back to indexCharles Isbell: Computing, Interactive AI, and Race in America | Lex Fridman Podcast #135
Chapters
0:0 Introduction
2:36 Top 3 movies of all time
8:45 People are easily predictable
14:27 Breaking out of our bubbles
26:13 Interactive AI
32:45 Lifelong machine learning
41:12 Faculty hiring
48:47 University rankings
56:15 Science communicators
65:39 Hip hop
74:39 Funk
76:3 Computing
91:55 Race
107:59 Cop story
116:20 Racial tensions
125:42 MLK vs Malcolm X
129:3 Will human civilization destroy itself?
133:34 Fear of death and the passing of time
00:00:00.000 |
The following is a conversation with Charles Isbell, 00:00:03.120 |
Dean of the College of Computing at Georgia Tech, 00:00:10.720 |
and someone who deeply thinks about what exactly is 00:00:14.980 |
the field of computing and how do we teach it. 00:00:17.880 |
He also has a fascinatingly varied set of interests, 00:00:21.440 |
including music, books, movies, sports, and history 00:00:35.640 |
and I knew I had to eventually talk to him on this podcast. 00:00:41.120 |
followed by some thoughts related to the episode. 00:00:44.280 |
First is Neuro, the maker of functional sugar-free gum 00:00:48.080 |
and mints that I use to give my brain a quick caffeine boost. 00:00:59.240 |
Third is Masterclass, online courses that I watch 00:01:02.120 |
from some of the most amazing humans in history. 00:01:05.020 |
And finally, Cash App, the app I use to send money 00:01:10.760 |
Please check out these sponsors in the description 00:01:13.080 |
to get a discount and to support this podcast. 00:01:16.360 |
As a side note, let me say that I'm trying to make it 00:01:18.600 |
so that the conversations with Charles, Eric Weinstein, 00:01:24.180 |
before Americans vote for president on November 3rd. 00:01:28.280 |
There's nothing explicitly political in these conversations, 00:01:31.480 |
but they do touch on something in human nature 00:01:34.520 |
that I hope can bring context to our difficult time, 00:01:37.880 |
and maybe, for a moment, allow us to empathize 00:02:00.340 |
Alexander the Great, Genghis Khan, Hitler, Stalin, 00:02:04.820 |
and all the complicated parts of human history in between, 00:02:10.420 |
for our humble little civilization here on Earth. 00:02:14.020 |
The conversation with Dan will hopefully be posted tomorrow 00:02:20.260 |
If you enjoy this thing, subscribe on YouTube, 00:02:31.240 |
And now, here's my conversation with Charles Isbell. 00:02:35.120 |
You've mentioned that you love movies and TV shows. 00:02:43.160 |
but you have to be definitively, objectively conclusive. 00:02:49.880 |
- So you're asking me to be definitive and to be conclusive. 00:02:52.460 |
That's a little hard, I'm gonna tell you why. 00:02:55.340 |
It's because movies is too broad of a category. 00:03:02.140 |
I'll pick one or two from each of the genres. 00:03:04.180 |
I'll get us to three, so I'm not gonna cheat. 00:03:09.860 |
which is probably my favorite movie of all time, 00:03:12.060 |
is "His Girl Friday," which is probably a movie 00:03:16.540 |
but it's based on a play called "The Front Page" 00:03:35.940 |
So you've seen these shows where there's a man and a woman, 00:03:38.860 |
and they clearly are in love with one another, 00:03:48.980 |
It's very much of its time, so it's, I don't know, 00:03:52.340 |
must have come out sometime between 1934 and 1939. 00:03:55.260 |
I'm not sure exactly when the movie itself came out. 00:04:08.940 |
Someone's on death row, and they're newspapermen, 00:04:16.380 |
They were divorced, the editor, the publisher, I guess, 00:04:25.580 |
and there's this whole other thing that's going on. 00:04:28.980 |
- Yeah, it's just a little play in conversation. 00:04:33.980 |
about the conversation, because at the end of the day, 00:04:38.340 |
and so I really like that movie for that reason. 00:04:45.180 |
and they're Crouching Tiger, Hidden Dragon, and John Wick. 00:04:53.540 |
It gets increasingly, I love them all for different reasons, 00:04:59.660 |
despite the fact they're two completely different movies. 00:05:01.780 |
But the reason I put Crouching Tiger, Hidden Dragon, 00:05:04.420 |
and John Wick together is 'cause I actually think 00:05:06.580 |
they're the same movie, or what I like about them, 00:05:08.820 |
the same movie, which is both of them create a world 00:05:17.740 |
But the story is done so well that you pick it up. 00:05:22.140 |
you have these little coins, and they're headed out, 00:05:26.340 |
every single person in New York City is an assassin. 00:05:29.140 |
There's like two people who come through who aren't, 00:05:36.140 |
Crouching Tiger, Hidden Dragon's a lot like that. 00:05:38.180 |
You get the feeling that this is chapter nine 00:05:42.100 |
the first eight chapters, and they're not gonna 00:05:46.140 |
- But you get pulled in anyway, like immediately. 00:05:48.300 |
So it's just excellent storytelling in both cases, 00:06:04.020 |
Like it scrolls off the page, but I didn't see 00:06:07.820 |
- I do not do martial arts, but I certainly-- 00:06:11.420 |
- Oh, we could talk about every Jackie Chan movie 00:06:13.300 |
ever made, and I would be on board with that. 00:06:21.980 |
would be Drunken Master 2, known in the States 00:06:36.140 |
- No, first one ever that I saw and remember, 00:06:40.500 |
- I didn't know what it was, and I didn't know 00:06:46.460 |
I only later rediscovered that that was actually-- 00:06:52.500 |
was he actually drinking, or was he play drinking? 00:07:04.660 |
- He was definitely drinking, and in the end, 00:07:10.660 |
- Yeah, and has one of the most fantastic fights ever 00:07:15.220 |
Anyway, that's my favorite one of his movies, 00:07:19.540 |
It's actually a movie called Nothing But a Man, 00:07:26.380 |
who you'll know from Hogan's Heroes, and Abbie Lincoln. 00:07:34.140 |
It's a beautiful story, but my favorite scenes, 00:07:38.700 |
one of my favorite movies just for the ending 00:07:42.780 |
I think the last scene of that is just fantastic. 00:07:45.660 |
It's the whole movie all summarized in just eight, 00:07:51.500 |
I don't think you can, you need to worry about spoilers 00:08:04.180 |
And she asks him if he did this terrible thing, 00:08:07.940 |
and he says, "No," and she says, "Thank you," 00:08:12.620 |
and you see him, you see her going out of the door, 00:08:20.100 |
and they're kissing Michael's hands, and Godfather. 00:08:25.580 |
so instead of looking at him, you're looking at her, 00:08:36.060 |
and your position as dean at Georgia Tech, Carl? 00:08:50.960 |
enjoy all kinds of experiments, including on yourself, 00:08:54.160 |
but I saw a video where you said you did an experiment 00:08:56.980 |
where you tracked all kinds of information about yourself, 00:09:00.020 |
and a few others, sort of wiring up your home, 00:09:05.540 |
and this little idea that you mentioned in that video, 00:09:10.180 |
that you thought that two days' worth of data 00:09:14.660 |
is enough to capture majority of the behavior 00:09:18.420 |
First, can you describe what the heck you did 00:09:23.140 |
to collect all that data, 'cause it's fascinating, 00:09:25.220 |
just like little details of how you collect that data, 00:09:27.820 |
and also what your intuition behind the two days is. 00:09:31.080 |
- So, first off, it has to be the right two days, 00:09:32.540 |
but I was thinking of a very specific experiment. 00:09:34.940 |
There's actually a suite of them that I've been a part of, 00:09:38.140 |
I just sort of dabbled in that part of the world. 00:09:41.980 |
that I was talking about had to do with recording 00:09:45.100 |
all the IR going on in my, infrared going on in my house. 00:09:48.820 |
So this is a long time ago, so everything's being controlled 00:10:04.200 |
My house was completely wired up at the time. 00:10:06.460 |
But you know, what, I'm about to look at a movie, 00:10:12.940 |
It was kind of surprising, it shouldn't have been. 00:10:27.780 |
and you saw, you captured everything that was going on. 00:10:31.500 |
or anything like that, just the way that the system 00:10:35.700 |
So it turns out that, and I did this with myself, 00:10:39.780 |
and then I had students, and they worked with 00:10:41.540 |
many other people, and it turns out at the end of the day, 00:10:44.500 |
people do the same things over and over and over again. 00:10:48.120 |
So it has to be the right two days, like a weekend, 00:10:53.820 |
at the level of what button they're gonna press next 00:11:00.740 |
like a, you don't even need a hidden Markov model, 00:11:09.900 |
just by doing something very simple and stupid, 00:11:21.780 |
by the things they do, the things you can measure about them 00:11:27.500 |
so distribution over actions, and you try to represent them 00:11:30.420 |
by the distribution of actions that are done on them, 00:11:38.380 |
and they cluster remarkably well, in fact, irritatingly so. 00:11:43.220 |
And so, by clustering people this way, you can, 00:11:49.620 |
of what's the next button you're gonna press, 00:11:51.340 |
but I can get 99% accuracy, or somewhere there's about, 00:11:54.620 |
on the collections of things you might press. 00:11:56.460 |
And it turns out, the things that you might press 00:12:01.540 |
So, for example, all the numbers on a keypad, 00:12:09.140 |
And so, you would naturally cluster them together, 00:12:11.420 |
and you discover that numbers are all related 00:12:15.140 |
to one another in some way, and all these other things. 00:12:17.100 |
And then, and here's the part that I think's important. 00:12:19.700 |
I mean, you can see this in all kinds of things. 00:12:25.060 |
but any given individual is remarkably predictable, 00:12:28.460 |
because you keep doing the same things over and over again. 00:12:32.460 |
in the long time that I've been thinking about this, 00:12:45.140 |
and philosophically speaking, is it possible to say 00:12:51.580 |
So, even though some large percentage of our behaviors, 00:13:10.100 |
I would say it a little differently, I think. 00:13:13.660 |
One is, I'm gonna disagree with the premise, I think, 00:13:23.140 |
from lots of other people, but they're not 0%. 00:13:27.560 |
So, in fact, even if you do this kind of clustering 00:13:33.420 |
even if they individually behave very differently 00:13:54.000 |
I would agree with that, at least at a philosophical level, 00:13:59.800 |
with something difficult, a decision that you have to make, 00:14:07.680 |
that's sort of what defines you as the individual, 00:14:11.840 |
It's the hard problem, it's not the easy problem. 00:14:22.300 |
and I do think that that's a reasonable place to start 00:14:33.300 |
which we are working on, let me just go along this thread 00:14:36.940 |
to skip to kind of our world of social media, 00:14:41.660 |
at least on the artificial intelligence side, 00:14:48.680 |
but that we have these silos in social media, 00:14:53.480 |
and we have these clusterings, as you're kind of mentioning, 00:15:00.840 |
is that we wanna break each other out of those silos 00:15:11.000 |
If you're a Democrat, you'd be empathetic to the Republican. 00:15:14.160 |
If you're a Republican, you're empathetic to a Democrat. 00:15:21.000 |
but there's other binnings that we can think about. 00:15:24.120 |
Is there, from an artificial intelligence perspective, 00:15:28.120 |
'cause you're just saying we cluster along the data, 00:15:33.680 |
is referring to throwing agents into that mix, 00:15:43.900 |
Is that something that you think is possible? 00:16:07.900 |
other people's points of view, all that kind of stuff? 00:16:11.440 |
- Yes, and I actually don't think it's that hard. 00:16:20.600 |
Let's assume that you can do a kind of partial ordering 00:16:28.920 |
so long as there's some way that this is a cluster, 00:16:30.840 |
this is a cluster, there's some edge between them, right? 00:16:33.600 |
They don't quite touch even, or maybe they come very close. 00:16:41.100 |
The way you get from here to here is you find the edge 00:16:44.800 |
And I think that machines are actually very good 00:16:46.560 |
at that sort of thing, once we can kind of define the problem 00:16:48.720 |
either in terms of behavior or ideas or words or whatever. 00:16:57.400 |
and you kind of have some semantic meaning for them, 00:17:00.720 |
the machine doesn't have to, you do as the designer, 00:17:03.800 |
then yeah, I think you can kind of move people along 00:17:09.560 |
or sort of coming up with the network structure itself 00:17:12.960 |
is hard is because I'm gonna tell you a story 00:17:17.460 |
I may get some of the details a little bit wrong, 00:17:22.280 |
You take two sets of people from the same backgrounds 00:17:27.820 |
So you separate them up, which we do all the time, 00:17:29.740 |
right, oh, you know, we're gonna break out in the, 00:17:32.220 |
you're gonna go over there and you're gonna talk about this, 00:17:33.540 |
you're gonna go over there and you're gonna talk about this. 00:17:35.120 |
And then you have them sort of in this big room, 00:17:38.740 |
and you have them sort of interact with one another. 00:17:41.060 |
When they come back to talk about what they learned, 00:17:48.160 |
they basically don't speak the same language anymore. 00:17:50.080 |
Like when you create these problems and you dive into them, 00:17:57.080 |
'cause we were in the middle of that at the time, 00:18:00.960 |
and they're talking about these rooms that you can see, 00:18:03.560 |
but you're seeing them from different vantage points, 00:18:05.480 |
depending upon what side of the room you're on. 00:18:12.580 |
This group over here, looking at the same room, 00:18:16.420 |
but it's not in their line of sight or whatever, 00:18:18.420 |
so they end up referring to it by some other way. 00:18:26.720 |
and they don't even realize they're referring 00:18:32.180 |
The clock on the wall is the thing that stuck with me. 00:18:35.620 |
the problem isn't that the ideologies disagree, 00:18:42.800 |
The hard part is just getting them to agree on the, 00:18:45.720 |
well, maybe we'd say the axioms in our world, right? 00:18:48.880 |
But just get them to agree on some basic definitions. 00:18:52.560 |
Because right now they're talking past each other, 00:18:58.700 |
getting them to interact, that may not be that difficult. 00:19:10.380 |
but it feels like there's multiple layers to this. 00:19:16.000 |
being able to put yourself in the shoes of the other person, 00:19:35.040 |
I'm very lucky to have this amazing community 00:19:39.240 |
of loving people, but whenever I encounter trolls, 00:19:42.200 |
they always roll their eyes at the idea of love 00:19:48.200 |
- So they show love by derision, I would say. 00:20:06.080 |
and bridge the gap of what is this person's life like? 00:20:23.420 |
even just to think about what was their upbringing like, 00:20:31.440 |
or a shitty education or all those kinds of things, 00:20:52.360 |
- So the word understander is doing a lot of work, right? 00:20:58.960 |
that there is something similar as a point to touch, right? 00:21:08.560 |
I think you're right in the way that you're using 00:21:17.060 |
Empathy is kind of understanding where they're coming from 00:21:20.260 |
And for most people, those things go hand in hand. 00:21:23.960 |
For some people, some are very good at empathy 00:21:29.840 |
well, my observation would be, I'm not a psychologist, 00:21:39.320 |
understand where they're coming from and still think, 00:22:07.800 |
Because we're not all exactly like each other. 00:22:14.360 |
Even though they get clearly tangled up in one another. 00:22:16.920 |
So what I think AI could help you do, actually, 00:22:19.440 |
is if, and I'm being quite fanciful, as it were, 00:22:26.660 |
the words that you use, the actions you take, 00:22:41.680 |
some kind of commonality, a mapping, as it were, 00:22:48.520 |
then I can take the cosine of the angle between you, 00:22:51.100 |
and if it's zero, you've got nothing in common. 00:22:53.840 |
If it's one, you're completely the same person. 00:23:00.520 |
If I can find the place where there's the overlap, 00:23:02.580 |
then I might be able to introduce you on that basis, 00:23:07.740 |
and make it easier for you to take that step of empathy. 00:23:14.520 |
although I wonder if it requires that everyone involved 00:23:18.840 |
is at least interested in asking the question. 00:23:21.320 |
So maybe the hard part is just getting them interested 00:23:23.880 |
In fact, maybe if you can get them to ask the question, 00:23:28.400 |
Maybe that's the problem that AI should be working on, 00:23:30.280 |
not telling you how you're similar or different, 00:23:36.640 |
- It feels like an economist's answer, actually. 00:23:43.760 |
which is I think everything you said is brilliant, 00:23:46.040 |
but I tend to believe, philosophically speaking, 00:23:49.840 |
that people are interested underneath it all, 00:24:27.040 |
and that person in my mind will become warmer and warmer, 00:24:30.880 |
and I'll start to feel more and more compassion towards them. 00:24:34.240 |
I think for majority of the population, that's true, 00:24:39.600 |
- Yeah, I mean, it's an empirical question, right? 00:24:44.200 |
and so I'm gonna say that I think you're right. 00:25:23.700 |
that it is much easier to be dismissive of a person 00:25:27.100 |
if they're not in front of you, if they're not real, right? 00:25:35.180 |
So if you're on social media, if you're on the web, 00:25:39.680 |
being forced to deal with someone as a person, 00:25:47.520 |
one, you're forced to deal with their humanity 00:25:50.280 |
The other is, of course, that they might punch you 00:25:53.540 |
so both of those things kind of work together, 00:25:59.880 |
is really a kind of substitute for forcing them 00:26:28.760 |
about the field of interactive artificial intelligence? 00:27:01.980 |
I care about building some kind of intelligent artifact, 00:27:07.900 |
that would be at least as intelligent as humans 00:27:16.660 |
- So that's the deep underlying love and dream 00:27:21.640 |
- It's the bigger, whatever the heck that is. 00:27:26.880 |
And I don't understand how one could be intelligent 00:27:29.480 |
without learning, so therefore I gotta figure out 00:27:33.160 |
But machine learning, by the way, is also a tool. 00:27:35.440 |
I said statistical because that's what most people 00:27:37.520 |
think of themselves, machine learning people. 00:27:40.040 |
I think it's what Pat Langley might disagree, 00:27:58.140 |
which is building something truly intelligent 00:28:05.580 |
Which is, is there something you can say concrete 00:28:10.100 |
about the mysterious gap between the subset ML 00:28:21.040 |
not totally, but in part unknown at this time, 00:28:25.700 |
is it knowledge, like expert system reasoning 00:28:29.820 |
- So AI is bigger than ML, but ML is bigger than AI. 00:28:36.980 |
that are really interested in slightly different problems. 00:28:39.540 |
I tend to think of ML, and there are many people out there 00:28:42.700 |
but I tend to think of ML being much more concerned 00:28:46.320 |
I'm an AI about the sort of more philosophical goal 00:28:50.960 |
that motivates me, even if I end up finding myself 00:28:53.480 |
living in this kind of engineering-ish space. 00:28:57.840 |
but you know, it's, to me, they just feel very different. 00:29:03.400 |
Your sort of goals of where you're trying to be 00:29:06.360 |
are somewhat different, but to me, AI is about 00:29:10.520 |
And typically, but not always, for the purpose 00:29:14.500 |
of understanding ourselves a little bit better. 00:29:16.820 |
Machine learning is, I think, trying to solve the problem, 00:29:23.520 |
- So on that note, so with the interactive AI, 00:29:29.060 |
as a singular system, or is it as a collective, 00:29:32.260 |
huge amount of systems interacting with each other? 00:29:38.380 |
and of AI systems fundamental to intelligence? 00:29:46.800 |
So the reason the interactive AI part matters to me 00:29:51.040 |
is because I don't, this is gonna sound simple, 00:29:55.660 |
but I don't care whether a tree makes a sound 00:30:09.780 |
with other people, right, or other things, anyway. 00:30:12.960 |
And we go out of our way to make other things intelligent. 00:30:21.840 |
I think the interactive AI part is being intelligent 00:30:25.280 |
in and of myself in isolation is a meaningless act, 00:30:30.520 |
The correct answer is you have to be intelligent 00:30:35.440 |
to learn faster, because you can import from past history. 00:30:47.600 |
But I'm also intelligent as a part of a larger species 00:30:50.300 |
and group of people, and we're trying to move 00:30:53.520 |
And so I think that notion of being intelligent 00:31:01.520 |
And so that's why I care about that aspect of it. 00:31:07.320 |
One is not just building something intelligent with others, 00:31:10.320 |
but understanding that you can't always communicate 00:31:13.180 |
They have been in a room where there's a clock on the wall 00:31:15.480 |
that you haven't seen, which means you have to spend 00:31:18.720 |
with one another constantly in order to figure out 00:31:24.000 |
So, I mean, this is why people project, right? 00:31:25.620 |
You project your own intentions and your own reasons 00:31:28.740 |
for doing things onto others as a way of understanding them 00:31:32.960 |
But by the way, you, completely predictable person, 00:31:37.040 |
I don't know you well enough, but you probably eat 00:31:51.120 |
but other people get the chicken and broccoli. 00:31:56.680 |
- I don't know what's wrong with those people. 00:31:57.840 |
- I don't know what's wrong with them either. 00:32:06.000 |
We got to communicate, and it's going to change, right? 00:32:08.440 |
So it's not, interactive AI is not just about 00:32:20.880 |
This is what we mean about things like adaptable models, 00:32:28.040 |
but you're different from the person you were 15 minutes ago, 00:32:31.920 |
and I have to assume that you're at least going to drift. 00:32:38.040 |
And I have to have some mechanism for adapting to that 00:32:51.000 |
which is, I think, a topic that's understudied, 00:32:56.000 |
or maybe because nobody knows what to do with it. 00:33:01.520 |
or most of our artificial intelligence systems 00:33:04.360 |
that are primarily machine learning based systems, 00:33:07.000 |
or dialogue systems, all those kinds of things, 00:33:29.880 |
that seems to pick up the crumbs along the way 00:33:34.360 |
that somehow seems to capture a person pretty well. 00:33:37.080 |
Do you have any ideas how to do lifelong learning? 00:33:42.080 |
Because it seems like most of the machine learning community 00:33:47.320 |
not spend a lot of time on lifelong learning, 00:33:49.840 |
I don't think they spend a lot of time on learning, period, 00:33:52.760 |
in the sense that they tend to be very task-focused. 00:33:55.120 |
Everybody is over-fitting to whatever problem 00:33:58.360 |
They're over-engineering their solutions to the task. 00:34:01.480 |
Even the people, and I think these people do, 00:34:03.720 |
are trying to solve a hard problem of transfer learning, 00:34:11.060 |
where the light is, 'cause that's where the light is, right? 00:34:14.920 |
I mean, one could argue that we tend to do this in general. 00:34:20.600 |
We tend to hill climb and get stuck in local optima. 00:34:28.360 |
Because, so, look, here's the hard thing about AI, right? 00:34:32.240 |
The hard thing about AI is it keeps changing on us, right? 00:34:35.440 |
AI is the art and science of making computers act 00:34:49.700 |
to the sort of ineffable quality of who we are. 00:34:52.980 |
Which means that the moment you understand something 00:34:58.060 |
That's just, you take the derivative and you divide by two 00:35:00.300 |
and then you average it out over time in the window. 00:35:09.440 |
of either there's very simple task-based things 00:35:13.700 |
There's all of AI, and there's like nothing in the middle. 00:35:16.560 |
Like it's very hard to get from here to here, 00:35:18.740 |
and it's very hard to see how to get from here to here. 00:35:21.740 |
And I don't think that we've done a very good job of it 00:35:24.220 |
because we get stuck trying to solve the small problem 00:35:28.140 |
I'm not gonna pretend that I'm better at this 00:35:31.300 |
And of course, all the incentives in academia 00:35:34.340 |
and in industry are set to make that very hard 00:35:56.280 |
then you've probably spent much of your career 00:35:57.840 |
taking these little risks, relatively speaking. 00:36:02.720 |
telling you not to take that particular big risk, right? 00:36:04.760 |
So the whole system's set up to make progress very slow. 00:36:17.820 |
At least try to get n equal two and maybe n equal seven 00:36:25.500 |
I'm gonna solve this problem and another problem. 00:36:39.700 |
which you're gonna have live for months at a time 00:36:45.740 |
then you're never going to make progress in that direction. 00:36:52.380 |
It's yes, you should be deploying these things 00:36:59.940 |
that it's gonna take you five years to do this. 00:37:01.780 |
Not rerunning the same experiment over and over again 00:37:11.580 |
and seeing what it's learning algorithm, say, can learn, 00:37:17.360 |
Without that, you're gonna be stuck ultimately. 00:37:19.800 |
- What do you think about the possibility of N equals one 00:37:23.840 |
growing, it's probably a crude approximation, 00:37:27.240 |
but growing like if we look at language models like GPT-3, 00:37:31.040 |
if you just make it big enough, it'll swallow the world. 00:37:34.360 |
Meaning like it'll solve all your T to infinity 00:37:43.820 |
and just pumping it full of steroids in terms of compute, 00:37:57.860 |
and it will learn how to reason, how to paint, 00:38:06.180 |
- I mean, I can't think of a more terrifying world 00:38:08.340 |
to live in than a world that is based on YouTube videos, 00:38:16.880 |
You will get somewhere and you will learn something, 00:38:23.580 |
You won't solve the, you know, here's the thing. 00:38:25.980 |
We build these things and we say we want them to learn, 00:38:30.700 |
but what actually happens, and let's say they do learn. 00:38:33.420 |
I mean, certainly every paper I've gotten published 00:38:35.220 |
the things learn, I don't know about anyone else, 00:38:41.280 |
So we keep redefining what it means to be successful, 00:38:51.980 |
which is like the one you just described with YouTube. 00:38:53.580 |
Let's get completely out of machine learning. 00:38:55.340 |
Well, not completely, but mostly out of machine learning. 00:38:58.420 |
People were trying to solve information retrieval, 00:39:02.180 |
the ad hoc information retrieval problem forever. 00:39:04.420 |
I mean, first major book I ever read about it was what, 00:39:09.760 |
Anyway, it's, you know, we'll treat everything as a vector 00:39:12.600 |
and we'll do these vector space models and whatever. 00:39:20.160 |
And then Google comes and makes the ad hoc problem 00:39:27.960 |
and there's some brilliant algorithmic stuff behind it too, 00:39:40.100 |
so that you have, you know, there are 10 million answers 00:39:43.140 |
quite literally to the question that you're asking, 00:39:46.540 |
then the problem wasn't give me the things that are 00:39:49.620 |
relevant, the problem is don't give me anything 00:39:51.580 |
that's irrelevant, at least in the first page, 00:39:56.260 |
So Google is not solving the information retrieval problem, 00:40:05.940 |
which is not the same thing as getting an answer. 00:40:12.300 |
what the problem was we thought we were trying to solve 00:40:15.520 |
You thought you were trying to find an answer, 00:40:17.420 |
but you're not, we're trying to find the answer, 00:40:19.060 |
but it turns out you're just trying to find an answer. 00:40:24.220 |
Of course, you trained yourself to figure out 00:40:25.980 |
what the keywords were to get you that webpage. 00:40:31.980 |
you've just changed the problem into something else. 00:40:33.820 |
You haven't actually learned what you set out to learn. 00:40:37.180 |
maybe we're not doing that either, we just think we are, 00:40:40.120 |
because we're in our own heads, maybe we're learning 00:40:45.520 |
I think the point is is that Google has not solved 00:40:48.560 |
information retrieval, Google has done amazing service. 00:40:51.180 |
I have nothing bad to say about what they've done. 00:40:52.820 |
Lord knows my entire life is better because Google exists, 00:40:55.820 |
in foreign for Google Maps, I don't think I've ever 00:41:00.060 |
- Like 95, I see 110 and I see, but where did 95 go? 00:41:04.360 |
So I'm very grateful for Google, but they just have 00:41:08.580 |
to make certain the first five things are right. 00:41:12.780 |
Look, we're going off on a totally different topic here, 00:41:29.500 |
We say things like we want to find the best person 00:41:39.280 |
which I will point out was founded 30 years after 00:41:47.480 |
- I'm just saying I appreciate all that they did 00:41:51.400 |
Anyway, so we're gonna try to hire the best person 00:41:58.120 |
That's what we say, the best person for this job. 00:42:02.540 |
Do you know which percentage of faculty in the top four 00:42:15.940 |
- 60% of the faculty in the top four earn their PhDs 00:42:18.980 |
This is computer science for which there is no top five. 00:42:40.560 |
And Princeton and Georgia Tech are tied for eight 00:42:49.020 |
you know what percentage of faculty in the top 10 00:42:57.120 |
If you look at the top 55 ranked departments, 00:43:00.540 |
50% of the faculty earn their PhDs from the top 10. 00:43:04.300 |
There's no universe in which all the best faculty, 00:43:28.640 |
But what it tells you is, well, ask yourself the question, 00:43:42.520 |
nobody ever lost his job buying a computer from IBM. 00:43:47.080 |
And nobody ever lost their job hiring a PhD from MIT, right? 00:43:51.920 |
If the person turned out to be terrible, well, you know, 00:43:54.200 |
they came from MIT, what did you expect me to know? 00:44:07.400 |
particularly because you're only gonna hire one this year, 00:44:10.920 |
you're only gonna hire one or two or three this year. 00:44:14.240 |
you're stuck with them for at least seven years 00:44:16.120 |
in most places, because that's before you know 00:44:18.560 |
And if they get tenure, you're stuck with them 00:44:19.960 |
for a good 30 years, unless they decide to leave. 00:44:22.240 |
That means the pressure to get this right is very high. 00:44:27.820 |
You don't care about saying no inappropriately. 00:44:30.480 |
You only care about saying yes inappropriately. 00:44:40.000 |
was in exactly the same situation with their search. 00:44:42.000 |
It turns out you just don't wanna give people 00:44:43.560 |
the wrong page in the first three or four pages. 00:44:50.680 |
just make certain the wrong answers don't get up there. 00:45:02.240 |
something beautiful, profound to your question. 00:45:19.440 |
how do we get that 13th page with a truly special person? 00:45:27.440 |
Computer science probably has those kinds of people. 00:45:32.140 |
Like you have the Russian guy, Grigori Perelman. 00:45:40.600 |
that don't know how to play the little game of etiquette 00:45:44.240 |
that faculty have all agreed somehow converged 00:45:48.320 |
over the decades how to play with each other. 00:45:53.480 |
is not from the top four, top whatever numbers, the schools. 00:45:57.520 |
And maybe actually just says a few every once in a while 00:46:19.680 |
- Well, you have to be willing to take a certain kind of, 00:46:23.120 |
you'd have to take, you have to be willing to take risk. 00:46:34.640 |
you would say, oh, well, the main thing is you want, 00:46:41.280 |
we'll just inject some randomness in and it'll be okay. 00:46:44.760 |
The problem is that feels very, very hard to do with people. 00:46:56.440 |
that, you know, injecting randomness in the system 00:46:58.600 |
at that level for who you hire is just not worth doing 00:47:01.840 |
because the price is too high or the cost is too high. 00:47:04.320 |
We had infinite resources, sure, but we don't. 00:47:07.000 |
So, you know, you're ruining other people's lives 00:47:10.840 |
But we've taken that principle, even if I grant it, 00:47:22.240 |
of people we look at and give an opportunity to. 00:47:24.960 |
If we do that, then we have a better chance of finding that. 00:47:30.280 |
another level, but let me tell you something else. 00:47:32.200 |
You know, I did a sort of study, I call it a study. 00:47:35.080 |
I called up eight of my friends and asked them 00:47:36.400 |
for all of their data for graduate admissions, 00:47:38.280 |
but then someone else followed up and did an actual study. 00:47:43.360 |
how everybody gets into grad school, more or less. 00:47:50.720 |
You admit most people from places ranked around you, 00:47:52.600 |
and you admit almost no one from places ranked below you, 00:47:54.600 |
with the exception of the small liberal arts colleges 00:48:01.160 |
Which means the decision of whether, you know, 00:48:11.760 |
By what you knew to go to undergrad to do whatever, right? 00:48:15.720 |
So if we can push these things back a little bit 00:48:20.680 |
that you will be able to see someone interesting 00:48:25.320 |
The other answer to that question, by the way, 00:48:30.000 |
you either adjust the pool so the probabilities go up, 00:48:32.520 |
that's a way of injecting a little bit of uniformity. 00:48:40.240 |
You just let yourself be measured by something 00:48:51.840 |
move entire universities to behave differently, 00:48:56.440 |
- Can you talk trash about those rankings for a second? 00:49:01.280 |
I actually, it's so funny how, from my perspective, 00:49:07.320 |
how dogmatic, like how much I trust those rankings. 00:49:27.040 |
like most people don't know what they're based on. 00:49:34.600 |
- Well, so it depends on which rankings you're talking about. 00:49:40.280 |
- Computer science, US News, isn't that the main one? 00:49:46.480 |
Sorry, csrankings.org, but nothing else matters 00:49:50.200 |
So US News has formula that it uses for many things, 00:49:55.280 |
because computer science is considered a science, 00:49:58.480 |
So the rankings for computer science is 100% reputation. 00:50:20.000 |
So that means, how do you improve reputation? 00:50:22.120 |
How do you move up and down the space of reputation? 00:50:33.120 |
because Georgia Tech is actually the case to look at. 00:50:37.640 |
but because Georgia Tech is the only computing unit 00:50:40.440 |
that was not in the top 20 that has made it into the top 10. 00:50:43.120 |
It's also the only one in the last two decades, I think, 00:51:11.560 |
I think it's because we have shown leadership 00:51:17.000 |
So we created a college, first public university to do it, 00:51:22.320 |
I also think it's no accident that CMU is the largest, 00:51:36.040 |
when there was a crisis about undergraduate education, 00:51:41.000 |
and succeeded at rethinking undergrad education 00:51:47.840 |
when most public universities anyway were afraid to do it. 00:51:51.820 |
and that mattered because people were trying to figure out 00:51:57.200 |
I think it's about being observed by your peers 00:52:02.200 |
So, I mean, that is what reputation is, right? 00:52:04.240 |
So the way you move up in the reputation rankings 00:52:15.260 |
And there's huge hysteria in the system, right? 00:52:18.280 |
I can't remember this, this may be apocryphal, 00:52:39.280 |
the best students come, which keeps it being great. 00:52:48.920 |
Like, it doesn't actually have to be backed by reality. 00:52:52.080 |
And it's, you know, not to say anything about MIT, 00:53:03.840 |
like one of the surprising things when I showed up at MIT 00:53:13.360 |
they're the same people as I've met other places. 00:53:25.000 |
It's a lot better than it was when I was here. 00:53:36.520 |
It just doesn't get all of the best students. 00:53:39.040 |
There are many more best students out there, right? 00:53:45.800 |
And it just kind of, it's a sort of positive feedback. 00:53:49.380 |
which I think is worth examining for a moment, right? 00:53:54.800 |
You said, "We're living in the space of narrative 00:54:04.920 |
We just build stories to explain why we do what we do. 00:54:08.060 |
Someone once asked me, "But wait, there's nothing objective." 00:54:12.920 |
It's an objective measure of the opinions of everybody else. 00:54:20.600 |
But, you know, what, I mean, tell me something 00:54:23.180 |
you think is actually objective and measurable 00:54:29.900 |
I mean, you're getting me off on something here, 00:54:34.900 |
which are just reflecting light and putting them on film, 00:54:49.100 |
and Western Europe, were relatively light-skinned. 00:54:55.620 |
That got fixed with better film and whole processes. 00:55:02.980 |
wanted to be able to take pictures of mahogany furniture. 00:55:08.540 |
wanted to be able to take pictures of chocolate. 00:55:18.740 |
- Are objective, they're just capturing light. 00:55:27.300 |
if I may use that word, some physics over others, 00:55:32.260 |
So I can either worry about this part of the spectrum 00:55:38.300 |
but I have more people paying money over here, right? 00:55:40.380 |
And it turns out that if a giant conglomerate 00:55:43.860 |
wants, demands that you do something different 00:55:45.900 |
and it's gonna involve all kinds of money for you, 00:55:52.220 |
Oh, it's because of this notion of objectiveness, right? 00:55:57.060 |
because at the end you've gotta tell a story, 00:56:00.140 |
and what else is engineering other than that? 00:56:01.900 |
So I think that the rankings capture something. 00:56:20.020 |
with whatever that narrative is, have fun with it, 00:56:26.540 |
of that calm, sexy voice of explaining the stars 00:56:31.660 |
or the Elon Musk, dare I even say Donald Trump, 00:56:35.180 |
where you're like trolling and shaking up the system 00:56:47.580 |
thinks like, finds the counterintuitive ideas 00:56:51.700 |
in the particular science and throws them out there 00:56:54.820 |
and sees how they play in the public discourse. 00:57:00.420 |
And why doesn't academia attract an Elon Musk type? 00:57:14.420 |
Well, I think the answer is we have told ourselves a story, 00:57:31.620 |
that in some ways you're the mathematician, right? 00:57:44.180 |
then it is beneath you to do that kind of thing, right? 00:57:51.340 |
I think that, and by the way, everyone doesn't have to do this. 00:57:54.220 |
and everyone, even if they would be good at it, 00:58:08.900 |
are ones that engage with the rest of the world. 00:58:20.700 |
whereas of course that wasn't true 20 years ago, 00:58:25.300 |
And if it was, it wasn't around in a meaningful way. 00:58:26.660 |
I don't actually know how long Twitter's been around. 00:58:28.740 |
As I get older, I find that my notion of time 00:58:33.100 |
Like Google really has been around that long? 00:58:42.180 |
that a part of our job is to impact the people 00:58:46.380 |
and that that's the point of being at a great place 00:58:53.660 |
Forget Twitter, we could look at just online courses 00:58:59.220 |
Like there is a kind of force that pulls you back. 00:59:09.740 |
There's a little bit of, all of us have this, 00:59:11.780 |
but certainly faculty have this, which is jealousy. 00:59:15.100 |
It's whoever's popular at being a good communicator, 00:59:23.020 |
And of course, when you excite the world with the science, 00:59:37.100 |
and they hate that a TED Talk gets millions of views 00:59:46.940 |
unless you like win a Nobel Prize or whatever. 00:59:48.740 |
Like it's only when you like get senior enough 00:59:54.940 |
But just like you said, even when you get tenure, 01:00:00.380 |
I have many colleagues and friends who have gotten tenure, 01:00:23.560 |
to think in a certain way, to accept certain values, 01:00:41.340 |
I do think that as a field, not just as a field, 01:00:45.420 |
as a profession, we have a habit of belittling those 01:00:55.680 |
as if the word itself is a kind of scarlet A, right? 01:01:15.260 |
are not being pure to whatever the values and ethos is 01:01:22.900 |
Now, having said that, I think that ultimately, 01:01:26.580 |
people who are able to be popular and out there 01:01:30.260 |
and are touching the world and making a difference, 01:01:39.380 |
or you have to be very interested in pursuing it. 01:01:45.860 |
I'd be really interested in how Rod Brooks felt 01:01:50.500 |
when he did "Fast, Cheap, and Out of Control" 01:01:56.420 |
- It was a documentary that involved four people. 01:01:59.820 |
I remember nothing about it other than Rod Brooks was in it 01:02:04.820 |
Can't remember what the other two things were. 01:02:06.460 |
It was robots, naked mole rats, and then two other-- 01:02:10.260 |
of the Artificial Intelligence Laboratory at MIT, 01:02:23.380 |
And also is a little bit of a rock star personality 01:02:27.180 |
in the AI world, very opinionated, very intelligent. 01:02:33.420 |
Also, he was one of my two advisors for my PhD. 01:02:56.980 |
'cause I was a student at the time, it was amazing. 01:02:59.500 |
This guy was in a movie, being very much himself. 01:03:09.620 |
I mean, I think they edited it appropriately for him. 01:03:12.860 |
But it was very much Rod, and he did all this 01:03:15.700 |
I mean, he was running, was he running the iLab 01:03:22.040 |
He did amazing things, made a lot of his bones 01:03:23.720 |
by doing the kind of counterintuitive science, right? 01:03:40.880 |
I don't know if he would tell you it was good or bad, 01:03:42.080 |
but I know that for everyone else out there in the world, 01:03:48.340 |
So it's not as if it destroyed his career by being popular. 01:03:52.080 |
- All right, let's go into a topic where I'm on thin ice, 01:03:56.280 |
because I grew up in the Soviet Union and Russia. 01:03:58.080 |
My knowledge of music, this American thing you guys do, 01:04:12.180 |
the Lab for Interactive Artificial Intelligence, 01:04:14.660 |
but also there's just a bunch of mystery around this. 01:04:27.880 |
- So a lot of my life is about making acronyms. 01:04:30.440 |
So if I have one quirk, it's that people will say words, 01:04:45.840 |
But finally I decided that the P stands for probabilistic 01:04:51.440 |
and it's uncertainty, which is the important thing here. 01:04:54.300 |
And the FUNK can be lots of different things, 01:04:56.580 |
but I decided I should leave it up to the individual 01:05:00.680 |
But I will tell you that when my students graduate, 01:05:06.460 |
I hand them, they put on a hat and star glasses 01:05:22.320 |
- So there's a sense to it which is not an acronym, 01:05:53.020 |
especially the past couple of decades in the '90s 01:05:59.800 |
what records or artists would you introduce me to? 01:06:09.740 |
or maybe what influenced you in your journey, 01:06:13.520 |
Like when the family's gone and you just sit back 01:06:26.520 |
no matter how old they are or where they live. 01:06:28.560 |
But for me, the first thing that's worth pointing out 01:06:31.420 |
is that hip hop and rap aren't the same thing, 01:06:35.080 |
and there are people who feel very strongly about this, 01:06:38.800 |
- You're offending everybody in this conversation, 01:06:44.880 |
- It's a whole set of things, of which rap is a part. 01:06:55.860 |
And there's all these, including the popping and the locking 01:07:03.040 |
And then there's rap, which is this particular-- 01:07:09.000 |
I mean, you wouldn't call the stuff that DJs do 01:07:14.280 |
So given that we understand that hip hop is this whole thing, 01:07:17.480 |
what are the rap albums that best touch that for me? 01:07:37.320 |
by watching old episodes of "I Love the '70s," 01:07:44.200 |
and just see where people come in and out of pop culture. 01:07:56.680 |
Particularly, it takes a nation of millions to hold us back, 01:07:59.840 |
which is clearly the best album ever produced, 01:08:03.560 |
and certainly the best hip hop album ever produced, 01:08:09.920 |
Fantastic lyrics, 'cause to me, it's all about the lyrics. 01:08:16.520 |
and he did a lot, very kind of heavy metal-ish, 01:08:33.280 |
I'd probably get you to someone like a Mos Def. 01:08:35.280 |
I would give you a history lesson, basically. 01:08:50.640 |
and eventually, I would take you back to "The Last Poets," 01:08:56.000 |
and particularly their first album, "The Last Poets," 01:08:58.080 |
which was 1970, to give you a sense of history, 01:09:14.280 |
who are kind of confused about any kind of music, 01:09:17.040 |
you know, the truth is, this is the same thing 01:09:19.720 |
It's about narrative and being a part of something 01:09:31.960 |
and you have no idea what they're talking about. 01:09:38.640 |
you start using words that nobody else understands, right? 01:09:40.520 |
And it becomes part of, hip-hop's the same way, 01:09:42.680 |
everything's the same way, they're all cultural artifacts. 01:09:44.860 |
But I would help you to see that there's a history of it, 01:09:50.680 |
so that you could kind of see how it connects 01:09:54.520 |
including some of the good work that's been done 01:10:09.720 |
But I'd start you with "It Takes a Nation to Make Us All" 01:10:12.560 |
- There's an interesting tradition in more modern hip-hop 01:10:16.000 |
of integrating almost like classic rock songs or whatever, 01:10:38.200 |
- Well, that's been true since the beginning. 01:10:45.480 |
'cause it was the DJ that brought all the records together 01:10:49.440 |
If you go back to those days, mostly in New York, 01:10:53.040 |
though not exclusively, but mostly in New York, 01:10:56.600 |
it was the DJ that brought all the music together 01:10:58.080 |
and the beats and showed that basically music 01:11:10.400 |
when I became really into it, which was most of the '80s, 01:11:14.600 |
it was more funk was the back for a lot of the stuff, 01:11:22.960 |
'cause it tied into what my parents listened to 01:11:28.160 |
And by the way, complete revival of George Clinton 01:11:32.080 |
and Parliament and Funkadelic and all of those things 01:11:34.560 |
to bring it sort of back into the '80s and into the '90s. 01:11:37.280 |
And as we go on, you're gonna see the last decade 01:11:44.760 |
it's probably because it's being sampled by someone 01:12:00.840 |
So this stuff's been going on for a long time. 01:12:02.280 |
It's one of the things that I think is beautiful. 01:12:04.640 |
Run DMC, Jam Master Jay used to play, he played piano. 01:12:13.000 |
of what was going on rather than play the piano. 01:12:17.080 |
- Well, it's pieces, you're putting pieces together. 01:12:21.760 |
I mean, the roots are doing their own thing, right? 01:12:40.440 |
this is me talking trash about modern hip hop. 01:12:42.640 |
I haven't investigated, I'm sure people will correct me 01:13:06.040 |
or you see Public Enemy, or Rage Against the Machine 01:13:09.360 |
that's the place where we go to those lyrics. 01:13:18.720 |
or I'm really happy that she's still with me, 01:13:22.200 |
or the flip side, it's like love songs of different kinds. 01:13:31.240 |
It seems like rap is the place where you would find that, 01:13:38.160 |
what I see, you look at like mumble rap or whatever, 01:13:42.520 |
and more towards the beat and the musicality of it. 01:13:46.880 |
In fact, if you go back and you read my reviews, 01:14:01.360 |
but I often would start with, it's all about the lyrics. 01:14:08.240 |
before I've even finished having this conversation 01:14:10.160 |
that neither of us knows what we're talking about, 01:14:36.800 |
but there's a little bit of sampling here and there, 01:14:50.440 |
- It's hard to imagine somebody being James Brown. 01:14:55.740 |
and just listen to Snatch It Back and Hold It, 01:15:21.240 |
You know, it's, I guess I would answer your question 01:15:23.600 |
depending on whether I'm thinking about it in 2020 01:15:28.920 |
I'm just thinking in terms of, you know, that was rock, 01:15:35.080 |
But we didn't use those words, or maybe we did, 01:15:40.760 |
Certainly not the way we used it in the '70s and the '80s. 01:15:45.640 |
I appreciate all the mistakes that we have made 01:15:49.000 |
Actually, some of the disco is actually really, really good. 01:15:51.120 |
- John Travolta, oh boy, he regrets it probably. 01:15:56.840 |
- Yeah, and it got him to where he's going, where he is. 01:15:59.640 |
- Oh, well, thank you for taking that detour. 01:16:05.720 |
we've already talked about computing a little bit, 01:16:13.440 |
where it fits into the sets of different disciplines? 01:16:18.360 |
What should people, how should they think about computing, 01:16:27.020 |
that defines for a young mind what computing is? 01:16:32.900 |
- So I don't know about a perfect curriculum, 01:16:36.680 |
without the curriculum, you don't get anywhere. 01:16:38.520 |
Curriculum, to me, is the fundamental data structure. 01:16:44.520 |
So I think the curriculum is where I like to play. 01:16:48.000 |
So I spend a lot of time thinking about this. 01:16:50.400 |
But I will tell you, I'll answer your question 01:16:51.840 |
by answering a slightly different question first 01:16:58.560 |
The truth is, what we really educate people in 01:17:01.880 |
from the beginning, but certainly through college, 01:17:04.600 |
you've sort of failed if you don't think about it this way, 01:17:09.420 |
people often think about tools and tool sets, 01:17:26.640 |
not just the skill of learning how to hammer well, 01:17:31.040 |
what's the fundamental way to think about the world, right? 01:17:39.680 |
They give you different ways of sort of thinking through. 01:17:41.600 |
So, with that in mind, I think that computing, 01:17:44.760 |
to even ask the question whether it's a discipline, 01:17:48.040 |
does it have a way of thinking about the world 01:17:49.640 |
that is different from the scientist who is doing discovery 01:17:53.480 |
and using the scientific method as a way of doing it, 01:17:59.800 |
about the abstractions that may be artificial, but whatever. 01:18:18.040 |
and I've come to a view about what computing actually is, 01:18:22.160 |
what the mindset is, and it's a little abstract, 01:18:27.280 |
I think that what distinguishes the computationalist 01:18:36.200 |
that models, languages, and machines are equivalent. 01:18:43.960 |
but it's a machine that is an executable thing 01:18:55.960 |
but it is fundamentally dynamic and executable. 01:19:07.040 |
that I make it static, and that's not a bad thing. 01:19:12.640 |
It's not a process that continually runs, right? 01:19:18.000 |
that self-reflection of the system itself matters, 01:19:23.360 |
So, it is a science, because the models fundamentally 01:19:27.960 |
Information is a scientific thing to discover, right? 01:19:30.900 |
Not just a mathematical conceit that gets created. 01:19:35.280 |
because you're actually dealing with constraints 01:19:41.420 |
But it's also a math, because you're actually worrying 01:19:44.120 |
about these languages that describe what's happening. 01:19:52.400 |
and finite state automata, one of which feels like a machine, 01:20:08.320 |
and we would do better if we made that more explicit. 01:20:12.220 |
How my life has changed in my thinking about this 01:20:18.100 |
since I tried to put that to paper with some colleagues 01:20:20.680 |
is the realization, which comes to a question 01:20:42.020 |
computer science, whatever you want to call it, 01:20:55.060 |
kind of only matters with respect to human beings 01:20:58.340 |
So, the triangle exists, that is fundamentally computing. 01:21:10.100 |
and intelligence that has to interact with it 01:21:11.980 |
that changes the data, the information that makes sense 01:21:35.900 |
into the idea, into this framework of computing, 01:21:41.340 |
like parts of psychology, parts of neuroscience, 01:21:44.780 |
What about philosophy, like studies of human nature 01:21:57.860 |
which is nice because I understand if-then statements. 01:22:09.700 |
There'll still be biology involved, but whatever. 01:22:15.180 |
in fact, the cell is a bunch of if-then statements 01:22:22.220 |
certainly the way we would do AI and machine learning, 01:22:23.900 |
but there's just even the way that we think about, 01:22:31.440 |
who are not in computer science worry about computer science 01:22:45.020 |
but it's not the most important thing in the world. 01:22:55.860 |
without understanding some data science and computing 01:22:57.860 |
because the way you're gonna get history done, in part, 01:23:07.480 |
to help you to think about a better way to describe history 01:23:11.020 |
and what it tells us about where we might be going. 01:23:17.040 |
is because the philosopher has a lot to say about computing. 01:23:22.060 |
about the way humans interact with computing, right? 01:23:33.620 |
- Did you think computing will eat everything 01:23:37.500 |
or almost like disappear because it's part of everything? 01:23:42.860 |
but there's kind of two ways that fields destroy themselves. 01:23:48.680 |
and I think we can think of fields that might be that way. 01:23:54.860 |
And we have that instinct, we have that impulse. 01:23:58.660 |
who want computer science to be this pure thing. 01:24:07.440 |
I'm gonna teach Fortran for engineers or whatever, 01:24:12.660 |
that makes it worth studying in and of itself. 01:24:26.660 |
In fact, the thriving major, almost every place. 01:24:33.340 |
because people need to know the things we need to know. 01:24:36.220 |
And our job, much as the mathematician's job, 01:24:41.880 |
much the way the point of you taking chemistry as a freshman 01:24:47.000 |
it's to learn to think like a scientist, right? 01:24:48.820 |
Our job is to help them to think like a computationalist, 01:24:51.980 |
and we have to take both of those things very seriously. 01:24:56.820 |
we have historically certainly taken the second thing, 01:24:59.180 |
that our job is to help them to think a certain way. 01:25:02.780 |
I don't think we've taken that very seriously at all. 01:25:06.020 |
- I don't know if you know who Dan Carlin is, 01:25:07.700 |
he has this podcast called "Hardcore History." 01:25:10.620 |
- I've just did an amazing four-hour conversation with him, 01:25:15.780 |
But I bring him up because he talks about this idea 01:25:20.240 |
that it's possible that history as a field will become, 01:25:24.340 |
like currently, most people study history a little bit, 01:25:29.340 |
kind of are aware of it, we have a conversation about it, 01:25:35.700 |
that some parts of history are being ignored, 01:26:08.300 |
of computer science, it becomes a niche thing 01:26:14.980 |
and all the history, the founding of the United States, 01:26:22.420 |
And it's a kind of profound thing to think about, 01:26:26.220 |
how we can lose track, how we can lose these fields 01:26:31.220 |
when they're best, like in the case of history, 01:26:36.940 |
that everybody learns and thinks about and so on. 01:26:44.180 |
similar to history in the sense that it seems 01:26:47.100 |
like it should be a part of everybody's life to some degree, 01:26:56.300 |
And it's not obvious that that's the way it'll go. 01:27:09.100 |
It's currently very successful, but it's not, 01:27:13.060 |
I mean, you're at the leadership level of this, 01:27:15.420 |
you're defining the future, so it's in your hands. 01:27:21.420 |
this can go, and there's this kind of conversation 01:27:45.620 |
it would be an absolute shame if no one studied history. 01:27:51.220 |
the amount of history is presumably also growing, 01:28:03.340 |
if you think of your brains as being outside of your head, 01:28:06.400 |
that you can kind of learn the history you need to know 01:28:09.900 |
That seems fanciful, but it's a kind of way of, 01:28:17.740 |
for the particular thing you have to care about, 01:28:20.820 |
- For our objective camera discussion, right? 01:28:23.220 |
- Yeah, right, and we've already lost lots of history, 01:28:27.420 |
that some of which will be, it's even lost to you. 01:28:46.260 |
the point is you have to get across those lessons. 01:28:54.600 |
even if you don't necessarily have the information 01:28:57.880 |
With computing, I think it's somewhat different. 01:29:02.800 |
but everyone needs to learn how to think in the way 01:29:08.240 |
in the sense of repeatable, not just in the sense of, 01:29:12.600 |
not resolution in the sense of get the right number of bits. 01:29:15.600 |
In saying what it is you want the machine to do, 01:29:19.520 |
and being able to describe a problem in such a way 01:29:29.240 |
talking back and forth just to kind of vaguely understand 01:29:31.480 |
what the other person means, and hope we get it good enough 01:29:35.360 |
You can't do that with machines, at least not yet. 01:29:53.720 |
what that means, that it's a programming language 01:29:55.760 |
and it has these sort of things that you fiddle with 01:30:01.080 |
In fact, I would argue that one of the big holes 01:30:05.080 |
is that we forget that we are basically doing 01:30:09.560 |
We forget that we are doing, we are using programming. 01:30:13.440 |
Like, we're using languages to express what we're doing. 01:30:15.400 |
We get just so all caught up in the deep network, 01:30:22.800 |
based upon a set of parameters that we made up. 01:30:29.080 |
And so the lesson of computing, computer science education, 01:30:42.720 |
or we call them if-then statements or whatever, 01:30:45.960 |
but you're forced to surface those assumptions. 01:30:50.720 |
of a computing education, that and that the models, 01:30:52.600 |
the languages, and the machines are equivalent. 01:31:04.360 |
So you better get it right, or at least understand it, 01:31:06.720 |
and be able to express roughly what you want to express. 01:31:19.040 |
because at the end, it will not only make them better 01:31:29.680 |
It'll help them to understand what others are doing to them 01:31:35.440 |
'cause you're not gonna solve the problem of social media 01:31:37.720 |
insofar as you think of social media as a problem 01:31:43.920 |
It only works if people react to it appropriately 01:31:49.160 |
and therefore take control over what they're doing. 01:32:01.600 |
- One is because it's a fascinating part of your story, 01:32:07.720 |
through a pretty tense time in terms of race tensions 01:32:13.320 |
and discussions and ideas in this time in America. 01:32:42.520 |
By the way, black, African-American, personal color. 01:33:04.680 |
when you showed up to Georgia Tech for your undergrad, 01:33:12.000 |
and that was like, oh, that was a new experience. 01:33:14.520 |
So can you take me from just a human perspective, 01:33:25.840 |
And by the way, that story continues through MIT as well. 01:33:28.800 |
In fact, it was quite a bit more stark at MIT and Boston. 01:33:34.800 |
Georgia Tech was undergrad, MIT was graduate school. 01:33:45.760 |
- You didn't go on a backpacking trip in Europe. 01:33:48.960 |
In fact, I literally went to IBM for three months, 01:33:51.920 |
got in a car, and drove straight to Boston with my mother, 01:33:55.560 |
Moved into an apartment I'd never seen over the Royal East. 01:34:12.440 |
So as you said, I was born in Chattanooga, Tennessee. 01:34:16.680 |
in a moving truck at the age of three and a half, 01:34:25.800 |
Like much of the country, and certainly much of Atlanta 01:34:29.560 |
in the '70s and '80s, it was deeply, highly segregated, 01:34:33.240 |
though not in a way that I think was obvious to you 01:34:41.040 |
and Atlanta's hardly unique in this way, by highway, 01:34:45.220 |
So I grew up not only in a predominantly black area, 01:34:48.600 |
to say the very least, I grew up on the poor side of that. 01:34:52.060 |
But I was very much aware of race for a bunch of reasons, 01:34:57.640 |
one, that people made certain that I was, my family did, 01:35:07.120 |
I say I had a girlfriend, I didn't have a girlfriend, 01:35:10.760 |
in the first grade, but I do remember she decided 01:35:13.880 |
I was her girlfriend, this little white girl named Heather. 01:35:16.640 |
And we had a long discussion about how it was okay 01:35:20.320 |
despite the fact that she was white and I was black. 01:35:26.520 |
- Yes, but being a girlfriend and boyfriend in first grade 01:35:41.300 |
I was like, ah, my life is, now my life has changed 01:35:44.080 |
in first grade, no one told me elementary school 01:35:56.320 |
I didn't think too much about it, but I was aware of it. 01:36:02.220 |
it's that I wasn't aware that I was a minority. 01:36:10.240 |
I mean, I'm six years old, five years old in first grade. 01:36:13.120 |
The world is the seven people I see every day. 01:36:20.200 |
home of the Civil Rights Movement and all the rest, 01:36:24.720 |
'cause there were only three, four, five channels, right? 01:36:27.040 |
And I saw the news, which my mother might make me watch. 01:36:29.840 |
Monica Kaufman was on TV telling me the news. 01:36:35.280 |
And they were all black, and the mayor was black, 01:36:42.200 |
I remember the first day walking across campus, 01:36:48.960 |
that of the hundreds and hundreds and hundreds 01:36:59.560 |
And then of course it continued that way for, 01:37:02.240 |
well, for much of the rest of my career at Georgia Tech. 01:37:12.200 |
So I began to meet students of Asian descent, 01:37:14.480 |
and I met students who we would call Hispanic, 01:37:17.640 |
And so my world, this is what college is supposed to do, 01:37:19.680 |
right, it's supposed to open you up to people, and it did. 01:37:23.040 |
But it was a very strange thing to be in the minority. 01:37:28.040 |
When I came to Boston, I will tell you a story. 01:37:32.160 |
I applied to one place as an undergrad, Georgia Tech, 01:37:36.840 |
because I was stupid, I didn't know any better. 01:37:42.640 |
When I went to grad school, I applied to three places, 01:37:44.920 |
Georgia Tech, because that's where I was, MIT, and CMU. 01:37:59.560 |
He spent his time explaining to me about Pittsburgh, 01:38:01.880 |
much less about CMU, but more about Pittsburgh, 01:38:07.860 |
something about the sun coming out two days out of the year. 01:38:23.600 |
for a variety of reasons whether they liked Boston. 01:38:26.160 |
And 10 of them loved it, and 10 of them hated it. 01:38:33.480 |
And they explained to me very much why that was the case. 01:38:42.240 |
And I came up here, and I could see it immediately, 01:38:45.840 |
why people would love it and why people would not. 01:38:48.000 |
- Why people tell you about the nice coffee shops. 01:38:56.000 |
Nice shops, oh, there's all these students here. 01:39:00.400 |
and you can walk in something about the outdoors, 01:39:01.960 |
which I wasn't the slightest bit interested in. 01:39:03.480 |
The outdoors is for the bugs, it's not for humans. 01:39:09.560 |
- Yeah, I mean, it's the way I feel about it. 01:39:12.520 |
And the black folk told me completely different stories 01:39:14.960 |
about which part of town you did not wanna be caught in 01:39:18.040 |
after dark, and I heard all, but that was nothing new. 01:39:22.060 |
So I decided that MIT was a great place to be 01:39:25.480 |
as a university, and I believed it then, I believe it now. 01:39:37.680 |
that nobody was working on at the time, but that's okay. 01:39:41.520 |
It was great, and so I thought that I would be fine. 01:39:43.200 |
And I'd only be there for like four or five years. 01:39:45.640 |
I told myself, which turned out not to be true at all. 01:39:48.620 |
But I enjoyed my time, I enjoyed my time there. 01:39:55.800 |
that were driven by what I look like while I was here. 01:39:59.560 |
I got asked a lot of questions, I ran into a lot of cops. 01:40:06.320 |
But at the time, I mean, I haven't been here a long time, 01:40:17.800 |
I don't know if there are any radio stations anymore. 01:40:19.720 |
I'm sure there are, but I don't listen to the radio anymore, 01:40:26.800 |
But the idea is you could be in a major metropolitan area 01:40:28.840 |
and there wasn't a single black radio station, 01:40:31.560 |
that played what we would call black music then, 01:40:50.560 |
Insofar as it existed, it was uniformly distributed 01:40:57.940 |
And where you had concentrations of black Bostonians, 01:41:07.960 |
but then in high school, well, in ninth grade, 01:41:17.920 |
There was at least one major fight every week. 01:41:24.820 |
But when I went to ninth grade, I went to academy. 01:41:31.880 |
It was a magnet school, that's why I was able to go there. 01:41:40.680 |
It had 385 students, all but four of whom were black. 01:41:50.360 |
of the former mayor of Atlanta, Michael Jackson's cousin. 01:41:54.240 |
I mean, you know, it was an upper middle class-- 01:41:59.680 |
You know, I dropped the mic, dropped some names, 01:42:04.440 |
12th cousin, nine times removed, I don't know. 01:42:10.080 |
I did not come from a place where you had cars. 01:42:12.360 |
I had my first car when I came to MIT, actually. 01:42:14.840 |
So it was just a very different experience for me. 01:42:19.840 |
But I'd been to places where, whether you were rich 01:42:24.660 |
or whether you were poor, you could be black and rich 01:42:28.440 |
and there were places, and they were segregated 01:42:31.000 |
by class as well as by race, but that existed. 01:42:40.260 |
It felt like it was the interracial dating capital 01:42:54.240 |
You couldn't go up the Orange Line, at that time. 01:43:08.780 |
And that was just the greater Boston area in 1992. 01:43:18.900 |
did you feel, was there levels in which you were empowered 01:43:24.140 |
to be first, or one of the first black people 01:43:29.620 |
in some of these great institutions that you were a part of, 01:43:35.640 |
And was there a part where it was, it felt limiting? 01:43:46.380 |
So I never felt, in fact, quite the opposite. 01:43:50.260 |
Not only did I not feel as if no one was trying to stop me, 01:44:00.900 |
Not my fellow students, not that they didn't want me 01:44:07.020 |
or at least that people were happy to see me succeed 01:44:12.260 |
But you know, 1990, you're dealing with a different set 01:44:15.740 |
You're very early, at least in computer science, 01:44:18.620 |
you're very early in the sort of Jackie Robinson period. 01:44:22.540 |
Jackie Robinson syndrome, which is that you have to, 01:44:27.860 |
or has to be sure to succeed because if that person fails, 01:44:32.440 |
So, you know, it was kind of in everyone's best interest. 01:44:37.460 |
I'm completely sure that people went out of their way 01:44:40.700 |
to try to make certain that the environment would be good. 01:44:47.620 |
And I was hardly the only, I was the only person 01:44:57.220 |
Less than 20 years away from the first black PhD 01:45:03.620 |
1971, something like that, somewhere around then. 01:45:07.020 |
So we weren't that far away from the first first. 01:45:12.700 |
from the first black PhD computer science, right? 01:45:15.900 |
So we were in a, it was a sort of interesting time. 01:45:26.880 |
And furthermore, I felt as if there was enough 01:45:34.120 |
though I didn't know them, who wanted to make certain 01:45:40.180 |
of the rest of the city, which I think were designed 01:45:44.140 |
in such a way that they felt no need to be supportive. 01:46:04.220 |
that no matter what, you should feel empowered? 01:46:07.860 |
You said, you used the word, I think, illusion or delusion. 01:46:12.060 |
Is there a sense from the individual perspective 01:46:15.140 |
where you should always kind of ignore, you know, 01:46:39.880 |
whether it's just hatred in its like deluded form, 01:46:43.820 |
all that kind of stuff, and just kind of see yourself 01:46:46.780 |
as empowered and confident, all those kinds of things? 01:46:50.340 |
- I mean, it certainly helps, but there's a trade-off, right? 01:46:55.020 |
I mean, you can't get a PhD unless you're crazy enough 01:47:00.140 |
I mean, that kind of massive delusion is that. 01:47:03.700 |
that you can succeed despite whatever odds you see 01:47:07.580 |
that you don't think that you need to step out 01:47:19.500 |
I've been able to find that wherever I've gone, 01:47:21.860 |
even if it wasn't necessarily on the floor that I was in. 01:47:29.140 |
So I felt supported, and certainly I had my mother 01:47:36.980 |
even if it were a long-distance call that cost money, 01:47:39.420 |
which is not something that any of the kids today 01:47:49.340 |
But you cannot be so deluded that you miss the obvious 01:47:54.140 |
and it makes you think you're doing better than you are, 01:48:35.940 |
I guess the answer is it looks exactly the way it looks now 01:48:39.940 |
because this is the world that we happen to live in. 01:48:42.620 |
It's people clustering and doing the things that they do 01:48:46.220 |
and making decisions based on one or two bits 01:48:51.460 |
which, by the way, are all positive feedback loops, 01:48:57.340 |
because you behave in a certain way that makes it true, 01:49:06.260 |
I do not, despite having made it over 50 now. 01:49:13.620 |
- God, I have a few gray hairs here and there. 01:49:17.420 |
- I think, you know, I don't imagine I will ever 01:49:22.180 |
see a police officer and not get very, very tense. 01:49:28.480 |
because it probably means you're being pulled over 01:49:32.140 |
or you're gonna get a ticket or whatever, right? 01:49:34.340 |
I mean, the interesting thing about the law in general 01:49:44.540 |
except in a few very small circumstances, right? 01:49:47.620 |
But, so that's just, that's an underlying reality. 01:49:57.540 |
halfway between Boston and Wellesley, actually. 01:50:06.380 |
that if he shot me right now, he'd get away with it. 01:50:15.360 |
is that if he shoots me now, he will get away with it. 01:50:31.000 |
odds were, would be, of course, that it wouldn't, 01:50:32.940 |
but if it became a thing that other people knew about, 01:50:51.240 |
You know, that hurts not just you, you're dead, 01:50:53.700 |
but your family, and the way people look at you, 01:51:02.300 |
I absolutely believe it would've worked had he done it. 01:51:08.180 |
He did not go out that night expecting to do that, 01:51:11.240 |
and I wouldn't be surprised if he never, ever did that, 01:51:23.160 |
- And you were basically speeding or something like that? 01:51:29.080 |
but in fact, I may not have even gotten a ticket. 01:51:35.240 |
- Yeah, apparently I moved too fast or something. 01:51:44.800 |
- So how, if we can take a little walk around your brain, 01:51:55.280 |
and how do you feel about cops after that experience? 01:52:02.880 |
is the same view I have about lots of things. 01:52:06.480 |
Fire is an important and necessary thing in the world, 01:52:10.320 |
but you must respect fire because it will burn you. 01:52:13.860 |
Fire is a necessary evil in the sense that it can burn you, 01:52:21.360 |
and all the other things that we use fire for. 01:52:23.340 |
So when I see a cop, I see a giant ball of flame, 01:52:31.880 |
- And then some people might see a nice place, 01:52:34.120 |
a nice thing to roast marshmallows with a family over. 01:52:40.120 |
- Okay, so let me go a little darker, and I apologize. 01:52:43.120 |
Just talked to Dan Carlin about it, he left for four hours. 01:52:59.760 |
And one might even argue that it is a illogical conclusion. 01:53:05.100 |
On the other hand, you've got to live in the world, 01:53:13.420 |
I mean, hate is something that takes a lot of energy. 01:53:18.020 |
So one should reserve it for when it is useful 01:53:21.940 |
and not carried around with you all the time. 01:53:24.280 |
Again, there's a big difference between the happy delusion 01:53:28.140 |
that convinces you that you can actually get out of bed 01:53:30.140 |
and make it to work today without getting hit by a car, 01:53:33.660 |
and the sad delusion that means you can not worry 01:53:37.260 |
about this car that is barreling towards you, right? 01:53:46.460 |
If we go all the way back to something you said earlier 01:53:48.580 |
about empathy, I think what I would ask other people 01:53:53.580 |
to get out of this one of many, many, many stories 01:54:03.160 |
People would ask me to empathize with the police officer. 01:54:12.700 |
in the top 10 most dangerous jobs in the United States, 01:54:14.860 |
you're much more likely to get killed in a taxi cab. 01:54:17.540 |
Half of police officers are actually killed by suicide. 01:54:32.780 |
I think though that if we step back from what I feel, 01:54:37.220 |
and we step back from what an individual police officer 01:54:42.300 |
because all things tie back into interactive AI. 01:54:45.500 |
The real problem here is that we've built a narrative, 01:54:47.460 |
we built a big structure that has made it easy 01:54:50.340 |
for people to put themselves into different pots 01:55:02.300 |
It is a useful exercise to ask yourself sometimes, 01:55:04.540 |
I think, that if I had grown up in a completely 01:55:07.140 |
different house, in a completely different household, 01:55:09.340 |
as a completely different person, if I had been a woman, 01:55:13.660 |
Would I believe what that crazy person over there believes? 01:55:31.180 |
if you think Facebook is broken, how do you fix racism? 01:55:38.740 |
It's not, I mean, individual conversations matter a lot, 01:55:41.720 |
but you have to create structures that allow people 01:55:45.420 |
to have those individual conversations all the time 01:55:50.460 |
and that allows them to understand that other people 01:55:52.700 |
have had different experiences, but that ultimately 01:56:13.100 |
is actually setting up the structures in the first place, 01:56:22.860 |
have a big role to that, of selling that optimistic delusion 01:56:38.580 |
which is this particular moment in history feels 01:56:43.460 |
like there's a non-zero probability, if we go to the P, 01:56:49.380 |
of something akin to a violent or a non-violent civil war. 01:56:58.700 |
you can speak to this from perhaps a more knowledgeable 01:57:09.740 |
There's a lot of anger, and it has to do with people, 01:57:17.660 |
I think, much is the quiet economic pain of millions 01:57:28.500 |
because of closed businesses, because of lost dreams. 01:57:32.100 |
So that's building, whatever that tension is building. 01:57:44.620 |
from which the protests and so on percolated. 01:57:52.220 |
I mean, the very first race riots were in Boston, 01:57:59.060 |
- Going way, I mean, like the 1700s or whatever, right? 01:58:25.260 |
to remember what happened the last time, right? 01:58:32.420 |
you said two things there that I think are worth unpacking. 01:58:35.300 |
One has to do with this sort of moment in time, 01:58:47.980 |
So I'm actually, I want to separate those things, 01:58:55.420 |
So let's separate these two things for a moment. 01:59:04.600 |
one of my three minors as an undergrad was history, 01:59:16.380 |
- And history of, and Spanish history, actually, 01:59:28.300 |
That was really, that was really fascinating. 01:59:32.060 |
from all the computer science classes I'd been taking, 01:59:34.260 |
even the cocci classes I was taking at an undergrad. 01:59:37.780 |
Anyway, I'm not, I am a, I'm interested in history, 01:59:45.060 |
So, you know, forgive my, I will ask the audience 01:59:50.500 |
But I think the question that's always worth asking 02:00:11.620 |
Well, first off, there was a civil rights movement 02:00:12.980 |
in the '30s and '40s, it just wasn't of the same character 02:00:17.380 |
Post-World War II, lots of interesting things 02:00:36.100 |
- It could have easily happened right after World War II. 02:00:38.900 |
- Yes, I think, and again, I am not a scholar, 02:00:57.980 |
'Cause one was born 20 years after the other, whatever. 02:01:02.700 |
I think it turns out that, you know what King's 02:01:14.320 |
trying to integrate, and I forget the guy's name, 02:01:21.820 |
he was a sheriff, made a deal with the whole state of Georgia. 02:01:30.140 |
and put them in jails very far away from here. 02:01:33.280 |
And we're gonna do that, and we're not gonna, 02:01:35.200 |
there'll be no reason for the press to hang around. 02:01:49.240 |
little boys and girls being hit with fire hoses 02:01:52.500 |
and being knocked down, and there was outrage, 02:01:56.260 |
Part of the delusion is pretending that nothing bad 02:01:59.820 |
is happening that might force you to do something big 02:02:01.680 |
you don't want to do, but sometimes it gets put in your face 02:02:05.760 |
And a large part, in my view, of what happened, right, 02:02:16.080 |
but part of that delusion was that it wasn't gonna affect 02:02:17.860 |
the West or the Northeast, and of course it did, 02:02:23.880 |
and in some ways we're living with that legacy now, 02:02:35.240 |
there's not just more TV, there's social media, 02:02:45.920 |
I mean, the world is changing rapidly, right? 02:02:49.200 |
You're now seeing people you could have avoided seeing 02:02:50.920 |
most of your life growing up in a particular time, 02:02:53.960 |
and it's happening, it's dispersing at a speed 02:02:56.920 |
that is fast enough to cause concern for some people, 02:03:00.080 |
but not so fast to cause massive negative reaction. 02:03:11.880 |
I'm happy to be yelled at by a real historian. 02:03:14.080 |
- Oh yeah, I mean, there's just the obvious thing, 02:03:17.400 |
I mean, I guess you're implying, but not saying this, 02:03:20.480 |
I mean, it seemed to have percolated the most 02:03:27.960 |
It's fascinating to think that whatever the mechanisms 02:03:40.320 |
those mechanisms are the mechanisms of change. 02:03:46.160 |
I seem to be the only person who remembers this, 02:03:47.680 |
but sometime before the Rodney King incident, 02:03:55.600 |
in Southern California, and he was gonna prove it 02:03:59.040 |
by having some news, some camera people follow him around. 02:04:09.480 |
he crosses into the city, some cops pull him over, 02:04:15.400 |
They like shove his face through a glass window. 02:04:18.640 |
like I distinctly remember watching this as a kid. 02:04:22.520 |
I was in college at the time, I was in grad school at the time. 02:04:30.740 |
- Whatever that is, whatever that magic thing is. 02:04:37.680 |
Or '91, actually it must have been '90 or '91. 02:04:59.440 |
other than one thing caught on and one thing didn't. 02:05:09.200 |
And the other is there's easier and easier ways 02:05:15.480 |
But we're still finding ourselves telling the same story. 02:05:18.520 |
I would invite you to go back and read the op-eds 02:05:23.640 |
the violence is not the right answer after Rodney King. 02:05:31.720 |
It's the same words over and over and over again. 02:05:35.200 |
I mean, there's your remembering history right there. 02:05:39.920 |
and I'm surprised no one got flagged for plagiarism. 02:05:53.320 |
- You know Malcolm X was older than Martin Luther King? 02:05:55.800 |
People kind of have it in their head that he's younger. 02:06:05.640 |
and they think of Malcolm X as the young, angry, whatever. 02:06:27.320 |
One thing I will say, without taking a moral position, 02:06:53.760 |
just talking it out is gonna lead to progress, 02:06:57.120 |
but it seems like if you just look through history, 02:07:00.740 |
being irrationally upset is the way you make progress. 02:07:12.080 |
I mean, what's the difference between that and what, 02:07:13.880 |
again, without taking a moral position on this, 02:07:22.360 |
or it takes longer, or it takes a very different form. 02:07:27.220 |
a whole host of things, positive and negative, 02:07:40.800 |
I mean, many people far more talented and thoughtful 02:07:43.400 |
than I have have said this in one form or another, right? 02:07:46.000 |
That violence is the voice of the unheard, right? 02:07:53.400 |
when they feel as if they have no other option. 02:07:56.000 |
And sometimes we agree, and sometimes we disagree. 02:08:14.360 |
So another way of putting it, which I think is less, 02:08:16.760 |
let us just say, provocative, but I think is true, 02:08:45.000 |
And by the way, this isn't just about violent, political, 02:08:48.180 |
whatever, nonviolent political change, right? 02:08:49.920 |
This is true for understanding calculus, right? 02:08:53.880 |
- We're back to talking about faculty hiring. 02:08:55.560 |
- At the end of the day, in the end of the day, 02:09:01.400 |
Faculty hiring is a metaphor for all of life. 02:09:07.040 |
Do you think everything is gonna be okay in the next year? 02:09:16.760 |
- I tend to think that everything's gonna be okay, 02:09:21.800 |
My mother says something to me a lot, and always has, 02:09:47.560 |
She'll ignore me just as much as I ignored my parents 02:09:50.160 |
when I was in college and went to grad school. 02:09:52.400 |
But I think that one day, if we're all lucky, 02:09:55.920 |
you live long enough to look back on something 02:09:57.640 |
that happened a while ago, even if it was painful, 02:10:07.320 |
Do you think we'll live into the 21st century? 02:10:14.880 |
are you worried that we might destroy ourselves 02:10:16.720 |
with nuclear weapons, with AGI, with engineering? 02:10:21.400 |
but I am worried, I mean, at any given moment, right? 02:10:23.840 |
Also, but you know, at any given moment, a comet could, 02:10:32.300 |
we have a better chance than not of making it. 02:10:36.960 |
- You know, I talked to Alex Villepenco from Berkeley. 02:10:50.100 |
and they can just enter, and then we have less than a month. 02:10:53.320 |
- Yeah, and yet, you make it from day to day. 02:10:59.340 |
Well, maybe for Earth it'll pass, but not for humans. 02:11:09.620 |
at least not while I'm around, and if we are, 02:11:13.560 |
so I might as well assume it's not going to happen. 02:11:25.400 |
the This Week in Black History calendar of facts. 02:11:28.860 |
There's like a million questions I can ask here. 02:11:33.360 |
You said you're not a historian, but is there, 02:11:41.680 |
is there somebody in history, in black history, 02:11:49.220 |
or personal inspiration from, or you just find interesting, 02:11:55.180 |
- Well, I find the entirety of the '40s and the '60s 02:11:59.060 |
and the civil rights movement that didn't happen 02:12:04.540 |
I mean, I've read quite a bit of the time period, 02:12:10.000 |
when I had more time to read as many things as I wanted to. 02:12:12.900 |
What was quirky about This Week in Black History 02:12:17.140 |
when I started in the '80s was how focused it was. 02:12:22.140 |
It was because of the sources I was stealing from, 02:12:25.640 |
so I'd take calendars, anything I could find, 02:12:28.160 |
Google didn't exist, right, and I just pulled 02:12:33.760 |
and I started getting people sending me information, 02:12:56.280 |
mother necessity, right, all these other things 02:13:00.760 |
and they went to the wrong state at the wrong time, 02:13:04.160 |
but they were inventing things we use, right? 02:13:17.440 |
you know, the Charles Richard Drews of the world. 02:13:19.200 |
You know, you create things that impact people, 02:13:33.740 |
I mean, look, in our world, all we really have is credit. 02:13:38.740 |
- I was always bothered by how much value credit is given. 02:13:49.780 |
- But you got the actual, we're all gonna be dead soon. 02:14:02.980 |
The Turing Award given to three people for deep learning, 02:14:27.500 |
- Well, you know, someone asked me about immortality once, 02:14:29.660 |
and I said, and I stole this from somebody else, 02:14:35.420 |
I asked him, "What's your great-grandfather's name?" 02:14:42.540 |
I mean, I'm not entirely sure I know my grandparents' names, 02:14:47.700 |
I don't know their middle names, for example. 02:14:49.800 |
Didn't live in living memory, so I could find out. 02:14:53.900 |
Actually, my grandfather didn't know when he was born. 02:15:02.740 |
So in some sense, immortality is doing something, 02:15:06.020 |
preferably positive, so that your great-grandchildren 02:15:22.100 |
I don't have to know who my great-grandfather was 02:15:26.820 |
- And I don't know who my great-grandchildren are, 02:15:29.300 |
certainly who my great-great-grandchildren are, 02:15:37.900 |
in such a way that their lives will be better 02:15:39.700 |
than they would have been if I hadn't done that. 02:15:51.380 |
- I don't know if I'm afraid of death, but I don't like it. 02:16:06.500 |
This feels like a very Russian conversation, actually. 02:16:10.660 |
a very, something that happened to me recently. 02:16:13.460 |
If you look very carefully, you will see I have a scar. 02:16:17.580 |
- Which, by the way, is an interesting story of its own 02:16:20.000 |
about why people who have half of their thyroid taken out, 02:16:29.060 |
is its own interesting story, but I won't go into it. 02:16:30.780 |
Just suffice it to say, I did what I keep telling people 02:16:33.020 |
you should never do, which is never go to the doctor 02:16:34.500 |
unless you have to, because there's nothing good 02:16:36.280 |
that's ever gonna come out of a doctor's visit, right? 02:16:38.000 |
So I went to the doctor to look at one thing, 02:16:53.380 |
So I was sitting there, and to look at my thyroid, 02:17:00.220 |
and he said, "We're gonna have to take it out 02:17:05.420 |
"but if you wait 'til you're 85, that'll be really bad 02:17:10.420 |
"when you're 85 years old, if you can help it." 02:17:19.340 |
I would decide I would put it off until December 19th 02:17:25.180 |
and I wouldn't be able to say I made it to 49 or whatever, 02:17:27.940 |
so I said, "I'll wait 'til after my birthday." 02:17:30.220 |
In the first six months of that, nothing changed. 02:17:46.260 |
I don't have to take a pill or anything like that. 02:17:49.740 |
I'm in the hospital room, and the doctor comes in. 02:17:58.060 |
They're talking to me, and the anesthesiologist says, 02:18:01.200 |
"Huh, your blood pressure's through the roof. 02:18:04.260 |
I said, "No, but I'm terrified if that helps you at all." 02:18:10.220 |
who supports the anesthesiologist, if I got that right, 02:18:16.000 |
"You're gonna be feeling pretty good in a couple minutes." 02:18:20.240 |
"Well, I'm gonna feel pretty good in a couple minutes." 02:18:28.260 |
I have this distinct impression that I've met this guy, 02:18:34.060 |
but I kind of just don't remember what just happened. 02:18:39.700 |
and I'm like, "Oh, it's just like in the movies 02:18:52.840 |
And I remember thinking, "Oh, she's not talking to me." 02:18:57.260 |
And in between the time where the tiles were going by 02:19:24.660 |
and me waking up in my hospital bed, no time passed. 02:19:32.060 |
When I go to sleep and I wake up in the morning, 02:19:46.500 |
By the way, my wife was there with me talking. 02:19:52.340 |
but luckily I didn't say anything I wouldn't normally say. 02:20:02.780 |
But her point of view is I would start talking 02:20:07.220 |
and then I would wake up and leave off where I was before. 02:20:39.180 |
to discover on my own whether immortality sucks 02:20:48.420 |
- I would like to have a choice in the matter. 02:20:54.060 |
that gives it a little flavor, a little spice. 02:20:57.820 |
- Well, in reinforcement learning, we believe that. 02:21:31.120 |
you won't know who your great-grandchildren are, 02:21:45.320 |
that the entire universe, save for one trifling exception, 02:21:58.880 |
- Charles, this was one of the best conversations 02:22:03.600 |
I've ever had, and I get to see you tomorrow again 02:22:06.640 |
to hang out with who looks to be one of the most, 02:22:15.760 |
that I'll ever get to meet with Michael Lipman. 02:22:26.760 |
I'm excited to see with you being involved there 02:22:35.720 |
with Charles Isbell, and thank you to our sponsors, 02:22:39.000 |
Neuro, the maker of functional sugar-free gum and mints 02:22:42.600 |
that I use to give my brain a quick caffeine boost, 02:22:46.480 |
Decoding Digital, a podcast on tech and entrepreneurship 02:23:03.680 |
Please check out these sponsors in the description 02:23:06.000 |
to get a discount and to support this podcast. 02:23:09.360 |
If you enjoy this thing, subscribe on YouTube, 02:23:19.600 |
And now let me leave you with some poetic words 02:23:28.600 |
of being pushed out of the glittering sunlight 02:23:30.840 |
of life's July and left standing amid the piercing chill 02:23:37.620 |
Thank you for listening, and hope to see you next time.