back to indexCharles Isbell and Michael Littman: Machine Learning and Education | Lex Fridman Podcast #148
Chapters
0:0 Introduction
2:27 Is machine learning just statistics?
6:49 NeurIPS vs ICML
9:5 Data is more important than algorithm
14:49 The role of hardship in education
23:33 How Charles and Michael met
28:5 Key to success: never be satisfied
31:23 Bell Labs
42:50 Teaching machine learning
53:1 Westworld and Ex Machina
61:0 Simulation
67:49 The college experience in the times of COVID
96:27 Advice for young people
103:19 How to learn to program
114:43 Friendship
00:00:00.000 |
The following is a conversation with Charles Isbell and Michael Whitman. 00:00:03.360 |
Charles is the dean of the College of Computing at Georgia Tech, 00:00:07.000 |
and Michael is a computer science professor at Brown University. 00:00:10.480 |
I've spoken with each of them individually on this podcast, 00:00:14.480 |
and since they are good friends in real life, 00:00:17.000 |
we all thought it would be fun to have a conversation together. 00:00:20.920 |
Quick mention of each sponsor, followed by some thoughts related to the episode. 00:00:28.320 |
the all-in-one drink that I start every day with to cover all my nutritional bases, 00:00:32.680 |
8 Sleep, a mattress that cools itself and gives me yet another reason to enjoy sleep, 00:00:38.120 |
Masterclass, online courses from some of the most amazing humans in history, 00:00:43.280 |
and Cash App, the app I use to send money to friends. 00:00:46.720 |
Please check out the sponsors in the description to get a discount 00:00:52.280 |
As a side note, let me say that having two guests on the podcast is an experiment 00:01:02.400 |
I would like to occasionally be a kind of moderator for debates 00:01:06.600 |
between people that may disagree in some interesting ways. 00:01:10.120 |
If you have suggestions for who you would like to see debate on this podcast, 00:01:15.800 |
As with all experiments of this kind, it is a learning process. 00:01:20.160 |
Both the video and the audio might need improvement. 00:01:23.160 |
I realized I think I should probably do three or more cameras next time, 00:01:28.400 |
and also try different ways to mount the microphone for the third person. 00:01:36.480 |
I'm going to have to go figure out the thumbnail for the video version of the podcast, 00:01:41.800 |
since I usually put the guest's head on the thumbnail, 00:01:45.120 |
and now there's two heads and two names to try to fit into the thumbnail. 00:01:55.080 |
which in theoretical computer science happens to be an NP hard problem. 00:02:01.560 |
Whatever I come up with, if you have better ideas for the thumbnail, 00:02:06.120 |
And in general, I always welcome ideas how this thing can be improved. 00:02:09.840 |
If you enjoy it, subscribe on YouTube, review it with Five Stars on Apple Podcast, 00:02:14.720 |
follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman. 00:02:21.040 |
And now, here's my conversation with Charles Isbell and Michael Littman. 00:02:26.040 |
You'll probably disagree about this question, 00:02:29.760 |
but what is your biggest, would you say, disagreement about either something 00:02:34.920 |
profound and very important or something completely not important at all? 00:02:39.440 |
I don't think I have any disagreements at all. 00:02:45.360 |
So one thing that you sometimes mention is that, 00:02:51.440 |
whether or not machine learning is computational statistics. 00:02:58.400 |
And in particular, and more importantly, it is not just computational statistics. 00:03:14.040 |
If it were just the statistics, then we would be happy with where we are. 00:03:22.640 |
I agree that machine learning is not just statistics. 00:03:30.080 |
What is the computational and computational statistics? 00:03:32.760 |
Does this take us into the realm of computing? 00:03:35.600 |
But I think perhaps the way I can get him to admit that he's wrong is that it's about rules. 00:03:50.120 |
It's not just a random variable that you choose and you have a probability... 00:03:52.560 |
I think you have a narrow view of statistics. 00:03:54.760 |
Well, then what would be the broad view of statistics that would still allow it to be statistics and not say history that would make computational statistics okay? 00:04:03.320 |
So I had my first sort of research mentor, a guy named Tom Landauer, taught me to do some statistics. 00:04:12.360 |
And I was annoyed all the time because the statistics would say that what I was doing was not statistically significant. 00:04:21.760 |
And basically what he said to me is statistics is how you're going to keep from lying to yourself, which I thought was really deep. 00:04:28.880 |
It is a way to keep yourself honest in a particular way. 00:04:43.800 |
Even regular statisticians, non-computational statisticians, do spend some of their time evaluating rules, right? 00:04:51.160 |
Applying statistics to try and understand, does this rule capture this? 00:04:55.640 |
You mean like hypothesis testing kind of thing? 00:05:01.560 |
Like I feel like the word statistic literally means like a summary, like a number that summarizes other numbers. 00:05:06.840 |
But I think the field of statistics actually applies that idea to things like rules, to understand whether or not a rule is valid. 00:05:21.160 |
Because I think it's useful to think about a lot of what AI and machine learning is, or certainly should be, as software engineering, as programming languages. 00:05:29.280 |
Just to put it in language that you might understand, the hyperparameters beyond the problem itself. 00:05:35.520 |
The hyperparameters is too many syllables for me to understand. 00:05:44.800 |
It's the loss function you choose to focus on. 00:05:45.680 |
You want to say the practice of machine learning is different than the practice of statistics. 00:05:50.000 |
Like the things you have to worry about and how you worry about them are different, therefore they're different. 00:05:54.800 |
At a very little, I mean, at the very least, it's that much is true. 00:05:58.920 |
It doesn't mean that statistics, computational or otherwise, aren't important. 00:06:06.880 |
I think that we could think about game theory in terms of statistics, but I don't think it's very as useful to do. 00:06:12.280 |
I mean, the way I would think about it, or a way I would think about it is this way. 00:06:18.760 |
But I don't think it's as useful to think about chemistry as being just physics. 00:06:25.200 |
The level of abstraction really matters here. 00:06:26.920 |
So I think it is, there are contexts in which it is useful. 00:06:31.480 |
And so finding that connection is actually helpful. 00:06:33.120 |
And I think that's when I emphasize the computational statistics thing. 00:06:36.080 |
I think I want to befriend statistics and not absorb them. 00:06:41.280 |
Here's the A way to think about it beyond what I just said, right? 00:06:44.720 |
So what would you say, and I want you to think back to a conversation we had a very long time ago. 00:06:49.240 |
What would you say is the difference between, say, the early 2000s ICML and what we used to call NIPS, NIRPS? 00:06:58.000 |
A lot of, particularly on the machine learning that was done there? 00:07:11.320 |
I think my most cited ICML paper is from '94. 00:07:14.840 |
Michael knows this better than me because, of course, he's significantly older than I am. 00:07:17.880 |
But the point is, what is the difference between ICML and NIRPS in the late '90s, early 2000s? 00:07:24.320 |
I don't know what everyone else's perspective would be, but I had a particular perspective at that time, which is I felt like ICML was more of a computer science 00:07:32.800 |
place and that NIPS, NIRPS was more of an engineering place, like the kind of math that happened at the two places. 00:07:39.880 |
As a computer scientist, I felt more comfortable with the ICML math and the NIRPS people would say that that's because I'm dumb. 00:07:53.280 |
We actually had a nice conversation with Tom Dietrich about this on Twitter just a couple of days ago. 00:07:58.440 |
I put it a little differently, which is that ICML was machine learning done by computer scientists and NIRPS was machine learning done by computer scientists trying to impress statisticians. 00:08:10.640 |
Which was weird because it was the same people, at least by the time I started paying attention. 00:08:18.800 |
And I think that that perspective of whether you're trying to impress the statisticians or you're trying to impress the programmers is actually very different and has real impact on what you do. 00:08:28.400 |
You choose to worry about and what kind of outcomes you come to. 00:08:32.720 |
And computational statistics is a means to an end. 00:08:35.800 |
And I think that really matters here in the same way that I don't think computer science is just engineering or just science or just math or whatever. 00:08:43.080 |
Okay, so I'd have to now agree that now we agree on everything. 00:08:47.540 |
The important thing here is that my opinions may have changed, but not the fact that I'm right, I think is what we just came to. 00:08:54.560 |
Right. And my opinions may have changed and not the fact that I'm wrong. 00:09:05.880 |
How does neural networks change this, just to even linger on this topic, change this idea of statistics? 00:09:14.500 |
How big of a pie statistics is within the machine learning thing? 00:09:18.820 |
Like, because it sounds like hyperparameters and also just the role of data. 00:09:24.180 |
You know, people are starting to use this terminology of software 2.0, which is like the act of programming as a... 00:09:33.300 |
Like you're a designer in the hyperparameter space of neural networks, and you're also the collector and the organizer and the cleaner of the data. 00:09:46.940 |
So how did, on the NeurIPS versus ICML topic, what's the role of neural networks in redefining the size and the role of machine learning? 00:09:57.060 |
I can't wait to hear what Michael thinks about this, but I would add one. 00:10:04.140 |
I think there's one other thing I would add to your description, which is the kind of software engineering part of what does it mean to debug, for example. 00:10:10.900 |
But this is a difference between the kind of computational statistics view of machine learning and the computational view of machine learning, which is, I think, one is worried about the equation, as it were. 00:10:20.740 |
And by the way, this is not a value judgment. 00:10:24.580 |
But the kind of questions you would ask, you start asking yourself, well, what does it mean to program and develop and build the system, is a very computer science-y view of the problem. 00:10:33.020 |
I mean, if you get on data science Twitter and econ Twitter, you actually hear this a lot with the economist and the data scientist complaining about the machine learning people. 00:10:46.220 |
And I don't know why they don't see this, but they're not even asking the same questions. 00:10:49.300 |
They're not thinking about it as a kind of programming problem. 00:10:52.780 |
And I think that that really matters, just asking this question. 00:10:55.660 |
I actually think it's a little different from programming and hyperparameter space and sort of collecting the data. 00:11:02.900 |
But I do think that that immersion really matters. 00:11:05.980 |
So I'll give you a quick example, the way I think about this. 00:11:09.780 |
Michael and I have co-taught a machine learning class, which has now reached, I don't know, 10,000 people at least over the last several years or somewhere there's about. 00:11:16.940 |
And my machine learning assignments are of this form. 00:11:21.180 |
So the first one is something like implement these five algorithms, you know, K and N and SVMs and boosting and decision trees and neural networks. 00:11:32.580 |
And when I say implement, I mean steal the code. 00:11:35.980 |
You get zero points for getting the thing to work. 00:11:38.500 |
I don't want you spending your time worrying about getting the corner case right of, you know, what happens when you are trying to normalize distances and the points on the thing. 00:11:50.620 |
However, you're going to run those algorithms on two data sets. 00:11:58.820 |
Well, data sets are interesting if it reveals differences between algorithms, which presumably are all the same because they can represent whatever they can represent. 00:12:05.940 |
And two data sets are interesting together if they show different differences, as it were. 00:12:11.860 |
You have to justify their interestingness and you have to analyze them in a whole bunch of ways. 00:12:14.740 |
But all I care about is the data in your analysis, not the programming. 00:12:18.300 |
And I occasionally end up in these long discussions with students. 00:12:22.580 |
I copy and paste the things that I've said the other 15,000 times it's come up, which is, they go, but the only way to learn. 00:12:28.980 |
Really understand is to code them up, which is a very programmer software engineering view of the world. 00:12:35.140 |
If you don't program it, you don't understand it, which is, by the way, I think is wrong in a very specific way. 00:12:40.180 |
But it is a way that you come to understand because then you have to wrestle with the algorithm. 00:12:44.500 |
But the thing about machine learning is not just sorting numbers where in some sense the data doesn't matter. 00:12:49.100 |
What matters is, well, does the algorithm work on these abstract things? 00:12:59.740 |
And so as a result, you have to live with the data and don't get distracted by the algorithm per se. 00:13:04.820 |
And I think that that focus on the data and what it can tell you and what question it's actually answering for you, as opposed to the question you thought you were asking, is a key and important thing about machine learning and is a way that computationalists, as opposed to statisticians, bring a particular view about how to think about the process. 00:13:23.300 |
The statisticians, by contrast, bring, I think I'd be willing to say, a better view about the kind of formal math that's behind it and what an actual number ultimately is saying about the data. 00:13:35.700 |
And those are both important, but they're also different. 00:13:37.740 |
- I didn't really think of it this way, is to build intuition about the role of data, the different characteristics of data, by having two data sets that are different and that reveal the differences in the differences. 00:13:50.820 |
That's a really fascinating, that's a really interesting educational approach. 00:13:57.060 |
- No, they love it at the end. - They love it later. 00:13:58.620 |
- They love it at the end, not at the beginning. 00:14:04.020 |
- I feel like there's a deep, profound lesson about education there. 00:14:07.940 |
- That you can't listen to students about whether what you're doing is the right or the wrong thing. 00:14:16.060 |
- Well, as a wise, Michael Lippman once said to me about children, which I think applies to teaching, is you have to give them what they need without bending to their will. 00:14:29.620 |
Your whole job is to curate and to present, because on their own, they're not going to necessarily know where to search. 00:14:34.980 |
So you're providing pushes in some direction and learn space. 00:14:38.940 |
And you have to give them what they need in a way that keeps them engaged enough so that they eventually discover what they want and they get the tools they need to go and learn other things off of. 00:14:52.020 |
Let me put on my Russian hat, which believes that life is suffering. 00:15:01.940 |
- What do you think is the role of, we talked about balance a little bit. 00:15:08.020 |
What do you think is the role of hardship in education? 00:15:11.140 |
Like, I think the biggest things I've learned, like what made me fall in love with math, for example, is by being bad at it until I got good at it. 00:15:24.260 |
So like struggling with a problem, which increased the level of joy I felt when I finally figured it out. 00:15:33.380 |
And it always felt with me, with teachers, especially modern discussions of education, how can we make education more fun, more engaging, more all those things? 00:15:44.180 |
Or from my perspective, it's like you're maybe missing the point that education, that life is suffering. 00:15:51.540 |
Education is supposed to be hard and that actually what increases the joy you feel when you actually learn something. 00:16:03.900 |
- Okay, so this may be a point where we differ. 00:16:11.900 |
- Okay, well, I was gonna not answer the question. 00:16:14.140 |
- So you don't want the students to know you enjoy them suffering? 00:16:18.900 |
I was gonna say that there's, I think there's a distinction that you can make in the kind of suffering, right? 00:16:24.940 |
So I think you can be in a mode where you're suffering in a hopeless way versus you're suffering in a hopeful way, right? 00:16:33.340 |
Where you're like, you can see that you still have, you can still imagine getting to the end, right? 00:16:41.900 |
And as long as people are in that mindset where they're struggling, but it's not a hopeless kind of struggling, that's productive. 00:16:50.460 |
But if struggling, like if you break their will, if you leave them hopeless, no, that don't, sure, some people are gonna, whatever, lift themselves up by their bootstraps. 00:17:00.380 |
But like mostly you give up and certainly it takes the joy out of it. 00:17:03.420 |
And you're not gonna spend a lot of time on something that brings you no joy. 00:17:10.300 |
You have to thwart people in a way that they still believe that there's a way through. 00:17:17.140 |
So that's a, that we strongly agree, actually. 00:17:19.900 |
So I think, well, first off, struggling and suffering aren't the same thing, right? 00:17:25.020 |
- Oh, no, no, I actually appreciate the poetry. 00:17:27.580 |
And I, one of the reasons I appreciate it is that they are often the same thing and often quite different, right? 00:17:37.100 |
You don't necessarily have to struggle to suffer. 00:17:38.700 |
So I think that you want people to struggle, but that hope matters. 00:17:42.860 |
You have to, they have to understand that they're gonna get through it on the other side. 00:17:48.940 |
I actually think Brown University has a very, just philosophically has a very different take 00:17:55.260 |
on the relationship with their students, particularly undergrads from say, a place like Georgia Tech, which is-- 00:18:03.180 |
- I mean, remember Charles said, "It doesn't matter what the facts are, I'm always right." 00:18:07.020 |
- The correct answer is that it doesn't matter. 00:18:09.580 |
They're different, but they're clearly answers to that. 00:18:13.020 |
- He went to a school like the school where he is as an undergrad. 00:18:18.060 |
I went to a school, specifically the same school, 00:18:20.860 |
though it changed a bit in the intervening years. 00:18:25.980 |
- Yeah, and I went to an undergrad place that's a lot like the place where I work now. 00:18:29.580 |
And so it does seem like we're more familiar with these models. 00:18:32.940 |
- There's a similarity between Brown and Yale? 00:18:38.700 |
- Duke has some similarities too, but it's got a little Southern draw. 00:18:42.620 |
- You've kind of worked, you've sort of worked at universities that are like the places where you 00:18:51.580 |
- Are you uncomfortable venturing outside the box? 00:18:59.820 |
- He only goes to places that have institute in the name, right? 00:19:05.980 |
Well, no, I was a visiting scientist at UPenn, or visiting something at UPenn. 00:19:19.340 |
- The institute is in the, that Charles only goes to places that have institute in the name. 00:19:24.860 |
So I guess Georgia, I forget that Georgia Tech is Georgia Institute of Technology. 00:19:30.380 |
- The number of people who refer to it as Georgia Tech University is large and incredibly irritating. 00:19:36.380 |
This is one of the few things that genuinely gets under my skin. 00:19:38.940 |
- But like schools like Georgia Tech and MIT have, as part of the ethos, like there is, 00:19:43.660 |
I wanna say there's an abbreviation that someone taught me, like IHTFP, something like that. 00:19:49.500 |
Like there's an expression which is basically, I hate being here, which they say so proudly. 00:19:54.780 |
And that is definitely not the ethos at Brown. 00:19:57.820 |
Like Brown is, there's a little more pampering and empowerment and stuff. 00:20:02.060 |
And it's not like we're gonna crush you and you're gonna love it. 00:20:04.860 |
So yeah, I think there's a, I think the ethoses are different. 00:20:12.380 |
- In order to graduate from Georgia Tech, this is a true thing, feel free to look it up. 00:20:18.700 |
- No, actually Georgia Tech was really the first-- 00:20:22.220 |
- I feel like Georgia Tech was the first in a lot of ways. 00:20:32.140 |
First master's in computer science, actually. 00:20:38.700 |
- We had the first information and computer science master's degree in the country. 00:20:42.700 |
But the Georgia Tech, it used to be the case in order to graduate from Georgia Tech, 00:20:48.140 |
you had to take a drown proofing class, where effectively, 00:20:50.700 |
they threw you in the water, tied you up, and if you didn't drown, you got to graduate. 00:20:56.380 |
- You basically, there were certainly versions of it, but I mean, luckily, 00:20:59.260 |
they ended it just before I had to graduate, because otherwise I would have never graduated. 00:21:04.060 |
I wanna say '84, '83, someone around then, they ended it, but yeah, 00:21:08.940 |
you used to have to prove you could tread water for some ridiculous amount of time. 00:21:13.660 |
- You couldn't graduate. No, it was more than two minutes. 00:21:25.020 |
- I bet it was that and not tied up, because like, who needs to learn how to swim when you're tied? 00:21:30.220 |
Nobody, but who needs to learn to swim when you're actually falling into the water dressed? 00:21:34.700 |
- I think your facts are getting in the way with a good story. 00:21:40.380 |
- Sometimes the narrative matters more, but whatever it was, you had to, 00:21:52.220 |
That's a part of what Georgia Tech has always been, and we struggle with that, by the way, 00:21:56.220 |
about what we want to be, particularly as things go. 00:21:59.740 |
But you sort of, how much can you be pushed without breaking? 00:22:06.540 |
And you come out of the other end stronger, right? 00:22:08.780 |
There's this saying we used to have when I was an undergrad there, 00:22:10.460 |
which was Georgia Tech, building tomorrow the night before. 00:22:13.820 |
Right, and it was just kind of idea that, you know, 00:22:17.660 |
give me something impossible to do, and I'll do it in a couple of days, 00:22:20.780 |
because that's what I just spent the last four or five or six years doing. 00:22:26.060 |
Having now done a number of projects with you, 00:22:31.180 |
There's nothing wrong with waiting until the last minute. 00:22:33.500 |
The secret is knowing when the last minute is. 00:22:37.980 |
- Yeah, that is a definite Charles statement that I am trying not to embrace. 00:22:43.940 |
- Wow, and I appreciate that, because you helped move my last minute up. 00:22:47.180 |
- That's the social construct, the way you converge together, 00:22:54.540 |
In fact, MIT, you know, I'm sure a lot of universities have this, 00:22:58.380 |
but MIT has like MIT time that everyone is always agreed together 00:23:03.580 |
that there is such a concept and everyone just keeps showing up like 10 to 15 to 20, 00:23:08.620 |
depending on the department, late to everything. 00:23:16.380 |
- In fact, the classes will say, you know, well, this is no longer true, actually. 00:23:20.220 |
But it used to be a class would start at eight, but actually it started at 8.05, 00:23:25.740 |
Everything's five minutes off, and nobody expects anything to start 00:23:28.620 |
until five minutes after the half hour, whatever it is. 00:23:32.700 |
- Well, let's rewind the clock back to the '50s and '60s when you guys met. 00:23:42.940 |
So like the internet and the world kind of knows you as connected in some ways 00:23:50.140 |
in terms of education, of teaching the world. 00:23:54.620 |
But how did you as human beings and as collaborators meet? 00:24:01.660 |
One is how we met, and the other is how we got to know each other. 00:24:08.140 |
I'm gonna say that we came to understand that we had some common something. 00:24:13.580 |
Yeah, it's funny, 'cause on the surface, I think we're different in a lot of ways. 00:24:20.180 |
- Yeah, I mean, now we just leave each other's... 00:24:22.540 |
- So I will tell the story of how we met, and I'll let Michael tell the story of how we met. 00:24:29.980 |
I was already at that point, it was AT&T Labs. 00:24:33.980 |
But anyway, I was there, and Michael was coming to interview. 00:24:38.620 |
He was a professor at Duke at the time, but decided for reasons that he wanted to be in New Jersey. 00:24:49.500 |
Interviews are very much like academic interviews. 00:24:52.380 |
We all had to meet with him afterwards and so on, one-on-one. 00:24:56.060 |
But it was obvious to me that he was gonna be hired. 00:25:00.860 |
They were just talking about all the great stuff he did. 00:25:03.100 |
"Oh, he did this great thing, and you had just won something at AAAI, I think, 00:25:06.300 |
or maybe you got 18 papers in AAAI that year." 00:25:08.060 |
- "I got the best paper award at AAAI for the crossword stuff." 00:25:11.900 |
So that had all happened, and everyone was going on and on and on about it. 00:25:14.620 |
Actually, so Tinder was saying incredibly nice things about you. 00:25:20.860 |
- He was grumpily saying very nice things about you. 00:25:31.100 |
- So he remembers meeting me as inconveniencing his afternoon. 00:25:35.980 |
I was in the middle of trying to do something. 00:25:38.860 |
And for reasons that are purely accidental, despite what Michael thinks, 00:25:42.460 |
my desk at the time was set up in such a way that it had sort of an L shape, 00:25:46.540 |
and the chair on the outside was always lower than the chair that I was in. 00:25:52.380 |
- The only reason I think that it was on purpose is because you told me it was on purpose. 00:25:58.620 |
- His guest chair was really low so that he could look down at everybody. 00:26:02.540 |
- The idea was just to simply create a nice environment that you were asking for a mortgage, 00:26:09.260 |
Anyway, so we sat there, and we just talked for a little while. 00:26:12.060 |
And I think he got the impression that I didn't like him, which wasn't true. 00:26:18.540 |
And right after the talk, I said to my host, Michael Kearns, who ultimately was my boss-- 00:26:23.740 |
I'm a friend and a huge fan of Michael, yeah. 00:26:45.740 |
- As a game, not his skill level, 'cause I'm pretty sure he's-- 00:26:49.820 |
All right, there's some competitiveness there. 00:26:51.420 |
But the point is that it was like the middle of the day. 00:26:56.380 |
But then in the middle of the day, I gave a job talk. 00:27:08.620 |
Because that was so bad that it'd just be embarrassing 00:27:16.060 |
Like, it's just, let's just forget this ever happened. 00:27:21.420 |
- That's one of the most Michael Lipman set of sentences 00:27:29.740 |
And I remember him being very much the way I remember him now 00:27:40.300 |
- The chair thing and the low voice, I think. 00:28:03.660 |
- And then I got hired and I was in the group. 00:28:05.260 |
- Can we take a slight tangent on this topic of, 00:28:08.220 |
it sounds like, maybe you could speak to the bigger picture. 00:28:23.340 |
- Yeah, that was like a three out of 10 response. 00:28:29.580 |
You know, I remember Marvin Minsky said on a video interview 00:28:35.100 |
something that the key to success in academic research 00:28:44.060 |
- I think I followed that because I hate everything he's done. 00:28:53.260 |
But do you find that resonates with you at all 00:29:02.140 |
- That's such an MIT view of the world though. 00:29:04.300 |
So I remember talking about this when, as a student, you know, 00:29:09.260 |
I will clean it up for the purposes of the podcast. 00:29:12.060 |
My work is crap, my work is crap, my work is crap, my work is crap. 00:29:16.060 |
Then you like go to a conference or something. 00:29:27.520 |
- Yes. I've never hated my work, but I have been dissatisfied with it. 00:29:35.920 |
being okay with the fact that you've taken a positive step, 00:29:42.240 |
That's important because that's a part of the hope, right? 00:29:45.040 |
But you have to, but I haven't gotten there yet. 00:29:47.360 |
If that's not there, that I haven't gotten there yet, 00:29:53.280 |
So I buy that, which is a little different from hating everything that you do. 00:29:56.720 |
I mean, there's things that I've done that I like better than I like myself. 00:30:01.120 |
So it's separating me from the work, essentially. 00:30:06.640 |
but sometimes the work I'm really excited about. 00:30:11.120 |
So I found the work that I've liked, that I've done, most of it, 00:30:16.320 |
I liked it in retrospect more when I was far away from it in time. 00:30:21.040 |
I have to be fairly excited about it to get done. 00:30:24.000 |
- No, excited at the time, but then happy with the result. 00:30:26.800 |
But years later, or even I might go back, "You know what? 00:30:31.760 |
Or, "Oh gosh, it turns out I've been thinking about that. 00:30:34.080 |
It's actually influenced all the work that I've done since without realizing it." 00:30:46.960 |
I think there's something to the idea you've got to hate what you do, 00:30:52.400 |
And different people motivate themselves differently. 00:30:54.160 |
I don't happen to motivate myself with self-loathing. 00:30:56.480 |
I happen to motivate myself with something else. 00:30:58.240 |
- So you're able to sit back and be proud of, 00:31:04.400 |
- Well, and it's easier when you can connect it with other people 00:31:10.720 |
- And then you can still safely hate yourself privately. 00:31:15.040 |
Or at least win-lose, which is what you're looking for. 00:31:28.400 |
because we didn't do much research together at AT&T. 00:31:37.760 |
but that was one of the most magical places, historically speaking. 00:31:46.080 |
I feel like there's a profound lesson in there, too. 00:31:57.600 |
and people who really believed in machine learning 00:32:02.320 |
Let's get the people who are thinking about this 00:32:17.760 |
- And to be clear, he's gotten to be at Bell Labs. 00:32:22.000 |
- Yeah, I should have been 91 as a grad student. 00:32:24.400 |
So I was there for a long time, every summer, except for-- 00:32:33.360 |
- So Bell Labs was several locations for the research? 00:32:42.560 |
- But they were in a couple places in Jersey. 00:32:53.280 |
on the cricket pitch at Bell Labs at Murray Hill. 00:32:57.520 |
when it split off with Luce during what we called 00:33:00.720 |
- Are you better than Michael Korns at ultimate frisbee? 00:33:14.480 |
Okay, I have played on a championship-winning 00:33:25.200 |
- I know how young he was when he was younger. 00:33:30.400 |
Michael was a much better basketball player than I was. 00:33:36.960 |
- Let's be clear, I've not played basketball with you. 00:33:50.080 |
Anyway, but we were talking about something else, 00:33:55.680 |
So this was kind of cool about what was magical about it. 00:34:01.200 |
is that Bell Labs was an arm of the government, right? 00:34:06.720 |
And every month you paid a little thing on your phone bill, 00:34:10.640 |
which turned out was a tax for all the research 00:34:17.440 |
- The big bang or whatever, the cosmic background radiation. 00:34:21.200 |
They had some amazing stuff with directional microphones, 00:34:23.520 |
I got to go in this room where they had all these panels 00:34:28.720 |
And one another, and he'd move some panels around. 00:34:30.720 |
And then he'd have me step two steps to the left, 00:34:40.080 |
- which is deeply disturbing to hear your heartbeat. 00:34:44.800 |
There's so much, all this sort of noise around. 00:34:50.320 |
the purest sense of a university, but without students. 00:34:52.960 |
So it was all the faculty working with one another 00:34:58.880 |
you know, during the summer and they would go away. 00:35:00.640 |
But it was just this kind of wonderful experience. 00:35:04.480 |
In fact, I would often have to walk out my door 00:35:11.600 |
the proper way to prove something or another. 00:35:22.560 |
And it was just a place where you could think, thoughts. 00:35:25.120 |
And it was okay because so long as once every 25 years or so, 00:35:29.200 |
somebody invented a transistor, it paid for everything else. 00:35:35.440 |
it became harder and harder and harder to justify it 00:35:39.280 |
as far as the folks who were very far away were concerned. 00:35:46.240 |
that you never had a chance to really build a relationship. 00:35:51.360 |
So when the diaspora happened, it was amazing, right? 00:35:55.040 |
- Everybody left and I think everybody ended up 00:36:06.960 |
And as a professor, anyway, the answer that I would give is, 00:36:09.920 |
well, probably Bell Labs in some very real sense. 00:36:19.200 |
And Microsoft research is great and Google does good stuff 00:36:22.160 |
and you can pick IBM, you can tell if you want to, 00:36:33.680 |
about the physical proximity and the chance collisions? 00:36:39.280 |
where everyone is maybe trying to see the silver lining 00:36:46.080 |
Is there, one of the things that people like faculty 00:36:58.640 |
everything is about meetings that are supposed to be, 00:37:00.880 |
there's not a chance to just talk about comic book 00:37:07.040 |
- So it's funny you say this because that's how we met. 00:37:10.880 |
So I'll let Michael say that, but I'll just add one thing, 00:37:12.640 |
which is just that research is a social process 00:37:15.520 |
and it helps to have random social interactions, 00:37:22.560 |
One of the great things about the AI lab when I was there, 00:37:28.640 |
but we had entire walls that were whiteboards 00:37:31.680 |
and they would just write and people would walk up 00:37:36.000 |
and you got so much out of the freedom to do that. 00:37:44.240 |
which I would sometimes find deeply irritating, 00:37:49.440 |
But the sort of pointlessness and the interaction 00:37:51.920 |
was in some sense the point, at least for me. 00:37:56.960 |
I mentioned Josh Tenenbaum and he's very much, 00:37:59.840 |
he's such an inspiration in the child-like way 00:38:07.680 |
It doesn't even have to be about machine learning 00:38:11.520 |
He'll just pull you into a closest writable surface, 00:38:14.960 |
which is still, you can find whiteboards at MIT everywhere. 00:38:23.760 |
and talk for a couple hours about some aimless thing. 00:38:33.920 |
- It's definitely something worth missing in this world 00:38:42.640 |
- Whenever I wonder myself whether MIT really is as great 00:38:54.160 |
of what particular institutions stand for, right? 00:38:57.760 |
I mean, I don't, my guess is he's unaware of this. 00:39:02.080 |
- That the masters are not aware of their mastery. 00:39:28.320 |
We were given an opportunity to do a job search 00:39:38.240 |
and you would go to my office to keep me from working. 00:39:43.360 |
You had decided that there was really no point 00:39:46.400 |
'cause our relationship with the company was done. 00:39:49.360 |
- Yeah, but remember, I felt that way beforehand. 00:40:08.880 |
like things, like right before the ship was about to sink, 00:40:30.320 |
"Okay, wait, maybe we can save a couple of these people 00:40:32.560 |
if we can have them do something really useful." 00:40:39.280 |
we were gonna make basically an automated assistant 00:40:48.720 |
across all sorts of your personal information. 00:41:15.280 |
is that our boss's boss is a guy named Ron Brockman. 00:41:42.320 |
trying to implement this vision that Ron Brockman had 00:41:52.240 |
from what is possibly the greatest job of all time, 00:41:54.800 |
I think about, well, we kind of helped birth Siri. 00:42:02.960 |
But we got to spend a lot of time in his office 00:42:07.600 |
- We got to spend a lot of time in my office. 00:42:16.400 |
which was what he always dreamed he would do. 00:42:35.280 |
It was inspirational to see things go that way. 00:42:48.080 |
and the work that you were doing at Georgia Tech. 00:43:02.800 |
between Georgia Tech and Udacity to make this happen. 00:43:09.040 |
but wouldn't tell me what was actually going on. 00:43:35.360 |
so there's a little bit of jumping around in time. 00:43:44.080 |
I mean, maybe Charles, maybe this is a Charles story. 00:43:49.200 |
- And worked on things like revamping the curriculum, 00:43:53.600 |
so that it had some kind of semblance of modular structure, 00:43:59.360 |
moving from a fairly narrow, specific set of topics 00:44:03.520 |
to touching a lot of other parts of intellectual life. 00:44:08.160 |
And the curriculum was supposed to reflect that. 00:44:15.840 |
- And for my labors, I ended up the associate dean. 00:44:28.480 |
they will give you responsibility to do more of that thing. 00:44:47.920 |
depending on what you're trying to do that week, 00:44:51.040 |
- Well, one of the problems with the word work, 00:45:02.480 |
that's one of, you know, we talked about balance. 00:45:09.520 |
It always rubbed me the wrong way as a terminology. 00:45:17.600 |
- Oh, I can't tell you how much time I'd spend, 00:45:25.440 |
"I cannot believe they're paying me to do this." 00:45:46.640 |
well, we texted almost every day during the period. 00:45:53.920 |
the machine learning conference was in Atlanta. 00:45:56.480 |
I was the chair, the general chair of the conference. 00:46:00.320 |
Charles was my publicity chair, something like that, 00:46:07.840 |
if he didn't actually show up for the conference 00:46:21.920 |
because he didn't like how it was to text with me 00:46:27.040 |
if I had an iPhone, the text would be somehow smoother. 00:46:34.400 |
But it was, yeah, Charles forced me to get an iPhone 00:46:43.920 |
and then eventually we did the teaching thing 00:46:46.960 |
And there's a couple of reasons for that, by the way. 00:46:48.720 |
One is I really wanted to do something different. 00:46:54.400 |
What's a thing that you could do in this medium 00:47:01.120 |
And being able to do something with another person 00:47:12.160 |
- Yeah, I always thought, he makes it sound brilliant 00:47:22.560 |
and it would be great if Michael could teach the course 00:47:29.040 |
- Well, that's what the second class was more like that. 00:47:37.680 |
- I wish you were once again letting the facts get in the way. 00:47:57.280 |
which is supervised learning, unsupervised learning, 00:48:00.240 |
and reinforcement learning and decision-making, 00:48:02.880 |
the kind of assignments that we talked about earlier. 00:48:08.320 |
which is reinforcement learning and decision-making. 00:48:12.160 |
I'd been teaching at that point for well over a decade. 00:48:17.360 |
Actually, I learned quite a bit teaching that class with him, 00:48:21.440 |
But the first one I drove most of it was all my material. 00:48:23.680 |
Although I had stolen that material originally 00:48:34.480 |
At least some of the, at least when I found the slides, 00:48:37.760 |
Yes, every machine learning class taught in the early 2000s 00:48:46.560 |
a lot more with reinforcement learning and such 00:48:57.440 |
Most people were just doing supervised learning 00:48:58.880 |
and maybe a little bit of clustering and whatnot. 00:49:01.600 |
But we took it all the way to machine learning. 00:49:02.720 |
- A lot of it just comes from Tom Mitchell's book. 00:49:06.000 |
half of it comes from Tom Mitchell's book, right? 00:49:12.320 |
'Cause certain things weren't invented when Tom-- 00:49:29.280 |
or I'm on the edge of being an introvert anyway. 00:49:39.520 |
and bringing them into whatever we find interesting 00:49:45.280 |
And I found the idea of staring alone at a computer screen 00:49:54.560 |
- And I had in fact done a MOOC for Udacity on algorithms 00:49:59.120 |
and it was a week in a dark room talking at the screen, 00:50:29.760 |
But each time I tried it and I got no reaction, 00:50:32.800 |
it just was taking the energy out of my performance, 00:50:45.280 |
because you can't let your guard down for a moment with Charles. 00:50:52.080 |
But we would work really well together, I thought, 00:50:54.640 |
So I knew that we could sort of make it work. 00:51:09.200 |
And that's when we did the overfitting thriller video. 00:51:21.280 |
- Okay, so it happened, it was completely spontaneous. 00:51:40.240 |
and then the student comes in not knowing anything. 00:51:49.120 |
- And so he needed to set up a little "Prisoner's Dilemma" grid. 00:51:52.000 |
So he drew it and I could see what he was drawing. 00:51:54.400 |
And the "Prisoner's Dilemma" consists of two players, 00:51:57.360 |
two parties, so he decided he would make little cartoons 00:52:04.640 |
that were deciding whether or not to rat each other out. 00:52:06.800 |
One of them he drew as a circle with a smiley face 00:52:14.080 |
And the other one with all sorts of curly hair. 00:52:22.880 |
- And that stuck, I actually watched that video. 00:52:28.800 |
- He started singing "Smooth Criminal" by Michael Jackson. 00:52:36.880 |
our kind of first actual episode should be coming out today, 00:52:43.120 |
where the two of us discuss episodes of "Westworld." 00:52:48.800 |
"Huh, what does this say about computer science and AI?" 00:52:53.760 |
I mean, I know it's on season three or whatever we have. 00:53:17.920 |
- That was more my time than your time, Charles. 00:53:20.000 |
- That's right, 'cause you're much older than I am. 00:53:21.440 |
I think the important thing here is that it's narrative, 00:53:27.280 |
But the idea that they would give these reveries, 00:53:35.200 |
- Who could possibly think that was gonna, I gotta, 00:53:37.600 |
I mean, I don't know, I've only seen the first two episodes 00:53:42.640 |
That the robots were actually designed by Hannibal Lecter. 00:53:54.160 |
But still, I was just struck by how it's all driven 00:54:02.400 |
is talking to them about what's going on in their heads, 00:54:10.000 |
But think about how it would work in real life. 00:54:17.120 |
It was quite interesting to just kind of ask this question. 00:54:23.280 |
- So we don't know, we can't answer that question. 00:54:25.680 |
- I'm also a fan of a guy named Alex Garland. 00:54:41.600 |
Like he's curious enough to go into that direction. 00:54:43.920 |
On the Westworld side, I felt there was more emphasis 00:55:02.480 |
- Well, they said specifically, so we make a change 00:55:05.760 |
And that's bad because something terrible could happen. 00:55:07.680 |
Like if you're putting things out in the world 00:55:09.360 |
and you're not sure whether something terrible 00:55:13.040 |
- I just feel like there should have been someone 00:55:16.560 |
poke his head in and say, "What could possibly go wrong?" 00:55:22.000 |
and I did watch a lot more, I'm not giving anything away. 00:55:24.640 |
I would have loved it if there was like an episode 00:55:29.840 |
a new model or something and like it just keeps failing. 00:55:34.000 |
And then it's more turns into like an episode 00:55:41.840 |
that are constantly like threatening the fabric 00:55:47.520 |
Yeah. And you know, this reminds me of something that, 00:55:50.960 |
so I agree with that, that actually would be very cool, 00:56:06.640 |
- I think that's where we lose people, by the way, 00:56:10.480 |
either figure out debugging or think debugging is terrible. 00:56:14.560 |
- This is part of the struggle versus suffering, right? 00:56:16.880 |
You get through it and you kind of get the skills of it, 00:56:28.560 |
really, really neat about framing it that way. 00:56:34.000 |
But what I don't like about all of these things, 00:56:46.320 |
he says that the thing that nobody noticed he put in 00:57:04.560 |
of passing the general version of the Turing test, 00:57:08.400 |
or the consciousness test, is smiling for no one. 00:57:17.040 |
oh, it's like the Chinese room kind of experiment. 00:57:26.160 |
with the actual experience and just take it in. 00:57:29.680 |
I don't know, he said nobody noticed the magic of it. 00:57:32.560 |
- I have this vague feeling that I remember the smile, 00:57:34.880 |
but now you've just put the memory in my head, 00:57:49.440 |
But here's the problem I have with all of those movies, 00:57:57.360 |
is it sets up the problem of AI as succeeding, 00:58:10.960 |
It's using the data to make decisions that are terrible. 00:58:13.600 |
It's not the intelligence that's gonna go out there 00:58:17.840 |
or lock us into a room to starve to death slowly 00:58:26.000 |
that are allowing us to make the terrible decisions 00:58:30.160 |
we would have less efficiently made before, right? 00:58:32.160 |
Computers are very good at making us more efficient, 00:58:35.520 |
including being more efficient at doing terrible things. 00:58:38.160 |
And that's the part of the AI we have to worry about. 00:58:40.080 |
It's not the true intelligence that we're gonna build 00:58:44.000 |
sometime in the future, probably long after we're around. 00:58:46.560 |
I think that whole framing of it sort of misses the point, 00:58:58.880 |
'cause I wanted to build something like that. 00:59:00.640 |
Philosophical questions are interesting to me, 00:59:17.120 |
But I feel like Cal 9000 came a little bit closer to that 00:59:28.000 |
It felt like closer to the AI systems we have today. 00:59:31.120 |
And the real things we might actually encounter, 00:59:35.760 |
which is over-relying in some fundamental way 00:59:41.360 |
on our dumb assistants or on social networks, 00:59:46.960 |
onto things that require internet and power and so on, 00:59:55.040 |
and thereby becoming powerless as a standalone entity. 01:00:02.160 |
in some subtle way, it creates a lot of problems. 01:00:05.520 |
And those problems are dramatized when you're in space 01:00:12.240 |
once we started making the decisions for you, 01:00:17.040 |
That's the matrix, Michael, in case you don't remember. 01:00:23.040 |
because isn't that what we do with people anyway? 01:00:24.880 |
This kind of the shared intelligence that is humanity 01:00:32.160 |
As individuals, we're still generally intelligent. 01:00:35.840 |
but we leave most of this up to other people, 01:00:39.760 |
And by the way, everyone doesn't necessarily share our goals. 01:00:44.960 |
Sometimes we make decisions that others would see 01:00:48.880 |
and yet we somehow manage it, manage to survive. 01:01:16.080 |
it's a good experiment of how difficult would it be 01:01:28.960 |
then I don't believe that we were put in the simulation. 01:01:31.440 |
I believe that it's just physics playing out, 01:01:38.160 |
- So you think you have to build the universe? 01:01:56.720 |
where it doesn't feel very natural to me at all. 01:02:02.000 |
I don't understand this thing that we're living in. 01:02:09.520 |
Now, if you wanna call that the result of a simulator, 01:02:17.520 |
I mean, the interesting thing about simulation 01:02:26.640 |
- Unless you were aware enough to know that there was a bug. 01:02:31.280 |
- Yeah, the way you put the question, though. 01:02:32.240 |
- I don't think that we live in a simulation created for us. 01:02:36.800 |
I've actually never thought about it that way. 01:02:38.000 |
I mean, the way you asked the question, though, 01:02:39.760 |
is could you create a world that is enough for us humans? 01:02:42.960 |
It's an interestingly sort of self-referential question 01:02:45.360 |
because the beings that created the simulation 01:02:52.240 |
But we're in the simulation, and so it's realistic for us. 01:02:59.040 |
for the people in the simulation, as it were, 01:03:17.200 |
- It becomes a world, even like in brief moments, 01:03:43.760 |
is if we journey into that world early on in life, often. 01:03:49.680 |
- Yeah, but from a video game design perspective, 01:04:00.640 |
'cause it's clear that video games are getting much better. 01:04:16.160 |
if we were to fast forward 100 years into the future 01:04:18.880 |
in a way that might change society fundamentally. 01:04:44.640 |
- You really need to read "Calculating God" by Sawyer. 01:04:54.160 |
but it's, assuming you're that kind of reader, 01:05:05.600 |
And I think it's pretty sure it's Robert Sawyer. 01:05:12.240 |
which is why the story mostly takes place in Toronto. 01:05:24.320 |
sort of thing from say, "The Egg," for example. 01:05:59.760 |
But anyway, you should read "Calculating God." 01:06:11.120 |
One thing I've noticed about people growing up now, 01:06:19.440 |
bigger and bigger and bigger part of their lives. 01:06:21.520 |
And the video games have become much more realistic. 01:06:32.800 |
exactly with the numbers we're talking about here. 01:06:42.480 |
- I understand that economists can actually see 01:06:45.520 |
the impact of video games on the labor market, 01:06:48.480 |
that there's fewer young men of a certain age 01:06:54.240 |
participating in like paying jobs than you'd expect. 01:07:10.800 |
That's it, you go in the holodeck, you never come out. 01:07:17.760 |
so this is the end of humanity as we know it, right? 01:07:24.960 |
It's some possibility to go into another world 01:07:28.160 |
that can be artificially made better than this one. 01:07:34.000 |
or speeding it up so you appear to live forever, 01:07:37.680 |
- And then most of us will just be old people on the porch 01:07:42.160 |
yelling at the kids these days in their virtual reality. 01:08:09.840 |
all the time, whenever the editor didn't like something 01:08:12.320 |
or whatever, I would say, "We'll fix it in post." 01:08:24.640 |
- So, is there something you've learned about, 01:08:28.160 |
I mean, it's interesting to talk about MOOCs. 01:08:35.600 |
I think there's two lines of conversation to be had here, 01:08:56.960 |
but because I think it's reminded us of a lot of things. 01:09:01.920 |
there's an article out by a good friend of mine 01:09:20.160 |
and why people wanted us to go back to college. 01:09:27.840 |
And what they're paying for is not the classes. 01:09:29.680 |
What they're paying for is the college experience. 01:09:59.600 |
the disaggregation was not the disaggregation 01:10:04.960 |
and that you can get the best anywhere you want to. 01:10:10.080 |
The disaggregation is having it shoved in our faces 01:10:30.480 |
and we're happy we had the learning experience as well. 01:10:41.440 |
I'm standing in front of you telling you this 01:10:50.800 |
So to me, that's what COVID has forced us to deal with, 01:10:53.440 |
even though I think we're still all in deep denial about it 01:11:08.880 |
is a way of providing a more dispersed experience 01:11:13.600 |
and these kinds of remote things that we've learned. 01:11:15.760 |
And we'll have to come up with new ways to engage them 01:11:25.280 |
so that they actually come out four or five or six years later 01:11:30.720 |
So I think the world will be radically different afterwards. 01:11:41.920 |
And I think this would have been true even without COVID, 01:11:47.680 |
So it's happening in two or three years or five years 01:11:51.200 |
- That was an amazing answer that I did not understand. 01:12:06.240 |
- Well, so, you know, the power of technology 01:12:09.280 |
that if you go on the West Coast and hang out long enough 01:12:11.440 |
is all about, we're gonna disaggregate these things together, 01:12:13.440 |
the books from the bookstore, you know, that kind of a thing. 01:12:15.760 |
And then suddenly Amazon controls the universe, right? 01:12:30.000 |
and then take classes over the network anywhere? 01:12:33.200 |
- Yeah, this is what people thought was gonna happen, 01:12:34.720 |
or at least people claimed it was gonna happen, right? 01:12:36.400 |
- 'Cause my daughter is essentially doing that now. 01:12:38.720 |
She's on one campus, but learning in a different campus. 01:12:40.800 |
- Sure, and COVID makes that possible, right? 01:12:47.360 |
- But the idea originally was that, you know, 01:12:49.280 |
you and I were gonna create this machine learning class 01:12:52.560 |
there'd be the machine learning class everyone takes, right? 01:12:55.920 |
But, you know, something like that, you can see how- 01:13:01.760 |
- I don't think that will be the thing that happens. 01:13:05.200 |
maybe I missed what the college experience was. 01:13:07.120 |
I thought it was peers, like people hanging around. 01:13:13.120 |
- Yeah, but none of that, you can do classes online 01:13:24.240 |
- It's in a context and the context is the university. 01:13:28.720 |
that Georgia Tech really is different from Brown. 01:13:40.240 |
to the students in making an informed decision. 01:13:42.480 |
- But the truth, but yes, they will make choices 01:13:46.640 |
And some of those choices will be made for them. 01:13:49.840 |
'cause they think it's this, that, or the other. 01:13:51.440 |
I just don't want to say, I don't want to give the idea- 01:13:56.720 |
I mean, Georgia Tech is different from Brown. 01:13:59.280 |
Brown is different from, pick your favorite state school 01:14:05.600 |
Which I guess is my favorite state school in Iowa. 01:14:10.560 |
And a lot of those contexts are, they're about history, yes, 01:14:13.360 |
but they're also about the location of where you are. 01:14:15.760 |
They're about the larger group of people who are around you, 01:14:20.480 |
and you're basically the only thing that's there 01:14:22.320 |
as a university, you're responsible for all the jobs, 01:14:25.200 |
or whether you're at Georgia State University, 01:14:26.960 |
which is an urban campus where you're surrounded by, 01:14:35.600 |
It actually matters whether you're a small campus 01:14:48.560 |
And if you, not to, you know, if you get a degree 01:14:53.840 |
at an online university somewhere, you don't, 01:15:09.200 |
The reason for that, I think, and you'd have to ask them, 01:15:19.600 |
and that it's reaching, you know, 11,000 students, 01:15:22.800 |
And we're admitting everyone we believe who can succeed. 01:15:31.840 |
depending on how long you take, a dollar degree, 01:15:34.320 |
as opposed to the 46,000 it costs you to come on campus. 01:15:36.480 |
So that feels, and I can do it while I'm working full time, 01:15:43.280 |
So it's an opportunity to do something you wanted to do, 01:15:53.040 |
So I think we created something that's had an impact, 01:15:56.800 |
but importantly, we gave a set of people opportunities 01:16:00.640 |
So I think people feel very loyal about that. 01:16:04.000 |
besides the surveys, is that we have somewhere north 01:16:09.920 |
who graduated, but come back in TA for this class 01:16:17.840 |
because they believe in sort of having that opportunity 01:16:25.840 |
15 years from now, will people have that same sense? 01:16:32.800 |
it's a matter of feeling as if you're a part of something. 01:16:45.760 |
Going through a shared experience makes that easier. 01:16:49.600 |
if you're alone looking at a computer screen. 01:16:54.800 |
- The question is, it still is the intuition to me, 01:16:57.440 |
and it was at the beginning when I saw something 01:17:07.280 |
- No, it won't replace universities, but it will-- 01:17:13.760 |
The people who are taking it are already adults. 01:17:15.520 |
They've gone through their undergrad experience. 01:17:18.560 |
I think their goals have shifted from when they were 17. 01:17:25.360 |
something very social and very important, right? 01:17:30.080 |
don't build the sidewalks, just leave the grass, 01:17:31.920 |
and the students will, or the people will walk, 01:17:33.440 |
and you put the sidewalks where they create paths, 01:17:36.400 |
- There are architects who apparently believe 01:17:39.920 |
The metaphor here is that we created this environment. 01:17:45.120 |
We didn't quite know how to think about the social aspect, 01:17:48.640 |
but, you know, we didn't have time to solve all, 01:17:57.440 |
like on Google+, there were like 30-something groups 01:17:59.680 |
created in the first year because somebody had used Google+. 01:18:16.400 |
putting on their T-shirts as they travel around the world. 01:18:18.240 |
I climbed this mountaintop, I'm putting this T-shirt on, 01:18:24.560 |
on top of the social network and the social media that existed 01:18:49.440 |
But I don't think it's going to replace the university 01:19:01.360 |
Now, maybe there'll be some other rite of passage 01:19:05.920 |
So the university is such a fascinating mess of things. 01:19:11.120 |
So just even the faculty position is a fascinating mess. 01:19:33.280 |
it's maybe an accident of history or human evolution. 01:19:36.240 |
It seems like the people who are really good at teaching 01:19:45.360 |
At the same time, it also doesn't seem to make sense 01:19:51.680 |
is the same place where you go to learn calculus 01:20:00.240 |
- Yeah, relatively speaking, it's a safe space. 01:20:02.240 |
Now, by the way, I feel the need very strongly 01:20:10.720 |
And by the way, the ones who do go to college, 01:20:16.240 |
You know, the places where we've been, where we are, 01:20:22.240 |
the traditional movie version of universities are. 01:20:25.440 |
But for most people, it's not that way at all. 01:20:27.280 |
By the way, most people who drop out of college, 01:20:31.120 |
So, you know, we were talking about a particular experience. 01:20:38.560 |
which is very small, but larger than it was a decade 01:20:47.120 |
My concern, which I think is kind of implicit 01:20:51.440 |
is that somehow we will divide the world up further 01:20:54.000 |
into the people who get to have this experience 01:20:59.200 |
and everyone else while increasingly requiring 01:21:10.560 |
but they're not gonna get to have that experience. 01:21:12.320 |
And there'll be a small group of people who do 01:21:13.680 |
who continue to, you know, positive feedback, 01:21:16.800 |
I worry a lot about that, which is why for me, 01:21:25.920 |
I think the reason, whether it's good, bad or strange, 01:21:29.520 |
but I think it's useful to have the faculty member, 01:21:40.880 |
and with the fundamental mission of the university 01:21:48.160 |
because they're creating, they're reproducing basically, 01:21:50.720 |
right, and it lets them do their research and multiply. 01:21:52.720 |
But they understand that the mission is the undergrads. 01:21:57.120 |
And so they will do it without complaint mostly 01:22:00.240 |
because it's a part of the mission and why they're here. 01:22:02.320 |
And they have experiences with it themselves. 01:22:07.200 |
The people who tend to get squeezed in that, by the way, 01:22:12.000 |
nor the undergrads we have already bought into the idea 01:22:18.000 |
Anyway, I think tying that mission in really matters. 01:22:26.000 |
Education feels like more of a higher calling to me 01:22:29.360 |
Because education, you cannot treat it as a hobby 01:22:34.640 |
- But that's the pushback on this whole system 01:22:38.320 |
is that you should, education should be a full-time job. 01:22:48.080 |
- Yes, although I think most of our colleagues, 01:22:51.120 |
many of our colleagues would say that research is the job 01:23:09.040 |
and the other thing I want Michael to point out 01:23:11.760 |
to sort of the ideal professor in some sense than I am. 01:23:20.720 |
but he is a dean, so he has a different experience. 01:23:23.120 |
- I'm giving him time to think of the profound thing 01:23:29.520 |
we have lecturers in the College of Computing where I am. 01:23:33.120 |
There's 10 or 12 of them depending on how you count 01:23:35.520 |
as opposed to the 90 or so tenure track faculty. 01:23:40.960 |
well, they don't only teach, they also do service. 01:23:45.360 |
They teach 50%, over 50% of our credit hours, 01:24:00.880 |
And that's including our grad courses, right? 01:24:02.880 |
So they're doing this, they're teaching more, 01:24:12.720 |
You hire someone from the outside to do whatever, 01:24:18.080 |
because it's all internal confidential stuff. 01:24:21.040 |
is there was a single question we asked our alumni. 01:24:25.680 |
all the way up to people who graduated last week, right? 01:24:45.440 |
And then, so they got all the answers from people 01:24:48.160 |
It was clearly a word cloud created by people 01:24:52.080 |
'cause they had one person whose name like appeared 01:24:56.080 |
like Philip, Phil, Dr. Phil, you know, but whatever. 01:25:00.000 |
And I looked at it and I noticed something really cool. 01:25:02.240 |
The five people from the College of Computing, 01:25:17.200 |
both were chairs of our division of computing instruction. 01:25:19.680 |
One just, one retired, one is gonna retire soon. 01:25:22.160 |
And the other two were lecturers I remembered 01:25:36.560 |
Two of those are teaching awards are named after, right? 01:25:48.080 |
the big introductory classes that got me into it. 01:25:50.080 |
There's a guy named Richard Barker's on there 01:26:00.880 |
'cause it kept showing up over and over again. 01:26:04.720 |
- But different people spelled it differently. 01:26:15.680 |
I went to read about him 'cause I was curious who he was, 01:26:27.520 |
they remember the people who were kind to them, 01:26:37.600 |
Not to completely lose track of the fundamental problem 01:26:41.760 |
of how do we replace the party aspect of universities. 01:26:47.920 |
- Before we go to what makes the platonic professor, 01:26:59.680 |
Like are we, should we desperately be clamoring 01:27:05.760 |
or is this a stable place to be for a little while? 01:27:12.480 |
and learning experience has been really rough. 01:27:18.160 |
in a way that's not a happy, positive struggle. 01:27:29.520 |
But I worry about just even before this happened, 01:27:42.800 |
I mean, all the data that I'm aware of seems to indicate, 01:27:46.880 |
and this kind of fits, I think, with Charles's story, 01:27:55.120 |
if they feel connected to the person teaching the class, 01:28:20.160 |
So, I literally, I think, learned linear algebra 01:28:24.800 |
from Gilbert Strang by watching MIT OpenCourseWare 01:29:10.400 |
They seem to think they're learning something anyway. 01:29:25.760 |
to try to make certain that what they've curated 01:29:31.680 |
And so there's huge differences in what they prefer. 01:29:34.800 |
what they prefer is more connection, not less. 01:29:45.200 |
'Cause that was the biggest classroom on campus. 01:29:50.720 |
I'm literally on a stage looking down on them 01:30:10.160 |
Daphne has actually said some version of this, 01:30:22.000 |
Even the people who had access to our material 01:30:30.080 |
their boredom, and like when the parts are boring 01:30:34.880 |
and their excitement when the parts are exciting, 01:30:46.480 |
- Watching the circus on TV alone is not really. 01:30:59.360 |
- Well, you need, maybe you need just another person. 01:31:05.760 |
- Well, there's different kinds of connection, right? 01:31:22.960 |
And so different jokes work in different size crowds too. 01:31:26.880 |
- Where sometimes if it's a big enough crowd, 01:31:30.080 |
then even a really subtle joke can take root someplace 01:31:34.640 |
And it kind of, there's a whole statistics of, 01:31:48.000 |
So I purposely didn't laugh just to see if I was right. 01:32:09.040 |
- So I wanna say that it was a good thing that I did. 01:32:22.800 |
I mean, certainly movie theaters are a thing, right? 01:32:30.640 |
aren't really co-present with the people in the audience. 01:32:36.080 |
it's an open question that's being raised by this, 01:32:43.600 |
So that's, it's a very parallel question for education. 01:32:47.040 |
Will movie theaters still be a thing in 2021? 01:32:51.840 |
that there is a feeling of being in the crowd 01:32:54.560 |
that isn't replicated by being at home watching it 01:33:03.120 |
- But I feel like we're having a conversation 01:33:25.200 |
- I'll wait to publish this until we have a vaccine. 01:33:37.200 |
- First of all, movie theaters weren't this way, right? 01:33:39.360 |
In like the '60s and '70s, they weren't like this. 01:33:43.840 |
With "Jaws" and "Star Wars" created blockbusters, right? 01:33:54.800 |
So it's just a very different, it's very different. 01:33:56.960 |
So what we've been experiencing in the last 10 years 01:34:11.920 |
So I think that's a painful way of saying that it will change. 01:34:23.760 |
Replace is too strong of a word, but it will change. 01:34:42.640 |
well, this is dumb, than before there was records. 01:34:45.760 |
It's possible to argue that, if you look at the data, 01:34:50.640 |
that it just expanded the pie of what music listening means. 01:34:55.680 |
So it's possible that like universities grow in the parallel 01:34:59.200 |
or the theaters grow, but also more people get to watch movies, 01:35:06.880 |
- Yeah, and to the extent that we can grow the pie, 01:35:09.520 |
and have education be not just something you do 01:35:11.840 |
for four years when you're done with your other education, 01:35:20.480 |
especially as the economy and the world change rapidly. 01:35:24.160 |
Like people need opportunities to stay abreast 01:35:39.120 |
from Laserdisc to DVDs or record players to CDs. 01:35:42.880 |
I mean, I'm willing to grant that that is true, 01:35:47.600 |
And the ability to do something that you couldn't do otherwise 01:35:53.600 |
And you can tell me I'm only getting 90% of the experience, 01:35:57.520 |
I wasn't getting it before, or it wasn't lasting as long, 01:36:00.720 |
I mean, this just seems straightforward to me. 01:36:05.360 |
It is for the good that more people get access. 01:36:10.400 |
One, to educate them and make access available. 01:36:20.000 |
We can do both of those things at the same time. 01:36:26.480 |
- So you've educated some scary number of people. 01:36:38.800 |
Is there a device that you can give to a young person today 01:36:52.000 |
about whatever the journey that one takes in their, 01:36:59.360 |
maybe in their teens, in their early twenties, 01:37:04.960 |
as you try to go through the essential process of partying 01:37:12.240 |
- If you get to the point where you're far enough up 01:37:18.720 |
that you can actually make decisions like this, 01:37:21.760 |
then find the thing that you're passionate about 01:37:25.520 |
And sometimes it's the thing that drives your life 01:37:34.320 |
And I understand that and it's not easy for everyone, 01:37:36.240 |
but always take a moment or two to pursue the things 01:37:53.360 |
And it's okay if it takes you a long time to get there. 01:37:55.840 |
Rodney Dangerfield became a comedian in his fifties, 01:38:01.600 |
And lots of people failed for a very long time 01:38:09.600 |
I mean, you and I talked about the experience 01:38:17.360 |
Wasn't my first one and it wasn't my last one. 01:38:19.840 |
But in my view, I wasn't supposed to be here after that 01:38:25.760 |
So you might as well go ahead and grab life as you can 01:38:31.040 |
While recognizing, again, the delusion matters, right? 01:38:35.760 |
Allow yourself to believe that it's all gonna work out. 01:38:37.920 |
Just don't be so deluded that you miss the obvious 01:38:53.280 |
- Yeah, I mean, there's a whole lot of things 01:39:03.600 |
can depend a lot on things out of your control. 01:39:06.000 |
But I really do believe in the passion, excitement thing. 01:39:10.160 |
I was talking to my mom on the phone the other day 01:39:11.840 |
and essentially what came out is that computer science 01:39:21.840 |
And I get to be a professor teaching something 01:39:28.720 |
And she was like trying to give me some appreciation 01:39:33.360 |
for how foresightful I was for choosing this line of work 01:39:37.520 |
as if somehow I knew that this is what was gonna happen 01:39:49.360 |
I didn't think it would be particularly lucrative. 01:40:03.440 |
And I pride myself on my ability to remain un-rich. 01:40:20.000 |
So I got lucky and the thing that I cared about 01:40:24.960 |
But I don't think I would have had a fun time 01:40:38.000 |
and the internet is part of the problem here, 01:40:41.360 |
is they say they're passionate about so many things. 01:40:46.320 |
Which is a harder thing for me to know what to do with. 01:41:05.120 |
- Is there anything about this particular hallway 01:41:06.960 |
that's relevant or you're just in general choices? 01:41:09.600 |
- It sounds like you regret not taking the right turn. 01:41:16.320 |
On the left was Michael Dimmon's office, right? 01:41:23.040 |
It wasn't a huge choice. - It would have really hurt. 01:41:42.720 |
But by the way, I decided to say yes to something 01:42:05.520 |
Some things are clearly smarter than other things. 01:42:08.720 |
But in the end, if you've got multiple choices, 01:42:11.360 |
there are lots of things you think you might love. 01:42:17.120 |
The worst thing that'll happen is you took a left turn 01:42:18.800 |
instead of a right turn and you ended up merely happy. 01:42:23.440 |
- So accepting, so taking the step and just accepting, 01:42:43.760 |
- If I ever get the chance again, I'm doing it. 01:42:54.160 |
There was a thing where we did do the zombie thing. 01:43:14.560 |
- Exactly, you took the left turn and it ended up-- 01:43:17.680 |
- Took the left turn and it ended up being the right thing. 01:43:19.440 |
So a lot of people ask me that are a little bit tangential 01:43:28.160 |
like all kinds of disciplines that are outside 01:43:30.240 |
of the particular discipline of computer science. 01:43:37.920 |
or want to either taste this little skill set or discipline 01:43:44.640 |
or try to see if it can be used somehow in their own life? 01:43:49.840 |
- It feels, well, one of the magic things about the internet 01:43:58.960 |
My daughter is taking AP Computer Science right now. 01:44:07.920 |
and I'll be really curious where he takes it. 01:44:12.240 |
for this sort of thing and she's doing great. 01:44:14.080 |
But one of the things I have to tell her all the time, 01:44:17.120 |
she points, well, I want to make a rhythm game. 01:44:18.800 |
So I want to go for two weeks and then build a rhythm game. 01:44:30.960 |
I was in grad school when I suddenly woke up one day 01:44:34.000 |
over the Royal East and I thought, wait a minute, 01:44:37.840 |
I should be able to write "Pac-Man" in an afternoon. 01:44:42.960 |
I had to figure out how the ghost moved and everything. 01:44:48.560 |
But if I had started out trying to build "Pac-Man", 01:44:52.400 |
I think it probably would have ended very poorly for me. 01:44:54.960 |
Luckily back then there weren't these magical devices 01:45:00.000 |
to give me this illusion that I could create something 01:45:02.240 |
by myself from the basics inside of a weekend like that. 01:45:05.600 |
I mean, that was a culmination of years and years and years 01:45:09.520 |
before I decided I should be able to write this and I could. 01:45:16.720 |
There are lots of people there to give you the information. 01:45:20.400 |
Remember they've been doing it for a very long time. 01:45:22.560 |
Take it slow, learn the little pieces, get excited about it. 01:45:25.600 |
And then keep the big project you want to build in mind. 01:45:29.520 |
Because as a wise man once said, life is long. 01:45:32.720 |
Sometimes it doesn't seem that long, but it is long. 01:45:35.520 |
And you'll have enough time to build it all out. 01:45:44.480 |
That's not exciting, but it'll get you around. 01:45:48.480 |
- Well, there's only one programming language, it's Lisp. 01:45:50.800 |
But if you have to pick a programming language, 01:45:55.840 |
- Python is basically Lisp, but with better syntax. 01:46:03.120 |
- So you're gonna argue that C syntax is better than anything? 01:46:06.880 |
Anyway, also I'm gonna answer Python despite what he said. 01:46:09.760 |
- Tell me, tell your story about somebody's dissertation 01:46:15.280 |
This is Dave's, Dave's dissertation was like Dave McAllister, 01:46:19.840 |
- And then he came in our group at Bell Labs. 01:46:21.920 |
- And now he's at Technology Technical Institute of Chicago. 01:46:33.760 |
And he decided to have as an appendix his actual code, 01:46:40.720 |
And like the last 20 pages are just right parentheses. 01:46:58.240 |
If you're of a certain age, if you're really young 01:47:01.120 |
and trying to figure it out, graphical languages 01:47:02.960 |
that let you kind of see how the thing works, 01:47:08.800 |
thinking about how to build languages that get people in. 01:47:17.920 |
And that's why I asked you what stage of life people are in. 01:47:22.880 |
- The answer to that question of which language 01:47:25.280 |
keeps changing, I mean, there's some value to exploring. 01:48:03.520 |
And if you push far enough, like it can be assembly language, 01:48:09.440 |
before you start to hit the really deep concepts 01:48:11.360 |
that you would get sooner in other languages. 01:48:13.200 |
But like, I don't know, computation is kind of computation, 01:48:16.480 |
is kind of Turing equivalent, is kind of computation. 01:48:22.080 |
but you have to build out that mental structure in your mind. 01:48:25.360 |
And I don't think it super matters which language. 01:48:29.840 |
because some things are just at the wrong level of abstraction. 01:48:32.080 |
I think assembly's at the wrong level of abstraction 01:48:38.160 |
- Yes, for frameworks, big frameworks are quite a bit. 01:48:46.080 |
and I think of a project and I go through it in a weekend. 01:48:50.240 |
You're right though, the languages that are designed for that 01:48:54.800 |
Pick the ones that people have built tutorials 01:49:05.120 |
I was teaching intro to CS in the summer as a favor. 01:49:15.040 |
And it was very funny 'cause I'd go in every single time 01:49:18.000 |
how am I possibly gonna fill up an hour and a half 01:49:28.960 |
writing to a variable and conditional branching. 01:49:35.920 |
And when I say that's it, I don't mean it's simple. 01:50:11.840 |
- That's one of the trickiest things to get for programmers, 01:50:15.440 |
that there's a memory and the variables are pointing 01:50:20.880 |
And sometimes the languages hide that from you 01:50:29.840 |
or used to worry about these sorts of things anyway, 01:50:32.320 |
had this kind of belief that actually people, 01:50:36.800 |
X equals something, Y equals something, Y equals X, 01:50:38.640 |
that you have now made a mathematical statement 01:50:43.920 |
- Which you can if you just put like an anchor in front of it. 01:50:47.520 |
- Yes, but people, that's not what you're doing. 01:50:56.480 |
is that most of the people who didn't know the answer, 01:51:03.120 |
- And so it's by reference, or by name really, right? 01:51:10.800 |
And so depending upon what you think they are, 01:51:16.240 |
or one could go two thirds of the way through a semester, 01:51:19.280 |
and people still hadn't figured out in their heads, 01:51:29.920 |
"Oh, if you just put an ampersand in front of it," 01:51:31.680 |
I mean, that doesn't make any sense for an intro class. 01:51:34.560 |
don't even give you the ability to think about it 01:51:38.800 |
about the difference between equal EQ and equal in Lisp? 01:51:48.960 |
So you shouldn't be, it's not too hard, we all do it, 01:52:05.040 |
and those very basic things is the very basics 01:52:16.880 |
Like even a simpler version of the equal sign 01:52:26.480 |
Like I think basically every single programming language 01:52:30.160 |
with just a few handful of exceptions equals is assignment. 01:52:34.880 |
- And you have some other operator for equality. 01:52:38.720 |
- And even that, like everyone kind of knows it. 01:52:59.520 |
is being okay in that state of confusion for a while. 01:53:19.440 |
And then you just kind of stare into the void 01:53:26.240 |
- By the way, the fact that they didn't get this 01:53:29.840 |
I mean, they were still able to do their assignments. 01:53:49.280 |
And it was all memory management and terrible. 01:53:57.360 |
it was clear to me that it was overriding memory. 01:54:05.840 |
at the front and the main that was like 400K, 01:54:11.760 |
Because wherever I was scribbling over memory, 01:54:14.160 |
it would scribble into that space and it didn't matter. 01:54:18.640 |
But I did create something to sort of deal with it. 01:54:22.400 |
- And it, you know, that's crazy, that's crazy. 01:54:27.120 |
But I knew enough about memory management to go, 01:54:30.080 |
"You know, I'm just gonna create an empty array here 01:54:39.680 |
so you're not even gonna come across that problem. 01:54:45.920 |
of hating everything you do and hating yourself. 01:55:15.600 |
- Yeah, I can tell you what I hate about Charles. 01:55:23.920 |
but different way that it's sort of like having 01:55:32.800 |
because I would not naturally gravitate to them that way. 01:55:39.360 |
- Yeah, the inner product is not zero for sure. 01:56:00.880 |
He also sometimes works as an outward confidence 01:56:19.280 |
- At the end of the day, luck favors the Charles. 01:56:32.000 |
You guys are an inspiration to a huge number of people 01:57:02.720 |
from some of the most amazing humans in history, 01:57:05.040 |
and Cash App, the app I use to send money to friends. 01:57:08.800 |
Please check out the sponsors in the description 01:57:12.240 |
to get a discount and to support this podcast. 01:57:15.120 |
If you enjoy this thing, subscribe on YouTube, 01:57:29.440 |
Don't raise your voice, improve your argument. 01:57:33.760 |
Thank you for listening and hope to see you next time.