back to index

Take Back MIT | Eric Weinstein and Lex Fridman


Whisper Transcript | Transcript Only Page

00:00:00.000 | - This idea of DISC, right,
00:00:05.000 | Distributed Idea Suppression Complex.
00:00:07.920 | - Yeah.
00:00:08.760 | - Is that what's bringing the Elon's of the world down?
00:00:12.960 | - You know, it's so funny,
00:00:13.800 | it's like he's asking Joe Rogan, like, is that a joint?
00:00:17.480 | You know, it's like, well, what will happen if I smoke it?
00:00:19.680 | What will happen to the stock price?
00:00:21.400 | What will happen if I scratch myself in public?
00:00:24.320 | What will happen if I say what I think about Thailand
00:00:27.760 | or COVID or who knows what?
00:00:30.280 | And everybody's like, don't say that, say this,
00:00:32.480 | go do this, go do that.
00:00:33.880 | Well, it's crazy making.
00:00:35.240 | It's absolutely crazy making.
00:00:37.960 | And if you think about what we put people through,
00:00:40.920 | we need to get people who can use FU money,
00:00:47.980 | the FU money they need to insulate themselves
00:00:51.760 | from all of the people who know better.
00:00:53.880 | 'Cause my nightmare is that why did we only get one Elon?
00:00:58.880 | What if we were supposed to have thousands
00:01:00.920 | and thousands of Elon's?
00:01:02.360 | And the weird thing is like, this is all that remains.
00:01:06.160 | You're looking at like Obi-Wan and Yoda,
00:01:10.360 | and it's like, this is all that's left
00:01:13.080 | after Order 66 has been executed.
00:01:17.840 | And that's the thing that's really upsetting to me
00:01:19.640 | is we used to have Elon's five deep
00:01:22.120 | and then we could talk about Elon
00:01:24.040 | in the context of his cohort.
00:01:26.960 | But this is like, if you were to see a giraffe
00:01:29.780 | in the Arctic with no trees around,
00:01:31.660 | you'd think, why the long neck?
00:01:33.800 | What a strange sight.
00:01:35.000 | - How do we get more Elon's?
00:01:37.900 | How do we change these?
00:01:40.120 | So I think that you've, so we know MIT and Harvard.
00:01:45.120 | So maybe returning to our previous conversation,
00:01:48.840 | my sense is that the Elon's of the world
00:01:51.360 | are supposed to come from MIT and Harvard.
00:01:53.360 | - Right.
00:01:54.320 | - And how do you change?
00:01:56.280 | - Let's think of one that MIT sort of killed.
00:01:58.760 | Have any names in mind?
00:02:01.820 | Aaron Schwartz leaps to my mind.
00:02:05.920 | - Yeah.
00:02:06.760 | - Okay, are we MIT supposed to shield the Aaron Schwartz's
00:02:11.760 | from, I don't know, journal publishers?
00:02:16.960 | Or are we supposed to help the journal publishers
00:02:19.560 | so that we can throw 35 year sentences in his face
00:02:22.340 | or whatever it is that we did that depressed him?
00:02:25.080 | Okay, so here's my point.
00:02:27.400 | I want MIT to go back to being the home of Aaron Schwartz.
00:02:32.400 | And if you wanna send Aaron Schwartz to a state
00:02:38.400 | where he's looking at 35 years in prison
00:02:41.320 | or something like that, you are my sworn enemy.
00:02:44.520 | You are not MIT.
00:02:45.720 | - Yeah.
00:02:47.560 | - You are the traitorous,
00:02:49.600 | irresponsible, middle brow, pencil pushing,
00:02:56.560 | green eye shade fool that needs to not be in the seat
00:03:00.440 | at the presidency of MIT, period, the end.
00:03:03.160 | Get the fuck out of there
00:03:04.720 | and let one of our people sit in that chair.
00:03:07.080 | - And the thing that you've articulated is that
00:03:09.880 | the people in those chairs are not the way they are
00:03:13.880 | because they're evil or somehow morally compromised
00:03:17.040 | is that it's just that that's the distributed nature.
00:03:20.880 | Is that there's some kind of aspect of the system that--
00:03:22.880 | - These are people who wed themselves to the system.
00:03:26.640 | They adapt every instinct.
00:03:28.640 | And the fact is is that they're not going to be
00:03:32.040 | on Joe Rogan smoking a blunt.
00:03:34.060 | - Let me ask a silly question.
00:03:36.280 | Do you think institutions generally
00:03:37.960 | just tend to become that?
00:03:39.580 | - No, we get some of the institutions.
00:03:43.720 | We get Caltech.
00:03:45.440 | Here's what we're supposed to have.
00:03:46.480 | We're supposed to have Caltech.
00:03:47.840 | We're supposed to have Reed.
00:03:49.840 | We're supposed to have Deep Springs.
00:03:51.680 | We're supposed to have MIT.
00:03:54.240 | We're supposed to have a part of Harvard.
00:03:56.320 | And when the sharp elbow crowd comes after
00:03:59.440 | the sharp mind crowd, we're supposed to break
00:04:02.560 | those sharp elbows and say, don't come around here again.
00:04:05.880 | - So what are the weapons that the sharp minds
00:04:07.880 | are supposed to use in our modern day?
00:04:10.720 | So to reclaim MIT, what is the, what's the future?
00:04:15.200 | - Are you kidding me?
00:04:16.480 | First of all, assume that this is being seen at MIT.
00:04:20.040 | Hey, everybody. - It definitely is.
00:04:21.280 | - Okay.
00:04:22.800 | Hey, everybody.
00:04:24.200 | Try to remember who you are.
00:04:26.300 | You're the guys who put the police car
00:04:28.080 | on top of the great dome.
00:04:29.760 | You guys came up with the great breast of knowledge.
00:04:32.160 | You created a Tetris game in the green building.
00:04:35.440 | Now, what is your problem?
00:04:38.560 | They killed one of your own.
00:04:40.480 | You should make their life a living hell.
00:04:43.960 | You should be the ones who keep the memory
00:04:46.720 | of Aaron Schwartz alive and all of those hackers
00:04:49.880 | and all of those mutants.
00:04:51.340 | It's like, it's either our place or it isn't.
00:04:57.160 | And if we have to throw 12 more pianos off of the roof,
00:05:02.160 | if Harold Edgerton was taking those photographs
00:05:07.800 | with slow-mo back in the 40s,
00:05:13.560 | if Noam Chomsky's on your faculty,
00:05:16.840 | what the hell is wrong with you kids?
00:05:19.160 | You are the most creative and insightful people
00:05:21.800 | and you can't figure out how to defend Aaron Schwartz?
00:05:24.320 | That's on you guys.
00:05:25.600 | - So some of that is giving more power to the young,
00:05:28.240 | like you said, to the brave, to the bold.
00:05:30.640 | - Taking power from the feeble and the middle-brow.
00:05:33.720 | - Yeah, but what is the mechanism?
00:05:35.560 | To me-- - I don't know.
00:05:36.400 | You have some nine-volt batteries?
00:05:39.040 | You have some copper wire?
00:05:41.240 | - I tend to-- - Do you have a capacitor?
00:05:44.760 | - I tend to believe you have to create an alternative
00:05:47.720 | and make the alternative so much better
00:05:51.000 | that it makes MIT obsolete unless they change.
00:05:55.720 | And that's what forces change.
00:05:58.000 | So as opposed to somehow--
00:05:59.400 | - Okay, so you use projection mapping.
00:06:02.060 | - What's projection mapping?
00:06:03.300 | - Where you take some complicated edifice
00:06:05.520 | and you map all of its planes
00:06:07.320 | and then you actually project some unbelievable graphics,
00:06:10.240 | re-skinning a building, let's say, at night.
00:06:12.320 | - That's right, yeah.
00:06:13.160 | - Okay, so you wanna do some graffiti art with light.
00:06:15.240 | - You basically wanna hack the system?
00:06:16.720 | - No, I'm saying, look, listen to me, Liv.
00:06:19.480 | We're smarter than they are.
00:06:21.680 | And you know what they say?
00:06:23.440 | They say things like, I think we need some geeks.
00:06:27.160 | Get me two PhDs.
00:06:28.680 | You treat PhDs like that, that's a bad move.
00:06:33.880 | 'Cause PhDs are capable.
00:06:36.200 | And we act like our job is to peel grapes for our betters.
00:06:40.640 | - Yeah, that's a strange thing.
00:06:42.400 | You speak about it very eloquently.
00:06:44.560 | It's how we treat basically the greatest minds in the world,
00:06:49.480 | which is like at their prime, which is PhD students.
00:06:52.940 | We pay them nothing.
00:06:56.020 | - I'm done with it.
00:06:58.960 | - Yeah.
00:06:59.780 | - Right, we gotta take what's ours.
00:07:01.880 | So take back MIT.
00:07:04.720 | Become ungovernable.
00:07:07.080 | Become ungovernable.
00:07:08.620 | And by the way, when you become ungovernable,
00:07:11.960 | don't do it by throwing food.
00:07:13.820 | Don't do it by pouring salt on the lawn like a jerk.
00:07:19.040 | Do it through brilliance.
00:07:20.540 | Because what you, Caltech and MIT can do,
00:07:23.280 | and maybe Rensselaer Polytechnic or Worcester Polytech,
00:07:26.640 | I don't know, Lehigh.
00:07:28.800 | God damn it, what's wrong with you technical people?
00:07:31.000 | You act like you're a servant class.
00:07:34.080 | - It's unclear to me how you reclaim it,
00:07:36.080 | except with brilliance, like you said.
00:07:38.120 | But to me, the way you reclaim it with brilliance
00:07:41.680 | is to go outside the system.
00:07:43.040 | - Aaron Schwartz came from the Elon Musk class.
00:07:45.800 | What you guys gonna do about it?
00:07:48.160 | Right?
00:07:49.000 | The super capable people need to flex,
00:07:53.120 | need to be individual, they need to stop giving away
00:07:55.320 | all their power to a zeitgeist or a community
00:07:58.120 | or this or that.
00:07:59.520 | You're not indoor cats, you're outdoor cats.
00:08:02.080 | Go be outdoor cats.
00:08:02.960 | - Do you think we're gonna see this kind of change?
00:08:05.240 | - You were the one asking me before,
00:08:06.880 | like what about the World War II generation?
00:08:09.400 | What I'm trying to say is that there's
00:08:10.480 | a technical revolt coming.
00:08:12.240 | You wanna talk about--
00:08:14.200 | - But I'm trying to lead it, right?
00:08:15.400 | I'm trying to see--
00:08:16.520 | - No, you're not trying to lead it.
00:08:17.360 | - I'm trying to get a blueprint here.
00:08:18.800 | - All right, Lex.
00:08:20.280 | How angry are you about our country pretending
00:08:23.520 | that you and I can't actually do technical subjects
00:08:26.920 | so that they need an army of kids coming in
00:08:30.340 | from four countries in Asia?
00:08:32.280 | It's not about the four countries in Asia,
00:08:34.080 | it's not about those kids.
00:08:35.800 | It's about lying about us, that we don't care enough
00:08:38.240 | about science and technology, that we're incapable of it.
00:08:42.000 | As if we don't have Chinese and Russians
00:08:44.440 | and Koreans and Croatians, like we've got everybody here.
00:08:49.400 | The only reason you're looking outside
00:08:51.480 | is that you wanna hire cheap people
00:08:53.500 | from the family business because you don't wanna pass
00:08:55.760 | the family business on.
00:08:57.520 | And you know what?
00:08:59.160 | You didn't really build the family business.
00:09:01.160 | It's not yours to decide.
00:09:03.440 | You the boomers and you the silent generation,
00:09:05.900 | you did your bit, but you also fouled a lot of stuff up.
00:09:09.300 | And you're custodians.
00:09:11.580 | You are caretakers.
00:09:12.800 | You are supposed to hand something.
00:09:14.440 | What you did instead was to gorge yourself
00:09:17.960 | on cheap foreign labor, which you then held up
00:09:21.120 | as being much more brilliant than your own children,
00:09:23.500 | which was never true.
00:09:24.660 | - See, but I'm trying to understand
00:09:26.880 | how we create a better system without anger,
00:09:29.200 | without revolution.
00:09:30.440 | - Ah.
00:09:31.280 | - Not by kissing and hugs, but by,
00:09:36.280 | I mean, I don't understand within MIT
00:09:39.840 | what the mechanism of building a better MIT is.
00:09:42.920 | - We're not gonna pay Elsevier.
00:09:44.720 | Aaron Schwartz was right.
00:09:45.960 | JSTOR is an abomination.
00:09:48.520 | - But why, who within MIT, who within institutions
00:09:52.320 | is going to do that when, just like you said,
00:09:55.000 | the people who are running the show are more senior.
00:09:57.880 | Why did I get Frank Wilczek to speak out?
00:10:00.320 | - So you're, it's basically individuals that step up.
00:10:03.840 | I mean, one of the surprising things about Elon
00:10:06.000 | is that one person can inspire so much.
00:10:09.640 | - He's got academic freedom.
00:10:11.240 | It just comes from money.
00:10:12.480 | - I don't agree with that.
00:10:16.600 | Do you think money, okay, so yes, certainly--
00:10:20.480 | - Sorry, and testicles.
00:10:23.080 | - You've, yes, but I think that testicles
00:10:25.080 | is more important than money.
00:10:26.440 | - Right.
00:10:27.280 | - Or guts.
00:10:29.000 | I think, I do agree with you, you speak about this a lot,
00:10:31.360 | that because the money in the academic institutions
00:10:34.080 | has been so constrained that people are misbehaving
00:10:37.520 | in horrible ways.
00:10:39.560 | But I don't think that if we reverse that
00:10:42.760 | and give a huge amount of money,
00:10:44.120 | people will all of a sudden behave well.
00:10:45.640 | I think it also takes guts.
00:10:46.880 | - No, you need to give people security.
00:10:49.080 | - Security, yes.
00:10:49.920 | - Like you need to know that you have a job on Monday
00:10:54.120 | when on Friday you say, "I'm not so sure
00:10:56.600 | "I really love diversity and inclusion."
00:10:59.160 | And somebody's like, "Wait, what?
00:11:00.840 | "You didn't love diversity?
00:11:02.360 | "We had a statement on diversity and inclusion
00:11:03.880 | "and you wouldn't sign?
00:11:05.040 | "Are you against the inclusion part
00:11:06.880 | "or are you against diversity?
00:11:08.040 | "Do you just not like people like you?"
00:11:10.040 | You're like, "Actually, that has nothing to do with anything.
00:11:12.920 | "You're making this into something that it isn't.
00:11:14.960 | "I don't wanna sign your goddamn stupid statement.
00:11:17.640 | "And get out of my lab."
00:11:19.680 | Get out of my lab, it all begins from the middle finger.
00:11:22.920 | Get out of my lab.
00:11:25.480 | The administrators need to find other work.
00:11:28.460 | - Yeah, listen, I agree with you
00:11:31.200 | and I hope to seek your advice and wisdom
00:11:36.040 | as we change this because I'd love to see--
00:11:38.880 | - I will visit you in prison if that's what you're asking.
00:11:42.400 | - I have no, I think prison is great.
00:11:45.000 | You get a lot of reading done and good working out.
00:11:49.040 | Well, let me ask, something I brought up before
00:11:54.200 | is the Nietzsche quote of,
00:11:55.800 | "Beware that when fighting monsters,
00:11:57.880 | "you yourself do not become a monster.
00:12:00.040 | "For when you gaze long into the abyss,
00:12:02.440 | "the abyss gazes into you."
00:12:04.240 | Are you worried that your focus on the flaws in the system
00:12:08.400 | that we've just been talking about has damaged your mind
00:12:12.360 | or the part of your mind that's able to see the beauty
00:12:15.760 | in the world in the system?
00:12:18.560 | That because you have so sharply been able
00:12:23.120 | to see the flaws in the system,
00:12:25.480 | you can no longer step back and appreciate its beauty?
00:12:28.880 | - Look, I'm the one who's trying to get the institutions
00:12:33.180 | to save themselves by getting rid of their inhabitants
00:12:36.080 | but leaving the institution, like a neutron bomb
00:12:38.760 | that removes the unworkable leadership class
00:12:43.760 | but leaves the structures.
00:12:45.400 | - So the leadership class is really the problem.
00:12:48.320 | - The leadership class is the problem.
00:12:49.160 | - But the individual, like the professors,
00:12:50.920 | the individual scholars--
00:12:51.760 | - Well, the professors are gonna have to go back
00:12:54.200 | into training to remember how to be professors.
00:12:57.840 | Like people are cowards at the moment
00:12:59.640 | because if they're not cowards, they're unemployed.
00:13:02.240 | - Yeah, that's one of the disappointing things
00:13:05.660 | I've encountered is to me, tenure--
00:13:08.280 | - But nobody has tenure now.
00:13:11.200 | - Whether they do or not, they certainly don't have
00:13:20.920 | the kind of character and fortitude
00:13:23.280 | that I was hoping to see.
00:13:25.120 | To me--
00:13:25.960 | - But they'd be gone.
00:13:26.960 | See, you're dreaming about the people
00:13:30.800 | who used to live at MIT.
00:13:34.180 | You're dreaming about the previous inhabitants
00:13:38.440 | of your university.
00:13:40.240 | And if you looked at somebody like,
00:13:42.320 | Isidore Singer is very old, I don't know what state he's in
00:13:46.520 | but that guy was absolutely the real deal.
00:13:49.560 | And if you look at Noam Chomsky,
00:13:51.720 | tell me that Noam Chomsky has been muzzled, right?
00:13:55.200 | Now, what I'm trying to get at is you're talking
00:13:59.220 | about younger energetic people, but those people,
00:14:02.120 | like when I say something like,
00:14:03.840 | I'm against, I'm for inclusion and I'm for diversity
00:14:08.840 | but I'm against diversity and inclusion TM,
00:14:12.920 | like the movement.
00:14:13.820 | Well, I couldn't say that if I was a professor.
00:14:19.400 | Oh my God, he's against our sacred document.
00:14:21.880 | Okay, well, in that kind of a world,
00:14:24.840 | do you wanna know how many things I don't agree with you on?
00:14:27.520 | Like we could go on for days and days and days,
00:14:29.440 | all of the nonsense that you've parroted
00:14:31.640 | inside of the institution.
00:14:33.840 | Any sane person has no need for it.
00:14:36.320 | They have no want or desire.
00:14:38.560 | - Do you think you have to have some patience for nonsense
00:14:44.740 | when many people work together in a system?
00:14:47.280 | - How long has string theory gone on for
00:14:48.960 | and how long have I been patient?
00:14:51.240 | Okay, so you're talking about--
00:14:52.080 | - There's a limit to patience, I imagine.
00:14:53.400 | - You're talking about like 36 years
00:14:55.760 | of modern nonsense in string theory.
00:14:58.000 | - So you can do like eight to 10 years, but not more.
00:15:00.800 | - I can do 40 minutes.
00:15:03.020 | This is 36 years.
00:15:05.280 | - Well, you've done that over two hours already.
00:15:06.880 | - No, but it's-- - I appreciate.
00:15:08.320 | - But it's been 36 years of nonsense
00:15:10.640 | since the anomaly cancellation in string theory.
00:15:14.000 | It's like, what are you talking about about patience?
00:15:16.800 | I mean, Lex, you're not even acting like yourself.
00:15:20.160 | Well, you're trying to stay in the system.
00:15:23.520 | - I'm not trying, I'm not.
00:15:25.080 | I'm trying to see if perhaps,
00:15:28.640 | so my hope is that the system just has a few assholes in it,
00:15:32.640 | which you highlight,
00:15:35.200 | and the fundamentals of the system are broken,
00:15:38.600 | because if the fundamentals of the systems are broken,
00:15:41.640 | then I just don't see a way for MIT to succeed.
00:15:45.740 | Like, I don't see how young people take over MIT.
00:15:50.260 | I don't see how--
00:15:51.300 | - By inspiring us.
00:15:54.460 | You know, the great part about being at MIT,
00:15:58.460 | like when you saw the genius in these pranks,
00:16:02.260 | the heart, the irreverence, it's like, don't,
00:16:06.380 | we were talking about Tom Lehrer the last time.
00:16:08.700 | Tom Lehrer was as naughty as the day is long, agreed?
00:16:12.340 | - Agreed.
00:16:13.220 | - Was he also a genius?
00:16:14.740 | Was he well-spoken?
00:16:15.740 | Was he highly cultured?
00:16:17.900 | He was so talented, so intellectual,
00:16:20.140 | that he could just make fart jokes morning, noon, and night.
00:16:23.380 | Okay, well, in part, the right to make fart jokes,
00:16:27.180 | the right to, for example, put a functioning phone booth
00:16:30.440 | that was ringing on top of the Great Dome at MIT
00:16:33.540 | has to do with we are such badasses
00:16:35.460 | that we can actually do this stuff.
00:16:37.900 | Well, don't tell me about it anymore.
00:16:39.740 | Go break the law.
00:16:41.740 | Go break the law in a way that inspires us
00:16:44.100 | and makes us not want to prosecute you.
00:16:46.700 | Break the law in a way that lets us know
00:16:49.540 | that you're calling us out on our bullshit,
00:16:51.060 | that you're filled with love,
00:16:53.260 | and that our technical talent has not gone to sleep,
00:16:57.260 | it's not incapable, and if the idea is
00:17:00.660 | is that you're gonna dig a moat around the university
00:17:03.360 | and fill it with tiger sharks, that's awesome,
00:17:07.420 | 'cause I don't know how you're gonna do it,
00:17:08.780 | but if you actually manage to do that,
00:17:10.840 | I'm not gonna prosecute you under a reckless endangerment.
00:17:14.800 | - That's beautifully put.
00:17:18.120 | I hope those, first of all, they'll listen.
00:17:20.840 | I hope young people at MIT will take over
00:17:22.920 | in this kind of way.
00:17:24.260 | In the introduction to your podcast episode
00:17:27.640 | on Jeffrey Epstein, you give to me a really moving story,
00:17:32.640 | but unfortunately for me, too brief,
00:17:37.300 | about your experience with a therapist
00:17:40.140 | and a lasting terror that permeated your mind.
00:17:42.440 | Can you go there?
00:17:47.080 | Can you tell?
00:17:48.240 | - I don't think so.
00:17:49.080 | I mean, I appreciate what you're saying.
00:17:50.820 | I said it obliquely.
00:17:51.960 | I said enough.
00:17:52.820 | There are bad people who cross our paths,
00:17:57.080 | and the current vogue is to say, oh, I'm a survivor.
00:18:02.080 | I'm a victim.
00:18:05.940 | I can do anything I want.
00:18:08.600 | This is a broken person, and I don't know why
00:18:11.500 | I was sent to a broken person as a kid.
00:18:14.100 | And to be honest with you, I also felt like in that story,
00:18:17.160 | I say that I was able to say no,
00:18:19.260 | and this was like the entire weight of authority,
00:18:22.860 | and he was misusing his position,
00:18:26.780 | and I was also able to say no.
00:18:29.800 | What I couldn't say no to
00:18:32.920 | was having him re-inflicted in my life.
00:18:35.360 | - Right, so you were sent back.
00:18:38.380 | - Yeah, second time.
00:18:39.220 | I tried to complain about what had happened,
00:18:41.180 | and I tried to do it in a way that did not
00:18:43.260 | immediately cause horrific consequences
00:18:48.700 | to both this person and myself,
00:18:50.340 | because we don't have the tools
00:18:52.780 | to deal with sexual misbehavior.
00:18:58.340 | We have nuclear weapons.
00:19:00.260 | We don't have any way of saying,
00:19:02.560 | this is probably not a good place
00:19:05.720 | or a role for you at this moment as an authority figure,
00:19:09.920 | and something needs to be worked on.
00:19:11.680 | So in general, when we see somebody
00:19:13.840 | who is misbehaving in that way,
00:19:17.000 | our immediate instinct is to treat the person as Satan,
00:19:22.000 | and we understand why.
00:19:26.000 | We don't want our children to be at risk.
00:19:28.720 | Now, I personally believe that I fell down on the job
00:19:34.920 | and did not call out the Jeffrey Epstein thing early enough
00:19:38.200 | because I was terrified of what Jeffrey Epstein represents,
00:19:41.080 | and this recapitulated the old terror,
00:19:44.080 | trying to tell the world, this therapist is out of control.
00:19:48.120 | And when I said that, the world responded by saying,
00:19:51.880 | well, you have two appointments booked,
00:19:53.480 | and you have to go for the second one.
00:19:55.440 | So I got re-inflicted into this office on this person
00:19:59.800 | who was now convinced that I was about to tear down
00:20:01.400 | his career and his reputation
00:20:02.840 | and might have been on the verge of suicide for all I know.
00:20:04.720 | I don't know.
00:20:06.200 | But he was very, very angry,
00:20:08.120 | and he was furious with me
00:20:09.520 | that I had breached the sacred confidence of his office.
00:20:12.480 | - What kind of ripple effects does that have,
00:20:16.520 | has that had to the rest of your life?
00:20:19.220 | The absurdity and the cruelty of that.
00:20:23.200 | I mean, there's no sense to it.
00:20:24.760 | - Well, see, this is the thing
00:20:27.960 | people don't really grasp, I think.
00:20:31.880 | There's an academic who I got to know many years ago
00:20:35.760 | named Jennifer Fried, who has a theory of betrayal,
00:20:42.160 | which she calls institutional betrayal.
00:20:44.320 | And her gambit is that when you were betrayed
00:20:46.800 | by an institution that is sort of like a fiduciary
00:20:50.480 | or a parental obligation to take care of you,
00:20:53.700 | that you find yourself in a far different situation
00:20:58.720 | with respect to trauma than if you were betrayed
00:21:01.240 | by somebody who's a peer.
00:21:03.600 | And so I think that in my situation,
00:21:08.560 | I kind of repeat a particular dynamic with authority.
00:21:15.480 | I come in not following all the rules,
00:21:20.180 | trying to do some things, not trying to do others,
00:21:23.340 | blah, blah, blah.
00:21:24.560 | And then I get into a weird relationship with authority.
00:21:28.520 | And so I have more experience
00:21:30.120 | with what I would call institutional betrayal.
00:21:32.520 | Now, the funny part about it is that
00:21:35.920 | when you don't have masks or PPE
00:21:39.080 | in a influenza-like pandemic,
00:21:42.720 | and you're missing ICU beds and ventilators,
00:21:45.440 | that is ubiquitous institutional betrayal.
00:21:49.880 | So I believe that in a weird way, I was very early.
00:21:53.440 | The idea of, and this is like the really hard concept,
00:21:58.680 | pervasive or otherwise universal institutional betrayal,
00:22:02.680 | where all of the institutions,
00:22:04.200 | you can count on any hospital to not charge you properly
00:22:07.860 | for what their services are.
00:22:09.920 | You can count on no pharmaceutical company
00:22:12.520 | to produce the drug that will be maximally beneficial
00:22:15.600 | to the people who take it.
00:22:17.480 | You know that your financial professionals
00:22:20.000 | are not simply working in your best interest.
00:22:22.800 | And that issue had to do with
00:22:25.160 | the way in which growth left our system.
00:22:28.240 | So I think that the weird thing is
00:22:29.960 | is that this first institutional betrayal by a therapist
00:22:33.640 | left me very open to the idea of,
00:22:35.840 | okay, well, maybe the schools are bad.
00:22:37.500 | Maybe the hospitals are bad.
00:22:38.760 | Maybe the drug companies are bad.
00:22:40.120 | Maybe our food is off.
00:22:41.800 | Maybe our journalists are not serving journalistic ends.
00:22:44.960 | And that was what allowed me
00:22:46.920 | to sort of go all the distance and say,
00:22:49.560 | huh, I wonder if our problem is that something
00:22:52.560 | is causing all of our sense-making institutions to be off.
00:22:57.240 | That was the big insight.
00:22:58.360 | And tying that to a single ideology,
00:23:02.320 | what if it's just about growth?
00:23:03.680 | They were all built on growth,
00:23:05.080 | and now we've promoted people who are capable
00:23:08.160 | of keeping quiet that their institutions aren't working.
00:23:11.640 | So the privileged, silent aristocracy,
00:23:16.040 | the people who can be counted upon,
00:23:18.000 | not to mention a fire when a raging fire
00:23:20.320 | is tearing through a building.
00:23:23.520 | - But nevertheless, how big of a psychological burden is that?
00:23:28.520 | - It's huge.
00:23:29.400 | It's terrible.
00:23:30.240 | It's crushing.
00:23:31.080 | It's very-- - It's very comforting
00:23:33.040 | to be the parental.
00:23:34.880 | I mean, I don't know.
00:23:37.320 | I treasure, I mean, we were just talking about MIT.
00:23:41.240 | I can intellectualize and agree
00:23:44.120 | with everything you're saying,
00:23:45.360 | but there's a comfort, a warm blanket
00:23:47.600 | of being within the institution.
00:23:49.960 | And up until Aaron Schwartz, let's say.
00:23:53.100 | In other words, now, if I look at the provost
00:23:56.880 | and the president as mommy and daddy,
00:23:59.160 | you did what to my big brother?
00:24:01.080 | You did what to our family?
00:24:06.440 | You sold us out in which way?
00:24:08.340 | What secrets left for China?
00:24:11.940 | You hired which workforce?
00:24:13.720 | You did what to my wages?
00:24:16.120 | You took this portion of my grant for what purpose?
00:24:18.660 | You just stole my retirement through a fringe rate?
00:24:21.320 | What did you do?
00:24:22.680 | - But can you still, I mean,
00:24:24.720 | the thing is about this view you have
00:24:28.560 | is it often turns out to be sadly correct.
00:24:31.420 | - But this is the thing.
00:24:32.920 | - But let me just, in this silly hopeful thing,
00:24:37.440 | do you still have hope in institutions?
00:24:39.880 | Can you within your-- - Yes.
00:24:41.160 | - Psychologically. - Yes.
00:24:42.640 | - I'm referring not intellectually.
00:24:44.540 | Because you have to carry this burden,
00:24:46.600 | can you still have a hope within you, Jake?
00:24:50.360 | When you sit at home alone,
00:24:52.640 | and as opposed to seeing the darkness
00:24:55.200 | within these institutions, seeing a hope.
00:24:57.560 | - Well, but this is the thing.
00:24:58.760 | I want to confront, not for the purpose of a dust up.
00:25:03.760 | I believe, for example, if you've heard episode 19,
00:25:08.320 | that the best outcome is for Carol Greider to come forward,
00:25:13.320 | as we discussed in episode 19.
00:25:15.880 | - With your brother, Brett Einstein.
00:25:17.360 | - And say, you know what? - It's a great episode.
00:25:19.120 | - I screwed up.
00:25:20.680 | He did call, he did suggest the experiment.
00:25:23.960 | I didn't understand that it was his theory
00:25:26.120 | that was producing it.
00:25:27.200 | Maybe I was slow to grasp it.
00:25:30.260 | But my bad, and I don't want to pay for this bad choice
00:25:35.260 | on my part, let's say, for the rest of my career.
00:25:42.440 | I want to own up, and I want to help make sure
00:25:44.680 | that we do what's right with what's left.
00:25:48.200 | - And that's one little case within the institution
00:25:50.640 | that you would like to see made.
00:25:51.800 | - I would like to see MIT very clearly come out
00:25:55.400 | and say, you know, Margot O'Toole was right
00:25:57.520 | when she said David Baltimore's lab here
00:25:59.960 | produced some stuff that was not reproducible
00:26:05.720 | with Teresa Imanishikari's research.
00:26:08.700 | I want to see the courageous people.
00:26:11.840 | I would like to see the Aaron Schwartz wing
00:26:14.880 | of the computer science department.
00:26:17.720 | Yeah, let's think about it.
00:26:20.520 | Wouldn't that be great if they said,
00:26:21.920 | you know, an injustice was done,
00:26:23.800 | and we're gonna write that wrong
00:26:25.900 | just as if this was Alan Turing?
00:26:27.740 | - Which I don't think they've written that wrong.
00:26:31.840 | - Well, then let's have the Turing-Schwartz wing.
00:26:34.400 | - The Turing-Schwartz, they're starting
00:26:36.680 | a new college of computing.
00:26:37.880 | It wouldn't be wonderful to call it the Turing-Schwartz.
00:26:40.360 | - I would like to have the Madame Wu wing
00:26:42.520 | of the physics department,
00:26:44.000 | and I'd love to have the Emmy Noether statue
00:26:47.240 | in front of the math department.
00:26:48.520 | I mean, like, you want to get excited
00:26:49.840 | about actual diversity and inclusion?
00:26:52.560 | Well, let's go with our absolute best people
00:26:54.720 | who never got theirs, 'cause there is structural bigotry.
00:26:57.560 | But if we don't actually start celebrating
00:27:01.880 | the beautiful stuff that we're capable of
00:27:03.880 | when we're handed heroes and we fumble them into the trash,
00:27:07.660 | what the hell?
00:27:08.500 | I mean, Lex, this is such nonsense.
00:27:13.500 | Just pulling our head out.
00:27:17.220 | You know, on everyone's cecum should be tattooed,
00:27:22.340 | if you can read this, you're too close.
00:27:24.520 | - Beautifully put, and I'm a dreamer just like you.
00:27:32.160 | So I don't see as much of the darkness,
00:27:36.400 | genetically or due to my life experience,
00:27:39.940 | but I do share the hope for MIT,
00:27:43.820 | the institution that we care a lot about.
00:27:45.620 | - We both do.
00:27:46.600 | - Yeah, and Harvard, the institution
00:27:48.660 | I don't give a damn about, but you do.
00:27:50.960 | - I love Harvard.
00:27:52.000 | - I'm just kidding.
00:27:52.840 | - I love Harvard, but Harvard and I
00:27:54.800 | have a very difficult relationship,
00:27:56.480 | and part of what, you know,
00:27:57.760 | when you love a family that isn't working,
00:27:59.920 | I don't want to trash, I didn't bring up the name
00:28:04.080 | of the president of MIT during the Aaron Schwartz period.
00:28:07.160 | It's not vengeance, I want the rot cleared out.
00:28:12.080 | I don't need to go after human beings.
00:28:14.720 | - Yeah, just like you said, with the disk formulation,
00:28:19.480 | the individual human beings don't necessarily carry the--
00:28:24.000 | - It's those chairs that are so powerful
00:28:28.200 | in which they sit.
00:28:29.520 | - It's the chairs, not the humans.
00:28:31.240 | - It's not the humans.
00:28:32.720 | (silence)
00:28:34.880 | (silence)
00:28:37.040 | (silence)
00:28:39.200 | (silence)
00:28:41.360 | (silence)
00:28:43.520 | (silence)
00:28:45.680 | (silence)
00:28:47.840 | [BLANK_AUDIO]