back to index

Eric Weinstein's Harvard Story - The System Breaks Down in Novel Situations | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | - Without naming names, can you tell the story
00:00:05.000 | of your struggle during your time at Harvard?
00:00:09.620 | Maybe in a way that tells the bigger story
00:00:13.180 | of the struggle of young, bright minds
00:00:15.880 | that are trying to come up with big, bold ideas
00:00:20.320 | within the institutions that we're talking about?
00:00:23.840 | - You can start.
00:00:26.800 | I mean, in part, it starts with coffee,
00:00:31.800 | with a couple of Croatians in the math department at MIT.
00:00:36.920 | And we used to talk about music and dance
00:00:44.640 | and math and physics and love and all this kind of stuff
00:00:49.000 | as Eastern Europeans loved to, and I ate it up.
00:00:54.600 | And my friend, Gordana, who was an instructor
00:00:58.600 | in the MIT math department
00:01:00.120 | when I was a graduate student at Harvard,
00:01:02.160 | said to me, and I'm probably gonna do a bad version
00:01:05.960 | of her accent.
00:01:06.800 | - There we go.
00:01:07.640 | - Will I see you tomorrow at the secret seminar?
00:01:12.800 | And I said, what secret seminar?
00:01:16.380 | Eric, don't joke.
00:01:18.860 | I said, I'm not used to this style of humor, Gordana.
00:01:24.160 | Eric, the secret seminar that your advisor is running,
00:01:27.420 | I said, what are you talking about?
00:01:30.440 | Ha ha ha ha.
00:01:32.600 | You know, your advisor is running a secret seminar
00:01:35.960 | on this aspect, I think it was like
00:01:37.440 | the Churn-Simons invariant.
00:01:39.300 | Not sure what the topic was again,
00:01:42.400 | but she gave me the room number and the time
00:01:45.320 | and she was like not cracking a smile.
00:01:47.280 | I've never known her to make this kind of a joke.
00:01:49.760 | And I thought this was crazy.
00:01:50.720 | And I was trying to have an advisor.
00:01:52.640 | I didn't want an advisor, but people said
00:01:54.280 | you have to have one, so I took one.
00:01:56.880 | And I went to this room like 15 minutes early
00:02:01.880 | and there was not a soul inside it.
00:02:03.440 | It was outside of the math department.
00:02:05.520 | And it was still in the same building,
00:02:09.080 | the Science Center at Harvard.
00:02:10.580 | And I sat there and I let five minutes go by,
00:02:14.320 | I let seven minutes go by, 10 minutes go by,
00:02:16.280 | there was nobody.
00:02:17.640 | I thought, okay, so this was all an elaborate joke.
00:02:21.120 | And then like three minutes to the hour,
00:02:23.120 | this graduate student walks in
00:02:26.400 | and sees me and does a double take.
00:02:29.080 | And then I start to see the professors
00:02:31.600 | in geometry and topology start to file in.
00:02:34.480 | And everybody's like very disconcerted
00:02:39.040 | that I'm in this room.
00:02:40.140 | And finally, the person who was supposed to be my advisor
00:02:46.040 | walks in to the seminar and sees me
00:02:49.400 | and goes white as a ghost.
00:02:51.740 | And I realized that the secret seminar is true,
00:02:56.660 | that the department is conducting a secret seminar
00:03:01.660 | on the exact topic that I'm interested in,
00:03:04.500 | not telling me about it,
00:03:06.460 | and that these are the reindeer games
00:03:08.700 | that the Rudolphs of the department are not invited to.
00:03:12.420 | And so then I realized, okay, I did not understand it.
00:03:16.220 | There's a parallel department.
00:03:19.620 | And that became the beginning of an incredible odyssey
00:03:24.620 | in which I came to understand that the game
00:03:32.180 | that I had been sold about publication,
00:03:36.260 | about blind refereeing, about openness
00:03:40.860 | and scientific transmission of information was all a lie.
00:03:48.820 | I came to understand that at the very top,
00:03:51.860 | there's a second system that's about closed meetings
00:03:56.480 | and private communications and agreements about citation
00:04:01.480 | and publication that the rest of us don't understand.
00:04:06.380 | And that in large measure,
00:04:08.420 | that is the thing that I won't submit to.
00:04:11.460 | And so when you ask me questions like,
00:04:13.220 | well, why wouldn't you feel good about
00:04:15.660 | talking to your critics?
00:04:16.580 | Or why wouldn't you feel?
00:04:17.500 | The answer is, oh, you don't know.
00:04:19.620 | Like if you stay in a nice hotel,
00:04:21.720 | you don't realize that there's an entire second structure
00:04:25.200 | inside of that hotel,
00:04:26.940 | where like there's usually a worker's cafe
00:04:29.800 | in a resort complex that isn't available
00:04:32.660 | to the people who are staying in the hotel.
00:04:34.340 | And then there are private hallways inside the same hotel
00:04:39.340 | that are parallel structures.
00:04:42.400 | So that's what I found, which was in essence,
00:04:45.420 | just the way you can stay hotels your whole life
00:04:47.540 | and not realize that inside of every hotel
00:04:49.820 | is a second structure
00:04:50.940 | that you're not supposed to see as the guest.
00:04:53.600 | There is a second structure inside of academics
00:04:56.560 | that behaves totally differently
00:04:58.340 | with respect to how people get dinged,
00:05:00.920 | how people get their grants taken away,
00:05:02.700 | how this person comes to have that thing named after them.
00:05:07.660 | And by pretending that we're not running
00:05:11.320 | a parallel structure,
00:05:14.660 | I have no patience for that anymore.
00:05:16.820 | So I got a chance to see how the game,
00:05:19.220 | how hardball is really played at Harvard.
00:05:21.600 | And I'm now eager to play hardball
00:05:25.660 | back with the same people who played hardball with me.
00:05:29.960 | - Let me ask two questions on this.
00:05:32.980 | So one, do you think it's possible,
00:05:36.740 | so I call those people assholes.
00:05:39.580 | That's the technical term.
00:05:41.860 | Do you think it's possible
00:05:42.700 | that that's just not the entire system,
00:05:45.340 | but a part of the system?
00:05:47.740 | You can navigate, you can swim in the waters
00:05:53.460 | and find the groups of people who do aspire to--
00:05:57.100 | - The guy who rescued my PhD
00:05:59.380 | was one of the people who filed in to the secret seminar.
00:06:03.300 | - Right, but are there people--
00:06:06.860 | - I'm just trying to say-- - Who are outside of this?
00:06:08.620 | - Is he an asshole?
00:06:10.300 | - Well, yes, I was a bad--
00:06:12.220 | - No, but I'm trying to make this point,
00:06:13.860 | which is this isn't my failure
00:06:16.000 | to correctly map these people, it's yours.
00:06:19.400 | You have a simplification that isn't gonna work.
00:06:23.740 | - I think, okay, asshole's the wrong term.
00:06:25.620 | I would say lacking of character.
00:06:30.020 | - What would you have had these people do?
00:06:32.980 | Why did they do this?
00:06:34.100 | Why have a secret seminar?
00:06:35.580 | - I don't understand the exact dynamics
00:06:38.540 | of a secret seminar,
00:06:39.380 | but I think the right thing to do is to,
00:06:42.420 | I mean, to see individuals like you.
00:06:44.780 | There might be a reason to have a secret seminar,
00:06:47.340 | but they should detect that an individual like you,
00:06:51.300 | a brilliant mind who's thinking about certain ideas
00:06:54.660 | could be damaged by this.
00:06:55.980 | - I don't think that they see it that way.
00:06:58.360 | The idea is we're going to sneak food
00:07:01.140 | to the children we want to survive.
00:07:03.480 | - Yeah, so that's highly problematic,
00:07:05.460 | and there should be people within that room.
00:07:07.380 | I'm trying to say, this is the thing,
00:07:09.580 | the ball is thrown but it won't be caught.
00:07:12.300 | The problem is they know
00:07:15.000 | that most of their children won't survive,
00:07:17.100 | and they can't say that.
00:07:21.460 | - I see, sorry to interrupt.
00:07:24.700 | You mean that the fact that the whole system is underfunded,
00:07:29.700 | that they naturally have to pick favorites.
00:07:32.540 | - They live in a world which reached steady state
00:07:35.540 | at some level, let's say, in the early '70s.
00:07:40.160 | And in that world, before that time,
00:07:45.720 | you have a professor like Norman Steenrod,
00:07:47.940 | and you'd have 20 children that is graduate students,
00:07:50.780 | and all of them would go on to be professors,
00:07:52.420 | and all of them would want to have 20 children.
00:07:55.380 | So you start taking higher and higher powers of 20,
00:07:59.620 | and you see that the system could not,
00:08:01.540 | it's not just about money, the system couldn't survive.
00:08:04.980 | So the way it's supposed to work now
00:08:07.380 | is that we should shut down the vast majority
00:08:10.380 | of PhD programs, and we should let the small number
00:08:14.160 | of truly top places populate mostly teaching
00:08:19.160 | and research departments that aren't PhD producing.
00:08:23.500 | We don't want to do that because we use PhD students
00:08:26.420 | as a labor force.
00:08:27.340 | So the whole thing has to do with growth,
00:08:30.500 | resources, dishonesty.
00:08:33.100 | And in that world, you see all of these adaptations
00:08:37.500 | to a ruthless world where the key question is,
00:08:40.320 | where are we gonna bury this huge number of bodies
00:08:42.480 | of people who don't work out?
00:08:43.980 | So my problem was I wasn't interested in dying.
00:08:49.060 | - So you clearly highlighted there's aspects
00:08:52.620 | of the system that are broken, but as an individual,
00:08:55.180 | is your role to exit the system or just acknowledge
00:09:01.860 | that it's a game and win it?
00:09:03.100 | - My role is to survive and thrive in the public eye.
00:09:06.920 | In other words, when you have an escapee of the system--
00:09:13.020 | - Like yourself.
00:09:14.060 | - Such as, and that person says,
00:09:16.820 | you know, I wasn't exactly finished.
00:09:18.580 | Let me show you a bunch of stuff.
00:09:20.900 | Let me show you that the theory of telomeres
00:09:24.300 | never got reported properly.
00:09:26.000 | Let me show you that all of marginal economics
00:09:30.300 | is supposed to be redone with a different version
00:09:32.180 | of the differential calculus.
00:09:33.460 | Let me show you that you didn't understand
00:09:35.460 | the self-dual Yang-Mills equations correctly
00:09:37.940 | in topology and physics because they're in fact
00:09:41.900 | much more broadly found and it's only the mutations
00:09:46.980 | that happen in special dimensions.
00:09:48.780 | There are lots of things to say,
00:09:50.620 | but this particular group of people,
00:09:54.420 | like if you just take, where are all the Gen X
00:09:58.420 | and millennial university presidents?
00:10:00.520 | - Right.
00:10:02.220 | - Okay, they're all in a holding pattern.
00:10:06.740 | Now, why in this story of telomeres,
00:10:11.740 | was it an older professor and a younger graduate student?
00:10:15.860 | It's this issue of what would be called
00:10:18.420 | interference competition.
00:10:20.460 | So for example, orcas try to drown minke whales
00:10:24.220 | by covering their blowholes so that they suffocate
00:10:26.760 | because the needed resource is air.
00:10:29.240 | Okay, well, what do the universities do?
00:10:32.140 | They try to make sure that you can't be viable,
00:10:35.580 | that you need them, that you need their grants,
00:10:38.540 | you need to be zinged with overhead charges
00:10:43.540 | or fringe rates or all of the games
00:10:45.780 | that the locals love to play.
00:10:48.180 | Well, my point is, okay, what's the cost of this?
00:10:51.060 | How many people died as a result
00:10:53.260 | of these interference competition games?
00:10:56.340 | You know, when you take somebody like Douglas Prasher
00:10:58.520 | who did green fluorescent protein
00:11:00.680 | and he drives the shuttle bus, right,
00:11:03.300 | 'cause his grant runs out
00:11:04.840 | and he has to give away all of his research
00:11:06.360 | and all of that research gets a Nobel Prize
00:11:08.320 | and he gets to drive a shuttle bus for $35,000 a year.
00:11:11.280 | - What do you mean by died?
00:11:12.240 | Do you mean their career, their dreams, their passions?
00:11:14.120 | - Yeah, as an academic, Doug Prasher was dead
00:11:17.780 | for a long period of time.
00:11:19.200 | - Okay, so as a person who's escaped the system.
00:11:25.360 | - Yeah.
00:11:26.680 | - Can't you, 'cause you also have in your mind
00:11:30.560 | a powerful theory that may turn out to be useful, maybe not.
00:11:35.440 | - Let's hope.
00:11:36.880 | - Can't you also play the game enough,
00:11:40.200 | like with the children, so like publish, but also--
00:11:45.040 | - If you told me that this would work,
00:11:46.840 | really what I wanna do, you see,
00:11:49.120 | is I would love to revolutionize a field
00:11:53.540 | with an H index of zero.
00:11:55.320 | Like we have these proxies that count
00:12:00.100 | how many papers you've written,
00:12:01.420 | how cited are the papers you've written.
00:12:03.600 | All this is nonsense.
00:12:06.140 | - That's interesting, sorry, what do you mean by a field
00:12:08.420 | with an H index of zero?
00:12:09.460 | So a totally new field.
00:12:10.620 | - H index counts somehow how many papers have you gotten
00:12:13.900 | that get so many citations.
00:12:15.260 | Let's say H index undefined.
00:12:20.780 | Like for example, I don't have an advisor for my PhD,
00:12:25.660 | but I have to have an advisor
00:12:28.220 | as far as something called the Math Genealogy Project
00:12:31.160 | that tracks who advised who,
00:12:34.020 | who advised whom down the line.
00:12:36.180 | So I am my own advisor, which sets up a loop, right?
00:12:41.020 | How many students do I have an infinite number?
00:12:43.660 | Or descendants.
00:12:45.340 | They don't want to have that story,
00:12:47.380 | so I have to have formal advisor, Raoul Bott,
00:12:50.540 | and my Wikipedia entry, for example,
00:12:52.380 | says that I was advised by Raoul Bott, which is not true.
00:12:55.180 | So you get fit into a system that says,
00:12:59.680 | well, we have to know what your H index is.
00:13:01.460 | We have to know, you know, where are you a professor
00:13:05.040 | if you want to apply for a grant?
00:13:06.140 | It makes all of these assumptions.
00:13:08.460 | What I'm trying to do is in part
00:13:10.460 | to show all of this is nonsense.
00:13:12.420 | This is proxy BS that came up in the institutional setting,
00:13:16.160 | and right now it's important for those of us
00:13:18.380 | who are still vital, like Elon,
00:13:20.780 | it would be great to have Elon
00:13:21.900 | as a professor of physics and engineering, right?
00:13:25.500 | - It seems ridiculous to say, but--
00:13:27.900 | - No, just as a shot in the arm.
00:13:30.860 | You know, like, it'd be great to have Elon at Caltech,
00:13:35.220 | even one day a week, one day a month.
00:13:38.200 | Okay, well, why can't we be in there?
00:13:41.300 | It's the same reason.
00:13:42.140 | Well, why can't you be on The View?
00:13:43.900 | Why can't you be on Bill Maher?
00:13:45.300 | We need to know what you're gonna do
00:13:46.660 | before we take you on the show.
00:13:48.860 | Well, I don't wanna tell you what I'm gonna do.
00:13:51.220 | - Do you think you need to be able
00:13:52.580 | to dance the dance a little bit?
00:13:55.000 | - I can dance the dance fine.
00:13:56.380 | - To be on The View.
00:13:57.220 | - Oh, come on.
00:13:58.460 | - So you can, yeah, you do.
00:13:59.540 | You're not-- - I can do that fine.
00:14:01.300 | Here's where it's, the place that it goes south is
00:14:04.500 | there's like a set of questions
00:14:07.500 | that get you into this more adversarial stuff,
00:14:10.300 | and you've in fact asked some of those
00:14:12.060 | more adversarial questions this setting,
00:14:14.980 | and they're not things that are necessarily aggressive,
00:14:17.500 | but they're things that are making assumptions.
00:14:20.620 | - Right.
00:14:21.460 | - So when you have a question, it's like,
00:14:24.220 | Lex, are you avoiding your critics?
00:14:26.200 | It's just like, okay, well, why did you frame that that way?
00:14:30.040 | Or the next question would be,
00:14:31.540 | do you think that you should have a special exemption
00:14:35.100 | and that you should have the right to break rules
00:14:36.720 | and everyone else should have to follow them?
00:14:38.620 | Like that question I find enervating.
00:14:40.980 | It doesn't really come out of anything meaningful.
00:14:42.900 | It's just like we feel we're supposed to ask that
00:14:45.100 | of the other person to show that we're not captured
00:14:47.480 | by their madness.
00:14:48.860 | That's not the real question you wanna ask me.
00:14:50.860 | If you wanna get really excited about this,
00:14:52.540 | you wanna ask, do you think this thing is right?
00:14:56.140 | Yeah, weirdly, I do.
00:14:58.260 | Do you think that it's going to be
00:14:59.520 | immediately seen to be right?
00:15:00.700 | I don't.
00:15:01.540 | I think it's gonna have an interesting fight
00:15:04.460 | and it's gonna have an interesting evolution.
00:15:06.460 | And well, what do you hope to do with it
00:15:08.380 | in non-physical terms?
00:15:10.340 | Gosh, I hope it revolutionizes our relationship
00:15:14.460 | well with people outside of the institutional framework
00:15:18.300 | and it re-inflicts us into the institutional framework
00:15:21.020 | where we can do the most good
00:15:22.780 | to bring the institutions back to health.
00:15:24.880 | It's like these are positive uplifting questions.
00:15:29.060 | If you had Frank Wilczek, you wouldn't say,
00:15:31.500 | Frank, let's be honest, you have done very little
00:15:34.900 | with your life after the original huge show
00:15:38.620 | that you used to break under the physics.
00:15:40.980 | We weirdly ask people different questions
00:15:43.620 | based upon how they sit down.
00:15:45.460 | - Yeah, that's very strange, right?
00:15:46.980 | But you have to understand that,
00:15:49.140 | so here's the thing, I get these days
00:15:54.140 | a large number of emails from people
00:15:56.580 | with the equivalent of a theory of everything for AGI.
00:15:59.620 | - Yeah.
00:16:00.700 | - And I use my own radar, BS radar, to detect.
00:16:06.940 | - Unfairly, perhaps, whether they're full of shit or not.
00:16:11.860 | - Right.
00:16:12.700 | I love where you're going with this, by the way.
00:16:16.100 | - And (laughs)
00:16:19.060 | my concern I often think about is
00:16:22.460 | there's elements of brilliance in what people write to me
00:16:25.860 | and I'm trying to, right now, as you made it clear,
00:16:30.660 | the kind of judgments and assumptions we make,
00:16:33.700 | how am I supposed to deal with you
00:16:36.180 | who are an outsider of the system
00:16:38.860 | and think about what you're doing?
00:16:42.060 | Because my radar is saying you're not full of shit.
00:16:44.920 | - But I'm also not completely outside of the system.
00:16:48.580 | - That's right, you've danced beautifully.
00:16:50.980 | You've actually got all the credibility
00:16:54.780 | that you're supposed to,
00:16:55.780 | all the nice little stamps of approval,
00:16:58.500 | not all, but a large enough amount.
00:17:01.680 | I mean, it's hard to put into words
00:17:05.360 | exactly why you sound,
00:17:09.760 | whether your theory turns out to be good or not,
00:17:14.100 | you sound like a special human being.
00:17:17.780 | - I appreciate that and thank you very much for saying that.
00:17:19.100 | - In a good way, right?
00:17:19.940 | - No, no, no.
00:17:20.780 | - So, but what am I supposed to do
00:17:22.700 | with that flood of emails from AGI folks?
00:17:25.300 | - Why do I sound different?
00:17:27.300 | - I don't know.
00:17:28.620 | And I would like to systemize that, I don't know.
00:17:31.780 | Look, when you're talking to people,
00:17:35.540 | you very quickly can surmise,
00:17:39.440 | am I claiming to be a physicist?
00:17:40.900 | No, I say it every turn, I'm not a physicist.
00:17:43.200 | When you say something about bundles,
00:17:47.320 | you say, well, can you explain it differently?
00:17:49.760 | I'm pushing around on this area, that lever over there.
00:17:54.760 | I'm trying to find something
00:17:57.680 | that we can play with and engage.
00:18:00.480 | And you know another thing is
00:18:01.960 | that I'll say something at scale.
00:18:05.000 | So if I was saying completely wrong things
00:18:07.100 | about bundles on the Joe Rogan program,
00:18:09.280 | you don't think that we wouldn't hear a crushing chorus?
00:18:12.400 | - Yes.
00:18:13.240 | - And same thing with geometric unity.
00:18:15.840 | So I put up this video from this Oxford lecture.
00:18:20.760 | I understand that it's not a standard lecture,
00:18:23.060 | but you haven't heard the most brilliant people
00:18:28.220 | in the field say, well, this is obviously nonsense.
00:18:31.440 | They don't know what to make of it.
00:18:33.200 | And they're gonna hide behind,
00:18:35.600 | well, he hasn't said enough detail.
00:18:37.080 | Where's the paper?
00:18:38.160 | - And where's the paper?
00:18:39.000 | I've seen the criticism.
00:18:41.080 | I've gotten the same kind of criticism.
00:18:42.440 | I've published a few things,
00:18:44.080 | like especially stuff related to Tesla.
00:18:47.740 | We did studies on Tesla vehicles,
00:18:50.660 | and the kind of criticism I've gotten
00:18:53.320 | was showed that they're completely--
00:18:55.320 | - Oh, right, like the guy who had Elon Musk
00:18:57.600 | on his program twice is gonna give us
00:18:59.160 | an accurate assessment.
00:19:00.240 | - Yeah, exactly, exactly.
00:19:02.160 | - It's just very low level.
00:19:03.520 | - Like without actually ever addressing the content.
00:19:07.960 | - You know, Lex, I think that in part,
00:19:13.280 | you're trying to solve a puzzle
00:19:14.800 | that isn't really your puzzle.
00:19:16.400 | I think you know that I'm sincere.
00:19:18.080 | You don't know whether the theory is gonna work or not.
00:19:21.200 | And you know that it's not coming out of somebody
00:19:23.480 | who's coming out of left field.
00:19:25.440 | Like the story makes sense.
00:19:26.780 | There's enough that's new and creative and different
00:19:30.560 | in other aspects where you can check me
00:19:32.880 | that your real concern is, are you really telling me
00:19:37.600 | that when you start breaking the rules,
00:19:39.120 | you see the system for what it is,
00:19:41.200 | and it's become really vicious and aggressive?
00:19:43.560 | And the answer is yes.
00:19:44.640 | And I had to break the rules in part
00:19:47.480 | because of learning issues,
00:19:48.800 | because I came into this field
00:19:51.120 | with a totally different set of attributes.
00:19:54.100 | My profile just doesn't look like anybody else's remotely.
00:19:57.560 | But as a result, what that did is it showed me
00:20:00.240 | what is the system true to its own ideals?
00:20:03.400 | Or does it just follow these weird procedures
00:20:05.640 | and then when you take it off the rails,
00:20:08.840 | it behaves terribly.
00:20:10.160 | And that's really what my story I think does
00:20:13.540 | is it just says, well, he completely takes the system
00:20:17.280 | into new territory where it's not expecting
00:20:19.480 | to have to deal with somebody
00:20:20.400 | with these confusing sets of attributes.
00:20:22.400 | And I think what he's telling us
00:20:24.880 | is he believes it behaves terribly.
00:20:26.860 | Now, if you take somebody with perfect standardized tests
00:20:31.860 | and a winner of math competitions
00:20:34.940 | and you put them in a PhD program,
00:20:37.880 | they're probably gonna be okay.
00:20:39.440 | I'm not saying that the system breaks down
00:20:45.400 | for everybody under all circumstances.
00:20:48.260 | I'm saying when you present the system
00:20:50.400 | with a novel situation, at the moment,
00:20:53.200 | it will almost certainly break down
00:20:55.000 | with probability approaching 100%.
00:20:58.320 | - But to me, the painful and the tragic thing
00:21:02.240 | is it, sorry to bring out my motherly instinct,
00:21:07.240 | but it feels like it's too much,
00:21:09.640 | it could be too much of a burden
00:21:11.040 | to exist outside the system.
00:21:12.600 | - Maybe, but-- - Psychologically.
00:21:14.560 | - First of all, I've got a podcast that I kinda like.
00:21:20.040 | I've got amazing friends.
00:21:22.280 | I have a life which has more interesting people
00:21:24.800 | passing through it than I know what to do with.
00:21:27.280 | And they haven't managed to kill me off yet,
00:21:29.240 | so so far, so good.
00:21:30.600 | (laughing)
00:21:32.840 | (upbeat music)
00:21:35.420 | (upbeat music)
00:21:38.000 | (upbeat music)
00:21:40.580 | (upbeat music)
00:21:43.160 | (upbeat music)
00:21:45.740 | (upbeat music)
00:21:48.320 | [BLANK_AUDIO]