back to index

Eric Weinstein: Geometric Unity and the Call for New Ideas & Institutions | Lex Fridman Podcast #88


Chapters

0:0 Introduction
2:8 World War II and the Coronavirus Pandemic
14:3 New leaders
31:18 Hope for our time
34:23 WHO
44:19 Geometric unity
98:55 We need to get off this planet
100:47 Elon Musk
106:58 Take Back MIT
135:31 The time at Harvard
157:1 The Portal
162:58 Legacy

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Eric Weinstein,
00:00:03.180 | the second time we've spoken on this podcast.
00:00:06.140 | He's a mathematician with a bold and piercing intelligence,
00:00:09.500 | unafraid to explore the biggest questions in the universe
00:00:13.180 | and shine a light on the darkest corners of our society.
00:00:16.900 | He's the host of the Portal podcast,
00:00:20.580 | a part of which he recently released
00:00:22.980 | his 2013 Oxford lecture on his theory of geometric unity
00:00:28.260 | that is at the center of his lifelong efforts
00:00:31.220 | to arrive at a theory of everything
00:00:33.540 | that unifies the fundamental laws of physics.
00:00:36.520 | This conversation was recorded recently
00:00:38.500 | in the time of the coronavirus pandemic.
00:00:41.100 | For everyone feeling the medical, psychological,
00:00:43.340 | and financial burden of this crisis,
00:00:45.260 | I'm sending love your way.
00:00:47.060 | Stay strong, we're in this together, we'll beat this thing.
00:00:50.780 | This is the Artificial Intelligence Podcast.
00:00:53.300 | If you enjoy it, subscribe on YouTube,
00:00:55.340 | review it with five stars on Apple Podcasts,
00:00:57.460 | support it on Patreon,
00:00:58.900 | or simply connect with me on Twitter @LexFriedman,
00:01:02.020 | spelled F-R-I-D-M-A-N.
00:01:04.860 | This show is presented by Cash App,
00:01:07.280 | the number one finance app in the App Store.
00:01:09.540 | When you get it, use code LEXPODCAST.
00:01:12.820 | Cash App lets you send money to friends,
00:01:14.620 | buy Bitcoin, and invest in the stock market
00:01:16.720 | with as little as $1.
00:01:18.780 | Since Cash App does fractional share trading,
00:01:21.020 | let me mention that the order execution algorithm
00:01:23.860 | that works behind the scenes
00:01:25.500 | to create the abstraction of the fractional orders
00:01:28.420 | is an algorithmic marvel.
00:01:30.500 | So big props to the Cash App engineers
00:01:32.540 | for solving a hard problem that in the end
00:01:35.060 | provides an easy interface that takes a step up
00:01:37.940 | to the next layer of abstraction of the stock market,
00:01:41.060 | making trading more accessible to new investors
00:01:43.980 | and diversification much easier.
00:01:47.140 | So again, if you get Cash App from the App Store
00:01:49.460 | or Google Play and use code LEXPODCAST,
00:01:53.500 | you get $10 and Cash App will also donate $10 to FIRST,
00:01:57.180 | an organization that is helping to advance robotics
00:01:59.700 | and STEM education for young people around the world.
00:02:02.580 | And now, here's my conversation with Eric Weinstein.
00:02:07.720 | Do you see a connection between World War II
00:02:11.380 | and the crisis we're living through right now?
00:02:14.080 | - Sure.
00:02:15.200 | The need for collective action,
00:02:17.180 | reminding ourselves of the fact
00:02:20.320 | that all of these abstractions,
00:02:21.740 | like everyone should just do exactly what he or she
00:02:25.740 | wants to do for himself and leave everyone else alone,
00:02:28.620 | none of these abstractions work in a global crisis.
00:02:31.860 | And this is just a reminder
00:02:33.980 | that we didn't somehow put all that behind us.
00:02:36.900 | - When I hear stories about my grandfather
00:02:39.700 | who was in the army, and so the Soviet Union
00:02:42.060 | where most people die when you're in the army,
00:02:44.420 | there's a brotherhood that happens,
00:02:46.300 | there's a love that happens.
00:02:48.460 | Do you think that's something we're gonna see here?
00:02:50.660 | - Well, we're not there.
00:02:51.700 | I mean, what the Soviet Union went through.
00:02:55.100 | I mean, the enormity of the war on the Russian doorstep.
00:03:00.100 | - This is different.
00:03:03.260 | What we're going through now is not--
00:03:04.820 | - We can't talk about Stalingrad and COVID
00:03:06.660 | in the same breath yet.
00:03:08.060 | We're not ready.
00:03:08.940 | And the sort of, you know,
00:03:12.580 | just the sense of like the great patriotic war
00:03:15.540 | and the way in which I was very moved
00:03:18.380 | by the Soviet custom of newlyweds going
00:03:20.980 | and visiting war memorials on their wedding day.
00:03:23.340 | Like the happiest day of your life,
00:03:24.500 | you have to say thank you to the people
00:03:26.140 | who made it possible.
00:03:27.740 | We're not there.
00:03:28.980 | We're just restarting history.
00:03:32.060 | You know, I've called this,
00:03:34.620 | on the Rogan program I called it the Great Nap.
00:03:37.140 | - Yeah. - The 75 years
00:03:38.500 | with very little by historical standards
00:03:43.180 | in terms of really profound disruption.
00:03:46.180 | And so-- - When you call it
00:03:47.380 | the Great Nap, you mean lack of deep global tragedy?
00:03:51.660 | - Well, lack of realized global tragedy.
00:03:54.620 | So I think that the development, for example,
00:03:56.740 | of the hydrogen bomb, you know,
00:03:59.060 | was something that happened during the Great Nap.
00:04:02.740 | And that doesn't mean that people who lived during
00:04:07.380 | that time didn't feel fear, didn't know anxiety.
00:04:11.060 | But it was to say that most of the violent potential
00:04:13.620 | of the human species was not realized.
00:04:16.060 | It was in the form of potential energy.
00:04:18.420 | And this is the thing that I've sort of taken issue with
00:04:20.780 | with the description of Steven Pinker's optimism
00:04:23.980 | is that if you look at the realized kinetic variables,
00:04:26.860 | things have been getting much better for a long time,
00:04:28.840 | which is the Great Nap.
00:04:30.540 | But it's not as if our fragility has not grown,
00:04:34.540 | our dependence on electronic systems,
00:04:36.860 | our vulnerability to disruption.
00:04:39.300 | And so all sorts of things have gotten much better.
00:04:42.820 | Other things have gotten much worse
00:04:44.620 | and the destructive potential has skyrocketed.
00:04:47.740 | - Is tragedy the only way we wake up from the Big Nap?
00:04:51.900 | - Well, no, you could also have, you know,
00:04:54.540 | jubilation about positive things,
00:04:57.100 | but it's harder to get people's attention.
00:04:59.420 | - Can you give an example of a big global positive thing
00:05:02.580 | that could happen?
00:05:03.640 | - I think that when, for example, just historically speaking,
00:05:07.080 | HIV went from being a death sentence
00:05:10.420 | to something that people could live with
00:05:12.140 | for a very long period of time,
00:05:14.320 | it would be great if that had happened on a Wednesday,
00:05:17.260 | right, like all at once,
00:05:18.380 | like you knew that things had changed.
00:05:20.500 | And so the bleed in somewhat kills
00:05:23.140 | the sort of the Wednesday effect
00:05:25.340 | where it all happens on a particular day
00:05:28.440 | at a particular moment.
00:05:30.020 | I think if you look at the stock market here,
00:05:31.900 | you know, there's a very clear moment
00:05:33.500 | where you can see that the market absorbs
00:05:35.780 | the idea of the coronavirus.
00:05:38.260 | I think that with respect to positives,
00:05:41.900 | the moon landing was the best example of a positive
00:05:45.840 | that happened at a particular time,
00:05:48.000 | or recapitulating the Soviet American link up
00:05:52.520 | in terms of Skylab and Soyuz, right?
00:05:58.120 | Like that was a huge moment
00:05:59.600 | when you actually had these two nations connecting in orbit.
00:06:04.600 | And so, yeah, there are great moments
00:06:07.340 | where something beautiful and wonderful and amazing happens,
00:06:10.620 | you know, but it's just, there are fewer of them.
00:06:12.920 | That's why as much as I can't imagine proposing to somebody
00:06:16.760 | at a sporting event,
00:06:18.800 | when you have like 30,000 people waiting,
00:06:21.680 | and you know, like, she says yes, it's pretty exciting.
00:06:25.360 | So I think that we shouldn't discount that.
00:06:28.000 | - So how bad do you think it's going to get
00:06:32.960 | in terms of the global suffering
00:06:35.360 | that we're going to experience with this crisis?
00:06:38.720 | - I can't figure this one out.
00:06:40.400 | I'm just not smart enough.
00:06:41.400 | Something is going weirdly wrong.
00:06:44.040 | They're almost like two separate storylines.
00:06:46.560 | We're in one storyline,
00:06:48.640 | we aren't taking things nearly seriously enough.
00:06:52.240 | We see people using food packaging lids as masks
00:06:57.240 | who are doctors or nurses.
00:07:00.000 | We hear horrible stories about people dying needlessly
00:07:04.320 | due to triage.
00:07:06.600 | And that's a very terrifying story.
00:07:10.500 | On the other hand, there's this other story which says
00:07:13.580 | there are tons of ventilators someplace.
00:07:16.020 | We've got lots of masks, but they haven't been released.
00:07:19.740 | We've got hospital ships
00:07:21.020 | where none of the beds are being used.
00:07:23.460 | And it's very confusing to me
00:07:25.940 | that somehow these two stories give me the feeling
00:07:29.860 | that they both must be true simultaneously,
00:07:32.820 | and they can't both be true in any kind of standard way.
00:07:36.140 | What I don't know whether it's just that I'm dumb,
00:07:38.420 | but I can't get one or the other story to quiet down.
00:07:41.660 | So I think weirdly, this is much more serious
00:07:44.340 | than we had understood it.
00:07:45.820 | And it's not nearly as serious
00:07:47.580 | as some people are making it out to be at the same time,
00:07:51.700 | and that we're not being given the tools
00:07:54.380 | to actually understand,
00:07:55.420 | oh, here's how to interpret the data,
00:07:57.060 | or here's the issue with the personal protective equipment
00:08:00.660 | is actually a jurisdictional battle
00:08:02.500 | or a question of who pays for it
00:08:04.780 | rather than a question of whether it's present or absent.
00:08:07.060 | I don't understand the details of it,
00:08:09.260 | but something is wildly off
00:08:11.140 | in our ability to understand where we are.
00:08:13.100 | - So that's policy, that's institutions.
00:08:15.700 | What about, do you think about the quiet suffering
00:08:18.860 | of millions of people that have lost their job?
00:08:22.720 | Is this a temporary thing?
00:08:24.660 | I mean, what I'm, my ear's not to the suffering
00:08:28.820 | of those people who have lost their job
00:08:30.860 | or the 50% possibly of small businesses
00:08:33.660 | that are gonna go bankrupt.
00:08:35.220 | Do you think about that quiet suffering?
00:08:38.900 | - Well-- - And how that might arise
00:08:40.660 | itself? - Could be not quiet too.
00:08:43.260 | I mean-- - Right, that's the--
00:08:44.700 | - Could be a depression.
00:08:46.020 | This could go from recession to depression,
00:08:48.420 | and depression could go to armed conflict and then to war.
00:08:51.220 | So it's not a very abstract causal chain
00:08:56.220 | that gets us to the point where we can begin
00:08:59.240 | with quiet suffering and anxiety
00:09:01.860 | and all of these sorts of things
00:09:03.100 | and people losing their jobs
00:09:04.580 | and people dying from stress and all sorts of things.
00:09:08.260 | But look, anything powerful enough to put us all indoors
00:09:13.260 | in a, I mean, think about this as an incredible experiment.
00:09:18.700 | Imagine that you proposed,
00:09:22.020 | hey, I wanna do a bunch of research.
00:09:24.420 | Let's figure out what changes in our emissions profiles
00:09:28.820 | for our carbon footprints when we're all indoors
00:09:31.180 | or what happens to traffic patterns
00:09:33.780 | or what happens to the vulnerability of retail sales
00:09:36.660 | as Amazon gets stronger, et cetera, et cetera.
00:09:40.980 | I believe that in many of those situations,
00:09:45.100 | we're running an incredible experiment.
00:09:48.540 | And am I worried for us all?
00:09:50.540 | Yes, there are some bright spots,
00:09:52.300 | one of which is that when you're ordered to stay indoors,
00:09:56.060 | people are gonna feel entitled.
00:09:57.620 | And the usual thing that people are going to hit
00:10:00.980 | when they hear that they've lost your job,
00:10:03.220 | there's this kind of tough love attitude
00:10:10.500 | that you see, particularly in the United States.
00:10:12.980 | Oh, you lost your job, poor baby.
00:10:15.540 | Well, go retrain, get another one.
00:10:18.220 | I think there's gonna be a lot less appetite for that
00:10:20.980 | because we've been asked to sacrifice,
00:10:24.260 | to risk, to act collectively.
00:10:27.060 | And that's the interesting thing.
00:10:28.660 | What does that reawaken in us?
00:10:30.380 | Maybe the idea that we actually are nations
00:10:33.540 | and that your fellow countrymen
00:10:35.940 | may start to mean something to more people.
00:10:38.660 | It certainly means something to people in the military,
00:10:41.420 | but I wonder how many people who aren't in the military
00:10:44.500 | start to think about this as like, oh yeah,
00:10:46.380 | we are kind of running separate experiments
00:10:48.700 | and we are not China.
00:10:50.140 | - So you think this is kind of a period
00:10:53.300 | that might be studied for years to come?
00:10:55.620 | From my perspective, we are a part of the experiment,
00:10:59.720 | but I don't feel like we have access to the full data,
00:11:03.060 | the full data of the experiment.
00:11:04.820 | We're just like little mice in a large--
00:11:08.340 | - Does this one make sense to you, Lex?
00:11:10.300 | - I'm romanticizing it
00:11:12.940 | and I keep connecting it to World War II.
00:11:15.100 | So I keep connecting to historical events
00:11:17.260 | and making sense of them through that way
00:11:19.660 | or reading "The Plague" by Camus.
00:11:22.260 | Almost kind of telling narratives and stories,
00:11:27.620 | but I'm not hearing the suffering
00:11:32.540 | that people are going through
00:11:33.940 | because I think that's quiet.
00:11:36.560 | Everybody's numb currently.
00:11:39.400 | They're not realizing what it means to have lost your job
00:11:42.920 | and to have lost your business.
00:11:44.280 | There's kind of a, I don't,
00:11:45.680 | I'm afraid how that fear will materialize itself
00:11:52.560 | once the numbness wears out.
00:11:56.260 | And especially if this lasts for many months,
00:12:00.280 | and if it's connected to the incompetence of the CDC
00:12:02.840 | and the WHO and our government
00:12:06.560 | and perhaps the election process.
00:12:09.400 | My biggest fear is that the elections get delayed
00:12:13.760 | or something like that.
00:12:14.920 | So the basic mechanisms of our democracy get
00:12:20.120 | slowed or damaged in some way
00:12:24.680 | that then mixes with the fear that people have
00:12:26.860 | that turns to panic, that turns to anger, that anger.
00:12:31.220 | - Can I just play with that for a little bit?
00:12:32.460 | - Sure.
00:12:33.740 | - What if, in fact, all of that structure
00:12:37.780 | that you grew up thinking about,
00:12:40.340 | and again, you grew up in two places, right?
00:12:43.180 | So when you were inside the US,
00:12:47.300 | we tend to look at all of these things as museum pieces.
00:12:50.700 | Like how often do we amend the Constitution anymore?
00:12:54.660 | And in some sense, if you think about the Jewish tradition
00:12:57.660 | of Simchat Torah, you've got this beautiful scroll
00:13:01.220 | that has been lovingly hand drawn in calligraphy
00:13:05.640 | that's very valuable.
00:13:08.640 | And it's very important that you not treat it
00:13:12.620 | as a relic to be revered.
00:13:15.900 | And so we, one day a year, we dance with the Torah
00:13:18.840 | and we hold this incredibly vulnerable document up
00:13:22.940 | and we treat it as if it was Ginger Rogers
00:13:26.700 | being led by Fred Astaire.
00:13:29.300 | Well, that is how you become part of your country.
00:13:33.800 | In fact, maybe the election will be delayed.
00:13:37.040 | Maybe extraordinary powers will be used.
00:13:39.380 | Maybe any one of a number of things will indicate
00:13:42.380 | that you're actually living through history.
00:13:44.140 | This isn't a museum piece that you were handed
00:13:46.180 | by your great-great-grandparents.
00:13:48.220 | - But you're kind of suggesting that there might be
00:13:51.820 | like a community thing that pops up,
00:13:54.060 | like as opposed to an angry revolution,
00:13:59.060 | it might have a positive effect of--
00:14:02.020 | - Well, for example, are you telling me
00:14:04.940 | that if the right person stood up
00:14:06.900 | and called for us to sacrifice PPE for our nurses
00:14:11.900 | and our MDs who are on the front lines,
00:14:17.300 | that like people wouldn't reach down deep
00:14:20.640 | in their own supply that they've been like stocking
00:14:23.080 | and carefully storing and just say, "Here, take it."
00:14:25.860 | Right now, an actual leader would use this time
00:14:32.340 | to bring out the heroic character,
00:14:36.540 | and I'm gonna just go wildly patriotic
00:14:38.660 | 'cause I friggin' love this country.
00:14:40.920 | We've got this dormant population in the US
00:14:43.580 | that loves leadership and country and pride in our freedom
00:14:49.860 | and not being told what to do,
00:14:51.600 | and we still have this thing that binds us together,
00:14:54.680 | and all of the merchants of division just be gone.
00:14:59.680 | - I totally agree with you.
00:15:02.460 | I think there is a deep hunger for that leadership.
00:15:05.200 | Why hasn't that, why hasn't one arisen?
00:15:08.040 | - Because we don't have the right surgeon general.
00:15:10.540 | We have guys saying, "Come on, guys, don't buy masks.
00:15:14.600 | "They don't really work for you.
00:15:15.680 | "Save them for our healthcare professionals."
00:15:19.000 | No, you can't do that.
00:15:21.220 | You have to say, "You know what?
00:15:22.220 | "These masks actually do work,
00:15:23.780 | "and they more work to protect other people from you,
00:15:27.600 | "but they would work for you.
00:15:28.940 | "They'll keep you somewhat safer if you wear them."
00:15:31.040 | Here's the deal.
00:15:32.220 | You've got somebody who's taking huge amounts
00:15:33.960 | of viral load all the time
00:15:35.420 | because the patients are shedding.
00:15:37.340 | Do you wanna protect that person who's volunteered
00:15:39.200 | to be on the front line, who's up sleepless nights?
00:15:41.760 | You just change the message.
00:15:44.060 | You stop lying to people.
00:15:45.980 | You just, you level with them.
00:15:47.760 | It's bad.
00:15:49.160 | - Absolutely, but that's a little bit specific,
00:15:52.360 | so you have to be just honest
00:15:54.200 | about the facts of the situation, yes.
00:15:56.280 | But I think you were referring to something bigger
00:15:58.360 | than just that, is inspiring,
00:16:00.880 | rewriting the Constitution,
00:16:04.900 | sort of rethinking how we work as a nation.
00:16:07.840 | - Yeah, I think you should probably amend the Constitution
00:16:10.580 | once or twice in a lifetime
00:16:13.500 | so that you don't get this distance
00:16:15.640 | from the foundational documents.
00:16:17.260 | And part of the problem is that we've got two generations
00:16:21.500 | on top that feel very connected to the US.
00:16:24.720 | They feel bought in.
00:16:25.840 | And we've got three generations below.
00:16:28.000 | It's a little bit like watching your parents
00:16:31.120 | riding the tricycle that they were supposed
00:16:34.120 | to pass on to you.
00:16:35.440 | And it's like, you're now too old to ride a tricycle,
00:16:38.080 | and they're still whooping it up,
00:16:39.440 | ringing the bell with the streamers
00:16:41.120 | coming off the handlebars, and you're just thinking,
00:16:43.560 | "Do you guys never get bored?
00:16:45.240 | "Do you never pass a torch?
00:16:46.840 | "Do you really want to?"
00:16:48.500 | We had five septuagenarians, all born in the '40s,
00:16:51.980 | running for president of the United States
00:16:53.220 | when Klobuchar dropped out.
00:16:54.960 | The youngest was Warren.
00:16:56.180 | We had Warren, Biden, Sanders, Bloomberg, and Trump
00:17:00.380 | from like 1949 to 1941,
00:17:03.540 | all who had been the oldest president at inauguration.
00:17:07.100 | And nobody says, "Grandma and Grandpa,
00:17:09.900 | "you're embarrassing us."
00:17:11.360 | - Except Joe Rogan.
00:17:13.900 | Let me put it on you.
00:17:16.320 | You have a big platform.
00:17:18.220 | You're somewhat of an intelligent, eloquent guy.
00:17:20.860 | What role do you, somewhat, what role do you play?
00:17:25.060 | Why aren't you that leader?
00:17:26.580 | I mean, I would argue that you're
00:17:29.220 | in ways becoming that leader.
00:17:32.380 | - So I haven't taken enough risk.
00:17:34.740 | Is that your idea?
00:17:35.980 | What should I do or say at the moment?
00:17:38.180 | - No, you're a little bit,
00:17:39.780 | no, you have taken quite a big risks,
00:17:41.820 | and we'll talk about it.
00:17:43.820 | But you're also on the outside shooting in,
00:17:48.820 | meaning you're dismantling the institution
00:17:56.080 | from the outside as opposed to becoming the institution.
00:18:00.920 | - Do you remember that thing you brought up
00:18:02.000 | when you were on "The View"?
00:18:03.400 | - "The View"?
00:18:05.760 | - I'm sorry, when you were on "Oprah"?
00:18:08.000 | - I didn't make, I didn't get the invite.
00:18:09.880 | - I'm sorry, when you were on Bill Maher's program,
00:18:12.880 | what was that thing you were saying?
00:18:14.680 | They don't know we're here.
00:18:19.160 | They may watch us.
00:18:20.680 | - Yeah.
00:18:21.720 | - They may quietly slip us a direct message,
00:18:26.720 | but they pretend that this internet thing
00:18:28.800 | is some dangerous place where only lunatics play.
00:18:33.280 | - Well, who has the bigger platform,
00:18:35.340 | the portal or Bill Maher's program or "The View"?
00:18:38.960 | - Bill Maher and "The View".
00:18:41.100 | - In terms of viewership or in terms of,
00:18:43.320 | what's the metric of size?
00:18:44.640 | - Well, first of all, the key thing is,
00:18:46.980 | take a newspaper and even imagine
00:18:51.080 | that it's completely fake, okay?
00:18:53.560 | And that has very little in the way of circulation.
00:18:56.160 | Yet, imagine that it's a 100-year-old paper
00:19:00.240 | and that it's still part of this game,
00:19:02.120 | this internal game of media.
00:19:03.760 | The key point is that those sources
00:19:07.380 | that have that kind of mark of respectability
00:19:12.380 | to the institutional structures matter in a way
00:19:17.360 | that even if I say something on a very large platform
00:19:20.200 | that makes a lot of sense,
00:19:21.780 | if it's outside of what I've called
00:19:23.420 | the gated institutional narrative or JIN,
00:19:26.060 | it sort of doesn't matter to the institutions.
00:19:28.940 | So the game is, if it happens outside of the club,
00:19:33.400 | we can pretend that it never happened.
00:19:36.160 | - How can you get the credibility and the authority
00:19:38.860 | from outside the gated institutional narrative?
00:19:43.100 | - Well, first of all, you and I both share
00:19:47.300 | institutional credibility coming from our associations.
00:19:53.100 | So we were both at MIT?
00:19:54.900 | - Yes.
00:19:56.300 | - Were you at Harvard at any point?
00:19:57.640 | - Nope.
00:19:58.620 | - Okay, well--
00:19:59.900 | - I lived in Harvard Square.
00:20:01.740 | - So did I.
00:20:02.820 | But at some level, the issue isn't
00:20:06.200 | whether you have credentials in that sense.
00:20:09.460 | The key question is, can you be trusted
00:20:11.400 | to file a flight plan and not deviate from that flight plan
00:20:15.240 | when you are in an interview situation?
00:20:18.040 | Will you stick to the talking points?
00:20:19.880 | I will not.
00:20:20.720 | And that's why you're not going to be allowed
00:20:23.800 | in the general conversation,
00:20:26.660 | which amplifies these sentiments.
00:20:28.640 | - But I'm still trying to--
00:20:30.480 | - So your point would be, is that we're,
00:20:32.120 | let's say both, so you've done how many Joe Rogan?
00:20:35.680 | - Four.
00:20:36.520 | - I've done four too, right?
00:20:37.720 | So both of us are somewhat frequent guests.
00:20:40.000 | The show is huge, you know the power as well as I do.
00:20:43.020 | And people are gonna watch this conversation.
00:20:46.240 | Huge number watched our last one, by the way.
00:20:48.320 | I want to thank you for that one.
00:20:49.400 | That was a terrific, terrific conversation.
00:20:51.520 | Really did change my life.
00:20:52.920 | - Changed my life.
00:20:53.960 | - You're a brilliant interviewer, so thank you.
00:20:56.560 | - Thank you, Eric.
00:20:57.800 | That was, you changed my life too,
00:21:00.500 | that you gave me a chance.
00:21:01.680 | So that was--
00:21:02.520 | - No, no, no, I'm so glad I did that one.
00:21:04.640 | What I would say is, is that we keep mistaking
00:21:07.120 | how big the audience is for whether or not
00:21:09.360 | you have the KISS.
00:21:10.440 | And the KISS is a different thing.
00:21:12.080 | - KISS?
00:21:12.920 | What does that stand for?
00:21:14.400 | - It's not an acronym yet.
00:21:15.400 | - Okay.
00:21:16.240 | - Thank you for asking.
00:21:19.200 | It's a question of, are you part of the
00:21:21.800 | interoperable institution-friendly discussion?
00:21:26.000 | And that's the discussion which we ultimately
00:21:28.180 | have to break into.
00:21:29.600 | - But that's what I'm trying to get at,
00:21:31.200 | is how does Eric Weinstein become the President
00:21:34.720 | of the United States?
00:21:36.440 | - I shouldn't become the President of the United States.
00:21:38.480 | Not interested, thank you very much for asking.
00:21:40.320 | - Okay, get into a leadership position where,
00:21:43.800 | I guess I don't know what that means,
00:21:45.660 | but where you can inspire millions of people
00:21:49.260 | to inspire the sense of community,
00:21:52.320 | inspire the kind of actions required to overcome hardship,
00:21:57.320 | the kind of hardship that we may be experiencing,
00:22:00.260 | to inspire people to work hard and face
00:22:04.400 | the difficult, hard facts of the realities
00:22:06.960 | we're living through, all those kinds of things
00:22:08.960 | that you're talking about.
00:22:10.520 | That leader, can that leader emerge from the current
00:22:15.000 | institutions, or alternatively,
00:22:19.360 | can it also emerge from the outside?
00:22:21.880 | I guess that's what I was asking.
00:22:22.960 | - So my belief is that this is the last hurrah
00:22:26.420 | for the elderly, centrist kleptocrats.
00:22:30.260 | (laughing)
00:22:31.820 | - Can you define each of those terms?
00:22:33.820 | - Okay, elderly, I mean people who were born
00:22:37.620 | at least a year before I was.
00:22:41.140 | That's a joke, you can laugh.
00:22:42.580 | (laughing)
00:22:43.640 | No, because I'm born at the cusp of the Gen X boomer divide.
00:22:46.900 | Centrist, they're pretending, there are two parties,
00:22:52.560 | Democrat and Republican Party in the United States.
00:22:54.900 | I think it's easier to think of the mainstream
00:22:57.020 | of both of them as part of an aggregate party
00:22:59.540 | that I sometimes call the looting party,
00:23:01.780 | which gets us to kleptocracy, which is ruled by thieves.
00:23:05.900 | And the great temptation has been to treat
00:23:08.380 | the US like a trough, and you just have to get yours
00:23:11.020 | because it's not like we're doing anything productive.
00:23:13.380 | So everybody's sort of looting the family mansion,
00:23:16.180 | and somebody stole the silver, and somebody's
00:23:18.140 | cutting the pictures out of the frames.
00:23:20.500 | You know, roughly speaking, we're watching our elders
00:23:25.220 | live it up in a way that doesn't make sense
00:23:27.300 | to the rest of us.
00:23:28.900 | - Okay, so if it's the last hurrah,
00:23:31.260 | this is the time for leaders to step up?
00:23:34.380 | - Well, no, we're not ready yet.
00:23:36.580 | We're not ready, seriously, I call out,
00:23:40.580 | the head of the CDC should resign.
00:23:42.520 | Should resign.
00:23:44.480 | The Surgeon General should resign.
00:23:47.220 | Trump should resign.
00:23:48.060 | Pelosi should resign.
00:23:49.940 | de Blasio should resign.
00:23:50.780 | - But they're not gonna resign.
00:23:51.860 | - I understand that, so we'll wait.
00:23:55.280 | - No, but that's not how revolutions work.
00:23:57.480 | You don't wait for people to resign.
00:24:00.200 | You step up and inspire the alternative.
00:24:03.960 | - Do you remember the Russian Revolution of 1907?
00:24:06.440 | - That's before my time.
00:24:08.960 | - But there wasn't a Russian Revolution of 1907.
00:24:12.440 | - So you're thinking we're in 1907, not 1917.
00:24:14.160 | - I'm saying we're too early.
00:24:16.120 | - But we got this, you know, Spanish flu came in 1718,
00:24:20.520 | so I would argue that there's a lot of parallels there.
00:24:24.380 | - World War I.
00:24:25.400 | - I think it's not time yet.
00:24:26.680 | Like John Prine, the songwriter, just died of COVID.
00:24:31.680 | That was a pretty big--
00:24:34.280 | - Really?
00:24:35.120 | - Yeah.
00:24:35.940 | - By the way, yes, of course,
00:24:37.520 | every time we do this, we discover our mutual appreciation
00:24:44.680 | of obscure, brilliant, witty songwriters.
00:24:48.000 | - He's really quite good, right?
00:24:49.120 | - He's really good, yeah.
00:24:51.400 | He died--
00:24:52.600 | - My understanding is that he passed recently
00:24:54.460 | due to complications of corona.
00:24:56.900 | So we haven't had large enough,
00:25:01.260 | enough large enough shocking deaths yet.
00:25:04.440 | Picturesque deaths, deaths of a family
00:25:07.740 | that couldn't get treatment.
00:25:09.200 | There are stories that will come and break our hearts,
00:25:13.600 | and we have not had enough of those.
00:25:15.100 | The visuals haven't come in.
00:25:16.540 | - But I think they're coming.
00:25:17.900 | - Well, we'll find out.
00:25:19.420 | - But you have to be there when they come.
00:25:22.500 | - But we didn't get the visual, for example,
00:25:24.880 | of falling man from 9/11.
00:25:28.320 | So the outside world did, but Americans were not,
00:25:31.960 | it was thought that we would be too delicate.
00:25:33.920 | So just the way you remember Pulitzer Prize-winning
00:25:36.500 | photographs from the Vietnam era,
00:25:38.600 | you don't easily remember the photographs
00:25:42.360 | from all sorts of things that have happened since,
00:25:44.360 | because something changed in our media.
00:25:46.120 | We are insensitive, we cannot feel
00:25:48.640 | or experience our own lives.
00:25:51.900 | And the tragedy that would animate us to action.
00:25:55.440 | - Yeah, but I think there, again,
00:25:57.660 | I think there's going to be that suffering
00:25:59.480 | that's going to build and build and build
00:26:02.280 | in terms of businesses, mom and pop shops that close.
00:26:06.560 | And I think for myself, I think often
00:26:09.760 | that I'm being weak,
00:26:14.560 | and I feel like I should be doing something.
00:26:18.920 | I should be becoming a leader on a small scale.
00:26:21.400 | - You can't.
00:26:22.660 | This is not World War II, and this is not Soviet Russia.
00:26:26.500 | - Why not?
00:26:27.340 | Why not?
00:26:29.140 | - Because our internal programming,
00:26:31.580 | the malware that sits between our ears
00:26:34.060 | is much different than the propagandized malware
00:26:39.060 | of the Soviet era.
00:26:42.200 | I mean, people were both very indoctrinated
00:26:45.700 | and also knew that some level it was BS.
00:26:50.020 | They had a double mind.
00:26:51.760 | I don't know, there must be a great word in Russian
00:26:53.760 | for being able to think both of those things simultaneously.
00:26:58.620 | - You don't think people are actually sick
00:27:02.460 | of the partisanship, sick of incompetence?
00:27:06.380 | - Yeah, but I called for revolt the other day on Joe Rogan,
00:27:08.980 | people found it quixotic.
00:27:10.500 | - Well, because I think you're not,
00:27:13.840 | I think revolt is different.
00:27:15.500 | I think as like--
00:27:17.180 | - Okay, I'm really angry.
00:27:18.700 | I'm furious.
00:27:20.940 | I cannot stand that this is my country at the moment.
00:27:24.020 | I'm embarrassed.
00:27:25.300 | - So let's build a better one.
00:27:26.700 | - Yeah.
00:27:27.540 | - That's the--
00:27:28.380 | - I'm in.
00:27:29.200 | (laughing)
00:27:30.040 | - Okay, so, well--
00:27:31.180 | - Okay, but let's take over a few universities.
00:27:34.580 | Let's start running a different experiment
00:27:36.580 | at some of our better universities.
00:27:38.260 | When I did this experiment, and I said,
00:27:40.540 | if this were 40 years ago,
00:27:44.540 | the median age, I believe,
00:27:46.140 | of a university president was 51.
00:27:48.700 | That would have the person in Gen X,
00:27:51.260 | and we'd have a bunch of millennial presidents,
00:27:53.460 | a bunch of, more than half Gen X.
00:27:56.860 | It's almost 100% baby boom at this point.
00:27:59.620 | And how did that happen?
00:28:02.660 | We can get into how they changed retirement,
00:28:05.080 | but this generation above us does not feel,
00:28:09.560 | or even the older generation, silent generation.
00:28:13.620 | I had Roger Penrose on my program.
00:28:16.900 | - Excellent conversation.
00:28:17.740 | - And I, thank you, really appreciate that.
00:28:19.900 | And I asked him a question that was very important to me.
00:28:21.500 | I said, look, you're in your late 80s.
00:28:24.980 | Is there anyone you could point to as a successor
00:28:27.580 | that we should be watching, we can get excited?
00:28:30.020 | You know, I said, here's an opportunity to pass the baton.
00:28:32.860 | He said, well, let me hold off on that.
00:28:35.260 | I was like, oh, is it ever the right moment
00:28:38.300 | to point to somebody younger than you
00:28:39.980 | to keep your flame alive after you're gone?
00:28:42.660 | And also, I don't know whether,
00:28:44.180 | I'm just gonna admit to this,
00:28:45.380 | people treat me like I'm crazy
00:28:47.100 | for caring about the world after I'm dead.
00:28:49.220 | Or wanting to be remembered after you're gone.
00:28:53.460 | Like, well, what does it matter to you?
00:28:54.480 | You're gone.
00:28:55.320 | It's this deeply sort of secular,
00:28:57.060 | somatic perspective on everything.
00:28:59.860 | Where we don't, you know that phrase in As Time Goes By?
00:29:04.700 | It says, it's still the same old story,
00:29:07.500 | a fight for love and glory, a case of do or die.
00:29:13.220 | I don't think people imagined then
00:29:15.840 | that there wouldn't be a story
00:29:18.640 | about fighting for love and glory.
00:29:20.320 | And like, we are so out of practice
00:29:23.620 | about fighting rivals for love
00:29:26.660 | and fighting for glory in something bigger than yourself.
00:29:31.660 | - But the hunger is there.
00:29:36.080 | - Well, that was the point then, right?
00:29:37.680 | The whole idea is that Rick was,
00:29:39.660 | you know, he was like Han Solo of his time.
00:29:41.820 | He's just like, I stick my neck out for nobody.
00:29:44.660 | You know, it's like, oh, come on, Rick.
00:29:46.360 | You're just pretending.
00:29:47.340 | You actually have a big soul, right?
00:29:49.420 | And so at some level, that's the question.
00:29:51.900 | Do we have a big soul or is it just all bullshit?
00:29:54.100 | - See, I think there's huge Manhattan Project style projects
00:29:59.100 | whether you're talking about physical infrastructure
00:30:01.340 | or going to Mars, you know, the SpaceX, NASA efforts
00:30:06.340 | or huge, huge scientific efforts.
00:30:10.100 | - Well, we need to get back into the institutions
00:30:12.100 | and we need to remove the weak leadership,
00:30:13.980 | that we have weak leaders
00:30:15.140 | and the weak leaders need to be removed
00:30:16.940 | and they need to seat people more dangerous
00:30:19.380 | than the people who are currently sitting
00:30:21.000 | in a lot of those chairs.
00:30:22.500 | - Or build new institutions.
00:30:24.380 | - Good luck.
00:30:25.820 | - Well, so one of the nice things from the internet
00:30:30.820 | is, for example, somebody like you can have a bigger voice
00:30:35.500 | than almost anybody at the particular institutions
00:30:39.020 | we're talking about.
00:30:39.940 | - That's true.
00:30:40.900 | But the thing is, I might say something.
00:30:43.580 | You can count on the fact that the provost at Princeton
00:30:47.260 | isn't gonna say anything.
00:30:48.500 | - What do you mean?
00:30:50.460 | Too afraid?
00:30:51.540 | - Well, if that person were to give an interview,
00:30:54.180 | how are things going in research at Princeton?
00:30:58.140 | Well, I'm hesitant to say it,
00:30:59.940 | but they're perhaps as good as they've ever been
00:31:02.220 | and I think they're gonna get better.
00:31:03.720 | Oh, is that right?
00:31:04.780 | All fields?
00:31:06.220 | Yep, I don't see a weak one.
00:31:08.340 | It's just like, okay, great.
00:31:10.060 | Who are you and what are you even saying?
00:31:13.100 | We're just used to total nonsense 24/7.
00:31:17.500 | - Yeah.
00:31:18.460 | What do you think might be a beautiful thing
00:31:21.860 | that comes out of this?
00:31:23.160 | Is there a hope, like a little inkling,
00:31:27.500 | a little fire of hope you have about our time right now?
00:31:30.380 | - Yeah, I think one thing is coming to understand
00:31:34.280 | that the freaks, weirdos, mutants, and other ne'er-do-wells,
00:31:39.280 | sometimes referred to as grifters, I like that one,
00:31:43.420 | grifters and gadflies were very often
00:31:48.420 | the earliest people on the coronavirus.
00:31:51.560 | That's a really interesting question.
00:31:52.820 | Why was that?
00:31:54.100 | And it seems to be that they had already paid
00:31:58.180 | such a social price that they weren't going to be beaten up
00:32:03.100 | by being told that, oh my God, you're xenophobic.
00:32:07.860 | You just hate China.
00:32:09.000 | Or, wow, you sound like a conspiracy theorist.
00:32:12.880 | So if you'd already paid those prices,
00:32:16.020 | you were free to think about this.
00:32:17.300 | And everyone in an institutional framework was terrified
00:32:21.020 | that they didn't want to be seen as the alarmist,
00:32:24.580 | the chicken little.
00:32:28.020 | And so that's why you have this confidence
00:32:30.060 | where de Blasio says, get on with your lives,
00:32:34.380 | get back in there and celebrate Chinese New Year
00:32:37.020 | in Chinatown, despite coronavirus.
00:32:39.780 | It's like, okay, really?
00:32:41.060 | So you just always thought everything
00:32:43.260 | would automatically be okay if you adapted,
00:32:46.780 | sorry, if you adopted that posture.
00:32:49.020 | - So you think this time reveals the weakness
00:32:52.300 | of our institutions and reveals the strength
00:32:55.020 | of our gadflies and the weirdos and the--
00:32:58.100 | - No, not necessarily the strength,
00:32:59.540 | but the value of freedom.
00:33:02.260 | Like a different way of saying it would be,
00:33:04.540 | wow, even your gadflies and your grifters
00:33:06.860 | were able to beat your institutional folks
00:33:09.140 | because your institutional folks were playing
00:33:11.180 | with a giant mental handicap.
00:33:13.820 | So just imagine like we were in the story
00:33:15.820 | of Harrison Bergeron by Vonnegut.
00:33:18.620 | And our smartest people were all subjected
00:33:21.500 | to distracting noises every seven seconds.
00:33:26.240 | Well, they would be functionally much dumber
00:33:29.880 | because they couldn't continue a thought
00:33:31.840 | through all the disturbance.
00:33:33.600 | So in some sense, that's a little bit
00:33:35.280 | like what belonging to an institution is,
00:33:37.240 | is that if you have to make a public statement,
00:33:39.120 | of course the Surgeon General is gonna be the worst.
00:33:42.360 | 'Cause they're just playing with too much of a handicap.
00:33:44.440 | There are too many institutional players
00:33:46.000 | who are like, don't screw us up.
00:33:48.200 | And so the person has to say something wrong.
00:33:50.080 | We're gonna back propagate a falsehood.
00:33:53.520 | And this is very interesting.
00:33:54.480 | Some of my socially oriented friends say,
00:33:57.360 | Eric, I don't understand what you're on about.
00:33:59.000 | Of course masks work, but you know
00:34:00.360 | what they're trying to do?
00:34:01.560 | They're trying to get us not to buy up the masks
00:34:03.440 | for the doctors.
00:34:05.160 | And I think, okay, so you imagine
00:34:06.880 | that we can just create scientific fiction at will
00:34:10.200 | so that you can run whatever social program you want.
00:34:13.400 | This is what I, you know, my point is get out of my lab.
00:34:15.560 | Get out of the lab.
00:34:16.560 | You don't belong in the lab.
00:34:17.640 | You're not meant for the lab.
00:34:19.200 | You're constitutionally incapable of being around the lab.
00:34:22.140 | You need to leave the lab.
00:34:23.920 | - You think the CDC and WHO knew that masks work
00:34:27.800 | and were trying to sort of imagine
00:34:31.960 | that people are kind of stupid
00:34:33.640 | and they would buy masks in excess
00:34:37.800 | if they were told that masks work?
00:34:39.940 | Is that like, 'cause this does seem
00:34:43.200 | to be a particularly clear example of mistakes made.
00:34:47.960 | - You're asking me this question?
00:34:50.040 | - Yeah. - No, you're not.
00:34:51.400 | What do you think, Lex?
00:34:54.280 | - Well, I actually probably disagree with you a little bit.
00:34:56.520 | - Great, let's do it.
00:34:57.560 | - I think it's not so easy to be honest with the populace
00:35:02.280 | when the danger of panic is always around the corner.
00:35:08.040 | So I think the kind of honesty you exhibit
00:35:13.040 | appeals to a certain class of brave intellectual minds
00:35:20.800 | that it appeals to me, but I don't know,
00:35:24.420 | from the perspective of WHO, I don't know
00:35:28.360 | if it's so obvious that they should be honest
00:35:33.360 | 100% of the time with people.
00:35:37.440 | - I'm not saying you should be perfectly transparent
00:35:39.760 | and 100% honest.
00:35:41.060 | I'm saying that the quality of your lies
00:35:43.040 | has to be very high and it has to be public spirited.
00:35:46.200 | There's a big difference between,
00:35:48.280 | so I'm not a child about this.
00:35:50.880 | I'm not saying that when you're at war, for example,
00:35:53.000 | you turn over all of your plans to the enemy
00:35:55.680 | because it's very important that you're transparent
00:35:58.200 | with 360 degree visibility, far from it.
00:36:01.160 | What I'm saying is something has been forgotten
00:36:05.080 | and I forgot who it was who told it to me,
00:36:06.880 | but it was a fellow graduate student
00:36:08.500 | in the Harvard math department and he said,
00:36:12.960 | you know, I learned one thing being out in the workforce
00:36:15.640 | because he was one of the few people
00:36:16.680 | who had had a work life in the department as a grad student.
00:36:20.920 | He said, you can be friends with your boss,
00:36:24.440 | but if you're going to be friends with your boss,
00:36:26.120 | you have to be doing a good job at work.
00:36:28.120 | And there's an analog here, which is,
00:36:33.140 | if you're going to be reasonably honest with the population,
00:36:36.800 | you have to be doing a good job at work
00:36:38.460 | as the Surgeon General or as the head of the CDC.
00:36:41.480 | So if you're doing a terrible job,
00:36:44.820 | you're supposed to resign.
00:36:46.800 | And then the next person is supposed to say,
00:36:49.760 | look, I'm not going to lie to you.
00:36:51.520 | I inherited the situation.
00:36:53.280 | It was in a bit of disarray,
00:36:54.720 | but I had several requirements before I agreed to step in
00:36:58.400 | and take the job because I needed to know
00:36:59.960 | I could turn it around.
00:37:00.780 | I needed to know that I had clear lines of authority.
00:37:02.980 | I needed to know that I had the resources available
00:37:05.280 | in order to rectify the problem.
00:37:06.680 | And I needed to know that I had the ability
00:37:08.440 | and the freedom to level with the American people directly
00:37:10.600 | as I saw fit.
00:37:11.520 | All of my wishes were granted
00:37:13.160 | and that's why I'm happy here.
00:37:14.640 | On Monday morning, I've got my sleeves rolled up.
00:37:17.580 | Boy, do we got a lot to do.
00:37:18.900 | So please come back in two weeks
00:37:20.360 | and then ask me how I'm doing then.
00:37:21.660 | And I hope to have something to show you.
00:37:23.620 | That's how you do it.
00:37:25.020 | - So why is that excellence and basic competence missing?
00:37:30.020 | - The big net.
00:37:32.240 | You see, you come from multiple traditions
00:37:34.760 | where it was very important to remember things.
00:37:38.360 | The Soviet tradition made sure
00:37:40.320 | that you remembered the sacrifices that came in that war.
00:37:44.120 | And the Jewish tradition,
00:37:46.300 | we're doing this on Passover, right?
00:37:48.680 | Okay, well, every year we tell one simple story.
00:37:52.880 | Well, why can't it be different every year?
00:37:54.480 | Maybe we could have a rotating series of seven stories
00:37:57.520 | because it's the one story that you need.
00:38:00.580 | It's like, you work with the men in black group, right?
00:38:03.240 | And it's the last suit that you'll ever need.
00:38:05.100 | This is the last story that you ever need.
00:38:06.920 | Don't think I fell for your neuralyzer last time.
00:38:11.960 | In any event, we tell one story
00:38:14.320 | because it's the get out of Dodge story.
00:38:16.100 | There's a time when you need to not wait
00:38:18.040 | for the bread to rise.
00:38:19.760 | And that's the thing, which is,
00:38:21.280 | even if you live through a great nap,
00:38:24.880 | you deserve to know what it feels like
00:38:27.720 | to have to leave everything that has become comfortable
00:38:30.600 | and unworkable.
00:38:32.260 | - It's sad that you need that tragedy,
00:38:37.880 | I imagine, to have the tradition of remembering.
00:38:42.620 | - It's sad to think that because things have been nice
00:38:47.620 | and comfortable means that we can't have
00:38:51.000 | great, competent leaders,
00:38:52.600 | which is kind of the implied statement.
00:38:55.660 | Like, can we have great leaders who take big risks,
00:39:00.460 | who inspire hard work, who deal with difficult truth,
00:39:05.460 | even though things have been comfortable?
00:39:09.180 | Well, we know what those people sound like.
00:39:11.900 | I mean, if, for example, Jocko Willink
00:39:14.940 | suddenly threw his hat into the ring,
00:39:16.740 | everyone would say, okay, right, party's over.
00:39:23.380 | It's time to get up at 4.30 and really work hard
00:39:26.660 | and we've got to get back into fighting shape.
00:39:28.920 | - Yeah, but Jocko's a very special,
00:39:33.320 | I think that whole group of people,
00:39:38.820 | by profession, put themselves into hardship
00:39:42.460 | on a daily basis.
00:39:44.220 | And he's not, well, I don't know,
00:39:47.160 | but he's probably not going to be,
00:39:49.020 | well, could Jocko be president?
00:39:52.700 | - Okay, but it doesn't have to be Jocko, right?
00:39:54.340 | Like, in other words, if it was Kai Lenny
00:39:56.260 | or if it was Alex Honnold from rock climbing,
00:40:01.260 | they're just serious people.
00:40:04.460 | They're serious people who can't afford your BS.
00:40:09.460 | - Yeah, but why do we have serious people
00:40:12.540 | that do rock climbing and don't have serious people
00:40:17.540 | who lead the nation?
00:40:19.080 | That seems to-- - Because that was a,
00:40:22.340 | those skills needed in rock climbing
00:40:24.540 | are not good during the big nap.
00:40:28.620 | And at the tail end of the big nap,
00:40:30.040 | they would get you fired.
00:40:31.240 | - But I don't, don't you think there's a,
00:40:33.620 | the fundamental part of human nature
00:40:35.380 | that desires to excel, to be exceptionally good at your job?
00:40:39.860 | - Yeah, but what is your job?
00:40:42.220 | I mean, in other words, my point to you is,
00:40:44.740 | if you're a general in a peacetime army
00:40:47.820 | and your major activity is playing war games,
00:40:50.260 | what if the skills needed to win war games
00:40:54.180 | are very different than the skills needed to win wars
00:40:56.660 | because you know how the war games are scored
00:40:58.620 | and you've done money ball, for example, with war games.
00:41:02.780 | You figured out how to win games on paper.
00:41:05.200 | So then the advancement skill becomes divergent
00:41:08.660 | from the ultimate skill that it was proxying for.
00:41:13.660 | - Yeah, but you create, we're good as human beings to,
00:41:17.780 | I mean, at least me, I can't do a big nap.
00:41:21.460 | So at any one moment when I finish something,
00:41:24.220 | a new dream pops up.
00:41:25.740 | So going to Mars, going-- - What do you like to do?
00:41:28.460 | You like to do Brazilian jiu-jitsu?
00:41:30.580 | - Well, first of all, I like to do everything.
00:41:31.980 | You like to play guitar? - Guitar.
00:41:34.380 | - You do this podcast, you do theory.
00:41:36.340 | You're constantly taking risks and exposing yourself, right?
00:41:41.940 | Because you got one of those crazy, I'm sorry to say it,
00:41:44.940 | you got an Eastern European Jewish personality,
00:41:47.360 | which I'm still tied to,
00:41:48.660 | and I'm a couple generations more distant than you are.
00:41:52.820 | And I've held on to that thing because it's valuable to me.
00:41:56.380 | - You don't think there's a huge percent of the populace,
00:41:59.500 | even in the United States, that's that.
00:42:02.140 | It might be a little bit dormant, but--
00:42:04.140 | - Do you know Anna Khachyian from the Red Scare podcast?
00:42:07.620 | - Did you interview her? - Yeah.
00:42:09.100 | - Yeah, yeah, yeah, I listened, yeah, yeah, she was great.
00:42:11.060 | - She was great, right? - Yeah, she's fun.
00:42:12.860 | - She's terrific, but she also has the same thing going on.
00:42:15.900 | And I made a joke in the liner notes for that episode,
00:42:19.020 | which is somewhere on the road from Stalingrad
00:42:22.840 | to Forever 21, something was lost.
00:42:25.100 | Like how can Stalingrad and Forever 21
00:42:27.160 | be in the same sentence?
00:42:29.180 | And in part, it's that weird thing.
00:42:32.060 | It's like trying to remember.
00:42:33.620 | Even words, like I'm in Russian and Hebrew,
00:42:36.260 | things like, it's like, but (speaking in foreign language)
00:42:39.620 | You know, these words have much more potency about memory.
00:42:42.940 | And I don't know.
00:42:46.580 | I do, I think there's still a dormant populace
00:42:50.980 | that craves leaders on a small scale, on large scale.
00:42:55.180 | And I hope to be that leader on a small scale.
00:42:58.860 | And I think you, sir, have a role to be a leader.
00:43:03.460 | - You kids go ahead without me.
00:43:06.060 | I'm just gonna, I'm gonna do a little bit
00:43:07.900 | of weird podcasting.
00:43:09.020 | - See, now you're putting on your Joe Rogan hat.
00:43:14.100 | He says I'm just a comedian.
00:43:15.460 | - Oh, no, I'm not. - And you say I'm just a--
00:43:17.060 | - No, it's not that.
00:43:17.900 | If I say I wanna lead too much because of the big nap,
00:43:21.860 | there's like a group, a chorus of automated idiots.
00:43:25.140 | And their first thought is like, ah, I knew it!
00:43:27.580 | This was a power grab all along.
00:43:29.500 | Why should you lead?
00:43:30.580 | You know, just like.
00:43:31.780 | And so the idea is you're just trying to skirt around,
00:43:34.420 | not stepping on all of the idiot landmines.
00:43:37.500 | It's like, okay, so now I'm gonna hear that
00:43:39.540 | in my inbox for the next three days.
00:43:41.500 | - Okay, so lead by example, just live--
00:43:43.660 | - No, I mean-- - On a large platform.
00:43:45.380 | - Look, we should take over the institutions.
00:43:47.140 | There are institutions.
00:43:48.340 | We've got bad leadership.
00:43:49.540 | We should mutiny.
00:43:50.660 | And we should inject a, I don't know, 15%, 20%
00:43:56.300 | disagreeable, dissident, very aggressive,
00:43:58.740 | loner individual, mutant freaks,
00:44:01.180 | all the people that you go to see Avengers movies about
00:44:03.580 | or the X-Men or whatever it is,
00:44:05.220 | and stop pretending that everything good
00:44:07.700 | comes out of some great, giant, inclusive,
00:44:10.300 | communal 12-hour meeting.
00:44:14.100 | It's like, stop it!
00:44:16.780 | That's not how shit happens.
00:44:18.360 | - You recently published a video of a lecture
00:44:23.300 | you gave at Oxford presenting some aspects
00:44:25.780 | of a theory of everything called geometric unity.
00:44:30.780 | So this was a work of 30 plus years.
00:44:35.320 | This is life's work.
00:44:36.640 | Let me ask the silly old question.
00:44:40.780 | How do you feel as a human?
00:44:42.860 | Excited, scared, the experience of posting it.
00:44:47.020 | - You know, it's funny.
00:44:48.840 | One of the things that you learn to feel as an academic
00:44:52.980 | is the great sins you can commit in academics
00:44:57.420 | is to show yourself to be a non-serious person,
00:45:01.260 | to show yourself to have delusions,
00:45:03.240 | to avoid the standard practices
00:45:08.500 | which everyone is signed up for.
00:45:11.040 | And it's weird because you know
00:45:17.820 | that those people are gonna be angry.
00:45:19.580 | He did what?
00:45:21.800 | Why would he do that?
00:45:23.320 | - And what we're referring to, for example,
00:45:26.600 | there's traditions of sort of publishing incrementally,
00:45:30.000 | certainly not trying to have a theory of everything,
00:45:32.700 | perhaps working within the academic departments,
00:45:37.480 | all those things.
00:45:38.320 | - That's true.
00:45:39.880 | - And so you're going outside of all of that.
00:45:42.480 | - Well, I mean, I was going inside of all of that.
00:45:45.760 | And we did not come to terms when I was inside.
00:45:49.880 | And what they did was so outside to me,
00:45:52.440 | was so weird, so freakish.
00:45:54.760 | Like the most senior respectable people
00:45:57.400 | at the most senior respectable places
00:45:59.640 | were functionally insane as far as I could tell.
00:46:02.940 | And again, it's like being functionally stupid
00:46:05.140 | if you're the head of the CDC or something
00:46:08.360 | where you're giving recommendations out
00:46:10.480 | that aren't based on what you actually believe,
00:46:12.400 | they're based on what you think you have to be doing.
00:46:14.900 | Well, in some sense, I think that that's a lot
00:46:17.480 | of how I saw the math and physics world
00:46:19.520 | as the physics world was really crazy
00:46:23.280 | and the math world was considerably less crazy,
00:46:25.720 | just very strict and kind of dogmatic.
00:46:28.600 | - Well, we'll psychoanalyze those folks,
00:46:30.720 | but I really wanna maybe linger on it a little bit longer
00:46:35.240 | of how you feel 'cause this is such a special moment
00:46:39.080 | in your life.
00:46:39.920 | - Well, I really appreciate it.
00:46:40.740 | It's a great question.
00:46:41.580 | So if we can pair off some of those other issues.
00:46:48.700 | It's new being able to say what the Observer's is,
00:46:53.700 | which is my attempt to replace space time
00:46:57.120 | with something that is both closely related
00:46:59.060 | to space time and not space time.
00:47:01.240 | So I used to carry the number 14
00:47:04.460 | as a closely guarded secret in my life
00:47:06.760 | and where 14 is really four dimensions of space and time
00:47:11.760 | plus 10 extra dimensions of rulers and protractors
00:47:15.820 | for the cool kids out there, symmetric two tensors.
00:47:20.820 | - So you had a geometric, complicated,
00:47:24.500 | beautiful geometric view of the world
00:47:26.260 | that you carried with you for a long time.
00:47:27.980 | - Yeah.
00:47:28.820 | - Did you have friends that you, colleagues that you--
00:47:32.700 | - Essentially, no.
00:47:33.860 | - Talked?
00:47:34.700 | - No, in fact, some of these stories are me coming out
00:47:38.860 | to my friends and I use the phrase coming out
00:47:43.380 | because I think that gays have monopolized
00:47:46.100 | the concept of a closet.
00:47:47.860 | Many of us are in closets having nothing to do
00:47:50.220 | with our sexual orientation.
00:47:51.620 | Yeah, I didn't really feel comfortable
00:47:55.400 | talking to almost anyone.
00:47:56.620 | So this was a closely guarded secret
00:48:00.320 | and I think that I let on in some ways
00:48:02.100 | that I was up to something and probably,
00:48:04.840 | but it was a very weird life.
00:48:06.580 | So I had to have a series of things
00:48:08.460 | that I pretended to care about
00:48:10.140 | so that I could use that as the stalking horse
00:48:12.860 | for what I really cared about.
00:48:13.900 | And to your point, I never understood this whole thing
00:48:17.200 | about theories of everything.
00:48:19.100 | Like if you were gonna go into something
00:48:20.820 | like theoretical physics,
00:48:22.820 | isn't that what you would normally pursue?
00:48:25.460 | Like wouldn't it be crazy to do something that difficult
00:48:28.000 | and that poorly paid if you were gonna try
00:48:30.820 | to do something other than figure out
00:48:32.360 | what this is all about?
00:48:33.500 | - Now I have to reveal my cards,
00:48:36.100 | my sort of weaknesses and lack and understanding
00:48:39.620 | of the music of physics and math departments.
00:48:42.780 | But there's an analogy here to artificial intelligence
00:48:45.900 | and often folks come in and say,
00:48:50.540 | okay, so there's a giant department working on
00:48:53.420 | quote unquote artificial intelligence,
00:48:55.920 | but why is nobody actually working on intelligence?
00:49:00.580 | Like you're all just building little toys.
00:49:04.600 | You're not actually trying to understand
00:49:06.500 | and that breaks a lot of people.
00:49:08.100 | It confuses them 'cause like,
00:49:11.980 | okay, so I'm at MIT, I'm at Stanford,
00:49:15.060 | I'm at Harvard, I'm here, I dreamed of being,
00:49:17.640 | working on artificial intelligence.
00:49:20.060 | Why is everybody not actually working on intelligence?
00:49:23.260 | And I have the same kind of sense
00:49:25.740 | that that's what working on the theory of everything is.
00:49:28.720 | That strangely you somehow become an outcast for even--
00:49:33.100 | - But we know why this is, right?
00:49:34.740 | - Why?
00:49:36.780 | - Well, it's because, let's take the artificial,
00:49:38.500 | let's play with AGI for example.
00:49:40.180 | - Yeah.
00:49:41.260 | - I think that the idea starts off
00:49:43.500 | with nobody really knows how to work on that.
00:49:45.820 | And so if we don't know how to work on it,
00:49:48.220 | we choose instead to work on a program
00:49:50.660 | that is tangentially related to it.
00:49:52.940 | So we do a component of a program
00:49:55.220 | that is related to that big question
00:49:57.700 | because it's felt like at least I can make progress there.
00:50:00.860 | And that wasn't where I was.
00:50:04.020 | Where I was in, it's funny,
00:50:06.900 | there was this book called Frieden-Uhlenbeck
00:50:09.460 | and it had this weird mysterious line
00:50:11.620 | in the beginning of it.
00:50:13.340 | And I tried to get clarification
00:50:15.600 | of this weird mysterious line
00:50:17.060 | and everyone said wrong things.
00:50:19.580 | And then I said, okay, well,
00:50:20.480 | so I can tell that nobody's thinking properly
00:50:23.060 | because I just asked the entire department
00:50:26.020 | and nobody has a correct interpretation of this.
00:50:29.320 | And so it's a little bit like you see a crime scene photo
00:50:33.900 | and you have a different idea.
00:50:36.040 | Like there's a smoking gun and you figure,
00:50:37.980 | that's actually a cigarette lighter
00:50:39.420 | I don't really believe that.
00:50:40.600 | And then there's like a pack of cards
00:50:42.940 | and you think, oh, that looks like the blunt instrument
00:50:45.140 | that the person was beaten with.
00:50:47.420 | So you have a very different idea about how things go.
00:50:50.580 | And very quickly you realize
00:50:51.780 | that there's no one thinking about that.
00:50:54.200 | - There's a few human sides to this and technical sides,
00:50:58.500 | both of which I'd love to try to get down to.
00:51:01.780 | So the human side, I can tell from my perspective,
00:51:04.820 | I think it was before April 1st, April Fool's,
00:51:08.340 | maybe the day before, I forget.
00:51:10.040 | But I was laying in bed in the middle of the night
00:51:12.580 | and somehow it popped up on my feed somewhere
00:51:17.580 | that your beautiful face is speaking live.
00:51:23.300 | And I clicked and it's kind of weird
00:51:27.540 | how the universe just brings things together
00:51:29.460 | in this kind of way.
00:51:30.900 | And all of a sudden I realized
00:51:32.860 | that there's something big happening
00:51:34.500 | at this particular moment.
00:51:35.580 | It's strange, on a day like any day.
00:51:39.420 | And all of a sudden you were thinking of,
00:51:43.380 | you had this somber tone, like you were serious,
00:51:46.660 | like you were going through some difficult decision.
00:51:49.900 | And it seems strange.
00:51:54.020 | I almost thought you were maybe joking,
00:51:56.120 | but there was a serious decision being made.
00:51:58.100 | And it was a wonderful experience to go through with you.
00:52:00.700 | - I really appreciate it.
00:52:01.540 | I mean, it was April 1st.
00:52:03.040 | - Yeah, it's kind of fascinating.
00:52:04.500 | I mean, just the whole experience.
00:52:06.200 | So I want to ask, I mean, thank you for letting me be part
00:52:12.960 | of that kind of journey of decision-making
00:52:15.460 | that took 30 years.
00:52:17.140 | But why now?
00:52:19.740 | Why did you think, why did you struggle so long
00:52:22.900 | not to release it and decide to release it now?
00:52:27.020 | While the whole world is on lockdown, on April fools,
00:52:32.500 | is it just because you like the comedy of absurd ways
00:52:37.500 | that the universe comes together?
00:52:39.580 | - I don't think so.
00:52:41.020 | I think that the COVID epidemic is the end of the big nap.
00:52:45.540 | And I think that I actually tried this seven years earlier
00:52:51.540 | in Oxford.
00:52:52.420 | And it was too early.
00:52:55.840 | - Which part was too, is it the platform?
00:52:59.340 | 'Cause your platform is quite different now, actually.
00:53:01.260 | The internet, I remember you,
00:53:02.760 | I read several of your brilliant answers
00:53:06.020 | that people should read for the Edge questions.
00:53:07.820 | One of them was related to the internet.
00:53:10.420 | - It was the first one.
00:53:11.860 | - Was it the first one?
00:53:12.700 | - An essay called Go Virtual, Young Man.
00:53:14.900 | - Yeah, yeah, that's like forever ago now.
00:53:18.420 | - Well, that was 10 years ago,
00:53:19.620 | and that's exactly what I did,
00:53:21.020 | is I decamped to the internet,
00:53:22.620 | which is where the portal lives.
00:53:23.940 | The portal, the portal, the portal.
00:53:25.740 | (laughing)
00:53:27.220 | - Well, let's start the whole, the theme,
00:53:29.100 | the ominous theme music, which you just listen to forever.
00:53:33.260 | - I actually started recording tiny guitar licks
00:53:36.540 | for the audio portion, not for the video portion.
00:53:40.140 | You kind of inspire me with bringing your guitar
00:53:43.660 | into the story, but keep going.
00:53:45.860 | - So you thought, so the Oxford was like step one,
00:53:48.500 | and you kind of, you put your foot into the water
00:53:52.100 | to sample it, but it was too cold at the time,
00:53:55.380 | so you didn't want to step in all the--
00:53:56.220 | - I was just really disappointed.
00:53:58.100 | - What was disappointing about that experience?
00:54:00.340 | - It's a hard thing to talk about.
00:54:01.720 | It has to do with the fact that,
00:54:03.580 | and I can see this mirrors a disappointment within myself.
00:54:09.860 | There are two separate issues.
00:54:11.260 | One is the issue of making sure that the idea
00:54:15.180 | is actually heard and explored,
00:54:17.540 | and the other is the question about,
00:54:20.500 | will I become disconnected from my work
00:54:24.060 | because it will be ridiculed,
00:54:25.860 | it will be immediately improved,
00:54:28.100 | it will be found to be derivative of something
00:54:30.100 | that occurred in some paper in 1957.
00:54:32.380 | When the community does not want you to gain a voice,
00:54:35.680 | it's a little bit like a policeman deciding
00:54:39.320 | to weirdly enforce all of these little-known regulations
00:54:44.320 | against you and sometimes nobody else,
00:54:48.500 | and I think that's kind of this weird thing
00:54:53.060 | where I just don't believe that we can reach
00:54:58.020 | the final theory necessarily
00:55:00.780 | within the political economy of academics.
00:55:03.860 | So if you think about how academics are tortured
00:55:06.060 | by each other and how they're paid
00:55:09.300 | and where they have freedom and where they don't,
00:55:11.780 | I actually weirdly think that that system
00:55:13.820 | of selective pressures is going to eliminate anybody
00:55:16.460 | who's going to make real progress.
00:55:18.460 | - So that's interesting.
00:55:19.300 | So if you look at the story of Andrew Wiles, for example,
00:55:22.460 | with the Fermat's Last Theorem,
00:55:25.860 | I mean, he, as far as I understand,
00:55:28.260 | he pretty much isolated himself
00:55:30.420 | from the world of academics in terms of the bulk
00:55:33.700 | of the work he did.
00:55:35.180 | And from my perspective, it's dramatic and fun to read about
00:55:39.960 | but it seemed exceptionally stressful
00:55:41.540 | the first steps he took when actually
00:55:44.940 | making the work public.
00:55:46.140 | That seemed, to me, it would be hell.
00:55:48.700 | - Yeah, but it's like so artificially dramatic.
00:55:51.560 | You know, he leads up to it at a series of lectures.
00:55:55.580 | He doesn't want to say it.
00:55:56.980 | And then he finally says it at the end
00:55:59.840 | because obviously this comes out of a body of work where,
00:56:02.660 | I mean, the funny part about Fermat's Last Theorem
00:56:05.220 | is that it wasn't originally thought to be a deep
00:56:07.420 | and meaningful problem.
00:56:09.180 | It was just an easy to state one that had gone unsolved.
00:56:12.420 | But if you think about it,
00:56:14.060 | it became attached to the body of regular theory.
00:56:17.580 | So he built up this body of regular theory,
00:56:20.060 | gets all the way up to the end,
00:56:21.260 | announces, and then there's this whole drama about,
00:56:25.540 | okay, somebody's checking the proof.
00:56:26.940 | I don't understand what's going on in line 37.
00:56:29.340 | Oh, is it serious?
00:56:31.660 | It seems a little bit more serious than we knew.
00:56:33.580 | - I mean, do you see parallels?
00:56:34.960 | Do you share the concern that your experience
00:56:37.060 | might be something similar?
00:56:38.220 | - Well, in his case, I think that if I recall correctly,
00:56:40.980 | his original proof was unsalvageable.
00:56:42.820 | He actually came up with a second proof
00:56:46.020 | with a colleague, Richard Taylor.
00:56:50.880 | And it was that second proof which carried the day.
00:56:53.820 | So it was a little bit that he got put
00:56:55.360 | under incredible pressure and then had to succeed
00:56:59.100 | in a new way having failed the first time,
00:57:00.940 | which is like even a weirder and stranger story.
00:57:03.540 | - That's an incredible story in some sense.
00:57:05.540 | But I mean, I'm trying to get a sense
00:57:08.580 | of the kind of stress you're under.
00:57:09.420 | - I think that this is, okay, but I'm rejecting.
00:57:12.500 | What I don't think people understand with me
00:57:15.100 | is the scale of the critique.
00:57:18.500 | It's like I don't, people say,
00:57:21.400 | "Well, you must implicitly agree with this
00:57:23.040 | "and implicitly agree."
00:57:23.880 | And it's like, "No, try me.
00:57:25.740 | "Ask before you decide that I am mostly in agreement
00:57:29.320 | "with the community about how these things
00:57:31.060 | "should be handled or what these things mean."
00:57:34.240 | - Can you elaborate?
00:57:35.080 | And also just why does criticism matter so much here?
00:57:40.080 | So you seem to dislike the burden of criticism
00:57:46.760 | that it will choke away all--
00:57:49.300 | - There's different kinds of criticism.
00:57:51.380 | There's constructive criticism
00:57:52.960 | and there's destructive criticism.
00:57:54.860 | And what I don't like is I don't like a community
00:57:58.740 | that can't, first of all,
00:58:02.180 | like if you take the physics community,
00:58:04.700 | just the way we screwed up on masks and PPE,
00:58:07.720 | just the way we screwed up in the financial crisis
00:58:10.600 | and mortgage-backed securities,
00:58:11.900 | we screwed up on string theory.
00:58:13.880 | - Can we just forget the string theory happened
00:58:15.780 | or? (laughs)
00:58:17.180 | - Sure, but then somebody should say that, right?
00:58:19.460 | Somebody should say, "You know, it didn't work out."
00:58:22.220 | - Yeah.
00:58:23.060 | - But okay, but you're asking this,
00:58:25.860 | like why do you guys get to keep the prestige
00:58:28.380 | after failing for 35 years?
00:58:31.060 | That's an interesting question.
00:58:31.900 | - Who is the you guys?
00:58:32.940 | Because to me--
00:58:33.780 | - Whoever the profession, look, these things,
00:58:36.620 | if there is a theory of everything to be had, right?
00:58:39.660 | It's going to be a relatively small group of people
00:58:42.300 | where this will be sorted out.
00:58:43.660 | - Absolutely.
00:58:44.660 | - It's not tens of thousands,
00:58:46.820 | it's probably hundreds at the top.
00:58:50.400 | - But within that community, there's the assholes.
00:58:55.400 | There's the, I mean, you always in this world
00:59:01.620 | have people who are kind, open-minded.
00:59:04.660 | - It's not a question about kind, it's a question about,
00:59:07.620 | okay, let's imagine, for example,
00:59:10.100 | that you have a story where you believe
00:59:13.500 | that ulcers are definitely caused by stress.
00:59:15.820 | And you've never questioned it,
00:59:18.740 | or maybe you felt like the Japanese came out of the blue
00:59:21.300 | and attacked us at Pearl Harbor, right?
00:59:23.860 | And now somebody introduces a new idea to you,
00:59:26.640 | which is like, what if it isn't stress at all?
00:59:29.060 | Or what if we actually tried to make resource-starved Japan
00:59:32.900 | attack us somewhere in the Pacific
00:59:34.380 | so we could have Cassius Belli to enter the Asian theater?
00:59:37.280 | And the person's original idea is like,
00:59:39.780 | what, what are you even saying?
00:59:41.940 | You know, it's like too crazy.
00:59:43.500 | Well, when Dirac in 1963
00:59:47.420 | talked about the importance of beauty
00:59:51.280 | as a guiding principle in physics,
00:59:53.040 | and he wasn't talking about the scientific method,
00:59:56.820 | that was crazy talk.
00:59:58.180 | But he was actually making a great point,
01:00:00.860 | and he was using Schrodinger,
01:00:02.100 | and I think Schrodinger was standing in for him,
01:00:05.260 | and he said that if your equations
01:00:06.860 | don't agree with experiment, that's kind of a minor detail.
01:00:10.580 | If they have true beauty in them,
01:00:12.460 | you should explore them because very often
01:00:14.740 | the agreement with experiment is an issue
01:00:18.260 | of fine-tuning of your model, of the instantiation.
01:00:21.340 | And so it doesn't really tell you that your model is wrong.
01:00:25.540 | And of course, Heisenberg told Dirac
01:00:27.300 | that his model was wrong because the proton and the electron
01:00:31.820 | should be the same mass
01:00:32.820 | if they are each other's antiparticles.
01:00:35.540 | And that was an irrelevant kind of silliness
01:00:39.620 | rather than a real threat to the Dirac theory.
01:00:43.060 | - But okay, so amidst all this silliness,
01:00:46.580 | I'm hoping that we could talk about
01:00:48.660 | the journey that geometric unity has taken and will take
01:00:52.700 | as an idea and an idea that will see the light.
01:00:56.020 | - Yeah.
01:00:57.180 | - So first of all, I'm thinking of writing a book
01:01:00.740 | called "Geometric Unity for Idiots."
01:01:03.340 | - Okay.
01:01:04.160 | - And I need you as a consultant, so can we--
01:01:06.580 | - First of all, I hope I have the trademark
01:01:08.280 | on geometric unity.
01:01:09.260 | - You do.
01:01:10.180 | - Good.
01:01:11.020 | - Can you give a basic introduction
01:01:13.820 | of the goals of geometric unity,
01:01:17.140 | the basic tools of mathematics,
01:01:20.300 | use the viewpoints in general for idiots like me?
01:01:24.820 | - Okay, great, fun.
01:01:26.300 | - So what's the goal of geometric unity?
01:01:28.820 | - The goal of geometric unity is to start with something
01:01:31.980 | so completely bland that you can simply say,
01:01:37.340 | well, that's something that begins the game
01:01:39.580 | is as close to a mathematical nothing as possible.
01:01:42.940 | In other words, I can't answer the question,
01:01:44.420 | why is there something rather than nothing?
01:01:46.140 | But if there has to be a something that we begin from,
01:01:49.180 | let it begin from something that's like a blank canvas.
01:01:52.500 | - Let's even more basic, so what is something?
01:01:56.820 | What are we trying to describe here?
01:01:58.060 | - Okay, right now we have a model of our world,
01:02:03.060 | and it's got two sectors.
01:02:05.580 | One of the sectors is called general relativity,
01:02:07.660 | the other is called the standard model.
01:02:09.820 | So we'll call it GR for general relativity
01:02:13.660 | and SM for standard model.
01:02:15.940 | - What's the difference between the two?
01:02:17.140 | What do the two describe?
01:02:19.300 | - So general relativity gives pride of place to gravity,
01:02:24.300 | and everything else is acting as a sort of a backup singer.
01:02:30.420 | - Gravity is the star of the show.
01:02:32.140 | - Gravity is the star of general relativity.
01:02:35.140 | And in the standard model,
01:02:38.300 | the other three non-gravitational forces,
01:02:41.500 | so if there are four forces that we know about,
01:02:43.300 | three of the four are non-gravitational,
01:02:45.740 | that's where they get to shine.
01:02:48.260 | - Great, so tiny little particles
01:02:50.740 | and how they interact with each other.
01:02:52.460 | - So photons, gluons,
01:02:54.540 | and so-called intermediate vector bosons.
01:02:57.920 | Those are the things that the standard model showcases,
01:03:00.940 | and general relativity showcases gravity.
01:03:03.820 | And then you have matter,
01:03:05.160 | which is accommodated in both theories,
01:03:08.700 | but much more beautifully inside of the standard model.
01:03:11.900 | - So what does a theory of everything do?
01:03:15.620 | - So first of all, I think that that's the first place
01:03:19.180 | where we haven't talked enough.
01:03:20.500 | We assume that we know what it means,
01:03:23.300 | but we don't actually have any idea what it means.
01:03:25.780 | And what I claim it is, is that it's a theory
01:03:28.300 | where the questions beyond that theory
01:03:32.100 | are no longer of a mathematical nature.
01:03:34.980 | In other words, if I say,
01:03:38.540 | let us take X to be a four-dimensional manifold.
01:03:43.540 | To a mathematician or physicist, I've said very little.
01:03:49.680 | I've simply said, there's some place for calculus
01:03:52.660 | and linear algebra to dance together and to play.
01:03:56.580 | And that's what manifolds are.
01:03:59.260 | They're the most natural place
01:04:00.620 | where our two greatest math theories can really intertwine.
01:04:05.060 | - Which are the two?
01:04:08.740 | Oh, you mean calculus and linear algebra, yep.
01:04:10.780 | - Right.
01:04:12.180 | Okay, now the question is beyond that.
01:04:15.100 | So it's sort of like saying,
01:04:16.620 | I'm an artist and I want to order a canvas.
01:04:18.840 | Now the question is, does the canvas paint itself?
01:04:24.640 | Does the canvas come up with an artist?
01:04:30.620 | And paint an ink, which then paint the canvas.
01:04:35.620 | Like that's the hard part about theories of everything,
01:04:39.220 | which I don't think people talk enough about.
01:04:41.620 | - Can we just, you bring up Escher
01:04:43.740 | and the hand that draws itself.
01:04:46.100 | - The fire that lights itself or drawing hands.
01:04:48.700 | - The drawing hands.
01:04:49.580 | - Yeah.
01:04:50.500 | - And every time I start to think about that,
01:04:52.940 | my mind like shuts down.
01:04:55.620 | - No, don't do that.
01:04:56.660 | - It, there's a spark.
01:04:59.080 | - No, but this is the most beautiful part.
01:05:00.500 | We should do this together.
01:05:01.340 | - No, it's beautiful, but this robot's brain sparks fly.
01:05:06.340 | So can we try to say the same thing over and over
01:05:11.300 | in different ways about what you mean
01:05:14.900 | by that having to be a thing we have to contend with?
01:05:17.620 | - Sure.
01:05:18.620 | - Like why do you think that creating a theory
01:05:22.060 | of everything, as you call the source code,
01:05:25.060 | our understanding our source code require a view
01:05:29.500 | like the hand that draws itself?
01:05:31.000 | - Okay, well, here's what goes on
01:05:32.360 | in the regular physics picture.
01:05:35.160 | We've got these two main theories,
01:05:36.660 | general relativity and the standard model, right?
01:05:39.060 | Think of general relativity as more or less
01:05:44.800 | the theory of the canvas, okay?
01:05:48.880 | Maybe you have the canvas in a particularly rigid shape,
01:05:52.740 | maybe you've measured it, so it's got length
01:05:54.480 | and it's got an angle, but more or less,
01:05:56.160 | it's just canvas and length and angle,
01:05:58.720 | and that's all that really general relativity is,
01:06:02.560 | but it allows the canvas to warp a bit.
01:06:04.880 | Then we have the second thing,
01:06:08.580 | which is this import of foreign libraries,
01:06:13.580 | which aren't tied to space and time.
01:06:18.680 | So we've got this crazy set of symmetries
01:06:20.960 | called SU3 cross SU2 cross U1.
01:06:24.240 | We've got this collection of 16 particles in a generation,
01:06:27.560 | which are these sort of twisted spinners,
01:06:29.640 | and we've got three copies of them.
01:06:32.840 | Then we've got this weird Higgs field that comes in
01:06:35.240 | and like deus ex machina, solves all the problems
01:06:38.360 | that have been created in the play
01:06:40.360 | that can't be resolved otherwise.
01:06:42.000 | - So that's the standard model,
01:06:43.200 | quantum field theory just plopped on top of this canvas.
01:06:45.880 | - Yeah, it's a problem of the double origin story.
01:06:48.840 | One origin story is about space and time,
01:06:51.560 | the other origin story is about
01:06:53.160 | what we would call internal quantum numbers
01:06:56.040 | and internal symmetries.
01:06:58.000 | And then there was an attempt to get one to follow
01:07:01.720 | from the other called Calusa-Klein theory,
01:07:03.700 | which didn't work out.
01:07:04.860 | And this is sort of in that vein.
01:07:09.480 | - So you said origin story.
01:07:13.200 | So in the hand that draws itself, what is it?
01:07:16.200 | - So it's as if you had the canvas
01:07:20.040 | and then you ordered up also give me paint brushes,
01:07:23.840 | paints, pigments, pencils, and artists.
01:07:26.080 | - But you're saying that's like,
01:07:27.640 | if you want to create a universe from scratch,
01:07:30.800 | the canvas should be generating the paint brushes
01:07:33.000 | and the paint brushes should be generating the canvas.
01:07:34.880 | - Yeah, yeah, yeah, right.
01:07:36.080 | Like you should--
01:07:36.920 | - Who's the artist in this analogy?
01:07:38.280 | - Well, this is, sorry,
01:07:40.120 | then we're gonna get into a religious thing
01:07:41.680 | and I don't want to do that.
01:07:42.880 | - Okay.
01:07:43.720 | - Well, you know my shtick, which is that we are the AI.
01:07:47.600 | We have two great stories about the simulation
01:07:50.200 | and artificial general intelligence.
01:07:52.080 | In one story, man fears that some program
01:07:57.000 | we've given birth to will become self-aware,
01:07:59.480 | smarter than us, and will take over.
01:08:02.660 | In another story, there are genius simulators
01:08:06.660 | and we live in their simulation.
01:08:09.160 | And we haven't realized that those two stories
01:08:11.860 | are the same story.
01:08:13.440 | In one case, we are the simulator.
01:08:18.080 | In another case, we are the simulated.
01:08:19.980 | And if you buy those and you put them together,
01:08:24.040 | we are the AGI and whether or not we have simulators,
01:08:27.140 | we may be trying to wake up by learning our own source code.
01:08:30.380 | So this could be our Skynet moment,
01:08:32.000 | which is one of the reasons I have some issues around it.
01:08:35.260 | - I think we'll talk about that 'cause I--
01:08:37.220 | - Well, that's the issue of the emergent artist
01:08:39.000 | within the story.
01:08:40.160 | Just to get back to the point.
01:08:41.520 | - Okay, so now the key point is,
01:08:44.240 | the standard way we tell the story
01:08:45.880 | is that Einstein sets the canvas
01:08:48.320 | and then we order all the stuff that we want
01:08:50.680 | and then that paints the picture that is our universe.
01:08:54.340 | So you order the paint, you order the artist,
01:09:00.600 | you order the brushes, and that then,
01:09:04.600 | when you collide the two,
01:09:06.120 | gives you two separate origin stories.
01:09:08.080 | The canvas came from one place
01:09:10.680 | and everything else came from somewhere else.
01:09:12.840 | - So what are the mathematical tools required
01:09:16.200 | to construct consistent geometric theory,
01:09:21.620 | make this concrete?
01:09:25.780 | - Well, somehow, you need to get three copies, for example,
01:09:30.780 | of generations with 16 particles each.
01:09:35.400 | And so the question would be,
01:09:39.040 | well, there's a lot of special personality
01:09:42.600 | in those symmetries.
01:09:44.440 | Where would they come from?
01:09:45.800 | So for example, you've got what would be called
01:09:49.560 | grand unified theories that sound like SU5,
01:09:54.560 | the George I. Glashow theory.
01:09:56.160 | There's something that should be called spin 10,
01:09:58.400 | but physicists insist on calling it SO10.
01:10:01.680 | There's something called the Petit-Salam theory
01:10:04.280 | that tends to be called SU4 cross SU2 cross SU2,
01:10:07.760 | which should be called spin six cross spin four.
01:10:10.320 | I can get into all of these.
01:10:11.800 | - Now, what are they all accomplishing?
01:10:14.000 | - They're all taking the known forces that we see
01:10:16.760 | and packaging them up to say,
01:10:19.640 | we can't get rid of the second origin story,
01:10:22.800 | but we can at least make that origin story more unified.
01:10:26.720 | So they're trying, grand unification is the attempt to--
01:10:29.360 | - And that's a mistake in your--
01:10:31.040 | - It's not a mistake.
01:10:32.280 | The problem is, is it was born lifeless.
01:10:35.040 | When George I and Glashow first came out
01:10:37.520 | with the SU5 theory, it was very exciting
01:10:41.960 | because it could be tested in a South Dakota mine
01:10:45.920 | filled up with like, I don't know, cleaning fluid
01:10:48.360 | or something like that.
01:10:49.280 | And they looked for proton decay and didn't see it.
01:10:51.840 | And then they gave up because in that day
01:10:54.280 | when your experiment didn't work, you gave up on the theory.
01:10:57.720 | It didn't come to us born of a fusion
01:11:00.600 | between Einstein and Bohr.
01:11:04.480 | And that was kind of the problem,
01:11:07.880 | is it had this weird parenting
01:11:09.280 | where it was just on the Bohr side.
01:11:10.720 | There was no Einsteinian contribution.
01:11:12.920 | Lex, how can I help you most?
01:11:17.480 | I'm trying to figure out what questions you wanna ask
01:11:21.360 | so that you get the most satisfying answers.
01:11:23.700 | - There's a bunch of questions I wanna ask.
01:11:27.800 | I mean, one, and I'm trying to sneak up on you somehow
01:11:32.480 | to reveal in a accessible way the nature of our universe.
01:11:37.480 | - So I can just give you a guess, right?
01:11:42.600 | We have to be very careful that we're not claiming
01:11:45.640 | that this has been accepted.
01:11:47.920 | This is a speculation.
01:11:49.540 | But I will make the speculation.
01:11:52.000 | I think what you would wanna ask me
01:11:53.520 | is how can the canvas generate all the stuff
01:11:55.840 | that usually has to be ordered separately?
01:11:58.280 | All right, should we do that?
01:11:59.320 | - Let's go there. - Okay.
01:12:01.760 | So the first thing is is that you have a concept
01:12:05.360 | in computers called technical debt.
01:12:08.560 | You're coding and you cut corners
01:12:10.080 | and you know you're gonna have to do it right
01:12:12.280 | before the thing is safe for the world.
01:12:15.440 | But you're piling up some series of IOUs to yourself
01:12:20.440 | and your project as you're going along.
01:12:22.800 | So the first thing is we can't figure out
01:12:27.360 | if you have only four degrees of freedom,
01:12:29.380 | and that's what your canvas is,
01:12:30.920 | how do you get at least Einstein's world?
01:12:33.440 | Einstein said, look, it's not just four degrees of freedom,
01:12:37.200 | but there need to be rulers and protractors
01:12:39.500 | to measure length and angle in the world.
01:12:41.540 | You can't just have a flabby four degrees of freedom.
01:12:44.920 | So the first thing you do is you create 10 extra variables,
01:12:49.800 | which is like if we can't choose any particular set
01:12:52.320 | of rulers and protractors to measure length and angle,
01:12:55.520 | let's take the set of all possible rulers and protractors.
01:13:00.040 | And that would be called symmetric non-degenerate two tensors
01:13:04.160 | on the tangent space of the four manifold X4.
01:13:07.120 | Now, because there are four degrees of freedom,
01:13:11.000 | you start off with four dimensions,
01:13:12.600 | then you need four rulers
01:13:14.040 | for each of those different directions.
01:13:17.880 | So that's four, that gets us up to eight variables.
01:13:20.480 | And then between four original variables,
01:13:23.240 | there are six possible angles.
01:13:24.840 | So four plus four plus six is equal to 14.
01:13:29.320 | So now you've replaced X4 with another space,
01:13:32.720 | which in the lecture, I think I called U14,
01:13:34.680 | but I'm now calling Y14.
01:13:36.160 | This is one of the big problems of working on something
01:13:38.720 | in private is every time you pull it out,
01:13:40.320 | you sort of can't remember it,
01:13:41.360 | you name something new.
01:13:43.160 | Okay, so you've got a 14 dimensional world,
01:13:45.600 | which is the original four dimensional world,
01:13:47.800 | plus a lot of extra gadgetry for measurement.
01:13:52.060 | - And because you're not in the four dimensional world,
01:13:55.240 | you don't have the technical debt.
01:13:56.800 | - No, now you've got a lot of technical debt
01:13:58.640 | 'cause now you have to explain away a 14 dimensional world,
01:14:01.200 | which is a big, you're taking a huge advance
01:14:03.880 | on your payday check, right?
01:14:06.000 | - But aren't more dimensions allow you more freedom?
01:14:10.360 | - Maybe, but you have to get rid of them somehow
01:14:12.080 | because we don't perceive them.
01:14:14.040 | - So eventually you have to collapse it down
01:14:15.320 | to the thing that we perceive.
01:14:16.640 | - Or you have to sample a four dimensional filament
01:14:20.980 | within that 14 dimensional world
01:14:22.840 | known as a section of a bundle.
01:14:26.320 | - Okay, so how do we get from the 14 dimensional world
01:14:30.200 | where I imagine a lot of--
01:14:31.400 | - Oh, wait, wait, wait.
01:14:33.080 | You're cheating, the first question was
01:14:35.540 | how do we get something from almost nothing?
01:14:38.920 | Like how do we get the,
01:14:40.760 | if I've said that the who and the what
01:14:43.240 | in the newspaper story that is a theory of everything
01:14:47.520 | are bosons and fermions,
01:14:49.920 | so let's make the who the fermions
01:14:51.640 | and the what the bosons.
01:14:53.040 | Think of it as the players and the equipment for a game.
01:14:56.300 | - Are we supposed to be thinking of actual physical things
01:14:59.000 | with mass or energy?
01:15:00.880 | - Yep. - Okay.
01:15:01.920 | - So think about everything you see in this room.
01:15:05.240 | So from chemistry you know it's all protons,
01:15:07.240 | neutrons and electrons,
01:15:08.480 | but from a little bit of late 1960s physics,
01:15:12.620 | we know that the protons and neutrons
01:15:14.420 | are all made of up quarks and down quarks.
01:15:17.120 | So everything in this room is basically up quarks,
01:15:19.640 | down quarks and electrons stuck together
01:15:21.640 | with the what, the equipment.
01:15:25.640 | Okay, now the way we see it currently
01:15:29.640 | is we see that there are space time indices
01:15:33.380 | which we would call spinners that correspond to the who,
01:15:37.520 | that is the fermions, the matter, the stuff,
01:15:39.720 | the up quarks, the down quarks, the electrons.
01:15:42.320 | And there are also 16 degrees of freedom
01:15:47.320 | that come from this space of internal quantum numbers.
01:15:53.420 | So in my theory, in 14 dimensions,
01:15:58.260 | there's no internal quantum number space that figures in.
01:16:02.280 | It's all just spinorial.
01:16:05.400 | So spinners in 14 dimensions without any festooning
01:16:11.700 | with extra linear algebraic information.
01:16:15.720 | There's a concept of spinners which is natural
01:16:21.980 | if you have a manifold with length and angle.
01:16:25.680 | And why 14 is almost a manifold with length and angle.
01:16:30.200 | It's so close.
01:16:34.260 | In other words, because you're looking at the space
01:16:37.660 | of all rulers and protractors,
01:16:39.300 | maybe it's not that surprising
01:16:41.220 | that a space of rulers and protractors
01:16:43.120 | might come very close to having rulers and protractors
01:16:45.640 | on it itself.
01:16:47.020 | Like can you measure the space of measurements?
01:16:49.840 | And you almost can.
01:16:51.860 | And a space that has length and angle,
01:16:54.580 | if it doesn't have a topological obstruction,
01:16:56.900 | comes with these objects called spinners.
01:16:59.260 | Now spinners are the stuff of our world.
01:17:05.180 | We are made of spinners.
01:17:07.140 | They are the most important really deep object
01:17:09.540 | that I can tell you about.
01:17:10.580 | They were very surprising.
01:17:11.740 | - What is a spinner?
01:17:13.420 | - So famously, there are these weird things
01:17:16.820 | that require 720 degrees of rotation.
01:17:22.180 | In order to come back to normal.
01:17:24.620 | And that doesn't make sense.
01:17:26.820 | And the reason for this is that there's a knottedness
01:17:30.380 | in our three dimensional world that people don't observe.
01:17:33.820 | And you can famously see it by this Dirac string trick.
01:17:38.820 | So if you take a glass of water,
01:17:40.480 | imagine that this was a tumbler
01:17:41.800 | and I didn't want to spill any of it.
01:17:43.740 | And the question is, if I rotate the cup
01:17:47.340 | without losing my grip on the base 360 degrees,
01:17:52.020 | and I can't go backwards, is there any way I can take a sip?
01:17:56.540 | And the answer is this weird motion,
01:17:58.540 | which is go over first and under second.
01:18:03.540 | And that's 720 degrees of rotation to come back to normal
01:18:07.980 | so that I can take a sip.
01:18:09.540 | Well, that weird principle,
01:18:10.900 | which sometimes is known as the Philippine wine glass dance
01:18:13.980 | because waitresses in the Philippines
01:18:16.660 | apparently learned how to do this.
01:18:18.380 | That move defines, if you will,
01:18:24.500 | this hidden space that nobody knew was there of spinners,
01:18:28.700 | which Dirac figured out when he took the square root
01:18:32.020 | of something called the Klein-Gordon equation,
01:18:34.300 | which I think had earlier work incorporated
01:18:40.060 | from Cartan and Killing and Company in mathematics.
01:18:43.260 | So spinners are one of the most profound aspects
01:18:45.620 | of human existence.
01:18:46.860 | - And you forgive me for the perhaps dumb questions,
01:18:48.980 | but would a spinner be the mathematical object
01:18:52.100 | that's the basic unit of our universe?
01:18:55.660 | - When you start with a manifold,
01:18:59.300 | which is just like something like a donut or a sphere,
01:19:04.820 | or a circle, or a Möbius band,
01:19:06.820 | a spinner is usually the first wildly surprising thing
01:19:11.780 | that you found was hidden in your original purchase.
01:19:15.960 | So you order a manifold and you didn't even realize,
01:19:20.320 | it's like buying a house and finding a panic room inside
01:19:23.800 | that you hadn't counted on.
01:19:25.600 | It's very surprising when you understand
01:19:27.480 | that spinners are running around in your spaces.
01:19:29.880 | - Again, perhaps a dumb question,
01:19:33.180 | but we're talking about 14 dimensions and four dimensions.
01:19:36.600 | What is the manifold we're operating under?
01:19:40.120 | - So in my case, it's proto-spacetime.
01:19:42.280 | It's before Einstein can slap rulers
01:19:46.240 | and protractors on spacetime.
01:19:48.840 | - What you mean by that, sorry to interrupt,
01:19:50.760 | is spacetime is the 4D manifold?
01:19:54.600 | - Spacetime is a four-dimensional manifold
01:19:57.320 | with extra structure.
01:19:58.700 | - What's the extra structure?
01:20:00.800 | - It's called a semi-Romanian or pseudo-Romanian metric.
01:20:05.800 | And in essence, there is something akin
01:20:08.940 | to a four by four symmetric matrix,
01:20:13.280 | which is equivalent to length and angle.
01:20:15.680 | So when I talk about rulers and protractors
01:20:17.520 | or I talk about length and angle,
01:20:19.520 | or I talk about Romanian or pseudo-Romanian
01:20:22.040 | or semi-Romanian manifolds,
01:20:24.020 | I'm usually talking about the same thing.
01:20:26.600 | Can you measure how long something is
01:20:28.600 | and what the angle is between two different rays or vectors?
01:20:33.600 | So that's what Einstein gave us as his arena,
01:20:36.920 | his place to play, his canvas.
01:20:39.440 | - There's a bunch of questions I could ask here,
01:20:46.000 | but like I said, I'm working on this book,
01:20:49.000 | Geometric Unity for Idiots.
01:20:50.580 | And I think what would be really nice as your editor
01:20:57.600 | to have beautiful, maybe even visualizations
01:21:03.200 | that people could try to play with,
01:21:06.720 | try to reveal small little beauties
01:21:09.280 | about the way you're thinking about this world.
01:21:11.200 | - Well, I usually use the Joe Rogan program for that.
01:21:13.720 | Sometimes I have him doing the Philippine wine glass dance.
01:21:17.080 | I had the hop vibration.
01:21:19.000 | The part of the problem is that most people don't know
01:21:22.780 | this language about spinners, bundles, metrics, gauge fields.
01:21:27.780 | And they're very curious about the theory of everything,
01:21:31.480 | but they have no understanding
01:21:32.920 | of even what we know about our own world.
01:21:34.880 | - Is it a hopeless pursuit?
01:21:38.040 | - No.
01:21:38.880 | - Like even gauge theory.
01:21:40.000 | - Right.
01:21:40.840 | - Just this, I mean, it seems to be very inaccessible.
01:21:44.200 | Is there some aspect of it that could be made accessible?
01:21:46.600 | - I mean, I could go to the board right there
01:21:48.520 | and give you a five minute lecture on gauge theory
01:21:51.400 | that would be better than the official lecture
01:21:54.600 | on gauge theory.
01:21:55.420 | You would know what gauge theory was.
01:21:57.440 | - So it is possible to make it accessible.
01:21:59.760 | - Yeah, but nobody does.
01:22:01.840 | Like, in other words, you're gonna watch
01:22:03.880 | over the next year lots of different discussions
01:22:06.920 | about quantum entanglement or the multiverse.
01:22:10.520 | Where are we now?
01:22:12.360 | Or many worlds, are they all equally real?
01:22:15.760 | - Yeah.
01:22:16.600 | - Right?
01:22:18.640 | - I mean, yeah, that's like--
01:22:19.480 | - Okay, but you're not gonna hear anything
01:22:20.960 | about the hop vibration except if it's from me,
01:22:23.320 | and I hate that.
01:22:24.800 | - Why can't you be the one?
01:22:26.920 | - Well, because I'm going a different path.
01:22:28.720 | I think that we've made a huge mistake,
01:22:30.600 | which is we have things we can show people
01:22:32.780 | about the actual models.
01:22:34.620 | We can push out visualizations where they're not listening
01:22:38.000 | by analogy, they're watching the same thing
01:22:40.120 | that we're seeing.
01:22:41.280 | And as I've said to you before,
01:22:42.920 | this is like choosing to perform sheet music
01:22:46.080 | that hasn't been performed in a long time.
01:22:48.000 | Or the experts can't afford orchestras,
01:22:50.480 | so they just trade Beethoven symphonies as sheet music.
01:22:53.600 | And they're like, oh, wow, that was beautiful.
01:22:55.920 | But it's like nobody heard anything.
01:22:58.400 | They just looked at the score.
01:22:59.480 | Well, that's how mathematicians and physicists
01:23:02.040 | trade papers and ideas, is that they write down
01:23:05.440 | the things that represent stuff.
01:23:07.280 | I want to at least close out this thought line
01:23:11.100 | that you started, which is how does the canvas
01:23:15.360 | order all of this other stuff into being?
01:23:19.860 | So I at least want to say some incomprehensible things
01:23:23.800 | about that, and then we'll have that much done, all right?
01:23:28.120 | - On that just point, does it have to be incomprehensible?
01:23:32.880 | - Do you know what the Schrodinger equation is?
01:23:34.800 | - Yes.
01:23:35.640 | - Do you know what the Dirac equation is?
01:23:38.440 | - What does no mean?
01:23:40.120 | - Well, my point is you're gonna have some feeling
01:23:42.680 | that you know what the Schrodinger equation is.
01:23:45.240 | As soon as we get to the Dirac equation,
01:23:47.280 | your eyes are gonna get a little bit glazed, right?
01:23:50.360 | So now why is that?
01:23:53.840 | Well, the answer to me is that you want to ask me
01:23:58.840 | about the theory of everything,
01:24:01.040 | but you haven't even digested the theory of everything
01:24:05.680 | as we've had it since 1928,
01:24:08.300 | when Dirac came out with his equation.
01:24:10.680 | So for whatever reason, and this isn't a hit on you,
01:24:15.960 | you haven't been motivated enough in all the time
01:24:20.640 | that you've been on Earth to at least get as far
01:24:23.360 | as the Dirac equation.
01:24:24.680 | And this was very interesting to me
01:24:25.760 | after I gave the talk in Oxford.
01:24:27.720 | New scientist who had done kind of a hatchet job on me
01:24:31.880 | to begin with sent a reporter to come to the third version
01:24:34.760 | of the talk that I gave,
01:24:36.560 | and that person had never heard of the Dirac equation.
01:24:39.460 | So you have a person who's completely professionally
01:24:44.760 | not qualified to ask these questions,
01:24:48.880 | wanting to know, well, how does your theory
01:24:52.640 | solve new problems?
01:24:53.800 | And it's like, well, in the case of the Dirac equation,
01:24:56.160 | well, tell me about that, I don't know what that is.
01:24:58.000 | So then the point is, okay, I got it.
01:25:01.000 | You're not even caught up minimally to where we are now,
01:25:05.040 | and that's not a knock on you, almost nobody is.
01:25:08.560 | But then how does it become my job
01:25:11.120 | to digest what has been available for like over 90 years?
01:25:17.960 | - Well, to me, the open question is whether
01:25:20.360 | what's been available for over 90 years
01:25:22.360 | can be, there could be a blueprint of a journey
01:25:27.360 | that one takes to understand it, not to--
01:25:31.560 | - Oh, I want to do that with you.
01:25:32.960 | And one of the things I think
01:25:35.000 | I've been relatively successful at, for example,
01:25:37.660 | when you ask other people what gauge theory is,
01:25:41.760 | you get these very confusing responses.
01:25:44.440 | And my response is much simpler.
01:25:45.880 | It's, oh, it's a theory of differentiation,
01:25:49.080 | where when you calculate the instantaneous rise over run,
01:25:52.480 | you measure the rise not from a flat horizontal,
01:25:55.400 | but from a custom endogenous reference level.
01:25:57.980 | What do you mean by that?
01:25:59.880 | It's like, okay, and then I do this thing
01:26:01.720 | with Mount Everest, which is, Mount Everest is how high?
01:26:05.400 | Then they give the height.
01:26:06.240 | I say, above what?
01:26:07.060 | Then they say sea level.
01:26:08.040 | And I say, which sea is that in Nepal?
01:26:10.680 | They're like, oh, I guess there isn't a sea,
01:26:11.880 | 'cause it's landlocked.
01:26:12.800 | It's like, okay, well, what do you mean by sea level?
01:26:15.000 | Oh, there's this thing called the geoid I'd never heard of.
01:26:17.600 | Oh, that's the reference level.
01:26:19.200 | That's a custom reference level that we imported.
01:26:22.520 | So all sorts of people have remembered
01:26:26.080 | the exact height of Mount Everest
01:26:28.160 | without ever knowing what it's a height from.
01:26:30.660 | Well, in this case, in gauge theory,
01:26:34.200 | there's a hidden reference level
01:26:35.720 | where you measure the rise and rise over run
01:26:38.320 | to give the slope of a line.
01:26:39.880 | What if you have different concepts
01:26:43.360 | of where that rise should be measured from
01:26:47.920 | that vary within the theory,
01:26:49.160 | that are endogenous to the theory?
01:26:51.360 | That's what gauge theory is.
01:26:52.760 | Okay, we have a video here, right?
01:26:55.560 | - Yeah. - Okay.
01:26:57.160 | I'm gonna use my phone.
01:26:58.320 | If I wanna measure my hand and its slope,
01:27:04.000 | this is my attempt to measure it using standard calculus.
01:27:07.760 | In other words, the reference level is apparently flat,
01:27:10.680 | and I measure the rise above that phone using my hand, okay?
01:27:15.640 | If I wanna use gauge theory, it means I can do this,
01:27:18.720 | or I can do that, or I can do this, or I can do this,
01:27:21.520 | or I could do what I did from the beginning, okay?
01:27:24.720 | At some level, that's what gauge theory is.
01:27:27.320 | Now, that is an act, no,
01:27:28.720 | I've never heard anyone describe it that way.
01:27:31.860 | So while the community may say, well, who is this guy?
01:27:34.080 | And why does he have the right to talk in public?
01:27:36.080 | I'm waiting for somebody to jump out of the woodwork
01:27:38.660 | and say, you know Eric's whole shtick
01:27:41.020 | about rulers and protractors leading to a derivative,
01:27:45.120 | derivatives are measured as rise over run
01:27:47.000 | above reference level, the reference level's not fit to get.
01:27:49.000 | Like, I go through this whole shtick
01:27:50.600 | in order to make it accessible.
01:27:52.200 | I've never heard anyone say it.
01:27:53.760 | I'm trying to make, Prometheus would like
01:27:57.160 | to discuss fire with everybody else.
01:27:59.220 | All right, I'm gonna just say one thing
01:28:02.040 | to close out the earlier line,
01:28:03.680 | which is what I think we should have continued with.
01:28:06.420 | When you take the naturally occurring spinners,
01:28:09.780 | the unadorned spinners, the naked spinners,
01:28:12.540 | not on this 14 dimensional manifold,
01:28:16.760 | but on something very closely tied to it,
01:28:19.320 | which I've called the chimeric tangent bundle.
01:28:22.160 | That is the object which stands in for the thing
01:28:26.340 | that should have had length and angle on it,
01:28:27.940 | but just missed, okay?
01:28:29.400 | When you take that object and you form spinners on that
01:28:33.720 | and you don't adorn them,
01:28:34.820 | so you're still in the single origin story,
01:28:37.040 | you get very large spinorial objects
01:28:41.400 | upstairs on this 14 dimensional world, Y14,
01:28:45.640 | which is part of the observers.
01:28:47.600 | When you pull that information back from Y14 down to X4,
01:28:52.600 | it miraculously looks like the adorned spinners,
01:29:00.720 | the festooned spinners,
01:29:03.120 | the spinners that we play with in ordinary reality.
01:29:07.700 | In other words, the 14 dimensional world
01:29:10.520 | looks like a four dimensional world
01:29:12.520 | plus a 10 dimensional complement.
01:29:15.120 | So 10 plus four equals 14.
01:29:17.440 | That 10 dimensional complement,
01:29:19.100 | which is called a normal bundle,
01:29:21.540 | generates spin properties, internal quantum numbers
01:29:24.880 | that look like the things
01:29:26.600 | that give our particles personality,
01:29:28.840 | that make, let's say, up quarks and down quarks
01:29:32.920 | charged by negative one third or plus two thirds,
01:29:37.360 | that kind of stuff, or whether or not
01:29:39.280 | some quarks feel the weak force and other quarks do not.
01:29:45.620 | So the X4 generates Y14,
01:29:50.240 | Y14 generates something called the chimeric tangent bundle.
01:29:53.440 | Chimeric tangent bundle generates unadorned spinners.
01:29:56.580 | The unadorned spinners get pulled back from 14 down to four
01:30:00.520 | where they look like adorned spinners.
01:30:03.260 | And we have the right number of them.
01:30:05.280 | You thought you needed three, you only got two.
01:30:08.440 | But then something else that you'd never seen before
01:30:10.800 | broke apart on this journey.
01:30:13.040 | And it broke into another copy of the thing
01:30:15.640 | that you already have two copies of.
01:30:17.400 | One piece of that thing broke off.
01:30:19.760 | So now you have two generations
01:30:21.600 | plus an imposter third generation, which is,
01:30:24.100 | I don't know why we never talk about this possibility
01:30:27.580 | in regular physics.
01:30:28.580 | And then you've got a bunch of stuff that we haven't seen,
01:30:30.520 | which has descriptions.
01:30:32.160 | So people always say,
01:30:33.000 | does it make any falsifiable predictions?
01:30:34.880 | Yes, it does.
01:30:35.720 | It says that the matter that you should be seeing next
01:30:41.560 | has particular properties that can be read off.
01:30:44.000 | - Like?
01:30:46.120 | - Like weak isospin, weak hypercharge,
01:30:48.920 | like the responsiveness to the strong force.
01:30:51.860 | The one I can't tell you
01:30:52.760 | is what energy scale it would happen at.
01:30:55.760 | - So you can't say if those characteristics
01:30:59.780 | can be detected with the current--
01:31:01.580 | - But it may be that somebody else can.
01:31:03.260 | I'm not a physicist.
01:31:05.140 | I'm not a quantum field theorist.
01:31:06.540 | I can't, I don't know how you would do that.
01:31:09.820 | - The hope for me is that there's some simple
01:31:13.700 | explanations for all of it.
01:31:16.780 | - Like, should we have a drink?
01:31:19.140 | - You're having fun.
01:31:20.560 | - No, I'm trying to have fun with you.
01:31:22.100 | You know that.
01:31:22.940 | - Yeah, there's a bunch of fun things
01:31:24.900 | which are very serious.
01:31:26.540 | - To talk about here.
01:31:28.800 | - Anyway, that was how I got what I thought you wanted,
01:31:31.980 | which is if you think about the fermions as the artists
01:31:36.980 | and the bosons as the brushes and the paint,
01:31:42.560 | what I told you is that's how we get the artists.
01:31:46.240 | - What are the open questions for you in this?
01:31:50.160 | What are the challenges?
01:31:52.020 | So you're not done.
01:31:53.640 | - Well, there's things that I would like
01:31:55.580 | to have in better order.
01:31:57.380 | So a lot of people will say,
01:31:59.800 | the reason I hesitate on this is I just have
01:32:03.100 | a totally different view than the community.
01:32:04.900 | So for example, I believe that general relativity
01:32:08.440 | began in 1913 with Einstein and Grossman.
01:32:12.900 | Now that was the first of like four major papers
01:32:17.100 | in this line of thinking.
01:32:19.980 | To most physicists, general relativity happened
01:32:23.140 | when Einstein produced a divergence-free gradient,
01:32:28.140 | which turned out to be the gradient
01:32:33.540 | of the so-called Hilbert or Einstein-Hilbert action.
01:32:36.740 | And from my perspective, that wasn't true.
01:32:40.140 | This is that it began when Einstein said,
01:32:41.980 | "Look, this is about differential geometry
01:32:46.500 | "and the final answer is gonna look like
01:32:49.020 | "a curvature tensor on one side
01:32:51.420 | "and matter and energy on the other side."
01:32:54.060 | And that was enough.
01:32:55.260 | And then he published a wrong version of it
01:32:57.820 | where it was the Ricci tensor, not the Einstein tensor.
01:33:00.660 | Then he corrected the Ricci tensor
01:33:02.880 | to make it into the Einstein tensor.
01:33:04.720 | Then he corrected that to add a cosmological constant.
01:33:07.520 | I can't stand that the community thinks in those terms.
01:33:12.580 | There's some things about which,
01:33:14.940 | like there's a question about which contraction do I use?
01:33:17.820 | There's an Einstein contraction,
01:33:19.820 | there's a Ricci contraction.
01:33:21.100 | They both go between the same spaces.
01:33:23.780 | I'm not sure what I should do.
01:33:25.180 | I'm not sure which contraction I should choose.
01:33:28.260 | This is called a Shiab operator
01:33:30.180 | for ship in a bottle in my stuff.
01:33:32.400 | - You have this big platform in many ways
01:33:37.580 | that inspires people's curiosity
01:33:41.500 | about physics and mathematics.
01:33:43.660 | - Right.
01:33:44.500 | - And I'm one of those people.
01:33:47.980 | - Well, great.
01:33:50.180 | - But then you start using a lot of words
01:33:52.300 | that I don't understand.
01:33:53.500 | And I might know them, but I don't understand.
01:33:59.660 | And what's unclear to me,
01:34:01.260 | if I'm supposed to be listening to those words,
01:34:04.160 | or if it's just, if this is one of those technical things
01:34:08.280 | that's intended for a very small community,
01:34:11.780 | or if I'm supposed to actually take those words
01:34:14.140 | and start a multi-year study,
01:34:19.020 | not a serious study, but the kind of study
01:34:21.500 | when you're interested in learning
01:34:23.740 | about machine learning, for example,
01:34:25.420 | or any kind of discipline.
01:34:27.220 | That's where I'm a little bit confused.
01:34:29.180 | So you speak beautifully about ideas.
01:34:32.620 | You often reveal the beauty in geometry.
01:34:36.780 | And I'm unclear in what are the steps I should be taking.
01:34:41.620 | I'm curious, how can I explore?
01:34:45.420 | How can I play with something?
01:34:46.640 | How can I play with these ideas?
01:34:48.460 | - Right.
01:34:49.300 | - And enjoy the beauty of,
01:34:51.120 | not necessarily understanding the depth of a theory
01:34:53.620 | that you're presenting, but start to share in the beauty.
01:34:56.960 | As opposed to sharing and enjoying the beauty
01:35:00.500 | of just the way, the passion with which you speak,
01:35:03.460 | which is in itself fun to listen to,
01:35:08.060 | but also starting to be able to understand
01:35:11.300 | some aspects of this theory that I can enjoy it.
01:35:16.660 | And start to build an intuition,
01:35:18.660 | what the heck we're even talking about.
01:35:20.580 | 'Cause you're basically saying we need to throw
01:35:22.380 | a lot of our ideas of views of the universe out.
01:35:27.380 | And I'm trying to find accessible ways in.
01:35:34.060 | - Okay.
01:35:34.900 | - Not in this conversation.
01:35:36.980 | - No, I appreciate that.
01:35:37.820 | So one of the things that I've done
01:35:39.100 | is I've picked on one paragraph from Edward Witten.
01:35:42.660 | And I said, this is the paragraph.
01:35:46.500 | If I could only take one paragraph with me,
01:35:48.620 | this is the one I'd take.
01:35:49.860 | And it's almost all in prose, not in equations.
01:35:52.220 | And he says, look, this is our knowledge of the universe
01:35:55.980 | at its deepest level.
01:35:57.140 | And he was writing this during the 1980s.
01:35:59.740 | And he has three separate points
01:36:02.020 | that constitute our deepest knowledge.
01:36:04.540 | And those three points refer to equations.
01:36:07.740 | One to the Einstein field equation,
01:36:09.440 | one to the Dirac equation,
01:36:10.820 | and one to the Yang-Mills-Maxwell equation.
01:36:14.540 | Now, one thing I would do is take a look at that paragraph
01:36:19.460 | and say, okay, what do these three lines mean?
01:36:22.940 | Like it's a finite amount of verbiage.
01:36:24.580 | You can write down every word that you don't know.
01:36:27.780 | - Beautiful.
01:36:28.620 | - And you can say, what do I think?
01:36:30.780 | - Done.
01:36:31.620 | - Now, young man.
01:36:33.600 | - Yes.
01:36:34.440 | - There's a beautiful wall in Stony Brook, New York,
01:36:38.620 | built by someone who I know you will interview,
01:36:41.340 | named Jim Simons.
01:36:44.260 | And Jim Simons, he's not the artist,
01:36:46.700 | but he's the guy who funded it.
01:36:48.220 | World's greatest hedge fund manager.
01:36:50.180 | And on that wall contain the three equations
01:36:53.320 | that Witten refers to in that paragraph.
01:36:57.340 | And so that is the transmission from the paragraph
01:37:00.180 | or graph to the wall.
01:37:02.480 | Now, that wall needs an owner's manual,
01:37:06.400 | which Roger Penrose has written called "The Road to Reality."
01:37:11.000 | Let's call that the tome.
01:37:13.180 | So this is the subject of the so-called
01:37:15.340 | graph wall tome project that is going on
01:37:19.220 | in our Discord server and our general group
01:37:22.580 | around the portal community,
01:37:24.100 | which is how do you take something that purports
01:37:27.660 | in one paragraph to say what the deepest understanding
01:37:31.220 | man has of the universe in which he lives.
01:37:33.500 | It's memorialized on a wall, which nobody knows about,
01:37:39.380 | which is an incredibly gorgeous piece of art.
01:37:42.800 | And that was written up in a book,
01:37:45.960 | which has been written for no man.
01:37:48.560 | Maybe it's for a woman, I don't know.
01:37:51.300 | But no one should be able to read this book
01:37:53.880 | because either you're a professional
01:37:56.380 | and you know a lot of this book,
01:37:57.660 | in which case it's kind of a refresher
01:37:59.220 | to see how Roger thinks about these things.
01:38:01.580 | Or you don't even know that this book
01:38:03.100 | is a self-contained invitation
01:38:05.980 | to understanding our deepest nature.
01:38:08.580 | So I would say find yourself in the graph wall
01:38:11.980 | tome transmission sequence and join
01:38:14.140 | the graph wall tome project if that's of interest.
01:38:17.580 | - Okay, beautiful.
01:38:18.980 | Now just to linger on a little longer,
01:38:21.060 | what kind of journey do you see Geometric Unity taking?
01:38:24.540 | - I don't know.
01:38:25.500 | I mean, that's the thing is that first of all,
01:38:27.300 | the professional community has to get very angry
01:38:29.360 | and outraged and they have to work through
01:38:31.580 | their feeling that this is nonsense, this is bullshit,
01:38:34.100 | or like, no, wait a minute, this is really cool.
01:38:37.100 | Actually, I need some clarification over here.
01:38:38.980 | So there's gonna be some sort of
01:38:40.580 | weird coming back together process.
01:38:43.460 | - Are you already hearing murmurings of that?
01:38:47.460 | - It's very funny.
01:38:48.340 | Officially, I've seen very little.
01:38:50.040 | - So it's perhaps happening quietly.
01:38:53.780 | - Yeah.
01:38:54.620 | - You often talk about we need to get off this planet.
01:38:58.640 | - Yep.
01:38:59.480 | - Can I try to sneak up on that by asking
01:39:03.700 | what in your kind of view is the difference,
01:39:06.340 | the gap between the science of it, the theory,
01:39:10.580 | and the actual engineering of building something
01:39:12.900 | that leverages the theory to do something?
01:39:14.900 | Like how big is that--
01:39:16.220 | - We don't know.
01:39:17.260 | - Gap?
01:39:18.220 | - I mean, if you have 10 extra dimensions to play with
01:39:20.860 | that are the rulers and protractors of the world themselves,
01:39:24.540 | can you gain access to those dimensions?
01:39:26.740 | - Do you have a hunch, so--
01:39:30.620 | - I don't know.
01:39:31.900 | I don't wanna get ahead of myself.
01:39:33.580 | Because you have to appreciate, I can have hunches
01:39:35.780 | and I can jaw off.
01:39:37.720 | But one of the ways that I'm succeeding in this world
01:39:43.380 | is to not bow down to my professional communities
01:39:46.980 | nor to ignore them.
01:39:48.540 | Like I'm actually interested in the criticism,
01:39:50.580 | I just wanna denature it so that it's not
01:39:53.340 | mostly interpersonal and irrelevant.
01:39:56.580 | I believe that they don't want me to speculate
01:40:00.900 | and I don't need to speculate about this.
01:40:02.720 | I can simply say I'm open to the idea
01:40:05.760 | that it may have engineering prospects
01:40:07.760 | and it may be a death sentence.
01:40:09.240 | We may find out that there's not enough new here
01:40:13.120 | that even if it were right,
01:40:14.960 | that there would be nothing new to do.
01:40:16.400 | Can't tell you.
01:40:17.520 | - That's what you mean by death sentences,
01:40:19.660 | there would not be exciting breakthroughs
01:40:21.560 | that follow on. - Wouldn't it be terrible
01:40:22.740 | if you couldn't, like you can do new things
01:40:25.240 | in an Einsteinian world that you couldn't do
01:40:27.420 | in a Newtonian world.
01:40:29.320 | - Right.
01:40:30.160 | - You know, like you have twin paradoxes
01:40:31.160 | or Lorentz contraction of length
01:40:33.080 | or any one of a number of new cool things
01:40:35.360 | happen in relativity theory that didn't happen for Newton.
01:40:38.680 | What if there wasn't new stuff to do
01:40:41.600 | at the next and final level?
01:40:43.240 | - So first of all-- - That would be quite sad.
01:40:47.920 | Let me ask a silly question but--
01:40:50.540 | - We'll say it with a straight face.
01:40:53.200 | - Impossible.
01:40:57.440 | So let me mention Elon Musk.
01:41:01.720 | What are your thoughts about,
01:41:03.440 | he's more, you're more in the physics theory side of things,
01:41:08.200 | he's more in the physics engineering side of things
01:41:10.720 | in terms of SpaceX efforts.
01:41:13.020 | What do you think of his efforts to get off this planet?
01:41:18.020 | - Well I think he's the other guy
01:41:22.560 | who's semi-serious about getting off this planet.
01:41:26.800 | I think there are two of us who are semi-serious
01:41:28.480 | about getting off the planet.
01:41:29.920 | - What do you think about his methodology and yours
01:41:33.120 | when you look at them?
01:41:34.640 | - I don't wanna be against Elon
01:41:36.160 | because I was so excited that your top video
01:41:40.000 | was Ray Kurzweil and then I did your podcast
01:41:42.440 | and we had some chemistry so it zoomed up
01:41:44.920 | and I thought okay, I'm gonna beat Ray Kurzweil.
01:41:46.600 | So just as I'm coming up on Ray Kurzweil,
01:41:49.000 | you're like and now, Alex Fridman special Elon Musk
01:41:52.580 | and he blew me out of the water.
01:41:53.800 | So I don't wanna be petty about it.
01:41:55.680 | I wanna say that I don't but I am.
01:41:59.040 | Okay, but here's the funny part.
01:42:00.640 | He's not taking enough risk.
01:42:03.320 | Like he's trying to get us to Mars.
01:42:06.240 | Imagine that he got us to Mars, the moon
01:42:08.600 | and we'll throw in Titan.
01:42:10.160 | And nowhere good enough.
01:42:14.260 | The diversification level is too low.
01:42:16.280 | Now there's a compatibility.
01:42:18.280 | First of all, I don't think Elon is serious about Mars.
01:42:23.520 | I think Elon is using Mars.
01:42:25.380 | - As a narrative, as a story, as a dream?
01:42:29.680 | - To make the moon jealous.
01:42:31.120 | - To make the, no.
01:42:32.200 | - I think he's using it as a story to organize us
01:42:38.000 | to reacquaint ourselves with our need for space,
01:42:41.000 | our need to get off this planet.
01:42:42.360 | It's a concrete thing.
01:42:43.680 | He's shown that, many people think that he's shown
01:42:48.380 | that he's the most brilliant and capable person
01:42:50.300 | on the planet.
01:42:51.140 | I don't think that's what he showed.
01:42:52.260 | I think he showed that the rest of us
01:42:53.600 | have forgotten our capabilities.
01:42:55.640 | And so he's like the only guy who has still kept the faith
01:42:59.040 | and is like, what's wrong with you people?
01:43:01.600 | - So you think the lesson we should draw from Elon Musk
01:43:03.680 | is there's a capable person within a lot of us.
01:43:07.920 | - Elon makes sense to me.
01:43:09.960 | - In what way?
01:43:11.440 | - He's doing what any sensible person should do.
01:43:13.560 | He's trying incredible things
01:43:16.040 | and he's partially succeeding, partially failing.
01:43:18.520 | - To try to solve the obvious problems before.
01:43:20.480 | - Duh.
01:43:21.320 | - Yeah.
01:43:22.160 | - But he comes up with things like, I got it.
01:43:25.200 | We'll come up with a battery company,
01:43:26.620 | but batteries aren't sexy, so we'll make a car around it.
01:43:29.560 | Like, great.
01:43:30.520 | Or any one of a number of things.
01:43:34.580 | Elon is behaving like a sane person
01:43:38.840 | and I view everyone else as insane.
01:43:41.280 | And my feeling is that we really have to get off this planet.
01:43:46.280 | We have to get out of this,
01:43:47.240 | we have to get out of the neighborhood.
01:43:49.640 | - Dilingual a little bit.
01:43:50.840 | Do you think that's a physics problem
01:43:52.720 | or an engineering problem?
01:43:54.280 | - I think it's a cowardice problem.
01:43:57.240 | I think that we're afraid that we had 400 hitters
01:44:02.240 | of the mind, like Einstein and Dirac,
01:44:04.240 | and that era is done and now we're just sort of copy editors.
01:44:09.240 | - So is some of it money?
01:44:11.680 | Like, if we become brave enough to go outside
01:44:15.380 | the solar system, can we afford to, financially?
01:44:19.340 | - Well, I think that that's not really the issue.
01:44:21.540 | The issue is, look what Elon did well.
01:44:26.080 | He amassed a lot of money.
01:44:27.480 | And then he plowed it back in and he spun the wheel
01:44:32.960 | and he made more money, and now he's got FU money.
01:44:36.740 | Now, the problem is is that a lot of the people
01:44:41.060 | who have FU money are not people whose middle finger
01:44:44.540 | you ever want to see.
01:44:45.660 | I want to see Elon's middle finger.
01:44:48.420 | I want to see what he's-- - What do you mean by that?
01:44:49.680 | Or like when you say, "Fuck it, I'm gonna do
01:44:51.760 | "the biggest possible thing." - He's gonna do
01:44:52.600 | whatever the fuck he wants.
01:44:54.380 | Right?
01:44:55.220 | Fuck you, fuck anything that gets in his way
01:44:57.240 | that he can afford to push out of his way.
01:44:59.000 | - And you're saying he's not actually even doing that enough.
01:45:01.980 | - No, I'm-- - He's not going--
01:45:03.480 | - Please, I'm gonna go, Elon's doing fine with his money.
01:45:06.900 | I just want him to enjoy himself,
01:45:09.640 | have the most Dionysian-- - Well, no,
01:45:12.520 | but you're saying Mars is playing it safe.
01:45:15.380 | - He doesn't know how to do anything else.
01:45:19.320 | He knows rockets.
01:45:20.700 | - Yeah.
01:45:21.820 | - And he might know some physics at a fundamental level.
01:45:26.820 | - Yeah, I guess, okay, just let me just go right back to it.
01:45:32.180 | How much physics do you really,
01:45:33.900 | how much brilliant breakthrough ideas on the physics side
01:45:37.480 | do you need to get off this planet?
01:45:40.080 | - I don't know, and I don't know whether,
01:45:42.900 | like in my most optimistic dream,
01:45:44.580 | I don't know whether my stuff gets us off the planet,
01:45:47.140 | but it's hope.
01:45:48.800 | It's hope that there's a more fundamental theory
01:45:51.080 | that we can access, that we don't need,
01:45:53.840 | whose elegance and beauty will suggest
01:45:58.220 | that this is probably the way the universe goes.
01:46:01.420 | Like you have to say this weird thing,
01:46:03.160 | which is this I believe,
01:46:05.700 | and this I believe is a very dangerous statement,
01:46:08.640 | but this I believe, I believe that my theory points the way.
01:46:14.160 | Now, Elon might or might not be able to access my theory.
01:46:18.760 | I don't know what he knows, but keep in mind,
01:46:21.940 | why are we all so focused on Elon?
01:46:25.760 | It's really weird.
01:46:26.680 | It's kind of creepy too.
01:46:28.380 | - Why?
01:46:29.220 | He's just the person who's just asking
01:46:31.320 | the obvious questions and doing whatever he can.
01:46:33.880 | - But he makes sense to me.
01:46:35.120 | You see, Craig Venter makes sense to me.
01:46:37.120 | Jim Watson makes sense to me.
01:46:39.200 | - But we're focusing on Elon because he somehow is rare.
01:46:44.540 | - Well, that's the weird thing.
01:46:45.940 | Like we've come up with a system
01:46:47.260 | that eliminates all Elon from our pipeline,
01:46:51.660 | and Elon somehow snuck through
01:46:55.460 | when they weren't quality adjusting everything, you know?
01:46:58.220 | - And this idea of disk,
01:47:02.420 | a distributed idea suppression complex,
01:47:05.260 | is that what's bringing the Elons of the world down?
01:47:09.900 | - You know, it's so funny.
01:47:11.340 | He's asking Joe Rogan, like, is that a joint?
01:47:14.420 | You know, it's like, well, what will happen if I smoke it?
01:47:16.620 | What will happen to the stock price?
01:47:18.360 | What will happen if I scratch myself in public?
01:47:21.280 | What will happen if I say what I think about Thailand
01:47:24.700 | or COVID or who knows what?
01:47:27.260 | And everybody's like, don't say that.
01:47:28.740 | Say this, go do this, go do that.
01:47:30.840 | Well, it's crazy making.
01:47:32.200 | It's absolutely crazy making.
01:47:34.940 | And if you think about what we put people through,
01:47:41.180 | we need to get people who can use FU money,
01:47:44.940 | the FU money they need to insulate themselves
01:47:48.700 | from all of the people who know better.
01:47:50.860 | 'Cause my nightmare is that why did we only get one Elon?
01:47:55.860 | What if we were supposed to have
01:47:57.300 | thousands and thousands of Elons?
01:47:59.500 | And the weird thing is like, this is all that remains.
01:48:03.140 | You're looking at like Obi-Wan and Yoda,
01:48:07.340 | and it's like, this is the only,
01:48:09.060 | this is all that's left after Order 66 has been executed.
01:48:14.060 | And that's the thing that's really upsetting to me
01:48:16.600 | is we used to have Elons five deep.
01:48:19.100 | And then we could talk about Elon
01:48:21.020 | in the context of his cohort.
01:48:23.920 | But this is like, if you were to see a giraffe
01:48:26.740 | in the Arctic with no trees around,
01:48:28.620 | you'd think, why the long neck?
01:48:30.740 | What a strange sight.
01:48:31.960 | - How do we get more Elons?
01:48:34.860 | How do we change the,
01:48:37.100 | so I think that you've,
01:48:39.080 | so we know MIT and Harvard.
01:48:43.140 | So maybe returning to our previous conversation,
01:48:45.800 | my sense is that the Elons of the world
01:48:48.340 | are supposed to come from MIT and Harvard.
01:48:50.340 | - Right.
01:48:51.300 | - And how do you change?
01:48:53.260 | - Let's think of one that MIT sort of killed.
01:48:55.740 | Have any names in mind?
01:48:58.800 | Aaron Schwartz leaps to my mind.
01:49:02.900 | - Yeah.
01:49:04.120 | - Are we MIT supposed to shield the Aaron Schwartz's
01:49:09.120 | from, I don't know, journal publishers?
01:49:13.940 | Or are we supposed to help the journal publishers
01:49:16.520 | so that we can throw 35 year sentences in his face
01:49:19.300 | or whatever it is that we did that depressed him?
01:49:22.040 | Okay, so here's my point.
01:49:24.360 | I want MIT to go back to being the home of Aaron Schwartz.
01:49:30.360 | And if you want to send Aaron Schwartz to a state
01:49:35.360 | where he's looking at 35 years in prison
01:49:38.280 | or something like that, you are my sworn enemy.
01:49:41.480 | You are not MIT.
01:49:42.700 | - Yeah.
01:49:44.520 | - You are the traitorous,
01:49:46.560 | irresponsible, middle brow, pencil pushing,
01:49:53.540 | green eye shade fool that needs to not be in the seat
01:49:57.400 | at the presidency of MIT period, the end,
01:50:00.140 | get the fuck out of there and let one of our people
01:50:02.840 | sit in that chair.
01:50:04.080 | - And the thing that you've articulated is that
01:50:06.840 | the people in those chairs are not the way they are
01:50:10.840 | because they're evil or somehow morally compromised,
01:50:14.000 | is that it's just that's the distributed nature.
01:50:17.860 | Is that there's some kind of aspect of the system--
01:50:19.720 | - These are people who wed themselves to the system.
01:50:23.580 | They adapt every instinct.
01:50:25.600 | And the fact is is that they're not going to be
01:50:29.000 | on Joe Rogan smoking a blunt.
01:50:31.000 | - Let me ask a silly question.
01:50:33.240 | Do you think institutions generally
01:50:34.920 | just tend to become that?
01:50:36.540 | - No, we get some of the institutions.
01:50:40.680 | We get Caltech.
01:50:42.400 | Here's what we're supposed to have.
01:50:43.440 | We're supposed to have Caltech.
01:50:44.800 | We're supposed to have Reed.
01:50:46.800 | We're supposed to have Deep Springs.
01:50:48.640 | We're supposed to have MIT.
01:50:51.200 | We're supposed to have a part of Harvard.
01:50:53.280 | And when the sharp elbow crowd comes after
01:50:56.400 | the sharp mind crowd, we're supposed to break
01:50:59.520 | those sharp elbows and say, "Don't come around here again."
01:51:02.840 | - So what are the weapons that the sharp minds
01:51:04.840 | are supposed to use in our modern day to reclaim MIT?
01:51:09.120 | What is the, what's the future?
01:51:12.160 | - Are you kidding me?
01:51:13.440 | First of all, assume that this is being seen at MIT.
01:51:17.000 | Hey, everybody. - It definitely is.
01:51:18.280 | - Okay.
01:51:19.760 | Hey, everybody.
01:51:21.160 | Try to remember who you are.
01:51:23.260 | You're the guys who put the police car
01:51:25.040 | on top of the great dome.
01:51:26.720 | You guys came up with the great breast of knowledge.
01:51:29.080 | You created a Tetris game in the green building.
01:51:32.400 | Now, what is your problem?
01:51:35.500 | They killed one of your own.
01:51:37.400 | You should make their life a living hell.
01:51:40.920 | You should be the ones who keep the memory
01:51:43.680 | of Aaron Schwartz alive.
01:51:45.160 | And all of those hackers and all of those mutants.
01:51:48.280 | You know, it's like, it's either our place or it isn't.
01:51:54.080 | And if we have to throw 12 more pianos off of the roof,
01:51:59.080 | right, if Harold Edgerton was taking those photographs,
01:52:04.720 | you know, with slow-mo back in the '40s,
01:52:08.180 | if Noam Chomsky is on your faculty,
01:52:13.760 | what the hell is wrong with you kids?
01:52:16.080 | You are the most creative and insightful people
01:52:18.720 | and you can't figure out how to defend Aaron Schwartz?
01:52:21.260 | That's on you guys.
01:52:22.520 | - So some of that is giving more power to the young,
01:52:25.160 | like you said, to the brave, to the bold.
01:52:27.560 | - Taking power from the feeble and the middle-brow.
01:52:30.640 | - Yeah, but what is the mechanism?
01:52:32.480 | To me-- - I don't know.
01:52:33.320 | You have some nine-volt batteries?
01:52:35.960 | You have some copper wire?
01:52:37.400 | Do you have a capacitor?
01:52:41.680 | - I tend to believe you have to create an alternative
01:52:44.640 | and make the alternative so much better
01:52:48.000 | that it makes MIT obsolete unless they change.
01:52:52.680 | And that's what forces change.
01:52:54.940 | So as opposed to somehow--
01:52:56.360 | - Okay, so you use projection mapping.
01:52:59.020 | - What's projection mapping?
01:53:00.280 | - Where you take some complicated edifice
01:53:02.480 | and you map all of its planes
01:53:04.280 | and then you actually project some unbelievable graphics,
01:53:07.200 | re-skinning a building, let's say, at night.
01:53:09.280 | - That's right, yeah.
01:53:10.120 | - Okay, so you wanna do some graffiti art with light.
01:53:12.200 | - You basically wanna hack the system.
01:53:13.680 | - No, I'm saying, look, listen to me, Len.
01:53:16.480 | We're smarter than they are.
01:53:18.680 | And you know what they say?
01:53:20.440 | They say things like, I think we need some geeks.
01:53:24.160 | Get me two PhDs.
01:53:25.680 | - Right.
01:53:27.480 | - You treat PhDs like that, that's a bad move.
01:53:30.880 | 'Cause PhDs are capable.
01:53:33.220 | And we act like our job is to peel grapes for our betters.
01:53:37.620 | - Yeah, that's a strange thing.
01:53:39.440 | You speak about it very eloquently,
01:53:41.560 | is how we treat basically the greatest minds in the world,
01:53:46.480 | which is like at their prime, which is PhD students.
01:53:49.920 | We pay them nothing.
01:53:52.980 | - I'm done with it.
01:53:55.920 | - Yeah.
01:53:56.760 | - Right, we gotta take what's ours.
01:53:58.840 | So take back MIT.
01:54:01.680 | Become ungovernable.
01:54:04.040 | Become ungovernable.
01:54:05.580 | And by the way, when you become ungovernable,
01:54:08.920 | don't do it by throwing food.
01:54:11.800 | Don't do it by pouring salt on the lawn like a jerk.
01:54:16.000 | Do it through brilliance.
01:54:17.520 | Because what you, Caltech and MIT can do,
01:54:20.240 | and maybe Rensselaer Polytechnic or Worcester Polytech,
01:54:23.600 | I don't know, Lehigh.
01:54:25.780 | God damn it, what's wrong with you technical people?
01:54:27.960 | You act like you're a servant class.
01:54:31.040 | - It's unclear to me how you reclaim it,
01:54:33.000 | except with brilliance, like you said.
01:54:35.040 | But to me, the way you reclaim it with brilliance
01:54:38.600 | is to go outside the system.
01:54:40.000 | - Aaron Schwartz came from the Elon Musk class.
01:54:42.760 | What you guys gonna do about it?
01:54:45.080 | Right?
01:54:45.920 | The super capable people need to flex,
01:54:50.080 | need to be individual, they need to stop giving away
01:54:52.240 | all their power to a zeitgeist or a community
01:54:55.040 | or this or that.
01:54:56.440 | You're not indoor cats, you're outdoor cats.
01:54:59.020 | Go be outdoor cats.
01:54:59.860 | - Do you think we're gonna see this kind of change happen?
01:55:02.320 | - You were the one asking me before,
01:55:04.000 | what about the World War II generation?
01:55:06.360 | All I'm trying to say is that
01:55:07.280 | there's a technical revolt coming.
01:55:09.700 | Here's, you wanna talk about--
01:55:11.160 | - I'm trying to lead it, right?
01:55:12.400 | I'm trying to see--
01:55:13.520 | - No, you're not trying to lead it.
01:55:14.360 | - I'm trying to get a blueprint here.
01:55:15.800 | - All right, Lex.
01:55:17.240 | How angry are you about our country pretending
01:55:20.480 | that you and I can't actually do technical subjects
01:55:23.900 | so that they need an army of kids coming in
01:55:27.300 | from four countries in Asia?
01:55:29.260 | It's not about the four countries in Asia,
01:55:31.040 | it's not about those kids.
01:55:32.760 | It's about lying about us,
01:55:34.160 | that we don't care enough about science and technology,
01:55:37.020 | that we're incapable of it.
01:55:38.980 | As if we don't have Chinese and Russians
01:55:41.420 | and Koreans and Croatians.
01:55:43.720 | Like, we've got everybody here.
01:55:46.360 | The only reason you're looking outside
01:55:48.440 | is that you wanna hire cheap people
01:55:50.480 | from the family business
01:55:51.520 | because you don't wanna pass the family business on.
01:55:54.480 | And you know what?
01:55:56.120 | You didn't really build the family business.
01:55:58.140 | It's not yours to decide.
01:56:00.400 | You the boomers and you the silent generation,
01:56:02.860 | you did your bit, but you also fouled a lot of stuff up.
01:56:06.260 | And you're custodians.
01:56:08.540 | You are caretakers.
01:56:09.760 | You were supposed to hand something.
01:56:11.400 | What you did instead was to gorge yourself
01:56:14.920 | on cheap foreign labor, which you then held up
01:56:18.080 | as being much more brilliant than your own children,
01:56:20.460 | which was never true.
01:56:21.620 | - See, but I'm trying to understand
01:56:23.840 | how we create a better system without anger,
01:56:26.160 | without revolution.
01:56:27.400 | - Ah.
01:56:28.240 | - Not by kissing and hugs,
01:56:31.040 | but by, I mean, I don't understand within MIT
01:56:36.800 | what the mechanism of building a better MIT is.
01:56:39.900 | - We're not gonna pay Elsevier.
01:56:41.700 | Aaron Schwartz was right.
01:56:42.920 | JSTOR is an abomination.
01:56:45.500 | - But why, who within MIT,
01:56:47.340 | who within institutions is going to do that
01:56:50.180 | when, just like you said,
01:56:52.000 | the people who are running the show are more senior.
01:56:54.860 | - I don't get Frank Wilczek to speak out.
01:56:57.320 | - So you're, it's basically individuals that step up.
01:57:00.860 | I mean, one of the surprising things about Elon
01:57:03.020 | is that one person can inspire so much.
01:57:06.660 | He's got academic freedom.
01:57:08.240 | It just comes from money.
01:57:09.480 | - I don't agree with that.
01:57:13.580 | That you think money, okay, so yes, certainly.
01:57:17.380 | - Sorry, and testicles.
01:57:20.040 | - You've, yes, but I think the testicles
01:57:22.060 | is more important than money.
01:57:23.400 | - Right.
01:57:24.240 | - Or guts.
01:57:25.760 | I think, I do agree with you.
01:57:27.280 | You speak about this a lot,
01:57:28.340 | that because the money in the academic institutions
01:57:31.040 | has been so constrained that people are misbehaving
01:57:34.480 | in horrible ways.
01:57:35.920 | - Yes.
01:57:36.760 | - But I don't think that if we reverse that
01:57:39.760 | and give a huge amount of money,
01:57:41.100 | people will all of a sudden behave well.
01:57:42.640 | I think it also takes guts.
01:57:43.880 | - No, you need to give people security.
01:57:46.080 | - Security, yes.
01:57:46.920 | - Like you need to know that you have a job on Monday
01:57:51.120 | when on Friday you say,
01:57:52.560 | "I'm not so sure I really love diversity and inclusion."
01:57:56.160 | And somebody's like, "Wait, what?
01:57:57.880 | "You didn't love diversity?
01:57:59.380 | "We had a statement on diversity and inclusion.
01:58:00.920 | "You wouldn't sign?
01:58:02.060 | "Are you against the inclusion part?
01:58:03.900 | "Or are you against diverse?
01:58:05.040 | "Do you just not like people like you?"
01:58:07.040 | You're like, "Actually, that has nothing to do with anything.
01:58:09.920 | "You're making this into something that it isn't.
01:58:11.960 | "I don't wanna sign your goddamn stupid statement.
01:58:14.640 | "And get out of my lab."
01:58:16.320 | Right, get out of my lab.
01:58:18.000 | It all begins from the middle finger.
01:58:19.920 | Get out of my lab.
01:58:21.300 | The administrators need to find other work.
01:58:25.460 | - Yeah.
01:58:27.320 | Listen, I agree with you,
01:58:28.240 | and I hope to seek your advice and wisdom
01:58:33.040 | as we change this.
01:58:34.800 | Because I'd love to see--
01:58:35.880 | - I will visit you in prison if that's what you're asking.
01:58:39.400 | - I have no, I think prison is great.
01:58:42.000 | You get a lot of reading done,
01:58:43.440 | and good working out.
01:58:46.040 | Well, let me ask,
01:58:48.000 | something I brought up before is the Nietzsche quote of,
01:58:52.780 | "Beware that when fighting monsters,
01:58:54.880 | "you yourself do not become a monster.
01:58:57.020 | "For when you gaze long into the abyss,
01:58:58.840 | "the abyss gazes into you."
01:59:01.220 | Are you worried that your focus on the flaws
01:59:04.520 | in the system that we've just been talking about
01:59:07.480 | has damaged your mind,
01:59:09.320 | or the part of your mind that's able to see the beauty
01:59:12.720 | in the world in the system?
01:59:14.680 | That because you have so sharply been able
01:59:20.080 | to see the flaws in the system,
01:59:22.440 | you can no longer step back and appreciate its beauty?
01:59:25.840 | - Look, I'm the one who's trying to get
01:59:27.720 | the institutions to save themselves
01:59:31.160 | by getting rid of their inhabitants,
01:59:33.040 | but leaving the institution,
01:59:34.540 | like a neutron bomb that removes
01:59:37.320 | the unworkable leadership class,
01:59:40.800 | but leaves the structures.
01:59:42.360 | - So the leadership class is really the problem.
01:59:45.280 | - The leadership class is the problem.
01:59:46.120 | - But the individual, like the professors,
01:59:47.880 | the individual scholars--
01:59:48.720 | - Well, the professors are gonna have to go back
01:59:51.140 | into training to remember how to be professors.
01:59:54.760 | Like people are cowards at the moment,
01:59:56.600 | because if they're not cowards, they're unemployed.
01:59:59.160 | - Yeah, that's one of the disappointing things
02:00:02.640 | I've encountered is, to me, tenure--
02:00:05.240 | - Nobody has tenure now.
02:00:08.200 | - Whether they do or not, they certainly don't have
02:00:14.800 | the kind of character and fortitude
02:00:20.240 | that I was hoping to see.
02:00:22.120 | To me--
02:00:22.960 | - But they'd be gone.
02:00:23.920 | See, you're dreaming about the people
02:00:27.760 | who used to live at MIT.
02:00:32.140 | You're dreaming about the previous inhabitants
02:00:35.420 | of your university.
02:00:37.200 | And if you looked at somebody like,
02:00:39.300 | Isidore Singer is very old, I don't know what state he's in,
02:00:43.500 | but that guy was absolutely the real deal.
02:00:46.540 | And if you look at Noam Chomsky,
02:00:48.680 | tell me that Noam Chomsky has been muzzled, right?
02:00:52.140 | - Yeah.
02:00:53.100 | - Now, what I'm trying to get at is,
02:00:55.300 | you're talking about younger, energetic people,
02:00:58.140 | but those people, like when I say something like,
02:01:00.820 | I'm against, I'm for inclusion and I'm for diversity,
02:01:05.820 | but I'm against diversity and inclusion TM,
02:01:09.900 | like the movement.
02:01:10.800 | Well, I couldn't say that if I was a professor.
02:01:14.780 | Oh my God, he's against our sacred document.
02:01:18.820 | Okay, well, in that kind of a world,
02:01:21.780 | do you wanna know how many things I don't agree with you on?
02:01:24.500 | Like we could go on for days and days and days,
02:01:26.420 | all of the nonsense that you've parroted
02:01:28.620 | inside of the institution.
02:01:30.820 | Any sane person has no need for it.
02:01:33.300 | They have no want or desire.
02:01:35.520 | - Do you think you have to have some patience for nonsense
02:01:41.700 | when many people work together in a system?
02:01:44.240 | - How long has string theory gone on for
02:01:45.900 | and how long have I been patient?
02:01:48.180 | Okay, so you're talking about--
02:01:49.020 | - There's a limit to patience, I imagine.
02:01:50.420 | - You're talking about like 36 years
02:01:52.740 | of modern nonsense in string theory.
02:01:54.940 | - So you can do like eight to 10 years, but not more?
02:01:57.780 | - I can do 40 minutes.
02:02:00.000 | This is 36 years.
02:02:02.260 | - Well, you've done that over two hours already.
02:02:03.860 | - No, but it's-- - I appreciate.
02:02:05.300 | - But it's been 36 years of nonsense
02:02:07.620 | since the anomaly cancellation in string theory.
02:02:10.980 | It's like, what are you talking about about patience?
02:02:13.740 | I mean, Lex, you're not even acting like yourself.
02:02:17.140 | Well, you're trying to stay in the system.
02:02:20.500 | - I'm not trying, I'm not.
02:02:22.060 | I'm trying to see if perhaps,
02:02:25.580 | so my hope is that the system just has a few assholes in it,
02:02:29.580 | which you highlight,
02:02:32.180 | and the fundamentals of the system aren't broken,
02:02:35.580 | because if the fundamentals of the systems are broken,
02:02:38.620 | then I just don't see a way for MIT to succeed.
02:02:41.940 | I don't see how young people take over MIT.
02:02:47.260 | I don't see how--
02:02:48.280 | - By inspiring us.
02:02:51.440 | You know the great part about being at MIT?
02:02:55.420 | Like when you saw the genius in these pranks,
02:02:59.220 | the heart, the irreverence.
02:03:01.420 | We were talking about Tom Lehrer the last time.
02:03:05.700 | Tom Lehrer was as naughty as the day is long, agreed?
02:03:09.300 | - Agreed.
02:03:10.220 | - Was he also a genius?
02:03:11.700 | Was he well-spoken?
02:03:12.720 | Was he highly cultured?
02:03:14.860 | He was so talented, so intellectual,
02:03:17.100 | that he could just make fart jokes morning, noon, and night.
02:03:20.340 | Okay, well, in part, the right to make fart jokes,
02:03:24.140 | the right to, for example, put a functioning phone booth
02:03:27.420 | that was ringing on top of the Great Dome at MIT,
02:03:30.500 | has to do with we are such badasses
02:03:32.420 | that we can actually do this stuff.
02:03:34.860 | Well, don't tell me about it anymore.
02:03:36.700 | Go break the law.
02:03:38.740 | Go break the law in a way that inspires us
02:03:41.080 | and makes us not want to prosecute you.
02:03:43.660 | Break the law in a way that lets us know
02:03:46.500 | that you're calling us out on our bullshit,
02:03:48.020 | that you're filled with love,
02:03:50.260 | and that our technical talent has not gone to sleep,
02:03:54.220 | it's not incapable, and if the idea is
02:03:57.620 | is that you're gonna dig a moat around the university
02:04:00.300 | and fill it with tiger sharks, that's awesome,
02:04:04.380 | 'cause I don't know how you're gonna do it,
02:04:05.740 | but if you actually manage to do that,
02:04:07.800 | I'm not gonna prosecute you under a reckless endangerment.
02:04:11.760 | - That's beautifully put.
02:04:15.060 | I hope those, first of all, they'll listen.
02:04:17.780 | I hope young people at MIT will take over
02:04:19.860 | in this kind of way.
02:04:21.200 | In the introduction to your podcast episode
02:04:24.540 | on Jeffrey Epstein, you give to me a really moving story,
02:04:29.540 | but unfortunately for me, too brief,
02:04:34.220 | about your experience with a therapist
02:04:37.060 | and a lasting terror that permeated your mind.
02:04:39.380 | Can you go there?
02:04:43.980 | Can you tell?
02:04:45.140 | - I don't think so.
02:04:45.980 | I mean, I appreciate what you're saying.
02:04:47.740 | I said it obliquely.
02:04:48.860 | I said enough.
02:04:49.740 | There are bad people who cross our paths,
02:04:53.980 | and the current vogue is to say, oh, I'm a survivor.
02:04:58.980 | I'm a victim.
02:05:02.860 | I can do anything I want.
02:05:04.460 | This is a broken person, and I don't know why
02:05:08.440 | I was sent to a broken person as a kid,
02:05:11.060 | and to be honest with you, I also felt like in that story,
02:05:14.120 | I say that I was able to say no,
02:05:17.360 | and this was like the entire weight of authority,
02:05:19.820 | and he was misusing his position,
02:05:23.720 | and I was also able to say no.
02:05:26.740 | What I couldn't say no to was having him
02:05:30.780 | re-inflicted in my life.
02:05:32.340 | - All right, so you were sent back a second time.
02:05:36.140 | - I tried to complain about what had happened,
02:05:38.140 | and I tried to do it in a way that did not
02:05:40.180 | immediately cause horrific consequences
02:05:45.680 | to both this person and myself,
02:05:47.300 | because we don't have the tools
02:05:49.740 | to deal with sexual misbehavior.
02:05:55.300 | We have nuclear weapons.
02:05:57.240 | We don't have any way of saying
02:05:59.440 | this is probably not a good place or a role for you
02:06:03.540 | at this moment as an authority figure,
02:06:06.900 | and something needs to be worked on,
02:06:08.660 | so in general, when we see somebody
02:06:10.800 | who is misbehaving in that way,
02:06:13.940 | our immediate instinct is to treat the person as Satan,
02:06:18.940 | and we understand why.
02:06:22.940 | We don't want our children to be at risk.
02:06:25.660 | Now, I personally believe that I fell down on the job
02:06:31.860 | and did not call out the Jeffrey Epstein thing early enough
02:06:35.140 | because I was terrified of what Jeffrey Epstein represents,
02:06:38.020 | and thus recapitulated the old terror
02:06:41.020 | trying to tell the world this therapist
02:06:43.500 | is out of control, and when I said that,
02:06:47.220 | the world responded by saying,
02:06:48.820 | well, you have two appointments booked,
02:06:50.420 | and you have to go for the second one,
02:06:52.420 | so I got re-inflicted into this office
02:06:56.100 | on this person who was now convinced
02:06:57.460 | that I was about to tear down his career and his reputation
02:06:59.780 | and might have been on the verge of suicide for all I know.
02:07:01.700 | I don't know, but he was very, very angry,
02:07:05.060 | and he was furious with me that I had breached
02:07:07.380 | a sacred confidence of his office.
02:07:11.020 | - What kind of ripple effects has that had
02:07:14.060 | through the rest of your life,
02:07:16.180 | the absurdity and the cruelty of that?
02:07:20.180 | I mean, there's no sense to it.
02:07:21.740 | - Well, see, this is the thing
02:07:24.940 | people don't really grasp, I think.
02:07:26.880 | There's an academic who I got to know many years ago
02:07:32.700 | named Jennifer Freud, who has a theory of betrayal,
02:07:39.140 | what she calls institutional betrayal,
02:07:41.300 | and her gambit is that when you were betrayed
02:07:43.760 | by an institution that is sort of like a fiduciary
02:07:47.460 | or a parental obligation to take care of you,
02:07:50.660 | that you find yourself in a far different situation
02:07:55.660 | with respect to trauma than if you were betrayed
02:07:58.220 | by somebody who's a peer,
02:08:00.560 | and so I think that in my situation,
02:08:07.440 | I kind of repeat a particular dynamic with authority.
02:08:12.440 | I come in not following all the rules,
02:08:17.140 | trying to do some things, not trying to do others,
02:08:20.320 | blah, blah, blah, and then I get into
02:08:23.580 | a weird relationship with authority,
02:08:25.480 | and so I have more experience with what I would call
02:08:28.040 | institutional betrayal.
02:08:29.480 | Now, the funny part about it is that
02:08:32.880 | when you don't have masks or PPE
02:08:36.040 | in a influenza-like pandemic,
02:08:39.680 | and you're missing ICU beds and ventilators,
02:08:42.400 | that is ubiquitous institutional betrayal,
02:08:46.840 | so I believe that in a weird way, I was very early,
02:08:50.400 | the idea of, and this is like the really hard concept,
02:08:54.400 | pervasive or otherwise universal institutional betrayal,
02:08:59.660 | where all of the institutions,
02:09:01.180 | you can count on any hospital to not charge you properly
02:09:04.840 | for what their services are.
02:09:06.880 | You can count on no pharmaceutical company
02:09:09.500 | to produce the drug that will be maximally beneficial
02:09:12.560 | to the people who take it.
02:09:14.440 | You know that your financial professionals
02:09:16.960 | are not simply working in your best interest,
02:09:19.760 | and that issue had to do with the way
02:09:22.440 | in which growth left our system,
02:09:25.200 | so I think that the weird thing is
02:09:26.920 | is that this first institutional betrayal by a therapist
02:09:30.600 | left me very open to the idea of,
02:09:32.800 | okay, well, maybe the schools are bad,
02:09:34.480 | maybe the hospitals are bad,
02:09:35.760 | maybe the drug companies are bad,
02:09:37.100 | maybe our food is off,
02:09:38.760 | maybe our journalists are not serving journalistic ends,
02:09:41.920 | and that was what allowed me
02:09:43.880 | to sort of go all the distance and say,
02:09:46.560 | huh, I wonder if our problem is that something
02:09:49.540 | is causing all of our sense-making institutions to be off.
02:09:54.200 | That was the big insight,
02:09:55.320 | and tying that to a single ideology,
02:09:59.280 | what if it's just about growth?
02:10:00.640 | They were all built on growth,
02:10:02.040 | and now we've promoted people who are capable
02:10:05.120 | of keeping quiet that their institutions aren't working.
02:10:08.600 | So the privileged, silent aristocracy,
02:10:13.000 | the people who can be counted upon,
02:10:14.960 | not to mention a fire when a raging fire
02:10:17.280 | is tearing through a building.
02:10:18.780 | - But nevertheless,
02:10:21.360 | it's how big of a psychological burden is that?
02:10:25.520 | - It's huge, it's terrible, it's crushing.
02:10:27.840 | It's very-- - It's very comforting
02:10:30.000 | to be the parental, I mean, I don't know,
02:10:34.280 | I treasure, I mean, we were just talking about MIT.
02:10:38.200 | I can intellectualize and agree
02:10:41.080 | with everything you're saying,
02:10:42.320 | but there's a comfort, a warm blanket
02:10:44.560 | of being within the institution.
02:10:46.920 | - And up until Aaron Schwartz, let's say.
02:10:50.080 | In other words, now, if I look at the provost
02:10:53.840 | and the president as mommy and daddy,
02:10:56.120 | you did what to my big brother?
02:10:59.040 | (Aaron sighs)
02:11:00.440 | You did what to our family?
02:11:03.440 | You sold us out in which way?
02:11:05.300 | What secrets left for China?
02:11:08.880 | You hired which workforce?
02:11:10.680 | You did what to my wages?
02:11:13.040 | You took this portion of my grant for what purpose?
02:11:15.600 | You just stole my retirement through a fringe rate.
02:11:18.240 | What did you do?
02:11:19.600 | - But can you still, I mean,
02:11:21.640 | the thing is about this view you have
02:11:25.480 | is it often turns out to be sadly correct.
02:11:28.360 | - But this is the thing.
02:11:29.880 | - But let me just, in this silly hopeful thing,
02:11:34.400 | do you still have hope in institutions?
02:11:36.800 | Can you within your, psychologically?
02:11:39.120 | - Yes.
02:11:39.960 | - I'm referring not intellectually,
02:11:41.480 | because you have to carry this burden,
02:11:43.540 | can you still have a hope within you?
02:11:47.280 | When you sit at home alone,
02:11:49.560 | and as opposed to seeing the darkness
02:11:52.120 | within these institutions, seeing a hope.
02:11:54.520 | - Well, but this is the thing, I want to confront,
02:11:57.500 | not for the purpose of a dust up.
02:12:01.120 | I believe, for example, if you've heard episode 19,
02:12:05.280 | that the best outcome is for Carroll Grider
02:12:07.800 | to come forward, as we discussed in episode 19.
02:12:12.800 | - With your brother, Brett Weinstein.
02:12:14.280 | - And say, you know what? - It's a great episode.
02:12:16.040 | - I screwed up.
02:12:17.640 | He did call, he did suggest the experiment.
02:12:20.920 | I didn't understand that it was his theory
02:12:23.060 | that was producing it.
02:12:24.120 | Maybe I was slow to grasp it.
02:12:27.200 | But my bad, and I don't want to pay for this bad
02:12:32.200 | choice on my part, let's say,
02:12:37.160 | for the rest of my career.
02:12:39.400 | I want to own up, and I want to help make sure
02:12:41.600 | that we do what's right with what's left.
02:12:45.160 | - And that's one little case within the institution
02:12:47.600 | that you would like to see made.
02:12:48.720 | - I would like to see MIT very clearly come out
02:12:52.320 | and say, you know, Margot O'Toole was right
02:12:54.480 | when she said David Baltimore's lab here
02:12:56.920 | produced some stuff that was not reproducible
02:13:02.680 | with Teresa Imanishi-Kari's research.
02:13:05.680 | I want to see the courageous people,
02:13:08.840 | I would like to see the Aaron Schwartz wing
02:13:11.840 | of the computer science department.
02:13:14.680 | Yeah, wouldn't, no, let's think about it.
02:13:17.480 | Wouldn't that be great if we said, you know,
02:13:19.200 | an injustice was done, and we're gonna write that wrong
02:13:22.880 | just as if this was Alan Turing?
02:13:24.700 | - Which I don't think they've righted that wrong.
02:13:28.800 | - Well, then let's have the Turing-Schwartz wing.
02:13:31.360 | - The Turing-Schwartz, they're starting
02:13:33.640 | a new college of computing.
02:13:34.840 | It wouldn't be wonderful to call it the Turing-Schwartz?
02:13:37.320 | - I would like to have the Madame Wu wing
02:13:39.480 | of the physics department, and I'd love to have
02:13:41.920 | the Emmy Noether statue in front of the math department.
02:13:45.480 | I mean, like, you want to get excited
02:13:46.800 | about actual diversity and inclusion?
02:13:49.120 | - Yeah.
02:13:49.960 | - Well, let's go with our absolute best people
02:13:51.680 | who never got theirs, 'cause there is structural bigotry.
02:13:54.520 | But if we don't actually start celebrating
02:13:58.840 | the beautiful stuff that we're capable of
02:14:00.840 | when we're handed heroes and we fumble them into the trash,
02:14:04.600 | what the hell, I mean, Lex, this is such nonsense.
02:14:09.600 | Just pulling our head out.
02:14:16.000 | You know, on everyone's cecum should be tattooed,
02:14:19.320 | if you can read this, you're too close.
02:14:21.500 | - Beautifully put, and I'm a dreamer just like you.
02:14:29.120 | So I don't see as much of the darkness genetically
02:14:34.120 | or due to my life experience, but I do share the hope
02:14:39.120 | for MIT, the institution that we care a lot about.
02:14:42.600 | - We both do.
02:14:43.560 | - Yeah, and Harvard Institution, I don't give a damn about,
02:14:46.680 | but you do.
02:14:47.920 | - I love Harvard.
02:14:48.960 | - I'm just kidding.
02:14:49.800 | - I love Harvard, but Harvard and I
02:14:51.760 | have a very difficult relationship,
02:14:53.440 | and part of what, you know, when you love a family
02:14:55.820 | that isn't working, I don't want to trash,
02:14:59.200 | I didn't bring up the name of the president of MIT
02:15:02.560 | during the Aaron Schwartz period.
02:15:04.160 | It's not vengeance.
02:15:06.640 | I want the rot cleared out.
02:15:09.040 | I don't need to go after human beings.
02:15:11.640 | - Yeah.
02:15:13.500 | Just like you said, with the disk formulation,
02:15:16.440 | the individual human beings don't necessarily carry the--
02:15:21.080 | - It's those chairs that are so powerful
02:15:25.160 | in which they sit.
02:15:26.480 | - It's the chairs, not the humans.
02:15:28.240 | - It's not the humans.
02:15:29.480 | - Without naming names, can you tell the story
02:15:35.200 | of your struggle during your time at Harvard?
02:15:39.080 | Maybe in a way that tells the bigger story
02:15:42.640 | of the struggle of young, bright minds
02:15:45.340 | that are trying to come up with big, bold ideas
02:15:49.800 | within the institutions that we're talking about.
02:15:53.320 | - You can start.
02:15:55.200 | I mean, in part, it starts with coffee,
02:16:01.280 | with a couple of Croatians in the math department at MIT,
02:16:09.120 | and we used to talk about music and dance
02:16:14.120 | and math and physics and love and all this kind of stuff
02:16:18.460 | as Eastern Europeans loved to, and I ate it up.
02:16:23.460 | And my friend, Gordana, who was an instructor
02:16:28.080 | in the MIT math department when I was a graduate student
02:16:30.640 | at Harvard, said to me,
02:16:34.000 | and I'm probably gonna do a bad version of her accent.
02:16:36.320 | - There we go.
02:16:37.660 | - Eric, will I see you tomorrow at the secret seminar?
02:16:42.260 | And I said, what secret seminar?
02:16:45.860 | Eric, don't joke.
02:16:48.320 | I said, I'm not used to this style of humor, Gordana.
02:16:53.620 | Eric, the secret seminar that your advisor is running.
02:16:56.900 | I said, what are you talking about?
02:16:59.900 | Ha ha ha ha.
02:17:02.100 | Your advisor is running a secret seminar on this aspect,
02:17:06.400 | I think it was like the Chern-Simons invariant.
02:17:08.780 | Not sure what the topic was again,
02:17:11.860 | but she gave me the room number and the time
02:17:14.780 | and she was like not cracking a smile.
02:17:16.740 | I've never known her to make this kind of a joke.
02:17:19.220 | And I thought this was crazy.
02:17:20.220 | And I was trying to have an advisor.
02:17:22.100 | I didn't want an advisor,
02:17:23.220 | but people said you have to have one, so I took one.
02:17:26.340 | And I went to this room like 15 minutes early
02:17:31.340 | and there was not a soul inside it.
02:17:32.900 | It was outside of the math department.
02:17:36.380 | And it was still in the same building,
02:17:38.560 | the science center at Harvard.
02:17:40.880 | And I sat there and I let five minutes go by,
02:17:43.800 | I let seven minutes go by, 10 minutes go by,
02:17:45.760 | there was nobody.
02:17:47.120 | I thought, okay, so this was all an elaborate joke.
02:17:50.600 | And then like three minutes to the hour,
02:17:52.600 | this graduate student walks in
02:17:55.840 | and like sees me and does a double take.
02:17:58.560 | And then I start to see the professors
02:18:01.060 | in geometry and topology start to file in.
02:18:04.960 | And everybody's like very disconcerted
02:18:08.540 | that I'm in this room.
02:18:09.640 | And finally, the person who was supposed to be my advisor
02:18:15.540 | walks in to the seminar and sees me
02:18:18.900 | and goes white as a ghost.
02:18:20.240 | And I realized that the secret seminar is true,
02:18:26.180 | that the department is conducting a secret seminar
02:18:31.180 | on the exact topic that I'm interested in,
02:18:34.020 | not telling me about it.
02:18:35.960 | And that these are the reindeer games
02:18:38.200 | that the Rudolphs of the department are not invited to.
02:18:41.920 | And so then I realized, okay, I did not understand it.
02:18:45.720 | There's a parallel department.
02:18:47.540 | And that became the beginning of an incredible odyssey
02:18:54.080 | in which I came to understand that the game
02:19:01.600 | that I had been sold about publication,
02:19:05.680 | about blind refereeing, about openness
02:19:10.280 | and scientific transmission of information was all a lie.
02:19:15.280 | I came to understand that at the very top,
02:19:21.320 | there's a second system that's about closed meetings
02:19:25.940 | and private communications and agreements about citation
02:19:31.340 | and publication that the rest of us don't understand.
02:19:35.840 | And that in large measure,
02:19:37.880 | that is the thing that I won't submit to.
02:19:40.920 | And so when you ask me questions like,
02:19:42.680 | well, why wouldn't you feel good about
02:19:45.120 | talking to your critics?
02:19:46.040 | Or why wouldn't you feel, the answer is, oh, you don't know.
02:19:49.080 | Like if you stay in a nice hotel,
02:19:51.180 | you don't realize that there's an entire second structure
02:19:54.680 | inside of that hotel where like there's usually
02:19:57.640 | a worker's cafe in a resort complex
02:20:01.100 | that isn't available to the people
02:20:02.660 | who are staying in the hotel.
02:20:03.800 | And then there are private hallways
02:20:05.500 | inside the same hotel that are parallel structures.
02:20:11.880 | So that's what I found, which was in essence,
02:20:14.900 | just the way you can stay hotels your whole life
02:20:17.000 | and not realize that inside of every hotel
02:20:19.280 | is a second structure that you're not supposed to see
02:20:21.700 | as the guest.
02:20:23.060 | There is a second structure inside of academics
02:20:26.020 | that behaves totally differently with respect
02:20:28.600 | to how people get dinged, how people get their grants
02:20:31.520 | taken away, how this person comes to have that thing
02:20:35.400 | named after them.
02:20:37.120 | And by pretending that we're not running
02:20:40.800 | a parallel structure, I have no patience for that anymore.
02:20:45.800 | So I got a chance to see how the game,
02:20:48.680 | how hardball is really played at Harvard.
02:20:51.060 | And I'm now eager to play hardball
02:20:56.680 | back with the same people who played hardball with me.
02:20:59.440 | - Let me ask two questions on this.
02:21:02.440 | So one, do you think it's possible,
02:21:06.240 | so I call those people assholes, that's the technical term.
02:21:11.240 | Do you think it's possible that that's just
02:21:13.600 | not the entire system, but a part of the system?
02:21:17.200 | That there's, you can navigate, you can swim in the waters
02:21:22.920 | and find the groups of people who do aspire to--
02:21:26.560 | - The guy who rescued my PhD was one of the people
02:21:30.040 | who filed in to the secret seminar.
02:21:32.760 | - Right, but are there people--
02:21:36.280 | - I'm just trying to say--
02:21:37.120 | - Who are outside of this, right?
02:21:38.120 | - Is he an asshole?
02:21:39.680 | - Well, yes, I was a bad--
02:21:41.680 | - No, but I'm trying to make this point,
02:21:43.320 | which is this isn't my failure to correctly map
02:21:46.840 | these people, it's yours.
02:21:48.880 | You have a simplification that isn't gonna work.
02:21:53.200 | - I think, okay, asshole's the wrong term.
02:21:55.100 | I would say lacking of character.
02:21:59.480 | - What would you have had these people do?
02:22:02.440 | Why did they do this?
02:22:03.580 | Why have a secret seminar?
02:22:05.060 | - I don't understand the exact dynamics
02:22:08.020 | of a secret seminar, but I think the right thing to do
02:22:11.080 | is to see individuals like you.
02:22:14.240 | There might be a reason to have a secret seminar,
02:22:16.800 | but they should detect that an individual like you,
02:22:20.800 | a brilliant mind who's thinking about certain ideas
02:22:24.140 | could be damaged by this.
02:22:25.440 | - I don't think that they see it that way.
02:22:27.840 | The idea is we're going to sneak food
02:22:30.600 | to the children we want to survive.
02:22:32.940 | - Yeah, so that's highly problematic,
02:22:34.920 | and there should be people within that room.
02:22:36.800 | - But I'm trying to say, this is the thing,
02:22:39.080 | the ball that's thrown but won't be caught.
02:22:41.760 | The problem is they know that most of their children
02:22:45.640 | won't survive, and they can't say that.
02:22:51.800 | - I see, sorry to interrupt.
02:22:54.180 | You mean that the fact that the whole system
02:22:57.820 | is underfunded, that they naturally have to pick favorites.
02:23:02.020 | - They live in a world which reached steady state
02:23:05.000 | at some level, let's say, in the early '70s.
02:23:09.620 | And in that world, before that time,
02:23:15.200 | you have a professor like Norman Steenrod,
02:23:17.420 | and you'd have 20 children that is graduate students,
02:23:20.260 | and all of them would go on to be professors,
02:23:21.880 | and all of them would want to have 20 children.
02:23:24.860 | So you start taking higher and higher powers of 20,
02:23:29.100 | and you see that the system could not,
02:23:31.020 | it's not just about money, the system couldn't survive.
02:23:34.460 | So the way it's supposed to work now
02:23:36.840 | is that we should shut down the vast majority
02:23:39.840 | of PhD programs, and we should let the small number
02:23:43.620 | of truly top places populate mostly teaching
02:23:49.260 | and research departments that aren't PhD producing.
02:23:52.940 | We don't want to do that because we use PhD students
02:23:55.900 | as a labor force.
02:23:56.820 | So the whole thing has to do with growth,
02:23:59.980 | resources, dishonesty, and in that world,
02:24:04.420 | you see all of these adaptations to a ruthless world,
02:24:08.500 | the key question is where are we gonna bury
02:24:10.580 | this huge number of bodies of people who don't work out?
02:24:13.420 | So my problem was I wasn't interested in dying.
02:24:18.540 | - So you clearly highlighted there's aspects
02:24:22.060 | of the system that are broken, but as an individual,
02:24:24.620 | is your role to exit the system,
02:24:29.900 | or just acknowledge that it's a game and win it?
02:24:32.540 | - My role is to survive and thrive in the public eye.
02:24:36.400 | In other words, when you have an escapee of the system--
02:24:42.500 | - Like yourself. - Such as,
02:24:44.340 | and that person says, you know, I wasn't exactly finished,
02:24:48.060 | let me show you a bunch of stuff.
02:24:50.380 | Let me show you that the theory of telomeres
02:24:53.740 | never got reported properly.
02:24:55.460 | Let me show you that all of marginal economics
02:24:59.780 | is supposed to be redone with a different version
02:25:01.660 | of the differential calculus.
02:25:02.940 | Let me show you that you didn't understand
02:25:04.900 | the self-Duhal Yang-Mills equations correctly
02:25:07.420 | in topology and physics because they're in fact
02:25:11.340 | much more broadly found, and it's only the mutations
02:25:16.460 | that happen in special dimensions.
02:25:18.260 | There are lots of things to say,
02:25:20.080 | but this particular group of people,
02:25:23.900 | like if you just take, where are all the Gen X
02:25:27.900 | and millennial university presidents?
02:25:30.020 | - Right. - Okay?
02:25:31.780 | They're all in a holding pattern.
02:25:36.220 | Now, why in this story of telomeres,
02:25:41.220 | was it an older professor and a younger graduate student?
02:25:45.340 | It's this issue of what would be called
02:25:47.900 | interference competition.
02:25:49.940 | So for example, orcas try to drown minke whales
02:25:53.700 | by covering their blowholes so that they suffocate
02:25:56.220 | because the needed resource is air.
02:25:58.700 | Okay, well, what do the universities do?
02:26:01.620 | They try to make sure that you can't be viable,
02:26:05.060 | that you need them, that you need their grants,
02:26:08.020 | you need to be zinged with overhead charges
02:26:13.060 | or fringe rates or all of the games
02:26:15.260 | that the locals love to play.
02:26:17.660 | Well, my point is, okay, what's the cost of this?
02:26:20.540 | How many people died as a result
02:26:22.740 | of these interference competition games?
02:26:25.060 | When you take somebody like Douglas Prasher
02:26:27.980 | who did green fluorescent protein,
02:26:30.140 | and he drives a shuttle bus, right,
02:26:32.780 | 'cause his grant runs out, and he has to give away
02:26:34.960 | all of his research, and all of that research
02:26:36.760 | gets a Nobel Prize, and he gets to drive a shuttle bus
02:26:38.980 | for $35,000 a year.
02:26:40.740 | - What do you mean by died?
02:26:41.700 | Do you mean their career, their dreams, their passions?
02:26:43.580 | - Yeah, the whole, as an academic,
02:26:45.940 | Doug Prasher was dead for a long period of time.
02:26:48.660 | - Okay, so as a person who's escaped the system,
02:26:54.920 | can't you, 'cause you also have in your mind
02:27:00.020 | a powerful theory that may turn out to be useful,
02:27:03.940 | maybe not. - Let's hope.
02:27:06.380 | - Can't you also play the game enough with the children,
02:27:11.100 | so like publish, but also--
02:27:14.500 | - If you told me that this would work,
02:27:16.300 | really what I wanna do, you see,
02:27:18.580 | is I would love to revolutionize a field
02:27:23.020 | with an h-index of zero.
02:27:24.780 | Like we have these proxies that count
02:27:29.560 | how many papers you've written,
02:27:30.900 | how cited are the papers you've written.
02:27:33.060 | All this is nonsense.
02:27:35.620 | - That's interesting, sorry, what do you mean by
02:27:37.540 | a field with an h-index?
02:27:38.940 | So a totally new field.
02:27:40.100 | - H-index counts somehow how many papers have you gotten
02:27:43.340 | that get so many citations.
02:27:44.700 | Let's say h-index undefined.
02:27:48.300 | Like for example, I don't have an advisor for my PhD,
02:27:55.120 | but I have to have an advisor as far as something called
02:27:58.940 | the Math Genealogy Project that tracks
02:28:01.420 | who advised whom down the line.
02:28:07.040 | So I am my own advisor, which sets up a loop, right?
02:28:10.480 | How many students do I have, an infinite number,
02:28:13.120 | or descendants.
02:28:14.800 | They don't want to have that story,
02:28:16.840 | so I have to have formal advisor, Raoul Bott,
02:28:20.000 | and my Wikipedia entry, for example,
02:28:21.840 | says that I was advised by Raoul Bott, which is not true.
02:28:24.680 | So you get fit into a system that says,
02:28:29.160 | well, we have to know what your h-index is,
02:28:30.920 | we have to know, you know, where are you a professor
02:28:34.500 | if you wanna apply for a grant?
02:28:35.600 | It makes all of these assumptions.
02:28:37.920 | What I'm trying to do is in part to show
02:28:40.640 | all of this is nonsense, this is proxy BS
02:28:43.200 | that came up in the institutional setting.
02:28:45.620 | And right now, it's important for those of us
02:28:47.840 | who are still vital, like Elon,
02:28:50.240 | it would be great to have Elon as a professor
02:28:52.080 | of physics and engineering.
02:28:53.600 | - Yeah. - Right?
02:28:54.980 | - It seems ridiculous to say, but--
02:28:57.360 | - No, just as a shot in the arm.
02:29:00.960 | - Yeah. - You know, like,
02:29:02.600 | it'd be great to have Elon at Caltech, even one day a week.
02:29:06.160 | - Yeah. - One day a month.
02:29:07.700 | Okay, well, why can't we be in there?
02:29:10.840 | It's the same reason, well, why can't you be on "The View"?
02:29:13.400 | Why can't you be on "Bill Maher"?
02:29:14.800 | We need to know what you're gonna do
02:29:16.160 | before we take you on the show.
02:29:18.360 | Well, I don't wanna tell you what I'm gonna do.
02:29:20.720 | - Do you think you need to be able to dance
02:29:22.920 | the dance a little bit?
02:29:24.520 | - I can dance the dance fine.
02:29:25.880 | - To be on "The View"? - Oh, come on.
02:29:28.000 | - So you can, yeah, you do, you're not--
02:29:29.760 | - I can do that fine.
02:29:30.800 | - Here's where, the place that it goes south is,
02:29:34.040 | there's like a set of questions that get you
02:29:37.640 | into this more adversarial stuff.
02:29:39.760 | And you've, in fact, asked some of those
02:29:41.520 | more adversarial questions this setting.
02:29:44.440 | And they're not things that are necessarily aggressive,
02:29:46.960 | but they're things that are making assumptions.
02:29:50.080 | - Right. - Right, so when you make,
02:29:51.560 | I have a question, it's like,
02:29:53.680 | Lex, are you avoiding your critics?
02:29:55.660 | You know, it's just like, okay,
02:29:58.160 | well, why did you frame that that way?
02:29:59.520 | Or the next question would be, it's like,
02:30:01.560 | do you think that you should have a special exemption
02:30:04.560 | and that you should have the right to break rules
02:30:06.200 | and everyone else should have to follow them?
02:30:08.080 | Like that question I find enervating.
02:30:10.120 | - Yeah. - It doesn't really come out
02:30:11.360 | of anything meaningful, it's just like,
02:30:12.760 | we feel we're supposed to ask that of the other person
02:30:15.460 | to show that we're not captured by their madness.
02:30:18.320 | That's not the real question you wanna ask me.
02:30:20.320 | If you wanna get really excited about this,
02:30:22.040 | you wanna ask, do you think this thing is right?
02:30:25.600 | Yeah, weirdly, I do.
02:30:27.720 | Do you think that it's going to be
02:30:29.000 | immediately seen to be right?
02:30:30.160 | I don't.
02:30:31.000 | I think it's gonna have an interesting fight
02:30:33.920 | and it's gonna have an interesting evolution.
02:30:35.920 | And, well, what do you hope to do with it
02:30:37.860 | in non-physical terms?
02:30:39.800 | Gosh, I hope it revolutionizes our relationship
02:30:43.920 | well, with people outside of the institutional framework
02:30:47.760 | and it re-inflicts us into the institutional framework
02:30:50.480 | where we can do the most good
02:30:52.240 | to bring the institutions back to health.
02:30:56.440 | These are positive, uplifting questions.
02:30:58.520 | And if you had Frank Wilczek, you wouldn't say,
02:31:00.960 | Frank, let's be honest, you have done very little
02:31:04.360 | with your life after the original huge show
02:31:08.120 | that you used to break under the physics.
02:31:10.480 | We weirdly ask people different questions
02:31:13.120 | based upon how they sit down.
02:31:14.920 | - Yeah, that's very strange, right?
02:31:16.440 | But you have to understand that,
02:31:18.640 | so here's the thing, I get, these days,
02:31:23.600 | a large number of emails from people
02:31:26.040 | with the equivalent of a theory of everything for AGI.
02:31:29.080 | - Yeah.
02:31:30.160 | - And I use my own radar, BS radar, to detect
02:31:34.480 | unfairly, perhaps, whether they're full of shit or not.
02:31:41.320 | - Right.
02:31:42.160 | I love where you're going with this, by the way.
02:31:45.560 | - And, (laughs)
02:31:48.560 | my concern I often think about is
02:31:51.940 | there's elements of brilliance in what people write to me.
02:31:55.040 | And I'm trying to, right now, as you made it clear,
02:32:00.120 | the kind of judgments and assumptions we make,
02:32:03.160 | how am I supposed to deal with you
02:32:05.640 | who are an outsider of the system
02:32:08.320 | and think about what you're doing?
02:32:11.520 | Because my radar is saying you're not full of shit.
02:32:14.380 | - Well, but I'm also not completely outside of the system.
02:32:18.040 | - That's right, you've danced beautifully.
02:32:20.440 | You've actually got all the credibility
02:32:24.220 | that you're supposed to,
02:32:25.260 | all the nice little stamps of approval,
02:32:27.980 | not all, but a large enough amount.
02:32:31.140 | I mean, it's hard to put into words exactly why
02:32:36.220 | you sound, whether your theory turns out to be good or not,
02:32:43.100 | you sound like a special human being.
02:32:47.220 | - I appreciate that, and thank you for your--
02:32:48.540 | - In a good way, right?
02:32:49.380 | - No, no, no.
02:32:50.300 | So, but what am I supposed to do
02:32:52.160 | with that flood of emails from AGI folks?
02:32:54.760 | - Why do I sound different?
02:32:56.760 | - I don't know.
02:32:58.100 | And I would like to systemize that, I don't know.
02:33:01.240 | - Look, when you're talking to people,
02:33:05.020 | you very quickly can surmise,
02:33:08.920 | am I claiming to be a physicist?
02:33:10.360 | No, I say it every turn, I'm not a physicist, right?
02:33:14.720 | When you say something about bundles,
02:33:16.780 | you say, well, can you explain it differently?
02:33:19.200 | I'm pushing around on this area, that lever over there.
02:33:24.200 | I'm trying to find something
02:33:27.160 | that we can play with and engage.
02:33:29.920 | And you know, another thing is
02:33:31.400 | is that I'll say something at scale.
02:33:34.440 | So if I was saying completely wrong things
02:33:36.540 | about bundles on the Joe Rogan program,
02:33:38.720 | you don't think that we wouldn't hear a crushing chorus?
02:33:41.840 | - Yes, absolutely.
02:33:42.680 | - And same thing with geometric unity.
02:33:45.280 | So I put up this video from this Oxford lecture.
02:33:50.180 | I understand that it's not a standard lecture,
02:33:52.480 | but you haven't heard the most brilliant people in the field
02:33:58.240 | say, well, this is obviously nonsense.
02:34:00.900 | They don't know what to make of it.
02:34:02.640 | And they're gonna hide behind,
02:34:05.080 | well, he hasn't said enough details.
02:34:06.520 | Where's the paper?
02:34:07.640 | - Where's the paper?
02:34:08.480 | I've seen the criticism.
02:34:10.560 | I've gotten the same kind of criticism.
02:34:11.920 | I've published a few things
02:34:13.440 | and especially stuff related to Tesla.
02:34:17.220 | We did studies on Tesla vehicles
02:34:20.120 | and the kind of criticism I've gotten
02:34:22.800 | was showed that they're completely--
02:34:24.780 | - Oh, right, like the guy who had Elon Musk
02:34:27.040 | on his program twice
02:34:28.040 | is gonna give us an accurate assessment.
02:34:29.680 | - Yeah, exactly, exactly.
02:34:31.640 | - It's just very low level.
02:34:33.000 | - Like without actually ever addressing the content.
02:34:37.420 | - You know, Lex, I think that in part,
02:34:42.760 | you're trying to solve a puzzle
02:34:44.260 | that isn't really your puzzle.
02:34:45.880 | I think you know that I'm sincere.
02:34:47.560 | You don't know whether the theory is gonna work or not.
02:34:50.680 | And you know that it's not coming out of somebody
02:34:52.960 | who's coming out of left field.
02:34:54.880 | Like the story makes sense.
02:34:56.240 | There's enough that's new and creative and different
02:35:00.040 | in other aspects where you can check me
02:35:02.360 | that your real concern is,
02:35:05.900 | are you really telling me
02:35:07.040 | that when you start breaking the rules,
02:35:08.560 | you see the system for what it is
02:35:10.640 | and it's become really vicious and aggressive?
02:35:13.000 | And the answer is yes.
02:35:14.100 | And I had to break the rules in part
02:35:16.920 | because of learning issues,
02:35:18.240 | because I came into this field
02:35:20.600 | with a totally different set of attributes.
02:35:23.560 | My profile just doesn't look like anybody else's remotely.
02:35:27.000 | But as a result, what that did is it showed me
02:35:29.680 | what is the system true to its own ideals
02:35:32.840 | or does it just follow these weird procedures
02:35:35.080 | and then when you take it off the rails,
02:35:38.300 | it behaves terribly.
02:35:39.640 | And that's really what my story I think does
02:35:43.000 | is it just says, well,
02:35:44.200 | he completely takes the system into new territory
02:35:48.000 | where it's not expecting to have to deal with somebody
02:35:49.880 | with these confusing sets of attributes.
02:35:51.880 | And I think what he's telling us
02:35:54.340 | is he believes it behaves terribly.
02:35:56.340 | Now, if you take somebody with perfect standardized tests
02:36:01.340 | and a winner of math competitions
02:36:04.400 | and you put them in a PhD program,
02:36:07.360 | they're probably gonna be okay.
02:36:09.880 | I'm not saying that the system
02:36:11.760 | breaks down for everybody under all circumstances.
02:36:17.720 | I'm saying when you present the system
02:36:19.860 | with a novel situation,
02:36:21.960 | at the moment it will almost certainly break down
02:36:24.480 | with probability approaching 100%.
02:36:27.800 | - But to me, the painful and the tragic thing
02:36:31.700 | is it, sorry to bring out my motherly instinct,
02:36:36.920 | but it feels like it's too much,
02:36:39.120 | it could be too much of a burden
02:36:40.520 | to exist outside the system.
02:36:42.080 | - Maybe, but-- - Psychologically.
02:36:44.040 | - First of all, I've got a podcast that I kinda like.
02:36:49.040 | I've got amazing friends.
02:36:51.780 | I have a life which has more interesting people
02:36:54.280 | passing through it than I know what to do with.
02:36:56.760 | And they haven't managed to kill me off yet.
02:36:58.720 | So, so far, so good.
02:37:00.080 | - Speaking of which, you host an amazing podcast
02:37:04.440 | that we've mentioned several times,
02:37:05.800 | but should mention over and over, The Portal,
02:37:08.460 | where you somehow manage every single conversation
02:37:13.440 | is a surprise.
02:37:15.480 | You go, I mean, not just the guests,
02:37:18.100 | but just the places you take them,
02:37:20.460 | the kind of ways they become challenging
02:37:23.760 | and how you recover from that.
02:37:25.640 | I mean, it's, there's just,
02:37:28.520 | it's full of genuine human moments.
02:37:30.760 | So I really appreciate what you're,
02:37:32.920 | it's a fun podcast to listen to.
02:37:35.600 | Let me ask some silly questions about it.
02:37:38.080 | What have you learned about conversation,
02:37:41.560 | about human-to-human conversation?
02:37:44.600 | - Well, I have a problem that I haven't solved
02:37:46.120 | on The Portal, which is that in general,
02:37:49.920 | when I ask people questions,
02:37:51.400 | they usually find their deeply grooved answers.
02:37:56.280 | And I'm not so interested
02:37:57.360 | in all of the deeply grooved answers.
02:37:59.000 | And so there's a complaint,
02:38:00.200 | which I'm very sympathetic to, actually,
02:38:02.620 | that I talk over people,
02:38:03.880 | that I won't sit still for the answer.
02:38:05.720 | And I think that that's weirdly sort of correct.
02:38:09.080 | It's not that I'm not interested in hearing other voices.
02:38:12.560 | It's that I'm not interested in hearing the same voice
02:38:15.320 | on my program that I could have gotten on somebody else's.
02:38:18.320 | And I haven't solved that well.
02:38:19.480 | So I've learned that I need a new conversational technique
02:38:23.120 | where I can keep somebody
02:38:24.500 | from finding their comfortable place
02:38:27.320 | and yet not be the voice talking over that person.
02:38:30.120 | - Yeah, it's funny.
02:38:30.960 | I get a sense, like your conversation with Brett,
02:38:33.720 | I can sense you detect that the line he's going down is,
02:38:38.720 | you know how it's gonna end,
02:38:41.080 | and you think it's a useless line,
02:38:43.580 | so you'll just stop it right there
02:38:45.120 | and you take him into the direction
02:38:46.600 | that you think it should go.
02:38:47.960 | But that requires interruption.
02:38:49.440 | - Well, and it does so far.
02:38:51.320 | I haven't found a better way.
02:38:52.440 | I'm looking for a better way.
02:38:53.640 | It's not like I don't hear the problem.
02:38:57.040 | I do hear the problem.
02:38:58.320 | I just, I haven't solved the problem.
02:39:01.440 | And on the Brett episode, I was insufferable.
02:39:06.440 | It was very difficult to listen to.
02:39:08.680 | It was so overbearing.
02:39:10.360 | But on the other hand, I was right.
02:39:12.560 | You know, it's like-- - It's funny.
02:39:13.880 | You keep saying that, but I didn't find it,
02:39:17.400 | maybe because I heard brothers.
02:39:19.000 | Like I heard a big brother.
02:39:20.320 | - Yeah, it was pretty bad.
02:39:21.660 | - Really? - I think so.
02:39:23.160 | - I didn't think it was bad at all.
02:39:24.240 | - Well, a lot of people found it insufferable.
02:39:26.320 | And I think it also has to do with the fact
02:39:28.320 | that this has become a frequent experience.
02:39:31.040 | I have several shows where somebody who I very much admire
02:39:34.040 | and think of as courageous,
02:39:35.720 | you know, I'm talking with them, maybe we're friends,
02:39:39.360 | and they sit down on the show,
02:39:41.440 | and they immediately become this fake person.
02:39:44.320 | Like two seconds in, they're sort of saying,
02:39:47.880 | "Well, I don't wanna be too critical or too harsh.
02:39:50.040 | "I don't wanna name any names.
02:39:51.000 | "I don't wanna this, don't wanna."
02:39:52.400 | He's like, "Okay, I'm gonna put my listeners
02:39:54.660 | "through three hours of you being sweetness and light."
02:39:58.360 | - Yeah.
02:40:00.000 | - Like at least give me some reality,
02:40:02.920 | and then we can decide to shelve the show
02:40:04.720 | and never let it hear the call of freedom
02:40:08.960 | in the bigger world.
02:40:10.240 | - I've seen you break out of that a few times.
02:40:12.720 | I've seen you be successful.
02:40:14.960 | I forgot the guest, but she was dressed with,
02:40:17.200 | where at the end of the episode,
02:40:22.520 | you had an argument about Brett.
02:40:24.600 | I forgot her name. - Oh, Agnes Collard.
02:40:26.080 | - Yeah, Agnes Collard. - Agnes Collard,
02:40:27.440 | the philosopher at the University of Chicago.
02:40:30.080 | - Yeah, you've continuously broken out of her.
02:40:32.960 | You guys went, you know, you seem pretty genuine.
02:40:38.080 | - I like her.
02:40:39.420 | I'm completely ethically opposed
02:40:41.040 | to what she's ethically for.
02:40:42.440 | (Lex laughing)
02:40:43.400 | - Well, she was great, and she wasn't like,
02:40:45.840 | you're both going hard-- - She's a grownup.
02:40:48.540 | - Yeah, exactly. - And she doesn't care
02:40:49.720 | about her, so she's-- - That was awesome.
02:40:50.840 | - Yeah.
02:40:51.680 | - But you're saying that some people
02:40:54.120 | are difficult to break out that way.
02:40:56.040 | - It's just that, you know, she was bringing
02:40:58.400 | the courage of her conviction.
02:40:59.720 | She was sort of defending the system,
02:41:01.680 | and I thought, wow, that's a pretty indefensible system
02:41:05.800 | that you're defending. - Well, that's great, though.
02:41:06.720 | She's doing that, isn't it?
02:41:08.560 | - I mean-- - It made for an awesome--
02:41:10.800 | - I think it's very informative for the world.
02:41:13.720 | - Yes, you just hated-- (laughing)
02:41:15.760 | - I just can't stand the idea that somebody says,
02:41:17.960 | well, we don't care who gets paid
02:41:19.680 | or who gets the credit as long as we get the goodies,
02:41:21.800 | 'cause that seems like insane.
02:41:24.280 | - Have you ever been afraid leading into a conversation?
02:41:28.300 | - Garry Kasparov.
02:41:32.320 | (laughing)
02:41:33.480 | - Really?
02:41:35.120 | - By the way, I mean, I'm just a fan taking requests, but--
02:41:39.560 | - I started at the beginning in Russian,
02:41:41.320 | and in fact, I used one word incorrectly.
02:41:44.240 | Is that terrible?
02:41:45.080 | - You know, it was pretty good.
02:41:46.240 | It was pretty good Russian.
02:41:47.440 | What was terrible is I think he complimented you, right?
02:41:50.120 | No, did he compliment you or was that me?
02:41:52.980 | Did he compliment you on your Russian?
02:41:54.840 | - Well, he said almost perfect Russian.
02:41:57.200 | - Yeah, like he was full of shit.
02:41:58.840 | That was not great Russian.
02:42:02.080 | - That was not great Russian.
02:42:03.120 | - That was hard, you tried hard, which is what matters.
02:42:07.160 | - That is so insulting.
02:42:08.840 | - I hope so, but I do hope you continue.
02:42:12.160 | It felt like, I don't know how long it went.
02:42:14.240 | It might've been like a two-hour conversation,
02:42:15.840 | but it felt, I hope it continues.
02:42:18.960 | I feel like you have many conversations with Garry.
02:42:22.000 | I would love to hear, there's certain conversation
02:42:24.580 | I would just love to hear much, much longer.
02:42:27.500 | - He's coming from a very, it's this issue
02:42:30.460 | about needing to overpower people in a very dangerous world,
02:42:33.820 | and so Garry has that need.
02:42:35.680 | - Yeah, he was interrupting you.
02:42:39.180 | It was an interesting dynamic.
02:42:41.220 | It was an interesting dynamic.
02:42:43.100 | - Two Weinsteins going at it.
02:42:44.780 | - I mean, two powerhouse egos, brilliant.
02:42:47.660 | - No, he's just gonna say egos.
02:42:49.180 | Minds, spirits.
02:42:51.460 | You don't have an ego.
02:42:52.300 | You're the most humble person I know.
02:42:54.540 | - Is that true?
02:42:55.380 | - No, that's a complete lie.
02:42:56.900 | Do you think about your own mortality, death?
02:43:01.420 | - Sure.
02:43:02.340 | - Are you afraid of death?
02:43:04.500 | - I released a theory during something
02:43:06.300 | that can kill older people, sure.
02:43:08.120 | - Oh, is there a little bit of a parallel there?
02:43:12.460 | - Of course, of course, I don't want it to die with me.
02:43:15.200 | - What do you hope your legacy is?
02:43:20.500 | - Oh, I hope my legacy is accurate.
02:43:24.400 | I'd like to write on my accomplishments
02:43:30.540 | rather than how my community decided to ding me
02:43:33.360 | while I was alive, that would be great.
02:43:35.060 | - What about if it was significantly exaggerated?
02:43:38.340 | - I don't want it.
02:43:39.740 | - You want it to be accurate.
02:43:41.180 | - I've got some pretty terrific stuff,
02:43:45.060 | and whether it works out or doesn't,
02:43:47.500 | I would like it to reflect what I actually was.
02:43:50.380 | I'll settle for accurate.
02:43:54.240 | - What would you say, what is the greatest element
02:43:59.900 | of Eric Weinstein accomplishment in life?
02:44:04.600 | In terms of being accurate, what are you most proud of?
02:44:10.660 | - Trying.
02:44:14.620 | (pages fluttering)
02:44:17.540 | The idea that we were stalled out in the hardest field
02:44:23.220 | at the most difficult juncture,
02:44:25.540 | and that I didn't listen to that voice ever
02:44:31.500 | that said, "Stop, you're hurting yourself,
02:44:36.020 | "you're hurting your family, you're hurting everybody,
02:44:37.500 | "you're embarrassing yourself, you're screwing up.
02:44:39.840 | "You can't do this, you're a failure, you're a fraud.
02:44:43.280 | "Turn back, save yourself."
02:44:45.340 | Like that voice, I didn't ultimately listen to it,
02:44:51.420 | and it was going for 35, 37 years.
02:44:56.140 | Very hard.
02:44:59.400 | - And I hope you never listen to that voice.
02:45:05.900 | - Well-- - That's why
02:45:06.740 | you're an inspiration. - Thank you.
02:45:08.420 | I appreciate that.
02:45:09.260 | I'm just infinitely honored that you would spend time
02:45:14.260 | with me, you've been a mentor to me,
02:45:18.760 | almost a friend I can't imagine a better person
02:45:22.680 | to talk to in this world,
02:45:23.560 | so thank you so much for talking today.
02:45:24.960 | - I can't wait till we do it again.
02:45:26.280 | Lex, thanks for sticking with me,
02:45:27.960 | and thanks for being the most singular guy
02:45:31.440 | in the podcasting space.
02:45:33.160 | In terms of all of my interviews,
02:45:34.800 | I would say that the last one I did with you,
02:45:37.920 | many people feel was my best,
02:45:40.280 | and it was a non-conventional one.
02:45:42.500 | So whatever it is that you're bringing to the game,
02:45:45.540 | I think everyone's noticing, and keep at it.
02:45:48.180 | - Thank you.
02:45:49.020 | Thanks for listening to this conversation
02:45:51.380 | with Eric Weinstein, and thank you
02:45:53.420 | to our presenting sponsor, Cash App.
02:45:55.580 | Please consider supporting the podcast
02:45:57.260 | by downloading Cash App and using code LEXPODCAST.
02:46:01.300 | If you enjoy this podcast, subscribe on YouTube,
02:46:04.000 | review it with Five Stars and Apple Podcasts,
02:46:06.420 | support it on Patreon, or simply connect
02:46:08.320 | with me on Twitter @LexFriedman.
02:46:10.740 | And now, let me leave you with some words of wisdom
02:46:14.080 | from Eric Weinstein's first appearance on this podcast.
02:46:17.560 | "Everything is great about war, except all the destruction."
02:46:22.760 | Thank you for listening, and hope to see you next time.
02:46:27.200 | (upbeat music)
02:46:29.780 | (upbeat music)
02:46:32.360 | [BLANK_AUDIO]