back to index

Eric Weinstein: Revolutionary Ideas in Science, Math, and Society | Lex Fridman Podcast #16


Chapters

0:0 Intro
0:44 Who influenced your thinking
9:59 Telogens
16:22 Open AI
18:5 Edward Teller
20:3 All telogens
21:22 Building a nightmare machine
23:7 Metrics
23:49 The Great Mystery of Our Time
26:20 My Leading Concern
28:54 Gated Institutional Narrative
32:30 Nuclear Weapons
35:16 Chess
37:22 The Future
40:39 Temporal Dimensions

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Eric Weinstein.
00:00:03.360 | He's a mathematician, economist, physicist,
00:00:06.000 | and the managing director of Thiel Capital.
00:00:08.680 | He coined the term, and you could say,
00:00:10.880 | is the founder of the intellectual dark web,
00:00:14.320 | which is a loosely assembled group of public intellectuals
00:00:17.280 | that includes Sam Harris, Jordan Peterson, Steven Pinker,
00:00:20.920 | Joe Rogan, Michael Shermer, and a few others.
00:00:24.800 | This conversation is part
00:00:26.080 | of the Artificial Intelligence Podcast at MIT and beyond.
00:00:30.240 | If you enjoy it, subscribe on YouTube, iTunes,
00:00:33.700 | or simply connect with me on Twitter @LexFriedman,
00:00:37.080 | spelled F-R-I-D.
00:00:39.400 | And now, here's my conversation with Eric Weinstein.
00:00:43.420 | - Are you nervous about this?
00:00:46.320 | - Scared shitless.
00:00:47.240 | - Okay, (speaking in foreign language)
00:00:50.560 | - You mentioned "Kung Fu Panda"
00:00:51.720 | as one of your favorite movies.
00:00:54.560 | It has the usual profound master-student dynamic going on.
00:00:58.000 | So, who has been a teacher
00:01:01.100 | that significantly influenced the direction
00:01:02.800 | of your thinking and life's work?
00:01:04.900 | So, if you're the Kung Fu Panda, who was your shifu?
00:01:08.680 | - Oh, that's interesting,
00:01:09.500 | because I didn't see shifu as being the teacher.
00:01:12.140 | - Who was the teacher?
00:01:13.640 | - Oogway, Master Oogway, the turtle.
00:01:16.500 | - Oh, the turtle, right.
00:01:18.140 | - They only meet twice in the entire film,
00:01:20.800 | and the first conversation sort of doesn't count.
00:01:23.960 | So, the magic of the film, in fact,
00:01:27.540 | its point is that the teaching that really matters
00:01:32.540 | is transferred during a single conversation,
00:01:37.180 | and it's very brief.
00:01:40.040 | And so, who played that role in my life?
00:01:42.660 | I would say either my grandfather,
00:01:48.600 | Harry Rubin, and his wife, Sophie Rubin, my grandmother,
00:01:52.560 | or Tom Lehrer.
00:01:54.040 | - Tom Lehrer?
00:01:55.880 | - Yeah.
00:01:57.200 | - In which way?
00:01:58.120 | - If you give a child Tom Lehrer records,
00:02:01.400 | what you do is you destroy their ability
00:02:04.200 | to be taken over by later malware.
00:02:07.200 | And it's so irreverent, so witty, so clever, so obscene,
00:02:14.160 | that it destroys the ability to lead a normal life
00:02:18.520 | for many people.
00:02:19.340 | So, if I meet somebody who's usually really shifted
00:02:24.340 | from any kind of neurotypical presentation,
00:02:27.120 | I'll often ask them, are you a Tom Lehrer fan?
00:02:30.680 | And the odds that they will respond are quite high.
00:02:34.120 | - Now, Tom Lehrer's Poisoning Pigeons in the Park, Tom Lehrer?
00:02:38.160 | - That's very interesting.
00:02:39.160 | There are a small number of Tom Lehrer songs
00:02:41.400 | that broke into the general population.
00:02:43.800 | Poisoning Pigeons in the Park, the Element song,
00:02:46.160 | and perhaps the Vatican rag.
00:02:47.640 | So, when you meet somebody who knows those songs,
00:02:51.440 | but doesn't know--
00:02:52.560 | - Oh, you're judging me right now, aren't you?
00:02:54.640 | - Harshly.
00:02:56.040 | No, but you're Russian, so undoubtedly you know
00:02:58.720 | Nikolai Ivanovich Lubachevsky, that song.
00:03:00.920 | - Yes, yeah.
00:03:01.760 | - So, that was a song about plagiarism
00:03:03.800 | that was in fact plagiarized,
00:03:05.520 | which most people don't know, from Danny Kaye,
00:03:07.760 | where Danny Kaye did a song called
00:03:10.640 | Stanislavski of the Muskie Arts.
00:03:12.940 | And so, Tom Lehrer did this brilliant job
00:03:16.280 | of plagiarizing a song and making it about plagiarism,
00:03:19.940 | and then making it about this mathematician
00:03:22.500 | who worked in non-Euclidean geometry.
00:03:24.800 | That was like giving heroin to a child.
00:03:27.540 | It was extremely addictive, and eventually led me
00:03:31.360 | to a lot of different places,
00:03:33.540 | one of which may have been a PhD in mathematics.
00:03:36.520 | - And he was also at least a lecturer in mathematics,
00:03:39.580 | I believe, at Harvard, something like that.
00:03:42.160 | - I just had dinner with him, in fact.
00:03:44.160 | When my son turned 13, we didn't tell him,
00:03:48.400 | but his bar mitzvah present was dinner
00:03:52.720 | with his hero, Tom Lehrer.
00:03:54.320 | And Tom Lehrer was 88 years old, sharp as a tack,
00:03:58.780 | irreverent and funny as hell, and just,
00:04:01.860 | you know, there are very few people in this world
00:04:04.560 | that you have to meet while they're still here,
00:04:06.640 | and that was definitely one for our family.
00:04:09.240 | - So that wit is a reflection of intelligence
00:04:14.240 | in some kind of deep way, like where,
00:04:16.400 | that would be a good test of intelligence,
00:04:18.560 | whether you're a Tom Lehrer fan.
00:04:20.200 | So what do you think that is about wit,
00:04:22.740 | about that kind of humor,
00:04:26.200 | ability to see the absurdity in existence?
00:04:29.120 | Do you think that's connected to intelligence,
00:04:31.040 | or are we just two Jews on a mic
00:04:33.000 | that appreciate that kind of humor?
00:04:34.680 | - No, I think that it's absolutely connected to intelligence.
00:04:37.640 | So you can see it, there's a place where Tom Lehrer
00:04:41.400 | decides that he's going to lampoon Gilbert
00:04:44.680 | of Gilbert and Sullivan, and he's going to outdo Gilbert
00:04:47.320 | with clever, meaningless wordplay.
00:04:49.440 | And he has, forget the, well, let's see.
00:04:52.960 | He's doing Clementine as if Gilbert and Sullivan wrote it,
00:04:55.960 | and he says, "That I mister de prester,
00:04:57.720 | "young sister, nay mister, this mister de pester,
00:04:59.420 | "she tried, pestering sister's a festering blister,
00:05:01.560 | "you best to resister, say I.
00:05:03.400 | "The sister persisted, the mister resisted,
00:05:04.920 | "I kissed her all loyalty slip.
00:05:06.040 | "When she said I could have her, her sister's cadaver
00:05:08.280 | "must surely have turned in its crypt."
00:05:10.520 | That's so dense, it's so insane,
00:05:13.560 | that that's clearly intelligence,
00:05:17.180 | because it's hard to construct something like that.
00:05:19.480 | If I look at my favorite Tom Lehrer lyric,
00:05:23.000 | there's a perfectly absurd one, which is,
00:05:26.780 | "Once all the Germans were warlike and mean,
00:05:28.660 | "but that couldn't happen again.
00:05:30.100 | "We taught them a lesson in 1918,
00:05:31.880 | "and they've hardly bothered us since then."
00:05:34.440 | That is a different kind of intelligence.
00:05:36.840 | You're taking something that is so horrific,
00:05:39.780 | and you're sort of making it palatable and funny,
00:05:43.200 | and demonstrating also just your humanity.
00:05:48.200 | I think the thing that came through,
00:05:50.120 | as Tom Lehrer wrote all of these terrible, horrible lines,
00:05:55.200 | was just what a sensitive and beautiful soul he was,
00:05:58.040 | who was channeling pain through humor and through grace.
00:06:02.840 | I've seen throughout Europe, throughout Russia,
00:06:04.920 | that same kind of humor emerged
00:06:06.160 | from the generation of World War II.
00:06:09.200 | It seemed like that humor is required
00:06:11.440 | to somehow deal with the pain and the suffering
00:06:14.000 | of that that war created.
00:06:15.960 | - Well, you do need the environment
00:06:17.880 | to create the broad Slavic soul.
00:06:19.720 | I don't think that many Americans
00:06:23.000 | really appreciate Russian humor,
00:06:27.080 | how you had to joke during the time of,
00:06:31.380 | let's say, Article 58 under Stalin.
00:06:33.840 | You had to be very, very careful.
00:06:36.040 | The concept of a Russian satirical magazine
00:06:38.500 | like "Krokodil" doesn't make sense.
00:06:41.360 | So you have this cross-cultural problem
00:06:44.000 | that there are certain areas of human experience
00:06:48.800 | that it would be better to know nothing about.
00:06:51.160 | And quite unfortunately,
00:06:53.440 | Eastern Europe knows a great deal about them,
00:06:55.640 | which makes the songs of Vladimir Vysotsky so potent.
00:07:00.340 | The prose of Pushkin, whatever it is,
00:07:04.260 | you have to appreciate the depth
00:07:06.580 | of the Eastern European experience.
00:07:08.940 | And I would think that perhaps Americans
00:07:11.500 | knew something like this around the time of the Civil War,
00:07:15.560 | or maybe under slavery and Jim Crow,
00:07:19.660 | or even the harsh tyranny of the coal and steel employers
00:07:24.660 | during the labor wars.
00:07:29.100 | But in general, I would say it's hard for us
00:07:32.220 | to understand and imagine the collective culture
00:07:36.040 | unless we have the system of selective pressures
00:07:38.440 | that, for example, Russians were subjected to.
00:07:41.540 | - Yeah, so if there's one good thing that comes out of war,
00:07:46.700 | it's literature, art, and humor, and music.
00:07:51.700 | - Oh, I don't think so.
00:07:52.560 | I think almost everything is good about war
00:07:54.880 | except for death and destruction.
00:07:57.020 | - Right.
00:07:59.060 | - Without the death, it would bring the romance of it.
00:08:02.660 | The whole thing is nice.
00:08:03.500 | - Well, this is why we're always caught up in war.
00:08:05.700 | We have this very ambiguous relationship to it,
00:08:08.340 | is that it makes life real and pressing and meaningful
00:08:12.780 | and at an unacceptable price,
00:08:15.860 | and the price has never been higher.
00:08:17.780 | - So to jump into AI a little bit,
00:08:22.220 | in one of the conversations you had, or one of the videos,
00:08:27.780 | you described that one of the things AI systems can't do,
00:08:30.940 | and biological systems can,
00:08:32.820 | is self-replicate in the physical world.
00:08:34.900 | - Oh, no, no, no.
00:08:35.740 | - In the physical world.
00:08:39.020 | - Well, yes, the physical robots can't self-replicate.
00:08:42.660 | This is a very tricky point,
00:08:46.860 | which is that the only thing that we've been able to create
00:08:50.780 | that's really complex,
00:08:52.100 | that has an analog of our reproductive system, is software.
00:08:57.300 | But nevertheless, software replicates itself,
00:09:01.860 | if we're speaking strictly for the replication,
00:09:04.060 | in this kind of digital space.
00:09:06.060 | So let me, just to begin, let me ask a question.
00:09:08.660 | Do you see a protective barrier or a gap
00:09:12.380 | between the physical world and the digital world?
00:09:15.220 | - Let's not call it digital.
00:09:16.420 | Let's call it the logical world versus the physical world.
00:09:19.700 | - Why logical?
00:09:21.820 | - Well, because even though we had,
00:09:24.140 | let's say Einstein's brain preserved,
00:09:27.220 | it was meaningless to us as a physical object
00:09:29.820 | because we couldn't do anything
00:09:31.940 | with what was stored in it at a logical level.
00:09:35.540 | And so the idea that something may be stored logically
00:09:38.420 | and that it may be stored physically are not necessarily,
00:09:43.420 | we don't always benefit from synonymizing.
00:09:45.740 | I'm not suggesting that there isn't a material basis
00:09:48.100 | to the logical world,
00:09:49.840 | but that it does warrant identification
00:09:52.980 | with a separate layer that need not invoke logic gates
00:09:57.860 | and zeros and ones.
00:09:59.460 | - And so connecting those two worlds,
00:10:01.260 | so the logical world and the physical world,
00:10:03.340 | or maybe just connecting to the logical world
00:10:06.900 | inside our brain, Einstein's brain,
00:10:09.140 | you mentioned the idea of out, outtelligence.
00:10:14.140 | - Artificial outtelligence.
00:10:15.460 | - Artificial outtelligence.
00:10:16.660 | - Yes, this is the only essay
00:10:18.660 | that John Brockman ever invited me to write
00:10:21.740 | that he refused to publish in Edge.
00:10:23.920 | - Why?
00:10:26.740 | - Well, maybe it wasn't well-written, but I don't know.
00:10:31.100 | - The idea is quite compelling.
00:10:32.460 | It's quite unique and new,
00:10:33.900 | and at least from my view, a stance point,
00:10:37.200 | maybe you can explain it.
00:10:38.300 | - Sure.
00:10:39.480 | What I was thinking about is why it is that we're waiting
00:10:42.580 | to be terrified by artificial general intelligence
00:10:47.260 | when in fact artificial life is terrifying in and of itself
00:10:52.260 | and it's already here.
00:10:54.500 | So in order to have a system of selective pressures,
00:10:57.780 | you need three distinct elements.
00:11:00.180 | You need variation within a population,
00:11:04.300 | you need heritability, and you need differential success.
00:11:07.440 | So what's really unique,
00:11:10.620 | and I've made this point I think elsewhere,
00:11:14.480 | about software is that if you think about
00:11:17.440 | what humans know how to build, that's impressive.
00:11:19.280 | So I always take a car and I say,
00:11:21.500 | does it have an analog of each
00:11:23.200 | of the physiological systems?
00:11:25.480 | Does it have a skeletal structure?
00:11:26.680 | That's its frame.
00:11:27.840 | Does it have a neurological structure?
00:11:30.200 | It has an onboard computer.
00:11:32.040 | It has a digestive system.
00:11:33.720 | The one thing it doesn't have is a reproductive system.
00:11:38.380 | But if you can call spawn on a process,
00:11:42.920 | effectively you do have a reproductive system.
00:11:45.220 | And that means that you can create something
00:11:49.680 | with variation, heritability, and differential success.
00:11:53.360 | Now, the next step in the chain of thinking was
00:11:56.520 | where do we see inanimate, non-intelligent life
00:12:01.520 | outwitting intelligent life?
00:12:04.320 | And I have two favorite systems
00:12:08.200 | and I try to stay on them so that we don't get distracted.
00:12:11.320 | One of which is the Ofres orchid subspecies,
00:12:15.840 | or subclade, I don't know what to call it.
00:12:18.200 | - Is it a type of flower?
00:12:19.080 | - Yeah, it's a type of flower that mimics the female
00:12:21.940 | of a pollinator species in order to dupe the males
00:12:25.840 | into engaging, it was called pseudocopulation,
00:12:29.000 | with the fake female, which is usually represented
00:12:31.840 | by the lowest petal.
00:12:33.680 | And there's also a pheromone component
00:12:35.520 | to fool the males into thinking
00:12:36.880 | they have a mating opportunity.
00:12:37.960 | But the flower doesn't have to give up energy
00:12:40.680 | in the form of nectar as a lure
00:12:42.640 | because it's tricking the males.
00:12:44.240 | The other system is a particular species of mussel,
00:12:50.280 | Lampicillus in the clear streams of Missouri.
00:12:54.960 | And it fools bass into biting a fleshy lip
00:12:59.900 | that contain its young.
00:13:01.720 | And when the bass see this fleshy lip,
00:13:04.800 | which looks exactly like a species of fish
00:13:07.000 | that the bass like to eat,
00:13:08.880 | the young explode and clamp onto the gills
00:13:12.240 | and parasitize the bass and also use the bass
00:13:15.680 | to redistribute them as they eventually release.
00:13:18.340 | Both of these systems, you have a highly intelligent dupe
00:13:24.440 | being fooled by a lower life form.
00:13:29.180 | And what is sculpting these convincing lures?
00:13:34.560 | It's the intelligence of previously duped targets
00:13:39.560 | for these strategies.
00:13:41.440 | So when the target is smart enough to avoid the strategy,
00:13:44.740 | those weaker mimics fall off.
00:13:49.440 | They have terminal lines.
00:13:51.000 | And only the better ones survive.
00:13:52.680 | So it's an arms race between the target species
00:13:56.560 | that is being parasitized, getting smarter,
00:14:00.860 | and this other less intelligent or non-intelligent object
00:14:05.860 | getting as if smarter.
00:14:08.040 | And so what you see is that artificial general intelligence
00:14:13.780 | is not needed to parasitize us.
00:14:17.100 | It's simply sufficient for us to outwit ourselves.
00:14:22.020 | So you could have a program, let's say,
00:14:24.140 | one of these Nigerian scams that writes letters
00:14:28.180 | and uses whoever sends it Bitcoin
00:14:32.300 | to figure out which aspects of the program should be kept,
00:14:36.740 | which should be varied and thrown away.
00:14:38.780 | And you don't need it to be in any way intelligent
00:14:41.240 | in order to have a really nightmarish scenario
00:14:43.460 | of being parasitized by something
00:14:44.980 | that has no idea what it's doing.
00:14:46.620 | - So you phrased a few concepts really eloquently.
00:14:49.300 | So let me try to, as a few directions this goes.
00:14:52.980 | So one, first of all, in the way we write software today,
00:14:56.880 | it's not common that we allow it to self-modify.
00:15:01.100 | - But we do have that ability now.
00:15:02.820 | - We have the ability.
00:15:03.860 | It's-- - Just not common.
00:15:05.100 | - It's not just common.
00:15:06.020 | So your thought is that that is a serious worry
00:15:11.020 | if there becomes a reason--
00:15:15.300 | - But self-modifying code is available now.
00:15:18.420 | - So there's different types of self-modification, right?
00:15:21.020 | There's personalization, your email app,
00:15:24.940 | your Gmail is self-modifying to you
00:15:29.620 | after you log in or whatever, you can think of it that way.
00:15:32.280 | But ultimately, it's central,
00:15:34.100 | all the information is centralized.
00:15:37.740 | But you're thinking of ideas where you're completely,
00:15:40.980 | this is a unique entity operating under selective pressures
00:15:45.580 | and it changes--
00:15:46.900 | - Well, you just, if you think about the fact
00:15:49.540 | that our immune systems don't know
00:15:52.340 | what's coming at them next,
00:15:53.620 | but they have a small set of spanning components.
00:15:57.860 | And if it's a sufficiently expressive system
00:16:00.940 | in that any shape or binding region can be approximated
00:16:05.940 | with the Lego that is present,
00:16:10.120 | then you can have confidence that you don't need to know
00:16:14.940 | what's coming at you because the combinatorics
00:16:17.500 | are sufficient to reach any configuration needed.
00:16:23.340 | - So that's a beautiful thing, well,
00:16:25.860 | terrifying thing to worry about
00:16:27.060 | because it's so within our reach.
00:16:28.700 | - Whenever I suggest these things,
00:16:31.840 | I do always have a concern as to whether or not
00:16:33.980 | I will bring them into being by talking about them.
00:16:36.900 | - So there's this thing from OpenAI,
00:16:39.740 | so next week, I have to talk to the founder of OpenAI,
00:16:44.080 | this idea that their text generation,
00:16:47.940 | the new stuff they have for generating text
00:16:51.620 | is they didn't wanna bring it,
00:16:53.500 | they didn't wanna release it
00:16:54.500 | because they're worried about the--
00:16:57.500 | - I'm delighted to hear that,
00:16:58.880 | but they're going to end up releasing it.
00:17:00.620 | - Yes, so that's the thing, I think talking about it,
00:17:04.140 | well, at least from my end,
00:17:05.220 | I'm more a proponent of technology preventing,
00:17:09.600 | so further innovation preventing
00:17:14.020 | the detrimental effects of innovation.
00:17:16.420 | - Well, we're sort of tumbling down a hill
00:17:20.060 | at accelerating speed,
00:17:22.180 | so whether or not we're proponents or--
00:17:24.860 | - It doesn't really matter.
00:17:25.700 | - It may not matter, but I--
00:17:27.340 | - Well, it may not.
00:17:28.260 | - Well, I do feel that there are people
00:17:29.860 | who've held things back and died poorer
00:17:33.780 | than they might have otherwise been,
00:17:35.300 | we don't even know their names.
00:17:37.400 | I don't think that we should discount the idea
00:17:39.820 | that having the smartest people
00:17:42.380 | showing off how smart they are by what they've developed
00:17:46.620 | may be a terminal process.
00:17:50.660 | I'm very mindful in particular of a beautiful letter
00:17:55.180 | that Edward Teller of all people wrote to Leo Zillard
00:17:58.300 | where Zillard was trying to figure out
00:17:59.540 | how to control the use of atomic weaponry
00:18:02.580 | at the end of World War II,
00:18:03.900 | and Teller rather strangely,
00:18:07.100 | because many of us view him as a monster,
00:18:09.940 | showed some very advanced moral thinking
00:18:13.140 | talking about the slim chance we have for survival
00:18:16.380 | and that the only hope is to make war unthinkable.
00:18:19.460 | I do think that not enough of us feel in our gut
00:18:23.100 | what it is we are playing with
00:18:24.660 | when we are working on technical problems.
00:18:27.180 | And I would recommend to anyone who hasn't seen it,
00:18:29.220 | a movie called "The Bridge on the River Kwai"
00:18:33.140 | about, I believe, captured British POWs
00:18:36.500 | who just in a desire to do a bridge well
00:18:39.960 | end up over collaborating with their Japanese captors.
00:18:43.060 | - Well, now you're making me question
00:18:46.620 | the unrestricted open discussion of ideas in AI.
00:18:50.240 | - I'm not saying I know the answer.
00:18:52.580 | I'm just saying that I could make a decent case
00:18:55.820 | for either our need to talk about this
00:18:57.740 | and to become technologically focused on containing it
00:19:00.660 | or need to stop talking about this
00:19:02.780 | and try to hope that the relatively small number
00:19:06.960 | of highly adept individuals
00:19:08.560 | who are looking at these problems
00:19:09.860 | is small enough that we should in fact
00:19:12.140 | be talking about how to contain them.
00:19:13.940 | - Well, the way ideas, the way innovation happens,
00:19:16.500 | what new ideas develop, Newton with calculus,
00:19:20.280 | whether if he was silent,
00:19:23.480 | the idea would emerge elsewhere.
00:19:25.820 | Well, in the case of Newton, of course.
00:19:27.700 | But in the case of AI,
00:19:30.660 | how small is the set of individuals
00:19:32.580 | out of which such ideas would arise?
00:19:35.780 | Is it a question? - Well, the idea
00:19:38.580 | is that the researchers we know
00:19:40.540 | and those that we don't know
00:19:41.660 | who may live in countries that don't wish us to know
00:19:43.740 | what level they're currently at
00:19:46.260 | are very disciplined in keeping these things to themselves.
00:19:50.500 | Of course, I will point out
00:19:51.460 | that there's a religious school in Kerala
00:19:55.780 | that developed something very close to the calculus,
00:19:59.220 | certainly in terms of infinite series
00:20:01.020 | in, I guess, religious prayer and rhyme and prose.
00:20:10.340 | So it's not that Newton had any ability to hold that back,
00:20:14.940 | and I don't really believe
00:20:16.420 | that we have an ability to hold it back.
00:20:17.620 | I do think that we could change the proportion
00:20:20.380 | of the time we spend worrying about the effects,
00:20:23.100 | what if we are successful,
00:20:24.300 | rather than simply trying to succeed
00:20:25.780 | and hope that we'll be able to contain things later.
00:20:28.260 | - Beautifully put.
00:20:29.140 | So on the idea of our intelligence,
00:20:31.380 | what form, treading cautiously,
00:20:34.860 | as we've agreed as we tumbled down the hill,
00:20:37.860 | what form-- - Can't stop ourselves, can we?
00:20:40.020 | - We cannot.
00:20:40.860 | What form do you see it taking?
00:20:43.460 | So one example, Facebook, Google,
00:20:47.700 | do want to, I don't know a better word,
00:20:51.540 | you want to influence users to behave a certain way.
00:20:55.780 | And so that's one kind of example of our intelligence,
00:20:59.580 | is systems perhaps modifying the behavior
00:21:02.500 | of these intelligent human beings
00:21:05.660 | in order to sell more product of different kind.
00:21:08.620 | But do you see other examples of this actually emerging in--
00:21:12.860 | - Just take any parasitic system.
00:21:14.620 | Make sure that there's some way in which
00:21:17.860 | that there's differential success,
00:21:20.460 | heritability, and variation.
00:21:24.940 | And those are the magic ingredients.
00:21:27.420 | And if you really wanted to build a nightmare machine,
00:21:29.260 | make sure that the system that expresses the variability
00:21:33.980 | has a spanning set so that it can learn to arbitrary levels
00:21:38.980 | by making it sufficiently expressive.
00:21:41.780 | That's your nightmare.
00:21:43.140 | - So it's your nightmare, but it could also be,
00:21:46.260 | it's a really powerful mechanism by which to create,
00:21:50.220 | well, powerful systems.
00:21:52.260 | So are you more worried about the negative direction
00:21:57.100 | that might go versus the positive?
00:21:59.020 | So you said parasitic, but that doesn't necessarily
00:22:01.700 | need to be what the system converges towards.
00:22:05.060 | It could be, what is it, symbiotic?
00:22:07.100 | - Parasitism, the dividing line between parasitism
00:22:10.620 | and symbiosis is not so clear.
00:22:13.580 | - That's what they tell me about marriage.
00:22:15.060 | I'm still single, so I don't know.
00:22:17.340 | - Well, yeah, we could go into that too, but.
00:22:22.340 | (Lex laughing)
00:22:23.820 | No, I think we have to appreciate,
00:22:27.380 | are you infected by your own mitochondria?
00:22:30.500 | - Right. (laughing)
00:22:32.900 | - Good.
00:22:33.740 | Right? - Yeah.
00:22:35.980 | - So in marriage, you fear the loss of independence,
00:22:38.900 | but even though the American therapeutic community
00:22:42.640 | may be very concerned about codependence,
00:22:45.300 | what's to say that codependence isn't what's necessary
00:22:48.060 | to have a stable relationship in which to raise children
00:22:52.140 | who are maximally case-selected
00:22:54.100 | and require incredible amounts of care
00:22:56.020 | because you have to wait 13 years
00:22:57.340 | before there's any reproductive payout,
00:22:58.940 | and most of us don't want our 13-year-olds having kids.
00:23:01.740 | It's a very tricky situation to analyze,
00:23:04.500 | and I would say that predators and parasites
00:23:09.020 | drive much of our evolution,
00:23:10.820 | and I don't know whether to be angry at them or thank them.
00:23:13.920 | - Well, ultimately, I mean, nobody knows the meaning of life
00:23:17.580 | or what even happiness is, but there is some metrics.
00:23:21.180 | - Oh, they didn't tell you?
00:23:22.260 | - They didn't, that's why all the poetry in books are about.
00:23:27.980 | You know, there is some metrics under which
00:23:29.660 | you can kind of measure how good it is
00:23:32.140 | that these AI systems are roaming about.
00:23:34.900 | So you're more nervous about software
00:23:39.560 | than you are optimistic about ideas of,
00:23:43.860 | yeah, self-replicating large scale.
00:23:45.140 | - I don't think we've really felt where we are.
00:23:48.540 | You know, occasionally we get a wake-up.
00:23:52.340 | 9/11 was so anomalous compared to everything else
00:23:58.060 | we've experienced on American soil
00:24:00.580 | that it came to us as a complete shock
00:24:03.220 | that that was even a possibility.
00:24:04.940 | What it really was was a highly creative
00:24:07.440 | and determined R&D team deep in the bowels of Afghanistan
00:24:12.440 | showing us that we had certain exploits
00:24:16.540 | that we were open to that nobody had chosen to express.
00:24:19.380 | I can think of several of these things
00:24:21.180 | that I don't talk about publicly
00:24:23.180 | that just seem to have to do with
00:24:26.920 | how relatively unimaginative those who wish
00:24:30.820 | to cause havoc and destruction have been up until now.
00:24:33.780 | But the great mystery of our time,
00:24:36.320 | of this particular little era,
00:24:40.120 | is how remarkably stable we've been since 1945
00:24:45.120 | when we demonstrated the ability
00:24:47.240 | to use nuclear weapons in anger.
00:24:50.220 | And we don't know why things like that
00:24:56.760 | haven't happened since then.
00:24:58.320 | We've had several close calls, we've had mistakes,
00:25:00.800 | we've had brinksmanship.
00:25:03.420 | And what's now happened is that we've settled
00:25:05.820 | into a sense that, oh, it'll always be nothing.
00:25:10.720 | It's been so long since something was
00:25:14.240 | at that level of danger
00:25:18.040 | that we've got a wrong idea in our head.
00:25:20.760 | And that's why when I went on the Ben Shapiro show,
00:25:23.000 | I talked about the need to resume
00:25:25.280 | above-ground testing of nuclear devices,
00:25:28.080 | because we have people whose developmental experience
00:25:30.560 | suggests that when, let's say, Donald Trump
00:25:34.000 | and North Korea engage on Twitter,
00:25:37.200 | oh, it's nothing, it's just posturing.
00:25:39.320 | Everybody's just in it for money.
00:25:41.040 | There's a sense that people are in a video game mode
00:25:45.160 | which has been the right call since 1945.
00:25:49.360 | We've been mostly in video game mode.
00:25:51.320 | It's amazing.
00:25:52.460 | - So you're worried about a generation
00:25:54.160 | which has not seen any existential--
00:25:57.000 | - We've lived under it.
00:25:58.640 | You see, you're younger.
00:26:00.240 | I don't know if, and again, you came from Moscow.
00:26:03.940 | There was a TV show called The Day After
00:26:09.320 | that had a huge effect on a generation growing up in the US.
00:26:14.320 | And it talked about what life would be like
00:26:17.760 | after a nuclear exchange.
00:26:20.920 | We have not gone through an embodied experience
00:26:24.480 | collectively where we've thought about this.
00:26:27.400 | And I think it's one of the most irresponsible things
00:26:30.040 | that the elders among us have done,
00:26:32.680 | which is to provide this beautiful garden
00:26:36.280 | in which the thorns are cut off of the rose bushes
00:26:42.260 | and all of the edges are rounded and sanded.
00:26:47.720 | And so people have developed this totally unreal idea
00:26:50.880 | which is everything's going to be just fine.
00:26:53.920 | And do I think that my leading concern is AGI
00:26:57.160 | or my leading concern is thermonuclear exchange
00:27:01.880 | or gene drives or any one of these things?
00:27:04.080 | I don't know.
00:27:05.640 | But I know that our time here
00:27:08.840 | in this very long experiment here is finite
00:27:11.880 | because the toys that we've built are so impressive.
00:27:15.000 | And the wisdom to accompany them has not materialized.
00:27:19.000 | And I think we actually got a wisdom uptick since 1945.
00:27:24.000 | We had a lot of dangerous skilled players
00:27:27.160 | on the world stage who nevertheless,
00:27:29.760 | no matter how bad they were, managed to not embroil us
00:27:33.880 | in something that we couldn't come back from.
00:27:38.160 | - The Cold War.
00:27:39.240 | - Yeah, and the distance from the Cold War.
00:27:41.400 | You know, I'm very mindful of,
00:27:46.520 | there was a Russian tradition actually,
00:27:49.080 | of on your wedding day,
00:27:51.400 | going to visit a memorial to those who gave their lives.
00:27:56.400 | Can you imagine this?
00:27:58.280 | Where on the happiest day of your life,
00:28:00.200 | you go and you pay homage to the people
00:28:03.720 | who fought and died in the Battle of Stalingrad?
00:28:06.480 | I'm not a huge fan of communism, I gotta say.
00:28:11.760 | But there were a couple of things that the Russians did
00:28:15.080 | that were really positive in the Soviet era.
00:28:18.760 | And I think trying to let people know
00:28:21.320 | how serious life actually is,
00:28:23.240 | the Russian model of seriousness
00:28:26.040 | is better than the American model.
00:28:28.400 | - And maybe, like you mentioned,
00:28:30.600 | there was a small echo of that after 9/11.
00:28:33.560 | - We wouldn't let it form.
00:28:36.120 | We talk about 9/11, but it's 9/12
00:28:39.000 | that really moved the needle.
00:28:41.600 | When we were all just there and nobody wanted to speak.
00:28:44.740 | We witnessed something super serious
00:28:48.280 | and we didn't want to run to our computers
00:28:53.280 | and blast out our deep thoughts and our feelings.
00:28:57.880 | And it was profound because we woke up briefly there.
00:29:04.000 | I talk about the gated institutional narrative
00:29:07.320 | that sort of programs our lives.
00:29:09.000 | I've seen it break three times in my life.
00:29:11.880 | One of which was the election of Donald Trump.
00:29:14.980 | Another time was the fall of Lehman Brothers
00:29:17.680 | when everybody who knew that Bear Stearns
00:29:21.360 | wasn't that important knew that Lehman Brothers
00:29:25.040 | met AIG was next.
00:29:27.240 | And the other one was 9/11.
00:29:29.300 | And so if I'm 53 years old and I only remember three times
00:29:33.520 | that the global narrative was really interrupted,
00:29:37.280 | that tells you how much we've been on top
00:29:39.800 | of developing events.
00:29:43.280 | I mean, we had the Murrow Federal Building explosion,
00:29:45.680 | but it didn't cause the narrative to break.
00:29:47.560 | It wasn't profound enough.
00:29:48.920 | Around 9/12, we started to wake up out of our slumber.
00:29:53.920 | And the powers that be did not want to coming together.
00:29:59.920 | The admonition was go shopping.
00:30:02.620 | - The powers that be was what is that force
00:30:06.520 | as opposed to blaming individuals?
00:30:07.800 | - We don't know.
00:30:08.800 | - So whatever that--
00:30:10.080 | - Whatever that force is,
00:30:11.680 | there's a component of it that's emergent
00:30:13.440 | and there's a component of it that's deliberate.
00:30:15.640 | So give yourself a portfolio with two components.
00:30:18.620 | Some amount of it is emergent,
00:30:20.120 | but some amount of it is also an understanding
00:30:23.400 | that if people come together,
00:30:25.240 | they become an incredible force.
00:30:27.580 | And what you're seeing right now, I think,
00:30:29.580 | is there are forces that are trying to come together
00:30:34.640 | and there are forces that are trying to push things apart.
00:30:37.780 | And one of them is the globalist narrative
00:30:41.840 | versus the national narrative,
00:30:43.280 | where to the globalist perspective,
00:30:47.400 | the nations are bad things in essence,
00:30:50.080 | that they're temporary, they're nationalistic,
00:30:52.880 | they're jingoistic, it's all negative.
00:30:55.460 | To people in the national, more in the national idiom,
00:30:58.240 | they're saying, "Look, this is where I pay my taxes.
00:31:00.560 | "This is where I do my army service.
00:31:02.500 | "This is where I have a vote.
00:31:04.180 | "This is where I have a passport.
00:31:05.960 | "Who the hell are you to tell me
00:31:07.660 | "that because you've moved into some place
00:31:09.600 | "that you can make money globally,
00:31:11.680 | "that you've chosen to abandon other people
00:31:14.040 | "to whom you have a special and elevated duty?"
00:31:16.840 | And I think that these competing narratives
00:31:19.540 | have been pushing towards the global perspective
00:31:22.080 | from the elite and a larger and larger number
00:31:25.720 | of disenfranchised people are saying,
00:31:27.240 | "Hey, I actually live in a place and I have laws
00:31:30.960 | "and I speak a language, I have a culture,
00:31:33.280 | "and who are you to tell me that because you can profit
00:31:36.580 | "in some faraway land that my obligations
00:31:40.360 | "to my fellow countrymen are so much diminished?"
00:31:43.260 | - So these tensions between nations and so on,
00:31:45.380 | ultimately you see being proud of your country and so on,
00:31:48.620 | which creates potentially the kind of things
00:31:51.780 | that led to wars and so on.
00:31:54.060 | Ultimately it is human nature and it is good for us
00:31:57.340 | for wake-up calls of different kinds.
00:31:59.060 | - Well, I think that these are tensions.
00:32:01.220 | And my point isn't, I mean, nationalism run amok
00:32:05.040 | is a nightmare, and internationalism run amok
00:32:08.520 | is a nightmare.
00:32:09.680 | And the problem is we're trying to push these pendulums
00:32:14.680 | to some place where they're somewhat balanced,
00:32:18.160 | where we have a higher duty of care to those
00:32:22.000 | who share our laws and our citizenship,
00:32:25.880 | but we don't forget our duties of care to the global system.
00:32:30.960 | I would think this is elementary,
00:32:32.720 | but the problem that we're facing concerns the ability
00:32:37.720 | for some to profit by abandoning their obligations
00:32:42.760 | to others within their system.
00:32:45.300 | And that's what we've had for decades.
00:32:48.540 | - You mentioned nuclear weapons.
00:32:50.220 | I was hoping to get answers from you
00:32:51.660 | since one of the many things you've done as economics,
00:32:56.020 | and maybe you can understand human behavior,
00:32:57.660 | why the heck we haven't blown each other up yet.
00:33:01.020 | But okay, so we'll get back--
00:33:02.740 | - I don't know the answer.
00:33:03.580 | - Yeah, it's really important to say
00:33:06.340 | that we really don't know.
00:33:07.660 | - A mild uptick in wisdom.
00:33:09.540 | - A mild uptick in wisdom.
00:33:10.900 | Well, Steven Pinker, who I've talked with,
00:33:13.660 | has a lot of really good ideas about why, but--
00:33:18.060 | - I don't trust his optimism.
00:33:19.500 | - Listen, I'm Russian, so I never trust a guy
00:33:23.980 | who's that optimistic.
00:33:24.820 | - No, no, no, it's just that you're talking
00:33:26.860 | about a guy who's looking at a system
00:33:29.980 | in which more and more of the kinetic energy,
00:33:33.700 | like war, has been turned into potential energy,
00:33:36.660 | like unused nuclear weapons.
00:33:38.540 | - Wow, beautifully put.
00:33:39.380 | - And now I'm looking at that system,
00:33:41.420 | and I'm saying, okay, well,
00:33:42.540 | if you don't have a potential energy term,
00:33:44.200 | then everything's just getting better and better.
00:33:45.940 | - Yeah, wow, that's beautifully put.
00:33:49.140 | Only a physicist could, okay.
00:33:51.300 | - I'm not a physicist.
00:33:52.400 | - Is that a dirty word?
00:33:55.300 | - No, no, I wish I were a physicist.
00:33:57.780 | - Me too, my dad's a physicist.
00:33:59.180 | I'm trying to live up to that,
00:34:00.380 | probably for the rest of my life.
00:34:02.820 | He's probably gonna listen to this too, so.
00:34:05.060 | - He did. - Yeah.
00:34:06.060 | So your friend Sam Harris worries a lot
00:34:10.220 | about the existential threat of AI,
00:34:12.220 | not in the way that you've described, but in the more.
00:34:16.900 | - Well, he hangs out with Elon, I don't know Elon.
00:34:19.060 | (laughing)
00:34:20.260 | - So are you worried about that kind of,
00:34:23.780 | you know, about the,
00:34:26.660 | about either robotic systems
00:34:30.740 | or traditionally defined AI systems
00:34:33.620 | essentially becoming super intelligent,
00:34:35.620 | much more intelligent than human beings,
00:34:37.540 | and getting-- - Well, they already are.
00:34:40.420 | And they're not.
00:34:41.260 | - When seen as a collective, you mean?
00:34:46.100 | - Well, I mean, I can mean all sorts of things,
00:34:48.220 | but certainly many of the things that we thought
00:34:53.020 | were peculiar to general intelligence
00:34:55.780 | do not require general intelligence.
00:34:57.340 | So that's been one of the big awakenings
00:34:59.660 | that you can write a pretty convincing sports story
00:35:04.220 | from stats alone without needing to have watched the game.
00:35:09.220 | So, you know, is it possible to write lively prose
00:35:13.140 | about politics?
00:35:14.180 | Yeah, no, not yet.
00:35:15.300 | So we're sort of all over the map.
00:35:20.340 | One of the things about chess,
00:35:22.700 | there's a question I once asked on Quora
00:35:25.380 | that didn't get a lot of response,
00:35:26.900 | which was what is the greatest brilliancy ever produced
00:35:30.060 | by a computer in a chess game,
00:35:31.420 | which was different than the question
00:35:32.700 | of what is the greatest game ever played.
00:35:35.460 | So if you think about brilliancies
00:35:36.900 | is what really animates many of us
00:35:38.620 | to think of chess as an art form.
00:35:40.300 | Those are those moves and combinations
00:35:44.020 | that just show such flair, panache, and soul.
00:35:49.020 | Computers weren't really great at that.
00:35:50.460 | They were great positional monsters.
00:35:52.300 | And recently we've started seeing brilliancies.
00:35:56.940 | - Yeah, a few grandmasters have identified
00:35:59.740 | with AlphaZero that things were quite brilliant.
00:36:02.860 | - Yeah, so that's an example of something.
00:36:06.260 | We don't think that that's AGI,
00:36:07.580 | but in a very restricted set of rules like chess,
00:36:11.680 | you're starting to see poetry of a high order.
00:36:15.540 | And so I don't like the idea that we're waiting for AGI.
00:36:21.220 | AGI is sort of slowly infiltrating our lives
00:36:25.900 | in the same way that I don't think a worm should be,
00:36:30.060 | C. elegans shouldn't be treated as non-conscious
00:36:32.820 | because it only has 300 neurons.
00:36:34.500 | Maybe it just has a very low level of consciousness
00:36:37.780 | because we don't understand what these things mean
00:36:39.360 | as they scale up.
00:36:40.740 | So am I worried about this general phenomena?
00:36:43.620 | Sure, but I think that one of the things that's happening
00:36:46.900 | is that a lot of us are fretting about this
00:36:50.180 | in part because of human needs.
00:36:52.420 | We've always been worried about the Golem, right?
00:36:57.260 | - Well, the Golem is the artificially created--
00:36:59.740 | - Life, you know.
00:37:00.700 | - It's like Frankenstein type of character.
00:37:02.740 | - It's a Jewish version.
00:37:04.040 | And Frankenberg, Frankenstein.
00:37:09.040 | - Yeah, that makes sense.
00:37:10.460 | - But we've always been worried
00:37:14.020 | about creating something like this
00:37:16.300 | and it's getting closer and closer.
00:37:18.460 | And there are ways in which
00:37:20.260 | we have to realize that the whole thing
00:37:25.220 | that we've experienced or the context of our lives
00:37:29.180 | is almost certainly coming to an end.
00:37:32.340 | And I don't mean to suggest that we won't survive,
00:37:37.340 | I don't know.
00:37:38.940 | And I don't mean to suggest that it's coming tomorrow.
00:37:41.100 | It could be 300, 500 years.
00:37:43.580 | But there's no plan that I'm aware of
00:37:46.940 | if we have three rocks that we could possibly inhabit
00:37:49.660 | that are sensible within current technological dreams,
00:37:54.660 | the Earth, the Moon, and Mars.
00:37:57.820 | And we have a very competitive civilization
00:38:01.980 | that is still forced into violence
00:38:04.500 | to sort out disputes that cannot be arbitrated.
00:38:07.380 | It is not clear to me that we have a long-term future
00:38:10.460 | until we get to the next stage,
00:38:12.820 | which is to figure out whether or not
00:38:14.580 | the Einsteinian speed limit can be broken.
00:38:17.740 | And that requires our source code.
00:38:19.800 | - Our source code, the stuff in our brains
00:38:23.620 | to figure out, what do you mean by our source code?
00:38:26.380 | - The source code of the context,
00:38:27.820 | whatever it is that produces the quarks,
00:38:29.980 | the electrons, the neutrinos.
00:38:32.180 | - Our source code, I got it.
00:38:33.900 | So this is--
00:38:35.060 | - No, you're talking about stuff
00:38:36.020 | that's written in a higher level language.
00:38:38.140 | - Yeah, yeah, that's right.
00:38:39.140 | You're talking about the low level, the bits.
00:38:42.100 | That's what is currently keeping us here.
00:38:46.540 | We can't even imagine.
00:38:48.660 | We have harebrained schemes for staying
00:38:52.440 | within the Einsteinian speed limit.
00:38:54.320 | Maybe if we could just drug ourselves
00:38:57.300 | and go into a suspended state,
00:38:58.740 | or we could have multiple generations.
00:39:00.380 | I think all that stuff is pretty silly.
00:39:02.780 | But I think it's also pretty silly to imagine
00:39:05.620 | that our wisdom is going to increase
00:39:07.560 | to the point that we can have the toys we have,
00:39:10.420 | and we're not going to use them for 500 years.
00:39:14.020 | - Speaking of Einstein, I had a profound breakthrough
00:39:17.260 | when I realized you're just one letter away from the guy.
00:39:20.340 | - Yeah, but I'm also one letter away from Feinstein.
00:39:23.580 | - It's, well, you get to pick.
00:39:25.180 | (laughing)
00:39:26.180 | Okay, so unified theory.
00:39:28.540 | You know, you've worked, you enjoy the beauty of geometry.
00:39:32.660 | I don't actually know if you enjoy it.
00:39:34.340 | You certainly are quite good at explaining--
00:39:35.860 | - I tremble before it.
00:39:36.820 | - Tremble before it.
00:39:38.900 | If you're religious, that is one of the--
00:39:40.460 | - I don't have to be religious.
00:39:42.100 | It's just so beautiful, you will tremble anyway.
00:39:45.020 | - I mean, I just read Einstein's biography,
00:39:47.380 | and one of the ways, one of the things you've done
00:39:51.300 | is try to explore a unified theory,
00:39:55.140 | talking about a 14-dimensional observer
00:39:57.460 | that has the 4D space-time continuum embedded in it.
00:40:02.020 | I'm just curious how you think,
00:40:05.940 | philosophically, at a high level,
00:40:08.140 | about something more than four dimensions.
00:40:10.740 | How do you try to, what does it make you feel,
00:40:14.900 | talking in the mathematical world about dimensions
00:40:19.240 | that are greater than the ones we can perceive?
00:40:22.160 | Is there something that you take away
00:40:25.060 | that's more than just the math?
00:40:27.020 | - Well, first of all, stick out your tongue at me.
00:40:29.500 | Okay.
00:40:33.740 | Now, on the front of that tongue,
00:40:36.180 | - Yeah?
00:40:37.020 | - There was a sweet receptor.
00:40:38.460 | And next to that were salt receptors on two different sides.
00:40:44.420 | A little bit farther back, there were sour receptors.
00:40:46.620 | And you wouldn't show me the back of your tongue
00:40:48.100 | where your bitter receptor was.
00:40:50.140 | - Show the good side always.
00:40:51.260 | - Okay, but that was four dimensions of taste receptors.
00:40:56.260 | But you also had pain receptors on that tongue,
00:40:58.820 | and probably heat receptors on that tongue.
00:41:01.140 | So let's assume that you had one of each.
00:41:03.140 | That would be six dimensions.
00:41:05.260 | So when you eat something, you eat a slice of pizza,
00:41:07.860 | and it's got some hot pepper on it, maybe some jalapeno.
00:41:14.300 | You're having a six-dimensional experience, dude.
00:41:17.600 | - Do you think we overemphasize the value of time
00:41:21.020 | as one of the dimensions, or space?
00:41:24.300 | Well, we certainly overemphasize the value of time,
00:41:26.380 | 'cause we like things to start and end,
00:41:28.140 | or we really don't like things to end, but they seem to.
00:41:30.820 | - Well, what if you flipped one of the spatial dimensions
00:41:33.380 | into being a temporal dimension?
00:41:35.020 | And you and I were to meet in New York City,
00:41:39.300 | and say, "Well, where and when should we meet?"
00:41:42.180 | Say, "How about I'll meet you on 36th and Lexington
00:41:46.420 | "at two in the afternoon and 11 o'clock in the morning?"
00:41:51.260 | That would be very confusing.
00:41:54.660 | - Well, so it's convenient for us to think about time,
00:41:58.620 | you mean?
00:41:59.460 | - We happen to be in a delicious situation
00:42:01.260 | in which we have three dimensions of space and one of time,
00:42:04.260 | and they're woven together in this sort of strange fabric
00:42:07.460 | where we can trade off a little space for a little time,
00:42:09.380 | but we still only have one dimension that is picked out
00:42:12.220 | relative to the other three.
00:42:13.380 | It's very much "Gladys Knight and the Pips."
00:42:15.560 | - So which one developed for who?
00:42:17.940 | Do we develop for these dimensions,
00:42:19.660 | or did the dimensions, or were they always there
00:42:22.740 | and it doesn't--
00:42:23.620 | - Well, do you imagine that there isn't a place
00:42:25.480 | where there are four temporal dimensions,
00:42:27.160 | or two and two of space and time,
00:42:28.780 | or three of time and one of space,
00:42:30.660 | and then would time not be playing the role of space?
00:42:33.820 | Why do you imagine that the sector that you're in
00:42:35.880 | is all that there is?
00:42:36.940 | - Certainly do not, but I can't imagine otherwise.
00:42:40.860 | I mean, I haven't done ayahuasca or any of those drugs,
00:42:44.540 | I'd hope to one day, but--
00:42:46.140 | - Instead of doing ayahuasca,
00:42:47.020 | you could just head over to building two.
00:42:49.500 | - That's where the mathematicians are?
00:42:50.740 | - Yeah, that's where they hang.
00:42:52.060 | - Just to look at some geometry?
00:42:53.380 | - Well, just ask about pseudo-Riemannian geometry,
00:42:55.500 | that's what you're interested in.
00:42:56.740 | (Lex laughs)
00:42:58.140 | - Okay.
00:42:58.980 | - Or you could talk to a shaman and end up in Peru.
00:43:01.420 | - And then some extra money for that trip?
00:43:03.020 | - Yeah, but you won't be able to do any calculations
00:43:04.860 | if that's how you choose to go about it.
00:43:06.420 | - Well, a different kind of calculation.
00:43:08.300 | - So to speak.
00:43:09.140 | - Yeah.
00:43:09.960 | One of my favorite people, Edward Frankel,
00:43:11.620 | Berkeley professor, author of "Love and Math,"
00:43:13.860 | great title for a book,
00:43:15.020 | said that you were quite a remarkable intellect
00:43:19.780 | to come up with such beautiful original ideas
00:43:22.060 | in terms of the unified theory and so on,
00:43:25.020 | but you were working outside academia.
00:43:28.060 | So one question, in developing ideas
00:43:31.540 | that are truly original, truly interesting,
00:43:33.500 | what's the difference between inside academia
00:43:35.940 | and outside academia when it comes to developing such ideas?
00:43:39.980 | - Oh, it's a terrible choice, terrible choice.
00:43:43.100 | So if you do it inside of academics,
00:43:46.820 | you are forced to constantly
00:43:50.340 | show great loyalty to the consensus
00:43:56.700 | and you distinguish yourself with small,
00:44:00.300 | almost microscopic heresies
00:44:02.900 | to make your reputation in general.
00:44:05.000 | And you have very competent people
00:44:09.180 | and brilliant people who are working together
00:44:11.540 | who form very deep social networks
00:44:16.460 | and have a very high level of behavior,
00:44:21.100 | at least within mathematics
00:44:22.780 | and at least technically within physics,
00:44:25.100 | theoretical physics.
00:44:26.180 | When you go outside, you meet lunatics and crazy people,
00:44:31.380 | madmen, and these are people
00:44:36.100 | who do not usually subscribe to the consensus position
00:44:40.420 | and almost always lose their way.
00:44:42.320 | And the key question is,
00:44:46.780 | will progress likely come from someone
00:44:50.540 | who has miraculously managed to stay within the system
00:44:54.820 | and is able to take on a larger amount of heresy
00:44:57.380 | that is sort of unthinkable,
00:45:00.320 | in which case that will be fascinating?
00:45:04.500 | Or is it more likely that somebody will maintain
00:45:07.940 | a level of discipline from outside of academics
00:45:10.820 | and be able to make use of the freedom
00:45:15.660 | that comes from not having to constantly
00:45:18.540 | affirm your loyalty to the consensus of your field?
00:45:21.700 | - So you've characterized in ways
00:45:23.020 | that academia in this particular sense is declining.
00:45:28.020 | You posted a plot,
00:45:30.260 | the older population of the faculty is getting larger,
00:45:34.380 | the younger is getting smaller and so on.
00:45:37.020 | So which direction of the two are you more hopeful about?
00:45:40.660 | - Well, the baby boomers can't hang on forever.
00:45:43.220 | - Well, first of all, in general true,
00:45:44.580 | and second of all, in academia.
00:45:46.380 | - But that's really what this time is about.
00:45:49.460 | - Is the baby boomers control.
00:45:51.460 | We're used to like financial bubbles
00:45:53.460 | that last a few years in length and then pop.
00:45:57.140 | The baby boomer bubble is this really long lived thing.
00:46:01.860 | And all of the ideology,
00:46:03.900 | all of the behavior patterns, the norms,
00:46:07.060 | now for example, string theory
00:46:08.380 | is an almost entirely baby boomer phenomena.
00:46:11.580 | It was something that baby boomers were able to do
00:46:13.960 | because it required a very high level
00:46:16.660 | of mathematical ability.
00:46:20.460 | - You don't think of string theory as an original idea?
00:46:24.820 | - Oh, I mean, it was original to Veneziano
00:46:26.900 | probably is older than the baby boomers.
00:46:29.660 | And there are people who are younger than the baby boomers
00:46:31.860 | who are still doing string theory.
00:46:33.300 | And I'm not saying that nothing discovered
00:46:35.220 | within the large string theoretic complex is wrong.
00:46:38.460 | Quite the contrary, a lot of brilliant mathematics
00:46:41.420 | and a lot of the structure of physics
00:46:43.620 | was elucidated by string theorists.
00:46:45.640 | What do I think of the deliverable nature
00:46:49.220 | of this product that will not ship called string theory?
00:46:52.460 | I think that it is largely an affirmative action program
00:46:55.300 | for highly mathematically and geometrically talented
00:46:59.300 | baby boomer physicists so that they can say
00:47:02.980 | that they're working on something
00:47:04.660 | within the constraints of what they will say
00:47:08.740 | is quantum gravity.
00:47:10.420 | Now there are other schemes,
00:47:12.420 | there's like asymptotic safety.
00:47:14.540 | There are other things that you could imagine doing.
00:47:17.100 | I don't think much of any of the major programs,
00:47:20.740 | but to have inflicted this level of loyalty
00:47:25.260 | through a shibboleth, well, surely you don't question X.
00:47:29.540 | Well, I question almost everything in the string program.
00:47:32.780 | And that's why I got out of physics.
00:47:34.220 | When you called me a physicist, it was a great honor.
00:47:37.300 | But the reason I didn't become a physicist
00:47:39.140 | wasn't that I fell in love with mathematics.
00:47:41.260 | As I said, wow, in 1984, 1983, I saw the field going mad.
00:47:46.620 | And I saw that mathematics,
00:47:48.740 | which has all sorts of problems, was not going insane.
00:47:52.900 | And so instead of studying things within physics,
00:47:55.500 | I thought it was much safer to study
00:47:57.100 | the same objects within mathematics.
00:47:59.780 | There's a huge price to pay for that.
00:48:01.180 | You lose physical intuition.
00:48:03.420 | But the point is that it wasn't
00:48:05.500 | a North Korean reeducation camp either.
00:48:08.180 | - Are you hopeful about cracking open
00:48:11.220 | the Einstein unified theory in a way
00:48:13.600 | that has really, really understanding
00:48:17.140 | whether this uniting everything together
00:48:20.620 | with quantum theory and so on?
00:48:21.860 | - I mean, I'm trying to play this role myself.
00:48:25.420 | To do it to the extent of handing it over
00:48:28.480 | to the more responsible, more professional,
00:48:32.300 | more competent community.
00:48:33.800 | So I think that they're wrong about a great number
00:48:37.740 | of their belief structures.
00:48:39.560 | But I do believe, I mean,
00:48:42.460 | I have a really profound love hate relationship
00:48:45.120 | with this group of people.
00:48:46.800 | - On the physics side? - Oh yeah.
00:48:48.680 | - 'Cause the mathematicians actually
00:48:50.040 | seem to be much more open-minded.
00:48:51.880 | - They are and they aren't.
00:48:54.080 | They're open-minded about anything
00:48:55.560 | that looks like great math.
00:48:56.920 | They'll study something that isn't very important physics,
00:49:00.180 | but if it's beautiful mathematics,
00:49:01.700 | then they'll have, they have great intuition
00:49:04.320 | about these things.
00:49:06.100 | As good as the mathematicians are,
00:49:07.900 | and I might even intellectually at some horsepower level
00:49:10.940 | give them the edge, the theoretical physics community
00:49:15.460 | is bar none the most profound intellectual community
00:49:19.860 | that we have ever created.
00:49:21.900 | It is the number one, there's nobody in second place
00:49:24.980 | as far as I'm concerned.
00:49:25.940 | Look, in their spare time, in their spare time,
00:49:29.140 | they invented molecular biology.
00:49:31.060 | - What was the origin of molecular biology?
00:49:33.100 | You're saying physics?
00:49:33.940 | - Well, somebody like Francis Crick.
00:49:34.860 | I mean, a lot of the early molecular biologists.
00:49:38.620 | - Were physicists? - Yeah, I mean,
00:49:40.180 | you know, Schrodinger wrote, "What is life?"
00:49:42.500 | That was highly inspirational.
00:49:44.420 | I mean, you have to appreciate that there is no community
00:49:49.420 | like the basic research community in theoretical physics.
00:49:54.660 | And it's not something, I'm highly critical of these guys.
00:49:59.380 | I think that they would just wasted the decades of time
00:50:04.380 | with a near religious devotion
00:50:08.340 | to their misconception conceptualization
00:50:11.020 | of where the problems were in physics.
00:50:13.260 | But this has been the greatest intellectual collapse
00:50:16.740 | ever witnessed within academics.
00:50:18.780 | - You see it as a collapse or just a lull?
00:50:22.780 | - Oh, I'm terrified that we're about to lose the vitality.
00:50:25.940 | We can't afford to pay these people.
00:50:27.900 | We can't afford to give them an accelerator
00:50:31.260 | just to play with in case they find something
00:50:33.480 | at the next energy level.
00:50:35.180 | These people created our economy.
00:50:38.180 | They gave us the Rad Lab and radar.
00:50:41.780 | They gave us two atomic devices to end World War II.
00:50:45.500 | They created the semiconductor and the transistor
00:50:48.380 | to power our economy through Moore's Law.
00:50:51.580 | As a positive externality of particle accelerators,
00:50:54.660 | they created the World Wide Web.
00:50:56.380 | And we have the insolence to say,
00:50:59.940 | "Why should we fund you with our taxpayer dollars?"
00:51:02.380 | No, the question is, are you enjoying your physics dollars?
00:51:08.000 | Right, these guys signed
00:51:09.420 | the world's worst licensing agreement.
00:51:12.380 | And if they simply charged for every time
00:51:16.180 | you used a transistor or a URL,
00:51:19.660 | or enjoyed the peace that they have provided
00:51:21.900 | during this period of time through the terrible weapons
00:51:25.740 | that they developed, or your communications devices,
00:51:29.600 | all of the things that power our economy,
00:51:32.300 | I really think came out of physics,
00:51:33.880 | even to the extent that chemistry came out of physics
00:51:35.700 | and molecular biology came out of physics.
00:51:37.880 | So, first of all, you have to know
00:51:39.940 | that I'm very critical of this community.
00:51:42.440 | Second of all, it is our most important community.
00:51:45.060 | We have neglected it, we've abused it,
00:51:47.540 | we don't take it seriously,
00:51:49.700 | we don't even care to get them to rehab
00:51:52.300 | after a couple of generations of failure.
00:51:54.700 | No one, I think the youngest person
00:51:57.980 | to have really contributed to the standard model
00:52:00.980 | of theoretical level was born in 1951, right?
00:52:05.620 | Frank Wilczek.
00:52:07.580 | And almost nothing has happened
00:52:10.000 | that in theoretical physics after 1973, '74
00:52:14.820 | that sent somebody to Stockholm
00:52:17.020 | for theoretical development that predicted experiment.
00:52:21.600 | So, we have to understand that we are doing this to ourselves
00:52:24.760 | now with that said, these guys have behaved abysmally,
00:52:28.440 | in my opinion, because they haven't owned up
00:52:31.980 | to where they actually are,
00:52:33.380 | what problems they're really facing,
00:52:34.920 | how definite they can actually be.
00:52:37.280 | They haven't shared some of their most brilliant discoveries
00:52:39.760 | which are desperately needed in other fields,
00:52:41.820 | like gauge theory,
00:52:43.120 | which at least the mathematicians can share,
00:52:45.580 | which is an upgrade of the differential calculus
00:52:47.400 | of Newton and Leibniz.
00:52:49.040 | And they haven't shared the importance
00:52:50.560 | of renormalization theory,
00:52:52.840 | even though this should be standard operating procedure
00:52:55.400 | for people across the sciences dealing with different layers
00:52:59.440 | and different levels of phenomena.
00:53:01.040 | - And by shared, you mean communicated in such a way
00:53:03.440 | that it disseminates throughout the different sciences.
00:53:06.800 | - These guys are sitting,
00:53:07.920 | both theoretical physicists and mathematicians
00:53:10.680 | are sitting on top of a giant stockpile
00:53:13.520 | of intellectual gold, right?
00:53:16.320 | They have so many things
00:53:17.660 | that have not been manifested anywhere.
00:53:19.760 | I was just on Twitter, I think I mentioned
00:53:23.400 | the Habermann switch pitch that shows the self-duality
00:53:26.080 | of the tetrahedron realized as a linkage mechanism.
00:53:29.640 | Now, this is like a triviality
00:53:32.080 | and it makes an amazing toy that's built a market,
00:53:36.440 | hopefully a fortune for Chuck Habermann.
00:53:38.520 | Well, you have no idea how much great stuff
00:53:41.280 | that these priests have in their monastery.
00:53:43.440 | - So it's truly a love and hate relationship for you.
00:53:47.360 | - Yeah, well, look--
00:53:48.200 | - Sounds like it's more on the love side.
00:53:49.240 | - This building that we're in right here
00:53:51.960 | is the building in which I really put together
00:53:54.840 | the conspiracy between the National Academy of Sciences
00:53:57.640 | and the National Science Foundation
00:54:00.040 | through the government university industry
00:54:01.560 | research round table to destroy the bargaining power
00:54:04.840 | of American academics using foreign labor.
00:54:08.520 | - With a--
00:54:09.360 | - On microfiche in the basement.
00:54:11.080 | Oh yeah, that was done here in this building.
00:54:13.240 | Isn't that weird?
00:54:14.080 | - And I'm truly speaking with a revolutionary
00:54:16.960 | and a radical--
00:54:18.160 | - No, no, no, no, no, no, no, no, no, no.
00:54:20.120 | At an intellectual level, I am absolutely garden variety.
00:54:25.120 | I'm just straight down the middle.
00:54:27.580 | The system that we are in,
00:54:29.480 | this university is functionally insane.
00:54:34.680 | Harvard is functionally insane.
00:54:36.680 | And we don't understand that when we get these things wrong,
00:54:41.120 | the financial crisis made this very clear.
00:54:43.560 | There was a long period where every grownup,
00:54:46.120 | everybody with a tie who spoke in baritone tones
00:54:51.120 | with the right degree at the end of their name,
00:54:55.520 | we're talking about how we banished volatility.
00:54:59.360 | We were in the great moderation.
00:55:01.000 | Okay, they were all crazy.
00:55:04.040 | And who was right?
00:55:05.200 | It was like Nassim Taleb, Nouriel Roubini.
00:55:08.400 | Now, what happens is that they claimed
00:55:10.720 | the market went crazy, but the market didn't go crazy.
00:55:14.600 | The market had been crazy.
00:55:16.140 | And what happened is that it suddenly went sane.
00:55:19.040 | Well, that's where we are with academics.
00:55:21.120 | Academics right now is mad as a hatter.
00:55:23.860 | And it's absolutely evident.
00:55:25.520 | I can show you graph after graph.
00:55:27.040 | I can show you the internal discussions.
00:55:28.520 | I can show you the conspiracies.
00:55:30.540 | Harvard's dealing with one right now
00:55:32.160 | over its admissions policies for people of color
00:55:36.320 | who happen to come from Asia.
00:55:38.200 | All of this madness is necessary to keep the game going.
00:55:41.880 | What we're talking about,
00:55:43.120 | just while we're on the topic of revolutionaries,
00:55:46.280 | is we're talking about the danger of an outbreak of sanity.
00:55:49.820 | - Yeah, you're the guy pointing out the elephant
00:55:53.560 | in the room here.
00:55:55.360 | - The elephant has no clothes.
00:55:56.960 | (Lex laughing)
00:55:58.200 | - Is that how that goes?
00:55:59.600 | I was gonna talk a little bit to Joe Rogan about this.
00:56:04.600 | We ran out of time.
00:56:06.280 | But I think you have some,
00:56:09.480 | just listening to you,
00:56:11.920 | you could probably speak really eloquently to academia
00:56:14.400 | on the difference between the different fields.
00:56:16.480 | So you think there's a difference
00:56:19.000 | between science, engineering,
00:56:20.560 | and then the humanities and academia
00:56:22.440 | in terms of tolerance that they're willing to tolerate?
00:56:25.760 | So from my perspective, I thought,
00:56:28.780 | computer science and maybe engineering
00:56:32.060 | is more tolerant to radical ideas,
00:56:34.740 | but that's perhaps innocent of me.
00:56:36.900 | 'Cause I always, all the battles going on now
00:56:39.660 | are a little bit more on the humanities side
00:56:41.420 | and gender studies and so on.
00:56:43.180 | - Have you seen the American Mathematical Society's
00:56:46.540 | publication of an essay called "Get Out the Way"?
00:56:49.460 | - I have not.
00:56:50.580 | What's the--
00:56:51.420 | - The idea is that white men who hold positions
00:56:56.300 | within universities in mathematics
00:56:58.020 | should vacate their positions
00:56:59.460 | so that young black women can take over
00:57:02.500 | or something like this.
00:57:03.820 | - That's in terms of diversity,
00:57:04.860 | which I also wanna ask you about,
00:57:06.100 | but in terms of diversity of strictly ideas.
00:57:10.140 | - Oh, sure.
00:57:10.980 | - Do you think, 'cause you're basically saying physics
00:57:14.220 | as a community has become a little bit intolerant
00:57:16.580 | to some degree to new radical ideas.
00:57:20.340 | Or at least you said--
00:57:21.180 | - Well, it's changed a little bit recently,
00:57:24.180 | which is that even string theory is now admitting,
00:57:28.100 | okay, we don't, this doesn't look very promising
00:57:30.900 | in the short term, right?
00:57:32.820 | So the question is what compiles,
00:57:35.980 | if you wanna take the computer science metaphor,
00:57:39.580 | what will get you into a journal?
00:57:41.900 | Will you spend your life trying to push some paper
00:57:44.260 | into a journal or will it be accepted easily?
00:57:47.440 | What do we know about the characteristics of the submitter?
00:57:51.940 | And what gets taken up and what does not?
00:57:55.420 | All of these fields are experiencing pressure
00:57:58.660 | because no field is performing so brilliantly well
00:58:02.180 | that it's revolutionizing our way of speaking and thinking
00:58:08.620 | in the ways in which we've become accustomed.
00:58:12.820 | - But don't you think, even in theoretical physics,
00:58:15.880 | a lot of times, even with theories like string theory,
00:58:19.860 | you could speak to this, it does eventually lead to
00:58:22.860 | what are the ways that this theory would be testable?
00:58:25.500 | - So ultimately, although, look,
00:58:28.740 | there's this thing about Popper and the scientific method
00:58:32.020 | that's a cancer and a disease
00:58:34.020 | in the minds of very smart people.
00:58:36.260 | That's not really how most of the stuff gets worked out,
00:58:39.780 | it's how it gets checked.
00:58:41.180 | And there is a dialogue between theory and experiment.
00:58:45.500 | But everybody should read Paul Dirac's 1963
00:58:49.900 | scientific American article where he,
00:58:55.940 | it's very interesting, he talks about it
00:58:58.620 | as if it was about the Schrodinger equation
00:59:00.580 | and Schrodinger's failure to advance his own work
00:59:03.780 | because of his failure to account for some phenomenon.
00:59:06.260 | The key point is that if your theory is a slight bit off,
00:59:08.740 | it won't agree with experiment,
00:59:10.260 | but it doesn't mean that the theory is actually wrong.
00:59:13.500 | But Dirac could as easily have been talking
00:59:15.700 | about his own equation in which he predicted
00:59:18.740 | that the electrons should have an antiparticle.
00:59:22.020 | And since the only positively charged particle
00:59:24.580 | that was known at the time was the proton,
00:59:26.740 | Heisenberg pointed out, well, shouldn't your antiparticle,
00:59:29.500 | the proton have the same mass as the electron
00:59:31.620 | and doesn't that invalidate your theory?
00:59:33.540 | So I think that Dirac was actually being quite,
00:59:35.620 | potentially quite sneaky and talking about the fact
00:59:39.340 | that he had been pushed off of his own theory
00:59:41.300 | to some extent by Heisenberg.
00:59:43.780 | But look, we fetishized the scientific method
00:59:47.960 | and Popper and falsification because it protects us
00:59:52.460 | from crazy ideas entering the field.
00:59:55.480 | So it's a question of balancing type one and type two error
00:59:58.420 | and we were pretty maxed out in one direction.
01:00:01.460 | - The opposite of that, let me say what comforts me,
01:00:04.140 | sort of biology or engineering,
01:00:07.620 | at the end of the day, does the thing work?
01:00:10.700 | - Yeah.
01:00:11.540 | - You can test the crazies away.
01:00:14.620 | The crazy, well, see, now you're saying,
01:00:16.700 | but some ideas are truly crazy
01:00:18.180 | and some are actually correct.
01:00:20.780 | - So there's pre-correct, currently crazy.
01:00:24.300 | - Yeah. - Right?
01:00:25.300 | And so you don't wanna get rid of everybody
01:00:27.160 | who's pre-correct and currently crazy.
01:00:29.660 | The problem is is that we don't have standards in general
01:00:35.180 | for trying to determine who has to be put to the sword
01:00:38.740 | in terms of their career and who has to be protected
01:00:42.100 | as some sort of giant time suck pain in the ass
01:00:46.300 | who may change everything.
01:00:47.860 | - Do you think that's possible,
01:00:49.220 | creating a mechanism of those selected?
01:00:51.300 | - Well, you're not gonna like the answer, but here it comes.
01:00:53.300 | - Oh, boy.
01:00:55.140 | - It has to do with very human elements.
01:00:59.360 | We're trying to do this at the level of rules and fairness.
01:01:02.500 | It's not gonna work.
01:01:03.620 | 'Cause the only thing that really understands this
01:01:08.660 | is the rules.
01:01:09.980 | - You ever read the "Double Helix"?
01:01:12.220 | - It's a book.
01:01:13.260 | - Oh, you have to read this book.
01:01:16.000 | Not only did Jim Watson half discover
01:01:19.820 | this three-dimensional structure of DNA,
01:01:21.940 | he was also one hell of a writer before he became an ass.
01:01:24.940 | No, he's tried to destroy his own reputation.
01:01:29.620 | - I knew about the ass,
01:01:30.460 | I didn't know about the good writer.
01:01:32.940 | - Jim Watson is one of the most important people now living.
01:01:35.780 | And as I've said before,
01:01:38.860 | Jim Watson is too important a legacy
01:01:41.100 | to be left to Jim Watson.
01:01:42.840 | That book tells you more
01:01:46.620 | about what actually moves the dial.
01:01:49.260 | And there's another story about him,
01:01:51.340 | which I don't agree with,
01:01:52.700 | which is that he stole everything from Rosalind Franklin.
01:01:54.820 | I mean, the problems that he had
01:01:56.420 | with Rosalind Franklin are real,
01:01:58.180 | but we should actually honor that tension in our history
01:02:02.020 | by delving into it rather than having a simple solution.
01:02:05.180 | Jim Watson talks about Francis Crick
01:02:07.940 | being a pain in the ass
01:02:09.240 | that everybody secretly knew was super brilliant.
01:02:11.700 | And there's an encounter between Chargaff,
01:02:16.740 | who came up with the equimolar relations
01:02:19.340 | between the nucleotides,
01:02:20.880 | who should have gotten the structure of DNA,
01:02:22.860 | and Watson and Crick.
01:02:24.620 | And he talks about missing a shiver
01:02:28.940 | in the heartbeat of biology.
01:02:30.300 | And this stuff is so gorgeous,
01:02:31.620 | it just makes you tremble even thinking about it.
01:02:35.620 | Look, we know very often who is to be feared,
01:02:40.620 | and we need to fund the people that we fear.
01:02:44.000 | The people who are wasting our time
01:02:46.980 | need to be excluded from the conversation.
01:02:49.660 | You see, and maybe we'll make some errors
01:02:52.980 | in both directions,
01:02:54.780 | but we have known our own people.
01:02:58.100 | We know the pains in the asses that might work out.
01:03:01.020 | And we know the people who are really just blowhards
01:03:03.380 | who really have very little to contribute most of the time.
01:03:07.340 | It's not 100%, but you're not gonna get there with rules.
01:03:10.460 | - Right, it's using some kind of instinct.
01:03:12.620 | I mean, to be honest,
01:03:14.540 | I'm gonna make you roll your eyes for a second,
01:03:16.540 | but the first time I heard
01:03:19.580 | that there is a large community of people
01:03:21.100 | who believe the Earth is flat,
01:03:22.680 | actually made me pause and ask myself the question.
01:03:26.420 | - Why would there be such a community?
01:03:27.980 | - Yeah, is it possible the Earth is flat?
01:03:30.180 | So I had to like, wait a minute.
01:03:33.180 | I mean, then you go through a thinking process
01:03:35.140 | that I think is really healthy.
01:03:37.500 | It ultimately ends up being a geometry thing, I think.
01:03:40.700 | It's an interesting thought experiment at the very least.
01:03:44.100 | - Well, I do a different version of it.
01:03:46.540 | I say, why is this community stable?
01:03:48.660 | - Yeah, that's a good way to analyze it.
01:03:51.500 | - Interesting that whatever we've done
01:03:53.220 | has not erased the community.
01:03:54.900 | So, you know, they're taking a long shot bet
01:03:57.820 | that won't pan out, you know.
01:03:59.300 | Maybe we just haven't thought enough
01:04:01.500 | about the rationality of the square root of two
01:04:03.460 | and somebody brilliant will figure it out.
01:04:05.140 | Maybe we will eventually land one day
01:04:07.060 | on the surface of Jupiter and explore it.
01:04:09.100 | Right, these are crazy things that will never happen.
01:04:14.020 | - So much of social media operates by AI algorithms.
01:04:17.540 | You talked about this a little bit,
01:04:19.500 | recommending the content you see.
01:04:21.700 | So on this idea of radical thought,
01:04:24.980 | how much should AI show you things you disagree with
01:04:28.180 | on Twitter and so on?
01:04:30.860 | In a Twitter verse.
01:04:33.140 | - I hate this question.
01:04:34.500 | - Yeah? - Yeah.
01:04:35.340 | - 'Cause you don't know the answer?
01:04:37.300 | - No, no, no, no.
01:04:38.900 | Look, they've pushed out this cognitive Lego to us
01:04:43.220 | that will just lead to madness.
01:04:45.780 | It's good to be challenged with things
01:04:47.820 | that you disagree with.
01:04:49.380 | The answer is no.
01:04:50.500 | It's good to be challenged with interesting things
01:04:52.980 | with which you currently disagree,
01:04:55.340 | but that might be true.
01:04:57.060 | So I don't really care about whether or not
01:04:58.500 | I disagree with something or don't disagree.
01:05:00.380 | I need to know why that particular disagreeable thing
01:05:03.420 | is being pushed out.
01:05:05.340 | Is it because it's likely to be true?
01:05:07.020 | Is it because, is there some reason?
01:05:09.700 | Because I can write a computer generator
01:05:12.300 | to come up with an infinite number of disagreeable statements
01:05:15.900 | that nobody needs to look at.
01:05:17.640 | So please, before you push things at me
01:05:19.660 | that are disagreeable, tell me why.
01:05:22.780 | - There is an aspect in which that question is quite dumb,
01:05:25.220 | especially because it's being used to almost,
01:05:29.920 | very generically by these different networks to say,
01:05:33.800 | well, we're trying to work this out.
01:05:35.440 | But basically, how much,
01:05:39.620 | do you see the value of seeing things you don't like,
01:05:43.560 | not you disagree with, because it's very difficult
01:05:45.540 | to know exactly what you articulated,
01:05:47.600 | which is the stuff that's important for you to consider
01:05:52.480 | that you disagree with.
01:05:53.360 | That's really hard to figure out.
01:05:54.960 | The bottom line is the stuff you don't like.
01:05:57.120 | If you're a Hillary Clinton supporter,
01:06:00.640 | you may not want to, it might not make you feel good
01:06:03.800 | to see anything about Donald Trump.
01:06:05.800 | That's the only thing algorithms
01:06:06.920 | can really optimize for currently.
01:06:08.920 | They really can't. - No, they can do better.
01:06:10.320 | This is, we're-- - You think so?
01:06:12.700 | - No, we're engaged in some moronic back and forth
01:06:17.200 | where I have no idea why people who are capable
01:06:22.200 | of building Google, Facebook, Twitter
01:06:25.960 | are having us in these incredibly low level discussions.
01:06:28.880 | Do they not know any smart people?
01:06:31.200 | Do they not have the phone numbers of people
01:06:33.200 | who can elevate these discussions?
01:06:34.960 | - They do, but this, they're optimizing
01:06:39.540 | for a different thing and they are pushing those people
01:06:41.580 | out of those rooms.
01:06:42.420 | - They're optimizing for things we can't see.
01:06:46.080 | And yes, profit is there.
01:06:48.480 | Nobody's questioning that.
01:06:50.240 | But they're also optimizing for things like
01:06:54.160 | political control or the fact that they're doing business
01:06:56.640 | in Pakistan and so they don't want to talk
01:06:58.720 | about all the things that they're going to be
01:07:00.640 | bending to in Pakistan.
01:07:03.280 | So we're involved in a fake discussion.
01:07:07.400 | - You think so?
01:07:08.240 | You think these conversations at that depth
01:07:09.760 | are happening inside Google?
01:07:11.240 | You don't think they have some basic metrics
01:07:14.000 | under user engagements?
01:07:15.720 | - You're having a fake conversation with us, guys.
01:07:18.220 | We know you're having a fake conversation.
01:07:19.880 | I do not wish to be part of your fake conversation.
01:07:23.600 | You know how to cool these units.
01:07:26.720 | You know high availability like nobody's business.
01:07:29.360 | My Gmail never goes down, almost.
01:07:33.160 | - So you think just because they can do incredible work
01:07:36.240 | on the software side with infrastructure,
01:07:38.280 | they can also deal with some of these difficult questions
01:07:43.280 | about human behavior, human understanding,
01:07:46.280 | you're not, you're not.
01:07:47.440 | - I mean, I've seen the developers' screens
01:07:50.920 | that people take shots of inside of Google.
01:07:54.320 | And I've heard stories inside of Facebook and Apple.
01:07:58.440 | We're not, we're engaged, they're engaging us
01:08:01.600 | in the wrong conversations.
01:08:04.060 | We are not at this low level.
01:08:06.080 | Here's one of my favorite questions.
01:08:08.100 | Why is every piece of hardware that I purchase
01:08:11.840 | in tech space equipped as a listening device?
01:08:15.560 | Where's my physical shutter to cover my lens?
01:08:19.740 | We had this in the 1970s.
01:08:22.640 | The cameras that had lens caps, you know?
01:08:25.080 | How much would it cost to have a security model?
01:08:27.920 | Pay five extra bucks.
01:08:29.780 | Why is my indicator light software controlled?
01:08:33.040 | Why, when my camera is on, do I not see
01:08:35.240 | that the light is on by putting it as something
01:08:38.040 | that cannot be bypassed?
01:08:39.700 | Why have you set up all my devices
01:08:42.900 | at some difficulty to yourselves as listening devices
01:08:46.280 | and we don't even talk about this?
01:08:47.720 | This thing is total fucking bullshit.
01:08:51.520 | - Well, I hope-- - No, no, wait, wait, wait.
01:08:53.100 | - These discussions are happening about privacy.
01:08:55.320 | Is there a more difficult thing you're giving credit for?
01:08:57.080 | - It's not just privacy.
01:08:59.040 | It's about social control.
01:09:01.080 | We're talking about social control.
01:09:03.560 | Why do I not have controls over my own levers?
01:09:07.120 | Just have a really cute UI where I can switch,
01:09:09.940 | I can dial things, or I can at least see
01:09:11.680 | what the algorithms are.
01:09:12.960 | - But you think that there is some deliberate choices
01:09:16.720 | being made here. - There is emergence
01:09:19.040 | and there is intention.
01:09:21.560 | There are two dimensions.
01:09:22.920 | The vector does not collapse onto either axis.
01:09:26.320 | But the idea that anybody who suggests
01:09:29.080 | that intention is completely absent is a child.
01:09:34.080 | - That's really beautifully put.
01:09:35.960 | And like many things you've said is gonna make me--
01:09:38.840 | - Can I turn this around slightly?
01:09:40.800 | - Yeah.
01:09:41.800 | - I sit down with you and you say
01:09:42.880 | that you're obsessed with my feed.
01:09:45.720 | I don't even know what my feed is.
01:09:47.560 | What are you seeing that I'm not?
01:09:49.720 | - I was obsessively looking through your feed on Twitter
01:09:53.560 | 'cause it was really enjoyable
01:09:54.880 | because there's the Tom Lehrer element,
01:09:56.520 | there's the humor in it.
01:09:58.240 | - By the way, that feed is Eric R. Weinstein on Twitter.
01:10:01.600 | - It's great. - Eric R. Weinstein.
01:10:03.720 | No, but seriously, why?
01:10:06.620 | - Why did I find it enjoyable or what was I seeing?
01:10:09.840 | - What are you looking for?
01:10:11.360 | Why are we doing this?
01:10:12.920 | What is this podcast about?
01:10:14.800 | I know you've got all these interesting people.
01:10:16.480 | I'm just some guy who's sort of a podcast guest.
01:10:18.880 | - Sort of a podcast, you're not even wearing a tie.
01:10:22.360 | I mean, it's not even a serious interview.
01:10:24.740 | I'm searching for meaning, for happiness,
01:10:30.560 | for a dopamine rush, so short-term and long-term.
01:10:34.400 | - And how are you finding your way to me?
01:10:36.500 | I don't honestly know what I'm doing to reach you.
01:10:41.240 | The representing ideas which feel common sense to me
01:10:46.240 | and not many people are speaking,
01:10:47.880 | so it's kind of like the intellectual dark web folks.
01:10:52.120 | These folks, from Sam Harris to Jordan Peterson to yourself,
01:10:58.660 | are saying things where it's like you're saying,
01:11:01.080 | look, there's an elephant and he's not wearing any clothes.
01:11:03.980 | And I say, yeah, yeah, let's have more of that conversation.
01:11:09.400 | That's how I'm finding you.
01:11:10.960 | - I'm desperate to try to change
01:11:13.320 | the conversation we're having.
01:11:14.720 | I'm very worried we've got an election in 2020.
01:11:17.400 | I don't think we can afford four more years
01:11:20.120 | of a misinterpreted message,
01:11:22.380 | which is what Donald Trump was.
01:11:25.320 | And I don't want the destruction of our institutions.
01:11:28.360 | They all seem hell-bent on destroying themselves.
01:11:30.600 | So I'm trying to save theoretical physics,
01:11:33.200 | trying to save the New York Times,
01:11:34.720 | trying to save our various processes.
01:11:38.200 | And I think it feels delusional to me
01:11:40.560 | that this is falling to a tiny group of people
01:11:44.600 | who are willing to speak out without getting so freaked out
01:11:48.720 | that everything they say will be misinterpreted
01:11:50.760 | and that their lives will be ruined through the process.
01:11:53.000 | I mean, I think we're in an absolutely bananas period
01:11:56.000 | of time, and I don't believe it should fall
01:11:57.920 | to such a tiny number of shoulders to shoulder this way.
01:12:01.120 | - So I have to ask you, on the capitalism side,
01:12:05.840 | you mentioned that technology is killing capitalism,
01:12:08.160 | or it has effects that are, well, not unintended,
01:12:12.880 | but not what economists would predict
01:12:16.000 | or speak of capitalism creating.
01:12:18.800 | I just wanna talk to you about, in general,
01:12:21.240 | the effect of even then artificial intelligence
01:12:23.640 | or technology automation taking away jobs
01:12:27.280 | and these kinds of things,
01:12:28.200 | and what you think is the way to alleviate that,
01:12:31.500 | whether the Andrew Yang presidential candidate
01:12:33.700 | with universal basic income, UBI,
01:12:36.040 | what are your thoughts there?
01:12:38.680 | How do we fight off the negative effects of technology?
01:12:42.000 | - All right, you're a software guy, right?
01:12:44.560 | A human being is a worker is an old idea.
01:12:48.500 | A human being has a worker is a different object, right?
01:12:53.960 | So if you think about object-oriented programming
01:12:55.720 | as a paradigm, a human being has a worker
01:12:59.640 | and a human being has a soul.
01:13:01.840 | We're talking about the fact that for a period of time,
01:13:04.400 | the worker that a human being has was in a position
01:13:08.720 | to feed the soul that a human being has.
01:13:11.340 | However, we have two separate claims
01:13:15.040 | on the value in society.
01:13:17.020 | One is as a worker and the other is as a soul,
01:13:20.740 | and the soul needs sustenance, it needs dignity,
01:13:23.440 | it needs meaning, it needs purpose.
01:13:25.320 | As long as your means of support is not highly repetitive,
01:13:33.600 | I think you have a while to go
01:13:34.840 | before you need to start worrying.
01:13:36.920 | But if what you do is highly repetitive
01:13:39.760 | and it's not terribly generative,
01:13:41.240 | you are in the crosshairs of for loops and while loops,
01:13:45.960 | and that's what computers excel at, repetitive behavior.
01:13:48.880 | And when I say repetitive, I may mean things
01:13:52.440 | that have never happened through combinatorial possibilities
01:13:54.840 | but as long as it has a looped characteristic to it,
01:13:57.200 | you're in trouble.
01:13:58.100 | We are seeing a massive push towards socialism
01:14:02.960 | because capitalists are slow to address the fact
01:14:07.960 | that a worker may not be able to make claims.
01:14:10.880 | A relatively undistinguished median member of our society
01:14:15.640 | still has needs to reproduce, needs to dignity.
01:14:20.640 | And when capitalism abandons the median individual
01:14:25.520 | or the bottom 10th or whatever it's going to do,
01:14:29.660 | it's flirting with revolution.
01:14:32.640 | And what concerns me is that the capitalists
01:14:35.400 | aren't sufficiently capitalistic to understand this.
01:14:38.080 | You really want to court authoritarian control
01:14:43.040 | in our society because you can't see
01:14:45.100 | that people may not be able to defend themselves
01:14:47.300 | in the marketplace because the marginal product
01:14:50.080 | of their labor is too low to feed their dignity as a soul.
01:14:55.040 | So my great concern is that our free society has to do
01:14:59.960 | with the fact that we are self-organized.
01:15:02.320 | I remember looking down from my office in Manhattan
01:15:04.960 | when Lehman Brothers collapsed and thinking,
01:15:07.880 | who's going to tell all these people
01:15:09.440 | that they need to show up at work
01:15:11.820 | when they don't have a financial system
01:15:14.120 | to incentivize them to show up at work?
01:15:16.440 | So my complaint is first of all, not with the socialists
01:15:20.880 | but with the capitalists, which is you guys are being idiots.
01:15:24.520 | You're courting revolution by continuing to harp
01:15:28.280 | on the same old ideas that, well, you know,
01:15:30.960 | try harder, bootstrap yourself.
01:15:33.040 | Yeah, to an extent that works, to an extent.
01:15:36.300 | But we are clearly headed in a place
01:15:37.940 | that there's nothing that ties together our need
01:15:41.040 | to contribute and our need to consume.
01:15:45.400 | And that may not be provided by capitalism
01:15:47.580 | because it may have been a temporary phenomenon.
01:15:49.460 | So check out my article on anthropic capitalism
01:15:52.800 | and the new gimmick economy.
01:15:55.480 | I think people are late getting the wake-up call
01:15:58.020 | and we would be doing a better job
01:15:59.980 | saving capitalism from itself
01:16:01.860 | because I don't want this done under authoritarian control.
01:16:05.740 | And the more we insist that everybody
01:16:08.100 | who's not thriving in our society
01:16:10.100 | during their reproductive years
01:16:11.860 | in order to have a family is failing at a personal level.
01:16:15.260 | I mean, what a disgusting thing that we're saying.
01:16:18.380 | What a horrible message.
01:16:19.940 | Who the hell have we become
01:16:21.900 | that we've so bought into the Chicago model
01:16:24.940 | that we can't see the humanity
01:16:26.580 | that we're destroying in that process?
01:16:28.140 | And I hate the thought of communism.
01:16:31.460 | I really do.
01:16:32.280 | My family has flirted with it decades past.
01:16:34.540 | It's a wrong, bad idea.
01:16:36.460 | But we are going to need to figure out
01:16:38.460 | how to make sure that those souls
01:16:40.620 | are nourished and respected
01:16:43.060 | and capitalism better have an answer.
01:16:45.060 | And I'm betting on capitalism,
01:16:46.460 | but I gotta tell you, I'm pretty disappointed with my team.
01:16:49.980 | - So you're still on the capitalism team.
01:16:52.140 | You just, there's a theme here.
01:16:54.260 | - Radical capital.
01:16:56.180 | - Hyper capitalism.
01:16:57.340 | - I want, I think hyper capitalism
01:16:59.620 | is gonna have to be coupled to hyper socialism.
01:17:01.980 | You need to allow the most productive people
01:17:04.260 | to create wonders.
01:17:06.100 | And you gotta stop bogging them down
01:17:08.240 | with all of these extra nice requirements.
01:17:11.260 | Nice is dead.
01:17:12.860 | Good has a future.
01:17:14.500 | Nice doesn't have a future
01:17:16.180 | because nice ends up with gulags.
01:17:19.020 | - Damn, that's a good line.
01:17:21.100 | Okay, last question.
01:17:22.700 | You tweeted today a simple, quite insightful equation
01:17:27.140 | saying imagine that every unit F of fame you picked up
01:17:33.060 | as stalkers and H haters.
01:17:35.660 | So I imagine S and H are dependent on your path to fame
01:17:38.660 | perhaps a little bit.
01:17:39.500 | - Well, it's not as simple.
01:17:40.940 | People always take these things literally
01:17:42.460 | when you have like 280 characters to explain yourself.
01:17:45.160 | (laughing)
01:17:47.020 | - So you mean that that's not a mathematical--
01:17:49.500 | - No, there's no law.
01:17:50.340 | - Oh, okay.
01:17:51.260 | All right.
01:17:52.260 | I put the word imagine
01:17:53.380 | because I still have a mathematician's desire for precision.
01:17:56.260 | Imagine that this were true.
01:17:57.820 | - But it was a beautiful way to imagine
01:18:00.080 | that there is a law that has those variables in it.
01:18:03.460 | And you've become quite famous these days.
01:18:06.700 | So how do you yourself optimize that equation
01:18:09.940 | with the peculiar kind of fame
01:18:11.860 | that you have gathered along the way?
01:18:13.760 | - I wanna be kinder.
01:18:14.740 | I wanna be kinder to myself.
01:18:16.180 | I wanna be kinder to others.
01:18:17.380 | I wanna be able to have heart,
01:18:22.100 | compassion, these things are really important.
01:18:24.460 | And I have a pretty spectrumy kind of approach to analysis.
01:18:28.940 | I'm quite literal.
01:18:30.460 | I can go full Rain Man on you at any given moment.
01:18:33.100 | No, I can't.
01:18:33.940 | I can't.
01:18:34.780 | It's faculty of autism, if you like.
01:18:36.440 | And people are gonna get angry
01:18:37.380 | because they want autism to be respected.
01:18:39.180 | But when you see me coding or you see me doing mathematics,
01:18:44.180 | I'm, you know, I speak with speech apnea.
01:18:47.740 | (stammering)
01:18:49.300 | Be right down to dinner.
01:18:50.500 | And we have to try to integrate ourselves
01:18:54.180 | in those tensions between, you know,
01:18:57.380 | it's sort of back to us as a worker and us as a soul.
01:19:00.660 | Many of us are optimizing one at the expense of the other.
01:19:05.540 | And I struggle with social media
01:19:08.020 | and I struggle with people making threats
01:19:09.900 | against our families.
01:19:11.720 | And I struggle with just how much pain people are in.
01:19:15.860 | And if there's one message I would like to push out there,
01:19:20.020 | you're responsible, everybody, all of us,
01:19:22.360 | myself included, with struggling.
01:19:24.860 | Struggle, struggle mightily because you,
01:19:27.500 | it's nobody else's job to do your struggle for you.
01:19:30.780 | Now with that said, if you're struggling and you're trying
01:19:33.460 | and you're trying to figure out how to better yourself
01:19:35.540 | and where you've failed, where you've let down your family,
01:19:38.100 | your friends, your workers, all this kind of stuff,
01:19:40.780 | give yourself a break.
01:19:43.500 | You know, if it's not working out,
01:19:46.380 | I have a lifelong relationship with failure and success.
01:19:50.300 | There's been no period of my life
01:19:52.780 | where both haven't been present in one form or another.
01:19:55.900 | And I do wish to say that a lot of times
01:19:59.300 | people think this is glamorous.
01:20:01.200 | I'm about to go, you know, do a show with Sam Harris.
01:20:04.400 | People are gonna listen in on two guys
01:20:05.760 | having a conversation on stage.
01:20:07.280 | It's completely crazy.
01:20:08.500 | I'm always trying to figure out how to make sure
01:20:09.980 | that those people get maximum value.
01:20:12.300 | And that's why I'm doing this podcast, you know,
01:20:16.540 | just give yourself a break.
01:20:18.260 | You owe us, you owe us your struggle.
01:20:20.500 | You don't owe your family or your coworkers
01:20:22.900 | or your lovers or your family members success.
01:20:25.800 | As long as you're in there and you're picking yourself up,
01:20:29.680 | recognize that this new situation with the economy
01:20:33.620 | that doesn't have the juice to sustain our institutions
01:20:37.060 | has caused the people who've risen to the top
01:20:39.700 | of those institutions to get quite brutal and cruel.
01:20:43.380 | Everybody is lying at the moment.
01:20:45.100 | Nobody's really a truth teller.
01:20:46.800 | Try to keep your humanity about you.
01:20:50.060 | Try to recognize that if you're failing,
01:20:52.780 | if things aren't where you want them to be
01:20:54.940 | and you're struggling and you're trying to figure out
01:20:56.620 | what you're doing wrong, what you could do,
01:20:58.420 | it's not necessarily all your fault.
01:21:01.260 | We are in a global situation.
01:21:02.960 | I have not met the people who are honest,
01:21:06.060 | kind, good, successful.
01:21:08.220 | Nobody that I've met is checking all the boxes.
01:21:12.740 | Nobody's getting all 10s.
01:21:14.420 | So I just think that's an important message
01:21:17.340 | that doesn't get pushed out enough.
01:21:18.740 | Either people wanna hold society responsible
01:21:21.780 | for their failures, which is not reasonable.
01:21:23.980 | You have to struggle, you have to try.
01:21:26.140 | Or they wanna say you're 100% responsible
01:21:28.340 | for your failures, which is total nonsense.
01:21:30.500 | - Beautifully put.
01:21:32.900 | Eric, thank you so much for talking today.
01:21:34.380 | - Thanks for having me, buddy.
01:21:35.860 | (upbeat music)
01:21:38.440 | (upbeat music)
01:21:41.020 | (upbeat music)
01:21:43.600 | (upbeat music)
01:21:46.180 | (upbeat music)
01:21:48.760 | (upbeat music)
01:21:51.340 | [BLANK_AUDIO]