back to indexEric Weinstein: Revolutionary Ideas in Science, Math, and Society | Lex Fridman Podcast #16
Chapters
0:0 Intro
0:44 Who influenced your thinking
9:59 Telogens
16:22 Open AI
18:5 Edward Teller
20:3 All telogens
21:22 Building a nightmare machine
23:7 Metrics
23:49 The Great Mystery of Our Time
26:20 My Leading Concern
28:54 Gated Institutional Narrative
32:30 Nuclear Weapons
35:16 Chess
37:22 The Future
40:39 Temporal Dimensions
00:00:00.000 |
The following is a conversation with Eric Weinstein. 00:00:14.320 |
which is a loosely assembled group of public intellectuals 00:00:17.280 |
that includes Sam Harris, Jordan Peterson, Steven Pinker, 00:00:20.920 |
Joe Rogan, Michael Shermer, and a few others. 00:00:26.080 |
of the Artificial Intelligence Podcast at MIT and beyond. 00:00:30.240 |
If you enjoy it, subscribe on YouTube, iTunes, 00:00:33.700 |
or simply connect with me on Twitter @LexFriedman, 00:00:39.400 |
And now, here's my conversation with Eric Weinstein. 00:00:54.560 |
It has the usual profound master-student dynamic going on. 00:01:04.900 |
So, if you're the Kung Fu Panda, who was your shifu? 00:01:09.500 |
because I didn't see shifu as being the teacher. 00:01:20.800 |
and the first conversation sort of doesn't count. 00:01:27.540 |
its point is that the teaching that really matters 00:01:48.600 |
Harry Rubin, and his wife, Sophie Rubin, my grandmother, 00:02:07.200 |
And it's so irreverent, so witty, so clever, so obscene, 00:02:14.160 |
that it destroys the ability to lead a normal life 00:02:19.340 |
So, if I meet somebody who's usually really shifted 00:02:27.120 |
I'll often ask them, are you a Tom Lehrer fan? 00:02:30.680 |
And the odds that they will respond are quite high. 00:02:34.120 |
- Now, Tom Lehrer's Poisoning Pigeons in the Park, Tom Lehrer? 00:02:43.800 |
Poisoning Pigeons in the Park, the Element song, 00:02:47.640 |
So, when you meet somebody who knows those songs, 00:02:52.560 |
- Oh, you're judging me right now, aren't you? 00:02:56.040 |
No, but you're Russian, so undoubtedly you know 00:03:05.520 |
which most people don't know, from Danny Kaye, 00:03:16.280 |
of plagiarizing a song and making it about plagiarism, 00:03:27.540 |
It was extremely addictive, and eventually led me 00:03:33.540 |
one of which may have been a PhD in mathematics. 00:03:36.520 |
- And he was also at least a lecturer in mathematics, 00:03:54.320 |
And Tom Lehrer was 88 years old, sharp as a tack, 00:04:01.860 |
you know, there are very few people in this world 00:04:04.560 |
that you have to meet while they're still here, 00:04:09.240 |
- So that wit is a reflection of intelligence 00:04:29.120 |
Do you think that's connected to intelligence, 00:04:34.680 |
- No, I think that it's absolutely connected to intelligence. 00:04:37.640 |
So you can see it, there's a place where Tom Lehrer 00:04:44.680 |
of Gilbert and Sullivan, and he's going to outdo Gilbert 00:04:52.960 |
He's doing Clementine as if Gilbert and Sullivan wrote it, 00:04:57.720 |
"young sister, nay mister, this mister de pester, 00:04:59.420 |
"she tried, pestering sister's a festering blister, 00:05:06.040 |
"When she said I could have her, her sister's cadaver 00:05:17.180 |
because it's hard to construct something like that. 00:05:39.780 |
and you're sort of making it palatable and funny, 00:05:50.120 |
as Tom Lehrer wrote all of these terrible, horrible lines, 00:05:55.200 |
was just what a sensitive and beautiful soul he was, 00:05:58.040 |
who was channeling pain through humor and through grace. 00:06:02.840 |
I've seen throughout Europe, throughout Russia, 00:06:11.440 |
to somehow deal with the pain and the suffering 00:06:44.000 |
that there are certain areas of human experience 00:06:48.800 |
that it would be better to know nothing about. 00:06:53.440 |
Eastern Europe knows a great deal about them, 00:06:55.640 |
which makes the songs of Vladimir Vysotsky so potent. 00:07:11.500 |
knew something like this around the time of the Civil War, 00:07:19.660 |
or even the harsh tyranny of the coal and steel employers 00:07:32.220 |
to understand and imagine the collective culture 00:07:36.040 |
unless we have the system of selective pressures 00:07:38.440 |
that, for example, Russians were subjected to. 00:07:41.540 |
- Yeah, so if there's one good thing that comes out of war, 00:07:59.060 |
- Without the death, it would bring the romance of it. 00:08:03.500 |
- Well, this is why we're always caught up in war. 00:08:05.700 |
We have this very ambiguous relationship to it, 00:08:08.340 |
is that it makes life real and pressing and meaningful 00:08:22.220 |
in one of the conversations you had, or one of the videos, 00:08:27.780 |
you described that one of the things AI systems can't do, 00:08:39.020 |
- Well, yes, the physical robots can't self-replicate. 00:08:46.860 |
which is that the only thing that we've been able to create 00:08:52.100 |
that has an analog of our reproductive system, is software. 00:08:57.300 |
But nevertheless, software replicates itself, 00:09:01.860 |
if we're speaking strictly for the replication, 00:09:06.060 |
So let me, just to begin, let me ask a question. 00:09:12.380 |
between the physical world and the digital world? 00:09:16.420 |
Let's call it the logical world versus the physical world. 00:09:27.220 |
it was meaningless to us as a physical object 00:09:31.940 |
with what was stored in it at a logical level. 00:09:35.540 |
And so the idea that something may be stored logically 00:09:38.420 |
and that it may be stored physically are not necessarily, 00:09:45.740 |
I'm not suggesting that there isn't a material basis 00:09:52.980 |
with a separate layer that need not invoke logic gates 00:10:03.340 |
or maybe just connecting to the logical world 00:10:09.140 |
you mentioned the idea of out, outtelligence. 00:10:26.740 |
- Well, maybe it wasn't well-written, but I don't know. 00:10:39.480 |
What I was thinking about is why it is that we're waiting 00:10:42.580 |
to be terrified by artificial general intelligence 00:10:47.260 |
when in fact artificial life is terrifying in and of itself 00:10:54.500 |
So in order to have a system of selective pressures, 00:11:04.300 |
you need heritability, and you need differential success. 00:11:17.440 |
what humans know how to build, that's impressive. 00:11:33.720 |
The one thing it doesn't have is a reproductive system. 00:11:42.920 |
effectively you do have a reproductive system. 00:11:49.680 |
with variation, heritability, and differential success. 00:11:53.360 |
Now, the next step in the chain of thinking was 00:11:56.520 |
where do we see inanimate, non-intelligent life 00:12:08.200 |
and I try to stay on them so that we don't get distracted. 00:12:19.080 |
- Yeah, it's a type of flower that mimics the female 00:12:21.940 |
of a pollinator species in order to dupe the males 00:12:25.840 |
into engaging, it was called pseudocopulation, 00:12:29.000 |
with the fake female, which is usually represented 00:12:37.960 |
But the flower doesn't have to give up energy 00:12:44.240 |
The other system is a particular species of mussel, 00:12:50.280 |
Lampicillus in the clear streams of Missouri. 00:13:12.240 |
and parasitize the bass and also use the bass 00:13:15.680 |
to redistribute them as they eventually release. 00:13:18.340 |
Both of these systems, you have a highly intelligent dupe 00:13:29.180 |
And what is sculpting these convincing lures? 00:13:34.560 |
It's the intelligence of previously duped targets 00:13:41.440 |
So when the target is smart enough to avoid the strategy, 00:13:52.680 |
So it's an arms race between the target species 00:14:00.860 |
and this other less intelligent or non-intelligent object 00:14:08.040 |
And so what you see is that artificial general intelligence 00:14:17.100 |
It's simply sufficient for us to outwit ourselves. 00:14:24.140 |
one of these Nigerian scams that writes letters 00:14:32.300 |
to figure out which aspects of the program should be kept, 00:14:38.780 |
And you don't need it to be in any way intelligent 00:14:41.240 |
in order to have a really nightmarish scenario 00:14:46.620 |
- So you phrased a few concepts really eloquently. 00:14:49.300 |
So let me try to, as a few directions this goes. 00:14:52.980 |
So one, first of all, in the way we write software today, 00:14:56.880 |
it's not common that we allow it to self-modify. 00:15:06.020 |
So your thought is that that is a serious worry 00:15:18.420 |
- So there's different types of self-modification, right? 00:15:29.620 |
after you log in or whatever, you can think of it that way. 00:15:37.740 |
But you're thinking of ideas where you're completely, 00:15:40.980 |
this is a unique entity operating under selective pressures 00:15:46.900 |
- Well, you just, if you think about the fact 00:15:53.620 |
but they have a small set of spanning components. 00:16:00.940 |
in that any shape or binding region can be approximated 00:16:10.120 |
then you can have confidence that you don't need to know 00:16:14.940 |
what's coming at you because the combinatorics 00:16:17.500 |
are sufficient to reach any configuration needed. 00:16:31.840 |
I do always have a concern as to whether or not 00:16:33.980 |
I will bring them into being by talking about them. 00:16:39.740 |
so next week, I have to talk to the founder of OpenAI, 00:17:00.620 |
- Yes, so that's the thing, I think talking about it, 00:17:05.220 |
I'm more a proponent of technology preventing, 00:17:37.400 |
I don't think that we should discount the idea 00:17:42.380 |
showing off how smart they are by what they've developed 00:17:50.660 |
I'm very mindful in particular of a beautiful letter 00:17:55.180 |
that Edward Teller of all people wrote to Leo Zillard 00:18:13.140 |
talking about the slim chance we have for survival 00:18:16.380 |
and that the only hope is to make war unthinkable. 00:18:19.460 |
I do think that not enough of us feel in our gut 00:18:27.180 |
And I would recommend to anyone who hasn't seen it, 00:18:29.220 |
a movie called "The Bridge on the River Kwai" 00:18:39.960 |
end up over collaborating with their Japanese captors. 00:18:46.620 |
the unrestricted open discussion of ideas in AI. 00:18:52.580 |
I'm just saying that I could make a decent case 00:18:57.740 |
and to become technologically focused on containing it 00:19:02.780 |
and try to hope that the relatively small number 00:19:13.940 |
- Well, the way ideas, the way innovation happens, 00:19:16.500 |
what new ideas develop, Newton with calculus, 00:19:41.660 |
who may live in countries that don't wish us to know 00:19:46.260 |
are very disciplined in keeping these things to themselves. 00:19:55.780 |
that developed something very close to the calculus, 00:20:01.020 |
in, I guess, religious prayer and rhyme and prose. 00:20:10.340 |
So it's not that Newton had any ability to hold that back, 00:20:17.620 |
I do think that we could change the proportion 00:20:20.380 |
of the time we spend worrying about the effects, 00:20:25.780 |
and hope that we'll be able to contain things later. 00:20:51.540 |
you want to influence users to behave a certain way. 00:20:55.780 |
And so that's one kind of example of our intelligence, 00:21:05.660 |
in order to sell more product of different kind. 00:21:08.620 |
But do you see other examples of this actually emerging in-- 00:21:27.420 |
And if you really wanted to build a nightmare machine, 00:21:29.260 |
make sure that the system that expresses the variability 00:21:33.980 |
has a spanning set so that it can learn to arbitrary levels 00:21:43.140 |
- So it's your nightmare, but it could also be, 00:21:46.260 |
it's a really powerful mechanism by which to create, 00:21:52.260 |
So are you more worried about the negative direction 00:21:59.020 |
So you said parasitic, but that doesn't necessarily 00:22:01.700 |
need to be what the system converges towards. 00:22:07.100 |
- Parasitism, the dividing line between parasitism 00:22:17.340 |
- Well, yeah, we could go into that too, but. 00:22:35.980 |
- So in marriage, you fear the loss of independence, 00:22:38.900 |
but even though the American therapeutic community 00:22:45.300 |
what's to say that codependence isn't what's necessary 00:22:48.060 |
to have a stable relationship in which to raise children 00:22:58.940 |
and most of us don't want our 13-year-olds having kids. 00:23:10.820 |
and I don't know whether to be angry at them or thank them. 00:23:13.920 |
- Well, ultimately, I mean, nobody knows the meaning of life 00:23:17.580 |
or what even happiness is, but there is some metrics. 00:23:22.260 |
- They didn't, that's why all the poetry in books are about. 00:23:45.140 |
- I don't think we've really felt where we are. 00:23:52.340 |
9/11 was so anomalous compared to everything else 00:24:07.440 |
and determined R&D team deep in the bowels of Afghanistan 00:24:16.540 |
that we were open to that nobody had chosen to express. 00:24:30.820 |
to cause havoc and destruction have been up until now. 00:24:40.120 |
is how remarkably stable we've been since 1945 00:24:58.320 |
We've had several close calls, we've had mistakes, 00:25:03.420 |
And what's now happened is that we've settled 00:25:05.820 |
into a sense that, oh, it'll always be nothing. 00:25:20.760 |
And that's why when I went on the Ben Shapiro show, 00:25:28.080 |
because we have people whose developmental experience 00:25:41.040 |
There's a sense that people are in a video game mode 00:26:00.240 |
I don't know if, and again, you came from Moscow. 00:26:09.320 |
that had a huge effect on a generation growing up in the US. 00:26:20.920 |
We have not gone through an embodied experience 00:26:27.400 |
And I think it's one of the most irresponsible things 00:26:36.280 |
in which the thorns are cut off of the rose bushes 00:26:47.720 |
And so people have developed this totally unreal idea 00:26:53.920 |
And do I think that my leading concern is AGI 00:26:57.160 |
or my leading concern is thermonuclear exchange 00:27:11.880 |
because the toys that we've built are so impressive. 00:27:15.000 |
And the wisdom to accompany them has not materialized. 00:27:19.000 |
And I think we actually got a wisdom uptick since 1945. 00:27:29.760 |
no matter how bad they were, managed to not embroil us 00:27:33.880 |
in something that we couldn't come back from. 00:27:51.400 |
going to visit a memorial to those who gave their lives. 00:28:03.720 |
who fought and died in the Battle of Stalingrad? 00:28:06.480 |
I'm not a huge fan of communism, I gotta say. 00:28:11.760 |
But there were a couple of things that the Russians did 00:28:41.600 |
When we were all just there and nobody wanted to speak. 00:28:53.280 |
and blast out our deep thoughts and our feelings. 00:28:57.880 |
And it was profound because we woke up briefly there. 00:29:04.000 |
I talk about the gated institutional narrative 00:29:11.880 |
One of which was the election of Donald Trump. 00:29:21.360 |
wasn't that important knew that Lehman Brothers 00:29:29.300 |
And so if I'm 53 years old and I only remember three times 00:29:33.520 |
that the global narrative was really interrupted, 00:29:43.280 |
I mean, we had the Murrow Federal Building explosion, 00:29:48.920 |
Around 9/12, we started to wake up out of our slumber. 00:29:53.920 |
And the powers that be did not want to coming together. 00:30:13.440 |
and there's a component of it that's deliberate. 00:30:15.640 |
So give yourself a portfolio with two components. 00:30:20.120 |
but some amount of it is also an understanding 00:30:29.580 |
is there are forces that are trying to come together 00:30:34.640 |
and there are forces that are trying to push things apart. 00:30:50.080 |
that they're temporary, they're nationalistic, 00:30:55.460 |
To people in the national, more in the national idiom, 00:30:58.240 |
they're saying, "Look, this is where I pay my taxes. 00:31:14.040 |
"to whom you have a special and elevated duty?" 00:31:19.540 |
have been pushing towards the global perspective 00:31:22.080 |
from the elite and a larger and larger number 00:31:27.240 |
"Hey, I actually live in a place and I have laws 00:31:33.280 |
"and who are you to tell me that because you can profit 00:31:40.360 |
"to my fellow countrymen are so much diminished?" 00:31:43.260 |
- So these tensions between nations and so on, 00:31:45.380 |
ultimately you see being proud of your country and so on, 00:31:54.060 |
Ultimately it is human nature and it is good for us 00:32:01.220 |
And my point isn't, I mean, nationalism run amok 00:32:05.040 |
is a nightmare, and internationalism run amok 00:32:09.680 |
And the problem is we're trying to push these pendulums 00:32:14.680 |
to some place where they're somewhat balanced, 00:32:25.880 |
but we don't forget our duties of care to the global system. 00:32:32.720 |
but the problem that we're facing concerns the ability 00:32:37.720 |
for some to profit by abandoning their obligations 00:32:51.660 |
since one of the many things you've done as economics, 00:32:57.660 |
why the heck we haven't blown each other up yet. 00:33:13.660 |
has a lot of really good ideas about why, but-- 00:33:19.500 |
- Listen, I'm Russian, so I never trust a guy 00:33:29.980 |
in which more and more of the kinetic energy, 00:33:33.700 |
like war, has been turned into potential energy, 00:33:44.200 |
then everything's just getting better and better. 00:34:12.220 |
not in the way that you've described, but in the more. 00:34:16.900 |
- Well, he hangs out with Elon, I don't know Elon. 00:34:46.100 |
- Well, I mean, I can mean all sorts of things, 00:34:48.220 |
but certainly many of the things that we thought 00:34:59.660 |
that you can write a pretty convincing sports story 00:35:04.220 |
from stats alone without needing to have watched the game. 00:35:09.220 |
So, you know, is it possible to write lively prose 00:35:26.900 |
which was what is the greatest brilliancy ever produced 00:35:44.020 |
that just show such flair, panache, and soul. 00:35:52.300 |
And recently we've started seeing brilliancies. 00:35:59.740 |
with AlphaZero that things were quite brilliant. 00:36:07.580 |
but in a very restricted set of rules like chess, 00:36:11.680 |
you're starting to see poetry of a high order. 00:36:15.540 |
And so I don't like the idea that we're waiting for AGI. 00:36:25.900 |
in the same way that I don't think a worm should be, 00:36:30.060 |
C. elegans shouldn't be treated as non-conscious 00:36:34.500 |
Maybe it just has a very low level of consciousness 00:36:37.780 |
because we don't understand what these things mean 00:36:40.740 |
So am I worried about this general phenomena? 00:36:43.620 |
Sure, but I think that one of the things that's happening 00:36:52.420 |
We've always been worried about the Golem, right? 00:36:57.260 |
- Well, the Golem is the artificially created-- 00:37:25.220 |
that we've experienced or the context of our lives 00:37:32.340 |
And I don't mean to suggest that we won't survive, 00:37:38.940 |
And I don't mean to suggest that it's coming tomorrow. 00:37:46.940 |
if we have three rocks that we could possibly inhabit 00:37:49.660 |
that are sensible within current technological dreams, 00:38:04.500 |
to sort out disputes that cannot be arbitrated. 00:38:07.380 |
It is not clear to me that we have a long-term future 00:38:23.620 |
to figure out, what do you mean by our source code? 00:38:39.140 |
You're talking about the low level, the bits. 00:39:02.780 |
But I think it's also pretty silly to imagine 00:39:07.560 |
to the point that we can have the toys we have, 00:39:10.420 |
and we're not going to use them for 500 years. 00:39:14.020 |
- Speaking of Einstein, I had a profound breakthrough 00:39:17.260 |
when I realized you're just one letter away from the guy. 00:39:20.340 |
- Yeah, but I'm also one letter away from Feinstein. 00:39:28.540 |
You know, you've worked, you enjoy the beauty of geometry. 00:39:42.100 |
It's just so beautiful, you will tremble anyway. 00:39:47.380 |
and one of the ways, one of the things you've done 00:39:57.460 |
that has the 4D space-time continuum embedded in it. 00:40:10.740 |
How do you try to, what does it make you feel, 00:40:14.900 |
talking in the mathematical world about dimensions 00:40:19.240 |
that are greater than the ones we can perceive? 00:40:27.020 |
- Well, first of all, stick out your tongue at me. 00:40:38.460 |
And next to that were salt receptors on two different sides. 00:40:44.420 |
A little bit farther back, there were sour receptors. 00:40:46.620 |
And you wouldn't show me the back of your tongue 00:40:51.260 |
- Okay, but that was four dimensions of taste receptors. 00:40:56.260 |
But you also had pain receptors on that tongue, 00:41:05.260 |
So when you eat something, you eat a slice of pizza, 00:41:07.860 |
and it's got some hot pepper on it, maybe some jalapeno. 00:41:14.300 |
You're having a six-dimensional experience, dude. 00:41:17.600 |
- Do you think we overemphasize the value of time 00:41:24.300 |
Well, we certainly overemphasize the value of time, 00:41:28.140 |
or we really don't like things to end, but they seem to. 00:41:30.820 |
- Well, what if you flipped one of the spatial dimensions 00:41:39.300 |
and say, "Well, where and when should we meet?" 00:41:42.180 |
Say, "How about I'll meet you on 36th and Lexington 00:41:46.420 |
"at two in the afternoon and 11 o'clock in the morning?" 00:41:54.660 |
- Well, so it's convenient for us to think about time, 00:42:01.260 |
in which we have three dimensions of space and one of time, 00:42:04.260 |
and they're woven together in this sort of strange fabric 00:42:07.460 |
where we can trade off a little space for a little time, 00:42:09.380 |
but we still only have one dimension that is picked out 00:42:19.660 |
or did the dimensions, or were they always there 00:42:23.620 |
- Well, do you imagine that there isn't a place 00:42:30.660 |
and then would time not be playing the role of space? 00:42:33.820 |
Why do you imagine that the sector that you're in 00:42:36.940 |
- Certainly do not, but I can't imagine otherwise. 00:42:40.860 |
I mean, I haven't done ayahuasca or any of those drugs, 00:42:53.380 |
- Well, just ask about pseudo-Riemannian geometry, 00:42:58.980 |
- Or you could talk to a shaman and end up in Peru. 00:43:03.020 |
- Yeah, but you won't be able to do any calculations 00:43:11.620 |
Berkeley professor, author of "Love and Math," 00:43:15.020 |
said that you were quite a remarkable intellect 00:43:19.780 |
to come up with such beautiful original ideas 00:43:33.500 |
what's the difference between inside academia 00:43:35.940 |
and outside academia when it comes to developing such ideas? 00:43:39.980 |
- Oh, it's a terrible choice, terrible choice. 00:44:09.180 |
and brilliant people who are working together 00:44:26.180 |
When you go outside, you meet lunatics and crazy people, 00:44:36.100 |
who do not usually subscribe to the consensus position 00:44:50.540 |
who has miraculously managed to stay within the system 00:44:54.820 |
and is able to take on a larger amount of heresy 00:45:04.500 |
Or is it more likely that somebody will maintain 00:45:07.940 |
a level of discipline from outside of academics 00:45:18.540 |
affirm your loyalty to the consensus of your field? 00:45:23.020 |
that academia in this particular sense is declining. 00:45:30.260 |
the older population of the faculty is getting larger, 00:45:37.020 |
So which direction of the two are you more hopeful about? 00:45:40.660 |
- Well, the baby boomers can't hang on forever. 00:45:53.460 |
that last a few years in length and then pop. 00:45:57.140 |
The baby boomer bubble is this really long lived thing. 00:46:11.580 |
It was something that baby boomers were able to do 00:46:20.460 |
- You don't think of string theory as an original idea? 00:46:29.660 |
And there are people who are younger than the baby boomers 00:46:35.220 |
within the large string theoretic complex is wrong. 00:46:38.460 |
Quite the contrary, a lot of brilliant mathematics 00:46:49.220 |
of this product that will not ship called string theory? 00:46:52.460 |
I think that it is largely an affirmative action program 00:46:55.300 |
for highly mathematically and geometrically talented 00:47:14.540 |
There are other things that you could imagine doing. 00:47:17.100 |
I don't think much of any of the major programs, 00:47:25.260 |
through a shibboleth, well, surely you don't question X. 00:47:29.540 |
Well, I question almost everything in the string program. 00:47:34.220 |
When you called me a physicist, it was a great honor. 00:47:41.260 |
As I said, wow, in 1984, 1983, I saw the field going mad. 00:47:48.740 |
which has all sorts of problems, was not going insane. 00:47:52.900 |
And so instead of studying things within physics, 00:48:21.860 |
- I mean, I'm trying to play this role myself. 00:48:33.800 |
So I think that they're wrong about a great number 00:48:42.460 |
I have a really profound love hate relationship 00:48:56.920 |
They'll study something that isn't very important physics, 00:49:07.900 |
and I might even intellectually at some horsepower level 00:49:10.940 |
give them the edge, the theoretical physics community 00:49:15.460 |
is bar none the most profound intellectual community 00:49:21.900 |
It is the number one, there's nobody in second place 00:49:25.940 |
Look, in their spare time, in their spare time, 00:49:34.860 |
I mean, a lot of the early molecular biologists. 00:49:44.420 |
I mean, you have to appreciate that there is no community 00:49:49.420 |
like the basic research community in theoretical physics. 00:49:54.660 |
And it's not something, I'm highly critical of these guys. 00:49:59.380 |
I think that they would just wasted the decades of time 00:50:13.260 |
But this has been the greatest intellectual collapse 00:50:22.780 |
- Oh, I'm terrified that we're about to lose the vitality. 00:50:31.260 |
just to play with in case they find something 00:50:41.780 |
They gave us two atomic devices to end World War II. 00:50:45.500 |
They created the semiconductor and the transistor 00:50:51.580 |
As a positive externality of particle accelerators, 00:50:59.940 |
"Why should we fund you with our taxpayer dollars?" 00:51:02.380 |
No, the question is, are you enjoying your physics dollars? 00:51:21.900 |
during this period of time through the terrible weapons 00:51:25.740 |
that they developed, or your communications devices, 00:51:33.880 |
even to the extent that chemistry came out of physics 00:51:42.440 |
Second of all, it is our most important community. 00:51:57.980 |
to have really contributed to the standard model 00:52:00.980 |
of theoretical level was born in 1951, right? 00:52:17.020 |
for theoretical development that predicted experiment. 00:52:21.600 |
So, we have to understand that we are doing this to ourselves 00:52:24.760 |
now with that said, these guys have behaved abysmally, 00:52:37.280 |
They haven't shared some of their most brilliant discoveries 00:52:39.760 |
which are desperately needed in other fields, 00:52:45.580 |
which is an upgrade of the differential calculus 00:52:52.840 |
even though this should be standard operating procedure 00:52:55.400 |
for people across the sciences dealing with different layers 00:53:01.040 |
- And by shared, you mean communicated in such a way 00:53:03.440 |
that it disseminates throughout the different sciences. 00:53:07.920 |
both theoretical physicists and mathematicians 00:53:23.400 |
the Habermann switch pitch that shows the self-duality 00:53:26.080 |
of the tetrahedron realized as a linkage mechanism. 00:53:32.080 |
and it makes an amazing toy that's built a market, 00:53:43.440 |
- So it's truly a love and hate relationship for you. 00:53:51.960 |
is the building in which I really put together 00:53:54.840 |
the conspiracy between the National Academy of Sciences 00:54:01.560 |
research round table to destroy the bargaining power 00:54:11.080 |
Oh yeah, that was done here in this building. 00:54:14.080 |
- And I'm truly speaking with a revolutionary 00:54:20.120 |
At an intellectual level, I am absolutely garden variety. 00:54:36.680 |
And we don't understand that when we get these things wrong, 00:54:46.120 |
everybody with a tie who spoke in baritone tones 00:54:51.120 |
with the right degree at the end of their name, 00:54:55.520 |
we're talking about how we banished volatility. 00:55:10.720 |
the market went crazy, but the market didn't go crazy. 00:55:16.140 |
And what happened is that it suddenly went sane. 00:55:32.160 |
over its admissions policies for people of color 00:55:38.200 |
All of this madness is necessary to keep the game going. 00:55:43.120 |
just while we're on the topic of revolutionaries, 00:55:46.280 |
is we're talking about the danger of an outbreak of sanity. 00:55:49.820 |
- Yeah, you're the guy pointing out the elephant 00:55:59.600 |
I was gonna talk a little bit to Joe Rogan about this. 00:56:11.920 |
you could probably speak really eloquently to academia 00:56:14.400 |
on the difference between the different fields. 00:56:22.440 |
in terms of tolerance that they're willing to tolerate? 00:56:36.900 |
'Cause I always, all the battles going on now 00:56:43.180 |
- Have you seen the American Mathematical Society's 00:56:46.540 |
publication of an essay called "Get Out the Way"? 00:56:51.420 |
- The idea is that white men who hold positions 00:57:10.980 |
- Do you think, 'cause you're basically saying physics 00:57:14.220 |
as a community has become a little bit intolerant 00:57:24.180 |
which is that even string theory is now admitting, 00:57:28.100 |
okay, we don't, this doesn't look very promising 00:57:35.980 |
if you wanna take the computer science metaphor, 00:57:41.900 |
Will you spend your life trying to push some paper 00:57:44.260 |
into a journal or will it be accepted easily? 00:57:47.440 |
What do we know about the characteristics of the submitter? 00:57:55.420 |
All of these fields are experiencing pressure 00:57:58.660 |
because no field is performing so brilliantly well 00:58:02.180 |
that it's revolutionizing our way of speaking and thinking 00:58:08.620 |
in the ways in which we've become accustomed. 00:58:12.820 |
- But don't you think, even in theoretical physics, 00:58:15.880 |
a lot of times, even with theories like string theory, 00:58:19.860 |
you could speak to this, it does eventually lead to 00:58:22.860 |
what are the ways that this theory would be testable? 00:58:28.740 |
there's this thing about Popper and the scientific method 00:58:36.260 |
That's not really how most of the stuff gets worked out, 00:58:41.180 |
And there is a dialogue between theory and experiment. 00:59:00.580 |
and Schrodinger's failure to advance his own work 00:59:03.780 |
because of his failure to account for some phenomenon. 00:59:06.260 |
The key point is that if your theory is a slight bit off, 00:59:10.260 |
but it doesn't mean that the theory is actually wrong. 00:59:18.740 |
that the electrons should have an antiparticle. 00:59:22.020 |
And since the only positively charged particle 00:59:26.740 |
Heisenberg pointed out, well, shouldn't your antiparticle, 00:59:29.500 |
the proton have the same mass as the electron 00:59:33.540 |
So I think that Dirac was actually being quite, 00:59:35.620 |
potentially quite sneaky and talking about the fact 00:59:39.340 |
that he had been pushed off of his own theory 00:59:43.780 |
But look, we fetishized the scientific method 00:59:47.960 |
and Popper and falsification because it protects us 00:59:55.480 |
So it's a question of balancing type one and type two error 00:59:58.420 |
and we were pretty maxed out in one direction. 01:00:01.460 |
- The opposite of that, let me say what comforts me, 01:00:29.660 |
The problem is is that we don't have standards in general 01:00:35.180 |
for trying to determine who has to be put to the sword 01:00:38.740 |
in terms of their career and who has to be protected 01:00:42.100 |
as some sort of giant time suck pain in the ass 01:00:51.300 |
- Well, you're not gonna like the answer, but here it comes. 01:00:59.360 |
We're trying to do this at the level of rules and fairness. 01:01:03.620 |
'Cause the only thing that really understands this 01:01:21.940 |
he was also one hell of a writer before he became an ass. 01:01:24.940 |
No, he's tried to destroy his own reputation. 01:01:32.940 |
- Jim Watson is one of the most important people now living. 01:01:52.700 |
which is that he stole everything from Rosalind Franklin. 01:01:58.180 |
but we should actually honor that tension in our history 01:02:02.020 |
by delving into it rather than having a simple solution. 01:02:09.240 |
that everybody secretly knew was super brilliant. 01:02:31.620 |
it just makes you tremble even thinking about it. 01:02:35.620 |
Look, we know very often who is to be feared, 01:02:58.100 |
We know the pains in the asses that might work out. 01:03:01.020 |
And we know the people who are really just blowhards 01:03:03.380 |
who really have very little to contribute most of the time. 01:03:07.340 |
It's not 100%, but you're not gonna get there with rules. 01:03:14.540 |
I'm gonna make you roll your eyes for a second, 01:03:22.680 |
actually made me pause and ask myself the question. 01:03:33.180 |
I mean, then you go through a thinking process 01:03:37.500 |
It ultimately ends up being a geometry thing, I think. 01:03:40.700 |
It's an interesting thought experiment at the very least. 01:04:01.500 |
about the rationality of the square root of two 01:04:09.100 |
Right, these are crazy things that will never happen. 01:04:14.020 |
- So much of social media operates by AI algorithms. 01:04:24.980 |
how much should AI show you things you disagree with 01:04:38.900 |
Look, they've pushed out this cognitive Lego to us 01:04:50.500 |
It's good to be challenged with interesting things 01:05:00.380 |
I need to know why that particular disagreeable thing 01:05:12.300 |
to come up with an infinite number of disagreeable statements 01:05:22.780 |
- There is an aspect in which that question is quite dumb, 01:05:25.220 |
especially because it's being used to almost, 01:05:29.920 |
very generically by these different networks to say, 01:05:39.620 |
do you see the value of seeing things you don't like, 01:05:43.560 |
not you disagree with, because it's very difficult 01:05:47.600 |
which is the stuff that's important for you to consider 01:06:00.640 |
you may not want to, it might not make you feel good 01:06:12.700 |
- No, we're engaged in some moronic back and forth 01:06:17.200 |
where I have no idea why people who are capable 01:06:25.960 |
are having us in these incredibly low level discussions. 01:06:39.540 |
for a different thing and they are pushing those people 01:06:42.420 |
- They're optimizing for things we can't see. 01:06:54.160 |
political control or the fact that they're doing business 01:06:58.720 |
about all the things that they're going to be 01:07:15.720 |
- You're having a fake conversation with us, guys. 01:07:19.880 |
I do not wish to be part of your fake conversation. 01:07:26.720 |
You know high availability like nobody's business. 01:07:33.160 |
- So you think just because they can do incredible work 01:07:38.280 |
they can also deal with some of these difficult questions 01:07:54.320 |
And I've heard stories inside of Facebook and Apple. 01:07:58.440 |
We're not, we're engaged, they're engaging us 01:08:08.100 |
Why is every piece of hardware that I purchase 01:08:11.840 |
in tech space equipped as a listening device? 01:08:15.560 |
Where's my physical shutter to cover my lens? 01:08:25.080 |
How much would it cost to have a security model? 01:08:29.780 |
Why is my indicator light software controlled? 01:08:35.240 |
that the light is on by putting it as something 01:08:42.900 |
at some difficulty to yourselves as listening devices 01:08:53.100 |
- These discussions are happening about privacy. 01:08:55.320 |
Is there a more difficult thing you're giving credit for? 01:09:03.560 |
Why do I not have controls over my own levers? 01:09:07.120 |
Just have a really cute UI where I can switch, 01:09:12.960 |
- But you think that there is some deliberate choices 01:09:22.920 |
The vector does not collapse onto either axis. 01:09:29.080 |
that intention is completely absent is a child. 01:09:35.960 |
And like many things you've said is gonna make me-- 01:09:49.720 |
- I was obsessively looking through your feed on Twitter 01:09:58.240 |
- By the way, that feed is Eric R. Weinstein on Twitter. 01:10:06.620 |
- Why did I find it enjoyable or what was I seeing? 01:10:14.800 |
I know you've got all these interesting people. 01:10:16.480 |
I'm just some guy who's sort of a podcast guest. 01:10:18.880 |
- Sort of a podcast, you're not even wearing a tie. 01:10:30.560 |
for a dopamine rush, so short-term and long-term. 01:10:36.500 |
I don't honestly know what I'm doing to reach you. 01:10:41.240 |
The representing ideas which feel common sense to me 01:10:47.880 |
so it's kind of like the intellectual dark web folks. 01:10:52.120 |
These folks, from Sam Harris to Jordan Peterson to yourself, 01:10:58.660 |
are saying things where it's like you're saying, 01:11:01.080 |
look, there's an elephant and he's not wearing any clothes. 01:11:03.980 |
And I say, yeah, yeah, let's have more of that conversation. 01:11:14.720 |
I'm very worried we've got an election in 2020. 01:11:25.320 |
And I don't want the destruction of our institutions. 01:11:28.360 |
They all seem hell-bent on destroying themselves. 01:11:40.560 |
that this is falling to a tiny group of people 01:11:44.600 |
who are willing to speak out without getting so freaked out 01:11:48.720 |
that everything they say will be misinterpreted 01:11:50.760 |
and that their lives will be ruined through the process. 01:11:53.000 |
I mean, I think we're in an absolutely bananas period 01:11:57.920 |
to such a tiny number of shoulders to shoulder this way. 01:12:01.120 |
- So I have to ask you, on the capitalism side, 01:12:05.840 |
you mentioned that technology is killing capitalism, 01:12:08.160 |
or it has effects that are, well, not unintended, 01:12:21.240 |
the effect of even then artificial intelligence 01:12:28.200 |
and what you think is the way to alleviate that, 01:12:31.500 |
whether the Andrew Yang presidential candidate 01:12:38.680 |
How do we fight off the negative effects of technology? 01:12:48.500 |
A human being has a worker is a different object, right? 01:12:53.960 |
So if you think about object-oriented programming 01:13:01.840 |
We're talking about the fact that for a period of time, 01:13:04.400 |
the worker that a human being has was in a position 01:13:17.020 |
One is as a worker and the other is as a soul, 01:13:20.740 |
and the soul needs sustenance, it needs dignity, 01:13:25.320 |
As long as your means of support is not highly repetitive, 01:13:41.240 |
you are in the crosshairs of for loops and while loops, 01:13:45.960 |
and that's what computers excel at, repetitive behavior. 01:13:52.440 |
that have never happened through combinatorial possibilities 01:13:54.840 |
but as long as it has a looped characteristic to it, 01:13:58.100 |
We are seeing a massive push towards socialism 01:14:02.960 |
because capitalists are slow to address the fact 01:14:07.960 |
that a worker may not be able to make claims. 01:14:10.880 |
A relatively undistinguished median member of our society 01:14:15.640 |
still has needs to reproduce, needs to dignity. 01:14:20.640 |
And when capitalism abandons the median individual 01:14:25.520 |
or the bottom 10th or whatever it's going to do, 01:14:35.400 |
aren't sufficiently capitalistic to understand this. 01:14:38.080 |
You really want to court authoritarian control 01:14:45.100 |
that people may not be able to defend themselves 01:14:47.300 |
in the marketplace because the marginal product 01:14:50.080 |
of their labor is too low to feed their dignity as a soul. 01:14:55.040 |
So my great concern is that our free society has to do 01:15:02.320 |
I remember looking down from my office in Manhattan 01:15:16.440 |
So my complaint is first of all, not with the socialists 01:15:20.880 |
but with the capitalists, which is you guys are being idiots. 01:15:24.520 |
You're courting revolution by continuing to harp 01:15:37.940 |
that there's nothing that ties together our need 01:15:47.580 |
because it may have been a temporary phenomenon. 01:15:49.460 |
So check out my article on anthropic capitalism 01:15:55.480 |
I think people are late getting the wake-up call 01:16:01.860 |
because I don't want this done under authoritarian control. 01:16:11.860 |
in order to have a family is failing at a personal level. 01:16:15.260 |
I mean, what a disgusting thing that we're saying. 01:16:46.460 |
but I gotta tell you, I'm pretty disappointed with my team. 01:16:59.620 |
is gonna have to be coupled to hyper socialism. 01:17:22.700 |
You tweeted today a simple, quite insightful equation 01:17:27.140 |
saying imagine that every unit F of fame you picked up 01:17:35.660 |
So I imagine S and H are dependent on your path to fame 01:17:42.460 |
when you have like 280 characters to explain yourself. 01:17:47.020 |
- So you mean that that's not a mathematical-- 01:17:53.380 |
because I still have a mathematician's desire for precision. 01:18:00.080 |
that there is a law that has those variables in it. 01:18:06.700 |
So how do you yourself optimize that equation 01:18:22.100 |
compassion, these things are really important. 01:18:24.460 |
And I have a pretty spectrumy kind of approach to analysis. 01:18:30.460 |
I can go full Rain Man on you at any given moment. 01:18:39.180 |
But when you see me coding or you see me doing mathematics, 01:18:57.380 |
it's sort of back to us as a worker and us as a soul. 01:19:00.660 |
Many of us are optimizing one at the expense of the other. 01:19:11.720 |
And I struggle with just how much pain people are in. 01:19:15.860 |
And if there's one message I would like to push out there, 01:19:27.500 |
it's nobody else's job to do your struggle for you. 01:19:30.780 |
Now with that said, if you're struggling and you're trying 01:19:33.460 |
and you're trying to figure out how to better yourself 01:19:35.540 |
and where you've failed, where you've let down your family, 01:19:38.100 |
your friends, your workers, all this kind of stuff, 01:19:46.380 |
I have a lifelong relationship with failure and success. 01:19:52.780 |
where both haven't been present in one form or another. 01:20:01.200 |
I'm about to go, you know, do a show with Sam Harris. 01:20:08.500 |
I'm always trying to figure out how to make sure 01:20:12.300 |
And that's why I'm doing this podcast, you know, 01:20:22.900 |
or your lovers or your family members success. 01:20:25.800 |
As long as you're in there and you're picking yourself up, 01:20:29.680 |
recognize that this new situation with the economy 01:20:33.620 |
that doesn't have the juice to sustain our institutions 01:20:37.060 |
has caused the people who've risen to the top 01:20:39.700 |
of those institutions to get quite brutal and cruel. 01:20:54.940 |
and you're struggling and you're trying to figure out 01:21:08.220 |
Nobody that I've met is checking all the boxes.