back to indexLiv Boeree: Poker, Game Theory, AI, Simulation, Aliens & Existential Risk | Lex Fridman Podcast #314
Chapters
0:0 Introduction
0:58 Poker and game theory
8:1 Dating optimally
12:53 Learning
21:5 Daniel Negreanu
26:13 Phil Hellmuth
28:46 Greatest poker player ever
33:4 Bluffing
43:25 Losing
52:41 Mutually assured destruction
57:35 Simulation hypothesis
74:13 Moloch
103:25 Beauty
115:33 Quantifying life
135:43 Existential risks
154:6 AI
163:57 Energy healing
171:7 Astrophysics
174:1 Aliens
199:43 Advice for young people
201:48 Music
209:37 Meaning of life
00:00:00.000 |
evolutionarily, if we see a lion running at us, 00:00:03.040 |
we didn't have time to calculate the lion's kinetic energy 00:00:06.360 |
and is it optimal to go this way or that way? 00:00:16.640 |
this is not something that you ever evolved to do, 00:00:19.320 |
and yet you're in that same flight or fight response. 00:00:25.260 |
to basically learn how to meditate in the moment 00:00:28.280 |
and calm yourself so that you can think clearly. 00:00:30.680 |
- The following is a conversation with Liv Bury, 00:00:35.760 |
formerly one of the best poker players in the world, 00:00:44.240 |
on topics of game theory, physics, complexity, and life. 00:00:58.580 |
What role do you think luck plays in poker and in life? 00:01:06.980 |
- The longer you play, the less influence luck has, 00:01:22.820 |
If you and I sat and played 10 hands right now, 00:01:31.760 |
then I'll probably win like over 98, 99% of the time. 00:01:40.160 |
The betting strategy that this individual does 00:01:45.940 |
- Against any individual over time, the better player, 00:01:49.540 |
So what does that mean to make a better decision? 00:01:51.300 |
Well, to get into the real nitty gritty already, 00:02:00.020 |
familiar with like Nash equilibria, that term, right? 00:02:02.540 |
So there are these game theory optimal strategies 00:02:17.260 |
but back in, you know, when I was playing all the time, 00:02:20.020 |
I would study these game theory optimal solutions 00:02:33.220 |
it's actually, it's a loss minimization thing 00:02:37.340 |
Your best bet is to try and play a sort of similar style. 00:02:42.340 |
You also need to try and adopt this loss minimization. 00:02:46.180 |
But because I've been playing much longer than you, 00:02:59.500 |
and then deviating from this game theory optimal strategy 00:03:05.100 |
- Can you define game theory and Nash equilibria? 00:03:08.660 |
Can we try to sneak up to it in a bunch of ways? 00:03:10.900 |
Like what's a game theory framework of analyzing poker, 00:03:25.060 |
I mean, it's technically a branch of economics, 00:03:27.660 |
but it also applies to like wider decision theory. 00:03:35.780 |
it's these like little payoff matrices and so on. 00:03:38.740 |
But it's essentially just like study of strategies 00:03:50.460 |
And what that means is when you're in a Nash equilibrium, 00:04:02.780 |
assuming your opponent is also doing the same thing. 00:04:06.700 |
If we're both playing in a game theory optimal strategy, 00:04:12.140 |
now we're putting ourselves at a disadvantage. 00:04:18.860 |
Like if we were to start playing rock, paper, scissors, 00:04:27.860 |
What would your sort of optimal strategy be, do you think? 00:04:35.220 |
I would probably try to be as random as possible. 00:04:43.580 |
You wanna, because you don't know anything about me. 00:04:46.060 |
You don't want to give anything away about yourself. 00:04:59.660 |
but I would probably just randomize back too. 00:05:07.660 |
where we're both playing this like unexploitable strategy. 00:05:13.060 |
that I'm playing rock a little bit more often than I should. 00:05:16.420 |
- Yeah, you're the kind of person that would do that. 00:05:37.860 |
and you're not deviating and exploiting my mistake. 00:05:41.340 |
So you'd wanna start throwing paper a bit more often 00:05:43.820 |
in whatever you figure is the right sort of percentage 00:05:45.980 |
of the time that I'm throwing rock too often. 00:05:54.700 |
but it's not always the maximally profitable thing 00:06:02.580 |
So that's kind of then how it works in poker, 00:06:10.660 |
nowadays they study, the games changed so much 00:06:12.940 |
and I think we should talk about how it sort of evolved. 00:06:15.460 |
But nowadays like the top pros basically spend all their time 00:06:26.180 |
sort of doing billions of fictitious self-play hands. 00:06:52.580 |
And then you wanna try and memorize what these are. 00:07:04.500 |
- Yeah, those kinds of simulations incorporate 00:07:06.420 |
all the betting strategies and everything else like that. 00:07:08.540 |
So as opposed to some kind of very crude mathematical model 00:07:16.380 |
it's including everything else too, the game theory of it. 00:07:26.340 |
the super high stakes tournaments or tough cash games, 00:07:30.700 |
you're gonna get eaten alive in the long run. 00:07:33.260 |
But of course you could get lucky over the short run. 00:07:35.140 |
And that's where this like luck factor comes in 00:07:40.420 |
If luck didn't, if there wasn't this random element 00:07:42.580 |
and there wasn't the ability for worse players 00:07:45.260 |
to win sometimes, then poker would fall apart. 00:07:48.460 |
The same reason people don't play chess professionally 00:07:51.500 |
for money against, you don't see people going 00:07:57.860 |
because you know there's very little luck in chess, 00:08:01.420 |
- Have you seen "Beautiful Mind," that movie? 00:08:11.660 |
- The way they illustrated it is they're trying 00:08:13.940 |
to pick up a girl at a bar and there's multiple girls. 00:08:20.100 |
I don't remember the details, but I remember- 00:08:21.980 |
- Don't you like then speak to her friends first? 00:08:25.700 |
I mean, it's classic pickup artist stuff, right? 00:08:28.220 |
- And they were trying to correlate that somehow, 00:08:30.780 |
that being an optimal strategy game, theoretically. 00:08:39.740 |
I mean, there's probably an optimal strategy. 00:08:41.900 |
Is it, does that mean that there's an actual Nash equilibrium 00:08:51.740 |
- So where it's a optimal dating strategy where you, 00:08:57.500 |
you know you've got like a set of a hundred people 00:08:59.900 |
you're gonna look through and after how many do you, 00:09:04.220 |
now after that, after going on this many dates out of a 100, 00:09:09.180 |
okay, the next best person I see, is that the right one? 00:09:24.860 |
Yeah, so, but it's funny under those strict constraints, 00:09:36.460 |
that is better than anyone you've seen before. 00:10:37.140 |
- I've been on dating apps, but I've never used them. 00:10:45.260 |
'Cause there aren't a lot of people that use social, 00:10:47.380 |
like dating apps are on there for a long time. 00:10:51.020 |
So the interesting aspect is like, all right, 00:10:58.740 |
before it actually starts affecting your mind negatively 00:11:12.780 |
of finding somebody that's gonna make you happy 00:11:18.260 |
I wish they would be able to release that data. 00:11:22.220 |
I think they ran a huge, huge study on all of their-- 00:11:28.060 |
I think there's a lot of opportunity for dating apps 00:11:49.460 |
- The Goodreads is just lists books that you've read 00:11:54.460 |
and allows you to comment on the books you read 00:11:58.940 |
But it's a giant social networks of people reading books. 00:12:01.300 |
And that seems to be a much better database of interests. 00:12:04.620 |
Of course, it constrains you to the books you're reading, 00:12:06.580 |
but that really reveals so much more about the person. 00:12:16.020 |
Also, like the kind of places, people you're curious about, 00:12:24.860 |
Are you into Ayn Rand or are you into Bernie Sanders? 00:12:48.580 |
of probably the type of people you'd be looking for. 00:12:50.620 |
- Well, at least be able to fake reading books. 00:12:55.860 |
- Yeah, game the dating app by feigning intellectualism. 00:12:59.060 |
- Can I admit something very horrible about myself? 00:13:50.420 |
there's kind of this advanced versions of courses 00:14:00.580 |
I think a lot of it was also writing, written. 00:14:04.940 |
You have like AP Physics, AP Computer Science, 00:14:13.460 |
but I think Shakespeare was a part of that, but I- 00:14:16.980 |
- And you game, the point is you gamified it. 00:14:19.180 |
- Gamified, well, entirety, I was into getting As. 00:14:33.980 |
The deepest learning I've done has been outside of school, 00:14:36.260 |
with a few exceptions, especially in grad school, 00:14:43.740 |
it was outside of getting the A for the course. 00:14:46.220 |
The best stuff I've ever done is when you read the chapter 00:14:49.660 |
and you do many of the problems at the end of the chapter, 00:14:52.340 |
which is usually not what's required for the course, 00:14:58.820 |
If you go back now and you look at like biology textbook 00:15:11.980 |
plus they have practice problems of increasing difficulty 00:15:19.940 |
- I got through my entire physics degree with one textbook 00:15:26.300 |
that they told us at the beginning of the first year, 00:15:28.660 |
buy this, but you're gonna have to buy 15 other books 00:15:33.380 |
and I was like, every time I would just check 00:15:36.700 |
and it did, and I think I only bought like two or three extra 00:15:39.820 |
and thank God, 'cause they're super expensive textbooks, 00:15:46.220 |
you get the right one, it's just like a manual for, 00:15:52.940 |
this is the tyranny of having exams and metrics. 00:16:04.020 |
and then like sort of dust off my shoulders afterwards 00:16:06.380 |
when I get a good grade or be annoyed at myself 00:16:08.140 |
when I didn't, but yeah, you're absolutely right 00:16:10.860 |
in that the actual, how much of that physics knowledge 00:16:14.420 |
I've retained, like I've, I learned how to cram and study 00:16:24.700 |
I mean, yes and no, but really like nothing makes you learn 00:16:29.100 |
a topic better than when you actually then have 00:16:33.180 |
You know, like I'm trying to wrap my teeth around this, 00:16:38.140 |
and there's no exam at the end of it that I can gamify. 00:16:43.060 |
There's no way to gamify and sort of like shortcut 00:16:47.220 |
from like deep foundational levels to then build upon it 00:16:52.340 |
And like, you're about to go and do some lectures, right? 00:17:00.620 |
that you got through when you were studying for an exam 00:17:06.820 |
especially the kind of stuff you do on YouTube, 00:17:12.780 |
You have to think through what is the core idea here. 00:17:17.100 |
And when you do the lectures live especially, 00:17:23.900 |
That is the luxury you get if you're recording a video 00:17:30.180 |
But it definitely is a luxury you shouldn't lean on. 00:17:39.320 |
And you realize, oh, you've gamified this system 00:17:43.400 |
because you're not really thinking deeply about stuff. 00:17:46.760 |
You're through the edit, both written and spoken, 00:17:57.640 |
So live teaching, or at least recording video 00:18:04.680 |
And I think it's the most honest way of doing it, 00:18:18.520 |
I do think people talk about high school and college 00:18:28.320 |
But looking back, of course I did a lot of those things. 00:18:33.320 |
No, yes, but it's also a time when you get to read textbooks 00:18:39.240 |
or read books or learn with all the time in the world. 00:18:47.740 |
You don't have these responsibilities of laundry 00:18:54.920 |
and having to pay for mortgage or all that kind of stuff, 00:19:04.060 |
In most cases, there's just so much time in the day 00:19:07.360 |
for learning, and you don't realize it at the time 00:19:15.480 |
But you never get a chance to do this kind of learning, 00:19:21.080 |
unless later in life you really make a big effort out of it. 00:19:24.640 |
You get, basically your knowledge gets solidified. 00:19:34.320 |
Like some people like knowledge is not something 00:19:39.320 |
that they think is fun, but if that's the kind of thing 00:19:42.320 |
that you think is fun, that's the time to have fun 00:19:44.880 |
and do the drugs and drink and all that kind of stuff. 00:19:46.920 |
But the learning, just going back to those textbooks, 00:19:59.240 |
with their TikTok and their-- - Well, not even that, 00:20:08.100 |
- Yeah, but they're not, they are using that, 00:20:11.100 |
but college is still very, there's a curriculum. 00:20:16.100 |
I mean, so much of school is about rigorous study 00:20:19.640 |
of a subject and still on YouTube, that's not there. 00:20:23.960 |
YouTube has, Grant Sanderson talks about this, 00:20:33.840 |
"I just take really cool concepts and I inspire people, 00:20:41.740 |
"you should do the textbook, you should do that." 00:20:45.240 |
And there's still the textbook industrial complex 00:20:49.020 |
that charges like $200 for a textbook and somehow, 00:20:54.960 |
- Well, they're like, "Oh, sorry, new edition, 00:20:58.920 |
"edition 14.6, sorry, you can't use 14.5 anymore." 00:21:08.120 |
I'm gonna get a chance to talk to him on this podcast 00:21:41.540 |
you basically develop this gut feeling about, 00:21:49.820 |
You look at the board and somehow from the fog 00:22:01.320 |
you can't really put a finger on exactly why, 00:22:04.480 |
but it just comes from your gut feeling or no. 00:22:10.560 |
So gut feelings are definitely very important. 00:22:19.060 |
You've got your sort of logical linear voice in your head, 00:22:35.860 |
You know, often they do some kind of inspired play 00:22:40.420 |
and they wouldn't really be able to explain it. 00:22:47.560 |
but it was more just because no one had the language 00:22:50.120 |
with which to describe what optimal strategies were 00:22:52.340 |
because no one really understood how poker worked. 00:22:54.260 |
This was before, you know, we had analysis software, 00:22:59.860 |
I guess some people would write down their hands 00:23:02.660 |
but there was no way to assimilate all this data 00:23:05.900 |
But then, you know, when computers became cheaper 00:23:11.500 |
where it would like automatically save your hand histories, 00:23:14.220 |
now all of a sudden you kind of had this body of data 00:23:19.580 |
And so that's when people started to see, you know, 00:23:24.220 |
And so what that meant is the role of intuition 00:23:33.920 |
And it went more into, as we talked before about, 00:23:40.600 |
But also, as I said, like game theory optimal 00:23:43.300 |
is about loss minimization and being unexploitable. 00:23:47.740 |
But if you're playing against people who aren't, 00:23:49.620 |
because no person, no human being can play perfectly 00:23:51.620 |
game theory optimal in poker, not even the best AIs. 00:24:04.620 |
So when, yeah, when you're playing this unexploitable style, 00:24:08.460 |
but when your opponents start doing something, 00:24:11.440 |
you know, suboptimal that you want to exploit, 00:24:14.160 |
well now that's where not only your like logical brain 00:24:17.360 |
will need to be thinking, oh, okay, I know I have this, 00:24:19.960 |
my, I'm in the sort of top end of my range here 00:24:23.940 |
So that means I need to be calling X percent of the time 00:24:30.480 |
But then sometimes you'll have this gut feeling 00:24:34.000 |
that will tell you, you know, you know what, this time, 00:24:37.320 |
I know mathematically I'm meant to call now, you know, 00:24:40.380 |
I've got, I'm in the sort of top end of my range 00:24:49.960 |
Like they're beating you, maybe your hand is worse. 00:24:56.940 |
this is where the last remaining art in poker, 00:24:59.620 |
the fuzziness is like, do you listen to your gut? 00:25:13.120 |
I mean, I can't speak for how much he's studying 00:25:17.760 |
I think he has, like he must be to still be keeping up, 00:25:26.360 |
he's seen so many hands of poker in the flesh. 00:25:29.220 |
He's seen so many people, the way they behave 00:25:31.900 |
when the chips are, you know, when the money's on the line 00:25:33.760 |
and you've got him staring you down in the eye, 00:25:42.200 |
- And he talks a lot, which is an interactive element, 00:25:45.040 |
which is he's getting stuff from other people. 00:25:52.960 |
this extra layer of information that others can't. 00:25:55.920 |
Now that said though, he's good online as well. 00:25:59.680 |
would he be beating the top cash game players online? 00:26:07.280 |
and he's got that additional layer of information, 00:26:10.680 |
but he knows what to do with it still so well. 00:26:17.080 |
And he's one of my favorite people to talk about 00:26:19.920 |
in terms of, I think he might have cracked the simulation. 00:26:28.000 |
- In more ways than one, he's cracked the simulation, 00:26:34.560 |
and I love you, Phil, I'm not in any way knocking you. 00:26:50.280 |
going deep or winning these huge field tournaments, 00:27:45.240 |
with a frequency far beyond what the real percentages are. 00:28:05.240 |
everybody else is stupid because he was obviously 00:28:08.320 |
- Meant to win, if that silly thing hadn't happened. 00:28:18.080 |
at the World Series, only at the World Series of Poker, 00:28:32.760 |
But he's clearly good because he's still winning. 00:28:40.640 |
and how cards, a randomly shuffled deck of cards come out. 00:28:44.320 |
I don't know what it is, but he's doing it right still. 00:28:46.680 |
- Who do you think is the greatest of all time? 00:29:04.960 |
He's got, he truly, I mean, he is one of the greatest. 00:29:11.000 |
He's certainly the greatest at the World Series of Poker. 00:29:22.840 |
And this like, just through sheer force of will, 00:29:28.640 |
And I think it's something that should be studied 00:29:32.000 |
- Yeah, there might be some actual game theoretical wisdom. 00:30:04.480 |
to manipulate the things the way you want them to go 00:30:09.760 |
- Do you think Phil Hellmuth understands them? 00:30:23.120 |
- Phil Hellmuth wrote a book about positivity. 00:30:28.680 |
- And I think it's about sort of manifesting what you want 00:30:36.560 |
and in your ability to win, like eyes on the prize. 00:30:50.160 |
too much out of the scene for the last few years to really, 00:30:56.600 |
He's so incredibly intimidating to play against. 00:31:00.480 |
I've only played against him a couple of times, 00:31:05.080 |
oof, no one's made me sweat harder than Phil Ivey. 00:31:10.440 |
That was actually one of the most thrilling moments 00:31:12.720 |
was it was in a Monte Carlo in a high roller. 00:31:24.400 |
And I felt like he was just scouring my soul. 00:31:36.400 |
there's a chance I was bluffing with the best hand, 00:31:43.320 |
I was truly one of the deep highlights of my career. 00:31:52.640 |
Because especially as I felt like I was one of the worst 00:31:57.880 |
unless I had a really solid plan that I was now like 00:32:01.160 |
advertising, oh look, I'm capable of bluffing Phil Ivey, 00:32:09.040 |
which is like, I'm a scared girl playing a high roller 00:32:15.640 |
But isn't there layers to this, like psychological warfare, 00:32:31.080 |
You know, the goal of poker is to be as deceptive as possible 00:32:34.680 |
about your own strategies while elucidating as much out of 00:32:42.080 |
particularly if they're people who you consider 00:32:44.800 |
any information I give them is going into their little 00:32:49.240 |
I assume it's going to be calculated and used well. 00:32:51.760 |
So I have to be really confident that my like meta gaming 00:33:00.640 |
So it's better just to keep that little secret to myself 00:33:08.360 |
- So, yeah, I mean, maybe actually, let me ask, 00:33:10.440 |
like, what did it feel like with Phil Ivey or anyone else 00:33:18.840 |
So a lot of money on the table and maybe, I mean, 00:33:26.000 |
but also some uncertainty in your mind and heart about, 00:33:31.960 |
well, maybe I miscalculated what's going on here, 00:33:43.800 |
you know, running a, I mean, any kind of big bluff 00:34:04.120 |
I think it was the first time I'd ever played a 20, 00:34:13.000 |
because it was much bigger than I would ever normally play. 00:34:18.720 |
And now I was sitting there against all the big boys, 00:34:21.240 |
you know, the Negranus, the Phil Ives and so on. 00:34:23.600 |
And then to like, each time you put the bets out, 00:34:29.720 |
you know, you put another bet out, your card. 00:34:36.280 |
that would make my hand very, very strong and therefore win. 00:34:39.160 |
But most of the time, those cards don't come. 00:34:40.920 |
- So it's a semi-bluff 'cause you're representing, 00:34:43.880 |
are you representing that you already have something? 00:34:46.960 |
- So I think in this scenario, I had a flush draw. 00:34:51.120 |
So I had two clubs, two clubs came out on the flop, 00:34:55.920 |
and then I'm hoping that on the turn in the river, 00:35:00.960 |
I could hit a club and then I'll have the best hand 00:35:04.600 |
And so I can keep betting and I'll want them to call, 00:35:07.480 |
but I'm also got the other way of winning the hand 00:35:11.920 |
I can keep betting and get them to fold their hand. 00:35:14.720 |
And I'm pretty sure that's what the scenario was. 00:35:18.000 |
So I had some future equity, but it's still, you know, 00:35:25.320 |
because I'm, you know, the pot is now getting bigger 00:35:27.920 |
And in the end, like I've jam all in on the river, 00:35:34.560 |
this might be the one time I ever get to play a big 25K. 00:35:37.200 |
You know, this was the first time I played one. 00:35:38.680 |
So it was, it felt like the most momentous thing. 00:35:42.000 |
And this was also when I was trying to build myself up, 00:35:43.840 |
you know, build my name, a name for myself in poker. 00:35:50.520 |
Like, I mean, it literally does feel like a form of life 00:35:58.320 |
Are you just like, what are you thinking about? 00:36:02.760 |
- More than a mixture of like, okay, what are the cards? 00:36:07.080 |
what are cards that make my hand look stronger? 00:36:10.080 |
Which cards hit my perceived range from his perspective? 00:36:15.680 |
What's the right amount of bet size to, you know, 00:36:23.280 |
But I think in reality, because I was so scared, 00:36:26.960 |
there's a certain threshold of like nervousness or stress 00:36:35.320 |
it just like, it feels like a game of wits, basically. 00:36:38.800 |
It's like of nerve, can you hold your resolve? 00:36:41.840 |
And it certainly got by that, like by the river. 00:36:45.880 |
I don't even know if this is a good bluff anymore, 00:36:55.200 |
And that's, and it happens in all kinds of decision-making. 00:36:58.920 |
I think anything that's really, really stressful. 00:37:00.800 |
I can imagine someone in like an important job interview, 00:37:06.560 |
like Bridgewater style where they ask these really hard, 00:37:13.440 |
to be able to like subdue your flight or fight response. 00:37:20.360 |
So you can actually, you know, engage that voice 00:37:23.880 |
in your head and do those slow logical calculations. 00:37:33.000 |
And, you know, is it optimal to go this way or that way? 00:37:43.280 |
this is not something that you ever, you know, 00:37:45.960 |
And yet you're in that same flight or fight response. 00:37:51.080 |
to be able to develop, to basically learn how to like 00:37:57.440 |
- But as you were searching for a comparable thing, 00:38:00.800 |
it's interesting 'cause you just made me realize 00:38:03.360 |
that bluffing is like an incredibly high stakes form 00:38:15.240 |
In the context of game, it's not a negative kind of lying. 00:38:20.400 |
You're representing something that you don't have. 00:38:36.120 |
I was thinking when Hitler was lying to Stalin 00:38:44.640 |
And so you're talking to a person like your friends 00:38:53.960 |
But meanwhile, whole time you're building up troops 00:39:25.600 |
hence one of the biggest wars in human history began. 00:39:30.360 |
Stalin for sure thought that this was never going to be, 00:39:34.920 |
that Hitler is not crazy enough to invade the Soviet Union. 00:39:43.040 |
And ideologically, even though there's a tension 00:39:46.280 |
between communism and fascism or national socialism, 00:39:58.680 |
They in theory had a common enemy, which was the West. 00:40:17.680 |
And everybody, because of that in the Soviet Union, 00:40:22.240 |
believed that it was a huge shock when Kiev was invaded. 00:40:25.840 |
And you hear echoes of that when I traveled to Ukraine, 00:40:32.240 |
It's not just the invasion on one particular border, 00:40:39.480 |
Especially at that time when you thought World War I, 00:40:44.240 |
you realized that that was the way to end all wars. 00:40:52.440 |
to try to take on this monster in the Soviet Union. 00:41:02.320 |
And yeah, but that's a very large scale kind of lie, 00:41:07.320 |
but I'm sure there's in politics and geopolitics, 00:41:19.080 |
But in our personal lives, I don't know how often we, 00:41:23.720 |
I mean, like think of spouses cheating on their partners. 00:41:32.920 |
unfortunately that stuff happens all the time. 00:41:36.240 |
- Or having like multiple families, that one is great. 00:41:39.240 |
When each family doesn't know about the other one 00:41:44.640 |
There's probably a sense of excitement about that too. 00:41:54.200 |
the truth finds a way of coming out, you know? 00:42:00.200 |
Yeah, people, I mean, and that's why I think actually, 00:42:03.520 |
like poker, what's so interesting about poker 00:42:12.480 |
but actually poker players are very honest people. 00:42:15.120 |
I would say they are more honest than the average, 00:42:17.560 |
you know, if you just took random population sample. 00:42:27.560 |
most people like to have some kind of, you know, 00:42:30.560 |
mysterious, you know, an opportunity to do something 00:42:40.440 |
everyone knows what they're in for, and that's allowed. 00:42:43.400 |
And you get to like really get that out of your system. 00:42:46.280 |
And then also like poker players learned that, you know, 00:42:51.000 |
I would play in a huge game against some of my friends, 00:42:55.400 |
even my partner Igor, where we will be, you know, 00:42:59.680 |
trying to draw blood in terms of winning each money 00:43:02.080 |
off each other and like getting under each other's skin, 00:43:04.040 |
winding each other up, doing the craftiest moves we can. 00:43:17.280 |
Because that, and that's why games are so great, 00:43:21.080 |
like this competitive urge that, you know, most people have. 00:43:28.720 |
Like we talked about bluffing when it worked out. 00:43:37.280 |
fortunately I've never gone broke through poker. 00:43:51.840 |
- I feel like you haven't lived unless you've gone broke. 00:43:58.960 |
I've sort of lived through it vicariously through him 00:44:11.560 |
It really, you know, varies from person to person. 00:44:28.120 |
Or you see everything in like ROI or expected hourlies 00:44:37.240 |
is calibrate the strength of my emotional response 00:44:45.560 |
you have a huge emotional dramatic response to a tiny loss. 00:44:53.240 |
and you're so dead inside that you don't even feel it. 00:45:04.240 |
I mean, I've had times where I've lost, you know, 00:45:09.040 |
It's, you know, especially if I got really unlucky 00:45:13.960 |
where I've gone away and like, you know, kicked the wall, 00:45:17.560 |
punched a wall, I like nearly broke my hand one time. 00:45:19.920 |
Like I'm a lot less competitive than I used to be. 00:45:30.880 |
And I think that sort of slowly waned as I've gotten older. 00:45:35.720 |
- I don't know if others would say the same, right? 00:45:43.000 |
It's like, he's a lot less competitive than he used to be. 00:45:53.000 |
Like when, you know, I play a game with my buddies online 00:46:00.800 |
- Thank you for passing on your obsession to me. 00:46:05.840 |
- But I'm terrible and I enjoy playing terribly. 00:46:09.080 |
because that's gonna pull me into your monster 00:46:18.280 |
- You just do the points thing, you know, against the bots. 00:46:23.520 |
there's like a hard one and there's a very hard one. 00:46:35.000 |
as opposed to enjoy building a little world and- 00:46:37.960 |
- Yeah, no, no, there's no time for exploration 00:46:46.440 |
- Yeah, so in order to be able to play a decent game 00:47:00.440 |
there'll be, I could teach you it within a day, honestly. 00:47:05.720 |
And then you'll be ready for the big leagues. 00:47:08.120 |
- Generalizes to more than just polytopia, but okay. 00:47:15.960 |
- Losing hurts, oh yeah, yes, competitiveness over time. 00:47:24.720 |
about winning when I choose to play something. 00:47:34.960 |
you start to see the world more as a positive something, 00:47:38.040 |
or at least you're more aware of externalities 00:47:47.480 |
and I'm more aware of my own, you know, like, 00:47:50.160 |
if I have a really strong emotional response to losing, 00:47:52.760 |
and that makes me then feel shitty for the rest of the day, 00:47:58.640 |
that that's unnecessary negative externality. 00:48:01.680 |
So I'm like, okay, I need to find a way to turn this down, 00:48:09.680 |
and think about some of the lower points of your life, 00:48:12.160 |
like the darker places you've got in your mind, 00:48:17.120 |
Like, did losing spark the descent into a new thing? 00:48:22.120 |
The descent into darkness, or was it something else? 00:48:29.480 |
were when I was wanting to quit and move on to other things, 00:48:34.480 |
but I felt like I hadn't ticked all the boxes 00:48:40.000 |
Like, I wanted to be the most winningest female player, 00:48:50.960 |
I've won one of these, I've won one of these, 00:48:57.440 |
is a drive of like over-optimization to random metrics 00:49:01.600 |
without much wisdom at the time, but then I carried on. 00:49:09.600 |
than I still actually had the passion to chase it for. 00:49:15.760 |
I played for as long as I did, because who knows, you know, 00:49:24.920 |
- This is it, peak experience, absolute pinnacle 00:49:40.080 |
But the dark times were in the sort of like 2016 to 18, 00:49:59.720 |
I would take the losses harder than I needed to, 00:50:04.880 |
I felt like my life was ticking away and I was like, 00:50:11.720 |
slightly more optimally than her next opponent. 00:50:20.240 |
to be doing something more directly impactful 00:50:26.280 |
And I think it's a thing a lot of poker players, 00:50:45.280 |
He's got this incredible brain, like what to put it to? 00:50:52.120 |
- It's this weird moment where I've just spoken 00:50:55.240 |
with people that won multiple gold medals at the Olympics 00:51:07.960 |
the thought you thought would give meaning to your life. 00:51:11.880 |
But in fact, life is full of constant pursuits of meaning. 00:51:20.240 |
and there's endless bliss, no, it continues going on and on. 00:51:23.980 |
You constantly have to figure out to rediscover yourself. 00:51:27.840 |
And so for you, like that struggle to say goodbye to poker, 00:51:33.800 |
- There's always a bigger game, that's the thing. 00:51:35.640 |
That's my like motto, it's like, what's the next game? 00:51:41.200 |
because obviously game usually implies zero sum, 00:51:54.680 |
that's a fast track to either completely stagnate 00:52:17.080 |
that if anyone makes a single bet, fires some weapons, 00:52:26.960 |
And if we keep playing these adversarial zero sum games, 00:52:41.120 |
- What do you think about that mutually assured destruction, 00:52:46.360 |
almost to the point of caricaturing game theory idea 00:53:11.720 |
that it was a, you know, it's a Nash equilibrium. 00:53:31.160 |
Where it's basically like, you have a, you know, 00:53:40.640 |
you know, particle splitting or pair particle splitting. 00:53:52.480 |
Because you can only ever be in the universe, 00:54:00.240 |
where you continually make, you know, option A comes in. 00:54:20.800 |
'Cause someone, and you will only find yourself 00:54:22.600 |
in the set of observers that make it down that path. 00:54:29.200 |
- That doesn't mean you're still not gonna be fucked 00:54:33.880 |
No, I'm not advocating like that we're all immortal 00:54:40.120 |
of like these things called observer selection effects, 00:54:42.440 |
which Bostrom, Nick Bostrom talks about a lot. 00:54:47.080 |
but I think it could be overextended that logic. 00:54:58.360 |
- Well, no, I mean, it leads you into like solipsism, 00:55:02.160 |
Again, if everyone like falls into solipsism of like, 00:55:10.880 |
But my point is, is that with the nuclear weapons thing, 00:55:14.600 |
there have been at least, I think it's 12 or 11 00:55:24.200 |
and it made weird reflections of some glaciers 00:55:33.840 |
And that put them on high alert, nearly ready to shoot. 00:55:35.720 |
And it was only because the head of Russian military 00:55:39.600 |
happened to be at the UN in New York at the time 00:55:43.360 |
why would they fire now when their guy is there? 00:55:50.240 |
where they didn't then escalate it into firing. 00:55:55.960 |
the person who should be the most famous person on earth, 00:56:01.160 |
like billions of people by ignoring Russian orders to fire 00:56:07.440 |
And it turned out to be, you know, very hard thing to do. 00:56:14.520 |
that we aren't having this kind of like selection effect 00:56:20.800 |
But of course we don't know the actual probabilities 00:56:30.880 |
And it is an absolute moral imperative, if you ask me, 00:56:41.160 |
but it's not like we're in the bottom of a pit. 00:56:42.840 |
You know, if you would like map it topographically, 00:56:46.400 |
it's not like a stable ball at the bottom of a thing. 00:56:49.400 |
We're on the top of a hill with a ball balanced on top. 00:56:52.360 |
And just at any little nudge could send it flying down 00:56:55.560 |
and nuclear war pops off and hellfire and bad times. 00:57:03.200 |
And another intelligent civilization might still pop up. 00:57:10.160 |
Nuclear war, sure, that's one of the perhaps less bad ones. 00:57:13.080 |
Green goo through synthetic biology, very bad. 00:57:17.560 |
Will turn, you know, destroy all organic matter through, 00:57:28.880 |
Or AI type, you know, mass extinction thing as well 00:57:42.280 |
Do you think we're living inside a video game 00:57:47.400 |
- Well, I think, well, so what was the second part 00:58:03.120 |
it's like Phil Hellmuth type of situation, right? 00:58:12.640 |
Like there's a lot of interesting stuff going on. 00:58:28.800 |
- Kind of, but God seems to be not optimizing 00:58:33.800 |
in the different formulations of God that we conceive of. 00:58:45.840 |
but the, you know, just like the basically like a teenager 00:58:49.800 |
in their mom's basement watching create a fun universe 00:58:54.200 |
to observe what kind of crazy shit might happen. 00:59:00.080 |
do I think there is some kind of extraneous intelligence 00:59:25.040 |
Partly because I've had just small little bits of evidence 00:59:34.040 |
like, so I was a diehard atheist even five years ago. 00:59:39.400 |
You know, I got into like the rationality community, 00:59:41.680 |
big fan of less wrong, continue to be incredible resource. 00:59:45.680 |
But I've just started to have too many little snippets 00:59:58.040 |
with the current sort of purely materialistic explanation 01:00:08.680 |
- Isn't that just like a humbling practical realization 01:00:19.360 |
- Yeah, no, it's a reminder of epistemic humility 01:00:21.520 |
because I fell too hard, you know, same as people, 01:00:24.360 |
like I think, you know, many people who are just like, 01:00:26.440 |
my religion is the way, this is the correct way, 01:00:37.040 |
But similarly, I think the sort of the Richard Dawkins brand 01:00:44.720 |
you know, there's a way to try and navigate these questions 01:00:52.720 |
which I still think is our best sort of realm 01:00:54.760 |
of like reasonable inquiry, you know, a method of inquiry. 01:00:57.600 |
So an example, I have two kind of notable examples 01:01:06.640 |
The first one was actually in 2010, early on in, 01:01:13.480 |
And I, the, I, the, remember the Icelandic volcano 01:01:22.760 |
And I, it meant I got stuck down in the South of France. 01:01:29.920 |
well, there's a big poker tournament happening in Italy. 01:01:39.920 |
which was much bigger than my bankroll would normally allow. 01:01:45.960 |
won my way in kind of like I did with the Monte Carlo, 01:01:50.920 |
from 500 euros into 5,000 euros to play this thing. 01:01:59.000 |
it was the biggest tournament ever held in Europe 01:02:01.720 |
It got over like 1,200 people, absolutely huge. 01:02:06.640 |
for before, you know, the normal shuffle up and deal 01:02:12.120 |
And they played Chemical Brothers' "Hey Boy, Hey Girl," 01:02:19.200 |
It was like one of these like pump me up songs. 01:02:21.200 |
And I was sitting there thinking, oh yeah, it's exciting. 01:02:24.600 |
And out of nowhere, just suddenly this voice in my head, 01:02:35.840 |
And so it sounded like my own voice and it said, 01:02:40.320 |
And it was so powerful that I got this like wave of like, 01:02:46.760 |
And I even, I remember looking around being like, 01:03:06.480 |
Okay, yes, maybe I have that feeling before every time 01:03:13.840 |
I play and it's just that I happened to, you know, 01:03:16.200 |
because I won the tournament, I retroactively remembered it. 01:03:25.840 |
- Like it gave you a confident, a deep confidence. 01:03:33.960 |
I then went and lost half my stack quite early on. 01:03:35.920 |
And I remember thinking like, oh, well that was bullshit. 01:03:41.080 |
But you know, I managed to like keep it together 01:03:42.920 |
and recover and then just went like pretty perfectly 01:03:52.800 |
And I don't want to put a, I can't put an explanation. 01:04:03.680 |
or was it just my own self-confidence and so on 01:04:08.880 |
And I don't, I'm not going to put a frame on it. 01:04:12.400 |
So we're a bunch of NPCs living in this world 01:04:24.200 |
And that feeling you got is somebody just like, 01:04:27.240 |
they got to play a poker tournament through you. 01:04:36.680 |
Like, I don't even know how much I can trust my memory. 01:04:38.960 |
- You're just an NPC retelling the same story. 01:04:41.520 |
'Cause they just played the tournament and left. 01:04:48.520 |
left as a boring NPC retelling this story of greatness. 01:04:51.880 |
- But it was, and what was interesting was that after that, 01:04:53.720 |
then I didn't obviously win a major tournament 01:04:56.800 |
And it left, that was actually another sort of dark period 01:05:04.360 |
just on a material level were insane, winning the money. 01:05:08.520 |
'cause there was this girl that came out of nowhere 01:05:12.160 |
And so again, sort of chasing that feeling was difficult. 01:05:16.720 |
But then on top of that, there was this feeling 01:05:45.320 |
I think everybody wrestles with to an extent, right? 01:05:47.240 |
We all, we are truly the protagonists in our own lives. 01:05:51.520 |
And so it's a natural bias, human bias to feel special. 01:05:56.520 |
And I think, and in some ways we are special. 01:06:00.760 |
Every single person is special because you are that, 01:06:10.480 |
and take the amalgam of everyone's experiences, 01:06:12.920 |
So there is this shared sort of objective reality, 01:06:15.680 |
but sorry, there's objective reality that is shared, 01:06:17.720 |
but then there's also this subjective reality, 01:06:22.280 |
And it's not like one is correct and one isn't. 01:06:36.800 |
And I think a lot of people have this as like, 01:06:46.560 |
There's all these big people doing big things. 01:06:48.640 |
There's big actors and actresses, big musicians. 01:06:53.400 |
There's big business owners and all that kind of stuff, 01:06:58.680 |
I have my own subjective experience that I enjoy and so on, 01:07:09.440 |
I mean, one of the things just having interacted 01:07:19.160 |
And that realization I think is really empowering. 01:07:52.880 |
Like, we tend to say this person or that person is brilliant 01:07:57.880 |
but really, no, they're just like sitting there 01:08:01.240 |
and thinking through stuff, just like the rest of us. 01:08:05.280 |
I think they're in the habit of thinking through stuff, 01:08:14.120 |
in a bunch of bullshit and minutiae of day-to-day life. 01:08:17.040 |
They really think big ideas, but those big ideas, 01:08:20.080 |
it's like allowing yourself the freedom to think big, 01:08:27.280 |
that actually solved this particular big problem. 01:08:29.280 |
First, identify a big problem that you care about, 01:08:37.320 |
And I think sometimes you do need to have like-- 01:08:50.520 |
So again, like going through the classic rationalist training 01:08:54.920 |
of "Less Wrong" where it's like, you want your map, 01:08:57.440 |
you know, the image you have of the world in your head 01:09:00.080 |
to as accurately match up with how the world actually is. 01:09:04.400 |
You want the map and the territory to perfectly align as, 01:09:11.040 |
I don't know if I fully subscribe to that anymore, 01:09:20.400 |
Like there is value in overconfidence sometimes, I do. 01:09:44.440 |
whether they were gonna be the world's greatest, 01:09:47.320 |
because they had this innate, deep self-confidence, 01:09:54.920 |
That gave them the confidence to then pursue this thing 01:09:56.960 |
and like, with the kind of focus and dedication 01:10:01.840 |
in whatever it is you're trying to do, you know? 01:10:09.560 |
with the classic sort of rationalist community 01:10:13.440 |
because that's a field that is worth studying 01:10:44.280 |
I think we're all like elucidating to the same thing. 01:10:53.480 |
The simulation, at least one of the simulation hypothesis 01:10:56.680 |
is like, as you said, like a teenager in his bedroom 01:11:02.560 |
It just like wants to see how this thing plays out 01:11:05.120 |
'cause it's curious and it could turn it off like that. 01:11:24.360 |
is I think an essential thing actually for people to find. 01:11:31.880 |
And like, it is uniquely humbling of all of us to an extent. 01:11:36.880 |
But B, it gives people that little bit of like reserve, 01:11:43.400 |
And I do think things are gonna get pretty dark 01:11:59.280 |
- You and C is AKA Grimes, we call what the game? 01:12:07.880 |
- Not, well, the universe, like what if it's a game 01:12:12.320 |
and the goal of the game is to figure out like, 01:12:14.600 |
well, either how to beat it, how to get out of it. 01:12:16.960 |
You know, maybe this universe is an escape room, 01:12:26.880 |
in order to like unlock this like hyper dimensional key 01:12:32.120 |
- No, but then, so you're saying it's like different levels 01:12:34.400 |
and it's like a cage within a cage within a cage 01:12:36.520 |
and one cage at a time, you figure out how to escape that. 01:12:42.240 |
like us becoming multi-planetary would be a level up 01:12:49.600 |
Or spiritually, you know, humanity becoming more combined 01:12:56.880 |
and us becoming a little bit more enlightened, 01:12:59.600 |
You know, there's many different frames to it, 01:13:09.240 |
is probably the biological evolutionary process. 01:13:14.040 |
So going from single cell organisms to early humans. 01:13:37.880 |
Probably, 'cause it allows us to exponentially scale. 01:13:53.440 |
so that it can accommodate more of our stuff, more of us. 01:13:58.480 |
but I don't know if it like fully solves this issue of, 01:14:32.360 |
well, so apparently back in the Canaanite times, 01:14:43.240 |
somewhere around 300 BC or 200 AD, I don't know. 01:14:53.240 |
to this awful demon god thing they called Moloch 01:15:00.920 |
And it was literally like about child sacrifice. 01:15:02.640 |
Whether they actually existed or not, we don't know, 01:15:10.960 |
it seemed like it was kind of quiet throughout history 01:15:15.440 |
until this movie "Metropolis" in 1927 talked about this, 01:15:20.440 |
you see that there was this incredible futuristic city 01:15:29.080 |
but then the protagonist goes underground into the sewers 01:15:31.360 |
and sees that the city is run by this machine. 01:15:38.080 |
because it was just so hard to keep it running. 01:15:40.920 |
So there was all this suffering that was required 01:15:45.920 |
that this machine is actually this demon Moloch. 01:15:48.160 |
So again, it's like this sort of mechanistic consumption 01:15:53.840 |
And then Allen Ginsberg wrote a poem in the '60s, 01:16:03.120 |
And a lot of people sort of quite understandably 01:16:14.680 |
that's moved Moloch into this idea of game theory 01:16:22.600 |
one that literally I think it might be my favorite piece 01:16:33.400 |
We can link to it in the show notes or something, right? 01:16:42.280 |
I have a professional operation going on here. 01:16:51.520 |
If I don't, please, somebody in the comments remind me. 01:17:05.760 |
- He's migrated onto Substack, but yeah, it's still a blog. 01:17:15.160 |
which will mean something to people when we continue. 01:17:21.560 |
So anyway, so he writes this piece, "Meditations on Moloch," 01:17:25.440 |
and basically he analyzes the poem and he's like, 01:17:36.400 |
where people would sacrifice a thing that they care about, 01:17:42.480 |
in order to gain power, a competitive advantage. 01:17:55.520 |
if you're trying to become a famous Instagram model, 01:18:02.360 |
you are incentivized to post the hottest pictures 01:18:19.320 |
And then more recently, these beauty filters, 01:18:23.480 |
it makes your face look absolutely incredible, 01:18:31.600 |
Everyone is incentivized to that short-term strategy. 01:18:43.080 |
the reason why I talked about them in this video 01:18:45.880 |
Like I was trying to grow my Instagram for a while. 01:18:50.000 |
And I noticed these filters, how good they made me look. 01:18:53.440 |
And I'm like, well, I know that everyone else 01:18:59.000 |
- Please, so I don't have to use the filters. 01:19:08.440 |
- Exactly, these short-term incentives to do this thing 01:19:12.760 |
that either sacrifices your integrity or something else 01:19:23.760 |
to the bottom spiral where everyone else ends up 01:19:25.880 |
in a situation which is worse off than if they hadn't started 01:19:36.400 |
a competitive system of like everyone sitting 01:19:38.160 |
and having a view, that if someone at the very front 01:19:49.160 |
So you need this like top-down God's eye coordination 01:19:53.840 |
But from within the system, you can't actually do that. 01:19:57.680 |
It's this thing that makes people sacrifice values 01:20:01.600 |
in order to optimize for winning the game in question, 01:20:21.280 |
It's a force of bad incentives on a multi-agent system 01:20:26.600 |
prisoner's dilemma is technically a kind of Moloch-y 01:20:29.880 |
system as well, but it's just a two-player thing. 01:20:31.960 |
But another word for Moloch is multipolar trap, 01:20:35.720 |
where basically you just got a lot of different people 01:20:44.720 |
but because that strategy gives you a short-term advantage, 01:20:54.040 |
for a large number of people to play game theory. 01:21:02.560 |
And is it on them to try to anticipate what kind of, 01:21:06.960 |
like to do the thing that poker players are doing, 01:21:12.360 |
If, you know, Mark Zuckerberg and Jack and all the, 01:21:17.240 |
if they had at least just run a few simulations 01:21:23.240 |
different types of algorithms would turn out for society, 01:21:33.080 |
So not kind of this level of how do we optimize engagement 01:21:47.960 |
How is it gonna get morphed in iterative games played 01:21:52.960 |
into something that will change society forever? 01:21:58.800 |
I hope there's meetings like that inside companies, 01:22:04.480 |
And it's difficult because like when you're starting up 01:22:10.280 |
you're aware that you've got investors to please, 01:22:17.280 |
there's only so much R&D you can afford to do. 01:22:20.120 |
You've got all these like incredible pressures, 01:22:23.760 |
and just build your thing as quickly as possible 01:22:29.120 |
when they built these social media platforms, 01:22:38.720 |
everyone these days is optimizing for clicks, 01:22:42.000 |
whether it's the social media platforms themselves, 01:22:53.760 |
or whether it's the New York Times or whoever, 01:22:56.080 |
they're trying to get their story to go viral. 01:22:58.280 |
So everyone's got this bad incentive of using, 01:23:00.200 |
as you called it, the clickbait industrial complex. 01:23:04.200 |
because everyone is now using worse and worse tactics 01:23:06.240 |
in order to like try and win this attention game. 01:23:08.680 |
And yeah, so ideally these companies would have had 01:23:19.800 |
okay, what are the ways this could possibly go wrong 01:23:24.680 |
they should be aware of this concept of moloch 01:23:27.680 |
whenever you have a highly competitive multi-agent system, 01:23:35.920 |
and you try and bring all this complexity down 01:23:51.520 |
- I think there should be an honesty when founders, 01:23:53.920 |
I think there's a hunger for that kind of transparency 01:23:56.400 |
of like, we don't know what the fuck we're doing. 01:24:04.320 |
And like, actually just be honest about this, 01:24:06.440 |
that we're all like these weird rats in a maze. 01:24:12.400 |
There's this kind of sense like the founders, 01:24:22.880 |
- He's at the mercy of this, like everyone else. 01:24:33.000 |
about how Instagram and Facebook are good for you, 01:24:38.000 |
to actually ask some of the deepest questions 01:24:43.240 |
How do we design the game such that we build a better world? 01:24:48.120 |
- I think a big part of this as well is people, 01:24:51.760 |
there's this philosophy, particularly in Silicon Valley 01:25:03.080 |
where yes, technology has solved a lot of problems 01:25:05.440 |
and can potentially solve a lot of future ones. 01:25:08.200 |
But it can also, it's always a double-edged sword 01:25:10.840 |
and particularly as technology gets more and more powerful 01:25:17.360 |
psychological manipulation with it and so on. 01:25:20.680 |
It's, technology is not a values neutral thing. 01:25:24.960 |
People think, I used to always think this myself, 01:25:30.720 |
It's just, it's the humans that either make it good or bad. 01:25:39.440 |
They literally dictate how humans now form social groups 01:25:54.400 |
where it's technology driving social interaction, 01:26:00.480 |
and like which ideas become popular, that's Moloch. 01:26:06.480 |
We need it so we need to figure out what are the good memes? 01:26:13.400 |
we need to optimize for that like makes people happy 01:26:26.800 |
And, you know, like as much as I love in many ways 01:26:31.800 |
the culture of Silicon Valley and like, you know, 01:26:39.280 |
There are, we have to like be honest with ourselves. 01:26:44.160 |
We're getting to a point where we are losing control 01:26:47.240 |
of this very powerful machine that we have created. 01:26:49.600 |
- Can you redesign the machine within the game? 01:26:53.200 |
Can you just have, can you understand the game enough? 01:27:06.720 |
You know, like the way I try to be in real life 01:27:11.320 |
and the way I try to be online is to be about kindness 01:27:16.080 |
And I feel like I'm sometimes get like criticized 01:27:19.840 |
for being naive and all those kinds of things. 01:27:22.040 |
But I feel like I'm just trying to live within this game. 01:27:27.520 |
Yeah, but also like, hey, it's kind of fun to do this. 01:27:32.920 |
You know, that, and that's like trying to redesign 01:27:48.640 |
- Well, the other option is to create new companies 01:27:55.280 |
or anyone who has control of the rules of the game. 01:27:59.040 |
- I think we need to be doing all of the above. 01:28:03.000 |
about what are the kind of positive, healthy memes. 01:28:09.760 |
As Elon said, he who controls the memes controls the universe. 01:28:28.040 |
We have the ability to learn and evolve through culture 01:28:37.360 |
And that means culture is incredibly powerful 01:28:40.800 |
and we can create and become victim to very bad memes 01:28:56.520 |
We also need to, you know, 'cause I don't want the, 01:29:01.520 |
I'm making this video right now called the Attention Wars, 01:29:05.520 |
like the media machine is this Moloch machine. 01:29:11.440 |
that where everyone is optimizing for engagement 01:29:13.200 |
in order to win their share of the attention pie. 01:29:16.480 |
And then if you zoom out, it's really like Moloch 01:29:19.240 |
'cause the only thing that benefits from this in the end, 01:29:20.960 |
you know, like our information ecosystem is breaking down. 01:29:24.040 |
Like we have, you look at the state of the US, 01:29:35.120 |
in terms of what their actual shared reality is. 01:29:51.920 |
It doesn't matter how innocuous the topic is, 01:29:59.840 |
And that's like an emergent Moloch type force 01:30:06.880 |
and cuts through it so that it can split nicely 01:30:15.560 |
all everyone is trying to do within the system 01:30:18.040 |
is just maximize whatever gets them the most attention 01:30:24.560 |
And the way, the best emotion for getting attention, 01:30:29.240 |
well, because it's not just about attention on the internet, 01:30:35.960 |
They need to like comment or retweet or whatever. 01:30:47.520 |
even from like previously uncontacted tribes have. 01:30:51.040 |
Some of those are negative, you know, like sadness, 01:30:56.080 |
disgust, anger, et cetera, some are positive, 01:31:14.280 |
and I'm not just like talking out my ass here, 01:31:15.840 |
there are studies here that have looked into this. 01:31:17.840 |
Whereas like if someone's like disgusted or fearful, 01:31:31.040 |
well, now that triggers all the old tribalism emotions. 01:31:35.920 |
And so that's how then things get sort of spread, 01:31:39.400 |
They out-compete all the other memes in the ecosystem. 01:31:50.520 |
I did a tweet, the problem with raging against the machine 01:31:55.080 |
is that the machine has learned to feed off rage 01:31:59.080 |
That's the thing that's now keeping it going. 01:32:06.040 |
in the war of attention is constantly maximizing rage. 01:32:13.600 |
And it happens to be that engagement is, well, propaganda. 01:32:28.880 |
whether it's the culture war or the Ukraine war, yeah. 01:32:33.240 |
do you think it's possible that in the long arc 01:32:48.600 |
one of the magic things about democracy and so on 01:32:51.120 |
is you have the blue versus red constantly fighting. 01:33:04.880 |
like almost really embodying different ideas, 01:33:08.840 |
And through the yelling over the period of decades, 01:33:11.600 |
maybe centuries, figuring out a better system. 01:33:18.120 |
- But in the long arc, it actually is productive. 01:33:43.560 |
can be used for propaganda or just manipulating people 01:33:54.040 |
there are significant resources being put in. 01:34:01.440 |
too doom and gloom, but there are bad actors out there. 01:34:05.080 |
There are plenty of good actors within the system 01:34:06.640 |
who are just trying to stay afloat in the game. 01:34:09.760 |
But then on top of that, we have actual bad actors 01:34:17.360 |
- And using, so because of the digital space, 01:34:19.640 |
they're able to use artificial actors, meaning bots. 01:34:50.960 |
- You know, there's truth to fight fire with fire. 01:34:53.480 |
It's like, but how, you always have to be careful 01:35:01.360 |
- Yeah, yeah, Hitler was trying to spread love too. 01:35:11.440 |
the road to hell is paved in good intentions. 01:35:13.560 |
And there's always unforeseen circumstances, you know, 01:35:18.560 |
outcomes, externalities, if you're trying to adopt a thing, 01:35:25.040 |
- If you can run some Sims on it first, absolutely. 01:35:28.600 |
- But also there's certain aspects of a system, 01:35:39.640 |
it's not good for me to have full control over it. 01:35:44.000 |
I might have a good understanding of what's good and not. 01:35:51.920 |
wouldn't it be nice to get rid of those assholes? 01:35:54.040 |
But then that power starts getting to your head, 01:36:17.520 |
the more you can decentralize the control of a thing 01:36:24.880 |
- But then you still need the ability to coordinate, 01:36:26.720 |
because that's the issue when if something is too, 01:36:29.440 |
you know, that's really, to me, like the culture wars, 01:36:32.440 |
the bigger war we're dealing with is actually 01:36:39.600 |
but like centralization versus decentralization. 01:36:44.120 |
Power and control by a few versus completely distributed. 01:36:48.240 |
And the trouble is if you have a fully centralized thing, 01:36:52.720 |
Stalin type things can happen, or completely distributed. 01:36:56.880 |
Now you're at risk of complete anarchy and chaos 01:36:58.560 |
where you can't even coordinate to like on, you know, 01:37:01.160 |
when there's like a pandemic or anything like that. 01:37:03.280 |
So it's like, what is the right balance to strike 01:37:14.560 |
- So the dictator can commit huge atrocities, 01:37:17.760 |
but they can also make sure the infrastructure works 01:37:24.880 |
They have the ability to create like laws and rules 01:37:27.360 |
that like force coordination, which stops Moloch. 01:37:40.640 |
So that's where, so I've been working on this series. 01:37:45.640 |
It's been driving me insane for the last year and a half. 01:37:52.120 |
The second one, hopefully we're coming out in like a month. 01:37:54.800 |
And my goal at the end of the series is to like present, 01:37:59.960 |
of like what Moloch is and how it's affecting 01:38:11.880 |
- Wait, wait, the generator function of existential risk. 01:38:14.200 |
So you're saying Moloch is sort of the engine 01:38:24.520 |
- It's not my phrase, it's Daniel Schmachtenberger. 01:38:26.800 |
- Oh, Schmachtenberger. - I got that from him. 01:38:29.880 |
- It's like all roads lead back to Daniel Schmachtenberger. 01:38:37.800 |
- But anyway, sorry, totally rude interruptions from me. 01:38:45.440 |
because it's just like this one big external thing. 01:38:51.520 |
but, you know, synthetic bio, you know, bioweapons, 01:38:55.800 |
that's one because everyone's incentivized to build, 01:39:01.040 |
you know, just to threaten someone else, et cetera. 01:39:08.040 |
But yeah, so if Moloch is this like generator function 01:39:14.560 |
that's driving all of these issues over the coming century 01:39:20.240 |
And so far what I've gotten to is this character 01:39:26.600 |
Because Moloch is the God of lose-lose ultimately. 01:39:34.320 |
So I was like, well, what's the opposite of that? 01:39:49.160 |
and I addressed it in the video, like it's red and black. 01:39:57.480 |
So Win-Win is kind of actually like these colors. 01:40:06.640 |
It loves a little bit of healthy competition, 01:40:19.120 |
And then beyond that, it also loves cooperation, 01:40:28.480 |
It's not like kind of like boring, you know, like, 01:40:51.280 |
when it has to do like official functions, is Omnia. 01:41:06.000 |
- But there is an angelic kind of sense to Omnia though. 01:41:10.880 |
- So it's more like, it embraces the fun aspect. 01:41:23.560 |
that requires embracing the chaos of the game 01:41:39.400 |
not optimizing for the consequences of the game. 01:41:42.520 |
- Right, well, it's recognizing the value of competition 01:41:48.720 |
it's about you enjoying the process of having a competition 01:41:59.080 |
Because one of the reason why Moloch is doing so well now 01:42:02.720 |
in our civilization is because we haven't been able 01:42:07.000 |
And so it's just having all these negative externalities 01:42:12.600 |
I think my guess is, and now we're getting really like, 01:42:20.880 |
but I think we'll be in a more interesting universe 01:42:26.800 |
if we have one that has both pure cooperation, 01:42:36.240 |
Like it's good to have some little zero-sum-ness bits, 01:42:41.520 |
and I'm not qualified as a philosopher to know that. 01:43:01.480 |
different frameworks of how we think about that. 01:43:04.680 |
- At the small scale of a collection of individuals 01:43:09.960 |
It's a meme, I think it's an example of a good meme. 01:43:13.360 |
And I'm open, I'd love to hear feedback from people 01:43:16.720 |
they have a better idea or it's not, you know, 01:43:18.320 |
but it's the direction of memes that we need to spread, 01:43:21.640 |
this idea of like, look for the win-wins in life. 01:43:27.240 |
so in that particular context where Moloch creates 01:43:43.800 |
what kind of thing we would like to converge towards 01:43:59.200 |
when it can't be reduced down to easy metrics. 01:44:04.200 |
Like if you think of a tree, what is it about a tree, 01:44:12.280 |
What is it about it that we find so beautiful? 01:44:14.680 |
It's not, you know, the sweetness of its fruit 01:44:32.920 |
That's both, like it walks this fine line between, 01:44:45.500 |
And you know, you can't, it's evolving over time. 01:44:51.440 |
You know, the definition of a complex versus, 01:44:54.760 |
you know, a complex versus a complicated system. 01:44:57.660 |
A complicated system can be sort of broken down into bits, 01:45:02.240 |
A complex system is kind of like a black box. 01:45:05.000 |
It does all this crazy stuff, but if you take it apart, 01:45:11.800 |
And also very importantly, like the sum of the parts, 01:45:21.400 |
And I think that extends to things like art as well. 01:45:23.560 |
Like there's something immeasurable about it. 01:45:27.640 |
There's something we can't break down to a narrow metric. 01:45:33.840 |
- So how can Instagram reveal that kind of beauty, 01:45:41.160 |
- And this takes us back to dating sites and Goodreads, 01:45:50.320 |
It shouldn't try and like, right now, you know, 01:45:53.320 |
one of the, I was talking to like a social media expert 01:45:58.320 |
- Is there such a thing as a social media expert? 01:46:03.280 |
'Cause I'm thinking about working with one to like, 01:46:09.440 |
You should, you should have done it a long time ago. 01:46:14.760 |
And it's gonna be about this like positive stuff. 01:46:17.480 |
And the thing that, you know, they always come back 01:46:21.000 |
and say, it's like, well, you need to like figure out 01:46:24.080 |
You know, you need to narrow down what your thing is 01:46:32.200 |
They wanna know that they're coming back to the same thing. 01:46:34.320 |
And that's the advice on YouTube, Twitter, you name it. 01:46:45.400 |
It's making things more, it's an oversimplification. 01:46:56.560 |
It's trying to like encapsulate the human experience 01:47:00.120 |
and put it into digital form and commodify it to an extent. 01:47:09.520 |
And that's why I think it's kind of ultimately 01:47:15.240 |
- Yeah, it's interesting because there is some sense 01:47:18.120 |
in which a simplification sort of in the Einstein 01:47:28.040 |
a simplification in a way that still captures 01:47:30.280 |
some core power of an idea of a person is also beautiful. 01:47:36.120 |
And so maybe it's possible for social media to do that. 01:47:50.520 |
but in a simple way, in a way that can be displayed 01:48:04.920 |
Of course, the viral machine that spreads those words 01:48:09.920 |
often results in people taking the thing out of context. 01:48:14.840 |
People often don't read tweets in the context 01:48:20.800 |
The full history of the tweets they've written, 01:48:35.520 |
But in a certain sense, if you do take it in context, 01:48:39.840 |
it reveals some kind of quirky little beautiful idea 01:48:43.240 |
or a profound little idea from that particular person 01:48:48.560 |
So in that sense, Twitter can be more successful. 01:49:02.040 |
is there a way to rewrite the Twitter algorithm 01:49:08.800 |
the fertile breeding ground of the culture wars? 01:49:30.880 |
Twitter is where you have this amalgam of human culture 01:49:38.480 |
and the angriest, most divisive takes and amplifies them. 01:49:47.800 |
because all the journalists are also on Twitter, 01:49:55.120 |
from this already like very boiling lava of rage. 01:50:00.760 |
And then spread that to their millions and millions of people 01:50:05.800 |
And so I honestly, I think if I could press a button, 01:50:10.840 |
turn them off, I probably would at this point, 01:50:13.640 |
'cause I just don't see a way of being compatible 01:50:16.160 |
with healthiness, but that's not gonna happen. 01:50:18.600 |
And so at least one way to like stem the tide 01:50:23.160 |
and make it less molochey would be to change, 01:50:30.040 |
at least if like it was on a subscription model, 01:50:31.960 |
then it's now not optimizing for impressions, 01:50:46.520 |
So you're trying to encourage addictive behaviors. 01:50:56.960 |
whether someone comes back to the site once a month 01:50:59.160 |
or 500 times a month, they get the same amount of money. 01:51:02.080 |
So now that takes away that incentive to use technology, 01:51:37.360 |
- Competition. - But then I disagree with you. 01:51:40.520 |
- I disagree with you that all virality is negative then, 01:51:49.520 |
'cause we have a lot of data on virality being negative. 01:51:52.920 |
But I happen to believe that the core of human beings, 01:52:10.560 |
but it's possible to engineer systems that are viral 01:52:25.840 |
meaning a lot of people need to be challenged. 01:52:31.480 |
you might not like it, but you ultimately grow from it. 01:52:42.320 |
that people at their core are on average good, 01:52:44.600 |
as opposed to, care for each other, as opposed to not. 01:52:47.280 |
I think it's actually a very small percentage of people 01:52:50.920 |
are truly wanting to do just destructive malicious things. 01:52:54.440 |
Most people are just trying to win their own little game 01:52:57.880 |
they're just stuck in this badly designed system. 01:53:15.160 |
I think there should be a whole field of study 01:53:20.360 |
that above a certain threshold of the population 01:53:24.040 |
agree is a positive, happy, bringing people together meme, 01:53:27.360 |
the kind of thing that brings families together 01:53:29.880 |
that would normally argue about cultural stuff 01:53:34.800 |
Identify those memes and figure out what it was, 01:53:37.440 |
what was the ingredient that made them spread that day. 01:53:49.600 |
that enables like productivity, like cooperation, 01:53:52.720 |
solving difficult problems and all those kinds of stuff. 01:54:09.320 |
It's like, Moloch hates collaboration and coordination 01:54:14.600 |
And that's, again, like the internet started out as that 01:54:20.680 |
but because of the way it was sort of structured 01:54:30.600 |
And, but they needed to find a way to pay the bills anyway, 01:54:40.240 |
But that meant that things were very decoupled. 01:54:42.760 |
You know, you've got this third-party interest, 01:54:57.480 |
you start making the thing for the advertiser. 01:55:02.000 |
Yeah, like it's, there's no clean solution to this. 01:55:07.440 |
And I, it's a really good suggestion by you actually 01:55:11.160 |
to like figure out how we can optimize virality 01:55:19.560 |
- I shall be the general of the love bot army. 01:55:33.280 |
You've talked about quantifying your thinking. 01:55:42.280 |
Like if you think about different trajectories 01:55:45.680 |
just actually analyzing life in game theoretic way, 01:55:52.820 |
that you had an honest conversation with Igor 01:55:54.680 |
about like how long is this relationship gonna last? 01:56:01.560 |
having an honest conversation about the probability 01:56:05.080 |
of things that we sometimes are a little bit too shy 01:56:08.600 |
or scared to think of in a probabilistic terms. 01:56:11.840 |
Can you speak to that kind of way of reasoning 01:56:17.040 |
Can you do this kind of thing with human relations? 01:56:20.880 |
- Yeah, so the scenario you're talking about, 01:56:27.680 |
- I think it was about a year into our relationship 01:56:30.920 |
and we were having a fairly heavy conversation 01:56:35.440 |
whether or not I was gonna sell my apartment. 01:56:46.360 |
- When you guys are having that conversation, 01:56:49.800 |
or is he sober and you're actually having a serious- 01:57:03.360 |
for a couple of years before we even got romantic. 01:57:20.720 |
- So the probability of it being a big deal was high. 01:57:35.200 |
But Igor's MO has always been much more than mine. 01:57:50.320 |
And if you aren't able to accept difficult things yourself, 01:57:59.800 |
The relationship needs this bedrock of honesty 01:58:07.720 |
but I would like to push against some of those ideas, 01:58:25.280 |
what's the likelihood that we're going to be together 01:58:28.240 |
'Cause I think it was roughly a three-year time horizon. 01:58:35.080 |
let's both write down our predictions formally. 01:58:38.560 |
we were just getting into like effective altruism 01:58:47.360 |
well, your own foresight essentially in a quantified way. 01:59:03.880 |
And I remember having this moment of like, ooh, 01:59:09.600 |
but like a lot can happen in 10 years, you know? 01:59:21.400 |
I don't want to give a number lower than his. 01:59:22.680 |
And I remember thinking, I was like, uh-uh, don't game it. 01:59:42.000 |
if his had been consistently lower than mine, 01:59:49.880 |
I think he's just kind of like a water off the duck's back 01:59:53.320 |
Be like, okay, well, all right, we'll figure this out. 01:59:55.200 |
- Well, did you guys provide error bars on the estimate? 01:59:59.560 |
We didn't give formal plus or minus error bars. 02:00:05.720 |
did you feel informed enough to make such decisions? 02:00:19.080 |
I would want to say one of the assumptions you have 02:00:23.120 |
is you're not that different from other relationships. 02:00:26.240 |
- And so I wanna have some data about the way- 02:00:31.560 |
And also actual trajectories of relationships. 02:00:39.080 |
about the ways that relationships fall apart or prosper, 02:00:56.320 |
and how often the different trajectories change in life. 02:01:05.640 |
Can you look at my life and have a good prediction 02:01:09.760 |
about in terms of my characteristics and my relationships 02:01:13.560 |
of what that's gonna look like in the future or not? 02:01:15.880 |
I don't even know the answer to that question. 02:01:17.200 |
I'll be very ill-informed in terms of making the probability. 02:01:20.720 |
I would be far, yeah, I just would be under-informed. 02:01:26.680 |
I'll be over-biasing to my prior experiences, I think. 02:01:35.760 |
say, "Look, I have really wide error bars on this 02:01:46.760 |
- And I feel also the romantic nature of that question. 02:02:06.520 |
That's one of my pushbacks against radical honesty, 02:02:20.200 |
- Going back to the wise sage- - In order to sort of 02:02:29.440 |
the positive, the power of positive thinking. 02:02:39.800 |
And I agree, I don't think there's a clear answer to this, 02:02:44.280 |
Some people this works better for than others. 02:02:46.600 |
You know, to be clear, Igor and I weren't doing 02:02:52.000 |
Like we did it with very much tongue in cheek. 02:02:57.400 |
I don't think it even would have drastically changed 02:03:06.920 |
you really actually kinda, there was a deep honesty to it. 02:03:09.920 |
- Exactly, it was a deep, and it was just like 02:03:13.840 |
I actually have to think through this quite critically, 02:03:26.000 |
So there was one thing of what my actual prediction is, 02:03:28.600 |
but what are my desires, and could these desires 02:03:34.960 |
and I personally don't think it loses anything. 02:03:37.080 |
It didn't take any of the magic away from our relationship, 02:03:40.640 |
It brought us closer together, 'cause it was like 02:03:42.640 |
we did this weird, fun thing that I appreciate 02:03:47.920 |
And I think it was somewhat unique in our relationship 02:03:51.880 |
that both of us are very, we both love numbers, 02:03:54.880 |
we both love statistics, we're both poker players. 02:03:57.320 |
So this was kind of like our safe space anyway. 02:04:01.320 |
For others, one partner really might not like 02:04:09.520 |
But I do think there's, it's interesting sometimes 02:04:24.040 |
Which is interesting, 'cause that's in tension 02:04:25.480 |
with the idea of what we just talked about with beauty 02:04:28.680 |
the fact that you can't measure everything about it. 02:04:39.160 |
of measuring the utility of a tree in its entirety. 02:04:43.280 |
I don't know, maybe we should, maybe we shouldn't. 02:04:52.600 |
People are overly biased against trying to do 02:05:03.920 |
It's like, well, sure, but guts, our intuitions 02:05:19.000 |
You only make those decisions a couple of times 02:05:22.960 |
- Well, I would love to know, there's a balance, 02:05:40.080 |
For example, just talking to soldiers in Ukraine, 02:05:43.520 |
you ask them, what's the probability of you winning, 02:06:01.600 |
- First of all, the morale there is higher than probably, 02:06:06.320 |
and I've never been to a war zone before this, 02:06:22.560 |
- Everybody, not just soldiers, not everybody. 02:06:28.640 |
- I think because there's perhaps a dormant desire 02:06:41.960 |
because it's been going through this 30 year process 02:06:45.360 |
of different factions and political bickering, 02:06:52.600 |
They say all great nations have had an independence war. 02:07:04.600 |
There's constantly been factions, there's been divisions, 02:07:17.720 |
and there's that kind of sense that we're going to fight 02:07:25.640 |
And that, on top of the fact that there's just, 02:07:33.960 |
and there's certain other countries like this, 02:07:36.760 |
there are certain cultures are feisty in their pride 02:07:49.520 |
In certain countries, you do not want to occupy. 02:08:00.240 |
If we occupy this land for prolonged periods of time, 02:08:04.400 |
Like, they're not going to want to be occupied." 02:08:07.160 |
And certain other countries are like, pragmatic. 02:08:31.000 |
because you said it's always been under conflict 02:08:36.120 |
- So you would expect them to actually be the opposite 02:08:44.800 |
I mean, I think they've developed this culture 02:09:07.880 |
against oppression and all that kind of stuff, 02:09:13.720 |
But a lot of other aspects are also part of it 02:09:16.440 |
that has to do with the reverse Mollik kind of situation, 02:09:20.040 |
where social media has definitely played a part of it. 02:09:27.220 |
The fact that the president of the nation, Zelensky, 02:09:41.460 |
when the capital of the nation is under attack, 02:09:46.820 |
that the United States advised Zelensky to do 02:09:49.120 |
is to flee and to be the leader of the nation 02:09:57.280 |
Everyone around him, there was a pressure to leave, 02:10:09.200 |
There's a lot of people that criticize Zelensky 02:10:18.860 |
especially that singular act of staying in the capital. 02:10:27.160 |
come together to create something within people. 02:10:35.640 |
so how zoomed out of a view do you wanna take? 02:10:40.640 |
Because, yeah, you describe it as an anti-Molotov thing 02:10:48.000 |
because it brought the Ukrainian people together 02:10:51.800 |
Maybe that's a good thing, maybe that's a bad thing. 02:10:56.740 |
But if you zoom it out from a level, on a global level, 02:11:15.680 |
It seems like a good thing that they came together, 02:11:17.600 |
but we don't know how this is all gonna play out. 02:11:21.600 |
we'll be like, "Okay, that was the bad, that was the." 02:11:23.440 |
- Oh yeah, so I was describing the reverse Moloch 02:11:31.240 |
and they say, "Well, if you channel most of the resources 02:11:36.240 |
"of the nation and the nation supporting Ukraine 02:11:40.500 |
"into the war effort, are you not beating the drums of war 02:12:20.240 |
an agreement that guarantees no more invasions. 02:12:34.120 |
is you want to demonstrate to the rest of the world 02:12:36.600 |
who's watching carefully, including Russia and China 02:12:39.640 |
and different players on the geopolitical stage, 02:12:42.280 |
that this kind of conflict is not going to be productive 02:13:10.160 |
is just individual human beings and human lives 02:13:17.760 |
we should realize that it's entirely possible 02:13:22.000 |
that we will see a World War III in the 21st century. 02:13:29.840 |
And so the way we play this as a human civilization 02:13:34.840 |
will define whether we do or don't have a World War III. 02:13:39.800 |
How we discuss war, how we discuss nuclear war, 02:14:11.520 |
the ever-increasing military-industrial complex. 02:14:17.560 |
that when you say pro-Ukraine or pro-anybody, 02:14:29.440 |
that creates narratives that says it's pro-human beings. 02:14:36.800 |
But it's actually, if you look at the raw use 02:14:47.440 |
The real, we have to just somehow get the meme 02:14:50.680 |
into everyone's heads that the real enemy is war itself. 02:14:57.120 |
And that doesn't mean to say that there isn't justification 02:15:01.600 |
for small local scenarios, adversarial conflicts. 02:15:13.760 |
It's not that they're on the side of team country, 02:15:21.520 |
that your corrective measure doesn't actually then end up 02:15:25.800 |
being co-opted by the war machine and creating greater war. 02:15:35.200 |
that the weapons that can be used are so mass destructive 02:15:44.080 |
- What existential threat, in terms of us not making it, 02:15:49.640 |
What existential threat to human civilization? 02:15:56.640 |
- No, it's like, well, while we're in the somber place, 02:16:08.000 |
We mentioned asteroids, we mentioned AGI, nuclear weapons. 02:16:17.400 |
mostly because I think it's the one where we have 02:16:22.000 |
in a positive direction, or more specifically, 02:16:39.960 |
So, of course, we have natural risks from natural pandemics, 02:16:49.640 |
and technology becomes more and more democratized 02:16:55.980 |
And whether or not you fall into the camp of COVID 02:17:03.720 |
or whether it was purely naturally occurring, 02:17:13.320 |
synthetic pathogens or human meddled with pathogens 02:17:23.280 |
whether they're omnicidal maniacs, either way. 02:17:27.720 |
And so that means we need more robustness for that. 02:17:31.160 |
And you would think that us having this nice little dry run, 02:17:45.800 |
And so you'd think that we would then be coming, 02:17:49.920 |
we'd be much more robust in our pandemic preparedness. 02:17:52.620 |
And meanwhile, the budget in the last two years for the US, 02:18:01.900 |
I can't remember the name of what the actual budget was, 02:18:04.800 |
but it was like a multi-trillion dollar budget 02:18:10.580 |
considering that COVID cost multiple trillions 02:18:17.320 |
for future pandemic preparedness was 60 billion. 02:18:28.680 |
all the way down to 2 billion out of multiple trillions 02:18:31.560 |
for a thing that has just cost us multiple trillions. 02:18:34.180 |
We've just finished, we're not even really out of it. 02:18:41.020 |
"Whew, all right, we've got the pandemic out of the way. 02:18:50.640 |
but there's an immense amount of naivety about, 02:18:55.080 |
they think that nature is the main risk moving forward, 02:19:02.920 |
than this project that I was just reading about 02:19:12.360 |
and we're not talking about just like within cities, 02:19:15.080 |
like deep into like caves that people don't go to, 02:19:19.200 |
scour the earth for whatever the most dangerous 02:19:22.760 |
possible pathogens could be that they can find. 02:19:33.080 |
And again, whether you think COVID was a lab leak or not, 02:19:37.480 |
but we have historically had so many, as a civilization, 02:19:42.600 |
from even like the highest level security things. 02:19:44.600 |
Like it just, people should go and just read it. 02:19:47.520 |
It's like a comedy show of just how many they are, 02:19:54.640 |
So bring these things then back to civilization. 02:19:58.960 |
Then the next step would be to then categorize them, 02:20:04.520 |
by their level of potential pandemic lethality. 02:20:07.000 |
And then the piece de resistance on this plan 02:20:10.680 |
is to then publish that information freely on the internet 02:20:14.840 |
about all these pathogens, including their genome, 02:20:16.920 |
which is literally like the building instructions 02:20:21.240 |
And this is something that genuinely a pocket 02:20:33.360 |
oh, this is good because it might buy us some time 02:20:36.200 |
to buy, to develop the vaccines, which, okay, sure. 02:20:39.520 |
Maybe would have made sense prior to mRNA technology, 02:20:46.880 |
when we find a new pathogen within a couple of days. 02:20:57.680 |
Meanwhile, the downside is you're not only giving, 02:21:06.400 |
to every bad actor on earth who would be doing cartwheels. 02:21:10.320 |
And I'm talking about like Kim Jong-un, ISIS, 02:21:14.880 |
they think the rest of the world is their enemy. 02:21:17.200 |
And in some cases they think that killing themselves 02:21:22.680 |
And you're literally giving them the building blocks 02:21:26.920 |
Like on expectation, it's probably like minus EV 02:21:33.440 |
Certainly in the tens or hundreds of millions. 02:21:35.760 |
So the cost benefit is so unbelievably, it makes no sense. 02:21:46.600 |
And it's not that it's malice or anything like that. 02:22:08.920 |
even if a bad actor couldn't physically make it themselves, 02:22:14.080 |
like the technology is getting cheaper and easier to use. 02:22:18.000 |
But even if they couldn't make it, they could now bluff it. 02:22:20.400 |
Like what would you do if there's like some deadly new virus 02:22:28.600 |
"Hey, if you don't let me build my nuclear weapons, 02:22:31.560 |
"I'm gonna release this, I've managed to build it." 02:22:33.640 |
Well, now he's actually got a credible bluff. 02:22:36.480 |
And so that's, it's just like handing the keys, 02:22:39.320 |
it's handing weapons of mass destruction to people. 02:22:44.560 |
but the possible world in which it might make sense 02:22:55.520 |
but the good guys are like an order of magnitude 02:23:10.120 |
By very good, not meaning like a little bit better, 02:23:17.240 |
in each of those individual disciplines, is that feasible? 02:23:24.920 |
leapfrog to the place where the good guys are? 02:23:31.880 |
with pertaining to this like particular plan of like, 02:23:39.360 |
where at least then that would maybe make sense 02:23:41.200 |
for steps one and step two of like getting the information, 02:23:55.000 |
- But there's different levels of release, right? 02:24:02.440 |
but there's a release where you incrementally give it 02:24:08.880 |
but it's like you're giving it to major labs. 02:24:10.960 |
- There's different layers of reasonability, but- 02:24:14.640 |
if you go anywhere beyond like complete secrecy, 02:24:23.440 |
- So you might as well release it to the public, 02:24:38.240 |
- Right, which is why you shouldn't get the information 02:24:44.880 |
Yeah, the solution is either don't get the information 02:24:46.600 |
in the first place or B, keep it incredibly contained. 02:24:55.800 |
So in the case of biology, I do think you're very right. 02:25:06.640 |
Meaning don't just even collect the information, 02:25:09.200 |
but like don't do, I mean, gain of function research 02:25:15.720 |
- I mean, it's all about cost benefits, right? 02:25:17.680 |
There are some scenarios where I could imagine 02:25:19.280 |
the cost benefit of a gain of function research 02:25:24.480 |
all the potential risks, factored in the probability 02:25:28.560 |
not only known unknowns, but unknown unknowns as well, 02:25:32.360 |
And then even then it's like orders of magnitude 02:25:37.320 |
is that there's this like naivety that's preventing people 02:25:43.320 |
Because, you know, I get it, the science community, 02:25:47.280 |
again, I don't wanna bucket the science community, 02:25:49.320 |
but like some people within the science community 02:25:54.080 |
and everyone just cares about getting knowledge 02:26:02.400 |
Listen, I've been criticizing the science community 02:26:07.400 |
There's so many brilliant people that brilliance 02:26:13.200 |
And then you start to look at the history of science, 02:26:20.880 |
And it's dark how you can use brilliant people 02:26:24.900 |
that like playing the little game of science, 02:26:28.820 |
You know, you're building, you're going to conferences, 02:26:30.740 |
you're building on top of each other's ideas, 02:26:33.280 |
Hi, I think I've realized how this particular molecule works 02:26:42.520 |
In that little game, everyone gets really excited 02:26:46.120 |
Oh, I came up with a pill that solves this problem 02:26:51.400 |
that shows the exact probability it's gonna help or not. 02:26:56.360 |
and you forget to realize this game, just like Mullick, 02:27:07.600 |
human civilization or divide human civilization 02:27:17.560 |
I mean, the effects of, I mean, it's just so, 02:27:22.920 |
have nothing to do with the biology of the virus, 02:27:29.600 |
But like one of them is the complete distrust 02:27:34.200 |
The other one is because of that public distrust, 02:27:36.400 |
I feel like if a much worse pandemic came along, 02:27:50.080 |
- And they won't be, they'll distrust every single thing 02:27:53.640 |
that any major institution is gonna tell them. 02:28:07.800 |
they very knowingly told, it was a white lie, 02:28:12.440 |
such as early on when there was clearly a shortage of masks. 02:28:17.440 |
And so they said to the public, "Oh, don't get masks, 02:28:35.200 |
you're gonna make that you're gonna get sicker, 02:28:43.160 |
And it's pretty clear the reason why they did that 02:28:45.320 |
was because there was actually a shortage of masks 02:28:47.640 |
and they really needed it for health workers, 02:28:52.800 |
But the cost of lying to the public when that then comes out, 02:28:57.800 |
people aren't as stupid as they think they are. 02:29:02.760 |
And that's, I think, where this distrust of experts 02:29:13.160 |
Now, that's not to say that there aren't a lot of stupid 02:29:14.800 |
people who have a lot of wacky ideas around COVID 02:29:18.200 |
but if you treat the general public like children, 02:29:21.560 |
they're going to see that, they're going to notice that, 02:29:23.480 |
and that is going to absolutely decimate the trust 02:29:26.760 |
in the public institutions that we depend upon. 02:29:29.520 |
And honestly, the best thing that could happen, 02:29:32.240 |
I wish, if Fauci and these other leaders who, 02:29:36.400 |
I mean, God, I can't imagine how nightmare his job has been 02:29:41.360 |
So I have a lot of sympathy for the position he's been in. 02:29:51.120 |
"We didn't handle this as well as we could have. 02:29:53.920 |
"These are all the things I would have done differently 02:29:56.680 |
"I apologize for this and this and this and this." 02:30:03.040 |
Maybe this would backfire, but I don't think it would. 02:30:06.240 |
'cause I've lost trust in a lot of these things. 02:30:08.840 |
But I'm fortunate that I at least know people 02:30:10.360 |
who I can go to who I think have good epistemics 02:30:13.520 |
But if they could sort of put their hands on and go, 02:30:16.360 |
"Okay, these are the spots where we screwed up. 02:30:21.200 |
"Yeah, we actually told a little white lie here. 02:30:24.960 |
Where they just did the radical honesty thing, 02:30:28.680 |
that would go so far to rebuilding public trust. 02:30:49.480 |
from being honest in that way now when he leaders. 02:31:15.000 |
- I mean, he still has a chance to do that, I think. 02:31:24.000 |
I don't think he's irredeemable by any means. 02:31:28.720 |
of whether there was arrogance or there or not. 02:31:30.720 |
Just know that I think, like, coming clean on the, 02:31:34.200 |
you know, it's understandable to have fucked up 02:31:37.800 |
Like, I won't expect any government to handle it well 02:31:42.880 |
so much, like, lack of information and so on. 02:31:48.760 |
okay, look, we're doing a scrutiny of where we went wrong. 02:31:51.320 |
And for my part, I did this wrong in this part. 02:32:01.960 |
Another person that screwed up in terms of trust, 02:32:10.320 |
There seems to have been a kind of dishonesty 02:32:18.080 |
in that they didn't trust people to be intelligent. 02:32:23.080 |
Like, we need to tell them what's good for them. 02:32:26.040 |
We know what's good for them, that kind of idea. 02:32:39.080 |
Nut picking is where the craziest, stupidest, 02:32:45.920 |
let's call it, you know, let's say people who are vaccine, 02:32:48.760 |
people who are vaccine hesitant, vaccine speculative, 02:32:52.080 |
you know, what social media did or the media or anyone, 02:33:07.880 |
select the craziest ones and then have that beamed to, 02:33:13.040 |
or Francis's perspective, that's what they get 02:33:15.680 |
because they're getting the same social media stuff as us. 02:33:18.960 |
I mean, they might get some more information, 02:33:20.880 |
but they too are gonna get the nuts portrayed to them. 02:33:31.280 |
And that just means they're not social media savvy. 02:33:33.640 |
So one of the skills of being on social media 02:33:37.840 |
like to understand, to put into proper context. 02:33:57.240 |
- Where the hell did that analogy come from in my mind? 02:34:01.560 |
I think you need to, there's some Freudian thing 02:34:03.760 |
we need to deeply investigate with a therapist. 02:34:08.320 |
Are you worried about AGI, superintelligence systems, 02:34:19.960 |
but I feel kind of bipolar in that some days I wake up 02:34:26.960 |
I'm like, wow, we can unlock the mysteries of the universe, 02:34:38.960 |
In some ways you need this like omnibenevolent, 02:34:48.280 |
that can like make us all not do the molecule thing, 02:34:55.560 |
or redesign the system so that it's not vulnerable 02:35:04.960 |
is that maybe, you know, we can't survive without it. 02:35:13.260 |
unfortunately now that there's multiple actors 02:35:17.280 |
this was fine 10 years ago when it was just DeepMind, 02:35:23.340 |
Now it's like, the whole thing is at the same, 02:35:30.040 |
that like optimizes for speed at the cost of safety 02:35:35.280 |
and so we'll be the more likely the ones to build the AGI, 02:35:37.360 |
you know, and that's the same cycle that you're in. 02:35:45.020 |
if you go and try and like stop all the different companies, 02:35:51.160 |
then it will, you know, the good ones will stop 02:35:57.680 |
but then that leaves all the other ones to continue 02:36:08.640 |
I know at least some of the folks at DeepMind 02:36:12.120 |
and they're incredible and they're thinking about this. 02:36:13.760 |
They're very aware of this problem and they're like, 02:36:15.720 |
you know, I think some of the smartest people on earth. 02:36:22.000 |
and they're some of the best machine learning engineers. 02:36:26.240 |
So it's possible to have a company or a community of people 02:36:31.720 |
and are thinking about the philosophical topics. 02:36:33.760 |
- Exactly, and importantly, they're also game theorists, 02:36:38.240 |
a game theory problem, the thing, this Moloch mechanism 02:36:41.640 |
and like, you know, how do we voice arms race scenarios? 02:36:46.640 |
You need people who aren't naive to be thinking about this. 02:36:50.040 |
And again, like luckily there's a lot of smart, 02:36:56.000 |
And I think it's again, a thing that we need people 02:37:02.360 |
how do we create, how do we mitigate the arms race dynamics 02:37:10.280 |
Bostrom calls it the orthogonality problem whereby, 02:37:13.560 |
because obviously there's a chance, you know, 02:37:17.480 |
is that you build something that's super intelligent 02:37:19.760 |
and by definition of being super intelligent, 02:37:25.160 |
and have the wisdom to know what the right goals are. 02:37:27.840 |
And hopefully those goals include keeping humanity alive. 02:37:30.960 |
Right, but Bostrom says that actually those two things, 02:37:35.320 |
you know, super intelligence and super wisdom 02:37:42.880 |
And how do we make it so that they are correlated? 02:37:48.920 |
- But I think that like merging of intelligence and wisdom, 02:37:58.840 |
that we're constantly having these kinds of debates, 02:38:09.600 |
- Yes, buying time is a good thing, definitely. 02:38:27.800 |
we can't even make it through the next few decades 02:38:39.360 |
- Well, there is, I'm suspicious of that kind of thinking 02:38:42.240 |
because it seems like the entirety of human history 02:38:45.160 |
has people in it that are like predicting doom 02:39:04.320 |
I've talked and listened to a bunch of people 02:39:15.720 |
is they love to be the person that kind of figured out 02:39:23.400 |
it's going to mark something extremely important 02:39:31.320 |
When in reality, most of us will be forgotten 02:39:42.000 |
whenever you lose loved ones or just tragedy happens, 02:40:08.720 |
I mean, it depends on the kind of nuclear war, 02:40:10.960 |
but in case of nuclear war, it will still go on. 02:40:18.320 |
I just, I feel like the doom and gloom thing is a-- 02:40:23.760 |
- Well, we don't want a self-fulfilling prophecy. 02:40:32.980 |
from the amount of time we've spent in this conversation 02:40:45.000 |
these bad scenarios can be a self-fulfilling prophecy. 02:40:51.180 |
with at least making people aware of the problem 02:40:56.280 |
the reason why I wanna talk about this to your audience 02:40:58.360 |
is that on average, they're the type of people 02:41:04.960 |
and they can sort of sense that there's trouble brewing. 02:41:24.260 |
So it's right, I think, for people to be thinking about it. 02:41:38.060 |
because it gives you at least this anchor of hope. 02:41:44.780 |
I do think there's something out there that wants us to win. 02:41:47.700 |
I think there's something that really wants us to win. 02:42:07.720 |
We are the ones who have to come up with the solutions, 02:42:21.340 |
all the destructive trajectories that lay in our future 02:42:44.880 |
or they're looking at just the waves going in and out. 02:42:47.880 |
And ultimately there's a kind of deep belief there 02:42:50.440 |
in the momentum of humanity to figure it all out. 02:42:55.440 |
- I think we'll make it, but we've got a lot of work to do. 02:43:11.760 |
Battle of Polytopia is a really radical simplification 02:43:20.540 |
It still has a lot of the skill tree development, 02:43:33.260 |
It's one of the most elegantly designed games I've ever seen. 02:43:46.220 |
the dopamine reward circuits in our brains very well. 02:43:59.980 |
I have in my notes, energy healing question mark. 02:44:18.300 |
The other craziest thing that's happened to me 02:44:23.740 |
I started getting this weird problem in my ear 02:44:30.340 |
where it was kind of like low frequency sound distortion, 02:44:44.140 |
and it was almost like a physical sensation in my ear, 02:44:48.140 |
And it would last for a few hours and then go away, 02:44:51.020 |
and then come back for a few hours and go away. 02:45:10.580 |
where people basically end up losing their hearing, 02:45:13.060 |
it often comes with dizzy spells and other things, 02:45:16.300 |
'cause it's like the inner ear gets all messed up. 02:45:18.740 |
Now, I don't know if that's actually what I had, 02:45:21.820 |
but that's what at least one doctor said to me. 02:45:24.980 |
But anyway, so I'd had three months of this stuff, 02:45:27.020 |
this going on, and it was really getting me down. 02:45:32.580 |
Don't mean to be that person talking about Burning Man. 02:45:40.100 |
'cause Burning Man is a very loud, intense place. 02:45:45.980 |
I get talking to this girl who's a friend of a friend. 02:45:51.060 |
"Oh, I'm really down in the dumps about this." 02:46:24.780 |
She's like, "No, no, no, there's something there. 02:46:26.660 |
And I was like, "No, no, no, I really don't like it. 02:46:31.940 |
and I don't know how long, for a few minutes. 02:46:33.740 |
And then she eventually collapses on the ground, 02:46:54.700 |
She said it was something very unpleasant and dark. 02:46:59.900 |
You'll have the physical symptoms for a couple of weeks, 02:47:10.020 |
I'd had something bad in me that made someone feel bad, 02:47:16.260 |
"Wait, I thought, you do this, this is the thing. 02:47:20.340 |
Like you pulled like some kind of exorcism or something? 02:47:24.620 |
So it, like just, the most insane experience. 02:47:35.380 |
But my ear problem went away about a couple of weeks later, 02:47:39.540 |
and touch wood, I've not had any issues since. 02:48:16.500 |
Maybe there's a whole science of what we call placebo. 02:48:23.940 |
And I mean, I don't know what the problem was. 02:48:30.540 |
"Oh, that's how, you know, if they do have that 02:48:43.260 |
it comes with this like baggage of like frame. 02:48:49.820 |
All I can do is describe the experience and what happened. 02:48:52.780 |
I cannot put an ontological framework around it. 02:48:56.460 |
I can't say why it happened, what the mechanism was, 02:49:00.020 |
what the problem even was in the first place. 02:49:06.980 |
And fortunately for me, it made the problem go away. 02:49:13.500 |
this took me on this journey of becoming more humble 02:49:18.780 |
I was in the like Richard Dawkins train of atheism 02:49:25.660 |
We know, you know, the only way we can get through, 02:49:30.820 |
and chemical interactions and that kind of stuff. 02:49:39.500 |
And that doesn't mean that it's ascientific as well. 02:49:43.540 |
'Cause, you know, the beauty of the scientific method 02:49:47.140 |
is that it still can apply to this situation. 02:49:51.340 |
I would like to try and test this experimentally. 02:49:55.540 |
I don't know how we would go about doing that. 02:49:57.100 |
We'd have to find other people with the same condition. 02:49:58.860 |
I guess, and like try and repeat the experiment. 02:50:02.780 |
But it doesn't, just because something happens 02:50:06.940 |
that's sort of out of the realms of our current understanding 02:50:13.820 |
- Yeah, I think the scientific method sits on a foundation 02:50:38.940 |
Like we haven't really figured this whole thing out. 02:50:41.580 |
- But at the same time, we have found ways to act, 02:50:45.460 |
you know, we're clearly doing something right. 02:50:47.500 |
Because think of the technological scientific advancements, 02:50:52.460 |
would blow people's minds even from 100 years ago. 02:50:55.620 |
- Yeah, and we've even allegedly gone out to space 02:51:00.340 |
Although I still haven't, I have not seen evidence 02:51:02.500 |
of the earth being round, but I'm keeping an open mind. 02:51:06.540 |
Speaking of which, you studied physics and astrophysics. 02:51:23.620 |
Like when did you fall in love with astronomy 02:51:33.380 |
but particularly my mom, my mom is like the most nature, 02:51:38.300 |
she is mother earth, is the only way to describe her. 02:51:41.180 |
Just, she's like Dr. Dolittle, animals flock to her 02:51:53.500 |
she doesn't have any, she never went to university 02:51:57.100 |
or anything like that, she's actually phobic of maths. 02:52:01.060 |
I was trying to teach her poker and she hated it. 02:52:03.860 |
But she's so deeply curious and that just got instilled in me 02:52:13.220 |
when it was warm enough in the UK to do that. 02:52:15.820 |
And we would just lie out there until we fell asleep 02:52:18.740 |
looking for satellites, looking for shooting stars. 02:52:22.140 |
And I was just always, I don't know whether it was from that 02:52:31.980 |
And also the like the most layers of abstraction. 02:52:44.140 |
it also made logical sense in that it was a degree 02:52:47.460 |
that was subject that ticked the box of being, 02:53:02.140 |
I thought I was gonna become like a research scientist. 02:53:04.460 |
My original plan was I wanna be a professional astronomer. 02:53:08.660 |
that asks the big questions and it's not like biology 02:53:14.100 |
and the path to go to medical school or something like that 02:53:19.140 |
is very pragmatic. - The more pragmatic side. 02:53:23.140 |
- But this is, yeah, physics is a good combination 02:53:30.220 |
Yeah, I mean, it wasn't like I did an immense amount 02:53:34.380 |
It just was like this, it made the most sense. 02:53:38.220 |
I mean, you have to make this decision in the UK, age 17 02:53:41.300 |
which is crazy 'cause in US, you go the first year, 02:53:48.540 |
- Yeah, I think the first few years of college, 02:53:50.020 |
you focus on the drugs and only as you get closer 02:54:05.180 |
When you looked up at the stars with your mom 02:54:15.780 |
I would imagine she would take the viewpoint of, 02:54:21.300 |
she knows there's a huge number of potential spawn sites 02:54:29.460 |
- Yeah, spawn sites in polytopia, we spawned on earth. 02:54:49.180 |
'Cause it feels like life that originated on earth 02:55:00.980 |
it doesn't exclude the completely different forms of life 02:55:04.340 |
and different biochemical soups can't also spawn, 02:55:09.140 |
but I guess it implies that there's some spark 02:55:12.580 |
that is uniform, which I kind of like the idea of it. 02:55:19.220 |
like after it dies, like what happens if life on earth ends? 02:55:32.820 |
If it's a paperclip maximizer, not for the example, 02:55:39.620 |
high on the capabilities, very low on the wisdom type thing. 02:55:44.380 |
So whether that's gray goo, green goo, like nanobots 02:55:51.180 |
that thinks it needs to turn everything into paperclips. 02:55:57.700 |
then it's gonna be very hard for life, complex life, 02:56:05.660 |
deeply low complexity, over-optimization on a single thing, 02:56:08.620 |
sacrificing everything else, turning the whole world into-- 02:56:12.180 |
like if we actually take a paperclip maximizer, 02:56:27.420 |
- So it becomes a multi-planetary paperclip maximizer? 02:56:38.660 |
'cause it's a hypothetical thought experiment, 02:56:41.220 |
much practical application to the AI safety problem, 02:56:43.260 |
but it's just a fun thing to play around with. 02:56:45.260 |
But if by definition it is maximally intelligent, 02:56:54.740 |
but extremely bad at choosing goals in the first place, 02:56:58.060 |
so again, we're talking on this orthogonality thing, right? 02:57:00.060 |
It's very low on wisdom, but very high on capability. 02:57:03.720 |
Then it will figure out how to jump the vacuum gap 02:57:17.800 |
is necessarily all about maximizing paperclips. 02:57:27.360 |
and is willing to do anything to accomplish that goal, 02:57:32.280 |
and all human life and all of consciousness in the universe 02:57:36.320 |
for the goal of producing a maximum number of paperclips. 02:58:10.320 |
different people have different perspectives. 02:58:12.480 |
But don't you think within the paperclip world 02:58:16.920 |
just like in the zeros and ones that make up a computer, 02:58:27.800 |
as you scale to multiple planets and throughout, 02:58:33.240 |
that on top of the fabric of maximizing paperclips, 02:58:38.240 |
that would emerge like little societies of paperclip-- 02:58:51.560 |
it is literally just a piece of bent iron, right? 02:58:55.560 |
So if it's maximizing that throughout the universe, 02:59:13.880 |
will just emerge and create through gravity or something. 02:59:18.920 |
'cause there's a dynamic element to the whole system. 02:59:21.560 |
It's not just, it's creating those paperclips. 02:59:24.760 |
And the act of creating, there's going to be a process, 02:59:32.600 |
There's a whole complex three-dimensional system 02:59:44.520 |
They can be interacting in very interesting ways 02:59:46.520 |
as you scale exponentially through three-dimensional. 03:00:02.640 |
- I love your optimism. - It has to understand 03:00:06.200 |
we're going into the realm of pathological optimism, 03:00:16.280 |
- So you're saying that basically intelligence 03:00:18.440 |
is inherent in the fabric of reality and will find a way, 03:00:21.640 |
kind of like Goldblum says, "Life will find a way." 03:00:25.160 |
even out of this perfectly homogenous dead soup. 03:00:31.520 |
It has to, it's perfectly maximal in the production. 03:00:34.920 |
I don't know why people keep thinking it's homogenous. 03:00:59.520 |
that will make it beautiful. - You think that even out of- 03:01:05.640 |
with the whole heat death of the universe, right? 03:01:08.120 |
'Cause that's another sort of instantiation of this. 03:01:10.120 |
It's like everything becomes so far apart and so cold 03:01:20.440 |
Do you think that even out of that homogenous grayness 03:01:38.200 |
will figure out ways to travel to other universes 03:01:43.560 |
or through black holes to create whole other worlds 03:01:46.160 |
to break what we consider are the limitations of physics. 03:01:52.760 |
The paperclip maximizer will find a way if a way exists. 03:01:56.720 |
And we should be humble to realize that we don't- 03:01:59.040 |
- Yeah, but because it just wants to make more paperclips. 03:02:18.320 |
- Whether it's, yeah, whether it's, you know, 03:02:20.320 |
Planck lengths or paperclips as the base unit. 03:02:31.320 |
It has, like, I don't know if you can summarize it. 03:02:37.320 |
and yet out of them amazing complexity emerges. 03:02:39.640 |
- And its goals seem to be pretty basic and dumb. 03:02:52.000 |
I don't know if you can assign goals to physics, 03:03:09.120 |
That's where intelligence, that's where humans emerge. 03:03:19.000 |
is that you think that the force of emergence itself 03:03:31.800 |
And you're trusting that emergence will find a way 03:03:34.880 |
to even out of seemingly the most molecule awful, 03:03:39.880 |
plain outcome emergence will still find a way. 03:03:52.920 |
How about we build the paperclip maximizer and find out? 03:04:00.000 |
But the thing is, it will destroy humans in the process, 03:04:10.240 |
Would that make you sad if AI systems that are beautiful, 03:04:30.320 |
I mean, that's the reason why I'm in some ways 03:04:41.560 |
there's a chance, if we're in this hypothetical 03:04:48.240 |
it needs our atoms and energy to do something, 03:04:56.800 |
bio just kills everything on Earth and that's it. 03:05:03.340 |
in the few hundred million years it has left. 03:05:27.040 |
that he thinks it's sort of building off this Gaia theory 03:05:30.840 |
where Earth is living some form of intelligence itself 03:06:00.120 |
because if something is truly brilliant and wise and smart 03:06:17.460 |
There should be plenty of space for it to go out 03:06:33.000 |
we'll then go and create another super intelligent agent 03:06:36.600 |
because it should be omni-wise and smart enough 03:06:40.680 |
- Unless it deems humans to be kind of assholes. 03:06:44.200 |
The humans are a source of a lose-lose kind of dynamics. 03:06:55.440 |
Moloch is, that's why I think it's important to say. 03:07:00.200 |
- No, I mean, I think game theory is the source of Moloch. 03:07:03.840 |
And 'cause Moloch exists in non-human systems as well. 03:07:18.520 |
that's on an island of animals, rats out-competing 03:07:22.680 |
the ones that massively consume all the resources 03:07:31.960 |
Moloch exists in little pockets in nature as well. 03:07:35.960 |
- I wonder if it's actually a result of consequences 03:07:38.320 |
of the invention of predator and prey dynamics. 03:07:41.540 |
Maybe AI will have to kill off every organism that-- 03:07:46.540 |
- Now you're talking about killing off competition. 03:08:08.520 |
It'll put them in a zoo like we do with parasites. 03:08:22.440 |
outside of the geographically constricted region, 03:08:33.240 |
for beauty and kindness and non-Moloch behavior 03:08:41.460 |
Let me, I don't know if you answered the aliens question. 03:08:54.680 |
but I think he said it's a good chance we're alone. 03:09:18.240 |
we should be, and even if only a tiny fraction of those 03:09:23.160 |
we should be, the universe should be very noisy. 03:09:24.920 |
There should be evidence of Dyson spheres or whatever, 03:09:29.760 |
But seemingly things are very silent out there. 03:09:32.560 |
Now, of course, it depends on who you speak to. 03:09:33.840 |
Some people say that they're getting signals all the time 03:09:50.720 |
So the Drake equation is basically just a simple thing 03:09:55.160 |
of trying to estimate the number of possible civilizations 03:09:59.920 |
of stars created per year by the number of stars 03:10:02.960 |
that have planets, planets that are habitable, blah, blah, 03:10:08.600 |
and depending on the range of your lower bound 03:10:12.200 |
and your upper bound point estimates that you put in, 03:10:22.480 |
was Toby is a researcher at the Future of Humanity Institute. 03:10:25.680 |
They, instead of, they realized that it's like basically 03:10:31.240 |
a statistical quirk that if you put in point sources, 03:10:41.120 |
it spans like maybe even like a couple of hundred 03:10:50.920 |
And so they, by putting stuff on a log scale, 03:10:54.360 |
or actually they did it on like a log log scale 03:10:56.120 |
on some of them, and then like ran the simulation 03:11:04.360 |
When you do that, then actually the number comes out 03:11:10.720 |
mathematically correct way of doing the calculation. 03:11:13.400 |
It's still a lot of hand-waving, as science goes, 03:11:18.720 |
I don't know what an analogy is, but it's hand-wavy. 03:11:24.760 |
and then they did a Bayesian update on it as well 03:11:27.040 |
to like factor in the fact that there is no evidence 03:11:41.400 |
intelligent civilization in our galaxy thus far, 03:11:45.120 |
and around 50/50 in the entire observable universe. 03:11:53.440 |
- Well, yeah, the math around this particular equation, 03:11:55.680 |
which the equation is ridiculous on many levels, 03:12:03.240 |
is there's different things, different components 03:12:06.800 |
that can be estimated, and the error bars on which 03:12:13.840 |
And hence, throughout, since the equation came out, 03:12:17.360 |
the error bars have been coming out on different aspects. 03:12:23.300 |
this gives you a mission to reduce the error bars 03:12:30.000 |
and once you do, you can better and better understand. 03:12:32.760 |
Like in the process of reducing the error bars, 03:12:34.720 |
you'll get to understand actually what is the right way 03:12:41.480 |
how many of them there are, and all those kinds of things. 03:12:43.960 |
So I don't think it's good to use that for an estimation. 03:12:53.860 |
Like, and trying to understand the very physics-based, 03:12:59.320 |
biology, chemistry, biology-based question of what is life. 03:13:04.240 |
Maybe computation-based, what the fuck is this thing? 03:13:07.920 |
And that, like how difficult is it to create this thing? 03:13:12.160 |
It's one way to say like, how many planets like this 03:13:16.160 |
but it feels like from our very limited knowledge 03:13:23.640 |
how does, what is this thing, and how does it originate? 03:13:42.480 |
and these like weird systems that encode information 03:13:46.360 |
and pass information from-- - Self-replicate. 03:13:48.480 |
- Self-replicate, and then also select each other 03:13:51.080 |
and mutate in interesting ways such that they can adapt 03:13:53.440 |
and evolve and build increasingly more complex systems. 03:13:56.600 |
- Right, well, it's a form of information processing. 03:14:05.640 |
which then results in, I guess, information processing? 03:14:26.680 |
- The goal is, well, the goal is to make more of itself. 03:14:44.240 |
in a way that maximizes the chance of its survival. 03:14:47.700 |
- Individual agents within an ecosystem do, yes. 03:15:06.520 |
I think the mistake is that we're anthropomorphizing, 03:15:26.500 |
the mistake we make when we try and put our mind, 03:15:29.220 |
think through things from an evolutionary perspective, 03:15:32.340 |
as though we're giving evolution some kind of agency 03:15:43.980 |
that say that anthropomorphization is a mistake, 03:15:48.760 |
that anthropomorphization is a mistake is a mistake. 03:15:51.620 |
I think there's a lot of power in anthropomorphization, 03:15:54.980 |
if I can only say that word correctly one time. 03:15:57.500 |
I think that's actually a really powerful way 03:16:00.820 |
to reason through things, and I think people, 03:16:04.380 |
seem to run away from it as fast as possible, 03:16:09.060 |
- Can you give an example of how it helps in robotics? 03:16:12.220 |
- Oh, in that our world is a world of humans, 03:16:19.060 |
and to see robots as fundamentally just tools 03:16:23.340 |
runs away from the fact that we live in a world, 03:16:30.020 |
that all these game theory systems we've talked about, 03:16:33.760 |
that a robot that ever has to interact with humans, 03:16:37.580 |
and I don't mean intimate friendship interaction, 03:16:42.500 |
where it has to deal with the uncertainty of humans, 03:16:45.540 |
you have to acknowledge that the robot's behavior 03:16:52.020 |
just as much as the human has an effect on the robot, 03:17:00.540 |
this is obvious in a physical manifestation of a robot, 03:17:07.820 |
they have their own personal life projections, 03:17:14.840 |
they have their own memories of what a dog is like, 03:17:18.540 |
and that's gonna be useful in a safety setting, 03:17:22.240 |
which is one of the most trivial settings for a robot, 03:17:25.100 |
in terms of how to avoid any kind of dangerous situations, 03:17:37.840 |
how a robot should consider navigating its environment 03:17:43.780 |
- I also think our brains are designed to think 03:17:53.860 |
I think is best applied in the space of human decisions, 03:18:15.480 |
is because there is a danger of overly applying, 03:18:19.440 |
overly, wrongly assuming that this artificial intelligence 03:18:23.480 |
is going to operate in any similar way to us, 03:18:34.320 |
anthropomorphizing them is less of a mistake, I think, 03:18:37.520 |
than an AI, even though it's an AI we built, and so on, 03:18:40.200 |
because at least we know that they're running 03:18:45.580 |
out of the same evolutionary process, you know, 03:18:52.420 |
and needing to find a mate, and that kind of stuff, 03:18:55.180 |
whereas an AI that has just popped into existence 03:19:23.540 |
their method of whatever their form of thinking is, 03:20:17.620 |
'cause it's something I'm still figuring out myself, 03:20:23.940 |
Don't see things, everything as a zero-sum game, 03:20:40.940 |
they're like, "Oh, she's a pro, I wanna do that too." 03:20:56.780 |
but don't try making a living from it these days, 03:21:04.460 |
really, really be aware of how much time you spend 03:21:15.820 |
every moment that you spend on it is bad for you. 03:21:20.140 |
but just have that running in the background. 03:21:27.220 |
- Of course, about becoming a professional poker player, 03:21:42.660 |
Find a thing that you can't be talked out of, too. 03:22:39.820 |
I'm a guitarist, more like a classic rock guitarist. 03:22:50.780 |
what's the better metal band, Metallica versus Pantera? 03:22:53.900 |
This is a more kind of '90s maybe discussion. 03:23:05.780 |
But they were, basically everybody was against me. 03:23:16.180 |
- I think that's crazy. - 'Cause Metallica's pop, 03:23:24.420 |
you can't say who was the godfather of metal, blah, blah, blah. 03:23:26.540 |
But they were so groundbreaking and so brilliant. 03:23:31.540 |
I mean, you've named literally two of my favorite bands. 03:23:35.620 |
When you ask that question of who are my favorites, 03:23:41.500 |
who I just think, ugh, they just tick all the boxes for me. 03:23:48.260 |
Nowadays, I kind of sort of feel like a repulsion to the, 03:23:56.900 |
Come on, who's, like, no, you have to rank them. 03:23:58.940 |
But it's like this false zero-sum-ness that's like, why? 03:24:04.620 |
- Although, when people ask that kind of question 03:24:07.020 |
about anything, movies, I feel like it's hard work 03:24:11.460 |
and it's unfair, but it's, you should pick one. 03:24:31.820 |
- Can I just like, what, why does this matter? 03:24:39.820 |
- Oh, like, do you use it for, like, motivation 03:24:44.620 |
- Yeah, I was weirdly listening to 80s hair metal 03:25:00.020 |
But yeah, sorry, to answer your question about guitar playing, 03:25:09.140 |
My objective would be to hear some really hard technical solo 03:25:12.180 |
and then learn it, memorize it, and then play it perfectly. 03:25:15.980 |
But I was incapable of trying to write my own music. 03:25:19.220 |
Like, the idea was just absolutely terrifying. 03:25:24.660 |
it'd be kind of cool to actually try starting a band again 03:25:38.140 |
Like, I play Comfortably Numb on the internet. 03:25:47.700 |
with technical playing, both on piano and guitar. 03:25:50.300 |
And one of the reasons I started learning guitar 03:26:01.740 |
And one of the first solos I learned is that, 03:26:09.260 |
- Yeah, there's some tapping, but it's just really fast. 03:26:16.900 |
But there's a melody that you can hear through it, 03:26:21.620 |
It's a beautiful solo, but it's also technically, 03:26:23.900 |
just visually the way it looks when a person's watching, 03:26:36.420 |
that I think requires you to generate beautiful music. 03:26:42.340 |
And so that took me a long time to let go of that 03:26:52.260 |
I think that journey is a little bit more inspired 03:26:59.940 |
But I think ultimately it's a more rewarding journey 03:27:04.940 |
'cause you get to, your relationship with the guitar 03:27:20.020 |
was like it was something to tame and defeat. 03:27:24.780 |
- Which was kind of what my whole personality was back then. 03:27:33.140 |
Whereas writing music is you work, it's like a dance. 03:27:37.860 |
- But I think because of the competitive aspect, 03:27:47.180 |
I think there's just like a harsh self criticism 03:27:55.420 |
- I mean, there's certain things that feel really personal. 03:27:59.300 |
And on top of that, as we talked about poker offline, 03:28:03.380 |
there's certain things that you get to a certain height 03:28:05.580 |
in your life, and that doesn't have to be very high, 03:28:15.940 |
And it's hard to, like you being at a very high level 03:28:19.860 |
in poker, it might be hard for you to return to poker 03:28:24.620 |
knowing that you're just not as sharp as you used to be 03:28:26.900 |
because you're not doing it every single day. 03:28:29.460 |
That's something I always wonder with, I mean, 03:28:47.260 |
it's like accepting the fact that this whole ride is finite 03:28:51.460 |
and you have a prime, there's a time when you were really 03:29:02.940 |
- But you can still discover joy within that process. 03:29:06.500 |
It's been tough, especially with some level of like, 03:29:12.100 |
and people film stuff, you don't have the privacy 03:29:16.620 |
of just sharing something with a few people around you. 03:29:29.140 |
But all those pressures aside, if you really, 03:29:31.860 |
you can step up and still enjoy the fuck out of 03:29:37.900 |
What do you think is the meaning of this whole thing? 03:29:47.340 |
you have to live up, do you feel the requirement 03:30:22.060 |
- I mean, I feel the urge to live up to that, 03:30:31.980 |
or is good something completely separate to that? 03:30:45.180 |
- I think to explore, have fun, and understand, 03:30:51.820 |
and make more of here and to keep the game going. 03:31:10.460 |
to try and put it into a vaguely scientific term, 03:31:18.340 |
the length of code required to describe the universe 03:31:22.660 |
And highly complex, and therefore interesting. 03:31:26.260 |
Because again, I know we banged the metaphor to death, 03:31:37.180 |
doesn't require that much of a code to describe. 03:31:42.700 |
but that steady state, assuming a steady state, 03:31:45.620 |
Whereas it seems like our universe is over time 03:31:49.660 |
becoming more and more complex and interesting. 03:31:51.900 |
There's so much richness and beauty and diversity 03:31:53.940 |
on this Earth, and I want that to continue and get more. 03:32:12.060 |
- 'Cause we do create a lot of fun things along, 03:32:22.460 |
And perhaps that has to do with the finiteness of life, 03:32:31.300 |
Like the fact that they end, there's this, whatever it is, 03:33:08.420 |
I don't know what that is, the finiteness of it. 03:33:13.820 |
I mean a big thing I think that one has to learn 03:33:25.620 |
people cling onto things beyond what they're meant to be, 03:33:37.060 |
I think it's obvious, as we've talked about many times, 03:33:41.380 |
You should, you're already doing a lot of stuff publicly 03:33:47.420 |
You're a great educator, you're a great mind, 03:33:50.300 |
But it's also this whole medium of just talking 03:34:09.380 |
and if you're talking to others to understand them better 03:34:22.880 |
that like melt together in just hilarious ways, 03:34:26.100 |
fascinating ways, just the tension of ideas there 03:34:30.260 |
But in general, I think you got a beautiful voice. 03:34:36.540 |
Thank you for honoring me with this conversation 03:34:42.580 |
- Thanks for listening to this conversation with Liv Boree. 03:34:46.300 |
please check out our sponsors in the description. 03:34:53.040 |
"I think it's much more interesting to live not knowing 03:34:59.300 |
I have approximate answers and possible beliefs 03:35:08.540 |
And there are many things I don't know anything about 03:35:11.460 |
such as whether it means anything to ask why we're here. 03:35:20.760 |
by being lost in a mysterious universe without any purpose, 03:35:24.300 |
which is the way it really is as far as I can tell." 03:35:27.620 |
Thank you for listening and hope to see you next time.