back to index

Liv Boeree: Poker, Game Theory, AI, Simulation, Aliens & Existential Risk | Lex Fridman Podcast #314


Chapters

0:0 Introduction
0:58 Poker and game theory
8:1 Dating optimally
12:53 Learning
21:5 Daniel Negreanu
26:13 Phil Hellmuth
28:46 Greatest poker player ever
33:4 Bluffing
43:25 Losing
52:41 Mutually assured destruction
57:35 Simulation hypothesis
74:13 Moloch
103:25 Beauty
115:33 Quantifying life
135:43 Existential risks
154:6 AI
163:57 Energy healing
171:7 Astrophysics
174:1 Aliens
199:43 Advice for young people
201:48 Music
209:37 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | evolutionarily, if we see a lion running at us,
00:00:03.040 | we didn't have time to calculate the lion's kinetic energy
00:00:06.360 | and is it optimal to go this way or that way?
00:00:08.840 | You just reacted.
00:00:10.440 | And physically, our bodies are well-attuned
00:00:13.240 | to actually make right decisions.
00:00:14.520 | But when you're playing a game like poker,
00:00:16.640 | this is not something that you ever evolved to do,
00:00:19.320 | and yet you're in that same flight or fight response.
00:00:22.360 | And so that's a really important skill
00:00:24.440 | to be able to develop,
00:00:25.260 | to basically learn how to meditate in the moment
00:00:28.280 | and calm yourself so that you can think clearly.
00:00:30.680 | - The following is a conversation with Liv Bury,
00:00:35.760 | formerly one of the best poker players in the world,
00:00:38.400 | trained as an astrophysicist
00:00:40.320 | and is now a philanthropist and an educator
00:00:44.240 | on topics of game theory, physics, complexity, and life.
00:00:48.280 | This is the Lex Friedman Podcast.
00:00:51.100 | To support it, please check out our sponsors
00:00:53.200 | in the description.
00:00:54.440 | And now, dear friends, here's Liv Bury.
00:00:58.580 | What role do you think luck plays in poker and in life?
00:01:02.620 | You can pick whichever one you want,
00:01:04.300 | poker or life and/or life.
00:01:06.980 | - The longer you play, the less influence luck has,
00:01:10.700 | you know, like with all things,
00:01:11.700 | the bigger your sample size,
00:01:13.940 | the more the quality of your decisions
00:01:16.420 | or your strategies matter.
00:01:17.960 | So to answer that question, yeah, in poker,
00:01:21.620 | it really depends.
00:01:22.820 | If you and I sat and played 10 hands right now,
00:01:26.180 | I might only win 52% of the time, 53% maybe.
00:01:30.300 | But if we played 10,000 hands,
00:01:31.760 | then I'll probably win like over 98, 99% of the time.
00:01:35.180 | So it's a question of sample sizes.
00:01:38.220 | - And what are you figuring out over time?
00:01:40.160 | The betting strategy that this individual does
00:01:42.300 | or literally it doesn't matter
00:01:43.780 | against any individual over time?
00:01:45.940 | - Against any individual over time, the better player,
00:01:48.020 | because they're making better decisions.
00:01:49.540 | So what does that mean to make a better decision?
00:01:51.300 | Well, to get into the real nitty gritty already,
00:01:55.340 | basically poker is a game of math.
00:01:57.800 | There are these strategies,
00:02:00.020 | familiar with like Nash equilibria, that term, right?
00:02:02.540 | So there are these game theory optimal strategies
00:02:06.300 | that you can adopt.
00:02:08.300 | And the closer you play to them,
00:02:10.340 | the less exploitable you are.
00:02:12.400 | So because I've studied the game a bunch,
00:02:15.940 | although admittedly not for a few years,
00:02:17.260 | but back in, you know, when I was playing all the time,
00:02:20.020 | I would study these game theory optimal solutions
00:02:23.020 | and try and then adopt those strategies
00:02:24.900 | when I go and play.
00:02:25.740 | So I'd play against you and I would do that.
00:02:27.740 | And because the objective,
00:02:31.820 | when you're playing game theory optimal,
00:02:33.220 | it's actually, it's a loss minimization thing
00:02:35.980 | that you're trying to do.
00:02:37.340 | Your best bet is to try and play a sort of similar style.
00:02:42.340 | You also need to try and adopt this loss minimization.
00:02:46.180 | But because I've been playing much longer than you,
00:02:48.100 | I'll be better at that.
00:02:49.380 | So first of all, you're not taking advantage
00:02:51.900 | of my mistakes.
00:02:53.140 | But then on top of that,
00:02:54.980 | I'll be better at recognizing
00:02:56.900 | when you are playing suboptimally
00:02:59.500 | and then deviating from this game theory optimal strategy
00:03:02.180 | to exploit your bad plays.
00:03:05.100 | - Can you define game theory and Nash equilibria?
00:03:08.660 | Can we try to sneak up to it in a bunch of ways?
00:03:10.900 | Like what's a game theory framework of analyzing poker,
00:03:14.340 | analyzing any kind of situation?
00:03:16.300 | - So game theory is just basically the study
00:03:20.060 | of decisions within a competitive situation.
00:03:25.060 | I mean, it's technically a branch of economics,
00:03:27.660 | but it also applies to like wider decision theory.
00:03:30.860 | And usually when you see it,
00:03:35.780 | it's these like little payoff matrices and so on.
00:03:37.860 | That's how it's depicted.
00:03:38.740 | But it's essentially just like study of strategies
00:03:41.100 | under different competitive situations.
00:03:43.020 | And as it happens, certain games,
00:03:46.340 | in fact, many, many games,
00:03:47.980 | have these things called Nash equilibria.
00:03:50.460 | And what that means is when you're in a Nash equilibrium,
00:03:52.380 | basically it is not,
00:03:55.460 | there is no strategy that you can take
00:03:59.700 | that would be more beneficial
00:04:01.180 | than the one you're currently taking,
00:04:02.780 | assuming your opponent is also doing the same thing.
00:04:05.660 | So it would be a bad idea.
00:04:06.700 | If we're both playing in a game theory optimal strategy,
00:04:10.660 | if either of us deviate from that,
00:04:12.140 | now we're putting ourselves at a disadvantage.
00:04:16.620 | Rock, paper, scissors is actually
00:04:17.620 | a really great example of this.
00:04:18.860 | Like if we were to start playing rock, paper, scissors,
00:04:22.420 | you know, you know nothing about me
00:04:23.780 | and we're gonna play for all our money,
00:04:26.220 | let's play 10 rounds of it.
00:04:27.860 | What would your sort of optimal strategy be, do you think?
00:04:30.860 | What would you do?
00:04:31.780 | - Let's see.
00:04:35.220 | I would probably try to be as random as possible.
00:04:40.220 | - Exactly.
00:04:43.580 | You wanna, because you don't know anything about me.
00:04:46.060 | You don't want to give anything away about yourself.
00:04:48.140 | So ideally you'd have like a little dice
00:04:49.580 | or somewhat, you know, perfect randomizer
00:04:52.620 | that makes you randomize 33% of the time
00:04:54.540 | each of the three different things.
00:04:56.100 | And in response to that,
00:04:57.340 | well, actually I can kind of do anything,
00:04:59.660 | but I would probably just randomize back too.
00:05:01.420 | But actually it wouldn't matter
00:05:02.420 | 'cause I know that you're playing randomly.
00:05:05.180 | So that would be us in a Nash equilibrium
00:05:07.660 | where we're both playing this like unexploitable strategy.
00:05:10.660 | However, if after a while you then notice
00:05:13.060 | that I'm playing rock a little bit more often than I should.
00:05:16.420 | - Yeah, you're the kind of person that would do that.
00:05:18.180 | Wouldn't you?
00:05:19.020 | - Sure, yes, yes, yes.
00:05:19.980 | I'm more of a scissors girl, but anyway.
00:05:21.340 | - You are?
00:05:22.540 | - No, I'm a, as I said, randomizer.
00:05:24.460 | So you notice I'm throwing rock too much
00:05:27.140 | or something like that.
00:05:28.180 | Now you'd be making a mistake by continuing
00:05:30.140 | playing this game theory optimal strategy,
00:05:32.220 | well, the previous one,
00:05:33.060 | because you are now, I'm making a mistake
00:05:37.860 | and you're not deviating and exploiting my mistake.
00:05:41.340 | So you'd wanna start throwing paper a bit more often
00:05:43.820 | in whatever you figure is the right sort of percentage
00:05:45.980 | of the time that I'm throwing rock too often.
00:05:48.100 | So that's basically an example of where,
00:05:51.740 | what game theory optimal strategy is
00:05:53.300 | in terms of loss minimization,
00:05:54.700 | but it's not always the maximally profitable thing
00:05:58.060 | if your opponent is doing stupid stuff,
00:06:00.180 | which in that example.
00:06:02.580 | So that's kind of then how it works in poker,
00:06:04.540 | but it's a lot more complex.
00:06:06.460 | And the way poker players typically,
00:06:10.660 | nowadays they study, the games changed so much
00:06:12.940 | and I think we should talk about how it sort of evolved.
00:06:15.460 | But nowadays like the top pros basically spend all their time
00:06:19.380 | in between sessions running these simulators
00:06:23.140 | using like software where they do basically
00:06:25.020 | Monte Carlo simulations,
00:06:26.180 | sort of doing billions of fictitious self-play hands.
00:06:31.180 | You input a fictitious hand scenario,
00:06:34.060 | like, oh, what do I do with Jack nine suited
00:06:36.100 | on a King 10 four two spade board
00:06:40.380 | and against this bet size.
00:06:43.980 | So you'd input that, press play,
00:06:45.820 | it'll run it's billions of fake hands
00:06:49.380 | and then it will converge upon
00:06:50.380 | what the game theory optimal strategies are.
00:06:52.580 | And then you wanna try and memorize what these are.
00:06:55.460 | Basically they're like ratios of how often,
00:06:57.500 | what types of hands you want to bluff
00:06:59.940 | and what percentage of the time.
00:07:01.340 | So then there's this additional layer
00:07:02.820 | of inbuilt randomization built in.
00:07:04.500 | - Yeah, those kinds of simulations incorporate
00:07:06.420 | all the betting strategies and everything else like that.
00:07:08.540 | So as opposed to some kind of very crude mathematical model
00:07:12.340 | of what's the probability you win
00:07:13.860 | just based on the quality of the card,
00:07:16.380 | it's including everything else too, the game theory of it.
00:07:20.220 | - Yes, yeah, essentially.
00:07:21.980 | And what's interesting is that nowadays,
00:07:23.780 | if you want to be a top pro
00:07:25.100 | and you go and play in these really like
00:07:26.340 | the super high stakes tournaments or tough cash games,
00:07:29.220 | if you don't know this stuff,
00:07:30.700 | you're gonna get eaten alive in the long run.
00:07:33.260 | But of course you could get lucky over the short run.
00:07:35.140 | And that's where this like luck factor comes in
00:07:36.860 | because luck is both a blessing and a curse.
00:07:40.420 | If luck didn't, if there wasn't this random element
00:07:42.580 | and there wasn't the ability for worse players
00:07:45.260 | to win sometimes, then poker would fall apart.
00:07:48.460 | The same reason people don't play chess professionally
00:07:51.500 | for money against, you don't see people going
00:07:54.100 | and hustling chess, like not knowing,
00:07:56.860 | trying to make a living from it
00:07:57.860 | because you know there's very little luck in chess,
00:08:00.140 | but there's quite a lot of luck in poker.
00:08:01.420 | - Have you seen "Beautiful Mind," that movie?
00:08:03.940 | - Years ago.
00:08:04.780 | - Well, what do you think about the game
00:08:06.100 | theoretic formulation of what is it,
00:08:08.540 | the hot blonde at the bar?
00:08:09.980 | Do you remember?
00:08:10.820 | - Oh yeah.
00:08:11.660 | - The way they illustrated it is they're trying
00:08:13.940 | to pick up a girl at a bar and there's multiple girls.
00:08:16.460 | They're like, it's like a friend group
00:08:18.140 | and you're trying to approach.
00:08:20.100 | I don't remember the details, but I remember-
00:08:21.980 | - Don't you like then speak to her friends first?
00:08:23.860 | - Yeah, yeah.
00:08:24.700 | - Something like that, fame, disinterest.
00:08:25.700 | I mean, it's classic pickup artist stuff, right?
00:08:27.380 | You wanna-
00:08:28.220 | - And they were trying to correlate that somehow,
00:08:30.780 | that being an optimal strategy game, theoretically.
00:08:36.740 | What, what?
00:08:37.580 | Like, I don't think I remember-
00:08:38.900 | - I can't imagine that there is.
00:08:39.740 | I mean, there's probably an optimal strategy.
00:08:41.900 | Is it, does that mean that there's an actual Nash equilibrium
00:08:45.020 | of like picking up girls?
00:08:46.620 | - Do you know the marriage problem?
00:08:48.980 | It's optimal stopping?
00:08:50.900 | - Yes.
00:08:51.740 | - So where it's a optimal dating strategy where you,
00:08:54.980 | do you remember what it is?
00:08:56.660 | - Yeah, I think it's like something like,
00:08:57.500 | you know you've got like a set of a hundred people
00:08:59.900 | you're gonna look through and after how many do you,
00:09:04.220 | now after that, after going on this many dates out of a 100,
00:09:08.020 | at what point do you then go,
00:09:09.180 | okay, the next best person I see, is that the right one?
00:09:11.580 | And I think it's like something like 37%.
00:09:14.180 | - It's one over E, whatever that is.
00:09:17.740 | - Right, which I think is 37%.
00:09:19.300 | - Yeah.
00:09:20.140 | (laughing)
00:09:21.260 | We're gonna fact check that.
00:09:22.820 | (laughing)
00:09:24.860 | Yeah, so, but it's funny under those strict constraints,
00:09:28.420 | then yes, after that many people,
00:09:30.660 | as long as you have a fixed size pool,
00:09:32.820 | then you just pick the next person
00:09:36.460 | that is better than anyone you've seen before.
00:09:38.460 | - Yeah.
00:09:39.300 | Have you tried this?
00:09:41.780 | Have you incorporated it?
00:09:42.620 | - I'm not one of those people.
00:09:44.660 | And we're gonna discuss this.
00:09:46.900 | And what do you mean, those people?
00:09:50.140 | I try not to optimize stuff.
00:09:52.580 | I try to listen to the heart.
00:09:55.340 | I don't think,
00:09:57.020 | I like, my mind immediately is attracted
00:10:02.180 | to optimizing everything.
00:10:06.260 | And I think that if you really give in
00:10:09.940 | to that kind of addiction,
00:10:11.100 | that you lose the joy of the small things,
00:10:14.940 | the minutia of life, I think.
00:10:17.220 | I don't know.
00:10:18.060 | I'm concerned about the addictive nature
00:10:19.820 | of my personality in that regard.
00:10:21.620 | - In some ways.
00:10:24.780 | - Well, I think the, on average,
00:10:26.540 | people under try and quantify things
00:10:30.140 | or try under optimize.
00:10:32.540 | There are some people who,
00:10:34.220 | it's like with all these things,
00:10:35.340 | it's a balancing act.
00:10:37.140 | - I've been on dating apps, but I've never used them.
00:10:40.180 | I'm sure they have data on this
00:10:42.060 | because they probably have
00:10:43.060 | the optimal stopping control problem.
00:10:45.260 | 'Cause there aren't a lot of people that use social,
00:10:47.380 | like dating apps are on there for a long time.
00:10:51.020 | So the interesting aspect is like, all right,
00:10:56.020 | how long before you stop looking
00:10:58.740 | before it actually starts affecting your mind negatively
00:11:01.740 | such that you see dating as a kind of--
00:11:05.140 | - Game.
00:11:08.220 | - A kind of game versus an actual process
00:11:12.780 | of finding somebody that's gonna make you happy
00:11:14.420 | for the rest of your life.
00:11:15.780 | That's really interesting.
00:11:17.340 | They have the data.
00:11:18.260 | I wish they would be able to release that data.
00:11:20.420 | And I do want to--
00:11:21.260 | - It's OKCupid, right?
00:11:22.220 | I think they ran a huge, huge study on all of their--
00:11:25.340 | - Yeah, they're more data-driven, I think,
00:11:26.780 | OKCupid folks are.
00:11:28.060 | I think there's a lot of opportunity for dating apps
00:11:30.220 | in general, even bigger than dating apps,
00:11:32.340 | people connecting on the internet.
00:11:35.020 | I just hope they're more data-driven
00:11:37.580 | and it doesn't seem that way.
00:11:40.340 | I think like, I've always thought that
00:11:43.380 | Goodreads should be a dating app.
00:11:47.220 | (Hannah laughs)
00:11:48.060 | Like the-- - I've never used it.
00:11:49.460 | - The Goodreads is just lists books that you've read
00:11:54.460 | and allows you to comment on the books you read
00:11:57.380 | and what the books you're currently reading.
00:11:58.940 | But it's a giant social networks of people reading books.
00:12:01.300 | And that seems to be a much better database of interests.
00:12:04.620 | Of course, it constrains you to the books you're reading,
00:12:06.580 | but that really reveals so much more about the person.
00:12:10.220 | Allows you to discover shared interests
00:12:12.460 | because books are a kind of window
00:12:13.940 | into the way you see the world.
00:12:16.020 | Also, like the kind of places, people you're curious about,
00:12:20.380 | the kind of ideas you're curious about.
00:12:21.740 | Are you a romantic or are you cold,
00:12:23.700 | calculating rationalist?
00:12:24.860 | Are you into Ayn Rand or are you into Bernie Sanders?
00:12:28.260 | Are you into whatever?
00:12:29.940 | And I feel like that reveals so much more
00:12:31.980 | than like a person trying to look hot
00:12:35.220 | from a certain angle in a Tinder profile.
00:12:37.140 | - Well, and it also be a really great filter
00:12:38.940 | in the first place for people.
00:12:40.340 | It's like people who read books
00:12:41.940 | and are willing to go and rate them
00:12:44.700 | and give feedback on them and so on.
00:12:47.020 | So that's already a really strong filter
00:12:48.580 | of probably the type of people you'd be looking for.
00:12:50.620 | - Well, at least be able to fake reading books.
00:12:52.380 | I mean, the thing about books,
00:12:53.540 | you don't really need to read it.
00:12:54.500 | You can just look at the CliffsNotes.
00:12:55.860 | - Yeah, game the dating app by feigning intellectualism.
00:12:59.060 | - Can I admit something very horrible about myself?
00:13:02.380 | - Go on.
00:13:03.220 | - The things that, you know,
00:13:04.340 | I don't have many things in my closet,
00:13:05.700 | but this is one of them.
00:13:06.900 | I've never actually read Shakespeare.
00:13:10.700 | I've only read CliffsNotes.
00:13:12.420 | And I got a five in the AP English exam.
00:13:14.940 | - Wow.
00:13:15.900 | - And I- - Which book?
00:13:17.100 | - Which books have I read?
00:13:19.660 | - Well, yeah, which was the exam on?
00:13:21.540 | - Oh, no, they include a lot of them.
00:13:23.340 | But Hamlet, I don't even know
00:13:27.180 | if you read Romeo and Juliet, Macbeth.
00:13:30.100 | I don't remember, but I don't understand it.
00:13:32.700 | It's like really cryptic.
00:13:34.060 | - It's hard.
00:13:34.900 | - It's really, I don't,
00:13:35.980 | and it's not that pleasant to read.
00:13:37.980 | It's like ancient speak.
00:13:39.220 | I don't understand it.
00:13:40.340 | Anyway, maybe I was too dumb.
00:13:41.820 | I'm still too dumb, but I did-
00:13:44.780 | - But you got a five, which is-
00:13:45.860 | - Yeah, yeah.
00:13:46.700 | - I don't know how the US grading system-
00:13:48.020 | - Oh, no, so AP English is,
00:13:50.420 | there's kind of this advanced versions of courses
00:13:53.060 | in high school, and you take a test
00:13:54.940 | that is like a broad test for that subject
00:13:57.940 | and includes a lot.
00:13:58.940 | It wasn't obviously just Shakespeare.
00:14:00.580 | I think a lot of it was also writing, written.
00:14:04.940 | You have like AP Physics, AP Computer Science,
00:14:07.180 | AP Biology, AP Chemistry,
00:14:09.780 | and then AP English or AP Literature.
00:14:12.180 | I forget what it was,
00:14:13.460 | but I think Shakespeare was a part of that, but I-
00:14:16.980 | - And you game, the point is you gamified it.
00:14:19.180 | - Gamified, well, entirety, I was into getting As.
00:14:22.580 | I saw it as a game.
00:14:24.220 | I don't think any,
00:14:25.740 | I don't think all the learning I've done
00:14:30.380 | has been outside of school.
00:14:33.980 | The deepest learning I've done has been outside of school,
00:14:36.260 | with a few exceptions, especially in grad school,
00:14:38.420 | like deep computer science courses,
00:14:40.460 | but that was still outside of school
00:14:41.820 | 'cause it was outside of, sorry,
00:14:43.740 | it was outside of getting the A for the course.
00:14:46.220 | The best stuff I've ever done is when you read the chapter
00:14:49.660 | and you do many of the problems at the end of the chapter,
00:14:52.340 | which is usually not what's required for the course,
00:14:54.940 | like the hardest stuff.
00:14:56.180 | In fact, textbooks are freaking incredible.
00:14:58.820 | If you go back now and you look at like biology textbook
00:15:02.260 | or any of the computer science textbooks
00:15:06.100 | on algorithms and data structures,
00:15:07.780 | those things are incredible.
00:15:09.540 | They have the best summary of a subject,
00:15:11.980 | plus they have practice problems of increasing difficulty
00:15:15.340 | that allows you to truly master the basic,
00:15:17.740 | like the fundamental ideas behind that.
00:15:19.940 | - I got through my entire physics degree with one textbook
00:15:24.420 | that was just this really comprehensive one
00:15:26.300 | that they told us at the beginning of the first year,
00:15:28.660 | buy this, but you're gonna have to buy 15 other books
00:15:31.700 | for all your supplementary courses,
00:15:33.380 | and I was like, every time I would just check
00:15:35.500 | to see whether this book covered it,
00:15:36.700 | and it did, and I think I only bought like two or three extra
00:15:39.820 | and thank God, 'cause they're super expensive textbooks,
00:15:41.900 | it's a whole racket they've got going on.
00:15:44.300 | Yeah, they are, they could just,
00:15:46.220 | you get the right one, it's just like a manual for,
00:15:49.500 | but what's interesting though is,
00:15:52.940 | this is the tyranny of having exams and metrics.
00:15:56.780 | - The tyranny of exams and metrics, yes.
00:15:58.660 | - I loved them because I'm very competitive
00:16:00.980 | and I liked finding ways to gamify things
00:16:04.020 | and then like sort of dust off my shoulders afterwards
00:16:06.380 | when I get a good grade or be annoyed at myself
00:16:08.140 | when I didn't, but yeah, you're absolutely right
00:16:10.860 | in that the actual, how much of that physics knowledge
00:16:14.420 | I've retained, like I've, I learned how to cram and study
00:16:19.420 | and please an examiner, but did that give me
00:16:22.180 | the deep lasting knowledge that I needed?
00:16:24.700 | I mean, yes and no, but really like nothing makes you learn
00:16:29.100 | a topic better than when you actually then have
00:16:31.980 | to teach it yourself.
00:16:33.180 | You know, like I'm trying to wrap my teeth around this,
00:16:36.180 | like game theory, Moloch stuff right now
00:16:38.140 | and there's no exam at the end of it that I can gamify.
00:16:43.060 | There's no way to gamify and sort of like shortcut
00:16:44.940 | my way through it.
00:16:45.780 | I have to understand it so deeply
00:16:47.220 | from like deep foundational levels to then build upon it
00:16:50.700 | and then try and explain it to other people.
00:16:52.340 | And like, you're about to go and do some lectures, right?
00:16:54.340 | You can't sort of just like,
00:16:57.580 | you presumably can't rely on the knowledge
00:17:00.620 | that you got through when you were studying for an exam
00:17:03.420 | to reteach that.
00:17:04.820 | - Yeah, and especially high level lectures,
00:17:06.820 | especially the kind of stuff you do on YouTube,
00:17:09.420 | you're not just regurgitating material.
00:17:12.780 | You have to think through what is the core idea here.
00:17:17.100 | And when you do the lectures live especially,
00:17:20.740 | you have to, there's no second takes.
00:17:23.900 | That is the luxury you get if you're recording a video
00:17:28.380 | for YouTube or something like that.
00:17:30.180 | But it definitely is a luxury you shouldn't lean on.
00:17:34.800 | I've gotten to interact with a few YouTubers
00:17:37.360 | that lean on that too much.
00:17:39.320 | And you realize, oh, you've gamified this system
00:17:43.400 | because you're not really thinking deeply about stuff.
00:17:46.760 | You're through the edit, both written and spoken,
00:17:51.760 | you're crafting an amazing video,
00:17:53.920 | but you yourself as a human being
00:17:55.480 | have not really deeply understood it.
00:17:57.640 | So live teaching, or at least recording video
00:18:00.800 | with very few takes is a different beast.
00:18:04.680 | And I think it's the most honest way of doing it,
00:18:07.320 | like as few takes as possible.
00:18:09.200 | - That's why I'm nervous about this.
00:18:10.800 | (laughing)
00:18:12.040 | - Don't-- - I'll go back and be like,
00:18:12.880 | ah, let's do that.
00:18:14.040 | - Don't fuck this up, Liv.
00:18:15.520 | The tyranny of exams.
00:18:18.520 | I do think people talk about high school and college
00:18:23.520 | as a time to do drugs and drink and have fun
00:18:27.200 | and all this kind of stuff.
00:18:28.320 | But looking back, of course I did a lot of those things.
00:18:33.320 | No, yes, but it's also a time when you get to read textbooks
00:18:39.240 | or read books or learn with all the time in the world.
00:18:47.740 | You don't have these responsibilities of laundry
00:18:54.920 | and having to pay for mortgage or all that kind of stuff,
00:18:59.920 | pay taxes, all this kind of stuff.
00:19:04.060 | In most cases, there's just so much time in the day
00:19:07.360 | for learning, and you don't realize it at the time
00:19:10.280 | because at the time it seems like a chore.
00:19:12.680 | Why the hell does there's so much homework?
00:19:15.480 | But you never get a chance to do this kind of learning,
00:19:18.060 | this kind of homework ever again in life,
00:19:21.080 | unless later in life you really make a big effort out of it.
00:19:24.640 | You get, basically your knowledge gets solidified.
00:19:27.480 | You don't get to have fun and learn.
00:19:29.160 | Learning is really fulfilling and really fun
00:19:33.360 | if you're that kind of person.
00:19:34.320 | Like some people like knowledge is not something
00:19:39.320 | that they think is fun, but if that's the kind of thing
00:19:42.320 | that you think is fun, that's the time to have fun
00:19:44.880 | and do the drugs and drink and all that kind of stuff.
00:19:46.920 | But the learning, just going back to those textbooks,
00:19:51.360 | the hours spent with the textbooks
00:19:53.080 | is really, really rewarding.
00:19:55.000 | - Do people even use textbooks anymore?
00:19:56.600 | - Yeah. - Do you think?
00:19:57.640 | 'Cause-- - Kids these days
00:19:59.240 | with their TikTok and their-- - Well, not even that,
00:20:01.800 | but just like so much information,
00:20:04.680 | really high quality information
00:20:06.280 | is now in digital format online.
00:20:08.100 | - Yeah, but they're not, they are using that,
00:20:11.100 | but college is still very, there's a curriculum.
00:20:16.100 | I mean, so much of school is about rigorous study
00:20:19.640 | of a subject and still on YouTube, that's not there.
00:20:23.960 | YouTube has, Grant Sanderson talks about this,
00:20:27.760 | he's this math-- - 3Blue1Brown.
00:20:30.080 | - Yeah, 3Blue1Brown.
00:20:31.600 | He says like, "I'm not a math teacher.
00:20:33.840 | "I just take really cool concepts and I inspire people,
00:20:37.700 | "but if you wanna really learn calculus,
00:20:39.420 | "if you wanna really learn linear algebra,
00:20:41.740 | "you should do the textbook, you should do that."
00:20:45.240 | And there's still the textbook industrial complex
00:20:49.020 | that charges like $200 for a textbook and somehow,
00:20:53.400 | I don't know, it's ridiculous.
00:20:54.960 | - Well, they're like, "Oh, sorry, new edition,
00:20:58.920 | "edition 14.6, sorry, you can't use 14.5 anymore."
00:21:02.920 | It's like, "What's different?
00:21:03.760 | "We've got one paragraph different."
00:21:05.380 | - So we mentioned offline Daniel Negrano.
00:21:08.120 | I'm gonna get a chance to talk to him on this podcast
00:21:11.640 | and he's somebody that I found fascinating
00:21:14.440 | in terms of the way he thinks about poker,
00:21:16.560 | verbalizes the way he thinks about poker,
00:21:18.640 | the way he plays poker.
00:21:20.200 | So, and he's still pretty damn good.
00:21:22.900 | He's been good for a long time.
00:21:24.900 | So you mentioned that people are running
00:21:27.280 | these kinds of simulations
00:21:28.440 | and the game of poker has changed.
00:21:30.560 | Do you think he's adapting in this way?
00:21:33.140 | Do you think like the top pros,
00:21:34.880 | do they have to adapt this way?
00:21:36.540 | Or is there still like over the years,
00:21:41.540 | you basically develop this gut feeling about,
00:21:45.300 | like you get to be like good the way,
00:21:48.580 | like alpha zero is good.
00:21:49.820 | You look at the board and somehow from the fog
00:21:54.180 | comes out the right answer.
00:21:55.380 | Like this is likely what they have.
00:21:58.020 | This is likely the best way to move.
00:22:00.480 | And you don't really,
00:22:01.320 | you can't really put a finger on exactly why,
00:22:04.480 | but it just comes from your gut feeling or no.
00:22:09.060 | - Yes and no.
00:22:10.560 | So gut feelings are definitely very important.
00:22:14.700 | You know, that we've got our two mode
00:22:15.980 | or you can distill it down to two modes
00:22:18.140 | of decision-making, right?
00:22:19.060 | You've got your sort of logical linear voice in your head,
00:22:22.140 | system two, as it's often called
00:22:24.020 | and your system on your gut intuition.
00:22:27.240 | And historically in poker,
00:22:32.340 | the very best players were playing
00:22:34.300 | almost entirely by their gut.
00:22:35.860 | You know, often they do some kind of inspired play
00:22:39.220 | and you'd ask them why they do it
00:22:40.420 | and they wouldn't really be able to explain it.
00:22:42.540 | And that's not so much because their process
00:22:46.460 | was unintelligible,
00:22:47.560 | but it was more just because no one had the language
00:22:50.120 | with which to describe what optimal strategies were
00:22:52.340 | because no one really understood how poker worked.
00:22:54.260 | This was before, you know, we had analysis software,
00:22:57.540 | you know, no one was writing.
00:22:59.860 | I guess some people would write down their hands
00:23:01.440 | in a little notebook,
00:23:02.660 | but there was no way to assimilate all this data
00:23:04.540 | and analyze it.
00:23:05.900 | But then, you know, when computers became cheaper
00:23:08.460 | and software started emerging
00:23:09.880 | and then obviously online poker,
00:23:11.500 | where it would like automatically save your hand histories,
00:23:14.220 | now all of a sudden you kind of had this body of data
00:23:17.060 | that you could run analysis on.
00:23:19.580 | And so that's when people started to see, you know,
00:23:22.140 | these mathematical solutions.
00:23:24.220 | And so what that meant is the role of intuition
00:23:31.740 | essentially became smaller.
00:23:33.920 | And it went more into, as we talked before about,
00:23:38.620 | you know, this game theory optimal style.
00:23:40.600 | But also, as I said, like game theory optimal
00:23:43.300 | is about loss minimization and being unexploitable.
00:23:47.740 | But if you're playing against people who aren't,
00:23:49.620 | because no person, no human being can play perfectly
00:23:51.620 | game theory optimal in poker, not even the best AIs.
00:23:54.040 | They're still like, you know,
00:23:55.460 | they're 99.99% of the way there or whatever,
00:23:57.540 | but it's kind of like the speed of light.
00:23:59.260 | You can't reach it perfectly.
00:24:01.060 | - So there's still a role for intuition?
00:24:03.780 | - Yes.
00:24:04.620 | So when, yeah, when you're playing this unexploitable style,
00:24:08.460 | but when your opponents start doing something,
00:24:11.440 | you know, suboptimal that you want to exploit,
00:24:14.160 | well now that's where not only your like logical brain
00:24:17.360 | will need to be thinking, oh, okay, I know I have this,
00:24:19.960 | my, I'm in the sort of top end of my range here
00:24:22.200 | with this hand.
00:24:23.940 | So that means I need to be calling X percent of the time
00:24:26.860 | and I put them on this range, et cetera.
00:24:30.480 | But then sometimes you'll have this gut feeling
00:24:34.000 | that will tell you, you know, you know what, this time,
00:24:37.320 | I know mathematically I'm meant to call now, you know,
00:24:40.380 | I've got, I'm in the sort of top end of my range
00:24:42.740 | and this is the odds I'm getting.
00:24:45.140 | So the math says I should call,
00:24:46.300 | but there's something in your gut saying
00:24:48.360 | they've got it this time, they've got it.
00:24:49.960 | Like they're beating you, maybe your hand is worse.
00:24:54.960 | So then the real art,
00:24:56.940 | this is where the last remaining art in poker,
00:24:59.620 | the fuzziness is like, do you listen to your gut?
00:25:03.660 | How do you quantify the strength of it?
00:25:06.100 | Or can you even quantify the strength of it?
00:25:08.360 | And I think that's what Daniel has.
00:25:13.120 | I mean, I can't speak for how much he's studying
00:25:15.380 | with the simulators and that kind of thing.
00:25:17.760 | I think he has, like he must be to still be keeping up,
00:25:22.280 | but he has an incredible intuition for just,
00:25:26.360 | he's seen so many hands of poker in the flesh.
00:25:29.220 | He's seen so many people, the way they behave
00:25:31.900 | when the chips are, you know, when the money's on the line
00:25:33.760 | and you've got him staring you down in the eye,
00:25:36.120 | you know, he's intimidating.
00:25:37.480 | He's got this like kind of X factor vibe
00:25:39.640 | that he gives out.
00:25:42.200 | - And he talks a lot, which is an interactive element,
00:25:45.040 | which is he's getting stuff from other people.
00:25:47.200 | - Yes, yeah.
00:25:48.040 | - And just like the subtlety.
00:25:49.320 | So he's like, he's probing constantly.
00:25:51.480 | - Yeah, he's probing and he's getting
00:25:52.960 | this extra layer of information that others can't.
00:25:55.920 | Now that said though, he's good online as well.
00:25:57.880 | You know, I don't know how, again,
00:25:59.680 | would he be beating the top cash game players online?
00:26:02.680 | Probably not, no.
00:26:03.920 | But when he's in person
00:26:07.280 | and he's got that additional layer of information,
00:26:08.920 | he can not only extract it,
00:26:10.680 | but he knows what to do with it still so well.
00:26:14.160 | There's one player who I would say
00:26:15.520 | is the exception to all of this.
00:26:17.080 | And he's one of my favorite people to talk about
00:26:19.920 | in terms of, I think he might have cracked the simulation.
00:26:24.480 | It's Phil Helmuth.
00:26:28.000 | - In more ways than one, he's cracked the simulation,
00:26:30.160 | I think.
00:26:31.000 | - Yeah, he somehow to this day is still,
00:26:34.560 | and I love you, Phil, I'm not in any way knocking you.
00:26:37.360 | He's still winning so much
00:26:41.480 | at the World Series of Poker specifically.
00:26:43.920 | He's now won 16 bracelets.
00:26:45.560 | The next nearest person I think has won 10.
00:26:48.520 | And he is consistently year in, year out,
00:26:50.280 | going deep or winning these huge field tournaments,
00:26:53.200 | you know, with like 2000 people,
00:26:55.400 | which statistically he should not be doing.
00:26:57.920 | And yet you watch some of the plays he makes
00:27:02.800 | and they make no sense.
00:27:04.160 | Like mathematically, they are so far
00:27:05.800 | from game theory optimal.
00:27:07.640 | And the thing is, if you went and stuck him
00:27:09.120 | in one of these high stakes cash games
00:27:11.480 | with a bunch of like DTO people,
00:27:13.320 | he's gonna get ripped apart.
00:27:15.160 | But there's something that he has
00:27:16.640 | that when he's in the halls
00:27:17.600 | of the World Series of Poker specifically,
00:27:19.680 | amongst sort of amateurish players,
00:27:24.080 | he gets them to do crazy shit like that.
00:27:26.920 | And, but my little pet theory is that also,
00:27:30.880 | he's like a wizard and he gets the cards
00:27:35.960 | to do what he needs them to.
00:27:37.360 | Because he just expects to win
00:27:42.840 | and he expects to get flop a set
00:27:45.240 | with a frequency far beyond what the real percentages are.
00:27:50.240 | And I don't even know if he knows
00:27:51.800 | what the real percentages are.
00:27:52.640 | He doesn't need to, because he gets there.
00:27:54.760 | - I think he has found a cheat code.
00:27:56.040 | 'Cause when I've seen him play,
00:27:57.400 | he seems to be like annoyed
00:27:59.880 | that the long shot thing didn't happen.
00:28:02.000 | - Yes.
00:28:02.880 | - He's like annoyed and it's almost like
00:28:05.240 | everybody else is stupid because he was obviously
00:28:07.480 | going to win with this.
00:28:08.320 | - Meant to win, if that silly thing hadn't happened.
00:28:10.560 | And it's like, you don't understand,
00:28:11.480 | the silly thing happens 99% of the time.
00:28:13.960 | And it's a 1%, not the other way around.
00:28:15.720 | But genuinely for his lived experience
00:28:18.080 | at the World Series, only at the World Series of Poker,
00:28:20.040 | it is like that.
00:28:21.520 | So I don't blame him for feeling that way.
00:28:24.000 | But he does, he has this X factor.
00:28:26.480 | And the poker community has tried for years
00:28:29.960 | to rip him down saying like, he's no good.
00:28:32.760 | But he's clearly good because he's still winning.
00:28:34.560 | Or there's something going on.
00:28:36.240 | Whether that's he's figured out how to
00:28:38.800 | mess with the fabric of reality
00:28:40.640 | and how cards, a randomly shuffled deck of cards come out.
00:28:44.320 | I don't know what it is, but he's doing it right still.
00:28:46.680 | - Who do you think is the greatest of all time?
00:28:48.720 | Would you put Helmuth?
00:28:52.560 | - It depends.
00:28:53.400 | - He seems like the kind of person
00:28:54.640 | when mentioned he would actually watch this.
00:28:56.440 | So you might want to be careful.
00:28:58.400 | - As I said, I love Phil.
00:28:59.720 | And I would say this to his face,
00:29:03.440 | I'm not saying anything, I don't.
00:29:04.960 | He's got, he truly, I mean, he is one of the greatest.
00:29:09.520 | I don't know if he's the greatest.
00:29:11.000 | He's certainly the greatest at the World Series of Poker.
00:29:14.280 | And he is the greatest at,
00:29:16.360 | despite the game switching into a pure game,
00:29:19.440 | almost an entire game of math,
00:29:20.880 | he has managed to keep the magic alive.
00:29:22.840 | And this like, just through sheer force of will,
00:29:26.000 | making the game work for him.
00:29:27.600 | And that is incredible.
00:29:28.640 | And I think it's something that should be studied
00:29:30.440 | because it's an example.
00:29:32.000 | - Yeah, there might be some actual game theoretical wisdom.
00:29:35.240 | There might be something to be said
00:29:36.440 | about optimality from studying him.
00:29:39.200 | - What do you mean by optimality?
00:29:40.960 | - Meaning, or rather game design perhaps.
00:29:45.520 | Meaning if what he's doing is working,
00:29:48.440 | maybe poker is more complicated
00:29:51.640 | than we're currently modeling it as.
00:29:54.200 | So like-
00:29:55.040 | - Or there's an extra layer,
00:29:56.640 | and I don't mean to get too weird and wooey,
00:29:59.480 | but or there's an extra layer of ability
00:30:04.480 | to manipulate the things the way you want them to go
00:30:07.520 | that we don't understand yet.
00:30:09.760 | - Do you think Phil Hellmuth understands them?
00:30:11.960 | Is he just generally-
00:30:13.640 | - Hashtag positivity.
00:30:15.480 | He wrote a book on positivity.
00:30:17.240 | - He has?
00:30:18.080 | He did?
00:30:18.920 | - Yes, "Phil Positivity."
00:30:19.760 | - Like a trolling book?
00:30:20.600 | - No.
00:30:21.440 | - A serious-
00:30:22.280 | - He's straight up, yeah.
00:30:23.120 | - Phil Hellmuth wrote a book about positivity.
00:30:26.080 | - Yes.
00:30:27.280 | - Okay, not ironically.
00:30:28.680 | - And I think it's about sort of manifesting what you want
00:30:31.880 | and getting the outcomes that you want
00:30:34.200 | by believing so much in yourself
00:30:36.560 | and in your ability to win, like eyes on the prize.
00:30:39.120 | And I mean, it's working.
00:30:42.200 | - Demands delivered.
00:30:43.280 | But where do you put like Phil Ivey
00:30:45.400 | and all those kinds of people?
00:30:47.360 | - I mean, I'm too, I've been, to be honest,
00:30:50.160 | too much out of the scene for the last few years to really,
00:30:53.000 | I mean, Phil Ivey's clearly got,
00:30:55.040 | again, he's got that X factor.
00:30:56.600 | He's so incredibly intimidating to play against.
00:31:00.480 | I've only played against him a couple of times,
00:31:01.800 | but when he like looks you in the eye
00:31:03.760 | and you're trying to run a bluff on him,
00:31:05.080 | oof, no one's made me sweat harder than Phil Ivey.
00:31:07.160 | Just, my bluff got through, actually.
00:31:10.440 | That was actually one of the most thrilling moments
00:31:11.880 | I've ever had in poker,
00:31:12.720 | was it was in a Monte Carlo in a high roller.
00:31:15.480 | I can't remember exactly what the hand was,
00:31:16.720 | but I three bit and then like just barreled
00:31:20.680 | all the way through.
00:31:22.120 | And he just like put his laser eyes into me.
00:31:24.400 | And I felt like he was just scouring my soul.
00:31:28.200 | And I was just like, hold it together, Liv,
00:31:29.760 | hold it together.
00:31:30.600 | And he was like, ah, I folded.
00:31:31.440 | - And you knew your hand was weaker.
00:31:33.040 | - Yeah, I mean, I was bluffing.
00:31:34.400 | I presume, which, you know,
00:31:36.400 | there's a chance I was bluffing with the best hand,
00:31:37.800 | but I'm pretty sure my hand was worse.
00:31:40.720 | And he folded.
00:31:43.320 | I was truly one of the deep highlights of my career.
00:31:47.520 | - Did you show the cards or did you fold?
00:31:50.040 | - You should never show in game.
00:31:52.640 | Because especially as I felt like I was one of the worst
00:31:54.240 | players at the table in that tournament.
00:31:55.720 | So giving that information,
00:31:57.880 | unless I had a really solid plan that I was now like
00:32:01.160 | advertising, oh look, I'm capable of bluffing Phil Ivey,
00:32:03.600 | but like why?
00:32:05.160 | It's much more valuable to take advantage
00:32:07.640 | of the impression that they have of me,
00:32:09.040 | which is like, I'm a scared girl playing a high roller
00:32:10.920 | for the first time.
00:32:12.080 | Keep that going, you know.
00:32:13.440 | - Interesting.
00:32:15.640 | But isn't there layers to this, like psychological warfare,
00:32:18.800 | that the scared girl might be way smart
00:32:22.440 | and then like to flip the tables.
00:32:24.720 | Do you think about that kind of stuff?
00:32:25.920 | - Oh, definitely.
00:32:26.760 | - Is it better not to reveal information?
00:32:28.520 | - I mean, generally speaking,
00:32:29.520 | you want to not reveal information.
00:32:31.080 | You know, the goal of poker is to be as deceptive as possible
00:32:34.680 | about your own strategies while elucidating as much out of
00:32:38.240 | your opponent about their own.
00:32:39.720 | So giving them free information,
00:32:42.080 | particularly if they're people who you consider
00:32:43.440 | very good players,
00:32:44.800 | any information I give them is going into their little
00:32:47.400 | database and being,
00:32:49.240 | I assume it's going to be calculated and used well.
00:32:51.760 | So I have to be really confident that my like meta gaming
00:32:55.800 | that I'm going to then do,
00:32:56.720 | oh, they've seen this, so therefore that,
00:32:58.320 | I'm going to be on the right level.
00:33:00.640 | So it's better just to keep that little secret to myself
00:33:03.760 | in the moment.
00:33:04.600 | - So how much is bluffing part of the game?
00:33:06.680 | - Huge amount.
00:33:08.360 | - So, yeah, I mean, maybe actually, let me ask,
00:33:10.440 | like, what did it feel like with Phil Ivey or anyone else
00:33:14.080 | when it's a high stake, when it's a big,
00:33:16.040 | it's a big bluff.
00:33:18.840 | So a lot of money on the table and maybe, I mean,
00:33:23.360 | what defines a big bluff?
00:33:24.640 | Maybe a lot of money on the table,
00:33:26.000 | but also some uncertainty in your mind and heart about,
00:33:30.560 | like self doubt,
00:33:31.960 | well, maybe I miscalculated what's going on here,
00:33:34.560 | what the bet said, all that kind of stuff.
00:33:36.720 | Like, what does that feel like?
00:33:38.800 | - I mean, it's, I imagine comparable to,
00:33:43.800 | you know, running a, I mean, any kind of big bluff
00:33:49.000 | where you have a lot of something
00:33:52.360 | that you care about on the line, you know?
00:33:54.640 | So if you're bluffing in a courtroom,
00:33:57.200 | not that anyone should ever do that,
00:33:58.160 | or, you know, something equatable to that,
00:34:00.360 | it's, you know, in that scenario, you know,
00:34:04.120 | I think it was the first time I'd ever played a 20,
00:34:05.680 | I'd won my way into this 25K tournament.
00:34:09.160 | So that was the buy-in, 25,000 euros.
00:34:11.160 | And I had satellited my way in
00:34:13.000 | because it was much bigger than I would ever normally play.
00:34:16.040 | And, you know, I hadn't,
00:34:17.160 | I wasn't that experienced at the time.
00:34:18.720 | And now I was sitting there against all the big boys,
00:34:21.240 | you know, the Negranus, the Phil Ives and so on.
00:34:23.600 | And then to like, each time you put the bets out,
00:34:29.720 | you know, you put another bet out, your card.
00:34:33.000 | Yeah, I was on what's called a semi-bluff.
00:34:34.880 | So there was some cards that could come
00:34:36.280 | that would make my hand very, very strong and therefore win.
00:34:39.160 | But most of the time, those cards don't come.
00:34:40.920 | - So it's a semi-bluff 'cause you're representing,
00:34:43.880 | are you representing that you already have something?
00:34:46.960 | - So I think in this scenario, I had a flush draw.
00:34:51.120 | So I had two clubs, two clubs came out on the flop,
00:34:55.920 | and then I'm hoping that on the turn in the river,
00:34:58.080 | one will come.
00:34:59.120 | So I have some future equity.
00:35:00.960 | I could hit a club and then I'll have the best hand
00:35:02.760 | in which case, great.
00:35:04.600 | And so I can keep betting and I'll want them to call,
00:35:07.480 | but I'm also got the other way of winning the hand
00:35:09.640 | where if my card doesn't come,
00:35:11.920 | I can keep betting and get them to fold their hand.
00:35:14.720 | And I'm pretty sure that's what the scenario was.
00:35:18.000 | So I had some future equity, but it's still, you know,
00:35:21.000 | most of the time I don't hit that club.
00:35:23.280 | And so I would rather him just fold
00:35:25.320 | because I'm, you know, the pot is now getting bigger
00:35:27.080 | and bigger.
00:35:27.920 | And in the end, like I've jam all in on the river,
00:35:31.000 | that's my entire tournament on the line.
00:35:33.720 | As far as I'm aware,
00:35:34.560 | this might be the one time I ever get to play a big 25K.
00:35:37.200 | You know, this was the first time I played one.
00:35:38.680 | So it was, it felt like the most momentous thing.
00:35:42.000 | And this was also when I was trying to build myself up,
00:35:43.840 | you know, build my name, a name for myself in poker.
00:35:46.680 | I wanted to get respect.
00:35:47.720 | - Destroy everything for you.
00:35:49.280 | - It felt like it in the moment.
00:35:50.520 | Like, I mean, it literally does feel like a form of life
00:35:52.480 | and death, like your body physiologically
00:35:54.360 | is having that flight or fight response.
00:35:56.120 | - What are you doing with your body?
00:35:57.120 | What are you doing with your face?
00:35:58.320 | Are you just like, what are you thinking about?
00:36:01.360 | (laughing)
00:36:02.760 | - More than a mixture of like, okay, what are the cards?
00:36:04.800 | So in theory, I'm thinking about like, okay,
00:36:07.080 | what are cards that make my hand look stronger?
00:36:10.080 | Which cards hit my perceived range from his perspective?
00:36:13.920 | Which cards don't?
00:36:15.680 | What's the right amount of bet size to, you know,
00:36:18.360 | maximize my fold equity in this situation?
00:36:20.880 | You know, that's the logical stuff
00:36:22.160 | that I should be thinking about.
00:36:23.280 | But I think in reality, because I was so scared,
00:36:25.480 | 'cause there's this, at least for me,
00:36:26.960 | there's a certain threshold of like nervousness or stress
00:36:30.480 | beyond which the logical brain shuts off.
00:36:33.520 | And now it just gets into this like,
00:36:35.320 | it just like, it feels like a game of wits, basically.
00:36:38.800 | It's like of nerve, can you hold your resolve?
00:36:41.840 | And it certainly got by that, like by the river.
00:36:44.360 | I think by that point, I was like,
00:36:45.880 | I don't even know if this is a good bluff anymore,
00:36:47.760 | but fuck it, let's do it.
00:36:50.120 | - Your mind is almost numb
00:36:51.320 | from the intensity of that feeling.
00:36:53.200 | - I call it the white noise.
00:36:55.200 | And that's, and it happens in all kinds of decision-making.
00:36:58.920 | I think anything that's really, really stressful.
00:37:00.800 | I can imagine someone in like an important job interview,
00:37:02.960 | if it's like a job they've always wanted
00:37:04.840 | and they're getting grilled, you know,
00:37:06.560 | like Bridgewater style where they ask these really hard,
00:37:09.240 | like mathematical questions.
00:37:11.200 | You know, it's a really learned skill
00:37:13.440 | to be able to like subdue your flight or fight response.
00:37:17.840 | You know, I think get from the sympathetic
00:37:19.400 | into the parasympathetic.
00:37:20.360 | So you can actually, you know, engage that voice
00:37:23.880 | in your head and do those slow logical calculations.
00:37:26.080 | 'Cause evolutionarily, you know,
00:37:28.360 | if we see a lion running at us,
00:37:29.680 | we didn't have time to sort of calculate
00:37:31.680 | the lion's kinetic energy.
00:37:33.000 | And, you know, is it optimal to go this way or that way?
00:37:35.480 | You just reacted.
00:37:37.080 | And physically our bodies are well attuned
00:37:39.880 | to actually make right decisions.
00:37:41.160 | But when you're playing a game like poker,
00:37:43.280 | this is not something that you ever, you know,
00:37:44.800 | evolved to do.
00:37:45.960 | And yet you're in that same flight or fight response.
00:37:49.000 | And so that's a really important skill
00:37:51.080 | to be able to develop, to basically learn how to like
00:37:53.520 | meditate in the moment and calm yourself
00:37:55.720 | so that you can think clearly.
00:37:57.440 | - But as you were searching for a comparable thing,
00:38:00.800 | it's interesting 'cause you just made me realize
00:38:03.360 | that bluffing is like an incredibly high stakes form
00:38:06.880 | of lying.
00:38:08.240 | You're lying.
00:38:09.680 | And I don't think you can--
00:38:11.640 | - Telling a story.
00:38:12.480 | - It's straight up lying.
00:38:15.240 | In the context of game, it's not a negative kind of lying.
00:38:19.080 | - But it is, yeah, exactly.
00:38:20.400 | You're representing something that you don't have.
00:38:23.400 | And I was thinking like, how often in life
00:38:26.800 | do we have such high stakes of lying?
00:38:30.360 | 'Cause I was thinking,
00:38:31.520 | certainly in high level military strategy,
00:38:36.120 | I was thinking when Hitler was lying to Stalin
00:38:40.280 | about his plans to invade the Soviet Union.
00:38:44.640 | And so you're talking to a person like your friends
00:38:48.360 | and you're fighting against the enemy,
00:38:50.760 | whatever the formulation of the enemy is.
00:38:53.960 | But meanwhile, whole time you're building up troops
00:38:56.840 | on the border.
00:38:57.680 | That's extremely--
00:38:59.800 | - Wait, wait, so Hitler and Stalin
00:39:01.080 | were like pretending to be friends?
00:39:02.560 | - Yeah.
00:39:03.400 | - Well, my history knowledge is terrible.
00:39:04.360 | That's crazy.
00:39:05.280 | - Yeah, that they were...
00:39:06.640 | And it worked because Stalin,
00:39:11.520 | until the troops crossed the border
00:39:14.320 | and invaded in Operation Barbarossa
00:39:17.560 | where this storm of Nazi troops
00:39:22.560 | invaded large parts of the Soviet Union,
00:39:25.600 | hence one of the biggest wars in human history began.
00:39:30.360 | Stalin for sure thought that this was never going to be,
00:39:34.920 | that Hitler is not crazy enough to invade the Soviet Union.
00:39:38.720 | And it makes, geopolitically,
00:39:41.000 | makes total sense to be collaborators.
00:39:43.040 | And ideologically, even though there's a tension
00:39:46.280 | between communism and fascism or national socialism,
00:39:50.320 | however you formulate it,
00:39:51.520 | it still feels like this is the right way
00:39:54.360 | to battle the West.
00:39:55.800 | - Right.
00:39:56.640 | They were more ideologically aligned.
00:39:58.680 | They in theory had a common enemy, which was the West.
00:40:01.400 | - So it made total sense.
00:40:03.480 | And in terms of negotiations
00:40:05.440 | and the way things were communicated,
00:40:07.320 | it seemed to Stalin that for sure
00:40:11.520 | that they would remain at least for a while
00:40:16.120 | peaceful collaborators.
00:40:17.680 | And everybody, because of that in the Soviet Union,
00:40:22.240 | believed that it was a huge shock when Kiev was invaded.
00:40:25.840 | And you hear echoes of that when I traveled to Ukraine,
00:40:28.240 | sort of the shock of the invasion.
00:40:32.240 | It's not just the invasion on one particular border,
00:40:34.560 | but the invasion of the capital city.
00:40:36.760 | And just like, holy shit.
00:40:39.480 | Especially at that time when you thought World War I,
00:40:44.240 | you realized that that was the way to end all wars.
00:40:46.960 | You would never have this kind of war.
00:40:49.040 | And holy shit, this person is mad enough
00:40:52.440 | to try to take on this monster in the Soviet Union.
00:40:55.520 | So it's no longer going to be a war
00:40:58.360 | of hundreds of thousands dead.
00:40:59.880 | It'll be a war of tens of millions dead.
00:41:02.320 | And yeah, but that's a very large scale kind of lie,
00:41:07.320 | but I'm sure there's in politics and geopolitics,
00:41:11.480 | that kind of lying happening all the time.
00:41:13.960 | And a lot of people pay financially
00:41:17.320 | and with their lives for that kind of lying.
00:41:19.080 | But in our personal lives, I don't know how often we,
00:41:22.280 | maybe we- - I think people do.
00:41:23.720 | I mean, like think of spouses cheating on their partners.
00:41:27.400 | And then like having to lie, like,
00:41:28.440 | "Where were you last night?"
00:41:29.560 | Stuff like that. - Oh shit, that's tough.
00:41:30.600 | Yeah, that's true. - Like that's, I think,
00:41:32.920 | unfortunately that stuff happens all the time.
00:41:36.240 | - Or having like multiple families, that one is great.
00:41:39.240 | When each family doesn't know about the other one
00:41:42.240 | and like maintaining that life.
00:41:44.640 | There's probably a sense of excitement about that too.
00:41:47.320 | Or- - Seems unnecessary, yeah.
00:41:50.720 | But- - Why?
00:41:51.720 | - Well, just lying, like, you know,
00:41:54.200 | the truth finds a way of coming out, you know?
00:41:56.760 | - Yes, but hence that's the thrill.
00:41:59.240 | - Yeah, perhaps.
00:42:00.200 | Yeah, people, I mean, and that's why I think actually,
00:42:03.520 | like poker, what's so interesting about poker
00:42:05.960 | is most of the best players I know,
00:42:09.880 | they're always exceptions, you know,
00:42:10.880 | they're always bad eggs,
00:42:12.480 | but actually poker players are very honest people.
00:42:15.120 | I would say they are more honest than the average,
00:42:17.560 | you know, if you just took random population sample.
00:42:22.360 | Because A, you know, I think, you know,
00:42:26.080 | humans like to have that,
00:42:27.560 | most people like to have some kind of, you know,
00:42:30.560 | mysterious, you know, an opportunity to do something
00:42:32.720 | like a little edgy.
00:42:34.880 | So we get to sort of scratch that itch
00:42:36.840 | of being edgy at the poker table,
00:42:38.360 | where it's like, it's part of the game,
00:42:40.440 | everyone knows what they're in for, and that's allowed.
00:42:43.400 | And you get to like really get that out of your system.
00:42:46.280 | And then also like poker players learned that, you know,
00:42:51.000 | I would play in a huge game against some of my friends,
00:42:55.400 | even my partner Igor, where we will be, you know,
00:42:57.760 | absolutely going at each other's throats,
00:42:59.680 | trying to draw blood in terms of winning each money
00:43:02.080 | off each other and like getting under each other's skin,
00:43:04.040 | winding each other up, doing the craftiest moves we can.
00:43:08.360 | But then once the game's done, you know,
00:43:11.120 | the winners and the losers will go off
00:43:12.320 | and get a drink together and have a fun time
00:43:13.880 | and like talk about it in this like
00:43:15.720 | weird academic way afterwards.
00:43:17.280 | Because that, and that's why games are so great,
00:43:19.520 | 'cause you get to like live out,
00:43:21.080 | like this competitive urge that, you know, most people have.
00:43:26.440 | - What's it feel like to lose?
00:43:28.720 | Like we talked about bluffing when it worked out.
00:43:31.400 | What about when you go broke?
00:43:34.520 | - So like in a game, I'm,
00:43:37.280 | fortunately I've never gone broke through poker.
00:43:39.200 | - You mean like full life?
00:43:40.880 | - Full life, no.
00:43:42.480 | I know plenty of people who have.
00:43:44.120 | And I don't think Igor would mind me saying,
00:43:47.840 | he went broke once in poker, well, you know,
00:43:50.320 | early on when we were together.
00:43:51.840 | - I feel like you haven't lived unless you've gone broke.
00:43:54.720 | - Yeah, I-- - In some sense.
00:43:56.440 | - Right, well, I mean, I'm happy,
00:43:58.960 | I've sort of lived through it vicariously through him
00:44:00.880 | when he did it at the time.
00:44:02.320 | But yeah, what's it like to lose?
00:44:04.160 | Well, it depends.
00:44:05.000 | So it depends on the amount,
00:44:06.080 | it depends what percentage of your net worth
00:44:07.680 | you've just lost.
00:44:08.640 | It depends on your brain chemistry.
00:44:11.560 | It really, you know, varies from person to person.
00:44:13.160 | - You have a very cold calculating way
00:44:15.320 | of thinking about this.
00:44:16.920 | So it depends what percentage. (laughs)
00:44:19.000 | - Well, it really does, right?
00:44:20.720 | - Yeah, it's true, it's true.
00:44:21.560 | - But that's, I mean,
00:44:23.080 | that's another thing poker trains you to do.
00:44:24.760 | You see everything in percentages.
00:44:28.120 | Or you see everything in like ROI or expected hourlies
00:44:30.840 | or cost benefit, et cetera.
00:44:32.440 | You know, so that's,
00:44:36.080 | one of the things I've tried to do
00:44:37.240 | is calibrate the strength of my emotional response
00:44:39.840 | to the win or loss that I've received.
00:44:43.400 | Because it's no good if you like, you know,
00:44:45.560 | you have a huge emotional dramatic response to a tiny loss.
00:44:49.480 | Or on the flip side, you have a huge win
00:44:53.240 | and you're so dead inside that you don't even feel it.
00:44:55.480 | Well, that's, you know, that's a shame.
00:44:56.840 | I want my emotions to calibrate with reality
00:45:00.280 | as much as possible.
00:45:01.480 | So yeah, what's it like to lose?
00:45:04.240 | I mean, I've had times where I've lost, you know,
00:45:07.200 | busted out of a tournament
00:45:08.120 | that I thought I was gonna win in.
00:45:09.040 | It's, you know, especially if I got really unlucky
00:45:10.560 | or I make a dumb play,
00:45:13.960 | where I've gone away and like, you know, kicked the wall,
00:45:17.560 | punched a wall, I like nearly broke my hand one time.
00:45:19.920 | Like I'm a lot less competitive than I used to be.
00:45:24.120 | Like I was pathologically competitive
00:45:26.640 | in my like late teens, early twenties.
00:45:28.760 | I just had to win at everything.
00:45:30.880 | And I think that sort of slowly waned as I've gotten older.
00:45:33.520 | - According to you, yeah.
00:45:34.880 | - According to me.
00:45:35.720 | - I don't know if others would say the same, right?
00:45:38.480 | I feel like ultra competitive people,
00:45:40.880 | like I've heard Joe Rogan say this to me.
00:45:43.000 | It's like, he's a lot less competitive than he used to be.
00:45:45.440 | I don't know about that.
00:45:47.360 | - Oh, I believe it.
00:45:48.440 | No, I totally believe it.
00:45:50.080 | Because as you get, you can still be,
00:45:51.440 | like I care about winning.
00:45:53.000 | Like when, you know, I play a game with my buddies online
00:45:56.000 | or, you know, whatever it is,
00:45:57.400 | polytopia is my current obsession.
00:45:58.760 | Like when I-
00:46:00.800 | - Thank you for passing on your obsession to me.
00:46:02.960 | - Are you playing now?
00:46:03.800 | - Yeah, I'm playing now.
00:46:04.720 | - We gotta have a game.
00:46:05.840 | - But I'm terrible and I enjoy playing terribly.
00:46:08.080 | I don't wanna have a game
00:46:09.080 | because that's gonna pull me into your monster
00:46:11.400 | of like a competitive play.
00:46:13.880 | - It's important.
00:46:14.720 | It's an important skill.
00:46:15.560 | - I'm enjoy playing on the, I can't.
00:46:18.280 | - You just do the points thing, you know, against the bots.
00:46:20.920 | - Yeah, against the bots.
00:46:21.960 | And I can't even do the,
00:46:23.520 | there's like a hard one and there's a very hard one.
00:46:26.920 | - That's crazy, yeah.
00:46:27.760 | - That's crazy.
00:46:28.600 | I can't, I don't even enjoy the hard one.
00:46:29.920 | The crazy I really don't enjoy.
00:46:32.040 | 'Cause it's intense.
00:46:33.680 | You have to constantly try to win
00:46:35.000 | as opposed to enjoy building a little world and-
00:46:37.960 | - Yeah, no, no, there's no time for exploration
00:46:39.520 | in polytopia, you gotta get,
00:46:40.720 | well, once you graduate from the crazies,
00:46:42.760 | then you can come play the-
00:46:44.560 | - Graduate from the crazies.
00:46:46.440 | - Yeah, so in order to be able to play a decent game
00:46:48.640 | against like, you know, our group,
00:46:50.920 | you'll need to be consistently winning
00:46:55.640 | like 90% of games against 15 crazy bots.
00:46:58.280 | - Yeah.
00:46:59.160 | - And you'll be able to, like,
00:47:00.440 | there'll be, I could teach you it within a day, honestly.
00:47:03.360 | - How to beat the crazies?
00:47:04.640 | - How to beat the crazies.
00:47:05.720 | And then you'll be ready for the big leagues.
00:47:08.120 | - Generalizes to more than just polytopia, but okay.
00:47:12.400 | Why were we talking about polytopia?
00:47:14.360 | Losing hurts.
00:47:15.960 | - Losing hurts, oh yeah, yes, competitiveness over time.
00:47:19.360 | - Oh yeah.
00:47:20.360 | - I think it's more that, at least for me,
00:47:23.360 | I still care about playing,
00:47:24.720 | about winning when I choose to play something.
00:47:26.480 | It's just that I don't see the world
00:47:28.680 | as zero sum as I used to, you know?
00:47:32.080 | I think as one gets older and wiser,
00:47:34.960 | you start to see the world more as a positive something,
00:47:38.040 | or at least you're more aware of externalities
00:47:40.800 | of scenarios, of competitive interactions.
00:47:43.760 | And so, yeah, I just, like, I'm more,
00:47:47.480 | and I'm more aware of my own, you know, like,
00:47:50.160 | if I have a really strong emotional response to losing,
00:47:52.760 | and that makes me then feel shitty for the rest of the day,
00:47:55.000 | and then I beat myself up mentally for it,
00:47:57.200 | like, I'm now more aware
00:47:58.640 | that that's unnecessary negative externality.
00:48:01.680 | So I'm like, okay, I need to find a way to turn this down,
00:48:03.920 | you know, dial this down a bit.
00:48:05.280 | - Was poker the thing that has,
00:48:07.560 | if you think back at your life
00:48:09.680 | and think about some of the lower points of your life,
00:48:12.160 | like the darker places you've got in your mind,
00:48:15.000 | did it have to do something with poker?
00:48:17.120 | Like, did losing spark the descent into a new thing?
00:48:22.120 | The descent into darkness, or was it something else?
00:48:24.760 | - I think my darkest points in poker
00:48:29.480 | were when I was wanting to quit and move on to other things,
00:48:34.480 | but I felt like I hadn't ticked all the boxes
00:48:38.360 | I wanted to tick.
00:48:40.000 | Like, I wanted to be the most winningest female player,
00:48:43.760 | which is by itself a bad goal.
00:48:45.520 | You know, that was one of my initial goals,
00:48:48.000 | and I was like, well, I haven't, you know,
00:48:48.840 | and I wanted to win a WPT event.
00:48:50.960 | I've won one of these, I've won one of these,
00:48:52.160 | but I want one of those as well.
00:48:53.600 | And that sort of, again, like,
00:48:57.440 | is a drive of like over-optimization to random metrics
00:49:00.120 | that I decided were important
00:49:01.600 | without much wisdom at the time, but then I carried on.
00:49:05.240 | That made me continue chasing it longer
00:49:09.600 | than I still actually had the passion to chase it for.
00:49:12.920 | And I don't have any regrets that, you know,
00:49:15.760 | I played for as long as I did, because who knows, you know,
00:49:18.200 | I wouldn't be sitting here,
00:49:19.280 | I wouldn't be living this incredible life
00:49:21.040 | that I'm living now.
00:49:22.680 | - This is the height of your life right now.
00:49:24.920 | - This is it, peak experience, absolute pinnacle
00:49:28.120 | here in your robot land.
00:49:31.360 | - Yeah, yeah.
00:49:32.200 | - With your creepy light.
00:49:33.600 | No, it is, I mean, I wouldn't change a thing
00:49:37.360 | about my life right now,
00:49:38.280 | and I feel very blessed to say that.
00:49:40.080 | But the dark times were in the sort of like 2016 to 18,
00:49:48.040 | even sooner really, where I was like,
00:49:50.360 | I'd stopped loving the game
00:49:53.160 | and I was going through the motions.
00:49:55.480 | And I would, and then I was like, you know,
00:49:59.720 | I would take the losses harder than I needed to,
00:50:02.240 | 'cause I'm like, "Oh, it's another one."
00:50:03.400 | And it was, I was aware that like,
00:50:04.880 | I felt like my life was ticking away and I was like,
00:50:06.480 | is this gonna be what's on my tombstone?
00:50:07.960 | Oh yeah, she played the game of, you know,
00:50:09.600 | this zero-sum game of poker,
00:50:11.720 | slightly more optimally than her next opponent.
00:50:14.120 | Like, cool, great, legacy, you know?
00:50:16.440 | So I just wanted, you know,
00:50:19.040 | there was something in me that knew I needed
00:50:20.240 | to be doing something more directly impactful
00:50:24.120 | and just meaningful.
00:50:25.280 | It was just like a search for meaning.
00:50:26.280 | And I think it's a thing a lot of poker players,
00:50:27.960 | even a lot of, I imagine any games players
00:50:30.960 | who sort of love intellectual pursuits,
00:50:34.520 | you know, I think you should ask
00:50:37.000 | Magnus Carlsen this question.
00:50:38.120 | - Yeah, walking away from chess, right?
00:50:39.800 | - Yeah, like it must be so hard for him.
00:50:41.520 | You know, he's been on the top for so long
00:50:43.760 | and it's like, well, now what?
00:50:45.280 | He's got this incredible brain, like what to put it to?
00:50:48.680 | And yeah, it's-
00:50:52.120 | - It's this weird moment where I've just spoken
00:50:55.240 | with people that won multiple gold medals at the Olympics
00:50:58.360 | and the depression hits hard after you win.
00:51:01.540 | - Dopamine crash.
00:51:04.160 | - 'Cause it's a kind of a goodbye,
00:51:05.600 | saying goodbye to that person,
00:51:06.920 | to all the dreams you had,
00:51:07.960 | the thought you thought would give meaning to your life.
00:51:11.880 | But in fact, life is full of constant pursuits of meaning.
00:51:16.880 | You don't like arrive and figure it all out
00:51:20.240 | and there's endless bliss, no, it continues going on and on.
00:51:23.980 | You constantly have to figure out to rediscover yourself.
00:51:27.840 | And so for you, like that struggle to say goodbye to poker,
00:51:31.600 | you have to like find the next-
00:51:33.800 | - There's always a bigger game, that's the thing.
00:51:35.640 | That's my like motto, it's like, what's the next game?
00:51:38.440 | And more importantly,
00:51:41.200 | because obviously game usually implies zero sum,
00:51:43.120 | like what's the game which is like omni-win?
00:51:46.560 | Look, what- - Omni-win?
00:51:47.640 | - Omni-win. - Why is omni-win
00:51:48.920 | is so important?
00:51:50.720 | - Because if everyone plays zero sum games,
00:51:54.680 | that's a fast track to either completely stagnate
00:51:57.160 | as a civilization, but more actually,
00:51:59.200 | far more likely to extinct ourselves.
00:52:01.320 | You know, like the playing field is finite.
00:52:07.440 | Nuclear powers are playing a game of poker
00:52:10.680 | with their chips of nuclear weapons, right?
00:52:14.560 | And the stakes have gotten so large
00:52:17.080 | that if anyone makes a single bet, fires some weapons,
00:52:20.680 | the playing field breaks.
00:52:22.160 | I made a video on this.
00:52:23.320 | The playing field is finite.
00:52:26.960 | And if we keep playing these adversarial zero sum games,
00:52:31.600 | thinking that in order for us to win,
00:52:34.080 | someone else has to lose,
00:52:35.480 | or if we lose that someone else wins,
00:52:37.760 | that will extinct us.
00:52:40.000 | It's just a matter of when.
00:52:41.120 | - What do you think about that mutually assured destruction,
00:52:44.840 | that very simple,
00:52:46.360 | almost to the point of caricaturing game theory idea
00:52:50.000 | that does seem to be at the core
00:52:52.360 | of why we haven't blown each other up yet
00:52:54.040 | with nuclear weapons.
00:52:55.680 | Do you think there's some truth to that,
00:52:57.560 | this kind of stabilizing force
00:53:00.200 | of mutually assured destruction?
00:53:01.480 | And do you think that's gonna hold up
00:53:04.960 | through the 21st century?
00:53:07.360 | - I mean, it has held, yes.
00:53:10.400 | There's definitely truth to it,
00:53:11.720 | that it was a, you know, it's a Nash equilibrium.
00:53:14.800 | - Yeah, are you surprised it held this long?
00:53:17.320 | Isn't it crazy?
00:53:18.480 | - It is crazy when you factor in
00:53:20.200 | all the like near miss accidental firings.
00:53:24.680 | Yes, that's makes me wonder, like, you know,
00:53:28.160 | are you familiar with the like
00:53:29.000 | quantum suicide thought experiment?
00:53:31.160 | Where it's basically like, you have a, you know,
00:53:34.960 | like a Russian roulette type scenario
00:53:37.560 | hooked up to some kind of quantum event,
00:53:40.640 | you know, particle splitting or pair particle splitting.
00:53:45.640 | And if it, you know, if it goes A,
00:53:48.440 | then the gun doesn't go off and it goes B,
00:53:50.280 | then it does go off and it kills you.
00:53:52.480 | Because you can only ever be in the universe,
00:53:55.040 | you know, assuming like the Everett branch,
00:53:56.560 | you know, multiverse theory,
00:53:57.800 | you will always only end up in the branch
00:54:00.240 | where you continually make, you know, option A comes in.
00:54:03.320 | But you run that experiment enough times,
00:54:05.440 | it starts getting pretty damn, you know,
00:54:07.080 | out of the tree gets huge.
00:54:09.360 | There's a million different scenarios,
00:54:10.920 | but you'll always find yourself in this,
00:54:12.440 | in the one where it didn't go off.
00:54:14.840 | And so from that perspective,
00:54:18.640 | you are essentially immortal.
00:54:20.800 | 'Cause someone, and you will only find yourself
00:54:22.600 | in the set of observers that make it down that path.
00:54:24.960 | - Yeah. - So it's kind of a--
00:54:26.640 | - That doesn't mean, that doesn't--
00:54:29.200 | - That doesn't mean you're still not gonna be fucked
00:54:32.200 | at some point in your life.
00:54:33.040 | - No, of course not.
00:54:33.880 | No, I'm not advocating like that we're all immortal
00:54:36.040 | because of this.
00:54:36.880 | It's just like a fun thought experiment.
00:54:38.520 | And the point is it like raises this thing
00:54:40.120 | of like these things called observer selection effects,
00:54:42.440 | which Bostrom, Nick Bostrom talks about a lot.
00:54:44.480 | And I think people should go read.
00:54:46.240 | - It's really powerful,
00:54:47.080 | but I think it could be overextended that logic.
00:54:48.920 | I'm not sure exactly how it can be.
00:54:52.280 | I just feel like you can get,
00:54:55.160 | you can overgeneralize that logic somehow.
00:54:58.360 | - Well, no, I mean, it leads you into like solipsism,
00:55:00.680 | which is a very dangerous mindset.
00:55:02.160 | Again, if everyone like falls into solipsism of like,
00:55:04.400 | well, I'll be fine.
00:55:05.760 | That's a great way of creating a very,
00:55:08.880 | self-terminating environment.
00:55:10.880 | But my point is, is that with the nuclear weapons thing,
00:55:14.600 | there have been at least, I think it's 12 or 11
00:55:17.160 | near misses of like just stupid things.
00:55:21.080 | Like there was moonrise over Norway
00:55:24.200 | and it made weird reflections of some glaciers
00:55:26.880 | in the mountains, which set off,
00:55:28.840 | I think the alarms of NORAD radar.
00:55:33.840 | And that put them on high alert, nearly ready to shoot.
00:55:35.720 | And it was only because the head of Russian military
00:55:39.600 | happened to be at the UN in New York at the time
00:55:42.000 | that they go like, well, wait a second,
00:55:43.360 | why would they fire now when their guy is there?
00:55:47.200 | And it was only that lucky happenstance,
00:55:49.280 | which doesn't happen very often
00:55:50.240 | where they didn't then escalate it into firing.
00:55:52.000 | And there's a bunch of these different ones.
00:55:53.880 | Stanislav Petrov, like saved,
00:55:55.960 | the person who should be the most famous person on earth,
00:55:57.920 | 'cause he's probably on expectation
00:55:59.560 | saved the most human lives of anyone,
00:56:01.160 | like billions of people by ignoring Russian orders to fire
00:56:05.280 | because he felt in his gut
00:56:06.280 | that actually this was a false alarm.
00:56:07.440 | And it turned out to be, you know, very hard thing to do.
00:56:11.040 | And there's so many of those scenarios
00:56:12.560 | that I can't help but wonder at this point
00:56:14.520 | that we aren't having this kind of like selection effect
00:56:16.840 | thing going on.
00:56:17.920 | 'Cause you look back and you're like,
00:56:18.800 | geez, that's a lot of near misses.
00:56:20.800 | But of course we don't know the actual probabilities
00:56:22.880 | that they would have lent,
00:56:23.720 | each one would have ended up in nuclear war.
00:56:24.840 | Maybe they were not that likely, but still.
00:56:26.720 | The point is, it's a very dark, stupid game
00:56:29.440 | that we're playing.
00:56:30.880 | And it is an absolute moral imperative, if you ask me,
00:56:35.320 | to get as many people thinking about ways
00:56:37.240 | to make this like very precarious.
00:56:39.560 | 'Cause we're in a Nash equilibrium,
00:56:41.160 | but it's not like we're in the bottom of a pit.
00:56:42.840 | You know, if you would like map it topographically,
00:56:46.400 | it's not like a stable ball at the bottom of a thing.
00:56:48.360 | We're not in equilibrium because of that.
00:56:49.400 | We're on the top of a hill with a ball balanced on top.
00:56:52.360 | And just at any little nudge could send it flying down
00:56:55.560 | and nuclear war pops off and hellfire and bad times.
00:56:58.880 | - On the positive side,
00:57:01.080 | life on earth will probably still continue.
00:57:03.200 | And another intelligent civilization might still pop up.
00:57:06.160 | - Maybe.
00:57:07.000 | - Several millennia after.
00:57:08.440 | - Pick your X risk, depends on the X risk.
00:57:10.160 | Nuclear war, sure, that's one of the perhaps less bad ones.
00:57:13.080 | Green goo through synthetic biology, very bad.
00:57:17.560 | Will turn, you know, destroy all organic matter through,
00:57:22.560 | you know, it's basically like a biological
00:57:27.080 | paperclip maximizer, also bad.
00:57:28.880 | Or AI type, you know, mass extinction thing as well
00:57:32.480 | would also be bad.
00:57:33.320 | - Shh, they're listening.
00:57:35.000 | There's a robot right behind you.
00:57:36.440 | Okay, wait, so let me ask you about this
00:57:38.680 | from a game theory perspective.
00:57:40.360 | Do you think we're living in a simulation?
00:57:42.280 | Do you think we're living inside a video game
00:57:44.640 | created by somebody else?
00:57:45.920 | (laughing)
00:57:47.400 | - Well, I think, well, so what was the second part
00:57:49.840 | of the question?
00:57:50.680 | Do I think we're living in a simulation and?
00:57:52.600 | - A simulation that is observed by somebody
00:57:56.640 | for purpose of entertainment.
00:57:58.600 | So like a video game.
00:57:59.640 | Are we listening?
00:58:00.480 | Are we, because there's a,
00:58:03.120 | it's like Phil Hellmuth type of situation, right?
00:58:05.240 | Like there's a creepy level of like,
00:58:09.920 | this is kind of fun and interesting.
00:58:12.640 | Like there's a lot of interesting stuff going on.
00:58:16.920 | I mean, that could be somehow integrated
00:58:18.760 | into the evolutionary process where,
00:58:20.680 | the way we perceive and--
00:58:24.120 | - Are you asking me if I believe in God?
00:58:26.000 | Sounds like it.
00:58:28.800 | - Kind of, but God seems to be not optimizing
00:58:33.800 | in the different formulations of God that we conceive of.
00:58:37.400 | He doesn't seem to be or she optimizing
00:58:39.960 | for like personal entertainment.
00:58:42.700 | Or maybe the older gods did,
00:58:45.840 | but the, you know, just like the basically like a teenager
00:58:49.800 | in their mom's basement watching create a fun universe
00:58:54.200 | to observe what kind of crazy shit might happen.
00:58:58.520 | - Okay, so to try and ask this,
00:59:00.080 | do I think there is some kind of extraneous intelligence
00:59:11.280 | to like our classic measurable universe
00:59:16.200 | that we can measure with,
00:59:18.240 | through our current physics and instruments?
00:59:22.920 | I think so, yes.
00:59:25.040 | Partly because I've had just small little bits of evidence
00:59:30.760 | in my own life, which have made me question,
00:59:34.040 | like, so I was a diehard atheist even five years ago.
00:59:39.400 | You know, I got into like the rationality community,
00:59:41.680 | big fan of less wrong, continue to be incredible resource.
00:59:45.680 | But I've just started to have too many little snippets
00:59:51.320 | of experience, which don't make sense
00:59:58.040 | with the current sort of purely materialistic explanation
01:00:01.600 | of how reality works.
01:00:08.680 | - Isn't that just like a humbling practical realization
01:00:13.680 | that we don't know how reality works?
01:00:17.400 | Isn't that just a reminder to yourself?
01:00:19.360 | - Yeah, no, it's a reminder of epistemic humility
01:00:21.520 | because I fell too hard, you know, same as people,
01:00:24.360 | like I think, you know, many people who are just like,
01:00:26.440 | my religion is the way, this is the correct way,
01:00:28.280 | this is the law, you are immoral
01:00:31.200 | if you don't follow this, blah, blah, blah.
01:00:32.600 | I think they are lacking epistemic humility.
01:00:35.080 | They're a little too much hubris there.
01:00:37.040 | But similarly, I think the sort of the Richard Dawkins brand
01:00:39.720 | of atheism is too rigid as well and doesn't,
01:00:44.720 | you know, there's a way to try and navigate these questions
01:00:50.960 | which still honors the scientific method,
01:00:52.720 | which I still think is our best sort of realm
01:00:54.760 | of like reasonable inquiry, you know, a method of inquiry.
01:00:57.600 | So an example, I have two kind of notable examples
01:01:03.280 | that like really rattled my cage.
01:01:06.640 | The first one was actually in 2010, early on in,
01:01:09.800 | quite early on in my poker career.
01:01:13.480 | And I, the, I, the, remember the Icelandic volcano
01:01:18.480 | that erupted that like shut down
01:01:20.520 | kind of all Atlantic airspace?
01:01:22.760 | And I, it meant I got stuck down in the South of France.
01:01:25.280 | I was there for something else.
01:01:27.160 | And I couldn't get home and someone said,
01:01:29.920 | well, there's a big poker tournament happening in Italy.
01:01:31.880 | Maybe do you wanna go?
01:01:33.040 | I was like, oh, right, sure.
01:01:33.880 | Like let's, you know, got a train across,
01:01:35.880 | found a way to get there.
01:01:37.080 | And the buy-in was 5,000 euros,
01:01:39.920 | which was much bigger than my bankroll would normally allow.
01:01:42.520 | And so I played a feeder tournament,
01:01:45.960 | won my way in kind of like I did with the Monte Carlo,
01:01:47.920 | big one.
01:01:48.760 | So then I won my way, you know,
01:01:50.920 | from 500 euros into 5,000 euros to play this thing.
01:01:54.200 | And on day one of then the big tournament,
01:01:58.040 | which turned out to have,
01:01:59.000 | it was the biggest tournament ever held in Europe
01:02:00.880 | at the time.
01:02:01.720 | It got over like 1,200 people, absolutely huge.
01:02:04.560 | And I remember they dimmed the lights
01:02:06.640 | for before, you know, the normal shuffle up and deal
01:02:10.240 | to tell everyone to start playing.
01:02:12.120 | And they played Chemical Brothers' "Hey Boy, Hey Girl,"
01:02:16.080 | which I don't know why it's notable,
01:02:17.360 | but it was just like a really,
01:02:18.360 | it was a song I always liked.
01:02:19.200 | It was like one of these like pump me up songs.
01:02:21.200 | And I was sitting there thinking, oh yeah, it's exciting.
01:02:22.920 | I'm playing this really big tournament.
01:02:24.600 | And out of nowhere, just suddenly this voice in my head,
01:02:29.320 | just, and it sounded like my own sort of,
01:02:32.000 | you know, when you think in your mind,
01:02:33.600 | you hear a voice kind of, right?
01:02:35.000 | At least I do.
01:02:35.840 | And so it sounded like my own voice and it said,
01:02:38.640 | you are gonna win this tournament.
01:02:40.320 | And it was so powerful that I got this like wave of like,
01:02:44.200 | you know, sort of goosebumps down my body.
01:02:46.760 | And I even, I remember looking around being like,
01:02:48.760 | did anyone else hear that?
01:02:49.840 | And obviously people are on their phones,
01:02:51.040 | like no one else heard it.
01:02:51.880 | And I was like, okay.
01:02:53.480 | Six days later, I win the fucking tournament
01:02:58.160 | out of 1,200 people.
01:03:01.480 | And I don't know how to explain it.
01:03:06.480 | Okay, yes, maybe I have that feeling before every time
01:03:13.840 | I play and it's just that I happened to, you know,
01:03:16.200 | because I won the tournament, I retroactively remembered it.
01:03:18.360 | But that's just-
01:03:19.400 | - Or the feeling gave you a kind of,
01:03:22.360 | now from the film "Hellmuthian."
01:03:25.000 | - Well, exactly.
01:03:25.840 | - Like it gave you a confident, a deep confidence.
01:03:28.360 | - And it did, it definitely did.
01:03:29.880 | I remember then feeling this like sort of,
01:03:32.040 | well, although I remember then on day one,
01:03:33.960 | I then went and lost half my stack quite early on.
01:03:35.920 | And I remember thinking like, oh, well that was bullshit.
01:03:37.720 | You know, what kind of premonition is this?
01:03:40.240 | Thinking, oh, I'm out.
01:03:41.080 | But you know, I managed to like keep it together
01:03:42.920 | and recover and then just went like pretty perfectly
01:03:46.080 | from then on.
01:03:47.440 | And either way, it definitely instilled me
01:03:51.080 | with this confidence.
01:03:52.800 | And I don't want to put a, I can't put an explanation.
01:03:55.840 | Like, you know, was it some, you know,
01:03:58.680 | huge extra supernatural thing driving me
01:04:03.680 | or was it just my own self-confidence and so on
01:04:06.520 | that just made me make the right decisions?
01:04:08.000 | I don't know.
01:04:08.880 | And I don't, I'm not going to put a frame on it.
01:04:10.680 | And I think-
01:04:11.520 | - I think I know a good explanation.
01:04:12.400 | So we're a bunch of NPCs living in this world
01:04:14.720 | created by, in the simulation.
01:04:16.600 | And then people, not people,
01:04:19.200 | creatures from outside of the simulation
01:04:21.280 | sort of can tune in and play your character.
01:04:24.200 | And that feeling you got is somebody just like,
01:04:27.240 | they got to play a poker tournament through you.
01:04:29.320 | - Honestly, it felt like that.
01:04:30.680 | It did actually feel a little bit like that.
01:04:33.120 | But it's been 12 years now.
01:04:35.200 | I've retold the story many times.
01:04:36.680 | Like, I don't even know how much I can trust my memory.
01:04:38.960 | - You're just an NPC retelling the same story.
01:04:41.520 | 'Cause they just played the tournament and left.
01:04:43.800 | - Yeah, they're like, oh, that was fun.
01:04:44.760 | Cool. - Yeah, cool. Next.
01:04:46.080 | And now you're for the rest of your life
01:04:48.520 | left as a boring NPC retelling this story of greatness.
01:04:51.880 | - But it was, and what was interesting was that after that,
01:04:53.720 | then I didn't obviously win a major tournament
01:04:55.640 | for quite a long time.
01:04:56.800 | And it left, that was actually another sort of dark period
01:05:01.200 | because I had this incredible,
01:05:02.800 | like the highs of winning that,
01:05:04.360 | just on a material level were insane, winning the money.
01:05:07.080 | I was on the front page of newspapers
01:05:08.520 | 'cause there was this girl that came out of nowhere
01:05:10.200 | and won this big thing.
01:05:12.160 | And so again, sort of chasing that feeling was difficult.
01:05:16.720 | But then on top of that, there was this feeling
01:05:18.160 | of almost being touched by something bigger
01:05:22.320 | that was like, ah.
01:05:24.080 | - So maybe, did you have a sense
01:05:26.600 | that I might be somebody special?
01:05:29.520 | Like this kind of,
01:05:31.880 | I think that's the confidence thing
01:05:36.520 | that maybe you could do something special
01:05:40.240 | in this world after all kind of feeling.
01:05:42.320 | - I definitely, I mean, this is the thing
01:05:45.320 | I think everybody wrestles with to an extent, right?
01:05:47.240 | We all, we are truly the protagonists in our own lives.
01:05:51.520 | And so it's a natural bias, human bias to feel special.
01:05:56.520 | And I think, and in some ways we are special.
01:06:00.760 | Every single person is special because you are that,
01:06:03.760 | the universe does, the world literally
01:06:05.200 | does revolve around you.
01:06:06.200 | That's the thing in some respect.
01:06:08.560 | But of course, if you then zoom out
01:06:10.480 | and take the amalgam of everyone's experiences,
01:06:12.080 | then no, it doesn't.
01:06:12.920 | So there is this shared sort of objective reality,
01:06:15.680 | but sorry, there's objective reality that is shared,
01:06:17.720 | but then there's also this subjective reality,
01:06:19.240 | which is truly unique to you.
01:06:20.720 | And I think both of those things coexist.
01:06:22.280 | And it's not like one is correct and one isn't.
01:06:24.440 | And again, anyone who's like,
01:06:26.280 | oh no, your lived experience is everything
01:06:28.160 | versus your lived experience is nothing.
01:06:30.080 | No, it's a blend between these two things.
01:06:32.480 | They can exist concurrently.
01:06:33.840 | - But there's a certain kind of sense
01:06:35.280 | that at least I've had my whole life.
01:06:36.800 | And I think a lot of people have this as like,
01:06:38.480 | well, I'm just like this little person.
01:06:40.960 | Surely I can't be one of those people
01:06:42.920 | that do the big thing, right?
01:06:46.560 | There's all these big people doing big things.
01:06:48.640 | There's big actors and actresses, big musicians.
01:06:53.400 | There's big business owners and all that kind of stuff,
01:06:57.040 | scientists and so on.
01:06:58.680 | I have my own subjective experience that I enjoy and so on,
01:07:02.440 | but there's like a different layer.
01:07:04.760 | Surely I can't do those great things.
01:07:09.440 | I mean, one of the things just having interacted
01:07:11.480 | with a lot of great people, I realized,
01:07:14.160 | no, they're like just the same humans as me.
01:07:19.160 | And that realization I think is really empowering.
01:07:22.160 | And to remind yourself-- - But are they?
01:07:25.040 | - Huh? - But are they?
01:07:25.920 | - Are they?
01:07:26.760 | Well, in terms-- - Depends on some, yeah.
01:07:30.560 | - They're like a bag of insecurities and--
01:07:33.920 | - Yes.
01:07:34.760 | - Peculiar sort of,
01:07:39.320 | like their own little weirdnesses and so on.
01:07:43.120 | I should say also not,
01:07:44.920 | they have the capacity for brilliance,
01:07:50.160 | but they're not generically brilliant.
01:07:52.880 | Like, we tend to say this person or that person is brilliant
01:07:57.880 | but really, no, they're just like sitting there
01:08:01.240 | and thinking through stuff, just like the rest of us.
01:08:05.280 | I think they're in the habit of thinking through stuff,
01:08:08.120 | seriously, and they've built up a habit
01:08:10.600 | of not allowing their mind to get trapped
01:08:14.120 | in a bunch of bullshit and minutiae of day-to-day life.
01:08:17.040 | They really think big ideas, but those big ideas,
01:08:20.080 | it's like allowing yourself the freedom to think big,
01:08:24.720 | to realize that you can be one
01:08:27.280 | that actually solved this particular big problem.
01:08:29.280 | First, identify a big problem that you care about,
01:08:31.320 | and then like, I can actually be the one
01:08:33.240 | that solves this problem.
01:08:34.880 | And like, allowing yourself to believe that.
01:08:37.320 | And I think sometimes you do need to have like--
01:08:38.920 | - Yeah. - That shock go
01:08:40.280 | through your body and a voice tells you
01:08:41.680 | you're gonna win this tournament.
01:08:42.880 | - Well, exactly, and whether it was,
01:08:44.880 | it's this idea of useful fictions.
01:08:50.520 | So again, like going through the classic rationalist training
01:08:54.920 | of "Less Wrong" where it's like, you want your map,
01:08:57.440 | you know, the image you have of the world in your head
01:09:00.080 | to as accurately match up with how the world actually is.
01:09:04.400 | You want the map and the territory to perfectly align as,
01:09:07.240 | you know, you want it to be
01:09:08.440 | as an accurate representation as possible.
01:09:11.040 | I don't know if I fully subscribe to that anymore,
01:09:13.920 | having now had these moments of like,
01:09:16.400 | feeling of something either bigger
01:09:18.560 | or just actually just being overconfident.
01:09:20.400 | Like there is value in overconfidence sometimes, I do.
01:09:23.760 | If you would, you know, take, you know, take
01:09:26.240 | Magnus Carlsen, right?
01:09:30.600 | If he, I'm sure from a young age,
01:09:32.360 | he knew he was very talented,
01:09:34.440 | but I wouldn't be surprised if he was also
01:09:36.720 | had something in him to,
01:09:39.200 | well, actually maybe he's a bad example
01:09:40.320 | 'cause he truly is the world's greatest,
01:09:42.640 | but someone who, it was unclear
01:09:44.440 | whether they were gonna be the world's greatest,
01:09:45.720 | but ended up doing extremely well
01:09:47.320 | because they had this innate, deep self-confidence,
01:09:50.320 | this like even overblown idea
01:09:53.120 | of how good their relative skill level is.
01:09:54.920 | That gave them the confidence to then pursue this thing
01:09:56.960 | and like, with the kind of focus and dedication
01:10:00.480 | that it requires to excel
01:10:01.840 | in whatever it is you're trying to do, you know?
01:10:03.480 | And so there are these useful fictions
01:10:06.120 | and that's where I think I diverge slightly
01:10:09.560 | with the classic sort of rationalist community
01:10:13.440 | because that's a field that is worth studying
01:10:19.680 | of like how the stories we tell,
01:10:23.280 | what the stories we tell to ourselves,
01:10:25.040 | even if they are actually false
01:10:26.240 | and even if we suspect they might be false,
01:10:29.040 | how it's better to sort of have that like
01:10:30.400 | little bit of faith,
01:10:32.000 | the like value in faith, I think, actually.
01:10:34.360 | And that's partly another thing
01:10:35.840 | that's like now led me to explore,
01:10:37.640 | you know, the concept of God,
01:10:40.600 | whether you wanna call it a simulator,
01:10:42.920 | the classic theological thing.
01:10:44.280 | I think we're all like elucidating to the same thing.
01:10:46.440 | Now, I don't know, I'm not saying, you know,
01:10:47.760 | 'cause obviously the Christian God is like,
01:10:49.760 | you know, all benevolent, endless love.
01:10:53.480 | The simulation, at least one of the simulation hypothesis
01:10:56.680 | is like, as you said, like a teenager in his bedroom
01:10:58.680 | who doesn't really care, doesn't give a shit
01:11:01.200 | about the individuals within there.
01:11:02.560 | It just like wants to see how this thing plays out
01:11:05.120 | 'cause it's curious and it could turn it off like that.
01:11:06.960 | You know, where on the sort of psychopathy
01:11:09.880 | to benevolent spectrum God is, I don't know.
01:11:13.800 | But having this,
01:11:18.600 | just having a little bit of faith
01:11:20.160 | that there is something else out there
01:11:21.720 | that might be interested in our outcome
01:11:24.360 | is I think an essential thing actually for people to find.
01:11:27.960 | A, because it creates commonality between,
01:11:29.680 | it's something we can all share.
01:11:31.880 | And like, it is uniquely humbling of all of us to an extent.
01:11:35.040 | It's like a common objective.
01:11:36.880 | But B, it gives people that little bit of like reserve,
01:11:41.400 | you know, when things get really dark.
01:11:43.400 | And I do think things are gonna get pretty dark
01:11:44.880 | over the next few years.
01:11:46.080 | But it gives that like,
01:11:48.280 | to think that there's something out there
01:11:50.760 | that actually wants our game to keep going.
01:11:53.160 | I keep calling it the game, you know.
01:11:55.040 | It's a thing, C and I, we call it the game.
01:11:59.280 | - You and C is AKA Grimes, we call what the game?
01:12:03.680 | Everything, the whole thing?
01:12:05.000 | - Yeah, we joke about like--
01:12:06.680 | - So everything is a game?
01:12:07.880 | - Not, well, the universe, like what if it's a game
01:12:12.320 | and the goal of the game is to figure out like,
01:12:14.600 | well, either how to beat it, how to get out of it.
01:12:16.960 | You know, maybe this universe is an escape room,
01:12:19.920 | like a giant escape room.
01:12:21.240 | And the goal is to figure out,
01:12:24.000 | put all the pieces of puzzle,
01:12:25.160 | figure out how it works
01:12:26.880 | in order to like unlock this like hyper dimensional key
01:12:29.800 | and get out beyond what it is.
01:12:31.280 | That's--
01:12:32.120 | - No, but then, so you're saying it's like different levels
01:12:34.400 | and it's like a cage within a cage within a cage
01:12:36.520 | and one cage at a time, you figure out how to escape that.
01:12:40.000 | - Like a new level up, you know,
01:12:42.240 | like us becoming multi-planetary would be a level up
01:12:44.400 | or us, you know, figuring out how to upload
01:12:46.680 | our consciousnesses to the thing,
01:12:48.240 | that would probably be a leveling up.
01:12:49.600 | Or spiritually, you know, humanity becoming more combined
01:12:53.760 | and less adversarial and bloodthirsty
01:12:56.880 | and us becoming a little bit more enlightened,
01:12:58.760 | that would be a leveling up.
01:12:59.600 | You know, there's many different frames to it,
01:13:01.840 | whether it's physical, you know, digital,
01:13:05.120 | or like metaphysical--
01:13:05.960 | - I wonder what the level,
01:13:06.800 | I think level one for Earth
01:13:09.240 | is probably the biological evolutionary process.
01:13:14.040 | So going from single cell organisms to early humans.
01:13:18.360 | Then maybe level two is whatever's happening
01:13:22.200 | inside our minds and creating ideas
01:13:24.000 | and creating technologies.
01:13:26.100 | That's like evolutionary process of ideas.
01:13:30.200 | And then multi-planetary is interesting.
01:13:34.640 | Is that fundamentally different
01:13:35.880 | from what we're doing here on Earth?
01:13:37.880 | Probably, 'cause it allows us to exponentially scale.
01:13:41.520 | - It delays the Malthusian trap, right?
01:13:47.040 | It's a way to keep the playing field,
01:13:51.800 | to make the playing field get larger
01:13:53.440 | so that it can accommodate more of our stuff, more of us.
01:13:57.320 | And that's a good thing,
01:13:58.480 | but I don't know if it like fully solves this issue of,
01:14:03.480 | well, this thing called Moloch,
01:14:06.800 | which we haven't talked about yet,
01:14:08.720 | which is basically, I call it
01:14:11.040 | the god of unhealthy competition.
01:14:12.760 | - Yeah, let's go to Moloch.
01:14:14.040 | What's Moloch?
01:14:15.040 | You did a great video on Moloch,
01:14:17.480 | one aspect of it, the application of it to--
01:14:19.880 | - Yeah, Instagram beauty filters.
01:14:22.440 | - True.
01:14:23.600 | - Very niche.
01:14:25.000 | I wanted to start off small.
01:14:27.360 | So Moloch was originally coined as,
01:14:32.360 | well, so apparently back in the Canaanite times,
01:14:39.560 | it was to say ancient Carthaginian,
01:14:41.440 | I can never say it, Carthaginian,
01:14:43.240 | somewhere around 300 BC or 200 AD, I don't know.
01:14:46.760 | There was supposedly this death cult
01:14:49.720 | who would sacrifice their children
01:14:53.240 | to this awful demon god thing they called Moloch
01:14:56.080 | in order to get power to win wars.
01:14:59.400 | So really dark, horrible things.
01:15:00.920 | And it was literally like about child sacrifice.
01:15:02.640 | Whether they actually existed or not, we don't know,
01:15:04.080 | but in mythology they did.
01:15:05.680 | And this god that they worshipped
01:15:07.240 | was this thing called Moloch.
01:15:09.280 | And then, I don't know,
01:15:10.960 | it seemed like it was kind of quiet throughout history
01:15:13.720 | in terms of mythology beyond that
01:15:15.440 | until this movie "Metropolis" in 1927 talked about this,
01:15:20.440 | you see that there was this incredible futuristic city
01:15:27.040 | that everyone was living great in,
01:15:29.080 | but then the protagonist goes underground into the sewers
01:15:31.360 | and sees that the city is run by this machine.
01:15:34.280 | And this machine basically would just like
01:15:36.160 | kill the workers all the time
01:15:38.080 | because it was just so hard to keep it running.
01:15:40.080 | They were always dying.
01:15:40.920 | So there was all this suffering that was required
01:15:43.000 | in order to keep the city going.
01:15:44.560 | And then the protagonist has this vision
01:15:45.920 | that this machine is actually this demon Moloch.
01:15:48.160 | So again, it's like this sort of mechanistic consumption
01:15:50.880 | of humans in order to get more power.
01:15:53.840 | And then Allen Ginsberg wrote a poem in the '60s,
01:15:58.720 | which incredible poem called "Howl"
01:16:01.920 | about this thing Moloch.
01:16:03.120 | And a lot of people sort of quite understandably
01:16:07.000 | take the interpretation of that,
01:16:09.160 | that he's talking about capitalism.
01:16:12.360 | But then the sort of piece to resistance
01:16:14.680 | that's moved Moloch into this idea of game theory
01:16:17.320 | was Scott Alexander of Slate's "Codex."
01:16:20.240 | Wrote this incredible,
01:16:22.600 | one that literally I think it might be my favorite piece
01:16:24.400 | of writing of all time.
01:16:25.280 | It's called "Meditations on Moloch."
01:16:27.160 | Everyone must go read it.
01:16:30.560 | - I say "Codex" is a blog.
01:16:31.880 | - It's a blog, yes.
01:16:33.400 | We can link to it in the show notes or something, right?
01:16:36.200 | No, don't.
01:16:37.040 | - Yes, yes.
01:16:39.280 | - But I like how you assume
01:16:42.280 | I have a professional operation going on here.
01:16:44.280 | - I mean-
01:16:45.120 | - I shall try to remember to-
01:16:45.960 | - You're wearing a suit.
01:16:46.800 | What do you want?
01:16:48.920 | You're giving the impression of it.
01:16:50.160 | - Yeah, I'll look, please.
01:16:51.520 | If I don't, please, somebody in the comments remind me.
01:16:54.040 | - I'll help you.
01:16:54.880 | - If you don't know this blog,
01:16:56.640 | it's one of the best blogs ever probably.
01:17:00.240 | You should probably be following it.
01:17:01.960 | - Yes.
01:17:02.800 | - Are blogs still a thing?
01:17:04.160 | I think they are still a thing, yeah.
01:17:05.760 | - He's migrated onto Substack, but yeah, it's still a blog.
01:17:08.880 | - Substack better not fuck things up.
01:17:11.520 | - I hope not, yeah.
01:17:12.640 | I hope they don't turn Moloch-y,
01:17:15.160 | which will mean something to people when we continue.
01:17:17.560 | (both laughing)
01:17:18.760 | - When I stop interrupting for once.
01:17:20.040 | - No, no, it's good.
01:17:21.560 | So anyway, so he writes this piece, "Meditations on Moloch,"
01:17:25.440 | and basically he analyzes the poem and he's like,
01:17:28.720 | "Okay, so it seems to be something relating
01:17:30.200 | to where competition goes wrong."
01:17:32.960 | And Moloch was historically this thing
01:17:36.400 | where people would sacrifice a thing that they care about,
01:17:40.120 | in this case, children, their own children,
01:17:42.480 | in order to gain power, a competitive advantage.
01:17:45.920 | And if you look at almost everything
01:17:48.320 | that sort of goes wrong in our society,
01:17:50.360 | it's that same process.
01:17:52.840 | So with the Instagram beauty filters thing,
01:17:55.520 | if you're trying to become a famous Instagram model,
01:18:02.360 | you are incentivized to post the hottest pictures
01:18:04.400 | of yourself that you can.
01:18:05.680 | You're trying to play that game.
01:18:07.280 | There's a lot of hot women on Instagram.
01:18:08.720 | How do you compete against them?
01:18:10.080 | You post really hot pictures
01:18:11.600 | and that's how you get more likes.
01:18:13.320 | As technology gets better,
01:18:16.080 | more makeup techniques come along.
01:18:19.320 | And then more recently, these beauty filters,
01:18:22.080 | where like at the touch of a button,
01:18:23.480 | it makes your face look absolutely incredible,
01:18:26.080 | compared to your natural face.
01:18:28.720 | These technologies come along.
01:18:31.600 | Everyone is incentivized to that short-term strategy.
01:18:35.640 | But on net, it's bad for everyone
01:18:39.040 | because now everyone is kind of feeling
01:18:40.480 | like they have to use these things.
01:18:41.680 | And these things, they make you,
01:18:43.080 | the reason why I talked about them in this video
01:18:44.400 | is 'cause I noticed it myself.
01:18:45.880 | Like I was trying to grow my Instagram for a while.
01:18:48.840 | I've given up on it now.
01:18:50.000 | And I noticed these filters, how good they made me look.
01:18:53.440 | And I'm like, well, I know that everyone else
01:18:56.400 | is kind of doing it.
01:18:57.240 | - Go subscribe to Liz's Instagram.
01:18:59.000 | - Please, so I don't have to use the filters.
01:19:00.840 | (both laughing)
01:19:02.200 | - Post a bunch of, yeah, make it blow up.
01:19:06.000 | So yeah, you felt the pressure actually.
01:19:08.440 | - Exactly, these short-term incentives to do this thing
01:19:12.760 | that either sacrifices your integrity or something else
01:19:17.280 | in order to stay competitive,
01:19:19.160 | which on aggregate creates this sort of race
01:19:23.760 | to the bottom spiral where everyone else ends up
01:19:25.880 | in a situation which is worse off than if they hadn't started
01:19:28.480 | than it were before.
01:19:29.320 | Kind of like if, like at a football stadium,
01:19:34.600 | the system is so badly designed,
01:19:36.400 | a competitive system of like everyone sitting
01:19:38.160 | and having a view, that if someone at the very front
01:19:40.240 | stands up to get an even better view,
01:19:42.080 | it forces everyone else behind
01:19:43.640 | to like adopt that same strategy,
01:19:45.440 | just to get to where they were before,
01:19:47.000 | but now everyone's stuck standing up.
01:19:49.160 | So you need this like top-down God's eye coordination
01:19:51.880 | to make it go back to the better state.
01:19:53.840 | But from within the system, you can't actually do that.
01:19:56.080 | So that's kind of what this Moloch thing is.
01:19:57.680 | It's this thing that makes people sacrifice values
01:20:01.600 | in order to optimize for winning the game in question,
01:20:04.440 | the short-term game.
01:20:05.480 | - But this Moloch, can you attribute it
01:20:09.760 | to any one centralized source,
01:20:11.680 | or is it an emergent phenomena
01:20:13.640 | from a large collection of people?
01:20:16.160 | - Exactly that.
01:20:17.000 | It's an emergent phenomena.
01:20:18.880 | It's a force of game theory.
01:20:21.280 | It's a force of bad incentives on a multi-agent system
01:20:25.600 | where you've got more, you know,
01:20:26.600 | prisoner's dilemma is technically a kind of Moloch-y
01:20:29.880 | system as well, but it's just a two-player thing.
01:20:31.960 | But another word for Moloch is multipolar trap,
01:20:35.720 | where basically you just got a lot of different people
01:20:37.760 | all competing for some kind of prize.
01:20:39.840 | And it would be better if everyone didn't do
01:20:43.200 | this one shitty strategy,
01:20:44.720 | but because that strategy gives you a short-term advantage,
01:20:47.240 | everyone's incentivized to do it,
01:20:48.480 | and so everyone ends up doing it.
01:20:49.840 | - So the responsibility for, I mean,
01:20:52.200 | social media is a really nice place
01:20:54.040 | for a large number of people to play game theory.
01:20:56.840 | And so they also have the ability
01:20:59.720 | to then design the rules of the game.
01:21:02.560 | And is it on them to try to anticipate what kind of,
01:21:06.960 | like to do the thing that poker players are doing,
01:21:08.960 | to run simulation?
01:21:11.080 | - Ideally, that would have been great.
01:21:12.360 | If, you know, Mark Zuckerberg and Jack and all the,
01:21:15.800 | you know, the Twitter founders and everyone,
01:21:17.240 | if they had at least just run a few simulations
01:21:20.560 | of how their algorithms would, you know,
01:21:23.240 | different types of algorithms would turn out for society,
01:21:26.200 | that would have been great.
01:21:27.320 | - That's really difficult to do
01:21:28.680 | that kind of deep philosophical thinking
01:21:30.400 | about humanity, actually.
01:21:33.080 | So not kind of this level of how do we optimize engagement
01:21:38.080 | or what brings people joy in the short term,
01:21:42.840 | but how is this thing going to change
01:21:45.320 | the way people see the world?
01:21:47.960 | How is it gonna get morphed in iterative games played
01:21:52.960 | into something that will change society forever?
01:21:57.040 | That requires some deep thinking.
01:21:58.800 | I hope there's meetings like that inside companies,
01:22:02.600 | but I haven't seen them. - There aren't.
01:22:03.600 | That's the problem.
01:22:04.480 | And it's difficult because like when you're starting up
01:22:08.640 | a social media company, you know,
01:22:10.280 | you're aware that you've got investors to please,
01:22:14.120 | there's bills to pay, you know,
01:22:17.280 | there's only so much R&D you can afford to do.
01:22:20.120 | You've got all these like incredible pressures,
01:22:22.120 | and bad incentives to get on
01:22:23.760 | and just build your thing as quickly as possible
01:22:25.200 | and start making money.
01:22:26.640 | And, you know, I don't think anyone intended
01:22:29.120 | when they built these social media platforms,
01:22:32.560 | and just to like preface it.
01:22:33.880 | So the reason why social media is relevant
01:22:36.360 | because it's a very good example of like,
01:22:38.720 | everyone these days is optimizing for clicks,
01:22:42.000 | whether it's the social media platforms themselves,
01:22:44.760 | because every click gets more impressions
01:22:47.880 | and impressions pay for, you know,
01:22:49.720 | they get advertising dollars,
01:22:51.000 | or whether it's individual influencers,
01:22:53.760 | or whether it's the New York Times or whoever,
01:22:56.080 | they're trying to get their story to go viral.
01:22:58.280 | So everyone's got this bad incentive of using,
01:23:00.200 | as you called it, the clickbait industrial complex.
01:23:02.760 | That's a very moloch-y system
01:23:04.200 | because everyone is now using worse and worse tactics
01:23:06.240 | in order to like try and win this attention game.
01:23:08.680 | And yeah, so ideally these companies would have had
01:23:14.520 | enough slack in the beginning
01:23:17.280 | in order to run these experiments to see,
01:23:19.800 | okay, what are the ways this could possibly go wrong
01:23:22.440 | for people?
01:23:23.280 | What are the ways that moloch,
01:23:24.680 | they should be aware of this concept of moloch
01:23:26.320 | and realize that it's,
01:23:27.680 | whenever you have a highly competitive multi-agent system,
01:23:31.280 | which social media is a classic example
01:23:32.800 | of millions of agents,
01:23:34.160 | all trying to compete for likes and so on,
01:23:35.920 | and you try and bring all this complexity down
01:23:39.920 | into like very small metrics,
01:23:41.880 | such as number of likes, number of retweets,
01:23:44.880 | whatever the algorithm optimizes for,
01:23:46.720 | that is a like guaranteed recipe
01:23:48.680 | for this stuff to go wrong
01:23:49.800 | and become a race to the bottom.
01:23:51.520 | - I think there should be an honesty when founders,
01:23:53.920 | I think there's a hunger for that kind of transparency
01:23:56.400 | of like, we don't know what the fuck we're doing.
01:23:58.200 | This is a fascinating experiment.
01:23:59.720 | We're all running as a human civilization.
01:24:02.800 | Let's try this out.
01:24:04.320 | And like, actually just be honest about this,
01:24:06.440 | that we're all like these weird rats in a maze.
01:24:10.440 | None of us are controlling it.
01:24:12.400 | There's this kind of sense like the founders,
01:24:15.160 | the CEO of Instagram or whatever,
01:24:17.560 | Mark Zuckerberg has a control
01:24:19.240 | and he's like with strings playing people.
01:24:21.880 | No, they're-
01:24:22.880 | - He's at the mercy of this, like everyone else.
01:24:24.640 | He's just like trying to do his best.
01:24:26.600 | - And like, I think putting on a smile
01:24:29.280 | and doing over polished videos
01:24:33.000 | about how Instagram and Facebook are good for you,
01:24:36.920 | I think is not the right way
01:24:38.000 | to actually ask some of the deepest questions
01:24:41.160 | we get to ask as a society.
01:24:43.240 | How do we design the game such that we build a better world?
01:24:48.120 | - I think a big part of this as well is people,
01:24:51.760 | there's this philosophy, particularly in Silicon Valley
01:24:55.720 | of well, techno-optimism,
01:24:58.520 | technology will solve all our issues.
01:25:00.360 | And there's a steel man argument to that
01:25:03.080 | where yes, technology has solved a lot of problems
01:25:05.440 | and can potentially solve a lot of future ones.
01:25:08.200 | But it can also, it's always a double-edged sword
01:25:10.840 | and particularly as technology gets more and more powerful
01:25:13.440 | and we've now got like big data
01:25:15.120 | and we're able to do all kinds of like
01:25:17.360 | psychological manipulation with it and so on.
01:25:20.680 | It's, technology is not a values neutral thing.
01:25:24.960 | People think, I used to always think this myself,
01:25:27.000 | it's like this naive view that,
01:25:28.880 | oh, technology is completely neutral.
01:25:30.720 | It's just, it's the humans that either make it good or bad.
01:25:33.840 | No, to the point we're at now,
01:25:36.640 | the technology that we are creating,
01:25:38.000 | they are social technologies.
01:25:39.440 | They literally dictate how humans now form social groups
01:25:44.440 | and so on beyond that.
01:25:46.240 | And beyond that, it also then,
01:25:47.760 | that gives rise to like the memes
01:25:49.600 | that we then like coalesce around.
01:25:51.960 | And that, if you have the stack that way
01:25:54.400 | where it's technology driving social interaction,
01:25:56.960 | which then drives like mimetic culture
01:26:00.480 | and like which ideas become popular, that's Moloch.
01:26:04.800 | And we need the other way around.
01:26:06.480 | We need it so we need to figure out what are the good memes?
01:26:08.400 | What are the good values that we think are,
01:26:13.400 | we need to optimize for that like makes people happy
01:26:16.680 | and healthy and like keeps society as robust
01:26:20.120 | and safe as possible.
01:26:21.360 | Then figure out what the social structure
01:26:22.840 | around those should be.
01:26:23.680 | And only then do we figure out technology.
01:26:25.480 | But we're doing the other way around.
01:26:26.800 | And, you know, like as much as I love in many ways
01:26:31.800 | the culture of Silicon Valley and like, you know,
01:26:33.840 | I do think that technology has, you know,
01:26:36.040 | I don't want to knock it.
01:26:36.880 | It's done so many wonderful things for us.
01:26:38.240 | Same with capitalism.
01:26:39.280 | There are, we have to like be honest with ourselves.
01:26:44.160 | We're getting to a point where we are losing control
01:26:47.240 | of this very powerful machine that we have created.
01:26:49.600 | - Can you redesign the machine within the game?
01:26:53.200 | Can you just have, can you understand the game enough?
01:26:57.360 | Okay, this is the game.
01:26:58.880 | And this is how we start to reemphasize
01:27:01.920 | the memes that matter.
01:27:03.840 | The memes that bring out the best in us.
01:27:06.720 | You know, like the way I try to be in real life
01:27:11.320 | and the way I try to be online is to be about kindness
01:27:15.240 | and love.
01:27:16.080 | And I feel like I'm sometimes get like criticized
01:27:19.840 | for being naive and all those kinds of things.
01:27:22.040 | But I feel like I'm just trying to live within this game.
01:27:25.840 | I'm trying to be authentic.
01:27:27.520 | Yeah, but also like, hey, it's kind of fun to do this.
01:27:31.080 | Like you guys should try this too.
01:27:32.920 | You know, that, and that's like trying to redesign
01:27:37.200 | some aspects of the game within the game.
01:27:40.960 | - Is that possible?
01:27:41.920 | - I don't know, but I think we should try.
01:27:46.760 | I don't think we have an option but to try.
01:27:48.640 | - Well, the other option is to create new companies
01:27:51.160 | or to pressure companies that,
01:27:55.280 | or anyone who has control of the rules of the game.
01:27:59.040 | - I think we need to be doing all of the above.
01:28:01.120 | I think we need to be thinking hard
01:28:03.000 | about what are the kind of positive, healthy memes.
01:28:09.760 | As Elon said, he who controls the memes controls the universe.
01:28:13.640 | - He said that.
01:28:14.480 | - I think he did, yeah.
01:28:16.320 | But there's truth to that.
01:28:17.880 | It's very, there is wisdom in that
01:28:19.440 | because memes have driven history.
01:28:22.760 | You know, we are a cultural species.
01:28:24.920 | That's what sets us apart from chimpanzees
01:28:27.200 | and everything else.
01:28:28.040 | We have the ability to learn and evolve through culture
01:28:32.520 | as opposed to biology or like, you know,
01:28:34.960 | classic physical constraints.
01:28:37.360 | And that means culture is incredibly powerful
01:28:40.800 | and we can create and become victim to very bad memes
01:28:44.960 | or very good ones.
01:28:46.680 | But we do have some agency over which memes,
01:28:49.360 | you know, we not only put out there,
01:28:52.000 | but we also like subscribe to.
01:28:53.680 | So I think we need to take that approach.
01:28:56.520 | We also need to, you know, 'cause I don't want the,
01:29:01.520 | I'm making this video right now called the Attention Wars,
01:29:03.800 | which is about like how Moloch,
01:29:05.520 | like the media machine is this Moloch machine.
01:29:08.560 | Well, is this kind of like blind, dumb thing
01:29:11.440 | that where everyone is optimizing for engagement
01:29:13.200 | in order to win their share of the attention pie.
01:29:16.480 | And then if you zoom out, it's really like Moloch
01:29:18.280 | that's pulling the strings
01:29:19.240 | 'cause the only thing that benefits from this in the end,
01:29:20.960 | you know, like our information ecosystem is breaking down.
01:29:24.040 | Like we have, you look at the state of the US,
01:29:26.480 | it's in, we're in a civil war.
01:29:28.440 | It's just not a physical war.
01:29:30.000 | It's an information war.
01:29:33.320 | And people are becoming more fractured
01:29:35.120 | in terms of what their actual shared reality is.
01:29:37.440 | Like truly like an extreme left person,
01:29:39.600 | an extreme right person,
01:29:40.520 | like they literally live in different worlds
01:29:43.440 | in their minds at this point.
01:29:45.320 | And it's getting more and more amplified.
01:29:47.280 | And this force is like a razor blade
01:29:50.320 | pushing through everything.
01:29:51.920 | It doesn't matter how innocuous the topic is,
01:29:53.400 | it will find a way to split into this,
01:29:55.320 | you know, bifurcated culture war.
01:29:57.280 | And it's fucking terrifying.
01:29:58.320 | - Because that maximizes the tension.
01:29:59.840 | And that's like an emergent Moloch type force
01:30:03.200 | that takes any, anything, any topic
01:30:06.880 | and cuts through it so that it can split nicely
01:30:11.240 | into two groups.
01:30:12.560 | One that's--
01:30:13.920 | - Well, it's whatever, yeah,
01:30:15.560 | all everyone is trying to do within the system
01:30:18.040 | is just maximize whatever gets them the most attention
01:30:21.240 | because they're just trying to make money
01:30:22.360 | so they can keep their thing going, right?
01:30:24.560 | And the way, the best emotion for getting attention,
01:30:29.240 | well, because it's not just about attention on the internet,
01:30:30.880 | it's engagement.
01:30:31.920 | That's the key thing, right?
01:30:33.120 | In order for something to go viral,
01:30:34.400 | you need people to actually engage with it.
01:30:35.960 | They need to like comment or retweet or whatever.
01:30:38.600 | And of all the emotions that, you know,
01:30:43.560 | there's like seven classic shared emotions
01:30:45.800 | that studies have found that all humans,
01:30:47.520 | even from like previously uncontacted tribes have.
01:30:51.040 | Some of those are negative, you know, like sadness,
01:30:56.080 | disgust, anger, et cetera, some are positive,
01:30:58.880 | happiness, excitement, and so on.
01:31:02.360 | The one that happens to be the most useful
01:31:04.760 | for the internet is anger
01:31:06.680 | because anger is such an active emotion.
01:31:11.120 | If you want people to engage,
01:31:13.160 | if someone's scared,
01:31:14.280 | and I'm not just like talking out my ass here,
01:31:15.840 | there are studies here that have looked into this.
01:31:17.840 | Whereas like if someone's like disgusted or fearful,
01:31:22.440 | they actually tend to then be like,
01:31:23.680 | oh, I don't wanna deal with this.
01:31:24.760 | So they're less likely to actually engage
01:31:26.360 | and share it and so on.
01:31:27.520 | They're just gonna be like, ugh.
01:31:28.480 | Whereas if they're enraged by a thing,
01:31:31.040 | well, now that triggers all the old tribalism emotions.
01:31:35.920 | And so that's how then things get sort of spread,
01:31:38.400 | you know, much more easily.
01:31:39.400 | They out-compete all the other memes in the ecosystem.
01:31:43.160 | And so this like, the attention economy,
01:31:47.440 | the wheels that make it go around is rage.
01:31:50.520 | I did a tweet, the problem with raging against the machine
01:31:55.080 | is that the machine has learned to feed off rage
01:31:57.280 | because it is feeding off our rage.
01:31:59.080 | That's the thing that's now keeping it going.
01:32:00.400 | So the more we get angry, the worse it gets.
01:32:03.240 | - So the mullet in this attention,
01:32:06.040 | in the war of attention is constantly maximizing rage.
01:32:11.040 | - What it is optimizing for is engagement.
01:32:13.600 | And it happens to be that engagement is, well, propaganda.
01:32:18.600 | You know, it's like, I mean,
01:32:20.920 | it just sounds like everything is putting,
01:32:23.240 | more and more things are being put through
01:32:24.320 | this like propagandist lens
01:32:26.000 | of winning whatever the war is in question,
01:32:28.880 | whether it's the culture war or the Ukraine war, yeah.
01:32:31.400 | - Well, I think the silver lining of this,
01:32:33.240 | do you think it's possible that in the long arc
01:32:36.520 | of this process, you actually do arrive
01:32:39.560 | at greater wisdom and more progress?
01:32:41.800 | It just, in the moment, it feels like people
01:32:43.920 | are tearing each other to shreds over ideas.
01:32:47.760 | But if you think about it,
01:32:48.600 | one of the magic things about democracy and so on
01:32:51.120 | is you have the blue versus red constantly fighting.
01:32:53.880 | It's almost like they're in discourse,
01:32:58.160 | creating devil's advocate,
01:32:59.720 | making devils out of each other.
01:33:01.240 | And through that process, discussing ideas,
01:33:04.880 | like almost really embodying different ideas,
01:33:07.680 | just to yell at each other.
01:33:08.840 | And through the yelling over the period of decades,
01:33:11.600 | maybe centuries, figuring out a better system.
01:33:15.320 | Like in the moment, it feels fucked up.
01:33:17.280 | - Right.
01:33:18.120 | - But in the long arc, it actually is productive.
01:33:20.760 | - I hope so.
01:33:23.320 | That said, we are now in the era of,
01:33:28.320 | just as we have weapons of mass destruction
01:33:30.480 | with nuclear weapons,
01:33:32.400 | that can break the whole playing field.
01:33:35.240 | We now are developing weapons
01:33:37.680 | of informational mass destruction,
01:33:39.400 | information weapons, WMDs that basically
01:33:43.560 | can be used for propaganda or just manipulating people
01:33:47.720 | however it's needed, whether that's through
01:33:51.080 | dumb TikTok videos or, you know,
01:33:54.040 | there are significant resources being put in.
01:33:59.040 | I don't mean to sound like, you know,
01:34:01.440 | too doom and gloom, but there are bad actors out there.
01:34:04.240 | That's the thing.
01:34:05.080 | There are plenty of good actors within the system
01:34:06.640 | who are just trying to stay afloat in the game.
01:34:08.240 | So effectively doing monarchy things.
01:34:09.760 | But then on top of that, we have actual bad actors
01:34:12.480 | who are intentionally trying to like,
01:34:14.960 | manipulate the other side into doing things.
01:34:17.360 | - And using, so because of the digital space,
01:34:19.640 | they're able to use artificial actors, meaning bots.
01:34:24.440 | - Exactly, botnets, you know,
01:34:26.120 | and this is a whole new situation
01:34:29.720 | that we've never had before.
01:34:31.160 | - It's exciting.
01:34:32.280 | You know what I wanna do? - Is it?
01:34:34.000 | - You know what I wanna do?
01:34:36.160 | Because there is, you know,
01:34:37.120 | people are talking about bots manipulating
01:34:38.920 | and have like malicious bots
01:34:41.600 | that are basically spreading propaganda.
01:34:43.560 | I wanna create like a bot army for like,
01:34:46.560 | that like fights that. - For love?
01:34:47.600 | - Yeah, exactly, for love.
01:34:48.680 | That fights, that, I mean.
01:34:50.960 | - You know, there's truth to fight fire with fire.
01:34:53.480 | It's like, but how, you always have to be careful
01:34:57.080 | whenever you create, again, like,
01:34:59.960 | Moloch is very tricky.
01:35:01.360 | - Yeah, yeah, Hitler was trying to spread love too.
01:35:04.240 | - Yeah, so we thought, but you know,
01:35:05.440 | I agree with you that like,
01:35:07.320 | that is a thing that should be considered,
01:35:09.160 | but there is, again, everyone,
01:35:11.440 | the road to hell is paved in good intentions.
01:35:13.560 | And there's always unforeseen circumstances, you know,
01:35:18.560 | outcomes, externalities, if you're trying to adopt a thing,
01:35:21.440 | even if you do it in the very best of faith.
01:35:23.600 | - But you can learn lessons of history.
01:35:25.040 | - If you can run some Sims on it first, absolutely.
01:35:28.600 | - But also there's certain aspects of a system,
01:35:31.000 | as we've learned through history,
01:35:32.240 | that do better than others.
01:35:34.000 | Like, for example, don't have a dictator.
01:35:36.120 | So like, if I were to create this bot army,
01:35:39.640 | it's not good for me to have full control over it.
01:35:42.920 | Because in the beginning,
01:35:44.000 | I might have a good understanding of what's good and not.
01:35:46.720 | But over time, that starts to get deviated,
01:35:49.200 | 'cause I'll get annoyed at some assholes,
01:35:50.720 | and I'll think, okay,
01:35:51.920 | wouldn't it be nice to get rid of those assholes?
01:35:54.040 | But then that power starts getting to your head,
01:35:55.680 | you become corrupted.
01:35:56.880 | That's basic human nature.
01:35:58.080 | So distribute the power somehow.
01:35:59.960 | - We need a love botnet on a DAO.
01:36:04.480 | A DAO love botnet.
01:36:07.800 | - Yeah, but, and without a leader.
01:36:10.040 | Like without-
01:36:10.880 | - Exactly, a distributed, right.
01:36:12.360 | But yeah, without any kind of centralized-
01:36:14.320 | - Yeah, without even, you know,
01:36:16.120 | basically is the more control,
01:36:17.520 | the more you can decentralize the control of a thing
01:36:21.440 | to people, you know, but the balance-
01:36:24.880 | - But then you still need the ability to coordinate,
01:36:26.720 | because that's the issue when if something is too,
01:36:29.440 | you know, that's really, to me, like the culture wars,
01:36:32.440 | the bigger war we're dealing with is actually
01:36:34.640 | between the, like the sort of the,
01:36:38.320 | I don't know what even the term is for it,
01:36:39.600 | but like centralization versus decentralization.
01:36:42.000 | That's the tension we're seeing.
01:36:44.120 | Power and control by a few versus completely distributed.
01:36:48.240 | And the trouble is if you have a fully centralized thing,
01:36:50.920 | then you're at risk of tyranny, you know,
01:36:52.720 | Stalin type things can happen, or completely distributed.
01:36:56.880 | Now you're at risk of complete anarchy and chaos
01:36:58.560 | where you can't even coordinate to like on, you know,
01:37:01.160 | when there's like a pandemic or anything like that.
01:37:03.280 | So it's like, what is the right balance to strike
01:37:05.840 | between these two structures?
01:37:08.120 | - Can't Moloch really take hold
01:37:09.720 | in a fully decentralized system?
01:37:11.080 | That's the one of the dangers too.
01:37:12.560 | - Yes.
01:37:13.720 | - So you're very vulnerable to Moloch.
01:37:14.560 | - So the dictator can commit huge atrocities,
01:37:17.760 | but they can also make sure the infrastructure works
01:37:21.000 | and trains what I'm talking about.
01:37:23.240 | - They have that God's eye view, at least.
01:37:24.880 | They have the ability to create like laws and rules
01:37:27.360 | that like force coordination, which stops Moloch.
01:37:30.960 | But then you're vulnerable to that dictator
01:37:33.280 | getting infected with like this,
01:37:34.680 | with some kind of psychopathy type thing.
01:37:37.000 | - What's reverse Moloch?
01:37:39.480 | - So great question.
01:37:40.640 | So that's where, so I've been working on this series.
01:37:45.640 | It's been driving me insane for the last year and a half.
01:37:48.320 | I did the first one a year ago.
01:37:50.120 | I can't believe it's nearly been a year.
01:37:52.120 | The second one, hopefully we're coming out in like a month.
01:37:54.800 | And my goal at the end of the series is to like present,
01:37:58.760 | 'cause basically I'm painting the picture
01:37:59.960 | of like what Moloch is and how it's affecting
01:38:02.320 | almost all these issues in our society
01:38:04.400 | and how it's, you know, driving.
01:38:06.480 | It's like kind of the generator function
01:38:08.000 | as people describe it of existential risk.
01:38:11.040 | And then at the end of that-
01:38:11.880 | - Wait, wait, the generator function of existential risk.
01:38:14.200 | So you're saying Moloch is sort of the engine
01:38:16.480 | that creates a bunch of X risks.
01:38:19.680 | - Yes, not all of them.
01:38:20.720 | Like, you know, a-
01:38:22.960 | - It's a cool phrase, generator function.
01:38:24.520 | - It's not my phrase, it's Daniel Schmachtenberger.
01:38:26.800 | - Oh, Schmachtenberger. - I got that from him.
01:38:28.120 | - Of course. - All of these ideas.
01:38:29.880 | - It's like all roads lead back to Daniel Schmachtenberger.
01:38:32.720 | - The dude is brilliant.
01:38:34.560 | He's really, really brilliant.
01:38:35.400 | - And after that it's Mark Twain.
01:38:37.800 | - But anyway, sorry, totally rude interruptions from me.
01:38:41.120 | - No, it's fine.
01:38:42.000 | So not all X risks.
01:38:43.080 | So like an asteroid technically isn't
01:38:45.440 | because it's just like this one big external thing.
01:38:49.440 | It's not like a competition thing going on,
01:38:51.520 | but, you know, synthetic bio, you know, bioweapons,
01:38:55.800 | that's one because everyone's incentivized to build,
01:38:58.600 | even for defense, you know, bad viruses,
01:39:01.040 | you know, just to threaten someone else, et cetera.
01:39:03.720 | Or AI technically, the race to AGI
01:39:05.760 | is kind of potentially a molecule situation.
01:39:08.040 | But yeah, so if Moloch is this like generator function
01:39:14.560 | that's driving all of these issues over the coming century
01:39:17.560 | that might wipe us out, what's the inverse?
01:39:20.240 | And so far what I've gotten to is this character
01:39:24.040 | that I want to put out there called Win-Win.
01:39:26.600 | Because Moloch is the God of lose-lose ultimately.
01:39:28.600 | It masquerades as the God of win-lose,
01:39:30.200 | but in reality it's lose-lose.
01:39:31.720 | Everyone ends up worse off.
01:39:34.320 | So I was like, well, what's the opposite of that?
01:39:35.800 | It's Win-Win.
01:39:36.640 | And I was thinking for ages,
01:39:37.520 | like what's a good name for this character?
01:39:39.680 | And then the more I was like, okay, well,
01:39:42.360 | don't try and think through it logically.
01:39:44.640 | What's the vibe of Win-Win?
01:39:46.480 | And to me, like in my mind, Moloch is like,
01:39:49.160 | and I addressed it in the video, like it's red and black.
01:39:52.080 | It's kind of like very, you know,
01:39:54.760 | hyper-focused on its one goal, you must win.
01:39:57.480 | So Win-Win is kind of actually like these colors.
01:40:01.360 | It's like purple, turquoise.
01:40:04.080 | It loves games too.
01:40:06.640 | It loves a little bit of healthy competition,
01:40:08.320 | but constrained, like kind of like before.
01:40:10.640 | Knows how to ring fence zero-sum competition
01:40:12.760 | into like just the right amount,
01:40:14.880 | whereby its externalities can be controlled
01:40:17.400 | and kept positive.
01:40:19.120 | And then beyond that, it also loves cooperation,
01:40:21.240 | coordination, love, all these other things.
01:40:23.920 | But it's also kind of like mischievous.
01:40:25.880 | Like, you know, it will have a good time.
01:40:28.480 | It's not like kind of like boring, you know, like,
01:40:30.640 | oh God, it knows how to have fun.
01:40:33.920 | It can get down.
01:40:36.000 | But ultimately it's like unbelievably wise
01:40:39.760 | and it just wants the game to keep going.
01:40:41.800 | And I call it Win-Win.
01:40:44.600 | - That's a good like pet name, Win-Win.
01:40:46.960 | - Yes, I think the-
01:40:48.480 | - Win-Win.
01:40:49.320 | - Win-Win, right?
01:40:50.160 | And I think its formal name,
01:40:51.280 | when it has to do like official functions, is Omnia.
01:40:55.080 | - Omnia.
01:40:55.920 | - Yeah.
01:40:56.760 | - From like omniscience, kind of, why Omnia?
01:40:59.560 | You just like Omnia?
01:41:00.400 | - Just like Omni-Win.
01:41:01.240 | - Omni-Win.
01:41:02.080 | - But I'm open to suggestions.
01:41:02.960 | So like, you know, and this is-
01:41:04.160 | - I like Omnia, yeah.
01:41:05.160 | - Yeah, like that's like-
01:41:06.000 | - But there is an angelic kind of sense to Omnia though.
01:41:08.800 | So Win-Win is more fun.
01:41:10.040 | - Exactly.
01:41:10.880 | - So it's more like, it embraces the fun aspect.
01:41:15.880 | I mean, there is something about sort of,
01:41:18.640 | there's some aspect to Win-Win interactions
01:41:23.560 | that requires embracing the chaos of the game
01:41:31.640 | and enjoying the game itself.
01:41:33.800 | I don't know, I don't know what that is.
01:41:35.240 | That's almost like a Zen-like appreciation
01:41:37.720 | of the game itself,
01:41:39.400 | not optimizing for the consequences of the game.
01:41:42.520 | - Right, well, it's recognizing the value of competition
01:41:46.160 | in of itself, it's not like about winning,
01:41:48.720 | it's about you enjoying the process of having a competition
01:41:51.520 | and not knowing whether you're gonna win
01:41:52.680 | or lose this little thing.
01:41:53.920 | But then also being aware that, you know,
01:41:56.600 | what's the boundary?
01:41:57.440 | How big do I want competition to be?
01:41:59.080 | Because one of the reason why Moloch is doing so well now
01:42:02.720 | in our civilization is because we haven't been able
01:42:05.600 | to ring fence competition.
01:42:07.000 | And so it's just having all these negative externalities
01:42:10.600 | and we've completely lost control of it.
01:42:12.600 | I think my guess is, and now we're getting really like,
01:42:18.640 | you know, metaphysical technically,
01:42:20.880 | but I think we'll be in a more interesting universe
01:42:26.800 | if we have one that has both pure cooperation,
01:42:29.720 | you know, lots of cooperation
01:42:31.360 | and some pockets of competition
01:42:33.000 | than one that's purely cooperation entirely.
01:42:36.240 | Like it's good to have some little zero-sum-ness bits,
01:42:39.360 | but I don't know that fully
01:42:41.520 | and I'm not qualified as a philosopher to know that.
01:42:44.120 | - And that's what reverse Moloch,
01:42:45.480 | so this kind of win-win creature system
01:42:49.640 | is an antidote to the Moloch system.
01:42:52.160 | - Yes.
01:42:53.600 | And I don't know how it's gonna do that.
01:42:56.360 | - But it's good to kind of try to start
01:43:00.000 | to formulate different ideas,
01:43:01.480 | different frameworks of how we think about that.
01:43:03.840 | - Exactly.
01:43:04.680 | - At the small scale of a collection of individuals
01:43:07.240 | and a large scale of a society.
01:43:09.120 | - Exactly.
01:43:09.960 | It's a meme, I think it's an example of a good meme.
01:43:13.360 | And I'm open, I'd love to hear feedback from people
01:43:15.560 | if they think it's, you know,
01:43:16.720 | they have a better idea or it's not, you know,
01:43:18.320 | but it's the direction of memes that we need to spread,
01:43:21.640 | this idea of like, look for the win-wins in life.
01:43:25.160 | - Well, on the topic of beauty filters,
01:43:27.240 | so in that particular context where Moloch creates
01:43:31.800 | negative consequences, you know,
01:43:35.160 | Dostoevsky said beauty will save the world.
01:43:37.160 | What is beauty anyway?
01:43:39.140 | It would be nice to just try to discuss
01:43:43.800 | what kind of thing we would like to converge towards
01:43:49.320 | in our understanding of what is beautiful.
01:43:52.260 | - So to me, I think something is beautiful
01:43:59.200 | when it can't be reduced down to easy metrics.
01:44:04.200 | Like if you think of a tree, what is it about a tree,
01:44:10.720 | like a big ancient beautiful tree, right?
01:44:12.280 | What is it about it that we find so beautiful?
01:44:14.680 | It's not, you know, the sweetness of its fruit
01:44:19.680 | or the value of its lumber.
01:44:23.480 | It's this entirety of it that is,
01:44:28.480 | there's these immeasurable qualities.
01:44:30.880 | It's like almost like a qualia of it.
01:44:32.920 | That's both, like it walks this fine line between,
01:44:37.360 | well, it's got lots of patonicity,
01:44:38.740 | but it's not overly predictable.
01:44:41.080 | You know, again, it walks this fine line
01:44:42.200 | between order and chaos.
01:44:43.240 | It's a very highly complex system.
01:44:45.500 | And you know, you can't, it's evolving over time.
01:44:51.440 | You know, the definition of a complex versus,
01:44:53.280 | and this is another Schmacktenberger thing,
01:44:54.760 | you know, a complex versus a complicated system.
01:44:57.660 | A complicated system can be sort of broken down into bits,
01:45:00.600 | understood and then put back together.
01:45:02.240 | A complex system is kind of like a black box.
01:45:05.000 | It does all this crazy stuff, but if you take it apart,
01:45:08.200 | you can't put it back together again
01:45:09.340 | because there's all these intricacies.
01:45:11.800 | And also very importantly, like the sum of the parts,
01:45:15.000 | sorry, the sum of the whole is much greater
01:45:16.600 | than the sum of the parts.
01:45:17.880 | And that's where the beauty lies, I think.
01:45:21.400 | And I think that extends to things like art as well.
01:45:23.560 | Like there's something immeasurable about it.
01:45:27.640 | There's something we can't break down to a narrow metric.
01:45:30.000 | - Does that extend to humans, you think?
01:45:31.840 | - Yeah, absolutely.
01:45:33.840 | - So how can Instagram reveal that kind of beauty,
01:45:37.440 | the complexity of a human being?
01:45:39.320 | - Good question.
01:45:41.160 | - And this takes us back to dating sites and Goodreads,
01:45:44.880 | I think.
01:45:45.720 | - Very good question.
01:45:47.800 | I mean, well, I know what it shouldn't do.
01:45:50.320 | It shouldn't try and like, right now, you know,
01:45:53.320 | one of the, I was talking to like a social media expert
01:45:56.240 | recently 'cause I was like, ugh, I hate-
01:45:58.320 | - Is there such a thing as a social media expert?
01:45:59.960 | - Oh yeah, there are like agencies out there
01:46:02.080 | that you can like outsource.
01:46:03.280 | 'Cause I'm thinking about working with one to like,
01:46:06.040 | I wanna start a podcast.
01:46:08.560 | - Yes.
01:46:09.440 | You should, you should have done it a long time ago.
01:46:11.920 | - Working on it.
01:46:12.920 | It's gonna be called Win-Win.
01:46:14.760 | And it's gonna be about this like positive stuff.
01:46:17.480 | And the thing that, you know, they always come back
01:46:21.000 | and say, it's like, well, you need to like figure out
01:46:23.000 | what your thing is.
01:46:24.080 | You know, you need to narrow down what your thing is
01:46:26.440 | and then just follow that.
01:46:28.000 | Have a like a sort of a formula
01:46:31.080 | because that's what people want.
01:46:32.200 | They wanna know that they're coming back to the same thing.
01:46:34.320 | And that's the advice on YouTube, Twitter, you name it.
01:46:37.480 | And that's why, and the trouble with that
01:46:40.240 | is that it's a complexity reduction.
01:46:42.480 | And generally speaking, you know,
01:46:44.120 | complexity reduction is bad.
01:46:45.400 | It's making things more, it's an oversimplification.
01:46:48.040 | Not that simplification is a bad thing,
01:46:50.240 | but when you're trying to take, you know,
01:46:55.520 | what is social media doing?
01:46:56.560 | It's trying to like encapsulate the human experience
01:47:00.120 | and put it into digital form and commodify it to an extent.
01:47:05.720 | And so you do that, you compress people down
01:47:07.600 | into these like narrow things.
01:47:09.520 | And that's why I think it's kind of ultimately
01:47:12.680 | fundamentally incompatible
01:47:13.920 | with at least my definition of beauty.
01:47:15.240 | - Yeah, it's interesting because there is some sense
01:47:18.120 | in which a simplification sort of in the Einstein
01:47:23.120 | kind of sense of a really complex idea,
01:47:28.040 | a simplification in a way that still captures
01:47:30.280 | some core power of an idea of a person is also beautiful.
01:47:36.120 | And so maybe it's possible for social media to do that.
01:47:39.360 | A presentation, sort of a slither, a slice,
01:47:44.360 | a look into a person's life
01:47:46.360 | that reveals something real about them,
01:47:50.520 | but in a simple way, in a way that can be displayed
01:47:53.240 | graphically or through words.
01:47:55.600 | Some way, I mean, in some way,
01:47:57.680 | Twitter can do that kind of thing.
01:47:59.960 | A very few set of words can reveal
01:48:03.080 | the intricacies of a person.
01:48:04.920 | Of course, the viral machine that spreads those words
01:48:09.920 | often results in people taking the thing out of context.
01:48:14.840 | People often don't read tweets in the context
01:48:18.920 | of the human being that wrote them.
01:48:20.800 | The full history of the tweets they've written,
01:48:24.080 | the education level, the humor level,
01:48:26.560 | the worldview they're playing around with,
01:48:30.440 | all that context is forgotten
01:48:31.760 | and people just see the different words.
01:48:33.640 | So that can lead to trouble.
01:48:35.520 | But in a certain sense, if you do take it in context,
01:48:39.840 | it reveals some kind of quirky little beautiful idea
01:48:43.240 | or a profound little idea from that particular person
01:48:47.200 | that shows something about that person.
01:48:48.560 | So in that sense, Twitter can be more successful.
01:48:51.320 | If we're talking about Mullicks,
01:48:53.080 | is driving a better kind of incentive.
01:48:56.080 | - Yeah, I mean, how they can,
01:48:59.600 | like if we were to rewrite,
01:49:02.040 | is there a way to rewrite the Twitter algorithm
01:49:05.480 | so that it stops being the like,
01:49:08.800 | the fertile breeding ground of the culture wars?
01:49:12.480 | Because that's really what it is.
01:49:13.720 | It's, I mean, maybe I'm giving it,
01:49:17.560 | Twitter too much power,
01:49:19.400 | but just the more I looked into it
01:49:21.800 | and I had conversations with Tristan Harris
01:49:25.640 | from the Center of Humane Technology
01:49:27.680 | and he explained it as like,
01:49:30.880 | Twitter is where you have this amalgam of human culture
01:49:34.160 | and then this terribly designed algorithm
01:49:36.280 | that amplifies the craziest people
01:49:38.480 | and the angriest, most divisive takes and amplifies them.
01:49:44.840 | And then the media, the mainstream media,
01:49:47.800 | because all the journalists are also on Twitter,
01:49:49.920 | they then are informed by that.
01:49:52.920 | And so they draw out the stories they can
01:49:55.120 | from this already like very boiling lava of rage.
01:50:00.760 | And then spread that to their millions and millions of people
01:50:04.440 | who aren't even on Twitter.
01:50:05.800 | And so I honestly, I think if I could press a button,
01:50:10.840 | turn them off, I probably would at this point,
01:50:13.640 | 'cause I just don't see a way of being compatible
01:50:16.160 | with healthiness, but that's not gonna happen.
01:50:18.600 | And so at least one way to like stem the tide
01:50:23.160 | and make it less molochey would be to change,
01:50:30.040 | at least if like it was on a subscription model,
01:50:31.960 | then it's now not optimizing for impressions,
01:50:37.040 | 'cause basically what it wants is for people
01:50:38.320 | to keep coming back as often as possible.
01:50:40.240 | That's how they get paid, right?
01:50:42.040 | Every time an ad gets shown to someone
01:50:43.760 | and the way is to get people
01:50:44.680 | constantly refreshing their feed.
01:50:46.520 | So you're trying to encourage addictive behaviors.
01:50:49.400 | Whereas if someone, if they moved on
01:50:52.320 | to at least a subscription model,
01:50:53.960 | then they're getting the money either way,
01:50:56.960 | whether someone comes back to the site once a month
01:50:59.160 | or 500 times a month, they get the same amount of money.
01:51:02.080 | So now that takes away that incentive to use technology,
01:51:06.440 | to build, to design an algorithm
01:51:08.200 | that is maximally addictive.
01:51:09.600 | That would be one way, for example.
01:51:12.520 | - Yeah, but you still want people to,
01:51:14.640 | yeah, I just feel like that just slows down,
01:51:17.280 | creates friction in the virality of things.
01:51:20.960 | - But that's good.
01:51:22.280 | We need to slow down virality.
01:51:24.440 | - It's good, it's one way.
01:51:26.600 | Virality is Moloch, to be clear.
01:51:28.960 | - So Moloch is always negative then?
01:51:34.320 | - Yes, by definition.
01:51:36.520 | - Yes.
01:51:37.360 | - Competition. - But then I disagree with you.
01:51:38.200 | - Competition is not always negative.
01:51:39.440 | Competition is neutral.
01:51:40.520 | - I disagree with you that all virality is negative then,
01:51:44.200 | is Moloch then.
01:51:45.960 | Because it's a good intuition
01:51:49.520 | 'cause we have a lot of data on virality being negative.
01:51:52.920 | But I happen to believe that the core of human beings,
01:51:57.120 | so most human beings want to be good
01:52:00.400 | more than they want to be bad to each other.
01:52:03.400 | And so I think it's possible.
01:52:05.720 | It might be just harder to engineer
01:52:07.560 | systems that enable virality,
01:52:10.560 | but it's possible to engineer systems that are viral
01:52:13.880 | that enable virality.
01:52:15.720 | And the kind of stuff that rises to the top
01:52:19.160 | is things that are positive.
01:52:21.400 | And positive not like la la positive,
01:52:24.160 | it's more like win-win,
01:52:25.840 | meaning a lot of people need to be challenged.
01:52:28.240 | - Wise things, yes.
01:52:29.640 | - You grow from it, it might challenge you,
01:52:31.480 | you might not like it, but you ultimately grow from it.
01:52:34.720 | - And ultimately bring people together
01:52:36.720 | as opposed to tear them apart.
01:52:38.480 | I deeply want that to be true.
01:52:40.560 | And I very much agree with you
01:52:42.320 | that people at their core are on average good,
01:52:44.600 | as opposed to, care for each other, as opposed to not.
01:52:47.280 | I think it's actually a very small percentage of people
01:52:50.920 | are truly wanting to do just destructive malicious things.
01:52:54.440 | Most people are just trying to win their own little game
01:52:56.520 | and they don't mean to be,
01:52:57.880 | they're just stuck in this badly designed system.
01:53:00.280 | That said, the current structure, yes,
01:53:04.640 | is the current structure means that virality
01:53:09.160 | is optimized towards Moloch.
01:53:10.800 | That doesn't mean there aren't exceptions.
01:53:12.400 | Sometimes positive stories do go viral
01:53:14.080 | and I think we should study them.
01:53:15.160 | I think there should be a whole field of study
01:53:17.080 | into understanding, identifying memes
01:53:20.360 | that above a certain threshold of the population
01:53:24.040 | agree is a positive, happy, bringing people together meme,
01:53:27.360 | the kind of thing that brings families together
01:53:29.880 | that would normally argue about cultural stuff
01:53:31.720 | at the table, at the dinner table.
01:53:34.800 | Identify those memes and figure out what it was,
01:53:37.440 | what was the ingredient that made them spread that day.
01:53:40.480 | - And also like, not just like happiness
01:53:44.760 | and connection between humans,
01:53:45.960 | but connection between humans in other ways
01:53:49.600 | that enables like productivity, like cooperation,
01:53:52.720 | solving difficult problems and all those kinds of stuff.
01:53:56.240 | So it's not just about let's be happy
01:53:59.040 | and have a fulfilling lives.
01:54:00.840 | It's also like, let's build cool shit.
01:54:03.000 | - Yeah.
01:54:03.840 | Which is the spirit of collaboration,
01:54:05.120 | which is deeply anti-Moloch.
01:54:06.480 | It's not using competition.
01:54:09.320 | It's like, Moloch hates collaboration and coordination
01:54:13.160 | and people working together.
01:54:14.600 | And that's, again, like the internet started out as that
01:54:18.080 | and it could have been that,
01:54:20.680 | but because of the way it was sort of structured
01:54:23.360 | in terms of, you know, very lofty ideal,
01:54:26.880 | they wanted everything to be open source,
01:54:28.960 | open source and also free.
01:54:30.600 | And, but they needed to find a way to pay the bills anyway,
01:54:32.800 | because they were still building this
01:54:33.840 | on top of our old economics system.
01:54:36.280 | And so the way they did that
01:54:37.600 | was through third-party advertisement.
01:54:40.240 | But that meant that things were very decoupled.
01:54:42.760 | You know, you've got this third-party interest,
01:54:45.600 | which means that you're then like,
01:54:47.480 | people are having to optimize for that.
01:54:48.960 | And that is, you know, the actual consumer
01:54:51.320 | is actually the product,
01:54:52.960 | not the person you're making the thing for.
01:54:56.600 | You're in, in the end,
01:54:57.480 | you start making the thing for the advertiser.
01:54:59.840 | And so that's why it then like breaks down.
01:55:02.000 | Yeah, like it's, there's no clean solution to this.
01:55:07.440 | And I, it's a really good suggestion by you actually
01:55:11.160 | to like figure out how we can optimize virality
01:55:16.080 | for positive sum topics.
01:55:19.560 | - I shall be the general of the love bot army.
01:55:22.700 | - Distributed.
01:55:26.200 | - Distributed, distributed.
01:55:27.440 | No, okay, yeah.
01:55:28.400 | The power, just even in saying that,
01:55:30.240 | the power already went to my head.
01:55:32.240 | No, okay.
01:55:33.280 | You've talked about quantifying your thinking.
01:55:35.920 | We've been talking about this,
01:55:37.000 | sort of a game theoretic view on life
01:55:39.320 | and putting probabilities behind estimates.
01:55:42.280 | Like if you think about different trajectories
01:55:44.280 | you can take through life,
01:55:45.680 | just actually analyzing life in game theoretic way,
01:55:48.360 | like your own life, like personal life.
01:55:50.840 | I think you've given an example
01:55:52.820 | that you had an honest conversation with Igor
01:55:54.680 | about like how long is this relationship gonna last?
01:55:57.840 | So similar to our sort of marriage problem
01:56:00.160 | kind of discussion,
01:56:01.560 | having an honest conversation about the probability
01:56:05.080 | of things that we sometimes are a little bit too shy
01:56:08.600 | or scared to think of in a probabilistic terms.
01:56:11.840 | Can you speak to that kind of way of reasoning
01:56:15.040 | about the good and the bad of that?
01:56:17.040 | Can you do this kind of thing with human relations?
01:56:20.880 | - Yeah, so the scenario you're talking about,
01:56:24.120 | it was like-
01:56:25.040 | - Yeah, tell me about that scenario.
01:56:27.680 | - I think it was about a year into our relationship
01:56:30.920 | and we were having a fairly heavy conversation
01:56:34.520 | because we were trying to figure out
01:56:35.440 | whether or not I was gonna sell my apartment.
01:56:37.720 | He had already moved in,
01:56:40.160 | but I think we were just figuring out
01:56:41.760 | what like our long-term plans would be.
01:56:43.640 | Should we buy a place together, et cetera.
01:56:46.360 | - When you guys are having that conversation,
01:56:47.720 | are you like drunk out of your mind on wine
01:56:49.800 | or is he sober and you're actually having a serious-
01:56:52.760 | - I think I'm sober.
01:56:53.680 | - How do you get to that conversation?
01:56:54.960 | 'Cause most people are kind of afraid
01:56:56.160 | to have that kind of serious conversation.
01:56:58.760 | - Well, so our relationship was very,
01:57:01.520 | well, first of all, we were good friends
01:57:03.360 | for a couple of years before we even got romantic.
01:57:06.760 | And when we did get romantic,
01:57:12.480 | it was very clear that this was a big deal.
01:57:15.800 | It wasn't just like another,
01:57:17.680 | it wasn't a random thing.
01:57:20.720 | - So the probability of it being a big deal was high.
01:57:22.880 | - Was already very high.
01:57:24.200 | And then we'd been together for a year
01:57:26.200 | and it had been pretty golden and wonderful.
01:57:28.800 | So there was a lot of foundation already
01:57:32.640 | where we felt very comfortable
01:57:33.840 | having a lot of frank conversations.
01:57:35.200 | But Igor's MO has always been much more than mine.
01:57:38.440 | He was always from the outset,
01:57:40.280 | just in a relationship,
01:57:42.600 | radical transparency and honesty is the way
01:57:45.320 | because the truth is the truth
01:57:47.080 | whether you want to hide it or not.
01:57:48.880 | It will come out eventually.
01:57:50.320 | And if you aren't able to accept difficult things yourself,
01:57:55.320 | then how could you possibly expect
01:57:57.680 | to be the most integral version?
01:57:59.800 | The relationship needs this bedrock of honesty
01:58:04.600 | as a foundation more than anything.
01:58:06.600 | - Yeah, that's really interesting,
01:58:07.720 | but I would like to push against some of those ideas,
01:58:09.960 | but let's-
01:58:10.800 | - Okay, all right.
01:58:11.640 | - Down the line, yes, throw them up.
01:58:13.720 | I just rudely interrupt.
01:58:15.280 | - No, it's fine.
01:58:16.760 | And so, we'd been about together for a year
01:58:19.840 | and things were good.
01:58:20.680 | And we were having this hard conversation
01:58:23.440 | and then he was like, well, okay,
01:58:25.280 | what's the likelihood that we're going to be together
01:58:27.280 | in three years then?
01:58:28.240 | 'Cause I think it was roughly a three-year time horizon.
01:58:31.000 | And I was like, ooh, interesting.
01:58:32.880 | And then we were like, actually wait,
01:58:34.240 | before you say it out loud,
01:58:35.080 | let's both write down our predictions formally.
01:58:37.720 | 'Cause we'd been like,
01:58:38.560 | we were just getting into like effective altruism
01:58:40.440 | and rationality at the time,
01:58:41.840 | which is all about making formal predictions
01:58:43.880 | as a means of measuring your own,
01:58:47.360 | well, your own foresight essentially in a quantified way.
01:58:53.680 | So, we both wrote down our percentages
01:58:55.480 | and we also did a one-year prediction
01:58:58.640 | and a 10-year one as well.
01:58:59.560 | So, we got percentages for all three
01:59:01.960 | and then we showed each other.
01:59:03.880 | And I remember having this moment of like, ooh,
01:59:06.240 | 'cause for the 10-year one, I was like, ooh,
01:59:07.640 | well, I mean, I love him a lot,
01:59:09.600 | but like a lot can happen in 10 years, you know?
01:59:11.880 | And we've only been together for, you know,
01:59:14.280 | so I was like, I think it's over 50%,
01:59:16.240 | but it's definitely not 90%.
01:59:17.920 | And I remember like wrestling, I was like,
01:59:19.240 | oh, but I don't want him to be hurt.
01:59:20.240 | I don't want him to, you know,
01:59:21.400 | I don't want to give a number lower than his.
01:59:22.680 | And I remember thinking, I was like, uh-uh, don't game it.
01:59:25.280 | This is an exercise in radical honesty.
01:59:28.120 | So, just give your real percentage.
01:59:29.520 | And I think mine was like 75%.
01:59:31.280 | And then we showed each other
01:59:32.400 | and luckily we were fairly well aligned.
01:59:37.200 | But honestly, even if we weren't-
01:59:38.640 | - 20%.
01:59:39.480 | - Huh?
01:59:40.320 | It definitely would have,
01:59:42.000 | if his had been consistently lower than mine,
01:59:45.520 | that would have rattled me for sure.
01:59:48.120 | Whereas if it had been the other way around,
01:59:49.880 | I think he's just kind of like a water off the duck's back
01:59:52.480 | type of guy.
01:59:53.320 | Be like, okay, well, all right, we'll figure this out.
01:59:55.200 | - Well, did you guys provide error bars on the estimate?
01:59:57.800 | Like the love them on-
01:59:58.680 | - They came built in.
01:59:59.560 | We didn't give formal plus or minus error bars.
02:00:02.200 | I didn't draw any or anything like that.
02:00:04.120 | - I guess that's the question I have is,
02:00:05.720 | did you feel informed enough to make such decisions?
02:00:10.720 | 'Cause I feel like if I were to do
02:00:14.680 | this kind of thing rigorously,
02:00:16.200 | I would want some data.
02:00:19.080 | I would want to say one of the assumptions you have
02:00:23.120 | is you're not that different from other relationships.
02:00:25.400 | - Right.
02:00:26.240 | - And so I wanna have some data about the way-
02:00:29.360 | - You want the base rates.
02:00:30.720 | - Yeah.
02:00:31.560 | And also actual trajectories of relationships.
02:00:34.400 | I would love to have time series data
02:00:39.080 | about the ways that relationships fall apart or prosper,
02:00:42.840 | how they collide with different life events,
02:00:46.720 | losses, job changes, moving,
02:00:48.840 | both partners find jobs, only one has a job.
02:00:54.520 | I want that kind of data
02:00:56.320 | and how often the different trajectories change in life.
02:01:01.280 | How informative is your past to your future?
02:01:04.400 | That's a whole thing.
02:01:05.640 | Can you look at my life and have a good prediction
02:01:09.760 | about in terms of my characteristics and my relationships
02:01:13.560 | of what that's gonna look like in the future or not?
02:01:15.880 | I don't even know the answer to that question.
02:01:17.200 | I'll be very ill-informed in terms of making the probability.
02:01:20.720 | I would be far, yeah, I just would be under-informed.
02:01:25.680 | I would be under-informed.
02:01:26.680 | I'll be over-biasing to my prior experiences, I think.
02:01:31.240 | - Right, but as long as you're aware of that
02:01:33.120 | and you're honest with yourself,
02:01:34.400 | and you're honest with the other person,
02:01:35.760 | say, "Look, I have really wide error bars on this
02:01:37.640 | for the following reasons," that's okay.
02:01:40.440 | I still think it's better
02:01:41.280 | than not trying to quantify it at all
02:01:43.200 | if you're trying to make
02:01:44.040 | really major irreversible life decisions.
02:01:46.760 | - And I feel also the romantic nature of that question.
02:01:49.480 | For me personally, I try to live my life
02:01:52.800 | thinking it's very close to 100%.
02:01:55.440 | Like allowing myself, actually,
02:01:58.400 | this is the difficulty of this,
02:02:00.000 | is allowing myself to think differently,
02:02:03.560 | I feel like has a psychological consequence.
02:02:06.520 | That's one of my pushbacks against radical honesty,
02:02:09.720 | is this one particular perspective on-
02:02:14.280 | - So you're saying you would rather give
02:02:16.320 | a falsely high percentage to your partner?
02:02:20.200 | - Going back to the wise sage- - In order to sort of
02:02:22.640 | create this additional optimism.
02:02:24.640 | - Helmuth. - Yes.
02:02:26.280 | - Of fake it till you make it,
02:02:29.440 | the positive, the power of positive thinking.
02:02:31.200 | - #positivity, yeah. - Yeah, #.
02:02:33.000 | - Well, so that, and this comes back
02:02:36.760 | to this idea of useful fictions, right?
02:02:39.800 | And I agree, I don't think there's a clear answer to this,
02:02:42.960 | and I think it's actually quite subjective.
02:02:44.280 | Some people this works better for than others.
02:02:46.600 | You know, to be clear, Igor and I weren't doing
02:02:50.160 | this formal prediction in it.
02:02:52.000 | Like we did it with very much tongue in cheek.
02:02:55.560 | It wasn't like we were gonna make,
02:02:57.400 | I don't think it even would have drastically changed
02:03:00.160 | what we decided to do, even.
02:03:02.200 | We kinda just did it more as a fun exercise.
02:03:04.400 | - But the consequence of that fun exercise,
02:03:06.920 | you really actually kinda, there was a deep honesty to it.
02:03:09.920 | - Exactly, it was a deep, and it was just like
02:03:12.160 | this moment of reflection, I'm like, oh wow,
02:03:13.840 | I actually have to think through this quite critically,
02:03:16.720 | and so on.
02:03:17.560 | And it's also what was interesting was,
02:03:22.000 | I got to check in with what my desires were.
02:03:26.000 | So there was one thing of what my actual prediction is,
02:03:28.600 | but what are my desires, and could these desires
02:03:30.440 | be affecting my predictions, and so on.
02:03:32.480 | And that's a method of rationality,
02:03:34.960 | and I personally don't think it loses anything.
02:03:37.080 | It didn't take any of the magic away from our relationship,
02:03:39.240 | quite the opposite.
02:03:40.640 | It brought us closer together, 'cause it was like
02:03:42.640 | we did this weird, fun thing that I appreciate
02:03:45.400 | a lot of people find quite strange.
02:03:47.920 | And I think it was somewhat unique in our relationship
02:03:51.880 | that both of us are very, we both love numbers,
02:03:54.880 | we both love statistics, we're both poker players.
02:03:57.320 | So this was kind of like our safe space anyway.
02:04:01.320 | For others, one partner really might not like
02:04:05.160 | that kind of stuff at all, in which case
02:04:06.400 | this is not a good exercise to do.
02:04:07.840 | I don't recommend it to everybody.
02:04:09.520 | But I do think there's, it's interesting sometimes
02:04:14.320 | to poke holes in the, probe at these things
02:04:18.920 | that we consider so sacred that we can't try
02:04:21.600 | to quantify them.
02:04:24.040 | Which is interesting, 'cause that's in tension
02:04:25.480 | with the idea of what we just talked about with beauty
02:04:27.400 | and what makes something beautiful,
02:04:28.680 | the fact that you can't measure everything about it.
02:04:30.960 | And perhaps something shouldn't be tried to,
02:04:32.920 | maybe it's wrong to completely try and value
02:04:36.360 | the utilitarian, put a utilitarian frame
02:04:39.160 | of measuring the utility of a tree in its entirety.
02:04:43.280 | I don't know, maybe we should, maybe we shouldn't.
02:04:44.680 | I'm ambivalent on that.
02:04:46.640 | But overall, people have too many biases.
02:04:52.600 | People are overly biased against trying to do
02:04:57.440 | a quantified cost-benefit analysis
02:04:59.840 | on really tough life decisions.
02:05:01.480 | They're like, "Oh, just go with your gut."
02:05:03.920 | It's like, well, sure, but guts, our intuitions
02:05:07.120 | are best suited for things that we've got
02:05:08.760 | tons of experience in.
02:05:10.360 | Then we can really trust on it,
02:05:11.880 | if it's a decision we've made many times.
02:05:13.280 | But if it's like, should I marry this person
02:05:16.000 | or should I buy this house over that house?
02:05:19.000 | You only make those decisions a couple of times
02:05:20.680 | in your life, maybe.
02:05:22.960 | - Well, I would love to know, there's a balance,
02:05:26.920 | probably it's a personal balance of strike,
02:05:29.240 | is the amount of rationality you apply
02:05:33.120 | to a question versus the useful fiction,
02:05:38.120 | the fake it till you make it.
02:05:40.080 | For example, just talking to soldiers in Ukraine,
02:05:43.520 | you ask them, what's the probability of you winning,
02:05:49.160 | Ukraine winning?
02:05:50.160 | Almost everybody I talk to is 100%.
02:05:55.640 | - Wow.
02:05:56.840 | - And you listen to the experts, right?
02:05:58.960 | They say all kinds of stuff.
02:06:00.760 | - Right.
02:06:01.600 | - First of all, the morale there is higher than probably,
02:06:06.320 | and I've never been to a war zone before this,
02:06:09.880 | but I've read about many wars,
02:06:12.560 | and I think the morale in Ukraine is higher
02:06:14.920 | than almost any war I've read about.
02:06:17.320 | It's every single person in the country
02:06:19.360 | is proud to fight for their country.
02:06:21.720 | - Wow.
02:06:22.560 | - Everybody, not just soldiers, not everybody.
02:06:25.640 | - Why do you think that is,
02:06:26.560 | specifically more than in other wars?
02:06:28.640 | - I think because there's perhaps a dormant desire
02:06:36.680 | for the citizens of this country
02:06:39.600 | to find the identity of this country,
02:06:41.960 | because it's been going through this 30 year process
02:06:45.360 | of different factions and political bickering,
02:06:48.400 | and they haven't had, as they talk about,
02:06:50.960 | they haven't had their independence war.
02:06:52.600 | They say all great nations have had an independence war.
02:06:55.880 | They had to fight for their independence,
02:06:58.920 | for the discovery of the identity,
02:07:00.700 | of the core of the ideals that unify us,
02:07:03.200 | and they haven't had that.
02:07:04.600 | There's constantly been factions, there's been divisions,
02:07:07.220 | there's been pressures from empires,
02:07:09.840 | from United States and from Russia,
02:07:12.320 | from NATO and Europe,
02:07:14.080 | everybody telling them what to do.
02:07:15.680 | Now they wanna discover who they are,
02:07:17.720 | and there's that kind of sense that we're going to fight
02:07:21.240 | for the safety of our homeland,
02:07:23.280 | but we're also gonna fight for our identity.
02:07:25.640 | And that, on top of the fact that there's just,
02:07:30.640 | if you look at the history of Ukraine,
02:07:33.960 | and there's certain other countries like this,
02:07:36.760 | there are certain cultures are feisty in their pride
02:07:43.040 | of being the citizens of that nation.
02:07:45.720 | Ukraine is that, Poland was that.
02:07:48.440 | You just look at history.
02:07:49.520 | In certain countries, you do not want to occupy.
02:07:52.880 | - Right.
02:07:53.720 | (Lex laughing)
02:07:54.560 | - I mean, both Stalin and Hitler
02:07:55.840 | talked about Poland in this way.
02:07:57.400 | They're like, "This is a big problem.
02:08:00.240 | If we occupy this land for prolonged periods of time,
02:08:02.400 | they're gonna be a pain in their ass.
02:08:04.400 | Like, they're not going to want to be occupied."
02:08:07.160 | And certain other countries are like, pragmatic.
02:08:09.540 | They're like, "Well, leaders come and go.
02:08:12.200 | I guess this is good."
02:08:13.360 | Ukraine just doesn't have,
02:08:15.680 | Ukrainians, throughout the 20th century,
02:08:19.280 | don't seem to be the kind of people
02:08:20.760 | that just sit calmly and let the "occupiers"
02:08:25.760 | impose their rules.
02:08:30.160 | - That's interesting, though,
02:08:31.000 | because you said it's always been under conflict
02:08:33.360 | and leaders have come and gone.
02:08:35.280 | - Yeah.
02:08:36.120 | - So you would expect them to actually be the opposite
02:08:37.120 | under that reasoning.
02:08:38.760 | - Because it's a very fertile land.
02:08:42.360 | It's great for agriculture.
02:08:43.480 | So a lot of people want to,
02:08:44.800 | I mean, I think they've developed this culture
02:08:46.520 | because they've constantly been occupied
02:08:47.920 | by different people, for different peoples.
02:08:51.040 | And so maybe there is something to that,
02:08:54.600 | where you've constantly had to feel,
02:08:58.320 | like, within the blood of the generations,
02:09:00.720 | there's the struggle against the man,
02:09:04.880 | against the imposition of rules,
02:09:07.880 | against oppression and all that kind of stuff,
02:09:09.440 | and that stays with them.
02:09:10.600 | So there's a will there.
02:09:13.720 | But a lot of other aspects are also part of it
02:09:16.440 | that has to do with the reverse Mollik kind of situation,
02:09:20.040 | where social media has definitely played a part of it.
02:09:23.160 | Also, different charismatic individuals
02:09:25.120 | have had to play a part.
02:09:27.220 | The fact that the president of the nation, Zelensky,
02:09:31.160 | stayed in Kiev during the invasion
02:09:35.160 | is a huge inspiration to them
02:09:37.720 | because most leaders, as you can imagine,
02:09:41.460 | when the capital of the nation is under attack,
02:09:44.560 | the wise thing, the smart thing,
02:09:46.820 | that the United States advised Zelensky to do
02:09:49.120 | is to flee and to be the leader of the nation
02:09:52.760 | from a distant place.
02:09:54.760 | He said, "Fuck that, I'm staying put."
02:09:57.280 | Everyone around him, there was a pressure to leave,
02:10:01.760 | and he didn't.
02:10:02.860 | And that, in those singular acts,
02:10:07.040 | really can unify a nation.
02:10:09.200 | There's a lot of people that criticize Zelensky
02:10:11.400 | within Ukraine before the war.
02:10:14.040 | He was very unpopular, even still.
02:10:17.240 | But they put that aside,
02:10:18.860 | especially that singular act of staying in the capital.
02:10:24.160 | Yeah, a lot of those kinds of things
02:10:27.160 | come together to create something within people.
02:10:31.660 | - These things always, of course,
02:10:35.640 | so how zoomed out of a view do you wanna take?
02:10:40.640 | Because, yeah, you describe it as an anti-Molotov thing
02:10:46.320 | happened within Ukraine
02:10:48.000 | because it brought the Ukrainian people together
02:10:49.840 | in order to fight a common enemy.
02:10:51.800 | Maybe that's a good thing, maybe that's a bad thing.
02:10:53.400 | In the end, we don't know
02:10:54.240 | how this is all gonna play out, right?
02:10:56.740 | But if you zoom it out from a level, on a global level,
02:11:01.200 | they're coming together to fight,
02:11:05.140 | that could make a conflict larger.
02:11:12.240 | You know what I mean?
02:11:13.120 | I don't know what the right answer is here.
02:11:15.680 | It seems like a good thing that they came together,
02:11:17.600 | but we don't know how this is all gonna play out.
02:11:20.000 | If this all turns into nuclear war,
02:11:21.600 | we'll be like, "Okay, that was the bad, that was the."
02:11:23.440 | - Oh yeah, so I was describing the reverse Moloch
02:11:26.080 | for the local level.
02:11:27.440 | - Exactly, yeah.
02:11:28.280 | - Now, this is where the experts come in
02:11:31.240 | and they say, "Well, if you channel most of the resources
02:11:36.240 | "of the nation and the nation supporting Ukraine
02:11:40.500 | "into the war effort, are you not beating the drums of war
02:11:45.500 | "that is much bigger than Ukraine?"
02:11:47.480 | In fact, even the Ukrainian leaders
02:11:50.880 | are speaking of it this way.
02:11:52.920 | This is not a war between two nations.
02:11:55.840 | This is the early days of a world war
02:12:00.840 | if we don't play this correctly.
02:12:02.480 | - Yes.
02:12:03.320 | And we need cool heads from our leaders.
02:12:07.760 | - So from Ukraine's perspective,
02:12:09.560 | Ukraine needs to win the war.
02:12:12.680 | Because what does winning the war mean
02:12:15.440 | is coming to peace negotiations,
02:12:20.240 | an agreement that guarantees no more invasions.
02:12:24.360 | And then you make an agreement
02:12:25.640 | about what land belongs to who.
02:12:28.040 | - Right.
02:12:28.880 | - And you stop that.
02:12:30.240 | And basically, from their perspective,
02:12:34.120 | is you want to demonstrate to the rest of the world
02:12:36.600 | who's watching carefully, including Russia and China
02:12:39.640 | and different players on the geopolitical stage,
02:12:42.280 | that this kind of conflict is not going to be productive
02:12:46.000 | if you engage in it.
02:12:47.240 | So you wanna teach everybody a lesson,
02:12:49.240 | let's not do World War III.
02:12:50.920 | It's gonna be bad for everybody.
02:12:53.000 | It's a lose-lose.
02:12:55.040 | - Deep lose-lose.
02:12:56.600 | Doesn't matter.
02:12:57.440 | - And I think that's actually a correct...
02:13:04.360 | When I zoom out,
02:13:06.760 | 99% of what I think about
02:13:10.160 | is just individual human beings and human lives
02:13:12.320 | and just that war is horrible.
02:13:14.600 | But when you zoom out
02:13:15.600 | and think from a geopolitics perspective,
02:13:17.760 | we should realize that it's entirely possible
02:13:22.000 | that we will see a World War III in the 21st century.
02:13:26.400 | And this is like a dress rehearsal for that.
02:13:29.840 | And so the way we play this as a human civilization
02:13:34.840 | will define whether we do or don't have a World War III.
02:13:39.800 | How we discuss war, how we discuss nuclear war,
02:13:49.320 | the kind of leaders we elect and prop up,
02:13:54.320 | the kind of memes we circulate.
02:13:58.480 | Because you have to be very careful
02:13:59.920 | when you're being pro-Ukraine, for example,
02:14:04.480 | you have to realize that you're being...
02:14:06.560 | You are also indirectly feeding
02:14:11.520 | the ever-increasing military-industrial complex.
02:14:14.620 | So you have to be extremely careful
02:14:17.560 | that when you say pro-Ukraine or pro-anybody,
02:14:22.560 | you're pro-human beings, not pro the machine
02:14:29.440 | that creates narratives that says it's pro-human beings.
02:14:36.800 | But it's actually, if you look at the raw use
02:14:39.920 | of funds and resources,
02:14:42.640 | it's actually pro-making weapons
02:14:44.880 | and shooting bullets and dropping bombs.
02:14:47.440 | The real, we have to just somehow get the meme
02:14:50.680 | into everyone's heads that the real enemy is war itself.
02:14:54.640 | That's the enemy we need to defeat.
02:14:57.120 | And that doesn't mean to say that there isn't justification
02:15:01.600 | for small local scenarios, adversarial conflicts.
02:15:06.600 | If you have a leader who is starting wars,
02:15:11.120 | they're on the side of team war, basically.
02:15:13.760 | It's not that they're on the side of team country,
02:15:15.280 | whatever that country is,
02:15:16.240 | it's they're on the side of team war.
02:15:17.920 | So that needs to be stopped and put down.
02:15:20.180 | But you also have to find a way
02:15:21.520 | that your corrective measure doesn't actually then end up
02:15:25.800 | being co-opted by the war machine and creating greater war.
02:15:28.920 | Again, the playing field is finite.
02:15:31.080 | The scale of conflict is now getting so big
02:15:35.200 | that the weapons that can be used are so mass destructive
02:15:38.800 | that we can't afford another giant conflict.
02:15:42.560 | We just, we won't make it.
02:15:44.080 | - What existential threat, in terms of us not making it,
02:15:48.000 | are you most worried about?
02:15:49.640 | What existential threat to human civilization?
02:15:51.880 | We got like--
02:15:52.720 | - Going down the dark path, huh?
02:15:53.560 | - This is good.
02:15:54.680 | Well, no, it's a dark--
02:15:56.640 | - No, it's like, well, while we're in the somber place,
02:15:58.600 | we might as well.
02:15:59.440 | (both laughing)
02:16:02.320 | - Some of my best friends are dark paths.
02:16:04.500 | What worries you the most?
02:16:08.000 | We mentioned asteroids, we mentioned AGI, nuclear weapons.
02:16:13.960 | - The one that's on my mind the most,
02:16:17.400 | mostly because I think it's the one where we have
02:16:19.480 | actually a real chance to move the needle on
02:16:22.000 | in a positive direction, or more specifically,
02:16:24.600 | stop some really bad things from happening,
02:16:26.880 | really dumb, avoidable things, is bio-risks.
02:16:31.880 | - In what kind of bio-risks?
02:16:37.440 | There's so many fun options.
02:16:39.120 | - Oh, yeah, so many.
02:16:39.960 | So, of course, we have natural risks from natural pandemics,
02:16:43.640 | naturally occurring viruses or pathogens.
02:16:45.920 | And then also as time and technology goes on
02:16:49.640 | and technology becomes more and more democratized
02:16:52.040 | into the hands of more and more people,
02:16:54.040 | the risk of synthetic pathogens.
02:16:55.980 | And whether or not you fall into the camp of COVID
02:17:00.640 | was gain of function, accidental lab leak,
02:17:03.720 | or whether it was purely naturally occurring,
02:17:05.920 | either way, we are facing a future where
02:17:13.320 | synthetic pathogens or human meddled with pathogens
02:17:17.880 | either accidentally get out
02:17:20.040 | or get into the hands of bad actors,
02:17:23.280 | whether they're omnicidal maniacs, either way.
02:17:27.720 | And so that means we need more robustness for that.
02:17:31.160 | And you would think that us having this nice little dry run,
02:17:33.920 | which is what, as awful as COVID was,
02:17:36.340 | and all those poor people that died,
02:17:39.480 | it was still like a child's play
02:17:42.400 | compared to what a future one could be
02:17:44.480 | in terms of fatality rate.
02:17:45.800 | And so you'd think that we would then be coming,
02:17:49.920 | we'd be much more robust in our pandemic preparedness.
02:17:52.620 | And meanwhile, the budget in the last two years for the US,
02:17:58.620 | sorry, they just did this,
02:18:01.900 | I can't remember the name of what the actual budget was,
02:18:04.800 | but it was like a multi-trillion dollar budget
02:18:06.960 | that the US just set aside.
02:18:08.920 | And originally in that,
02:18:10.580 | considering that COVID cost multiple trillions
02:18:12.580 | to the economy, right?
02:18:13.940 | The original allocation in this new budget
02:18:17.320 | for future pandemic preparedness was 60 billion.
02:18:19.960 | So tiny proportion of it.
02:18:22.720 | That's proceeded to get whittled down
02:18:24.840 | to like 30 billion to 15 billion,
02:18:28.680 | all the way down to 2 billion out of multiple trillions
02:18:31.560 | for a thing that has just cost us multiple trillions.
02:18:34.180 | We've just finished, we're not even really out of it.
02:18:37.480 | It basically got whittled down to nothing
02:18:39.280 | because for some reason people think that,
02:18:41.020 | "Whew, all right, we've got the pandemic out of the way.
02:18:43.160 | That was that one."
02:18:44.620 | And the reason for that is that people are,
02:18:47.080 | and I say this with all due respect
02:18:49.620 | to a lot of the science community,
02:18:50.640 | but there's an immense amount of naivety about,
02:18:55.080 | they think that nature is the main risk moving forward,
02:18:59.160 | and it really isn't.
02:19:00.500 | And I think nothing demonstrates this more
02:19:02.920 | than this project that I was just reading about
02:19:05.640 | that's sort of being proposed right now
02:19:07.120 | called Deep Vision.
02:19:10.040 | And the idea is to go out into the wilds,
02:19:12.360 | and we're not talking about just like within cities,
02:19:15.080 | like deep into like caves that people don't go to,
02:19:17.760 | deep into the Arctic, wherever,
02:19:19.200 | scour the earth for whatever the most dangerous
02:19:22.760 | possible pathogens could be that they can find.
02:19:26.480 | And then not only do, try and find these,
02:19:29.920 | bring samples of them back to laboratories.
02:19:33.080 | And again, whether you think COVID was a lab leak or not,
02:19:36.200 | I'm not gonna get into that,
02:19:37.480 | but we have historically had so many, as a civilization,
02:19:40.400 | we've had so many lab leaks
02:19:42.600 | from even like the highest level security things.
02:19:44.600 | Like it just, people should go and just read it.
02:19:47.520 | It's like a comedy show of just how many they are,
02:19:50.560 | how leaky these labs are,
02:19:52.160 | even when they do their best efforts.
02:19:54.640 | So bring these things then back to civilization.
02:19:57.520 | That's step one of the badness.
02:19:58.960 | Then the next step would be to then categorize them,
02:20:02.960 | do experiments on them and categorize them
02:20:04.520 | by their level of potential pandemic lethality.
02:20:07.000 | And then the piece de resistance on this plan
02:20:10.680 | is to then publish that information freely on the internet
02:20:14.840 | about all these pathogens, including their genome,
02:20:16.920 | which is literally like the building instructions
02:20:18.800 | of how to do them on the internet.
02:20:21.240 | And this is something that genuinely a pocket
02:20:24.640 | of the like bio, of the scientific community
02:20:27.960 | thinks is a good idea.
02:20:29.840 | And I think on expectation, like the,
02:20:32.120 | and their argument is, is that,
02:20:33.360 | oh, this is good because it might buy us some time
02:20:36.200 | to buy, to develop the vaccines, which, okay, sure.
02:20:39.520 | Maybe would have made sense prior to mRNA technology,
02:20:42.040 | but like they, mRNA, we can bank,
02:20:44.680 | we can develop a vaccine now
02:20:46.880 | when we find a new pathogen within a couple of days.
02:20:49.960 | Now then there's all the trials and so on.
02:20:51.560 | Those trials would have to happen anyway
02:20:52.960 | in the case of a brand new thing.
02:20:54.440 | So you're saving maybe a couple of days.
02:20:56.400 | So that's the upside.
02:20:57.680 | Meanwhile, the downside is you're not only giving,
02:21:01.200 | you're bringing the risk of these pathogens
02:21:03.200 | of like getting leaked,
02:21:04.040 | but you're literally handing it out
02:21:06.400 | to every bad actor on earth who would be doing cartwheels.
02:21:10.320 | And I'm talking about like Kim Jong-un, ISIS,
02:21:13.280 | people who like want,
02:21:14.880 | they think the rest of the world is their enemy.
02:21:17.200 | And in some cases they think that killing themselves
02:21:19.760 | is like a noble cause.
02:21:22.680 | And you're literally giving them the building blocks
02:21:24.240 | of how to do this.
02:21:25.080 | It's the most batshit idea I've ever heard.
02:21:26.920 | Like on expectation, it's probably like minus EV
02:21:29.560 | of like multiple billions of lives
02:21:31.640 | if they actually succeeded in doing this.
02:21:33.440 | Certainly in the tens or hundreds of millions.
02:21:35.760 | So the cost benefit is so unbelievably, it makes no sense.
02:21:38.840 | And I was trying to wrap my head around,
02:21:41.120 | like what's going wrong in people's minds
02:21:44.920 | to think that this is a good idea?
02:21:46.600 | And it's not that it's malice or anything like that.
02:21:50.000 | I think it's that people don't,
02:21:53.640 | the proponents, they don't,
02:21:56.120 | they're actually overly naive
02:21:57.400 | about the interactions of humanity.
02:22:00.680 | And well, like there are bad actors
02:22:02.840 | who will use this for bad things.
02:22:04.920 | Because not only will it,
02:22:06.160 | if you publish this information,
02:22:08.920 | even if a bad actor couldn't physically make it themselves,
02:22:12.200 | which given in 10 years time,
02:22:14.080 | like the technology is getting cheaper and easier to use.
02:22:18.000 | But even if they couldn't make it, they could now bluff it.
02:22:20.400 | Like what would you do if there's like some deadly new virus
02:22:23.000 | that we were published on the internet
02:22:26.160 | in terms of its building blocks?
02:22:27.600 | Kim Jong-un could be like,
02:22:28.600 | "Hey, if you don't let me build my nuclear weapons,
02:22:31.560 | "I'm gonna release this, I've managed to build it."
02:22:33.640 | Well, now he's actually got a credible bluff.
02:22:35.480 | We don't know.
02:22:36.480 | And so that's, it's just like handing the keys,
02:22:39.320 | it's handing weapons of mass destruction to people.
02:22:42.440 | Makes no sense.
02:22:43.280 | - The possible, I agree with you,
02:22:44.560 | but the possible world in which it might make sense
02:22:48.320 | is if the good guys,
02:22:52.240 | which is a whole another problem,
02:22:53.800 | defining who the good guys are,
02:22:55.520 | but the good guys are like an order of magnitude
02:22:59.680 | higher competence.
02:23:00.980 | And so they can stay ahead of the bad actors
02:23:06.760 | by just being very good at the defense.
02:23:10.120 | By very good, not meaning like a little bit better,
02:23:13.740 | but an order of magnitude better.
02:23:15.920 | But of course the question is
02:23:17.240 | in each of those individual disciplines, is that feasible?
02:23:21.720 | Can you, can the bad actors,
02:23:23.480 | even if they don't have the competence,
02:23:24.920 | leapfrog to the place where the good guys are?
02:23:29.720 | - Yeah, I mean, I would agree in principle
02:23:31.880 | with pertaining to this like particular plan of like,
02:23:35.720 | that, you know, with the thing I described,
02:23:38.520 | this deep vision thing,
02:23:39.360 | where at least then that would maybe make sense
02:23:41.200 | for steps one and step two of like getting the information,
02:23:43.440 | but then why would you release it,
02:23:45.680 | the information to your literal enemies?
02:23:47.280 | You know, that's, that makes,
02:23:49.920 | that doesn't fit at all in that perspective
02:23:52.600 | of like trying to be ahead of them.
02:23:53.560 | You're literally handing them the weapon.
02:23:55.000 | - But there's different levels of release, right?
02:23:56.640 | So there's the kind of secrecy
02:24:00.520 | where you don't give it to anybody,
02:24:02.440 | but there's a release where you incrementally give it
02:24:05.480 | to like major labs.
02:24:07.720 | So it's not public release,
02:24:08.880 | but it's like you're giving it to major labs.
02:24:10.960 | - There's different layers of reasonability, but-
02:24:12.880 | - But the problem there is it's going to,
02:24:14.640 | if you go anywhere beyond like complete secrecy,
02:24:18.120 | it's going to leak.
02:24:19.720 | - That's the thing.
02:24:20.560 | It's very hard to keep secrets.
02:24:21.600 | - And so that's still-
02:24:22.440 | - Information is-
02:24:23.440 | - So you might as well release it to the public,
02:24:25.920 | is that argument.
02:24:26.920 | So you either go complete secrecy
02:24:28.800 | or you release it to the public.
02:24:31.720 | So, which is essentially the same thing.
02:24:33.920 | It's going to leak anyway,
02:24:35.920 | if you don't do complete secrecy.
02:24:38.240 | - Right, which is why you shouldn't get the information
02:24:39.800 | in the first place.
02:24:40.640 | - Yeah, I mean, in that, I think-
02:24:43.720 | - Well, that's A solution.
02:24:44.880 | Yeah, the solution is either don't get the information
02:24:46.600 | in the first place or B, keep it incredibly contained.
02:24:51.600 | - See, I think it really matters
02:24:54.560 | which discipline we're talking about.
02:24:55.800 | So in the case of biology, I do think you're very right.
02:24:59.680 | We shouldn't even be, it should be forbidden
02:25:02.520 | to even like think about that.
02:25:06.640 | Meaning don't just even collect the information,
02:25:09.200 | but like don't do, I mean, gain of function research
02:25:12.440 | is a really iffy area.
02:25:14.880 | Like you start-
02:25:15.720 | - I mean, it's all about cost benefits, right?
02:25:17.680 | There are some scenarios where I could imagine
02:25:19.280 | the cost benefit of a gain of function research
02:25:21.360 | is very, very clear, where you've evaluated
02:25:24.480 | all the potential risks, factored in the probability
02:25:26.840 | that things can go wrong and like, you know,
02:25:28.560 | not only known unknowns, but unknown unknowns as well,
02:25:31.000 | tried to quantify that.
02:25:32.360 | And then even then it's like orders of magnitude
02:25:34.440 | better to do that.
02:25:35.480 | I'm behind that argument, but the point is,
02:25:37.320 | is that there's this like naivety that's preventing people
02:25:40.480 | from even doing the cost benefit properly
02:25:42.160 | on a lot of the things.
02:25:43.320 | Because, you know, I get it, the science community,
02:25:47.280 | again, I don't wanna bucket the science community,
02:25:49.320 | but like some people within the science community
02:25:52.160 | just think that everyone's good
02:25:54.080 | and everyone just cares about getting knowledge
02:25:55.560 | and doing the best for the world.
02:25:56.920 | And unfortunately that's not the case.
02:25:58.080 | I wish we lived in that world, but we don't.
02:26:00.920 | - Yeah, I mean, there's a lie.
02:26:02.400 | Listen, I've been criticizing the science community
02:26:05.640 | broadly quite a bit.
02:26:07.400 | There's so many brilliant people that brilliance
02:26:09.840 | is somehow a hindrance sometimes
02:26:11.320 | 'cause it has a bunch of blind spots.
02:26:13.200 | And then you start to look at the history of science,
02:26:16.440 | how easily it's been used by dictators
02:26:19.080 | to any conclusion they want.
02:26:20.880 | And it's dark how you can use brilliant people
02:26:24.900 | that like playing the little game of science,
02:26:27.200 | 'cause it is a fun game.
02:26:28.820 | You know, you're building, you're going to conferences,
02:26:30.740 | you're building on top of each other's ideas,
02:26:32.360 | there's breakthroughs.
02:26:33.280 | Hi, I think I've realized how this particular molecule works
02:26:37.060 | and I could do this kind of experiment
02:26:38.600 | and everyone else is impressed.
02:26:39.720 | Ooh, cool.
02:26:40.560 | No, I think you're wrong.
02:26:41.440 | Let me show you why you're wrong.
02:26:42.520 | In that little game, everyone gets really excited
02:26:44.880 | and they get excited.
02:26:46.120 | Oh, I came up with a pill that solves this problem
02:26:48.080 | and it's gonna help a bunch of people.
02:26:49.560 | And I came up with a giant study
02:26:51.400 | that shows the exact probability it's gonna help or not.
02:26:54.520 | And you get lost in this game
02:26:56.360 | and you forget to realize this game, just like Mullick,
02:27:00.100 | can have like--
02:27:03.120 | - Unintended consequences, yeah.
02:27:04.760 | - Unintended consequences that might destroy
02:27:07.600 | human civilization or divide human civilization
02:27:12.600 | or have dire geopolitical consequences.
02:27:17.560 | I mean, the effects of, I mean, it's just so,
02:27:20.360 | the most destructive effects of COVID
02:27:22.920 | have nothing to do with the biology of the virus,
02:27:25.800 | it seems like.
02:27:26.640 | I mean, I could just list them forever.
02:27:29.600 | But like one of them is the complete distrust
02:27:32.240 | of public institutions.
02:27:34.200 | The other one is because of that public distrust,
02:27:36.400 | I feel like if a much worse pandemic came along,
02:27:38.880 | we as a world have not cried wolf.
02:27:42.520 | And if an actual wolf now comes,
02:27:45.520 | people will be like, "Fuck masks, fuck--"
02:27:48.400 | - "Fuck vaccines, fuck everything."
02:27:50.080 | - And they won't be, they'll distrust every single thing
02:27:53.640 | that any major institution is gonna tell them.
02:27:56.040 | And--
02:27:57.040 | - Because that's the thing,
02:27:58.280 | there were certain actions made by certain,
02:28:04.640 | health public figures where they told,
02:28:07.800 | they very knowingly told, it was a white lie,
02:28:10.760 | it was intended in the best possible way,
02:28:12.440 | such as early on when there was clearly a shortage of masks.
02:28:17.440 | And so they said to the public, "Oh, don't get masks,
02:28:23.240 | there's no evidence that they work.
02:28:25.080 | Don't get them, they don't work.
02:28:27.280 | In fact, it might even make it worse.
02:28:29.000 | You might even spread it more."
02:28:30.240 | Like that was the real like stinker.
02:28:32.280 | Yeah, no, no.
02:28:34.360 | "The less you know how to do it properly,
02:28:35.200 | you're gonna make that you're gonna get sicker,
02:28:36.680 | or you're more likely to catch the virus,"
02:28:38.720 | which is just absolute crap.
02:28:41.360 | And they put that out there.
02:28:43.160 | And it's pretty clear the reason why they did that
02:28:45.320 | was because there was actually a shortage of masks
02:28:47.640 | and they really needed it for health workers,
02:28:50.200 | which makes sense, like I agree.
02:28:52.800 | But the cost of lying to the public when that then comes out,
02:28:57.800 | people aren't as stupid as they think they are.
02:29:02.760 | And that's, I think, where this distrust of experts
02:29:05.880 | has largely come from.
02:29:06.720 | A, they've lied to people overtly,
02:29:09.360 | but B, people have been treated like idiots.
02:29:13.160 | Now, that's not to say that there aren't a lot of stupid
02:29:14.800 | people who have a lot of wacky ideas around COVID
02:29:16.920 | and all sorts of things,
02:29:18.200 | but if you treat the general public like children,
02:29:21.560 | they're going to see that, they're going to notice that,
02:29:23.480 | and that is going to absolutely decimate the trust
02:29:26.760 | in the public institutions that we depend upon.
02:29:29.520 | And honestly, the best thing that could happen,
02:29:32.240 | I wish, if Fauci and these other leaders who,
02:29:36.400 | I mean, God, I can't imagine how nightmare his job has been
02:29:39.760 | over the last few years, hell on earth.
02:29:41.360 | So I have a lot of sympathy for the position he's been in.
02:29:46.360 | But if he could just come out and be like,
02:29:48.720 | "Okay, look, guys, hands up.
02:29:51.120 | "We didn't handle this as well as we could have.
02:29:53.920 | "These are all the things I would have done differently
02:29:55.840 | "in hindsight.
02:29:56.680 | "I apologize for this and this and this and this."
02:29:58.680 | That would go so far,
02:30:01.360 | and maybe I'm being naive, who knows?
02:30:03.040 | Maybe this would backfire, but I don't think it would.
02:30:05.040 | To someone like me, even,
02:30:06.240 | 'cause I've lost trust in a lot of these things.
02:30:08.840 | But I'm fortunate that I at least know people
02:30:10.360 | who I can go to who I think have good epistemics
02:30:12.680 | on this stuff.
02:30:13.520 | But if they could sort of put their hands on and go,
02:30:16.360 | "Okay, these are the spots where we screwed up.
02:30:18.160 | "This, this, this.
02:30:20.200 | "This was our reasons.
02:30:21.200 | "Yeah, we actually told a little white lie here.
02:30:22.680 | "We did it for this reason.
02:30:23.600 | "We're really sorry."
02:30:24.960 | Where they just did the radical honesty thing,
02:30:26.720 | the radical transparency thing,
02:30:28.680 | that would go so far to rebuilding public trust.
02:30:32.160 | And I think that's what needs to happen.
02:30:33.320 | - Yeah, I totally agree with you.
02:30:34.680 | Unfortunately, his job was very tough
02:30:38.680 | and all those kinds of things.
02:30:39.800 | But I see arrogance,
02:30:42.920 | and arrogance prevented him
02:30:44.600 | from being honest in that way previously.
02:30:47.840 | And I think arrogance will prevent him
02:30:49.480 | from being honest in that way now when he leaders.
02:30:52.840 | I think young people are seeing that,
02:30:55.440 | that kind of talking down to people
02:30:59.960 | from a position of power,
02:31:02.080 | I hope is the way of the past.
02:31:04.400 | People really like authenticity
02:31:06.120 | and they like leaders that are like a man
02:31:10.600 | and a woman of the people.
02:31:12.320 | And I think that just-
02:31:15.000 | - I mean, he still has a chance to do that, I think.
02:31:17.040 | I mean, I don't wanna- - Yeah, sure.
02:31:18.280 | - I don't think he's, you know,
02:31:19.280 | if I doubt he's listening,
02:31:20.600 | but if he is, like, hey, I think, you know,
02:31:24.000 | I don't think he's irredeemable by any means.
02:31:25.680 | I think there's, you know,
02:31:27.560 | I don't have an opinion
02:31:28.720 | of whether there was arrogance or there or not.
02:31:30.720 | Just know that I think, like, coming clean on the,
02:31:34.200 | you know, it's understandable to have fucked up
02:31:36.960 | during this pandemic.
02:31:37.800 | Like, I won't expect any government to handle it well
02:31:39.440 | because it was so difficult,
02:31:41.120 | like, so many moving pieces,
02:31:42.880 | so much, like, lack of information and so on.
02:31:46.080 | But the step to rebuilding trust is to go,
02:31:48.760 | okay, look, we're doing a scrutiny of where we went wrong.
02:31:51.320 | And for my part, I did this wrong in this part.
02:31:53.920 | - That would be huge.
02:31:55.160 | - All of us can do that.
02:31:56.040 | I mean, I was struggling for a while
02:31:57.400 | whether I wanna talk to him or not.
02:32:00.040 | I talked to his boss, Francis Collins.
02:32:01.960 | Another person that screwed up in terms of trust,
02:32:06.380 | lost a little bit of my respect too.
02:32:10.320 | There seems to have been a kind of dishonesty
02:32:14.680 | in the back rooms,
02:32:18.080 | in that they didn't trust people to be intelligent.
02:32:23.080 | Like, we need to tell them what's good for them.
02:32:26.040 | We know what's good for them, that kind of idea.
02:32:29.000 | - To be fair, the thing that's,
02:32:32.560 | what's it called?
02:32:33.400 | I heard the phrase today, nut picking.
02:32:36.440 | Social media does that.
02:32:37.800 | So you've got like nitpicking.
02:32:39.080 | Nut picking is where the craziest, stupidest,
02:32:44.080 | you know, if you have a group of people,
02:32:45.920 | let's call it, you know, let's say people who are vaccine,
02:32:47.760 | I don't like the term anti-vaccine,
02:32:48.760 | people who are vaccine hesitant, vaccine speculative,
02:32:52.080 | you know, what social media did or the media or anyone,
02:32:56.440 | you know, their opponents would do
02:32:59.240 | is pick the craziest example.
02:33:00.960 | So the ones who are like, you know,
02:33:02.280 | I think I need to inject myself with like,
02:33:04.360 | motor oil up my ass or something, you know,
02:33:07.880 | select the craziest ones and then have that beamed to,
02:33:11.600 | you know, so from like someone like Fauci
02:33:13.040 | or Francis's perspective, that's what they get
02:33:15.680 | because they're getting the same social media stuff as us.
02:33:17.440 | They're getting the same media reports.
02:33:18.960 | I mean, they might get some more information,
02:33:20.880 | but they too are gonna get the nuts portrayed to them.
02:33:24.880 | So they probably have a misrepresentation
02:33:27.200 | of what the actual public's intelligence is.
02:33:29.320 | - Well, that just, yes.
02:33:31.280 | And that just means they're not social media savvy.
02:33:33.640 | So one of the skills of being on social media
02:33:36.000 | is to be able to filter that in your mind,
02:33:37.840 | like to understand, to put into proper context.
02:33:40.120 | - To realize that what you are saying,
02:33:41.760 | social media is not anywhere near
02:33:43.920 | an accurate representation of humanity.
02:33:46.600 | - Nut picking, and there's nothing wrong
02:33:49.520 | with putting motor oil up your ass.
02:33:51.360 | It's just one of the better aspects of,
02:33:54.480 | I do this every weekend.
02:33:56.400 | Okay.
02:33:57.240 | - Where the hell did that analogy come from in my mind?
02:33:59.880 | Like what?
02:34:00.720 | - I don't know.
02:34:01.560 | I think you need to, there's some Freudian thing
02:34:03.760 | we need to deeply investigate with a therapist.
02:34:06.680 | Okay, what about AI?
02:34:08.320 | Are you worried about AGI, superintelligence systems,
02:34:13.100 | or paperclip maximizer type of situation?
02:34:17.200 | - Yes, I'm definitely worried about it,
02:34:19.960 | but I feel kind of bipolar in that some days I wake up
02:34:24.320 | and I'm like--
02:34:25.160 | - You're excited about the future?
02:34:26.120 | - Well, exactly.
02:34:26.960 | I'm like, wow, we can unlock the mysteries of the universe,
02:34:29.440 | you know, escape the game.
02:34:30.760 | 'Cause I spend all my time thinking
02:34:35.760 | about these molecule problems,
02:34:37.000 | that what is the solution to them?
02:34:38.960 | In some ways you need this like omnibenevolent,
02:34:43.280 | omniscient, omni-wise coordination mechanism
02:34:48.280 | that can like make us all not do the molecule thing,
02:34:53.000 | or like provide the infrastructure,
02:34:55.560 | or redesign the system so that it's not vulnerable
02:34:57.440 | to this molecule process.
02:34:58.900 | And in some ways, you know,
02:35:01.160 | that's the strongest argument to me
02:35:02.560 | for like the race to build AGI,
02:35:04.960 | is that maybe, you know, we can't survive without it.
02:35:08.000 | But the flip side to that is
02:35:13.260 | unfortunately now that there's multiple actors
02:35:14.880 | trying to build AI, AGI, you know,
02:35:17.280 | this was fine 10 years ago when it was just DeepMind,
02:35:19.920 | but then other companies started up
02:35:22.040 | and now it created a race dynamic.
02:35:23.340 | Now it's like, the whole thing is at the same,
02:35:27.000 | it's got the same problem.
02:35:27.960 | It's like, whichever company is the one
02:35:30.040 | that like optimizes for speed at the cost of safety
02:35:33.580 | will get the competitive advantage,
02:35:35.280 | and so we'll be the more likely the ones to build the AGI,
02:35:37.360 | you know, and that's the same cycle that you're in.
02:35:40.280 | And there's no clear solution to that,
02:35:41.600 | 'cause you can't just go like slapping,
02:35:45.020 | if you go and try and like stop all the different companies,
02:35:51.160 | then it will, you know, the good ones will stop
02:35:54.640 | because they're the ones, you know,
02:35:55.680 | within the West's reach,
02:35:57.680 | but then that leaves all the other ones to continue
02:36:00.000 | and then they're even more likely.
02:36:01.000 | So it's like, it's a very difficult problem
02:36:03.160 | with no clean solution.
02:36:04.420 | And, you know, at the same time, you know,
02:36:08.640 | I know at least some of the folks at DeepMind
02:36:12.120 | and they're incredible and they're thinking about this.
02:36:13.760 | They're very aware of this problem and they're like,
02:36:15.720 | you know, I think some of the smartest people on earth.
02:36:18.720 | - Yeah, the culture is important there
02:36:20.640 | because they are thinking about that
02:36:22.000 | and they're some of the best machine learning engineers.
02:36:26.240 | So it's possible to have a company or a community of people
02:36:29.800 | that are both great engineers
02:36:31.720 | and are thinking about the philosophical topics.
02:36:33.760 | - Exactly, and importantly, they're also game theorists,
02:36:36.720 | you know, and because this is ultimately
02:36:38.240 | a game theory problem, the thing, this Moloch mechanism
02:36:41.640 | and like, you know, how do we voice arms race scenarios?
02:36:46.640 | You need people who aren't naive to be thinking about this.
02:36:50.040 | And again, like luckily there's a lot of smart,
02:36:52.000 | non-naive game theorists within that group.
02:36:54.800 | Yes, I'm concerned about it.
02:36:56.000 | And I think it's again, a thing that we need people
02:36:59.640 | to be thinking about in terms of like,
02:37:02.360 | how do we create, how do we mitigate the arms race dynamics
02:37:05.960 | and how do we solve the thing of,
02:37:10.280 | Bostrom calls it the orthogonality problem whereby,
02:37:13.560 | because obviously there's a chance, you know,
02:37:16.080 | the belief, the hope is,
02:37:17.480 | is that you build something that's super intelligent
02:37:19.760 | and by definition of being super intelligent,
02:37:22.960 | it will also become super wise
02:37:25.160 | and have the wisdom to know what the right goals are.
02:37:27.840 | And hopefully those goals include keeping humanity alive.
02:37:30.960 | Right, but Bostrom says that actually those two things,
02:37:35.320 | you know, super intelligence and super wisdom
02:37:38.280 | aren't necessarily correlated.
02:37:40.400 | They're actually kind of orthogonal things.
02:37:42.880 | And how do we make it so that they are correlated?
02:37:44.800 | How do we guarantee it?
02:37:45.640 | Because we need it to be guaranteed really,
02:37:47.040 | to know that we're doing the thing safely.
02:37:48.920 | - But I think that like merging of intelligence and wisdom,
02:37:53.920 | at least my hope is that this whole process
02:37:56.920 | happens sufficiently slowly,
02:37:58.840 | that we're constantly having these kinds of debates,
02:38:02.040 | that we have enough time to figure out
02:38:06.160 | how to modify each version of the system
02:38:07.920 | as it becomes more and more intelligent.
02:38:09.600 | - Yes, buying time is a good thing, definitely.
02:38:12.040 | Anything that slows everything down,
02:38:14.240 | we just, everyone needs to chill out.
02:38:16.080 | We've got millennia to figure this out.
02:38:19.560 | Or at least, well, it depends again.
02:38:26.560 | Some people think that, you know,
02:38:27.800 | we can't even make it through the next few decades
02:38:29.800 | without having some kind of
02:38:31.560 | omni-wise coordination mechanism.
02:38:36.320 | And there's also an argument to that.
02:38:37.960 | Yeah, I don't know.
02:38:39.360 | - Well, there is, I'm suspicious of that kind of thinking
02:38:42.240 | because it seems like the entirety of human history
02:38:45.160 | has people in it that are like predicting doom
02:38:48.600 | or just around the corner.
02:38:50.760 | There's something about us
02:38:52.720 | that is strangely attracted to that thought.
02:38:57.320 | It's almost like fun to think about
02:38:59.680 | the destruction of everything.
02:39:01.200 | Just objectively speaking,
02:39:04.320 | I've talked and listened to a bunch of people
02:39:08.160 | and they are gravitating towards that.
02:39:11.120 | It's almost, I think it's the same thing
02:39:13.200 | that people love about conspiracy theories
02:39:15.720 | is they love to be the person that kind of figured out
02:39:19.200 | some deep fundamental thing about the,
02:39:22.120 | that's going to be,
02:39:23.400 | it's going to mark something extremely important
02:39:26.160 | about the history of human civilization
02:39:28.320 | because then I will be important.
02:39:31.320 | When in reality, most of us will be forgotten
02:39:33.720 | and life will go on.
02:39:37.640 | And one of the sad things about
02:39:40.040 | whenever anything traumatic happens to you,
02:39:42.000 | whenever you lose loved ones or just tragedy happens,
02:39:46.720 | you realize life goes on.
02:39:48.260 | Even after a nuclear war that will wipe out
02:39:52.160 | some large percentage of the population
02:39:55.040 | and will torture people for years to come
02:40:00.040 | because of the sort of,
02:40:01.960 | I mean, the effects of a nuclear winter,
02:40:05.440 | people will still survive,
02:40:07.280 | life will still go on.
02:40:08.720 | I mean, it depends on the kind of nuclear war,
02:40:10.960 | but in case of nuclear war, it will still go on.
02:40:13.520 | That's one of the amazing things about life,
02:40:15.840 | it finds a way.
02:40:17.080 | And so in that sense,
02:40:18.320 | I just, I feel like the doom and gloom thing is a--
02:40:23.760 | - Well, we don't want a self-fulfilling prophecy.
02:40:26.120 | - Yes, that's exactly.
02:40:27.480 | - Yes, and I very much agree with that.
02:40:29.680 | And I even have a slight feeling
02:40:32.980 | from the amount of time we've spent in this conversation
02:40:35.800 | talking about this 'cause it's like,
02:40:37.600 | is this even a net positive
02:40:40.200 | if it's making everyone feel,
02:40:41.880 | or in some ways, making people imagine
02:40:45.000 | these bad scenarios can be a self-fulfilling prophecy.
02:40:47.480 | But at the same time, that's weighed off
02:40:51.180 | with at least making people aware of the problem
02:40:54.360 | and gets them thinking.
02:40:55.320 | And I think particularly,
02:40:56.280 | the reason why I wanna talk about this to your audience
02:40:58.360 | is that on average, they're the type of people
02:41:00.680 | who gravitate towards these kind of topics
02:41:02.800 | 'cause they're intellectually curious
02:41:04.960 | and they can sort of sense that there's trouble brewing.
02:41:07.880 | They can smell that there's,
02:41:09.400 | I think there's a reason
02:41:10.240 | people are thinking about this stuff a lot
02:41:11.440 | is because the probability,
02:41:13.640 | it's increased in probability
02:41:16.840 | over certainly over the last few years.
02:41:19.440 | Trajectories have not gone favorably,
02:41:21.700 | let's put it since 2010.
02:41:24.260 | So it's right, I think, for people to be thinking about it.
02:41:28.460 | But that's where they're like,
02:41:30.060 | I think whether it's a useful fiction
02:41:31.660 | or whether it's actually true
02:41:33.340 | or whatever you wanna call it,
02:41:34.420 | I think having this faith,
02:41:35.940 | this is where faith is valuable
02:41:38.060 | because it gives you at least this anchor of hope.
02:41:41.140 | And I'm not just saying it to trick myself.
02:41:43.900 | Like I do truly,
02:41:44.780 | I do think there's something out there that wants us to win.
02:41:47.700 | I think there's something that really wants us to win.
02:41:49.720 | And it just, you just have to be like,
02:41:53.120 | just like, okay, now I sound really crazy,
02:41:55.800 | but like open your heart to it a little bit.
02:41:58.800 | And it will give you the like,
02:42:03.200 | the sort of breathing room
02:42:04.880 | with which to marinate on the solutions.
02:42:07.720 | We are the ones who have to come up with the solutions,
02:42:10.400 | but we can use,
02:42:15.000 | there's like this hashtag positivity.
02:42:18.060 | There's value in that.
02:42:19.340 | - Yeah, you have to kind of imagine
02:42:21.340 | all the destructive trajectories that lay in our future
02:42:24.980 | and then believe in the possibility
02:42:27.980 | of avoiding those trajectories.
02:42:29.500 | All while, you said audience,
02:42:32.300 | all while sitting back, which is majority,
02:42:35.140 | the two people that listen to this
02:42:36.700 | are probably sitting on a beach,
02:42:38.700 | smoking some weed.
02:42:40.280 | - God damn it.
02:42:43.240 | - It's a beautiful sunset,
02:42:44.880 | or they're looking at just the waves going in and out.
02:42:47.880 | And ultimately there's a kind of deep belief there
02:42:50.440 | in the momentum of humanity to figure it all out.
02:42:55.440 | - I think we'll make it, but we've got a lot of work to do.
02:42:58.400 | - Which is what makes this whole simulation,
02:43:01.040 | this video game kind of fun.
02:43:02.440 | This battle of Polytopia, I still,
02:43:06.720 | man, I love those games so much.
02:43:08.560 | - They're so good.
02:43:09.400 | - And that one for people who don't know,
02:43:11.760 | Battle of Polytopia is a really radical simplification
02:43:16.760 | of a civilization type of game.
02:43:20.540 | It still has a lot of the skill tree development,
02:43:24.140 | a lot of the strategy,
02:43:25.800 | but it's easy enough to play on a phone.
02:43:29.900 | - Yeah.
02:43:30.740 | - It's kind of interesting.
02:43:32.140 | - They've really figured it out.
02:43:33.260 | It's one of the most elegantly designed games I've ever seen.
02:43:35.580 | It's incredibly complex.
02:43:37.980 | And yet being, again, it walks that line
02:43:39.580 | between complexity and simplicity
02:43:40.980 | in this really, really great way.
02:43:42.580 | And they use pretty colors that hack
02:43:46.220 | the dopamine reward circuits in our brains very well.
02:43:49.740 | - It's fun.
02:43:50.860 | Video games are so fun.
02:43:52.300 | - Yeah.
02:43:53.580 | - Most of this life is just about fun,
02:43:55.540 | escaping all the suffering to find the fun.
02:43:57.780 | What's energy healing?
02:43:59.980 | I have in my notes, energy healing question mark.
02:44:02.260 | What's that about?
02:44:03.100 | (laughing)
02:44:05.540 | - Oh man.
02:44:06.880 | God, your audience are gonna think I'm mad.
02:44:09.500 | So the two crazy things that happened to me,
02:44:13.300 | the one was the voice in the head
02:44:15.020 | that said you're gonna win this tournament,
02:44:16.140 | and then I won the tournament.
02:44:18.300 | The other craziest thing that's happened to me
02:44:20.980 | was in 2018,
02:44:23.740 | I started getting this weird problem in my ear
02:44:30.340 | where it was kind of like low frequency sound distortion,
02:44:35.300 | where voices, particularly men's voices,
02:44:37.340 | became incredibly unpleasant to listen to.
02:44:39.580 | It would create this,
02:44:42.420 | it would be falsely amplified or something,
02:44:44.140 | and it was almost like a physical sensation in my ear,
02:44:46.380 | which was really unpleasant.
02:44:48.140 | And it would last for a few hours and then go away,
02:44:51.020 | and then come back for a few hours and go away.
02:44:52.580 | And I went and got hearing tests,
02:44:54.500 | and they found that the bottom end,
02:44:56.380 | I was losing the hearing in that ear.
02:44:58.400 | And so in the end,
02:45:03.620 | the doctors said they think it was
02:45:05.380 | this thing called Meniere's disease,
02:45:07.260 | which is this very unpleasant disease
02:45:10.580 | where people basically end up losing their hearing,
02:45:12.140 | but they get this,
02:45:13.060 | it often comes with dizzy spells and other things,
02:45:16.300 | 'cause it's like the inner ear gets all messed up.
02:45:18.740 | Now, I don't know if that's actually what I had,
02:45:21.820 | but that's what at least one doctor said to me.
02:45:24.980 | But anyway, so I'd had three months of this stuff,
02:45:27.020 | this going on, and it was really getting me down.
02:45:28.860 | And I was at Burning Man, of all places.
02:45:32.580 | Don't mean to be that person talking about Burning Man.
02:45:35.140 | But I was there, and again, I'd had it,
02:45:37.980 | and I was unable to listen to music,
02:45:39.180 | which is not what you want,
02:45:40.100 | 'cause Burning Man is a very loud, intense place.
02:45:42.620 | And I was just having a really rough time.
02:45:44.060 | And on the final night,
02:45:45.980 | I get talking to this girl who's a friend of a friend.
02:45:49.660 | And I mentioned, I was like,
02:45:51.060 | "Oh, I'm really down in the dumps about this."
02:45:52.580 | And she's like, "Oh, well,
02:45:53.420 | I've done a little bit of energy healing.
02:45:54.980 | Would you like me to have a look?"
02:45:56.380 | And I was like, "Sure."
02:45:57.940 | Now, this was, again,
02:45:59.460 | no time in my life for this.
02:46:03.420 | I didn't believe in any of this stuff.
02:46:04.980 | I was just like, "It's all bullshit.
02:46:06.100 | It's all wooey nonsense."
02:46:07.500 | But I was like, "Sure, have a go."
02:46:10.740 | And she starts with her hand,
02:46:13.700 | and she says, "Oh, there's something there."
02:46:15.340 | And then she leans in,
02:46:16.180 | and she starts sucking over my ear,
02:46:18.780 | not actually touching me,
02:46:19.900 | but close to it, with her mouth.
02:46:22.660 | And it was really unpleasant.
02:46:23.780 | I was like, "Whoa, can you stop?"
02:46:24.780 | She's like, "No, no, no, there's something there.
02:46:25.820 | I need to get it."
02:46:26.660 | And I was like, "No, no, no, I really don't like it.
02:46:27.780 | Please, this is really loud."
02:46:29.140 | She's like, "I need to, just bear with me."
02:46:31.100 | And she does it,
02:46:31.940 | and I don't know how long, for a few minutes.
02:46:33.740 | And then she eventually collapses on the ground,
02:46:36.540 | like freezing cold, crying.
02:46:38.500 | And I'm just like,
02:46:41.420 | "I don't know what the hell is going on."
02:46:42.860 | I'm thoroughly freaked out,
02:46:44.300 | as is everyone else watching.
02:46:45.420 | Just like, "What the hell?"
02:46:46.380 | And we warm her up, and she was like,
02:46:47.660 | (gasps)
02:46:48.500 | "What, ugh."
02:46:49.460 | She was really shaken up.
02:46:51.340 | And she's like, "I don't know what that..."
02:46:54.700 | She said it was something very unpleasant and dark.
02:46:57.540 | "Don't worry, it's gone.
02:46:58.700 | I think you'll be fine in a couple...
02:46:59.900 | You'll have the physical symptoms for a couple of weeks,
02:47:01.420 | and you'll be fine."
02:47:02.700 | But, you know, she was just like that.
02:47:04.620 | You know, so I was so rattled, A,
02:47:07.900 | because the potential that actually,
02:47:10.020 | I'd had something bad in me that made someone feel bad,
02:47:13.060 | and that she was scared.
02:47:15.340 | That was what, you know, I was like,
02:47:16.260 | "Wait, I thought, you do this, this is the thing.
02:47:19.300 | Now you're terrified?
02:47:20.340 | Like you pulled like some kind of exorcism or something?
02:47:22.540 | Like what the fuck is going on?"
02:47:24.620 | So it, like just, the most insane experience.
02:47:29.620 | And frankly, it took me like a few months
02:47:31.740 | to sort of emotionally recover from it.
02:47:35.380 | But my ear problem went away about a couple of weeks later,
02:47:39.540 | and touch wood, I've not had any issues since.
02:47:42.940 | So...
02:47:44.700 | - That gives you like hints
02:47:48.140 | that maybe there's something out there.
02:47:50.100 | - I mean, I don't, again,
02:47:53.340 | I don't have an explanation for this.
02:47:55.180 | The most probable explanation was, you know,
02:47:57.780 | I was a burning man.
02:47:58.620 | I was in a very open state.
02:47:59.860 | Let's just leave it at that.
02:48:01.820 | And, you know,
02:48:04.580 | placebo is an incredibly powerful thing
02:48:08.060 | and a very not understood thing.
02:48:11.140 | - Almost assigning the word placebo to it
02:48:12.660 | reduces it down to a way that
02:48:14.860 | it doesn't deserve to be reduced down.
02:48:16.500 | Maybe there's a whole science of what we call placebo.
02:48:19.180 | Maybe there's a, placebo is a door-
02:48:21.580 | - Self-healing, you know?
02:48:23.940 | And I mean, I don't know what the problem was.
02:48:26.300 | Like I was told it was many years.
02:48:27.620 | I don't want to say I definitely had that
02:48:29.380 | because I don't want people to think that,
02:48:30.540 | "Oh, that's how, you know, if they do have that
02:48:32.060 | 'cause it's a terrible disease.
02:48:33.020 | And if they have that,
02:48:33.860 | that this is gonna be a guaranteed way
02:48:34.740 | for it to fix it for them."
02:48:35.580 | I don't know.
02:48:36.420 | And I also don't, I don't,
02:48:39.700 | and you're absolutely right to say like
02:48:41.300 | using even the word placebo is like,
02:48:43.260 | it comes with this like baggage of like frame.
02:48:47.900 | And I don't want to reduce that.
02:48:49.820 | All I can do is describe the experience and what happened.
02:48:52.780 | I cannot put an ontological framework around it.
02:48:56.460 | I can't say why it happened, what the mechanism was,
02:49:00.020 | what the problem even was in the first place.
02:49:02.900 | I just know that something crazy happened
02:49:05.140 | and it was while I was in an open state.
02:49:06.980 | And fortunately for me, it made the problem go away.
02:49:09.460 | But what I took away from it, again,
02:49:11.780 | it was part of this, you know,
02:49:13.500 | this took me on this journey of becoming more humble
02:49:15.900 | about what I think I know.
02:49:17.380 | Because as I said before, I was like,
02:49:18.780 | I was in the like Richard Dawkins train of atheism
02:49:21.740 | in terms of there is no God.
02:49:23.060 | There's everything like that is bullshit.
02:49:24.820 | We know everything.
02:49:25.660 | We know, you know, the only way we can get through,
02:49:28.980 | we know how medicine works and its molecules
02:49:30.820 | and chemical interactions and that kind of stuff.
02:49:33.980 | And now it's like, okay, well,
02:49:36.580 | there's clearly more for us to understand.
02:49:39.500 | And that doesn't mean that it's ascientific as well.
02:49:43.540 | 'Cause, you know, the beauty of the scientific method
02:49:47.140 | is that it still can apply to this situation.
02:49:49.940 | Like I don't see why, you know,
02:49:51.340 | I would like to try and test this experimentally.
02:49:54.300 | I haven't really, you know,
02:49:55.540 | I don't know how we would go about doing that.
02:49:57.100 | We'd have to find other people with the same condition.
02:49:58.860 | I guess, and like try and repeat the experiment.
02:50:02.780 | But it doesn't, just because something happens
02:50:06.940 | that's sort of out of the realms of our current understanding
02:50:10.060 | it doesn't mean that it's,
02:50:11.900 | the scientific method can't be used for it.
02:50:13.820 | - Yeah, I think the scientific method sits on a foundation
02:50:17.900 | of those kinds of experiences.
02:50:20.180 | 'Cause they, scientific method is a process
02:50:24.700 | to carve away at the mystery all around us.
02:50:29.700 | And experiences like this is just a reminder
02:50:33.740 | that we're mostly shrouded in mystery still.
02:50:36.700 | That's it, it's just like a humility.
02:50:38.940 | Like we haven't really figured this whole thing out.
02:50:41.580 | - But at the same time, we have found ways to act,
02:50:45.460 | you know, we're clearly doing something right.
02:50:47.500 | Because think of the technological scientific advancements,
02:50:50.020 | the knowledge that we have,
02:50:52.460 | would blow people's minds even from 100 years ago.
02:50:55.620 | - Yeah, and we've even allegedly gone out to space
02:50:58.660 | and landed on the moon.
02:51:00.340 | Although I still haven't, I have not seen evidence
02:51:02.500 | of the earth being round, but I'm keeping an open mind.
02:51:06.540 | Speaking of which, you studied physics and astrophysics.
02:51:11.620 | (laughing)
02:51:13.860 | Just to go to that, just to jump around
02:51:18.740 | through the fascinating life you've had,
02:51:20.940 | when did you, how did that come to be?
02:51:23.620 | Like when did you fall in love with astronomy
02:51:25.900 | and space and things like this?
02:51:28.620 | - As early as I can remember.
02:51:30.460 | I was very lucky that my mom and my dad,
02:51:33.380 | but particularly my mom, my mom is like the most nature,
02:51:38.300 | she is mother earth, is the only way to describe her.
02:51:41.180 | Just, she's like Dr. Dolittle, animals flock to her
02:51:44.540 | and just like sit and look at her adoringly.
02:51:47.060 | - As she sings.
02:51:48.380 | - Yeah, she just is mother earth
02:51:51.060 | and she has always been fascinated by,
02:51:53.500 | she doesn't have any, she never went to university
02:51:57.100 | or anything like that, she's actually phobic of maths.
02:51:59.380 | If I try and get her to like,
02:52:01.060 | I was trying to teach her poker and she hated it.
02:52:03.860 | But she's so deeply curious and that just got instilled in me
02:52:08.860 | when we would sleep out under the stars
02:52:11.260 | whenever it was the two nights a year
02:52:13.220 | when it was warm enough in the UK to do that.
02:52:15.820 | And we would just lie out there until we fell asleep
02:52:18.740 | looking for satellites, looking for shooting stars.
02:52:22.140 | And I was just always, I don't know whether it was from that
02:52:25.340 | but I've always naturally gravitated to like
02:52:28.660 | the biggest questions.
02:52:31.980 | And also the like the most layers of abstraction.
02:52:35.180 | I love just like, what's the meta question,
02:52:36.780 | what's the meta question and so on.
02:52:38.820 | So I think it just came from that really.
02:52:40.820 | And then on top of that, like physics,
02:52:44.140 | it also made logical sense in that it was a degree
02:52:47.460 | that was subject that ticked the box of being,
02:52:54.100 | answering these really big picture questions
02:52:55.580 | but it was also extremely useful.
02:52:57.900 | It like has a very high utility
02:52:59.660 | in terms of, I didn't know necessarily.
02:53:02.140 | I thought I was gonna become like a research scientist.
02:53:04.460 | My original plan was I wanna be a professional astronomer.
02:53:07.060 | - So it's not just like a philosophy degree
02:53:08.660 | that asks the big questions and it's not like biology
02:53:14.100 | and the path to go to medical school or something like that
02:53:16.220 | which is overly pragmatic, not overly,
02:53:19.140 | is very pragmatic. - The more pragmatic side.
02:53:23.140 | - But this is, yeah, physics is a good combination
02:53:25.580 | of the two.
02:53:26.500 | - Yeah, at least for me, it made sense.
02:53:27.940 | And I was good at it, I liked it.
02:53:30.220 | Yeah, I mean, it wasn't like I did an immense amount
02:53:32.460 | of soul searching to choose it or anything.
02:53:34.380 | It just was like this, it made the most sense.
02:53:38.220 | I mean, you have to make this decision in the UK, age 17
02:53:41.300 | which is crazy 'cause in US, you go the first year,
02:53:44.820 | you do a bunch of stuff, right?
02:53:46.260 | And then you choose your major.
02:53:48.540 | - Yeah, I think the first few years of college,
02:53:50.020 | you focus on the drugs and only as you get closer
02:53:53.100 | to the end, do you start to think,
02:53:55.820 | oh shit, this wasn't about that
02:53:57.420 | and I owe the government a lot of money.
02:54:01.380 | How many alien civilizations are out there?
02:54:05.180 | When you looked up at the stars with your mom
02:54:07.580 | and you were counting them,
02:54:11.140 | what's your mom think about the number
02:54:12.900 | of alien civilization?
02:54:13.740 | - I actually don't know.
02:54:15.780 | I would imagine she would take the viewpoint of,
02:54:18.660 | she's pretty humble and she knows how many,
02:54:21.300 | she knows there's a huge number of potential spawn sites
02:54:24.100 | out there, so she would--
02:54:25.380 | - Spawn sites?
02:54:26.220 | - Spawn sites, yeah.
02:54:28.020 | This is our spawn sites. - Spawn sites.
02:54:29.460 | - Yeah, spawn sites in polytopia, we spawned on earth.
02:54:32.160 | - Yeah, spawn sites.
02:54:35.860 | Why does that feel weird to say spawn?
02:54:39.260 | - Because it makes me feel like it's,
02:54:41.320 | there's only one source of life
02:54:45.300 | and it's spawning in different locations.
02:54:47.380 | That's why the word spawn.
02:54:49.180 | 'Cause it feels like life that originated on earth
02:54:52.540 | really originated here.
02:54:54.660 | - Right, it is unique to this particular,
02:54:58.740 | yeah, I mean, but I don't, in my mind,
02:55:00.980 | it doesn't exclude the completely different forms of life
02:55:04.340 | and different biochemical soups can't also spawn,
02:55:09.140 | but I guess it implies that there's some spark
02:55:12.580 | that is uniform, which I kind of like the idea of it.
02:55:16.180 | - And then I get to think about respawning,
02:55:19.220 | like after it dies, like what happens if life on earth ends?
02:55:23.500 | Is it gonna restart again?
02:55:26.060 | - Probably not, it depends.
02:55:28.060 | - Maybe earth is too--
02:55:28.900 | - It depends on the type of,
02:55:30.700 | what's the thing that kills it off, right?
02:55:32.820 | If it's a paperclip maximizer, not for the example,
02:55:36.100 | but some kind of very self-replicating,
02:55:39.620 | high on the capabilities, very low on the wisdom type thing.
02:55:44.380 | So whether that's gray goo, green goo, like nanobots
02:55:48.620 | or just a shitty misaligned AI
02:55:51.180 | that thinks it needs to turn everything into paperclips.
02:55:54.340 | If it's something like that,
02:55:57.700 | then it's gonna be very hard for life, complex life,
02:56:00.860 | because by definition, a paperclip maximizer
02:56:03.180 | is the ultimate instantiation of molecule,
02:56:05.660 | deeply low complexity, over-optimization on a single thing,
02:56:08.620 | sacrificing everything else, turning the whole world into--
02:56:11.100 | - Although something tells me,
02:56:12.180 | like if we actually take a paperclip maximizer,
02:56:14.620 | it destroys everything.
02:56:16.180 | It's a really dumb system
02:56:17.820 | that just envelops the whole of earth.
02:56:20.780 | - And the universe beyond, yeah.
02:56:22.380 | - I didn't know that part, but okay, great.
02:56:26.580 | - That's the thought experiment anyway.
02:56:27.420 | - So it becomes a multi-planetary paperclip maximizer?
02:56:30.300 | - Well, it just propagates, I mean,
02:56:32.100 | it depends whether it figures out
02:56:33.660 | how to jump the vacuum gap.
02:56:35.060 | But again, I mean, this is all silly
02:56:38.660 | 'cause it's a hypothetical thought experiment,
02:56:40.340 | which I think doesn't actually have
02:56:41.220 | much practical application to the AI safety problem,
02:56:43.260 | but it's just a fun thing to play around with.
02:56:45.260 | But if by definition it is maximally intelligent,
02:56:47.820 | which means it is maximally good
02:56:49.820 | at navigating the environment around it
02:56:53.100 | in order to achieve its goal,
02:56:54.740 | but extremely bad at choosing goals in the first place,
02:56:58.060 | so again, we're talking on this orthogonality thing, right?
02:57:00.060 | It's very low on wisdom, but very high on capability.
02:57:03.720 | Then it will figure out how to jump the vacuum gap
02:57:05.680 | between planets and stars and so on,
02:57:07.400 | and thus just turn every atom it gets
02:57:09.360 | its hands on into paperclips.
02:57:10.960 | - Yeah, by the way, for people who don't--
02:57:12.800 | - Which is maximum virality, by the way.
02:57:14.600 | That's what virality is.
02:57:16.240 | - But does not mean that virality
02:57:17.800 | is necessarily all about maximizing paperclips.
02:57:20.040 | In that case, it is.
02:57:21.080 | So for people who don't know,
02:57:22.280 | this is just a thought experiment example
02:57:24.280 | of an AI system that has a goal
02:57:27.360 | and is willing to do anything to accomplish that goal,
02:57:30.400 | including destroying all life on Earth
02:57:32.280 | and all human life and all of consciousness in the universe
02:57:36.320 | for the goal of producing a maximum number of paperclips.
02:57:40.720 | Okay.
02:57:41.840 | - Or whatever its optimization function was
02:57:43.960 | that it was set at.
02:57:45.000 | - But don't you think--
02:57:45.840 | - It could be recreating Lexus.
02:57:47.920 | Maybe it'll tile the universe in Lex.
02:57:50.560 | - Go on.
02:57:51.600 | I like this idea.
02:57:52.440 | No, I'm just kidding.
02:57:53.360 | - That's better.
02:57:54.800 | That's more interesting than paperclips.
02:57:56.320 | - That could be infinitely optimal
02:57:57.640 | if I were to say so myself.
02:57:58.480 | - But if you ask me, it's still a bad thing
02:58:00.400 | because it's permanently capping
02:58:02.760 | what the universe could ever be.
02:58:04.200 | It's like, that's its end state.
02:58:05.880 | - Or achieving the optimal
02:58:07.840 | that the universe could ever achieve,
02:58:09.240 | but that's up to,
02:58:10.320 | different people have different perspectives.
02:58:12.480 | But don't you think within the paperclip world
02:58:15.520 | that would emerge,
02:58:16.920 | just like in the zeros and ones that make up a computer,
02:58:20.200 | that would emerge beautiful complexities?
02:58:22.860 | Like, it won't suppress,
02:58:27.800 | as you scale to multiple planets and throughout,
02:58:30.800 | there'll emerge these little worlds
02:58:33.240 | that on top of the fabric of maximizing paperclips,
02:58:38.240 | that would emerge like little societies of paperclip--
02:58:45.200 | - Well, then we're not describing
02:58:47.480 | a paperclip maximizer anymore
02:58:48.560 | because if you think of what a paperclip is,
02:58:51.560 | it is literally just a piece of bent iron, right?
02:58:55.560 | So if it's maximizing that throughout the universe,
02:58:58.920 | it's taking every atom it gets its hand on
02:59:01.000 | into somehow turning it into iron or steel,
02:59:04.200 | and then bending it into that shape,
02:59:05.360 | and then done and done.
02:59:06.760 | By definition, like paperclips,
02:59:08.840 | there is no way for, well, okay,
02:59:11.920 | so you're saying that paperclips somehow
02:59:13.880 | will just emerge and create through gravity or something.
02:59:17.920 | - No, no, no, no,
02:59:18.920 | 'cause there's a dynamic element to the whole system.
02:59:21.560 | It's not just, it's creating those paperclips.
02:59:24.760 | And the act of creating, there's going to be a process,
02:59:27.880 | and that process will have a dance to it
02:59:30.640 | 'cause it's not like sequential thing.
02:59:32.600 | There's a whole complex three-dimensional system
02:59:35.200 | of paperclips.
02:59:36.200 | People like string theory, right?
02:59:39.400 | It's supposed to be strings
02:59:40.280 | that are interacting in fascinating ways.
02:59:41.960 | I'm sure paperclips are very string-like.
02:59:44.520 | They can be interacting in very interesting ways
02:59:46.520 | as you scale exponentially through three-dimensional.
02:59:50.040 | I mean, I'm sure the paperclip maximizer
02:59:53.840 | has to come up with a theory of everything.
02:59:55.640 | It has to create wormholes, right?
02:59:58.280 | It has to break, like,
03:00:01.000 | it has to understand quantum mechanics.
03:00:02.640 | - I love your optimism. - It has to understand
03:00:04.120 | general relativity. - This is where I'd say
03:00:06.200 | we're going into the realm of pathological optimism,
03:00:08.400 | wherever it's. (laughs)
03:00:10.520 | - I'm sure there'll be a,
03:00:11.720 | I think there's an intelligence
03:00:14.840 | that emerges from that system.
03:00:16.280 | - So you're saying that basically intelligence
03:00:18.440 | is inherent in the fabric of reality and will find a way,
03:00:21.640 | kind of like Goldblum says, "Life will find a way."
03:00:23.840 | You think life will find a way
03:00:25.160 | even out of this perfectly homogenous dead soup.
03:00:29.480 | - It's not perfectly homogenous.
03:00:31.520 | It has to, it's perfectly maximal in the production.
03:00:34.920 | I don't know why people keep thinking it's homogenous.
03:00:37.360 | It maximizes the number of paperclips.
03:00:39.520 | That's the only thing.
03:00:40.360 | It's not trying to be homogenous.
03:00:42.000 | It's trying to- - True, true, true, true.
03:00:42.840 | - It's trying to maximize paperclips.
03:00:44.720 | - So you're saying that because,
03:00:48.840 | kind of like in "The Big Bang,"
03:00:50.640 | it seems like things, there were clusters,
03:00:53.120 | there was more stuff here than there.
03:00:54.880 | That was enough of the patonicity
03:00:56.640 | that kickstarted the evolutionary process.
03:00:58.480 | - It's the little weirdnesses
03:00:59.520 | that will make it beautiful. - You think that even out of-
03:01:01.080 | - Yeah, complexity emerges.
03:01:03.000 | - Interesting, okay.
03:01:04.240 | Well, so how does that line up then
03:01:05.640 | with the whole heat death of the universe, right?
03:01:08.120 | 'Cause that's another sort of instantiation of this.
03:01:10.120 | It's like everything becomes so far apart and so cold
03:01:13.280 | and so perfectly mixed
03:01:16.720 | that it's like homogenous grayness.
03:01:20.440 | Do you think that even out of that homogenous grayness
03:01:23.320 | where there's no negative entropy,
03:01:27.280 | there's no free energy that we understand
03:01:30.960 | even from that new stuff?
03:01:34.000 | - Yeah, the paperclip maximizer
03:01:36.640 | or any other intelligent systems
03:01:38.200 | will figure out ways to travel to other universes
03:01:41.040 | to create big bangs within those universes
03:01:43.560 | or through black holes to create whole other worlds
03:01:46.160 | to break what we consider are the limitations of physics.
03:01:51.160 | (laughing)
03:01:52.760 | The paperclip maximizer will find a way if a way exists.
03:01:56.720 | And we should be humble to realize that we don't-
03:01:59.040 | - Yeah, but because it just wants to make more paperclips.
03:02:01.520 | So it's gonna go into those universes
03:02:02.680 | and turn them into paperclips.
03:02:03.800 | - Yeah, but we, humans, not humans,
03:02:07.480 | but complex systems exist on top of that.
03:02:10.320 | We're not interfering with it.
03:02:13.000 | This complexity emerges from-
03:02:16.280 | - The simple base state.
03:02:17.400 | - The simple base state.
03:02:18.320 | - Whether it's, yeah, whether it's, you know,
03:02:20.320 | Planck lengths or paperclips as the base unit.
03:02:22.640 | - Yeah, you can think of like the universe
03:02:25.720 | as a paperclip maximizer
03:02:27.320 | because it's doing some dumb stuff.
03:02:28.880 | Like physics seems to be pretty dumb.
03:02:31.320 | It has, like, I don't know if you can summarize it.
03:02:34.160 | - Yeah, the laws are fairly basic
03:02:37.320 | and yet out of them amazing complexity emerges.
03:02:39.640 | - And its goals seem to be pretty basic and dumb.
03:02:43.440 | If you can summarize its goals,
03:02:45.280 | I mean, I don't know what's a nice way,
03:02:47.000 | maybe laws of thermodynamics could be,
03:02:52.000 | I don't know if you can assign goals to physics,
03:02:55.080 | but if you formulate in the sense of goals,
03:02:57.980 | it's very similar to paperclip maximizing
03:03:00.840 | in the dumbness of the goals.
03:03:02.960 | But the pockets of complexity as it emerge
03:03:06.520 | is where beauty emerges.
03:03:07.960 | That's where life emerges.
03:03:09.120 | That's where intelligence, that's where humans emerge.
03:03:12.120 | And I think we're being very down
03:03:14.000 | on this whole paperclip maximizer thing.
03:03:16.240 | Now, the reason we hate it-
03:03:17.680 | - I think, yeah, because what you're saying
03:03:19.000 | is that you think that the force of emergence itself
03:03:24.000 | is another like unwritten, not unwritten,
03:03:27.480 | but like another baked in law of reality.
03:03:31.800 | And you're trusting that emergence will find a way
03:03:34.880 | to even out of seemingly the most molecule awful,
03:03:39.880 | plain outcome emergence will still find a way.
03:03:42.240 | I love that as a philosophy.
03:03:43.480 | I think it's very nice.
03:03:44.360 | I would wield it carefully
03:03:47.320 | because there's large error bars on that
03:03:50.520 | and the certainty of that.
03:03:52.080 | - Yeah.
03:03:52.920 | How about we build the paperclip maximizer and find out?
03:03:55.960 | - Classic, yeah.
03:03:56.880 | Molec is doing cartwheels, man.
03:03:59.160 | - Yeah.
03:04:00.000 | But the thing is, it will destroy humans in the process,
03:04:03.240 | which is the reason we really don't like it.
03:04:05.320 | We seem to be really holding on
03:04:07.240 | to this whole human civilization thing.
03:04:10.240 | Would that make you sad if AI systems that are beautiful,
03:04:13.840 | that are conscious, that are interesting
03:04:15.680 | and complex and intelligent,
03:04:17.300 | ultimately lead to the death of humans?
03:04:20.040 | Would that make you sad?
03:04:21.040 | - If humans led to the death of humans?
03:04:23.000 | Sorry.
03:04:23.840 | - Like if they would supersede humans.
03:04:25.520 | - Oh, if some AI?
03:04:26.840 | - Yeah, AI would end humans.
03:04:30.320 | I mean, that's the reason why I'm in some ways
03:04:33.320 | less emotionally concerned about AI risk
03:04:36.440 | than say, bio risk.
03:04:39.480 | Because at least with AI,
03:04:41.560 | there's a chance, if we're in this hypothetical
03:04:44.100 | where it wipes out humans,
03:04:45.640 | but it does it for some higher purpose,
03:04:48.240 | it needs our atoms and energy to do something,
03:04:51.060 | at least now the universe is going on
03:04:53.120 | to do something interesting.
03:04:55.240 | Whereas if it wipes everything,
03:04:56.800 | bio just kills everything on Earth and that's it.
03:05:00.120 | And there's no more,
03:05:01.440 | Earth cannot spawn anything more meaningful
03:05:03.340 | in the few hundred million years it has left.
03:05:05.960 | 'Cause it doesn't have much time left.
03:05:07.680 | Then, yeah, I don't know.
03:05:13.480 | So one of my favorite books I've ever read
03:05:15.680 | is "Novacene" by James Lovelock,
03:05:18.200 | who sadly just died.
03:05:19.920 | He wrote it when he was like 99.
03:05:22.080 | He died aged 102, so it's a fairly new book.
03:05:24.280 | And he sort of talks about that,
03:05:27.040 | that he thinks it's sort of building off this Gaia theory
03:05:30.840 | where Earth is living some form of intelligence itself
03:05:35.840 | and that this is the next step, right?
03:05:38.400 | Is this, whatever this new intelligence
03:05:41.740 | that is maybe silicon-based
03:05:43.360 | as opposed to carbon-based goes on to do.
03:05:46.160 | And it's really sort of, in some ways,
03:05:47.680 | an optimistic but really fatalistic book.
03:05:49.820 | I don't know if I fully subscribe to it,
03:05:52.400 | but it's a beautiful piece to read anyway.
03:05:54.400 | So am I sad by that idea?
03:05:56.760 | I think so, yes.
03:05:57.920 | And actually, yeah, this is the reason
03:05:59.240 | why I'm sad by the idea,
03:06:00.120 | because if something is truly brilliant and wise and smart
03:06:03.880 | and truly super intelligent,
03:06:05.580 | it should be able to figure out abundance.
03:06:09.320 | So if it figures out abundance,
03:06:11.720 | it shouldn't need to kill us off.
03:06:12.800 | It should be able to find a way for us.
03:06:15.060 | There's plenty, the universe is huge.
03:06:17.460 | There should be plenty of space for it to go out
03:06:19.920 | and do all the things it wants to do
03:06:21.760 | and give us a little pocket
03:06:23.440 | where we can continue doing our things
03:06:24.880 | and we can continue to do things and so on.
03:06:26.640 | And again, if it's so supremely wise,
03:06:28.920 | it shouldn't even be worried
03:06:29.760 | about the game theoretic considerations
03:06:31.920 | that by leaving us alive,
03:06:33.000 | we'll then go and create another super intelligent agent
03:06:35.120 | that it then has to compete against,
03:06:36.600 | because it should be omni-wise and smart enough
03:06:38.280 | to not have to concern itself with that.
03:06:40.680 | - Unless it deems humans to be kind of assholes.
03:06:44.200 | The humans are a source of a lose-lose kind of dynamics.
03:06:50.880 | - Well, yes and no.
03:06:55.440 | Moloch is, that's why I think it's important to say.
03:06:57.560 | - But maybe humans are the source of Moloch.
03:07:00.200 | - No, I mean, I think game theory is the source of Moloch.
03:07:03.840 | And 'cause Moloch exists in non-human systems as well.
03:07:08.000 | It happens within agents within a game
03:07:10.560 | in terms of, it applies to agents,
03:07:13.920 | but it can apply to a species
03:07:18.520 | that's on an island of animals, rats out-competing
03:07:22.680 | the ones that massively consume all the resources
03:07:25.520 | are the ones that are gonna win out
03:07:26.840 | over the more chill, socialized ones.
03:07:29.680 | And so, creates this Malthusian trap.
03:07:31.960 | Moloch exists in little pockets in nature as well.
03:07:34.480 | So it's not a strictly human thing.
03:07:35.960 | - I wonder if it's actually a result of consequences
03:07:38.320 | of the invention of predator and prey dynamics.
03:07:41.540 | Maybe AI will have to kill off every organism that--
03:07:46.540 | - Now you're talking about killing off competition.
03:07:50.280 | - Not competition, but just like the way,
03:07:55.280 | it's like the weeds or whatever
03:08:00.000 | in a beautiful flower garden.
03:08:01.720 | - Parasites.
03:08:02.560 | - The parasites, yeah.
03:08:03.960 | On the whole system.
03:08:05.240 | Now, of course, it won't do that completely.
03:08:08.520 | It'll put them in a zoo like we do with parasites.
03:08:10.560 | - It'll ring fence.
03:08:11.400 | - Yeah, and there'll be somebody doing a PhD
03:08:13.160 | on like, they'll prod humans with a stick
03:08:15.840 | and see what they do.
03:08:18.920 | But I mean, in terms of letting us run wild
03:08:22.440 | outside of the geographically constricted region,
03:08:26.040 | that might be, it might decide to,
03:08:30.020 | no, I think there's obviously the capacity
03:08:33.240 | for beauty and kindness and non-Moloch behavior
03:08:38.240 | amidst humans.
03:08:39.360 | So I'm pretty sure AI will preserve us.
03:08:41.460 | Let me, I don't know if you answered the aliens question.
03:08:46.280 | - Oh, I didn't.
03:08:47.120 | - You had a good conversation with Toby Oyd
03:08:49.520 | about various sides of the universe.
03:08:52.200 | I think, did he say, now I'm forgetting,
03:08:54.680 | but I think he said it's a good chance we're alone.
03:08:58.200 | - So the classic Fermi paradox question is,
03:09:02.220 | there are so many spawn points,
03:09:06.040 | and yet it didn't take us that long
03:09:10.640 | to go from harnessing fire
03:09:12.120 | to sending out radio signals into space.
03:09:15.160 | So surely given the vastness of space,
03:09:18.240 | we should be, and even if only a tiny fraction of those
03:09:20.760 | create life and other civilizations too,
03:09:23.160 | we should be, the universe should be very noisy.
03:09:24.920 | There should be evidence of Dyson spheres or whatever,
03:09:27.920 | like at least radio signals and so on.
03:09:29.760 | But seemingly things are very silent out there.
03:09:32.560 | Now, of course, it depends on who you speak to.
03:09:33.840 | Some people say that they're getting signals all the time
03:09:35.880 | and so on, and I don't wanna make
03:09:37.520 | an epistemic statement on that.
03:09:38.680 | But it seems like there's a lot of silence.
03:09:43.160 | And so that raises this paradox.
03:09:45.720 | And then say, the Drake equation.
03:09:50.720 | So the Drake equation is basically just a simple thing
03:09:55.160 | of trying to estimate the number of possible civilizations
03:09:58.360 | within the galaxy by multiplying the number
03:09:59.920 | of stars created per year by the number of stars
03:10:02.960 | that have planets, planets that are habitable, blah, blah,
03:10:04.640 | blah, so all these different factors.
03:10:06.560 | And then you plug in numbers into that,
03:10:08.600 | and depending on the range of your lower bound
03:10:12.200 | and your upper bound point estimates that you put in,
03:10:15.200 | you get out a number at the end
03:10:16.880 | for the number of civilizations.
03:10:18.560 | But what Toby and his crew did differently,
03:10:22.480 | was Toby is a researcher at the Future of Humanity Institute.
03:10:25.680 | They, instead of, they realized that it's like basically
03:10:31.240 | a statistical quirk that if you put in point sources,
03:10:33.880 | even if you think you're putting
03:10:34.720 | in conservative point sources,
03:10:36.040 | because on some of these variables,
03:10:38.760 | the uncertainty is so large,
03:10:41.120 | it spans like maybe even like a couple of hundred
03:10:43.640 | of orders of magnitude.
03:10:45.040 | By putting in point sources,
03:10:47.480 | it's always going to lead to overestimates.
03:10:50.920 | And so they, by putting stuff on a log scale,
03:10:54.360 | or actually they did it on like a log log scale
03:10:56.120 | on some of them, and then like ran the simulation
03:10:58.840 | across the whole bucket of uncertainty,
03:11:02.640 | across all those orders of magnitude.
03:11:04.360 | When you do that, then actually the number comes out
03:11:07.280 | much, much smaller.
03:11:08.280 | And that's the more statistically rigorous,
03:11:10.720 | mathematically correct way of doing the calculation.
03:11:13.400 | It's still a lot of hand-waving, as science goes,
03:11:15.440 | it's like definitely, just waving,
03:11:18.720 | I don't know what an analogy is, but it's hand-wavy.
03:11:21.280 | And anyway, when they did this,
03:11:24.760 | and then they did a Bayesian update on it as well
03:11:27.040 | to like factor in the fact that there is no evidence
03:11:30.160 | that we're picking up, because no evidence
03:11:31.880 | is actually a form of evidence, right?
03:11:33.960 | And the long and short of it comes out
03:11:36.600 | that we're roughly around 70% to be the only
03:11:41.400 | intelligent civilization in our galaxy thus far,
03:11:45.120 | and around 50/50 in the entire observable universe.
03:11:47.880 | Which sounds so crazily counterintuitive,
03:11:50.240 | but their math is legit.
03:11:53.440 | - Well, yeah, the math around this particular equation,
03:11:55.680 | which the equation is ridiculous on many levels,
03:11:57.840 | but the powerful thing about the equation
03:12:03.240 | is there's different things, different components
03:12:06.800 | that can be estimated, and the error bars on which
03:12:10.880 | can be reduced with science.
03:12:13.840 | And hence, throughout, since the equation came out,
03:12:17.360 | the error bars have been coming out on different aspects.
03:12:19.640 | - Yeah, that's very true.
03:12:21.000 | - And so that, it almost kind of says,
03:12:23.300 | this gives you a mission to reduce the error bars
03:12:27.020 | on these estimates over a period of time,
03:12:30.000 | and once you do, you can better and better understand.
03:12:32.760 | Like in the process of reducing the error bars,
03:12:34.720 | you'll get to understand actually what is the right way
03:12:38.480 | to find out where the aliens are,
03:12:41.480 | how many of them there are, and all those kinds of things.
03:12:43.960 | So I don't think it's good to use that for an estimation.
03:12:47.800 | I think you do have to think from like,
03:12:50.120 | more like from first principles,
03:12:51.840 | just looking at what life is on Earth.
03:12:53.860 | Like, and trying to understand the very physics-based,
03:12:59.320 | biology, chemistry, biology-based question of what is life.
03:13:04.240 | Maybe computation-based, what the fuck is this thing?
03:13:07.920 | And that, like how difficult is it to create this thing?
03:13:12.160 | It's one way to say like, how many planets like this
03:13:14.500 | are out there, all that kind of stuff,
03:13:16.160 | but it feels like from our very limited knowledge
03:13:20.560 | perspective, the right way is to think,
03:13:23.640 | how does, what is this thing, and how does it originate?
03:13:28.200 | From very simple, non-life things,
03:13:32.000 | how does complex, life-like things emerge?
03:13:36.200 | From a rock to a bacteria, protein,
03:13:42.480 | and these like weird systems that encode information
03:13:46.360 | and pass information from-- - Self-replicate.
03:13:48.480 | - Self-replicate, and then also select each other
03:13:51.080 | and mutate in interesting ways such that they can adapt
03:13:53.440 | and evolve and build increasingly more complex systems.
03:13:56.600 | - Right, well, it's a form of information processing.
03:13:58.800 | - Yeah. - Right?
03:13:59.760 | - Right.
03:14:00.600 | - Well, it's information transfer,
03:14:02.960 | but then also an energy processing,
03:14:05.640 | which then results in, I guess, information processing?
03:14:09.100 | Maybe I'm getting bogged down.
03:14:09.940 | - Well, it's doing some modification,
03:14:12.040 | and yeah, the input is some energy.
03:14:14.480 | - Right, it's able to extract, yeah,
03:14:17.640 | extract resources from its environment
03:14:20.720 | in order to achieve a goal.
03:14:22.760 | - But the goal doesn't seem to be clear.
03:14:25.000 | - Right.
03:14:25.840 | (Lex laughing)
03:14:26.680 | - The goal is, well, the goal is to make more of itself.
03:14:29.480 | - Yeah, but in a way that increases,
03:14:33.800 | I mean, I don't know if evolution
03:14:36.360 | is a fundamental law of the universe,
03:14:39.880 | but it seems to want to replicate itself
03:14:44.240 | in a way that maximizes the chance of its survival.
03:14:47.700 | - Individual agents within an ecosystem do, yes.
03:14:51.120 | Yes, evolution itself doesn't give a fuck.
03:14:53.320 | - Right.
03:14:54.160 | - It's very, it don't care.
03:14:56.040 | It's just like, oh, you optimize it.
03:14:58.560 | Well, at least it's certainly, yeah,
03:15:01.600 | it doesn't care about the welfare
03:15:03.520 | of the individual agents within it,
03:15:05.160 | but it does seem to, I don't know.
03:15:06.520 | I think the mistake is that we're anthropomorphizing,
03:15:09.760 | to even try and give evolution a mindset,
03:15:13.580 | because it is, there's a really great post
03:15:17.240 | by Eliezer Yudkowsky on "Less Wrong,"
03:15:21.500 | which is an alien god, and he talks about
03:15:26.500 | the mistake we make when we try and put our mind,
03:15:29.220 | think through things from an evolutionary perspective,
03:15:32.340 | as though we're giving evolution some kind of agency
03:15:35.180 | and what it wants.
03:15:36.360 | Yeah, worth reading, but yeah.
03:15:39.580 | - I would like to say that having interacted
03:15:42.620 | with a lot of really smart people
03:15:43.980 | that say that anthropomorphization is a mistake,
03:15:46.940 | I would like to say that saying
03:15:48.760 | that anthropomorphization is a mistake is a mistake.
03:15:51.620 | I think there's a lot of power in anthropomorphization,
03:15:54.980 | if I can only say that word correctly one time.
03:15:57.500 | I think that's actually a really powerful way
03:16:00.820 | to reason through things, and I think people,
03:16:03.020 | especially people in robotics,
03:16:04.380 | seem to run away from it as fast as possible,
03:16:07.160 | and I just--
03:16:09.060 | - Can you give an example of how it helps in robotics?
03:16:12.220 | - Oh, in that our world is a world of humans,
03:16:19.060 | and to see robots as fundamentally just tools
03:16:23.340 | runs away from the fact that we live in a world,
03:16:27.940 | a dynamic world of humans,
03:16:30.020 | that all these game theory systems we've talked about,
03:16:33.760 | that a robot that ever has to interact with humans,
03:16:37.580 | and I don't mean intimate friendship interaction,
03:16:40.820 | I mean in a factory setting,
03:16:42.500 | where it has to deal with the uncertainty of humans,
03:16:44.420 | all that kind of stuff,
03:16:45.540 | you have to acknowledge that the robot's behavior
03:16:49.860 | has an effect on the human,
03:16:52.020 | just as much as the human has an effect on the robot,
03:16:54.580 | and there's a dance there,
03:16:56.100 | and you have to realize that this entity,
03:16:58.420 | when a human sees a robot,
03:17:00.540 | this is obvious in a physical manifestation of a robot,
03:17:04.220 | they feel a certain way,
03:17:05.820 | they have a fear, they have uncertainty,
03:17:07.820 | they have their own personal life projections,
03:17:11.880 | we have to have pets and dogs,
03:17:13.260 | and the thing looks like a dog,
03:17:14.840 | they have their own memories of what a dog is like,
03:17:17.120 | they have certain feelings,
03:17:18.540 | and that's gonna be useful in a safety setting,
03:17:20.740 | safety critical setting,
03:17:22.240 | which is one of the most trivial settings for a robot,
03:17:25.100 | in terms of how to avoid any kind of dangerous situations,
03:17:29.220 | and a robot should really consider that
03:17:32.500 | in navigating its environment,
03:17:34.860 | and we humans are right to reason about
03:17:37.840 | how a robot should consider navigating its environment
03:17:41.460 | through anthropomorphization.
03:17:43.780 | - I also think our brains are designed to think
03:17:46.520 | in human terms,
03:17:51.520 | like game theory,
03:17:53.860 | I think is best applied in the space of human decisions,
03:18:00.320 | and so--
03:18:03.440 | - Right, you're dealing,
03:18:04.280 | I mean, with things like AI,
03:18:06.160 | AI is, they are,
03:18:07.400 | you know, we can somewhat,
03:18:10.520 | like I don't think it's,
03:18:12.000 | the reason I say anthropomorphization,
03:18:14.160 | we need to be careful with,
03:18:15.480 | is because there is a danger of overly applying,
03:18:19.440 | overly, wrongly assuming that this artificial intelligence
03:18:23.480 | is going to operate in any similar way to us,
03:18:25.880 | because it is operating
03:18:27.760 | on a fundamentally different substrate,
03:18:29.680 | like even dogs, or even mice, or whatever,
03:18:33.280 | in some ways, like,
03:18:34.320 | anthropomorphizing them is less of a mistake, I think,
03:18:37.520 | than an AI, even though it's an AI we built, and so on,
03:18:40.200 | because at least we know that they're running
03:18:41.980 | from the same substrate,
03:18:43.380 | and they've also evolved from the same,
03:18:45.580 | out of the same evolutionary process, you know,
03:18:48.660 | they've followed this evolution of, like,
03:18:50.620 | needing to compete for resources,
03:18:52.420 | and needing to find a mate, and that kind of stuff,
03:18:55.180 | whereas an AI that has just popped into existence
03:18:58.260 | somewhere on a cloud server, let's say,
03:19:01.180 | you know, or whatever, however it runs,
03:19:02.820 | and whatever, whether, I don't know,
03:19:04.260 | whether they have an internal experience,
03:19:05.540 | I don't think they necessarily do,
03:19:07.020 | in fact, I don't think they do,
03:19:08.100 | but the point is, is that to try and apply
03:19:11.900 | any kind of modeling of, like,
03:19:13.960 | thinking through problems and decisions
03:19:15.340 | in the same way that we do,
03:19:16.900 | has to be done extremely carefully,
03:19:18.740 | because they are, like, they're so alien,
03:19:23.540 | their method of whatever their form of thinking is,
03:19:26.300 | it's just so different,
03:19:27.180 | because they've never had to evolve,
03:19:29.140 | you know, in the same way.
03:19:30.620 | - Yeah, beautifully put,
03:19:32.260 | I was just playing devil's advocate.
03:19:33.740 | I do think, in certain contexts,
03:19:35.700 | anthropomorphization is not gonna hurt you.
03:19:37.900 | - Yes.
03:19:38.740 | - Engineers run away from it too fast.
03:19:39.940 | - I can see that.
03:19:41.140 | - But for the most point, you're right.
03:19:43.660 | Do you have advice for young people today,
03:19:48.660 | like the 17-year-old that you were,
03:19:51.340 | of how to live life you can be proud of,
03:19:55.380 | how to have a career you can be proud of,
03:19:57.780 | in this world full of mullocks?
03:19:59.960 | - Think about the win-wins,
03:20:02.460 | look for win-win situations,
03:20:05.500 | and be careful not to, you know,
03:20:09.940 | overly use your smarts to convince yourself
03:20:12.220 | that something is win-win when it's not,
03:20:13.620 | so that's difficult,
03:20:14.620 | and I don't know how to advise, you know,
03:20:16.340 | people on that,
03:20:17.620 | 'cause it's something I'm still figuring out myself,
03:20:20.180 | but have that as a sort of default MO.
03:20:23.940 | Don't see things, everything as a zero-sum game,
03:20:28.060 | try to find the positive-sum-ness,
03:20:29.740 | and, like, find ways,
03:20:30.660 | if there doesn't seem to be one,
03:20:32.700 | consider playing a different game,
03:20:34.220 | so I would suggest that.
03:20:35.700 | Do not become a professional poker player,
03:20:39.020 | 'cause people always ask,
03:20:40.940 | they're like, "Oh, she's a pro, I wanna do that too."
03:20:43.220 | Fine, you could've done it if you were,
03:20:44.860 | you know, when I started out,
03:20:46.260 | it was a very different situation back then.
03:20:48.260 | Poker is, you know,
03:20:49.620 | a great game to learn
03:20:52.580 | in order to understand the ways to think,
03:20:54.940 | and I recommend people learn it,
03:20:56.780 | but don't try making a living from it these days,
03:20:58.380 | it's almost, it's very, very difficult,
03:21:00.260 | to the point of being impossible.
03:21:03.620 | And then,
03:21:04.460 | really, really be aware of how much time you spend
03:21:09.660 | on your phone and on social media,
03:21:12.100 | and really try and keep it to a minimum.
03:21:14.260 | Be aware that, basically,
03:21:15.820 | every moment that you spend on it is bad for you.
03:21:18.100 | So, doesn't mean to say you can never do it,
03:21:20.140 | but just have that running in the background.
03:21:22.500 | I'm doing a bad thing for myself right now.
03:21:25.260 | I think that's the general rule of thumb.
03:21:27.220 | - Of course, about becoming a professional poker player,
03:21:31.140 | if there is a thing in your life
03:21:32.900 | that's like that,
03:21:35.380 | and nobody can convince you otherwise,
03:21:37.660 | just fucking do it.
03:21:38.660 | Don't listen to anyone's advice.
03:21:42.660 | Find a thing that you can't be talked out of, too.
03:21:46.340 | That's a thing.
03:21:47.340 | - I like that, yeah.
03:21:48.500 | - You were a lead guitarist in a metal band.
03:21:54.460 | Did I write that down for something?
03:21:56.940 | What did you do it for?
03:22:02.620 | Was it the performing?
03:22:03.620 | Was it the pure, the music of it?
03:22:07.140 | Was it just being a rock star?
03:22:08.980 | Why'd you do it?
03:22:10.020 | - So, we only ever played two gigs.
03:22:15.460 | We didn't last, you know, it wasn't a very,
03:22:17.940 | we weren't famous or anything like that.
03:22:20.620 | But I was very into metal.
03:22:25.340 | Like, it was my entire identity.
03:22:27.500 | Sort of from the age of 16 to 23.
03:22:29.420 | - What's the best metal band of all time?
03:22:31.940 | Don't ask me that, it's so hard to answer.
03:22:34.060 | - So, I know I had a long argument with,
03:22:39.820 | I'm a guitarist, more like a classic rock guitarist.
03:22:43.980 | So, you know, I've had friends
03:22:45.820 | who are very big Pantera fans,
03:22:47.340 | and so there was often arguments about,
03:22:50.780 | what's the better metal band, Metallica versus Pantera?
03:22:53.900 | This is a more kind of '90s maybe discussion.
03:22:57.340 | But I was always on the side of Metallica,
03:23:00.380 | both musically and in terms of performance
03:23:02.740 | and the depth of lyrics and so on.
03:23:05.780 | But they were, basically everybody was against me.
03:23:10.260 | 'Cause if you're a true metal fan,
03:23:12.260 | I guess the idea goes is you can't possibly
03:23:14.540 | be a Metallica fan.
03:23:16.180 | - I think that's crazy. - 'Cause Metallica's pop,
03:23:17.500 | it's like, they sold out.
03:23:19.580 | - Metallica are metal.
03:23:20.620 | Like, they were the, I mean, again,
03:23:24.420 | you can't say who was the godfather of metal, blah, blah, blah.
03:23:26.540 | But they were so groundbreaking and so brilliant.
03:23:31.540 | I mean, you've named literally two of my favorite bands.
03:23:35.620 | When you ask that question of who are my favorites,
03:23:37.260 | like those were two that came up.
03:23:39.020 | A third one is Children of Bodom,
03:23:41.500 | who I just think, ugh, they just tick all the boxes for me.
03:23:44.980 | Yeah, I don't know.
03:23:48.260 | Nowadays, I kind of sort of feel like a repulsion to the,
03:23:54.260 | I was that myself.
03:23:55.700 | Like, I'd be like, who do you prefer more?
03:23:56.900 | Come on, who's, like, no, you have to rank them.
03:23:58.940 | But it's like this false zero-sum-ness that's like, why?
03:24:01.740 | They're so additive.
03:24:02.660 | Like, there's no conflict there.
03:24:04.620 | - Although, when people ask that kind of question
03:24:07.020 | about anything, movies, I feel like it's hard work
03:24:11.460 | and it's unfair, but it's, you should pick one.
03:24:14.940 | - Yeah. - Like, and I,
03:24:16.260 | that's actually, you know, the same kind of,
03:24:17.820 | it's like a fear of a commitment.
03:24:19.380 | - Right. - When people ask me,
03:24:20.220 | what's your favorite band?
03:24:21.060 | It's like, phew.
03:24:22.580 | But I, you know, it's good to pick.
03:24:24.300 | - Exactly, and thank you for, yeah,
03:24:25.460 | thank you for the tough question, yeah.
03:24:27.660 | - Well, maybe not in the context
03:24:29.500 | when a lot of people are listening.
03:24:31.820 | - Can I just like, what, why does this matter?
03:24:33.980 | No, it does, it's--
03:24:35.580 | - Are you still into metal?
03:24:37.380 | - Funny enough, I was listening to a bunch
03:24:38.500 | before I came over here.
03:24:39.820 | - Oh, like, do you use it for, like, motivation
03:24:43.420 | or get you in a certain--
03:24:44.620 | - Yeah, I was weirdly listening to 80s hair metal
03:24:46.900 | before I came.
03:24:48.100 | - Does that count as metal?
03:24:49.820 | - I think so.
03:24:50.700 | It's like proto-metal and it's happy.
03:24:53.420 | It's optimistic, happy, proto-metal.
03:24:55.780 | Yeah, I mean, these things, you know,
03:24:57.780 | all these genres bleed into each other.
03:25:00.020 | But yeah, sorry, to answer your question about guitar playing,
03:25:03.420 | my relationship with it was kind of weird
03:25:05.060 | in that I was deeply uncreative.
03:25:09.140 | My objective would be to hear some really hard technical solo
03:25:12.180 | and then learn it, memorize it, and then play it perfectly.
03:25:15.980 | But I was incapable of trying to write my own music.
03:25:19.220 | Like, the idea was just absolutely terrifying.
03:25:23.140 | But I was also just thinking, I was like,
03:25:24.660 | it'd be kind of cool to actually try starting a band again
03:25:28.620 | and getting back into it and write.
03:25:31.580 | But it's scary.
03:25:34.020 | - It's scary.
03:25:34.860 | I mean, I put out some guitar playing
03:25:36.860 | just other people's covers.
03:25:38.140 | Like, I play Comfortably Numb on the internet.
03:25:41.500 | It's scary, too.
03:25:42.500 | It's scary putting stuff out there.
03:25:44.260 | And I had this similar kind of fascination
03:25:47.700 | with technical playing, both on piano and guitar.
03:25:50.300 | And one of the reasons I started learning guitar
03:25:55.300 | is from Ozzy Osbourne, Mr. Crowley's solo.
03:26:01.740 | And one of the first solos I learned is that,
03:26:04.300 | and there's a beauty to it.
03:26:07.580 | There's a lot of beauty to it.
03:26:08.420 | - Tapping, right?
03:26:09.260 | - Yeah, there's some tapping, but it's just really fast.
03:26:13.500 | - Beautiful, like arpeggios.
03:26:15.700 | - Yeah, arpeggios, yeah.
03:26:16.900 | But there's a melody that you can hear through it,
03:26:19.620 | but there's also build up.
03:26:21.620 | It's a beautiful solo, but it's also technically,
03:26:23.900 | just visually the way it looks when a person's watching,
03:26:27.060 | you feel like a rockstar playing it.
03:26:29.660 | But it ultimately has to do with technical.
03:26:32.140 | You're not developing the part of your brain
03:26:36.420 | that I think requires you to generate beautiful music.
03:26:40.620 | It is ultimately technical in nature.
03:26:42.340 | And so that took me a long time to let go of that
03:26:45.820 | and just be able to write music myself.
03:26:48.980 | And that's a different journey, I think.
03:26:52.260 | I think that journey is a little bit more inspired
03:26:55.060 | in the blues world, for example,
03:26:57.060 | where improvisation is more valued,
03:26:58.620 | obviously in jazz and so on.
03:26:59.940 | But I think ultimately it's a more rewarding journey
03:27:04.940 | 'cause you get to, your relationship with the guitar
03:27:08.420 | then becomes a kind of escape from the world
03:27:12.180 | where you can create, create.
03:27:14.100 | I mean, creating stuff is-
03:27:16.140 | - And it's something you work with,
03:27:18.780 | 'cause my relationship with my guitar
03:27:20.020 | was like it was something to tame and defeat.
03:27:22.780 | (laughing)
03:27:23.620 | - Yeah, it's a challenge.
03:27:24.780 | - Which was kind of what my whole personality was back then.
03:27:26.820 | Like I was just very, like, as I said,
03:27:28.820 | like very competitive, very just like must
03:27:31.380 | bend this thing to my will.
03:27:33.140 | Whereas writing music is you work, it's like a dance.
03:27:36.620 | You work with it.
03:27:37.860 | - But I think because of the competitive aspect,
03:27:39.900 | for me at least, that's still there,
03:27:42.460 | which creates anxiety about playing publicly
03:27:45.740 | or all that kind of stuff.
03:27:47.180 | I think there's just like a harsh self criticism
03:27:49.380 | within the whole thing.
03:27:50.780 | It's really, really, it's really tough.
03:27:53.380 | - I wanna hear some of your stuff.
03:27:55.420 | - I mean, there's certain things that feel really personal.
03:27:59.300 | And on top of that, as we talked about poker offline,
03:28:03.380 | there's certain things that you get to a certain height
03:28:05.580 | in your life, and that doesn't have to be very high,
03:28:07.140 | but you get to a certain height
03:28:09.060 | and then you put it aside for a bit
03:28:11.740 | and it's hard to return to it
03:28:13.180 | because you remember being good.
03:28:15.940 | And it's hard to, like you being at a very high level
03:28:19.860 | in poker, it might be hard for you to return to poker
03:28:22.540 | every once in a while and you enjoy it,
03:28:24.620 | knowing that you're just not as sharp as you used to be
03:28:26.900 | because you're not doing it every single day.
03:28:29.460 | That's something I always wonder with, I mean,
03:28:31.620 | even just like in chess with Kasparov,
03:28:33.780 | some of these greats, just returning to it,
03:28:35.820 | it's just, it's almost painful.
03:28:37.100 | - Yes, I can imagine, yeah.
03:28:38.420 | - And I feel that way with guitar too,
03:28:40.660 | 'cause I used to play like every day a lot.
03:28:44.180 | So returning to it is painful 'cause like,
03:28:47.260 | it's like accepting the fact that this whole ride is finite
03:28:51.460 | and you have a prime, there's a time when you were really
03:28:56.460 | good and now it's over and now-
03:28:58.780 | - We're on a different chapter of life.
03:29:00.340 | It's like, oh, but I miss that, yeah.
03:29:02.940 | - But you can still discover joy within that process.
03:29:06.500 | It's been tough, especially with some level of like,
03:29:09.140 | as people get to know you, there's,
03:29:12.100 | and people film stuff, you don't have the privacy
03:29:16.620 | of just sharing something with a few people around you.
03:29:20.580 | - Yeah.
03:29:21.580 | - That's a beautiful privacy that-
03:29:23.380 | - That's a good point.
03:29:24.220 | - With the internet, it's just disappearing.
03:29:26.020 | - Yeah, that's a really good point.
03:29:27.740 | - Yeah.
03:29:29.140 | But all those pressures aside, if you really,
03:29:31.860 | you can step up and still enjoy the fuck out of
03:29:34.940 | a good musical performance.
03:29:37.900 | What do you think is the meaning of this whole thing?
03:29:42.340 | - What's the meaning of life?
03:29:43.780 | - Wow.
03:29:45.620 | - It's in your name, as we talked about,
03:29:47.340 | you have to live up, do you feel the requirement
03:29:50.980 | to have to live up to your name?
03:29:52.580 | - Because live?
03:29:55.220 | - Yeah.
03:29:56.060 | - No, because I don't see it.
03:29:57.780 | I mean, my, again, it's kind of like,
03:30:00.660 | no, I don't know.
03:30:03.300 | - And as we talked about-
03:30:04.140 | - Because my full name is Olivia.
03:30:05.340 | - Yeah.
03:30:06.180 | - So I can retreat in that and be like,
03:30:07.220 | oh, Olivia, what does that even mean?
03:30:09.060 | Live up to live.
03:30:12.020 | And no, I can't say I do,
03:30:13.820 | 'cause I've never thought of it that way.
03:30:15.340 | - And then your name backwards is evil.
03:30:17.220 | That's what we also talked about.
03:30:18.820 | There's like layers of evil.
03:30:22.060 | - I mean, I feel the urge to live up to that,
03:30:24.500 | to be the inverse of evil.
03:30:26.420 | - Yeah.
03:30:27.740 | - Or even better, 'cause I don't think,
03:30:30.420 | is the inverse of evil good,
03:30:31.980 | or is good something completely separate to that?
03:30:34.860 | I think my intuition says it's the latter,
03:30:36.980 | but I don't know.
03:30:37.820 | Anyway, again, getting in the weeds.
03:30:39.860 | - What is the meaning of all this?
03:30:42.100 | - Of life.
03:30:42.940 | Why are we here?
03:30:45.180 | - I think to explore, have fun, and understand,
03:30:51.820 | and make more of here and to keep the game going.
03:30:55.940 | - Of here?
03:30:56.780 | More of here?
03:30:57.620 | - More of-
03:30:58.460 | - More of this, whatever this is?
03:31:00.020 | - More of experience.
03:31:02.060 | Just to have more of experience,
03:31:03.180 | and ideally, positive experience.
03:31:06.460 | And more complex, I guess,
03:31:10.460 | to try and put it into a vaguely scientific term,
03:31:13.060 | make it so that the program required,
03:31:18.340 | the length of code required to describe the universe
03:31:21.460 | is as long as possible.
03:31:22.660 | And highly complex, and therefore interesting.
03:31:26.260 | Because again, I know we banged the metaphor to death,
03:31:32.180 | but like, tiled with paperclips
03:31:37.180 | doesn't require that much of a code to describe.
03:31:41.100 | Obviously, maybe something emerges from it,
03:31:42.700 | but that steady state, assuming a steady state,
03:31:44.820 | it's not very interesting.
03:31:45.620 | Whereas it seems like our universe is over time
03:31:49.660 | becoming more and more complex and interesting.
03:31:51.900 | There's so much richness and beauty and diversity
03:31:53.940 | on this Earth, and I want that to continue and get more.
03:31:56.260 | I want more diversity,
03:31:58.580 | and in the very best sense of that word,
03:32:01.860 | is to me the goal of all this.
03:32:06.860 | - Yeah, and somehow have fun in the process.
03:32:10.820 | - Yes.
03:32:12.060 | - 'Cause we do create a lot of fun things along,
03:32:14.340 | instead of in this creative force,
03:32:17.940 | and all the beautiful things we create,
03:32:19.460 | somehow there's like a funness to it.
03:32:22.460 | And perhaps that has to do with the finiteness of life,
03:32:25.100 | the finiteness of all these experiences,
03:32:27.100 | which is what makes them kind of unique.
03:32:31.300 | Like the fact that they end, there's this, whatever it is,
03:32:35.420 | falling in love or creating a piece of art,
03:32:40.420 | or creating a bridge, or creating a rocket,
03:32:45.540 | or creating a, I don't know,
03:32:48.820 | just the businesses that build something
03:32:53.140 | or solve something.
03:32:56.140 | The fact that it is born and it dies,
03:33:00.820 | somehow embeds it with fun, with joy,
03:33:05.820 | for the people involved.
03:33:08.420 | I don't know what that is, the finiteness of it.
03:33:11.420 | - It can do, some people struggle with the,
03:33:13.820 | I mean a big thing I think that one has to learn
03:33:17.860 | is being okay with things coming to an end.
03:33:21.100 | And in terms of like projects and so on,
03:33:25.620 | people cling onto things beyond what they're meant to be,
03:33:28.660 | beyond what is reasonable.
03:33:30.460 | - And I'm gonna have to come to terms
03:33:33.900 | with this podcast coming to an end.
03:33:35.820 | I really enjoyed talking to you.
03:33:37.060 | I think it's obvious, as we've talked about many times,
03:33:40.180 | you should be doing a podcast.
03:33:41.380 | You should, you're already doing a lot of stuff publicly
03:33:45.980 | to the world, which is awesome.
03:33:47.420 | You're a great educator, you're a great mind,
03:33:49.180 | you're a great intellect.
03:33:50.300 | But it's also this whole medium of just talking
03:33:52.860 | is also fun. - It is good.
03:33:53.780 | - It's a fun one.
03:33:54.620 | - It really is good.
03:33:55.540 | And it's just, it's nothing but like,
03:33:58.860 | oh, it's just so much fun.
03:34:00.580 | And you can just get into so many,
03:34:03.220 | yeah, there's this space to just explore
03:34:05.300 | and see what comes and emerges and yeah.
03:34:08.020 | - Yeah, to understand yourself better
03:34:09.380 | and if you're talking to others to understand them better
03:34:11.540 | and together with them.
03:34:12.620 | I mean, you should do your own podcast,
03:34:15.380 | but you should also do a podcast with Cee
03:34:17.020 | as we talked about.
03:34:18.740 | The two of you have such different minds
03:34:22.880 | that like melt together in just hilarious ways,
03:34:26.100 | fascinating ways, just the tension of ideas there
03:34:29.340 | is really powerful.
03:34:30.260 | But in general, I think you got a beautiful voice.
03:34:33.220 | So thank you so much for talking today.
03:34:35.340 | Thank you for being a friend.
03:34:36.540 | Thank you for honoring me with this conversation
03:34:39.380 | and with your valuable time.
03:34:40.420 | Thanks, Liv. - Thank you.
03:34:42.580 | - Thanks for listening to this conversation with Liv Boree.
03:34:45.180 | To support this podcast,
03:34:46.300 | please check out our sponsors in the description.
03:34:48.820 | And now let me leave you with some words
03:34:50.900 | from Richard Feynman.
03:34:53.040 | "I think it's much more interesting to live not knowing
03:34:56.260 | than to have answers which might be wrong.
03:34:59.300 | I have approximate answers and possible beliefs
03:35:01.820 | and different degrees of uncertainty
03:35:03.580 | about different things,
03:35:05.120 | but I'm not absolutely sure of anything.
03:35:08.540 | And there are many things I don't know anything about
03:35:11.460 | such as whether it means anything to ask why we're here.
03:35:15.460 | I don't have to know the answer.
03:35:17.580 | I don't feel frightened not knowing things
03:35:20.760 | by being lost in a mysterious universe without any purpose,
03:35:24.300 | which is the way it really is as far as I can tell."
03:35:27.620 | Thank you for listening and hope to see you next time.
03:35:31.720 | (upbeat music)
03:35:34.300 | (upbeat music)
03:35:36.880 | (Session concluded at 4pm)