back to index

The Books I Read In November 2022


Chapters

0:0 Cal's intro
0:57 Life is Hard
3:11 Superintelligence
6:24 Life 3.0
8:18 Sacred Nature
10:29 Cinema Speculation

Whisper Transcript | Transcript Only Page

00:00:00.000 | Alright, well, like we do each month, I like to talk about the books I read in the previous
00:00:08.200 | month.
00:00:09.960 | As long-time listeners know, my goal is to try to read at least five books a month, and
00:00:13.600 | I do this by just regularly putting aside time to read.
00:00:17.360 | I read in the morning, I often will put aside time in the evening to read.
00:00:20.480 | If I have extra time during lunch, I'll read.
00:00:22.640 | Occasionally, I'll time block blocks to read if I'm getting close to finishing a book.
00:00:26.960 | It's just a little bit of effort put into freeing up time to look at the pages of a
00:00:32.000 | book and a commitment to not instead dedicating that time to your phone.
00:00:36.400 | It's surprising how many pages you can actually get through.
00:00:38.560 | Alright, so Jesse, I want to go through the five books I read in November of 2022.
00:00:47.440 | This is in, I guess that's the order I finish these.
00:00:49.960 | Alright, number one, Life is Hard, How Philosophy Can Help Us Find Our Way by Kieran Setia.
00:01:03.560 | Kieran wrote Midlife, which I loved, and I talk about a lot in my book, Digital Minimalism.
00:01:09.800 | He's a philosopher at MIT.
00:01:13.160 | So Life is Hard is he's pulling from big ideas from philosophy to help you get through life
00:01:19.160 | against the backdrop of the fact that life is hard and hard things are going to happen.
00:01:22.520 | So obviously, it's a popular subject.
00:01:25.360 | It seems sometimes like the only consistent answer we get to this seems to be stoicism.
00:01:31.560 | Kieran is drawing much more widely from the world of philosophy to try to help provide
00:01:36.360 | answers about how to get through life.
00:01:38.580 | Really good book, some really strong ideas.
00:01:40.440 | I think the earlier chapters, which drew from experience from the disability community to
00:01:46.720 | give insight into how to deal with loss or bad things happening, I thought was really
00:01:52.120 | particularly strong.
00:01:54.320 | It was a philosophical stance to life in which you don't focus on what has been taken away
00:01:58.800 | from you, but instead focusing on what is possible or what you can do.
00:02:03.600 | And there's this whole philosophical backdrop to this about most things you're not going
00:02:08.120 | to get time to do anyway.
00:02:09.360 | So it's not actually rational to focus on, well, now I've lost, you know, this has been
00:02:13.880 | taken on my life and I can't do this one thing.
00:02:15.640 | Well, there's a thousand things you're not going to get to.
00:02:18.040 | Focus on the things you can and how you can actually build a life of real meaning around
00:02:21.520 | loss.
00:02:22.520 | I thought that was really good.
00:02:24.280 | My main critique of this book is throughout there's these somewhat heavy handed injections
00:02:30.960 | of I don't know what else to call this other than wokeness, I suppose, that take you out
00:02:37.280 | of the book.
00:02:38.280 | There's these sections where it seems more like Kieran is writing to a suddenly narrowing
00:02:43.440 | his audience, the fellow academics and just saying, don't yell at me, don't yell at me,
00:02:46.920 | don't yell at me.
00:02:47.920 | And I think the book would have been perhaps more broad and timeless, perhaps without those
00:02:52.720 | interjections.
00:02:53.720 | It really did feel like an editor at some point said, someone might get mad about this
00:02:58.640 | one, I get mad about this.
00:02:59.640 | And you had to go back and add these self-defensive sections and I don't know, I think it hurt
00:03:03.840 | the timelessness of the book a little bit.
00:03:05.720 | All right.
00:03:07.200 | Book number two, Superintelligent Paths, Dangers and Strategies by the philosopher Nick Bostrom.
00:03:15.920 | So this book's a 2016, maybe 2017, really popular among the tech set, the techno libertarian
00:03:23.320 | set who's concerned about artificial intelligence.
00:03:27.840 | Basically Bostrom, who has the center at Oxford, it looks at threats to humanity's future with
00:03:33.920 | a very straight face, very systematically goes through all of these scenarios of how
00:03:39.560 | superintelligent AI, the various ways it might essentially take over the world and potentially
00:03:46.120 | convert the world into a fuel source as it sort of takes over the whole galaxy to try
00:03:50.080 | to fuel its computation.
00:03:51.560 | So it's like all abstract, all mind experiments, but like, let's think through as superintelligence
00:03:56.920 | arises, artificial intelligence, all the different things that could happen, all the ways it
00:04:00.400 | could unfold.
00:04:01.400 | Spoiler alert, most of them are bad for humanity.
00:04:05.600 | So I don't know, it's an interesting book because he's taking this issue very seriously.
00:04:11.520 | I mean, when I'm reading this book, I keep alternating between perceiving it as bracing
00:04:17.720 | and perceiving it as absurd.
00:04:21.680 | And I bounce back and forth, which I think is the mark of a provocative book.
00:04:25.720 | This caught the attention of a lot of tech types, caught the attention of Bill Gates,
00:04:30.600 | Elon Musk.
00:04:31.600 | I'm thinking of various people who blurbed this book and said, we should be worried about
00:04:35.320 | this.
00:04:36.640 | So it's interesting.
00:04:37.800 | I think what it also reveals is this strong belief among these type of particular brand
00:04:42.000 | of thinkers who are concerned about AI and think we should start preparing now to deal
00:04:46.600 | with these threats is they have this certain determinism for the future of humanity that's
00:04:53.520 | really rooted in this idea that like, of course, we need to get to a place where we expand
00:04:57.800 | beyond Earth and harness more of the resources of the galaxy of the solar system and beyond.
00:05:03.720 | And it's this sort of sci-fi type of extra planetary future vision for humanity.
00:05:12.920 | And they're really, I think it's just kind of baked into the thinking, I think, of a
00:05:16.320 | lot of these thinkers is like, this is where we're heading.
00:05:18.640 | So when you pick this up, reading the AI prognosticators, it makes sense.
00:05:23.160 | Something like Elon Musk and how he thinks about Mars suddenly makes a lot more sense.
00:05:26.800 | Like, oh, this is a very common mindset.
00:05:29.280 | We're going to leave Earth and we need to build Dyson spheres around the sun.
00:05:33.040 | And how much energy is available in the solar system and how a million years from now can
00:05:37.040 | we harness all of that?
00:05:38.240 | So the reason why like the Nick Bostroms of the world are so concerned about AI becoming
00:05:44.600 | a super intelligence is that they think that will stop us from this vision of expanding
00:05:50.040 | throughout the universe.
00:05:52.680 | And B, but maybe if we're really careful, it could help accelerate that.
00:05:56.260 | So it's that undercurrent is something that I would say most people don't think about.
00:06:00.120 | But in this particular circle, it is just assumed that, yeah, this is the whole ball
00:06:05.280 | game is 20,000 years from now, we better have be harnessing 80% of the energy from the sun
00:06:10.800 | as we become a multi-solar system species.
00:06:14.840 | So along those same lines, I read Life 3.0, Being Human in the Age of Artificial Intelligence
00:06:21.480 | by Max Tegmark at MIT.
00:06:25.700 | I found this book to be much more energetic, interesting than Bostroms book.
00:06:30.300 | Bostroms is very ontological.
00:06:31.620 | It's like very clearly break down these different possibilities and go through them.
00:06:36.140 | Tegmark has a lot of energy, a lot of originality in his thinking.
00:06:39.740 | Tegmark, he's a physicist at MIT, but he's like very broad.
00:06:43.980 | He touches on a bunch of different topics.
00:06:46.940 | And so you get a lot of this.
00:06:47.940 | It was a much more enjoyable read, in my opinion.
00:06:50.180 | But he's all over the, it's like, let's talk about AI.
00:06:52.800 | Let's talk about, let's get terms right.
00:06:55.540 | But let's also let's talk about like all these different ways that we might harness energy
00:06:58.540 | from the universe.
00:06:59.540 | And let's talk, like he bounces around to all these ideas.
00:07:01.580 | He's a smart guy.
00:07:02.580 | He's a creative thinker.
00:07:03.860 | He is also much more, he's much more clear about this in Bostrom, but he's just like
00:07:07.420 | Bostrom is very much aligned with a course, a course, a course, the whole point of humanity
00:07:11.180 | is to leave the planet and expand throughout the solar system and beyond.
00:07:14.940 | And it's just taken as a granted that that's what the whole ball game is about.
00:07:19.120 | It's a more eclectic book than Bostrom was more fun booked in Bostrom.
00:07:23.300 | And but I enjoyed it.
00:07:24.300 | I enjoyed that.
00:07:25.300 | Tegmark is an interesting guy.
00:07:26.300 | He is the guy, by the way, that's responsible for all of those quotes you see from famous
00:07:29.980 | scientists and engineer types who are saying, I'm worried about AI.
00:07:34.860 | So Bill Gates, Hawking, Stephen Hawking, and Elon Musk, all three of their quotes about
00:07:41.940 | we should be worried about AI.
00:07:43.740 | That's all Tegmark's doing.
00:07:45.940 | He is the one who organized this big conference in Puerto Rico where he brought all these
00:07:50.740 | people together and really kicked off this idea of we have to start thinking now about
00:07:55.820 | the future of AI before we actually get to a place where it's dangerous.
00:07:59.340 | And so he's really the cultural orchestra conductor of this big names in tech and science
00:08:05.660 | expressing concern about AI.
00:08:07.500 | Tegmark is a huge initiator of that movement.
00:08:13.020 | In Gears, I also read Sacred Nature, Restoring Our Ancient Bond with the Natural World by
00:08:17.700 | Karen Armstrong.
00:08:18.700 | I'm a huge fan of Armstrong.
00:08:20.580 | I think she's one of the most interesting and talented religious historians writing
00:08:25.580 | today.
00:08:27.020 | The main value in Sacred Nature is a short book that will very quickly bring you into
00:08:32.620 | the Armstrong philosophy, which she's developed over multiple books now about the nature of
00:08:40.140 | religion, pre-enlightenment being something that is based on action and ritual and activity,
00:08:45.980 | that insight is gained through doing.
00:08:48.580 | It's not gained in a linguistic sense.
00:08:50.980 | It's not gained by just studying a text or deciding in the abstract whether or not to
00:08:54.540 | assent to a creed or not.
00:08:57.220 | Religion insights were often experiential.
00:09:00.660 | By doing these things, you over time directly experience, lived experience, the inside of
00:09:06.140 | the religion.
00:09:07.140 | Until you're actually doing all the different things, you're not getting insight.
00:09:10.260 | You can't evaluate a religion and decide to follow it or not or if it's true or not just
00:09:14.300 | based off of reading its books.
00:09:15.780 | This is this key Armstrong insight.
00:09:18.120 | Her best book on this is The Case for God.
00:09:20.700 | Sacred Nature is short, but you get a really good sampling of her thinking.
00:09:27.260 | As an actual proposal for rethinking our relationship with nature, I don't know, it's a combination
00:09:35.880 | of this incredibly insightful breakdown of the way that various spiritual traditions
00:09:41.680 | saw an energy infused throughout all of nature.
00:09:44.460 | It's very fascinating how the Abrahamic religion moved away from that.
00:09:49.720 | By citing their religion in particular time in history, that actually changed the relationship
00:09:54.680 | with nature, made it more instrumental and less infused with the divine.
00:09:58.420 | All that is fascinating.
00:09:59.420 | It's kind of grafted on with a sort of very middle of the road, standard sort of climate
00:10:05.940 | change polemic that there's no insight there.
00:10:09.380 | And therefore, we should care more about climate change.
00:10:12.440 | It's like that part almost feels tacked on to what is otherwise incredibly insightful
00:10:16.260 | religious scholarship.
00:10:19.660 | Last book, my favorite actually of the five, Cinema Speculation by Quentin Tarantino.
00:10:26.820 | Fantastic book.
00:10:27.960 | It's basically just him talking about his experience with the cinema in the seventies,
00:10:32.540 | which was a influential period for him as a kid.
00:10:37.380 | The exposure he had to cinema in the seventies, each of the chapters is built around a particular
00:10:42.300 | movie, but it goes all over the place.
00:10:46.220 | Each chapter is sort of anchored with a particular movie.
00:10:48.940 | So they'll be talking about bullet or they'll be talking about deliverance, but then it
00:10:53.300 | goes all over the place.
00:10:55.660 | Why I love this book is because it's so original in its tone and approach.
00:11:01.580 | So this captures in prose, essentially the essence of Quentin Tarantino, right?
00:11:08.220 | So it's divergent, it's obsessed with pop culture.
00:11:11.580 | It has a foundation and deep intellectual confidence, and it bounces around between,
00:11:17.300 | he knows all these movies, he knows these directors, he's connecting them in interesting
00:11:20.940 | ways.
00:11:21.940 | He's jumping from this, this, and back to this.
00:11:23.140 | He's not trying to show off.
00:11:24.580 | And yet he just is, there's like a profound intellect behind a critique, but he's not
00:11:29.340 | trying to prove that he's smart.
00:11:30.700 | So it's like watching a good Tarantino movie, but in the written form.
00:11:34.740 | And there's so little innovation that happens these days, I think, in idea nonfiction.
00:11:38.220 | I mean, there's so much sameness in tone.
00:11:41.700 | And it's a bunch of people like my age who are putting on their sort of deep professor
00:11:45.360 | voice and trying to, I'm so smart and let me be very careful and resigned or whatever.
00:11:51.220 | And then Tarantino comes in and it's just like a fire hose.
00:11:53.660 | It's like, boom, it's energy and divergent and he's brilliant, but he doesn't care.
00:11:57.740 | He's all over the place and you come away having learned a lot.
00:12:01.020 | So it's one of the most original works just in terms of tone and delivery of idea nonfiction
00:12:05.260 | I've read in a long time.
00:12:06.340 | So whether you're a movie geek or not, I enjoy cinema speculation.
00:12:11.500 | One learning though, if you get it on audio like I did, Tarantino reads the first chapter
00:12:16.860 | and it's great.
00:12:17.860 | You're like, oh, here we go.
00:12:19.740 | We got eight hours of him, it's because his voice matches the content.
00:12:24.820 | It switches to a third party narrator for the second chapter.
00:12:28.140 | So a little bit of being switched.
00:12:29.140 | So be prepared for the narrator's fine.
00:12:30.660 | I bet Tarantino should have read the whole thing.
00:12:32.580 | But he's a busy guy.
00:12:33.580 | All right, Jesse, those are my five books from November.
00:12:37.940 | With the life is hard, when you were talking about the ending there, it kind of reminded
00:12:42.420 | me when you're talking about caveats with Sam Harris last week.
00:12:45.700 | Yeah, there was a little bit of that.
00:12:48.020 | Because I mean, well, also just he's pulling from these philosophers over 200 to 300 year
00:12:51.460 | period.
00:12:52.460 | So it's, it's very, um, timeless and broad, but then half the chapters in the end, it's
00:12:58.060 | like, uh, and where this all should lead you is to like very narrow, like whatever, basically
00:13:05.820 | like, um, 2022 elite academic thinking on political issues, like whatever that current,
00:13:13.020 | very contemporary thought is that all just leads you to there.
00:13:16.660 | And that felt tacked on, you know, Kierkegaard and Nietzsche, they didn't know about postmodern
00:13:23.620 | influence critical theories, five years from now, the trends in academia are going to be
00:13:27.540 | different.
00:13:28.540 | So that's, what's going to make it seem less timely.
00:13:29.540 | Like he obviously was writing this at home during the pandemic, post George Floyd.
00:13:32.860 | And like, this was really influencing him.
00:13:34.860 | And he was thinking about people reading this and how they're going to react.
00:13:38.460 | But even like five years from now, I think where it gets specific and contemporary is
00:13:41.620 | going to feel dated, which is fine.
00:13:43.900 | If you're writing a book that is contemporary and founded in a particular moment, but this
00:13:47.980 | is a book about timeless timelessness philosophies that covered all sorts of different periods,
00:13:52.620 | all sorts of different, uh, innovations and political thought, all sorts of different,
00:13:56.860 | um, uh, intellectual, uh, favorite ideas of the time and all these philosophies that were
00:14:02.700 | all these different things were going on.
00:14:05.060 | And so that's, maybe that's just me, but I just felt like this book, you didn't have
00:14:08.540 | to cap.
00:14:09.540 | Yeah.
00:14:10.540 | It was caveat.
00:14:11.540 | Um, cause you were talking to Sam about that and I'm not going to talk to Sam about that.
00:14:14.540 | I think it's, I think you got to trust the reader.
00:14:16.340 | Yeah.
00:14:17.340 | Yeah.
00:14:18.340 | I mean, you know, if you need to prove to a certain subset, like I'm with your tribe,
00:14:21.220 | wear a shirt with a slogan.
00:14:22.940 | But, uh, when you try to put too many caveats into your writing, it doesn't work.
00:14:27.860 | It, it, I think it reduces the impact.
00:14:29.780 | Trust your reader.
00:14:30.780 | The readers can add the caveats.
00:14:31.780 | They can apply it to the, whatever moment they're reading it in and draw those, draw
00:14:35.380 | those insights.
00:14:36.380 | And, and so when you, when you add these caveats to either avoid being yelled at or to signal
00:14:40.480 | you're on a particular team, regardless of what that team is, I always think that diminishes
00:14:44.820 | the value of nonfiction writing.
00:14:46.700 | Trust the reader.
00:14:47.700 | The reader knows what they care about.
00:14:49.020 | They're sophisticated.
00:14:50.460 | They will take your ideas and adapt them and apply them to their own lives and to their
00:14:54.820 | own situations, the causes they care about them.
00:14:56.700 | They'll stress test them against these other things going on.
00:14:58.820 | So if your book is not particularly about these issues to graph the issues on, it doesn't
00:15:04.940 | end up making it better.
00:15:07.100 | It doesn't end up, I mean, maybe it does protect you from, I don't know, some nasty tweet,
00:15:12.880 | but no one really cares.
00:15:13.880 | That's the reality is like, no one really cares about you.
00:15:15.840 | And this is what I've decided.
00:15:17.520 | No one cares about me.
00:15:18.520 | No one really is following it that closely.
00:15:20.400 | People see what you read, right?
00:15:21.880 | It's just something in here that's useful to me and they move on with their life.
00:15:24.480 | So yeah, but I did talk about that with Harris on the podcast, my philosophy of caveats.
00:15:29.080 | And we've talked about on the show before, but how caveating, well, this advice might
00:15:33.140 | not apply here.
00:15:34.140 | It's very relevant to one-on-one conversation where it's just reasonable and polite, but
00:15:39.400 | doesn't work well when it's one to many.
00:15:42.000 | When you're doing one to many broadcasting information, then you got to let the recipients
00:15:47.480 | add the caveat.
00:15:48.480 | So it's two different modes of communication.
00:15:50.280 | [outro music]