back to index

Ep. 190: Managing Idea Notebooks, Taming Instant Messaging, and Identifying Keystone Habits


Chapters

0:0 Cal's intro
1:22 Cal Reacts to the News
21:31 Cal talks about My Body Tutor and ExpressVPN
24:3 Cal listens to a call about Managing idea notebooks
29:40 Doctor who wants to learn advanced math
34:6 Obsessed with instant messaging
47:18 Tempering a racing mind
51:9 Cal talks about Workable and Grammarly
54:30 Selecting good keystone habits

Whisper Transcript | Transcript Only Page

00:00:00.000 | I'm Cal Newport and this is Deep Questions, episode 190.
00:00:11.720 | I'm here at my Deep Work HQ joined by my professor, professor, I'm back to that again, joined
00:00:19.840 | by my producer, Professor Jesse.
00:00:24.420 | I'm still sick, Jesse, this is why I'm mixing up your title, but again, as we talked about
00:00:30.120 | last time, I am a hero for continuing to podcast, even though I have a bad cold.
00:00:40.320 | It's kind of funny, a lot of times you say professor, so now when you say producer, I
00:00:43.320 | get thrown off.
00:00:44.320 | Yeah, I'm trying to, I've been trying to pressure you to go back to school, is what this is
00:00:49.600 | about.
00:00:50.600 | It's subliminal, subliminal, subliminal pressuring.
00:00:54.220 | We had a question, we didn't do it, but one of the questions we were considering was from,
00:00:58.780 | maybe it was one of the calls, was from someone who wanted to later in life study theoretical
00:01:04.540 | mathematics.
00:01:08.100 | Was that you they called in?
00:01:09.900 | You've been inspired?
00:01:10.900 | He was a doctor, I'm not a doctor.
00:01:11.900 | I see, I see.
00:01:12.900 | He was inspired by, you were inspired by me calling you professor.
00:01:16.380 | All right, so we have listener calls, it's a listener calls episode.
00:01:19.940 | We've got some good calls queued up.
00:01:23.580 | Last week, however, we debuted a new segment, Cal reacts to the news, where we take something
00:01:30.260 | that's in the news and I give you my opinions on it.
00:01:34.580 | I thought that went well, so we thought we would do it again.
00:01:37.760 | So on Monday's episode, we talked about the New York Times decision to stop encouraging
00:01:43.860 | their reporters to use Twitter.
00:01:47.060 | Now we have another article, which Jesse just handed to me when I got to the HQ today, and
00:01:52.740 | it has to do about the other major Twitter story in the news this week, which is Elon
00:01:57.180 | Musk buying a 9% stake in Twitter and gaining a seat on the Twitter board of directors.
00:02:05.940 | This seems to have been causing a lot of debate.
00:02:07.580 | I mean, I've talked to some people about this in academic circles and et cetera, and people
00:02:11.900 | are really all over the place.
00:02:13.580 | People are all over the place about Musk in particular.
00:02:18.580 | I think people have a hard time trying to figure out, do I like this guy or hate this
00:02:23.780 | Do I think he's interesting or do I think he's terrible?
00:02:25.140 | Because he doesn't make any of those categorizations easy.
00:02:29.860 | So it's always unclear.
00:02:31.340 | Jesse, let me ask you, pro-Musk, anti-Musk?
00:02:34.140 | I think I'm pro-Musk, yeah, for sure.
00:02:37.380 | He gets a lot of press because he's so ridiculously rich.
00:02:41.940 | He's going to be around for a long time.
00:02:43.740 | He's not even that old.
00:02:44.740 | He's not that old.
00:02:45.740 | Yeah, I think he's not that much older than us.
00:02:49.180 | He's like 10 years older.
00:02:50.180 | Yeah, I mean, he's going to be in the news.
00:02:52.260 | He's in the news all the time now, and he's going to be in the news for a long time.
00:02:57.060 | Yeah.
00:02:58.060 | I mean, I'm looking forward to the Walter Isaacson biography on Musk.
00:03:00.940 | I don't know when that's coming out, but he's such an interesting, weird character.
00:03:05.300 | I think it's going to be a fascinating book, and Isaacson, because of his stature, can
00:03:08.300 | get access.
00:03:09.300 | I mean, I think the thing about Musk is that he has done objectively sort of impressive
00:03:17.420 | error-defining things in the world of business.
00:03:19.180 | He just made electric cars in industry.
00:03:21.460 | He just made affordable space travel in industry.
00:03:25.820 | Those are crazy things to do, and just became the richest man in the world doing that.
00:03:29.180 | So it's like a larger-than-life character.
00:03:32.020 | When he interacts with the world, however, he does so in like a really weird way.
00:03:37.180 | He'll be joking.
00:03:38.500 | He'll be, say, non-sequiturs.
00:03:41.140 | He has no particular tribal allegiances, so then he's not particularly super conservative.
00:03:48.820 | So the conservatives are like, "I guess we like this guy.
00:03:50.460 | I don't know if we like this guy."
00:03:52.060 | He's not very interested in wokeism, so more of the left, the farther left, is like, "I
00:03:58.740 | guess we just like this guy."
00:04:00.260 | No one's really sure how they feel about him.
00:04:01.700 | But anyways, the more proximate question here is how do we feel about him buying a 9% stake
00:04:07.660 | in Twitter?
00:04:08.660 | This is the article that Jesse gave me.
00:04:10.020 | It is an article by Professor Scott Galloway from NYU.
00:04:16.180 | It's Galloway's dissection and opinions on Elon's purchase of Twitter stock.
00:04:24.180 | So I thought what I would do here is summarize the major point of Galloway, give you my reaction
00:04:29.020 | to that point, and then talk about a couple of other things that Galloway says that I
00:04:33.900 | somewhat disagree on.
00:04:34.900 | So let's start with Galloway's main point, which is one I actually agree on.
00:04:38.420 | And let me actually preface this all by saying if you have to choose between me and him on
00:04:43.260 | anything having to do with Twitter, you should probably trust Scott.
00:04:47.180 | He's actually been involved.
00:04:48.260 | He's had a stake in Twitter.
00:04:49.780 | He was part of an activist shareholder movement that helped get Jack Dorsey removed from the
00:04:55.020 | Twitter board of directors.
00:04:56.020 | So this is someone that actually knows a lot about it.
00:04:57.820 | I don't, but I do have a microphone in front of me, so I figured we should talk about it.
00:05:02.060 | So here's Galloway's main point.
00:05:05.580 | Twitter is not doing well.
00:05:07.700 | This is largely acknowledged.
00:05:08.900 | This is why Musk was able to buy so much stock at basically a discount.
00:05:14.020 | Galloway acknowledges smart move by Elon.
00:05:17.500 | It's an undervalued stock.
00:05:19.660 | And it's undervalued because Twitter is not doing well.
00:05:21.620 | There's different ways to look at it.
00:05:22.940 | Galloway has a bunch of charts.
00:05:24.140 | One of them is of ad revenue.
00:05:26.720 | Twitter is doing what, $4.5 billion in 2021 compared to $115 billion for Meta and $209
00:05:34.060 | billion for Google.
00:05:35.660 | Everyone sleeps on Google.
00:05:36.660 | They don't realize how much it still dominates the advertising market.
00:05:40.220 | You know, Meta had to create, it created its own ecosystem in which it can sell ads.
00:05:46.340 | Google is basically still selling ads on the rest of the internet.
00:05:50.020 | So people forget how much money they're producing.
00:05:52.540 | Galloway also talks about the enterprise value per daily active user.
00:05:56.480 | For Twitter it's $131 compared to almost $300 for Meta.
00:06:00.040 | So they're not doing a very good job of actually getting value out of their active user.
00:06:05.420 | So what should be done about this?
00:06:08.360 | Galloway has a solution that he calls #ExpletiveDeletedObvious.
00:06:15.820 | And that solution is move to a subscription model.
00:06:21.880 | Here's Galloway's elaboration.
00:06:24.600 | Corporate users and users with large followings would pay for a fraction of the value they
00:06:27.640 | receive.
00:06:28.640 | I have long advocated for this model.
00:06:30.520 | By shifting the company's revenue source from advertisers to users, subscription aligns
00:06:33.920 | economic incentives with user experience rather than user exploitation.
00:06:38.600 | As I hinted, I agree with that.
00:06:42.060 | I agree with that move.
00:06:43.100 | I think it would be an exciting move, not just for Twitter, but because of the precedent
00:06:46.520 | it would set for how social media could operate in the future.
00:06:52.000 | I think Galloway is absolutely right when he argues ad-supported social media platforms
00:07:01.320 | have completely misaligned incentives.
00:07:04.480 | I wrote about this a while ago for The New Yorker.
00:07:06.200 | I wrote a piece a couple years ago about indie social media where I got into this more.
00:07:10.320 | But the issue is, with an ad-supported social media platform, all that matters is engagement.
00:07:15.740 | How long do we keep you on the platform?
00:07:17.240 | The more you're on there, the more ads we can put in front of you, but also the more
00:07:20.040 | data we can gather from you to target those ads.
00:07:24.480 | And neither of those objectives are aligned with what is going to be the most satisfying
00:07:30.280 | experience for a user or what's going to be the best experience for a user from even just
00:07:34.920 | a privacy standpoint or any other standpoint you might want to look at.
00:07:39.200 | When you put all the data you can find about a user into a black box machine learning algorithm
00:07:43.480 | to have it figure out what to show you so that you'll stick around more, what you're
00:07:48.360 | doing is getting artificial neural networks to learn the darkest recesses of the human
00:07:54.880 | brain to figure out about the brainstem, to figure out about our tribal reactions, our
00:08:00.580 | fear reactions.
00:08:01.580 | There's very powerful things in the human psyche.
00:08:04.560 | And those artificial neural networks that are taking in all this data and running their
00:08:09.220 | multi-armed bandit reinforcement algorithm to try to figure out what makes you stick
00:08:12.920 | around later than farther than others, it's just figuring out stuff that darker souls
00:08:18.160 | in the past had also deduced.
00:08:20.960 | Let's make you angry.
00:08:22.420 | Let's make you sad.
00:08:23.940 | Let's get your back up against the wall and feel like there's a fight to be had.
00:08:28.200 | And Twitter is really good at that because the main grist for Twitter is individual interaction,
00:08:34.480 | right?
00:08:35.480 | It's words from people and you interacting with words from other people.
00:08:40.680 | It's not pictures of things.
00:08:42.360 | It's not articles or videos that maybe people are talking about.
00:08:45.360 | It is just direct human interaction going through a curation algorithm that's just
00:08:50.520 | trying to maximize engagement and it creates terrible Lord of the Flies, everyone going
00:08:56.640 | after everyone.
00:08:58.120 | Seven seconds after being on Twitter, you feel upset.
00:09:01.080 | That is the outcome.
00:09:02.080 | And so I think Galloway is onto something.
00:09:04.240 | If the money is from ads, then the money is from users wanting to pay that ad revenue.
00:09:09.880 | I mean, not from ads, I'm sorry, subscriptions.
00:09:12.320 | The income comes from users wanting to pay those subscription fees.
00:09:15.240 | Why do users want to pay subscriptions to things?
00:09:17.280 | Not because I use it all the time, but because they value what they get when they do use
00:09:24.160 | It's a completely different model.
00:09:26.480 | And now you want the experience to be informative, uplifting, interesting, entertaining.
00:09:30.560 | It's a completely different incentive.
00:09:32.280 | Now I don't know if I share Galloway's optimism that this would create a monster
00:09:38.760 | company.
00:09:40.840 | I think once you start charging, you might have a severe contraction of user base.
00:09:47.920 | But I think it would be a very profitable company and a very useful company.
00:09:50.680 | And I hope that actually happens.
00:09:52.600 | Now there's different ways to go forward here.
00:09:54.120 | It could be the people who post have to pay, but anyone can read.
00:09:57.680 | I think that's probably fine.
00:09:59.980 | I think he didn't mention this, but I think getting rid of pseudo anonymity would be good.
00:10:04.360 | Real people talking to other real people.
00:10:06.800 | You kind of have to stand by what you're doing.
00:10:10.400 | I think that would be good.
00:10:11.400 | There's a lot of other details to work out, but I really do think it probably would become
00:10:14.360 | a better experience if it was subscription based and it would teach the world.
00:10:18.960 | You can run a really profitable, good online user generated content company without having
00:10:26.840 | to have ads be what's at the core of it.
00:10:29.000 | Now if it was very profitable and grew to be really big, that'd be great because then
00:10:32.860 | more people would do it.
00:10:33.860 | I don't know if it would be, but I like the idea.
00:10:37.160 | So I mentioned two things I didn't quite agree with Galloway about.
00:10:39.800 | Again, defer to him, but let's just throw it out there.
00:10:43.720 | Number one, his enthusiasm for Twitter, he calls it, I can't find a wording here, but
00:10:50.720 | basically he calls it like one of the most important companies in history.
00:10:59.040 | What's the way he talked about this?
00:11:00.240 | I can't find his exact wording, but it's like, this is obviously like one of the most important
00:11:03.320 | companies in the world, that its impact on the world is critical and it's very important
00:11:08.420 | what happens to it.
00:11:09.600 | So he says that somewhere.
00:11:10.600 | I don't have the exact terminology.
00:11:12.000 | All right.
00:11:13.000 | I don't completely agree with that.
00:11:15.080 | I think there's ironically an echo chamber around Twitter for Twitter users.
00:11:22.200 | It disproportionately matters for a very small segment of the population that has a lot of
00:11:27.400 | power for reporters and for politicians, for people who are involved in criticism, content
00:11:35.680 | producers, writers.
00:11:36.680 | Like there's a pretty small group that has a lot of influence on culture to which Twitter
00:11:40.060 | is at the center of their universe.
00:11:41.720 | And I think that is easy to generalize and be like, this thing is at the center of most
00:11:46.600 | people's universes.
00:11:47.600 | Most people don't care about it.
00:11:48.780 | Its impact on most people is second order.
00:11:50.640 | And maybe this is what Galloway had in mind.
00:11:52.740 | For most people, they don't use Twitter.
00:11:54.680 | They don't care about Twitter.
00:11:55.680 | Or if they use it, they use it rarely like I would, which is I need to look up what this
00:12:01.600 | Nationals beat reporter is saying about what just happened in the baseball game.
00:12:05.960 | Right.
00:12:06.960 | And but they are getting a second order impact because as we talked about on Monday, the
00:12:11.520 | news they're encountering, the bills being passed in Congress, like a lot of that is
00:12:15.240 | being influenced by Twitter.
00:12:16.360 | So they do have a second order impact.
00:12:18.040 | But I think 99 percent of the people in our country, if you turn Twitter off tomorrow,
00:12:22.640 | wouldn't notice.
00:12:23.640 | They just don't use it that much.
00:12:25.360 | So I don't think it's as central as the people who are in that world think it's central
00:12:30.120 | to their world, but they don't realize that that's that's a more constrained world.
00:12:35.680 | The second thing is Galloway's evaluation of Elon.
00:12:42.840 | Let's get to the question of is it good or bad for Twitter?
00:12:45.760 | And what Galloway says is like, it's kind of mixed.
00:12:49.320 | Like the the fact that, you know, you bring in this this larger than life character who's
00:12:53.520 | obviously a genius, you put him on the board, it's going to shake things up.
00:12:56.960 | It was smart for Twitter to put him on the board.
00:12:58.800 | It's better to have activists on your board than to be attacking you from the outside.
00:13:01.920 | He said that there's Twitter needs a kick and that might be a kick.
00:13:06.280 | But he's really worried about Elon's personality traits.
00:13:09.480 | He said Elon is very volatile.
00:13:11.880 | He will tweet off all sorts of random things.
00:13:14.840 | He might take Twitter in arbitrary directions.
00:13:17.960 | Galloway was upset that Elon was just doing like Twitter polls about major new features,
00:13:22.520 | like that he might just add an edit button because people said that was a good idea.
00:13:26.840 | And Galloway's right about that.
00:13:28.200 | Elon is a volatile person.
00:13:29.560 | And so you worry about what that's going to do over time to Twitter and its share value.
00:13:34.040 | The place where I probably disagree with Galloway, though, and it's a disagreement I have with
00:13:38.480 | a lot of people, is it seems like a lot of people, especially around here, their take
00:13:43.640 | on Musk and Twitter is that Musk is upset about moderation and what he wants is a unrestricted,
00:13:53.000 | basically unrestricted speech on Twitter.
00:13:55.880 | And so typically when people say, here's the problem with people like Musk is they
00:13:59.240 | want complete free speech.
00:14:00.480 | And then like Galloway does, they go on and be like, the First Amendment doesn't guarantee
00:14:04.080 | free speech to everyone.
00:14:05.280 | It guarantees freedom from government intervention and it's fine for companies to do free speech.
00:14:08.920 | And if you really want to do unfettered free speech where you're just stopping illegal
00:14:11.880 | stuff, you get 8chan and it's terrible.
00:14:15.120 | And I think most people would agree with that.
00:14:17.180 | I don't think, though, that that's what Galloway or people like him who are critiquing Twitter,
00:14:21.160 | I don't think that's what they want.
00:14:22.160 | I think it's a bit of a straw man.
00:14:24.360 | I mean, I think a lot of people, you know, again, especially and I think here it gets
00:14:28.520 | a little bit partisan.
00:14:30.960 | They want that to be the thing that someone like Elon is advocating for because it's easy
00:14:34.600 | to dismiss.
00:14:35.600 | I don't know Elon.
00:14:36.600 | I don't really know his stance on Twitter that well.
00:14:38.360 | I know some people who know Elon, though, who are in his orbit.
00:14:41.440 | So I can kind of guess at where he stands.
00:14:43.680 | I would assume what Elon would like his complaint is basically that, no, of course there should
00:14:48.440 | be moderation.
00:14:51.080 | If Twitter was like 8chan, no one's going to want to use it, at least not at the scale
00:14:54.880 | it has, but that this moderation should be more centrist.
00:14:59.400 | And I think that's a harder thing to argue about.
00:15:02.680 | If let's say you're coming at this more from a position to the left, that's a harder thing
00:15:08.980 | to argue about, which there should be moderation, but the moderation should align more with
00:15:12.120 | like centrist positions.
00:15:14.120 | What let's say like the majority of the country might be comfortable with.
00:15:17.600 | And I think Elon's argument is too often when it's like politically related, moderation
00:15:23.480 | decisions will align with like a relatively far to the left perspectives that are shared
00:15:29.840 | by maybe 6 or 7% of the country.
00:15:32.800 | And that feels great for those people who share those views, but for everyone else,
00:15:36.400 | it can feel arbitrary or weird and or that they're left out.
00:15:39.840 | And so like we should come at it from a like a centrist position.
00:15:42.240 | If it's something that would upset your aunt, then you're like, okay, maybe that shouldn't
00:15:48.200 | be there.
00:15:49.200 | Like, maybe that should be the standard and the standard shouldn't be.
00:15:51.960 | Is this something that is going to upset a like a radical critical theorist?
00:15:57.320 | You know, like that's a, that's a bigger stance.
00:15:59.480 | So I don't know.
00:16:01.240 | That's another thing I think I disagree with.
00:16:02.440 | Galloway is like, I don't think Musk wants 8chan on Twitter.
00:16:04.760 | I think he just wants the aunt standard and not the critical theorist standard, I suppose.
00:16:12.200 | So that's what I think.
00:16:14.240 | What do you know?
00:16:15.240 | What do you think, Jesse?
00:16:16.240 | Well, what about the part of Elon going out the wrong SEC storm form and what are your
00:16:22.320 | views on that?
00:16:23.320 | Yeah.
00:16:24.320 | I mean, I think that falls under the he's volatile critique, which is fair.
00:16:30.160 | I think there's probably a little bit more planning in it than that.
00:16:33.040 | I think it's similar to like the Brady retirement thing and then him going back, like they have
00:16:37.120 | this stuff planned out.
00:16:38.520 | So, so I'm not up, I'm not up on the latest breaking.
00:16:41.440 | So is the thought that this is so he can then step back out?
00:16:45.280 | Well, he announced it on April 4th or whatever that he was this, and then the stock price
00:16:53.720 | immediately went up, but he had filled out the wrong form.
00:16:56.480 | He was technically supposed to announce that like two weeks prior, in which case when he
00:17:00.040 | was buying all that stock, the price would have gone up.
00:17:03.120 | So essentially he saved himself like 150 million, but Galloway was saying that that's fraud.
00:17:07.560 | Oh, well that probably, that might be.
00:17:09.400 | I mean, I think that falls under the volatility.
00:17:13.200 | He's not great with corporate governance.
00:17:15.280 | He sort of flies by the seat of his pants.
00:17:16.720 | I mean, it kind of works well in creating new industries, but then also causes trouble.
00:17:20.440 | Yeah.
00:17:21.440 | I think it's, I don't know.
00:17:22.440 | I don't, I don't think that he just filled out the wrong form.
00:17:25.720 | I feel like he probably, I'm reading this now.
00:17:30.120 | Yeah.
00:17:31.120 | It's down in the end, like where, yeah.
00:17:34.640 | Yeah.
00:17:35.640 | Well, that's, well, that's a whole other issue.
00:17:38.640 | But yeah, again, you're bringing Musk into your world.
00:17:41.960 | All this stuff follows.
00:17:42.960 | Well, a lot of this, a lot of this stuff and like other publications is that's what the
00:17:48.200 | headline is about, you know, it's the form thing.
00:17:51.400 | Yeah.
00:17:52.400 | It's the form and saving money and buying Twitter.
00:17:53.800 | Yeah.
00:17:54.800 | I think to me, that's like the least interesting part of the story.
00:17:58.360 | I mean, it's interesting, but I also think it's, that's more like the reason that's the
00:18:03.120 | headlines in a lot of places.
00:18:04.560 | It's like, we're working backwards from, we don't like this guy.
00:18:09.000 | So let's emphasize the, like, look, he might've done fraud here, which I, which I think is
00:18:15.120 | big, but there's also like these huge discussions of like, what should Twitter be?
00:18:18.120 | And is he right?
00:18:19.120 | Is he wrong?
00:18:20.120 | And yeah, there, when I talk to people, I get a lot of like, there people around here
00:18:27.560 | are really worried that he's sort of wants to get rid of all moderation, which I, again,
00:18:33.040 | I think is crazy.
00:18:34.040 | Like no one would want to use a platform that it's like, eh, people can say whatever, as
00:18:38.080 | long as it's not defamation or, or like putting someone in imminent harm, like those platforms
00:18:42.240 | exist and they're weird and, and melt people's eyes and they're terrible and no one likes
00:18:47.080 | to be there.
00:18:48.080 | Yeah.
00:18:49.080 | Um, and so I can't imagine that, uh, assuming he does get to keep his stock, I can't imagine
00:18:54.800 | that's what Elon Musk wants.
00:18:56.280 | Yeah.
00:18:57.280 | Yeah.
00:18:58.280 | Again, I, I really think a lot of these like intellectual dark web types who all seem to
00:19:02.360 | like know him are, that's really what they would want is the, would it upset my aunt
00:19:09.000 | test?
00:19:10.000 | Yeah.
00:19:11.000 | Yeah.
00:19:12.000 | Which would moderate a lot of stuff, but wouldn't completely align with either side of the political
00:19:15.920 | spectrums, more extreme stances.
00:19:18.960 | Like I think that stance could very well, you know, would moderate things like about
00:19:24.360 | vaccine misinformation still in a way that people on the right would be really upset
00:19:30.160 | about.
00:19:31.160 | Um, but it would also like not kick people off for things that people on the left think
00:19:35.600 | they should be kicked off for.
00:19:36.600 | But your aunt would be like, I don't understand what the problem is.
00:19:38.600 | Like there's probably a middle ground there.
00:19:40.320 | And there's a whole theory in media criticism about, uh, when media outlets like newspapers
00:19:47.960 | used to make most of their money off advertisements, the goal is to have the largest possible circulation.
00:19:53.580 | And so the, the second order effect of that is that you get coverage that is aimed at
00:19:59.160 | the broadest possible audience because you need the broadest possible audience to like
00:20:02.320 | pick your paper up from the newsstand and read it so you can get those ad reads.
00:20:05.520 | And then when things shifted behind paywalls, it changed all the incentives.
00:20:08.560 | And now it's more about how do we keep the people who are paying us money each month
00:20:12.560 | for this excited about this.
00:20:14.800 | And the incentives change more towards like, well, let's figure out what they're into,
00:20:19.080 | who they support, who they dislike.
00:20:20.920 | And like, let's really attack the people they dislike and really support the things they
00:20:25.920 | Like let's validate their, their particular whatever.
00:20:28.040 | And so that's an interesting theory that then you shift towards like a, the Overton
00:20:31.560 | window significantly realigns when you shift your actual economic model, not that's true
00:20:37.480 | or not, but there we go.
00:20:41.400 | Elon Musk, we'll ask him about it.
00:20:43.880 | Yeah.
00:20:44.880 | He'll be on the podcast.
00:20:45.880 | He comes in the podcast.
00:20:46.880 | Yeah.
00:20:47.880 | He's on the list of people that Jesse thinks is going to come, kind of come on the podcast.
00:20:50.880 | So when Elon comes on, we'll ask him about the SEC form.
00:20:53.840 | We'll ask him about what he really wants to do with content moderation.
00:20:56.640 | We'll ask him to give us some of his stock.
00:20:59.760 | And we will ask him about when we're going to get the Mars.
00:21:03.360 | So we'll ask him all those things.
00:21:06.760 | We'll have him on in Zuckerberg.
00:21:07.760 | Maybe we'll just do it like the same time.
00:21:09.160 | Cause A, that'd be more efficient for us.
00:21:11.680 | I think.
00:21:12.680 | Mark can wait in the waiting room.
00:21:13.760 | Mark can wait.
00:21:14.760 | Yeah.
00:21:15.760 | Mark can wait in the waiting room.
00:21:18.080 | We'll send down for some Bevco coffee.
00:21:20.440 | Oh man.
00:21:21.880 | All right.
00:21:23.160 | Let's talk about a couple of sponsors before we get to some of these calls.
00:21:26.920 | Let's start by talking about MyBodyTutor.
00:21:30.640 | As I mentioned on Monday, I have known Adam Gilberts, the founder of MyBodyTutor for a
00:21:35.440 | long time.
00:21:36.440 | He used to be the fitness advice guy for my blog, Study Hacks.
00:21:41.400 | His company, MyBodyTutor is a 100% online coaching program that solves the biggest problem
00:21:46.360 | in health and fitness, which is the lack of consistency.
00:21:50.120 | When you sign up for MyBodyTutor, you get a online coach who builds a plan for you and
00:21:56.320 | your life, customized to you about your eating and your exercise.
00:22:00.160 | And then, and this is the magic sauce, you check in every day with an app.
00:22:04.800 | Here's what I did.
00:22:05.800 | Here's what I ate.
00:22:06.800 | Here's my questions.
00:22:07.800 | It's that daily accountability with an online coach, the same online coach every day that
00:22:11.680 | makes MyBodyTutor so effective.
00:22:14.200 | So if you're serious about getting fit, Adam is giving deep question listeners $50 off
00:22:18.880 | their first month.
00:22:20.640 | All you have to do is mention this podcast when you join.
00:22:25.200 | So go to MyBodyTutor, T-U-T-O-R.com, mention deep questions when you sign up and you will
00:22:32.240 | get $50 off.
00:22:33.240 | I also want to talk about ExpressVPN.
00:22:41.160 | You need a VPN to protect you when you use the internet.
00:22:44.660 | As I explain every time I talk about this, because I think it's important to understand
00:22:48.880 | the technology, a VPN does something very fundamental.
00:22:52.440 | Instead of you directly connecting to a website or server, if you use a VPN, you will instead
00:22:57.400 | automatically be connected to a VPN server.
00:23:01.400 | That server will then talk to the website or service on your behalf.
00:23:05.560 | That way the website and service doesn't know who is talking to it.
00:23:08.600 | And because your connection to the VPN is encrypted, no one knows on your end who you're
00:23:13.040 | talking to either.
00:23:15.180 | Computer scientists like me use VPNs all the time and ExpressVPN is my preferred VPN because
00:23:20.240 | they have very fast speeds.
00:23:22.860 | They have a large selection of servers, they're in over 94 countries now, and they're compatible
00:23:28.440 | with all of your devices.
00:23:30.880 | You just run the ExpressVPN app.
00:23:34.360 | And when you turn it on, it will do that connection for you.
00:23:36.640 | You use your device as normal, except now you have all the protection of a VPN.
00:23:42.680 | So be smart, get a VPN.
00:23:44.160 | If you're going to get a VPN, make it ExpressVPN.
00:23:48.640 | To find out more, go to ExpressVPN.com/deep.
00:23:53.960 | Don't forget to use my link at ExpressVPN.com/deep to get an extra three months of ExpressVPN
00:24:01.120 | for free.
00:24:02.120 | All right, I am in the mood for some listener calls.
00:24:07.760 | So Jesse, kick us off here.
00:24:09.360 | Who do we have?
00:24:10.360 | All right, first question is about idea notebooks.
00:24:12.960 | Hello, Cal, Sean here from Balmy, Miami.
00:24:16.040 | I'm finding particular benefit in my nascent application of the idea notebooks that you
00:24:20.520 | have suggested.
00:24:22.000 | So far, this approach seems to be the right way for me to tame what is otherwise an excess
00:24:26.320 | of ideas without letting them creep into my workspaces.
00:24:30.160 | My question to you, how do you go through your idea notebooks once you've completed
00:24:33.540 | the notebook?
00:24:34.600 | Do you reread page by page?
00:24:36.740 | That takes a while.
00:24:37.740 | Do you schedule time for it or does it somehow conveniently happen to you when you have a
00:24:41.460 | spare moment?
00:24:42.820 | And what does that process look like?
00:24:45.040 | By the way, Moleskine should definitely make a branded Cal Newport idea notebook.
00:24:49.180 | Tropical regards and thank you.
00:24:52.700 | Good question.
00:24:53.700 | I like the Moleskine Cal Newport branded notebook idea.
00:24:59.020 | You know, I do have my time block planner, so I'm kind of in that business.
00:25:02.620 | And we're at...
00:25:03.620 | I looked it up, Jesse.
00:25:05.460 | We've got royalty statements recently.
00:25:08.020 | 21,000 time block planner sold.
00:25:10.540 | So we're rolling.
00:25:11.980 | People are using it.
00:25:12.980 | And the new version of that, by the way, is coming.
00:25:14.820 | We're working on a spiral bound version.
00:25:17.020 | It falls flat.
00:25:18.460 | It's delayed somewhat because there's a lot of supply chain issues.
00:25:21.460 | Like in all of publishing, it's like glue, like stuff that you think is standard.
00:25:25.700 | There's like a lot of supply chain issues.
00:25:28.780 | But the new and improved one is coming.
00:25:29.980 | What I tell people is don't wait for it.
00:25:31.880 | Just get the version we have now because in a few months you'll be done with it anyways.
00:25:35.220 | Like you're buying into a system, not just one.
00:25:37.820 | So for existing users...
00:25:39.100 | I didn't mean this to be an advertisement, by the way, but now it's an advertisement.
00:25:43.300 | Timeblockplanner.com to find out more.
00:25:47.220 | Use promo code superfluous unmonetized advertisement to get 1% off.
00:25:55.100 | So anyways, I'm digressing.
00:25:57.500 | Yes, I want a Cal Newport idea notebook.
00:25:59.860 | I want it to be beautiful.
00:26:00.860 | I like field notes like Moleskine.
00:26:02.300 | So talk to me if you're at those companies.
00:26:05.060 | Timeblock planners are out there.
00:26:06.780 | Timeblockplanner.com.
00:26:07.780 | Those are only going to keep improving.
00:26:08.780 | So buy those.
00:26:10.060 | Let's get to the mechanics of using an idea notebook.
00:26:13.020 | Yes, it takes time to go through them.
00:26:15.780 | And yes, I think you should.
00:26:17.660 | There's two things you spend time on with an idea notebook.
00:26:21.260 | Number one is the regular review.
00:26:24.980 | I suggest once a month, but you can adjust depending on how quickly you pile up ideas
00:26:31.060 | in your notebook.
00:26:32.380 | Once a month, you take your existing notebook and you go through and you can read through
00:26:38.420 | either the entire notebook or just what's new and process ideas.
00:26:44.020 | If there's something in there, you're like, "Yeah, you know what?
00:26:45.500 | I think I want to act on this."
00:26:47.260 | Then you can move it into your systems.
00:26:49.340 | Let me make a task about this.
00:26:51.180 | Let me put a deadline on my calendar to think about it.
00:26:53.060 | Let me put aside an afternoon to kick off work on that.
00:26:56.300 | Let me make it something that's on my quarterly plan.
00:27:00.700 | So now I'm going to start putting time aside from it.
00:27:02.460 | So you might process something off of there into your systems.
00:27:07.060 | And it's important that you do this on this regular basis.
00:27:11.100 | Because in the moment when you're writing down an idea in the idea notebook, you want
00:27:15.580 | to be able to let go of that.
00:27:17.620 | You want to be able to turn back to whatever you're doing, what you were doing before
00:27:21.460 | the idea popped up.
00:27:22.460 | You want to turn back to that without your mind holding on to, "Wait, don't forget
00:27:25.420 | about this."
00:27:26.980 | And if your mind trusts, within a few weeks, I'm going to look at that and really think
00:27:30.300 | about it.
00:27:31.300 | And if it needs to be scheduled, I will.
00:27:33.700 | Then you're going to be able to let it go in the moment.
00:27:35.140 | So regular review is critical.
00:27:37.060 | The second way you interact with your idea notebook is when it fills.
00:27:41.580 | And what do you do when it fills?
00:27:42.660 | There's different schools of thought here.
00:27:44.700 | My school of thought is you get the new notebook, you go through the old one, and you copy over
00:27:52.540 | into the first pages of the new one, short summaries of any ideas from the old one that
00:27:56.620 | you think you want to keep thinking about, you want to keep on your mind, you want to
00:28:01.100 | keep in the conversation.
00:28:05.100 | Copy over shorter description of those by hand.
00:28:08.460 | Most things you'll find, most things in your notebook you've either already processed into
00:28:13.060 | your systems like, "Yeah, let me try it."
00:28:15.460 | Or in the light of day, once you filled that notebook, you're like, "Whatever.
00:28:20.420 | That was a momentary inspiration.
00:28:21.860 | There's nothing there to think about."
00:28:23.140 | So I rarely fill more than a couple of pages of my new notebook with holdover ideas.
00:28:28.460 | But I do that discipline.
00:28:29.460 | And then I can shut down the old book and I have the new notebook.
00:28:31.620 | And then when I'm reviewing that new notebook the next month, those ideas are up front.
00:28:35.460 | So they don't disappear.
00:28:36.460 | So you're not forgetting them.
00:28:37.460 | And when that notebook fills, maybe one of those will still make it to the next notebook.
00:28:41.060 | And that might be a sign of like, "This is something that's important to me.
00:28:43.260 | I don't really know what to do about it, but it's important to me."
00:28:44.940 | And that's a good indicator.
00:28:45.940 | It's a good factor.
00:28:46.940 | And that's what I recommend doing.
00:28:48.860 | Yes, these things take time.
00:28:51.260 | Put aside time.
00:28:52.700 | I used to go to cafes.
00:28:53.700 | For whatever reason, I remember this.
00:28:55.700 | When we were living on Beacon Hill and I was a postdoc at MIT, I would often go to the
00:29:01.780 | Bagel Store.
00:29:02.780 | It was like a Brugger's that was right at the MGH Hospital.
00:29:09.140 | I say Charles because the metro stop there, the T stop there was Charles Street, MGH.
00:29:14.660 | I just have a lot of memories of going there and going through my Moleskins and transferring
00:29:18.560 | new things to the new notebook and figuring out what's going on.
00:29:20.980 | But go get a breakfast sandwich and spend a half hour, have a coffee.
00:29:24.940 | So it's worth the time.
00:29:26.540 | Ideas are currency.
00:29:27.540 | You don't want to lose them.
00:29:28.660 | You also don't want them to distract you.
00:29:30.820 | The idea notebook method is a good way of dealing with them.
00:29:35.220 | All right, Jesse, what do we got for call number two?
00:29:38.860 | I'm feeling like I'm going to be on a roll today because I'm sick.
00:29:41.140 | I'm going to move.
00:29:42.140 | I'm going to move with alacrity.
00:29:44.900 | We got a cool question here and you kind of referred to it earlier in the episode about
00:29:48.340 | the doctor who's interested in theoretical math.
00:29:50.940 | We did have this question.
00:29:51.940 | Okay, good.
00:29:52.940 | I thought I remembered it.
00:29:53.940 | Hi, Cal.
00:29:56.940 | This is Ryan.
00:29:57.940 | I am a doctor and self-taught Python programmer.
00:30:02.180 | Outside of my physician role, I like to apply machine learning and AI to various medical
00:30:07.180 | related questions.
00:30:09.520 | And I've loved learning about the underlying mathematics that formed the foundation of
00:30:14.140 | many of these models.
00:30:15.780 | My question for you is what would you recommend for someone who would like to learn more about
00:30:22.340 | advanced and theoretical mathematics who is later on in their career and does not want
00:30:28.080 | to go back to school?
00:30:29.940 | Thank you.
00:30:30.940 | Love the podcast.
00:30:31.940 | Well, so Ryan, when you say advanced or theoretical mathematics, if what you really mean is you
00:30:38.380 | want to better understand the mathematics that's relevant to machine learning, the type
00:30:43.260 | of work you're doing in machine learning, I'm going to assume that's what you mean.
00:30:47.300 | That's not really super advanced mathematics.
00:30:49.100 | I mean, you want to make sure that you bone up on calculus, sort of, I don't know, college
00:30:54.460 | level calculus, maybe some statistics.
00:30:57.820 | I mean, bone up on some statistics.
00:31:00.640 | And I would just use for those online courses, perhaps, you know, a resource I'd really recommend
00:31:08.380 | for some of the AI stuff and some of the related math would probably be to go to MIT's OpenCourseWare.
00:31:12.660 | MIT is part of this initiative where a lot of their courses are available completely
00:31:17.660 | online.
00:31:18.660 | You can watch videos of all the lectures and get all the assignments.
00:31:20.940 | It's all free.
00:31:23.100 | They have some fantastic lectures on these topics.
00:31:26.580 | And so, yeah, if your calculus is rusty, if your statistics are rusty, you know, you can
00:31:31.820 | on Academy that, get an online course.
00:31:34.020 | You can learn that pretty quickly.
00:31:35.020 | And then I would say take some of these MIT OpenCourseWare topics, courses, free courses.
00:31:39.500 | If I take it, I mean, you watch the lectures and do the assignments on your own time.
00:31:43.140 | It's not going back to school.
00:31:44.140 | It's not going to cost you anything, just time.
00:31:46.580 | And you can be pretty advanced.
00:31:48.500 | You would be pretty advanced pretty quickly on some of these areas where you're interested.
00:31:53.060 | If you want to see an extreme example of that, my longtime friend, Scott Young, with whom
00:31:59.940 | I co-produce our two online courses, Top Performer and Life of Focus, years ago, he did something
00:32:05.860 | called the MIT Challenge, where he did the entire computer science curriculum, undergraduate
00:32:11.780 | computer science curriculum at MIT on his own and in one year.
00:32:17.420 | And he did it all using the MIT OpenCourseWare because all of the courses that he needed
00:32:21.740 | to actually satisfy the requirements of the MIT computer science major were all available
00:32:26.820 | online.
00:32:27.820 | So he watched the lectures at high speed.
00:32:28.900 | He did all the assignments.
00:32:30.900 | He graded himself.
00:32:31.900 | Like, so he gave himself grades.
00:32:33.340 | He graded the exams and problem sets.
00:32:35.340 | And he tried to grade himself honestly.
00:32:37.460 | And so his point was like, look, I'm passing these courses.
00:32:40.740 | Like, I am learning the material.
00:32:43.260 | I gave him the capstone.
00:32:44.620 | So for the capstone of this course, I gave him all the material for my graduate theory
00:32:48.460 | of computation course I teach at Georgetown.
00:32:50.180 | And then gave him my exams.
00:32:51.460 | And I graded the exams.
00:32:52.460 | And he did well on them.
00:32:53.460 | So I can kind of confirm he did learn a lot of stuff.
00:32:55.380 | So anyways, a lot is possible on your own.
00:32:58.740 | And MIT OpenCourseWare, if you're already a smart guy like you are, Ryan, if you already
00:33:01.700 | know about some of this stuff, you have some math backgrounds, you know some machine learning,
00:33:06.180 | that's going to be a great way for a self-motivated and disciplined person like yourself to push
00:33:10.580 | things as long as I'm doing free ads.
00:33:13.340 | If you want to find out more about that MIT challenge, Scott wrote a book called Ultra
00:33:18.580 | Learning about doing these type of radical learning challenges.
00:33:22.980 | And he talks in there about the MIT challenge.
00:33:27.100 | And I will take credit.
00:33:28.100 | I gave him that name.
00:33:29.380 | I gave him that name, Ultra Learning.
00:33:31.500 | So I take, and I think this is fair, 60% of the credit for that book.
00:33:38.140 | Because really the names are everything.
00:33:39.140 | I think you would agree, Jesse, that I should get 60%.
00:33:42.060 | And because of that, 60% of the royalties.
00:33:44.060 | I think fair.
00:33:46.420 | Have our lawyer call Scott and say, fair is fair.
00:33:52.220 | Where's our 60% of the royalties?
00:33:55.220 | We're giving you that name.
00:33:56.220 | You're nothing without the name.
00:33:57.740 | Oh man.
00:33:59.340 | All right.
00:34:00.780 | Making progress.
00:34:01.780 | Feeling good.
00:34:04.140 | Call number three.
00:34:06.940 | Next call is about checking her phone way too much.
00:34:10.740 | Hi, Scott.
00:34:11.740 | My name is Lucia.
00:34:12.740 | I'm a Spanish law student.
00:34:13.740 | And I'm very thankful for your student help books.
00:34:16.980 | They have helped me a lot.
00:34:19.300 | And my question is the following.
00:34:20.620 | I have gone through a digital minimalism process and deleted all my social media, including
00:34:26.220 | three Twitter accounts, which now sounds crazy even to me.
00:34:30.620 | But I still use instant messaging apps.
00:34:32.820 | And although I don't use them when I'm with other people, whenever I'm alone, I check
00:34:37.860 | them a lot.
00:34:38.860 | More specifically, I check them every time I pick up my phone, be it for meditation,
00:34:43.820 | for checking my uni scheduling, or time in my exercise, or any other reason.
00:34:48.460 | This has come to the point where I often forget what I have picked my phone for originally,
00:34:54.460 | because I get too swallowed by the messaging.
00:34:58.660 | I feel like Narcissus, who keeps returning to his reflection again and again, regardless
00:35:02.860 | of what's better for him.
00:35:04.340 | Do you have any tips regarding this?
00:35:06.860 | Thank you so much.
00:35:07.860 | Great show.
00:35:08.860 | Well, first of all, it's been a while since we've had a good Greek reference.
00:35:13.140 | So I appreciate that.
00:35:14.940 | That's old school deep questions, Jessi.
00:35:16.900 | Back in the day when it was just-- when I would record in here in the summer, and we
00:35:22.220 | had no video, and I had the AC off.
00:35:24.540 | Even though it turns out having the AC on is not a problem, because we have gated microphones.
00:35:27.540 | And I'd just be in here sweating just by myself during the pandemic recording podcast episodes.
00:35:33.100 | We got a little crazy with the Greek references.
00:35:36.180 | So maybe you've been a moderating influence.
00:35:37.980 | People have emerged back into the world and are a little more reasonable.
00:35:40.820 | But I do appreciate the Greek reference.
00:35:44.700 | So I want to apply here the digital minimalism philosophy.
00:35:49.820 | Everything you talked about was things you quit or things you got rid of.
00:35:53.060 | I'm glad you quit things and got rid of things.
00:35:54.780 | But in some sense, that's not what I'm mainly interested in.
00:35:58.180 | If you want to become a digital minimalist, what I'm interested in is your vision of the
00:36:01.980 | life well-lived.
00:36:04.100 | What was it that you discovered when you did your 30-day digital declutter that really
00:36:07.980 | mattered to you?
00:36:08.980 | What were the things that are important to you and your life?
00:36:12.500 | Because that should be the foundation of reconstructing your digital interactions.
00:36:17.220 | When you know what you want to spend your time on, what matters to you, what satisfies--
00:36:20.780 | we're going to use some more Greek references here.
00:36:24.300 | Aristotelian notion of the virtuous life, of the ethical life.
00:36:31.140 | Then you can work backwards and say, what tech helps these things?
00:36:34.580 | And if I know why I'm using these techs, what's the right rules to put around them?
00:36:38.460 | Now I suspect if you go through this exercise, you will find, because you're using these
00:36:44.900 | instant messenger apps a lot, that there's people in your life that being connected to
00:36:48.380 | matters to you, that the relationships in your life matter to you, that you want to
00:36:51.620 | be a good friend, you want to be a good daughter, you want to be a good aunt, whoever these
00:36:56.060 | people are that you're connecting to.
00:36:58.020 | You want to sacrifice non-trivial time and attention on their behalf, as that is what
00:37:03.540 | we as humans are wired to do as social beings.
00:37:05.940 | And if we're not doing that, we get unhappy, we get anxious, we start to worry about ourselves,
00:37:13.220 | we start to worry about our lives.
00:37:14.740 | That's probably what you would identify.
00:37:15.900 | And if you identify that, you say, great, I want to build a life when I'm doing that.
00:37:20.180 | Great.
00:37:21.300 | What's a good way to throw tech into that picture?
00:37:24.980 | And when you ask that question, your answer is not going to be, obviously, the right use
00:37:29.780 | of tech to satisfy this goal is going to be checking instant messengers every single time
00:37:33.740 | I pick up my phone.
00:37:34.740 | How could that possibly be the right thing there?
00:37:38.100 | What's the right thing going to be?
00:37:39.100 | Well, if you're really thinking this through, you say, okay, well, first of all, these five
00:37:43.300 | people, I want to whatever, I want to talk to them every single week, and here's when
00:37:46.500 | I call them, and I put aside a lot of time for that, and that's important.
00:37:49.260 | And these people, we go out and we get coffee every weekend because very important people
00:37:54.060 | to me, and I go once a quarter to go see this relative of mine.
00:37:57.980 | Some of this has nothing to do with technology, but you're actively saying, this is important
00:38:01.220 | to me, what do I want to do in this?
00:38:02.820 | And then you might be saying, okay, instant messenger might be useful for talking to these
00:38:07.860 | people in other times or these relatives that are overseas who I can't see as much, and
00:38:11.260 | that's great.
00:38:12.260 | But if you know that's why you're using it, then you can say, what rules do I want to
00:38:14.700 | put around that so that I can, in true digital minimalist fashion, preserve the benefit here
00:38:19.340 | while avoiding most of the unnecessary harms?
00:38:21.820 | And when you go through that exercise, you're probably going to learn, oh, I got to retrain
00:38:26.740 | people's expectations about how I use these services.
00:38:29.980 | I probably need to step away from, I will be involved in ad hoc threads that are happening
00:38:34.780 | throughout the day.
00:38:35.780 | Now, I just got to step away from that.
00:38:37.820 | It'll take people a week to adjust and then they'll get it.
00:38:40.300 | And I want to put aside time, and maybe it's at the beginning of each day over a lunch
00:38:43.580 | break, where I do check in on things and see what people are talking about.
00:38:46.980 | And what I do there is not so much try to just jump into those conversations, but use
00:38:51.780 | the instant messenger to maybe set up or arrange a call or a meeting, use it kind of logistically
00:38:55.780 | so I can kind of see what's going on in people's lives.
00:38:57.580 | But I don't think just back and forth asynchronous conversations on there, that doesn't count
00:39:01.120 | for much.
00:39:02.120 | I don't feel like that's that significant.
00:39:03.180 | I'm not going to spend that much time doing it.
00:39:04.500 | You could just completely rethink how you use those tools once you know what your goal
00:39:08.820 | If your goal is having deep, meaningful connections, you're not going to come away with an answer
00:39:13.420 | that says I should be on this all the time.
00:39:16.260 | Now why is this important?
00:39:17.500 | Well, when you come at tool usage from the perspective of how do I support a vision of
00:39:22.900 | a life well lived that I seriously believe in, changes are sustainable.
00:39:27.580 | And you stop using it the way you did before.
00:39:29.380 | And you stop checking in on all these threads.
00:39:31.340 | You stop using it as a distraction.
00:39:33.820 | Because you have a new way of using it that aligns with something you believe in deep
00:39:36.740 | in your bones to be the right way to live.
00:39:38.820 | That is very compelling.
00:39:41.180 | And now suddenly when you're like, oh, maybe I should pick up this phone when I have to
00:39:44.580 | look up a schedule at uni and go to instant messenger, you're like, if I do that, then
00:39:50.180 | I'm kind of repudiating this whole vision I'm so excited about, about a life well lived.
00:39:53.300 | And I don't want to repudiate that.
00:39:54.300 | I want to live that life.
00:39:55.300 | I'm excited about it.
00:39:56.380 | So you don't check.
00:39:57.900 | And that is the power of coming at digital usage from the perspective of supporting the
00:40:01.340 | positive, not just trying to reduce or eliminate the negative.
00:40:05.060 | So that is what I'm going to suggest.
00:40:06.780 | Let's give this the full digital minimalism treatment.
00:40:10.620 | Identify that picture of what matters to you.
00:40:13.820 | Come up with a plan that really supports that.
00:40:15.580 | Be radical about it.
00:40:16.580 | Like, Hey, I'm going to every other month fly to go visit this relative or whatever.
00:40:20.940 | The more radical you are, the more you single to yourself, you take that seriously, but
00:40:25.460 | then have very specific rules about your tech inside of that picture.
00:40:28.180 | And I think you'll find that your digital usage is going to be greatly reduced.
00:40:32.660 | I mean, the hardest thing about this and Jesse, I have this in my own family is the transition
00:40:38.440 | of not being someone who participates in threads because it's basically like a binary categorization.
00:40:45.460 | There's like the people you know that for whatever reason will reliably participate
00:40:50.580 | in text or instant messenger threads.
00:40:52.620 | Like if you say something, they'll come back to it or they're always kind of there.
00:40:55.260 | And it's sort of nice because you can sort of be in touch with that person all the time,
00:40:59.980 | but it's not sustainable and people get used to it.
00:41:03.100 | Like, okay, you know, Jesse probably is not going to just answer back a random text, right?
00:41:07.660 | When I sent it, I mean, unless like he happens to be doing other text messages, people completely
00:41:12.380 | adjust to it.
00:41:13.380 | And you know what?
00:41:14.380 | It does not hurt your relationships with people because all of that doesn't do much anyways.
00:41:18.220 | If you're instead prioritizing, like, how do I really talk to this person?
00:41:21.420 | I'm going to call this person every week.
00:41:22.660 | I can go see these people once a month.
00:41:24.140 | Then the fact that you're not at three in the afternoon answering the random back and
00:41:28.340 | forth doesn't matter.
00:41:30.420 | So you can always talk about how texting is a little bit different than social media
00:41:35.220 | too as well.
00:41:36.220 | Right?
00:41:37.220 | Yeah.
00:41:38.220 | Because it's not the mechanics that are not someone's trying to steal your time.
00:41:42.500 | It's not engineered to be addictive.
00:41:44.360 | The pressures there are social.
00:41:46.080 | So it's a social engineering problem, not an attention engineering problem.
00:41:49.380 | The reason why people have a hard time getting away from the texting is because they have
00:41:54.560 | configured their expectations with the people they know in such a way that they expect them
00:41:58.780 | to be reachable.
00:41:59.780 | And that is very convenient.
00:42:01.500 | But once you've configured those expectations, you're going to have to be answering texts
00:42:04.420 | back and forth all the time.
00:42:06.260 | So you have to completely rebuild those expectations of, I'm not just randomly available, but I
00:42:12.940 | do see you and spend time with you a lot.
00:42:14.420 | And if you need to get in touch with me, I'll see it at some point today and I'm reliable,
00:42:19.340 | but I'm also not on my phone most of the time.
00:42:21.780 | But no matter what, social media and texting are still context switching if you're in deep
00:42:25.740 | work.
00:42:26.740 | Well, yeah, that's the problem.
00:42:27.740 | It has the same cost.
00:42:28.740 | Yeah.
00:42:29.740 | So the social media is bringing in you because there's a neural net app that is selecting
00:42:32.200 | things to show you the press your brainstem.
00:42:33.540 | You're like, man, I got to look at that context switch.
00:42:36.100 | You're in trouble.
00:42:37.100 | Texting has none of that, but it's still very powerful because if there's a social expectation
00:42:42.140 | that you have established that I will answer and now the person you know is sending you
00:42:47.100 | a quick question, like, well, I got to keep checking because it's a problem if I don't
00:42:50.740 | answer and then you have the same cost.
00:42:53.900 | Yeah.
00:42:54.900 | Actually, I want to ask you this question about texting because a lot of times, like
00:42:58.060 | say you have a note or something that you want to text somebody, you might write that
00:43:01.260 | down when you're in like a deep work session or whatever.
00:43:04.140 | Then like say that that session's over, you go to text the person, then you read another
00:43:07.940 | text and you forget about it.
00:43:10.220 | There's really no way to, well, I've figured out one way.
00:43:14.940 | If you just close your texting interface and you pull everybody up by the context and you
00:43:18.900 | can text them directly.
00:43:19.900 | Right.
00:43:20.900 | But otherwise, like you can't, you're always going to see the main page of the texting.
00:43:23.700 | Have you ever thought about that?
00:43:26.060 | So then what, so you're saying the issue is if you want to like hold on to.
00:43:29.180 | You get distracted because like you say you go, you want to text Elon Musk about coming
00:43:34.100 | on your podcast.
00:43:35.100 | Right.
00:43:36.100 | Right.
00:43:37.100 | And then you're going to go there and you see Mark Zuckerberg asked you to go to get
00:43:38.660 | drinks for happy hours.
00:43:39.900 | You sigh like, oh, Mark.
00:43:42.860 | So that's the expectation.
00:43:44.100 | And then you forget to invite Elon on the thing because you're sent Mark the response
00:43:49.220 | or whatever.
00:43:50.220 | Yeah.
00:43:51.220 | Do you ever think about that or do you ever, do you have a way around that or?
00:43:54.300 | I mean, yeah, I mean, I think it's the problem.
00:43:58.060 | Like text messages are very convenient, but it's the problem of using that as like the
00:44:01.140 | primary way, like you are, like where they're useful is I'm meeting you, right.
00:44:07.460 | I'm coming to meet you and we have to, Hey, I'm five minutes late or like I'll often,
00:44:10.940 | like I'll text you if I'm coming over here and like every single week I'm late, I'll
00:44:14.220 | be like, I'm late or here's what's going on.
00:44:16.500 | It's logistical.
00:44:17.500 | It's helping us like coordinate.
00:44:19.660 | We're about to get together.
00:44:20.660 | Like it works fine for that.
00:44:21.860 | Or you're like, Hey, why aren't you here?
00:44:22.980 | I'm at the bar or something like that.
00:44:24.540 | But when it's, it's, um, and I get these types of things a lot.
00:44:26.940 | I don't like it where it says like someone randomly texted me during the day.
00:44:29.780 | Yeah.
00:44:30.780 | And there's like an obligation in it.
00:44:31.780 | And like, what about this?
00:44:32.780 | Or can you let me know when you're available for whatever?
00:44:34.540 | I'm like exactly this issue.
00:44:36.300 | How am I going to, this is going to be lost in the string full of all these other texts
00:44:40.140 | from all these other people.
00:44:41.140 | How am I going to remember?
00:44:42.660 | Like typically what I try to do is immediately if I see something like that, get into my
00:44:45.660 | capture system.
00:44:46.660 | Like right away, like get them into it.
00:44:48.900 | Yeah.
00:44:49.900 | I think it's a huge problem.
00:44:50.900 | Like texting, text interfaces are not invented for a world in which there could be dozens
00:44:55.500 | of people spontaneously like that you're engaged in conversations with any one of which could
00:45:00.700 | involve obligations being introduced for you.
00:45:03.420 | Yeah.
00:45:04.420 | Completely asynchronous and outside of your control.
00:45:06.280 | I found a work away.
00:45:07.280 | Um, I use an Android.
00:45:08.580 | So if you go to the contacts page on the Android and you pull up like, you know, Mark cell
00:45:12.660 | phone will show only their text.
00:45:14.600 | You can text them through that interface.
00:45:16.540 | Oh, that's smart.
00:45:17.580 | And then, so if you're disciplined about it, you, you go to the texting phase, but you're
00:45:22.420 | beyond Mark's, you know, individual profile where you're texting him.
00:45:26.380 | And then as long as you do that, that's interesting.
00:45:28.260 | That's smart.
00:45:29.260 | Okay.
00:45:30.260 | So you pull up the contact and then it can just be, yeah.
00:45:32.100 | So like if you and I are talking like right before a recording session, we could be just
00:45:37.500 | seeing each other.
00:45:38.500 | Like I could see it goes, it brings it over to the other one.
00:45:41.540 | But if I just know that I needed to text you, then I could just go into the contact, grab
00:45:45.260 | you and text you and I wouldn't be distracted by other texts.
00:45:47.900 | Right.
00:45:48.900 | If you need to send some, Oh, so outbound.
00:45:49.900 | Yeah.
00:45:50.900 | Yeah.
00:45:51.900 | Um, yeah, it's good fix.
00:45:52.900 | Yeah.
00:45:53.900 | I mean, that's a hard one.
00:45:54.900 | People have a hard time with that.
00:45:55.900 | And again, I think it's all just, it's expectations because it's a problem.
00:46:00.260 | Once the expectation is set, it's true.
00:46:01.540 | You can't just stop using it.
00:46:03.420 | Like I think that is harder than people leaving social media.
00:46:06.960 | People think it's going to be hard to leave social media because people imagine there's
00:46:09.760 | these big audiences that care, but they don't.
00:46:12.260 | And no one notices when you leave and it's kind of embarrassing.
00:46:14.660 | Text messaging people notice when you change your habit.
00:46:17.100 | Right.
00:46:18.100 | So like, you know, if you're on Twitter and you quit Twitter, no one cares.
00:46:22.700 | Like no one, because you were just being algorithmically thrown in the feeds with other stuff and no
00:46:26.380 | one knows it's all dehumanized and depersonalized and who cares.
00:46:29.620 | But if like your brother stops responding to the text, but he always had reliably been
00:46:33.540 | doing that, then that's like, Oh my God, like people, you know, like it's way more, I learned
00:46:37.420 | at working on digital minimalism and working with people that are going through the process
00:46:41.580 | of the declutter.
00:46:42.920 | It's way more like personal and fraught changing your text habits or instant messenger habits
00:46:47.780 | than it is changing your social media habits.
00:46:50.660 | Maybe I should just get my text, my phone number out to our listeners.
00:46:57.100 | So they can just text you?
00:46:58.100 | Just text me.
00:46:59.100 | Yeah, that would go well.
00:47:00.100 | Oh my.
00:47:01.100 | All right.
00:47:02.100 | Where are we at?
00:47:03.100 | Oh, no, I'm going to speed up now.
00:47:05.940 | I always say 45 minutes and we've never been under like 57 minutes.
00:47:10.900 | Well, I think there's always value in the episode.
00:47:13.340 | So that's true.
00:47:14.340 | Your audience wants to hear you.
00:47:15.340 | That's true.
00:47:16.340 | Jive on stuff.
00:47:17.340 | I hear you.
00:47:18.340 | All right.
00:47:19.340 | Well, speaking of jiving, what do we got?
00:47:20.340 | We've got a question about tempering a racing mind.
00:47:23.020 | Hey, Cal, I was wondering what you do if you have a racy mind?
00:47:27.300 | One issue I have is that when I'm working on stuff, I get easily distracted and I end
00:47:33.460 | up going down another rabbit hole, which I think will go somewhere.
00:47:37.060 | But when I visited the next day, even though I put in like lots of work the day before,
00:47:43.340 | I somehow just lose energy for it.
00:47:45.300 | So I was wondering, is there like a way to kind of temper a, what do you call it, a racing
00:47:54.300 | mind?
00:47:55.300 | Thanks.
00:47:56.300 | And I love the podcast, by the way.
00:47:58.860 | So for that type of racing mind, what we're talking about is you have ideas while you're
00:48:03.220 | working on one thing and then you go down the rabbit hole of that idea because you don't
00:48:06.180 | want to forget it or you think it could be important.
00:48:08.660 | We can go back to some of the stuff we talked about earlier in this episode.
00:48:12.140 | The idea notebook is going to be your friend.
00:48:14.340 | The whole point of that is like, let's get ideas down where I don't have to deal with
00:48:18.340 | it now, but I trust I will deal with it.
00:48:21.180 | Now what I'm going to tell you, if you have a really racing mind, is I'm going to give
00:48:23.380 | you a layer, an intermediary layer between your brain and the idea notebook.
00:48:29.100 | So if you're working on something, you've time blocked it, I'm writing this chapter
00:48:34.940 | or whatever you're doing and your mind has an idea.
00:48:38.540 | What I would do is have somewhere open on your computer a blank text file.
00:48:42.500 | I'm a huge fan of blank text files.
00:48:44.100 | I call mine working memory.txt because I really think of it as like an extension of my brain
00:48:48.580 | and you can type fast and you can write just as fast as you can write down, type enough
00:48:54.620 | stuff that you're not going to forget it.
00:48:56.660 | Like yeah, idea about, you know, on podcast, giving away briefcases full of money, dot
00:49:03.660 | dot dot research, Mr. Beast, dot dot dot, boom, back to what you're doing.
00:49:08.380 | Another idea comes up, right?
00:49:11.300 | Like see if Elon Musk would be willing to do a jujitsu style fight with Mark Zuckerberg
00:49:19.260 | when they're on the show.
00:49:22.420 | Look into it, ask Jeske, question Mark, just type it as fast as you can, back to what you're
00:49:26.780 | doing.
00:49:27.780 | And then at the very end of that session, just give yourself a couple of minutes to
00:49:31.440 | look at the stuff you captured.
00:49:33.980 | And by the way, this could also just be tasks that you think of by light bulb, right?
00:49:39.340 | Look at that list.
00:49:40.340 | Like, let me get this out of working memory.txt.
00:49:42.460 | If you have time right there, you do it.
00:49:43.940 | If you're back to back, if you're going from a work session to a meeting, it's fine.
00:49:47.900 | It's in that text file.
00:49:49.420 | You'll process that back down to empty at some point before the day is over.
00:49:52.260 | It's one of your capture bins.
00:49:53.260 | Next time you get a chance, you process that thing.
00:49:55.460 | If you're like, I kind of like this fight idea, I think it has merit.
00:50:00.220 | And you can figure out what to do with it.
00:50:01.280 | Like maybe it's still fledgling, so you're going to write it into your idea notebook.
00:50:05.620 | Now that'll get looked at next month.
00:50:07.240 | Maybe it's urgent, like we should just get rolling on this.
00:50:09.220 | So like, let me put a task into my task system, you know, by Jiu Jitsu mats.
00:50:13.860 | And let me write this into my quarterly plan that I'm working on this.
00:50:16.500 | So I'll know to generate more tasks and you can process it that way.
00:50:19.020 | And then it gets processed out of there, it'll be fine.
00:50:22.100 | So that's what I would suggest.
00:50:23.100 | If you're a real idea racing mind type, use a blank text file as your intermediary, that
00:50:28.060 | when you're working, everything goes on there as fast as you can type.
00:50:31.060 | And you're right back to what you're doing.
00:50:32.820 | And you put aside time when you have it to process that down and figure out what to do
00:50:36.460 | with it.
00:50:37.460 | Be it idea file, be it going right into your multi-scale planning system.
00:50:40.660 | That I think is going to help your mind.
00:50:42.640 | The thing that will keep your mind obsessing over ideas you had is lack of trust.
00:50:48.340 | If your mind does not trust that those ideas will be looked at, that task will be handled,
00:50:53.080 | that concept will be contemplated more deeply.
00:50:56.300 | If it does not trust that, it's not going to give you peace.
00:50:58.540 | If it does trust that, it'll allow you to get back to the thing you're doing.
00:51:02.060 | It might take a little bit of practice with it, but after a few weeks, I think you'll
00:51:04.620 | find that is quite useful.
00:51:07.740 | All right.
00:51:09.340 | Well, let me take a quick break here to talk about a couple more sponsors that makes this
00:51:14.500 | show possible.
00:51:15.940 | Start with workable.
00:51:18.580 | Hiring is important.
00:51:19.740 | Hiring the right people is even more important, but it is difficult to do, especially right
00:51:23.380 | now in this pandemic or near post-pandemic market.
00:51:27.380 | It's difficult to hire.
00:51:29.180 | A tool like workable can help.
00:51:31.540 | It does a couple of things for you.
00:51:33.820 | One, it helps you cast the widest net possible by posting your jobs to all the top job boards
00:51:38.820 | with just one click.
00:51:40.860 | It also helps you evaluate and hire quickly with modern tools like video interviews and
00:51:44.660 | e-signatures.
00:51:45.660 | It can even help you automate repetitive tasks in the hiring process like scheduling interviews
00:51:50.180 | so you can spend your time on what's important, which is finding the right people to hire.
00:51:55.660 | So whether you're hiring for your coffee shop or your engineering team, workable is exactly
00:51:59.220 | what you need to get the right people fast.
00:52:02.680 | Start hiring today with a risk-free 15-day trial.
00:52:06.180 | If you hire during that trial, it won't cost you a thing.
00:52:10.700 | Just go to workable.com/podcast to start hiring.
00:52:15.620 | Workable is hiring made easy.
00:52:18.740 | So if you are trying to get hired or you're in your job and wanting to stand out, a few
00:52:24.620 | things are more important than being able to express yourself clearly and professionally.
00:52:32.540 | This is where Grammarly enters the scene.
00:52:37.260 | Grammarly is software that runs on all of your devices and all the apps you use to write,
00:52:42.820 | and it helps you become a better writer.
00:52:47.060 | I've been messing around with Grammarly Premium recently, so there's a free and premium version.
00:52:51.660 | The premium version has even extra features, and I am quite impressed with what this technology
00:52:55.540 | can do.
00:52:56.540 | So let's say, for example, you want to be clearer in your emails.
00:53:00.780 | You'll be able to now persuade your audience with a confident and polished tone using Grammarly
00:53:04.940 | Premium's tone adjustments.
00:53:06.820 | If you write a sentence that's too clunky and hard to understand, don't worry.
00:53:11.860 | Grammarly Premium has a full sentence rewrite feature that will help you make that clearer.
00:53:17.140 | They also have clarity suggestions.
00:53:19.980 | Don't use this word, that's jargon.
00:53:21.940 | Use this one instead.
00:53:22.940 | It's actually really pretty eerily good to clarity suggestions.
00:53:25.760 | There's even a tone detector.
00:53:28.460 | Young people have a hard time with this.
00:53:30.860 | Like am I at the right level of professionalness?
00:53:34.020 | Honestly, I think if you're under the age of 25, you need the Grammarly Premium tone
00:53:39.740 | detector because young people have a hard time with this.
00:53:41.980 | It's just they have a different culture.
00:53:44.980 | They're on the phones these days, Jesse, with the tipping and the tapping and the emojis
00:53:51.220 | and the jargon.
00:53:52.220 | So you've got to get your tone right if you're going to be taken seriously.
00:53:55.500 | And until then, you've got to get off my yard.
00:54:00.260 | So get through those emails and your work quicker by keeping it concise, confident,
00:54:03.740 | and effective with Grammarly.
00:54:06.820 | If you go to grammarly.com/deep to sign up for a free account, when you're ready to upgrade
00:54:12.820 | to Grammarly Premium, you will get 25% off for being my listener.
00:54:18.020 | That's 25% off at g-r-a-m-m-a-r-l-y.com/deep.
00:54:28.100 | All right, 54 minutes.
00:54:29.780 | Let's do one more call, Jesse.
00:54:31.900 | Let's see what we got here.
00:54:33.700 | We've got a good question about Keystone Habits, especially focusing on the community bucket.
00:54:38.700 | Ooh, deep life question.
00:54:40.700 | Hello, Cal.
00:54:41.700 | Sean here from Miami.
00:54:44.580 | Some queries about Keystone Habits.
00:54:46.900 | I have been applying them with the community bucket as a special focus for this semester,
00:54:52.060 | being a better father, son, and partner.
00:54:54.860 | I have begun noticing more intentionality in those moments that the opportunity presents
00:54:59.380 | itself to become the person I wish to be.
00:55:02.060 | For that, you always have my gratitude.
00:55:04.420 | I originally thought it would be good to have a Keystone Habit about reflecting on my familial
00:55:08.980 | interactions.
00:55:10.340 | For example, did I review my interactions with my family, looking for what I should
00:55:14.020 | have said differently?
00:55:15.880 | But that feels lazy.
00:55:17.220 | It's a trivial commitment.
00:55:18.920 | So I thought of two other ones, a 10-minute uninterrupted conversation with my parents
00:55:25.420 | and not arguing in front of our daughter.
00:55:28.620 | Do these sound to you like decent Keystone Habits?
00:55:31.220 | I was also trying to refine my Keystone Habits for my craft.
00:55:35.300 | I was originally tracking, did I review the class that I taught after my class with the
00:55:40.020 | intention of finding three things I would have done differently?
00:55:43.660 | Do you have an opinion for what would be a good Keystone Habit for a practitioner or
00:55:47.420 | teacher trying to improve their craft?
00:55:50.100 | I also tracked my hours of personal practice, of course.
00:55:53.860 | Always with gratitude.
00:55:55.180 | Deep regards.
00:55:56.180 | Sean, you sound familiar.
00:55:58.860 | I think this might've been your second question in this episode, so well played.
00:56:03.900 | Doesn't happen that often, but it means you're asking good queries.
00:56:07.660 | Quick primer for people who are confused by the terminology.
00:56:12.180 | My concept of the deep life has this prescriptive process that comes with it.
00:56:19.180 | It's like one particular way to help try to cultivate a deeper life.
00:56:23.460 | And it's where you divide your life into the areas that are important.
00:56:25.980 | We call those buckets.
00:56:26.980 | The first thing you do is come up with a Keystone Habit in each that you do and track daily.
00:56:32.180 | That's what Sean is talking about.
00:56:34.340 | When you're talking about community, so Sean said community bucket, that's the area of
00:56:38.380 | your life focused on your connection to other people, your relationships.
00:56:42.780 | I'd like both those Keystone Habits.
00:56:45.660 | The uninterrupted conversation of a minimum length is a really good Keystone Habit.
00:56:49.940 | I might recommend not just making it for your parents, but you have the circle of people
00:56:53.180 | who are close to you in your life, your kids, your partner, your parents, maybe some siblings.
00:57:00.340 | And what you're tracking, what I would suggest is did I have an uninterrupted conversation
00:57:04.900 | at least 10 minutes with someone in that circle today?
00:57:07.620 | Check or no.
00:57:08.620 | He's getting the habit like every day I want to try to find some time where I can actually
00:57:11.900 | like connect to someone.
00:57:13.740 | And it might just be talking to like my oldest kid.
00:57:16.620 | Just let's just talk and do nothing else at the table.
00:57:18.380 | Or it might be calling up a relative.
00:57:19.740 | And because you don't want to leave that Keystone Habit on executed on any given day, you're
00:57:24.900 | going to maybe put aside some time.
00:57:26.500 | You're going to shift some things, your schedule, you'll call on the way home.
00:57:30.020 | You'll do stuff you might not have otherwise done.
00:57:31.740 | This is the Keystone Habit working like it's supposed to.
00:57:34.900 | So I do like that idea.
00:57:37.260 | I think you're right that analyzing, analyzing your prior conversations, I don't think that's
00:57:42.980 | that useful.
00:57:43.980 | I think it's more useful to actually get out there and do more interactions, more sacrifice
00:57:48.460 | on behalf of each other.
00:57:50.020 | So I think that's a good habit.
00:57:51.340 | Did you have another one, Jesse?
00:57:52.740 | I feel like he had another community Keystone.
00:57:57.180 | Now I'm racked my brain.
00:57:58.180 | He said he talked about the one about having the conversation.
00:58:05.300 | And then also going back and looking at his lectures.
00:58:08.020 | Yeah, there's a craft one, but he had a second community one.
00:58:10.140 | And the second community one.
00:58:11.140 | Oh, calling his parents.
00:58:12.140 | Yeah.
00:58:13.140 | I mean, that's the oh, not arguing in front of the kids.
00:58:16.220 | Yeah.
00:58:18.220 | That's a good one.
00:58:19.220 | So if you're in a state where you find it like you're having a lot of unnecessary arguments,
00:58:24.980 | you know, in front of your kids, or if you're getting mad at your kids too much, I've gone
00:58:27.620 | through phases where I've worried about that.
00:58:29.540 | I think having a metric you tracked, it says, I didn't do that today.
00:58:33.620 | That's really powerful because you don't want to break the chain.
00:58:35.700 | You're like, you know what, I've done this for 20 days.
00:58:38.180 | So now when like my kids being a pain instead of yelling, it's like, man, I want to get
00:58:42.300 | that check.
00:58:43.300 | I don't want to break the chain.
00:58:44.300 | That actually can be that can actually be quite powerful.
00:58:46.100 | Though I will say what I've learned is especially, and this might be the same for you with you're
00:58:50.940 | saying yelling at your partner, but for like yelling at your kids, like usually what I
00:58:54.100 | learned is there's a deeper issue.
00:58:58.500 | The issue often has something to do with your kids.
00:58:59.740 | It has to do usually with your schedule.
00:59:01.380 | Usually it's your overworked or you're stressed, you know?
00:59:04.340 | And so often the solution is not just, I mean, I think it's, it's the proximate thing.
00:59:08.420 | It's like, let me just not yell at my kids, but, but really what in that particular case,
00:59:12.140 | you want to get to the point where I'm just not yelling at that often because I have more
00:59:16.020 | breathing room in my schedule.
00:59:17.300 | I'm doing cleaner shutdowns.
00:59:18.860 | I'm exercising after work before I go into childcare mode.
00:59:22.780 | And so that's the only other thing I would tell you there is that it might be a seemingly
00:59:26.020 | disconnected Keystone habit that actually solves that problem.
00:59:29.820 | You could say, I want to, I want to not argue or yell.
00:59:32.300 | You could track that, but really the thing you want to track might be, did I do a shutdown?
00:59:36.260 | Did I exercise right after work?
00:59:38.620 | And that might actually be the thing you track that solves that problem.
00:59:40.660 | So just keep that in mind.
00:59:42.700 | When it comes to craft, yes, you're talking about teaching.
00:59:46.900 | Yeah, I would track, I mean, you got to figure out what is the, what is the tractable activity
00:59:52.020 | that really matters here to matters for the thing that you're trying to do.
00:59:55.180 | If you invent an activity that you like, I could imagine doing a 10 minute review of
00:59:59.380 | my course, but if that doesn't actually matter or make much difference, there's no point
01:00:04.280 | in tracking it.
01:00:05.280 | So if you can find an activity that really does make a difference and track whether or
01:00:07.980 | not you did it, that makes a lot of sense for the craft bucket.
01:00:11.420 | You know, like I track deep work hours because I write and I think for a living and I want
01:00:17.580 | to see those build up and I want to make sure there's something I prioritize when it comes
01:00:22.260 | to teaching.
01:00:23.260 | I don't know.
01:00:24.260 | You have to figure out what is the thing you could do every day that really makes a difference.
01:00:26.540 | Is it having a 30 minute debrief section where you go back and clean up your notes for the
01:00:31.260 | next time you teach it?
01:00:32.620 | Okay.
01:00:33.620 | Then track that.
01:00:35.260 | Is it you went and improved at least one thing in your course you went forward prospectively.
01:00:39.300 | Let me update this upcoming assignment.
01:00:41.060 | Let me find another exercise for the exam.
01:00:44.060 | And maybe your thing is I want to do this most days.
01:00:46.900 | It only takes a half hour at a time, but it'll build up to a lot of innovation.
01:00:50.380 | Then track that, but make sure that whatever you're tracking, it's something that really
01:00:54.060 | does make a difference.
01:00:55.780 | If you invent something that's tractable, but doesn't do much, your mind is going to
01:01:00.020 | say, I don't care.
01:01:01.020 | I mean, yeah, we put a check or we didn't, but I don't really care.
01:01:04.860 | All right, Sean, well, enjoy Miami.
01:01:08.340 | I'm glad you asked this question because it gave me a chance to review some of that deep
01:01:12.060 | life Keystone habit buckets type strategies that we talk about a lot.
01:01:17.140 | And then sometimes we stopped talking about for a while.
01:01:18.700 | So I'm glad to be able to reintroduce that back in.
01:01:20.740 | I think Jesse's summertime brings out more of those reflective things because people
01:01:24.180 | have more time and they're vacationing.
01:01:26.220 | They're outside more.
01:01:27.220 | They're outside more.
01:01:28.220 | So we'll see.
01:01:29.220 | I think we're going to get a lot more.
01:01:30.220 | It's my theory, more deep life questions in the summer.
01:01:34.260 | Like right now we're kind of in a crunch mode.
01:01:36.180 | We're getting more work questions.
01:01:37.260 | We'll see if that's true.
01:01:38.260 | I think a lot more about deep life stuff in the summer.
01:01:40.460 | So we'll see if that holds true.
01:01:43.980 | Speaking of summer though, I should probably rest because I am exhausted and I am sick.
01:01:49.620 | So thank you everyone who sent in your calls.
01:01:52.220 | Go to calnewport.com/podcast to find out how to record your own listener call straight
01:01:57.820 | from your browser.
01:01:58.820 | If you liked what you heard, you'll like what you see.
01:02:02.020 | You can watch the video of this full episode at youtube.com/calnewportmedia.
01:02:08.540 | Be back next week, if I don't die of whatever illness this is, with a new episode of the
01:02:14.540 | Deep Questions Podcast.
01:02:15.540 | And until then, as always, stay deep.
01:02:18.060 | [MUSIC PLAYING]
01:02:21.420 | (upbeat music)