back to index

Before You Rot Away At Home - How To Rebuild a Life of Meaning In a Digital World | Cal Newport


Chapters

0:0 Deep Dive: On Screens and Solitude
24:7 How can I study at night after my doctor work?
34:3 Can you comment on Leopold Aschenbrenner’s Situational Awareness essay?
48:52 How do I successfully pursue my non-work values?
58:13 How can I become a better writer?
60:18 Can someone break into the top 0.1% of their respective field without periods of unsustainable and obsessive work?
65:49 Managing active projects
70:19 A follow-up from Episode 323
76:48 An Offline Person Tries TikTok for the First Time

Whisper Transcript | Transcript Only Page

00:00:00.000 | So the writer Derek Thompson, who I know and I like, has a big new feature article in The Atlantic
00:00:04.720 | right now. Many of you sent it to me, so you probably have heard of it. It's titled "The
00:00:10.480 | Antisocial Century." Americans are now spending more time alone than ever. It's changing our
00:00:17.360 | personalities, our politics, and even our relationship to reality. For this article,
00:00:23.600 | Derek talked to a lot of different experts and explored a lot of different related ideas. But
00:00:27.600 | today, there's one point in particular from the article I want to focus on, because I think it
00:00:31.920 | represents one of the biggest issues created by our modern digital environment. The good news is,
00:00:38.080 | once we make that issue clear, the solution will also be quite obvious. All right, so to start
00:00:45.280 | here, let's talk a little bit more about what Derek is saying in this article, then we'll point
00:00:49.280 | out the part I care about. For those who are watching, instead of just listing, I have it up
00:00:52.480 | on the screen here. There's the headline, the opening graphic. I'm going to read a quote from
00:01:00.560 | this, but just to set it up, the key idea in this article is, Derek notes, a lot has been said about
00:01:07.760 | the so-called loneliness epidemic. Loneliness is an actual negative subjective state connected to
00:01:16.480 | the sense that you are not connected to other people. Derek says this is a bit of a misnomer
00:01:23.760 | in the sense that if you look at the data around loneliness in particular, it's not like that is
00:01:29.920 | getting a lot worse, or that's getting worse in some sort of pronounced way. He says the real
00:01:34.960 | issue is solitude, which he defines as time spent alone. Solitude does not depend on you feeling
00:01:43.440 | bad about it. It's just an actual physical state. Let me read you a key quote about this from his
00:01:50.000 | article. The privatization of American leisure is one part of a much bigger story. Americans are
00:01:56.560 | spending less time with other people than in any other period for which we have trustworthy data
00:02:02.000 | going back all the way to 1965. Between that year and the end of the 20th century, in-person
00:02:07.040 | socializing slowly declined. From 2003 to 2023, it plunged by more than 20%, according to the
00:02:14.880 | American Time Use Survey, an annual study conducted by the Bureau of Labor Statistics.
00:02:20.320 | So the issue is not necessarily that we're lonely, but that we're spending more time alone,
00:02:24.560 | and a lot of cases we don't mind it. Derek goes on to say self-imposed solitude might just be the
00:02:30.880 | most important social fact of the 21st century in America. All right, so there's a lot of problems
00:02:38.240 | that Derek surveys in this article that come from this rise in solitude. But there's one point in
00:02:46.560 | particular that I want to highlight because I think it's particularly relevant to the modern
00:02:50.720 | digital environment. So again, I'm going to quote from the article here. This is Derek talking about
00:02:56.080 | one of multiple problems with solitude. "Richard V. Reeves, the president of the American Institute
00:03:02.720 | for Boys and Men, told me that for men, as for women, something hard to define is lost when we
00:03:08.800 | pursue a life of isolationist comforts. He calls it 'neededness,' the way we make ourselves essential
00:03:17.440 | to our families and community. I think at some level, we all need to feel like we're a jigsaw
00:03:22.640 | piece that's going to fit into a jigsaw somewhere, he said. This neededness can come in several forms,
00:03:28.480 | social, economic, or communitarian. Our children and partners can depend on us for care or income.
00:03:35.200 | Our colleagues can rely on us to finish a project or to commiserate about an annoying boss. Our
00:03:39.680 | religious congregations and weekend poker parties can count us to fill a pew or to bring the dip."
00:03:45.760 | All right, so let's talk about this notion of neededness. I think we can kill this here, Jesse.
00:03:52.240 | In my book "Digital Minimalism," which actually made a lot of points that I think are
00:03:57.520 | being underscored by the experts in this article, I made this related argument where I said, look,
00:04:04.320 | when it comes to sociality, what our brain really looks for is us sacrificing non-trivial time and
00:04:13.440 | attention on behalf of someone else. So we have evolved to think about, if I am sacrificing
00:04:20.560 | non-trivial time and attention, so reproductively relevant, survival-relevant resources,
00:04:25.280 | on behalf of another person, that person is someone with whom I have an important connection.
00:04:30.000 | We're connected. We are in a community, right? This is an important person to me.
00:04:35.280 | So it's sort of measuring how much you sacrifice for someone
00:04:39.040 | to measure how important that person actually is in your life.
00:04:43.280 | So you can imagine, if we're drawing a social graph, so we put points for all the different
00:04:48.880 | people around you, like in a tribe back in the Paleolithic period, you draw a line between people
00:04:54.000 | if they are sacrificing non-trivial attention on behalf of each other, and what you would want is
00:04:58.000 | your point in the middle of that graph to be densely connected into this web. You have lots
00:05:02.080 | of people to whom you're connected to, and a lot of those people are connected to each other as well.
00:05:05.440 | So now imagine you're drawing one of these social graphs today. The problem is, if you're not
00:05:14.240 | sacrificing non-trivial time and attention on behalf of someone, you don't get to draw a line.
00:05:18.640 | And so we're seeing a lot of people's social graphs are sparse. And if your social graph
00:05:24.560 | is sparse, you're not feeling that neededness that Reeves talked about. So I just think this
00:05:30.480 | is two sides of the same coin. He talks about neededness. That's a subjective description
00:05:36.000 | of what it means to have a connection to an individual community, that they need you,
00:05:41.360 | you play an important role. I use sacrifice of non-trivial resources as a sort of quantitative
00:05:46.480 | or functional description of what this type of connection means. It's about what are you
00:05:49.760 | actually giving up on behalf of another person. So the more of these actual neededness or sacrifice
00:05:57.200 | connections you have to other people around you, the more resilient you become, the more fulfilled
00:06:02.560 | you become, the more satisfied you become about your life. So why then is this dissolving, as
00:06:09.440 | Reeves points out in Derek Thompson's article? Well, I think technology plays a major role in
00:06:14.000 | that story of dissolution. And it does so in two major ways that sort of work together into a
00:06:20.000 | negative symphony. So the first type of way where the modern digital technology plays a role in this
00:06:26.160 | is it leads us away from that type of behavior, that sacrifice of non-trivial time and attention
00:06:33.760 | that's really required to feel connected to someone. Let's think about the ways in which
00:06:39.120 | it does this. Digital communication, low friction digital communication, simulates
00:06:44.320 | enough of the idea of connecting to another person that it can help stave off loneliness.
00:06:49.760 | But it doesn't require sacrifice. So it doesn't give us that neededness that we also crave.
00:06:56.240 | I think there's a subtle point that's really important here. It's easy to text message
00:07:02.160 | someone that's very low friction. It's easy to jump on someone's social feed, see what they're
00:07:08.160 | doing, and leave a comment. It's low risk, low friction. It doesn't take much energy.
00:07:12.640 | If you're doing enough of this, you're probably not going to feel lonely because you're interacting
00:07:18.960 | with people. Like, I am not alone in this world. There are other people that I am interacting with.
00:07:23.920 | But this is so low friction, it's not requiring you to sacrifice any non-trivial time and attention.
00:07:28.080 | You're not taking out your afternoon, putting everything aside to go for a walk with a friend
00:07:33.440 | to help them figure something out. You're not cooking soup and driving it over to your friend's
00:07:36.960 | house because they're sick and giving it to them. You're taking that, you're making that sacrifice
00:07:40.480 | to make their life better. You're not doing anything, any significant investment of resources.
00:07:44.720 | So the social circuits in your mind don't see these people as being a part of your social graph.
00:07:49.200 | So we get this mismatch. I don't have loneliness because I'm simulating these social connections.
00:07:56.160 | Without loneliness, what's driving you to sacrifice this time and attention? There's
00:07:59.920 | nothing left to drive you. It's comfortable to be at home. You don't feel particularly bad in
00:08:03.200 | the moment. Why get off the couch and go for that walk or deliver that soup? This is a point that's
00:08:08.320 | emphasized actually in Derek's article, that loneliness serves the purpose of feeling really
00:08:13.280 | bad. So to make that bad feeling go away, we get off our butts and go do things for other people.
00:08:18.720 | And the neededness follows. Social media and digital, or in particular, digital communication,
00:08:23.280 | more generally, short-circuits the loneliness loop. And so we feel completely contented to
00:08:28.160 | keep sitting there, not really noticing that that actual substantial social graph is quietly
00:08:33.520 | beginning to dissipate behind us. Social media itself, if we focus in here more, also plays
00:08:38.800 | a role in being led astray from these type of non-trivial sacrifice behaviors. Because it gives
00:08:45.680 | us a sense, if you're a user, that you're a part of a community. Yeah, I'm a community. I have
00:08:51.760 | leadership. I'm out there. I'm needed. There's my followers need me. Because look, I post the
00:08:58.000 | things and they give me reactions and it passes around. So again, it short-circuits the sort of
00:09:04.960 | natural human drive we have to be in community, to be there for our community, to be someone that
00:09:13.280 | people look up to and depend on. It kind of simulates that enough that we don't feel bad
00:09:17.040 | about ourselves. But those deeper parts of our social circuitry say we're not sacrificing on
00:09:22.800 | behalf of a community. We're not really out there doing something that is hard and requiring energy.
00:09:28.720 | These aren't real connections. These are Potemkin podiums at which we're making our
00:09:35.840 | imagined grand speeches. But it's just an algorithm ginning up some fake response so
00:09:41.120 | that we feel important. So again, this is a theme that we see. Video games are doing the same thing,
00:09:44.960 | especially for young men. It scratches that itch to be a leader, to stand up and be someone people
00:09:50.000 | can count on, because your Call of Duty squad is killing a bunch of Nazis. But you're not really.
00:09:54.400 | You're deeper down. Your mind knows this isn't real. Where's the actual physical pain or hardship?
00:09:59.120 | Where's the time we're actually investing, helping the guy down the street dig his car
00:10:05.680 | out after the storm? We're not actually doing the stuff our brain counts. So again, this theme comes
00:10:09.760 | back again and again, where the technology scratches the itch that would otherwise drive
00:10:17.840 | us to do the stuff that matters, just enough we don't do the stuff that matters. We have a
00:10:22.000 | disconnect between one part of our brain is happy with the simulated sociality, but the other part
00:10:27.440 | of our brain is not. We don't have that neededness, because we know deep down our Call of Duty squad
00:10:34.160 | and our social media followers don't really need us. Is it really a friend if all we're
00:10:38.160 | doing is trading text? Hey, it's Cal. I wanted to interrupt briefly to say that if you're enjoying
00:10:43.520 | this video, then you need to check out my new book, Slow Productivity, The Lost Art of Accomplishment
00:10:50.640 | Without Burnout. This is like the Bible for most of the ideas we talk about here in these videos.
00:10:58.000 | You can get a free excerpt at calnewport.com/slow. I know you're going to like it.
00:11:05.120 | Check it out. Now let's get back to the video. The final way the technology I think is leading
00:11:12.000 | to this actually comes from the world of work. So it seems maybe like it's coming from out of
00:11:17.040 | nowhere, but I think it connects. So like I write about in my book, Slow Productivity, we have
00:11:21.520 | pseudo productivity, the management heuristic that visible activity is a proxy for useful effort
00:11:27.840 | combined with mobile computing. So now I can do work at a very visible fine grained level in any
00:11:32.480 | location on earth. Those two things have made us very, very busy, especially or notably outside
00:11:38.160 | of our normal work hours. Of course, I wrote a whole book about this. But from the point of view
00:11:43.520 | of what we're talking about here, neededness in the social graph, being more and more busy outside
00:11:48.480 | of normal work hours means there's less and less time to sacrifice non-trivial time and attention
00:11:53.680 | on behalf of other people. So that too is getting in the way of building these strong social graphs,
00:11:57.840 | which give us that sense of neededness. All right, so modern technology is playing a big role
00:12:03.760 | in this solitude problem. But I said there was a second way, that there's two major ways that
00:12:10.640 | technology is playing this role. Well, the first way we just talked about, it's making our graph
00:12:14.880 | sparser. The second way, and this is where it becomes an insidious, insidious, I almost said
00:12:21.600 | it, Jesse. I almost said insidious. Insidious cycle, it helps numb us from the pain of not
00:12:31.360 | having that neededness. It drives us away from neededness and then gives us the sucre so that
00:12:37.040 | we can survive not having it or just barely. And that's where we really get that self-reinforcing
00:12:43.360 | cycle. I don't feel needed anymore. My social graph is sparse. I'm not really connected into
00:12:51.280 | a thick network of people who depend on me and I depend on them. This makes me uncomfortable.
00:12:57.040 | Let's distract myself. Let's TikTok, let's video game. Let's endlessly scroll. Let's get caught up
00:13:04.000 | in, I don't know, it could be a conspiracy theory or whatever we want to do that's going to give us
00:13:07.760 | some sort of distraction away from this big lack that's actually happening in our lives.
00:13:14.560 | Then we use devices more or we work more to try to fill in that void. And then we get even more
00:13:20.320 | distance from our actual sacrifice driven social graph and our neediness goes down even more
00:13:25.200 | severely. It's a terrible cycle. It was a cycle that got amplified, of course, by COVID and other
00:13:30.320 | types of trends with computing. And it brings us to where we are now and to where Derek's article
00:13:34.960 | is. All right, so here's the good news. Once we know what's going on here, the solutions are
00:13:41.920 | obvious. We got to add back more links to that sacrifice social graph. That's it. We got to add
00:13:47.040 | back more links. Now that we know that's the problem, that those are being taken out because
00:13:50.640 | of technology, we need to add those back in. And we could be indirect about this. And I think this
00:13:56.960 | is the problem. It's too often what happens in these discussions is we say, well, maybe we need
00:14:00.720 | to think about how to get rid of the forces that are causing this problem in the first place.
00:14:04.800 | And we have to completely reform both our relationship and our cultural relationship
00:14:08.320 | with technology and work so that we can finally have the time and drive to get back to building
00:14:14.160 | social lives in a way that we're more used to doing it. Or we could just say, I'm just going
00:14:17.360 | to go add links directly. We'll figure that out on the way. I just want to go sacrifice
00:14:21.360 | non-trivial time and attention on behalf of people I care about. Let me just go do that.
00:14:24.400 | Just do that first. Let's just directly add the lines back. And then we can figure out how to fix
00:14:30.320 | the bigger problem and fix our culture and get utopia. So it's not a hard thing to understand
00:14:35.440 | that we need to do. Spend more time actually doing things for other people. That's what we need to
00:14:42.320 | do. How many people in the last month have you gone out of your way to really be there for them
00:14:47.920 | or to sacrifice on behalf of them? If you have a family, you've probably done it for your kids,
00:14:52.880 | maybe for one friend or another. But this should actually be something I'm doing multiple times a
00:14:57.440 | week. You start adding those lines back in your graph. You could even draw one of these things.
00:15:01.680 | Here's a dot for all the people I really know well. And each month, I'm going to draw a line
00:15:06.240 | if I do at least one non-trivial sacrifice on their behalf. And each month, how thick can I
00:15:10.240 | get this graph to look? How many points on this star can I actually create if I'm at the center
00:15:14.800 | and they're around the periphery? Not a bad exercise to actually do. Now, here's the good
00:15:19.600 | news. If you go right to that solution, what are you going to find? Well, you're going to find
00:15:27.040 | something getting reactivated within you. And suddenly that drive to be on the devices so much
00:15:33.280 | goes down because this is better. We're on our devices a lot because we were missing this.
00:15:38.640 | We're on our devices a lot because we're convincing ourselves this counts as sociality. But
00:15:42.000 | when we get re-exposed to the real thing, suddenly this other stuff, this digital simulation comes
00:15:49.040 | across as sort of trivial or a low-resolution simulation. It's no longer as appealing.
00:15:54.640 | As we get used to sacrificing other people, we see that's important. Now, I'm not going to do
00:15:59.920 | email all evening. We'll have to just figure that out. I'll have to figure out another approach to
00:16:03.200 | my work, either grow some confidence or change some systems, or this is just what it's going to
00:16:07.600 | have to be. It pushes back on the digital. So the digital pushed us into this problem.
00:16:13.360 | Sure. But instead of trying to fix our digital life first, let's go right back and fix this
00:16:17.840 | social problem. And actually the digital itself will suddenly seem less urgent. So I think that's
00:16:22.400 | the good news in this. Because really, what we do on this show is we're often navigating the
00:16:27.440 | perils of the modern digital environment, figuring out what are causing the disorders and mismatches
00:16:31.520 | of this, and then trying to figure out how to actually solve it. This is one of the biggest
00:16:35.200 | perils right now. This lack of neededness caused by the sparsification of the sacrifice-social
00:16:40.960 | graph. And no, Jesse, I don't like to create alliterative, unnecessarily technical terms.
00:16:45.200 | That's just how normal people talk. Let's just be clear about that. Sparsification of the
00:16:50.800 | sacrifice-driven-sociality graph. That's how normal people talk, let's be honest.
00:16:54.160 | This is, I think, one of the big problems of culture right now. Technology got us there.
00:16:59.120 | Technology is keeping us there. But going back to our roots as a social being suddenly makes
00:17:03.680 | technology's role in this seem more glaring and hard to miss. And therefore, the role that
00:17:07.680 | technology plays in our lives begins to reduce a little bit. So anyways, I want to throw it out
00:17:13.120 | there. I can't help but connect these type of issues to technology. And here's a place where
00:17:19.200 | we have a big negative impact, but we also have a very clear lever to pull to make things more
00:17:23.200 | positive. I like that phrase. Sparsification of sacrifice-driven-social graph. It's like
00:17:31.840 | a computer science paper title right there. I like it too.
00:17:34.720 | Yeah. All right. So there we go. We got a bunch of good questions. We have a reaction piece coming
00:17:42.320 | up later at Tech Corner, which, once again, is an article I just wrote for The New Yorker. So
00:17:46.240 | we're getting a bunch of Cal New Yorker this month. So I'm excited to get to that. But first,
00:17:51.440 | let's talk quickly about a sponsor. We actually have a new sponsor this week. This is a company
00:17:58.000 | that came into my life at a very opportune time. We're talking about Uplift. The Uplift desk is at
00:18:04.880 | the forefront of ergonomic solutions promoting better posture and health through adjustable
00:18:09.280 | standing desk design to help you live a healthier lifestyle. Plus, they have all kinds of accessories
00:18:15.280 | to keep you moving throughout your day, even if you work for only a few hours at your desk. Uplift
00:18:20.560 | came into our life as a potential sponsor I mentioned at a good time because, man, my back,
00:18:25.360 | I'm having all sorts of problems. I don't talk a lot about this on the show. Very short answer.
00:18:31.360 | I got an abdominal injury, being awesome in the gym,
00:18:36.640 | just being really cool with weights, and people were thinking I'm awesome. I got an abdominal
00:18:42.320 | injury, screwed up my core, I needed a surgery, blah, blah, blah, the point being when your core
00:18:50.560 | gets messed up, and I had to wear braces and stuff for a couple months, and then that messes up your
00:18:54.480 | back, so now my back is really messed up. Now my abdomen is healed, my back is messed up, so now I
00:19:01.600 | have to re-strengthen the abdomen and get my back is hurting all the time. Anyways, man, do I
00:19:06.560 | understand now posture and how much this matters. I don't think I would have understood. The
00:19:13.040 | associate director of undergraduate studies I work with at Georgetown, he was showing me last semester
00:19:17.440 | his Uplift desk, which I was like, oh, that's a beautiful desk. It looks great. It's this bamboo
00:19:22.720 | style, and the lifting mechanism is now really built into the legs in a way you don't even know
00:19:27.200 | it's a standing desk. I feel like the old standing desks, correct me if I'm wrong here, Jesse, my
00:19:32.560 | memory of back in the day when this technology came around is basically the technology looked
00:19:37.120 | roughly in size footprint to Mike Mulligan's steam engine. A huge contraption, there's a foreman
00:19:45.120 | blowing a whistle, and these giant gears start turning. These Uplift desks now, like, oh, that's
00:19:49.760 | just a normal desk in just like the normal leg on the side. It's like in there. I don't know how,
00:19:55.200 | anyway, so I was like, oh, it's a beautiful desk, but, you know, I was like, what do you need a
00:19:57.840 | standing desk for? Now I get it because it's like, oh my God, posture is everything for me right now.
00:20:02.000 | So I am working standing, sitting in all sorts of different ways, a lot more working standing.
00:20:06.880 | So one of the things I'm using now, and I don't mean to preach on this, but it's like my whole
00:20:10.160 | life right now, is Uplift, it's not just the desk, they have these accessories, the anti-fatigue mat.
00:20:15.600 | They sent me one of those, that's been a life, that's really helped. So you stand on this mat,
00:20:19.520 | so it's not just your full weight, just like on your feet, just on the hard ground. It's like,
00:20:25.840 | either you could wear those shoes that people wear now that have, and I think the official
00:20:31.760 | measurement is like 17 inches of heel, or like you walk around like a clown stilt, or you can
00:20:37.760 | have this mat, and it's just so you don't have that pressure as much, like this stuff really
00:20:41.680 | matters in a way I didn't think about before. I also got, I'm going to bring it to the HQ,
00:20:45.680 | a wobble stool. I can put it in front of the 3D printer. So it's like you can sit on it,
00:20:51.760 | but instead of just sitting completely still, you can wobble on it and kind of like move your core
00:20:56.560 | around. It doesn't, you can't fall, you're not going to fall over, but you can like kind of
00:20:59.840 | move and work out your core. I originally was like, oh, my kids will like it, but like, oh God,
00:21:05.120 | I need this thing now. So anyways, I'm going on a rant here, but like, I am obsessed now with
00:21:09.680 | posture and ergonomics because my whole life is like about this right now. So Uplift came into my,
00:21:14.800 | came into my life at a good time. You guys got to care about this stuff. The Uplift Desk is the
00:21:20.320 | industry standard, but their accessories are cool as well. So anyways, let me get to an actual call
00:21:24.480 | to action here. Make this year yours by going to upliftdesk.com/deep and use our code DEEP to get
00:21:31.520 | four free accessories, free same day shipping, free returns, and an industry-leading 15-year
00:21:38.000 | warranty that covers your entire desk and an extra discount off your entire order. That might be the
00:21:42.400 | most free things I've ever heard for a discount code. That's Uplift Desk, U-P-L-I-F-T desk.com/deep
00:21:53.120 | for a special offer. And it's only available at that link. Start 2025 right, stand up, move,
00:21:58.160 | thrive with Uplift Desk. Also want to talk about our longtime friends at ExpressVPN. Look, getting,
00:22:07.040 | going online without ExpressVPN is like not having a case for your phone. Most of the time,
00:22:12.640 | you'll probably be fine. But all it takes is just that one drop and you'll wish you had spent those
00:22:16.720 | extra few dollars on a case. Well, that's what it's like using the internet without a VPN because
00:22:23.360 | every time you connect to an unencrypted network, at the airport, you're at the hotel, you're at a
00:22:28.240 | coffee shop, your online data is not secure. Any hacker who is right there can see the packets that
00:22:34.400 | you are sending, including who it is that you're talking to. A VPN protects you from all of this.
00:22:40.560 | When you use a VPN, what happens is you take the message you really want to send,
00:22:43.920 | and you encrypt it, so no one can read it. And then you send that to a VPN server, then the VPN
00:22:50.880 | server unencrypts it, talks to that site or service on your behalf, encrypts the response, sends it
00:22:55.040 | back to you. So now if I'm the hacker next to you at the coffee shop, all I can learn by looking at
00:22:59.440 | your packets being sent over the radio waves is that you're communicating with a VPN server. I
00:23:04.080 | have no idea what actual site or service you're talking to. All of that is obfuscated for me.
00:23:08.720 | So you do need a VPN to protect yourself and your privacy. If you're going to use a VPN,
00:23:14.800 | use the one I recommend, which is ExpressVPN. Its encryption is super secure. There's enough
00:23:19.520 | bits there in the encryption that no computer, feasible computer in the world could ever crack
00:23:23.440 | it. It's very easy to use. You just turn on the app, and now it's working. Just use your other
00:23:27.760 | apps and web browsers just like you would normally. It just automatically works in the background.
00:23:32.000 | You can use it on your phone, your laptop, your tablet, so it's easy to stay secure. It's been
00:23:36.800 | rated number one by not just me, but top tech reviewers like CNET and The Verge. So you need
00:23:43.600 | a VPN to be on the internet in the modern world. ExpressVPN should be your friend. So secure your
00:23:49.760 | online data today by visiting expressvpn.com/deep. That's expressvpn.com/deep. And you can get an
00:23:58.720 | extra four months free expressvpn.com/deep. All right, Jesse, let's do some questions.
00:24:04.800 | First question is from Kay. I'm a medical doctor switching specialties. This requires I study for
00:24:12.720 | entrance exams. I time block my nights for studying after my eight to five doctor work,
00:24:18.560 | but I struggle with this as I'm too tired. This leaves only the weekends to study. How can I
00:24:22.880 | improve on scheduling and following through on weekdays? Well, as I learned doing research for
00:24:28.320 | last week's episode on morning routines, I think the key is, and Jesse will agree with me,
00:24:32.400 | is to do your studying from within a cold plunge. Because what that's going to do
00:24:36.960 | is the cytokines are going to sharpen your focus muscles, and then it's going to be very difficult.
00:24:43.280 | Actually, what you need to do is in the cold plunge. You need a cold plunge that's deep enough
00:24:48.000 | that you can do pull-ups while in the cold plunge. You're maybe doing pull-ups in and
00:24:54.320 | out of the cold plunge, and then that will help you get after it. Okay, let's get to the heart of
00:25:00.560 | this here. Kay, I'm going to tell you first of all to use a phrase from the older episodes of the
00:25:07.280 | show, to face the productivity dragon here, which means confronting and accepting the reality of a
00:25:13.200 | particular workload that you're struggling with. You have a very hard job, and so you're finding
00:25:20.960 | it hard to also do a lot of hard studying after your job is over. That is just the reality. That's
00:25:25.920 | not a broken thing. It's not unusual. It's not inexplicable. It's not a problem. In fact, it's
00:25:31.440 | not at all surprising. It's hard to be a doctor. Those are long shifts. And so the study, something
00:25:38.640 | intense after a hard shift, might just be really hard. So we have to just accept that at first.
00:25:43.600 | But we can't see it. To ignore the productivity dragon is just to really want something to be
00:25:50.240 | doable, to be frustrated that it's not, and just hope if you get upset enough or focus on it enough,
00:25:58.000 | you can just sort of make it possible. That dragon is there, and sometimes it's going to
00:26:03.040 | block you from getting where you want to go. All right, once we accept that, now we can review what
00:26:08.000 | are our options and tools here without yet trying to assess whether any of these is going to solve
00:26:13.280 | the problem. Let's just put on the table, there's a dragon up here. All right, townspeople, what
00:26:18.240 | weapons do we have? Let's see what we have, and then we can build a plan. So in your case, there's
00:26:22.560 | a few things that could be relevant. Better energy could help, right? There's things you could do so
00:26:28.240 | that you're coming off of your shift, you have higher energy, maybe you're able to persist in
00:26:31.920 | more studying than you are now. These are things like sleep, exercise, and nutrition. Those are
00:26:36.560 | probably the big ones, maybe with a good shutdown routine from the doctor job. In theory, if you're
00:26:43.520 | in really good shape and have really good health, in theory, you probably would have more energy you
00:26:50.480 | could probably spare. I don't know if that's going to be a lot because the physical and intellectual
00:26:55.040 | are related, but they're not completely congruent. But that's a tool we have on our table.
00:26:58.800 | That might take a while, though, to get healthy, to get in good shape, but that takes time.
00:27:03.360 | And kind of ironically, you don't have a lot of time each day to work on this. We could get
00:27:07.760 | around that, doctors get around that, they work out at the hospital, etc. But it takes a lot of
00:27:12.000 | time to get in, you know, to get healthier. And you might have to be done with this in a couple
00:27:15.920 | months. Alright, another tool, though, better study habits. So if you're using the right study
00:27:21.200 | habits, maybe a shorter amount of time per day, you can get more out of it. Also, studying is less
00:27:27.200 | exhausting when it's more focused and you trust it, right? When you're a really good studier,
00:27:32.240 | I was a really good studier, I wrote books about how to study. When you're a really good studier,
00:27:36.160 | it's a lot less exhausting. Why? Because the sense of exhaustion from studying in particular
00:27:43.280 | is sometimes generated from your mind having resistance to the activity that you're about
00:27:48.160 | to do. It doesn't want to do it. So it's like, I don't want to do this. And now you're competing
00:27:52.080 | with your mind trying to drag it into this activity. That's exhausting and not super
00:27:56.560 | sustainable. Why does your mind reject studying? Well, one of the reasons why it rejects it is
00:28:03.840 | because studying is not a precise verb. Your mind doesn't think you have a particularly good plan
00:28:09.760 | for how you're going to get prepared. Your mind knows that you're just going to sit down and open
00:28:12.880 | up your books and then look at your phone and then look at your instant messenger and then kind of
00:28:16.400 | read some things and kind of look over at something else. It has no confidence that this is going to
00:28:20.960 | lead to anything good at all. And so it holds back motivation. So now you're dragging your
00:28:26.560 | mind through it. By contrast, if you're a really good studier, your mind's like, oh, yeah, we got
00:28:30.160 | a good plan. We know how to prepare for these type of questions. We're given this 50 hard minutes,
00:28:34.480 | and we are going to really make progress in these 50 minutes. You're going to have a lot more
00:28:39.360 | motivation to do it, even if it's intellectually harder. So improving your study habits is
00:28:43.840 | something else that could help here. Let's step back now and look at more drastic or
00:28:48.800 | reconfiguration-based plans. You could just take longer. Maybe you want to sit for these
00:28:56.000 | master's exams in three months. Maybe you're like, what I really need to do here is do this six
00:29:00.640 | months from now or a year from now, because my studying is going to be—I can't study every night.
00:29:06.000 | And maybe I'm doing my study on the weekends or just one night a week. It's going to take me a
00:29:09.600 | lot longer to prepare. So if I push this off by a year, then I can get there in a reasonable time
00:29:15.600 | frame. That's like a real slow productivity type of idea. No one knows how long it took you to do
00:29:20.800 | something. They just know in the end what things you did. And often the key to sustainability is
00:29:24.880 | simply just taking longer. Ten years from now, all people are going to know is like, oh, you made
00:29:29.840 | this shift in your clinical practice. They don't remember exactly how long did it take from you
00:29:34.240 | having this idea to you taking the entrance exams. The final tool we can put on the table here is
00:29:39.600 | change your work situation temporarily. Maybe you take a leave for two weeks. You can just do
00:29:44.480 | nothing but seriously study and just get this thing done. So we have different tools on the table.
00:29:51.840 | And your question is just, okay, what combination of these is going to get me where I need to get?
00:29:56.080 | Facing the productivity dragon, rarely as people fear, leads them to the conclusion of this thing
00:30:03.840 | that's important to me I can't do. That's not what happens. What happens is you come up with a more
00:30:08.720 | reasonable plan for how to get there. And it might not be as easy as you hope or as quick as you hope
00:30:12.960 | or as painless as you hope, but typically you find a way to take care of that dragon. Once you actually
00:30:18.640 | see it and you're looking at it and having an honest conversation about what options you actually
00:30:22.160 | have. So probably some combination of those things I mentioned will get you there. And I almost
00:30:27.920 | certainly it's not going to end up being as quick or as easy as you hoped when you first went down
00:30:31.440 | this path, but that's okay. Sometimes paths have dragons on them. We still have to figure out how
00:30:36.160 | to get up to the castle. So hopefully, not hopefully, you will, you'll find a way to get there.
00:30:41.760 | When did you come up with the term productivity dragon?
00:30:45.040 | I feel like it was early in the show.
00:30:46.880 | Oh, it wasn't before?
00:30:48.560 | It was definitely early in the show because when I was listening to it,
00:30:51.680 | I loved it when I was just a fan.
00:30:53.280 | Should we look it up?
00:30:54.000 | Well, was it? I don't know. I was thinking maybe you discovered it when you're in your 20s.
00:30:59.440 | No, no, no, but it's possible. So here's what I'm looking this up now. So here's what's possible is
00:31:03.200 | that like I wrote about it on my newsletter around the time the show was coming up.
00:31:08.880 | Okay. I thought you had it for 15 years before.
00:31:12.400 | But God, now you're making me doubt myself, Jesse. All right.
00:31:15.760 | Man, so it's definitely early study hacks because I'm seeing a clip from August of 2020.
00:31:27.200 | Okay.
00:31:28.080 | So that'd be pretty early, but here's a-
00:31:29.840 | It's like five years old.
00:31:30.800 | Here's an article from July of 2020 on confronting the productivity dragon, take two. Okay. So
00:31:39.040 | you know why this is take two? It wasn't great.
00:31:42.480 | So I wrote this article, now I remember this, July of 2020.
00:31:45.280 | I wrote this article about confronting the productivity dragon.
00:31:49.680 | And so I just grabbed some image online of St. George fighting a dragon
00:31:53.920 | because that's the classic. And here's a picture of it now of him artwork.
00:32:00.160 | I didn't realize the picture I had drawn, I guess St. George has
00:32:03.760 | white supremacist connections as well.
00:32:06.800 | I kid you not, this picture I posted was St. George stabbing a dragon
00:32:13.680 | and it was like his cloak or his sword, swastikas, swastikas.
00:32:18.880 | People are like, all right, that checks. That's what I feared.
00:32:25.520 | I think I put a note about this. I don't know. I didn't, but, oh yeah, I did down here.
00:32:30.480 | It was from Wikicommons. That's why, you know, I was like, oh, it's a Wikicommons,
00:32:35.120 | like no, because no copyright image. So I didn't expect it.
00:32:38.080 | Okay. So here's what I said. In my first attempt to post this article,
00:32:42.400 | I grabbed an image of St. George from Wikicommons that seemed to be of the
00:32:45.360 | right resolution and dimensions, but I missed one crucial detail.
00:32:48.320 | His heralding was full of swastikas. Whoops.
00:32:51.280 | Anyways, if I read this article from July of 2020, it opens by saying,
00:32:59.200 | on a recent episode of my podcast, someone asked me, and I mentioned this term.
00:33:04.320 | So I think it was the podcast, a very early episode.
00:33:07.520 | Yeah. A very early episode of the podcast I came up with.
00:33:11.440 | Okay. I kind of want to return to it.
00:33:13.840 | Let's revisit the productivity dragon. I love the term. I just thought maybe you
00:33:17.200 | had it like a poster of your, of the dragon, like in your college dorm or something.
00:33:21.360 | I don't know. Maybe I did. So I had to look it up, but no,
00:33:24.480 | it mainly is just a vehicle for me to put swastika imagery on.
00:33:29.360 | I thought like my site was going to get put on a hate watch list or something like that.
00:33:36.720 | I was like, just probably bots that are just, you know,
00:33:40.960 | following sites and be like, Oh, they posted a lot of swastikas. Like we got to take them.
00:33:44.320 | You know, it's a good thing you weren't on Facebook at the time.
00:33:46.480 | Yeah. That would not have gone well. Yeah. Maybe that's the real reason
00:33:49.600 | why I'm not on social media. They just, I got kicked off all of them for,
00:33:52.320 | and it got reposted a few places too, I think, because people just repost my articles.
00:33:57.440 | Anyways, we should do a productivity dragon, like revisit.
00:34:00.480 | Okay. I'll note that.
00:34:01.600 | All right. What do we got next?
00:34:02.560 | Next question is from David. As a non-tech person interested in tech, I enjoy your comments on AI.
00:34:09.680 | Can you comment on Leopold Aschenbrenner's situational awareness essay?
00:34:14.880 | It's getting a lot of hype and criticism.
00:34:17.200 | Yes, I think I have something. I loaded up something here. Okay. I'm not going to load up.
00:34:24.480 | So for people who don't know, Leopold Aschenbrenner, I think now he's an investor.
00:34:31.360 | He runs a fund, but used to be in tech. Wrote this essay called Situational Awareness,
00:34:38.160 | AI from Now to 2034, which is basically, he's synthesizing all these conversations
00:34:45.680 | with people in tech, and he's laying out this vision of axioms and predictions for the future
00:34:51.120 | of AI. And it's pretty extreme. And it's because of that gathering a lot of attention.
00:34:55.520 | I'm not going to read that or even go through the essay because I think it's like 160 something
00:35:01.360 | pages long. I mean, it's a book basically. It's like this huge, really long thing.
00:35:04.400 | But I did find a good, Mike Allen has a good summary of the main points on axios.
00:35:10.560 | So I'll read a few of these. I have it on the screen here for people
00:35:14.160 | who are watching instead of just listening. Like here are some of the points that were made,
00:35:19.360 | some of the stickier points that were made in this big, long essay.
00:35:22.160 | One, trust the trend lines. The trend lines are intense, and they were right.
00:35:26.480 | The magic of deep learning is that it just works, and the trends lines have been astonishingly
00:35:30.560 | consistent despite naysayers at every turn. Another big point, over and over again,
00:35:35.680 | year after year, skeptics have claimed deep learning won't be able to do x.
00:35:38.800 | They've been quickly proven wrong. Point three, it's strikingly plausible that by 2027,
00:35:44.240 | models will be able to do the work of an AI research engineer.
00:35:48.800 | By 2027, rather than a chatbot, you're gonna have something that looks more like an agent,
00:35:52.400 | like a coworker. Number five, the data wall, there's potentially important source of variance
00:35:58.960 | for all this, we're running out of internet data. Number six, AI progress won't stop at the human
00:36:04.480 | level. We would rapidly go from human level to vastly superhuman systems. He points the idea
00:36:11.360 | of superintelligence possibly by AD 2030 and so on. These are the type of ideas that are in this
00:36:20.320 | essay, and it's an interesting essay, and it's getting a lot of attention. I would say, be wary
00:36:27.120 | to just naively dismiss this essay because Ashton Brenner knows a lot about this technology, and he
00:36:32.800 | really did talk to a lot of people. He really has a sense for it. On the other hand, you do have to
00:36:37.120 | take his essay with a grain of salt because his fund is basically focused on we're investing in
00:36:43.200 | the technologies that are going to lead directly to AGI. That's his pitch to investors. So it is,
00:36:47.440 | of course, very much in his benefit. It's very much to his benefit for people to believe that
00:36:52.640 | AI technologies trends are very extreme and noteworthy because that's the pitch of his fund
00:37:00.000 | as well. So you have to keep those things in mind. I'll add a couple observations. These aren't like
00:37:05.920 | limits to what he's saying, but I'll put a couple potential breaking observations to keep in the mix
00:37:14.240 | here. So one thing that interests me that Ashton Brenner doesn't talk about in the summary, at
00:37:19.360 | least, he says over and over again, year after year, skeptics have claimed deep learning won't
00:37:23.760 | be able to do X and have been quickly proven wrong. If there's one lesson we've learned from
00:37:28.640 | the past decade of AI, it's that you should never bet against deep learning. Well, it is true. Its
00:37:34.160 | capabilities keep growing. And as we say, well, it's still bad at this, then engineers work on
00:37:40.400 | this, and then for a lot of those thises, it gets better at. But there have been year after year of
00:37:46.800 | predictions that have not been coming true, which is the predictions about the practical impact in
00:37:51.600 | our lives. As soon as chatGPT came out there, it was like, we're six months away from this
00:37:56.400 | disruption. Whole industries are going away. Homework apocalypse, education as we know it is
00:38:01.360 | done. These whole sectors are gone. Look, this guy over here fired half of his call center.
00:38:06.560 | That's going to be everyone. These jobs are gone. So far, there's been almost no major disruption.
00:38:14.000 | So the one place where there is a gap is the impact gap. So the connection that we've been
00:38:20.480 | getting wrong is we thought there would be this type coupling between functional breakthrough
00:38:25.760 | and disruption. That as the magnitude of a functional breakthrough on AI models
00:38:31.120 | jumped, the immediate disruptions would jump as well. Well, it turns out that there's at the very
00:38:36.240 | least a large lag between these two things. I think this is a significant thing to keep in mind.
00:38:40.400 | It is turning out that to make this technology high impact on people's day-to-day lives,
00:38:48.240 | there is no escaping the actual sort of hard, hard-to-predict product development cycle.
00:38:55.360 | That it's not just the fact that these models can do amazing things. It doesn't mean that it's doing
00:39:00.560 | amazing things in people's lives. People still have to now do the painstaking work of integrating
00:39:05.040 | this AI into specific products. Nine out of ten ways you do this is not going to work or be that
00:39:09.840 | useful. So it's hard. There's competition. Companies are going to fail. Initiatives are
00:39:13.440 | going to lose money for these big companies. And then in there, you're going to find, "Oh,
00:39:17.040 | here's the right product that actually works." The consumer internet was the same way.
00:39:22.160 | We knew it was a big deal. And a lot of companies were like, "This is a big deal. This changes
00:39:26.400 | everything," which it did. But we thought at first, "Great. So if I just put money into anything
00:39:32.000 | internet, it's going to be successful." And it wasn't. And most of the early things we did didn't
00:39:35.280 | really work. And we had the first dot-com crash in the early 2000s. And then what ended up being
00:39:41.200 | required was years of different companies and startups and people trying, "Well, what's the
00:39:45.040 | right way to get the internet to people? Or how do people actually want to use it?" And then we got
00:39:49.280 | out of it some of these Web2-based models that have then become incredibly profitable. But you
00:39:54.880 | had in 1999 people being like, "Yeah, well, Time Warner should be on this. We'll buy AOL. We'll
00:40:02.240 | have this online version of the articles. And Webvan. We'll have warehouses full of food you
00:40:07.360 | can buy on the internet. And we're all going to make a lot of money." None of that worked.
00:40:11.520 | But you fast forward another 25 years, and Meta has a trillion-plus dollar market cap. So they
00:40:18.240 | were right, but it just took a long time to try to figure out what works and what didn't. So that's
00:40:22.160 | going to slow down AI's progress some. Because we're three years out of highly capable language
00:40:28.160 | models and don't yet have large disruption use cases. So just that lag is longer than we think.
00:40:35.920 | On the flip side, that means when the disruptions come, it might seem like it's coming out of
00:40:39.200 | nowhere because it's not going to be tightly coupled to an innovation. I actually think the
00:40:43.120 | power—I was just giving a talk at Microsoft recently. We were talking about this. I actually
00:40:48.480 | think that we have sufficient capability and AI tools today to support major disruption to the
00:40:56.640 | way knowledge work happens. We don't actually need, if we have no future innovation, like we
00:41:01.280 | have to freeze everything like where it is now. We have sufficient capability and power in these
00:41:06.960 | models for sufficient disruption knowledge work. We just haven't figured out the right tools or
00:41:10.880 | way to integrate it yet in the products. And that's what everyone's working on right now.
00:41:13.680 | So it might be slower till the everyday person is feeling the disruption.
00:41:19.840 | But the disruption might also seem to be somewhat out of nowhere because, again, it won't be tied
00:41:24.720 | to a recent innovation. It will be a product innovation that finally just works just right.
00:41:31.280 | All right. So that's one idea I want to point out that I think is relevant.
00:41:36.320 | The second is I think there's a pause or wall or sort of AI mini winter that is coming up
00:41:44.400 | because there's two limits we're coming up against. Asher Brenner mentions one of these
00:41:50.560 | limits. One, we're running out of data. So this idea of we're going to train these sort of
00:41:54.960 | transformer-based language models, these feedforward language models, we're going to
00:41:58.480 | train them with text data. There's not much text left. We've kind of used all the text on the
00:42:04.320 | Internet. I mean, Meta has this advantage. I heard Scott Galloway talking about this. I think it's
00:42:08.560 | probably a smart analysis. He was saying don't bet too hard against Meta right now because there's a
00:42:14.000 | couple wins in their favor, like TikTok perhaps going away, which will be good for them because
00:42:20.000 | Reels has become an effective TikTok clone. But the other thing is they have a lot of extra text
00:42:24.080 | because of all the platforms. All their platforms have these giant archives of text, and text is
00:42:28.000 | what you need to train these. And so maybe they can eke out some more training, like OpenAI,
00:42:33.200 | who maybe is just limited to the full open Internet and every book ever written. So to also
00:42:37.120 | have everything ever said on Facebook, that's more text. So more text helps. But we're kind
00:42:40.960 | of running out of text to train these things on. So we're sort of getting to the limit of data.
00:42:46.160 | Because of the inefficiency of how the training happens and knowledge is represented, you need a
00:42:52.320 | ton of data to train these things. So we could be kind of running out of what we get through
00:42:55.200 | capabilities. Now, there's different training methods that matter, right? Like the O1 model,
00:43:02.560 | and I don't want to go down too deep down this rabbit hole, but the newest chat GPT is better
00:43:06.960 | at reasoning. And this is in part due to the way they train it now. It turns out you can make these
00:43:12.000 | things a little bit better at, feed forward networks better at reasoning if you do a particular
00:43:16.400 | type of training where what you really do is you say, when you give an answer, it's a simple idea,
00:43:21.200 | but it has a huge impact. There's actually a Dartmouth kid who figured this out. Jason Wynn,
00:43:25.600 | I think. But it's a simple idea. When you give an answer, chat GPT, we want you to explain
00:43:32.240 | your steps that lead to your answer. And if you give an answer that doesn't have a lot of steps,
00:43:38.240 | we're going to zap you during training with a bad signal. And if you give an answer where you kind
00:43:42.880 | of spell out your steps, we're going to zap you with a good signal. That's called reinforcement
00:43:47.280 | learning. And we're going to add that onto your normal training. This is how they train these
00:43:50.480 | models not to like say bad things or to avoid certain topics. Well, they just say, oh, we'll
00:43:53.920 | just zap them while we're training. Like, hey, show your work. So if I ask you like a math problem,
00:43:59.760 | don't just say the answer to that math problem is 27, I'm going to give you a happy zap if instead
00:44:04.160 | of just saying that, you say, well, let's walk this through. We started with this many apples,
00:44:08.000 | and we took away this apple, so we've left this many apples, so now the answer is 27.
00:44:11.360 | So by zapping it, like, hey, we really like when you show your work. Now, when you're training
00:44:16.320 | these networks, they're more likely to train in a way that actually captures more of the logic,
00:44:21.440 | because they have to actually say the steps along the way, and then they're more likely to do
00:44:26.080 | reasoning better. So there's stuff you can do. But we are going to hit a limit where we're going
00:44:29.360 | to run out of data. I also am this big believer that the feed-forward network model, there's only
00:44:33.280 | so far we can get with that. There is no state, there is no recurrence, there is no looping,
00:44:38.080 | there is no let's try out a bunch of things. There's no here's a novel state of a problem
00:44:42.240 | in the world, and we want to now explore what to do with this and compare this to other stuff we
00:44:45.840 | know. Feed-forward model, everything has to be stuck in these forward connections of the deep
00:44:49.520 | learning model. So I think the limitations of that structure plus data limitations means we
00:44:54.240 | might hit an AI mini winter. The way we're going to break out of that, I think, is going to be with
00:44:58.720 | more complicated model structure. We're going to have multiple models. Individual models might go
00:45:03.680 | through deep learning to actually learn what they're doing. But they're going to interact
00:45:06.880 | with each other. And some of these models or modules are going to be human coded and not
00:45:10.560 | learned. And it's going to be in the ensemble of different models. This is keeping a state.
00:45:14.800 | Here's a simulator model. Here's like an understand the world model. Over here is a
00:45:18.400 | prediction model. Over here is like a meta model. All of these working together is what's going to,
00:45:24.400 | I think, get us out of the AI mini winter and actually move AI to that next level,
00:45:29.440 | which is going to be a much bigger step towards something like AGI. So I'm getting kind of
00:45:32.560 | technical here, but there we go. AI mini winter is going to come, but then we'll eventually get
00:45:36.480 | through it. And the impact gap on AI, we should not look down on. It takes years, actually, to go
00:45:42.400 | from this tech is great to this tech is having a great impact on people's lives. We got to factor
00:45:46.800 | that in. So that wasn't quite 165 pages worth of material, Jesse, but I think it was close.
00:45:52.080 | On Rogan, Zuckerberg talked about how AI can basically do the work of an average
00:45:57.120 | programmer now. Did you hear that? I mean, it just depends what you mean by that.
00:46:02.000 | Yeah. Yeah. Yeah. It's good at generating code.
00:46:04.960 | But it's unclear. So when you look at professional programmers, so it can produce code like OK code,
00:46:13.040 | but it's not really where it's impacting productivity and programming. Where it's
00:46:16.720 | impacting productivity and programming based on the programmers I've talked to
00:46:20.240 | is it's preventing you from having to do what for the last 10 or 15 years programmers have been
00:46:25.760 | doing, which is I know there is some sort of library call I need to make here to
00:46:31.040 | erase the screen or whatever. I don't remember what it was. So I'm going to Google it, that
00:46:37.840 | Google is going to load up a page on the Stack Overflow forum, and the Stack Overflow forum is
00:46:43.440 | going to have the answer like, oh, that's the name of that library. And what are the parameters? OK,
00:46:48.000 | great. And then they go back over and you type it in. So this is a lot of programming nowadays.
00:46:52.240 | You don't master, you don't memorize everything. You're constantly Googling things,
00:46:56.880 | and then you're getting answers and going back to what you're doing.
00:46:59.680 | AI is very good at like, I don't even have to do that. I can just like start typing,
00:47:03.760 | and it kind of figures out like, oh, you're looking for this. Here's it. Here's the name
00:47:07.520 | of the library. Here's the parameters. You don't have to leave your development environment.
00:47:10.640 | Or you can kind of even ask it like, what's the thing I need to draw circles?
00:47:14.480 | Or just write it like what would I call it? Circle drawing thing. And it says, is this what
00:47:19.840 | you mean? And like, yeah, that's what I mean, right? So for programmers, it's literally shaving
00:47:26.880 | time off of what they're doing. But they would never put in a bunch of code. On the other hand,
00:47:31.600 | I know a lot of people who aren't programmers at all who are now building simple programs who
00:47:35.840 | wouldn't have been able to without AI. This is one of the ideas. I kind of introduced this in
00:47:39.680 | this talk I was giving the other day. But I think one of the first big productivity impacts is
00:47:44.320 | going to have in knowledge work is really going to be this, in general, unlocking complex software
00:47:51.280 | capabilities in individuals without complex software training. And it's not just with
00:47:56.480 | programming, but just with software has powerful capabilities. Often only power users know how to
00:48:02.320 | do it. AI is going to make it easier for non-power users to get power capability. So I'm going to be
00:48:07.440 | able to do crazy stuff in Excel without having to really understand Excel macros and how these sort
00:48:12.480 | of complicated things work, because I can just kind of describe what I want. And the AI can
00:48:16.640 | understand that and turn it into a macro language that Excel understands, and I can get it done.
00:48:20.960 | So that's where I think the first productivity gains are going to happen is unlock these more
00:48:25.440 | powerful features. So now I don't program, but I can write a simple program. That's useful.
00:48:30.480 | I kind of know about Excel, but I don't know how to do an advanced sort or the swap the rows with
00:48:36.080 | these numbers with these other-- I don't know how to do those operations. Now AI will help me do it.
00:48:41.680 | So I think unlocking power features without power user training will be one of the low
00:48:46.400 | hanging fruits. We're going to see some impact. All right, what do we got next?
00:48:51.360 | Next question is from Colin. I'm fortunate to have a remote job that supports flexibility,
00:48:56.400 | but I often struggle to translate the values I care about-- learning, curiosity, self-improvement,
00:49:02.080 | connection, and adventure-- into concrete goals and actions. I want to be able to sustain these
00:49:07.040 | practices. Too often I find myself stuck in a cycle of pseudo-productivity, going through the
00:49:11.440 | motions without feeling truly fulfilled. I think this is a common problem, especially if you have
00:49:16.800 | the blessing of time, is you get kind of systematic and say, "Okay, well, here's the things that
00:49:22.000 | matter to me," and then you kind of start, "I'm going to do this, I'm going to do that, I'm going
00:49:26.400 | to do this," and it feels sort of soulless. Like I have these like checklist of things I do every
00:49:30.880 | day that's connecting to the things that I value, and I don't know, it just feels like going through
00:49:35.680 | the motions. It doesn't actually feel like it's infusing my life with value. That's a really
00:49:38.880 | common problem, actually like acting on your values in a way that's really meaningful.
00:49:43.440 | I have four things I want to mention that could be helpful here.
00:49:46.400 | One, you know, once you've identified what's important to you, you have your buckets,
00:49:51.120 | have some sort of keystone habit in each, sure, as a starting point. Something you do for each
00:49:56.560 | of these values or things you care about on a regular basis. It's not trivial, but it's tractable,
00:50:02.400 | so you're just signaling to yourself, "I care about these things, sure," but then choose one
00:50:07.360 | and say, "This is the thing I'm going to really work on for the next six months.
00:50:11.200 | This is the thing for the next season I'm going to try to figure out through experimentation
00:50:17.760 | and focus how to integrate this into my life in the coolest possible way," because it actually
00:50:22.240 | can be hard. You could say, "I like adventure," great. Building an actual rhythm of adventure
00:50:27.760 | that's meaningful to you in your life, that might take a lot of experimentation. It's not an obvious
00:50:31.600 | thing to do. Maybe you spend a full summer really focusing on that. "Well, what if I go on weekend
00:50:38.080 | trips and that's not enough? Maybe what I want to do is once a quarter, let me try one of these
00:50:42.080 | quarterly trips. What does that feel like? Maybe I want to challenge myself every week to go to a
00:50:48.240 | place that I haven't been before. Maybe I want to get a group of friends. We do this together."
00:50:51.680 | You figure out what's really pressing my buttons on this value and how do I best integrate that
00:50:57.840 | into a part of my life? That takes time and experimentation, so just focus on one until
00:51:01.120 | you feel good about it. Then you can move on to another. It can take years to kind of button down
00:51:07.120 | a full lifestyle setup. Then at the end of that, you kind of say, "Okay, now I had kids. Whoops,
00:51:11.200 | we got to change all these again." What adventure means is very different now than it did before,
00:51:15.200 | and that's okay. So spend more time and go one by one in figuring these things out. There's
00:51:20.960 | a patience thing. The second solution, go back to lifestyle-centric planning to better understand
00:51:27.680 | what it means for these values to be a part of your life. In particular, the part of
00:51:31.120 | lifestyle-centric planning that's key when you're thinking about these type of values like curiosity
00:51:35.120 | or adventure, for example, is to find examples that resonate. Like, "You know what? What I'm
00:51:41.840 | looking for is someone who is doing something in their life that really—that specific thing they're
00:51:49.200 | doing really resonates with me." So get more concrete. Move from the abstract to the more
00:51:54.240 | concrete. Like, "Oh, I really love the way this guy works." Maybe you're—really, when it comes
00:52:02.480 | to adventure, you try to get concrete. The thing that resonates with you is this movie, which I
00:52:07.840 | watched a bunch as a kid. I wonder if you would know this one, Jesse. K2. Right? Oh, yeah. With
00:52:13.680 | Spacey? Is it Spacey? Yeah, right? Oh, it might be. Yeah. They go to climb K2. It doesn't go well.
00:52:22.240 | Oh, no. That was it, everyone. It's a mountain climbing movie. Wait,
00:52:26.560 | I'm looking this up because— He was in like K something else.
00:52:30.400 | Yeah. Oh, oh, the K packs. Exactly. Exactly what you're talking about. No, I can think of—I'm
00:52:37.120 | looking this up here. 1991 film. Man, I used to like this film. Oh, it's Michael Biehn. Yeah,
00:52:45.360 | Michael Biehn and Matt Craven. Spoiler alert. I don't think it goes well for Matt Craven. Anyway,
00:52:50.880 | so it was this mountain climbing movie. K2, you know, is the second highest mountain in the world
00:52:54.960 | behind Everest, and it's like the most dangerous mountain. People die. I mean, people die all the
00:52:58.560 | time. I mean, okay, not the rabbit hole, but the reputation of K2 at the time was like, "This is
00:53:05.920 | the real killer." Like, Everest, you can have these companies that like take you up to the top.
00:53:11.440 | If you pay them $60,000, you don't have to be a world-class athlete. K2 is really, really hard.
00:53:15.760 | It's the second highest mountain. It was really, really hard, and it had the highest death toll.
00:53:19.520 | It was like one out of five people die or whatever. But then after this movie came out,
00:53:23.440 | you know, you get the disaster on Everest where Krakauer in thin air was there, where all these
00:53:28.160 | people died, and then a lot of people died on Everest after that. So no one thinks about
00:53:31.920 | Everest as being easy. I mean, in theory, you can do it without being an elite athlete. You can pay
00:53:36.000 | to do it, but now its death rate's also pretty high. Anyways, the reason why I think about this
00:53:41.280 | movie, because I saw it all the time, is that Michael Biehn was a corporate executive in this
00:53:48.320 | movie. And they were always showing he was in his skyscraper office, and he had a NordicTrack
00:53:53.520 | machine in there, because he was training for this. So he was a world-class mountaineer and
00:53:58.400 | had this job, right? So maybe that really resonates with you. Like, yeah, that's what I
00:54:02.400 | want to be, like adventure. My job just fulfills other things for me, and it's specific and
00:54:09.120 | corporate or whatever. But I want to be the guy who also has the NordicTrack machine in my office,
00:54:13.440 | because I'm training to go do these extreme things, and there's these two sides of me.
00:54:17.840 | So maybe that's what resonates with you. Then that gives you a concrete way of thinking about
00:54:22.960 | integrating adventure into your life. So you look for what resonates, because sometimes
00:54:25.920 | the abstract principle, you don't know what about that appeals to me, or what way of integrating
00:54:33.360 | that into a life really is interesting to me. And the concrete examples that get there. So you want
00:54:38.480 | to use a single-purpose notebook for this, or have a Fields note or Moleskine notebook, where you're
00:54:42.080 | taking notes on these things as you watch things, as you read things, as you meet people. Take
00:54:47.760 | notes on what's resonating, and that's going to give you some better ideas of how to implement
00:54:51.600 | this. Solution three, you might want to simplify. Maybe you want to simplify down the things you're
00:54:57.760 | focusing on so that you don't have too many specific things that you're trying to make
00:55:01.200 | progress on. So connection, it might be like heart-body-mind. Heart is like community and
00:55:12.880 | connection. Body is like, I want to be in good shape and healthy, and it fuels all these other
00:55:18.800 | things, and go do things that uses my body. And mind is like, I want to enjoy the world of ideas
00:55:24.640 | and interestingness and just do cool stuff with my mind. Simplify it. And then under those things,
00:55:29.600 | there's lots of different things you could do, and maybe you do different things at different times.
00:55:33.040 | I'm going to start by this season, like this winter, I want to read these five great books
00:55:38.080 | and have a discussion group about them. And maybe in the summer, I'm doing something else with my
00:55:41.520 | curiosity. And with my body right now, maybe it's just like, in my case, getting my back to work
00:55:46.160 | again. But then the next season, I might be working on returning to my Alex Skarsgård workout and
00:55:52.960 | getting giant traps or whatever, and it might get more extreme. So if you simplify it, smaller
00:55:58.400 | categories that have many more possibilities under them, now you have less going on at any one time.
00:56:03.280 | My final thing I would mention here, make sure you're not missing a foundation of what David
00:56:09.200 | Brooks would call second mountain virtues. Like in your list, outside of connection,
00:56:14.080 | we have learning, curiosity, self-improvement, and adventure. None of these second mountain
00:56:18.400 | virtues are service virtues. You serving other people in the world. This is like a foundation
00:56:24.720 | of meaning, especially as you get past a certain age. So often this will happen as people leave
00:56:28.240 | their 20s and move through their 30s, is that they'll find that just the sort of self-focused
00:56:32.320 | things, the things we see in the beginning of those morning ritual videos, I'm up and I have
00:56:38.880 | 17 steps I do just to like perfect my, you know, every aspect of my being. They don't fulfill,
00:56:44.960 | it's not as exciting anymore. You feel this bit of a lack. You're like, I'm trying to kill it at
00:56:48.880 | my job and be in really good shape and go on all these adventures and catalog them or whatever.
00:56:54.400 | And it's feeling a little bit empty after a while. The second mountain virtues, which are character
00:56:58.960 | and service based, that's when these kick in and really give you a strong foundation and really
00:57:03.840 | probably a life where a lot of your discretionary time is on second mountain virtues. And then on
00:57:08.320 | top of that, you're able to do these sort of, you know, I'm training to mountain climb as like my
00:57:13.600 | other thing I do, or I'm really in the movies and me and my friends are like really in the movies.
00:57:17.760 | That becomes that balance of second mountain virtues versus like other types of self-focused
00:57:23.520 | virtues. That ratio needs to shift as you get a little bit older. So it might just be that,
00:57:27.360 | like it's more about like your heart and soul, you have to get cleaned up. And then the other
00:57:33.200 | things maybe will be less important or you'll enjoy them more, or you don't have to do as many
00:57:37.040 | of them to get the same fulfillment. So I, long answer, because, you know, I'm thinking about a
00:57:42.160 | lot of this from my deep life book, which is actually, I'm on like a six week pause writing
00:57:47.440 | that because I doing this New Yorker thing. Yeah, you mentioned that.
00:57:50.320 | So I am excited to get back to it. You know, I took over Kyle Shaka's column for one month
00:57:55.680 | and I'm writing the third or fourth of the fourth, four articles right now. I'm kind of looking for
00:58:01.440 | it on the other end of that, just easing my way back into the deep life book, kind of building up
00:58:08.080 | the speed. It has been fun writing columns, but man, that's a fast pace. All right, who do we got
00:58:13.120 | next? Next question is from Holden. Speaking of writing, how would you recommend somebody go about
00:58:18.320 | deliberate and consistent improvement in writing? You know, my thing for writing, it's like other,
00:58:23.760 | any other sort of skilled activity you have to train, right? It's doing it a lot will get you
00:58:30.720 | part of the way. So if I'm writing a bunch, I want to do my pages every day, that will help.
00:58:37.680 | You will become a better writer than if you don't do that at all. You get more used to it,
00:58:40.320 | it's less hard. You build some circuits in your brain, words come more easily,
00:58:43.840 | then you're going to hit a wall. And if you want to get better, you have to have
00:58:46.880 | activities designed to stretch you in specific ways past where your current capabilities are.
00:58:51.520 | Writing for editing is really the best way to do that. I'm trying to make this good enough that
00:58:55.840 | this person likes it. So like someone's going to evaluate it and you're going to get that feedback.
00:58:59.520 | Like it stretches you and the feedback helps you get better in particular ways.
00:59:03.920 | Taking on specific writing challenges also helps. I want to work on this technique here.
00:59:08.240 | I'm going to read people who are good at that technique, try to understand it and then use
00:59:11.760 | that knowledge in this thing I'm writing now. So it's writing where you are specifically
00:59:15.440 | stretching a particular piece of the writing talent. It's the stretch and the specificity
00:59:20.240 | that's going to make you better. So you've got to think about it as something that you're going to
00:59:24.480 | train. I mean, it's why, for example, at the very upper levels of fiction writing, like elite
00:59:29.680 | literary fiction writers, so many of them go to MFA programs. They just often need that final
00:59:34.800 | really learning and being pushed with other writers and reading their stuff and they're
00:59:38.880 | reading your stuff. You just need that final push of like where you're still a little rusty.
00:59:42.560 | Seeing someone who's better at something than you are and like reading their thing and then trying
00:59:47.760 | to be better in yours the next time. They need that final training push if you want to be like
00:59:51.520 | an elite level fiction writer. But that persists at every level on the writing ladder. You got to
00:59:58.240 | train to get to the next level. So that's the way I usually think about that. All right, what's next?
01:00:03.440 | We have our corner.
01:00:04.720 | Slow Productivity Corner. This is each week we have a question about my latest book,
01:00:08.240 | Slow Productivity, The Lost Art of Accomplishment Without Burnout. The main reason we do this
01:00:12.640 | segment is so that we can play this segment theme song, which we're going to hear right now.
01:00:17.520 | All right, Jesse, what's our Slow Productivity Corner question of the week?
01:00:28.800 | It's from JJ. Many individuals who've reached the absolute pinnacle of their fields from athletes
01:00:34.080 | like Michael Jordan to entrepreneurs like Elon Musk seem to follow a different pattern of obsessive,
01:00:38.960 | all-consuming work without clear boundaries. While your approach clearly leads to meaningful
01:00:43.520 | achievements and a fulfilling life, I wonder if someone can truly reach the uppermost
01:00:47.600 | echelons of their field while maintaining the balanced approach you discuss in slow productivity.
01:00:52.080 | Well, it's a good question. I don't know necessarily that you would use balance as
01:00:57.520 | one of the key adjectives for slow productivity. I would say it's focused. I would say it's
01:01:02.880 | sustainable. I would say it's kind of the opposite of pseudo productivity, which is performative.
01:01:09.280 | Activity for the sake of activity. That's what it rejects.
01:01:11.760 | But I was thinking about people in the top 1% of their field. And in a lot of fields,
01:01:16.720 | elite-level performers, if you look at how they approach their work,
01:01:21.040 | it echoes a lot of the ideas from slow productivity. So let's consider, for example,
01:01:26.960 | elite writers. I used elite writers as examples frequently in the book Slow Productivity because
01:01:33.680 | to be an elite writer, you almost always have to take a slow productivity approach. You're not
01:01:37.600 | working on many things. You're basically just all-in on the book you're writing. You kind of
01:01:41.120 | simplify your life in that way. You're a coarser obsessing over quality if you're an elite writer.
01:01:46.080 | I want this thing to win the whatever literary prizes that I'm hoping are becoming New York
01:01:50.880 | Times notable books. You really care about quality more than anything else.
01:01:53.920 | And yeah, it's seasonal. I'm really working on a book hard. Now I'm completely doing nothing.
01:02:01.200 | Now I'm brainstorming the next book. Now I'm editing a book. There's real variations. And
01:02:05.280 | because they have such autonomy, there can be seasonality in their day. Often these writers
01:02:09.920 | have specific hours they write in, and then they're done. They're not writing all day long,
01:02:13.360 | and there could be seasonality in their month or week. I was thinking, if I was just a full-time
01:02:19.680 | writer, just a book writer, I bet this back thing I'm dealing with would be so much easier to deal
01:02:25.040 | with because all I'm doing is writing. I could just take the foot off the gas pedal while I work
01:02:30.800 | on this rehab and it's bothering me, and then just put it back on again. Because writers are full-time
01:02:36.640 | writers, and you sometimes have to be pretty much elite to do it full-time, have that type of
01:02:42.400 | ability to be more seasonal. Elite athletes, I actually think about as practitioners of slow
01:02:46.400 | productivity as well. I mean, they do one thing. They're sport. They're not working on 30 things.
01:02:52.160 | They're not answering emails and jumping off and on calls. They're training for their sport. Of
01:02:57.600 | course they obsess over quality. That's what makes them elite athletes. And their work is literally
01:03:02.000 | seasonal. Here's the sport season. Here's the off-season. We treat these things very differently,
01:03:07.040 | so they have different rhythms of their week. Elite academics, often they become elite because,
01:03:13.200 | again, they are slow productivity practitioners as well. Hey, I'm focusing on just this result.
01:03:19.040 | Academia is very seasonal, teaching, non-teaching, but also
01:03:25.520 | working on a result and being done with a result. It might take you a couple of years before you get
01:03:28.640 | going again on another big project. And they obsess over quality. Now, of course, a lot of
01:03:32.560 | academic positions have the slow productivity subverted by the injection of the administrative.
01:03:38.320 | But in this context, we note that's a problem. They say, I'm worse at being a professor now
01:03:43.120 | because I have to do all this administrative work. So what made them elite was not the
01:03:48.400 | busyness you see of a later stage career professor holding all these administrative positions. What
01:03:53.120 | made them elite is when they were more true to the slow productivity principles. So I see it.
01:03:58.240 | I mean, what's missing here, you mentioned balance. I think what you mean by balance is like the total
01:04:02.080 | number of hours you're working in a day is not too bad. Athletes, I guess, violate that because
01:04:08.080 | it takes a lot of hours if you're just in a season. Yeah. Writers don't work a huge number
01:04:12.640 | of hours. So I think that's still OK. Elite academics, they can work a lot of hours. Yeah,
01:04:17.040 | I think, especially if it's a lab-based academia. So fair enough on that point.
01:04:21.120 | There are, of course, you mentioned Elon Musk, that points towards the idea that there are
01:04:25.440 | careers in which elite level is not really compatible with slow productivity principles.
01:04:29.440 | Entrepreneurship is probably the classic example of that, like starting up a company.
01:04:33.600 | You just, you aren't doing one thing. You're doing lots of things. It's not seasonal. It's
01:04:37.520 | all out all the time. And it's not really obsessing over quality because you don't
01:04:41.600 | have time or energy to do that. It's just like putting out fires and trying to keep things
01:04:45.040 | rolling forward. So yeah, starting big companies is usually not compatible with slow productivity.
01:04:51.360 | Elite leaders of complicated teams, those type of positions often aren't compatible. I'm thinking
01:04:56.960 | about like Navy commanders. You're the CO on a big ship, like a destroyer or something, like that's
01:05:01.600 | not compatible with slow productivity. It's doing many, many different things. It's all out all day.
01:05:05.760 | It's getting things done right, but not trying to like push to quality. You don't have the time,
01:05:11.360 | energy, or luxury of like, I'm just going to obsess over quality in one thing. It's like
01:05:14.320 | trying to prevent bad things from happening. So yeah, no, not every job has the elite levels
01:05:19.360 | be compatible with slow productivity, but a lot do. And you'll see that if you read the book,
01:05:23.520 | because I draw from stories of knowledge workers who've done elite work in times past to draw out
01:05:28.720 | these principles. So it should be no surprise. All right. I think we decided, right, Jesse,
01:05:33.440 | we're now doing the music on the way out as well. >>Jesse Nichols: We sure did.
01:05:36.640 | >>Tor Norbye: All right. Let's hear that. This is my competitor product for dundaily.com.
01:05:50.800 | All it is, is every five minutes, it just plays that music. It's like an app, just plays that
01:05:57.440 | music every five minutes. Just relax people to get good work. All right. Do we have a call this week?
01:06:01.520 | >>Jesse Nichols: We do. >>Tor Norbye: All right. Let's hear it.
01:06:03.200 | >>Hi, Cal. This is Chris. I'm a data architect in Minnesota. And I have a question about how
01:06:10.880 | do you manage different projects in the columns, specifically active and waiting on someone else?
01:06:18.880 | I created a personal project in Asana that I have one task for each of my projects.
01:06:29.360 | And I have the active column, and I'm trying to keep that to no more than three
01:06:35.680 | open projects at a time. But I also have projects that I'm waiting on other people
01:06:42.320 | to get back to me for. And so I'm curious if you have any advice or rules of thumb around
01:06:49.760 | what to do if that list of projects that I'm waiting on people starts to stack up and then,
01:06:57.440 | say, four people get back to me at the same time.
01:06:59.600 | Wondering if you have any advice on that. >>Jesse Nichols: Yeah.
01:07:03.920 | >>Tor Norbye: Really appreciate the show. Thanks. >>Jesse Nichols: It's a good question about
01:07:07.600 | these task storage boards. Two things I want to say. First of all, I don't have everything I
01:07:14.000 | need to do on my task storage board. These tend to be sort of like tasks that need to get done.
01:07:18.960 | But what's not typically included on those boards for me is like ongoing work. You know,
01:07:25.120 | if I'm working on a book chapter, for example, that's a major thing I'm working on. There's
01:07:29.280 | not a task on my task board that says, like, work on book chapter. That's something that's going to
01:07:33.200 | come up in my weekly planning. I'm like, what am I doing right now? Oh, I'm working on my book as
01:07:39.040 | one of my big goals for this quarter. So what do I want to get done this week? Well, let's see,
01:07:42.960 | if I could finish a draft of chapter three this week, that would actually be good. Great. Let me
01:07:47.200 | put that on my weekly plan. Like today's big, you know, this week's focus is working on chapter
01:07:51.200 | three. And in fact, maybe I want to actually block out a few big writing blocks to make
01:07:55.520 | sure I have time on my calendar for working on chapter three, right? Nothing here ever touched
01:07:59.920 | a task on a Trello board. But the Trello board stuff might be, they often are they're like one
01:08:04.960 | off tasks or individual tasks, like stuff I need to do or get back to people that I don't want to
01:08:08.480 | forget about. All right, so I'll keep that in mind first. Second, okay, so what happens if you have a
01:08:14.800 | lot of stuff waiting to hear back from? Well, you put these items on the waiting to get back column,
01:08:19.520 | so you don't forget about them. You're telling your mind, yeah, I sent out this request. I don't
01:08:23.600 | want to forget that that's out there. Like that person may never get back to me again. That's
01:08:29.200 | going to be an open loop that's going to generate stress in my brain. So I want to make sure I
01:08:32.720 | remember like, yeah, I asked Jesse about this, that I'm waiting to hear back. And a good waiting
01:08:38.320 | to get back card on a Trello board will say what you're going to do when that comes back. When that
01:08:42.720 | gets back, make a decision and tell Jeremy. So here's what I'm waiting to hear back on. And
01:08:48.720 | here's what I'm going to do when I get it. Okay. You don't have to execute that right away.
01:08:53.120 | So, you know, if someone gets back to you, you can take that off the waiting to get back to
01:08:59.360 | you list now. Hey, that's back in my world. But then what you do with that's up to you. It's kind
01:09:03.360 | of like a new task has entered your life. You could put it on, you could just do it right then.
01:09:06.960 | You could put on your active list as something like, I want to try to get to this as soon as
01:09:09.840 | possible, or it could go on a back burner list. All right, ball's back in my court. I'm not going
01:09:15.760 | to act on this right now, but like, okay, it's, it's, it's changed. The status has changed. I've
01:09:21.280 | heard back. I have this information. Now I have a new thing to do. I'm going to put that back,
01:09:24.400 | you know, under whatever column is appropriate. So you don't have to do those things right away.
01:09:29.520 | The goal of that list is, you know, not to forget things that are outstanding,
01:09:36.160 | but you don't have to execute those things right away. All right. So hopefully that helps. And
01:09:40.880 | also, you know, I'm pretty loose about these things. Like often the things I have on my active
01:09:44.880 | list, it's, it's non-major things. Like I, but from my list of things I kind of want to make
01:09:49.680 | progress on. And as I go through my daily plan, I, um, and I have like, put, put aside admin blocks.
01:09:56.800 | I'll go look at those and see how many of those I can churn through. But you know, hey, sometimes
01:09:59.920 | things take longer or you lose some admin blocks, you don't get them done. And like, that's fine.
01:10:04.560 | I find that kind of loose. Like the critical stuff is going to end up being a part of my weekly plan
01:10:08.720 | and probably make its way onto my calendar. So hopefully that makes sense about waiting for,
01:10:12.800 | just because you've been waiting for something for a while, doesn't mean you have to
01:10:15.440 | act on it right away when it comes back to you. All right. We've got a case study this week where
01:10:21.680 | people send in a description of using that type of advice we talked about here on the show in their
01:10:25.360 | lives. If you have a case study, you can send them to jessie@calnewport.com. This case study
01:10:30.880 | comes from Amy, who we talked to in episode 323. She was also one of the listeners who pointed us
01:10:38.240 | towards the Derek Thompson article that we talked about earlier in the deep dive. So thank you, Amy.
01:10:42.640 | All right. So I don't remember episode 233, Jesse, in detail, but I guess it was about,
01:10:48.240 | she was going back to grad school. It had been a few years since she'd been in school.
01:10:52.640 | She's in her early thirties.
01:10:53.760 | And we were giving some advice about how to tackle school. And I think one of the points we made is,
01:10:59.120 | hey, don't be too stressed about this. You probably are going to find coming back to
01:11:03.840 | school in your thirties, it's not going to seem as hard of a job as it was when you were
01:11:07.600 | 20 whatever. All right. So here's her follow-up case study. I got all A's in my first semester
01:11:14.640 | of graduate school. Going to school and doing well is much easier at 34 than it was at 18.
01:11:21.360 | And it wasn't like I wasn't interested in my college education. I went to Berkeley College
01:11:25.760 | of Music because I was, and still am, obsessed with music. But after having some more life
01:11:31.280 | experience, my grad school program, though challenging and demanding, feels much easier
01:11:35.680 | than undergrad. My unsolicited advice for anyone considering college or grad school,
01:11:40.160 | take a gap year. If you're 18 and planning to go to college, seriously consider deferring
01:11:44.480 | your acceptance for a year. This is a common practice in other countries for various reasons,
01:11:48.800 | but Americans would be well, do well to adapt it too. I appreciate that, Amy. It is a true point.
01:11:55.600 | Older people find school easier because school is not that hard once you're used to doing hard
01:12:00.880 | things. An 18-year-old's not really used to doing hard things. But a 34-year-old is. And if it's
01:12:05.760 | their full-time job, they say it's not too hard to study. Like, studying is not fun. But honestly,
01:12:09.840 | this is going to take me like five hours this week to be prepared for this exam. Five hours is not
01:12:14.480 | that much time. I used to spend five hours just on my inbox on Monday morning alone. This is no big
01:12:18.560 | deal. I notice this again and again when I would advise non-traditional college students. So at
01:12:23.920 | Georgetown, I would help advise or give talks to the advising program that would work with
01:12:30.240 | non-traditional college students. So people coming back later in life, but also we did some work with
01:12:34.880 | the veteran program. So people coming back on the GI Bill, and they would just crush it, right?
01:12:39.600 | They'd seen real hardship. If you are new to school, the gap year is a good idea. Another idea,
01:12:44.640 | just read my book, How to Become a Straight-A Student. Read it and do it. Your friends are
01:12:50.080 | idiots when it comes to studying. Do not look at how they study. Take no advice from them.
01:12:55.760 | They are really bad at it. Do the stuff in that book. You'll get very good grades. That's just
01:12:59.840 | it. That book is like, here's how the people who get after it, this is how they actually study.
01:13:03.520 | This is the stuff that works. This is what you really need to do. Do that stuff. It tells you
01:13:08.400 | how to be organized, how to take notes, the right way to study for math, the right way to write
01:13:11.520 | papers. Just do it that way, and you're going to get really good grades, and it's going to be a
01:13:15.040 | lot easier than what your friends are doing. So yeah, if you treat being a student like a job,
01:13:20.400 | it's like an easy job. If on the flip side, you do what many students do, is you treat being a
01:13:26.160 | student like a vacation, then you're like, this is a really crappy vacation because I keep having
01:13:30.400 | to go to the library. And you see everything you have to do is somehow be negative because it's
01:13:34.640 | getting in the way of you having fun. But if you see it as a job, you're like, this is the easiest
01:13:37.280 | job I ever had. It's like a halftime job, and I'm doing great and getting a lot of praise for it.
01:13:40.800 | So anyways, Amy, thanks for helping to emphasize that point. We have a cool final segment coming
01:13:46.480 | up. I'll react to one of my own articles. But first, let's briefly hear from another sponsor.
01:13:52.640 | I want to talk about the Defender, the class of vehicles that we have been promoting here on the
01:13:58.480 | show, because it seems to kind of fit with our theme, right? I mean, it is a vehicle that is
01:14:05.280 | well-suited for those who are seeking something deeper in life. But it's also pretty cool.
01:14:10.480 | It's rugged and comfortable, which is, by the way, how people like to describe me.
01:14:16.800 | Rugged, rugged, and comfortable. I've always liked these cars. There's actually the current
01:14:22.480 | Defender line of cars. They have the 90, the 110, and the 130 model. The 130 model can now hold up
01:14:28.800 | to eight seats. This is a car that has a very durable, rugged design that you can take
01:14:36.320 | adventurous places, but it's very comfortable inside. It's got all of the latest technologies
01:14:40.720 | to make driving not just comfortable, but easy. I particularly like they have the under-the-car
01:14:47.600 | camera system. So if you're driving in some sort of situation off-road, you can see what's under
01:14:53.920 | the car. Like, where is that big rock? Because I want to make sure that I'm going around it with
01:14:57.520 | my tire. And you see it like you're seeing through your car on the screen, which is really cool.
01:15:01.920 | Or of course, the way I would use that feature, which is, okay, what kid toy am I currently
01:15:06.800 | running over right now in our driveway? And how valuable is it? Do I have to bother going to get
01:15:11.200 | it or can I just continue to drive over it? It would help me there. It's a cool car. Rugged,
01:15:15.600 | but also comfortable. Adventurous, but also relaxing. So you can design your Defender at
01:15:22.080 | LandRoverUSA.com. Build your Defender at LandRoverUSA.com. Also want to talk about our
01:15:29.920 | friends at Shopify. Everyone we know, and okay, I'm not fact checking that statement. So let's say
01:15:35.920 | so many people Jesse and I know who are in this business who sell things, they just use Shopify.
01:15:40.400 | That's just what you do. Like if you're going to sell something online or in a store,
01:15:45.040 | you use Shopify because they have selling things nailed down. It just is going to make it
01:15:52.080 | professional and easy and effective. Nobody does selling better than Shopify. They have the number
01:15:58.400 | one checkout on the planet. And they're not so secret, secret shop pay, which boost conversions
01:16:04.160 | up to 50%. People who are thinking about buying a thing are going to buy it. The shop pay pushes
01:16:10.080 | them forward. That means less carts go abandoned. So if you're into growing your business,
01:16:14.560 | your commerce platform better be ready to sell wherever your customers are scrolling or strolling
01:16:18.960 | on the web, in your store, in their feed and everywhere in between. Businesses that sell more
01:16:23.360 | sell on Shopify. So upgrade your business and get the same checkout that basically everyone we know
01:16:30.400 | who's selling things online uses. Sign up for your $1 per month trial at shopify.com/deep.
01:16:38.800 | Just be sure to type that in all lowercase to get the discount. Go to shopify.com/deep to upgrade
01:16:44.320 | your selling today that's shopify.com/deep. All right, Jesse, let's move on their final segment.
01:16:50.880 | We like to do one of two segments in the end, either a tech corner where I talk about technology
01:16:58.480 | or a CalReacts where I react to something on the internet. Today, we're doing both again,
01:17:01.920 | because I am reacting to my own latest article for the New Yorker, and it is an article about
01:17:07.280 | technology. I'm going to pull this up on the screen here for people who are watching instead
01:17:11.120 | of just listening to show you what I think is probably the most disturbing graphic that has
01:17:15.360 | ever accompanied something I have written, Jesse. I would describe it as a phone melting somebody's
01:17:21.520 | face. So it's pretty intense. A cool graphic, actually. It's like a... Do you ever know the
01:17:27.760 | graphics they're going to use? No, they do it kind of last minute. Yeah. Yeah. It's been occasions
01:17:33.440 | where I wish I had a version of it. So it's not like the book cover input that you have? Yeah,
01:17:38.720 | I don't know how it works. Sometimes they're drawing it from scratch, and sometimes they're
01:17:41.920 | like they have it already. I don't quite know how it works. But I guess this is probably not
01:17:47.680 | an artwork I want blown up large in my house because it would give me nightmares. It's a
01:17:51.120 | cool picture though. All right, here's the article I wrote. It's a column. This is me again. I took
01:17:55.280 | over Kyle Shaka's infinite scroll column for a month. This column is titled "What Happened When
01:17:59.760 | an Extremely Offline Person Tried TikTok." So the premise was, hey, I'm recovering from this injury.
01:18:06.640 | I've got kind of laid up a little bit. Maybe it would be fun to try TikTok. The formal journalistic
01:18:13.520 | experiment I was doing here was to see how is the experience of social media and our relationship
01:18:19.200 | with social media changed since when I was last like really actively writing about like how people
01:18:24.880 | use social media, whether they should use social media, which was really about a decade ago.
01:18:28.240 | I'm just going to point out a couple points. So perhaps one of the most striking things I found
01:18:35.520 | is that when I was writing about quitting social media, this was like 2013 and 2016,
01:18:42.000 | that's when I became known for that. I went back and read those articles again for this.
01:18:45.520 | There were really big debates happening. Supporters of social media had very strong
01:18:52.240 | reasons why it was important. I was debating against those reasons. So my articles were like
01:18:56.800 | very carefully walking through these arguments and saying these arguments are not as strong as you
01:19:00.560 | think, and people would get upset about those stances. It was really a pretty robust debate.
01:19:05.520 | I've talked about this before on the show, but like I would write a Times op-ed, and then the
01:19:09.520 | Times would publish a response op-ed, or I would go on the radio to talk about that article, and
01:19:13.440 | then they would bring on someone to push back on me on the radio show to say, you know, "Kyle is
01:19:17.440 | wrong." Like it was a pretty contentious debate that was unfolding at that time. Most of those
01:19:23.920 | articles, arguments I used to debate against, none of them apply anymore to social media.
01:19:29.440 | We use the same phrase, but when I was on TikTok or trying YouTube shorts or Instagram reels,
01:19:36.640 | the arguments that people used to make in favor of social media just don't apply anymore. They
01:19:41.200 | said, "This is how you keep up with your friends and your social life." No one keeps up with their
01:19:44.880 | friends or social life on TikTok. They said, "This is going to open up career opportunities." This
01:19:49.840 | was a big one. People were like, "You're crazy. You're going to disappear and have no job if
01:19:53.440 | you're not using these platforms." No one's saying that about TikTok. No one's saying like, "Yeah,
01:19:57.760 | I got my job because my boss at the insurance company thought my TikToks were fire." That just
01:20:04.000 | doesn't happen. The other major argument from 10 years ago was, "This is the online town square.
01:20:08.160 | This is where culture is being formed." This was like the Twitter argument back then. The most
01:20:12.720 | important articles are moving around Facebook. TikTok, Instagram reels, YouTube shorts, you're
01:20:16.880 | getting these incredibly individualized, atomized feeds. Your feed looks nothing like the person
01:20:21.440 | next to you. It's not creating collective culture. It's creating isolated, customized distraction.
01:20:26.400 | I was really struck. I was like, "Man, all this fighting I used to do,
01:20:29.760 | none of it's relevant anymore." These big arguments for why social media is important don't apply to
01:20:36.560 | the latest, most popular generation of social media. I went and I talked to some young people
01:20:41.680 | who do use these services. I had them show me TikTok. I was like, "Well, why do you use it?
01:20:45.600 | Here's the thing." They don't have a great answer. None of these young people were giving
01:20:51.280 | full-throated defenses of TikTok in the way that I used to get full-throated defenses of Facebook,
01:20:55.200 | Twitter, and Instagram back in the day. They're like, "Yeah, it's pretty stupid,
01:20:59.040 | but it's diverting." The one guy I talked to, Zach, would talk about it. He's like, "There's
01:21:04.560 | these memes, these video memes, and it's funny." He showed them to me, and they're funny and
01:21:07.920 | interesting, and they remind me of some of the absurdist type of humor that was popular on the
01:21:11.840 | early web when I was in college in the early 2000s, and I get it. But that's not a profound argument.
01:21:17.120 | It was like, "Yeah, this is funny. I like him." He would actually use the funny TikToks he found
01:21:23.520 | as just a social lubricant. You could send these to friends via text message, and it gave you
01:21:28.160 | something, an excuse to ping your friend or to talk to someone. The young woman I talked to was
01:21:33.360 | like, "I don't know. It feels kind of authentic. It creates emotion." She sent me some TikToks. It
01:21:38.480 | was like a recipe thing that was visually appealing, and a video of vets returning home
01:21:44.160 | early to surprise their kids, and that was touching. None of them have a grandiose theory
01:21:49.840 | like you used to get from communication professors back in 2013 about why this was at the key of
01:21:57.360 | culture, or this was at the key of your success, or it's at the key of the evolving civic life.
01:22:03.280 | People are just like, "I don't know. It's diverting," and I could use a little diversion in my life.
01:22:07.520 | There's a lot of fears around this because it's very diverting, and we see young people,
01:22:13.440 | they have a hard time turning their eyes away from this because once we get rid of all those
01:22:16.800 | other justifications, you can hone in on just being as engaging as possible, and that can be
01:22:21.600 | pretty addictive. But I found an almost hopeful note in this. If we're no longer fighting for
01:22:27.360 | social media, then I think its footprint on our lives is going to get smaller. I think, yes,
01:22:33.680 | its addictive nature maybe is higher, but the addiction is no longer protected. It's no longer
01:22:38.960 | protected in the clothing of virtue. It's just addicting. It's cigarettes in the '80s versus
01:22:46.160 | cigarettes in the '50s. No one wants to be smoking anymore. We all get it. It's still hard to stop,
01:22:51.920 | but everyone kind of agrees like, "Yeah, I probably should do this less."
01:22:54.320 | So we use the term social media today. We used the term social media 10 years ago,
01:22:59.840 | but it's describing something different. In some ways, it's something more insidious, but in other
01:23:05.200 | ways, it's something that feels like it's much more solvable because it feels much less important.
01:23:09.200 | Its grasp is hard, but its grounding is shallow. So I actually came away from this like, "Oh,
01:23:15.920 | not as scary because no one's fighting me on this. We're all on the same side."
01:23:21.200 | And the fact that we have the TikTok ban, at least in some form, seems like it's going through,
01:23:24.880 | that would be positive as well. You see one of these services being banned. That also just
01:23:30.880 | helped change their mindset of like, "Yeah, these things are kind of optional. It was okay. We took
01:23:33.680 | that one away. We all survived." It just kind of emphasizes the optionality, the triviality,
01:23:39.600 | the tangentiality of these services. So it was an interesting experiment, Jesse.
01:23:43.680 | I had no... By the way, I have no interest in... They're on my phone now because I was
01:23:48.000 | doing an experiment. I have no interest in clicking those apps. I don't know if you've
01:23:51.600 | used TikTok before. It's just... I've never used it.
01:23:53.760 | But I get you would get used to it if these young people are more used to it.
01:23:58.640 | You think Elon's going to buy it? Maybe. Yeah, maybe. I don't know what's going to happen.
01:24:06.160 | I'm bad at predicting the legislative. It's like we're recording this on Friday
01:24:14.000 | before the ban could go into effect very quickly. But probably, Congress wants to expand it.
01:24:20.160 | I don't know what's going to happen. I think he's going to buy it.
01:24:22.480 | I just wish... It'll be expensive, but...
01:24:24.880 | Yeah, I think he has the money. He'll raise it.
01:24:27.760 | Yeah, maybe a syndicate. Yeah. I just hope it goes away just so we get used to this idea
01:24:34.320 | of like, "Yeah, these things come and go," which I think is the reality of social media today.
01:24:38.800 | These things come and go. The guy I talked to for the article, Zach, shout out to Zach,
01:24:44.320 | he also uses Instagram Reels, which is very similar to TikTok, and he mixes them up. He
01:24:47.680 | doesn't care. He's like, "Oh, here, check out this TikTok," and really, it's an Instagram Reel.
01:24:51.360 | It doesn't matter. There's no social graph. Most people, like the typical TikTok user,
01:24:58.480 | according to peer research, doesn't ever even touch their biofield. It doesn't know anything
01:25:02.480 | about you other than the videos you like. It's not like your friends are in there,
01:25:05.040 | your followers are in there. If you leave the platform, it's a problem.
01:25:08.400 | I can jump over to Reels and see videos on there, and I'm getting the same experience.
01:25:13.920 | Like, it doesn't matter. These things have become portable. They're just
01:25:16.720 | becoming increasingly generic sources of short-form distraction, and that feels very
01:25:21.920 | different than like, "Man, I would get yelled at." People thought I was like an eccentric, Luddite,
01:25:27.120 | anti-democratic weirdo for not using one of these three platforms. That is just not the case
01:25:34.400 | anymore. He bought Twitter for $44 billion, right? How much would TikTok be, like $200 billion?
01:25:40.720 | I don't know. That's a good question. So, I mean, Twitter's user base is in the hundreds
01:25:45.360 | of millions, and TikTok's much bigger than that. You would think it would be at least four times.
01:25:49.920 | TikTok's generating a lot more revenue as well. Yeah. I mean, that's not easy money to raise.
01:25:54.880 | He's in trouble right now for some of the details of how he used his own stock to raise. The SEC's
01:26:02.160 | mad at him for how he raised the money for Twitter. He's taking loans against his own stock.
01:26:06.720 | I just read that SBF book from Lewis about FTX. Yeah. He could have bought it. Not anymore.
01:26:14.160 | Not anymore. It's a lot of abbreviations. A similar thing of all commingled funds.
01:26:18.640 | Yeah. Maybe we should buy it. Just be all slow productivity corner theme music
01:26:23.760 | and Jesse Skeleton. That would be a good... Just like Jesse Skeleton doing funny things
01:26:31.680 | and slow productivity corner theme music. That would be a successful platform. I'll record that.
01:26:37.040 | All right. Anyways, let's wrap it up for now. We'll be back next week with another episode,
01:26:40.960 | and until then, as always, stay deep.
01:26:43.520 | Hey, if you liked today's discussion, I think you'll also like episode 330 in which we explore
01:26:48.880 | how to tackle social media's hidden dangers. Check it out. I think you'll like it. The
01:26:54.800 | final part of this deep dive, I will then connect what's going on in Australia
01:26:58.800 | with all of our general struggles to control the role of technology for better or for worse in our