back to index

Is Overstimulation Ruining Your Life? - How Your Phone Affects Intelligence, Focus & Productivity


Chapters

0:0 Are We Getting Dumber?
22:19 Is it possible to build a better Twitter?
27:23 Should I ditch my higher paying job to avoid stress?
32:19 Can I remain relevant with younger, tech savvy colleagues?
35:57 How should I navigate my time during while receiving severance?
38:41 What should I do next after being fired from my federal job?
42:27 Practicing focus on the weekends
46:6 An engineering Kansan system
57:26 The AI Blind Spot

Whisper Transcript | Transcript Only Page

00:00:00.000 | So several people recently sent me the same article.
00:00:02.600 | It was from the Financial Times.
00:00:04.300 | It was written by John Byrne Murdoch, and it had a provocative headline,
00:00:07.320 | Have Humans Past Peak Brain Power?
00:00:11.020 | So I'm going to take a closer look at this claim.
00:00:13.960 | I have two goals in mind.
00:00:15.180 | First, I want to develop a better understanding of why the data seems to show that we are getting dumber.
00:00:21.200 | But two, I want to use that understanding.
00:00:23.120 | It's my second goal.
00:00:23.840 | Use that understanding to help find practical ways that you as an individual can push back on that trend.
00:00:30.340 | And not only not get dumber, but make sure that you continue to get smarter.
00:00:34.220 | So this article that I was just citing was inspired by some recent analysis that was released by the Organization for Economic Cooperation and Development.
00:00:42.820 | They do this regular test called the PISA, which benchmarks teenagers around the world and their knowledge of math, reading, and science.
00:00:51.820 | So we have sort of trends over time.
00:00:53.600 | It's a useful test to kind of understand what's going on.
00:00:55.880 | So they looked at—there's a recent analysis of this.
00:00:59.200 | The article looked at that recent analysis, plus some other tests that have been given worldwide recently.
00:01:03.460 | And the author of the article made the following conclusion, and I'm quoting here,
00:01:08.200 | Across a range of tests, the average person's ability to reason and solve novel problems appears to have peaked in the early 2010s and have been declining ever since.
00:01:20.220 | So I have a graph to show here, Jesse, bring up this graph for those who are watching instead of just listening.
00:01:26.600 | So here's one of the key graphs that indicates this point.
00:01:29.560 | It shows performance and reasoning and problem solving tests over time.
00:01:33.820 | On the left-hand graph, what you see is a line for science, reading, and math.
00:01:37.780 | And you can attest, Jesse, that right around 2012, all those lines go downward.
00:01:44.400 | That's from the PISA test.
00:01:46.180 | We have a test from an adult on the right.
00:01:47.620 | Literacy takes a big spill right around 2012 and goes down dramatically ever since.
00:01:54.840 | All right, so we can bring that graph down.
00:01:55.920 | So that indicates the thesis of the article that, hey, starting in the early 2010s, at least according to the test, we're doing worse.
00:02:03.900 | Worse, us being humans.
00:02:07.060 | So why, why are we doing worse?
00:02:10.140 | Well, the article points to an obvious culprit based just on the forensic evidence of the timing.
00:02:14.160 | These trends seem to occur right around that 2012 to 2014 period is where we see these downward shifts.
00:02:21.500 | That date should sound familiar.
00:02:23.940 | There have been other things that worry us that got more prominently worrisome starting around that time.
00:02:29.020 | For example, teenage mental health deterioration is another one.
00:02:32.620 | What happened around that 2012 period?
00:02:34.940 | Smartphones became ubiquitous.
00:02:37.300 | This is when we got worldwide ubiquity of smartphones became a reality.
00:02:41.840 | So the article points out correctly, we seem to be seeing a negative turn on these tests of various reasoning and intelligence abilities around the time smartphones come.
00:02:51.620 | And it's been getting worse ever since.
00:02:53.960 | But I don't think it's useful to just leave it there.
00:02:56.640 | So if we just say, yes, smartphones seem to have led to us getting dumber.
00:03:01.780 | It's unclear how we should respond.
00:03:04.740 | We're probably not going to get rid of phones.
00:03:07.500 | Most people need various aspects of the phone and app ecosystem to operate.
00:03:12.180 | So it sort of leaves us without much to do except for to shrug our shoulders and say, well, I guess phones may as dumber, but what are we going to do?
00:03:17.940 | It's sort of like cars came along and traffic deaths, you know, got higher.
00:03:21.880 | Here's another here's a source of 20,000 new deaths a year that didn't exist before cars.
00:03:25.640 | But, you know, we kind of need cars.
00:03:27.260 | And it was like, this is just something we're going to have to live with.
00:03:29.040 | It feels that way sometimes when we're dealing with these cognitive impacts of smartphones.
00:03:33.980 | But I think we can do better and I want to do better today.
00:03:38.400 | So I'm going to look closer and I'm going to try to develop a hypothesis that explains at least partially,
00:03:43.860 | specifically why, what mechanisms of smartphones are making us perform worse on these tests, making us dumber.
00:03:48.960 | Because if we know more specifically what about these things is making us dumber,
00:03:53.800 | then maybe we have a chance of reversing that even without having to get rid of our phones.
00:03:58.740 | So to look closer, what I'm going to do is pull up another graph.
00:04:02.160 | Jesse, bring this up on the screen here.
00:04:03.280 | Here's another graph from this article that gets at what specifically is changing in the smartphone era.
00:04:10.540 | So we see here on the left, a graph over time, measuring percentage of respondents that are saying they have difficulty thinking or concentrating.
00:04:19.380 | This comes from another survey called Monitoring the Future, which John Byrne Murdoch sort of pulled up.
00:04:23.640 | What you notice here is it's relatively stable, difficulty thinking or concentrating until that same inflection point of around 2012.
00:04:32.200 | And then it shoots up and we see a aggressive upward trend.
00:04:37.720 | On the right, we have another graph, percentage of people saying they have trouble learning new things.
00:04:41.620 | It's relatively flat starting in 1990.
00:04:43.720 | Again, right around 2012, shoots up.
00:04:46.840 | Same time that difficulty thinking or concentrating shoots up.
00:04:50.080 | So, of course, right at the smartphone inflection point, we see mechanistically that people suddenly reported at much higher rates,
00:04:57.780 | having difficulty thinking or concentrating and having trouble learning new things.
00:05:01.920 | All right, we can bring this down, Jesse.
00:05:04.300 | That is where I think we're seeing the effect of smartphones.
00:05:08.640 | And if we look at a little bit closer, why is smartphones now causing us to have difficulty thinking or concentrating or trouble learning new things?
00:05:15.720 | Keep zooming in.
00:05:16.940 | I believe we can identify what I think of as a cognitive death spiral here.
00:05:21.560 | And here's how I think this works.
00:05:23.980 | So, you now have a smartphone.
00:05:25.440 | The phone itself is not the problem, of course.
00:05:27.720 | It's the ecosystem of attention economy that arose around the smartphone.
00:05:31.700 | Pre-smartphone, if you were building a sort of information platform, your Facebook pre-smartphone,
00:05:37.100 | you were building a product that was trying to be maximally useful to users.
00:05:41.820 | I want to make Facebook so useful that people will think to log in and want to be a member of it.
00:05:47.860 | So, like, all your friends are on here.
00:05:50.060 | That's a marker of this being useful.
00:05:51.960 | You can find out what your friends are up to.
00:05:54.120 | That is a really useful thing.
00:05:55.880 | Post-smartphone, we had a shift towards an attention paradigm where the idea now is not being useful,
00:06:00.740 | but capturing as much attention as possible.
00:06:03.220 | They realized users were, you wanted a large user count if you were trying to raise money,
00:06:07.700 | but once you had a company running, you wanted to monetize those users.
00:06:12.340 | And that's a different game.
00:06:13.760 | And this is when the goal of platforms changed to not being as useful as possible,
00:06:17.260 | but being as addictive as possible.
00:06:18.640 | So, we get ubiquitous smartphone use pickup around this time.
00:06:21.860 | Why does that cause a cognitive death spiral?
00:06:24.060 | Well, think about what happens.
00:06:25.180 | You have this rhythm in your life of constantly being distracted because
00:06:29.060 | the apps on your phone are designed to grab your attention.
00:06:33.260 | It has faster, more desirable stimuli than other things in your life.
00:06:36.600 | So, now you're rewiring these circuits in your brain
00:06:40.460 | so that the reward circuits are very much tuned towards if a phone is nearby,
00:06:44.920 | let's focus on that.
00:06:46.960 | Let's have our dopamine cascade focus on the action of looking at that phone
00:06:52.100 | because that we have learned, these circuits have ingrained,
00:06:54.320 | that's going to give us the quicker hit to whatever else we are doing.
00:06:57.340 | And because the phone is ubiquitous, we constantly have those reward circuits firing
00:07:01.940 | because the phone is always there.
00:07:03.280 | The reward is always there.
00:07:05.520 | Look, if you put a donut in front of me, I'm going to build up a reward.
00:07:09.360 | You know, every day at four, you put out donuts at the office,
00:07:11.580 | I will build up a reward circuit where, like, I'm really looking forward to that donut.
00:07:14.760 | If you now follow me everywhere I go with a cart of donuts, there's going to be a problem.
00:07:18.680 | Right?
00:07:19.480 | So, that's what started happening with the phone.
00:07:20.660 | So, now our mind gets rewired to craving this more faster pace of stimuli.
00:07:26.340 | That can directly impact our ability to concentrate because that's distracting us.
00:07:33.380 | We're trying to take a P as a test.
00:07:34.500 | We're going to do worse on it.
00:07:35.420 | It's harder to sort of apply our existing intelligence.
00:07:38.400 | But the reason why I think it creates a cognitive death spiral
00:07:41.120 | is that it also means we spend less time on the type of activities that could make us smarter.
00:07:48.700 | So, if two things going on at the same time is our mind gets rewired for faster stimuli,
00:07:53.460 | we have a harder time applying our existing intelligence,
00:07:56.340 | but we also have a harder time engaging in activities that would make us smarter.
00:08:02.460 | Now, this is also captured in this article.
00:08:05.200 | Jesse, bring up one more chart here.
00:08:06.520 | What we have on the screen here is a chart showing the decline of reading.
00:08:11.900 | There's two plots on here.
00:08:13.340 | So, this is a percentage of U.S. teenagers who read in their leisure time.
00:08:18.540 | One of these plots on here shows who says they hardly ever read,
00:08:23.160 | and the other plot shows who reads almost every day.
00:08:27.140 | So, we see the almost every day.
00:08:29.520 | It's like moving mildly down through the 80s and 90s.
00:08:34.260 | Right around 2012, that goes down real sharply,
00:08:37.540 | and the people reporting that they hardly ever read goes up real sharply.
00:08:41.720 | All right, so we can bring that graph down.
00:08:43.340 | So, what's this saying?
00:08:44.960 | This is an example of an activity.
00:08:46.260 | Reading is an example of an activity that makes you smarter.
00:08:48.740 | The brain circuits involved in reading makes you smarter.
00:08:53.180 | You can better understand other people.
00:08:55.520 | You can better sustain your attention on abstract targets.
00:08:57.720 | You can better manipulate information and build and construct worlds in your mind.
00:09:00.900 | Reading is calisthenics for your mind.
00:09:03.460 | It is just straight-up exercise for your mind.
00:09:05.200 | It's why it's been at the core of sort of every academic curriculum
00:09:07.960 | since the invention of the codex.
00:09:09.520 | So, it's one among other activities that we do less of
00:09:13.200 | because it requires sustained attention,
00:09:14.640 | and when we rewire our mind for faster stimuli,
00:09:17.120 | we're less likely to actually, as we see in that graph,
00:09:20.080 | we're less likely to actually spend time doing that.
00:09:22.000 | So, we get this double whammy.
00:09:23.160 | We have a hard time applying whatever intelligence we have,
00:09:25.860 | and we slow down or completely stop the increase of our intelligence
00:09:30.080 | that should be happening over time
00:09:31.280 | as we do activities that would naturally get us there.
00:09:33.440 | The result, we're dumber, and we see it.
00:09:36.600 | Our performance on those tests plummet.
00:09:38.580 | So, we're not getting smarter,
00:09:40.120 | and we're having a harder time applying the intelligence we have.
00:09:43.140 | Okay, so really now what we're talking about,
00:09:46.160 | our issue is not with smartphones so much
00:09:50.380 | as it is with the specific effect
00:09:52.500 | of having our brain rewired for faster stimuli,
00:09:56.460 | and because of this,
00:09:57.920 | spending less time with activities that foster intelligence.
00:10:01.360 | Hey, it's Cal.
00:10:02.840 | I wanted to interrupt briefly to say that if you're enjoying this video,
00:10:06.480 | then you need to check out my new book,
00:10:08.920 | Slow Productivity,
00:10:11.060 | The Lost Art of Accomplishment Without Burnout.
00:10:14.440 | This is like the Bible for most of the ideas we talk about here in these videos.
00:10:20.020 | You can get a free excerpt at calnewport.com slash slow.
00:10:25.600 | I know you're going to like it.
00:10:27.100 | Check it out.
00:10:28.100 | Now, let's get back to the video.
00:10:29.460 | So, if we're looking for a response here,
00:10:32.620 | we can actually come up with actions
00:10:35.920 | that don't involve us having to go back in time.
00:10:38.240 | Now, before I talk about what that could particularly be,
00:10:40.520 | here's the analogy that came to my mind from 60 years ago, right?
00:10:43.840 | We had this issue 60, 70 years ago,
00:10:46.000 | where in the U.S., for example,
00:10:48.220 | the economy shifted from being primarily industrial agricultural
00:10:51.900 | to having this very strong sort of office-centric knowledge work sector.
00:10:55.100 | And we noticed in the 1950s,
00:10:57.140 | and in particular the 1960s,
00:10:58.680 | this issue of we are having health problems
00:11:02.620 | at a higher rate than they existed before
00:11:04.560 | because before you were probably working on a farm
00:11:07.440 | and you were exercising all day long.
00:11:09.620 | You were on your feet.
00:11:10.380 | You were moving.
00:11:10.880 | You were lifting things.
00:11:11.660 | It was very physical.
00:11:12.280 | And now suddenly you're sedentary because you're in an office.
00:11:14.560 | You're not getting that exercise.
00:11:16.060 | This caught us off guard.
00:11:17.420 | Like, oh, that was important,
00:11:19.860 | and we're not getting that anymore.
00:11:22.180 | You know, in the era before bypass surgery,
00:11:24.600 | people would just drop dead in their 60s.
00:11:26.500 | That's just how it worked.
00:11:27.180 | You just have a heart attack and die in your 60s.
00:11:28.760 | You're like, whoa, what's going on here?
00:11:29.860 | How do we respond to that?
00:11:32.100 | Well, we didn't say we need to shut down the offices
00:11:36.220 | and go back to the farms.
00:11:37.200 | We said, what was the thing we're missing now from the farms
00:11:40.880 | now that we have this new knowledge sector?
00:11:42.440 | Oh, it's the exercise.
00:11:43.460 | Okay, I guess people need to exercise.
00:11:45.340 | You didn't have to think about that before.
00:11:47.060 | In 1920, you didn't have to think about exercising.
00:11:49.420 | You just got it.
00:11:50.240 | But in 1975, I got to go jogging.
00:11:53.280 | You know, I got to move some weights around
00:11:56.040 | because that is out of my life now,
00:11:58.100 | and it is actually pretty important.
00:11:59.240 | That is a good analogy for thinking about
00:12:01.260 | this smartphone-induced dumbness issue.
00:12:03.780 | We don't necessarily have to go back
00:12:05.920 | to a pre-2012 technology era,
00:12:08.640 | but we do now have to think explicitly
00:12:11.980 | about increasing our intelligence
00:12:14.580 | and maintaining our ability to hold attention
00:12:16.240 | in a way that we didn't have to in 2009.
00:12:18.300 | We just did this naturally.
00:12:20.520 | Now we have to think about it.
00:12:21.960 | That's the mindset shift.
00:12:23.680 | We have to exercise our minds
00:12:25.580 | in the same way we learned
00:12:26.480 | we have to exercise our bodies.
00:12:27.820 | So what might that mean?
00:12:30.060 | Well, we talk about this a lot on the show,
00:12:32.520 | but just to give you four ideas,
00:12:34.340 | you know, that gets your mind going
00:12:36.500 | about how one might have a cognitive exercise routine
00:12:39.120 | and the pushback on this dumbness trend,
00:12:42.420 | you could one, force yourself to read.
00:12:43.960 | Reading is pull-ups and push-ups for your brain.
00:12:46.900 | Read.
00:12:48.160 | Every week, read a book.
00:12:49.220 | Start with things you love,
00:12:51.880 | easy to read, you're excited to read,
00:12:53.700 | but force yourself to sit there and read.
00:12:56.600 | The best way to do this
00:12:58.700 | is to be outside of arm's reach of a phone.
00:13:01.980 | In fact, be in a completely different room
00:13:03.340 | from a phone.
00:13:03.800 | Even better, go for a walk
00:13:05.420 | and read on a bench without your phone
00:13:06.940 | so that you don't have to fight
00:13:08.700 | against a reward circuit
00:13:09.840 | that sees the phone and says,
00:13:11.420 | it is right there.
00:13:11.960 | We could pick that up.
00:13:12.700 | Dopamine, dopamine, dopamine.
00:13:13.680 | So make your life easier.
00:13:14.740 | But reading is calisthenics for your brain.
00:13:16.400 | More generally,
00:13:18.140 | in the constant companion model of your phone,
00:13:20.180 | when you're at home,
00:13:20.900 | plug it in in the kitchen.
00:13:21.820 | Go there if you need to look something up.
00:13:24.400 | Go there if you need to check your text messages.
00:13:25.860 | Go there if you need to make a call,
00:13:27.160 | but don't have it with you
00:13:28.560 | when you're doing other things.
00:13:29.720 | Again, you want to sort of break out of this pattern
00:13:32.300 | of I can at any moment get faster stimuli.
00:13:34.960 | You certainly want to avoid,
00:13:37.720 | and I just learned this term.
00:13:38.940 | I don't know if you know this term, Jesse,
00:13:40.040 | but I just learned this term stimuli stacking.
00:13:42.060 | I don't, no.
00:13:43.900 | I heard this from a younger person.
00:13:45.240 | Shout out to Nate.
00:13:46.200 | Stimuli stacking is where you're consuming
00:13:50.020 | multiple streams of stimulus at the same time.
00:13:52.880 | So you're watching something
00:13:54.100 | while checking something on your phone,
00:13:56.100 | and maybe you even have like a different device
00:13:57.980 | on which you're like following something else.
00:14:00.540 | And supposedly some of the streamers like Netflix
00:14:03.980 | are actually redesigning their shows
00:14:06.320 | to be more compatible with stimuli stacking.
00:14:08.280 | So if it requires you to have to,
00:14:10.020 | if I missed what was said here,
00:14:11.640 | I don't know what's going on.
00:14:12.480 | That's a bad show
00:14:13.240 | because you can't actually look at your phone
00:14:14.660 | in the same time and watch that show.
00:14:16.100 | So don't stimuli stack.
00:14:17.420 | We want your mind to be used to like doing one thing
00:14:21.220 | for a long piece of time.
00:14:22.060 | Reflection walks is another great one.
00:14:23.980 | Go for a walk with a particular problem
00:14:25.980 | you want to solve.
00:14:26.680 | It could be just a problem in your life.
00:14:28.260 | I want to work this through,
00:14:29.400 | and your mind is going to be everywhere.
00:14:30.880 | It's going to be squirrel, squirrel, squirrel.
00:14:32.580 | But you keep pulling it back.
00:14:34.600 | Be in the sunshine.
00:14:35.760 | Be in the woods.
00:14:36.640 | Get used to just being alone with your own thoughts
00:14:39.380 | and manipulating your thoughts.
00:14:40.580 | You will get better at this.
00:14:41.640 | This also pushes back on the negative trends
00:14:44.660 | that smartphones are inducing
00:14:45.860 | and have hobbies that require concentration.
00:14:48.040 | Playing the guitar requires a lot of concentration
00:14:51.340 | to get better.
00:14:52.040 | Woodworking requires a lot of concentration,
00:14:54.380 | you know, to get better.
00:14:56.040 | A particular sport requires a lot of work and focus
00:14:59.340 | to actually get better at it.
00:15:00.420 | So have, you know,
00:15:01.300 | things that require sustained concentration
00:15:03.100 | and give you obvious rewards as you get better.
00:15:05.920 | So notable rewards.
00:15:06.980 | So you feel that appreciation.
00:15:08.100 | All right.
00:15:09.300 | So anyways, I thought that was a cool article.
00:15:10.900 | That's what I think is going on.
00:15:12.360 | It's not just the phone itself makes us dumber.
00:15:17.720 | It's particularly the way that it's rewired our brain,
00:15:21.340 | which creates that death spiral
00:15:22.780 | of we have a harder time applying our intelligence
00:15:24.400 | and we don't increase it.
00:15:25.240 | So we just push back.
00:15:26.040 | Look, man, when you're in the office building
00:15:28.520 | madmen in the 1960s,
00:15:30.020 | you got to start exercising.
00:15:31.580 | You didn't have to exercise
00:15:33.200 | when you're on the farm in the 1940s.
00:15:34.840 | You got to exercise now
00:15:35.860 | in the office building in the 1960s.
00:15:37.520 | Well, same thing.
00:15:38.500 | When I was in college in the early 2000s,
00:15:40.980 | I didn't have to worry about
00:15:42.280 | how do I keep my brain sharp?
00:15:45.200 | How do I keep getting smarter?
00:15:46.360 | Because we were just doing this all the time.
00:15:48.300 | We had to read books
00:15:49.460 | and we didn't have like constant distractions
00:15:51.300 | and we were often bored
00:15:52.340 | and walking long distances
00:15:53.460 | in the interminable snow
00:15:55.560 | of Hanover, New Hampshire,
00:15:57.200 | going through the snow,
00:15:59.260 | like trying to find our car,
00:16:00.480 | but we couldn't because it was buried in snow.
00:16:02.100 | And there was nothing in our ear
00:16:03.420 | and there was nothing to look at.
00:16:04.460 | You would just have to think.
00:16:05.380 | We were just thinking thoughts
00:16:06.400 | and mainly just I'm cold
00:16:07.780 | and why didn't I go to school at Pepperdine?
00:16:10.280 | But you were thinking
00:16:12.000 | and then you would go
00:16:12.800 | and you would trudge through this to a library
00:16:14.620 | and you're just stuck there with your book
00:16:15.980 | and you would sit there
00:16:16.600 | and have to like read your books for a while.
00:16:18.080 | We didn't have to think about it.
00:16:19.320 | We were like the farmers in the 40s.
00:16:20.620 | Now, 2025, you got to exercise.
00:16:22.900 | So you got to like force yourself to read books.
00:16:24.840 | You got to go for reflection walks.
00:16:26.620 | So cool article, scary trend,
00:16:29.280 | but at least on the individual level,
00:16:31.500 | I think it's reversible.
00:16:32.380 | When you read articles on a desktop
00:16:36.180 | or like a laptop,
00:16:38.520 | what do you do if you get distracted?
00:16:40.500 | Just put stuff in the working memory?
00:16:41.860 | So you put like, what are you talking about?
00:16:45.020 | Like if a thought comes up
00:16:46.000 | that's unrelated to the article?
00:16:47.260 | Yeah.
00:16:47.720 | Just trying to distract me.
00:16:48.680 | So you're not on your phone,
00:16:50.240 | but you're on a laptop or a desktop.
00:16:51.860 | I guess I would put it in working memory.
00:16:53.860 | I don't know.
00:16:54.240 | I'm pretty used to now
00:16:55.340 | when I'm doing something,
00:16:56.320 | I lock in on that thing.
00:16:57.300 | And then when I'm done,
00:16:58.560 | like now, what do I want to think about?
00:16:59.760 | But do you ever just read articles
00:17:01.480 | on a desktop or a laptop?
00:17:03.380 | Yeah, sometimes.
00:17:04.880 | So I'm trying to think,
00:17:05.760 | it's a good question.
00:17:06.460 | Like this morning,
00:17:07.860 | I read articles from both
00:17:09.080 | the New York Times and the New Yorker.
00:17:10.400 | And in both cases, I use the app.
00:17:12.920 | On what type of device?
00:17:14.900 | On my phone.
00:17:15.540 | On your phone.
00:17:16.280 | Yeah.
00:17:16.700 | I'll also read articles on the browser
00:17:19.840 | and I'll print articles.
00:17:21.500 | It's like another thing I like to do.
00:17:23.660 | But I'm not very distracted by the web.
00:17:25.600 | You know, like I don't really have places
00:17:27.500 | to go to distract me.
00:17:28.560 | Yeah.
00:17:28.920 | Like maybe MLB trade rumors.
00:17:30.760 | But that's only relevant
00:17:32.020 | for like a three-month period each year.
00:17:33.400 | So it's easier for me
00:17:34.940 | to just read an article
00:17:35.780 | and then I'm done reading that article.
00:17:37.060 | Right.
00:17:37.680 | Yeah.
00:17:37.960 | Well, anyways,
00:17:39.680 | we got a bunch of good questions coming up.
00:17:41.480 | But first,
00:17:42.320 | let's hear briefly from some sponsors.
00:17:44.460 | Talk about a relatively new sponsor of ours
00:17:48.080 | that I really enjoy called Factor.
00:17:49.600 | Factor offers chef-made gourmet meals
00:17:54.980 | that make eating well easy.
00:17:56.420 | Jesse, let me ask you this.
00:17:58.020 | Let's say you're at home.
00:18:00.100 | It's like a weekend.
00:18:00.780 | You're not at work.
00:18:01.440 | Lunchtime.
00:18:02.680 | What are you doing?
00:18:03.480 | How, like what's going to be
00:18:05.100 | a typical lunch for you?
00:18:05.980 | Well, I actually don't eat dinner.
00:18:07.660 | That's right.
00:18:08.440 | Yeah.
00:18:08.620 | Jesse eats like one meal a week
00:18:09.880 | and he has to be deadlifting
00:18:11.600 | while he does it.
00:18:12.240 | All right.
00:18:12.880 | Dinner.
00:18:13.260 | What are you going to do?
00:18:14.740 | Where would you typically find the dinner?
00:18:16.060 | I usually,
00:18:17.560 | depending on,
00:18:19.020 | say I'm at home,
00:18:19.840 | I would make eggs with,
00:18:22.040 | you know,
00:18:23.040 | cut up onions and vegetables
00:18:24.600 | and stuff like that.
00:18:25.260 | Yeah.
00:18:25.560 | So, okay.
00:18:26.020 | You would be a perfect example
00:18:27.440 | where Factor would be really useful.
00:18:29.660 | I've really been enjoying these.
00:18:30.940 | They're really healthy,
00:18:34.380 | tasty meals.
00:18:35.540 | They're refrigerated.
00:18:36.600 | You microwave them.
00:18:37.840 | It's like two minutes.
00:18:38.540 | I've been eating them all week.
00:18:40.340 | Two minutes
00:18:41.360 | and you have
00:18:43.000 | chicken and broccoli
00:18:45.300 | and cheese sauce
00:18:46.140 | or I had this thing
00:18:46.840 | that was
00:18:47.340 | some sort of like
00:18:48.580 | spiced ground meat
00:18:50.060 | with some sort of sauce
00:18:51.020 | or whatever.
00:18:51.400 | It's just like two minutes.
00:18:52.340 | You have it.
00:18:53.500 | It's like,
00:18:54.300 | it's healthy.
00:18:55.180 | It's good portions
00:18:56.320 | or whatever.
00:18:56.740 | It's a way of having
00:18:58.080 | like a variety of food
00:18:59.480 | and you can make it
00:19:01.320 | the ease with what you make.
00:19:02.640 | It's what really surprised me.
00:19:03.680 | I'm used to some of these
00:19:04.880 | microwave products
00:19:05.820 | where it's like,
00:19:06.620 | yeah,
00:19:06.780 | all you got to do
00:19:07.260 | is just like microwave
00:19:07.920 | for like 20 minutes
00:19:08.720 | and then you need to make
00:19:10.240 | like 17 incisions
00:19:11.260 | and then microwave
00:19:11.720 | for just seven minutes.
00:19:12.680 | but every 30 seconds
00:19:13.660 | you have to rotate it
00:19:14.600 | 45 degrees
00:19:15.860 | until you've gone past
00:19:17.120 | an obtuse angle
00:19:17.820 | at which point
00:19:18.240 | you want to turn it upside down
00:19:19.440 | and then once the vernal equinox comes
00:19:20.880 | you give it another five minutes
00:19:21.920 | and then you just put it in the oven
00:19:22.940 | and cook it for a half hour.
00:19:23.780 | Like,
00:19:24.200 | I was so surprised with Factors.
00:19:26.040 | Like,
00:19:26.340 | you just put a couple slits,
00:19:27.800 | you microwave it,
00:19:28.480 | mix some stuff up,
00:19:29.200 | tastes good.
00:19:30.120 | So Factor arrives fresh
00:19:31.760 | and fully prepared
00:19:32.580 | at your house.
00:19:33.960 | I've been seeing Factor vans
00:19:35.380 | around here,
00:19:35.860 | by the way.
00:19:36.300 | Yeah,
00:19:36.720 | I've seen them too.
00:19:37.360 | Yeah,
00:19:37.600 | so I guess they're coming
00:19:38.320 | from their own vans.
00:19:39.100 | You can lose up to eight pounds
00:19:41.020 | and eight weeks,
00:19:41.680 | for example,
00:19:42.260 | with Factor Keto meals
00:19:43.580 | based on a randomized
00:19:44.840 | controlled trial
00:19:45.600 | with Factor Keto.
00:19:46.320 | The results will vary
00:19:47.940 | depending on diet and exercise.
00:19:49.120 | More generally though,
00:19:50.640 | there's 40 different options
00:19:52.260 | across eight dietary preferences
00:19:54.100 | on the menu each week.
00:19:55.580 | So you can really pick
00:19:56.580 | what's tailored to your goals.
00:19:58.000 | I worry about calories.
00:19:59.280 | I want more protein.
00:20:00.300 | I'm doing keto.
00:20:01.260 | So not only do you get
00:20:02.360 | healthy meals,
00:20:02.920 | but you can choose exactly
00:20:03.920 | what you're looking for
00:20:04.580 | in your meals.
00:20:05.120 | Shows up,
00:20:06.520 | put in the fridge,
00:20:08.020 | throw it in the microwave
00:20:09.040 | when it's time to eat.
00:20:09.740 | They also have these
00:20:10.660 | wholesome smoothies,
00:20:11.680 | breakfast,
00:20:12.080 | grab and go snacks
00:20:12.880 | and more add-ons.
00:20:13.860 | So an easy way
00:20:15.140 | to really take control
00:20:16.280 | of your nutrition,
00:20:17.360 | make the eating
00:20:18.960 | kind of automatic
00:20:19.820 | and make it healthy.
00:20:21.320 | So eat smart with Factor.
00:20:23.600 | Get started at
00:20:24.380 | factormeals.com
00:20:25.540 | slash deep50off
00:20:28.160 | and use the code
00:20:29.340 | deep50off.
00:20:30.800 | So the word deep,
00:20:31.700 | the number 50,
00:20:33.140 | the word off
00:20:34.420 | to get 50% off
00:20:35.880 | your first box
00:20:36.540 | plus pre-shipping.
00:20:37.400 | That's code
00:20:38.660 | deep50off
00:20:39.540 | at factormeals.com
00:20:40.620 | slash deep50off
00:20:41.840 | to get 50% off
00:20:42.840 | plus free shipping
00:20:43.680 | on your first box.
00:20:46.200 | I also want to talk
00:20:47.520 | about our long-time friends
00:20:48.620 | and long-time sponsor
00:20:50.100 | at Grammarly.
00:20:51.800 | Jesse,
00:20:52.440 | I'm going to put you
00:20:52.940 | on the spot again.
00:20:53.780 | What percentage
00:20:55.400 | of the work week
00:20:56.220 | do you think
00:20:56.740 | the average professional
00:20:57.740 | spends doing
00:20:59.120 | written communication?
00:21:01.840 | It is 50%.
00:21:03.520 | That's pretty good though.
00:21:04.600 | You're pretty close.
00:21:05.460 | So Grammarly
00:21:06.580 | helps you with the thing
00:21:09.580 | that you have
00:21:11.320 | like a written word
00:21:12.140 | diet though, right?
00:21:12.700 | You only write at night
00:21:14.260 | like you're eating.
00:21:14.780 | You're only going to write
00:21:15.800 | 10 words a day.
00:21:16.400 | It's all part of.
00:21:17.120 | So Grammarly
00:21:19.220 | helps you with this thing
00:21:20.140 | that you're spending
00:21:20.460 | half your week already doing
00:21:21.340 | which is writing
00:21:21.840 | with AI.
00:21:23.180 | Grammarly
00:21:23.680 | is your AI writing
00:21:24.420 | partner.
00:21:24.920 | You can stay focused
00:21:25.800 | and get through
00:21:26.180 | your work faster
00:21:26.960 | with relevant
00:21:28.060 | real-time suggestions
00:21:29.040 | wherever you write
00:21:30.500 | and you can download
00:21:31.380 | Grammarly for free
00:21:32.200 | at grammarly.com
00:21:33.400 | slash podcast.
00:21:34.240 | I'm going to talk
00:21:34.720 | about this more
00:21:35.260 | in the final segment
00:21:37.340 | of the show
00:21:37.860 | but this is where
00:21:38.760 | I think AI
00:21:39.460 | is really going
00:21:40.340 | to make its move
00:21:41.080 | early on
00:21:41.740 | is form fit
00:21:43.460 | the specific uses
00:21:44.440 | and this is what
00:21:45.620 | Grammarly is doing.
00:21:46.460 | In the tools
00:21:47.440 | you already do
00:21:48.140 | to do this writing
00:21:48.860 | that takes up
00:21:49.380 | 50% of your time
00:21:50.540 | it is right there
00:21:51.560 | to help make
00:21:52.040 | that writing faster.
00:21:53.580 | Whether you're
00:21:54.380 | brainstorming
00:21:55.080 | whether you want
00:21:55.820 | to rewrite something
00:21:56.700 | whether you want
00:21:58.060 | to check the tone
00:21:59.180 | of what you are doing
00:22:00.340 | you want context
00:22:01.360 | aware suggestions
00:22:02.100 | of like what
00:22:02.780 | you're going to write
00:22:03.420 | Grammarly is there
00:22:04.720 | for you to make
00:22:05.500 | your writing better
00:22:06.400 | and to get that
00:22:07.120 | good writing faster.
00:22:08.240 | 90% of professionals
00:22:09.540 | say Grammarly
00:22:10.220 | has saved them time
00:22:11.260 | writing and editing
00:22:12.080 | their work.
00:22:12.620 | Four out of five
00:22:14.320 | professionals say
00:22:15.040 | Grammarly helps
00:22:16.040 | them gain buy-in
00:22:17.240 | and action
00:22:17.780 | through their
00:22:18.420 | communication.
00:22:19.480 | Your data is secure
00:22:21.540 | and never sold
00:22:22.060 | with Grammarly.
00:22:22.620 | It's been doing
00:22:23.260 | this for 15 years.
00:22:24.320 | Grammarly is like
00:22:25.400 | the gold standard
00:22:26.160 | of responsible AI
00:22:27.040 | in this space.
00:22:27.680 | So get more done
00:22:28.900 | with Grammarly.
00:22:29.460 | Download Grammarly
00:22:30.720 | for free
00:22:31.200 | at grammarly.com
00:22:32.140 | slash podcast.
00:22:32.960 | That's grammarly.com
00:22:34.760 | slash podcast.
00:22:36.400 | All right Jesse
00:22:37.520 | do some questions.
00:22:38.540 | First question
00:22:41.700 | is from Grant.
00:22:42.380 | Decentralized platforms
00:22:44.580 | such as Mastodon,
00:22:45.740 | Pixel Fed,
00:22:46.880 | and Blue Sky
00:22:47.700 | have been slowly
00:22:48.520 | gaining momentum.
00:22:49.220 | Is this a good idea
00:22:50.580 | or should we move
00:22:51.420 | from these types
00:22:52.320 | of platforms
00:22:52.900 | altogether?
00:22:53.420 | Well I know
00:22:54.900 | two of those platforms.
00:22:56.040 | I know Mastodon,
00:22:57.240 | New Blue Sky.
00:22:57.760 | Have you heard of
00:22:58.460 | Pixel Fed?
00:23:00.200 | Well then I'm typing
00:23:01.040 | it in here.
00:23:01.580 | Just so we know
00:23:03.060 | what we're talking about.
00:23:03.880 | All right.
00:23:04.160 | Pixelfed.com
00:23:06.260 | I typed in
00:23:09.960 | Pixel Jed.
00:23:10.620 | That if you're
00:23:12.760 | wondering
00:23:13.100 | that URL is
00:23:14.780 | available.
00:23:15.020 | All right so
00:23:15.460 | Pixelfed.com
00:23:16.400 | I don't know
00:23:16.740 | about this.
00:23:17.280 | All right this
00:23:19.180 | also decentralized
00:23:20.620 | photo sharing
00:23:21.360 | and social media
00:23:21.960 | powered by
00:23:22.500 | Pixel Fed.
00:23:23.600 | All right so I
00:23:24.120 | think Mastodon
00:23:24.820 | and Pixel Fed
00:23:25.440 | are Twitter,
00:23:27.800 | Instagram style
00:23:28.600 | social media
00:23:29.120 | platforms that
00:23:29.900 | really show
00:23:30.880 | their decentralization.
00:23:32.660 | So like
00:23:33.040 | individuals can
00:23:34.080 | start up their
00:23:34.620 | own servers
00:23:35.320 | and then
00:23:35.700 | different servers
00:23:36.240 | talk to each
00:23:36.800 | other.
00:23:37.040 | Blue Sky is
00:23:38.720 | I think this was
00:23:39.220 | started by Jack Dorsey
00:23:40.140 | it's a little bit
00:23:41.120 | more polished so
00:23:41.880 | the experience is
00:23:42.640 | much more directly
00:23:43.400 | like using Twitter
00:23:44.400 | but there's some
00:23:45.160 | technological changes
00:23:46.820 | underneath the hood.
00:23:47.860 | So the question I
00:23:49.480 | think Grant is
00:23:50.040 | asking is like
00:23:50.820 | okay so is like
00:23:51.560 | this what we need
00:23:52.140 | to do?
00:23:52.720 | Keep working on
00:23:54.680 | like if Twitter
00:23:55.500 | slash X
00:23:56.320 | has all these
00:23:57.380 | problems we need
00:23:58.100 | to keep working
00:23:58.720 | to get a version
00:23:59.520 | of that that's
00:24:00.060 | better or do we
00:24:00.740 | not need a
00:24:01.380 | platform like that
00:24:02.100 | at all?
00:24:02.420 | I tend to be in
00:24:04.220 | the camp of
00:24:04.680 | we don't need
00:24:05.020 | a platform like
00:24:05.620 | that at all.
00:24:06.260 | I wrote an
00:24:06.980 | article about this
00:24:07.900 | a couple years
00:24:08.980 | ago back when
00:24:09.800 | Meta announced
00:24:11.320 | Threads which was
00:24:12.420 | their sort of
00:24:13.020 | Instagram slash
00:24:14.120 | Twitter clone.
00:24:15.440 | I wrote a New
00:24:16.460 | Yorker essay that
00:24:17.060 | was titled
00:24:17.520 | We Don't Need
00:24:18.260 | a New Twitter.
00:24:18.760 | And my argument
00:24:20.400 | is that issues
00:24:22.380 | with these platforms
00:24:23.360 | is not just the
00:24:24.840 | specific rules by
00:24:26.160 | which they're run
00:24:27.000 | which is what a
00:24:28.680 | lot of the fight
00:24:29.420 | is right now.
00:24:30.160 | So like Elon
00:24:31.060 | Musk takes over
00:24:31.820 | Twitter,
00:24:32.140 | renames it X,
00:24:33.720 | changes the
00:24:34.260 | content moderation
00:24:35.180 | and suddenly you
00:24:36.100 | get much more
00:24:36.660 | of this type of
00:24:37.240 | content and less
00:24:38.280 | of that type of
00:24:38.780 | content.
00:24:39.180 | Blue sky,
00:24:39.820 | people move to
00:24:40.740 | blue sky and say
00:24:41.340 | we're going to
00:24:41.680 | moderate differently,
00:24:42.480 | we'll be more
00:24:43.620 | hostile towards
00:24:44.900 | like right-leaning
00:24:45.560 | ideas but very
00:24:46.680 | embracing of left-leaning
00:24:48.060 | ideas.
00:24:48.480 | Twitter kind of went
00:24:49.160 | the other way,
00:24:49.740 | we're going to be
00:24:50.480 | much more embracing
00:24:51.200 | of right,
00:24:51.620 | even far-right ideas
00:24:52.580 | and more hostile
00:24:53.780 | towards left-line
00:24:54.400 | ideas.
00:24:54.660 | So it's all about
00:24:55.200 | like what are we
00:24:55.700 | doing on these
00:24:56.360 | platforms?
00:24:56.820 | My argument is like
00:24:58.040 | that's not really
00:24:58.860 | the issue.
00:24:59.480 | I mean it's an
00:24:59.900 | issue but the
00:25:00.400 | bigger issue is the
00:25:01.260 | whole idea of a
00:25:02.360 | global conversation
00:25:03.020 | platform.
00:25:03.480 | This whole notion
00:25:05.420 | that we should have
00:25:07.000 | hundreds of millions
00:25:07.860 | of people trying to
00:25:08.840 | interact on the
00:25:09.860 | same platform,
00:25:10.940 | that notion is
00:25:13.180 | broken.
00:25:13.480 | Like that is,
00:25:15.160 | my argument is that
00:25:16.560 | is just guaranteed
00:25:17.800 | to lead to
00:25:19.020 | destabilization and
00:25:19.860 | rancor.
00:25:20.780 | because if you have
00:25:21.580 | hundreds of millions
00:25:22.260 | of people that are
00:25:22.920 | trying to communicate
00:25:23.600 | and yet we need
00:25:24.200 | some sort of notion
00:25:24.940 | of a common zeitgeist,
00:25:25.880 | like a small number
00:25:26.660 | of conversations that
00:25:28.000 | are being surfaced
00:25:28.780 | from this scrum that
00:25:30.640 | most people are going
00:25:31.480 | to encounter.
00:25:32.040 | This is like the,
00:25:32.780 | this is the,
00:25:33.460 | the selling proposition
00:25:34.840 | of the global
00:25:35.300 | platforms as you kind
00:25:36.300 | of like create this
00:25:37.200 | of the moment
00:25:37.740 | internet zeitgeist you
00:25:38.680 | can put your finger
00:25:39.540 | It's going to require
00:25:40.800 | aggressive,
00:25:41.320 | aggressive curation
00:25:42.220 | because you have
00:25:42.740 | hundreds of millions
00:25:43.420 | of messages from
00:25:44.080 | which you want to
00:25:44.600 | pull these like
00:25:45.140 | common threads,
00:25:45.880 | a very small number
00:25:46.620 | of messages.
00:25:47.040 | And once you have
00:25:47.820 | heavy curation
00:25:48.900 | like that,
00:25:49.400 | you are going to
00:25:50.840 | lead towards all
00:25:51.700 | sorts of problems.
00:25:52.380 | You're going to
00:25:52.800 | lead to behaviors
00:25:53.760 | at far extremes.
00:25:54.980 | It's not going
00:25:56.280 | to be a pretty place.
00:25:57.120 | It's not going
00:25:57.580 | to serve most
00:25:58.100 | people's needs.
00:25:58.780 | So I,
00:25:59.640 | I have up the,
00:26:00.340 | the point of view
00:26:01.680 | of like global
00:26:02.120 | conversation platforms
00:26:03.020 | is a bad use
00:26:03.620 | of the internet.
00:26:04.060 | I think a better
00:26:05.140 | use of the internet
00:26:05.740 | is more niche
00:26:07.600 | conversations,
00:26:08.200 | bringing together
00:26:08.800 | people from all
00:26:09.460 | around the world
00:26:10.120 | into smaller
00:26:11.360 | communities where
00:26:12.080 | they can talk
00:26:12.560 | about shared
00:26:12.920 | interest and create
00:26:14.680 | a shared sense
00:26:15.760 | of community.
00:26:16.580 | These are our
00:26:17.280 | community standards
00:26:18.380 | based on our
00:26:19.840 | real community,
00:26:20.940 | not 300 million
00:26:22.920 | people in the same
00:26:23.540 | platform.
00:26:23.980 | And we have to
00:26:24.560 | have like a group
00:26:25.080 | of people in Palo
00:26:26.100 | Alto somewhere who
00:26:26.760 | say, here's our
00:26:27.440 | community standards.
00:26:28.080 | Like, no, this
00:26:28.580 | is a hundred
00:26:29.520 | people who like
00:26:30.200 | this baseball team
00:26:31.000 | and we like to
00:26:31.440 | get together and
00:26:32.000 | talk about it on
00:26:32.700 | this platform.
00:26:33.260 | And we can come
00:26:33.960 | up with our own
00:26:34.400 | standards for how
00:26:35.060 | we talk about this
00:26:35.860 | because we're
00:26:36.260 | really a community
00:26:37.080 | or here's a group
00:26:38.460 | of people that like
00:26:39.220 | we like to get
00:26:39.680 | together to talk
00:26:40.320 | about like this
00:26:40.960 | type of movie or
00:26:41.860 | something like that.
00:26:42.500 | We can create
00:26:43.280 | our own community
00:26:43.840 | standards for that
00:26:44.420 | because it's a,
00:26:44.860 | it's a real
00:26:45.320 | community.
00:26:45.720 | We know each
00:26:46.360 | other, we care
00:26:46.900 | about what's
00:26:47.240 | going on, we
00:26:47.680 | like each
00:26:48.060 | other, we're
00:26:48.380 | trying to share
00:26:48.860 | information.
00:26:49.380 | That is the use
00:26:50.520 | of the internet.
00:26:50.900 | So I've been a
00:26:51.380 | long, long been
00:26:52.560 | arguing that these
00:26:53.500 | global platforms,
00:26:54.900 | the idea that we
00:26:55.320 | need everyone
00:26:55.820 | using a small
00:26:56.340 | number of
00:26:56.620 | platforms only
00:26:58.260 | really serves
00:26:59.260 | investors in the
00:27:00.920 | really large
00:27:01.300 | platforms.
00:27:01.740 | It's a way to
00:27:02.440 | try to concentrate
00:27:03.200 | a huge amount
00:27:04.120 | of monetization
00:27:04.880 | into a small
00:27:05.700 | number of
00:27:06.060 | people's pockets,
00:27:06.680 | but it doesn't
00:27:07.340 | create a better
00:27:07.840 | experience for the
00:27:08.440 | users.
00:27:08.740 | So my argument
00:27:10.480 | is niche
00:27:11.720 | communities are
00:27:12.280 | better.
00:27:13.460 | And once you
00:27:14.080 | have niche
00:27:14.440 | communities, I
00:27:15.080 | don't think the
00:27:15.440 | chronological timeline
00:27:16.380 | of like a
00:27:16.840 | Twitter, this or
00:27:17.380 | that is necessarily
00:27:18.000 | best.
00:27:18.400 | You want more
00:27:19.800 | of like a, maybe
00:27:20.400 | like a foreign
00:27:21.000 | metaphor, discussion
00:27:22.040 | metaphor, a chat
00:27:23.040 | metaphor, or a
00:27:24.760 | live, you know,
00:27:25.520 | discord style
00:27:26.300 | metaphor, whatever
00:27:26.980 | one you want to
00:27:27.820 | It's not necessarily
00:27:28.540 | just a sort of
00:27:29.440 | algorithmically curated
00:27:30.540 | or chronological
00:27:31.220 | timeline.
00:27:31.640 | So no, I don't
00:27:32.280 | think we need to
00:27:34.040 | fix Twitter with a
00:27:34.820 | different Twitter.
00:27:35.300 | I think we need to
00:27:36.080 | move past this era
00:27:36.940 | where we thought that
00:27:37.620 | Twitter was a good
00:27:38.200 | idea.
00:27:38.560 | All right, what
00:27:40.320 | do we got next?
00:27:40.840 | Next question is
00:27:42.220 | from Kevin.
00:27:42.680 | I took a job that
00:27:44.240 | pays more than
00:27:44.820 | double my previous
00:27:46.120 | There's a lot more
00:27:47.060 | pressure and I'm
00:27:47.680 | stressed out.
00:27:48.300 | My wife works and
00:27:49.660 | we don't spend
00:27:50.240 | lavishly.
00:27:50.880 | There is a job back
00:27:52.160 | at my old company
00:27:52.840 | for my initial pay.
00:27:53.780 | How should I apply
00:27:55.100 | lifestyle-centric
00:27:56.040 | planning to my
00:27:56.740 | situation if I've
00:27:57.620 | only been at my
00:27:58.200 | new role for six
00:27:59.120 | months?
00:27:59.480 | Well, this sounds
00:28:01.140 | like an example of
00:28:02.060 | lifestyle-centric
00:28:02.740 | planning not being
00:28:03.680 | applied and now you
00:28:05.540 | have a little
00:28:05.900 | buyer's remorse.
00:28:06.720 | So I talk about
00:28:08.500 | this in my book,
00:28:09.140 | So Good They Can't
00:28:09.800 | Ignore You from
00:28:10.360 | 2012, as one of
00:28:11.980 | the so-called
00:28:12.580 | control traps.
00:28:13.660 | So I said, look,
00:28:15.640 | one of the traps
00:28:17.580 | that arises around
00:28:19.760 | taking control of
00:28:21.720 | your life and
00:28:22.440 | aiming it towards
00:28:23.080 | the things that
00:28:23.540 | resonate and away
00:28:24.260 | from the things
00:28:24.740 | that don't is
00:28:25.940 | that it's just as
00:28:27.520 | you're building up
00:28:28.320 | the skills that
00:28:29.080 | could give you
00:28:29.520 | that leverage that
00:28:31.540 | you will be tempted
00:28:33.020 | with, forget
00:28:34.660 | taking control,
00:28:35.880 | come get more
00:28:37.460 | money, come get
00:28:38.540 | more prestige.
00:28:39.400 | So it's like
00:28:39.920 | exactly when
00:28:40.780 | you're good
00:28:41.120 | enough to start
00:28:41.680 | dictating what
00:28:42.400 | your working life
00:28:43.040 | is like, that
00:28:44.300 | someone comes and
00:28:44.900 | says, no, no, no,
00:28:45.460 | this is the
00:28:46.240 | prestigious job,
00:28:46.980 | double your
00:28:47.380 | salary, come on,
00:28:47.980 | scoreboard, you're
00:28:49.100 | gonna put points on
00:28:49.600 | the scoreboard.
00:28:50.100 | It's precisely when
00:28:51.660 | people care enough
00:28:52.480 | about what you can
00:28:53.160 | do to try to lock
00:28:54.040 | you into things that
00:28:55.080 | are gonna have
00:28:56.200 | obvious indicators of
00:28:58.380 | prestige, like title
00:28:59.500 | and income, but that
00:29:01.040 | lock you in and
00:29:01.800 | prevent you from
00:29:02.420 | taking more control
00:29:03.100 | over your life.
00:29:03.600 | So you have to be
00:29:04.380 | careful.
00:29:04.640 | It's probably what
00:29:05.200 | happened here.
00:29:05.700 | It's like, this is
00:29:06.240 | impressive.
00:29:06.880 | This job is
00:29:08.160 | double the money.
00:29:08.720 | It's a hard job to
00:29:09.700 | get, and I think I
00:29:11.120 | might be able to get
00:29:12.280 | I'm winning the game
00:29:13.560 | here.
00:29:13.780 | I'm proud of myself.
00:29:14.680 | But then the
00:29:15.080 | problem is you have
00:29:15.700 | that job, and you
00:29:17.460 | have to confront the
00:29:18.120 | reality of, well, how
00:29:18.880 | does this actually fit
00:29:19.520 | into my ideal lifestyle
00:29:20.380 | plan?
00:29:20.780 | And Kevin, it sounds
00:29:21.480 | like here it doesn't,
00:29:22.360 | right?
00:29:23.620 | It's taking up a lot
00:29:24.440 | of your time.
00:29:24.860 | It's stressing you
00:29:25.620 | You're bothered by
00:29:26.860 | that stress.
00:29:27.400 | You don't like the
00:29:29.360 | lifestyle that it's
00:29:30.000 | leading to.
00:29:31.460 | So in general, you
00:29:32.760 | know, the way you
00:29:33.400 | avoid this, and then
00:29:34.120 | we'll talk about what
00:29:34.560 | you can do now, but the
00:29:36.280 | way you avoid this is
00:29:36.980 | you're always working
00:29:37.580 | back from the ideal
00:29:38.220 | lifestyle.
00:29:38.640 | What is my day like, an
00:29:40.860 | ideal day like?
00:29:41.740 | What type of place do I
00:29:43.480 | live?
00:29:43.900 | What's the schedule like?
00:29:44.960 | What am I doing?
00:29:45.580 | What type of places am I
00:29:46.440 | around?
00:29:46.660 | Who am I around?
00:29:47.240 | What's going on?
00:29:47.840 | How do I feel?
00:29:48.480 | What do I see?
00:29:49.060 | Smell and touch, right?
00:29:49.860 | Like you have this
00:29:50.340 | really clear picture, like
00:29:51.340 | what your ideal lifestyle
00:29:52.160 | is like.
00:29:52.600 | And then all you're
00:29:54.380 | thinking about is how
00:29:55.640 | do I move closer to
00:29:56.940 | that with the decisions
00:29:57.860 | I make?
00:29:59.460 | If you have a really
00:30:00.260 | clear image of that, it
00:30:01.900 | then might be the case
00:30:02.780 | when you look at a
00:30:03.320 | promotion that's going to
00:30:04.020 | give you a lot of money
00:30:04.740 | or a job that's going to
00:30:05.640 | give you a lot of money.
00:30:06.380 | But if that job breaks
00:30:08.100 | many aspects of your
00:30:09.440 | vision, you're like, of
00:30:10.440 | course I'm not going to
00:30:11.020 | do that.
00:30:11.360 | But if you don't have
00:30:12.560 | that vision in place,
00:30:13.340 | you're like, well, the
00:30:13.900 | only scoreboard I have is
00:30:15.080 | like title and income, so
00:30:16.100 | I want to keep putting up
00:30:17.060 | points, so that's what I'm
00:30:17.840 | going to do.
00:30:18.220 | Now that you're already in
00:30:19.720 | this position, that's okay.
00:30:21.480 | Let's do our lifestyle
00:30:22.420 | centric planning now.
00:30:23.280 | Why are you stressed?
00:30:24.860 | What is it that you're
00:30:25.660 | missing?
00:30:25.920 | What is it that you're
00:30:26.680 | looking for in the
00:30:28.520 | ideal life?
00:30:29.180 | Don't fixate on the
00:30:30.660 | solution.
00:30:31.060 | Fixate on where you want
00:30:32.020 | to be.
00:30:32.340 | It's easy.
00:30:33.760 | What happens is when
00:30:34.640 | people are unhappy with
00:30:35.520 | the situation is that they
00:30:36.540 | want to fixate on a
00:30:37.320 | specific solution.
00:30:38.060 | So like maybe right now
00:30:39.560 | you want to fixate on like
00:30:40.680 | going back to your old
00:30:41.440 | job or something, but
00:30:42.160 | don't get stuck yet in the
00:30:43.440 | solution.
00:30:43.800 | Just focus on the image
00:30:45.120 | of the ideal.
00:30:45.640 | Then you can do an
00:30:47.940 | evidence-based analysis of
00:30:49.180 | how to get around
00:30:49.700 | obstacles and what
00:30:50.480 | opportunities you have
00:30:51.220 | available, and there
00:30:52.020 | might be paths there
00:30:53.120 | that you're not thinking
00:30:55.060 | about right now.
00:30:55.980 | It might be, oh,
00:30:57.000 | maybe at my current
00:30:57.900 | new job, I do this
00:31:00.920 | smaller pivot, and that
00:31:02.900 | actually sets me up to
00:31:04.040 | get to these things, this
00:31:05.080 | ideal lifestyle in a
00:31:05.960 | different way.
00:31:06.420 | Or maybe the solution
00:31:08.100 | is, yeah, I got to go
00:31:08.700 | back.
00:31:09.000 | I'm going to go back to
00:31:10.280 | this other position, but
00:31:11.520 | don't fixate on this.
00:31:12.760 | People like the grand
00:31:14.360 | gesture.
00:31:14.820 | Don't worry about that
00:31:16.100 | Worry about what it is you
00:31:17.520 | want in your life, and
00:31:19.240 | be sober-minded and
00:31:20.160 | careful about figuring
00:31:21.080 | out what are the
00:31:22.140 | obstacles to that, what
00:31:23.020 | are the opportunities to
00:31:23.700 | get there.
00:31:24.040 | Maybe in this new
00:31:24.800 | job, it's possible.
00:31:25.500 | This money is going to
00:31:26.420 | allow you to do certain
00:31:27.220 | things that you
00:31:27.700 | couldn't do before, and
00:31:28.980 | it's going to unlock all
00:31:29.720 | sorts of cool things in
00:31:30.460 | your lifestyle, but the
00:31:31.240 | problem is the time, but
00:31:32.120 | maybe you can get on top
00:31:32.880 | of the time by using
00:31:35.120 | some Cal Newport
00:31:36.180 | techniques, and actually I
00:31:37.680 | can really reduce the
00:31:38.640 | footprint of this job and
00:31:39.760 | still use the money.
00:31:40.500 | You don't know what the
00:31:41.420 | right answer is going to
00:31:42.060 | be until you know what
00:31:43.580 | you're trying to get to
00:31:44.240 | first.
00:31:44.600 | So focus on the goal, not
00:31:47.500 | the fixes.
00:31:48.040 | This goes up a lot,
00:31:49.720 | Jesse.
00:31:49.900 | People's reaction to a
00:31:52.720 | difficult situation is
00:31:54.880 | to focus on like a
00:31:55.940 | particular move, not
00:31:58.400 | what's wrong, where do I
00:32:00.100 | want to get, what's five
00:32:01.240 | options to get there.
00:32:02.060 | They just want to have, I
00:32:03.940 | guess it feels good in the
00:32:04.900 | moment.
00:32:05.200 | I'm going to do something
00:32:06.460 | radical.
00:32:06.940 | I'm going to quit.
00:32:07.960 | I'm going to move to this
00:32:09.300 | country or whatever it is,
00:32:10.500 | right?
00:32:10.620 | You just, you get some,
00:32:11.720 | some big idea and the
00:32:14.840 | radicalness of the idea
00:32:15.840 | gives you some suker, but
00:32:17.300 | you're not actually thinking
00:32:18.340 | through what are the five
00:32:20.400 | different things I could do
00:32:21.280 | here and let me sober
00:32:22.100 | mindedly look at them.
00:32:23.000 | And actually in the end,
00:32:23.740 | this is not as exciting, but
00:32:24.860 | this path here, this one
00:32:26.080 | year plan is going to end
00:32:28.160 | up in a much better place.
00:32:29.140 | It's, it's not people's
00:32:30.340 | instincts to work backwards
00:32:31.920 | from ideal lifestyle and be
00:32:33.200 | systematic and exploring
00:32:34.240 | different ways forward.
00:32:35.000 | All right.
00:32:37.280 | Who do we got next?
00:32:37.940 | Next question is from
00:32:39.400 | Ravi.
00:32:39.760 | I'm a 40 year old software
00:32:41.700 | engineer.
00:32:42.120 | My younger colleagues are
00:32:43.600 | technically sharper and
00:32:44.640 | bring more value to my
00:32:45.640 | manager.
00:32:46.000 | This makes me
00:32:47.000 | dispensable.
00:32:47.640 | What's my path towards
00:32:49.220 | relevancy?
00:32:50.440 | Well, I think you have
00:32:51.160 | two options, Robbie.
00:32:51.940 | One, you have to keep in
00:32:53.220 | mind young people have more
00:32:55.940 | relevant skills because the
00:32:58.260 | thing they learned when they
00:32:59.600 | were learning this field
00:33:00.460 | because they're younger is
00:33:02.260 | much more recent.
00:33:02.980 | So they're getting into the
00:33:05.360 | technology field.
00:33:06.080 | I'm, they're putting in that
00:33:06.980 | initial push to learn
00:33:08.080 | skills.
00:33:08.440 | They're learning whatever
00:33:09.260 | skill is relevant right now.
00:33:10.580 | And if they just learned the
00:33:11.580 | skill a couple of years ago,
00:33:12.480 | it's going to be new and it's
00:33:13.680 | going to be relevant.
00:33:15.820 | But this doesn't mean that you
00:33:17.520 | can't, uh, get back ahead of
00:33:20.260 | them again.
00:33:20.840 | So yeah, they're coming in with
00:33:22.220 | relevant skills, but if you're
00:33:24.300 | pretty systematic about, I'm
00:33:25.540 | going to put aside regular
00:33:26.400 | time, the master, what's new
00:33:28.280 | now, you can get yourself back
00:33:31.180 | into that conversation, right?
00:33:32.960 | So it's not that the younger
00:33:34.320 | people are much smarter.
00:33:35.340 | It's not that the younger people
00:33:36.660 | can learn skills much faster than
00:33:38.280 | you either.
00:33:38.660 | They're just starting with the
00:33:40.700 | latest skill.
00:33:41.700 | But once you've also learned
00:33:43.140 | the latest skill, you're kind
00:33:44.080 | of have parity now.
00:33:44.980 | Like once they're working for
00:33:46.860 | this company, they're not
00:33:47.600 | necessarily picking up the next
00:33:49.200 | thing that's going to come any
00:33:50.180 | faster than you can.
00:33:51.100 | If you systematically put aside
00:33:52.360 | time to do it.
00:33:53.040 | In fact, you probably have an
00:33:54.920 | advantage.
00:33:55.540 | If you're older, you say you're
00:33:57.380 | 40, you're probably less
00:33:58.980 | stimuli addicted than they are.
00:34:00.220 | Look back at our deep dive from
00:34:01.980 | the beginning of this episode.
00:34:02.940 | You probably have less of a
00:34:05.140 | problem with sustained
00:34:07.200 | concentration than they have.
00:34:08.340 | You probably have less of a
00:34:09.520 | problem doing the type of
00:34:10.480 | activities that's going to
00:34:11.180 | make them smarter, right?
00:34:12.140 | Your brain is calmer.
00:34:14.580 | Your brain has been less
00:34:16.560 | drenched in high stimuli
00:34:17.920 | dopamine.
00:34:18.360 | So if you are continually,
00:34:22.600 | slowly, and steadily keeping
00:34:24.240 | what's the new thing this year?
00:34:25.240 | Let me learn it.
00:34:25.800 | Okay, what's the new thing
00:34:26.480 | next year?
00:34:26.860 | Let me learn that.
00:34:27.600 | You are going to be ahead of,
00:34:30.920 | I think you'll get back ahead
00:34:31.860 | of the people who have joined
00:34:33.120 | more recently, right?
00:34:35.080 | And you can keep your
00:34:36.340 | relevancy aside.
00:34:37.020 | So I always think, you know,
00:34:39.400 | young people have more
00:34:40.440 | time, they have more
00:34:41.640 | energy, but they are more
00:34:42.800 | distracted.
00:34:43.280 | The other advantage you have
00:34:45.300 | is because you're older and
00:34:46.520 | more sober-minded, you're
00:34:47.440 | probably better at mature
00:34:49.140 | decision-making, communication,
00:34:50.500 | personability.
00:34:51.240 | So you could also pivot,
00:34:52.540 | which is like the common move
00:34:54.220 | towards more of a managerial
00:34:55.660 | role, right?
00:34:56.860 | Where it's hard for a phone
00:34:58.580 | addicted 23-year-old is not
00:34:59.800 | going to be able to manage
00:35:00.740 | other people.
00:35:01.340 | They have just trouble
00:35:02.120 | interacting with them and just
00:35:03.300 | making mature decisions and
00:35:04.560 | impulsivity.
00:35:05.140 | You're 40, you can do that in a
00:35:07.420 | way a young person can't.
00:35:08.240 | So you can also pivot
00:35:09.080 | towards a role that actually
00:35:10.140 | rewards your age in that way
00:35:11.500 | as well.
00:35:11.840 | So use your ability and hone
00:35:14.440 | your ability to focus the fact
00:35:15.980 | that you're not young and on
00:35:17.000 | TikTok all the time to either
00:35:18.680 | be systematically learning
00:35:19.980 | stuff faster than the new
00:35:21.840 | people or use your older,
00:35:24.000 | more sober brain to pivot to a
00:35:26.340 | role that the young people
00:35:27.740 | can't do as well.
00:35:28.520 | I mean, I've heard Scott
00:35:30.380 | Galloway talk about this as
00:35:31.520 | sort of the problem with the
00:35:33.320 | tech industry.
00:35:33.940 | As he said, sort of one of the
00:35:35.120 | big models in the tech
00:35:36.320 | industry is bring in people
00:35:39.340 | who are young and they might
00:35:42.200 | be only 70% as good as the
00:35:45.620 | people who've been there for
00:35:46.320 | a while, but they are half the
00:35:47.360 | salary.
00:35:47.680 | So keep hiring young people.
00:35:49.960 | You don't have to pay them
00:35:51.000 | nearly as much as the older
00:35:51.960 | people.
00:35:52.280 | They're not as good, but they
00:35:53.920 | have high energy and it keeps
00:35:55.780 | the expenses low.
00:35:56.660 | So that's sort of the headwinds
00:35:58.360 | you're pushing back against
00:35:59.220 | here.
00:35:59.340 | But I think by systematically
00:36:01.300 | and deliberately learning new
00:36:02.620 | skills, you can actually be more
00:36:03.840 | nimble than younger people and
00:36:04.960 | then move into the positions
00:36:05.880 | where the young people just
00:36:06.620 | can't take them.
00:36:07.180 | The position where I can't hire
00:36:08.640 | the 23-year-old out of Stanford
00:36:09.800 | can't take that position.
00:36:10.920 | They just are unable to deal
00:36:12.220 | with adults yet.
00:36:12.900 | So I think those are your two
00:36:14.060 | options.
00:36:14.440 | All right, who do we have next?
00:36:17.080 | Next question is from Selah.
00:36:18.680 | My grand goal is to own a
00:36:20.860 | production company that writes
00:36:22.000 | and creates animations.
00:36:23.140 | I currently am a producer that
00:36:25.060 | works on podcasts and TV shows.
00:36:26.820 | However, we've been informed
00:36:28.440 | that our last deal will end this
00:36:30.020 | August and we'll get a year of
00:36:31.260 | severance.
00:36:32.080 | I currently work about three
00:36:33.220 | days a week and dabble in some
00:36:34.620 | freelance.
00:36:35.060 | Should I spend this time trying
00:36:36.760 | to look for another job or hone
00:36:38.320 | in on trying to make my long-term
00:36:40.180 | goal a reality?
00:36:41.320 | I mean, a couple points here.
00:36:43.660 | First, I would temper or
00:36:46.600 | complement goal-based thinking
00:36:48.280 | with lifestyle-centric based
00:36:49.680 | thinking.
00:36:50.000 | So don't get completely fixated on
00:36:52.700 | this specific goal.
00:36:53.680 | Like, I need to be running this
00:36:55.280 | type of company.
00:36:56.080 | It might not be possible or it's
00:36:58.060 | possible, but it's actually not
00:36:59.080 | going to make your life as good
00:36:59.920 | as you think.
00:37:00.460 | So work backwards from
00:37:02.520 | lifestyle and this will, you can
00:37:04.480 | see how this particular idea fits
00:37:06.000 | into your ideal lifestyle, but it
00:37:07.220 | will also unlock other paths
00:37:09.440 | towards your ideal lifestyle that
00:37:10.460 | you might not be thinking about
00:37:11.520 | right now because they're less
00:37:12.560 | definitive or less sexy.
00:37:13.920 | So I would say that number one.
00:37:15.740 | Number two, I think it's good,
00:37:18.520 | right?
00:37:18.780 | You're starting from a place of
00:37:20.000 | expertise.
00:37:20.400 | You're working in podcasts and TV
00:37:23.180 | show production.
00:37:23.880 | So it's not like this is a pie in
00:37:26.220 | the sky dream.
00:37:27.060 | You're not an insurance
00:37:28.820 | actuarial analyst who's like, I
00:37:30.820 | want to have an animation company.
00:37:31.860 | Like, so you know this world, you
00:37:33.080 | know what's realistic.
00:37:33.720 | I think that's really useful.
00:37:34.780 | I would still use the advice from
00:37:38.820 | my book, So Good They Can't Ignore
00:37:39.940 | You, use money as a neutral
00:37:41.360 | indicator of value.
00:37:42.260 | You need to actually see people
00:37:47.180 | paying for what you're offering as
00:37:50.440 | indication that what you're
00:37:51.440 | offering has value to the market.
00:37:53.900 | So if you want to start an
00:37:55.100 | animation production studio, like
00:37:56.400 | you need to be producing animation
00:37:57.660 | that like, okay, I actually sold
00:37:58.980 | this project or that, and maybe
00:38:00.080 | it's on the side at first, but now
00:38:01.360 | it's making this much money that I
00:38:03.320 | would be okay if I switched to this
00:38:04.700 | full time.
00:38:05.300 | Like, don't just hope or guess it's
00:38:07.740 | going to succeed.
00:38:08.360 | Wait to see that people are giving
00:38:10.600 | you their money, not just giving you
00:38:12.300 | their encouragement, but giving you
00:38:14.920 | their money as an indicator that your
00:38:16.820 | project is valuable.
00:38:17.700 | So if you could do this quickly, like
00:38:18.980 | you have a year as a severance.
00:38:20.060 | So if you can do this in the next
00:38:20.980 | six months, yeah, go all in on it
00:38:23.440 | and see if it works.
00:38:24.180 | If this is going to be a longer
00:38:25.320 | endeavor, it might take a few years
00:38:26.820 | to really like get something, making
00:38:29.040 | enough money to see if it's viable
00:38:30.440 | or not, then you should be looking
00:38:31.340 | for another job at the same time.
00:38:32.640 | That's my advice.
00:38:34.540 | Introduce lifestyle centric planning,
00:38:36.820 | not just goal-based planning to make
00:38:39.100 | sure that you see the full scope of
00:38:40.380 | both your opportunities and options,
00:38:42.620 | but also like make sure that whatever
00:38:43.960 | you're doing is aimed in the right
00:38:45.080 | direction.
00:38:45.400 | And then two, use money as a neutral
00:38:47.460 | indicator of value, right?
00:38:50.240 | If you're making money off of this,
00:38:51.920 | you can spend more time doing it.
00:38:53.180 | If you're not, then maybe it's not
00:38:54.260 | going to work.
00:38:54.580 | It's a really good test of viability.
00:38:55.980 | All right.
00:38:58.240 | We got another, we got one last job
00:38:59.680 | question here, right?
00:39:00.820 | We got Sam.
00:39:01.560 | I just got fired from my federal job
00:39:04.680 | as I was in a probationary period.
00:39:06.240 | How do I recover from this setback?
00:39:08.060 | I was partway through a quasi
00:39:09.680 | development program and I had some
00:39:11.120 | great career capital opportunities
00:39:13.000 | lined up.
00:39:13.520 | I'm worried that without finishing
00:39:15.080 | the program, I won't be competitive
00:39:16.980 | for another job in this niche and all
00:39:19.380 | my skills so far and, and all my
00:39:21.900 | skills so far are specialized to this
00:39:23.480 | field.
00:39:23.780 | Well, I mean, look, first of all,
00:39:26.260 | empathy is due.
00:39:27.560 | Like I can see you're, you're
00:39:29.780 | struggling with what happened and
00:39:31.080 | rightly so.
00:39:31.660 | It sucks that you have a job and you
00:39:35.400 | lose it.
00:39:35.900 | And in particular, if it was a new
00:39:37.160 | job and you liked where it was
00:39:38.760 | taking you, you had this plan, it
00:39:41.260 | was a good plan and it got taken
00:39:43.360 | away.
00:39:43.620 | Like there's something traumatic in
00:39:44.720 | that and I, that comes through, I
00:39:46.380 | think, in your question.
00:39:47.040 | So I think there's empathy here and
00:39:48.720 | nothing good about this situation.
00:39:50.180 | So we need to regroup and we need to
00:39:52.880 | reattach.
00:39:53.540 | And so we need to be wary of emotional
00:39:57.100 | attachments and narratives that are no
00:39:58.660 | longer possible.
00:39:59.420 | So there's a particular narrative here
00:40:02.260 | about a particular development program
00:40:03.600 | that you're a part of and you, you,
00:40:04.820 | you saw where that was going to lead
00:40:05.900 | That may be off the table.
00:40:06.800 | Now it was, it was only available to
00:40:09.000 | government employees.
00:40:09.600 | You're not gonna be able to get that
00:40:10.360 | government job back anytime soon.
00:40:11.780 | So we have to pivot here and the
00:40:14.960 | right way to pivot is a, to do a
00:40:17.900 | inventory of your career capital.
00:40:19.240 | Like, well, what are my skills?
00:40:20.860 | What are my rare and valuable skills
00:40:23.200 | and try to find where else are those
00:40:26.560 | going to be valued?
00:40:27.680 | So I want to try to find another job.
00:40:30.040 | You might also think, look, I might
00:40:32.760 | have to spend a couple of years doing
00:40:34.180 | something else just because I have to
00:40:35.240 | put food on the table and I'm going to
00:40:37.360 | regroup in that new position, build
00:40:39.180 | skills and think about my next move
00:40:41.580 | to get back on the track I was on
00:40:44.120 | before or another path towards whatever
00:40:46.940 | ideal lifestyle I have in place.
00:40:48.220 | So it might be a regroup and reattach.
00:40:49.620 | The regroup might take a couple of
00:40:50.740 | years.
00:40:50.940 | I got to find the job.
00:40:52.620 | Maybe I have to move.
00:40:53.540 | I got to put my capital to work.
00:40:54.740 | Maybe I have to take a lower position
00:40:55.900 | and work my up real quick.
00:40:56.940 | I have to get back on my feet before
00:40:59.940 | I can make my really highly strategic
00:41:02.620 | plan.
00:41:02.960 | I think this is hard for a lot of
00:41:04.720 | people.
00:41:05.040 | If you've done a lot of work to set
00:41:06.460 | yourself up for a strategic move and
00:41:07.940 | then that gets taken away from you, you
00:41:10.000 | want to just jump right into another
00:41:11.640 | equally strategic move.
00:41:12.620 | But sometimes you have to go regroup,
00:41:14.060 | especially if like the job loss is
00:41:16.020 | unexpected.
00:41:16.460 | I got to just go find something that's
00:41:18.660 | going to take advantage of my career
00:41:20.040 | capital.
00:41:20.360 | Let me build up capital quick.
00:41:21.660 | Let me regroup, catch my breath, keep
00:41:23.980 | my bank account from emptying.
00:41:25.080 | And it's like, OK, now let me try
00:41:26.340 | again.
00:41:26.660 | And that I think that's going to be the
00:41:28.140 | case, especially for a lot of
00:41:29.120 | probationary or federal workers who have
00:41:30.860 | lost their job.
00:41:31.480 | You got to find something.
00:41:33.600 | Then you got to catch your breath and
00:41:35.640 | then you got to figure out, OK, now what
00:41:37.580 | do I want to do next?
00:41:38.240 | In your case, Sam, there are these very
00:41:40.920 | specific things you care about.
00:41:41.940 | So like what you your re-attack might
00:41:44.060 | actually be back in the government.
00:41:45.480 | It might be back in the same program, but
00:41:47.820 | you need to go back.
00:41:48.480 | It might take a year or two until that's
00:41:50.000 | available again.
00:41:50.680 | Right.
00:41:51.580 | So your re-attack might be back in what
00:41:53.440 | you're doing or could be something really
00:41:54.560 | differently.
00:41:54.900 | But, you know, don't hold on to the career
00:41:58.900 | narrative that unfairly got taken from
00:42:00.740 | We're not denying that it sucks, but we
00:42:05.300 | can't get caught on it because we need to
00:42:07.080 | keep making forward progress.
00:42:09.340 | So inventory your capital, find something
00:42:11.340 | that rewards it, build more capital, create
00:42:14.380 | a new plan, re-attack the new plan.
00:42:17.180 | We're not giving up.
00:42:18.140 | We're strategically regrouping, not
00:42:20.440 | retreating, and then we're re-attacking
00:42:22.660 | again.
00:42:22.880 | I think that's the right way to think
00:42:24.120 | about your career during periods of
00:42:25.540 | turmoil.
00:42:25.820 | It's not always a straight line upwards or
00:42:29.300 | straight down a path that you've planned
00:42:30.760 | before.
00:42:31.120 | Sometimes we get knocked off.
00:42:33.020 | It's going to take us a little while to
00:42:34.340 | find our way back, but we keep hiking
00:42:35.900 | some more.
00:42:36.340 | Okay.
00:42:36.740 | So I feel for you, Sam, but you are going
00:42:38.840 | to be okay.
00:42:39.320 | Let's regroup and we'll re-attack in a
00:42:41.640 | little bit.
00:42:41.980 | All right.
00:42:43.620 | Do we got a call this week?
00:42:44.460 | We do.
00:42:45.020 | All right.
00:42:46.080 | Let's hear it.
00:42:46.540 | Hi, Cal.
00:42:47.360 | My name is Alfie.
00:42:48.420 | I'm from the United Kingdom.
00:42:50.300 | And my question is around the ability to do
00:42:53.580 | deep work while in a shallow job.
00:42:56.400 | Now, a little bit of context.
00:42:58.420 | I'm two years into my career and I'm on a
00:43:01.960 | rotational program at a bank.
00:43:03.600 | And part of this rotation means that every six
00:43:07.640 | months I can change team.
00:43:08.960 | And I've recently changed team to one which
00:43:12.200 | really prioritizes shallow work and really
00:43:16.000 | pseudo productivity is almost the name of the
00:43:18.400 | game is very much celebrated.
00:43:19.860 | There's a lot of context switching throughout the
00:43:22.660 | So my question is really, given that this is a
00:43:27.340 | temporary role and given that at the moment I feel
00:43:31.940 | that my ability to concentrate and work deep has
00:43:34.440 | diminished, what are some practices I can do maybe over
00:43:40.500 | weekends or in my free time to improve that ability to
00:43:44.620 | concentrate and once I roll off this rotation and go into my
00:43:50.120 | new role, what are some things I can do early on to get back into
00:43:54.220 | the deep work routines and habits to ensure that that skill maintains
00:44:00.320 | one I've trained well?
00:44:02.720 | Thank you.
00:44:04.540 | Well, Alfie, it's a good question.
00:44:05.960 | I think, by the way, Jesse, because Alfie mentioned the word
00:44:09.960 | pseudo productivity, which comes from my book, Slow
00:44:13.680 | Productivity, that we could play the theme music.
00:44:15.860 | I didn't schedule the corner.
00:44:17.900 | Someone brought up slow productivity organically.
00:44:22.040 | So we're going to play the slow productivity theme music.
00:44:23.920 | All right, Alfie, it's a good question.
00:44:25.860 | I mean, first of all, this emphasizes there's some roles that were deep
00:44:30.220 | work is not rewarded.
00:44:31.040 | Like deep work is just a type of effort, right?
00:44:32.820 | It's an effort with sustained concentration.
00:44:34.320 | It is a good way of maximizing cognitive abilities for a lot of
00:44:38.720 | tasks where that's important.
00:44:39.680 | Alfie's in a role right now where it's not important.
00:44:41.780 | So he's not spending a lot of time in the state of deep work.
00:44:45.920 | So that's fine, right?
00:44:47.940 | This role doesn't require it.
00:44:49.040 | But I think it's a cool question of like, how do I make sure I don't
00:44:52.700 | lose that ability, that ability to concentrate that, you know, will be
00:44:57.300 | relevant again later with another job.
00:44:58.580 | Let's go back to the deep dive.
00:45:01.180 | We talked about this in the beginning of this episode, that if you're
00:45:05.280 | constantly in like a high stimuli type of situation, you get less comfortable
00:45:09.820 | concentrating.
00:45:10.380 | We see that in the data that you spend less time doing activities that make you
00:45:13.460 | smarter.
00:45:13.780 | And then you get even less comfortable, less smart and less comfortable
00:45:16.940 | concentrating.
00:45:17.380 | You get a cognitive death spiral.
00:45:18.680 | So do the type of things we talked about earlier in this episode.
00:45:21.880 | Force yourself to read books outside of work.
00:45:23.800 | Practice the non-constant companion model of the phone is plugged in, not with me when
00:45:30.080 | I'm at home.
00:45:30.560 | So you're just used to doing one thing at a time.
00:45:32.420 | Do reflection walks all the time.
00:45:33.760 | You're going to work on professional and personal projects just in my head while I'm
00:45:37.320 | walking with no source of distraction with me.
00:45:39.340 | Right?
00:45:40.400 | These are all things that are going to help your mind be comfortable with sustained
00:45:44.700 | concentration and actually strengthen your mind's ability to do things.
00:45:49.040 | You might add into there the hard hobby, you know, learn a new skill, learn how to computer
00:45:53.600 | program, learn how to do like microelectronics or woodworking.
00:45:56.360 | Be really careful about cognitive calisthenics during this period in which you were basically
00:46:00.880 | doing the cognitive equivalent of smoking during the workday.
00:46:03.320 | You've got to sort of offset the damage that's happening with active improving activities.
00:46:07.520 | So yeah, you'll be out of this rotation soon.
00:46:10.360 | Do things like that in the background.
00:46:12.900 | And then when it comes time to schedule deep work again in your next rotation, you're not
00:46:16.540 | going to struggle with maintaining your concentration.
00:46:18.840 | You're going to feel like your instrument is still well-practiced.
00:46:21.960 | All right.
00:46:23.740 | We got a case study here.
00:46:25.100 | All right.
00:46:27.440 | Our case study today, this is from Ian.
00:46:30.500 | All right.
00:46:31.960 | Ian says, great discussion on Kanban boards and systems.
00:46:36.040 | I love the space and has been a great reminder to me of how simplicity here is what works.
00:46:41.140 | Reminder, I also need it.
00:46:43.060 | Attached are images of my engineering Kanban board, which I created around 2014 or so, that
00:46:49.220 | I thought you might find interesting.
00:46:51.140 | Still my most powerful organizing system yet.
00:46:54.600 | I'm still yet able to, I've still yet been able to replicate this to do or do better in very
00:47:01.280 | new roles now.
00:47:02.100 | Remote work waste is hard.
00:47:03.020 | This is around the third version revision of the system and nothing I've done yet has replicated
00:47:07.780 | the visual organizational power, simplicity, and political leverage with internal customers.
00:47:12.540 | So I'm going to read about his system that he uses.
00:47:15.960 | And then for those who are watching, he sent some pictures of it that I'll bring on the screen
00:47:19.100 | as well.
00:47:19.360 | But let me read about it first.
00:47:20.960 | So he says, we do, we did a standup meeting.
00:47:23.720 | So he's talking about like the, the optimal way his, his Kanban based system, like how
00:47:28.240 | it operated at its peak.
00:47:30.380 | We did a standup meeting in front of the board, Monday, Wednesday, Fridays.
00:47:33.540 | It was the center of the engineering office nestled between two filing cupboards and hard
00:47:37.680 | to miss.
00:47:38.080 | We had custom cards with blue tack on the side of the cupboard and whiteboard.
00:47:42.080 | We had constant visual reminder of what's going on.
00:47:44.600 | We color coded by product or category.
00:47:46.600 | We had red dots for crazy important stuff.
00:47:48.980 | And we had queues, parking lots, done piles, et cetera.
00:47:51.960 | It worked really well for parking and queuing items rather than solve the last job that came
00:47:56.200 | in the door.
00:47:56.660 | If there was a real fire, we just made the call to drop everything and solve the fire.
00:48:00.960 | I think we gave up on the percentage done bars on the cars as that just didn't add up value
00:48:06.040 | versus maintenance time.
00:48:06.900 | Great for visual prompt to others of what we were working on and where things were at.
00:48:10.600 | I've tried digital systems and they just don't work nearly as well or as powerful
00:48:13.900 | as this.
00:48:14.920 | All right, I'm going to bring up some of these images here for those who are watching.
00:48:18.840 | Okay, so here is the main Kanban board.
00:48:21.980 | It is on the wall between two filing cabinets.
00:48:24.220 | For those who are listening, it's on a whiteboard.
00:48:28.420 | So there's blue tacked paper cards on a whiteboard that's up on the wall.
00:48:34.500 | There's columns here.
00:48:36.180 | One of these columns is to do next unassigned.
00:48:40.420 | So we talked about this in slow productivity.
00:48:42.740 | You have a place to keep track of work that needs to be done in your group that doesn't
00:48:45.620 | exist on anyone's plate.
00:48:46.680 | All right.
00:48:48.800 | It's not being forgotten, but it's not just in someone's inbox or in someone's head.
00:48:52.240 | It's on this column on this board.
00:48:53.640 | All right.
00:48:54.240 | In the middle is to do assigned.
00:48:56.600 | Now, look, this column is a cool way of doing it.
00:48:58.960 | It's split up into rows.
00:49:00.480 | Each row corresponds to a different person.
00:49:03.460 | So you can see who's working on what.
00:49:05.260 | So the person in the top row has nothing.
00:49:06.980 | The person in the second row is working on three things.
00:49:09.100 | You have those three cards.
00:49:10.140 | The person in the fourth row has four things.
00:49:11.640 | You can see specifically the four things they're working on.
00:49:14.040 | You'll notice, Jesse, two of these things are sort of on the border between two rows.
00:49:20.400 | So I think that means those two people are working on that together, maybe?
00:49:23.120 | Possibly.
00:49:24.400 | We can't actually read the cards, but they have the information about the task.
00:49:28.660 | It's got to be because there's another one at the end, too.
00:49:32.120 | So that's to do next assigned.
00:49:33.800 | And then they have a column for in progress.
00:49:35.680 | So it's like what people are working on at the moment.
00:49:38.040 | Oh, I see.
00:49:41.220 | There's a lot more people down here.
00:49:42.360 | Okay.
00:49:43.320 | Oh, cool.
00:49:43.800 | So if we look down here, we see there's rows for lots of people.
00:49:47.380 | People have a lot of stuff queued up.
00:49:49.320 | So they can queue up.
00:49:50.680 | I talk about this in Slow Productivity, doing this for yourself, having a queue of like what
00:49:54.360 | you're going to work on and in what order.
00:49:55.760 | They just have this up on a board.
00:49:57.380 | So I can look at Adam's row here and see one, two, three, four, five, six, seven things,
00:50:01.660 | one with a red dot, like things he's waiting to work on.
00:50:04.400 | And then next to it would be like, here's the four things he's working on now, knowing
00:50:07.620 | that as he finishes one of these things, he'll pull something else over there.
00:50:10.340 | So none of this is being kept track of in their head.
00:50:12.640 | And none of this is being kept track of just their email inboxes.
00:50:16.260 | It's really clear.
00:50:18.020 | This is unassigned work.
00:50:19.520 | Here's work we've assigned, but it's not being worked on.
00:50:21.440 | Here's the work people are working on.
00:50:23.400 | And we can just see it visually who's working on what and its status.
00:50:26.300 | Notice over here on the left, this is called parking lots.
00:50:30.080 | They have on the side of the cupboard parking lot.
00:50:33.740 | This is a Kanban idea where it's things you don't know.
00:50:39.200 | You're not looking to assign them.
00:50:40.960 | You're kind of like, let's put this on hold.
00:50:42.480 | Like these are things we're thinking about, but we're not looking to get these done right
00:50:45.980 | So you have a place to put those tasks.
00:50:47.800 | So anyways, what I love about this, and I'm really big on this idea in my book, Slow
00:50:51.080 | Productivity, is they're keeping track of what needs to be done, its status, and who's working
00:50:57.040 | on what in a centralized, transparent way, as opposed to allowing workloads to exist non-transparently,
00:51:03.940 | informally, and on individual plates.
00:51:06.080 | It makes a huge difference in the experience of your day.
00:51:08.860 | Because now, if you're one of these people represented by a row on this board, you're
00:51:12.260 | only working on the things in your row in the in-progress column.
00:51:15.260 | You don't have to do emails.
00:51:16.680 | You don't have to take meetings.
00:51:17.720 | And you don't have to waste cognitive cycles on all these other things.
00:51:20.380 | But Jesse, look at how many things are in to-do next, unassigned, and parking lot.
00:51:28.300 | Without a system like that, all of those things would just exist on people's plates and be
00:51:33.740 | generating potential, hey, what's going on with those emails?
00:51:35.760 | Or can we just have a quick meeting to check in on those emails?
00:51:38.420 | Or just in the back of your head is something you're supposed to be working on that you're
00:51:42.240 | So I see all of those cards that are in all of those other places as cognitive overhead
00:51:48.240 | that's been removed from the system.
00:51:49.480 | So they're going to finish stuff much quicker, and people are going to be much happier.
00:51:52.680 | So thanks for that.
00:51:54.180 | What was this, Ian?
00:51:55.220 | Thanks for sending the picture.
00:51:57.260 | That's a great demonstration of how these type of task force systems can work for teams.
00:52:03.700 | All right.
00:52:04.960 | So we got a final segment coming up.
00:52:08.840 | I want to have a hot take on AI.
00:52:10.400 | But first, let's hear from another sponsor.
00:52:12.980 | We'll talk about our friends at the Defender line of vehicles.
00:52:19.020 | We're talking about the Defender 90, the Defender 110, and the Defender 130.
00:52:24.060 | This is a very good-looking car.
00:52:26.220 | They're designed in a way that has like the modern features or conveniences you would want
00:52:30.480 | from like a modern high-end car.
00:52:31.860 | But they also have that tough, rigid body design, durability, that lightweight monocoque architecture
00:52:40.040 | for extra strength that you can take this thing on adventures.
00:52:42.120 | And I like that mix.
00:52:43.300 | Good-looking car that can be smooth and comfortable, and yet also take you where adventure might hold.
00:52:52.840 | Jesse, I got good news and bad news.
00:52:54.960 | I am going to get you from our sponsor, Defender 110, that you can drive.
00:53:00.280 | The bad news is it's wrapped with a picture of me.
00:53:03.520 | So it's going to make sure of like an advertisement for the podcast.
00:53:06.140 | So it's me.
00:53:07.880 | I'm going to have like a big thumbs up on both sides of the car.
00:53:10.700 | And there's a speaker on top that's going to chastise people for being on their phone as
00:53:15.480 | you drive past.
00:53:16.040 | So that's like the trade-off.
00:53:17.320 | It's a cool enough looking car that you probably still get away with it.
00:53:22.060 | Yeah.
00:53:22.340 | I've been seeing more and more of these.
00:53:23.940 | I haven't seen the one yet, again, that was parked outside our office after the last Defender
00:53:28.140 | read.
00:53:29.340 | But I see them everywhere now that I know about them, and it's cool.
00:53:32.580 | It's a good-looking car.
00:53:33.840 | They know what they're doing there.
00:53:35.800 | Anyways, if you want to see what these cars look like with or without the Cal Newport wrap,
00:53:39.960 | I don't know if that's a standard feature yet.
00:53:41.540 | It should be.
00:53:41.920 | Go check them out.
00:53:43.400 | I really like the way this car looks.
00:53:44.800 | Go to LandRoverUSA.com.
00:53:48.560 | So you can visit LandRoverUSA.com to learn more about the Defender.
00:53:53.320 | I also want to talk about our friends at ExpressVPN.
00:53:58.340 | How do you choose which internet service provider to use?
00:54:02.620 | The sad thing is most of us don't have very many options, right?
00:54:06.540 | ISPs are often operating like monopolies in your region.
00:54:08.980 | This is the internet service provider that's nearby.
00:54:10.960 | They use this monopoly to take advantage of their customers because you don't have any other
00:54:16.180 | options often.
00:54:17.140 | you don't have any other recourse if you don't like what they're doing.
00:54:19.080 | And you get things like data caps and bandwidth throttling, et cetera.
00:54:22.380 | But here's one thing that these ISPs are doing that you can push back on using ExpressVPN.
00:54:27.440 | They're trying to keep track of every website you visit, right?
00:54:33.700 | And here's how this works.
00:54:35.500 | Do you know what, Jess, I'm going to put you on the spot here.
00:54:38.220 | I never know like what network terminology is well-known or not.
00:54:41.580 | If I say packet, what do you think of when I say like an internet packet?
00:54:46.240 | I'm not sure.
00:54:47.980 | You think?
00:54:48.700 | Yeah.
00:54:48.880 | So this is the thing.
00:54:49.560 | I always assume these terminology is known.
00:54:51.720 | Okay.
00:54:51.960 | So when you're communicating on a network, like you're communicating to the deeplife.com over
00:54:56.480 | the internet, your communication gets broken up into these little messages called packets.
00:55:01.340 | And the front of the packet's like a address on an envelope.
00:55:05.160 | Here's the website this message is going to.
00:55:07.720 | Here's the website that it's been sent from.
00:55:10.060 | And then that packet gets sent through the internet.
00:55:11.840 | It gets bounced from router to router until it gets to the website.
00:55:14.700 | So every router has to look at the address along the way until it gets to where it's going.
00:55:20.120 | And then the destination can open up that envelope and, oh, here's what you're sending me, a request
00:55:24.760 | for a podcast or something like that.
00:55:26.640 | And a lot of websites these days use a secure protocol so that the stuff inside the envelope
00:55:33.260 | is encrypted.
00:55:33.780 | So if I'm your internet service provider and you hand me this envelope, hey, get this to
00:55:38.960 | the deeplife.com.
00:55:39.980 | I don't know what you're sending to the deeplife.com, but I can see that's who you're talking
00:55:43.940 | to because I have to pass this on.
00:55:45.360 | The address has to be plain.
00:55:46.860 | So just like in the mail, you can have a thick envelope, so I really don't know what
00:55:50.260 | you're mailing, but who you're mailing it to, my postman knows.
00:55:53.140 | All right.
00:55:54.120 | So ISPs, just look at the address on these envelopes and they know, hey, here's who you're
00:55:59.560 | talking to.
00:55:59.960 | I don't know what you're saying, but I know who you're talking to and they sell that data
00:56:02.160 | or they can.
00:56:02.780 | With a VPN like ExpressVPN, you get around that.
00:56:06.280 | And the way you get around that is you take the envelope you really want to send.
00:56:09.340 | I really want to talk to the deeplife.com, but I don't want people to know.
00:56:12.480 | I'm going to put that inside another envelope and I'm going to send that envelope
00:56:16.740 | to a VPN server.
00:56:17.840 | So now all your ISP knows is, you know, Jesse's talking to a VPN server.
00:56:23.760 | And then the VPN server can open that out and take out your real envelope.
00:56:26.900 | Oh, you really want to talk to the deeplife.com.
00:56:28.740 | It will talk to the site on your behalf and then it will put the response back in a big
00:56:32.860 | envelope and send that back to you.
00:56:34.160 | And now all your ISP learns is that you're talking to a VPN server.
00:56:37.400 | Does that work as an explanation?
00:56:39.400 | I like that explanation.
00:56:40.320 | Yeah.
00:56:40.560 | Okay.
00:56:41.020 | You even have to use the word encryption.
00:56:42.160 | So that's what a VPN server does.
00:56:45.400 | It makes it so among other advantages, you can communicate with other sites and services
00:56:51.560 | without like your service provider knowing who you're talking to.
00:56:54.660 | If you're going to use a VPN, use the one I recommend, which is Express VPN.
00:57:01.180 | This is fantastic software.
00:57:03.120 | It's easy to use.
00:57:03.980 | You fire up the app and click one button and now all of your network communication on that
00:57:07.240 | device is going to go through the VPN and is protected.
00:57:09.860 | It works on phones, laptops, tablets, and more.
00:57:12.780 | So you can, no matter what you're checking through the internet with, you can have VPN privacy with
00:57:16.460 | Express VPN.
00:57:17.100 | It's been rated number one by tech reviewers at CNET and The Verge among others.
00:57:22.840 | So you should use a VPN and Express VPN is a great option.
00:57:26.460 | So protect your online privacy today by visiting expressvpn.com slash deep.
00:57:31.840 | That's E-X-P-R-E-S-S-V-P-N.com slash deep.
00:57:36.280 | You can get an extra four months free, but only when you go to expressvpn.com slash deep.
00:57:43.000 | All right, let's go on to our final segment.
00:57:45.220 | All right, that VPN ad got me into a technology mindset.
00:57:49.520 | So we do another tech corner.
00:57:51.420 | I think like my goal here, Jesse, is to give people like at least one interesting thing to
00:57:56.240 | throw into like a dinner party conversation about technology.
00:57:59.660 | You should come out of the tech corner with like a little tidbit you can pull out to be
00:58:02.560 | smart.
00:58:02.840 | I like it.
00:58:03.580 | Yeah.
00:58:03.900 | So I want to elaborate briefly an idea that I've been playing with.
00:58:08.500 | It came up in a panel discussion I was in recently.
00:58:10.800 | I spoke at a board of directors meeting this morning that came up again.
00:58:14.320 | So I'm sort of playing with this idea that there's a potential blind spot in the world
00:58:21.000 | of AI and in particular, a blind spot about where big impacts are going to come next.
00:58:26.760 | So when we think about generative AI tools like chat GPT and economic impacts, which is really
00:58:32.180 | the topic that's at the heart of a lot of my reporting on AI, a lot of the focus when you
00:58:37.780 | see people talking about products or you see the products that are being produced,
00:58:40.740 | by the big players, particularly at like OpenAI or Microsoft or Google, is a focus on the
00:58:46.760 | ability of these generative AI tools to generate text.
00:58:50.660 | We're thinking about the advantages of these tools as the text they can produce.
00:58:58.620 | So we're thinking often when we're thinking about like sort of non-tech applications of generative AI.
00:59:13.720 | So not like in programming or data analysis or these type of things.
00:59:16.500 | It's the text generation that we focus on.
00:59:18.420 | And that's important.
00:59:20.380 | But I'm becoming increasingly convinced that the first sort of ubiquitous productivity gains
00:59:30.280 | from generative AI, so gains that are going to be cross-industry, are going to come from
00:59:35.200 | the symmetric ability of these models, which is to interpret text that's input.
00:59:42.080 | So it does both things.
00:59:43.760 | I can type something in the chat GPT.
00:59:45.540 | It can understand what I'm saying very well.
00:59:48.040 | And based on the understanding, it can produce text very well.
00:59:51.100 | I think the big next ubiquitous productivity gains are going to be based on the interpretation
00:59:55.540 | of text.
00:59:56.220 | And in particular, the ability of these models to be natural language interfaces to other
01:00:01.000 | software tools.
01:00:01.720 | So like the example I like to give is maybe I have a piece of software where I don't know
01:00:07.360 | how to use this software in an advanced way.
01:00:09.760 | It's like a spreadsheet.
01:00:10.780 | I don't really know how to do advanced analysis or data cleaning in the spreadsheet.
01:00:15.900 | An expert user might know.
01:00:17.100 | I don't know how to use it.
01:00:18.460 | I have a data analysis package.
01:00:20.920 | I don't really know how to do it.
01:00:22.720 | I know what I want it to do.
01:00:24.120 | You know, I want it to like take this data and like do a regression, but I don't know like
01:00:28.300 | what to click or where to pull or how to do this.
01:00:29.980 | I haven't been learning the software.
01:00:32.800 | This is a place where generative AI can help because you could just say in natural language,
01:00:37.020 | here's what I want to do.
01:00:38.340 | And what these models are very good at is translating between languages.
01:00:42.780 | So it can translate what you want to do from natural English language to some sort of highly
01:00:48.220 | structured macro machine language that the application understands.
01:00:52.520 | So I don't know.
01:00:53.300 | I want to take out all the column, all of the rows from column B that have a dollar amount
01:00:57.360 | less than $5.
01:00:58.020 | And with the rows that remain, I want to build a pie chart that buckets them in intervals of
01:01:05.480 | $100.
01:01:06.640 | I just say that in natural language.
01:01:08.420 | And the language model takes that and then spits out on the other end, a bunch of sort
01:01:12.380 | of like very well formatted macro commands, which you can then feed to the spreadsheet.
01:01:16.520 | And the spreadsheet does that work for you.
01:01:17.860 | That's where I think the low hanging fruit is that's going to be plucked next, or at least
01:01:21.840 | could be plucked next.
01:01:22.720 | That more importantly to me than you writing the email on my behalf is you helping me take
01:01:30.040 | advantage of the power of software tools that already exist.
01:01:33.480 | That's low hanging fruit productivity gains for both the individuals and for organizations,
01:01:38.800 | because now you need less experts and you need less people.
01:01:42.840 | So that is what I'm keeping an eye on right now.
01:01:45.560 | One of the reasons why I think this is not being emphasized is that interpretation of text
01:01:50.840 | and translation of it into machine language doesn't require massive models.
01:01:54.380 | And if you're Anthropic, if you're OpenAI Microsoft, right?
01:01:58.940 | If you're Google, you want massive models to be the thing that people care about.
01:02:03.460 | Because you're among the only companies that can afford to create these massive models.
01:02:06.940 | You really see the might and power of the massive models in production.
01:02:10.740 | Look at the graphic it created.
01:02:12.940 | Look at the very subtle text it created.
01:02:14.940 | Look at the code it created and like how the code compiles right away.
01:02:20.100 | Like that's where you really get into the power of it.
01:02:22.300 | But you don't need a 600 billion parameter model to take natural language commands for Excel
01:02:28.140 | and turn them into like spreadsheet commands.
01:02:29.920 | You could probably train a much smaller model to do that.
01:02:31.880 | And like a lot of companies could probably do that.
01:02:33.620 | So they want the focus to be on text outputted because that requires the fanciest models.
01:02:39.460 | But I really think this is the low hanging fruit.
01:02:41.960 | And the reason why I'm pretty sure it'll be plucked is that it, again,
01:02:44.460 | it doesn't require a 30,000 GPU, you know, nuclear power plant power data center in order to train.
01:02:52.440 | Like much more modest models can be natural language translators.
01:02:56.160 | And so a smaller company can build their own or multiple companies can have their own version of these models for their particular tools.
01:03:03.280 | So that's the idea I want to throw out here now.
01:03:05.320 | Don't just focus on the ability of language models to generate text.
01:03:09.280 | Focus on their ability to interpret text.
01:03:11.100 | Small agile models that unlock the power of existing software tools, I think is going to be a big deal.
01:03:17.900 | So along these lines, like I think it's a misnomer.
01:03:22.580 | Again, people want to think about the language model, for example, doing all the work.
01:03:28.400 | I think this is unnecessary as well.
01:03:30.420 | They want to think about, like, I have a bunch of data.
01:03:32.360 | I want to just input that data to my language model.
01:03:35.400 | And the language model will analyze it.
01:03:37.480 | Like it's going to move through a language model and a result will come out the other side.
01:03:40.340 | That's not really what we want.
01:03:42.140 | We don't want a language model that's so big and it's been trained on so many things.
01:03:47.460 | You can give it like a bunch of data and the model itself can actually do some statistical analysis.
01:03:52.060 | No, what you want is a really predictable, dependable, high quality statistical analysis software.
01:04:00.000 | And the language model, you tell it what you want to do with the data and then it tells the statistical software, here's the analysis we want to run.
01:04:07.680 | And then the statistical software does the analysis, right?
01:04:10.100 | Like that's what we really want.
01:04:11.500 | We want, and I think this, again, is where the low-hanging fruit is going to be.
01:04:14.320 | Not massive models that just is like does everything you want it to do, but they unlock the things that other software does for the average user.
01:04:21.180 | All right, so that's my big idea I'm throwing out there, trademark, trademark 2025.
01:04:27.640 | I don't have a catchy name for it, but keep that in mind.
01:04:29.760 | Look for natural language processing as like the next, maybe the first big killer app of this, not this oracular idea of just like I talk to this oracle and I give it data.
01:04:41.560 | It just does everything for me.
01:04:43.020 | I just don't think that's the most efficient way to get the near future value out of generating that.
01:04:48.640 | All right, so there's my tech corner idea of the week.
01:04:51.620 | And with that, I think we'll wrap up this episode.
01:04:54.280 | We'll be back next week with some more deep questions.
01:04:57.480 | But until then, as always, stay deep.
01:05:00.140 | If you like today's discussion about how we're getting dumber and what to do about it, you might also like episode 336, which was titled On Screens and Solitude.
01:05:09.540 | It has a lot of good ideas there as well for getting back in control of your own brain.
01:05:14.680 | Check it out.
01:05:15.600 | I think you'll like it.
01:05:16.620 | So the writer Derek Thompson, who I know and I like, has a big new feature article in The Atlantic right now.
01:05:21.780 | Many of you sent it to me, so you probably have heard of it.
01:05:25.260 | It's titled The Anti-Social Century.
01:05:28.580 | Americans are now spending more time alone than ever.
01:05:32.440 | It's changing our personalities, our politics, and even our relationship to reality.