back to indexChatGPT Is Making You Dumber? - Here's Why It Might Be... | Cal Newport

Chapters
0:0 Should We Fear Cognitive Debt?
43:0 Can AI be creative?
47:0 What’s the smallest change I can make to address my disorganization?
51:45 How do I find time for personal projects?
64:41 How should I choose my next internship?
70:18 How did you develop your goal-setting philosophy?
73:25 Inbox Zero and Notion
82:34 A Thoreau Schedule
90:38 AI CEO’s hot takes on work
00:00:00.000 |
So last month, Ezra Klein went on the How I Write podcast. 00:00:04.200 |
The host, David Perel, asked Ezra about using AI for his writing, 00:00:08.840 |
and Ezra's answer generated some controversy. 00:00:12.100 |
It began when Ezra said the following, and I quote, 00:00:15.300 |
I think it is very dangerous to use ChatGPT in any serious way for writing. 00:00:21.500 |
Ezra then goes on to give some reasons for this claim. 00:00:25.360 |
And one of the reasons is that AI can help you rewrite or polish or check what you've written, 00:00:30.400 |
but it can't tell you if the ideas itself are good or not. 00:00:34.360 |
As Ezra elaborates, you have to be attuned to that voice in you. 00:00:38.860 |
It's like, not right, not right, not right, not right. 00:00:41.840 |
You're not trying to bypass that or get around it or get to where it's soft. 00:00:46.660 |
You're trying to get to the point where you're like, ah, got it right. 00:00:52.480 |
Ezra later adds, ChatGPT can't identify fundamentally wrong ideas. 00:00:58.200 |
When later asked specifically about AI and journalism, Ezra says, 00:01:03.600 |
I'm not completely against anything, and I have not, and not for lack of trying, 00:01:08.000 |
and I think not for lack of being informed or interested in the issue, 00:01:10.800 |
found a way that I consistently use AI in my work. 00:01:13.980 |
I'll sometimes use it right now as a replacement for Google searches, 00:01:21.400 |
but some people, including several I know and who wrote into the show, 00:01:25.220 |
thought that his stance was nostalgic and out of touch, 00:01:28.500 |
like sticking with a typewriter in the age of word processors. 00:01:32.620 |
but it's basically just a worse process that makes you slower at your craft. 00:01:39.800 |
must include a symbiotic relationship with AI. 00:01:44.360 |
So who is right on this point, Ezra or AI's defenders? 00:01:51.860 |
and that's when I saw that my good friend and longtime friend of the show, 00:01:55.480 |
the author Brad Stolberg, sent me an Instagram post, 00:01:59.320 |
which he had recently published about a new paper that came out to MIT 00:02:02.640 |
that took a closer look at the impact of using AI in writing. 00:02:07.900 |
This post hit a nerve online and it caught my attention too. 00:02:10.740 |
This paper, I would argue, points us towards a stronger, 00:02:14.040 |
broader argument about AI writing in our culture 00:02:30.020 |
and get into it on this topic for our deep dive today. 00:02:39.720 |
So tell me about this paper that you wrote about. 00:02:52.400 |
When Using an AI Assistant for an Essay Writing Task. 00:03:24.200 |
and I'm going to quote the researchers directly, 00:03:25.880 |
is that LLM users consistently underperformed 00:03:29.900 |
at neural, linguistic, and behavioral levels. 00:03:35.260 |
Those are the three things that they evaluated. 00:03:48.360 |
So their brains were using 47% less neural connections 00:03:56.400 |
they were getting a lot of help from the model. 00:04:05.720 |
couldn't quote anything from what they'd just written. 00:04:16.280 |
from the group that used a simple Google search 00:04:24.860 |
give subjective, qualitative analyses of the writing. 00:04:35.300 |
soulless, empty, lacking individuality, typical. 00:04:42.920 |
is perhaps the biggest finding from this study, 00:04:54.280 |
is absolutely more efficient than writing yourself. 00:05:31.800 |
We are mortgaging our future cognitive fitness, 00:05:40.260 |
It reminds me of the article I told you about 00:05:44.800 |
and it was about students using AI for writing. 00:05:55.320 |
and this was one of the points of that article, 00:06:05.140 |
Like, oh, this is somehow going to produce writing 00:06:08.440 |
It mainly seemed to be about strain reduction. 00:06:24.080 |
And it gave them all these moments of release from strain, 00:06:29.980 |
to get a sentence by just asking the model again and again. 00:06:32.960 |
But it was better than, from a strain perspective, 00:06:40.200 |
than to produce stuff from scratch with the brain. 00:06:52.720 |
And yet we often think about productivity enhancing tools 00:07:11.180 |
The metaphor that I used in the little mini essay 00:07:55.600 |
but you wouldn't accrue those health benefits. 01:00:24.320 |
not what he wanted to be working on and so I think he just kind of got depressed he got paid for that though right they got paid for it but it was just to try to keep the lights on because this is not the same as having like a hit movie so it was like they had to fire a lot of people then they had to fire a lot of labor issues right there the animators had gone on strike this might have been pre-war that really disillusioned him like he had all this disillusionment happening for years and so I think he just when you're really driven like that and you're really driven like that and you're suddenly are taken out of your life the 01:00:24.320 |
things you were working on and succeeding on. 01:00:36.780 |
And that was actually sort of saved Disney with these parks eventually. 01:00:41.800 |
When you go to Disneyland, they have abundant opportunities for you to spend money. 01:00:49.700 |
It's almost as if they're trying to make money off of their guests. 01:00:54.680 |
I always say, what are the two most powerful ways to make money off of people? 01:01:01.280 |
A, convincing them that they might make a lot of money. 01:01:13.500 |
What else would you buy, though, other than a ticket and food and stuff? 01:01:20.300 |
Not me, but actually the guy who does my hair had to figure this out, right? 01:01:23.580 |
So we're in Star Wars World or whatever they call it at Disneyland. 01:01:27.420 |
They spent a billion dollars on just like this part of the park. 01:01:32.000 |
And we come across this thing where there's like a guy out there. 01:01:39.080 |
And they're like, oh, it's a make-your-own-lightsaber experience, right? 01:01:42.940 |
And they're like, oh, that sounds interesting. 01:01:57.480 |
So we're like, well, we're not doing that, right? 01:02:02.960 |
My wife's like, we're not spending $1,000 on lightsabers. 01:02:10.920 |
He's like, you know, at this gift shop though, you can build plastic ones and it's like $30 01:02:23.360 |
There's no, there was nothing behind that wall. 01:02:29.000 |
The entire point of that is to get you to buy the $30 lightsabers, right? 01:02:34.700 |
They're like, hey, your kids want to build lightsabers? 01:02:39.900 |
And then they're like, well, I guess you could, I guess you could do this $30 experience. 01:02:47.240 |
But I like this idea that it was like just for, because it's a very effective way to get 01:02:52.980 |
No, it seemed like it was, I don't want to say it. 01:02:58.740 |
Some people were, I'm trying to say it nicely. 01:03:01.220 |
I think there were plenty of people in the Star Wars portion of the park who seemed like 01:03:09.100 |
And I would say they were, they're in their 20s, probably live with their parents and don't 01:03:22.220 |
That's like, that's probably how I describe it. 01:03:26.020 |
I think I could see like people abroad, like flying private over there and be like, all right, 01:03:31.740 |
But are they that interested in a lightsaber? 01:03:34.220 |
If they fly there from like Japan or something. 01:03:52.680 |
I mean, think about if you fly private there. 01:03:58.040 |
And then you do the private tour where you spend like a thousand bucks for your party 01:04:02.200 |
and you get to go to the front of every line. 01:04:05.100 |
But all it is, is they don't want to just say, if you're rich, you don't have to wait 01:04:09.060 |
So it's like, oh, we have a private tour and our tour guides are able to bypass the lines. 01:04:13.600 |
But all they do is just let you get on the rides right away. 01:04:17.400 |
So yeah, you're spending a thousand bucks a day on that anyways. 01:04:26.080 |
Point is, who was the original question from? 01:04:30.640 |
Point is, Duncan, beware of lightsaber scams. 01:04:34.520 |
I think we handled that one perfectly, Jesse. 01:04:39.620 |
I have an internship in mainframe development with a rare and valuable skill. 01:04:43.700 |
Very few people have the skill set within my organization and many of them are nearing 01:04:47.880 |
However, it's long hours and often overlooked by management. 01:04:50.560 |
This fall, I have an opportunity to transition into a data analyst role for a third internship. 01:04:59.280 |
You know, I don't care as much about internships. 01:05:05.040 |
So the default there would be, yeah, do the data analyst one. 01:05:13.280 |
Like you're gathering information with these internships. 01:05:16.560 |
When it comes to actually choosing what you want to do for a job after the fact, especially 01:05:20.660 |
if these internships open up, each thing you did internship in opens up a job opportunity. 01:05:26.060 |
The question is always twofold when it comes to these career capital moves, right? 01:05:30.600 |
The building up rare and valuable skills and using them to construct your career. 01:05:34.000 |
Is there an opportunity in this job to build career capital? 01:05:37.600 |
So are there rare and valuable skills that I could develop? 01:05:40.080 |
For the mainframe job that your mainframe internship could lead to, the answer seems to be clearly 01:05:45.340 |
You describe it as a rare and valuable skill. 01:05:47.880 |
Most people don't know how to still work with these mainframes, but you need to upkeep 01:05:54.440 |
The second question you have to ask is, will there be an opportunity for me to cash in career 01:06:00.060 |
capital if acquired to gain some autonomy or control over how my career unfolds? 01:06:08.040 |
And that's what you would really have to assess here. 01:06:10.340 |
So if for whatever reason, this company is very rigid about people in their mainframe 01:06:17.200 |
development group, no, this is just what that job is. 01:06:24.840 |
Then the answer to that second question is no. 01:06:27.980 |
If on the other hand, you're like, look, yeah, they don't realize the value of this. 01:06:31.700 |
But if I got really good at that and was like, hey, I'm keeping up 10 of these systems 01:06:38.000 |
and now I want to change so that I'm doing this like remotely or I'm here once a week 01:06:42.800 |
And they're like, oh, OK, yeah, we don't want to lose you. 01:06:44.840 |
If you think there would be a chance to apply your capital to get leverage, then I think it 01:06:49.000 |
could be good whether or not they're recognizing the value right now. 01:06:52.160 |
The classic place where the second question trips up people is law partners. 01:06:57.260 |
There's an example from the book, So Good They Can't Ignore You. 01:07:01.240 |
In law, especially like big law, so like working at the big law firms in the big cities, you 01:07:06.540 |
for sure are building up a rare and valuable skill, right? 01:07:08.560 |
Because you are mastering a specific part of the legal code that is literally very valuable 01:07:17.060 |
You're using your law degree in your brain and your ability to like really work hard to 01:07:20.980 |
You build up a huge amount of career capital in law, but at big law firms, they give you 01:07:26.720 |
a very limited number of options for investing that career capital, right? 01:07:30.700 |
Really, the only option they give you is investing it into having more salary. 01:07:34.780 |
If you get good at this, you can become a partner and then you can become a managing partner and 01:07:39.940 |
what you're going to get in exchange for that is more money. 01:07:44.040 |
If you're going to be at this firm, here is the path and here are the expectations. 01:07:48.860 |
And the only thing you can open up by getting better is moving to the next step. 01:07:52.180 |
And so it's a classic example of a place where, yeah, you can build up a lot of career 01:07:55.700 |
capital, but they make it very hard for you to have flexibility in how you cash in that 01:08:00.180 |
career capital to take control of your career. 01:08:02.740 |
And it's why a lot of lawyers end up with good bank accounts, but really unhappy because 01:08:07.160 |
they got really good, but they have no choice with what to do with that except for just 01:08:11.120 |
And for a lot of people, other parts of their ideal lifestyle are then being trampled on because 01:08:17.800 |
So you always have to get both of those questions. 01:08:19.460 |
So that is how you should evaluate the career opportunities to come out of these internships. 01:08:28.380 |
And that's what you're looking for is whatever has the strongest affirmative answer to both 01:08:34.180 |
When it comes to your internships, yeah, do the other internship, right? 01:08:36.940 |
Like might as well open up more opportunities, learn more things. 01:08:42.500 |
What's going to matter is the choice you make for what job you go after. 01:08:45.480 |
And then once you do, once you have that job. 01:08:52.220 |
I mean, there are some options for cashing out your law career capital, but they're hard. 01:09:02.280 |
I know someone, for example, who successfully renegotiated, I'm not going to, I'll leave 01:09:07.680 |
the partner track and we can like keep my salary where it is. 01:09:11.360 |
And I'm going to do this many hours because it's billable. 01:09:15.080 |
So actually maybe the salary is going to be lower than I would be getting if I was doing 01:09:18.520 |
80 hours a week at billables or whatever, but I'm going to live remotely. 01:09:22.960 |
I'm going to work remotely and live somewhere else. 01:09:25.060 |
And where I'm living, what you're paying me for this like actual 40 hours of work I'm 01:09:29.660 |
doing each week is way better than what most people are being paid here for working 40 hours 01:09:34.540 |
It's not as prestigious and it's not on the track and I'm not going to make a million five, 01:09:37.960 |
but it's a good salary for a reasonable amount of work. 01:09:40.940 |
And now I can live in this other part of the country. 01:09:43.320 |
You've also seen people like try to put out their own shingles and then you can have some 01:09:48.620 |
But law like famously is, I think it's a classic example is you build a lot of capital, but you 01:09:53.580 |
only have one thing you're allowed to invest it in. 01:09:55.060 |
It's like the company store back in the old days of mining companies. 01:09:58.740 |
Like they would pay you a good wage to be a miner, but the only thing you could do with 01:10:05.520 |
You had to live in the company housing and you had to buy from the company store and they 01:10:09.560 |
That's how I think about some of those partnership track sort of elite jobs sometimes. 01:10:17.780 |
I've noticed that much of Cal's goal setting philosophy aligns with research on goal 01:10:23.420 |
hierarchies, top level subordinate goals down to specific goals for the current day. 01:10:29.020 |
Has Cal ever seen this research or explored similar evidence-based frameworks in shaping 01:10:33.040 |
his approach to goal setting or behavior change? 01:10:38.100 |
There is this research that often comes out of business schools and there's often long acronyms 01:10:43.660 |
and like here is the whatever 17 letter long word type of goal setting paradigm and it's 01:10:52.320 |
Like you give someone a framework, you're giving them structure to their thinking and planning 01:10:56.660 |
that they didn't have before and like it does better. 01:10:59.640 |
So no, I, I kind of find that research boring, but I would say this, the type of things you're 01:11:05.940 |
talking about, like multi-scale planning, et cetera. 01:11:11.000 |
Like what is time management in the end other than making intentional decisions about what 01:11:17.560 |
Like that's time management and there's a particular segment of people, sort of this 01:11:23.240 |
modern, like educated knowledge worker, spend a lot of time on a computer screen type people 01:11:27.620 |
where you have this tension between bigger picture things you want to work on that are going to 01:11:34.320 |
be long-term important for you and maybe for your professional prospects and the dizzying 01:11:39.640 |
whirlwind of digital distractions that just sort of like makes up your day to day. 01:11:44.660 |
Otherwise you're just in the whirlwind and nothing gets done. 01:11:47.380 |
And you get stuck and you get stagnant and you're basically trying to prove your worth 01:11:50.620 |
through your pseudo productivity, which is a young man's game and something that's not 01:11:55.020 |
And so you need some way of making a smarter decision in the moment about what to do next 01:11:59.080 |
beyond just like who wants my attention right now. 01:12:01.540 |
And so having multiple scales of thinking about what's important helps you trickle big picture 01:12:07.800 |
ideas down to small picture decisions about what to come next. 01:12:10.160 |
To me, this is just common sense and it works well in practice, but I don't really see it 01:12:14.120 |
as goal setting so much as I see it as making smart time management. 01:12:17.280 |
decisions in a current digital work culture where it is very hard on the fly without structure 01:12:27.540 |
I mean, if you're in a situation where you're not in a whirlwind of digital distraction workplace, 01:12:32.520 |
If you're working on one big project, you could run like Oliver Berkman's schedule, which is 01:12:37.080 |
basically deep work for three hours on the thing you're doing that's important to you. 01:12:41.360 |
And then just so like do your best with the rest of the day. 01:12:44.380 |
Keep up with stuff that's urgent and just try not to work too much. 01:12:48.360 |
If like maybe you're a professional writer, for example, or a independent thinker or something 01:12:53.820 |
But if you are trying to build career capital and find meaning and move the needle in a 01:12:58.720 |
sort of busy knowledge work job, I just, I think time management requires some care about 01:13:07.040 |
It requires some care in terms of how do you make the decision about what's the right usage 01:13:11.940 |
of my time and multiple scale seems to make sense. 01:13:14.340 |
So I don't know if there's a lot of research on it, but there is a lot of good experience 01:13:26.820 |
My name is Jamie Chalmers, a long time listener, first time caller. 01:13:31.820 |
They've been tremendously helpful in my creative career. 01:13:35.000 |
I've got several different projects that run concurrently. 01:13:39.000 |
And I was listening to your recent episode about Inbox Zero, which I thought was fascinating. 01:13:44.180 |
In particular, the method that you use using Trello and your working memory text file 01:13:50.260 |
to capture information out of email and plug it into your card system. 01:13:54.360 |
Now, I've listened to that episode and you talking about that about half a dozen times 01:13:59.760 |
And while I've had some success in using the working memory file to get emails out of my 01:14:05.480 |
inbox, as it were, I've not had a lot of success in translating those into my Notion task system. 01:14:11.900 |
And so I kind of wondered whether you could just dig back into that a little bit more, 01:14:15.520 |
explain some of the mechanics, perhaps the head spaces that you get into when you're 01:14:19.320 |
focusing on translating that information into action. 01:14:23.140 |
That would be super helpful because I can see that it would be a really useful way to go 01:14:31.100 |
So thank you once again for everything that you do. 01:14:40.860 |
And I think part of like the hang up here might be that I go from inbox to working memory 01:14:47.860 |
And then from there, there's a lot of directions where that information can go. 01:14:58.080 |
So the basic idea here is how do you clear an inbox, right? 01:15:02.800 |
I don't recommend going message by message necessarily and trying to dispatch completely 01:15:09.220 |
as each message before moving on to the next. 01:15:11.220 |
The context switching there becomes a real cognitive load. 01:15:16.280 |
It become, you know, the strain, everyone knows it of like, why can't I just keep going through 01:15:22.300 |
It's because your mind can't keep switching from one topic to another so quickly. 01:15:36.360 |
I just put a summary of like what that message demands of me into a text file. 01:15:45.060 |
It doesn't have to be, you know, clean or interesting or something like that. 01:15:49.460 |
And then I have all of those in my text file. 01:15:52.960 |
In fact, I was thinking what I'm going to do, Jesse, is I'm going to load up an inbox right 01:15:58.560 |
So I can just give an example of how I'm going to read real messages from my inbox right now 01:16:05.880 |
and talk about like what would be my summary in my working memory.txt file. 01:16:10.980 |
So like I'm loading up, I have so many inboxes. 01:16:13.580 |
This is my personal, this is a personal inbox, an inbox that we do kind of behind the scenes 01:16:19.440 |
stuff for the podcast and for like my books or whatever. 01:16:28.160 |
Next one is a meeting scheduling for an appointment with a trainer. 01:16:42.260 |
Next one, ironically, is from a PT I used to work with checking in. 01:17:02.060 |
I got a scheduling email about a club I helped run at my kid's school. 01:17:06.220 |
So I just put, again, get back to blah, blah, blah about whatever. 01:17:13.040 |
Do you delete these emails after you make the note? 01:17:26.520 |
So when I say like, get back to blank, I put their name. 01:17:29.820 |
So now I know I can just type that name to Gmail and that email will come back. 01:17:40.720 |
It's a note from actually a well-known podcaster who I sent a note to and he got back to me. 01:18:12.780 |
Put such and such a schedule for two o'clock on Thursday. 01:18:21.420 |
And then you're going to note that that's actually kind of funny. 01:18:25.700 |
Like if I'm getting my haircut and we're like, oh, let's schedule the next 01:18:29.560 |
You had a lot going on at this hair appointment. 01:18:40.040 |
Anyways, I'll email myself on the fly because I know when I next leave my inbox, 01:18:45.440 |
that'll move to working memory dot TXT and then I'll take care of it. 01:18:56.240 |
When I say sort, I mean, I'm just copying and pasting text within a text file, plain text 01:19:00.220 |
I'm going to put like the scheduling things all in a row. 01:19:02.980 |
So I just have a bunch of like scheduling things like in a row. 01:19:09.180 |
I can be like, all right, getting back to people. 01:19:10.560 |
And I'll put those like next to each other or whatever. 01:19:12.660 |
So I'm kind of like grouping these things by, by type, by type of message. 01:19:16.620 |
Or if there's a bunch of things on the same subject matter, like if I had multiple, I did 01:19:20.680 |
a lot of foreign press this week for some reason. 01:19:22.420 |
So like I would have like multiple things about foreign press. 01:19:29.080 |
Then I will go through and tackle these things by group. 01:19:34.180 |
And it's just easier when the groups are all the same type of thinking. 01:19:37.820 |
So now I have like in that list there, four or five scheduling things. 01:19:41.140 |
I'm going to open up my calendar, open up my inbox, like, all right, now we're doing 01:19:45.800 |
scheduling and doing this all at once has like a lot of advantages, right? 01:19:50.900 |
So most of those, first of all, are not going to end up on my task list. 01:19:55.240 |
I'm just going to go through one by one and do the scheduling. 01:19:57.660 |
And some of these, there's a link to a calendar. 01:20:00.200 |
And other ones I email back, like, here's the time I want to suggest. 01:20:05.000 |
So I'll go back and find those again in my inbox and like respond to them as needed. 01:20:11.040 |
But because I'm doing all this scheduling together, it allows me to have some extra 01:20:15.760 |
efficiencies as well, because I might be like, man, this is going to pepper my, I don't want 01:20:23.000 |
I'm going to make like Friday afternoon, my like appointment call time next week. 01:20:30.060 |
And I'm just going to offer that time to everybody. 01:20:31.720 |
And that way the rest of this week will stay clear, right? 01:20:34.600 |
So seeing them all together, or I might say, I'm going to take these two things. 01:20:39.260 |
I'm going to punt them and be like, you know what? 01:20:40.840 |
I don't, let's, let's get at this later in the summer. 01:20:42.760 |
I was like, this is feeling like there's too many things. 01:20:44.400 |
When I see how many things I'm scheduling, I'm like, this is too much. 01:20:46.820 |
This is my, I, I'm going to punt two of these things. 01:20:52.940 |
And then I'm like, let me get back to people. 01:20:56.200 |
Again, those are things where I'm just responding. 01:20:57.880 |
Then some of these things will require actual putting a task, right? 01:21:02.200 |
Like the, the, the South Korean publicity thing. 01:21:04.240 |
Oh, this is like a non-trivial amount of work I need to do here. 01:21:08.540 |
So I'm, maybe I'm going to add this to my, my Trello as a task card. 01:21:12.980 |
Maybe I need to actually connect this to a deadline. 01:21:15.780 |
So I'm going to put the deadline on the calendar. 01:21:17.860 |
Like, this is like the drop dead deadline for getting these like answers back to this question. 01:21:22.760 |
And I'll paste the questions into a Trello card and move on from there. 01:21:25.920 |
So it's moving things from the inbox into hastily type lines and a working memory.txt. 01:21:31.420 |
Sorting those things manually into like groups and then dealing with those groups where I'm 01:21:36.640 |
either ignoring them or dealing with them right away, scheduling something on my calendar or 01:21:42.020 |
creating a task where the information, I'll point to the information. 01:21:44.940 |
And I still do it the old fashioned way where I just copy the subject line and put it in the 01:21:49.740 |
So I know what to search for when I get to that task. 01:21:54.440 |
I know people have been telling me there's ways to link directly to the message in Gmail. 01:21:58.100 |
People keep emailing me this and I, I keep ignoring it. 01:22:01.180 |
But yeah, there are more advanced ways of doing it. 01:22:06.100 |
It feels like an extra step, but I'm telling you, like, this is much easier mentally than 01:22:11.260 |
if I just went from each email to each email and tried to answer it until I was done with 01:22:20.560 |
Now I'm trying to get back to this person about this. 01:22:24.200 |
It just becomes overwhelming and it's surprising that it does. 01:22:28.320 |
But underneath the covers, it's just a context switching. 01:22:35.400 |
All right, TAF says, I was listening to you talk about the Thoreau schedule today and you 01:22:41.500 |
essentially described the working schedule I've managed to carve out for myself over recent 01:22:46.260 |
It resulted due to a variety of factors, including COVID and having a young family. 01:22:51.100 |
I have a pretty mid-range marketing job for a CPG brand, managing a small team. 01:22:55.800 |
And my current role of current role, the career capital I've developed enables me to take ownership 01:23:01.520 |
on my schedule because the management team knows that I will deliver strong work regardless 01:23:07.940 |
I'll stop right there briefly to point out this is a common application of career capital. 01:23:13.280 |
If you do, you have very valuable skills and you deliver, people trust you. 01:23:18.540 |
You do the things you say you're going to say when you say you're going to do them. 01:23:21.680 |
You are now a super rare commodity and there will be a lot of accommodations because, oh my 01:23:27.500 |
God, here's someone who actually does what they say they're going to do when they're going 01:23:33.080 |
So here now is the schedule that TAF created. 01:23:37.020 |
So that being said, my schedule borrows from many concepts you've discussed. 01:23:42.440 |
I wake at 6 a.m. and read, but not on my phone. 01:23:45.020 |
I go to my office at 7 a.m. and start my day clearing emails and checking in on core project 01:23:50.280 |
timelines to make sure there's nothing urgent on the cards. 01:23:52.620 |
I find that staring, starting at 7 a.m. means I'm working before most of my colleagues are 01:23:57.220 |
logged on, which means there are very few distractions or meetings booked during this time. 01:24:00.360 |
And I can just focus on what I need to get done. 01:24:03.080 |
Then I have time blocked my calendar from 7.30 a.m. to 10.30 a.m. for my highlight. 01:24:07.720 |
This is a term that comes from the time dorks, where I focus on the one major priority deep 01:24:14.960 |
work project that is most critical to my day. 01:24:17.480 |
Then at 12 p.m., having completed five hours of work, I stop and work out. 01:24:22.840 |
And then once I come home, I cover off the last two to three hours of the day with any 01:24:28.180 |
shallow work admin meetings and sign off at 4 p.m. to be with my family. 01:24:31.320 |
In the evening after my daughter has gone to bed, my wife and I read for about one to two 01:24:36.060 |
I always have a number of books on the go, a novel, something nonfiction, something 01:24:39.820 |
theological, and a book of poetry so that I have plenty of choice for whatever mood I'm 01:24:43.280 |
I basically read until I fall asleep, and I like the idea of reading at the very start 01:24:52.180 |
I use Trello to organize my work and personal life. 01:24:55.140 |
I have one central board with my priorities for today, which contains anything critical. 01:25:02.120 |
Anything that sits in backlog for a long time eventually gets archived if nobody raises 01:25:07.120 |
I treat my Outlook as if it's an actual mailbox, so I close it once I'm finished with it and 01:25:12.280 |
then go back to check it at specific intervals in the day. 01:25:15.000 |
I've found this reduces the impulse of reading emails as they arrive and constantly being distracted. 01:25:19.760 |
I also have a two-folder system for my email. 01:25:22.780 |
My inbox has incoming mail or emails I need to address, and then filed has everything else. 01:25:28.240 |
My reason is that once I've read an email, it can be filed, and if I need again, I can search 01:25:32.420 |
This has reduced all email admin and a necessary folder assignment or rules. 01:25:36.840 |
Finally, I time block my work days with recurring meetings so that I'm automatically blocked off 01:25:43.440 |
every day for my morning highlight, my lunchtime 10K run, and my time with my family at 4 p.m. 01:25:50.520 |
This means every day has the same structure, and my calendar availability is prescribed for 01:25:55.260 |
I've never had any issues or pushback from colleagues with this approach, and people rarely book meetings 01:26:02.640 |
Someone that it's not a sexy story of, I became the only nuclear physicist in the world that 01:26:10.100 |
could handle this, and therefore I have a job where I work one day a week and surf all day. 01:26:14.940 |
It's just someone who does their job very well. 01:26:19.840 |
They combine this with lifestyle-centric planning or lifestyle engineering or lifestyle architecting, 01:26:24.320 |
whatever we want to call it, to work backwards from their ideal lifestyle, and it's not something 01:26:27.960 |
that would catch your attention if you hear it described, but it is a really good lifestyle. 01:26:31.500 |
Reading in the morning, done at work at 4, doing a long run and workout in the middle of the 01:26:36.700 |
day, get stuff done, deep work every day, meetings are constrained, email checks are not all the 01:26:44.800 |
time, people are okay with it because they deliver, and the life is really good. 01:26:52.740 |
I appreciate to have you sending in that case study. 01:26:55.620 |
All right, so we have a good final segment coming up here. 01:26:58.840 |
We got some Cal Network artwork and a what I'm not reading segment to get into as well, but 01:27:04.780 |
first let's take a quick break to hear from some sponsors. 01:27:10.120 |
How many people do you think stopped me when I was at Disneyland recently to compliment how 01:27:22.300 |
Now, whether or not that's actually true, I will say I have been enjoying shaving more 01:27:30.000 |
Harry sends the best quality razors right to your door for a fraction of the price of the 01:27:35.160 |
We're talking about really good German engineered blades, these nice, comfortable sort of rubber 01:27:41.420 |
I really like them, but also shaving products and get like excellent shaving cream delivered 01:27:47.960 |
One I like is that you can also get a richly lathering skin softening body wash in scents 01:27:58.020 |
Interestingly, Jesse, richly lathering skin softening is how a lot of people describe 01:28:06.840 |
You set it up so you don't have to remember to go to the store. 01:28:09.440 |
You don't have to add another task to your to-do list. 01:28:11.660 |
Before I started using Harry's, I was basically in the Stone Age when it came to shaving. 01:28:17.800 |
I might as well have been scraping my face with a piece of sharpened flint. 01:28:25.020 |
Normally their trial set is $10, but right now you can get it for just $6 if you go to 01:28:34.880 |
That's our exclusive link, harrys.com slash deep to get a $6 trial set. 01:28:42.380 |
I also want to talk about our friends at ShipStation. 01:28:44.660 |
If you ask Jesse where all of these packages he orders from e-commerce stores come from, 01:28:51.280 |
he will say, and this is absolutely true, Jesse, you can back me up on this, a stork delivers 01:28:58.940 |
But for those of you who actually run an e-commerce store, you know packages don't arrive 01:29:04.240 |
For a lot of companies, it is a huge source of labor and stress. 01:29:08.220 |
This is where ShipStation can make your life so much easier. 01:29:12.900 |
Last year alone, over 700 million orders were fulfilled with ShipStation, half of which were 01:29:22.520 |
With ShipStation, you can sync orders from everywhere you sell into one dashboard. 01:29:27.720 |
You have one dashboard and you can replace manual tasks with custom automations to reduce shipping 01:29:32.800 |
And all of this is at the fraction of the cost you probably were already spending for 01:29:37.640 |
Now, here's a couple of things about ShipStation that caught my attention. 01:29:40.520 |
It's the fastest, most affordable way to ship products to your customers because it has 01:29:44.340 |
discounts up to 88% off UPS, DHL Express, and USPS rates and up to 90% off FedEx rates. 01:29:53.460 |
It also seamlessly integrates with the services and selling channels you are already using so you can manage all of your orders over multiple different channels and systems in one easy-to-see dashboard. 01:30:08.920 |
During the time I've been reading this spot, 1,400 packages were shipped with the help of ShipStation and none of them, Jesse, were delivered by a store. 01:30:21.840 |
Go to ShipStation.com slash deep to sign up for your free trial. 01:30:25.680 |
No credit card or contract is required and you can cancel any time. 01:30:36.100 |
All right, so in our final segment, I want to do a twist on my what I'm reading segment that's called what I'm not reading. 01:30:43.560 |
But first, Jesse, I wanted to quickly cover a new piece of Cal Network artwork that came in. 01:30:50.160 |
I do have to sort of ding this person a little bit because a lot of people have been making sort of like custom artwork. 01:30:57.020 |
In this case, they just took an existing photo of me and added to the existing photo a copy of a fake Cal Network book, but I still think it is interesting. 01:31:06.720 |
We'll put it on the screen here for people who are watching instead of just listening. 01:31:10.420 |
And what we've got here is a – this is my author photo actually. 01:31:22.980 |
Veining Out, as I constantly do, with Austin Aviator Sunglasses, holding Cal Network's hit bestselling book, I'm the Boss Now, which is his career advice book. 01:31:39.060 |
So I always appreciate some good Cal Network artwork. 01:31:43.520 |
So to our final segment, I have been doing recently what I'm reading where I talk about like interesting articles and books that either cover interesting ideas you might like or I think are just good books or articles for those looking for some more depth. 01:31:56.560 |
I'm going to call this segment what I'm not reading. 01:31:58.620 |
It's a particular type of article that's been common recently, and I want to give you permission to ignore every single one of them because they're nonsense and are going to cause unnecessary stress. 01:32:07.360 |
I have loaded here a sample of one of these type of articles that I'm going to suggest you don't read. 01:32:13.260 |
I'll put it on the screen here for those who are listening instead of – or watching instead of just listening. 01:32:17.620 |
The article is from Fortune, and here's the headline. 01:32:22.680 |
OpenAI CEO Sam Altman says AI can rival someone with a PhD just weeks after saying it's ready for entry-level jobs. 01:32:36.480 |
This is an article – I saw this because like Google Notification just showed it to me on my phone. 01:32:42.460 |
There are a lot of articles like this where there is some sort of over-the-top alarmist headline which makes anyone who's just loosely following AI be like, oh, my God, we're so screwed. 01:32:51.740 |
But I want to look a little bit deeper at this article to try to explain why you can ignore this type of thing. 01:32:59.060 |
So if we read this article, I have a couple quotes here. 01:33:04.200 |
Earlier this month, OpenAI CEO Sam Altman revealed that the technology can already perform the task equal to that of an entry-level employee. 01:33:13.120 |
Now on a podcast posted just last week, the chat GPD mastermind went even further saying AI can even perform tasks typically expected of the smartest grads with a doctorate. 01:33:23.680 |
Soon after, it says, as companies like Amazon have admitted, they will soon cut their corporate ranks thanks to AI and Anthropic CEO Dario Amadei warning that the technology could wipe out half of all entry-level white-collar jobs. 01:33:36.860 |
It begs the question, what jobs will be left for those tossing their graduation caps into the air in the coming years? 01:33:43.580 |
There are a lot of articles like this where tech CEOs are saying pearl-clutching, groin-tightening, scary type of things like this. 01:33:58.340 |
First of all, let's start with the claim here that, and I'm going to read this, AI can perform tasks typically expected to the smartest grads with a doctorate. 01:34:09.640 |
What they're actually talking about here is the fact that OpenAI carefully tuned one of their foundational models to work on a specific type of math competition question. 01:34:20.060 |
So they hired, we got some correction on this from a listener. 01:34:25.060 |
They used a data set from a company that hired math PhDs and paid them $100 an hour to write math problems of this type with step-by-step solutions. 01:34:35.940 |
And then they could use reinforcement learning techniques to try to tune one of these foundation models to do well on this very specific type of math problem. 01:34:43.300 |
A professor involved in this project said, oh, these are hard math problems, not the type you would assign an undergrad, but the type you might assign a graduate student. 01:34:52.340 |
From there, we get AI can now do PhD-level workers' jobs. 01:34:59.420 |
That is a nonsense leap from a very specific type of math competition problem that is like of the type that might get assigned in a graduate student problem set to AI is doing graduate-level jobs. 01:35:15.420 |
And not only is it a huge exaggeration, but as we went over in the What I'm Reading segment a few weeks ago, 01:35:22.800 |
even those results are highly contested because OpenAI was like, look, these are hard problems and we can do them. 01:35:29.660 |
And independent research firms said, great, let's take these models and we'll put them on other similar math competitions that happened recently. 01:35:37.020 |
And they did terribly, leading to the idea that they had been very, very fit to this very specific type of problem that happened to be really well suited for the reinforcement learning techniques that we know how to use now on AI. 01:35:50.840 |
Another massive exaggeration that's happening with a lot of this reporting is conflating of the post-pandemic tech downturn, which is leading to lots of layoffs. 01:36:01.640 |
It's getting a little bit better now, but it's leading to lots of layoffs because there was a huge boom in the tech industry during the pandemic. 01:36:13.280 |
We put a lot of money into a lot of these areas. 01:36:15.900 |
We need to cut back because we need our profit ratios to be higher for the stock market. 01:36:25.360 |
And there's a lot of this sort of disingenuous conflating by these reporters that make it seem like, without maybe necessarily 100% claiming it, that these job losses are because AI is automating the jobs. 01:37:00.320 |
They are not replacing hundreds of thousands of people or tens of thousands of people with AI that's automating them. 01:37:11.440 |
I saw an article the other day that said, look, CS majors are cratering. 01:37:20.540 |
CS majors are down because the tech industry is down. 01:37:25.100 |
The same thing happens every time the tech industry goes down. 01:37:28.700 |
When I was a computer science undergraduate, we had the same issue. 01:37:33.720 |
Post .com bust in early 2001, the tech industry contracted. 01:37:39.380 |
Majors went down because there was less jobs. 01:37:43.280 |
And then majors went back up again when there was the Web 2 boom, brought a lot more investment in. 01:37:51.580 |
Jobs, it made jobs for, except for, you know, specialized AI jobs, like if you're a machine learning person, are contracting some. 01:37:59.200 |
And yet people will conflate it and say, oh, it's because AI is taking the jobs. 01:38:07.440 |
Most of the cuts at these companies are not software developer jobs. 01:38:10.780 |
So I think it's really disingenuous reporting, but there's a lot of this going on. 01:38:14.640 |
Later in the article, we see some of this evidence. 01:38:18.560 |
So here they say, for example, they do note, well, okay, however, I'm reading from the article now later in the article. 01:38:26.100 |
However, in the tech industry in particular, volatility in the jobs market is nothing new, said Art Zale, CEO of the tech career platform DICE. 01:38:32.700 |
After all, nearly 600,000 tech employees lost their jobs between 2022 and 2024, according to layoffs at FOII. 01:38:39.740 |
Yes, the tech industry has really post-pandemic contracted. 01:38:45.500 |
That's not because in 2022, before ChatGPT was out, AI was taking those jobs. 01:38:51.320 |
It contracted, and then it will grow again once more investment capital comes back in. 01:38:57.180 |
Look at like Dario Amadei saying technology could wipe out half of all entry-level white-collar jobs within like two years, whatever he said. 01:39:05.120 |
It's not going to do it, and he's just blowing hot air. 01:39:07.640 |
And Sam Altman knows that AI is not taking PhD-level jobs because it's not really taking any jobs right now. 01:39:18.960 |
Because it requires a massive amount of investment capital, ongoing investment capital, to keep these places afloat. 01:39:25.620 |
OpenAI is desperately in need of this massive loan from SoftBank, this huge, many, many billion-dollar loan, which is happening in tranches, to go through. 01:39:38.680 |
And the more you feel like these tools are going to, in the future, be immensely disruptive, the more you overlook right now that they're losing $3 to $4 billion a year. 01:39:47.480 |
So it is in the interest of the CEOs of tech companies to push any possible narrative of very large world-shaking disruption because you will overlook any issues with their business right now if you think where they're going is going to be the biggest disruption that we've had sort of like in the history of humanity. 01:40:07.880 |
So they are now just spewing anything they can get attention for because they know they will get articles like this. 01:40:13.900 |
They know they can look at a model fine-tuned to do well on math problems, but not really because when other people tested it, it didn't, and say, well, now even PhD-level jobs are gone. 01:40:24.320 |
That is a nonsense leap, just like it was a nonsense leap when Sam Altman talked about how AI was going to help us build Dyson spheres around the sun, the power, the growth of the universe. 01:40:35.580 |
Just like it was a massive leap when Scott Alexander very confidently says in Project 2027 of like, well, yeah, I mean, pretty soon the AI is not only going to be able to program its own super intelligences, it's going to build self-replicating robots that will build the data centers, and this will all happen in the next three years. 01:40:52.440 |
So anyways, I wanted to do a little bit of a rant and say, you can feel confident right now when you see like these like super alarmist headlines that don't reflect anything you've seen in your own life or your own industry, your own career. 01:41:06.700 |
All these jobs are gone, and it's quoting some sort of tech CEO. 01:41:12.800 |
Follow the more like tech reporting, tech journalism, but also just wait to actually see major changes in your own world. 01:41:20.260 |
I think the hype coming out of these CEOs is becoming almost like parody, and this has now entered my list of what I'm not reading, is alarmist articles about whatever the latest chaos, crazy, nonsensical hot air thing that Sam Altman has said, and will be widely repeated by outlets looking to get people to click. 01:41:42.020 |
Wait, so do you have notifications on your phone? 01:41:44.160 |
I don't know what I, okay, let me rant about this. 01:41:48.060 |
So you have to have this Google app on your phone. 01:41:52.340 |
They force you to have it for, I had to get it because I can't, couldn't like consistently log in the Gmail. 01:42:08.140 |
So something about like the two-factor authentication. 01:42:10.020 |
You have to have the Google app, the use like Google things. 01:42:14.360 |
And so that app just like shows me, I don't know, notifications, I guess. 01:42:19.280 |
Like it just will pop up, like these articles will pop up in like little bars on my iPhone. 01:42:23.160 |
But I can't get rid of the Google app because then I can't use my email. 01:42:37.080 |
But anyways, because I read a lot of AI content in my job as like a tech, when I do tech journalism, it's always showing me these nonsense articles. 01:42:45.360 |
It's always like Dario Amadei says, you have 17 minutes left to live before AI driven robots harvest your brains for their, to lubricate their self-replicating robot machines. 01:42:58.640 |
It's just like these type of headlines again and again and again. 01:43:00.920 |
So I see a lot of this nonsense and it's what I am not reading. 01:43:04.220 |
Because again, every time you look at it, it's like some super exaggerated claim like here, or they just make it up. 01:43:10.320 |
Amadei is just like half the jobs will be gone next year. 01:43:14.140 |
And the reporter's like, oh, that sounds bad. 01:43:19.620 |
And you're like, well, how is that going to happen? 01:43:25.700 |
I should just do reports like as the deep media LLC head and be like, we're going to be able to fly robotic pterodactyls next month. 01:43:38.560 |
Bees with lasers are going to start murdering our dogs within by 2028. 01:43:53.840 |
There's someone, there's the RoboBee project that, you know, happened at Harvard 20 years ago. 01:44:00.600 |
I sometimes feel like this is what, there's so much interesting stuff happening with AI and so much particular impacts and good stuff and bad stuff. 01:44:08.160 |
There's so much serious reporting happening that I hate that all this junk is getting out there. 01:44:12.120 |
I mean, the robot bee thing is true, so be careful about that. 01:44:17.340 |
We'll be back next week with another episode of the podcast. 01:44:24.200 |
Well, if you're looking for a counterbalance to my rant against Sam Altman's claims about AI, check out episode 349, where I actually look at an earlier in his career, Sam Altman's productivity techniques, where I think there's actually a lot of good ideas there. 01:44:40.580 |
So, you can't deny this guy was successful, even if we don't trust what he's saying now. 01:44:44.000 |
Check out that episode about how he got there. 01:44:47.840 |
So, as someone who writes and talks a lot about producing meaningful stuff in a distracted world, I always get excited when prominent individuals give us insight into their own processes for achieving this goal. 01:45:01.000 |
So, you can imagine how happy I was when I saw Tim Ferriss recently linked to a blog post that was titled simply Productivity that was published in 2018 by OpenAI's Sam Altman.