back to index

ChatGPT Is Making You Dumber? - Here's Why It Might Be... | Cal Newport


Chapters

0:0 Should We Fear Cognitive Debt?
43:0 Can AI be creative?
47:0 What’s the smallest change I can make to address my disorganization?
51:45 How do I find time for personal projects?
64:41 How should I choose my next internship?
70:18 How did you develop your goal-setting philosophy?
73:25 Inbox Zero and Notion
82:34 A Thoreau Schedule
90:38 AI CEO’s hot takes on work

Whisper Transcript | Transcript Only Page

00:00:00.000 | So last month, Ezra Klein went on the How I Write podcast.
00:00:04.200 | The host, David Perel, asked Ezra about using AI for his writing,
00:00:08.840 | and Ezra's answer generated some controversy.
00:00:12.100 | It began when Ezra said the following, and I quote,
00:00:15.300 | I think it is very dangerous to use ChatGPT in any serious way for writing.
00:00:21.500 | Ezra then goes on to give some reasons for this claim.
00:00:25.360 | And one of the reasons is that AI can help you rewrite or polish or check what you've written,
00:00:30.400 | but it can't tell you if the ideas itself are good or not.
00:00:34.360 | As Ezra elaborates, you have to be attuned to that voice in you.
00:00:38.860 | It's like, not right, not right, not right, not right.
00:00:41.840 | You're not trying to bypass that or get around it or get to where it's soft.
00:00:46.660 | You're trying to get to the point where you're like, ah, got it right.
00:00:49.440 | And that's usually intellectual labor.
00:00:52.480 | Ezra later adds, ChatGPT can't identify fundamentally wrong ideas.
00:00:58.200 | When later asked specifically about AI and journalism, Ezra says,
00:01:03.600 | I'm not completely against anything, and I have not, and not for lack of trying,
00:01:08.000 | and I think not for lack of being informed or interested in the issue,
00:01:10.800 | found a way that I consistently use AI in my work.
00:01:13.980 | I'll sometimes use it right now as a replacement for Google searches,
00:01:17.140 | and it's valuable for that.
00:01:19.480 | A lot of people appreciated Ezra's points,
00:01:21.400 | but some people, including several I know and who wrote into the show,
00:01:25.220 | thought that his stance was nostalgic and out of touch,
00:01:28.500 | like sticking with a typewriter in the age of word processors.
00:01:31.000 | A few cranky people might do it,
00:01:32.620 | but it's basically just a worse process that makes you slower at your craft.
00:01:37.340 | The future of writing, they're convinced,
00:01:39.800 | must include a symbiotic relationship with AI.
00:01:44.360 | So who is right on this point, Ezra or AI's defenders?
00:01:49.320 | Well, this question has been on my mind,
00:01:51.860 | and that's when I saw that my good friend and longtime friend of the show,
00:01:55.480 | the author Brad Stolberg, sent me an Instagram post,
00:01:59.320 | which he had recently published about a new paper that came out to MIT
00:02:02.640 | that took a closer look at the impact of using AI in writing.
00:02:07.900 | This post hit a nerve online and it caught my attention too.
00:02:10.740 | This paper, I would argue, points us towards a stronger,
00:02:14.040 | broader argument about AI writing in our culture
00:02:16.220 | that shows Ezra is perhaps onto something.
00:02:18.560 | Anyway, here's the good news.
00:02:19.920 | In a coincidence of timing,
00:02:21.900 | Brad is here in DC visiting his family.
00:02:25.860 | So I asked him if he would swing by the HQ
00:02:28.100 | to talk with me about this paper
00:02:30.020 | and get into it on this topic for our deep dive today.
00:02:33.240 | And fortunately, he agreed.
00:02:35.340 | So we are joined today by Brad.
00:02:37.080 | Brad, thanks for coming.
00:02:37.920 | Hey, it's great to be here, Cal.
00:02:39.720 | So tell me about this paper that you wrote about.
00:02:43.460 | What is it?
00:02:44.240 | Who wrote it?
00:02:44.860 | What's the big points?
00:02:45.980 | The paper is called
00:02:47.840 | Your Brain on Chat GPT,
00:02:50.120 | Accumulation of Cognitive Debt
00:02:52.400 | When Using an AI Assistant for an Essay Writing Task.
00:02:56.780 | The paper came out of MIT
00:02:58.960 | and the highest level summary is
00:03:02.580 | for four months,
00:03:03.760 | a team of researchers from the MIT Media Lab
00:03:07.940 | followed 54 study participants
00:03:10.660 | who were using Chat GPT
00:03:13.840 | to heavily assist them in writing essays.
00:03:18.440 | And the top line finding,
00:03:20.760 | and I'm going to quote the research,
00:03:22.140 | the top line finding,
00:03:24.200 | and I'm going to quote the researchers directly,
00:03:25.880 | is that LLM users consistently underperformed
00:03:29.900 | at neural, linguistic, and behavioral levels.
00:03:33.680 | So those are the three things.
00:03:35.260 | Those are the three things that they evaluated.
00:03:36.840 | On the neural level,
00:03:38.440 | EEG brain studies found that
00:03:41.500 | there was a 47% reduction in brain activity
00:03:45.580 | with the heavy Chat GPT users.
00:03:48.360 | So their brains were using 47% less neural connections
00:03:53.320 | as they were quote-unquote writing.
00:03:55.640 | I guess in this case,
00:03:56.400 | they were getting a lot of help from the model.
00:03:58.320 | Behaviorally, researchers found that
00:04:01.700 | 83% of the heavy model users
00:04:05.720 | couldn't quote anything from what they'd just written.
00:04:08.960 | This compares to about 10%
00:04:11.300 | from the group that used no technology,
00:04:13.760 | and very similarly about 13%
00:04:16.280 | from the group that used a simple Google search
00:04:18.860 | to aid them in their writing.
00:04:20.200 | And then on the linguistic level,
00:04:22.560 | they had neutral arbiters
00:04:24.860 | give subjective, qualitative analyses of the writing.
00:04:29.820 | And words that came up repeatedly
00:04:32.080 | to describe the AI writing was,
00:04:34.760 | and I'm quoting,
00:04:35.300 | soulless, empty, lacking individuality, typical.
00:04:41.200 | Now, the last thing that I'll say
00:04:42.920 | is perhaps the biggest finding from this study,
00:04:45.860 | and it's right there in the title,
00:04:47.300 | this term cognitive debt.
00:04:48.840 | And what the researchers found is that,
00:04:50.960 | yes, using Chat GPT to write for you
00:04:54.280 | is absolutely more efficient than writing yourself.
00:04:57.720 | You don't have any cognitive strain.
00:04:59.080 | It's not hard.
00:04:59.840 | You just put the prompt in,
00:05:01.100 | you massage it a little bit,
00:05:02.220 | and it's done.
00:05:03.140 | And they looked at the time required,
00:05:05.080 | and they finished earlier.
00:05:06.420 | Of course they finished earlier,
00:05:07.640 | because they're just prompting the model.
00:05:08.960 | So, yes, you have this short-term benefit,
00:05:12.960 | if you will,
00:05:13.540 | of being much more efficient.
00:05:14.680 | But what you pay in the long term
00:05:17.680 | to get that benefit
00:05:18.860 | seems like tenfold the cost
00:05:20.980 | in decreased cognitive fitness
00:05:23.460 | and cognitive health,
00:05:24.280 | and that's the term cognitive debt.
00:05:25.860 | They essentially argue
00:05:27.000 | that when we use AI to write for us,
00:05:28.760 | we pick up a fair amount of cognitive debt.
00:05:31.800 | We are mortgaging our future cognitive fitness,
00:05:34.520 | really our ability to think for ourselves
00:05:36.620 | for this sort of short-term efficiency.
00:05:38.820 | I mean, this is interesting.
00:05:40.260 | It reminds me of the article I told you about
00:05:43.120 | that I wrote last year for The New Yorker,
00:05:44.800 | and it was about students using AI for writing.
00:05:47.880 | And I looked virtually over the shoulder
00:05:50.660 | of a graduate student writing a paper.
00:05:53.260 | And that's what surprised me,
00:05:55.320 | and this was one of the points of that article,
00:05:57.020 | is it wasn't about speed for the student.
00:06:01.260 | It wasn't about producing text
00:06:04.340 | they otherwise couldn't.
00:06:05.140 | Like, oh, this is somehow going to produce writing
00:06:07.460 | I'm not ready to produce.
00:06:08.440 | It mainly seemed to be about strain reduction.
00:06:11.140 | That's what really seemed to be going on
00:06:13.620 | is that writing is a pain.
00:06:17.060 | Quite literally, like cognitively,
00:06:18.940 | it doesn't feel good to write.
00:06:20.400 | And they would go back and forth with the,
00:06:22.620 | what about this, what about that?
00:06:24.080 | And it gave them all these moments of release from strain,
00:06:27.220 | or sometimes it would take them a long time
00:06:29.980 | to get a sentence by just asking the model again and again.
00:06:32.960 | But it was better than, from a strain perspective,
00:06:35.860 | than the de novo construction, right?
00:06:37.740 | It's easier to ask questions of a model
00:06:40.200 | than to produce stuff from scratch with the brain.
00:06:44.180 | But, okay, but this is interesting.
00:06:45.460 | This cognitive debt idea,
00:06:46.780 | it's arguing there is this huge benefit
00:06:51.120 | that comes from writing.
00:06:52.720 | And yet we often think about productivity enhancing tools
00:06:55.240 | and making things easier.
00:06:56.100 | How do you understand
00:06:58.080 | what is the dividing line between activity
00:07:00.980 | where you want it to be hard
00:07:02.440 | versus the hardness is a bug,
00:07:06.260 | not a feature?
00:07:07.200 | I think that's at the core of this argument.
00:07:08.600 | And I think you're spot on.
00:07:11.180 | The metaphor that I used in the little mini essay
00:07:14.240 | that I wrote for Instagram
00:07:15.660 | is this notion of fitness.
00:07:19.000 | So if you think about physical fitness,
00:07:21.840 | there's a whole manner of reasons
00:07:24.460 | that one would train their body.
00:07:25.800 | But let's just focus on physical health.
00:07:27.560 | We know that lifting weights, running,
00:07:29.380 | pushing yourself physically
00:07:30.660 | is good for your health.
00:07:32.420 | You live longer is associated
00:07:34.520 | with reduced risk of certain diseases,
00:07:37.260 | both physical and cognitive and so on.
00:07:40.380 | Now, if you went to the gym
00:07:42.560 | with a forklift or some other apparatus
00:07:46.120 | to lift the weight for you,
00:07:47.880 | you could spend the same hour in the gym,
00:07:49.920 | but it would be much more comfortable.
00:07:51.500 | You wouldn't be straining.
00:07:52.620 | And that gym session would be a lot easier,
00:07:55.600 | but you wouldn't accrue those health benefits.
00:07:58.440 | And I think essentially what's happening
00:08:01.060 | when we over-rely on a tool like ChatGPT
00:08:04.060 | to write for us
00:08:04.900 | is we're sacrificing our cognitive fitness.
00:08:07.360 | We are replacing the strain of thinking,
00:08:10.340 | of writing, with a forklift.
00:08:14.020 | In this term, the, excuse me,
00:08:15.720 | we're replacing it with a forklift.
00:08:17.340 | And in this case,
00:08:17.940 | it's the model that does the work for us.
00:08:19.940 | And I think the result is a decline
00:08:22.340 | in cognitive health and cognitive fitness.
00:08:24.260 | And that's precisely what this paper showed.
00:08:26.360 | Another point I'll make real quick, Cal,
00:08:27.740 | before turning it back to you,
00:08:28.880 | is the researchers also found
00:08:30.860 | that when they took ChatGPT away
00:08:34.320 | from the group that had been using it
00:08:36.060 | for four months,
00:08:36.780 | they wrote worse than people
00:08:39.680 | who had never used ChatGPT before.
00:08:41.640 | This is scary.
00:08:42.600 | Exactly.
00:08:43.680 | That's interesting.
00:08:44.900 | So you're losing familiarity
00:08:47.700 | with the mental processes
00:08:48.820 | yet required to produce thought.
00:08:50.520 | I mean,
00:08:51.040 | something that comes to mind here
00:08:53.060 | is when I wrote that article
00:08:53.920 | for The New Yorker,
00:08:54.580 | I went deep
00:08:56.020 | on what happens in your brain
00:08:57.720 | when you write.
00:08:58.960 | And it is really complicated, right?
00:09:00.860 | Because writing is unnatural.
00:09:02.000 | We're not evolved to write.
00:09:03.320 | Linguistic communication
00:09:04.240 | is a cultural invention,
00:09:05.200 | not an evolutionary invention.
00:09:07.000 | So in order to write,
00:09:08.420 | we have to hijack
00:09:09.360 | a lot of different parts of our brain
00:09:10.580 | and get them to work together.
00:09:12.100 | There's like visual cortexes involved.
00:09:14.440 | There's spatial reasoning involved.
00:09:15.720 | I didn't learn this
00:09:16.420 | until I read this.
00:09:17.160 | But there's a spatial reasoning component,
00:09:19.480 | the same part of your brain
00:09:20.440 | you use to keep a map in your head,
00:09:22.180 | like how do I get from here
00:09:23.420 | back to the watering hole?
00:09:25.100 | That gets used
00:09:26.020 | to spatially position
00:09:27.320 | the outline of your arguments
00:09:28.760 | in your heads.
00:09:29.540 | You're trying to understand
00:09:30.240 | where you are
00:09:31.020 | in an abstract argument.
00:09:32.200 | The visual cortex
00:09:33.500 | brings up images.
00:09:34.460 | The linguistic centers
00:09:35.580 | bring up like words.
00:09:36.760 | The auditory centers
00:09:38.480 | like convert those words
00:09:39.380 | to sounds.
00:09:40.000 | You can think about
00:09:41.160 | how to put them into words.
00:09:42.040 | I think I used the term
00:09:43.380 | as like a symphony
00:09:44.140 | being conducted
00:09:45.660 | in your brain.
00:09:46.800 | This is probably
00:09:48.280 | all very beneficial.
00:09:48.980 | We know this
00:09:49.660 | in the context of reading.
00:09:51.420 | So there's research
00:09:52.480 | on what reading does for us
00:09:53.880 | and writing is just
00:09:54.420 | reading in reverse.
00:09:55.560 | You're producing the words
00:09:57.100 | instead of reading it.
00:09:57.660 | But we know reading,
00:09:58.420 | the research from Marianne Wolfe
00:09:59.680 | among others argues,
00:10:00.780 | we develop these deep processes,
00:10:04.240 | deep reading processes,
00:10:05.300 | which creates stronger connections
00:10:08.160 | between all these brain areas.
00:10:09.240 | Once they've been formed,
00:10:10.600 | then you can access them
00:10:12.220 | more easily.
00:10:12.680 | Now you can access
00:10:13.540 | and string together
00:10:14.180 | different parts of your brain
00:10:15.140 | and it allows for
00:10:15.780 | more advanced reasoning,
00:10:16.540 | more sophisticated reasoning,
00:10:17.540 | more empathy, etc.
00:10:18.460 | So I guess this is
00:10:20.520 | what they're seeing probably
00:10:22.080 | when EEG's activity goes down
00:10:25.240 | or using chat GPT
00:10:26.200 | is you're probably taking
00:10:27.300 | whole parts of your brain offline
00:10:29.700 | that you otherwise
00:10:30.220 | would be using
00:10:31.040 | during the writing process
00:10:32.020 | and so those connections
00:10:32.860 | aren't getting strengthened.
00:10:33.660 | So like the equivalent
00:10:34.400 | of the muscle getting stronger
00:10:35.420 | is probably literally neurons
00:10:36.940 | wiring together
00:10:38.420 | because they were firing together.
00:10:39.960 | And so if we think about it
00:10:41.600 | like cognitive weightlifting,
00:10:42.480 | yeah, I think you're
00:10:43.500 | absolutely right.
00:10:44.140 | Like those muscles
00:10:45.420 | aren't getting worked.
00:10:47.320 | but that's what I was
00:10:48.360 | trying to figure out.
00:10:48.840 | What's the context
00:10:49.540 | where we want to get stronger?
00:10:50.400 | Right?
00:10:51.180 | Because we do use forklifts
00:10:52.380 | to follow the physical analogy.
00:10:53.960 | Like if I'm in a warehouse,
00:10:55.220 | a forklift is great
00:10:56.980 | because we used to have to use,
00:10:58.560 | you know,
00:10:59.040 | the longshoremen
00:11:00.900 | would have to pick things up
00:11:02.420 | and hurt their backs
00:11:03.200 | or use pulleys
00:11:04.400 | and forklifts are better.
00:11:05.480 | We can move things onto it.
00:11:06.740 | But in the gym,
00:11:07.560 | I want to do the work.
00:11:08.520 | So how do we think about this?
00:11:09.880 | Like how do you think about it
00:11:11.180 | in intellectual work?
00:11:12.600 | I think it's a trade-off
00:11:14.600 | between what capacities
00:11:15.740 | do we want to maintain
00:11:17.820 | and strengthen
00:11:18.540 | versus what capacities
00:11:20.820 | are we okay letting go?
00:11:23.360 | By no means
00:11:26.260 | am I personally
00:11:27.200 | an anti-AI purist.
00:11:29.780 | I think there are
00:11:30.480 | so many good use cases.
00:11:32.020 | And I do think it's true
00:11:32.900 | that whenever there's
00:11:34.080 | a new technology,
00:11:34.840 | there tends to be
00:11:35.620 | this kind of hysteria
00:11:36.620 | or moral panic.
00:11:37.560 | It happened with the fire.
00:11:38.940 | It happened with the wheel.
00:11:39.960 | The Industrial Revolution
00:11:41.980 | was so beneficial
00:11:43.860 | for so many different reasons.
00:11:45.620 | However,
00:11:46.220 | there's tons of public health data
00:11:48.000 | that shows
00:11:48.960 | that the invention
00:11:49.680 | of the vacuum cleaner,
00:11:51.520 | for example,
00:11:52.280 | helped people
00:11:53.480 | to clean a lot faster.
00:11:54.960 | And as a result,
00:11:55.600 | you are a lot more sedentary
00:11:57.180 | because you're not having
00:11:57.940 | to crouch down
00:11:58.640 | and stand back up
00:11:59.580 | and cover your whole house
00:12:00.520 | on your hands and knees
00:12:01.320 | to clean.
00:12:01.760 | Is it good
00:12:03.120 | that we have vacuum cleaners?
00:12:04.220 | But if we don't replace that,
00:12:05.720 | then you have
00:12:07.500 | these negative health consequences.
00:12:08.560 | you have obesity
00:12:09.360 | if you're just sedentary
00:12:10.340 | all the time.
00:12:10.880 | And I think that
00:12:12.080 | when it comes to something
00:12:13.220 | like writing,
00:12:14.100 | which really is
00:12:16.280 | maybe the best proxy
00:12:17.360 | that we have for thinking,
00:12:18.520 | and if you want to think well,
00:12:20.140 | then it helps to write well,
00:12:21.520 | I don't think
00:12:23.280 | that we should be so flippant
00:12:25.320 | with outsourcing
00:12:26.200 | our ability to think.
00:12:27.180 | Our ability to think
00:12:28.180 | is a central feature
00:12:29.220 | of our humanity.
00:12:29.860 | In many ways,
00:12:30.660 | it separates us
00:12:31.520 | from other species.
00:12:32.580 | We can very quickly
00:12:34.120 | become like Pavlov's dog
00:12:35.600 | putting something
00:12:36.680 | into the AI.
00:12:37.320 | It responds.
00:12:38.320 | we either like it
00:12:39.020 | or we don't.
00:12:39.520 | We give it a thumbs up
00:12:40.300 | or a thumbs down.
00:12:40.880 | We ask it the next question
00:12:42.040 | and then we're sacrificing
00:12:44.260 | so much of our humanity.
00:12:46.240 | if AI
00:12:46.900 | was going to
00:12:48.340 | be able to,
00:12:49.720 | let's say,
00:12:50.380 | run a gazillion
00:12:51.680 | different permutations
00:12:52.900 | to somehow get
00:12:53.540 | to the bottom of cancer,
00:12:54.480 | well then the cost benefit
00:12:56.000 | is pretty clear.
00:12:56.660 | Let it do that.
00:12:57.680 | Let it use all of its
00:12:58.700 | computing power
00:12:59.340 | to figure that out.
00:13:00.240 | But in the case of writing,
00:13:01.740 | we're essentially
00:13:02.280 | letting AI think for us.
00:13:03.620 | And I just don't think
00:13:04.820 | that that's a smart thing
00:13:05.520 | to outsource.
00:13:06.040 | I like this idea,
00:13:07.140 | by the way,
00:13:07.600 | of a moral panic
00:13:08.300 | around the invention
00:13:09.580 | of fire.
00:13:10.060 | I like to think of
00:13:11.380 | like a Stone Age John Height.
00:13:12.820 | When these kids
00:13:14.440 | are looking at this fire,
00:13:15.820 | they're not out there
00:13:17.520 | learning how to mammoth hunt.
00:13:18.760 | And do we really want kids
00:13:19.720 | who can't mammoth hunt?
00:13:20.580 | Come on, fire.
00:13:21.200 | Fire is a problem.
00:13:22.000 | So then what if we say,
00:13:23.320 | let me throw,
00:13:23.860 | I'll throw out a proposal.
00:13:24.580 | You tell me what you think.
00:13:26.540 | School, don't use LLMs.
00:13:29.360 | It defeats the whole purpose.
00:13:30.400 | School is the closest thing
00:13:31.800 | we have to an intellectual gym.
00:13:32.980 | So why would we put
00:13:34.520 | a Forklift or Echo Skeleton
00:13:36.200 | in the gym?
00:13:36.820 | That's the whole point
00:13:38.920 | of writing in school.
00:13:40.000 | It's not getting us
00:13:41.440 | to something else
00:13:42.240 | that we're trying to get to
00:13:43.320 | and making that more efficient
00:13:44.460 | would be good.
00:13:46.140 | The writing itself
00:13:47.660 | would be the case.
00:13:48.960 | So I'm going to say school,
00:13:50.000 | don't use LLMs
00:13:52.020 | to write at all.
00:13:53.760 | I would say maybe
00:13:55.480 | if you're a professional writer too.
00:13:56.900 | This seems to be Ezra's argument.
00:13:58.340 | It's like I do this for a living.
00:14:00.000 | If you listen to this,
00:14:00.520 | you know,
00:14:00.840 | I'm trying to be really,
00:14:02.020 | really good at this.
00:14:03.020 | The outsource parts of my thinking,
00:14:06.460 | my thinking is what I'm paid for.
00:14:08.160 | Like that's the whole ball game.
00:14:10.160 | This doesn't make sense to me.
00:14:11.240 | Like why would I want
00:14:12.080 | to outsource any?
00:14:12.720 | All I do is write these scripts
00:14:15.060 | for my episodes.
00:14:16.520 | That's my whole job
00:14:17.640 | to make that a little bit faster.
00:14:20.020 | Like what am I trying to do here?
00:14:21.440 | This is not,
00:14:22.100 | it doesn't get me
00:14:22.740 | to something else faster.
00:14:23.640 | So, okay.
00:14:24.080 | Do you agree with those
00:14:25.240 | or am I missing something?
00:14:25.840 | School,
00:14:26.320 | professional writing,
00:14:27.640 | you shouldn't be using LLMs
00:14:30.460 | in the actual
00:14:32.020 | like crafting a prose process.
00:14:34.120 | I don't know.
00:14:34.480 | Am I getting this right?
00:14:36.400 | I'm sort of doing this
00:14:37.060 | on the fly here.
00:14:37.700 | I'm doing it on the fly with you.
00:14:39.380 | I do think you're getting it right.
00:14:40.300 | I think that you would add
00:14:41.860 | maybe composing music
00:14:44.000 | or if you're going to be writing
00:14:46.600 | emails or notes
00:14:47.680 | to family members
00:14:48.740 | that feel important.
00:14:50.060 | Essentially,
00:14:50.960 | if you step back
00:14:52.020 | and you look at this
00:14:52.740 | maybe more philosophically
00:14:53.860 | or spiritually,
00:14:54.440 | if there are activities
00:14:56.340 | that require strain
00:14:58.280 | in the short term,
00:14:59.320 | but in the long term,
00:15:01.040 | you derive satisfaction from,
00:15:03.760 | then those are activities
00:15:05.480 | worth protecting.
00:15:07.460 | the thought experiment
00:15:09.100 | that makes this so simple
00:15:10.660 | because it's an extreme
00:15:11.560 | is imagine
00:15:13.160 | that you could
00:15:14.540 | press one button
00:15:15.600 | and perhaps we'll get here
00:15:16.680 | in a few years,
00:15:17.120 | who knows,
00:15:17.980 | and you could say,
00:15:18.680 | I want a Grammy
00:15:19.380 | award-winning folk song.
00:15:20.820 | And you press that button
00:15:22.500 | and the song comes out
00:15:23.640 | and you spend maybe
00:15:24.280 | 30 minutes going back
00:15:25.440 | and forth making edits
00:15:26.380 | and then you win the Grammy.
00:15:27.980 | Would you be as satisfied
00:15:30.260 | with that Grammy
00:15:31.220 | than if you worked
00:15:31.900 | on that album
00:15:32.420 | for two years
00:15:33.220 | and the lion's share,
00:15:35.180 | if not all of it,
00:15:35.880 | came out of your brain?
00:15:36.740 | I don't think so.
00:15:38.440 | I think that you'd be
00:15:39.020 | much more satisfying
00:15:39.740 | if you did the work.
00:15:40.640 | there could be a world
00:15:42.140 | where a part of doing
00:15:43.020 | that work
00:15:43.400 | is using this
00:15:44.180 | as a tool
00:15:44.880 | but that's very different
00:15:46.220 | than relying on it
00:15:47.280 | or having it replace
00:15:48.100 | the actual act.
00:15:49.000 | I think it's twofold.
00:15:50.240 | I think it's exactly
00:15:50.960 | what you said
00:15:51.540 | which is you don't want
00:15:53.220 | to use it in areas
00:15:54.380 | that are core
00:15:55.140 | to how you live your life,
00:15:57.040 | how you make your living,
00:15:58.620 | you know,
00:15:59.020 | if you get paid to think
00:16:00.200 | or you're a knowledge worker,
00:16:01.140 | you want to be really careful
00:16:02.040 | about protecting
00:16:02.600 | your ability to think
00:16:03.800 | and I think
00:16:05.120 | we just need to be careful
00:16:06.020 | of this,
00:16:06.680 | you know,
00:16:07.660 | this is a tale
00:16:08.360 | as old as time,
00:16:08.960 | this trade-off
00:16:09.480 | between short-term convenience
00:16:10.780 | and long-term satisfaction
00:16:12.280 | and I personally
00:16:14.400 | am hesitant
00:16:14.980 | to outsource
00:16:16.640 | things that are hard
00:16:17.900 | in the moment
00:16:18.380 | but give me satisfaction,
00:16:19.680 | make me content,
00:16:20.480 | make me satiated,
00:16:21.440 | allow me to feel
00:16:22.560 | like I'm falling asleep
00:16:23.440 | after a good day's work
00:16:24.620 | because I put in the effort.
00:16:25.680 | I don't want to click
00:16:26.420 | two buttons
00:16:26.940 | and have that done for me.
00:16:27.860 | I have a term
00:16:28.880 | I just invented.
00:16:29.520 | You're so good
00:16:30.740 | at naming things.
00:16:31.380 | All right,
00:16:31.580 | what do you think about this?
00:16:32.160 | And it's an adjustment
00:16:33.400 | of existing term,
00:16:34.040 | type two thinking.
00:16:36.000 | Right?
00:16:37.200 | type two fun
00:16:38.160 | is this well-known term
00:16:39.380 | for activities
00:16:40.640 | that like aren't fun
00:16:41.980 | in the moment
00:16:42.540 | while you're doing them
00:16:43.220 | like hard mountaineering
00:16:44.180 | or something like this
00:16:44.900 | or a really hard workout
00:16:45.760 | but it's very satisfying
00:16:46.840 | and fun when you're done.
00:16:47.920 | You're glad that you've done them.
00:16:49.040 | We should have the same thing
00:16:50.380 | for thinking.
00:16:50.760 | Type two thinking
00:16:51.560 | is difficult in the moment.
00:16:53.580 | The blank page
00:16:54.460 | is difficult in the moment
00:16:55.860 | but having composed
00:16:57.100 | something that you're proud of
00:16:58.220 | is like deeply satisfying
00:16:59.700 | and long-term,
00:17:01.160 | you know it's going to make you
00:17:02.140 | better and smarter
00:17:03.160 | and so you think about
00:17:04.360 | type two thinking
00:17:05.000 | like type two fun.
00:17:05.780 | Like, yeah,
00:17:06.100 | the hardness is the feature here.
00:17:07.260 | Maybe that'll catch on.
00:17:08.780 | I don't know.
00:17:09.760 | I love that
00:17:10.120 | and I also think
00:17:10.860 | that there's a chance
00:17:13.100 | that in the future
00:17:13.920 | we'll have essentially
00:17:15.600 | cognitive gyms
00:17:16.920 | where like you set aside time
00:17:18.380 | and you say
00:17:19.520 | during this hour and a half
00:17:20.880 | I'm not going to use
00:17:21.580 | the model at all.
00:17:22.400 | I'm going to walk
00:17:23.680 | on a treadmill
00:17:24.260 | instead of drive the car.
00:17:25.940 | I do have one question
00:17:26.840 | for you
00:17:27.260 | and this is going to be
00:17:28.400 | perhaps some pushback
00:17:29.520 | from your listeners.
00:17:30.400 | I know that people
00:17:31.140 | who read my work
00:17:31.820 | are probably thinking this.
00:17:33.080 | I think the pushback
00:17:34.600 | would be
00:17:34.980 | this is a tool
00:17:37.000 | and it's all about
00:17:39.220 | how you use it
00:17:40.060 | and what if
00:17:41.000 | you get the type two hard
00:17:42.440 | by getting better
00:17:43.540 | at using the tool?
00:17:44.400 | So why would you spend
00:17:45.780 | two hours writing
00:17:46.520 | if you could sit there
00:17:47.940 | and prompt the tool
00:17:48.800 | back and forth
00:17:49.680 | and you're still going to have
00:17:50.900 | that cognitive strain
00:17:51.720 | and it's still going to be
00:17:52.320 | really effortful
00:17:53.140 | and it's going to be
00:17:54.300 | as you said in the opening
00:17:55.240 | a symbiotic relationship
00:17:56.420 | where the skill
00:17:57.420 | that you're developing
00:17:58.100 | or the competency
00:17:58.800 | that you're developing
00:17:59.620 | is prompting
00:18:01.200 | and using the tool.
00:18:02.220 | I think it's a much more
00:18:03.360 | probably the response
00:18:04.220 | would be
00:18:04.360 | it's a much more
00:18:04.800 | narrow cognitive activity
00:18:06.180 | right?
00:18:06.760 | Like there is
00:18:07.420 | some like
00:18:08.880 | sophistication
00:18:10.020 | you're like
00:18:10.260 | look when I'm in the gym
00:18:11.100 | and I have the animatronic
00:18:12.140 | that's lifting the weights
00:18:12.960 | for me
00:18:13.280 | there's a lot of
00:18:13.980 | I have to like control it
00:18:15.380 | and make a lot of decisions
00:18:16.420 | and move this lever
00:18:17.200 | and that lever
00:18:17.740 | it's not easy right
00:18:18.660 | to get it
00:18:19.100 | it's a different type of hard
00:18:20.100 | but it's not
00:18:21.120 | the type of hard
00:18:22.140 | that you're hoping
00:18:22.620 | to get out of the gym
00:18:23.260 | which was getting stronger
00:18:24.240 | so prompting
00:18:26.140 | carefully
00:18:27.080 | to try to work
00:18:28.080 | back and forth
00:18:28.800 | to get out stuff
00:18:29.540 | you can use
00:18:30.280 | is probably a much more
00:18:31.240 | narrower band
00:18:31.960 | of your mind
00:18:32.420 | you're using
00:18:32.820 | it's a very sort of
00:18:33.540 | like tool functional type
00:18:34.700 | we're already good
00:18:35.600 | at that by the way
00:18:36.380 | like we're
00:18:36.840 | we are very good
00:18:38.300 | at like using
00:18:38.960 | these type of
00:18:39.500 | consumer facing tools
00:18:40.500 | which are made
00:18:40.960 | to be really easy
00:18:41.760 | that's not necessarily
00:18:43.100 | something we have
00:18:43.760 | to work really hard at
00:18:44.640 | whereas the connections
00:18:46.060 | we get in writing
00:18:47.260 | those are hard one
00:18:48.980 | they make you smarter
00:18:50.200 | they make you more empathetic
00:18:51.360 | you're able to better
00:18:52.100 | understand the world
00:18:53.060 | and manipulate ideas
00:18:54.160 | which I think is
00:18:54.940 | more of a superpower
00:18:56.140 | I mean
00:18:56.500 | I don't know
00:18:57.140 | I'll ask you where
00:18:57.600 | you would draw your line
00:18:59.160 | I think I'm probably
00:19:00.000 | similar to Ezra here
00:19:02.560 | I think
00:19:03.140 | these tools
00:19:04.040 | are fine
00:19:05.700 | as a better
00:19:06.380 | Google search
00:19:07.000 | we're kind of
00:19:07.480 | already used to
00:19:08.020 | doing Google searches
00:19:08.740 | sometimes it's like
00:19:09.440 | a Google search
00:19:10.040 | that'll do some
00:19:10.500 | summaries for you
00:19:11.200 | I have to be a little
00:19:12.220 | bit careful about it
00:19:12.940 | because it hallucinates
00:19:13.600 | a lot
00:19:13.900 | but that can be useful
00:19:15.720 | sometimes if I'm
00:19:17.060 | writing an article
00:19:18.120 | on deadline
00:19:18.800 | I'll do some
00:19:19.360 | copy editing with it
00:19:20.400 | not that I
00:19:21.540 | I can't copy edit
00:19:22.580 | myself
00:19:23.080 | but that
00:19:23.480 | you know
00:19:24.400 | if I'm in a hurry
00:19:26.980 | I'll sometimes
00:19:28.460 | be like
00:19:28.680 | hey I'll let
00:19:29.500 | LLM help me
00:19:31.160 | copy edit
00:19:31.620 | because if this is
00:19:32.200 | cleaner grammatically
00:19:33.040 | it's just going to
00:19:33.520 | make the next
00:19:33.920 | editing pass
00:19:34.560 | faster
00:19:35.540 | like I wouldn't
00:19:36.160 | necessarily bother
00:19:36.760 | with a book
00:19:37.280 | because we have
00:19:38.020 | but if it's on a
00:19:39.120 | tight deadline
00:19:39.520 | for an article
00:19:40.080 | I might do that
00:19:40.960 | I'm wary about
00:19:42.460 | deep research
00:19:43.220 | unless it's really
00:19:44.440 | hey here's some
00:19:46.020 | quantitative data
00:19:47.060 | I kind of know
00:19:47.600 | where it is
00:19:48.320 | will you
00:19:49.040 | O2 or something
00:19:50.020 | put this into
00:19:51.020 | a spreadsheet
00:19:52.160 | or into a format
00:19:53.040 | like actually do
00:19:53.780 | data manipulation
00:19:54.560 | in a way that makes
00:19:55.320 | it easier to work
00:19:55.860 | with like that
00:19:56.320 | makes sense to me
00:19:56.960 | I'm a little bit
00:19:57.820 | more wary of like
00:19:58.640 | hey go summarize
00:19:59.380 | everything on this
00:20:00.520 | topic because I
00:20:01.280 | just don't think
00:20:01.720 | it does a good
00:20:02.200 | job and also
00:20:02.820 | that's part of
00:20:03.500 | the work of
00:20:03.920 | journalism is
00:20:04.560 | knowing like
00:20:05.140 | your sources
00:20:05.620 | and following
00:20:06.200 | those trails
00:20:06.720 | I think it
00:20:07.340 | doesn't do a
00:20:07.780 | great job
00:20:08.220 | I don't know
00:20:08.880 | that's where I
00:20:09.460 | am but maybe
00:20:10.600 | we're being
00:20:10.900 | arbitrary here
00:20:11.640 | but I draw the
00:20:13.400 | line where it's
00:20:14.000 | like the actual
00:20:14.580 | construction of the
00:20:15.440 | ideas or
00:20:16.680 | prose like
00:20:17.800 | I'm filling
00:20:18.500 | the blank page
00:20:19.420 | when you have
00:20:19.840 | your sources
00:20:20.300 | I think I
00:20:20.920 | agree with
00:20:21.240 | Ezra there
00:20:21.680 | you grapple
00:20:22.580 | you make all
00:20:23.740 | these connections
00:20:24.420 | you write
00:20:25.060 | it goes slow
00:20:25.960 | 20 minutes in
00:20:26.760 | it starts to
00:20:27.200 | pick up speed
00:20:27.920 | that's where a
00:20:29.240 | lot of interesting
00:20:29.700 | stuff happens
00:20:30.280 | but your cognitive
00:20:30.880 | let me go back
00:20:31.360 | to your cognitive
00:20:31.840 | gem idea
00:20:32.500 | in some sense
00:20:33.640 | that's what we
00:20:34.000 | want to avoid
00:20:34.720 | like what we
00:20:35.860 | want to avoid
00:20:36.380 | is getting in
00:20:36.800 | a world where
00:20:37.480 | this I mean
00:20:37.860 | it's exactly
00:20:38.380 | what happened
00:20:38.900 | as you said
00:20:39.260 | in physical
00:20:39.660 | health
00:20:40.020 | we became
00:20:41.080 | much more
00:20:41.540 | sedentary
00:20:42.080 | and then had
00:20:42.520 | to invent
00:20:43.100 | activities
00:20:43.660 | just to try
00:20:44.220 | to get
00:20:44.500 | some exercise
00:20:46.080 | we could do
00:20:47.100 | this intellectually
00:20:47.820 | we kind
00:20:48.660 | of outsource
00:20:49.380 | all of our
00:20:50.220 | thinking
00:20:50.660 | our brains
00:20:51.260 | are much
00:20:51.580 | worse
00:20:52.080 | and now
00:20:52.580 | there's like
00:20:52.900 | a small
00:20:53.200 | number of us
00:20:53.800 | that have to
00:20:54.180 | like go to
00:20:54.460 | cognitive gyms
00:20:55.120 | to like try
00:20:55.500 | to keep
00:20:55.740 | the brain
00:20:56.300 | or we could
00:20:56.860 | just not
00:20:57.140 | outsource the
00:20:57.740 | intellectual
00:20:58.100 | activities
00:20:58.680 | because having
00:21:00.500 | my student
00:21:01.120 | paper written
00:21:01.820 | faster
00:21:02.360 | it doesn't
00:21:03.600 | it doesn't
00:21:03.600 | make my
00:21:04.020 | it doesn't
00:21:04.980 | open up
00:21:06.240 | opportunities
00:21:06.880 | or like
00:21:08.000 | open up
00:21:08.360 | new economic
00:21:08.860 | opportunities
00:21:09.360 | or growth
00:21:09.880 | it's just
00:21:10.280 | sort of
00:21:10.500 | convenience
00:21:10.880 | in the
00:21:11.240 | moment
00:21:11.460 | I want
00:21:12.220 | it would
00:21:12.520 | be nice
00:21:12.840 | to avoid
00:21:13.380 | the cognitive
00:21:14.520 | and so
00:21:15.480 | thinking
00:21:16.060 | about type
00:21:16.760 | thinking
00:21:17.080 | maybe
00:21:17.340 | that's
00:21:17.580 | the way
00:21:17.800 | to do
00:21:18.260 | unsurprisingly
00:21:20.180 | I'm going
00:21:20.760 | to completely
00:21:21.300 | agree
00:21:21.940 | with you
00:21:23.040 | we can
00:21:23.820 | continue
00:21:24.240 | to build
00:21:24.800 | this metaphor
00:21:25.740 | in a way
00:21:26.300 | that I
00:21:26.760 | have found
00:21:27.180 | really helpful
00:21:27.980 | in relation
00:21:28.800 | to health
00:21:29.460 | and fitness
00:21:29.880 | so we
00:21:30.700 | talked about
00:21:31.260 | the production
00:21:32.440 | of writing
00:21:33.640 | or thinking
00:21:34.320 | and we
00:21:35.640 | made the
00:21:35.960 | parallel
00:21:36.260 | to lifting
00:21:36.780 | weights
00:21:37.120 | and sometimes
00:21:38.440 | you do want
00:21:38.920 | to outsource
00:21:39.340 | lifting the
00:21:39.880 | heavy thing
00:21:40.240 | at the
00:21:40.460 | warehouse
00:21:40.740 | but if
00:21:41.440 | the goal
00:21:41.900 | is to
00:21:42.240 | maintain
00:21:42.520 | strength
00:21:43.000 | in this
00:21:43.420 | if the
00:21:43.800 | is to
00:21:44.100 | maintain
00:21:44.780 | the ability
00:21:45.220 | to think
00:21:45.520 | for ourselves
00:21:45.980 | we certainly
00:21:46.460 | don't want
00:21:46.960 | to outsource
00:21:47.980 | I would
00:21:49.120 | say at
00:21:49.620 | but we
00:21:50.240 | don't
00:21:50.360 | want to
00:21:50.540 | become
00:21:50.720 | reliant
00:21:51.120 | on it
00:21:52.560 | metaphor
00:21:53.380 | extends
00:21:53.980 | to what
00:21:54.680 | consume
00:21:55.200 | so at
00:21:55.560 | the same
00:21:56.240 | underwent
00:21:56.900 | industrial
00:21:57.280 | revolution
00:21:57.800 | and we
00:21:58.080 | became
00:21:58.580 | sedentary
00:21:59.180 | there
00:22:00.820 | massive
00:22:01.340 | change
00:22:02.120 | technology
00:22:02.620 | and the
00:22:03.200 | creation
00:22:04.200 | ultra-processed
00:22:05.060 | or highly
00:22:05.600 | processed
00:22:06.100 | foods
00:22:07.540 | highly
00:22:07.880 | processed
00:22:08.260 | foods
00:22:09.840 | they're
00:22:10.260 | convenient
00:22:10.820 | they taste
00:22:11.980 | great
00:22:12.300 | they have
00:22:12.680 | a kind
00:22:13.220 | addictive
00:22:13.840 | quality
00:22:14.620 | about
00:22:15.340 | the more
00:22:15.820 | that you
00:22:16.520 | the harder
00:22:16.960 | it is
00:22:17.500 | to enjoy
00:22:18.820 | nourishing
00:22:19.260 | foods
00:22:21.120 | happens
00:22:22.140 | ultra-processed
00:22:23.000 | foods
00:22:25.600 | sorts
00:22:26.000 | metabolic
00:22:26.380 | issues
00:22:27.080 | dysfunction
00:22:28.400 | struggle
00:22:28.880 | overweight
00:22:29.540 | obesity
00:22:30.620 | we've
00:22:31.040 | built
00:22:31.220 | environment
00:22:31.700 | around
00:22:32.460 | where
00:22:33.300 | sedentary
00:22:34.160 | there's
00:22:34.440 | ultra-processed
00:22:35.100 | foods
00:22:35.320 | everywhere
00:22:35.700 | and we
00:22:36.440 | spent
00:22:36.620 | a lot
00:22:36.840 | of time
00:22:37.020 | talking
00:22:37.180 | about
00:22:37.400 | production
00:22:38.460 | writing
00:22:38.880 | but I
00:22:39.340 | think
00:22:39.620 | there's
00:22:40.220 | something
00:22:40.360 | to say
00:22:40.920 | consumption
00:22:42.840 | consuming
00:22:44.320 | bunch
00:22:45.040 | short
00:22:46.340 | generated
00:22:46.900 | content
00:22:48.320 | TikTok
00:22:49.000 | Instagram
00:22:49.840 | Twitter
00:22:50.200 | wherever
00:22:52.720 | content
00:22:53.740 | these
00:22:56.200 | researchers
00:22:56.720 | coined
00:22:57.420 | soulless
00:22:57.940 | empty
00:22:58.500 | lacking
00:22:59.040 | individuality
00:23:02.000 | not only
00:23:02.560 | are you
00:23:02.960 | exerting
00:23:03.460 | energy
00:23:03.900 | to build
00:23:04.560 | cognitive
00:23:05.120 | fitness
00:23:05.800 | you're
00:23:06.200 | consuming
00:23:08.160 | highly
00:23:08.740 | ultra-processed
00:23:09.880 | information
00:23:10.400 | that has
00:23:10.840 | a very
00:23:11.100 | synthetic
00:23:13.380 | think
00:23:14.620 | biggest
00:23:15.640 | enter
00:23:16.480 | mental
00:23:17.040 | built
00:23:17.380 | environment
00:23:17.840 | that is
00:23:18.160 | similar
00:23:18.380 | to the
00:23:18.820 | physical
00:23:19.220 | built
00:23:19.760 | environment
00:23:20.220 | where
00:23:20.500 | we're
00:23:20.820 | exerting
00:23:21.180 | ourselves
00:23:22.020 | we're
00:23:22.240 | consuming
00:23:22.660 | ultra-processed
00:23:23.440 | content
00:23:23.820 | I mean
00:23:24.320 | this comes
00:23:24.640 | up in
00:23:24.960 | reading
00:23:25.160 | research
00:23:26.200 | you're
00:23:26.520 | reading
00:23:27.660 | screen
00:23:28.140 | lightweight
00:23:29.060 | material
00:23:30.500 | immediately
00:23:31.820 | interesting
00:23:32.700 | challenging
00:23:33.880 | doesn't
00:23:34.220 | develop
00:23:34.820 | called
00:23:35.200 | reading
00:23:35.380 | processes
00:23:36.060 | there
00:23:36.180 | actually
00:23:36.860 | distinction
00:23:37.880 | reading
00:23:38.380 | reading
00:23:38.760 | doesn't
00:23:38.980 | matter
00:23:39.200 | whether
00:23:40.540 | phone
00:23:41.660 | classic
00:23:43.340 | harder
00:23:43.840 | thing
00:23:44.100 | you're
00:23:44.340 | reading
00:23:45.380 | brain
00:23:47.640 | intellectual
00:23:48.080 | advantages
00:23:48.860 | reading
00:23:49.060 | research
00:23:50.240 | think
00:23:50.440 | Proust and
00:23:50.960 | Squid
00:23:51.300 | great
00:23:53.800 | really
00:23:54.400 | you're
00:23:54.520 | challenged
00:23:54.900 | you're
00:23:55.120 | grappling
00:23:55.720 | ideas
00:23:56.540 | complicated
00:23:57.300 | character
00:23:58.500 | complicated
00:23:59.040 | structure
00:24:00.000 | what's
00:24:00.240 | going
00:24:00.780 | that's
00:24:01.600 | where
00:24:01.760 | brain
00:24:02.080 | stronger
00:24:03.100 | agree
00:24:03.560 | a lot
00:24:04.940 | prescription
00:24:05.600 | follow
00:24:07.140 | analogy
00:24:07.520 | through
00:24:08.300 | prescriptions
00:24:09.700 | would
00:24:11.280 | things
00:24:11.940 | write
00:24:14.080 | consume
00:24:15.160 | ultra
00:24:15.380 | process
00:24:15.700 | content
00:24:16.060 | sparingly
00:24:17.940 | things
00:24:18.340 | pursue
00:24:20.420 | smart
00:24:21.860 | Brad's
00:24:22.380 | Instagram
00:24:23.100 | which
00:24:23.580 | textual
00:24:24.220 | by the
00:24:25.720 | writes
00:24:27.100 | thing
00:24:27.300 | we're
00:24:27.440 | talking
00:24:27.700 | about
00:24:27.860 | today
00:24:29.000 | those
00:24:30.520 | there's
00:24:30.960 | particular
00:24:31.780 | always
00:24:32.160 | about
00:24:32.520 | Instagram
00:24:34.160 | motivation
00:24:35.300 | particular
00:24:35.880 | pursuit
00:24:38.020 | people
00:24:38.760 | triathlete
00:24:40.120 | helps
00:24:41.280 | videos
00:24:42.700 | triathletes
00:24:43.300 | gives
00:24:44.440 | right
00:24:45.060 | doing
00:24:46.340 | digital
00:24:46.700 | minimalism
00:24:48.020 | right
00:24:48.920 | aside
00:24:49.180 | specific
00:24:50.820 | people
00:24:51.280 | follow
00:24:52.440 | particular
00:24:52.760 | purpose
00:24:53.360 | don't
00:24:54.680 | there
00:24:56.620 | default
00:24:57.360 | because
00:24:58.040 | platform
00:24:58.560 | going
00:24:59.120 | you're
00:24:59.440 | going
00:25:00.620 | things
00:25:01.020 | discovered
00:25:01.700 | when I
00:25:02.160 | wrote
00:25:02.480 | something
00:25:02.720 | about
00:25:03.780 | working
00:25:04.900 | looking
00:25:05.320 | history
00:25:06.440 | understand
00:25:06.880 | physical
00:25:07.320 | obesity
00:25:08.100 | issue
00:25:08.640 | interesting
00:25:08.960 | it took
00:25:09.240 | a long
00:25:10.940 | wondering
00:25:12.140 | I don't
00:25:12.660 | think
00:25:12.760 | we have
00:25:12.960 | our arms
00:25:13.320 | fully
00:25:13.540 | around
00:25:14.360 | equivalent
00:25:14.840 | cognitive
00:25:15.300 | obesity
00:25:15.860 | really
00:25:16.060 | going
00:25:16.840 | don't
00:25:17.300 | always
00:25:18.140 | problem
00:25:18.740 | developing
00:25:20.100 | problem
00:25:20.560 | physical
00:25:20.820 | world
00:25:21.080 | obesity
00:25:21.840 | we can
00:25:22.200 | measure
00:25:22.500 | we have
00:25:22.760 | a very
00:25:22.940 | specific
00:25:24.260 | talking
00:25:24.560 | about
00:25:25.300 | exactly
00:25:25.980 | chronic
00:25:26.400 | health
00:25:26.700 | conditions
00:25:27.340 | really
00:25:27.580 | makes
00:25:27.840 | worse
00:25:28.360 | that's
00:25:29.300 | understood
00:25:30.280 | wasn't
00:25:30.540 | understood
00:25:33.220 | graph
00:25:33.840 | started
00:25:34.240 | hockey
00:25:34.440 | stick
00:25:35.760 | calories
00:25:36.620 | sedentariness
00:25:38.340 | we're
00:25:39.180 | those
00:25:39.340 | early
00:25:39.640 | stages
00:25:40.200 | cognitive
00:25:40.560 | obesity
00:25:41.420 | don't
00:25:41.540 | think
00:25:42.380 | exactly
00:25:44.140 | going
00:25:45.640 | don't
00:25:47.060 | airplane
00:25:51.140 | across
00:25:52.900 | tiktok
00:25:53.960 | scrolling
00:25:55.800 | don't
00:25:55.940 | you've
00:25:56.500 | you're
00:25:56.820 | tiktok
00:25:57.540 | three
00:25:58.800 | seconds
00:25:59.180 | swipe
00:26:00.260 | second
00:26:00.580 | swipe
00:26:01.680 | seconds
00:26:01.980 | swipe
00:26:03.780 | doing
00:26:05.140 | brain
00:26:05.940 | don't
00:26:06.360 | funny
00:26:06.500 | tiktok
00:26:06.820 | story
00:26:07.520 | aside
00:26:07.940 | actually
00:26:08.360 | funny
00:26:08.880 | terrifying
00:26:10.000 | maybe
00:26:11.220 | years
00:26:12.640 | tiktok
00:26:13.080 | going
00:26:13.660 | thing
00:26:14.080 | authors
00:26:14.880 | because
00:26:15.100 | there's
00:26:17.400 | tiktok
00:26:17.700 | account
00:26:19.660 | tiktok
00:26:22.560 | describe
00:26:25.880 | eaten
00:26:27.460 | skittles
00:26:29.300 | eating
00:26:30.160 | skittles
00:26:31.440 | whole
00:26:32.920 | synthetic
00:26:35.000 | remember
00:26:35.400 | telling
00:26:36.660 | marketing
00:26:37.120 | person
00:26:38.080 | should
00:26:38.460 | tiktok
00:26:40.140 | can't
00:26:40.880 | happy
00:26:41.300 | forfeit
00:26:41.660 | whatever
00:26:43.000 | publicity
00:26:43.600 | going
00:26:45.180 | brain
00:26:45.700 | precisely
00:26:48.100 | reason
00:26:48.820 | think
00:26:48.980 | another
00:26:49.300 | piece
00:26:49.820 | prescriptive
00:26:50.560 | advice
00:26:52.980 | would
00:26:54.420 | importance
00:26:55.500 | being
00:26:56.120 | aware
00:26:57.660 | trade-offs
00:26:58.660 | you're
00:26:58.980 | producing
00:26:59.780 | consuming
00:27:00.540 | highly
00:27:00.740 | processed
00:27:01.100 | information
00:27:02.520 | having
00:27:03.220 | rules
00:27:03.860 | constraints
00:27:04.380 | as you
00:27:04.740 | mentioned
00:27:05.280 | about
00:27:07.080 | metaphor
00:27:08.560 | continue
00:27:09.060 | analogy
00:27:11.960 | merely
00:27:12.380 | exist
00:27:14.160 | modern
00:27:14.740 | environment
00:27:15.640 | healthy
00:27:17.300 | obesity
00:27:17.580 | problem
00:27:18.920 | opinion
00:27:20.760 | education
00:27:22.220 | carbohydrate
00:27:23.000 | protein
00:27:23.920 | added
00:27:24.520 | sugars
00:27:25.360 | calories
00:27:26.020 | actually
00:27:26.740 | people
00:27:28.280 | maintain
00:27:28.820 | decent
00:27:30.060 | health
00:27:31.060 | terrible
00:27:31.700 | environment
00:27:32.240 | nobody
00:27:34.380 | because
00:27:35.800 | would
00:27:36.580 | because
00:27:37.700 | every
00:27:37.900 | corner
00:27:38.440 | tastes
00:27:38.620 | great
00:27:39.520 | level
00:27:40.100 | knowledge
00:27:41.660 | knowledge
00:27:43.800 | constraints
00:27:45.000 | rules
00:27:45.200 | people
00:27:46.180 | overdrive
00:27:46.880 | cockamamie
00:27:47.380 | diets
00:27:48.020 | kinds
00:27:48.320 | restriction
00:27:50.120 | least
00:27:50.400 | think
00:27:51.400 | intentional
00:27:52.160 | deliberate
00:27:52.500 | about
00:27:55.260 | you're
00:27:55.420 | eating
00:27:56.180 | think
00:27:56.880 | least
00:27:57.380 | should
00:27:57.540 | approach
00:27:58.140 | information
00:27:58.760 | environment
00:27:59.460 | similarly
00:28:00.120 | because
00:28:01.320 | there's
00:28:02.040 | going to be
00:28:02.340 | fast food
00:28:02.740 | restaurants
00:28:03.240 | popping up
00:28:04.020 | on every
00:28:04.400 | corner
00:28:04.740 | in the
00:28:05.080 | world
00:28:05.380 | thinking
00:28:05.640 | you're
00:28:06.140 | saying
00:28:06.320 | people
00:28:07.100 | healthy
00:28:08.160 | often
00:28:08.960 | there's
00:28:10.380 | not like
00:28:10.740 | hardship
00:28:11.440 | there's
00:28:12.440 | rules
00:28:12.900 | follow
00:28:13.560 | there's
00:28:13.740 | points
00:28:14.060 | friction
00:28:14.740 | exactly
00:28:16.500 | going
00:28:18.660 | poker
00:28:18.920 | night
00:28:19.520 | having
00:28:19.920 | whatever
00:28:21.440 | doesn't
00:28:21.820 | dominate
00:28:22.360 | their
00:28:23.500 | constant
00:28:23.920 | feature
00:28:24.680 | there's
00:28:25.880 | aware
00:28:26.620 | doing
00:28:27.640 | don't
00:28:28.020 | something
00:28:28.920 | digital
00:28:31.220 | worried
00:28:32.460 | going to
00:28:32.620 | end up
00:28:33.660 | cognitively
00:28:34.040 | obese
00:28:36.260 | interesting
00:28:37.980 | seems
00:28:38.380 | making
00:28:38.600 | things
00:28:38.940 | easier
00:28:41.380 | can't
00:28:41.680 | point
00:28:41.980 | towards
00:28:43.860 | where
00:28:46.640 | where
00:28:47.300 | people
00:28:48.220 | weird
00:28:49.860 | that's
00:28:50.160 | probably
00:28:50.700 | that's
00:28:51.800 | probably
00:28:52.180 | problem
00:28:52.680 | always
00:28:54.400 | about
00:28:54.540 | financial
00:28:54.920 | budgeting
00:28:56.560 | haven't
00:28:57.920 | spending
00:28:58.200 | money
00:28:59.060 | haven't
00:28:59.620 | recently
00:29:00.980 | don't
00:29:01.420 | budget
00:29:01.740 | right
00:29:02.640 | because
00:29:03.460 | budget
00:29:03.720 | doesn't
00:29:04.060 | exist
00:29:04.480 | unless
00:29:04.920 | inducing
00:29:05.700 | constraints
00:29:06.460 | change
00:29:06.740 | behavior
00:29:07.560 | would
00:29:07.920 | normally
00:29:08.240 | spent
00:29:08.540 | money
00:29:09.440 | didn't
00:29:10.300 | that's
00:29:11.000 | happening
00:29:12.280 | budget
00:29:12.740 | don't
00:29:12.840 | really
00:29:13.200 | budget
00:29:13.740 | maybe
00:29:13.960 | there's
00:29:14.320 | there's
00:29:14.760 | cognitive
00:29:15.160 | analogy
00:29:15.480 | there
00:29:15.680 | all right
00:29:16.120 | final
00:29:16.360 | thing
00:29:16.660 | how do we
00:29:17.100 | think
00:29:17.280 | about
00:29:17.660 | writing
00:29:18.540 | where
00:29:19.140 | don't
00:29:19.400 | think
00:29:19.760 | writing
00:29:20.560 | where
00:29:20.980 | could
00:29:21.920 | guess
00:29:22.640 | thinking
00:29:23.040 | emails
00:29:23.600 | right
00:29:24.680 | business
00:29:25.120 | logistical
00:29:25.660 | writing
00:29:26.120 | where we
00:29:26.580 | don't
00:29:26.780 | think
00:29:26.960 | about
00:29:27.320 | writing
00:29:27.820 | being
00:29:29.300 | cognitive
00:29:29.860 | development
00:29:30.440 | right
00:29:31.240 | writing
00:29:31.600 | paper
00:29:32.020 | school
00:29:32.520 | article
00:29:32.840 | professional
00:29:33.160 | writer
00:29:34.580 | something
00:29:35.240 | grappling
00:29:36.840 | thoughts
00:29:39.680 | professional
00:29:40.240 | writing
00:29:40.800 | where
00:29:41.760 | it is
00:29:42.500 | writing
00:29:42.980 | fully
00:29:43.200 | functional
00:29:44.840 | coordinate
00:29:47.040 | meeting
00:29:47.520 | books
00:29:49.560 | summary
00:29:49.900 | these
00:29:50.040 | notes
00:29:50.880 | person
00:29:52.040 | integrate
00:29:54.280 | slide
00:29:54.920 | something
00:29:55.460 | that's
00:29:55.860 | other
00:29:56.540 | major
00:29:59.000 | writing
00:29:59.380 | right
00:30:00.540 | think
00:30:01.100 | thoughts
00:30:01.540 | thought
00:30:01.800 | about
00:30:02.640 | about
00:30:03.500 | where
00:30:03.820 | don't
00:30:04.040 | think
00:30:04.320 | writing
00:30:04.800 | cognitively
00:30:05.320 | important
00:30:05.740 | should
00:30:07.300 | having
00:30:07.740 | write
00:30:07.960 | those
00:30:08.140 | emails
00:30:08.420 | how do
00:30:09.520 | think
00:30:09.700 | about
00:30:11.680 | expert
00:30:14.660 | going
00:30:14.960 | defer
00:30:16.620 | asked
00:30:16.960 | question
00:30:17.620 | thing
00:30:18.660 | thought
00:30:18.860 | about
00:30:20.240 | think
00:30:21.160 | happen
00:30:22.280 | we'll
00:30:22.880 | world
00:30:23.220 | where
00:30:25.480 | talking
00:30:28.320 | going
00:30:28.980 | forth
00:30:29.200 | about
00:30:29.880 | schedule
00:30:30.320 | meeting
00:30:31.560 | other
00:30:31.820 | number
00:30:32.380 | administrative
00:30:32.940 | overhead
00:30:33.560 | tasks
00:30:35.560 | question
00:30:36.120 | that I
00:30:36.900 | you're
00:30:37.180 | going
00:30:38.100 | write
00:30:39.640 | really
00:30:40.520 | writing
00:30:41.560 | first
00:30:41.760 | place
00:30:42.020 | because
00:30:42.580 | assume
00:30:43.120 | you're
00:30:43.260 | going
00:30:43.880 | write
00:30:45.200 | perhaps
00:30:45.640 | recipient
00:30:46.300 | going
00:30:46.760 | their
00:30:48.420 | least
00:30:48.800 | their
00:30:49.280 | respond
00:30:51.000 | robots
00:30:51.440 | talking
00:30:52.000 | robots
00:30:53.680 | there
00:30:54.640 | human
00:30:55.020 | process
00:30:56.420 | think
00:30:56.840 | agree
00:30:59.340 | overwrought
00:31:00.260 | email
00:31:01.600 | doing
00:31:02.940 | linguistic
00:31:03.740 | communication
00:31:04.340 | in the
00:31:04.660 | workplace
00:31:06.560 | crude
00:31:07.280 | solution
00:31:08.160 | complicated
00:31:08.580 | problem
00:31:09.040 | there's
00:31:09.280 | usually
00:31:09.660 | coordination
00:31:10.020 | problem
00:31:10.480 | needs
00:31:10.840 | solved
00:31:11.380 | information
00:31:11.760 | needs
00:31:12.440 | person
00:31:12.880 | decision
00:31:13.180 | needs
00:31:13.840 | people
00:31:14.340 | right
00:31:14.500 | information
00:31:15.320 | decision
00:31:16.820 | happen
00:31:18.800 | typing
00:31:20.140 | other
00:31:20.400 | we'll
00:31:21.980 | crude
00:31:22.520 | proxy
00:31:23.100 | reaching
00:31:23.500 | decision
00:31:24.460 | right
00:31:24.820 | seems
00:31:25.400 | there's
00:31:26.040 | direct
00:31:26.880 | that's
00:31:27.080 | probably
00:31:27.520 | should
00:31:28.020 | having
00:31:28.600 | write
00:31:29.140 | really
00:31:29.260 | complicated
00:31:29.740 | email
00:31:30.560 | figure
00:31:31.700 | information
00:31:32.560 | meeting
00:31:33.400 | wouldn't
00:31:33.740 | easier
00:31:34.520 | could
00:31:37.060 | information
00:31:37.840 | needs
00:31:38.640 | could
00:31:39.200 | forth
00:31:39.340 | directly
00:31:41.080 | think
00:31:41.360 | when it
00:31:41.560 | comes
00:31:41.840 | things
00:31:42.200 | email
00:31:42.500 | communication
00:31:43.080 | there
00:31:45.000 | about
00:31:45.420 | making
00:31:46.700 | brain
00:31:46.900 | smarter
00:31:48.260 | think
00:31:48.580 | about
00:31:49.000 | efficiency
00:31:51.840 | broader
00:31:52.180 | level
00:31:53.820 | write
00:31:54.440 | world
00:31:54.540 | without
00:31:54.720 | email
00:31:55.860 | a lot
00:31:56.080 | about
00:31:56.260 | what's
00:31:59.260 | communication
00:31:59.680 | serving
00:32:02.280 | project
00:32:04.180 | meeting
00:32:04.660 | books
00:32:05.440 | actually
00:32:06.160 | trivial
00:32:07.180 | what's
00:32:09.440 | capture
00:32:10.480 | message
00:32:12.000 | there
00:32:12.880 | effectively
00:32:13.460 | efficiently
00:32:14.000 | possible
00:32:14.560 | there's
00:32:17.160 | window
00:32:19.500 | forth
00:32:19.820 | right
00:32:21.240 | other
00:32:21.900 | think
00:32:22.020 | people
00:32:22.360 | using
00:32:23.120 | professional
00:32:23.520 | writing
00:32:24.060 | don't
00:32:24.180 | trust
00:32:24.380 | their
00:32:24.540 | writing
00:32:24.760 | ability
00:32:27.080 | probably
00:32:27.580 | because
00:32:27.820 | you're
00:32:28.040 | writing
00:32:29.180 | there's
00:32:29.500 | sense
00:32:30.080 | don't
00:32:30.240 | really
00:32:30.600 | trust
00:32:31.460 | myself
00:32:32.080 | write
00:32:34.340 | person
00:32:34.920 | other
00:32:36.540 | going
00:32:37.460 | professional
00:32:38.040 | something
00:32:38.660 | right
00:32:39.080 | which
00:32:39.680 | probably
00:32:40.860 | being
00:32:41.480 | think
00:32:41.660 | there's
00:32:41.860 | a lot
00:32:42.300 | places
00:32:43.440 | professional
00:32:43.820 | communication
00:32:44.420 | where
00:32:45.500 | straight
00:32:46.520 | saves
00:32:47.080 | probably
00:32:49.360 | through
00:32:49.520 | these
00:32:49.680 | notes
00:32:50.980 | summarize
00:32:51.980 | points
00:32:52.540 | meeting
00:32:53.600 | drudgery
00:32:55.560 | for you
00:32:58.340 | cliff
00:33:00.340 | son's
00:33:00.680 | baseball
00:33:01.900 | eight
00:33:02.160 | o'clock
00:33:02.440 | and we
00:33:02.700 | haven't
00:33:03.000 | eaten
00:33:03.280 | we'll
00:33:04.780 | McDonald's
00:33:05.640 | Burger
00:33:06.880 | built
00:33:08.020 | McDonald's
00:33:08.680 | Burger
00:33:09.620 | would
00:33:10.260 | unhealthy
00:33:11.140 | wouldn't
00:33:12.260 | physical
00:33:12.660 | health
00:33:13.080 | physical
00:33:13.380 | fitness
00:33:15.260 | intentional
00:33:16.060 | deliberate
00:33:17.080 | there
00:33:17.520 | specific
00:33:17.880 | times
00:33:18.800 | makes
00:33:19.040 | sense
00:33:20.200 | think
00:33:20.320 | that's
00:33:20.840 | think
00:33:21.480 | analogy
00:33:21.860 | holds
00:33:22.600 | nutrition
00:33:23.020 | purists
00:33:24.020 | shall
00:33:24.200 | never
00:33:24.940 | those
00:33:26.220 | rigid
00:33:26.420 | rules
00:33:27.180 | consuming
00:33:30.520 | means
00:33:31.280 | little
00:33:31.440 | piece
00:33:31.960 | wrote
00:33:32.420 | saying
00:33:33.020 | should
00:33:35.320 | think
00:33:36.180 | should
00:33:36.440 | really
00:33:36.720 | deliberate
00:33:37.700 | intentional
00:33:38.080 | about
00:33:38.640 | because
00:33:39.140 | weren't
00:33:40.240 | smartphones
00:33:40.640 | and the
00:33:40.880 | adoption
00:33:41.380 | social
00:33:41.660 | media
00:33:42.840 | becoming
00:33:43.160 | pretty
00:33:43.700 | clear
00:33:44.980 | there
00:33:46.460 | detrimental
00:33:46.980 | negative
00:33:47.600 | impacts
00:33:48.160 | people
00:33:48.500 | probably
00:33:48.660 | listening
00:33:49.280 | podcast
00:33:50.100 | phone
00:33:50.280 | which
00:33:51.180 | proof
00:33:51.980 | there
00:33:52.520 | great
00:33:52.840 | things
00:33:53.200 | about
00:33:53.880 | modern
00:33:54.140 | technology
00:33:57.880 | let's
00:33:58.520 | happens
00:34:00.140 | think
00:34:00.880 | writing
00:34:02.280 | consumption
00:34:03.120 | consumption
00:34:03.900 | written
00:34:04.280 | material
00:34:06.180 | arguing
00:34:08.100 | let's
00:34:08.360 | learn
00:34:08.840 | mistakes
00:34:09.500 | adoption
00:34:09.900 | social
00:34:10.100 | media
00:34:10.500 | smartphones
00:34:11.240 | let's
00:34:11.540 | really
00:34:11.820 | deliberate
00:34:12.500 | intentional
00:34:12.920 | about
00:34:14.020 | technology
00:34:14.500 | instead
00:34:15.020 | going
00:34:15.380 | wherever
00:34:15.760 | current
00:34:15.940 | takes
00:34:16.960 | right
00:34:17.180 | sounds
00:34:17.440 | we're
00:34:17.940 | Ezra's
00:34:19.660 | summarize
00:34:20.380 | advice
00:34:20.920 | seems
00:34:21.240 | we're
00:34:21.360 | saying
00:34:21.720 | least
00:34:22.040 | saying
00:34:22.860 | you're
00:34:23.140 | student
00:34:23.520 | don't
00:34:25.980 | writing
00:34:27.100 | whole
00:34:27.340 | point
00:34:27.620 | there
00:34:28.300 | smarter
00:34:29.380 | better
00:34:29.500 | thinker
00:34:29.960 | writing
00:34:31.300 | incredible
00:34:33.620 | cognitive
00:34:33.920 | world
00:34:34.300 | don't
00:34:35.500 | you're
00:34:35.680 | professional
00:34:36.000 | writer
00:34:36.400 | that's
00:34:37.680 | whole
00:34:39.540 | nuanced
00:34:39.920 | subtle
00:34:40.220 | grasp
00:34:41.960 | athlete
00:34:42.240 | needs
00:34:42.600 | actually
00:34:43.400 | training
00:34:43.720 | don't
00:34:45.360 | writing
00:34:45.760 | you're
00:34:45.960 | professional
00:34:46.420 | circumstance
00:34:48.260 | there
00:34:49.200 | tedious
00:34:52.220 | information
00:34:54.160 | another
00:34:56.160 | email
00:34:56.480 | chain
00:34:56.840 | where
00:34:56.980 | people
00:34:57.320 | talking
00:34:57.540 | about
00:34:57.660 | their
00:34:57.760 | availabilities
00:34:58.380 | and I
00:34:58.680 | want to
00:34:58.860 | suggest
00:34:59.120 | three
00:34:59.320 | times
00:34:59.740 | works
00:35:00.060 | everyone
00:35:01.240 | that's
00:35:02.020 | maybe
00:35:02.980 | because
00:35:03.380 | drudgery
00:35:03.940 | helping
00:35:05.620 | annoying
00:35:06.860 | email
00:35:07.460 | future
00:35:08.720 | could
00:35:09.060 | schedule
00:35:10.120 | short
00:35:11.840 | going
00:35:12.060 | automate
00:35:13.500 | final
00:35:13.720 | thing
00:35:13.980 | would
00:35:14.360 | you're
00:35:14.580 | professional
00:35:14.940 | context
00:35:16.320 | you're
00:35:17.160 | nervous
00:35:17.660 | about
00:35:18.180 | writing
00:35:18.500 | quality
00:35:19.120 | maybe
00:35:19.400 | you're
00:35:19.520 | sending
00:35:20.060 | important
00:35:20.380 | email
00:35:20.760 | you're
00:35:22.120 | check
00:35:22.960 | Grammarly
00:35:24.680 | it'll
00:35:25.520 | actually
00:35:26.080 | here's
00:35:27.300 | places
00:35:27.780 | could
00:35:28.200 | sound
00:35:28.540 | professional
00:35:29.620 | would
00:35:30.060 | short
00:35:30.660 | you're
00:35:30.820 | worried
00:35:32.680 | could
00:35:32.960 | helpful
00:35:34.520 | should
00:35:35.140 | signal
00:35:36.460 | should
00:35:36.640 | actually
00:35:36.940 | deliberately
00:35:37.760 | on my
00:35:38.040 | riding
00:35:38.700 | lower
00:35:39.020 | stakes
00:35:39.360 | situation
00:35:39.980 | until I
00:35:40.260 | get to
00:35:40.580 | point
00:35:40.780 | where
00:35:41.540 | nervous
00:35:41.880 | about
00:35:42.460 | think
00:35:43.280 | you're
00:35:44.700 | national
00:35:45.580 | don't
00:35:46.500 | actually
00:35:47.040 | these
00:35:47.240 | trails
00:35:47.840 | going
00:35:48.080 | drive
00:35:48.580 | scooter
00:35:48.820 | around
00:35:50.120 | scooter
00:35:51.660 | trail
00:35:52.140 | whatever
00:35:52.680 | context
00:35:53.320 | should
00:35:54.140 | saying
00:35:54.300 | maybe
00:35:54.800 | should
00:35:56.040 | better
00:35:56.240 | shape
00:35:58.320 | don't
00:35:59.220 | scooter
00:35:59.600 | around
00:35:59.960 | all right
00:36:00.100 | those
00:36:00.660 | pieces
00:36:01.080 | advice
00:36:02.800 | we're
00:36:03.940 | think
00:36:04.080 | we're
00:36:05.200 | using
00:36:05.580 | typewriter
00:36:06.600 | processors
00:36:07.840 | someone
00:36:08.040 | who uses
00:36:08.380 | their
00:36:08.540 | brain
00:36:09.020 | living
00:36:09.300 | worried
00:36:10.560 | about
00:36:10.800 | their
00:36:10.960 | brain
00:36:11.260 | wanting
00:36:11.500 | their
00:36:12.240 | possible
00:36:13.620 | would
00:36:14.620 | nostalgic
00:36:15.800 | moral
00:36:16.020 | panic
00:36:16.580 | actually
00:36:16.920 | pretty
00:36:17.640 | reasonable
00:36:19.800 | hitting
00:36:20.820 | missing
00:36:21.640 | think
00:36:21.760 | you're
00:36:21.880 | nailing
00:36:22.700 | thing
00:36:23.380 | would
00:36:24.160 | fourth
00:36:24.460 | example
00:36:25.220 | you're
00:36:25.360 | someone
00:36:25.640 | who's
00:36:25.980 | worried
00:36:26.640 | writing
00:36:26.860 | could
00:36:27.160 | better
00:36:27.740 | you're
00:36:27.880 | using
00:36:29.800 | wouldn't
00:36:30.540 | autopilot
00:36:31.580 | would
00:36:31.940 | attention
00:36:33.000 | edits
00:36:33.460 | gives
00:36:34.360 | think
00:36:34.660 | sounds
00:36:34.880 | better
00:36:35.780 | would
00:36:36.520 | think
00:36:36.960 | strain
00:36:39.020 | sound
00:36:39.220 | better
00:36:40.380 | change
00:36:41.860 | better
00:36:44.440 | closer
00:36:45.720 | maybe
00:36:49.940 | professional
00:36:51.400 | don't
00:36:53.260 | better
00:36:53.420 | version
00:36:55.580 | different
00:36:55.720 | version
00:36:57.860 | ranking
00:36:59.080 | caution
00:36:59.420 | because
00:37:00.740 | actual
00:37:01.080 | person
00:37:02.880 | confident
00:37:03.780 | tells
00:37:04.640 | doesn't
00:37:05.200 | right
00:37:06.100 | could
00:37:06.960 | training
00:37:07.180 | mechanism
00:37:07.620 | that's
00:37:09.160 | going
00:37:10.360 | forth
00:37:10.560 | about
00:37:10.880 | sentences
00:37:11.880 | gotta
00:37:12.100 | worried
00:37:12.300 | about
00:37:13.120 | right
00:37:13.860 | thanks
00:37:14.140 | coming
00:37:14.520 | think
00:37:14.760 | cracked
00:37:15.220 | problem
00:37:16.180 | interesting
00:37:16.500 | thanks
00:37:17.060 | turning
00:37:17.420 | attention
00:37:18.000 | paper
00:37:19.200 | obviously
00:37:19.820 | bigger
00:37:20.220 | issue
00:37:21.180 | write
00:37:21.420 | about
00:37:21.880 | essays
00:37:22.340 | books
00:37:24.140 | we'll
00:37:24.300 | touch
00:37:24.520 | on it
00:37:25.080 | future
00:37:26.080 | cognitive
00:37:26.960 | fitness
00:37:27.360 | cognitive
00:37:28.180 | cognitive
00:37:28.560 | obesity
00:37:30.360 | these
00:37:30.500 | issues
00:37:30.960 | wrapped
00:37:31.620 | together
00:37:31.960 | we'll
00:37:33.160 | about
00:37:33.480 | future
00:37:34.180 | least
00:37:34.500 | gives
00:37:34.960 | interesting
00:37:35.640 | what's
00:37:35.820 | going
00:37:36.160 | right
00:37:36.660 | thank
00:37:37.520 | thanks
00:37:38.380 | pleasure
00:37:38.600 | let's
00:37:40.280 | quick
00:37:40.440 | break
00:37:41.580 | another
00:37:42.420 | sponsors
00:37:42.920 | notion
00:37:47.240 | powered
00:37:49.680 | place
00:37:51.100 | automatically
00:37:51.580 | captures
00:37:52.120 | meeting
00:37:52.440 | notes
00:37:52.760 | instantly
00:37:53.160 | finds
00:37:53.680 | exact
00:37:54.080 | content
00:37:55.120 | drafts
00:37:56.100 | detailed
00:37:59.420 | models
00:37:59.880 | notion
00:38:01.160 | became
00:38:01.620 | twice
00:38:02.600 | powerful
00:38:03.200 | teams
00:38:03.560 | making
00:38:06.820 | little
00:38:07.900 | background
00:38:08.440 | Jesse
00:38:08.680 | and I
00:38:09.780 | notions
00:38:10.940 | various
00:38:11.380 | custom
00:38:11.900 | tools
00:38:12.420 | built
00:38:13.100 | notion
00:38:14.000 | years
00:38:14.340 | clearly
00:38:14.940 | listeners
00:38:17.000 | caller
00:38:17.680 | heard
00:38:18.100 | mentioned
00:38:19.840 | custom
00:38:20.380 | notion
00:38:21.220 | system
00:38:21.760 | right
00:38:23.260 | something
00:38:23.840 | we've
00:38:24.260 | known
00:38:24.700 | while
00:38:25.660 | they've
00:38:25.940 | added
00:38:26.260 | these
00:38:26.860 | features
00:38:27.280 | recently
00:38:28.460 | think
00:38:28.740 | really
00:38:29.340 | Jesse
00:38:29.920 | and I
00:38:30.360 | experimenting
00:38:30.920 | with one
00:38:31.440 | of these
00:38:31.860 | features
00:38:32.200 | which
00:38:32.460 | really
00:38:32.740 | caught
00:38:33.100 | attention
00:38:33.460 | they call
00:38:34.460 | enterprise
00:38:34.880 | search
00:38:35.540 | but here's
00:38:35.960 | the idea
00:38:36.360 | you can use
00:38:37.560 | keywords
00:38:38.180 | or an
00:38:39.200 | open-ended
00:38:39.740 | question
00:38:40.320 | for a
00:38:41.520 | single
00:38:41.940 | powerful
00:38:42.460 | search
00:38:42.880 | experience
00:38:43.520 | across
00:38:43.940 | all of
00:38:44.840 | connected
00:38:45.280 | tools
00:38:45.920 | information
00:38:46.520 | you're
00:38:47.380 | unifying
00:38:48.500 | scattered
00:38:48.980 | knowledge
00:38:49.440 | right
00:38:49.880 | within
00:38:50.400 | workplace
00:38:51.800 | quick
00:38:52.420 | summary
00:38:53.280 | those
00:38:53.760 | results
00:38:54.300 | you can
00:38:55.440 | something
00:38:55.600 | called
00:38:56.160 | connectors
00:38:57.540 | search
00:38:57.760 | be able
00:38:58.180 | connect
00:38:58.820 | other
00:38:59.600 | tools
00:39:00.160 | already
00:39:00.560 | using
00:39:01.640 | example
00:39:02.040 | Jesse
00:39:02.380 | and I
00:39:02.960 | Google
00:39:03.340 | workspace
00:39:04.720 | running
00:39:05.380 | podcast
00:39:07.260 | notion
00:39:07.520 | enterprise
00:39:07.840 | search
00:39:08.340 | we can
00:39:09.000 | search
00:39:09.520 | might
00:39:09.740 | combine
00:39:10.380 | information
00:39:12.740 | we have
00:39:13.120 | stored
00:39:13.520 | notion
00:39:14.440 | episode
00:39:14.860 | scripts
00:39:15.640 | stored
00:39:16.040 | Google
00:39:16.220 | drive
00:39:18.360 | example
00:39:20.460 | episode
00:39:21.040 | summarize
00:39:22.800 | summarize
00:39:23.500 | connect
00:39:23.960 | other
00:39:24.160 | information
00:39:25.360 | really
00:39:25.940 | bring
00:39:26.300 | power
00:39:27.480 | information
00:39:30.020 | general
00:39:30.340 | chatbot
00:39:31.240 | might
00:39:31.800 | using
00:39:32.160 | notion
00:39:33.720 | models
00:39:34.200 | built
00:39:35.880 | choose
00:39:36.100 | which
00:39:39.120 | cloud
00:39:40.240 | sonnet
00:39:41.780 | directly
00:39:42.320 | notion
00:39:43.020 | subscription
00:39:43.840 | separate
00:39:44.240 | subscription
00:39:44.800 | needed
00:39:45.480 | don't
00:39:46.460 | separate
00:39:48.460 | separate
00:39:48.760 | window
00:39:49.980 | we're
00:39:50.360 | first
00:39:50.660 | discover
00:39:50.940 | notion
00:39:52.700 | fortune
00:39:53.320 | companies
00:39:54.140 | teams
00:39:54.720 | notion
00:39:56.060 | email
00:39:56.480 | cancel
00:39:57.960 | meetings
00:39:58.360 | and save
00:39:59.080 | searching
00:39:59.600 | their
00:39:59.980 | that's
00:40:00.340 | music
00:40:01.340 | course
00:40:01.580 | check
00:40:02.760 | notion
00:40:04.540 | right
00:40:05.620 | notion.com
00:40:06.880 | slash
00:40:07.980 | now you
00:40:08.500 | need to
00:40:08.700 | type that
00:40:09.340 | all in
00:40:10.580 | lowercase
00:40:11.100 | letters
00:40:11.560 | notion.com
00:40:13.040 | slash
00:40:14.080 | to try
00:40:14.680 | powerful
00:40:15.060 | all-in-one
00:40:15.840 | notion
00:40:16.700 | today
00:40:17.140 | when you
00:40:19.340 | supporting
00:40:21.620 | notion.com
00:40:23.120 | slash
00:40:24.840 | I also
00:40:25.480 | want to
00:40:25.680 | talk about
00:40:25.960 | a new
00:40:26.160 | sponsor
00:40:27.660 | friends
00:40:28.720 | smalls
00:40:31.480 | don't
00:40:31.980 | Jesse
00:40:34.480 | Siamese
00:40:38.780 | little
00:40:39.300 | known
00:40:40.660 | grown
00:40:40.860 | African
00:40:45.840 | killed
00:40:46.140 | people
00:40:47.900 | Siamese
00:40:50.460 | having
00:40:51.240 | still
00:40:51.760 | tuned
00:40:53.320 | world
00:40:54.560 | which
00:40:55.820 | noticing
00:40:58.000 | people
00:41:01.080 | mentioning
00:41:01.820 | smalls
00:41:05.920 | digestive
00:41:06.880 | issues
00:41:07.880 | maybe
00:41:08.140 | they're
00:41:08.320 | throwing
00:41:08.700 | their
00:41:10.240 | don't
00:41:11.800 | you're
00:41:11.980 | feeding
00:41:13.040 | you're
00:41:13.360 | think
00:41:13.940 | could
00:41:14.220 | better
00:41:15.420 | should
00:41:15.620 | check
00:41:16.200 | smalls
00:41:17.280 | smalls
00:41:18.480 | protein
00:41:18.960 | packed
00:41:20.420 | protein
00:41:20.780 | packed
00:41:21.040 | recipes
00:41:21.960 | preservative
00:41:22.540 | ingredients
00:41:23.180 | you'd
00:41:23.940 | fridge
00:41:24.540 | deliver
00:41:25.100 | right
00:41:26.240 | delivery
00:41:26.580 | stuff
00:41:26.920 | because
00:41:27.340 | don't
00:41:27.860 | extra
00:41:28.240 | tasks
00:41:29.820 | that's
00:41:30.940 | cats.com
00:41:31.700 | names
00:41:32.020 | smalls
00:41:32.520 | their best
00:41:33.020 | overall
00:41:35.980 | first
00:41:36.300 | order
00:41:37.240 | shipping
00:41:38.360 | smalls.com
00:41:39.320 | and you
00:41:39.640 | can use
00:41:40.180 | promo
00:41:42.340 | limited
00:41:42.960 | today
00:41:43.660 | smalls
00:41:44.180 | started
00:41:46.840 | couple
00:41:47.780 | cooking
00:41:48.780 | small
00:41:48.980 | batches
00:41:49.560 | grown
00:41:50.040 | there
00:41:51.340 | humane
00:41:51.620 | world
00:41:51.980 | animals
00:41:52.360 | they've
00:41:52.660 | donated
00:41:53.320 | million
00:41:53.500 | dollars
00:41:53.780 | worth
00:41:56.540 | important
00:41:57.700 | everyone
00:41:58.580 | Jesse
00:41:59.900 | don't
00:42:00.240 | throw
00:42:00.480 | under
00:42:04.620 | comes
00:42:06.860 | mainly
00:42:07.520 | unmixed
00:42:08.440 | cement
00:42:08.720 | powder
00:42:09.200 | garbage
00:42:09.560 | that's
00:42:12.080 | there's
00:42:12.300 | people
00:42:13.300 | don't
00:42:13.520 | understand
00:42:14.820 | value
00:42:15.280 | actually
00:42:17.280 | you're
00:42:17.500 | going
00:42:18.560 | smalls
00:42:19.440 | after
00:42:20.020 | switching
00:42:20.540 | smalls
00:42:21.700 | of cat
00:42:22.020 | owners
00:42:22.220 | reported
00:42:22.560 | overall
00:42:23.000 | health
00:42:23.340 | improvement
00:42:23.760 | that's
00:42:24.060 | a big
00:42:26.000 | smalls
00:42:26.640 | confident
00:42:28.280 | their
00:42:28.420 | product
00:42:30.020 | means
00:42:30.420 | refund
00:42:31.460 | won't
00:42:32.660 | their
00:42:34.680 | waiting
00:42:36.840 | deserve
00:42:37.420 | limited
00:42:38.080 | because
00:42:39.680 | questions
00:42:40.040 | listener
00:42:42.920 | first
00:42:43.140 | smalls
00:42:43.420 | order
00:42:44.680 | shipping
00:42:48.800 | smalls
00:42:50.240 | promo
00:42:51.120 | again
00:42:51.740 | that's
00:42:52.040 | promo
00:42:53.840 | first
00:42:54.100 | order
00:42:55.660 | shipping
00:42:56.860 | smalls
00:42:58.100 | all right
00:42:59.580 | let's
00:43:01.000 | questions
00:43:01.500 | all right
00:43:05.700 | first
00:43:05.980 | questions
00:43:06.720 | Antonio
00:43:09.060 | models
00:43:09.680 | creative
00:43:10.660 | naive
00:43:12.900 | models
00:43:13.380 | explore
00:43:14.460 | space
00:43:15.240 | described
00:43:15.980 | training
00:43:17.560 | suited
00:43:18.200 | explore
00:43:18.600 | beyond
00:43:19.140 | training
00:43:20.640 | capability
00:43:22.460 | going
00:43:22.660 | beyond
00:43:23.180 | training
00:43:24.900 | creativity
00:43:26.660 | Antonio
00:43:27.000 | this is
00:43:27.380 | really
00:43:27.600 | one of
00:43:28.460 | questions
00:43:29.840 | goals
00:43:31.040 | research
00:43:32.780 | terminology
00:43:33.960 | called
00:43:34.200 | generalization
00:43:34.980 | beyond
00:43:35.560 | distribution
00:43:37.740 | grail
00:43:38.760 | artificially
00:43:39.260 | intelligent
00:43:39.700 | systems
00:43:40.600 | would
00:43:42.460 | train
00:43:44.040 | system
00:43:44.540 | would
00:43:45.080 | happen
00:43:45.640 | would
00:43:45.780 | train
00:43:46.200 | system
00:43:48.720 | doing
00:43:49.300 | would
00:43:49.460 | learn
00:43:49.980 | generalizations
00:43:51.300 | about
00:43:52.420 | various
00:43:52.760 | things
00:43:54.080 | interact
00:43:54.740 | other
00:43:55.760 | using
00:43:56.000 | those
00:43:56.220 | generalizations
00:43:57.660 | produce
00:43:59.480 | understanding
00:44:02.360 | ideas
00:44:02.920 | and new
00:44:03.380 | output
00:44:04.900 | doesn't
00:44:05.460 | directly
00:44:05.920 | reflect
00:44:06.740 | things
00:44:07.800 | training
00:44:08.380 | humans
00:44:09.240 | we're
00:44:09.380 | pretty
00:44:09.960 | generalization
00:44:11.740 | maybe
00:44:12.480 | learn
00:44:13.500 | looking
00:44:14.780 | particular
00:44:15.140 | example
00:44:16.700 | gravity
00:44:17.460 | works
00:44:19.960 | understand
00:44:21.460 | rules
00:44:22.760 | gravity
00:44:23.300 | going
00:44:23.500 | apply
00:44:24.120 | these
00:44:24.300 | things
00:44:24.540 | we've
00:44:24.720 | never
00:44:25.140 | before
00:44:25.860 | context
00:44:26.420 | right
00:44:27.040 | generalize
00:44:28.520 | apply
00:44:28.900 | generalization
00:44:31.320 | debate
00:44:31.720 | right
00:44:33.360 | generalization
00:44:34.160 | happening
00:44:35.000 | models
00:44:35.440 | based
00:44:36.260 | large
00:44:36.560 | language
00:44:36.800 | models
00:44:38.420 | sense
00:44:39.440 | research
00:44:39.760 | community
00:44:40.120 | seems
00:44:41.760 | researchers
00:44:42.160 | would
00:44:44.220 | comes
00:44:45.320 | researchers
00:44:45.940 | talked
00:44:46.520 | recent
00:44:47.020 | articles
00:44:48.140 | would
00:44:48.540 | there
00:44:48.640 | to be
00:44:48.980 | generalization
00:44:49.540 | but it
00:44:49.840 | takes
00:44:50.020 | a huge
00:44:50.340 | amount
00:44:50.860 | and they
00:44:51.100 | don't
00:44:51.440 | how much
00:44:52.040 | happening
00:44:52.380 | there's
00:44:53.220 | important
00:44:53.840 | paper
00:44:54.800 | couple
00:44:54.940 | weeks
00:44:55.500 | Apple
00:44:57.040 | where
00:44:57.860 | looking
00:44:58.420 | reasoning
00:44:58.700 | capabilities
00:44:59.720 | cutting
00:45:00.920 | large
00:45:01.260 | language
00:45:01.580 | models
00:45:02.020 | and they
00:45:02.480 | basically
00:45:02.900 | arguing
00:45:04.140 | reasoning
00:45:04.500 | quote
00:45:04.780 | unquote
00:45:05.140 | really
00:45:05.720 | doesn't
00:45:06.540 | generalize
00:45:07.200 | can't
00:45:07.680 | beyond
00:45:08.340 | applying
00:45:09.940 | directly
00:45:10.680 | rules
00:45:11.640 | that were
00:45:11.940 | inferred
00:45:12.360 | during
00:45:12.580 | training
00:45:12.920 | so it
00:45:13.220 | seems
00:45:14.140 | language
00:45:14.420 | models
00:45:14.880 | working
00:45:15.160 | right
00:45:21.760 | build
00:45:22.600 | pattern
00:45:23.000 | recognition
00:45:23.860 | transformer
00:45:24.240 | architecture
00:45:25.240 | understanding
00:45:26.280 | these
00:45:26.420 | examples
00:45:27.880 | infer
00:45:28.600 | general
00:45:28.880 | rules
00:45:29.320 | about
00:45:30.280 | and I'm
00:45:30.860 | using
00:45:31.040 | rules
00:45:31.860 | in the
00:45:32.080 | traditional
00:45:32.640 | sense
00:45:32.900 | but in
00:45:33.360 | informal
00:45:33.700 | sense
00:45:35.240 | embed
00:45:35.720 | their
00:45:35.980 | actual
00:45:36.500 | neural
00:45:36.720 | network
00:45:37.160 | architecture
00:45:38.800 | rules
00:45:39.140 | about
00:45:39.600 | these
00:45:39.780 | types
00:45:40.140 | examples
00:45:41.460 | forward
00:45:43.020 | novel
00:45:43.420 | version
00:45:44.100 | example
00:45:44.500 | it'll
00:45:44.720 | recognize
00:45:46.240 | apply
00:45:47.740 | answer
00:45:48.600 | example
00:45:51.320 | a lot
00:45:51.800 | simple
00:45:52.100 | arithmetic
00:45:52.500 | problems
00:45:53.540 | pattern
00:45:54.100 | recognition
00:45:54.660 | piece
00:45:55.100 | of the
00:45:55.340 | neural
00:45:55.520 | networks
00:45:56.080 | recognize
00:45:56.840 | there's
00:45:57.420 | arithmetic
00:45:57.840 | addition
00:45:58.380 | problem
00:45:59.240 | prompt
00:45:59.760 | seeing
00:46:00.940 | there
00:46:01.420 | rules
00:46:02.500 | embedded
00:46:02.980 | informally
00:46:03.860 | architecture
00:46:04.660 | captures
00:46:06.460 | different
00:46:06.760 | numbers
00:46:07.260 | being
00:46:07.540 | added
00:46:07.800 | together
00:46:08.160 | here's
00:46:08.620 | result
00:46:08.900 | would
00:46:11.420 | novel
00:46:11.740 | problem
00:46:13.180 | hasn't
00:46:13.760 | exact
00:46:15.440 | problem
00:46:15.960 | knows
00:46:19.360 | seems
00:46:19.580 | to be
00:46:20.180 | doing
00:46:21.840 | you're
00:46:22.280 | going
00:46:22.800 | generalization
00:46:23.720 | about
00:46:24.160 | properties
00:46:24.680 | numbers
00:46:24.960 | beyond
00:46:26.640 | language
00:46:27.260 | models
00:46:27.620 | at least
00:46:28.060 | we're
00:46:28.180 | seeing
00:46:29.000 | scaling
00:46:29.880 | generalization
00:46:30.580 | would
00:46:31.000 | we're
00:46:31.980 | running
00:46:33.060 | which
00:46:33.500 | scale
00:46:34.240 | massive
00:46:34.480 | sense
00:46:36.320 | creativity
00:46:36.740 | you're
00:46:36.940 | talking
00:46:37.200 | about
00:46:37.360 | Antonio
00:46:37.820 | probably
00:46:38.220 | going to
00:46:38.480 | require
00:46:38.820 | other
00:46:39.220 | types
00:46:40.300 | systems
00:46:40.880 | maybe
00:46:41.600 | language
00:46:41.880 | models
00:46:42.580 | symbolic
00:46:43.020 | reasoning
00:46:43.720 | other
00:46:44.220 | types
00:46:44.740 | explicit
00:46:45.780 | thinking
00:46:46.760 | ontological
00:46:47.720 | organization
00:46:48.620 | there
00:46:49.020 | you're
00:46:50.720 | going
00:46:51.680 | generalization
00:46:53.280 | models
00:46:53.900 | people
00:46:54.080 | disagree
00:46:55.740 | possible
00:46:56.960 | systems
00:46:57.660 | right
00:46:58.960 | all right
00:47:00.100 | what do we got next
00:47:00.600 | Jesse
00:47:00.840 | next up is
00:47:02.040 | Michelle
00:47:02.520 | I feel like I'm
00:47:04.180 | incredibly disorganized
00:47:05.720 | in my work life
00:47:06.480 | what's the smallest
00:47:07.580 | possible first step
00:47:08.720 | I can take to
00:47:09.400 | generate real
00:47:10.140 | improvement
00:47:10.680 | all right
00:47:11.760 | that's a good
00:47:12.040 | question a lot of
00:47:12.640 | people ask about
00:47:13.420 | this like what
00:47:14.020 | is step
00:47:14.600 | one if I want
00:47:15.600 | to get some
00:47:16.280 | sort of
00:47:17.380 | minimum
00:47:18.920 | effective dose
00:47:19.860 | get some
00:47:20.840 | sort of
00:47:21.180 | sense of
00:47:21.520 | organization
00:47:21.940 | out of it
00:47:22.380 | I thought
00:47:22.700 | about this
00:47:23.100 | a little
00:47:23.580 | here's what
00:47:25.240 | I would
00:47:25.460 | suggest
00:47:25.880 | you're
00:47:26.320 | completely
00:47:26.560 | disorganized
00:47:27.260 | at work
00:47:27.700 | and at home
00:47:28.280 | and you
00:47:28.520 | constantly feel
00:47:28.980 | like you're
00:47:29.200 | behind things
00:47:29.960 | and maybe
00:47:30.260 | you're in
00:47:30.440 | your email
00:47:30.760 | inbox a lot
00:47:31.600 | or on your
00:47:32.300 | text threads
00:47:32.760 | like reacting
00:47:33.360 | to crisis
00:47:33.800 | I think you
00:47:35.180 | need some
00:47:35.500 | sort of
00:47:35.820 | master list
00:47:36.520 | you need
00:47:36.860 | some place
00:47:37.360 | where you
00:47:37.600 | write down
00:47:38.140 | everything
00:47:38.580 | that is
00:47:40.420 | just in
00:47:41.040 | landscape
00:47:41.460 | of things
00:47:41.840 | you might
00:47:42.140 | have some
00:47:42.480 | responsibility
00:47:42.960 | for or need
00:47:43.820 | to deal
00:47:44.520 | this is
00:47:44.860 | probably
00:47:46.260 | number
00:47:46.960 | and you
00:47:47.560 | have to
00:47:47.900 | tend to
00:47:48.680 | right
00:47:48.880 | so at
00:47:49.320 | the end
00:47:49.580 | of every
00:47:50.100 | just dump
00:47:50.900 | what came
00:47:52.800 | up today
00:47:53.240 | that I'm
00:47:53.560 | just keeping
00:47:53.940 | track of
00:47:54.300 | in my
00:47:54.820 | what text
00:47:55.460 | message
00:47:55.700 | that I get
00:47:56.100 | that I'm
00:47:56.320 | kind of
00:47:56.520 | forgetting
00:47:56.800 | about
00:47:57.200 | what is
00:47:58.640 | coming
00:47:58.980 | through on
00:48:00.000 | emails
00:48:00.440 | and now
00:48:00.900 | I realize
00:48:01.280 | this is a
00:48:01.640 | bigger project
00:48:02.100 | I need
00:48:02.400 | to do
00:48:02.820 | just get
00:48:04.140 | it all
00:48:04.420 | written
00:48:05.080 | somewhere
00:48:05.520 | do it
00:48:06.980 | on like
00:48:07.180 | a legal
00:48:07.980 | so you
00:48:08.860 | carry
00:48:09.800 | maybe
00:48:10.300 | Google
00:48:10.880 | something
00:48:12.360 | probably
00:48:12.760 | minimum
00:48:13.100 | thing
00:48:16.680 | permission
00:48:19.500 | brain
00:48:20.080 | constantly
00:48:20.640 | trying
00:48:21.160 | refresh
00:48:21.660 | remember
00:48:22.080 | everything
00:48:22.740 | could
00:48:22.900 | cause
00:48:23.280 | problem
00:48:24.000 | brain
00:48:24.920 | stress
00:48:26.620 | negative
00:48:27.300 | subjective
00:48:28.120 | experience
00:48:29.840 | forgot
00:48:31.220 | they're
00:48:31.360 | calling
00:48:31.680 | and saying
00:48:32.000 | where
00:48:33.580 | don't
00:48:34.220 | brains
00:48:34.440 | don't
00:48:35.560 | thinks
00:48:36.700 | could
00:48:36.960 | happen
00:48:37.480 | unless
00:48:38.240 | remembers
00:48:38.740 | something
00:48:39.020 | you're
00:48:39.240 | forgetting
00:48:40.680 | a lot
00:48:41.040 | cycles
00:48:41.740 | prevent
00:48:42.220 | negative
00:48:42.500 | experience
00:48:43.160 | happening
00:48:44.220 | reassure
00:48:45.080 | brain
00:48:46.640 | don't
00:48:47.480 | track
00:48:48.060 | everything
00:48:48.440 | anymore
00:48:48.980 | we're
00:48:49.540 | forgetting
00:48:49.960 | something
00:48:52.600 | throw
00:48:52.780 | stuff
00:48:54.920 | you know
00:48:55.820 | a few
00:48:56.060 | times
00:48:56.320 | a day
00:48:57.480 | don't
00:48:58.020 | sense
00:48:58.680 | least
00:48:59.780 | forgot
00:48:59.940 | about
00:49:00.700 | should
00:49:02.260 | wrote
00:49:02.980 | that's
00:49:04.500 | going
00:49:04.840 | a huge
00:49:05.120 | psychological
00:49:05.580 | difference
00:49:06.120 | that's
00:49:06.420 | foundation
00:49:06.960 | which
00:49:07.580 | management
00:49:08.420 | built
00:49:11.140 | figure
00:49:14.100 | you're
00:49:14.240 | going
00:49:15.460 | organize
00:49:15.880 | these
00:49:16.240 | bigger
00:49:16.460 | projects
00:49:17.600 | progress
00:49:18.700 | that's
00:49:19.080 | complicated
00:49:21.100 | master
00:49:23.100 | management
00:49:23.440 | equivalent
00:49:24.200 | saying
00:49:26.300 | 5,000
00:49:26.800 | steps
00:49:28.360 | comes
00:49:28.680 | trying
00:49:29.020 | yourself
00:49:29.660 | shape
00:49:31.000 | movement
00:49:32.820 | sedentariness
00:49:34.800 | first step
00:49:35.500 | in the
00:49:36.200 | physical
00:49:36.440 | world
00:49:38.100 | capture
00:49:38.520 | to use
00:49:38.880 | David
00:49:39.060 | Allen
00:49:39.640 | versus
00:49:40.260 | haphazardly
00:49:41.220 | keeping
00:49:41.460 | track
00:49:41.860 | things
00:49:42.100 | spread
00:49:42.420 | between
00:49:42.740 | emails
00:49:43.200 | chats
00:49:44.820 | brain
00:49:46.680 | really
00:49:48.400 | you're
00:49:48.560 | going
00:49:49.520 | after
00:49:49.900 | while
00:49:55.080 | reasonable
00:49:56.020 | friction
00:49:58.240 | going to
00:49:58.380 | believe
00:49:58.760 | impossible
00:49:59.240 | going to
00:49:59.320 | give up
00:49:59.800 | eventually
00:50:00.960 | going to
00:50:01.060 | start
00:50:01.260 | saying
00:50:01.780 | maybe
00:50:02.140 | could
00:50:03.140 | couple
00:50:03.280 | rules
00:50:04.200 | sense
00:50:04.860 | maybe
00:50:05.080 | we're
00:50:05.220 | going to
00:50:05.900 | look at
00:50:06.880 | calendar
00:50:08.440 | organically
00:50:09.600 | bottom
00:50:10.260 | management
00:50:10.860 | emerge
00:50:11.940 | think
00:50:12.080 | that's
00:50:12.420 | minimum
00:50:12.600 | effective
00:50:14.520 | going
00:50:15.700 | fully
00:50:16.200 | organized
00:50:20.260 | going
00:50:21.380 | worst
00:50:22.720 | stresses
00:50:23.280 | could
00:50:23.480 | caused
00:50:23.860 | disorganization
00:50:26.080 | thinking
00:50:26.280 | about
00:50:26.540 | Jesse
00:50:27.060 | podcast
00:50:27.540 | topic
00:50:27.940 | might
00:50:28.140 | actually
00:50:29.420 | quite
00:50:31.220 | thinking
00:50:31.580 | about
00:50:32.960 | minimum
00:50:36.180 | management
00:50:36.480 | organization
00:50:36.880 | system
00:50:40.140 | above
00:50:40.380 | water
00:50:41.080 | while
00:50:41.440 | because
00:50:42.740 | interested
00:50:43.260 | question
00:50:43.980 | summer
00:50:44.480 | example
00:50:44.940 | you're
00:50:45.380 | organized
00:50:45.920 | person
00:50:46.780 | takes
00:50:47.640 | cognitive
00:50:49.300 | effort
00:50:50.120 | maintain
00:50:52.220 | intricate
00:50:53.680 | management
00:50:53.980 | system
00:50:55.940 | minimum
00:50:56.200 | effective
00:50:56.860 | system
00:50:57.560 | downgrade
00:50:58.160 | something
00:50:58.400 | if you
00:50:58.660 | tried
00:50:59.540 | years
00:51:00.040 | you're
00:51:00.200 | going
00:51:00.320 | start
00:51:00.480 | missing
00:51:00.700 | stuff
00:51:01.340 | sustainable
00:51:02.080 | could
00:51:02.400 | couple
00:51:02.540 | months
00:51:03.020 | summer
00:51:05.040 | summer
00:51:05.280 | feeling
00:51:05.960 | you're
00:51:06.100 | really
00:51:06.300 | behind
00:51:06.840 | stressed
00:51:07.920 | thinking
00:51:08.200 | about
00:51:09.660 | ideas
00:51:10.020 | for it
00:51:10.640 | tuned
00:51:10.840 | I might
00:51:12.120 | week's
00:51:12.360 | podcast
00:51:13.040 | something
00:51:13.380 | you're
00:51:13.600 | going to
00:51:14.100 | maybe
00:51:17.700 | maybe
00:51:18.320 | maybe
00:51:19.680 | that's
00:51:20.160 | maybe
00:51:27.000 | involve
00:51:28.240 | regular
00:51:28.580 | electric
00:51:28.900 | shocks
00:51:31.220 | don't
00:51:31.420 | forget
00:51:32.440 | Pavlovian
00:51:34.860 | shocked
00:51:35.520 | every
00:51:37.600 | forget
00:51:38.480 | appointment
00:51:38.840 | you're
00:51:39.260 | going
00:51:39.420 | forget
00:51:39.580 | those
00:51:39.780 | appointments
00:51:40.100 | anymore
00:51:41.280 | right
00:51:43.020 | Duncan
00:51:44.760 | projects
00:51:45.400 | would
00:51:45.800 | tackle
00:51:46.660 | outside
00:51:47.800 | struggle
00:51:48.600 | enough
00:51:54.080 | Duncan
00:51:54.440 | going
00:51:54.620 | connect
00:51:56.900 | unexpected
00:51:57.960 | source
00:51:58.880 | maybe
00:51:59.740 | makes
00:51:59.900 | sense
00:52:00.500 | because
00:52:02.380 | Disney
00:52:05.160 | a lot
00:52:05.380 | things
00:52:05.540 | about
00:52:05.720 | Disney
00:52:07.080 | because
00:52:08.260 | Disneyland
00:52:09.180 | rereading
00:52:10.080 | first
00:52:10.660 | years
00:52:11.100 | about
00:52:11.620 | history
00:52:12.080 | Disneyland
00:52:12.440 | itself
00:52:13.500 | interesting
00:52:14.740 | analogy
00:52:15.980 | interesting
00:52:16.520 | study
00:52:16.920 | guess
00:52:17.140 | right
00:52:18.940 | 1940s
00:52:20.080 | Disney
00:52:21.180 | is like
00:52:21.480 | a big
00:52:21.900 | concern
00:52:22.400 | at this
00:52:22.760 | point
00:52:23.000 | they've
00:52:23.340 | a bunch
00:52:23.820 | movies
00:52:26.520 | really
00:52:26.740 | involved
00:52:27.380 | effort
00:52:29.400 | company
00:52:30.540 | Disney
00:52:30.800 | pretty
00:52:30.960 | famous
00:52:33.120 | little
00:52:33.500 | stressed
00:52:35.000 | dealing
00:52:35.840 | depression
00:52:37.560 | begins
00:52:41.920 | development
00:52:43.420 | scale
00:52:44.500 | train
00:52:45.800 | himself
00:52:47.260 | himself
00:52:48.680 | house
00:52:50.400 | acres
00:52:52.020 | train
00:52:54.680 | custom
00:52:55.760 | built
00:52:56.280 | locomotive
00:52:57.300 | actual
00:52:57.680 | steam
00:52:58.040 | locomotive
00:52:58.600 | but it's
00:52:59.520 | scaled
00:53:00.080 | I don't
00:53:00.620 | scale
00:53:01.020 | specific
00:53:02.560 | numbers
00:53:03.260 | but it
00:53:05.700 | enough
00:53:06.200 | could
00:53:08.420 | front
00:53:08.840 | think
00:53:09.060 | about
00:53:10.800 | wheels
00:53:11.700 | wheels
00:53:11.880 | power
00:53:12.160 | wheels
00:53:12.360 | what's
00:53:12.660 | plastic
00:53:14.140 | drive
00:53:14.380 | that's
00:53:15.780 | right
00:53:16.120 | and he
00:53:17.560 | thing
00:53:17.780 | built
00:53:18.240 | and he
00:53:18.480 | built
00:53:18.680 | rolling
00:53:19.000 | stock
00:53:19.440 | and then
00:53:20.240 | built
00:53:21.180 | track
00:53:22.620 | through
00:53:24.140 | property
00:53:24.900 | that he
00:53:25.540 | bought
00:53:25.880 | I think
00:53:26.880 | this was
00:53:27.100 | in like
00:53:27.300 | Beverly
00:53:27.520 | Hills
00:53:27.740 | or something
00:53:28.060 | like that
00:53:28.480 | and it
00:53:29.820 | elaborate
00:53:30.840 | right
00:53:31.120 | they had
00:53:31.680 | train
00:53:32.260 | bridge
00:53:32.600 | on it
00:53:33.080 | trestles
00:53:33.560 | that was
00:53:34.680 | so big
00:53:35.140 | it was
00:53:35.600 | enough
00:53:35.960 | that it
00:53:36.540 | under
00:53:37.220 | jurisdiction
00:53:38.240 | county
00:53:38.800 | actually
00:53:39.000 | have to
00:53:39.500 | inspected
00:53:40.160 | other
00:53:40.360 | bridge
00:53:40.800 | would
00:53:42.120 | built
00:53:42.740 | tunnel
00:53:47.040 | tunnel
00:53:47.260 | so it's
00:53:47.480 | a big
00:53:47.800 | enough
00:53:48.020 | tunnel
00:53:48.700 | people
00:53:49.060 | grownups
00:53:49.740 | riding
00:53:50.280 | train
00:53:52.440 | tunnel
00:53:52.760 | can't
00:53:53.180 | light
00:53:53.360 | on the
00:53:53.560 | other
00:53:54.060 | because
00:53:54.960 | through
00:53:56.260 | so that
00:53:56.940 | effect
00:53:57.620 | don't
00:53:57.860 | how long
00:53:58.380 | tunnel
00:53:58.660 | going
00:53:59.900 | buried
00:54:00.340 | completely
00:54:00.680 | landscaped
00:54:01.500 | whole
00:54:01.660 | thing
00:54:02.080 | forgot
00:54:02.880 | exact
00:54:03.140 | total
00:54:03.500 | length
00:54:03.940 | track
00:54:04.920 | maybe
00:54:05.380 | something
00:54:06.040 | quarter
00:54:07.780 | something
00:54:08.440 | anyways
00:54:09.420 | here's
00:54:11.340 | about
00:54:11.900 | that's
00:54:13.600 | incredible
00:54:14.040 | amount
00:54:15.940 | hobby
00:54:17.260 | speak
00:54:18.980 | incredible
00:54:19.380 | amount
00:54:20.140 | treasure
00:54:21.320 | build
00:54:21.780 | thing
00:54:24.520 | person
00:54:26.820 | large
00:54:27.260 | movie
00:54:27.600 | company
00:54:28.460 | doing
00:54:29.340 | different
00:54:29.680 | projects
00:54:30.600 | there's
00:54:31.220 | money
00:54:31.420 | things
00:54:31.680 | moving
00:54:32.320 | forth
00:54:34.460 | wondering
00:54:35.400 | Disney
00:54:37.500 | build
00:54:39.060 | project
00:54:41.480 | answer
00:54:42.940 | company
00:54:45.680 | flexibility
00:54:46.340 | that's
00:54:47.880 | answer
00:54:49.080 | digital
00:54:49.340 | world
00:54:51.580 | flexibility
00:54:52.440 | didn't
00:54:54.100 | demand
00:54:55.160 | because
00:54:55.540 | couldn't
00:54:56.120 | demand
00:54:56.960 | there
00:54:57.440 | email
00:54:57.900 | right
00:54:58.320 | there
00:54:59.720 | slack
00:55:00.160 | messages
00:55:00.620 | people
00:55:03.280 | autonomy
00:55:04.100 | because
00:55:04.820 | actual
00:55:05.300 | obstacles
00:55:06.180 | access
00:55:07.000 | every
00:55:07.440 | Disney
00:55:07.800 | needed
00:55:09.460 | financiers
00:55:11.400 | that's
00:55:12.620 | three
00:55:13.360 | train
00:55:14.480 | three
00:55:15.540 | train
00:55:16.480 | vacation
00:55:17.240 | Europe
00:55:18.080 | that's
00:55:18.700 | going
00:55:18.800 | to be
00:55:19.200 | weeks
00:55:20.080 | takes
00:55:21.440 | across
00:55:21.820 | country
00:55:23.220 | across
00:55:23.540 | we're
00:55:24.300 | wasn't
00:55:25.020 | paced
00:55:25.400 | things
00:55:26.420 | still
00:55:27.800 | found
00:55:28.640 | massive
00:55:29.060 | project
00:55:29.740 | about
00:55:30.380 | period
00:55:32.120 | what's
00:55:32.300 | going
00:55:34.580 | mindset
00:55:36.920 | mindset
00:55:38.580 | thing
00:55:38.880 | really
00:55:39.120 | important
00:55:43.220 | there's
00:55:43.420 | a lot
00:55:43.800 | gives
00:55:44.500 | schedules
00:55:46.500 | early
00:55:46.840 | you're
00:55:47.020 | working
00:55:47.700 | night
00:55:48.760 | build
00:55:49.200 | lunch
00:55:49.380 | break
00:55:49.640 | around
00:55:49.900 | doing
00:55:50.460 | right
00:55:50.860 | something
00:55:51.620 | becomes
00:55:52.040 | important
00:55:52.820 | could
00:55:53.140 | something
00:55:53.520 | sparks
00:55:54.220 | right
00:55:54.640 | you'll
00:55:55.280 | surprised
00:55:56.940 | we've
00:55:57.540 | effect
00:55:57.840 | a few
00:55:58.120 | weeks
00:55:59.020 | podcast
00:56:00.740 | about
00:56:02.200 | people
00:56:03.880 | counter
00:56:04.240 | intuitive
00:56:04.820 | unexpected
00:56:05.660 | response
00:56:06.380 | where
00:56:07.480 | these
00:56:07.620 | experiments
00:56:12.160 | turned
00:56:12.520 | there
00:56:13.040 | given
00:56:13.540 | schedule
00:56:14.640 | along
00:56:15.240 | their
00:56:15.460 | schedule
00:56:15.780 | could
00:56:16.680 | instead
00:56:17.640 | plenty
00:56:21.920 | things
00:56:22.840 | longer
00:56:24.180 | meanings
00:56:25.880 | would
00:56:26.000 | argue
00:56:26.480 | Disney
00:56:26.760 | argument
00:56:27.360 | would
00:56:28.940 | important
00:56:29.300 | enough
00:56:29.680 | project
00:56:30.540 | actually
00:56:30.980 | really
00:56:31.860 | progress
00:56:32.640 | you'll
00:56:33.380 | surprised
00:56:33.780 | by how
00:56:34.260 | progress
00:56:35.080 | here's
00:56:35.680 | second
00:56:35.940 | thing
00:56:36.640 | about
00:56:39.740 | don't
00:56:40.140 | screens
00:56:41.960 | another
00:56:42.360 | thing
00:56:43.000 | using
00:56:44.100 | inordinate
00:56:44.560 | amount
00:56:45.260 | people's
00:56:46.020 | don't
00:56:46.180 | realize
00:56:48.380 | default
00:56:49.220 | let me
00:56:51.300 | streaming
00:56:51.580 | service
00:56:52.840 | phone
00:56:54.100 | don't
00:56:54.420 | realize
00:56:55.280 | hours
00:56:56.320 | outside
00:56:57.400 | getting
00:56:57.620 | eaten
00:56:59.220 | experiment
00:57:00.000 | something
00:57:00.560 | digital
00:57:00.820 | declutter
00:57:01.640 | about
00:57:02.440 | digital
00:57:02.840 | minimalism
00:57:03.980 | month
00:57:04.480 | going
00:57:04.860 | these
00:57:05.080 | tools
00:57:06.320 | going
00:57:06.820 | phone
00:57:07.320 | social
00:57:07.540 | media
00:57:09.160 | allowed
00:57:09.580 | watch
00:57:09.940 | streaming
00:57:10.220 | thing
00:57:10.500 | if it's
00:57:10.940 | other
00:57:11.120 | people
00:57:11.740 | family
00:57:12.020 | movie
00:57:12.300 | night
00:57:12.680 | those
00:57:13.180 | rules
00:57:13.540 | place
00:57:13.840 | you're
00:57:14.460 | gonna
00:57:14.500 | bored
00:57:15.160 | first
00:57:16.000 | a lot
00:57:17.220 | don't
00:57:17.640 | leave
00:57:17.960 | another
00:57:18.500 | minutes
00:57:19.260 | going
00:57:19.700 | those
00:57:20.200 | minutes
00:57:20.560 | right
00:57:22.680 | really
00:57:22.820 | gonna
00:57:22.940 | start
00:57:23.120 | dinner
00:57:23.500 | another
00:57:24.620 | going
00:57:25.240 | bored
00:57:26.720 | boredom
00:57:27.100 | drive
00:57:27.500 | towards
00:57:27.700 | projects
00:57:28.800 | progress
00:57:30.200 | things
00:57:31.740 | example
00:57:32.300 | Walter
00:57:32.600 | Isaacson
00:57:33.240 | wrote
00:57:34.840 | of his
00:57:35.180 | initial
00:57:35.720 | historical
00:57:36.400 | bestsellers
00:57:37.900 | Einstein
00:57:38.200 | biography
00:57:38.960 | Franklin
00:57:39.240 | biography
00:57:41.140 | write
00:57:41.400 | these
00:57:42.280 | heard
00:57:42.560 | interviewer
00:57:45.540 | Warner
00:57:47.420 | Disney
00:57:48.880 | answer
00:57:49.500 | don't
00:57:49.600 | watch
00:57:50.200 | night
00:57:51.480 | would
00:57:52.120 | instead
00:57:52.840 | watching
00:57:54.100 | gonna
00:57:54.220 | write
00:57:57.140 | could
00:57:57.280 | write
00:57:58.020 | night
00:57:58.300 | little
00:58:00.080 | stuff
00:58:01.380 | surprised
00:58:02.740 | produced
00:58:04.000 | that's
00:58:04.500 | together
00:58:05.700 | project
00:58:06.140 | exciting
00:58:06.660 | enough
00:58:07.660 | fight
00:58:09.940 | screens
00:58:10.880 | experimentally
00:58:12.520 | those
00:58:12.900 | things
00:58:13.140 | together
00:58:14.420 | think
00:58:14.640 | might
00:58:14.880 | surprised
00:58:16.840 | happened
00:58:17.140 | Jesse
00:58:17.700 | train
00:58:18.220 | locomotive
00:58:19.840 | happened
00:58:20.300 | Disney
00:58:21.900 | train
00:58:23.880 | crash
00:58:24.520 | scalded
00:58:25.620 | little
00:58:27.840 | there
00:58:28.260 | injury
00:58:30.040 | we're
00:58:32.000 | train
00:58:34.660 | because
00:58:36.160 | inspiration
00:58:36.740 | needed
00:58:37.360 | Disneyland
00:58:39.860 | learned
00:58:40.200 | something
00:58:41.180 | turned
00:58:41.540 | attention
00:58:41.980 | creating
00:58:42.180 | Disneyland
00:58:43.060 | interesting
00:58:43.920 | depressed
00:58:46.240 | because
00:58:46.980 | didn't
00:58:47.540 | didn't
00:58:47.900 | a lot
00:58:48.100 | about
00:58:49.620 | really
00:58:53.100 | people
00:58:53.560 | worked
00:58:54.060 | talked
00:58:54.400 | about
00:58:55.700 | going
00:58:57.300 | there's
00:58:57.460 | a lot
00:58:59.260 | think
00:59:00.620 | think
00:59:00.980 | depressed
00:59:01.320 | because
00:59:03.640 | didn't
00:59:03.860 | mental
00:59:04.060 | health
00:59:08.740 | company
00:59:09.100 | right
00:59:09.580 | world
00:59:10.680 | company
00:59:11.480 | growing
00:59:13.740 | white
00:59:15.060 | right
00:59:16.780 | you're
00:59:17.520 | movie
00:59:17.960 | studio
00:59:18.520 | things
00:59:19.740 | going
00:59:19.880 | really
00:59:21.560 | comes
00:59:22.620 | European
00:59:22.960 | market
00:59:24.120 | that's
00:59:24.800 | revenue
00:59:26.420 | suddenly
00:59:29.440 | people
00:59:31.400 | their
00:59:31.500 | efforts
00:59:31.860 | towards
00:59:32.160 | government
00:59:32.480 | contracts
00:59:33.840 | you're
00:59:34.240 | having
00:59:34.860 | biggest
00:59:35.140 | movie
00:59:35.760 | world
00:59:36.180 | government
00:59:36.780 | contracts
00:59:37.720 | to make
00:59:38.240 | stuff
00:59:38.600 | for the
00:59:39.000 | it was
00:59:39.580 | animations
00:59:40.180 | for the
00:59:41.060 | effort
00:59:41.420 | propaganda
00:59:42.740 | stuff
00:59:43.940 | it would
00:59:45.260 | a bunch
00:59:45.820 | would
00:59:46.820 | animations
00:59:48.660 | fighter
00:59:49.260 | planes
00:59:49.740 | flying
00:59:50.080 | around
00:59:50.380 | globe
00:59:52.100 | propaganda
00:59:52.780 | videos
00:59:53.540 | propaganda
00:59:53.860 | videos
00:59:54.200 | which
01:00:24.320 | not what he wanted to be working on and so I think he just kind of got depressed he got paid for that though right they got paid for it but it was just to try to keep the lights on because this is not the same as having like a hit movie so it was like they had to fire a lot of people then they had to fire a lot of labor issues right there the animators had gone on strike this might have been pre-war that really disillusioned him like he had all this disillusionment happening for years and so I think he just when you're really driven like that and you're really driven like that and you're suddenly are taken out of your life the
01:00:24.320 | things you were working on and succeeding on.
01:00:26.680 | I think he just got depressed.
01:00:28.000 | Disneyland took him out of it.
01:00:29.900 | He's like, we're going to build this park.
01:00:31.580 | And everyone thought he was crazy.
01:00:33.020 | He's like, we're going to make it happen.
01:00:34.060 | And it was.
01:00:36.140 | It was very successful.
01:00:36.780 | And that was actually sort of saved Disney with these parks eventually.
01:00:39.740 | They generate a lot of money, Jesse.
01:00:41.080 | I don't know if you know this.
01:00:41.800 | When you go to Disneyland, they have abundant opportunities for you to spend money.
01:00:46.880 | Don't know if you know about it.
01:00:49.700 | It's almost as if they're trying to make money off of their guests.
01:00:54.680 | I always say, what are the two most powerful ways to make money off of people?
01:01:01.280 | A, convincing them that they might make a lot of money.
01:01:04.280 | So that's casinos.
01:01:04.920 | Or B, their kids.
01:01:07.540 | I want my kids to have the right experience.
01:01:09.880 | And so you write these checks.
01:01:12.300 | I had an aha moment.
01:01:13.500 | What else would you buy, though, other than a ticket and food and stuff?
01:01:16.680 | Souvenirs.
01:01:17.740 | So this was an aha moment I had.
01:01:20.300 | Not me, but actually the guy who does my hair had to figure this out, right?
01:01:23.580 | So we're in Star Wars World or whatever they call it at Disneyland.
01:01:26.740 | I don't know what it's called.
01:01:27.420 | They spent a billion dollars on just like this part of the park.
01:01:30.580 | And it's all like Star Wars-y.
01:01:32.000 | And we come across this thing where there's like a guy out there.
01:01:37.240 | They're promoting something.
01:01:38.120 | And we're like, what's going on here?
01:01:39.080 | And they're like, oh, it's a make-your-own-lightsaber experience, right?
01:01:42.940 | And they're like, oh, that sounds interesting.
01:01:45.420 | And he's like, yeah.
01:01:46.260 | And you know what?
01:01:47.200 | We have some spots.
01:01:48.000 | It's reservation.
01:01:48.740 | We have some spots.
01:01:49.480 | Like, oh, yeah.
01:01:50.020 | This is interesting.
01:01:50.680 | Like, this could be fun.
01:01:51.360 | You have spots right now.
01:01:52.020 | Great.
01:01:52.300 | He's like, great.
01:01:52.780 | So how much is it?
01:01:53.520 | He's like, it's $237 a person.
01:01:55.680 | Weird number.
01:01:57.020 | Yeah.
01:01:57.480 | So we're like, well, we're not doing that, right?
01:01:59.440 | Because you had five people.
01:02:01.480 | Yeah, we're not going to spend that money.
01:02:02.960 | My wife's like, we're not spending $1,000 on lightsabers.
01:02:06.760 | Yeah.
01:02:07.120 | He's like, well, he's like, I'll tell you.
01:02:09.400 | He's like, yeah, I get it.
01:02:10.660 | I get it.
01:02:10.920 | He's like, you know, at this gift shop though, you can build plastic ones and it's like $30
01:02:16.760 | a lightsaber.
01:02:18.200 | Like, oh yeah, that's much better.
01:02:19.320 | Okay, good.
01:02:19.780 | Here's the insight my hair guy had.
01:02:23.360 | There's no, there was nothing behind that wall.
01:02:25.800 | There was no $237 build a lightsaber thing.
01:02:29.000 | The entire point of that is to get you to buy the $30 lightsabers, right?
01:02:33.840 | Because that'd be brilliant.
01:02:34.700 | They're like, hey, your kids want to build lightsabers?
01:02:37.300 | Like, yeah, like it's going to be $250.
01:02:39.040 | Like, boo.
01:02:39.900 | And then they're like, well, I guess you could, I guess you could do this $30 experience.
01:02:44.240 | Well, some people must pay the $250.
01:02:46.120 | That's probably what's really happening.
01:02:47.240 | But I like this idea that it was like just for, because it's a very effective way to get
01:02:51.420 | you to spend $30 on a lightsaber.
01:02:52.980 | No, it seemed like it was, I don't want to say it.
01:02:57.340 | But some people wouldn't shake a stick.
01:02:58.740 | Some people were, I'm trying to say it nicely.
01:03:01.220 | I think there were plenty of people in the Star Wars portion of the park who seemed like
01:03:06.000 | they would spend the $237.
01:03:07.660 | Yeah.
01:03:09.100 | And I would say they were, they're in their 20s, probably live with their parents and don't
01:03:21.160 | spend a lot of time in the gym.
01:03:22.220 | That's like, that's probably how I describe it.
01:03:23.500 | There was a certain demographic.
01:03:24.560 | See, I think about it totally differently.
01:03:26.020 | I think I could see like people abroad, like flying private over there and be like, all right,
01:03:29.520 | no problem.
01:03:30.040 | Like whatever.
01:03:30.720 | Yeah.
01:03:31.740 | But are they that interested in a lightsaber?
01:03:33.300 | Probably.
01:03:34.220 | If they fly there from like Japan or something.
01:03:36.560 | Yeah.
01:03:37.000 | Okay.
01:03:37.420 | So you're like, this is nothing.
01:03:38.900 | We spent so much money.
01:03:40.140 | Yeah.
01:03:40.740 | Yeah.
01:03:40.940 | I get that.
01:03:41.520 | We spent so much money to come here.
01:03:43.400 | Like, why not just add that?
01:03:45.200 | Yeah.
01:03:45.880 | Yeah.
01:03:46.320 | Yeah.
01:03:47.120 | Because you can do like private tours there.
01:03:49.880 | It's $800 a day or something like that.
01:03:52.680 | I mean, think about if you fly private there.
01:03:54.480 | I mean, that's 50K right there.
01:03:56.380 | Yeah.
01:03:58.040 | And then you do the private tour where you spend like a thousand bucks for your party
01:04:02.200 | and you get to go to the front of every line.
01:04:03.580 | So it's like a private tour.
01:04:05.100 | But all it is, is they don't want to just say, if you're rich, you don't have to wait
01:04:08.540 | any lines.
01:04:09.060 | So it's like, oh, we have a private tour and our tour guides are able to bypass the lines.
01:04:13.600 | But all they do is just let you get on the rides right away.
01:04:17.400 | So yeah, you're spending a thousand bucks a day on that anyways.
01:04:19.460 | Yeah, you're right.
01:04:20.400 | The 237 for the lightsaber.
01:04:22.660 | Whatever.
01:04:26.080 | Point is, who was the original question from?
01:04:28.440 | Duncan.
01:04:29.840 | Duncan.
01:04:30.640 | Point is, Duncan, beware of lightsaber scams.
01:04:34.520 | I think we handled that one perfectly, Jesse.
01:04:36.740 | All right.
01:04:37.560 | Who do we got next?
01:04:38.180 | Next up is Fred.
01:04:39.620 | I have an internship in mainframe development with a rare and valuable skill.
01:04:43.700 | Very few people have the skill set within my organization and many of them are nearing
01:04:47.400 | retirement.
01:04:47.880 | However, it's long hours and often overlooked by management.
01:04:50.560 | This fall, I have an opportunity to transition into a data analyst role for a third internship.
01:04:55.580 | However, there are drawbacks as well.
01:04:57.600 | How should I manage my internships?
01:04:59.280 | You know, I don't care as much about internships.
01:05:02.880 | Right.
01:05:04.220 | It's not your job.
01:05:05.040 | So the default there would be, yeah, do the data analyst one.
01:05:08.460 | You're just gathering data.
01:05:09.540 | Right.
01:05:10.600 | See what that's like.
01:05:12.060 | Maybe there's something about that you like.
01:05:13.280 | Like you're gathering information with these internships.
01:05:15.120 | I'm not going to overthink that.
01:05:16.560 | When it comes to actually choosing what you want to do for a job after the fact, especially
01:05:20.660 | if these internships open up, each thing you did internship in opens up a job opportunity.
01:05:26.060 | The question is always twofold when it comes to these career capital moves, right?
01:05:30.600 | The building up rare and valuable skills and using them to construct your career.
01:05:34.000 | Is there an opportunity in this job to build career capital?
01:05:37.600 | So are there rare and valuable skills that I could develop?
01:05:40.080 | For the mainframe job that your mainframe internship could lead to, the answer seems to be clearly
01:05:45.340 | You describe it as a rare and valuable skill.
01:05:47.880 | Most people don't know how to still work with these mainframes, but you need to upkeep
01:05:52.420 | them.
01:05:52.720 | So yeah, yes.
01:05:54.440 | The second question you have to ask is, will there be an opportunity for me to cash in career
01:06:00.060 | capital if acquired to gain some autonomy or control over how my career unfolds?
01:06:05.660 | That's often the sticking point.
01:06:08.040 | And that's what you would really have to assess here.
01:06:10.340 | So if for whatever reason, this company is very rigid about people in their mainframe
01:06:17.200 | development group, no, this is just what that job is.
01:06:20.080 | And the salaries are here and that's it.
01:06:22.820 | And we don't really care.
01:06:23.720 | We feel like you're expendable.
01:06:24.840 | Then the answer to that second question is no.
01:06:26.860 | And you'd be wary about it.
01:06:27.980 | If on the other hand, you're like, look, yeah, they don't realize the value of this.
01:06:31.700 | But if I got really good at that and was like, hey, I'm keeping up 10 of these systems
01:06:38.000 | and now I want to change so that I'm doing this like remotely or I'm here once a week
01:06:41.300 | or whatever you want to do.
01:06:42.800 | And they're like, oh, OK, yeah, we don't want to lose you.
01:06:44.840 | If you think there would be a chance to apply your capital to get leverage, then I think it
01:06:49.000 | could be good whether or not they're recognizing the value right now.
01:06:52.160 | The classic place where the second question trips up people is law partners.
01:06:57.260 | There's an example from the book, So Good They Can't Ignore You.
01:07:01.240 | In law, especially like big law, so like working at the big law firms in the big cities, you
01:07:06.540 | for sure are building up a rare and valuable skill, right?
01:07:08.560 | Because you are mastering a specific part of the legal code that is literally very valuable
01:07:14.440 | to clients.
01:07:15.180 | And not a lot of people can do it.
01:07:17.060 | You're using your law degree in your brain and your ability to like really work hard to
01:07:20.640 | learn it.
01:07:20.980 | You build up a huge amount of career capital in law, but at big law firms, they give you
01:07:26.720 | a very limited number of options for investing that career capital, right?
01:07:30.700 | Really, the only option they give you is investing it into having more salary.
01:07:34.780 | If you get good at this, you can become a partner and then you can become a managing partner and
01:07:39.940 | what you're going to get in exchange for that is more money.
01:07:41.480 | And they are very rigid about that is it.
01:07:44.040 | If you're going to be at this firm, here is the path and here are the expectations.
01:07:48.860 | And the only thing you can open up by getting better is moving to the next step.
01:07:52.180 | And so it's a classic example of a place where, yeah, you can build up a lot of career
01:07:55.700 | capital, but they make it very hard for you to have flexibility in how you cash in that
01:08:00.180 | career capital to take control of your career.
01:08:01.900 | And that's a problem.
01:08:02.740 | And it's why a lot of lawyers end up with good bank accounts, but really unhappy because
01:08:07.160 | they got really good, but they have no choice with what to do with that except for just
01:08:10.280 | to make more money.
01:08:11.120 | And for a lot of people, other parts of their ideal lifestyle are then being trampled on because
01:08:16.840 | they don't have control over it.
01:08:17.800 | So you always have to get both of those questions.
01:08:19.460 | So that is how you should evaluate the career opportunities to come out of these internships.
01:08:24.400 | Can I build a rare and valuable skill?
01:08:26.540 | And will I have options if I do?
01:08:28.380 | And that's what you're looking for is whatever has the strongest affirmative answer to both
01:08:33.860 | of those.
01:08:34.180 | When it comes to your internships, yeah, do the other internship, right?
01:08:36.940 | Like might as well open up more opportunities, learn more things.
01:08:40.000 | That information is not going to hurt you.
01:08:42.500 | What's going to matter is the choice you make for what job you go after.
01:08:45.480 | And then once you do, once you have that job.
01:08:47.560 | So that would be, that would be my advice.
01:08:52.220 | I mean, there are some options for cashing out your law career capital, but they're hard.
01:08:56.280 | You have to leave the big firms.
01:08:58.080 | You have to try to do your own thing.
01:08:59.320 | You have to renegotiate your situation.
01:09:02.280 | I know someone, for example, who successfully renegotiated, I'm not going to, I'll leave
01:09:07.680 | the partner track and we can like keep my salary where it is.
01:09:11.360 | And I'm going to do this many hours because it's billable.
01:09:13.900 | So I'm going to do this many hours.
01:09:15.080 | So actually maybe the salary is going to be lower than I would be getting if I was doing
01:09:18.520 | 80 hours a week at billables or whatever, but I'm going to live remotely.
01:09:22.960 | I'm going to work remotely and live somewhere else.
01:09:25.060 | And where I'm living, what you're paying me for this like actual 40 hours of work I'm
01:09:29.660 | doing each week is way better than what most people are being paid here for working 40 hours
01:09:33.840 | a week.
01:09:34.200 | Sure.
01:09:34.540 | It's not as prestigious and it's not on the track and I'm not going to make a million five,
01:09:37.960 | but it's a good salary for a reasonable amount of work.
01:09:40.940 | And now I can live in this other part of the country.
01:09:42.300 | So I've seen that before.
01:09:43.320 | You've also seen people like try to put out their own shingles and then you can have some
01:09:48.320 | control.
01:09:48.620 | But law like famously is, I think it's a classic example is you build a lot of capital, but you
01:09:53.580 | only have one thing you're allowed to invest it in.
01:09:55.060 | It's like the company store back in the old days of mining companies.
01:09:58.740 | Like they would pay you a good wage to be a miner, but the only thing you could do with
01:10:02.820 | it is spend it at the company store.
01:10:04.100 | You had to live in the company town.
01:10:05.520 | You had to live in the company housing and you had to buy from the company store and they
01:10:08.220 | kind of just got all that money back.
01:10:09.560 | That's how I think about some of those partnership track sort of elite jobs sometimes.
01:10:14.300 | All right.
01:10:15.880 | Who else do we got?
01:10:16.400 | Next up is Joel.
01:10:17.780 | I've noticed that much of Cal's goal setting philosophy aligns with research on goal
01:10:23.420 | hierarchies, top level subordinate goals down to specific goals for the current day.
01:10:29.020 | Has Cal ever seen this research or explored similar evidence-based frameworks in shaping
01:10:33.040 | his approach to goal setting or behavior change?
01:10:35.220 | I mean, I don't know.
01:10:36.180 | I find goal research to be so boring.
01:10:38.100 | There is this research that often comes out of business schools and there's often long acronyms
01:10:43.660 | and like here is the whatever 17 letter long word type of goal setting paradigm and it's
01:10:50.080 | really not that interesting to me.
01:10:51.300 | I don't find the research interesting.
01:10:52.320 | Like you give someone a framework, you're giving them structure to their thinking and planning
01:10:56.660 | that they didn't have before and like it does better.
01:10:58.320 | And like, look, this is a good framework.
01:10:59.640 | So no, I, I kind of find that research boring, but I would say this, the type of things you're
01:11:05.940 | talking about, like multi-scale planning, et cetera.
01:11:07.820 | I don't see that as goal setting.
01:11:09.380 | I just see it as time management.
01:11:11.000 | Like what is time management in the end other than making intentional decisions about what
01:11:16.040 | should I do next?
01:11:17.560 | Like that's time management and there's a particular segment of people, sort of this
01:11:23.240 | modern, like educated knowledge worker, spend a lot of time on a computer screen type people
01:11:27.620 | where you have this tension between bigger picture things you want to work on that are going to
01:11:34.320 | be long-term important for you and maybe for your professional prospects and the dizzying
01:11:39.640 | whirlwind of digital distractions that just sort of like makes up your day to day.
01:11:42.760 | And so you have to be very intentional.
01:11:44.660 | Otherwise you're just in the whirlwind and nothing gets done.
01:11:47.380 | And you get stuck and you get stagnant and you're basically trying to prove your worth
01:11:50.620 | through your pseudo productivity, which is a young man's game and something that's not
01:11:53.560 | going to lead to like a sustainable life.
01:11:55.020 | And so you need some way of making a smarter decision in the moment about what to do next
01:11:59.080 | beyond just like who wants my attention right now.
01:12:01.540 | And so having multiple scales of thinking about what's important helps you trickle big picture
01:12:07.800 | ideas down to small picture decisions about what to come next.
01:12:10.160 | To me, this is just common sense and it works well in practice, but I don't really see it
01:12:14.120 | as goal setting so much as I see it as making smart time management.
01:12:17.280 | decisions in a current digital work culture where it is very hard on the fly without structure
01:12:24.480 | to make smart decisions about your time.
01:12:25.960 | Other people don't have this issue, right?
01:12:27.540 | I mean, if you're in a situation where you're not in a whirlwind of digital distraction workplace,
01:12:31.140 | it might be much easier.
01:12:32.520 | If you're working on one big project, you could run like Oliver Berkman's schedule, which is
01:12:37.080 | basically deep work for three hours on the thing you're doing that's important to you.
01:12:41.360 | And then just so like do your best with the rest of the day.
01:12:43.420 | See what you're in the mood for.
01:12:44.380 | Keep up with stuff that's urgent and just try not to work too much.
01:12:46.720 | Like that's fine.
01:12:48.360 | If like maybe you're a professional writer, for example, or a independent thinker or something
01:12:53.000 | like that.
01:12:53.820 | But if you are trying to build career capital and find meaning and move the needle in a
01:12:58.720 | sort of busy knowledge work job, I just, I think time management requires some care about
01:13:05.840 | what you want to work on next.
01:13:07.040 | It requires some care in terms of how do you make the decision about what's the right usage
01:13:11.940 | of my time and multiple scale seems to make sense.
01:13:14.340 | So I don't know if there's a lot of research on it, but there is a lot of good experience
01:13:17.820 | of real people who have tried this.
01:13:20.520 | All right.
01:13:20.960 | Do we have a call this week?
01:13:21.960 | We do.
01:13:22.680 | All right.
01:13:23.080 | Excellent.
01:13:23.320 | Let's hear it.
01:13:23.860 | Hello, Cal.
01:13:26.820 | My name is Jamie Chalmers, a long time listener, first time caller.
01:13:30.060 | I want to thank you for your books.
01:13:31.820 | They've been tremendously helpful in my creative career.
01:13:35.000 | I've got several different projects that run concurrently.
01:13:39.000 | And I was listening to your recent episode about Inbox Zero, which I thought was fascinating.
01:13:44.180 | In particular, the method that you use using Trello and your working memory text file
01:13:50.260 | to capture information out of email and plug it into your card system.
01:13:54.360 | Now, I've listened to that episode and you talking about that about half a dozen times
01:13:58.700 | around my head around it.
01:13:59.760 | And while I've had some success in using the working memory file to get emails out of my
01:14:05.480 | inbox, as it were, I've not had a lot of success in translating those into my Notion task system.
01:14:11.900 | And so I kind of wondered whether you could just dig back into that a little bit more,
01:14:15.520 | explain some of the mechanics, perhaps the head spaces that you get into when you're
01:14:19.320 | focusing on translating that information into action.
01:14:23.140 | That would be super helpful because I can see that it would be a really useful way to go
01:14:27.520 | about things.
01:14:28.160 | And yeah, I'd love to find out more.
01:14:31.100 | So thank you once again for everything that you do.
01:14:33.620 | Massively important in my life.
01:14:36.000 | I appreciate you.
01:14:36.840 | I look forward to hearing from you.
01:14:38.040 | Thanks.
01:14:38.520 | All right, Jamie.
01:14:40.060 | It's a good question.
01:14:40.860 | And I think part of like the hang up here might be that I go from inbox to working memory
01:14:47.220 | dot TXT.
01:14:47.860 | And then from there, there's a lot of directions where that information can go.
01:14:53.460 | Only one of which is a task system.
01:14:56.400 | So like, let's do this again real briefly.
01:14:58.080 | So the basic idea here is how do you clear an inbox, right?
01:15:02.800 | I don't recommend going message by message necessarily and trying to dispatch completely
01:15:09.220 | as each message before moving on to the next.
01:15:11.220 | The context switching there becomes a real cognitive load.
01:15:15.060 | It becomes very exhausting.
01:15:16.280 | It become, you know, the strain, everyone knows it of like, why can't I just keep going through
01:15:20.380 | these messages?
01:15:21.040 | Why am I so resistant to it?
01:15:22.300 | It's because your mind can't keep switching from one topic to another so quickly.
01:15:27.640 | It's mentally really dragging on it, right?
01:15:30.800 | And so what I do is I go through my inbox.
01:15:33.160 | I grab, I don't handle the message.
01:15:36.360 | I just put a summary of like what that message demands of me into a text file.
01:15:42.220 | Really just as fast as I can type, right?
01:15:45.060 | It doesn't have to be, you know, clean or interesting or something like that.
01:15:49.460 | And then I have all of those in my text file.
01:15:52.060 | I can organize them.
01:15:52.960 | In fact, I was thinking what I'm going to do, Jesse, is I'm going to load up an inbox right
01:15:57.680 | now while we're talking.
01:15:58.560 | So I can just give an example of how I'm going to read real messages from my inbox right now
01:16:05.880 | and talk about like what would be my summary in my working memory.txt file.
01:16:10.720 | Okay.
01:16:10.980 | So like I'm loading up, I have so many inboxes.
01:16:13.220 | This is crazy.
01:16:13.580 | This is my personal, this is a personal inbox, an inbox that we do kind of behind the scenes
01:16:19.440 | stuff for the podcast and for like my books or whatever.
01:16:23.540 | All right.
01:16:24.040 | So I'm looking through this now.
01:16:24.920 | First email is a newsletter.
01:16:27.240 | So it doesn't matter.
01:16:28.160 | Next one is a meeting scheduling for an appointment with a trainer.
01:16:36.040 | So what would I put there?
01:16:37.660 | I would just type down schedule training.
01:16:41.100 | All right.
01:16:42.260 | Next one, ironically, is from a PT I used to work with checking in.
01:16:47.400 | So I'd be like, get back to their name.
01:16:51.280 | There's something from you, Jesse.
01:16:53.380 | So I put that in ignore.
01:16:56.740 | That's immediately where that goes.
01:16:59.440 | Button I press there.
01:17:02.060 | I got a scheduling email about a club I helped run at my kid's school.
01:17:06.220 | So I just put, again, get back to blah, blah, blah about whatever.
01:17:10.300 | I have an invoice, pay, pay invoice.
01:17:13.040 | Do you delete these emails after you make the note?
01:17:17.980 | So then I would be.
01:17:19.440 | Do you delete, make the note?
01:17:20.680 | I don't delete.
01:17:21.300 | I archive.
01:17:22.120 | Okay.
01:17:22.940 | Yeah, you are.
01:17:23.440 | I archive, right?
01:17:24.560 | And so there's enough information.
01:17:26.520 | So when I say like, get back to blank, I put their name.
01:17:29.820 | So now I know I can just type that name to Gmail and that email will come back.
01:17:33.920 | Invoice to pay, pay invoice.
01:17:36.880 | Okay.
01:17:37.780 | Respond to.
01:17:40.720 | It's a note from actually a well-known podcaster who I sent a note to and he got back to me.
01:17:46.100 | Rogan?
01:17:47.800 | It's not Rogan.
01:17:49.180 | Yeah, he doesn't like you.
01:17:50.280 | He doesn't like me.
01:17:50.840 | He's like.
01:17:51.820 | I'm convinced he doesn't like you.
01:17:53.200 | He's like, why don't you tweet me?
01:17:54.440 | You're moved by tweet me.
01:17:58.660 | Mailing list, mailing list.
01:18:00.280 | Someone sending me a galley of something.
01:18:02.620 | North, South Korea publicity thing.
01:18:07.640 | Put on calendar, right?
01:18:09.420 | And something to schedule.
01:18:10.860 | It's a note to myself.
01:18:12.780 | Put such and such a schedule for two o'clock on Thursday.
01:18:16.040 | So that type, that's like what I'm writing.
01:18:17.960 | Hold on a minute.
01:18:18.400 | You emailed yourself?
01:18:19.380 | Because I was somewhere else.
01:18:21.000 | Okay.
01:18:21.420 | And then you're going to note that that's actually kind of funny.
01:18:23.640 | Yeah, because I was, I was out and about.
01:18:25.700 | Like if I'm getting my haircut and we're like, oh, let's schedule the next
01:18:28.600 | haircut while I'm here.
01:18:29.560 | You had a lot going on at this hair appointment.
01:18:31.420 | This is a Disneyland emailing yourself.
01:18:34.320 | Key of today's episode.
01:18:35.600 | You would like this guy actually.
01:18:37.580 | It's real.
01:18:40.040 | Anyways, I'll email myself on the fly because I know when I next leave my inbox,
01:18:45.440 | that'll move to working memory dot TXT and then I'll take care of it.
01:18:48.340 | All right.
01:18:48.540 | So those are the type of things I'd write.
01:18:49.580 | Then you sort them by type.
01:18:51.240 | Right.
01:18:52.620 | So like a bunch of these were schedule.
01:18:54.400 | I'm just going to like copy and paste.
01:18:56.240 | When I say sort, I mean, I'm just copying and pasting text within a text file, plain text
01:18:59.980 | file.
01:19:00.220 | I'm going to put like the scheduling things all in a row.
01:19:02.980 | So I just have a bunch of like scheduling things like in a row.
01:19:06.800 | There's like getting back to people things.
01:19:09.180 | I can be like, all right, getting back to people.
01:19:10.560 | And I'll put those like next to each other or whatever.
01:19:12.660 | So I'm kind of like grouping these things by, by type, by type of message.
01:19:16.620 | Or if there's a bunch of things on the same subject matter, like if I had multiple, I did
01:19:20.680 | a lot of foreign press this week for some reason.
01:19:22.420 | So like I would have like multiple things about foreign press.
01:19:25.220 | I'd put all those things together.
01:19:27.060 | I'm just kind of like sorting things.
01:19:29.080 | Then I will go through and tackle these things by group.
01:19:34.180 | And it's just easier when the groups are all the same type of thinking.
01:19:37.820 | So now I have like in that list there, four or five scheduling things.
01:19:41.140 | I'm going to open up my calendar, open up my inbox, like, all right, now we're doing
01:19:45.800 | scheduling and doing this all at once has like a lot of advantages, right?
01:19:50.900 | So most of those, first of all, are not going to end up on my task list.
01:19:55.240 | I'm just going to go through one by one and do the scheduling.
01:19:57.660 | And some of these, there's a link to a calendar.
01:19:59.300 | I'll do it that way.
01:20:00.200 | And other ones I email back, like, here's the time I want to suggest.
01:20:03.020 | I'm just handling it right there.
01:20:05.000 | So I'll go back and find those again in my inbox and like respond to them as needed.
01:20:09.320 | I'm handling it right there.
01:20:11.040 | But because I'm doing all this scheduling together, it allows me to have some extra
01:20:15.760 | efficiencies as well, because I might be like, man, this is going to pepper my, I don't want
01:20:20.440 | to pepper my week with these things.
01:20:21.960 | All right.
01:20:22.200 | You know what I'm going to do?
01:20:23.000 | I'm going to make like Friday afternoon, my like appointment call time next week.
01:20:30.060 | And I'm just going to offer that time to everybody.
01:20:31.720 | And that way the rest of this week will stay clear, right?
01:20:34.600 | So seeing them all together, or I might say, I'm going to take these two things.
01:20:39.260 | I'm going to punt them and be like, you know what?
01:20:40.840 | I don't, let's, let's get at this later in the summer.
01:20:42.760 | I was like, this is feeling like there's too many things.
01:20:44.400 | When I see how many things I'm scheduling, I'm like, this is too much.
01:20:46.820 | This is my, I, I'm going to punt two of these things.
01:20:49.080 | Aren't that important.
01:20:49.740 | I see it all together.
01:20:50.460 | I can make that decision more clearly.
01:20:52.060 | Right.
01:20:52.940 | And then I'm like, let me get back to people.
01:20:54.440 | And now I'm like in that mindset.
01:20:56.200 | Again, those are things where I'm just responding.
01:20:57.880 | Then some of these things will require actual putting a task, right?
01:21:02.200 | Like the, the, the South Korean publicity thing.
01:21:04.240 | Oh, this is like a non-trivial amount of work I need to do here.
01:21:06.800 | I'm not going to do it right now.
01:21:08.540 | So I'm, maybe I'm going to add this to my, my Trello as a task card.
01:21:12.980 | Maybe I need to actually connect this to a deadline.
01:21:15.780 | So I'm going to put the deadline on the calendar.
01:21:17.860 | Like, this is like the drop dead deadline for getting these like answers back to this question.
01:21:22.760 | And I'll paste the questions into a Trello card and move on from there.
01:21:25.920 | So it's moving things from the inbox into hastily type lines and a working memory.txt.
01:21:31.420 | Sorting those things manually into like groups and then dealing with those groups where I'm
01:21:36.640 | either ignoring them or dealing with them right away, scheduling something on my calendar or
01:21:42.020 | creating a task where the information, I'll point to the information.
01:21:44.940 | And I still do it the old fashioned way where I just copy the subject line and put it in the
01:21:49.040 | Trello card.
01:21:49.740 | So I know what to search for when I get to that task.
01:21:52.660 | And I'll be able to find it right away.
01:21:54.440 | I know people have been telling me there's ways to link directly to the message in Gmail.
01:21:58.100 | People keep emailing me this and I, I keep ignoring it.
01:22:01.180 | But yeah, there are more advanced ways of doing it.
01:22:03.520 | So hopefully that's helpful.
01:22:04.300 | Does that make sense, Jesse?
01:22:05.200 | Like what I'm talking about here?
01:22:06.100 | It feels like an extra step, but I'm telling you, like, this is much easier mentally than
01:22:11.260 | if I just went from each email to each email and tried to answer it until I was done with
01:22:16.480 | Like I'm scheduling this appointment.
01:22:18.200 | Now I'm responding to someone about this.
01:22:20.560 | Now I'm trying to get back to this person about this.
01:22:23.040 | Now I'm scheduling an appointment.
01:22:24.200 | It just becomes overwhelming and it's surprising that it does.
01:22:28.320 | But underneath the covers, it's just a context switching.
01:22:31.260 | All right, we got a case study here.
01:22:32.800 | It comes from TAF.
01:22:35.400 | All right, TAF says, I was listening to you talk about the Thoreau schedule today and you
01:22:41.500 | essentially described the working schedule I've managed to carve out for myself over recent
01:22:45.520 | years.
01:22:46.260 | It resulted due to a variety of factors, including COVID and having a young family.
01:22:51.100 | I have a pretty mid-range marketing job for a CPG brand, managing a small team.
01:22:55.800 | And my current role of current role, the career capital I've developed enables me to take ownership
01:23:01.520 | on my schedule because the management team knows that I will deliver strong work regardless
01:23:05.620 | of where and when I work.
01:23:07.940 | I'll stop right there briefly to point out this is a common application of career capital.
01:23:13.280 | If you do, you have very valuable skills and you deliver, people trust you.
01:23:18.540 | You do the things you say you're going to say when you say you're going to do them.
01:23:21.680 | You are now a super rare commodity and there will be a lot of accommodations because, oh my
01:23:27.500 | God, here's someone who actually does what they say they're going to do when they're going
01:23:30.100 | to do it.
01:23:30.500 | Like we do not want to lose this person.
01:23:32.060 | You have a lot of control there.
01:23:33.080 | So here now is the schedule that TAF created.
01:23:37.020 | So that being said, my schedule borrows from many concepts you've discussed.
01:23:40.560 | All right, so here's the schedule.
01:23:42.440 | I wake at 6 a.m. and read, but not on my phone.
01:23:45.020 | I go to my office at 7 a.m. and start my day clearing emails and checking in on core project
01:23:50.280 | timelines to make sure there's nothing urgent on the cards.
01:23:52.620 | I find that staring, starting at 7 a.m. means I'm working before most of my colleagues are
01:23:57.220 | logged on, which means there are very few distractions or meetings booked during this time.
01:24:00.360 | And I can just focus on what I need to get done.
01:24:03.080 | Then I have time blocked my calendar from 7.30 a.m. to 10.30 a.m. for my highlight.
01:24:07.720 | This is a term that comes from the time dorks, where I focus on the one major priority deep
01:24:14.960 | work project that is most critical to my day.
01:24:17.480 | Then at 12 p.m., having completed five hours of work, I stop and work out.
01:24:21.380 | I'm outside and that allows me to think.
01:24:22.840 | And then once I come home, I cover off the last two to three hours of the day with any
01:24:28.180 | shallow work admin meetings and sign off at 4 p.m. to be with my family.
01:24:31.320 | In the evening after my daughter has gone to bed, my wife and I read for about one to two
01:24:35.140 | hours before bed.
01:24:36.060 | I always have a number of books on the go, a novel, something nonfiction, something
01:24:39.820 | theological, and a book of poetry so that I have plenty of choice for whatever mood I'm
01:24:43.280 | I basically read until I fall asleep, and I like the idea of reading at the very start
01:24:46.720 | and end of each day.
01:24:47.780 | Some additional habits that help.
01:24:49.820 | I have no social media apps on my phone.
01:24:52.180 | I use Trello to organize my work and personal life.
01:24:55.140 | I have one central board with my priorities for today, which contains anything critical.
01:24:59.320 | The rest fall into priority two or backlog.
01:25:02.120 | Anything that sits in backlog for a long time eventually gets archived if nobody raises
01:25:05.960 | it as important.
01:25:07.120 | I treat my Outlook as if it's an actual mailbox, so I close it once I'm finished with it and
01:25:12.280 | then go back to check it at specific intervals in the day.
01:25:15.000 | I've found this reduces the impulse of reading emails as they arrive and constantly being distracted.
01:25:19.760 | I also have a two-folder system for my email.
01:25:22.780 | My inbox has incoming mail or emails I need to address, and then filed has everything else.
01:25:28.240 | My reason is that once I've read an email, it can be filed, and if I need again, I can search
01:25:31.920 | for it.
01:25:32.420 | This has reduced all email admin and a necessary folder assignment or rules.
01:25:36.840 | Finally, I time block my work days with recurring meetings so that I'm automatically blocked off
01:25:43.440 | every day for my morning highlight, my lunchtime 10K run, and my time with my family at 4 p.m.
01:25:50.520 | This means every day has the same structure, and my calendar availability is prescribed for
01:25:54.320 | anyone booking meetings.
01:25:55.260 | I've never had any issues or pushback from colleagues with this approach, and people rarely book meetings
01:25:59.740 | across my time blocks.
01:26:00.840 | So there we go.
01:26:02.640 | Someone that it's not a sexy story of, I became the only nuclear physicist in the world that
01:26:10.100 | could handle this, and therefore I have a job where I work one day a week and surf all day.
01:26:13.280 | It's not a sexy story like that.
01:26:14.940 | It's just someone who does their job very well.
01:26:16.760 | They deliver.
01:26:17.340 | They're reliable.
01:26:18.040 | They do high quality.
01:26:19.840 | They combine this with lifestyle-centric planning or lifestyle engineering or lifestyle architecting,
01:26:24.320 | whatever we want to call it, to work backwards from their ideal lifestyle, and it's not something
01:26:27.960 | that would catch your attention if you hear it described, but it is a really good lifestyle.
01:26:31.500 | Reading in the morning, done at work at 4, doing a long run and workout in the middle of the
01:26:36.700 | day, get stuff done, deep work every day, meetings are constrained, email checks are not all the
01:26:44.800 | time, people are okay with it because they deliver, and the life is really good.
01:26:49.460 | So I think that is a really good example.
01:26:52.740 | I appreciate to have you sending in that case study.
01:26:55.620 | All right, so we have a good final segment coming up here.
01:26:58.840 | We got some Cal Network artwork and a what I'm not reading segment to get into as well, but
01:27:04.780 | first let's take a quick break to hear from some sponsors.
01:27:07.220 | Jesse, I have a question for you.
01:27:10.120 | How many people do you think stopped me when I was at Disneyland recently to compliment how
01:27:17.040 | good my shave looked?
01:27:19.080 | Three.
01:27:19.560 | It was 1,700 people.
01:27:21.040 | All right.
01:27:22.300 | Now, whether or not that's actually true, I will say I have been enjoying shaving more
01:27:26.060 | recently because I have been using Harry's.
01:27:30.000 | Harry sends the best quality razors right to your door for a fraction of the price of the
01:27:34.560 | big brands.
01:27:35.160 | We're talking about really good German engineered blades, these nice, comfortable sort of rubber
01:27:40.480 | coated handles.
01:27:41.420 | I really like them, but also shaving products and get like excellent shaving cream delivered
01:27:46.620 | right with your razors.
01:27:47.960 | One I like is that you can also get a richly lathering skin softening body wash in scents
01:27:54.700 | like Redwood, Wildlands, and Stone.
01:27:58.020 | Interestingly, Jesse, richly lathering skin softening is how a lot of people describe
01:28:02.600 | this podcast.
01:28:03.520 | So there you go.
01:28:04.560 | They send this all to your door.
01:28:06.240 | It's automatic.
01:28:06.840 | You set it up so you don't have to remember to go to the store.
01:28:09.440 | You don't have to add another task to your to-do list.
01:28:11.660 | Before I started using Harry's, I was basically in the Stone Age when it came to shaving.
01:28:17.800 | I might as well have been scraping my face with a piece of sharpened flint.
01:28:22.140 | Take your own grooming out of the Stone Age.
01:28:24.400 | Get Harry's.
01:28:25.020 | Normally their trial set is $10, but right now you can get it for just $6 if you go to
01:28:33.180 | harrys.com slash deep.
01:28:34.880 | That's our exclusive link, harrys.com slash deep to get a $6 trial set.
01:28:42.380 | I also want to talk about our friends at ShipStation.
01:28:44.660 | If you ask Jesse where all of these packages he orders from e-commerce stores come from,
01:28:51.280 | he will say, and this is absolutely true, Jesse, you can back me up on this, a stork delivers
01:28:55.500 | them.
01:28:55.760 | He told me this just earlier today.
01:28:58.000 | He thinks that's how they come.
01:28:58.940 | But for those of you who actually run an e-commerce store, you know packages don't arrive
01:29:03.200 | by magic.
01:29:04.240 | For a lot of companies, it is a huge source of labor and stress.
01:29:08.220 | This is where ShipStation can make your life so much easier.
01:29:12.900 | Last year alone, over 700 million orders were fulfilled with ShipStation, half of which were
01:29:20.760 | creatine shipments going to Jesse.
01:29:22.520 | With ShipStation, you can sync orders from everywhere you sell into one dashboard.
01:29:26.900 | This is key.
01:29:27.720 | You have one dashboard and you can replace manual tasks with custom automations to reduce shipping
01:29:32.460 | errors.
01:29:32.800 | And all of this is at the fraction of the cost you probably were already spending for
01:29:37.200 | fulfillment.
01:29:37.640 | Now, here's a couple of things about ShipStation that caught my attention.
01:29:40.520 | It's the fastest, most affordable way to ship products to your customers because it has
01:29:44.340 | discounts up to 88% off UPS, DHL Express, and USPS rates and up to 90% off FedEx rates.
01:29:53.460 | It also seamlessly integrates with the services and selling channels you are already using so you can manage all of your orders over multiple different channels and systems in one easy-to-see dashboard.
01:30:07.800 | All right, you want to hear something cool?
01:30:08.920 | During the time I've been reading this spot, 1,400 packages were shipped with the help of ShipStation and none of them, Jesse, were delivered by a store.
01:30:18.620 | Come on.
01:30:19.020 | Upgrade to a smoother shipping experience.
01:30:21.840 | Go to ShipStation.com slash deep to sign up for your free trial.
01:30:25.680 | No credit card or contract is required and you can cancel any time.
01:30:30.200 | That's ShipStation.com slash deep.
01:30:33.580 | All right, let's get back to our show.
01:30:36.100 | All right, so in our final segment, I want to do a twist on my what I'm reading segment that's called what I'm not reading.
01:30:43.560 | But first, Jesse, I wanted to quickly cover a new piece of Cal Network artwork that came in.
01:30:50.160 | I do have to sort of ding this person a little bit because a lot of people have been making sort of like custom artwork.
01:30:57.020 | In this case, they just took an existing photo of me and added to the existing photo a copy of a fake Cal Network book, but I still think it is interesting.
01:31:06.060 | So here we go.
01:31:06.720 | We'll put it on the screen here for people who are watching instead of just listening.
01:31:10.420 | And what we've got here is a – this is my author photo actually.
01:31:15.000 | I don't know if you've seen it.
01:31:15.780 | It's what's on my book jacket.
01:31:16.800 | Cal Network with his Georgetown tank top.
01:31:20.260 | That's what I wear to teach.
01:31:22.980 | Veining Out, as I constantly do, with Austin Aviator Sunglasses, holding Cal Network's hit bestselling book, I'm the Boss Now, which is his career advice book.
01:31:34.400 | So there we go.
01:31:34.880 | It's my author photo with a mocked-up book.
01:31:39.060 | So I always appreciate some good Cal Network artwork.
01:31:42.560 | All right.
01:31:43.520 | So to our final segment, I have been doing recently what I'm reading where I talk about like interesting articles and books that either cover interesting ideas you might like or I think are just good books or articles for those looking for some more depth.
01:31:54.240 | I want to rant briefly here today.
01:31:56.560 | I'm going to call this segment what I'm not reading.
01:31:58.620 | It's a particular type of article that's been common recently, and I want to give you permission to ignore every single one of them because they're nonsense and are going to cause unnecessary stress.
01:32:07.360 | I have loaded here a sample of one of these type of articles that I'm going to suggest you don't read.
01:32:13.260 | I'll put it on the screen here for those who are listening instead of – or watching instead of just listening.
01:32:17.620 | The article is from Fortune, and here's the headline.
01:32:22.680 | OpenAI CEO Sam Altman says AI can rival someone with a PhD just weeks after saying it's ready for entry-level jobs.
01:32:31.120 | So what's left for grads?
01:32:33.760 | This came out on June 20th.
01:32:36.480 | This is an article – I saw this because like Google Notification just showed it to me on my phone.
01:32:42.460 | There are a lot of articles like this where there is some sort of over-the-top alarmist headline which makes anyone who's just loosely following AI be like, oh, my God, we're so screwed.
01:32:51.740 | But I want to look a little bit deeper at this article to try to explain why you can ignore this type of thing.
01:32:58.140 | All right.
01:32:59.060 | So if we read this article, I have a couple quotes here.
01:33:02.380 | Here's the first paragraph.
01:33:04.200 | Earlier this month, OpenAI CEO Sam Altman revealed that the technology can already perform the task equal to that of an entry-level employee.
01:33:13.120 | Now on a podcast posted just last week, the chat GPD mastermind went even further saying AI can even perform tasks typically expected of the smartest grads with a doctorate.
01:33:23.680 | Soon after, it says, as companies like Amazon have admitted, they will soon cut their corporate ranks thanks to AI and Anthropic CEO Dario Amadei warning that the technology could wipe out half of all entry-level white-collar jobs.
01:33:36.860 | It begs the question, what jobs will be left for those tossing their graduation caps into the air in the coming years?
01:33:43.580 | There are a lot of articles like this where tech CEOs are saying pearl-clutching, groin-tightening, scary type of things like this.
01:33:52.640 | This is all nonsense.
01:33:55.120 | And let me try to explain to you why.
01:33:57.400 | Okay.
01:33:58.340 | First of all, let's start with the claim here that, and I'm going to read this, AI can perform tasks typically expected to the smartest grads with a doctorate.
01:34:07.580 | What are they actually talking about here?
01:34:09.640 | What they're actually talking about here is the fact that OpenAI carefully tuned one of their foundational models to work on a specific type of math competition question.
01:34:20.060 | So they hired, we got some correction on this from a listener.
01:34:25.060 | They used a data set from a company that hired math PhDs and paid them $100 an hour to write math problems of this type with step-by-step solutions.
01:34:35.940 | And then they could use reinforcement learning techniques to try to tune one of these foundation models to do well on this very specific type of math problem.
01:34:43.300 | A professor involved in this project said, oh, these are hard math problems, not the type you would assign an undergrad, but the type you might assign a graduate student.
01:34:52.340 | From there, we get AI can now do PhD-level workers' jobs.
01:34:59.420 | That is a nonsense leap from a very specific type of math competition problem that is like of the type that might get assigned in a graduate student problem set to AI is doing graduate-level jobs.
01:35:15.420 | And not only is it a huge exaggeration, but as we went over in the What I'm Reading segment a few weeks ago,
01:35:22.800 | even those results are highly contested because OpenAI was like, look, these are hard problems and we can do them.
01:35:29.660 | And independent research firms said, great, let's take these models and we'll put them on other similar math competitions that happened recently.
01:35:37.020 | And they did terribly, leading to the idea that they had been very, very fit to this very specific type of problem that happened to be really well suited for the reinforcement learning techniques that we know how to use now on AI.
01:35:48.880 | So that is a massive exaggeration.
01:35:50.840 | Another massive exaggeration that's happening with a lot of this reporting is conflating of the post-pandemic tech downturn, which is leading to lots of layoffs.
01:36:01.640 | It's getting a little bit better now, but it's leading to lots of layoffs because there was a huge boom in the tech industry during the pandemic.
01:36:07.280 | And post-pandemic, we're cyclical.
01:36:09.100 | It's a hard job market there.
01:36:11.340 | Companies are cutting back.
01:36:13.280 | We put a lot of money into a lot of these areas.
01:36:15.900 | We need to cut back because we need our profit ratios to be higher for the stock market.
01:36:21.680 | And just what happens?
01:36:22.780 | Companies are cyclical.
01:36:23.640 | So there's a cutback.
01:36:25.360 | And there's a lot of this sort of disingenuous conflating by these reporters that make it seem like, without maybe necessarily 100% claiming it, that these job losses are because AI is automating the jobs.
01:36:39.580 | This is nonsense.
01:36:40.780 | It is not what is happening.
01:36:43.400 | Yes, Amazon is cutting.
01:36:46.500 | Meta cut a bunch of people.
01:36:48.020 | Microsoft is cutting a bunch of people.
01:36:50.560 | Because they're bloated.
01:36:52.720 | They cut those revenue numbers.
01:36:55.240 | They cut those employment.
01:36:56.380 | Those are expenditure numbers go down.
01:36:58.120 | Their profitability goes up.
01:36:59.200 | They got too big.
01:37:00.320 | They are not replacing hundreds of thousands of people or tens of thousands of people with AI that's automating them.
01:37:07.220 | AI cannot do that yet.
01:37:09.780 | So I think that is very disingenuous.
01:37:11.440 | I saw an article the other day that said, look, CS majors are cratering.
01:37:15.960 | They're going down.
01:37:16.720 | And it's because AI is taking all the jobs.
01:37:19.160 | It's like, hold on a second.
01:37:20.540 | CS majors are down because the tech industry is down.
01:37:25.100 | The same thing happens every time the tech industry goes down.
01:37:28.700 | When I was a computer science undergraduate, we had the same issue.
01:37:33.720 | Post .com bust in early 2001, the tech industry contracted.
01:37:39.380 | Majors went down because there was less jobs.
01:37:43.280 | And then majors went back up again when there was the Web 2 boom, brought a lot more investment in.
01:37:48.520 | There's more jobs again.
01:37:49.400 | There's a contraction post-pandemic.
01:37:51.580 | Jobs, it made jobs for, except for, you know, specialized AI jobs, like if you're a machine learning person, are contracting some.
01:37:58.280 | So majors go down.
01:37:59.200 | And yet people will conflate it and say, oh, it's because AI is taking the jobs.
01:38:03.160 | AI is not replacing software developers.
01:38:07.440 | Most of the cuts at these companies are not software developer jobs.
01:38:10.780 | So I think it's really disingenuous reporting, but there's a lot of this going on.
01:38:14.640 | Later in the article, we see some of this evidence.
01:38:18.560 | So here they say, for example, they do note, well, okay, however, I'm reading from the article now later in the article.
01:38:26.100 | However, in the tech industry in particular, volatility in the jobs market is nothing new, said Art Zale, CEO of the tech career platform DICE.
01:38:32.700 | After all, nearly 600,000 tech employees lost their jobs between 2022 and 2024, according to layoffs at FOII.
01:38:39.740 | Yes, the tech industry has really post-pandemic contracted.
01:38:43.320 | So yes, there's a lot of job cuts.
01:38:45.500 | That's not because in 2022, before ChatGPT was out, AI was taking those jobs.
01:38:51.320 | It contracted, and then it will grow again once more investment capital comes back in.
01:38:57.180 | Look at like Dario Amadei saying technology could wipe out half of all entry-level white-collar jobs within like two years, whatever he said.
01:39:03.800 | All right, here's the reality.
01:39:05.120 | It's not going to do it, and he's just blowing hot air.
01:39:07.640 | And Sam Altman knows that AI is not taking PhD-level jobs because it's not really taking any jobs right now.
01:39:15.860 | They're blowing hot air.
01:39:17.300 | Why are they blowing hot air?
01:39:18.960 | Because it requires a massive amount of investment capital, ongoing investment capital, to keep these places afloat.
01:39:25.620 | OpenAI is desperately in need of this massive loan from SoftBank, this huge, many, many billion-dollar loan, which is happening in tranches, to go through.
01:39:35.720 | And they need excitement around their tools.
01:39:38.680 | And the more you feel like these tools are going to, in the future, be immensely disruptive, the more you overlook right now that they're losing $3 to $4 billion a year.
01:39:47.480 | So it is in the interest of the CEOs of tech companies to push any possible narrative of very large world-shaking disruption because you will overlook any issues with their business right now if you think where they're going is going to be the biggest disruption that we've had sort of like in the history of humanity.
01:40:07.880 | So they are now just spewing anything they can get attention for because they know they will get articles like this.
01:40:13.900 | They know they can look at a model fine-tuned to do well on math problems, but not really because when other people tested it, it didn't, and say, well, now even PhD-level jobs are gone.
01:40:24.320 | That is a nonsense leap, just like it was a nonsense leap when Sam Altman talked about how AI was going to help us build Dyson spheres around the sun, the power, the growth of the universe.
01:40:35.580 | Just like it was a massive leap when Scott Alexander very confidently says in Project 2027 of like, well, yeah, I mean, pretty soon the AI is not only going to be able to program its own super intelligences, it's going to build self-replicating robots that will build the data centers, and this will all happen in the next three years.
01:40:52.440 | So anyways, I wanted to do a little bit of a rant and say, you can feel confident right now when you see like these like super alarmist headlines that don't reflect anything you've seen in your own life or your own industry, your own career.
01:41:06.700 | All these jobs are gone, and it's quoting some sort of tech CEO.
01:41:10.120 | You can essentially ignore those for now.
01:41:12.800 | Follow the more like tech reporting, tech journalism, but also just wait to actually see major changes in your own world.
01:41:20.260 | I think the hype coming out of these CEOs is becoming almost like parody, and this has now entered my list of what I'm not reading, is alarmist articles about whatever the latest chaos, crazy, nonsensical hot air thing that Sam Altman has said, and will be widely repeated by outlets looking to get people to click.
01:41:39.580 | All right, there we go.
01:41:40.300 | That's my rant, Jesse.
01:41:42.020 | Wait, so do you have notifications on your phone?
01:41:44.160 | I don't know what I, okay, let me rant about this.
01:41:48.060 | So you have to have this Google app on your phone.
01:41:52.340 | They force you to have it for, I had to get it because I can't, couldn't like consistently log in the Gmail.
01:42:02.240 | You have to have this Google app.
01:42:04.220 | It's this, so you have an Android.
01:42:05.520 | No, I have, it's iPhone.
01:42:07.140 | Oh, you have an iPhone.
01:42:07.880 | Yeah.
01:42:08.140 | So something about like the two-factor authentication.
01:42:10.020 | You have to have the Google app, the use like Google things.
01:42:14.360 | And so that app just like shows me, I don't know, notifications, I guess.
01:42:19.280 | Like it just will pop up, like these articles will pop up in like little bars on my iPhone.
01:42:23.160 | But I can't get rid of the Google app because then I can't use my email.
01:42:27.300 | Can you turn off the notifications?
01:42:28.740 | I don't know.
01:42:29.440 | Is that something I can do?
01:42:31.020 | I don't.
01:42:33.420 | It cracks me up.
01:42:34.800 | I have no idea.
01:42:35.580 | Let's see here.
01:42:36.320 | I don't know, probably.
01:42:37.080 | But anyways, because I read a lot of AI content in my job as like a tech, when I do tech journalism, it's always showing me these nonsense articles.
01:42:45.360 | It's always like Dario Amadei says, you have 17 minutes left to live before AI driven robots harvest your brains for their, to lubricate their self-replicating robot machines.
01:42:58.640 | It's just like these type of headlines again and again and again.
01:43:00.920 | So I see a lot of this nonsense and it's what I am not reading.
01:43:04.220 | Because again, every time you look at it, it's like some super exaggerated claim like here, or they just make it up.
01:43:10.320 | Amadei is just like half the jobs will be gone next year.
01:43:14.140 | And the reporter's like, oh, that sounds bad.
01:43:15.520 | Let me get a headline about that.
01:43:16.500 | That would be really bad, right?
01:43:17.480 | It's great.
01:43:19.140 | It's crazy.
01:43:19.620 | And you're like, well, how is that going to happen?
01:43:21.320 | I don't know.
01:43:21.860 | Key unlocks.
01:43:23.200 | It's going to happen.
01:43:23.920 | I don't know.
01:43:24.760 | I could write these articles.
01:43:25.700 | I should just do reports like as the deep media LLC head and be like, we're going to be able to fly robotic pterodactyls next month.
01:43:35.280 | All right.
01:43:36.560 | Why not?
01:43:37.020 | Like that would be a big deal.
01:43:38.560 | Bees with lasers are going to start murdering our dogs within by 2028.
01:43:45.580 | Sure.
01:43:47.060 | Right.
01:43:47.500 | I mean, I don't know.
01:43:48.140 | It's possible.
01:43:49.400 | Lasers exist.
01:43:51.380 | Bees exist.
01:43:53.840 | There's someone, there's the RoboBee project that, you know, happened at Harvard 20 years ago.
01:43:59.760 | It's possible.
01:44:00.600 | I sometimes feel like this is what, there's so much interesting stuff happening with AI and so much particular impacts and good stuff and bad stuff.
01:44:08.160 | There's so much serious reporting happening that I hate that all this junk is getting out there.
01:44:12.120 | I mean, the robot bee thing is true, so be careful about that.
01:44:15.540 | All right.
01:44:16.160 | That's all the time we have for today.
01:44:17.340 | We'll be back next week with another episode of the podcast.
01:44:20.380 | And until then, as always, stay tuned.
01:44:23.100 | All right.
01:44:24.200 | Well, if you're looking for a counterbalance to my rant against Sam Altman's claims about AI, check out episode 349, where I actually look at an earlier in his career, Sam Altman's productivity techniques, where I think there's actually a lot of good ideas there.
01:44:40.580 | So, you can't deny this guy was successful, even if we don't trust what he's saying now.
01:44:44.000 | Check out that episode about how he got there.
01:44:46.040 | I think you might like it.
01:44:47.840 | So, as someone who writes and talks a lot about producing meaningful stuff in a distracted world, I always get excited when prominent individuals give us insight into their own processes for achieving this goal.
01:45:01.000 | So, you can imagine how happy I was when I saw Tim Ferriss recently linked to a blog post that was titled simply Productivity that was published in 2018 by OpenAI's Sam Altman.