back to indexThe Real Reason You Can’t Put Down Your Phone (and What To Do About It) | Cal Newport

Chapters
0:0 This is your brain on phones
47:0 My 11 year-old is bullied for not having a smartphone. What should I do?
55:40 How can I overcome my irrational urge to show off on social media?
59:12 How is video game addiction different than phone addiction?
63:24 Can I become addicted to messaging?
66:43 Is it ok to use newspaper apps on my phone?
74:7 A 5th grader drops her Apple Watch
78:14 An aspiring photographer and Instagram
82:54 The 5 Books Cal in September, 2025
00:00:00.400 |
I've been talking recently about the idea that making your life deeper will make your 00:00:07.440 |
And this is true, but it's not always enough. 00:00:11.720 |
For some people, the constant allure of their phone is so strong, it's so inescapable that 00:00:17.320 |
it can seem impossible to find any sort of freedom. 00:00:24.080 |
How to create relief from your phone overuse to gain enough breathing room that you can 00:00:29.680 |
actually pay attention to making the other parts of your life more meaningful. 00:00:34.540 |
Now I have a very particular approach that I want to take here. 00:00:37.000 |
I'm going to start inside your skull, deep within the folds of your brain, because I 00:00:43.340 |
think by identifying exactly what is happening among your neurons when you find yourself picking 00:00:48.680 |
up that device more than you want to, will help us come up with specific responses that 00:00:56.600 |
I understand how the brain works when you look at your phone. 00:00:59.360 |
And we're also going to discover why so much of the common advice you hear about phone 00:01:04.220 |
So we'll figure out what does work and what doesn't work. 00:01:07.620 |
So if you're sick and tired of being distracted from everything that matters, looking at your 00:01:12.140 |
phone more than you want to, trying again and again to stop this and not having any success, 00:01:20.680 |
As always, I'm Cal Newport, and this is Deep Questions. 00:01:25.080 |
Today's episode, the real reason you can't put down your phone and what to do about it. 00:01:47.480 |
I want to summarize the neuroscience of why you feel the urge to pick up your phone, but 00:01:52.880 |
I'm going to do this in a way that is going to avoid naming specific brain regions are going 00:02:00.500 |
In fact, I guarantee I'm only going to mention one neurotransmitter and no brain regions because 00:02:04.420 |
I want to get to the core of this without having to get bogged down with too much of the details 00:02:10.380 |
So this is going to be a sort of high level view of what's happening low down deep within 00:02:16.520 |
So here's the basic way I want you to think about what's going on in your head, sort of 00:02:22.200 |
There are numerous groups of neurons and what we can call the short term motivation system. 00:02:27.240 |
These are neurons that have learned over time to recognize different situations as cues to 00:02:36.000 |
So we have situations that are acting as cues for taking certain actions. 00:02:41.720 |
I'm going to draw you a picture here, God help us all, but I want to try to illustrate what 00:02:45.760 |
So let's start with, uh, I have this on the screen here for people who are watching instead of just 00:02:53.900 |
It's going to be very clear exactly what they're doing. 00:03:14.320 |
I was coaching a little league game down in Anacostia and there was a, the field, Jesse 00:03:20.100 |
was all artificial turf, but it was just baking in the sun that didn't happen to be any shade. 00:03:25.580 |
So this is coming from recent experience, but basically let's say this, you're the baseball 00:03:40.580 |
Well, there's a group of neurons that are going to fire in response to the situation. 00:03:43.580 |
It's a group of neurons that are pattern recognizers and they're going to recognize this situation, 00:03:48.200 |
both the internal state part of the state, which is that you're feeling very thirsty and 00:03:53.160 |
Like I see a water cooler that, you know, is nearby and it's in my dugout. 00:03:56.580 |
So they're going to fire in response to that situation because they have learned through 00:04:00.580 |
experience that drinking water in this situation is going to give you a big reward. 00:04:04.500 |
The satisfaction you feel when you drink, when you're thirsty, you can imagine that group 00:04:08.860 |
of, uh, neurons is essentially creating a vote like, okay, we're making a vote for you. 00:04:14.860 |
The person to take the action of going to drink the water from that water cooler. 00:04:18.660 |
Here's the only technical term I'll throw into this neuroscience discussion. 00:04:21.860 |
This is where the neurotransmitter dopamine enters the scene. 00:04:25.140 |
Dopamine is often misunderstood when people talk about it. 00:04:29.180 |
It plays many roles in the brain, but it's often simplified to some notion of the pleasure 00:04:33.380 |
Like you do something because you want the dopamine. 00:04:39.320 |
No, actually the role of dopamine in this situation is that, that, that particular group of neurons 00:04:45.040 |
that recognize this situation, the water cooler and you're thirsty include dopamine neurons. 00:04:51.600 |
So when that pattern fires, it activates those neurons and those neurons release dopamine. 00:04:57.960 |
That dopamine is going to go down a pathway connected to that situation, a pathway that is going to 00:05:04.680 |
connect to, uh, the action of going to get the water, you will experience the cascade of dopamine 00:05:10.760 |
down that particular pathway as a sense of motivation. 00:05:19.400 |
Now, here's, what's interesting about the brain is that there's often many different situations 00:05:25.640 |
So if we go back to this picture again, this beautifully drawn picture, um, and no, it's not Norman Rockwell, 00:05:31.320 |
So I know you were, you were thinking about it. 00:05:32.840 |
Um, in the same picture, maybe over on, you know, the dugout wall over here, you have, uh, 00:05:39.800 |
like your bat and your helmet are over there, perfectly drawn, right? 00:05:44.280 |
So like another possible action that you could take is maybe also, you know, that you are up the bat 00:05:53.320 |
And so you also have neurons, a group of neurons that might recognize that situation is it recognizes 00:05:59.080 |
the internal knowledge of I'm first up, you know, at the, at the bottom of the inning and we're heading 00:06:06.600 |
And I see my batting equipment, you know, that's down here. 00:06:09.000 |
And so the relevant action that cues is, well, maybe I should go get my helmet on and get my bat ready. 00:06:14.280 |
So now we have competing things you could do next. 00:06:17.640 |
But what's going to happen is the bundle of neurons is associated with getting the bat and the, uh, 00:06:24.360 |
They're going to make their own vote and they'll still flood some dopamine. 00:06:29.080 |
They'll flood some dopamine down the relevant action pathway to try to make you motivated to like, 00:06:33.480 |
well, maybe I should go get my equipment ready to bat. 00:06:35.800 |
So you have different votes going on for different possible actions. 00:06:40.520 |
In general, the votes going to win is going to be the vote that's associated with the largest expected 00:06:48.200 |
So the row expected reward for drinking the water has been built up in that circuit through learning 00:06:56.920 |
Like there's a whole mechanism that dopamine is also involved in. 00:07:00.520 |
I won't go into the details where it adjusts that reward. 00:07:04.120 |
So if you think like, oh, it'll be kind of nice to drink water when I'm thirsty. 00:07:08.680 |
There's a gap between what happened expectation. 00:07:10.840 |
You'll actually use dopamine to mediate a change to that circuit so that next time it'll fire 00:07:17.160 |
These circuits have learned through experience what reward to imagine. 00:07:20.280 |
Very roughly speaking, the loudest vote in the situation wins. 00:07:24.920 |
And in this situation, if you're really thirsty, that's going to win. 00:07:27.160 |
And you're going to, you're probably going to drink the water. 00:07:28.760 |
And then you can, then you can vote the vote for saying, get ready to bat might be stronger. 00:07:38.280 |
Now, neuroscientists, I know already you're really upset. 00:07:42.360 |
I'm not talking about things like the ventral tegmental area or the nucleus acubinus or the 00:07:46.520 |
incentive salience that dopamine types increases. 00:07:51.880 |
Let's just say this is what's happening here. 00:07:53.400 |
Different parts of your brain in the short-term motivation system. 00:08:00.920 |
The strongest vote based on the strongest expected reward tends to win. 00:08:07.800 |
Now let's talk about what happens then with our phones. 00:08:11.240 |
So if you want to be precise, a big part of the problem we face when we look at our phones too much 00:08:16.680 |
is that they overwhelm this particular short-term motivation system. 00:08:24.200 |
There's three things at play here that lead phones to so effectively overwhelm this system. 00:08:28.200 |
First, many of the things we do on our phones generate very clean and effective reward experiences. 00:08:35.960 |
So when we're learning about the reward from different activities, the reward of many of the 00:08:40.840 |
apps we look at our phone are designed to be very pure and to generate like a very strong association 00:08:47.640 |
in our head and really make that that circuit associated with the action of looking at your phone 00:08:58.280 |
On apps like TikTok, for example, you have these machine learning algorithms. 00:09:04.520 |
They try to estimate, build an approximation of some sort of unknown process that generates rewards 00:09:13.720 |
And it might not know in advance how that process works, but it learns by observing 00:09:17.160 |
how to approximate that so it can get the biggest reward. 00:09:20.360 |
So what's really happening here with one of these machine learning algorithms is that it's basically 00:09:23.800 |
building an approximation of the reward generating circuits in your head. 00:09:28.840 |
And then it's selecting things to show you that are going to generate the strongest, cleanest rewards 00:09:35.720 |
So we're getting these really pure reward signals when we're looking at our phones, because if we're 00:09:40.120 |
looking at algorithmic curated content, they are devised to give you much stronger, sort of like 00:09:46.280 |
artificially stronger, consistent rewards in a way that you might not actually encounter in the real world. 00:09:52.200 |
Now, there's a couple of different types of reward signals you get from social media style apps on your phone. 00:09:59.880 |
So this is why like funny things or unexpected things happening in TikTok videos are popular. 00:10:09.720 |
These machine learning algorithms curating the content. 00:10:12.040 |
Learn is something that generates like a very strong reward signal for humans. 00:10:18.520 |
So what you're getting, what I mean by negative is you're escaping the negative state of boredom. 00:10:24.200 |
Now, of course, this is another like really positive, consistent reward you get from a phone 00:10:29.080 |
is because you're often able to use it in a situation where you're otherwise bored. 00:10:33.480 |
This is not the case with a lot of other rewards. 00:10:36.840 |
Like I like watching movies, but by the time I'm sitting down in a movie theater and I'm there 00:10:41.720 |
with like my wife and we got some popcorn, I'm excited about the movie. 00:10:44.840 |
I'm not bored when that movie starts, but our phone is something we can pull out 00:10:49.000 |
The alleviation of boredom is a very strong positive signal. 00:10:57.080 |
So this is why, you know, interesting tidbits or things that give you like emotional arousal, 00:11:04.280 |
This frees you from the negative state of boredom in a really strong way. 00:11:07.640 |
And that's a very positive signal for our brain as well. 00:11:15.320 |
So we get very clean, consistent reward signals. 00:11:19.720 |
So our brain is like, my God, we get messy rewards in the real world. 00:11:30.280 |
And we're adjusting that circuitry every time. 00:11:33.880 |
Oh, this is, we should expect a reward almost always. 00:11:37.320 |
Or again, like a very strong expected reward associated with looking at the phone. 00:11:41.000 |
The second reason why phones overwhelm the system in our brain is that they, in addition to having 00:11:49.560 |
a very clean and consistent positive reward, the expected reward is good. 00:11:54.360 |
They occasionally deliver big rewards, especially again, when we're using things like social apps, 00:12:01.400 |
really, really positive rewards, which they can't, the really big rewards they can't offer all 00:12:06.600 |
the time, but they can offer them occasionally. 00:12:09.400 |
Well, you know, this would be like social approval indicators is one. 00:12:13.640 |
This was part of the power of like Facebook's early rise after they switched to mobile and 00:12:18.120 |
The idea that sometimes when I go on and check and I click that, that's the Q. 00:12:25.080 |
When I, when I click that F app, I could see unexpectedly that a post of mine got a lot of likes. 00:12:32.520 |
That's a big reward because we care a lot about social approval. 00:12:35.400 |
TikTok has this as well with like views or favorites. 00:12:37.800 |
You're posting videos with your friends, you're doing dances and you're like, you know, 00:12:41.720 |
most of the time, no one cares, but every once in a while, 10,000 views. 00:12:47.080 |
In fact, TikTok knows that and will just artificially sometimes pump up those numbers to give you that 00:12:51.640 |
experience from, from, from every once in a while. 00:12:53.640 |
Or like, if you're someone like me, forget even social media, like me during baseball trade season, 00:13:01.480 |
the closest I get to what like a gambler addiction feels like is the MLB trade rumor site. 00:13:10.600 |
I mean, so they track trade rumors coming up to the trade deadline in MLB. 00:13:15.320 |
Most of the time you go to that website, it's nothing's there for you, but occasionally it's, 00:13:22.680 |
we just traded for Max Scherzer and it's a huge reward if you're a baseball fan. 00:13:26.680 |
Those really big rewards are delivered intermittently. 00:13:31.240 |
That also really messes with this type of system. 00:13:36.280 |
It's why slot machines are so effective and they're all over Las Vegas casinos. 00:13:44.760 |
That plays into our expected reward system in a way where we're like, well, we might, we got 00:13:52.200 |
The hunting for the big reward, the way that generates an expected reward calculation, 00:13:57.320 |
that really gets us wanting to check things a lot. 00:13:59.800 |
The third reason why phones overwhelm these brain systems is that the queue, 00:14:04.200 |
unlike many other things we encounter in life is ubiquitous, right? 00:14:09.000 |
There's certain things that I have very strong expected rewards with, and I'm going to get 00:14:13.320 |
a really strong incentive salience to use a technical term with dopamine cascading down 00:14:18.840 |
There's certain things where that's really going to be the case, right? 00:14:20.840 |
Like if I'm super hungry and I come home and there's like a big pizza out there that like 00:14:25.480 |
my kids just started eating, it would be very hard for me not to grab, you know, 00:14:30.680 |
But it's pretty rare that I'm hungry and they're like, someone's putting a pizza in front of me. 00:14:37.400 |
If, if there was someone, uh, like an Italian pizza chef who followed me 00:14:41.000 |
and he kind of just waited until I was hungry and then was like, Hey, here you go pizza. 00:14:46.760 |
I imagine it was Mario, you know, that'd be too much pizza, right? 00:14:49.960 |
That'd be a problem, but how often does it come up? 00:14:52.680 |
The problem with phone is the queue is ubiquitous because it's in your pocket. 00:14:55.960 |
So that means that circuit that's looking at the situation and saying, is there a pattern 00:15:04.680 |
That means I have to calculate reward and maybe incentivize action. 00:15:07.720 |
It's always firing off because the phone is always with you 00:15:11.000 |
and it has a nice, good expected reward signal 00:15:15.800 |
because you get this clean reward and intermittent big rewards. 00:15:18.440 |
So it's always voting that brain, those brain clusters that are associated 00:15:24.840 |
This is like the, the neuronal equivalent of like Obama's campaign infrastructure. 00:15:32.040 |
And so you got a good, clean, solid vote from the pick up your phone circuit all the time. 00:15:42.440 |
There's many things that will outweigh it throughout the day. 00:15:45.720 |
But if it's there voting all the time, it's going to win so often. 00:15:49.240 |
Basically, yeah, it's not winning every election. 00:15:51.320 |
It's like, you know, I have a bigger thing I want to do here. 00:15:55.400 |
This is urgent or this or that, but man, it's every moment it's voting. 00:15:58.760 |
And a lot of these elections, if we're going to keep this metaphor going, 00:16:02.200 |
there's not much else, not many other good candidates running, if you know what I mean. 00:16:07.320 |
But you think about, this is the way I think about the short term motivation 00:16:11.480 |
system being overwhelmed is that we're not used to getting such a clean, consistent reward 00:16:17.000 |
from an action, but machine learning algorithms on algorithmic curation ensure that's the case. 00:16:21.480 |
Intermittent big rewards is incredibly compelling. 00:16:24.920 |
It's so powerful that like the few places we used to see this in our world before we had 00:16:30.600 |
Now we have that effect and a watered down effect. 00:16:34.360 |
This is why we look at our phone all the time. 00:16:38.120 |
It's not really a fair fight once we recognize how things have been rewired. 00:16:44.360 |
So now the question is, once we understand this, what should we do about it? 00:16:49.880 |
In other words, if we know what's happening in our brain, which we do now, 00:16:54.760 |
how does that help us better understand why certain types of responses about phone overuse 00:16:59.720 |
work and why certain other things that people talk about a lot don't work? 00:17:05.160 |
So I think what I'm going to do here is I want to start with things that don't work. 00:17:11.080 |
I want to go through common advice that people give the stop looking at your phone so much 00:17:17.720 |
and then using this frame of understanding about what happens in your brain. 00:17:21.720 |
We can go through each and understand why that rarely actually works. 00:17:28.520 |
So let's start with the idea of adding friction. 00:17:32.760 |
So if you're watching instead of just listening, you can see this on the screen here. 00:17:37.160 |
So let's start with the idea of adding friction. 00:17:40.200 |
This is this notion of, for example, I'm going to move my TikTok or Instagram apps to a folder 00:17:46.760 |
within a folder on the third screen of my phone. 00:17:50.120 |
This might mean using one of these tools where you have to look at a picture of a tree before you click 00:17:58.840 |
on it, or maybe even one of these things where you have to have a physical fob, you have to hold to 00:18:02.840 |
your phone to unlock it, or maybe you've taken the apps off for some of these social media. 00:18:06.440 |
So to access them, you have to go through your browser and that's, that's less convenient. 00:18:09.880 |
So we think like, yeah, adding friction, maybe then I'm less likely to do it. 00:18:15.640 |
And why does it not make much of a difference? 00:18:18.120 |
Because again, those brain bundles that are voting, they're making an expected value calculation 00:18:24.680 |
and they have a really big value they think is going to happen. 00:18:28.680 |
If you add friction, they weigh that friction that's integrated into the expected value. 00:18:33.640 |
So it reduces the value some, the value of picking up this phone is a little bit lower 00:18:38.040 |
because there's some extra action I have to do. 00:18:39.960 |
But that friction, the cost of that friction is minor from the brain's perspective, 00:18:45.320 |
compared to that like massive neurochemical change to your subjective state that you're 00:18:49.560 |
going to get by seeing the good content or potentially getting the big reward. 00:18:53.320 |
So the difference it makes is minor in order for friction to be powerful enough to actually 00:18:58.680 |
outweigh that reward to the place where you don't even want to pick up your phone. 00:19:01.960 |
It would have to be way more severe, way more severe friction than like, 00:19:06.760 |
I have to click through a few screens or I have to touch a fob to my phone before I can unlock it. 00:19:10.680 |
It would literally have to be something like, if I look at Instagram, you know, 00:19:16.120 |
That would be a case where you're like, you know what, I'm not going to look at Instagram, 00:19:20.200 |
but you really have to have something kind of that powerful. 00:19:22.360 |
Actually, there's a, you know, my friend Ramit Sethi, his brother, who I also know, Manish 00:19:32.040 |
And it was a stunt, but he hired someone to sit next to him and slap him every time he loaded Facebook. 00:19:39.960 |
I mean, he was kind of making a point about it. 00:19:41.320 |
But honestly, that is the level of net costs you would need before you're going to outweigh the reward that your brain has learned from these phones to actually have a major change to your behavior. 00:19:55.000 |
All right, let's look back at this list of things that people suggest. 00:19:59.480 |
Change the way you think about these devices, right? 00:20:03.400 |
Like, let's think about social media and I'll tell you why it's bad from like a societal point of view. 00:20:08.520 |
Let me tell you why it's bad because of the people who run it. 00:20:13.240 |
Like, is it really that important to you that you're donating a lot of your time to quote like one of my recent newsletters toiling for free in an attention factory so that like Mark Zuckerberg can buy the second half of Kauai. 00:20:25.880 |
But those mindset shifts aren't likely on their own again to make a major change to your behavior because we go within your school. 00:20:33.320 |
There's the bundle of neurons that are recognizing the situation that here's the phone and I could pick it up and they have a high expected value. 00:20:44.120 |
That mindset shift does not have a major disinhibiting effect on this much more simpler short term motivation circuit, which says we have a good chance of getting a good expected reward in terms of how we feel. 00:20:55.480 |
Alevian the negative state of boredom or pleasant surprise if we pick this thing up. 00:20:59.560 |
So again, this makes sense on paper, but not once we understand your brain. 00:21:05.560 |
Okay, what I just need to do is have better rules. 00:21:07.720 |
I'm only going to use Instagram for 30 minutes a day. 00:21:12.200 |
I'll have a limit of like 30 minutes of tick tock total or whatever you're again. 00:21:17.720 |
Those brain bundles don't know about your time limits. 00:21:21.480 |
What they care about is your phone is here and if I pick it up and I touch on that 00:21:26.200 |
It's like pulling the lever on the slot machine handle. 00:21:28.840 |
My expected reward in terms of like the positive impact on my subjective affect is high. 00:21:36.200 |
It's like going to someone who's addicted to a slot machine and be like, okay, here's the thing. 00:21:39.160 |
You should have a rule about how much you play it and say, I'm not going to play it more than that. 00:21:44.440 |
That's that's no match for what's happening in short term motivation system. 00:21:49.640 |
Yeah, you can make your phone black and white. 00:21:52.440 |
That reduces the expected value of looking at one of these services a teeny bit. 00:21:57.640 |
That's not nearly enough to make a difference. 00:22:06.520 |
I'm going to, uh, you know, my, my internet Shabbat. 00:22:09.720 |
I don't, uh, I'm going to follow the Jewish tradition and I'm not going to on Saturdays. 00:22:14.600 |
I'm not going to use technology and I'm not gonna use my phone. 00:22:21.960 |
You have a better day that day, not being on your phone. 00:22:25.800 |
But one day without your phone is not nearly enough to change anything about those circuits. 00:22:31.640 |
Now, maybe if you detoxed for eight months without your phone, the lack of use on those circuits, 00:22:37.800 |
eventually the expected reward would come down and your phone would seem less appealing. 00:22:41.240 |
But taking a day off or going on a week long meditation retreat once a year, lots of benefits, 00:22:46.280 |
but one benefit you're not going to get out of there. 00:22:47.960 |
Once we understand the neuroscience of phone use, you're not going to get a 00:22:50.440 |
notable reduction in your urge to check your phone. 00:22:53.320 |
The final thing that people think about is escape. 00:22:55.640 |
What if I just get rid of my phone altogether? 00:22:59.000 |
Well, that does work in the sense that if you get rid of your phone altogether and you replace it 00:23:03.880 |
with something like a dumb phone, the queue is not there and you will decide. 00:23:08.920 |
People talk about this because I meet these people all the time because of what I write and 00:23:12.920 |
People talk about this, this feeling of freedom. 00:23:30.680 |
And you're like, oh, everything else seems more interesting to me. 00:23:38.200 |
But the problem is as a long-term general solution is it's not sustainable for a lot of 00:23:42.920 |
people, there's just too many things that you need the smartphone to do that are logistical. 00:23:47.240 |
So escape gets you away from those circuits, but escape is hard to maintain. 00:23:52.040 |
So by itself, it doesn't end up being a general solution. 00:23:59.400 |
It's because we're doing the wrong thing for the brain. 00:24:06.120 |
If we know how the brain works and we know why those other things aren't helping us out, 00:24:11.240 |
because it's not going to solve the way our brain activates. 00:24:16.360 |
So how can this knowledge of the brain help point us towards fixes 00:24:23.640 |
That is what I want to get to next, the advice that actually will work. 00:24:27.880 |
But first, we need to take a quick break to talk about some of our sponsors. 00:24:33.480 |
So you know what I like, Jesse, a good pair of jeans. 00:24:37.800 |
It's an embarrassing story, but it's a true story. 00:24:41.240 |
This was a few months ago in the worst possible way. 00:24:44.280 |
I was driving in the campus and I realized the jeans, this is all true, had a hole in them, 00:24:49.240 |
not in the place you want there to be a hole in your pants when you're like going to a public place. 00:24:55.640 |
So I was like, okay, I need a new pair of jeans, a better, okay, I need a new pair of jeans. 00:25:00.760 |
So I went looking and I found exactly what I was looking for with rag and bones in fuse denim. 00:25:07.240 |
With rag and bone, you get jeans built to move with you. 00:25:10.760 |
They hold their shape and develop character over time. 00:25:13.080 |
There's multiple fits from slim and straight to relaxed or athletic. 00:25:17.720 |
I like the straight fit, but I know other people like the other fits. 00:25:20.600 |
I mean, with your CrossFit where they make you 00:25:24.200 |
do like a thousand deadlifts and snatches a day, you probably need the athletic fit, I guess. 00:25:30.760 |
They're like, okay, here's our workout today. 00:25:32.360 |
700 snatches with 4,000 meter rows in between for time. 00:25:42.280 |
But anyways, there's a style for every preference and occasion. 00:25:51.160 |
Premium denim and construction that will help them stand out. 00:25:56.920 |
What you really are going to notice about the infused denim is the way they look. 00:26:00.200 |
It's because of the way they handle the wash. 00:26:04.840 |
Each pair goes through an eight-step over-dye process. 00:26:07.480 |
This gives rich layer tones and deep dimensional shapes that makes looks polished as effortless. 00:26:14.840 |
And over time, that color and texture evolves, right? 00:26:19.320 |
You get that more like authentic, lived in touch. 00:26:23.320 |
If you've never had a really good pair of jeans, you should get one. 00:26:39.800 |
So what I'm trying to say here is it's time to upgrade your denim with rag and bone. 00:26:45.480 |
Our listeners will get 20% off their entire order if they use the code deep when they go to 00:26:59.880 |
When you use the promo code deep, use that code. 00:27:10.200 |
And when they ask you on the site where you heard about us, please mention our show. 00:27:14.760 |
I also want to talk briefly about our friends at Grammarly. 00:27:17.480 |
Look, in the knowledge economy, the ability to communicate clearly is everything. 00:27:23.240 |
If you're a good communicator, it'll help you get promoted. 00:27:26.680 |
You really should care about how you communicate. 00:27:29.240 |
But most people struggle because writing is hard and it's not obvious how to get better at it. 00:27:35.160 |
Grammarly is the essential AI communication assistant that boosts both the productivity 00:27:42.440 |
I don't think people fully realize how powerful Grammarly has gotten. 00:27:45.320 |
Let me tell you about a particular feature I've been enjoying, the proofreading agent. 00:27:52.280 |
I had to write an email that was going to go out to a large group of people. 00:27:57.480 |
But before sending it, I had the proofreading agent and Grammarly take a look. 00:28:09.160 |
So I clicked, for example, sharpen opening point. 00:28:11.960 |
And it suggested ways to be less wishy-washy and more precise in explaining up front what this email was about. 00:28:17.800 |
It actually gave me ideas that made the email better. 00:28:22.680 |
And this was without me having to sit there and give this email, which I didn't care that much about the same type of thought I would give if I was writing like, you know, a New Yorker article or something. 00:28:30.760 |
So it just in places where I'm not even like thinking much about writing or my brain is tired, it can still help you write better. 00:28:39.080 |
If you're a knowledge worker, have Grammarly be like an editor who sits behind you and looks over your shoulder. 00:28:43.560 |
Have it take a look at any communication that's going out to people that you care about, how they think about you. 00:28:53.960 |
It can help you brainstorm titles or ideas if you get stuck. 00:28:58.040 |
This will all make you a better writer, and that's what makes the difference in our current economy. 00:29:06.040 |
You can download Grammarly for free at grammarly.com/podcast. 00:29:13.640 |
All right, Jesse, let's get back to our deep dive. 00:29:18.840 |
So before we took a quick break, we talked about the common advice for using your phone unless it doesn't work. 00:29:26.200 |
And we use our new understanding of the short-term motivation system to explain why it doesn't work. 00:29:29.800 |
Now I want to do the opposite, and I want to look at a few ideas that do work. 00:29:33.640 |
And now we can understand why they work based on our understanding of the brain. 00:29:39.320 |
The first idea I want to mention here that I think actually does work is eliminate the strongest reward signals. 00:29:49.400 |
So one of the things you can do is prevent the brain circuit that recognizes the pattern of your phone being nearby from being exposed to such a constant stream of highly purified reward signals. 00:30:02.120 |
We want to reduce the expected reward that that brain circuit associates with looking at your phone. 00:30:07.720 |
The easiest way to do that is basically stop using on your phone algorithmically curated content. 00:30:14.600 |
The best rewards come from content that is selected using a machine learning algorithm that uses your engagement as input. 00:30:21.880 |
They're building an approximation of your reward estimator in your brain, and they're going to give you a very clear signal. 00:30:27.720 |
So just stop using – I know this sounds both simple and impossible, but this is what I'm telling you. 00:30:32.520 |
Stop using on your phone things like TikTok or Instagram or X or anything that you have an algorithm trying to choose things that's going to free you from boredom real easily or give you pleasant surprise. 00:30:49.320 |
When you reduce that reward signal over time, the power of the vote of the pick up the phone circuit is reduced. 00:30:56.280 |
And it goes from being like a really well-organized campaign to a sort of haphazard campaign, and that's good. 00:31:03.720 |
You still need to use these apps for like work or this or that, great. 00:31:07.240 |
You do it on a computer or whatever, and that's not like what's the difference. 00:31:11.960 |
Because now you're getting those reward signals over on your computer. 00:31:15.160 |
It's not being associated with the pick up your phone circuit. 00:31:17.160 |
And the phone is the thing you have with you all the time, so that matters. 00:31:19.800 |
Speaking of having it with you all the time, we understand the brain. 00:31:23.720 |
The second category of advice that we think will actually work is reduce the ubiquity of the cues. 00:31:28.120 |
So if you don't have your phone right with you all the time, 00:31:33.080 |
then the amount of time during your day where the pick up the phone pattern fires is reduced 00:31:39.560 |
because the phone actually has to be accessible for that pattern to fire. 00:31:44.440 |
So even if the expected reward is really high for looking at your phone, if the phone is not there, 00:31:48.840 |
the pattern that's going to generate that vote is not going to fire. 00:31:52.200 |
Now, the easiest way to do this, I've been now saying this for a while now. 00:31:58.120 |
People don't like this advice, but I'm telling you, you need to do this. 00:32:00.440 |
When you're at home, your phone is plugged in in your kitchen, plug in your charger, whatever. 00:32:09.000 |
If you need to check on like a text message conversation, you go to the kitchen and you 00:32:14.600 |
If there's a call you're expecting, you put on the ringer like an old-fashioned phone. 00:32:18.520 |
And if it rings, you go in there and you check on it. 00:32:20.440 |
If you want to listen to a podcast while you do the dishes, you use earbuds. 00:32:24.440 |
You use wireless earphones so that the phone can stay where it is plugged in. 00:32:29.000 |
So importantly, if you're reading, if you're at the dinner table, if you're watching TV, 00:32:32.440 |
you're brushing your teeth, like whatever it is, the phone isn't actually there for you to grab. 00:32:38.840 |
So it makes it the easiest thing you can do to make the cue non-ubiquitous. 00:32:46.440 |
And if the cue is not there, the pattern won't fire and you don't have to fight that vote. 00:32:53.320 |
The third thing that actually works once we understand the brain is strengthening competing 00:32:58.920 |
So the short-term motivation system is really good at like, yeah, go get that water, go pick 00:33:03.800 |
up this phone because that's like most of what we do during the day. 00:33:06.840 |
But there are other brain systems that can overwhelm or overwrite the short-term reward system. 00:33:13.640 |
If there weren't, things would get pretty primal pretty quickly. 00:33:17.960 |
So one of the systems that can overwhelm the vote from the short-term system is our 00:33:24.840 |
The system that does a better job of simulating the long-term future of certain activities, right? 00:33:35.800 |
The short-term system is like, I can associate the immediate reward we'll get by doing this. 00:33:41.160 |
When I eat the cookie, it will feel like this. 00:33:43.240 |
When I pick up the phone, almost immediately we'll get something like this type of reward. 00:33:47.080 |
The long-term system is thinking about, for example, when you're picking up a weight, 00:33:50.760 |
it's looking down the line for you being in shape. 00:33:53.880 |
Or when you're working on a book chapter, it's looking down the line towards the book being done. 00:34:05.240 |
So instead of it just being an association, this cue with this reward, it actually does a whole 00:34:12.440 |
There's a fascinating literature about how these simulations work. 00:34:15.480 |
I've talked about them before on previous episodes, but it uses like stored memories 00:34:22.680 |
It has like a logical simulator that then takes those memories. 00:34:25.480 |
They try to figure out possible futures and based on those memories, what can it expect? 00:34:30.040 |
Anyway, the reward signals from that system can easily overwhelm those from the short-term system. 00:34:36.920 |
This is why we do boring hard stuff at our jobs, even though there's more fun things nearby we could 00:34:43.240 |
be doing because the long-term reward of keeping our job and having this thing that my boss is asking 00:34:50.760 |
me for is like swamping the short-term reward. 00:34:52.760 |
Like, well, don't we want to go to the snack machine right now? 00:34:56.200 |
So you can strengthen this long-term reward system by practicing and introducing discipline into 00:35:01.880 |
And by discipline, I mean getting used to the long-term pursuit of goals that require consistent 00:35:07.960 |
action over time and then have really good rewarding outcomes on the other end. 00:35:11.720 |
The more of these you have, the more power you're giving to your long-term reward system. 00:35:17.080 |
And the more comfortable it is saying, hey, short-term system, I don't care about your vote. 00:35:22.360 |
The reward that really feels good is when these big long-term projects succeed. 00:35:29.320 |
And so the more your brain gets used to discipline, it gets used to working on long-term projects and 00:35:35.240 |
reaping big rewards down the line, the easier time you'll have basically pushing aside 00:35:40.920 |
the incentive salience that you're getting from the short-term system 00:35:45.160 |
and turn your attention to efforts that maybe in the moment aren't as pleasurable. 00:35:49.000 |
But your brain's like, I know where this story ends and that's the real stuff. 00:35:53.480 |
And the long-term reward system, of course, is like how the human species has really differentiated 00:35:57.080 |
itself as part of at least how we really differentiate ourselves from other animals. 00:36:00.120 |
Because we can overcome these sort of short-term impulses to work on, you know, 00:36:05.800 |
building fire, inventing language or mathematics or building like massive structures. 00:36:10.680 |
And there's so many things we do that differentiate us from other animals because 00:36:16.600 |
So you practice strengthening your competing systems and practicing disciplined pursuit of 00:36:22.760 |
long-term important goals really begins to shift the balance of power within your brain. 00:36:27.640 |
And the short-term reward system gets much less, much less play. 00:36:32.680 |
All right. So there we go. Those are, once we understand the brain, 00:36:35.880 |
the things that we know don't work and the things that I think actually will work. 00:36:39.320 |
They're not as sexy by the way, right? They're annoying. The things that do work. It's like 00:36:42.680 |
take the fun stuff off your phone, keep your phone plugged in when you're in the kitchen and be 00:36:47.480 |
disciplined. That's not fun. I want to have like a fun detox every week and make my phone grayscale 00:36:55.400 |
and rail against like Mark Zuckerberg being bad. Like all that stuff's more fun. This stuff is not fun, 00:37:01.560 |
but it works because it's actually compatible with our brain. All right. So Jesse, 00:37:04.680 |
let's do some takeaways. So we have some takeaway music. 00:37:13.800 |
All right. So here's how I feel about this. There was a time 00:37:19.880 |
when talking about spending too much time on your phone sort of sounded like a 00:37:24.280 |
kids these days type of old man yelling from his porch type of advice, but we're past that time. 00:37:29.800 |
No one is happy about how much of their life is punctuated by that little glowing piece of glass, 00:37:36.280 |
nor are they happy about the impact this is having, not just on their ability to live deeply, 00:37:40.360 |
but on our society and our ability for like our political system to function, the mental health of 00:37:46.120 |
our kids, the freedom from random nihilistic violence. So eliminating that sense of needing 00:37:53.480 |
to constantly look at your phone, that addictive sense to look at your phone has never been something 00:37:58.680 |
that people care more about. And the message I'm hoping to deliver today is that not all advice about 00:38:03.160 |
this goal is made equal. It's not enough to just say, let me go try some stuff. 00:38:07.960 |
Different advice works better than others. And if you study how the brain actually works 00:38:13.000 |
and how it actually creates the sense of motivation and what circuits are firing and why to get you to 00:38:18.040 |
keep looking back at that, that device again and again, when you understand that it becomes much 00:38:24.040 |
clearer that most things that you might try to fix that problem are ineffective, but there are a few 00:38:29.480 |
things that we can expect to work really well. And again, as I just said, they're not sexy. 00:38:33.080 |
It's taking apps like TikTok off your phone. Just get rid of that super clean reward signal 00:38:39.320 |
associated with your phone. It's about putting your phone in the kitchen when you're at home 00:38:42.440 |
or going for a walk without it. Get rid of the cue. The pattern doesn't fire. If it doesn't fire, 00:38:48.200 |
it begins to weaken. And finally, strengthening the other systems, the systems that really make us human 00:38:53.560 |
and that could overwhelm the short term motivation system, the systems like long term motivation based on 00:38:58.280 |
the discipline pursuit of stuff that you care about, focus on those so that the balance of power shifts 00:39:03.480 |
in your brain and the short term motivation system doesn't have so much say. None of that is easy. 00:39:07.000 |
None of it is fun. But once we understand how strong those reward loops are in our brain, 00:39:12.680 |
we see how far we really have to push ourselves if we want to get free. The little stuff isn't going 00:39:17.640 |
going to matter. It's the big, uncool, but effective stuff that we have to focus on and we have to push 00:39:24.280 |
on that hard. So we can tame our phones. It just requires understanding what really we're trying to 00:39:29.320 |
do. We set out to accomplish that goal. There we go. That's my, this is your phone on your brains on 00:39:36.680 |
phone. Is that the way we go? This is your brains? Yeah. On phones. That's my, that's my analysis, Jesse. 00:39:41.880 |
Um, have you seen those fob things? No, I saw them. I was doing a doc filming a documentary. They're 00:39:50.360 |
filming at my house and some of the crew was showing up to me. They were using them. You, you have to 00:39:55.560 |
touch it to your phone to unlock it. So the idea is you put that, you know, you're like, oh, there's this 00:40:00.520 |
like friction step. Yeah. And the reason, like, it seems like kind of a weird thing, but like the 00:40:05.000 |
reason why that became a big product is that Apple, they make it hard. They really don't want third 00:40:10.280 |
party apps to have control over other apps on their phone. They changed this. There used to be apps. 00:40:14.680 |
You, a lot of apps you could buy on your phone that restrict your phone use, but those are hard to make 00:40:21.560 |
now because they're new terms of service. Say your app can't affect someone else's app. You want to control 00:40:27.480 |
your phone time. You have to use screen time. But as we talked about on the show over the summer, 00:40:31.800 |
screen time is great if you're trying to control your kid's phone. But if you're in control of the 00:40:35.880 |
screen time, you can just turn it off. Yeah. So, but one thing you can do is have an 00:40:40.040 |
is just locking and unlocking can be very strong, but that's what the fob thing is, is it's just like an 00:40:46.200 |
unlock mechanism. Uh, your phone is anything goes when you turn it on. So I thought those were cool, 00:40:51.320 |
but you know, they're not going to be enough because that is not enough. That friction is not enough 00:40:55.720 |
to change the expected rewards. So there we go. All right. Uh, we got a lot of good show coming up 00:41:01.400 |
on this topic. We have questions from listeners about their own struggles with both them and their 00:41:08.680 |
kids trying to deal with their phones, including a mom of an 11 year old, who I think really is 00:41:14.760 |
looking for permission from me to say, get that 11 year old, your phone. It's going to go the way she 00:41:19.880 |
thinks. I think, um, we also have coming up a little bit later, my reading list from last month, 00:41:25.720 |
where I talked about the five books I read in September. I think that's really well suited for 00:41:28.680 |
today's episode because what could be more contrary to looking at your phone all the time than reading. 00:41:35.880 |
So I kind of like that. I mean, I try to do this early each month, but I think it going over the books 00:41:40.440 |
I read is like a really good sort of, uh, in cap to discussion we're having today. But before we 00:41:46.440 |
get into all of that, uh, let's do a little housekeeping. Jesse, do we have any, uh, housekeeping 00:41:51.160 |
about the show we need to get to this week? Yes, we do. We need some more calls specifically about 00:41:57.160 |
phone usage, social media, that sort of thing. How do people call into the show and leave calls to be 00:42:04.520 |
featured on the show? Uh, they just go to the deep life.com/listen and there's a link right there 00:42:09.640 |
at the top, right? So you could do this right from your, uh, phone or your computer from your web browser. 00:42:15.240 |
We need more calls. Uh, the calls we have are all pretty deep worky, pretty productivity. So what we 00:42:22.920 |
really need is calls about other types of struggles you have with technology in your life and questions 00:42:28.200 |
about technology, questions about technology in your life, questions about creating a life that is more 00:42:32.040 |
internet proof. We're really looking for that. So you have a good chance of getting featured. Uh, 00:42:35.800 |
what else do we got? Um, if you're interested in the newsletter, which is revamped and awesome, 00:42:40.920 |
it comes out every Monday morning, they can go to calnewport.com and sign up and they can also go to 00:42:45.080 |
the deep life.com and sign up. Yeah. You got to sign up for that. I mean, basically, 00:42:48.840 |
I guess the way I would say it recently, Jesse is really like the newsletter has been in conversation 00:42:53.160 |
with the podcast. So like often like the newsletter that came out, you'll, that came out today, 00:42:59.400 |
the day this episode is coming out is taking one of the ideas from the podcast and actually 00:43:03.880 |
running with it. So there's a lot of that going on or I'll re I'll get a reaction to the podcast 00:43:09.400 |
for newsletter. So really like the newsletter plus the podcast together really gets you into these ideas 00:43:15.000 |
a lot better than one on their own. Um, all right, anything else? Uh, did you see that paramount's 00:43:20.120 |
buying the free press for 150 million? I did see that. Yeah. I think that's a good sign, 00:43:26.040 |
I guess for those of us doing independent media. Um, here's what I want you to put on your list, 00:43:31.800 |
call NBC and tell them, Hey, look, we're reasonable. They can have deep questions for like a hundred 00:43:38.520 |
minutes, right? Cause I, they don't need, cause I guess that's what they're doing now. 00:43:43.560 |
Does that mean I get a host of this day show? Like how's this work? If they buy us, 00:43:46.520 |
we could do four hours in the morning. I think it's politically oriented as well, 00:43:51.640 |
based on, you know, the politics of the current. The one thing, there's seems to be like a lot of 00:43:57.000 |
people reacting who are like online, who are upset or very jealous about the money. I do want to just 00:44:05.880 |
make a, like, here's my business PSA. Barry Weiss did not just get handed $150 million. I think 00:44:12.040 |
people read that and they're like, Oh, someone just gave $150 million to Barry Weiss. No, no, no. 00:44:16.520 |
There's a, an agreement to buy the free press, which unfolds over many years. And much of that has to do 00:44:23.080 |
with stocks and stock swaps. So there's like a certain amount of stock. The free press is going 00:44:28.680 |
to get every year. So the actual amount that in the end, the purchase will depend on what the paramount 00:44:33.080 |
stock does. Barry Weiss doesn't own a hundred percent of the free press they've taken on. I mean, 00:44:38.520 |
they have many partners when they started, they've taken on a lot of venture capital. 00:44:41.800 |
So, you know, yeah, she made a lot of money. I don't want to downplay it, but she didn't just get 00:44:48.600 |
$150 million. I think people don't always. Speaking of the free press, you need to have 00:44:53.240 |
Neil Ferguson on the show. That's a smart guy. Niel, Niel Ferguson. Um, did you read something of his 00:45:00.200 |
recently? Um, yeah, but, and then I signed up for the free press and I get alerts when he writes stuff. 00:45:05.720 |
He's a cool guy. Neil, Neil Ferguson. People say Neil, but it's N-I-A-L-L. Yeah. Um, 00:45:13.480 |
he was a, his store, he was at Harvard before he went to the Hoover and he was a historical economist 00:45:22.440 |
or an economic historian, I guess. Like the, the thing is like, he was a historian, 00:45:28.200 |
but would use the tools of economics to do his history. Right. So it'd be like one of these things 00:45:33.640 |
where like, oh, we're, we want to learn more about like the medieval Lawrence in like the early 00:45:39.320 |
Renaissance period. And that type of historian, like, we're going to go read all the, uh, the, 00:45:44.280 |
like the ledgers and we're going to like reconstruct like how the money was moving through the economy. 00:45:50.200 |
And it was really cool. I think that type of, he's a, he's a really smart guy and he writes a lot of 00:45:53.960 |
books. Yeah. Um, but then he left, he's conservative ish. I mean, it's all relative, but then he went to the 00:45:59.640 |
Hoover institution at Stanford. Yeah. I think he's a smart guy. We should have him on. And he has a, 00:46:03.080 |
he has an accent, right? Yeah. And he's gonna sound so much smarter. Here's what we're going to do. 00:46:06.680 |
All right. Here's how we're going to count. We're going to have Neil Ferguson on. I'm afraid his 00:46:09.720 |
English accent is going to make him seem smarter than me. So you can speak in your French accent. 00:46:13.240 |
Well, no, here's the solution from last week's episode. It's even better. 00:46:15.960 |
When I speak, we put the Jordan Peterson violin music behind me. 00:46:22.360 |
just like, like, just really. So then I'm going to sound more profound than I can counteract his, 00:46:27.320 |
his English accent. And when he speaks, I'm thinking like someone playing the spoons or like, 00:46:33.800 |
you know, you can play the jug where you blow into the moonshine jug with the triple X's on it as part 00:46:40.680 |
of like an old time band from the depression. We're going to play that behind him to kind of reduce the 00:46:45.240 |
impact of his English accent. And then for me, it's going to be like really emotional violin music. 00:46:49.560 |
I'm going to talk real slowly. All right. That's housekeeping for today. Let's move on to some 00:46:54.760 |
questions. Wait, who do we got first, Jesse? First question's from Andy. My 11 year old son is the 00:47:01.080 |
only one in his class without a smartphone. He gets bullied and he feels left out of trends. Smartphones 00:47:06.440 |
aren't allowed at the school, but that doesn't change the fact that he's clearly missing out. 00:47:10.040 |
Well, I mean, first of all, I'll say I recognize the, the, the issues here and I recognize that it's 00:47:17.000 |
really hard and it's also not really fair for my generation of parents and especially the people 00:47:23.480 |
in my generation who are a little bit older, like whose kids are in high school or just going into 00:47:28.280 |
college now because we didn't sign up to have to be part of this experiment of let's take this incredibly 00:47:33.800 |
powerful new technology. Let's give it to our kids. It'd be like, let's see what happens. Like, 00:47:37.560 |
let's just make it socially ubiquitous and then leave it on individual parents. They're like, okay, 00:47:42.200 |
you got to go read like John Heights, 700 page, you know, annotated bibliography online about mental 00:47:47.640 |
health impacts of smartphones and make a decision that's going to make you like a complete outlier in 00:47:51.000 |
your community. It is really, uh, we really put a really big burden on parents. So I have a lot of empathy. 00:47:58.200 |
This is a really hard time. A couple of things I want to say here. It's probably not true that you're 00:48:03.960 |
11 year olds. You only kid in your class, not with, without a smartphone, but it can really feel that way. 00:48:08.280 |
And at some schools that is more true than others. So that's changing now, which I think is good, 00:48:13.240 |
but that hasn't completely changed yet. And also say if he's being bullied, the phone is not going 00:48:18.920 |
to help. So one of the big concerns, and I did a talk recently at my kid's school where I went through 00:48:26.120 |
and said, why are the experts actually concerned about these things? And I broke those concerns in 00:48:30.200 |
the four categories. And I talked about the four categories. And one of the categories was the sort of 00:48:35.480 |
negative externalities of digital sociality, which by the way, Jesse is exactly the way to talk when 00:48:41.160 |
you're trying to impress a bunch of seventh graders. They really liked that. It was funny because we're 00:48:46.120 |
doing it for the students and their parents. But there's these negative externalities of digital 00:48:50.520 |
sociality. And one of them is there's whole parts of our brain that give us sort of like interpersonal 00:48:56.520 |
guardrails that, you know, prevents me from like trying to club Jesse when I'm upset about, you know, 00:49:03.320 |
an ad transition or something. We have all these interpersonal guardrails that makes us 00:49:07.960 |
more reasonable people. We need that to survive as like a community oriented species. 00:49:12.520 |
A lot of them turn off when social interactions become purely linguistic. So when I am just sending text 00:49:19.080 |
to you on Snapchat or on WhatsApp, yes, if you ask my prefrontal cortex, is there a person like, am I 00:49:27.320 |
talking to Jesse over WhatsApp? It knows it, right? It's not confused about it. But many of these other 00:49:33.720 |
deeper social circuits aren't seeing a person. They're not hearing a person. So they are turned 00:49:37.320 |
off. The guardrails get turned off, which means in purely linguistic interaction, 00:49:43.240 |
we're basically like worse people and we're much more likely to be mean or to bully. I mean, 00:49:49.800 |
obviously we see this among adults on social media all the time, just like the stuff they say on X, 00:49:54.680 |
you know, it's, it's crazy, right? Like how, how enraged or outraged or how mean or how terrible they 00:50:00.120 |
get. It's because the stuff that stops us from saying that in person is turned off when it's linguistic. 00:50:04.840 |
So anyways, bullying goes up when you move more conversation digital. And then if you're in a 00:50:11.160 |
compromised social situation in school, and by the way, tell your 11 year old that it does get, 00:50:15.640 |
you know, it will get better. Middle school is not great necessarily, but like, keep doing what 00:50:21.560 |
you're doing, you know, be, you have interest, develop skills, develop discipline, start doing 00:50:27.400 |
stuff that's like really hard and following through. It will get better. But if you're having to struggle 00:50:31.640 |
socially, then the other thing that happens when I have a smartphone, again, man, such a burden is your 00:50:37.560 |
relief goes away. So those interactions are happening all the time, not just at school, at home. You're 00:50:44.520 |
doing your homework, when you're at dinner, you're going upstairs at bed, it's right there on your phone 00:50:48.040 |
all the time. So you get no cognitive relief from high stakes social interaction, which is exhausting. And 00:50:53.720 |
if, and if you're in a socially compromised situation, if they're also really negative, potentially. And so 00:50:58.520 |
that's like even worse. So this is not a solution. If you're upset about your social situation in middle 00:51:04.840 |
school, like in a classic social middle school, like I'm not fitting into this bullying situation. 00:51:08.360 |
Hey, here's a phone that people can be like bigger jerks and you have to be in touch with them all the 00:51:14.280 |
time. That's not necessarily going to make that better. There's a couple of things that can help 00:51:18.520 |
with this collective action problem. I'm a fan of things like the wait for eighth pledge, where you 00:51:23.080 |
get parents in your grade to sign up and you use the wait for eight website onto this pledge. If I'm 00:51:29.240 |
not going to give a phone to my kid until after eighth grade, which is like roughly when like John 00:51:32.840 |
height recommends when the surgeon general recommends the last surgeon general, it's sort of like an 00:51:37.400 |
emerging consensus, wait till high school to have your first smartphone. And it gives you statistics and 00:51:42.120 |
that's where it's powerful. And then you can say, you know what, you're not the only kid not to have a 00:51:46.120 |
phone. 33% of your grade has signed onto a pledge saying that their kids aren't going to get a phone 00:51:49.880 |
until high school anyways. So that social evidence of you're not the only one is very effective because 00:51:56.680 |
you do not need 99% of your class to not have a phone before you feel comfortable with that. You need 00:52:02.280 |
roughly like 15 to 20% before that becomes a socially acceptable option. So I'm a big fan of those. 00:52:06.920 |
You can help organize one of those pledges. The way this works is it's really parents and grades at 00:52:12.600 |
individual schools say, I want to pledge for our grade. And they get together with a couple of 00:52:16.840 |
parents and you put together an email and you tell the school and the school gives the email addresses 00:52:20.520 |
and you send it out. That's all it is. And grade by grade, you begin to create these pledges, which I 00:52:26.040 |
think are very powerful. We should start those as early as third or fourth grade. So you have different 00:52:29.800 |
options there. Okay. And then you have lots of logistical things like, you know, if there's group 00:52:35.080 |
chats, you want to be a part of, you can have set up group chat on a family iPad and you have certain 00:52:39.560 |
time set aside where you can, he can look at the iPad and do the group text and stuff like that. 00:52:42.920 |
But I really want to add friction to the idea of like, wouldn't it all just be easier if I just gave 00:52:48.920 |
him the phone? I wouldn't have to be dealing with this. It's really so bad. And I really want to 00:52:53.240 |
emphasize it is, that is not going to be the solution to these things, the social things you worry about, 00:53:00.040 |
his social things. There's so much bad that comes with it. What I want to do here, I don't know if we 00:53:03.960 |
have this, Jesse, do we have that clip? Yeah. I want to play, here's an ad that's been going 00:53:08.120 |
around the internet. It's from a smartphone-free childhood. It's a nonprofit organization. They didn't 00:53:13.720 |
add, we only have the audio here, but you can kind of imagine the visuals here, which I think 00:53:17.400 |
does a good job of making you confront the reality of what happens if you give, 00:53:22.440 |
you know, your 11 year old a smartphone. So we're going to hear the audio that, so just to set the 00:53:26.520 |
visual of this ad, there's like 11 year old kid. I think it's like a 10 or 11 year old looking kid, 00:53:31.160 |
like a young looking kid that age and a dad is in bed and that his dad is like at the doorway, 00:53:36.600 |
like saying goodnight to him. All right, let's hear this audio, Jesse. 00:53:38.920 |
Hey kiddo, it's about time for bed. Okay. Okay. Well, remember there's a box 00:53:47.240 |
in the corner over there with all the pornographic material that's ever been made in the world. 00:53:51.160 |
Even the really weird stuff that could scar you for life. I'm trusting you not to look in there, 00:53:56.200 |
okay? Okay. Feelings are for losers. Oh, and this guy's going to be in your corner all night just 00:54:01.640 |
randomly spewing out hateful things. Just ignore him, okay? Well, I'm thinking of it. There's an order 00:54:07.720 |
form on your desk where you can purchase illegal drugs. The mean girls from your school are going 00:54:11.640 |
to be standing there talking about you all night. And this Russian hacker is going to keep asking for 00:54:16.680 |
your password. I'm not a hacker. Amazon customer service. Just need you to ignore him, okay? 00:54:23.480 |
Love you, buddy. We ask too much of our kids when we give them a smartphone. Let's change the norm. 00:54:29.800 |
Together. Maybe we go around the room and share social security numbers. Join the movement at 00:54:33.960 |
smartphonefreechildhoodus.com. Eh, I'll get it anyway. 00:54:38.520 |
We try to hide that reality, but it's the truth. Like, would you ever with like your 11 year old be 00:54:45.720 |
like, look, I'm going to put pornography and bullies and sort of like weird interest group, like these, 00:54:54.200 |
these influencers that have these like really weird agendas. There's gonna be a bunch of hackers. Let's 00:54:57.960 |
put in there like catfishers too, that are going to try to, uh, pretend that they're like young women 00:55:03.000 |
on text messages, then try to like exploit you out of money, which has been causing like a very large, 00:55:07.640 |
like trend in self harm, like all this stuff. Oh. And like a lot of addictions, like let me just put in 00:55:12.520 |
like constant video games and TVs are always on and like addictive stuff in here. And just like, 00:55:17.080 |
I trust you and just like handle it. You know, it's an incredible thing to ask of a kid. And that's 00:55:23.480 |
what happens when you give them the smartphone. So, um, it's hard, but stand strong is what I 00:55:28.920 |
would say. I think norms are changing and your kid will thank you not to have to deal with all that 00:55:33.160 |
uh, at their current age. All right. Who do we got next? 00:55:36.200 |
Next up is Natasha. I want to quit social media. However, I sometimes get an irrational urge to show 00:55:43.480 |
off. I think everybody feels this to some extent. They want to brag about their looks, their lifestyle, 00:55:48.280 |
body, wealth, job, intelligence, whatever. I was often the weird loner at school. And as an adult, 00:55:53.240 |
I still have that chip on my shoulder. How should I overcome this? 00:55:56.440 |
All right. Well, the main thing I want to say is the urge is not irrational. 00:56:01.320 |
So like the, the deep down, the urge you're feeling is like a very human urge. 00:56:06.360 |
The thing that is abnormal here is the technology medium in which that urge is being expressed. 00:56:12.200 |
That is what sort of perverts things and creates the negative outcomes. 00:56:16.840 |
So what is the natural, so to speak outlet for this urge to like want to show off? What the real 00:56:24.440 |
natural outlet for that is that like, I want to have respect and be in a position of leadership in my 00:56:28.280 |
communities. That is where that's going. I want in the communities I'm involved in real world 00:56:33.880 |
communities where I know what the people look like and I see them in person in those communities. I want 00:56:37.400 |
to over time through sacrifice service and demonstration of competency, build up like 00:56:42.600 |
increasing levels of, of respect and leadership that people look to me, they're impressed by me 00:56:48.520 |
and they want me to like be in charge of thing or be involved or someone they can count on. 00:56:53.560 |
That is actually what we crave. The social media just takes that craving and perverts it. This 00:56:58.120 |
is like what happens with a lot of the attention economy based engagement technology is they take 00:57:03.240 |
completely normal human impulses or urges that actually can lead to very, uh, positive outcomes. 00:57:09.800 |
That's why we evolved to have them and they hijack them. You know, they hijack them, right? 00:57:13.880 |
It's just like, uh, boredom is like a very strong urge that is meant to try to push humans to, 00:57:19.320 |
to not just lay in the sun like cats can do and are perfectly happy to do. 00:57:23.080 |
But to get up and actually go try to see intentions made manifest concretely in the 00:57:26.680 |
world in positive ways that leads to invention. That leads to innovation. That leads to protection. 00:57:31.960 |
I'm going to go build fences around our like paleo that camp. I'm going to try to build a better 00:57:36.920 |
Flint axe. That's going to help us like better do whatever. There's a very human thing, 00:57:41.720 |
but social media hijacks that it's like, Oh, boredom's bad. It feels good when it goes away. 00:57:46.600 |
Scroll endlessly on tick tock, you know, it's hijacking, right? 00:57:52.600 |
Just like procreation is good. Pornography hijacks that like, Oh, that's there. We 00:57:56.360 |
can hijack that over here. Like junk food hijacks like our hunger urge. But that's, 00:58:00.760 |
what's being hijacked here by the social validation that's built into as an engagement mechanism to 00:58:05.160 |
social media. So take that urge, don't push it away, but find a natural, healthy outlet for it, 00:58:11.880 |
which is becoming a leader and someone who is respected within actual real world communities. 00:58:16.200 |
And I'll tell you what, when you match these urges, people feel this all the time 00:58:19.320 |
to what they're really meant to drive you towards. It's a completely different feeling. 00:58:23.880 |
It is a completely different feeling knowing like, Hey, I was here for my community. 00:58:30.360 |
I stood up and people look up to me. That gives you a sense of satisfaction that lasts so much more 00:58:36.360 |
than I did like an Instagram filter on my face when I was taking a picture at the beach and you know, 00:58:43.240 |
it, it got 50 of those hearts. I don't know what that means. And someone, you know, in broken English 00:58:48.200 |
was like you hot. That's like a short term, like, Oh, it's a little simulacrum of what that urge is 00:58:54.200 |
actually driving you towards. So nothing you're doing is irrational, but the technology is screwing 00:58:59.000 |
with a very healthy, natural urge. Go be a leader. You'll go earn respect among real people. It is, 00:59:04.520 |
it's the real deal is 10 X better than what you get on those screens. 00:59:07.080 |
All right. Who do we got next? Next up is Carl. You talk a lot about phones and social media. I don't 00:59:13.880 |
actually spend that much time on my phone. My issue is video games. How do I stop playing these so much? 00:59:19.080 |
Well, if we go through our model of how the brain works, there's a lot of these same brain 00:59:23.160 |
mechanisms at play, like games are designed to give you, it's going to be a pretty consistent 00:59:28.200 |
reward because it's such an artificial environment and it's calibrated to be difficult, but not too 00:59:33.160 |
difficult. So the rewards you get is one of both novelty of your scene stuff. You get a little bit 00:59:37.400 |
of adrenaline if it's like an action game. And most importantly, you get sort of progress. 00:59:41.640 |
And so it, it fits roughly in that model. Like, Oh, I see my video game is here. I'm at home. 00:59:46.920 |
I could go to the gym. I could read a book or I could pick up the controller. The queue is right 00:59:51.960 |
there. The controller is right there. And because it's this artificial environment, 00:59:55.160 |
it can give me like a really clean reward. So it does build up like a strong association. 01:00:00.200 |
There's a certain type of game that really messes with the system even more than our phones do. And 01:00:05.880 |
that is massive online games. If you talk to people within like technology overuse type 01:00:10.920 |
related fields and research, the real things they fear are massively online games because they add in some 01:00:17.400 |
extra elements to make their attraction even more powerful. They have like an endless increase in 01:00:23.560 |
score. So if you're playing, you know, world of Warcraft or something like this, you're going to get 01:00:28.120 |
this steady leveling. I do a little bit of effort in level that's simulating like in real world, 01:00:34.760 |
building up competency and getting more respect for it, which is a very powerful driver. 01:00:38.520 |
They simulated in those games. Now, of course you get this when you're just playing like Mario a little 01:00:43.640 |
bit because you're making progress, but in the online multiplayer games, your mind is like, no, 01:00:48.440 |
no, no, other people know what level I'm at and they see it. So that's much more powerful. 01:00:54.040 |
It's simulating like I am in a real community of people and I am earning their respect. I just talked 01:00:59.000 |
about that in my answer to Natasha, that that's a strong urge. This thing grabs it way stronger than 01:01:03.880 |
like getting hearts on Instagram and like really twist it. So that is super compelling. 01:01:08.520 |
Um, also the sense of, uh, it hijacks this sense of like, I have, this is my tribe because I hear 01:01:16.120 |
them in my headset. We run like a discord server while we play or whatever. And I'm with them all 01:01:21.320 |
the time and we're doing stuff that kind of pushes the buttons of like adventures and trials. It's not 01:01:26.760 |
the real things. There's not nearly as strong as like I'm actually in battle with people, but it simulates 01:01:31.720 |
that sort of band of brothers type connection. This is super, super addictive on top of all the other 01:01:36.600 |
just standard. Like we have nice positive reward signals to get associated with it. That's where we 01:01:41.160 |
have the most problems. It's where like, if you look to South Korea where they're, they're sort of 01:01:44.600 |
they five X all the problems we have with technology here because it's like a much more technology focused 01:01:49.640 |
culture. They have detox centers there for these video games, not for phones, but for massively online 01:01:56.280 |
video games. They have cases over there of people dying because they played the game so long that it was like 01:02:01.080 |
dehydration and they had heart issues. So you gotta be worried. Probably the most addictive 01:02:08.280 |
consumer facing technology we have are some of these massively online player games. I'd be really 01:02:12.600 |
worried about it. Like my advice about those games is like a game where a lot of people are playing at 01:02:16.120 |
the same time. Just don't play them. Just don't use it. Just, I would just stay away from it. 01:02:20.360 |
Right. It's like, don't get, you know, you might be like a marijuana user. Don't get near the stuff that 01:02:26.040 |
might have Fint and all in it. Like why even like play with that? Okay. So I am very wary. Uh, I am 01:02:32.040 |
very wary of those games. What would I would recommend if you're playing video games, AAA games are usually 01:02:37.480 |
the best that are non multiplayer. A AAA game, it costs 60 bucks. It has about 50 hours of gameplay 01:02:43.880 |
programmed into it. It's supposed to be stretched out. It's challenging. I'm going to want to take 01:02:46.680 |
breaks because it's hard. I'm not leveling up in a way that like someone's watching me and like, yeah, 01:02:51.400 |
that's like your, your equivalent of like me watching a lot of movies. I'm not so worried about 01:02:55.400 |
that, but I worry about the massively online, uh, online player games. So, uh, to give you actual 01:02:59.800 |
advice, Carl, don't play those, just stop playing those, you know, just growing up, shouldn't play 01:03:03.960 |
those games. Stop playing those games. If you like other video games, like work of art that are 01:03:08.120 |
single player and AAA then like, yeah, you could treat it like TV, you know, like, yeah, 01:03:12.840 |
instead of watching a movie on, on when I get home on Friday night, I'm going to play the video game 01:03:16.520 |
for two or three hours. I guess that's okay. I'm not as worried about that. All right. Who do we got next? 01:03:21.800 |
Next up is Carissa. I spend most of my time on my phone, text messaging. I don't really think I'm 01:03:27.000 |
addicted, but feel there's a ton of stuff to figure out and people's questions to answer. What do I do about 01:03:31.720 |
this? Yeah, that's a complicated question. We did an episode about this earlier in the summer about text 01:03:37.240 |
messaging actually being a major driver of phone use, especially when you get a little bit older 01:03:42.520 |
and you have more responsibilities and maybe you're doing logistics for kids, or you just have like a 01:03:46.280 |
complicated social life or work slash social life you're trying to navigate. And this really complicates 01:03:52.280 |
the picture because when we're talking about something like TikTok, that's purely optional. 01:03:57.080 |
Like it gives you a nice reward signal, but you do not need that particular reward signal to function 01:04:01.880 |
in any way. And no one notices or cares if you stop using TikTok. People do notice if you, 01:04:06.840 |
you stop becoming available on the messaging services. So the hard thing with these is trying 01:04:14.040 |
to differentiate between logistical necessity and social driven addiction cues. And so here's the 01:04:22.040 |
issue. You might be using this a lot mainly because I'm just involved in six threads that are all relevant 01:04:29.240 |
because they're all logistics and timing and I have to figure it out. But there's also a really strong 01:04:35.720 |
cue here that our short term reward center is going to really care about, which is there might be people 01:04:41.880 |
waiting for me to respond. And the negative affect they have towards me will increase the longer I'm not 01:04:50.200 |
responding. That catches our attention. People are getting madder every minute I don't look at my phone. 01:04:56.760 |
Potentially makes you really want to look at your phone and for obvious reasons, right? 01:05:01.160 |
So these two things together can make this really powerful. The best solutions here, as we talked about 01:05:06.920 |
in the episode of the summer is not just to abstain. I just don't use my phone anymore, but to try to 01:05:16.120 |
reroute the actual necessary communication more out of these text messages and or to reprogram over time 01:05:24.360 |
the expectations of people who communicate to you through these apps so that your mind is not so worried that 01:05:30.040 |
people are upset. Like, you know, uh, we, we have different ways of organizing different logistics, 01:05:36.760 |
ways we check in, I'll call someone once a day, or we figure out a plan in advance. So there's like less 01:05:41.320 |
stuff that's probably coming in urgent. And also people have learned in my circles, my phone's in my 01:05:47.800 |
kitchen. And so I come and I check it, but if maybe once an hour, but there might be a two hour break and 01:05:52.680 |
they can call if there's something really time urgent that tells your brain, like they're not just sitting 01:05:56.680 |
there upset and stewing. They know and understand the way you use your phone. You basically have to 01:06:02.440 |
change your relationship to communication so that frequent time sensitive communication coming through 01:06:07.480 |
text messages is not that common anymore that people understand you're not always checking it. And because 01:06:11.880 |
of that, you have, uh, the backup phone calls and other types of logistical types of things you do. 01:06:16.360 |
That's all a pain. What I'm saying is do that pain. It's worth the pain because the cost on the other 01:06:22.120 |
hand is the almost addictively have to check that phone all the time. And that has so many negative 01:06:27.960 |
impacts. It's worth the pain of like, I have a kind of janky way I deal with like logistics and 01:06:34.680 |
communication. It's worth, it's worth the effort. All right. Let's see. Do we have one more question? 01:06:39.880 |
Yep. Cool. Next up is Robert. I have newspaper apps on my phone, like the Washington Post and New York 01:06:45.800 |
Times and New York Post. I also have these on my iPad where I normally read my news. Is it fine 01:06:51.240 |
that I occasionally read these apps on my phone? I don't go on social media sites when doing so. 01:06:55.320 |
Um, it's not the worst thing in the world. What I would recommend if you're going to look at like 01:07:02.760 |
the New York Times app is treat it like a newspaper and just say like, yeah, this is when I sit down, 01:07:08.760 |
like with my morning coffee or something. And I go through because it's, it's not super dynamic. 01:07:13.880 |
They kind of set it for each day and then have some minor changes throughout the day based on what's 01:07:17.800 |
happening. Let me go through and like, see which articles I read and maybe one of the opinion pieces I 01:07:23.160 |
read. And then you're kind of like done with the New York Times for the day. 01:07:25.560 |
Just like you would have done in a day where you had the newspaper and you would read it over 01:07:29.880 |
breakfast and that's it. And now you're done reading the newspaper. If you treat it that way, 01:07:34.360 |
it doesn't really matter what medium you're reading it on. An interesting thing about the New York Times 01:07:38.680 |
in particular is in their app design and in the way that they're now thinking about news stories, 01:07:45.160 |
they're highly influenced by the success of other news bearing social media, which they see as their 01:07:49.400 |
competitors. And it's reflected in the way they actually cover news now. So you'll notice two 01:07:54.200 |
things they do to try to prevent you from bypassing them and going straight to something like Twitter 01:08:00.440 |
when there's breaking news. The first thing they do is the live updates. That is a direct response to 01:08:05.880 |
social media. So they want to give you a sense if there's some sort of breaking thing happening, 01:08:10.360 |
that if you're on the app, there will be updates. It'll keep updating and you can click the thing, 01:08:17.240 |
show me the new updates or whatever, because they know people have now learned through social 01:08:21.480 |
media. It's not enough of like something happened. I'm going to wait till we know more about it. Then 01:08:25.720 |
I'll read an article that summarizes what we know. And then maybe later that day or the next day, 01:08:29.000 |
I'll get like this. Okay. A day later. Now, what do we know? I need to be like collecting information, 01:08:33.800 |
right? That's what social media taught us. I have to be like Woodstein and Woodward and Bernstein 01:08:40.440 |
and all the president's men, like trying to put together all the pieces. Like you want to have 01:08:43.560 |
just like information coming at you. So they try to simulate this, but like, oh, we'll do these like 01:08:47.800 |
breaking like reports. The other thing they do, which is very conscious reaction to social media 01:08:52.680 |
is they'll do five articles on an issue. Like, instead of just like, okay, here's what happened. 01:08:57.720 |
Here's what's going on. Right. And here's our article where we explain what's going on. Like 01:09:02.120 |
we'll do five articles on it. Here's what's happening. Here's another take on it. Here's 01:09:05.960 |
a news analysis on it. Here's looking at the other side of it because they want you to feel like we're 01:09:10.600 |
flooding the box. I have lots of things I can read about this thing I care about. Again, 15 years ago, 01:09:15.480 |
this would be weird. I just have a good article on it that you put in what you know, but social media 01:09:21.720 |
train people. Again, they want to be reading lots of stuff. I want to read this and that and this, 01:09:27.000 |
and I kind of want to like immerse myself in what's happening. So they do that. So they'll 01:09:31.000 |
generate five articles on something right away as opposed to one. So it's interesting. So they've 01:09:34.280 |
had to respond. They've had to respond to social media that tried to be attractive, but it's fine. 01:09:39.880 |
You can read it on your phone. Just do it once a day. I mean, again, they don't update those apps 01:09:44.280 |
that much because it's a newspaper model. So there's not that much for you to see there, 01:09:48.200 |
even if you check it all the time. All right. We still have a lot of more cool stuff on this topic to come, 01:09:53.960 |
including a case study from one of my listeners where they talk about what happened when they, 01:09:59.560 |
as a parent, I'm going to have to gasp here, changed their mind and took a piece of technology they had 01:10:05.880 |
given their kid back again. It turns out you can do that. So we'll see what actually happened there. 01:10:10.600 |
It involves a double murder. No, not really, but that would be funny if that sounded it up like, 01:10:15.880 |
and she killed, he killed me and my wife. Um, no. All right. And we also have a phone call about 01:10:20.840 |
it's about Instagram. What's our phone call about something? Yeah. Yeah. We got phone call from a 01:10:24.040 |
listener about Instagram. And of course I get a reveal the five books I read in September. So we've 01:10:30.440 |
got a lot of cool content coming forward on this topic, but first stay tuned because we just have to 01:10:35.000 |
take a quick break to hear from our sponsors. So I want to talk about our friends at My Body Tutor. 01:10:42.440 |
I've known Adam Gilbert, My Body Tutor's founder for many years. He's one of my go-to guys for fitness 01:10:48.120 |
advice. His company, My Body Tutor is 100% online. Uh, and it's a coaching program. So it's a 100% online 01:10:55.400 |
coaching problem that solves the biggest problem in health and fitness, which is lack of consistency. 01:10:59.320 |
The way this works is you're assigned a coach who you check in with every day with an app on your 01:11:03.720 |
phone. It's quick, but you check in every day and that accountability gives you consistently. You 01:11:08.760 |
check in on what you ate, you check in on your exercise and you kind of explain what's going on. 01:11:14.360 |
If something unusual is happening and they give you feedback and you know you're checking with this 01:11:18.440 |
coach every day, it keeps you on track. The coach helps you figure out your plan. What are we doing with 01:11:22.680 |
your diet? What are we doing with your fitness? You can use the coach can help you when you're like, 01:11:27.000 |
Hey, I'm going on a trip. How do I adjust this? So having like this coach that's working with you, 01:11:31.880 |
holding you accountable, but also giving you information and helping you adjust. It really 01:11:36.920 |
makes this work way better than if you just try to do this on your own. So if you're trying to get 01:11:42.840 |
healthier, whatever that means to you, and you're, you're worried about trying to do this on your own, 01:11:47.240 |
you've tried before and it's failed. My Body Tutor is a fantastic model for getting this done. I have good news. 01:11:53.880 |
Um, Adam will give deep question listeners $50 off their first month. If you mentioned you came from 01:11:59.480 |
this podcast when you sign up, so definitely do that. So head over to my body tutor.com t-u-t-o-r, 01:12:05.560 |
my body tutor.com. Tell them you came from deep questions, get $50 off and begin your quest towards 01:12:11.640 |
becoming healthier. I just want to talk about our friends at Indeed. Jesse, let me tell you something I'm not 01:12:17.160 |
very good at, which is hiring. We recently hired a creative director to help run the newsletter, 01:12:23.160 |
among other things. And you know what my process was for trying to hire this person? I took a receipt 01:12:28.040 |
I found in my pocket and I wrote on the back, me need newsletter guy, money, maybe non juggler preferred. 01:12:36.200 |
I just nailed that to a telephone pole. Long story short, the juggler we hired ended up being pretty good, 01:12:42.360 |
actually. So that was okay, but there has to be an easier way to do this. And there is, and it is 01:12:46.120 |
using indeed when it comes to hiring indeed is all you need. Uh, so stop struggling to get your post 01:12:53.960 |
seen on other job sites, use indeed sponsored jobs, which will help you stand out and hire fast. It works 01:13:00.680 |
according to the data, uh, from indeed sponsored jobs posted directly on indeed have 45% more 01:13:07.400 |
applications than non-sponsored jobs. You can spend less time interviewing candidates who check all your 01:13:13.240 |
boss, spend less time interviewing candidates who check all your boxes, less stress, less time, 01:13:17.480 |
more results. Now with indeed sponsored jobs, unless the show will get a $75 sponsored job credit to get 01:13:25.080 |
their jobs more visibility. If they go to indeed.com/deep, just go to indeed.com/deep right now and support our 01:13:34.360 |
show by saying you heard about indeed on this podcast, indeed.com/deep terms and conditions apply hiring. 01:13:42.600 |
Indeed is all you need. All right, Jesse, let's get back into it. 01:13:49.160 |
So I went to a case study here. This is where listeners write in the talk about how the ideas 01:13:52.920 |
we talk about on the show have actually impacted their lives. We got a good one today, but we can't 01:13:57.800 |
do a case study until we get our mind in the rights that you, uh, the right context. And the best way to 01:14:02.360 |
do that, Jesse is playing our case study theme music. 01:14:09.320 |
All right. Our case today comes from John. John says, our daughter is currently finishing fifth grade. 01:14:21.560 |
At the end of fourth grade, she began walking home from school. So we got her an Apple watch for calls 01:14:27.480 |
and location tracking. However, at the end of fifth grade, she began focusing on the watch to the 01:14:33.080 |
exclusion of people around her at times, becoming visibly upset by some of the group texts she was on. 01:14:38.440 |
Then her grades began suffering. So after a few attempts to work around the tech, we took the watch 01:14:43.960 |
away permanently, locked up the TV remotes until 5:00 PM during the week and allowed unlimited books. 01:14:50.200 |
Last month, she tested in the eighth grade math and achieved the highest score in her grade on an 01:14:55.880 |
end of year literature exam. Her social life certainly hasn't suffered either. She now has a flip phone 01:15:01.240 |
for calls and emergencies, but the novelty wore off quickly and it lacks the intense stimulus of the watch. 01:15:06.120 |
We otherwise do our best to favor in-person interaction through hangout sports and scouting. 01:15:11.000 |
So surprise, surprise, Jesse, you give these technologies that completely overwhelm the short 01:15:16.760 |
term motivation centers in our brain and you give it to a fourth grader and their brain goes haywire. 01:15:22.360 |
And you know, I feel like we, we pretend like this is not true, but it is, it's like that commercial 01:15:27.000 |
that we just played for. You're giving all these capabilities to someone who's, you know, nine and 01:15:32.920 |
you're like, okay, we just like trust you to use this well. So well done, John, realizing that like, 01:15:37.480 |
yeah, you can change your mind. I get this all the time at talks where people are like, man, I wish I 01:15:41.720 |
hadn't, I wish I hadn't given like this phone to like my kid. Like, what can I do? You know? And I was like, 01:15:46.120 |
like, well, you are the parent. Like you can change your mind. And if you're not willing to 01:15:51.080 |
fully change your mind, the thing I've been recommending that people do say, okay, uh, yeah, 01:15:55.560 |
you have a phone. It's not yours. It's ours. We pay for it. It's not your property. You have no right 01:15:58.760 |
to it. I always joke in my talks that these kids who, you know, talk monosyllabically when it comes 01:16:04.840 |
to like, you try to take their phone away, become like Berkey and private property scholars. Like this is 01:16:11.160 |
Liberty is built on the private possession of property and you may not trot on my freedom. 01:16:16.040 |
What is it? It's your phone. And so you just say like, okay, at home, my phone lives in the kitchen. 01:16:21.800 |
So when you're home, we plug it in there and you can go there if you need to text your friends 01:16:26.600 |
or you need to check in on things, but you don't have it at the dinner table and you don't have it 01:16:29.640 |
at the couch and you certainly don't have it upstairs in your room with the sheets over your head. 01:16:33.160 |
The phone lives here. None of this. I really don't like when I hear the parents like, 01:16:39.000 |
I wish, please, please stop using your, we're trying to eat dinner. Oh, please stop using your 01:16:44.760 |
phone. What can you do kids these days? I don't know. It's your phone. You're paying for it. It 01:16:48.200 |
lives in the kitchen. So you can go back. All right. What I do with my kids. Um, so people do ask what 01:16:53.720 |
we do. Uh, none of my kids have phones. We, we own a couple of family flip phones, not a kid's phones, 01:16:59.000 |
family phones. It's like the phone we leave. If one of our kids is at home and we're out going for a walk, 01:17:04.200 |
like that's our equivalent of the old fashioned phone. If you need to like, there's an emergency or whatever. 01:17:08.520 |
Um, and if you're going somewhere like taking the public bus to baseball practice, 01:17:12.760 |
where it would be good to have a backup, or you could tell us something went wrong. 01:17:16.680 |
You can basically like check out one of the family flip phones, which are terrible to use. And it's 01:17:20.760 |
like really bad technology. You can check one out to bring with you to have in case there's a problem. 01:17:25.480 |
When you get back, we take it back. And just like John was talking about, there's something 01:17:28.120 |
interesting on these phones. They don't actually want to use them. That's what we do. When I was worried 01:17:31.800 |
about location tracking for my second grader, what I, what I innovated instead, I think this is a really 01:17:37.240 |
good idea is that I actually, um, integrated into his backpack, a 150 decibel fire siren. 01:17:45.080 |
And what I have it do is just at random intervals, it just wails for 30 seconds at a time. So I can kind 01:17:50.680 |
of just see where he is around town. It's like, Oh, I hear, I hear the fire siren over by the library. 01:17:56.680 |
So that's, you know, that works. You gotta innovate. That's the idea. All right. So good. Uh, way to go, 01:18:01.480 |
John. And don't give her a smartphone until high school and have it super locked down. And don't 01:18:05.160 |
give her a less locked down smartphone with social media until she's 16. That's my advice. All right. 01:18:09.240 |
Do we have a call this week? We do. All right. Let's hear this. Hi, Kyle. Thank you so much for 01:18:14.120 |
your content and writing. It's been a great help for me and countless others. Um, a question today is 01:18:20.120 |
around Instagram as an inspired aspiring photographer. I'm wondering what your thoughts on, uh, if I need a 01:18:27.800 |
grown presence on Instagram to make a success in the photography, or do you think, um, I can do it without it? 01:18:35.720 |
Um, it feels like in today's world, it'd be difficult to be successful in fields like this 01:18:40.760 |
without a social media presence. I appreciate you, mate. Thank you. All right. It's a good question. 01:18:47.080 |
I don't know that it's vital for you to be successful, but let's say you're worried about it and we want 01:18:52.520 |
to alleviate this worry, understanding what we now know about how the brain works and creating that urge 01:18:57.560 |
to look at your phone that we talked about in the beginning of the show. This gives us some options 01:19:01.640 |
here that can give us a little bit more confidence and the clear option that comes to mind for your 01:19:04.840 |
your situation is if you need an Instagram presence for your photography, it should have nothing to do 01:19:11.400 |
with your phone. Do it from your laptop. Now, this is going to be easier for you, by the way, 01:19:16.680 |
because if you're a professional photographer, your photos are not being taken on your phone. 01:19:20.040 |
So you're going to have to be like uploading them onto your computer anyways, before you post them. 01:19:23.800 |
One of the reasons why Instagram is so successful is, uh, you had to, it was mobile native because for 01:19:31.160 |
most people, the photos they were posting were taken on their phone. And so it was mobile native. 01:19:36.520 |
And the reason why it took off ahead of Facebook at the time when it really began to take off in the, 01:19:41.400 |
the, the second, the 2010s is because Facebook was just moving on the mobile, but people didn't yet have 01:19:47.720 |
that ingrained habit of like, oh, I want to go to Facebook on my phone. They were used to this, 01:19:51.320 |
like something they did at work when they were bored. It was like a website they went to, but Instagram was 01:19:55.560 |
mobile native because the whole idea was I took a picture on my phone and now I can, uh, go over into 01:20:01.560 |
Instagram and post it. And so it got this really big user growth, but it wasn't just the user growth. 01:20:06.600 |
It was the engagement that really made Facebook perk up. Oh, wow. Instagram users use it a lot 01:20:12.360 |
because it, the camera was on your phone. And so it was the first real social app where people associated 01:20:18.440 |
it with like, oh, I do this on my phone. And then they built up that, the, the reward signal got really 01:20:24.200 |
strong and they built up that pattern recognizer in their short term motivation system. And they 01:20:27.240 |
started picking up their phone all the time. So Instagram more than Facebook actually got people 01:20:30.520 |
look at, uh, looking at their phone more than they planned. Okay. So do your photography, 01:20:35.080 |
Instagram on your computer. Don't have it on your phone. Don't log in to Instagram on safari on your 01:20:40.200 |
phone, have a bad password and never type it in on your phone. So it's not something you can do impulsively 01:20:44.600 |
by bad. I mean, good. Like in the sense of it would be complicated to remember it kind of confusing here. 01:20:51.080 |
Bad, like, cool. You know what I'm talking about. Right. And so now, and then this could be like, 01:20:56.360 |
you know, have a plan. Think about your Instagram for photography is like one of the boring things 01:21:01.560 |
that you have to do. It's like sending out your stupid invoices and like the invoice reminders. 01:21:05.800 |
Like I got to go into like QuickBooks and like click these buttons and I hate it. And no one ever 01:21:11.080 |
really knows what's happening in QuickBooks. And you just press these buttons. You're like, I think 01:21:15.000 |
this kind of worked, right? Like it, treat it like that. I got to log it on my computer. I got to 01:21:19.080 |
import the photos and I'm going to change the format and make it more of this. I go to my Instagram 01:21:23.880 |
account. I load it in and I have this copy and I post that. And then I log out again and you do it three 01:21:30.680 |
times a week. And that's how your stuff gets up there. Your brain never builds up the queue of your 01:21:38.360 |
phone being involved. You're not really without your phone. You're not going to consume a lot of 01:21:43.640 |
Instagram. So you're not going to have that strong reward signal associated with it. Um, and so you're 01:21:50.200 |
not going to get that like strong vote being fired up in your brain all the time. And it's not going to 01:21:54.120 |
leave the technology overuse unless like you work at your computer all the time. And maybe you could 01:21:57.960 |
learn that queue if I want to go over the Instagram website. So do it that way. But the other thing I 01:22:02.040 |
would recommend, this is an idea for my book, Deep Work at some point, either before you do this or after 01:22:07.640 |
you've done it for a while, take a 30 day break where you don't do anything on social media and 01:22:11.800 |
see, does anyone matter? Does anyone notice? And does it make any difference? Because a lot of times 01:22:16.200 |
you might realize like, okay, my 75 subscribers on Instagram and that one photo that went viral is 01:22:21.320 |
like not making a big difference. That's not where the action is. Where I really need to be is like 01:22:24.760 |
on thumbtack as a preferred contractor on LinkedIn, or it has nothing to do with the internet. It has to do 01:22:30.360 |
with like shows at whatever. So, you know, test it out. Don't just assume it's vital, 01:22:35.080 |
but do it on your computer and it won't be that addictive. All right. Final part of our show. 01:22:39.960 |
We're going to sound effect for this one. Yeah. Well, we have a transition. Oh, do we have a yeah, 01:22:45.080 |
let's do transitions. Okay. Part three. Let's hear a sound here. Final part of the show. It's the early 01:22:51.080 |
in the new month. So I'm going to talk about the books I read in the last month. So I, my goal is 01:22:56.120 |
always to read five books a month. Reading, reading, reading is the best way to internet proof your brain, 01:23:01.320 |
the best way to become smarter, the best way to be able to think better, the best way to like 01:23:05.240 |
everything good in a cognitive society comes from reading. It's like saying you should be like 01:23:11.800 |
walking in your physical health. All right. So I try to read five books a month. I want to talk about the 01:23:16.040 |
five books I read in September, 2025, a quick shout out. I like to shout out the friends of the show that 01:23:21.000 |
have new books out. Uh, a friend of the show, Robert Glazer, I've been on his podcast several 01:23:25.560 |
times. He elevate podcasts, cool new book out this week called the compass within a little stories 01:23:31.320 |
about the value that guide us. It's about how do we find authenticity and fulfillment in every areas 01:23:36.520 |
of our lives. It's built around a parable. Check that out. The compass within. All right. So what are 01:23:41.080 |
the books I actually read? Three novels. Let me look here. Three novels this month. 01:23:46.520 |
Unusual for me. Uh, first novel, it was a novel from my childhood that I saw in a little free 01:23:53.160 |
library around Tacoma park and I grabbed it and read it. The ice limit by Lincoln child and Douglas 01:23:58.360 |
Preston. Classic early 2000 techno thriller. Uh, the premise is basically there's like a giant meteorite 01:24:07.240 |
down on an island off the southern tip of South America. And this billionaire has hired Eli again, 01:24:14.120 |
Eli Jen and, uh, effective engineering solutions. This sort of, uh, mysterious engineering company that 01:24:22.680 |
it's under the radar, but does like the impossible things they can do. Right. Um, and their whole plan 01:24:28.120 |
is how are we going to extract this meteorite and get it back to this guy's museum in upstate New York? 01:24:33.080 |
It's good. And it would be the heaviest object ever moved by man. Cause it's like super dense. And then 01:24:39.560 |
like it's weird things that great techno through there's weird things happening with the, the meteor, 01:24:45.240 |
right. But also the Chilean, uh, Navy, this one sort of commander in the Chilean Navy who has this like 01:24:52.440 |
really weird, like interesting backstory is like really bitter is like, I don't, I'm suspicious of 01:24:57.880 |
this. So he's around bothering them and eventually he has reasons to want to kill them. And so there's 01:25:03.560 |
like a good techno thriller. You have all these things coming together for like the final sequence. 01:25:07.880 |
I love classic techno thrillers. There's not as much of a market for anymore. Like a lot of the 01:25:11.400 |
genre fiction market right now is outside of fantasy, like the Brandon Sanderson types. It's like really, 01:25:18.680 |
it's not techno thrillers. It's like people falling in love with dragons and romances with fairies and dark 01:25:26.040 |
academia. Um, which I read one of those a couple of summers ago. Anyways, fiction is a big market, 01:25:32.520 |
but the tech techno thrillers, they were the thing in the eighties and nineties. They're not anymore, 01:25:36.440 |
which I think is too sad until I write my techno thriller, which is going to be about Jesse skeleton 01:25:43.880 |
going back in time. Maybe the fight Vikings. I don't know. I got to figure that out. All right. 01:25:47.480 |
Second book I read, uh, a Byung Chul Han book, the burnout society. So over the summer, early in the 01:25:54.040 |
summer, I did a podcast about, uh, his book about the swarm, the something, his book about, I forgot the 01:26:00.840 |
exact name, the connected swarm, I think. Anyways, he's a, he's a, he's in Germany. He's a philosopher 01:26:08.200 |
in Germany. I believe he's South Korean, a South Korean philosopher. He's a German institution and he 01:26:13.480 |
writes these series of books that are like a little bit more accessible than a lot of continental 01:26:17.800 |
philosophy. And they're sort of affirmatic and they're very popular among Gen Z or whatever. 01:26:22.280 |
And so I had, I had done a book about his other, uh, a book whose topics seem most relevant, but this 01:26:27.720 |
is the book that everyone talks about from him, the burnout society. So, um, it was good. Yeah. It's 01:26:32.680 |
interesting. I mean, he's, it's, it's interesting. He talks nomically, but kind of like makes these 01:26:39.160 |
pronouncements. It turns out there's a lot of other writers, especially from like more of like the 60s 01:26:44.280 |
and 70s critical theory era that like write the same way. So it's not as novel as I think a lot of 01:26:48.600 |
his Gen Z readers think, uh, the format, but it's like interesting stuff. I think it's like 01:26:53.320 |
very Mark, a little Marxist influenced. Um, I mean, he plays with a lot of different philosophers. 01:27:00.200 |
Uh, so it's cool. I thought it was interesting writing. I took a lot of notes. Maybe I should 01:27:02.920 |
do a podcast about it at some point. Um, but he's easy to, he's, he's hard to read, but easy to read, 01:27:07.000 |
right? Like, you know what? He's an easy to read, hard writer. That makes sense. He's a real philosopher, 01:27:11.880 |
but he makes it just accessible enough that you can sort of get wisdom out of it. All right. 01:27:16.360 |
Second novel I read down and out in the magic kingdom by Cory Doctorow. Now this was interesting. 01:27:23.880 |
I didn't get this book at first and then I did. So the, the, the premise for this book, 01:27:30.840 |
it, it takes place in the future. Uh, it's in the, the, we're at a future time where basically 01:27:37.640 |
death has been solved and money. Like you don't really, you don't have to work and you, they can 01:27:45.960 |
replace your body. Like if you get killed, they can just like you, you download your brain and they can just 01:27:50.920 |
like re recharge your brain and either put it into your body again after fixing you or do a 01:27:55.400 |
rapid clone of your bodies. Like people aren't like worried about death. And there's this, this weird 01:28:01.240 |
sort of culture going on where it's like, people are just doing projects and you can kind of like 01:28:05.720 |
take over a project just through everyone has like a popularity ranking. And if people like you, 01:28:11.240 |
your reputation is good. You can kind of take things over. And the main character lives in Disney world 01:28:16.040 |
in Florida, but it's not run by Disney anymore. It's like these different coalitions kind of take 01:28:21.640 |
over parts of the park and they can kind of just run them unless they're, because to try to like 01:28:27.000 |
push them out and take over the part of the park would make your reputation really fall. But unless 01:28:31.800 |
their reputation falls enough and you can come in and do it anyways, at first, when you're reading 01:28:36.600 |
this book, it's, you're like, what, like, this is a, this novel is not doing it for me. There's no, 01:28:41.960 |
none of the normal stakes you need for drama. Like nothing really matters. Like this seems so fake. 01:28:47.400 |
It's like weird, this world of like, they're, they're trying to redo the haunted mansion so that 01:28:52.680 |
like this other group doesn't take it over with their technology and everyone kind of cares about it, 01:28:56.680 |
but it's also stupid and it's all kind of like weird and superficial and there's no stakes. The main 01:29:01.400 |
character gets shot early on and they just reload them. And there's, you know, like maybe the only 01:29:05.000 |
stakes is like his neural interface isn't really working. Like, I don't care about what they care 01:29:09.560 |
about. I don't like empathize with this. And it's, and it's like, it doesn't make sense. It's like plastic. 01:29:14.600 |
This is like, doesn't seem like a real world. And then you get it. Yeah. He's personifying online 01:29:20.760 |
culture, like, and doing a brilliant job of it. This it's, it's, it's urgent. And at the same time, 01:29:27.400 |
meaningless it, you have these things you really care about and you're, you're like racing to get 01:29:32.680 |
done, but it doesn't matter. And no one cares. And it's like a reputation and popularity and this has 01:29:39.640 |
having its moment and now this or whatever. And Dr. Oh, what he's really doing is he's making you, 01:29:45.000 |
so it's more of like a post maybe postmodern in this sense. Like, he's like making you, 01:29:48.120 |
he's personifying like the sort of empty plastic plastic, plasticness and emptiness of online culture 01:29:52.760 |
into the sort of like view of the future. And then once you get that, you're like, oh, 01:29:55.560 |
that makes it a hard read, right? Because it doesn't have the, all of the normal elements 01:30:00.360 |
you might want out of drama, like empathy with characters, stakes that you care about, 01:30:04.120 |
surprise and unresolved, like caring about the resolution. You don't really care. 01:30:08.360 |
But in doing that, you're like, oh, I see what you're saying. This is actually really smart. 01:30:12.040 |
So it's a really good book. I actually, I actually glad I read it, but it took me halfway 01:30:15.160 |
through to really get what was going on there. Um, then I went to a nonfiction book. 01:30:20.840 |
Sarah Hurwitz's new book as a Jew. Sarah Hurwitz, uh, wrote, she's a, uh, speech writer. She was a 01:30:29.400 |
Clinton speech writer and a Michelle Obama speech writer. So there's a DC person. She's lives around 01:30:35.480 |
here somewhere. And in her thirties, she wrote a book called here all along, which was about like 01:30:40.120 |
rediscovering. She's Jewish, but like non-practicing and then rediscovered Judaism 01:30:45.320 |
in her thirties and how it like really helped her in a lot of ways. And she like kind of wrote 01:30:51.400 |
a book about it. And then this book is actually pretty interesting. This book is about her. 01:30:55.640 |
She's making the argument. She's confronting her tendency when she was younger to say things like, 01:31:03.320 |
you know, like I'm, I'm a, I'm a cultural Jew or I'm a social justice Jew or this or that. And 01:31:09.000 |
you know, she didn't really know much about it. And she's basically making the argument in this book 01:31:13.400 |
that this is just, she now sees this is just part of a, a, you know, the last a hundred years of this, 01:31:20.360 |
like 150 years of an ultimately fruitless effort of Jewish individuals to try to assimilate, to try to 01:31:26.680 |
avoid the, the, the, the, the baggage and the harm and the, what, what they've had to put up with for 01:31:33.960 |
the last 2000 years. And if they're like, if we just assimilate enough, then like, we won't have to, 01:31:38.600 |
you know, people will like us or whatever. And she's basically coming to grips with like, 01:31:41.640 |
it never works. This is, this happened in the early 20th century. This happened in the 19th 01:31:45.880 |
century. This happened in the mid century and it never works. And she's kind of coming to grips with 01:31:49.480 |
like, I have to come to grips with like being Jewish and what that means. Um, I can't like, 01:31:54.120 |
she, she said, I didn't talk about Israel in my first book. And she's like, I can't not talk about it. 01:32:00.200 |
I'm Jewish. I have to confront that in its role in my life. I didn't talk about anti-Semitism in my 01:32:04.760 |
first book, but it's like the defining force for the way I was acting. So I don't know. I 01:32:09.000 |
liked this better than her first book in the sense that it's more self-reflective. You want like to 01:32:15.720 |
hear, I mean, so the places in the context of the show, it's very deep life relevant. 01:32:21.080 |
So if you want a good example of like a particular case study of someone trying to understand their life 01:32:25.640 |
and what matters and how to transform it into something deeper, when it happens, 01:32:30.360 |
something more shallow, this is like a great, this and her last book, like are a great 01:32:34.040 |
contribution to that category. But I like the second book better because I think it's more 01:32:38.520 |
psychologically self-reflective and searing. Like it's more interesting in that sense. It's not, 01:32:43.960 |
it's like really people grappling with something they're really dealing with. I think is often more 01:32:48.840 |
in a way that is like real and messy is more meaningful than maybe like a more pat, like, you know, 01:32:57.080 |
I was struggling and I found this then I'm better. So I thought that was really interesting. 01:33:00.200 |
She does a really good history of, uh, anti-semitism, which I think is really good. 01:33:05.560 |
I mean, I, I, the, the best work on this, I really think is Dara Horn. People love dead Jews. I think 01:33:12.040 |
in some of her writing, she's done on this recently for the Atlantic and tablets, probably the best, 01:33:15.800 |
but, uh, her, which, because she's a speechwriter is very good. This is the thing I noticed was like, 01:33:19.880 |
this is kind of dry. Some of this is history stuff, but speechwriters are very good at like, 01:33:25.560 |
you know, moving it forward and simplifying the language. And I learned a lot, but that was good. I 01:33:30.040 |
think her thing about where does anti-semitism come from? It's a 2000 year groove that you can 01:33:35.400 |
trace back to. And she like goes to the history that was very useful. So I thought I'm thinking 01:33:40.520 |
about the book a lot. So, you know, whether or not you care about the struggles of Jewish identity, 01:33:47.960 |
it was, I think a good book. If you care about someone, she's now, she's roughly my age. I think 01:33:55.560 |
struggling with more generally speaking, what am I all about? Why I'm all about and what's that actually 01:34:00.680 |
mean? Because I think a lot of us go through those same series of questions when you're trying to figure 01:34:07.160 |
yourself out. All that really differs is like the specific details of those questions. So, you know, 01:34:12.520 |
if you're a secular Jew from DC, it's going to be, your questions might be centered on that, but 01:34:18.760 |
different people will have different content behind those questions. But how people tackle those 01:34:24.280 |
questions, I think is really important for thinking about like, how do you build a deep life in a world 01:34:28.760 |
that the digital is trying to make it superficial and fragmented. So it stuck with me. I think it was, 01:34:33.560 |
uh, it was interesting, interesting book. All right. Final book, a novel. I read the searcher by Tana French, 01:34:40.120 |
because everyone's talking about this novel. That was good. That was good. It's, this is now like, 01:34:44.360 |
it's a classic, uh, it's genre, but it's highbrow genre. So, you know, it's, um, lit genre. So she's 01:34:51.160 |
a really good writer. Um, that it's basically, I guess, detective mystery genre, technically, 01:34:56.920 |
but, uh, the, it's about a police officer retired from Chicago, whose name is Cal, which I appreciate 01:35:04.440 |
Al Hooper. And he moves to, I mean, I like the premise. It was very deep lifey. He moves to a 01:35:10.280 |
small town in Ireland to like fix up. He's just like done, you know, he's divorced. He's, uh, 01:35:15.800 |
his, his daughter's older. The being in police force was hard. And he's like, I'm going to move to this 01:35:20.760 |
like small town in the middle of like very rural Ireland. I'm going to build up this, uh, fix up an 01:35:26.360 |
old house. I have 10 acres. I, a lot of characters. I walk in the town to like the pub in town, like one of 01:35:31.720 |
these types of situations. Right. I just want to kind of get away from it all. And, um, like that's 01:35:37.160 |
it. But then, you know, mystery finds, uh, like a local kid from comes in and is like, you know, 01:35:46.280 |
my brother's missing. And despite his best interest, Cal Hooper kind of gets into like, uh, for various 01:35:52.200 |
reasons, like I have to try to figure this out. And then pushback comes from the town of like, you 01:35:56.680 |
shouldn't be asking those questions. So it's a, it's a great setup, right. Or, uh, but I love the 01:36:01.400 |
setting and I love the deep life themes of like, I'm escaping the city to move to Ireland and what, 01:36:06.040 |
what, what's good about that. And what is like fantasy is not what I thought. And Tana French 01:36:10.840 |
is a fantastic, uh, she's fantastic at writing these type of books. So that was a good one. 01:36:14.680 |
So there you go. Those were the books I read in September. All right. I think that's all the time 01:36:21.640 |
we have, right, Jesse. All right. Thanks for listening. We'll be back next week with another episode of the 01:36:26.360 |
podcast. Hopefully everyone's getting their Halloween decorations ready. I just showed Jesse 01:36:31.720 |
my sound light synchronized controller. I built from scratch, which I'm going to hopefully install this 01:36:35.320 |
weekend. I'm looking forward to it. It's incredible. Hopefully everyone's doing something similar. 01:36:39.320 |
So I'll be back next week. And until then, as always, if you like today's discussion of what happens in our 01:36:45.640 |
brain that makes our phone so compelling, you might also like episode 371 titled is it finally time to 01:36:52.760 |
leave social media, which is more about the social impact, the impact on our humanity of using these 01:37:00.360 |
technologies. It's a great match to today's episode. Check it out. I think you'll like it. 01:37:05.080 |
Those of us who study online culture like to use the phrase, "Twitter is not real life." But as we saw 01:37:11.080 |
yet again this week, when the digital discourses fostered on services like Twitter or Blue Sky or TikTok 01:37:16.840 |
do intersect with the real world, whether they originate from the left or the right, the results are often