back to index

Avoiding Distractions & Doing Deep Work | Dr. Cal Newport & Dr. Andrew Huberman


Chapters

0:0 Deep Work and Digital Distraction: The Battle Against Social Media
0:54 The Illusion of Internet's Allure Without Social Media
1:11 Confronting FOMO and the Anxiety of Disconnection
1:46 The Evolution of Connectivity and Its Impact
2:21 Navigating the Digital Age: Personal Strategies and Anecdotes
3:57 Exploring the Psychological Effects of Social Media and Smartphones
4:22 The Debate on Digital Dependency: Addiction vs. Extension of the Brain
6:53 Reimagining Internet Usage: A Call for Cultural Shift
8:2 Personal Experiences and the Power of Unplugging
9:43 Closing Thoughts and Invitation to Full Episode

Whisper Transcript | Transcript Only Page

00:00:00.000 | - In terms of deep work and getting a little bit back
00:00:05.280 | to kind of practical steps towards deep work,
00:00:07.640 | I also have to ask you 'cause I didn't earlier,
00:00:10.260 | when you are on your laptop in your library
00:00:12.580 | with your fireplace and these books,
00:00:14.160 | it's a beautiful image, actually,
00:00:15.480 | that you've drawn for us in our minds,
00:00:17.800 | is the Wi-Fi connection to your computer activated
00:00:21.240 | or are you offline?
00:00:23.240 | - It's connected because it doesn't really matter to me,
00:00:26.360 | you know, because what's drawing my attention?
00:00:29.540 | I mean, the most important decision I think I made,
00:00:32.040 | technically speaking, to be a cognitive worker
00:00:35.540 | is the lack of social media.
00:00:37.760 | Like, I think we underestimate the degree to which
00:00:42.320 | our problem with digital distraction is not the internet,
00:00:45.240 | it's not our phones, it is specific products and services
00:00:48.120 | that are engineered at great expense
00:00:50.500 | to pull you back to 'em.
00:00:51.720 | When you take that away,
00:00:52.560 | the internet's not that interesting.
00:00:54.400 | Like, I don't have a cycle of sites to go to.
00:00:58.240 | You know, I can check my email,
00:00:59.800 | but I don't really know where else to go.
00:01:01.240 | I mean, I could go to the New York Times, I guess,
00:01:02.700 | but then you've seen the articles, right?
00:01:03.940 | They change it once a day.
00:01:05.620 | There's just not much, I've set things up,
00:01:07.340 | so there's not much that's that interesting to me.
00:01:09.780 | - We've all heard of FOMO, fear of missing out.
00:01:14.780 | I feel like there's the other thing,
00:01:16.840 | which is fear of missing something bad, right?
00:01:21.420 | Sort of like an anxiety,
00:01:22.420 | a more primitive anxiety within us
00:01:24.100 | that if we are not engaged on social media
00:01:26.460 | or looking at our phone often or texting often,
00:01:28.740 | that it's not that we'll miss the party,
00:01:31.520 | we'll miss the emergency.
00:01:32.800 | You don't seem to suffer from those kind of everyday ills.
00:01:37.180 | - Yeah, I mean, it doesn't happen that much.
00:01:39.140 | I mean, I have a phone, you know.
00:01:41.060 | - A standard.
00:01:42.260 | - No, I mean, I have my phone.
00:01:43.220 | I guess if I'm working away from it,
00:01:44.420 | yeah, I guess it's true if there was an emergency.
00:01:47.460 | But this was the case for a very long time, right?
00:01:49.720 | We didn't have smartphones till really relatively recently.
00:01:52.620 | This is, you know, 15 years ago.
00:01:54.980 | So we were just used to this until yesterday, essentially,
00:01:59.100 | that there's just periods of time where you're out of touch.
00:02:01.740 | Like you're at a restaurant with someone,
00:02:03.840 | you're out of touch until you get back to your office.
00:02:05.500 | Like we were okay.
00:02:06.420 | You know, we weren't plagued by emergencies
00:02:08.340 | that led to disastrous results
00:02:10.660 | because we couldn't hear about it, right?
00:02:12.020 | Then you go to the movies, like you're out of touch, right?
00:02:13.780 | And be a couple hours, so you're in touch again.
00:02:16.180 | And so I don't, you know,
00:02:17.100 | it's not something that's affected me as much.
00:02:18.980 | So maybe I'm working without my phone nearby.
00:02:21.180 | A lot of people have this response.
00:02:22.580 | They begin sort of catastrophizing.
00:02:24.100 | Like what if this happens or this or that?
00:02:25.620 | And I'm thinking, you know, I survived before that.
00:02:28.180 | My parents survived without that.
00:02:29.540 | My grandparents survived without that.
00:02:32.300 | I don't worry about it as much, you know?
00:02:34.380 | And some of this maybe is just,
00:02:36.300 | this doesn't upset people as much as it used to.
00:02:38.540 | The fact I don't use a lot of these apps or have my phone.
00:02:41.340 | But it really does upset people, right?
00:02:43.740 | There's, well, what about this?
00:02:44.780 | What about that?
00:02:45.600 | What about this?
00:02:46.440 | And I don't know how much of this is just maybe
00:02:48.700 | I'm oblivious and how much of this is people back
00:02:50.900 | sliding explanation for why they do need their phone,
00:02:54.540 | why they do need to look at all the time.
00:02:55.860 | But I get a lot of it.
00:02:57.340 | - Yeah, well, maybe they're upset and you don't know
00:02:58.620 | because you're not looking at your phone.
00:02:59.820 | - That's right.
00:03:01.020 | Hey, I'll tell you what, that's a blessing.
00:03:02.700 | Not knowing how upset people are at you.
00:03:04.540 | Yeah, it's a blessing as a semi-public figure.
00:03:06.460 | I'll tell you that.
00:03:07.540 | - Yeah, I can comment on that, but I won't.
00:03:09.700 | I am on social media and I do enjoy it.
00:03:14.300 | I sort of got started posting on Instagram
00:03:16.040 | and then expanded to other platforms,
00:03:17.780 | including the podcast.
00:03:18.980 | But there's a threshold beyond which it becomes
00:03:21.540 | counterproductive for sure.
00:03:23.080 | I think there's information there,
00:03:25.760 | like questions that people ask are often informative.
00:03:29.220 | It's sort of like ending a class
00:03:30.860 | and asking, are there any questions?
00:03:32.120 | Sometimes the comments that people bring back
00:03:33.900 | are truly informative towards both where they might have
00:03:37.460 | some misunderstanding, but also sometimes
00:03:40.100 | some really terrific ideas.
00:03:42.140 | So there's that, but I completely agree
00:03:45.440 | that this is a very precarious space.
00:03:48.760 | And I'll just relay a quick anecdote.
00:03:52.120 | Years ago, I gave a quick lecture
00:03:54.460 | down at Santa Clara University, South of Stanford.
00:03:56.400 | And I was talking about this issue.
00:03:57.440 | I recommended your book and a student came up afterwards
00:03:59.800 | and he said, "You don't get it."
00:04:02.240 | At that time I was in my early forties.
00:04:03.560 | He said, "You don't get it.
00:04:05.200 | You grew up without social media and the phone.
00:04:07.760 | And so you've adopted it into your life,
00:04:09.660 | but we grew up with it.
00:04:11.040 | And when my phone," he's speaking for himself
00:04:13.100 | and the first person, "When my phone loses power,
00:04:16.640 | I feel a physical drain within my body.
00:04:18.880 | And when it comes back on, I feel a lift within my body."
00:04:22.560 | So I'd love your thoughts on whether or not you think
00:04:25.480 | the phone and perhaps social media as well
00:04:28.800 | are in some ways an extension of our brain.
00:04:30.980 | It's almost like another cortical area
00:04:33.080 | that contains all this information.
00:04:34.680 | It's a version of us.
00:04:35.920 | This gets into notions of AI that we can talk about as well.
00:04:38.360 | I know you're involved in AI and writing about AI.
00:04:41.160 | But to me, when the phone is used in that way,
00:04:44.900 | it really is almost like a piece of neural machinery of sorts.
00:04:51.000 | - Yeah, I mean, there's two ways of looking at it.
00:04:53.360 | Yeah, so there is the sort of cyborg image, I suppose, right?
00:04:57.080 | Like you're extending, you're plugging into this new sphere.
00:05:01.520 | Like you have this sort of digital network extension
00:05:03.600 | of information and what's going on.
00:05:05.560 | There's also the much more pessimistic view,
00:05:07.840 | which is no, no, that feeling is the feeling
00:05:10.280 | of a moderate behavioral addiction, right?
00:05:12.600 | So you'll hear the same thing from a gambler.
00:05:16.040 | I really, when I'm away from being able to play, right,
00:05:19.280 | to make my bets or do whatever, like I feel really,
00:05:21.560 | I feel not myself.
00:05:22.920 | And then when I'm around it and I can play and make some bets,
00:05:25.760 | play some poker, whatever it is.
00:05:26.600 | - The feeling of the chips.
00:05:27.520 | - I feel myself, that chips, right?
00:05:29.440 | Like they would say, so it could be,
00:05:31.280 | both of these things could be true.
00:05:33.120 | I think the moderate behavioral addiction side
00:05:35.100 | is more true than a lot of us want to admit, actually.
00:05:38.400 | Like it does feel bad because moderate behavioral addictions
00:05:41.800 | build these feedback response loops.
00:05:44.360 | And then you get the dopamine system going,
00:05:45.920 | when the anticipation.
00:05:47.760 | Because what's on there is things that have been engineered
00:05:50.160 | that you're gonna get this sort of highly engaging stimuli.
00:05:52.520 | And then you see the deliverance of that stimuli, right?
00:05:55.300 | This really nice piece of glass on a piece of metal,
00:05:58.480 | I'm gonna press this sort of carefully,
00:06:00.920 | this icon whose colors have been chosen
00:06:03.200 | because we know it's gonna hit various parts
00:06:05.880 | of our neural alert systems to be as engaging as possible.
00:06:09.040 | And I'm gonna see something in there
00:06:10.800 | that's gonna generate some sort of emotional response.
00:06:13.360 | So of course, when you see that thing sitting there,
00:06:15.680 | you wanna use it.
00:06:16.520 | And when you can't, it's a stymie dopamine response.
00:06:19.840 | You're like, this is not good, I'm uncomfortable.
00:06:22.520 | And I think that's a big part of it as well.
00:06:25.120 | Because I've had this argument with some people.
00:06:27.680 | And by the way, I see both sides of this.
00:06:29.840 | Like there are great advantages
00:06:31.880 | to what people are doing with these tools.
00:06:33.560 | It's just that it's all mixed up
00:06:35.060 | with all these disadvantages.
00:06:36.680 | And it becomes very difficult.
00:06:38.320 | It's like the alcohol in the neighborhood bar is too potent.
00:06:41.680 | And people are going there to socialize
00:06:43.800 | and they're coming home at three in the morning,
00:06:46.400 | passing out, it's like the balance is off.
00:06:48.640 | Not that there's not something good there,
00:06:50.600 | but the balance is off.
00:06:51.500 | So it becomes pretty difficult to navigate.
00:06:53.120 | So I think some of that's what's going on,
00:06:54.760 | especially with the younger generation
00:06:56.080 | that was raised on it.
00:06:57.400 | Which is why, by the way,
00:06:58.960 | I think the cultural norms are gonna change around this.
00:07:01.560 | I think we're gonna think about unrestricted internet usage
00:07:04.960 | not as something that we just sort of bequeath on youth
00:07:08.120 | as they become 10 years old,
00:07:09.560 | but something that we're actually much more careful about.
00:07:11.920 | Probably something that's gonna be post-pubescent
00:07:14.360 | is gonna make a lot more sense
00:07:15.440 | once you've had more brain development,
00:07:17.200 | once you've had more social entrenchment,
00:07:19.920 | you sort of understand your identity, et cetera.
00:07:22.440 | Because we recognize the flip side
00:07:25.000 | of plugging this thing into your brain is,
00:07:26.200 | yeah, you have access to more information,
00:07:27.480 | but it also pumps that into your brain.
00:07:30.080 | So I don't know, I lean a little bit heavier
00:07:32.720 | towards the pessimistic read
00:07:34.200 | because I know too many people because of my books
00:07:37.200 | who've really reduced the impact
00:07:38.420 | of these things in their lives.
00:07:39.680 | And they don't, on the far side of that transformation,
00:07:43.520 | they don't typically report a great impoverishment
00:07:45.920 | and experience.
00:07:47.080 | They don't report, I'm less mentally agile,
00:07:50.460 | the information at my fingertips is less,
00:07:52.300 | I'm missing out on life.
00:07:54.320 | There's typically this coming out of the fog
00:07:56.100 | on the other side of it where they're like,
00:07:57.280 | oh, this is fine.
00:07:58.800 | So I'm a little bit suspicious
00:08:00.320 | about exactly what this mechanism is.
00:08:02.440 | - I think you're right
00:08:03.280 | about the moderate behavioral addiction piece.
00:08:06.880 | Years ago, when I was starting my lab,
00:08:08.920 | I had grants to write and I found the phone
00:08:11.280 | to be pretty intrusive for that process.
00:08:14.020 | So I used to give the phone to somebody in my lab
00:08:15.840 | and announced to everyone in my lab
00:08:17.080 | that if I asked for it back prior to 5 p.m. that day,
00:08:20.440 | I would give everyone in the lab,
00:08:21.640 | I think it was a hundred dollar bill.
00:08:22.760 | My lab was pretty big at the time.
00:08:24.040 | As a junior professor, they did not do not,
00:08:26.720 | sorry, academic institutions not to be named,
00:08:30.160 | pay us very much despite what people might think.
00:08:32.240 | And it was difficult several times
00:08:35.640 | throughout the day or more.
00:08:36.480 | I was like, ah, I really want to look at that thing.
00:08:38.120 | But at the end of the day, I'll tell you that no one got paid.
00:08:41.840 | I got my phone back.
00:08:42.880 | But it's wonderful the amount of work that you can get done
00:08:45.340 | when that thing is out of the room.
00:08:46.720 | - I mean, it's my superpower, right?
00:08:49.320 | I don't work that hard
00:08:50.800 | in the sense that I don't do long hours.
00:08:52.760 | Like I'm not constitutionally suited for long hours.
00:08:55.840 | This was never my thing.
00:08:57.760 | My brain tires, right?
00:08:59.660 | I mean, I'm good for four,
00:09:01.580 | four and a half good hours a day
00:09:03.660 | of actually producing good stuff with my brain, probably max.
00:09:07.540 | But you know, I don't use my phone that much.
00:09:08.860 | I don't use the internet that much.
00:09:10.580 | And I prioritize it and a lot just gets done.
00:09:12.980 | It just sort of piles up over time, you know?
00:09:15.220 | And there's a sense of like,
00:09:16.060 | you must be burning the midnight oil
00:09:17.500 | and you have all these things going on.
00:09:19.620 | But again, people I think underestimate.
00:09:21.420 | And it's not the underestimate the impact of this.
00:09:24.060 | It's not just the accumulation of time
00:09:26.340 | you spend looking on your phone.
00:09:28.480 | It's also this network switching cost, right?
00:09:31.060 | Because like the phone is very good
00:09:32.540 | at inducing a network switch.
00:09:34.180 | And that's an expensive, time-consuming,
00:09:37.120 | energy-consuming neuronal operation.
00:09:40.140 | - Task switching.
00:09:41.140 | - I'm gonna switch my focus of attention from this to that.
00:09:44.000 | - Thank you for tuning into the Huberman Lab Clips channel.
00:09:46.700 | If you enjoyed the clip that you just viewed,
00:09:48.820 | please check out the full length episode by clicking here.
00:09:51.860 | [BLANK_AUDIO]