back to index

Cal Newport: Deep Work, Focus, Productivity, Email, and Social Media | Lex Fridman Podcast #166


Chapters

0:0 Introduction
2:15 Deep work
7:0 Focus
12:43 Time blocking
19:38 Deadlines
29:13 Do less, do better, know why
31:55 Clubhouse
45:58 Burnout
52:25 Boredom
60:10 Quit social media for 30 days
70:4 Social media
95:12 How email destroyed our productivity at work
104:57 How we fix email
111:59 Over-optimization
116:14 When to use email and when not to
123:57 Podcasting
128:33 Alan Turing proving the impossible
132:32 Fragility of math in the face of randomness
141:21 Neural networks
150:6 What will the P=NP proof look like?
153:46 Is math discovered or invented?
157:53 Book publishing
167:59 Love
171:21 Death
174:17 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | "The following is a conversation with Cal Newport.
00:00:02.480 | "He's a friend and someone who's writing,
00:00:04.640 | "like his book, 'Deep Work' for example,
00:00:06.920 | "has guided how I strive to approach productivity
00:00:09.600 | "and life in general.
00:00:11.220 | "He doesn't use social media,
00:00:13.100 | "and in his book, 'Digital Minimalism',
00:00:15.640 | "he encourages people to find the right amount
00:00:17.760 | "of social media usage that provides value and joy.
00:00:21.120 | "He has a new book out called 'A World Without Email',
00:00:24.700 | "where he argues, brilliantly I would say,
00:00:27.240 | "that email is destroying productivity in companies
00:00:30.120 | "and in our lives.
00:00:31.660 | "And, very importantly, he offers solutions.
00:00:35.500 | "He is a computer scientist at Georgetown University
00:00:38.680 | "who practices what he preaches.
00:00:41.060 | "To do theoretical computer science
00:00:42.820 | "at the level that he does it,
00:00:44.540 | "you really have to live a focused life
00:00:46.780 | "that minimizes distractions
00:00:48.300 | "and maximizes hours of deep work.
00:00:51.100 | "Lastly, he's a host of an amazing podcast
00:00:54.340 | "called 'Deep Questions' that I highly recommend
00:00:57.600 | "for anyone who wants to improve their productive life.
00:01:00.720 | "Quick mention of our sponsors,
00:01:02.840 | "ExpressVPN, Linode Linux Virtual Machines,
00:01:06.680 | "Sun Basket Meal Delivery Service,
00:01:09.000 | "and Simply Safe Home Security.
00:01:11.340 | "Click the sponsor links to get a discount
00:01:13.360 | "and to support this podcast."
00:01:15.520 | As a side note, let me say that 'Deep Work'
00:01:17.460 | or long periods of deep, focused thinking
00:01:20.800 | have been something I've been chasing more and more
00:01:22.840 | over the past few years.
00:01:24.560 | Deep work is hard, but it's ultimately the thing
00:01:28.080 | that makes life so damn amazing.
00:01:30.960 | The ability to create things you're passionate about
00:01:33.820 | in a flow state where the distractions
00:01:35.880 | of the world just fade away.
00:01:37.960 | Social media, yes, reading the comments,
00:01:41.560 | yes, I still read the comments,
00:01:43.520 | is a source of joy for me in strict moderation.
00:01:46.960 | Too much takes away the focused mind,
00:01:49.260 | and too little, at least I think,
00:01:51.600 | takes away all of the fun.
00:01:53.760 | We need both, the focus and the fun.
00:01:57.000 | If you enjoy this thing, subscribe on YouTube,
00:01:59.320 | review it on Apple Podcast, follow on Spotify,
00:02:02.760 | support on Patreon, or connect with me on Twitter
00:02:06.100 | at Lex Friedman if you can only figure out
00:02:08.760 | how to spell that.
00:02:10.120 | And now, here's my conversation with Cal Newport.
00:02:13.980 | What is deep work?
00:02:16.640 | Let's start with the big question.
00:02:18.960 | - So, I mean, it's my term
00:02:20.480 | for when you're focusing without distraction
00:02:23.680 | on a cognitively demanding task,
00:02:25.880 | which is something we've all done,
00:02:28.520 | but we had never really given it a name, necessarily,
00:02:31.720 | that was separate from other type of work.
00:02:33.520 | And so, I gave it a name and said,
00:02:36.560 | let's compare that to other types of efforts
00:02:38.480 | you might do while you're working
00:02:40.680 | and see that the deep work efforts
00:02:42.320 | actually have a huge benefit
00:02:44.320 | that we might be underestimating.
00:02:45.840 | - What does it mean to work deeply on something?
00:02:49.000 | - I had been calling it hard focus
00:02:51.840 | in my writing before that.
00:02:54.360 | Well, so the context you would understand,
00:02:55.720 | I was in the theory group in CSAIL at MIT, right?
00:02:58.400 | So I was surrounded at the time
00:03:00.000 | when I was coming up with these ideas
00:03:01.840 | by these professional theoreticians.
00:03:03.560 | And that's like a murderer's row of thinkers there, right?
00:03:06.920 | I mean, it's like Turing Award, Turing Award,
00:03:08.960 | MacArthur, Turing Award.
00:03:10.120 | I mean, you know the crew, right?
00:03:12.720 | - Theoretical computer science.
00:03:13.920 | - Theoretical computer science, yeah, yeah.
00:03:15.880 | So I'm in the theory group, right?
00:03:18.120 | Doing theoretical computer science.
00:03:20.000 | And I publish a book.
00:03:21.400 | So I was in this milieu where I was being exposed to people
00:03:25.400 | where focus was their tier one skill.
00:03:27.720 | Like that's what you would talk about, right?
00:03:29.080 | Like how intensely I can focus,
00:03:31.520 | that was the key skill.
00:03:33.400 | It's like your 440 time or something
00:03:34.840 | if you were an athlete, right?
00:03:37.160 | - So this is something that people were actually,
00:03:39.120 | the theory folks are thinking about?
00:03:41.720 | - Oh yeah.
00:03:42.560 | - Really? - Oh yeah.
00:03:43.400 | - Like they're openly discussing like, how do you focus?
00:03:46.040 | I mean, I don't know if they would quantify it,
00:03:48.520 | but focus was the tier one skill.
00:03:51.560 | So you would come in, here'd be a typical day.
00:03:53.720 | You'd come in and Eric Domain would be sitting
00:03:56.480 | in front of a whiteboard, right?
00:03:58.360 | With a whole group of visitors
00:04:00.200 | who had come to work with them.
00:04:01.280 | And maybe they projected like a grid on there
00:04:03.680 | because they're working on some graph theory problem.
00:04:06.440 | You go to lunch, you go to the gym, you come back,
00:04:10.040 | they're sitting there staring at the same whiteboard, right?
00:04:12.960 | Like that's the tier one skill.
00:04:14.240 | - This is the difference between different disciplines.
00:04:16.120 | Like I often feel for many reasons like a fraud,
00:04:20.640 | but I definitely feel like a fraud when I hang out
00:04:22.760 | with like either mathematicians or physicists.
00:04:25.440 | It's like, it feels like they're doing the legit work
00:04:29.000 | because when you talk,
00:04:30.200 | closer in computer science, you get to programming
00:04:32.720 | or like machine learning,
00:04:33.840 | like the experimental machine learning
00:04:38.400 | or like just the engineering version of it.
00:04:41.120 | It feels like you're gone so far away
00:04:44.560 | from what's required to solve something fundamental
00:04:47.880 | about this universe.
00:04:49.000 | It feels like you're just like cheating your way
00:04:51.160 | into like some kind of trick to figure out
00:04:53.560 | how to solve a problem in this one particular case.
00:04:56.200 | That's how it feels.
00:04:57.400 | I'd be interested to hear what you think about that
00:05:02.840 | because programming doesn't always feel
00:05:07.040 | like you need to think deeply, to work deeply,
00:05:11.440 | but sometimes it does.
00:05:12.840 | So it's a weird dance.
00:05:14.360 | - For sure code does, right?
00:05:15.760 | I mean, especially if you're coming up
00:05:17.560 | with original algorithmic designs,
00:05:20.680 | I think it's a great example of deep work.
00:05:22.720 | I mean, yeah, the hardcore theoreticians,
00:05:24.960 | they push it to an extreme.
00:05:26.480 | I mean, I think it's like knowing
00:05:29.040 | that athletic endeavor is good
00:05:31.080 | and then hanging out with a Olympic athlete.
00:05:33.800 | Like, oh, I see that's what it is.
00:05:35.680 | Now for the grad students like me,
00:05:37.080 | we're not anywhere near that level,
00:05:38.680 | but the faculty in that group,
00:05:41.520 | these were the cognitive Olympic athletes.
00:05:44.600 | But coding, I think, is a classic example of deep work
00:05:48.040 | because I got this problem I wanna solve.
00:05:50.840 | I have all of these tools
00:05:52.160 | and I have to combine them somehow creatively
00:05:55.760 | and on the fly.
00:05:56.600 | So basically I had been exposed to that.
00:05:58.880 | So I was used to this notion when I was in grad school
00:06:00.800 | and I was writing my blog, I'd write about hard focus.
00:06:03.280 | You know, that was the term I used.
00:06:05.000 | Then I published this book, "So Good They Can't Ignore You,"
00:06:07.720 | which came out in 2012.
00:06:09.360 | So like right as I began as a professor.
00:06:12.120 | And that book had this notion
00:06:13.480 | of skill being really important for career satisfaction,
00:06:16.480 | that it's not just following your passion.
00:06:19.320 | You have to actually really get good at something
00:06:21.000 | and then you use that skills as leverage.
00:06:22.480 | And there was this big follow-up question to that book
00:06:24.360 | of, okay, well, how do I get really good at this?
00:06:27.080 | And then I look back to my grad school experience.
00:06:28.960 | I was like, huh, there was this focus thing
00:06:31.480 | that we used to do.
00:06:32.320 | I wonder how generally applicable that is
00:06:35.400 | into the knowledge sector.
00:06:36.280 | And so as I started thinking about it,
00:06:38.280 | it became clear there's this interesting storyline
00:06:40.320 | that emerged that, okay,
00:06:41.440 | actually undistracted concentration is not just important
00:06:44.000 | for esoteric theoreticians.
00:06:46.280 | It's important here, it's important here,
00:06:47.440 | it's important here.
00:06:48.680 | And that involved into the deep work hypothesis,
00:06:51.160 | which is across the whole knowledge work sector,
00:06:55.160 | focus is very important
00:06:56.480 | and we've accidentally created circumstances
00:06:58.480 | where we just don't do a lot of it.
00:07:00.440 | - So focus is the sort of prerequisite for basically,
00:07:03.680 | you say knowledge work,
00:07:04.520 | but basically any kind of skill acquisition,
00:07:07.040 | any kind of major effort in this world.
00:07:09.320 | Can we break that apart a little bit?
00:07:11.640 | - Yeah, so a key aspect of focus
00:07:15.280 | is not just that you're concentrating hard on something,
00:07:17.840 | but you do it without distraction.
00:07:20.080 | So a big theme of my work is that context shifting
00:07:24.720 | kills the human capacity to think.
00:07:27.000 | So if I change what I'm paying attention to
00:07:29.520 | to something different,
00:07:31.120 | really, even if it's brief
00:07:32.000 | and then try to bring it back to the main thing I'm doing,
00:07:34.280 | that causes a huge cognitive pileup
00:07:36.040 | that makes it very hard to think clearly.
00:07:38.120 | So even if you think, okay, look, I'm writing this code
00:07:40.400 | or I'm writing this essay and I'm not multitasking
00:07:43.640 | and all my windows are closed and I have no notifications on
00:07:47.120 | but every five or six minutes, you quickly check
00:07:50.200 | like an inbox or your phone,
00:07:51.920 | that initiates a context shift in your brain, right?
00:07:54.240 | We're gonna start to suppress some neural networks,
00:07:55.960 | we're gonna try to amplify some others.
00:07:57.440 | It's a pretty complicated process, actually.
00:07:59.280 | There's a sort of neurological cascade that happens.
00:08:02.280 | You rip yourself away from that halfway through
00:08:04.440 | and go back to what you're doing.
00:08:05.520 | And now it's trying to switch back to the original thing,
00:08:07.160 | even though it's also in your brain's in the process
00:08:08.800 | of switching to these emails
00:08:09.960 | and trying to understand those contexts.
00:08:11.760 | And as a result, your ability to think clearly
00:08:14.640 | just goes really down.
00:08:16.440 | And it's fatiguing too.
00:08:17.360 | I mean, you do this long enough, you get midday
00:08:19.480 | and you're like, okay, I can't think anymore.
00:08:22.120 | You've exhausted yourself.
00:08:23.280 | - Is there some kind of perfect number of minutes,
00:08:27.960 | would you say?
00:08:28.880 | So we're talking about focusing on a particular task
00:08:32.280 | for one minute, five minutes, 10 minutes, 30 minutes.
00:08:37.200 | Is it possible to kind of context switch
00:08:40.120 | while maintaining deep focus every 20 minutes or so?
00:08:44.240 | So if you're thinking of like this,
00:08:46.560 | again, maybe it's a selfish kind of perspective,
00:08:48.680 | but if you think about programming,
00:08:50.400 | you're focused on a particular design of a little bit,
00:08:53.680 | maybe a small scale on a particular function
00:08:55.880 | or large scale on a system.
00:08:59.560 | And then the shift of focus happens like this,
00:09:02.840 | which is like, wait a minute,
00:09:04.960 | is there a library that can achieve this little task
00:09:07.400 | or something like that?
00:09:08.280 | And then you have to look it up.
00:09:10.200 | This is the danger zone.
00:09:11.480 | You go to the internets.
00:09:13.040 | And so you have to, now it is a kind of context switch
00:09:17.040 | because as opposed to thinking about the particular problem,
00:09:20.000 | you now have switched thinking about like consuming
00:09:25.000 | and integrating knowledge that's out there
00:09:27.920 | that can plug into your solution to a particular problem.
00:09:31.160 | It definitely feels like a context switch,
00:09:33.560 | but is that a really bad thing to do?
00:09:36.000 | So should you be setting it aside always
00:09:38.560 | and really trying to as much as possible go deep
00:09:42.160 | and stay there for like a really long period of time?
00:09:45.960 | - Well, I mean, I think if you're looking up a library
00:09:48.680 | that's relevant to what you're doing, that's probably okay.
00:09:51.360 | And I don't know that I would count that
00:09:52.480 | as a full context shift
00:09:53.720 | because the semantic networks involved
00:09:56.520 | are relatively similar, right?
00:09:58.040 | You're thinking about this type of solution.
00:10:00.520 | You're thinking about coding.
00:10:01.640 | You're thinking about this type of functions.
00:10:03.600 | Where you're really gonna get hit
00:10:04.620 | is if you switch your context to something that's different
00:10:08.000 | and if there's unresolved obligation.
00:10:09.800 | So really the worst possible thing you could do
00:10:11.720 | would be to look at like an email inbox, right?
00:10:14.360 | 'Cause here's 20 emails.
00:10:16.040 | I can't answer most of these right now.
00:10:18.020 | They're completely different.
00:10:19.560 | Like the context of these emails, like, okay,
00:10:21.440 | there's a grant funding issue or something like this.
00:10:23.360 | It's very different than the coding I'm doing.
00:10:25.520 | And I'm leaving it unresolved.
00:10:27.200 | So it's like, someone needs something from me
00:10:29.200 | and I'm gonna try to pull my attention back.
00:10:30.680 | The second worst would be something
00:10:31.840 | that's emotionally arousing.
00:10:33.360 | So if you're like, let me just glance over at Twitter.
00:10:35.180 | I'm sure it's nice and calm and peaceful over there, right?
00:10:37.520 | That could be devastating
00:10:38.560 | because you're gonna expose yourself
00:10:39.800 | to something that's emotionally arousing.
00:10:41.560 | That's gonna completely mess up the cognitive plateau there.
00:10:43.920 | And then when you come back to,
00:10:45.040 | okay, let me try to code again, it's really difficult.
00:10:47.840 | - So it's both the information and the emotion.
00:10:50.100 | Yeah, both can be killers if what you're trying to do.
00:10:53.420 | So I would recommend at least an hour at a time
00:10:55.940 | 'cause it could take up to 20 minutes
00:10:57.600 | to completely clear out the residue
00:10:59.320 | from whatever it was you were thinking about before.
00:11:02.660 | So if you're coding for 30 minutes,
00:11:04.040 | you might only be getting 10 or 15 minutes
00:11:05.700 | of actual sort of peak lex going on there, right?
00:11:08.860 | So an hour, at least you get a good 40, 45 minutes plus.
00:11:11.840 | I'm partial to 90 minutes as a really good chunk.
00:11:15.540 | We can get a lot done, but just before you get exhausted,
00:11:18.620 | you can sort of pull back a little bit.
00:11:21.240 | - Yeah, and one of the beautiful,
00:11:22.880 | and people can read about it in your book, "Deep Work,"
00:11:27.040 | but, and I know this has been out for a long time
00:11:29.540 | and people are probably familiar with many of the concepts,
00:11:31.960 | but it's still pretty profound
00:11:33.800 | and it has stayed with me for a long time.
00:11:35.880 | There's something about adding the terms to it
00:11:39.400 | that actually solidifies the concepts.
00:11:42.160 | Like words matter, it's pretty cool.
00:11:44.360 | And just for me, sort of as a comment,
00:11:48.600 | there's, it's a struggle and it's very difficult
00:11:52.680 | to maintain focus for a prolonged period of time,
00:11:56.160 | but the days on which I'm able to accomplish
00:11:59.360 | several hours of that kind of work, I'm happy.
00:12:03.360 | So forget being productive and all that.
00:12:05.540 | I'm just satisfied with my life.
00:12:07.880 | I feel fulfilled.
00:12:10.600 | It's like joyful.
00:12:11.560 | And then I can be, I'm less of a dick
00:12:14.440 | to other people in my life afterwards.
00:12:17.560 | It's a beautiful thing.
00:12:19.120 | And I find the opposite.
00:12:22.280 | When I don't do that kind of thing, I'm much more irritable.
00:12:25.480 | Like I feel like I didn't accomplish anything
00:12:27.320 | and there's this stress that then,
00:12:29.160 | the negative emotion builds up to where you're no longer
00:12:31.760 | able to sort of enjoy a lot of this amazing life.
00:12:35.480 | So in that sense, "Deep Work" has been a source
00:12:37.880 | of a lot of happiness.
00:12:39.640 | I'd love to ask you, how do you,
00:12:42.280 | again, you cover this in the book,
00:12:43.480 | but how do you integrate "Deep Work" into your life?
00:12:46.480 | What are different scheduling strategies
00:12:48.560 | that you would recommend just at a high level?
00:12:51.160 | What are different ideas there?
00:12:52.560 | - Well, I mean, I'm a big fan of time blocking.
00:12:55.320 | So if you're facing your workday,
00:12:58.480 | don't allow your inbox or to-do list to sort of drive you.
00:13:02.800 | Don't just come into your day and think,
00:13:04.000 | "What do I wanna do next?"
00:13:05.840 | I mean, I'm a big planner saying,
00:13:06.720 | "Here's the time available.
00:13:09.640 | "Let me make a plan for it."
00:13:11.480 | So I have a meeting here, I have an appointment here.
00:13:13.360 | Here's what's left.
00:13:14.200 | What do I actually wanna do with it?
00:13:15.400 | So in this half hour, I'm gonna work on this.
00:13:17.960 | For this 90-minute block, I'm gonna work on that.
00:13:19.680 | And during this hour, I'm gonna try to fit this in.
00:13:21.320 | And then actually I have this half hour gap
00:13:22.680 | between two meetings.
00:13:23.520 | So why don't I take advantage of that
00:13:24.680 | to go run five errands?
00:13:25.800 | I can kind of batch those together.
00:13:27.720 | But blocking out in advance,
00:13:30.120 | this is what I wanna do with the time available.
00:13:32.480 | I mean, I find that's much more effective.
00:13:33.720 | Now, once you're doing this,
00:13:34.600 | once you're in a discipline of time blocking,
00:13:36.320 | it's much easier to actually see,
00:13:37.800 | this is where I want, for example, to deep work.
00:13:40.160 | And I can get a handle on the other things
00:13:41.840 | that need to happen and find better places to fit them
00:13:44.880 | so I can prioritize this.
00:13:46.080 | And you're gonna get a lot more of that done
00:13:48.520 | than if it's just going through your day
00:13:49.680 | and saying, what's next?
00:13:51.400 | - And schedule every single day kind of thing.
00:13:53.400 | So is that you could try to in the morning
00:13:54.880 | to try to have a plan?
00:13:57.000 | - Yeah, so I do a quarterly, weekly, daily planning.
00:14:00.280 | So at the semester or quarterly level,
00:14:02.320 | I have a big picture vision
00:14:04.800 | for what I'm trying to get done during the fall, let's say,
00:14:07.320 | or during the winter.
00:14:08.360 | There's a deadline coming up for academic papers
00:14:11.200 | at the end of the season.
00:14:12.120 | Here's what I'm working on.
00:14:13.480 | I wanna have this many chapters done.
00:14:14.840 | I wanna have a book, something like this.
00:14:15.840 | Like you have the big picture vision
00:14:18.200 | of what you wanna get done.
00:14:20.080 | Then weekly, you look at that
00:14:22.240 | and then you look at your week
00:14:23.960 | and you put together a plan for like, okay,
00:14:25.280 | what's my week gonna look like?
00:14:27.200 | What do I need to do?
00:14:28.200 | Or how am I gonna make progress on these things?
00:14:29.880 | Maybe I need to do an hour every morning
00:14:32.640 | or I see that Monday is my only really empty day.
00:14:34.640 | So that's gonna be the day that I really need to nail
00:14:36.440 | on writing or something like this.
00:14:38.080 | And then every day, you look at your weekly plan
00:14:41.160 | and see, let me block off the actual hours.
00:14:42.760 | So you do that three scales,
00:14:44.720 | the quarterly down to weekly, down to daily.
00:14:47.840 | - And we're talking about actual times of day versus,
00:14:50.800 | so the alternative is what I end up doing a lot,
00:14:55.080 | and I'm not sure it's the best way to do it,
00:14:56.640 | is scheduling the duration of time.
00:15:01.640 | This is called the luxury when you don't have any meetings.
00:15:04.520 | I'm like, religiously don't do meetings.
00:15:07.840 | - All other academics are jealous of you, by the way.
00:15:09.840 | - Yeah. - I know.
00:15:11.520 | No Zoom meetings.
00:15:13.440 | I find those are, that's one of the worst tragedies
00:15:18.240 | of the pandemic, is both the opportunity to,
00:15:21.360 | well, okay, the positive thing is to have more time
00:15:24.040 | with your family, sort of reconnect in many ways,
00:15:27.320 | and that's really interesting.
00:15:28.960 | Be able to remotely sort of not waste time on travel
00:15:34.120 | and all those kinds of things.
00:15:35.200 | The negative is, actually, both those things
00:15:38.400 | are also sources of the negative.
00:15:40.400 | But the negative is, it seems like people
00:15:42.720 | have multiplied the number of meetings
00:15:44.280 | because they're so easy to schedule,
00:15:46.240 | and there's nothing more draining to me,
00:15:49.920 | intellectually, philosophically, just my spirit
00:15:54.240 | is destroyed by even a 10-minute Zoom meeting.
00:15:57.880 | Like, what are we doing here?
00:15:59.720 | - What's the meaning of life?
00:16:00.960 | Come on, what is this all about?
00:16:02.720 | - Every Zoom meeting, I have an existential crisis.
00:16:05.720 | - Kierkegaard with a internet connection.
00:16:10.320 | - So, what the hell were we talking about?
00:16:13.760 | Oh, so when you don't have meetings,
00:16:16.720 | there's a luxury to really allow for certain things
00:16:21.560 | if they need to, like the important things,
00:16:25.080 | like deep work sessions to last way longer
00:16:28.000 | than you maybe planned for.
00:16:30.840 | I mean, that's my goal, is to try to schedule.
00:16:33.320 | The goal is to schedule, to sit and focus
00:16:35.520 | for a particular task for an hour,
00:16:37.480 | and hope I can keep going.
00:16:40.000 | And hope I can get lost in it.
00:16:41.920 | And do you find that this is at all an okay way to go?
00:16:46.920 | And the time blocking is just something you have to do
00:16:51.800 | to actually be an adult and operate in this real world?
00:16:54.880 | Or is there some magic to the time blocking?
00:16:57.960 | - Well, I mean, there's magic to the intention.
00:17:00.520 | There's magic to it if you have varied responsibilities.
00:17:05.040 | So, I'm often juggling multiple jobs, essentially.
00:17:08.800 | There's academic stuff, there's teaching stuff,
00:17:11.160 | there's book stuff, there's the business
00:17:13.560 | surrounding my book stuff.
00:17:16.560 | But I'm of your same mindset.
00:17:19.000 | If a deep work session is going well,
00:17:22.240 | you just rock and roll and let it go on.
00:17:24.480 | So, one of the big keys of time block,
00:17:26.400 | at least the way I do it, so I even sell this planner
00:17:29.280 | to help people time block, it has many columns.
00:17:31.880 | Because the discipline is,
00:17:32.960 | oh, if your initial schedule changes,
00:17:35.960 | you just move over one.
00:17:36.880 | Next time you get a chance to move over one column,
00:17:38.960 | and then you just fix it for the time that's remaining.
00:17:41.280 | So, in other words, there's no bonus for,
00:17:44.200 | I made a schedule and I stuck with it.
00:17:46.320 | Like, there's actually,
00:17:47.160 | it's not like you get a prize for it, right?
00:17:48.640 | Like, for me, the prize is,
00:17:50.600 | I have an intentional plan for my time.
00:17:52.960 | And if I have to change that plan, that's fine.
00:17:54.680 | Like, the state I wanna be is basically
00:17:56.320 | at any point in the day,
00:17:57.200 | I've thought about what time remains
00:17:58.760 | and gave it some thought for what to do.
00:18:01.160 | Because I'll do the same thing,
00:18:02.120 | even though I have a lot more meetings
00:18:04.680 | and other types of things I have to do in my various jobs.
00:18:06.960 | And I basically prioritize the deep work
00:18:09.720 | and then get yelled at a lot.
00:18:11.080 | - Yeah, got it. - So that's kind of
00:18:11.920 | my strategy is like, just be okay,
00:18:14.000 | just be okay getting yelled at a lot.
00:18:15.560 | Because I feel you, if you're rolling, yeah.
00:18:18.160 | Well, that's what it is for me.
00:18:19.480 | Like with writing, I think it's,
00:18:20.720 | writing's so hard in a certain way that it's,
00:18:22.920 | you don't really get on a roll in some sense.
00:18:24.920 | Like, it's just difficult.
00:18:26.800 | But working on proofs,
00:18:28.440 | it's very hard to pull yourself away from a proof
00:18:32.360 | if you start to get some traction.
00:18:33.900 | Just, you've been at it for a couple hours
00:18:35.520 | and you feel the pins and tumblers
00:18:37.800 | starting to click together and progress is being made.
00:18:40.520 | It's really hard to pull away from that.
00:18:42.400 | So I'm willing to get yelled at by almost everyone.
00:18:45.200 | - Of course, there is also a positive effect
00:18:48.560 | to pulling yourself out of it when things are going great.
00:18:53.200 | Because then you're kind of excited to resume.
00:18:55.700 | - Yeah. - As opposed to stopping
00:18:57.000 | on a dead end.
00:18:58.640 | - That's true.
00:18:59.480 | - There's an, yeah, there's a,
00:19:03.640 | there's an extra force of procrastination that comes with
00:19:06.060 | if you stop on a dead end to return to the task.
00:19:08.940 | - Yeah, or a cold start.
00:19:10.860 | - Yeah. - Like, whenever I,
00:19:12.340 | like I'm in a stage now, I submitted a few papers recently.
00:19:15.420 | So now we're sort of starting something up from cold.
00:19:18.620 | And it takes way too long to get going
00:19:20.540 | because it's very hard to,
00:19:22.220 | it's very hard to get the motivation to schedule a time
00:19:24.420 | when it's not, yeah, we're in it.
00:19:25.860 | Like, here's where we are.
00:19:26.900 | We feel like something's about to give here.
00:19:28.340 | We're in the very early stages where it's just,
00:19:30.660 | I don't know, I'm gonna read hard papers
00:19:32.900 | and it's gonna be hard to understand them.
00:19:34.120 | I'm gonna have no idea how to make progress.
00:19:35.640 | It's not motivating.
00:19:38.480 | - What about deadlines?
00:19:39.720 | Can we, okay, so this is like a therapy session.
00:19:42.760 | (laughing)
00:19:44.240 | Is why, it seems like I only get stuff done
00:19:48.640 | that has deadlines.
00:19:50.160 | And so the, one of the implied powerful things
00:19:53.240 | about time blocking is there's a kind of deadline
00:19:56.160 | or there's a artificial or real sense of urgency.
00:19:59.920 | Do you think it's possible to get anything done
00:20:02.820 | in this world without deadlines?
00:20:04.260 | Why do deadlines work so well?
00:20:06.160 | - Well, I mean, it's a clear motivational signal,
00:20:08.500 | but in the short term,
00:20:10.900 | you do get an effect like that in time blocking.
00:20:12.700 | I think the strong effect you get by saying,
00:20:15.460 | this is the exact time I'm gonna work on this,
00:20:18.140 | is that you don't have the debate with yourself
00:20:20.020 | every three minutes about, should I take a break now?
00:20:23.260 | Right, like this is the big issue with just saying,
00:20:24.860 | you know, I'm gonna go write.
00:20:26.320 | I'm gonna write for a while and that's it
00:20:27.780 | because your mind is saying, well, obviously,
00:20:29.240 | we're gonna take some breaks, right?
00:20:30.940 | We're not just gonna write forever.
00:20:32.380 | And so why not right now?
00:20:34.380 | You have to like, well, not right now,
00:20:35.220 | let's go a little bit longer, five minutes.
00:20:36.340 | So why don't we just take a break now?
00:20:37.300 | Like we should probably look at the internet.
00:20:38.820 | Now you have to constantly have this battle.
00:20:40.280 | On the other hand, if you're in a time block schedule,
00:20:42.340 | like I've got these two hours put aside for writing,
00:20:44.300 | that's what I'm supposed to be doing.
00:20:46.020 | I have a break scheduled over here.
00:20:48.300 | I don't have to fight with myself, right?
00:20:50.060 | And maybe at a larger scale,
00:20:51.300 | deadlines give you a similar sort of effect.
00:20:53.580 | Is I know this is what I'm supposed to be working on
00:20:55.340 | because it's due.
00:20:57.980 | - Perhaps, but we are describing as much healthier
00:21:01.240 | sort of giving yourself over.
00:21:02.880 | And you talk about this in the new email book,
00:21:05.680 | the process, I mean, in general,
00:21:07.200 | you talk about it all over is creating a process
00:21:09.800 | and then giving yourself over to the process.
00:21:13.420 | But then you have to be strict with yourself.
00:21:17.600 | - Yeah, but what are the deadlines you're talking about?
00:21:19.640 | It's like with papers,
00:21:20.600 | like what's the main type of deadline work?
00:21:22.800 | - Well, so papers definitely,
00:21:26.240 | but publications like say this podcast,
00:21:31.240 | I have to publish this podcast early next week,
00:21:35.300 | one, because your book is coming out.
00:21:36.720 | I'd love to support this amazing book.
00:21:40.080 | But the other is I have to fly to Vegas on Thursday
00:21:45.080 | to run 40 miles with David Goggins.
00:21:48.540 | And so I want this podcast,
00:21:51.620 | this conversation we're doing now to be out of my life.
00:21:55.020 | Like I don't want to be in a hotel in Vegas,
00:21:57.340 | like editing the, like freaking out
00:21:59.660 | while David Goggins is yelling.
00:22:01.420 | - On hour 43 of your Terrathon thing.
00:22:05.820 | - But actually it's possible that I still will be doing that
00:22:09.260 | because that's not a hard,
00:22:10.340 | that's a softer deadline, right?
00:22:12.040 | But those are sort of,
00:22:13.180 | life imposes these kinds of deadlines.
00:22:15.460 | I'm not, so yeah, papers are nice
00:22:18.020 | because there's an actual deadline.
00:22:20.580 | But I am almost referring to like the pressure
00:22:24.500 | that people put on you.
00:22:26.020 | Hey man, you said you're going to get this done
00:22:28.060 | two months ago, why haven't you gotten it done?
00:22:30.260 | - See, I don't like that pressure.
00:22:31.660 | - Yeah.
00:22:32.500 | - So maybe we, now, first of all, I think we can all--
00:22:33.340 | - I hate it too.
00:22:34.260 | - We can agree by the way,
00:22:35.140 | having David Goggins yell at you
00:22:37.180 | is probably the top productivity technique.
00:22:39.520 | (laughing)
00:22:41.140 | I think we'd all get a lot more done if he was yelling.
00:22:44.020 | But see, I don't like that.
00:22:45.080 | So I will try to get things done early.
00:22:47.540 | I like having flex.
00:22:49.040 | I also don't like the idea of this has to get done today.
00:22:52.960 | Right, like it's due at midnight
00:22:55.060 | and we've got a lot to do as the night before
00:22:57.180 | because then I get in my head about what if I get sick?
00:22:59.780 | Or like, what if, you know,
00:23:01.420 | what if I don't get a bad night's sleep
00:23:03.260 | and I can't think clearly?
00:23:04.800 | So I like to have the flex.
00:23:05.740 | So I'm all process.
00:23:07.500 | And that's like the philosophical aspect
00:23:08.980 | of that book, "Deep Work"
00:23:09.820 | is that there's something very human and deep
00:23:13.080 | about just wrangling with the world of ideas.
00:23:15.100 | I mean, Aristotle talked about this.
00:23:16.540 | If you go back and read the ethics,
00:23:18.380 | he's trying to understand the meaning of life.
00:23:20.300 | And he eventually ends up ultimately at the human capacity
00:23:24.720 | to contemplate deeply.
00:23:26.440 | It's kind of a teleological argument.
00:23:27.960 | It's the things that only humans can do
00:23:29.520 | and therefore it must be somehow connected to our ends.
00:23:31.720 | And he said, ultimately, that's where he found his meaning.
00:23:34.280 | But, you know, he's touching on some sort of intimation
00:23:36.520 | there that's correct.
00:23:37.840 | And so what I try to build my life around
00:23:39.640 | is regularly thinking hard about stuff that's interesting.
00:23:44.400 | Just like if you get a fitness habit going,
00:23:46.400 | you feel off when you don't do it.
00:23:50.080 | I try to get that cognitive habit.
00:23:51.620 | So it's like, I got it.
00:23:52.940 | I mean, look, I have my bag here somewhere.
00:23:54.300 | I have my notebook in it because I was thinking
00:23:56.780 | on the Uber ride over, I was like, you know,
00:23:58.300 | I could get some, I'm working on this new proof.
00:24:00.420 | And it just, so you train yourself.
00:24:02.740 | You train yourself to appreciate certain things.
00:24:04.700 | And then over time, the hope is that it accretes.
00:24:08.060 | - Well, let's talk about some demons
00:24:09.740 | because I wonder, so there's like "Deep Work"
00:24:13.780 | which and the "World Without Email" books
00:24:19.260 | that to me symbolize the life I want to live, okay?
00:24:24.100 | And then there is, I'm like, despite appearances,
00:24:27.680 | an adult at this point.
00:24:29.860 | And this is the life I actually live.
00:24:32.100 | And I'm in constant chaos.
00:24:36.180 | You said you don't like that anxiety.
00:24:37.780 | I hate it too, but it seems like I'm always in it.
00:24:41.300 | It's a giant mess.
00:24:42.980 | It's like, it's almost like whenever I establish,
00:24:46.580 | whenever I have successful processes for doing deep work,
00:24:49.340 | I'll add stuff on top of it just to introduce the chaos.
00:24:52.460 | And like, I don't want to, but you know,
00:24:55.660 | you have to look in the mirror at a certain point
00:24:57.500 | and you have to say like, who the hell am I?
00:25:00.580 | Like, I keep doing this.
00:25:02.420 | Is this something that's fundamental to who I am
00:25:04.660 | or do I really need to fix this?
00:25:06.180 | - What's the chaos right now?
00:25:07.500 | Like I've seen your video about like your routine.
00:25:09.820 | It seemed very structured and deep.
00:25:12.300 | In fact, I was really envious of it.
00:25:13.640 | So like, what's the chaos now that's not in that video?
00:25:17.060 | - Many of those sessions go way longer.
00:25:19.100 | I don't get enough sleep.
00:25:20.580 | And then the main introduction of chaos is,
00:25:24.080 | it's taking on too many things on the to-do list.
00:25:26.640 | - I see.
00:25:27.800 | - It's, I mean, I suppose it's a problem
00:25:29.440 | that everybody deals with, which is saying, not saying no.
00:25:33.400 | But it's not like I have trouble saying no.
00:25:36.880 | It's that there's so much cool shit in my life.
00:25:39.280 | Okay, listen, there's nothing I love more in this world
00:25:42.720 | than the Boston Dynamics robots.
00:25:45.320 | - Spot and the other, yeah.
00:25:47.000 | - And they're giving me spot.
00:25:48.380 | So there's a to-do, what am I gonna say, no?
00:25:51.400 | So they're giving me spot
00:25:52.600 | and I wanna do some computer vision stuff
00:25:54.360 | for the hell of it.
00:25:55.480 | Okay, so that's now a to-do item.
00:25:57.520 | - And then you go to Texas for a while.
00:25:59.000 | - And there's Texas.
00:25:59.840 | - But then everything's happening
00:26:00.840 | to all the interesting people down there.
00:26:02.400 | - And then there's surprises, right?
00:26:03.920 | There are power outages in Texas.
00:26:05.800 | There's constant changes to plans
00:26:07.360 | and all those kinds of things.
00:26:08.480 | And you sleep less.
00:26:09.880 | And then there's personal stuff,
00:26:11.400 | like just people in your life,
00:26:13.600 | sources of stress, all those kinds of things.
00:26:16.520 | But it does feel like if I'm just being introspective
00:26:19.960 | that I bring it onto myself.
00:26:22.380 | I suppose a lot of people do this kind of thing
00:26:24.640 | is they flourish under pressure.
00:26:30.360 | And I wonder if that's just a hack I've developed
00:26:34.920 | as a habit early on in life
00:26:37.600 | that you need to let go of.
00:26:40.720 | You need to fix.
00:26:42.080 | - But it's all interesting things.
00:26:44.000 | - Yeah, it's interesting.
00:26:44.960 | - That's interesting.
00:26:45.800 | Yeah, because these are all interesting things.
00:26:47.520 | - Well, one of the things you talked about in "Deep Work,"
00:26:49.720 | which is really important,
00:26:50.840 | is having an end to the day, putting it down.
00:26:55.060 | I don't think I've ever done that in my life.
00:26:59.040 | - Yeah.
00:26:59.880 | Well, see, I started doing that early
00:27:01.000 | because I got married early.
00:27:04.040 | So I didn't have a real job.
00:27:05.280 | I was a grad student, but my wife had a real job.
00:27:07.680 | And so I just figured I should do my work when she's at work
00:27:12.200 | because, hey, when work's over, she'll be home
00:27:14.480 | and I don't wanna be on campus or whatever.
00:27:17.760 | And so real early on, I just got in that habit
00:27:19.720 | of this is when you end work.
00:27:22.320 | And then when I was a postdoc,
00:27:24.040 | which is kind of an easy job,
00:27:25.440 | I put artificial, I was like, "I wanna train."
00:27:30.020 | I was like, "When I'm a professor, it's gonna be busier
00:27:31.600 | because there's demands that professors have beyond research."
00:27:34.200 | And so as a postdoc,
00:27:35.560 | I added artificial, large, time-consuming things
00:27:37.960 | into the middle of my day.
00:27:38.800 | I'd basically exercise for two hours
00:27:40.440 | in the middle of the day
00:27:41.260 | and do all this productive meditation and stuff like this
00:27:44.600 | while still maintaining the nine to five.
00:27:46.720 | So it's like, okay, I wanna get really good
00:27:48.680 | at putting artificial constraints on so that I stay,
00:27:51.600 | I didn't wanna get flabby when my job was easy
00:27:54.720 | so that when I became a professor,
00:27:56.600 | and now all of that's paying off
00:27:57.800 | because I have a ton of kids.
00:27:59.760 | So now I don't really have a choice.
00:28:01.760 | That's what's probably keeping me away from cool things
00:28:04.120 | is I just don't have time to do them.
00:28:06.280 | And then after a while, people stop bothering.
00:28:09.240 | - But that's how you have a successful life.
00:28:13.040 | Otherwise, it's too easy
00:28:15.240 | to then go into the full Hunter S. Thompson,
00:28:18.000 | like to where nobody functional wants
00:28:23.000 | to be in your vicinity.
00:28:25.720 | Like you're driving, you attract the people
00:28:29.240 | that have a similar behavior pattern as you.
00:28:33.400 | So if you live in chaos,
00:28:35.440 | you're going to attract chaotic people.
00:28:37.400 | And then it becomes like this self-fulfilling prophecy.
00:28:42.040 | And it feels like, I'm not bothered by it,
00:28:45.920 | but I guess this is all coming around
00:28:48.400 | to exactly what you're saying,
00:28:49.520 | which is like, I think one of the big hacks
00:28:52.240 | for productive people that I've met is to get married
00:28:54.920 | and have kids, honestly.
00:28:57.600 | It's very, perhaps counterintuitive,
00:29:00.840 | but it's like the ultimate timetable enforcer.
00:29:05.640 | - Yeah, it enforces a lot of timetables,
00:29:08.840 | though it has a huge, kids have a huge productivity hit.
00:29:12.240 | Those, you gotta weigh it.
00:29:13.320 | But okay, here's the complicated thing, though.
00:29:15.440 | Like you could think about in your own life,
00:29:17.120 | starting the podcast as one of these just cool opportunities
00:29:20.600 | that you put on yourself, right?
00:29:22.080 | Like I could have been talking to you at MIT four years ago
00:29:25.040 | and be like, don't do that.
00:29:25.880 | Like your research is going well, right?
00:29:28.120 | But then everyone who watches you is like,
00:29:29.360 | okay, this podcast is, the direction that's taking you
00:29:31.720 | is like a couple of years from now,
00:29:33.680 | it's gonna, there'll be something really monumental
00:29:36.600 | that you're probably, that's gonna probably lead to, right?
00:29:38.080 | There'll be some really,
00:29:39.920 | it just feels like your life is going somewhere.
00:29:41.800 | - It's going somewhere, it's interesting.
00:29:43.240 | - Yeah. - Unexpected, yeah.
00:29:44.600 | - Yeah, so how do you balance those two things?
00:29:46.360 | And so what I try to throw at it is this motto
00:29:48.920 | of do less, do better, know why, right?
00:29:50.920 | So do less, do better, know why.
00:29:55.600 | It used to be the motto of my website years ago.
00:29:58.760 | So do a few things, but like an interesting array, right?
00:30:01.920 | So I was doing MIT stuff, but I was also writing, you know?
00:30:06.880 | So a couple of things are, you know, they were interesting.
00:30:08.480 | Like I have a couple bets placed
00:30:10.240 | on a couple of different numbers on the roulette table,
00:30:12.800 | but not too many things.
00:30:14.120 | And then really try to do those things really well
00:30:15.880 | and see where it goes.
00:30:16.840 | Like with my writing,
00:30:17.680 | I just spent years and years and years just training.
00:30:19.560 | I was like, I wanna be a better writer,
00:30:20.400 | I wanna be a better writer.
00:30:21.240 | I started writing student books when I was a student.
00:30:24.200 | I really wanted to write hardcover idea books.
00:30:25.960 | I started training.
00:30:27.480 | I would use like New Yorker articles to train myself.
00:30:30.320 | I'd break them down and then I'd get commissions
00:30:32.000 | with much smaller magazines and practice the skills.
00:30:34.360 | And it took forever until, you know, but now today,
00:30:37.040 | like I actually get to write for the New Yorker,
00:30:38.360 | but it took like a decade.
00:30:40.320 | So a small number of things, try to do them really well.
00:30:42.120 | And then the know why is have a connection
00:30:44.080 | to some sort of value.
00:30:45.800 | Like in general, I think this is worth doing
00:30:48.600 | and then seeing where it leads.
00:30:50.640 | - And so the choice of the few things is grounded
00:30:54.840 | in what, like a little flame of passion,
00:30:59.200 | like a love for the thing,
00:31:00.280 | like a sense that you say you wanted to write,
00:31:02.600 | get good at writing.
00:31:04.320 | You had that kind of introspective moment of thinking,
00:31:07.800 | this actually brings me a lot of joy and fulfillment.
00:31:10.360 | - Yeah, I mean, it gets complicated
00:31:11.600 | 'cause I wrote a whole book about
00:31:13.200 | following your passion being bad advice,
00:31:14.680 | which is like the first thing I kind of got infamous for.
00:31:18.440 | I wrote that back in 2012.
00:31:20.120 | But the argument there is like passion cultivates, right?
00:31:23.280 | So what I was pushing back on was the myth
00:31:26.120 | that the passion for what you do exists full intensity
00:31:29.880 | before you start, and then that's what propels you.
00:31:32.600 | Where actually the reality is as you get better at something,
00:31:35.400 | as you gain more autonomy, more skill,
00:31:36.680 | and more impact, the passion grows along with it.
00:31:38.600 | So that when people look back later and say,
00:31:42.080 | oh, follow your passion, what they really mean is
00:31:43.760 | I'm very passionate about what I do,
00:31:45.200 | and that's a worthy goal.
00:31:47.080 | But how you actually cultivate that is much more complicated
00:31:49.320 | than just introspection is gonna identify,
00:31:51.960 | like for sure you should be a writer or something like this.
00:31:54.080 | - So I was actually quoting you.
00:31:55.320 | I was on a social network last night in a clubhouse.
00:32:00.320 | I don't know if you've heard of it.
00:32:01.720 | - Wait, I have to ask you about this
00:32:03.280 | because I'm invited to do a clubhouse.
00:32:05.560 | I don't know what that means.
00:32:07.160 | A tech reporter has invited me to do a clubhouse
00:32:09.320 | about my new book.
00:32:10.240 | - That's awesome.
00:32:12.560 | Well, let me know when 'cause I'll show up.
00:32:14.240 | - But what is it?
00:32:15.080 | - Okay, so first of all, let me just mention
00:32:16.680 | that I was in a clubhouse room last night
00:32:21.120 | and I kept plugging exactly what you said about passion.
00:32:24.760 | So we'll talk about it.
00:32:25.800 | It was a room that was focused on burnout.
00:32:28.400 | - Okay.
00:32:29.520 | - But first, clubhouse is a kind of fascinating place
00:32:34.200 | in terms of your mind would be very interesting
00:32:37.700 | to analyze this place because we talk about email,
00:32:41.640 | talk about social networks,
00:32:43.560 | but clubhouse is something very different
00:32:45.400 | and I've encountered in other places, Discord and so on,
00:32:49.080 | that's voice only communication.
00:32:52.320 | So it's a bunch of people in a room,
00:32:53.720 | they're just eyes closed.
00:32:56.240 | All you hear is their voices.
00:32:57.640 | - In real time.
00:32:58.520 | - Real time, live.
00:32:59.860 | It only happens live.
00:33:01.040 | You're technically not allowed to record,
00:33:03.240 | but some people still do,
00:33:04.720 | and especially when it's big conversations.
00:33:07.760 | But the whole point is it's there live.
00:33:09.840 | And there's different structures.
00:33:10.960 | Like on Discord, it was so fascinating.
00:33:13.920 | I have this Discord server that would have
00:33:16.880 | hundreds of people in a room together, right?
00:33:19.480 | We're all just little icons
00:33:21.000 | that can mute and unmute our mics.
00:33:22.880 | - Okay.
00:33:23.700 | - And so you're sitting there,
00:33:25.840 | so it's just voices and you're able with hundreds of people
00:33:30.840 | to not interrupt each other.
00:33:33.960 | Well, first of all, as a dynamic system.
00:33:37.400 | - You see icons, just like mics muted or not muted,
00:33:39.680 | basically.
00:33:40.520 | - Yeah, so everyone's muted and they unmute
00:33:42.360 | and it starts flashing.
00:33:44.120 | - Yeah.
00:33:45.280 | - So you're like, okay, let me get precedence.
00:33:47.960 | - Yeah.
00:33:48.800 | - So it's the digital equivalent
00:33:49.620 | of when you're in a conversation,
00:33:50.920 | like at a faculty meeting,
00:33:52.360 | and you sort of like kind of make some noises
00:33:54.880 | like while the other person's finishing.
00:33:56.240 | And so people realize like,
00:33:57.620 | okay, this person wants to talk next,
00:33:58.920 | but now it's purely digital.
00:34:00.240 | You see a flashing.
00:34:01.520 | - But in a faculty meeting, which is very interesting,
00:34:04.320 | like even as we're talking now,
00:34:06.580 | there's a visual element that seems to increase
00:34:09.640 | the probability of interruption.
00:34:11.360 | - Yeah.
00:34:12.200 | - When it's just darkness,
00:34:13.700 | you actually listen better and you don't interrupt.
00:34:17.300 | So like if you create a culture,
00:34:18.760 | there's always gonna be assholes,
00:34:20.880 | but they're actually exceptions.
00:34:23.360 | Everybody adjusts.
00:34:24.680 | They kind of evolve to the beat of the room.
00:34:28.280 | Okay, that's one fascinating aspect.
00:34:30.320 | It's like, okay, that's weird.
00:34:32.240 | 'Cause it's different than like a Zoom call
00:34:34.400 | where there's video.
00:34:35.560 | - Yeah.
00:34:36.800 | - It's just audio.
00:34:38.560 | You think video adds,
00:34:40.040 | but it actually seems like it subtracts.
00:34:42.760 | The second aspect of it that's fascinating
00:34:45.220 | is when it's no video, just audio,
00:34:48.120 | there's an intimacy.
00:34:49.980 | It's weird.
00:34:51.700 | Because with strangers,
00:34:53.860 | you connect in a much more real way.
00:34:57.220 | It's similar to podcasts.
00:34:59.020 | - Yeah.
00:34:59.860 | - But--
00:35:00.700 | - With a lot of people.
00:35:01.660 | - With a lot of people and new people.
00:35:03.700 | - Huh.
00:35:04.540 | - And they bring, okay, first of all,
00:35:07.560 | different voices like low voices and like high voices.
00:35:11.360 | And it's more difficult to judge.
00:35:14.360 | In Discord, you couldn't even see the people.
00:35:18.560 | It was a culture where you do funny profile pictures
00:35:21.640 | as opposed to your actual face.
00:35:23.160 | In Clubhouse, it's your actual face.
00:35:24.960 | So you can tell like as an older person, younger person.
00:35:27.640 | In Discord, you couldn't.
00:35:28.840 | You just have to judge based on the voice.
00:35:31.260 | But there's something about the listening
00:35:35.000 | and the intimacy of being surprised by different strangers.
00:35:39.040 | It feels almost like a party with friends
00:35:43.580 | and friends of friends you haven't met yet,
00:35:45.340 | but you really like.
00:35:47.300 | Now, Clubhouse also has an interesting innovation
00:35:49.700 | where there's a large crowd that just listens
00:35:52.340 | and there's a stage.
00:35:54.140 | And you can bring people up on the stage.
00:35:56.340 | So only people on stage are talking.
00:35:59.220 | And you can have like five, six, seven, eight,
00:36:01.340 | sometimes 20, 30 people on stage.
00:36:03.340 | And then you can also have thousands
00:36:04.820 | of people just listening.
00:36:05.980 | - I see.
00:36:06.820 | - So there's a, I don't know.
00:36:08.900 | A lot of people are being surprised by this.
00:36:10.760 | - Why is it called a social network?
00:36:12.280 | It seems like it doesn't have, there's not social links.
00:36:14.200 | There's not a feed that's trying to harvest attention.
00:36:17.200 | It feels like a communication.
00:36:19.160 | - So the social network aspect is you follow people.
00:36:24.200 | And the people you follow,
00:36:26.180 | now this is like the first social network
00:36:27.600 | that's actually correct use of follow, I think.
00:36:30.460 | You're more likely to see the rooms they're in.
00:36:35.120 | So there's a, your feed is a bunch of rooms
00:36:37.240 | that are going on right now.
00:36:38.560 | - Okay.
00:36:39.400 | - And the people you follow are the ones
00:36:43.380 | that will increase the likelihood
00:36:44.820 | that you'll see the room they're in.
00:36:46.380 | And so the final result is like,
00:36:48.540 | there's a list of really interesting rooms.
00:36:50.360 | Like I have all these,
00:36:52.460 | I've been speaking Russian quite a bit.
00:36:54.380 | There's practicing, but also just like talking politics
00:36:58.260 | and philosophy in Russian.
00:37:00.100 | I've never done that before,
00:37:01.220 | but it allows me to connect with that community.
00:37:03.340 | And then there's a community of like, it's funny,
00:37:07.620 | but like I'll go in a community
00:37:09.200 | of all African-American people talking about race
00:37:12.200 | and I'll be welcomed.
00:37:13.460 | - Yeah.
00:37:14.300 | - I've never had, like, I've literally never been
00:37:17.080 | in a difficult conversation about race,
00:37:20.220 | like with people from all over the place.
00:37:22.560 | It's like fascinating.
00:37:23.480 | And musicians, jazz musicians, I don't know.
00:37:26.440 | You could say that a lot of other places
00:37:28.460 | could have created that culture.
00:37:29.680 | I suppose Twitter and Facebook allow for that culture,
00:37:33.000 | but there's something about this network
00:37:35.880 | as it stands now, 'cause no Android users.
00:37:39.200 | It's probably just because it's iPhone people.
00:37:42.780 | - Yeah, less conspiratorial or something.
00:37:45.240 | - Well, like less, listen, I'm an Android person,
00:37:47.300 | so I got an iPhone just for this network, which is funny.
00:37:50.200 | For now, it's all like, there's very few trolls.
00:37:55.920 | There's very few people that are trying
00:37:57.240 | to manipulate the system and so on.
00:37:59.240 | So I don't know, it's interesting.
00:38:00.920 | Now, the downside, the reason you're going to hate it
00:38:04.760 | is because it's so intimate, because it pulls you in
00:38:08.520 | and pulls in very successful people like you,
00:38:11.320 | just like really successful, productive, very busy people.
00:38:15.960 | It's a huge time sink.
00:38:21.360 | It's very difficult to pull yourself out.
00:38:23.640 | - Interesting, you mean once you're in a room?
00:38:25.080 | - Well, no, leaving the room is actually easy.
00:38:27.760 | The beautiful thing about a stage with multiple people,
00:38:30.480 | there's actually a little button that says leave quietly.
00:38:33.600 | So culture, no, etiquette-wise, it's okay to just leave.
00:38:38.520 | So you and I in a room, when it's just you and I,
00:38:41.000 | it's a little awkward to leave.
00:38:42.200 | - If you're asking questions and I'm just gone.
00:38:44.240 | - But, and actually, if you're being interviewed
00:38:46.360 | for the book, that's weird because you're now in the event
00:38:51.040 | and you're supposed to, but usually the person interviewing
00:38:54.000 | would be like, okay, it's time for you to go.
00:38:55.920 | It's more normal, but the normal way to use the room
00:38:59.960 | is like, you're just opening the app,
00:39:02.820 | and there'll be like, I don't know, Sam Harris,
00:39:05.480 | Eric Weinstein, I think Joe Rogan showed up to the app,
00:39:11.840 | Bill Gates, these people on stage just randomly
00:39:14.880 | just plugged in, and then you'll step up on stage,
00:39:18.020 | listen, maybe you won't contribute at all,
00:39:20.360 | maybe you'll say something funny,
00:39:21.960 | and then you'll just leave.
00:39:23.400 | And there's the addicting aspect to it,
00:39:26.840 | the reason it's a time sink is you don't wanna leave.
00:39:30.000 | - What I've noticed about exceptionally busy people
00:39:33.420 | that they love this, I think might have to do
00:39:36.500 | with the pandemic. - Might be a little bit, yeah.
00:39:38.220 | - There's a loneliness. - They're all starved.
00:39:40.540 | - But also it's really cool people.
00:39:42.480 | - Yeah. - Like when was the last time
00:39:44.980 | you talked to Sam Harris or whoever?
00:39:47.020 | Like think of anybody, Tyler, like any faculty.
00:39:52.020 | - This is like what universities strive to create,
00:39:54.860 | but it's taken hundreds of years of cultural evolution
00:39:57.700 | to try to get a lot of interesting, smart people together
00:39:59.660 | that run into each other.
00:40:00.660 | - We have really strong faculty in a room together
00:40:04.620 | with no scheduling, this is the power of it.
00:40:07.180 | It's like you just show up, there's none of that baggage
00:40:10.140 | of scheduling and so on, and there's no pressure to leave,
00:40:13.380 | sorry, no pressure to stay, it's very easy for you to leave.
00:40:16.700 | You realize that there's a lot of constraints on meetings
00:40:19.220 | and like faculty, like even stopping by before the pandemic,
00:40:24.220 | a friend or faculty or colleague and so on,
00:40:28.380 | there's a weirdness about leaving,
00:40:30.140 | but here there's not a weirdness about leaving.
00:40:33.260 | So they've discovered something interesting,
00:40:36.060 | but the final result when you observe it,
00:40:38.340 | it's very fulfilling, I think it's very beneficial,
00:40:43.420 | but it's very addicting.
00:40:44.860 | So you have to make sure you moderate.
00:40:48.540 | - Yeah, that's interesting.
00:40:50.140 | Okay, well, so maybe I'll try it.
00:40:52.180 | I mean, look, there's no, the things that make me suspicious
00:40:54.700 | about other platforms aren't here.
00:40:56.660 | So the feed is not full of user generated content
00:41:00.500 | that is going through some sort of algorithmic rating
00:41:02.340 | process with all the weird incentives and nudging that does.
00:41:05.780 | And you're not producing content that's being harvested
00:41:08.900 | to be monetized by another company.
00:41:11.260 | I mean, it seems like it's more ephemeral, right?
00:41:14.420 | You're here, you're talking,
00:41:15.780 | the feed is just actually just showing you,
00:41:17.780 | here's interesting things happening, right?
00:41:19.380 | You're not jockeying in the feed for,
00:41:21.060 | look, I'm being clever or something,
00:41:22.420 | and I'm gonna get a like count that goes up
00:41:24.700 | and that's gonna influence.
00:41:26.700 | And there's more friction,
00:41:27.540 | there's more cognitive friction, I guess,
00:41:28.980 | involved in listening to smart people
00:41:31.380 | versus scrolling through.
00:41:33.540 | Yeah, there's something there.
00:41:34.460 | - So there's no--
00:41:35.380 | - Why are people so, I see a lot of,
00:41:37.180 | there's all these articles that seem,
00:41:39.260 | I haven't really read them,
00:41:40.100 | but it seems, why are reporters negative about this?
00:41:42.380 | - Competition.
00:41:43.300 | The New York Times wrote this article
00:41:44.700 | called "Unfettered Conversations Happening on Clubhouse."
00:41:47.620 | - So I'm right in picking up a tone,
00:41:51.180 | even from the headlines,
00:41:52.020 | that there's some negative vibes from the press.
00:41:55.060 | - No, so I can say, let's say,
00:41:58.500 | well, I'll tell you what the article was saying,
00:42:00.980 | which is they're having cancelable conversations,
00:42:05.780 | like the biggest people in the world
00:42:07.340 | almost trolling the press.
00:42:09.500 | And the press is desperate--
00:42:10.340 | - Like 4channing the press.
00:42:11.540 | - Yeah, 4channing the press,
00:42:13.500 | by saying that you guys are looking for clickbait
00:42:16.780 | from our genuine human conversations.
00:42:19.260 | And so I think, honestly, the press is just like,
00:42:24.260 | what do we do with this?
00:42:25.500 | We can't, first of all, it's a lot of work for the,
00:42:28.940 | okay, it's what Naval says,
00:42:31.380 | which is like, this is skipping the journalist.
00:42:34.620 | Like the interview, if you go on Clubhouse,
00:42:37.180 | the interview you might do for the book
00:42:39.740 | would be with somebody who's like a journalist
00:42:41.460 | and interviewing you.
00:42:43.140 | That's more traditional.
00:42:45.140 | It'd be a good introduction for you to try it,
00:42:47.280 | but the way to use Clubhouse is you just show up
00:42:52.280 | and it's like, again, like me, I'm sorry,
00:42:55.620 | I'm like, I keep mentioning Sam Harris
00:42:58.980 | as if it's like the only person I know,
00:43:00.540 | but like a lot of these major faculty,
00:43:03.660 | I don't know, Max Tegmark, just major faculty
00:43:06.860 | just sitting there, and then you show up
00:43:08.620 | and then I'll ask like,
00:43:10.940 | oh, don't you have a book coming out or something?
00:43:12.780 | And then you'll talk about the book
00:43:14.460 | and then you'll leave five minutes later
00:43:15.700 | 'cause you have to go get coffee and go to the bathroom.
00:43:18.160 | So like, that's the, it's not the journalistic,
00:43:21.000 | you're not gonna actually enjoy the interview as much
00:43:23.920 | because it'll be like the normal thing.
00:43:26.680 | Like you're there for 40 minutes or an hour
00:43:28.880 | and there'll be questions from the audience.
00:43:31.160 | - Like I'm doing an event next week for the book launch
00:43:33.800 | where it's like Jason Fried and I are talking about email,
00:43:37.440 | but it's using some more,
00:43:39.080 | there'll be like a thousand people who are there
00:43:40.680 | to watch virtually, but it's using
00:43:42.080 | some sort of traditional webinar.
00:43:44.760 | Clubhouse would be a situation
00:43:46.020 | where that could just happen informally.
00:43:47.620 | Like I jump in like Jason's there
00:43:49.020 | and then someone else jumps in and yeah, that's interesting.
00:43:51.940 | - But for now it's still closed.
00:43:53.940 | So even though there's a lot of excitement
00:43:56.180 | and there'll be quite famous people
00:43:58.940 | just sitting there listening to you,
00:44:00.740 | but the numbers aren't exactly high.
00:44:04.300 | So you're talking about rooms,
00:44:05.940 | like even the huge rooms are like just a few thousand.
00:44:09.340 | - Right, and this is probably like Soho
00:44:11.260 | in the fifties or something too,
00:44:12.620 | just because of the exponential growth,
00:44:15.480 | give it seven more months.
00:44:17.120 | And if you let one invite, be gets two invites,
00:44:19.480 | be gets four invites, be pretty soon it'll be everyone.
00:44:22.200 | And then the rooms in your feed are gonna be whatever,
00:44:25.440 | marketing, performance, enhancing drugs
00:44:27.160 | or something like that.
00:44:28.760 | - But then, and a bunch of competitors,
00:44:30.600 | there's already like 30 plus competitors
00:44:32.680 | that sprung up Twitter spaces.
00:44:34.400 | So Twitter is creating a competitor
00:44:36.160 | that's going to likely destroy Clubhouse
00:44:38.560 | because they just have a much larger user base
00:44:40.480 | and they already have a social network.
00:44:42.400 | So I would be very cautious of course
00:44:46.380 | with the addictive element,
00:44:47.580 | but it doesn't just like you said,
00:44:49.300 | this particular implementation in its early stages
00:44:52.100 | doesn't have the like,
00:44:53.340 | it doesn't have the context switching problem.
00:44:58.500 | You'll just switch to it and you'll be stuck.
00:45:01.140 | - Yeah, the keep a context is great.
00:45:02.860 | - Yeah. - Yeah.
00:45:04.380 | - But then I think the best way I've found to use it
00:45:07.660 | is to acknowledge that these things pull you in.
00:45:12.660 | - Yeah.
00:45:13.780 | - So I've used it in the past,
00:45:17.000 | like almost, I'll go get a coffee
00:45:19.140 | and I'll tune into a conversation
00:45:20.880 | as if that's how I use podcasts sometimes.
00:45:24.360 | I'll just like play a little bit of a podcast
00:45:26.960 | and then I can just turn it off.
00:45:29.600 | The problem with these is it pulls you in,
00:45:31.880 | it's really interesting.
00:45:32.840 | And then the other problem that you'll experience
00:45:35.420 | is like somebody will recognize you
00:45:37.520 | and then they'll be like, oh Lex, come on up.
00:45:41.320 | Come on, no way, I had a question for you.
00:45:43.440 | And then it takes a lot for you to go like to ignore that.
00:45:47.600 | - Yeah, yeah.
00:45:49.280 | - And then you pulled in and it's fascinating
00:45:51.120 | and it's really cool people.
00:45:52.320 | So it's like a source of a lot of joy,
00:45:53.880 | but you have to be very, very careful.
00:45:58.160 | The reason I brought it up is there's a room,
00:46:00.920 | there's an entire club actually on burnout
00:46:04.120 | and I brought you up and I brought David Goggins
00:46:08.100 | as the process I go through,
00:46:09.700 | which is my passion goes up and down, it dips.
00:46:14.700 | And I don't think I trust my own mind
00:46:17.700 | to tell me whether I'm getting close to burnout
00:46:22.620 | or exhaustion or not.
00:46:24.820 | I kind of go with the David Goggins model of,
00:46:28.260 | I mean, he's probably more applying it to running,
00:46:30.260 | but when it feels like your mind can't take any more,
00:46:35.140 | that you're just 40% at your capacity.
00:46:38.900 | I mean, it's just like an arbitrary level.
00:46:41.020 | - It's the Navy SEAL thing, right?
00:46:41.860 | - The Navy SEAL thing.
00:46:43.060 | I mean, you could put that at any percent,
00:46:44.620 | but it is remarkable that if you just take it
00:46:48.100 | one step at a time, just keep going,
00:46:49.980 | it's similar to this idea of a process.
00:46:53.380 | If you just trust the process and you just keep following,
00:46:55.860 | even if the passion goes up and down and so on,
00:46:58.420 | then ultimately, if you look in aggregate,
00:47:02.740 | the passion will increase.
00:47:04.420 | Your self-satisfaction will increase.
00:47:05.820 | - Yeah, and if you have two things,
00:47:08.300 | this has been a big strategy of mine,
00:47:09.780 | so that what you hope for is off-phase,
00:47:12.580 | off-phase alignment.
00:47:14.220 | Sometimes it's in-phase and that's a problem,
00:47:16.580 | but off-phase alignment's good.
00:47:18.060 | So, okay, my research, I'm struggling,
00:47:20.700 | but my book stuff is going well, right?
00:47:22.740 | And so when you add those two waves together,
00:47:24.820 | like, oh, we're doing pretty well.
00:47:25.780 | And then in other periods, like on my writing,
00:47:28.460 | I feel like I'm just not getting anywhere,
00:47:29.940 | but oh, I've had some good papers,
00:47:31.180 | I'm feeling good over there.
00:47:32.420 | So having two things that can counteract each other.
00:47:35.980 | Now, sometimes they fall into sync and then it gets rough.
00:47:38.980 | (laughs)
00:47:40.300 | Because everything for me is cyclical,
00:47:41.780 | good periods, bad periods with all this stuff.
00:47:43.300 | So typically they don't coincide, so it helps compensate.
00:47:47.620 | When they do coincide, you get really high highs,
00:47:50.420 | like where everything's clicking,
00:47:51.260 | and then you get these really low lows
00:47:52.820 | where your research is not working,
00:47:54.740 | your program's not clicking,
00:47:56.420 | you feel like you're nowhere with your writing,
00:47:59.340 | and then it's a little rougher.
00:48:00.740 | - Is, do you think about the concept of burnout?
00:48:04.060 | 'Cause I, so I personally have never experienced burnout
00:48:06.420 | in the way that folks talk about,
00:48:08.300 | which is like, it's not just the up and down,
00:48:11.740 | it's like, you don't wanna do anything ever again.
00:48:14.780 | - Yeah.
00:48:15.620 | - It's like, for some people it's like physical,
00:48:17.740 | like to the hospital kind of thing.
00:48:19.740 | - Yeah, so I do worry about it.
00:48:22.880 | So when I used to do student writing,
00:48:24.840 | like writing about students and student advice,
00:48:27.460 | it came up a lot with students at elite schools,
00:48:30.440 | and I used to call it deep procrastination,
00:48:32.420 | but it was a real, really vivid,
00:48:35.380 | very replicatable syndrome
00:48:37.680 | where they stop being able to do schoolwork.
00:48:39.700 | - Yeah.
00:48:40.540 | - Like, this is due, and the professor gives you an extension
00:48:42.980 | and the professor gives you an incomplete
00:48:44.340 | and says, "You got it, you were gonna fail the course,
00:48:46.140 | "you have to hand this in," and they can't do it.
00:48:48.600 | Right, it's like a complete stop
00:48:50.680 | on the ability to actually do work.
00:48:52.460 | And so I used to counsel students who had that issue,
00:48:54.180 | and often it was a combination of,
00:48:56.500 | at least this is my best analysis,
00:48:58.700 | is you have just the physical and cognitive difficulties
00:49:01.380 | of they're usually under a very hard load, right?
00:49:03.700 | They're doing too many majors, too many extracurriculars,
00:49:05.540 | just really pushing themselves,
00:49:07.500 | and the motivation is not sufficiently intrinsic.
00:49:11.500 | - Right.
00:49:12.340 | - So if you have a motivational center
00:49:13.340 | that's not completely on board,
00:49:14.460 | so a lot of these kids, like when I'm dealing with MIT kids,
00:49:16.740 | they would be, their whole town was shooting off fireworks
00:49:20.220 | that they got in,
00:49:21.060 | everyone's hoped that they were going there,
00:49:23.500 | and that they're in three majors,
00:49:24.620 | they don't wanna let people down,
00:49:25.620 | but they're not really interested
00:49:26.580 | in being a doctor or whatever.
00:49:28.180 | So your motivation's not in the right place,
00:49:30.100 | the motivational psychologist would say
00:49:31.580 | the locus of control was more towards
00:49:33.180 | the extrinsic end of the spectrum, and you have hardship.
00:49:36.660 | And you could just fritz out the whole system.
00:49:38.840 | And so I would always be very worried about that,
00:49:40.540 | so I think about that a lot.
00:49:41.900 | I do a lot of multi-phase or multi-scale seasonality.
00:49:45.400 | So I'll go hard on something for a while,
00:49:48.140 | and then for a few weeks, go easy.
00:49:50.300 | I'll have semesters that are hard
00:49:51.940 | and semesters that are easier.
00:49:53.140 | I'll take the summer really low.
00:49:54.140 | So on multiple scales,
00:49:55.180 | and in the day I'll go really hard on something,
00:49:56.660 | but then have a hard cutoff at five.
00:49:57.900 | So like every scale, it's all about rest and recovery.
00:50:01.900 | 'Cause I really wanna avoid that, and I do burn out.
00:50:03.620 | I burnt out, pretty recently I get minor burnt outs.
00:50:06.620 | I got a couple papers that I was trying to work through
00:50:10.060 | for a deadline a few weeks ago,
00:50:12.780 | and I wasn't sleeping well,
00:50:14.660 | and there's some other things going on,
00:50:17.500 | and it just knocks out, and I get sick usually,
00:50:20.300 | is how I know I've pushed myself too far.
00:50:22.380 | And so I kind of pulled it back.
00:50:23.460 | Now I'm doing this book launch,
00:50:24.460 | then after this book launch, I'm pulling it back again.
00:50:26.700 | So seasonality for rest and recovery, I think is crucial.
00:50:30.140 | And at every scale, daily, monthly,
00:50:33.700 | and then at the annual scale.
00:50:34.940 | An easy summer, for example,
00:50:36.260 | I think is a great idea, if that's possible.
00:50:38.900 | - Okay, you just made me realize
00:50:41.260 | that that's exactly what I do.
00:50:43.260 | 'Cause I feel like I'm not even close
00:50:45.160 | to burn out on anything,
00:50:46.100 | even though I'm in chaos.
00:50:48.800 | I feel the right exact ways of seasonality is the,
00:50:53.740 | not even the seasonality,
00:50:55.180 | but you always have multiple seasons operating.
00:50:59.020 | It's like you said,
00:51:00.140 | 'cause when you have a lot of cool shit going on,
00:51:02.820 | there's always at least one thing that's a source of joy.
00:51:05.660 | There's always a reason.
00:51:08.740 | I suppose the fundamental thing,
00:51:10.900 | and I've known people that suffer from depression too,
00:51:13.780 | the fundamental problem with the experience
00:51:16.080 | of depression and burnout is like,
00:51:18.380 | why do, like, life is meaningless.
00:51:21.900 | And I always have an answer of like, why?
00:51:25.420 | Why today could be cool.
00:51:26.940 | - And you have to contrive it, right?
00:51:29.940 | If you don't have it, you have to contrive it.
00:51:31.780 | I think it's really important.
00:51:33.180 | Like, okay, well, this is going bad.
00:51:34.860 | So now is the time to start thinking about,
00:51:37.300 | I mean, look, I started a podcast during the pandemic.
00:51:39.860 | It's like, this is going pretty bad,
00:51:42.300 | but you know what?
00:51:43.180 | This could be something really interesting.
00:51:46.340 | - Deep questions with Kyle Newport.
00:51:48.140 | - I do it all in that voice.
00:51:50.580 | - I love the podcast, by the way.
00:51:53.500 | But yeah, I think David Foster Wallace said,
00:51:56.740 | "The key to life is to be unboreable."
00:51:59.100 | I've always kind of taken that to heart,
00:52:01.840 | which is like, you should be able to,
00:52:04.660 | maybe artificially, generate anything.
00:52:10.780 | Find something in your environment, in your surroundings,
00:52:14.960 | that's a source of joy.
00:52:16.080 | Like, everything is fun.
00:52:17.400 | - Yeah.
00:52:18.520 | Did you read "The Pale King"?
00:52:20.160 | It goes deep on boredom.
00:52:21.540 | It's like uncomfortable.
00:52:22.980 | It's like an uncomfortable meditation on boredom.
00:52:25.680 | Like, the characters in that are just driven
00:52:27.700 | to the extremes of...
00:52:30.000 | I just bought three books on boredom the other day.
00:52:33.340 | So now I'm really interested in this topic.
00:52:35.200 | Because I was anxious about my book launch
00:52:37.200 | happening this week.
00:52:38.040 | So I was like, okay, I need something else.
00:52:39.560 | So I have this idea for a...
00:52:41.120 | I might do it as an article first, but as a book.
00:52:43.580 | Like, okay, I need something cool to be thinking about.
00:52:46.760 | Because I was worried about, like, I don't know.
00:52:48.800 | Is the launch gonna work?
00:52:49.800 | The pandemic, what's gonna happen?
00:52:51.200 | I don't know if it's gonna get there.
00:52:52.140 | So this is exactly what we're talking about.
00:52:54.120 | So I went out and I bought a bunch of books,
00:52:56.040 | and I'm beginning a whole sort of intellectual exploration.
00:53:00.200 | - Well, I think that's one of the profound ideas
00:53:03.000 | in deep work that you don't expand on too much is boredom.
00:53:08.360 | - Yeah, well, so deep work had a superficial idea
00:53:12.520 | about boredom, which was...
00:53:13.800 | I had this chapter called "Embrace Boredom."
00:53:16.040 | And a very functionalist idea was basically,
00:53:19.720 | you have to have some boredom in your regular schedule,
00:53:21.720 | or your mind is gonna form a Pavlovian connection
00:53:24.960 | between as soon as I feel boredom, I get stimuli.
00:53:28.440 | And once it forms that connection,
00:53:29.600 | it's never gonna tolerate deep work.
00:53:30.800 | So there's this very pragmatic treatment of boredom
00:53:34.240 | of your mind better be used to the idea
00:53:36.640 | that sometimes you don't get stimuli,
00:53:37.880 | because otherwise you can't write for three hours.
00:53:39.760 | Like it's just not gonna tolerate it.
00:53:41.780 | But more recently, what I'm really interested in boredom
00:53:44.080 | is it as a fundamental human drive, right?
00:53:47.200 | Because it's incredibly uncomfortable.
00:53:49.720 | And think about the other things
00:53:50.960 | that are incredibly uncomfortable, like hunger or thirst.
00:53:53.240 | They serve a really important purpose for our species, right?
00:53:56.640 | Like if something is really distressing, there's a reason.
00:53:58.720 | Pain is really uncomfortable
00:53:59.960 | because we need to worry about getting injured.
00:54:02.360 | Thirst is really uncomfortable
00:54:03.640 | because we need water to survive.
00:54:05.560 | So what's boredom?
00:54:07.000 | Why is that uncomfortable?
00:54:08.680 | And I've been interested in this notion
00:54:11.480 | that boredom is about driving us towards productive action.
00:54:16.480 | Like as a species, I mean, think about it.
00:54:19.160 | Like what got us to actually take advantage of these brains?
00:54:22.320 | What got us to actually work with fire?
00:54:24.400 | What got us to start shaping stones and the hand axes
00:54:27.660 | and figuring out if we could actually sharpen a stick
00:54:29.680 | sharp enough that we could throw it as a melee weapon
00:54:32.320 | or a distance weapon for hunting mammoth, right?
00:54:35.240 | Boredom drives us towards action.
00:54:37.960 | So now I'm fascinated by this fundamental action instinct
00:54:41.480 | because I have this theory that I'm working on
00:54:43.560 | that we're out of sync with it.
00:54:45.800 | Just like we have this drive for hunger,
00:54:47.940 | but then we introduced junk food
00:54:49.080 | and got out of sync with hunger
00:54:50.300 | and it makes us really unhealthy.
00:54:52.080 | We have this drive towards action,
00:54:53.420 | but then we overload ourselves
00:54:55.400 | and we have all of these distractions.
00:54:56.800 | And then that causes,
00:54:58.600 | it's like a cognitive action obesity type things
00:55:01.440 | because it short circuits the system
00:55:02.840 | that wants us to do things,
00:55:03.800 | but we put more things on our plate
00:55:04.840 | than we can possibly do.
00:55:05.720 | And then we're really frustrated we can't do them.
00:55:07.480 | And we're short circuiting all of our wires.
00:55:09.200 | So it all comes back to this question,
00:55:11.560 | well, what would be the ideal amount of stuff to do
00:55:16.560 | and type of things to do?
00:55:18.400 | Like if we wanted to look back
00:55:19.320 | at our ancestral environment and say,
00:55:21.720 | if I could just build from scratch,
00:55:24.000 | how much work I do and what I work on
00:55:26.520 | to be as in touch with that as like paleo people
00:55:28.640 | are trying to get their diets in touch with that.
00:55:30.120 | And so now I'm just,
00:55:31.320 | but see, it's something I made up.
00:55:34.040 | But now I'm going deep on it.
00:55:36.160 | And one of my podcast listeners,
00:55:37.560 | I was talking about on the show and I was like,
00:55:39.320 | well, I keep trying to learn about animals and boredom.
00:55:41.440 | And she sent me this cool article
00:55:42.880 | from an animal behaviorist journal
00:55:44.600 | about what we know about human boredom versus animal boredom.
00:55:48.120 | So trying to figure out that puzzle
00:55:50.000 | is the wave that's high.
00:55:52.280 | So I can get through the wave that's low of like,
00:55:54.160 | I don't know about this pandemic book launch.
00:55:55.700 | And my research is stumbling a little bit
00:55:59.800 | because of the pandemic.
00:56:00.640 | And so I needed a nice high.
00:56:03.680 | So there we go, there's a case study.
00:56:05.360 | - Well, it's both a case study
00:56:07.520 | and a very interesting set of concepts
00:56:09.320 | 'cause I didn't even realize that it's so simple.
00:56:12.280 | I'm one of the people that has a interesting
00:56:17.280 | push and pull dynamic with hunger,
00:56:18.980 | trying to understand the hunger with myself.
00:56:21.200 | Like I probably have an unhealthy relationship with food.
00:56:24.680 | I don't know, but there's probably a perfect,
00:56:28.480 | that's a nice way to think about diet as action.
00:56:32.800 | There's probably an optimal diet response
00:56:36.520 | to the experience that our body's telling us,
00:56:40.340 | the signal that our body's sending, which is hunger.
00:56:43.280 | And in that same way, boredom is sending a signal.
00:56:46.840 | And most of our intellectual activities in this world,
00:56:49.960 | our creative activities are essentially
00:56:52.680 | a response to that signal.
00:56:56.600 | - Yeah, and think about this analogy
00:56:59.640 | that we have this hunger instinct
00:57:01.080 | that junk food short circuits.
00:57:03.360 | - Yes.
00:57:04.200 | - Right, it's like, oh, we'll satisfy that hyper-palatably
00:57:06.720 | and it doesn't end up well.
00:57:08.240 | Now think about modern attention engineered,
00:57:11.760 | digitally mediated entertainment.
00:57:14.520 | We have this boredom instinct.
00:57:16.000 | Oh, we can take care of that
00:57:17.760 | with a hyper-palatable alternative.
00:57:20.500 | Is that gonna lead to a similar problem?
00:57:22.260 | - So I've been fasting a lot lately.
00:57:23.720 | Like I'm doing eating once a day.
00:57:27.660 | I've been doing that for over a month.
00:57:29.620 | Just eating one meal a day and primarily meat.
00:57:33.800 | But it's very, fasting has been incredible for me,
00:57:38.280 | for focus, for wellbeing, for a few,
00:57:40.880 | I don't know, just for feeling good.
00:57:42.680 | Okay, we'll put on a chart what makes me feel good.
00:57:45.680 | And that fasting and eating primarily a meat-based diet
00:57:50.680 | makes me feel really good.
00:57:52.500 | And so, but that ultimately, what fasting did,
00:57:57.800 | I haven't fasted super long yet, like a seven-day diet,
00:58:00.500 | which I really like to do.
00:58:02.140 | But even just fasting for a day for 24 hours
00:58:05.120 | gets you in touch with the signal.
00:58:09.980 | It's fascinating.
00:58:10.820 | Like you get to listen to your,
00:58:12.340 | learn to listen to your body that like,
00:58:15.040 | you know, it's okay to be hungry.
00:58:17.720 | It's like a little signal that sends you stuff.
00:58:19.780 | And then I get to listen to how it responds
00:58:24.140 | when I put food in my body.
00:58:27.540 | And I get to like, okay, cool.
00:58:30.480 | So like food is a thing that pacifies the signal.
00:58:33.760 | Like it sounds ridiculous, okay.
00:58:35.760 | You could do that with-
00:58:36.600 | - And do different types of food.
00:58:38.320 | It feels different.
00:58:39.160 | So you learn about what your body wants.
00:58:41.660 | - For some reason, fasting,
00:58:44.140 | it's similar to the deep work embrace boredom.
00:58:47.360 | Fasting allowed me to go into mode of listening,
00:58:50.400 | of trying to understand the signal,
00:58:52.080 | that I could say I have an unhealthy appreciation of fruit.
00:58:56.920 | Okay.
00:58:57.860 | I love apples and cherries.
00:58:59.540 | Like I don't know how to moderate them.
00:59:01.420 | So if you take just same amount of calories,
00:59:03.500 | I don't know, calories matter,
00:59:04.840 | but they say calories,
00:59:05.960 | 2000 calories of cherries versus 2000 calories of steak.
00:59:10.960 | If I eat 2000 calories of steak,
00:59:13.300 | maybe just a little bit of like green beans or cauliflower,
00:59:17.180 | I'm going to feel really good, fulfilled, focused, and happy.
00:59:22.180 | If I eat cherries, I'm going to be,
00:59:24.460 | I'm going to wake up behind a dumpster crying with like,
00:59:27.780 | naked and like, it's just-
00:59:29.780 | - Pits all around.
00:59:30.620 | - Yeah, with everything.
00:59:31.460 | - With a face, yeah.
00:59:32.300 | - And just like bloated, just not, and unhappy.
00:59:36.420 | And also the mood swings up and down.
00:59:39.740 | I don't know.
00:59:41.340 | And I'll be much hungrier the next day.
00:59:44.740 | Sometimes it takes a couple of days,
00:59:46.220 | but when I introduce carbs into the system, too many carbs,
00:59:50.100 | it starts, it's just unhealthy.
00:59:53.020 | I go into this roller coaster as opposed to a calm boat ride
00:59:56.020 | along the river in the Amazon or something like that.
00:59:58.540 | And so fasting was the mechanism of,
01:00:01.140 | for me to start listening to the body.
01:00:03.940 | I wonder if you can do that same kind of,
01:00:05.940 | I guess that's what meditation a little bit is.
01:00:07.860 | - A little bit, but yeah, listen to boredom.
01:00:10.100 | But so two years ago,
01:00:11.020 | I had a book out called "Digital Minimalism."
01:00:13.540 | And one of the things I was recommending that people do
01:00:16.100 | is basically a 30-day fast,
01:00:18.300 | but from digital personal entertainment,
01:00:20.420 | social media, online videos,
01:00:21.900 | anything that captures your attention and dispels boredom.
01:00:26.420 | And people were thinking like, oh, this is a detox.
01:00:29.540 | Like I just want to teach your body
01:00:30.700 | not to need the distraction or this or that,
01:00:32.780 | but it really wasn't what I was interested in.
01:00:34.300 | I wanted there to be space
01:00:37.660 | that you could listen to your boredom.
01:00:39.380 | Like, okay, I can't just dispel it.
01:00:41.100 | I can't just look at the screen
01:00:42.540 | and revel in it a little bit and start to listen to it
01:00:45.140 | and say, what is this really pushing me towards?
01:00:48.020 | And you take the new stuff,
01:00:49.460 | the new technology off the table and sort of ask,
01:00:51.460 | what is this, what am I craving?
01:00:53.220 | Like, what's the activity equivalent of 2000 calories of meat
01:00:57.300 | with a little bit of green beans on the side?
01:00:59.380 | And I had 1700 people go through this experiment,
01:01:01.620 | like spend 30 days doing this.
01:01:03.420 | And it's hard at first,
01:01:04.340 | but then they get used to listening to themselves
01:01:06.980 | and sort of seeking out
01:01:07.860 | what is this really pushing me towards?
01:01:09.740 | And it was pushing people towards connection.
01:01:12.260 | It was pushing people towards,
01:01:13.460 | I just want to go be around other people.
01:01:15.580 | It was pushing people towards high quality leisure activities
01:01:19.340 | like I want to go do something that's complicated.
01:01:21.700 | And it took weeks sometimes for them
01:01:23.260 | to get in touch with their boredom,
01:01:25.140 | but then it completely rewired how they thought about
01:01:28.780 | what do I want to do with my time outside of work?
01:01:30.780 | And then the idea is when you're done with that,
01:01:32.060 | then it was much easier to go back
01:01:33.340 | and completely change your digital life
01:01:34.940 | because you have alternatives, right?
01:01:37.420 | You're not just trying to abstain from things you don't like
01:01:39.860 | but that's basically a listening to boredom experiment.
01:01:42.660 | Like just be there with the boredom
01:01:45.140 | and see where it drives you
01:01:46.380 | when you don't have the digital cheese it's.
01:01:48.860 | Okay, so if I can't do that, where is it gonna drive me?
01:01:52.020 | Well, I guess I kind of want to go to the library,
01:01:53.940 | which came up a lot by the way.
01:01:54.780 | A lot of people rediscovered the library.
01:01:57.460 | - With physical books.
01:01:58.300 | - Physical books, so you can just go borrow them.
01:02:00.460 | And there's low pressure and you can explore
01:02:03.180 | and you bring them home and then you read them
01:02:04.860 | and you can sit by the window and read them
01:02:06.620 | and it's nice weather outside.
01:02:07.660 | And I used to do that 20 years ago.
01:02:09.740 | They're listening to boredom.
01:02:10.980 | - So can you maybe elaborate a little bit
01:02:12.980 | on the different experiences that people had
01:02:15.620 | when they quit social media for 30 days?
01:02:17.820 | Like if you were to recommend that process,
01:02:20.740 | what is ultimately the goal?
01:02:23.020 | - Yeah, digital minimalism,
01:02:24.940 | that's my philosophy for all this tech.
01:02:27.740 | And it's working backwards from what's important.
01:02:30.700 | So it's, you figure out what you're actually all about,
01:02:33.180 | like what you want to do,
01:02:34.140 | what you want to spend your time doing.
01:02:35.900 | And then you can ask, okay,
01:02:37.660 | is there a place that tech could amplify
01:02:39.180 | or support some of these things?
01:02:40.180 | And that's how you decide what tech to use.
01:02:42.860 | And so the process is let's actually
01:02:45.300 | get away from everything.
01:02:46.500 | Let's be bored for a while.
01:02:47.580 | Let's really spend a month getting,
01:02:48.980 | really figuring out what do I actually want to do?
01:02:51.060 | What do I want to spend my time doing?
01:02:52.180 | What's important to me?
01:02:53.820 | What makes me feel good?
01:02:54.700 | And then when you're done,
01:02:55.540 | you can bring back in tech very strategically
01:02:57.140 | to help those things, right?
01:02:58.740 | And that was the goal.
01:02:59.980 | That turns out to be much more successful
01:03:01.860 | than when people take a abstention only approach.
01:03:05.500 | So if you come at your tech life and say,
01:03:09.060 | whatever, I look at Instagram too much.
01:03:10.700 | Like I don't like how much I'm on Instagram.
01:03:12.860 | That's a bad thing.
01:03:13.740 | I want to reduce this bad thing.
01:03:15.100 | So here's my new thing.
01:03:16.260 | I'm going to spend less time looking at Instagram,
01:03:18.100 | much less likely to succeed in the longterm.
01:03:20.580 | So we're much less likely at trying to reduce
01:03:23.100 | this sort of amorphous negative because in the moment,
01:03:25.500 | you're like, yeah, but it's not that bad.
01:03:27.220 | And it would be kind of interesting to look at it now.
01:03:29.140 | When you're instead controlling behavior
01:03:30.780 | because you have a positive that you're aiming towards,
01:03:32.620 | it's very powerful for people.
01:03:33.620 | Like I want my life to be like this.
01:03:35.860 | Here's the role that tech plays in that life.
01:03:39.140 | The connection to wanting your life to be like that
01:03:41.260 | is very, very strong.
01:03:42.460 | And then it's much, much easier to say,
01:03:43.700 | yeah, like using Instagram is not part of my plan
01:03:45.780 | for how I have that life.
01:03:46.660 | And I really want to have that life.
01:03:47.780 | So of course I'm not going to use Instagram.
01:03:49.060 | So it turns out to be a much more sustainable way
01:03:51.780 | to tame what's going on.
01:03:53.260 | - So if you quit social media for 30 days,
01:03:55.340 | you kind of have to do the work.
01:03:58.100 | - You have to do the work.
01:03:59.340 | - Of thinking like, what am I actually,
01:04:01.460 | what makes me happy in terms of these tools
01:04:04.340 | that I've previously used?
01:04:05.620 | And when you try to integrate them back,
01:04:08.940 | how can I integrate them to maximize the thing
01:04:11.100 | that actually makes me happy?
01:04:11.940 | - Yeah, or what makes me happy unrelated to technology?
01:04:14.500 | Like, what do I actually, what do I want my life to be like?
01:04:16.300 | Well, maybe what I want to do is be outside of nature
01:04:18.660 | two hours a day and spend a lot more time
01:04:20.220 | like helping my community and sacrificing
01:04:22.020 | on behalf of my connections,
01:04:23.180 | and then have some sort of intellectually engaging
01:04:26.500 | leisure activity, like I'm reading
01:04:28.220 | or trying to read the great books
01:04:29.620 | and having more calm and seeing the sunset.
01:04:31.700 | Like you create this picture and then you go back and say,
01:04:35.180 | well, I still need my Facebook group
01:04:36.580 | because that's how I keep up with my cycling group.
01:04:39.820 | But Twitter is just, you know,
01:04:41.300 | toxic's not helping any of these things.
01:04:42.820 | And well, I'm an artist, so I kind of need Instagram
01:04:45.420 | to get inspiration.
01:04:46.260 | But if I know that's why I'm using Instagram,
01:04:48.180 | I don't need it on my phone, it's just on my computer.
01:04:49.980 | And I just follow 10 artists and check it once a week.
01:04:51.900 | Like you really can start deploying.
01:04:54.060 | It was the number one thing that differentiated
01:04:55.820 | in that experiment, the people who ended up
01:04:58.220 | sustainably making changes and getting through the 30 days
01:05:00.940 | and those who didn't, was the people who did
01:05:03.060 | the experimentation and the reflection.
01:05:04.580 | Like, let me try to figure out what's positive.
01:05:07.500 | They were much more successful than the people
01:05:09.180 | that just said, I'm sick of using my phone so much.
01:05:11.940 | So I'm just gonna white knuckle it.
01:05:12.900 | Just 30 days will be good for me.
01:05:14.100 | I just gotta, I just gotta get away from it or something.
01:05:16.540 | It doesn't last.
01:05:17.620 | - So you don't use social media currently.
01:05:19.820 | - Yeah.
01:05:21.060 | - Do you find that a lot of people going through this process
01:05:24.340 | will seek to basically arrive at a similar place
01:05:29.340 | to not use social media primarily?
01:05:30.900 | - About half, right.
01:05:32.460 | So about half when they went through this exercise,
01:05:34.700 | and these aren't quantified numbers.
01:05:36.700 | You know, this is just, they sent me reports and yeah.
01:05:40.100 | - That's pretty good though, 1,700?
01:05:42.060 | - Yeah, yeah.
01:05:43.380 | So roughly half probably got rid of social media altogether.
01:05:47.220 | Once they did this exercise,
01:05:48.260 | they realized these things I care about,
01:05:50.140 | I don't, social media is not the tools that's really helping.
01:05:53.580 | The other half kept some, there were some things
01:05:56.060 | in their life where some social media was useful.
01:05:59.060 | But the key thing is, if they knew why they were deploying
01:06:01.180 | social media, they could put fences around it.
01:06:04.380 | So for example, of those half that kept some social media,
01:06:07.100 | almost none of them kept it on their phone.
01:06:09.460 | - Oh, interesting.
01:06:10.300 | - Yeah, you can't optimize if you don't know
01:06:12.100 | what the function you're trying to optimize.
01:06:13.620 | So it's like this huge hack.
01:06:14.660 | It's like, once you know this is why I'm using Twitter,
01:06:16.940 | then you can have a lot of rules about how you use Twitter.
01:06:19.180 | And suddenly you take this cost benefit ratio
01:06:21.660 | and it goes like way from the company's advantage
01:06:24.140 | and then way over towards your advantage.
01:06:25.820 | - It's kind of fascinating 'cause I've been torn
01:06:28.700 | with social media, but I did this kind of process.
01:06:30.580 | I haven't actually done it for 30 days,
01:06:32.300 | which I probably should.
01:06:33.740 | I'll do it for like a week at a time and regularly
01:06:36.100 | and thinking what kind of approach to Twitter works for me.
01:06:41.100 | I'm distinctly aware of the fact that I really enjoy
01:06:47.940 | posting once or twice a day.
01:06:51.160 | And at that time checking from the previous post,
01:06:55.060 | it makes me feel, even when there's negative comments,
01:06:59.780 | they go right past me.
01:07:01.340 | And when there's positive comments, it makes you smile.
01:07:03.500 | I feel like love and connection with people,
01:07:06.140 | especially if people I know, but even just in general,
01:07:08.500 | it's like, it makes me feel like the world
01:07:10.300 | is full of awesome people.
01:07:12.220 | Okay, when you increase that from checking from two to,
01:07:15.380 | like, I don't know what the threshold is for me,
01:07:17.500 | but probably like five or six per day,
01:07:19.900 | it starts going to anxiety world,
01:07:21.940 | like where negative comments will actually stick
01:07:25.620 | to me mentally and positive comments will feel more shallow.
01:07:32.380 | - Yeah, yeah.
01:07:33.660 | - It's kind of fascinating.
01:07:34.660 | So I've been trying to, there's been long stretches
01:07:39.660 | of time, I think December and January,
01:07:43.300 | where I did just post and check, post and check.
01:07:46.340 | That makes me really happy.
01:07:49.020 | Most of 2020 I did that, it made me really happy.
01:07:52.540 | Recently I started, like, I'll go, you know,
01:07:56.060 | you go right back in like a drug addict
01:07:57.980 | where you check it like, I don't know what that number is,
01:08:00.700 | but that number is high.
01:08:01.660 | It's not good.
01:08:02.500 | You don't come out happy.
01:08:03.860 | No one comes out of a day full of Twitter
01:08:06.020 | celebrating humanity.
01:08:07.300 | - And it's not even, 'cause I'm very fortunate
01:08:10.380 | to have a lot of just like positivity in the Twitter,
01:08:12.900 | but there's just a general anxiety.
01:08:16.180 | I wouldn't even say, I wouldn't even say it's,
01:08:19.860 | it's probably the thing that you're talking about
01:08:21.380 | with the contact switching.
01:08:22.620 | It's almost like an exhaustion.
01:08:25.820 | I wouldn't even say it's like a negative feeling.
01:08:27.900 | It's almost just an exhaustion to where I'm not creating
01:08:30.740 | anything beautiful in my life, just exhausted.
01:08:33.820 | - Like an existential exhaustion.
01:08:35.220 | - Existential exhaustion.
01:08:36.900 | But I wonder, do you think it's possible to use,
01:08:39.540 | from the people you've seen, from yourself,
01:08:42.580 | to use social media in the way I'm describing, moderation,
01:08:45.500 | or is it always going to become?
01:08:48.060 | - When people do this exercise,
01:08:49.220 | you get lots of configurations.
01:08:52.340 | So for people that have a public presence, for example,
01:08:56.100 | like what you're doing is not that unusual.
01:08:58.820 | Okay, I post one thing a day and my audience likes it
01:09:02.220 | and that's kind of it, which, but you've thought through,
01:09:04.860 | like, okay, this supports something I value,
01:09:06.940 | which is like having a sort of informal connection
01:09:09.420 | with my audience and being exposed
01:09:12.140 | to some sort of positive randomness.
01:09:16.020 | - Yes.
01:09:16.860 | - Okay, then you could say, if that's my goal,
01:09:18.900 | what's the right way to do it?
01:09:19.740 | Well, I don't need to be on Twitter on my phone all day.
01:09:21.260 | Maybe what I do is every day at five,
01:09:23.180 | I do my post and check on the day.
01:09:25.660 | So I have a writer friend, Ryan Holiday,
01:09:28.980 | who writes about the Stoics a lot,
01:09:30.940 | and he has this similar strategy.
01:09:32.540 | He posts one quote every day, usually from a famous Stoic
01:09:36.900 | and sometimes from a contemporary figure,
01:09:38.100 | and that's just what he does.
01:09:38.940 | He just posts it and it's a very positive thing.
01:09:41.700 | Like his readers really love it
01:09:43.000 | because it's just like a dose of inspiration.
01:09:44.740 | He doesn't spend time,
01:09:46.440 | he's never interacting with anyone on social media, right?
01:09:49.180 | But that's an example of,
01:09:50.820 | I figured out what's important to me,
01:09:52.260 | what's the best way to use tools to amplify it,
01:09:54.460 | and then you get advantages out of the tools.
01:09:56.700 | So I like what you're doing.
01:09:57.980 | I looked up your Twitter feed before I came over here.
01:10:00.980 | I was curious, you're not on there a lot.
01:10:02.980 | - No.
01:10:03.820 | - I don't see you yelling at people.
01:10:04.640 | Now, do you think social media as a medium
01:10:08.020 | changed the cultural standards?
01:10:09.540 | And I mean it in a, have you read Neil Postman at all?
01:10:12.340 | Have you read like "Amusing Ourselves to Death"?
01:10:14.420 | He was a social critic, technology critic,
01:10:16.520 | and wrote a lot about sort of technological determinism.
01:10:20.140 | So the ways, which is a really influential idea
01:10:22.840 | to a lot of my work, which is actually a little out
01:10:24.620 | of fashion right now in academia,
01:10:25.960 | but the ways that the properties and presence
01:10:28.900 | of technologies change things about humans
01:10:31.540 | in a way that's not really intended
01:10:32.980 | or planned by the humans themselves.
01:10:34.260 | And he has, that book is all about
01:10:35.460 | how different communication medium,
01:10:38.460 | like fundamentally just changed the way
01:10:39.700 | the human brain understands and operates.
01:10:42.380 | And so he sort of gets into the,
01:10:43.700 | what happened when the printed word was widespread
01:10:45.980 | and how television changed it.
01:10:47.540 | And this was all pre-social media.
01:10:50.120 | But this one of these ideas I'm having is like,
01:10:51.700 | what's the degree to which,
01:10:52.780 | I get into it sometimes on my show,
01:10:54.300 | I get into a little bit,
01:10:55.140 | like the degree to which like Twitter in particular,
01:10:58.340 | just changed the way that people conceptualized what,
01:11:00.780 | for example, debate and discussion was.
01:11:04.180 | Like it introduced a rhetorical dunk culture,
01:11:06.380 | or it's sort of more about tribes not giving ground
01:11:09.260 | to other tribes.
01:11:10.580 | And it's like, it's a complete,
01:11:12.340 | there's different places and times
01:11:15.060 | when that type of discussion was thought of differently.
01:11:18.100 | - Well, yeah, absolutely.
01:11:19.540 | But I tend to believe,
01:11:20.780 | I don't know what you think,
01:11:21.620 | that there's the technological solutions.
01:11:23.580 | Like there's literally different features in Twitter
01:11:27.860 | that could completely reverse that.
01:11:29.540 | There's so much power in the different choices
01:11:32.980 | that are made.
01:11:33.800 | And it could still be highly engaging
01:11:36.540 | and have very different effects,
01:11:37.740 | perhaps more negative,
01:11:38.980 | or hopefully more positive.
01:11:40.580 | - Yeah, so I'm trying to pull these two things apart.
01:11:42.740 | So there's these two ways social media,
01:11:45.540 | let's say could change the experience
01:11:46.940 | of reading a major newspaper today.
01:11:49.460 | One could be a little bit more economic, right?
01:11:51.300 | So the internet made it cheaper to get news.
01:11:53.700 | The newspapers had to retreat to a paywall model
01:11:55.720 | because it was the only way they were gonna survive.
01:11:57.340 | But once you're in a paywall model,
01:11:58.580 | then what you really wanna do is make your tribe,
01:12:01.740 | which is within the paywall, very, very happy with you.
01:12:04.120 | So you wanna work to them.
01:12:05.260 | But then there's the sort of the determinist point of view,
01:12:07.860 | which is the properties of Twitter, which were arbitrary.
01:12:10.660 | Jack and Evan just, whatever, let's just do it this way,
01:12:14.220 | influenced the very way that people now understand
01:12:16.060 | and think about the world.
01:12:17.100 | - So the one influenced the other,
01:12:19.060 | I think they kind of started adjusting together.
01:12:22.100 | I did this thing, I mean, I'm trying to understand this.
01:12:25.380 | Part of the, I've been playing with the entrepreneurial idea
01:12:30.380 | that's a very particular dream I've had of a startup
01:12:34.700 | that this is a longer term thing
01:12:37.260 | that has to do with artificial intelligence.
01:12:39.580 | But more and more, it seems like there's some trajectory
01:12:43.020 | through creating social media type of technologies,
01:12:47.420 | very different than what people are thinking I'm doing.
01:12:49.540 | But it's a kind of challenge to the way that Twitter is done.
01:12:54.540 | But it's not obvious what the best mechanisms are
01:12:58.740 | to still make an exceptionally engaging platform,
01:13:01.900 | like Clubhouse is very engaging,
01:13:04.020 | and not have any of the negative effects.
01:13:06.540 | For example, there's Chrome extensions
01:13:08.940 | that allow you to turn off all likes and dislikes
01:13:13.660 | and all of that from Twitter.
01:13:15.020 | So all you're seeing is just the content.
01:13:18.260 | On Twitter, that to me creates,
01:13:21.540 | that's not a compelling experience at all.
01:13:23.620 | Because I still need, I would argue,
01:13:26.940 | I still need the likes to know what's a tweet worth reading.
01:13:30.420 | 'Cause I only have a limited amount of time,
01:13:32.220 | so I need to know what's valuable.
01:13:34.020 | It's like great Yelp reviews on tweets or something.
01:13:36.980 | But I've turned off on, for example,
01:13:40.620 | on my account on YouTube,
01:13:44.620 | I wrote a Chrome extension that turns off
01:13:47.380 | all likes and dislikes and just views.
01:13:50.300 | I don't know how many views a video gets and so on,
01:13:53.020 | unless it's on my phone.
01:13:53.860 | - Do you take off the recommendations?
01:13:55.660 | - No, no.
01:13:58.540 | - On YouTube, some people,
01:13:59.540 | distraction for YouTube is a big one for people.
01:14:02.100 | - No, I'm not worried about the distraction
01:14:04.100 | because I'm able to control myself on YouTube.
01:14:06.900 | - You don't rabbit hole.
01:14:07.900 | - No, I don't rabbit hole.
01:14:09.060 | So you have to know your demons
01:14:10.500 | or your addictions or whatever.
01:14:11.900 | On YouTube, I'm okay.
01:14:12.860 | I don't keep clicking.
01:14:14.700 | The negative feelings come from seeing the views
01:14:19.340 | on stuff you've created.
01:14:22.100 | - Oh, so you don't wanna see your views.
01:14:24.100 | - Yeah.
01:14:24.940 | So I'm just speaking to the things
01:14:26.780 | that I'm aware of myself that are helpful
01:14:29.660 | and things that are not helpful emotionally.
01:14:31.940 | And I feel like there should be,
01:14:34.540 | we need to create actually tooling for ourselves.
01:14:37.020 | That's not me with JavaScript,
01:14:38.860 | but anybody's able to create,
01:14:42.340 | sort of control the experience that they have.
01:14:45.060 | - Yeah.
01:14:45.900 | Well, so my big unified theory on social media
01:14:48.900 | is I'm very bearish on the big platforms
01:14:52.460 | having a long future.
01:14:53.820 | - You are.
01:14:54.660 | - I think the moment of three or four major platforms
01:14:57.940 | is not gonna last.
01:15:01.020 | Right, so I don't, okay, this is just perspective, right?
01:15:03.820 | So you can start shorting these stocks on my,
01:15:07.020 | don't tell Vlad. - It's not financial advice.
01:15:08.100 | - Yeah, don't do it, Robin Hood.
01:15:09.980 | So here's, I think the big mistake
01:15:12.860 | the major platforms made
01:15:14.780 | is when they took out the network effect advantage, right?
01:15:19.380 | So the original pitch,
01:15:20.980 | especially if something like Facebook or Instagram
01:15:23.140 | was the people you know are on here, right?
01:15:26.420 | So like what you use this for
01:15:27.740 | is you can connect to people that you already know.
01:15:29.780 | This is what makes the network useful.
01:15:31.700 | So therefore the value of our network grows quadratically
01:15:34.780 | with the number of users.
01:15:35.820 | And therefore it's such a headstart
01:15:37.780 | that there's no way that someone else can catch up.
01:15:40.340 | But when they shifted and when Facebook took the lead
01:15:42.660 | of say we're gonna shift towards a newsfeed model,
01:15:45.700 | they basically said we're going to try to in the moment
01:15:48.460 | get more data and get more likes.
01:15:50.260 | Like what we're gonna go towards
01:15:51.300 | is actually just seeing interesting stuff,
01:15:54.300 | like seeing diverting information.
01:15:55.380 | So people took this social internet impulse
01:15:58.260 | to connect to people digitally to other tools,
01:16:00.860 | like group text messages and WhatsApp
01:16:02.460 | and stuff like this, right?
01:16:03.300 | So you don't think about these tools
01:16:04.420 | as oh, this is where I connect with people.
01:16:06.460 | Once it's just a feed that's kind of interesting,
01:16:09.140 | now you're competing with everything else
01:16:10.780 | that can produce interesting content that's diverting.
01:16:13.460 | And I think that is a much fiercer competition
01:16:16.300 | because now for example,
01:16:17.140 | you're going up against podcast, right?
01:16:18.620 | I mean like, okay, I guess, you know,
01:16:19.980 | the Twitter feed is interesting right now,
01:16:22.900 | but also a podcast is interesting
01:16:24.340 | or something else could be interesting too.
01:16:25.580 | I think it's a much fiercer competition
01:16:27.300 | when there's no more network effects, right?
01:16:29.900 | And so my sense is we're gonna see a fragmentation
01:16:32.220 | into what I call long tail social media,
01:16:34.740 | where if I don't need everyone I know to be on a platform,
01:16:38.460 | then why not have three or four bespoke platforms I use
01:16:41.860 | where it's a thousand people and it's all,
01:16:43.980 | we're all interested in, you know, whatever,
01:16:46.500 | AI or comedy and we've perfected this interface
01:16:50.060 | and maybe it's like Clubhouse, it's audio or something.
01:16:52.060 | And we all pay $2 so we don't have to worry
01:16:54.180 | about attention harvesting.
01:16:55.780 | And that's gonna be wildly more entertaining.
01:16:57.740 | Like, I mean, I'm thinking about comedians on Twitter.
01:17:00.580 | It's not the best internet possible format
01:17:04.060 | for them expressing themselves and being interesting
01:17:06.380 | that you have all these comedians that are trying to like,
01:17:08.100 | well, I can do like little clips and little whatever.
01:17:10.020 | Like, I don't know if there was a long tail social media.
01:17:13.020 | It's really, this is where the comedians are
01:17:14.460 | and there's podcasts and the comedians run podcasts now.
01:17:16.500 | So this is my thought is that there's really no,
01:17:19.460 | there's really no strong advantage
01:17:21.180 | to having one large platform that everyone is on.
01:17:25.860 | If all you're getting from it is I now have different
01:17:27.980 | options for diversion and like uplifting aspirational
01:17:31.180 | or whatever types of entertainment,
01:17:33.300 | that whole thing could fragment.
01:17:34.540 | And I think the glue that was holding together
01:17:36.100 | was network effects.
01:17:36.940 | I don't think they realized that when network effects
01:17:38.620 | have been destabilized,
01:17:40.020 | they don't have the centrifugal force anymore
01:17:41.900 | and they're spinning faster and faster.
01:17:43.340 | But is a Twitter feed really that much more interesting
01:17:46.660 | than all these streaming services?
01:17:48.060 | Is it really that much more interesting than Clubhouse?
01:17:51.060 | Is it that much more interesting than podcasts?
01:17:54.020 | I feel like they don't realize how unstable
01:17:56.500 | their ground actually is.
01:17:57.460 | - Yeah, that's fascinating.
01:17:58.420 | But the thing that makes Twitter and Facebook work,
01:18:03.420 | I mean, the newsfeed, you're exactly right.
01:18:07.100 | Like you can just duplicate the news.
01:18:08.780 | Like if it's not the social network and it's the newsfeed,
01:18:12.780 | then why not have multiple different feeds
01:18:15.220 | that are more, that are better at satisfying you?
01:18:17.480 | There's a dopamine gamification that they've figured out.
01:18:20.540 | - Yeah.
01:18:21.380 | - And so you have to, whatever you create,
01:18:24.820 | you have to at least provide some pleasure
01:18:27.340 | in that same gamification kind of way.
01:18:29.660 | It doesn't have to have to do with scale
01:18:32.180 | of large social networks.
01:18:33.260 | But I mean, I guess you're implying
01:18:35.300 | that you should be able to design
01:18:37.300 | that kind of mechanism in other forms.
01:18:40.220 | - Or people are turning on that gamification.
01:18:42.660 | I mean, so people are getting wise to it
01:18:44.500 | and are getting uncomfortable about it, right?
01:18:46.300 | So if I'm offering something, these exist out here.
01:18:49.060 | - Like sugar.
01:18:49.900 | People realize sugar's bad for you,
01:18:51.060 | they're gonna stop eating it. - Yeah, sugar's great.
01:18:51.900 | Yeah, drinking a lot's great too,
01:18:53.100 | but also after a while you realize there's problems.
01:18:56.100 | So some of the long tail social media networks
01:18:58.140 | that are out there that I've looked at,
01:18:59.820 | they offer usually like a deeper sense of connection.
01:19:02.900 | Like it's usually interesting people
01:19:04.940 | that you share some affinity
01:19:06.180 | and you have these carefully cultivated.
01:19:08.180 | I wrote this New Yorker piece a couple of years ago
01:19:09.940 | about the indie social media movement
01:19:11.580 | that really got into some of these different technologies.
01:19:14.940 | But I think the technologies are a distraction.
01:19:17.020 | We focus too much on, you know,
01:19:18.900 | Macedon versus, you know, whatever, like forget,
01:19:21.180 | or Discord, like actually let's forget
01:19:22.500 | the protocols right now.
01:19:23.540 | It's the idea of, okay,
01:19:26.060 | and there's a lot of these long tail social media groups,
01:19:28.500 | what people are getting out of it,
01:19:29.580 | which I think can outweigh the dopamine gamification
01:19:32.900 | is strong connection and motivation.
01:19:35.140 | Like you're in a group with other guys
01:19:36.940 | that are all trying to be, you know,
01:19:38.820 | better dads or something like this.
01:19:40.380 | And you talk to them on a regular basis
01:19:42.660 | and you're sharing your stories
01:19:43.700 | and there's interesting talks.
01:19:44.780 | And that's a powerful thing too.
01:19:47.500 | - One interesting thing about scale of Twitter
01:19:49.860 | is you have these viral spread of information.
01:19:53.260 | So sort of Twitter has become a newsmaker in itself.
01:19:57.060 | - Yeah, I think it's a problem.
01:19:58.580 | - Well, yes, but I wonder what replaces that
01:20:01.140 | because then you immediately--
01:20:03.540 | - Reporting?
01:20:04.380 | - Well, no.
01:20:05.220 | - Reporters would have to do some work again, I don't know.
01:20:07.140 | - No, the problem with reporters and journalism
01:20:09.740 | is that they're intermediary.
01:20:12.420 | They have control.
01:20:14.100 | I mean, this is the problem in Russia currently
01:20:15.980 | is that you have,
01:20:17.140 | it creates a shield between the people and the news.
01:20:22.420 | The interesting thing and the powerful thing about Twitter
01:20:25.100 | is that the news originates from the individual
01:20:28.100 | that's creating the news.
01:20:29.020 | Like you have the president of the United States,
01:20:31.820 | the former president of the United States on Twitter
01:20:33.700 | creating news.
01:20:34.900 | You have Elon Musk creating news.
01:20:36.780 | You have people announcing stuff on Twitter
01:20:39.620 | as opposed to talking to a journalist.
01:20:41.500 | And that feels much more genuine
01:20:44.020 | and it feels very powerful,
01:20:48.340 | but actually coming to realize
01:20:50.380 | it doesn't need the social network.
01:20:53.020 | You can just put that announcement on a YouTube type thing.
01:20:55.980 | - This is what I'm thinking, right.
01:20:56.900 | So this is my point about that because that's right.
01:20:59.620 | The democratizing power of the internet is fantastic.
01:21:02.300 | I'm an old school internet nerd,
01:21:03.660 | a guy that was, you know,
01:21:05.260 | telemating in the servers and gophering
01:21:07.100 | before the World Wide Web was around, right?
01:21:08.700 | So I'm a huge internet booster
01:21:10.100 | and that's one of its big power.
01:21:12.260 | But when you put everything on Twitter,
01:21:14.340 | I think the fact that you've taken,
01:21:16.620 | you homogenized everything, right?
01:21:18.380 | So everything looks the same,
01:21:20.140 | moves with the same low friction,
01:21:21.580 | is very difficult.
01:21:22.700 | You have no, what I call distributed curation, right?
01:21:25.260 | The only curation that really happens,
01:21:26.980 | I was a little bit with likes and also the algorithm,
01:21:29.460 | but if you look back to pre-Web 2.0 or early Web 2.0,
01:21:33.660 | when a lot of this was happening, let's say on blogs,
01:21:35.940 | where people own their own servers
01:21:37.420 | and you had your different blogs,
01:21:39.180 | there was this distributed curation that happened
01:21:41.340 | where in order for your blog to get on people's radar,
01:21:45.340 | and this had nothing to do with any gatekeepers
01:21:47.500 | or legacy media, it was over time,
01:21:50.220 | you got more links and people respected you
01:21:52.020 | and you would hear about this blog over here.
01:21:53.460 | And there's this whole distributed curation
01:21:55.220 | and filtering going on.
01:21:56.740 | So if you think like the 2004 presidential election,
01:22:00.020 | most of the information people are getting from the internet
01:22:02.220 | was when the first big internet news driven elections
01:22:05.740 | was from, you had like the daily costs and drudge,
01:22:09.420 | but there was like blogs that were out there.
01:22:11.020 | And this was back, Ezra Klein was just running a blog
01:22:13.460 | out of his dorm room at this point, right?
01:22:16.260 | And you would in a distributed fashion gain credibility
01:22:20.700 | because, okay, people have paid,
01:22:22.580 | it's very hard to get people to pay attention to your blog,
01:22:24.020 | they're paying attention, I get linked to this kid Ezra
01:22:26.460 | or whatever, it seems to be really sharp
01:22:28.060 | and now people are noticing it.
01:22:29.740 | And now you have a distributed curation
01:22:32.100 | that solves a lot of the problems we see
01:22:34.100 | when you have a completely homogenized low friction
01:22:36.100 | environment like friction where, I mean, Twitter,
01:22:38.260 | where any random conspiracy theory or whatever
01:22:41.140 | that people like can just shoot through and spread.
01:22:44.620 | Whereas if you're starting a blog to try to push QAnon
01:22:48.180 | or something like that,
01:22:49.500 | it's probably gonna be a really weird looking blog.
01:22:51.500 | You're gonna have a hard time,
01:22:52.420 | like it's just never gonna show up on people's radar, right?
01:22:55.580 | - So everything you've said up until the very last statement
01:22:58.700 | I would agree with.
01:22:59.940 | - This is a topic I don't know a ton about, I guess.
01:23:02.740 | - So there's, I think, I forget QAnon.
01:23:07.100 | - Yeah, no, we can-
01:23:07.940 | - But QAnon is, QAnon could be that.
01:23:09.940 | I also don't know, I should know more,
01:23:11.900 | I apologize, I don't know more.
01:23:13.620 | I mean, that's a power and the downside.
01:23:17.220 | You can have, I mean, Hitler could have a blog today
01:23:21.660 | and he would have potentially a very large following
01:23:24.180 | if he's charismatic, if he's good with words,
01:23:28.180 | is able to express the ideas of whatever,
01:23:30.300 | maybe he's able to channel the frustration,
01:23:32.700 | the anger that people have about a certain thing.
01:23:35.220 | So I think that's the power of blogs,
01:23:37.300 | but it's also the limitation, but that doesn't,
01:23:39.700 | we're not trying to solve that.
01:23:40.900 | - You can't solve that, yeah.
01:23:41.740 | - The fundamental problem you're saying
01:23:43.340 | is not the problem.
01:23:44.940 | Your thesis is that there's nothing special
01:23:48.060 | about large-scale social networks
01:23:50.980 | that guarantees that they will keep existing.
01:23:53.500 | - And it's important to remember
01:23:54.940 | for a lot of the older generation of internet activists,
01:23:58.180 | so the people who were very pro-internet in the early days,
01:24:01.180 | they were completely flabbergasted
01:24:03.900 | by the rise of these platforms.
01:24:05.540 | Say, why would you take the internet
01:24:08.540 | and then build your own version of the internet
01:24:10.980 | where you own all the servers?
01:24:12.580 | And we built this whole distributed,
01:24:14.500 | the whole thing, we had open protocols.
01:24:16.580 | Everyone anywhere in the world
01:24:17.700 | uses the same protocols.
01:24:18.620 | Your machine can talk to any other machine.
01:24:19.980 | It's the most democratic communication system
01:24:23.300 | that's ever been built.
01:24:24.260 | And then these companies came along and said,
01:24:25.460 | "We're gonna build our own,
01:24:26.540 | we'll just own all the servers
01:24:27.500 | and put them in buildings that we own.
01:24:29.100 | And the internet will just be the first mile
01:24:30.940 | that gets you into our private internet
01:24:32.460 | where we owned the whole thing."
01:24:33.780 | It went completely against the entire motivation
01:24:37.900 | of the internet, was like, "Yes, it's not gonna be
01:24:39.940 | one person owns all the servers
01:24:41.300 | and you pay to access them.
01:24:42.260 | It's any one server that they own
01:24:43.860 | can talk to anyone else's server
01:24:45.100 | because we all agree on a standard set of protocols."
01:24:48.380 | And so the old guard of pro-internet people
01:24:51.700 | never understood this move towards,
01:24:53.940 | "Let's build private versions of the internet.
01:24:56.540 | We'll build three or four private internets
01:24:59.060 | and that's what we'll all use."
01:25:00.060 | It was the opposite, basically.
01:25:01.740 | - Well, it's funny enough, I don't know if you follow,
01:25:03.340 | but Jack Dorsey is also, is a proponent
01:25:07.580 | and is helping to fund, create fully distributed versions
01:25:11.940 | of Twitter, essentially, a thing
01:25:13.180 | that would potentially destroy Twitter.
01:25:15.940 | But I think there might be financial,
01:25:18.540 | like business cases to be made there, I'm not sure.
01:25:21.900 | But that seems to be another alternative
01:25:23.660 | as opposed to creating a bunch of, like the long tail,
01:25:28.660 | creating like the ultimate long tail
01:25:31.500 | of like fully distributed.
01:25:33.220 | - Yeah, which is-- - Which is what the internet is.
01:25:35.060 | - But that's sort of why I'm thinking
01:25:36.620 | about long tail social media,
01:25:37.860 | I'm thinking it's like the text's not so important.
01:25:40.940 | Like there's groups out there, right?
01:25:42.660 | I know where the tech they use to actually implement
01:25:45.700 | their digital only social group, whatever,
01:25:47.620 | they might use Slack, they might use some combination
01:25:50.020 | of Zoom or it doesn't matter.
01:25:51.020 | I think in the tech world,
01:25:52.740 | we wanna build the beautiful protocol
01:25:54.780 | that, okay, everyone's gonna use
01:25:56.100 | as just a federated server protocol
01:25:58.620 | in which we've worked out X, Y, and Z
01:25:59.860 | and no one understands it
01:26:00.700 | because then the engineers need it all to make,
01:26:02.260 | I get it because I'm a nerd like this, like, okay,
01:26:03.940 | every standard has to fit with everything else
01:26:05.700 | and no one understands what's going on.
01:26:07.420 | Meanwhile, you have this group of bike enthusiasts
01:26:10.540 | that are like, yeah, we'll just jump on a Zoom
01:26:12.420 | and have some Slack and put up a blog
01:26:14.260 | and the tech doesn't really matter.
01:26:15.740 | Like we built a world with our own curation, our own rules,
01:26:19.500 | our own sort of social ecosystem
01:26:22.020 | that's generating a lot of value.
01:26:23.460 | I mean, I don't know if it'll happen,
01:26:24.940 | there's a lot of money at stake
01:26:26.140 | with obviously these large,
01:26:27.340 | but I just think they're more,
01:26:29.540 | they're so, I mean, look how quickly
01:26:30.900 | Americans left Facebook, right?
01:26:33.460 | I mean, Facebook was savvy to buy other properties
01:26:35.620 | and to diversify, right?
01:26:36.740 | But how quick did that take
01:26:37.940 | for just standard Facebook newsfeed?
01:26:40.780 | Everyone under the age of something were using it
01:26:42.980 | and no one under a certain age is using it now.
01:26:44.540 | It took like four years.
01:26:45.380 | I mean, this stuff is really-
01:26:47.500 | - I believe people can leave Facebook overnight.
01:26:50.980 | - Yeah.
01:26:51.820 | - Like I think Facebook hasn't actually messed up
01:26:55.340 | like enough to, there's two things.
01:26:57.940 | They haven't messed up enough
01:26:58.820 | for people to really leave aggressively
01:27:00.620 | and there's no good alternative for them to leave.
01:27:03.860 | I think if good alternatives pop up,
01:27:06.220 | it would just immediately happen.
01:27:07.620 | This stuff is a lot more culturally fragile, I think.
01:27:10.300 | I mean, Twitter's having a moment
01:27:11.460 | 'cause it was feeding a certain type of,
01:27:13.180 | I mean, there's a lot of anxieties
01:27:14.260 | that was in the sort of political sphere anyways
01:27:16.460 | that Twitter was working with,
01:27:19.080 | but its moment could go too as well.
01:27:21.700 | I mean, it's a really arbitrary thing,
01:27:23.420 | short little things.
01:27:24.580 | And I read a Wired article about this
01:27:26.340 | earlier in the pandemic.
01:27:27.300 | Like this is crazy that the way
01:27:29.580 | that we're trying to communicate information
01:27:31.420 | about the pandemic is all these weird arbitrary rules
01:27:34.100 | where people are screenshotting pictures of articles
01:27:37.100 | that are part of a tweet thread
01:27:38.380 | where you say one slash in under it.
01:27:40.700 | Like we have the technology guys
01:27:43.100 | to like really clearly convey
01:27:45.300 | for long form information to people.
01:27:47.100 | Why do we have these?
01:27:48.380 | And I know it's 'cause it's the gamified dopamine hits,
01:27:50.420 | but what a weird medium.
01:27:52.300 | There's no reason for us to have these threads
01:27:55.460 | that you have to find and pin when you screenshot.
01:27:57.860 | I mean, we have technology to communicate better
01:27:59.700 | using the internet.
01:28:00.540 | I mean, why are epidemiologists having to do tweet threads?
01:28:05.100 | - Because there's mechanisms of publishing
01:28:06.980 | that make it easier on Twitter.
01:28:08.660 | I mean, we're evolving as a species
01:28:10.780 | and the internet is a very fresh thing.
01:28:12.900 | And so it's kind of interesting to think
01:28:16.100 | that as opposed to Twitter,
01:28:18.380 | this is what Jack also complains about
01:28:20.220 | is Twitter's not innovating fast enough.
01:28:23.140 | And so it's almost like the people are innovating
01:28:26.820 | and thinking about their productive life
01:28:29.620 | faster than the platforms
01:28:31.780 | on which they operate can catch up.
01:28:33.660 | And so at the point the gap grows sufficiently,
01:28:37.460 | they'll jump a few people,
01:28:39.540 | a few innovative folks will just create an alternative
01:28:42.580 | and perhaps distributed, perhaps just many little silos
01:28:47.580 | and then people will jump
01:28:49.020 | and then we'll just continue in this kind of way.
01:28:50.660 | - But see, I think like Substack, for example,
01:28:52.340 | what they're gonna pull out of Twitter,
01:28:53.660 | among other things, is the audience that was,
01:28:56.420 | let's say like slightly left of center,
01:28:58.900 | but slightly left of center, don't like Trump,
01:29:03.060 | uncomfortable with like postmodern critical theories
01:29:05.500 | made into political action, right?
01:29:07.500 | And they're like, yeah, Twitter,
01:29:08.460 | there was people on there talking about this
01:29:10.180 | and it made me feel sort of heard
01:29:12.380 | because I was feeling a little bit like a nerd about it.
01:29:14.220 | But honestly, I'd probably rather subscribe to four subs,
01:29:16.820 | you know, I'm gonna have like Barry's and Andrew Sullivan's,
01:29:19.180 | I'll have like a Jesse Signals,
01:29:20.740 | like I'll have a few Substacks I can subscribe to.
01:29:22.820 | And honestly, I'm a knowledge worker who's 32 anyways,
01:29:26.780 | probably that's an email all day.
01:29:28.100 | And so like there's an innovation that's gonna,
01:29:30.180 | that group, you know, it's gonna suck them off.
01:29:32.460 | - Which is actually a very large group.
01:29:34.180 | - Yeah, that's a lot of energy.
01:29:36.380 | And then once Trump's gone, I guess that's probably gonna,
01:29:38.700 | that drove a lot of more like Trump people off Twitter.
01:29:42.220 | Like this stuff is fragile, I think.
01:29:44.380 | - So I, but the fascinating thing to me,
01:29:47.020 | 'cause I've hung out on Parler for a short amount enough
01:29:50.660 | to know that the interface matters, it's so fascinating.
01:29:53.580 | Like that it's not just about ideas.
01:29:57.220 | It's about creating like Substack too,
01:30:01.220 | creating a pleasant experience, addicting experience.
01:30:04.500 | - No, you're right about that.
01:30:05.340 | And it's hard.
01:30:06.620 | And it's why the, this is one of the conclusions
01:30:08.220 | from that Indy social media article is,
01:30:10.100 | it's just the ugliness matters.
01:30:12.140 | And I don't mean even just aesthetically,
01:30:13.380 | but just the clunkiness of the interfaces,
01:30:15.660 | and I don't know, it's to some degree,
01:30:18.860 | the social media companies have spent a lot of money on this
01:30:21.060 | and to some degree, it's a survivorship bias, right?
01:30:23.940 | I think Twitter, every time I hear Jack talks about this,
01:30:26.860 | it seems like he's as surprised as anyone else
01:30:30.100 | the way Twitter is being used.
01:30:31.100 | I mean, it's basically the way they had it years ago.
01:30:36.100 | And then it was a great, it'll be statuses, right?
01:30:39.460 | This is what I'm doing,
01:30:41.020 | and my friends can follow me and see it.
01:30:42.500 | And without really changing anything,
01:30:43.620 | it just happened to hit everything right
01:30:45.980 | to support this other type of interaction.
01:30:47.540 | - Well, there's also the JavaScript model,
01:30:49.420 | which Brendan and I talked about.
01:30:51.340 | He just implemented JavaScript,
01:30:53.460 | like the crappy version of JavaScript in 10 days,
01:30:55.780 | threw it out there and just changed it really quickly.
01:31:00.340 | Evolved it really quickly and now has become,
01:31:03.220 | according to Stack Exchange,
01:31:04.220 | the most popular programming language in the world.
01:31:06.140 | It drives most of the internet and even the backend
01:31:09.340 | and now mobile.
01:31:10.980 | And so that's an argument for the kind of thing
01:31:14.180 | you're talking about where the bike club people
01:31:17.300 | could literally create the thing that would run
01:31:22.060 | most of the internet 10 years from now.
01:31:24.420 | So there's something to that.
01:31:27.780 | As opposed to trying to get lucky
01:31:29.340 | or trying to think through stuff,
01:31:30.500 | it's just to solve a particular problem.
01:31:33.300 | - Do stuff, yeah.
01:31:34.140 | - And then do stuff.
01:31:35.460 | - Keep tinkering until you love it.
01:31:36.820 | - Yeah.
01:31:37.660 | And then, and of course the sad thing
01:31:40.100 | is timing and luck matter,
01:31:42.420 | and that you can't really control.
01:31:43.900 | - That's the problem, yeah.
01:31:45.580 | You can't go back to 2007.
01:31:47.900 | - Yeah.
01:31:48.740 | - That's like the number one thing you could do
01:31:49.580 | to have a lot of success with a new platform
01:31:51.140 | is go back in time 14 years.
01:31:53.500 | - So the thing you have to kind of think about
01:31:55.140 | is what's the totally new thing
01:31:59.260 | that 10 years from now would seem obvious?
01:32:03.060 | I mean, some people are saying clubhouse is that.
01:32:05.060 | There's been a lot of stuff like clubhouse before,
01:32:08.100 | but it hit the right kind of thing.
01:32:11.280 | Similar to Tesla, actually.
01:32:14.140 | What clubhouse did is it got a lot of
01:32:16.580 | relatively famous people on there quickly.
01:32:19.340 | And then the other effect is like,
01:32:23.160 | it's invite only, so like,
01:32:24.620 | oh, all the famous people are on there.
01:32:27.340 | I wonder what's, it's the FOMO.
01:32:29.100 | Like, fear that you're missing something really profound
01:32:32.300 | or exciting happening there.
01:32:34.260 | So those social effects.
01:32:36.100 | And then once you actually show up,
01:32:37.860 | I'm a huge fan of this.
01:32:40.180 | It's the JavaScript model.
01:32:41.340 | It's like clubhouse is so dumb,
01:32:44.700 | like so simple in its interface.
01:32:46.700 | Like you literally can't do anything except unmute.
01:32:50.460 | There's a mute button,
01:32:51.780 | and there's a leave quietly button, and that's it.
01:32:54.700 | And it's kind of--
01:32:56.100 | I love single use technology in that sense.
01:32:59.580 | - There's no like, there's no, it's just like trivial.
01:33:04.580 | And Twitter kind of started like that.
01:33:08.340 | Facebook started like that.
01:33:10.260 | But they've evolved quickly to add all these features
01:33:12.460 | and so on.
01:33:13.420 | And I do hope clubhouse stays that way.
01:33:16.180 | - Yeah. - It'd be interesting.
01:33:17.180 | - Or there's alternatives.
01:33:18.500 | I mean, even with clubhouse, though,
01:33:21.100 | so one of the issues with a lot of these platforms,
01:33:23.020 | I think, is bits are cheap enough now
01:33:26.740 | that we don't really need a unicorn investor model.
01:33:30.940 | I mean, the investors need that model.
01:33:32.900 | There's really not really an imperative of,
01:33:37.220 | we need something that can scale
01:33:39.860 | to 100 million plus a year revenue.
01:33:42.140 | So because it was gonna require this much seed
01:33:44.820 | and angel investment,
01:33:45.740 | and you're not gonna get this much seed angel investment
01:33:48.380 | unless you can have a potential exit this wide,
01:33:50.900 | 'cause you have to be part of a portfolio
01:33:52.060 | that depends on one out of 10 exiting here.
01:33:55.020 | If you don't actually need that,
01:33:57.420 | and you don't need to satisfy that investor model,
01:33:59.900 | which I think is basically the case.
01:34:01.620 | I mean, bits are so cheap.
01:34:02.940 | Everything is so cheap.
01:34:04.300 | You don't necessarily, so even like with clubhouse,
01:34:05.820 | it's investor backed, right?
01:34:07.500 | So this notion of like, this needs to be a major platform.
01:34:10.620 | But the bike club doesn't necessarily need a major platform.
01:34:14.140 | That's where I'm interested.
01:34:14.980 | I mean, I don't know.
01:34:15.900 | There's so much money.
01:34:16.900 | That's the only problem that bets against me,
01:34:18.340 | is that you can concentrate a lot of capital
01:34:21.460 | if you do these things, right?
01:34:22.540 | I mean, so Facebook was like
01:34:23.860 | a fantastic capital concentration machine.
01:34:26.260 | It's crazy how much,
01:34:27.980 | where it even found that capital in the world
01:34:30.380 | that it could concentrate and ossify in the stock price
01:34:32.540 | that a very small number of people have access to, right?
01:34:35.220 | That's incredibly powerful.
01:34:37.260 | So when there is a possibility to consolidate
01:34:40.700 | and gather a huge amount of capital,
01:34:41.980 | that's a huge imperative that's very hard
01:34:43.980 | for the bike club to go up against, so.
01:34:45.860 | - But there's a lot of money in the bike club,
01:34:47.300 | as you see with the Wall Street bets,
01:34:50.700 | when a bunch of people get together.
01:34:53.100 | I mean, it doesn't have to be a bike,
01:34:54.780 | it could be a bunch of different bike clubs
01:34:56.220 | just kind of team up to overtake.
01:34:59.340 | - That's what we're doing now, yeah.
01:35:00.620 | Or we're gonna repurpose off the shelf stuff.
01:35:02.860 | - Yes, that's good.
01:35:03.700 | - That's not, yeah, we're gonna repurpose whatever.
01:35:05.980 | It was for office productivity or something,
01:35:07.780 | and like the clubs using Slack just to build out these.
01:35:11.020 | Yeah. - Yeah.
01:35:12.580 | Let's talk about email.
01:35:14.020 | - Yeah, that's right.
01:35:15.740 | I wrote a book.
01:35:17.500 | - You wrote yet another amazing book, "A World Without Email."
01:35:22.500 | Maybe one way to enter this discussion
01:35:24.860 | is to ask what is the hyperactive hive mind,
01:35:28.260 | which is the concept you opened the book with.
01:35:29.860 | - Yeah, and the devil.
01:35:31.260 | - And the devil. (laughs)
01:35:32.900 | - It's the scourge of hundreds of millions.
01:35:35.060 | So I think, so I called this book "A World Without Email."
01:35:40.140 | The real title should be
01:35:40.980 | "A World Without the Hyperactive Hive Mind Workflow,"
01:35:43.900 | but my publisher didn't like that, right?
01:35:45.820 | So we had to get a little bit more pithy.
01:35:47.740 | I was trying to answer the question after Deep Work,
01:35:50.580 | why is it so hard to do this?
01:35:52.780 | Like if this is so valuable, if we can produce much higher,
01:35:55.620 | people are much happier, why do we check email a day?
01:35:58.980 | Why are we on Slack all day?
01:36:00.860 | And so I started working on this book
01:36:02.420 | immediately after Deep Work.
01:36:04.220 | And so my initial interviews were done in 2016.
01:36:06.780 | So it took five years to pull the threads together.
01:36:08.900 | I was trying to understand why is it so hard
01:36:11.340 | for most people to actually find any time
01:36:14.420 | to do this stuff that actually moves the needle.
01:36:16.500 | And the story was, and I thought this was,
01:36:18.300 | I hadn't heard this reported anywhere else,
01:36:19.940 | that's why it took me so long to pull it together,
01:36:22.020 | is email arrives on the scene,
01:36:24.420 | email spreads, I trace it,
01:36:25.980 | it really picks up steam in the early 1990s,
01:36:28.580 | between like 1990 and 1995, it makes its move, right?
01:36:32.300 | And it does so for very pragmatic reasons.
01:36:34.060 | It was replacing existing communication technologies
01:36:36.820 | that it was better than.
01:36:37.660 | It was mainly the fax machine, voicemail, and memos, right?
01:36:39.940 | So this was just better, right?
01:36:41.700 | So it was a killer app because it was useful.
01:36:44.340 | In its wake came a new way of collaborating,
01:36:47.820 | and that's the hyperactive hive mind.
01:36:49.540 | So it's like the virus that follows the rats
01:36:53.460 | that went through Western Europe for the black pig.
01:36:55.780 | As email spread through organizations,
01:36:57.840 | in its wake came the hyperactive hive mind workflow,
01:37:00.500 | which says, okay, guys,
01:37:01.480 | here's the way we're gonna collaborate.
01:37:03.500 | We'll just work things out on the fly
01:37:05.340 | with unscheduled back and forth messages.
01:37:07.300 | Just boom, boom, boom, let's go back and forth.
01:37:08.940 | Hey, what about this?
01:37:09.780 | You see this?
01:37:10.600 | What about that client?
01:37:11.440 | What's going on over here?
01:37:13.140 | That followed email.
01:37:14.780 | It completely took over office work.
01:37:18.140 | And the need to keep up with all of these asynchronous
01:37:22.260 | back and forth unscheduled messages,
01:37:24.360 | as those got more and more and more,
01:37:25.620 | we had more of those to service,
01:37:26.700 | the need to service those required us
01:37:28.140 | to check more and more and more and more, right?
01:37:30.340 | And so by the time, and I go through the numbers,
01:37:32.000 | but by the time you get to today,
01:37:33.960 | now the average knowledge worker
01:37:34.940 | has to check one of these channels once every six minutes.
01:37:37.420 | Because every single thing you do in your organization,
01:37:39.660 | how you talk to your colleagues,
01:37:40.640 | how you talk to your vendors,
01:37:41.580 | how you talk to your clients,
01:37:42.540 | how you talk to the HR department,
01:37:43.740 | it's all this asynchronous,
01:37:44.900 | unscheduled back and forth messaging.
01:37:47.180 | And you have to service the conversations.
01:37:49.540 | And it spiraled out of control,
01:37:51.300 | and it has sort of devolved a lot of work in the office now
01:37:54.180 | to all I do is constantly tend communication channels.
01:37:58.780 | - So it's fascinating what you're describing
01:38:00.740 | is nobody ever paused in this whole evolution
01:38:05.740 | to try to create a system that actually works.
01:38:08.740 | That it was kind of like a huge fan of cellular automata.
01:38:13.060 | So it just kind of started a very simple mechanism,
01:38:17.380 | just like cellular automata,
01:38:18.220 | it just kind of grew to overtake
01:38:20.540 | all the fundamental communication of how we do business
01:38:24.300 | and also personal life.
01:38:25.300 | - Yeah, and that's one of the big ideas
01:38:27.180 | is that the unintentionality, right?
01:38:29.740 | So this goes back to technological determinism.
01:38:31.860 | I mean, this is a weird business book
01:38:33.240 | because I go deep on philosophy,
01:38:35.780 | I go deep on for some reason
01:38:37.180 | we get into paleoanthropology for a while,
01:38:39.060 | we do a lot of neuroscience.
01:38:40.220 | It's kind of a weird book,
01:38:41.540 | but I got real into this technological determinism, right?
01:38:44.780 | This notion that just the presence of a technology
01:38:46.900 | can change how people act.
01:38:48.740 | That's my big argument
01:38:49.700 | about what happened with the hive mind.
01:38:51.080 | And I can document specific examples, right?
01:38:54.260 | So I document this example in IBM 1987, maybe 85,
01:38:59.260 | but it's in like the mid to late 80s,
01:39:01.220 | IBM, Armonk headquarters,
01:39:03.580 | we're gonna put an internal email, right?
01:39:05.580 | Because it's convenient.
01:39:07.600 | And so they ran a whole study.
01:39:09.380 | And so I talked to the engineer who ran the study,
01:39:11.840 | Adrian Stundlick, we're gonna run this study
01:39:13.120 | to figure out how much do we communicate
01:39:14.800 | because it was still an era where it's expensive, right?
01:39:17.800 | So you have to provision a mainframe
01:39:19.160 | so you can't over provision.
01:39:20.880 | Like we wanna know how much communication actually happens.
01:39:22.880 | So they went and figured it out.
01:39:24.220 | How many memos, how many calls, how many notes?
01:39:26.400 | Great, we'll provision a mainframe to handle email
01:39:28.760 | that can handle all of that.
01:39:29.940 | So if all of our communication moves to email,
01:39:32.660 | the mainframe will still be fine.
01:39:34.200 | In three days, they had melted it down.
01:39:36.060 | People were communicating six times more than that estimate.
01:39:39.500 | So just in three days,
01:39:41.200 | the presence of a low friction digital communication tool
01:39:44.420 | drastically changed how everyone collaborated.
01:39:46.600 | So that's not enough time for an all hands meeting.
01:39:49.560 | Guys, we figured it out.
01:39:51.140 | This is what we need to communicate a lot more
01:39:52.760 | is what's gonna make us more productive.
01:39:54.980 | We need more emails, it's emergent.
01:39:57.540 | - Isn't that just on the positive end, amazing to you?
01:40:00.800 | Like, isn't email amazing?
01:40:03.240 | Like in those early days,
01:40:04.720 | like just the frictionless communication.
01:40:07.180 | I mean, email is awesome.
01:40:09.620 | Like, people say that there's a lot of problems with emails,
01:40:13.500 | just like people say a lot of problems with Twitter
01:40:15.020 | and so on.
01:40:15.860 | It's kinda cool that you can just send a little note.
01:40:18.540 | - It was a miracle, right?
01:40:19.940 | So I wrote a, this originally was a New Yorker piece
01:40:23.820 | from a year or two ago called "Was Email a Mistake?"
01:40:26.100 | And then it's in the book too.
01:40:28.080 | But I go into the history of email,
01:40:31.220 | like why did it come along?
01:40:32.740 | And it solved a huge problem.
01:40:34.400 | So it was the problem of fast asynchronous communication.
01:40:37.880 | And it was a problem that did not exist
01:40:39.340 | until we got large offices.
01:40:41.060 | We got large offices, synchronous communication,
01:40:43.400 | like let's get on the phone at the same time,
01:40:44.820 | there's too much overhead to it,
01:40:45.840 | there's too many people you might have to talk to.
01:40:48.160 | Asynchronous communication, like let me send you a memo
01:40:50.580 | when I'm ready and you can read it when you're ready,
01:40:52.680 | took too long.
01:40:53.800 | And so it was like a huge problem.
01:40:54.980 | So one of the things I talked about is the way that
01:40:56.740 | when they built the CIA headquarters,
01:40:58.940 | there was such a need for fast asynchronous communication
01:41:02.100 | that they built a pneumatic powered email system.
01:41:05.100 | They had these pneumatic tubes all throughout
01:41:07.020 | the headquarters with electromagnetic routers.
01:41:09.660 | So you would put your message in a plexiglass tube
01:41:12.700 | and you would turn these brass dials about the location,
01:41:15.220 | you would stick it in these things and pneumatic tubes
01:41:17.260 | and it would shoot and sort and work its way
01:41:19.620 | through these tubes to show up in just a minute or something
01:41:23.100 | at the floor and at the general office suite
01:41:24.860 | where you wanted to go.
01:41:25.940 | And my point is the fact that they spent so much money
01:41:28.300 | to make that work show how important
01:41:31.060 | fast asynchronous communication was to large offices.
01:41:33.100 | So when email came along,
01:41:35.020 | it was a productivity silver bullet.
01:41:37.060 | It was a miracle.
01:41:37.900 | I talked to the researchers who were working on
01:41:39.980 | computer supported collaboration in the late 80s,
01:41:41.860 | trying to figure out how are we gonna use computer networks
01:41:44.100 | to be more productive?
01:41:44.980 | And they were building all these systems and tools.
01:41:47.180 | Email showed up, it just wiped all that research
01:41:49.460 | off the map.
01:41:50.300 | There was no need to build these custom
01:41:52.100 | intranet applications.
01:41:53.420 | There was no need to build these communication platforms.
01:41:56.380 | Email could just do everything.
01:41:58.340 | So it was a miracle application,
01:42:00.840 | which is why it spread everywhere.
01:42:02.740 | That's one of these things where, okay,
01:42:04.420 | unintended consequences, right?
01:42:05.660 | You had this miracle productivity silver bullet.
01:42:07.700 | It spread everywhere, but it was so effective.
01:42:10.620 | It just, I don't know, like a drug.
01:42:12.460 | I'm sure there's some pandemic metaphor here,
01:42:15.360 | analogy here of a drug that like is so effective
01:42:17.580 | at treating this, that it also blows up
01:42:19.060 | your whole immune system and then everyone gets sick.
01:42:21.020 | - Well, ultimately it probably significantly increased
01:42:23.460 | the productivity of the world,
01:42:24.680 | but there's a kind of hump that it now has plateaued.
01:42:28.060 | And then the fundamental question you're asking is like,
01:42:32.160 | okay, how do we take the next,
01:42:33.720 | how do we keep increasing the productivity?
01:42:35.680 | - No, I think it brought it down.
01:42:36.840 | So my contention, and so again,
01:42:41.320 | there's a little bit in the book,
01:42:42.160 | but I have a more recent Wired article
01:42:44.080 | that puts some newer numbers to this.
01:42:47.000 | I subscribe to the hypothesis that the hyperactive hive mind
01:42:50.000 | was so detrimental.
01:42:51.520 | So yeah, it helped productivity at first, right?
01:42:53.840 | When you could do fast asynchronous communication,
01:42:56.480 | but very quickly there was a sort of exponential rise
01:42:58.980 | in communication amounts.
01:43:00.980 | Once we got to the point where the hive mind meant
01:43:02.660 | you had to constantly check your email,
01:43:04.440 | I think that made us so unproductive
01:43:06.860 | that it actually was pulling down
01:43:08.220 | non-industrial productivity.
01:43:09.400 | And I think the only reason why,
01:43:11.220 | so it certainly has not been going up.
01:43:12.940 | That metric has been stagnating for a long time now
01:43:15.220 | while all this was going on.
01:43:16.900 | I think the only reason why it hasn't fallen
01:43:19.180 | is that we added these extra shifts off the books.
01:43:22.540 | I'm gonna work for three hours in the morning,
01:43:23.880 | I'm gonna work for three hours at night.
01:43:25.580 | And only that I think has allowed us
01:43:27.460 | to basically maintain a stagnated non-industrial growth.
01:43:31.620 | We should have been shooting up the charts.
01:43:32.980 | I mean, this is miraculous innovations.
01:43:35.580 | Computer networks, and then we built out
01:43:37.000 | these hundred billion dollar ubiquitous worldwide
01:43:39.740 | high-speed wireless internet infrastructure
01:43:42.000 | with supercomputers in our pockets
01:43:43.380 | where we could talk to anyone at any time.
01:43:44.820 | Like, why did our productivity not shoot off the charts?
01:43:47.300 | Because our brain can't context switch
01:43:48.620 | once every six minutes.
01:43:49.460 | - So it's fundamentally back to the context switching.
01:43:51.500 | - Context switching is poison.
01:43:53.420 | - Context switching is poison.
01:43:54.740 | What is it about email that forces context switching?
01:43:58.300 | Is it both our psychology that drags us in?
01:44:00.580 | Or is it the expectation of--
01:44:02.260 | - Yeah, right, right, because it's not,
01:44:03.860 | I think we've seen this through a personal will
01:44:06.740 | or failure lens recently.
01:44:08.980 | Like, oh, am I addicted to email?
01:44:11.820 | I have bad etiquette about my email.
01:44:14.100 | No, it's the underlying workflow.
01:44:16.100 | So the tool itself I will exonerate.
01:44:19.100 | I think I would rather use POP3 than a fax protocol.
01:44:23.660 | I think it's easier.
01:44:24.980 | The issue is the hyperactive hive mind workflow.
01:44:27.020 | So if I am now collaborating with 20 or 30 different people
01:44:30.900 | with back and forth unscheduled messaging,
01:44:33.060 | I have to tend those conversations, right?
01:44:35.220 | It's like you have 30 metaphorical ping pong tables.
01:44:38.180 | And when the balls come back across,
01:44:39.380 | you have to pretty soon hit it back
01:44:41.220 | or stuff actually grinds to a halt.
01:44:43.460 | So it's the workflow that's the problem.
01:44:45.580 | It's not the tool, it's the fact that we use it
01:44:47.140 | to do all of our collaboration.
01:44:48.400 | Let's just send messages back and forth,
01:44:49.900 | which means you can't be far from checking that
01:44:52.780 | 'cause if you take a break, if you batch,
01:44:54.820 | if you try to have better habits,
01:44:56.780 | it's gonna slow things down.
01:44:58.020 | So my whole villain is this hyperactive hive mind workflow.
01:45:02.420 | The tool is fine.
01:45:03.580 | I don't want the tool to go away,
01:45:05.380 | but I wanna replace the hyperactive hive mind workflow.
01:45:07.260 | I think this is gonna be one of the biggest
01:45:10.060 | value generating productivity revolutions
01:45:12.980 | of the 21st century.
01:45:14.180 | I quote an anonymous CEO who's pretty well-known
01:45:16.980 | who says this is gonna be the moon shot of the 21st century.
01:45:19.620 | It's gonna be of that importance.
01:45:20.980 | There's so much latent productivity that's being suppressed
01:45:24.420 | because we just figure things out on the fly in email
01:45:26.340 | that as we figure that out,
01:45:27.740 | I think it's gonna be hundreds of billions of dollars.
01:45:32.060 | - You're so absolutely right.
01:45:35.480 | The question is, what does a world without email look like?
01:45:39.140 | How do we fix email?
01:45:40.900 | - So what happens is, at least in my vision,
01:45:44.100 | you identify, well, actually,
01:45:46.900 | there's these different processes that make up my workday.
01:45:49.220 | Like these are things that I do repeatedly,
01:45:52.020 | often in collaboration with other people
01:45:53.460 | that do useful things for my company or whatever.
01:45:56.460 | Right now, most of these processes
01:45:57.980 | are implicitly implemented with the hyperactive hive mind.
01:46:00.660 | How do we do this thing?
01:46:01.580 | Like answering client questions
01:46:02.660 | to shoot messages back and forth.
01:46:03.820 | You know, how do we do this thing?
01:46:05.220 | Posting podcast episodes,
01:46:06.300 | we'll just figure it out on the fly.
01:46:07.820 | My main argument is we actually have to do
01:46:09.340 | like they did in the industrial sector.
01:46:11.340 | Take each of these processes and say,
01:46:12.860 | is there a better way to do this?
01:46:14.900 | And by better, I mean a way that's gonna minimize
01:46:16.980 | the need to have unscheduled back and forth messaging.
01:46:19.220 | So we actually have to do process engineering.
01:46:22.020 | This created a massive growth in productivity
01:46:24.300 | in the industrial sector during the 20th century.
01:46:25.900 | We have to do it in knowledge work.
01:46:26.900 | We can't just rock and roll an inbox
01:46:28.460 | as we actually have to say,
01:46:30.100 | how do we deal with client questions?
01:46:31.420 | Well, let's put in place a process
01:46:32.620 | that doesn't require us to send messages back and forth.
01:46:35.060 | How do we post podcast episodes?
01:46:36.580 | Let's automate this to a degree where
01:46:38.540 | I don't have to just send you a message on the fly.
01:46:40.380 | And you do this process by process
01:46:43.220 | and the pressure on that inbox is released.
01:46:45.180 | And now you don't have to check it every six minutes.
01:46:46.980 | So you still have email.
01:46:47.940 | I mean, like I need to send you a file.
01:46:49.140 | Sure, I'll use email,
01:46:50.380 | but we're not coordinating or collaborating over email
01:46:52.700 | or Slack, which is just a faster way of doing the hive mind.
01:46:55.020 | I mean, Slack doesn't solve anything there.
01:46:57.860 | You have better structured bespoke processes.
01:47:00.260 | I think that's what's gonna unleash
01:47:01.780 | this massive productivity.
01:47:03.180 | - Bespoke, so the interesting thing is like,
01:47:05.580 | if for example, you and I exchanged some emails.
01:47:07.620 | So obviously I, for, let's just say my particular case,
01:47:10.980 | I scheduled podcasts.
01:47:11.940 | There's a bunch of different tasks.
01:47:13.900 | Fascinatingly enough that I do
01:47:16.620 | that could be converted into processes.
01:47:19.180 | Is it up to me to create that process?
01:47:21.820 | Or do you think we also need to build tools
01:47:23.740 | just like email was a protocol for helping us
01:47:28.500 | create processes for the different tasks?
01:47:31.020 | - I mean, I think ultimately the whole organization,
01:47:34.260 | the whole team has to be involved.
01:47:35.380 | I think ultimately there's certainly a lot of investor money
01:47:37.740 | being spent right now to try to figure out those tools.
01:47:40.460 | So I think Silicon Valley has figured this out
01:47:42.140 | in the past couple of years.
01:47:43.140 | This is the difference between
01:47:45.060 | when I was talking to people after Deep Work
01:47:47.220 | and now five years later is this scent is in the air.
01:47:51.580 | Because there's so much latent productivity.
01:47:53.260 | So yes, there are gonna be new tools,
01:47:54.900 | which I think could help.
01:47:55.740 | There are already tools that exist.
01:47:57.020 | I mean, in the different groups I profiled
01:47:59.620 | use things like Trello or Basecamp or Asana or Flow
01:48:03.740 | and our schedule wants and acuity.
01:48:06.060 | Like there's a lot of tools out there.
01:48:08.740 | The key is not to think about it in terms of
01:48:10.700 | what tool do I replace email with?
01:48:12.500 | Instead you think about it with,
01:48:14.300 | I have a pro, we're trying to come up with a process
01:48:16.180 | that reduces back and forth messages.
01:48:17.580 | Oh, what tool might help us do that?
01:48:21.300 | Yeah, and I would push,
01:48:22.220 | it's not about necessarily efficiency.
01:48:24.020 | In fact, some of these things are gonna take more time.
01:48:26.020 | So writing a letter to someone is like a high value activity
01:48:29.780 | it's probably worth doing.
01:48:30.940 | The thing that's killer is the back and forth.
01:48:33.460 | 'Cause now I have to keep checking, right?
01:48:35.100 | So we scheduled this together
01:48:36.700 | 'cause I knew you from before,
01:48:38.540 | but like most of the interviews I was scheduling for this,
01:48:41.500 | actually I have a process with my publicist
01:48:43.780 | where we use a shared document
01:48:45.140 | and she puts stuff in there
01:48:46.460 | and then I check it twice a week
01:48:48.060 | and there's scheduling options.
01:48:50.060 | I say, here's what I wanna do this one
01:48:51.180 | or this will work for this one or whatever.
01:48:52.780 | And it takes more time in the moment than just,
01:48:54.980 | but it means that we have almost no back and forth messaging
01:48:58.460 | for podcast scheduling, which without this,
01:49:00.500 | so like with my UK publisher,
01:49:02.460 | I didn't put this process into place
01:49:03.860 | 'cause we're not doing as many interviews,
01:49:06.420 | but it's all the time.
01:49:07.420 | And I'm like, oh man,
01:49:08.420 | I could really feel the difference, right?
01:49:10.100 | It's the back and forth that's killer.
01:49:11.740 | - I suppose it is up to the individual people involved.
01:49:15.060 | Like you said, knowledge workers,
01:49:18.140 | like they have to carry the responsibility
01:49:21.020 | of creating processes.
01:49:23.540 | Like how, always asking the first principles question,
01:49:25.940 | how can this be converted into a process?
01:49:28.300 | - Yeah, so you can start by doing this yourself,
01:49:30.700 | like just with what you can control.
01:49:32.700 | I think ultimately once the teams are doing that,
01:49:34.940 | I think that's probably the right scale.
01:49:36.460 | If you try to do that at the organizational scale,
01:49:38.180 | you're gonna get bureaucracy, right?
01:49:39.860 | So if Elon Musk is gonna dictate down
01:49:44.660 | to everyone at Tesla or something like this,
01:49:46.900 | that's too much remove and you get bureaucracy.
01:49:48.620 | But if it's, we're a team of six
01:49:50.980 | that's working together on whatever powertrain software,
01:49:55.140 | then we can figure out on our own,
01:49:56.420 | what are our processes?
01:49:57.260 | How do we wanna do this?
01:49:58.180 | - So it's ultimately also creating a culture
01:50:00.020 | where it's saying like in email,
01:50:02.020 | sending an email just for the hell of it,
01:50:03.820 | it should be taboo.
01:50:05.620 | So you are being destructive to the productivity of the team
01:50:10.620 | by sending this email,
01:50:12.540 | as opposed to helping develop a process and so on
01:50:17.060 | that will ultimately automate this.
01:50:20.780 | - That's why I'm trying to spread this message
01:50:22.500 | of the context switches is poison.
01:50:24.180 | I get so much into the science of it.
01:50:25.460 | I think we underestimate how much it kills us
01:50:28.020 | to have to wrench away our context,
01:50:29.660 | look at a message and come back.
01:50:30.820 | And so once you have the mindset of,
01:50:32.820 | it's a huge thing to ask of someone
01:50:35.180 | to have to take their attention off something
01:50:37.100 | and look back at this.
01:50:38.180 | And if they have to do that for three or four times,
01:50:40.380 | like we're just gonna figure this out on the fly
01:50:42.140 | and every message is gonna require five checks
01:50:44.060 | of the inbox while you wait for it.
01:50:45.540 | Now you've created whatever it is at this point,
01:50:47.660 | 25 or 30 context shifts.
01:50:50.220 | Like you've just done a huge disservice to someone's day.
01:50:52.820 | This would be like if I had a professional athlete,
01:50:54.620 | like, "Hey, do me a favor.
01:50:55.900 | I need you to go do this press interview."
01:50:57.060 | But to get there, you're gonna have to carry this sandbag
01:50:59.580 | and sprint up this hill,
01:51:00.700 | like completely exhaust your muscles
01:51:02.060 | and then you have to go play a game.
01:51:03.180 | Like, of course I'm not gonna ask an athlete
01:51:04.660 | to do like an incredibly physically demanding thing
01:51:07.340 | right before a game,
01:51:08.660 | but something as easy as thoughts, question mark,
01:51:11.780 | or like, "Hey, do you wanna jump on a call
01:51:13.260 | and it's gonna be six back and forth messages
01:51:14.900 | to figure it out?"
01:51:15.820 | It's kind of the cognitive equivalent, right?
01:51:17.780 | You're taking the wind out of someone.
01:51:19.580 | - Yeah, and by the way, for people who are listening,
01:51:22.460 | 'cause I recently posted a few job openings
01:51:24.660 | for so I had to help with this thing.
01:51:26.500 | And one of the things that people are surprised
01:51:28.860 | when they work with me
01:51:29.700 | is how many spreadsheets and processes are involved.
01:51:32.180 | - And it's like Claude Shannon, right?
01:51:33.420 | I talked about communication theory or information theory.
01:51:36.340 | It takes time to come up with a clever code upfront.
01:51:38.700 | So you spend more time upfront
01:51:39.740 | figuring out those spreadsheets
01:51:40.780 | and trying to get people on board with it.
01:51:42.860 | But then your communication going forward
01:51:45.300 | is all much more efficient.
01:51:46.260 | So over time, you're using much less bandwidth, right?
01:51:49.740 | So you do pain upfront.
01:51:52.300 | It's quicker just right now to send an email.
01:51:54.700 | But if I spend a half day to do this
01:51:56.140 | over the next six months, I've saved myself 600 emails.
01:52:00.220 | - Now, here's a tough question for, you know,
01:52:02.660 | from the computer science perspective,
01:52:04.660 | we often over optimize.
01:52:07.100 | So you've create processes and you, okay.
01:52:10.940 | Just like you're saying, it's so pleasurable
01:52:14.860 | to increase in the long-term productivity
01:52:19.700 | that sometimes you just enjoy that process in itself
01:52:22.900 | by just creating processes.
01:52:25.020 | And you actually never,
01:52:27.280 | like it has a negative effect on productivity long-term
01:52:31.060 | because you're too obsessed with the processes.
01:52:33.500 | Is that a nice problem to have essentially?
01:52:37.860 | - I mean, it's a problem.
01:52:38.860 | I mean, because let's look at the one sector
01:52:41.220 | that does do this, which is developers, right?
01:52:44.540 | So agile methodologies like Scrum or Kanban
01:52:47.220 | are basically workflow methodologies
01:52:49.860 | that are much better than the hyperactive hive mind.
01:52:52.360 | But man, some of those programmers get pretty obsessive.
01:52:55.740 | I don't know if you've ever talked
01:52:56.660 | to a whatever level three Scrum master.
01:52:59.460 | They get really obsessive about like,
01:53:01.980 | it has to happen exactly this way.
01:53:04.200 | And it's probably seven times more complex
01:53:05.900 | than it needs to be.
01:53:07.300 | I'm hoping that's just because nerds like me,
01:53:09.780 | you know, like to do that.
01:53:11.020 | But it's a broadly probably an issue, right?
01:53:14.220 | We have to be careful because you can just go down
01:53:16.060 | that fiddling path.
01:53:17.940 | Like, so it needs to be, here's how we do it.
01:53:19.760 | Let's reduce the messages and let's roll, you know?
01:53:22.580 | You can't save yourself through
01:53:26.120 | if you can get the process just right, right?
01:53:28.220 | So I wrote this article kind of recently
01:53:30.580 | called "The Rise and Fall of Getting Things Done."
01:53:32.820 | And I profiled this productivity guru named Merlin Mann.
01:53:37.060 | And I talked about this movement called productivity prong
01:53:39.940 | as like elite speak term in the early 2000s
01:53:42.900 | where people just became convinced
01:53:44.540 | that if they could combine their productivity systems
01:53:47.140 | with software and they could find just the right software,
01:53:50.180 | just the right configuration where they could offload
01:53:51.900 | most of the difficulty of work,
01:53:53.100 | what happened with the machines.
01:53:54.660 | We'd kind of figure it out for them.
01:53:55.500 | And then they could just sort of crank widgets
01:53:57.060 | and it'd be, and the whole thing fell apart
01:53:58.660 | because work is hard and it's hard to do
01:54:00.860 | and making decisions about what to work on is hard
01:54:03.020 | and no system can really do that for you.
01:54:04.740 | So you have to have this sort of balance between,
01:54:08.820 | context switches are poison.
01:54:10.900 | So we got to get rid of the context switches.
01:54:12.460 | Once like something's working good enough
01:54:13.900 | to get rid of the context switches, then get after it.
01:54:17.060 | - Yeah, there's a psychological process there for me.
01:54:19.500 | The OCD nature, like I've literally embarrassing enough
01:54:23.580 | have lost my shit before when,
01:54:26.260 | so in many of the processes that involve Python scripts,
01:54:30.420 | the rule is to not use spaces.
01:54:34.900 | Underscores, there's like rules
01:54:36.580 | for like how you format stuff, okay?
01:54:39.300 | And like, I should not lose my shit
01:54:42.100 | when somebody had a space and maybe capital letters.
01:54:45.300 | Like, it's okay to have a space.
01:54:48.020 | 'Cause there's this feeling like something's not perfect.
01:54:50.980 | - Yeah.
01:54:51.820 | - And as opposed to in the Python script,
01:54:54.620 | allowing some flexibility around that,
01:54:56.660 | you create this programmatic way that's flawless.
01:54:59.100 | And when everything's working perfectly, it's perfect.
01:55:01.740 | But actually, if you strive for perfection,
01:55:06.080 | it has the same stress, like has a lot of the stress
01:55:10.100 | that you were seeking to escape with the context switching.
01:55:12.940 | Because you're almost stressing about errors.
01:55:17.940 | Like when the process is functioning,
01:55:20.820 | there's always this anxiety of like,
01:55:23.180 | I wonder if it's gonna succeed.
01:55:25.100 | - Yeah.
01:55:25.940 | - I wonder if it's gonna succeed.
01:55:26.820 | - Yeah, no, I think some of that's just you and I probably.
01:55:29.060 | I mean, it's just our mindset, right?
01:55:30.300 | We're in, we do computer science, right?
01:55:32.340 | So chicken and egg, I guess.
01:55:34.500 | And a lot of the processes end up working here
01:55:36.500 | are much rougher.
01:55:37.820 | It's like, okay, instead of letting clients
01:55:39.440 | just email me all the time, we have a weekly call,
01:55:43.700 | and then we send them a breakdown
01:55:45.420 | of everything we committed to, right?
01:55:47.600 | That's a process that works.
01:55:48.540 | Okay, I get asked a lot of questions
01:55:50.100 | 'cause I'm the JavaScript guy in the company.
01:55:51.940 | Instead of doing it by email, I have office hours.
01:55:53.780 | This is what Basecamp does.
01:55:54.860 | All right, so you come to my office hours,
01:55:55.940 | that cuts down a lot of back and forth.
01:55:57.140 | All right, we're gonna, instead of emailing
01:55:58.300 | about this project, we'll have a Trello board,
01:56:02.040 | and we'll do a weekly really structured status meeting
01:56:04.680 | real quick, what's going on, who needs what, let's go.
01:56:07.020 | And now everything's on there and on our inboxes,
01:56:09.060 | we don't have to send as many messages.
01:56:10.180 | So like that rough level of granularity,
01:56:12.460 | that gets you most of the way there.
01:56:14.540 | - So the parts that you can't automate
01:56:17.140 | and turn into a process.
01:56:19.780 | So how many parts like that do you think
01:56:21.780 | should remain in a perfect world?
01:56:24.660 | And for those parts where email is still useful,
01:56:29.660 | what do you recommend those emails look like?
01:56:32.660 | How should you write the emails?
01:56:34.420 | When should you send them?
01:56:35.820 | - Yeah, I think email is good for delivering information.
01:56:40.780 | Right, so I think about like a fax machine or something.
01:56:42.860 | You know, it's a really good fax machine.
01:56:44.340 | So if I need to send you something
01:56:46.260 | and you just send you a file,
01:56:47.100 | I need to broadcast a new policy or something,
01:56:49.460 | like email is a great way to do it.
01:56:51.340 | It's bad for collaboration.
01:56:53.620 | So if you're having a conversation,
01:56:55.660 | like we're trying to reach a decision on something,
01:56:57.480 | I'm trying to learn about something,
01:56:58.860 | I'm trying to clarify what something, what this is,
01:57:01.220 | that's more than just like a one answer type question,
01:57:04.820 | then I think that you shouldn't be doing an email.
01:57:07.460 | - But see, here's the thing,
01:57:08.960 | like you and I don't talk often.
01:57:11.580 | And so we have a kind of new interaction.
01:57:13.820 | It's not, so sure, yeah, you have a book coming out,
01:57:17.500 | so there's a process and so on,
01:57:19.060 | but say there, don't you think there's a lot
01:57:22.020 | of novel interactive experiences?
01:57:24.300 | - Yeah, I think it's fine.
01:57:25.340 | - So you could, just for every novel experience,
01:57:27.780 | it's okay to have a little bit of an exchange.
01:57:29.980 | - I think it's fine.
01:57:30.820 | Like I think it's fine if stuff comes in over the transom
01:57:33.100 | or you hear from someone you haven't heard from in a while.
01:57:35.980 | I think all that's fine.
01:57:37.460 | I mean, that email at its best,
01:57:39.700 | where it starts to kill us is where all of our collaboration
01:57:42.460 | is happening with the back and forth.
01:57:43.460 | So when you've moved the bulk of that out of your inbox,
01:57:45.700 | now you're back in that Meg Ryan movie,
01:57:47.980 | like you got mail or it's like, all right,
01:57:49.940 | load this up and you wait for the vote.
01:57:51.580 | I'm like, oh, we got a message.
01:57:53.100 | Yeah, Lex sent me a message.
01:57:54.740 | This is interesting, right?
01:57:55.580 | You're back to the AOL days.
01:57:56.900 | - So you're talking about the bulk of the business world
01:58:00.060 | where like email has replaced the actual communication,
01:58:04.060 | all of the communication protocols required
01:58:05.940 | to accomplish anything.
01:58:06.780 | - Everything is just happening with messages.
01:58:08.060 | So if you now get most stuff done,
01:58:10.980 | repeatable collaborations with other processes
01:58:14.060 | that don't require you to check these inboxes,
01:58:15.580 | then the inbox can serve like an inbox,
01:58:17.860 | which includes hearing from interesting people, right?
01:58:20.860 | Or sending something, hey, I don't know if you saw this,
01:58:22.660 | I thought you might like it.
01:58:23.500 | I think it's great for that.
01:58:24.700 | - So there's probably a bunch of people listening to this.
01:58:27.580 | They're like, yeah, but I work on a team
01:58:31.100 | and all they use is email.
01:58:33.100 | How do you start the revolution from like the ground up?
01:58:35.900 | - Yeah, well, do it, do asymmetric optimization first.
01:58:39.020 | So identify all your processes
01:58:40.700 | and then change what you can change
01:58:42.340 | and be socially very careful about it.
01:58:44.340 | So don't necessarily say like, okay,
01:58:46.220 | this is a new process we all have to do.
01:58:48.260 | You're just, hey, we gotta get this report ready.
01:58:51.820 | Here's what I think we should do.
01:58:52.700 | Like I'll get a draft into our Dropbox folder
01:58:54.660 | by like noon on Monday, grab it.
01:58:57.900 | I won't touch it again until Tuesday morning.
01:59:00.100 | And then I'll look at your changes.
01:59:01.540 | I have this office hours always scheduled Tuesday afternoon.
01:59:03.860 | So if there's anything that catches your attention,
01:59:05.780 | grab me then.
01:59:06.960 | But I've told the designer who CC'd on this
01:59:09.180 | that by COB Tuesday, the final version will be ready
01:59:13.060 | for them to take and polish or whatever.
01:59:14.900 | Like the person on the other end is like, great,
01:59:16.100 | I'm glad, you know, Cal has a plan.
01:59:18.140 | So I just, what do I need to do?
01:59:19.260 | I need to edit this tomorrow, whatever, right?
01:59:21.260 | But you've actually pulled them into a process.
01:59:22.740 | That means we're gonna get this report together
01:59:24.220 | without having to just go back and forth.
01:59:25.900 | So you just asymmetrically optimize these things
01:59:29.220 | and then you can begin the conversation.
01:59:31.340 | And maybe that's where my book comes in place.
01:59:32.940 | You just sort of slide it, slide it across the desk.
01:59:36.300 | - So buy the book and just leave it.
01:59:38.020 | - Leave it at the, yeah.
01:59:38.860 | - Give it to everybody on your team.
01:59:40.220 | Okay, so we solved the bulk of the email problem with this.
01:59:42.940 | Is there a case to be made that even for like communication
01:59:45.500 | between you and I, we should move away from email?
01:59:50.500 | And for example, there's a guy, I recently,
01:59:52.500 | I don't know if you know comedians,
01:59:53.700 | but there's a guy named Joey Diaz
01:59:55.940 | that I've had an interaction with recently.
01:59:57.900 | And that guy, first of all, the sweetest human,
02:00:00.460 | despite what his comedy sounds like,
02:00:02.500 | is the sweetest human being.
02:00:04.300 | And he's a big proponent of just pick up the phone and call.
02:00:08.620 | And it makes me so uncomfortable when people call me.
02:00:10.900 | It's like, I don't know what to do with this thing.
02:00:14.080 | But it kind of gets everything done quicker, I think,
02:00:17.780 | if I remove the anxiety from that.
02:00:19.900 | Is there a case to be made for that?
02:00:21.220 | Or is email could still be the most efficient way
02:00:24.260 | to do this?
02:00:25.100 | - No, I mean, look, if you have to interact with someone,
02:00:27.620 | there's a lot of efficiency in synchrony, right?
02:00:30.020 | And this is something from distributed system theory
02:00:31.780 | where you know if you go from synchronous
02:00:33.420 | to asynchronous networks,
02:00:34.740 | there's a huge amount of overhead to the asynchrony.
02:00:36.660 | So actually the protocols required to solve things
02:00:39.060 | in asynchronous networks are significantly more complicated
02:00:42.180 | and fragile than synchronous protocols.
02:00:44.060 | So if we can just do real time, it's usually better.
02:00:47.020 | And also from an interaction,
02:00:48.620 | like social connection standpoint,
02:00:50.020 | there's a lot more information in the human voice
02:00:51.900 | and the back and forth.
02:00:53.820 | Yeah, if you just call, so very generational, right?
02:00:56.500 | Like our generation will be comfortable talking on the phone
02:00:59.460 | in a way that like a younger generation isn't,
02:01:01.460 | but an older generation is more comfortable with,
02:01:03.320 | well, you just call people.
02:01:05.140 | Whereas we, so there's a happy medium,
02:01:07.060 | but most of my good friends,
02:01:08.780 | we just talk, we have regular phone calls.
02:01:11.060 | - Okay.
02:01:11.900 | - Yeah, it's not, I don't just call them,
02:01:13.060 | we schedule it, we schedule it.
02:01:14.220 | Yeah, just on text, like, yeah,
02:01:15.280 | you want to talk sometime soon.
02:01:17.980 | - Do you ever have a process around friends?
02:01:20.860 | - Not really, no.
02:01:22.300 | - I feel like I should, I feel like-
02:01:24.420 | - When you have like a lot of interesting
02:01:26.020 | friend possibilities,
02:01:27.580 | is you have like an interesting problem, right?
02:01:29.180 | Like really interesting people you can talk to.
02:01:32.100 | - Well, that's one problem.
02:01:33.380 | And the other one is the introversion
02:01:34.740 | where I'm just afraid of people and get really stressed.
02:01:37.660 | Like I freak out and so-
02:01:39.520 | - You picked a good line of work.
02:01:41.420 | - Yeah, now perhaps it's the Goggins thing.
02:01:43.980 | It's like facing your fears or whatever,
02:01:46.500 | but it's almost like there's a,
02:01:50.440 | it has to do with the timetables thing
02:01:52.020 | and the deep work that the nice thing
02:01:54.660 | about the processes is it not only automates,
02:01:59.660 | sort of automates away the context switching,
02:02:03.380 | it ensures you do the important things too.
02:02:05.740 | - Yeah.
02:02:06.580 | - It's like prioritize, so the thing is with email,
02:02:10.140 | because everything is done over email,
02:02:12.500 | you can be lazy in the same way with like social networks
02:02:17.260 | and do the easy things first that are not that important.
02:02:21.060 | So the process also enforces
02:02:23.260 | that you do the important things.
02:02:24.900 | And for me, the important things is like,
02:02:28.180 | okay, this sounds weird, but like social connection.
02:02:30.380 | - No, that's one of the most important things
02:02:33.020 | in all of human existence.
02:02:34.340 | - Yeah.
02:02:35.180 | - And doing it, the paradoxical thing,
02:02:37.620 | I got into this for digital minimalism,
02:02:40.340 | the more you sacrifice on behalf of the connection,
02:02:42.500 | the stronger the connection feels, right?
02:02:44.460 | So sacrificing non-trivial time and attention
02:02:47.180 | on behalf of someone is what tells your brain
02:02:49.100 | that this is a serious relationship,
02:02:52.260 | which is why social media had this paradoxical effect
02:02:54.860 | of making people feel less social
02:02:57.100 | 'cause it took the friction out of it.
02:02:58.580 | And so the brain just doesn't like,
02:02:59.860 | yeah, you've been commenting on this person's whatever,
02:03:02.380 | you've been retweeting them or sending them some texts,
02:03:05.400 | you haven't, it's not hard enough.
02:03:07.500 | And then the perceived strength
02:03:09.700 | of that social connection diminishes,
02:03:11.100 | where if you talk to them or go spend time with them
02:03:13.300 | or whatever, you're gonna feel better about it.
02:03:16.180 | So the friction is good.
02:03:17.740 | I have a thing with some of my friends
02:03:18.920 | where at the end of each call,
02:03:20.700 | we take a couple minutes to schedule the next.
02:03:23.100 | Then you never have to,
02:03:23.940 | it's like I do with haircuts or something, right?
02:03:25.340 | Like if I don't schedule it then,
02:03:27.340 | I'm never gonna get my haircut, right?
02:03:29.100 | And so it's like, okay, when do you wanna talk next?
02:03:32.200 | - Yeah, that's a really good idea.
02:03:34.180 | I just don't call friends and like every 10 years,
02:03:38.100 | I do something dramatic for them
02:03:39.660 | so that we maintain the friendship.
02:03:40.980 | Like I'd murder somebody that they really don't like.
02:03:43.460 | - Yeah, exactly.
02:03:44.300 | - I just don't like that. - Careful, man,
02:03:45.420 | Joey might ask you to do that.
02:03:46.780 | - Yeah, that's why, oh, this is one of my favorite things.
02:03:49.300 | - Lex, you need to come down to New Jersey.
02:03:51.580 | - That's exactly what we're gonna do.
02:03:52.420 | - With that robot dog of yours.
02:03:54.140 | - We're gonna go down to Jersey.
02:03:56.740 | There's a special human.
02:03:57.780 | I love the comedian world.
02:04:00.020 | They've been shaking up,
02:04:01.500 | I don't know if you listen to Joe Rogan, all those folks,
02:04:04.100 | they kind of are doing something interesting
02:04:08.100 | for MIT and academia.
02:04:10.460 | They're shaking up this world a little bit,
02:04:13.060 | like podcasting, because comedians are paving the way
02:04:15.520 | for podcasting.
02:04:17.060 | And so you have like Andrew Huberman,
02:04:18.820 | who's a neuroscientist at Stanford, a friend of mine now.
02:04:21.620 | He's like into podcasting now,
02:04:25.260 | and you're into podcasting.
02:04:27.100 | Of course, you're not necessarily podcasting
02:04:29.180 | about computer science currently, right?
02:04:30.960 | But that, it feels like you could have a lot
02:04:35.760 | of the free spirit of the comedians implemented
02:04:40.520 | by the people who are academically trained.
02:04:43.920 | - Who actually have a niche specialty.
02:04:46.760 | - Yeah, and then that results, I mean,
02:04:49.280 | who knows what the experiment looks like,
02:04:51.340 | but that results me being able to talk about robotics
02:04:54.200 | with Joey Diaz when he says, you know,
02:04:56.960 | drops F-bombs every other sentence.
02:04:58.680 | And I, the world is, like, I've seen actually a shift
02:05:02.720 | within colleagues and friends within MIT
02:05:06.240 | where they're becoming much more accepting
02:05:08.520 | of that kind of thing.
02:05:09.340 | It's very interesting.
02:05:10.480 | - That's interesting.
02:05:11.320 | So you're seeing, okay.
02:05:12.720 | - Because they're seeing how popular it is.
02:05:14.520 | They're like--
02:05:15.360 | - Well, you're really popular.
02:05:16.180 | I don't know how they think about it
02:05:17.360 | at Georgetown, for example.
02:05:18.880 | - I don't know.
02:05:19.720 | It's interesting, but I think what happens
02:05:22.080 | is the popularity of it combined
02:05:25.160 | with just good conversations with people.
02:05:28.400 | They respect, it's like, oh, okay, wait, this is the thing.
02:05:32.720 | - Yeah.
02:05:33.560 | - And this is more fun to listen to
02:05:34.760 | than a shitty Zoom lecture about their work.
02:05:39.160 | - Yeah.
02:05:40.000 | - It's like, there's something here.
02:05:40.820 | - There's something interesting.
02:05:41.660 | - And we don't, nobody actually knows what that is.
02:05:44.000 | Just like with Clubhouse or something,
02:05:46.440 | nobody's figured out, like, where does this medium take?
02:05:49.080 | Is this a legitimate medium of education?
02:05:51.520 | - Yeah.
02:05:52.360 | - Or is this just like a fun--
02:05:54.360 | - Well, that's your innovation, I think,
02:05:55.560 | was we can bring on professors.
02:05:58.040 | - Yeah.
02:05:58.880 | - And I know Joe Rogan did some of that too,
02:06:00.600 | but your professors in your field.
02:06:04.880 | - Yeah, exactly.
02:06:05.720 | - You bring on all these MIT guys who I remember.
02:06:08.280 | - Well, that's been the big challenge for me is,
02:06:10.120 | I don't know, is I feel,
02:06:13.400 | I would ask big philosophical questions
02:06:17.680 | of people like yourself that are really well,
02:06:22.680 | like, so for example, you have a lot of excellent papers
02:06:25.920 | on, you know, that has a lot of theory in it, right?
02:06:30.920 | And there is some temptation to just go through papers.
02:06:35.440 | And I think it's possible to actually do that.
02:06:37.080 | I haven't done that much, but I think it's possible.
02:06:39.440 | It just requires a lot of preparation.
02:06:41.720 | And I can probably only do that with things
02:06:43.920 | that I'm actually, like, in the field I'm aware of.
02:06:48.080 | But there's a dance that I would love to be able
02:06:51.120 | to try to hit right, where it's actually getting
02:06:53.320 | to the core of some interesting ideas,
02:06:55.000 | as opposed to just talking about philosophy.
02:06:56.960 | At the same time, there's a large audience of people
02:06:59.840 | that just wanna be inspired by, like,
02:07:02.920 | by disciplines where they don't necessarily
02:07:05.960 | know the details.
02:07:07.240 | But there's a lot of people that are like,
02:07:08.560 | hmm, I'm really curious.
02:07:10.640 | I've been thinking about pivoting careers
02:07:13.120 | into software engineering.
02:07:14.720 | They would love to hear from people like you
02:07:16.680 | about computer science, even if it's like theory.
02:07:19.320 | - Yeah, but just like the idea that you can have big ideas,
02:07:22.200 | you push them through and it's interesting,
02:07:24.120 | you fight for it, yeah.
02:07:25.200 | - Well, there's some, there's, what is it,
02:07:27.720 | Computerphile and Numberphile, these YouTube channels.
02:07:32.720 | There's channels I watch on chess, exceptionally popular,
02:07:37.240 | where I don't understand maybe 80% of the time
02:07:41.480 | what the hell they're talking about,
02:07:42.720 | 'cause they're talking about why this move
02:07:44.600 | is better than this move.
02:07:45.560 | But I love the passion and the genius of those people
02:07:48.480 | and just overhearing it.
02:07:50.200 | I don't know why that's so exciting.
02:07:52.080 | - Do you look at Scott Aronson's blog at all?
02:07:54.040 | The Shuttle to Optimize.
02:07:55.200 | Yeah, it's like hardcore complexity theory,
02:07:57.880 | but it's just an enthusiasm or like Terry Tao's blog.
02:08:01.200 | - A little bit of humor.
02:08:02.680 | Terry Tao has a blog?
02:08:03.920 | - He used to, yeah.
02:08:04.840 | And it would just be, I'm going all in on,
02:08:08.720 | you know, here's the new affine group
02:08:10.280 | with which you can do whatever.
02:08:11.680 | I mean, it was just equations.
02:08:13.040 | - Well, in the case of Scott Aronson, he's good.
02:08:15.360 | He's able to turn on like the inner troll
02:08:19.360 | and comedian and so on.
02:08:20.600 | He keeps the fun, which is the best of kinds of fun.
02:08:22.960 | - He's a philosophical guy.
02:08:24.160 | He wrote that book.
02:08:25.000 | - Yeah, he turns on the philosophy.
02:08:27.120 | Yeah, so, you know, we're exploring these different ways
02:08:30.400 | of communicating science and exciting the world.
02:08:33.480 | Speaking of which, I gotta ask you about computer science.
02:08:36.160 | (laughs)
02:08:37.160 | - That's right, I do some of that.
02:08:39.320 | - So, I mean, a lot of your work is what inspired
02:08:43.600 | this deep thinking about productivity
02:08:46.480 | from all the different angles,
02:08:48.800 | because some of the most rigorous work is mathematical work.
02:08:52.080 | And in computer science, the theoretical computer science.
02:08:55.040 | Let me ask the Scott Aronson question of like,
02:08:57.440 | is there something to you that stands out in particular
02:09:00.760 | that's beautiful or inspiring
02:09:03.440 | or just really insightful about computer science
02:09:05.920 | or maybe mathematics?
02:09:08.400 | - I mean, I like theory.
02:09:11.240 | And in particular, what I've always liked in theory
02:09:13.120 | is the notion of impossibilities.
02:09:14.840 | That's kind of my specialty.
02:09:16.600 | So, within the context of distributed algorithms,
02:09:19.880 | my specialty is impossibility results.
02:09:21.680 | The idea that you can argue nothing exists that solves this
02:09:26.040 | or nothing exists that can solve this faster than this.
02:09:30.520 | And I think that's really interesting.
02:09:32.000 | And that goes all the way back to Turing.
02:09:34.640 | His original paper on computable numbers
02:09:37.760 | with their connection to the,
02:09:38.760 | it's in German, the Eichler-Tung problem,
02:09:40.240 | but basically the German name
02:09:41.840 | that Hilbert called the decision problem.
02:09:43.400 | This was pre-computers, but he's English,
02:09:46.280 | so it's written in English, so it's a very accessible paper.
02:09:48.760 | And it lays the foundation
02:09:50.480 | for all of theoretical computer science.
02:09:51.880 | He just has this insight.
02:09:53.080 | He's like, well, if we think about an algorithm,
02:09:55.120 | I mean, he figures out all effective procedures
02:09:57.680 | or Turing machines are basically algorithms.
02:09:59.560 | We could really describe a Turing machine with a number,
02:10:01.960 | which we can now imagine with computer code,
02:10:04.120 | you could just take a source file
02:10:05.560 | and just treat the binary version of the file
02:10:07.640 | as a really long number, right?
02:10:09.400 | But he's like, every program is just a finite number.
02:10:12.400 | It's a natural number.
02:10:14.000 | And then he realized one way to think about a problem
02:10:16.520 | is you have, and this is kind of the Mike Sipser approach,
02:10:19.680 | but you have a sort of, it's a language.
02:10:21.760 | So an infinite number of strings,
02:10:23.440 | some of them are in the language and some of them aren't,
02:10:25.040 | but basically you can imagine a problem
02:10:26.720 | is represented as an infinite binary string,
02:10:29.320 | where in every position, like a one means
02:10:31.040 | that string is in the language and a zero means it isn't.
02:10:33.600 | And then he applied Cantor from the 19th century and said,
02:10:37.480 | okay, the natural numbers are countable,
02:10:39.720 | so it's countably infinite, and infinite binary strings,
02:10:43.720 | you can use a diagonalization argument
02:10:45.160 | and show they're uncountable.
02:10:47.920 | So there's just vastly more problems
02:10:50.800 | than there are algorithms.
02:10:51.920 | So basically anything you can come up with
02:10:53.440 | for the most part, almost certainly
02:10:54.480 | is not solvable by a computer.
02:10:56.320 | You know, and then he was like,
02:10:57.400 | let me give a particular example,
02:10:58.760 | and he figured out the very first computability proof.
02:11:00.760 | And he said, let's just walk through
02:11:02.000 | with a little bit of simple logic.
02:11:03.720 | The halting problem can't be solved by an algorithm.
02:11:06.040 | And that kicked off the whole enterprise
02:11:08.840 | of some things can't be solved by algorithms,
02:11:12.800 | some things can't be solved by computers.
02:11:14.240 | And we've just been doing theory on that
02:11:16.240 | since that was the '30s he wrote that.
02:11:18.600 | - So proving that something is impossible
02:11:21.120 | is sort of a more, a stricter version of that.
02:11:23.400 | Is it like proving bounds on the performance
02:11:26.280 | of different algorithms?
02:11:27.200 | - Yeah, so those are, yeah,
02:11:28.040 | so bounds are upper bounds, right?
02:11:29.360 | So you say this algorithm does at least this well
02:11:32.920 | and no worse than this,
02:11:33.760 | but you're looking at a particular algorithm.
02:11:35.560 | And possibility proofs say no algorithm ever
02:11:39.200 | could ever solve this problem.
02:11:40.120 | So no algorithm could ever solve the halting problem.
02:11:42.480 | - So it's problem-centric.
02:11:43.880 | It's making something different,
02:11:46.120 | making a conclusive statement about the problem.
02:11:48.520 | - Yes.
02:11:49.360 | - And that's somehow satisfying 'cause it's--
02:11:51.800 | - It's philosophically interesting.
02:11:53.240 | - Yeah.
02:11:54.080 | - I mean, it all goes back to,
02:11:54.920 | you get back to Plato, it's all reducto ad absurdum.
02:11:58.360 | So all these arguments have to start.
02:11:59.680 | The only way to do it is
02:12:00.520 | because there's an infinite number of solutions,
02:12:01.960 | you can't go through them.
02:12:02.800 | You say, let's assume for the sake of contradiction
02:12:06.320 | that there existed something that solves this problem.
02:12:09.040 | And then you turn to crank a logic
02:12:10.280 | until you blow up the universe.
02:12:11.720 | And then you go back and say,
02:12:12.560 | okay, our original assumption
02:12:13.920 | that this solution exists can't be true.
02:12:16.480 | I just think philosophically,
02:12:17.680 | it's like a really exciting kind of beautiful thing.
02:12:19.760 | It's what I specialize in within distributed algorithms
02:12:22.080 | is more like time-bound impossibility results.
02:12:24.840 | Like no algorithm can solve this problem faster than this
02:12:28.680 | in this setting.
02:12:29.680 | Of all the infinite number of ways you might ever do it.
02:12:32.080 | - So you have many papers,
02:12:34.080 | but the one that caught my eye
02:12:35.440 | is "Smooth Analysis of Dynamic Networks,"
02:12:38.400 | in which you write,
02:12:40.640 | "A problem with the worst-case perspective
02:12:42.720 | "is that it often leads to extremely strong lower bounds.
02:12:45.540 | "These strong results motivate a key question.
02:12:48.000 | "Is this bound robust in the sense
02:12:49.800 | "that it captures the fundamental difficulty
02:12:51.940 | "introduced by dynamism?
02:12:53.840 | "Or is the bound fragile in the sense
02:12:56.480 | "that the poor performance it describes
02:12:58.200 | "depends on an exact sequence of adversarial changes?
02:13:01.980 | "Fragile lower bounds leave open the possibility
02:13:05.120 | "of algorithms that might still perform well in practice."
02:13:08.680 | That's in the sense of the impossible
02:13:11.760 | and the bounds discussion presents the interesting question.
02:13:15.280 | I just like the idea of robust and fragile bounds,
02:13:18.440 | but what do you make about this kind of tension
02:13:23.000 | between what's provably,
02:13:25.760 | like what bounds you can prove that are like robust
02:13:30.000 | and something that's a bit more fragile?
02:13:32.520 | And also by way of answering that
02:13:36.040 | for this particular paper,
02:13:38.120 | can you say what the hell are dynamic networks?
02:13:41.400 | - What are distributed algorithms?
02:13:42.240 | - You don't know this, come on now.
02:13:43.680 | - And I have no idea.
02:13:44.880 | And what is smooth analysis?
02:13:46.160 | - Yeah, well, okay.
02:13:47.000 | So smooth analysis, it's, so it wasn't my idea.
02:13:49.960 | So Spielman and Tang came up with this
02:13:52.400 | in the context of sequential algorithms.
02:13:54.360 | So just like the normal world of an algorithm
02:13:57.040 | that runs on a computer.
02:13:58.460 | And they were looking at,
02:13:59.920 | there's a well-known algorithm called the simplex algorithm,
02:14:02.760 | but basically you're trying to, whatever,
02:14:04.920 | find a hole around a group of points.
02:14:07.480 | And there was an algorithm
02:14:08.320 | that worked really well in practice.
02:14:10.480 | But when you analyze it, you would say,
02:14:12.040 | I can't guarantee it's gonna work well in practice
02:14:13.960 | because if you have just the right inputs,
02:14:16.400 | this thing could run really long, right?
02:14:18.960 | But in practice, it seemed to be really fast.
02:14:20.480 | So smooth analysis is they came in and they said,
02:14:22.580 | let's assume that a bad guy chooses the inputs.
02:14:26.000 | It could be anything like really bad ones.
02:14:27.880 | And all we're gonna do, because in simplex, they're numbers.
02:14:30.880 | We're gonna just randomly put a little bit of noise
02:14:33.520 | on each of the numbers.
02:14:34.640 | And they showed if you put a little bit of noise
02:14:36.080 | on the numbers, suddenly simplex algorithm goes really fast.
02:14:39.720 | Like, oh, that explains this lower bound,
02:14:41.840 | this idea that it could sometimes run really long
02:14:44.280 | was a fragile bound because it could only run
02:14:46.280 | a really long time if you had exactly
02:14:47.840 | the worst pathological input.
02:14:49.800 | So then my collaborators and I brought this over
02:14:51.560 | to the world of distributed algorithms.
02:14:53.580 | We brought them over the general lower bounds, right?
02:14:55.360 | So in the world of dynamic networks,
02:14:58.160 | so distributed algorithm is a bunch of algorithms
02:15:00.080 | on different machines talking to each other,
02:15:02.440 | trying to solve a problem.
02:15:03.280 | And sometimes they're in a network.
02:15:05.000 | So you imagine them connected with network links.
02:15:07.460 | And a dynamic network, those can change, right?
02:15:09.980 | So I was talking to you, but now I can't talk to you anymore
02:15:12.300 | and I'm connected to a person over here.
02:15:14.020 | It's a really hard environment, mathematically speaking.
02:15:16.420 | And there's a lot of really strong lower bounds,
02:15:19.060 | which you could imagine if the network can change
02:15:20.700 | all the time and a bad guy is doing it,
02:15:23.520 | it's like hard to do things well.
02:15:24.960 | - So there's an algorithm running
02:15:26.460 | on every single node in the network.
02:15:27.820 | - Yeah.
02:15:28.660 | - And then you're trying to say something of any kind
02:15:30.780 | that makes any kind of definitive sense
02:15:32.820 | about the performance of that algorithm.
02:15:34.780 | - Yeah, so like, so I just submitted a new paper
02:15:37.300 | on this a couple of weeks ago,
02:15:38.340 | and we were looking at a very simple problem.
02:15:39.860 | There's some messages in the network.
02:15:42.340 | We want everyone to get them.
02:15:44.220 | If the network doesn't change, you can do this pretty well.
02:15:47.900 | You can pipeline them.
02:15:48.740 | There's some algorithms that work,
02:15:50.260 | basic algorithms that work really well.
02:15:52.180 | If the network can change every round,
02:15:54.140 | there's these lower bounds that says,
02:15:57.100 | it takes a really long time.
02:15:58.060 | There's a way that like,
02:15:58.900 | no matter what algorithm you come up with,
02:16:00.200 | there's a way the network can change in such a way
02:16:02.140 | that just really slows down your progress basically, right?
02:16:05.640 | So smooth analysis there says,
02:16:07.100 | yeah, but that seems like a really,
02:16:09.060 | you'd have really bad luck
02:16:10.180 | if your network was changing like exactly
02:16:13.820 | in the right way that you needed to screw your algorithm.
02:16:15.940 | So we said, what if we randomly just add
02:16:19.180 | or remove a couple edges in every round?
02:16:20.620 | So the adversary is trying to choose
02:16:21.700 | the worst possible network.
02:16:22.660 | We're just tweaking it a little bit.
02:16:24.780 | And in that case, this is a new paper.
02:16:25.980 | I mean, it's a blinded submission,
02:16:27.500 | so maybe I shouldn't, it's not, whatever.
02:16:30.340 | We basically showed-
02:16:31.660 | - An anonymous friend of yours submitted a paper.
02:16:33.620 | - Anonymous friend of mine, yeah, yeah.
02:16:35.700 | Whose paper should be accepted.
02:16:37.780 | Showed that even just adding like one random edge per round,
02:16:40.780 | and here's the cool thing about it,
02:16:43.260 | the simplest possible solution to this problem
02:16:46.020 | blows away that lower bound and does really well.
02:16:47.860 | So that's like a very fragile lower bound
02:16:50.100 | because we're like, it's almost impossible
02:16:53.100 | to actually keep things slow.
02:16:55.340 | - I wonder how many lower bounds you can smash open
02:16:59.900 | with this kind of analysis and show that they're fragile.
02:17:02.420 | - It's my interest, yeah.
02:17:03.620 | Because in distributed algorithms,
02:17:05.660 | there's a ton of really famous strong lower bounds,
02:17:08.300 | but things have to go wrong, really, really wrong
02:17:12.500 | for these lower bound arguments to work.
02:17:14.580 | And so I like this approach.
02:17:15.660 | So this whole notion of fragile versus robust,
02:17:17.940 | I was like, well, let's go in
02:17:19.020 | and just throw a little noise in there.
02:17:21.100 | And if it becomes solvable,
02:17:22.980 | then maybe that lower bound wasn't really something
02:17:24.540 | we should worry about.
02:17:25.540 | - You know, that's gonna embarrass,
02:17:27.340 | that's really uncomfortable.
02:17:28.580 | That's really embarrassing to a lot of people.
02:17:31.500 | 'Cause okay, this is the OCD thing with the spaces
02:17:35.900 | is it feels really good when you can prove a nice bound.
02:17:39.060 | And if you say that that bound is fragile,
02:17:42.680 | that's like, there's gonna be a sad kid
02:17:46.100 | that walks with their lunchbox back home,
02:17:49.740 | like, "Well, my lower bound doesn't matter."
02:17:52.340 | - No, I don't think they care.
02:17:53.260 | It's all, I don't know, it feels like to me
02:17:55.380 | a lot of this theory is just math machismo.
02:17:57.460 | It's like, whatever, this was a hard bound to prove.
02:18:00.460 | - What do you think about that?
02:18:02.260 | So if you show that something is fragile,
02:18:03.740 | that's more important,
02:18:04.700 | that's really important in practice, right?
02:18:07.200 | So do you think kind of theoretical computer science
02:18:10.740 | is living in its own world, just like mathematics,
02:18:13.340 | and their main effort, which I think is very valuable,
02:18:16.300 | is to develop ideas that's not necessarily interesting,
02:18:19.340 | whether it's applicable in the real world?
02:18:21.060 | - Yeah, we don't care about the applicability.
02:18:23.220 | We kind of do, but not really.
02:18:24.620 | And we're terrible with computers.
02:18:26.060 | You can't do anything useful with computers
02:18:27.740 | and we don't know how to code.
02:18:28.700 | And we're not productive members
02:18:31.860 | of like technological society,
02:18:33.300 | but I do think things percolate.
02:18:36.260 | - Exactly.
02:18:37.100 | - You percolate from the world of theory
02:18:38.660 | into the world of algorithm design,
02:18:40.100 | where we'll pull on the theory
02:18:41.260 | and now suddenly it's useful.
02:18:42.980 | And then the algorithm design gets pulled
02:18:44.840 | into the world of practice where they say,
02:18:46.020 | "Well, actually we can make this algorithm a lot better
02:18:47.700 | because in practice, really these servers do XYZ
02:18:50.020 | and now we can make this super efficient."
02:18:51.620 | And so I do think, I mean, I tell my,
02:18:53.580 | I teach theory to the PhD students at Georgetown.
02:18:56.500 | I show them the sort of funnel of like,
02:18:58.820 | "Okay, we're over here doing theory,"
02:19:00.060 | but it eventually some of this stuff will percolate down
02:19:02.500 | in effect at the very end, a phone,
02:19:05.540 | but it's a long tunnel.
02:19:08.060 | - But the very question you're asking
02:19:10.020 | at the highest philosophical level is fascinating.
02:19:12.380 | Like if you take a system, a distributed system
02:19:14.980 | or a network and introduce a little bit of noise into it,
02:19:19.520 | like how many problems of that nature
02:19:23.340 | are fundamentally changed
02:19:25.220 | by that little introduction of noise?
02:19:27.580 | - Yeah, because it's all,
02:19:29.060 | especially in distributed algorithms,
02:19:30.260 | the model is everything.
02:19:31.540 | Like the way we work is we're incredibly precise
02:19:33.820 | about here's exactly, it's mathematical.
02:19:36.020 | Here's exactly how the network works
02:19:37.540 | and it's a state machine, algorithms are state machines.
02:19:40.140 | There's rounds and schedulers.
02:19:41.320 | We're super precise, we can prove lower bounds.
02:19:44.160 | But yeah, often those lower,
02:19:45.060 | those impossibility results really get at the hard edges
02:19:48.780 | of exactly how that model works.
02:19:50.540 | So we'll see if this, so we published a paper on this,
02:19:53.500 | that paper you mentioned,
02:19:55.480 | that kind of introduced the idea
02:19:56.740 | to the distributed algorithms world.
02:19:58.060 | And I think that's got some traction
02:20:01.060 | and there's been some follow-ups.
02:20:02.180 | So we've just submitted our next.
02:20:05.660 | I mean, honestly, the issue with the next
02:20:06.900 | is that like the result fell out so easily,
02:20:09.340 | and this shows the mathematical machismo problem
02:20:11.100 | in these fields, is there's a good chance
02:20:13.940 | the paper won't be accepted
02:20:15.020 | because there wasn't enough mathematical self-flagellation.
02:20:18.220 | - That's such a nice finding.
02:20:20.260 | So even, so showing that very few,
02:20:22.580 | just very little bit of noise,
02:20:24.380 | can have a dramatic, make a dramatic statement
02:20:27.300 | about the-- - It was a big surprise to us,
02:20:29.380 | but once we figured out how to show it, it's not too hard.
02:20:33.220 | - And these are venues that for theoretical,
02:20:37.740 | for theoretical work.
02:20:38.900 | Okay, so the fascinating tension
02:20:41.140 | that exists in other disciplines,
02:20:42.540 | like one of them is machine learning,
02:20:44.700 | which despite the power of machine learning
02:20:48.780 | and deep learning and all, like the impact of it,
02:20:52.720 | in the real world, the main conferences on machine learning
02:20:55.960 | are still resistant to application papers.
02:20:58.400 | - Yeah.
02:20:59.240 | - I'm not, sort of, and application papers broadly defined,
02:21:03.880 | meaning like finding almost like you would,
02:21:08.160 | like Darwin did by like going around,
02:21:12.760 | collecting some information, saying,
02:21:14.240 | "Huh, isn't this interesting?"
02:21:15.680 | - Yeah.
02:21:16.760 | - Like those are some of the most popular blogs,
02:21:19.200 | and yet as a paper, it's not really accepted.
02:21:21.320 | I wonder what you think about this whole world
02:21:23.400 | of deep learning from a perspective of theory.
02:21:27.900 | What do you make of this whole discipline
02:21:31.720 | of the success of neural networks,
02:21:33.240 | of how to do science on them?
02:21:34.960 | Are you excited by the possibilities
02:21:37.960 | of what we might discover about neural networks?
02:21:40.080 | Do you think it's fundamental in engineering discipline,
02:21:42.320 | or is there something theoretical
02:21:44.680 | that we might crack open one of these days
02:21:47.480 | in understanding something deep about how system,
02:21:49.640 | optimization, and how systems learn?
02:21:52.160 | - I am convinced by, is it Tagamart at MIT, who's--
02:21:56.160 | - Tagamart?
02:21:57.000 | - Yeah, Tagamart, right?
02:21:58.000 | So his notion has always been convincing to me
02:22:00.240 | that the fact that some of these models are inscrutable
02:22:04.800 | is not fundamental to them,
02:22:06.880 | and that we can, we're gonna get better and better,
02:22:08.600 | because in the end, you know,
02:22:09.920 | the reason why practicing computer scientists
02:22:12.360 | often who are doing AI, or working in AI industry,
02:22:15.560 | aren't like worried about so much existential threats
02:22:18.640 | is because they see the reality
02:22:20.320 | is they're multiplying matrices with numpy
02:22:22.920 | or something like this, right?
02:22:23.760 | Yeah, and tweaking constants
02:22:25.440 | and hoping that the classifier fitness,
02:22:27.800 | for God's sakes, before the submission deadline
02:22:30.520 | actually like gets above some,
02:22:31.680 | like it feels like it's linear algebra and TDM, right?
02:22:36.120 | But anyways, I'm really convinced with his idea
02:22:39.280 | that once we understand better and better
02:22:40.720 | what's going on from a theory perspective,
02:22:42.240 | it's gonna make it into an engineering discipline.
02:22:44.680 | So in my mind, where we're gonna end up is,
02:22:47.240 | okay, forget these metaphors of neurons,
02:22:50.280 | these things are gonna be put down
02:22:52.360 | into these mathematical kind of elegant equations,
02:22:54.880 | differentiable equations that just kind of work well.
02:22:57.640 | And then it's gonna be when I need a little bit of AI
02:23:00.000 | in this thing, plumbing,
02:23:02.320 | like let's get a little bit of a pattern recognizer
02:23:05.240 | with a noise module and let's connect,
02:23:06.840 | I mean, you know this feel better than me,
02:23:08.280 | so I don't know if this is like a reasonable prediction,
02:23:11.240 | but that we're gonna, it's gonna become less inscrutable,
02:23:14.120 | and then it's gonna become more engineerable,
02:23:16.320 | and then we're gonna have AI and more things
02:23:18.600 | because we're gonna have a little bit more control
02:23:20.600 | over how we piece together
02:23:22.560 | these different classification black boxes.
02:23:25.880 | - So one of the problems,
02:23:26.960 | and there might be some interesting parallels
02:23:29.040 | that you might provide intuition on is,
02:23:31.040 | you know, neural networks are very large
02:23:32.680 | and they have a lot of,
02:23:33.840 | we were talking about, you know,
02:23:38.120 | dynamic networks and distributed algorithms.
02:23:41.920 | One of the problems with the analysis of neural networks
02:23:45.520 | is, you know, you have a lot of nodes
02:23:48.720 | and you have a lot of edges.
02:23:50.840 | To be able to interpret and to control different things
02:23:53.160 | is very difficult.
02:23:54.000 | There's fields in trying to figure out,
02:23:58.120 | like mathematically, how you form clean representations
02:24:03.120 | that are like, like one node contains all the information
02:24:07.720 | about a particular thing and no other nodes
02:24:09.800 | is correlated to it, so like it has unique knowledge.
02:24:13.680 | But that ultimately boils down to trying
02:24:15.920 | to simplify this thing into,
02:24:19.000 | that goes against its very nature,
02:24:20.600 | which is like deeply connected and like dynamic
02:24:25.600 | and just, you know, hundreds of millions, billions of nodes.
02:24:30.400 | And in a distributed sense, like when you zoom out,
02:24:33.960 | the thing has a representation
02:24:35.440 | and understanding of something,
02:24:36.960 | but the individual nodes are just doing
02:24:38.520 | their little exchange thing.
02:24:40.520 | And it's the same thing with Stephen Wolfram
02:24:42.800 | when he talked about cellular automata.
02:24:44.480 | It's very difficult to do math
02:24:46.360 | when you have a huge collection of distributed things,
02:24:48.800 | each acting on their own.
02:24:50.440 | And it's almost like, it feels like it's almost impossible
02:24:54.760 | to do any kind of theoretical work in the traditional sense.
02:24:58.200 | It almost becomes completely like a biology,
02:25:02.560 | you become a biologist as opposed to a theoretician.
02:25:06.080 | You just study it experimentally.
02:25:07.720 | - Yeah, so I think that's the big question, I guess, right?
02:25:10.720 | Yeah, so is the large size and interconnectedness
02:25:15.280 | of the, like a deep learning network,
02:25:17.800 | fundamental to that task,
02:25:19.160 | or are we just not very good at it yet
02:25:20.400 | because we're using the wrong metaphor?
02:25:23.400 | I mean, the human brain learns with much fewer examples
02:25:26.800 | and with much less tuning of the whatever, whatever,
02:25:30.440 | whatever probably that requires to get
02:25:32.360 | those like deep mind networks up and running.
02:25:34.480 | But yeah, so I don't really know,
02:25:36.240 | but the one thing I have observed is that,
02:25:38.160 | yeah, there's the mundane nature
02:25:41.280 | of some of the working with these models
02:25:43.760 | tends to lead people to think that,
02:25:45.560 | to do it like, it could be Skynet,
02:25:49.200 | or it could be like a lot of pain to get,
02:25:52.320 | the thermostat to do what we want it to do.
02:25:54.600 | - And there's a lot of open questions in between there.
02:25:56.680 | And then of course, the distributed network
02:26:01.680 | of humans that use these systems.
02:26:04.760 | So like you can have the system itself,
02:26:07.520 | then you know network,
02:26:08.680 | but you can also have like little algorithms
02:26:10.520 | controlling the behavior of humans,
02:26:11.800 | which is what you have with social networks.
02:26:14.120 | It's possible that a very, what is it a toaster,
02:26:17.120 | or whatever, the opposite of Skynet,
02:26:19.560 | when taken at scale,
02:26:20.640 | but used by individual humans and controlling their behavior
02:26:23.320 | can actually have the Skynet effect.
02:26:25.640 | - Yeah. - So the scale there.
02:26:27.800 | - We might have that now.
02:26:29.040 | - We might have that now, we just don't know.
02:26:30.720 | - Yeah. - As it's happening.
02:26:32.240 | - Is Twitter creating a little mini Skynet?
02:26:33.680 | I mean, because what happens,
02:26:35.080 | it twirls out ramifications in the world.
02:26:38.000 | And is it really that much different
02:26:40.160 | if it's a robot with tentacles or a bunch of servers that.
02:26:44.520 | - Yeah, and the destructive effects could be,
02:26:47.560 | I mean, it could be political,
02:26:48.880 | but it could also be like,
02:26:51.040 | you could probably make an interesting case
02:26:52.640 | that the virus, the coronavirus spread on Twitter too,
02:26:57.640 | in the minds of people,
02:27:00.800 | like the fear and the misinformation
02:27:03.400 | in some very interesting ways.
02:27:05.000 | - Yeah. - Mixed up.
02:27:06.400 | And maybe this pandemic wasn't sufficiently dangerous
02:27:08.960 | to where that could have created a weird instability,
02:27:13.000 | but maybe other things might create instability.
02:27:15.120 | Like somebody, God forbid,
02:27:17.000 | detonates a nuclear weapon somewhere.
02:27:19.200 | And then maybe the destructive aspect of that
02:27:21.800 | would not as much be the military actions,
02:27:24.920 | but the way those news are spread on Twitter.
02:27:27.800 | - Yeah. - And the panic that creates.
02:27:29.320 | - Yeah, yeah.
02:27:30.440 | I mean, I think that's a great case study, right?
02:27:32.360 | Like what happened?
02:27:34.320 | I'm not suggesting that Lexi got let off a nuclear bomb.
02:27:36.760 | I meant the coronavirus, but okay.
02:27:39.280 | But yeah, I think that's a really interesting case study.
02:27:42.120 | I'm interested in the counterfactual of 1995,
02:27:46.040 | like do the same virus in 1995.
02:27:48.360 | So first of all, it would have been,
02:27:50.440 | I get to hear whatever, the nightly news,
02:27:53.520 | we'll talk about it,
02:27:54.480 | and then there'll be my local health board
02:27:57.280 | will talk about it.
02:27:58.320 | That meant mitigation decisions
02:27:59.880 | would probably necessarily be very sort of localized.
02:28:04.320 | Like our community is trying to figure out
02:28:05.480 | what are we gonna do?
02:28:06.320 | What's gonna happen?
02:28:07.160 | Like we see this with schools,
02:28:08.080 | like where I grew up in New Jersey,
02:28:10.400 | there's very localized school districts.
02:28:12.800 | So even though they had sort of really bad viral numbers
02:28:16.120 | there, my school I grew up in has been open since the fall
02:28:18.360 | because it's very localized.
02:28:20.160 | It's like these teachers and these parents,
02:28:21.640 | what do we wanna do?
02:28:22.480 | What are we comfortable with?
02:28:23.680 | I live in a school district right now in Montgomery County
02:28:26.440 | that's a billion dollar a year budget,
02:28:27.960 | 150,000 kid school district.
02:28:30.000 | It just can't, it's closed, you know, because it's too.
02:28:32.720 | So I'm interested in that counterfactual.
02:28:33.960 | Yes, you have all this information moving around.
02:28:36.440 | And then you have the effects on discourse
02:28:39.240 | that we were talking about earlier,
02:28:40.520 | that the Neil Postman style effects of Twitter,
02:28:43.520 | which shifts people into a sort of a dunk culture mindset
02:28:46.200 | of don't give an inch to the other team.
02:28:49.880 | And we're used to this and was fired up by politics
02:28:51.720 | and the unique attributes of Twitter.
02:28:53.480 | Now throw in the coronavirus
02:28:54.760 | and suddenly we see decades of public health
02:28:57.800 | knowledge, a lot of which was honed during the HIV epidemic
02:29:00.960 | was thrown out the window
02:29:02.320 | because a lot of this was happening on Twitter.
02:29:04.280 | And suddenly we had public health officials
02:29:06.200 | using a don't give an inch to the other team mindset
02:29:08.760 | of like, well, if we say this,
02:29:10.600 | that might validate something that was wrong over here.
02:29:13.000 | And we need to, if we say this,
02:29:14.240 | then maybe like that'll stop them from doing this.
02:29:16.440 | That's like very Twittery in a way that in 1995
02:29:19.760 | is probably not the way public health officials
02:29:22.620 | would be thinking.
02:29:23.460 | Or now it's like, well, this is,
02:29:25.320 | if we said this about masks,
02:29:26.560 | but the other team said that about masks,
02:29:28.160 | we can't give an inch to this.
02:29:29.320 | So we gotta be careful.
02:29:30.280 | And like, we can't tell people it's okay
02:29:31.680 | after they're vaccinated because that might,
02:29:33.640 | we're giving them an inch on this.
02:29:34.680 | And that's very Twittery in my mind, right?
02:29:36.840 | That is the impact of Twitter
02:29:39.440 | on the way we think about discourse,
02:29:40.680 | which is a dunking culture of don't give any inch
02:29:42.480 | to the other team.
02:29:43.320 | And it's all about slam dunks where you're completely right
02:29:45.240 | and they're completely wrong.
02:29:46.480 | It's as a rhetorical strategy is incredibly simplistic,
02:29:49.080 | but it's also the way that we think right now
02:29:50.840 | about how we do debate.
02:29:52.540 | It combined terribly with a election year pandemic.
02:29:56.400 | - Yeah, election year pandemic.
02:29:58.240 | I wonder if we could do some smooth analysis.
02:30:00.040 | Let's run the simulation over a few times.
02:30:01.840 | - A little bit of noise.
02:30:02.680 | - Yeah, see if it can dramatically change
02:30:05.760 | the behavior of the system.
02:30:07.240 | Okay, we talked about your love for proving
02:30:10.080 | that something is impossible.
02:30:11.640 | So there's quite a few still open problems
02:30:14.360 | and complexity of algorithms.
02:30:17.440 | So let me ask, does P equal NP?
02:30:20.820 | - Probably not.
02:30:22.040 | - Probably not.
02:30:23.380 | If P equals NP, what kind of,
02:30:28.380 | and you'd be really surprised somebody proves it.
02:30:31.600 | What would that proof look like?
02:30:33.960 | And why would that even be?
02:30:35.080 | What would that mean?
02:30:36.480 | What would that proof look like?
02:30:38.320 | And what possible universe could P equals NP?
02:30:40.800 | Is there something insightful you could say there?
02:30:43.440 | - It could be true.
02:30:45.440 | And I mean, I'm not a complexity theorist,
02:30:47.280 | but every complexity theorist I know
02:30:49.520 | is convinced they're not equal
02:30:51.440 | and are basically not working on it anymore.
02:30:53.120 | I mean, there is a million dollars at stake
02:30:54.480 | if you can solve the proof.
02:30:55.920 | It's one of the Millennium Prizes.
02:30:57.800 | Okay, so here's how I think the P not equals NP proof
02:31:00.840 | is gonna eventually happen.
02:31:02.800 | I think it's gonna fall out
02:31:04.520 | and it's gonna be not super simple,
02:31:07.360 | but not as hard as people think.
02:31:08.720 | Because my theory about a lot of theoretical
02:31:11.640 | computer science based on just some results I've done,
02:31:14.000 | so this is a huge extrapolation,
02:31:15.440 | is that a lot of what we're doing
02:31:17.200 | is just obfuscating deeper mathematics.
02:31:20.040 | So like this happens to me a lot, not a lot,
02:31:22.680 | but it's happened to me a few times in my work
02:31:24.320 | where we obfuscate it because we say,
02:31:26.240 | well, there's an algorithm and it has this much memory
02:31:29.280 | and they're connected on a network.
02:31:30.640 | And okay, here's our setup.
02:31:31.760 | And now we're trying to see how fast it can solve a problem.
02:31:34.480 | And people do bounds about it.
02:31:35.960 | And then the end, it turns out
02:31:37.040 | that we were just obfuscating some underlying
02:31:39.320 | mathematical thing that already existed.
02:31:43.320 | Right, so this has happened to me.
02:31:44.360 | I had this paper I was quite fond of a while ago.
02:31:47.400 | It was looking at this problem called contention resolution
02:31:50.580 | where you put an unknown set of people on a shared channel
02:31:54.520 | and they're trying to break symmetry.
02:31:55.740 | So it was like an ethernet, whatever.
02:31:57.720 | Only one person can use it at a time.
02:31:58.920 | You try to break symmetry.
02:31:59.760 | There's all these bounds people have proven over the years
02:32:02.360 | about how long it takes to do this, right?
02:32:05.040 | And like I discovered at some point,
02:32:07.400 | there's this one combinatorial result
02:32:10.400 | from the early 1990s.
02:32:12.600 | All of these lower bound proofs all come from this.
02:32:15.400 | And in fact, it improved a lot of them
02:32:16.800 | and simplified a lot.
02:32:17.640 | You could put it all in one paper.
02:32:19.760 | It was like, are we really?
02:32:20.600 | And then, okay, so this new paper
02:32:21.860 | that I submitted a couple of weeks ago,
02:32:24.180 | I found you could take some of these same lower bound proofs
02:32:26.380 | for this contention resolution problem.
02:32:28.040 | You could reprove them using Shannon's source code theorem.
02:32:32.240 | That actually when you're breaking contention,
02:32:34.040 | what you're really doing is building a code over,
02:32:37.320 | if you have a distribution on the network sizes,
02:32:40.460 | it's a code over that source.
02:32:41.700 | And if you plug in a high entropy information source
02:32:44.400 | and plug in from 1948, the source code theorem
02:32:47.460 | that says on a noiseless channel,
02:32:48.680 | you can't send things at a faster rate
02:32:51.080 | than the entropy allows,
02:32:52.400 | the exact same lower bounds fall back out again.
02:32:54.400 | So like this type of thing happens,
02:32:55.800 | there's some famous lower bounds and distributed algorithms
02:32:58.680 | that turned out to all be algebraic topology
02:33:01.560 | underneath the covers.
02:33:02.680 | And they won the Girdle Prize for working on that.
02:33:05.480 | So my sense is what's gonna happen is at some point,
02:33:08.360 | someone really smart, it's gonna be very exciting,
02:33:10.960 | is gonna realize there's some sort of other representation
02:33:14.720 | of what's going on with these Turing machines
02:33:16.800 | trying to sort of efficiently compute.
02:33:18.480 | - It'll naturally fall out of that.
02:33:19.320 | - And there'll be an existing mathematical result
02:33:22.080 | that applies.
02:33:23.920 | - Someone or something, I guess.
02:33:25.360 | It could be AI theorem provers kind of thing.
02:33:28.000 | - It could be, yeah.
02:33:28.840 | I mean, not a, well, yeah.
02:33:30.520 | I mean, there's theorem provers,
02:33:31.760 | like what that means now, which is not fun.
02:33:35.520 | - It's just a bunch of--
02:33:37.040 | - Very carefully formulated postulates that,
02:33:39.280 | but I take your point, yeah.
02:33:41.320 | - Yeah, so, okay.
02:33:44.480 | On a small tangent, and then you're kind of
02:33:47.480 | implying that mathematics, it almost feels
02:33:49.840 | like a kind of weird evolutionary tree
02:33:52.480 | that ultimately leads back to some kind of ancestral,
02:33:55.400 | few fundamental ideas that all are just like,
02:33:58.820 | they're all somehow connected.
02:34:00.320 | In that sense, do you think math is fundamental
02:34:06.240 | to our universe and we're just like slowly
02:34:09.080 | trying to understand these patterns?
02:34:12.000 | Or is it discovered?
02:34:14.200 | Or is it just a little game that we play
02:34:17.960 | amongst ourselves to try to fit
02:34:21.560 | little patterns to the world?
02:34:23.640 | - Yeah, that's the question, right?
02:34:24.840 | That's the physicist's question.
02:34:26.800 | I mean, I'm probably, I'm in the discovered camp,
02:34:29.080 | but I don't do theoretical physics,
02:34:30.680 | so I know they have a, they feel like
02:34:34.240 | they have a stronger claim to answering that question.
02:34:37.240 | But everything comes back to it.
02:34:38.440 | Everything comes back to it.
02:34:39.480 | I mean, all of physics, the fact that
02:34:41.360 | the universe is, well, okay.
02:34:43.900 | It's a complicated question.
02:34:47.200 | - So how often do you think, how deeply
02:34:51.080 | does this result describe the fundamental reality of nature?
02:34:55.680 | - So the reason I hesitated, because it's something I'm,
02:35:01.080 | I taught this seminar and did a little work
02:35:03.080 | on what are called biological algorithms.
02:35:05.000 | So there's this notion of,
02:35:09.360 | so physicists use mathematics to explain the universe,
02:35:14.360 | and it was unreasonable that mathematics works so well.
02:35:18.200 | All these differential equations,
02:35:19.760 | why does that explain all we need to know
02:35:21.640 | about thermodynamics and gravity
02:35:23.000 | and all these type of things?
02:35:24.200 | Well, there's this movement within
02:35:26.400 | the intersection of computer science and biology.
02:35:28.640 | It's just kind of Wolframium, I guess, really,
02:35:31.120 | that algorithms can be very explanatory.
02:35:35.000 | Like if you're trying to explain parsimoniously
02:35:39.400 | something about like an ant colony or something like this,
02:35:41.800 | you're not going to, ultimately,
02:35:43.640 | it's not gonna be explained as a equation,
02:35:45.800 | like a physics equation.
02:35:46.800 | It's gonna be explained by an algorithm.
02:35:48.240 | So like this algorithm run distributedly
02:35:51.880 | is going to explain the behavior.
02:35:53.220 | So that's mathematical, but not quite mathematical,
02:35:56.120 | but it is if you think about an algorithm
02:35:57.620 | like a lambda calculus, which brings you back
02:35:59.480 | to the world of mathematics.
02:36:01.040 | So I'm thinking out loud here, but basically,
02:36:04.880 | abstract math is sort of like unreasonably effective
02:36:09.120 | at explaining a lot of things.
02:36:10.400 | And that's just what I feel like I glimpse.
02:36:11.820 | I'm not like a super well-known theoretician.
02:36:14.960 | I don't have really famous results.
02:36:16.940 | So even as a sort of middling career theoretician,
02:36:21.840 | I keep encountering this where we think
02:36:25.440 | we're solving some problem about computers and algorithms,
02:36:28.600 | and it's some much deeper underlying math.
02:36:31.000 | It's Shannon, but Shannon is entropy,
02:36:33.200 | but entropy was really, goes all the way back
02:36:36.800 | to whatever it was, Boyle, or all the way back
02:36:39.280 | to looking at the early physics.
02:36:40.600 | And it's, anyways, to me, I think it's amazing.
02:36:44.920 | - Yeah, but it could be the flip side of that
02:36:47.360 | could be just our brains draw so much pleasure
02:36:49.440 | from the deriving generalized theories
02:36:53.200 | and simplifying the universe that we just naturally see
02:36:56.640 | that kind of simplicity and everything.
02:36:58.640 | - Yeah, so that's the whole Newton to Einstein, right?
02:37:01.840 | So you can say this must be right
02:37:03.560 | because it's so predictive.
02:37:04.920 | And well, it's not quite predictive
02:37:06.400 | because Mercury wobbles a little bit,
02:37:07.680 | but I think we have it set.
02:37:08.640 | And then you turn out, no, Einstein.
02:37:10.720 | And then you get Bohr, like, no, not Einstein.
02:37:13.880 | It's actually statistical.
02:37:15.120 | And yeah, so that would say-
02:37:16.600 | - It's hard to also know where a smooth analysis
02:37:18.960 | fits into all that, where a little bit of noise,
02:37:21.520 | like you can say something very clean about a system
02:37:25.680 | and then a little bit of noise,
02:37:27.400 | like the average case is actually very different.
02:37:29.840 | And so, I mean, that's where the quantum mechanics comes in.
02:37:32.880 | It's like, ugh, why does it have to be randomness in this?
02:37:36.160 | - Yeah, it would have to do this complex statistics.
02:37:38.680 | - Yeah. - Yeah.
02:37:39.520 | - So to be determined.
02:37:42.560 | - Yeah, that'll be my next book.
02:37:43.960 | That'd be ambitious.
02:37:45.600 | The fundamental core of reality, comma,
02:37:50.000 | and some advice for being more productive at work.
02:37:52.440 | (both laugh)
02:37:53.720 | - Can I ask you just if it's possible to do an overview
02:37:57.080 | and just some brief comments of wisdom
02:38:00.400 | on the process of publishing a book?
02:38:02.880 | What's that process entail?
02:38:04.040 | What are the different options?
02:38:05.400 | And what's your recommendation
02:38:06.960 | for somebody that wants to write a book like yours,
02:38:12.400 | a nonfiction book that discovers something interesting
02:38:15.960 | about this world?
02:38:17.760 | - So what I usually advise is follow the process as is.
02:38:25.560 | Don't try to reinvent.
02:38:27.280 | I think that happens a lot where you'll try to reinvent
02:38:31.480 | the way the publishing industry should work.
02:38:33.000 | Like this is kind of not like in a business model ways,
02:38:36.280 | but just like, this is what I want to do.
02:38:38.200 | I wanna write a thousand words a day and I wanna do this,
02:38:40.680 | and I'm gonna put it on the internet.
02:38:42.120 | And the publishing industry is very specific
02:38:44.640 | about how it works.
02:38:46.280 | And so like when I got started writing books,
02:38:47.840 | which at a very young age,
02:38:48.800 | so I sold my first book at the age of 21,
02:38:52.280 | the way I did that is I found a family friend
02:38:56.280 | that was an agent.
02:38:57.920 | And I said, "I'm not trying to make you be my agent.
02:39:00.480 | "Just explain to me how this works.
02:39:02.160 | "Not just how the world works,
02:39:03.240 | "but give me the hard truth about how would a 21-year-old,
02:39:06.580 | "under what conditions could a 21-year-old sell a book
02:39:08.700 | "and what would that look like?"
02:39:09.540 | And she just explained it to me.
02:39:10.560 | Well, you'd have to do this
02:39:11.400 | and have to be a subject that it made sense for you to write.
02:39:14.000 | And you would have to do this type of writing
02:39:15.320 | for other publications, the validated and blah, blah, blah.
02:39:17.560 | And you have to get the agent first.
02:39:18.720 | And I learned the whole game plan.
02:39:20.760 | And then I executed.
02:39:22.240 | And so the rough game plan is with nonfiction,
02:39:24.680 | you get the agent first
02:39:26.440 | and the agent's gonna sell it to the publishers.
02:39:28.720 | So you're never sending something directly
02:39:30.200 | to the publishers.
02:39:31.040 | In nonfiction, you're not writing the book first.
02:39:34.280 | You're gonna get an advance from the publisher once sold.
02:39:37.260 | And then you're gonna do the primary writing of the book.
02:39:40.160 | In fact, it will, in most circumstances,
02:39:42.320 | hurt you if you've already written.
02:39:43.520 | - If you've already written.
02:39:44.360 | - Yeah.
02:39:45.200 | - So you're trying to sell,
02:39:46.020 | well, I guess the agent,
02:39:47.160 | first you sell it to the agent
02:39:48.000 | and then the agent sells it to the publishers.
02:39:49.760 | - Yeah, it's much easier to get an agent
02:39:51.640 | than a book deal.
02:39:52.480 | So the thought is,
02:39:53.440 | if you can't get an agent, then why would you?
02:39:55.260 | So you start with,
02:39:56.200 | and also, the way this works with a good agent is,
02:39:59.160 | they know all the editors
02:40:00.440 | and they have lunch with the editors.
02:40:01.480 | And they're always just like,
02:40:02.320 | "Hey, what projects do you have coming?
02:40:03.160 | "What are you looking for?
02:40:04.000 | "Here's one of my authors."
02:40:04.820 | That's the way all these deals happen.
02:40:06.040 | It's not, you're not emailing a manuscript to a slush pile.
02:40:09.480 | - Yeah, and so, so first of all,
02:40:11.120 | the agent takes a percentage and then the publishers,
02:40:13.200 | this is where the process comes in.
02:40:14.640 | They take also a cut that's probably ridiculous.
02:40:17.840 | So if you try to reinvent the system,
02:40:20.980 | you'll probably be frustrated by the percentage
02:40:22.760 | that everyone takes relative to
02:40:24.320 | how much bureaucracy and efficiency,
02:40:26.560 | ridiculousness there is in the system.
02:40:28.200 | Your recommendation is like,
02:40:29.960 | you're just one ant.
02:40:31.620 | Stop trying to build your own ant colony.
02:40:34.360 | - Well, or if you create your own process
02:40:36.800 | for how it should work,
02:40:38.400 | the book's not gonna get published.
02:40:39.440 | So there's the separate question,
02:40:40.880 | the economic question of like,
02:40:41.920 | should I create my own,
02:40:43.660 | like self-publish it or do something like that?
02:40:45.400 | But putting that aside,
02:40:47.080 | there's a lot of people I encounter
02:40:48.860 | that wanna publish a book with a main publisher,
02:40:51.320 | but they invent their own rules for how it works, right?
02:40:54.080 | - So then the alternative though is self-publishing
02:40:56.400 | and the downside, there's a lot of downsides.
02:41:00.440 | It's like, it's almost like publishing an opinion piece
02:41:03.080 | in the New York Times versus writing on a blog.
02:41:05.560 | There's no reason why writing a blog post on Medium
02:41:09.520 | can't get way more attention and legitimacy
02:41:13.560 | and long-lasting prestige than a New York Times article.
02:41:16.800 | But nevertheless, for most people,
02:41:18.900 | writing in a prestigious newspaper,
02:41:21.220 | quote unquote prestigious, is just easier.
02:41:26.220 | - And well, and depends on your goal.
02:41:29.060 | So, you know, like I push you towards a big publisher
02:41:32.220 | because I think your goal is,
02:41:33.980 | it's huge ideas you want to impact.
02:41:36.060 | You're gonna have more impact, you know?
02:41:37.900 | - Even though, like actually,
02:41:39.820 | so there's different ways to measure impact, right?
02:41:41.860 | - In the world of ideas.
02:41:42.780 | - In the world of ideas.
02:41:44.180 | And also, yeah, in the world of ideas,
02:41:46.420 | it's kind of like the clubhouse thing now,
02:41:48.920 | even if the audience is not large,
02:41:51.160 | the people in the audience are very interesting.
02:41:53.560 | It's like the conversation feels like
02:41:56.860 | it has long-lasting impact among the people
02:42:01.460 | who in different and disparate industries
02:42:03.720 | that are also then starting their own conversations
02:42:06.540 | and all that kind of stuff.
02:42:07.380 | - Yeah, because you have other,
02:42:09.020 | so like self-publishing a book,
02:42:11.080 | the goals that would solve,
02:42:13.260 | you have much better ways of getting to those goals
02:42:15.340 | might be part of it, right?
02:42:16.220 | So if there's the financial aspect
02:42:17.900 | of, well, you get to keep more of it,
02:42:19.580 | I mean, the podcast is probably gonna crush
02:42:22.420 | what the book's gonna do anyways, right?
02:42:23.580 | Yeah, if it's,
02:42:25.260 | wanna get directly to certain audiences or crowds,
02:42:28.780 | it might be harder through a traditional publisher.
02:42:30.580 | There's better ways to talk to those crowds.
02:42:32.580 | It could be on clubhouse.
02:42:33.420 | With all these new technologies,
02:42:34.660 | self-published books not gonna be the most effective way
02:42:36.900 | to find your way to a new crowd.
02:42:39.220 | But if the idea is like,
02:42:40.040 | I wanna have a, leave a dent in the world of ideas,
02:42:43.600 | then to have a vulnerable old publisher,
02:42:47.200 | put out your book in a nice hardcover
02:42:49.140 | and do the things they do,
02:42:51.080 | that goes a long way.
02:42:53.000 | And they do do a lot.
02:42:53.820 | I mean, it's very difficult actually.
02:42:55.660 | There's so much involved in putting together a book.
02:42:57.800 | - They get books into bookstores
02:42:59.160 | and all that kind of stuff.
02:42:59.980 | - All this, and from an efficiency standpoint,
02:43:02.480 | I mean, just the time involved
02:43:03.720 | in trying to do this yourself is,
02:43:05.520 | they know people do it. - They have a process, right?
02:43:06.720 | Like you said, they have a process.
02:43:08.360 | - They've got a process.
02:43:09.200 | I mean, I know like Jaco did this recently.
02:43:10.960 | He started his own imprint and I have a couple other,
02:43:13.000 | but it's a huge overhead.
02:43:14.960 | I mean, if you like, if you run a business and you,
02:43:17.160 | so like Jaco is a good case study, right?
02:43:19.220 | So he got fed up with Simon and Schuster
02:43:22.360 | dragging their feet and said,
02:43:23.400 | "I'm gonna start my own imprint then,
02:43:25.160 | if you're not gonna publish my kid's book."
02:43:28.040 | But he, what does he do?
02:43:29.240 | He runs businesses, right?
02:43:30.760 | So I think in his world, like I already run,
02:43:32.720 | I'm a partner in whatever, in Origin,
02:43:34.880 | and I have this and that.
02:43:35.720 | And so it's like, yeah, we can run businesses.
02:43:37.680 | That's what we know how to do.
02:43:38.520 | That's what I do.
02:43:39.340 | I run businesses, I have people,
02:43:40.500 | but for like you or I, we don't run businesses.
02:43:43.120 | It'd be terrible.
02:43:43.960 | - Yeah. - Yeah.
02:43:44.800 | - Well, especially these kinds of businesses, right?
02:43:47.400 | So I do wanna launch a business,
02:43:48.880 | but very different technology business.
02:43:50.400 | It's a very different space. - Very different.
02:43:51.240 | - Very different space. - Very, very different.
02:43:52.560 | Yeah, yeah.
02:43:53.400 | I mean, this is like, okay, I need copy editors
02:43:55.720 | and graphic book binders,
02:43:57.840 | and I need a contract with the printer,
02:43:59.240 | but the printer doesn't have slots.
02:44:00.720 | And so now I have to try to, I mean, it's-
02:44:02.760 | - I get so, I need to shut this off in my brain,
02:44:05.640 | but I get so frustrated when the system
02:44:07.180 | could clearly be improved.
02:44:08.580 | It's the thing that you're mentioning.
02:44:10.080 | - Yeah.
02:44:10.920 | - It's like, this is so inefficient.
02:44:12.440 | Every time I go to the DMV or something like that,
02:44:15.640 | you'd think like, ah, this could be done so much better.
02:44:17.940 | - Yeah.
02:44:18.780 | - But, you know, and the same thing is the worry
02:44:22.600 | with an editor, which I guess would come from the publisher,
02:44:26.080 | like who would, how much supervision on your book
02:44:31.080 | did you receive?
02:44:32.060 | Like, hey, do you think this is too long?
02:44:34.260 | Or do you think the title, like title,
02:44:36.240 | how much choice do you have in the title,
02:44:38.220 | in the cover, in the presentation,
02:44:40.460 | and the branding and all that kind of stuff?
02:44:41.940 | - Yeah, I mean, all of it depends, right?
02:44:43.820 | So when it comes on the relationship
02:44:47.260 | on the, with the editor on the writing,
02:44:48.500 | it depends on the editor and it depends on you.
02:44:51.620 | So like at this point, I'm on my seventh book
02:44:54.100 | and I write for a lot of major publications.
02:44:56.860 | And at this point I have what I feel like is a voice
02:45:00.180 | that I've, and a level of craft
02:45:01.900 | that I'm very comfortable with, right?
02:45:03.860 | So my editor is not gonna be, she kind of is gonna trust me
02:45:07.060 | and it's gonna be more big picture.
02:45:08.300 | Like I'm losing the thread here
02:45:10.860 | or this seems like it could be longer.
02:45:12.820 | Whereas the first book I wrote when I was 21,
02:45:15.380 | I had notes such as, you start a lot of sentences with so,
02:45:19.500 | you don't use any contractions
02:45:20.940 | because I've been doing scientific writing.
02:45:22.340 | We don't use contractions.
02:45:23.180 | Like you should probably use contractions.
02:45:25.220 | I think it was way more, you know,
02:45:26.940 | I had to go back and rewrite the whole thing, yeah.
02:45:29.660 | - But ultimately the recommendation,
02:45:31.980 | I mean, we talked offline and sort of,
02:45:33.740 | I was thinking loosely, not really sure,
02:45:36.860 | but I was thinking of writing a book
02:45:37.940 | and there's a kind of desire to go self-publishing,
02:45:40.620 | not for financial reasons.
02:45:42.060 | - And the money can be good by the way, right?
02:45:43.380 | I mean, it's very power law type distributed, right?
02:45:48.100 | So the money on a hardcover is somewhere
02:45:49.820 | between one or $2 a book.
02:45:52.020 | - So the thing is, I personally don't-
02:45:53.860 | - But then you give up 15% to the agent, so.
02:45:56.060 | - I personally don't care about money
02:45:57.420 | as I've mentioned before,
02:45:58.620 | but I, for some reason, really don't like spending money
02:46:02.220 | on things that are not worth it.
02:46:05.800 | Like I don't care if I get money,
02:46:08.060 | I just don't like spending money on,
02:46:10.020 | like feeding a system that's inefficient.
02:46:12.820 | It's like I'm contributing to the problem.
02:46:14.460 | That's my biggest problem.
02:46:16.100 | - Right, so you think that you're worried
02:46:17.620 | about the inefficiencies of the-
02:46:19.820 | - Yeah, the fact that-
02:46:20.660 | - Like the overheads, the number of people involved or-
02:46:23.220 | - The overhead, the emails again,
02:46:26.120 | the fact that they have this way of speaking,
02:46:30.300 | which I'm allergic to many people,
02:46:32.340 | like that's very marketing speak.
02:46:34.640 | Like you could tell they've been having Zoom meetings
02:46:36.780 | all day.
02:46:37.780 | It's like, as opposed to a sort of creative collaborators
02:46:42.780 | that are like also a little bit crazy.
02:46:45.700 | - Yeah, well-
02:46:46.540 | - And I suppose some of that is finding the right people.
02:46:48.160 | - Finding the right people, that's what I would say.
02:46:49.540 | I say there's definitely, and maybe it's just good fortune,
02:46:52.500 | good fortune in terms of like my agents
02:46:55.140 | and editors I've worked with,
02:46:55.980 | there's really good people who see the vision,
02:47:00.980 | are smart, are incredibly literary.
02:47:02.660 | - And they actually help you.
02:47:03.500 | - Yeah, and like let's-
02:47:04.340 | - Basically.
02:47:05.180 | - Yeah, I had a great editor when I was first moving
02:47:06.820 | into hardcover books, for example.
02:47:08.740 | It was my first big book advance
02:47:12.260 | and my first sort of big deal.
02:47:14.660 | And he was like a senior editor and it was very useful.
02:47:19.180 | He was like, we had a lot of long talks, right?
02:47:21.340 | I was, so this was my fourth book,
02:47:23.140 | "So Good They Can't Ignore You"
02:47:23.980 | was my first big hardcover idea book.
02:47:26.600 | And we had a lot of talks,
02:47:29.260 | like even before I started writing it,
02:47:31.020 | just let's talk about books and his philosophy.
02:47:34.020 | He'd been in the business for a long time.
02:47:35.540 | He was the head of the imprint.
02:47:37.020 | It was useful.
02:47:38.420 | - Yeah, but I mean, the other frustrating thing
02:47:40.980 | is how long the whole thing takes.
02:47:42.540 | - Takes a long time.
02:47:44.220 | - Yeah, I suppose that's, you just have to accept that.
02:47:46.300 | - Well, yeah, I handed in this manuscript
02:47:48.020 | for the book that comes out now,
02:47:50.380 | like when this, I handed it in, I mean, over the summer,
02:47:53.580 | like during the pandemic.
02:47:54.420 | So it's not terrible, right?
02:47:56.420 | And we were editing during the pandemic
02:47:57.940 | and I finished it in the spring.
02:48:00.460 | - We've talked most of today,
02:48:02.260 | except for a little bit computer science,
02:48:03.620 | most of today about a productive life.
02:48:05.560 | How does love, friendship, and family fit into that?
02:48:11.200 | Is there, do you find that there's a tension?
02:48:16.200 | Is it possible for relationships
02:48:19.660 | to energize the whole process, to benefit?
02:48:22.500 | Or is it ultimately a trade-off?
02:48:25.460 | But because life is short
02:48:27.540 | and ultimately we seek happiness, not productivity,
02:48:32.820 | that we have to accept that tension.
02:48:35.060 | - Yeah, I mean, I think relationships is the,
02:48:39.220 | that's the whole deal.
02:48:41.540 | Like I thought about this the other day,
02:48:43.780 | I don't know what the context was.
02:48:44.700 | I was thinking about if I was gonna give
02:48:46.180 | like an advice speech, like a commencement address,
02:48:49.060 | or like giving advice to young people.
02:48:51.700 | And like the big question I have for young people
02:48:55.260 | is if they haven't already, bad things are gonna happen
02:48:58.600 | that you don't control, so what's the plan, right?
02:49:01.820 | Like, let's start figuring that out now
02:49:03.580 | because it's not all good.
02:49:05.620 | Some people get off better than others,
02:49:07.460 | but eventually stuff happens, right?
02:49:09.980 | You get sick, something falls apart,
02:49:11.900 | the economy craters, someone you know dies,
02:49:16.020 | like all sorts of bad stuff is gonna happen, right?
02:49:19.020 | So how are we gonna do this?
02:49:20.660 | Like, how do we like live life when life is hard?
02:49:23.100 | And in ways that is unfair and unpredictable.
02:49:25.780 | Then relationships is the,
02:49:27.860 | that's the buffer for all of that.
02:49:29.560 | 'Cause we're wired for it, right?
02:49:31.540 | I went down this rabbit hole with digital minimalism.
02:49:34.180 | I went down this huge rabbit hole
02:49:35.460 | about the human brain and sociality.
02:49:38.820 | It's all rewired, it's like all of our brain is for this.
02:49:41.300 | Like everything, all of our mechanisms,
02:49:43.380 | everything is made to service social connections
02:49:46.420 | because it's what kept you alive.
02:49:47.460 | You know, I mean, you had your tribal connections
02:49:49.700 | is how you didn't starve during a famine,
02:49:52.060 | people would share food, et cetera.
02:49:53.820 | And so you can't neglect that, and it's like everything.
02:49:57.180 | And people feel it, right?
02:49:58.380 | Like there's no, our social networks
02:50:00.060 | are hooked up to the pain center.
02:50:01.740 | It's why it feels so terrible when you miss someone
02:50:04.020 | or like someone dies or something, right?
02:50:05.340 | That's like how seriously we take it.
02:50:07.700 | There's a pretty accepted theory
02:50:09.300 | that the default mode network,
02:50:10.520 | like a lot of what the default mode network is doing.
02:50:12.580 | So the sort of the default state our brain goes into
02:50:15.020 | when we're not doing something in particular
02:50:16.300 | is practicing sociality, practicing interactions,
02:50:19.580 | because it's so crucial to what we do.
02:50:22.580 | It's like at the core of human thriving.
02:50:25.140 | So I've, more recently, the way I think about it
02:50:27.100 | is like relationships first.
02:50:28.860 | Like, okay, given that foundation of putting like,
02:50:31.620 | and I don't think we put nearly enough time into it.
02:50:33.220 | I worry that social media is reducing relationships,
02:50:36.100 | strong relationships.
02:50:37.460 | Strong relationships where you're sacrificing
02:50:39.180 | non-trivial time and attention,
02:50:42.140 | resources, whatever, on behalf of other people.
02:50:44.620 | That's the net that is gonna allow you
02:50:46.460 | to get through anything.
02:50:48.260 | Then, all right, now what do we wanna do
02:50:51.060 | with the surplus that remains?
02:50:53.580 | May I wanna build some fire, build some tools?
02:50:55.940 | - So put relationships first.
02:50:57.740 | I like the worst case analysis
02:50:59.300 | from the computer science perspective.
02:51:01.220 | Put relationships first.
02:51:03.420 | Yeah, because everything else is just
02:51:05.260 | assuming average case,
02:51:08.700 | assuming things kind of keep going as they were going.
02:51:10.780 | - And you're neglecting the fundamental human drive.
02:51:13.420 | Like we have this, we talk about the boredom instinct.
02:51:15.380 | We wanna build things, we wanna have impact,
02:51:17.020 | we wanna do productivity.
02:51:17.860 | That's not nearly as clear cut of a drive of we need people.
02:51:21.460 | - But if we look at the real worst case analysis here
02:51:25.660 | is one day, you're pretty young now,
02:51:28.940 | but that's not gonna last very long.
02:51:31.900 | You're gonna die one day.
02:51:33.060 | Is that something you think about?
02:51:35.260 | - Little bit.
02:51:36.420 | - Are you afraid of death?
02:51:37.660 | - Well, I'm of the mindset of,
02:51:39.460 | let's make that a productivity hack.
02:51:41.980 | I'm of the mindset of,
02:51:43.260 | we need to confront that soon.
02:51:47.100 | So let's do what we can now
02:51:48.700 | so that when we really confront and think about it,
02:51:50.620 | we're more likely to feel better about it.
02:51:52.780 | So in other words, like let's focus now
02:51:54.700 | on living and doing things in such a way
02:51:57.020 | that we're proud of,
02:51:58.660 | so that when it really comes time to confront that,
02:52:00.740 | we're more likely to say like,
02:52:02.900 | okay, I feel kind of good about the situation.
02:52:05.100 | - So what, when you're laying in your deathbed,
02:52:07.580 | would you, in looking back,
02:52:09.700 | what would make you think like,
02:52:11.060 | oh, I did okay, I'm proud of that.
02:52:13.900 | I optimized the hell out of that.
02:52:15.900 | - That's a good, I mean, it's a good question
02:52:17.180 | to go backwards on.
02:52:19.940 | I mean, this is like David Brooks's eulogy,
02:52:24.140 | virtues versus resume virtues, right?
02:52:26.980 | So his argument is that,
02:52:28.900 | and that's another interesting DC area person.
02:52:30.700 | I keep thinking of interesting DC area people.
02:52:32.940 | All right, David Brooks is here too.
02:52:34.740 | His argument, he thinks eulogy virtues is,
02:52:38.300 | so what we eulogize is different
02:52:39.740 | than what we promote on the resume.
02:52:42.060 | That's his whole thing now, right?
02:52:44.100 | His second mountain, the road to character,
02:52:45.980 | both these books are,
02:52:47.460 | he has this whole premise
02:52:48.700 | of there's like this professional phase
02:52:50.100 | and there's a phase of giving of yourself
02:52:53.260 | and sacrificing on behalf of other people.
02:52:55.700 | I don't know, maybe it's all mixed together, right?
02:52:57.260 | You wanna, I think living by a code is important, right?
02:53:00.260 | I mean, this is something that's not emphasized enough.
02:53:03.140 | I always think of advice
02:53:03.980 | that my undergrad should be given,
02:53:05.700 | that they're not given,
02:53:06.660 | especially at a place like Georgetown
02:53:07.740 | that has this like deep history of,
02:53:10.260 | you know, trying to promote human flourishing
02:53:11.780 | because of the Jesuit connection.
02:53:13.420 | There's such resiliency and pride
02:53:19.300 | that comes out of living well,
02:53:21.420 | even when it's hard,
02:53:22.260 | living according to a code,
02:53:23.540 | living accord to, which, you know,
02:53:25.220 | I think religion used to structure this for people,
02:53:28.540 | but in its absence, you need some sort of replacement,
02:53:30.260 | but this, even when things weren't,
02:53:31.900 | soldiers get this a lot, right?
02:53:33.540 | They experience this a lot.
02:53:34.380 | Even when things were tough,
02:53:35.220 | I was able to persist in living in this way
02:53:36.780 | that I knew was right,
02:53:37.620 | even though it wasn't the easiest thing
02:53:38.500 | to do in the moment.
02:53:39.340 | Like fewer things give humans more resiliency.
02:53:42.060 | It's like having done that,
02:53:43.980 | your relationships were strong, right?
02:53:45.740 | Many people coming to your funeral is a standard.
02:53:47.500 | Like a lot of people are gonna come to your funeral,
02:53:48.900 | like that means you matter to a lot of people.
02:53:51.180 | And then maybe having done,
02:53:52.380 | to the extent of whatever capabilities
02:53:55.660 | you happen to be granted, you know,
02:53:57.860 | and they're different for different people,
02:53:59.420 | some people can sprint real fast
02:54:00.740 | and some people can do math problems,
02:54:02.860 | try to actually do something of impact.
02:54:04.860 | - I'll just promise to give gift cards
02:54:08.540 | to anybody who shows up to the funeral.
02:54:10.220 | - You're gonna hack it?
02:54:11.380 | - I'm gonna hack even the funeral.
02:54:12.900 | - There's gonna be a lottery wheel you spin
02:54:14.740 | when you come in and someone goes away with $10,000.
02:54:17.940 | - See, the problem is like,
02:54:19.340 | with all this, the living by principles,
02:54:22.180 | living a principle life, focusing on relationships
02:54:24.300 | and kind of thinking of this life as this perfect thing
02:54:27.780 | kind of forgets the notion that none of it,
02:54:30.220 | you know, makes any sense, right?
02:54:34.060 | Like the, like it kind of implies
02:54:37.980 | that this is like a video game
02:54:40.340 | and you wanna get a high score,
02:54:42.420 | as opposed to none of this even makes sense.
02:54:45.820 | Like why would he, like what the?
02:54:47.940 | (laughing)
02:54:49.780 | Like what does it even mean to die?
02:54:52.700 | It's gonna be over.
02:54:53.740 | It's like everything I do, all of these productivity hacks,
02:54:57.820 | all this life, all these efforts, all this creative efforts,
02:55:00.180 | kind of assume it's gonna go on forever.
02:55:02.220 | There's a kind of a sense of immortality
02:55:05.180 | and I don't even know how to intellectually
02:55:06.700 | make sense that it ends.
02:55:08.380 | Of course, gotta ask you in that context,
02:55:10.820 | what do you think is the meaning of it all?
02:55:13.220 | Especially for a computer scientist,
02:55:14.780 | I mean, there's gotta be some mathematical--
02:55:17.420 | - Yeah, 27 or what's the--
02:55:19.740 | - What's the Douglas Adams?
02:55:21.300 | - Yeah, 42, okay.
02:55:23.420 | - 27 is a better number.
02:55:24.820 | - I should read more sci-fi.
02:55:26.700 | - Maybe you're onto something with a 27.
02:55:28.780 | - I don't wanna give away too much,
02:55:30.020 | but just trust me, 27.
02:55:31.940 | - It's invisible, yeah.
02:55:33.180 | - So, I mean, I don't know, obviously, right?
02:55:37.060 | I mean, I'm a--
02:55:38.060 | - So be it.
02:55:38.900 | - Yeah, I don't know, but going back to what you were saying
02:55:41.860 | about the sort of the existentialist
02:55:43.460 | or sort of the more nihilist style approach,
02:55:47.140 | the one thing that there is are intimations, right?
02:55:50.460 | So there's these intimations that human have
02:55:53.700 | of somehow this feels right and this feels wrong,
02:55:56.540 | this feels good, this feels like I'm doing,
02:55:59.180 | I'm aligned with something, you know,
02:56:01.020 | when I'm acting with courage to save whatever, right?
02:56:03.140 | It's not, these intimations are a grounding
02:56:05.540 | against arbitrariness.
02:56:07.180 | Like one of the ideas I'm really interested in is that
02:56:09.820 | when you look at religion, right?
02:56:13.500 | So I'm interested in world religions.
02:56:15.820 | My grandfather was like a theologian
02:56:18.380 | that studied and wrote all these books,
02:56:19.540 | and I'm very interested in this type of stuff.
02:56:21.420 | And there's this great book that's,
02:56:24.620 | it's not specific to a particular religion,
02:56:27.580 | but it's Karen Armstrong wrote this great book
02:56:29.300 | called "The Case for God."
02:56:30.900 | She's very interesting.
02:56:31.740 | She was a Catholic nun who sort of left that religion,
02:56:34.700 | but one of the smartest thinkers
02:56:36.780 | in terms of like accessible theological thinking
02:56:40.300 | that's not tied to any particular religion.
02:56:43.020 | Her whole argument is that the way to understand religion,
02:56:45.540 | you first of all, you have to go way back pre-enlightenment
02:56:47.660 | where all this was formed.
02:56:48.540 | We got messed up thinking about religion post-enlightenment,
02:56:51.140 | right?
02:56:51.980 | And these were operating systems
02:56:54.780 | for making sense of intimations.
02:56:56.420 | The one thing we had were these different intimations
02:57:00.340 | of this feel like awe and mystical experience.
02:57:03.860 | And this feels, there's something you feel
02:57:05.980 | when you act in a certain way
02:57:07.300 | and don't act in this other way.
02:57:08.860 | And it was like the scientists who were trying to study
02:57:12.060 | and understand the model of the atom
02:57:13.620 | by just looking at experiments
02:57:15.420 | and trying to understand what's going on.
02:57:16.700 | Like the great religions of the world
02:57:18.420 | were basically figuring out
02:57:19.900 | how do we make sense of these intimations
02:57:21.460 | and live in alignment with them
02:57:23.100 | and build a life of meaning around that?
02:57:24.980 | What were the tools they were using?
02:57:26.340 | They were using ritual, they were using belief,
02:57:28.260 | they were using action.
02:57:29.620 | But all of it was like an OS.
02:57:31.220 | It was like a liturgical model of the atom.
02:57:33.900 | - It's hard coded in.
02:57:36.020 | So it did through the evolutionary process.
02:57:39.300 | - I mean, they wouldn't have called it that back then.
02:57:41.900 | Yeah, I mean, they didn't have that as pre-enlightenment.
02:57:45.500 | They just said this is here.
02:57:46.980 | And the directive is to try to live in alignment with that.
02:57:50.940 | - Well, then I wanna ask who wrote the original code.
02:57:53.460 | - Yeah, so-- - That's the open question.
02:57:54.980 | - Yeah, so Armstrong lays out this good argument.
02:57:56.820 | And where it gets really interesting
02:57:58.260 | is that she emphasizes that
02:58:00.740 | all of this was considered ineffable, right?
02:58:03.260 | So the whole notion,
02:58:04.500 | and this is like rich in Jewish tradition in particular
02:58:06.580 | and also in Islamic tradition,
02:58:08.340 | we can't comprehend and understand what's going on here.
02:58:11.540 | And so the best we can do to approximate understanding
02:58:14.020 | and live in alignment is we act as if this is true,
02:58:17.020 | do these rituals, have these actions or whatever.
02:58:20.020 | Post-enlightenment, a lot of that got,
02:58:21.900 | once we learned about enlightenment,
02:58:24.940 | we grew these suspicions around religion
02:58:26.780 | that are very much of the modern era, right?
02:58:28.860 | So like to Karen Armstrong,
02:58:30.860 | like Sam Harris's critique of religion makes no sense.
02:58:33.740 | The critique's based on, well, this is,
02:58:36.100 | you're making the ascent to propositions
02:58:37.700 | that you think are true for which
02:58:38.660 | you do not have evidence that they are true.
02:58:39.900 | She's like, that's an enlightenment thing, right?
02:58:41.500 | This is not the context and this is not,
02:58:43.660 | the religion is the Rutherford model of the atom.
02:58:46.740 | Like it's not actually maybe what is underneath happening,
02:58:50.300 | but this model explains why your chemical equations work.
02:58:52.660 | And so this is like the way religion was.
02:58:54.980 | There's a God, we'll call it this, this is how it works.
02:58:57.140 | We do this ritual, we act in this way, it aligns with it.
02:58:59.500 | Just like the model of the atom predicted why,
02:59:02.020 | Na and Cl is gonna become salt,
02:59:03.580 | this predicts that you're gonna feel
02:59:04.940 | and live in alignment, right?
02:59:06.060 | It's like this beautiful, sophisticated theory,
02:59:08.340 | which actually matches how a lot of great theologians
02:59:11.060 | have thought about it.
02:59:13.540 | But then when you come forward in time,
02:59:14.860 | yeah, maybe it's evolution.
02:59:15.820 | I mean, this is like what Peterson hints at, right?
02:59:18.380 | Like he's basically, he doesn't like to get
02:59:23.380 | super pinned down on this, but it kind of seems where he--
02:59:26.660 | - He's almost like searching for the words.
02:59:29.100 | - He focuses more on like Jung and other people,
02:59:30.980 | but I mean, I know he's very Jungian,
02:59:33.140 | but that same type of analysis, I think, roughly speaking,
02:59:35.820 | like Armstrong is sort of a,
02:59:37.780 | it's kind of like a Petersonian analysis,
02:59:39.380 | but she's looking more at the deep history of religion.
02:59:42.700 | But yeah, he throws in an evolutionary aspect.
02:59:46.540 | - And I wonder what home it finds.
02:59:48.500 | I wonder what the new home is if religion dissipates,
02:59:51.860 | what the new home for these kinds of
02:59:54.100 | natural inclinations are.
02:59:55.580 | - Yeah.
02:59:57.500 | - Whether it's technology, whether--
02:59:58.780 | - And if it's evolution, I mean,
02:59:59.820 | this is Francis Collins' book also.
03:00:01.500 | He's like, well, that's a religious,
03:00:04.420 | that could be a very religious notion.
03:00:06.500 | I don't, I think this stuff is interesting.
03:00:07.580 | I'm not a very religious person,
03:00:08.700 | but I'm thinking it's not a bad idea.
03:00:11.900 | Maybe what replaces, honestly,
03:00:13.500 | like maybe what replaces religion is a return to religion,
03:00:17.220 | but in this sort of more sophisticated,
03:00:20.060 | I mean, if you went back, yeah, I mean,
03:00:22.020 | it's the issue with like a lot of the recent critiques,
03:00:24.860 | I think it's a stronger critique in a complicated way,
03:00:29.460 | right, because the whole way these,
03:00:31.060 | the way this works, I mean, the theologians,
03:00:33.260 | if you're reading Paul Tillich, if you're reading Heschel,
03:00:35.940 | if you're reading these people,
03:00:36.980 | they're thinking very sophisticatedly
03:00:38.620 | about religion in terms of this.
03:00:39.860 | It's ineffable and we're just these things
03:00:42.020 | and this is as deep, it connects us to these things
03:00:44.220 | in a way that puts life in alignment.
03:00:46.060 | We can't really explain what's going on
03:00:47.180 | because our brains can't handle it, right?
03:00:50.060 | For the average person though,
03:00:52.380 | this notion of live as if is kind of how religions work.
03:00:55.420 | Is live as if this is true.
03:00:57.620 | It's like an OS for getting in alignment with,
03:01:00.140 | because through cultural evolution,
03:01:02.780 | like you behave in this way, do these ritual,
03:01:04.220 | live as if this is true,
03:01:06.300 | gives you the goal you're looking for.
03:01:10.140 | But that's a complicated thing, live as if this is true,
03:01:12.380 | because if you, especially if you're not a theologian
03:01:15.020 | to say, yeah, this is not true in an enlightenment sense,
03:01:18.620 | but I'm living as if, it kind of takes the heat out of it.
03:01:21.420 | But of course it's what people are doing
03:01:22.820 | because highly religious people still do bad things
03:01:25.660 | where if you really were, there's absolutely a hell
03:01:28.380 | and I'm definitely gonna go to it if I do this bad thing,
03:01:30.100 | you would never have, no one would ever murder anyone
03:01:33.140 | if they were an evangelical Christian, right?
03:01:34.780 | So it's like what, this is kind of a tangent
03:01:37.380 | that I'm on shaky ground here,
03:01:39.540 | but it's something I've been interested off and on a lot.
03:01:43.300 | - Well, it's fascinating.
03:01:44.220 | I mean, I think we're in some sense searching for,
03:01:46.620 | 'cause it does make for a good operating system.
03:01:48.660 | We're searching for a good live as if X is true
03:01:52.060 | and we're searching for a new X.
03:01:54.260 | And maybe artificial intelligence will be the very,
03:01:58.800 | the new gods that we're so desperately looking for.
03:02:02.540 | - Or it'll just spit out 42.
03:02:04.000 | - I thought it was 27.
03:02:06.460 | Cal, this is, as you know, I've been a huge fan.
03:02:09.800 | So are a huge number of people that I've spoken with.
03:02:14.700 | So they keep telling me, I absolutely have to talk to you.
03:02:17.380 | This was a huge honor.
03:02:18.380 | This was really fun.
03:02:19.220 | Thanks for wasting all this time with me.
03:02:20.700 | - Yeah, no, likewise, man.
03:02:21.540 | I've been a long time fan.
03:02:22.440 | So this was a lot of fun.
03:02:23.500 | - Yeah, thanks man.
03:02:24.500 | Thanks for listening to this conversation
03:02:27.060 | with Cal Newport.
03:02:28.140 | A thank you to our sponsors, ExpressVPN,
03:02:31.860 | Linode Linux Virtual Machines,
03:02:34.260 | Sun Basket Meal Delivery Service,
03:02:36.580 | and SimpliSafe Home Security.
03:02:39.300 | Click the sponsor links to get a discount
03:02:42.180 | and to support this podcast.
03:02:44.460 | And now let me leave you with some words from Cal himself.
03:02:47.860 | "Clarity about what matters
03:02:49.700 | "provides clarity about what does not."
03:02:52.960 | Thank you for listening and hope to see you next time.
03:02:55.800 | (upbeat music)
03:02:58.380 | (upbeat music)
03:03:00.960 | [BLANK_AUDIO]