back to index

Ep. 237. Is It Time To Rethink The Internet?


Chapters

0:0 Cal's intro
7:28 Today’s Deep Question - Should we consider a radically reimagined internet?
23:22 Cal talks about Henson Shaving and Huel
29:30 Can we build a better internet?
36:27 Will the coming augmented reality make the deep life impossible?
44:35 How do I become more disciplined about shutting down my phone?
49:44 Is the internet alive?
55:49 How does Cal see the connection between his professor and writer roles?
56:0 Cal talks about Mint Mobile and Policy Genius
71:4 Something Interesting: Michael Cera Doesn’t Use a Smartphone

Whisper Transcript | Transcript Only Page

00:00:00.000 | what would happen if there was a radical reimagined internet? Would that be good or bad?
00:00:04.480 | That's the deep question I want to explore in today's episode. Should we consider a radically
00:00:10.160 | reimagined internet? I'm Cal Newport, and this is Deep Questions,
00:00:22.960 | the show about living and working deeply in an increasingly distracted world.
00:00:30.320 | I'm here in my Deep Work HQ, joined as always by my producer, Jesse. You know, Jesse, one thing
00:00:39.680 | that goes on in this town in DC that's rarely relevant to our show is the Supreme Court,
00:00:45.280 | is obviously based here. The week that we're recording this, which is the week before it
00:00:50.080 | comes out, is actually one of the rare instances where the dealings of the Supreme Court are
00:00:55.520 | relevant to the type of things we talk about. Yeah. So I figure I can be like a sophisticated
00:01:00.960 | courtroom court watcher journalist today and talk a little bit about this case. But it's an
00:01:07.040 | important case because it, in theory, could reshape the internet. Now, I don't know if that's
00:01:13.440 | actually going to happen, but they're hearing a case that in theory could reshape the internet.
00:01:17.840 | And I wanted to talk about it briefly because I think it gets to some deeper ideas about the
00:01:21.680 | internet that I would like to discuss. So I want to load up here on the screen,
00:01:27.440 | for those who are watching at youtube.com/calnewportmedia, this is episode 237. You'll
00:01:33.120 | see this on the screen, but for those who are listening, I'll narrate. This article is from
00:01:37.680 | the Times. This is Adam Liptick's coverage of this course case. He did a good job at this.
00:01:46.480 | Let's talk briefly about what's going on. So I'm pulling from this New York Times
00:01:49.600 | summary to talk about what's going on in this case. All right. The case in question argued
00:01:55.680 | in the Supreme Court last week was brought by the family of Nohemi Gonzalez, a 23-year-old
00:02:02.160 | college student who was killed in a restaurant in Paris during the terrorist attacks in November
00:02:06.320 | 2015. A lawyer for the family argued that YouTube, which is a subsidiary of Google,
00:02:14.080 | bore responsibility because it had used algorithms to push Islamic State videos to interested viewers
00:02:18.800 | using information the company had collected about them. So this case is Gonzalez v. Google. So it's
00:02:26.880 | the Gonzalez family suing Google because Google owns YouTube. All right. So they're saying YouTube
00:02:33.760 | is partially responsible for our son's death. Now, the core legal issue at stake here is found,
00:02:42.640 | and I'm scrolling to find the exact text here. It's a small section, Section 230, actually,
00:02:49.120 | it's going to be very specific, Section 230(c)(1) of the 1996 Communications Decency Act.
00:02:55.680 | The Section 230, so it's a part of U.S. law, has the following sentence in it.
00:03:00.960 | "No provider or user of an interactive computer service shall be treated as the publisher or
00:03:07.040 | speaker of any information provided by another content provider." So as this New York Times
00:03:13.520 | article clarifies, what this means is it has enabled the rise of social networks like Facebook
00:03:18.560 | and Twitter by ensuring that the sites did not assume legal liability for every post.
00:03:23.120 | So the Section 230 was put into law in the 90s. It was actually spurred by a lawsuit that was
00:03:30.960 | raised, I believe, against the early internet service provider, Prodigy, about something
00:03:36.240 | defamatory that was posted on a Prodigy bulletin board. And this lawsuit back in the 90s argued,
00:03:43.760 | "You can't just say you're a passive host of information because you do moderate comments.
00:03:49.920 | You do take some comments down. So the fact that you didn't take this comment down meant you,
00:03:53.360 | Prodigy, are responsible for this information, and we are going to sue you." And in response to that,
00:03:58.960 | we get the Section 230 that says, "No, no, no, it's not the internet service provider's fault.
00:04:02.960 | Even if they are moderating the content, even if they are trying to get rid of some things,
00:04:08.560 | they still won't be held responsible for what remains." Many court cases subsequent to the
00:04:13.920 | passing of the law containing Section 230 reinforce this idea that Section 230 gives
00:04:20.000 | immunity from liability for internet service providers. So at the core of this case is the
00:04:26.960 | Gonzalez family lawyers is arguing that they should be able to pierce the Section 230 protection
00:04:31.760 | because new services like YouTube, when I say new, I mean new as compared to Prodigy,
00:04:38.160 | as compared to 1996 internet, YouTube, Twitter, Instagram, Facebook, these services aren't just
00:04:46.240 | a repository of information like a bulletin board that's perhaps moderated. They are using algorithms
00:04:52.160 | to curate information, to learn about their users, and to recommend information to those users. And
00:04:57.360 | so the Gonzalez family is arguing, once you are algorithmically curating data, the Section 230
00:05:04.480 | protection no longer holds. You're not a bulletin board. You are an entertainment company. You're
00:05:10.800 | making decisions about what you're showing people. There should be some liability for that.
00:05:16.240 | Right? So that's the issue at stake. Is anything going to happen from this case?
00:05:21.760 | The consensus seems to be no. Most of the coverage I've read of this case of the court watchers who
00:05:29.920 | studied the back and forth discussion, the hours of back and forth conversation between the Supreme
00:05:35.360 | Court justices and the Gonzalez family's lawyer, as well as there were some government lawyers
00:05:39.360 | involved, the Biden administration actually has a lawyer arguing on behalf of the Gonzalez family.
00:05:43.840 | It's a very complicated legal picture, but the consensus seems to be the judges are not going
00:05:48.400 | to make a major change to the interpretation of Section 230. I'll give you a quote right here.
00:05:54.400 | Elena Kagan said, "You know, these are not like the nine greatest experts on the internet."
00:05:59.840 | I disagree with that. When I think of internet high-tech experts, you know who I think about?
00:06:05.040 | Clarence Thomas. Imagine him with like seven or eight computer screens up as he has code compiling
00:06:11.600 | over here and a OpenGL virtual reality environment that he's building over there, fiercely typing in
00:06:19.840 | VIM with a ergonomic keyboard. So they do recognize they're no internet experts.
00:06:25.040 | Brett Kavanaugh said, "You know, if we change the interpretation here, this would really crash
00:06:29.680 | the digital economy." So like they're really wary of the impact even a minor change to the
00:06:34.400 | interpretation would have. He noted that probably making changes to 230 is a job. Congress, they
00:06:41.120 | passed the original law. They should pass clarifying laws, right? So the court is not going to make a
00:06:45.440 | major change to 230. But it got me thinking, what if they did? It's an interesting daydream.
00:06:56.000 | What if the court came back and significantly reduced 230's protection of companies that
00:07:06.000 | algorithmically curate or recommend information? In other words, what would happen if there was a
00:07:12.880 | radical re-imagined of the internet? Would that be good or bad? That's the deep question I want
00:07:16.640 | to explore in today's episode. Should we consider a radically re-imagined internet?
00:07:23.440 | So like we always do, we're going to tackle this in three segments. Segment one, let's go a little
00:07:28.720 | bit deeper. We'll go a little bit deeper on this question of what would it be like if we radically
00:07:34.160 | changed the section 230 and therefore radically changed the internet? How would it be better? How
00:07:39.120 | would it be worse? Then we'll do some questions, five questions from you, the listeners that are
00:07:43.120 | relevant to this topic. I'm going to be a little broad there in what I count as relevant to this
00:07:48.720 | topic. So the five questions will all orbit roughly around rethinking our relationship to technology,
00:07:55.520 | such as the internet. So we'll have some hardcore new visions of internet questions, but also some
00:08:00.720 | more personal. How do I personally change my relationship to the internet? And a few in
00:08:05.360 | between. And then the third segment of the show, as usual, we'll shift gears and do something
00:08:10.880 | interesting. All right. So let's deep dive into this a little bit. What would happen if we
00:08:17.040 | radically changed section 230? We can start with existing visions. So people who are advocating
00:08:25.280 | for changes to section 230, we can start with what their vision is for how the internet would change.
00:08:30.000 | And then we'll go forward to the vision I have in mind. Here's the thing about section 230.
00:08:35.920 | At the moment, it has opponents, so people who want to see it changed on both the American
00:08:41.920 | political right and the American political left. I don't know how the social media companies have
00:08:49.040 | managed to do this time and again, but time and again, they have managed to alienate all sides
00:08:53.040 | of the political spectrum. It really is a case study in public relations done terribly wrong.
00:08:57.920 | So there's already a clamoring for changes to 230 from both the right and the left.
00:09:02.240 | So the American political right, which I think was first, the really start to question section 230,
00:09:09.200 | what they are looking for is accountability on moderation. So their concern is, you can't just
00:09:18.640 | take me off of your platform, kick me off Twitter, remove or demonetize my videos on YouTube
00:09:24.640 | without being held responsible for why. And if your reason is simply, we don't like your politics
00:09:29.520 | or these particular questions are bad for a particular, let's say progressive policy that
00:09:35.920 | we want to promote, then you should be held accountable, your major platform. So the right
00:09:39.360 | really comes at it from, you can't just take stuff off the internet for no good reason.
00:09:44.640 | You can be held liable for why you're doing that. And it can't be discriminating on political
00:09:48.880 | beliefs or something like that. So this is the right's approach to section 230. They think it's
00:09:52.560 | giving too much cover for these companies to do whatever they want. Interestingly, there's a new
00:09:57.520 | case coming to the Supreme court later this session that is getting specifically at a first amendment
00:10:06.080 | attack, a first amendment argument about whether or not services can remove individuals based on
00:10:11.440 | things like political beliefs. The American political left came to disliking section 230
00:10:15.680 | a little bit more recently, and they want the opposite. They want to reduce 230 protections so
00:10:22.640 | that companies can be held more responsible for what they allow to be on their services.
00:10:27.840 | So they want to make sure that if you publish as a Facebook or a Twitter, something that is
00:10:36.800 | really damaging that you can be held liable. So they want to actually be able to essentially
00:10:42.320 | punish interactive services for what is on their platform. The right wants to be able to punish or
00:10:48.400 | hold accountable interactive services for what they take off the platform. They're coming at
00:10:52.160 | 230 from two different angles. Here's my take. Neither of these two visions of what the internet
00:10:57.920 | could be like with 230 revisions, we could call them are nearly radical enough. So if you, if you
00:11:06.480 | look at the commentary from the right and left, they're imagining a new internet in which it's
00:11:10.240 | basically like the current internet. There's a small number of massive platform monopolies through
00:11:15.200 | which most people interact with each other and the information landscape. They just want each of
00:11:20.160 | these two sides just wants those existing landscapes to be more friendly to their team.
00:11:24.080 | And this kind of makes sense. I think about who, if you're a political media figure talking about
00:11:31.280 | this, it is in your interest for there to be these highly concentrated information platforms that
00:11:37.520 | everyone uses, because if you have influence on, let's say Twitter, that's massive influence. You
00:11:42.800 | want these platforms to be around. You want them to, because they can give you a lot of power
00:11:47.440 | in the information landscape. You just want them to be more friendly to the things you care about.
00:11:51.760 | So I think the current visions for 230 reinterpretations coming from the American
00:11:57.520 | political spectrum are not that interesting to me because it doesn't get to the core of what we do
00:12:01.920 | on the internet. So let's talk about a more radical thought experiment. As long as we're daydreaming,
00:12:07.600 | let's talk about a more radical thought experiment. Imagine the justices came back and said, no,
00:12:11.920 | we're not just tweaking 230. We're essentially going to get rid of any implied protection for
00:12:20.880 | any type of service that does any sort of curation or recommendation of information.
00:12:25.600 | So in this vision, I'm thinking about, you know, it's okay if you're a search engine,
00:12:31.600 | because you're not hosting the information. Google is just pointing you towards sites that
00:12:36.800 | already exist. If you're a web hosting company, you're fine. You're just running a bunch of
00:12:42.080 | servers that people can host WordPress instances on and websites, but you're doing no curation.
00:12:47.840 | You're doing no recommendation. You're just running the racks there. But if you're Facebook,
00:12:52.800 | nope, you're liable for everything on Facebook. If you're Twitter, you're liable for everything
00:12:56.320 | people put on Twitter. If you are Instagram, you're liable for everything that goes on Instagram.
00:13:01.440 | YouTube is liable for every video, as long as they're curating and recommending the videos.
00:13:05.680 | And let's imagine in this extreme thought experiment that because of that,
00:13:09.760 | those major algorithmic curated content centric companies all go away. It becomes financially
00:13:17.920 | unviable. So essentially any company that's business model is built on third-party users
00:13:24.160 | supplying content for free, that they then use algorithms to help sort and display back to users
00:13:30.480 | to capture their attention. Any company doing that, what we could call the algorithmic internet,
00:13:34.640 | basically they go away. It becomes financially unviable because the lawsuits are myriad and
00:13:40.960 | unrelenting. All right, so let's ask about what this would be like. So if this algorithmic internet
00:13:48.080 | went away because the courts waived this one, what would be left? What would be the consequences?
00:13:54.080 | Would the internet that remains be usable? Would it be a step backwards? What would we face?
00:13:59.600 | And there would obviously be a lot of short-term pain. A lot of people would lose their jobs in
00:14:03.920 | the tech sector at first, because these are big employers. A lot of stock portfolios would go
00:14:09.200 | down because a lot of stock portfolios have heavy tech sector investment that would eventually shake
00:14:15.120 | itself out. In the optimistic view, all this talent would eventually move to perhaps more
00:14:20.000 | productive applications of this talent. Instead of saying, let's have the best and brightest trying
00:14:25.680 | to capture the attention of 17-year-olds better on their phones, these people might now start
00:14:30.480 | be thinking about building tools or new technologies that are more generally productive, more generally
00:14:36.240 | beneficial to society. But that would be really disruptive. But what about the average individual
00:14:42.080 | internet user? I would argue, at least as long as we're daydreaming, as long as we're ignoring some
00:14:48.000 | of the inevitable rough edges here, the internet that remains would be more human and would actually
00:14:54.080 | be not such a bad place to be. So in this world where all the algorithmic internet giants are gone,
00:15:02.560 | I can imagine sort of three tiers of different internet sites or services. So at the top,
00:15:09.520 | you would still have all of those old-fashioned news sites and streaming services, all of those
00:15:15.680 | content media companies that subscribe to old-fashioned beliefs like having editors,
00:15:22.400 | paying people who are good at what they do to produce the content that you show to people.
00:15:26.560 | So we're talking, you know, the New York Times.com, New Yorker.com, Netflix, Hulu, of course,
00:15:32.800 | all that's still there, right? They already have an editorial liabilities stance. New York Times
00:15:38.640 | is responsible for everything they publish. Netflix is responsible for every show they put on
00:15:41.920 | there. So nothing would change for them. So you would have sort of high-end, high-quality, edited,
00:15:46.320 | professionally produced content. All that's still there on the internet. None of that changes.
00:15:49.920 | In the middle tier, you would have all of the fruits of the recent revolution in independent
00:15:56.000 | media company and independent media content. That would all still be there. Podcast, email,
00:16:02.160 | newsletters, these are all produced by individuals. The individuals producing these things have always
00:16:08.000 | been liable for what they produce. If, you know, on this podcast, for example, we're liable for
00:16:14.960 | what we say. I mean, Jesse will tell you, we get cease and desist letters from Brandon Sanderson's
00:16:19.680 | company, you know, what, like once or twice a week. They were liable, right? Because, you know,
00:16:24.480 | we point out the truth that he wrote the name of the wind and won't admit that. And we will
00:16:29.440 | continue to push for that truth. But nothing would change for a podcast. We've always been liable.
00:16:33.920 | So nothing would change for us. Nothing would change for a newsletter subscriber. Now I did
00:16:37.920 | write about this idea on my email newsletter at calnewport.com and a reader wrote in to me and
00:16:42.480 | said, yeah, but Patreon and Substack would go away because they would be liable for the things
00:16:47.040 | they're hosting into which I would say that's not a big deal because you don't need Substack to have
00:16:50.880 | an email newsletter. You would just host your own, just have an account with a hosting company like
00:16:54.880 | I do. And you don't need a giant venture backed company like Patreon to accept money from your
00:17:00.640 | supporters. Tools would just emerge that you could just plug it into your WordPress instance. I mean,
00:17:04.800 | all this stuff would be a little bit harder, but it would just be shifting back towards the
00:17:07.840 | individuals deploying these tools. Why do we have to have every time there's something good, like
00:17:13.840 | newsletters? Why can't we just all have newsletters? Why does there have to be a company
00:17:18.000 | backed by $60 million in venture capital that tries to own all the newsletters and then turn
00:17:22.320 | it into some sort of glorified text-based Netflix? We would be fine. All right. And then at the lowest
00:17:26.800 | tier, what about individuals and their ability to express themselves and connect to people?
00:17:31.760 | So we still have interesting content from the top tier and the middle tier, but what about,
00:17:35.360 | hey, I can't post my pictures on Instagram anymore. I can't just talk to the small niche
00:17:40.640 | of people. We're into Viking cosplay and I can't send them tweets in this little tweet group we
00:17:47.920 | have. There, I think the shift would head back towards individually hosted websites. I have a
00:17:54.800 | website. I can post pictures on it. I can post videos on it. No, I can't post on YouTube, but
00:17:58.480 | you know what? I have a WordPress instance and a very easy plugin for me to upload videos for
00:18:02.400 | people to watch. This whole thing costs me a couple of dollars a month, which is enough to
00:18:06.800 | make it difficult to have a lot of bots, but not at all prohibited for the average user to be online.
00:18:11.920 | How do I find other people? Well, it'd be a little bit like 2005, links, serendipity,
00:18:17.200 | word of mouth. I imagine there'd be a resurgence in feeds. So you could sort of subscribe to the
00:18:22.240 | feeds of people who are publishing interesting articles or pictures. And there would be probably
00:18:26.960 | very low friction, aesthetically pleasing RSS reader apps that show up on your phone.
00:18:31.440 | So just on your phone, you could just see like, hey, here's the newest stuff that,
00:18:34.880 | you know, Jesse posted about Viking cosplay and I can see the pictures and the whatever.
00:18:39.280 | And it almost would feel like a current social media network, but everyone owns their own stuff.
00:18:43.120 | Everyone's liable for their own stuff. And it's not connected in this massive, fully connected,
00:18:51.120 | underlying social graph with power law dynamics in which if Jesse says the wrong thing in one of
00:18:57.040 | his posts on Viking cosplay, that it can spread to the whole internet really quickly. Just way more
00:19:01.040 | friction and human, human curation involved here. I don't think that's a bad internet.
00:19:06.640 | I mean, I think basically what we would have there is the 2005 internet plus a lot of the
00:19:12.560 | cool new stuff that's happened since then, minus a lot of the bad stuff that's happened since then,
00:19:17.760 | minus the compulsively addictive content created by these algorithmic curated services like the
00:19:25.520 | TikTok, like the Instagram, minus these weird unexpected dynamics that we got from services
00:19:30.640 | like Twitter and Instagram, where we took everyone on the internet and put them in the same service,
00:19:34.800 | made them all the same. Everyone's account looks the same. Everyone looks the same.
00:19:38.240 | Put them on power law dynamics and allowed ideas to ping pong back and forth the entire world with
00:19:44.880 | explosive virality and get these weird tribal, this tribal entrenchment that arose. And we begin
00:19:51.280 | to see the worst versions of ourselves and see the worst in other people. All those dynamics are gone
00:19:55.920 | to. Now this would be sad for some people. I mean, reporters would have to come out of their caves
00:20:01.440 | where they've been on Twitter 12 hours a day and actually get used to sunlight again and realize
00:20:05.600 | what the wind feels like. I mean, yeah, so that would be hard for them. It'd be hard to be an
00:20:10.320 | influencer that makes a lot of money. Now you have to sort of build your own thing. You can't rely on
00:20:15.440 | algorithms to jumpstart you to some sort of notoriety, but I actually think it wouldn't be
00:20:19.600 | that bad of an internet. So is this going to happen? No, of course not. Not actually going to happen.
00:20:25.120 | The Supreme court nor Congress is going to significantly change 230. If it does get
00:20:31.280 | changed, it's going to get changed along the narrow vision that I was talking about earlier,
00:20:35.120 | that the political left or right has where it's going to be tweaked. Like, yeah, a little bit
00:20:39.600 | liable if you really put something pretty terrible on here, or you really have to have a reason to
00:20:44.560 | kick me off beyond just, you know, a Republican bad. That's what the tweaks would be. And fine,
00:20:50.160 | we can argue about that. But to me, that vision is also narrow. But I love this thought experiment.
00:20:54.880 | What if in one swipe algorithmic exploitation, curation and recommendation of third party user
00:21:03.120 | content went away might not be that bad of an internet. And I make this point not to say that
00:21:09.040 | that's going to happen or that we should advocate for that specific thing to happen, but that we
00:21:13.120 | should not just accept that the internet we've stumbled into the consumer face, the internet
00:21:19.440 | we've stumbled into over the past 25 years is somehow the right way to do this, or that is
00:21:22.640 | somehow fundamental. There are other ways we could imagine the internet operating. There's other
00:21:28.800 | relationships we can imagine having to these tools. And we should keep thinking about that.
00:21:34.240 | We should keep daydreaming, keep wondering what a better internet might actually be like, even if
00:21:39.360 | it's not imminently about to happen. All right. So Jesse, that was my sermon on section 230 and the,
00:21:44.800 | the human internet. Yeah, I like it. I haven't done a tech sermon in a while. Yeah. I figured
00:21:51.040 | I used to do more of these back in the old days of the show. And I was just here alone.
00:21:54.400 | I would do tech sermons because I was just, you know, missing humans and just sitting here alone
00:22:00.320 | in my, in my office, you know, thinking random thoughts. So I thought that'd be nice to get
00:22:04.240 | back to. I like the tech sermons. It wouldn't, if that actually happened though, it wouldn't
00:22:09.040 | change your life very much. Right. Oh, I mean, we take this idea that these giant platforms that
00:22:15.920 | just suck in all of this third-party content for free basically, and just repackage it with
00:22:20.480 | algorithms to try to capture eyeballs. That's the internet. It's not, you know, I mean, look,
00:22:25.920 | look at, look at, yeah, my engagement with the internet, none of it goes through except for
00:22:29.840 | YouTube. But if there's no YouTube, we would just be posting these videos on our own server
00:22:33.840 | and people would be going and they'd be watching them there and that'd be fine.
00:22:37.360 | Right. I'd be, I'd be fine. Fine with that. Websites are owned. The newsletters are owned.
00:22:41.920 | The podcast is our own reliable for everything we say, you know, it's fine. You don't have to
00:22:46.480 | be in the, what you get in those worlds is the ability to have the lotto ticket of viral expansion
00:22:52.480 | of audience. And on the flip side, it's good for advertisers, but I don't think the world is that
00:22:57.520 | bad without that lotto ticket. I don't, I don't know that lotto ticket, sudden growth and virality
00:23:01.680 | is even that great for people anyways, slow and steady is my way. I'm not a very viral guy.
00:23:07.280 | All right. So we have a collection of questions that all loosely orbit this issue of rethinking
00:23:11.920 | our relationship to technology. First, I want to mention one of the sponsors that makes this
00:23:17.840 | show possible. That's our friends at Hinson shaving, Jesse true story. Speaking of Hinson
00:23:23.120 | shaving, as you know, I was in Florida last weekend, had an interesting experience. We were
00:23:29.760 | taxing, we're going to set 737, 800 taxing on the runway at Dulles getting ready to take off.
00:23:36.560 | And I saw, you know, the, the stair car things, they'd come up to the planes to bring the food
00:23:42.240 | onto the planes, uh, racing after racing after our plane. And it, it, it cut across another plane.
00:23:49.840 | That was nearest taxing that plane skidded off of the skidded off of the runway, the front landing
00:23:54.960 | gear snap that smashes down into the ground cuts off our plane screeches to a stop. The overhead
00:24:00.000 | compartments are opening. The guy runs up the stairs, rips open the, the, uh, emergency exit
00:24:06.320 | door on the wing from the outside. So the slides automatically inflate. Everyone is screaming.
00:24:10.880 | Uh, the guy sticks his head in. I was sitting near the window and says, excuse me, sir, pointed to me.
00:24:15.520 | I really like your shave. He had seen it from afar. You want to say that? That shave is a
00:24:25.200 | Henson razor shave. Uh, you know, that I love the Henson razor. Here's the idea. This company
00:24:33.120 | manufactures precision parts for the aerospace industry. So they have these machines that can
00:24:38.800 | do incredibly precise manufacturing of parts. So they created this razor made out of aluminum,
00:24:45.360 | very solid, beautiful piece of construction. It screws on that allows you to put a standard
00:24:50.560 | five cent safety blade, safety razor blade into their aluminum razor, screw it on.
00:24:57.120 | And it's so precise that only 0.001, three inches of the blade extends beyond the housing.
00:25:05.440 | This is what gives you a really good shape because you have just enough blade to actually.
00:25:09.600 | Shave, but not so much blade that you begin to get a diving board effect where it goes up and
00:25:15.600 | down that diving board effect where you have too much blade sticking out is what causes nicks,
00:25:19.760 | uh, is what causes razor burn. So by having just the barest human hair width bit of your,
00:25:25.200 | uh, blade coming out of the edge of your razor, you get a great shave and you only require one
00:25:31.600 | blade. You don't need like you get from the modern disposable companies, the 17 blades
00:25:36.480 | that you have to like, Oh, start like a lawnmower and hold up to your face.
00:25:41.280 | One five simply can do it. If you have a beautifully precision designed aluminum razor.
00:25:47.520 | Uh, so this is what I use. So I have this permanent thing. I'm not throwing out stuff.
00:25:52.160 | I'm not buying stuff. I don't have stuff mailed to me every week. I used to Henson's this beautifully
00:25:56.400 | designed Henson's razor, clean shave, simple over time. I'm already saving money because the blades
00:26:05.120 | are so cheap. Even though you pay more for the initial razor, it doesn't take long before the,
00:26:11.280 | uh, the cost of using disposables or subscription service exceeds and continues to exceed what you
00:26:17.040 | spent on the one nice, uh, Henson's razor, because the actual blades you're using are so cheap. So
00:26:22.640 | anyways, I'm a big fan of the Henson razor. And if you like to shave, you will like it as well.
00:26:29.520 | So it's time to say no to subscriptions and yes to a razor that will last you a lifetime.
00:26:34.720 | Visit Henson, shaving.com/cal to pick the razor for you. Use that code Cal at checkout, and you'll
00:26:41.760 | get two years worth of blades for free. So you add the two year supply of blades to your cart,
00:26:46.560 | and then use Cal promo code when you check out and the cost of those will drop to zero.
00:26:50.800 | So that's 100 free blades when you head to H E N S O N S H A V I N G.com/cal and use that code Cal.
00:27:00.960 | I also want to talk about a newer sponsor of the show. It has a product that I've been enjoying.
00:27:06.800 | This is Huel, H U E L. In particular, I want to talk about the Huel Black Edition,
00:27:13.280 | which is a high protein, nutritionally complete meal in a convenient shake. It has everything
00:27:19.120 | your body needs in just two scoops that you add the water that includes 27 essential vitamins and
00:27:23.360 | minerals and 40 grams of protein. So each scoop has 200 calories. So two scoops gives you more or
00:27:28.720 | less a reasonable sized meal. The way I have been using Huel is because I automate my eating.
00:27:38.880 | My whole idea is breakfast, lunch, automate. Don't think about it. Just have a very easy way.
00:27:45.040 | This is what I do for breakfast. This is what I do for lunch. And then you dinner,
00:27:47.760 | you can actually cook interesting food. And that's when you can actually care about your food. To me,
00:27:50.800 | this is the easiest way to stay healthy. And I have this very strict way I do this. So days when
00:27:54.720 | I'm on campus, I know exactly where I'm going to go to get lunch. I go to Epicurean and I get a
00:27:59.120 | whole bunch of vegetables and one thing of protein. And when I'm at home, I know what I make.
00:28:03.680 | And Huel helps me. Huel helps me in that when I have to do breakfast replacement. So in my
00:28:09.600 | automation of my food, sometimes I don't have a lot of time in the morning, especially I'm going
00:28:14.320 | on campus, knowing I can do scoop one, scoop two, shake, bring it with me in the car. I've got
00:28:19.520 | calories. I got proteins. I got vitamins. It fits perfectly with my automated meal philosophy.
00:28:25.920 | So you just make sure breakfast and lunches are healthy and you don't think about it. And then
00:28:29.600 | dinner, you can actually enjoy food. So that's how I use it. You can use it however you want to,
00:28:34.080 | but I've tried different shakes. I've tried different meal replacements, and this is one
00:28:38.400 | of my favorites. It's vegan, it's gluten-free, it's lacto-free. There's no artificial sweeteners.
00:28:43.760 | It's low glycemic index. It has omega-3, it has omega-6. There's no palm oil. It's available in
00:28:50.080 | nine flavors. It's also affordable. It works out to be about $2.50 for a 400 calorie meals for each
00:28:58.480 | double scoop you do. They will also give you a free t-shirt and a shaker with your first order
00:29:03.280 | if you go to Huel.com/questions. So if you're looking for a good meal replacement, just to
00:29:08.240 | automate those meals you don't want to worry about, go to Huel.com/questions. Don't forget
00:29:13.680 | the slash questions. That's what's going to get you a free t-shirt and the shaker you can use
00:29:17.040 | to shake up your shake. All right, Jesse, let us do some questions all about radically rethinking
00:29:25.840 | a relationship to technology. Sounds great. We'll start with Jessica.
00:29:29.680 | What does Cal think about Urbit? Does it answer Cal and Jordan Lanier's call for a social internet,
00:29:35.840 | or is it just another barrier to the deep life? Well, I had to look this up. Urbit is U-R-B-I-T.
00:29:43.200 | So it's one of these blank slate replacements for the internet. So there's a lot of these that come
00:29:50.000 | out of the tech community. They're typically very purist. So they'll have these goals. We want to
00:29:56.560 | exactly be like fully functional, fully safe, fully peer to peer. I love these types of experiments. I
00:30:02.800 | didn't know Urbit, so I looked it up. Here's the website. So if you're watching, again, if you're
00:30:07.600 | watching on YouTube, episode 237, you'll see this on your screen. The Urbit website calls it a clean
00:30:14.080 | slate OS and network for the 21st century. They go on to say Urbit is a new kind of computer
00:30:21.760 | that you can own completely in ways that matter, networking, identity, and data.
00:30:26.240 | They further explain, we realized that in order to fix the internet, we had to build a new computer
00:30:31.280 | from scratch. Good thing we started over a decade ago. Today, Urbit is a real system with thousands
00:30:36.720 | of users that are building all kinds of community software, DAOs, and more, and it's getting better
00:30:41.840 | every day. So it's a little bit complicated what's going on here. I spent some time reading about
00:30:45.840 | Urbit earlier today. It's an operating system that runs entirely in a virtual machine,
00:30:55.040 | completely self-encapsulates. You might run it on a different Unix-based machine, and it's like an
00:30:59.200 | OS that runs in its own little box that it can't break out of. So it's very safe, built from scratch
00:31:05.760 | environment. It has peer to peer networking built into the kernel of this operating system. So the
00:31:11.360 | whole idea is if you're running the Urbit OS, it makes it very easy for you to directly connect to
00:31:17.760 | other people running an Urbit OS on their own machine. So peer to peer connections,
00:31:24.720 | if you want to network nerd out on this, this protocol is implemented using UDP
00:31:28.800 | on top of the standard IP stack of the public internet. So getting a little bit nerdy there.
00:31:36.000 | It also comes with a system for managing IDs. So you can have a username in the Urbit ecosystem
00:31:43.200 | that's yours and no one else can use it, and no one else can pretend to be you. And they implement
00:31:47.840 | this using the Ethereum blockchain. So you're essentially, you'll claim a name in the blockchain
00:31:54.160 | and you can give your public key in there. And now you can sign things from that name and everyone
00:31:58.240 | can verify it's you and no one else can steal it. It costs money and time and effort to put
00:32:04.240 | something into the blockchain too. So you can't just create thousands of fake accounts and spam
00:32:08.800 | and have a lot of bots because you actually have to pay to get all of these IDs put in the blockchain.
00:32:14.640 | So that's what's kind of going on here. So what they imagine is you can create your sort of own
00:32:18.640 | overlay internet on top of the actual internet where there are no central servers that people
00:32:23.040 | connect to. If I want to have a chat with a bunch of people, there's not a WhatsApp server somewhere
00:32:29.120 | that we all connect to that's keeping track of us and keeping track of our information.
00:32:32.560 | I just directly connect to the people I'm chatting to. We all just share messages back and forth.
00:32:36.400 | And they want to make it really easy to write applications where lots of people can connect
00:32:41.280 | and share things with each other. And there's no central servers involved. So no one that's
00:32:45.520 | mining your data, no one that's selling your data. And it's also their security issues because it's
00:32:50.320 | a sandbox OS, all particularly protected in a virtual machine. It's very simple. So there can't
00:32:56.320 | be malicious programs that take over your computer. Great idea in general. The issue, of course, with
00:33:02.720 | these specific tools is that they're complicated and they're missing the thing that makes the real
00:33:08.480 | internet fun, which is all of these services that have been really optimized and tuned to capture
00:33:13.440 | your attention to be interesting. So they never, none of these endeavors, and there's a lot like
00:33:18.800 | Urbit, none of them really ever gain mainstream traction because they're pure and appealing to
00:33:24.800 | people who understand the technology, but they're not very fun. And most people don't care. Here's
00:33:30.640 | my argument. I like this, these directions, but I don't think we actually need new technologies,
00:33:37.600 | new operating systems, or new protocols to build a more human internet. I think the tools that have
00:33:45.760 | long existed are already sufficient to create a DIY human internet without needing new
00:33:54.480 | peer-to-peer native operating systems, without needing blockchains or building DAOs. So what
00:34:01.120 | are these tools? Well, here's my vision of a more DIY human internet. You consume podcasts,
00:34:09.440 | you consume email newsletters, you still use, as we talked about in the deep dive, you still use
00:34:15.520 | the sort of high-end editorial content providers. So maybe you still stream HBO or read the New
00:34:22.160 | York Times or something like this. You talk for people that you know, and you're probably using
00:34:30.080 | text. You have text threads with people you actually know to have digital conversations with.
00:34:34.640 | If you yourself want to express yourself on the internet, like we talked about before,
00:34:39.120 | you can have your own server instance somewhere and have a WordPress server where you can post
00:34:43.360 | videos and you can post images, you can post texts, and you can put them out in RSS feed,
00:34:49.200 | and you can follow what other people are doing as well. And that's kind of it. Like that is a DIY
00:34:56.400 | more human internet that you could make that your interaction with the internet today, that you have
00:35:01.280 | some professionally edited sites with professional content producers to get great content from the
00:35:06.720 | internet. You listen to some podcasts, you read some email newsletters, maybe you post some stuff
00:35:11.200 | and you talk directly with people you've met online through text messages, and you know their
00:35:15.680 | sites, they know yours, and that's it. And there is no super virality, and there is no, I need to
00:35:19.680 | see everything that's going on in the world, there needs to be memes, and I need to have 100,000
00:35:23.600 | people who are going to watch my video because of an algorithm. You don't need any of that. So what
00:35:27.840 | I'm saying is this is not as, from a technical perspective, not as pure or interesting as these
00:35:33.520 | conceptions like Urbit, where you have all of these standards that meets and it's entirely
00:35:37.760 | decentralized, and we're using the blockchain and it's entirely peer to peer. Now it's still messy,
00:35:42.800 | but it's more human. So that's usually how I come down on these, is we could just change our
00:35:46.960 | engagement with the internet today to be more human without a new tool needing to be introduced.
00:35:52.640 | It's much more about what we don't do than what we need to do. It's much more about, I don't want
00:35:58.720 | to deal with giant platforms that just suck in a lot of third-party content and then use algorithms
00:36:05.120 | to spread it back out and capture people's attention. I just don't want to use Instagram.
00:36:09.840 | I don't want to be on Twitter. It's just saying what you don't want to do, what's left behind,
00:36:14.640 | I think, is still a pretty great internet. So I call that the DIY human internet because it is
00:36:18.880 | something that you can implement in your relationship to these tools right now.
00:36:22.640 | All right, what do we got next, Jesse?
00:36:25.520 | All right, next question is from Brian. You've often mentioned that you view the next big
00:36:30.480 | disruption in personal tech being the transition to using augmented reality glasses to project
00:36:35.440 | virtual screens. How do you see this development interacting with the attention economy,
00:36:40.160 | chronic distraction, and the paucity of quiet in our lives?
00:36:44.000 | So Brian, for those who haven't heard this prediction of mine,
00:36:47.840 | this is where I think the future of the personal electronics industry is going,
00:36:52.400 | is we're going to move away eventually from a world in which there's many different devices
00:36:56.880 | that all have their own processors and dedicated screens. So we have a computer, we have an iPad,
00:37:01.920 | we have a phone, we have a television, all of these have their own processors and screens.
00:37:05.440 | I think it's inevitable that we're going to move away from that world to a world where we each have
00:37:09.760 | augmented reality goggles that can project screens of very high resolution convincingly anywhere in
00:37:16.960 | our actual physical world. So I don't need a physical phone. I can have a phone screen just
00:37:21.920 | appear. I don't need to carry a laptop with me to write on the plane. I can just have my laptop
00:37:28.080 | screen appear as if it's right there in front of me. I don't need to buy a high-end television. I
00:37:32.960 | can make a very large television screen appear on an otherwise blank wall. I think the economics
00:37:38.240 | there just make a lot of sense. And so we're going to see a collapse of the consumer electronics
00:37:44.000 | industry towards software. So like a lot of these, you don't build a TV, you build TV software.
00:37:50.960 | You don't build a phone, you build phone software. So there's going to collapse towards software and
00:37:54.240 | a small number of people who make the goggles and the companies that win in that battle to produce
00:37:59.600 | this hardware are going to be some of the largest companies we've ever seen in the history of the
00:38:02.560 | world. Because imagine if Apple and Sony and Samsung, you know, all were one company and all
00:38:11.840 | of that consumer base was all being serviced by one company and one product. It's going to make
00:38:15.040 | the success of the iPhone seem like a niche product. So that's why I think it's going to
00:38:21.040 | happen. Interestingly, the biggest, I won't get too much into the details here. I think there's
00:38:26.080 | two big obstacles to this. One is of course, just getting the AR technology sufficiently advanced.
00:38:31.360 | That will solve, that's getting a lot better. That will solve. The biggest issue now I think
00:38:35.680 | is the computation. So the way you would want to imagine this is going to work is that the
00:38:40.720 | computation, so the phone operating system or the TV operating system, the computer operating system
00:38:46.800 | is running efficiently on a virtual instance in a server farm somewhere. And essentially that
00:38:53.040 | instance is just streaming over the internet to your glasses, the screen images. So all you need
00:38:59.120 | on yourself is a box that can accept, here's the screen that's being generated by some powerful,
00:39:05.360 | you know, program. And I just put that screen in the world. So then I don't need my personal device
00:39:10.560 | to be able to do all the different computation. It doesn't have to actually generate the images
00:39:14.160 | from scratch itself. So if I'm playing a video game, for example, I don't need my computer
00:39:18.640 | attached to my AR glasses to be calculating all those polygons. That can be in an instance
00:39:23.600 | somewhere and I'm just getting the screen captures, right? And if I have a pipeline where the screen
00:39:29.040 | can be streamed to me, then my glasses can do anything, right? Because we can have as much
00:39:34.960 | back end power as we need. The issue, I think the biggest bottleneck to this future is having enough,
00:39:40.960 | low enough latency and high enough bandwidth to actually stream those screens fast enough
00:39:46.480 | to these devices that it actually looks like a native running device. There's a whole interesting
00:39:52.000 | discussion of this in Matthew Ball's book, The Metaverse. And basically what he says is they've
00:39:56.480 | tried this. There's a Google product, for example, that does this with video games where they just
00:40:00.480 | stream the screens and you don't actually have to have the video game processor local. It's hard
00:40:06.720 | because it takes time to send things across connections. Even the speed of light is not as
00:40:11.360 | fast as you would like it to be when you're trying to send a hundred frames a second over the
00:40:16.080 | internet. You also need a lot of bandwidth. So there's some real problems that have to be solved
00:40:18.960 | there. I imagine a future in which you're basically going to have a virtual machine instance that
00:40:24.240 | follows you geographically. So it's wherever you are, it's in a data center near you. And it is
00:40:29.040 | like one powerful instance is doing all of your computations and sending the, it's close enough
00:40:33.440 | that it can actually send the screens to you fast enough. Anyways, Brian, that's all nerd stuff.
00:40:38.000 | The big conclusion here is, you know, we probably are coming to a future where that's the case.
00:40:43.360 | Will that make us more distracted? It will, but not in the way that I think science fiction writers
00:40:49.200 | think. So I think science fiction writers imagine if once we have a world with these augmented
00:40:54.560 | reality glasses, that somehow we're going to be subjected to constant advertisements, just
00:41:00.400 | floating in our world that there's like in ready player one, the Spielberg version of that book,
00:41:05.040 | that there's just all these different floating ads everywhere because we could do that. I don't
00:41:09.200 | think that's what's going to happen. But these consumer technology companies like Apple, they
00:41:14.800 | don't really care. They're not selling advertisements to you. They want to sell their,
00:41:19.200 | they want to sell their products to you. So I don't think that's going to be the issue. I don't
00:41:21.760 | think we're going to add clutter our visual field with things that don't exist today that are
00:41:26.560 | pulling out our attention. Here's the problem is going to be, it'll be difficult to get away from
00:41:31.200 | your devices. So any one moment in this new future I'm talking about will not look visually distinct
00:41:38.560 | from a moment today. There might be a phone screen here that you're scrolling through to check out a
00:41:43.440 | text message. There might be a computer screen here where you're, you're writing an email.
00:41:46.960 | It won't look any more cluttered or high tech or different than our world today. The difference is
00:41:51.840 | we're going to always have the ability to bring up that phone screen. We're always going to have
00:41:56.400 | the ability to bring up that computer screen because as long as there's one thing we might
00:41:59.600 | need to do, we're going to have those glasses on. If we have the glasses on or one swipe or one
00:42:04.800 | telegraphic brain link, you know, thought away from having a screen pop up. And so it's going to
00:42:09.760 | be like 2023 if you weren't really able to put your phone away for a while, if you weren't able to
00:42:16.800 | just go for a walk and not have a TV screen with you. So I think that's going to be the real issue
00:42:21.920 | is the ubiquity of screens will be total. Once we're in this AR future, any screen you want will
00:42:30.160 | always be available. That is a pretty hard thing to resist. And so what we're going to have to do
00:42:35.360 | to prepare ourselves for that, Brian, I think is become practiced digital minimalist. We need to
00:42:42.240 | start getting much more serious about thinking proactively. Here's what matters to me in my life.
00:42:47.200 | Here's how I want to use technology to support these things. Those are the only reasons I use
00:42:51.200 | technology. The getaway from technology is just the default thing we do when we're bored or
00:42:56.000 | something we only remove if we can point out a specific harm and instead see technology as
00:42:59.600 | something we add to service other more lofty goals. We're going to have to become much better
00:43:03.760 | at structuring our relationship to the digital because this AR future is going to make the
00:43:09.280 | digital always one hand swipe away. So it's going to be a much harder distraction environment.
00:43:14.400 | Now, Jesse, that's my one prediction I've been making consistently. I don't hear about a lot.
00:43:19.760 | Yeah, I only hear about it from you. Yeah, I feel good about it. And I've had other predictions
00:43:24.960 | where, you know, people thought I was crazy. I think some of the the coming zeitgeist shift
00:43:30.560 | and destabilization of social media, you know, I was pushing that I was seeing that hard to write
00:43:35.200 | on the wall 2016, 2017, and people thought I was crazy. And it's all sort of coming to coming to
00:43:41.120 | play. So I'm going to I'm going to give myself the benefit of the doubt on this AR question.
00:43:44.800 | I think 10 years, 10 year window, we're going to see a major shift of the personal consumer
00:43:49.920 | technology industry. And then you can always take the glasses off, right? Yeah, you can take them
00:43:54.000 | off. But we're going to become so used to the idea of if I just need one thing, like if I'm writing,
00:43:59.840 | I need to go or watching TV. I need the glasses on the watch TV, like watch a movie.
00:44:04.160 | But now the phone is right there. The tablet is right there. My video game player is right there.
00:44:10.080 | I'm just imagining a 14 year old with this technology, just like video game, video game,
00:44:15.200 | phone, phone, just movie, movie, just like moving these things all around.
00:44:18.080 | Like we're gonna have to get our act together before we give ourselves such like universal
00:44:22.960 | accessibility. You're gonna have to write the sequel to the book. I know. Or maybe just more
00:44:27.120 | people can buy that book. All right. Speaking of digital minimum, I think this next question
00:44:31.600 | is relevant, right? What do we got here? All right. Next question is from Nandan.
00:44:37.680 | What are the benefits of shutting down the phone usage before going to bed and after waking up?
00:44:42.640 | What techniques can I implement to become more disciplined in doing so?
00:44:46.160 | Well, Nandan, I mean, the benefits are massive, because what are we really talking about here?
00:44:52.640 | If you're using your phone up to the moment you go to bed, if you're using your phone the moment
00:44:57.360 | that you wake up, that means that you are constantly in this four dimensional world,
00:45:02.320 | where you have the real and the digital, and it's all sort of mixed together.
00:45:06.960 | That's not a humane way to live. It's an impoverished approach to reality.
00:45:13.920 | It is an anxiety generation machine. It is also a think of it as like a potential magnet,
00:45:22.000 | just pulling all this potential for you constructing a life well lived and just
00:45:26.240 | sucking it away and converting it into returns on the stock owners of these giant people who own
00:45:31.760 | stock, these giant attention economy platforms. It is no way to live. That's the benefit.
00:45:36.640 | So how do you get away from it? Well, Nandan, you have to be way more aggressive than what
00:45:39.680 | you're talking about here. What you're talking about here, I see all the time.
00:45:42.960 | You want to nibble at the edges with tips. I use my phone too much. Do you have a good tip?
00:45:49.040 | Like maybe I should stop not bring the bedroom or I should have a time I start turning on my phone.
00:45:54.960 | This is not how we solve this problem. When your life is entirely enmeshed with the digital,
00:46:00.160 | we do not solve this problem by nibbling at the edges with tips.
00:46:04.000 | We don't, we don't solve this problem by writing books where we put in all these caveats about
00:46:07.440 | like, well, I'm not saying that you shouldn't use your phone and please don't yell at me,
00:46:10.640 | but like, you know, maybe if you're able, you should consider like putting down your phone
00:46:13.920 | while you're, while you're having dinner and you know, but unless like you really need your phone
00:46:16.960 | during dinner, that's not what's going to solve this problem. We need to fundamentally
00:46:20.320 | repair our relationship with these tools. We need a philosophy well thought through
00:46:26.160 | about how technology is integrated into our lives. And the philosophy I preach is digital
00:46:31.840 | minimalism from my book, eponymously titled digital minimalism. And it's a pretty simple
00:46:38.080 | approach. It says, what you need to do is figure out what you care about in your life,
00:46:41.760 | the positive, what you want to spend time doing. You then work backwards and say,
00:46:45.760 | for each of these things I care about, what's a really useful way or effective way to use
00:46:50.240 | technology to support it. Your answer to those questions describes the technologies you use,
00:46:56.880 | everything else you don't use for something to lay claim to your time and attention.
00:47:01.680 | It has to go through the test of, I care about this. This technology is the best way to support
00:47:06.800 | this. And here are the specific rules I use for using this technology to support this thing. I
00:47:10.480 | care about everything else by default. You do not use in your life. It allows you to leverage
00:47:15.280 | the power of technology without being a slave to their worst excesses. That's where you need
00:47:20.720 | to get Nanded. You need to establish your own philosophy of digital minimalism. You don't
00:47:24.640 | just need tips to nibble at the edges of excessive phone use, but I know tips help people. So let me
00:47:31.040 | give you a few warmup rules that will prepare you for going through a bigger digital minimalism
00:47:36.480 | type transition. These are rules that real digital minimalists have come up with as they have,
00:47:42.880 | as they've gone through this exercise of reshaping their relationship to technology. Number one,
00:47:47.520 | the phone for your method. I see this a lot. I see this a lot among minimalists who really
00:47:52.160 | think through how do I want to use tech? When you are at home, the phone goes on a shelf in the
00:47:57.920 | kitchen or in your foyer, it's plugged into charge. That's where it is. If you're expecting
00:48:03.040 | a call, put the ringer on. If it rings, you can go in there and answer the call. If there's text
00:48:08.000 | messages you might need to check in on, you can walk to the foyer of the kitchen, that phone
00:48:12.000 | stays plugged in and you can answer those text messages right there. It is not with you elsewhere
00:48:17.920 | in your house. So it's not with you when you're watching TV. If you want to go see if someone
00:48:22.880 | texted you, you got to walk over there and stand there at the foyer typing until you're done with
00:48:26.480 | that. If you want to look something up, you have to walk over the foyer, look it up, type it in,
00:48:29.760 | and then go back to what you're doing. It's not with you at the dinner table. It's for sure not
00:48:33.120 | with you in the bedroom. So that's an example of a rule around use that might emerge as you do a
00:48:39.600 | digital minimalism overhaul to your life. Another rule that I think is interesting that's emerged
00:48:46.080 | often among digital minimalists is purposefully engineered disconnected time every single day.
00:48:51.360 | Every single day I'm going for a walk or running an errand without a phone. And if I'm bored,
00:48:58.640 | I'm bored. And if emergency happens, an emergency happens. But you know what? Until like a minute
00:49:04.000 | ago, we did not have phones with us everywhere we went. And the vast majority of us did not die
00:49:09.760 | in tragic emergencies because we were unable to quickly text someone. So you might want to try
00:49:14.560 | those as a warmup, Nanded. Phone foyer method, significant expedition or time every day without
00:49:21.120 | your phone just to get used to what it's like not to be with it. That's a good warmup, but you need
00:49:24.160 | to do the full transformation. My book, Digital Minimalism, is a good place to start. That book
00:49:28.880 | is available in lots of different countries. I think we've sold rights on that to a lot of
00:49:32.880 | different places. So wherever you're from, you can probably find a copy of that book.
00:49:37.280 | All right. I like it. We're rolling, Jesse. Let's keep moving.
00:49:41.840 | All right. Next question is from Andrew. Can the internet be possibly conscious?
00:49:48.160 | The internet seems living to me. Andrew, it's not. I've actually done a bit of a deep dive
00:49:54.480 | recently on machine consciousness for a big article I'm working on about artificial intelligence and
00:49:59.760 | consciousness. So I've been thinking about this. And I can come up with multiple reasons, maybe
00:50:06.400 | three or four different reasons why the internet itself is not conscious. Now, I understand why
00:50:11.600 | people think this. So as Andrew elaborated in the full version of his question, he said, look,
00:50:15.200 | this thing is all of these connections, like it's just deeply connected, all these network
00:50:20.080 | connections, that looks like a brain. And there's storage. Think about all this information on
00:50:24.160 | servers. We have all these connections. It's taking information out of servers and moving
00:50:27.200 | it to other servers. It looks like a giant brain. It's of a huge scale. Isn't it possible that
00:50:32.000 | something like consciousness can emerge there? And from my understanding of machine consciousness
00:50:37.920 | philosophy, no, probably not. So issue number one, issue number one, we can get by looking at
00:50:44.720 | research from a field known as the Neuronal Correlates of Consciousness, NCC. It's a well
00:50:49.120 | established field that was sort of launched in the 90s. Watson of Watson and Crick's fame was
00:50:54.080 | actually involved in the beginning of this field. They study using clever experiments and brain
00:50:59.760 | imaging tools, human consciousness, and in particular, what areas of the nervous system
00:51:06.800 | in humans seem to be involved in the experience of consciousness. And basically, the conclusion
00:51:12.240 | of that research is that it's very small and very specialized. We have huge sections of our brain,
00:51:18.080 | for example, that process stuff all the time, like the visual cortex that's constantly processing and
00:51:23.520 | trying to make sense of images. Their experiments do not find that as having any sort of standalone
00:51:29.200 | consciousness. It's just generating information. Our gut is incredibly complicated. The nerves in
00:51:34.160 | our gut and digestive system is very complicated. NCC research says there's no consciousness we can
00:51:39.360 | find. None of our sense of consciousness emerges from those actual centers. They've identified
00:51:44.080 | these very narrow portions deep in our brain where a lot of different information integrates,
00:51:49.280 | where this seems to be where activity in here is where like our notion of conscious awareness
00:51:54.640 | for ourself seems to depend on. The point of this is consciousness is not haphazard.
00:52:00.640 | Just connecting a bunch of neurons is not going to give you consciousness. It's not just a matter
00:52:05.680 | of numbers. Well, if we make this network big enough, if we have a huge neural network that
00:52:09.920 | can process images really well, eventually it's conscious. That's not how it works.
00:52:13.360 | We need very, very specialized structures. Most likely, we need very, very specialized structures
00:52:19.120 | that have been engineered exactly to generate something like consciousness to create it. So
00:52:22.880 | NCC research pushes you away from this idea that if something kind of looks like a brain and it's
00:52:27.200 | big, it's probably going to just eventually have consciousness. The other issue is the connections
00:52:32.160 | that make up the internet are too loose. There are a lot of connections, but they're probably
00:52:36.160 | not nearly dense enough to have enough what's known as integration. So there's a whole other
00:52:41.520 | theory about consciousness that has to do with the integration of machines. The internet is not
00:52:45.120 | densely connected enough. It's highly connected. There's many ways for me to get from this server
00:52:50.160 | to that server through network links, but it's not super densely connected. And so it's probably not
00:52:54.720 | nearly connected enough like the human brain is as you would need for something like consciousness.
00:53:00.000 | We also have the issue of the connectivity of the internet. It's not oriented towards the type of
00:53:06.960 | goals that we think aggregate up to be necessary for consciousness. So the human brain, different
00:53:11.840 | parts of the brains do very specific things that feed into higher layers that make use of those
00:53:15.840 | things in very specific ways. We have image processing and these images then go into more
00:53:22.240 | complicated integration cortical frames, which bring in other types of information. Those connect
00:53:26.160 | to the memory. It's incredibly precisely, the brain is precisely structured to build up and
00:53:31.840 | integrate information in a way that then those specialized centers can create consciousness.
00:53:35.200 | The internet's not doing that. You can't just take a random collection of network connections
00:53:39.840 | and say, this thing is processing information. It's not, it's just sending packets back and forth.
00:53:45.600 | I know the final issue is it's too slow. The speed of what it would take for actual information,
00:53:52.080 | the actual speed that it functionally moves to the internet, where like you have to wait until
00:53:56.480 | it's your time in the packet queue. And then this thing gets sent over here and a program has it.
00:54:00.480 | And eventually some packets get sent over here. That's so slow compared to neurons in a human
00:54:05.440 | brain. And then if somehow packets going back and forth and data being pulled in and out of
00:54:10.080 | hard disks was creating something like a giant human brain, it would be a human brain that
00:54:14.640 | operated incredibly slowly. So no, Andrew, I think if we look deeper at these questions of
00:54:19.840 | machine consciousness, the internet is probably not alive. If you want to find out more about this,
00:54:24.640 | a good introduction to these notions can be found in Max Tegmark's book, Life 3.0. He has some
00:54:31.360 | chapters in the end about consciousness. And I think it covers a lot of this really well.
00:54:35.920 | Also, again, I'm hinting that I'm writing an article about this. So keep an eye out for
00:54:40.720 | a Cal Newport byline, you know, I don't know, at some point in the next few weeks,
00:54:45.440 | maybe. But we'll also take a deeper look at these questions as well. All right.
00:54:49.920 | >> What would a dense internet look like? >> Well, I mean, I think the issue is that
00:54:54.080 | the human brain is like, it's very densely connected. So lots of things are connected
00:54:58.560 | to each other. The internet's not actually that densely connected. It's like there's a way it's
00:55:03.280 | highly connected. So it's like if I want to get information from a server in this room
00:55:07.600 | to a server in Singapore, like a lot of paths we could go on.
00:55:11.600 | >> Would it all have to be in one giant server for it to be dense?
00:55:14.080 | >> Not, but it would have to be more like my server is connected directly to like 10,000
00:55:20.640 | other servers. And like, it's the, and the servers in this one area are all densely connected to each
00:55:26.720 | other so they can all work together and like process things. And it's not really the way
00:55:29.760 | the internet is. It's much more sparse than that. It's probably not big enough either. I mean,
00:55:34.880 | I don't know the total number of internet links, but the number of actual neuronal connections,
00:55:40.480 | even the human brain numbers in the trillions. And I don't think there's that many links in
00:55:43.520 | the internet. All right. I think we have time for one more.
00:55:47.120 | >> All right. Next question is from Teresa. How do you see your work as a professor interfacing
00:55:53.280 | with your writing? Does it support your writing lifestyle or is it fulfilling your other interests?
00:55:57.600 | >> Well, I want to include this in part because I think it's relevant to this discussion of
00:56:03.680 | rethinking technology because that's a lot of the work I do. And also just because of the timing.
00:56:07.680 | So around the time this episode is released, knock on wood, a newer version of my website,
00:56:14.800 | calnewport.com will also be released. So I've, I've rebuilt my web presence to try to
00:56:20.160 | provide more clarity on this question of, well, it's Cal a professor, is he a writer?
00:56:25.280 | Are these unrelated? What about his podcast? Like what goes on in Cal's life? How do these
00:56:30.720 | things interact? I'm reshaping my online presence to try to add some clarity here.
00:56:36.240 | So if you look at the new calnewport.com, once it launches, you'll see I have merged my academic
00:56:42.400 | website with my personal website into one unified website and the homepage will be split right down
00:56:48.960 | the middle. And one side it says Cal the writer on the other side, it says Cal the academic. So I
00:56:53.840 | directly confront this question on my, the new version of my website, what are these roles and
00:56:59.520 | how do they interact? And the way I see it is that as an academic, I'm a computer science and
00:57:04.720 | technologist who does academic type of work on those questions that includes, you know, hardcore,
00:57:12.720 | old fashioned computer science theory, increasingly moving forward. My academic work is going to is,
00:57:18.720 | is including and will continue to include rigorous academic work on issues around technology and
00:57:24.080 | their interaction with culture and society. So there's some shifts that are happening,
00:57:28.400 | potentially in my role at Georgetown and some other work I'm getting involved in. So I'm,
00:57:32.160 | I'm an academic that publishes stuff about technology. Some of it's like hardcore math.
00:57:37.280 | Some of it is more tech and culture, tech and society, tech intersecting with work,
00:57:41.120 | tech intersecting with our lives and democracies, et cetera. I'm also a writer who writes about
00:57:47.680 | all sorts of issues that orbit that general theme for non-academic audiences. So the New Yorker,
00:57:55.840 | of course, is like the main article outlet for me. And then I also write books. And again,
00:58:02.720 | my public facing writing is for the most part in the last whatever many years is related pretty
00:58:09.440 | strongly to this academic interest. So I'm sort of an intellectual that thinks about technology
00:58:12.960 | and society and technologists at a university. And then I write these public facing things that
00:58:16.160 | all sort of orbit around it. You know, when I'm talking, people think of me sometimes as
00:58:19.600 | they talk about productivity. It's not really productivity by itself. I'm talking about,
00:58:25.440 | I'm talking about work in a world with all these digital tools and how that upended what work
00:58:31.120 | meant and how do we rethink work in a world where we have ubiquitous internet connections?
00:58:35.840 | How did something like email completely change our relationship to work or a book like digital
00:58:40.480 | minimalism? I'm talking about how our sense of ourself and our vision of the life well lived
00:58:45.200 | was changed by ubiquitous high speed wireless internet access. So a lot of the things I write
00:58:49.120 | about really orbit around technology creating these unexpected side effects in our society.
00:58:55.680 | So I'm an academic and I'm a writer and Cal Newport doc and they're related and Cal Newport
00:58:59.840 | dot com makes that clear in that work. Right. So in this work I've done, especially the public
00:59:07.120 | facing writing, one of the things that emerged from it was what I think of as the deep life
00:59:14.880 | movement. So there was this sort of more pragmatic community that emerged from some of my work
00:59:20.400 | about how do I live a deeper life? How do I personally like rethink my life so that my
00:59:26.560 | work is deeper? My life is deeper in an increasingly distracted world. This community,
00:59:31.440 | this movement arose out of all these issues I was writing about in my books and in my articles and
00:59:35.840 | to some degree in my academic work. So then the way I understand that is that I have sort of spun off
00:59:42.560 | this movement, this deep life movement as sort of its own thing that I that I helped create and
00:59:48.240 | now I'm putting it out there into the world. And it's under that umbrella that you have this
00:59:51.760 | podcast. It's under that umbrella that you have the videos I do for YouTube. It's a movement that's
00:59:57.120 | about very pragmatically trying to make your life deeper. Now, we actually have this has been soft
01:00:03.360 | launched. Me and Jesse have been working on this. We've soft launched a website just for that
01:00:07.280 | movement, the deep life dot com. And you can actually go check that out right now. Now we're
01:00:11.840 | going to be pushing this more heavily as we get it more up to speed and we get more used to it.
01:00:15.520 | But basically it's a standalone home where you can go and listen to every podcast, watch all
01:00:21.360 | the videos I've done, see clips, any clip from a podcast on the same page as that podcast, go to
01:00:26.880 | all the show notes. There's a page for every podcast episodes we've done. We have Netflix
01:00:30.560 | style carousels for everything I'm doing on YouTube. So you can watch the videos I use without
01:00:34.880 | having to go to YouTube dot com, without having to see algorithmic recommendations. You can just
01:00:39.360 | treat the deep life dot com like a personal Netflix. This is an online movement that we're
01:00:44.400 | going to grow. Other personalities can eventually be involved, people, content and providers that
01:00:51.360 | aren't just me. And so that's its own movement. It's about this one idea that people are very
01:00:56.560 | interested in, that I'm trying to help people with trying to make a positive impact in the world.
01:01:00.000 | But the right way to understand the deep life movement, the deep life dot com, it's like
01:01:04.720 | when John Haidt started Heterodox University or Heterodox Academy, whatever he called it,
01:01:10.720 | it's a particular movement or organization he created about one idea that he worked on as a
01:01:15.120 | social psychologist. But he's a social psychologist, works on other things. So that's the way I
01:01:19.120 | understand myself. I work on the intersection of technology and culture. I write about a lot of
01:01:23.280 | different things on these issues. One of the things that has been most impactful that I've
01:01:26.880 | worked on has been this idea of the deep life and deep life movement. So I've spun that off into
01:01:30.560 | its own organization and entity and portal and will be continuing to sort of feed and grow that.
01:01:35.280 | And so that's how I understand how all these pieces fit together. I don't know if that's
01:01:39.360 | complicated, Jesse. In my mind, that all makes sense. Yeah, no, I think I like it. And you
01:01:46.240 | structured well. And I think the deep life dot com will really be pushing, you know, I'm a slow
01:01:50.720 | productivity guy, so take our time. But slowly we're growing this slaver growing thing. So I
01:01:56.560 | think the new Cal Newport dot com that integrates the academic and writing portions of my life in
01:02:01.680 | the one site. And then the deep life dot com as being the home for the podcast, the home for the
01:02:06.720 | video. So that's not living on Cal Newport dot com. I think there's a lot of for me clarity in
01:02:11.120 | that. Yeah. So I can be an academic who thinks about technology and writes academic stuff on it,
01:02:18.960 | also writes public facing stuff on it. And then also have this this entity I created,
01:02:22.320 | the deep life dot com that really helps this one movement that that came out of my work,
01:02:27.440 | really help that grow and help a lot of people. But the important thing, I think, is to distinguish
01:02:32.400 | my general work as a thinker from the deep life dot com. I write about a lot of other stuff, too.
01:02:36.320 | And I think that's the difference between, let's say, me and just someone who's maybe like an
01:02:41.760 | online personality. It's not like the deep life dot com, this podcast like this is me.
01:02:46.640 | Yeah, it's not like this is all I do. And I this is something I started because I think it's very
01:02:54.000 | important, but it doesn't totalize what I work on. You know, a lot of my New Yorker stuff is
01:02:58.960 | deeper looks at particular issues about technology, the workplace, the future, the workplace that
01:03:04.720 | maybe has nothing to do with the quest to live a deeper life. My academic work I'm working on now
01:03:08.880 | is some of it's pretty technical. It's not directly related to it. So I think by having
01:03:13.040 | being able to separate these two, we can really feed the deep life as a movement
01:03:16.400 | and help as many people as possible without muddying it up too much with, oh, and I'm giving
01:03:21.600 | this lecture at this university on digital ethics or something like they're related,
01:03:25.760 | but they're not all the same thing. So we'll see if that's clear. I mean, the new Cal Newport dot
01:03:30.240 | com makes this clear. So it's like, oh, if you want to watch videos or podcasts like that's all
01:03:34.560 | over at the deep life dot com. Is this the first time your academic stuff has been on the website?
01:03:40.400 | Yeah. So I used to have a still up. I'm going to have to sort of change it.
01:03:45.040 | A website at Georgetown dot edu. And that was all my academic stuff.
01:03:48.640 | So now people can go do a deep dive in your academic world.
01:03:52.160 | You can read my CV. You can look at my publications. Yeah. So now I know I yeah,
01:03:55.840 | the academic stuff is all it's all going to be on Cal Newport dot com. Knock on wood,
01:04:00.560 | that's going to be live. I mean, it's going live the day after we record this,
01:04:03.680 | hopefully. But website launches are tricky. So maybe it won't. I don't know if that's clear
01:04:09.520 | for people, but that's what I'm trying to do. I mean, I think the big shift happening in my
01:04:12.560 | academic career now is the doing more tech and society stuff in my academic career.
01:04:17.440 | And then that makes it to me, it makes things a lot clearer. So I do academic work on tech
01:04:23.120 | and society. I do public facing writing on that. And that's sort of that's my role as a public
01:04:27.200 | thinker. Well, you still write these like complex math papers. I mean, I'm working on one now.
01:04:32.240 | Yeah. You like that, though, right? I like that. Yeah. I like being an academic.
01:04:36.080 | So keep your math skills up. It's a weird fake job. I really like it.
01:04:39.440 | All right. So final segment of the show, we shift gears and go to talk about something
01:04:47.040 | interesting or I just take something interesting that a listener sent me to my interesting account
01:04:51.680 | report dot com email address. I got a good one, at least one that makes me happy for a lot of
01:04:57.120 | reasons before we get there. Let me just briefly mention another one of our sponsors. That's our
01:05:01.360 | friends at Mint Mobile. So the idea with Mint Mobile is that it is a company that sells premium
01:05:09.040 | wireless service and it does it in an online only format. They don't have physical brick and mortar
01:05:14.640 | stores. So they get rid of all that overhead cost. And so they can they can sell you really high end
01:05:21.040 | wireless and cellular service cheap. So you can just order right from home, you do it online,
01:05:27.280 | they could just send you a SIM card, you can stick into any phone without all that overhead.
01:05:31.760 | You can have cellular wireless service starting as low as just fifteen dollars a month. We're
01:05:38.800 | talking about unlimited talk and text and high speed data delivered on the nation's largest 5G
01:05:42.880 | network. They're so efficient. You can get this cheap. Two things you can do with Mint Mobile.
01:05:47.520 | One, just lower your wireless bill. I think people just get numbed into their accounts with one of
01:05:53.280 | the big players and how much money they're spending. Like, I don't know, my phone's important
01:05:56.400 | and they just get used to these big bills going out every month. You can get equally good high
01:06:03.840 | speed data, high speed connections for very cheap with Mint Mobile. It's not hard. You do it all
01:06:10.160 | online. They send you this thing, you stick it into the phone and you have it. The other thing
01:06:14.160 | you can do with Mint Mobile is what I do with it. I use it to set up a cheap secondary phone account.
01:06:20.880 | So what I wanted to do, and I brought it in again today, so if you're watching on YouTube,
01:06:25.120 | you can see it. I have it turned off so that you won't accidentally see any personal information.
01:06:30.640 | I got this off Amazon, this NUU phone. It's a flip phone. It has a camera and big buttons.
01:06:37.440 | There's a call button and a hang up button and the buttons you can do text messaging with and
01:06:42.560 | you can take and receive calls. Mint Mobile, 15 bucks a month, SIM card, stuck it right in here.
01:06:47.680 | I now have a dumb phone on which my wife can reach me with text messages. My family can call
01:06:52.320 | me if there's an emergency. I can call someone if I lost something or I'm running late. And it allows
01:06:57.120 | me now to have excursions or days or parts of my day where I say I can bring this, I call it my bat
01:07:02.880 | phone, as emergency backup, but I don't have a phone that has a browser in it on me. I have no
01:07:07.680 | way of checking email. I have no way of going to see how spring training is doing today. And so
01:07:12.480 | this is a cool sort of deep life hack that Mint Mobile can help support. Get a $15 a month plan,
01:07:17.840 | spend 70 bucks on a flip phone, stick that card in here. And now you have the ability
01:07:22.400 | to go off grid without actually being completely off grid. So to get your new wireless plan for
01:07:28.640 | just 15 bucks a month and get that plan shipped to your door for free, go to mintmobile.com/deep.
01:07:34.400 | That's mintmobile.com/deep. Cut your wireless bills to $15 a month at mintmobile.com/deep.
01:07:40.960 | All right. Well, the other thing I want to briefly mention before we get to something
01:07:45.440 | interesting is our other friends at Policy Genius. This is tax season. So I often get
01:07:51.040 | stressed during tax season because I know I'm supposed to be preparing my taxes for our
01:07:55.440 | accountant. It's looming over my head. There's always these ambiguities involved where you're
01:08:00.160 | trying to pull together the information. There's something you don't quite have.
01:08:02.480 | And so I just procrastinate. We all do that. But one of the other very common areas where people
01:08:09.280 | procrastinate is in life insurance. Just like you know you need to prepare your taxes, you know you
01:08:14.000 | need life insurance if there's anyone who depends on you financially. Why don't you have life
01:08:17.920 | insurance? Because it's a pain. You don't know how to start. Where do you go to get it? Who do you
01:08:24.560 | talk to? Do you have to talk to that guy in the red shirt from that commercial or the lady that
01:08:30.160 | wears the flow? Like, I don't know what insurance company, how does this even work, right? This is
01:08:34.720 | why people don't get life insurance when they know they need it because they don't know where to
01:08:37.680 | start. I'll tell you how to do it. PolicyGenius. They make it easy. Their whole idea is to modernize
01:08:45.200 | the life insurance industry. Their technology makes it easy to compare life insurance quotes
01:08:49.520 | from America's top insurers. In just a few clicks, you can get your lowest price. Click, boom, off
01:08:56.880 | my task list. You just go to PolicyGenius.com and you can get life insurance set up real easy.
01:09:05.040 | With PolicyGenius, you can find life insurance policies that start at just $39 per month for
01:09:09.200 | $2 million of coverage. That's a great price. Some options offer coverage in as little as a week and
01:09:13.760 | they avoid unnecessary medical exams. So if you need life insurance, just go to PolicyGenius.com.
01:09:18.800 | Hit, click on the link in the, or click on the link in the description. Okay. So yeah,
01:09:23.440 | you can click on PolicyGenius.com. There's a link in the description or just go to
01:09:26.080 | PolicyGenius.com. Your loved ones deserve a financial safety net. You deserve a smarter
01:09:30.480 | way to find it. Buy it. Go to PolicyGenius.com. Get the life insurance. 10 minutes from now,
01:09:34.800 | you can take that off your task list. All right, Josie, let's do something interesting.
01:09:39.200 | Before we get into the particular thing I want to show you, I just want to mention briefly,
01:09:45.920 | a friend of the show has a new book out and I want to mention that for the interested listeners.
01:09:53.360 | So my friend, Robert Glazer, who runs the Elevate podcast, it's a podcast I've been on
01:09:59.760 | several times. It's been around for a really long time. I really liked that podcast. He has a new
01:10:03.760 | book that's just come out or is just about to come out, depending on when you hear this,
01:10:08.480 | called Elevate Your Team, Empower Your Team to Reach Their Full Potential and Build
01:10:13.440 | a Business that Builds Leaders. I blurbed this book. So let me just read you my blurb.
01:10:18.480 | For those who are watching on YouTube, you can see my picture right here next to Dan Pink. So
01:10:24.080 | here's what I said. "A team that reaches its full capacity is a force to be reckoned with.
01:10:28.080 | Robert Glazer provides an evidence-based roadmap for achieving this goal."
01:10:31.920 | Here's Dan Pink's blurb, just happens to be right next to mine. "This book is at once
01:10:37.440 | perceptive and practical. It will open new vistas for your own thinking about leadership and equip
01:10:41.120 | you with a host of tools and tips to build capacity in your team. Follow Bob Glazer or
01:10:44.800 | prepare to be left behind." So if you run a team, if you're in management, check out Elevate Your
01:10:49.840 | Team. Bob's just a friend of ours. So I wanted to mention that. All right, something interesting.
01:10:55.600 | The article I want to discuss today is titled, where's the actual title here? So I'm loading
01:11:01.520 | this up on the screen for those who are watching episode 237 on YouTube. All right, here's the
01:11:06.080 | title. It's at BuzzFeed. Michael Cera explained why he doesn't own a smartphone. And honestly,
01:11:12.640 | it's a solid explanation. Michael Cera, of course, played George Michael on the epic cult classic
01:11:22.400 | Fox TV comedy, Arrested Development. Jesse, I have to say something that made me happy the other day
01:11:29.200 | is in my class at Georgetown. So I'm teaching 19 year olds who would have been not of TV watching
01:11:36.800 | age when Arrested Development came out. I made a Arrested Development reference in class. So I was
01:11:42.240 | talking about the need that there's, I was saying there's always points when you're doing induction
01:11:47.200 | problems on a, on a exam. So you always get some points for just doing the base case, which is
01:11:51.520 | easy. There's always points in the base case. And I said this, if you need to remember this,
01:11:55.120 | it's just sort of my version of there's always money in the banana stand, which is an Arrested
01:11:58.640 | Development reference. Two students came up to me after class and said, I just want to say,
01:12:02.560 | I really appreciated the Arrested Development reference. So even the young kids today,
01:12:07.680 | at least some of them have come across Arrested Development. It gives me hope for the future.
01:12:12.640 | All right. So Michael Cera was a star of Arrested Development and has done a bunch of other stuff
01:12:19.120 | since. So this Buzzfeed article has some cool details. So let me just read a few things from
01:12:24.560 | this article here. It all warms my heart. In an interview with the Hollywood Reporter
01:12:28.720 | about his latest projects, Michael explained why he's rejected modern cell phone technology and
01:12:33.520 | attempted to keep his private life extremely private. The reality is that Michael is not a
01:12:37.920 | fan of too much socializing and he's completely opted out of social media. It doesn't feel
01:12:42.480 | conscious. He said, I guess it's just something that I didn't elect to do because everyone does
01:12:46.720 | it. It starts to feel like a big choice, but it's just not that interesting to me.
01:12:50.240 | In fact, he's so private, you won't even catch him with an iPhone or Android. His explanation
01:12:55.840 | makes a lot of sense. I also don't have a smartphone, he added, and that is a conscious
01:12:59.920 | choice because I feel a bit of fear about it, honestly, like I'd really lose control of my
01:13:05.920 | waking life. I think that's cool. So here's the message. My good friend, Michael Cera, and I do
01:13:13.200 | not use social media. He doesn't even use a smartphone. Between the two of us, we have been
01:13:20.800 | involved in multiple Emmy award-winning shows and cool movies. Between the two of us, we're
01:13:26.720 | actually quite famous. He won all the Emmys, he did all the movies. But I think it's cool.
01:13:32.320 | Here's the bigger point. When we say I need to use something, like I need to use social media,
01:13:40.560 | I need to have a smartphone. What we usually really mean by need is I'm afraid that some
01:13:46.160 | bad things might happen if I don't. And Cera gets into this farther on in this BuzzFeed article
01:13:51.680 | about like, yeah, there's probably roles I didn't get. There's probably some opportunities that
01:13:56.240 | aren't available to me because I'm not out there with a big social media presence.
01:14:00.080 | But here's the thing, who cares? He has a cool life. He was involved in a cool show. He's done
01:14:04.960 | cool plays. He's done cool movies. He continues to work. He's not a huge celebrity because he's
01:14:10.000 | not out there. He's not that visible. So he has his private life still relatively private.
01:14:14.320 | He didn't need to use these tools. He just recognized if he didn't, he might lose a few
01:14:19.600 | opportunities. And he said, great, what a fair price. What a fair price to pay to have a life
01:14:24.400 | that's more under my control. That's what I like about this Michael Cera example.
01:14:28.640 | You don't have to fall into this optimization mindset of like, OK, if I'm going to be an actor,
01:14:35.680 | how can I be the most successful possible actor in the history of the world? And I don't want to
01:14:39.760 | do anything that's going to take that away from you. How about just like I want to be a working
01:14:42.640 | actor and I have a cool life or I want to be a writer instead of saying I want to optimize to
01:14:47.200 | be the best selling author of all time. And until I outsell James Clear, I am going to every night
01:14:51.680 | stay up and say, what can I do to get there? How about just like I'm an author that sells books
01:14:55.520 | and people like them and it's a really cool life. You know, you don't have to optimize everything.
01:14:59.920 | You don't have to be the baseball player trying to break the record for AAV.
01:15:02.720 | And it's contract. So that's Michael Cera. He's built a cool life without having trying to wring
01:15:08.880 | out every possible bit of exposure or growth. And I like that. The other thing I learned about this
01:15:16.240 | and Jesse, maybe you're more familiar with BuzzFeed, but what the hell is going on with this?
01:15:20.160 | Is this what BuzzFeed is? It's just like pictures with two sentences and another picture that
01:15:24.880 | another two sentences. I never go on BuzzFeed. My goodness. That's the other lesson I take away
01:15:30.240 | from this. BuzzFeed kind of depresses me. See, Michael Cera can probably be probably reading
01:15:36.480 | long New Yorker pieces because he's not distracted by his phone. If you're on your phone all the time,
01:15:40.080 | I guess you have to just read BuzzFeed. So I was a little depressed to finally be exposed to
01:15:44.320 | BuzzFeed. That's kind of depressing, but it's balanced out by the fact that Michael Cera is
01:15:48.480 | the man. George Michael is the man. So I like to call him Mr. Manager. Another rest of development
01:15:54.560 | reference. He doesn't use a phone and he's doing pretty well for himself. All right. Well, speaking
01:15:59.040 | about doing well for ourselves, I think this has been a pretty good episode. I enjoyed doing a
01:16:03.760 | little bit of tech sermonizing and prognostication. If you like what you heard, you'll like what you
01:16:09.680 | saw. The full episode and clips will be available at youtube.com/calendepartmedia. We'll be back
01:16:14.000 | next week with another episode of the show. And until then, as always, stay deep.