back to index

Cal Newport Reacts To Elon Buying Twitter | Deep Questions Podcast


Chapters

0:0 Cal's intro
1:0 Cal talks about a recent Scott Galloway article about Elon
4:45 Cal talks about Twitter not getting enough value out of its users
7:30 Subscription model for Twitter?
12:20 Where Cal disagrees with Galloway
15:15 Cal and Jesse talking about Elon's purchase of Twitter stock

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, so we have listener calls as a listener calls episode.
00:00:02.600 | We've got some good calls queued up last week.
00:00:06.480 | However, we debuted a new segment.
00:00:08.400 | Cal reacts to the news where we take something that's in the news
00:00:13.880 | and I give you my opinions on it.
00:00:17.120 | I thought that went well.
00:00:18.320 | So we thought we would do it again.
00:00:20.040 | So on Monday's episode, we talked about the New York times decision to stop
00:00:25.920 | encouraging their reporters to use Twitter.
00:00:29.680 | Now we have another article, which Jesse just handed to me when I got to the HQ
00:00:34.280 | today, and it has to do about the other major Twitter story in the news this
00:00:38.400 | week, which is Elon Musk buying a 9% stake in Twitter and gaining a seat
00:00:45.160 | on the Twitter board of directors.
00:00:47.520 | This seems to have been causing a lot of debate.
00:00:50.480 | I mean, I've talked to some people about this and academic circles and et cetera.
00:00:54.400 | People are really all over the place.
00:00:55.720 | People are all over the place.
00:00:58.040 | About Musk in particular, I think people have a hard time trying to figure out,
00:01:03.680 | do I like this guy or hate this guy?
00:01:05.680 | Do I think he's interesting or do I think he's terrible?
00:01:07.560 | Cause he, he doesn't make any of those categorizations easy.
00:01:11.480 | Uh, so, so it's, it's always unclear.
00:01:13.880 | Like Jesse, let me ask you pro Musk, anti Musk.
00:01:17.120 | I think I'm pro Musk.
00:01:19.240 | Yeah, for sure.
00:01:19.960 | He gets a lot of press cause he's, you know, so ridiculously rich.
00:01:24.200 | Yeah.
00:01:24.520 | He's going to begin.
00:01:25.160 | He's going to be around for a long time.
00:01:26.400 | He's not even that old.
00:01:27.600 | He's not that old.
00:01:28.320 | Yeah.
00:01:28.680 | I think he's not that much older than us.
00:01:31.400 | He's like 10 years older.
00:01:32.760 | Yeah.
00:01:33.280 | I mean, he's going to be in the news.
00:01:34.720 | He's in the news all the time now, and he's going to be in
00:01:37.480 | the news for a long time.
00:01:38.440 | Yeah.
00:01:38.800 | I mean, I'm, I'm looking forward to the, the Walter Isaacson biography on Musk.
00:01:43.640 | I don't know when that's coming out, but he's such an interesting, weird character.
00:01:48.040 | I think it's going to be a fascinating book and Isaacson because
00:01:50.360 | of his stature can get access.
00:01:52.080 | I mean, I think the thing about Musk is that he has done objectively sort of
00:01:57.720 | impressive, uh, error defining things in the world of business.
00:02:01.880 | He just made electric cars and industry.
00:02:03.720 | He just made, uh, affordable space travel and industry.
00:02:08.480 | Those are crazy things to do and just became the richest
00:02:10.600 | man in the world doing that.
00:02:11.880 | So it's like a larger than life character.
00:02:13.800 | When he interacts with the world, however, he does so in like a really weird way.
00:02:19.120 | He he'll be joking.
00:02:21.080 | He'll be, uh, say non sequiturs.
00:02:23.400 | He has no particular tribal allegiances.
00:02:26.360 | So then, you know, he's not, he's not particularly super conservative.
00:02:31.480 | So the conservative is like, I guess we like this guy.
00:02:33.200 | I don't know if we like this guy.
00:02:34.440 | He's not very interested in, very interested in like woke ism.
00:02:37.640 | So like more of the, the left, the farther left is like, I
00:02:41.400 | guess we just like this guy.
00:02:42.560 | Like no one's really sure how they feel about him, but anyways.
00:02:45.520 | The more proximate question here is how do we feel about him
00:02:49.120 | buying a 9% stake in Twitter?
00:02:50.880 | And this is the article that Jesse gave me.
00:02:52.840 | It is an article by professor Scott Galloway from NYU.
00:02:58.200 | It's Galloway's dissection and opinions on Elon's purchase of Twitter stock.
00:03:06.240 | And so I thought I would do here is summarize, uh, the major point of
00:03:10.280 | Galloway, give you my reaction to that point, and then talk about a couple
00:03:13.880 | of the things that Galloway says that I somewhat disagree on.
00:03:17.520 | So let's start with Galloway's main point, which is one I actually agree on.
00:03:21.040 | And let me actually preface this all by saying, if you have to choose
00:03:24.400 | between me and him on anything, having to do with Twitter, you
00:03:27.600 | should probably trust Scott.
00:03:28.920 | He's actually been involved.
00:03:30.920 | He's had a stake in Twitter.
00:03:32.240 | He was part of an activist shareholder movement that helped got, uh, to
00:03:36.280 | help get Jack Dorsey removed from the Twitter board of directors.
00:03:38.680 | So this is someone that actually knows a lot about it.
00:03:40.560 | I don't, but I do have a microphone in front of me.
00:03:43.040 | So I figured we should talk about it.
00:03:44.360 | So here's Galloway's main point.
00:03:46.480 | Twitter is not doing well.
00:03:49.520 | This is largely acknowledged.
00:03:51.560 | This is why Musk was able to buy so much stock at basically a discount.
00:03:55.400 | Galloway acknowledges smart move by Elon.
00:03:59.080 | It's an undervalued stock and it's undervalued because
00:04:03.080 | Twitter is not doing well.
00:04:04.280 | There's different ways to look at it.
00:04:05.400 | Galloway has a bunch of charts.
00:04:06.600 | One of them is of ad revenue.
00:04:09.200 | Twitter is doing what $4.5 billion in 2021 compared to 115 billion for
00:04:14.960 | meta and 209 billion for Google.
00:04:17.880 | Everyone sleeps on Google.
00:04:19.280 | They don't realize how much it still dominates the advertising market.
00:04:22.000 | Uh, you know, meta had to create, it created its own ecosystem
00:04:26.240 | in which it can sell ads.
00:04:28.200 | Google is basically still selling ads on the rest of the internet.
00:04:32.080 | So people forget how much money they're producing.
00:04:34.720 | Galloway also talks about the enterprise value per daily active user.
00:04:39.080 | For Twitter, it's 131 compared to almost $300 for meta.
00:04:42.880 | So they're not doing very good job of actually getting value
00:04:45.880 | out of their active user.
00:04:48.040 | So what should be done about this?
00:04:49.520 | Galloway has a sol a solution that he calls hashtag
00:04:54.560 | expletive deleted obvious.
00:04:56.880 | And that solution is moved to a subscription model.
00:05:03.040 | Here's Galloway's elaboration, corporate users and users with large
00:05:08.440 | followings would pay for a fraction of the value they receive.
00:05:10.920 | I have long advocated for this model by shifting the company's revenue
00:05:14.400 | source from advertisers to users.
00:05:15.920 | Subscription aligns economic incentives with user experience
00:05:18.560 | rather than user exploitation.
00:05:19.960 | As I hinted, I agree with that.
00:05:23.600 | I agree with that move.
00:05:25.800 | I think it would be an exciting move, not just for Twitter, but because of the
00:05:28.720 | precedent it would set for how social media could operate in the future.
00:05:34.720 | I think Galloway is absolutely right when he argues ad supported, ad
00:05:40.600 | supported social media platforms have completely misaligned incentives.
00:05:46.600 | I wrote about this a while ago for the New Yorker wrote a piece a couple of
00:05:49.960 | years ago about indie social media, where I got into this more, but the issue is.
00:05:54.080 | With an ad supported social media platform, all that matters is engagement.
00:05:58.360 | How long do we keep you on the platform?
00:05:59.880 | The more you're on there, the more ads we can put in front of you, but also
00:06:02.560 | the more data we can gather from you to target those ads and neither of those
00:06:08.240 | objectives are aligned with what is going to be the most satisfying experience for
00:06:14.160 | a user or what's going to be the best experience for a user from even just a,
00:06:17.880 | a privacy standpoint or any other standpoint you might want to look at.
00:06:21.360 | When you put all the data you can find about a user into a black box, machine
00:06:25.200 | learning algorithm to have it figure out what to show you so that you'll stick
00:06:29.040 | around more, what you're doing is getting artificial neural networks to learn.
00:06:34.800 | The darkest recesses of the human brain to figure out about the brain stem, to
00:06:40.020 | figure out about our, uh, our tribal reactions, our fear reactions, right?
00:06:44.520 | There's very powerful things in the human psyche and those artificial neural
00:06:48.360 | networks that are taken in all this data and running their multi-arm bandit
00:06:53.120 | reinforcement algorithm to try to figure out what makes you stick around
00:06:56.000 | later than, than farther than others.
00:06:57.800 | It's just figuring out stuff that darker souls in the past had also deduced.
00:07:02.920 | Let's make you angry.
00:07:04.480 | Let's make you sad.
00:07:05.960 | Let's get your back up against the wall and feel like there's a fight to be had.
00:07:09.960 | And Twitter is really good at that because the main grist for Twitter is
00:07:14.720 | individuals, uh, individual interaction, right?
00:07:19.500 | It's words from people and you interacting with words from other people.
00:07:23.400 | It's not pictures of things.
00:07:25.000 | It's not articles or videos that maybe people are talking about is just direct
00:07:28.960 | human interaction, going through a curation algorithm.
00:07:32.920 | This is trying to maximize engagement and it creates.
00:07:35.440 | Terrible Lord of the flies.
00:07:37.880 | Everyone going after everyone, seven seconds after being on Twitter, you feel
00:07:42.660 | upset.
00:07:43.160 | That is the outcome.
00:07:44.640 | And so I think Galloway is onto something.
00:07:46.120 | If the money is from ads, then the money is from users wanting to pay that ad
00:07:52.240 | revenue.
00:07:52.740 | I mean, not from ads, I'm sorry.
00:07:53.960 | Subscriptions.
00:07:54.920 | The income comes from users wanting to pay those subscription fees.
00:07:57.840 | Why do users want to pay subscriptions to things?
00:07:59.720 | Not because, uh, I use it all the time, but because they value what they get when
00:08:04.720 | they do use it.
00:08:06.080 | It's a completely different model.
00:08:08.760 | And now you want the experience to be informative, uplifting, interesting,
00:08:12.480 | entertaining.
00:08:13.140 | It's a completely different incentive.
00:08:14.540 | Now, I don't know if I share Galloway's, uh, optimism that this would create a
00:08:21.000 | monster company.
00:08:23.520 | I think once you start charging, you might have a severe contraction of user
00:08:28.720 | base, right?
00:08:30.360 | But I think it would be a very profitable company and a very useful company.
00:08:33.160 | And I hope that actually happens.
00:08:34.880 | Now there's different ways to go forward here.
00:08:36.760 | It could be the, the people who post have to pay, but anyone can read.
00:08:40.400 | I think that's probably fine.
00:08:41.560 | Um, I think he didn't mention this, but I think getting rid of pseudo anonymity
00:08:45.200 | would be good.
00:08:45.840 | It's a real people talking to other real people.
00:08:49.460 | You kind of have to stand by, uh, you have to stand by what you're doing.
00:08:52.920 | You know, I think that would be good.
00:08:54.000 | There's a lot of other details to work out, but I really do think it probably
00:08:56.640 | would become a better experience if it was subscription based and it would teach
00:09:00.400 | the world.
00:09:00.900 | You can run a really profitable, good online user generated content company
00:09:07.640 | without having to have ads be what's at the core of it.
00:09:11.240 | Now, if it was very profitable and grew to be really big, that'd be great
00:09:15.160 | because then more people would do it.
00:09:16.320 | I don't know if it would be, but I liked the idea.
00:09:18.000 | So I mentioned two things I didn't quite agree with Galloway about.
00:09:22.560 | Again, defer to him, but let's just throw it out there.
00:09:24.760 | Number one, his enthusiasm for Twitter, he calls it, uh, can't find a wording
00:09:33.080 | here, but basically he calls it like one of the most important companies like in
00:09:39.560 | history.
00:09:40.520 | What's the way he talked about this?
00:09:42.800 | I can't find his exact wording, but it's like, this is obviously like one of the
00:09:45.560 | most important companies in the world that it's, that its impact on the world
00:09:48.680 | is critical and it's, it's so it's very important what happens to it.
00:09:52.200 | So he says that somewhere.
00:09:53.080 | I don't have the exact, the exact terminology.
00:09:55.560 | All right.
00:09:56.160 | I don't completely agree with that.
00:09:57.760 | I think there's ironically an, an, an echo chamber, uh, around Twitter for
00:10:03.120 | Twitter users.
00:10:03.880 | It disproportionately matters for a very small segment of the population that has
00:10:09.720 | a lot of power for reporters and for politicians, um, for people who are
00:10:14.160 | involved in, in criticism, content producers, writers, like there's a pretty
00:10:20.240 | small group that has a lot of influence on culture to which Twitter is at the
00:10:23.160 | center of their universe.
00:10:24.320 | And I think that is easy to generalize and be like, this thing is at the
00:10:28.720 | center of most people's universes.
00:10:30.080 | Most people don't care about it.
00:10:31.240 | It's impact on most people is second order.
00:10:33.200 | And maybe this is what Galloway had in mind for most people.
00:10:36.160 | Uh, they don't use Twitter.
00:10:37.400 | They don't care about Twitter, or if they use it, they use it.
00:10:39.400 | Rarely like I would, which is, I need to look up what this nationals beat
00:10:45.080 | reporter is saying about what just happened in the baseball game.
00:10:48.200 | Right.
00:10:49.600 | And, but they are getting a second order impact because as we talked about on
00:10:52.640 | Monday, the news they're encountering, the bills being passed in Congress, like
00:10:57.240 | a lot of that is being influenced by Twitter.
00:10:58.960 | So they do have a second order impact, but I think 99% of the people in our
00:11:03.360 | country, if you turn Twitter off tomorrow, wouldn't notice.
00:11:05.960 | They just don't use it that much.
00:11:08.000 | So I don't think it's as central, uh, as the people who are in that world
00:11:12.080 | think it's central to their world, but they don't realize that that's, that's
00:11:14.840 | a more constrained world.
00:11:18.200 | The second thing is Galloway's evaluation of Elon.
00:11:24.400 | Let's get to the question of, is it good or bad for Twitter?
00:11:28.400 | And what Galloway says is like, yeah, it's kind of mixed.
00:11:30.840 | Like the, the, the fact that, you know, you bring in this, this larger than life
00:11:35.480 | character, who's obviously a genius and you put them on the board, it's
00:11:38.200 | going to shake things up.
00:11:39.120 | It was smart for Twitter to put them on the board.
00:11:41.320 | It's better to have activists on your board than to be attacking
00:11:43.880 | you from the outside.
00:11:44.520 | He said that there's Twitter needs a kick and that might be a kick, but he's really
00:11:49.320 | worried about Elon's personality traits.
00:11:52.080 | He said, Elon is very volatile.
00:11:54.040 | He will tweet off all sorts of random things.
00:11:57.400 | He might take Twitter in arbitrary directions.
00:11:59.680 | Uh, Galloway was upset that Elon was just doing like Twitter polls about major new
00:12:04.160 | features, like that he might just add an edit button because people said that was
00:12:08.360 | a good idea and a Galloway's right about that.
00:12:10.560 | Elon is a volatile person.
00:12:12.080 | And so you worry about what that's going to do over time to
00:12:14.960 | Twitter and its share value.
00:12:16.120 | The place where I probably disagree with Galloway though, and it's a disagreement
00:12:20.320 | I have with a lot of people is it seems like a lot of people, especially around
00:12:24.440 | here, their take on Musk and Twitter is that, uh, Musk is upset about moderation
00:12:31.560 | and what he wants is a unrestricted, basically unrestricted speech on Twitter.
00:12:37.120 | And so typically when people say, here's the problem with people like Musk is
00:12:41.200 | they, they want complete free speech.
00:12:43.040 | And then like Galloway does, they go on and be like, the first amendment
00:12:46.160 | doesn't guarantee free speech to everyone.
00:12:47.920 | It guarantees freedom from government intervention, and it's fine for
00:12:50.640 | companies to do free speech.
00:12:51.640 | And if you really want to do unfettered free speech where you're just stopping
00:12:54.160 | illegal stuff, you get 8chan is terrible.
00:12:56.480 | And I think most people would agree with that.
00:12:58.960 | I don't think though that that's what Galloway or people like
00:13:02.560 | him who are critiquing Twitter.
00:13:03.760 | I don't think that's what they want.
00:13:04.920 | I think that's a bit of a straw man.
00:13:06.160 | I mean, I think a lot of people, uh, you know, again, especially, and I think
00:13:10.760 | here it gets a little bit partisan.
00:13:11.960 | Um, they want that to be the thing that someone like Elon is, is advocating
00:13:16.560 | for, because it's easy to dismiss.
00:13:17.960 | I don't know Elon.
00:13:19.160 | I don't really know his stance on Twitter that well.
00:13:21.000 | I know some people who know Elon though, who are in his orbit.
00:13:24.040 | So I can kind of guess at where he stands.
00:13:26.040 | I would assume what Elon would like his complaint is basically that, no, of
00:13:30.720 | course there should be moderation.
00:13:31.880 | If, if Twitter was like 8chan, no one's going to want to use it, at
00:13:36.840 | least not at the scale it has, but that this moderation should be more centrist.
00:13:41.640 | And I think that's a harder thing to argue about.
00:13:43.880 | If let's say you're, you're, uh, you're coming at this more from a position, uh,
00:13:49.720 | to the left, that's a harder thing to argue about, which there should be
00:13:52.400 | moderation, but the moderation should align more with like centrist positions.
00:13:56.320 | What let's say like the majority of the country might be comfortable with.
00:13:59.720 | And, and I think Elon's argument is too often when it's like politically
00:14:04.760 | related moderation decisions will align with like a relatively far to the left
00:14:11.240 | perspectives that are shared by maybe six or 7% of the country.
00:14:14.200 | And that feels great for those people who share those views, but for everyone
00:14:18.240 | else, it can feel arbitrary or weird and, or that they're left out.
00:14:22.160 | And so like, we should come at it from a, like a centrist position.
00:14:24.480 | If it's something that would upset your aunt, then you're like, okay,
00:14:30.000 | maybe that shouldn't be there.
00:14:31.040 | Like maybe that should be the standard and the standard shouldn't be, is
00:14:34.400 | this something that is going to upset a, um, like a radical critical theorist?
00:14:39.480 | You know, like that's, that's a bigger stance.
00:14:41.480 | So I don't know.
00:14:43.080 | I, that's another thing I think I disagree with Galloway is like, I don't
00:14:45.640 | think Musk wants 8chan on Twitter.
00:14:47.160 | I think he just wants the aunt standard and not the, uh, critical
00:14:52.880 | theorist standard, I suppose.
00:14:54.200 | So that's where I think, what do you, what do you think, Jesse?
00:14:57.480 | Well, what about the part of Elon going out the wrong SEC storm form and
00:15:03.880 | what are your views on that?
00:15:04.920 | Yeah.
00:15:05.560 | I mean, I think that falls under the he's volatile critique, which is fair.
00:15:09.880 | I think there's probably a little bit more planning in it than that.
00:15:14.480 | I think it's similar to like the Brady retirement thing.
00:15:17.040 | And then him going back, like they have this stuff planned out.
00:15:20.000 | So, so I'm not up, I'm not up on the latest breaking.
00:15:23.280 | So is the thought that this is so he can then step back out?
00:15:26.680 | Well, he announced it on April 4th or whatever that he broke.
00:15:34.280 | Was this, and then the stock price immediately went up, but he
00:15:37.480 | had filled out the wrong form.
00:15:39.120 | He was technically supposed to announce that like two weeks prior.
00:15:41.640 | In which case, when he was buying all that stock, the price would have gone up.
00:15:45.800 | So essentially he saved himself like 150 million, but Galloway
00:15:49.160 | was saying that that's fraud.
00:15:50.360 | Oh, well that probably, that might be.
00:15:52.120 | Yeah.
00:15:52.400 | I mean, I think that falls under the volatility.
00:15:55.480 | He's not great with corporate governance.
00:15:58.000 | He sort of flies by the seat of his pants.
00:15:59.640 | I mean, it kind of works well in creating new industries, but
00:16:01.760 | then also causes trouble.
00:16:02.960 | Yeah.
00:16:03.720 | I think it's, I don't know.
00:16:05.520 | I don't, I don't think that he just filled out the wrong form.
00:16:08.240 | I feel like he probably, I'm reading this now.
00:16:11.920 | Yeah.
00:16:14.080 | It's down the end, like where I see it.
00:16:16.440 | Yeah.
00:16:16.800 | Yeah.
00:16:19.200 | Well, that's, well, that's a whole other issue, but yeah, again, you're
00:16:22.280 | bringing Musk into your world.
00:16:23.760 | All this stuff follows.
00:16:25.200 | Well, a lot of this, a lot of this stuff and like other publications is that's
00:16:30.000 | what the headline is about, you know,
00:16:33.080 | it's the form thing.
00:16:33.720 | Yeah.
00:16:34.080 | It's the form and saving money and buying Twitter.
00:16:36.320 | Yeah.
00:16:37.040 | To me, that's like the least interesting part of the story.
00:16:40.760 | I mean, it's interesting, but I also think it's, that's more like the reason
00:16:45.320 | that's the headlines in a lot of places.
00:16:46.960 | It's like, we're working backwards from, we don't like this guy.
00:16:50.760 | So, uh, let's emphasize the, like, look, he might've done fraud here, which I,
00:16:56.920 | which I think is, is big, but there's also like these huge discussions of like,
00:16:59.800 | what should Twitter be?
00:17:00.720 | And is he right?
00:17:01.400 | Is he wrong?
00:17:02.640 | And yeah, there, when I talk to people, I get a lot of like, uh, there people
00:17:09.680 | around here are really worried that he's sort of wants to get rid of all
00:17:14.000 | moderation, which I, again, I think is crazy.
00:17:16.080 | Like no one would want to use a platform that it's like, eh, people can say
00:17:20.040 | whatever, as long as it's not defamation or, or like putting someone in imminent
00:17:23.680 | harm, like those platforms exist and they're weird and, and melt people's
00:17:27.760 | eyes and they're terrible and no one likes to be there.
00:17:30.360 | Um, and so I can imagine that, uh, assuming he does get to keep his stock.
00:17:36.480 | I can't imagine that's what Elon Musk wants.
00:17:38.400 | Yeah.
00:17:39.320 | Again, I, I really think a lot of these like intellectual dark web types who all
00:17:44.640 | seem to like know him are, that's really what they would want is the, what did
00:17:50.800 | upset my aunt test.
00:17:52.160 | Yeah.
00:17:53.240 | Which would moderate a lot of stuff, but wouldn't completely align with either
00:17:57.360 | side of the political spectrums, more extreme stances.
00:18:01.040 | Like I think that stance could very well, you know, would moderate things like about
00:18:06.120 | vaccine misinformation still in a way that people on the right would be really
00:18:12.360 | upset about.
00:18:13.080 | Um, but it would also like not kick people off for things that people on the left
00:18:17.960 | think they should be kicked off for.
00:18:19.080 | But your aunt would be like, I don't understand what the problem is.
00:18:20.960 | Like there's probably a middle ground there.
00:18:22.720 | And there's a whole theory in media criticism about, uh, when media outlets
00:18:29.560 | like newspapers used to make most of their money off advertisements, the goal
00:18:33.240 | is to have the largest possible circulation.
00:18:35.360 | And so the, the second order effect of that is that you get coverage that is
00:18:41.240 | aimed at the broadest possible audience, because you need the broadest possible
00:18:43.880 | audience to like pick your paper up from the newsstand and read it.
00:18:46.320 | So you can get those ad reads.
00:18:47.360 | And then when things shifted behind paywalls, it changed all the incentives.
00:18:50.960 | And now it's more about how do we keep the people who are paying us money each
00:18:54.880 | month for this excited about this.
00:18:57.240 | And the incentives change more towards like, well, let's figure out what.
00:18:59.800 | They're into who they support, who they dislike.
00:19:02.920 | And like, let's really attack the people they dislike and really support the
00:19:07.080 | things they do, like let's validate their, their particular whatever.
00:19:10.120 | And so that's an interesting theory that then you shift towards like a, the
00:19:13.680 | Overton window significantly realigns when you shift your actual economic model.
00:19:19.560 | And so that's a really interesting theory.
00:19:21.120 | And I think that's a really interesting way to think about it.
00:19:23.040 | And I think that's a really interesting way to think about it.
00:19:24.720 | And I think that's a really interesting way to think about it.
00:19:26.000 | And I think that's a really interesting way to think about it.
00:19:27.000 | And I think that's a really interesting way to think about it.
00:19:28.000 | And I think that's a really interesting way to think about it.
00:19:29.000 | And I think that's a really interesting way to think about it.
00:19:30.000 | And I think that's a really interesting way to think about it.
00:19:31.000 | And I think that's a really interesting way to think about it.
00:19:32.000 | And I think that's a really interesting way to think about it.
00:19:33.000 | And I think that's a really interesting way to think about it.
00:19:34.000 | And I think that's a really interesting way to think about it.
00:19:35.000 | And I think that's a really interesting way to think about it.
00:19:36.000 | And I think that's a really interesting way to think about it.
00:20:02.880 | [MUSIC PLAYING]