back to index

A Brief History Of How Giant Internet Companies Print Coin | Deep Questions With Cal Newport


Chapters

0:0 Cal's intro
3:55 The Old Model
8:34 Trying to make money from online content
10:5 The Network model
14:46 Extranalities
15:2 The Loop

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, segment number one, deep dive.
00:00:02.240 | I call this deep dive, loops, networks, and links.
00:00:07.240 | What I'm gonna talk about here
00:00:11.080 | is something you probably never knew you should care about,
00:00:13.320 | which is distributed curation
00:00:16.360 | of user generated content online.
00:00:18.680 | That is the most boring title you have probably heard.
00:00:21.860 | This is why most people don't think about it,
00:00:23.320 | but it is actually a subject that is incredibly important
00:00:26.160 | for understanding the dynamics
00:00:27.360 | of the current internet economy.
00:00:30.600 | So let me start with the backdrop to this discussion,
00:00:35.040 | which is this idea that right now,
00:00:37.160 | if we look at the internet economy,
00:00:38.960 | there are companies making a huge amount of money
00:00:42.320 | monetizing content generated by users,
00:00:48.080 | largely unpaid users generating content,
00:00:51.000 | making lots of money.
00:00:51.880 | And I'm underscoring the word lots here.
00:00:54.680 | We're talking about some of the biggest corporations
00:00:57.040 | in the world right now are using this model.
00:01:01.280 | So it's not just, here's a nice niche
00:01:03.880 | where some people made some money.
00:01:05.040 | Monetizing user generated content on the internet
00:01:08.800 | has become a massive industry.
00:01:11.880 | Here's the thing, that is new.
00:01:13.380 | That is relatively new.
00:01:15.680 | That's less than 20 years old,
00:01:17.760 | this idea that you could make a lot of money
00:01:20.480 | off of this type of content.
00:01:21.740 | So quick beats of the timeline leading up
00:01:24.800 | to this current state of our economy,
00:01:26.640 | before the web, before the consumer facing web
00:01:30.640 | became available in the 1990s,
00:01:32.560 | there was basically no way to make a lot of money
00:01:35.300 | off user generated content.
00:01:36.840 | The model was, if you're a media company,
00:01:40.860 | pay a small number of talented people to create content
00:01:43.880 | to be consumed by a lot of consumers.
00:01:46.600 | It's not user generated content,
00:01:47.660 | it's highly paid professional generated content.
00:01:51.060 | This created various economies.
00:01:52.920 | So if you were trying to reach a very broad audience,
00:01:54.880 | like you're a national television network,
00:01:57.000 | there could be a lot of competition for this talent
00:01:58.960 | because you wanted the best television writer,
00:02:00.760 | you wanted the best actor,
00:02:01.860 | and they could be really highly paid.
00:02:03.400 | But this model didn't require the superstar economics
00:02:05.680 | because of localization.
00:02:08.000 | This is why you had conglomerates like Gannett
00:02:10.800 | become really, really big
00:02:13.000 | because they found that was a good model
00:02:14.640 | to buy up local papers.
00:02:16.280 | Local papers used to be a lucrative model.
00:02:18.680 | You pay people who are as good as anyone else
00:02:21.660 | who's writing for your particular market,
00:02:24.340 | and then you can make money off of everyone
00:02:26.060 | who lives in that market.
00:02:26.900 | All right, Web 1.0 comes along, '96, '97, '98.
00:02:31.700 | Now it is possible for individuals to create content
00:02:37.380 | that can be consumed by anyone else.
00:02:39.900 | We gotta emphasize that this was a major transformation
00:02:47.120 | in the history of media production,
00:02:49.740 | that now anyone could produce content
00:02:51.660 | that could be generated by anyone else.
00:02:53.700 | We're talking, of course,
00:02:54.540 | almost primarily about written content here.
00:02:57.120 | This did not change the main media economics yet.
00:03:01.100 | It was too hard to do.
00:03:02.320 | It was technically demanding.
00:03:04.380 | You had to hand code HTML often
00:03:06.220 | to try to put stuff online.
00:03:07.300 | It didn't really look that good.
00:03:08.980 | Most of the leveraging of the first web revolution
00:03:12.260 | was actually by media companies
00:03:14.320 | that already were using the old model,
00:03:16.900 | small number of highly paid writers serving lots of people,
00:03:19.760 | to reduce distribution cost.
00:03:22.600 | So you didn't get brand new creators.
00:03:26.440 | You got existing creators like Time Magazine realizing
00:03:30.480 | if we release on the web,
00:03:31.840 | it's cheaper than printing things on paper.
00:03:33.640 | Then we get Web 2.0.
00:03:36.500 | This is the major turning point
00:03:38.220 | in the economics of content.
00:03:42.940 | Web 2.0, which happens once we get to the new millennium,
00:03:46.500 | is where we made it easier for people
00:03:51.620 | to publish information on the web.
00:03:54.620 | Now, instead of hand coding HTML,
00:03:58.100 | you can type into a box and click Submit or click Post.
00:04:01.880 | For the old school web geeks among you,
00:04:05.920 | there is small technical innovations
00:04:07.760 | that were critical here,
00:04:08.600 | like AJAX, asynchronous JavaScript,
00:04:10.800 | that made it possible for you to send information
00:04:13.360 | from a website to a server,
00:04:14.520 | have that server update the website
00:04:16.580 | without having to reload the whole page.
00:04:18.200 | These little innovations that made Web 2.0 possible.
00:04:20.840 | Now it was easy enough
00:04:21.820 | that almost anyone could generate content.
00:04:23.580 | This made it possible to generate a ton of content.
00:04:26.680 | But before we could mine this new resource,
00:04:31.180 | this information resource into massive companies
00:04:33.520 | and into massive monetizations,
00:04:35.240 | curation had to be solved.
00:04:38.540 | And this is what I wanna talk about,
00:04:40.140 | is the evolution of curation once Web 2.0 came along.
00:04:45.140 | 'Cause it turns out
00:04:46.120 | having a lot of people generating content does you no good.
00:04:49.440 | If you're trying to make money,
00:04:51.340 | selling content does you no good
00:04:53.980 | if you can't select for your audience
00:04:56.400 | stuff they actually want to see.
00:04:59.240 | So this goes back to the title
00:05:01.460 | of my deep dive loops, networks, and links.
00:05:04.840 | These are the three dominant models
00:05:07.180 | of curation of user generated content
00:05:10.100 | that emerged that's in reverse chronological order.
00:05:12.580 | Links were first, then the network model,
00:05:15.020 | then the loop model.
00:05:15.860 | I wanna walk through these three models briefly,
00:05:18.760 | how they work and their advantages and disadvantages.
00:05:20.740 | And I think this will help clarify
00:05:22.260 | a lot of what we see going on.
00:05:23.780 | All right, so the first effective model
00:05:28.380 | for curating user generated content in the Web 2.0 era
00:05:32.100 | was the link model.
00:05:34.940 | When I say link, I'm talking about hyperlinks.
00:05:38.300 | This is how the blogosphere worked
00:05:40.940 | during those early years
00:05:42.880 | of this content production revolution.
00:05:45.580 | It is a distributed curation method that is very human.
00:05:49.440 | It is based on human webs of trust
00:05:53.900 | that are augmented with digital technology.
00:05:56.140 | So here's roughly how it works.
00:05:57.720 | If I'm gonna enter the world of blogs and websites
00:06:00.900 | that are linked to each other,
00:06:02.100 | I'm gonna enter it in a place
00:06:03.340 | where I have a pre-established trust relationship.
00:06:06.700 | Okay, I know this person.
00:06:08.600 | This person has a foot in traditional media maybe.
00:06:11.280 | I've seen their newspaper column.
00:06:12.740 | They write books I care about.
00:06:14.260 | Friends of mine have really pushed.
00:06:15.740 | This is the smart person that you need to read.
00:06:17.980 | Okay, so I enter into this web of trust relationships
00:06:21.000 | through an entryway of pre-existing trust.
00:06:23.780 | I then see who are the people I already trust linking to.
00:06:28.400 | If sufficient people link to a new source of information,
00:06:33.700 | a new blog or a new website,
00:06:34.980 | and that website has sufficient aesthetic capital
00:06:39.500 | that it's trustworthy,
00:06:40.700 | it's not a weird gray background website
00:06:43.000 | with animated gifs of eagles and what have you,
00:06:47.020 | then that will enter into my web of trust.
00:06:49.060 | I will trust that too,
00:06:50.100 | and I will begin to consume that content.
00:06:51.560 | Now that new site, when it's linking to something else,
00:06:54.380 | again, will help convey trust into these new targets
00:06:58.220 | and help expand that web.
00:06:59.820 | So ultimately this is humans building trust,
00:07:03.260 | and then using that trust to expand
00:07:06.100 | where you get your content from.
00:07:09.580 | This was remarkably effective.
00:07:13.820 | It actually works really good
00:07:15.980 | at, if you actually stick with it,
00:07:18.000 | excavating really interesting quality sources
00:07:22.860 | of information that you might not have otherwise
00:07:24.260 | had access to, and more importantly,
00:07:25.900 | filtering out the weirdness,
00:07:27.740 | 'cause it's very difficult in this model.
00:07:29.660 | It's very difficult to get into someone's web of trust.
00:07:32.900 | So weird conspiratorial work, blatant misinformation,
00:07:37.900 | just general emotional outrage and ickiness
00:07:42.420 | could not gain a lot of traction
00:07:44.220 | in the link model of curation
00:07:46.340 | because it would never get the entry point.
00:07:48.780 | It is very hard, in other words,
00:07:50.780 | to see something like QAnon gain a lot of traction
00:07:55.500 | in, let's say, 2006 online ecosystem
00:07:59.620 | because for one of these blogs,
00:08:02.700 | if you're one of these initial
00:08:03.860 | somewhat eccentric QAnon conspiracy theorists,
00:08:06.660 | how is that gonna get into a web of trust
00:08:08.180 | that's gonna intersect mine?
00:08:09.340 | It probably won't.
00:08:10.780 | It probably won't.
00:08:11.860 | And so it worked pretty well.
00:08:13.180 | The disadvantages were two.
00:08:15.300 | One, it was hard to monetize this type of world
00:08:19.260 | of user-generated information.
00:08:20.740 | The blogosphere was famously hard to monetize,
00:08:23.460 | both for large networks and individual content creators.
00:08:26.980 | There just wasn't the model there.
00:08:28.740 | So that was an issue.
00:08:31.420 | You had the individual writers.
00:08:32.820 | It was hard to aggregate them.
00:08:34.540 | You can look at Nick Denton and Gawker.
00:08:36.820 | There's a lot of interesting oral history
00:08:40.420 | on trying to make money off that model.
00:08:42.060 | It was difficult.
00:08:43.300 | And two, it's hard work.
00:08:45.580 | So if I wanna consume content, I have to do a lot of work.
00:08:50.220 | You actually have to spend a lot of time online.
00:08:51.900 | You have to see, build trust, expand this web of trust.
00:08:54.860 | This takes a lot of time surfing,
00:08:56.140 | following links, being exposed.
00:08:57.700 | So it was biased towards heavy tech users.
00:09:01.460 | If you wanted to create content, it was even harder.
00:09:04.340 | It was very difficult to gain a foothold.
00:09:07.860 | So this was the flip side of the filtering
00:09:09.820 | and curation being very effective.
00:09:12.340 | It allowed a lot more voices than existed
00:09:14.140 | in the world of newspapers, TV, and radio only.
00:09:16.660 | But it was really hard, if you're starting from scratch,
00:09:18.580 | to gain access to these webs of trust.
00:09:21.100 | I mean, I remember in the early days
00:09:23.260 | of my blog at calnewport.com,
00:09:25.540 | when I used to focus only on student advice,
00:09:29.500 | I specifically remember being very frustrated
00:09:33.700 | when I would see links from more established, trusted blogs.
00:09:38.060 | Lifehacker was one that comes to mind,
00:09:40.140 | to other student advice sources
00:09:42.140 | that I thought I was better than.
00:09:44.140 | It's like, look, I publish these books.
00:09:45.460 | My advice is better.
00:09:46.500 | Why aren't they linking to me?
00:09:47.780 | It's 'cause it's a very slow moving system.
00:09:49.900 | Eventually I gathered enough trust
00:09:51.580 | to get linked to a lot by those types of sources,
00:09:53.860 | but it could take years.
00:09:55.020 | So it was not very exciting for content creators.
00:09:57.900 | It was very difficult, very difficult.
00:10:00.940 | All right, that led to model number two.
00:10:03.260 | We'll call this the network model.
00:10:06.660 | Facebook was the innovator here.
00:10:08.780 | They figured out, okay, if we have a social network
00:10:11.380 | where we make it easy for anyone to create content,
00:10:16.460 | so now we can greatly increase the pool
00:10:19.260 | of possible content out there
00:10:20.820 | by having these very slick web interfaces,
00:10:22.940 | so you don't have to worry about what you look like.
00:10:25.060 | You don't have to worry about doing the hard effort
00:10:27.780 | of gaining aesthetic capital to convince people
00:10:30.060 | that you're someone legitimate.
00:10:31.140 | Everyone looks the same.
00:10:31.980 | Take that off the table.
00:10:33.740 | You don't have to set up a blog,
00:10:35.180 | WordPress account somewhere.
00:10:36.100 | Take that off the table.
00:10:36.940 | You just sign up for this account, click these buttons.
00:10:38.660 | It looks great.
00:10:40.060 | And if we can get people, they realize,
00:10:42.260 | to do the work on their own
00:10:44.300 | of teaching us who their friends are,
00:10:46.180 | we can leverage that underlying social graph
00:10:50.100 | to do the curation.
00:10:51.580 | So now, instead of people having to do
00:10:53.300 | the individual hard work of being on the internet a lot
00:10:56.460 | and following links and building up this web of trust,
00:10:58.940 | you can have a newsfeed.
00:11:00.140 | The newsfeed will fill in with what's interesting to you.
00:11:03.380 | And what Facebook realized is,
00:11:05.100 | well, if we see what your self-declared friends
00:11:08.700 | are interested in,
00:11:10.140 | we will guess you're probably interested in that too.
00:11:12.820 | We can use friend relationships
00:11:14.500 | plus a little bit of magic secret sauce
00:11:16.180 | and to try to keep redundant information
00:11:18.060 | and keep things fresh
00:11:19.860 | to curate for you
00:11:21.860 | stuff based off of your friend relationship.
00:11:23.780 | And that worked out really well.
00:11:25.020 | So Facebook innovated that model.
00:11:26.580 | Instagram followed up that model, but with images.
00:11:29.620 | And it was actually really successful.
00:11:31.060 | So now everyone can be involved in producing content
00:11:33.620 | and you can get a pretty well curated stream of stuff
00:11:36.740 | that's interesting to you
00:11:38.100 | without having to do too much work.
00:11:39.660 | So it lowered the bar.
00:11:42.660 | Twitter had a twist on this model.
00:11:44.740 | This is the retweet model.
00:11:47.500 | Facebook eventually copied this model as well
00:11:49.900 | where they added their share button.
00:11:51.140 | The retweet model says,
00:11:52.100 | let's make it really easy for you to share a piece of content
00:11:56.300 | with everyone you're directly connected to.
00:11:58.820 | And in those people who you're directly connected to
00:12:01.180 | that really liked the content will do the same thing.
00:12:04.460 | Now, if you model this out mathematically,
00:12:06.940 | what you see is that the most compelling content
00:12:09.940 | on the network at any one point can dramatically,
00:12:14.660 | with dramatic speed, spread to huge swaths of the network.
00:12:19.660 | So this was an even more dynamic
00:12:23.100 | and aggressive source of distributed curation.
00:12:25.500 | And it became the core of Twitter's success.
00:12:27.700 | You carefully set up who you follow.
00:12:29.700 | You do the work of propagating stuff you like
00:12:32.580 | with this low friction retweet, or in Facebook's case, share.
00:12:36.580 | And the resulting fierce viral dynamics
00:12:39.140 | will become an incredibly effective
00:12:41.740 | distributed selection mechanism
00:12:43.420 | for things that will engage people.
00:12:45.740 | And that's why Twitter became so powerful.
00:12:48.180 | You know, Facebook was interesting.
00:12:49.260 | You see what your friends are up to and sharing,
00:12:50.740 | but Twitter, man, it would come out of left field with things.
00:12:53.820 | It was almost magical in the trends it would unearth.
00:12:57.460 | And that was all distributed.
00:12:59.380 | That's not a super clever algorithm.
00:13:01.740 | That's not Hal 2000 sitting somewhere
00:13:04.300 | learning about the human psyche.
00:13:05.900 | It's a hundred million users
00:13:09.020 | making hundreds of retweet or not decisions every day.
00:13:13.460 | So that's the network model.
00:13:16.100 | Leverage, homogenize interfaces and leverage
00:13:19.900 | these networks, these social networks
00:13:23.020 | to help curate the content created
00:13:25.340 | within these closed garden networks.
00:13:27.540 | Again, advantages, much easier to use.
00:13:29.540 | Much more people could be involved.
00:13:31.780 | You can make a lot more money off it.
00:13:32.940 | Very easy to monetize because these networks
00:13:35.660 | work within closed gardens.
00:13:37.540 | Disadvantage, you homogenize all the aesthetics
00:13:40.100 | of the content and the curation becomes obfuscated.
00:13:43.420 | You just get this feed of stuff that's interesting
00:13:45.020 | and all looks the same.
00:13:46.220 | Now, suddenly the QAnon,
00:13:49.180 | the proverbial QAnon conspiracy theorist
00:13:51.020 | who would never be able to enter the web of trust in 2005
00:13:54.220 | can easily spread and gain traction in 2015.
00:13:59.460 | Because all content looks the same.
00:14:01.180 | Curation is happening more behind the scenes.
00:14:03.100 | It's not based off of these more natural,
00:14:05.660 | deeply human trust relationships.
00:14:07.860 | Other disadvantage of course is the viral dynamics,
00:14:11.300 | especially the retweet share dynamics
00:14:13.660 | led to a lot of unexpected externalities,
00:14:17.900 | tribalism, outrage culture, mob, swarms,
00:14:22.900 | heavy feedback influence on, for example,
00:14:25.980 | media outlets where then you have reporters.
00:14:28.260 | So fearful of the fierce pushback possible
00:14:33.020 | that can happen overnight because of these fierce dynamics.
00:14:35.540 | Starting to really start to tailor
00:14:36.700 | what they say or don't say.
00:14:37.980 | Then you get the balkanization of media coverage itself.
00:14:40.380 | And there's all of these externalities
00:14:41.700 | that no one could have guessed.
00:14:43.020 | Twitter was not Dr. No with his cat
00:14:48.020 | on his island off of Jamaica with an evil plot
00:14:52.260 | to bring down democracy.
00:14:53.380 | They just wanted people to spend time on their service.
00:14:55.780 | These were all unexpected side effects.
00:14:57.340 | All right, moving quickly now,
00:14:58.900 | model number three is the loop.
00:15:02.060 | This is personified I think best by TikTok.
00:15:05.900 | So now what we do with the loop
00:15:08.820 | is you basically take the human out of the equation
00:15:13.220 | and you use simple,
00:15:15.060 | but devastatingly effective machine learning loops
00:15:17.340 | to just select for you as an individual
00:15:19.620 | from the whole pool of potential content, what to show you.
00:15:23.260 | No shares required, no retweets required,
00:15:25.620 | no you going through and telling the network
00:15:27.780 | who your friends are, none of that's required.
00:15:30.060 | And again, to the technicalities of this,
00:15:31.780 | what really happens with these machine learning loops
00:15:33.900 | is that all of the content is embedded
00:15:35.580 | in some sort of multidimensional statistical space.
00:15:38.260 | It then feeds you items from this space.
00:15:41.420 | It looks at how long you watch each video
00:15:44.460 | to try to assess your preference
00:15:46.940 | towards that particular region.
00:15:48.380 | This gives it some weighted cores
00:15:49.780 | in this multidimensional space
00:15:51.060 | that it can then weight its selections of future videos
00:15:53.860 | by what's gonna be closer to one of these cores,
00:15:56.460 | blah, blah, blah, nerd, nerd, nerd, math, math, math.
00:16:00.540 | It works eerily well.
00:16:02.540 | You start watching videos, scrolling up and down,
00:16:05.060 | it gathers that data, do this for half a day,
00:16:07.660 | and it seems like TikTok knows you better
00:16:10.700 | than the people who are closest to you.
00:16:12.380 | So it was an incredibly effective way of doing this.
00:16:15.260 | Of course, services like YouTube do something similar,
00:16:17.860 | but YouTube is more complicated.
00:16:19.340 | It has to serve many different purposes.
00:16:21.780 | It doesn't purify this model nearly as well as TikTok,
00:16:24.900 | which is just this model purified.
00:16:26.900 | Videos, full screen, swipe when you're done,
00:16:30.400 | we'll send you the next, that's it.
00:16:32.300 | And when you purify this model,
00:16:33.740 | you saw it was probably one of the most effective
00:16:36.180 | curation methods we've ever seen.
00:16:39.220 | So again, the advantages, no social graph needed,
00:16:43.340 | anyone can compete in this space.
00:16:45.180 | You just need a reasonable pool of content
00:16:47.720 | and a machine learning loop,
00:16:50.700 | and you can be titillating people in a very effective way.
00:16:54.340 | Disadvantages, this is like the fentanyl of distraction.
00:16:57.940 | It's too purified.
00:16:59.140 | It can take over your whole life.
00:17:00.460 | It is distraction now completely purified
00:17:03.220 | by any even attempt to connect it to community relationships
00:17:07.660 | or being up on the news or exposure to interesting people.
00:17:10.720 | It is just, let's go straight to the brainstem
00:17:13.940 | and inject that chemical.
00:17:17.980 | So it is all humanity is now being stripped out
00:17:21.120 | of the curation loop.
00:17:22.700 | So we started with 2005, rich humanity,
00:17:26.980 | but hard to monetize, hard to use.
00:17:28.580 | 2015, now you have this sort of,
00:17:32.620 | we're exploiting human things like our friendship networks
00:17:37.180 | and our retweet decisions that produce this,
00:17:40.340 | let's call this, we're gonna use a drug metaphor,
00:17:41.980 | kind of cocaine of distraction.
00:17:44.500 | This is Twitter, this is Facebook, this is Instagram.
00:17:46.620 | And then we get to TikTok and we purify down,
00:17:50.940 | get the human out of the loop altogether,
00:17:52.300 | purify the curation down to its strongest form.
00:17:54.780 | And we are living in a tent city,
00:17:58.620 | drooling out of the side of our mouth, waiting to overdose.
00:18:02.240 | All right, so that is the history
00:18:06.860 | of distributed curation of user generated content.
00:18:10.980 | Two takeaways, once we understand this,
00:18:12.620 | a lot of the recent history of the internet economy
00:18:15.020 | makes sense.
00:18:15.860 | So like here, for example, is two practical takeaways
00:18:17.820 | this framework can help you come up with.
00:18:20.460 | One, we lost something special
00:18:23.660 | when we left the link-based curation.
00:18:25.980 | Now I understand we can't go back to a world
00:18:27.780 | where the only type of user generated content
00:18:30.020 | is curated in a link-based manner.
00:18:32.740 | We can't go back to a pure 2005 blogosphere world,
00:18:35.420 | but couldn't we add this world back to what we have today?
00:18:38.940 | Isn't there a market out there for this more human web,
00:18:42.120 | a trust-based, slower, harder,
00:18:43.720 | but better quality connection, better quality information,
00:18:47.080 | really effective filtering of the weird
00:18:50.220 | and the conspiratorial and the based
00:18:52.660 | and loose foundations?
00:18:53.860 | Isn't there some sort of revivification of the blog
00:18:58.700 | that at least the sort of expert class
00:19:00.980 | or sub expert class could be participating in?
00:19:03.020 | Maybe podcasts are doing this,
00:19:04.360 | but there's not a lot of, we don't have the same links.
00:19:06.460 | Anyways, I think that's interesting.
00:19:08.240 | Two, once you understand distributed curation,
00:19:11.120 | you see that it is difficult to fix the negative side effects
00:19:15.460 | of in particular the network and loop-based curation models
00:19:20.940 | through human intervention.
00:19:23.300 | We're mixing too much, two different things here.
00:19:26.660 | So if you think you can go in and solve the negative
00:19:30.220 | of the, let's say, Twitter-based retweet,
00:19:32.260 | fierce viral dynamics by having humans in the loop,
00:19:36.180 | trying to kick people off of Twitter, good luck.
00:19:39.140 | These are two completely different types of dynamics
00:19:41.220 | going on, you're mixing and matching.
00:19:42.540 | Same thing with TikTok,
00:19:43.420 | this fiercely effective machine learning loop.
00:19:46.660 | What are you gonna do
00:19:47.660 | when you don't like all the outcomes of that
00:19:49.220 | is like have a human come in and try to intervene.
00:19:51.160 | You're mixing two different modalities, it doesn't work.
00:19:53.540 | If you want to get away from the negative side effects
00:19:56.260 | of these distributed curation models,
00:19:57.700 | you have to actually change the cultural zeitgeist
00:20:00.220 | to push people onto other sources of interaction,
00:20:02.500 | other sources of distraction, other sources of engagement.
00:20:05.620 | I don't know that you can come in and fix
00:20:08.220 | something so cybernetically effective
00:20:10.140 | as the TikTok machine learning loop or Twitter retweets
00:20:14.220 | with a board of safety.
00:20:15.460 | What we need to do is convince people
00:20:17.500 | that they shouldn't really be on Twitter that much or not.
00:20:20.780 | But anyways, two takeaways just to show you
00:20:22.880 | that once you have these frameworks,
00:20:25.180 | you can actually make some useful conclusions.
00:20:28.560 | (upbeat music)
00:20:31.140 | (upbeat music)