back to indexA Brief History Of How Giant Internet Companies Print Coin | Deep Questions With Cal Newport
Chapters
0:0 Cal's intro
3:55 The Old Model
8:34 Trying to make money from online content
10:5 The Network model
14:46 Extranalities
15:2 The Loop
00:00:02.240 |
I call this deep dive, loops, networks, and links. 00:00:11.080 |
is something you probably never knew you should care about, 00:00:18.680 |
That is the most boring title you have probably heard. 00:00:21.860 |
This is why most people don't think about it, 00:00:23.320 |
but it is actually a subject that is incredibly important 00:00:30.600 |
So let me start with the backdrop to this discussion, 00:00:38.960 |
there are companies making a huge amount of money 00:00:54.680 |
We're talking about some of the biggest corporations 00:01:05.040 |
Monetizing user generated content on the internet 00:01:26.640 |
before the web, before the consumer facing web 00:01:32.560 |
there was basically no way to make a lot of money 00:01:40.860 |
pay a small number of talented people to create content 00:01:47.660 |
it's highly paid professional generated content. 00:01:52.920 |
So if you were trying to reach a very broad audience, 00:01:57.000 |
there could be a lot of competition for this talent 00:01:58.960 |
because you wanted the best television writer, 00:02:03.400 |
But this model didn't require the superstar economics 00:02:08.000 |
This is why you had conglomerates like Gannett 00:02:18.680 |
You pay people who are as good as anyone else 00:02:26.900 |
All right, Web 1.0 comes along, '96, '97, '98. 00:02:31.700 |
Now it is possible for individuals to create content 00:02:39.900 |
We gotta emphasize that this was a major transformation 00:02:57.120 |
This did not change the main media economics yet. 00:03:08.980 |
Most of the leveraging of the first web revolution 00:03:16.900 |
small number of highly paid writers serving lots of people, 00:03:26.440 |
You got existing creators like Time Magazine realizing 00:03:42.940 |
Web 2.0, which happens once we get to the new millennium, 00:03:58.100 |
you can type into a box and click Submit or click Post. 00:04:10.800 |
that made it possible for you to send information 00:04:18.200 |
These little innovations that made Web 2.0 possible. 00:04:23.580 |
This made it possible to generate a ton of content. 00:04:31.180 |
this information resource into massive companies 00:04:40.140 |
is the evolution of curation once Web 2.0 came along. 00:04:46.120 |
having a lot of people generating content does you no good. 00:05:10.100 |
that emerged that's in reverse chronological order. 00:05:15.860 |
I wanna walk through these three models briefly, 00:05:18.760 |
how they work and their advantages and disadvantages. 00:05:28.380 |
for curating user generated content in the Web 2.0 era 00:05:34.940 |
When I say link, I'm talking about hyperlinks. 00:05:45.580 |
It is a distributed curation method that is very human. 00:05:57.720 |
If I'm gonna enter the world of blogs and websites 00:06:03.340 |
where I have a pre-established trust relationship. 00:06:08.600 |
This person has a foot in traditional media maybe. 00:06:15.740 |
This is the smart person that you need to read. 00:06:17.980 |
Okay, so I enter into this web of trust relationships 00:06:23.780 |
I then see who are the people I already trust linking to. 00:06:28.400 |
If sufficient people link to a new source of information, 00:06:34.980 |
and that website has sufficient aesthetic capital 00:06:43.000 |
with animated gifs of eagles and what have you, 00:06:51.560 |
Now that new site, when it's linking to something else, 00:06:54.380 |
again, will help convey trust into these new targets 00:07:18.000 |
excavating really interesting quality sources 00:07:22.860 |
of information that you might not have otherwise 00:07:29.660 |
It's very difficult to get into someone's web of trust. 00:07:32.900 |
So weird conspiratorial work, blatant misinformation, 00:07:50.780 |
to see something like QAnon gain a lot of traction 00:08:03.860 |
somewhat eccentric QAnon conspiracy theorists, 00:08:15.300 |
One, it was hard to monetize this type of world 00:08:20.740 |
The blogosphere was famously hard to monetize, 00:08:23.460 |
both for large networks and individual content creators. 00:08:45.580 |
So if I wanna consume content, I have to do a lot of work. 00:08:50.220 |
You actually have to spend a lot of time online. 00:08:51.900 |
You have to see, build trust, expand this web of trust. 00:09:01.460 |
If you wanted to create content, it was even harder. 00:09:14.140 |
in the world of newspapers, TV, and radio only. 00:09:16.660 |
But it was really hard, if you're starting from scratch, 00:09:29.500 |
I specifically remember being very frustrated 00:09:33.700 |
when I would see links from more established, trusted blogs. 00:09:51.580 |
to get linked to a lot by those types of sources, 00:09:55.020 |
So it was not very exciting for content creators. 00:10:08.780 |
They figured out, okay, if we have a social network 00:10:11.380 |
where we make it easy for anyone to create content, 00:10:22.940 |
so you don't have to worry about what you look like. 00:10:25.060 |
You don't have to worry about doing the hard effort 00:10:27.780 |
of gaining aesthetic capital to convince people 00:10:36.940 |
You just sign up for this account, click these buttons. 00:10:53.300 |
the individual hard work of being on the internet a lot 00:10:56.460 |
and following links and building up this web of trust, 00:11:00.140 |
The newsfeed will fill in with what's interesting to you. 00:11:05.100 |
well, if we see what your self-declared friends 00:11:10.140 |
we will guess you're probably interested in that too. 00:11:26.580 |
Instagram followed up that model, but with images. 00:11:31.060 |
So now everyone can be involved in producing content 00:11:33.620 |
and you can get a pretty well curated stream of stuff 00:11:47.500 |
Facebook eventually copied this model as well 00:11:52.100 |
let's make it really easy for you to share a piece of content 00:11:58.820 |
And in those people who you're directly connected to 00:12:01.180 |
that really liked the content will do the same thing. 00:12:06.940 |
what you see is that the most compelling content 00:12:09.940 |
on the network at any one point can dramatically, 00:12:14.660 |
with dramatic speed, spread to huge swaths of the network. 00:12:23.100 |
and aggressive source of distributed curation. 00:12:29.700 |
You do the work of propagating stuff you like 00:12:32.580 |
with this low friction retweet, or in Facebook's case, share. 00:12:49.260 |
You see what your friends are up to and sharing, 00:12:50.740 |
but Twitter, man, it would come out of left field with things. 00:12:53.820 |
It was almost magical in the trends it would unearth. 00:13:09.020 |
making hundreds of retweet or not decisions every day. 00:13:37.540 |
Disadvantage, you homogenize all the aesthetics 00:13:40.100 |
of the content and the curation becomes obfuscated. 00:13:43.420 |
You just get this feed of stuff that's interesting 00:13:51.020 |
who would never be able to enter the web of trust in 2005 00:14:01.180 |
Curation is happening more behind the scenes. 00:14:07.860 |
Other disadvantage of course is the viral dynamics, 00:14:33.020 |
that can happen overnight because of these fierce dynamics. 00:14:37.980 |
Then you get the balkanization of media coverage itself. 00:14:48.020 |
on his island off of Jamaica with an evil plot 00:14:53.380 |
They just wanted people to spend time on their service. 00:15:08.820 |
is you basically take the human out of the equation 00:15:15.060 |
but devastatingly effective machine learning loops 00:15:19.620 |
from the whole pool of potential content, what to show you. 00:15:27.780 |
who your friends are, none of that's required. 00:15:31.780 |
what really happens with these machine learning loops 00:15:35.580 |
in some sort of multidimensional statistical space. 00:15:51.060 |
that it can then weight its selections of future videos 00:15:53.860 |
by what's gonna be closer to one of these cores, 00:15:56.460 |
blah, blah, blah, nerd, nerd, nerd, math, math, math. 00:16:02.540 |
You start watching videos, scrolling up and down, 00:16:05.060 |
it gathers that data, do this for half a day, 00:16:12.380 |
So it was an incredibly effective way of doing this. 00:16:15.260 |
Of course, services like YouTube do something similar, 00:16:21.780 |
It doesn't purify this model nearly as well as TikTok, 00:16:33.740 |
you saw it was probably one of the most effective 00:16:39.220 |
So again, the advantages, no social graph needed, 00:16:50.700 |
and you can be titillating people in a very effective way. 00:16:54.340 |
Disadvantages, this is like the fentanyl of distraction. 00:17:03.220 |
by any even attempt to connect it to community relationships 00:17:07.660 |
or being up on the news or exposure to interesting people. 00:17:10.720 |
It is just, let's go straight to the brainstem 00:17:17.980 |
So it is all humanity is now being stripped out 00:17:32.620 |
we're exploiting human things like our friendship networks 00:17:40.340 |
let's call this, we're gonna use a drug metaphor, 00:17:44.500 |
This is Twitter, this is Facebook, this is Instagram. 00:17:46.620 |
And then we get to TikTok and we purify down, 00:17:52.300 |
purify the curation down to its strongest form. 00:17:58.620 |
drooling out of the side of our mouth, waiting to overdose. 00:18:06.860 |
of distributed curation of user generated content. 00:18:12.620 |
a lot of the recent history of the internet economy 00:18:15.860 |
So like here, for example, is two practical takeaways 00:18:27.780 |
where the only type of user generated content 00:18:32.740 |
We can't go back to a pure 2005 blogosphere world, 00:18:35.420 |
but couldn't we add this world back to what we have today? 00:18:38.940 |
Isn't there a market out there for this more human web, 00:18:43.720 |
but better quality connection, better quality information, 00:18:53.860 |
Isn't there some sort of revivification of the blog 00:19:00.980 |
or sub expert class could be participating in? 00:19:04.360 |
but there's not a lot of, we don't have the same links. 00:19:08.240 |
Two, once you understand distributed curation, 00:19:11.120 |
you see that it is difficult to fix the negative side effects 00:19:15.460 |
of in particular the network and loop-based curation models 00:19:23.300 |
We're mixing too much, two different things here. 00:19:26.660 |
So if you think you can go in and solve the negative 00:19:32.260 |
fierce viral dynamics by having humans in the loop, 00:19:36.180 |
trying to kick people off of Twitter, good luck. 00:19:39.140 |
These are two completely different types of dynamics 00:19:43.420 |
this fiercely effective machine learning loop. 00:19:49.220 |
is like have a human come in and try to intervene. 00:19:51.160 |
You're mixing two different modalities, it doesn't work. 00:19:53.540 |
If you want to get away from the negative side effects 00:19:57.700 |
you have to actually change the cultural zeitgeist 00:20:00.220 |
to push people onto other sources of interaction, 00:20:02.500 |
other sources of distraction, other sources of engagement. 00:20:10.140 |
as the TikTok machine learning loop or Twitter retweets 00:20:17.500 |
that they shouldn't really be on Twitter that much or not. 00:20:25.180 |
you can actually make some useful conclusions.