back to indexCal Newport On Why Social Media Can’t Be Tamed | Deep Questions Podcast
Chapters
0:0 Cal's intro
0:56 Cal explains the 2 things about the algorithm
2:40 Cal talks about neural networks
3:30 Can we get rid of that in social media?
5:30 Explosive virality
8:16 Section 230
00:00:19.800 |
and he was interested in your talk about the algorithm 00:00:40.360 |
about social media and he'd like to kind of hear you 00:00:44.880 |
And he's even wondering why an algorithm is even needed. 00:00:49.400 |
- So if you could just dive into that a little bit. 00:00:51.840 |
I mean, there's two things about the algorithm. 00:00:56.600 |
So if we're gonna get a little bit more precise 00:00:58.960 |
with our terminology, I'm teaching algorithms, 00:01:04.440 |
And an algorithm technically is you have a finite sequence 00:01:13.800 |
It's way more ambiguous what goes on with social media. 00:01:19.040 |
think about it as a large bank of black boxes 00:01:27.360 |
and then using that information helps make decisions 00:01:30.280 |
about what should we show this particular user. 00:01:41.080 |
And what is actually happening to these black boxes 00:01:42.960 |
is not just a single algorithm that you can study. 00:01:50.120 |
through back propagation to try to essentially learn. 00:01:55.040 |
What it is that appeals to you and what doesn't, 00:02:02.560 |
And then their feedback is fed into each other 00:02:10.960 |
of black boxes connected together in complex manner. 00:02:16.880 |
of different networks hooked up to each other 00:02:18.760 |
that in the end spits out its recommendations 00:02:35.000 |
I mean, if you wanna really get into neural networks, 00:02:37.000 |
I mean, what they're really doing is they're really, 00:02:56.440 |
in which the inputs exist into these different regions 00:02:59.840 |
And when you get something like a deep learning neural net, 00:03:06.160 |
and then they're communicating to each other. 00:03:11.320 |
where the straight edges are and how many there are. 00:03:22.440 |
but that's what I wanna emphasize is complicated. 00:03:24.440 |
So it's not an algorithm we can tweak easily, 00:03:26.480 |
it's not an algorithm we can understand easily. 00:03:40.800 |
This was basically all social media before 2009. 00:03:53.320 |
All it did was sort of all the people you follow, 00:03:58.160 |
here are their tweets, sort them in chronological order, 00:04:10.680 |
See, people don't even talk about this on Facebook anymore. 00:04:12.560 |
It's just become this newsfeed distraction machine, 00:04:25.280 |
you get these complex neural network based algorithms 00:04:27.840 |
that say, well, I'm not just gonna show you things 00:04:35.160 |
And that's where we get this complicated play 00:04:48.800 |
like many of the issues of this were unexpected. 00:04:53.440 |
Like all you're trying to do with these networks, 00:04:59.680 |
in the sense of I wanna spend more time looking at this. 00:05:03.640 |
I think one of the big ones is what we talked about 00:05:07.120 |
when we covered John Heights Atlantic article 00:05:11.960 |
And it was an introduced intense virality, right? 00:05:23.800 |
So it's gonna start showing it to more people, 00:05:25.240 |
which gives it a chance to be even more popular. 00:05:32.360 |
led to a environment that was high stakes and terrifying. 00:05:44.840 |
And that completely changed who you social media, 00:05:50.480 |
had to do with, I talked about this in digital minimalism, 00:05:56.680 |
about what reaction you were gonna get to your pieces. 00:05:59.160 |
And that also touched on just social psychology, 00:06:02.480 |
the intermittent reinforcement of sometimes people 00:06:06.680 |
that became a real addictive factor in getting people back, 00:06:26.680 |
So it's possible, but there's no reason why companies would 00:06:31.240 |
because Carl, it makes them way more profitable. 00:06:40.720 |
because they are perfectly optimized to get you to do that. 00:07:01.920 |
Like most of what you saw was not interesting 00:07:04.840 |
'cause most of what most people you follow post 00:07:16.480 |
And I believe the timeline was Facebook made that move. 00:07:25.400 |
So when they saw, when Facebook saw Twitter move away 00:07:33.760 |
Facebook realized they had to do something similar. 00:07:35.840 |
So now I don't know if that genie is gonna go back 00:07:38.440 |
because I mean, it would be like going to an oil company, 00:07:47.360 |
out of the ground per day, but could you stop using them?" 00:07:56.160 |
this is our main technology for getting oil out of the ground. 00:08:00.000 |
We don't wanna artificially go back to a time 00:08:04.520 |
You know, I got another email about that segment. 00:08:09.120 |
reacting to Obama's call to regulate social media. 00:08:15.800 |
So there's this rule on the books called section 230 00:08:19.600 |
of this larger legislation that has a big impact 00:08:30.480 |
for what other users are posting on the platforms. 00:08:33.400 |
And it was invented, we talked about this before, 00:09:03.240 |
but I'm not the editor selecting information." 00:09:06.000 |
And the idea was these large social media platforms 00:09:14.880 |
And so there's one of the pushes for regulation 00:09:20.760 |
just like a newspaper is liable for what they print 00:09:32.080 |
just because anything that would lead social media 00:09:43.560 |
Long story short, Jesse, he was basically saying, 00:09:53.840 |
and you do so because there's an impact you want 00:09:59.000 |
he said, "Actually, they would probably be okay. 00:10:04.760 |
to try to walk around and sidestep these liabilities 00:10:13.360 |
It really could be a problem for smaller companies." 00:10:16.360 |
So look, I'm not a lawyer, but it's an interesting note. 00:10:39.120 |
Somebody asked about that, that's the only reason