back to index

Cal Newport On Why Social Media Can’t Be Tamed | Deep Questions Podcast


Chapters

0:0 Cal's intro
0:56 Cal explains the 2 things about the algorithm
2:40 Cal talks about neural networks
3:30 Can we get rid of that in social media?
5:30 Explosive virality
8:16 Section 230

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, Jesse, enough about that.
00:00:03.000 | - Actually, we did get a email from Carl.
00:00:07.280 | He's a big fan.
00:00:08.300 | - Was it about air conditioner?
00:00:09.840 | - It wasn't, but it should be.
00:00:14.320 | - Well, then I'm not interested.
00:00:15.880 | - But he actually, he's been emailing me
00:00:19.800 | and he was interested in your talk about the algorithm
00:00:23.880 | when you were talking about the Obama call
00:00:26.520 | for more regulation for--
00:00:28.080 | - During one of our CalReacts to the news.
00:00:30.840 | - Yeah, a couple of weeks ago.
00:00:32.440 | So he thought it was enlightening
00:00:36.960 | and he's like talking about the algorithm
00:00:39.080 | and how it's like one of the worst things
00:00:40.360 | about social media and he'd like to kind of hear you
00:00:42.520 | talk about that more.
00:00:44.880 | And he's even wondering why an algorithm is even needed.
00:00:48.560 | - Yes.
00:00:49.400 | - So if you could just dive into that a little bit.
00:00:51.000 | - Well, that's interesting.
00:00:51.840 | I mean, there's two things about the algorithm.
00:00:53.840 | One is it's not an algorithm, right?
00:00:56.600 | So if we're gonna get a little bit more precise
00:00:58.960 | with our terminology, I'm teaching algorithms,
00:01:03.000 | for example, this semester.
00:01:04.440 | And an algorithm technically is you have a finite sequence
00:01:08.680 | of unambiguous steps that are executed
00:01:11.520 | in a clear control order.
00:01:13.800 | It's way more ambiguous what goes on with social media.
00:01:17.480 | So instead of really an algorithm,
00:01:19.040 | think about it as a large bank of black boxes
00:01:24.800 | that takes in lots of information
00:01:27.360 | and then using that information helps make decisions
00:01:30.280 | about what should we show this particular user.
00:01:33.480 | It's actually assigning weights
00:01:34.600 | to different possible things to figure out
00:01:36.160 | which tweet to show this user,
00:01:37.880 | which Facebook newsfeed posts to prioritize.
00:01:41.080 | And what is actually happening to these black boxes
00:01:42.960 | is not just a single algorithm that you can study.
00:01:45.160 | It's really for the most part,
00:01:46.320 | a collection of mainly neural nets.
00:01:48.080 | So you have neural networks that are trained
00:01:50.120 | through back propagation to try to essentially learn.
00:01:55.040 | What it is that appeals to you and what doesn't,
00:01:58.560 | but it's not just one neural net.
00:02:00.160 | There's multiple different neural nets
00:02:01.600 | that do different things.
00:02:02.560 | And then their feedback is fed into each other
00:02:04.800 | in somewhat complicated ways.
00:02:06.480 | And so it's a real, almost intractable mess
00:02:10.960 | of black boxes connected together in complex manner.
00:02:13.720 | So it's this real complicated mess
00:02:16.880 | of different networks hooked up to each other
00:02:18.760 | that in the end spits out its recommendations
00:02:23.200 | in the terms of weight.
00:02:24.040 | So show this tweet, not that.
00:02:25.440 | So it's completely obfuscated.
00:02:28.800 | It's very difficult to try to find out
00:02:30.760 | what is going on inside those networks.
00:02:35.000 | I mean, if you wanna really get into neural networks,
00:02:37.000 | I mean, what they're really doing is they're really,
00:02:38.960 | as we've talked about before,
00:02:39.880 | but they're essentially building spaces,
00:02:42.880 | like regions and multidimensional space
00:02:47.800 | that they can associate.
00:02:49.440 | So this is what a standard,
00:02:50.800 | I don't wanna get too far down this,
00:02:51.880 | but a standard not many layer neural net
00:02:53.760 | is basically just building up the space
00:02:56.440 | in which the inputs exist into these different regions
00:02:58.640 | so it can categorize things.
00:02:59.840 | And when you get something like a deep learning neural net,
00:03:03.160 | you have multiple layers
00:03:04.600 | that are each doing something like that,
00:03:06.160 | and then they're communicating to each other.
00:03:07.840 | So one layer might look at a picture
00:03:09.480 | and just be really good at figuring out
00:03:11.320 | where the straight edges are and how many there are.
00:03:14.480 | And then another layer takes that input
00:03:16.360 | and is really good at figuring out,
00:03:17.480 | well, if there's this many straight edges,
00:03:18.720 | then it's probably a crosswalk.
00:03:20.920 | And so it all gets kind of complicated,
00:03:22.440 | but that's what I wanna emphasize is complicated.
00:03:24.440 | So it's not an algorithm we can tweak easily,
00:03:26.480 | it's not an algorithm we can understand easily.
00:03:29.160 | Can we get rid of that in social media?
00:03:31.160 | I mean, we can imagine that being true.
00:03:35.200 | We can imagine that being true
00:03:37.080 | because we used to have social media
00:03:38.840 | without these quote unquote algorithms.
00:03:40.800 | This was basically all social media before 2009.
00:03:45.160 | So if you use Twitter in 2007,
00:03:49.600 | there's no complicated algorithm involved
00:03:51.720 | in showing what you see.
00:03:53.320 | All it did was sort of all the people you follow,
00:03:58.160 | here are their tweets, sort them in chronological order,
00:04:01.320 | put the newest one at the top.
00:04:03.120 | Facebook was doing the same thing.
00:04:05.080 | You have different people that you,
00:04:06.880 | I don't know what their terminology was,
00:04:08.320 | you like, you friend.
00:04:10.680 | See, people don't even talk about this on Facebook anymore.
00:04:12.560 | It's just become this newsfeed distraction machine,
00:04:14.440 | but you would friend people
00:04:16.040 | and would look at what they were posting
00:04:17.640 | and put it in chronological order,
00:04:20.000 | is what people were posting.
00:04:21.960 | And then at some point after that,
00:04:24.400 | as we've talked about before,
00:04:25.280 | you get these complex neural network based algorithms
00:04:27.840 | that say, well, I'm not just gonna show you things
00:04:29.200 | in chronological order,
00:04:31.200 | I'm gonna prioritize things
00:04:32.280 | that are gonna increase engagement,
00:04:33.720 | which means time on service.
00:04:35.160 | And that's where we get this complicated play
00:04:39.760 | where we no longer understand
00:04:40.920 | how it shows us what it shows us.
00:04:42.600 | Many of the impacts that the switch
00:04:44.520 | to this quote unquote algorithmic sorting
00:04:47.200 | of social media content had,
00:04:48.800 | like many of the issues of this were unexpected.
00:04:51.880 | There's side effects.
00:04:53.440 | Like all you're trying to do with these networks,
00:04:56.160 | these neural nets all connected together
00:04:57.720 | is just to make the user happier
00:04:59.680 | in the sense of I wanna spend more time looking at this.
00:05:02.280 | But it had all sorts of side effects.
00:05:03.640 | I think one of the big ones is what we talked about
00:05:07.120 | when we covered John Heights Atlantic article
00:05:09.880 | from a few weeks ago.
00:05:11.960 | And it was an introduced intense virality, right?
00:05:16.520 | Because now I could post something,
00:05:19.960 | other people could start spreading it,
00:05:21.280 | the algorithm will see that it's popular.
00:05:23.800 | So it's gonna start showing it to more people,
00:05:25.240 | which gives it a chance to be even more popular.
00:05:26.720 | And you can have explosive virality.
00:05:28.640 | And Heights point was explosive virality
00:05:32.360 | led to a environment that was high stakes and terrifying.
00:05:37.360 | You could be a hero or canceled in 12 hours,
00:05:43.280 | just like boom, it could just happen.
00:05:44.840 | And that completely changed who you social media,
00:05:47.040 | how they used it.
00:05:47.920 | Another issue with these types of algorithms
00:05:50.480 | had to do with, I talked about this in digital minimalism,
00:05:54.280 | but it made it way less predictable
00:05:56.680 | about what reaction you were gonna get to your pieces.
00:05:59.160 | And that also touched on just social psychology,
00:06:02.480 | the intermittent reinforcement of sometimes people
00:06:04.400 | like what I do and sometimes people don't,
00:06:06.680 | that became a real addictive factor in getting people back,
00:06:09.320 | especially for Instagram and Facebook
00:06:12.400 | in the earlier days of these algorithms
00:06:14.080 | is now that when people could like
00:06:15.520 | and things could be shared,
00:06:17.000 | you had this much more unpredictable
00:06:19.920 | and unstable environment of feedback.
00:06:22.080 | And that was very addictive.
00:06:23.240 | Like, are people gonna like this?
00:06:24.400 | Are people not gonna like this?
00:06:26.680 | So it's possible, but there's no reason why companies would
00:06:31.240 | because Carl, it makes them way more profitable.
00:06:35.360 | It's fantastically more profitable.
00:06:38.760 | People use these services all the time
00:06:40.720 | because they are perfectly optimized to get you to do that.
00:06:44.400 | If you went back to 2007 Facebook,
00:06:49.120 | it is way more boring than 2022 Facebook.
00:06:52.520 | If you go back to Twitter, early Twitter,
00:06:55.600 | before they began building
00:06:57.600 | these algorithmically juiced timelines,
00:07:00.440 | it was a way more boring place.
00:07:01.920 | Like most of what you saw was not interesting
00:07:04.840 | 'cause most of what most people you follow post
00:07:07.160 | is not interesting.
00:07:09.480 | So it was good for us as a culture
00:07:12.320 | because it was a diversion.
00:07:13.240 | We didn't wanna spend too much time on,
00:07:14.320 | but it's a completely different beast.
00:07:16.480 | And I believe the timeline was Facebook made that move.
00:07:20.480 | I mean, Twitter made that move first
00:07:22.520 | and then Facebook and Instagram followed.
00:07:25.400 | So when they saw, when Facebook saw Twitter move away
00:07:31.480 | from a strict chronological timeline,
00:07:33.760 | Facebook realized they had to do something similar.
00:07:35.840 | So now I don't know if that genie is gonna go back
00:07:37.440 | in the bottle for those companies
00:07:38.440 | because I mean, it would be like going to an oil company,
00:07:41.440 | like going to Exxon and saying,
00:07:43.880 | "I know you've invented these technologies
00:07:45.720 | which allows you to get 10X more oil
00:07:47.360 | out of the ground per day, but could you stop using them?"
00:07:52.240 | And they're probably gonna say no.
00:07:54.520 | They're probably gonna say, "No,
00:07:56.160 | this is our main technology for getting oil out of the ground.
00:07:58.240 | We've gotten really good at it.
00:08:00.000 | We don't wanna artificially go back to a time
00:08:01.640 | when we were worse at it."
00:08:03.600 | Yeah, so it's interesting.
00:08:04.520 | You know, I got another email about that segment.
00:08:07.120 | So that particular article,
00:08:09.120 | reacting to Obama's call to regulate social media.
00:08:13.040 | And I mentioned section 230.
00:08:15.800 | So there's this rule on the books called section 230
00:08:19.600 | of this larger legislation that has a big impact
00:08:23.880 | in that it protects, as we talked about,
00:08:25.920 | it protects platforms from liability
00:08:30.480 | for what other users are posting on the platforms.
00:08:33.400 | And it was invented, we talked about this before,
00:08:35.720 | the intent was thinking about comments.
00:08:38.080 | You know, I have a blog, it's 2008,
00:08:44.240 | of a comment section.
00:08:45.640 | And maybe someone's gonna come along
00:08:46.800 | and leave a comment on that section
00:08:48.760 | that is, you know, violate some laws
00:08:51.760 | like revealing insider trading
00:08:54.400 | or conducting libel against someone
00:08:56.760 | or something like this, right?
00:08:57.720 | And section 230, very crudely speaking,
00:08:59.560 | would say, "I'm not responsible for that.
00:09:02.200 | I just had a comment board,
00:09:03.240 | but I'm not the editor selecting information."
00:09:06.000 | And the idea was these large social media platforms
00:09:07.960 | were using that coverage to say,
00:09:09.160 | "Look, we're not an editor, right?
00:09:12.440 | So we can't be held liable
00:09:13.480 | for what people actually say on it."
00:09:14.880 | And so there's one of the pushes for regulation
00:09:16.680 | is to get rid of that protection
00:09:18.240 | and say, "No, you are liable
00:09:19.520 | for what's posted on your platform,
00:09:20.760 | just like a newspaper is liable for what they print
00:09:23.400 | and that this might lead to more aggressive
00:09:27.560 | or more effective content moderation."
00:09:30.160 | I said, "I might be interested in it
00:09:32.080 | just because anything that would lead social media
00:09:35.000 | to fragment and have to become more niche,
00:09:38.400 | I thought would be good."
00:09:39.680 | Anyways, I got an email from a lawyer
00:09:41.240 | who used to specialize in section 230.
00:09:43.560 | Long story short, Jesse, he was basically saying,
00:09:46.800 | "It's a tricky path to go down."
00:09:49.760 | He's like, "If you pull up 230
00:09:53.840 | and you do so because there's an impact you want
00:09:57.400 | on these very large social media companies,"
00:09:59.000 | he said, "Actually, they would probably be okay.
00:10:01.400 | They can afford the legal expertise
00:10:04.760 | to try to walk around and sidestep these liabilities
00:10:08.320 | and that small people are gonna get hurt.
00:10:10.800 | It's gonna be calnewport.com.
00:10:13.360 | It really could be a problem for smaller companies."
00:10:16.360 | So look, I'm not a lawyer, but it's an interesting note.
00:10:19.560 | And he's like, "It's just complicated."
00:10:21.640 | So there you go.
00:10:22.720 | Good question though, Carl.
00:10:26.120 | Yeah, I like people emailing.
00:10:27.400 | Jesse is jesse@calnewport.com.
00:10:29.560 | He loves to hear about the show,
00:10:31.360 | what you like, what you don't like.
00:10:33.160 | J-E-S-S-E.
00:10:35.640 | J-E-S-S-E@calnewport.com.
00:10:39.120 | Somebody asked about that, that's the only reason
00:10:40.560 | why I threw that in there.
00:10:41.920 | Yeah, yeah.
00:10:43.960 | It's not G-E-S-S-E-Y.
00:10:47.200 | (both laughing)
00:10:48.040 | It's all right.
00:10:48.880 | It's a lot of Jesse spelled with I-E.
00:10:50.520 | Hmm, fair enough, fair enough.
00:10:52.960 | (upbeat music)
00:10:55.540 | (upbeat music)