back to index

Why Cal Newport Hopes Elon Musk Ruins Twitter | Deep Questions with Cal Newport


Chapters

0:0 Cal's intro
1:6 Cal talks about "Elon Musk Details Twitter Takeover Plan"
6:28 Former Reddit CEO's Anti-Twitter Rant
19:4 Obama Calls for More Social Regulation

Whisper Transcript | Transcript Only Page

00:00:00.000 | All right, well, let us do our first segment.
00:00:05.000 | I have been enjoying doing some
00:00:08.920 | of these news reaction segments.
00:00:11.200 | What I'm really looking for, of course,
00:00:13.280 | is not to just give my opinion on everything, who cares,
00:00:15.440 | but to look at segments of the news that overlap things
00:00:17.920 | that we talk about here on the show
00:00:19.840 | and give me a chance to actually bounce off them
00:00:23.000 | and elaborate some of the theories I've been developing
00:00:26.400 | on the show, some of the theories that I talk
00:00:28.040 | about commonly on the show.
00:00:29.660 | So it's news that's relevant to the show.
00:00:31.560 | And unlike prior Cal Reacts to the News segments,
00:00:34.840 | I actually wanna do several different pieces here.
00:00:38.420 | So I'm gonna start with an article returning
00:00:42.480 | to what we've been discussing,
00:00:43.620 | returning to Elon Musk and his potential takeover
00:00:48.620 | of Twitter.
00:00:50.180 | There's an article I read this morning.
00:00:52.960 | It came out the morning I'm recording this
00:00:55.120 | from the New York Times.
00:00:57.760 | It was titled "Elon Musk Details His Plan to Pay
00:01:01.800 | for a $46.5 Billion Takeover of Twitter."
00:01:06.640 | And this article starts by noting,
00:01:08.980 | "Elon Musk said on Thursday that he had commitments
00:01:12.260 | worth $46.5 billion to finance his proposed bid for Twitter
00:01:17.260 | and was exploring whether to launch a hostile takeover
00:01:20.520 | for the social media company."
00:01:23.140 | Jesse, I think he's beating out the $5,000 bid
00:01:26.300 | that we put in.
00:01:27.200 | I think our vision for Cal Newport Twitter,
00:01:30.060 | they haven't said no yet,
00:01:32.980 | but it looks like Elon has the better bid
00:01:35.140 | he's putting together here.
00:01:36.660 | The article goes on to say,
00:01:37.540 | "The financial commitments gathered a week
00:01:39.620 | after Mr. Musk made an unsolicited offer for Twitter
00:01:42.660 | put pressure on the social media company's board
00:01:44.580 | to take his advances seriously."
00:01:47.580 | It's serious.
00:01:49.380 | Steven David Solomon, a professor at the School of Law
00:01:51.940 | at the University of California at Berkeley,
00:01:53.340 | said of the new filing,
00:01:54.580 | "He's getting more professional
00:01:56.140 | and this is starting to look like a normal hostile bid.
00:01:58.780 | You do not do that unless you are going to launch an offer."
00:02:02.980 | So I was actually,
00:02:06.280 | I think I was leaning towards this was probably not for real
00:02:11.000 | until I read this article.
00:02:12.880 | I thought he might've just been messing with people.
00:02:15.000 | This article is seeming to imply
00:02:17.880 | that he is actually serious.
00:02:19.320 | Did you take him seriously, Jesse?
00:02:20.840 | You were never quite sure
00:02:22.160 | if he was actually gonna do this.
00:02:23.760 | - I thought he was serious.
00:02:26.420 | - Okay, so you more understood it.
00:02:28.380 | Now, the breaking news that almost happened this morning
00:02:31.580 | is he tweeted today, the day we're recording this,
00:02:35.140 | what was his exact words here?
00:02:36.620 | It was moving on, dot, dot, dot, right?
00:02:41.300 | Now, this is big news in part
00:02:43.140 | because I never see Twitter breaking news
00:02:45.040 | because I don't use Twitter,
00:02:46.900 | but I had to go on the Twitter to get this tweet thread
00:02:50.380 | I'm gonna talk about for the next story
00:02:51.740 | that a user sent me and it was popped up.
00:02:54.680 | So I was like, maybe that was Musk saying
00:02:56.020 | he was moving on from this,
00:02:57.020 | but then he clarified, it was breaking news.
00:02:58.780 | It was 14 minutes before I came over here.
00:03:00.900 | He clarified that when he said moving on,
00:03:03.360 | it wasn't about his Twitter takeover bid.
00:03:05.520 | It was him moving on from making fun of Bill Gates
00:03:09.860 | because Bill Gates had shorted Tesla stock
00:03:13.840 | at a time where he was talking a lot about climate change.
00:03:15.720 | So I guess Elon Musk has been dunking on Gates recently
00:03:19.260 | for that.
00:03:20.100 | Captain climate change is shorting Tesla stock.
00:03:22.700 | And so he was moving on from that.
00:03:24.420 | Musk goes on to say on Twitter,
00:03:27.980 | "If our Twitter bid succeeds,
00:03:29.180 | we will defeat the spam bots or die trying."
00:03:33.380 | So that's interesting because that's a new spin.
00:03:36.540 | I think a lot of the coverage of Musk's plans for Twitter
00:03:40.580 | had to do with content moderation.
00:03:42.100 | We'll get into that more here in a second.
00:03:44.180 | And here he is emphasizing another type of improvement
00:03:47.600 | he would look to do in this case,
00:03:48.820 | get rid of spam bots.
00:03:50.740 | That's interesting that he is making that pivot.
00:03:54.060 | The final thing is I would point out
00:03:56.720 | that Morgan Stanley is a big part of the money
00:04:00.940 | he is raising for this takeover bid.
00:04:03.820 | And they quote a lecturer from Cornell University saying,
00:04:07.940 | "There are lots of very senior people at Morgan Stanley
00:04:10.060 | that are responsible for that brand.
00:04:12.060 | That in my view would not allow this to happen
00:04:13.940 | unless there was some level of seriousness behind it."
00:04:16.060 | Okay, so all of the coverage seems to be saying
00:04:21.060 | this is probably serious.
00:04:23.060 | It's probably not Musk.
00:04:25.180 | It's probably not Musk messing with us.
00:04:26.740 | Here's the two thoughts I have about that.
00:04:28.740 | One, it really still could be Musk messing with everyone.
00:04:31.700 | I think he gets enjoyment out of it.
00:04:33.980 | I think he's very persuasive.
00:04:35.540 | I think he could persuade Morgan Stanley to come on board
00:04:38.740 | even if he never actually planned to make the bid.
00:04:41.520 | So I'm still not sure about that.
00:04:44.140 | The other thing I was thinking about this morning
00:04:45.980 | is that there could be a silver lining
00:04:49.460 | to this Musk takeover that isn't really being reported,
00:04:54.260 | but I think it could be good.
00:04:56.380 | And that's the fact that the media for the most part
00:05:00.140 | doesn't like Elon Musk for a lot of complicated reasons.
00:05:04.180 | But of course, the fact that he messes with them this way
00:05:06.220 | is probably one of them.
00:05:07.720 | So if he did take over Twitter,
00:05:11.300 | the media might stop using Twitter and focusing on Twitter
00:05:16.300 | and allowing Twitter to influence them as much.
00:05:20.220 | Because on principle, they're like,
00:05:21.340 | "I don't like Elon Musk.
00:05:22.820 | "Twitter is now his.
00:05:24.100 | "I don't wanna use Twitter as much anymore
00:05:25.880 | "as a TV reporter or a newspaper reporter.
00:05:28.480 | "I don't wanna be a part of something that he owns."
00:05:31.820 | And this would be good for the Republic.
00:05:35.780 | I think if reporters and journalists in general
00:05:39.100 | spent less time on Twitter and being influenced by Twitter,
00:05:41.860 | it's probably better for everybody.
00:05:43.040 | So there's a silver lining here.
00:05:44.740 | If Musk continues acting sort of erratically
00:05:47.600 | and the media continues to dislike him
00:05:49.420 | and he takes over Twitter,
00:05:50.800 | maybe it will actually ironically and paradoxically
00:05:54.740 | reduce the impact of Twitter on our culture.
00:05:57.720 | And I think that would be a good thing.
00:06:00.880 | Basically anything that hurts Twitter, I'm kind of a fan of.
00:06:05.540 | All right, so then I had a second article here
00:06:08.740 | that elaborates on what's going on with Musk and Twitter.
00:06:12.660 | And maybe it's not fair to call it an article.
00:06:14.420 | It's a Twitter thread.
00:06:15.720 | So there's all sorts of recursive ironies abounding here.
00:06:19.160 | It's from the former CEO of Reddit, Yixing Wang,
00:06:24.760 | and hat tip to listener Andy, who emailed this to me.
00:06:29.420 | In general, by the way, if you have tips or articles
00:06:31.780 | you think I would like,
00:06:33.140 | my longtime address for that is interesting@calnewport.com.
00:06:37.940 | That's where Andy sent me this Twitter thread.
00:06:40.580 | And I thought it was quite interesting
00:06:41.900 | as this has gotten quite a lot of play.
00:06:44.340 | Yixing starts in this thread by saying,
00:06:47.820 | "I've now been asked multiple times
00:06:49.180 | "for my take on Elon's offer for Twitter.
00:06:52.040 | "So fine, this is what I think about that."
00:06:54.460 | All right, so here's his main point.
00:06:57.820 | If Elon takes over Twitter, he's in for a world of pain.
00:07:01.660 | He has no idea.
00:07:04.640 | I'm gonna summarize.
00:07:05.560 | This is a long thread.
00:07:07.540 | But Yixing says, "There is this old culture of the internet,
00:07:11.700 | "roughly Web 1.0 and early Web 2.0,
00:07:15.340 | "that had a very strong free speech culture.
00:07:18.640 | "This free speech idea arose out of a culture
00:07:20.780 | "of late '90s America, where the main people
00:07:22.680 | "who were interested in censorship
00:07:23.900 | "were religious conservatives.
00:07:25.500 | "In practical terms, this meant that they would try
00:07:27.200 | "to ban porn or other imagined moral degeneracy
00:07:29.960 | "on the internet.
00:07:30.860 | "Many of the older tech leaders today,"
00:07:34.900 | and he points to Elon Musk or Marc Andreessen,
00:07:38.040 | "grew up with that internet.
00:07:39.340 | "To them, the internet represented freedom,
00:07:40.920 | "a new frontier, a flowering of the human spirit,
00:07:42.700 | "and a great optimism that technology
00:07:44.240 | "could birth a new golden age of mankind."
00:07:47.820 | Skipping ahead here a little bit.
00:07:49.480 | "Reddit," which he ran, "was born in the last years
00:07:53.880 | "of this old internet, when free speech
00:07:55.400 | "meant freedom from religious conservatives
00:07:56.920 | "trying to take down porn
00:07:57.900 | "and sometimes first-person shooters.
00:07:59.540 | "And so we tried to preserve that ideal,
00:08:01.120 | "but this is not what free speech is about today."
00:08:04.860 | He then goes on to argue, "The internet is not a frontier
00:08:07.520 | "where people can go to be free.
00:08:08.860 | "It's where the entire world is now,
00:08:10.240 | "and every culture war is being fought on it.
00:08:12.100 | "It's the main battlefield for our culture wars."
00:08:15.020 | And he says, "It means that upholding free speech
00:08:16.780 | "means you're not standing up
00:08:18.620 | "against some religious conservatives
00:08:20.380 | "lobbying to remove Judy Blume books from the library.
00:08:22.420 | "It means you're standing up against everyone
00:08:24.060 | "because every side is trying to take away
00:08:25.700 | "the speech rights of the other side."
00:08:28.340 | And so he goes on to say, for example,
00:08:29.980 | that all of his left-wing woke friends
00:08:32.820 | are convinced that the social media platforms
00:08:35.660 | uphold the white supremacist misogynistic patriarchy,
00:08:38.340 | and they have plenty of screenshots and evidence.
00:08:40.580 | And at the same time,
00:08:41.460 | all of his center-right libertarian friends
00:08:43.860 | are convinced that social media platforms
00:08:46.020 | uphold the woke BLM Marxist LGBTQ agenda,
00:08:49.020 | and they also have plenty of screenshots, blah, blah, blah.
00:08:51.460 | So his point is everyone has their own definition
00:08:55.660 | of free speech.
00:08:56.980 | Everyone wants to stop the other side,
00:08:59.980 | whatever the other team is,
00:09:01.100 | from whatever they're doing to impede their team
00:09:03.380 | and to get more freedom for their own team,
00:09:05.460 | that it's a battlefield where there is no clear sides.
00:09:10.460 | And he says, Elon Musk doesn't realize this.
00:09:13.660 | He says, "Elon doesn't understand
00:09:15.980 | "what has happened to internet culture since 2014."
00:09:18.660 | I know he doesn't because he was late to Bitcoin.
00:09:22.580 | Elon's been too busy doing actual real things
00:09:24.580 | like making electric cars and reusable rockets.
00:09:28.260 | Cutting out some inappropriate language here.
00:09:31.620 | So he has a pretty good excuse for not paying attention,
00:09:34.340 | but this is something that's hard to understand
00:09:35.940 | unless you've run a social network.
00:09:38.300 | All right, I'll call it there.
00:09:41.700 | That's my summary.
00:09:42.540 | But basically what this former Reddit CEO is saying
00:09:45.340 | is that Elon Musk is from an old generation
00:09:48.220 | where free speech was an ideal
00:09:50.020 | that all tech people supported, and it was pretty clear.
00:09:52.540 | It was like the internet is for free speech,
00:09:54.060 | and we have to stop Jerry Falwell
00:09:56.260 | from trying to prevent first-person shooters,
00:09:59.740 | or Al Gore's wife from preventing first-person shooters,
00:10:02.740 | or whatever was going on at the time.
00:10:03.860 | He's like, "Today, that's no longer the case.
00:10:05.900 | "Free speech means different things for everybody.
00:10:08.140 | "There's no solution that's gonna make everyone happy.
00:10:10.220 | "What this team wants is completely different
00:10:12.500 | "from what this team wants,
00:10:13.740 | "and you're gonna have to either pick sides,
00:10:15.940 | "and if you don't pick sides,
00:10:17.300 | "everyone's gonna hate you.
00:10:18.140 | "You're gonna be in a world of pain from all sides."
00:10:20.860 | So this thread has gone somewhat viral,
00:10:25.460 | and I think it is an interesting take.
00:10:27.940 | All right, so here's what I think about that.
00:10:29.860 | Having thought a lot about this
00:10:31.140 | and written a lot about this,
00:10:33.340 | there's some things here I agree with.
00:10:35.300 | Yes, there certainly was an older internet culture.
00:10:39.780 | I think they're often described
00:10:41.180 | as the open culture techno-optimist
00:10:44.900 | that were a big believer in the internet
00:10:47.060 | as bringing openness to everyone.
00:10:49.620 | This is a movement that was really tied to things
00:10:51.820 | like information wants to be free,
00:10:54.100 | open source software.
00:10:55.380 | They were very anti-digital rights management.
00:10:57.900 | They thought software and music and text
00:11:01.140 | and everything should just be freely available
00:11:02.700 | on the internet, and it was a utopian movement.
00:11:05.140 | It came out of California techno-optimist circles.
00:11:07.860 | Kevin Kelly, who I know and respect,
00:11:09.540 | was one of the big thinkers of that.
00:11:10.700 | So that movement did exist.
00:11:12.300 | Their version of the internet
00:11:13.460 | is very different than it is today.
00:11:15.420 | A lot of writers have talked about that transition.
00:11:17.580 | I think Jaron Lanier is probably the most eloquent.
00:11:20.100 | He was an open culture techno-optimist
00:11:21.980 | who became decidedly not that
00:11:23.580 | after the internet took a turn in the early 2000s.
00:11:26.780 | So I think that is definitely true.
00:11:29.260 | I think he's also right.
00:11:31.660 | I think we've seen this clearly,
00:11:33.500 | that it's also right that there is no obvious solution
00:11:38.060 | that's gonna make most people happy
00:11:39.620 | when it comes to things like content moderation.
00:11:41.860 | The left wants this moderated,
00:11:43.820 | the right wants that moderated.
00:11:45.900 | And then there's other weird, crazy offshoots
00:11:48.580 | of the mainstream left and right
00:11:49.860 | that have all sorts of crazy thoughts
00:11:51.540 | about what should be moderated or not.
00:11:54.780 | And so there's no way to keep everyone happy.
00:11:56.820 | Facebook saw this.
00:11:58.420 | Facebook somehow got everyone,
00:12:00.860 | no matter where they were in the political spectrum,
00:12:02.300 | mad at them.
00:12:03.620 | The right wing got mad that they were being censored.
00:12:05.460 | The left wing got mad they weren't censoring enough.
00:12:07.300 | And so it is a very difficult place to be in.
00:12:10.300 | There is no politically neutral stance
00:12:12.380 | where people will see like you're doing it well.
00:12:13.740 | So I think he's right about that as well.
00:12:16.220 | What I think he's getting wrong though
00:12:19.460 | is this idea that Elon Musk or Mark Andreessen
00:12:22.020 | don't understand that.
00:12:23.980 | Gen X I think is too new of a generation.
00:12:26.540 | The sort of middle-aged tech oligarch class,
00:12:30.500 | the Peter Thiel's, Andreessen, Musk,
00:12:34.540 | you might have David Sacks,
00:12:38.300 | the sort of big made a lot of money, Reed Hoffman.
00:12:42.420 | They were not really from that school
00:12:44.740 | of the original open culture techno optimist.
00:12:47.020 | They're a little bit older.
00:12:47.940 | That's a little bit of an older time.
00:12:49.540 | That's Kevin Kelly, that's Stuart Brand,
00:12:51.580 | that's Jaron Lanier.
00:12:52.820 | That's a slightly older group.
00:12:54.500 | This group came up in the dot-com boom of the 90s.
00:12:57.260 | They're much more money focused
00:12:59.500 | than the original techno optimist are.
00:13:01.100 | And they know exactly what's going on.
00:13:02.980 | I do not think Elon Musk misunderstands
00:13:07.020 | what's actually happening on Twitter.
00:13:10.660 | I think what's going on when he talks about
00:13:13.020 | content moderation is much simpler.
00:13:14.620 | And I don't know why we don't just simplify this to this.
00:13:18.340 | I think when Elon Musk says, look at my notes here,
00:13:21.740 | but when Elon Musk says basically,
00:13:23.620 | he wants free speech,
00:13:25.700 | I think almost certainly what he means is
00:13:28.620 | he thinks that content moderation should come
00:13:31.180 | more from a centrist position than from a farther
00:13:36.020 | influence to the farther to the left position.
00:13:38.300 | I think that's all it is.
00:13:40.700 | And I think we see the split.
00:13:44.300 | We saw the split happening in Silicon Valley
00:13:48.620 | where this small group of these tech oligarchs,
00:13:52.820 | so especially the ones with brand names,
00:13:54.380 | the people who made a lot of money, were very successful,
00:13:57.340 | had accrued a lot of power in the internet,
00:14:02.180 | out of the internet's growth.
00:14:04.580 | When there was the shift more recently in our culture
00:14:09.140 | towards using postmodern critical theories
00:14:11.620 | as the main perspective through which
00:14:12.860 | we understand the world,
00:14:14.100 | that group largely resisted it.
00:14:15.700 | And there might be a bit of a, don't tell me how to think,
00:14:20.140 | look, I'm used to being the smartest person in the room
00:14:22.100 | and explaining how things work,
00:14:23.460 | and I don't want someone from a university coming along
00:14:26.900 | and telling me how to think.
00:14:27.740 | I don't know what it was.
00:14:28.780 | It could be the antagonism that grew between the media
00:14:32.460 | and these groups.
00:14:33.300 | So as more of the culture, and especially media culture,
00:14:36.820 | shifted to using postmodern critical theories
00:14:38.860 | as their main lens,
00:14:40.700 | their treatment of these tech bro oligarchs
00:14:43.300 | got pretty rough.
00:14:45.260 | And there's this whole tension that's not reported a lot,
00:14:47.580 | but there basically has been a complete break
00:14:49.660 | between the Silicon Valley brand name leaders
00:14:52.940 | and the East Coast media,
00:14:54.260 | where they just won't talk to them anymore.
00:14:56.100 | Like every time we talk to you,
00:14:57.340 | you just dunk on us in the piece,
00:14:58.780 | and look, we're just not even gonna do interviews with you.
00:15:00.780 | We'll just talk to people directly through our own podcast,
00:15:03.140 | and we'll talk to people through our own websites.
00:15:04.780 | And there's this real tension between the two worlds.
00:15:07.540 | That probably didn't help either.
00:15:09.100 | But I think it's as simple as that.
00:15:10.780 | Elon Musk is from that group of brand name tech oligarchs
00:15:13.660 | that says, "I don't know.
00:15:15.060 | "I don't wanna re-center all my perspectives
00:15:17.540 | "through postmodern critical theories.
00:15:19.460 | "I think Twitter does that too much.
00:15:21.580 | "I wanted to do it less."
00:15:23.700 | So I don't know why it has to be such a complex analysis
00:15:26.940 | of what's really going on with free speech,
00:15:28.900 | and maybe Musk is from this weird prior time,
00:15:32.220 | and we have to understand these complex rules.
00:15:34.980 | I think he just has a different location
00:15:39.500 | on the political spectrum, and has a lot of money.
00:15:42.700 | You can put those two things together.
00:15:46.260 | I don't know.
00:15:47.100 | I mean, Justin, you probably hear this, right?
00:15:47.940 | Like there's a lot of bending over backwards
00:15:50.100 | and complex analysis for some of this stuff,
00:15:51.860 | but I think some of it's pretty straightforward.
00:15:53.500 | Musk is like, "I wanna be more centrist on this,
00:15:55.620 | "and I have a lot of money,
00:15:56.600 | "and I'm kinda screwing with people."
00:15:58.660 | - Yeah, I do think he has a plan.
00:16:01.860 | - Yeah. - I'm not exactly sure
00:16:02.780 | what it is, but I do think he knows what's going on.
00:16:06.500 | 'Cause he uses Twitter a lot anyway.
00:16:07.700 | He's been using it for years, right?
00:16:09.300 | - Yeah, he's got 80-something million followers.
00:16:11.220 | - Yeah. - Yeah.
00:16:12.140 | - So anything he uses that much,
00:16:15.420 | he's obviously thinking a lot about it.
00:16:17.100 | - I mean, this comes back to my bigger point,
00:16:19.380 | which I make often about social media,
00:16:20.980 | which is this is the impossibility
00:16:22.780 | of trying to have universalism.
00:16:24.320 | I just think this model of social media universalism,
00:16:27.340 | where everyone uses the same small number of platforms,
00:16:29.900 | doesn't make sense.
00:16:31.900 | That's not what the internet was architected for.
00:16:34.180 | Like the whole point of the internet
00:16:35.580 | is now you have potential point-to-point connections
00:16:38.660 | between everyone in the world.
00:16:41.140 | Meaning that you can put together
00:16:42.860 | any type of communication graph topologies that you want.
00:16:46.140 | Small groups of people, interesting connections
00:16:49.200 | that you surf to find people you've never known before.
00:16:51.660 | But to go to this broadcast topology,
00:16:53.660 | we say, "No, no, no.
00:16:54.900 | "Everyone's gonna talk to the same server banks
00:16:57.180 | "at the same companies,
00:16:58.220 | "and everyone's gonna read the same information
00:16:59.940 | "being posted on the same three websites."
00:17:02.100 | Completely gets in the way
00:17:03.620 | and obviates all the advantage of the internet
00:17:05.140 | in the first place, and it's completely impossible.
00:17:06.780 | And this is really what we're seeing here
00:17:08.460 | in that Twitter thread.
00:17:09.580 | You're never going to make a platform work
00:17:11.740 | where you want it to be the platform everyone uses.
00:17:14.300 | What an impossible task.
00:17:17.500 | You want everyone in the country to use the same platform
00:17:20.020 | with the same interface
00:17:21.220 | and the same content moderation rules.
00:17:22.980 | Of course, that's going to explode, and it should.
00:17:26.500 | And it's why I'm obviously much more in favor
00:17:29.380 | of social media being, and by social media,
00:17:31.620 | I mean social interaction on the internet
00:17:33.620 | being much more niche, much more smaller scale.
00:17:37.780 | Do what you wanna do in your particular community.
00:17:40.740 | Let community standards emerge in a grassroots fashion
00:17:44.500 | from the small number of users
00:17:45.900 | that are using each of these particular networks or groups
00:17:48.540 | or however you want to work.
00:17:49.500 | That is where the internet really works.
00:17:51.500 | The people who are into X have a place to go
00:17:54.780 | and hang out with people that are into X.
00:17:56.260 | And the standards for how they talk about things
00:17:58.020 | might be really different than people that are into Y.
00:17:59.780 | And the people who are into Y don't have to know
00:18:01.180 | what the people that are into X are talking about.
00:18:03.060 | And we don't have to have some sort of common set of rules
00:18:06.460 | that the people from X and Y both have to follow.
00:18:09.820 | And so I think the folly here, the Shakespearean tragedy
00:18:13.540 | underlying all this is this push towards,
00:18:16.100 | we need a digital town square.
00:18:18.620 | We need one service that everyone uses.
00:18:21.100 | So look, I don't think Musk doesn't know what's going on.
00:18:26.100 | I don't think he's Kevin Kelly recarnated
00:18:28.900 | as being too techno-optimistic.
00:18:30.380 | He knows what he's doing.
00:18:31.620 | I don't think it's gonna be super successful
00:18:34.340 | because I think this whole project
00:18:35.740 | of having giant platforms that everyone uses
00:18:37.500 | makes no sense and is destroying the internet.
00:18:40.140 | But I don't think it's complicated what he's doing.
00:18:42.380 | He likes Twitter.
00:18:43.860 | He doesn't like some of the politics
00:18:45.260 | behind how it's being implemented.
00:18:46.740 | He has money.
00:18:48.180 | So he's like, I'm gonna try to change it.
00:18:50.740 | All right, let's do one more quick article.
00:18:53.500 | So today my theme is social media,
00:18:55.940 | future social media, social media regulation.
00:18:57.940 | It doesn't mean I'm always gonna talk about this,
00:18:59.420 | but just seems to be in the air these days.
00:19:01.620 | So this final one also comes from the Times,
00:19:05.580 | from a couple of days ago.
00:19:07.460 | The title of the article is
00:19:08.500 | Obama Calls for More Regulatory Oversight
00:19:11.060 | of Social Media Giants.
00:19:13.580 | This is from a talk that Obama gave last week
00:19:17.740 | at the Stanford Cyber Policy Center.
00:19:19.620 | Okay, so the article says,
00:19:20.740 | former President Barack Obama on Thursday
00:19:23.380 | called for greater regulatory oversight
00:19:25.420 | of the country's social media giants,
00:19:26.980 | saying their power to curate the information
00:19:28.580 | that people consume has turbocharged political polarization
00:19:32.980 | and threatened the pillars of democracy.
00:19:35.300 | Weighing in on the debate over how to address
00:19:37.060 | the spread of disinformation,
00:19:38.620 | he said the companies needed to subject
00:19:40.980 | their proprietary algorithms
00:19:42.140 | to the same kind of regulatory oversight
00:19:43.900 | to ensure the safety of cars, food,
00:19:46.020 | and other consumer products.
00:19:47.380 | Tech companies need to be more transparent
00:19:48.980 | about how they operate, Mr. Obama said.
00:19:52.100 | Well, I mean, there's a lot of things
00:19:55.660 | I agree with the former president on.
00:19:57.100 | I think this take, however, is a little bit out of touch
00:19:59.980 | with the underlying technology.
00:20:03.060 | Social media, quote unquote, algorithms
00:20:06.460 | are not like food safety or car safety,
00:20:11.460 | where, okay, we have data from crash tests
00:20:15.620 | that need to be shared or whatever needs to happen here.
00:20:18.380 | They're very complicated, but they're not just complicated.
00:20:21.140 | They are fundamentally doing something that's ineffable.
00:20:26.140 | So what's actually happening underneath the coverage
00:20:29.500 | of these social media companies,
00:20:30.620 | how, let's say, items are selected to add to the stream
00:20:33.940 | in the timeline that you consume,
00:20:35.860 | in the feed that you can consume.
00:20:37.460 | This is a collection, a complex collection
00:20:40.420 | of complex neural networks that have been trained
00:20:43.620 | in complex ways, usually with reinforcement mechanisms,
00:20:46.660 | and then they're connected together
00:20:47.860 | in some sort of dynamical way.
00:20:50.220 | You can't look at a complex connections of neural networks
00:20:53.780 | and say, what does this do?
00:20:55.380 | It doesn't work that way.
00:20:58.980 | These networks have learned on their own
00:21:02.140 | through hundreds of millions of trials of training
00:21:04.860 | and back propagation reinforcement.
00:21:07.140 | They have learned patterns, instructions, the information,
00:21:11.140 | what information is more likely to get engagement
00:21:13.180 | from this person versus that person
00:21:14.980 | that cannot be easily reduced
00:21:17.020 | to a human understandable format.
00:21:19.620 | A lot of what's happening here is that the information,
00:21:22.340 | let's say a particular post on Twitter,
00:21:24.620 | is going to exist as a multidimensional vector
00:21:27.300 | of data points, and that any one of these neural networks
00:21:30.820 | in the quote unquote algorithm
00:21:32.100 | is actually creating a multidimensional hyperplane
00:21:34.740 | through which they can actually categorize
00:21:36.420 | a multidimensional location for this point.
00:21:38.740 | And then the user themselves,
00:21:40.420 | they're gonna show this to exist
00:21:41.660 | as their own multidimensional point,
00:21:43.460 | and they can see if they've been segmented
00:21:45.260 | into the same place by this hyperplane,
00:21:47.060 | meaning they're more likely to have an affinity.
00:21:49.060 | All of which I'm trying to say here
00:21:50.380 | is that this is complicated, abstract, multidimensional work
00:21:54.780 | that is happening with these systems.
00:21:56.340 | It is not an algorithm like we would think about it.
00:21:59.660 | It is not a bunch of turbo basic code that says,
00:22:03.020 | if about cats and reader is over 60,
00:22:09.580 | show them this post.
00:22:14.060 | It's not that.
00:22:15.060 | It is vectors of numbers
00:22:18.780 | going through complex linear algebra convolutions
00:22:22.300 | and scores coming out of the other side.
00:22:24.140 | And what happens in between is not understandable by people.
00:22:29.020 | But I think this is an old fashioned view that like,
00:22:31.060 | oh, I think what's happening here
00:22:32.340 | is that like someone who was mustache twisting,
00:22:36.740 | and was like, hmm, we will get more views.
00:22:40.260 | I'm looking at this here.
00:22:41.580 | These people like hearing why vaccines are bad.
00:22:44.420 | So let me just type into here,
00:22:46.340 | show articles about vaccines being bad
00:22:50.220 | because then we will get more money.
00:22:51.740 | And oh, the regulators here.
00:22:53.300 | And he says, don't put that in your algorithm.
00:22:54.820 | So, okay, I'm gonna take that out of my algorithm here.
00:22:56.620 | That's not how it works.
00:22:57.540 | It's these, again, it's incredibly complex and abstract
00:23:00.220 | and you can't break it down into what's really happening.
00:23:03.380 | Now, furthermore, even if you could,
00:23:05.380 | it would be disastrous for these companies
00:23:07.220 | if you could somehow make this algorithm interrogatable.
00:23:11.580 | Right, so maybe you wanna apply
00:23:13.100 | an explainable AI approach here and say,
00:23:14.940 | well, let's just interrogate the algorithms.
00:23:17.220 | See what it says is, you know,
00:23:19.060 | let's put in different things and see which it prefers.
00:23:21.420 | But if you did that,
00:23:22.620 | then everyone would start scamming the system.
00:23:26.020 | Everyone trying to get people to look at their dubious
00:23:29.620 | diet pill site or whatever would figure out
00:23:32.860 | exactly what to put in their tech
00:23:34.540 | so that it would dominate everything else.
00:23:36.340 | It'd be like showing spammers the spam filter
00:23:39.700 | that Gmail used.
00:23:41.300 | Everything would then slip through the filter
00:23:42.820 | because they could just sit there
00:23:43.700 | and work with it to figure it out.
00:23:44.700 | So you can't really make it clear.
00:23:46.140 | Anyways, again, I respect the former president.
00:23:48.980 | I'm just saying this is out of date
00:23:51.460 | with what's going on with this technology.
00:23:52.820 | It's not so simple.
00:23:53.700 | It's not you're going in there and turning knobs.
00:23:55.940 | This knob is turned towards bad information.
00:23:58.140 | Let's just turn that down.
00:23:59.660 | And this knob is turned towards good information.
00:24:01.420 | Let's turn it up.
00:24:02.700 | The reality I think is much more complicated.
00:24:06.140 | But there's a bigger point here I wanna make,
00:24:08.500 | which is again, in a lot of discourse,
00:24:10.180 | especially again, discourse coming out of
00:24:12.620 | more elite circles about social media,
00:24:14.540 | there's this real focus on the problem is
00:24:18.340 | the wrong information is being amplified.
00:24:21.140 | That this is the problem.
00:24:22.260 | It's all about content amplification.
00:24:24.300 | This is bad information.
00:24:26.300 | This platform is sending out this bad information
00:24:29.020 | to a lot of people.
00:24:30.100 | And from the elite perspective, most people are dumb.
00:24:33.100 | So then they get tricked
00:24:33.940 | and then they believe this bad information.
00:24:35.180 | So let's just stop it
00:24:36.020 | from spreading out the bad information.
00:24:38.100 | I am more in line right now with John Height's latest take
00:24:42.380 | that we talked about last week on the show.
00:24:44.620 | His take from his Big Atlantic article,
00:24:46.100 | which I increasingly think is right.
00:24:49.580 | And I think it gives us a more nuanced understanding
00:24:51.820 | of the issues with social media than simply saying,
00:24:54.420 | it pushes the bad information more
00:24:57.220 | than the quote unquote good information.
00:24:58.820 | 'Cause if you'll remember from our discussion
00:25:00.380 | of John Height's Atlantic article,
00:25:03.820 | what he was arguing is the problem
00:25:05.220 | is not what it does to information,
00:25:06.580 | it's what social media does to the people.
00:25:08.740 | What it does to the people
00:25:09.620 | who are interacting on social media.
00:25:11.100 | And his whole point was,
00:25:13.020 | once these platforms shifted towards viral dynamics,
00:25:17.620 | where something could get a huge amount
00:25:19.780 | of attention right away,
00:25:21.540 | things could blow up really quickly.
00:25:23.220 | He said, this really changed the way
00:25:24.780 | that people use social media.
00:25:28.620 | Three things happened.
00:25:30.660 | One, there became immediate,
00:25:35.660 | there could be immediate consequences.
00:25:38.100 | If you sort of say the wrong thing,
00:25:40.180 | your team could swarm in a way that was breathtaking.
00:25:43.820 | And it became, he called it a vigilante culture.
00:25:46.380 | Where out of nowhere,
00:25:48.020 | you could have people just piled on and destroyed.
00:25:52.340 | Two, it created a culture then
00:25:53.860 | where you became very wary of letting the other team,
00:25:57.900 | depending on how you define the other team,
00:25:59.300 | gain any ground.
00:26:00.820 | Can't let the other team gain any ground.
00:26:03.220 | So we got to like quickly tamp down or attack.
00:26:06.260 | Don't give, we give in on anything that might get amplified.
00:26:08.860 | So it created this really tense, anxious type of environment.
00:26:12.580 | And three, that drove out almost everyone but the extremes.
00:26:15.980 | So as Height documents,
00:26:17.100 | you're left with the extremes on the left
00:26:18.660 | and the extremes on the right,
00:26:20.300 | basically fighting back and forth,
00:26:22.140 | desperate to avoid being attacked by their own team,
00:26:24.180 | desperate not to give any ground to the other team.
00:26:26.500 | It became a spectacle of the elite extremist.
00:26:30.100 | That is what is happening on a platform
00:26:32.260 | like Twitter right now.
00:26:33.100 | And it's great entertainment for those groups
00:26:36.060 | and a larger group of adjacent people
00:26:37.740 | that quietly like to watch it,
00:26:39.300 | but it's a terrible environment.
00:26:42.100 | So the incentives there,
00:26:44.380 | our incentives were really wonky information,
00:26:47.340 | really weird, bad information,
00:26:49.380 | can really spread and take hold
00:26:50.700 | because the point there is not,
00:26:52.300 | hey, let's try to spread interesting information,
00:26:54.060 | is we're going to win
00:26:54.900 | and I don't want to get hung by my own team.
00:26:57.860 | So we'll grasp onto something crazy
00:26:59.780 | if that gives us a little advantage.
00:27:01.580 | And we will ignore refuting evidence about something
00:27:04.700 | with a diligence, with diligent blinders,
00:27:07.620 | if that might lead to an attack,
00:27:10.100 | if I acknowledge it,
00:27:10.940 | or if I might give the other team room.
00:27:12.540 | So Height's argument is the problem is not
00:27:14.660 | what is the social media algorithms,
00:27:16.580 | how do they amplify or choose
00:27:17.820 | what information to amplify?
00:27:18.980 | It's what do they do to the people?
00:27:20.780 | And the viral dynamics turn people
00:27:22.220 | into these weird, obsessive, extreme tribal warriors.
00:27:26.380 | And that is an environment where really wonky
00:27:29.540 | or bad information can spread, can take hold,
00:27:31.820 | can be really difficult to dislodge.
00:27:36.620 | I mean, I think if there was somehow a way
00:27:41.180 | to really dunk on Trump by believing in flat-earthism,
00:27:45.500 | you would see a lot of flat-earthism.
00:27:47.420 | Other way around, if there's some way
00:27:49.620 | to really, really get at Biden,
00:27:53.460 | if by believing in lizard people,
00:27:55.460 | lizard people stories are going to spread really strong
00:27:58.180 | and people are really going to grasp them.
00:27:59.740 | It's not the information that matters here.
00:28:02.620 | It's the human dynamic.
00:28:04.020 | Social media warps the people to a mode
00:28:09.020 | in which all sorts of crazy stuff is going to spread.
00:28:12.180 | So again, so I think the solution is we got to get away
00:28:14.740 | from platform universalism.
00:28:17.060 | We got to get away from this idea
00:28:18.540 | that everyone needs to be on the same platform,
00:28:20.420 | that we need these quote unquote,
00:28:21.900 | what we call digital town halls,
00:28:23.900 | which aren't digital town halls at all.
00:28:25.420 | It's the digital Roman Colosseum.
00:28:27.020 | It's a spectacle for elite extremists doing combat
00:28:30.500 | and the small group of people who like to watch the blood,
00:28:33.060 | but it's a spectacle that has a trickle down impact
00:28:35.060 | on everyone.
00:28:35.940 | Most people don't use Twitter.
00:28:37.580 | Most people, the vast majority of people
00:28:39.700 | never post anything on Twitter,
00:28:40.820 | but there's huge impacts about what happens in their life
00:28:43.820 | because of what's going on in that elite Colosseum.
00:28:46.140 | And I think Haidt is absolutely right about that.
00:28:47.940 | And the problem is not just, again,
00:28:49.380 | we have to go in there and turn some content knobs.
00:28:51.420 | Don't promote this content, promote that content.
00:28:53.260 | We're way past that.
00:28:54.460 | We got to stop the impact it has on people.
00:28:57.620 | And I think the way we do that is we de-emphasize
00:29:00.620 | the importance of these platforms.
00:29:02.020 | Once we recognize it's an elite spectacle,
00:29:03.780 | maybe we'll spend less time paying attention to it.
00:29:06.620 | We'll reduce its impact.
00:29:07.820 | So Obama goes on to say one of his proposals
00:29:12.100 | was to look at section 230
00:29:14.580 | of the Communications Decency Act,
00:29:16.140 | which protects social media platforms from liability
00:29:18.260 | for content that their users post.
00:29:19.980 | So he is supporting proposals to get rid of that.
00:29:24.940 | That makes companies more liable for what's posted.
00:29:27.660 | And again, I think that's interesting.
00:29:29.100 | 230 is complicated.
00:29:30.300 | Like technically 230 is what would give me protection
00:29:33.380 | if commenters on my blog said something damaging or illegal.
00:29:38.380 | 230 would say, well, I'm not gonna be held responsible
00:29:42.020 | for what other people posted on my platform.
00:29:44.260 | The social media companies are really leaning the 230.
00:29:46.900 | I'm not opposed to the idea of getting rid of 230
00:29:49.980 | to some degree, because I think, again,
00:29:51.540 | anything that might lead to fracturing social media
00:29:54.860 | is probably gonna be better for everyone involved.
00:29:59.380 | I mean, I like a world if like, okay, we drop 230
00:30:01.940 | and maybe I have to turn comment threads off
00:30:03.700 | because like I don't be liable, sure.
00:30:05.980 | But it means that it's no longer legally viable
00:30:08.500 | to be a massive universal platform.
00:30:10.540 | What you get instead is more of a Reddit type culture
00:30:12.620 | of smaller communities where things are,
00:30:15.020 | there's community moderation and people are responsible
00:30:17.420 | for what's posted on there.
00:30:18.540 | And there's care to how the community interacts
00:30:21.780 | and the people who are really into this
00:30:23.540 | are not talking to the people who are really into that.
00:30:25.700 | That's probably a better world.
00:30:27.140 | So probably from a legal principle,
00:30:29.740 | there's problems with just saying drop 230,
00:30:31.620 | but I like anything that might fracture social media
00:30:35.060 | away from this form it is now,
00:30:38.340 | where we have the spectacle of the elites.
00:30:40.740 | And though most of us don't wanna pay attention to it,
00:30:42.540 | it ends up really affecting our life.
00:30:46.940 | So I don't know.
00:30:50.540 | You probably know Twitter better than I do, Jesse.
00:30:54.260 | - I don't think so.
00:30:55.100 | I never know Twitter. - You don't know either?
00:30:55.940 | Yeah, see, that's the thing.
00:30:56.900 | Most people don't.
00:30:57.780 | And yet it has a big impact on all of our life.
00:31:00.140 | Like it could have a big impact on
00:31:02.260 | how our employers operate, the news we receive,
00:31:05.500 | the legislation taken up or not taken up,
00:31:08.700 | what politicians pay attention to.
00:31:10.780 | It's like the Roman Coliseum,
00:31:13.060 | the only people going there are like the elite landowners
00:31:17.220 | because they really like it.
00:31:18.100 | But what's happening in the Coliseum
00:31:19.340 | is completely affecting what happens
00:31:20.700 | to the rest of the Roman citizens
00:31:22.220 | who are just like out there trying to run their farms.
00:31:24.020 | - It's a great analogy.
00:31:25.220 | - Yeah, and I think Haidt actually mentioned
00:31:29.180 | the Roman Coliseum in his Atlantic article.
00:31:31.020 | So I'm sort of taking and running with his perspective,
00:31:34.820 | but it's elite capture.
00:31:36.820 | And I think that's all this stuff is,
00:31:39.260 | that's what keeps capturing me about all of this.
00:31:41.780 | We're in 1750 France,
00:31:47.340 | and there's like huge arguments going on
00:31:49.420 | about the Hall of Mirrors at Versailles.
00:31:52.660 | It's like, it's not what most people in France
00:31:54.300 | cared about right then.
00:31:55.500 | So I think there's more of that going on with social media
00:31:59.380 | than other people are willing to let on.
00:32:01.860 | So there we go.
00:32:03.340 | That's what I think about that.
00:32:06.340 | (upbeat music)
00:32:08.940 | (upbeat music)