back to indexWhy Cal Newport Hopes Elon Musk Ruins Twitter | Deep Questions with Cal Newport
Chapters
0:0 Cal's intro
1:6 Cal talks about "Elon Musk Details Twitter Takeover Plan"
6:28 Former Reddit CEO's Anti-Twitter Rant
19:4 Obama Calls for More Social Regulation
00:00:00.000 |
All right, well, let us do our first segment. 00:00:13.280 |
is not to just give my opinion on everything, who cares, 00:00:15.440 |
but to look at segments of the news that overlap things 00:00:19.840 |
and give me a chance to actually bounce off them 00:00:23.000 |
and elaborate some of the theories I've been developing 00:00:26.400 |
on the show, some of the theories that I talk 00:00:31.560 |
And unlike prior Cal Reacts to the News segments, 00:00:34.840 |
I actually wanna do several different pieces here. 00:00:43.620 |
returning to Elon Musk and his potential takeover 00:00:57.760 |
It was titled "Elon Musk Details His Plan to Pay 00:01:08.980 |
"Elon Musk said on Thursday that he had commitments 00:01:12.260 |
worth $46.5 billion to finance his proposed bid for Twitter 00:01:17.260 |
and was exploring whether to launch a hostile takeover 00:01:23.140 |
Jesse, I think he's beating out the $5,000 bid 00:01:39.620 |
after Mr. Musk made an unsolicited offer for Twitter 00:01:42.660 |
put pressure on the social media company's board 00:01:49.380 |
Steven David Solomon, a professor at the School of Law 00:01:56.140 |
and this is starting to look like a normal hostile bid. 00:01:58.780 |
You do not do that unless you are going to launch an offer." 00:02:06.280 |
I think I was leaning towards this was probably not for real 00:02:12.880 |
I thought he might've just been messing with people. 00:02:28.380 |
Now, the breaking news that almost happened this morning 00:02:31.580 |
is he tweeted today, the day we're recording this, 00:02:46.900 |
but I had to go on the Twitter to get this tweet thread 00:03:05.520 |
It was him moving on from making fun of Bill Gates 00:03:13.840 |
at a time where he was talking a lot about climate change. 00:03:15.720 |
So I guess Elon Musk has been dunking on Gates recently 00:03:20.100 |
Captain climate change is shorting Tesla stock. 00:03:33.380 |
So that's interesting because that's a new spin. 00:03:36.540 |
I think a lot of the coverage of Musk's plans for Twitter 00:03:44.180 |
And here he is emphasizing another type of improvement 00:03:50.740 |
That's interesting that he is making that pivot. 00:03:56.720 |
that Morgan Stanley is a big part of the money 00:04:03.820 |
And they quote a lecturer from Cornell University saying, 00:04:07.940 |
"There are lots of very senior people at Morgan Stanley 00:04:12.060 |
That in my view would not allow this to happen 00:04:13.940 |
unless there was some level of seriousness behind it." 00:04:16.060 |
Okay, so all of the coverage seems to be saying 00:04:28.740 |
One, it really still could be Musk messing with everyone. 00:04:35.540 |
I think he could persuade Morgan Stanley to come on board 00:04:38.740 |
even if he never actually planned to make the bid. 00:04:44.140 |
The other thing I was thinking about this morning 00:04:49.460 |
to this Musk takeover that isn't really being reported, 00:04:56.380 |
And that's the fact that the media for the most part 00:05:00.140 |
doesn't like Elon Musk for a lot of complicated reasons. 00:05:04.180 |
But of course, the fact that he messes with them this way 00:05:11.300 |
the media might stop using Twitter and focusing on Twitter 00:05:16.300 |
and allowing Twitter to influence them as much. 00:05:28.480 |
"I don't wanna be a part of something that he owns." 00:05:35.780 |
I think if reporters and journalists in general 00:05:39.100 |
spent less time on Twitter and being influenced by Twitter, 00:05:50.800 |
maybe it will actually ironically and paradoxically 00:06:00.880 |
Basically anything that hurts Twitter, I'm kind of a fan of. 00:06:05.540 |
All right, so then I had a second article here 00:06:08.740 |
that elaborates on what's going on with Musk and Twitter. 00:06:12.660 |
And maybe it's not fair to call it an article. 00:06:15.720 |
So there's all sorts of recursive ironies abounding here. 00:06:19.160 |
It's from the former CEO of Reddit, Yixing Wang, 00:06:24.760 |
and hat tip to listener Andy, who emailed this to me. 00:06:29.420 |
In general, by the way, if you have tips or articles 00:06:33.140 |
my longtime address for that is interesting@calnewport.com. 00:06:37.940 |
That's where Andy sent me this Twitter thread. 00:06:57.820 |
If Elon takes over Twitter, he's in for a world of pain. 00:07:07.540 |
But Yixing says, "There is this old culture of the internet, 00:07:18.640 |
"This free speech idea arose out of a culture 00:07:25.500 |
"In practical terms, this meant that they would try 00:07:27.200 |
"to ban porn or other imagined moral degeneracy 00:07:34.900 |
and he points to Elon Musk or Marc Andreessen, 00:07:40.920 |
"a new frontier, a flowering of the human spirit, 00:07:49.480 |
"Reddit," which he ran, "was born in the last years 00:08:01.120 |
"but this is not what free speech is about today." 00:08:04.860 |
He then goes on to argue, "The internet is not a frontier 00:08:10.240 |
"and every culture war is being fought on it. 00:08:12.100 |
"It's the main battlefield for our culture wars." 00:08:15.020 |
And he says, "It means that upholding free speech 00:08:20.380 |
"lobbying to remove Judy Blume books from the library. 00:08:22.420 |
"It means you're standing up against everyone 00:08:32.820 |
are convinced that the social media platforms 00:08:35.660 |
uphold the white supremacist misogynistic patriarchy, 00:08:38.340 |
and they have plenty of screenshots and evidence. 00:08:49.020 |
and they also have plenty of screenshots, blah, blah, blah. 00:08:51.460 |
So his point is everyone has their own definition 00:09:01.100 |
from whatever they're doing to impede their team 00:09:05.460 |
that it's a battlefield where there is no clear sides. 00:09:15.980 |
"what has happened to internet culture since 2014." 00:09:18.660 |
I know he doesn't because he was late to Bitcoin. 00:09:22.580 |
Elon's been too busy doing actual real things 00:09:24.580 |
like making electric cars and reusable rockets. 00:09:28.260 |
Cutting out some inappropriate language here. 00:09:31.620 |
So he has a pretty good excuse for not paying attention, 00:09:34.340 |
but this is something that's hard to understand 00:09:42.540 |
But basically what this former Reddit CEO is saying 00:09:50.020 |
that all tech people supported, and it was pretty clear. 00:09:56.260 |
from trying to prevent first-person shooters, 00:09:59.740 |
or Al Gore's wife from preventing first-person shooters, 00:10:03.860 |
He's like, "Today, that's no longer the case. 00:10:05.900 |
"Free speech means different things for everybody. 00:10:08.140 |
"There's no solution that's gonna make everyone happy. 00:10:10.220 |
"What this team wants is completely different 00:10:18.140 |
"You're gonna be in a world of pain from all sides." 00:10:27.940 |
All right, so here's what I think about that. 00:10:35.300 |
Yes, there certainly was an older internet culture. 00:10:49.620 |
This is a movement that was really tied to things 00:10:55.380 |
They were very anti-digital rights management. 00:11:01.140 |
and everything should just be freely available 00:11:02.700 |
on the internet, and it was a utopian movement. 00:11:05.140 |
It came out of California techno-optimist circles. 00:11:15.420 |
A lot of writers have talked about that transition. 00:11:17.580 |
I think Jaron Lanier is probably the most eloquent. 00:11:23.580 |
after the internet took a turn in the early 2000s. 00:11:33.500 |
that it's also right that there is no obvious solution 00:11:39.620 |
when it comes to things like content moderation. 00:11:45.900 |
And then there's other weird, crazy offshoots 00:11:54.780 |
And so there's no way to keep everyone happy. 00:12:00.860 |
no matter where they were in the political spectrum, 00:12:03.620 |
The right wing got mad that they were being censored. 00:12:05.460 |
The left wing got mad they weren't censoring enough. 00:12:07.300 |
And so it is a very difficult place to be in. 00:12:12.380 |
where people will see like you're doing it well. 00:12:19.460 |
is this idea that Elon Musk or Mark Andreessen 00:12:38.300 |
the sort of big made a lot of money, Reed Hoffman. 00:12:44.740 |
of the original open culture techno optimist. 00:12:54.500 |
This group came up in the dot-com boom of the 90s. 00:13:14.620 |
And I don't know why we don't just simplify this to this. 00:13:18.340 |
I think when Elon Musk says, look at my notes here, 00:13:28.620 |
he thinks that content moderation should come 00:13:31.180 |
more from a centrist position than from a farther 00:13:36.020 |
influence to the farther to the left position. 00:13:48.620 |
where this small group of these tech oligarchs, 00:13:54.380 |
the people who made a lot of money, were very successful, 00:14:04.580 |
When there was the shift more recently in our culture 00:14:15.700 |
And there might be a bit of a, don't tell me how to think, 00:14:20.140 |
look, I'm used to being the smartest person in the room 00:14:23.460 |
and I don't want someone from a university coming along 00:14:28.780 |
It could be the antagonism that grew between the media 00:14:33.300 |
So as more of the culture, and especially media culture, 00:14:36.820 |
shifted to using postmodern critical theories 00:14:45.260 |
And there's this whole tension that's not reported a lot, 00:14:47.580 |
but there basically has been a complete break 00:14:49.660 |
between the Silicon Valley brand name leaders 00:14:58.780 |
and look, we're just not even gonna do interviews with you. 00:15:00.780 |
We'll just talk to people directly through our own podcast, 00:15:03.140 |
and we'll talk to people through our own websites. 00:15:04.780 |
And there's this real tension between the two worlds. 00:15:10.780 |
Elon Musk is from that group of brand name tech oligarchs 00:15:23.700 |
So I don't know why it has to be such a complex analysis 00:15:28.900 |
and maybe Musk is from this weird prior time, 00:15:32.220 |
and we have to understand these complex rules. 00:15:39.500 |
on the political spectrum, and has a lot of money. 00:15:47.100 |
I mean, Justin, you probably hear this, right? 00:15:51.860 |
but I think some of it's pretty straightforward. 00:15:53.500 |
Musk is like, "I wanna be more centrist on this, 00:16:02.780 |
what it is, but I do think he knows what's going on. 00:16:09.300 |
- Yeah, he's got 80-something million followers. 00:16:17.100 |
- I mean, this comes back to my bigger point, 00:16:24.320 |
I just think this model of social media universalism, 00:16:27.340 |
where everyone uses the same small number of platforms, 00:16:31.900 |
That's not what the internet was architected for. 00:16:35.580 |
is now you have potential point-to-point connections 00:16:42.860 |
any type of communication graph topologies that you want. 00:16:46.140 |
Small groups of people, interesting connections 00:16:49.200 |
that you surf to find people you've never known before. 00:16:54.900 |
"Everyone's gonna talk to the same server banks 00:16:58.220 |
"and everyone's gonna read the same information 00:17:03.620 |
and obviates all the advantage of the internet 00:17:05.140 |
in the first place, and it's completely impossible. 00:17:11.740 |
where you want it to be the platform everyone uses. 00:17:17.500 |
You want everyone in the country to use the same platform 00:17:22.980 |
Of course, that's going to explode, and it should. 00:17:26.500 |
And it's why I'm obviously much more in favor 00:17:33.620 |
being much more niche, much more smaller scale. 00:17:37.780 |
Do what you wanna do in your particular community. 00:17:40.740 |
Let community standards emerge in a grassroots fashion 00:17:45.900 |
that are using each of these particular networks or groups 00:17:56.260 |
And the standards for how they talk about things 00:17:58.020 |
might be really different than people that are into Y. 00:17:59.780 |
And the people who are into Y don't have to know 00:18:01.180 |
what the people that are into X are talking about. 00:18:03.060 |
And we don't have to have some sort of common set of rules 00:18:06.460 |
that the people from X and Y both have to follow. 00:18:09.820 |
And so I think the folly here, the Shakespearean tragedy 00:18:21.100 |
So look, I don't think Musk doesn't know what's going on. 00:18:37.500 |
makes no sense and is destroying the internet. 00:18:40.140 |
But I don't think it's complicated what he's doing. 00:18:55.940 |
future social media, social media regulation. 00:18:57.940 |
It doesn't mean I'm always gonna talk about this, 00:19:13.580 |
This is from a talk that Obama gave last week 00:19:28.580 |
that people consume has turbocharged political polarization 00:19:35.300 |
Weighing in on the debate over how to address 00:19:57.100 |
I think this take, however, is a little bit out of touch 00:20:15.620 |
that need to be shared or whatever needs to happen here. 00:20:18.380 |
They're very complicated, but they're not just complicated. 00:20:21.140 |
They are fundamentally doing something that's ineffable. 00:20:26.140 |
So what's actually happening underneath the coverage 00:20:30.620 |
how, let's say, items are selected to add to the stream 00:20:40.420 |
of complex neural networks that have been trained 00:20:43.620 |
in complex ways, usually with reinforcement mechanisms, 00:20:50.220 |
You can't look at a complex connections of neural networks 00:21:02.140 |
through hundreds of millions of trials of training 00:21:07.140 |
They have learned patterns, instructions, the information, 00:21:11.140 |
what information is more likely to get engagement 00:21:19.620 |
A lot of what's happening here is that the information, 00:21:24.620 |
is going to exist as a multidimensional vector 00:21:27.300 |
of data points, and that any one of these neural networks 00:21:32.100 |
is actually creating a multidimensional hyperplane 00:21:47.060 |
meaning they're more likely to have an affinity. 00:21:50.380 |
is that this is complicated, abstract, multidimensional work 00:21:56.340 |
It is not an algorithm like we would think about it. 00:21:59.660 |
It is not a bunch of turbo basic code that says, 00:22:18.780 |
going through complex linear algebra convolutions 00:22:24.140 |
And what happens in between is not understandable by people. 00:22:29.020 |
But I think this is an old fashioned view that like, 00:22:32.340 |
is that like someone who was mustache twisting, 00:22:41.580 |
These people like hearing why vaccines are bad. 00:22:53.300 |
And he says, don't put that in your algorithm. 00:22:54.820 |
So, okay, I'm gonna take that out of my algorithm here. 00:22:57.540 |
It's these, again, it's incredibly complex and abstract 00:23:00.220 |
and you can't break it down into what's really happening. 00:23:07.220 |
if you could somehow make this algorithm interrogatable. 00:23:19.060 |
let's put in different things and see which it prefers. 00:23:22.620 |
then everyone would start scamming the system. 00:23:26.020 |
Everyone trying to get people to look at their dubious 00:23:36.340 |
It'd be like showing spammers the spam filter 00:23:41.300 |
Everything would then slip through the filter 00:23:46.140 |
Anyways, again, I respect the former president. 00:23:53.700 |
It's not you're going in there and turning knobs. 00:23:59.660 |
And this knob is turned towards good information. 00:24:02.700 |
The reality I think is much more complicated. 00:24:06.140 |
But there's a bigger point here I wanna make, 00:24:26.300 |
This platform is sending out this bad information 00:24:30.100 |
And from the elite perspective, most people are dumb. 00:24:38.100 |
I am more in line right now with John Height's latest take 00:24:49.580 |
And I think it gives us a more nuanced understanding 00:24:51.820 |
of the issues with social media than simply saying, 00:24:58.820 |
'Cause if you'll remember from our discussion 00:25:13.020 |
once these platforms shifted towards viral dynamics, 00:25:40.180 |
your team could swarm in a way that was breathtaking. 00:25:43.820 |
And it became, he called it a vigilante culture. 00:25:48.020 |
you could have people just piled on and destroyed. 00:25:53.860 |
where you became very wary of letting the other team, 00:26:03.220 |
So we got to like quickly tamp down or attack. 00:26:06.260 |
Don't give, we give in on anything that might get amplified. 00:26:08.860 |
So it created this really tense, anxious type of environment. 00:26:12.580 |
And three, that drove out almost everyone but the extremes. 00:26:22.140 |
desperate to avoid being attacked by their own team, 00:26:24.180 |
desperate not to give any ground to the other team. 00:26:26.500 |
It became a spectacle of the elite extremist. 00:26:33.100 |
And it's great entertainment for those groups 00:26:44.380 |
our incentives were really wonky information, 00:26:52.300 |
hey, let's try to spread interesting information, 00:27:01.580 |
And we will ignore refuting evidence about something 00:27:22.220 |
into these weird, obsessive, extreme tribal warriors. 00:27:26.380 |
And that is an environment where really wonky 00:27:29.540 |
or bad information can spread, can take hold, 00:27:41.180 |
to really dunk on Trump by believing in flat-earthism, 00:27:55.460 |
lizard people stories are going to spread really strong 00:28:09.020 |
in which all sorts of crazy stuff is going to spread. 00:28:12.180 |
So again, so I think the solution is we got to get away 00:28:18.540 |
that everyone needs to be on the same platform, 00:28:27.020 |
It's a spectacle for elite extremists doing combat 00:28:30.500 |
and the small group of people who like to watch the blood, 00:28:33.060 |
but it's a spectacle that has a trickle down impact 00:28:40.820 |
but there's huge impacts about what happens in their life 00:28:43.820 |
because of what's going on in that elite Colosseum. 00:28:46.140 |
And I think Haidt is absolutely right about that. 00:28:49.380 |
we have to go in there and turn some content knobs. 00:28:51.420 |
Don't promote this content, promote that content. 00:28:57.620 |
And I think the way we do that is we de-emphasize 00:29:03.780 |
maybe we'll spend less time paying attention to it. 00:29:16.140 |
which protects social media platforms from liability 00:29:19.980 |
So he is supporting proposals to get rid of that. 00:29:24.940 |
That makes companies more liable for what's posted. 00:29:30.300 |
Like technically 230 is what would give me protection 00:29:33.380 |
if commenters on my blog said something damaging or illegal. 00:29:38.380 |
230 would say, well, I'm not gonna be held responsible 00:29:44.260 |
The social media companies are really leaning the 230. 00:29:46.900 |
I'm not opposed to the idea of getting rid of 230 00:29:51.540 |
anything that might lead to fracturing social media 00:29:54.860 |
is probably gonna be better for everyone involved. 00:29:59.380 |
I mean, I like a world if like, okay, we drop 230 00:30:05.980 |
But it means that it's no longer legally viable 00:30:10.540 |
What you get instead is more of a Reddit type culture 00:30:15.020 |
there's community moderation and people are responsible 00:30:18.540 |
And there's care to how the community interacts 00:30:23.540 |
are not talking to the people who are really into that. 00:30:31.620 |
but I like anything that might fracture social media 00:30:40.740 |
And though most of us don't wanna pay attention to it, 00:30:50.540 |
You probably know Twitter better than I do, Jesse. 00:30:55.100 |
I never know Twitter. - You don't know either? 00:30:57.780 |
And yet it has a big impact on all of our life. 00:31:02.260 |
how our employers operate, the news we receive, 00:31:13.060 |
the only people going there are like the elite landowners 00:31:22.220 |
who are just like out there trying to run their farms. 00:31:31.020 |
So I'm sort of taking and running with his perspective, 00:31:39.260 |
that's what keeps capturing me about all of this. 00:31:52.660 |
It's like, it's not what most people in France 00:31:55.500 |
So I think there's more of that going on with social media