All right, well, let us do our first segment. I have been enjoying doing some of these news reaction segments. What I'm really looking for, of course, is not to just give my opinion on everything, who cares, but to look at segments of the news that overlap things that we talk about here on the show and give me a chance to actually bounce off them and elaborate some of the theories I've been developing on the show, some of the theories that I talk about commonly on the show.
So it's news that's relevant to the show. And unlike prior Cal Reacts to the News segments, I actually wanna do several different pieces here. So I'm gonna start with an article returning to what we've been discussing, returning to Elon Musk and his potential takeover of Twitter. There's an article I read this morning.
It came out the morning I'm recording this from the New York Times. It was titled "Elon Musk Details His Plan to Pay for a $46.5 Billion Takeover of Twitter." And this article starts by noting, "Elon Musk said on Thursday that he had commitments worth $46.5 billion to finance his proposed bid for Twitter and was exploring whether to launch a hostile takeover for the social media company." Jesse, I think he's beating out the $5,000 bid that we put in.
I think our vision for Cal Newport Twitter, they haven't said no yet, but it looks like Elon has the better bid he's putting together here. The article goes on to say, "The financial commitments gathered a week after Mr. Musk made an unsolicited offer for Twitter put pressure on the social media company's board to take his advances seriously." It's serious.
Steven David Solomon, a professor at the School of Law at the University of California at Berkeley, said of the new filing, "He's getting more professional and this is starting to look like a normal hostile bid. You do not do that unless you are going to launch an offer." So I was actually, I think I was leaning towards this was probably not for real until I read this article.
I thought he might've just been messing with people. This article is seeming to imply that he is actually serious. Did you take him seriously, Jesse? You were never quite sure if he was actually gonna do this. - I thought he was serious. - Okay, so you more understood it.
Now, the breaking news that almost happened this morning is he tweeted today, the day we're recording this, what was his exact words here? It was moving on, dot, dot, dot, right? Now, this is big news in part because I never see Twitter breaking news because I don't use Twitter, but I had to go on the Twitter to get this tweet thread I'm gonna talk about for the next story that a user sent me and it was popped up.
So I was like, maybe that was Musk saying he was moving on from this, but then he clarified, it was breaking news. It was 14 minutes before I came over here. He clarified that when he said moving on, it wasn't about his Twitter takeover bid. It was him moving on from making fun of Bill Gates because Bill Gates had shorted Tesla stock at a time where he was talking a lot about climate change.
So I guess Elon Musk has been dunking on Gates recently for that. Captain climate change is shorting Tesla stock. And so he was moving on from that. Musk goes on to say on Twitter, "If our Twitter bid succeeds, we will defeat the spam bots or die trying." So that's interesting because that's a new spin.
I think a lot of the coverage of Musk's plans for Twitter had to do with content moderation. We'll get into that more here in a second. And here he is emphasizing another type of improvement he would look to do in this case, get rid of spam bots. That's interesting that he is making that pivot.
The final thing is I would point out that Morgan Stanley is a big part of the money he is raising for this takeover bid. And they quote a lecturer from Cornell University saying, "There are lots of very senior people at Morgan Stanley that are responsible for that brand. That in my view would not allow this to happen unless there was some level of seriousness behind it." Okay, so all of the coverage seems to be saying this is probably serious.
It's probably not Musk. It's probably not Musk messing with us. Here's the two thoughts I have about that. One, it really still could be Musk messing with everyone. I think he gets enjoyment out of it. I think he's very persuasive. I think he could persuade Morgan Stanley to come on board even if he never actually planned to make the bid.
So I'm still not sure about that. The other thing I was thinking about this morning is that there could be a silver lining to this Musk takeover that isn't really being reported, but I think it could be good. And that's the fact that the media for the most part doesn't like Elon Musk for a lot of complicated reasons.
But of course, the fact that he messes with them this way is probably one of them. So if he did take over Twitter, the media might stop using Twitter and focusing on Twitter and allowing Twitter to influence them as much. Because on principle, they're like, "I don't like Elon Musk.
"Twitter is now his. "I don't wanna use Twitter as much anymore "as a TV reporter or a newspaper reporter. "I don't wanna be a part of something that he owns." And this would be good for the Republic. I think if reporters and journalists in general spent less time on Twitter and being influenced by Twitter, it's probably better for everybody.
So there's a silver lining here. If Musk continues acting sort of erratically and the media continues to dislike him and he takes over Twitter, maybe it will actually ironically and paradoxically reduce the impact of Twitter on our culture. And I think that would be a good thing. Basically anything that hurts Twitter, I'm kind of a fan of.
All right, so then I had a second article here that elaborates on what's going on with Musk and Twitter. And maybe it's not fair to call it an article. It's a Twitter thread. So there's all sorts of recursive ironies abounding here. It's from the former CEO of Reddit, Yixing Wang, and hat tip to listener Andy, who emailed this to me.
In general, by the way, if you have tips or articles you think I would like, my longtime address for that is interesting@calnewport.com. That's where Andy sent me this Twitter thread. And I thought it was quite interesting as this has gotten quite a lot of play. Yixing starts in this thread by saying, "I've now been asked multiple times "for my take on Elon's offer for Twitter.
"So fine, this is what I think about that." All right, so here's his main point. If Elon takes over Twitter, he's in for a world of pain. He has no idea. I'm gonna summarize. This is a long thread. But Yixing says, "There is this old culture of the internet, "roughly Web 1.0 and early Web 2.0, "that had a very strong free speech culture.
"This free speech idea arose out of a culture "of late '90s America, where the main people "who were interested in censorship "were religious conservatives. "In practical terms, this meant that they would try "to ban porn or other imagined moral degeneracy "on the internet. "Many of the older tech leaders today," and he points to Elon Musk or Marc Andreessen, "grew up with that internet.
"To them, the internet represented freedom, "a new frontier, a flowering of the human spirit, "and a great optimism that technology "could birth a new golden age of mankind." Skipping ahead here a little bit. "Reddit," which he ran, "was born in the last years "of this old internet, when free speech "meant freedom from religious conservatives "trying to take down porn "and sometimes first-person shooters.
"And so we tried to preserve that ideal, "but this is not what free speech is about today." He then goes on to argue, "The internet is not a frontier "where people can go to be free. "It's where the entire world is now, "and every culture war is being fought on it.
"It's the main battlefield for our culture wars." And he says, "It means that upholding free speech "means you're not standing up "against some religious conservatives "lobbying to remove Judy Blume books from the library. "It means you're standing up against everyone "because every side is trying to take away "the speech rights of the other side." And so he goes on to say, for example, that all of his left-wing woke friends are convinced that the social media platforms uphold the white supremacist misogynistic patriarchy, and they have plenty of screenshots and evidence.
And at the same time, all of his center-right libertarian friends are convinced that social media platforms uphold the woke BLM Marxist LGBTQ agenda, and they also have plenty of screenshots, blah, blah, blah. So his point is everyone has their own definition of free speech. Everyone wants to stop the other side, whatever the other team is, from whatever they're doing to impede their team and to get more freedom for their own team, that it's a battlefield where there is no clear sides.
And he says, Elon Musk doesn't realize this. He says, "Elon doesn't understand "what has happened to internet culture since 2014." I know he doesn't because he was late to Bitcoin. Elon's been too busy doing actual real things like making electric cars and reusable rockets. Cutting out some inappropriate language here.
So he has a pretty good excuse for not paying attention, but this is something that's hard to understand unless you've run a social network. All right, I'll call it there. That's my summary. But basically what this former Reddit CEO is saying is that Elon Musk is from an old generation where free speech was an ideal that all tech people supported, and it was pretty clear.
It was like the internet is for free speech, and we have to stop Jerry Falwell from trying to prevent first-person shooters, or Al Gore's wife from preventing first-person shooters, or whatever was going on at the time. He's like, "Today, that's no longer the case. "Free speech means different things for everybody.
"There's no solution that's gonna make everyone happy. "What this team wants is completely different "from what this team wants, "and you're gonna have to either pick sides, "and if you don't pick sides, "everyone's gonna hate you. "You're gonna be in a world of pain from all sides." So this thread has gone somewhat viral, and I think it is an interesting take.
All right, so here's what I think about that. Having thought a lot about this and written a lot about this, there's some things here I agree with. Yes, there certainly was an older internet culture. I think they're often described as the open culture techno-optimist that were a big believer in the internet as bringing openness to everyone.
This is a movement that was really tied to things like information wants to be free, open source software. They were very anti-digital rights management. They thought software and music and text and everything should just be freely available on the internet, and it was a utopian movement. It came out of California techno-optimist circles.
Kevin Kelly, who I know and respect, was one of the big thinkers of that. So that movement did exist. Their version of the internet is very different than it is today. A lot of writers have talked about that transition. I think Jaron Lanier is probably the most eloquent. He was an open culture techno-optimist who became decidedly not that after the internet took a turn in the early 2000s.
So I think that is definitely true. I think he's also right. I think we've seen this clearly, that it's also right that there is no obvious solution that's gonna make most people happy when it comes to things like content moderation. The left wants this moderated, the right wants that moderated.
And then there's other weird, crazy offshoots of the mainstream left and right that have all sorts of crazy thoughts about what should be moderated or not. And so there's no way to keep everyone happy. Facebook saw this. Facebook somehow got everyone, no matter where they were in the political spectrum, mad at them.
The right wing got mad that they were being censored. The left wing got mad they weren't censoring enough. And so it is a very difficult place to be in. There is no politically neutral stance where people will see like you're doing it well. So I think he's right about that as well.
What I think he's getting wrong though is this idea that Elon Musk or Mark Andreessen don't understand that. Gen X I think is too new of a generation. The sort of middle-aged tech oligarch class, the Peter Thiel's, Andreessen, Musk, you might have David Sacks, the sort of big made a lot of money, Reed Hoffman.
They were not really from that school of the original open culture techno optimist. They're a little bit older. That's a little bit of an older time. That's Kevin Kelly, that's Stuart Brand, that's Jaron Lanier. That's a slightly older group. This group came up in the dot-com boom of the 90s.
They're much more money focused than the original techno optimist are. And they know exactly what's going on. I do not think Elon Musk misunderstands what's actually happening on Twitter. I think what's going on when he talks about content moderation is much simpler. And I don't know why we don't just simplify this to this.
I think when Elon Musk says, look at my notes here, but when Elon Musk says basically, he wants free speech, I think almost certainly what he means is he thinks that content moderation should come more from a centrist position than from a farther influence to the farther to the left position.
I think that's all it is. And I think we see the split. We saw the split happening in Silicon Valley where this small group of these tech oligarchs, so especially the ones with brand names, the people who made a lot of money, were very successful, had accrued a lot of power in the internet, out of the internet's growth.
When there was the shift more recently in our culture towards using postmodern critical theories as the main perspective through which we understand the world, that group largely resisted it. And there might be a bit of a, don't tell me how to think, look, I'm used to being the smartest person in the room and explaining how things work, and I don't want someone from a university coming along and telling me how to think.
I don't know what it was. It could be the antagonism that grew between the media and these groups. So as more of the culture, and especially media culture, shifted to using postmodern critical theories as their main lens, their treatment of these tech bro oligarchs got pretty rough. And there's this whole tension that's not reported a lot, but there basically has been a complete break between the Silicon Valley brand name leaders and the East Coast media, where they just won't talk to them anymore.
Like every time we talk to you, you just dunk on us in the piece, and look, we're just not even gonna do interviews with you. We'll just talk to people directly through our own podcast, and we'll talk to people through our own websites. And there's this real tension between the two worlds.
That probably didn't help either. But I think it's as simple as that. Elon Musk is from that group of brand name tech oligarchs that says, "I don't know. "I don't wanna re-center all my perspectives "through postmodern critical theories. "I think Twitter does that too much. "I wanted to do it less." So I don't know why it has to be such a complex analysis of what's really going on with free speech, and maybe Musk is from this weird prior time, and we have to understand these complex rules.
I think he just has a different location on the political spectrum, and has a lot of money. You can put those two things together. I don't know. I mean, Justin, you probably hear this, right? Like there's a lot of bending over backwards and complex analysis for some of this stuff, but I think some of it's pretty straightforward.
Musk is like, "I wanna be more centrist on this, "and I have a lot of money, "and I'm kinda screwing with people." - Yeah, I do think he has a plan. - Yeah. - I'm not exactly sure what it is, but I do think he knows what's going on.
'Cause he uses Twitter a lot anyway. He's been using it for years, right? - Yeah, he's got 80-something million followers. - Yeah. - Yeah. - So anything he uses that much, he's obviously thinking a lot about it. - I mean, this comes back to my bigger point, which I make often about social media, which is this is the impossibility of trying to have universalism.
I just think this model of social media universalism, where everyone uses the same small number of platforms, doesn't make sense. That's not what the internet was architected for. Like the whole point of the internet is now you have potential point-to-point connections between everyone in the world. Meaning that you can put together any type of communication graph topologies that you want.
Small groups of people, interesting connections that you surf to find people you've never known before. But to go to this broadcast topology, we say, "No, no, no. "Everyone's gonna talk to the same server banks "at the same companies, "and everyone's gonna read the same information "being posted on the same three websites." Completely gets in the way and obviates all the advantage of the internet in the first place, and it's completely impossible.
And this is really what we're seeing here in that Twitter thread. You're never going to make a platform work where you want it to be the platform everyone uses. What an impossible task. You want everyone in the country to use the same platform with the same interface and the same content moderation rules.
Of course, that's going to explode, and it should. And it's why I'm obviously much more in favor of social media being, and by social media, I mean social interaction on the internet being much more niche, much more smaller scale. Do what you wanna do in your particular community. Let community standards emerge in a grassroots fashion from the small number of users that are using each of these particular networks or groups or however you want to work.
That is where the internet really works. The people who are into X have a place to go and hang out with people that are into X. And the standards for how they talk about things might be really different than people that are into Y. And the people who are into Y don't have to know what the people that are into X are talking about.
And we don't have to have some sort of common set of rules that the people from X and Y both have to follow. And so I think the folly here, the Shakespearean tragedy underlying all this is this push towards, we need a digital town square. We need one service that everyone uses.
So look, I don't think Musk doesn't know what's going on. I don't think he's Kevin Kelly recarnated as being too techno-optimistic. He knows what he's doing. I don't think it's gonna be super successful because I think this whole project of having giant platforms that everyone uses makes no sense and is destroying the internet.
But I don't think it's complicated what he's doing. He likes Twitter. He doesn't like some of the politics behind how it's being implemented. He has money. So he's like, I'm gonna try to change it. All right, let's do one more quick article. So today my theme is social media, future social media, social media regulation.
It doesn't mean I'm always gonna talk about this, but just seems to be in the air these days. So this final one also comes from the Times, from a couple of days ago. The title of the article is Obama Calls for More Regulatory Oversight of Social Media Giants. This is from a talk that Obama gave last week at the Stanford Cyber Policy Center.
Okay, so the article says, former President Barack Obama on Thursday called for greater regulatory oversight of the country's social media giants, saying their power to curate the information that people consume has turbocharged political polarization and threatened the pillars of democracy. Weighing in on the debate over how to address the spread of disinformation, he said the companies needed to subject their proprietary algorithms to the same kind of regulatory oversight to ensure the safety of cars, food, and other consumer products.
Tech companies need to be more transparent about how they operate, Mr. Obama said. Well, I mean, there's a lot of things I agree with the former president on. I think this take, however, is a little bit out of touch with the underlying technology. Social media, quote unquote, algorithms are not like food safety or car safety, where, okay, we have data from crash tests that need to be shared or whatever needs to happen here.
They're very complicated, but they're not just complicated. They are fundamentally doing something that's ineffable. So what's actually happening underneath the coverage of these social media companies, how, let's say, items are selected to add to the stream in the timeline that you consume, in the feed that you can consume.
This is a collection, a complex collection of complex neural networks that have been trained in complex ways, usually with reinforcement mechanisms, and then they're connected together in some sort of dynamical way. You can't look at a complex connections of neural networks and say, what does this do? It doesn't work that way.
These networks have learned on their own through hundreds of millions of trials of training and back propagation reinforcement. They have learned patterns, instructions, the information, what information is more likely to get engagement from this person versus that person that cannot be easily reduced to a human understandable format. A lot of what's happening here is that the information, let's say a particular post on Twitter, is going to exist as a multidimensional vector of data points, and that any one of these neural networks in the quote unquote algorithm is actually creating a multidimensional hyperplane through which they can actually categorize a multidimensional location for this point.
And then the user themselves, they're gonna show this to exist as their own multidimensional point, and they can see if they've been segmented into the same place by this hyperplane, meaning they're more likely to have an affinity. All of which I'm trying to say here is that this is complicated, abstract, multidimensional work that is happening with these systems.
It is not an algorithm like we would think about it. It is not a bunch of turbo basic code that says, if about cats and reader is over 60, show them this post. It's not that. It is vectors of numbers going through complex linear algebra convolutions and scores coming out of the other side.
And what happens in between is not understandable by people. But I think this is an old fashioned view that like, oh, I think what's happening here is that like someone who was mustache twisting, and was like, hmm, we will get more views. I'm looking at this here. These people like hearing why vaccines are bad.
So let me just type into here, show articles about vaccines being bad because then we will get more money. And oh, the regulators here. And he says, don't put that in your algorithm. So, okay, I'm gonna take that out of my algorithm here. That's not how it works. It's these, again, it's incredibly complex and abstract and you can't break it down into what's really happening.
Now, furthermore, even if you could, it would be disastrous for these companies if you could somehow make this algorithm interrogatable. Right, so maybe you wanna apply an explainable AI approach here and say, well, let's just interrogate the algorithms. See what it says is, you know, let's put in different things and see which it prefers.
But if you did that, then everyone would start scamming the system. Everyone trying to get people to look at their dubious diet pill site or whatever would figure out exactly what to put in their tech so that it would dominate everything else. It'd be like showing spammers the spam filter that Gmail used.
Everything would then slip through the filter because they could just sit there and work with it to figure it out. So you can't really make it clear. Anyways, again, I respect the former president. I'm just saying this is out of date with what's going on with this technology. It's not so simple.
It's not you're going in there and turning knobs. This knob is turned towards bad information. Let's just turn that down. And this knob is turned towards good information. Let's turn it up. The reality I think is much more complicated. But there's a bigger point here I wanna make, which is again, in a lot of discourse, especially again, discourse coming out of more elite circles about social media, there's this real focus on the problem is the wrong information is being amplified.
That this is the problem. It's all about content amplification. This is bad information. This platform is sending out this bad information to a lot of people. And from the elite perspective, most people are dumb. So then they get tricked and then they believe this bad information. So let's just stop it from spreading out the bad information.
I am more in line right now with John Height's latest take that we talked about last week on the show. His take from his Big Atlantic article, which I increasingly think is right. And I think it gives us a more nuanced understanding of the issues with social media than simply saying, it pushes the bad information more than the quote unquote good information.
'Cause if you'll remember from our discussion of John Height's Atlantic article, what he was arguing is the problem is not what it does to information, it's what social media does to the people. What it does to the people who are interacting on social media. And his whole point was, once these platforms shifted towards viral dynamics, where something could get a huge amount of attention right away, things could blow up really quickly.
He said, this really changed the way that people use social media. Three things happened. One, there became immediate, there could be immediate consequences. If you sort of say the wrong thing, your team could swarm in a way that was breathtaking. And it became, he called it a vigilante culture.
Where out of nowhere, you could have people just piled on and destroyed. Two, it created a culture then where you became very wary of letting the other team, depending on how you define the other team, gain any ground. Can't let the other team gain any ground. So we got to like quickly tamp down or attack.
Don't give, we give in on anything that might get amplified. So it created this really tense, anxious type of environment. And three, that drove out almost everyone but the extremes. So as Height documents, you're left with the extremes on the left and the extremes on the right, basically fighting back and forth, desperate to avoid being attacked by their own team, desperate not to give any ground to the other team.
It became a spectacle of the elite extremist. That is what is happening on a platform like Twitter right now. And it's great entertainment for those groups and a larger group of adjacent people that quietly like to watch it, but it's a terrible environment. So the incentives there, our incentives were really wonky information, really weird, bad information, can really spread and take hold because the point there is not, hey, let's try to spread interesting information, is we're going to win and I don't want to get hung by my own team.
So we'll grasp onto something crazy if that gives us a little advantage. And we will ignore refuting evidence about something with a diligence, with diligent blinders, if that might lead to an attack, if I acknowledge it, or if I might give the other team room. So Height's argument is the problem is not what is the social media algorithms, how do they amplify or choose what information to amplify?
It's what do they do to the people? And the viral dynamics turn people into these weird, obsessive, extreme tribal warriors. And that is an environment where really wonky or bad information can spread, can take hold, can be really difficult to dislodge. I mean, I think if there was somehow a way to really dunk on Trump by believing in flat-earthism, you would see a lot of flat-earthism.
Other way around, if there's some way to really, really get at Biden, if by believing in lizard people, lizard people stories are going to spread really strong and people are really going to grasp them. It's not the information that matters here. It's the human dynamic. Social media warps the people to a mode in which all sorts of crazy stuff is going to spread.
So again, so I think the solution is we got to get away from platform universalism. We got to get away from this idea that everyone needs to be on the same platform, that we need these quote unquote, what we call digital town halls, which aren't digital town halls at all.
It's the digital Roman Colosseum. It's a spectacle for elite extremists doing combat and the small group of people who like to watch the blood, but it's a spectacle that has a trickle down impact on everyone. Most people don't use Twitter. Most people, the vast majority of people never post anything on Twitter, but there's huge impacts about what happens in their life because of what's going on in that elite Colosseum.
And I think Haidt is absolutely right about that. And the problem is not just, again, we have to go in there and turn some content knobs. Don't promote this content, promote that content. We're way past that. We got to stop the impact it has on people. And I think the way we do that is we de-emphasize the importance of these platforms.
Once we recognize it's an elite spectacle, maybe we'll spend less time paying attention to it. We'll reduce its impact. So Obama goes on to say one of his proposals was to look at section 230 of the Communications Decency Act, which protects social media platforms from liability for content that their users post.
So he is supporting proposals to get rid of that. That makes companies more liable for what's posted. And again, I think that's interesting. 230 is complicated. Like technically 230 is what would give me protection if commenters on my blog said something damaging or illegal. 230 would say, well, I'm not gonna be held responsible for what other people posted on my platform.
The social media companies are really leaning the 230. I'm not opposed to the idea of getting rid of 230 to some degree, because I think, again, anything that might lead to fracturing social media is probably gonna be better for everyone involved. I mean, I like a world if like, okay, we drop 230 and maybe I have to turn comment threads off because like I don't be liable, sure.
But it means that it's no longer legally viable to be a massive universal platform. What you get instead is more of a Reddit type culture of smaller communities where things are, there's community moderation and people are responsible for what's posted on there. And there's care to how the community interacts and the people who are really into this are not talking to the people who are really into that.
That's probably a better world. So probably from a legal principle, there's problems with just saying drop 230, but I like anything that might fracture social media away from this form it is now, where we have the spectacle of the elites. And though most of us don't wanna pay attention to it, it ends up really affecting our life.
So I don't know. You probably know Twitter better than I do, Jesse. - I don't think so. I never know Twitter. - You don't know either? Yeah, see, that's the thing. Most people don't. And yet it has a big impact on all of our life. Like it could have a big impact on how our employers operate, the news we receive, the legislation taken up or not taken up, what politicians pay attention to.
It's like the Roman Coliseum, the only people going there are like the elite landowners because they really like it. But what's happening in the Coliseum is completely affecting what happens to the rest of the Roman citizens who are just like out there trying to run their farms. - It's a great analogy.
- Yeah, and I think Haidt actually mentioned the Roman Coliseum in his Atlantic article. So I'm sort of taking and running with his perspective, but it's elite capture. And I think that's all this stuff is, that's what keeps capturing me about all of this. We're in 1750 France, and there's like huge arguments going on about the Hall of Mirrors at Versailles.
It's like, it's not what most people in France cared about right then. So I think there's more of that going on with social media than other people are willing to let on. So there we go. That's what I think about that. (upbeat music) (upbeat music)