back to indexJonathan Haidt: The Case Against Social Media | Lex Fridman Podcast #291
Chapters
0:0 Introduction
1:25 Social media and mental health
15:14 Mark Zuckerberg
24:51 Children's use of social media
35:36 Social media and democracy
51:42 Elon Musk and Twitter
68:13 Anonymity on social media
74:12 Misinformation
81:6 Social media benefits
83:50 Political division on social media
90:22 Future of social media
96:14 Advice for young people
00:00:00.000 |
The following is a conversation with Jonathan Haidt, 00:00:04.720 |
and critic of the negative effects of social media 00:00:10.480 |
He gives a respectful but hard hitting response 00:00:24.960 |
He has brilliantly discussed these topics in his writing, 00:00:28.760 |
including in his book, "The Coddling of the American Mind" 00:00:32.240 |
and in his recent long article in "The Atlantic" 00:00:35.560 |
titled, "Why the Past 10 Years of American Life 00:00:41.800 |
When Teddy Roosevelt said in his famous speech 00:00:55.320 |
but both his criticism and our disagreement is essential 00:01:04.640 |
Social media has both the power to destroy our society 00:01:10.560 |
It's up to us to figure out how we take the lighter path. 00:01:18.080 |
please check out our sponsors in the description. 00:01:20.760 |
And now, dear friends, here's Jonathan Haidt. 00:01:25.360 |
- So you have been thinking about the human mind 00:01:31.000 |
"The Righteous Mind," "The Coddling of the American Mind," 00:01:33.640 |
and today you're thinking, you're writing a lot 00:01:42.120 |
let's go through the thread that connects all of that work. 00:01:46.120 |
How do we get from the very beginning to today 00:01:49.280 |
with the good, the bad, and the ugly of social media? 00:01:55.120 |
which means I study how we think about other people 00:02:01.760 |
And in graduate school at the University of Pennsylvania, 00:02:07.220 |
and I studied how morality varied across countries. 00:02:17.840 |
And in that decade was really when the American culture war 00:02:23.740 |
And I began to notice that left and right in this country 00:02:28.540 |
And you could use the tools of cultural psychology 00:02:36.100 |
And I began growing alarmed in the early 2000s 00:02:42.340 |
And I began studying the causes of polarization, 00:02:46.520 |
bringing moral psychology to bear on our political problems. 00:02:52.940 |
to basically help the Democrats stop screwing up, 00:02:55.540 |
because I could see that some of my research showed 00:02:58.460 |
people on the right understand people on the left, 00:03:14.160 |
So originally I wanted to actually help the Democrats 00:03:21.840 |
And I got a contract to write "The Righteous Mind." 00:03:49.420 |
on helping left and right understand each other 00:03:51.660 |
and helping our democratic institutions to work better. 00:03:54.820 |
Okay, so all this is before I had any interest 00:04:25.780 |
And they're saying it's dangerous, it's violence. 00:04:34.300 |
that led us to write the "Coddling the American Mind," 00:04:37.020 |
which wasn't primarily about social media either. 00:04:42.820 |
But after that, things got so much worse everywhere. 00:05:02.180 |
whether it's fundamentally the nature of social media 00:05:08.780 |
that lead the social media companies that's the problem. 00:05:12.960 |
that's highlighted in the "Coddling of the American Mind" 00:05:22.400 |
to make sort of emphasize anti-fragility of the human mind 00:05:27.400 |
as it interacts with the social media platforms 00:05:34.220 |
- That should take us an hour or two to cover. 00:05:40.120 |
You said you wanted to challenge some of the things 00:05:49.500 |
- Okay, there are two major areas that I study. 00:05:51.780 |
One is what is happening with teen mental health? 00:05:54.300 |
It fell off a cliff in 2013, it was very sudden. 00:06:00.140 |
to our democratic and epistemic institutions? 00:06:14.300 |
and what's the evidence that social media is a contributor? 00:06:18.760 |
what's happening to democracies, not just America, 00:06:26.800 |
'cause that's what the Atlantic article is about. 00:06:43.560 |
There's self-reports of how depressed, anxious, lonely. 00:06:47.040 |
There's data on hospital admissions for self-harm. 00:06:50.740 |
And all of these things, they bounce around somewhat, 00:06:53.720 |
but they're relatively level in the early 2000s. 00:06:56.840 |
And then all of a sudden, around 2010 to 2013, 00:07:00.520 |
depending on which statistic you're looking at, 00:07:02.680 |
all of a sudden, they begin to shoot upwards. 00:07:08.040 |
but on the whole, it's like up for both sexes. 00:07:26.800 |
Suicide for preteen girls, thankfully, it's not very common, 00:07:42.880 |
So as I've been studying it, I found, first of all, 00:07:55.000 |
but yet it's not as clear in the Germanic countries. 00:07:58.240 |
In continental Europe, it's a little different, 00:08:00.120 |
and we can get into that when we talk about childhood. 00:08:20.760 |
That wouldn't have hit preteen girls the hardest. 00:08:25.080 |
The complexity here in the data is, of course, 00:08:27.360 |
as everyone knows, correlation doesn't prove causation. 00:08:30.520 |
The fact that television viewing was going up 00:08:37.080 |
So what I've done, and this is Rook with Jean Twenge, 00:08:40.480 |
who wrote the book "iGen," is because I was challenged, 00:08:48.960 |
"Oh, you don't know what you're talking about. 00:08:52.960 |
"and mental health, they exist, but they're tiny. 00:09:10.080 |
which exists, but it's so tiny it's zero, essentially. 00:09:14.480 |
And that claim, that social media's no more harmful 00:09:19.400 |
it was a very catchy claim, and it's caught on, 00:09:31.160 |
is you implied, but just to make it explicit, 00:09:36.880 |
as you're proposing, is that a very particular aspect 00:09:46.440 |
a certain mechanism of virality that was invented, 00:09:51.120 |
or perhaps some aspect of social media is the cause. 00:09:56.960 |
I mean, overall, the more you connect people, the better. 00:10:00.080 |
Giving people the telephone was an amazing step forward. 00:10:02.940 |
Giving them free telephone, free long distance, 00:10:06.400 |
Video is, I mean, so connecting people is good. 00:10:10.120 |
And social media, at least the idea of users posting things, 00:10:14.120 |
like that happens on LinkedIn, and it's great. 00:10:17.900 |
What I'm talking about here is not the internet. 00:10:28.200 |
in which people are incentivized to create content. 00:10:31.760 |
And that content is what brings other people on. 00:10:41.880 |
which seems to be incredibly harmful for teenagers, 00:10:45.480 |
especially for young girls, 10 to 14 years old 00:10:53.320 |
because it leads to all kinds of anger, conflict, 00:11:08.160 |
since we're doing the nuance now in this section, 00:11:14.840 |
I use Twitter because it's an amazing way to put out news, 00:11:37.200 |
what are the possible positive effects of social media 00:11:40.840 |
So for example, in the way I've been using Twitter, 00:11:45.840 |
not the promotion or any of that kind of stuff, 00:11:48.280 |
it makes me feel less lonely to connect with people, 00:11:53.280 |
to make me smile, a little bit of humor here and there. 00:11:56.800 |
And that at scale is a very interesting effect, 00:12:05.000 |
So we kind of have to consider that and be honest. 00:12:10.960 |
We have to be honest about the positive and the negative. 00:12:13.560 |
And sometimes we're not sufficiently positive 00:12:18.880 |
we're not rigorous in a scientific way about the negative. 00:12:25.280 |
And so that brings us to the Mark Zuckerberg email. 00:12:31.880 |
let me just pick up on the issue of trade-offs 00:12:38.280 |
No, that's a one-dimensional conceptualization. 00:12:47.480 |
Like we would have been sitting there alone in our homes. 00:12:49.840 |
Yeah, if all we had was texting, telephone, Zoom, Skype, 00:12:57.260 |
all sorts of ways of communicating with each other. 00:13:00.120 |
Oh, and there's blogs and the rest of the internet. 00:13:07.660 |
Now those did help certain things get out faster. 00:13:12.040 |
but it also led to huge explosions of misinformation 00:13:14.540 |
and the polarization of our politics to such an extent 00:13:19.600 |
didn't believe what the medical establishment was saying. 00:13:25.300 |
was playing political games that made them less credible. 00:13:30.080 |
if you've got the internet, smartphones, blogs, 00:13:36.360 |
that adding in this particular business model 00:13:43.280 |
- And one interesting one we'll also talk about is YouTube. 00:13:46.840 |
I think it's easier to talk about Twitter and Facebook. 00:13:50.240 |
YouTube is another complex beast that's very hard to, 00:14:09.600 |
called "Social Media and Political Dysfunction 00:14:14.160 |
That includes, I believe, papers on YouTube as well. 00:14:23.440 |
because I can't imagine life without YouTube. 00:14:34.840 |
the way I think about the internet in general, 00:14:37.020 |
and I don't know enough to really comment on YouTube. 00:14:39.200 |
So I have been focused, and it's also interesting. 00:14:42.000 |
One thing we know is teen social life changed radically 00:14:47.760 |
Before 2010, they weren't mostly on every day 00:15:04.680 |
Those don't seem to be as harmful to mental health 00:15:11.560 |
that seem to really have done in girls' mental health. 00:15:18.600 |
So at 64 minutes and 31 seconds on the video, 00:15:26.180 |
- This is the very helpful YouTube transcript. 00:15:42.940 |
"Actually, on most measures, the kids are doing better 00:15:48.500 |
And then he says, "I think an accurate characterization 00:15:56.760 |
"is generally positive for their mental health." 00:16:16.340 |
to another document that we'll make available. 00:16:25.140 |
Should we force the platforms to actually tell us 00:16:28.620 |
Like, we have no idea, other than self-report. 00:16:32.260 |
They're the only ones who know, like, the kid does this, 00:16:34.240 |
and then over the next hours, the kid's depressed or happy. 00:16:39.860 |
So should they be compelled to reveal the data? 00:16:43.700 |
- So you raised just, to give people a little bit of context, 00:16:47.940 |
and this document is brilliantly structured with questions, 00:16:51.740 |
studies that indicate that the answer to a question is yes, 00:16:54.940 |
indicate that the answer to a question is no, 00:17:03.700 |
- Right, wait, so that's the one that we're gonna get to. 00:17:08.900 |
- So I've got three different Google Docs here, 00:17:13.220 |
It's an amazing way to organize the research literature, 00:17:31.980 |
is that our gut feelings drive our reasoning. 00:17:34.500 |
That was my dissertation, that was my early research. 00:17:37.180 |
And so if Gene Twenge and I are committed to, 00:17:39.620 |
but we're gonna obviously preferentially believe 00:17:51.300 |
is other people who have a different confirmation bias. 00:17:53.780 |
So these documents evolve because critics then say, 00:18:02.340 |
and I'm gonna put links to everything on my website. 00:18:05.340 |
If listeners, viewers go to jonathanheit.com/socialmedia. 00:18:13.700 |
I'll put everything together in one place there, 00:18:19.180 |
and other things like it that we're talking about. 00:18:22.380 |
So yeah, so the thing I wanna call attention to now 00:18:29.340 |
and Social Media is a Major Contributing Cause. 00:18:32.220 |
So Ben Sass and Chris Coons are on the Judiciary Committee. 00:18:40.580 |
So they asked me to testify on what do we know, 00:18:44.340 |
And so what I did was I put together everything I know 00:18:49.940 |
That first, what do we know about the crisis? 00:18:52.040 |
Well, that the crisis is specific to mood disorders, 00:18:56.420 |
It's not just self-report, it's also behavioral data, 00:18:59.300 |
because suicide and self-harm go skyrocketing after 2010. 00:19:05.460 |
and the crisis is gendered, and it's hit many countries. 00:19:17.900 |
So self-report, just how you kind of collect data 00:19:22.820 |
- You have a self-report, a survey, you ask people-- 00:19:30.180 |
- How many hours a week do you use social media? 00:19:35.680 |
you can collect large amounts of data that way, 00:19:41.020 |
And but then there's, I forget the term you use, 00:19:48.260 |
Where you actually have self-harm and suicide numbers. 00:19:55.740 |
So this is from the National Survey on Drug Use and Health. 00:19:58.300 |
So the federal government, and also Pew and Gallup, 00:20:03.360 |
that have been collecting survey data for decades. 00:20:07.020 |
And what you see on these graphs over and over again 00:20:09.300 |
is relatively straight lines up until around 2010 or 2012. 00:20:36.300 |
And literally the day before my book with Greg was published, 00:20:39.420 |
the day before, there was a psychiatrist in "New York Times" 00:20:44.520 |
"Smartphones are not ruining your kid's brain." 00:20:56.720 |
but all we have to do is look at the hospitalization data 00:20:59.380 |
for self-harm and suicide, and we see the exact same trends. 00:21:08.660 |
You have an elbow, and then it goes up, up, up. 00:21:15.460 |
So we have a catastrophe, and this was all true before COVID. 00:21:18.860 |
COVID made things worse, but we have to realize, 00:21:22.920 |
Kids are back in school, but we're not gonna go back 00:21:25.900 |
to where we were because this problem is not caused by COVID. 00:21:31.620 |
Well, just again, to just go through the point, 00:21:34.740 |
I just feel like I just really wanna get out the data 00:21:38.140 |
to show that Mark is wrong. - This is amazing. 00:21:55.340 |
is actually much larger than for eating potatoes. 00:21:58.020 |
So that famous line wasn't about social media use. 00:22:07.540 |
And so what they did is they looked at all screen use, 00:22:20.300 |
but the media has reported it as social media is 0.03, 00:22:29.940 |
there's more than 100 studies in the Google Doc. 00:22:36.400 |
What happens if we zoom in on just social media? 00:22:42.020 |
What happens if we zoom in on girls and social media? 00:22:52.060 |
Amy Orban says, I think I have a quote from here, 00:22:59.020 |
"The associations between social media use and wellbeing 00:23:02.060 |
therefore range from about R equals 0.15 to R equals 0.10." 00:23:11.900 |
And a lot of research, including hers and mine, 00:23:15.700 |
So for girls, we're talking about correlations 00:23:19.500 |
Gene Twenge and I found it's about 0.2 or 0.22. 00:23:22.600 |
Now this might sound like an arcane social science debate, 00:23:26.240 |
public health correlations are almost never above 0.2. 00:23:29.180 |
So the correlation of childhood exposure to lead 00:23:31.900 |
and adult IQ, very serious problem, that's 0.09. 00:23:35.220 |
Like the world's messy and our measurements are messy. 00:23:37.900 |
And so if you find a consistent correlation of 0.15, 00:23:41.180 |
like you would never let your kid do that thing. 00:23:51.980 |
you actually can explain the mental health epidemic 00:24:00.780 |
because I quit potatoes and it had no effect on me. 00:24:32.420 |
So if we just linger on the Mark Zuckerberg statements. 00:24:43.200 |
So if you put yourself in the shoes of Mark Zuckerberg 00:25:08.660 |
because ultimately, and this is something you argue, 00:25:18.980 |
all you have is maybe some leaks of internal data. 00:25:27.200 |
about the potential bias in those kinds of leaks. 00:25:29.700 |
You want to, it would be nice to have a non-leak data. 00:25:45.940 |
And when people commit suicide, what was happening before. 00:25:51.060 |
- So how do we begin to fix social media, would you say? 00:25:55.620 |
- Okay, so here's the most important thing to understand. 00:26:04.780 |
You have to have much more specific questions. 00:26:13.580 |
is done on what's called the dose response model. 00:26:16.340 |
That is, everybody, including most of the researchers, 00:26:28.660 |
And if kids have a lot of sugar, then it's bad. 00:26:40.320 |
So we evolved as a species in which kids play 00:26:50.780 |
That's how you become a mature adult, until the 1990s. 00:26:58.540 |
So we completely, we began rewiring childhood 00:27:41.020 |
Millennials didn't get it until they were in college. 00:27:48.180 |
So they really began to get on around 2009, 2010, 00:27:51.820 |
and boom, two years later, they're depressed. 00:27:53.740 |
It's not because they ate too much sugar necessarily. 00:28:18.160 |
and worrying and imagine going through puberty that way 00:28:21.840 |
versus imagine there was a policy, no phones in school. 00:28:38.220 |
Keep kids off until they're done with puberty. 00:28:42.220 |
and Andy Shabilsky showing that the damage is greatest 00:28:47.740 |
So there is no way to make it safe for preteens 00:28:51.640 |
We've gotta, kids should simply not be allowed 00:28:54.220 |
on these business models where you're the product. 00:29:01.760 |
- So I think that's a really powerful solution, 00:29:03.660 |
but it makes me wonder if there's other solutions 00:29:15.580 |
Sort of if there's a way that's more productive 00:29:21.500 |
So of course one thing is putting your phone down, 00:29:25.740 |
of social media companies, it might be difficult 00:29:34.580 |
without social media, social media is a source of joy. 00:29:39.500 |
So I wonder if it's possible to design the mechanisms 00:29:47.260 |
but actually just technically the recommender system 00:30:00.700 |
but does not lead to depression, self-harm, and suicide. 00:30:08.500 |
as the objective function, not engagement or-- 00:30:12.940 |
- Yeah, I don't think that can be done for kids. 00:30:15.980 |
So I am very reluctant to tell adults what to do. 00:30:20.420 |
and I would lose their friendship if I started saying, 00:30:22.460 |
oh, it's bad for adults and we should stop adults 00:30:31.180 |
or without me having any ability to control it. 00:30:33.580 |
As a parent, it's very hard to stop your kid. 00:30:35.860 |
I have stopped my kids from getting on Instagram 00:30:44.220 |
They see that what the kids are really on it, 00:30:46.500 |
what they post, they see that the culture of it 00:30:50.220 |
So I don't think there's a way to make it healthy for kids. 00:30:54.240 |
I think there's one thing which is healthy for kids, 00:30:57.160 |
We already robbed them of most of it in the '90s. 00:31:04.220 |
I'm not saying that these things are all bad, 00:31:29.420 |
because I think the story's actually quite clear now. 00:31:38.180 |
and the effect sizes are coming in around 0.1 to 0.15, 00:31:41.240 |
whether you call that a correlation coefficient or a beta. 00:31:48.700 |
We collect true experiments with random assignment 00:32:01.860 |
is good for your mental health or bad for it? 00:32:13.980 |
What do you think is causing depression and anxiety? 00:32:16.900 |
And the number one thing they say is social media. 00:32:20.980 |
- Do you think the recommendation is, as a parent, 00:32:25.460 |
that teens should not use Instagram, Twitter? 00:32:38.020 |
I mean, it might be very difficult to make it safe. 00:32:42.340 |
while we don't know how to make it safe, put down the phone. 00:32:56.220 |
That's one of five or seven different avenues of harm. 00:32:59.400 |
The main one I think, which does in the girls, 00:33:03.440 |
It's living a life where you're thinking all the time 00:33:11.100 |
so it's bad enough that they're scrolling through, 00:33:19.080 |
are the ones the algorithm picked that were the night. 00:33:21.000 |
Anyway, so the scrolling, I think, is bad for the girls. 00:33:24.300 |
But I'm beginning to see, I can't prove this, 00:33:26.020 |
but I'm beginning to see from talking to girls, 00:33:27.580 |
from seeing how it's used, is once you start posting, 00:33:32.260 |
And now you're, basically, you're no longer present. 00:33:40.740 |
And when you're in class, you're thinking about 00:33:43.780 |
how are people responding to the post that I made 00:33:51.080 |
but now I've got this big article, I'm tweeting about it, 00:34:01.740 |
Imagine being a 12-year-old girl going through puberty, 00:34:10.900 |
who are putting up sexy photos of themselves. 00:34:12.780 |
Like, and this is so sad, so sad, don't be doing this. 00:34:16.100 |
- Yeah, see, the thing where I disagree a little bit is, 00:34:23.500 |
I feel it's the responsibility of social media, 00:34:31.100 |
and for the company to maximize the long-term happiness 00:34:43.660 |
- That's the way it is currently, that driven. 00:34:45.540 |
- If we can get a business model, as you're saying, 00:34:48.440 |
- And I think that's the way to make much more money. 00:35:01.960 |
I mean, a lot of people say it's about the source of money, 00:35:19.760 |
- It should, in theory, in morality land, it should. 00:35:26.840 |
But that's maybe a discussion for another day. 00:35:29.000 |
We're studying the reality of the way things currently are, 00:35:33.480 |
and they are as they are, as the studies are highlighting. 00:35:36.920 |
So let us go then in from the land of mental health 00:35:49.440 |
is there a connection, is there a correlation 00:36:08.640 |
this is an oversimplification, but there's some truth to it. 00:36:10.720 |
There's what's called the behavioral activation system, 00:36:22.040 |
And of course, students, I'm a college professor, 00:36:24.840 |
and most of us think about our college days like, 00:36:27.120 |
you know, yeah, we were anxious at times, but it was fun. 00:36:29.760 |
And it was like, I can take all these courses, 00:36:32.120 |
I can do all these clubs, I know all these people. 00:36:39.480 |
with their front right cortex hyperactivated, 00:36:41.520 |
everything's a threat, everything is dangerous, 00:36:48.320 |
what's called defend mode as opposed to discover mode. 00:36:53.000 |
Imagine a large, diverse, secular, liberal democracy 00:36:57.520 |
in which people are most of the time in discover mode. 00:37:03.160 |
And this is what de Tocqueville said about Americans, 00:37:12.600 |
But here, like, you know, let's roll up our sleeves, 00:37:26.120 |
whatever anyone says, you're not looking like, 00:37:29.040 |
You're thinking, you know, how is this dangerous? 00:37:33.540 |
How can I, you know, so if you imagine, you know, 00:37:38.200 |
okay, let's push everyone over into, you know, 00:37:41.520 |
And it's like joy breaks out, age of Aquarius. 00:37:49.560 |
than to have them spend some time on partisan 00:37:55.840 |
including videos about how horrible the other side is. 00:38:02.320 |
then we lose our country or then it's catastrophe. 00:38:06.100 |
So Twitter, and again, we're not saying all of Twitter, 00:38:11.320 |
and people that are mostly not talking about politics, 00:38:14.260 |
but the ones that are on talking about politics 00:38:20.020 |
All the mainstream media is hugely influenced by Twitter. 00:38:26.320 |
if there's more sort of anxiety, sense of threat, 00:38:32.760 |
the great thing about a democracy and especially, 00:38:36.440 |
you know, a legislature that has some diversity in it, 00:38:39.260 |
is that the art of politics is that you can grow the pie 00:38:45.220 |
You find ways that we can all get 60% of what we want. 00:38:48.920 |
And that ends when everyone's anxious and angry. 00:38:52.260 |
- So let's try to start to figure out who's to blame here. 00:39:04.900 |
that they're making in the detailed engineering designs 00:39:09.100 |
Is it the users of social media that drive narratives 00:39:30.340 |
can't look away from an accident when you're driving by? 00:39:38.020 |
can we start to figure out what the solution would be 00:40:07.320 |
The first time my buddies and I played paintball, 00:40:12.220 |
And we were divided into teams with strangers 00:40:14.140 |
to shoot guns at each other and kill each other. 00:40:21.780 |
Like it really felt like we'd opened a room in our hearts 00:40:25.420 |
But as men, testosterone changes our brains and our bodies 00:40:29.340 |
and activates the war stuff, like we've got more stuff. 00:40:31.820 |
And that's why boys like certain team sports, it's play war. 00:40:38.820 |
It doesn't mean we're always wanting to fight. 00:40:40.180 |
We're also really good at making peace and cooperation 00:40:48.460 |
you want a society that has room for conflict, 00:40:53.900 |
That's totally, it's not just harmless, it's actually good. 00:41:02.820 |
And if you're gonna have a diverse democracy, 00:41:12.620 |
that talks about the difficulty of diversity and democracy 00:41:22.940 |
Now let's imagine that the technological environment 00:41:28.820 |
Let's give everyone telephones and the postal service. 00:41:32.900 |
we can do all these things together with people far away. 00:41:35.900 |
Now, instead of that, let's give them a technology 00:41:40.100 |
So early Facebook and Twitter were generally lovely places. 00:41:44.300 |
People old enough to remember, like, they were fun. 00:41:48.260 |
You didn't feel like you were gonna get your head blown off 00:41:57.020 |
And like almost all the platforms start off as nice places. 00:42:00.340 |
But, and this is the key thing in the article, 00:42:03.540 |
in the Atlantic article on Babel, on after Babel. 00:42:15.020 |
After Babel, Adapting to a World We Can No Longer Share, 00:42:22.740 |
and it was why the past 10 years have been unique. 00:42:29.220 |
So first of all, what is it, what's the Tower of Babel? 00:42:33.140 |
- Okay, so the Tower of Babel is a story early in Genesis 00:42:36.460 |
where the descendants of Noah are spreading out 00:42:42.420 |
and they say, "Let us build us a city with a tower 00:42:45.340 |
to make a name for ourselves, lest we be scattered again." 00:42:48.300 |
And so it's a very short story, there's not a lot in it, 00:42:54.060 |
let's build a city and a tower to reach the heavens. 00:42:57.660 |
And God is offended by the hubris of these people 00:43:05.700 |
he says, "Let us go down and confuse their language 00:43:10.060 |
so that they may not understand one another." 00:43:13.100 |
So in the story, he doesn't literally knock the tower over, 00:43:15.920 |
but many of us have seen images or movie dramatizations 00:43:20.420 |
where a great wind comes and the tower is knocked over 00:43:23.500 |
and the people are left wandering amid the rubble, 00:43:28.540 |
So I've been grappling, I've been trying to say, 00:43:33.700 |
Beginning in 2014, what the hell is happening 00:43:38.140 |
it hit journalism, the arts, and now it's all over companies. 00:44:00.180 |
"Let half of the people hate the other half." 00:44:02.860 |
It's God said, "Let us confuse their language 00:44:05.140 |
"that they, none of them can understand each other 00:44:14.180 |
So Meta or Facebook wrote a rebuttal to my article. 00:44:19.180 |
They disputed what I said, and one of their arguments was, 00:44:23.380 |
oh, but polarization goes back way before social media 00:44:31.100 |
And I did say that, but I should have said it more clearly 00:44:43.740 |
Cable TV, Fox News, whatever you want to point to 00:44:55.980 |
I'm a liberal professor and my liberal students frightened me. 00:45:02.220 |
the draft of our first draft of our coddling article. 00:45:07.440 |
students are not as afraid of their professors, 00:45:12.740 |
It's not like the whole generation has lost their minds. 00:45:17.460 |
a small number are adept at using social media 00:45:31.860 |
It's usually about a word, like he used this word, 00:45:38.760 |
So they don't even engage with ideas and arguments. 00:45:50.140 |
there's something about social media in those years 00:46:10.340 |
You get a big change in the dynamics of groups. 00:46:21.180 |
of what makes groups smart and what makes them stupid. 00:46:37.800 |
as like the one I'm having with these other researchers 00:46:40.840 |
over social media, I write something, they write something, 00:46:53.360 |
It's what makes them generators of knowledge. 00:47:02.820 |
if any academic comes up with research that says otherwise, 00:47:16.420 |
who in the days after George Floyd was killed 00:47:19.420 |
and there were protests, and the question is, 00:47:24.860 |
Now, most of them were peaceful, but some were violent. 00:47:27.400 |
And he tweeted a study, he just simply tweeted a study 00:47:32.400 |
I think sociologist at Princeton, Omar Wasow. 00:47:35.300 |
And Wasow's study showed that when you look back 00:47:38.300 |
at the '60s, you see that where there were violent protests, 00:47:40.500 |
it tended to backfire, peaceful protests tend to work. 00:48:04.260 |
in that situation, because it happens time and time again, 00:48:08.860 |
but also in the coddling of the American mind, 00:48:11.160 |
is the blame on the mob, the mechanisms that enables the mob 00:48:36.000 |
that is the place for courage and for ideals, right? 00:48:43.820 |
So there's going to be new mechanisms for mobs 00:48:57.380 |
which means I look at the social forces that work on people. 00:49:05.420 |
and it's a way that we find surprising and shameful, 00:49:08.620 |
I'm not going to say, wow, 95% of the people are shameful. 00:49:11.220 |
I'm going to say, wow, what a powerful situation. 00:49:17.900 |
because there are hardly any in the first few years, 00:49:24.140 |
There are hardly any leaders who stood up against it. 00:49:26.700 |
But I've talked to a lot, and it's always the same thing. 00:49:35.100 |
generally they're progressive or on the left. 00:49:37.660 |
They're accused of things by their young employees. 00:49:40.340 |
They don't have the vocabulary to stand up to it. 00:49:47.300 |
and there's only a few examples of university presidents 00:49:49.580 |
who said like, no, we're not going to stop this talk 00:49:53.620 |
No, you know, we're not going to fire this professor 00:49:55.540 |
because he wrote a paper that you don't like. 00:50:01.100 |
I have to include that the situational forces are so strong. 00:50:06.220 |
we are seeing a reversal in the last few weeks or months. 00:50:10.140 |
A clear sign of that is that the New York Times 00:50:18.660 |
Now that's amazing that the Times had the guts 00:50:21.380 |
to stand up for free speech because, you know, 00:50:33.940 |
and the New York Times literally will change papers. 00:50:36.780 |
I have an essay in Politico with Nadine Strassen, 00:50:44.740 |
retracted and changed an editorial by Brett Stevens. 00:50:48.180 |
And they did it in a sneaky way and they lied about it. 00:50:50.460 |
And they did this out of fear because he mentioned IQ. 00:51:00.540 |
And the New York Times, I mean, they were really cowardly. 00:51:07.060 |
They know that they should not have fired James Bennett. 00:51:12.300 |
And that's why they're now poking their head up 00:51:16.860 |
And then of course they got their heads blown off 00:51:18.220 |
'cause Twitter reacted like, how dare you say this? 00:51:24.580 |
They didn't apologize for defending free speech. 00:51:30.540 |
- Can I ask you an opinion on something here? 00:51:34.820 |
in terms of Twitter being the editorial board 00:51:38.140 |
for the prestigious journalistic organizations, 00:51:43.580 |
what's the importance of the role of Mr. Elon Musk in this? 00:51:52.260 |
but here's a human who tweets about the importance 00:52:02.300 |
the positive and the negative possible consequences 00:52:09.460 |
and if he's gonna be one of the major reasons 00:52:11.860 |
why we decarbonize quickly and why we get to Mars, 00:52:19.780 |
Now where I'm concerned and where I'm critical 00:52:22.220 |
is we're in the middle of a raging culture war 00:52:25.180 |
and this culture war is making our institutions stupid. 00:52:30.220 |
This culture war, I think, could destroy our country. 00:52:34.820 |
into constant constitutional crises, a lot more violence. 00:52:40.800 |
but I think there will be a lot more violence. 00:52:42.620 |
So we're in the middle of this raging culture war. 00:53:35.980 |
but they should not be weighing in on culture war issues. 00:53:45.780 |
"because they were mostly the kindness party, 00:53:50.260 |
"but they have become the party of division and hate, 00:53:53.940 |
"so I can no longer support them and will vote Republican." 00:53:59.400 |
"Now watch their dirty tricks campaign against me unfold." 00:54:10.420 |
- Because he's probably communicating with the board, 00:54:23.260 |
bringing back the former president onto the platform, 00:54:35.660 |
it seems that there's a strong lean in Twitter 00:54:45.340 |
it seems like the current operation of Twitter 00:54:49.940 |
is the extremes of the left get outraged by something, 00:55:23.920 |
with everything with the parties, that's not the issue. 00:55:26.120 |
The issue is, should the most important CEO in America, 00:55:29.560 |
the CEO of some of our biggest and most important companies, 00:55:36.320 |
In one world, the CEO of every Fortune 500 company has said, 00:55:41.320 |
I'm a Republican because I hate those douchebags, 00:55:44.000 |
or I'm a Democrat because I hate those Nazi racists. 00:55:50.240 |
everybody, it's culture war everywhere, all the time, 00:55:54.360 |
You pick a doctor based on whether he's red or blue. 00:55:57.880 |
That's one possible future, which we're headed towards. 00:56:02.320 |
Political conflict should be confined to political spaces. 00:56:06.200 |
but you don't go protesting at people's private homes. 00:56:14.560 |
When you go shopping, when you go to a restaurant, 00:56:19.400 |
you should be able to buy products from an excellent company. 00:56:21.200 |
You shouldn't have to always think, what's the CEO? 00:56:31.360 |
That was, I think, really throwing fuel on a fire 00:56:36.680 |
are gonna get even more politicized than they are. 00:56:48.280 |
of course we can find out what his views are. 00:56:53.320 |
but it's not that a CEO can't be a partisan or have views, 00:57:05.040 |
and it's setting a precedent that corporations 00:57:22.200 |
So I'm guessing this was written while sitting 00:57:24.920 |
on the toilet and I could see in a day from now 00:57:35.920 |
there's a chaos element, and that's, chaos is not-- 00:57:52.480 |
- That's a trade-off, because if you become a politician, 00:57:55.760 |
so there's a trade-off between, in this case, 00:58:01.600 |
like being calculating about the impact you have 00:58:09.120 |
And I'm not sure calculating is also a slippery slope. 00:58:13.000 |
Both are slippery slopes, you have to be careful. 00:58:18.080 |
and we just say like, "What should a person do?" 00:58:20.400 |
Those are very hard, but our world is actually structured 00:58:25.520 |
And if it's just like, "Oh, talking here among our friends, 00:58:31.160 |
But the CEO of a company has fiduciary duties, 00:58:38.640 |
And if he is using the company for his own political gain 00:58:44.960 |
that's a violation of his fiduciary duty to the company. 00:58:49.640 |
whether your fiduciary duty is to the shareholders, 00:59:06.960 |
So it's not necessarily he's violating a law by doing it, 00:59:16.760 |
- I think you have to take it in the full context 00:59:18.520 |
because you see that he's not being a political player, 00:59:35.080 |
"left-handed people are douchebags, I hate them." 00:59:38.880 |
Like, why would you drag away left-handed people? 00:59:40.440 |
- What you said now is not either funny or like-hearted 00:59:47.160 |
I'm not picking on you, I'm saying that statement. 00:59:49.640 |
Words matter, there's a lightness to the statement 00:59:57.160 |
there's ridiculous memes and there's nonstop jokes 01:00:06.920 |
there's not a celebration of the beauty of the engineering 01:00:18.540 |
where you think that it's really not authentic. 01:00:26.880 |
by being civil, you're actually playing politics 01:00:29.800 |
because all of your actual political decision-making 01:00:41.320 |
and actually struggling with some of the ideas 01:00:45.220 |
I think this lightness represents the kind of positivity 01:01:01.140 |
- But look at what you just said, in the full context. 01:01:09.420 |
- Well, that's the problem you're highlighting. 01:01:10.260 |
- The rule of Twitter is there is no context. 01:01:12.220 |
Everything is taken in the maximum possible way. 01:01:16.640 |
you know, yes, I wish we did take people in context. 01:01:23.500 |
- So you're saying it is a bit of responsibility 01:01:30.580 |
there is the fundamental mechanism of Twitter 01:01:32.500 |
where people don't give you the benefit of the doubt. 01:01:34.900 |
- Well, I don't wanna hang it on large platform 01:01:36.600 |
because then that's what a lot of people say. 01:01:41.220 |
and she should say things that agree with my politics. 01:01:47.660 |
CEOs of a company have duties and responsibilities. 01:01:54.780 |
as being a really bad role model for young men. 01:01:59.080 |
and he is a very appealing, attractive role model. 01:02:04.820 |
I think, I don't wanna put a responsibility on people, 01:02:08.540 |
but yes, he could be a much, much better role model. 01:02:12.220 |
- You can't insult sitting senators by calling them old. 01:02:31.420 |
the reason I like Elon is because of the engineering, 01:02:37.340 |
- But what I admire on the Twitter side is the authenticity 01:02:41.200 |
because I've been a little bit jaded and worn out 01:02:49.260 |
the CEOs and the politicians who built up walls 01:02:53.000 |
That's one of the reasons I love long-form podcasting, 01:02:55.500 |
is especially if you talk more than 10 minutes, 01:03:10.340 |
at least to me, the biggest problem is to pick sides, 01:03:12.840 |
to say I'm not going to vote this way or that way. 01:03:21.140 |
You have much, like the importance of social media 01:03:30.680 |
the short-term bickering of any one political party. 01:03:35.760 |
where we develop ideas through sort of rigorous discourse, 01:03:43.420 |
- So, okay, so here's an idea about social media 01:03:48.060 |
which is everyone freaks out because they think either, 01:03:52.620 |
oh, he's going to do less content moderations. 01:03:58.140 |
because they think the people doing the content moderation 01:04:06.380 |
And it was, I would defeat the bots or something. 01:04:17.220 |
And actually, I think this could change the world. 01:04:19.140 |
Even if the bid fails, the fact that Elon said that, 01:04:22.460 |
that he thinks we need to authenticate all humans is huge. 01:04:26.260 |
Because now we're talking about solutions here. 01:04:29.900 |
What can we do to make social media a better place 01:04:33.540 |
a place that actually makes democracy better? 01:04:48.860 |
Whereas we're using it to make ourselves weaker, 01:04:57.500 |
And all the argument is about content moderation. 01:05:02.720 |
is that what, five or 10% of what they might call 01:05:06.740 |
hate speech gets caught, 1% of violence and intimidation. 01:05:09.900 |
Content moderation, even if we do a lot more of it, 01:05:13.700 |
All the powers and the dynamics changes to the architecture. 01:05:20.460 |
what are the reforms that would matter for social media? 01:05:23.420 |
the number one thing I believe is user authentication 01:05:36.460 |
anyone can open an account on Twitter, Facebook, whatever, 01:05:39.420 |
as long as you're over 16, and that's another piece. 01:05:46.780 |
You can open an account and you can look, you can read, 01:05:50.220 |
and you can make up whatever fake name you want. 01:05:57.420 |
on a company that has section 230 protection from lawsuits, 01:06:07.580 |
If you want to be able to post on platforms that, 01:06:13.100 |
there's a lot of evidence that they are undermining 01:06:16.520 |
Then the company has this minimal responsibility 01:06:23.460 |
You can't just walk up to a bank with a bag of money 01:06:25.420 |
that you stole and say, here, deposit this for me. 01:06:31.420 |
And the bank isn't gonna announce who you are publicly, 01:06:33.460 |
but you have to, if they're gonna do business with you, 01:06:35.620 |
they need to know you're a real person, not a criminal. 01:06:39.620 |
And so there's a lot of schemes for how to do this. 01:06:53.240 |
meaning you have to at least pass a CAPTCHA or some test 01:07:00.300 |
it's a constant cat and mouse struggle between bots. 01:07:06.160 |
The next level up, there are a variety of schemes 01:07:13.640 |
So some, whether you show an ID, whether you use biometric, 01:07:19.580 |
that establishes identity, whether it's linked to a phone, 01:07:22.340 |
whatever it is, there are multiple schemes now 01:07:24.580 |
that companies have figured out for how to do this. 01:07:27.540 |
And so if you did that, then in order to get an account 01:07:30.920 |
where you have posting privileges on Facebook or Twitter 01:07:33.620 |
or TikTok or whatever, you have to at least do that. 01:07:49.460 |
You still have trolls who use their real name, 01:07:51.420 |
but this would just make it a little scarier for trolls. 01:07:59.100 |
but some men, as soon as they have the anonymity, 01:08:04.820 |
One troll can ruin thousands of people's day. 01:08:08.180 |
- You know, I'm somebody who believes in free speech. 01:08:12.640 |
And so there's been a lot of discussions about this. 01:08:15.940 |
And we'll ask you some questions about this too. 01:08:18.940 |
But there is, the tension there is the power of a troll 01:08:35.840 |
if you wanna have a private party and enjoy your time, 01:08:57.400 |
So see, you're saying it's a step in the right direction 01:09:19.960 |
So actually there's a lot of ways to do this. 01:09:24.260 |
So if you go to the social media and political dysfunction, 01:09:35.100 |
So we're collecting there now some of the ideas 01:09:41.800 |
And so one is WorldCoin, there's one human-id.org. 01:09:46.480 |
This is a new organization created by an NYU Stern student 01:09:56.360 |
of identity verification that is keyed to your phone. 01:10:03.280 |
And of course you can buy seven different phone numbers 01:10:06.160 |
if you want, but it's gonna be about 20 or $30 a month. 01:10:15.600 |
but most people do and nobody has a thousand. 01:10:24.800 |
Imagine a public square in which the incentives 01:10:34.080 |
Okay, so that's the public square we have now. 01:10:41.280 |
If there's a thousand of us in the public square 01:10:43.600 |
and 10 are incentivized to kick us and throw shit at us, 01:11:01.560 |
Imagine if the incentive was to be constructive. 01:11:04.160 |
And so this is an idea that I've been kicking around. 01:11:05.880 |
I talked about it with Reid Hoffman last week 01:11:19.000 |
focus on users, what users are incredibly aggressive. 01:11:27.040 |
AI could easily code nastiness or just aggression, hostility. 01:11:40.600 |
You will see everybody who's a four and below, 01:11:43.280 |
but you won't even see the people who are fives 01:11:49.500 |
We're not censoring them, they can say what they want. 01:11:52.060 |
But now there's actually an incentive to not be an asshole. 01:11:59.180 |
So imagine our country goes in two directions. 01:12:05.180 |
in which we could actually talk about things. 01:12:06.780 |
And in the other, we actually try to disincentivize 01:12:10.020 |
being an asshole and encourage being constructive. 01:12:22.140 |
I've been looking at a lot of the machine learning models 01:12:23.860 |
involved and I believe that the nastiness classification 01:12:30.020 |
- So I personally believe in crowdsourced nastiness labeling. 01:12:35.020 |
But in an objective way where it doesn't become 01:12:52.420 |
Is this a shitty thing to say at this moment? 01:12:55.420 |
- Well, wait, no, but we don't care about individual posts. 01:13:02.740 |
as long as we're talking about averages here, 01:13:04.540 |
one person has a misclassified post, it doesn't matter. 01:13:22.300 |
'Cause that's what, we have that incentive in real life. 01:13:26.060 |
- It's actually really painful to be a full-time asshole, 01:13:32.100 |
- It should be also pain to be an asshole on the internet. 01:13:36.700 |
There could be different mechanisms for that. 01:13:38.420 |
I wish AI was there, machine learning models were there. 01:13:44.220 |
so one track is we have AI machine learning models 01:13:55.500 |
they look at it and they say, "What's going on here?" 01:13:59.700 |
and you can refine whatever the algorithms are 01:14:03.420 |
and people can only, "Hey, let's all rate this guy 01:14:06.940 |
So you wouldn't want just to rely on one single track. 01:14:09.580 |
But if you have two tracks, I think you could do it. 01:14:12.940 |
- What do you think about this word misinformation 01:14:15.040 |
that maybe connects to our two discussions now? 01:14:18.660 |
So one is the discussion of social media and democracy. 01:14:23.620 |
And then the other is the coddling of the American mind. 01:14:31.340 |
or used as a bullying word, like racism and so on, 01:15:00.860 |
- So I think there is a meaning of disinformation 01:15:09.420 |
by Russian agents to plant a story and spread it. 01:15:14.340 |
And they've been doing that since the '50s or '40s even. 01:15:21.060 |
- It's a disinformation campaign by the Russians. 01:15:29.660 |
- So I think to the extent that there are campaigns 01:15:43.540 |
that are intending to confuse or spread lies. 01:15:48.540 |
This is the Soviet, the firehose of falsehood tactic. 01:15:53.760 |
All the companies need to have pretty large staffs, 01:16:14.540 |
The Russians have been trying since the '50s, 01:16:17.860 |
they would come to America and they would do hate crimes. 01:16:27.220 |
Most of the debate nowadays, however, is not that. 01:16:37.140 |
then you're very conscious of the times when, 01:16:44.300 |
They would, you literally would get shut down 01:16:49.660 |
but there's at least a real likelihood that it is, 01:16:53.500 |
So I tend to stay away from any such discussions. 01:16:58.980 |
One is because they're almost entirely partisan. 01:17:03.260 |
what the other side is saying is misinformation 01:17:07.380 |
or disinformation, and they can prove certain examples. 01:17:12.460 |
We certainly are never gonna get 60 votes in the Senate 01:17:25.340 |
and then the nature of who is on the platform, 01:17:30.340 |
unverified people, and how much amplification they get. 01:17:47.060 |
is fundamentally in the dynamics of social media, 01:17:49.780 |
not in the subtle choices of content moderation, 01:17:57.900 |
You know, "The Turner Diaries" is this book written in 1978. 01:18:01.180 |
It introduced the replacement theory to a lot of people. 01:18:05.900 |
Timothy McVeigh had it on him when he was captured in 1995 01:18:22.340 |
They've been around, but this guy's from a small town. 01:18:30.860 |
he was entirely influenced by things he found online. 01:18:33.780 |
He was not influenced by anyone he met in person. 01:18:45.560 |
And this is, again, back to the "Atlantic" article. 01:18:49.500 |
I've got this amazing quote from Martin Goury. 01:18:53.500 |
But Martin Goury, he was a former CIA analyst, 01:18:56.660 |
wrote this brilliant book called "The Revolt of the Public" 01:19:04.860 |
He says, he talks about how in the age of mass media, 01:19:14.480 |
but we had stories in common, we had facts in common. 01:19:20.460 |
And he describes how the flood of information 01:19:22.480 |
with the internet is like a tsunami washing over us. 01:19:26.980 |
And he says, this isn't a comment in an interview on Vox. 01:19:30.340 |
He says, "The digital revolution has shattered that mirror. 01:19:33.940 |
And now the public inhabits those broken pieces of glass. 01:19:39.340 |
It's highly fragmented and it's basically mutually hostile. 01:19:45.260 |
and living in bubbles of one sort or another." 01:19:48.060 |
And so, we now see clearly there's this little bubble 01:19:51.740 |
of just bizarre nastiness in which the killer 01:19:58.460 |
and now in Buffalo, they're all put into a community 01:20:05.060 |
So we can never stamp those words or ideas out. 01:20:09.300 |
The question is not, can we stop them from existing? 01:20:16.900 |
all over the world and into every little town 01:20:21.240 |
whatever percentage of young men are vulnerable to this, 01:20:40.060 |
but much more important is fixing the dynamics. 01:20:50.660 |
that's only if all we talk about is content moderation. 01:20:55.180 |
You know, how much or how little do we, you know, 01:21:05.440 |
- So can you try to steel man the other side? 01:21:09.400 |
So the people that might say that social media 01:21:30.340 |
If you look at the whole, which is difficult to measure, 01:21:48.560 |
you know, the argument is usually what they say is, 01:21:52.700 |
well, you know, for communities that are cut off, 01:21:54.700 |
especially LGBTQ kids, they can find each other. 01:22:01.320 |
especially kids who wouldn't find connection otherwise. 01:22:04.700 |
It exposes you to a range of ideas and content, 01:22:16.700 |
is there inklings of data that's maybe early data 01:22:25.120 |
or how would you measure behavioral positive? 01:22:38.780 |
And this is what Mark was referring to when he said, 01:22:42.820 |
and on most it was positive, and on some it was negative. 01:22:48.460 |
look, most kids enjoy it, they're having fun, 01:22:51.620 |
but some kids are feeling inferior, cut off, bullied. 01:22:56.620 |
So if we're saying what's the average experience 01:23:01.700 |
on the platform, that might actually be positive. 01:23:22.300 |
like how do you feel while you're consuming heroin? 01:23:24.800 |
Like while I'm consuming heroin, I feel great, 01:23:29.740 |
Am I glad that everyone in my seventh grade class 01:23:37.660 |
sitting on the bench shooting up during recess. 01:23:40.020 |
So when you look at it as an emergent phenomenon 01:23:42.700 |
that changed childhood, now it doesn't matter 01:23:45.140 |
what are the feelings while you're actually using it. 01:23:47.220 |
We need to zoom out and say, how has this changed childhood? 01:23:56.780 |
what Mark said in 2012 when he was taking Facebook public, 01:24:01.780 |
and you know, this is the wake of the Arab Spring. 01:24:11.860 |
Now people say, you know, Facebook took them down. 01:24:24.720 |
So the argument that Mark makes in this letter 01:24:31.600 |
is, you know, we're at a turning point in history, 01:24:38.820 |
We're giving people the tools to rewire their institutions. 01:24:51.620 |
with the fall of the Iron Curtain and the Soviet Union. 01:24:54.580 |
And then the internet comes in and, you know, 01:24:58.440 |
how extraordinary it was, how much fun it was. 01:25:01.780 |
I mean, the sense that this was the dawning of a new age, 01:25:06.700 |
And so this optimism runs all the way from the early '90s, 01:25:09.740 |
all the way through 2011 with the Arab Spring. 01:25:12.980 |
And of course that year ends with Occupy Wall Street. 01:25:15.180 |
And there were also big protest movements in Israel, 01:25:24.740 |
that Facebook in particular, but all these platforms, 01:25:30.700 |
What dictator could possibly keep out the internet? 01:25:41.500 |
And then we can see what happened in the years after. 01:25:43.420 |
Now, first of all, so in Mark's response to you, 01:25:51.380 |
He says, "I think it's worth grounding this conversation 01:25:55.260 |
"in the actual research that has been done on this, 01:26:02.540 |
Then he says, "Most academic studies that I've seen 01:26:12.700 |
But he asserts that, well, actually, wait, it's tricky 01:26:18.220 |
So I can't, so it might be that the studies he has seen 01:26:20.420 |
say that, but if you go to the Google Doc with Chris Bale, 01:26:33.900 |
But on the other six, the evidence is pretty strongly 01:26:38.620 |
- He also draws a line between the United States 01:26:41.820 |
- Right, and there's one thing true about that, 01:26:46.020 |
much faster in the US than in any other major country. 01:26:49.580 |
So we're talking about an article by Matthew Genscow 01:27:06.720 |
that it's organized by, whether studies indicate yes or no, 01:27:10.400 |
question one is does social media make people more angry 01:27:15.020 |
Question two is does social media create echo chambers? 01:27:17.700 |
These are fascinating, really important questions. 01:27:19.720 |
Question three is does social media amplify posts 01:27:22.620 |
that are more emotional, inflammatory, or false? 01:27:31.240 |
foreign governments to increase political dysfunction 01:27:36.660 |
Question six, does social media decrease trust? 01:27:39.780 |
Seven, is the social media strengthen populist movements? 01:27:44.500 |
And then there's other sections, as you mentioned. 01:27:47.980 |
But once you operationalize it as seven different questions, 01:27:52.320 |
so one is about polarization and there are measures of that, 01:27:56.000 |
the degree to which people say they hate the other side. 01:27:58.380 |
And so in this study by Boxell, Genscow, and Shapiro, 2021, 01:28:02.860 |
they looked at all the measures of polarization 01:28:10.940 |
And they show plots, you have these nice plots 01:28:13.140 |
with red lines showing that in some countries 01:28:15.140 |
it's going up, like the United States especially, 01:28:28.460 |
But so much depends on how you operationalize things. 01:28:31.860 |
Are we interested in the straight line, regression line, 01:28:37.180 |
And if so, well, then he's right in what he says. 01:28:44.780 |
The argument is it's been rising and falling since 2012 01:28:50.420 |
I've been emailing with the authors of the study 01:28:56.060 |
'cause there's only a few observations after 2012. 01:28:58.160 |
But if you look at the graphs in their study, 01:29:00.420 |
and they actually do provide, as they pointed out to me, 01:29:07.180 |
So actually, a polarization is going up pretty widely 01:29:12.940 |
which is when the internet would be influential. 01:29:37.260 |
that polarization is up some places, down other places. 01:29:41.980 |
But actually, we have another section in the Google Doc 01:29:45.040 |
where we review all the data on the decline of democracy. 01:29:52.500 |
by some measures it begins to drop in the late 2000s, 01:30:06.220 |
of democracies on this planet that began in the 2010s. 01:30:14.620 |
there are a lot of other studies that point the other way. 01:30:18.020 |
are getting more polarized or less, more polarized. 01:30:27.300 |
Do you think Mark, do you think Elon or whoever 01:30:42.660 |
now this is you giving sort of financial advice to me. 01:30:52.540 |
But what do you think, when we talk again in 10 years, 01:30:57.300 |
what do you think we'd be looking at if it's a better world? 01:31:03.260 |
the dynamics of each change that needs to be made. 01:31:07.620 |
And so the biggest change for teen mental health, 01:31:12.380 |
it was set to 13 in COPPA in like 1997 or six or whatever, 01:31:19.500 |
I think it needs to go to 16 or 18 with enforcement. 01:31:38.820 |
when they were talking about rolling out Facebook for kids, 01:31:45.180 |
There's a business imperative to hook them early 01:31:48.480 |
So I don't expect Facebook to act on its own accord 01:31:56.020 |
like what economists call like a prisoner's dilemma 01:31:58.340 |
or a social dilemma is generalized to multiple people. 01:32:03.180 |
each player can't opt out because they're gonna lose. 01:32:18.860 |
that may put pressure on the platforms globally 01:32:22.300 |
So anyway, my point is, we have to have regulation 01:32:31.700 |
So I think that if the companies and the users, 01:32:36.940 |
in which the incentives against doing the right thing 01:32:40.900 |
are strong, we do need regulation on certain matters. 01:32:44.540 |
And again, it's not about content moderation, 01:32:47.860 |
but it's things like the Platform Accountability 01:32:49.620 |
and Transparency Act, which is from Stanardoz Kunz, 01:32:54.300 |
This would force the platforms to just share information 01:33:05.020 |
the Digital Platforms Commission Act of 2022, 01:33:09.020 |
which would create a body tasked with actually regulating 01:33:13.660 |
Right now, the US government doesn't have a body. 01:33:19.260 |
but we don't have a body that can oversee or understand 01:33:22.420 |
these things that are transforming everything 01:33:24.380 |
and possibly severely damaging our political life. 01:33:28.020 |
So I think there's a lot of, oh, and then the, 01:33:31.580 |
the state of California is actually currently considering 01:33:35.140 |
a version of the UK's, the age appropriate design code, 01:33:39.140 |
which would force the companies to do some simple things 01:33:42.180 |
like not be sending alerts and notifications to children 01:33:47.140 |
just things like that to make platforms just less damaging. 01:33:52.540 |
So I think there's an essential role for regulation. 01:33:55.060 |
And I think if the US Congress is too paralyzed by politics, 01:33:58.180 |
if the UK and the EU and the state of California 01:34:05.020 |
the platforms don't wanna have different versions 01:34:17.260 |
designed to hit Amazon, but it's hitting YouTube. 01:34:40.740 |
That's just to say that government can sometimes 01:34:46.420 |
- And so my preference is the threat of regulation 01:34:54.740 |
- Good behavior. - You really shouldn't need it. 01:35:03.220 |
And I actually, honestly, this, to our earlier 01:35:08.620 |
that I think it's good business to do the right thing 01:35:18.020 |
- Well, I think it's important because I've been thinking 01:35:30.580 |
in either creating World War III or avoiding World War III. 01:35:34.300 |
It seems like so much of wars throughout history 01:35:37.940 |
have been started through very fast escalation. 01:35:42.300 |
And it feels like just looking at our recent history, 01:35:44.780 |
social media is the mechanism for escalation. 01:35:47.420 |
And so it's really important to get this right, 01:35:49.460 |
not just for the mental health of young people, 01:35:57.380 |
but literally the survival of human civilization. 01:36:07.380 |
about World War III than I am about Civil War II. 01:36:13.880 |
Can I ask for your wise sage advice to young people? 01:36:25.420 |
But to young people in high school and college, 01:36:34.660 |
'cause I teach a course at NYU in the business school 01:36:47.140 |
But the course has evolved that it's now about three things, 01:36:50.620 |
how to get stronger, smarter, and more sociable. 01:37:03.820 |
love, and friendships, then you will be happier. 01:37:08.780 |
how do you become smarter, stronger, and happier? 01:37:22.680 |
that needs a lot of experience to wire itself up. 01:37:25.900 |
And the most important part of that experience is 01:37:33.660 |
You have an attachment figure to make you feel confident 01:37:43.500 |
because over time, you then develop the strength 01:37:50.200 |
That's what we blocked in the 1990s in this country. 01:37:52.980 |
So young people have to get themselves the childhood, 01:38:00.820 |
that older generations are blocking them from out of fear 01:38:06.100 |
out of just hijacking almost all the inputs into their life 01:38:10.920 |
So go out there, put yourself out in experiences. 01:38:14.800 |
You are anti-fragile and you're not gonna get strong 01:38:24.120 |
And then there's an analogy in how you get smarter, 01:38:27.080 |
which is you have to expose yourself to other ideas, 01:38:34.560 |
And this is why I co-founded Heterodox Academy 01:38:36.860 |
because we believe that faculty need to be in communities 01:38:41.580 |
that have political diversity and viewpoint diversity, 01:39:02.740 |
and so you have to realize that to get stronger, 01:39:07.400 |
And then the key to becoming more sociable is very simple. 01:39:17.640 |
what's interesting to them, what do they want? 01:39:19.840 |
And if you develop the skill of looking at it 01:39:36.100 |
But take charge of your life and your development 01:39:40.520 |
then the older protective generation and your phone 01:39:47.800 |
- So on anti-fragility and coddling the American mind, 01:40:00.840 |
but it's real struggle, absurd, unfair struggle 01:40:04.400 |
that seems to be formative to the human mind. 01:40:07.240 |
He says, "From time to time in the years to come, 01:40:13.120 |
so that you will come to know the value of justice. 01:40:18.780 |
because that will teach you the importance of loyalty. 01:40:30.560 |
so that you will be conscious of the role of chance in life 01:40:33.480 |
and understand that your success is not completely deserved 01:40:41.240 |
And when you lose, as you will from time to time, 01:40:55.260 |
so you know the importance of listening to others. 01:41:15.560 |
- Yes, for his son's middle school graduation. 01:41:21.200 |
- Yeah, and I think your work is really important 01:41:24.040 |
and it is beautiful and it's bold and fearless 01:41:27.200 |
and it's a huge honor that you would sit with me. 01:41:30.520 |
Thank you for spending your valuable time with me today, 01:41:39.880 |
please check out our sponsors in the description. 01:41:42.880 |
And now, let me leave you with some words from Carl Jung. 01:41:50.240 |
"can lead us to an understanding of ourselves." 01:41:53.940 |
Thank you for listening and hope to see you next time.