back to index

Jonathan Haidt: The Case Against Social Media | Lex Fridman Podcast #291


Chapters

0:0 Introduction
1:25 Social media and mental health
15:14 Mark Zuckerberg
24:51 Children's use of social media
35:36 Social media and democracy
51:42 Elon Musk and Twitter
68:13 Anonymity on social media
74:12 Misinformation
81:6 Social media benefits
83:50 Political division on social media
90:22 Future of social media
96:14 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Jonathan Haidt,
00:00:02.560 | social psychologist at NYU
00:00:04.720 | and critic of the negative effects of social media
00:00:07.480 | on the human mind and human civilization.
00:00:10.480 | He gives a respectful but hard hitting response
00:00:13.740 | to my conversation with Mark Zuckerberg.
00:00:16.040 | And together, him and I try to figure out
00:00:18.560 | how we can do better,
00:00:20.200 | how we can lessen the amount of depression
00:00:22.560 | and division in the world.
00:00:24.960 | He has brilliantly discussed these topics in his writing,
00:00:28.760 | including in his book, "The Coddling of the American Mind"
00:00:32.240 | and in his recent long article in "The Atlantic"
00:00:35.560 | titled, "Why the Past 10 Years of American Life
00:00:39.120 | Have Been Uniquely Stupid."
00:00:41.800 | When Teddy Roosevelt said in his famous speech
00:00:45.160 | that it is not the critic who counts,
00:00:47.680 | he has not yet read
00:00:48.960 | the brilliant writing of Jonathan Haidt.
00:00:51.320 | I disagree with John on some of the details
00:00:53.680 | of his analysis and ideas,
00:00:55.320 | but both his criticism and our disagreement is essential
00:01:00.320 | if we are to build better
00:01:02.200 | and better technologies that connect us.
00:01:04.640 | Social media has both the power to destroy our society
00:01:08.520 | and to help it flourish.
00:01:10.560 | It's up to us to figure out how we take the lighter path.
00:01:14.140 | This is the Lex Friedman Podcast.
00:01:17.120 | To support it,
00:01:18.080 | please check out our sponsors in the description.
00:01:20.760 | And now, dear friends, here's Jonathan Haidt.
00:01:25.360 | - So you have been thinking about the human mind
00:01:27.560 | for quite a long time.
00:01:29.320 | You wrote "The Happiness Hypothesis,"
00:01:31.000 | "The Righteous Mind," "The Coddling of the American Mind,"
00:01:33.640 | and today you're thinking, you're writing a lot
00:01:35.680 | about social media and about democracy.
00:01:38.960 | So perhaps if it's okay,
00:01:42.120 | let's go through the thread that connects all of that work.
00:01:46.120 | How do we get from the very beginning to today
00:01:49.280 | with the good, the bad, and the ugly of social media?
00:01:53.180 | So I'm a social psychologist,
00:01:55.120 | which means I study how we think about other people
00:01:59.640 | and how people affect our thinking.
00:02:01.760 | And in graduate school at the University of Pennsylvania,
00:02:04.560 | I picked the topic of moral psychology,
00:02:07.220 | and I studied how morality varied across countries.
00:02:10.920 | I studied in Brazil and India.
00:02:13.040 | And in the '90s, I began,
00:02:15.120 | this was like I got my PhD in 1992.
00:02:17.840 | And in that decade was really when the American culture war
00:02:21.280 | kind of really began to blow up.
00:02:23.740 | And I began to notice that left and right in this country
00:02:26.400 | were becoming like separate countries.
00:02:28.540 | And you could use the tools of cultural psychology
00:02:30.300 | to study this split,
00:02:31.980 | this moral battle between left and right.
00:02:34.780 | So I started doing that.
00:02:36.100 | And I began growing alarmed in the early 2000s
00:02:39.700 | about how bad polarization was getting.
00:02:42.340 | And I began studying the causes of polarization,
00:02:46.520 | bringing moral psychology to bear on our political problems.
00:02:50.240 | And I was originally gonna write a book
00:02:52.940 | to basically help the Democrats stop screwing up,
00:02:55.540 | because I could see that some of my research showed
00:02:58.460 | people on the right understand people on the left,
00:03:01.860 | they know what they think.
00:03:02.780 | You can't grow up in America
00:03:03.980 | without knowing what progressives think.
00:03:06.060 | But here I grew up generally on the left,
00:03:08.220 | and I had no idea what conservatives thought
00:03:10.060 | until I went and sought it out
00:03:11.660 | and started reading conservative things
00:03:13.100 | like National Review.
00:03:14.160 | So originally I wanted to actually help the Democrats
00:03:18.180 | to understand moral psychology
00:03:19.500 | so they could stop losing to George W. Bush.
00:03:21.840 | And I got a contract to write "The Righteous Mind."
00:03:24.380 | And once I started writing it,
00:03:25.500 | I committed to understanding conservatives
00:03:27.840 | by reading the best writings, not the worst.
00:03:30.060 | And I discovered, you know what?
00:03:31.540 | You don't understand anything
00:03:32.580 | until you look from multiple perspectives.
00:03:34.600 | And I discovered there are a lot of great
00:03:36.380 | social science ideas
00:03:37.580 | in the conservative intellectual tradition.
00:03:39.860 | And I also began to see, you know what?
00:03:42.320 | America's actually in real trouble.
00:03:43.980 | And this is like 2008, 2009.
00:03:45.420 | Things are really, we're coming apart here.
00:03:47.560 | So I began to really focus my research
00:03:49.420 | on helping left and right understand each other
00:03:51.660 | and helping our democratic institutions to work better.
00:03:54.820 | Okay, so all this is before I had any interest
00:03:56.680 | in social media.
00:03:57.520 | I was on Twitter, I guess like 2009,
00:03:59.620 | and not much, didn't think about it much.
00:04:02.220 | And then, so I'm going along
00:04:05.900 | as a social psychologist studying this.
00:04:07.900 | And then everything seems to kind of blow up
00:04:10.540 | in 2014, 2015 at universities.
00:04:13.620 | And that's when Greg Lukianoff came to me
00:04:16.380 | in May of 2014 and said,
00:04:17.860 | "John, weird stuff is happening.
00:04:20.500 | Students are freaking out
00:04:22.260 | about a speaker coming to campus
00:04:24.340 | that they don't have to go see.
00:04:25.780 | And they're saying it's dangerous, it's violence.
00:04:27.700 | Like what is going on?"
00:04:28.940 | And so anyway, Greg's ideas
00:04:31.060 | about how we were teaching students
00:04:33.220 | to think in distorted ways,
00:04:34.300 | that led us to write the "Coddling the American Mind,"
00:04:37.020 | which wasn't primarily about social media either.
00:04:38.740 | It was about, you know,
00:04:39.740 | this sort of a rise of depression, anxiety.
00:04:42.820 | But after that, things got so much worse everywhere.
00:04:45.780 | And that's when I began to think like,
00:04:47.100 | whoa, something systemically has changed.
00:04:49.180 | Something has changed
00:04:50.020 | about the fabric of the social universe.
00:04:52.020 | And so ever since then,
00:04:52.860 | I've been focused on social media.
00:04:54.820 | - So we're going to try to sneak up
00:04:57.340 | to the problems and the solutions at hand
00:04:59.340 | from different directions.
00:05:00.860 | I have a lot of questions
00:05:02.180 | whether it's fundamentally the nature of social media
00:05:05.140 | that's the problem,
00:05:05.980 | it's the decisions of various human beings
00:05:08.780 | that lead the social media companies that's the problem.
00:05:11.360 | Is there still some component
00:05:12.960 | that's highlighted in the "Coddling of the American Mind"
00:05:16.020 | that's the individual psychology at play
00:05:18.180 | or the way parenting and education works
00:05:22.400 | to make sort of emphasize anti-fragility of the human mind
00:05:27.400 | as it interacts with the social media platforms
00:05:31.300 | and the other humans through the social.
00:05:32.820 | So all that beautiful mess.
00:05:34.220 | - That should take us an hour or two to cover.
00:05:36.300 | - Or maybe a couple of years, yes.
00:05:37.940 | But so let's start if it's okay.
00:05:40.120 | You said you wanted to challenge some of the things
00:05:43.180 | that Mark Zuckerberg has said
00:05:44.920 | in a conversation with me.
00:05:46.620 | What are some of the ideas he expressed
00:05:48.620 | that you disagree with?
00:05:49.500 | - Okay, there are two major areas that I study.
00:05:51.780 | One is what is happening with teen mental health?
00:05:54.300 | It fell off a cliff in 2013, it was very sudden.
00:05:57.580 | And then the other is what is happening
00:06:00.140 | to our democratic and epistemic institutions?
00:06:02.860 | That means knowledge generating
00:06:04.140 | like universities, journalism.
00:06:07.060 | So my main areas of research
00:06:09.300 | where I'm collecting the empirical research
00:06:10.700 | and trying to make sense of it
00:06:12.020 | is what's happened to teen mental health
00:06:14.300 | and what's the evidence that social media is a contributor?
00:06:17.360 | And then the other areas,
00:06:18.760 | what's happening to democracies, not just America,
00:06:22.320 | and what's the evidence that social media
00:06:23.760 | is a contributor to the dysfunction?
00:06:25.520 | So I'm sure we'll get to that
00:06:26.800 | 'cause that's what the Atlantic article is about.
00:06:28.820 | But if we focus first on
00:06:30.360 | what's happened to teen mental health.
00:06:32.400 | So before I read the quotes from Mark,
00:06:34.440 | I'd like to just give the overview.
00:06:36.360 | And it is this.
00:06:39.720 | There's a lot of data tracking adolescents.
00:06:43.560 | There's self-reports of how depressed, anxious, lonely.
00:06:47.040 | There's data on hospital admissions for self-harm.
00:06:49.120 | There's data on suicide.
00:06:50.740 | And all of these things, they bounce around somewhat,
00:06:53.720 | but they're relatively level in the early 2000s.
00:06:56.840 | And then all of a sudden, around 2010 to 2013,
00:07:00.520 | depending on which statistic you're looking at,
00:07:02.680 | all of a sudden, they begin to shoot upwards.
00:07:06.280 | More so for girls in some cases,
00:07:08.040 | but on the whole, it's like up for both sexes.
00:07:10.760 | It's just that boys have lower levels
00:07:12.100 | of anxiety and depression.
00:07:13.080 | So the curve is not quite as dramatic.
00:07:15.240 | But what we see is not small increases.
00:07:17.600 | It's not like, oh, 10%, 20%.
00:07:20.240 | No, the increases are between 50 and 150%,
00:07:24.640 | depending on which group you're looking at.
00:07:26.800 | Suicide for preteen girls, thankfully, it's not very common,
00:07:31.560 | but it's two to three times more common now.
00:07:34.000 | Or by 2015, it had doubled.
00:07:36.600 | Between 2010 and 2015, it doubled.
00:07:38.680 | So something is going radically wrong
00:07:40.600 | in the world of American preteens.
00:07:42.880 | So as I've been studying it, I found, first of all,
00:07:46.360 | it's not just America.
00:07:47.520 | It's identical in Canada and the UK.
00:07:50.800 | Australia and New Zealand are very similar.
00:07:52.320 | They're just after a little delay.
00:07:53.800 | So whatever we're looking for here,
00:07:55.000 | but yet it's not as clear in the Germanic countries.
00:07:58.240 | In continental Europe, it's a little different,
00:08:00.120 | and we can get into that when we talk about childhood.
00:08:02.640 | But something's happening in many countries,
00:08:05.640 | and it started right around 2012, 2013.
00:08:08.920 | It wasn't gradual.
00:08:10.520 | It hit girls hardest,
00:08:11.520 | and it hit preteen girls the hardest.
00:08:13.920 | So what could it be?
00:08:15.640 | Nobody has come up with another explanation.
00:08:18.400 | Nobody.
00:08:19.280 | It wasn't the financial crisis.
00:08:20.760 | That wouldn't have hit preteen girls the hardest.
00:08:23.760 | There is no other explanation.
00:08:25.080 | The complexity here in the data is, of course,
00:08:27.360 | as everyone knows, correlation doesn't prove causation.
00:08:30.520 | The fact that television viewing was going up
00:08:32.480 | in the '60s and '70s doesn't mean
00:08:35.360 | that that was the cause of the crime.
00:08:37.080 | So what I've done, and this is Rook with Jean Twenge,
00:08:40.480 | who wrote the book "iGen," is because I was challenged,
00:08:44.280 | when Greg and I put out the book,
00:08:46.400 | "The Coddling of the American Mind,"
00:08:47.800 | some researchers challenged us and said,
00:08:48.960 | "Oh, you don't know what you're talking about.
00:08:50.760 | "The correlations between social media use
00:08:52.960 | "and mental health, they exist, but they're tiny.
00:08:55.500 | "It's like a correlation coefficient of .03,
00:08:59.000 | "or a beta of .05, tiny little things."
00:09:02.360 | And one famous article said,
00:09:03.960 | "It's no bigger than the correlation
00:09:06.160 | "of bad mental health and eating potatoes,"
00:09:10.080 | which exists, but it's so tiny it's zero, essentially.
00:09:14.480 | And that claim, that social media's no more harmful
00:09:17.400 | than eating potatoes or wearing eyeglasses,
00:09:19.400 | it was a very catchy claim, and it's caught on,
00:09:21.680 | and I keep hearing that.
00:09:22.880 | But let me unpack why that's not true,
00:09:25.320 | and then we'll get to what Mark said.
00:09:26.340 | 'Cause what Mark basically said,
00:09:27.760 | here, I'll actually read it.
00:09:29.200 | - And by the way, just to pause real quick,
00:09:31.160 | is you implied, but just to make it explicit,
00:09:35.240 | that the best explanation we have now,
00:09:36.880 | as you're proposing, is that a very particular aspect
00:09:39.880 | of social media is the cause,
00:09:42.280 | which is not just social media,
00:09:43.600 | but the like button and the retweet,
00:09:46.440 | a certain mechanism of virality that was invented,
00:09:51.120 | or perhaps some aspect of social media is the cause.
00:09:53.360 | - Okay, good idea.
00:09:54.320 | Let's be clear.
00:09:55.400 | Connecting people is good.
00:09:56.960 | I mean, overall, the more you connect people, the better.
00:10:00.080 | Giving people the telephone was an amazing step forward.
00:10:02.940 | Giving them free telephone, free long distance,
00:10:05.560 | is even better.
00:10:06.400 | Video is, I mean, so connecting people is good.
00:10:08.360 | I'm not a Luddite.
00:10:10.120 | And social media, at least the idea of users posting things,
00:10:14.120 | like that happens on LinkedIn, and it's great.
00:10:16.080 | It can serve all kinds of needs.
00:10:17.900 | What I'm talking about here is not the internet.
00:10:20.520 | It's not technology.
00:10:21.840 | It's not smartphones.
00:10:23.360 | And it's not even all social media.
00:10:25.940 | It's a particular business model
00:10:28.200 | in which people are incentivized to create content.
00:10:31.760 | And that content is what brings other people on.
00:10:34.200 | And the people on there are the product
00:10:36.480 | which is sold to advertisers.
00:10:38.160 | It's that particular business model
00:10:39.680 | which Facebook pioneered,
00:10:41.880 | which seems to be incredibly harmful for teenagers,
00:10:45.480 | especially for young girls, 10 to 14 years old
00:10:47.940 | is where they're most vulnerable.
00:10:50.040 | And it seems to be particularly harmful
00:10:51.720 | for democratic institutions
00:10:53.320 | because it leads to all kinds of anger, conflict,
00:10:55.560 | and the destruction of any shared narrative.
00:10:57.520 | So that's what we're talking about.
00:10:59.000 | We're talking about Facebook, Twitter.
00:11:00.560 | I don't have any data on TikTok.
00:11:01.920 | I suspect it's gonna end up being,
00:11:03.680 | having a lot of really bad effects
00:11:05.680 | because the teens are on it so much.
00:11:07.000 | And to be really clear,
00:11:08.160 | since we're doing the nuance now in this section,
00:11:10.320 | lots of good stuff happens.
00:11:11.720 | There's a lot of funny things on Twitter.
00:11:14.840 | I use Twitter because it's an amazing way to put out news,
00:11:17.520 | to put out when I write something,
00:11:19.520 | you and I use it to promote things.
00:11:22.280 | We learn things quickly.
00:11:24.680 | - Well, there's gonna be,
00:11:25.520 | now this is harder to measure.
00:11:28.000 | And we'll probably,
00:11:29.600 | I'll try to mention it
00:11:30.720 | because so much of our conversation
00:11:32.760 | will be about rigorous criticism.
00:11:34.400 | I'll try to sometimes mention
00:11:37.200 | what are the possible positive effects of social media
00:11:39.760 | in different ways.
00:11:40.840 | So for example, in the way I've been using Twitter,
00:11:45.840 | not the promotion or any of that kind of stuff,
00:11:48.280 | it makes me feel less lonely to connect with people,
00:11:53.280 | to make me smile, a little bit of humor here and there.
00:11:56.800 | And that at scale is a very interesting effect,
00:11:59.160 | being connected across the globe,
00:12:00.760 | especially during times of COVID and so on.
00:12:03.240 | It's very difficult to measure that.
00:12:05.000 | So we kind of have to consider that and be honest.
00:12:08.840 | There is a trade-off.
00:12:10.960 | We have to be honest about the positive and the negative.
00:12:13.560 | And sometimes we're not sufficiently positive
00:12:16.360 | or in a rigorous scientific way about the,
00:12:18.880 | we're not rigorous in a scientific way about the negative.
00:12:22.760 | And that's what we're trying to do here.
00:12:25.280 | And so that brings us to the Mark Zuckerberg email.
00:12:31.040 | - Okay, but wait,
00:12:31.880 | let me just pick up on the issue of trade-offs
00:12:33.280 | because people might think like,
00:12:35.680 | well, how much of this do we need?
00:12:37.200 | If we have too much, it's bad.
00:12:38.280 | No, that's a one-dimensional conceptualization.
00:12:42.120 | This is a multi-dimensional issue.
00:12:44.040 | And a lot of people seem to think like,
00:12:45.160 | oh, what would we have done
00:12:46.280 | without social media during COVID?
00:12:47.480 | Like we would have been sitting there alone in our homes.
00:12:49.840 | Yeah, if all we had was texting, telephone, Zoom, Skype,
00:12:54.840 | multiplayer video games, WhatsApp,
00:12:57.260 | all sorts of ways of communicating with each other.
00:13:00.120 | Oh, and there's blogs and the rest of the internet.
00:13:02.600 | Yeah, we would have been fine.
00:13:03.680 | Did we really need the hyper-viral platforms
00:13:06.680 | of Facebook and Twitter?
00:13:07.660 | Now those did help certain things get out faster.
00:13:09.960 | And that did help science Twitter sometimes,
00:13:12.040 | but it also led to huge explosions of misinformation
00:13:14.540 | and the polarization of our politics to such an extent
00:13:17.740 | that a third of the country, you know,
00:13:19.600 | didn't believe what the medical establishment was saying.
00:13:22.520 | And we'll get into this.
00:13:23.660 | The medical establishment sometimes
00:13:25.300 | was playing political games that made them less credible.
00:13:27.960 | So on net, it's not clear to me,
00:13:30.080 | if you've got the internet, smartphones, blogs,
00:13:33.280 | all of that stuff, it's not clear to me
00:13:36.360 | that adding in this particular business model
00:13:39.400 | of Facebook, Twitter, TikTok,
00:13:40.920 | that that really adds a lot more.
00:13:43.280 | - And one interesting one we'll also talk about is YouTube.
00:13:46.840 | I think it's easier to talk about Twitter and Facebook.
00:13:50.240 | YouTube is another complex beast that's very hard to,
00:13:53.640 | 'cause YouTube has many things.
00:13:54.960 | It's a content platform,
00:13:56.120 | but it also has a recommendation system.
00:13:59.480 | That's, let's focus our discussion
00:14:02.600 | on perhaps Twitter and Facebook,
00:14:04.240 | but you do in this large document
00:14:06.560 | that you're putting together on social media
00:14:09.600 | called "Social Media and Political Dysfunction
00:14:11.640 | Collaborative Review" with Chris Bale.
00:14:14.160 | That includes, I believe, papers on YouTube as well.
00:14:18.040 | - It does, but yeah, again,
00:14:19.120 | just to finish up with the nuance,
00:14:20.600 | yeah, YouTube is really complicated
00:14:23.440 | because I can't imagine life without YouTube.
00:14:25.280 | It's incredibly useful.
00:14:26.520 | It does a lot of good things.
00:14:28.260 | It also obviously helps to radicalize
00:14:30.600 | terrorist groups and murderers.
00:14:32.440 | So I think about YouTube
00:14:34.840 | the way I think about the internet in general,
00:14:37.020 | and I don't know enough to really comment on YouTube.
00:14:39.200 | So I have been focused, and it's also interesting.
00:14:42.000 | One thing we know is teen social life changed radically
00:14:45.640 | between about 2010 and 2012.
00:14:47.760 | Before 2010, they weren't mostly on every day
00:14:49.920 | 'cause they didn't have smartphones yet.
00:14:51.680 | By 2012 to '14, that's the area
00:14:53.680 | in which they almost all get smartphones,
00:14:55.720 | and they become daily users of the, what?
00:14:58.780 | So the girls go to Instagram and Tumblr.
00:15:00.820 | They go to the visual ones.
00:15:02.380 | The boys go to YouTube and video games.
00:15:04.680 | Those don't seem to be as harmful to mental health
00:15:06.540 | or even harmful at all.
00:15:07.860 | It's really Tumblr, Instagram particularly,
00:15:11.560 | that seem to really have done in girls' mental health.
00:15:14.300 | So now, okay, so let's look at the quote
00:15:16.380 | from Mark Zuckerberg.
00:15:18.600 | So at 64 minutes and 31 seconds on the video,
00:15:23.600 | I time-coded this.
00:15:25.060 | - This is excellent.
00:15:26.180 | - This is the very helpful YouTube transcript.
00:15:28.620 | YouTube's an amazing program.
00:15:30.240 | You ask him about Francis Haugen.
00:15:33.100 | You give him a chance to respond.
00:15:34.760 | And here's the key thing.
00:15:37.500 | So he talks about what Francis Haugen said.
00:15:41.300 | He said, "No, but that's mischaracterized.
00:15:42.940 | "Actually, on most measures, the kids are doing better
00:15:45.360 | "when they're on Instagram.
00:15:46.200 | "It's just on one out of the 18."
00:15:48.500 | And then he says, "I think an accurate characterization
00:15:52.140 | "would have been that kids using Instagram,"
00:15:55.380 | or not kids, but teens,
00:15:56.760 | "is generally positive for their mental health."
00:15:59.420 | That's his claim, that Instagram is overall,
00:16:03.420 | taken as a whole, Instagram is positive
00:16:05.640 | for their mental health.
00:16:06.480 | That's what he says.
00:16:07.500 | Now, is it really?
00:16:10.500 | Is it really?
00:16:11.660 | So first, just a simple, okay, now here,
00:16:14.060 | what I'd like to do is turn my attention
00:16:16.340 | to another document that we'll make available.
00:16:18.540 | So I was invited to give testimony
00:16:20.540 | before a Senate subcommittee two weeks ago,
00:16:22.900 | where they were considering
00:16:23.820 | the Platform Accountability Act.
00:16:25.140 | Should we force the platforms to actually tell us
00:16:27.220 | what our kids are doing?
00:16:28.620 | Like, we have no idea, other than self-report.
00:16:30.580 | We have no idea.
00:16:32.260 | They're the only ones who know, like, the kid does this,
00:16:34.240 | and then over the next hours, the kid's depressed or happy.
00:16:36.860 | We can't know that, but Facebook knows it.
00:16:39.860 | So should they be compelled to reveal the data?
00:16:42.700 | We need that.
00:16:43.700 | - So you raised just, to give people a little bit of context,
00:16:47.940 | and this document is brilliantly structured with questions,
00:16:51.740 | studies that indicate that the answer to a question is yes,
00:16:54.940 | indicate that the answer to a question is no,
00:16:57.180 | and then mixed results.
00:16:58.140 | And questions include things like,
00:17:00.340 | does social media make people more angry
00:17:02.180 | or effectively polarized?
00:17:03.700 | - Right, wait, so that's the one that we're gonna get to.
00:17:06.140 | That's the one for democracy.
00:17:07.900 | - Yes, that's for democracy.
00:17:08.900 | - So I've got three different Google Docs here,
00:17:10.540 | 'cause I found this is an amazing way,
00:17:11.860 | and thank God for Google Docs.
00:17:13.220 | It's an amazing way to organize the research literature,
00:17:16.060 | and it's a collaborative review,
00:17:17.660 | meaning that, so on this one,
00:17:19.220 | Gene Twenge and I put up the first draft,
00:17:21.300 | and we say, please, comment, add studies,
00:17:23.580 | tell us what we missed.
00:17:24.660 | And it evolves in real time.
00:17:25.860 | - In any direction, the yes or the no.
00:17:27.700 | - Oh yeah, we specifically encourage,
00:17:29.060 | 'cause look, the center of my research
00:17:31.980 | is that our gut feelings drive our reasoning.
00:17:34.500 | That was my dissertation, that was my early research.
00:17:37.180 | And so if Gene Twenge and I are committed to,
00:17:39.620 | but we're gonna obviously preferentially believe
00:17:41.940 | that these platforms are bad for kids,
00:17:43.540 | 'cause we said so in our books.
00:17:44.980 | So we have confirmation bias,
00:17:47.020 | and I'm a devotee of John Stuart Mill,
00:17:49.300 | the only cure for confirmation bias
00:17:51.300 | is other people who have a different confirmation bias.
00:17:53.780 | So these documents evolve because critics then say,
00:17:55.940 | no, you missed this, or they say,
00:17:57.160 | you don't know what you're talking about.
00:17:58.180 | It's like, great, say so, tell us.
00:17:59.880 | So I put together this document,
00:18:02.340 | and I'm gonna put links to everything on my website.
00:18:05.340 | If listeners, viewers go to jonathanheit.com/socialmedia.
00:18:11.900 | It's a new page I just created.
00:18:13.700 | I'll put everything together in one place there,
00:18:15.620 | and we'll put those in the show notes.
00:18:17.220 | - Like links to this document,
00:18:19.180 | and other things like it that we're talking about.
00:18:21.140 | - That's right, exactly.
00:18:22.380 | So yeah, so the thing I wanna call attention to now
00:18:24.740 | is this document here with the title,
00:18:27.580 | Teen Mental Health is Plummeting,
00:18:29.340 | and Social Media is a Major Contributing Cause.
00:18:32.220 | So Ben Sass and Chris Coons are on the Judiciary Committee.
00:18:34.980 | They had a subcommittee hearing
00:18:36.420 | on Nate Priscilli's bill,
00:18:38.900 | Platform Accountability Transparency Act.
00:18:40.580 | So they asked me to testify on what do we know,
00:18:42.740 | what's going on with teen mental health?
00:18:44.340 | And so what I did was I put together everything I know
00:18:46.340 | with plenty of graphs to make these points.
00:18:49.940 | That first, what do we know about the crisis?
00:18:52.040 | Well, that the crisis is specific to mood disorders,
00:18:55.300 | not everything else.
00:18:56.420 | It's not just self-report, it's also behavioral data,
00:18:59.300 | because suicide and self-harm go skyrocketing after 2010.
00:19:02.940 | The increases are very large,
00:19:05.460 | and the crisis is gendered, and it's hit many countries.
00:19:08.780 | So I go through the data on that.
00:19:10.300 | So we have a pretty clear characterization,
00:19:12.340 | and nobody's disputed me on this part.
00:19:14.540 | - So can we just pause real quick,
00:19:16.060 | just so for people who are not aware.
00:19:17.900 | So self-report, just how you kind of collect data
00:19:20.780 | on this kind of thing.
00:19:21.980 | - Sure.
00:19:22.820 | - You have a self-report, a survey, you ask people--
00:19:26.380 | - Yeah, how anxious are you these days?
00:19:29.340 | - Yeah.
00:19:30.180 | - How many hours a week do you use social media?
00:19:31.860 | - That kind of stuff.
00:19:32.700 | And you do, it's maybe,
00:19:35.680 | you can collect large amounts of data that way,
00:19:38.640 | 'cause you can ask a large number of people
00:19:40.060 | that kind of question.
00:19:41.020 | And but then there's, I forget the term you use,
00:19:43.260 | but more, so non-self-report data.
00:19:46.300 | - Behavioral data.
00:19:47.140 | - Behavioral data, that's right.
00:19:48.260 | Where you actually have self-harm and suicide numbers.
00:19:53.260 | - Exactly.
00:19:54.660 | So there are a lot of graphs like this.
00:19:55.740 | So this is from the National Survey on Drug Use and Health.
00:19:58.300 | So the federal government, and also Pew and Gallup,
00:20:01.900 | there are a lot of organizations
00:20:03.360 | that have been collecting survey data for decades.
00:20:05.740 | So this is a gold mine.
00:20:07.020 | And what you see on these graphs over and over again
00:20:09.300 | is relatively straight lines up until around 2010 or 2012.
00:20:12.500 | - And on the X-axis, we have time,
00:20:14.180 | years going from 2004 to 2020.
00:20:17.380 | And the Y-axis is the percent of US teens
00:20:20.340 | who had a major depression in the last year.
00:20:22.540 | - That's right.
00:20:23.380 | So when this data started coming out around,
00:20:24.900 | so Jean Twain's book, "I, Jen," 2017,
00:20:27.220 | a lot of people said, "Oh, she doesn't know
00:20:29.040 | "what she's talking about.
00:20:29.880 | "This is just self-report."
00:20:30.980 | Like, Gen Z, they're just really comfortable
00:20:32.740 | talking about this.
00:20:33.700 | This is a good thing.
00:20:34.780 | This isn't a real epidemic.
00:20:36.300 | And literally the day before my book with Greg was published,
00:20:39.420 | the day before, there was a psychiatrist in "New York Times"
00:20:42.380 | who had an op-ed saying, "Relax.
00:20:44.520 | "Smartphones are not ruining your kid's brain."
00:20:47.060 | And he said, "It's just self-report.
00:20:48.200 | "It's just that they're giving higher rates.
00:20:50.080 | "There's more diagnosis.
00:20:50.900 | "But underlying, there's no change."
00:20:52.460 | No, because it's theoretically possible,
00:20:56.720 | but all we have to do is look at the hospitalization data
00:20:59.380 | for self-harm and suicide, and we see the exact same trends.
00:21:03.140 | We see also a very sudden, big rise
00:21:06.500 | around between 2009 and 2012.
00:21:08.660 | You have an elbow, and then it goes up, up, up.
00:21:10.660 | So, and that is not self-report.
00:21:11.900 | Those are actual kids admitted to hospitals
00:21:13.700 | for cutting themselves.
00:21:15.460 | So we have a catastrophe, and this was all true before COVID.
00:21:18.860 | COVID made things worse, but we have to realize,
00:21:21.700 | you know, COVID's going away.
00:21:22.920 | Kids are back in school, but we're not gonna go back
00:21:25.900 | to where we were because this problem is not caused by COVID.
00:21:29.780 | What is it caused by?
00:21:31.620 | Well, just again, to just go through the point,
00:21:33.900 | then I'll stop.
00:21:34.740 | I just feel like I just really wanna get out the data
00:21:38.140 | to show that Mark is wrong. - This is amazing.
00:21:40.300 | - So first point, correlational studies
00:21:42.180 | consistently show a link.
00:21:43.520 | They almost all do, but it's not big.
00:21:45.740 | Equivalent to a correlation coefficient
00:21:47.580 | around 0.1, typically.
00:21:49.440 | That's the first point.
00:21:51.380 | The second point is that the correlation
00:21:55.340 | is actually much larger than for eating potatoes.
00:21:58.020 | So that famous line wasn't about social media use.
00:22:01.780 | That was about digital media use.
00:22:04.300 | That included watching Netflix,
00:22:05.960 | doing homework on everything.
00:22:07.540 | And so what they did is they looked at all screen use,
00:22:10.780 | and then they said, "This is correlated
00:22:12.380 | "with self-reports of depression, anxiety."
00:22:15.220 | Like, you know, 0.03, it's tiny.
00:22:17.840 | And they said that clearly in the paper,
00:22:20.300 | but the media has reported it as social media is 0.03,
00:22:24.860 | or tiny, and that's just not true.
00:22:26.980 | What I found digging into it,
00:22:28.300 | you don't know this until you look at the,
00:22:29.940 | there's more than 100 studies in the Google Doc.
00:22:32.520 | Once you dig in, what you see is,
00:22:34.260 | okay, you see a tiny correlation.
00:22:36.400 | What happens if we zoom in on just social media?
00:22:38.420 | It always gets bigger, often a lot bigger,
00:22:40.180 | two or three times bigger.
00:22:42.020 | What happens if we zoom in on girls and social media?
00:22:44.460 | It always gets bigger, often a lot bigger.
00:22:46.380 | And so what I think we can conclude,
00:22:48.840 | in fact, what one of the authors
00:22:49.980 | of the potato studies herself concludes,
00:22:52.060 | Amy Orban says, I think I have a quote from here,
00:22:55.060 | she reviewed a lot of studies,
00:22:56.660 | and she herself said that, quote,
00:22:59.020 | "The associations between social media use and wellbeing
00:23:02.060 | therefore range from about R equals 0.15 to R equals 0.10."
00:23:07.060 | So that's the range we're talking about.
00:23:09.860 | And that's for boys and girls together.
00:23:11.900 | And a lot of research, including hers and mine,
00:23:14.220 | show that girls, it's higher.
00:23:15.700 | So for girls, we're talking about correlations
00:23:17.160 | around 0.15 to 0.2, I believe.
00:23:19.500 | Gene Twenge and I found it's about 0.2 or 0.22.
00:23:22.600 | Now this might sound like an arcane social science debate,
00:23:24.780 | but people have to understand,
00:23:26.240 | public health correlations are almost never above 0.2.
00:23:29.180 | So the correlation of childhood exposure to lead
00:23:31.900 | and adult IQ, very serious problem, that's 0.09.
00:23:35.220 | Like the world's messy and our measurements are messy.
00:23:37.900 | And so if you find a consistent correlation of 0.15,
00:23:41.180 | like you would never let your kid do that thing.
00:23:43.900 | That actually is dangerous.
00:23:45.180 | And it can explain, when you multiply it
00:23:47.500 | over tens of millions of kids spending,
00:23:50.280 | you know, years of their lives,
00:23:51.980 | you actually can explain the mental health epidemic
00:23:54.060 | just from social media use.
00:23:56.900 | - Well, and then there's questions.
00:23:58.300 | By the way, this is really good to learn
00:24:00.780 | because I quit potatoes and it had no effect on me.
00:24:04.140 | And as a Russian, that was a big sacrifice.
00:24:06.260 | They're quite literal actually,
00:24:08.220 | because I'm mostly eating keto these days.
00:24:10.020 | But that's funny that they're actually
00:24:12.700 | literally called the potato studies.
00:24:14.540 | Okay, but given this,
00:24:17.380 | and there's a lot of fascinating data here,
00:24:19.820 | there's also a discussion of how to fix it.
00:24:23.820 | What are the aspects that if fixed
00:24:28.820 | would start to reverse some of these trends?
00:24:32.420 | So if we just linger on the Mark Zuckerberg statements.
00:24:37.420 | So first of all, do you think Mark is aware
00:24:40.420 | of some of these studies?
00:24:43.200 | So if you put yourself in the shoes of Mark Zuckerberg
00:24:48.200 | and the executives at Facebook and Twitter,
00:24:51.500 | how can you try to understand the studies?
00:24:53.720 | Like the Google Docs you put together
00:24:55.580 | to try to make decisions that fix things?
00:24:59.340 | Is there a stable science now
00:25:02.060 | that you can start to investigate?
00:25:03.740 | And also maybe if you can comment
00:25:06.780 | on the depth of data that's available,
00:25:08.660 | because ultimately, and this is something you argue,
00:25:11.600 | that the data should be more transparent,
00:25:14.380 | should be provided.
00:25:15.740 | But currently, if it's not,
00:25:18.980 | all you have is maybe some leaks of internal data.
00:25:22.460 | - That's right, that's right.
00:25:23.620 | And we could talk about the potential.
00:25:25.460 | You have to be very sort of objective
00:25:27.200 | about the potential bias in those kinds of leaks.
00:25:29.700 | You want to, it would be nice to have a non-leak data.
00:25:33.700 | Like--
00:25:35.180 | - Yeah, it'd be nice to be able to actually
00:25:36.780 | have academic researchers able to access
00:25:39.140 | in de-individuated, de-identified form
00:25:41.620 | the actual data on what kids are doing
00:25:44.560 | and how their mood changes.
00:25:45.940 | And when people commit suicide, what was happening before.
00:25:48.780 | And it'd be great to know that.
00:25:49.660 | We have no idea.
00:25:51.060 | - So how do we begin to fix social media, would you say?
00:25:55.620 | - Okay, so here's the most important thing to understand.
00:25:58.220 | In the social sciences, we say,
00:26:00.700 | is social media harmful to kids?
00:26:02.140 | That's a broad question.
00:26:03.320 | You can't answer that directly.
00:26:04.780 | You have to have much more specific questions.
00:26:06.640 | You have to operationalize it
00:26:08.540 | and have a theory of how it's harming kids.
00:26:11.420 | And so almost all of the research
00:26:13.580 | is done on what's called the dose response model.
00:26:16.340 | That is, everybody, including most of the researchers,
00:26:19.160 | are thinking about this like,
00:26:20.380 | let's treat this like sugar.
00:26:22.580 | You know, 'cause the data usually shows
00:26:23.860 | a little bit of social media use
00:26:24.900 | isn't correlated with harm, but a lot is.
00:26:26.780 | So, you know, think of it like sugar.
00:26:28.660 | And if kids have a lot of sugar, then it's bad.
00:26:31.100 | So how much is okay?
00:26:32.200 | But social media is not like sugar at all.
00:26:35.900 | It's not a dose response thing.
00:26:37.380 | It's a complete rewiring of childhood.
00:26:40.320 | So we evolved as a species in which kids play
00:26:43.620 | in mixed age groups.
00:26:44.860 | They learn the skills of adulthood.
00:26:46.180 | They're always playing and working
00:26:47.660 | and learning and doing errands.
00:26:48.780 | That's normal childhood.
00:26:49.740 | That's how you develop your brain.
00:26:50.780 | That's how you become a mature adult, until the 1990s.
00:26:53.740 | In the 1990s, we dropped all that.
00:26:55.340 | We said, it's too dangerous.
00:26:56.620 | If we let you outside, you'll be kidnapped.
00:26:58.540 | So we completely, we began rewiring childhood
00:27:00.940 | in the '90s before social media.
00:27:03.140 | And that's a big part of the story.
00:27:05.140 | I'm a big fan of Lenore Scanese,
00:27:07.020 | who wrote the book "Free-Range Kids."
00:27:08.300 | If there are any parents listening to this,
00:27:10.540 | please buy Lenore's book "Free-Range Kids,"
00:27:13.680 | and then go to letgrow.org.
00:27:15.900 | It's a nonprofit that Lenore and I started
00:27:18.600 | with Peter Gray and Daniel Shookman
00:27:21.260 | to help change the laws and the norms
00:27:24.520 | around letting kids out to play.
00:27:25.780 | They need free play.
00:27:26.940 | So that's the big picture.
00:27:28.340 | They need free play.
00:27:29.740 | And we started stopping that in the '90s
00:27:31.620 | that we reduced it.
00:27:32.700 | And then Gen Z, kids born in 1996,
00:27:35.740 | they're the first people in history
00:27:37.900 | to get on social media before puberty.
00:27:41.020 | Millennials didn't get it until they were in college.
00:27:43.520 | But Gen Z, they get it 'cause you can lie.
00:27:46.100 | You just lie about your age.
00:27:48.180 | So they really began to get on around 2009, 2010,
00:27:51.820 | and boom, two years later, they're depressed.
00:27:53.740 | It's not because they ate too much sugar necessarily.
00:27:56.240 | It's because even normal social interactions
00:27:59.140 | that kids had in the early 2000s, largely,
00:28:02.260 | well, they decline because now everything's
00:28:04.580 | through the phone.
00:28:05.880 | And that's what I'm trying to get across,
00:28:08.340 | that it's not just a dose response thing.
00:28:10.660 | It's imagine one middle school
00:28:13.280 | where everyone has an Instagram account
00:28:15.180 | and it's constant drama.
00:28:16.260 | Everyone's constantly checking and posting
00:28:18.160 | and worrying and imagine going through puberty that way
00:28:21.840 | versus imagine there was a policy, no phones in school.
00:28:24.800 | You have to check them in a locker.
00:28:26.100 | No one can have an Instagram account.
00:28:27.520 | All the parents are on board.
00:28:29.460 | Parents only let their kids have Instagram
00:28:31.300 | because the kid says everyone else has it.
00:28:33.420 | And we're stuck in a social dilemma.
00:28:35.060 | We're stuck in a trap.
00:28:36.140 | So what's the solution?
00:28:38.220 | Keep kids off until they're done with puberty.
00:28:40.420 | There's a new study actually by Amy Orban
00:28:42.220 | and Andy Shabilsky showing that the damage is greatest
00:28:45.820 | for girls between 11 and 13.
00:28:47.740 | So there is no way to make it safe for preteens
00:28:50.480 | or even 13, 14 year olds.
00:28:51.640 | We've gotta, kids should simply not be allowed
00:28:54.220 | on these business models where you're the product.
00:28:57.720 | They should not be allowed until you're 16.
00:28:59.540 | We need to raise the age and enforce it.
00:29:00.940 | That's the biggest thing.
00:29:01.760 | - So I think that's a really powerful solution,
00:29:03.660 | but it makes me wonder if there's other solutions
00:29:08.660 | like controlling the virality of bullying.
00:29:15.580 | Sort of if there's a way that's more productive
00:29:18.780 | to childhood to use social media.
00:29:21.500 | So of course one thing is putting your phone down,
00:29:24.060 | but first of all from the perspective
00:29:25.740 | of social media companies, it might be difficult
00:29:28.380 | to convince them to do so.
00:29:29.800 | And also for me as an adult who grew up
00:29:34.580 | without social media, social media is a source of joy.
00:29:39.500 | So I wonder if it's possible to design the mechanisms
00:29:43.460 | both challenge the ad driven model,
00:29:47.260 | but actually just technically the recommender system
00:29:49.940 | and how virality works on these platforms.
00:29:55.420 | If it's possible to design a platform
00:29:57.780 | that leads to growth, anti-fragility,
00:30:00.700 | but does not lead to depression, self-harm, and suicide.
00:30:05.700 | Like finding that balance and making that
00:30:08.500 | as the objective function, not engagement or--
00:30:12.940 | - Yeah, I don't think that can be done for kids.
00:30:15.980 | So I am very reluctant to tell adults what to do.
00:30:18.740 | I have a lot of libertarian friends
00:30:20.420 | and I would lose their friendship if I started saying,
00:30:22.460 | oh, it's bad for adults and we should stop adults
00:30:24.660 | from using it.
00:30:25.780 | But by the same token, I'm very reluctant
00:30:27.580 | to have Facebook and Instagram tell my kids
00:30:29.500 | what to do without me even knowing
00:30:31.180 | or without me having any ability to control it.
00:30:33.580 | As a parent, it's very hard to stop your kid.
00:30:35.860 | I have stopped my kids from getting on Instagram
00:30:38.220 | and that's caused some difficulties,
00:30:39.900 | but they also have thanked me
00:30:42.540 | 'cause they see that it's stupid.
00:30:44.220 | They see that what the kids are really on it,
00:30:46.500 | what they post, they see that the culture of it
00:30:48.500 | is stupid, as they say.
00:30:50.220 | So I don't think there's a way to make it healthy for kids.
00:30:54.240 | I think there's one thing which is healthy for kids,
00:30:56.100 | which is free play.
00:30:57.160 | We already robbed them of most of it in the '90s.
00:30:59.860 | The more time they spend on their devices,
00:31:01.340 | the less they have free play.
00:31:03.020 | Video games is a kind of play.
00:31:04.220 | I'm not saying that these things are all bad,
00:31:06.700 | but 12 hours of video game play
00:31:08.480 | means you don't get any physical play.
00:31:10.740 | So anyway--
00:31:11.580 | - To me, physical play is the way
00:31:13.240 | to develop physical antifragility.
00:31:17.940 | - And especially social skills.
00:31:19.020 | Kids need huge amounts of conflict
00:31:21.620 | with no adult to supervise or mediate,
00:31:24.140 | and that's why we robbed them of.
00:31:25.460 | So anyway, we should move on
00:31:26.980 | 'cause I get really into the evidence here
00:31:29.420 | because I think the story's actually quite clear now.
00:31:32.260 | There was a lot of ambiguity.
00:31:33.680 | There are conflicting studies,
00:31:34.820 | but when you look at it all together,
00:31:36.580 | the correlational studies are pretty clear
00:31:38.180 | and the effect sizes are coming in around 0.1 to 0.15,
00:31:41.240 | whether you call that a correlation coefficient or a beta.
00:31:43.580 | It's all standardized beta.
00:31:44.900 | It's all in that sort of range.
00:31:46.620 | There's also experimental evidence.
00:31:48.700 | We collect true experiments with random assignment
00:31:50.900 | and they mostly show an effect.
00:31:52.500 | And there's eyewitness testimony.
00:31:55.820 | With the kids themselves,
00:31:58.580 | you talk to girls and you poll them.
00:32:00.700 | Do you think overall Instagram
00:32:01.860 | is good for your mental health or bad for it?
00:32:04.060 | You're not gonna find a group saying,
00:32:05.020 | "Oh, it's wonderful.
00:32:05.860 | Oh yeah, yeah, Mark, you're right.
00:32:07.220 | It's mostly good."
00:32:08.060 | No, the girls themselves say,
00:32:09.900 | "This is the major reason."
00:32:10.900 | And I've got studies in the Google Doc
00:32:13.140 | where there've been surveys.
00:32:13.980 | What do you think is causing depression and anxiety?
00:32:16.900 | And the number one thing they say is social media.
00:32:19.140 | So there's multiple strands of evidence.
00:32:20.980 | - Do you think the recommendation is, as a parent,
00:32:25.460 | that teens should not use Instagram, Twitter?
00:32:30.300 | - Yes.
00:32:31.140 | - That's ultimately, maybe in the longterm,
00:32:33.620 | there's a more nuanced solution.
00:32:34.740 | - There's no way to make it safe.
00:32:36.060 | It's unsafe at any speed.
00:32:38.020 | I mean, it might be very difficult to make it safe.
00:32:41.180 | And then the short term,
00:32:42.340 | while we don't know how to make it safe, put down the phone.
00:32:45.620 | - Well, hold on a second.
00:32:47.380 | Play with other kids via a platform
00:32:50.220 | like Roblox or multiplayer video games.
00:32:52.140 | That's great.
00:32:52.980 | I have no beef with that.
00:32:54.500 | You focus on bullying before.
00:32:56.220 | That's one of five or seven different avenues of harm.
00:32:59.400 | The main one I think, which does in the girls,
00:33:02.220 | is not being bullied.
00:33:03.440 | It's living a life where you're thinking all the time
00:33:07.740 | about posting.
00:33:09.660 | Because once a girl starts posting,
00:33:11.100 | so it's bad enough that they're scrolling through,
00:33:12.860 | and this is, everyone comments on this,
00:33:14.300 | you're scrolling through
00:33:15.140 | and everyone's life looks better than yours
00:33:16.820 | because it's fake and all that you see
00:33:19.080 | are the ones the algorithm picked that were the night.
00:33:21.000 | Anyway, so the scrolling, I think, is bad for the girls.
00:33:24.300 | But I'm beginning to see, I can't prove this,
00:33:26.020 | but I'm beginning to see from talking to girls,
00:33:27.580 | from seeing how it's used, is once you start posting,
00:33:31.180 | that takes over your mind.
00:33:32.260 | And now you're, basically, you're no longer present.
00:33:34.860 | Because even if you're only spending
00:33:36.460 | five or six hours a day on Instagram,
00:33:39.100 | you're always thinking about it.
00:33:40.740 | And when you're in class, you're thinking about
00:33:43.780 | how are people responding to the post that I made
00:33:46.180 | between period, between classes.
00:33:48.800 | I mean, I do it.
00:33:49.720 | I try to stay off Twitter for a while,
00:33:51.080 | but now I've got this big article, I'm tweeting about it,
00:33:54.140 | and I can't help it.
00:33:55.100 | I check 20 times a day, I'll check.
00:33:56.900 | Like, what are people saying?
00:33:57.740 | What are people saying?
00:33:58.580 | This is terrible.
00:33:59.500 | And I'm a 58-year-old man.
00:34:01.740 | Imagine being a 12-year-old girl going through puberty,
00:34:04.760 | you're self-conscious about how you look.
00:34:06.740 | And I see some young women,
00:34:07.980 | I see some professional young women,
00:34:09.420 | women in their 20s and 30s,
00:34:10.900 | who are putting up sexy photos of themselves.
00:34:12.780 | Like, and this is so sad, so sad, don't be doing this.
00:34:16.100 | - Yeah, see, the thing where I disagree a little bit is,
00:34:20.420 | I agree with you in the short term,
00:34:22.660 | but in the long term,
00:34:23.500 | I feel it's the responsibility of social media,
00:34:26.180 | not in some kind of ethical way,
00:34:27.700 | not just in an ethical way,
00:34:29.380 | but it'll actually be good for the product
00:34:31.100 | and for the company to maximize the long-term happiness
00:34:34.740 | and well-being of the person.
00:34:36.580 | So not just engagement.
00:34:37.700 | So consider--
00:34:38.620 | - But the person is not the customer.
00:34:40.460 | So the thing is not to make them happy,
00:34:42.340 | it's to keep them on.
00:34:43.660 | - That's the way it is currently, that driven.
00:34:45.540 | - If we can get a business model, as you're saying,
00:34:47.460 | I'd be all for it.
00:34:48.440 | - And I think that's the way to make much more money.
00:34:51.360 | - So like a subscription model,
00:34:52.620 | where the money comes from paying?
00:34:54.780 | - It's not--
00:34:57.220 | - That would work, wouldn't it?
00:34:58.060 | That would help.
00:34:58.880 | - So subscription definitely would help,
00:34:59.880 | but I'm not sure it's so much,
00:35:01.960 | I mean, a lot of people say it's about the source of money,
00:35:05.160 | but I just think it's about
00:35:06.360 | the fundamental mission of the product.
00:35:09.960 | If you want people to really love a thing,
00:35:13.060 | I think that thing should maximize
00:35:17.780 | your long-term well-being.
00:35:19.760 | - It should, in theory, in morality land, it should.
00:35:23.760 | - I don't think it's just morality land.
00:35:25.400 | I think in business land, too.
00:35:26.840 | But that's maybe a discussion for another day.
00:35:29.000 | We're studying the reality of the way things currently are,
00:35:33.480 | and they are as they are, as the studies are highlighting.
00:35:36.920 | So let us go then in from the land of mental health
00:35:41.920 | for young people to the land of democracy.
00:35:45.480 | By the way, in these big umbrella areas,
00:35:49.440 | is there a connection, is there a correlation
00:35:52.520 | between the mental health of a human mind
00:35:55.880 | and the division of our political discourse?
00:35:59.160 | - Oh yes, oh yes.
00:36:00.800 | So our brains are structured
00:36:02.880 | to be really good at approach and avoid.
00:36:06.260 | So we have circuits, the front left circuit,
00:36:08.640 | this is an oversimplification, but there's some truth to it.
00:36:10.720 | There's what's called the behavioral activation system,
00:36:12.720 | front left cortex.
00:36:13.860 | It's all about approach, opportunity,
00:36:16.480 | kid in a candy store.
00:36:17.640 | And then the front right cortex has circuits
00:36:19.280 | specialized for withdrawal, fear, threat.
00:36:22.040 | And of course, students, I'm a college professor,
00:36:24.840 | and most of us think about our college days like,
00:36:27.120 | you know, yeah, we were anxious at times, but it was fun.
00:36:29.760 | And it was like, I can take all these courses,
00:36:32.120 | I can do all these clubs, I know all these people.
00:36:35.320 | Now imagine if in 2013, all of a sudden,
00:36:38.080 | students are coming in
00:36:39.480 | with their front right cortex hyperactivated,
00:36:41.520 | everything's a threat, everything is dangerous,
00:36:43.980 | there's not enough to go around.
00:36:45.840 | So the front right cortex puts us into
00:36:48.320 | what's called defend mode as opposed to discover mode.
00:36:51.360 | Now let's move up to adults.
00:36:53.000 | Imagine a large, diverse, secular, liberal democracy
00:36:57.520 | in which people are most of the time in discover mode.
00:37:00.760 | And you know, we have a problem,
00:37:01.920 | hmm, let's think how to solve it.
00:37:03.160 | And this is what de Tocqueville said about Americans,
00:37:05.400 | like, there's a problem, we get together,
00:37:07.120 | we figure out how to solve it.
00:37:08.360 | And he said, whereas in England and France,
00:37:10.640 | people would wait for the king to do it.
00:37:12.600 | But here, like, you know, let's roll up our sleeves,
00:37:14.400 | let's do it.
00:37:15.280 | That's the can-do mindset.
00:37:16.820 | That's front left cortex discover mode.
00:37:19.040 | If you have a national shift
00:37:21.920 | of people spending more time in defend mode,
00:37:24.440 | now you, so everything that comes up,
00:37:26.120 | whatever anyone says, you're not looking like,
00:37:27.700 | oh, is there something good about it?
00:37:29.040 | You're thinking, you know, how is this dangerous?
00:37:31.040 | How is this a threat?
00:37:31.880 | How is this violence?
00:37:32.700 | How can I attack this?
00:37:33.540 | How can I, you know, so if you imagine, you know,
00:37:35.880 | God up there with a little lever, like,
00:37:38.200 | okay, let's push everyone over into, you know,
00:37:40.300 | more into discover mode.
00:37:41.520 | And it's like joy breaks out, age of Aquarius.
00:37:44.020 | All right, let's shift them back into,
00:37:45.400 | let's put everyone in defend mode.
00:37:47.320 | And I can't think of a better way
00:37:48.400 | to put people in defend mode
00:37:49.560 | than to have them spend some time on partisan
00:37:52.200 | or political Twitter,
00:37:53.400 | where it's just a stream of horror stories,
00:37:55.840 | including videos about how horrible the other side is.
00:37:58.520 | And it's not just that they're bad people.
00:38:00.240 | It's that if they win this election,
00:38:02.320 | then we lose our country or then it's catastrophe.
00:38:06.100 | So Twitter, and again, we're not saying all of Twitter,
00:38:09.080 | you know, most people aren't on Twitter
00:38:11.320 | and people that are mostly not talking about politics,
00:38:14.260 | but the ones that are on talking about politics
00:38:16.480 | are flooding us with stuff.
00:38:18.680 | All the journalists see it.
00:38:20.020 | All the mainstream media is hugely influenced by Twitter.
00:38:23.720 | So if we put everyone,
00:38:26.320 | if there's more sort of anxiety, sense of threat,
00:38:29.640 | this colors everything.
00:38:30.940 | And then you're not, you know,
00:38:32.760 | the great thing about a democracy and especially,
00:38:36.440 | you know, a legislature that has some diversity in it,
00:38:39.260 | is that the art of politics is that you can grow the pie
00:38:42.720 | and then divide it.
00:38:43.700 | You don't just fight zero sum.
00:38:45.220 | You find ways that we can all get 60% of what we want.
00:38:48.920 | And that ends when everyone's anxious and angry.
00:38:52.260 | - So let's try to start to figure out who's to blame here.
00:38:57.260 | Is it the nature of social media?
00:38:59.540 | Is it the decision of the people
00:39:03.460 | at the heads of social media companies
00:39:04.900 | that they're making in the detailed engineering designs
00:39:07.620 | of the algorithm?
00:39:09.100 | Is it the users of social media that drive narratives
00:39:13.860 | like you mentioned journalists,
00:39:15.900 | that want to maximize drama in order to
00:39:18.940 | drive clicks to their off-site articles?
00:39:25.340 | Is it just human nature that loves drama,
00:39:30.340 | can't look away from an accident when you're driving by?
00:39:33.060 | Is there something to be said about,
00:39:35.340 | the reason I ask these questions is to see,
00:39:38.020 | can we start to figure out what the solution would be
00:39:42.740 | to alleviate, to deescalate the-
00:39:44.780 | - Not yet, not yet.
00:39:45.980 | Let's first, we have to understand,
00:39:47.880 | as we did on the teen mental health thing,
00:39:51.140 | okay, now let's lay out what is the problem?
00:39:53.100 | What's messing up our country?
00:39:54.540 | And then we can talk about solutions.
00:39:56.380 | So it's all the things you said,
00:39:58.020 | interacting in an interesting way.
00:40:00.540 | So human nature is tribal.
00:40:03.260 | We evolved for intergroup conflict.
00:40:05.860 | We love war.
00:40:07.320 | The first time my buddies and I played paintball,
00:40:10.500 | I was 29.
00:40:12.220 | And we were divided into teams with strangers
00:40:14.140 | to shoot guns at each other and kill each other.
00:40:16.700 | And we all, afterwards, it was like,
00:40:18.900 | oh my God, that was incredible.
00:40:21.780 | Like it really felt like we'd opened a room in our hearts
00:40:23.900 | that had never been opened.
00:40:25.420 | But as men, testosterone changes our brains and our bodies
00:40:29.340 | and activates the war stuff, like we've got more stuff.
00:40:31.820 | And that's why boys like certain team sports, it's play war.
00:40:35.900 | So that's who we are.
00:40:37.540 | It doesn't mean we're always tribal.
00:40:38.820 | It doesn't mean we're always wanting to fight.
00:40:40.180 | We're also really good at making peace and cooperation
00:40:42.580 | and finding deals.
00:40:43.620 | We're good at trade and exchange.
00:40:45.500 | So you want your country to,
00:40:48.460 | you want a society that has room for conflict,
00:40:52.060 | ideally over sports, like that's great.
00:40:53.900 | That's totally, it's not just harmless, it's actually good.
00:40:57.060 | But otherwise you want cooperation
00:40:58.660 | to generally prevail in the society.
00:41:00.600 | That's how you create prosperity and peace.
00:41:02.820 | And if you're gonna have a diverse democracy,
00:41:04.740 | you really better focus on cooperation,
00:41:06.900 | not on tribalism and division.
00:41:09.860 | And there's a wonderful book by Yasha Monk
00:41:11.620 | called "The Great Experiment"
00:41:12.620 | that talks about the difficulty of diversity and democracy
00:41:15.620 | and what we need to do to get this right
00:41:19.340 | and to get the benefits of diversity.
00:41:21.540 | So that's human nature.
00:41:22.940 | Now let's imagine that the technological environment
00:41:26.620 | makes it really easy for us to cooperate.
00:41:28.820 | Let's give everyone telephones and the postal service.
00:41:31.020 | Let's give them email, like, wow,
00:41:32.900 | we can do all these things together with people far away.
00:41:35.060 | It's amazing.
00:41:35.900 | Now, instead of that, let's give them a technology
00:41:38.540 | that encourages them to fight.
00:41:40.100 | So early Facebook and Twitter were generally lovely places.
00:41:44.300 | People old enough to remember, like, they were fun.
00:41:46.700 | There was a lot of humor.
00:41:48.260 | You didn't feel like you were gonna get your head blown off
00:41:50.100 | no matter what you said.
00:41:51.300 | 2007, 2008, 2009, it was still fun.
00:41:55.460 | These were nice places mostly.
00:41:57.020 | And like almost all the platforms start off as nice places.
00:42:00.340 | But, and this is the key thing in the article,
00:42:03.540 | in the Atlantic article on Babel, on after Babel.
00:42:06.580 | - The Atlantic article, by the way,
00:42:07.820 | is why the past 10 years of American life
00:42:09.700 | have been uniquely stupid.
00:42:11.460 | - Yeah, my title in the magazine was,
00:42:15.020 | After Babel, Adapting to a World We Can No Longer Share,
00:42:18.420 | is what I proposed.
00:42:19.260 | But they A/B tested, what's the title
00:42:21.820 | that gets the most clicks,
00:42:22.740 | and it was why the past 10 years have been unique.
00:42:24.500 | - So Babel, the Tower of Babel,
00:42:25.940 | is a driving metaphor in the piece.
00:42:29.220 | So first of all, what is it, what's the Tower of Babel?
00:42:31.700 | What's Babel, what are we talking about?
00:42:33.140 | - Okay, so the Tower of Babel is a story early in Genesis
00:42:36.460 | where the descendants of Noah are spreading out
00:42:39.100 | and repopulating the world.
00:42:40.700 | And they're on the plain of Shinar,
00:42:42.420 | and they say, "Let us build us a city with a tower
00:42:45.340 | to make a name for ourselves, lest we be scattered again."
00:42:48.300 | And so it's a very short story, there's not a lot in it,
00:42:50.740 | but it looks like they're saying,
00:42:52.660 | we don't want God to flood us again,
00:42:54.060 | let's build a city and a tower to reach the heavens.
00:42:57.660 | And God is offended by the hubris of these people
00:43:00.780 | acting again like gods.
00:43:02.560 | And he says, and here's the key line,
00:43:05.700 | he says, "Let us go down and confuse their language
00:43:10.060 | so that they may not understand one another."
00:43:13.100 | So in the story, he doesn't literally knock the tower over,
00:43:15.920 | but many of us have seen images or movie dramatizations
00:43:20.420 | where a great wind comes and the tower is knocked over
00:43:23.500 | and the people are left wandering amid the rubble,
00:43:26.500 | unable to talk to each other.
00:43:28.540 | So I've been grappling, I've been trying to say,
00:43:30.740 | what the hell happened to our society?
00:43:33.700 | Beginning in 2014, what the hell is happening
00:43:35.580 | to universities?
00:43:36.420 | And then it spread out from universities,
00:43:38.140 | it hit journalism, the arts, and now it's all over companies.
00:43:41.300 | What the hell happened to us?
00:43:44.060 | And it wasn't until I reread the Babel story
00:43:46.220 | a couple of years ago that I thought,
00:43:47.780 | whoa, this is it, this is the metaphor.
00:43:51.100 | Because I'd been thinking about tribalism
00:43:53.620 | and left-right battles and war,
00:43:55.620 | and that's easy to think about.
00:43:57.880 | But Babel isn't like, and God said,
00:44:00.180 | "Let half of the people hate the other half."
00:44:01.860 | No, it wasn't that.
00:44:02.860 | It's God said, "Let us confuse their language
00:44:05.140 | "that they, none of them can understand each other
00:44:07.420 | "ever again, or at least for a while."
00:44:09.740 | So it's a story about fragmentation.
00:44:12.780 | And that's what's unique about our time.
00:44:14.180 | So Meta or Facebook wrote a rebuttal to my article.
00:44:19.180 | They disputed what I said, and one of their arguments was,
00:44:23.380 | oh, but polarization goes back way before social media
00:44:27.060 | and it was happening in the '90s.
00:44:29.780 | And they're right, it does.
00:44:31.100 | And I did say that, but I should have said it more clearly
00:44:33.620 | with more examples.
00:44:34.940 | But here's the new thing.
00:44:36.660 | Even though left and right were beginning
00:44:38.300 | to hate each other more,
00:44:39.700 | we weren't afraid of the person next to us.
00:44:42.100 | We weren't afraid of each other.
00:44:43.740 | Cable TV, Fox News, whatever you want to point to
00:44:47.020 | about increasing polarization,
00:44:48.740 | it didn't make me afraid of my students.
00:44:50.980 | And that was new in around 2014, 2015.
00:44:53.620 | We started hearing, getting articles,
00:44:55.980 | I'm a liberal professor and my liberal students frightened me.
00:44:58.980 | It was in Vox in 2015.
00:45:00.820 | And that was after Greg and I had turned in
00:45:02.220 | the draft of our first draft of our coddling article.
00:45:05.900 | And surveys show over and over again,
00:45:07.440 | students are not as afraid of their professors,
00:45:09.580 | they're actually afraid of other students.
00:45:11.420 | Most students are lovely.
00:45:12.740 | It's not like the whole generation has lost their minds.
00:45:15.620 | What happens, a small number,
00:45:17.460 | a small number are adept at using social media
00:45:22.140 | to destroy anyone that they think
00:45:25.040 | they can get credit for destroying.
00:45:27.880 | And the bizarre thing is it's never,
00:45:30.060 | it's rarely about what ideas you express.
00:45:31.860 | It's usually about a word, like he used this word,
00:45:35.020 | or this was insensitive,
00:45:36.420 | or I can link this word to that word.
00:45:38.760 | So they don't even engage with ideas and arguments.
00:45:42.000 | It's a real sort of gotcha, prosecutable,
00:45:45.460 | it's like a witch trial mindset.
00:45:47.220 | - So the unique thing here,
00:45:50.140 | there's something about social media in those years
00:45:53.000 | that a small number of people
00:45:54.900 | can sort of be catalysts for this division.
00:45:56.820 | They can start the viral wave
00:45:58.800 | that leads to a division that's different
00:46:00.980 | than the kind of division we saw before.
00:46:02.620 | - It's a little different than a viral wave.
00:46:04.020 | Once you get some people
00:46:05.260 | who can use social media to intimidate,
00:46:08.260 | you get a sudden phase shift.
00:46:10.340 | You get a big change in the dynamics of groups.
00:46:12.740 | And that's the heart of the article.
00:46:13.900 | This isn't just another article
00:46:15.300 | about how social media is polarizing us
00:46:16.980 | and destroying democracy.
00:46:18.580 | The heart of the article is an analysis
00:46:21.180 | of what makes groups smart and what makes them stupid.
00:46:24.420 | And so because, as we said earlier,
00:46:27.660 | my own research is on post-hoc reasoning,
00:46:30.200 | post-hoc justification, rationalization.
00:46:32.240 | The only cure for that is other people
00:46:33.720 | who don't share your biases.
00:46:35.760 | And so if you have an academic debate,
00:46:37.800 | as like the one I'm having with these other researchers
00:46:40.840 | over social media, I write something, they write something,
00:46:43.680 | I have to take account of their arguments
00:46:45.160 | and they have to take account of mine.
00:46:46.880 | When the academic world works,
00:46:48.240 | it's because it puts us together
00:46:49.640 | in ways that things cancel out.
00:46:51.160 | That's what makes universities smart.
00:46:53.360 | It's what makes them generators of knowledge.
00:46:55.600 | Unless we stop dissent, what if we say,
00:46:59.020 | on these topics, there can be no dissent.
00:47:01.380 | And if anyone says otherwise,
00:47:02.820 | if any academic comes up with research that says otherwise,
00:47:05.680 | we're gonna destroy them.
00:47:06.980 | And if any academic even tweets a study
00:47:10.260 | contradicting what is the official word,
00:47:13.660 | we're gonna destroy them.
00:47:14.500 | And that was the famous case of David Shore,
00:47:16.420 | who in the days after George Floyd was killed
00:47:19.420 | and there were protests, and the question is,
00:47:21.620 | are these protests gonna be productive?
00:47:23.260 | Are they gonna backfire?
00:47:24.860 | Now, most of them were peaceful, but some were violent.
00:47:27.400 | And he tweeted a study, he just simply tweeted a study
00:47:30.780 | done by an African-American,
00:47:32.400 | I think sociologist at Princeton, Omar Wasow.
00:47:35.300 | And Wasow's study showed that when you look back
00:47:38.300 | at the '60s, you see that where there were violent protests,
00:47:40.500 | it tended to backfire, peaceful protests tend to work.
00:47:43.740 | And so he simply tweeted that study.
00:47:46.340 | And there was a Twitter mob after him,
00:47:48.080 | this was insensitive, this was anti-black,
00:47:51.740 | I think he was accused of,
00:47:53.100 | and he was fired within a day or two.
00:47:55.140 | So this is the kind of dynamic.
00:47:56.740 | This is not caused by cable TV.
00:47:58.580 | This is not caused, this is something new.
00:48:00.780 | - Can I, just on a small tangent there,
00:48:04.260 | in that situation, because it happens time and time again,
00:48:06.940 | you highlight in your current work,
00:48:08.860 | but also in the coddling of the American mind,
00:48:11.160 | is the blame on the mob, the mechanisms that enables the mob
00:48:17.780 | or the people that do the firing?
00:48:19.720 | The administration does the firing.
00:48:21.660 | - Yeah, it's all of them.
00:48:22.780 | - Well, can I, I sometimes feel
00:48:26.700 | that we don't put enough blame
00:48:28.380 | on the people that do the firing.
00:48:30.940 | Which is, that feels like,
00:48:33.100 | in the long arc of human history,
00:48:36.000 | that is the place for courage and for ideals, right?
00:48:40.380 | That's where it stops.
00:48:41.860 | That's where the buck stops.
00:48:43.820 | So there's going to be new mechanisms for mobs
00:48:46.900 | and all that kind of stuff.
00:48:47.740 | There's going to be tribalism.
00:48:49.540 | But at the end of the day,
00:48:50.500 | that's what it means to be a leader,
00:48:52.300 | is to stop the mob at the door.
00:48:56.000 | - But I'm a social psychologist,
00:48:57.380 | which means I look at the social forces that work on people.
00:49:00.380 | And if you show me a situation
00:49:02.220 | in which 95% of the people behave one way,
00:49:05.420 | and it's a way that we find surprising and shameful,
00:49:08.620 | I'm not going to say, wow, 95% of the people are shameful.
00:49:11.220 | I'm going to say, wow, what a powerful situation.
00:49:14.420 | We've got to change that situation.
00:49:16.420 | So that's what I think is happening here,
00:49:17.900 | because there are hardly any in the first few years,
00:49:20.540 | you know, it began around 2018, 2019,
00:49:22.420 | it really enters the corporate world.
00:49:24.140 | There are hardly any leaders who stood up against it.
00:49:26.700 | But I've talked to a lot, and it's always the same thing.
00:49:29.700 | You have these, you know, people in their,
00:49:33.020 | usually in their 50s or 60s,
00:49:35.100 | generally they're progressive or on the left.
00:49:37.660 | They're accused of things by their young employees.
00:49:40.340 | They don't have the vocabulary to stand up to it.
00:49:43.780 | And they give in very quickly.
00:49:45.520 | And because it happens over and over again,
00:49:47.300 | and there's only a few examples of university presidents
00:49:49.580 | who said like, no, we're not going to stop this talk
00:49:52.380 | just because you're freaking out.
00:49:53.620 | No, you know, we're not going to fire this professor
00:49:55.540 | because he wrote a paper that you don't like.
00:49:59.180 | There are so few examples,
00:50:01.100 | I have to include that the situational forces are so strong.
00:50:04.220 | Now, I think we are seeing,
00:50:06.220 | we are seeing a reversal in the last few weeks or months.
00:50:10.140 | A clear sign of that is that the New York Times
00:50:12.340 | actually came out with an editorial
00:50:14.700 | from the editorial board
00:50:16.160 | saying that free speech is important.
00:50:18.660 | Now that's amazing that the Times had the guts
00:50:21.380 | to stand up for free speech because, you know,
00:50:24.100 | they're the people, well,
00:50:27.420 | what's been happening with the Times
00:50:28.580 | is that they've allowed Twitter
00:50:29.780 | to become the editorial board.
00:50:31.500 | Twitter has control over the New York Times
00:50:33.940 | and the New York Times literally will change papers.
00:50:36.780 | I have an essay in Politico with Nadine Strassen,
00:50:39.540 | Steve Pinker and Pamela Peresky
00:50:41.540 | on how the New York Times
00:50:44.740 | retracted and changed an editorial by Brett Stevens.
00:50:48.180 | And they did it in a sneaky way and they lied about it.
00:50:50.460 | And they did this out of fear because he mentioned IQ.
00:50:53.820 | He mentioned IQ and Jews.
00:50:55.820 | And then he went on to say,
00:50:57.540 | it probably isn't a genetic thing,
00:50:58.700 | it's probably cultural, but he mentioned it.
00:51:00.540 | And the New York Times, I mean, they were really cowardly.
00:51:03.540 | Now, I think they, from what I hear,
00:51:05.500 | they know that they were cowardly.
00:51:07.060 | They know that they should not have fired James Bennett.
00:51:10.480 | They know that they gave in to the mob.
00:51:12.300 | And that's why they're now poking their head up
00:51:13.800 | above the parapet and they're saying,
00:51:15.320 | oh, we think that free speech is important.
00:51:16.860 | And then of course they got their heads blown off
00:51:18.220 | 'cause Twitter reacted like, how dare you say this?
00:51:20.500 | Are you saying racist speech is okay?
00:51:22.400 | But they didn't back down.
00:51:23.300 | They didn't retract it.
00:51:24.580 | They didn't apologize for defending free speech.
00:51:27.020 | So I think the Times might be coming back.
00:51:30.540 | - Can I ask you an opinion on something here?
00:51:32.420 | What, in terms of the Times coming back,
00:51:34.820 | in terms of Twitter being the editorial board
00:51:38.140 | for the prestigious journalistic organizations,
00:51:43.580 | what's the importance of the role of Mr. Elon Musk in this?
00:51:48.020 | So, you know, it's all fun and games,
00:51:52.260 | but here's a human who tweets about the importance
00:51:55.060 | of freedom of speech and buys Twitter.
00:51:59.020 | What are your thoughts on the influence,
00:52:02.300 | the positive and the negative possible consequences
00:52:05.220 | of this particular action?
00:52:07.060 | - So, you know, if he is gonna succeed
00:52:09.460 | and if he's gonna be one of the major reasons
00:52:11.860 | why we decarbonize quickly and why we get to Mars,
00:52:14.820 | then I'm willing to cut him a lot of slack.
00:52:17.520 | So I have an overall positive view of him.
00:52:19.780 | Now where I'm concerned and where I'm critical
00:52:22.220 | is we're in the middle of a raging culture war
00:52:25.180 | and this culture war is making our institutions stupid.
00:52:28.520 | It's making them fail.
00:52:30.220 | This culture war, I think, could destroy our country.
00:52:32.780 | And by destroy, I mean we could descend
00:52:34.820 | into constant constitutional crises, a lot more violence.
00:52:38.460 | You know, not that we're gonna disappear,
00:52:39.700 | not that we're gonna kill each other,
00:52:40.800 | but I think there will be a lot more violence.
00:52:42.620 | So we're in the middle of this raging culture war.
00:52:45.540 | It's possibly turning to violence.
00:52:48.220 | You need to not add fuel to the fire.
00:52:51.380 | And the fact that he declared
00:52:53.020 | that he's gonna be a Republican
00:52:54.660 | and the Democrats are the bad party.
00:52:56.460 | And, you know, as an individual citizen,
00:52:59.340 | he's entitled to his opinion, of course,
00:53:01.420 | but as an influential citizen,
00:53:03.300 | he should at least be thoughtful.
00:53:06.080 | And more importantly,
00:53:10.620 | companies need and I think would benefit
00:53:14.640 | from a Geneva Convention for the culture war
00:53:16.980 | in which, 'cause they're all being damaged
00:53:20.740 | by the culture war coming to the companies.
00:53:22.820 | What we need to get to, I hope,
00:53:24.660 | is a place where companies do,
00:53:27.620 | they have strong ethical obligations
00:53:30.100 | about the effects that they cause,
00:53:31.420 | about how they treat their employees,
00:53:32.620 | about their supply chain.
00:53:33.500 | They have strong ethical obligations,
00:53:35.980 | but they should not be weighing in on culture war issues.
00:53:39.220 | - Well, if I can read the exact tweet,
00:53:41.220 | 'cause part of the tweet I like,
00:53:43.220 | he said, "In the past, I voted Democrat
00:53:45.780 | "because they were mostly the kindness party,
00:53:50.260 | "but they have become the party of division and hate,
00:53:53.940 | "so I can no longer support them and will vote Republican."
00:53:58.340 | And then he finishes with,
00:53:59.400 | "Now watch their dirty tricks campaign against me unfold."
00:54:03.460 | Okay.
00:54:04.860 | - What do you make of that?
00:54:05.700 | Like, what do you think he was thinking
00:54:07.220 | that he came out so blatantly as a partisan?
00:54:10.420 | - Because he's probably communicating with the board,
00:54:13.060 | with the people inside Twitter,
00:54:14.340 | and he's clearly seeing the lean.
00:54:16.740 | And he's responding to that lean.
00:54:18.340 | He's also opening the door to the potential,
00:54:23.260 | bringing back the former president onto the platform,
00:54:27.900 | and also bringing back,
00:54:29.820 | which he's probably looking at the numbers
00:54:31.540 | of the people who are behind Truth Social,
00:54:34.300 | saying that, okay,
00:54:35.660 | it seems that there's a strong lean in Twitter
00:54:39.460 | in terms of the left.
00:54:41.620 | And in fact, from what I see,
00:54:45.340 | it seems like the current operation of Twitter
00:54:49.940 | is the extremes of the left get outraged by something,
00:54:54.940 | and the extremes of the right point out
00:54:58.580 | how the left is ridiculous.
00:55:00.300 | Like, that seems to be the mechanism.
00:55:02.600 | And that's the source of the drama,
00:55:07.120 | and then the left gets very mad at the right
00:55:09.920 | that points out the ridiculousness,
00:55:11.360 | and there's this vicious kind of cycle.
00:55:13.320 | - That's the polarization cycle.
00:55:14.720 | That's what we're in.
00:55:15.880 | - There's something that happened here
00:55:17.160 | that there's a shift where,
00:55:19.440 | there's a decline, I would say,
00:55:20.400 | in both parties towards being shitty.
00:55:23.080 | - Okay, but look,
00:55:23.920 | with everything with the parties, that's not the issue.
00:55:26.120 | The issue is, should the most important CEO in America,
00:55:29.560 | the CEO of some of our biggest and most important companies,
00:55:32.520 | so let's imagine five years from now,
00:55:35.360 | two different worlds.
00:55:36.320 | In one world, the CEO of every Fortune 500 company has said,
00:55:41.320 | I'm a Republican because I hate those douchebags,
00:55:44.000 | or I'm a Democrat because I hate those Nazi racists.
00:55:47.160 | That's one world where everybody,
00:55:48.600 | everybody puts up a thing in their window,
00:55:50.240 | everybody, it's culture war everywhere, all the time,
00:55:53.020 | 24 hours a day.
00:55:54.360 | You pick a doctor based on whether he's red or blue.
00:55:56.560 | Everything is culture war.
00:55:57.880 | That's one possible future, which we're headed towards.
00:56:00.100 | The other is, we say, you know what?
00:56:02.320 | Political conflict should be confined to political spaces.
00:56:05.080 | There is a room for protest,
00:56:06.200 | but you don't go protesting at people's private homes.
00:56:08.360 | You don't go threatening their children.
00:56:09.520 | You don't go doxing them.
00:56:11.160 | We have to have channels
00:56:12.240 | that are not culture war all the time.
00:56:14.560 | When you go shopping, when you go to a restaurant,
00:56:16.440 | you shouldn't be yelled at and screamed at.
00:56:18.520 | When you buy a product,
00:56:19.400 | you should be able to buy products from an excellent company.
00:56:21.200 | You shouldn't have to always think, what's the CEO?
00:56:23.520 | What is the, I mean, what an insane world,
00:56:25.080 | but that's where we're heading.
00:56:25.920 | So I think that Elon did a really bad thing
00:56:29.040 | in launching that tweet.
00:56:31.360 | That was, I think, really throwing fuel on a fire
00:56:33.760 | and setting a norm in which businesses
00:56:36.680 | are gonna get even more politicized than they are.
00:56:39.000 | - And you're saying specifically the problem
00:56:41.360 | was that he picked the side.
00:56:43.280 | - As the head of, yes, as the CEO,
00:56:45.480 | as the head of several major companies,
00:56:48.280 | of course we can find out what his views are.
00:56:50.720 | You know, it's not like it's, I mean,
00:56:52.040 | actually with him, it's maybe hard to know,
00:56:53.320 | but it's not that a CEO can't be a partisan or have views,
00:56:58.320 | but to publicly declare it in that way,
00:57:00.880 | in such a really insulting way,
00:57:03.800 | this is throwing fuel on the fire
00:57:05.040 | and it's setting a precedent that corporations
00:57:07.880 | are major players in the culture war.
00:57:09.720 | And I'm trying to reverse that.
00:57:10.720 | We've got to pull back from that.
00:57:11.880 | - Let me play devil's advocate here.
00:57:13.400 | So, 'cause I've gotten a chance to interact
00:57:15.640 | with quite a few CEOs,
00:57:17.200 | there is also a value for authenticity.
00:57:22.200 | So I'm guessing this was written while sitting
00:57:24.920 | on the toilet and I could see in a day from now
00:57:30.200 | saying, "LOL, just kidding."
00:57:32.480 | There's a humor, there's a lightness,
00:57:35.920 | there's a chaos element, and that's, chaos is not--
00:57:39.600 | - Yeah, that's not what we need right now.
00:57:40.920 | We don't need more chaos.
00:57:41.880 | - Well, so yes, there's a balance here.
00:57:45.220 | The chaos isn't engineered chaos,
00:57:47.680 | it's really authentically who he is.
00:57:50.060 | And I would like to say that there's--
00:57:51.640 | - I agree with that.
00:57:52.480 | - That's a trade-off, because if you become a politician,
00:57:55.760 | so there's a trade-off between, in this case,
00:57:58.040 | maybe authenticity and civility, maybe,
00:58:01.600 | like being calculating about the impact you have
00:58:06.600 | with your words versus just being yourself.
00:58:09.120 | And I'm not sure calculating is also a slippery slope.
00:58:13.000 | Both are slippery slopes, you have to be careful.
00:58:15.120 | - So when we have conversations in a vacuum
00:58:18.080 | and we just say like, "What should a person do?"
00:58:20.400 | Those are very hard, but our world is actually structured
00:58:23.300 | into domains and institutions.
00:58:25.520 | And if it's just like, "Oh, talking here among our friends,
00:58:29.200 | "like we should be authentic," sure.
00:58:31.160 | But the CEO of a company has fiduciary duties,
00:58:34.460 | legal fiduciary duties to the company,
00:58:36.480 | he owes loyalty to the company.
00:58:38.640 | And if he is using the company for his own political gain
00:58:42.920 | or other purposes or social standing,
00:58:44.960 | that's a violation of his fiduciary duty to the company.
00:58:48.440 | Now there's debate among scholars
00:58:49.640 | whether your fiduciary duty is to the shareholders,
00:58:51.520 | I don't think it's the shareholders.
00:58:53.160 | I think many legal experts say
00:58:55.120 | the company's a legal person,
00:58:57.600 | you have duties to the company,
00:58:58.820 | employees owe a duty to the company.
00:59:00.800 | So he's got those duties and I think he,
00:59:03.920 | you can say he's being authentic,
00:59:05.000 | but he's also violating those duties.
00:59:06.960 | So it's not necessarily he's violating a law by doing it,
00:59:10.140 | but he certainly is shredding any notion
00:59:12.880 | of professional ethics around leadership
00:59:14.840 | of a company in the modern age.
00:59:16.760 | - I think you have to take it in the full context
00:59:18.520 | because you see that he's not being a political player,
00:59:23.000 | he's just saying quit being douchey.
00:59:24.920 | - Suppose the CEO of Ford says,
00:59:27.240 | "You know what, let's pick a group."
00:59:30.320 | Well, I shouldn't do a racial group
00:59:32.000 | 'cause that would be different.
00:59:32.880 | Let's just say, "You know what,
00:59:35.080 | "left-handed people are douchebags, I hate them."
00:59:37.400 | Like, why would you say that?
00:59:38.880 | Like, why would you drag away left-handed people?
00:59:40.440 | - What you said now is not either funny or like-hearted
00:59:45.240 | because I hate them, it wasn't funny.
00:59:47.160 | I'm not picking on you, I'm saying that statement.
00:59:49.640 | Words matter, there's a lightness to the statement
00:59:53.000 | in the full context.
00:59:54.080 | If you look at the timeline of the man,
00:59:57.160 | there's ridiculous memes and there's nonstop jokes
01:00:01.060 | that my big problem with the CEO of Ford
01:00:03.280 | is there's never any of that.
01:00:04.760 | Not only is there any of that,
01:00:06.920 | there's not a celebration of the beauty of the engineering
01:00:09.420 | of the different products.
01:00:10.800 | It's all political speak,
01:00:12.760 | channeled through multiple meetings of PR.
01:00:15.480 | There's levels upon levels upon levels
01:00:18.540 | where you think that it's really not authentic.
01:00:22.640 | And there, you're actually, by being polite,
01:00:26.880 | by being civil, you're actually playing politics
01:00:29.800 | because all of your actual political decision-making
01:00:34.120 | is done in the back channels.
01:00:36.060 | That's obvious.
01:00:37.120 | Here, here's a human being authentic
01:00:41.320 | and actually struggling with some of the ideas
01:00:43.440 | and having fun with it.
01:00:45.220 | I think this lightness represents the kind of positivity
01:00:50.220 | that we should be striving for.
01:00:52.260 | It's funny to say that
01:00:53.660 | because you're looking at these statements
01:00:56.020 | and they seem negative,
01:00:57.420 | but in the full context of social media,
01:00:59.600 | I don't know if they are.
01:01:01.140 | - But look at what you just said, in the full context.
01:01:03.020 | You're taking his tweets in context.
01:01:05.620 | You know who doesn't do that?
01:01:07.060 | Twitter.
01:01:07.900 | Like, that's the Twitter--
01:01:09.420 | - Well, that's the problem you're highlighting.
01:01:10.260 | - The rule of Twitter is there is no context.
01:01:12.220 | Everything is taken in the maximum possible way.
01:01:14.140 | There is no context.
01:01:14.980 | - Oh, you're saying--
01:01:15.800 | - So this is not like,
01:01:16.640 | you know, yes, I wish we did take people in context.
01:01:20.020 | I wish we lived in that world.
01:01:21.220 | But now that we have Twitter and Facebook,
01:01:22.420 | we don't live in that world anymore.
01:01:23.500 | - So you're saying it is a bit of responsibility
01:01:26.580 | for people with a large platform
01:01:28.580 | to consider the fact that
01:01:30.580 | there is the fundamental mechanism of Twitter
01:01:32.500 | where people don't give you the benefit of the doubt.
01:01:34.900 | - Well, I don't wanna hang it on large platform
01:01:36.600 | because then that's what a lot of people say.
01:01:38.180 | Like, well, you know, she shouldn't say that
01:01:40.060 | because she has a large platform
01:01:41.220 | and she should say things that agree with my politics.
01:01:43.300 | I don't wanna hang it on large platform.
01:01:44.820 | I wanna hang it on CEO of a company.
01:01:47.660 | CEOs of a company have duties and responsibilities.
01:01:50.780 | And, you know, Scott Galloway,
01:01:52.300 | I think is very clear about this.
01:01:53.500 | You know, he criticized Elon a lot
01:01:54.780 | as being a really bad role model for young men.
01:01:57.500 | Young men need role models
01:01:59.080 | and he is a very appealing, attractive role model.
01:02:01.540 | - So I agree with you,
01:02:02.420 | but in terms of being a role model,
01:02:04.820 | I think, I don't wanna put a responsibility on people,
01:02:08.540 | but yes, he could be a much, much better role model.
01:02:11.380 | There's--
01:02:12.220 | - You can't insult sitting senators by calling them old.
01:02:14.020 | I mean, that's, you know.
01:02:15.280 | - Yeah, I won't do a both-side-ism of like,
01:02:19.680 | well, those senators can be assholes too.
01:02:21.540 | - Yeah, yes, yes. - But that's fair enough.
01:02:23.420 | Respond intelligently, as I tweeted,
01:02:25.460 | to unintelligent treatment, yes, yes.
01:02:29.240 | So the reason I like, and he's now a friend,
01:02:31.420 | the reason I like Elon is because of the engineering,
01:02:34.340 | because of the work he does.
01:02:35.580 | - No, I admire him enormously for that.
01:02:37.340 | - But what I admire on the Twitter side is the authenticity
01:02:41.200 | because I've been a little bit jaded and worn out
01:02:43.460 | by people who have built up walls,
01:02:48.020 | people in the position of power,
01:02:49.260 | the CEOs and the politicians who built up walls
01:02:51.660 | and you don't see the real person.
01:02:53.000 | That's one of the reasons I love long-form podcasting,
01:02:55.500 | is especially if you talk more than 10 minutes,
01:02:58.240 | it's hard to have a wall up.
01:02:59.940 | It all kind of crumbles away.
01:03:01.640 | So I don't know, but yes, yes, you're right.
01:03:03.840 | That is a step backwards to say,
01:03:10.340 | at least to me, the biggest problem is to pick sides,
01:03:12.840 | to say I'm not going to vote this way or that way.
01:03:16.140 | That's, like leave that to the politicians.
01:03:21.140 | You have much, like the importance of social media
01:03:26.000 | is far bigger than the bickering,
01:03:30.680 | the short-term bickering of any one political party.
01:03:33.080 | It's a platform where we make progress,
01:03:35.760 | where we develop ideas through sort of rigorous discourse,
01:03:40.760 | all those kinds of things.
01:03:43.420 | - So, okay, so here's an idea about social media
01:03:45.540 | developed through social media from Elon,
01:03:48.060 | which is everyone freaks out because they think either,
01:03:52.620 | oh, he's going to do less content moderations.
01:03:54.620 | The left is freaking out
01:03:55.540 | because they want more content moderation.
01:03:57.240 | The right is celebrating
01:03:58.140 | because they think the people doing the content moderation
01:03:59.940 | are on the left.
01:04:01.060 | But there was one, I think it was a tweet,
01:04:04.180 | where he said like three things
01:04:05.220 | he was going to do to make it better.
01:04:06.380 | And it was, I would defeat the bots or something.
01:04:08.080 | But he said, authenticate all humans.
01:04:11.460 | And this is a hugely important statement.
01:04:13.500 | And it's pretty powerful that this guy
01:04:15.580 | can put three words in a tweet.
01:04:17.220 | And actually, I think this could change the world.
01:04:19.140 | Even if the bid fails, the fact that Elon said that,
01:04:22.460 | that he thinks we need to authenticate all humans is huge.
01:04:26.260 | Because now we're talking about solutions here.
01:04:29.900 | What can we do to make social media a better place
01:04:32.700 | for democracy,
01:04:33.540 | a place that actually makes democracy better?
01:04:35.940 | As Tristan Harris has pointed out,
01:04:38.140 | social media and digital technology,
01:04:39.820 | the Chinese are using it really skillfully
01:04:41.700 | to make a better authoritarian nation.
01:04:44.300 | And by better, I don't mean morally better.
01:04:45.660 | I mean like more stable, successful.
01:04:48.860 | Whereas we're using it to make ourselves weaker,
01:04:51.060 | more fragmented, and more insane.
01:04:53.500 | So we're on the way down.
01:04:55.820 | We're in big trouble.
01:04:57.500 | And all the argument is about content moderation.
01:05:01.180 | And what we learned from Francis Haugen
01:05:02.720 | is that what, five or 10% of what they might call
01:05:06.740 | hate speech gets caught, 1% of violence and intimidation.
01:05:09.900 | Content moderation, even if we do a lot more of it,
01:05:11.660 | isn't gonna make a big difference.
01:05:13.700 | All the powers and the dynamics changes to the architecture.
01:05:18.100 | And as I said in my Atlantic article,
01:05:20.460 | what are the reforms that would matter for social media?
01:05:22.140 | And the number one thing I said,
01:05:23.420 | the number one thing I believe is user authentication
01:05:27.380 | or user verification.
01:05:28.900 | And people freak out and they say like,
01:05:30.700 | oh, but we need anonymous.
01:05:31.700 | Like, yeah, fine, you can be anonymous.
01:05:33.760 | But what I think needs to be done is
01:05:36.460 | anyone can open an account on Twitter, Facebook, whatever,
01:05:39.420 | as long as you're over 16, and that's another piece.
01:05:41.540 | Once you're 16 or 18, at a certain age,
01:05:44.660 | you can be treated like an adult.
01:05:46.780 | You can open an account and you can look, you can read,
01:05:50.220 | and you can make up whatever fake name you want.
01:05:52.740 | But if you want to post,
01:05:54.260 | if you want the viral amplification
01:05:57.420 | on a company that has section 230 protection from lawsuits,
01:06:00.560 | which is a very special privilege,
01:06:01.900 | I understand the need for it,
01:06:03.260 | but it's an incredibly powerful privilege
01:06:05.260 | to protect them from lawsuits.
01:06:07.580 | If you want to be able to post on platforms that,
01:06:10.980 | as we'll get to in the Google Doc,
01:06:13.100 | there's a lot of evidence that they are undermining
01:06:15.180 | and damaging democracy.
01:06:16.520 | Then the company has this minimal responsibility
01:06:20.780 | it has to meet.
01:06:21.820 | Banks have know your customer laws.
01:06:23.460 | You can't just walk up to a bank with a bag of money
01:06:25.420 | that you stole and say, here, deposit this for me.
01:06:27.740 | My name's John Smith.
01:06:29.280 | You have to actually show who you are.
01:06:31.420 | And the bank isn't gonna announce who you are publicly,
01:06:33.460 | but you have to, if they're gonna do business with you,
01:06:35.620 | they need to know you're a real person, not a criminal.
01:06:39.620 | And so there's a lot of schemes for how to do this.
01:06:42.940 | There's multiple levels.
01:06:44.020 | People don't seem to understand this.
01:06:45.340 | Level zero of authentication is nothing.
01:06:48.020 | That's what we have now.
01:06:49.200 | Level one, this might be what Elon meant,
01:06:51.740 | authenticate all humans,
01:06:53.240 | meaning you have to at least pass a CAPTCHA or some test
01:06:55.900 | to show you're not a bot.
01:06:57.300 | There's no identity, there's nothing.
01:06:58.600 | Just something that, you know,
01:07:00.300 | it's a constant cat and mouse struggle between bots.
01:07:03.500 | So we try to just filter out pure bots.
01:07:06.160 | The next level up, there are a variety of schemes
01:07:08.720 | that allow you to authenticate identity
01:07:11.120 | in ways that are not traceable or kept.
01:07:13.640 | So some, whether you show an ID, whether you use biometric,
01:07:17.380 | whether you have something on the blockchain
01:07:19.580 | that establishes identity, whether it's linked to a phone,
01:07:22.340 | whatever it is, there are multiple schemes now
01:07:24.580 | that companies have figured out for how to do this.
01:07:27.540 | And so if you did that, then in order to get an account
01:07:30.920 | where you have posting privileges on Facebook or Twitter
01:07:33.620 | or TikTok or whatever, you have to at least do that.
01:07:36.860 | And if you do that, you know,
01:07:39.220 | now the other people are real humans too.
01:07:44.220 | And suddenly our public square's a lot nicer
01:07:46.420 | 'cause you don't have bots swarming around.
01:07:48.140 | This would also cut down on trolls.
01:07:49.460 | You still have trolls who use their real name,
01:07:51.420 | but this would just make it a little scarier for trolls.
01:07:54.780 | Some men turn into complete assholes.
01:07:57.100 | They can be very polite in real life,
01:07:59.100 | but some men, as soon as they have the anonymity,
01:08:01.300 | they start using racial slurs.
01:08:03.380 | They're horrible.
01:08:04.820 | One troll can ruin thousands of people's day.
01:08:08.180 | - You know, I'm somebody who believes in free speech.
01:08:12.640 | And so there's been a lot of discussions about this.
01:08:15.940 | And we'll ask you some questions about this too.
01:08:18.940 | But there is, the tension there is the power of a troll
01:08:26.000 | to ruin the party.
01:08:27.800 | - Yeah, that's right.
01:08:28.940 | - So like this idea of free speech,
01:08:31.060 | boy, do you have to also consider
01:08:35.840 | if you wanna have a private party and enjoy your time,
01:08:39.920 | challenging, lots of disagreement,
01:08:43.640 | debate, all that kind of stuff, but fun.
01:08:45.340 | No like annoying person screaming,
01:08:48.840 | just not disagreeing, but just like spilling
01:08:52.160 | like the drinks all over the place.
01:08:55.920 | Yeah, all that kind of stuff.
01:08:57.400 | So see, you're saying it's a step in the right direction
01:09:02.240 | to at least verify the humanness of a person
01:09:07.240 | while maintaining anonymity.
01:09:08.840 | So that's one step, but the further step,
01:09:12.720 | that's maybe doesn't go all the way
01:09:15.480 | because you can still figure out ways
01:09:17.200 | to create multiple accounts and you can--
01:09:19.080 | - But it's a lot harder.
01:09:19.960 | So actually there's a lot of ways to do this.
01:09:21.840 | There's a lot of creativity out there
01:09:23.120 | about solving this problem.
01:09:24.260 | So if you go to the social media and political dysfunction,
01:09:27.140 | Google Doc that I created with Chris Bale,
01:09:29.800 | and then you go to section 11,
01:09:32.880 | proposals for improving social media.
01:09:35.100 | So we're collecting there now some of the ideas
01:09:39.400 | for how to do user authentication.
01:09:41.800 | And so one is WorldCoin, there's one human-id.org.
01:09:46.480 | This is a new organization created by an NYU Stern student
01:09:50.240 | who just came into my office last week,
01:09:52.240 | working with some other people.
01:09:54.160 | And what they do here is they have a method
01:09:56.360 | of identity verification that is keyed to your phone.
01:10:00.840 | So you do have to have a phone number.
01:10:03.280 | And of course you can buy seven different phone numbers
01:10:06.160 | if you want, but it's gonna be about 20 or $30 a month.
01:10:08.760 | So nobody's gonna buy a thousand phones.
01:10:11.320 | So yeah, you don't have just one unique ID,
01:10:15.600 | but most people do and nobody has a thousand.
01:10:18.400 | So just things like this
01:10:20.600 | that would make an enormous difference.
01:10:22.520 | So here's the way that I think about it.
01:10:24.800 | Imagine a public square in which the incentives
01:10:27.280 | are to be an asshole,
01:10:28.700 | that the more you kick people in the shins
01:10:30.480 | and spit on them and throw things at them,
01:10:32.680 | the more people applaud you.
01:10:34.080 | Okay, so that's the public square we have now.
01:10:37.420 | Not for most people, but as you said,
01:10:39.080 | just one troll can ruin it for everybody.
01:10:41.280 | If there's a thousand of us in the public square
01:10:43.600 | and 10 are incentivized to kick us and throw shit at us,
01:10:48.240 | it's no fun to be in that public square.
01:10:50.200 | So right now, I think Twitter in particular
01:10:52.760 | is making our public square much worse.
01:10:54.560 | It's making our democracy much weaker,
01:10:56.040 | much more divided, it's bringing us down.
01:10:59.560 | Imagine if we changed the incentives.
01:11:01.560 | Imagine if the incentive was to be constructive.
01:11:04.160 | And so this is an idea that I've been kicking around.
01:11:05.880 | I talked about it with Reid Hoffman last week
01:11:07.340 | and he seemed to think it's a good idea.
01:11:09.380 | And it would be very easy to,
01:11:14.480 | rather than trying to focus on posts,
01:11:16.600 | what post is fake or whatever,
01:11:19.000 | focus on users, what users are incredibly aggressive.
01:11:22.880 | And so people just use a lot of
01:11:24.120 | obscenity and exclamation points.
01:11:27.040 | AI could easily code nastiness or just aggression, hostility.
01:11:30.880 | And imagine if every user is rated
01:11:33.040 | on a one to five scale for that.
01:11:35.840 | And the default, when you open an account
01:11:37.680 | on Twitter or Facebook, the default is four.
01:11:40.600 | You will see everybody who's a four and below,
01:11:43.280 | but you won't even see the people who are fives
01:11:45.400 | and they don't get to see you.
01:11:46.940 | So they can say what they want, free speech.
01:11:49.500 | We're not censoring them, they can say what they want.
01:11:52.060 | But now there's actually an incentive to not be an asshole.
01:11:55.780 | 'Cause the more of an asshole you are,
01:11:57.260 | the more people block you out.
01:11:59.180 | So imagine our country goes in two directions.
01:12:01.220 | In one, things continue to deteriorate
01:12:03.460 | and we have no way to have a public square
01:12:05.180 | in which we could actually talk about things.
01:12:06.780 | And in the other, we actually try to disincentivize
01:12:10.020 | being an asshole and encourage being constructive.
01:12:13.500 | What do you think?
01:12:14.540 | - Well, this is because I'm an AI person.
01:12:17.260 | And I very much, ever since I talked to Jack
01:12:20.420 | about the health of conversation,
01:12:22.140 | I've been looking at a lot of the machine learning models
01:12:23.860 | involved and I believe that the nastiness classification
01:12:27.300 | is a difficult problem automatically.
01:12:29.180 | - I'm sure it is.
01:12:30.020 | - So I personally believe in crowdsourced nastiness labeling.
01:12:35.020 | But in an objective way where it doesn't become
01:12:39.780 | viral mob cancellation type of dynamics.
01:12:43.900 | But more sort of objectively,
01:12:46.300 | is this a shitty, almost out of context,
01:12:49.380 | with only local context.
01:12:52.420 | Is this a shitty thing to say at this moment?
01:12:54.300 | Because here's the thing.
01:12:55.420 | - Well, wait, no, but we don't care about individual posts.
01:12:57.780 | - No, no, but--
01:12:58.620 | - All that matters is the average.
01:12:59.660 | - The posts make the man.
01:13:01.060 | - They do, but the point is,
01:13:02.740 | as long as we're talking about averages here,
01:13:04.540 | one person has a misclassified post, it doesn't matter.
01:13:07.500 | - Right, right, right.
01:13:08.340 | Yeah, yeah, but you need to classify posts
01:13:10.580 | in order to build up the average.
01:13:12.020 | That's what I mean.
01:13:13.260 | And so I really like that idea,
01:13:16.700 | the high level idea of incentivizing people
01:13:20.820 | to be less shitty.
01:13:22.300 | 'Cause that's what, we have that incentive in real life.
01:13:25.220 | - Yeah, that's right.
01:13:26.060 | - It's actually really painful to be a full-time asshole,
01:13:29.040 | I think, in physical reality.
01:13:30.980 | - That's right, you'd be cut off.
01:13:32.100 | - It should be also pain to be an asshole on the internet.
01:13:36.700 | There could be different mechanisms for that.
01:13:38.420 | I wish AI was there, machine learning models were there.
01:13:41.340 | They just aren't yet.
01:13:42.900 | But how about we have,
01:13:44.220 | so one track is we have AI machine learning models
01:13:46.900 | and they render a verdict.
01:13:48.060 | Another class is crowdsourcing.
01:13:49.740 | You get, and then whenever the two disagree,
01:13:52.260 | you have staff at Twitter or whatever,
01:13:55.500 | they look at it and they say, "What's going on here?"
01:13:57.380 | And that way you can refine both the AI
01:13:59.700 | and you can refine whatever the algorithms are
01:14:01.340 | for the crowdsourcing.
01:14:02.180 | Because of course that can be gamed
01:14:03.420 | and people can only, "Hey, let's all rate this guy
01:14:05.260 | as really aggressive."
01:14:06.940 | So you wouldn't want just to rely on one single track.
01:14:09.580 | But if you have two tracks, I think you could do it.
01:14:12.940 | - What do you think about this word misinformation
01:14:15.040 | that maybe connects to our two discussions now?
01:14:18.660 | So one is the discussion of social media and democracy.
01:14:23.620 | And then the other is the coddling of the American mind.
01:14:27.060 | I've seen the word misinformation misused
01:14:31.340 | or used as a bullying word, like racism and so on,
01:14:35.700 | which are important concepts to identify,
01:14:41.140 | but they're nevertheless instead overused.
01:14:45.980 | Does that worry you?
01:14:46.900 | Because that seems to be the mechanism
01:14:49.440 | from inside Twitter, from inside Facebook,
01:14:52.540 | to label information you don't like
01:14:54.900 | versus information that's actually
01:14:58.260 | fundamentally harmful to society.
01:15:00.860 | - So I think there is a meaning of disinformation
01:15:03.660 | that is very useful and helpful,
01:15:06.900 | which is when you have a concerted campaign
01:15:09.420 | by Russian agents to plant a story and spread it.
01:15:14.340 | And they've been doing that since the '50s or '40s even.
01:15:17.640 | - That's what this podcast actually is.
01:15:19.700 | - Is what? (laughs)
01:15:21.060 | - It's a disinformation campaign by the Russians.
01:15:21.900 | - Yeah, you seem really Soviet to me, buddy.
01:15:24.340 | - It's subtle, it's between the lines.
01:15:28.820 | Okay, I'm sorry.
01:15:29.660 | - So I think to the extent that there are campaigns
01:15:33.860 | by either foreign agents or just by
01:15:36.500 | the Republican or Democratic parties,
01:15:38.100 | there have been examples of that,
01:15:39.740 | there are all kinds of concerted campaigns
01:15:43.540 | that are intending to confuse or spread lies.
01:15:48.540 | This is the Soviet, the firehose of falsehood tactic.
01:15:52.420 | So it's very useful for that.
01:15:53.760 | All the companies need to have pretty large staffs,
01:15:56.260 | I think, to deal with that,
01:15:57.100 | 'cause that will always be there.
01:15:58.380 | And that is really bad for our country.
01:16:00.680 | So Renee Duresta is just brilliant on this.
01:16:03.700 | Reading her work has really frightened me
01:16:05.620 | and opened my eyes about how easy it is
01:16:07.860 | to manipulate and spread misinformation
01:16:12.420 | and especially polarization.
01:16:14.540 | The Russians have been trying since the '50s,
01:16:17.860 | they would come to America and they would do hate crimes.
01:16:19.580 | They would spray swastikas in synagogues
01:16:21.820 | to make, and they spray anti-blackslurs.
01:16:24.100 | They try to make Americans feel
01:16:25.420 | that they're as divided as possible.
01:16:27.220 | Most of the debate nowadays, however, is not that.
01:16:31.000 | It seems to be people are talking about
01:16:32.900 | what the other side is saying.
01:16:34.500 | So if you're on the right,
01:16:37.140 | then you're very conscious of the times when,
01:16:39.420 | well, the left wouldn't let us even say,
01:16:42.900 | could COVID be from a lab?
01:16:44.300 | They would, you literally would get shut down
01:16:46.500 | for saying that, and it turns out,
01:16:47.780 | well, we don't know if it's true,
01:16:49.660 | but there's at least a real likelihood that it is,
01:16:51.340 | and it certainly is something
01:16:52.260 | that should have been talked about.
01:16:53.500 | So I tend to stay away from any such discussions.
01:16:57.460 | And the reason is twofold.
01:16:58.980 | One is because they're almost entirely partisan.
01:17:01.540 | It generally is each side thinks
01:17:03.260 | what the other side is saying is misinformation
01:17:07.380 | or disinformation, and they can prove certain examples.
01:17:10.980 | So we're not gonna get anywhere on that.
01:17:12.460 | We certainly are never gonna get 60 votes in the Senate
01:17:14.660 | for anything about that.
01:17:16.780 | I don't think content moderation
01:17:17.940 | is nearly as important as people think.
01:17:19.700 | It has to be done, and it can be improved.
01:17:21.740 | Almost all the action is in the dynamics,
01:17:23.540 | the architecture, the virality,
01:17:25.340 | and then the nature of who is on the platform,
01:17:30.340 | unverified people, and how much amplification they get.
01:17:33.060 | That's what we should be looking at,
01:17:34.660 | rather than wasting so much of our breath
01:17:36.940 | on whether we're gonna do a little more
01:17:38.620 | or a little less content moderation.
01:17:40.180 | - So the true harm to society on average
01:17:44.540 | and over the long term is in the dynamics,
01:17:47.060 | is fundamentally in the dynamics of social media,
01:17:49.780 | not in the subtle choices of content moderation,
01:17:53.780 | aka censorship.
01:17:55.140 | - Exactly.
01:17:55.980 | There've always been conspiracy theories.
01:17:57.900 | You know, "The Turner Diaries" is this book written in 1978.
01:18:01.180 | It introduced the replacement theory to a lot of people.
01:18:05.900 | Timothy McVeigh had it on him when he was captured in 1995
01:18:09.020 | after the Oklahoma City bombing.
01:18:10.340 | It's a kind of a Bible of that fringe,
01:18:12.820 | violent, racist, white supremacist group.
01:18:16.380 | So the killer in "Buffalo"
01:18:20.860 | was well acquainted with these ideas.
01:18:22.340 | They've been around, but this guy's from a small town.
01:18:25.500 | I forget where he's from.
01:18:26.740 | But he was, as he says in a manifesto,
01:18:30.860 | he was entirely influenced by things he found online.
01:18:33.780 | He was not influenced by anyone he met in person.
01:18:36.300 | Ideas spread and communities can form,
01:18:40.440 | these like micro communities can form
01:18:43.120 | with bizarre and twisted beliefs.
01:18:45.560 | And this is, again, back to the "Atlantic" article.
01:18:49.500 | I've got this amazing quote from Martin Goury.
01:18:52.500 | I mean, just find it.
01:18:53.500 | But Martin Goury, he was a former CIA analyst,
01:18:56.660 | wrote this brilliant book called "The Revolt of the Public"
01:18:59.860 | or, and he has this great quote.
01:19:04.860 | He says, he talks about how in the age of mass media,
01:19:09.780 | we were all in a sense looking at a mirror,
01:19:11.940 | looking back at us.
01:19:12.860 | And it might've been a distorted mirror,
01:19:14.480 | but we had stories in common, we had facts in common.
01:19:18.580 | It was mass media.
01:19:20.460 | And he describes how the flood of information
01:19:22.480 | with the internet is like a tsunami washing over us.
01:19:25.460 | It has all kinds of effects.
01:19:26.980 | And he says, this isn't a comment in an interview on Vox.
01:19:30.340 | He says, "The digital revolution has shattered that mirror.
01:19:33.940 | And now the public inhabits those broken pieces of glass.
01:19:37.760 | So the public isn't one thing.
01:19:39.340 | It's highly fragmented and it's basically mutually hostile.
01:19:43.080 | It's mostly people yelling at each other
01:19:45.260 | and living in bubbles of one sort or another."
01:19:48.060 | And so, we now see clearly there's this little bubble
01:19:51.740 | of just bizarre nastiness in which the killer
01:19:56.360 | in Christchurch and the killer in Norway
01:19:58.460 | and now in Buffalo, they're all put into a community
01:20:01.620 | and posts flow up within that community
01:20:03.220 | by a certain dynamic.
01:20:05.060 | So we can never stamp those words or ideas out.
01:20:09.300 | The question is not, can we stop them from existing?
01:20:12.340 | The question is, what platforms,
01:20:14.820 | what are the platforms by which they spread
01:20:16.900 | all over the world and into every little town
01:20:18.760 | so that the 1% of whatever,
01:20:21.240 | whatever percentage of young men are vulnerable to this,
01:20:23.620 | that they get exposed to it.
01:20:25.640 | It's in the dynamics and the architecture.
01:20:27.820 | - It's a fascinating point to think about
01:20:29.740 | 'cause we often debate and think about
01:20:33.180 | the content moderation, the censorship,
01:20:36.180 | the ideas of free speech, but you're saying,
01:20:38.460 | yes, that's important to talk about,
01:20:40.060 | but much more important is fixing the dynamics.
01:20:43.140 | - That's right, 'cause everyone thinks
01:20:44.120 | if there's regulation, it means censorship.
01:20:45.780 | At least people on the right think
01:20:46.980 | regulation equals censorship.
01:20:49.140 | And I'm trying to say, no, no,
01:20:50.660 | that's only if all we talk about is content moderation.
01:20:53.100 | Well, then yes, that is the framework.
01:20:55.180 | You know, how much or how little do we, you know,
01:20:57.440 | but I don't even wanna talk about that
01:20:59.300 | 'cause all the action is in the dynamics.
01:21:00.960 | That's the point of my article.
01:21:02.080 | It's the architecture changed
01:21:03.560 | and our social world went insane.
01:21:05.440 | - So can you try to steel man the other side?
01:21:09.400 | So the people that might say that social media
01:21:12.640 | is good for society overall,
01:21:15.360 | both in the dimension of mental health,
01:21:18.400 | as Mark said, for teenagers, teenage girls,
01:21:23.000 | and for our democracy.
01:21:24.640 | Yes, there's a lot of negative things,
01:21:26.920 | but that's slices of data.
01:21:30.340 | If you look at the whole, which is difficult to measure,
01:21:33.240 | it's actually good for society.
01:21:35.040 | And to the degree that it's not good,
01:21:37.520 | it's getting better and better.
01:21:38.520 | Is it possible to steel man their point?
01:21:40.200 | - Yeah.
01:21:41.480 | It's hard, but I should be able to do it.
01:21:44.560 | I need to put my money where my mouth is,
01:21:45.960 | and that's a good question.
01:21:47.120 | So on the mental health front,
01:21:48.560 | you know, the argument is usually what they say is,
01:21:52.700 | well, you know, for communities that are cut off,
01:21:54.700 | especially LGBTQ kids, they can find each other.
01:21:59.000 | So it connects kids,
01:22:01.320 | especially kids who wouldn't find connection otherwise.
01:22:04.700 | It exposes you to a range of ideas and content,
01:22:09.300 | and it's fun.
01:22:13.860 | - Is there, in the studies you looked at,
01:22:16.700 | is there inklings of data that's maybe early data
01:22:22.100 | that shows that there is positive effects
01:22:23.820 | in terms of self-report data,
01:22:25.120 | or how would you measure behavioral positive?
01:22:28.660 | It's difficult.
01:22:29.700 | - Right, so if you look at how do you feel
01:22:33.220 | when you're on the platform,
01:22:34.700 | you get a mix of positive and negative,
01:22:36.780 | and people say they feel supported.
01:22:38.780 | And this is what Mark was referring to when he said,
01:22:41.140 | you know, there was like 18 criteria,
01:22:42.820 | and on most it was positive, and on some it was negative.
01:22:45.700 | So if you look at how do you feel
01:22:46.580 | while you're using the platform,
01:22:48.460 | look, most kids enjoy it, they're having fun,
01:22:51.620 | but some kids are feeling inferior, cut off, bullied.
01:22:56.620 | So if we're saying what's the average experience
01:23:01.700 | on the platform, that might actually be positive.
01:23:04.420 | If we just measured the hedonics,
01:23:06.020 | like how much fun versus fear is there,
01:23:08.180 | it could well be positive.
01:23:09.480 | But what I'm trying to, okay,
01:23:11.400 | so is that enough steel manning?
01:23:12.740 | Can I?
01:23:13.580 | - That's pretty good.
01:23:15.140 | - Okay.
01:23:15.960 | - You held your breath.
01:23:16.800 | - Yeah.
01:23:18.200 | But what I'm trying to point out is
01:23:20.460 | this isn't a dose response sugar thing,
01:23:22.300 | like how do you feel while you're consuming heroin?
01:23:24.800 | Like while I'm consuming heroin, I feel great,
01:23:27.940 | but am I glad that heroin came into my life?
01:23:29.740 | Am I glad that everyone in my seventh grade class
01:23:31.660 | is on heroin?
01:23:32.500 | Like, no, I'm not.
01:23:33.340 | Like I wish that people weren't on heroin,
01:23:35.220 | and they could play on the playground,
01:23:36.500 | but instead they're just, you know,
01:23:37.660 | sitting on the bench shooting up during recess.
01:23:40.020 | So when you look at it as an emergent phenomenon
01:23:42.700 | that changed childhood, now it doesn't matter
01:23:45.140 | what are the feelings while you're actually using it.
01:23:47.220 | We need to zoom out and say, how has this changed childhood?
01:23:50.860 | - Can you try to do the same for democracy?
01:23:52.860 | - Yeah.
01:23:54.140 | So we can go back to, you know,
01:23:56.780 | what Mark said in 2012 when he was taking Facebook public,
01:24:01.780 | and you know, this is the wake of the Arab Spring.
01:24:04.580 | I think people really have to remember
01:24:05.760 | what an extraordinary year 2011 was.
01:24:08.140 | It starts with the Arab Spring.
01:24:10.460 | Dictators are pulled down.
01:24:11.860 | Now people say, you know, Facebook took them down.
01:24:13.860 | I mean, of course it was the citizen,
01:24:15.460 | the people themselves took down dictators,
01:24:18.060 | aided by Facebook and Twitter and,
01:24:21.540 | I don't know if it was, or texting,
01:24:23.060 | there were some other platforms they used.
01:24:24.720 | So the argument that Mark makes in this letter
01:24:28.440 | to potential shareholders, investors,
01:24:31.600 | is, you know, we're at a turning point in history,
01:24:34.580 | and, you know, social media is rewiring.
01:24:38.820 | We're giving people the tools to rewire their institutions.
01:24:42.380 | So this all sounds great.
01:24:43.900 | Like this is the democratic dream.
01:24:45.700 | And what I write about in the essay
01:24:47.340 | is the period of techno-democratic optimism,
01:24:50.260 | which began in the early '90s
01:24:51.620 | with the fall of the Iron Curtain and the Soviet Union.
01:24:54.580 | And then the internet comes in and, you know,
01:24:56.940 | people, I mean, people my age remember
01:24:58.440 | how extraordinary it was, how much fun it was.
01:25:01.780 | I mean, the sense that this was the dawning of a new age,
01:25:04.900 | and there was so much optimism.
01:25:06.700 | And so this optimism runs all the way from the early '90s,
01:25:09.740 | all the way through 2011 with the Arab Spring.
01:25:12.980 | And of course that year ends with Occupy Wall Street.
01:25:15.180 | And there were also big protest movements in Israel,
01:25:18.060 | in Spain, and in a lot of areas.
01:25:19.660 | Martin Goury talks about this.
01:25:21.740 | So there certainly was a case to be made
01:25:24.740 | that Facebook in particular, but all these platforms,
01:25:28.020 | these were God's gift to democracy.
01:25:30.700 | What dictator could possibly keep out the internet?
01:25:33.820 | What dictator could stand up to people
01:25:35.620 | connected on these digital media platforms?
01:25:38.240 | So that's the strong case
01:25:39.180 | that this is gonna be good for democracy.
01:25:41.500 | And then we can see what happened in the years after.
01:25:43.420 | Now, first of all, so in Mark's response to you,
01:25:47.900 | so here, let me read from what he said
01:25:49.800 | when you interviewed him.
01:25:51.380 | He says, "I think it's worth grounding this conversation
01:25:55.260 | "in the actual research that has been done on this,
01:25:57.300 | "which by and large finds that social media
01:25:59.260 | "is not a large driver of polarization."
01:26:01.660 | He says that.
01:26:02.540 | Then he says, "Most academic studies that I've seen
01:26:05.360 | "actually show that social media use
01:26:07.120 | "is correlated with lower polarization."
01:26:09.540 | That's a factual claim that he makes,
01:26:11.220 | which is not true.
01:26:12.700 | But he asserts that, well, actually, wait, it's tricky
01:26:15.940 | because he says the studies he has seen.
01:26:18.220 | So I can't, so it might be that the studies he has seen
01:26:20.420 | say that, but if you go to the Google Doc with Chris Bale,
01:26:23.020 | you see there's seven different questions
01:26:25.140 | that can be addressed.
01:26:26.300 | And on one of them, which is filter bubbles,
01:26:29.260 | the evidence is very mixed.
01:26:30.420 | And he might be right that Facebook overall
01:26:32.500 | doesn't contribute to filter bubbles.
01:26:33.900 | But on the other six, the evidence is pretty strongly
01:26:36.500 | on the yes side, it is a cause.
01:26:38.620 | - He also draws a line between the United States
01:26:40.620 | versus the rest of the world.
01:26:41.820 | - Right, and there's one thing true about that,
01:26:43.640 | which is that polarization has been rising
01:26:46.020 | much faster in the US than in any other major country.
01:26:48.380 | So he's right about that.
01:26:49.580 | So we're talking about an article by Matthew Genscow
01:26:53.300 | and a few other researchers.
01:26:56.340 | A very important article, we've got it
01:26:57.860 | in the political dysfunction database.
01:27:01.540 | - And we should say that in this study,
01:27:02.940 | there's, like I started to say,
01:27:04.980 | there's a lot of fascinating questions
01:27:06.720 | that it's organized by, whether studies indicate yes or no,
01:27:10.400 | question one is does social media make people more angry
01:27:12.980 | or effectively polarized?
01:27:15.020 | Question two is does social media create echo chambers?
01:27:17.700 | These are fascinating, really important questions.
01:27:19.720 | Question three is does social media amplify posts
01:27:22.620 | that are more emotional, inflammatory, or false?
01:27:25.920 | Question four is does social media increase
01:27:27.700 | the probability of violence?
01:27:29.340 | Question five is does social media enable
01:27:31.240 | foreign governments to increase political dysfunction
01:27:34.260 | in the US and other democracies?
01:27:36.660 | Question six, does social media decrease trust?
01:27:39.780 | Seven, is the social media strengthen populist movements?
01:27:44.500 | And then there's other sections, as you mentioned.
01:27:47.140 | - Yeah, that's right.
01:27:47.980 | But once you operationalize it as seven different questions,
01:27:52.320 | so one is about polarization and there are measures of that,
01:27:56.000 | the degree to which people say they hate the other side.
01:27:58.380 | And so in this study by Boxell, Genscow, and Shapiro, 2021,
01:28:02.860 | they looked at all the measures of polarization
01:28:06.660 | they could find going back to the 1970s
01:28:09.060 | for about 20 different countries.
01:28:10.940 | And they show plots, you have these nice plots
01:28:13.140 | with red lines showing that in some countries
01:28:15.140 | it's going up, like the United States especially,
01:28:17.580 | in some countries it's going down,
01:28:19.140 | and in some countries it's pretty flat.
01:28:20.940 | And so Marx says, well, you know,
01:28:22.700 | if polarization's going up a lot in the US
01:28:24.900 | but not in most other countries,
01:28:26.140 | well, maybe Facebook isn't responsible.
01:28:28.460 | But so much depends on how you operationalize things.
01:28:31.860 | Are we interested in the straight line, regression line,
01:28:35.420 | going back to the '70s?
01:28:37.180 | And if so, well, then he's right in what he says.
01:28:40.380 | But that's not the argument.
01:28:41.500 | The argument isn't that, you know,
01:28:43.340 | it's been rising and falling since the '70s.
01:28:44.780 | The argument is it's been rising and falling since 2012
01:28:47.620 | or so.
01:28:48.500 | And for that, now I just spoke with,
01:28:50.420 | I've been emailing with the authors of the study
01:28:53.020 | and they say there's not really enough data
01:28:54.380 | to do it statistically reliably
01:28:56.060 | 'cause there's only a few observations after 2012.
01:28:58.160 | But if you look at the graphs in their study,
01:29:00.420 | and they actually do provide, as they pointed out to me,
01:29:02.820 | they do provide a statistical test
01:29:04.780 | if you break the data at the year 2000.
01:29:07.180 | So actually, a polarization is going up pretty widely
01:29:11.540 | if you just look after 2000,
01:29:12.940 | which is when the internet would be influential.
01:29:15.900 | And if you look just after 2012,
01:29:17.580 | you have to just do it by eye.
01:29:18.940 | But if you do it on their graphs by eye,
01:29:21.100 | you see that actually a number of countries
01:29:22.700 | do see a sudden sharp upturn.
01:29:24.340 | Not all, not all by any means.
01:29:26.060 | But my point is, Mark asserts,
01:29:28.420 | he points to one study and he points to this
01:29:29.900 | over and over again.
01:29:30.740 | I have had two conversations with him.
01:29:31.900 | He pointed to this study both times.
01:29:34.540 | He asserts that this study shows
01:29:37.260 | that polarization is up some places, down other places.
01:29:40.180 | There's no association.
01:29:41.980 | But actually, we have another section in the Google Doc
01:29:45.040 | where we review all the data on the decline of democracy.
01:29:47.820 | And the high point of democracy,
01:29:49.380 | of course, it was rising in the '90s.
01:29:50.860 | But if you look around the world,
01:29:52.500 | by some measures it begins to drop in the late 2000s,
01:29:55.220 | around 2007, 2008.
01:29:57.540 | By others, it's in the early to mid 2010s.
01:30:00.820 | The point is, there is a, by many measures,
01:30:03.500 | there's a drop in the quality and the number
01:30:06.220 | of democracies on this planet that began in the 2010s.
01:30:09.940 | And so, yes, Mark can point to one study.
01:30:13.540 | But if you look in the Google Doc,
01:30:14.620 | there are a lot of other studies that point the other way.
01:30:16.740 | And especially about whether things
01:30:18.020 | are getting more polarized or less, more polarized.
01:30:20.340 | Not in all countries, but in a lot.
01:30:22.100 | - So you've provided the problem,
01:30:25.260 | several proposals for solutions.
01:30:27.300 | Do you think Mark, do you think Elon or whoever
01:30:32.300 | is at the head of Twitter would be able
01:30:35.580 | to implement these changes?
01:30:36.900 | Or does there need to be a competitor,
01:30:38.620 | social network to step up?
01:30:40.780 | If you were to predict the future,
01:30:42.660 | now this is you giving sort of financial advice to me.
01:30:45.740 | No. - No, not that I can't do.
01:30:47.260 | - Definitely not financial advice.
01:30:48.100 | - I can give you advice.
01:30:48.940 | Do the opposite of whatever I've done.
01:30:50.620 | - Okay, excellent. (laughs)
01:30:52.540 | But what do you think, when we talk again in 10 years,
01:30:57.300 | what do you think we'd be looking at if it's a better world?
01:31:00.940 | - Yeah, so you have to look at each,
01:31:03.260 | the dynamics of each change that needs to be made.
01:31:05.620 | And you have to look at it systemically.
01:31:07.620 | And so the biggest change for teen mental health,
01:31:09.740 | I think, is to raise the age from 13,
01:31:12.380 | it was set to 13 in COPPA in like 1997 or six or whatever,
01:31:16.260 | that was eight, whatever it was.
01:31:17.940 | It was set to 13 with no enforcement.
01:31:19.500 | I think it needs to go to 16 or 18 with enforcement.
01:31:22.860 | Now, there's no way that Facebook can say,
01:31:25.880 | actually, so look, Instagram, the age is 13,
01:31:29.220 | but they don't enforce it.
01:31:30.500 | And they're under pressure to not enforce it
01:31:32.700 | because if they did enforce it,
01:31:34.420 | then all the kids would just go to TikTok,
01:31:36.100 | which they're doing anyway.
01:31:36.940 | But if we go back a couple of years,
01:31:38.820 | when they were talking about rolling out Facebook for kids,
01:31:42.500 | 'cause they need to get those kids,
01:31:43.740 | they need to get kids under 13.
01:31:45.180 | There's a business imperative to hook them early
01:31:47.220 | and keep them.
01:31:48.480 | So I don't expect Facebook to act on its own accord
01:31:51.220 | and do the right thing because then they--
01:31:52.060 | - So regulation is the only way.
01:31:53.540 | - Exactly.
01:31:54.380 | When you have a social dilemma,
01:31:56.020 | like what economists call like a prisoner's dilemma
01:31:58.340 | or a social dilemma is generalized to multiple people.
01:32:01.140 | And when you have a social dilemma,
01:32:03.180 | each player can't opt out because they're gonna lose.
01:32:06.100 | You have to have central regulation.
01:32:07.660 | So I think we have to raise the age.
01:32:09.540 | The UK parliament is way ahead of us.
01:32:11.380 | I think they're actually functional.
01:32:12.700 | The US Congress is not functional.
01:32:14.780 | So the parliament is implementing
01:32:16.580 | the age appropriate design code
01:32:18.860 | that may put pressure on the platforms globally
01:32:21.180 | to change certain.
01:32:22.300 | So anyway, my point is, we have to have regulation
01:32:25.860 | to force them to be transparent
01:32:28.140 | and share what they're doing.
01:32:30.100 | There are some good bills out there.
01:32:31.700 | So I think that if the companies and the users,
01:32:34.580 | if we're all stuck in a social dilemma
01:32:36.940 | in which the incentives against doing the right thing
01:32:40.900 | are strong, we do need regulation on certain matters.
01:32:44.540 | And again, it's not about content moderation,
01:32:46.660 | who gets to say what,
01:32:47.860 | but it's things like the Platform Accountability
01:32:49.620 | and Transparency Act, which is from Stanardoz Kunz,
01:32:52.820 | Portman and Klobuchar.
01:32:54.300 | This would force the platforms to just share information
01:32:56.940 | on what they're doing.
01:32:57.780 | Like we can't even study what's happening
01:32:59.940 | without the information.
01:33:01.140 | So that I think is just common sense.
01:33:03.380 | Senator Michael Bennett introduced
01:33:05.020 | the Digital Platforms Commission Act of 2022,
01:33:09.020 | which would create a body tasked with actually regulating
01:33:12.580 | and having oversight.
01:33:13.660 | Right now, the US government doesn't have a body.
01:33:16.020 | I mean, the FTC can do certain things.
01:33:17.980 | We have things about antitrust,
01:33:19.260 | but we don't have a body that can oversee or understand
01:33:22.420 | these things that are transforming everything
01:33:24.380 | and possibly severely damaging our political life.
01:33:28.020 | So I think there's a lot of, oh, and then the,
01:33:31.580 | the state of California is actually currently considering
01:33:35.140 | a version of the UK's, the age appropriate design code,
01:33:39.140 | which would force the companies to do some simple things
01:33:42.180 | like not be sending alerts and notifications to children
01:33:45.740 | at 10 or 11 o'clock at night,
01:33:47.140 | just things like that to make platforms just less damaging.
01:33:52.540 | So I think there's an essential role for regulation.
01:33:55.060 | And I think if the US Congress is too paralyzed by politics,
01:33:58.180 | if the UK and the EU and the state of California
01:34:01.540 | and the state, a few other states,
01:34:02.620 | if they enact legislation,
01:34:05.020 | the platforms don't wanna have different versions
01:34:07.020 | in different states or countries.
01:34:08.660 | So I think there actually is some hope,
01:34:10.140 | even if the US Congress is dysfunctional.
01:34:12.660 | - So there is, 'cause I've been interacting
01:34:15.140 | with certain regulations hitting,
01:34:17.260 | designed to hit Amazon, but it's hitting YouTube.
01:34:19.580 | YouTube folks have been talking to me,
01:34:21.300 | which is recommender systems.
01:34:23.020 | The algorithm has to be public, I think,
01:34:26.760 | versus private, which completely breaks.
01:34:30.180 | It's way too clumsy a regulation
01:34:33.180 | that where the unintended consequences
01:34:35.140 | break recommender systems, not for Amazon,
01:34:38.160 | but for other platforms.
01:34:40.740 | That's just to say that government can sometimes
01:34:43.860 | be clumsy with the regulation.
01:34:45.580 | - Usually is.
01:34:46.420 | - And so my preference is the threat of regulation
01:34:51.420 | in a friendly way encourages--
01:34:54.740 | - Good behavior. - You really shouldn't need it.
01:34:56.340 | You really shouldn't need it.
01:34:57.500 | My preference is great leaders lead the way
01:35:01.620 | in doing the right thing.
01:35:03.220 | And I actually, honestly, this, to our earlier
01:35:05.860 | kind of maybe my naive disagreement
01:35:08.620 | that I think it's good business to do the right thing
01:35:11.300 | in these spaces.
01:35:12.900 | So creating a problem-- - Sometimes it is.
01:35:14.220 | Sometimes it loses you, most of your users.
01:35:18.020 | - Well, I think it's important because I've been thinking
01:35:20.460 | a lot about World War III recently.
01:35:22.980 | - Yeah. - And it might be silly
01:35:26.580 | to say, but I think social media has a role
01:35:30.580 | in either creating World War III or avoiding World War III.
01:35:34.300 | It seems like so much of wars throughout history
01:35:37.940 | have been started through very fast escalation.
01:35:42.300 | And it feels like just looking at our recent history,
01:35:44.780 | social media is the mechanism for escalation.
01:35:47.420 | And so it's really important to get this right,
01:35:49.460 | not just for the mental health of young people,
01:35:52.180 | not just for the polarization of bickering
01:35:54.140 | over small-scale political issues,
01:35:57.380 | but literally the survival of human civilization.
01:36:01.580 | So there's a lot at stake here.
01:36:04.360 | - Yeah, I certainly agree with that.
01:36:05.540 | I would just say that I'm less concerned
01:36:07.380 | about World War III than I am about Civil War II.
01:36:09.820 | I think that's a more likely prospect.
01:36:11.540 | - Yeah, yeah, yeah.
01:36:13.880 | Can I ask for your wise sage advice to young people?
01:36:19.420 | So advice number one is put down the phone.
01:36:22.500 | Don't use Instagram and social media.
01:36:25.420 | But to young people in high school and college,
01:36:28.980 | how to have a career,
01:36:30.220 | or how to have a life they can be proud of.
01:36:33.500 | - Yeah, I'd be happy to,
01:36:34.660 | 'cause I teach a course at NYU in the business school
01:36:38.780 | called Work, Wisdom, and Happiness.
01:36:40.860 | And the course is,
01:36:42.340 | it's advice on how to have a happy,
01:36:44.220 | a successful career as a human being.
01:36:47.140 | But the course has evolved that it's now about three things,
01:36:50.620 | how to get stronger, smarter, and more sociable.
01:36:54.900 | If you can do those three things,
01:36:56.640 | then you will be more successful
01:36:59.180 | at work and in love and friendships.
01:37:01.940 | And if you are more successful in work,
01:37:03.820 | love, and friendships, then you will be happier.
01:37:05.500 | You will be as happy as you can be, in fact.
01:37:07.500 | So the question is,
01:37:08.780 | how do you become smarter, stronger, and happier?
01:37:11.240 | And the answer to all three is,
01:37:15.380 | it's a number of things,
01:37:16.260 | but it's you have to see yourself
01:37:18.340 | as this complex adaptive system.
01:37:20.900 | You've got this complicated mind
01:37:22.680 | that needs a lot of experience to wire itself up.
01:37:25.900 | And the most important part of that experience is
01:37:28.220 | that you don't grow
01:37:30.020 | when you are with your attachment figure.
01:37:31.820 | You don't grow when you're safe.
01:37:33.660 | You have an attachment figure to make you feel confident
01:37:35.780 | to go out and explore the world.
01:37:37.240 | In that world, you will face threats,
01:37:39.540 | you will face fear,
01:37:40.700 | and sometimes you'll come running back.
01:37:42.380 | But you have to keep doing it
01:37:43.500 | because over time, you then develop the strength
01:37:46.260 | to stay out there and to conquer it.
01:37:48.340 | That's normal human childhood.
01:37:50.200 | That's what we blocked in the 1990s in this country.
01:37:52.980 | So young people have to get themselves the childhood,
01:37:56.700 | and this is all the way through adolescence
01:37:58.100 | and young adulthood,
01:37:59.340 | they have to get themselves the experience
01:38:00.820 | that older generations are blocking them from out of fear
01:38:04.020 | and that their phones are blocking them from
01:38:06.100 | out of just hijacking almost all the inputs into their life
01:38:09.080 | and almost all the minutes of their day.
01:38:10.920 | So go out there, put yourself out in experiences.
01:38:14.800 | You are anti-fragile and you're not gonna get strong
01:38:17.240 | unless you actually have setbacks
01:38:19.360 | and criticisms and fights.
01:38:21.200 | So that's how you get stronger.
01:38:24.120 | And then there's an analogy in how you get smarter,
01:38:27.080 | which is you have to expose yourself to other ideas,
01:38:29.960 | to ideas that people that criticize you,
01:38:32.880 | people that disagree with you.
01:38:34.560 | And this is why I co-founded Heterodox Academy
01:38:36.860 | because we believe that faculty need to be in communities
01:38:41.580 | that have political diversity and viewpoint diversity,
01:38:43.780 | but so do students.
01:38:45.100 | And it turns out students want this.
01:38:47.540 | The surveys show very clearly Gen Z
01:38:50.260 | has not turned against viewpoint diversity.
01:38:52.780 | Most of them want it,
01:38:54.100 | but they're just afraid of the small number
01:38:55.740 | that will sort of shoot darts at them
01:38:57.420 | if they say something wrong.
01:38:59.540 | So anyway, the point is you're anti-fragile
01:39:02.740 | and so you have to realize that to get stronger,
01:39:05.640 | you have to realize to get smarter.
01:39:07.400 | And then the key to becoming more sociable is very simple.
01:39:10.080 | It's just always looking at it
01:39:10.960 | through the other person's point of view.
01:39:12.680 | Don't be so focused on what you want
01:39:14.260 | and what you're afraid of.
01:39:16.200 | Put yourself in the other person's shoes,
01:39:17.640 | what's interesting to them, what do they want?
01:39:19.840 | And if you develop the skill of looking at it
01:39:21.960 | from their point of view,
01:39:23.040 | you'll be a better conversation partner,
01:39:24.560 | you'll be a better life partner.
01:39:27.200 | So there's a lot that you can do.
01:39:28.960 | I mean, I could say,
01:39:29.880 | go read "The Coddling of the American Mind."
01:39:32.120 | I could say, go read Dale Carnegie,
01:39:33.440 | "How to Win Friends and Influence People."
01:39:36.100 | But take charge of your life and your development
01:39:39.120 | 'cause if you don't do it,
01:39:40.520 | then the older protective generation and your phone
01:39:45.520 | are gonna take charge of you.
01:39:47.800 | - So on anti-fragility and coddling the American mind,
01:39:51.680 | if I may read just a few lines
01:39:53.480 | from Chief Justice John Roberts,
01:39:55.200 | which I find is really beautiful.
01:39:58.060 | So it's not just about viewpoint diversity,
01:40:00.840 | but it's real struggle, absurd, unfair struggle
01:40:04.400 | that seems to be formative to the human mind.
01:40:07.240 | He says, "From time to time in the years to come,
01:40:10.480 | I hope you will be treated unfairly
01:40:13.120 | so that you will come to know the value of justice.
01:40:16.360 | I hope that you will suffer betrayal
01:40:18.780 | because that will teach you the importance of loyalty.
01:40:21.800 | Sorry to say, but I hope you will be lonely
01:40:24.200 | from time to time
01:40:25.540 | so that you don't take friends for granted.
01:40:27.760 | I wish you bad luck again from time to time
01:40:30.560 | so that you will be conscious of the role of chance in life
01:40:33.480 | and understand that your success is not completely deserved
01:40:37.720 | and that the failure of others
01:40:39.000 | is not completely deserved either.
01:40:41.240 | And when you lose, as you will from time to time,
01:40:44.520 | I hope every now and then,
01:40:46.800 | your opponent will gloat over your failure.
01:40:49.360 | It is a way for you to understand
01:40:51.200 | the importance of sportsmanship.
01:40:53.200 | I hope you'll be ignored
01:40:55.260 | so you know the importance of listening to others.
01:40:57.520 | And I hope you will have just enough pain
01:41:00.120 | to learn compassion.
01:41:01.760 | Whether I wish these things or not,
01:41:04.040 | they're going to happen.
01:41:05.720 | And whether you benefit from them or not
01:41:08.080 | will depend upon your ability
01:41:09.660 | to see the message in your misfortunes.
01:41:13.140 | He read that in a middle school graduation.
01:41:15.560 | - Yes, for his son's middle school graduation.
01:41:18.640 | That's what I was trying to say,
01:41:19.800 | only that's much more beautiful.
01:41:21.200 | - Yeah, and I think your work is really important
01:41:24.040 | and it is beautiful and it's bold and fearless
01:41:27.200 | and it's a huge honor that you would sit with me.
01:41:29.440 | I'm a big fan.
01:41:30.520 | Thank you for spending your valuable time with me today,
01:41:32.440 | John, thank you so much.
01:41:33.360 | - Thanks so much, Lex, what a pleasure.
01:41:35.720 | - Thanks for listening to this conversation
01:41:37.160 | with Jonathan Haidt.
01:41:38.640 | To support this podcast,
01:41:39.880 | please check out our sponsors in the description.
01:41:42.880 | And now, let me leave you with some words from Carl Jung.
01:41:45.780 | "Everything that irritates us about others
01:41:50.240 | "can lead us to an understanding of ourselves."
01:41:53.940 | Thank you for listening and hope to see you next time.
01:41:57.120 | (upbeat music)
01:41:59.700 | (upbeat music)
01:42:02.280 | [BLANK_AUDIO]