back to indexGreg Lukianoff: Cancel Culture, Deplatforming, Censorship & Free Speech | Lex Fridman Podcast #397
Chapters
0:0 Introduction
2:11 Cancel culture & freedom of speech
16:42 Left-wing vs right-wing cancel culture
25:27 Religion
28:7 College rankings by freedom of speech
34:15 Deplatforming
48:50 Whataboutism
53:53 Steelmanning
61:29 How the left argues
72:9 Diversity, equity, and inclusion
84:0 Why colleges lean left
91:38 How the right argues
96:13 Hate speech
105:0 Platforming
114:31 Social media
135:38 Depression
147:9 Hope
00:00:00.000 |
If the goal is the project of human knowledge, which is to know the world as it is, you cannot 00:00:05.800 |
know the world as it is without knowing what people really think. 00:00:10.560 |
And what people really think is an incredibly important fact to know. 00:00:15.000 |
So every time you're actually saying, "You can't say that," you're actually depriving 00:00:19.460 |
yourself of the knowledge of what people really think. 00:00:21.880 |
You're causing what Timur Kuran, who's on our board of advisors, calls preference falsification. 00:00:27.440 |
You end up with an inaccurate picture of the world, which by the way, in a lot of cases, 00:00:32.600 |
because there are activists who want to restrict more speech, they actually tend to think that 00:00:36.180 |
people are more prejudiced than they might be. 00:00:38.480 |
And actually, one very real practical way it makes things worse is when you censor people, 00:00:46.560 |
It just encourages them to not share it with people who will get them in trouble. 00:00:50.900 |
So it leads them to talk to people who they already agree with, and group polarization 00:00:58.680 |
The following is a conversation with Greg Lukianoff, free speech advocate, First Amendment 00:01:03.480 |
attorney, president and CEO of FIRE, the Foundation for Individual Rights and Expression. 00:01:09.160 |
And he's the author of Unleashing Liberty, co-author with Jonathan Haidt of Coddling 00:01:14.840 |
of the American Mind, and co-author with Ricky Schlott of a new book coming out in October 00:01:21.040 |
that you should definitely pre-order now called The Cancelling of the American Mind, which 00:01:26.600 |
is a definitive accounting of the history, present and future of cancel culture, a term 00:01:31.840 |
used and overused in public discourse, but rarely studied and understood with the depth 00:01:38.080 |
and rigor that Greg and Ricky do in this book, and in part in this conversation. 00:01:45.200 |
Freedom of speech is important, especially on college campuses, the very place that you 00:01:51.000 |
should serve as the battleground of ideas, including weird and controversial ones, that 00:01:56.160 |
should encourage bold risk-taking, not conformity. 00:02:03.400 |
To support it, please check out our sponsors in the description. 00:02:06.800 |
And now, dear friends, here's Greg Lukianoff. 00:02:14.920 |
Now you've said that you don't like the term as it's been, quote, "dragged through the 00:02:18.520 |
mud and abused endlessly by a whole host of controversial figures." 00:02:25.640 |
Cancel culture is the uptick of campaigns, especially successful campaigns, starting 00:02:30.920 |
around 2014 to get people fired, expelled, deplatformed, et cetera, for speech that would 00:02:39.120 |
normally be protected by the First Amendment. 00:02:41.200 |
And I say would be protected because we're talking about circumstances in which it isn't 00:02:46.400 |
necessarily where the First Amendment applies. 00:02:48.420 |
But what I mean is like as an analog to, say, things you couldn't lose your job as a public 00:02:54.160 |
employee for, and also the climate of fear that's resulted from that phenomenon, the 00:03:00.840 |
fact that you can lose your job for having the wrong opinion. 00:03:03.560 |
And it wasn't subtle that there was an uptick in this, particularly on campus around 2014. 00:03:09.800 |
John Ronson wrote a book called So You've Been Publicly Shamed that came out in 2015, 00:03:15.840 |
I wrote a book called Freedom from Speech in 2014. 00:03:20.080 |
But it really was in 2017 when you started seeing this be directed at professors. 00:03:24.520 |
And when it comes to the number of professors that we've seen be targeted and lose their 00:03:29.760 |
jobs, I've been doing this for 22 years and I've seen nothing like it. 00:03:34.160 |
So, there's so many things I want to ask you here. 00:03:37.480 |
One actually, just look at the organization of fire. 00:03:41.360 |
Because it's interconnected to this whole fight and the rise of cancel culture and the 00:03:46.600 |
fight for freedom of speech since 2014 and before. 00:03:50.440 |
So fire was founded in 1999 by Harvey Sloverglate. 00:03:58.720 |
He's the person who actually found me out in my very happy life out in San Francisco, 00:04:03.480 |
but knew I was looking for a First Amendment job. 00:04:06.280 |
I'd gone to law school specifically to do First Amendment. 00:04:13.280 |
His protege, Kathleen Sullivan, was the dean of Stanford Law School. 00:04:16.800 |
And this remains the best compliment I ever got in my life is that she recommended me 00:04:24.120 |
And since that's the whole reason why I went to law school, I was excited to be a part 00:04:29.040 |
The other co-founder of fire is Alan Charles Kors. 00:04:34.860 |
He is the one of the leading experts in the world on the enlightenment and particularly 00:04:41.280 |
And if any of your listeners do like the great courses, he has a lecture on Blaise Pascal. 00:04:48.120 |
And Blaise, of course, is famous for the Pascal's wager. 00:04:51.760 |
And I left it just so moved and impressed and with a depth of understanding of how important 00:05:00.840 |
You mentioned to me offline connected to this that there is a, at least it runs in parallel, 00:05:06.840 |
or there's a connection between the love of science and the love of the freedom of speech. 00:05:10.840 |
Can you maybe elaborate where that connection is? 00:05:15.600 |
I think that for those of us who are really, who've devoted our lives to freedom of speech, 00:05:20.800 |
one thing that we are into, whether we know it or not, is epistemology, the study and 00:05:29.880 |
Freedom of speech has lots of moral and philosophical dimensions, but from a pragmatic standpoint, 00:05:36.520 |
it is necessary because we're creatures of incredibly limited knowledge. 00:05:43.480 |
I always love the fact that Yuval Harari refers to the enlightenment as the discovery of ignorance 00:05:51.560 |
It was suddenly being like, "Wow, hold on a second. 00:05:55.280 |
All of this incredibly interesting folk wisdom we got," which by the way, can be surprisingly 00:06:00.360 |
reliable here and there, "when you start testing a lot of it is nonsense. 00:06:07.760 |
Even our ideas about the way things fall," as Galileo established, "even our intuitions, 00:06:14.520 |
A lot of the early history of freedom of speech, it was happening at the same time as the scientific 00:06:23.960 |
A lot of the early debates about freedom of speech were tied in. 00:06:28.040 |
Certainly, Galileo, I always point out like Kepler was probably the even more radical 00:06:34.640 |
idea that there weren't even perfect spheres. 00:06:37.040 |
But at the same time, largely because of the invention of the printing press, you also 00:06:45.360 |
I always talk about Jan Hus from the famous Czech hero who was burned at the stake and 00:06:56.560 |
But he was basically Luther before the printing press. 00:07:01.880 |
Before Luther could get his word out, he didn't stand a chance and that was exactly what Jan 00:07:07.320 |
But a century later, thanks to the printing press, everyone could know what Luther thought 00:07:13.240 |
But it led to, of course, this completely crazy, hyper-disrupted period in European 00:07:20.440 |
- Well, you mentioned to jump around a little bit, the First Amendment. 00:07:24.080 |
First of all, what is the First Amendment and what is the connection to you between 00:07:28.760 |
the First Amendment, the freedom of speech and cancel culture? 00:07:31.560 |
- So I'm a First Amendment lawyer, as I mentioned, and that's what I—it's my passion. 00:07:37.040 |
That's what I studied and I think American First Amendment law is incredibly interesting. 00:07:41.400 |
In one sentence, the First Amendment is trying to get rid of basically all the reasons why 00:07:47.120 |
humankind had been killing each other for its entire existence. 00:07:51.080 |
That we weren't going to fight anymore over opinion. 00:07:53.440 |
We weren't going to fight anymore over religion, that you have the right to approach your government 00:07:57.000 |
for redress of grievances, that you have the freedom to associate. 00:08:00.760 |
But all of these things in one sentence, we're like, "Nope, the government will no longer 00:08:05.720 |
interfere with your right to have these fundamental human rights." 00:08:12.120 |
And so one thing that makes FIRE a little different from other organizations is, however, 00:08:17.280 |
we're not just a First Amendment organization. 00:08:22.840 |
And so—but at the same time, a lot of what I think free speech is can be well explained 00:08:31.360 |
with reference to a lot of First Amendment law, partially because in American history, 00:08:36.440 |
some of our smartest people have been thinking about what the parameters of freedom of speech 00:08:43.400 |
And a lot of those principles, they transfer very well just as pragmatic ideas. 00:08:48.440 |
So the biggest sin in terms of censorship is called viewpoint discrimination, that essentially 00:08:53.440 |
you allow freedom of speech except for that opinion. 00:08:56.760 |
Now, and it's found to be kind of more defensible, and I think this makes sense, that if you 00:09:01.920 |
set up a forum and we're only going to talk about economics to exclude people who want 00:09:05.980 |
to talk about a different topic, but it's considered rightfully a bigger deal if you 00:09:11.600 |
set up a forum for economics, but we're not going to let people talk about that kind of 00:09:15.280 |
economics or have that opinion on economics, most particularly. 00:09:20.660 |
So a lot of the principles from First Amendment law actually make a lot of philosophical sense 00:09:24.880 |
as good principles for when—like what is protected and unprotected speech, what should 00:09:30.280 |
get you in trouble, how you actually analyze it, which is why we actually try in our definition 00:09:35.640 |
of cancel culture to work in some of the First Amendment norms just in the definition so 00:09:41.640 |
You're saying so many interesting things, but if you can linger on the viewpoint discrimination, 00:09:46.480 |
is there any gray area of discussion there, like what is and isn't economics, for the 00:09:54.000 |
Is there—I mean, is it a science or is it an art to draw lines of what is and isn't 00:10:01.600 |
You know, if you're saying that something is or is not economics, well, you can say 00:10:03.640 |
everything's economics, and therefore I want to talk about poetry. 00:10:06.600 |
There'd be some line drawing exercise in there. 00:10:08.720 |
But let's say at once you decide to open up it to poetry even, it's a big difference 00:10:16.760 |
between saying, "Okay, now we're open to poetry," but you can't say, you know, 00:10:22.880 |
Like that's a forbidden opinion now officially in this otherwise open forum. 00:10:27.640 |
That would immediately at an intuitive level strike people as a bigger problem than just 00:10:34.080 |
Yeah, I mean, that intuitive level that you speak to, I hope that all of us have that 00:10:42.840 |
kind of basic intuition when the line is crossed. 00:10:45.440 |
It's the same thing for like pornography, right? 00:10:48.680 |
I think there's the same level of intuition that should be applied across the board here. 00:10:55.720 |
And it's when that intuition becomes deformed by whatever forces of society, that's when 00:11:02.560 |
Yeah, I mean, people find it a different thing. 00:11:05.640 |
You know, if someone loses their job simply for their political opinion, even if that 00:11:09.360 |
employer has every right in the world to fire you, I think Americans should still be like, 00:11:15.560 |
And I'm not making a legal case that maybe you shouldn't fire someone for their political 00:11:22.280 |
Like what society do we want to—what kind of society do we want to live in? 00:11:26.140 |
And it's been funny watching—you know, and I point this out. 00:11:30.540 |
Yes, I will defend businesses' First Amendment rights of association to be able to have the 00:11:36.460 |
legal right to decide, you know, who works for them. 00:11:40.300 |
But from a moral or philosophical matter, if you think through the implications of if 00:11:44.820 |
every business in America becomes an expressive association in addition to being a profit-maximizing 00:11:51.820 |
organization, that would be a disaster for democracy. 00:11:55.180 |
Because you would end up in a situation where people would actually be saying to themselves, 00:11:58.700 |
"I don't think I can actually say what I really think and still believe I can keep 00:12:05.580 |
I felt like we were headed because a lot of the initial response to people getting canceled 00:12:11.300 |
was very simply, you know, "Oh, but they have the right to get rid of this person." 00:12:16.580 |
And that's the end and beginning and end of the discussion. 00:12:22.620 |
I thought that wasn't actually a very serious way of—that if you care about both the First 00:12:27.060 |
Amendment and freedom of speech, of thinking it through. 00:12:30.300 |
To you, just to clarify, the First Amendment is kind of a legal embodiment of the ideal 00:12:41.820 |
And it's very specific—applied to government. 00:12:43.820 |
And freedom of speech is the application of the principle to, like, everything, including, 00:12:50.060 |
like, kind of the high-level philosophical ideal of what it—of the value of people 00:13:04.340 |
And you can have a situation—and I talk about countries that have good free speech 00:13:08.060 |
law but not necessarily great free speech culture. 00:13:11.220 |
And I talk about how when we sometimes make this distinction between free speech law and 00:13:15.420 |
free speech culture, we're thinking in a very cloudy kind of way. 00:13:20.380 |
And what I mean by that is that law is generally—particularly in a common law country, it's the reflection 00:13:31.300 |
And in a lot of cases, common law is supposed to actually take our intuitive ideas of fairness 00:13:38.540 |
So if you actually have a culture that doesn't appreciate free speech from a philosophical 00:13:42.820 |
standpoint, it's not going to be able to protect free speech for the long haul, even 00:13:46.860 |
in the law, because eventually—that's one of the reasons why I worry so much about 00:13:50.180 |
some of these terrible cases coming out of law schools, because I fear that even though, 00:13:55.660 |
sure, American First Amendment law is very strongly protective of First Amendment, for 00:13:59.820 |
now, it's not going to stay that way if you have generations of law students graduating 00:14:05.700 |
who actually think there's nothing—there's no higher goal than shouting down you're 00:14:10.460 |
Yeah, so that's why so much of your focus, or a large fraction of your focus, is on the 00:14:16.460 |
higher education or education period, is because education is the foundation of culture. 00:14:25.780 |
'64, you have the free speech movement on Berkeley. 00:14:29.140 |
And in '65, you have repressive tolerance by Herbert Marcuse, which was a declaration 00:14:33.940 |
of, "By the way, we on the left, we shouldn't—we should have free speech, but we should have 00:14:41.260 |
I mean, I went back and reread repressive tolerance, and how clear it is. 00:14:47.460 |
I forgot—I had forgotten that it really is kind of like, "And these so-called conservatives 00:14:51.780 |
and right-wingers, we need to repress them because they're regressive thinkers." 00:14:55.780 |
It really doesn't come out to anything more sophisticated than the very old idea that 00:15:07.980 |
We should not have—we have to retrain society." 00:15:10.980 |
And of course, like, it ends up being another—he was also a fan of Mao, so it's not surprising 00:15:15.940 |
that he—of course, the system would have to rely on some kind of totalitarian system. 00:15:22.600 |
But that was a laughable position, you know, say, 30, 40 years ago. 00:15:30.220 |
The idea that essentially, you know, "Free speech for me, not for thee," as the great, 00:15:34.860 |
you know, free speech champion, Ned Hentoff, used to say, was something that you were supposed 00:15:41.020 |
But I saw this when I was in law school in '97. 00:15:44.740 |
I saw this when I was interning at the ACLU in '99, that there was a slow-motion train 00:15:51.040 |
wreck coming, that essentially there was these bad ideas from campus that had been taking 00:15:57.400 |
on more and more steam of basically, "No free speech for my opponent." 00:16:02.060 |
We're actually becoming more and more accepted as—and partially because academia was becoming 00:16:09.720 |
I think that, as my co-author Jonathan Haidt points out, that when you have low viewpoint 00:16:15.260 |
diversity, people start thinking in a very kind of tribal way. 00:16:18.840 |
And if you don't have the respected dissenters, you don't have the people that you can point 00:16:22.660 |
to that are like, "Hey, this is a smart person. 00:16:25.180 |
This is like—this is a smart, reasonable, decent person that I disagree with. 00:16:29.140 |
So I guess not everyone thinks alike on this issue." 00:16:32.240 |
You start getting much more kind of like, only bad people, only heretics, only blasphemers, 00:16:38.840 |
only right-wingers can actually think in this way. 00:16:42.740 |
Every time you say something, I always have a million thoughts and a million questions 00:16:48.040 |
But since you mentioned there's a kind of drift, as you write about in the book, and 00:16:52.100 |
you mentioned now there's a drift towards the left in academia, which will also maybe 00:16:57.360 |
draw a distinction here between the left and the right and the cancel culture, as you present 00:17:01.520 |
in your book, is not necessarily associated with any one political viewpoint, that there's 00:17:06.880 |
mechanisms on both sides that result in cancellation and censorship in violation of freedom of 00:17:14.400 |
So one thing I want to be really clear about is the book takes on both right and left cancel 00:17:20.520 |
And definitely, you know, cancel culture from the left is more important in academia, where 00:17:28.240 |
But we talk a lot about cancel culture coming from legislatures. 00:17:31.420 |
We talk a lot about cancel culture on campus as well, because even though most of the attempts 00:17:38.920 |
that come from on campus to get people canceled are still from the left, there are a lot of 00:17:43.420 |
attacks that come from the right, that come from, you know, attempts by different organizations 00:17:48.760 |
and sometimes when there are stories in Fox News, you know, like they'll go after professors. 00:17:54.460 |
And about one-third of the attempts to get professors punished that are successful actually 00:18:00.720 |
And we talk about attempts to get books banned in the book. 00:18:10.720 |
Ron DeSantis had something called the Stop Woke Act, which we told everyone this is laughably 00:18:18.220 |
They tried to ban, you know, particular topics in higher ed. 00:18:26.840 |
And they didn't listen to us and they brought it, they passed it and we sued and we won. 00:18:33.100 |
Now they're trying again with something that's equally as unconstitutional and we will sue 00:18:41.600 |
So this is presumably trying to limit certain topics from being taught in school? 00:18:48.520 |
You know, it came out of the sort of attempt to get at critical race theory. 00:18:53.420 |
So it's topics related to race, gender, et cetera. 00:18:57.160 |
I don't remember exactly how they tried to cabinet to CRT. 00:19:02.840 |
But when you actually – the law is really well established that you can't tell higher 00:19:07.400 |
education what they're allowed to teach without violating the First Amendment. 00:19:13.380 |
And when this got in front of a judge, it was exactly as – he was exactly as skeptical 00:19:24.800 |
So if you're against that kind of teaching, the right way to fight it is by making the 00:19:29.880 |
case that it's not a good idea as part of the curriculum as opposed to banning it from 00:19:36.080 |
It just – the state doesn't have the power to simply say – to ban, you know, what teachers 00:19:43.060 |
Now it gets a little more complicated when you talk about K-12 because the state has 00:19:47.840 |
a role in deciding what public K-12 teaches because they're your kids. 00:19:57.920 |
There is democratic oversight of that process. 00:20:00.680 |
So for K-12, is there also a lean towards the left in terms of the administration that 00:20:20.600 |
But we have run into a lot of problems with education schools at fire. 00:20:24.640 |
And a lot of the graduates of education school end up being the administrators who clamp 00:20:31.240 |
And so I've been trying to think of positive ways to take on some of the problems that 00:20:37.480 |
I thought that the attempt to just dictate you won't teach the following 10 books, you 00:20:43.240 |
know, or 20 books or 200 books was the wrong way to do it. 00:20:46.600 |
Now when it comes to deciding what books are in the curriculum, again, that's something 00:20:49.840 |
the legislature actually can't have some say in. 00:20:52.720 |
And that's pretty uncontroversial in terms of the law. 00:20:56.280 |
But when it comes to how you fight it, I had something that since I'm kind of stuck with 00:20:59.720 |
a formula, I called empowering of the American mind. 00:21:03.240 |
I gave principles that were inconsistent with the sort of group think and heavy emphasis 00:21:09.440 |
on identity politics that, you know, some of the critics are rightfully complaining 00:21:17.800 |
And that is actually in cancelling of the American mind. 00:21:22.040 |
But I have a more detailed explanation of it that I'm going to be putting up on my blog, 00:21:27.720 |
Is it possible to legally, this is a silly question perhaps, create an extra protection 00:21:33.280 |
for certain kinds of literature, 1984 or something, to remain in the curriculum? 00:21:38.520 |
I mean, it's already, it's all protected, I guess. 00:21:41.080 |
I guess to protect against administrators from fiddling too much with the curriculum, 00:21:49.480 |
I don't know what the machinery of the K-12 public school. 00:21:54.000 |
In K-12, you know, state legislatures, you know- 00:21:59.000 |
And they can say like, "You should teach the following books." 00:22:01.080 |
Now, of course, people are always a little bit worried that if you, if they were to recommend, 00:22:06.560 |
you know, teach the Declaration of Independence, you know, that it will end up being, well, 00:22:11.320 |
they're going to teach the Declaration of Independence was just to protect slavery, 00:22:16.360 |
So teaching a particular topic matters, which textbooks you choose, which perspective you 00:22:21.920 |
Of course, there's like, religion starts to creep into the whole question of like, how, 00:22:25.040 |
you know, is the Bible, are you allowed to teach, incorporate that into education? 00:22:30.600 |
I mean, I'm an atheist with an intense interest in religion. 00:22:35.160 |
I actually read the entire Bible this year just because I do stuff like that. 00:22:38.480 |
And I never actually had read it from beginning to end. 00:22:40.680 |
Then I read the Quran because, you know, and I'm going to try to do the Book of Mormon, 00:22:48.600 |
I think you should, just to know, because it's such a touchstone in the way people talk 00:22:56.440 |
It can get pretty tedious, but I even made myself read through all of the very specific 00:23:02.380 |
instructions on how tall the different parts of the temple need to be and how long the 00:23:07.200 |
garbs need to be and what shape they need to be and what, like, and those go on a lot. 00:23:13.720 |
That surprisingly, surprisingly big chunk of Exodus. 00:23:17.960 |
I thought that was more like in Leviticus and Deuteronomy. 00:23:20.880 |
But then you get to books like Job, you know, wow. 00:23:24.200 |
I mean, Job is such a read and no way Job originally had that ending. 00:23:28.480 |
Like Job is basically, it starts out as this perverse bet between God and Satan about whether 00:23:36.300 |
or not they can actually make a good man renounce God. 00:23:39.120 |
And initially they can't, it's all going very predictably. 00:23:42.040 |
And then they finally really torture Job and he turns into the best, why is God cruel? 00:23:52.400 |
And he beats, he turns into like the best lawyer in the entire world and he defeats 00:23:57.160 |
All the people who come to argue with him, he argues the pants off of them. 00:24:01.040 |
And then suddenly at the end, God shows up and he's like, well, you know, I am everywhere. 00:24:12.000 |
He gives an answer kind of like, I am there when lionesses give birth and I am there. 00:24:16.880 |
And by the way, there's this giant monster Leviathan that's very big and it's very scary 00:24:23.600 |
And I'm kind of like, God, are you saying that you're very busy? 00:24:31.160 |
And you don't mention the whole kind of like, that I have a bet that's why I was torturing 00:24:39.200 |
And then at the end, God decides, Job's like, oh no, you're totally right. 00:24:44.920 |
And God says, I'm going to punish those people who tried to argue with you and didn't win. 00:24:50.480 |
So he gets rid of the, I don't know exactly what he does to them, I don't remember. 00:24:54.520 |
And then he gives Job all his money back and it makes him super prosperous. 00:24:58.680 |
And I'm like, no way that was the original ending of that book. 00:25:03.040 |
Because this was clearly a beloved novel that they were like, but it can't have that ending. 00:25:08.360 |
So it's a long way of saying, I actually think it's worthwhile. 00:25:12.040 |
Some of it was, you're always kind of surprised when you end up in the part. 00:25:17.160 |
There are parts of it that will sneak up on you. 00:25:28.040 |
So what, it'd be interesting to ask, is there a tension between the study of religious texts 00:25:35.160 |
or the following of religion and just believing in God and following the various aspects of 00:25:44.960 |
In the First Amendment, we have something that we call the religion clause. 00:25:48.960 |
And I've never liked calling it just that because it's two brilliant things right next 00:25:54.160 |
The state may not establish an official religion, but it cannot interfere with your right to 00:26:06.120 |
And I think sometimes the right gets very excited of the free exercise clause and the 00:26:12.200 |
And I like the fact that we have both of them together. 00:26:15.240 |
Now how does this relate to freedom of speech? 00:26:17.720 |
And how does it relate to the curriculum like we were talking about? 00:26:21.720 |
I actually think it would be great if public schools could teach the Bible, like in the 00:26:27.360 |
sense of like read it as a historical document. 00:26:30.200 |
But back when I was at the ACLU, every time I saw people trying this, it always turned 00:26:34.160 |
into them actually advocating for, you know, a Catholic or a Protestant or some, or Orthodox 00:26:43.320 |
So if you actually make it into something advocating for a particular view on religion, 00:26:47.640 |
then it crosses into the establishment clause side. 00:26:50.700 |
So Americans haven't figured out a way to actually teach it. 00:26:53.460 |
So it's probably better that you, you know, learn outside of a public school class. 00:26:57.480 |
Do you think it's possible to teach religion from like a world religions kind of course 00:27:09.640 |
I think the answer is it depends on from whose perspective. 00:27:12.400 |
Well, like the practitioners say you're like an Orthodox follower of a particular religion. 00:27:19.280 |
Is it possible to not piss you off in teaching like all the major religions of the world? 00:27:25.160 |
For some people, the bottom line is you have to teach it as true. 00:27:31.800 |
And with that, under those conditions, then the answer is no, you can't teach it without 00:27:38.280 |
Can't you say these people believe it's true? 00:27:41.960 |
So you have to walk on eggshells, essentially. 00:27:44.400 |
You can try really hard and you will still make some people angry, but serious people 00:27:49.320 |
will be like, "Oh no, you actually tried to be fair to the beliefs here." 00:27:53.400 |
And I try to be respectful as much as I can about a lot of this. 00:27:58.400 |
I still find myself much more drawn to both Buddhism and Stoicism though. 00:28:08.200 |
One interesting thing to get back to college campuses is FIRE keeps the college free speech 00:28:17.960 |
I highly recommend it because forget even just the ranking, you get to learn a lot about 00:28:21.840 |
the universities from this entirely different perspective than people are used to when they 00:28:26.160 |
go to pick whatever university they want to go to. 00:28:29.560 |
It just gives another perspective on the whole thing. 00:28:32.320 |
And it gives quotes from people that are students there and so on about their experiences. 00:28:37.440 |
And it gives different, maybe you could speak to the various measures here before we talk 00:28:42.240 |
about who's in the top five and who's in the bottom five. 00:28:45.720 |
What are the different parameters that contribute to the evaluation? 00:28:50.320 |
So people have been asking me since day one to do a ranking of schools according to freedom 00:28:55.840 |
And even though we have the best database in existence of campus speech codes, policies 00:29:03.240 |
that universities have that violate First Amendment or First Amendment norms, we also 00:29:07.400 |
have the best database of what we call the disinvitation database. 00:29:12.240 |
But it's actually the, it's better named the deplatforming database, which is what we're 00:29:17.760 |
And these are all cases where somebody was invited as a speaker to campus and they were 00:29:23.880 |
Disinvited or deplatforming also includes shouting down. 00:29:27.280 |
So they showed up and they couldn't really speak. 00:29:33.800 |
And so having that, what we really needed in order to have some serious social science 00:29:38.200 |
to really make a serious argument about what the ranking was, was to be able to one, get 00:29:43.760 |
a better sense of how many professors were actually getting punished during this time. 00:29:49.120 |
And then the biggest missing element was to be able to ask students directly what the 00:29:55.800 |
environment was like on that campus for freedom of speech. 00:29:58.920 |
Are you comfortable disagreeing with each other? 00:30:00.820 |
Are you comfortable disagreeing with your professors? 00:30:03.920 |
Do you think violence is acceptable in response to a speaker? 00:30:09.920 |
Do you think blocking people's access to a speaker is okay? 00:30:14.960 |
And once we were able to get all those elements together, we first did a test run, I think 00:30:22.720 |
And we've been doing it for four years now, always trying to make the methodology more 00:30:26.980 |
and more precise to better reflect the actual environment at particular schools. 00:30:32.780 |
And this year, the number one school was Michigan Technological University, which was a nice 00:30:38.840 |
The number two school was actually Auburn University, which was nice to see. 00:30:44.720 |
In the top 10, the most well-known prestigious school is actually UVA, which did really well 00:30:50.360 |
University of Chicago was not happy that they weren't number one, but University of Chicago 00:30:54.880 |
was 13, and they had been number one or in the top three for years prior to that. 00:31:03.200 |
Is it because of the really strong economics departments and things like this or why? 00:31:09.340 |
They wouldn't recognize a chapter of Turning Point USA. 00:31:12.640 |
And they made a very classic argument that we, and classic in the bad way, that we hear 00:31:18.120 |
campuses across the country, "Oh, we have a campus Republicans, so we don't need this 00:31:26.160 |
We've seen dozens and dozens, if not hundreds of attempts to get this one particular conservative 00:31:32.000 |
student group derecognized or not recognized. 00:31:36.720 |
And so we told them, like, "Listen, we told them at FIRE that we consider this serious," 00:31:45.420 |
So that's a point down in our ranking, and it was enough to knock them from—they probably 00:31:56.080 |
They're still one of the best schools in the country. 00:32:00.080 |
The school that did not do so well at a negative 10.69, negative 10.69, and we rounded up to 00:32:11.520 |
And Harvard has been not very happy with that result. 00:32:15.400 |
The only school to receive the abysmal ranking. 00:32:22.000 |
There are a couple people who have actually been really, I think, making a mistake by 00:32:25.360 |
getting very Harvard-sounding, by being like, "I've had statisticians look at this, and 00:32:31.120 |
they think your methodology is a joke," and pointing out, "And this case wasn't that 00:32:35.200 |
important, and that scholar wasn't—that scholar—" 00:32:37.640 |
Like one of the arguments against one of the scholars that we counted against them for 00:32:40.920 |
punishing, was that that wasn't a very famous or influential scholar. 00:32:46.360 |
So your argument seems to be snobbery, like essentially that you're not understanding 00:32:56.240 |
And then you're saying that actually that scholar wasn't important enough to count. 00:33:08.960 |
Even if we took all of your arguments as true, even if we decided to get rid of those two 00:33:13.760 |
professors, you would still be in negative numbers. 00:33:18.880 |
You would still be after Georgetown and Penn, and neither of those schools are good for 00:33:23.520 |
We should say the bottom five is the University of Pennsylvania, like you said, Penn, the 00:33:28.360 |
University of South Carolina, Georgetown University, and Fordham University. 00:33:33.920 |
They have so many bad cases at all of those schools. 00:33:36.720 |
What's the best way to find yourself in the bottom five if you're a university? 00:33:41.120 |
What's the fastest way to that negative, to that zero? 00:33:47.120 |
When we looked at the bottom five, 81% of attempts to get speakers deplatformed were 00:33:55.120 |
There were a couple of schools, I think Penn included, where every single attempt, every 00:33:59.160 |
time a student objected, a student group objected to that speaker coming, they canceled the 00:34:10.080 |
I think Harvard did stand up for a couple, but mostly people got deplatformed there as 00:34:23.760 |
What's the dynamics of pushing back of, basically, 'cause I imagine some of it is culture, but 00:34:33.240 |
I imagine every university has a bunch of students who will protest basically every 00:34:40.320 |
- Well, here's the dirty little secret about the big change in 2014. 00:34:46.000 |
And Fire and me and Hite have been very clear that the big change that we saw on campus 00:34:51.000 |
was that for most of my career, students were great on freedom of speech. 00:34:55.520 |
They were the best constituency for free speech, absolutely unambiguously, until about 2013, 00:35:02.600 |
And it was only in 2014 where we had this very kind of sad for us experience where suddenly 00:35:07.200 |
students were the ones advocating for deplatforming and new speech codes, kind of in a similar 00:35:12.360 |
way that they had been doing in, say, like the mid '80s, for example. 00:35:20.720 |
It's students and administrators, sometimes only a handful of them, though, working together 00:35:28.280 |
And this was exactly what happened at Stanford when Kyle Duncan, a Fifth Circuit judge, tried 00:35:33.340 |
to speak at my alma mater and a fifth of the class showed up to shout him down. 00:35:38.740 |
It was a real showing of what was going on that 10 minutes into the shout down of a Fifth 00:35:45.400 |
Circuit judge, and I keep on emphasizing that because I'm a constitutional lawyer. 00:35:53.600 |
You know, about a fifth of the school shows up to shout him down. 00:35:56.760 |
After 10 minutes of shouting him down, an administrator, a DEI administrator, gets up 00:36:03.440 |
That's a seven-minute long speech where she talks about free speech. 00:36:09.920 |
And we're at this law school where people could learn to challenge these norms. 00:36:15.020 |
So it's clear that there was coordination, you know, amongst some of these administrators. 00:36:19.280 |
And from talking to students there, they were in meetings, extensive meetings for a long 00:36:25.720 |
Then they take additional seven minutes to lecture the speaker on free speech not being 00:36:32.440 |
– the juice of free speech not being worth the squeeze. 00:36:36.520 |
And then for the rest of it, it's just constant heckling after she leaves. 00:36:42.040 |
This is clearly – and something very similar, you know, happened a number of times at Yale 00:36:46.280 |
where it was very clearly administrators were helping along with a lot of these disruptions. 00:36:51.440 |
So I think every time there is a shout down at a university, the investigation should 00:36:56.780 |
be first and foremost, did administrators help create this problem? 00:37:05.320 |
Because I think a lot of what's really going on here is the hyper-bureaucratization of 00:37:08.560 |
universities with a lot more ideological people who think of their primary job as basically 00:37:16.200 |
They're encouraging students – sorry, they're encouraging students who have opinions they 00:37:23.200 |
And that's why they really need to investigate this. 00:37:26.600 |
And it is at Stanford, the administrator who gave the prepared remarks about the juice 00:37:32.840 |
not being worth the squeeze, she has not been invited back to Stanford. 00:37:36.260 |
But she's one of the only examples I can think of when these things happen a lot where 00:37:40.440 |
an administrator clearly facilitated something that was a shout down or a deplatforming or 00:37:45.580 |
resulted in a professor getting fired or resulted in a student getting expelled, where the administrator 00:37:51.120 |
has got off scot-free or probably in some cases even got a promotion. 00:37:55.560 |
And so a small number of administrators, maybe even a single administrator, could participate 00:38:01.960 |
in the encouraging and the organization and thereby empower the whole process. 00:38:06.360 |
And that's something I've seen throughout my entire career. 00:38:08.520 |
And the only thing is it's kind of hard to catch this sort of in the act, so to speak. 00:38:12.040 |
And that's one of the reasons why it's helpful for people to know about this, you 00:38:21.240 |
And we actually featured this in a documentary made in 2015, that came out in 2015, 2016, 00:38:29.520 |
And this was when we started noticing something was changing on campus. 00:38:32.560 |
We also heard that comedians were saying that they couldn't use their good humor anymore. 00:38:36.460 |
This was right around the time that Jerry Seinfeld and Chris Rock said that they couldn't 00:38:40.320 |
– they didn't want to play on campuses because they couldn't be funny. 00:38:45.480 |
But we featured a case of a comedian who wanted to do a musical called The Passion of the 00:38:50.600 |
Musical, making fun of the Passion of the Christ, with the stated goal of offending 00:39:00.640 |
And it's an unusual case because we actually got documentation of administrators buying 00:39:07.040 |
tickets for angry students and holding an event where they trained them to jump up in 00:39:16.080 |
Like they bought them tickets, they sent them to this thing with the goal of shouting it 00:39:22.440 |
Now unsurprisingly, when you send an angry group of students to shut down a play, it's 00:39:36.320 |
And then the Pullman, Washington police told Chris Lee, the guy who made the play, that 00:39:43.800 |
Now, it's not every day you're going to have that kind of hard evidence that – of 00:39:47.960 |
actually seeing the administrators be so brazen that they recorded the fact that they bought 00:39:58.400 |
And I think it's a good excuse to cut down on one of the big problems in higher education 00:40:04.240 |
In your experience, is there a distinction between administrators and faculty in terms 00:40:13.240 |
So if we got rid of all – like Harvey's talked about getting rid of a large percentage 00:40:18.440 |
of the administration, does that help fix the problem or is the faculty also – small 00:40:25.200 |
percent of the faculty also part of the encouraging and the organization of these kind of cancel 00:40:31.480 |
And that's something that has been profoundly disappointing, is that when you look at the 00:40:35.400 |
huge uptick in attempts to get professors fired that we've seen over the last 10 years, 00:40:41.160 |
and actually over the last 22 years, as far back as our records go, at first, they were 00:40:48.080 |
overwhelmingly led by administrators, attempts to get professors punished. 00:40:53.680 |
And that was most – I'd say that was my career up until 2013 was fighting back at 00:41:00.400 |
Then you start having the problem in 2014 of students trying to get people canceled. 00:41:07.240 |
And the number – so one way that – one thing that makes it easier to document are 00:41:11.560 |
the petitions to get professors fired or punished and how disproportionately that those actually 00:41:18.240 |
But another big uptick has been fellow professors demanding that their fellow professors get 00:41:28.240 |
You shouldn't be proud of signing the petition to get your fellow professor. 00:41:32.720 |
And what's even more shameful is that we get – this has almost become a cliche within 00:41:40.960 |
When someone is facing one of these cancellation campaigns as a professor, I would get letters 00:41:45.320 |
from some of my friends saying, "I am so sorry this has happened to you." 00:41:50.440 |
And these were the same people who publicly signed the petition to get them fired. 00:42:00.360 |
Integrity is an important thing in this world. 00:42:02.640 |
And I think some of it – I'm so surprised people don't stand up more for this. 00:42:11.560 |
And if you have the guts as a faculty or an administrator to really stand up with eloquence, 00:42:21.040 |
with rigor, with integrity, I feel like it's impossible for anyone to do anything. 00:42:28.120 |
Because there's such a hunger – it's so refreshing. 00:42:31.240 |
I think everybody agrees that freedom of speech is a good thing. 00:42:39.920 |
I think the majority of people, even at the universities, that there's a hunger, but 00:42:43.440 |
it's almost like this kind of nervousness around it because there's a small number 00:42:51.440 |
So I mean, again, that's where great leadership comes in. 00:42:54.640 |
And so presidents of universities should probably be making clear declarations of like, "This 00:43:01.160 |
is a place where we value the freedom of expression." 00:43:08.160 |
A president, a university president who puts their foot down early and says, "Nope. 00:43:14.120 |
We are not entertaining firing this professor. 00:43:21.920 |
Although sometimes – and this is where you can really tell the administrative involvement 00:43:24.760 |
– students will do things like take over the president's office, and then that takeover 00:43:32.420 |
People point this out sometimes as being kind of like, "Oh, it's clearly –" like 00:43:36.340 |
my friend Sam Abrams when they tried to get him fired at Sarah Lawrence College. 00:43:42.740 |
And that was one of the times that it was used as kind of like, "Oh, this was hostile 00:43:47.600 |
to the university because the students took over the president's office." 00:43:51.380 |
And I'm like, "No, they let them take over the president's office." 00:43:54.020 |
And I don't know if that was one of the cases in which the takeover was catered, but 00:43:58.020 |
if there was ever sort of like a sign that's kind of like, "Yes, this is actually really 00:44:04.500 |
Well, in some sense, like protesting and having really strong opinions, even like ridiculous, 00:44:09.900 |
It's just it shouldn't lead to actual firing or deplatforming of people. 00:44:15.100 |
It's just not good for a university to support that and take action based on it. 00:44:20.100 |
And this is one of those like tensions in First Amendment that actually I think has 00:44:26.780 |
You absolutely have the right to devote your life to ending freedom of speech and ridiculing 00:44:36.380 |
And there are people who really can come off as very contemptible about even the philosophy 00:44:47.820 |
And if you try to get a professor fired, we will be on the other side of that. 00:44:50.980 |
Now, I think you had Randy Kennedy, who I really love him. 00:44:56.220 |
But he criticized us for our deplatforming database as saying, "This is saying that 00:45:07.420 |
We at FIRE as an organization have defended the right to protest all the time. 00:45:11.700 |
We are constantly defending the rights of protesters. 00:45:15.100 |
Not believing the protesters have the right to say this would—basically, that would 00:45:21.740 |
We're not calling for punishing the protesters. 00:45:25.340 |
But what we are saying is you can't let the protesters win if they're demanding someone 00:45:31.260 |
So, the line there is between protesters protesting and the university taking action based on 00:45:44.660 |
And that's something where the university—the way you deal with that tension in First Amendment 00:45:49.700 |
law is essentially kind of like the one positive duty that the government has. 00:45:53.900 |
First, the negative duty, the thing that it's not allowed to do is censor you. 00:45:58.460 |
But its positive duty is that if I want to say awful things or, for that matter, great 00:46:03.300 |
things that aren't popular in a public park, you can't let the crowd just shout me down. 00:46:09.420 |
You can't allow what's called a heckler's veto. 00:46:14.260 |
That's so interesting because I feel like that comes into play on social media as well. 00:46:19.620 |
There's this whole discussion about censorship and freedom of speech, but to me, the carrot 00:46:25.140 |
question is almost more interesting once the freedom of speech is established is how do 00:46:30.860 |
you incentivize high-quality debate and disagreement? 00:46:35.940 |
And that's one of the things we talk about in Canceling the American Mind is arguing 00:46:38.620 |
towards truth and that cancel culture is cruel, it's merciless, it's anti-intellectual, but 00:46:46.260 |
it also will never get you anywhere near truth and you are going to waste so much time destroying 00:46:50.980 |
your opponents in something that can actually never get you to truth through the process, 00:46:56.100 |
of course, of you never actually get directly at truth. 00:47:01.180 |
But everybody having a megaphone on the internet with anonymity, it seems like it's better 00:47:07.620 |
than censorship, but it feels like there's incentives on top of that you can construct 00:47:20.940 |
To incentivize somebody who puts a huge amount of effort to make even the most ridiculous 00:47:25.140 |
arguments, but basically ones that don't include any of the things you highlight in terms of 00:47:30.620 |
all the rhetorical tricks to shut down conversations. 00:47:35.260 |
Just make really good arguments for whatever, it doesn't matter if it's communism, for fascism, 00:47:41.220 |
whatever the heck you want to say, but do it with scale, with historical context, with 00:47:48.100 |
steel man on the other side, all those kind of elements. 00:47:50.500 |
We try to make three major points in the book. 00:47:53.340 |
One is just simply cancel culture is real, it's a historic era, and it's on a historic 00:48:00.420 |
The second one is you should think of cancel culture as part of a rhetorical, as a larger 00:48:06.740 |
lazy rhetorical approach to what we refer to as winning arguments without winning arguments. 00:48:14.300 |
We mean that in two senses, without having winning arguments or actually having won arguments. 00:48:19.580 |
We talk about all the different what we call rhetorical fortresses that both the left and 00:48:23.740 |
the right have that prevent you from, that allow you to just dismiss the person or dodge 00:48:30.540 |
the argument without actually ever getting to the substance of the argument. 00:48:36.300 |
But the rhetorical fortress stuff is actually something I've been very passionate about 00:48:40.100 |
because it interferes with our ability to get at truth and it wastes time and frankly 00:48:45.580 |
it also, since cancel culture is part of that rhetorical tactic, it can also ruin lives. 00:48:50.580 |
- It would actually be really fun to talk about this particular aspect of the book and 00:48:54.580 |
I highly recommend if you're listening to this, go pre-order the book now. 00:49:04.100 |
Okay, so in the book you also have a list of cheap rhetorical tactics that both the 00:49:11.660 |
left and the right use and then you have a list of tactics that the left uses and the 00:49:19.620 |
So there's the rhetorical, the perfect rhetorical fortress that the left uses and the efficient 00:49:29.860 |
Maybe we can go through a few of them that capture your heart in this particular moment 00:49:33.620 |
as we talk about it and if you can describe examples of it or if there's aspects of it 00:49:44.220 |
So whataboutism is defending against criticism of your side by bringing up the other side's 00:49:51.540 |
- I want to make little cards of these, of all of these tactics and start using them 00:49:56.620 |
on X all the time because they are so commonly deployed and whataboutism I put first for 00:50:03.260 |
- You know, it'd be an interesting idea to actually integrate that into Twitter/X where 00:50:07.900 |
people instead of clicking heart, they can click which of the rhetorical tactics this 00:50:14.900 |
is and then, because you know there's actually community notes. 00:50:18.180 |
I don't know if you've seen on X that people can contribute notes and it's quite fascinating. 00:50:26.300 |
But to give it a little more structure, that's a really interesting method actually. 00:50:30.460 |
- Yeah, I actually, when I was thinking about ways that X could be used to argue towards 00:50:35.180 |
truth, I wouldn't want to have it so that everybody would be bound to that, but I think 00:50:40.100 |
that, I imagine it almost being like a stream within X that was truth-focused that agrees 00:50:46.300 |
to some additional rules on how they would argue. 00:50:50.980 |
Where like there's, in terms of streams that intersect and can be separated, the shit-talking 00:51:00.260 |
- And then there's like truth and then there's humor, then there's like good vibes. 00:51:09.500 |
I'm not like somebody who absolutely needs good vibes all the time, but sometimes it's 00:51:14.460 |
nice to just log in and not have to see the drama, the fighting, the bickering, the cancellations, 00:51:24.260 |
That's why I go to Reddit, R-AW, or like one of the cute animals ones where there's cute 00:51:32.500 |
- I just want to see Ryan Reynolds singing with Will Ferrell. 00:51:45.260 |
- Whataboutism, yeah, that's everywhere when you look at it, when you look at Twitter, 00:51:52.060 |
And the first, what we call the obstacle course is basically time-tested, old-fashioned, argumentative 00:52:01.420 |
And whataboutism is just bringing up something, like someone makes an argument like Biden 00:52:06.100 |
is corrupt and then someone says, well, Trump was worse. 00:52:10.140 |
And that's not an illegitimate argument to make back, but it does, it seems to happen 00:52:16.740 |
every time someone makes an assertion, someone just points out some other thing that was 00:52:20.460 |
And it can get increasingly attenuated from what you're actually trying to argue. 00:52:24.900 |
And you see this all the time on social media. 00:52:28.260 |
And it's kind of, I was a big fan of Jon Stewart's Daily Show, but an awful lot of what the humor 00:52:33.740 |
was and what the tactic was for arguing was this thing over here. 00:52:37.100 |
It's like, oh, I'm making this argument about this important problem. 00:52:39.580 |
Oh, actually, there's this other problem over here that I'm more concerned about. 00:52:48.260 |
So January 6, watching everybody arguing about CHOP, like the occupied part of Seattle or 00:52:56.940 |
the occupied part of Portland, and basically trying to like, oh, you're bringing up the 00:53:03.380 |
And by the way, I live on Capitol Hill, so believe me, I was very aware of how scary 00:53:09.860 |
You know, like I just, my dad grew up in Yugoslavia and that was a night where we all ate dinner 00:53:14.700 |
in the basement because I'm like, oh, when the shit goes down, eat in the basement. 00:53:20.940 |
And people would try to deflect from January 6 being serious by actually making the argument 00:53:26.220 |
that, oh, well, there are crazy, horrible things happening in all over the country, 00:53:30.820 |
you know, riots that came from some of the social justice protests. 00:53:35.980 |
And of course, the answer is you can be concerned about both of these things and find them both 00:53:41.380 |
But you know, if I'm arguing about CHOP, you know, someone bringing up January 6 isn't 00:53:47.860 |
Or if I'm arguing about January 6, someone bringing up the riots in 2020 isn't that helpful. 00:53:53.260 |
We took a long, dark journey from whataboutism. 00:53:56.580 |
And related to that is strawmanning and steelmanning. 00:53:59.740 |
So misrepresenting the perspective of the opposing perspective. 00:54:05.660 |
And this is something also, I guess, it's very prevalent. 00:54:10.460 |
And it's difficult to do the reverse of that, which is steelmanning. 00:54:14.260 |
It requires empathy, it requires eloquence, it requires understanding, actually doing 00:54:18.980 |
the research and understanding the alternative perspective. 00:54:23.260 |
- My wonderful employee, Angel Andorado, has something that he calls starmanning. 00:54:29.940 |
It's nice to have, you know, two immigrant parents, because I remember being in San Francisco, 00:54:36.500 |
in the weird kind of like, ACLU slash Burning Man kind of cohort. 00:54:43.060 |
And having a friend there who was an artist who would talk about hating Kansas. 00:54:48.820 |
And that was his metaphor for middle America, is what he meant by it. 00:54:53.380 |
And but he was kind of proud of the fact that he hated Kansas. 00:54:57.060 |
And I'm like, you gotta understand, I still see all of you a little bit as foreigners. 00:55:02.180 |
And think about like, change the name of Kansas to Croatia, you know, change the name of Kansas 00:55:12.100 |
And the starmanning idea, which I like, is the idea of being like, so you're saying that 00:55:16.820 |
you really hate your dominant religious minority. 00:55:20.700 |
And that's when you start actually detaching yourself a little bit from it. 00:55:28.140 |
But some of our dynamics are incredibly typical. 00:55:31.400 |
It's one of the reasons why like, when people start reading Thomas Sowell, for example, 00:55:35.940 |
Because one of the things he does, is he does comparative analysis of countries problems 00:55:40.220 |
and points out that some of these things that we think are just unique to the United States 00:55:43.700 |
exist in, you know, 75% of the rest of the countries in the world. 00:55:48.500 |
Francis Fukuyama's, the book that I'm reading right now, Origins of the Political Order, 00:55:52.980 |
actually does this wonderful job of pointing out how we're not special in a variety of 00:55:57.940 |
This is actually something that's very much on my mind. 00:56:05.180 |
It's not, it's stilted a little bit in its writing. 00:56:09.140 |
Because his term for one of the things he's concerned about what destroys societies is 00:56:12.900 |
repatriomonialization, which is the reversion to societies in which you favor your family 00:56:24.540 |
And I actually think a lot of what I'm seeing in sort of in the United States, it makes 00:56:30.100 |
me worried that we might be going through a little bit of a process of repatriomonialization. 00:56:34.100 |
And I think that's one of the reasons why people are so angry. 00:56:35.940 |
I think having, I think the prospect that we, you know, we very nearly seem to have 00:56:42.380 |
an election that was going to be, you know, Jeb Bush versus Hillary Clinton. 00:56:50.860 |
But also it's one of the reasons why people are getting so angry about legacy admissions, 00:56:54.620 |
about like how much, you know, certain families seem to be able to keep their people in the 00:56:59.940 |
upper classes of the United States perpetually. 00:57:02.780 |
And believe me, like I was poor when I was a kid and I went to, and I got to go to, I 00:57:12.060 |
And I got to see how people, they treat you differently in a way that's almost insulting. 00:57:20.140 |
Like basically like suddenly to a certain kind of person, I was a legitimate person. 00:57:25.620 |
And I look at how much America relies on Harvard, on Yale to produce its, I'm going to use a 00:57:35.500 |
And that's one of the reasons why you have to be particularly worried about what goes 00:57:40.900 |
And these elite colleges with the exception of University of Chicago and UVA do really 00:57:52.740 |
It doesn't bode well for the future of the protection of freedom of speech for the rest 00:57:57.240 |
So can you also empathize there with the folks who voted for Donald Trump? 00:58:04.700 |
Because as precisely that as a resistance to this kind of momentum of the ruling class, 00:58:14.780 |
this royalty that passes on the rule from generation to generation. 00:58:20.020 |
I try really hard to empathize with to a degree everybody and try to really see where they're 00:58:30.500 |
I mean like I feel like the book, so "Coddling the American Mind" was a book that could be 00:58:38.020 |
sort of a crowd pleaser to a degree, partially because we really meant what we said in the 00:58:43.780 |
subtitle that these are good intentions and bad ideas that are hurting people. 00:58:50.440 |
And if you understand it and read the book, you can say it's like, "Okay, this isn't anybody 00:58:58.740 |
They're just doing it in a way that actually can actually lead to greater anxiety, depression, 00:59:03.380 |
and strangely, eventually pose a threat to freedom of speech." 00:59:08.280 |
But in this one, we can't be quite—me and my—oh, I haven't even mentioned my brilliant 00:59:13.020 |
co-author, Rikki Schlatt, a 23-year-old genius. 00:59:19.100 |
I started working with her when she was 20, who's my co-author on this book. 00:59:22.300 |
So when I'm saying we, I'm talking about me and Rikki. 00:59:30.740 |
But we can't actually write this in a way that's too kind because counselors aren't kind. 00:59:35.620 |
There's a cruelty and a mercilessness about it. 00:59:38.060 |
I mean, I started getting really depressed this past year when I was writing it, and 00:59:41.940 |
I didn't even want to tell my staff why I was getting so anxious and depressed. 00:59:45.220 |
It's partially because I'm talking about people who will, you know, in some of the cases we're 00:59:50.100 |
talking about, go to your house, target your kids. 00:59:54.500 |
So that's a long-winded way of saying the—I kind of can get what sort of drives the right 01:00:03.220 |
I feel like they're constantly feeling like they're being gaslit. 01:00:08.180 |
Elite education is really insulting to the working class. 01:00:12.780 |
Like part of the ideology that is dominant right now kind of treats almost 70% of the 01:00:18.660 |
American public like they're—we developed this a little bit in The Perfect Rhetorical 01:00:22.580 |
Fortress—like they're to some way illegitimate and not worthy of respect or compassion. 01:00:30.220 |
Yeah, the general elitism that radiates, self-fueling elitism that radiates from the people that 01:00:40.420 |
And what's funny is the elitism has been repackaged as a kind of—it masquerades as 01:00:49.340 |
kind of infinite compassion, that essentially it's based in a sort of very, to be frank, 01:00:56.100 |
overly simple ideology, an oversimplified explanation of the world, breaking people 01:01:01.820 |
into groups and judging people on how oppressed they are on the intersection of their various 01:01:10.540 |
And it came to that, I think, initially with an added appeal from a compassionate core, 01:01:17.140 |
but it gets used in a way that can be very cruel, very dismissive, compassionless, and 01:01:24.860 |
allows you to not take seriously most of your fellow human beings. 01:01:30.820 |
Maybe you can explore why a thing that has—kind of sounds good at first—can be, can create, 01:01:41.500 |
can become such a cruel weapon of canceling and hurting people and ignoring people. 01:01:46.660 |
I mean, this is what you described with the Perfect Rhetorical Fortress, which is a set 01:01:51.780 |
Maybe you can elaborate on what the Perfect Rhetorical Fortress is. 01:01:55.100 |
Yeah, so the Perfect Rhetorical Fortress is the way that's been developed on the left 01:02:01.780 |
to not ever get to someone's actual argument. 01:02:05.180 |
I want to make a chart, like a flow chart of this, about like, "Here's the argument 01:02:08.660 |
and here is this perfect fortress that will deflect you every time from getting to the 01:02:14.700 |
And I started to notice this, certainly, when I was in law school, that there were lots 01:02:20.660 |
And Perfect Rhetorical Fortress step one—and I can attest to this because I was guilty 01:02:26.300 |
of this as well—that you can dismiss people if you can argue that they're conservative. 01:02:31.740 |
They don't have to be conservative, to be clear. 01:02:36.860 |
So I never read Thomas Sowell because he was a right-winger. 01:02:40.940 |
I didn't read Camille Paglia because I was—someone had convinced me she was a right-winger. 01:02:46.020 |
There were lots of authors that—and when I was in law school, among a lot of very bright 01:02:51.940 |
people, it really was already an intellectual habit that if you could designate something 01:02:57.880 |
conservative, then you didn't really have to think about it very much anymore or take 01:03:02.020 |
This is a childish way of arguing, but nonetheless, I engaged in it. 01:03:08.100 |
I even mentioned in the book there was a time when a gay activist friend who was, I think, 01:03:14.980 |
steadily to my left, but nonetheless had that pragmatic experience of actually being an 01:03:18.980 |
activist said something like, "Well, just because someone's conservative doesn't mean 01:03:23.140 |
And I remember feeling kind of scandalized at some level, just being like, "Whoa, that's 01:03:27.540 |
kind of—isn't that the whole thing we're saying, is that they're just kind of bad people 01:03:31.540 |
You can just throw, "Oh, that guy's a right-winger." 01:03:37.380 |
Yeah, and then it can—if you're popular enough, it can be kind of sticky. 01:03:48.460 |
Essentially, it should have hit someone's—because I have a great liberal pedigree, everything 01:03:56.160 |
from working at the ACLU to doing refugee law in Eastern Europe. 01:03:59.380 |
I was part of an environmental mentoring program for inner-city high school kids in DC. 01:04:07.500 |
I can defend myself as being on the left, but I hate doing that because there's also 01:04:17.740 |
Are you really saying that if you can magically make me argue or convince yourself that I'm 01:04:23.660 |
on the right that you don't have to listen to me anymore?" 01:04:28.540 |
The reason why this has become so popular is because even among—or maybe especially 01:04:33.380 |
among elites that it works so effectively as a perfect weapon that you can use uncritically. 01:04:38.580 |
If I can just prove you're on the right, I don't have to think about you. 01:04:42.100 |
It's no wonder that suddenly you start seeing people calling the ACLU right wing and calling 01:04:47.100 |
the New York Times right wing because it's been such an effective way to delegitimize 01:04:55.580 |
Steven Pinker, who's on our board of advisors, he refers to academia as being the left pole, 01:05:01.980 |
that essentially it's a position that from that point of view, everything looks as if 01:05:09.260 |
But once it becomes a tactic that we accept, and it's one of the reasons why I'm more 01:05:16.460 |
on the left, but I think I'm left of center liberal. 01:05:23.740 |
And initially I was kind of like, "Should I really be writing something with someone 01:05:29.500 |
I have to actually live up to what I believe on this stuff because it's ridiculous that 01:05:33.620 |
we have this primitive idea that you can dismiss someone as soon as you claim rightly or wrongly 01:05:39.900 |
Well, correct me if I'm wrong, but I feel like you were recently called right wing. 01:05:46.340 |
FIRE, maybe you by association because of that debate. 01:05:54.220 |
So, yes, there's an article, there's a debate. 01:05:57.020 |
I can't wait to watch it because I don't think it's available yet to watch on video. 01:06:04.220 |
But FIRE was in part supporting and then LA Times wrote a scathing article about that 01:06:10.820 |
everybody in the debate was basically leaning right. 01:06:17.460 |
You know, Barry Weist has this great project, The Free Press. 01:06:22.540 |
It's covering stories that a lot of the media, right or left, isn't willing to cover. 01:06:28.420 |
And we did a, we hosted a debate with her and we wanted to make it as fun and controversial 01:06:37.500 |
So FIRE and The Free Press hosted a debate, "Did the Sexual Revolution Fail?" 01:06:42.400 |
So the debate was really exciting, really fun. 01:06:45.060 |
The side that said the sexual revolution wasn't a failure that Grimes and Sarah Hader were 01:06:52.060 |
It was, you know, a nice, meaty, thoughtful night. 01:06:56.180 |
And we got a, there was a review of it that was just sort of scathing about the whole 01:06:59.820 |
thing and it included a line saying that FIRE, which claims to believe in free speech but 01:07:05.180 |
only defends viewpoints it agrees with, I can't believe that even made it into the magazine. 01:07:09.900 |
Because it's not just calling us, because of course, you know, the implication of course 01:07:15.220 |
Actually, the staff leans decidedly more to the left than to the right. 01:07:19.940 |
But we also defend people all over the spectrum all the time. 01:07:24.180 |
Like that's something that even the most minimal Google search would have solved. 01:07:27.740 |
So like we've been giving LA Times some heat on this because it's like, yeah, if you said 01:07:32.340 |
in my opinion, they're right wing, we would have argued back, you know, saying, "Well, 01:07:37.500 |
here's the following 50,000 examples of us not being." 01:07:41.940 |
But when you actually make the factual claim that we only defend opinions we agree with, 01:07:46.340 |
first of all, there's no way for us to agree with opinions because we actually have a politically 01:07:50.820 |
diverse staff who won't even agree on which opinions are good and what opinions we have. 01:07:56.740 |
But yeah, I had one time when someone did something like this, and they were just being 01:08:01.780 |
a little bit flippant about kind of like free speech being fine. 01:08:04.660 |
I did a 70 tweet long thread, you know, just being like, "Hey, do you really think this 01:08:10.260 |
I decided not to do that on this particular one. 01:08:13.900 |
But the nice thing about it is it demonstrated two parts of the book, "Canceling of the American 01:08:21.200 |
One of them is dismissing someone because they're conservative, because that was the 01:08:24.700 |
implication, don't have to listen to fire because they're conservative. 01:08:27.380 |
But the other one is something, a term that I invented specifically for the way people 01:08:32.420 |
argue on Twitter, which is hypocrisy projection. 01:08:35.460 |
"Hi, I'm a person who only cares about one side of the political fence, and I think everyone 01:08:43.300 |
And by the way, I haven't done any actual research on this, but I assume everyone else 01:08:50.980 |
And this happens to fire a lot where someone will be like, "Where is fire on this case?" 01:08:54.220 |
And we're like, "We are literally quoted in the link you just sent but didn't actually 01:09:02.540 |
It's like, "Here's our lawsuit about it from six months ago." 01:09:07.940 |
So it's a favorite thing and also, Jon Stewart, Daily Show, like the whataboutism and the 01:09:15.700 |
kind of like idea that these people must be hypocrites is something that great as comedy, 01:09:20.320 |
but as far as actually a rhetorical tactic that will get you to truth, just assuming 01:09:24.260 |
that your opponent or just accusing your opponent of always being a hypocrite is not a good 01:09:31.160 |
But by the way, it tends to always come from people who aren't actually consistent on free 01:09:37.220 |
So that hands the projection, but basically not doing the research about whether the person 01:09:42.020 |
is or isn't a hypocrite and assuming others or a large fraction of others reading it will 01:09:50.420 |
And therefore, this kind of statement becomes a kind of truthiness without a grounding in 01:09:57.180 |
It breaks down that barrier between what is and isn't true because if the mob says something 01:10:01.900 |
is true, it takes too much effort to correct it. 01:10:04.980 |
And there are three ways I want to respond to this, which is just giving example after 01:10:10.180 |
example of times where we defended people on both sides of every major issue, basically 01:10:15.460 |
every major issue, whether it's Israel-Palestine, whether it's terrorism, whether it's gay marriage, 01:10:20.340 |
we have been, abortion, we have defended both sides of that argument. 01:10:25.460 |
The other part, and I call these the orphans of the culture war, I really want to urge 01:10:30.500 |
the media to start caring about free speech cases that actually don't have a political 01:10:35.860 |
valence, that are actually just about good old-fashioned exercise of power against the 01:10:40.740 |
little guy or little girl or little group on campus or off campus for that matter. 01:10:46.540 |
Because these cases happen, a lot of our litigation are just little people, just regular people 01:10:51.420 |
being told that they can't protest, that they can't hold signs. 01:10:54.500 |
And then the last part of the argument that I want people to really get is like, "Yeah, 01:10:58.100 |
and by the way, right-wingers get in trouble too, and there are attacks from the left, 01:11:05.940 |
You should care when Republicans get in trouble. 01:11:08.560 |
You should care when California has a DEI program that requires this, California Community 01:11:15.420 |
Colleges has a DEI program, a policy that actually requires even chemistry professors 01:11:21.180 |
to work in different DEI ideas from intersectionality to anti-racism into their classroom, into 01:11:29.740 |
This is a gross violation of academic freedom. 01:11:33.720 |
It is as bad as it is to tell professors what they can't say, like we fought and defeated 01:11:39.020 |
It's even worse to tell them what they must say. 01:11:41.680 |
That's downright totalitarian, and we're suing against this. 01:11:45.160 |
And what I'm saying is that when you're dismissing someone for just being on the other side of 01:11:52.160 |
the political fence, you are also kind of claiming, making a claim that none of these 01:11:58.600 |
And I want people to care about censorship when it even is against people they hate. 01:12:09.680 |
If we can take that tangent briefly with DEI, diversity, equity, and inclusion, what is 01:12:17.600 |
the good and what is the harm of such programs? 01:12:20.360 |
>>William: DEI, I know people who are DEI consultants. 01:12:24.280 |
Actually, I have a dear friend who I love very much who does DEI, absolutely decent 01:12:32.400 |
What they want to do is create bonds of understanding, friendship, compassion among people who are 01:12:41.480 |
Unfortunately, the research on what a lot of DEI actually does is oftentimes the opposite 01:12:47.060 |
And I think that it's partially a problem with some of the ideology that comes from 01:12:52.040 |
critical race theory, which is a real thing, by the way, that informs a lot of DEI that 01:12:57.320 |
actually makes it something more likely to divide than unite. 01:13:00.600 |
We talk about this in Coddling the American Mind as the difference between common humanity 01:13:05.460 |
identity politics and common enemy identity politics. 01:13:09.680 |
And I think that I know some of the people that I know who do DEI, they really want it 01:13:14.840 |
to be common humanity identity politics, but some of the actual ideological assumptions 01:13:20.440 |
that are baked in can actually cause people to feel more alienated from each other. 01:13:25.720 |
Now, when I started at FIRE, my first cases involved 9/11, and it was bad. 01:13:33.680 |
Professors were getting targeted, professors were losing their jobs for saying insensitive 01:13:37.800 |
things about 9/11, and both from the right and the left. 01:13:41.600 |
Actually, in that case, actually, sometimes a lot more from the right. 01:13:46.120 |
And it was really bad, and about five professors lost their jobs. 01:13:51.880 |
Five professors over a relatively short period of time being fired for a political opinion, 01:13:56.240 |
that's something that would get written up in any previous decades. 01:14:00.760 |
We're now evaluating how many professors have been targeted for cancellation between 2014 01:14:12.280 |
We're at about well over 1,000 attempts to get professors fired or punished, usually 01:14:18.520 |
driven by students and administrators, often driven by professors, unfortunately, as well. 01:14:24.200 |
About two-thirds of those result in the professor being punished in some way, everything from 01:14:30.440 |
having their article removed to suspension, et cetera. 01:14:35.200 |
About one-fifth of those result in professors being fired. 01:14:39.500 |
So right now, it's almost 200, it's around 190 professors being fired. 01:14:47.840 |
The Red Scare is generally considered to have been from 1947 to 1957. 01:14:52.520 |
It ended, by the way, in '57 when it finally became clear, thanks to the First Amendment, 01:14:57.880 |
that you couldn't actually fire people for their ideologies. 01:15:00.800 |
Prior to that, a lot of universities thought they could. 01:15:12.680 |
It was only '57 when the law was established. 01:15:14.680 |
So right now, these are happening in an environment where freedom of speech, academic freedom 01:15:18.800 |
are clearly protected at public colleges in the United States, and we're still seeing 01:15:27.040 |
During the Red Scare, the biggest study that was done of what was going on, I think this 01:15:31.720 |
came out in '55, and the evaluation was that there was about 62 professors fired for being 01:15:40.440 |
communists and about 90-something professors fired for political views overall. 01:15:52.120 |
So 60, 90, 100, depending on how you look at it. 01:15:55.080 |
I think the number is actually higher, but that's only because of hindsight. 01:16:00.420 |
What I mean by hindsight is we can look back and we actually find there were more professors 01:16:07.520 |
We're at 190 professors fired, and I still have to put up with people saying this isn't 01:16:13.320 |
I'm like, "In the nine and a half years of cancel culture, 190 professors fired. 01:16:17.360 |
In the 11 years of the Red Scare, probably somewhere around 100, probably more." 01:16:25.560 |
The number is going to keep going up, but unlike during the Red Scare where people could 01:16:29.600 |
clearly tell something was happening, the craziest thing about cancel culture is I'm 01:16:33.040 |
still dealing with people who are saying this isn't happening at all, and it hasn't been 01:16:38.800 |
We know that's a wild undercount, by the way, because when we surveyed professors, 17% of 01:16:44.240 |
them said that they had been threatened with investigation or actually investigated for 01:16:54.720 |
One third of them said that they were told by administrators not to take on controversial 01:17:01.240 |
Extrapolating that out, that's a huge number. 01:17:04.000 |
The reason why you're not going to hear about a lot of these cases is because there are 01:17:07.400 |
so many different conformity-inducing mechanisms in the whole thing. 01:17:11.960 |
That's one of the reasons why the idea that you'd add something requiring a DEI statement 01:17:16.880 |
to be hired or to get into a school under the current environment is so completely nuts. 01:17:22.680 |
We have had a genuine crisis of academic freedom over the last, particularly since 2017 on 01:17:29.520 |
We have very low viewpoint diversity to begin with. 01:17:33.040 |
Under these circumstances, administrators just start saying, "You know what the problem 01:17:43.800 |
We need another political litmus test," which is nuts. 01:17:47.440 |
That's what a DEI statement effectively is because there's no way to actually fill out 01:17:51.200 |
a DEI statement without someone evaluating you on your politics. 01:17:59.160 |
Nate Honeycutt, he got something like almost like 3,000 professors to participate evaluating 01:18:09.760 |
One was basically like the standard kind of identity politics intersectionality one. 01:18:21.680 |
As far as where my heart really is, it's that we have too little socioeconomic diversity, 01:18:26.320 |
particularly in elite higher ed, but also in education period. 01:18:30.800 |
The experiment had large participation, really interestingly set up. 01:18:35.840 |
It tried to model the way a lot of these DEI policies were actually implemented. 01:18:40.280 |
One of the ways these have been implemented, and I think in some of the California schools, 01:18:43.960 |
is that administrators go through the DEI statements before anyone else looks at them 01:18:50.720 |
and then eliminates people off the top depending on how they feel about their DEI statements. 01:18:57.800 |
The one on viewpoint diversity, I think like half of the people who reviewed it would eliminate 01:19:06.440 |
I think it was basically the same for religious diversity. 01:19:09.520 |
It was slightly better, like 40% for socioeconomic diversity, but that kills me. 01:19:16.040 |
The idea that kind of like, "Yeah, that actually is the kind of diversity that I think we need 01:19:22.320 |
It's not hostile to the other kinds, by the way, but the idea that we need more people 01:19:27.160 |
from the bottom three quarters of American society in higher education, I think should 01:19:34.680 |
The only one that really succeeded was the one that sprouted back exactly the kind of 01:19:38.840 |
ideology that they thought the reviewers would like, which is like, "Okay, there's no way 01:19:47.240 |
We've proved that it's a political litmus test and still school after school is adding 01:19:51.960 |
these to its application process to make schools still more ideologically homogenous." 01:19:59.400 |
Is it because it enforces a kind of group think where people are afraid, start becoming 01:20:08.760 |
afraid to sort of think and speak freely, liberally about whatever? 01:20:15.640 |
Well, one, it selects for people who tend to be farther to the left in a situation where 01:20:21.320 |
you already have people, a situation where universities do lean decidedly that way. 01:20:26.840 |
But it also establishes essentially a set of sacred ideas that if you're being quizzed 01:20:33.040 |
on what you've done to advance anti-racism, how you've been conscious of intersectionality, 01:20:42.680 |
it's unlikely that you'd actually get in if you said, "By the way, I actually think these 01:20:48.800 |
I think they're philosophically not very defensible." 01:20:51.040 |
Basically, if your position was, "I actually reject these concepts as being over simple," 01:21:01.440 |
I think that the person that I always think of that wasn't a right winger that would be 01:21:06.160 |
like, "Go to hell," if you made him fill one of these things out, it's Feynman. 01:21:11.360 |
I feel like if you gave one of these things to Richard Feynman, he'd be like, he would 01:21:17.440 |
- Yeah, there's some element of it that creates this hard to pin down fear. 01:21:28.120 |
So you said like the firing, the thing I wanted to say is firing 100 people or 200 people, 01:21:34.080 |
the point is even firing one person, I've just seen it, it can create this quiet ripple 01:21:42.120 |
- That single firing of a faculty has a ripple effect across tens of thousands of people, 01:21:49.520 |
of educators, of who is hired, what kind of conversations are being had, what kind of 01:21:55.600 |
textbooks are chosen, what kind of self-censorship and different flavors of that is happening. 01:22:02.200 |
- Yeah, I mean, when you ask professors about, "Are they intimidated under the current environment?" 01:22:10.160 |
The answer is yes, and particularly conservative professors already reporting that they're 01:22:16.680 |
afraid for their jobs in a lot of different cases. 01:22:18.760 |
- You have a lot of good statistics in the book, things like self-censorship, one provided 01:22:22.920 |
with a definition of self-censorship, at least a quarter of students said they self-censor 01:22:27.160 |
fairly often or very often during conversations with other students, with professors and during 01:22:31.520 |
classroom discussions, 25%, 27%, and 28% respectively. 01:22:37.440 |
A quarter of students also said that they are more likely to self-censor on campus now 01:22:43.240 |
at the time they were surveyed than they were when they first started college. 01:22:47.800 |
So sort of college is kind of instilling this idea of censorship, self-censorship. 01:22:54.360 |
- And back to the Red Scare comparison, and this is one of the interesting things about 01:22:57.400 |
the data as well, is that that same study that I was talking about, the most comprehensive 01:23:02.000 |
study of the Red Scare, there was polling about whether or not professors were self-censoring 01:23:09.440 |
And 9% of professors said that they were self-censoring their research and what they were saying. 01:23:18.120 |
That's almost a tenth of professors saying that their speech was chilled. 01:23:23.120 |
When we did this question for professors on our latest faculty survey, when you factor 01:23:28.780 |
together if they're, we asked them are they self-censoring in their research, are they 01:23:32.240 |
self-censoring in class, are they self-censoring online, et cetera, it was 90% of professors. 01:23:38.200 |
So the idea that we're actually in an environment that is historic in terms of how scared people 01:23:44.400 |
are actually of expressing controversial views, I think that it's the reason why we're gonna 01:23:49.540 |
actually be studying this in 50 years, the same way we studied the Red Scare. 01:23:55.320 |
It's not, the idea that this isn't happening will just be correctly viewed as insane. 01:23:59.640 |
- So maybe we can just discuss the leaning, the current leaning of academia towards the 01:24:05.720 |
left which you describe in various different perspectives. 01:24:08.840 |
So one, there's a voter registration ratio chart that you have by department which I 01:24:15.000 |
Can you explain this chart and can you explain what it shows? 01:24:18.920 |
- Yeah, when I started FIRE in 2001, I didn't take the viewpoint diversity issue as seriously. 01:24:24.760 |
I thought it was just something that right-wingers complained about. 01:24:28.120 |
But I really started to get what happens when you have a community with low viewpoint diversity. 01:24:36.800 |
And actually a lot of the research that I got most interested in was done in conjunction 01:24:42.040 |
with the great Cass Sunstein who writes a lot about group polarization. 01:24:49.920 |
But essentially when you have groups with political diversity, and you can see this 01:24:55.000 |
actually in judges for example, it tends to produce reliably more moderate outcomes. 01:25:01.040 |
Whereas groups that have low political diversity tend to sort of spiral off in their own direction. 01:25:07.680 |
And when you have a super majority of people from just one political perspective, that's 01:25:13.880 |
It creates a situation where there are sacred ideas. 01:25:17.400 |
And when you look at some of the departments, I think the estimate from the Crimson is that 01:25:25.540 |
But when you look at different departments, there are elite departments that have literally 01:25:31.760 |
And I think that's in a healthy intellectual environment. 01:25:35.760 |
The problem is definitely worse as you get more elite. 01:25:39.560 |
We definitely see more cases of lefty professors getting canceled at less elite schools. 01:25:44.920 |
It gets worse as you get down from the elite schools. 01:25:48.400 |
Where a lot of the one third of attempts to get professors punished that are successful 01:25:54.320 |
do come from the right and largely from off campus sources. 01:25:58.520 |
And we spend a lot of time talking about that in the book as well. 01:26:01.840 |
It's something that I do think is underappreciated. 01:26:05.200 |
But when it comes to the low viewpoint diversity, it works out kind of like you'd expect to 01:26:11.080 |
Economics is what, four to one or something like that. 01:26:15.920 |
But then when you start getting into some of the humanities, there are departments that 01:26:22.280 |
Is there a good why to why did the university's faculty administration move to the left? 01:26:31.000 |
I don't love, and this is an argument that you'll sometimes run into on the left, just 01:26:35.400 |
the argument that, well, people on the left are just smarter. 01:26:40.760 |
It's interesting because at least the research as of 10 years ago was indicating that if 01:26:45.840 |
you dig a little bit deeper into that, a lot of the people who do consider themselves on 01:26:49.400 |
the left tend to be a little bit more libertarian. 01:26:51.120 |
This is something that Pinker wrote a fair amount about. 01:26:56.920 |
The idea that we're just smarter is not an opinion I'm the least bit comfortable with. 01:27:04.720 |
I do think that departments take on momentum when they become a place where you're like, 01:27:11.360 |
wow, it'd be really unpleasant for me to work in this department if I'm the token conservative. 01:27:17.920 |
There are also departments where a lot of the ideology is kind of explicitly leftist. 01:27:25.160 |
You look at education schools, a lot of the stuff that is actually left over from what 01:27:30.520 |
is correctly called critical race theories is present. 01:27:33.880 |
And you end up having that in a number of the departments. 01:27:37.800 |
And it would be very strange to be in many departments a conservative social worker professor. 01:27:44.040 |
I'm sure they exist, but there's a lot of pressure to shut up if you are. 01:27:50.720 |
- So the process on the left of cancellation, as you started to talk about with the perfect 01:27:56.000 |
rhetorical fortress, the first step is dismiss a person if you can put a label of conservative 01:28:07.080 |
What other efficient or what other effective dismissal mechanisms are there? 01:28:13.320 |
- We have a little bit of fun with demographic numbers. 01:28:19.680 |
And I remember him being kind of like, don't include the actual percentage. 01:28:22.560 |
I'm like, no, we need to include the actual percentages because people are really bad 01:28:26.440 |
at estimating what the demographics of the US actually looks like, both the right and 01:28:33.580 |
So we put it in the numbers and we talk about being dismissed for being white or being dismissed 01:28:38.480 |
for being straight or being dismissed for being male. 01:28:43.440 |
And you can already dismiss people for being conservative. 01:28:45.680 |
And so we give examples in the book of these being used to dismiss people. 01:28:50.280 |
And oftentimes on topics not related to the fact that they are male or whether or not 01:28:55.520 |
And then we get to, I think it's like layer six. 01:28:59.720 |
You're down to 0.4% of the population and none of it mattered. 01:29:02.960 |
Because if you have the wrong opinion, even if you're in that 0.4% of the most intersectional 01:29:07.480 |
person who ever lived and you have the wrong opinion, you're a heretic and you actually 01:29:13.480 |
And the most interesting part of the research we did for this was just asking every prominent 01:29:20.200 |
black conservative and moderate that we knew personally, have you been told that you're 01:29:32.640 |
And it's kind of funny because it's like oftentimes white lefties telling them that's like, oh, 01:29:37.760 |
John McWhorter talked about having a reporter when he talked about when he showed that he 01:29:42.120 |
dissented from some of what he described as kind of like woke racism in his book, Woke 01:29:48.600 |
And the reporter actually is like, so do you consider yourself black? 01:29:54.160 |
And Coleman Hughes had one of the best quotes on it. 01:29:56.800 |
He said, I'm constantly being told that the most important thing to how legitimate my 01:30:06.040 |
But then when I have a dissenting opinion, I get told I'm not really black. 01:30:19.840 |
So and you lay this out really nicely in the book that there is this process of saying, 01:30:31.760 |
There's these categories that make it easier for you to dismiss a person's ideas based 01:30:38.480 |
And like you said, you end up in that tiny percentage and you can still dismiss. 01:30:42.360 |
We talk about this from a practical standpoint, the way the limitations on reality. 01:30:49.880 |
And a lot of cancel culture as cultural norms, as this way of winning arguments without winning 01:30:58.920 |
Because by the time you get down to the bottom of the – actually even to get a couple of 01:31:04.000 |
steps into the perfect rhetorical fortress and where has the time gone? 01:31:09.040 |
You probably just give up trying to actually have the argument and you never get to the 01:31:16.040 |
And all of these things are pretty sticky on social media. 01:31:19.520 |
Social media practically invented the perfect rhetorical fortress. 01:31:22.340 |
So that each one of those stages has a virality to it. 01:31:25.560 |
So it could stick and then it can get people really excited. 01:31:28.760 |
It allows you to feel outrage and superiority. 01:31:32.080 |
Because of that, at the scale of the virality, it allows you to never get to the actual discussion 01:31:37.840 |
But, you know, it's not just the left, it's the right. 01:31:44.500 |
So something to be proud of on the right, it's more efficient. 01:31:49.680 |
So you don't have to listen to liberals and anyone can be labeled a liberal if they have 01:31:55.120 |
I've seen liberal and left and leftist all used in the same kind of way. 01:32:02.800 |
You don't have to listen to experts, even conservative experts, if they have the wrong 01:32:08.980 |
You don't have to listen to journalists, even conservative journalists, if they have the 01:32:13.920 |
And among the MAGA wing, there's a fourth proposition. 01:32:19.240 |
You don't need to listen to anyone who isn't pro-Trump. 01:32:23.000 |
And we call it efficient because it eliminates a lot of people you probably should listen 01:32:28.760 |
You know, like we point out sometimes like how cancel culture can interfere with faith 01:32:33.540 |
So we get kind of being a little suspicious of experts, but at the same time, if you follow 01:32:38.840 |
that and you follow it mechanically, and I definitely, you know, I think everybody in 01:32:42.880 |
the US probably has some older uncle who exercises some of these. 01:32:49.080 |
It is a really efficient way to sort of wall yourself off from the rest of the world and 01:32:54.660 |
dismiss at least some people you really should be listening to. 01:32:58.280 |
The way you laid it out, it made me realize that we just take up so much of our brain 01:33:07.600 |
And you get like, you kind of exhaust yourself through this process of being outraged based 01:33:12.800 |
on these labels and you never get to actually, there's almost not enough time for empathy, 01:33:17.560 |
for like looking at a person and thinking, well, maybe they're right because you're so 01:33:25.560 |
And I mean, what's so interesting about this is that so much societal energy seems to be 01:33:31.280 |
spent on these nasty primal desires where essentially a lot of it is like, please tell 01:33:41.720 |
Where can I actually exercise some aggression against somebody? 01:33:45.760 |
And it seems to sometimes be just finding new justifications for that. 01:33:50.360 |
And it's an understandable, you know, human failing that sometimes can be used to defend 01:33:57.600 |
But again, it will never get you anywhere near the truth. 01:34:01.080 |
One interesting case that you cover about expertise is with COVID. 01:34:05.780 |
So how did cancel culture come into play on the topic of COVID? 01:34:09.520 |
Yeah, I think that COVID was a big blow to people's faith and expertise and cancel culture 01:34:18.360 |
I think one of the best examples of this is Jennifer Say at Levi's. 01:34:28.080 |
She talked about actually potentially to be the president of Levi's jeans. 01:34:36.280 |
And when they started shutting down the schools, she started saying, this is going to be a 01:34:42.040 |
This is going to hurt the poor and disadvantaged kids the most. 01:34:48.080 |
We have to figure out a way to open the schools back up. 01:34:53.960 |
And the typical kind of cancel culture wave took over as he had all sorts of petitions 01:34:58.840 |
for her to be fired and that she needed to apologize and all this kind of stuff. 01:35:03.480 |
And she was offered, I think, like a million dollar severance, which she wouldn't take 01:35:07.840 |
because she wanted to tell the world what she thought about this and that she wanted 01:35:11.660 |
to continue saying that she hadn't changed her mind, that this was a disaster for young 01:35:18.400 |
And now that's kind of the conventional wisdom and the research is quite clear that this 01:35:24.600 |
was devastating to particularly disadvantaged youths. 01:35:28.680 |
Like people understand this now as being, okay, she was probably right. 01:35:32.460 |
But one of the really sad aspects of cancel culture is people forget why you were canceled 01:35:41.120 |
There's this lingering kind of like, well, I don't have to take them seriously anymore. 01:35:44.680 |
By the way, did you notice they happen to be right on something very important? 01:35:48.520 |
Now one funny thing about freedom of speech, freedom of speech wouldn't exist if you didn't 01:35:52.800 |
also have the right to say things that were wrong. 01:35:56.160 |
Because if you can't engage in ideaphoria, if you can't actually speculate, you'll never 01:36:01.440 |
actually get to something that's right in the first place. 01:36:04.040 |
But it's especially galling when people who were right were censored and never actually 01:36:12.080 |
- Well, this might be a good place to ask a little bit more about the freedom of speech. 01:36:17.640 |
And you said that included in the freedom of speech is to say things that are wrong. 01:36:27.920 |
- Hate speech is the best marketing campaign for censorship. 01:36:34.400 |
And it came from academia of the 20th century. 01:36:38.680 |
And that when I talked about the anti-free speech movement, that was one of their first 01:36:44.280 |
There was a lot of talk about critical race theory and being against critical race theory. 01:36:49.440 |
And fire will sue if you say that people can't advocate for it or teach it or research it. 01:36:56.040 |
Because you do absolutely have the right to pursue it academically. 01:37:00.280 |
However, every time someone mentioned CRT, they should also say the very first project 01:37:06.280 |
of the people who founded CRT, Richard Delgado, Mary Matsuda, et cetera, was to create this 01:37:14.000 |
new category of unprotected speech called hate speech and to get it banned. 01:37:18.600 |
The person who enabled this drift, of course, was Herbert Marcuse in 1965, you know, basically 01:37:23.840 |
questioning whether or not free speech should be a sacred value on the left. 01:37:27.340 |
And he was on the losing side for a really long time. 01:37:29.680 |
The liberals, you know, the way I grew up, that was basically being pro-free speech was 01:37:37.040 |
But that started to be etched away on campus. 01:37:40.480 |
And the way it was was with the idea of hate speech that essentially, oh, but we can designate 01:37:54.960 |
Who's going to decide what hate speech actually is? 01:37:57.160 |
Well, it's usually overwhelmingly can only happen in an environment of really low viewpoint 01:38:02.360 |
diversity because you have to actually agree on what the most hateful and wrong things 01:38:11.680 |
It's referred to this in a great case about flag burning in the First Amendment that I 01:38:18.160 |
You can't ban speech just because it's offensive. 01:38:22.400 |
It basically—it's one of the reasons why these kind of codes have been more happily 01:38:27.280 |
adopted in places like Europe where they have a sense that there's like a modal German or 01:38:33.480 |
And I think this is offensive and therefore, I can say that this is wrong. 01:38:37.660 |
In a more multicultural and a genuinely more diverse country that's never actually had 01:38:43.840 |
an honest thought that there is a single kind of American, there's never been—like we 01:38:49.800 |
had the idea of Uncle Sam but that was always kind of a joke. 01:39:00.040 |
And we get in a society that diverse that you can't ban things simply because they're 01:39:08.240 |
And that's one of the reasons why hate speech is not an unprotected category of speech. 01:39:14.460 |
My theory on freedom of speech is slightly different than most other constitutional lawyers. 01:39:19.800 |
And I think that's partially because some of the ways—some of these theories, although 01:39:23.480 |
a lot of them are really good, are inadequate. 01:39:28.480 |
And I sometimes call my theory the pure informational theory of freedom of speech. 01:39:33.300 |
Or sometimes when I want to be fancy, the lab and the looking glass theory. 01:39:37.320 |
And its most important tenet is that there—is that if the goal is the project of human knowledge, 01:39:43.220 |
which is to know the world as it is, you cannot know the world as it is without knowing what 01:39:51.100 |
And what people really think is an incredibly important fact to know. 01:39:55.560 |
So every time you're actually saying, "You can't say that," you're actually depriving 01:40:00.020 |
yourself of the knowledge of what people really think. 01:40:02.460 |
You're causing what Tim O'Carron, who's on our board of advisors, calls preference 01:40:08.020 |
You end up with an inaccurate picture of the world, which by the way, in a lot of cases, 01:40:13.160 |
because there are activists who want to restrict more speech, they actually tend to think that 01:40:16.760 |
people are more prejudiced than they might be. 01:40:20.200 |
And actually, these kind of restrictions—there was a book called Racial Paranoia that came 01:40:24.980 |
out about 15 years ago that was making the point that the imposition of some of these 01:40:29.880 |
codes can sometimes make people think that the only thing holding you back from being 01:40:40.180 |
And one—which we talk about in the book—one very real practical way it makes things worse 01:40:45.340 |
is when you censor people, it doesn't change their opinion. 01:40:49.420 |
It just encourages them to not share it with people who will get them in trouble. 01:40:53.780 |
So it leads them to talk to people who they already agree with, and group polarization 01:40:59.320 |
So we have some interesting data in the book about how driving people off of Twitter, for 01:41:05.360 |
example, in 2017, and then again, I think, in 2020, driving people to gab led to greater 01:41:18.200 |
Censorship doesn't actually change people's minds, and it pushes them in directions that 01:41:21.300 |
actually by very solid research will actually make them more radicalized. 01:41:26.740 |
So yeah, I think that the attempt to ban hate speech, it doesn't really protect us from 01:41:32.700 |
it, but it gives the government such a vast weapon to use against us that we will regret 01:41:41.020 |
Is there a way to sort of to look at extreme cases to test this idea out a little bit? 01:41:47.100 |
So if we look on campus, what's your view about allowing, say, white supremacists on 01:41:56.980 |
I think you should be able to study what people think, and I think it's important that we 01:42:05.100 |
So I think that, you know, let's take, for example, QAnon. 01:42:20.380 |
That's important to understand modern American politics. 01:42:24.340 |
And so if you put your scholar hat on, you should be curious about kind of everyone, 01:42:34.460 |
Darrell Davis, who I'm sure you're familiar with, part of his goal was just simply to 01:42:40.580 |
And in the process, he actually de-radicalized a number of Klan's members when they actually 01:42:44.860 |
realized that this black man who had befriended them actually was compassionate, was a decent 01:42:51.100 |
They realized all their preconceptions were wrong. 01:42:52.540 |
So it can have a de-radicalizing factor, by the way. 01:42:55.500 |
But even when it doesn't, it's still really important to know what the bad people in your 01:43:01.220 |
Honestly, in some ways, for your own safety, it's probably more important to know what 01:43:06.140 |
the bad people in your society actually think. 01:43:07.860 |
I personally, I don't know what you think about that, but I personally think that freedom 01:43:12.020 |
of speech in cases like that, like KKK and campus, can do more harm in the short term, 01:43:21.980 |
Because you can sometimes argue for like, this is going to hurt in the short term. 01:43:27.580 |
But I mean, Harvey said this, is like, consider the alternative. 01:43:31.620 |
Because you've just kind of made the case for like, this potentially would be a good 01:43:36.980 |
And it often is, I think, especially in a stable society like ours, with a strong middle 01:43:42.500 |
class, all these kinds of things, where people have like the comforts to reason through things. 01:43:47.620 |
But to me, it's like, even if it hurts in the short term, even if it does create more 01:43:52.820 |
hate in the short term, the freedom of speech has this really beneficial thing, which is 01:43:58.860 |
it helps you move towards the truth, the entirety of society, towards a deeper, more accurate 01:44:04.300 |
understanding of life on earth, of society, of how people function, of ethics, of metaphysics, 01:44:16.260 |
It gets rid of the Nazis in the long term, even if it adds to the number of Nazis in 01:44:22.100 |
- Yeah, well, and meanwhile, and the reality check part of this is people always bring 01:44:31.020 |
I haven't seen a case where they've been invited. 01:44:35.260 |
Usually the Klan argument gets thrown out when people are trying to excuse, and that's 01:44:42.900 |
And that's why you can't have Bill Maher on campus. 01:44:49.740 |
It's a little bit of that whataboutism again, about being like, well, that thing over there 01:44:53.220 |
is terrible, and therefore, this comedian shouldn't come. 01:44:57.020 |
- So I do have a question, maybe by way of advice. 01:45:00.500 |
You know, interviewing folks and seeing this, like a podcast is a platform in deciding who 01:45:08.980 |
to talk to or not, that's something I have to come face to face with on occasion. 01:45:13.460 |
My natural inclination before I started the podcast was I would talk to anyone, including 01:45:18.780 |
people which I'm still interested in, who are, you know, the current members of the 01:45:25.500 |
And to me, there's a responsibility to do that with skill. 01:45:31.820 |
And that responsibility has been weighing heavier and heavier on me, because you realize 01:45:35.980 |
how much skill it actually takes, because you have to know to understand so much. 01:45:41.340 |
Because I've come to understand that the devil is always going to be charismatic. 01:45:47.300 |
The devil's not going to look like the devil. 01:45:49.340 |
And so you have to realize you can't always come to the table with a deep compassion for 01:45:57.420 |
You have to have, you know, like 90% compassion and another 90% deep historical knowledge 01:46:03.700 |
about the context of the battles around this particular issue. 01:46:08.980 |
I don't know if there's thoughts you have about this, how to handle speech in a way, 01:46:19.780 |
without censoring, bringing it to the surface, but in a way that creates more love in the 01:46:25.020 |
I remember Steve Bannon got disinvited from the New Yorker Festival. 01:46:31.620 |
And Jim Carrey freaked out and all sorts of other people freaked out and he got disinvited. 01:46:37.220 |
And I got invited to speak on Smirconish about this. 01:46:40.660 |
And I was saying, like, listen, you don't have people to your conference because you 01:46:49.340 |
Like that's, we have to get out of this idea that that's, because they were trying to make 01:46:53.700 |
it sound like that's an endorsement of Steve Bannon. 01:46:56.660 |
Like if you actually look at the opinions of all the people who are there, you can't possibly 01:47:00.660 |
endorse all the opinions that all these other people who are going to be there actually 01:47:04.700 |
And so in the process of making that argument, I got, and also of course, the very classic, 01:47:10.180 |
it's very valuable to know what someone Steve Bannon thinks. 01:47:14.300 |
And I remember someone arguing back saying, well, would you want someone to interview 01:47:19.180 |
And I'm like, because at the moment, like it was at the time when ISIS was really going 01:47:26.660 |
And I was like, would you not want to go to a talk where someone was trying to figure 01:47:33.620 |
Because that changes your framing that essentially it's like, no, it's curiosity. 01:47:40.260 |
And we need a great deal more curiosity and a lot less unwarranted certainty. 01:47:44.720 |
And there's a question of like, how do you conduct such conversations? 01:47:53.920 |
I feel like documentary filmmakers usually do a much better job. 01:47:57.980 |
And the best job is usually done by biographers. 01:48:00.740 |
So the more time you give to a particular conversation, like really deep thought and 01:48:06.500 |
historical context and studying the people, how they think, looking at all different perspectives, 01:48:11.620 |
looking at the psychology of the person, upbringing, their parents, their grandparents, all of 01:48:16.180 |
this, the more time you spend with that, the better, the better the quality of the conversation 01:48:23.460 |
is because you get to understand the, you get to really empathize with the person, with 01:48:31.860 |
And you get to see the common humanity, all of this. 01:48:40.340 |
So like the best stuff I've seen is interviews that are part of a documentary. 01:48:44.620 |
But even now documentaries are like, there's a huge incentive to do as quickly as possible. 01:48:49.180 |
There's not an incentive to really spend time with the person. 01:48:52.000 |
- There's a great new documentary about Floyd Abrams that I really recommend. 01:48:56.300 |
We did a documentary about Ira Glasser called Mighty Ira, which was my video team and my 01:49:02.300 |
protege Nico Perino and Chris Malby and Aaron Reese put it together. 01:49:07.460 |
And it just follows the life and times of Ira Glasser, the former head of the ACLU. 01:49:12.900 |
- If you could just linger on that, that's a fascinating story. 01:49:20.860 |
He started working at the NYCLU, the New York Civil Liberties Union, back in I think the 01:49:27.020 |
He was, I think Robert Kennedy recommended that he go in that direction. 01:49:32.420 |
And he became the president of the ACLU right at the time that they were suffering from 01:49:41.140 |
And Nico and Aaron and Chris put together this, and they'd never done a documentary 01:49:50.740 |
And it tells the story of the Nazis in Skokie. 01:49:53.620 |
It tells the story of the case around it, tells the story of the ACLU at the time and 01:49:59.900 |
And one of the things that's so great is like when you get to see the Nazis at Skokie, they 01:50:05.740 |
come off like the idiots that you would expect them to. 01:50:09.260 |
There's a moment when the rally is not going very well and the leader gets flustered. 01:50:15.020 |
And it almost seems like he's gonna like shout out kind of like, "You're making this Nazi 01:50:21.080 |
So it showed how actually allowing the Nazis to speak at Skokie kind of took the wind out 01:50:27.260 |
Like if they had, the whole movement, like everybody just kind of, it all kind of dissolved 01:50:32.340 |
after that because they looked like racist fools, they were. 01:50:36.180 |
They were, you know, even Blues Brothers made jokes about them. 01:50:40.780 |
And it didn't turn into the disaster that people thought it was going to be just by 01:50:45.900 |
But Ira Glasser, okay, so he has this wonderful story about how Jackie Robinson joined the 01:50:52.960 |
Brooklyn Dodgers and how there was a moment when it was seeing someone, an African-American 01:50:58.940 |
as on their, literally on their team and how that really got him excited about the cause 01:51:04.140 |
of racial equality and that became a big part of what his life was. 01:51:08.860 |
And I just think of that as such a great metaphor is expanding your circle and seeing more people 01:51:15.540 |
as being quite literally on your team is the solution to so many of these problems. 01:51:20.560 |
And I worry that one of the things that is absolutely just a fact of life in America 01:51:24.900 |
is like we do see each other more as enemy camps as opposed to people on the same team. 01:51:31.140 |
And that was actually something in the early days, like me and Will Creeley, the legal 01:51:34.460 |
director of FIRE, wrote about the forthcoming free speech challenges of everyone being on 01:51:40.780 |
And one thing that I was hoping was that as more people were exposing more of their lives, 01:51:47.740 |
we'd realize a lot of these things we knew intellectually, like kids go to the bar and 01:51:51.420 |
get drunk and do stupid things, that when we started seeing the evidence of them doing 01:51:58.840 |
stupid things, that we might be shocked at first, but then eventually get more sophisticated 01:52:03.060 |
and be like, "Well, come on, people are like that." 01:52:08.660 |
But I think that there are plenty of things we know about human nature and we know about 01:52:13.300 |
dumb things people say, and we've made it into an environment where there's just someone 01:52:20.280 |
out there waiting to be kind of like, "Oh, remember that dumb thing you said we were 01:52:25.540 |
Well, I'm gonna make sure that you don't get into your dream school because of that." 01:52:34.220 |
- Yeah, digging through someone's past comments to find speech that hasn't aged well. 01:52:39.900 |
Like that one isn't just someone not being empathetic. 01:52:41.740 |
They're like, "I'm gonna punish you for this." 01:52:44.660 |
And that's one of the reasons why I got depressed writing this book, because there's already 01:52:50.340 |
people who don't love me because of coddling the American mind, usually based on a misunderstanding 01:52:54.420 |
of what we actually said in Coddling the American Mind, but nonetheless. 01:52:57.940 |
But on this one, I'm calling out people for being very cruel in a lot of cases. 01:53:05.820 |
But one thing that was really scary about studying a lot of these cases is that once 01:53:10.340 |
you have that target on your back, what they're gonna try to cancel you for could be anything. 01:53:15.460 |
They might go back into your old posts, find something that you said in 1995, do something 01:53:23.660 |
where essentially it looks like it's this entire other thing, but really what's going 01:53:29.020 |
on is they didn't like your opinion, they didn't like your point of view on something, 01:53:32.700 |
and they're gonna find a way that from now on, anytime your name comes up, it's like, 01:53:36.500 |
"Oh, remember this thing I didn't like about him?" 01:53:39.260 |
And it's, again, it's cruel, doesn't get you anywhere closer to the truth, but it is a 01:53:46.500 |
- Okay, in terms of solutions, I'm gonna ask you a few things. 01:53:56.420 |
- So I'm sure you've figured it all out then. 01:54:02.060 |
From a free speech culture perspective, how to be a good parent. 01:54:06.260 |
- I think the first quality you should be cultivating in your children if you want to 01:54:11.420 |
have a free speech culture is curiosity and an awareness of the vastness that will always 01:54:21.540 |
And getting my kids excited about the idea that's like, "We're gonna spend our whole 01:54:27.540 |
And it's fast and exciting and endless and will never make a big dent in it, but the 01:54:36.060 |
But only fools think they know everything, and sometimes dangerous fools at that. 01:54:42.540 |
So giving the sense of intellectual humility early on, also saying things that actually 01:54:49.140 |
do sound kind of old-fashioned, but I say things to my kids like, "Listen, if you enjoy 01:54:56.700 |
study and work, both things that I very much enjoy, I do for fun, your life is going to 01:55:05.380 |
So some of those old-fashioned virtues are things I try to preach. 01:55:10.500 |
Counterintuitive stuff like outdoor time, playing, having time that are not intermediated 01:55:19.340 |
And little things, like I talk about in the book about when my kids are watching something 01:55:26.140 |
And I'm not talking about like zombie movies, I'm talking about like a cartoon that has 01:55:32.540 |
And saying that they want to turn the TV off. 01:55:34.820 |
And I talk to them and I say, "Listen, I'm gonna sit next to you and we're gonna finish 01:55:41.340 |
And I want you to tell me what you think of this afterwards." 01:55:49.020 |
And by the end of it, every single time, when I asked them, "Was that as scary as you thought 01:55:54.300 |
And they were like, "No, Daddy, that was fine." 01:55:56.300 |
And I'm like, "That's one of the great lessons in life. 01:55:59.140 |
The fear that you don't go through becomes much bigger in your head than actually simply 01:56:05.460 |
That's one of the reasons why I'm fighting back against this culture. 01:56:07.420 |
I'd love for all of our kids to be able to grow up in an environment where people give 01:56:12.420 |
you grace and accept the fact that sometimes people are gonna say things that piss you 01:56:17.220 |
off, take seriously the possibility of being wrong and be curious. 01:56:21.700 |
- Well, I have hope that the thing you mentioned, which is because so much of young people's 01:56:26.820 |
stuff is on the internet, that they're going to give each other a break. 01:56:33.540 |
- Generation Z hates cancel culture the most. 01:56:35.940 |
And that's another reason why it's like people still claiming this isn't even happening. 01:56:39.340 |
It's kind of like, "No, you actually can ask kids what they think of cancel culture." 01:56:45.420 |
Well, I kind of think of them as like the immune system that's like, it's the culture 01:56:49.060 |
waking up to like, "No, this is not a good thing." 01:56:52.540 |
I mean, I am one of those kids who is really glad that I was a little kid in the '80s and 01:56:58.420 |
a teenager in the '90s because having everything potentially online, it's not an upbringing 01:57:06.740 |
- Well, I, because you can also do the absolutist free speech. 01:57:11.860 |
I like leaning into it where I hope for a future where a lot of our insecurities, flaws, 01:57:23.460 |
And to be raw honest with it, I think that leads to a better world because the flaws 01:57:30.420 |
I mean, that's the flaws is the basic ingredients of human connection. 01:57:39.020 |
And I talked about trying to use social media from a Buddhist perspective and like as if 01:57:45.020 |
it's the collective unconscious meditating and seeing those little like angry bits that 01:57:51.820 |
are trying to cancel you or get you to shut up and just kind of like letting them go the 01:57:56.700 |
same way you're supposed to watch your thoughts kind of trail off. 01:58:01.980 |
Whatever the drama going on, just seeing the sea of it, of the collective consciousness 01:58:08.500 |
just processing this and having a little like panic attack and just kind of like breathing 01:58:15.620 |
- Look at the little sort of hateful, angry voices kind of pop up and be like, "Okay, 01:58:20.140 |
And I'm still focused on that thing because that is one of the things is, okay, yeah, 01:58:26.900 |
actually this is probably late in the game to be giving my grand theory on this stuff. 01:58:34.940 |
- So what I was studying in law school when I ran out of First Amendment classes, I decided 01:58:40.820 |
to study censorship during the Tudor dynasty because that's where we get our ideas of prior 01:58:45.340 |
restraint that come from the licensing of the printing press, which was something that 01:58:51.940 |
Where basically the idea was that if you can't print anything in England unless it's with 01:58:57.180 |
these Your Majesty approved printers, it will prevent heretical work and anti-Henry VIII 01:59:06.820 |
A pretty, pretty, pretty, pretty, pretty efficient idea if nothing else. 01:59:13.740 |
And I always, so he started getting angry at the printing press around 1521 and then 01:59:18.460 |
passed something that required prints to be, along with Parliament, in 1538. 01:59:25.620 |
And I always think of that as kind of like where we are now because we have this, back 01:59:30.900 |
then we had the original disruptive technology. 01:59:32.820 |
You know, writing was probably really bad, but the next one, which was the printing press, 01:59:38.660 |
And I mean, and I say calamitous on purpose because in the short term, the witch hunts 01:59:44.100 |
went up like crazy because the printing press allowed you to get that manual on how to find 01:59:53.060 |
It led to all sorts of distress, misinformation, nastiness. 01:59:58.580 |
And Henry VIII was trying to put the genie back in the bottle. 02:00:00.940 |
You know, he was kind of like, "I want to use this for good. 02:00:08.020 |
But he was in an unavoidable period of epistemic anarchy. 02:00:13.340 |
There's nothing you can do to make the period after the printing press came out to be a 02:00:18.900 |
non-disruptive, non-crazy period other than like absolute totalitarianism and destroy 02:00:24.420 |
all the presses, which simply was not possible in Europe. 02:00:28.940 |
So I feel like that's kind of like where we are now. 02:00:32.180 |
That disruption came from adding, I think, you know, several million people to the European 02:00:36.740 |
conversation and then eventually the global conversation. 02:00:39.260 |
But eventually, it became the best tool for disconfirmation, for getting rid of falsity, 02:00:48.180 |
And it's the benefits, the long-term benefits of the printing press are incalculably great. 02:00:56.260 |
And that's what gives me some optimism for where we are now with social media because 02:01:00.100 |
we are in that unavoidably anarchical period. 02:01:02.700 |
And I do worry that there are attempts in states to pass things to try to put the genie 02:01:09.500 |
Like if we ban TikTok or we say that nobody under 18 can be on the internet unless they 02:01:16.660 |
have parental permission, we're going at something that no amount of sort of top-down 02:01:25.860 |
We have to culturally adapt to the fact of it in ways that make us wiser that actually 02:01:34.300 |
and allow it potentially to be that wonderful engine for disconfirmation that we're nowhere 02:01:41.580 |
But think about it, additional millions of eyes on problems, thanks to the printing press, 02:01:47.900 |
helped create the scientific revolution, the enlightenment, the discovery of ignorance. 02:01:54.220 |
We now have added billions of eyes and voices to solving problems, and we're using them 02:02:01.660 |
But those are just the early days of the printing press. 02:02:08.980 |
Is there something about X, about Twitter, which is perhaps the most energetic source 02:02:17.580 |
It seems like the collective unconscious of the species. 02:02:19.700 |
I mean, like it's one of these things where the tendency to want to see patterns in history 02:02:26.380 |
sometimes can limit the actual batshit crazy experience of what history actually is. 02:02:34.860 |
But yes, we have these nice comforting ideas that it's going to be like last time. 02:02:42.580 |
And I think how unusual Twitter is, because I think of it as like the-- because people 02:02:49.700 |
talk about writing and mass communications as being expanding the size of our collective 02:02:59.500 |
But now we're kind of looking at our collective brain in real time, and it's filled just like 02:03:03.740 |
our own brains with all sorts of little crazy things that pop up and appear like virtual 02:03:09.220 |
particles kind of all over the place of people reacting in real time to things. 02:03:15.700 |
There's never been anything even vaguely like it. 02:03:22.220 |
At its best, sometimes seeing people just getting euphoric over something going on and 02:03:27.260 |
cracking absolutely brilliant immediate jokes at the same time, it can be-- it can even 02:03:35.260 |
I feel like-- and I live in a neighborhood now on X where I mostly deal with people that 02:03:43.740 |
I think are actually thoughtful, even if I disagree with them. 02:03:49.340 |
I occasionally run into those other sort of what I call neighborhoods on X where it's 02:03:53.700 |
just all canceling, all nastiness, and it's always kind of an unpleasant visit to those 02:03:58.660 |
I'm not saying the whole thing needs to be like my experience, but I do think that the 02:04:05.040 |
reason why people keep on coming back to it is it reveals raw aspects of humanity that 02:04:12.960 |
Yeah, but also it's totally new, like you said. 02:04:16.480 |
Just the virality, the speed, the news travels, the opinions travel, that the battle over 02:04:24.000 |
Yeah, of what is true and not, lies travel, the old Mark Twain thing, pretty fast on the 02:04:30.940 |
And it changes your understanding of how to interpret information. 02:04:38.740 |
The stats are pretty bad on mental health with young people, and I'm definitely in the 02:04:43.980 |
camp of people who think that social media is part of that. 02:04:46.540 |
I understand the debate, but I'm pretty persuaded that one of the things that hasn't been great 02:04:51.620 |
for mental health of people is just constantly being exposed. 02:04:57.460 |
I think it's possible to create social media that makes a huge amount of money and makes 02:05:03.820 |
To me, it's possible to align the incentives. 02:05:08.100 |
So in terms of making teenagers, making every stage of life, giving you long-term fulfillment 02:05:15.140 |
and happiness with your physical existence outside of the social media and on social 02:05:19.100 |
media, helping you grow as a human being, helping challenge you just the right amount 02:05:23.980 |
and just the right amount of cat videos, whatever, gives this full, rich human experience. 02:05:29.900 |
I think it's just a machine learning problem. 02:05:36.260 |
So the easiest feed you could do is maximize engagement. 02:05:42.620 |
It's like for the algorithm to learn enough about you to understand what will make you 02:05:49.540 |
truly happy as a human being to grow long-term. 02:05:52.940 |
That's just a very difficult problem to solve. 02:06:02.180 |
One of the reasons why people love it so much is it sets you up that you're watching a raunchy 02:06:06.220 |
British sex in the city except the main character is the most promiscuous one. 02:06:12.700 |
It's like, okay, and you kind of roll your eyes a little bit. 02:06:15.140 |
It's kind of funny and it's kind of cute and kind of spicy. 02:06:19.460 |
And then you realize that the person is actually kind of suffering and having a hard time. 02:06:25.140 |
And it gets deeper and deeper as the show goes on. 02:06:29.040 |
And she will do these incredible speeches about tell me what to do. 02:06:33.300 |
Like I just I know there's experts out there. 02:06:36.900 |
I know there's an optimal way to live my life. 02:06:39.780 |
So why can't someone just tell me what to do? 02:06:42.680 |
And it's this wonderfully like accurate, I think, aspect of human desire that what if 02:06:51.340 |
something could actually tell me the optimal way to go? 02:06:55.340 |
Because I think there is a desire to give up some amount of your own freedom and discretion 02:07:00.460 |
in order to be told to do the optimally right thing. 02:07:07.100 |
- Yeah, but see the way you phrase it, that scares me too. 02:07:12.060 |
One, you can be constantly distracted in a TikTok way by things that keep you engaged. 02:07:17.820 |
So removing that and giving you a bunch of options constantly and learning from long 02:07:25.160 |
term what results in your actual long term happiness. 02:07:29.860 |
But like which amounts of challenging ideas are good for you? 02:07:39.480 |
- But there is a number like that for you, Greg. 02:07:47.920 |
I love the feeling of like realizing, holy shit, I've been wrong. 02:07:52.720 |
But like, I would love for the algorithm to know that about me and to help me, but always 02:07:58.040 |
giving me options if I want to descend into cat videos and so on. 02:08:04.440 |
- Like the idea of kind of like both going the speed that you need to and running as 02:08:11.960 |
I just feel YouTube recommendation for better or worse, if used correctly, it feels like 02:08:20.140 |
Whenever I just refuse to click on stuff that's just dopamine based and click on only educational 02:08:26.240 |
things, the recommendation it provides are really damn good. 02:08:29.400 |
So I feel like it's a solvable problem, at least in the space of education of challenging 02:08:35.080 |
yourself, but also expanding your realm of knowledge and all this kind of stuff. 02:08:39.080 |
- And I'm definitely more in the, we're in an inescapably anarchical period and require 02:08:44.100 |
big cultural adjustments and there's no way that this isn't gonna be a difficult transition. 02:08:48.980 |
- Is there any specific little or big things that you would like to see X do, Twitter do? 02:08:57.140 |
With the printing press, an extra millions of eyes on any problem can tear down any institution, 02:09:03.820 |
And that's good in some ways, 'cause a lot of medieval institutions needed to be torn 02:09:07.120 |
down and some people did too, and a lot of ideas needed to be torn down. 02:09:11.200 |
Same thing is true now, an extra billions of eyes on every problem can tear down any 02:09:17.760 |
And again, some of those things needed to be torn down, but it can't build yet. 02:09:22.640 |
We are not at the stage that it can build yet, but it has shown us how thin our knowledge 02:09:27.640 |
It's one of the reasons why we're also aware of the replication crisis. 02:09:29.600 |
It's one of the reasons why we're also aware of how kind of shoddy our research is, how 02:09:33.520 |
much our expert class is arrogant in many cases. 02:09:37.600 |
But people don't wanna live in a world where they don't have people that they respect and 02:09:44.000 |
And I think what's happening, possibly now, but will continue to happen is people are 02:09:51.000 |
gonna establish themselves as being high integrity, that they will always be honest. 02:09:54.240 |
I think you are establishing yourself as someone who is high integrity, where they can trust 02:10:00.320 |
A fire wants to be the institution that people can come to. 02:10:03.240 |
It's like, if it's free speech, we will defend it, period. 02:10:06.640 |
And I think that people need to have authorities that they can actually trust. 02:10:12.600 |
And I think that if you actually had a stream that maybe people can watch in action, but 02:10:17.080 |
not flood with stupid cancel culture stuff or dumb cat memes, where it is actually a 02:10:22.760 |
serious discussion bounded around rules, no perfect rhetorical fortress, no efficient 02:10:27.240 |
rhetorical fortress, none of the BS ways we debate, I think you could start to actually 02:10:32.540 |
create something that could actually be a major improvement in the speed with which 02:10:37.700 |
we come up with new, better ideas and establish and separate truth from falsity. 02:10:41.480 |
- Yeah, if it's done well, it can inspire a large number of people to become higher 02:10:47.640 |
And it can create integrity as a value to strive for. 02:10:52.200 |
- I mean, there's been projects throughout the internet that have done an incredible 02:10:59.240 |
Like Wikipedia is an example of a big leap forward in doing that. 02:11:07.820 |
- So there's a few really powerful ideas for the people who edit Wikipedia. 02:11:13.940 |
One of which is each editor kind of for themselves declares, "I'm into politics and I really 02:11:22.700 |
kind of am a left leaning guy, so I really shouldn't be editing political articles because 02:11:30.620 |
- So they declare their biases and they often do a good job of actually declaring the biases, 02:11:34.460 |
but they'll still like, they'll find a way to justify themselves. 02:11:41.660 |
- And they want to correct it because they love correcting untruth into truth. 02:11:46.460 |
But the perspective of what is true or not is affected by their bias. 02:11:51.660 |
- And it is true that there is a left leaning bias on the editors of Wikipedia. 02:11:57.240 |
So for that, what happens is on articles, which I mostly appreciate, that don't have 02:12:03.480 |
a political aspect to them, scientific articles or technical articles, they can be really 02:12:13.640 |
Even history, just describing the facts of history that don't have a subjective element, 02:12:18.960 |
Also, just using my own brain, I can kind of filter out if it's something about January 02:12:27.000 |
I know I'm going to be like, I'm not, whatever's going on here, I'm gonna kind of read it, 02:12:34.280 |
I'm gonna look to a bunch of different perspectives on it. 02:12:38.240 |
There's probably going to be some kind of bias. 02:12:40.540 |
Maybe some wording will be such, which is where Wikipedia does its thing, the way they 02:12:48.020 |
word stuff will be biased, the choice of words. 02:12:52.400 |
But the Wikipedia editors themselves are so self-reflective. 02:12:56.440 |
They literally have articles describing these very effects of how you can use words to inject 02:13:07.520 |
- It's incredibly healthy, but I think you could do better. 02:13:10.920 |
One of the big flaws of Wikipedia to me that Community Notes on X does better is the accessibility 02:13:20.860 |
It's difficult to become an editor and it's not as visible, the process of editing. 02:13:26.080 |
So I would love, like you said, a stream for everyone to be able to observe this debate 02:13:31.680 |
between people with integrity of when they discuss things like January 6th, the very 02:13:36.720 |
controversial topics, to see how the process of the debate goes, as opposed to being hidden 02:13:42.800 |
in the shadows, which it currently is in Wikipedia. 02:13:48.200 |
And I've also seen how they will use certain articles on certain people. 02:13:54.400 |
Those about people I've learned to trust less and less. 02:13:57.600 |
Because they'll literally will use those to make personal attacks. 02:14:03.000 |
They'll use descriptions of different controversies to paint a picture of a person that doesn't 02:14:10.080 |
to me at least feel like an accurate representation of the person. 02:14:14.200 |
It's like writing an article about Einstein, mentioning something about theory of relativity 02:14:19.960 |
and then saying that he was a womanizer and abuser and a controversy. 02:14:26.720 |
Feynman also, not exactly the perfect human in terms of women. 02:14:36.680 |
And to capture that human properly, there's a certain way to do it. 02:14:40.680 |
And I think Wikipedia will often lean, they really try to be self-reflective and try to 02:14:47.160 |
stop this, but they will lean into the drama if it matches the bias. 02:14:52.560 |
But again, much better than, the world I believe is much better because Wikipedia exists. 02:15:00.200 |
But now that we're in these adolescent stages, we're growing and trying to come up with different 02:15:05.200 |
technologies, the idea of a stream is really, really interesting. 02:15:08.760 |
As you get more and more people into this discourse that where the value is, let's try 02:15:17.120 |
And that basically, you get little cards for nope, wrong, nope, wrong. 02:15:21.920 |
And the different rhetorical techniques that are being used to avoid actually discussing. 02:15:26.480 |
Yeah, and I think actually you can make it a little bit fun by you get a limited number 02:15:30.320 |
It's kind of like, you get three whataboutism cards. 02:15:37.400 |
Let me ask you about, so you mentioned going through some difficult moments in your life. 02:15:45.120 |
What has been your experience with depression? 02:15:48.520 |
What has been your experience getting out of it, overcoming it? 02:15:51.600 |
Yeah, I mean, the whole thing, the whole journey with Coddling the American Mind began with 02:15:58.040 |
me in the Belmont psychiatric facility in Philadelphia back in 2007. 02:16:05.200 |
I had called 911 in a moment of clarity because I'd gone to the hardware store to make sure 02:16:16.080 |
I wanted to make sure that I had my head wrapped and everything. 02:16:19.440 |
So if all the drugs I was planning to take didn't work, that I wouldn't be able to claw 02:16:28.080 |
And I always had issues with depression, but they were getting worse. 02:16:32.520 |
And frankly, one of the reasons why this cancel culture stuff is so important to me is that 02:16:38.920 |
the thing that I didn't emphasize as much in Coddling the American Mind, which by the 02:16:41.920 |
way, that description that I give of trying to kill myself was the first time I'd ever 02:16:47.520 |
Nobody in my family was aware of it being like that. 02:16:53.020 |
And basically, the only way I was able to write that was by doing, you know how you 02:16:59.360 |
And I was like, I'm going to convince myself that this is just between me and my computer 02:17:03.800 |
And it's probably now the most public thing I've ever written. 02:17:07.640 |
But what I didn't emphasize in that was how much the culture war played into how depressed 02:17:13.520 |
Because I was originally the legal director of FIRE, then I became president of FIRE in 02:17:17.520 |
2005, moved to Philadelphia where I get depressed. 02:17:23.880 |
There's something about the town, they don't seem to like me very much. 02:17:27.520 |
But the main thing was being in the culture war all the time. 02:17:33.440 |
I remember, you know, she didn't seem to really approve of what I did. 02:17:38.200 |
And meanwhile, like I was defending people on the left all the time. 02:17:42.360 |
And they'd be like, oh, that's good that you're defending someone on the left. 02:17:44.480 |
But they still would never forgive me for defending someone on the right. 02:17:47.360 |
And I remember saying at one point, I'm like, listen, I'm like, I'm a true believer in this 02:17:56.880 |
And she actually said, I think Republicans might be worse. 02:18:01.400 |
And that didn't, that relationship didn't go very well. 02:18:04.160 |
And then I nearly got in fistfights a couple times with people on the right, because they 02:18:08.560 |
found out I defended people who crack jokes about 9/11. 02:18:22.760 |
You can see how friends can turn on you if they don't like your politics. 02:18:27.160 |
So I got an early preview of this, of what the culture we were heading into by being 02:18:33.320 |
the president of FIRE, and it was exhausting. 02:18:37.880 |
And that was one of the main things that led me to be, you know, suicidally depressed. 02:18:43.280 |
At the Belmont Center, if you told me that that would be the beginning of a new and better 02:18:47.640 |
life for me, I would have laughed if I could have. 02:18:50.640 |
But I would, you know, I don't, you can tell I'm okay if I'm still laughing. 02:18:57.180 |
So I got a doctor and I started doing cognitive behavioral therapy. 02:19:02.240 |
I started having all these voices in my head that were catastrophizing and, you know, it 02:19:09.280 |
gave me over generalization and fortune telling, you know, mind reading, all of these things 02:19:17.960 |
And what you do in CBT is essentially you have something makes you upset, and then you 02:19:28.040 |
And, you know, something minor could happen and your response was, you know, like, well, 02:19:35.400 |
And that's because I'm broken and will die alone. 02:19:39.640 |
What are the following, you know, that's catastrophizing, that's mind reading, that's fortune telling, 02:19:46.380 |
And you have to do this several times a day, forever. 02:19:49.760 |
I actually need to brush up on it at the moment. 02:19:53.080 |
And it slowly over time, voices in my head that have been saying horrible, you know, 02:19:58.960 |
horrible internal talk, it just didn't sound as convincing anymore, which was a really 02:20:05.400 |
Like it was just kind of like, oh, wait, I don't buy that I'm broken. 02:20:11.400 |
That doesn't sound like truth from God, like it used to. 02:20:15.680 |
And nine months after I was planning to kill myself, I was probably happier than I'd been 02:20:24.080 |
And that was one of the things that, you know, the CBT is what led me to notice this in my 02:20:30.080 |
own work that it felt like administrators were kind of selling cognitive distortions, 02:20:36.220 |
And then when I started noticing that they seemed to come in actually already believing 02:20:39.280 |
a lot of this stuff, that would be very dangerous. 02:20:40.960 |
And that led to calling the American mind and all that stuff. 02:20:44.760 |
But the thing that was rough about writing, "Canceling the American Mind," I've mentioned 02:20:50.520 |
I got really depressed this past year because I was studying. 02:20:54.440 |
You know, there's a friend in there that I talk about who killed himself after being 02:20:59.640 |
I talked to him a week before he killed himself and I hadn't actually checked in with him 02:21:04.520 |
because he seemed so confident I thought he would be totally fine because he had an insensitive 02:21:09.600 |
tweet in June of 2020 and, you know, got forced out in a way that didn't actually sound as 02:21:17.000 |
He actually at least got a severance package, but they knew he'd sue and win because he 02:21:22.600 |
And so I waited to check in on him because we were so overwhelmed with the request for 02:21:26.040 |
helps and he was saying people were coming to his house still and then he shot himself 02:21:31.560 |
And I definitely, and because everyone knows I'm so public about my struggles with this 02:21:36.000 |
stuff, everybody who fights this stuff comes to me when they're having a hard time. 02:21:41.560 |
And this is a very hard psychologically taxing business to be in. 02:21:45.840 |
And even admitting this right now, like I think about like all the vultures out there, 02:21:52.200 |
Just like the same way when my friend Mike Adams killed himself, there were people like 02:21:55.160 |
celebrating on Twitter that a man was dead because they didn't like his tweets. 02:22:03.280 |
But somehow that made them compassionate for some abstract other person. 02:22:07.440 |
So I was getting a little depressed and anxious and the thing that really helped me more than 02:22:11.920 |
anything else was confessing to my staff that I, you know, books take a lot of energy. 02:22:20.720 |
So I knew they didn't want to hear that not only was this taking a lot of the boss's time, 02:22:26.840 |
But when I finally told the leadership of my staff, you know, people that even though 02:22:32.040 |
I try to maintain a lot of distance from, I love very, very much, it made such a difference, 02:22:37.960 |
you know, because I could be open about that. 02:22:39.960 |
And the other thing was, have you heard this conference dialogue? 02:22:49.360 |
It intentionally tries to get people over the political spectrum to come together and 02:22:54.200 |
have off the record conversations about big issues. 02:22:57.560 |
And it was nice to be in a room where liberal, conservative, none of the above were all like, 02:23:03.160 |
oh, thank God someone's taking on council culture. 02:23:05.760 |
And where it felt like, it felt like maybe this won't be the disaster for me and my family 02:23:12.360 |
that I was starting to be afraid it would be that taking this stuff on might actually 02:23:17.760 |
Well, one thing I just stands out from that is the pain of cancellation can be really 02:23:30.720 |
And that doesn't necessarily mean losing your job, but just even you can call it bullying, 02:23:34.880 |
you can call it whatever name, but just some number of people on the internet. 02:23:40.120 |
And that number can be small, kind of saying bad things to you. 02:23:46.640 |
That can be a pretty powerful force to the human psyche, which was very surprising. 02:23:51.640 |
And then the flip side also of that, it really makes me sad how cruel people can be. 02:24:01.120 |
Thinking that your cause is social justice in many cases can lead people to think I can 02:24:09.800 |
When a lot of times it's just a way to sort of vent some aggression on a person that you 02:24:18.360 |
So I think it's important for people to realize that they're whatever negative energy, whatever 02:24:29.200 |
negativity you want to put out there, there's real people that can get hurt. 02:24:34.920 |
You can really get people to one, be the worst version of themselves, or two, possibly take 02:24:46.360 |
Well, that's one of the things that we do in the book to really kind of address people 02:24:50.560 |
who still try to claim this isn't real, is we just quote. 02:25:02.400 |
And Taylor Swift's quote is essentially about how behind all of this, when it gets particularly 02:25:07.480 |
nasty, there's this very clear kill yourself kind of undercurrent to it. 02:25:16.400 |
And the problem is that in an environment so wide open, there's always going to be someone 02:25:22.560 |
who wants to be so transgressive and say the most hurtful, terrible thing. 02:25:27.960 |
But then you have to remember the misrepresentation, getting back to the old idioms, sticks and 02:25:32.720 |
stones will break my bones, but names will never hurt me, has been reimagined in campus 02:25:43.440 |
People will literally say stuff like, but now we know words can hurt. 02:25:49.720 |
Guys, you didn't have to come up with a special little thing that you teach children to make 02:25:55.520 |
words hurt less if they never hurt in the first place. 02:26:00.040 |
It's a saying that you repeat to yourself to give yourself strength when the bullies 02:26:12.120 |
It really does help to be like, listen, okay, assholes are going to say asshole things, 02:26:17.800 |
and I can't let them have that kind of power over me. 02:26:22.400 |
But also as a learning experience, because it does hurt. 02:26:26.320 |
But for the good people out there who actually just sometimes think that they're vending, 02:26:32.240 |
Remember that there are people on the other side of it. 02:26:35.440 |
For me, it hurts my kind of faith in humanity. 02:26:40.720 |
When I just see people being cruel to each other, it floats a cloud over my perspective 02:26:54.040 |
That was always my sort of flippant answer to that, if mankind is basically good or basically 02:26:59.720 |
evil being the biggest debate in philosophy and being like, well, the problem with the 02:27:05.760 |
first is there's nothing basic about humanity. 02:27:10.400 |
What gives you hope about this whole thing, about this dark state that we're in as you 02:27:24.600 |
I think people are sick of not being able to be authentic. 02:27:31.800 |
It's basically telling you don't be yourself. 02:27:42.320 |
I think that people have kind of had enough of it. 02:27:46.640 |
One thing I definitely want to say to your audience is it can't just be up to us arguers 02:27:59.640 |
I think that, and this may sound like it's an unrelated problem, I think if there were 02:28:06.160 |
highly respected, let's say extremely difficult ways to prove that you're extremely smart 02:28:12.120 |
and hardworking that cost little or nothing that actually can give the Harvards and the 02:28:18.560 |
Yales of the world a run for their money, I think that might be the most positive thing 02:28:22.520 |
we could do to deal with a lot of these problems and why. 02:28:26.360 |
I think the fact that we have become a weird, America with a great anti-elitist tradition 02:28:33.000 |
has become weirdly elitist in the respect that we not only again are our leadership 02:28:39.240 |
coming from these few fancy schools, we actually have like great admiration for them. 02:28:46.200 |
But I think we'd have a lot healthier of a society if people could prove their excellence 02:28:51.360 |
in ways that are coming from completely different streams and that are highly respected. 02:28:56.040 |
I sometimes talk about there should be a test that anyone who passes it gets like a BA in 02:29:09.200 |
I'm talking about something that like, you know, one out of only a couple, like a hundred 02:29:14.880 |
Some other way of actually, of not going through these massive bloated expensive institutions 02:29:20.640 |
that people can raise their hands and say, "I'm smart and hardworking." 02:29:23.600 |
I think that could be an incredibly healthy way. 02:29:26.480 |
I think we need additional streams for creative people to be solving problems whether that's 02:29:32.240 |
I think that there's lots of things that technology could do to really help with this. 02:29:35.960 |
I think some of the stuff that Sal Khan is working on at Khan Academy could really help. 02:29:40.960 |
So I think there's a lot of ways but they exist largely around coming up with new ways 02:29:45.140 |
of doing things, not just expecting the old things that have say $40 billion in the bank 02:29:54.200 |
And here's my, you know, I've been picking on Harvard a lot but I'm going to pick on 02:30:03.040 |
And you know, there's a great book called Poison Ivy by Evan Mandry, which I recommend 02:30:09.360 |
It sounds like me in a rant at Stanford, which was, and I think the status, you know, elite 02:30:15.240 |
higher education has more kids from the top 1% than they have from the bottom 50 or 60%, 02:30:23.520 |
And when you look at how much they actually like replicate class privilege, it's really 02:30:31.160 |
And above all else, if you're weird, continue being weird. 02:30:38.240 |
And you're one of the most interesting, one of the weirdest, in the most beautiful way 02:30:43.160 |
Greg, thank you for the really important work you do. 02:30:51.920 |
I appreciate the class, the hilarity that you brought here today, man. 02:31:00.560 |
And for me, who deeply cares about education, higher education, thank you for holding the 02:31:06.560 |
MITs and the Harvards accountable for doing right by the people that walk their halls. 02:31:16.200 |
Thanks for listening to this conversation with Greg Lukianoff. 02:31:18.800 |
To support this podcast, please check out our sponsors in the description. 02:31:23.040 |
And now let me leave you with some words from Noam Chomsky. 02:31:27.000 |
If you believe in freedom of speech, you believe in freedom of speech for views you don't like. 02:31:32.560 |
Goebbels was in favor of freedom of speech for views he liked. 02:31:38.100 |
If you're in favor of freedom of speech, that means you're in favor of freedom of speech 02:31:45.880 |
Thank you for listening, and hope to see you next time.