back to index

Ronald Sullivan: The Ideal of Justice in the Face of Controversy and Evil | Lex Fridman Podcast #170


Chapters

0:0 Introduction
2:13 Harvey Weinstein
8:18 Harvard succumbs to pressure
19:33 Safe spaces
25:51 Cancel culture
28:48 Evil
32:58 Hitler
37:34 Criminal justice system
41:35 Innocence
44:4 Racism in the judicial system
56:6 George Floyd
58:31 The trial of Derek Chauvin
72:20 O. J. Simpson
76:54 Aaron Hernandez
88:35 Book recommendations
96:10 Advice for young people
98:18 Death
100:24 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Ronald Sullivan,
00:00:02.840 | a professor at Harvard Law School
00:00:04.680 | known for taking on difficult and controversial cases.
00:00:08.440 | He was on the head legal defense team
00:00:11.120 | for the Patriots football player Aaron Hernandez
00:00:14.160 | in his double murder case.
00:00:15.920 | He represented one of the GNS6 defendants
00:00:19.160 | and never lost the case during his years
00:00:21.760 | in Washington DC's Public Defender Services Office.
00:00:25.920 | In 2019, Ronald joined the legal defense team
00:00:29.400 | of Harvey Weinstein, a film producer
00:00:32.280 | facing multiple charges of rape and other sexual assault.
00:00:36.360 | This decision met with criticism
00:00:38.600 | from Harvard University students,
00:00:40.480 | including an online petition by students
00:00:42.600 | seeking his removal as faculty dean of Winthrop House.
00:00:47.000 | Then a letter supporting him,
00:00:49.120 | signed by 52 Harvard Law School professors,
00:00:51.720 | appeared in the Boston Globe on March 8th, 2019.
00:00:55.800 | Following this, the Harvard administration
00:00:57.560 | succumbed to the pressure of a few Harvard students
00:01:00.480 | and announced that they will not be renewing
00:01:03.240 | Ronald Sullivan's dean position.
00:01:05.640 | This created a major backlash in the public discourse
00:01:09.280 | over the necessary role of universities
00:01:11.900 | in upholding the principles of law and freedom
00:01:15.680 | at the very foundation of the United States.
00:01:19.160 | This conversation is brought to you
00:01:20.520 | by Brooklyn and Sheetz, Wine Access online wine store,
00:01:24.160 | Monk Pack low carb snacks,
00:01:25.960 | and Blinkist app that summarizes books.
00:01:28.440 | Click their links to support this podcast.
00:01:31.400 | As a side note, let me say that the free exchange
00:01:34.320 | of difficult ideas is the only mechanism
00:01:37.360 | through which we can make progress.
00:01:39.540 | Truth is not a safe space.
00:01:42.320 | Truth is humbling, and being humbled can hurt.
00:01:46.460 | But this is the role of education,
00:01:48.240 | not just in the university, but in business and in life.
00:01:52.560 | Freedom and compassion can coexist,
00:01:55.640 | but it requires work and patience.
00:01:58.460 | It requires listening to the voices
00:02:00.840 | and to the experiences unlike our own.
00:02:03.680 | Listening, not silencing.
00:02:06.520 | This is the Lex Friedman Podcast,
00:02:09.300 | and here is my conversation with Ronald Sullivan.
00:02:13.080 | You were one of the lawyers who represented
00:02:16.720 | the Hollywood producer Harvey Weinstein
00:02:20.060 | in advance of a sexual assault trial.
00:02:22.320 | For this, Harvard forced you to step down
00:02:24.800 | as faculty deans, you and your wife, of Winthrop House.
00:02:29.040 | Can you tell the story of this saga
00:02:30.840 | from first deciding to represent Harvey Weinstein
00:02:35.060 | to the interesting, complicated events that followed?
00:02:39.260 | - Yeah, sure.
00:02:40.100 | So I got a call one morning from a colleague
00:02:42.760 | at the Harvard Law School who asked
00:02:46.100 | if I would consent to taking a call from Harvey.
00:02:52.560 | He wanted to meet me and chat with me
00:02:55.440 | about representing him.
00:02:56.640 | I said yes, and one thing led to another.
00:03:00.640 | I drove out to Connecticut where he was staying
00:03:05.120 | and met with him and some of his advisors.
00:03:08.040 | And then a day or two later, I decided to take the case.
00:03:13.040 | This would have been back in January of 2019, I believe.
00:03:18.920 | So the sort of cases, I have a very small practice.
00:03:23.560 | Most of my time is teaching and writing,
00:03:27.160 | but I tend to take cases that most deem to be impossible.
00:03:32.160 | I take the challenging sorts of cases.
00:03:36.680 | And this was, fit the bill, it was quite challenging
00:03:41.160 | in the sense that everyone had pre-drugged the case.
00:03:46.080 | When I say everyone, I just mean the general sentiment
00:03:48.680 | and the public had the case prejudged,
00:03:52.360 | even though the specific allegations did not regard
00:03:57.200 | any of the people in the New Yorker.
00:04:02.360 | That's the New Yorker article that sort of exposed
00:04:05.360 | everything that was going on, allegedly, with Harvey.
00:04:12.160 | So I decided to take the case and I did.
00:04:17.060 | - Is there a philosophy behind you taking on
00:04:20.160 | these very difficult cases?
00:04:21.680 | Like, is it a set of principles?
00:04:23.460 | Is it just your love of the law?
00:04:25.600 | Or is there like set of principles
00:04:27.840 | why you take on the cases?
00:04:29.780 | - Yeah, I do.
00:04:31.600 | I like to take on hard cases and I like to take on the cases
00:04:35.280 | that are with unpopular defendants, unpopular clients.
00:04:40.280 | And with respect to the latter,
00:04:44.840 | that's where Harvey Weinstein fell.
00:04:48.120 | It's because we need lawyers and good lawyers
00:04:53.120 | to take the unpopular cases because those sorts of cases
00:04:59.020 | determine what sort of criminal justice system we have.
00:05:04.040 | If we don't protect the rights and the liberties
00:05:06.800 | of those whom the society deems to be the least
00:05:10.040 | and the last, the unpopular client,
00:05:12.280 | then that's the camel's nose under the tent.
00:05:15.080 | If we let the camel's nose under the tent,
00:05:17.280 | the entire tent is gonna collapse.
00:05:19.560 | That is to say, if we short circuit the rights
00:05:23.720 | of a client like Harvey Weinstein,
00:05:26.160 | then the next thing you know,
00:05:27.800 | someone will be at your door knocking it down
00:05:30.400 | and violating your rights.
00:05:31.640 | There's a certain creep there with respect to the way
00:05:36.280 | in which the state will respect the civil rights
00:05:39.680 | and civil liberties of people.
00:05:41.400 | And these are the sorts of cases that test it.
00:05:44.240 | So, for example, there was a young man many, many years ago
00:05:49.240 | named Ernesto Miranda.
00:05:52.080 | By all accounts, he was not a likable guy.
00:05:55.120 | He was a three-time knife thief and not a likable guy.
00:06:00.120 | But lawyers stepped up and took his case.
00:06:04.400 | And because of that, we now have the Miranda warnings.
00:06:08.200 | You have the right to remain silent,
00:06:09.720 | those warnings that officers are forced to give to people.
00:06:14.720 | So it is through these cases that we express oftentimes
00:06:19.560 | the best values in our criminal justice system.
00:06:22.200 | So I proudly take on these sorts of cases
00:06:26.040 | in order to vindicate not only the individual rights
00:06:28.800 | of the person whom I'm representing,
00:06:30.720 | but the rights of citizens writ large,
00:06:34.160 | most of whom do not experience the criminal justice system
00:06:39.680 | and it's partly because of lawyers
00:06:42.960 | who take on these sorts of cases
00:06:45.880 | and establish rules that protect us,
00:06:48.960 | average, everyday, ordinary, concrete citizens.
00:06:53.480 | - From a psychological perspective, just you as a human,
00:06:56.440 | is there fear, is there stress from all the pressure?
00:07:00.640 | 'Cause if you're facing, I mean, the whole point,
00:07:03.360 | a difficult case, especially in the latter
00:07:05.240 | that you mentioned of the going against popular opinion,
00:07:08.360 | you have the eyes of millions potentially
00:07:11.240 | looking at you with anger as you try to defend
00:07:15.240 | the set of laws that this country is built on.
00:07:20.240 | - No, it doesn't stress me out particularly.
00:07:22.760 | It sort of comes with the territory.
00:07:26.360 | I try not to get too excited in either direction.
00:07:31.360 | So a big part of my practice is wrongful convictions
00:07:35.400 | and I've gotten over 6,000 people out of prison
00:07:40.400 | who've been wrongfully incarcerated
00:07:43.480 | and a subset of those people have been convicted
00:07:46.800 | and there are people who've been in jail 20, 30 years
00:07:50.560 | who have gotten out and those are the sorts of cases
00:07:52.960 | where people praise you and that sort of thing.
00:07:56.880 | And so, look, I do the work that I do.
00:08:01.840 | I'm proud of the work that I do.
00:08:03.760 | And in that sense, I'm sort of a part-time Taoist.
00:08:08.520 | The expression reversal was the movement of the Tao.
00:08:11.720 | So I don't get too high, I don't get too low.
00:08:14.640 | I just try to do my work and represent people
00:08:17.480 | to the best of my ability.
00:08:18.600 | - So one of the hardest cases of recent history
00:08:21.080 | would be the Harvey Weinstein in terms of popular opinion
00:08:24.080 | or unpopular opinion.
00:08:25.640 | So if you continue on that line,
00:08:29.120 | where does that story take you of taking on this case?
00:08:33.280 | - Yeah, so I took on the case
00:08:35.240 | and then there was a few students at the college.
00:08:40.120 | So let me back up.
00:08:41.160 | I had an administrative post at Harvard College,
00:08:44.080 | which is a separate entity from the Harvard Law School.
00:08:46.480 | Harvard College is the undergraduate portion
00:08:48.960 | of Harvard University and the law school is obviously
00:08:52.080 | the law school and I initially was appointed as master
00:08:57.080 | of one of the houses.
00:08:58.760 | We did a name change five or six years into it
00:09:01.880 | and we're called faculty deans.
00:09:04.760 | But the houses at Harvard are based on the college system
00:09:07.360 | of Oxford and Cambridge.
00:09:09.480 | So when students go to Harvard after their first year,
00:09:13.640 | they're assigned to a particular house or college
00:09:16.720 | and that's where they live and eat and so forth.
00:09:19.120 | - And these are undergraduates too.
00:09:20.200 | - These are undergraduate students.
00:09:21.640 | So I was responsible for one of the houses
00:09:25.200 | as its faculty dean.
00:09:28.080 | So it's an administrative appointment at the college
00:09:30.800 | and some students who clearly didn't like Harvey Weinstein
00:09:35.800 | began to protest about the representation
00:09:41.280 | and from there, it just mushroomed
00:09:45.560 | into one of the most craven, cowardly acts
00:09:49.840 | by any university in modern history.
00:09:53.200 | It's just a complete and utter repudiation
00:09:57.520 | of academic freedom and it is a decision
00:10:02.520 | that Harvard certainly will live to regret.
00:10:07.280 | Frankly, it's an embarrassment.
00:10:09.080 | We expect students to do what students do
00:10:12.320 | and I've encouraged students to have their voices heard
00:10:16.440 | and to protest.
00:10:17.520 | I mean, that's what students do.
00:10:19.760 | What is vexing are the adults.
00:10:24.360 | The dean of the faculty of arts and science,
00:10:27.160 | Claudine Gay, absolutely craven and cowardly.
00:10:31.000 | The dean of the college, same thing,
00:10:33.160 | Rakesh Khurana, craven and cowardly.
00:10:37.080 | They capitulated to the loudest voice in the room
00:10:42.080 | and ran around afraid of 19-year-olds.
00:10:45.040 | Oh, my 19-year-olds are upset.
00:10:47.200 | I need to do something.
00:10:49.680 | And it appeared to me that they so,
00:10:53.640 | so desired the approval of students
00:10:57.400 | that they were afraid to make the tough decision
00:11:00.320 | and the right decision.
00:11:01.600 | It really could have been an important teaching moment
00:11:04.560 | at Harvard. - Teaching moment, yeah.
00:11:06.200 | - Very important teaching moment.
00:11:07.800 | - So they forced you to step down
00:11:09.480 | from that faculty dean position at the house.
00:11:13.280 | - I would push back on the description a little bit.
00:11:16.160 | So I don't write the references
00:11:21.160 | to the op-ed I did in "The New York Times."
00:11:23.560 | Harvard made a mistake by making me step down
00:11:26.040 | or something like that.
00:11:27.680 | So I don't write those things.
00:11:29.320 | I did not step down and refuse to step down.
00:11:32.800 | Harvard declined to renew my contract.
00:11:37.200 | And I made it clear that I was not going to resign
00:11:41.240 | as a matter of principle and force them
00:11:45.000 | to do the cowardly act that they, in fact, did.
00:11:50.240 | - And you know, the worst thing about this,
00:11:54.160 | they did the college, Dean Gay and Dean Karana,
00:11:59.160 | commissioned this survey.
00:12:01.000 | They've never done this before, survey from the students.
00:12:03.760 | You know, how do you feel at Winthrop House?
00:12:06.760 | And the funny thing about the survey is
00:12:08.800 | they never released the results.
00:12:10.960 | Why did they never release the results?
00:12:13.380 | They never released the results
00:12:15.940 | because I would bet my salary
00:12:19.060 | that the results came back positive for me.
00:12:21.760 | And it didn't fit their narrative
00:12:23.660 | because most of the students were fine.
00:12:26.020 | Most of the students were fine.
00:12:27.460 | It was the loudest voice in the room.
00:12:29.420 | So they never released it.
00:12:30.860 | And you know, I challenge them to this day, release it.
00:12:33.940 | Release it.
00:12:34.940 | But no, but you know, they wanted to create this narrative.
00:12:42.100 | And when the data didn't support the narrative,
00:12:47.100 | then they just got silent.
00:12:48.940 | Oh, we're not gonna release it.
00:12:50.940 | The students demanded it, I demanded it,
00:12:53.780 | and they wouldn't release it because I am,
00:12:56.540 | I just know in my heart of hearts that it was,
00:13:01.480 | it came back in my favor that most students
00:13:05.100 | at Winthrop House said they were fine.
00:13:08.380 | There was a group of students
00:13:09.540 | that weaponized the term unsafe.
00:13:13.460 | They said we felt unsafe and they banteed this term about,
00:13:18.460 | but I'm, again, I'm confident that the majority of students
00:13:23.260 | at Winthrop House said they felt completely fine
00:13:26.660 | and felt safe and so forth.
00:13:29.300 | And the super majority, I am confident,
00:13:31.900 | either said, I feel great at Winthrop
00:13:35.380 | or, you know, I don't care one way or the other.
00:13:37.560 | And then there was some minority who had a different view.
00:13:40.740 | But, you know, lessons learned.
00:13:44.880 | It was a wonderful opportunity at Winthrop.
00:13:49.660 | I met some amazing students over my 10 years as master
00:13:54.180 | and then faculty dean.
00:13:56.180 | And I'm still in touch with a number of students,
00:13:58.800 | some of whom are now my students at the law school.
00:14:02.740 | So in the end, I thought it was,
00:14:06.060 | it ended up being a great experience.
00:14:09.880 | The national media was just wonderful in this,
00:14:13.720 | just wonderful.
00:14:15.000 | People wrote such wonderful articles and accounts
00:14:18.360 | and wagged their finger appropriately at Harvard.
00:14:21.600 | Compared me to John Adams,
00:14:25.940 | which I don't think is an apt comparison,
00:14:27.880 | but it's always great to read something like that.
00:14:30.400 | But at any rate, that was the Harvard,
00:14:33.780 | the Harvard versus Harvey situation.
00:14:37.400 | - So that seems like a seminal mistake by Harvard.
00:14:40.960 | And Harvard is one of the great universities in the world.
00:14:43.880 | And so sort of its successes and its mistakes
00:14:47.840 | are really important for the world
00:14:49.800 | as a beacon of like how we make progress.
00:14:53.360 | So what lessons for the bigger academia
00:14:56.320 | that's under fire a lot these days,
00:14:59.360 | what bigger lessons do you take away?
00:15:03.420 | Like how do we make Harvard great?
00:15:06.440 | How do we make other universities, Yale, MIT great
00:15:10.920 | in the face of such mistakes?
00:15:12.960 | - Well, I think that we have moved into a model
00:15:16.120 | where we have the consumerization of education.
00:15:21.120 | That is to say,
00:15:25.360 | we have feckless administrators
00:15:30.480 | who make policy based on what the students say.
00:15:35.480 | Now, this comment is not intended to suggest
00:15:40.660 | that students have no voice in governance,
00:15:44.300 | but it is to suggest that the faculty
00:15:47.260 | are there for a reason.
00:15:48.460 | They are among the greatest minds on the planet earth
00:15:52.700 | in their particular fields at schools like Harvard
00:15:55.260 | and Yale, Stanford, the schools that you mentioned, MIT,
00:15:59.060 | quite literally the greatest minds on earth.
00:16:01.380 | They're there for a reason.
00:16:03.860 | Things like curriculum and so forth
00:16:07.700 | are rightly in the province of faculty.
00:16:11.340 | And while you take input and critique and so forth,
00:16:15.180 | ultimately the grownups in the room
00:16:18.460 | have to be sufficiently responsible to take charge
00:16:23.180 | and to direct the course of a student's education.
00:16:28.380 | And my situation is one example
00:16:32.460 | where it really could have been an excellent teaching moment
00:16:36.020 | about the value of the Sixth Amendment,
00:16:38.020 | about what it means to treat people
00:16:42.860 | who are in the crosshairs of the criminal justice system.
00:16:46.620 | But rather than having that conversation,
00:16:50.100 | it's just this consumerization model.
00:16:54.940 | Well, there's a lot of noise out here,
00:16:57.460 | so we're gonna react in this sort of way.
00:17:00.500 | Higher education as well, unfortunately,
00:17:02.940 | has been commodified in other sorts of ways
00:17:07.060 | that has reduced or impeded,
00:17:10.420 | hampered these schools' commitments
00:17:13.220 | to free and robust and open dialogue.
00:17:17.900 | So to the degree that academic freedom
00:17:20.780 | doesn't sit squarely at the center of the academic mission,
00:17:25.460 | any school is gonna be in trouble.
00:17:27.300 | And I really hope that we weather
00:17:32.300 | this current political moment
00:17:38.180 | where 19-year-olds without degrees are running universities
00:17:43.180 | and get back to a system where faculty,
00:17:50.580 | where adults make decisions in the best interests
00:17:56.220 | of the university and the best interests of the student,
00:17:58.820 | even to the degree, though,
00:18:00.220 | some of those decisions may be unpopular.
00:18:04.900 | And that is gonna require a certain courage.
00:18:09.900 | And hopefully in time,
00:18:17.140 | and I'm confident that in time,
00:18:20.260 | administrators are gonna begin to push back
00:18:23.300 | on these current trends.
00:18:25.660 | Harvard's been around for a long time,
00:18:26.980 | it's been around for a long time for a reason.
00:18:29.300 | And one of the reasons is that it understands itself
00:18:32.860 | not to be static.
00:18:33.940 | So I have every view
00:18:36.420 | that Harvard is going to adapt
00:18:46.020 | and get itself back on course
00:18:49.660 | and be around another 400 years.
00:18:51.660 | At least that's my hope.
00:18:53.100 | - So, I mean, what this kind of boils down to
00:18:56.060 | is just having difficult conversation, difficult debates.
00:18:59.580 | When you mentioned sort of 19-year-olds,
00:19:01.980 | and it's funny, I've seen this even at MIT,
00:19:04.660 | it's not that they shouldn't have a voice.
00:19:07.660 | They do seem to, I guess you have to experience it
00:19:12.300 | and just observe it,
00:19:13.780 | they have a strangely disproportionate power.
00:19:17.060 | - Right, right.
00:19:17.900 | - It's very interesting to basically,
00:19:20.700 | I mean, you say, yes, there's great faculty and so on,
00:19:24.020 | but it's not even just that the faculty is smart
00:19:27.700 | or wise or whatever,
00:19:30.300 | it's that they're just silenced.
00:19:33.100 | So the terminology that you mentioned is weaponized
00:19:35.660 | as sort of safe spaces
00:19:38.100 | or that certain conversations make people feel unsafe.
00:19:41.880 | What do you think about this kind of idea?
00:19:48.460 | Is there some things that are unsafe
00:19:52.900 | to talk about in the university setting?
00:19:55.420 | Is there lines to be drawn somewhere?
00:19:58.100 | And just like you said on the flip side
00:20:01.380 | with the slippery slope,
00:20:02.940 | is it too easy for the lines to be drawn everywhere?
00:20:06.460 | - Yeah, that's a great question.
00:20:08.380 | So this idea of unsafe space,
00:20:10.980 | at least the vocabulary derives from some research,
00:20:15.780 | academic research about feeling psychologically unsafe.
00:20:20.300 | And so the notion here is that there are forms
00:20:26.060 | of psychological disquiet that impedes people
00:20:31.060 | from experiencing the educational environment
00:20:36.860 | to the greatest degree possible.
00:20:39.180 | And that's the argument.
00:20:44.380 | And assuming for a moment
00:20:46.020 | that people do have these feelings of disquiet
00:20:51.020 | at elite universities like MIT and like Harvard,
00:20:56.580 | that's probably the safest space people are gonna be in
00:20:59.980 | for their lives.
00:21:02.100 | 'Cause when they get out into the "real world",
00:21:05.460 | they won't have the sorts of nets
00:21:10.380 | that these schools provide,
00:21:11.580 | safety nets that these schools provide.
00:21:14.240 | So to the extent that research is descriptive
00:21:17.580 | of a psychological feeling,
00:21:19.780 | I think that the duty of the universities
00:21:22.460 | are to challenge people.
00:21:24.260 | Seems to me that it's a shame to go to a place like Harvard
00:21:27.540 | or a place like MIT, Yale,
00:21:29.300 | any of these great institutions
00:21:32.240 | and come out the same person that you were when you went in.
00:21:36.980 | That seems to be a horrible waste of four years
00:21:39.420 | and money and resources.
00:21:42.020 | Rather, we ought to challenge students,
00:21:45.140 | that they grow,
00:21:46.240 | challenge some of their most deeply held assumptions.
00:21:52.120 | They may continue to hold them,
00:21:54.980 | but the point of an education
00:21:57.020 | is to rigorously interrogate these fundamental assumptions
00:22:02.020 | that have guided you thus far
00:22:04.920 | and to do it fairly and civilly.
00:22:08.320 | So to the extent that there are lines that should be drawn,
00:22:12.040 | there's a long tradition
00:22:13.520 | in the university of civil discourse.
00:22:15.720 | So you should draw lines somewhere
00:22:18.920 | between civil discourse and uncivil discourse.
00:22:22.240 | The purpose of a university
00:22:23.780 | is to talk difficult conversations, tough issues,
00:22:28.520 | talk directly and frankly, but do it civilly.
00:22:32.760 | And so to yell and cuss at somebody
00:22:37.840 | and that sort of thing,
00:22:38.840 | well, do that on your own space,
00:22:42.120 | but observe the norms of civil discourse at the university.
00:22:47.120 | So look, I think that the presumption
00:22:52.080 | ought to be that the most difficult topics
00:22:56.920 | are appropriate to talk about at a university.
00:23:00.040 | That ought to be the presumption.
00:23:01.760 | Now, should
00:23:06.840 | MIT, for example, give its imprimatur
00:23:11.360 | to someone who is espousing the flat earth theory,
00:23:16.360 | you know, the earth is flat, right?
00:23:18.480 | So if certain ideas
00:23:22.680 | are so contrary to the scientific
00:23:28.640 | and cultural thinking of the moment,
00:23:34.280 | yeah, there's space there to draw a line
00:23:36.760 | and say, yeah, we're not gonna give you this platform
00:23:40.760 | to tell our students that the earth is flat.
00:23:44.740 | But, you know, a topic that's controversial,
00:23:49.760 | but contestatory, that's what universities are for.
00:23:54.440 | If you don't like the idea,
00:23:55.960 | present better ideas and articulate them.
00:23:58.840 | - And I think there needs to be a mechanism
00:24:01.860 | outside of the space of ideas of humbling.
00:24:04.240 | I've done martial arts for a long time.
00:24:07.120 | I got my ass kicked a lot.
00:24:09.040 | I think that's really important.
00:24:10.760 | I mean, in the space of ideas,
00:24:13.840 | I mean, even just in engineering,
00:24:16.240 | just all the math classes.
00:24:18.020 | My memories of math, which I love,
00:24:22.320 | is kind of pain.
00:24:24.920 | It's basically coming face to face with the idea
00:24:28.160 | that I'm not special,
00:24:30.880 | that I'm much dumber than I thought I was,
00:24:34.240 | and that accomplishing anything in this world
00:24:38.560 | requires really hard work.
00:24:40.140 | That's really humbling.
00:24:42.240 | That puts you, 'cause I remember when I was 18 and 19,
00:24:46.680 | and I thought I was gonna be the smartest,
00:24:48.980 | the best fighter, the Nobel Prize winning,
00:24:52.680 | you know, all those kinds of things.
00:24:54.080 | And then you come face to face with reality and it hurts.
00:24:58.400 | And it feels like there needs to be efficient mechanisms
00:25:01.720 | from the best universities in the world
00:25:03.960 | to, without abusing you,
00:25:06.440 | it's a very difficult line to walk,
00:25:09.160 | without mentally or physically abusing you,
00:25:13.160 | be able to humble you.
00:25:15.480 | And that's what I felt was missing
00:25:17.400 | in these very difficult, very important conversations
00:25:20.040 | is the 19-year-olds, when they spoke up,
00:25:23.600 | the mechanism for humbling them with ideas was missing.
00:25:28.600 | It kind of got broken down because, as you say,
00:25:33.640 | there does, like, I sensed fear.
00:25:37.800 | Everything was permeated with fear.
00:25:41.360 | And fear is paralyzing, fear is destructive,
00:25:45.320 | especially in a place that's supposed to be
00:25:47.120 | all about freedom of ideas.
00:25:49.400 | - Right.
00:25:50.240 | And I mean, I don't know if you have anything,
00:25:53.060 | any thoughts to say on this whole idea of cancel culture,
00:25:57.560 | where people, I mean, a lot of people use it
00:26:01.680 | as become political, so stay maybe outside of the world
00:26:04.480 | to politics.
00:26:05.420 | Is this, do you have thoughts about it?
00:26:11.080 | Does it bother you that people are sort of put in this bin
00:26:14.080 | and labeled as something and then thereby
00:26:17.880 | you can ignore everything they say?
00:26:19.560 | I mean, Steven Pinker, there's a lot of Harvard folks
00:26:22.480 | that are fighting against these set of ideas,
00:26:25.400 | but do you have thoughts?
00:26:27.360 | - I think that we as a culture are way, way, way too quick
00:26:32.360 | to cancel people.
00:26:34.680 | And it's become almost reflexive now.
00:26:39.680 | You know, someone says something or makes an offhand comment,
00:26:46.920 | even a mistake, there's a move to simply cancel folks.
00:26:51.920 | So I think that this quote unquote cancel culture
00:26:58.520 | has really gotten out of control at this point.
00:27:03.520 | It's forcing people to be robotic in many ways.
00:27:08.960 | - No offense to the robots.
00:27:11.000 | - I was gonna say, now I know I'm venturing
00:27:12.800 | into your intellectual domain.
00:27:15.600 | - For future robots watching this, no offense.
00:27:18.720 | - And there are men, it's discouraging a lot of good people
00:27:21.480 | from getting into public life in any sort of way
00:27:25.960 | because who needs the stress of it?
00:27:29.840 | - Well, in some sense, you're an inspiration
00:27:31.760 | that you're able to withstand the pressure,
00:27:35.600 | the pressure of the masses.
00:27:37.280 | But it is, it's a sad aspect of human nature
00:27:41.880 | that we kind of get into these crowds
00:27:43.600 | and we start chanting and it's fun for some reason,
00:27:47.040 | and then you forget yourself
00:27:48.240 | and then you sort of wake up the next day
00:27:50.840 | not having anticipated the consequences
00:27:55.480 | of all the chanting.
00:27:57.000 | And we get ourselves in trouble in that.
00:27:58.920 | I mean, there's some responsibility on social networks
00:28:02.800 | and the mechanisms by which they make it more frictionless
00:28:07.360 | to do the chanting, to do the canceling,
00:28:09.440 | to do the outrage and all that kind of stuff.
00:28:11.160 | So I actually on the technology side
00:28:13.160 | have a hope that that's fixable.
00:28:14.680 | But yeah, it does seem to be,
00:28:17.840 | you know, it almost like the internet showed to us
00:28:23.320 | that we have a lot of broken ways
00:28:25.980 | about which we communicate with each other
00:28:28.120 | and we're trying to figure that out.
00:28:29.360 | Same with the university.
00:28:30.680 | This mistake by Harvard showed that we need to reinvent
00:28:35.640 | what the university is.
00:28:37.400 | And I mean, all of this is,
00:28:39.160 | it's almost like we're finding our baby deer legs
00:28:42.320 | and trying to strengthen the institutions
00:28:45.480 | that have been very successful for a long time.
00:28:48.840 | You know, the really interesting thing
00:28:50.600 | about Harvey Weinstein
00:28:52.720 | and you choosing these exceptionally difficult cases
00:28:56.160 | is also thinking about
00:28:59.520 | what it means to defend evil people,
00:29:05.640 | what it means to defend these, we could say unpopular,
00:29:10.280 | and you might push back against the word evil,
00:29:13.360 | but bad people in society.
00:29:15.500 | First of all, do you think there's such a thing as evil
00:29:19.000 | or do you think all people are good
00:29:20.680 | and it's just circumstances that create evil?
00:29:24.000 | And also, is there somebody too evil for the law to defend?
00:29:28.200 | - So the first question,
00:29:30.600 | that's a deep philosophical question,
00:29:33.040 | whether the category of evil does any work for me.
00:29:38.040 | It does for me.
00:29:40.000 | I do think that,
00:29:41.880 | I do subscribe to that category,
00:29:45.080 | that there is evil in the world
00:29:48.320 | as conventionally understood.
00:29:50.600 | So there are many who will say,
00:29:52.840 | yeah, that just doesn't do any work for me,
00:29:57.160 | but the category evil, in fact,
00:30:00.280 | does intellectual work for me
00:30:02.240 | and I understand it as something that exists.
00:30:08.360 | - Is it genetic or is it the circumstance?
00:30:11.200 | What kind of work does it do for you intellectually?
00:30:13.520 | - I think that it's highly contingent,
00:30:16.920 | that is to say that the conditions
00:30:19.840 | in which one grows up and so forth
00:30:23.600 | begins to create this category
00:30:30.480 | that we may think of as evil.
00:30:32.480 | Now, there are studies and whatnot
00:30:36.040 | that show that certain brain abnormalities and so forth
00:30:41.040 | are more prevalent in say serial killers.
00:30:45.520 | So there may be a biological predisposition
00:30:49.440 | to certain forms of conduct,
00:30:51.520 | but I don't have the biological evidence
00:30:56.520 | to make a statement that someone is born evil.
00:31:02.120 | And I'm not a determinist thinker in that way.
00:31:06.880 | So you come out the womb evil
00:31:08.680 | and you're destined to be that way.
00:31:10.520 | To the extent there may be biological determinants,
00:31:16.200 | there still requires some nurture as well.
00:31:21.200 | - But do you still put a responsibility on the individual?
00:31:26.440 | - Of course, yeah, we all make choices.
00:31:29.240 | And so some responsibility on the individual indeed.
00:31:33.360 | We live in a culture unfortunately,
00:31:40.320 | where a lot of people have a constellation
00:31:44.840 | of bad choices in front of them.
00:31:46.760 | And that makes me very sad.
00:31:48.480 | That the people grow up with predominantly bad choices
00:31:54.400 | in front of them, and that's unfair.
00:31:56.480 | And that's on all of us.
00:31:58.640 | But yes, I do think we make choices.
00:32:01.200 | - Wow, that's so powerful, the constellation of bad choices.
00:32:04.800 | That's such a powerful way to think about sort of equality,
00:32:12.840 | which is the set of trajectories before you
00:32:18.440 | that you could take if you just roll the dice.
00:32:21.320 | Life is a kind of optimization problem.
00:32:26.880 | Sorry to take this into math.
00:32:28.240 | You can take over a set of trajectories
00:32:30.240 | under imperfect information.
00:32:32.120 | So you're gonna do a lot of stupid shit
00:32:36.200 | to put it in technical terms.
00:32:39.040 | But the fraction of the trajectories
00:32:45.320 | that take you into bad places or into good places
00:32:48.840 | is really important.
00:32:49.920 | And that's ultimately what we're talking about.
00:32:51.880 | And evil might be just a little bit
00:32:54.200 | of a predisposition biologically,
00:32:56.040 | but the rest is just trajectories that you can take.
00:32:58.320 | I've been studying Hitler a lot recently.
00:33:01.440 | I've been reading probably way too much.
00:33:03.560 | And it's interesting to think about
00:33:05.400 | all the possible trajectories that could have avoided
00:33:10.320 | this particular individual developing the hate that he did,
00:33:14.720 | the following that he did, the actual final.
00:33:18.780 | There's a few turns in him psychologically
00:33:23.400 | where he went from being a leader
00:33:26.400 | that just wants to conquer
00:33:28.760 | and to somebody who allowed his anger
00:33:33.760 | and emotion to take over
00:33:35.800 | to where he started making mistakes
00:33:37.640 | in terms of militarily speaking,
00:33:41.360 | but also started doing evil things.
00:33:46.040 | And all the possible trajectories
00:33:48.160 | that could have avoided that are fascinating,
00:33:50.000 | including he wasn't that bad at painting and drawing.
00:33:54.520 | - Right, that's true.
00:33:55.840 | That is true. - From the very beginning.
00:33:57.560 | And his time in Vienna,
00:34:00.240 | there's all these possible things to think about.
00:34:02.440 | And of course, there's millions of others like him
00:34:04.800 | that never came to power and all those kinds of things.
00:34:08.200 | But that goes to the second question on the side of evil.
00:34:12.480 | Do you think, and Hitler is often brought up
00:34:15.320 | as an example of somebody who is the epitome of evil.
00:34:20.280 | Do you think you would,
00:34:22.840 | if you got that same phone call after World War II
00:34:26.960 | and Hitler survived during the trial for war crimes,
00:34:31.960 | would you take the case defending Adolf Hitler?
00:34:39.600 | If you don't wanna answer that one,
00:34:41.840 | is there a line to draw for evil for who to not to defend?
00:34:46.360 | - No, I think everyone, I'll do the second one first.
00:34:49.320 | Everyone has a right to a defense
00:34:51.840 | if you're charged criminally in the United States of America.
00:34:55.920 | So no, I do not think that there's someone so evil
00:34:59.200 | that they do not deserve a defense.
00:35:02.840 | Process matters.
00:35:05.000 | Process helps us get to results more accurately
00:35:09.880 | than we would otherwise.
00:35:11.720 | So it is important and it's vitally important
00:35:14.680 | and indeed more important for someone deemed to be evil
00:35:18.880 | to receive the same quantum of process
00:35:22.160 | and the same substance of process that anyone else would.
00:35:25.520 | It's vitally important to the health
00:35:27.240 | of our criminal justice system for that to happen.
00:35:31.360 | So yes, everybody, Hitler included,
00:35:35.080 | were he charged in the United States
00:35:37.800 | for a crime that occurred in the United States, yes.
00:35:41.960 | Whether I would do it,
00:35:45.960 | if I were a public defender and assigned the case,
00:35:49.120 | yes, I started my career as a public defender.
00:35:51.680 | I represent anyone who was assigned to me.
00:35:56.080 | I think that is our duty.
00:35:59.080 | In private practice,
00:36:03.680 | I have choices and I likely,
00:36:08.200 | based on the hypo you gave me,
00:36:09.720 | and I would tweak it a bit
00:36:10.920 | 'cause it would have to be a US crime.
00:36:13.360 | - United States, yes.
00:36:15.120 | - But I get the broader point
00:36:16.680 | and don't wanna bog down in technicalities.
00:36:19.080 | I'd likely pass right now as I see it,
00:36:24.080 | unless it was a case where nobody else
00:36:28.440 | would represent him.
00:36:30.400 | Then I would think that I have some sort of duty
00:36:35.840 | and obligation to do it.
00:36:40.680 | But yes, everyone absolutely deserves a right
00:36:44.400 | to competent counsel.
00:36:46.520 | - That is a beautiful ideal.
00:36:48.440 | It's difficult to think about it
00:36:49.880 | in the face of public pressure.
00:36:52.680 | It's just, I mean,
00:36:54.760 | it's kind of terrifying to watch the masses
00:36:59.280 | during this past year of 2020,
00:37:01.760 | to watch the power of the masses
00:37:04.080 | to make a decision before any of the data is out,
00:37:09.080 | if the data is ever out,
00:37:11.920 | any of the details, any of the processes.
00:37:15.640 | And there's an anger to the justice system.
00:37:20.440 | There's a lot of people that feel like
00:37:22.120 | even though the ideal you describe is a beautiful one,
00:37:26.000 | it does not always operate justly.
00:37:30.280 | It does not operate to the best of its ideals.
00:37:33.040 | It operates unfairly.
00:37:34.880 | Can we go to the big picture of the criminal justice system?
00:37:38.840 | What do you, given the ideal,
00:37:42.560 | works about our criminal justice system and what is broken?
00:37:46.260 | - Well, there's a lot broken right now.
00:37:51.920 | And I usually focus on that.
00:37:54.680 | But in truth, a lot works about our criminal justice system.
00:37:59.000 | So there's an old joke.
00:38:00.600 | And it's funny, but it carries a lot of truth to it.
00:38:07.560 | And the joke is that in the United States,
00:38:11.240 | we have the worst criminal justice system in the world,
00:38:16.240 | except for every place else.
00:38:19.800 | And yes, we certainly have a number of problems
00:38:25.440 | and a lot of problems based on race and class
00:38:29.440 | and economic station,
00:38:32.100 | but we have a process that privileges liberty.
00:38:36.040 | And that's a good feature of the criminal justice system.
00:38:40.360 | So here's how it works.
00:38:41.600 | The idea of the relationship between the individual
00:38:45.160 | and the state is such that in the United States,
00:38:49.800 | we privilege liberty over and above very many values,
00:38:54.800 | so much so that a statement by Increase Mather,
00:38:58.560 | not terribly far from where we're sitting right now,
00:39:02.640 | has gained traction over all these years.
00:39:05.520 | And it's that better 10 guilty go free
00:39:07.960 | than one innocent person convicted.
00:39:10.600 | That is an expression of the way in which
00:39:14.560 | we understand liberty
00:39:16.920 | to operate in our collective consciousness.
00:39:19.360 | We would rather a bunch of guilty people go free
00:39:22.960 | than to impact the liberty interests
00:39:27.960 | of any individual person.
00:39:30.800 | So that's a guiding principle
00:39:32.760 | on our criminal justice system.
00:39:35.400 | Liberty.
00:39:36.720 | So we set a process that makes it difficult
00:39:40.320 | to convict people.
00:39:42.800 | We have rules of procedure that are cumbersome
00:39:46.280 | and that slow down the process
00:39:48.440 | and that exclude otherwise reliable evidence.
00:39:53.440 | And this is all because we place a value on liberty.
00:39:58.200 | And I think these are good things
00:40:00.040 | and it says a lot about our criminal justice system.
00:40:04.440 | Some of the bad features have to do
00:40:06.920 | with the way in which this country sees color
00:40:11.480 | as a proxy for criminality
00:40:13.480 | and treats people of color in radically different ways
00:40:17.720 | in the criminal justice system,
00:40:20.400 | from arrests to charging decisions to sentencing.
00:40:25.400 | People of color are disproportionately impacted
00:40:30.880 | on all sorts of registers.
00:40:33.440 | Now, one example, and it's a popular one,
00:40:37.000 | that although there appears to be
00:40:41.080 | no distinguishable difference between drug use
00:40:46.080 | by whites and blacks in the country,
00:40:49.800 | blacks, though only 12% of the population
00:40:54.320 | represent 40% of the drug charges in the country.
00:40:59.920 | There's some disequities along race and class
00:41:04.520 | in the criminal justice system
00:41:05.880 | that we really have to fix.
00:41:09.520 | And they've grown to more than bugs in the system
00:41:13.960 | and have become features, unfortunately, of our system.
00:41:17.400 | - Oh, to make it more efficient to make judgments.
00:41:19.880 | So the racism makes it more efficient.
00:41:21.880 | - It efficiently moves people from society to the streets
00:41:27.880 | and a lot of innocent people get caught up in that.
00:41:32.880 | - Well, let me ask in terms of the innocence.
00:41:37.760 | So you've gotten a lot of people who are innocent,
00:41:42.040 | I guess, revealed their innocence,
00:41:48.440 | demonstrated their innocence.
00:41:50.080 | What's that process like?
00:41:51.680 | What's it like emotionally, psychologically?
00:41:53.840 | What's it like legally to fight that?
00:41:56.160 | Legally to fight the system
00:41:57.800 | through the process of revealing
00:42:02.440 | the innocence of a human being?
00:42:06.160 | - Yeah, emotionally and psychologically, it can be taxing.
00:42:09.400 | I follow a model of what's called empathic representation.
00:42:14.400 | And that is I get to know my clients and their family.
00:42:19.280 | I get to know their strivings, their aspirations,
00:42:21.920 | their fears, their sorrows.
00:42:24.360 | So that certainly sometimes can do psychic injury on one.
00:42:29.360 | If you get really invested and really sad or happy,
00:42:37.120 | it does become emotionally taxing.
00:42:42.560 | But the idea of someone sitting in jail for 20 years,
00:42:47.560 | completely innocent of a crime,
00:42:48.680 | can you imagine sitting there every day for 20 years
00:42:52.080 | knowing that you factually did not do the thing
00:42:55.640 | that you were convicted of by a jury of your peers?
00:42:59.240 | It's got to be the most incredible thing in the world.
00:43:02.560 | But the people who do it and the people who make it
00:43:07.760 | and come out on the other side as productive citizens
00:43:11.520 | are folks who say,
00:43:13.360 | they've come to an inner peace in their own minds.
00:43:16.960 | And they say, "These bars aren't gonna define me,
00:43:20.800 | that my humanity is there and it's immutable."
00:43:25.800 | And they are not bitter, which is amazing.
00:43:30.760 | I would tend to think that I'm not that good of a person.
00:43:33.680 | I would be bitter for every day of 20 years
00:43:36.360 | if I were in jail for something.
00:43:38.560 | But people tell me that they can't survive,
00:43:42.560 | like that one cannot survive like that.
00:43:44.720 | And you have to come to terms with it.
00:43:46.760 | And the people whom I've exonerated,
00:43:50.920 | I mean, they come out, most of them come out
00:43:54.840 | and they just really just take on life with a vim
00:43:59.320 | and vigor without bitterness.
00:44:02.360 | And it's a beautiful thing to see.
00:44:04.960 | - Do you think it's possible to eradicate racism
00:44:08.640 | from the judicial system?
00:44:10.840 | - I do.
00:44:11.680 | I think that race insinuates itself
00:44:15.120 | in all aspects of our lives.
00:44:17.120 | And the judicial system is not immune from that.
00:44:21.200 | So to the extent we begin to eradicate
00:44:24.520 | dangerous and deleterious race thinking from society
00:44:29.760 | generally, then it will be eradicated
00:44:34.200 | from the criminal justice system.
00:44:36.360 | I think we've got a lot of work to do
00:44:37.840 | and I think it'll be a while, but I think it's doable.
00:44:42.600 | I mean, the country, so historians will look back
00:44:47.600 | 300 years from now and take note of the incredible journey
00:44:53.960 | of diasporic Africans in the US,
00:45:00.760 | an incredible journey from slavery
00:45:05.560 | to the heights of politics and business
00:45:10.680 | and judiciary and the academy and so forth
00:45:13.520 | in not a lot of time and actually not a lot of time.
00:45:17.440 | And if we can have that sort of movement historically,
00:45:22.440 | let's think about what the next 175 years will look like.
00:45:25.720 | I'm not saying it's gonna be short,
00:45:28.080 | but I'm saying that if we keep at it,
00:45:30.560 | keep getting to know each other a little better,
00:45:34.400 | keep enforcing laws that prohibit
00:45:39.880 | the sort of race-based discrimination
00:45:42.480 | that people have experienced and provide
00:45:45.520 | as a society opportunities for people to thrive
00:45:51.240 | in this world, then I think we can see a better world.
00:45:54.960 | And if we see a better world,
00:45:55.960 | we'll see a better judicial system.
00:45:58.040 | - So I think it's kind of fascinating
00:46:00.080 | if you look throughout history and race is just part of that
00:46:02.480 | is we create the other and treat the other
00:46:08.480 | with disdain through the legal system,
00:46:11.720 | but just through human nature.
00:46:13.800 | I tend to believe, we mentioned offline
00:46:15.600 | that I work with robots.
00:46:18.320 | It sounds absurd to say, especially to you,
00:46:20.280 | especially 'cause we're talking about racism
00:46:22.080 | and it's so prevalent today.
00:46:23.760 | I do believe that there will be almost
00:46:27.520 | like a civil rights movement for robots
00:46:30.440 | because I think there's a huge value to society
00:46:37.120 | of having artificial intelligence systems
00:46:40.720 | that interact with humans and are human-like.
00:46:46.840 | And the more they become human-like,
00:46:50.440 | they will start to ask very fundamentally human questions
00:46:56.400 | about freedom, about suffering, about justice.
00:47:00.800 | And they will have to come face to face,
00:47:03.200 | like look in the mirror and ask in the question,
00:47:07.320 | just because we're biologically based,
00:47:10.440 | just because we're sort of, well, just because we're human,
00:47:15.440 | does that mean we're the only ones that deserve the rights?
00:47:20.080 | Again, forming another group, which is robots.
00:47:25.080 | And I'm sure there could be along that path,
00:47:27.840 | different versions of other that we form.
00:47:32.280 | So racism, race is certainly a big other that we've made,
00:47:37.280 | as you said, a lot of progress on
00:47:39.560 | throughout the history of this country.
00:47:41.480 | But it does feel like we always create,
00:47:43.760 | as we make progress, create new other groups.
00:47:46.840 | And of course, the other group that perhaps is outside
00:47:50.560 | the legal system that people talk about is the essential,
00:47:55.400 | now I eat a lot of meat, but the torture of animals.
00:47:59.800 | The people talk about when we look back
00:48:01.520 | from a couple of centuries from now,
00:48:03.360 | look back at the kind of things we're doing to animals,
00:48:06.160 | we might regret that.
00:48:07.840 | We might see that in a very different light.
00:48:09.600 | And it's kind of interesting to see the future trajectory
00:48:11.920 | of what we wake up to about the injustice in our ways.
00:48:16.920 | But the robot one is the one I'm especially focused on,
00:48:23.480 | but at this moment in time, it seems ridiculous.
00:48:26.440 | But I'm sure most civil rights movements
00:48:28.320 | throughout history seem ridiculous at first.
00:48:31.000 | - Well, it's interesting, sort of outside of my
00:48:34.480 | intellectual bailiwick robots,
00:48:37.400 | as I understand the development of artificial intelligence,
00:48:42.400 | though the aspect that still is missing
00:48:48.000 | is this notion of consciousness.
00:48:57.200 | And that it's consciousness that is the thing
00:49:02.200 | that will move if it were to exist.
00:49:07.560 | And I'm not saying that it can or will,
00:49:10.640 | but if it were to exist would move robots from machines
00:49:15.120 | to something different,
00:49:20.320 | something that experienced the world in a way analogous
00:49:25.600 | to how we experience it.
00:49:28.240 | And also as I understand the science,
00:49:31.440 | unlike what you see on television,
00:49:35.560 | that we're not there yet in terms of this notion
00:49:40.400 | of the machines having a consciousness.
00:49:44.240 | - Or a great general intelligence,
00:49:48.800 | all those kinds of things.
00:49:49.800 | - Yeah, yeah.
00:49:50.640 | - A huge amount of progress has been made,
00:49:52.360 | and it's fascinating to watch.
00:49:55.480 | So I'm on both minds.
00:49:57.120 | As a person who's building them,
00:49:59.360 | I'm realizing how sort of quote-unquote dumb they are.
00:50:02.760 | But also looking at human history
00:50:06.960 | and how poor we are predicting the progress
00:50:09.560 | of innovation and technology,
00:50:11.720 | it's obvious that we have to be humble
00:50:14.840 | by our ability to predict,
00:50:16.680 | coupled with the fact that we keep,
00:50:19.000 | to use terminology carefully here,
00:50:22.760 | we keep discriminating against the intelligence
00:50:25.120 | of artificial systems.
00:50:28.040 | The smarter they get,
00:50:29.240 | the more ways we find to dismiss their intelligence.
00:50:33.760 | So this has just been going on throughout.
00:50:36.200 | It's almost as if we're threatened
00:50:41.240 | in the most primitive human way, animalistic way.
00:50:45.720 | We're threatened by the power of other creatures,
00:50:49.600 | and we wanna lessen, dismiss them.
00:50:52.280 | So consciousness is a really important one,
00:50:55.080 | but the one I think about a lot in terms of consciousness,
00:50:59.000 | the very engineering question,
00:51:01.000 | is whether the display of consciousness
00:51:03.560 | is the same as the possession of consciousness.
00:51:06.680 | So if a robot tells you they are conscious,
00:51:11.520 | if a robot looks like they're suffering
00:51:15.100 | when you torture them,
00:51:16.640 | if a robot is afraid of death
00:51:18.760 | and says they're afraid of death,
00:51:20.960 | and are legitimately afraid,
00:51:23.840 | in terms of just everything we as humans
00:51:28.080 | use to determine the ability of somebody
00:51:32.400 | to be their own entity,
00:51:34.840 | they're the one that loves, one that fears,
00:51:38.040 | one that hopes, one that can suffer.
00:51:42.240 | If a robot, in the dumbest of ways,
00:51:46.040 | is able to display that,
00:51:47.780 | it starts changing things very quickly.
00:51:53.280 | I'm not sure what it is,
00:51:54.520 | but it does seem that there's a huge component
00:51:57.080 | to consciousness that is a social creation.
00:52:01.320 | Like we together create our consciousness.
00:52:04.240 | Like we believe our common humanity together.
00:52:08.840 | Alone, we wouldn't be aware of our humanity.
00:52:11.840 | And the law, as it protects our freedoms,
00:52:15.320 | seems to be a construct of the social construct.
00:52:18.760 | And when you add other creatures into it,
00:52:22.520 | it's not obvious to me that you have to build,
00:52:26.640 | there'll be a moment when you say,
00:52:28.240 | "This thing is now conscious."
00:52:30.640 | I think there's going to be a lot of fake it
00:52:33.360 | until you make it.
00:52:34.440 | And there'll be a very gray area between fake and make
00:52:38.180 | that is going to force us to contend
00:52:41.000 | with what it means to be an entity that deserves rights,
00:52:45.800 | where all men are created equal.
00:52:48.520 | The men part might have to expand
00:52:51.680 | in ways that we are not yet anticipating.
00:52:55.000 | It's very interesting.
00:52:56.160 | I mean, my favorite,
00:52:57.380 | the fundamental thing I love about artificial intelligence
00:53:01.120 | is it gets smarter and smarter.
00:53:02.520 | It challenges us to think of what is right,
00:53:07.160 | questions of justice, questions of freedom.
00:53:09.600 | It basically challenges us to understand our own mind,
00:53:14.600 | to understand what,
00:53:21.120 | almost from an engineering first principles perspective,
00:53:24.080 | to understand what it is that makes us human
00:53:26.560 | that is at the core of all the rights that we talk about
00:53:29.120 | and all the documents we write.
00:53:31.240 | So even if we don't give rights
00:53:32.560 | to artificial intelligence systems,
00:53:33.920 | we may be able to construct more fair legal systems
00:53:38.920 | to protect us humans.
00:53:41.000 | - Well, I mean, interesting ontological question
00:53:44.280 | between the performance of consciousness
00:53:47.800 | and actual consciousness to the extent
00:53:51.560 | that actual consciousness is anything
00:53:56.920 | beyond some contingent reality.
00:54:00.120 | But you've posed a number
00:54:01.240 | of interesting philosophical questions.
00:54:03.800 | And then there's also, it strikes me
00:54:06.440 | that philosophers of religion
00:54:10.280 | would pose another set of questions as well
00:54:13.760 | when you deal with issues of structure versus soul,
00:54:18.760 | body versus soul.
00:54:21.120 | And it will be a complicated mix.
00:54:26.120 | And I suspect I'll be dust
00:54:30.440 | by the time those questions get worked out.
00:54:33.600 | - And so, yeah, the soul is a fun one.
00:54:36.840 | There's no soul.
00:54:38.000 | I'm not sure, maybe you can correct me,
00:54:39.720 | but there's very few discussion of soul
00:54:42.640 | in our legal system, right?
00:54:44.520 | - Right, correct.
00:54:45.360 | None.
00:54:46.200 | But there is a discussion about what constitutes
00:54:49.920 | a human being.
00:54:51.400 | And I mean, you gestured at the notion
00:54:53.840 | of the potential of the law widening the domain
00:54:58.840 | of a human being.
00:55:01.880 | So in that sense, right, people are very angry
00:55:06.400 | because they can't get sort of pain and suffering damages
00:55:12.080 | if someone negligently kills a pet
00:55:14.920 | because a pet is not a human being.
00:55:17.840 | And people say, "Well, I love my pet,"
00:55:20.160 | but the law sees a pet as chattel,
00:55:24.040 | as property like this water bottle.
00:55:27.840 | So the current legal definitions trade
00:55:31.840 | on a definition of humanity
00:55:35.840 | that may not be worked out in any sophisticated way,
00:55:39.400 | but certainly there's a broad and shared understanding
00:55:44.400 | of what it means.
00:55:47.600 | So probably doesn't explicitly contain a definition
00:55:51.720 | of something like soul, but it's more robust
00:55:56.200 | than a carbon-based organism,
00:55:59.600 | that there's something a little more distinct
00:56:03.040 | about what the law thinks a human being is.
00:56:06.880 | - So if we can dive into, we've already been doing it,
00:56:10.720 | but if we can dive into more difficult territory.
00:56:14.040 | So 2020 had the tragic case of George Floyd.
00:56:19.040 | When you reflect on the protests, on the racial tensions
00:56:24.960 | over the death of George Floyd,
00:56:26.720 | how do you make sense of it all?
00:56:29.320 | What do you take away from these events?
00:56:31.400 | - Look, the George Floyd moment occurred
00:56:35.720 | at an historical moment where people were
00:56:40.720 | in quarantine for COVID,
00:56:45.000 | and people have these cell phones
00:56:51.920 | to a degree greater than we've ever had them before.
00:56:56.920 | And this was a, sort of the straw that broke the camel's back
00:57:01.920 | after a number of these sorts of cell phone videos,
00:57:05.640 | surfaced, people were fed up.
00:57:08.960 | There was unimpeachable evidence of a form of mistreatment,
00:57:15.720 | whether it constitutes murder or manslaughter,
00:57:23.400 | the trial is going on now and jurors will figure that out,
00:57:27.280 | but there was widespread appreciation
00:57:30.880 | that a fellow human being was mistreated,
00:57:35.520 | that we were just talking about humanity,
00:57:38.080 | that there was not a sufficient recognition
00:57:42.880 | of this person's humanity.
00:57:45.160 | - The common humanity of this person.
00:57:46.680 | - The common humanity of this person, well said,
00:57:49.880 | and people were fed up.
00:57:51.120 | So we were already in this COVID space
00:57:53.080 | where we were exercising care for one another,
00:57:58.080 | and there was just an explosion,
00:58:01.040 | the likes of which this country hasn't seen
00:58:03.640 | since the civil rights protests of the 1950s and 1960s.
00:58:08.640 | And people simply said, "Enough, enough, enough, enough.
00:58:15.400 | This has to stop.
00:58:16.960 | We cannot treat fellow citizens in this way,
00:58:21.200 | and we can't do it with impunity."
00:58:23.080 | And the young people said, "We're just, we're not gonna
00:58:27.040 | stand for it anymore."
00:58:28.040 | And they took to the streets.
00:58:29.680 | But with the millions of people protesting,
00:58:34.680 | there is nevertheless taking us back
00:58:38.040 | to the most difficult of trials.
00:58:40.560 | You have the trial, like you mentioned,
00:58:42.240 | that's going on now of Derek Chauvin,
00:58:44.840 | of one of the police officers involved.
00:58:47.280 | What are your thoughts?
00:58:51.080 | What are your predictions on this trial
00:58:54.240 | where the process of the law is trying to proceed
00:58:58.480 | in the face of so much racial tension?
00:59:02.000 | - Yeah, it's gonna be an interesting trial.
00:59:04.160 | I've been keeping an eye on it there
00:59:06.520 | in jury selection now, today, as we're talking.
00:59:10.720 | So a lot's gonna depend on what sort of jury gets selected.
00:59:13.880 | - Yeah, how the, sorry to take, sorry to interrupt,
00:59:17.000 | but so one of the interesting qualities of this trial,
00:59:20.880 | maybe you can correct me if I'm wrong,
00:59:22.840 | but the cameras are allowed in the courtroom,
00:59:25.400 | at least during the jury selection.
00:59:27.640 | So you get to watch some of this stuff.
00:59:31.360 | And the other part is the jury selection.
00:59:34.200 | Again, I'm very inexperienced,
00:59:35.680 | but it seems like selecting an, what is it, unbiased jury,
00:59:40.000 | is really difficult for this trial.
00:59:43.520 | It's almost like, I don't know, me as a listener,
00:59:48.520 | like listening to people that are trying
00:59:53.840 | to talk their way into the jury kind of thing,
00:59:57.160 | trying to decide, is this person really unbiased,
01:00:00.280 | or are they just trying to hold on
01:00:02.880 | to their deeply held emotions
01:00:05.560 | and trying to get onto the jury?
01:00:07.440 | I mean, it's an incredibly difficult process.
01:00:09.160 | I don't know if you can comment on a case so difficult,
01:00:12.520 | like the ones you've mentioned before.
01:00:15.000 | How do you select a jury that represents the people
01:00:18.320 | and carries the sort of the ideal of the law?
01:00:22.600 | - Yeah, so a couple of things.
01:00:23.680 | So first, yes, it is televised,
01:00:25.400 | and it will be televised, as they say, gavel to gavel.
01:00:28.800 | So the entire trial,
01:00:30.600 | the whole thing is gonna be televised.
01:00:32.560 | So people are getting a view
01:00:35.320 | of how laborious jury selection can be.
01:00:39.600 | I think as of yesterday, they had picked six jurors,
01:00:42.720 | and it's taken a week, and they have to get to 14.
01:00:46.320 | So they've got probably another week or more to do.
01:00:51.320 | I've been in jury trials where it took a month
01:00:55.000 | to choose a jury.
01:00:56.240 | So that's the most important part.
01:00:57.680 | You have to choose the right sort of jury.
01:01:01.560 | So unbiased in the criminal justice system
01:01:03.840 | has a particular meaning.
01:01:06.240 | And it means that, let me tell you what it doesn't mean.
01:01:11.240 | It doesn't mean that a person is not aware of the case.
01:01:17.040 | It also does not mean that a person
01:01:20.440 | has informed an opinion about the case.
01:01:22.640 | Those are two popular misconceptions.
01:01:26.160 | What it does mean is that notwithstanding
01:01:28.840 | whether an individual has formed an opinion,
01:01:31.440 | notwithstanding whether an individual knows about the case,
01:01:34.920 | that individual can set aside any prior opinions,
01:01:38.920 | can set aside any notions
01:01:41.320 | that they've developed about the case,
01:01:42.960 | and listen to the evidence presented at trial
01:01:46.240 | in conjunction with the judge's instructions
01:01:48.960 | on how to understand and view that evidence.
01:01:53.200 | So if a person can do that,
01:01:55.400 | then they're considered unbiased.
01:01:58.600 | So, as a long time defense attorney,
01:02:02.960 | I would be hesitant in a big case like this
01:02:07.160 | to pick a juror who's never heard of the case
01:02:09.480 | or anything going around,
01:02:10.400 | 'cause I'm thinking, well, who is this person?
01:02:12.720 | And what in the world do they do?
01:02:17.600 | Or are they lying to me?
01:02:19.040 | I mean, how can you not have heard about this case?
01:02:22.960 | So they may bring other problems.
01:02:25.840 | So I don't mind so much people who've heard about the case
01:02:28.800 | or folks who've formed initial opinions,
01:02:32.480 | but what you don't want is people
01:02:35.360 | who have tethered themselves to that opinion
01:02:40.120 | in a way that they can't be convinced otherwise.
01:02:45.240 | So, but you also have people who, as you suggested,
01:02:49.120 | who just lie because they wanna get on the jury
01:02:52.400 | or lie because they wanna get off the jury.
01:02:54.360 | So sometimes people come and say,
01:02:57.000 | the most ridiculous, outrageous, offensive things
01:03:00.560 | because they know that they'll get excused for cause,
01:03:04.760 | and others who, you can tell,
01:03:08.800 | really badly wanna get on the jury.
01:03:11.080 | So they're just,
01:03:13.040 | they pretend to be the most neutral,
01:03:16.560 | unbiased person in the world,
01:03:19.040 | what the law calls the reasonable person.
01:03:21.120 | We have in law, the reasonable person standard.
01:03:24.080 | And I would tell my class,
01:03:26.120 | the reasonable person in real life
01:03:29.400 | is the person that you would be least likely
01:03:33.040 | to wanna have a drink with.
01:03:34.080 | They're the most boring, neutral,
01:03:37.280 | not interesting sort of person in the world.
01:03:40.200 | And so a lot of jurors engage in the performative act
01:03:45.200 | of presenting themselves as the most sort of even-keeled,
01:03:49.160 | rational, reasonable person
01:03:51.400 | 'cause they really wanna get on the jury.
01:03:52.960 | - Yeah, there's an interesting question.
01:03:55.040 | I apologize, I haven't watched a lot
01:03:56.960 | 'cause it is very long.
01:03:58.480 | I've watched it.
01:03:59.320 | You know, there's certain questions you've asked
01:04:02.880 | in the jury, you ask in the jury selection.
01:04:05.380 | I remember, I think one jumped out at me,
01:04:09.800 | which is something like,
01:04:13.960 | does the fact that this person's a police officer
01:04:17.800 | make you feel any kind of way about them?
01:04:21.480 | So trying to get at that, you know,
01:04:23.600 | I don't know what that is, I guess that's bias.
01:04:25.960 | And that's such a difficult question to ask.
01:04:29.840 | Like, I asked myself that question.
01:04:32.280 | Like, how much, you know,
01:04:34.360 | we all kinda wanna pretend that we're not racist,
01:04:37.760 | we don't judge, we don't have,
01:04:40.600 | we're like these, we're the reasonable human.
01:04:43.120 | But, you know, legitimately asking yourself,
01:04:45.560 | like, what are the prejudgments you have in your mind?
01:04:50.160 | Is that even possible for a human being?
01:04:55.680 | Like, when you look at yourself in the mirror
01:04:57.720 | and think about it, is it possible to actually answer that?
01:05:00.860 | - Yeah, look, I do not believe
01:05:04.240 | that people can be completely unbiased.
01:05:08.040 | We all have baggage and bias
01:05:11.120 | and bring it wherever we go, including to court.
01:05:14.720 | What you want is to try to find a person
01:05:18.920 | who can at least recognize when a bias is working
01:05:23.920 | and actively try to do the right thing.
01:05:29.000 | That's the best we can ask.
01:05:31.600 | So if a juror says, yeah, you know,
01:05:33.520 | I grew up in a place where I tend to believe
01:05:36.760 | what police officers say, that's just how I grew up.
01:05:39.520 | But if the judge is telling me
01:05:41.840 | that I have to listen to every witness equally,
01:05:45.240 | then I'll do my best and I won't weigh that testimony
01:05:50.040 | any higher than I would any other testimony.
01:05:52.280 | If you have someone answer a question like that,
01:05:54.400 | that sounds more sincere to me,
01:05:56.960 | it sounds more honest.
01:05:58.120 | And if you want a person to try to do that.
01:06:01.880 | And then in closing arguments, right, as the lawyer,
01:06:05.360 | right, I'd say something like, ladies and gentlemen,
01:06:07.240 | you know, we chose you to be on this jury
01:06:09.720 | because you swore that you would do your level best
01:06:14.720 | to be fair.
01:06:16.200 | That's why we chose you.
01:06:19.080 | And I'm confident that you're gonna do that here.
01:06:22.840 | So when you heard that police officer's testimony,
01:06:25.600 | the judge told you, you can't give more credit
01:06:30.000 | to that testimony just because it's a police officer.
01:06:33.480 | And I trust that you're gonna do that.
01:06:35.640 | And that you're gonna look at witness number three,
01:06:38.440 | you know, John Smith, you're gonna look at John Smith.
01:06:40.840 | John Smith has a different recollection
01:06:43.680 | and you're duty bound, duty bound to look at that testimony
01:06:48.520 | and this person's credibility, you know,
01:06:50.840 | the same degree as that other witness, right?
01:06:53.360 | And now what you have is just a he said, she said matter,
01:06:56.760 | and this is a criminal case.
01:06:58.480 | That has to be reasonable doubt, right?
01:07:01.040 | So, you know, so you, and really someone who's trying
01:07:04.320 | to do the right thing, it's helpful, but no,
01:07:07.120 | you're not gonna just fine 14 people with no biases.
01:07:10.800 | That's absurd.
01:07:12.240 | - Well, that's fascinating that,
01:07:14.760 | especially the way you're inspiring,
01:07:16.120 | the way you're speaking now is,
01:07:18.400 | I mean, I guess you're calling on the jury.
01:07:20.040 | That's kind of the whole system
01:07:21.280 | is you're calling on the jury,
01:07:23.000 | each individual on the jury to step up and really think,
01:07:28.000 | you know, to step up and be their most thoughtful selves,
01:07:31.560 | actually, most introspective.
01:07:33.960 | Like you're trying to basically ask people
01:07:37.240 | to be their best selves.
01:07:40.240 | And that's, and they, I guess a lot of people
01:07:43.960 | step up to that.
01:07:45.000 | That's why the system works.
01:07:47.000 | - I'm very, I'm very pro jury.
01:07:50.000 | Juries, they get it right a lot of the time,
01:07:53.640 | most of the time, and they really work hard to do it.
01:07:57.920 | So what do you think happens?
01:08:02.680 | I mean, maybe,
01:08:03.520 | I'm not so much on the legal side of things,
01:08:09.080 | but on the social side, it's like with the OJ Simpson trial.
01:08:13.520 | Do you think it's possible that Derek Chauvin
01:08:17.440 | does not get convicted of the,
01:08:19.120 | what is it, second degree murder?
01:08:20.800 | How do you think about that?
01:08:24.680 | How do you think about the potential social impact of that?
01:08:27.640 | The riots, the protests, either direction.
01:08:32.640 | Any words that are said,
01:08:34.280 | the tension here could be explosive,
01:08:37.160 | especially with the cameras.
01:08:38.520 | - Yeah, so yes, there's certainly a possibility
01:08:41.600 | that he'll be acquitted for homicide charges,
01:08:46.600 | for the jury to convict.
01:08:52.400 | They have to make a determination as to Officer Chauvin's,
01:08:56.520 | former Officer Chauvin's state of mind,
01:09:00.600 | whether he intended to cause some harm,
01:09:05.320 | whether he was grossly reckless in causing harm,
01:09:10.320 | so much so that he disregarded a known risk of death
01:09:14.360 | or serious bodily injury.
01:09:16.080 | And as you may have read in the papers yesterday,
01:09:19.360 | the judge allowed a third degree murder charge in Kentucky,
01:09:24.880 | which is, it's the mindset,
01:09:29.120 | the state of mind there is not an intention,
01:09:32.760 | but it's a depraved indifference.
01:09:37.080 | And what that means is that the jury doesn't have to find
01:09:39.680 | that he intended to do anything.
01:09:42.520 | Rather, they could find that he was just indifferent
01:09:47.520 | to a risk.
01:09:49.960 | - As dark.
01:09:50.800 | - Yeah, yeah.
01:09:52.840 | - I'm not sure what's worse.
01:09:54.840 | - Yeah, well, that's a good point,
01:09:56.520 | but it's another basis for the jury to convict.
01:10:01.000 | But look, you never know what happens
01:10:03.480 | when you go to a jury trial.
01:10:04.640 | So there could be an acquittal.
01:10:09.080 | And if there is, I imagine there would be massive protests.
01:10:14.080 | If he's convicted, I don't think that would happen
01:10:21.440 | 'cause I just don't see,
01:10:23.480 | at least nothing I've seen or read suggests
01:10:25.560 | that there's a big pro-Chauvin camp out there
01:10:30.400 | ready to protest.
01:10:31.520 | - Well, there could be,
01:10:32.520 | is there also potential tensions
01:10:35.320 | that could arise from the sentencing?
01:10:37.600 | I don't know how that exactly works.
01:10:39.480 | Sort of not enough years kind of thing.
01:10:41.720 | - Yeah, it could be.
01:10:42.560 | - All that kind of stuff.
01:10:43.400 | - I mean, a lot could happen.
01:10:44.920 | So it depends on what he's convicted of.
01:10:47.580 | One count, I think, is like up to 10 years.
01:10:50.400 | Another count's up to 40 years.
01:10:52.720 | So it depends what he's convicted of.
01:10:54.440 | And yes, it depends on how much time the judge gives him
01:10:59.440 | if he is convicted.
01:11:01.760 | There's a lot of space for people to be very angry in.
01:11:04.780 | So we will see what happens.
01:11:07.560 | - I just feel like with the judge and the lawyers,
01:11:11.240 | there's an opportunity to have
01:11:13.800 | really important, long-lasting speeches.
01:11:18.040 | I don't know if they think of it that way,
01:11:20.360 | especially with the cameras.
01:11:22.800 | It feels like they have the capacity to heal or to divide.
01:11:27.800 | Do you ever think about that as a lawyer, as a legal mind,
01:11:34.680 | that your words aren't just about the case,
01:11:36.940 | but about they'll reverberate through history, potentially?
01:11:41.940 | - That is certainly a possible consequence
01:11:47.440 | of things you say.
01:11:49.340 | I don't think that most lawyers think about that
01:11:52.960 | in the context of the case.
01:11:55.120 | Your role is much more narrow.
01:11:58.040 | You're the partisan advocate, as a defense lawyer,
01:12:01.240 | partisan advocate for that client.
01:12:04.620 | As a prosecutor, you're a minister of justice
01:12:07.200 | attempting to prosecute that particular case.
01:12:11.640 | But the reality is you are absolutely correct
01:12:14.680 | that sometimes the things you say
01:12:18.280 | will have a shelf life.
01:12:20.540 | And you mentioned OJ Simpson before.
01:12:23.020 | If the glove doesn't fit, you must acquit.
01:12:25.780 | It's gonna be just in our lexicon
01:12:28.780 | for probably a long time now.
01:12:30.580 | So it happens, but that's not,
01:12:33.740 | and it shouldn't be foremost on your mind.
01:12:36.620 | - Right.
01:12:37.460 | What do you make of the OJ Simpson trial?
01:12:41.380 | Do you have thoughts about it?
01:12:43.140 | He's out and about on social media now.
01:12:46.780 | He's a public figure.
01:12:48.400 | Is there lessons to be drawn from that whole saga?
01:12:52.280 | - Well, that was an interesting case.
01:12:53.800 | I was a young public defender,
01:12:55.440 | I wanna say in my first year as a public defender
01:12:58.840 | when that verdict came out.
01:13:00.320 | So that case was important in so many ways.
01:13:02.360 | One, it was the first DNA case,
01:13:06.520 | major DNA case.
01:13:08.960 | And there were significant lessons learned from that.
01:13:11.440 | One mistake that the prosecution made
01:13:16.000 | was that they didn't present the science
01:13:19.880 | in a way that a lay jury could understand it.
01:13:23.420 | And what Johnny Cochran did was he understood the science
01:13:29.640 | and was able to translate that
01:13:34.800 | into a vocabulary that he bet that that jury understood.
01:13:42.120 | So Cochran was dismissive of a lot of DNA.
01:13:46.640 | They say, he said something like,
01:13:48.880 | they say they found such and such amount of DNA.
01:13:53.280 | That's just like me wiping my finger against my nose
01:13:56.560 | and just that little bit of DNA.
01:13:59.880 | And that was effective
01:14:02.400 | because the prosecution hadn't done a good job
01:14:05.240 | of establishing that, yes, it's microscopic.
01:14:08.960 | You don't need that much.
01:14:10.400 | Yes, wiping your hand on your nose and touching something,
01:14:13.080 | you can transfer a lot of DNA
01:14:14.800 | and that gives you good information.
01:14:16.920 | But it was the first time that the public generally
01:14:20.720 | and that jury maybe since high school science
01:14:23.280 | had heard nucleotide.
01:14:25.920 | I mean, it was just all these terms getting thrown at them.
01:14:29.020 | But it was not weaved into a narrative.
01:14:33.160 | So Cochran taught us that no matter what type of case it is,
01:14:39.000 | no matter what science is involved,
01:14:40.560 | it's still about storytelling.
01:14:42.520 | It's still about a narrative.
01:14:44.040 | And he was great at that narrative
01:14:49.040 | and was consistent with his narrative all the way out.
01:14:56.140 | Another lesson that was relearned
01:15:00.920 | is that you never ask a question
01:15:02.960 | to which you don't know the answer.
01:15:04.380 | That's like trial absence 101.
01:15:09.380 | So when they gave OJ Simpson the glove and it wouldn't fit,
01:15:14.020 | you don't do things
01:15:16.340 | where you just don't know how it's gonna turn out.
01:15:18.180 | It was way, way too risky.
01:15:20.980 | And I think that's what acquitted him
01:15:24.220 | 'cause the glove just wouldn't fit.
01:15:26.420 | And he got to do this and ham in front of the camera
01:15:30.020 | and all of that and it was big.
01:15:33.180 | - Do you think about representation as a storytelling,
01:15:36.380 | like you, yourself and your role?
01:15:38.700 | - Absolutely, we tell stories.
01:15:42.060 | It is fundamental.
01:15:43.900 | Since time immemorial, we have told stories
01:15:47.740 | to help us make sense of the world around us.
01:15:51.460 | So as a scientist, you tell a different type of story,
01:15:56.460 | but we as a public have told stories
01:16:03.080 | from time immemorial to help us make sense
01:16:05.980 | of the physical and the natural world.
01:16:08.700 | And we are still a species that is moved by storytelling.
01:16:13.700 | So that's first and last in trial work.
01:16:18.980 | You have to tell a good story.
01:16:21.300 | And the basic introductory books about trial work
01:16:26.900 | teach young students and young lawyers
01:16:30.260 | to start in openings with this case is about,
01:16:34.820 | this case is about, and then you fill in the blank.
01:16:37.180 | And that's your narrative.
01:16:38.800 | That's the narrative you're gonna tell.
01:16:41.460 | - And of course you can do the ultra dramatic,
01:16:44.980 | the glove doesn't fit kind of the climax
01:16:47.500 | and all those kinds of things.
01:16:49.380 | - Yes, yes.
01:16:50.220 | - But that's the best of narratives, the best of stories.
01:16:52.660 | - Yes.
01:16:53.500 | - Speaking of other really powerful stories
01:16:57.180 | that you were involved with is the Aaron Hernandez trial
01:17:01.060 | and the whole story, the whole legal case.
01:17:03.420 | Can you maybe overview the big picture story
01:17:07.540 | and legal case of Aaron Hernandez?
01:17:09.500 | - Yeah, so Aaron, whom I miss a lot.
01:17:12.820 | So he was charged with a double murder
01:17:17.180 | in the case that I tried.
01:17:20.160 | And this was a unique case
01:17:21.860 | and one of those impossible cases,
01:17:24.420 | in part because Aaron had already been convicted of a murder.
01:17:29.420 | And so we had a client who was on trial for a double murder
01:17:35.460 | after having already been convicted of a separate murder.
01:17:39.820 | And we had a jury pool,
01:17:42.140 | just about all of whom knew
01:17:45.260 | that he had been convicted of a murder
01:17:48.480 | because he was a very popular football player in Boston,
01:17:51.860 | which is a big football town
01:17:54.020 | with the Patriots.
01:17:55.820 | So everyone knew that he was a convicted murderer
01:17:58.780 | and here we are defending for in a double murder case.
01:18:03.780 | So that was the context.
01:18:07.760 | It was not a case in the sense
01:18:09.260 | that this murder had gone unsolved for a couple of years.
01:18:14.260 | And then a nightclub bouncer said something to a cop
01:18:21.540 | who was working at a club
01:18:23.140 | that Aaron Hernandez was somehow involved
01:18:28.780 | in that murder that happened in the theater district.
01:18:32.220 | That's the district where all the clubs are in Boston
01:18:34.460 | and where the homicide occurred.
01:18:37.020 | And once the police heard Aaron Hernandez's name,
01:18:41.140 | then they went all out in order to do this.
01:18:46.140 | They found a guy named Alexander Bradley
01:18:51.220 | who was a very significant drug dealer
01:18:56.220 | in the Connecticut area.
01:19:04.780 | Very significant, very powerful.
01:19:08.940 | And he essentially, in exchange for a deal,
01:19:13.940 | pointed to Aaron, said, "Yeah, I was with Aaron."
01:19:18.300 | And Aaron was the murderer.
01:19:23.300 | So that's how the case came to court.
01:19:26.420 | - Okay, so that sets the context.
01:19:28.180 | What was your involvement in this case,
01:19:31.180 | like legally, intellectually, psychologically,
01:19:34.900 | when this particular second charge of murder?
01:19:40.460 | - So a friend called me, Jose Baez,
01:19:44.700 | who is the defense attorney,
01:19:46.400 | and he comes to a class that I teach every year at Harvard,
01:19:51.400 | the Trial Advocacy Workshop,
01:19:53.780 | as one of my teaching faculty members.
01:19:58.620 | It's a class where we teach students how to try cases.
01:20:01.500 | So Jose called me and said,
01:20:05.660 | "Hey, I got a call from Massachusetts, Aaron Hernandez.
01:20:10.660 | "You wanna go and talk to him with me?"
01:20:15.660 | So I said, "Sure."
01:20:16.500 | So we went up to the prison and met Aaron
01:20:19.500 | and spoke with him for two or three hours that first time.
01:20:27.680 | And before we left, he said he wanted to retain us.
01:20:32.320 | He wanted to work with us.
01:20:33.440 | And that started the representation.
01:20:35.400 | - What was he like in that time?
01:20:39.040 | Was he worn down by the whole process?
01:20:41.960 | Was there still a light in that?
01:20:44.320 | - He was not.
01:20:45.160 | He had, I mean, more than just a light,
01:20:47.440 | he was luminous almost.
01:20:49.640 | He had a radiant million-dollar smile
01:20:53.200 | whenever he walked in.
01:20:55.480 | My first impression, I distinctly remember,
01:20:58.280 | was, "Wow, this is what a professional athlete looks like."
01:21:01.520 | I mean, he walked in and he's just bigger
01:21:04.480 | and more fit than anyone anywhere.
01:21:09.060 | And it was like, "Wow."
01:21:10.540 | And when you saw him on television, he looked kinda little.
01:21:13.800 | And I was like, so I remember thinking,
01:21:15.840 | "Well, what do those other guys look like in person?"
01:21:19.100 | And he's extraordinarily polite, young.
01:21:26.060 | I was surprised by how young he was.
01:21:32.680 | - Both in mind and body?
01:21:36.280 | - Chronologically, I was thinking.
01:21:38.240 | He was in his early 20s, I believe.
01:21:41.960 | - But there seemed to be an innocence to him
01:21:44.000 | in terms of just the way he saw the world.
01:21:46.240 | - I think that's right.
01:21:47.960 | - They picked that up from the documentary,
01:21:49.800 | just taking that in.
01:21:50.640 | - I think that's right, yeah, yeah.
01:21:53.480 | - So there is a Netflix documentary titled
01:21:57.520 | "Killer Inside the Mind of Aaron Hernandez."
01:22:01.060 | What are your thoughts on this documentary?
01:22:02.800 | I don't know if you've gotten a chance to see it.
01:22:05.040 | - I have not seen it.
01:22:05.880 | I did not participate in it.
01:22:07.220 | I know I was in it because there was news footage.
01:22:11.840 | But I did not participate in it.
01:22:13.520 | I had not talked to Aaron about press or anything
01:22:18.520 | before he died.
01:22:22.320 | My strong view is that the attorney-client privilege
01:22:24.360 | survives death, and so I was not inclined
01:22:27.120 | to talk about anything that Aaron and I talked about.
01:22:29.880 | So I just didn't participate and have never watched it.
01:22:34.160 | - Not even watched, huh?
01:22:35.360 | So does that apply to most of your work?
01:22:39.560 | Do you try to stay away from the way
01:22:41.280 | the press perceives stuff?
01:22:42.740 | - During, yes, I try to stay away from it.
01:22:46.600 | I will view it afterwards.
01:22:48.760 | I just hadn't gotten around to watching Aaron,
01:22:51.520 | 'cause it's kind of sad.
01:22:53.640 | So I just haven't watched it.
01:22:55.080 | But I definitely stay away from the press during trial.
01:22:59.160 | And there are some lawyers who watch it religiously
01:23:03.420 | to see what's going on, but I'm confident
01:23:06.800 | in my years of training and so forth
01:23:09.720 | that I can actively sense what's going on
01:23:14.720 | in the courtroom and that I really don't need advice
01:23:20.200 | from Joe476@gmail, some random guy on the internet
01:23:25.200 | telling me how to try cases.
01:23:29.080 | So to me, it's just confusing,
01:23:31.660 | and I keep it out of my mind.
01:23:33.200 | - And even if you think you can ignore it,
01:23:35.360 | just reading it will have a little bit
01:23:37.760 | of an effect on your mind.
01:23:39.040 | - I think that's right.
01:23:40.520 | - Over time, it might accumulate.
01:23:43.880 | So the documentary, but in general,
01:23:47.120 | it mentioned or kind of emphasized
01:23:52.680 | and talked about Aaron's sexuality
01:23:55.880 | or sort of they were discussing basically
01:23:59.680 | the idea that he was a homosexual.
01:24:02.880 | And some of the trauma, some of the suffering
01:24:07.880 | that he endured in his life had to do
01:24:10.240 | with sort of fear given the society
01:24:12.760 | of what his father would think,
01:24:16.760 | of what others around him,
01:24:19.040 | sort of especially in sport culture and football and so on.
01:24:22.900 | So I don't know in your interaction with him,
01:24:27.260 | do you think that maybe even leaning up to a suicide,
01:24:32.200 | do you think his struggle with coming to terms
01:24:37.200 | with his sexuality had a role to play
01:24:40.800 | in much of his difficulties?
01:24:43.480 | - Well, I'm not gonna talk about my interactions with him
01:24:47.520 | and anything I derived from that.
01:24:50.720 | But what I will say is that a story broke on the radio
01:25:00.680 | at some point during the trial
01:25:04.640 | that Aaron had been in the same sex relationship
01:25:08.220 | with someone and some local sportscasters,
01:25:11.200 | local Boston sportscasters,
01:25:13.600 | or we really mushroomed the story.
01:25:18.600 | So he and everyone was aware of it.
01:25:23.800 | You also may know from the court record
01:25:28.240 | that the prosecutors floated a specious theory for a minute,
01:25:33.240 | but then backed off of it that Aaron was,
01:25:40.320 | that there was some sort of, I guess,
01:25:43.440 | gay rage at work with him.
01:25:45.880 | And that might be a cause, a motive for the killing.
01:25:50.880 | And luckily they really backed off of that.
01:25:53.960 | That was quite an offensive claim in theory.
01:25:58.320 | But to answer your question more directly,
01:26:01.760 | I mean, I have no idea why he killed himself.
01:26:04.640 | It was a surprise and a shock.
01:26:07.480 | I was scheduled to go see him
01:26:10.800 | like a couple of days after it happened.
01:26:13.080 | I mean, he was anxious for Jose and I to come in
01:26:18.080 | and do the appeal from the murder,
01:26:22.640 | which he was convicted for.
01:26:23.800 | He wanted us to take over that appeal.
01:26:25.800 | He was talking about going back to football.
01:26:29.520 | I mean, he said, well, you talk about this,
01:26:31.920 | earlier you talked about the sort of
01:26:33.840 | innocent aspect of him.
01:26:35.400 | He said, well, Ron, maybe not the Patriots,
01:26:39.440 | but I want to get back in the league.
01:26:41.280 | And I was like, Aaron, that's going to be tough, man.
01:26:45.720 | But he really believed it.
01:26:49.760 | And then for a few days later that to happen,
01:26:54.760 | it was just, it was a real shock to me.
01:26:57.120 | - Like when you look back at that, at his story,
01:27:00.620 | does it make you sad?
01:27:03.720 | - Very, very.
01:27:05.280 | I thought, so one, I believe he absolutely did not
01:27:11.200 | commit the crimes that we acquitted him on.
01:27:17.920 | I think that was the right answer for that.
01:27:22.240 | I don't know enough about Bradley, the first case,
01:27:26.760 | I'm sorry to make an opinion on,
01:27:29.520 | but in our case, it was just,
01:27:34.520 | he had the misfortune of having a famous name
01:27:37.640 | and the police department just really got on him there.
01:27:44.240 | So yes, I miss him a lot.
01:27:48.600 | It was very, very sad, surprising.
01:27:50.720 | - Yeah, and I mean, just on the human side,
01:27:53.800 | of course we don't know the full story,
01:27:55.400 | but just everything that led up to suicide,
01:27:59.000 | everything led up to an incredible professional
01:28:02.840 | football player, that whole story.
01:28:06.000 | - He was a remarkably talented athlete,
01:28:08.840 | remarkably talented athlete.
01:28:10.520 | And it has to do with all the possible trajectories,
01:28:14.680 | that we can take through life,
01:28:15.960 | as we were talking about before.
01:28:17.520 | And some of them lead to suicide, sadly enough.
01:28:22.520 | And it's always tragic when you have somebody
01:28:28.320 | with great potential result in the things that happen.
01:28:33.320 | People love it, when I ask about books,
01:28:37.720 | I don't know whether technical, like legal,
01:28:42.720 | or fiction, nonfiction books throughout your life,
01:28:45.880 | have had an impact on you,
01:28:47.640 | if there's something you could recommend,
01:28:49.840 | or something you could speak to about,
01:28:52.280 | something that inspired ideas, insights about this world,
01:28:57.120 | complicated world of ours.
01:28:58.880 | - Oh, wow.
01:29:00.400 | Yeah, so I'll give you a couple.
01:29:05.440 | So one is "Contingency, Irony, and Solidarity,"
01:29:08.720 | by Richard Worty.
01:29:09.720 | He's passed away now, but was a philosopher
01:29:14.440 | at some of our major institutions,
01:29:16.080 | Princeton, Harvard, Stanford.
01:29:20.600 | "Contingency, Irony, and Solidarity,"
01:29:24.800 | at least that's a book that really helped me
01:29:27.080 | work through a series of thoughts.
01:29:31.400 | So it stands for the proposition,
01:29:33.120 | that our most deeply held beliefs are contingent,
01:29:38.120 | that there's nothing beyond history,
01:29:43.000 | or prior to socialization,
01:29:45.320 | that's definatory of the human being, that's Worty.
01:29:49.440 | And he says that our most deeply held beliefs
01:29:54.560 | are received wisdom and highly contingent
01:29:58.120 | along a number of registers.
01:30:01.760 | And he does that, but then goes on to say
01:30:05.200 | that he nonetheless can hold strongly held beliefs,
01:30:10.200 | recognizing their contingency,
01:30:13.000 | but still believes them to be true and accurate.
01:30:15.040 | And he helps you to work through
01:30:16.680 | what could be an intellectual tension,
01:30:21.120 | other words, so you don't delve into,
01:30:25.120 | one doesn't delve into relativism,
01:30:27.240 | oh, everything is okay,
01:30:29.560 | but he gives you a vocabulary to think about
01:30:32.400 | how to negotiate these realities.
01:30:38.000 | - Do you share this tension?
01:30:40.120 | I mean, there is a real tension,
01:30:41.760 | it seems like even like the law,
01:30:43.680 | the legal system is all just a construct of our human ideas,
01:30:48.480 | and yet it seems to be,
01:30:50.720 | almost feels fundamental to what a just society is.
01:30:57.800 | - Yeah, I definitely share the tension
01:31:00.040 | and love his vocabulary
01:31:04.680 | in the way he's helped me resolve the tension.
01:31:08.920 | So, right, I mean, yeah, so like,
01:31:12.400 | infanticide, for example,
01:31:14.840 | perhaps it's socially contingent,
01:31:18.160 | perhaps it's received wisdom,
01:31:20.440 | perhaps it's anthropological,
01:31:23.440 | we need to propagate the species,
01:31:25.560 | and I still think it's wrong.
01:31:27.680 | And Rorty has helped me develop a category
01:31:32.280 | to say that, no, I can't provide any,
01:31:37.280 | in Rorty's words, non-circular theoretical backup
01:31:41.360 | for this proposition.
01:31:42.840 | At some point, it's gonna run me in a circularity problem,
01:31:46.400 | but that's okay, I hold this nonetheless
01:31:49.480 | in full recognition of its contingency,
01:31:52.040 | but what it does is makes you humble.
01:31:56.840 | And when you're humble, that's good,
01:31:59.880 | because this notion that ideas are always already
01:32:04.440 | in progress, never fully formed,
01:32:07.440 | I think is the sort of intellectual I strive to be.
01:32:12.440 | And if I have a sufficient degree of humility
01:32:17.440 | that I don't have the final answer, capital A,
01:32:21.840 | then that's gonna help me to get to better answers,
01:32:24.560 | lowercase a.
01:32:25.960 | And Rorty does, and he talks about,
01:32:28.880 | in the solidarity part of the book,
01:32:33.160 | he has this concept of imaginative,
01:32:35.880 | the imaginative ability to see other different people
01:32:41.560 | as we instead of they.
01:32:43.720 | And I just think it's a beautiful concept,
01:32:45.600 | but he talks about this imaginative ability,
01:32:47.760 | and it's this active process.
01:32:49.840 | So, I mean, so that's a book that's done a lot of work
01:32:54.040 | for me over the years.
01:32:57.520 | "Souls of Black Folk" by W.E.B. Du Bois
01:33:01.520 | was absolutely pivotal in my intellectual development.
01:33:05.560 | One of the premier set of essays
01:33:11.600 | in the Western literary tradition.
01:33:15.800 | And it's a deep and profound sociological,
01:33:19.880 | philosophical, and historical analysis
01:33:24.560 | of the predicament of blacks in America
01:33:28.560 | from one of our country's greatest polymaths.
01:33:33.560 | It's just a beautiful text, and I go to it yearly.
01:33:38.680 | - So for somebody like me, so growing up in the Soviet Union,
01:33:45.560 | the struggle, the civil rights movement,
01:33:47.680 | the struggle of race and all those kinds of things
01:33:50.880 | that is, you know, it's universal,
01:33:54.120 | but it's also very much a journey of the United States.
01:33:57.520 | It was kind of a foreign thing that I stepped into.
01:34:00.760 | Is that something you would recommend
01:34:02.040 | somebody like me to read?
01:34:03.720 | Or is there other things about race
01:34:08.400 | that are good to connect?
01:34:10.480 | Because my flavor of suffering,
01:34:13.400 | and just I'm a Jew as well,
01:34:15.360 | my flavor has to do with World War II,
01:34:17.520 | and the studies of that, you know,
01:34:19.120 | all the injustices there.
01:34:20.680 | So I'm now stepping into a new set of injustices
01:34:23.640 | and trying to learn the landscape.
01:34:26.840 | - I would say anyone is a better person
01:34:31.840 | for having read Du Bois.
01:34:34.040 | It's just, he's just a remarkable writer and thinker.
01:34:38.800 | And it, I mean, and to the extent you're interested
01:34:42.240 | in learning another history,
01:34:43.760 | he does it in a way that is quite sophisticated.
01:34:47.800 | So it's, so it's interesting.
01:34:52.240 | I was gonna give you three books.
01:34:55.040 | I noted the accent when I met you,
01:34:58.680 | but I didn't know exactly where you're from.
01:35:00.760 | But the other book I was gonna say
01:35:03.000 | is Dostoevsky's "Crime and Punishment."
01:35:05.360 | - Oh, great. (laughs)
01:35:06.240 | - And I mean, I've always wanted to go to St. Pete's
01:35:09.560 | just to sort of see with my own eyes
01:35:12.960 | what the word pictures that Dostoevsky created
01:35:16.960 | in "Crime and Punishment."
01:35:18.600 | And, you know, I love others of his stuff too,
01:35:21.040 | the Brothers Karamazov and so forth.
01:35:23.320 | But "Crime and Punishment," I first read in high school
01:35:26.560 | as a junior or senior.
01:35:28.320 | And it is a deep and profound meditation
01:35:33.320 | on both the meaning and the measure of our lives.
01:35:40.400 | And Dostoevsky, obviously in conversation
01:35:45.400 | with other thinkers, really gets at the crux
01:35:51.080 | of a fundamental philosophical problem.
01:35:57.000 | What does it mean to be a human being?
01:35:59.080 | And for that, "Crime and Punishment"
01:36:04.080 | captured me as a teenager.
01:36:05.960 | And that's another text that I return to often.
01:36:10.600 | - We've talked about young people a little bit
01:36:13.640 | at the beginning of our conversation.
01:36:15.540 | Is there advice that you could give
01:36:20.280 | to a young person today thinking about their career,
01:36:23.160 | thinking about their life,
01:36:25.200 | thinking about making their way in this world?
01:36:28.300 | - Yeah, sure.
01:36:29.140 | I'll share some advice.
01:36:30.400 | It actually picks up on a question we talked about earlier
01:36:33.840 | within the academy and schools.
01:36:36.400 | But it's some advice that a professor gave to me
01:36:39.880 | when I got to Harvard.
01:36:42.680 | And it is this, that you have to be willing
01:36:46.240 | to come face to face with your intellectual limitations
01:36:49.840 | and keep going.
01:36:50.920 | And it's hard for people.
01:36:54.120 | I mean, you mentioned this earlier,
01:36:56.000 | to face really difficult tasks,
01:37:01.000 | and particularly in these sort of elite spaces
01:37:03.720 | where you've excelled all your life and you come to MIT
01:37:07.040 | and you're like, "Wait a minute, I don't understand this.
01:37:09.480 | Wait, this is hard.
01:37:10.560 | I've never had something really hard before."
01:37:14.280 | And there are a couple options.
01:37:15.800 | And a lot of people will pull back
01:37:18.060 | and take the gentleman or gentlewoman's B
01:37:21.120 | and just go on.
01:37:23.360 | Or risk going out there, giving it your all
01:37:27.680 | and still not quite getting it.
01:37:29.820 | And that's a risk, but it's a risk well worth it
01:37:34.500 | because you're just gonna be the better person,
01:37:36.580 | the better student for it.
01:37:37.820 | And even outside of the academy,
01:37:40.180 | I mean, come face to face with your fears
01:37:44.820 | and keep going and keep going in life.
01:37:47.980 | And you're gonna be the better person,
01:37:49.820 | the better human being.
01:37:52.180 | - Yeah, it does seem to be, I don't know what it is,
01:37:54.460 | but it does seem to be that fear
01:37:57.500 | is a good indicator of something you should probably face.
01:38:02.500 | - Yes.
01:38:05.060 | - Like fear kind of shows the way a little bit.
01:38:07.940 | Not always.
01:38:09.420 | You might not wanna go into the cage with a lion,
01:38:11.620 | but maybe you should.
01:38:16.220 | - Maybe.
01:38:17.060 | - Let me ask sort of a darker question
01:38:20.620 | 'cause we're talking about Dostoevsky.
01:38:22.460 | We might as well.
01:38:25.260 | Do you, and connected to the freeing innocent people,
01:38:30.260 | do you think about mortality?
01:38:36.300 | Do you think about your own death?
01:38:38.260 | Are you afraid of death?
01:38:39.900 | - I'm not afraid of death.
01:38:42.180 | I do think about it more now
01:38:45.180 | because I'm now in my mid-50s.
01:38:47.820 | So I used to not think about it much at all,
01:38:50.420 | but the harsh reality is that I've got more time behind me
01:38:55.420 | now that I do in front of me.
01:38:59.260 | And it kind of happens all of a sudden
01:39:00.980 | to realize, wait a minute,
01:39:02.140 | I'm actually on the back nine now.
01:39:06.540 | So yeah, my mind moves to it from time to time.
01:39:10.180 | I don't dwell on it.
01:39:11.860 | I'm not afraid of it.
01:39:14.020 | My own personal religious commitments,
01:39:17.100 | I'm Christian and my religious commitments buoy me
01:39:22.100 | that death, and I believe this, death is not the end.
01:39:27.860 | So I'm not afraid of it.
01:39:31.100 | Now, this is not to say that I wanna rush to the afterlife.
01:39:35.820 | I'm good right here for a long time.
01:39:38.780 | I hope I've got 30, 35, 40 more years to go.
01:39:45.780 | But no, I don't fear death.
01:39:49.060 | We're finite creatures.
01:39:52.060 | We're all gonna die.
01:39:54.460 | - Well, the mystery of it, for somebody,
01:39:58.900 | at least for me, we human beings
01:40:01.060 | wanna figure everything out.
01:40:02.460 | Whatever the afterlife is, there's still a mystery to it.
01:40:07.780 | That uncertainty can be terrifying if you ponder it.
01:40:12.860 | But maybe what you're saying is (laughs)
01:40:17.300 | you haven't pondered it too deeply so far,
01:40:19.780 | and it's worked out pretty good.
01:40:21.260 | - It's worked out, yeah, no complaints.
01:40:24.380 | - So you said, again, that Stoyevsky kind of
01:40:27.180 | was exceptionally good at getting to the core
01:40:32.420 | of what it means to be human.
01:40:34.120 | Do you think about the why of why we're here,
01:40:38.740 | the meaning of this whole existence?
01:40:42.860 | - Yeah, no, I do.
01:40:44.780 | I think, and I actually think that's
01:40:47.340 | the purpose of an education.
01:40:49.940 | What does it mean to be a human being?
01:40:51.780 | And in one way or another,
01:40:54.300 | we set out to answer those questions,
01:40:56.780 | and we do it in a different way.
01:40:59.540 | I mean, some may look to philosophy
01:41:04.540 | to answer these questions.
01:41:07.260 | Why is it in one's personal interest to do good,
01:41:12.260 | to do justice?
01:41:16.740 | Some may look at it through the economist's lens.
01:41:21.740 | Some may look at it through the microscope
01:41:27.180 | in the laboratory that the phenomenal world
01:41:30.860 | is the meaning of life.
01:41:35.940 | Others may say that that's one vocabulary,
01:41:40.620 | that's one description,
01:41:42.660 | but the poet describes a reality
01:41:45.620 | to the same degree as a physicist.
01:41:48.340 | But that's the purpose of an education.
01:41:51.240 | It's to sort of work through these issues.
01:41:53.700 | What does it mean to be a human being?
01:41:58.700 | And I think it's a fascinating journey,
01:42:01.060 | and I think it's a lifelong endeavor
01:42:03.860 | to figure out what is the thing
01:42:05.660 | that nugget that makes us human.
01:42:09.140 | - Do you still see yourself as a student?
01:42:11.740 | - Of course.
01:42:13.100 | Yes, I mean, that's the best part
01:42:15.940 | about going into university teaching.
01:42:18.900 | You're a lifelong student.
01:42:20.940 | I'm always learning.
01:42:21.900 | I learn from my students and with my students
01:42:24.500 | and my colleagues,
01:42:26.620 | and you continue to read and learn and modify opinions,
01:42:31.620 | and I think it's just a wonderful thing.
01:42:34.660 | - Well, Ron, I'm so glad that somebody like you
01:42:39.660 | is carrying the fire of what is the best of Harvard.
01:42:45.900 | So it's a huge honor that you spent so much time,
01:42:49.460 | waste so much of your valuable time with me.
01:42:51.780 | I really appreciate the conversation.
01:42:52.620 | - Not a waste at all.
01:42:53.860 | - I think a lot of people love it.
01:42:55.140 | Thank you so much for talking today.
01:42:56.460 | - Thank you.
01:42:57.300 | - Thanks for listening to this conversation
01:42:59.780 | with Ronald Sullivan,
01:43:01.020 | and thank you to Brooklyn and Sheetz,
01:43:03.900 | Wine Access Online Wine Store,
01:43:05.980 | Mug Pack Low Carb Snacks,
01:43:07.900 | and Blinkist app that summarizes books.
01:43:10.740 | Click their links to support this podcast.
01:43:13.460 | And now let me leave you with some words
01:43:15.220 | from Nelson Mandela.
01:43:17.100 | When a man is denied the right to live the life
01:43:19.500 | he believes in,
01:43:20.780 | he has no choice but to become an outlaw.
01:43:24.440 | Thank you for listening and hope to see you next time.
01:43:28.180 | (upbeat music)
01:43:30.760 | (upbeat music)
01:43:33.340 | [BLANK_AUDIO]