back to index

Robert Proctor: Nazi Science and Ideology | Lex Fridman Podcast #268


Chapters

0:0 Introduction
4:8 Ideology and science
15:43 Wernher von Braun
22:49 The scientific process
32:52 Censorship
42:5 Anthony Fauci
47:16 Courage in science
54:35 Tobacco industry
78:21 Nazi medicine
88:30 The Nazi War on Cancer
93:43 Science funding
104:37 Ignorance
112:17 Ideology in academia
118:37 Human origins
128:30 Hobbies
135:48 Diversity in the universe
139:23 Stones
148:18 Conspiracy theories
152:28 Nazi impact on Soviet science
157:43 Nazi tobacco industry's denial campaign
161:58 Hope for the future
164:32 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | "What is the heroic action for a scientist in Nazi Germany?"
00:00:04.560 | - Science in many respects actually is the full collaborator
00:00:08.840 | in the most horrific forms of Nazi genocide,
00:00:13.200 | Nazi exclusion.
00:00:14.360 | What goes to the mind of a big tobacco executive?
00:00:17.980 | Cigarettes have killed more than any other object,
00:00:21.640 | than all the world of iron, all the world of gunpowder.
00:00:25.400 | Nuclear bombs have only killed a few hundred thousand people.
00:00:30.400 | Cigarettes have killed hundreds of millions.
00:00:33.120 | There's no contest.
00:00:35.240 | Cigarettes have killed far more
00:00:37.480 | and are far more preventable.
00:00:38.960 | What is the nature of human ignorance?
00:00:41.460 | The following is a conversation with Robert Proctor,
00:00:47.080 | historian at Stanford University,
00:00:49.160 | specializing in 20th century science,
00:00:51.840 | technology and medicine,
00:00:53.480 | especially the history of the most controversial aspects
00:00:57.440 | of those fields.
00:00:58.480 | Please allow me to say a few words about science
00:01:02.400 | and the nature of truth.
00:01:04.500 | The word science is often used as an ideal
00:01:07.960 | for a methodology that can help us escape the limitation
00:01:11.580 | of any one human mind in the pursuit of truth.
00:01:15.140 | The underlying idea here is that individual humans
00:01:18.080 | are too easily corrupted by bias, emotion,
00:01:20.800 | personal experience and the usual human craving
00:01:24.000 | for meaning, money, power and fame.
00:01:27.600 | And the hope is that the tools of science
00:01:30.440 | can help us overcome these limitations
00:01:32.580 | in striving for deeper and deeper understanding
00:01:35.440 | of objective reality, from physics to chemistry,
00:01:38.680 | biology, genetics and even psychology,
00:01:41.080 | cognitive science and neuroscience.
00:01:42.880 | But history shows that these tools of science
00:01:46.960 | are not devoid of human flaws,
00:01:49.360 | of influence from human institutions,
00:01:52.160 | of manipulation from people in power.
00:01:54.680 | As we talk about in this conversation with Robert Proctor,
00:01:58.960 | in the 1930s and 40s, there was the Nazi science
00:02:03.160 | and there was communist science
00:02:05.000 | and each had fundamentally different ideas about,
00:02:08.440 | for example, genetics and biology of disease.
00:02:11.640 | This history also shows that scientists can be corrupted
00:02:17.400 | slowly or quickly by fear, fame, money
00:02:21.680 | or just the ideological narratives of a charismatic leader
00:02:25.520 | that convinces each scientist and the scientific community
00:02:29.600 | that their work matters for the greater cause of humanity.
00:02:33.240 | Even if that cause involves the genetic purification
00:02:38.040 | of a people, the extermination of a cancer
00:02:41.840 | and the unrestricted experimentation
00:02:44.920 | on the bodies of living beings
00:02:47.080 | who do not have a voice,
00:02:49.000 | whose suffering will never be heard.
00:02:51.840 | All of this for the greater good.
00:02:54.280 | In some periods of human history,
00:02:58.160 | science was deeply influenced by the ideology
00:03:00.960 | of governments and individuals.
00:03:03.480 | In some, less so.
00:03:06.280 | The hard truth is that we can't know for sure
00:03:09.680 | about which of the two periods we're living through today.
00:03:13.300 | So let us not too quickly dismiss the voices
00:03:16.960 | of experts and non-experts alike
00:03:19.280 | that ask the simple question of,
00:03:21.880 | wait, are we doing the right thing here?
00:03:24.460 | Are we helping or hurting?
00:03:27.260 | Are we adding suffering to the world
00:03:30.080 | or are we alleviating it?
00:03:31.740 | Most such voices are nothing more than martyrs
00:03:36.000 | seeking fame, not truth.
00:03:38.280 | And they will be proven wrong.
00:03:40.480 | But some may help prevent future atrocities
00:03:44.920 | and suffering at a global scale.
00:03:47.720 | Let us then move forward with humility
00:03:50.280 | so that history will remember this period
00:03:53.160 | as one of human flourishing
00:03:54.800 | and where science lived up to its highest ideal.
00:03:58.360 | This is the Lex Friedman Podcast.
00:04:01.480 | To support it, please check out our sponsors
00:04:03.560 | in the description.
00:04:04.640 | And now, dear friends, here's Robert Proctor.
00:04:08.000 | What is the story of science and scientists
00:04:12.080 | during the rise, rule, and fall of the Third Reich?
00:04:15.640 | - Well, we tend to think of science
00:04:18.640 | as always on the side of liberty,
00:04:21.840 | as always on the side of enlightenment,
00:04:24.680 | as always on the side of enlarging human possibility.
00:04:29.680 | And here we have this phenomenon in the 1930s
00:04:34.680 | of really the world's leading scientific power,
00:04:39.480 | the Third Reich, which collectively had won
00:04:43.800 | a big chunk of all the Nobel Prizes.
00:04:45.800 | Suddenly they go fascist, they go Nazi with Hitler.
00:04:50.480 | And instead of being primarily a source of resistance,
00:04:56.600 | science in many respects actually is a full collaborator
00:05:03.400 | in the most horrific forms of Nazi genocide,
00:05:07.760 | Nazi exclusion, and that's kind of a relatively untold story
00:05:12.760 | in the sense that when we think of science
00:05:18.960 | in the Third Reich, we think of Joseph Mingeli
00:05:21.920 | injecting dye into the eyes of twins,
00:05:25.240 | or we think of horrific human experiments,
00:05:27.400 | and those are real.
00:05:28.960 | But it's also the story of a huge scientific apparatus,
00:05:33.880 | a bureaucracy, you could almost say,
00:05:36.760 | participating in every phase of the campaigns
00:05:41.520 | of Nazi destruction.
00:05:42.840 | And what I looked at in particular,
00:05:45.800 | and actually in my first book,
00:05:47.720 | was how physicians in particular,
00:05:49.680 | but also biomedical science,
00:05:51.480 | was collaborating with the regime,
00:05:55.800 | and that it's wrong to think of the Nazi regime
00:05:59.760 | as anti-science.
00:06:01.700 | It's anti a particular type of science.
00:06:04.820 | In particular, it was radically against
00:06:07.320 | what they called Jewish science, communist science.
00:06:10.320 | Certain types of science they did not like.
00:06:14.900 | There's a whole nature/nurture dispute in that period,
00:06:18.760 | and they're firmly on the side of nature,
00:06:21.440 | which interestingly gives rise to a very different
00:06:26.820 | type of science in the Soviet Union, by the way.
00:06:29.120 | - The Soviet Union is more on the nurture side?
00:06:31.740 | - The Soviet Union is on the side of the nurture side,
00:06:34.400 | in the dimension of genetics.
00:06:37.120 | And this is sort of an untold story.
00:06:40.540 | I was actually gonna write a book about it
00:06:42.240 | until I was barred from access to the Soviet Union.
00:06:47.040 | There've been different times in my life
00:06:48.480 | where I was a Russianist.
00:06:50.580 | - A Russianist.
00:06:51.580 | Okay, we're gonna have to talk about that.
00:06:53.620 | - I got excluded from fulfilling that dream
00:06:59.060 | but one of the things I was gonna look at
00:07:01.160 | when I got a Fulbright in the 1980s
00:07:05.200 | was to go over and look at the anti-Nazi genetics
00:07:10.200 | and anthropology of the Soviets
00:07:15.720 | and how a lot of their Lysenko-ist Lamarckism
00:07:19.720 | was actually anti-Nazi anti-genetics
00:07:24.720 | on the nurture side of nature.
00:07:26.440 | And that's really an untold story.
00:07:27.880 | It's an uncomfortable story
00:07:29.840 | 'cause it sounds like we might wanna make heroes
00:07:35.000 | out of the twisting of science in the Soviet Union.
00:07:39.480 | But nonetheless, there are these interesting complexities
00:07:41.860 | and what's amazing about Nazi science
00:07:44.880 | is how there was this collaboration.
00:07:48.480 | And you're talking about a culture
00:07:51.240 | where they're inventing things like electron microscopy,
00:07:54.360 | they're doing all kinds of studies in anthropology.
00:07:57.660 | So a lot of that's an untold story.
00:07:59.800 | - So what was the connection
00:08:00.920 | between the ideology and the science?
00:08:03.160 | If you can just linger on it longer.
00:08:05.880 | - Well, we tend to think of science and ideology
00:08:10.000 | as completely separate
00:08:11.480 | when I think the reality is they're not.
00:08:15.400 | If you look at why the Mayans in the 7th and 8th century AD
00:08:19.840 | had the world's most accurate calendar,
00:08:23.080 | accurate to within 17 seconds per year,
00:08:26.600 | that was all part of a ritual practice
00:08:28.900 | to celebrate the rise of Kukulkan,
00:08:32.780 | the rise of Venus with what's called the heliacal rising,
00:08:37.780 | namely the rising of Venus before the rising of the sun
00:08:42.900 | in which, at which moment Venus is destroyed
00:08:47.460 | by the light of the sun.
00:08:49.420 | Well, they developed this elaborate calendrical astronomy
00:08:53.700 | which required detailed observation,
00:08:55.660 | detailed chronicling of the movement of the heavens,
00:08:58.160 | in particular the planets,
00:08:59.760 | for the purpose of celebrating this cycle of renewal
00:09:05.040 | that they thought was sacred and holy and magical.
00:09:10.000 | So where's the ideology, where's the science?
00:09:12.060 | There's the sort of instrumentation, the calendrics,
00:09:15.660 | the measurement all in the service of this magical moment.
00:09:20.660 | And I think that's true of a lot of science.
00:09:24.920 | I had a friend years ago who was Mennonite
00:09:27.880 | and wanted to study solar cells
00:09:30.420 | and to improve silicon chips
00:09:32.480 | to make more efficient solar energy.
00:09:34.840 | There was no money for that.
00:09:37.600 | Yet when Ronald Reagan took office,
00:09:40.200 | the budgets for solar and alternative energy
00:09:42.860 | were essentially zeroed out
00:09:44.080 | and Reagan takes off the solar panels
00:09:46.780 | off of the roof of the White House.
00:09:48.980 | So my friend ends up working
00:09:53.220 | on hardening silicon chips against nuclear war.
00:09:57.460 | So he becomes part of the nuclear war protection
00:10:00.500 | defense apparatus,
00:10:02.240 | even though he wanted to work on alternative energy,
00:10:05.740 | doing very similar work with silicon chips,
00:10:08.700 | but in a different framework.
00:10:10.540 | And so the practices of science often gets pushed into
00:10:15.540 | and is woven into ideological practices.
00:10:20.300 | Sort of in the same way that you get beautiful
00:10:23.560 | medieval cathedrals built in service of Catholicism.
00:10:28.560 | - Well, what's in the mind of an individual scientist?
00:10:32.280 | So this process of ideology polluting science,
00:10:36.000 | or is it science empowering ideology?
00:10:39.320 | So almost like if you can zoom in and zoom out effortlessly
00:10:44.320 | into the individual mind of a scientist
00:10:47.160 | and then back to the whole scientific community.
00:10:49.680 | Like do scientists think about nuclear war,
00:10:54.160 | about the atrocities committed by the Nazis
00:10:56.840 | as they're helping on the minute details
00:10:59.720 | of the scientific process?
00:11:01.160 | - I think sometimes they do and sometimes they don't, right?
00:11:03.680 | You think of the chemists working to develop the cyanide
00:11:08.680 | that will be used to kill Jews in a concentration camp.
00:11:15.160 | What are they thinking?
00:11:16.800 | You can imagine a whole range of thoughts,
00:11:19.480 | maybe they don't know what they're doing.
00:11:22.100 | Maybe they do, maybe they know a little bit but not a lot.
00:11:25.240 | Maybe they don't wanna know.
00:11:26.640 | Maybe they have ways of lying to themselves.
00:11:31.060 | Maybe they are the one person who agreed to do it
00:11:36.240 | and 99 refused.
00:11:39.660 | So it's hard if not impossible to know
00:11:44.960 | what's in the soul of anyone.
00:11:47.600 | But when you have enormous power directing the motion
00:11:52.600 | and the currents or the ocean,
00:11:56.380 | it's not hard to find people willing to fill that in,
00:12:01.400 | especially if they're narrow technocrats,
00:12:04.400 | if they're just doing their job,
00:12:05.840 | if they're just building the widget.
00:12:08.480 | And I think a lot of scientific training
00:12:11.520 | is in widget building and that leads to the possibility
00:12:16.120 | that they can become easily instrumentalized
00:12:19.360 | in a particular action,
00:12:23.720 | which is maybe horrific or glorious.
00:12:26.660 | The other thing to keep in mind is that science is,
00:12:31.640 | as we say, what scientists do.
00:12:33.320 | And that can include a lot of things,
00:12:37.800 | it can exclude a lot of things.
00:12:39.560 | The word science itself is interesting
00:12:42.320 | because it's cognate, it actually comes originally
00:12:45.200 | from the Proto-Indo-European,
00:12:47.400 | skein, meaning to cut or divide.
00:12:53.240 | And so it's cognate with scissors, schism, skin.
00:12:58.240 | Skin is that which divides you from the world.
00:13:01.360 | Shit or scat is that which has been divided from you
00:13:05.540 | into the world.
00:13:06.960 | And so there's this cognate between science and shit
00:13:10.520 | or science and cutting with the whole idea being
00:13:13.400 | that you're dividing into parts, classifying.
00:13:16.200 | It's the taxonomic impulse.
00:13:18.280 | And to know is to know where something belongs,
00:13:23.120 | to divide it into its parts and put it in its proper place.
00:13:27.120 | And that taxonomic impulse can be very static.
00:13:31.000 | It's actually one of the things that Darwin had to overcome
00:13:33.760 | in recognizing evolution, that the taxonomies are in motion.
00:13:38.160 | But it also can lead to a kind of myopia
00:13:42.080 | that my job is done when I've classified something.
00:13:44.720 | Is this bird an X, a Y, or a Z?
00:13:48.360 | And that again can be, it can be ideological
00:13:52.600 | or it cannot be, but scientists are humans
00:13:56.080 | and they're fitting in with a world, with a world practice.
00:14:01.080 | And that's limiting, it's kind of inevitable.
00:14:05.680 | It's unavoidable.
00:14:06.880 | It's hard to be, if not impossible, out of the world
00:14:11.540 | that we're walking in.
00:14:13.640 | - Yeah, and it's fascinating 'cause I think ideologies
00:14:15.540 | also have an impulse towards forming taxonomies.
00:14:18.900 | And there is, so just, so being at MIT,
00:14:23.900 | I've gotten to learn about this character
00:14:27.480 | named Jeffrey Epstein.
00:14:28.940 | I didn't know who this was until all the news broke out
00:14:32.200 | and so on.
00:14:33.200 | And I started to wonder how did all these people at MIT
00:14:36.440 | that I admire would hang out with this person?
00:14:39.500 | Just lightly, just have conversations.
00:14:41.320 | I don't mean any of the bigger things,
00:14:42.840 | but even just basic conversations.
00:14:45.400 | And I think this has to do, you said scientists
00:14:48.200 | are widget builders and taxonomizers.
00:14:52.000 | I think there's power in somebody like the Nazi regime
00:14:56.120 | or like a Jeffrey Epstein just being excited
00:14:59.000 | about your widgets and making you feel like the widget
00:15:04.000 | serves a greater purpose in the world.
00:15:07.440 | And so it's not like you're, you know,
00:15:11.480 | sometimes people say scientists wanna make money
00:15:13.800 | and, or they have a big kind of ideological drive behind it.
00:15:18.800 | I think there's just nice when the widget,
00:15:22.920 | so you like building anyway, somehow somebody convinces you,
00:15:27.040 | some charismatic person, that this widget
00:15:30.200 | is actually has a grander purpose.
00:15:33.360 | And you don't almost feel, think about the negative
00:15:37.480 | or whether it's positive, just the fact that it's grand
00:15:40.760 | is already super exciting.
00:15:42.600 | - Yeah, yeah, I think that's right.
00:15:44.000 | I think that's the story of Werner von Braun,
00:15:46.840 | you know, and the fascination with rockets
00:15:49.040 | and this will, you know, enlarge something in the world.
00:15:53.440 | And here he is, he's an SS officer,
00:15:56.040 | he's working around slave labor.
00:15:58.720 | And then, but his rocket then gets compressed
00:16:02.860 | into the Western world or the American world
00:16:06.040 | and basically launches us to the moon
00:16:09.740 | and we forget about the sauce,
00:16:11.360 | how the sausage was made originally.
00:16:15.000 | - Well, can you talk about him a little bit more
00:16:17.880 | 'cause he's such a fascinating character?
00:16:20.840 | 'Cause he, so he was a Nazi, but he was also an American
00:16:25.720 | and it had such a grand impact on both.
00:16:30.400 | And like, there's this uncomfortable fact
00:16:32.480 | that he's, you know, one of the central figures
00:16:35.640 | that gave birth to the American space exploration efforts.
00:16:39.840 | - Yeah, he's an interesting figure,
00:16:40.920 | fascinated in a kind of a tunnel vision way with spaceflight.
00:16:45.280 | He makes these beautiful rockets already,
00:16:47.380 | beginning in the '20s, early '30s.
00:16:49.520 | Ends up for a while at Penamunda using slave labor
00:16:53.840 | to build V2 engines and so forth like that.
00:16:57.760 | I remember going to Penamunda
00:17:00.360 | where people have actually tracked the flights
00:17:03.920 | of aborted V2 rockets and found some of these beautiful,
00:17:08.840 | beautiful old engines, just the most like works of art.
00:17:13.080 | These engines used to rain terror on the British.
00:17:18.080 | It's interesting because in that same spot,
00:17:22.520 | I was hunting for amber, Baltic amber,
00:17:24.920 | 'cause I'm a stone collector.
00:17:27.320 | And among the amber collectors there,
00:17:29.760 | there's a famous story of the Penamunda burn.
00:17:33.320 | It's called because they find yellow phosphorus,
00:17:36.480 | they think is amber, they put it in their pocket
00:17:39.020 | and then it dries out and then explodes
00:17:41.140 | and creates this big burn on their legs.
00:17:45.380 | But the whole Nazi regime is full of things like that.
00:17:49.680 | It's full of these scholars who get twisted
00:17:54.340 | into a mindset.
00:17:56.400 | And it's also important to realize
00:18:00.400 | that people didn't often see what was coming.
00:18:05.240 | And we look back and we say, how could you X, Y, or Z?
00:18:09.440 | But before the Holocaust, there's not the Holocaust.
00:18:14.440 | There are versions of it, but things get on a new meaning,
00:18:18.760 | gain a new meaning in light of subsequent events.
00:18:21.620 | - And there's an entire propaganda machine
00:18:24.080 | that makes it easier for you to hold
00:18:27.080 | the narrative in your head.
00:18:28.720 | Even if you kind of intuitively know
00:18:31.040 | there's something really wrong here,
00:18:33.160 | because of the propaganda, you can kind of convince yourself
00:18:36.080 | to be able to sleep at night.
00:18:37.560 | - That's right, and we have to remember
00:18:40.840 | that Goebbels' office was not the office of propaganda.
00:18:45.840 | It was the office of popular enlightenment and propaganda.
00:18:53.280 | - So enlightenment was part of his--
00:18:57.560 | - Just the new era of enlightenment from his perspective.
00:19:00.560 | - It was supposed to be the new age,
00:19:01.960 | the new era of enlightenment.
00:19:04.000 | It's a little bit like the kind of myth
00:19:06.540 | of Hitler's failed artist.
00:19:08.800 | You know, his art is not that bad.
00:19:11.040 | You know, there are a lot of artists who are worse.
00:19:13.960 | And I had a very interesting conversation once
00:19:16.660 | with my college roommate who became a librarian at Harvard.
00:19:20.920 | And at Harvard, he met an old, old librarian,
00:19:24.360 | a German woman who had met Hitler as a kid
00:19:28.120 | when she was like eight years old.
00:19:30.160 | Her dad was like a Gauleiter for the Nuremberg area.
00:19:33.540 | And she said that for 15 minutes,
00:19:36.360 | Hitler goes out onto the balcony with her
00:19:40.120 | and has this conversation alone
00:19:42.120 | with this eight-year-old girl.
00:19:44.160 | And she said he was charming and funny.
00:19:47.920 | And then he said he loved kids.
00:19:49.720 | And she said he was the most charming sort of person.
00:19:52.720 | And that's part of the history too,
00:19:55.680 | that we tend to forget when we make a scarecrow image
00:19:58.280 | of this rabid, raging fanatic.
00:20:02.480 | You know, there's more to it than that.
00:20:05.000 | - That's really, really, really important to think about
00:20:08.440 | when we make a scarecrow,
00:20:10.240 | because that gives you actionable,
00:20:14.100 | like it forces you to introspect
00:20:18.220 | about people in your own life,
00:20:19.840 | or leaders in your life today,
00:20:22.380 | ones you admire.
00:20:23.800 | They're charismatic, they're friendly, they love kids,
00:20:26.240 | they talk about enlightenment.
00:20:28.880 | You have to kind of think, all right,
00:20:30.680 | am I being duped on certain things?
00:20:33.000 | You have to kind of have a,
00:20:34.320 | I mean, that's the problem with Jerry Epstein
00:20:36.120 | that people don't seem to talk about.
00:20:38.400 | I never met the guy,
00:20:39.820 | but just given the people he talked to whom I know,
00:20:44.240 | it feels like he must have been charismatic.
00:20:46.880 | Like people think about like,
00:20:48.480 | oh, it's because of the women,
00:20:50.160 | it's because of the money.
00:20:51.540 | I don't, the people I know,
00:20:54.680 | I don't think they're going to be influenced.
00:20:57.120 | Ultimately, it has to be how you are in the room,
00:21:01.040 | and make, it's exactly like you said, the enlightenment.
00:21:04.320 | I think that excites the scientists.
00:21:06.520 | Of course, as a charismatic person,
00:21:08.000 | you have to know what to pick
00:21:10.520 | in terms of what excites you,
00:21:12.040 | but that is also the fascinating thing to me about Hitler,
00:21:15.760 | is all of these meetings,
00:21:18.200 | even like with Chamberlain,
00:21:20.480 | inside rooms, whether he was screaming,
00:21:23.740 | or whatever he was saying,
00:21:25.200 | it seems like he was very convincing.
00:21:27.440 | There must have been passion in his eyes.
00:21:29.680 | There must have been charisma that one-on-one,
00:21:32.280 | in a quiet conversation, he was convincing.
00:21:34.440 | - Yes, there's a famous story about Goebbels,
00:21:37.400 | who would do a party trick,
00:21:39.440 | where for 15 minutes,
00:21:41.080 | for 15 minutes, he would rouse the crowd to communism.
00:21:45.600 | Workers of the world unite.
00:21:47.000 | Then for 15 minutes, he would rouse the world
00:21:50.680 | to capitalism and individualism.
00:21:53.640 | Then for 15 minutes, he would rouse the world to Nazism,
00:21:56.440 | and apparently, he was quite convincing
00:21:58.480 | in each of those performances.
00:22:01.120 | - Well, all those ideologies are pretty powerful.
00:22:04.020 | And I think it's not even the reason that matters as much
00:22:09.720 | as the power of the dream, of the vision of the enlightenment.
00:22:13.600 | I mean, the vision of communism is fascinatingly powerful.
00:22:16.920 | Like, workers unite, the common people stand together,
00:22:22.560 | they'll overthrow the powerful, the greedy,
00:22:25.320 | and share the outcomes of our hard work.
00:22:31.280 | - Well, it's kind of like the story of,
00:22:33.760 | two-thirds of the things that Marx calls for
00:22:36.880 | in the Communist Manifesto
00:22:38.200 | are already just part of the liberal state.
00:22:41.600 | And so the parts we remember or forget
00:22:43.840 | about an ideology are very revealing.
00:22:46.840 | - If we can just linger on this a little bit longer,
00:22:49.800 | what have you learned from this period of the 1930s
00:22:54.800 | about the scientific process?
00:22:57.040 | So, one of the labels you can put on your work,
00:23:01.160 | and you as a scholar, as a philosopher of science,
00:23:03.820 | and you also talk about Nazi Germany
00:23:07.100 | as a singular moment in time,
00:23:09.220 | or like a rebirth of the integration
00:23:12.640 | between ideology and science.
00:23:16.040 | So, in terms of valueless science, I think is the term.
00:23:21.040 | - Value-free science.
00:23:22.880 | - Value-free science that you use.
00:23:26.080 | I mean, it seems like Nazi Germany
00:23:27.840 | is an important moment in history.
00:23:30.360 | I mean, it probably goes up and down.
00:23:32.680 | What difficult truths have you learned
00:23:37.500 | about the scientific process,
00:23:39.300 | and what hopeful things have you learned
00:23:42.320 | about the scientific process?
00:23:44.380 | - Well, I guess the saddening thing
00:23:47.280 | is how easily people can become part of a machine.
00:23:50.280 | If there's power, people can be found to follow it.
00:23:55.520 | You know, one of the things I work on is big tobacco,
00:23:59.600 | and we'll probably come to that.
00:24:01.400 | But it's amazing to me how easily people
00:24:05.440 | are willing to work for big tobacco.
00:24:06.920 | It's amazing to me how many scientists and physicians
00:24:11.200 | were willing to work for the Nazi regime
00:24:14.360 | for multiple reasons.
00:24:17.900 | Partly because a lot of them really thought
00:24:21.280 | they were doing the Lord's work.
00:24:23.520 | They thought they were cleaning the world of filth.
00:24:28.520 | You know, I mean, if you really thought
00:24:31.760 | Jews are a parasitic race,
00:24:33.960 | why wouldn't you get rid of them?
00:24:36.680 | So there's an ontology, there's a theory of the world
00:24:40.520 | that they're building on.
00:24:42.560 | And interestingly, one that was also present
00:24:46.680 | in the United States, and one of the things I did find out
00:24:49.760 | in my earliest research was that the Nazis
00:24:52.680 | had looked lovingly and enviously over at the United States
00:24:56.880 | in terms of racial segregation, racial separation,
00:25:01.360 | and saw themselves in a kind of competition
00:25:04.560 | to become the world's racial leader
00:25:06.560 | as the most purified racial form.
00:25:11.160 | And that this required this kind of cleansing process.
00:25:15.100 | And the cleansing meant getting rid of
00:25:21.080 | the physically handicapped,
00:25:22.400 | it meant getting rid of racial inferiors
00:25:24.960 | as they imagined them.
00:25:26.480 | It meant getting rid of cancer-causing chemicals
00:25:31.240 | in the air and in our food and our water.
00:25:33.800 | These were all of a piece.
00:25:35.440 | There's a famous illustration that Richard Dahl
00:25:41.560 | talks about the great cancer theorists
00:25:44.080 | of studying in Nazi Germany in the 1930s.
00:25:48.080 | And he's shown a lecture where cancer cells
00:25:51.880 | are shown as Jews and x-rays are shown as stormtroopers.
00:25:56.160 | And these stormtroopers are killing the cancer cells
00:25:59.960 | who are also Jews.
00:26:01.140 | And so there's this metaphorical work
00:26:03.800 | of cleaning, extermination, sanitation.
00:26:08.200 | - Purification of a sort.
00:26:10.100 | - Purification.
00:26:10.940 | There's definitely a kind of purity quest.
00:26:13.320 | And you see that at multiple levels.
00:26:15.780 | And so you see how easy it is for people
00:26:21.040 | to fall into that given a particular theory.
00:26:25.640 | And again, coming back to that earlier sort of point
00:26:29.240 | about the scarecrow, which I think is very important.
00:26:31.940 | If we imagine that nothing like this went on
00:26:35.800 | here in the United States, that would be a big mistake.
00:26:38.900 | The Nazis are looking to save the Redwoods League,
00:26:43.900 | to the Aryan supremacists, to the Ku Klux Klan,
00:26:49.620 | to the separation of blacks and whites.
00:26:53.360 | Blacks were not allowed to join
00:26:54.900 | the American Medical Association until after World War II.
00:26:57.680 | So you have racial segregation.
00:26:59.820 | You have massive sterilization in the United States
00:27:02.440 | way before the Nazis.
00:27:04.540 | One of the first things the Nazis do
00:27:06.120 | from a racial hygiene point of view
00:27:08.420 | is start sterilizing what they called
00:27:10.400 | the mentally ill and the physically handicapped.
00:27:13.300 | Well, that had been going on since around World War I
00:27:17.320 | in the United States and even earlier in certain states
00:27:20.420 | in the form of castration of prisoners
00:27:25.060 | in order to prevent their demon seed
00:27:27.320 | from being propagated further into the race.
00:27:30.580 | So there's a kind of a racial international
00:27:34.180 | that's going on and that part of the story
00:27:38.300 | also needs to be told.
00:27:41.100 | - And scientists were able to carry those ideas
00:27:43.120 | in their mind from your work?
00:27:45.260 | - Of course, of course.
00:27:46.820 | I mean, that's one of the things going on
00:27:48.620 | with all the renaming of buildings now
00:27:52.860 | is scientists who were eugenicists
00:27:55.340 | are now getting their names pulled off of buildings.
00:28:00.340 | My personal view is that it has to be done
00:28:03.660 | on a case-by-case basis.
00:28:06.020 | But in general, I think it's usually better
00:28:08.780 | to add on rather than subtract.
00:28:11.580 | In other words, to add history rather than erase history
00:28:15.180 | or pretend as if history had never existed.
00:28:18.220 | Let me give you a specific example of that.
00:28:22.360 | One of the most powerful and diabolical university presidents
00:28:27.360 | in the Nazi period was a guy named Karl Ostel, A-S-T-E-L.
00:28:32.580 | And he was a rabid Nazi, high up in the leadership.
00:28:37.580 | And in his portrait at the University of Vienna,
00:28:44.380 | there he is in full SS uniform,
00:28:46.960 | that painting was taken down.
00:28:51.540 | Now, what I would have done is left the painting
00:28:54.360 | and put a, you know, add a plaque.
00:28:57.480 | But to pretend as if that never happened
00:29:00.040 | or to erase history in that way,
00:29:01.760 | I think is a big, big mistake.
00:29:04.800 | - Can I linger on that point?
00:29:06.440 | So I haven't gotten through it yet,
00:29:08.620 | but I've been trying to get through the Mein Kampf.
00:29:12.320 | And, you know, throughout its history,
00:29:14.240 | it's been taken down and up.
00:29:15.420 | It actually was taken down from Amazon for a while recently.
00:29:19.540 | What can you say about keeping that stuff up?
00:29:23.620 | So the reason it was taken down from Amazon,
00:29:26.400 | I mean, there's a large number of people
00:29:30.000 | that will read that.
00:29:31.180 | And the hate in their heart will grow.
00:29:36.700 | So they're not using it for educational purposes.
00:29:40.400 | You can't put a plaque on the Mein Kampf.
00:29:42.800 | You're ruining Mein Kampf then.
00:29:44.480 | Like you can't, I mean, this is, you know,
00:29:48.240 | Amazon can't do a warning saying like--
00:29:51.020 | - Or an expurgated, you could do
00:29:53.360 | expurgated version of Mein Kampf.
00:29:55.600 | Take out the word Jew, you know?
00:29:57.200 | - Exactly. - That would solve
00:29:58.520 | anything. - Exactly.
00:29:59.360 | So it still just stands on its own.
00:30:01.880 | I mean, it's not well-written,
00:30:05.520 | so you can maybe convince yourself that it's okay
00:30:08.280 | because it's not well-written.
00:30:10.220 | So it's not like this inspiring book of ideology
00:30:13.780 | that could easily convince.
00:30:16.300 | But can you still man the argument
00:30:20.920 | that Mein Kampf should be banned?
00:30:23.200 | And can you still man the argument
00:30:24.820 | that it should be not banned?
00:30:27.060 | - Well, I wouldn't say it should be banned.
00:30:28.960 | I think, if anything, that might make it forbidden fruit.
00:30:33.080 | Now, this might be different when we come to statues
00:30:36.640 | on the public square.
00:30:37.640 | After World War II, the statues of Hitler,
00:30:41.120 | there must have been thousands of them were taken down.
00:30:44.000 | Now, I think even the most rabid opponents
00:30:48.460 | of cancel culture would not say there was something wrong
00:30:53.460 | with taking down the statues of Hitler
00:30:56.360 | that were in every office building, every post office.
00:31:01.100 | So I think a lot depends on the placement
00:31:06.100 | and the purpose of icons, of statues, of texts.
00:31:11.340 | I don't see the harm in being able to buy Mein Kampf.
00:31:14.880 | It's so out of this world by now.
00:31:19.280 | Just the language and, if anything,
00:31:22.140 | there probably is more good done
00:31:24.600 | by people being shocked at how dumb it is
00:31:27.240 | than the evil that might be done by someone reading it.
00:31:30.520 | I can't imagine people being really gripped by that now,
00:31:35.520 | partly just 'cause it's kind of outdated and crazy talk.
00:31:39.560 | So in that case, I would not be in favor of that.
00:31:44.560 | When it comes to monuments or other types of things,
00:31:48.200 | it's a judgment call in each case.
00:31:50.120 | I think it has to be probably voted on,
00:31:52.640 | but it also, I think, in many of these cases,
00:31:58.400 | there's an add-on view would fix a lot of the problems.
00:32:02.140 | - We'll jump around a little bit.
00:32:03.240 | We'll come back to medicine and war on cancer.
00:32:07.040 | - Let me just add one thing on that.
00:32:08.520 | Recently, the name of Macmillan,
00:32:11.760 | who works on the charge of the electron
00:32:14.200 | in the early part of the 20th century,
00:32:15.760 | his name was taken off of a building at Caltech.
00:32:20.140 | Well, to take his name off, what do you really do?
00:32:24.840 | It wasn't a central aspect of his actual work.
00:32:28.000 | It's not why he was put on the name
00:32:30.160 | of that building at Caltech.
00:32:32.080 | And also, the memory is lost and the lesson is lost.
00:32:35.780 | When you could have kept the Macmillan name
00:32:39.600 | on the building and added a plaque,
00:32:42.020 | this guy was a racist or this guy was a eugenicist
00:32:45.440 | or something to make a teaching moment
00:32:48.180 | instead of just a forgetting moment.
00:32:50.320 | - Yeah.
00:32:51.500 | Well, let me take a small tangent and ask you
00:32:54.240 | about censorship and this particular period
00:32:58.200 | we're living through.
00:32:59.840 | So my friend Joe Rogan has a podcast.
00:33:03.640 | He hosts a few folks on there
00:33:07.100 | and they're folks of differing opinions.
00:33:09.720 | And as we speak, there's kind of a battle going on
00:33:13.120 | over whether Joe Rogan should be on Spotify
00:33:16.960 | and allowed to spread scientific misinformation.
00:33:20.280 | In particular, there's a guy named Robert Malone
00:33:22.560 | that's talking about, that's making a case against,
00:33:27.560 | at least against the COVID vaccine and so on.
00:33:31.880 | So outside of the specifics of this person,
00:33:35.100 | in this battle of scientific ideas
00:33:41.560 | that are sometimes tied up with ideology
00:33:45.000 | in our modern world, what do you think is the role?
00:33:49.640 | Like who gets to censor, decide what is misinformation
00:33:54.200 | or information?
00:33:56.040 | Should we let ideas fly in the scientific realm?
00:34:01.400 | So scientific ideas,
00:34:03.080 | or should we try to get it under control?
00:34:05.720 | Like which way, obviously all approaches will go wrong
00:34:10.560 | in some ways, which is more likely to go wrong?
00:34:14.040 | One where you try to get a hold of like,
00:34:16.640 | all right, this is a viral thing
00:34:20.120 | and it doesn't fit with scientific consensus.
00:34:23.460 | So we should probably like try to like quiet it down
00:34:26.160 | a little bit, or do you let it all just fly
00:34:29.200 | and let the ideas battle?
00:34:30.980 | Do you think about this kind of stuff
00:34:33.240 | in the context of history?
00:34:36.400 | - Well, that used to be a million dollar question.
00:34:39.640 | Of course, now it's a multi-billion dollar question.
00:34:42.360 | - Not trillion, yeah.
00:34:43.420 | - We're talking about powerful internet platforms
00:34:48.760 | becoming essentially publishers.
00:34:51.920 | And publishers can't say whatever they want.
00:34:55.180 | There are limits.
00:34:58.340 | They can't yell fire in a crowded theater.
00:35:01.900 | But there's a kind of social responsibility that is there.
00:35:06.460 | And I know some of these,
00:35:08.580 | I don't know a lot about this topic,
00:35:09.960 | but I know some of the large platforms
00:35:12.420 | do have dedicated offices
00:35:14.740 | to trying to rein in misinformation
00:35:19.220 | as you would expect any publisher to do.
00:35:21.340 | You can't just let anything fly in Time Magazine
00:35:25.540 | or the New York Times either.
00:35:29.460 | There are all kinds of codes of ethics and legal obligations.
00:35:32.860 | So I'm a fan of the efforts,
00:35:37.860 | or I think some of the large internet platforms
00:35:40.020 | should be congratulated at least for trying
00:35:42.660 | to make an effort to rein in misinformation.
00:35:45.660 | It's gonna be difficult,
00:35:47.780 | and mistakes are gonna be made,
00:35:49.860 | but it can't be a let everything fly kind of situation.
00:35:53.580 | But when I watch, unfortunately,
00:35:56.820 | the pressure these platforms feel
00:35:59.460 | to identify and to censor misinformation,
00:36:02.480 | that pressure is ideological in nature currently.
00:36:08.600 | So if you just objectively look,
00:36:12.500 | there's a certain political lean
00:36:14.260 | to people that are pressing on the censorship
00:36:16.940 | on the misinformation,
00:36:18.240 | which makes me very uncomfortable
00:36:20.580 | because now there's an ideology
00:36:22.340 | to labeling something as misinformation
00:36:24.660 | as opposed to kind of having a value,
00:36:29.660 | less evaluation of what is true or not.
00:36:33.220 | And you also have to acknowledge
00:36:35.940 | that it says something,
00:36:38.660 | that there's a very large number of people
00:36:42.140 | that, for example, follow Robert Malone,
00:36:46.220 | or follow people, I mean,
00:36:47.780 | what does that say about society?
00:36:50.300 | And there's a deeper lesson in there
00:36:53.300 | that's not just about blocking misinformation.
00:36:56.300 | It's distrust in science and institutions,
00:37:00.300 | distrust in leaders.
00:37:02.020 | Like, it feels like you have to fix that.
00:37:04.120 | And then censorship of misinformation
00:37:07.040 | is not going to be fixing that.
00:37:08.860 | It's only going to like throw gasoline on the fire.
00:37:12.600 | You gotta put out the fire.
00:37:15.820 | - Well, that's certainly possible, yeah.
00:37:19.380 | I mean, I think people are distrustful
00:37:23.900 | of certain institutions and not others, right?
00:37:26.900 | And I think a lot of distrust is good.
00:37:31.040 | I'm not a conspiracy theorist,
00:37:33.740 | but I do know there have been a lot of conspiracies
00:37:37.060 | and that people work behind scenes
00:37:42.060 | to do powerful bad things,
00:37:43.780 | and that's what needs to be exposed.
00:37:47.580 | The other thing I worry about,
00:37:48.780 | which is relevant to your question,
00:37:50.460 | again, it's a billion or trillion dollar question,
00:37:53.100 | is we're, I think, in a world of kind of flattening
00:37:56.940 | where all news or all information or all data
00:38:01.500 | is kind of equal in some way.
00:38:03.140 | And so you get the Twitter-verse going,
00:38:05.740 | and it doesn't matter if it's peer-reviewed
00:38:08.780 | or it doesn't matter if it's been supported by evidence.
00:38:12.580 | It's just a kind of outburst.
00:38:16.460 | It's interesting to contrast it with, say, 100 years ago.
00:38:20.020 | I mean, what would a crazy person
00:38:22.740 | or a flat earther or anything,
00:38:26.980 | what venue would they have?
00:38:29.260 | I mean, maybe they could go to a church or someplace.
00:38:31.940 | So now we have these empowering engines,
00:38:38.880 | then that's what's new historically,
00:38:40.780 | is that basically anyone can have a blog or a Twitter
00:38:46.780 | feed, and that is new.
00:38:49.180 | And so that is, you can think of it also
00:38:51.860 | as a kind of clutter.
00:38:52.780 | So it's a kind of a radical democracy in a way,
00:38:55.020 | and kind of one of the weaknesses of democracy
00:38:57.380 | is if everyone has an equal voice
00:38:59.780 | and if everyone has equal power.
00:39:01.980 | - So there's, of course, a flip side to that
00:39:03.900 | where everyone has equal power.
00:39:05.340 | It forces the people who are quote-unquote experts
00:39:09.300 | to be better at communication.
00:39:11.140 | I think people, like scientists, are just upset
00:39:13.660 | that they have to do better work at communicating now.
00:39:16.660 | They used to be lazy and you could just say,
00:39:18.820 | I have a PhD, therefore everyone listen to me.
00:39:20.740 | Now they have to actually convince people.
00:39:22.660 | Like, you have to convince people that the Earth is round.
00:39:25.220 | You can't just say the Earth is round, that's it.
00:39:27.180 | You have to show, you have to make,
00:39:30.820 | I mean, not the Earth is round part,
00:39:32.220 | but things like that.
00:39:34.580 | You have to actually be a great communicator,
00:39:37.460 | do great lectures, do documentaries, and so on,
00:39:41.060 | to battle those ideas.
00:39:42.660 | And then also to defend the sort of,
00:39:45.500 | the people labeled as crazy.
00:39:48.660 | In Nazi Germany, if you were protesting against
00:39:53.940 | some of the uses of science, of medicine,
00:39:58.380 | to commit atrocities, you would also be labeled crazy.
00:40:01.300 | - Yeah.
00:40:02.140 | - And so those voices are important.
00:40:03.740 | - Yeah, there's so many good points there
00:40:06.340 | on the scientists becoming good communicators.
00:40:10.080 | The history of scientists becoming bad communicators
00:40:13.280 | has a history.
00:40:14.540 | And the last original contribution to science,
00:40:19.540 | written entirely in the form of a poem,
00:40:21.640 | is Buffon's "Loves of the Plants."
00:40:25.760 | And following that in the 18th century,
00:40:28.360 | you get the uglification of science.
00:40:30.720 | The deliberate uglification of science,
00:40:32.960 | with the idea being that if you are clear,
00:40:35.640 | and if you speak beautifully, if you write beautifully,
00:40:37.840 | you're hiding something.
00:40:39.160 | You're covering over the truth with flowers,
00:40:43.400 | and decorations, and scents, and pleasant odors.
00:40:47.480 | And so, you get this scientific paper format,
00:40:52.480 | introduction, discussion, methods, results, conclusions,
00:40:58.280 | and it's kind of policed in this inhumane,
00:41:03.720 | non-humanistic kind of rhetorical way,
00:41:06.440 | and that's a big problem.
00:41:07.440 | And so, you get that combined with just the rise
00:41:10.760 | of the research lab, and the ever narrower widget builders,
00:41:15.460 | the cogs in the machine.
00:41:16.800 | It's not surprising that people might not trust
00:41:21.320 | certain aspects of that.
00:41:22.560 | That combined with the dirty laundry history
00:41:26.080 | of a lot of science, that you did have the requirement
00:41:31.080 | at Auschwitz that physicians supervise the killings.
00:41:36.240 | You know, the horrors of Tuskegee,
00:41:41.240 | and all kinds of other things.
00:41:43.840 | Or even something like the atom bomb,
00:41:45.600 | which is arguably more neutral, at least.
00:41:48.420 | But nonetheless, horrific.
00:41:49.520 | And so, it's not surprising that a lot of people
00:41:53.000 | don't trust science, and a lot of science
00:41:54.680 | shouldn't be trusted, right?
00:41:56.440 | There's science, and then there's science.
00:41:58.400 | So, there's a long history of dirty, bad science
00:42:02.600 | that you don't solve just by saying
00:42:04.200 | we should have trusted it.
00:42:05.720 | - Let's just stay on COVID for a brief moment,
00:42:10.000 | and talk about a particular leader
00:42:13.760 | that I think about, is Anthony Fauci.
00:42:16.600 | I've thought about whether to talk to him or not.
00:42:19.240 | I have my own feelings about Anthony Fauci.
00:42:25.000 | By the way, I admire basically everybody,
00:42:28.280 | and I admire scientists a lot.
00:42:30.600 | And there's something about him that bothers me.
00:42:34.080 | I think because I'm always bothered by ego,
00:42:39.080 | and lack of humility, and I sense that.
00:42:42.120 | Maybe I'm very wrong on this.
00:42:44.960 | But so, he has said that he represents science.
00:42:49.760 | If you've taken him full context,
00:42:51.720 | I understand the point he's making,
00:42:54.640 | which is, you know, when people attack,
00:43:01.600 | attack him, they think of him as representing science,
00:43:05.400 | things like that.
00:43:06.360 | But there's ego in that.
00:43:08.240 | And what do you think motivates and informs his decisions?
00:43:12.100 | Is it politics or science?
00:43:13.640 | And the broader question I have,
00:43:15.240 | what does it take to be a great scientific leader
00:43:20.680 | in difficult times?
00:43:23.960 | Like these, and maybe you could say
00:43:27.360 | Nazi Germany was similar,
00:43:29.640 | when there's obviously, like you,
00:43:33.080 | Anthony Fauci, just like scientific leaders
00:43:35.440 | during Nazi Germany, could have made a difference,
00:43:37.800 | it feels like, positive and negative.
00:43:41.040 | And so it's like there's a lot at stake,
00:43:43.000 | there's a lot at stake in terms of scientific leadership.
00:43:47.660 | If I've asked about 17 questions,
00:43:51.760 | if there's something worthwhile answering in that.
00:43:54.960 | - Well, Fauci, I think, is doing as good a job as he can.
00:43:59.320 | I mean, he's a, you can't turn on the television
00:44:03.080 | without seeing him.
00:44:04.180 | - But no, that's, what's the goal of the job?
00:44:08.840 | That means he appears a lot, but there's,
00:44:12.000 | he does not come off as somebody with authenticity.
00:44:15.640 | Like I admire so many science communicators,
00:44:17.920 | about 10x, 100x more than him,
00:44:20.060 | including his boss, Francis Collins,
00:44:23.360 | who I've recently lost respect for,
00:44:26.160 | given some of the emails that leaked.
00:44:28.840 | There's ego in those emails.
00:44:31.240 | And it upsets me, because like,
00:44:33.440 | I hope all that stuff comes out
00:44:35.520 | and wakes young scientists up to don't be a douchebag.
00:44:40.520 | Don't be humble.
00:44:43.900 | Be honest, be authentic, be real, put yourself out there,
00:44:47.360 | don't play the PR game, don't play politics.
00:44:50.640 | Just get excited about the widget building that you love,
00:44:53.840 | communicate that, and think about
00:44:55.840 | the difficult ethical questions there,
00:44:57.600 | and communicate them, be transparent.
00:44:59.960 | Don't think like the public, don't talk down to the public,
00:45:02.740 | don't think the public is too dumb to understand
00:45:05.120 | the complexities involved, because
00:45:07.080 | the moment you start to think that, when you're like 30,
00:45:11.440 | what do you think happens when you're 40 and 50?
00:45:14.120 | The slippery slope of that, the ego builds,
00:45:17.360 | the, like, this taste for the public opinion builds,
00:45:22.360 | and then you get into the leadership position
00:45:26.920 | at the time you're 60 and 70,
00:45:28.960 | and then you're just a dick.
00:45:31.020 | And you're a bad communicator to the very public.
00:45:34.960 | So I think this is something that just builds over time,
00:45:37.680 | is the skill to communicate, to be honest, to be real,
00:45:41.200 | to constantly humble yourself, to surround yourself
00:45:43.720 | with people that humble you.
00:45:45.800 | Anyway, I'm bothered by it,
00:45:50.440 | because I feel like science is under attack.
00:45:52.720 | People distrust science more and more and more.
00:45:56.400 | And it is perhaps unfair to place, like,
00:45:59.760 | Anthony Fauci to blame for that.
00:46:02.260 | But you know what?
00:46:03.100 | Leaders take care of the responsibility.
00:46:06.840 | So in you saying that he's doing the best job he can,
00:46:10.940 | I would say he's doing a reasonable job,
00:46:13.560 | but not the best job he can.
00:46:15.240 | - Yeah, well, I don't know what his capabilities are
00:46:17.880 | on that one or the other. - I mean, that position.
00:46:19.920 | - Right, right.
00:46:21.280 | - Like, you can imagine how history sees great leaders,
00:46:26.400 | that unite on which history turns.
00:46:31.160 | That's not a great leader, because there's a huge division.
00:46:34.840 | There's a lot of people in leadership position
00:46:37.640 | that can heal the division.
00:46:39.280 | You could think of tech leaders.
00:46:43.840 | They can heal the division, 'cause they have the platform.
00:46:46.120 | They can speak out with eloquence.
00:46:48.200 | You can think of political leaders, presidents,
00:46:50.600 | that can speak out and heal the division.
00:46:53.000 | You could think of scientific leaders, like Anthony Fauci.
00:46:55.400 | They can heal division.
00:46:57.120 | None of these are doing a good job right now.
00:46:59.520 | And which is, you know, leadership is hard,
00:47:03.080 | which is why when great leaders come along,
00:47:05.680 | history remembers them.
00:47:07.440 | So I just wanna point out, the emperor has no clothes
00:47:10.400 | when the leaders are like, eh, kind of mediocre.
00:47:12.960 | - Yeah, yeah.
00:47:14.200 | - 'Cause it feels like, I guess I'll take it to a question
00:47:17.920 | about Nazi Germany.
00:47:19.480 | What is the heroic action for a scientist in Nazi Germany?
00:47:24.280 | Like, to stand, to see what's right
00:47:29.200 | when you're under this cloud of ideology?
00:47:35.240 | - Yeah.
00:47:36.640 | Well, it's an almost impossible task in Nazi Germany.
00:47:40.660 | Maybe the heroic task would have been
00:47:44.440 | before Hitler was essentially elected,
00:47:48.220 | and the Reichstag is burned.
00:47:52.660 | - So in the '30s, 'cause it's building--
00:47:55.000 | - When it's building, what the other alternatives are,
00:47:58.780 | maybe it's events in World War I
00:48:02.800 | that could have made Nazism less inevitable.
00:48:07.300 | You know, maybe it's going back to the British Empire,
00:48:13.920 | which had a giant empire,
00:48:16.560 | and Germany wanted a big empire, too, right?
00:48:20.220 | And that part of the history of World War I
00:48:22.120 | is often forgotten.
00:48:25.160 | So, you know, the heroic act is to stand up
00:48:28.840 | and tell the truth and fight against evil.
00:48:32.360 | - Well, of course you get--
00:48:34.760 | Oh, sorry to interrupt, but of course you get--
00:48:35.600 | - Well, just have some courage, you know?
00:48:38.440 | - But I also, so I personally don't always
00:48:42.640 | have complete respect of people
00:48:44.120 | who stand up and have courage,
00:48:45.760 | 'cause it's not often effective.
00:48:47.480 | What I have the most respect for
00:48:50.400 | is long-term courage, like that's effective.
00:48:55.400 | 'Cause like, you know, if you're just an activist
00:48:57.480 | and you speak out this is wrong,
00:48:59.240 | that's not gonna be effective,
00:49:00.240 | because everybody around you is saying,
00:49:03.360 | "Nah, it's, like, we like our widgets."
00:49:06.960 | So you have to somehow like steer this Titanic ship.
00:49:10.160 | And I guess you're right,
00:49:12.240 | the easiest way to steer is to do it earlier.
00:49:15.120 | - Well, everyone has different skills.
00:49:18.200 | You know, Musk is building electric cars,
00:49:20.400 | and other people are trying to build solar and wind.
00:49:25.400 | And there are all kinds of problems
00:49:28.280 | that we're gonna solve, right?
00:49:29.600 | People are building better vaccines, you know?
00:49:32.640 | There's a thousand ways to do good in the world,
00:49:37.000 | and a thousand ways to do bad in the world.
00:49:39.640 | I mean, part of the problem in science
00:49:42.760 | is that we don't look enough
00:49:45.520 | at what I call the causes of causes.
00:49:48.420 | - So cigarettes cause cancer, but what causes cigarettes?
00:49:52.260 | - Yeah, so the deeper, yeah, yeah.
00:49:56.540 | - Obesity causes heart disease, but what causes obesity?
00:50:00.060 | And it's not just gluttony and sloth,
00:50:02.060 | it's the decision to pump up the sugar industry,
00:50:07.060 | and to allow soda in school.
00:50:10.040 | And I'm a big fan of what I call loop closing.
00:50:16.980 | We're all worried about climate change,
00:50:19.940 | and reducing our carbon footprint,
00:50:21.660 | but what about the hidden causes, the unprobed causes?
00:50:26.660 | I'm doing a project now with Londa Schiebinger
00:50:29.460 | on looking at how voluntary family planning
00:50:33.700 | could actually have a big role
00:50:36.140 | in reducing carbon footprint throughout the world.
00:50:40.100 | And these literatures are never joined, or rarely joined,
00:50:44.300 | that we have this huge carbon emissions problem,
00:50:48.660 | but we also have too many people on the planet,
00:50:52.940 | and the cause of that is because too few women and men
00:50:57.940 | have access to birth control.
00:51:01.340 | And if you join those realms open,
00:51:06.340 | there's gonna be new possibilities.
00:51:10.860 | And it's kind of like looking at the flip side of fascism,
00:51:15.860 | and the kind of discoveries they made
00:51:18.740 | that have been ignored.
00:51:20.840 | That's one of the things I'm interested in,
00:51:22.260 | is finding some of the gaping holes,
00:51:24.860 | the ideological gaps that have been ignored
00:51:27.300 | because of ideology, left or right, by the way,
00:51:30.920 | both of which involve blinders.
00:51:36.220 | And so there's all kinds of blinders that we live in,
00:51:38.500 | that's part of ideology, is what don't we even see?
00:51:41.980 | - And that would prevent us from seeing
00:51:45.140 | some deep, objective scientific truth.
00:51:47.940 | - Right, some truth.
00:51:49.100 | - And there's actually, just to mention,
00:51:51.340 | there's some people, including Elon,
00:51:53.740 | who are saying there's not too many people,
00:51:57.020 | there's not enough people, right?
00:51:59.220 | That if you just look at the birth rates,
00:52:01.520 | and so it's like, some of this is actually
00:52:05.340 | very difficult to figure out,
00:52:06.420 | 'cause there's these narratives,
00:52:08.340 | you mentioned tobacco, obesity with sugar,
00:52:12.300 | there's been narratives throughout the history,
00:52:14.100 | and it's very,
00:52:15.020 | there's certain topics on which it's
00:52:19.540 | easy to almost become apathetic,
00:52:24.420 | because you just see, in history,
00:52:28.540 | how narratives take hold and fade away.
00:52:32.700 | People were really sure that tobacco
00:52:36.620 | is not at all a problem, and then it fades,
00:52:39.780 | and then they figure it out,
00:52:40.900 | and then other things come along.
00:52:42.340 | What other things came along now?
00:52:44.500 | - Well, you asked about ideology,
00:52:46.340 | and one of the things I always ask students before class,
00:52:50.820 | whether I'm teaching agnotology
00:52:53.380 | or world history of science is,
00:52:56.500 | what makes fish move?
00:52:58.020 | And 90% of Americans will say,
00:53:01.740 | some version of muscles, fins, neurons,
00:53:06.740 | when the reality is, at least in saltwater,
00:53:10.340 | fish don't swim places, they're moved by currents.
00:53:14.420 | Fish are moved by currents, that's what makes fish move.
00:53:16.740 | This is not even counting the rotation of the Earth
00:53:19.780 | on its axis, or the rotation of the Earth around the sun,
00:53:22.400 | or the rotation of the solar system around the galaxy,
00:53:25.820 | ignore all that.
00:53:27.580 | Even on Earth, fish arrive up in Alaska.
00:53:31.940 | They don't swim there, they come by currents,
00:53:34.300 | and this is known to people who understand
00:53:36.380 | the ecology of fish,
00:53:40.420 | but we as sort of individualistic Americans think that--
00:53:45.420 | - The fish pulled itself up by its bootstraps.
00:53:47.940 | - Pulled itself up by its bootstraps, right,
00:53:49.980 | and whatever gumption and courage made his own world,
00:53:56.300 | instead of thinking of something like cigarettes,
00:53:59.940 | for example, hitting a village, like an epidemic,
00:54:03.980 | hitting the village like cholera,
00:54:05.700 | or pneumonia, or something like that.
00:54:07.540 | So there's a big ideology we have of personal choice.
00:54:11.220 | A great example of that is in the tobacco world,
00:54:15.200 | where people always, there's a whole field called cessation.
00:54:19.360 | That always means cessation of consumption,
00:54:21.560 | never cessation of production.
00:54:23.700 | All blame is put on the individual smoker,
00:54:26.900 | instead of looking at how they get smoked.
00:54:29.780 | And looking at that bigger picture, I think,
00:54:32.940 | is part of the story.
00:54:35.320 | - So a few years ago, you wrote that the cigarette
00:54:40.060 | is the deadliest object in the history
00:54:42.180 | of human civilization.
00:54:44.100 | Cigarettes kill about six million people every year,
00:54:47.380 | a number that will grow before it shrinks.
00:54:50.340 | Smoking in the 20th century killed 100 million people,
00:54:55.340 | and a billion could perish in our century,
00:54:59.660 | unless we reverse the course.
00:55:01.920 | Can you explain this idea that it's the deadliest object
00:55:07.860 | in the history of human civilization?
00:55:09.380 | Maybe just also talk about big tobacco
00:55:12.100 | and your efforts there.
00:55:14.060 | - Well, cigarettes have killed more than any other object,
00:55:18.540 | than all the world of iron, all the world of gunpowder.
00:55:21.300 | Nuclear bombs have only killed a few hundred thousand
00:55:26.220 | people.
00:55:27.380 | Cigarettes have killed hundreds of millions.
00:55:30.860 | And every year kill about as many as COVID.
00:55:33.860 | They're sort of neck and neck.
00:55:36.140 | But if you took the last five years, there's no contest.
00:55:39.980 | Cigarettes have killed far more,
00:55:42.220 | and are far more preventable.
00:55:44.720 | So what we're in a world, this bizarro world,
00:55:48.580 | where every night there's a COVID report,
00:55:51.140 | and cigarettes would never be mentioned.
00:55:53.900 | Cigarettes would no more likely to be mentioned
00:55:56.540 | than if we were talking about chewing gum on a sidewalk.
00:55:59.460 | They'd be no more likely to be in a presidential debate
00:56:03.000 | than, you know, sneezing in the wrong place.
00:56:07.740 | So we live in this world where most things are invisible.
00:56:13.940 | You know, we are, the eyes are in the front of the head.
00:56:18.940 | We don't see what's behind us.
00:56:22.000 | We have a fovea, which means not only do we only see
00:56:24.660 | what's in front of us, we see in a very narrow tunnel.
00:56:27.360 | And that's because we're predators.
00:56:30.640 | We don't have the eternal watchfulness of prey.
00:56:33.300 | We have a zeroed targeted focus.
00:56:35.780 | And that leads to a kind of myopia, or a tunnel vision,
00:56:39.780 | and all kinds of things.
00:56:42.580 | Then when you get something like a very powerful
00:56:44.620 | tobacco industry, which is a multi, multi-billion dollar
00:56:47.780 | industry, which still spends many billions of dollars
00:56:50.900 | advertising every year, but nonetheless manages
00:56:53.940 | to make themselves invisible.
00:56:55.860 | You have this powerful agent that is producing
00:56:59.020 | this engine of death that is invisible.
00:57:04.020 | It's been reduced to the fish that move themselves.
00:57:06.620 | In other words, there's not really a tobacco industry.
00:57:09.140 | There's just people who smoke, and that's a personal choice.
00:57:12.140 | Like what food we're gonna have for dinner tonight.
00:57:14.340 | And so it's erased from the policy world.
00:57:18.720 | It's as if it doesn't exist.
00:57:22.020 | And creating that sense of invisibility,
00:57:25.000 | to fail you to understand the causes of causes,
00:57:28.700 | is what allows the epidemic to continue,
00:57:33.020 | but also not even to be acknowledged.
00:57:35.920 | - How's the invisibility created?
00:57:38.860 | Is it natural, is it just human nature that ideas
00:57:43.860 | just fade from our attention?
00:57:49.260 | Or is it malevolent, still going on kind of action
00:57:53.940 | by the tobacco companies to keep this invisible?
00:57:59.940 | - It's still going on.
00:58:01.780 | Even when you see an ad against cigarettes on television,
00:58:07.660 | that's dramatically curtailed because the law
00:58:11.140 | that made those even possible required
00:58:15.040 | that there's an anti-villainy clause.
00:58:18.380 | The industry can't be made even visible in those ads.
00:58:22.900 | In some, they get away with it.
00:58:24.140 | But the industry operates through very powerful agents,
00:58:29.140 | you know, powerful senators.
00:58:31.720 | They used to count three quarters of the members
00:58:36.060 | of Congress as grade A contacts.
00:58:39.380 | They had most of the senators in their pocket,
00:58:41.380 | a lot of the senators.
00:58:42.620 | Sometimes they'll play both sides of the aisle.
00:58:44.640 | Basically, tobacco is Democratic, Democratic Party,
00:58:48.820 | until basically the '70s and Ronald Reagan,
00:58:53.060 | then it shifts over to becoming Republican.
00:58:57.580 | They create bodies like the Tea Party.
00:59:03.860 | They merge with big oil.
00:59:05.900 | The Koch brothers in the 1980s and '90s
00:59:10.420 | to form the Tea Party and a whole series of fronts
00:59:13.780 | which fight against all regulation and all taxation
00:59:18.780 | in order to prevent gas taxes and cigarette taxes,
00:59:24.220 | which are bonded in the convenience store and Walmart.
00:59:29.100 | Most cigarettes are actually sold in places like Walmart
00:59:31.620 | and pharmacies and 7-Elevens, things like that.
00:59:36.500 | And through that locus, then you have gasoline and tobacco
00:59:39.820 | sort of in this micro architectural collaboration.
00:59:44.160 | So there's multiple, multiple means that they use.
00:59:49.260 | Plus a lot of their targeting is hyper-specific.
00:59:53.140 | They use the internet very effectively.
00:59:54.820 | They use email and things that are customer targeting.
00:59:59.960 | - What goes through the mind of a big tobacco executive?
01:00:04.160 | This is connecting to our previous conversations
01:00:06.580 | of scientists and so on.
01:00:07.900 | I always wonder about that.
01:00:10.340 | I talked to Pfizer CEO, for example,
01:00:14.200 | and there's a deep question with the Pfizer CEO,
01:00:19.300 | with I guess any CEO, but big pharma.
01:00:27.960 | Would you, it's like if you can come up with a cure
01:00:32.100 | that gets rid of the problem that's in the big pharma,
01:00:37.860 | would you want to?
01:00:39.220 | Because you're going to lose a lot of money
01:00:42.440 | once the cure fixes the problem.
01:00:44.180 | It's nice to, like there's so many incentives to make money.
01:00:48.180 | Can you think clearly and make the right decisions?
01:00:51.300 | I'd like to believe most people are good
01:00:53.700 | and it's almost like this Steve Jobs idea,
01:00:57.560 | just like do the right thing
01:01:00.540 | and you'll make money in the end.
01:01:02.700 | It's like long-term, you'll make a lot of money
01:01:05.100 | if you do the right action
01:01:06.180 | 'cause there's always going to be problems you can fix.
01:01:08.580 | You can always pivot the company to focus on other things.
01:01:11.480 | As long as you're doing the best innovation,
01:01:14.060 | the best science, the best development
01:01:16.100 | and the production and deployment and stuff,
01:01:18.940 | you're going to win.
01:01:20.340 | But there's another view where you might,
01:01:22.820 | that kind of idea of making money pollutes you.
01:01:26.820 | It's the widget building.
01:01:28.320 | It's exciting when you can release a product
01:01:31.480 | that makes a lot of money
01:01:33.300 | and you start enjoying the charts
01:01:35.040 | that say the money's going up
01:01:37.080 | and you stop thinking about maybe there's the,
01:01:40.680 | that's the wrong choice for human civilization.
01:01:43.080 | - Well, one of the reasons I was made
01:01:44.880 | a courtesy appointment in pulmonary medicine at Stanford
01:01:49.440 | was they recognized I was doing more to save lives
01:01:52.120 | by trying to stop big tobacco
01:01:54.680 | than they were by yanking out this lung, that lung,
01:01:58.740 | on a daily basis.
01:01:59.780 | - Cause of causes.
01:02:01.060 | - The cause of causes, which we can keep returning to.
01:02:04.600 | Your question about how do people live with themselves
01:02:08.600 | is a crucial one.
01:02:09.780 | And it's one I've thought about a lot.
01:02:12.180 | It's one you think about with, in any context of horror,
01:02:16.180 | how do people live with themselves?
01:02:18.760 | How do they get up in the morning?
01:02:20.960 | I think there's a lot of incentives.
01:02:23.700 | One thing that you have to keep in mind
01:02:26.280 | is that whoever becomes CEO of a big tobacco company,
01:02:31.280 | they have already made decisions along the way
01:02:35.600 | and they are the remnant of a whole series
01:02:39.080 | of aspiring people who want to climb the ladder of success
01:02:44.000 | who maybe would refuse something like this.
01:02:47.320 | - But those don't survive the journey.
01:02:49.280 | - Those survive the journey who can make it through.
01:02:52.400 | And I think they have a mixture of ideologies.
01:02:54.940 | One, they'll say, well, if I didn't do it,
01:02:57.060 | someone else would.
01:02:58.600 | This is kind of the pour the cyclone B
01:03:00.860 | down the chimney into Auschwitz.
01:03:02.480 | Well, if I didn't do it, someone else would.
01:03:05.240 | So what's really the difference
01:03:07.060 | between me doing it and someone else?
01:03:08.340 | So that's one view.
01:03:10.460 | Another one is the tobacco industry,
01:03:13.860 | I think really doesn't like their customers
01:03:15.940 | except for the fact that they like their money.
01:03:19.620 | When you look at their documents,
01:03:20.900 | they talk about targeting against young adults
01:03:25.220 | or against women or against homosexuals.
01:03:30.220 | There's a whole project Reynolds has called Project Scum,
01:03:34.560 | which is project subculture urban markets
01:03:37.860 | where they're targeting homeless
01:03:39.140 | and homosexuals in San Francisco.
01:03:41.860 | So what kind of business model
01:03:43.820 | regards their customers as scum
01:03:46.220 | or talks about them as one famous Reynolds executives,
01:03:51.220 | we don't smoke this stuff,
01:03:54.340 | we reserve that for the poor, the black and the stupid.
01:03:57.500 | That's a direct quote from one of the Winston models.
01:04:01.700 | - So it's a company culture that sees the customers
01:04:04.860 | almost like as the enemy or worthless.
01:04:09.860 | - Losers. - Losers.
01:04:12.300 | - So you have these executives,
01:04:15.900 | if we don't do it, someone else will.
01:04:17.740 | If people are dumb enough to buy our product,
01:04:21.220 | let them buy it.
01:04:22.620 | Maybe it's a personal choice,
01:04:23.820 | maybe they're libertarians,
01:04:26.260 | maybe they're just, as you said,
01:04:28.420 | seduced by the money and the money is enormous.
01:04:30.980 | The money is enormous and these tobacco executives
01:04:34.200 | make tens of millions of dollars per year
01:04:36.140 | just in their salaries.
01:04:37.500 | So I think there's a whole series of logics.
01:04:43.380 | At some point, some of the companies
01:04:44.940 | have become food producers.
01:04:46.260 | In the 1980s and '90s, Philip Morris,
01:04:48.700 | which makes Marlboro,
01:04:50.420 | was the largest food producer in the United States.
01:04:53.300 | And so they could say,
01:04:55.940 | "Well, we're producing many products,
01:04:57.900 | "many addictive, desirable products."
01:05:02.260 | I think one project I'm working on now actually
01:05:04.620 | is looking at how the industry maintains morale
01:05:08.520 | in their own workforce.
01:05:10.500 | And they create a kind of parallel world
01:05:12.900 | of prizes and rewards and
01:05:15.380 | tobacco queens and tobacco princesses
01:05:21.060 | and tobacco sports teams and tobacco.
01:05:24.100 | It's this whole separate world,
01:05:25.820 | a world within a world.
01:05:27.020 | We all live in bubbles of a sort.
01:05:31.420 | And so there is this kind of tobacco world
01:05:34.680 | where you're with us or you're against us.
01:05:39.620 | And I even found evidence that the tobacco industry
01:05:44.340 | lies to its own employees.
01:05:46.100 | So they censored their own employee information
01:05:50.740 | so that everyone would be on board that,
01:05:52.940 | well, maybe it doesn't really cause cancer.
01:05:54.740 | The evidence is all statistics.
01:05:57.060 | Can't trust mice experiments 'cause mice are not men.
01:06:01.140 | They hire the guy, Darrell Huff,
01:06:03.460 | who wrote "How to Lie with Statistics,"
01:06:05.300 | the best-selling statistics book
01:06:06.940 | in the history of the world.
01:06:08.800 | They paid him to write a book called
01:06:10.300 | "How to Lie About Smoking with Statistics."
01:06:12.900 | Now that was never published
01:06:15.980 | when sort of word of some other dirty tricks got out.
01:06:20.420 | So one way they're able to gain legitimacy,
01:06:24.580 | gain normalcy, gain, you know,
01:06:27.140 | these are supporters of the arts.
01:06:28.980 | You know, there are universities named
01:06:31.780 | for tobacco executives.
01:06:33.820 | You know, we have Duke University,
01:06:35.380 | we have the George Weissman School,
01:06:37.720 | I think, of Arts and Sciences at CUNY.
01:06:41.340 | And there are prizes.
01:06:44.020 | You know, Philip Morris essentially created
01:06:46.100 | women's tennis as a spectator sport.
01:06:49.100 | Billie Jean King joins the board of directors
01:06:51.820 | of Philip Morris.
01:06:53.620 | She signs coupons, two-to-one coupons,
01:06:56.180 | for buying Virginia Slim cigarettes.
01:06:59.100 | So the industry is able to acquire this talent
01:07:03.180 | and then through a kind of an application
01:07:06.880 | of causality purely into the individual smoker.
01:07:10.360 | If you smoke, you did it to yourself.
01:07:13.200 | And so in a sense, we have nothing to do with it.
01:07:15.920 | It's sort of the same argument
01:07:17.480 | Exxon is making now with carbon.
01:07:21.380 | It's like, well, we just make the gas,
01:07:23.520 | we don't burn the gas.
01:07:24.960 | So really, we're not the problem.
01:07:27.120 | It's whoever drove here in a car that burned gas.
01:07:31.400 | And so there's a very interesting question.
01:07:34.260 | Who is liable?
01:07:35.900 | Who is responsible for, is the manufacturer just immune
01:07:40.900 | because it's a legal product
01:07:42.680 | and people make the foolish decision to smoke?
01:07:46.500 | Or does the addiction play a role in the liability?
01:07:50.380 | So these are all really interesting legal questions
01:07:53.060 | and philosophical questions.
01:07:55.060 | - Where do you attribute the success
01:07:56.900 | in the fight against big tobacco?
01:07:59.060 | So, I mean, there's been a lot of progress made.
01:08:01.520 | Maybe two questions.
01:08:02.780 | One is that and two, how much more is to be done?
01:08:06.380 | - Well, there's been, in my view, not that much progress.
01:08:10.700 | The tobacco industry basically won the war against cigarettes
01:08:16.160 | in the 1950s, the broader assumption inside
01:08:19.460 | and outside the industry would be,
01:08:21.260 | what was that if tobacco is,
01:08:23.580 | if cigarettes are ever shown as causing cancer,
01:08:25.820 | obviously they'll be banned.
01:08:28.340 | The famous slogan in the '50s was if spinach
01:08:31.900 | were ever shown to cause 1/10 the harm of cigarettes,
01:08:36.100 | it would be banned overnight.
01:08:38.300 | Flash forward 50 years, we still have 300,
01:08:41.700 | we still have 200 some billion cigarettes smoked
01:08:48.960 | in the United States every year.
01:08:51.340 | Globally, we still have about six trillion cigarettes
01:08:55.500 | smoked every year, that's 350 million miles
01:08:58.500 | of cigarettes smoked every year.
01:09:00.220 | That's enough to make a continuous chain of cigarettes
01:09:04.520 | from the Earth to the Sun and back
01:09:06.940 | with enough left over for several round trips to Mars.
01:09:10.140 | - But it's much fewer than, I mean, okay,
01:09:12.140 | so culturally speaking, I grew up in Soviet Union.
01:09:15.980 | Everybody smoked, everybody smoked.
01:09:19.660 | - Well, by everybody, you mean about half.
01:09:21.900 | - Well, by everybody, I mean culturally.
01:09:24.300 | So what does it feel like when everybody smokes, right?
01:09:27.660 | What percentage is that?
01:09:29.260 | Right now in the United States, it feels like nobody smokes.
01:09:33.260 | I'm talking about culturally.
01:09:34.820 | Do you see famous actors and actresses?
01:09:39.040 | Do you see movies?
01:09:40.220 | - All the time.
01:09:41.220 | - You do?
01:09:42.060 | - You can't watch a Hollywood movie without seeing
01:09:44.380 | pretty much continuous smoking.
01:09:47.580 | I mean, look at Peaky Blinders, look at, you know,
01:09:51.260 | any of the modern series now,
01:09:53.180 | it's pretty much a nonstop.
01:09:55.820 | You're right, there has been a change.
01:09:57.460 | I mean, that's true.
01:09:59.060 | The purest metric in the United States
01:10:00.940 | is number of cigarettes smoked per year.
01:10:03.620 | And that peaks in 1981 at 640 billion cigarettes.
01:10:08.620 | That's declined now to the level it was in 1940,
01:10:14.700 | which is about 240 billion cigarettes.
01:10:19.980 | Now globally, the number has increased.
01:10:23.460 | - See, but the perception, sorry to interrupt,
01:10:25.900 | but that's interesting.
01:10:28.020 | Even in the United States, the numbers, the decrease
01:10:31.420 | is not as significant as I thought it is.
01:10:33.820 | Because just in my own experience with people,
01:10:37.520 | you know, people speak negatively about smoking.
01:10:40.700 | - Yeah, well, for one thing, smokers do.
01:10:43.460 | I mean, smokers hate the fact they smoke.
01:10:45.420 | - Right, right, so this is the interesting observation
01:10:47.820 | I'm speaking to is even the smokers,
01:10:52.500 | are talking negatively about smoking,
01:10:54.660 | but they're still smoking.
01:10:56.180 | So even though I'm seeing this shift
01:10:58.420 | where smoking is no longer the cool thing,
01:11:01.580 | where it's like, when I was growing up
01:11:04.300 | and I smoked for a time, it was like a way
01:11:08.060 | to bond with strangers, to talk bullshit with friends.
01:11:12.220 | - Share a moment.
01:11:13.060 | - Share a moment together.
01:11:14.020 | I mean, it's a beautiful thing.
01:11:15.140 | And it's interesting 'cause we need to find other ways
01:11:17.940 | to share moments.
01:11:19.660 | But you know, you almost smoke from a stranger.
01:11:22.300 | I mean, that was seen as a good thing.
01:11:24.060 | Now-- - Did you ever smoke?
01:11:26.060 | - Oh, yeah, yeah. - For how many years?
01:11:28.740 | - Two years.
01:11:29.580 | I was a musician, so what happened is I was a musician,
01:11:33.740 | I was in a band.
01:11:34.940 | - Well, there you go.
01:11:35.860 | - And no, there is a bonding aspect to it.
01:11:38.420 | And I think I stopped smoking when they banned
01:11:42.420 | smoking inside bars.
01:11:45.300 | - Yeah, exactly.
01:11:46.300 | - Which was, I mean, that was,
01:11:48.940 | I mean, looking back now, it seems,
01:11:50.980 | it's such a powerful move.
01:11:54.260 | I mean, maybe you can speak to that because that--
01:11:56.100 | - Yes, that's key.
01:11:57.220 | - That was one of the moments that woke me up.
01:11:59.260 | Wait a minute.
01:12:00.500 | Like, that was a big shift for me,
01:12:02.580 | and I'm sure I'm not alone, where it's not just,
01:12:05.980 | like, it forced me to rethink the effect
01:12:09.860 | that smoking has on me.
01:12:11.500 | - Yes.
01:12:12.340 | - And also to think, can I actually live a life
01:12:14.660 | without smoking?
01:12:15.780 | Can I, you know, some people have that,
01:12:18.780 | I haven't gone through that process yet,
01:12:20.700 | but some people have that with drinking.
01:12:22.380 | - Yeah.
01:12:23.220 | - Can I have fun without drinking?
01:12:25.340 | I think the answer to that is yes,
01:12:26.660 | but I'm still drinking.
01:12:27.740 | (laughing)
01:12:28.820 | So that's a big shift, for example,
01:12:31.460 | if they ban drinking at certain places.
01:12:34.660 | And there's a lot of negative things
01:12:36.260 | to say about alcohol.
01:12:37.260 | - Well, I'm older than you, and I remember when mother,
01:12:40.820 | and I think you weren't even in this country then,
01:12:43.180 | but there was something called
01:12:44.340 | Mothers Against Drunk Driving.
01:12:47.100 | And if you look at movies from the '50s, '60s, even '70s,
01:12:50.740 | being drunk was just kind of a funny thing.
01:12:53.180 | - Yeah.
01:12:54.020 | - And you would drive drunk, what's the big deal, really?
01:12:57.220 | And Mothers Against Drunk Driving
01:12:59.060 | really denormalized drinking and driving,
01:13:04.060 | much like seatbelts.
01:13:05.340 | When I was a kid, you know, there were no seatbelts.
01:13:07.540 | You'd just lie in the back of the car,
01:13:09.180 | and you drove out west with your parents,
01:13:11.580 | and you'd lie flat, and it was wonderful.
01:13:14.200 | Seatbelts come along, and now it's pretty normalized
01:13:16.740 | that you buckle up.
01:13:17.980 | It's pretty normalized that you don't drink.
01:13:19.780 | And so the moment you identify
01:13:21.460 | is absolutely crucially important.
01:13:23.300 | A lot of it started in California,
01:13:25.660 | where there were bans on cigarettes.
01:13:29.420 | Some of it actually started in the computer industry,
01:13:31.860 | 'cause some of the early bugs that were found on tapes
01:13:34.500 | in the '70s were caused by smoke.
01:13:37.540 | And some of the earliest indoor smoking bans
01:13:40.560 | were actually in computer rooms,
01:13:42.180 | which were supposed to be clean enough
01:13:44.180 | that the tapes wouldn't spin
01:13:45.620 | and get caught by some snag of soot.
01:13:49.060 | And the workers started saying, "Wait a minute.
01:13:51.460 | "If the smoke can hurt the tapes,
01:13:54.940 | "can it hurt my lungs as well?"
01:13:56.840 | And so some of these early laws,
01:13:58.240 | already in the late '70s, early '80s, pushing it out.
01:14:01.340 | It was a huge struggle.
01:14:02.900 | The tobacco industry marshaled an army of experts
01:14:06.580 | to say that secondhand smoke
01:14:08.100 | was an entirely different kind of smoke.
01:14:09.880 | It can't hurt you.
01:14:10.920 | They eventually lost that battle.
01:14:13.800 | And now we have so-called smoke-free laws,
01:14:16.500 | where you can't smoke in most workplaces
01:14:20.020 | and most restaurants.
01:14:21.420 | And that denormalization has been crucial,
01:14:24.580 | because remember Aristotle says,
01:14:26.460 | "Tell me who you walk with, and I'll tell you who you are."
01:14:30.500 | And if your friends are smoking,
01:14:32.660 | if your friends are doing whatever, it makes it easier.
01:14:36.580 | The tobacco industry has been a genius
01:14:39.540 | at manipulating and really creating
01:14:41.580 | the material culture of the modern world.
01:14:44.100 | If your shirt has a pocket, that's to fit cigarettes.
01:14:48.600 | If your car has a plug-in,
01:14:52.400 | every car that I used to have had a cigarette lighter.
01:14:57.060 | It had an ashtray.
01:14:58.700 | Every plane that I flew when I was a kid,
01:15:00.900 | when I was younger anyway,
01:15:02.860 | there was smoking on it originally.
01:15:05.280 | And then there were ashtrays.
01:15:07.860 | And even today, every plane by law
01:15:11.500 | has to have ashtrays in the bathrooms,
01:15:13.540 | 'cause people still smoke in the planes.
01:15:16.740 | There's a special technique they have
01:15:18.340 | where they go in and light up your cigarette
01:15:21.300 | and put your mouth right down in the middle of the toilet
01:15:23.500 | and then flush it right at that same moment.
01:15:26.260 | And that's why there--
01:15:27.100 | - Let's take a good big puff.
01:15:29.100 | - Take a good big puff and flush it.
01:15:31.740 | And to prevent people from bringing down the plane
01:15:34.080 | by putting the cigarette out in the trash,
01:15:36.860 | every plane must have ashtrays.
01:15:39.140 | So that tells you something about the power of addiction,
01:15:41.220 | the power of normalcy.
01:15:42.580 | And it's related to your question of this crucial moment.
01:15:47.780 | If you can no longer smoke in a bar,
01:15:50.100 | if you can no longer smoke.
01:15:51.020 | And by the way, that's different from drinking.
01:15:53.860 | Most people who smoke wish they didn't.
01:15:58.800 | Most people who drink, that's not true.
01:16:00.840 | Most people who drink, they don't wish.
01:16:02.380 | There are some addicts, you know, 5% we say.
01:16:04.900 | But you're talking about 70, 80, 90% of people
01:16:09.180 | who smoke cigarettes regularly wish they did not.
01:16:11.740 | And that's actually where I learned about
01:16:13.740 | the idea that we could get rid of cigarettes entirely
01:16:18.060 | was just from talking to ordinary smokers.
01:16:20.020 | Those are the people who are willing to say,
01:16:23.860 | you know, let's get cigarettes all together
01:16:27.180 | and get rid of them all together
01:16:28.180 | 'cause it's not a recreational drug.
01:16:29.900 | It's very different from alcohol.
01:16:31.660 | And the genius of the tobacco industry
01:16:34.860 | has to turn basic, to trivialize addiction
01:16:38.380 | into just something we all like.
01:16:40.580 | It's addictive, I like it.
01:16:41.940 | And also to say that basically smoking is like drinking,
01:16:46.460 | which in fact it's not.
01:16:48.380 | Alcohol tends to be a recreational drug
01:16:51.100 | and cigarettes are more like heroin.
01:16:53.840 | - So how do we get that 200 billion down closer to zero?
01:17:00.780 | - Well, the good news, and I know you like good news,
01:17:03.660 | and I do too, is that every year
01:17:08.280 | we have about eight billion fewer cigarettes smoked
01:17:12.020 | in the United States.
01:17:12.860 | So we're going in the right direction.
01:17:15.980 | We're going to solve this.
01:17:16.940 | You know, there are, not every problem
01:17:18.780 | you can solve in the world.
01:17:19.980 | This is a very solvable problem.
01:17:22.280 | It's an enormous problem,
01:17:23.940 | arguably as big as COVID in certain respects.
01:17:26.900 | Much more invisible than COVID,
01:17:31.180 | but very solvable and actually will be solved
01:17:34.100 | probably because of climate change
01:17:36.700 | because we're going to need to find ways
01:17:38.400 | to reduce carbon footprints across the board.
01:17:42.020 | And that's going to be a kind of cultural revolution
01:17:46.660 | of sorts once we have a category six hurricane
01:17:50.000 | and hundreds of thousands of people start dying
01:17:53.340 | from the storms that are coming.
01:17:56.260 | But we'll be, it's like that metaphor of,
01:17:59.260 | there's a sci-fi film from 1950
01:18:03.400 | where they're trying to get back to Earth from the moon
01:18:08.400 | and they have to jettison their toolbox
01:18:12.000 | and their ladder and this and this and this.
01:18:13.540 | That's sort of, I think, the world we're going to be in.
01:18:15.580 | We're going to have to jettison a lot of things
01:18:17.240 | and cigarettes will be one of the things we can get rid of.
01:18:21.260 | - Let's come back to Nazi Germany for a time.
01:18:23.880 | You also wrote the book titled "The Nazi War on Cancer."
01:18:27.860 | - Right.
01:18:28.700 | - What is the main storyline and thesis of this book?
01:18:32.380 | - Well, I had been researching Nazi medicine.
01:18:37.380 | I went over to Germany.
01:18:40.660 | I didn't know what I wanted to do.
01:18:41.860 | I got a Fulbright.
01:18:43.020 | I originally wanted to go to Russia.
01:18:45.700 | Went to Germany partly 'cause my girlfriend
01:18:47.940 | was going there, Londa Schiebinger.
01:18:50.380 | And I was quick with the language.
01:18:53.980 | And my old landlady was born in 1900
01:19:01.700 | and I was renting a room, a tiny room in Berlin.
01:19:05.660 | And she told me, she'd been a nurse in World War I
01:19:08.860 | and told me how sad it was that all the mentally ill
01:19:11.620 | had died in that war
01:19:17.140 | and that how the same thing happened in World War II.
01:19:20.160 | And she told me about how sad it was
01:19:22.900 | that she'd never gotten married
01:19:24.140 | 'cause there were no German men around after World War I.
01:19:29.180 | But I also started taking classes in Germany.
01:19:34.140 | And at that time, there were still a few old Nazi professors
01:19:38.700 | just about to retire, very, very old.
01:19:42.380 | And I remember there was one guy who would talk about
01:19:45.020 | the impact on ovaries of women exposed to stress
01:19:49.460 | and how this would damage their ovaries
01:19:51.500 | and that this was like people who had been told
01:19:56.260 | they were about to be executed
01:19:57.660 | and they would do a before and an after on these ovaries.
01:19:59.820 | One of these horrific experiments.
01:20:01.380 | This was a physician in Berlin.
01:20:04.540 | And so I got involved with a group of people
01:20:07.840 | and really as a kind of intellectual garlic
01:20:10.280 | for living in Berlin.
01:20:13.180 | This is in 1980, '81.
01:20:17.140 | I started reading medical journals from the Nazi period
01:20:22.060 | and even the librarians didn't like that.
01:20:24.580 | I remember the Preussische Staatsbibliothek
01:20:27.060 | in downtown Berlin.
01:20:28.460 | They're like, "Why do you wanna,
01:20:30.580 | "you're not supposed to be reading these old Nazi journals.
01:20:33.640 | "These are just medical journals,
01:20:34.620 | "hundreds and hundreds of journals."
01:20:37.260 | And I just read them and read them and read them
01:20:39.220 | and read them and looking for details.
01:20:40.980 | I'd find like a veterinary medicine journal
01:20:45.260 | that would have a joking section where they'd say,
01:20:48.000 | "Oh, we found a cow with a swastika on his forehead,
01:20:50.780 | "a natural black swastika."
01:20:52.940 | Isn't that funny?
01:20:54.740 | Or I'd find stories about tobacco.
01:20:56.980 | I'd find stories about abortion.
01:20:59.440 | I'd find stories about excluding Jewish medicine
01:21:04.220 | or Jews from medicine or who's been promoted,
01:21:07.100 | who's been demoted, who's been Nazified.
01:21:10.140 | I discovered there was an entire Nazi Physicians League
01:21:15.260 | that was just the top Nazi,
01:21:17.620 | the most Nazi of the physicians.
01:21:20.460 | I discovered that physicians joined the Nazi party
01:21:24.220 | in a higher proportion than any other profession.
01:21:26.860 | That they joined the SS in a higher proportion
01:21:29.380 | than any other profession.
01:21:30.660 | - Why is that?
01:21:31.500 | Do you have a sense?
01:21:32.700 | - Because the Nazi regime is a kind of sanitary utopia.
01:21:36.740 | It was to create this purified world
01:21:39.420 | that would control the mind and fertility.
01:21:44.020 | So gynecologists and psychiatrists were the top.
01:21:49.020 | They were the most Nazified
01:21:50.780 | of the various medical professions.
01:21:53.420 | Control the body through sterilization, abortion.
01:21:56.940 | Control the mind through psychiatry.
01:21:59.620 | They killed a lot of the mentally ill.
01:22:01.820 | And you can read their professional journals.
01:22:04.660 | And I'm not sure these had ever been read since.
01:22:08.940 | I also went to East Germany,
01:22:10.220 | 'cause remember this is way before the wall fell.
01:22:13.340 | And they had a very special collection of taboo literature.
01:22:18.340 | It's kind of your point about should Mein Kampf be read?
01:22:22.440 | Well, of course, East Germany, nowhere close.
01:22:24.940 | And so, but not only that,
01:22:26.580 | Time Magazine couldn't be read.
01:22:28.140 | And Newsweek couldn't be read.
01:22:29.700 | And this file, this chamber
01:22:33.820 | that the foreign scholars were allowed to look through
01:22:35.620 | had all of the old Nazi literature
01:22:37.620 | and Nazi scientific literature.
01:22:39.380 | And Time Magazine and Newsweek
01:22:41.260 | and a whole pornography section as well.
01:22:43.660 | So all of the taboo topics.
01:22:46.140 | So here I'm researching in the West,
01:22:49.220 | I'm researching these topics
01:22:50.620 | the librarians didn't even want me to look at in the East.
01:22:53.760 | I was sort of going over there.
01:22:56.020 | I would hitchhike over there
01:22:57.620 | and overstay my welcome and things like that.
01:23:01.240 | But in any event, I noticed that there was this kind of taboo
01:23:06.240 | of talking about the big eugenics.
01:23:09.560 | I'd already been as a kind of a radical graduate student
01:23:13.680 | at Harvard working with all the Marxist biologists there.
01:23:17.420 | We'd already had a critique of eugenics
01:23:19.740 | and women being excluded from science
01:23:22.680 | and South African apartheid was a big deal
01:23:26.080 | and Arthur Jensen's blacks have lower IQs.
01:23:31.080 | And so there was a whole nest of controversial hot topics
01:23:36.140 | around sociobiology, around race and IQ,
01:23:40.120 | around women and scholarship and so forth.
01:23:43.240 | But we weren't looking at Nazi medicine.
01:23:45.240 | So I thought I'll look at the big eugenics,
01:23:49.200 | not just this smaller stuff.
01:23:51.200 | Only 50,000 people are sterilized in California,
01:23:55.560 | but there were huge numbers sterilized in Nazi Germany.
01:24:01.180 | So the more I looked into that,
01:24:04.120 | I realized there was a book there,
01:24:06.240 | but I had also started noticing this other weird stuff.
01:24:09.800 | Why were they anti-tobacco?
01:24:11.400 | Why did they recognize,
01:24:13.500 | why were the Nazis the first to recognize asbestos
01:24:17.280 | as causing mesothelioma?
01:24:19.720 | Why did they try to ban food dyes?
01:24:23.640 | Why did they, why are they the first culture in the world
01:24:27.560 | to encourage women to do breast self-exams?
01:24:30.440 | I told my mom this and she told me that in the 50s,
01:24:34.440 | women weren't even supposed to touch their breasts in Texas.
01:24:38.320 | And here in Nazi Germany,
01:24:39.640 | you've got these mandatory breast self-exams
01:24:42.480 | way before this was done in the United States.
01:24:46.720 | You had the first laws banning the X-raying of pregnant women
01:24:51.000 | already in the early 1930s.
01:24:54.920 | It was standard medical practice.
01:24:56.820 | They recognized that this could harm the fetus,
01:24:59.320 | harm the race, way before radiation was recognized
01:25:04.120 | as a hazard in England or America.
01:25:07.560 | I had started noticing these things
01:25:09.120 | and I have an eye for oddities.
01:25:12.440 | I like the weird, the contradictory, that which doesn't fit.
01:25:16.760 | And I remember finding a German magazine,
01:25:20.240 | a newspaper actually from 1919
01:25:23.280 | that talked about a Holocaust of six million Jews
01:25:26.760 | using that language.
01:25:27.860 | How could this be?
01:25:29.800 | You know, and I researched it.
01:25:31.200 | I thought it wasn't even real.
01:25:32.480 | And so I went and actually got the original newspaper
01:25:34.440 | and there it was.
01:25:35.760 | It's just one of those oddities of life that just happens.
01:25:40.760 | Just weird stuff happens, right?
01:25:44.240 | - That's the source of conspiracy theories, right?
01:25:46.320 | - Exactly.
01:25:47.240 | - So weird stuff happens, but you know,
01:25:48.800 | there's an inkling that that couldn't have been written
01:25:53.320 | in another time in history.
01:25:54.680 | Or it's much less likely that little coincidence
01:25:57.000 | to have happened in another.
01:25:58.480 | So it has some kind of resonance with something
01:26:03.480 | that captures something deep to the culture.
01:26:05.920 | - Yeah, and so I'm interested in probing.
01:26:08.240 | I mean, history is about seeing the universal
01:26:10.560 | through the particular in a way.
01:26:12.480 | So you look for the weird particular
01:26:14.200 | and then pull at that string
01:26:15.360 | to see if there's something there.
01:26:16.520 | - Is it that weird?
01:26:17.800 | You know, I did a project I never published on
01:26:20.920 | what I call pseudo-swastikas,
01:26:22.920 | which is a lot of companies in Nazi Germany
01:26:25.880 | made logos that look pretty much like a swastika.
01:26:29.480 | You start looking at them.
01:26:31.160 | They're disturbingly like a swastika.
01:26:34.200 | And I call those pseudo-swastikas.
01:26:35.640 | It's one of the many things I've filed away.
01:26:37.280 | It'd be a great project just to write it.
01:26:38.880 | How did this kind of visual iconography
01:26:42.560 | you know, you weren't supposed to do that.
01:26:43.720 | You weren't supposed to sell your, you know,
01:26:45.120 | bathroom cream with a swastika on it.
01:26:48.040 | - Yeah.
01:26:48.880 | - You know, so they would do these little things
01:26:50.880 | that look pretty much like a swastika.
01:26:53.700 | Or I would look at humor.
01:26:55.920 | What are they laughing at?
01:26:57.440 | What are they smiling at?
01:26:59.240 | - I didn't even know Germans had humor.
01:27:02.280 | - Yeah.
01:27:03.760 | - That's a good discovery.
01:27:04.720 | - Oddly enough, even Hitler had a sense of humor.
01:27:07.240 | There's one speech he gives,
01:27:09.160 | which is actually pretty funny,
01:27:10.420 | where he's ridiculing all the 29 tiny political parties.
01:27:15.000 | Oh, there's a this party and a that party.
01:27:17.240 | It's actually kind of funny.
01:27:19.080 | So we do have this, again, this scarecrow image
01:27:21.720 | even of Hitler and his personality and this and that.
01:27:25.560 | But I started noticing that there was this
01:27:30.080 | stuff that looks kind of modern.
01:27:31.920 | Hitler being a vegetarian and trying to limit alcohol
01:27:36.160 | and this and that.
01:27:37.760 | And then I got a call, but I'd sort of filed it away.
01:27:40.400 | And then I got a call from the Holocaust Museum.
01:27:42.480 | Would I like to be the first senior scholar in residence
01:27:46.420 | at the Holocaust Museum?
01:27:48.560 | I said, well, I wasn't really working on Nazi stuff
01:27:52.200 | that much anymore, but I did have this idea
01:27:54.600 | maybe looking at how it could be that the Nazis
01:27:57.960 | had the world's most aggressive anti-cancer campaign,
01:28:01.580 | which is kind of like an amazing fact.
01:28:03.180 | And I said, it's not exactly about the Holocaust.
01:28:05.120 | In a way, it's about the opposite.
01:28:06.960 | It's about what was Nazism that it was so seductive
01:28:10.520 | that it could become so powerful
01:28:12.800 | that something like the Holocaust could be possible.
01:28:16.240 | And they said, no, that sounds great.
01:28:17.420 | Do whatever you want.
01:28:18.420 | And so I went down to Washington, D.C.
01:28:21.960 | and helped them build a little bit
01:28:23.880 | some of the racial hygiene exhibits,
01:28:26.780 | some of the push and to show the medical aspect
01:28:29.780 | of the Holocaust.
01:28:30.800 | And so I ended up writing this book
01:28:33.380 | on the Nazi War on Cancer, which talks about
01:28:36.120 | how right before Hitler's about to invade Poland,
01:28:39.680 | he's talking late into the night about how to cure cancer.
01:28:43.360 | - So for Nazis, racial hygiene encompasses
01:28:48.280 | like way more than we might think.
01:28:51.600 | So it's like purifying in all ways.
01:28:54.360 | And one of the-- - Purifying,
01:28:55.360 | and it's also much more normal and more familiar.
01:28:59.160 | - It's like regular, in regular discussion.
01:29:02.360 | - It's like the famous line that if Nazism
01:29:04.480 | ever comes to Britain, it'll be wearing a bowler hat.
01:29:07.140 | And we create an image of Nazism,
01:29:12.120 | which is this fantasy image.
01:29:14.680 | And they're human beings making these decisions.
01:29:19.680 | - And when it's tied to things like removing cancer,
01:29:24.200 | so you're saying they kind of,
01:29:25.880 | the effort of purification walks alongside
01:29:29.440 | with this effort of fighting cancer.
01:29:32.240 | And then the final, the difficult truth here
01:29:36.000 | is that there's a lot of innovation,
01:29:38.860 | leading scientific innovation on fighting cancer.
01:29:44.600 | - It's not a bunch of blind robots following orders.
01:29:49.280 | It's a period of massive innovation.
01:29:53.160 | I mean, they declared the soybeans
01:29:57.380 | to be the official bean of the Third Reich
01:30:00.660 | because they realized how useful soy could be
01:30:03.960 | in protein for the people.
01:30:07.200 | They built a whole car out of soybeans.
01:30:10.520 | They pushed for a whole grain bread,
01:30:13.880 | calling white bread a French revolutionary
01:30:16.500 | capitalist product.
01:30:18.520 | And they're right about whole grain bread.
01:30:21.400 | It's better than, you know, so--
01:30:23.560 | - Allegedly, so far.
01:30:25.420 | So far, that's what we think.
01:30:27.680 | We'll discover eventually that bread
01:30:29.120 | is the thing that's killing us.
01:30:31.440 | Well, by the way, I'm eating mostly meat,
01:30:35.320 | so mostly carnivore, and that's been a discovery for me.
01:30:37.980 | I don't care what, like, I'm not making
01:30:40.500 | a general statement about the population,
01:30:42.140 | but me personally, how I feel,
01:30:43.980 | I like, I've discovered fasting,
01:30:46.180 | so I often, like on days like this,
01:30:48.860 | when it's pretty stressful, I'll eat once a day
01:30:52.420 | and only meat, or mostly meat.
01:30:55.460 | And that's amazing to me,
01:30:58.800 | from a scientific discovery perspective,
01:31:01.240 | that that makes me feel way better.
01:31:03.340 | You know, there's not scientific support
01:31:06.560 | why it might make you feel, but I don't care.
01:31:08.820 | The point is, I've done the experiments
01:31:10.500 | on the N of one, and it just makes me feel better.
01:31:13.800 | - Well, I think fasting is way undervalued.
01:31:16.880 | I mean, where do we get the idea
01:31:18.100 | you need three meals a day?
01:31:19.440 | I have a friend at Harvard, and he'll go
01:31:22.780 | seven or eight days periodically without food.
01:31:25.900 | He drinks water, but he considers it
01:31:29.760 | a kind of purification, and you know,
01:31:32.580 | we're in a world where it's too easy to get food.
01:31:37.160 | - Yeah. - Right, we're in a world,
01:31:38.180 | I mean, most animals are living in a sense
01:31:39.880 | that they're on the brink of starvation,
01:31:42.020 | but we have technologies and social conditions
01:31:47.020 | that allow, it's way too easy to find a piece of cake
01:31:52.260 | or a donut, and that's not something we evolved with.
01:31:56.460 | - We've been talking about purification
01:31:58.220 | in a negative context, but you know,
01:31:59.900 | there's appealing ways of minimalism,
01:32:03.500 | of removing things from your life,
01:32:04.940 | of seeking, especially for me being like OCD
01:32:09.340 | and a scientist, I do like this simplification of things,
01:32:14.140 | of this taxonomy of things.
01:32:15.900 | I just recently, storage got hacked by ransomware
01:32:20.900 | for these storage devices called QNAP NAS,
01:32:25.020 | and 50 terabytes of data locked up,
01:32:29.620 | and I can't, so it's lost, but you know,
01:32:32.220 | it was, at first it was a gut punch,
01:32:33.940 | and it really hurts, and a bunch of stuff is gone,
01:32:36.460 | but it's also freeing.
01:32:40.660 | - Yeah, well there's a, my favorite New Yorker cartoon
01:32:44.260 | is where the guy's about to die,
01:32:46.580 | as I say, he's 90 years old, he's got tubes in his nose,
01:32:49.300 | and the very last words are, I wish I'd bought more crap.
01:32:52.780 | (Lex laughing)
01:32:55.340 | - Yep, and that's now in this amazing world,
01:32:58.340 | applies to digital world too.
01:33:00.100 | Like you don't need to store everything,
01:33:01.500 | you just live in the moment, and live for the people--
01:33:03.980 | - Well that's one of my fears of Bitcoin,
01:33:06.020 | is losing your password, and I know a friend,
01:33:09.220 | his son, you know, mined, I don't know how many dozens
01:33:11.820 | of Bitcoins, and lost his password, you know,
01:33:14.900 | and so what can he do?
01:33:16.660 | There's a whole, I think, Silicon Valley episode
01:33:18.580 | about something like that, where the three comma club,
01:33:23.540 | you know, asshole billionaire is trying to find
01:33:26.140 | his old laptop with the password on it.
01:33:29.580 | - Yeah, that's the kind of dread people feel
01:33:31.460 | in the modern age, losing your Bitcoin password.
01:33:34.940 | Or for me, it'd be like last pass password.
01:33:37.380 | It's hilarious, we're funny, funny creatures.
01:33:41.620 | What else can we say outside of cancer,
01:33:45.700 | about medicine, about engineering?
01:33:48.640 | Lessons about medicine, lessons about engineering,
01:33:52.960 | and lessons about sort of applied science in Nazi Germany.
01:33:56.540 | So before we leave the subject, is there some truths
01:34:01.800 | that resonate with you still that's applicable for today?
01:34:06.520 | - Well, you know, historians celebrate contingency,
01:34:09.680 | or at least recognize contingency,
01:34:11.840 | and we always say things didn't have to turn out
01:34:13.920 | the way they did.
01:34:14.840 | Or you can't always foresee what's going to happen.
01:34:19.840 | And there were definitely missteps,
01:34:24.980 | and the potency of that ideology was such that
01:34:29.980 | it trapped a lot of people.
01:34:33.400 | And I guess by the time it becomes essentially
01:34:38.400 | a wartime operation, that becomes very, very dangerous.
01:34:42.640 | When it's, whatever the ideology is,
01:34:45.400 | once it's blended with warfare, that's catastrophic.
01:34:50.400 | One of the things that's ignored,
01:34:53.220 | I'm very interested in things that are ignored.
01:34:55.600 | And one of the things that we ignore now
01:34:57.200 | on something even like the climate catastrophe
01:35:00.280 | is the role of the military.
01:35:02.480 | I mean, there's a huge amount of carbon emissions
01:35:06.160 | from military operations.
01:35:09.400 | - Again, just part of the loop we're not closing.
01:35:13.760 | - Well, military is really interesting
01:35:16.080 | because I'm a AI person, robots,
01:35:20.080 | and most of my work when I was a PhD student
01:35:23.920 | was DARPA and DoD funded.
01:35:26.480 | And I think that's probably true for a lot of science
01:35:30.440 | that's funded, especially engineering
01:35:32.920 | is funded by the military.
01:35:34.980 | And again, I don't, I really wanna be careful
01:35:39.980 | drawing parallels between Nazi Germany and anything else.
01:35:43.940 | But there is a sense in which,
01:35:47.540 | I remember when I was, it hit me
01:35:51.820 | when one of the people close to me when I was a PhD,
01:35:54.180 | one of the faculty, she refused to take funding
01:35:58.940 | from DoD, from DARPA.
01:36:01.500 | That was interesting to me.
01:36:02.900 | I thought, but what's the, I mean, it's not,
01:36:06.680 | like you're not taking a stand against the war,
01:36:11.140 | you just don't wanna take money
01:36:12.800 | from tangentially associated military kind of efforts.
01:36:17.800 | And that little stand, I mean, that had an impact on me.
01:36:21.080 | At least it woke me up to,
01:36:24.320 | like this is something we should be very, very careful with.
01:36:28.980 | For me, artificial intelligence is,
01:36:32.940 | much of the DARPA research on autonomous vehicles
01:36:37.620 | and all kinds of robotics, drones,
01:36:39.660 | I mean, that's pure research.
01:36:41.140 | Some of the biggest discoveries,
01:36:42.700 | like I didn't think of it as military,
01:36:44.380 | I thought of it as engineering and science.
01:36:46.900 | But then when the drums of war start beating,
01:36:51.900 | like say in some future time,
01:36:54.600 | all of that machine is already there to turn it into,
01:36:59.940 | into now Lex is walking around
01:37:02.100 | and working on autonomous drones
01:37:03.620 | that are going to swarm China or swarms whoever,
01:37:08.620 | some terrorist part of the world.
01:37:15.940 | And then all of a sudden,
01:37:16.940 | all my widgets are being used for that.
01:37:19.060 | That's why I've been waking up more and more to,
01:37:22.220 | there's been something released called like the AI report.
01:37:25.100 | Eric Schmidt was one of the co-authors of it,
01:37:27.780 | which is essentially saying that
01:37:29.660 | because China is developing autonomous weapon systems,
01:37:33.100 | US should not ban autonomous weapon systems,
01:37:36.260 | it should also be doing it.
01:37:37.300 | So basically put AI into our weapons of war.
01:37:42.060 | And that escalation, that race is terrifying,
01:37:46.700 | just like all the things you mentioned.
01:37:48.060 | But that particular one for me is close
01:37:49.860 | because now too closely are the ideas of AI and war
01:37:54.860 | are being linked.
01:37:57.980 | - Very much, yeah.
01:37:58.820 | I mean, one of the things I think that is rarely taught
01:38:02.500 | in universities is what would you not do for money?
01:38:07.500 | - Right.
01:38:08.700 | - I mean, in a basic class on machine learning
01:38:12.260 | or even statistics or history,
01:38:15.900 | what would you not do for money?
01:38:17.780 | What should you not do for money?
01:38:19.500 | I have a lot of my own colleagues who work for big tobacco,
01:38:23.980 | you know, carrying water for them in court,
01:38:27.220 | a huge, essentially a mercenary army of historians,
01:38:30.940 | a vast undiagnosed, you know,
01:38:33.380 | essentially a hidden invisible army.
01:38:35.220 | They don't put it on their CVs.
01:38:36.780 | And it's going on the same thing
01:38:38.820 | with a lot of the technical fields.
01:38:41.220 | What wouldn't you do for money?
01:38:43.500 | At Stanford, there used to be secret PhDs,
01:38:46.820 | secret research projects.
01:38:48.060 | That was kicked off campus in 1971
01:38:50.820 | with the whole '60s radicalism.
01:38:54.820 | But nonetheless, individual professors still work
01:38:57.500 | for all kinds of military operations.
01:39:02.140 | We're setting up a new school of sustainability at Stanford,
01:39:05.080 | and it's gonna be pretty much in bed with big oil as well.
01:39:09.860 | Big oil is gonna be funding a lot of that.
01:39:11.960 | You know, what kind of influence?
01:39:15.540 | If they have a seat at the table,
01:39:16.960 | if they're giving money, if they're gifts,
01:39:19.420 | if their names are on certain projects,
01:39:23.380 | what influence is that gonna have?
01:39:25.260 | - This is what really bothered me.
01:39:27.020 | People don't often have, they don't have integrity
01:39:30.220 | in the way that I hoped they would.
01:39:33.260 | This is one of the things I learned in academia.
01:39:36.260 | I think a lot of people for money, you know,
01:39:39.360 | if I give you a million dollars to murder somebody,
01:39:44.300 | I think most people would not.
01:39:45.940 | A billion dollars, that number starts decreasing,
01:39:50.880 | but it's still pretty,
01:39:51.900 | like I think we would be happy with direct murder
01:39:54.700 | with not being done for money.
01:39:56.700 | But like subtle stuff, just pressures,
01:40:00.780 | and it could be with like, let me buy you a drink,
01:40:03.500 | and just, you know, laugh about stuff, become friends.
01:40:06.780 | That's a subtle pressure.
01:40:08.500 | I'm very upset with how many people would just unknowingly,
01:40:13.500 | like tell themselves a story, ah, what's the harm?
01:40:17.740 | And I see that with, for example,
01:40:19.820 | me personally at MIT, a lot of people I admire,
01:40:23.540 | but a lot of people I still admire, I'm friends of mine.
01:40:26.700 | I mean, for example, in doing autonomous vehicle research,
01:40:30.260 | there's car companies that fund that research.
01:40:32.560 | And the car companies say, no, of course,
01:40:36.300 | we're not going to influence anything.
01:40:38.200 | No, that's just like, you do, it's wide open.
01:40:43.220 | Do whatever you want.
01:40:44.820 | But the fact is, you know, they give millions of dollars,
01:40:48.940 | and I'm disappointed that actually a lot of scientists
01:40:52.740 | in that context are still afraid,
01:40:55.740 | even though legally it says they cannot,
01:40:58.820 | the car company cannot at all influence the research,
01:41:01.540 | they still start leaning slowly towards the ideas
01:41:05.940 | that that company espouses.
01:41:08.180 | And that's a harmless, perhaps, topic versus big tobacco,
01:41:12.220 | but I would argue it has harm on innovation.
01:41:15.660 | - Yeah, well, it skews innovation.
01:41:18.620 | What happened at Stanford was,
01:41:20.920 | Philip Morris and the other big tobacco companies,
01:41:25.620 | they had a massive denial campaign to deny
01:41:28.140 | that exposure to someone else's smoke could kill you,
01:41:31.420 | when in fact it can, it kills tens of thousands
01:41:33.820 | of Americans every year, still.
01:41:35.640 | They set up an entire conspiracy body
01:41:39.100 | called the Center for Indoor Air Research
01:41:41.020 | and funded hundreds of scientists to basically say,
01:41:45.260 | you know, it's all genetic, if you get cancer,
01:41:47.180 | well, you had it coming, 'cause of your genes,
01:41:50.740 | your ancestry, your hormones, whatever.
01:41:52.960 | Well, that was broken apart through what was called
01:41:58.020 | the Master Settlement Agreement,
01:41:59.300 | but it was rejuvenated and reinvigorated
01:42:04.080 | by something called the Philip Morris
01:42:05.260 | External Research Program,
01:42:06.500 | which continued with the same fax lines and executives,
01:42:10.380 | funding universities like Stanford,
01:42:12.080 | millions and millions of dollars.
01:42:14.300 | And when I came to Stanford,
01:42:15.340 | there were millions and millions of dollars
01:42:17.020 | being given to medical professors by Philip Morris
01:42:20.500 | as part of the Philip Morris External Research Program.
01:42:24.280 | Well, what were they researching?
01:42:26.300 | They're researching genetics, they're researching diet,
01:42:29.980 | anything but cigarettes causing cancer,
01:42:32.260 | and giving the non, giving the friendly research,
01:42:36.260 | as Philip Morris often called it, of bigger voice.
01:42:41.260 | They got money, they got jobs, you know,
01:42:44.100 | it amplified that as a research tradition.
01:42:47.020 | Remember, there's nothing natural in a university
01:42:49.540 | about how many professors there are
01:42:51.740 | of human origins versus AI.
01:42:55.700 | This is all a political decision
01:42:57.620 | at a very non-democratic institution.
01:43:00.700 | Universities are less democratic than the Vatican.
01:43:04.420 | At least the Pope is elected.
01:43:06.260 | Who elects a president of a university,
01:43:09.540 | or a dean, for that matter?
01:43:12.060 | And so, what happened was I helped launch a campaign
01:43:16.140 | to get Philip Morris off campus.
01:43:18.500 | And people started coming out of the woodwork,
01:43:21.820 | like, well, does this mean I shouldn't be working
01:43:23.500 | for the CIA, does this mean I shouldn't be working
01:43:25.700 | for big oil, it's like, what, you work for big oil?
01:43:29.060 | And our faculty voted against
01:43:31.300 | pushing Philip Morris off campus.
01:43:33.860 | But Philip Morris got bad press from it,
01:43:36.780 | and so they voluntarily withdrew the entire program.
01:43:40.140 | So we started, it was kind of a lesson
01:43:42.620 | in that you can lose a battle, but win a war,
01:43:46.260 | if you're doing the right thing.
01:43:47.580 | And so by standing up, even though our own faculty
01:43:50.180 | wouldn't back us in kicking Philip Morris
01:43:54.060 | out of the medical school,
01:43:55.420 | Philip Morris did a cost-benefit analysis,
01:43:59.180 | found, well, probably really not worth the kudos we get
01:44:02.820 | for embracing Stanford.
01:44:05.460 | So it can have an influence, and in this case,
01:44:08.180 | the influence was simply by rewarding,
01:44:10.460 | giving voice to the people who were blaming cholesterol
01:44:15.420 | rather than cigarettes.
01:44:17.740 | And of course, we know that historically,
01:44:19.460 | the tobacco industry created a lot of these theories,
01:44:23.260 | these alternative theories of what causes heart disease,
01:44:25.400 | that stress causes heart disease, that salt,
01:44:28.060 | or that anything but cigarettes.
01:44:30.340 | They funded that research to basically skew
01:44:34.820 | the whole research in their direction.
01:44:37.100 | - You edited a book titled Agnotology,
01:44:42.100 | this is an interesting term, so you mentioned it earlier,
01:44:45.220 | The Making and Unmaking of Ignorance,
01:44:47.860 | where you explore the topic of ignorance,
01:44:50.980 | or the authors explore the topic of ignorance
01:44:53.780 | in different applications and different contexts.
01:44:55.980 | Oh, let me ask the ridiculous big philosophical question.
01:45:00.580 | What is the nature of human ignorance?
01:45:03.680 | - Well, the first thing to say is that it's infinite.
01:45:06.180 | (both laughing)
01:45:08.140 | - Einstein quote or stupidity or something,
01:45:10.020 | I forget what it is, yeah, attributed.
01:45:12.740 | - The point is that there's probably trillions of planets
01:45:17.740 | in the universe, and we know one,
01:45:19.820 | you know a tiny piece of one.
01:45:21.860 | But not only that, who are the we?
01:45:23.060 | I mean, we're all born,
01:45:24.360 | as we started as single-celled organisms, right?
01:45:28.540 | As some sperm and some egg get together,
01:45:30.940 | that's certainly ignorant, and then we're ignorant,
01:45:34.860 | each one of us, there's an ontogeny of knowledge,
01:45:38.380 | you say, but an ontogeny of ignorance as well.
01:45:40.260 | We grow up, we have to learn,
01:45:42.000 | but almost everything that has been known
01:45:45.140 | has been forgotten.
01:45:46.340 | If you think about the names of ordinary people,
01:45:49.420 | and names of the Neanderthal, did they even have names?
01:45:52.820 | Most of the history of the world has been forgotten.
01:45:55.260 | We have a few shreds, a few traces that we try.
01:45:58.020 | History is a kind of resurrection projects,
01:46:01.140 | a kind of archeological project,
01:46:02.740 | and a genealogical project,
01:46:04.260 | where we look back and we find traces,
01:46:07.460 | and it's very biased.
01:46:09.440 | I'm interested in empires
01:46:12.260 | that we don't even know anything about,
01:46:14.660 | and there are whole empires that are gone,
01:46:17.180 | if things don't leave a written trace.
01:46:19.020 | We know something about Mayan cosmology,
01:46:22.140 | because we've got some of their stelae,
01:46:24.260 | and a few of their codices, four codices,
01:46:27.780 | but we know the dozens that were burned by Diego de Londa,
01:46:32.300 | inquisitorial Spanish friar,
01:46:34.980 | who thought these were just heresies, and so burned.
01:46:38.820 | So that knowledge is all lost.
01:46:40.420 | - You think there's a lot of deep wisdom
01:46:43.940 | about reality that is lost forever?
01:46:46.740 | - Of course, of course.
01:46:48.940 | - That's so sad.
01:46:49.860 | - Well, it is sad, but the human condition is sad.
01:46:52.620 | I mean, but then, if we can study ignorance,
01:46:56.100 | that's also a positive thing.
01:46:57.620 | Agnotology, the study of ignorance,
01:46:59.660 | the study of the cultural production of ignorance.
01:47:02.300 | - Cultural production, sorry to interrupt.
01:47:04.020 | Cultural production of ignorance?
01:47:05.500 | - Yes, yes, you can--
01:47:07.060 | - So ignorance is not just a manifestation
01:47:10.500 | of what it means to be human,
01:47:12.340 | it's also forced back onto you through the culture?
01:47:16.700 | - That's the missing piece
01:47:18.260 | that people don't pay enough attention to.
01:47:20.420 | It's not a natural vacuum we explore,
01:47:23.620 | like some empty cave.
01:47:24.980 | There are factories of ignorance.
01:47:28.220 | The tobacco industry, when they built
01:47:29.940 | their propaganda engines to deny
01:47:32.340 | that cigarettes cause cancer,
01:47:33.980 | they measured exactly how much ignorance
01:47:37.820 | could be created by watching one of their videos.
01:47:41.100 | They would show that watching one of their propaganda videos
01:47:44.620 | in the 1970s produced a 17% increase
01:47:48.780 | in the people not willing to say
01:47:51.820 | that cigarettes cause cancer.
01:47:53.780 | So this is, I call it agnometrics.
01:47:56.460 | They actually measured the success of their propaganda,
01:48:00.300 | and I'm sure this has been done
01:48:02.180 | in marketing and other fields as well.
01:48:06.340 | - That framing of it somehow is terrifying
01:48:11.060 | because it seems like a very effective way
01:48:19.220 | to be scientific about how to sort of
01:48:21.660 | create doubt in the mind.
01:48:25.900 | - Exactly, it's biabolical.
01:48:27.660 | And luckily we have some of the tobacco industry's
01:48:32.580 | own internal documents, the ones that were not destroyed.
01:48:36.580 | We actually know, we have some traces
01:48:38.600 | as to which ones were destroyed,
01:48:40.220 | and we know that the most sensitive were destroyed.
01:48:42.920 | And we know that some of the ones
01:48:44.980 | that were sequestered by whistleblowers
01:48:48.180 | or by disgruntled spouses or whatever,
01:48:51.580 | that those contain the real gems and the truth.
01:48:54.500 | And one of the ones that was leaked already in 1981
01:48:57.780 | was the doubt is our product memo
01:48:59.660 | that we don't just make cigarettes, we make two products.
01:49:02.100 | We make doubt and we make cigarettes.
01:49:04.280 | We make cigarettes, but we can only keep selling cigarettes
01:49:07.340 | so long as we can keep selling ignorance.
01:49:09.560 | And that then becomes a template of sorts
01:49:12.360 | for climate denial and for all kinds of other denial engines
01:49:16.800 | that are produced by the 1500 trade associations
01:49:21.100 | in Washington, D.C.
01:49:23.060 | So this is something new in the research enterprise
01:49:27.940 | of the world.
01:49:28.900 | After World War II, you have this enormous trust
01:49:32.780 | in science, trust in research.
01:49:35.540 | So what could be more effective than big tobacco saying,
01:49:38.260 | look, we're supporting research.
01:49:40.700 | We wanna get at the truth.
01:49:42.300 | We're funding hundreds of millions of dollars of research,
01:49:45.860 | which is exactly what they did.
01:49:48.300 | What they didn't say was it was all an effort
01:49:51.060 | to distract from the truth that cigarettes cause cancer
01:49:55.020 | and a million other diseases too, blindness,
01:49:58.220 | amputation, all kinds of other diseases.
01:50:01.060 | All of that was hidden, covered up
01:50:03.580 | through a distraction process.
01:50:05.340 | Richard Nixon declares war on cancer in 1971.
01:50:09.940 | It's called the War on Cancer.
01:50:11.460 | Cigarettes were excluded, even though cigarettes
01:50:15.100 | cause a third of all cancers, all cancer deaths.
01:50:19.140 | Cigarettes were excluded because the tobacco industry
01:50:21.580 | successfully argued that cigarettes cause cancer
01:50:25.180 | is not a scientific fact, but a political opinion.
01:50:28.100 | Much like the argument that guns don't cause death,
01:50:32.620 | you know, pulling the trigger causes death,
01:50:34.380 | or shooters, or whatever.
01:50:35.580 | In other words, it's all about breaking down
01:50:37.820 | the chain of causation into pieces
01:50:40.980 | that serve your interests.
01:50:43.300 | So it's not that cigarettes cause cancer,
01:50:45.620 | it's maybe the smoky them at most,
01:50:47.940 | so they're even denying that.
01:50:49.900 | It's the fact you have lungs that cause cancer.
01:50:53.060 | It's blaming the victim, and they had a thousand ways
01:50:56.220 | to blame the victim.
01:50:58.580 | I mean, there's some legitimacy to this line of argument,
01:51:02.020 | which is why it stakes, which is figuring out
01:51:04.060 | what is the causation of things is hard to figure out.
01:51:06.940 | A lot of the politics of science have to do
01:51:09.460 | with which parts of the causal chain
01:51:13.740 | do you view as real or not real.
01:51:17.380 | - When we say that carbon causes climate change,
01:51:20.220 | well, what causes carbon?
01:51:21.540 | If it's Exxon causing carbon,
01:51:23.260 | is it the person driving the car causing it,
01:51:25.300 | or is it the Republican Party causing that,
01:51:28.140 | or is it the Tea Party causing that,
01:51:30.260 | or is it big tobacco and big oil
01:51:31.940 | controlling the Republican Party, or is it what?
01:51:35.500 | - Is it the Jews controlling the weather,
01:51:37.180 | which is where the conspiracy theories come in,
01:51:39.060 | or the lizards.
01:51:40.300 | So whatever sticks, you try it out,
01:51:42.980 | and if you're a tobacco company,
01:51:44.260 | you're going to actually literally
01:51:46.380 | be scientific about it and try different options.
01:51:49.620 | - The genius of the tobacco conspiracy,
01:51:52.000 | the tobacco denial campaign,
01:51:53.620 | which is born on December 14th, 1953,
01:51:57.260 | we know on an hour by hour basis how it worked,
01:52:01.260 | is to create an alliance between solid research,
01:52:06.260 | or as they called it, impassionate, dispassionate research,
01:52:11.160 | and to tar all of their opponents as fanatical,
01:52:14.460 | emotional, hysterical, political.
01:52:16.980 | - You mentioned Marxism at Harvard
01:52:20.780 | (laughs)
01:52:22.260 | a couple decades ago or something like that.
01:52:24.680 | So 30 years ago, you wrote the value-free science book,
01:52:28.900 | Purity and Power in Modern Knowledge,
01:52:30.900 | which is interesting that you kinda,
01:52:35.940 | what you were describing then
01:52:37.740 | seems to be a concern for people now still.
01:52:40.300 | So you were, I think, referencing more Nazi Germany,
01:52:44.380 | and how social scientists would attack or defend Marxism,
01:52:48.500 | feminism, and other social movements using science.
01:52:52.200 | There's a, you know, depending on who you talk to,
01:52:55.780 | I just spent a day with Jordan Peterson,
01:52:59.020 | you know, there's some arguments
01:53:00.740 | that science is not being leveraged
01:53:02.620 | in some part of the university, which bothers me,
01:53:04.580 | because most of the university,
01:53:06.580 | at least like MIT, is doing engineering,
01:53:08.780 | and not, ideology doesn't seep in yet,
01:53:13.660 | but the concern they have is ideology seeps in eventually,
01:53:16.940 | if you let it in at all.
01:53:18.700 | Anyway, I ask all that, do you have some modern concerns
01:53:22.900 | about the seeping in of ideology into academic research
01:53:27.900 | in these social movements, for or against Marxism,
01:53:32.380 | for or against, you know, well, nobody's for racism,
01:53:36.300 | but, you know, on the topic, (laughs)
01:53:38.600 | like anti-racism, all those kinds of
01:53:41.180 | critical race theory things,
01:53:43.060 | and then also on the feminism and gender studies,
01:53:45.140 | and all those kinds of things.
01:53:46.140 | - Yeah, I mean, these have always been in the university.
01:53:49.980 | When people have been most adamant
01:53:53.300 | in saying that science is a neutral, value-free enterprise,
01:53:58.300 | it's times like the 1950s, when there weren't blacks,
01:54:02.700 | and there weren't even women in universities, so,
01:54:05.340 | what I discovered was that value neutrality,
01:54:10.260 | or this ideology of that we are value-free,
01:54:13.060 | it really arose as a defensive shield
01:54:17.180 | to prevent greater inclusion,
01:54:20.840 | to prevent, you know, questioning of
01:54:26.420 | the priorities of science, the practice of science,
01:54:29.940 | the nature of science.
01:54:31.180 | Now, we're in a period now, I think,
01:54:33.660 | of a kind of inclusive revolution,
01:54:35.500 | where people are realizing, well,
01:54:38.720 | we can't have, you know, universities
01:54:42.540 | that look too much like a certain way.
01:54:45.500 | There's probably gonna be, in that omelet making,
01:54:48.540 | you know, there's gonna be a few eggs that get broken.
01:54:51.980 | And I think people may exaggerate the extent
01:54:56.900 | to which that's going in, it's definitely real.
01:54:58.940 | - So, like cancer culture, all those kinds of things.
01:55:00.980 | - I mean, it's definitely real,
01:55:02.540 | but it's also, in a way, it's also a distraction
01:55:07.980 | from looking at big power in a university.
01:55:12.980 | If big oil is going to control, or at least influence,
01:55:18.480 | the direction of the sustainability school at Stanford,
01:55:23.160 | isn't that a bigger issue than whether we have,
01:55:26.880 | we can't say certain words on campus?
01:55:29.460 | In other words, there's some very interesting
01:55:32.680 | and complex aspects to this.
01:55:36.360 | And the idea that certain words should not be said,
01:55:41.360 | or that certain people should not be invited.
01:55:44.760 | An invitation to a university is always political.
01:55:47.440 | I mean, who do you invite, who do you not invite?
01:55:50.000 | Much as an admissions process is,
01:55:53.160 | if a student is admitted to Stanford,
01:55:54.920 | what that really means is 96% of the applicants
01:55:58.840 | did not get in, they were rejected.
01:56:01.240 | They were canceled from a Stanford.
01:56:03.760 | 4% are admitted, they call it an admissions committee,
01:56:06.640 | they should call it a rejection committee.
01:56:09.080 | When we hire someone in my department at Stanford,
01:56:12.760 | we get 300 applications, and maybe we accept one.
01:56:15.940 | It's not a hiring committee, it's a non-hiring committee.
01:56:20.600 | - That sounds like toxic cancer culture,
01:56:22.600 | all these rejections.
01:56:24.440 | Everybody should be accepted.
01:56:26.080 | - In that sense, it's the essence of meritocracy,
01:56:29.080 | is that selection is involved in any hiring decision,
01:56:33.520 | because in a way, when you are hired into a university,
01:56:37.360 | you are hired to control the means of production,
01:56:40.800 | at least part of it.
01:56:42.320 | And this part of the politics of it
01:56:45.120 | is invisible to the undergraduates,
01:56:46.920 | because they are consumers,
01:56:48.240 | and you're free as a consumer to eat whatever you want.
01:56:51.080 | But you're not free to own the means of production
01:56:55.280 | to say what's on the menu.
01:56:56.880 | - And that's where the power is.
01:56:58.520 | You have to ask the question,
01:56:59.440 | where's the power in the university?
01:57:01.400 | - I think that at MIT, the entire administration
01:57:05.440 | should get fired regularly,
01:57:07.040 | and more power put in the hands of faculty and students.
01:57:10.380 | There is an overgrowth that happens,
01:57:14.280 | that it feels like administrators
01:57:17.820 | are more easily influenced by big tobacco than faculty.
01:57:22.720 | And maybe it's me being sort of romantic
01:57:24.720 | about the idea of faculty,
01:57:26.000 | but if you're in the battle doing the research,
01:57:29.860 | I feel like, well, I don't know, I don't know, I don't know.
01:57:34.760 | But it feels like the administration
01:57:38.680 | helps you delude yourself longer.
01:57:43.120 | So it prevents you from waking up.
01:57:46.160 | It's like, no, no, it's okay to take this fine.
01:57:47.840 | Oh, Jeffrey Epstein, it's okay.
01:57:49.440 | And oh, okay, so he went to prison.
01:57:52.040 | Let's just keep it a little bit secret.
01:57:53.360 | It's fine, just keep taking the money.
01:57:54.880 | And I feel like that comes from the administration
01:57:57.200 | more than the faculty.
01:57:58.520 | - Well, there's certainly a cult of celebrity,
01:58:00.740 | a cult of money.
01:58:02.640 | Donors have the, remember in the whole scandal
01:58:06.680 | about the side door entrance in universities,
01:58:11.680 | there's always been the front door and the back door,
01:58:16.400 | where the back door is the rich donors,
01:58:19.220 | the kids of the rich donors,
01:58:20.760 | the legacy kids that you still get.
01:58:25.600 | So there are a lot of ways universities get corrupted.
01:58:29.760 | They get corrupted through money,
01:58:30.960 | they get corrupted through influence.
01:58:33.400 | And that should be recognized.
01:58:36.120 | - We're jumping around a little bit,
01:58:39.340 | but I read you also do work on human origins.
01:58:42.640 | So we mentioned this earlier.
01:58:46.080 | Let me ask another big philosophical question.
01:58:48.520 | What's human?
01:58:51.640 | What makes us human?
01:58:52.720 | What is human?
01:58:53.640 | And where did that humanness come from?
01:58:57.400 | - That's exactly the question we need to problematize,
01:59:01.280 | because it's what I call the Gandhi question.
01:59:03.960 | It's like, you know, Gandhi's asked,
01:59:06.880 | what do you think of Western civilization?
01:59:08.880 | And he says, it would be a good idea.
01:59:10.720 | And so when did humans evolve?
01:59:15.520 | Well, not yet.
01:59:18.120 | So we don't talk about, you know,
01:59:21.080 | when did, you know, we talk about
01:59:24.320 | the rise of modern humanity.
01:59:26.440 | And what's happened in the last 50 or 60 years or so,
01:59:31.080 | which I think is a good thing intellectually,
01:59:32.760 | is that we've smeared out humanness
01:59:36.640 | to mean many different things.
01:59:37.760 | It's not just tool use.
01:59:39.660 | It's not just upright posture.
01:59:41.440 | Upright posture goes back at least 5 million years.
01:59:45.220 | Tool use goes back at least 2 1/2 million years,
01:59:47.920 | stone tool years.
01:59:48.920 | But since wasps and chimpanzees use tools,
01:59:52.360 | then it's gotta be even older.
01:59:53.940 | So that's actually one of the things I'm interested is,
01:59:58.480 | how have different notions of what is human
02:00:03.080 | influenced our theories of human origins?
02:00:06.720 | And in particular, there's sort of the problem
02:00:10.480 | of what I call like sodomy in the uncanny valley,
02:00:13.560 | which is, how long ago would you be willing
02:00:15.980 | to date someone, say, someone that existed,
02:00:19.400 | say, 5 million years ago, 10 million years ago,
02:00:21.960 | 3 million years ago?
02:00:22.960 | In other words, when is it--
02:00:24.360 | - A date or a one night stand?
02:00:25.800 | I mean, that's strong. - Either one.
02:00:27.120 | Either one. - All right.
02:00:29.080 | - Let's say, be the mother of your children.
02:00:31.800 | - That's a lot of commitment, but yeah.
02:00:34.700 | - But it's an interesting question
02:00:37.200 | because after World War II,
02:00:39.120 | as a result of Nazism,
02:00:43.840 | no one wanted to be the one to say
02:00:45.720 | that this particular fossil we've just found
02:00:48.760 | was anything less than fully human.
02:00:51.160 | So there's a projection of humanness
02:00:53.320 | arbitrarily back into the past.
02:00:57.460 | So that even these little monkey-like creatures,
02:00:59.840 | rhomopithecines, rhomopithecus,
02:01:01.720 | were being declared to have folkways and mores and language,
02:01:05.300 | which is ridiculous.
02:01:07.400 | No one wanted to say that Neanderthals
02:01:10.120 | were anything less than fully human.
02:01:11.660 | So it's a very interesting question.
02:01:14.680 | At what point are they us?
02:01:17.040 | I mean, human origins is very much an identity quest.
02:01:19.960 | It's when did we become us?
02:01:22.360 | Which sort of begs the question, what are we?
02:01:24.120 | Who are we?
02:01:25.040 | - How much of that is the hardware evolution question
02:01:29.360 | versus the software?
02:01:30.520 | Like what, the actual development of society?
02:01:34.720 | Can't you argue that we became human with agriculture?
02:01:39.720 | I mean, can't you argue that we became human
02:01:41.880 | with the Industrial Revolution?
02:01:43.560 | - Well, certainly by then, they are us.
02:01:47.560 | But agriculture is only 12,000 years ago.
02:01:50.760 | That's a blink in the eye, right?
02:01:52.360 | That's yesterday.
02:01:54.240 | It's interesting, prior to the 19th century,
02:01:57.200 | most scholars thought that the pyramids
02:01:59.360 | were at the beginning of time.
02:02:01.160 | Essentially, they were closer to the beginning of time
02:02:03.360 | than they are to us.
02:02:04.200 | Now, it's a blink in the eye.
02:02:05.920 | You know, we use the metaphor of a meter.
02:02:08.000 | You know, the Earth is five billion, so that's a meter.
02:02:12.840 | The natural history of upright humans is five million,
02:02:17.840 | so that would be like one millimeter.
02:02:21.480 | It'd be the thickness of the white of your fingernail.
02:02:24.960 | And then the pyramids are 5,000,
02:02:28.360 | so that's a thousandth of a millimeter, a micron,
02:02:33.360 | which is the amount taken off when you brush
02:02:36.480 | your fingers on your jacket.
02:02:40.920 | So there's a natural history of humanity,
02:02:45.920 | and then there's the history of our constituents.
02:02:48.240 | We're all stardust because all of our complex atoms
02:02:52.400 | began in supernovas many billions of years ago.
02:02:56.920 | But upright posture, five million.
02:03:01.580 | Agriculture, only a few thousand years ago.
02:03:06.280 | We cultivate dogs a couple hundred thousand years ago,
02:03:10.300 | so those are paleolithic instruments.
02:03:12.300 | Cats are neolithic instruments
02:03:14.520 | 'cause they're used to kill vermin.
02:03:16.440 | Dogs are used to hunt with us.
02:03:19.460 | But there is what you say, this co-evolution,
02:03:22.960 | our social aspect and our physical aspect.
02:03:26.760 | Even the fact that we have whites of the eyes.
02:03:30.080 | We're the only animal with whites of the eyes.
02:03:32.720 | And the whites of the eyes tell intent.
02:03:37.200 | They tell direction.
02:03:38.280 | They tell interest.
02:03:39.120 | They know if you look at something,
02:03:41.520 | I can tell what you're looking at
02:03:42.920 | because there's a lateral resolution.
02:03:45.840 | I can tell what you're looking at.
02:03:47.840 | That's recent.
02:03:49.460 | And the people who do reconstruction for museums,
02:03:54.460 | they want to create what I call an ethnographic identity
02:03:57.460 | with the viewer.
02:03:58.380 | And so they fantasize about all these other early hominids,
02:04:03.380 | non-human, pre-human hominids, if that's a word,
02:04:07.800 | as having eyes like us, but they probably didn't.
02:04:10.500 | And they were probably not self-aware,
02:04:16.120 | at least the early ones can't have been self-aware
02:04:18.400 | the way we are insofar as we are.
02:04:20.720 | They may not have spoke.
02:04:23.100 | So I'm interested in basically
02:04:26.500 | when did we become what we think is human?
02:04:30.040 | It's clear that when we start burying the dead
02:04:32.800 | and making jewelry and when we, in a sense,
02:04:37.720 | invent fantasy, when we invent deception,
02:04:41.140 | that's human, that's fully human.
02:04:43.980 | We become human by thinking there's a world
02:04:48.040 | that really is not.
02:04:50.120 | - I mean, that feels like we're starting to operate
02:04:52.480 | in the space of ideas more and more.
02:04:54.600 | So to have deception, to have imagination,
02:04:57.680 | you start to be able to have ideas and share them.
02:05:00.200 | And it feels like the sharing is the thing
02:05:01.880 | that really develops the ideas.
02:05:03.360 | It's not you come up with ideas.
02:05:04.880 | And we become able to sort of understand
02:05:08.180 | what each other is thinking.
02:05:09.520 | Some animals can do this to a certain extent.
02:05:11.600 | Dogs have a certain empathy, but it's limited.
02:05:14.920 | It's highly limited.
02:05:16.320 | - You could probably argue that the dogs
02:05:18.040 | got that from the humans.
02:05:19.400 | - Yeah, I mean, humans and dogs have co-evolved,
02:05:21.720 | have definitely co-evolved 'cause it's over 100,000 years
02:05:25.300 | we've been working together there.
02:05:27.920 | But all our hands have evolved with tools.
02:05:32.040 | And so I'm trying to figure out now
02:05:33.720 | the original purpose of Acheulean hand axes,
02:05:37.040 | the first beautiful tool made by humans,
02:05:39.200 | which were made unchanged.
02:05:41.120 | - What kind of axes is this?
02:05:42.160 | - They're called Acheulean hand axes.
02:05:43.880 | They're these beautiful teardrop-shaped objects
02:05:48.040 | that go back 1.5 million years.
02:05:50.200 | - And what's your thought about its possible purposes?
02:05:53.040 | - Well, the most important thing I think is that--
02:05:56.480 | - A jealous husband comes home.
02:05:58.800 | - What's astonishing is that no one knows
02:06:00.600 | what they were used for.
02:06:02.360 | So they may have been maps, they may have been weapons,
02:06:06.840 | they may have been chopping devices,
02:06:09.480 | they may have been sexual displays.
02:06:11.460 | - Like ornaments to display something
02:06:15.160 | versus actual practical--
02:06:16.520 | - Like the peacock's tail, something to attract a mate.
02:06:20.480 | No one really knows, but what's interesting
02:06:22.840 | is how in becoming ignorant of those,
02:06:25.880 | that's a form of knowledge.
02:06:27.960 | In other words, a lot of,
02:06:29.340 | this is one reason I'm interested in ignorance,
02:06:30.920 | is that really, to understand something,
02:06:34.160 | and especially to teach something,
02:06:36.200 | you have to know what people don't know.
02:06:38.440 | And that's hard often.
02:06:39.800 | It's very hard to remember what it's like
02:06:42.240 | to not know something once you know it.
02:06:45.600 | Very hard, very hard to do.
02:06:47.080 | But you sort of have to do that
02:06:49.440 | to recreate that moment you can teach.
02:06:54.040 | - Well, one nice thing I like about the internet
02:06:57.560 | is you can look at old tweets of yours
02:07:00.880 | and to be like, okay, for some reason it brings to mind,
02:07:03.960 | like, okay, that's where my mind was.
02:07:06.000 | Another interesting exercise is Google search history.
02:07:09.480 | So I think for everybody, it keeps,
02:07:12.540 | you can look up your own history of what you searched for.
02:07:15.200 | And it's so cool to go back to 2008 or something like that.
02:07:18.760 | Like, oh, okay, I remember where your mind was.
02:07:21.520 | And immediately, actually, it's a nice way to restore
02:07:24.920 | at least an inkling of the ignorance you had,
02:07:28.800 | or have a peek into the ignorance you had
02:07:30.800 | about the world.
02:07:32.260 | And also to discover the things you've forgotten,
02:07:34.560 | the new ignorance you have now.
02:07:35.640 | You say, oh, right, right, I was really concerned
02:07:38.800 | about this and that.
02:07:41.200 | I do think that, as you're saying,
02:07:45.200 | it's both sad and illuminating to think about
02:07:50.200 | that most of what we've known,
02:07:52.400 | even like the deep wisdom, is forgotten
02:07:54.960 | as a human civilization.
02:07:56.380 | - But we create it new all the time as well.
02:08:00.260 | So-- - Right, hopefully,
02:08:02.160 | forgetting is a feature and not just a bug.
02:08:04.240 | - It's like those mice that can't forget,
02:08:06.120 | they go insane, right?
02:08:07.400 | If you imagine all of your memories as present,
02:08:11.200 | that's a recipe for insanity.
02:08:14.460 | You have to forget to learn, right?
02:08:16.960 | Learning is unlearning.
02:08:18.280 | - Which is exactly why I drink now.
02:08:23.640 | And then write some blues songs
02:08:28.360 | about forgetting a broken heart.
02:08:30.900 | Okay, you mentioned Amber and stone collection.
02:08:34.660 | I just have to ask, does that connect to human origins
02:08:37.220 | or just a personal love?
02:08:39.500 | What is it about stone collecting that attracts you?
02:08:41.580 | - Well, scholars tend to be text-oriented.
02:08:44.660 | I tend to think books are overrated.
02:08:46.580 | We evolved without books.
02:08:50.500 | I walk for a couple of hours in the forest every day.
02:08:55.720 | I gather mushrooms and all kinds of things,
02:08:58.300 | just located pieces of the 1953 resolution airplane crash
02:09:03.300 | outside of Half Moon Bay just a couple days ago.
02:09:09.900 | I like finding things.
02:09:11.220 | - Have you ever found pieces of a crashed UFO?
02:09:14.460 | - Not yet. - Not yet, okay.
02:09:15.940 | All right, let me know, please, if you do.
02:09:17.500 | - But of course, we have extraterrestrial other stuff.
02:09:19.740 | I mean, we have, I collect meteorites, so I'm into that.
02:09:25.480 | And so I'm interested in stone, stone quality.
02:09:28.400 | I grew up in Southern Texas and grew up surrounded
02:09:33.400 | by people who would hunt for stone and gather stone
02:09:38.360 | and cut stone.
02:09:39.200 | I cut stone as well.
02:09:40.200 | I'm a lapidary.
02:09:41.840 | And so I have this interest in the physical qualities
02:09:44.840 | of objects, sometimes called material culture,
02:09:49.040 | but it's just stuff.
02:09:50.860 | And I'm interested to know how different cultures
02:09:54.120 | have manipulated stuff, worked stuff, stone, wood,
02:09:58.000 | things like that.
02:09:58.920 | And also the fantasies people project into it.
02:10:04.200 | So I'm doing a book on all the different ways
02:10:07.280 | different cultures have found different images in stone,
02:10:10.080 | like Roshak tests.
02:10:11.540 | And so in India, they love agates with Hindu temples
02:10:17.260 | in them and altars.
02:10:18.960 | And in America, they like, you know,
02:10:21.880 | three crosses on the mount.
02:10:23.440 | And if you can find a stone with the word Allah in it,
02:10:27.560 | that's beloved in Yemen or Saudi Arabia.
02:10:30.500 | So there's a long history of people projecting fantasy
02:10:34.600 | into stone.
02:10:35.440 | And I'm using that as a kind of a metaphor.
02:10:39.640 | I'm also looking at the rise of hobbies
02:10:41.880 | and amateur stonework and how a lot of our gemologic
02:10:48.520 | techniques were actually invented by amateurs,
02:10:51.480 | which means just lovers, as opposed to professionals.
02:10:53.800 | The amateur is the lover.
02:10:55.800 | And hobbies, I don't know if you know,
02:10:57.880 | but the word hobby comes from a hobbled horse.
02:11:00.800 | And so you would hobble a horse to keep it from running.
02:11:04.160 | That's hobbling it with a stick or a string.
02:11:09.160 | And then kids would ride a hobbled horse for play,
02:11:14.400 | a horse on a stick.
02:11:16.440 | And riding a hobbled horse becomes riding a hobby horse.
02:11:19.600 | And then that becomes a hobby.
02:11:21.600 | And so hobbies become this so-called job you can't lose
02:11:25.220 | in the Great Depression in the 1930s.
02:11:27.840 | And then they explode.
02:11:29.260 | And so when I was a kid, people would collect coins
02:11:33.320 | or stamps or fossils or this or that.
02:11:35.360 | So I'm interested in that collecting passion.
02:11:38.320 | - So it's interesting, the development of hobbies,
02:11:41.760 | 'cause it feels like the future of human civilization
02:11:43.880 | will be very hobby-driven.
02:11:47.280 | Some of the, I often now,
02:11:50.840 | because of this particular little thing I'm doing
02:11:53.160 | with the podcast, I get to interact with photographers
02:11:55.360 | and videographers, and I'm disappointed to find
02:11:59.160 | how many professionals are not very good
02:12:02.320 | and how many hobbyists are very good.
02:12:04.860 | So it's almost--
02:12:07.440 | - Well, if they're amateurs, they're the lovers.
02:12:09.200 | I mean, you can think-- - The lovers, yes.
02:12:10.480 | - That's what that means, from amour.
02:12:12.280 | You're an amateur if you're a lover of the thing.
02:12:14.820 | And you're not in it for the money.
02:12:17.080 | You're in it 'cause you're obsessed.
02:12:19.080 | But as the GDP, as our freedom grows
02:12:24.080 | to sort of financially to be able to have a hobby,
02:12:29.000 | it feels like there'll be more lovers,
02:12:32.080 | more amateurs in the world,
02:12:33.560 | and not just for the artistic pursuits,
02:12:35.500 | but like science, technology development,
02:12:40.500 | building all kinds of technologies, almost as a hobby.
02:12:44.760 | - Yeah.
02:12:45.720 | - You have much more freedom to figure out
02:12:48.080 | what is the thing you love doing.
02:12:50.520 | And actually, over time, you won't even notice,
02:12:55.360 | but it'll start making money.
02:12:56.820 | And yeah, that's really fascinating.
02:12:59.600 | And yeah, it does kind of, I mean,
02:13:04.600 | when did that originate, just the collection?
02:13:08.600 | - Well, it goes through different stages.
02:13:12.360 | People have always gathered the odd thing
02:13:16.560 | to make something else.
02:13:18.920 | But you also get this tradition
02:13:21.460 | of what's called curiosity cabinets,
02:13:24.080 | especially in the Renaissance,
02:13:26.160 | which replaced the kind of treasure chambers
02:13:28.620 | of the ancient sultans or kings or whatever.
02:13:31.480 | And you get these curiosity cabinets
02:13:33.160 | that were often linked with magical practices,
02:13:35.720 | alchemical practices.
02:13:37.440 | People would gather bezoars, or they would gather,
02:13:41.360 | they would have an alligator hanging from the ceiling,
02:13:44.040 | or they would have a rare shrunken head or whatever.
02:13:47.580 | And that's part of the rise of natural history,
02:13:51.000 | the idea that you taxonomize the world,
02:13:52.920 | you classify the world, you look for the rare object,
02:13:55.740 | the rarity.
02:13:57.520 | And rarity still is a kind of virtue,
02:13:59.520 | like the recent news about trying
02:14:01.960 | to figure out ball lightning.
02:14:03.100 | When I was growing up, ball lightning was the big question.
02:14:06.280 | Does it exist, does it not exist?
02:14:08.600 | And now there's new evidence of how it actually might.
02:14:10.740 | - Wait, what, really?
02:14:11.960 | There's new evidence? - Yeah, yeah, there's--
02:14:13.520 | - I grew up with that.
02:14:15.080 | My dad, when I was young, told me,
02:14:16.840 | I asked him, like, how do I win a Nobel Prize?
02:14:19.160 | He said, "Invent a time machine
02:14:21.600 | "or figure out how ball lightning works."
02:14:23.880 | And so I got really excited, I was like,
02:14:26.440 | damn it, I'm gonna figure out how does ball lightning works.
02:14:28.720 | - It's very interesting from a history of science
02:14:30.660 | point of view because it's so rare
02:14:32.960 | that in a way it doesn't exist.
02:14:34.640 | You can't replicate it, you can't make it,
02:14:37.000 | does it really exist?
02:14:38.360 | It's a little bit like Libyan glass,
02:14:40.080 | another thing I collect is Libyan glass,
02:14:42.100 | which is a tektite, which falls as a result of a meteorite.
02:14:47.100 | A meteorite hits the earth, blasts earth up into space,
02:14:50.600 | it falls back down as a glass, that's called a tektite.
02:14:55.360 | And there's a rare form of it called Libyan glass,
02:14:58.640 | which fell probably around 20 million years ago
02:15:01.760 | and now it works out of the Sahara every now and then.
02:15:04.720 | It was the most valuable stone of antiquity.
02:15:08.040 | The centerpiece of Tutankhamen's breastplate
02:15:11.440 | is made of this beautiful yellow gemstone Libyan glass.
02:15:15.380 | So rarity is something that the hobbyists
02:15:20.120 | have always liked to cherish, the rarity,
02:15:23.240 | the odd, and science has a kind of often aversion,
02:15:27.000 | it has a kind of a love-hate relationship
02:15:29.220 | with rarity and novelty.
02:15:31.280 | Science is often trying to pursue novelty
02:15:33.440 | to make discoveries.
02:15:35.460 | But if you can't replicate it,
02:15:37.240 | it's kind of like, what does it really exist?
02:15:39.120 | - Yeah, which is why, I mean, UFOs and aliens
02:15:42.240 | and all those kinds, there's a general aversion to that
02:15:44.340 | 'cause it's like, it's a one-time event.
02:15:46.380 | It's sad because there's, just like you said,
02:15:52.560 | singular events or rare events
02:15:56.480 | are somehow really inspiring to us.
02:15:59.400 | And so you kind of have to balance that.
02:16:01.280 | Yeah, there's a scientific process,
02:16:03.080 | but you also have to like, it's the thing you find,
02:16:06.360 | the weird, the peculiar, it's like, huh, what is that?
02:16:11.360 | - Even the universe itself, it could be
02:16:14.120 | that the universe begins and then will end,
02:16:20.080 | say in a cold death, and that's it.
02:16:24.280 | I mean, it could be a one-off thing
02:16:25.720 | or it could be one of an infinite many cycles.
02:16:29.200 | And maybe all of the laws of nature
02:16:31.320 | are recreated anew with each cycle.
02:16:34.200 | - Or maybe what we're assuming about the Big Bang,
02:16:38.000 | there's some element of falsity.
02:16:39.520 | Maybe the speed of light is not constant,
02:16:42.640 | but changes over time.
02:16:43.900 | That would throw into question all kinds of theories
02:16:46.080 | about dark matter and dark energy,
02:16:49.480 | and even the age of the universe.
02:16:51.240 | - And to me, there's very likely trillions
02:16:54.640 | of conversations going on like this on other planets.
02:16:59.520 | - Yeah, no doubt.
02:17:01.600 | - Exactly.
02:17:02.760 | - Different kinds of drugs, different communication styles,
02:17:06.320 | different time scales at which life form is
02:17:09.340 | or what life looks like or how life behaves
02:17:12.240 | or what life is, and all those things.
02:17:14.800 | Every time you think about this, it's more and more humbling.
02:17:18.720 | Just this whole fog of ignorance.
02:17:23.520 | - What drives me crazy is wondering
02:17:26.440 | about the beautiful gemstones on other planets.
02:17:29.840 | I call them exo-aggots.
02:17:31.540 | They must be unbelievable features and forms
02:17:35.400 | which are unimaginable to us.
02:17:38.040 | 'Cause one thing we do know is that nature is very creative.
02:17:40.760 | I mean, we are the product of nature,
02:17:42.620 | and we seem to be fairly creative.
02:17:44.600 | And so imagine what else nature's created.
02:17:48.980 | But even that's unknown, is how common
02:17:51.280 | is life in the universe?
02:17:52.400 | Is it common or is it rare?
02:17:54.840 | We only have a sample size of one.
02:17:56.640 | It could be quite common or it could be even unique.
02:18:00.520 | - Yeah.
02:18:01.360 | I tend to believe it's everywhere,
02:18:06.840 | except for the fact that we don't even know
02:18:08.880 | how to define what life is.
02:18:10.460 | Like, what is everywhere exactly?
02:18:12.300 | We're talking about, it's very possible
02:18:14.660 | that there's not anywhere in the universe
02:18:19.660 | an organism with two legs and two arms,
02:18:23.420 | with two eyes and mostly hairless,
02:18:26.860 | walking around at this time scale.
02:18:29.620 | But there could be very different kind of other things.
02:18:33.460 | It was interesting, there's some people,
02:18:36.320 | this is not a common belief,
02:18:37.600 | but a friend now named Lee Cronin,
02:18:40.240 | he's a chemist and biologist,
02:18:43.360 | and he believes that if we ran evolution
02:18:45.360 | over and over and over and over on Earth,
02:18:47.240 | you'd get very different.
02:18:48.640 | Not just, you wouldn't just get different organisms,
02:18:51.560 | you'd get very different biology.
02:18:53.360 | - Yeah, that's quite possible, yeah.
02:18:54.960 | - That's a weird thing.
02:18:56.080 | I mean, most people kind of assume,
02:18:57.440 | well, it kind of, it fits to the environment
02:19:00.600 | and you're gonna get similar things,
02:19:02.000 | maybe not humans and so on,
02:19:03.360 | but to get very different biology,
02:19:06.080 | like starting from the bacteria to just how--
02:19:09.400 | - Well, the idea that it would be DNA-based
02:19:12.040 | on some other planet, that seems to me
02:19:14.520 | like saying they're speaking Swahili on some other planet.
02:19:17.200 | I mean, the odds of that particular architecture
02:19:20.280 | I think are infinitesimally small.
02:19:22.280 | - What's the coolest stone you've ever seen?
02:19:26.840 | - Oh my God, there's so many.
02:19:29.000 | - And what defines, is it rarity, is it just raw beauty?
02:19:34.000 | What captivates your--
02:19:36.360 | - I like a storied stone.
02:19:40.160 | I have a very beautiful Fairburn agate,
02:19:43.360 | which has multiple layers,
02:19:45.440 | and there's something I call agate paralysis
02:19:47.840 | because to polish it, you have to go through the layers,
02:19:52.840 | which means you're destroying the layers.
02:19:56.160 | And maybe what should be done is it should be like a movie
02:19:58.520 | where you film the entire process of cutting and polishing
02:20:03.100 | so that it's not dead.
02:20:06.320 | In other words, what was the diamond when it started rough?
02:20:09.160 | The rough diamond is gone,
02:20:10.360 | but if you could sort of do a filmic version
02:20:13.600 | of a cutting process so that the stone would exist
02:20:18.360 | from a pre-polished to a polished state,
02:20:20.600 | all as a kind of NFT or something.
02:20:24.200 | - That should be an NFT, that's right.
02:20:26.160 | - So the other thing I fantasize about
02:20:29.920 | is how pattern recognition technology
02:20:33.160 | will probably in the future allow us to discover
02:20:35.680 | all kinds of amazing stones,
02:20:37.400 | including for example, fossil skulls,
02:20:40.160 | fossil skulls of humans.
02:20:42.280 | Now it's kind of a chance process
02:20:45.040 | that you discover a skull in East Africa,
02:20:47.960 | but why not have a drone moving constantly,
02:20:51.720 | scanning for pattern recognition of human skull,
02:20:55.560 | human teeth, very slowly and then--
02:20:58.040 | - On the surface you mean?
02:20:58.880 | - Just above the surface, just 10 feet above the surface,
02:21:01.480 | 20 feet above the surface.
02:21:02.320 | - No, no, no, sorry, you think you'll be able
02:21:03.920 | to find skulls on the surface?
02:21:05.520 | - Yes, yes, in the middle of a place that no one has looked.
02:21:09.840 | These areas are vast, right?
02:21:11.840 | So it could be found on the surface,
02:21:13.800 | then move to the next layer,
02:21:14.960 | then find it under the surface as well.
02:21:17.480 | There's LIDAR, there's all kinds of ways.
02:21:19.360 | We're finding jungle cities under the Amazon
02:21:24.120 | that people didn't know about.
02:21:26.160 | - Do you think there's something out there
02:21:28.700 | that would just blow your mind?
02:21:30.520 | - Oh, for sure, for sure.
02:21:33.720 | Yeah, no doubt.
02:21:34.880 | - Oh man, and how much of it is a little bit underground,
02:21:39.280 | right, or how much of it is in the ocean?
02:21:41.160 | - Yeah, I mean here, right here, we are in the Bay Area.
02:21:44.640 | We know that much of the Native American civilization here
02:21:48.600 | was under the bay because 6,000 years ago,
02:21:52.560 | the bay was dry.
02:21:53.880 | It was a river, not a bay.
02:21:55.920 | And so all of those, whatever material, culture,
02:21:59.120 | archaeological traces existed there
02:22:01.480 | are now at least preserved under the water.
02:22:05.500 | So I think we're just beginning to touch the--
02:22:09.080 | - Could be treasure too.
02:22:10.800 | I mean, like literally, like you said,
02:22:12.120 | we lose the wisdom or we lose the knowledge,
02:22:14.620 | but I mean, if there's the pyramids, right,
02:22:19.620 | it's the great wonders of the world,
02:22:23.060 | there might be other wonders that are completely lost.
02:22:25.260 | - Yeah.
02:22:26.100 | - Just--
02:22:26.940 | - I mean, one of the stones, you asked about stones I like.
02:22:29.980 | I like stones, for example, every now and then,
02:22:34.820 | dinosaurs would eat rocks as gizzard stones,
02:22:39.260 | and then you find them in their guts, in their bones.
02:22:43.140 | Well, every now and then, they would eat a piece
02:22:44.460 | of petrified wood.
02:22:46.220 | So the idea that something was a tree and then stone
02:22:50.220 | and then swallowed by a dinosaur
02:22:52.340 | and ground up in the gizzard and polished
02:22:54.460 | and then left in a, you know.
02:22:55.940 | - Yeah.
02:22:56.780 | - So I like things that have been through dramatic--
02:22:58.820 | - There's a story there.
02:22:59.980 | - There's a story.
02:23:00.820 | - Yeah, I mean, that's, okay, the really fascinating thing,
02:23:04.020 | why seeing Allah or crosses in the stone
02:23:08.120 | is it feels like the stone has wisdom
02:23:12.060 | because it's been through so many generations of humans.
02:23:17.060 | It's like bigger.
02:23:19.860 | - Right.
02:23:20.700 | - It's seen it all.
02:23:21.740 | - Also, it's also the intellectual question
02:23:24.020 | of intelligent design.
02:23:25.460 | In other words, when people say intelligent design,
02:23:27.740 | mostly it's bogus, but there are several interesting examples
02:23:32.500 | of actual intelligent design,
02:23:34.060 | meaning when is a stone the product of artifice
02:23:39.060 | and when is it a geofact produced by nature?
02:23:42.500 | And that was an important discovery in the 19th century.
02:23:46.060 | The zone of percussion, it's called, the percussion zone.
02:23:50.560 | Or how do you know that a signal from out of space
02:23:52.980 | is an intelligent signal?
02:23:54.380 | And as opposed to hydrogen doing something
02:23:57.520 | or some natural thing,
02:23:59.720 | that's the genuine problem of intelligent design.
02:24:02.780 | How do you know if it's pi, maybe if it's E,
02:24:06.220 | if it's some pattern,
02:24:09.200 | how do you know that that's an intelligent signal?
02:24:12.700 | How do you know that an artifact in the ground is,
02:24:15.740 | we'll see in the clouds a face.
02:24:20.140 | It's called pareidolia.
02:24:21.180 | We have a kind of a built-in ability to see faces
02:24:24.780 | where they really aren't there, right?
02:24:26.820 | That's why kids like clowns.
02:24:28.500 | We've evolved that, so babies evolve it
02:24:31.180 | to recognize their parents and so forth.
02:24:34.060 | But when is it a projection
02:24:36.540 | and when is it really in the stone?
02:24:40.360 | And that was a big question with the rise of fossils.
02:24:43.780 | If you find a curly thing, is that life or is it non-life?
02:24:48.780 | People have made this mistake before.
02:24:51.740 | They'll find a rock on the moon or Mars,
02:24:54.080 | they say, oh, this is a face or whatever.
02:24:56.060 | Well, no, that's just projection, that's pareidolia.
02:25:01.060 | - I guess throughout science you have this problem
02:25:03.580 | of signal, just because something is beautiful
02:25:07.020 | doesn't mean it was, I mean, that's not a good signal
02:25:12.100 | to determine if it's intelligent design
02:25:14.700 | or natural evolution or natural design.
02:25:19.700 | Just because you see a stone that just,
02:25:23.460 | the pattern is incredible, how do you know?
02:25:27.020 | - How do you know it's a fossil is one question,
02:25:30.660 | namely the remnants of an organism.
02:25:33.780 | And how do you know if it was manipulated by a human?
02:25:37.140 | This is a big problem in trying to figure out
02:25:41.120 | the oldest art.
02:25:42.980 | If you find scratchings on a bone, is that a tally?
02:25:47.140 | Is it someone marking her menstrual period?
02:25:50.140 | Is it phases of the moon?
02:25:51.820 | Or is it trampling by an antelope?
02:25:55.160 | And that's called the science of taphonomy,
02:25:57.300 | to discern when a marking on a bone or a stone
02:26:01.420 | is in a sense an artifact or a geofact
02:26:04.860 | or an antelope effect.
02:26:07.340 | And it's an intellectually challenging question.
02:26:10.220 | And people wanna fantasize, they'll find a stone
02:26:12.700 | that looks like a carving that's 300,000 years old.
02:26:15.500 | Generally, I think those are just odd stones.
02:26:19.920 | You don't find the explosion of carved stone
02:26:24.420 | until around 60,000 years ago, 50, 60,000 years ago.
02:26:29.340 | There seems to be something that paleoanthropologists
02:26:32.780 | call the creative explosion or the big bang of the mind
02:26:37.420 | that produces a kind of ability to see in the distance,
02:26:42.420 | to identify a shape in an object,
02:26:47.760 | to create a shape in an object that you don't get.
02:26:49.960 | The Neanderthals don't seem to have ever done
02:26:52.740 | what we would call art.
02:26:55.220 | That's a very interesting phenomenon.
02:26:57.560 | But it requires that you have some understanding
02:27:00.740 | when is something art and when is it just,
02:27:03.700 | oh, that's a rock that looks like a face.
02:27:05.880 | - Or some, not necessarily understanding,
02:27:07.700 | but a conception that's mutually agreed upon,
02:27:10.420 | that we're able to, 'cause maybe Neanderthals,
02:27:13.480 | maybe fish have a conception of art.
02:27:15.620 | They just--
02:27:16.460 | - And this also gets back to your question
02:27:19.800 | about professional bias and ideology,
02:27:22.480 | because there's a huge reward for finding the oldest art.
02:27:26.460 | - Yeah, yeah.
02:27:27.820 | - If everyone says it's 50,000 years ago
02:27:30.700 | and you find one that's 300,000 years ago,
02:27:32.940 | that's a huge discovery.
02:27:35.340 | So there's a bias.
02:27:38.720 | And this has been one of the things that's led
02:27:41.580 | to probably the overproliferation
02:27:45.260 | of different species of hominids,
02:27:47.700 | because there's no academic reward
02:27:51.020 | for finding yet another example of someone else's species.
02:27:55.140 | But there's a huge reward if you can find a Lex Friedmanite,
02:28:00.140 | you can name it after yourself or whatever, new fossil.
02:28:07.060 | There's a huge professional reward
02:28:08.380 | to be the first at something.
02:28:09.820 | And so those types of professional rewards
02:28:13.180 | also influence science and what kind of science gets done.
02:28:17.860 | - Yeah, so I'm always suspicious of,
02:28:20.480 | and as we should all be,
02:28:22.020 | when you can kind of intuit a financial
02:28:25.220 | and otherwise motivation.
02:28:26.820 | I mean, that's actually often in the modern age
02:28:29.500 | where I'm suspicious of conspiracy theories.
02:28:32.140 | It's not that the logic doesn't make sense
02:28:35.540 | or something like that.
02:28:36.360 | I personally actually just enjoy conspiracy theories.
02:28:39.300 | I've been listening to Flat Earthers
02:28:41.420 | discuss stuff recently.
02:28:42.580 | It's kind of exciting for some reason.
02:28:44.180 | - It's fascinating, yeah.
02:28:45.300 | - It's like, 'cause I consider like, what if it's true?
02:28:48.520 | It's exciting to discover together,
02:28:50.620 | like think through first principles.
02:28:52.540 | Like, what does the world look like?
02:28:54.380 | It's exciting.
02:28:55.220 | I mean, it's the childlike discovery of a new idea.
02:28:58.900 | But the reason I'm skeptical
02:29:00.660 | of a lot of conspiracy theories
02:29:02.060 | is when I see how popular you can get
02:29:06.500 | propagating those conspiracy theories,
02:29:08.280 | how quickly you can form a large movement.
02:29:10.940 | And it's like, hmm.
02:29:12.100 | Like-- - Well, it's such thin evidence.
02:29:14.980 | It's like if Loch Ness exists, there's just one?
02:29:18.100 | I mean, how does the reproduction work on that?
02:29:21.540 | How do you talk about an animal
02:29:22.780 | that has only one in a population?
02:29:25.180 | It just doesn't, some of the things don't make sense.
02:29:26.940 | - No, but see, this is the logic side.
02:29:28.900 | I don't even go that far.
02:29:31.440 | The fact is, if you say there's a Loch Ness monster,
02:29:33.900 | I just see how quickly the idea spreads in popularity.
02:29:37.260 | It's the people are hungry to discover something new,
02:29:40.100 | just like you mentioned with the hominids.
02:29:43.040 | And I'm very suspicious of where there's like
02:29:46.620 | a strange hunger for ideas,
02:29:49.740 | 'cause then they're less likely to be objective
02:29:53.260 | and rigorous in considering the validity of that idea.
02:29:56.380 | I'm not going to the logic,
02:29:58.260 | 'cause actually, flat Earth is pretty logical.
02:30:01.980 | - Yeah, very logical.
02:30:04.000 | Logic is not the problem.
02:30:05.980 | - Right, but it spreads really quickly.
02:30:08.980 | And once again, with conspiracy theories,
02:30:11.620 | I think it represents,
02:30:13.100 | you have to think about the cause of causes,
02:30:15.500 | or cause of cause of causes, like you talked about,
02:30:18.220 | which is like it represents some deeper fragmenting
02:30:21.940 | of the common humanity.
02:30:25.020 | We have the trust in the big community that is science,
02:30:30.020 | in the big community that is government,
02:30:34.740 | all that kind of stuff.
02:30:36.540 | - Well, that's why things like ball lightning are cool,
02:30:38.540 | because it's like, the scientist denied it, but here it is.
02:30:42.700 | - Exactly, exactly.
02:30:44.180 | And that ultimately ends up--
02:30:45.580 | - Everyone said I was insane, but--
02:30:47.620 | - But it's still, you said there's some breakthroughs?
02:30:51.180 | I need to look it up.
02:30:52.020 | That's really cool. - Yeah, go check it out.
02:30:53.420 | It's pretty exciting.
02:30:54.540 | Yeah, there's some new theories of how it actually might--
02:30:57.180 | - Because I think, I mean, there's obviously several ways
02:30:59.460 | to prove that, like one of them is to recreate it
02:31:01.900 | in the lab, which-- - That's the gold standard.
02:31:03.780 | - That's probably very, very difficult.
02:31:06.460 | Just 'cause we're on the topic of rocks,
02:31:08.500 | I don't know if you've heard about this interstellar rock
02:31:11.720 | that flew through our, called the Mu'amua, for me.
02:31:14.100 | - Yes, yes, the cigar-shaped one.
02:31:16.580 | - The cigar-shaped one, as a fan of rocks,
02:31:18.220 | what do you think about that one?
02:31:19.060 | - Well, I think that generally, I mean,
02:31:22.300 | when the people were speculating it might be a spaceship,
02:31:25.800 | I thought, come on, rocks do all kinds of crazy things.
02:31:28.920 | They do a lot more than you realize.
02:31:30.700 | They can do unbelievably cool things.
02:31:33.580 | There are parts of the desert there in Utah
02:31:36.860 | where rocks move and create these long tracks,
02:31:40.420 | and it's, now we know it's from liquefaction
02:31:44.300 | and wind and various other things,
02:31:46.780 | but they're still unbelievably cool.
02:31:48.980 | Rocks can do almost anything.
02:31:51.140 | And so just the fact that one comes
02:31:53.100 | from outside the solar system doesn't mean
02:31:55.900 | it has to be a spaceship.
02:31:57.820 | So, but nonetheless, I thought it was awesome.
02:31:59.940 | I thought it was really, really cool,
02:32:01.940 | and I sort of wish it would happen more often.
02:32:04.940 | - I kinda hope it's trash from another alien civilization.
02:32:08.300 | - That'd be fantastic.
02:32:09.140 | - 'Cause if you're, (laughs)
02:32:11.980 | if humans are all a lesson, that we produce more trash
02:32:15.700 | than we do intelligent signal. (laughs)
02:32:18.220 | So the first thing to reach other civilizations
02:32:21.180 | I feel like would be our trash, our pollution,
02:32:24.500 | before the intelligent signal reaches them.
02:32:26.700 | You mentioned this interesting term, Russianist.
02:32:30.620 | The things we do for love.
02:32:31.740 | For some reason you went to Germany.
02:32:33.420 | - Yes.
02:32:34.260 | - So you said you're pretty eloquent with German.
02:32:38.700 | - I learned German, yeah.
02:32:40.420 | - Did you ever learn Russian a little bit?
02:32:42.900 | - I did learn Russian, yeah.
02:32:43.900 | I studied actually Russian as an undergraduate
02:32:46.900 | at Indiana University for several years,
02:32:49.860 | and then I wanted to do Russian and Chinese
02:32:54.620 | as a graduate student, 'cause I thought
02:32:56.780 | this is kind of the future.
02:32:58.820 | And Harvard said, "Nope, it has to be French and German."
02:33:02.420 | And so I essentially gave up on my Russian and Chinese.
02:33:07.700 | And the other part of that story is nonetheless,
02:33:11.860 | I wanted to do something with Russian.
02:33:13.500 | I wanted to study how much the Lamarckian ideology
02:33:18.500 | and biology in Russia at the time that led
02:33:23.780 | to their distrust of genetics under Stalin
02:33:27.100 | had to do with the fact that genetics
02:33:29.940 | was being pushed by the Nazis.
02:33:32.040 | And we tend to see those literatures in isolation.
02:33:36.400 | The Nazis were racist, the Russians were environmentalists
02:33:41.400 | biased by Lamarckian theories of heredity
02:33:44.860 | and rejected that part of Darwin.
02:33:46.800 | When they're right next to each other
02:33:49.300 | at the very same time, there must be a connection.
02:33:51.060 | And so I started reading into this,
02:33:52.860 | and I actually got a Fulbright to go write a book on this
02:33:57.860 | and canceled all my classes.
02:34:01.820 | I was teaching at the New School
02:34:03.080 | for Social Research at the time.
02:34:05.600 | And I couldn't get a visa into the Soviet Union.
02:34:09.800 | I was just barred admission for doing this project,
02:34:13.580 | looking at how Stalinist science
02:34:16.580 | had this anti-Nazi aspect, which we've overlooked.
02:34:20.580 | - Which year was this?
02:34:21.500 | This was when the Soviet Union was still together.
02:34:23.200 | - Yes, yeah, it was in the late 1980s, but before.
02:34:27.940 | Gorbachev was in power, but it wasn't before '89.
02:34:31.540 | - But still then there was a careful attention to--
02:34:36.540 | - Well, you never know how careful it was
02:34:39.520 | or built through the cracks,
02:34:41.600 | or you never know when something fails.
02:34:43.600 | You don't always know why it failed,
02:34:45.160 | but I was very disappointed,
02:34:46.800 | and it sort of ended that project.
02:34:49.640 | I didn't have access to the archives on it.
02:34:52.320 | I could have obviously done it later, but you know.
02:34:55.320 | - So there was that curiosity initially,
02:34:57.680 | but then you focused on the Nazi side of--
02:35:01.400 | - Yeah, I mean, the other thing was
02:35:02.940 | I was trying to figure out where to go
02:35:04.820 | for a Fulbright on a different year,
02:35:08.020 | and I wanted to go to China,
02:35:10.780 | and turns out you could only go to Taiwan.
02:35:13.500 | I didn't really want to go to Taiwan,
02:35:14.820 | and it was one in 50 odds of going to Taiwan,
02:35:17.340 | but it was one in three of going to Germany,
02:35:21.580 | and so I ended up going to Germany.
02:35:24.340 | I didn't have any particular interest
02:35:25.820 | in Germany at that time,
02:35:27.900 | but that's what I ended up doing.
02:35:30.700 | So I wrote one book in German, actually,
02:35:34.160 | and I wrote two books on Nazi Germany,
02:35:36.960 | and otherwise I might have been doing
02:35:40.000 | the same thing in Russian or Chinese.
02:35:42.320 | - Yeah, those are--
02:35:43.160 | - In other words, history chooses us
02:35:45.920 | as much as we choose history, right?
02:35:49.320 | - And those are really powerful cultures, right?
02:35:51.880 | Maybe can you comment on the German
02:35:55.160 | and the Russian and the Chinese,
02:35:57.080 | how much language, when you were reading
02:35:59.320 | those medical journals,
02:36:00.720 | how much were you able to understand?
02:36:04.360 | How important is it to understand language deeply
02:36:06.660 | in order to understand the culture?
02:36:08.460 | Did you struggle, and the opposite of that,
02:36:12.240 | did you find the beauty of the moment,
02:36:16.100 | like richly understand the moment
02:36:18.300 | because you had a hold of the language?
02:36:20.740 | - Well, in the Russian or Chinese case, no.
02:36:23.700 | I never got that far with it.
02:36:25.580 | I knew enough, I could read some Russian,
02:36:27.780 | and I could tell there were anthropologists
02:36:31.000 | who were anti-Nazi, and therefore, anti-genetics,
02:36:36.000 | and they saw genetics as essentially Nazi,
02:36:40.420 | and that was enough for me.
02:36:41.640 | I know there's something there,
02:36:43.300 | but I didn't have enough time.
02:36:44.260 | I wasn't allowed to go and actually research it.
02:36:47.220 | In the German case, you never fully know a language.
02:36:52.220 | We don't fully know English.
02:36:53.820 | There's always more to learn.
02:36:56.220 | I'm always learning new.
02:36:57.260 | I didn't know the word done last year, D-U-N.
02:37:00.740 | It means some kind of brown color,
02:37:02.980 | and I'm always finding new words.
02:37:04.680 | The words are near infinite as well, right?
02:37:10.060 | And new combinations.
02:37:11.160 | I've coined several words, too, in my life,
02:37:14.260 | but it did help understanding the humor,
02:37:18.260 | understanding the romance,
02:37:19.740 | and mainly just plowing through
02:37:22.180 | all of these medical journals,
02:37:23.400 | one after another after another.
02:37:24.860 | There's a kind of a voyeuristic aspect
02:37:27.700 | to looking into this lost world.
02:37:29.700 | You're reading texts by people who've died long ago.
02:37:33.260 | - And direct, it's not like reading books by famous people.
02:37:37.140 | It's like real people.
02:37:38.740 | - It's real people, and they make mistakes,
02:37:42.020 | and fascinating little stories.
02:37:44.560 | I was looking at how the Nazi tobacco industry
02:37:49.560 | had their own denial campaign,
02:37:53.140 | which was pro-Nazi and pro-tobacco,
02:37:57.020 | even though the Nazi regime was anti-tobacco,
02:38:01.260 | and they developed a lot of these rhetorical tricks
02:38:06.260 | that were later used by the Americans,
02:38:08.500 | like, oh, you can't trust that evidence.
02:38:10.740 | It's merely statistical.
02:38:12.780 | You can't trust the animal experiments
02:38:14.340 | 'cause all it proves is that mice should not smoke,
02:38:17.220 | but I noticed just in passing these remarkable stories,
02:38:22.740 | little hints, there's a report from a Japanese military man
02:38:27.740 | in one of these tobacco journals,
02:38:32.020 | tobacco industry journals in the Nazi period,
02:38:35.260 | and they're talking about this brotherhood
02:38:37.300 | of all men through cigarettes,
02:38:40.400 | and the tragedy that the Chinese and the Japanese,
02:38:43.380 | who were fighting each other,
02:38:44.840 | in a way, wanted nothing more than to smoke together,
02:38:49.540 | and the Chinese would sneak up to the Japanese forts
02:38:54.540 | to try to find a Japanese cigarette
02:38:57.360 | that had been thrown away, and they'd be glowing,
02:38:59.980 | and the Japanese knew this,
02:39:02.460 | and they would throw their cigarettes out.
02:39:03.820 | The Chinese would come,
02:39:04.800 | and then the Japanese would kill these Chinese,
02:39:07.620 | and then this guy is poetically lamenting the fact
02:39:10.680 | that even though all they want is a smoke,
02:39:14.100 | they nonetheless end up in the crosshairs and in death,
02:39:16.740 | and so it's just weird, and I'm reading this,
02:39:19.460 | translated from the Japanese into German
02:39:21.980 | in a Nazi tobacco industry newspaper.
02:39:26.500 | I mean, the layers of weirdness
02:39:28.500 | are really fascinating and touching, but--
02:39:31.120 | - And those very kind of brotherhood stories
02:39:36.140 | actually resonated later,
02:39:38.100 | 'cause I mean, that's how I feel about cigarettes.
02:39:39.860 | Some of my favorite moments in early life
02:39:43.100 | is about people connecting over a cigarette.
02:39:46.620 | - Of course.
02:39:47.540 | - And that works.
02:39:50.140 | Those narratives.
02:39:52.140 | - Yeah, it's the movies, right?
02:39:54.060 | The movies, it's called "Meet Cute."
02:39:56.620 | The tobacco industry, when they put cigarettes into a movie,
02:39:59.260 | they put it in right at the moment where boy meets girl.
02:40:03.300 | - Let me ask you just,
02:40:04.180 | in all the research you've done with Nazi Germany,
02:40:07.780 | just for me, from a conversational perspective,
02:40:11.500 | I was listening to a bunch of Holocaust survivors recently
02:40:15.860 | just on YouTube, listening to interviews.
02:40:18.280 | Also listening to Nazi SS soldiers,
02:40:22.980 | like they're still alive or recently.
02:40:26.600 | Some of them, especially the ones
02:40:29.300 | that deny many aspects of the Holocaust,
02:40:31.780 | it's so interesting to watch,
02:40:34.540 | 'cause they're still, still, it's so fascinating.
02:40:38.700 | Anyway, in your research,
02:40:41.000 | are there interesting people to talk to?
02:40:44.500 | They're still alive, or are they mostly,
02:40:46.580 | that part of history is no longer living, is in the books?
02:40:51.180 | - It is mostly no longer living,
02:40:52.620 | and that's one reason in the 1980s
02:40:55.300 | when I started working on Nazi science,
02:40:58.180 | I really did interview quite a few people
02:41:00.500 | and elderly people,
02:41:02.820 | people who had sort of slipped through the cracks,
02:41:05.740 | maybe even should have been prosecuted.
02:41:08.580 | So few people got prosecuted.
02:41:10.360 | But these were people who had racial theories,
02:41:13.820 | who published on these topics, and they were guarded.
02:41:18.820 | But these were the lives they lived.
02:41:25.300 | And mainly they wanted people
02:41:27.980 | not to be talking too much about this.
02:41:29.860 | So it gets sealed off and walled off.
02:41:32.860 | And that's why reading the medical literature itself
02:41:35.640 | was so much more valuable.
02:41:37.060 | - 'Cause there's no self-censorship, it's just there.
02:41:41.700 | - I'm sure there's some censorship,
02:41:43.540 | but what they said is what they said, and it's immense.
02:41:48.140 | It's immense and largely unread.
02:41:50.480 | As I said, there are hundreds and hundreds
02:41:52.060 | of Nazi medical journals,
02:41:53.500 | and people had not been reading those
02:41:55.620 | before I really started looking at them.
02:41:57.760 | - Given that you studied these really difficult parts
02:42:01.700 | of human history and human nature with big tobacco
02:42:04.580 | and just these mechanisms of manipulation,
02:42:07.060 | what gives you hope about the future?
02:42:09.380 | - Oh, all kinds of things give me hope.
02:42:12.780 | The forest gives me hope, the Wikipedia gives me hope,
02:42:17.580 | space exploration gives me hope.
02:42:19.540 | All kinds of things give me hope.
02:42:21.380 | I had this insight the other day.
02:42:22.780 | I walked through all of these giant redwoods,
02:42:26.520 | which were almost all cut.
02:42:28.780 | They're not very far from here,
02:42:29.780 | just half an hour straight west of where we are now,
02:42:33.420 | even up in redwood country.
02:42:35.600 | And I had this idea that they're growing back now,
02:42:41.420 | and every year they add how many cubic miles of wood,
02:42:46.060 | if you count California as a whole.
02:42:48.180 | But not only that, the roots are all old growth,
02:42:51.580 | if you think about it.
02:42:52.860 | These are re-sprouting.
02:42:54.340 | They're not from seeds.
02:42:55.300 | These are re-sprouting,
02:42:56.540 | and so they have this tremendous resource underground
02:43:00.700 | that even the loggers couldn't kill.
02:43:03.060 | And so from these stumps,
02:43:06.860 | you get what are called fairy rings,
02:43:08.540 | which are like five trees coming in a ring around it,
02:43:12.200 | each one competing to be the successor.
02:43:15.140 | So they've seen this story before,
02:43:18.380 | and they know to re-sprout.
02:43:21.580 | And that I think is a very hopeful thing,
02:43:24.020 | is that the roots are old growth,
02:43:25.700 | and hopefully in 100, 200, 300 years,
02:43:29.900 | it won't peak until around 1,000 years from now,
02:43:32.220 | you'll get these restoration
02:43:35.060 | of all of this magnificent old growth.
02:43:38.560 | But so many other things give me hope.
02:43:43.020 | We have to have hope,
02:43:44.060 | and I think that if the world is infinite,
02:43:47.740 | there's infinitely many ways for it to become fixed.
02:43:51.720 | I mean, obviously, we have some problems
02:43:53.860 | that need to be fixed, but they're fixable.
02:43:55.940 | - That's really beautifully put.
02:43:57.500 | That is a really hopeful idea that nature,
02:44:01.260 | that life, even human civilization,
02:44:03.660 | is resilient to all the mistakes we make.
02:44:05.700 | So the roots are there.
02:44:06.900 | So it outlives us.
02:44:09.060 | It's patient with our adolescent fuck-ups.
02:44:14.060 | - I mean, we're a thin layer on the crust.
02:44:16.220 | And eventually the Earth will be swallowed by the sun,
02:44:20.020 | and humans will have long gone extinct by then.
02:44:23.200 | But yeah, there's all kinds of grounds for hope.
02:44:27.260 | - So us being a thin layer of crust,
02:44:32.540 | what do you think is the meaning of this layer?
02:44:34.700 | What's the meaning of human existence?
02:44:36.500 | What's the meaning of life?
02:44:37.780 | - Well, I think it depends who you're talking to.
02:44:39.980 | If you're talking to a raccoon, it might be one thing.
02:44:44.180 | If you're talking to an old growth tree,
02:44:46.860 | it's making sure you're straight up, upright,
02:44:49.220 | and not on a slippery slope.
02:44:50.900 | - What about fish?
02:44:52.460 | - Yeah, fish, I guess they're trying to avoid the hook.
02:44:55.700 | Right, no fish ever, when they take the bait,
02:44:58.780 | no fish in the world has ever said, "I hope I get hooked."
02:45:02.260 | And that's one of the problems with tobacco
02:45:05.100 | is that there's all this bait and people get hooked.
02:45:09.180 | But the fish don't have heads, we have heads.
02:45:12.560 | One of the great innovations in the history of humanity,
02:45:17.140 | going back way pre, I mean, is the invention of the head,
02:45:19.660 | the mobile head that turns and sees.
02:45:22.560 | And the fish didn't have that.
02:45:27.140 | They didn't have hands.
02:45:29.080 | The octopus have cool stuff.
02:45:31.060 | - Yeah, it's not all about the head.
02:45:34.100 | - It's not all about the head.
02:45:35.140 | Well, in fact, the octopus, basically,
02:45:36.580 | they've got brains in their fingers.
02:45:38.740 | - And maybe brains is not even that good of an invention
02:45:41.420 | in the long arc of history,
02:45:43.420 | 'cause the fish maybe got it right.
02:45:45.380 | (Lex laughing)
02:45:47.460 | Stay in the ocean.
02:45:48.580 | - Well, of course, we evolved from fish, so.
02:45:51.260 | - Yeah, but we moved on.
02:45:54.340 | Is there a why to this?
02:45:57.220 | Or is it just the way, it's like the current.
02:46:02.220 | It's just like these pockets of interesting complexity
02:46:06.340 | pops up, like Allah showing up on a rock.
02:46:10.500 | This is what human civilization is.
02:46:12.020 | This weird little thing that showed up on a rock.
02:46:14.020 | And then it'll disappear.
02:46:15.500 | - Well, we are probably the most remarkable creation
02:46:17.940 | that nature has ever belched forward.
02:46:22.000 | We're probably the only one, if you don't count the KT
02:46:26.500 | meteorite that almost destroyed the Earth,
02:46:29.020 | we're the only ones that really have the capacity
02:46:31.100 | to destroy the Earth.
02:46:32.160 | I'm fascinated by the meteorite that wiped out everything
02:46:38.080 | bigger than four feet long,
02:46:39.680 | the Mount Everest-sized meteorite
02:46:42.860 | that hit the Earth 66 million years ago
02:46:45.020 | and destroyed most species in the water and on land.
02:46:50.860 | - There could have been some smart folks around then, too.
02:46:53.940 | - Well, actually, one thing I like to think about
02:46:55.480 | is that 232.3 million years ago,
02:47:00.480 | and 232.4 million years ago, that's 100,000 years.
02:47:06.080 | That tiniest of a sliver, maybe a millimeter
02:47:08.820 | in most parts of the Earth.
02:47:10.160 | It's enough time for a species of dinosaur
02:47:13.180 | to become intelligent, build a civilization,
02:47:16.860 | and go extinct with no traces.
02:47:18.640 | And maybe that happened.
02:47:23.020 | Our ignorance can fully engulf the fact that that happened.
02:47:28.020 | (laughs)
02:47:30.560 | Oh, the beautiful self-importance of us humans.
02:47:35.000 | It's easy to forget that multiple intelligent civilizations
02:47:39.280 | could have lived on Earth.
02:47:41.280 | - It's possible and gone extinct,
02:47:43.000 | or even life may have evolved more than once.
02:47:46.680 | Not only that, but proto-life may still exist
02:47:49.440 | and we're not even looking for it.
02:47:50.920 | Some type of clay that became life may still exist.
02:47:55.620 | One thing I like to think about is always,
02:47:58.760 | what is the before time that is now?
02:48:01.920 | Remember lecturing about this right before COVID.
02:48:04.680 | It's sort of like, what is our world now
02:48:07.400 | that we'll say, what was it like to be then before?
02:48:10.040 | - Yeah, before.
02:48:10.880 | - And that's the world we live in.
02:48:12.040 | We live in a before time for something
02:48:13.860 | we really can't predict.
02:48:15.280 | - Probably physical--
02:48:17.160 | - Appendages.
02:48:20.560 | - And being in person, being able to touch each other,
02:48:24.000 | or wanting to touch each other,
02:48:26.240 | versus being in the digital world, right?
02:48:29.400 | This whole idea of the metaverse
02:48:30.800 | and more and more moving into a digital space.
02:48:34.080 | What was it like being born before most of your life
02:48:38.680 | wasn't on the computer?
02:48:40.640 | - Yeah.
02:48:41.480 | - It's pretty damn good for the record.
02:48:45.160 | But maybe, I don't know the alternative.
02:48:47.820 | Robert, this was a fascinating conversation.
02:48:49.800 | Thank you for taking us through some dark periods
02:48:54.000 | of human history, but I think they contain
02:48:55.680 | a lot of lessons for today,
02:48:57.560 | that science is often inextricably connected
02:49:02.560 | to our values, to our ethics, to our politics,
02:49:07.080 | and that's something we have to contend with.
02:49:09.160 | So your work is really important,
02:49:10.520 | and thank you for shining a light on it.
02:49:12.360 | - Thank you.
02:49:13.760 | - Thanks for listening to this conversation
02:49:15.280 | with Robert Proctor.
02:49:16.560 | To support this podcast, please check out our sponsors
02:49:19.160 | in the description.
02:49:20.720 | And now, let me leave you with some words from Carl Sagan.
02:49:24.440 | Somewhere, something incredible is waiting to be known.
02:49:28.380 | Thank you for listening, and hope to see you next time.
02:49:32.400 | (upbeat music)
02:49:34.980 | (upbeat music)
02:49:37.560 | [BLANK_AUDIO]