back to index

Lisa Feldman Barrett: Counterintuitive Ideas About How the Brain Works | Lex Fridman Podcast #129


Chapters

0:0 Introduction
2:45 Are we alone in the universe?
4:13 Life on Earth
9:5 Collective intelligence of human brains
17:53 Triune brain
24:3 The predicting brain
31:58 How the brain evolved
37:58 Free will
46:51 Is anything real?
59:23 Dreams
65:11 Emotions are human-constructed concepts
90:40 Are women more emotional than men?
99:16 Empathy
130:56 Love
134:50 Mortality
136:26 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Lisa Feldman Barrett,
00:00:03.720 | a professor of psychology at Northeastern University
00:00:07.000 | and one of the most brilliant and bold thinkers
00:00:09.720 | and scientists I've ever had the pleasure of speaking with.
00:00:12.620 | She's the author of a book that revolutionized
00:00:15.160 | our understanding of emotion in the brain
00:00:17.300 | called "How Emotions Are Made"
00:00:19.640 | and she's coming out with a new book
00:00:22.000 | called "Seven and a Half Lessons About the Brain"
00:00:25.160 | that you can and should pre-order now.
00:00:29.000 | I got a chance to read it already
00:00:31.160 | and it's one of the best short whirlwind introductions
00:00:34.080 | to the human brain I've ever read.
00:00:36.360 | It comes out on November 17th,
00:00:38.160 | but again, if there's anybody worth supporting, it's Lisa,
00:00:41.300 | so please do pre-order the book now.
00:00:44.240 | Lisa and I agreed to speak once again
00:00:46.480 | around the time of the book release,
00:00:48.640 | especially because we felt that this first conversation
00:00:51.540 | is good to release now since we talk about
00:00:54.120 | the divisive time we're living through
00:00:56.000 | in the United States leading up to the election.
00:00:59.560 | And she gives me a whole new way to think about it
00:01:02.500 | from a neuroscience perspective
00:01:04.360 | that is ultimately inspiring of empathy, compassion,
00:01:08.240 | and love.
00:01:09.440 | Quick mention of each sponsor,
00:01:11.180 | followed by some thoughts related to this episode.
00:01:14.440 | First sponsor is Athletic Greens,
00:01:16.480 | the all-in-one drink that I start every day with
00:01:19.600 | to cover all my nutritional bases
00:01:21.720 | that I don't otherwise get through my diet naturally.
00:01:24.640 | Second is Magic Spoon,
00:01:26.360 | low-carb, keto-friendly, delicious cereal
00:01:28.600 | that I reward myself with after a productive day.
00:01:32.120 | The cocoa flavor is my favorite.
00:01:34.360 | Third sponsor is Cash App,
00:01:36.160 | the app I use to send money to friends for food, drinks,
00:01:40.320 | and unfortunately, for the many bets I have lost to them.
00:01:43.960 | Please check out these sponsors in the description
00:01:46.080 | to get a discount and to support this podcast.
00:01:48.960 | As a side note, let me say that the bold,
00:01:52.440 | first principles way that Lisa approaches
00:01:54.520 | our study of the brain
00:01:56.120 | is something that has inspired me
00:01:57.840 | ever since I learned about her work.
00:01:59.840 | And in fact, I invited her to speak at the AGI series
00:02:04.000 | I organized at MIT several years ago.
00:02:06.900 | But as a little twist, instead of a lecture,
00:02:09.760 | we did a conversation in front of the class.
00:02:12.200 | I think that was one of the early moments
00:02:14.240 | that led me to start this very podcast.
00:02:17.440 | It was scary and gratifying,
00:02:19.640 | which is exactly what life is all about.
00:02:21.800 | And it's kind of funny how life turns
00:02:23.960 | on little moments like these
00:02:25.760 | that at the time don't seem to be
00:02:27.440 | anything out of the ordinary.
00:02:29.400 | If you enjoy this thing, subscribe on YouTube,
00:02:31.700 | review it with Five Stars on Apple Podcast,
00:02:33.960 | follow on Spotify, support on Patreon,
00:02:36.640 | or connect with me on Twitter @LexFriedman.
00:02:40.400 | And now, here's my conversation with Lisa Feldman Barrett.
00:02:44.660 | Since we'll talk a lot about the brain today,
00:02:47.920 | do you think, let's ask the craziest question,
00:02:50.300 | do you think there's other intelligent life
00:02:52.120 | out there in the universe?
00:02:53.540 | - Honestly, I've been asking myself lately
00:02:56.360 | if there's intelligent life on this planet.
00:02:58.640 | You know, I have to think probabilities suggest yes,
00:03:04.640 | and also, secretly, I think I just hope that's true.
00:03:09.520 | It would be really, I know scientists
00:03:12.520 | aren't supposed to have hopes and dreams,
00:03:14.200 | but I think it would be really cool.
00:03:17.320 | And I also think it would be really sad
00:03:19.960 | if it wasn't the case.
00:03:21.660 | If we really were alone, that would be,
00:03:23.600 | that would seem profoundly sad, I think.
00:03:29.740 | - So it's exciting to you, not scary?
00:03:31.800 | - Yeah, no, you know, I take a lot of comfort in curiosity.
00:03:37.320 | It's a great resource for dealing with stress.
00:03:42.320 | So I'm learning all about mushrooms and octopuses
00:03:49.900 | and all kinds of stuff.
00:03:52.140 | And so for me, this counts, I think, in the realm of awe.
00:03:56.420 | But also, I think I'm somebody who cultivates awe
00:04:00.220 | deliberately on purpose to feel like a speck.
00:04:03.020 | You know, I find it a relief occasionally.
00:04:07.340 | - To feel small.
00:04:08.180 | - To feel small in a profoundly large
00:04:11.780 | and interesting universe.
00:04:13.820 | - So maybe to dig more technically
00:04:17.060 | on the question of intelligence,
00:04:19.060 | do you think it's difficult for intelligent life
00:04:21.340 | to arise like it did on Earth?
00:04:23.700 | From everything you've written and studied about the brain,
00:04:27.580 | how magical of a thing is it
00:04:30.740 | in terms of the odds it takes to arise?
00:04:33.780 | - Yeah, so, you know, magic is just,
00:04:37.640 | don't get me wrong, I mean,
00:04:39.860 | I like a magic show as much as the next person.
00:04:43.540 | My husband was a magician at one time.
00:04:45.820 | But, you know, magic is just a bunch of stuff
00:04:48.700 | that we don't really understand how it works yet.
00:04:50.780 | So I would say, from what I understand,
00:04:53.600 | there are some major steps in the course of evolution
00:04:58.600 | that at the beginning of life,
00:05:00.900 | the step from single cell to multicellular organisms,
00:05:04.140 | things like that, which are really not known.
00:05:07.440 | I think for me, the question is not so much
00:05:11.180 | could it, you know, what's the likelihood
00:05:16.220 | that it would happen again,
00:05:17.660 | as much as what are the steps and how long would it take?
00:05:22.660 | And if it were to happen again on Earth,
00:05:27.420 | would we end up with the same menu of life forms
00:05:32.420 | that we currently have now?
00:05:34.620 | And I think the answer's probably no, right?
00:05:36.540 | There's just so much about evolution
00:05:38.900 | that is stochastic and driven by chance.
00:05:43.340 | - But the question is whether that menu
00:05:44.780 | would be equally delicious,
00:05:46.340 | meaning like there'd be rich complexity of the kind of,
00:05:50.020 | like would we get dolphins and humans
00:05:52.660 | or whoever else falls in that category
00:05:54.740 | of weirdly intelligent, seemingly intelligent,
00:05:59.260 | however we define that?
00:06:01.300 | - Well, I think that has to be true
00:06:03.320 | if you just look at the range of creatures
00:06:05.940 | who've gone extinct.
00:06:07.060 | I mean, if you look at the range of creatures
00:06:09.740 | that are on the Earth now, it's incredible.
00:06:13.140 | And, you know, it's sort of trite to say that,
00:06:14.860 | but it actually is really incredible.
00:06:17.220 | Particularly, I don't know, I mean,
00:06:21.280 | animals, there are animals that seem really ordinary
00:06:26.300 | until you watch them closely
00:06:27.820 | and then they become miraculous, you know,
00:06:29.500 | like certain types of birds,
00:06:31.020 | which do very miraculous things,
00:06:33.100 | build, you know, bowers and do dances
00:06:38.220 | and all these really funky things
00:06:39.740 | that are hard to explain
00:06:42.180 | with a standard evolutionary story.
00:06:43.760 | Although, you know, people have them.
00:06:46.100 | - Birds are weird.
00:06:46.940 | They do a lot of, for mating purposes.
00:06:49.680 | They have a concept of beauty that I haven't quite,
00:06:53.740 | maybe you know much better,
00:06:54.780 | but it doesn't seem to fit evolutionary arguments well.
00:06:57.980 | - It does fit.
00:06:58.860 | Well, it depends, right?
00:06:59.940 | So I think you're talking about "The Evolution of Beauty,"
00:07:02.320 | the book that was written recently by,
00:07:05.500 | was it Frum, was that his name?
00:07:08.540 | Richard Frum, I think at Yale.
00:07:09.860 | - Oh, interesting, no, I didn't.
00:07:11.340 | - Oh, it's a great book.
00:07:12.180 | It's very controversial though,
00:07:13.300 | because he's making an argument that,
00:07:17.660 | the question about birds and some other animals
00:07:19.900 | is why would they engage in such metabolically costly
00:07:24.500 | displays when it doesn't improve their fitness at all?
00:07:30.060 | And the answer that he gives
00:07:32.100 | is the answer that Darwin gave, which is sexual selection.
00:07:35.500 | Not natural selection, but you know,
00:07:38.440 | selection can occur for all kinds of reasons.
00:07:40.260 | There could be artificial selection,
00:07:41.940 | which is when we breed animals, right?
00:07:44.020 | Which is actually how Darwin,
00:07:46.460 | that observation helped Darwin come to the idea
00:07:49.300 | of natural selection.
00:07:50.540 | - Oh, interesting.
00:07:51.860 | - And then there's sexual selection,
00:07:53.820 | meaning, and the argument that I think his name is Frum
00:07:57.660 | makes is that it's the pleasure,
00:08:03.280 | the selection pressure is the pleasure of female birds.
00:08:06.940 | Which as a woman, and as someone who studies affect,
00:08:10.500 | that's a great answer.
00:08:12.500 | I actually think there probably is natural,
00:08:14.260 | I think there is an aspect of natural selection to it,
00:08:16.340 | which he maybe hasn't considered.
00:08:18.620 | - But you were saying, the reason we brought up birds
00:08:20.860 | is the life we got now seems to be quite incredible.
00:08:24.140 | - Yeah, so you peek into the ocean,
00:08:26.700 | peek into the sky, there are miraculous creatures.
00:08:29.620 | Look at creatures who've gone extinct,
00:08:31.980 | and you know, in science fiction stories,
00:08:35.340 | you couldn't dream up something as interesting.
00:08:37.580 | So my guess is that, you know,
00:08:41.440 | intelligent life evolves in many different ways,
00:08:45.580 | even on this planet.
00:08:47.180 | There isn't one form of intelligence,
00:08:49.020 | there's not one brain that gives you intelligence.
00:08:51.180 | There are lots of different brain structures
00:08:52.820 | that can give you intelligence.
00:08:54.980 | So my guess is that the menagerie might not look
00:08:59.900 | exactly the way that it looks now,
00:09:01.620 | but it would certainly be as interesting.
00:09:04.900 | - But if we look at the human brain versus the brains,
00:09:09.900 | or whatever you call 'em,
00:09:11.580 | the mechanisms of intelligence in our ancestors,
00:09:14.340 | even early ancestors, that you write about,
00:09:16.860 | for example, in your new book,
00:09:18.640 | what's the difference between the fanciest brain we got,
00:09:24.700 | which is the human brain,
00:09:27.260 | and the ancestor brains that it came from?
00:09:31.660 | - Yeah, I think it depends on how far back you wanna go.
00:09:34.600 | - You go all the way back, right, in your book.
00:09:37.820 | (laughing)
00:09:38.700 | So what's the interesting comparison, would you say?
00:09:41.220 | - Well, first of all, I wouldn't say that the human brain
00:09:43.500 | is the fanciest brain we've got.
00:09:45.100 | I mean, an octopus brain is pretty different
00:09:48.140 | and pretty fancy, and they can do some pretty amazing things
00:09:51.200 | that we cannot do.
00:09:52.860 | You know, we can't grow back limbs,
00:09:54.460 | we can't change color and texture,
00:09:56.940 | we can't comport ourselves and squeeze ourselves
00:09:59.860 | into a little crevice.
00:10:01.340 | I mean, these are things that we invent,
00:10:03.180 | these are like superhero abilities
00:10:04.740 | that we invent in stories, right?
00:10:06.260 | We can't do any of those things.
00:10:08.160 | And so the human brain is certainly,
00:10:10.420 | we can certainly do some things that other animals can't do
00:10:14.160 | that seem pretty impressive to us.
00:10:17.860 | But I would say that there are a number of animal brains
00:10:22.340 | which seem pretty impressive to me,
00:10:24.220 | that can do interesting things
00:10:26.100 | and really impressive things that we can't do.
00:10:28.940 | - I mean, with your work on how emotions are made
00:10:31.300 | and so on, you kind of repaint the view of the brain
00:10:35.380 | as less glamorous, I suppose,
00:10:39.540 | than you would otherwise think.
00:10:42.580 | Or like, I guess you draw a thread
00:10:44.580 | that connects all brains together
00:10:47.420 | in terms of homeostasis and all that kind of stuff.
00:10:50.260 | - Yeah, I wouldn't say that the human brain
00:10:53.740 | is any less miraculous than anybody else would say.
00:10:57.260 | I just think that there are other brain structures
00:11:00.240 | which are also miraculous.
00:11:02.260 | And I also think that there are a number of things
00:11:04.260 | about the human brain which we share with other vertebrates,
00:11:09.260 | other animals with backbones,
00:11:12.120 | but that we share these miraculous things.
00:11:16.660 | But we can do some things in abundance.
00:11:19.260 | And we can also do some things with our brains together,
00:11:23.900 | working together that other animals can't do,
00:11:27.260 | or at least we haven't discovered their ability to do it.
00:11:31.360 | - Yeah, this social thing.
00:11:32.960 | I mean, that's one of the things you write about.
00:11:35.400 | How do you make sense of the fact,
00:11:40.560 | like the book "Sapiens" and the fact that we're able
00:11:43.780 | to kind of connect, like network our brains together,
00:11:47.240 | like you write about?
00:11:48.600 | I'll try to stop saying that. (laughs)
00:11:53.320 | Is that like some kind of feature that's built into there?
00:11:58.320 | Is that unique to our human brains?
00:12:00.380 | Like how do you make sense of that?
00:12:02.140 | - What I would say is that our ability to coordinate
00:12:05.660 | with each other is not unique to humans.
00:12:09.340 | There are lots of animals who can do that.
00:12:12.580 | But what we do with that coordination is unique
00:12:20.660 | because of some of the structural features in our brains.
00:12:25.220 | And it's not that other animals don't have
00:12:29.580 | those structural features, it's we have them in abundance.
00:12:33.460 | So, you know, the human brain is not larger
00:12:38.100 | than you would expect it to be for a primate of our size.
00:12:42.660 | If you took a chimpanzee and you grew it
00:12:47.220 | to the size of a human, that chimpanzee would have a brain
00:12:51.660 | that was the size of a human brain.
00:12:53.720 | So there's nothing special about our brain
00:12:56.080 | in terms of its size.
00:12:57.580 | There's nothing special about our brain
00:12:59.760 | in terms of the basic blueprint that builds our brain
00:13:04.760 | from an embryo is the basic blueprint
00:13:09.340 | that builds all mammalian brains
00:13:11.300 | and maybe even all vertebrate brains.
00:13:14.840 | It's just that because of its size,
00:13:18.000 | and particularly because of the size
00:13:19.760 | of the cerebral cortex, which is a part
00:13:23.920 | that people mistakenly attribute to rationality.
00:13:28.920 | - Why mistakenly?
00:13:29.960 | Isn't that where all the clever stuff happens?
00:13:32.200 | - Well, no, it really isn't.
00:13:34.200 | And I will also say that lots of clever stuff happens
00:13:37.020 | in animals who don't have a cerebral cortex.
00:13:39.480 | - Right.
00:13:40.320 | - But because of the size of the cerebral cortex
00:13:44.680 | and because of some of the features
00:13:47.000 | that are enhanced by that size,
00:13:50.480 | that gives us the capacity to do things
00:13:53.280 | like build civilizations and coordinate with each other,
00:13:58.200 | not just to manipulate the physical world,
00:14:02.320 | but to add to it in very profound ways.
00:14:06.660 | Like, you know, other animals can cooperate
00:14:09.160 | with each other and use tools.
00:14:11.700 | We draw a line in the sand and we make countries
00:14:14.900 | and then we create, you know,
00:14:17.400 | we create citizens and immigrants.
00:14:20.900 | - But also ideas.
00:14:22.460 | I mean, the countries are centered
00:14:23.980 | around the concept of, like, ideas.
00:14:26.940 | - Well, what do you think a citizen is and an immigrant?
00:14:29.540 | Those are ideas.
00:14:30.720 | Those are ideas that we impose on reality
00:14:34.460 | and make them real.
00:14:35.420 | And then they have very, very serious and real effects,
00:14:39.340 | physical effects on people.
00:14:40.980 | - What do you think about the idea
00:14:42.860 | that a bunch of people have written about,
00:14:44.380 | Dawkins with memes, which is like ideas are breeding.
00:14:49.020 | Like, we're just like the canvas for ideas to breed
00:14:54.020 | in our brains.
00:14:55.460 | So this kind of network that you talk about of brains
00:14:58.560 | is just a little canvas for ideas
00:15:00.420 | to then compete against each other and so on.
00:15:03.260 | - I think as a rhetorical tool,
00:15:04.740 | it's cool to think that way.
00:15:08.120 | So I think it was Michael Pollan.
00:15:10.860 | I don't remember if it was in "The Botany of Desire,"
00:15:12.980 | but it was in one of his early books
00:15:15.340 | on botany and gardening,
00:15:19.680 | where he wrote about plants,
00:15:22.680 | sort of utilizing humans for their own evolutionary purposes.
00:15:32.700 | But it's kind of interesting.
00:15:33.680 | You can think about a human gut in a sense
00:15:36.560 | as a propagation device for the seeds of tomatoes
00:15:41.560 | and what have you.
00:15:43.780 | So it's kind of cool.
00:15:44.840 | So I think rhetorically, it's an interesting device,
00:15:48.800 | but ideas are, as far as I know,
00:15:52.680 | invented by humans, propagated by humans.
00:15:57.060 | So I don't think they're separate
00:16:00.940 | from human brains in any way,
00:16:02.800 | although it is interesting to think about it that way.
00:16:06.860 | - Well, of course, the ideas that are using your brain
00:16:09.740 | to communicate and write excellent books,
00:16:11.860 | and they basically pick you, Lisa,
00:16:17.220 | as an effective communicator and thereby are winning.
00:16:21.360 | So that's an interesting worldview,
00:16:23.200 | to think that there's particular aspects of your brain
00:16:27.160 | that are conducive to certain sets of ideas,
00:16:31.380 | and maybe those ideas will win out.
00:16:33.440 | - Yeah, I think the way that I would say it really though
00:16:35.520 | is that there are many species of animals
00:16:38.160 | that influence each other's nervous systems,
00:16:40.280 | that regulate each other's nervous systems.
00:16:42.280 | And they mainly do it by physical means.
00:16:44.640 | They do it by chemicals, scent.
00:16:47.600 | They do it by, so termites and ants and bees, for example,
00:16:52.600 | use chemical scents.
00:16:55.280 | Mammals like rodents use scent,
00:16:59.520 | and they also use hearing, audition,
00:17:02.620 | and that little bit of vision.
00:17:04.180 | Primates, non-human primates, add vision, right?
00:17:08.460 | And I think everybody uses touch.
00:17:13.500 | Humans, as far as I know, are the only species
00:17:15.980 | that use ideas and words to regulate each other, right?
00:17:20.780 | I can text something to someone halfway around the world.
00:17:23.900 | - That's fascinating.
00:17:24.740 | - They don't have to hear my voice.
00:17:26.140 | They don't have to see my face,
00:17:27.780 | and I can have an effect on their nervous system.
00:17:30.680 | And ideas, the ideas that we communicate with words,
00:17:34.980 | I mean, words are in a sense a way
00:17:36.700 | for us to do mental telepathy with each other, right?
00:17:39.180 | I mean, I'm not the first person to say that, obviously.
00:17:41.780 | But how do I control your heart rate?
00:17:45.560 | How do I control your breathing?
00:17:46.780 | How do I control your actions?
00:17:49.020 | With words, it's because those words
00:17:51.260 | are communicating ideas.
00:17:54.180 | - So you also write, I think, let's go back to the brain.
00:17:57.900 | You write that Plato gave us the idea
00:17:59.980 | that the human brain has three brains in it, three forces,
00:18:04.980 | which is kind of a compelling notion.
00:18:08.340 | You disagree.
00:18:09.460 | First of all, what are the three parts of the brain,
00:18:12.700 | and why do you disagree?
00:18:15.200 | - So Plato's description of the psyche,
00:18:20.660 | which for the moment we'll just assume
00:18:22.820 | is the same as a mind.
00:18:24.660 | There are some scholars who would say,
00:18:26.500 | a soul, a psyche, a mind,
00:18:28.140 | those aren't actually all the same thing in ancient Greece,
00:18:30.660 | but we'll just for now gloss over that.
00:18:33.980 | So Plato's idea was that,
00:18:36.360 | and it was a description of really about moral behavior
00:18:41.360 | and moral responsibility in humans.
00:18:44.040 | So the idea was that the human psyche
00:18:46.460 | can be described with a metaphor of two horses
00:18:52.300 | and a charioteer.
00:18:53.660 | So one horse for instincts,
00:18:56.560 | like feeding and fighting and fleeing and reproduction.
00:19:02.540 | I'm trying to control my salty language.
00:19:07.000 | Which apparently they print in England.
00:19:11.020 | Like I actually tossed off a--
00:19:13.380 | - F, S?
00:19:15.540 | - Yeah, F, F.
00:19:16.360 | - Okay.
00:19:17.200 | - Yeah.
00:19:18.020 | I was like, you printed that?
00:19:19.100 | I couldn't believe you printed that.
00:19:20.340 | - Without like the stars or whatever?
00:19:21.980 | - Oh no, no, there was full print.
00:19:23.940 | They also printed a B word and it was really quite, yeah.
00:19:28.940 | - We should learn something from England.
00:19:32.260 | - Indeed, anyways.
00:19:33.580 | But instincts, and then the other horse represents emotions
00:19:38.020 | and then the charioteer represents rationality,
00:19:40.160 | which controls the two beasts, right?
00:19:43.720 | And fast forward a couple of centuries
00:19:49.500 | and in the middle of the 20th century,
00:19:54.500 | there was a very popular view of brain evolution,
00:19:58.040 | which suggested that you have this reptilian core,
00:20:03.040 | like a lizard brain, inner lizard brain for instincts.
00:20:08.420 | And then wrapped around that evolved,
00:20:10.980 | layer on top of that evolved a limbic system in mammals.
00:20:15.980 | So the novelty was in a mammalian brain,
00:20:18.740 | which bestowed mammals with, gave them emotions,
00:20:23.420 | the capacity for emotions.
00:20:24.660 | And then on top of that evolved a cerebral cortex,
00:20:29.660 | which in largely in primates, but very large in humans.
00:20:37.580 | And it's not that I personally disagree,
00:20:46.540 | it's that as far back as the 1960s,
00:20:49.500 | but really by the 1970s, it was shown pretty clearly
00:20:54.140 | with evidence from molecular genetics,
00:20:55.900 | so peering into cells in the brain
00:20:59.180 | to look at the molecular makeup of genes,
00:21:03.040 | that the brain did not evolve that way.
00:21:05.940 | And the irony is that the idea of the three-layered brain
00:21:15.840 | with an inner lizard, that hijacks your behavior
00:21:20.760 | and causes you to do and say things
00:21:22.680 | that you would otherwise not,
00:21:24.760 | or maybe that you will regret later,
00:21:27.120 | that idea became very popular,
00:21:31.000 | was popularized by Carl Sagan in the Dragons of Eden,
00:21:36.000 | which won a Pulitzer Prize in 1977,
00:21:40.860 | when it was already known pretty much
00:21:42.740 | in evolutionary neuroscience,
00:21:44.400 | that the whole narrative was a myth.
00:21:46.880 | - So, well, the narrative is on the way it evolved,
00:21:50.700 | but do you, I mean, again, it's that problem
00:21:54.060 | of it being a useful tool of conversation
00:21:59.060 | to say like there's a lizard brain,
00:22:02.340 | and there's a, like if I get overly emotional on Twitter,
00:22:05.820 | that was the lizard brain and so on.
00:22:09.300 | But do you--
00:22:10.400 | - No, I don't think it's useful.
00:22:11.800 | I think it's, I think that--
00:22:13.520 | - Is it useful, is it accurate?
00:22:16.920 | - I don't think it's accurate,
00:22:18.360 | and therefore, I don't think it's useful.
00:22:20.920 | So here's what I would say.
00:22:23.000 | I think that, the way I think about philosophy and science
00:22:28.000 | is that they are useful tools for living.
00:22:32.740 | And in order to be useful tools for living,
00:22:39.180 | they have to help you make good decisions.
00:22:44.180 | The triune brain, as it's called, this three-layer brain,
00:22:49.820 | the idea that your brain is like an already baked cake,
00:22:52.740 | and the cortex, cerebral cortex,
00:22:55.460 | just layered on top like icing,
00:22:57.860 | the idea, that idea is the foundation of the law
00:23:02.860 | in most Western countries.
00:23:05.900 | It's the foundation of economic theory,
00:23:10.220 | and it largely, and it's a great narrative.
00:23:13.340 | It sort of fits our intuitions about how we work.
00:23:17.640 | But it also, in addition to being wrong,
00:23:22.640 | it lets people off the hook for nasty behavior.
00:23:27.820 | And it also suggests that emotions
00:23:32.020 | can't be a source of wisdom, which they often are.
00:23:36.020 | In fact, you would not wanna be around someone
00:23:39.020 | who didn't have emotions.
00:23:40.700 | That would be, that's a psychopath.
00:23:43.260 | I mean, that's not someone you wanna really
00:23:47.580 | have that person deciding your outcome.
00:23:50.780 | So I guess my, and I could sort of go on and on and on,
00:23:54.120 | but my point is that I don't think,
00:23:59.120 | I don't think it's a useful narrative in the end.
00:24:03.460 | What's the more accurate view of the brain
00:24:06.460 | that we should use when we're thinking about it?
00:24:08.740 | - I'll answer that in a second,
00:24:09.700 | but I'll say that even our notion of what an instinct is
00:24:12.740 | or what a reflex is, it's not quite right, right?
00:24:16.840 | So if you look at evidence from ecology, for example,
00:24:21.840 | and you look at animals in their ecological context,
00:24:25.660 | what you can see is that even things which are reflexes
00:24:30.260 | are very context-sensitive.
00:24:32.420 | The brains of those animals are executing
00:24:36.980 | so-called instinctual actions
00:24:39.460 | in a very, very context-sensitive way.
00:24:42.140 | And so even when a physician takes the,
00:24:46.860 | it's like the idea of your patellar reflex
00:24:49.500 | where they hit your patellar tendon on your knee
00:24:52.220 | and you kick, the force with which you kick and so on
00:24:57.220 | is influenced by all kinds of things.
00:24:59.500 | It's a reflex isn't like a robotic response.
00:25:04.500 | And so I think a better way is a way that,
00:25:10.140 | to think about how brains work,
00:25:12.220 | is the way that matches our best understanding,
00:25:16.540 | our best scientific understanding,
00:25:18.000 | which I think is really cool
00:25:21.280 | because it's really counterintuitive.
00:25:24.620 | So how I came to this view,
00:25:26.260 | and I'm certainly not the only one who holds this view,
00:25:28.820 | I was reading work on neuroanatomy
00:25:31.660 | and the view that I'm about to tell you
00:25:34.580 | was strongly suggested by that.
00:25:37.140 | And then I was reading work in signal processing,
00:25:39.340 | like by electrical engineering.
00:25:41.660 | And similarly, the work suggested that,
00:25:46.140 | the research suggested that the brain worked this way.
00:25:48.400 | And I'll just say that I was reading
00:25:50.220 | across multiple literatures
00:25:51.660 | and they were, who don't speak to each other.
00:25:54.020 | And they were all pointing in this direction.
00:25:57.220 | And so far, although some of the details
00:26:00.820 | are still up for grabs,
00:26:02.420 | the general gist I think is,
00:26:05.180 | I've not come across anything yet,
00:26:07.600 | which really violates and I'm looking.
00:26:10.860 | And so the idea is something like this,
00:26:13.740 | it's very counterintuitive.
00:26:15.140 | So the way to describe it is to say,
00:26:18.920 | that your brain doesn't react to things in the world.
00:26:22.860 | It's not, to us it feels like our eyes
00:26:25.900 | are windows on the world.
00:26:27.940 | We see things, we hear things, we react to them.
00:26:31.540 | In psychology, we call this stimulus response.
00:26:35.420 | So your face is, your voice is a stimulus to me.
00:26:39.940 | I receive input and then I react to it.
00:26:43.840 | And I might react very automatically, system one.
00:26:48.960 | But I also might execute some control
00:26:54.420 | where I maybe stop myself from saying something
00:26:57.860 | or doing something and in a more reflective way,
00:27:02.860 | execute a different action, right?
00:27:04.720 | That's system two.
00:27:05.800 | The way the brain works though,
00:27:08.700 | is it's predicting all the time.
00:27:10.820 | It's constantly talking to itself,
00:27:13.220 | constantly talking to your body.
00:27:16.340 | And it's constantly predicting what's going on in the body
00:27:21.260 | and what's going on in the world
00:27:23.160 | and making predictions and the information
00:27:27.840 | from your body and from the world
00:27:29.540 | really confirm or correct those predictions.
00:27:32.800 | - So fundamentally, the thing that the brain does
00:27:35.380 | most of the time is just predict,
00:27:38.880 | like talking to itself and predicting stuff about the world,
00:27:41.600 | not like this dumb thing that just senses and responds.
00:27:45.700 | Senses and responds.
00:27:46.540 | - Yeah, so the way to think about it is like this.
00:27:48.500 | You know, your brain is trapped in a dark, silent box.
00:27:52.540 | - Yeah, that's very romantic of you.
00:27:54.700 | - Which is your skull.
00:27:57.260 | And the only information that it receives
00:28:01.320 | from your body and from the world, right,
00:28:04.440 | is through the senses, through the sense organs,
00:28:07.440 | your eyes, your ears, and you have sensory data
00:28:12.440 | that comes from your body
00:28:14.000 | that you're largely unaware of to your brain,
00:28:17.940 | which we call interoceptive,
00:28:20.240 | as opposed to exteroceptive,
00:28:21.740 | which is the world around you.
00:28:23.240 | But your brain is receiving sense data continuously,
00:28:29.300 | which are the effect of some set of causes.
00:28:35.800 | Your brain doesn't know the cause of these sense data.
00:28:41.280 | It's only receiving the effects of those causes,
00:28:44.520 | which are the data themselves.
00:28:46.380 | And so your brain has to solve what philosophers call
00:28:49.320 | an inverse inference problem.
00:28:51.280 | How do you know, when you only receive the effects
00:28:54.120 | of something, how do you know what caused those effects?
00:28:56.060 | So when there's a flash of light
00:28:58.060 | or a change in air pressure
00:29:00.920 | or a tug somewhere in your body,
00:29:04.200 | how does your brain know what caused those events
00:29:08.620 | so that it knows what to do next
00:29:12.400 | to keep you alive and well?
00:29:13.960 | And the answer is that your brain has one other source
00:29:18.600 | of information available to it,
00:29:21.100 | which is your past experience.
00:29:23.800 | It can reconstitute in its wiring past experiences,
00:29:28.800 | and it can combine those past experiences in novel ways.
00:29:34.400 | And so we have lots of names for this in psychology.
00:29:39.120 | We call it memory, we call it perceptual inference,
00:29:42.440 | we call it simulation.
00:29:44.020 | It's also, we call it concepts or conceptual knowledge.
00:29:49.480 | We call it prediction.
00:29:50.960 | Basically, if we were to stop the world right now,
00:29:54.880 | stop time, your brain is in a state,
00:29:58.780 | and it's representing what it believes
00:30:05.360 | is going on in your body and in the world,
00:30:08.480 | and it's predicting what will happen next
00:30:11.560 | based on past experience, right?
00:30:13.400 | Probabilistically, what's most likely to happen.
00:30:15.920 | And it begins to prepare for what's going to happen
00:30:20.920 | and it prepares your action,
00:30:24.160 | and it begins to prepare your experience based,
00:30:29.160 | so it's anticipating the sense data it's going to receive.
00:30:34.260 | And then when those data come in,
00:30:38.160 | they either confirm that prediction
00:30:40.240 | and your action executes,
00:30:42.560 | because the plan's already been made,
00:30:44.880 | or there's some sense data that your brain didn't know
00:30:49.880 | your brain didn't predict that's unexpected,
00:30:52.640 | and your brain takes it in, we say encodes it.
00:30:55.640 | We have a fancy name for that, we call it learning.
00:30:58.880 | Your brain learns, and it updates
00:31:01.440 | its storehouse of knowledge,
00:31:03.900 | which we call an internal model,
00:31:05.900 | and so that you can predict better next time.
00:31:08.760 | And it turns out that predicting and correcting,
00:31:11.320 | predicting and correcting,
00:31:13.240 | is a much more metabolically efficient way
00:31:15.520 | to run a system than constantly reacting all the time.
00:31:18.800 | Because if you're constantly reacting,
00:31:20.160 | it means you have no, you can't anticipate in any way
00:31:22.800 | what's gonna happen, and so the amount of uncertainty
00:31:26.240 | that you have to deal with is overwhelming
00:31:29.480 | to a nervous system.
00:31:30.920 | - Metabolically costly, I like it.
00:31:32.840 | - And so what is a reflex?
00:31:34.280 | A reflex is when your brain doesn't check
00:31:38.440 | against the sense data,
00:31:40.400 | that the potential cost to you is so great,
00:31:45.400 | maybe because your life is threatened,
00:31:48.400 | that your brain makes the prediction
00:31:50.800 | and executes the action without checking.
00:31:54.420 | - Yeah, so but prediction's still at the core,
00:31:57.120 | that's a beautiful vision of the brain.
00:31:58.680 | I wonder, from almost an AI perspective,
00:32:01.600 | but just computationally,
00:32:03.920 | is the brain just mostly a prediction machine then?
00:32:07.040 | Like is the perception just a nice little feature
00:32:11.280 | added on top, like the,
00:32:13.660 | both the integration of new perceptual information?
00:32:17.320 | I wonder how big of an impressive system
00:32:21.000 | is that relative to just the big predictor,
00:32:24.800 | model constructor?
00:32:25.680 | - Well, I think that we can look to evolution for that,
00:32:28.760 | for one answer, which is that when you go back,
00:32:32.120 | you know, 550 million years, give or take,
00:32:35.640 | we, you know, the world was populated by creatures,
00:32:38.720 | really ruled by creatures without brains.
00:32:41.040 | And you know, that's a biological statement,
00:32:46.500 | not a political statement.
00:32:48.400 | Really, world was--
00:32:49.240 | - You're calling dinosaurs dumb?
00:32:50.560 | You're talking about like--
00:32:51.480 | - Oh no, I'm not talking about dinosaurs, honey.
00:32:53.400 | I'm talking way back, further back than that.
00:32:56.760 | Really, there are these little creatures
00:32:59.880 | called amphioxus, which is the modern,
00:33:02.940 | it's a, or a lancet, that's the modern animal.
00:33:06.080 | But it's an animal that scientists believe
00:33:08.880 | is very similar to our common,
00:33:12.300 | the common ancestor that we share with invertebrates.
00:33:16.720 | Because, basically because of the tracing back
00:33:21.140 | the molecular genetics in cells.
00:33:23.080 | And that animal had no brain.
00:33:27.340 | It had some cells that would later turn into a brain,
00:33:31.000 | but in that animal, there's no brain.
00:33:32.280 | But that animal also had no head.
00:33:34.520 | And it had no eyes, and it had no ears,
00:33:36.860 | and it had really, really no senses, for the most part.
00:33:40.880 | It had very, very limited sense of touch.
00:33:43.960 | It had an eye spot for, not for seeing,
00:33:47.480 | but just for in training to circadian rhythm
00:33:50.860 | to light and dark.
00:33:52.080 | And it had no hearing, it had a vestibular cell
00:33:56.020 | so that it could keep upright in the water.
00:33:58.160 | At the time, we're talking evolutionary scale here,
00:34:02.780 | so give or take some hundred million years or something.
00:34:07.000 | But at the time, what are the vertebrate,
00:34:09.060 | like when a backbone evolved and a brain evolved,
00:34:13.280 | a full brain, that was when a head evolved
00:34:16.720 | with sense organs, and when,
00:34:19.980 | that's when your viscera, like internal systems, evolved.
00:34:23.760 | So the answer, I would say, is that senses,
00:34:28.540 | motor neuroscientists, people who study
00:34:31.680 | the control of motor behavior,
00:34:33.760 | believe that senses evolved in the service of motor action.
00:34:41.520 | So the idea is that, what triggered the,
00:34:46.400 | what triggered, what was the big evolutionary change,
00:34:49.960 | what was the big pressure that made it useful
00:34:53.960 | to have eyes and ears and a visual system
00:34:57.440 | and an auditory system and a brain, basically?
00:34:59.920 | And the answer that is commonly entertained right now
00:35:05.920 | is that it was predation.
00:35:08.160 | That when, at some point, an animal evolved
00:35:12.520 | that deliberately ate another animal,
00:35:16.160 | and this launched an arms race between predators and prey,
00:35:21.160 | and it became very useful to have senses.
00:35:25.080 | So these little amphiox, these little amphioxi,
00:35:28.580 | don't really have, they don't have,
00:35:34.720 | they're not aware of their environment very much, really.
00:35:37.600 | They, and so being able to look up ahead
00:35:42.600 | and ask yourself, should I eat that,
00:35:50.240 | or will it eat me, is a very useful thing.
00:35:53.520 | So the idea is that sense data
00:35:58.520 | is not there for consciousness.
00:36:01.180 | It didn't evolve for the purposes of consciousness.
00:36:03.920 | It didn't evolve for the purposes of experiencing anything.
00:36:07.440 | It evolved to be in the service of motor control.
00:36:13.360 | However, maybe it's useful.
00:36:16.900 | This is why scientists sometimes avoid questions
00:36:23.180 | about why things evolved.
00:36:25.520 | This is what philosophers call this teleology.
00:36:28.400 | You might be able to say something about how things evolve,
00:36:33.360 | but not necessarily why.
00:36:35.840 | We don't really know the why.
00:36:38.720 | That's all speculation.
00:36:40.520 | - But the why is kind of nice here.
00:36:42.120 | The interesting thing is that was the first element
00:36:45.800 | of social interaction is, am I gonna eat you,
00:36:48.600 | or are you gonna eat me?
00:36:50.400 | And for that, it's useful to be able to see each other,
00:36:55.400 | sense each other.
00:36:56.620 | That's kind of fascinating.
00:36:59.880 | There was a time when life didn't eat each other.
00:37:03.080 | - Or they did by accident.
00:37:04.760 | So an amphioxus, for example,
00:37:06.800 | it kind of gyrates in the water,
00:37:11.320 | and then it plants itself in the sand
00:37:14.680 | like a living blade of grass,
00:37:17.080 | and then it just filters whatever comes into its mouth.
00:37:21.480 | So it is eating, but it's not actively hunting.
00:37:26.000 | And when the concentration of food decreases,
00:37:32.200 | the amphioxus can sense this,
00:37:35.280 | and so it basically wriggles itself randomly
00:37:40.280 | to some other spot, which probabilistically
00:37:43.680 | will have more food than wherever it is.
00:37:46.040 | So it's not really, it's not guiding its actions
00:37:50.800 | on the basis of, we would say
00:37:53.000 | there's no real intentional action
00:37:55.280 | in the traditional sense.
00:37:58.560 | - Speaking of intentional action,
00:38:00.400 | and if the brain, if prediction is indeed
00:38:04.600 | a core component of the brain,
00:38:05.960 | let me ask you a question that scientists also hate
00:38:09.560 | is about free will.
00:38:11.640 | So how does, do you think about free will much,
00:38:15.520 | how does that fit into this, into your view of the brain?
00:38:19.480 | Why does it feel like we make decisions in this world?
00:38:24.480 | - This is a hard, we scientists hate this
00:38:26.960 | 'cause it's a hard question.
00:38:28.320 | We don't know the answer to it.
00:38:29.160 | - Are you taking a side?
00:38:30.360 | - I think I have. - You would have free will?
00:38:31.720 | - I think I have taken a side,
00:38:32.920 | but I don't put a lot of stock in my own intuitions
00:38:37.920 | or anybody's intuitions about the cause of things.
00:38:41.000 | One thing we know about the brain for sure
00:38:42.880 | is that the brain creates experiences for us.
00:38:46.360 | My brain creates experiences for me,
00:38:47.880 | your brain creates experiences for you
00:38:49.940 | in a way that lures you to believe
00:38:52.100 | that those experiences actually reveals
00:38:54.940 | the way that it works, but it doesn't.
00:38:59.640 | - So you don't trust your own intuition about free will.
00:39:01.600 | - Not really, not really.
00:39:03.760 | No, I mean, no, but I am also somewhat persuaded by,
00:39:07.280 | I think Dan Dennett wrote at one point,
00:39:09.520 | like the philosopher Dan Dennett wrote at one point
00:39:13.880 | that it's, I can't say it as eloquently as him,
00:39:18.040 | but people obviously have free will.
00:39:20.720 | They are obviously making choices.
00:39:22.640 | And so there is this observation that we're not robots
00:39:27.920 | and we can do some things like a little more sophisticated
00:39:30.800 | than an amphioxus.
00:39:31.800 | So here's what I would say.
00:39:35.160 | I would say that your predictions,
00:39:39.460 | your internal model that's running right now,
00:39:43.400 | that your ability to understand the sounds
00:39:46.240 | that I'm making and attach them to ideas
00:39:48.760 | is based on the fact that you have years of experience
00:39:53.760 | knowing what these sounds mean
00:39:55.840 | in a particular statistical pattern, right?
00:40:00.840 | I mean, that's how you can understand the words
00:40:03.180 | that are coming out of my--
00:40:04.760 | - Mouth? - Right.
00:40:07.240 | I think we did this once before too, didn't we?
00:40:09.280 | When we were--
00:40:10.120 | - I don't know, I would have to access my memory module.
00:40:12.360 | - I think when I was in your--
00:40:13.840 | - The class thing? - Yeah, I think we did it
00:40:15.840 | just like that actually, so bravo.
00:40:18.400 | - Wow. - Yeah.
00:40:19.240 | - I have to go look back to the tape.
00:40:21.520 | - Yeah, anyways, the idea though
00:40:25.560 | is that your brain is using past experience
00:40:28.880 | and it can use past experience in,
00:40:32.040 | so it's remembering, but you're not consciously remembering.
00:40:36.200 | It's basically re-implementing prior experiences
00:40:40.080 | as a way of predicting what's gonna happen next.
00:40:42.160 | And it can do something called conceptual combination,
00:40:44.720 | which is it can take bits and pieces of the past
00:40:48.240 | and combine it in new ways.
00:40:50.300 | So you can experience and make sense of things
00:40:55.700 | that you've never encountered before
00:40:57.580 | because you've encountered something similar to them.
00:41:00.660 | And so a brain in a sense is not just,
00:41:09.140 | doesn't just contain information,
00:41:12.980 | it is information gaining,
00:41:14.820 | meaning it can create new information
00:41:17.900 | by this generative process.
00:41:19.620 | So in a sense, you could say, well,
00:41:20.820 | that maybe that's a source of free will.
00:41:23.260 | But I think really where free will comes from
00:41:25.740 | or the kind of free will that I think
00:41:27.300 | is worth having a conversation about
00:41:29.460 | involves cultivating experiences for yourself
00:41:36.400 | that change your internal model.
00:41:40.180 | When you were born and you were raised
00:41:42.940 | in a particular context,
00:41:45.460 | your brain wired itself to your surroundings,
00:41:50.340 | to your physical surroundings
00:41:51.620 | and also to your social surroundings.
00:41:53.500 | So you were handed an internal model basically.
00:41:57.340 | But when you grow up,
00:42:01.500 | the more control you have over where you are
00:42:05.540 | and what you do,
00:42:06.660 | you can cultivate new experiences for yourself.
00:42:11.500 | And those new experiences can change your internal model
00:42:16.500 | and you can actually practice those experiences
00:42:21.080 | in a way that makes them automatic,
00:42:24.260 | meaning it makes it easier for the brain,
00:42:26.840 | your brain to make them again.
00:42:28.900 | And I think that that is something like
00:42:33.460 | what you would call free will.
00:42:35.380 | You aren't responsible for the model that you were handed,
00:42:40.380 | that someone, you know, your caregivers
00:42:43.820 | cultivated a model in your brain.
00:42:47.780 | You're not responsible for that model,
00:42:49.900 | but you are responsible for the one you have now.
00:42:52.980 | You can choose, you choose what you expose yourself to.
00:42:56.900 | You choose how you spend your time.
00:42:59.940 | Not everybody has choice over everything,
00:43:02.640 | but everybody has a little bit of choice.
00:43:05.020 | And so I think that is something
00:43:10.680 | that I think is arguably called free will.
00:43:13.500 | - Yeah, there's like the ripple effects
00:43:16.480 | of the billions of decisions you make early on in life
00:43:19.820 | are so great that even if it's not,
00:43:26.820 | even if it's like all deterministic,
00:43:30.300 | just the amount of possibilities that are created
00:43:35.300 | and then the focusing of those possibilities
00:43:38.500 | into a single trajectory,
00:43:40.140 | that somewhere within that, that's free will.
00:43:44.980 | Even if it's all deterministic, that might as well be,
00:43:48.680 | just the number of choices that are possible
00:43:52.740 | and the fact that you just make one trajectory
00:43:54.700 | through those set of choices seems to be like
00:43:57.560 | something like there'll be called free will.
00:43:59.500 | But it's still kind of sad to think like
00:44:02.180 | there doesn't seem to be a place
00:44:03.900 | where there's magic in there,
00:44:05.780 | where it is all just a computer.
00:44:08.100 | - Well, there's lots of magic, I would say, so far,
00:44:10.660 | because we don't really understand
00:44:13.720 | how all of this is exactly played out at a,
00:44:18.500 | I mean, scientists are working hard
00:44:23.040 | and disagree about some of the details
00:44:26.120 | under the hood of what I just described,
00:44:28.520 | but I think there's quite a bit of magic, actually.
00:44:31.340 | And also, there's also stochastic firing of,
00:44:36.340 | neurons don't, they're not purely digital
00:44:42.700 | in the sense that there is,
00:44:45.100 | there's also analog communication between neurons,
00:44:47.600 | not just digital, so it's not just with firing of axons.
00:44:51.920 | And some of that, there are other ways to communicate.
00:44:55.680 | And also, there's noise in the system,
00:45:00.680 | and the noise is there for a really good reason,
00:45:04.200 | and that is the more variability there is,
00:45:08.560 | the more potential there is for your brain
00:45:12.320 | to be able to be information-bearing.
00:45:15.600 | So basically, there are some animals
00:45:20.340 | that have clusters of cells,
00:45:22.720 | the only job is to inject noise into their neural patterns.
00:45:27.720 | - So maybe noise is the source of free will.
00:45:30.200 | - So you can think about stochasticity or noise
00:45:34.560 | as a source of free will,
00:45:36.740 | or you can think of conceptual combination
00:45:40.460 | as a source of free will.
00:45:42.320 | You can certainly think about cultivating,
00:45:45.780 | you can't reach back into your past and change your past.
00:45:51.400 | People try by psychotherapy and so on,
00:45:54.280 | but what you can do is change your present,
00:45:57.360 | which becomes your past.
00:46:01.100 | - Well, let me think about that sentence.
00:46:05.680 | - So one way to think about it is that you're continuously,
00:46:08.060 | this is a colleague of mine, a friend of mine said,
00:46:10.740 | "So what you're saying is that people
00:46:12.520 | "are continually cultivating their past."
00:46:15.680 | And I was like, "That's very poetic."
00:46:17.240 | Yes, you are continually cultivating your past
00:46:21.240 | as a means of controlling your future.
00:46:23.480 | - So you think, yeah, I guess the construction
00:46:29.720 | of the mental model that you use for prediction
00:46:32.900 | ultimately contains within it your perception of the past,
00:46:36.700 | like the way you interpret the past,
00:46:38.800 | or even just the entirety of your narrative about the past.
00:46:41.880 | So you're constantly rewriting the story of your past.
00:46:45.480 | Oh boy, yeah.
00:46:48.120 | That's one poetic and also just awe-inspiring.
00:46:51.160 | What about the other thing you talk about?
00:46:55.360 | You've mentioned about sensory perception
00:46:57.320 | as a thing that is just,
00:47:00.800 | you have to infer about the sources of the thing
00:47:03.640 | that you have perceived through your senses.
00:47:07.640 | So let me ask another ridiculous question.
00:47:12.200 | Is anything real at all?
00:47:14.240 | Like how do we know it's real?
00:47:15.740 | How do we make sense of the fact that,
00:47:18.360 | just like you said, there's this brain sitting alone
00:47:20.760 | in the darkness trying to perceive the world.
00:47:23.200 | How do we know that the world is out there to be perceived?
00:47:27.220 | - Yeah, so I don't think that you should be asking questions
00:47:30.380 | like that without passing a joint.
00:47:32.560 | - Right, no, for sure.
00:47:33.560 | - Yeah.
00:47:34.400 | - I actually did before this, so I apologize.
00:47:36.200 | - Okay, no, well, that's okay.
00:47:37.640 | You apologize for not sharing, that's okay.
00:47:39.480 | So I mean, here's what I would say.
00:47:41.080 | What I would say is that the reason why
00:47:43.280 | we can be pretty sure that there's a there there
00:47:46.120 | is that the structure of the information in the world,
00:47:51.120 | what we call statistical regularities
00:47:53.920 | in sights and sounds and so on,
00:47:56.040 | and the structure of the information
00:47:57.840 | that comes from your body, it's not random stuff.
00:48:00.640 | There's a structure to it.
00:48:02.000 | There's a spatial structure and a temporal structure,
00:48:05.040 | and that spatial and temporal structure wires your brain.
00:48:08.660 | So an infant brain is not a miniature adult brain.
00:48:13.920 | It's a brain that is waiting for wiring instructions
00:48:17.560 | from the world, and it must receive
00:48:20.400 | those wiring instructions to develop in a typical way.
00:48:24.800 | So for example, when a newborn is born,
00:48:28.320 | when a newborn is born, when a baby is born,
00:48:31.700 | the baby can't see very well
00:48:37.360 | because the visual system in that baby's brain
00:48:39.960 | is not complete.
00:48:41.840 | The retina of your eye, which actually is part of your brain
00:48:47.800 | has to be stimulated with photons of light.
00:48:50.460 | If it's not, the baby won't develop normally
00:48:54.120 | to be able to see in a neurotypical way.
00:48:56.880 | Same thing is true for hearing.
00:48:58.480 | The same thing is true really for all your senses.
00:49:01.080 | So the point is that the physical world,
00:49:06.080 | the sense data from the physical world,
00:49:09.160 | wires your brain so that you have an internal model
00:49:12.760 | of that world so that your brain can predict well
00:49:16.240 | to keep you alive and well and allow you to thrive.
00:49:19.800 | - That's fascinating that the brain is waiting
00:49:22.160 | for a very specific kind of set of instructions
00:49:26.600 | from the world, like not the specific,
00:49:29.040 | but a very specific kind of instructions.
00:49:31.880 | - So scientists call it expectable input.
00:49:35.120 | The brain needs some input in order to develop normally.
00:49:39.240 | And we are genetically, as I say in the book,
00:49:44.240 | we have the kind of nature that requires nurture.
00:49:48.280 | We can't develop normally without sensory input
00:49:53.200 | from the world and from the body.
00:49:55.240 | And what's really interesting about humans
00:49:58.000 | and some other animals too, but really seriously in humans,
00:50:03.000 | is the input that we need is not just physical.
00:50:08.880 | It's also social.
00:50:11.120 | We, in order for an infant, a human infant
00:50:14.880 | to develop normally, that infant needs eye contact,
00:50:19.520 | touch, it needs certain types of smells,
00:50:23.360 | it needs to be cuddled, it needs, right?
00:50:26.600 | So without social input, the brain,
00:50:31.600 | that infant's brain will not wire itself
00:50:36.840 | in a neurotypical way.
00:50:38.040 | And again, I would say there are lots of cultural patterns
00:50:43.040 | of caring for an infant.
00:50:46.480 | It's not like the infant has to be cared for in one way.
00:50:50.280 | Whatever the social environment is for an infant,
00:50:54.920 | that it will be reflected in that infant's internal model.
00:50:59.160 | So we have lots of different cultures,
00:51:00.560 | lots of different ways of rearing children.
00:51:02.800 | And that's an advantage for our species,
00:51:05.680 | although we don't always experience it that way.
00:51:07.360 | That is an advantage for our species.
00:51:09.160 | But if you just feed and water a baby
00:51:15.040 | without all the extra social doodads,
00:51:20.260 | what you get is a profoundly impaired human.
00:51:23.260 | - Yeah, but nevertheless, you're kind of saying
00:51:28.620 | that the physical reality has a consistent thing throughout
00:51:33.620 | that keeps feeding these set of sensory information
00:51:40.140 | that our brains are constructed for, but--
00:51:43.900 | - Yeah, the cool thing though,
00:51:44.860 | is that if you change the consistency,
00:51:47.140 | if you change the statistical regularities,
00:51:49.780 | so prediction error, your brain can learn it.
00:51:52.100 | It's expensive for your brain to learn it.
00:51:53.860 | And it takes a while for the brain
00:51:55.860 | to get really automated with it.
00:51:57.100 | But you had a wonderful conversation with David Eagleman,
00:52:01.740 | who just published a book about this,
00:52:04.260 | and gave lots and lots of really very, very cool examples.
00:52:07.780 | Some of which I actually discussed
00:52:09.900 | in "How Emotions Were Made,"
00:52:11.020 | but not obviously to the extent that he did in his book.
00:52:14.420 | It's a fascinating book.
00:52:16.780 | But it speaks to the point that your internal model
00:52:21.580 | is always under construction.
00:52:23.160 | And therefore, you always can modify your experience.
00:52:29.340 | - I wonder what the limits are.
00:52:31.780 | Like, if we put it on Mars, or if we put it
00:52:34.780 | in virtual reality, or if we sit at home during a pandemic,
00:52:39.340 | and we spend most of our day on Twitter and TikTok.
00:52:42.780 | Like, I wonder where the breaking point,
00:52:45.060 | like the limitations of the brain's capacity
00:52:48.260 | to properly continue wiring itself.
00:52:54.180 | - Well, I think what I would say is that
00:52:56.500 | there are different ways to specify your question, right?
00:53:00.260 | Like, one way to specify it would be the way
00:53:02.340 | that David phrases it, which is,
00:53:06.820 | "Can we create a new sense?"
00:53:09.900 | Like, "Can we create a new sensory modality?"
00:53:14.100 | How hard would that be?
00:53:15.540 | What are the limits in doing that?
00:53:17.240 | But another way to say it is,
00:53:22.260 | what happens to a brain when you remove
00:53:25.500 | some of those statistical regularities, right?
00:53:27.580 | Like, what happens to a brain,
00:53:29.060 | what happens to an adult brain when you remove
00:53:32.300 | some of the statistical patterns that were there,
00:53:36.820 | and they're not there anymore?
00:53:37.980 | - Are you talking about in the environment,
00:53:39.740 | or in the actual, like, you remove eyesight, for example?
00:53:43.900 | - Well, either way.
00:53:44.820 | I mean, basically, one way to limit the inputs
00:53:49.020 | to your brain are to stay home and protect yourself.
00:53:53.100 | Another way is to put someone in solitary confinement.
00:53:57.540 | Another way is to stick them in a nursing home.
00:54:01.240 | Another, well, not all nursing homes,
00:54:03.980 | but there are some, right?
00:54:06.980 | Which really are, where people are somewhat impoverished
00:54:12.500 | in the interactions and the variety
00:54:15.660 | of sensory stimulation that they get.
00:54:17.900 | Another way is that you lose a sense, right?
00:54:21.420 | But the point is, I think, that the human brain
00:54:26.340 | really likes variety, to say it in a,
00:54:30.700 | like a sort of Cartesian way.
00:54:35.160 | Variety is a good thing for a brain,
00:54:40.100 | and there are risks that you take
00:54:45.100 | when you restrict what you expose yourself to.
00:54:52.200 | - Yeah, you know, there's all this talk of diversity.
00:54:56.980 | The brain loves it, to the fullest definition
00:55:00.380 | and degree of diversity.
00:55:01.900 | - Yeah, I mean, I would say the only thing,
00:55:04.100 | basically, human brains thrive on diversity.
00:55:07.500 | The only place where we seem to have difficulty
00:55:10.140 | with diversity is with each other.
00:55:12.040 | - Yeah. - Right?
00:55:13.820 | But we, who wants to eat the same food every day?
00:55:16.500 | You never would.
00:55:17.460 | Who wants to wear the same clothes every day?
00:55:19.380 | I mean, my husband, if you ask him to close his eyes,
00:55:21.820 | he won't be able to tell you what he's wearing.
00:55:23.700 | He just, right?
00:55:24.660 | He'll buy seven shirts of exactly the same style
00:55:27.820 | in different colors, but they are in different colors,
00:55:30.560 | right, it's not like he's wearing--
00:55:31.820 | - How would you then explain my brain,
00:55:35.220 | which is terrified of choice,
00:55:36.940 | and therefore I wear the same thing every time?
00:55:39.460 | - Well, you must be getting your diversity.
00:55:41.300 | - Elsewhere. - Well, first of all,
00:55:42.140 | you are a fairly sharp dresser, so there is that,
00:55:44.540 | but so you're getting some reinforcement
00:55:47.100 | for dressing the way that you do.
00:55:48.460 | But no, your brain must get diversity in--
00:55:51.140 | - In other places. - In other places,
00:55:52.460 | but I think we, you know,
00:55:55.080 | so the two most expensive things your brain can do,
00:55:59.340 | metabolically speaking, is move your body and learn.
00:56:06.900 | Something new.
00:56:08.140 | So novelty, that is diversity, right,
00:56:12.580 | comes at a cost, a metabolic cost,
00:56:14.340 | but it's a cost, it's an investment that gives returns.
00:56:19.260 | And in general, people vary in how much they like novelty,
00:56:23.420 | unexpected things, some people really like it,
00:56:26.060 | some people really don't like it,
00:56:27.740 | and there's everybody in between.
00:56:29.400 | But in general, we don't eat the same thing every day,
00:56:32.860 | we don't usually do exactly the same thing
00:56:36.300 | in exactly the same order,
00:56:37.860 | in exactly the same place every day.
00:56:40.540 | The only place we have difficulty with diversity
00:56:46.380 | is in each other.
00:56:48.460 | And then we have considerable problems there,
00:56:50.820 | I would say, as a species.
00:56:52.980 | - Let me ask, I don't know if you're familiar
00:56:55.040 | with Donald Hoffman's work about questions of reality.
00:57:00.040 | What are your thoughts of the possibility
00:57:03.940 | that the very thing we've been talking about,
00:57:06.940 | of the brain wiring itself from birth
00:57:10.660 | to a particular set of inputs,
00:57:12.860 | is just a little slice of reality?
00:57:15.340 | That there is something much bigger out there
00:57:17.500 | that we humans, with our cognition,
00:57:20.420 | cognitive capabilities, are just not even perceiving?
00:57:23.260 | That the thing we're perceiving is just a crappy,
00:57:26.900 | like Windows 95 interface onto a much bigger,
00:57:32.180 | richer set of complex physics
00:57:35.340 | that we're not even in touch with?
00:57:37.720 | - Well, without getting too metaphysical about it,
00:57:41.300 | I think we know for sure,
00:57:42.620 | it doesn't have to be the crappy version of anything,
00:57:47.620 | but we definitely have a limited,
00:57:50.140 | we have a set of senses that are limited
00:57:53.260 | in very physical ways,
00:57:55.100 | and we're clearly not perceiving
00:57:56.780 | everything there is to perceive.
00:57:58.600 | That's clear.
00:57:59.840 | I mean, it's just, it's not that hard.
00:58:01.560 | We can't, without special,
00:58:03.280 | why do we invent scientific tools?
00:58:04.980 | It's so that we can overcome our senses
00:58:07.100 | and experience things that we couldn't otherwise,
00:58:10.380 | whether they are different parts of the visual spectrum,
00:58:14.180 | the light spectrum,
00:58:15.180 | or things that are too microscopically small for us to see,
00:58:19.480 | or too far away for us to see.
00:58:22.480 | So clearly, we're only getting a slice.
00:58:27.420 | And that slice,
00:58:29.460 | the interesting or potentially sad thing about humans
00:58:37.040 | is that we, whatever we experience,
00:58:40.640 | we think there's a natural reason for experiencing it.
00:58:43.540 | And we think it's obvious and natural,
00:58:46.340 | and that it must be this way,
00:58:48.440 | and that all the other stuff isn't important.
00:58:50.740 | And that's clearly not true.
00:58:53.300 | Many of the things that we think of as natural
00:58:56.480 | are anything but.
00:58:58.000 | They're certainly real, but we've created them.
00:59:01.200 | They certainly have very real impacts,
00:59:03.020 | but we've created those impacts.
00:59:05.280 | And we also know that there are many things
00:59:07.620 | outside of our awareness that have tremendous influence
00:59:11.560 | on what we experience and what we do.
00:59:14.140 | So there's no question that that's true.
00:59:17.640 | I mean, just, it's,
00:59:18.740 | but the extent is how,
00:59:21.840 | really the question is how fantastical is it?
00:59:23.880 | - Yeah, like what, you know, a lot of people ask me.
00:59:25.880 | I'm not allowed to say this.
00:59:27.640 | I think I'm allowed to say this.
00:59:29.280 | I've eaten shrooms a couple times,
00:59:31.620 | but I haven't gone the full,
00:59:33.280 | I'm talking to a few researchers in psychedelics.
00:59:35.760 | It's an interesting scientifically place.
00:59:37.960 | Like what is the portal you're entering
00:59:40.600 | when you take psychedelics?
00:59:41.840 | Or another way to ask is like dreams.
00:59:44.920 | What are-- - Yeah, so let me tell you
00:59:46.240 | what I think, which is based on nothing.
00:59:48.520 | Like this is based on my, right?
00:59:50.480 | So I don't-- - Your intuition.
00:59:52.480 | - It's based on my, I'm guessing now,
00:59:56.840 | based on what I do know, I would say.
00:59:59.420 | But I think that, well, think about what happens.
01:00:02.500 | So you're running, your brain's running this internal model
01:00:04.960 | and it's all outside of your awareness.
01:00:06.760 | You see the, you feel the products,
01:00:08.360 | but you don't sense the,
01:00:10.440 | you have no awareness of the mechanics of it, right?
01:00:13.760 | It's going on all the time.
01:00:15.120 | And so one thing that's going on all the time
01:00:19.280 | that you're completely unaware of
01:00:20.800 | is that when your brain,
01:00:22.720 | your brain is basically asking itself,
01:00:25.720 | figuratively speaking, not literally, right?
01:00:27.760 | Like how is, the last time I was in this sensory array
01:00:32.160 | with this stuff going on in my body
01:00:33.960 | and this chain of events which just occurred,
01:00:37.680 | what did I do next?
01:00:39.460 | What did I feel next?
01:00:40.680 | What did I see next?
01:00:42.000 | It doesn't come up with one answer.
01:00:43.600 | It comes up with a distribution of possible answers.
01:00:47.360 | And then there has to be some selection process.
01:00:50.360 | And so you have a network in your brain,
01:00:54.000 | a sub-network in your brain,
01:00:55.800 | a population of neurons that helps to choose.
01:00:59.700 | It's not, I'm not talking about a homunculus in your brain
01:01:03.560 | or anything silly like that.
01:01:05.600 | This is not the soul.
01:01:08.980 | It's not the center of yourself or anything like that.
01:01:11.800 | But there is a set of neurons
01:01:17.840 | that weighs the probabilities
01:01:20.840 | and helps to select or narrow the field, okay?
01:01:26.680 | And that network is working all the time.
01:01:30.200 | It's actually called the control network,
01:01:32.240 | the executive control network,
01:01:33.580 | or you can call it a frontoparietal
01:01:35.880 | because the regions of the brain that make it up
01:01:38.240 | are in the frontal lobe and the parietal lobe.
01:01:41.180 | There are also parts that belong
01:01:43.160 | to the subcortical parts of your brain.
01:01:44.880 | It doesn't really matter.
01:01:45.720 | The point is that there is this network
01:01:47.720 | and it is working all the time.
01:01:49.240 | Whether or not you feel in control,
01:01:50.840 | whether or not you feel like you're expending effort
01:01:52.720 | doesn't really matter.
01:01:53.560 | It's on all the time, except when you sleep.
01:01:56.440 | When you sleep, it's a little bit relaxed.
01:02:02.120 | And so think about what's happening when you sleep.
01:02:05.680 | When you sleep, the external world recedes,
01:02:09.800 | the sense data from, so basically your model
01:02:13.840 | becomes a little bit, the tethers from the world are loosened
01:02:18.840 | and this network, which is involved in, you know,
01:02:24.280 | maybe weeding out unrealistic things, is a little bit quiet.
01:02:29.280 | So your dreams are really your internal model
01:02:34.040 | that's unconstrained by the immediate world.
01:02:40.040 | Except, so you can do things that you can't do in real life
01:02:44.380 | in your dreams, right?
01:02:45.640 | You can fly.
01:02:46.480 | Like I, for example, when I fly on my back in a dream,
01:02:49.320 | I'm much faster than when I fly on my front.
01:02:51.660 | Don't ask me why, I don't know.
01:02:53.240 | - When you're laying on your back in your dream?
01:02:55.040 | - No, when I'm in my dream and flying in a dream,
01:02:58.740 | I am much faster flyer in the air.
01:03:00.840 | - You fly often?
01:03:02.080 | - Not often, but I--
01:03:03.480 | - You talk about it like you,
01:03:04.520 | I don't think I've flown for many years.
01:03:06.400 | - Well, you must try it.
01:03:08.960 | - I've flown, I've fallen.
01:03:11.520 | - That's scary.
01:03:12.440 | - Yeah, but you're talking about like airplane.
01:03:15.360 | - I fly in my dreams and I'm way faster, right?
01:03:18.520 | - On your back.
01:03:19.360 | - On my back, way faster.
01:03:20.400 | Now you can say, well, you know,
01:03:22.600 | you never flew in your life, right?
01:03:24.360 | It's conceptual combination.
01:03:25.780 | I mean, I've flown in an airplane and I've seen birds fly
01:03:30.360 | and I've watched movies of people flying
01:03:31.960 | and I know Superman probably flies,
01:03:34.320 | I don't know if he flies faster on his back, but--
01:03:36.120 | - He's, I've never seen--
01:03:37.840 | - Flying on his front, right?
01:03:39.160 | But yeah, but anyways, my point is that, you know,
01:03:41.960 | all of this stuff really,
01:03:43.840 | all these experiences really become part
01:03:47.400 | of your internal model.
01:03:48.840 | The thing is that when you're asleep,
01:03:50.680 | your internal model is still being constrained by your body.
01:03:55.720 | Your brain's always attached to your body.
01:03:58.520 | It's always receiving sense data from your body.
01:04:01.040 | You're mostly never aware of it
01:04:04.200 | unless you run up the stairs
01:04:06.640 | or, you know, maybe you are ill in some way,
01:04:11.200 | but you're mostly not aware of it,
01:04:12.760 | which is a really good thing because if you were,
01:04:15.320 | you know, you'd never pay attention
01:04:17.040 | to anything outside your own skin ever again.
01:04:19.120 | Like right now, you seem like you're sitting there
01:04:21.320 | very calmly, but you have a virtual drama, right?
01:04:25.360 | It's like an opera going on inside your body.
01:04:30.320 | And so I think that one of the things that happens
01:04:34.760 | when people take psilocybin or take, you know,
01:04:39.760 | ketamine, for example, is that the tethers-
01:04:45.040 | - Completely removed. - Are completely removed.
01:04:47.960 | - Yeah. - Yeah.
01:04:48.880 | - That's fascinating.
01:04:50.240 | And then- - And that's why
01:04:51.560 | it's helpful to have a guide, right?
01:04:53.520 | Because the guide is giving you sense data
01:04:57.080 | to steer that internal model
01:04:59.680 | so that it doesn't go completely off the rails.
01:05:02.600 | - Yeah, I know there's, again, that wiring to the other brain
01:05:06.520 | that's the guide is at least a tiny little tether.
01:05:09.880 | - Exactly. - Yeah.
01:05:11.280 | Let's talk about emotion a little bit if we could.
01:05:15.000 | Emotion comes up often, and I have never spoken
01:05:18.600 | with anybody who has a clarity about emotion
01:05:23.600 | from a biological and neuroscience perspective that you do.
01:05:29.240 | And I'm not sure I fully know how to,
01:05:34.240 | as I mentioned this way too much,
01:05:37.080 | but as somebody who was born in the Soviet Union
01:05:40.040 | and romanticizes basically everything,
01:05:42.440 | talks about love nonstop, you know, emotion is a,
01:05:46.960 | I don't know what to make of it.
01:05:49.040 | I don't know what, so maybe let's just try to talk about it.
01:05:53.960 | I mean, from a neuroscience perspective,
01:05:56.440 | we talked about a little bit last time,
01:05:58.440 | your book covers it, how emotions are made,
01:06:00.720 | but what are some misconceptions we writers of poetry,
01:06:05.720 | we romanticizing humans have about emotion
01:06:10.240 | that we should move away from
01:06:14.040 | before to think about emotion
01:06:16.240 | from both a scientific and an engineering perspective?
01:06:20.040 | - Yeah, so there is a common view of emotion in the West.
01:06:25.160 | The caricature of that view is that,
01:06:29.100 | you know, we have an inner beast, right?
01:06:33.480 | Your limbic system, your inner lizard.
01:06:37.160 | We have an inner beast and that comes baked in
01:06:41.060 | to the brain at birth.
01:06:42.000 | So you've got circuits for angers, atmosphere.
01:06:44.320 | It's interesting that they all have English names,
01:06:46.840 | these circuits, right?
01:06:48.000 | But, and they're there and they're triggered
01:06:51.540 | by things in the world.
01:06:52.780 | And then they cause you to do and say,
01:06:55.960 | and, you know, so when your fear circuit is triggered,
01:06:59.280 | you widen your eyes, you gasp,
01:07:03.120 | your heart rate goes up,
01:07:06.680 | you prepare to flee or to freeze.
01:07:10.940 | And these are modal responses.
01:07:15.120 | They're not the only responses that you give,
01:07:16.840 | but on average, they're the prototypical responses.
01:07:19.780 | That's the view.
01:07:22.160 | And that's the view of emotion in the law.
01:07:25.880 | That's the view, you know,
01:07:27.800 | that emotions are these profoundly unhelpful things
01:07:31.120 | that are obligatory kind of like reflexes.
01:07:34.300 | The problem with that view is that
01:07:40.720 | it doesn't comport to the evidence.
01:07:43.340 | And it doesn't really matter.
01:07:46.380 | The evidence actually lines up beautifully with each other.
01:07:49.160 | It just doesn't line up with that view.
01:07:50.800 | And it doesn't matter whether you're measuring people's
01:07:52.320 | faces, facial movements,
01:07:53.960 | or you're measuring their body movements,
01:07:55.280 | or you're measuring their peripheral physiology,
01:07:57.120 | or you're measuring their brains,
01:07:59.360 | or their voices or whatever.
01:08:00.640 | Pick any output that you wanna measure
01:08:03.620 | and, you know, any system you wanna measure,
01:08:05.760 | and you don't really find strong evidence for this.
01:08:09.080 | And I say this as somebody who not only has reviewed
01:08:13.000 | really thousands of articles and run, you know,
01:08:16.280 | big meta-analyses, which are statistical summaries
01:08:19.720 | of published papers,
01:08:21.280 | but also as someone who has sent teams of researchers
01:08:25.120 | to small-scale cultures,
01:08:29.500 | you know, remote cultures,
01:08:32.520 | which are very different from urban large-scale cultures
01:08:37.520 | like ours.
01:08:40.200 | And one culture that we visited,
01:08:42.800 | and I say we euphemistically because I myself didn't go
01:08:47.220 | because I only had two research permits,
01:08:49.900 | and I gave them to my students
01:08:52.160 | 'cause I felt like it was better for them
01:08:54.880 | to have that experience,
01:08:56.380 | and more formative for them to have that experience.
01:08:59.320 | But I was in contact with them every day by satellite phone.
01:09:03.040 | And this was to visit the Hadza hunter-gatherers
01:09:08.040 | in Tanzania, who are not an ancient people.
01:09:13.380 | They're a modern culture,
01:09:15.220 | but they live in circumstances,
01:09:18.180 | hunting and foraging,
01:09:20.180 | circumstances that are very similar,
01:09:24.460 | in similar conditions to our ancestors,
01:09:27.320 | hunting-gathering ancestors,
01:09:30.980 | when expressions of emotion were supposed to have evolved,
01:09:34.780 | at least by one view of, okay.
01:09:37.620 | So, you know, for many years,
01:09:41.540 | I was sort of struggling with this set of observations,
01:09:45.740 | right, which is that I feel emotion,
01:09:48.580 | and I perceive emotion in other people,
01:09:52.700 | but scientists can't find a single marker,
01:09:56.300 | a single biomarker,
01:09:57.700 | not a single individual measure or pattern of measures
01:10:01.640 | that can predict how someone,
01:10:04.980 | what kind of emotional state they're in.
01:10:06.980 | How could that possibly be?
01:10:08.820 | How can you possibly make sense of those two things?
01:10:11.460 | And through a lot of reading
01:10:15.860 | and a lot of immersing myself in different literatures,
01:10:19.580 | I came to the hypothesis
01:10:22.080 | that the brain is constructing these instances
01:10:26.580 | out of more basic ingredients.
01:10:29.360 | So when I tell you that the brain,
01:10:32.100 | when I suggest to you that what your brain is doing
01:10:34.100 | is making a prediction,
01:10:37.020 | and it's asking itself, figuratively speaking,
01:10:41.020 | the last time I was in this situation
01:10:43.620 | and this physical state, what did I do next?
01:10:47.580 | What did I see next?
01:10:48.580 | What did I hear next?
01:10:50.040 | It's basically asking what in my past
01:10:53.520 | is similar to the present?
01:10:57.240 | Things which are similar to one another
01:11:01.340 | are called a category.
01:11:06.220 | A group of things which are similar
01:11:07.380 | to one another is a category.
01:11:09.460 | And a mental representation of a category is a concept.
01:11:13.500 | So your brain is constructing categories or concepts
01:11:17.440 | on the fly, continuously.
01:11:19.620 | So you really wanna understand what a brain is doing.
01:11:21.900 | You don't, using machine learning,
01:11:23.660 | like classification models is not gonna help you
01:11:26.340 | because the brain doesn't classify.
01:11:28.060 | It's doing category construction.
01:11:30.060 | And the categories change,
01:11:33.820 | or you could say it's doing concept construction.
01:11:35.700 | It's using past experience to conjure a concept,
01:11:39.700 | which is a prediction.
01:11:41.160 | And if it's using past experiences of emotion,
01:11:47.660 | then it's constructing an emotion concept.
01:11:50.140 | Your concept will be,
01:11:56.180 | the content of it is,
01:11:59.060 | changes depending on the situation that you're in.
01:12:04.180 | So for example, if your brain uses past experiences
01:12:07.100 | of anger that you have learned,
01:12:11.060 | either because somebody labeled them for you,
01:12:14.540 | taught them to you, you observed them in movies and so on,
01:12:18.600 | in one situation could be very different
01:12:21.420 | from your concept for anger than another situation.
01:12:24.860 | And this is how anger, instances of anger,
01:12:28.940 | are what we call a population of variable instances.
01:12:33.120 | Sometimes when you're angry, you scowl.
01:12:35.420 | Sometimes when you're angry, you might smile.
01:12:38.540 | Sometimes when you're angry, you might cry.
01:12:42.700 | Sometimes your heart rate will go up, it will go down,
01:12:45.420 | it will stay the same.
01:12:46.820 | It depends on what action you're about to take.
01:12:49.640 | Because the way prediction, and I should say,
01:12:52.180 | the idea that physiology is yoked to action
01:12:57.180 | is a very old idea
01:12:58.580 | in the study of the peripheral nervous system.
01:13:02.060 | It's been known for really decades.
01:13:04.100 | And so if you look at what the brain is doing,
01:13:07.820 | if you just look at the anatomy,
01:13:09.420 | and here's the hypothesis that you would come up with,
01:13:13.060 | and I can go into the details.
01:13:14.820 | I've published these details in scientific papers,
01:13:19.280 | and they also appear somewhat
01:13:21.300 | in "How Emotions Are Made," my first book.
01:13:23.660 | They are not in the seven and a half lessons
01:13:26.680 | because that book is really not pitched
01:13:31.000 | at that level of explanation.
01:13:33.060 | It's really just a set of little essays.
01:13:36.940 | But the evidence, but what I'm about to say
01:13:39.940 | is actually based on scientific evidence.
01:13:42.940 | When your brain begins to form a prediction,
01:13:48.100 | the first thing it's doing is it's making a prediction
01:13:53.100 | of how to change the internal systems of your body,
01:13:56.740 | your heart, your cardiovascular system,
01:13:58.740 | the control of your heart, control of your lungs,
01:14:01.860 | a flush of cortisol, which is not a stress hormone.
01:14:05.680 | It's a hormone that gets glucose
01:14:07.240 | into your bloodstream very fast
01:14:10.120 | because your brain is predicting
01:14:11.440 | you need to do something metabolically expensive.
01:14:15.440 | And so that means either move or learn.
01:14:22.920 | And so your brain is preparing your body,
01:14:28.440 | the internal systems of your body,
01:14:30.520 | to execute some actions, to move in some way.
01:14:34.220 | And then it infers based on those motor predictions
01:14:43.020 | and what we call viscera motor predictions,
01:14:45.200 | meaning the changes in the viscera
01:14:48.820 | that your brain is preparing to execute.
01:14:54.240 | Your brain makes an inference about what you will sense
01:14:59.240 | based on those motor movements.
01:15:01.600 | So your experience of the world
01:15:05.440 | and your experience of your own body
01:15:08.160 | are a consequence of those predictions, those concepts.
01:15:13.800 | When your brain makes a concept for emotion,
01:15:16.780 | it's constructing an instance of that emotion.
01:15:21.680 | That is how emotions are made.
01:15:24.240 | And those concepts load in.
01:15:27.440 | The predictions that are made
01:15:29.000 | include contents inside the body,
01:15:33.940 | contents outside the body.
01:15:36.560 | I mean, it includes other humans.
01:15:38.400 | So just this construction of a concept
01:15:42.160 | includes the variables that are much richer
01:15:46.200 | than just some sort of simple notion.
01:15:51.200 | - Yeah, so our colloquial notion of a concept
01:15:53.440 | where I say, well, what's a concept of a bird?
01:15:58.440 | And then you list a set of features off to me.
01:16:02.800 | That's people's understanding,
01:16:04.760 | typically, of what a concept is.
01:16:05.900 | But if you go into the literature in cognitive science,
01:16:10.900 | what you'll see is that the way that scientists
01:16:14.100 | have understood what a concept is
01:16:15.740 | has really changed over the years.
01:16:17.520 | So people used to think about a concept
01:16:19.720 | as philosophers and scientists used to think
01:16:23.560 | about a concept as a dictionary definition for a category.
01:16:27.400 | So there's a set of things which are similar
01:16:29.600 | out in the world.
01:16:31.280 | And your concept for that category
01:16:36.280 | is a dictionary definition of the features,
01:16:39.440 | the necessary and sufficient features of those instances.
01:16:43.120 | So for a bird, it would be--
01:16:47.680 | - Wings, feathers.
01:16:49.520 | - Right, a beak, it flies, whatever, okay.
01:16:53.360 | That's called the classical category.
01:16:55.680 | And scientists discovered, observed,
01:16:58.160 | that actually not all instances of birds have feathers
01:17:02.840 | and not all instances of birds fly.
01:17:05.660 | And so the idea was that you don't have
01:17:08.400 | a single representation of necessary
01:17:11.080 | and sufficient features stored in your brain somewhere.
01:17:13.000 | Instead, what you have is a prototype.
01:17:15.820 | A prototype meaning you still have a single representation
01:17:19.480 | for the category, one,
01:17:20.960 | but the features are like of the most typical instance
01:17:27.240 | of the category or maybe the most frequent instance,
01:17:29.600 | but not all instances of the category
01:17:31.560 | have all the features, right?
01:17:32.920 | They have some graded similarity to the prototype.
01:17:36.600 | And then, you know,
01:17:42.900 | what I'm gonna like incredibly simplify now,
01:17:47.900 | a lot of work to say that then a series of experiments
01:17:52.340 | were done to show that in fact,
01:17:55.180 | what your brain seems to be doing
01:17:57.900 | is coming up with a single exemplar
01:18:02.100 | or instance of the category and reading off the features
01:18:07.100 | when I ask you for the concept.
01:18:11.000 | So if we were in a pet store and I asked you,
01:18:16.000 | what are the features of a bird?
01:18:19.980 | Tell me the concept of bird.
01:18:21.860 | You would be more likely to give me features of a good pet.
01:18:25.900 | And if we were in a restaurant,
01:18:29.220 | you would be more likely, you know,
01:18:30.920 | like a budgie, right, or a canary.
01:18:32.920 | If we were in a restaurant,
01:18:34.060 | you would be more likely to give me the features
01:18:36.540 | of a bird that you would eat, like a chicken.
01:18:39.220 | And if we were in a park,
01:18:40.340 | you'd be more likely to give me in this country,
01:18:45.340 | you know, the features of a sparrow or a robin.
01:18:49.220 | Whereas if we were in South America,
01:18:50.660 | you would probably give me the features of a peacock
01:18:54.540 | because that's more common
01:18:56.300 | or it is more common there than here
01:18:59.220 | that you would see a peacock in such circumstances.
01:19:01.500 | So the idea was that really what your brain was doing
01:19:06.580 | was conjuring a concept on the fly
01:19:11.580 | that meets the function that the category is being put to.
01:19:16.740 | Okay? - Yep.
01:19:19.700 | - Okay.
01:19:20.540 | Then people started studying ad hoc concepts,
01:19:26.380 | meaning concepts that,
01:19:31.160 | where the instances don't share any physical features.
01:19:36.640 | But the function of the instances are the same.
01:19:40.320 | So for example, think about all the things
01:19:42.360 | that can protect you from the rain.
01:19:44.860 | What are all the things that can protect you from the rain?
01:19:49.120 | - Umbrella, like this apartment.
01:19:54.120 | - Right.
01:19:55.160 | Your car. - Not giving a damn.
01:19:58.700 | Like a mindset.
01:20:03.520 | - Yeah, right, right.
01:20:05.400 | So the idea is that the function of the instances
01:20:09.000 | is the same in a given situation,
01:20:11.480 | even if they look different, sound different,
01:20:13.840 | smell different, this is called an abstract concept
01:20:17.920 | or a conceptual concept.
01:20:21.600 | Now, the really cool thing about conceptual categories
01:20:27.320 | or conceptual, yes, conceptual category,
01:20:29.880 | a conceptual is a category of things
01:20:31.880 | that are held together by a function,
01:20:35.480 | which is called an abstract concept or a conceptual category
01:20:39.040 | 'cause the things don't share physical features,
01:20:41.640 | they share functional features.
01:20:44.040 | There are two really cool things about this.
01:20:46.220 | One is that's what Darwin said a species was.
01:20:50.300 | So Darwin is known for discovering natural selection.
01:20:59.740 | But the other thing he really did,
01:21:02.300 | which was really profound, which he's less celebrated for,
01:21:06.420 | is understanding that all biological categories
01:21:10.520 | have inherent variation, inherent variation.
01:21:15.000 | Darwin wrote in "The Origin of Species"
01:21:19.180 | about before Darwin's book,
01:21:22.300 | a species was thought to be a classical category
01:21:27.420 | where all the instances of dogs were the same,
01:21:30.740 | had the exactly same features,
01:21:32.180 | and any variation from that perfect platonic instance
01:21:37.180 | was considered to be error.
01:21:41.380 | And Darwin said, "No, it's not error, it's meaningful."
01:21:44.980 | Nature selects on the basis of that variation.
01:21:50.660 | The reason why natural selection is powerful and can exist
01:21:56.020 | is because there is variation in a species.
01:21:59.860 | And in dogs, we talk about that variation
01:22:03.940 | in terms of the size of the dog
01:22:06.020 | and the amount of fur the dog has and the color
01:22:09.620 | and how long is the tail and how long is the snout.
01:22:12.840 | In humans, we talk about that variation
01:22:17.500 | in all kinds of ways, right, including in cultural ways.
01:22:24.660 | So that's one thing that's really interesting
01:22:26.900 | about conceptual categories is that Darwin
01:22:29.300 | was basically saying a species is a conceptual category.
01:22:32.660 | And in fact, if you look at modern debates
01:22:35.780 | about what is a species, you can't find anybody agreeing
01:22:40.180 | on what the criteria are for a species
01:22:43.020 | 'cause they don't all share the same genome.
01:22:45.220 | We don't all share, we don't.
01:22:47.700 | There isn't a single human genome.
01:22:49.780 | There's a population of genomes,
01:22:54.100 | but they're variable.
01:22:56.140 | It's not unbounded variation, but they are variable, right?
01:23:00.580 | And the other thing that's really cool
01:23:02.980 | about conceptual categories is that they are the categories
01:23:07.980 | that we use to make civilization.
01:23:13.900 | So think about money, for example.
01:23:17.940 | What are all the physical things
01:23:21.720 | that make something a currency?
01:23:25.460 | Is there any physical feature that all the currencies
01:23:28.660 | in all the worlds that's ever been used by humans share?
01:23:32.480 | - Well, certainly, right, but what is it?
01:23:36.620 | Is it definable?
01:23:37.760 | So it's getting to the point that you make this function.
01:23:42.940 | - It's the function, right.
01:23:44.740 | It's that we trade it for material goods.
01:23:47.220 | And we have to agree, right?
01:23:49.020 | We all impose on whatever it is, salt, barley,
01:23:52.700 | little shells, big rocks in the ocean that can't move,
01:23:55.500 | Bitcoin, pieces of plastic, mortgages,
01:23:58.260 | which are basically a promise of something in the future,
01:24:00.740 | nothing more, right?
01:24:02.300 | All of these things, we impose value on them.
01:24:06.300 | And we all agree that we can exchange them
01:24:09.740 | for material goods.
01:24:11.260 | - Yeah, and yes, that's brilliant.
01:24:13.860 | By the way, you're attributing some of that to Darwin,
01:24:16.100 | that he thought--
01:24:16.940 | - No, I'm saying that what Darwin--
01:24:18.540 | - 'Cause that's a brilliant view
01:24:19.540 | of what a species is, is the function.
01:24:21.780 | - Yeah, what I'm saying is that what Darwin,
01:24:24.540 | Darwin really talked about variation in,
01:24:28.300 | so if you read, for example, the biologist Ernst Mayr,
01:24:31.140 | who was an evolutionary biologist,
01:24:33.260 | and then when he retired, became a historian
01:24:36.220 | and philosopher of biology.
01:24:38.420 | And his suggestion is that Darwin,
01:24:42.980 | Darwin did talk about variation.
01:24:45.660 | He vanquished what's called essentialism,
01:24:48.060 | the idea that there's a single set of features
01:24:51.980 | that define any species.
01:24:54.860 | And out of that grew really discussions
01:25:01.100 | of the, like some of the functional features
01:25:06.380 | that species have, like they can reproduce,
01:25:08.540 | they can have offspring,
01:25:10.700 | the individuals of a species can have offspring.
01:25:12.740 | It turns out that's not a perfect,
01:25:16.420 | that's not a perfect criterion to use,
01:25:18.900 | but it's a functional criterion, right?
01:25:20.660 | So what I'm saying is that in cognitive science,
01:25:23.060 | people came up with the idea,
01:25:24.660 | they discovered the idea of conceptual categories
01:25:26.940 | or ad hoc concepts, these concepts that can change
01:25:30.860 | based on the function they're serving, right?
01:25:33.420 | And that it's there, it's in Darwin,
01:25:38.420 | and it's also in the philosophy of social reality.
01:25:42.260 | The way that philosophers talk about social reality,
01:25:44.860 | just look around you.
01:25:46.180 | I mean, we impose, we're treating a bunch of things
01:25:49.020 | as similar, which are physically different.
01:25:52.020 | And sometimes we take things that are physically the same
01:25:55.900 | and we treat them as separate categories.
01:25:58.340 | - But it feels like the number of variables involved
01:26:02.260 | in that kind of categorization is nearly infinite.
01:26:04.820 | And that's-- - No, I don't think so
01:26:06.140 | because there is a physical constraint, right?
01:26:08.540 | Like you and I could agree that we can fly in real life,
01:26:13.700 | but we can't, that's a physical constraint
01:26:16.780 | that we can't break, right?
01:26:18.820 | You and I could agree that we could walk through the walls,
01:26:21.500 | but we can't, we could agree that we could eat glass,
01:26:24.180 | but we can't. - Oh, there's a lot
01:26:25.020 | of constraints, but I just-- - Yeah, we can agree
01:26:26.620 | that the virus doesn't exist
01:26:28.700 | and we don't have to wear masks.
01:26:30.220 | - Right, yeah.
01:26:33.020 | - But physical reality still holds the trump card, right?
01:26:37.220 | But still there's a lot of-- - The trump card,
01:26:39.420 | well, pun unintended.
01:26:41.580 | - Pun completely unintended, but there you go,
01:26:43.140 | that's a predicting brain for you.
01:26:44.900 | (Zubin laughs)
01:26:46.100 | But there's a tremendous amount of leeway.
01:26:49.620 | - Yes. - Yeah, that's the point.
01:26:51.620 | So what I'm saying is that emotions are like money.
01:26:54.540 | Basically, they're like money, they're like countries,
01:26:59.020 | they're like kings and queens and presidents,
01:27:02.380 | they're like everything that we construct
01:27:05.220 | that we impose meaning on.
01:27:07.340 | We take these physical signals and we give them meanings
01:27:10.500 | that they don't otherwise have by their physical nature.
01:27:13.860 | And because we agree, they have that function.
01:27:18.860 | - But the beautiful thing, so maybe unlike money,
01:27:23.020 | I love this similarity is, it's not obvious to me
01:27:26.680 | that this kind of emergent agreement
01:27:29.460 | should happen with emotion,
01:27:31.140 | because our experiences are so different
01:27:33.320 | for each of us humans, and yet we kind of converge.
01:27:36.580 | - Well, in a culture we converge, but not across cultures.
01:27:39.760 | There are huge, huge differences.
01:27:41.700 | There are huge differences in what concepts exist,
01:27:44.820 | what they look like.
01:27:48.500 | So what I would say is that--
01:27:51.100 | - They feel like.
01:27:52.060 | - What we're doing with our young children
01:27:55.860 | as their brains become wired to their physical
01:28:01.900 | and their social environment, right,
01:28:03.860 | is that we are curating for them,
01:28:05.980 | we are bootstrapping into their brains a set of emotion.
01:28:10.220 | Concepts, that's partly what they're learning.
01:28:13.060 | And we curate those for infants,
01:28:15.100 | just the way we curate for them, what is a dog?
01:28:17.260 | What is a cat?
01:28:18.100 | What is a truck?
01:28:19.660 | We sometimes explicitly label,
01:28:22.380 | and we sometimes just use mental words.
01:28:25.200 | When your kid is throwing Cheerios on the floor
01:28:30.660 | instead of eating them, or your kid is crying
01:28:33.700 | when she won't put herself to sleep or whatever.
01:28:37.420 | We use mental words.
01:28:39.420 | And a word is this, words for infants,
01:28:44.040 | words are these really special things
01:28:46.600 | that they help infants learn, abstract categories.
01:28:49.360 | There's a huge literature showing that children
01:28:53.000 | can take things that don't look infants,
01:28:56.440 | like infants, really young infants, pre-verbal infants,
01:29:00.480 | can take, if you label, if I say to you,
01:29:04.500 | and you're an infant, okay?
01:29:07.040 | So I say, "Lex, Lexi, this is a bling."
01:29:12.040 | And I put it down and the bling makes a squeaky noise.
01:29:17.120 | And then I say, "Lexi, this is a bling."
01:29:22.120 | And I put it down and it makes a squeaky noise.
01:29:24.800 | And then I say, "Lexi, this is a bling.
01:29:29.800 | "You, as young as four months old,
01:29:34.100 | "will expect this to make a noise, a squeaky noise.
01:29:39.100 | "And if you don't, if it doesn't, you'll be surprised
01:29:41.880 | "because it violated your expectation, right?
01:29:44.640 | "I'm building for you an internal model of a bling."
01:29:49.640 | Okay, infants can do this really, really at a young age.
01:29:53.260 | And so there's no reason to believe
01:29:55.660 | that they couldn't learn emotion categories
01:29:58.500 | and concepts in the same way.
01:29:59.740 | And what happens when you go to a new culture?
01:30:04.380 | When you go to a new culture,
01:30:05.820 | you have to do what's called emotion acculturation.
01:30:11.380 | So my colleague, Batia Mesquita in Belgium,
01:30:13.760 | studies emotion acculturation.
01:30:15.140 | She studies how when people move
01:30:16.620 | from one culture to another,
01:30:18.140 | how do they learn the emotion concepts of that culture?
01:30:21.780 | How do they learn to make sense
01:30:23.500 | of their own internal sensations
01:30:25.980 | and also the movements, the raise of an eyebrow,
01:30:29.700 | the tilt of a head?
01:30:30.620 | How do they learn to make sense of cues from other people
01:30:34.720 | using concepts they don't have,
01:30:37.820 | but have to make on the fly?
01:30:40.660 | - So there's the difference in cultures.
01:30:43.300 | Let me open another door.
01:30:45.260 | I'm not sure I wanna open.
01:30:46.480 | But difference between men and women.
01:30:49.400 | Is there a difference between the emotional lives
01:30:53.940 | of those two categories of biological systems?
01:30:58.220 | - So here's what I would say.
01:31:00.340 | We did a series of studies in the 1990s
01:31:04.780 | where we asked men and women
01:31:07.340 | to tell us about their emotional lives.
01:31:10.040 | And women described themselves
01:31:11.380 | as much more emotional than men.
01:31:13.660 | They believed that they were more emotional than men,
01:31:15.480 | and men agreed.
01:31:16.340 | Women are much more emotional than men.
01:31:19.020 | Okay, and then we gave them little handheld computers.
01:31:24.020 | These were little Hewlett-Packard computers.
01:31:26.380 | They fit in the palm of your hand.
01:31:28.340 | A couple of pounds, they weighed a couple of pounds.
01:31:29.740 | So this was like pre-Palm Pilot even.
01:31:31.700 | Like this was, you know, 1990s, like early.
01:31:36.700 | And we asked them, we would, you know,
01:31:41.800 | ping them like 10 times a day,
01:31:45.820 | and just ask them to report how they were feeling,
01:31:48.940 | which is called experience sampling.
01:31:50.840 | So we experience sampled.
01:31:53.980 | And then at the end, and then we looked at their reports,
01:31:58.980 | and what we found is that men and women
01:32:01.100 | basically didn't differ.
01:32:02.540 | And there were some people who were really,
01:32:05.580 | had many more instances of emotion.
01:32:07.820 | So they were, you know, they were treading water
01:32:12.820 | in a tumultuous sea of emotion.
01:32:16.020 | And then there were other people
01:32:17.400 | who were like floating tranquilly, you know, in a lake.
01:32:21.100 | It was really not perturbed very often,
01:32:23.100 | and everyone in between.
01:32:25.180 | But there were no difference between men and women.
01:32:28.300 | And the really interesting thing is
01:32:29.140 | at the end of the sampling period, we asked people,
01:32:32.740 | so reflect over the past two weeks and tell it.
01:32:36.840 | So, you know, we've been now pinging people
01:32:38.620 | like again and again and again, right?
01:32:41.140 | So tell us how emotional do you think you are?
01:32:44.160 | No change from the beginning.
01:32:45.460 | So men and women believe that they are different.
01:32:51.120 | And when they are looking at other people,
01:32:54.120 | they make different inferences about emotion.
01:32:57.680 | If a man is scowling, like if you and I were together,
01:33:01.840 | and so somebody's watching this, okay?
01:33:05.280 | And yeah, hey, who am I saying?
01:33:08.000 | Hey, hi, yeah, hi.
01:33:09.600 | - By the way, people love it when you look at the camera.
01:33:16.280 | If you and I make exactly the same set of facial movements,
01:33:21.160 | when people look at you, both men and women look at you,
01:33:25.960 | they are more likely to think,
01:33:27.800 | oh, he's reacting to the situation.
01:33:30.760 | And when they look at me, they'll say,
01:33:33.360 | oh, she's having an emotion.
01:33:34.720 | She's, you know, yeah.
01:33:36.280 | And I wrote about this actually
01:33:38.040 | right before the 2016 election.
01:33:46.340 | - You know what, maybe I could confess.
01:33:49.260 | Let me try to carefully confess.
01:33:51.740 | - But you are really gonna.
01:33:53.100 | - Yeah, that when I,
01:33:57.600 | that there is an element when I see Hillary Clinton,
01:34:01.360 | that there was something annoying about her to me.
01:34:07.100 | And I, just that feeling,
01:34:09.420 | and then I tried to reduce that to,
01:34:13.380 | what is that?
01:34:14.220 | Because I think the same attributes
01:34:17.340 | that are annoying about her,
01:34:19.620 | when I see in other people, wouldn't be annoying.
01:34:22.100 | So I was trying to understand, what is it?
01:34:25.040 | Because it certainly does feel like that concept
01:34:28.060 | that I've constructed in my mind.
01:34:30.100 | - Well, I'll tell you that I think,
01:34:32.200 | well, let me just say that what you would predict about,
01:34:35.980 | for example, the performance of the two of them
01:34:39.060 | in the debates,
01:34:40.740 | and I wrote an op-ed for the New York Times actually,
01:34:43.900 | before the second debate.
01:34:46.340 | And it played out really pretty much as I thought
01:34:49.420 | that it would, based on research.
01:34:51.220 | It's not like I'm like a great fortune teller or anything.
01:34:53.260 | It's just, I was just applying the research,
01:34:54.580 | which was that when a woman,
01:34:57.300 | a woman's, people make internal attributions, it's called.
01:35:03.260 | They infer that the facial movements and body posture
01:35:07.220 | and vocalizations of a woman reflect her inner state,
01:35:10.620 | but for a man, they're more likely to assume
01:35:12.900 | that they reflect his response to the situation.
01:35:15.300 | It doesn't say anything about him.
01:35:16.460 | It says something about the situation he's in.
01:35:18.540 | - That's brilliant.
01:35:19.380 | - Now, for the thing that you were describing
01:35:22.900 | about Hillary Clinton,
01:35:24.140 | I think a lot of people experienced,
01:35:27.820 | but it's also in line with research, which shows,
01:35:30.660 | and particularly research actually
01:35:32.540 | about teaching evaluations is one place
01:35:36.180 | that you really see it,
01:35:37.620 | where the expectation is that a woman will be nurturant
01:35:42.060 | and that a man, there's just no expectation
01:35:46.180 | for him to be nurturant.
01:35:47.340 | So he's, if he is nurturant, he gets points.
01:35:50.800 | If he's not, he gets points.
01:35:54.260 | They're just different points, right?
01:35:56.020 | Whereas for a woman, especially a woman
01:35:58.500 | who's an authority figure, she's really in a catch-22.
01:36:02.500 | 'Cause if she's serious, she's a bitch.
01:36:05.460 | And if she's empathic, then she's weak.
01:36:09.500 | - Right, that's brilliant.
01:36:10.820 | I mean, one of the bigger questions to ask here,
01:36:14.140 | so that's one example where our construction
01:36:17.020 | of concepts gets us in trouble.
01:36:20.420 | - But remember I said science and philosophy
01:36:24.260 | are like tools for living.
01:36:26.060 | So I learned recently that, if you ask me,
01:36:30.460 | what is my intuition about what regulates my eating?
01:36:34.260 | I will say carbohydrates.
01:36:36.380 | I love carbohydrates.
01:36:37.500 | I love pasta.
01:36:38.500 | I love bread.
01:36:39.380 | I just love carbohydrates.
01:36:42.220 | But actually, research shows, and it's beautiful research.
01:36:46.020 | I love this research 'cause it so violates my own
01:36:49.180 | deeply, deeply held beliefs about myself,
01:36:52.840 | that most animals on this planet who have been studied,
01:36:57.500 | and there are many, actually eat
01:37:00.660 | to regulate their protein intake.
01:37:04.180 | So you will overeat carbohydrates
01:37:06.460 | if you, in order to get enough protein.
01:37:10.020 | And this research has been done with human,
01:37:12.340 | very beautiful research, with humans, with crickets,
01:37:15.820 | with like bonobos, I mean, just like
01:37:17.780 | all these different animals, not bonobos,
01:37:19.220 | but I think like baboons.
01:37:20.460 | Now, I have no intuition about that.
01:37:24.620 | And I, even now, as I regulate my eating,
01:37:27.820 | I still just have no intuition.
01:37:30.540 | It just, I can't feel it.
01:37:32.820 | What I feel is only about the carbohydrates.
01:37:35.820 | - It feels like you're regulating around carbohydrates,
01:37:38.060 | not the protein.
01:37:38.900 | - Yeah, but in fact, actually, what I am doing,
01:37:41.140 | if I am like most animals on the planet,
01:37:43.980 | I am regulating around protein.
01:37:45.260 | So knowing this, what do I do?
01:37:48.500 | I correct my behavior to eat,
01:37:51.620 | to actually deliberately try to focus on the protein.
01:37:55.760 | This is the idea behind bias training, right?
01:38:00.180 | Like if you, I also did not experience Hillary Clinton
01:38:05.180 | as the warmest candidate.
01:38:12.860 | However, you can use consistent science,
01:38:17.860 | since the consistent scientific findings,
01:38:21.380 | to organize your behavior.
01:38:23.740 | That doesn't mean that rationality is the absence of emotion
01:38:28.460 | because sometimes emotion or feelings in general,
01:38:33.220 | not the same thing as emotion, that's another topic,
01:38:37.340 | but are a source of information
01:38:41.580 | and they're wisdom and helpful.
01:38:42.860 | So I'm not saying that,
01:38:44.420 | but what I am saying is that
01:38:45.420 | if you have a deeply held belief
01:38:46.860 | and the evidence shows that you're wrong,
01:38:49.380 | then you're wrong.
01:38:50.380 | It doesn't really matter how confident you feel.
01:38:52.780 | That confidence could be also explained by science, right?
01:38:56.180 | So it would be the same thing as if I,
01:38:59.660 | regardless of whether someone is a,
01:39:01.260 | like Charlie Baker, right?
01:39:02.340 | Regardless of whether somebody's a Republican
01:39:04.380 | or a Democrat,
01:39:05.220 | if that person has a record that you can see
01:39:09.300 | is consistent with what you believe,
01:39:11.900 | then that is information that you can act on.
01:39:14.720 | - Yeah, and then, so try to,
01:39:17.960 | I mean, this is kind of what empathy is
01:39:20.080 | and open-mindedness is.
01:39:21.240 | Try to consider that the set of concepts
01:39:25.260 | that your brain has constructed
01:39:27.420 | through which you are now perceiving the world
01:39:30.260 | is not painting the full picture.
01:39:32.260 | I mean, this is now true for basically every,
01:39:35.180 | it doesn't have to be men and women,
01:39:36.500 | it could be basically the prism
01:39:38.860 | through which we perceive actually
01:39:39.860 | the political discourse, right?
01:39:41.700 | - Absolutely.
01:39:42.540 | So here's what I would say.
01:39:44.460 | You know, there are people who,
01:39:51.100 | scientists who will talk to you about cognitive empathy
01:39:53.620 | and emotional empathy and I prefer to think of it,
01:39:58.620 | I think the evidence is more consistent
01:40:01.980 | with what I'm about to say,
01:40:03.140 | which is that your brain is always making predictions
01:40:08.140 | using your own past experience
01:40:10.360 | and what you've learned from books and movies
01:40:13.640 | and other people telling you
01:40:14.900 | about their experiences and so on.
01:40:16.600 | And if your brain cannot make a concept
01:40:22.820 | to make sense of those,
01:40:24.100 | anticipate what those sense data are
01:40:25.980 | and make sense of them,
01:40:27.660 | you will be experientially blind.
01:40:30.040 | So, you know, when I'm giving lectures to people,
01:40:34.300 | I'll show them like a blobby black and white image
01:40:38.060 | and they're experientially blind to the image,
01:40:42.060 | they can't see anything in it
01:40:44.340 | and then I show them a photograph
01:40:45.980 | and then I show them the image again,
01:40:47.480 | the blobby image and then they see actually an object in it.
01:40:51.180 | But the image is the same.
01:40:53.880 | It's there, they're actually adding,
01:40:55.740 | their predictions now are adding, right?
01:40:57.780 | Or anyone who's-- - It's a beautiful example.
01:40:59.620 | - Anybody who's learned a language,
01:41:02.520 | a second language after their first language
01:41:07.420 | also has this experience of things
01:41:10.820 | that initially sound like sounds
01:41:12.360 | that they can't quite make sense of
01:41:14.140 | eventually come to make,
01:41:16.140 | they eventually come to make sense of them.
01:41:18.180 | And in fact, there are really cool examples
01:41:20.660 | of people who were like born blind
01:41:23.340 | because they have cataracts or they have corneal damage
01:41:28.260 | so that no light is reaching the brain
01:41:33.260 | and then they have an operation
01:41:36.020 | and then light reaches the brain and they can't see.
01:41:39.600 | For days and weeks and sometimes years,
01:41:45.140 | they are experientially blind to certain things.
01:41:48.540 | So what happens with empathy, right?
01:41:51.100 | Is that your brain is making a prediction.
01:41:54.860 | And if it doesn't have the capacity to make,
01:41:59.860 | if you don't share, if you're not similar,
01:42:06.660 | remember, I mean, categories are instances
01:42:10.580 | which are similar in some way.
01:42:12.060 | If you are not similar enough to that person,
01:42:16.100 | you will have a hard time making a prediction
01:42:17.860 | about what they feel.
01:42:19.460 | You will be experientially blind to what they feel.
01:42:22.620 | In the United States, children of color
01:42:29.460 | are under prescribed medicine by their physicians.
01:42:33.660 | This is been documented.
01:42:37.580 | It's not that the physicians are racist necessarily,
01:42:45.420 | but they might be experientially blind.
01:42:48.100 | The same thing is true of male physicians
01:42:54.580 | with female patients.
01:42:56.860 | I could tell you some hair-raising stories,
01:42:59.780 | really, where people die as a consequence
01:43:03.100 | of a physician making the wrong inference,
01:43:07.180 | the wrong prediction,
01:43:08.940 | because of being experientially blind.
01:43:12.300 | So we are, you know, empathy is not, it's not magic.
01:43:17.300 | We make inferences about each other,
01:43:23.660 | about what each other's feeling and thinking.
01:43:26.220 | In this culture, more than,
01:43:28.260 | and there are some cultures where people
01:43:30.660 | have what's called opacity of mind,
01:43:33.540 | where they will make a prediction
01:43:34.860 | about someone else's actions,
01:43:35.900 | but they're not inferring anything
01:43:37.180 | about the internal state of that person.
01:43:39.980 | But in our culture, we're constantly making inferences.
01:43:43.480 | What is this person thinking?
01:43:44.980 | And we're not doing it necessarily consciously,
01:43:47.300 | but we're just doing it really automatically
01:43:48.780 | using our predictions, what we know.
01:43:51.660 | And if you expose yourself to information,
01:43:56.660 | which is very different from somebody else,
01:43:59.980 | I mean, really what we have is,
01:44:02.620 | we have different cultures in this country right now
01:44:06.380 | that are, there are a number of reasons for this.
01:44:08.900 | I mean, part of it is, I don't know if you saw
01:44:10.580 | the "Social Dilemma," the Netflix.
01:44:13.540 | - Heard about it.
01:44:15.540 | - Yeah, it's a great, it's really great documentary.
01:44:19.660 | - About what social networks are doing to our society?
01:44:23.220 | - Yeah, yeah.
01:44:24.700 | But you know, nothing, no phenomenon
01:44:28.200 | has a simple single cause.
01:44:31.660 | There are multiple small causes
01:44:34.980 | which all add up to a perfect storm.
01:44:37.060 | That's just how most things work.
01:44:41.860 | And so the fact that machine learning algorithms
01:44:45.100 | are serving people up information on social media
01:44:48.820 | that is consistent with what they've already viewed
01:44:51.340 | and making, is part of the reason
01:44:55.580 | that you have these silos.
01:44:57.580 | But it's not the only reason why you have these silos,
01:45:00.820 | I think.
01:45:01.660 | There are other things afoot
01:45:04.500 | that enhance people's inability
01:45:09.500 | to even have a decent conversation.
01:45:13.340 | - Yeah, I mean, okay, so many things you said
01:45:15.700 | are just brilliant.
01:45:16.740 | So the experiment, experiential blindness.
01:45:20.020 | But also from my perspective,
01:45:21.740 | like I preach and I try to practice empathy a lot.
01:45:26.740 | And something about the way you've explained it
01:45:30.780 | makes me almost see it as a kind of exercise
01:45:33.620 | that we should all do, like to train,
01:45:35.820 | like to add experiences to the brain
01:45:38.740 | to expand this capacity to predict more effectively.
01:45:42.660 | - Absolutely.
01:45:43.620 | - So like what I do is kind of like a method acting thing,
01:45:47.660 | which is I imagine what the life of a person is like.
01:45:51.280 | Just think.
01:45:53.820 | I mean, this is something you see with Black Lives Matter
01:45:56.020 | and police officers.
01:45:58.220 | It feels like they're both, not both,
01:46:01.900 | but I have, because of martial arts and so on,
01:46:03.940 | I have a lot of friends who are cops.
01:46:06.580 | They don't necessarily have empathy
01:46:10.220 | or visualize the experience of the other.
01:46:14.940 | Certainly, currently, unfortunately,
01:46:17.000 | people aren't doing that with police officers.
01:46:19.380 | They're not imagining, they're not empathizing
01:46:22.980 | or putting themselves in the shoes of a police officer
01:46:26.300 | to realize how difficult that job is,
01:46:28.620 | how dangerous it is, how difficult it is to maintain calm
01:46:32.380 | and under so much uncertainty, all those kind of things.
01:46:35.140 | - But there's more, there's even, that's all that's true,
01:46:37.740 | but I think that there's even more,
01:46:39.500 | there's even more to be said there.
01:46:41.300 | I mean, like from a predicting brain standpoint,
01:46:44.400 | there's even more that can be said there.
01:46:47.100 | So I don't know if you wanna go down that path
01:46:48.680 | or you wanna stick on empathy,
01:46:49.920 | but I will also say that one of the things
01:46:52.980 | that I was most gratified by, I still am receiving,
01:46:57.020 | it's been more than three and a half years
01:46:59.700 | since "How Emotions Are Made" came out,
01:47:00.980 | and I'm still receiving daily emails from people, right?
01:47:04.420 | So that's gratifying.
01:47:05.800 | But one of the most gratifying emails I received
01:47:09.720 | was from a police officer in Texas
01:47:12.780 | who told me that he thought that "How Emotions Are Made"
01:47:17.160 | contained information that would be really helpful
01:47:25.280 | to resolving some of these difficulties.
01:47:28.840 | And he hadn't even read my op-ed piece
01:47:33.100 | about when is a gun not a gun,
01:47:35.060 | and using what we know about the science of perception
01:47:39.260 | from a prediction standpoint, like the brain is a predictor,
01:47:43.660 | to understand a little differently
01:47:45.820 | what might be happening in these circumstances.
01:47:49.200 | So there's a real, what's hard about,
01:47:52.360 | it's hard to talk about
01:47:53.360 | because everyone gets mad at you when you talk about this.
01:47:58.360 | And there is a way to understand this
01:48:02.620 | which has profound empathy
01:48:05.020 | for the suffering of people of color
01:48:10.020 | and that definitely is in line with Black Lives Matter
01:48:15.020 | at the same time as understanding
01:48:18.100 | the really difficult situation
01:48:21.080 | that police officers find themselves in.
01:48:23.460 | And I'm not talking about this bad apple or that bad apple.
01:48:26.500 | I'm not talking about police officers
01:48:28.020 | who are necessarily shooting people in the back
01:48:30.240 | as they run away.
01:48:31.260 | I'm talking about the cases of really good,
01:48:34.420 | well-meaning cops who have the kind of predicting brain
01:48:39.420 | that everybody else has.
01:48:41.340 | They're in a really difficult situation
01:48:45.960 | that I think both they and the people
01:48:50.420 | who are harmed don't realize.
01:48:54.980 | The way that these situations are constructed,
01:48:58.300 | I think it's just, there's a lot to be said there, I guess,
01:49:01.420 | is what I wanna say. - Yeah, is there something
01:49:03.340 | we can try to say in a sense?
01:49:05.900 | From the perspective of the predictive brain,
01:49:09.660 | which is a fascinating perspective to take on this,
01:49:14.280 | all the protests that are going on,
01:49:18.100 | there seems to be a concept of a police officer being built.
01:49:22.540 | - No, I think that concept is there.
01:49:25.640 | - But it's gaining strength, so it's being re,
01:49:29.340 | I mean, is it sure? - Yeah, it is, yeah.
01:49:32.100 | - It is there. - For sure.
01:49:32.940 | But I think, yeah, for sure, I think that that's right.
01:49:35.400 | I think that there's a shift in the stereotype
01:49:40.400 | of what I would say is a stereotype.
01:49:44.380 | There's a stereotype of a black man in this country
01:49:48.280 | that's always in movies and television,
01:49:50.760 | not always, but largely, that many people watch.
01:49:55.760 | I mean, you think you're watching a 10 o'clock drama
01:50:00.220 | and all you're doing is kicking back and relaxing,
01:50:02.680 | but actually, you're having certain predictions reinforced
01:50:07.060 | and others not.
01:50:07.980 | And what's happening now with police is the same thing,
01:50:13.580 | that there are certain stereotypes of a police officer
01:50:16.940 | that are being abandoned and other stereotypes
01:50:19.580 | that are being reinforced by what you see happening.
01:50:23.940 | All I'll say is that if you remember,
01:50:26.500 | I mean, there's a lot to say about this, really,
01:50:28.540 | that regardless of whether it makes people mad or not,
01:50:33.220 | I mean, the science is what it is.
01:50:36.020 | Just remember what I said.
01:50:39.180 | The brain makes predictions about internal changes
01:50:44.180 | in the body first and then it starts to prepare motor action
01:50:49.180 | and then it makes a prediction about what you will see
01:50:53.020 | and hear and feel based on those actions.
01:50:55.800 | So it's also the case that we didn't talk about
01:51:01.780 | is that sensory sampling, like your brain's ability
01:51:05.620 | to sample what's out there, is yoked to your heart rate.
01:51:10.620 | It's yoked to your heartbeats.
01:51:12.940 | There are certain phases of the heartbeat
01:51:15.300 | where it's easier for you to see what's happening
01:51:17.660 | in the world than in others.
01:51:20.100 | And so if your heart rate goes through the roof,
01:51:23.440 | you will be more likely to just go with your prediction
01:51:30.100 | and not correct based on what's out there
01:51:34.780 | because you're actually literally not seeing as well.
01:51:37.880 | Or you will see things that aren't there, basically.
01:51:42.020 | - Is there something that we could say by way of advice
01:51:48.420 | for when this episode is released
01:51:52.540 | in the chaos of emotion,
01:51:57.140 | sorry, I don't know a better term,
01:51:58.940 | that's just flying around on social media?
01:52:02.820 | - Well, I actually think it is emotion in the following sense.
01:52:06.320 | It sounds a little bit artificial
01:52:09.980 | in the way that I'm about to say it,
01:52:16.100 | but I really think that this is what's happening.
01:52:18.980 | One thing we haven't talked about is brains evolved,
01:52:23.980 | didn't evolve for you to see,
01:52:25.300 | they didn't evolve for you to hear,
01:52:27.000 | they didn't evolve for you to feel,
01:52:28.220 | they evolved to control your body.
01:52:30.060 | That's why you have a brain.
01:52:31.740 | You have a brain so that it can control your body.
01:52:34.300 | And the metaphor, the scientific term
01:52:37.340 | for predictively controlling your body is allostasis.
01:52:40.580 | Your brain is attempting to anticipate
01:52:45.260 | the needs of your body and meet those needs
01:52:47.220 | before they arise so that you can act as you need to act.
01:52:51.420 | And the metaphor that I use is a body budget.
01:52:55.140 | You know, your brain is running a budget for your body.
01:52:57.500 | It's not budgeting money,
01:52:58.580 | it's budgeting glucose and salt and water.
01:53:01.900 | And instead of having one or two bank accounts,
01:53:05.420 | it has gazillions.
01:53:06.980 | There are all these systems in your body
01:53:08.420 | that have to be kept in balance.
01:53:10.860 | And it's monitoring very closely,
01:53:14.480 | it's making predictions about like,
01:53:16.840 | when is it good to spend and when is it good to save
01:53:19.660 | and what would be a good investment
01:53:21.180 | and am I gonna get a return on my investment?
01:53:23.900 | Whenever people talk about reward or reward prediction error
01:53:27.100 | or anything to do with reward or punishment,
01:53:29.860 | they're talking about the body budget.
01:53:32.940 | They're talking about your brain's predictions
01:53:34.580 | about whether or not there will be a deposit or withdrawal.
01:53:37.820 | So when you,
01:53:42.160 | when your brain is running a deficit in your body budgets,
01:53:48.620 | you have some kind of metabolic imbalance,
01:53:50.780 | you experience that as discomfort.
01:53:55.120 | You experience that as distress.
01:53:57.180 | When your brain, when things are chaotic,
01:54:01.020 | you can't predict what's going to happen next.
01:54:05.540 | So I have this absolutely brilliant scientist
01:54:08.860 | working in my lab, his name is Jordan Theriault
01:54:13.600 | and he's published this really terrific paper
01:54:17.080 | on a sense of should, like, why do we have social rules?
01:54:22.560 | Why do we, you know, adhere to social norms?
01:54:27.560 | It's because if I make myself predictable to you,
01:54:31.860 | then you are predictable to me.
01:54:34.220 | And if you're predictable to me,
01:54:35.880 | that's good because that is less
01:54:38.620 | metabolically expensive for me.
01:54:40.260 | Novelty or unpredictability at the extreme is expensive.
01:54:47.060 | And if it goes on for long enough,
01:54:49.020 | what happens is first of all,
01:54:51.280 | you will feel really jittery and antsy,
01:54:53.780 | which we describe as anxiety.
01:54:56.900 | It isn't necessarily anxiety.
01:54:58.600 | It could be just something is not predictable
01:55:03.600 | and you are experiencing arousal
01:55:06.420 | because the chemicals that help you learn
01:55:09.980 | increase your feeling of arousal, basically.
01:55:13.420 | But if it goes on for long enough,
01:55:15.180 | you will become depleted
01:55:17.940 | and you will start to feel really, really, really distressed.
01:55:22.120 | So what we have is a culture full of people right now
01:55:27.040 | who are, their body budgets are just decimated
01:55:30.800 | and there's a tremendous amount of uncertainty.
01:55:34.660 | When you talk about it as depression, anxiety,
01:55:39.720 | it makes you think that it's not about your metabolism,
01:55:43.860 | that it's not about your body budgeting,
01:55:46.000 | that it's not about getting enough sleep
01:55:48.360 | or about eating well
01:55:50.000 | or about making sure that you have social connections.
01:55:53.180 | You think that it's something separate from that.
01:55:57.040 | But depression and anxiety
01:55:58.120 | are just a way of being in the world.
01:56:00.480 | They're a way of being in the world
01:56:04.080 | when things aren't quite right with your predictions.
01:56:06.900 | - That's such a deep way of thinking.
01:56:10.660 | Like the brain is maintaining homeostasis
01:56:16.540 | - It's actually allostasis.
01:56:17.580 | - Allostasis, I'm sorry.
01:56:19.020 | It's constantly making predictions
01:56:22.860 | and metabolically speaking,
01:56:24.420 | it's very costly to make novel,
01:56:26.660 | like constantly be learning to making adjustments.
01:56:29.900 | And then over time, there's a cost to be paid
01:56:34.900 | if you're just in a place of chaos
01:56:39.180 | where there's constant need for adjusting
01:56:42.740 | and learning and experience novel things.
01:56:46.860 | - And so part of the problem here,
01:56:49.500 | there are a couple of things,
01:56:50.460 | like I said, it's a perfect storm.
01:56:52.420 | There isn't a single cause.
01:56:54.740 | There are multiple cause,
01:56:55.780 | multiple things that combine together.
01:56:57.420 | It's a complex system, multiple things.
01:57:00.520 | Part of it is that people are,
01:57:04.780 | they're metabolically encumbered and they're distressed.
01:57:10.100 | And in order to try to have empathy for someone
01:57:13.300 | who is very much unlike you,
01:57:16.100 | you have to forage for information.
01:57:19.020 | You have to explore information
01:57:22.460 | that is novel to you and unexpected,
01:57:25.940 | and that's expensive.
01:57:27.940 | And at a time when people feel,
01:57:31.020 | what do you do when you are running a deficit
01:57:33.700 | in your bank account?
01:57:34.860 | You stop spending.
01:57:36.980 | What does it mean for a brain to stop spending?
01:57:40.380 | A brain stops moving very much,
01:57:43.140 | stops moving the body, and it stops learning.
01:57:46.900 | It just goes with its internal model.
01:57:48.700 | - Brilliantly put, yep.
01:57:50.700 | - So empathy requires,
01:57:53.980 | to have empathy for someone who is unlike you
01:57:56.620 | requires learning and practice,
01:58:01.420 | foraging for information.
01:58:04.220 | I mean, it is something I talk about in the book
01:58:07.180 | in "7 1/2 Lessons About the Brain."
01:58:08.780 | I think it's really important.
01:58:10.400 | It's hard, but it's hard.
01:58:13.180 | I think it's, you know,
01:58:14.900 | it's hard for people to have,
01:58:16.520 | to be curious about views that are unlike their own
01:58:20.780 | when they feel so encumbered.
01:58:26.340 | And I'll just tell you, I had this epiphany, really.
01:58:30.180 | I was listening to Robert Reich's "The System."
01:58:34.660 | He was talking about oligarchy versus democracy.
01:58:39.020 | And so oligarchy is where very wealthy people,
01:58:41.860 | like extremely wealthy people,
01:58:43.700 | shift power so that they become even more wealthy
01:58:51.020 | and even more insulated from the, you know,
01:58:54.380 | the pressures of the common person.
01:58:58.020 | It's actually the kind of system
01:59:00.860 | that leads to the collapse of civilizations,
01:59:04.180 | if you believe Jared Diamond.
01:59:05.980 | Just say that.
01:59:06.820 | But anyways, I'm listening to this,
01:59:08.700 | and I'm listening to him describe in fairly decent detail
01:59:13.700 | how the CEOs of these companies,
01:59:18.580 | there's been a shift in what it means to be a CEO
01:59:21.100 | and not being, no longer being a steward of the community
01:59:24.580 | and so on, but like in the 1980s,
01:59:26.620 | it sort of shifted to this other model
01:59:28.620 | of being like an oligarch.
01:59:30.740 | And he's talking about how, you know,
01:59:34.140 | it used to be the case that CEOs made like 20 times
01:59:39.140 | what their employees made.
01:59:45.260 | And now they make about 300 times on average
01:59:48.020 | what their employees made.
01:59:49.100 | So where did that money come from?
01:59:51.660 | It came from the pockets of the employees.
01:59:55.060 | And they don't know about it, right?
01:59:57.380 | No one knows about it.
01:59:58.400 | They just know they can't feed their children,
02:00:00.980 | they can't pay for healthcare,
02:00:03.420 | they can't take care of their family,
02:00:05.300 | and they worry about what's gonna happen to their,
02:00:07.580 | they're living like, you know, months a month, basically.
02:00:11.340 | Any one big bill could completely, you know,
02:00:13.980 | put them out on the street.
02:00:15.260 | So there are a huge number of people living like this.
02:00:17.900 | So all they, what they're experiencing,
02:00:19.900 | they don't know why they're experiencing it.
02:00:21.220 | So it's, and then someone comes along
02:00:24.620 | and gives them a narrative.
02:00:26.020 | - Yeah.
02:00:26.860 | - Well, somebody else butted in line in front of you.
02:00:29.900 | And that's why you're this way.
02:00:32.980 | That's why you experience what you're experiencing.
02:00:35.180 | It just for a minute, I was thinking,
02:00:39.340 | I had deep empathy for people who have beliefs
02:00:44.340 | that are really, really, really different from mine.
02:00:50.420 | But I was trying really hard to see it through their eyes.
02:00:55.420 | And did it cost me something metabolically?
02:00:59.380 | I'm sure.
02:01:01.180 | I'm sure.
02:01:02.100 | - But you had something in the gas tank.
02:01:04.660 | - Well, I--
02:01:06.020 | - In order to allocate that.
02:01:07.140 | I mean, that's the question is like,
02:01:08.620 | where did you, what resources did your brain draw on
02:01:13.100 | in order to actually make that effort?
02:01:14.700 | - Well, I'll tell you something, honestly, Lex.
02:01:17.540 | I don't have that much in the gas tank right now.
02:01:19.940 | (Lex laughing)
02:01:21.300 | Right, so I am surfing the stress that,
02:01:26.300 | stress is just, what is stress?
02:01:28.580 | Stress is your brain is preparing for a big metabolic outlay
02:01:31.660 | and it just keeps preparing and preparing
02:01:33.580 | and preparing and preparing.
02:01:35.220 | - You as a professor, you as a human?
02:01:37.460 | - Both, right?
02:01:39.020 | For me, this is a moment of existential crisis
02:01:42.700 | as much as anybody else, democracy, all of these things.
02:01:45.460 | So in many of my roles,
02:01:48.780 | so I guess what I'm trying to say is that
02:01:50.940 | I get up every morning and I exercise.
02:01:55.340 | I run, I row, I lift weights, right?
02:01:58.860 | You exercise in the middle of the day.
02:02:00.380 | I saw your like, daily thing.
02:02:02.420 | - I'm obsessed with it.
02:02:03.260 | - Yeah, I hate it, actually.
02:02:06.020 | You love it, right?
02:02:07.060 | You get a--
02:02:07.900 | - No, I hate it.
02:02:08.740 | - I hate it, but I do it religiously.
02:02:12.540 | - Yeah.
02:02:13.420 | - Why?
02:02:14.500 | Because it's a really good investment.
02:02:16.540 | It's an expenditure that is a really good investment.
02:02:20.300 | And so,
02:02:21.300 | when I was exercising, I was listening to the book
02:02:27.780 | and when I realized the insights
02:02:29.960 | that I was sort of like playing around with,
02:02:32.100 | like, does this make sense?
02:02:33.820 | Does this make sense?
02:02:34.660 | I didn't immediately plunge into it.
02:02:36.980 | I basically wrote some stuff down, I set it aside,
02:02:40.780 | and then I did what I,
02:02:42.300 | I prepared myself to make an expenditure.
02:02:45.020 | I don't know what you do before you exercise.
02:02:47.460 | I always have a protein shake, always have a protein shake,
02:02:50.900 | because I need to fuel up
02:02:52.860 | before I make this really big expenditure.
02:02:55.020 | And so, I did the same thing.
02:02:58.220 | I didn't have a protein drink, but I did the same thing.
02:03:01.700 | And fueling up can mean lots of different things.
02:03:04.300 | It can mean talking to a friend about it.
02:03:06.140 | It can mean, you know, it can mean
02:03:09.340 | making sure you get a good night's sleep before you do it.
02:03:11.420 | It can mean lots of different things.
02:03:12.980 | But I guess I think we have to do these things.
02:03:17.980 | - Yeah, I'm gonna re-listen
02:03:23.460 | to this conversation several times.
02:03:24.980 | This is brilliant.
02:03:26.340 | But I do think about,
02:03:28.380 | you know, I've encountered so many people
02:03:33.440 | that can't possibly imagine
02:03:35.580 | that a good human being can vote for Donald Trump.
02:03:38.700 | And I've also encountered people
02:03:42.380 | that can't imagine that an intelligent person
02:03:45.500 | can possibly vote for a Democrat.
02:03:47.860 | And I look at both these people,
02:03:52.020 | many of whom are friends,
02:03:54.060 | and let's just say, after this conversation,
02:03:58.620 | I can see as they're predicting brains
02:04:00.780 | not willing to invest the resources
02:04:04.360 | to empathize with the other side.
02:04:06.140 | And I think you have to, in order to be able to,
02:04:10.020 | like, to see the obvious common humanity in us.
02:04:14.500 | I don't know what the system is
02:04:16.100 | that's creating this division.
02:04:17.860 | We can put it, like you said, it's a perfect storm.
02:04:20.220 | It might be the social media.
02:04:22.220 | I don't know what the hell it is.
02:04:23.060 | - I think it's a bunch of things.
02:04:24.540 | I think it's-- - It's just coming together.
02:04:25.940 | - There's an economic system, which is disadvantaging
02:04:29.020 | large numbers of people.
02:04:30.980 | There's a use of social media.
02:04:34.420 | Like, if you, you know, if I had to orchestrate
02:04:37.340 | or architect a system that would screw up
02:04:40.980 | a human body budget, it would be the one that we live in.
02:04:44.380 | You know, we don't sleep enough.
02:04:45.600 | We eat pseudo food, basically.
02:04:48.980 | We are on social media too much,
02:04:51.620 | which is full of ambiguity,
02:04:52.920 | which is really hard for a human nervous system, right?
02:04:55.980 | Really, really hard.
02:04:57.060 | Like, ambiguity with no context to predict in.
02:04:59.700 | I mean, it's like, really?
02:05:01.180 | And then, you know, there are the economic concerns
02:05:03.800 | that affect large swaths of people in this country.
02:05:06.260 | I mean, it's really, I'm not saying everything
02:05:09.300 | is reducible to metabolism.
02:05:11.780 | Not everything is reducible to metabolism.
02:05:13.620 | But if you combine all these things together--
02:05:18.500 | - It's helpful to think of it that way.
02:05:20.060 | Then, somehow it's also,
02:05:22.140 | somehow it reduces the entirety of the human experience,
02:05:27.020 | the same kind of obvious logic.
02:05:28.520 | Like, we should exercise every day in the same kind of way.
02:05:31.740 | We should empathize every day.
02:05:34.880 | - Yeah.
02:05:35.900 | You know, there are these really wonderful,
02:05:37.740 | wonderful programs for teens,
02:05:41.580 | and also for parents of people who've lost children
02:05:45.180 | in wars and in conflicts, in political conflicts,
02:05:48.940 | where they go to a bucolic setting,
02:05:51.540 | and they talk to each other about their experiences.
02:05:54.020 | And miraculous things happen, you know?
02:05:59.020 | So, you know, it's easy to,
02:06:04.840 | it's easy to sort of shrug this stuff off
02:06:09.840 | as kind of Pollyanna-ish, you know?
02:06:12.520 | Like, what's this really gonna do?
02:06:13.800 | But you have to think about,
02:06:18.600 | when my daughter went to college, I gave her advice.
02:06:23.520 | I said, "Try to be around people
02:06:28.520 | "who let you be the kind of person you wanna be."
02:06:34.080 | You, we're back to free will, you have a choice.
02:06:37.780 | You have a choice.
02:06:40.360 | It might seem like a really hard choice.
02:06:41.960 | It might seem like a unimaginably difficult choice.
02:06:46.080 | You have a choice.
02:06:47.280 | Do you wanna be somebody who is wrapped in fury and agony?
02:06:52.280 | Or do you wanna be somebody who extends
02:06:57.580 | a little empathy to somebody else,
02:07:00.000 | and in the process, maybe learn something?
02:07:02.720 | Curiosity is the thing that protects you.
02:07:07.480 | Curiosity is the thing, it's curative curiosity.
02:07:12.480 | - On social media, the thing I recommend to people,
02:07:15.480 | at least that's the way I've been approaching social media.
02:07:20.040 | It doesn't seem to be the common approach,
02:07:22.080 | but I basically give love to people
02:07:27.080 | who seem to also give love to others.
02:07:30.640 | So it's the same, similar concept of surrounding yourself
02:07:34.400 | by the people you wanna become.
02:07:36.240 | And I ignore, sometimes block, but just ignore.
02:07:40.440 | I don't add aggression to people
02:07:43.240 | who are just constantly full of aggression
02:07:46.400 | and negativity and toxicity.
02:07:48.400 | There's a certain desire when somebody says something mean
02:07:51.760 | to say something, to say why,
02:07:59.040 | or try to alleviate the meanness and so on.
02:08:01.720 | But what you're doing essentially
02:08:03.040 | is you're now surrounding yourself
02:08:05.620 | by that group of folks that have that negativity.
02:08:09.240 | So even just the conversation.
02:08:11.200 | So I think it's just so powerful
02:08:15.200 | to put yourself amongst people
02:08:18.520 | whose basic mode of interaction is kindness.
02:08:23.920 | Because I don't know what it is,
02:08:27.000 | but maybe it's the way I'm built,
02:08:28.680 | is that to me is energizing for the gas tank
02:08:32.160 | of that then I can pull to.
02:08:34.760 | - For sure.
02:08:35.600 | - When I start reading "The Rise and Fall of the Third Reich"
02:08:38.080 | and start thinking about Nazi Germany,
02:08:41.000 | I can empathize with everybody involved.
02:08:43.600 | I can start to make these difficult,
02:08:46.040 | like thinking that's required
02:08:48.480 | to understand our little planet Earth.
02:08:52.160 | - Well, there is research to back up what you said.
02:08:54.840 | There's research that's consistent
02:08:56.580 | with your intuition there.
02:08:58.080 | You know, that there's research that shows
02:09:00.680 | that being kind to other people,
02:09:04.040 | doing something nice for someone else,
02:09:05.940 | is like making a deposit to some extent.
02:09:11.080 | You know, 'cause I think making a deposit
02:09:15.840 | not only in their body budgets, but also in yours.
02:09:18.800 | Like people feel good
02:09:21.600 | when they do good things for other people.
02:09:24.000 | You know, we are social animals.
02:09:26.040 | We regulate each other's nervous systems
02:09:28.280 | for better and for worse, right?
02:09:30.720 | The best thing for a human nervous system is another human.
02:09:35.480 | And the worst thing for a human nervous system
02:09:40.500 | is another human.
02:09:41.800 | So you decide, do you wanna be somebody
02:09:44.520 | who makes people feel better,
02:09:49.520 | or do you wanna be somebody who causes people pain?
02:09:53.160 | And we are more responsible for one another
02:09:58.160 | than we might like, or than we might want.
02:10:02.480 | But remember what we said about social reality, you know?
02:10:05.440 | Social reality, you know,
02:10:08.280 | there are lots of different cultural norms
02:10:12.640 | about, you know, independence,
02:10:15.200 | or, you know, collective, you know, nature of people.
02:10:20.120 | But the fact is we have socially dependent nervous systems.
02:10:24.980 | We evolved that way as a species.
02:10:27.620 | And in this country,
02:10:29.420 | we prize individual rights and freedoms.
02:10:32.880 | And that is a dilemma that we have to grapple with.
02:10:37.880 | And we have to do it in a way,
02:10:40.080 | if we're gonna be productive about it,
02:10:41.540 | we have to do it in a way that
02:10:43.600 | requires engaging with each other,
02:10:48.440 | and which is what I understand the, you know,
02:10:52.200 | the founding members of this country intended.
02:10:55.900 | - Beautifully put.
02:10:58.520 | Let me ask a few final silly questions.
02:11:01.800 | So one, we've talked a bit about love,
02:11:05.260 | but let me, it's fun to ask somebody like you
02:11:08.560 | who can effectively, from at least neuroscience perspective,
02:11:13.120 | disassemble some of these romantic notions,
02:11:15.040 | but what do you make of romantic love?
02:11:18.380 | Why do human beings seem to fall in love,
02:11:22.040 | at least a bunch of '80s hair bands have written about it?
02:11:26.200 | Is that a nice feature to have?
02:11:29.720 | Is that a bug?
02:11:31.120 | What is it?
02:11:31.960 | - Well, I'm really happy that I fell in love.
02:11:35.080 | I wouldn't want it any other way.
02:11:36.720 | (Lex laughing)
02:11:37.600 | But I would say--
02:11:38.920 | - Is that you, the person speaking,
02:11:40.320 | or the neuroscientist?
02:11:41.720 | - Well, that's me, the person speaking.
02:11:44.080 | But I would say, as a neuroscientist,
02:11:47.460 | babies are born not able to regulate their own body budgets
02:11:51.360 | 'cause their brains aren't fully wired yet.
02:11:53.700 | When you feed a baby, when you cuddle a baby,
02:11:59.600 | everything you do with a baby
02:12:02.280 | impacts that baby's body budget
02:12:04.480 | and helps to wire that baby's body budget,
02:12:06.840 | helps to wire that baby's brain
02:12:09.960 | to manage eventually her own body budget to some extent.
02:12:13.600 | That's the basis, biologically, of attachment.
02:12:17.380 | Humans evolved as a species to be socially dependent,
02:12:25.380 | meaning you cannot manage your body budget on your own
02:12:31.800 | without a tax that eventually you pay many years later
02:12:40.240 | in terms of some metabolic illness.
02:12:42.780 | Loneliness, when you break up with someone that you love
02:12:47.780 | or you lose them, you feel like it's gonna kill you,
02:12:52.340 | but it doesn't.
02:12:53.400 | But loneliness will kill you.
02:12:55.340 | It will kill you approximately,
02:12:57.760 | what is it, seven years earlier?
02:12:59.420 | I can't remember exactly the exact number.
02:13:01.100 | It's actually in the web notes to seven and a half lessons.
02:13:05.360 | But social isolation and loneliness will kill you earlier
02:13:10.360 | than you would otherwise die.
02:13:11.860 | And the reason why is that you didn't evolve
02:13:15.820 | to manage your nervous system on your own.
02:13:18.620 | And when you do, you pay a little tax,
02:13:20.800 | and that tax accrues very slightly over time,
02:13:24.380 | over a long period of time,
02:13:26.340 | so that by the time you're in middle age or a little older,
02:13:30.700 | you are more likely to die sooner
02:13:33.740 | from some metabolic illness,
02:13:35.340 | from heart disease, from diabetes, from depression.
02:13:38.780 | You're more likely to develop Alzheimer's disease.
02:13:40.940 | I mean, it takes a long time for that tax to accrue,
02:13:45.940 | but it does.
02:13:47.940 | So yes, I think it's a good thing
02:13:50.260 | for people to fall in love.
02:13:53.900 | - But I think the funny view of it is that it's clear
02:13:58.380 | that humans need the social attachment to,
02:14:03.560 | what is it, manage their nervous system,
02:14:05.700 | as you're describing.
02:14:08.740 | And the reason you wanna stay with somebody for a long time
02:14:12.380 | is so you don't have,
02:14:13.460 | is the novelty is very costly for our--
02:14:16.580 | - Well, now you're mixing, now you're mixing things.
02:14:18.660 | Now you're, you know, no, you have to decide whether.
02:14:21.620 | But what I would say is when you lose someone you love,
02:14:25.280 | it feels like you've lost a part of you.
02:14:29.780 | And that's because you have.
02:14:32.300 | You've lost someone who was contributing
02:14:36.080 | to your body budget.
02:14:37.380 | We are the caretakers of one another's nervous systems,
02:14:40.560 | like it or not.
02:14:41.840 | And out of that comes very deep feelings of attachment,
02:14:46.840 | some of which are romantic love.
02:14:50.260 | - Are you afraid of your own mortality?
02:14:56.020 | We two humans sitting here.
02:14:57.520 | - Yeah.
02:14:59.720 | - Do you think, do you ponder your mortality?
02:15:01.840 | I mean, you're somebody who thinks about your brain a lot.
02:15:05.620 | It seems one of the more terrifying,
02:15:10.620 | or I don't know, I don't know how to feel about it,
02:15:13.920 | but it seems to be one of the most definitive aspects
02:15:16.200 | of life is that it ends.
02:15:18.560 | - It's a complicated answer,
02:15:19.960 | but I think the best I can do in a short snippet
02:15:22.980 | would be to say, for a very long time,
02:15:25.360 | I did not fear my own mortality.
02:15:27.880 | I feared pain and suffering.
02:15:32.880 | So that's what I feared.
02:15:36.120 | I feared being harmed or dying in a way
02:15:38.760 | that would be painful.
02:15:40.020 | But I didn't fear having my life be over.
02:15:45.640 | Now, as a mother, I think I have fear.
02:15:50.640 | I fear dying before my daughter is,
02:15:58.200 | ready to be without me.
02:16:02.400 | That's what I fear.
02:16:04.580 | That's really what I fear.
02:16:10.400 | And frankly, honestly, I fear my husband dying before me
02:16:13.840 | much more than I fear my own death.
02:16:15.640 | - There's that love and social attachment again.
02:16:19.320 | - Yeah, because I know it's just gonna,
02:16:23.040 | I'm gonna feel like I wish I was dead.
02:16:25.640 | - Yeah, a final question about life.
02:16:29.280 | What do you think is the meaning of it all?
02:16:32.240 | What's the meaning of life?
02:16:33.600 | - Yeah, I think that there isn't one meaning of life.
02:16:38.800 | There's many meanings of life,
02:16:41.040 | and you use different ones on different days.
02:16:43.880 | But for me-- - Depending on the day.
02:16:46.320 | - Depending on the day, but for me, I would say,
02:16:49.320 | sometimes the meaning of life is to understand,
02:16:54.400 | to make meaning, actually.
02:16:55.800 | The meaning of life is to make meaning.
02:16:58.560 | Sometimes it's that.
02:16:59.960 | Sometimes it's to leave the world
02:17:03.680 | just slightly a little bit better
02:17:06.720 | than the Johnny Appleseed view.
02:17:08.960 | Sometimes the meaning of life is to
02:17:19.760 | clear the path for my daughter or for my students.
02:17:24.600 | So sometimes it's that.
02:17:28.320 | And sometimes it's just,
02:17:30.400 | you ever have moments where you're looking at the sky
02:17:38.680 | or you're by the ocean?
02:17:42.760 | Or sometimes for me, it's even like I'll see
02:17:48.160 | a weed poking out of a crack in a sidewalk.
02:17:52.080 | And you just have this overwhelming sense
02:17:56.600 | of the wonder of the world.
02:18:01.040 | The physical world is so wondrous,
02:18:09.760 | and you just get very immersed in the moment,
02:18:15.640 | like the sensation of the moment.
02:18:18.240 | Sometimes that's the meaning of life.
02:18:19.880 | I don't think there's one meaning of life.
02:18:21.760 | I think it's a population of instances,
02:18:24.480 | just like any other category.
02:18:27.520 | - I don't think there's a better way to end it, Lisa.
02:18:30.840 | The first time we spoke is, I think, if not the,
02:18:35.840 | then one of, I think it's the first conversation I had
02:18:40.000 | that basically launched this podcast.
02:18:41.760 | Yeah, that's actually the first conversation
02:18:43.680 | I've had that launched this podcast.
02:18:44.520 | - Oh, wow.
02:18:45.680 | - And now we get to finally do it the right way.
02:18:48.840 | So it's a huge honor to talk to you,
02:18:50.560 | that you spent time with me.
02:18:51.960 | I can't wait for, hopefully,
02:18:54.800 | the many more books you'll write.
02:18:56.640 | Certainly can't wait to, I already read this book,
02:19:00.520 | but I can't wait to listen to it
02:19:02.120 | because as you said offline, that you're reading it,
02:19:06.000 | and I think you have a great voice.
02:19:07.480 | You have a great, I don't know what's a nice way to put it,
02:19:10.000 | but maybe NPR voice.
02:19:12.520 | - Thank you.
02:19:13.360 | - In the best version of what that is.
02:19:14.720 | So thanks again for talking today.
02:19:16.560 | - Oh, it's my pleasure.
02:19:17.400 | Thank you so much for having me back.
02:19:20.040 | - Thank you for listening to this conversation
02:19:22.640 | with Lisa Feldman Barrett,
02:19:24.320 | and thank you to our sponsors,
02:19:26.360 | Athletic Greens, which is an all-in-one nutritional drink,
02:19:30.400 | Magic Spoon, which is a low-carb, keto-friendly cereal,
02:19:34.280 | and Cash App, which is an app
02:19:36.360 | for sending money to your friends.
02:19:38.640 | Please check out these sponsors in the description
02:19:40.640 | to get a discount and to support this podcast.
02:19:44.280 | If you enjoy this thing, subscribe on YouTube,
02:19:46.700 | review it with Five Stars and Apple Podcasts,
02:19:48.980 | follow on Spotify, support on Patreon,
02:19:51.560 | or connect with me on Twitter, @lexfriedman.
02:19:55.200 | And now let me leave you with some words
02:19:57.440 | from Lisa Feldman Barrett.
02:19:59.720 | It takes more than one human brain to create a human mind.
02:20:03.900 | Thank you for listening.
02:20:06.080 | I hope to see you next time.
02:20:07.960 | (upbeat music)
02:20:10.540 | (upbeat music)
02:20:13.120 | [BLANK_AUDIO]