back to indexLisa Feldman Barrett: Counterintuitive Ideas About How the Brain Works | Lex Fridman Podcast #129
Chapters
0:0 Introduction
2:45 Are we alone in the universe?
4:13 Life on Earth
9:5 Collective intelligence of human brains
17:53 Triune brain
24:3 The predicting brain
31:58 How the brain evolved
37:58 Free will
46:51 Is anything real?
59:23 Dreams
65:11 Emotions are human-constructed concepts
90:40 Are women more emotional than men?
99:16 Empathy
130:56 Love
134:50 Mortality
136:26 Meaning of life
00:00:00.000 |
The following is a conversation with Lisa Feldman Barrett, 00:00:03.720 |
a professor of psychology at Northeastern University 00:00:07.000 |
and one of the most brilliant and bold thinkers 00:00:09.720 |
and scientists I've ever had the pleasure of speaking with. 00:00:12.620 |
She's the author of a book that revolutionized 00:00:22.000 |
called "Seven and a Half Lessons About the Brain" 00:00:31.160 |
and it's one of the best short whirlwind introductions 00:00:38.160 |
but again, if there's anybody worth supporting, it's Lisa, 00:00:48.640 |
especially because we felt that this first conversation 00:00:56.000 |
in the United States leading up to the election. 00:00:59.560 |
And she gives me a whole new way to think about it 00:01:04.360 |
that is ultimately inspiring of empathy, compassion, 00:01:11.180 |
followed by some thoughts related to this episode. 00:01:16.480 |
the all-in-one drink that I start every day with 00:01:21.720 |
that I don't otherwise get through my diet naturally. 00:01:28.600 |
that I reward myself with after a productive day. 00:01:36.160 |
the app I use to send money to friends for food, drinks, 00:01:40.320 |
and unfortunately, for the many bets I have lost to them. 00:01:43.960 |
Please check out these sponsors in the description 00:01:46.080 |
to get a discount and to support this podcast. 00:01:59.840 |
And in fact, I invited her to speak at the AGI series 00:02:29.400 |
If you enjoy this thing, subscribe on YouTube, 00:02:40.400 |
And now, here's my conversation with Lisa Feldman Barrett. 00:02:44.660 |
Since we'll talk a lot about the brain today, 00:02:47.920 |
do you think, let's ask the craziest question, 00:02:58.640 |
You know, I have to think probabilities suggest yes, 00:03:04.640 |
and also, secretly, I think I just hope that's true. 00:03:31.800 |
- Yeah, no, you know, I take a lot of comfort in curiosity. 00:03:37.320 |
It's a great resource for dealing with stress. 00:03:42.320 |
So I'm learning all about mushrooms and octopuses 00:03:52.140 |
And so for me, this counts, I think, in the realm of awe. 00:03:56.420 |
But also, I think I'm somebody who cultivates awe 00:04:00.220 |
deliberately on purpose to feel like a speck. 00:04:19.060 |
do you think it's difficult for intelligent life 00:04:23.700 |
From everything you've written and studied about the brain, 00:04:39.860 |
I like a magic show as much as the next person. 00:04:45.820 |
But, you know, magic is just a bunch of stuff 00:04:48.700 |
that we don't really understand how it works yet. 00:04:53.600 |
there are some major steps in the course of evolution 00:05:00.900 |
the step from single cell to multicellular organisms, 00:05:04.140 |
things like that, which are really not known. 00:05:17.660 |
as much as what are the steps and how long would it take? 00:05:27.420 |
would we end up with the same menu of life forms 00:05:46.340 |
meaning like there'd be rich complexity of the kind of, 00:05:54.740 |
of weirdly intelligent, seemingly intelligent, 00:06:07.060 |
I mean, if you look at the range of creatures 00:06:13.140 |
And, you know, it's sort of trite to say that, 00:06:21.280 |
animals, there are animals that seem really ordinary 00:06:49.680 |
They have a concept of beauty that I haven't quite, 00:06:54.780 |
but it doesn't seem to fit evolutionary arguments well. 00:06:59.940 |
So I think you're talking about "The Evolution of Beauty," 00:07:17.660 |
the question about birds and some other animals 00:07:19.900 |
is why would they engage in such metabolically costly 00:07:24.500 |
displays when it doesn't improve their fitness at all? 00:07:32.100 |
is the answer that Darwin gave, which is sexual selection. 00:07:38.440 |
selection can occur for all kinds of reasons. 00:07:46.460 |
that observation helped Darwin come to the idea 00:07:53.820 |
meaning, and the argument that I think his name is Frum 00:08:03.280 |
the selection pressure is the pleasure of female birds. 00:08:06.940 |
Which as a woman, and as someone who studies affect, 00:08:14.260 |
I think there is an aspect of natural selection to it, 00:08:18.620 |
- But you were saying, the reason we brought up birds 00:08:20.860 |
is the life we got now seems to be quite incredible. 00:08:26.700 |
peek into the sky, there are miraculous creatures. 00:08:35.340 |
you couldn't dream up something as interesting. 00:08:41.440 |
intelligent life evolves in many different ways, 00:08:49.020 |
there's not one brain that gives you intelligence. 00:08:54.980 |
So my guess is that the menagerie might not look 00:09:04.900 |
- But if we look at the human brain versus the brains, 00:09:11.580 |
the mechanisms of intelligence in our ancestors, 00:09:18.640 |
what's the difference between the fanciest brain we got, 00:09:31.660 |
- Yeah, I think it depends on how far back you wanna go. 00:09:34.600 |
- You go all the way back, right, in your book. 00:09:38.700 |
So what's the interesting comparison, would you say? 00:09:41.220 |
- Well, first of all, I wouldn't say that the human brain 00:09:48.140 |
and pretty fancy, and they can do some pretty amazing things 00:09:56.940 |
we can't comport ourselves and squeeze ourselves 00:10:10.420 |
we can certainly do some things that other animals can't do 00:10:17.860 |
But I would say that there are a number of animal brains 00:10:26.100 |
and really impressive things that we can't do. 00:10:28.940 |
- I mean, with your work on how emotions are made 00:10:31.300 |
and so on, you kind of repaint the view of the brain 00:10:47.420 |
in terms of homeostasis and all that kind of stuff. 00:10:53.740 |
is any less miraculous than anybody else would say. 00:10:57.260 |
I just think that there are other brain structures 00:11:02.260 |
And I also think that there are a number of things 00:11:04.260 |
about the human brain which we share with other vertebrates, 00:11:19.260 |
And we can also do some things with our brains together, 00:11:23.900 |
working together that other animals can't do, 00:11:27.260 |
or at least we haven't discovered their ability to do it. 00:11:32.960 |
I mean, that's one of the things you write about. 00:11:40.560 |
like the book "Sapiens" and the fact that we're able 00:11:43.780 |
to kind of connect, like network our brains together, 00:11:53.320 |
Is that like some kind of feature that's built into there? 00:12:02.140 |
- What I would say is that our ability to coordinate 00:12:12.580 |
But what we do with that coordination is unique 00:12:20.660 |
because of some of the structural features in our brains. 00:12:29.580 |
those structural features, it's we have them in abundance. 00:12:38.100 |
than you would expect it to be for a primate of our size. 00:12:47.220 |
to the size of a human, that chimpanzee would have a brain 00:12:59.760 |
in terms of the basic blueprint that builds our brain 00:13:23.920 |
that people mistakenly attribute to rationality. 00:13:29.960 |
Isn't that where all the clever stuff happens? 00:13:34.200 |
And I will also say that lots of clever stuff happens 00:13:40.320 |
- But because of the size of the cerebral cortex 00:13:53.280 |
like build civilizations and coordinate with each other, 00:14:11.700 |
We draw a line in the sand and we make countries 00:14:26.940 |
- Well, what do you think a citizen is and an immigrant? 00:14:35.420 |
And then they have very, very serious and real effects, 00:14:44.380 |
Dawkins with memes, which is like ideas are breeding. 00:14:49.020 |
Like, we're just like the canvas for ideas to breed 00:14:55.460 |
So this kind of network that you talk about of brains 00:15:00.420 |
to then compete against each other and so on. 00:15:10.860 |
I don't remember if it was in "The Botany of Desire," 00:15:22.680 |
sort of utilizing humans for their own evolutionary purposes. 00:15:36.560 |
as a propagation device for the seeds of tomatoes 00:15:44.840 |
So I think rhetorically, it's an interesting device, 00:16:02.800 |
although it is interesting to think about it that way. 00:16:06.860 |
- Well, of course, the ideas that are using your brain 00:16:17.220 |
as an effective communicator and thereby are winning. 00:16:23.200 |
to think that there's particular aspects of your brain 00:16:33.440 |
- Yeah, I think the way that I would say it really though 00:16:47.600 |
They do it by, so termites and ants and bees, for example, 00:17:04.180 |
Primates, non-human primates, add vision, right? 00:17:13.500 |
Humans, as far as I know, are the only species 00:17:15.980 |
that use ideas and words to regulate each other, right? 00:17:20.780 |
I can text something to someone halfway around the world. 00:17:27.780 |
and I can have an effect on their nervous system. 00:17:30.680 |
And ideas, the ideas that we communicate with words, 00:17:36.700 |
for us to do mental telepathy with each other, right? 00:17:39.180 |
I mean, I'm not the first person to say that, obviously. 00:17:54.180 |
- So you also write, I think, let's go back to the brain. 00:17:59.980 |
that the human brain has three brains in it, three forces, 00:18:09.460 |
First of all, what are the three parts of the brain, 00:18:28.140 |
those aren't actually all the same thing in ancient Greece, 00:18:36.360 |
and it was a description of really about moral behavior 00:18:46.460 |
can be described with a metaphor of two horses 00:18:56.560 |
like feeding and fighting and fleeing and reproduction. 00:19:23.940 |
They also printed a B word and it was really quite, yeah. 00:19:33.580 |
But instincts, and then the other horse represents emotions 00:19:38.020 |
and then the charioteer represents rationality, 00:19:54.500 |
there was a very popular view of brain evolution, 00:19:58.040 |
which suggested that you have this reptilian core, 00:20:03.040 |
like a lizard brain, inner lizard brain for instincts. 00:20:10.980 |
layer on top of that evolved a limbic system in mammals. 00:20:18.740 |
which bestowed mammals with, gave them emotions, 00:20:24.660 |
And then on top of that evolved a cerebral cortex, 00:20:29.660 |
which in largely in primates, but very large in humans. 00:20:49.500 |
but really by the 1970s, it was shown pretty clearly 00:21:05.940 |
And the irony is that the idea of the three-layered brain 00:21:15.840 |
with an inner lizard, that hijacks your behavior 00:21:31.000 |
was popularized by Carl Sagan in the Dragons of Eden, 00:21:46.880 |
- So, well, the narrative is on the way it evolved, 00:22:02.340 |
and there's a, like if I get overly emotional on Twitter, 00:22:23.000 |
I think that, the way I think about philosophy and science 00:22:44.180 |
The triune brain, as it's called, this three-layer brain, 00:22:49.820 |
the idea that your brain is like an already baked cake, 00:22:57.860 |
the idea, that idea is the foundation of the law 00:23:13.340 |
It sort of fits our intuitions about how we work. 00:23:22.640 |
it lets people off the hook for nasty behavior. 00:23:32.020 |
can't be a source of wisdom, which they often are. 00:23:36.020 |
In fact, you would not wanna be around someone 00:23:50.780 |
So I guess my, and I could sort of go on and on and on, 00:23:59.120 |
I don't think it's a useful narrative in the end. 00:24:06.460 |
that we should use when we're thinking about it? 00:24:09.700 |
but I'll say that even our notion of what an instinct is 00:24:12.740 |
or what a reflex is, it's not quite right, right? 00:24:16.840 |
So if you look at evidence from ecology, for example, 00:24:21.840 |
and you look at animals in their ecological context, 00:24:25.660 |
what you can see is that even things which are reflexes 00:24:49.500 |
where they hit your patellar tendon on your knee 00:24:52.220 |
and you kick, the force with which you kick and so on 00:25:12.220 |
is the way that matches our best understanding, 00:25:26.260 |
and I'm certainly not the only one who holds this view, 00:25:37.140 |
And then I was reading work in signal processing, 00:25:46.140 |
the research suggested that the brain worked this way. 00:25:51.660 |
and they were, who don't speak to each other. 00:25:54.020 |
And they were all pointing in this direction. 00:26:18.920 |
that your brain doesn't react to things in the world. 00:26:27.940 |
We see things, we hear things, we react to them. 00:26:31.540 |
In psychology, we call this stimulus response. 00:26:35.420 |
So your face is, your voice is a stimulus to me. 00:26:43.840 |
And I might react very automatically, system one. 00:26:54.420 |
where I maybe stop myself from saying something 00:26:57.860 |
or doing something and in a more reflective way, 00:27:16.340 |
And it's constantly predicting what's going on in the body 00:27:32.800 |
- So fundamentally, the thing that the brain does 00:27:38.880 |
like talking to itself and predicting stuff about the world, 00:27:41.600 |
not like this dumb thing that just senses and responds. 00:27:46.540 |
- Yeah, so the way to think about it is like this. 00:27:48.500 |
You know, your brain is trapped in a dark, silent box. 00:28:04.440 |
is through the senses, through the sense organs, 00:28:07.440 |
your eyes, your ears, and you have sensory data 00:28:14.000 |
that you're largely unaware of to your brain, 00:28:23.240 |
But your brain is receiving sense data continuously, 00:28:35.800 |
Your brain doesn't know the cause of these sense data. 00:28:41.280 |
It's only receiving the effects of those causes, 00:28:46.380 |
And so your brain has to solve what philosophers call 00:28:51.280 |
How do you know, when you only receive the effects 00:28:54.120 |
of something, how do you know what caused those effects? 00:29:04.200 |
how does your brain know what caused those events 00:29:13.960 |
And the answer is that your brain has one other source 00:29:23.800 |
It can reconstitute in its wiring past experiences, 00:29:28.800 |
and it can combine those past experiences in novel ways. 00:29:34.400 |
And so we have lots of names for this in psychology. 00:29:39.120 |
We call it memory, we call it perceptual inference, 00:29:44.020 |
It's also, we call it concepts or conceptual knowledge. 00:29:50.960 |
Basically, if we were to stop the world right now, 00:30:13.400 |
Probabilistically, what's most likely to happen. 00:30:15.920 |
And it begins to prepare for what's going to happen 00:30:24.160 |
and it begins to prepare your experience based, 00:30:29.160 |
so it's anticipating the sense data it's going to receive. 00:30:44.880 |
or there's some sense data that your brain didn't know 00:30:52.640 |
and your brain takes it in, we say encodes it. 00:30:55.640 |
We have a fancy name for that, we call it learning. 00:31:05.900 |
and so that you can predict better next time. 00:31:08.760 |
And it turns out that predicting and correcting, 00:31:15.520 |
to run a system than constantly reacting all the time. 00:31:20.160 |
it means you have no, you can't anticipate in any way 00:31:22.800 |
what's gonna happen, and so the amount of uncertainty 00:31:54.420 |
- Yeah, so but prediction's still at the core, 00:32:03.920 |
is the brain just mostly a prediction machine then? 00:32:07.040 |
Like is the perception just a nice little feature 00:32:13.660 |
both the integration of new perceptual information? 00:32:25.680 |
- Well, I think that we can look to evolution for that, 00:32:28.760 |
for one answer, which is that when you go back, 00:32:35.640 |
we, you know, the world was populated by creatures, 00:32:51.480 |
- Oh no, I'm not talking about dinosaurs, honey. 00:32:53.400 |
I'm talking way back, further back than that. 00:33:02.940 |
it's a, or a lancet, that's the modern animal. 00:33:12.300 |
the common ancestor that we share with invertebrates. 00:33:16.720 |
Because, basically because of the tracing back 00:33:27.340 |
It had some cells that would later turn into a brain, 00:33:36.860 |
and it had really, really no senses, for the most part. 00:33:52.080 |
And it had no hearing, it had a vestibular cell 00:33:58.160 |
At the time, we're talking evolutionary scale here, 00:34:02.780 |
so give or take some hundred million years or something. 00:34:09.060 |
like when a backbone evolved and a brain evolved, 00:34:19.980 |
that's when your viscera, like internal systems, evolved. 00:34:33.760 |
believe that senses evolved in the service of motor action. 00:34:46.400 |
what triggered, what was the big evolutionary change, 00:34:49.960 |
what was the big pressure that made it useful 00:34:57.440 |
and an auditory system and a brain, basically? 00:34:59.920 |
And the answer that is commonly entertained right now 00:35:16.160 |
and this launched an arms race between predators and prey, 00:35:25.080 |
So these little amphiox, these little amphioxi, 00:35:34.720 |
they're not aware of their environment very much, really. 00:36:01.180 |
It didn't evolve for the purposes of consciousness. 00:36:03.920 |
It didn't evolve for the purposes of experiencing anything. 00:36:07.440 |
It evolved to be in the service of motor control. 00:36:16.900 |
This is why scientists sometimes avoid questions 00:36:25.520 |
This is what philosophers call this teleology. 00:36:28.400 |
You might be able to say something about how things evolve, 00:36:42.120 |
The interesting thing is that was the first element 00:36:45.800 |
of social interaction is, am I gonna eat you, 00:36:50.400 |
And for that, it's useful to be able to see each other, 00:36:59.880 |
There was a time when life didn't eat each other. 00:37:17.080 |
and then it just filters whatever comes into its mouth. 00:37:21.480 |
So it is eating, but it's not actively hunting. 00:37:26.000 |
And when the concentration of food decreases, 00:37:46.040 |
So it's not really, it's not guiding its actions 00:38:05.960 |
let me ask you a question that scientists also hate 00:38:11.640 |
So how does, do you think about free will much, 00:38:15.520 |
how does that fit into this, into your view of the brain? 00:38:19.480 |
Why does it feel like we make decisions in this world? 00:38:30.360 |
- I think I have. - You would have free will? 00:38:32.920 |
but I don't put a lot of stock in my own intuitions 00:38:37.920 |
or anybody's intuitions about the cause of things. 00:38:42.880 |
is that the brain creates experiences for us. 00:38:59.640 |
- So you don't trust your own intuition about free will. 00:39:03.760 |
No, I mean, no, but I am also somewhat persuaded by, 00:39:09.520 |
like the philosopher Dan Dennett wrote at one point 00:39:13.880 |
that it's, I can't say it as eloquently as him, 00:39:22.640 |
And so there is this observation that we're not robots 00:39:27.920 |
and we can do some things like a little more sophisticated 00:39:39.460 |
your internal model that's running right now, 00:39:48.760 |
is based on the fact that you have years of experience 00:40:00.840 |
I mean, that's how you can understand the words 00:40:07.240 |
I think we did this once before too, didn't we? 00:40:10.120 |
- I don't know, I would have to access my memory module. 00:40:32.040 |
so it's remembering, but you're not consciously remembering. 00:40:36.200 |
It's basically re-implementing prior experiences 00:40:40.080 |
as a way of predicting what's gonna happen next. 00:40:42.160 |
And it can do something called conceptual combination, 00:40:44.720 |
which is it can take bits and pieces of the past 00:40:50.300 |
So you can experience and make sense of things 00:40:57.580 |
because you've encountered something similar to them. 00:41:23.260 |
But I think really where free will comes from 00:41:29.460 |
involves cultivating experiences for yourself 00:41:45.460 |
your brain wired itself to your surroundings, 00:41:53.500 |
So you were handed an internal model basically. 00:42:06.660 |
you can cultivate new experiences for yourself. 00:42:11.500 |
And those new experiences can change your internal model 00:42:16.500 |
and you can actually practice those experiences 00:42:35.380 |
You aren't responsible for the model that you were handed, 00:42:49.900 |
but you are responsible for the one you have now. 00:42:52.980 |
You can choose, you choose what you expose yourself to. 00:43:16.480 |
of the billions of decisions you make early on in life 00:43:30.300 |
just the amount of possibilities that are created 00:43:40.140 |
that somewhere within that, that's free will. 00:43:44.980 |
Even if it's all deterministic, that might as well be, 00:43:52.740 |
and the fact that you just make one trajectory 00:43:54.700 |
through those set of choices seems to be like 00:44:08.100 |
- Well, there's lots of magic, I would say, so far, 00:44:28.520 |
but I think there's quite a bit of magic, actually. 00:44:45.100 |
there's also analog communication between neurons, 00:44:47.600 |
not just digital, so it's not just with firing of axons. 00:44:51.920 |
And some of that, there are other ways to communicate. 00:45:00.680 |
and the noise is there for a really good reason, 00:45:22.720 |
the only job is to inject noise into their neural patterns. 00:45:30.200 |
- So you can think about stochasticity or noise 00:45:45.780 |
you can't reach back into your past and change your past. 00:46:05.680 |
- So one way to think about it is that you're continuously, 00:46:08.060 |
this is a colleague of mine, a friend of mine said, 00:46:17.240 |
Yes, you are continually cultivating your past 00:46:23.480 |
- So you think, yeah, I guess the construction 00:46:29.720 |
of the mental model that you use for prediction 00:46:32.900 |
ultimately contains within it your perception of the past, 00:46:38.800 |
or even just the entirety of your narrative about the past. 00:46:41.880 |
So you're constantly rewriting the story of your past. 00:46:48.120 |
That's one poetic and also just awe-inspiring. 00:47:00.800 |
you have to infer about the sources of the thing 00:47:18.360 |
just like you said, there's this brain sitting alone 00:47:20.760 |
in the darkness trying to perceive the world. 00:47:23.200 |
How do we know that the world is out there to be perceived? 00:47:27.220 |
- Yeah, so I don't think that you should be asking questions 00:47:34.400 |
- I actually did before this, so I apologize. 00:47:43.280 |
we can be pretty sure that there's a there there 00:47:46.120 |
is that the structure of the information in the world, 00:47:57.840 |
that comes from your body, it's not random stuff. 00:48:02.000 |
There's a spatial structure and a temporal structure, 00:48:05.040 |
and that spatial and temporal structure wires your brain. 00:48:08.660 |
So an infant brain is not a miniature adult brain. 00:48:13.920 |
It's a brain that is waiting for wiring instructions 00:48:20.400 |
those wiring instructions to develop in a typical way. 00:48:37.360 |
because the visual system in that baby's brain 00:48:41.840 |
The retina of your eye, which actually is part of your brain 00:48:58.480 |
The same thing is true really for all your senses. 00:49:09.160 |
wires your brain so that you have an internal model 00:49:12.760 |
of that world so that your brain can predict well 00:49:16.240 |
to keep you alive and well and allow you to thrive. 00:49:19.800 |
- That's fascinating that the brain is waiting 00:49:22.160 |
for a very specific kind of set of instructions 00:49:35.120 |
The brain needs some input in order to develop normally. 00:49:39.240 |
And we are genetically, as I say in the book, 00:49:44.240 |
we have the kind of nature that requires nurture. 00:49:48.280 |
We can't develop normally without sensory input 00:49:58.000 |
and some other animals too, but really seriously in humans, 00:50:03.000 |
is the input that we need is not just physical. 00:50:14.880 |
to develop normally, that infant needs eye contact, 00:50:38.040 |
And again, I would say there are lots of cultural patterns 00:50:46.480 |
It's not like the infant has to be cared for in one way. 00:50:50.280 |
Whatever the social environment is for an infant, 00:50:54.920 |
that it will be reflected in that infant's internal model. 00:51:05.680 |
although we don't always experience it that way. 00:51:23.260 |
- Yeah, but nevertheless, you're kind of saying 00:51:28.620 |
that the physical reality has a consistent thing throughout 00:51:33.620 |
that keeps feeding these set of sensory information 00:51:49.780 |
so prediction error, your brain can learn it. 00:51:57.100 |
But you had a wonderful conversation with David Eagleman, 00:52:04.260 |
and gave lots and lots of really very, very cool examples. 00:52:11.020 |
but not obviously to the extent that he did in his book. 00:52:16.780 |
But it speaks to the point that your internal model 00:52:23.160 |
And therefore, you always can modify your experience. 00:52:34.780 |
in virtual reality, or if we sit at home during a pandemic, 00:52:39.340 |
and we spend most of our day on Twitter and TikTok. 00:52:56.500 |
there are different ways to specify your question, right? 00:53:09.900 |
Like, "Can we create a new sensory modality?" 00:53:25.500 |
some of those statistical regularities, right? 00:53:29.060 |
what happens to an adult brain when you remove 00:53:32.300 |
some of the statistical patterns that were there, 00:53:39.740 |
or in the actual, like, you remove eyesight, for example? 00:53:44.820 |
I mean, basically, one way to limit the inputs 00:53:49.020 |
to your brain are to stay home and protect yourself. 00:53:53.100 |
Another way is to put someone in solitary confinement. 00:53:57.540 |
Another way is to stick them in a nursing home. 00:54:06.980 |
Which really are, where people are somewhat impoverished 00:54:21.420 |
But the point is, I think, that the human brain 00:54:45.100 |
when you restrict what you expose yourself to. 00:54:52.200 |
- Yeah, you know, there's all this talk of diversity. 00:54:56.980 |
The brain loves it, to the fullest definition 00:55:07.500 |
The only place where we seem to have difficulty 00:55:13.820 |
But we, who wants to eat the same food every day? 00:55:17.460 |
Who wants to wear the same clothes every day? 00:55:19.380 |
I mean, my husband, if you ask him to close his eyes, 00:55:21.820 |
he won't be able to tell you what he's wearing. 00:55:24.660 |
He'll buy seven shirts of exactly the same style 00:55:27.820 |
in different colors, but they are in different colors, 00:55:36.940 |
and therefore I wear the same thing every time? 00:55:42.140 |
you are a fairly sharp dresser, so there is that, 00:55:55.080 |
so the two most expensive things your brain can do, 00:55:59.340 |
metabolically speaking, is move your body and learn. 00:56:14.340 |
but it's a cost, it's an investment that gives returns. 00:56:19.260 |
And in general, people vary in how much they like novelty, 00:56:23.420 |
unexpected things, some people really like it, 00:56:29.400 |
But in general, we don't eat the same thing every day, 00:56:40.540 |
The only place we have difficulty with diversity 00:56:48.460 |
And then we have considerable problems there, 00:56:52.980 |
- Let me ask, I don't know if you're familiar 00:56:55.040 |
with Donald Hoffman's work about questions of reality. 00:57:03.940 |
that the very thing we've been talking about, 00:57:15.340 |
That there is something much bigger out there 00:57:20.420 |
cognitive capabilities, are just not even perceiving? 00:57:23.260 |
That the thing we're perceiving is just a crappy, 00:57:26.900 |
like Windows 95 interface onto a much bigger, 00:57:37.720 |
- Well, without getting too metaphysical about it, 00:57:42.620 |
it doesn't have to be the crappy version of anything, 00:58:07.100 |
and experience things that we couldn't otherwise, 00:58:10.380 |
whether they are different parts of the visual spectrum, 00:58:15.180 |
or things that are too microscopically small for us to see, 00:58:29.460 |
the interesting or potentially sad thing about humans 00:58:40.640 |
we think there's a natural reason for experiencing it. 00:58:48.440 |
and that all the other stuff isn't important. 00:58:53.300 |
Many of the things that we think of as natural 00:58:58.000 |
They're certainly real, but we've created them. 00:59:07.620 |
outside of our awareness that have tremendous influence 00:59:21.840 |
really the question is how fantastical is it? 00:59:23.880 |
- Yeah, like what, you know, a lot of people ask me. 00:59:33.280 |
I'm talking to a few researchers in psychedelics. 00:59:59.420 |
But I think that, well, think about what happens. 01:00:02.500 |
So you're running, your brain's running this internal model 01:00:10.440 |
you have no awareness of the mechanics of it, right? 01:00:15.120 |
And so one thing that's going on all the time 01:00:27.760 |
Like how is, the last time I was in this sensory array 01:00:33.960 |
and this chain of events which just occurred, 01:00:43.600 |
It comes up with a distribution of possible answers. 01:00:47.360 |
And then there has to be some selection process. 01:00:55.800 |
a population of neurons that helps to choose. 01:00:59.700 |
It's not, I'm not talking about a homunculus in your brain 01:01:08.980 |
It's not the center of yourself or anything like that. 01:01:20.840 |
and helps to select or narrow the field, okay? 01:01:35.880 |
because the regions of the brain that make it up 01:01:38.240 |
are in the frontal lobe and the parietal lobe. 01:01:50.840 |
whether or not you feel like you're expending effort 01:02:02.120 |
And so think about what's happening when you sleep. 01:02:13.840 |
becomes a little bit, the tethers from the world are loosened 01:02:18.840 |
and this network, which is involved in, you know, 01:02:24.280 |
maybe weeding out unrealistic things, is a little bit quiet. 01:02:29.280 |
So your dreams are really your internal model 01:02:40.040 |
Except, so you can do things that you can't do in real life 01:02:46.480 |
Like I, for example, when I fly on my back in a dream, 01:02:53.240 |
- When you're laying on your back in your dream? 01:02:55.040 |
- No, when I'm in my dream and flying in a dream, 01:03:12.440 |
- Yeah, but you're talking about like airplane. 01:03:15.360 |
- I fly in my dreams and I'm way faster, right? 01:03:25.780 |
I mean, I've flown in an airplane and I've seen birds fly 01:03:34.320 |
I don't know if he flies faster on his back, but-- 01:03:39.160 |
But yeah, but anyways, my point is that, you know, 01:03:50.680 |
your internal model is still being constrained by your body. 01:03:58.520 |
It's always receiving sense data from your body. 01:04:12.760 |
which is a really good thing because if you were, 01:04:17.040 |
to anything outside your own skin ever again. 01:04:19.120 |
Like right now, you seem like you're sitting there 01:04:21.320 |
very calmly, but you have a virtual drama, right? 01:04:25.360 |
It's like an opera going on inside your body. 01:04:30.320 |
And so I think that one of the things that happens 01:04:34.760 |
when people take psilocybin or take, you know, 01:04:45.040 |
- Completely removed. - Are completely removed. 01:04:59.680 |
so that it doesn't go completely off the rails. 01:05:02.600 |
- Yeah, I know there's, again, that wiring to the other brain 01:05:06.520 |
that's the guide is at least a tiny little tether. 01:05:11.280 |
Let's talk about emotion a little bit if we could. 01:05:15.000 |
Emotion comes up often, and I have never spoken 01:05:23.600 |
from a biological and neuroscience perspective that you do. 01:05:37.080 |
but as somebody who was born in the Soviet Union 01:05:42.440 |
talks about love nonstop, you know, emotion is a, 01:05:49.040 |
I don't know what, so maybe let's just try to talk about it. 01:06:00.720 |
but what are some misconceptions we writers of poetry, 01:06:16.240 |
from both a scientific and an engineering perspective? 01:06:20.040 |
- Yeah, so there is a common view of emotion in the West. 01:06:37.160 |
We have an inner beast and that comes baked in 01:06:42.000 |
So you've got circuits for angers, atmosphere. 01:06:44.320 |
It's interesting that they all have English names, 01:06:55.960 |
and, you know, so when your fear circuit is triggered, 01:07:15.120 |
They're not the only responses that you give, 01:07:16.840 |
but on average, they're the prototypical responses. 01:07:27.800 |
that emotions are these profoundly unhelpful things 01:07:46.380 |
The evidence actually lines up beautifully with each other. 01:07:50.800 |
And it doesn't matter whether you're measuring people's 01:07:55.280 |
or you're measuring their peripheral physiology, 01:08:05.760 |
and you don't really find strong evidence for this. 01:08:09.080 |
And I say this as somebody who not only has reviewed 01:08:13.000 |
really thousands of articles and run, you know, 01:08:16.280 |
big meta-analyses, which are statistical summaries 01:08:21.280 |
but also as someone who has sent teams of researchers 01:08:32.520 |
which are very different from urban large-scale cultures 01:08:42.800 |
and I say we euphemistically because I myself didn't go 01:08:56.380 |
and more formative for them to have that experience. 01:08:59.320 |
But I was in contact with them every day by satellite phone. 01:09:03.040 |
And this was to visit the Hadza hunter-gatherers 01:09:30.980 |
when expressions of emotion were supposed to have evolved, 01:09:41.540 |
I was sort of struggling with this set of observations, 01:09:57.700 |
not a single individual measure or pattern of measures 01:10:08.820 |
How can you possibly make sense of those two things? 01:10:15.860 |
and a lot of immersing myself in different literatures, 01:10:22.080 |
that the brain is constructing these instances 01:10:32.100 |
when I suggest to you that what your brain is doing 01:10:37.020 |
and it's asking itself, figuratively speaking, 01:11:09.460 |
And a mental representation of a category is a concept. 01:11:13.500 |
So your brain is constructing categories or concepts 01:11:19.620 |
So you really wanna understand what a brain is doing. 01:11:23.660 |
like classification models is not gonna help you 01:11:33.820 |
or you could say it's doing concept construction. 01:11:35.700 |
It's using past experience to conjure a concept, 01:11:41.160 |
And if it's using past experiences of emotion, 01:11:59.060 |
changes depending on the situation that you're in. 01:12:04.180 |
So for example, if your brain uses past experiences 01:12:11.060 |
either because somebody labeled them for you, 01:12:14.540 |
taught them to you, you observed them in movies and so on, 01:12:21.420 |
from your concept for anger than another situation. 01:12:28.940 |
are what we call a population of variable instances. 01:12:35.420 |
Sometimes when you're angry, you might smile. 01:12:42.700 |
Sometimes your heart rate will go up, it will go down, 01:12:46.820 |
It depends on what action you're about to take. 01:12:49.640 |
Because the way prediction, and I should say, 01:12:58.580 |
in the study of the peripheral nervous system. 01:13:04.100 |
And so if you look at what the brain is doing, 01:13:09.420 |
and here's the hypothesis that you would come up with, 01:13:14.820 |
I've published these details in scientific papers, 01:13:48.100 |
the first thing it's doing is it's making a prediction 01:13:53.100 |
of how to change the internal systems of your body, 01:13:58.740 |
the control of your heart, control of your lungs, 01:14:01.860 |
a flush of cortisol, which is not a stress hormone. 01:14:11.440 |
you need to do something metabolically expensive. 01:14:30.520 |
to execute some actions, to move in some way. 01:14:34.220 |
And then it infers based on those motor predictions 01:14:54.240 |
Your brain makes an inference about what you will sense 01:15:08.160 |
are a consequence of those predictions, those concepts. 01:15:16.780 |
it's constructing an instance of that emotion. 01:15:51.200 |
- Yeah, so our colloquial notion of a concept 01:15:53.440 |
where I say, well, what's a concept of a bird? 01:15:58.440 |
And then you list a set of features off to me. 01:16:05.900 |
But if you go into the literature in cognitive science, 01:16:10.900 |
what you'll see is that the way that scientists 01:16:23.560 |
about a concept as a dictionary definition for a category. 01:16:39.440 |
the necessary and sufficient features of those instances. 01:16:58.160 |
that actually not all instances of birds have feathers 01:17:11.080 |
and sufficient features stored in your brain somewhere. 01:17:15.820 |
A prototype meaning you still have a single representation 01:17:20.960 |
but the features are like of the most typical instance 01:17:27.240 |
of the category or maybe the most frequent instance, 01:17:32.920 |
They have some graded similarity to the prototype. 01:17:47.900 |
a lot of work to say that then a series of experiments 01:18:02.100 |
or instance of the category and reading off the features 01:18:11.000 |
So if we were in a pet store and I asked you, 01:18:21.860 |
You would be more likely to give me features of a good pet. 01:18:34.060 |
you would be more likely to give me the features 01:18:36.540 |
of a bird that you would eat, like a chicken. 01:18:40.340 |
you'd be more likely to give me in this country, 01:18:45.340 |
you know, the features of a sparrow or a robin. 01:18:50.660 |
you would probably give me the features of a peacock 01:18:59.220 |
that you would see a peacock in such circumstances. 01:19:01.500 |
So the idea was that really what your brain was doing 01:19:11.580 |
that meets the function that the category is being put to. 01:19:20.540 |
Then people started studying ad hoc concepts, 01:19:31.160 |
where the instances don't share any physical features. 01:19:36.640 |
But the function of the instances are the same. 01:19:44.860 |
What are all the things that can protect you from the rain? 01:20:05.400 |
So the idea is that the function of the instances 01:20:11.480 |
even if they look different, sound different, 01:20:13.840 |
smell different, this is called an abstract concept 01:20:21.600 |
Now, the really cool thing about conceptual categories 01:20:35.480 |
which is called an abstract concept or a conceptual category 01:20:39.040 |
'cause the things don't share physical features, 01:20:46.220 |
One is that's what Darwin said a species was. 01:20:50.300 |
So Darwin is known for discovering natural selection. 01:21:02.300 |
which was really profound, which he's less celebrated for, 01:21:06.420 |
is understanding that all biological categories 01:21:22.300 |
a species was thought to be a classical category 01:21:27.420 |
where all the instances of dogs were the same, 01:21:32.180 |
and any variation from that perfect platonic instance 01:21:41.380 |
And Darwin said, "No, it's not error, it's meaningful." 01:21:44.980 |
Nature selects on the basis of that variation. 01:21:50.660 |
The reason why natural selection is powerful and can exist 01:22:06.020 |
and the amount of fur the dog has and the color 01:22:09.620 |
and how long is the tail and how long is the snout. 01:22:17.500 |
in all kinds of ways, right, including in cultural ways. 01:22:24.660 |
So that's one thing that's really interesting 01:22:29.300 |
was basically saying a species is a conceptual category. 01:22:35.780 |
about what is a species, you can't find anybody agreeing 01:22:56.140 |
It's not unbounded variation, but they are variable, right? 01:23:02.980 |
about conceptual categories is that they are the categories 01:23:25.460 |
Is there any physical feature that all the currencies 01:23:28.660 |
in all the worlds that's ever been used by humans share? 01:23:37.760 |
So it's getting to the point that you make this function. 01:23:49.020 |
We all impose on whatever it is, salt, barley, 01:23:52.700 |
little shells, big rocks in the ocean that can't move, 01:23:58.260 |
which are basically a promise of something in the future, 01:24:02.300 |
All of these things, we impose value on them. 01:24:13.860 |
By the way, you're attributing some of that to Darwin, 01:24:28.300 |
so if you read, for example, the biologist Ernst Mayr, 01:24:48.060 |
the idea that there's a single set of features 01:25:10.700 |
the individuals of a species can have offspring. 01:25:20.660 |
So what I'm saying is that in cognitive science, 01:25:24.660 |
they discovered the idea of conceptual categories 01:25:26.940 |
or ad hoc concepts, these concepts that can change 01:25:30.860 |
based on the function they're serving, right? 01:25:38.420 |
and it's also in the philosophy of social reality. 01:25:42.260 |
The way that philosophers talk about social reality, 01:25:46.180 |
I mean, we impose, we're treating a bunch of things 01:25:52.020 |
And sometimes we take things that are physically the same 01:25:58.340 |
- But it feels like the number of variables involved 01:26:02.260 |
in that kind of categorization is nearly infinite. 01:26:06.140 |
because there is a physical constraint, right? 01:26:08.540 |
Like you and I could agree that we can fly in real life, 01:26:18.820 |
You and I could agree that we could walk through the walls, 01:26:21.500 |
but we can't, we could agree that we could eat glass, 01:26:25.020 |
of constraints, but I just-- - Yeah, we can agree 01:26:33.020 |
- But physical reality still holds the trump card, right? 01:26:37.220 |
But still there's a lot of-- - The trump card, 01:26:41.580 |
- Pun completely unintended, but there you go, 01:26:51.620 |
So what I'm saying is that emotions are like money. 01:26:54.540 |
Basically, they're like money, they're like countries, 01:26:59.020 |
they're like kings and queens and presidents, 01:27:07.340 |
We take these physical signals and we give them meanings 01:27:10.500 |
that they don't otherwise have by their physical nature. 01:27:13.860 |
And because we agree, they have that function. 01:27:18.860 |
- But the beautiful thing, so maybe unlike money, 01:27:23.020 |
I love this similarity is, it's not obvious to me 01:27:33.320 |
for each of us humans, and yet we kind of converge. 01:27:36.580 |
- Well, in a culture we converge, but not across cultures. 01:27:41.700 |
There are huge differences in what concepts exist, 01:27:55.860 |
as their brains become wired to their physical 01:28:05.980 |
we are bootstrapping into their brains a set of emotion. 01:28:10.220 |
Concepts, that's partly what they're learning. 01:28:15.100 |
just the way we curate for them, what is a dog? 01:28:25.200 |
When your kid is throwing Cheerios on the floor 01:28:30.660 |
instead of eating them, or your kid is crying 01:28:33.700 |
when she won't put herself to sleep or whatever. 01:28:46.600 |
that they help infants learn, abstract categories. 01:28:49.360 |
There's a huge literature showing that children 01:28:56.440 |
like infants, really young infants, pre-verbal infants, 01:29:12.040 |
And I put it down and the bling makes a squeaky noise. 01:29:22.120 |
And I put it down and it makes a squeaky noise. 01:29:34.100 |
"will expect this to make a noise, a squeaky noise. 01:29:39.100 |
"And if you don't, if it doesn't, you'll be surprised 01:29:41.880 |
"because it violated your expectation, right? 01:29:44.640 |
"I'm building for you an internal model of a bling." 01:29:49.640 |
Okay, infants can do this really, really at a young age. 01:29:59.740 |
And what happens when you go to a new culture? 01:30:05.820 |
you have to do what's called emotion acculturation. 01:30:18.140 |
how do they learn the emotion concepts of that culture? 01:30:25.980 |
and also the movements, the raise of an eyebrow, 01:30:30.620 |
How do they learn to make sense of cues from other people 01:30:49.400 |
Is there a difference between the emotional lives 01:30:53.940 |
of those two categories of biological systems? 01:31:13.660 |
They believed that they were more emotional than men, 01:31:19.020 |
Okay, and then we gave them little handheld computers. 01:31:28.340 |
A couple of pounds, they weighed a couple of pounds. 01:31:45.820 |
and just ask them to report how they were feeling, 01:31:53.980 |
And then at the end, and then we looked at their reports, 01:32:07.820 |
So they were, you know, they were treading water 01:32:17.400 |
who were like floating tranquilly, you know, in a lake. 01:32:25.180 |
But there were no difference between men and women. 01:32:29.140 |
at the end of the sampling period, we asked people, 01:32:32.740 |
so reflect over the past two weeks and tell it. 01:32:41.140 |
So tell us how emotional do you think you are? 01:32:45.460 |
So men and women believe that they are different. 01:32:54.120 |
they make different inferences about emotion. 01:32:57.680 |
If a man is scowling, like if you and I were together, 01:33:09.600 |
- By the way, people love it when you look at the camera. 01:33:16.280 |
If you and I make exactly the same set of facial movements, 01:33:21.160 |
when people look at you, both men and women look at you, 01:33:57.600 |
that there is an element when I see Hillary Clinton, 01:34:01.360 |
that there was something annoying about her to me. 01:34:19.620 |
when I see in other people, wouldn't be annoying. 01:34:25.040 |
Because it certainly does feel like that concept 01:34:32.200 |
well, let me just say that what you would predict about, 01:34:35.980 |
for example, the performance of the two of them 01:34:40.740 |
and I wrote an op-ed for the New York Times actually, 01:34:46.340 |
And it played out really pretty much as I thought 01:34:51.220 |
It's not like I'm like a great fortune teller or anything. 01:34:57.300 |
a woman's, people make internal attributions, it's called. 01:35:03.260 |
They infer that the facial movements and body posture 01:35:07.220 |
and vocalizations of a woman reflect her inner state, 01:35:12.900 |
that they reflect his response to the situation. 01:35:16.460 |
It says something about the situation he's in. 01:35:19.380 |
- Now, for the thing that you were describing 01:35:27.820 |
but it's also in line with research, which shows, 01:35:37.620 |
where the expectation is that a woman will be nurturant 01:35:58.500 |
who's an authority figure, she's really in a catch-22. 01:36:10.820 |
I mean, one of the bigger questions to ask here, 01:36:30.460 |
what is my intuition about what regulates my eating? 01:36:42.220 |
But actually, research shows, and it's beautiful research. 01:36:46.020 |
I love this research 'cause it so violates my own 01:36:52.840 |
that most animals on this planet who have been studied, 01:37:12.340 |
very beautiful research, with humans, with crickets, 01:37:35.820 |
- It feels like you're regulating around carbohydrates, 01:37:38.900 |
- Yeah, but in fact, actually, what I am doing, 01:37:51.620 |
to actually deliberately try to focus on the protein. 01:37:55.760 |
This is the idea behind bias training, right? 01:38:00.180 |
Like if you, I also did not experience Hillary Clinton 01:38:23.740 |
That doesn't mean that rationality is the absence of emotion 01:38:28.460 |
because sometimes emotion or feelings in general, 01:38:33.220 |
not the same thing as emotion, that's another topic, 01:38:50.380 |
It doesn't really matter how confident you feel. 01:38:52.780 |
That confidence could be also explained by science, right? 01:39:02.340 |
Regardless of whether somebody's a Republican 01:39:11.900 |
then that is information that you can act on. 01:39:27.420 |
through which you are now perceiving the world 01:39:32.260 |
I mean, this is now true for basically every, 01:39:51.100 |
scientists who will talk to you about cognitive empathy 01:39:53.620 |
and emotional empathy and I prefer to think of it, 01:40:03.140 |
which is that your brain is always making predictions 01:40:10.360 |
and what you've learned from books and movies 01:40:30.040 |
So, you know, when I'm giving lectures to people, 01:40:34.300 |
I'll show them like a blobby black and white image 01:40:38.060 |
and they're experientially blind to the image, 01:40:47.480 |
the blobby image and then they see actually an object in it. 01:40:57.780 |
Or anyone who's-- - It's a beautiful example. 01:41:23.340 |
because they have cataracts or they have corneal damage 01:41:36.020 |
and then light reaches the brain and they can't see. 01:41:45.140 |
they are experientially blind to certain things. 01:42:12.060 |
If you are not similar enough to that person, 01:42:16.100 |
you will have a hard time making a prediction 01:42:19.460 |
You will be experientially blind to what they feel. 01:42:29.460 |
are under prescribed medicine by their physicians. 01:42:37.580 |
It's not that the physicians are racist necessarily, 01:43:12.300 |
So we are, you know, empathy is not, it's not magic. 01:43:23.660 |
about what each other's feeling and thinking. 01:43:39.980 |
But in our culture, we're constantly making inferences. 01:43:44.980 |
And we're not doing it necessarily consciously, 01:44:02.620 |
we have different cultures in this country right now 01:44:06.380 |
that are, there are a number of reasons for this. 01:44:08.900 |
I mean, part of it is, I don't know if you saw 01:44:15.540 |
- Yeah, it's a great, it's really great documentary. 01:44:19.660 |
- About what social networks are doing to our society? 01:44:41.860 |
And so the fact that machine learning algorithms 01:44:45.100 |
are serving people up information on social media 01:44:48.820 |
that is consistent with what they've already viewed 01:44:57.580 |
But it's not the only reason why you have these silos, 01:45:13.340 |
- Yeah, I mean, okay, so many things you said 01:45:21.740 |
like I preach and I try to practice empathy a lot. 01:45:26.740 |
And something about the way you've explained it 01:45:38.740 |
to expand this capacity to predict more effectively. 01:45:43.620 |
- So like what I do is kind of like a method acting thing, 01:45:47.660 |
which is I imagine what the life of a person is like. 01:45:53.820 |
I mean, this is something you see with Black Lives Matter 01:46:01.900 |
but I have, because of martial arts and so on, 01:46:17.000 |
people aren't doing that with police officers. 01:46:19.380 |
They're not imagining, they're not empathizing 01:46:22.980 |
or putting themselves in the shoes of a police officer 01:46:28.620 |
how dangerous it is, how difficult it is to maintain calm 01:46:32.380 |
and under so much uncertainty, all those kind of things. 01:46:35.140 |
- But there's more, there's even, that's all that's true, 01:46:41.300 |
I mean, like from a predicting brain standpoint, 01:46:47.100 |
So I don't know if you wanna go down that path 01:46:52.980 |
that I was most gratified by, I still am receiving, 01:47:00.980 |
and I'm still receiving daily emails from people, right? 01:47:05.800 |
But one of the most gratifying emails I received 01:47:12.780 |
who told me that he thought that "How Emotions Are Made" 01:47:17.160 |
contained information that would be really helpful 01:47:35.060 |
and using what we know about the science of perception 01:47:39.260 |
from a prediction standpoint, like the brain is a predictor, 01:47:45.820 |
what might be happening in these circumstances. 01:47:53.360 |
because everyone gets mad at you when you talk about this. 01:48:10.020 |
and that definitely is in line with Black Lives Matter 01:48:23.460 |
And I'm not talking about this bad apple or that bad apple. 01:48:28.020 |
who are necessarily shooting people in the back 01:48:34.420 |
well-meaning cops who have the kind of predicting brain 01:48:54.980 |
The way that these situations are constructed, 01:48:58.300 |
I think it's just, there's a lot to be said there, I guess, 01:49:01.420 |
is what I wanna say. - Yeah, is there something 01:49:05.900 |
From the perspective of the predictive brain, 01:49:09.660 |
which is a fascinating perspective to take on this, 01:49:18.100 |
there seems to be a concept of a police officer being built. 01:49:25.640 |
- But it's gaining strength, so it's being re, 01:49:32.940 |
But I think, yeah, for sure, I think that that's right. 01:49:35.400 |
I think that there's a shift in the stereotype 01:49:44.380 |
There's a stereotype of a black man in this country 01:49:50.760 |
not always, but largely, that many people watch. 01:49:55.760 |
I mean, you think you're watching a 10 o'clock drama 01:50:00.220 |
and all you're doing is kicking back and relaxing, 01:50:02.680 |
but actually, you're having certain predictions reinforced 01:50:07.980 |
And what's happening now with police is the same thing, 01:50:13.580 |
that there are certain stereotypes of a police officer 01:50:16.940 |
that are being abandoned and other stereotypes 01:50:19.580 |
that are being reinforced by what you see happening. 01:50:26.500 |
I mean, there's a lot to say about this, really, 01:50:28.540 |
that regardless of whether it makes people mad or not, 01:50:39.180 |
The brain makes predictions about internal changes 01:50:44.180 |
in the body first and then it starts to prepare motor action 01:50:49.180 |
and then it makes a prediction about what you will see 01:50:55.800 |
So it's also the case that we didn't talk about 01:51:01.780 |
is that sensory sampling, like your brain's ability 01:51:05.620 |
to sample what's out there, is yoked to your heart rate. 01:51:15.300 |
where it's easier for you to see what's happening 01:51:20.100 |
And so if your heart rate goes through the roof, 01:51:23.440 |
you will be more likely to just go with your prediction 01:51:34.780 |
because you're actually literally not seeing as well. 01:51:37.880 |
Or you will see things that aren't there, basically. 01:51:42.020 |
- Is there something that we could say by way of advice 01:52:02.820 |
- Well, I actually think it is emotion in the following sense. 01:52:16.100 |
but I really think that this is what's happening. 01:52:18.980 |
One thing we haven't talked about is brains evolved, 01:52:31.740 |
You have a brain so that it can control your body. 01:52:37.340 |
for predictively controlling your body is allostasis. 01:52:47.220 |
before they arise so that you can act as you need to act. 01:52:51.420 |
And the metaphor that I use is a body budget. 01:52:55.140 |
You know, your brain is running a budget for your body. 01:53:01.900 |
And instead of having one or two bank accounts, 01:53:16.840 |
when is it good to spend and when is it good to save 01:53:21.180 |
and am I gonna get a return on my investment? 01:53:23.900 |
Whenever people talk about reward or reward prediction error 01:53:32.940 |
They're talking about your brain's predictions 01:53:34.580 |
about whether or not there will be a deposit or withdrawal. 01:53:42.160 |
when your brain is running a deficit in your body budgets, 01:54:01.020 |
you can't predict what's going to happen next. 01:54:05.540 |
So I have this absolutely brilliant scientist 01:54:08.860 |
working in my lab, his name is Jordan Theriault 01:54:13.600 |
and he's published this really terrific paper 01:54:17.080 |
on a sense of should, like, why do we have social rules? 01:54:27.560 |
It's because if I make myself predictable to you, 01:54:40.260 |
Novelty or unpredictability at the extreme is expensive. 01:54:58.600 |
It could be just something is not predictable 01:55:17.940 |
and you will start to feel really, really, really distressed. 01:55:22.120 |
So what we have is a culture full of people right now 01:55:27.040 |
who are, their body budgets are just decimated 01:55:30.800 |
and there's a tremendous amount of uncertainty. 01:55:34.660 |
When you talk about it as depression, anxiety, 01:55:39.720 |
it makes you think that it's not about your metabolism, 01:55:50.000 |
or about making sure that you have social connections. 01:55:53.180 |
You think that it's something separate from that. 01:56:04.080 |
when things aren't quite right with your predictions. 01:56:26.660 |
like constantly be learning to making adjustments. 01:56:29.900 |
And then over time, there's a cost to be paid 01:57:04.780 |
they're metabolically encumbered and they're distressed. 01:57:10.100 |
And in order to try to have empathy for someone 01:57:31.020 |
what do you do when you are running a deficit 01:57:36.980 |
What does it mean for a brain to stop spending? 01:57:43.140 |
stops moving the body, and it stops learning. 01:57:53.980 |
to have empathy for someone who is unlike you 01:58:04.220 |
I mean, it is something I talk about in the book 01:58:16.520 |
to be curious about views that are unlike their own 01:58:26.340 |
And I'll just tell you, I had this epiphany, really. 01:58:30.180 |
I was listening to Robert Reich's "The System." 01:58:34.660 |
He was talking about oligarchy versus democracy. 01:58:39.020 |
And so oligarchy is where very wealthy people, 01:58:43.700 |
shift power so that they become even more wealthy 01:59:08.700 |
and I'm listening to him describe in fairly decent detail 01:59:18.580 |
there's been a shift in what it means to be a CEO 01:59:21.100 |
and not being, no longer being a steward of the community 01:59:34.140 |
it used to be the case that CEOs made like 20 times 01:59:58.400 |
They just know they can't feed their children, 02:00:05.300 |
and they worry about what's gonna happen to their, 02:00:07.580 |
they're living like, you know, months a month, basically. 02:00:15.260 |
So there are a huge number of people living like this. 02:00:26.860 |
- Well, somebody else butted in line in front of you. 02:00:32.980 |
That's why you experience what you're experiencing. 02:00:39.340 |
I had deep empathy for people who have beliefs 02:00:44.340 |
that are really, really, really different from mine. 02:00:50.420 |
But I was trying really hard to see it through their eyes. 02:01:08.620 |
where did you, what resources did your brain draw on 02:01:14.700 |
- Well, I'll tell you something, honestly, Lex. 02:01:17.540 |
I don't have that much in the gas tank right now. 02:01:28.580 |
Stress is your brain is preparing for a big metabolic outlay 02:01:39.020 |
For me, this is a moment of existential crisis 02:01:42.700 |
as much as anybody else, democracy, all of these things. 02:02:16.540 |
It's an expenditure that is a really good investment. 02:02:21.300 |
when I was exercising, I was listening to the book 02:02:36.980 |
I basically wrote some stuff down, I set it aside, 02:02:45.020 |
I don't know what you do before you exercise. 02:02:47.460 |
I always have a protein shake, always have a protein shake, 02:02:58.220 |
I didn't have a protein drink, but I did the same thing. 02:03:01.700 |
And fueling up can mean lots of different things. 02:03:09.340 |
making sure you get a good night's sleep before you do it. 02:03:12.980 |
But I guess I think we have to do these things. 02:03:35.580 |
that a good human being can vote for Donald Trump. 02:03:42.380 |
that can't imagine that an intelligent person 02:04:06.140 |
And I think you have to, in order to be able to, 02:04:10.020 |
like, to see the obvious common humanity in us. 02:04:17.860 |
We can put it, like you said, it's a perfect storm. 02:04:25.940 |
- There's an economic system, which is disadvantaging 02:04:34.420 |
Like, if you, you know, if I had to orchestrate 02:04:40.980 |
a human body budget, it would be the one that we live in. 02:04:52.920 |
which is really hard for a human nervous system, right? 02:04:57.060 |
Like, ambiguity with no context to predict in. 02:05:01.180 |
And then, you know, there are the economic concerns 02:05:03.800 |
that affect large swaths of people in this country. 02:05:06.260 |
I mean, it's really, I'm not saying everything 02:05:13.620 |
But if you combine all these things together-- 02:05:22.140 |
somehow it reduces the entirety of the human experience, 02:05:28.520 |
Like, we should exercise every day in the same kind of way. 02:05:41.580 |
and also for parents of people who've lost children 02:05:45.180 |
in wars and in conflicts, in political conflicts, 02:05:51.540 |
and they talk to each other about their experiences. 02:06:18.600 |
when my daughter went to college, I gave her advice. 02:06:28.520 |
"who let you be the kind of person you wanna be." 02:06:34.080 |
You, we're back to free will, you have a choice. 02:06:41.960 |
It might seem like a unimaginably difficult choice. 02:06:47.280 |
Do you wanna be somebody who is wrapped in fury and agony? 02:07:07.480 |
Curiosity is the thing, it's curative curiosity. 02:07:12.480 |
- On social media, the thing I recommend to people, 02:07:15.480 |
at least that's the way I've been approaching social media. 02:07:30.640 |
So it's the same, similar concept of surrounding yourself 02:07:36.240 |
And I ignore, sometimes block, but just ignore. 02:07:48.400 |
There's a certain desire when somebody says something mean 02:08:05.620 |
by that group of folks that have that negativity. 02:08:35.600 |
- When I start reading "The Rise and Fall of the Third Reich" 02:08:52.160 |
- Well, there is research to back up what you said. 02:09:15.840 |
not only in their body budgets, but also in yours. 02:09:30.720 |
The best thing for a human nervous system is another human. 02:09:35.480 |
And the worst thing for a human nervous system 02:09:49.520 |
or do you wanna be somebody who causes people pain? 02:10:02.480 |
But remember what we said about social reality, you know? 02:10:15.200 |
or, you know, collective, you know, nature of people. 02:10:20.120 |
But the fact is we have socially dependent nervous systems. 02:10:32.880 |
And that is a dilemma that we have to grapple with. 02:10:48.440 |
and which is what I understand the, you know, 02:10:52.200 |
the founding members of this country intended. 02:11:05.260 |
but let me, it's fun to ask somebody like you 02:11:08.560 |
who can effectively, from at least neuroscience perspective, 02:11:22.040 |
at least a bunch of '80s hair bands have written about it? 02:11:31.960 |
- Well, I'm really happy that I fell in love. 02:11:47.460 |
babies are born not able to regulate their own body budgets 02:11:53.700 |
When you feed a baby, when you cuddle a baby, 02:12:09.960 |
to manage eventually her own body budget to some extent. 02:12:13.600 |
That's the basis, biologically, of attachment. 02:12:17.380 |
Humans evolved as a species to be socially dependent, 02:12:25.380 |
meaning you cannot manage your body budget on your own 02:12:31.800 |
without a tax that eventually you pay many years later 02:12:42.780 |
Loneliness, when you break up with someone that you love 02:12:47.780 |
or you lose them, you feel like it's gonna kill you, 02:13:01.100 |
It's actually in the web notes to seven and a half lessons. 02:13:05.360 |
But social isolation and loneliness will kill you earlier 02:13:20.800 |
and that tax accrues very slightly over time, 02:13:26.340 |
so that by the time you're in middle age or a little older, 02:13:35.340 |
from heart disease, from diabetes, from depression. 02:13:38.780 |
You're more likely to develop Alzheimer's disease. 02:13:40.940 |
I mean, it takes a long time for that tax to accrue, 02:13:53.900 |
- But I think the funny view of it is that it's clear 02:14:08.740 |
And the reason you wanna stay with somebody for a long time 02:14:16.580 |
- Well, now you're mixing, now you're mixing things. 02:14:18.660 |
Now you're, you know, no, you have to decide whether. 02:14:21.620 |
But what I would say is when you lose someone you love, 02:14:37.380 |
We are the caretakers of one another's nervous systems, 02:14:41.840 |
And out of that comes very deep feelings of attachment, 02:14:59.720 |
- Do you think, do you ponder your mortality? 02:15:01.840 |
I mean, you're somebody who thinks about your brain a lot. 02:15:10.620 |
or I don't know, I don't know how to feel about it, 02:15:13.920 |
but it seems to be one of the most definitive aspects 02:15:19.960 |
but I think the best I can do in a short snippet 02:16:10.400 |
And frankly, honestly, I fear my husband dying before me 02:16:15.640 |
- There's that love and social attachment again. 02:16:33.600 |
- Yeah, I think that there isn't one meaning of life. 02:16:41.040 |
and you use different ones on different days. 02:16:46.320 |
- Depending on the day, but for me, I would say, 02:16:49.320 |
sometimes the meaning of life is to understand, 02:17:19.760 |
clear the path for my daughter or for my students. 02:17:30.400 |
you ever have moments where you're looking at the sky 02:18:09.760 |
and you just get very immersed in the moment, 02:18:27.520 |
- I don't think there's a better way to end it, Lisa. 02:18:30.840 |
The first time we spoke is, I think, if not the, 02:18:35.840 |
then one of, I think it's the first conversation I had 02:18:45.680 |
- And now we get to finally do it the right way. 02:18:56.640 |
Certainly can't wait to, I already read this book, 02:19:02.120 |
because as you said offline, that you're reading it, 02:19:07.480 |
You have a great, I don't know what's a nice way to put it, 02:19:20.040 |
- Thank you for listening to this conversation 02:19:26.360 |
Athletic Greens, which is an all-in-one nutritional drink, 02:19:30.400 |
Magic Spoon, which is a low-carb, keto-friendly cereal, 02:19:38.640 |
Please check out these sponsors in the description 02:19:40.640 |
to get a discount and to support this podcast. 02:19:44.280 |
If you enjoy this thing, subscribe on YouTube, 02:19:46.700 |
review it with Five Stars and Apple Podcasts, 02:19:59.720 |
It takes more than one human brain to create a human mind.