back to indexLisa Feldman Barrett: How the Brain Creates Emotions | MIT Artificial General Intelligence (AGI)
Chapters
0:0 Introduction
1:15 Ive cried
4:25 Common misconception
6:20 How emotions are created
8:23 Building blocks of emotion
11:18 What makes us human
15:2 Brain evolution
21:23 Emotions are not real
26:58 Emotions are a language
28:37 How does an infant learn
35:33 Different mappings
39:7 Building a robot
43:10 Will it love you back
50:16 Audience questions
56:41 The need for embodied systems
00:00:00.000 |
Today we're going to try something different. 00:00:08.960 |
She is a university distinguished professor of psychology 00:00:13.620 |
director of the Interdisciplinary Effective Science Laboratory, 00:00:19.960 |
How Emotions Are Made, The Secret Life of the Brain. 00:00:26.880 |
from social, psychological, cognitive science, 00:00:31.960 |
And so I think our conversation may help us gain intuition 00:00:43.080 |
As Josh Tenenbaum gave you a shout out on Tuesday 00:00:50.560 |
how to create artificial general intelligence systems 00:01:29.560 |
When I'm speaking about, when I'm giving an academic talk, 00:01:33.240 |
I sometimes will talk about a study that we did 00:01:38.080 |
and we have them watch the most evocative clips of films. 00:01:42.160 |
And there's several clips that are really powerful, 00:01:47.160 |
and a couple of them, every time I talk about them, 00:01:55.180 |
I'm describing for the audience what subjects are seeing. 00:01:59.560 |
One of the clips is from a movie called Sophie's Choice. 00:02:19.000 |
if I had a heart rate monitor and respiration monitor, 00:02:27.160 |
Meryl Streep is Sophie, and it's very, very evocative. 00:02:37.580 |
with Susan Sarandon, who is dying of breast cancer, 00:02:44.640 |
So this is another scene also that I find very compelling. 00:02:54.920 |
And in fact, I was just giving a presentation 00:02:56.920 |
to the Supreme Judicial Court of Massachusetts 00:03:05.960 |
you know, they wanted to understand the neuroscience 00:03:17.800 |
And to open the discussion, I showed them a clip 00:03:25.640 |
almost 20 years ago, actually, called "A Time to Kill," 00:03:30.640 |
a scene where Matthew McConaughey is this kind of, 00:03:36.120 |
where he is defending an African-American man 00:03:46.040 |
And so the scene is, he's this completely pathetic lawyer 00:03:50.040 |
until the end, when he masters this fantastic defense, 00:03:55.040 |
basically, of this man on trial for, you know, 00:04:01.840 |
And that one, I hadn't seen that for 20 years, that film, 00:04:11.480 |
and it just really punches you in the stomach. 00:04:29.620 |
So one of the things you talk a lot in your work, 00:04:32.280 |
in your book, is the difference in the experience 00:04:38.280 |
So what's our biggest misconception about emotion 00:04:44.200 |
have only considered emotions at a surface level? 00:04:58.420 |
and read their emotion the way you read words on a page, 00:05:11.080 |
The way that I'm saying it sounds preposterous, 00:05:15.180 |
that is actually what a lot of people believe, 00:05:24.640 |
and the investment of some of the most creative people 00:05:28.500 |
on this planet trying to build emotion detection systems, 00:05:33.100 |
when really what they're building are excellent systems 00:05:40.660 |
I mean, sometimes facial movements communicate emotion, 00:06:09.300 |
And so the idea that there's one universal facial expression 00:06:17.680 |
there's no strong scientific evidence for that claim. 00:06:23.580 |
Because we nevertheless feel like we're observing emotion 00:06:28.700 |
So how does emotion, the creation of emotion change 00:06:35.360 |
Does the audience change the display of emotion? 00:06:41.540 |
- Well, emotions are not displayed, I would say, right? 00:06:45.340 |
So basically you and I are having a conversation right now. 00:06:49.460 |
And part of what my brain is doing is it's guessing, 00:06:58.020 |
So right now you have a slight smile on your face, 00:07:13.120 |
- I have a friend of mine who's Russian who told me, 00:07:20.100 |
actually I've had several friends tell me this, 00:07:32.940 |
And in Bulgaria they have apparently like a name 00:07:46.740 |
But basically when you look at someone's face, 00:07:52.500 |
And although you yourself are focused on their face, 00:07:55.500 |
to you it feels as if that's where all the information is. 00:07:57.800 |
Your brain is actually taking in the entire sensory array. 00:08:01.700 |
So it's taking in also the sound of the person's voice 00:08:11.700 |
in all of those signals to make sense of things. 00:08:20.140 |
somebody's internal state in an obligatory way. 00:08:31.660 |
as far as I understand there's no good answer yet 00:08:33.880 |
from science, is what are the basic building blocks 00:08:38.140 |
- Well, I wouldn't say that there's no good answer 00:08:41.020 |
from science, I would say scientists disagree. 00:08:43.740 |
I think it's very clear what the building blocks are. 00:08:50.620 |
if you look back all the way to ancient Greece, 00:08:54.660 |
you can see that there are two kinds of views of emotion 00:08:58.260 |
that have been battling it out for millennia. 00:09:06.820 |
I will give you the modern versions of these views, 00:09:15.300 |
with circuits in your brain that are pre-wired 00:09:21.980 |
with an anger circuit, a fear circuit, a sadness circuit, 00:09:25.340 |
a happiness circuit, some other circuits too. 00:09:34.340 |
And the idea is that when one of these circuits 00:09:43.340 |
So you have a very stereotypic change in your body, 00:09:51.620 |
the chemicals in your body change in a particular way, 00:09:57.860 |
and you have a propensity to make a particular action 00:10:12.540 |
well, there are some basic kind of ingredients 00:10:19.300 |
or the human brain now, people talk about the brain now, 00:10:23.740 |
there are some basic ingredients that your brain uses 00:10:27.100 |
and it uses them to make every kind of mental event 00:10:29.400 |
that you experience and every action that you take. 00:10:32.020 |
The recipes are what are unique, the ingredients are common. 00:10:36.900 |
Just like you can take flour and water and salt 00:10:39.260 |
and you can make a whole bunch of different recipes 00:10:41.120 |
with them, some of which aren't even food, like glue. 00:10:45.620 |
In a similar way, the idea is that your brain 00:10:53.060 |
and it makes emotion as you need them on the spot. 00:11:06.100 |
you don't experience sadness and you don't perceive sadness 00:11:22.260 |
the idea of emotional intelligence is really difficult 00:11:25.500 |
and it seems like it's an important thing to try to instill. 00:11:28.620 |
So for human beings, where do you see the importance 00:11:36.820 |
- I actually think that's the wrong question to ask 00:11:41.060 |
because, no, I mean, I think people constantly 00:12:03.700 |
which is our simple feelings of feeling pleasant 00:12:15.380 |
the internal state of the physical systems in your body. 00:12:20.020 |
Not everybody makes emotions out of those feelings 00:12:25.820 |
So those feelings, the way to think about it is 00:12:28.020 |
that your brain comes wired to regulate your body. 00:12:52.520 |
And as caregivers regulate the nervous systems 00:12:55.780 |
of their infants, that wires the infant's brain. 00:13:04.100 |
They're born waiting for a set of wiring instructions 00:13:10.460 |
to the physical and social realities that they grow in. 00:13:27.460 |
These are kind of like barometer readings in a way. 00:13:30.500 |
They're very simple and they lack a lot of detail 00:13:37.900 |
And the way that a culture makes sense of them 00:13:43.920 |
when we lose something significant like a loved one, 00:13:55.420 |
100 or 200 years ago, there was an emotion called nostalgia 00:14:11.620 |
It's not just a matter of changing the label of something. 00:14:18.780 |
It's actually the formation of the experience 00:14:25.900 |
I don't think emotion is what you need to endow it with. 00:14:29.380 |
You need to endow it with these basic ingredients 00:14:38.860 |
make whatever states or guide whatever actions 00:14:44.260 |
That will be different if it's an American or a British 00:14:54.740 |
and study the Hadza who are hunting and gathering 00:15:02.100 |
- So do you have words or human interpretable labels 00:15:06.800 |
on these basic ingredients that we can understand? 00:15:10.460 |
I mean, I think the first thing to understand 00:15:12.540 |
is that when we think about building an intelligent agent, 00:15:17.540 |
we think about endowing the agent with cognition 00:15:32.620 |
especially in our culture, thinking, feeling, seeing. 00:15:40.940 |
But the truth is that your brain did not evolve 00:15:50.980 |
It evolved, the brains evolved to control bodies. 00:16:01.720 |
the various systems of the body as creatures move around 00:16:06.720 |
in order to gain resources like food and water. 00:16:13.280 |
And a brain has to figure out how much resources to expend 00:16:25.600 |
And scientists actually haven't really figured out 00:16:33.760 |
is that if you look, for example, at computer vision, 00:16:43.400 |
Not completely, but it's a pretty much solved problem. 00:16:53.880 |
that just reaches out smoothly, grabs a glass, 00:17:05.040 |
which we think of as this really basic thing, 00:17:07.140 |
like, oh, it's so trivial, all animals can do it, 00:17:10.160 |
is actually one of the hardest problems to solve. 00:17:13.880 |
And brains basically, I mean, there's a whole story here 00:17:21.080 |
with having a lizard brain, which is wrapped in 00:17:31.320 |
It's a way of thinking about brain evolution. 00:17:37.560 |
as bodies got larger and there were more systems, 00:17:46.280 |
and bigger and bigger, brains also had to get bigger. 00:17:49.320 |
But they had to get bigger to a point with some constraints. 00:18:05.000 |
We can talk about what that means in terms of depression 00:18:13.020 |
your brain is controlling your body all the time. 00:18:15.600 |
Whether you feel it or not, whether you're thinking 00:18:17.700 |
about it or not, whether you're asleep or awake, 00:18:20.680 |
certain parts of your brain are always very active 00:18:29.160 |
And those parts of the brain that are controlling 00:18:32.040 |
your heart and your lungs and your immune system 00:18:35.580 |
and all of those regions that are controlling 00:18:38.200 |
the systems of your body are also helping your brain 00:18:50.800 |
You don't feel your heart beating most of the time. 00:18:53.920 |
You don't feel your lungs expanding most of the time. 00:18:58.080 |
And there's a really good reason why we are all wired 00:19:00.960 |
not to feel those things, and that is you'd never 00:19:04.160 |
pay attention to anything outside in the world ever again 00:19:19.360 |
You feel really jittery, you feel really calm. 00:19:24.940 |
They sometimes, your brain can make them into emotions, 00:19:27.940 |
but they're with you every waking moment of your life. 00:19:55.200 |
the brain might make emotion out of those changes, 00:20:01.140 |
those very strong changes which you will feel 00:20:03.880 |
as really feeling unpleasant or really feeling pleasant. 00:20:16.840 |
Or your brain might even make a perception of the world 00:20:29.960 |
Those are also moments where these simple feelings 00:20:38.160 |
There are others that I could talk about too. 00:20:54.620 |
its fundamental job is to keep your body alive and well. 00:20:59.620 |
And if you don't have some kind of body to regulate 00:21:05.880 |
with affective feelings that come from that regulation, 00:21:18.820 |
in rendering something that seems more human. 00:21:24.100 |
- Right, so maybe you can elaborate like in the book Sapiens 00:21:28.500 |
that as human beings we're really good en masse 00:21:31.900 |
as with thousands, millions of people together 00:21:42.420 |
from a neuroscience perspective it's maybe very true 00:21:50.900 |
So it's interesting, you finish and then I'll-- 00:21:53.460 |
- So what I'm trying to say is also from an AI perspective 00:21:57.380 |
is they become these trivial ideas of mapping a smile 00:22:02.380 |
to being happy and these kind of trivial ideas 00:22:10.100 |
and we start to believe this and therefore it becomes real 00:22:27.180 |
just because our brain doesn't feel those explicit emotions 00:22:31.900 |
- I didn't say we don't feel explicit emotions. 00:22:37.980 |
this inference, it's an interesting inference 00:22:48.560 |
And it's a mistake that betrays a certain kind of thinking 00:23:03.880 |
that doesn't mean that people don't express emotion. 00:23:09.980 |
They just don't, when they're in a state of anger 00:23:13.540 |
or in a state of sadness or in a state of awe, 00:23:17.740 |
When I say, well, your body can do many things 00:23:28.060 |
Your breathing rate can go up, it can go down, 00:23:34.400 |
you sometimes see in sadness and you sometimes see in fear. 00:23:37.220 |
You sometimes even see it in enthusiasm and in awe. 00:23:57.940 |
So when your retina communicates to your brain 00:24:02.500 |
that you are faced with a wavelength that is 600 nanometers, 00:24:34.640 |
In a sense, your brain has imposed meaning on a signal 00:24:43.340 |
So let me back up and give a different example 00:24:45.580 |
to make it a little easier and then we'll re-approach this. 00:24:53.380 |
in fact, this is true almost of all civilization, right? 00:24:59.140 |
by virtue of the fact that we agree that they exist. 00:25:10.420 |
or little pieces of plastic, or gold, or diamonds, 00:25:24.760 |
only because a group of people agree that they have value. 00:25:33.360 |
and once we all agree that that object actually has value, 00:26:01.640 |
Money, currency exists because we impose meaning 00:26:07.760 |
on objects in the world, physical objects in the world, 00:26:12.680 |
that themselves don't have that meaning on their own. 00:26:15.880 |
And they are very real, money is very real to people. 00:26:18.780 |
I can stick somebody's head in a brain scanner 00:26:21.000 |
and show you that they experience value in a very real way, 00:26:38.640 |
We impose meaning on certain physical signals 00:26:47.140 |
We all agree that scowling is sometimes anger, 00:26:55.160 |
just like little pieces of paper become money. 00:27:00.040 |
the expression of emotion as a kind of language, 00:27:04.640 |
in the same way that we collectively agree on a language, 00:27:29.880 |
because we all agree that the country exists, more or less. 00:27:33.440 |
And wow, you guys barely even laughed at that, okay. 00:27:40.320 |
A revolution is when some people in the country 00:27:49.440 |
because we all agree that a president has powers. 00:27:51.720 |
President only has powers by virtue of the fact 00:27:57.200 |
the president doesn't have those powers anymore. 00:28:03.880 |
People's lives depend, outcomes of real people 00:28:07.720 |
depend on these social realities that we build and nurture. 00:28:14.160 |
into the brains of our children as we socialize them. 00:28:17.680 |
And when people move from one culture to another, 00:28:25.400 |
And if they don't, they get very sick physically, 00:28:29.600 |
because our ability to agree on what something means 00:28:34.380 |
actually is important for regulating our nervous systems. 00:28:40.560 |
for systems that learn to behave based on a certain reward, 00:28:45.560 |
it's important to kind of have some ground truth and learn. 00:28:52.520 |
So how do, it sounds like the expression of emotion 00:28:57.840 |
Can you talk about how we learn to fit into our culture 00:29:15.960 |
so I'll try to do it in like a couple sentences, yeah. 00:29:40.760 |
This is a term, scientific term for this is allostasis. 00:29:44.120 |
Allostasis is your brain's ability to predict 00:29:46.480 |
what your body's gonna need before it needs it 00:29:49.200 |
and tries to meet those needs before they arise. 00:29:52.380 |
So an example would be, if you're gonna stand up, 00:29:56.980 |
it has to raise your blood pressure before it stands you up. 00:30:02.320 |
That's costly from a metabolic standpoint, it's costly. 00:30:08.220 |
So an infant's brain can't do this very well. 00:30:24.360 |
And when someone does it, the infant is learning. 00:30:59.000 |
including their consequence for the infant's body. 00:31:10.780 |
It's just that the caregiver is there constantly 00:31:21.920 |
with the ability to recognize a face as a face, 00:31:25.040 |
but it learns that in the first couple of days of life. 00:31:44.880 |
infants start to learn what we call abstract categories. 00:31:57.120 |
or smell the same actually have the same function 00:32:05.980 |
So if you do an experiment with a three month old, 00:32:11.640 |
and you say to that baby very, very intentionally, 00:32:18.480 |
And you put the wug down and it makes a noise, like a beep. 00:32:24.880 |
And then you say, I'm like, I don't have props. 00:32:32.280 |
- Yeah, okay, and then you say, "Give me your wallet." 00:32:36.600 |
And then you say, "Look, sweetie, this is a wug." 00:32:41.600 |
And you put the wug down and it makes a beep. 00:33:11.680 |
And this might be, so lots of different perceptual features, 00:33:22.360 |
that the word is inviting the infant to understand 00:33:29.080 |
that the function of those very different physical signals 00:33:38.640 |
that are exactly identical in their physical features, 00:33:41.680 |
how they sound, how they smell, what they feel like, 00:33:45.920 |
what they look like, and you can name three of them 00:33:48.240 |
with one word and three of them with the other word, 00:33:55.880 |
That happens a little after, not as early as three months. 00:34:04.820 |
We're constantly pointing things out and labeling them. 00:34:24.360 |
Words are considered to be kind of invitations 00:34:52.060 |
and it's the basis of roles that we have with each other 00:34:58.240 |
It's the basis of a lot of the sort of functional categories 00:35:06.500 |
that those sensory arrays in and of themselves 00:35:09.960 |
They only have that meaning because you and I both learn 00:35:14.200 |
that that package of sensory array means something. 00:35:23.320 |
they're wired into our brains in our culture. 00:35:29.400 |
we have to learn sometimes different packages. 00:35:34.200 |
So you're saying that there's a few sources of sensory data 00:35:41.480 |
the feelings of some kind that we learn to then, 00:35:49.280 |
is it's trying to make sense of the sensory array around it. 00:35:57.980 |
just think about it from your brain's perspective. 00:36:03.240 |
it spends its entire life trapped in a dark, silent box. 00:36:17.000 |
so that it knows what to do to keep itself alive and well. 00:36:36.240 |
So a flash of light, what's a flash of light? 00:36:50.280 |
or it could be a doorbell or it could be, right? 00:36:52.800 |
Any particular sensory cue could have multiple causes. 00:36:57.400 |
So your brain's trapped basically in your skull 00:37:03.080 |
the sensory effects of stuff that happens in the world. 00:37:06.620 |
But it has to figure out what those things are 00:37:12.160 |
Well, it has something else that it can draw on. 00:37:16.440 |
Your brain basically doesn't store experiences 00:37:21.400 |
from the past, it can reconstitute them in its wiring. 00:37:35.120 |
So in one situation, a siren means one thing. 00:37:41.400 |
A flash of light means one thing in one situation 00:37:45.720 |
So your brain is using past experience to make guesses 00:38:08.960 |
It's not that there's one ache in your stomach for nausea 00:38:21.720 |
many different feelings of achiness for nausea 00:38:24.640 |
and many different feelings of achiness for hunger 00:38:29.200 |
So your brain has to make the same kinds of guesses 00:38:32.640 |
as it does about what the sensory events mean in the world. 00:38:37.640 |
It's guessing and making sense of the sensory array 00:38:51.960 |
which in psychology we have a really fancy name for that. 00:38:58.440 |
Your brain takes in the information that it didn't predict 00:39:01.360 |
and so that it can predict better in the next time 00:39:35.600 |
What essential aspect of the infant experience 00:39:41.520 |
- It needs, well, I mean, I'm not a computer scientist. 00:39:45.760 |
So the way that I would say it is it needs to have a body. 00:39:50.760 |
It needs to have something like physical systems 00:39:55.960 |
It has to do something analogous to allostasis. 00:40:09.720 |
- No, it's not that the brain is stabilizing itself. 00:40:26.120 |
- Oh, and there's philosophy to everything, right? 00:40:28.520 |
Whether you admit it or not is a different story. 00:41:13.400 |
They need dopamine to encode, to learn information. 00:41:18.840 |
to work towards getting a reward, I would say. 00:41:32.640 |
that's when you see a real surge of dopamine, 00:41:48.000 |
And people will, and animals will work tremendously hard 00:41:58.600 |
Motivation is expending resources to get a reward. 00:42:20.800 |
or for actually any living creature on this planet, 00:42:29.880 |
there's an analogy to what we're talking about here, 00:42:38.280 |
all these systems have to be kept in balance, 00:42:53.480 |
And that is the basis, the consequence of that regulation 00:43:02.880 |
which are, for many, many creatures on this planet, 00:43:31.560 |
- Well, you know, people fall in love with their cars. 00:43:43.400 |
it doesn't need much to fall in love with something. 00:43:58.560 |
in poetry and culture, is a social construct, 00:44:03.520 |
What I mean is, sort of the idea of monogamous, 00:44:11.480 |
You're saying you could do the same with a car, 00:44:17.480 |
you don't know anyone who is so in love with their car 00:44:29.480 |
It means that we regulate each other's nervous system. 00:44:31.560 |
So, our brain, my brain isn't just regulating 00:44:34.160 |
my nervous system, right now, it's regulating yours. 00:44:49.000 |
But other animals, like there are some insects 00:44:54.160 |
They do it with chemicals, they do it with smell, 00:44:58.840 |
like earwigs will, you know, like, they can actually, 00:45:05.240 |
but it's like, you know, cuddling its little baby, 00:45:13.000 |
But, you know, what about mammals, like rats? 00:45:29.720 |
we use all of those senses to regulate each other, 00:45:41.120 |
that allow us to speak and allow us to understand words 00:45:45.160 |
are directly connected to the parts of the brainstem 00:45:59.560 |
that are important for you to be able to understand language 00:46:11.920 |
And that is why I can say something to somebody, 00:46:22.360 |
without, you know, maybe they can see me, maybe they can't. 00:46:25.880 |
Maybe they can, hopefully they can't smell me. 00:46:28.040 |
Maybe they can hear me, maybe they, you know. 00:46:40.720 |
because the sound of their voice has an effect 00:46:56.900 |
When I say a word like car, that's a short form. 00:47:07.020 |
when I say the word car, and I say that word, 00:47:09.700 |
and that invokes those similar mental features, 00:47:50.380 |
or two really close friends comes from the ability 00:47:54.900 |
that we have to regulate each other's nervous systems. 00:48:09.780 |
on average, seven years sooner, loneliness kills. 00:48:13.180 |
I always tell my daughter, my daughter's 19 years old, 00:48:15.180 |
and I always tell her, when you break up with someone, 00:48:18.060 |
it feels like it will kill you, but it won't. 00:48:24.540 |
It will kill you, on average, seven years earlier 00:48:30.860 |
than it would if you didn't have an attachment. 00:48:39.780 |
as our bodies got really complex through evolution 00:48:43.940 |
and our brains got bigger, they could only get so big 00:48:47.780 |
there are constraints on how big any brain can get 00:48:54.660 |
but it also has to do with the metabolic cost of a brain. 00:48:58.500 |
Your brain is really expensive, my brain, really expensive. 00:49:02.380 |
Three pounds, 20% of your metabolic budget, that's a lot. 00:49:06.980 |
And so what did evolution do to solve this problem? 00:49:12.700 |
Well, it couldn't make our brains any bigger, 00:49:19.860 |
So you bear the burden of other people's allostasis 00:49:28.760 |
but that's what it means to give people support. 00:49:34.580 |
or says nice words to you or gives you a hug, 00:49:37.360 |
they are physically having an effect on your body, 00:49:40.380 |
that they are helping your body to maintain allostasis 00:49:48.460 |
And so the basis of love or attachment is basically that. 00:49:52.820 |
It's the ability to affect each other's nervous systems 00:50:04.420 |
And the worst thing for a human nervous system 00:50:37.980 |
- So I was thinking about what you were saying about reward 00:50:46.500 |
is, you described it as a return to allostasis. 00:50:55.460 |
just like the reinforcement of pathways or behavior 00:51:06.420 |
And I'm also wondering about the link between 00:51:10.560 |
the desire for allostasis and the need for novelty 00:51:18.780 |
So the second question's so much more interesting 00:51:24.860 |
So let me say this, that there is a need for novelty. 00:51:29.860 |
The need differs for different people, I will say. 00:51:49.380 |
or we can think about it in a really distal way. 00:51:51.780 |
But basically, when I say that a brain is organized 00:52:01.780 |
that doesn't mean that the goal is to only ever 00:52:06.360 |
have your prediction, your brain's predicting all the time, 00:52:10.440 |
to always have its predictions completely perfect 00:52:14.340 |
because you'll be bored out of your mind, right? 00:52:25.440 |
So it's a constant balance between what biologists 00:52:38.320 |
there's actually an increase in norepinephrine, 00:52:40.320 |
an increase in arousal, it feels really exciting, 00:52:46.620 |
Novelty requires usually that you learn something new, 00:52:57.740 |
your body around, which is also a metabolically 00:53:01.420 |
So the need for novelty is balanced by its cost, 00:53:21.380 |
one is bred, when you stick it in a novel cage, 00:53:28.660 |
And the other one, when you put it in a novel cage, 00:53:33.780 |
and it's just going crazy, kind of exploring everything. 00:53:37.580 |
Well, the one that sits still, scientists might say, 00:53:40.220 |
oh, that's a nervous rat, or that rat's afraid. 00:53:45.260 |
The rat is not moving, and it's not encoding anything, 00:53:53.000 |
This rat, on the other hand, is roaming all over the place, 00:53:57.240 |
moving a lot, learning a lot, so it's encoding a lot, 00:54:07.240 |
and there are also differences between times in your life, 00:54:13.040 |
where you feel like you really have to conserve. 00:54:14.640 |
When I talk to the public, I always talk about, 00:54:17.780 |
it's just too boring a word, but I sort of do 00:54:23.880 |
Your brain is sort of like the financial office 00:54:27.680 |
A company has lots of offices, it has to balance 00:54:38.480 |
What it's always trying to do is spend a little bit 00:54:49.640 |
so what do you do when something goes into the red? 00:54:53.100 |
you might actually spend a lot to try to really make, 00:55:00.000 |
That would be novelty, that would be move and spend, 00:55:07.220 |
You might say, well, I'm gonna save a little bit now. 00:55:33.080 |
over and above the rewards that it would give you 00:55:40.440 |
But it is really clear to me that the extent to which 00:55:47.760 |
any nervous system will embrace novelty and even seek it, 00:55:52.020 |
pretty much depends on the allostatic state of that system. 00:56:06.560 |
What that means in human terms is you get depressed. 00:56:15.920 |
so you stop paying attention to anything going on 00:56:37.680 |
and maybe I'll get back to your first question 00:56:40.280 |
- So if I understand your argument correctly, 00:56:44.000 |
if we're going to make anything like a general intelligence, 00:56:47.680 |
something approaching, you know, like a human, 00:56:55.720 |
- Well, I want to be careful about saying that 00:56:58.720 |
One, because in biology, there is this concept 00:57:06.740 |
there's more than one way to skin a cat, basically. 00:57:11.120 |
There are many ways to get to that functional outcome. 00:57:15.800 |
there are a lot of characteristics that are heritable, 00:57:20.320 |
And the reason why is that there isn't one set of genes. 00:57:39.400 |
you could implement an agent that has multiple systems 00:57:47.800 |
that is very important that we continually miss, 00:57:51.760 |
when we think about building an agent with mental states, 00:57:55.040 |
we continually miss the fact that it has a body. 00:57:59.000 |
Humans have bodies, and that's the brain's primary task. 00:58:07.280 |
even though we don't normally experience it that way. 00:58:16.040 |
So it seems to me that if you want to build an agent 00:58:18.880 |
that is human-like, it has to have something like a body. 00:58:27.880 |
that you could implement something like a body 00:58:40.080 |
if we just gave it some sort of, I don't know, 00:58:53.840 |
All a brain requires is that you at some point had a body. 00:58:59.000 |
So basically, this is what phantom limb pain is, 00:59:05.080 |
this is what chronic pain is, this is what happens. 00:59:08.840 |
If at some point you cease to get information 00:59:12.200 |
from your body anymore, your brain still can simulate, 00:59:23.600 |
At some point, the body isn't really needed anymore. 00:59:36.120 |
So there's a bunch of chemicals that go into your brain, 00:59:45.920 |
that diffuse around in the fluids in your brain 00:59:49.680 |
And that seems like a hack that we've inherited 00:59:51.960 |
over millions of years from primitive ancestors. 00:59:54.720 |
And if you look at the machine learning world, 00:59:57.800 |
we can do a bunch of similar things with neural nets. 00:59:59.640 |
So you can increase the activation thresholds 01:00:25.440 |
imposed by the evolutionary history on the whole system. 01:00:28.440 |
And it seems like that would make it work better. 01:00:30.200 |
So when people are in negative emotional states, 01:00:37.800 |
in negative emotional states, I have to tell you. 01:00:42.640 |
they can do very nefarious things very, very effectively. 01:00:45.800 |
- Right, but the general point I think is true 01:00:49.800 |
So emotions kind of flood your nervous system with-- 01:00:54.800 |
- No, emotions don't flood your nervous system. 01:01:09.240 |
or anywhere in your nervous system that is for emotion. 01:01:25.400 |
- They influence every mental event that you have, 01:01:40.800 |
And the neurochemical systems that you're referring to 01:01:43.880 |
basically change the ease with which information is passed 01:01:50.600 |
It's true regardless of whether the event is emotion 01:01:54.080 |
or whether it's a perception or whether it's a thought 01:02:05.120 |
that allows your brain to delay gratification of a reward. 01:02:13.960 |
because you anticipate a reward at some point in the future. 01:02:30.120 |
to remember in the past and also to do things now 01:02:38.880 |
So when I said you have to have something like a body, 01:03:07.880 |
Your brain is an internal model of your body in the world. 01:03:13.600 |
And if you wanna have an agent that is somewhat human-like, 01:03:19.320 |
then they have to be able to do something similar. 01:03:23.360 |
And whether it's a actual physical corporeal body or not, 01:03:29.840 |
- So it sounds a bit as though you're disputing my premises 01:03:36.240 |
- I started off by saying emotions are implemented 01:03:47.320 |
And well, I wouldn't say they're not maladaptive, 01:03:54.760 |
I think it's the Kluge's kind of historical accidents 01:03:59.080 |
you're talking about emotions like they're talking, 01:04:05.200 |
'cause I think it's not the correct question. 01:04:24.480 |
not all people in all cultures have emotions. 01:04:29.920 |
assuming they have a neurotypical brain of sorts, 01:04:36.080 |
And so to answer the question that you're asking me is, 01:04:39.840 |
I can't answer it 'cause I don't think it's the right, 01:04:43.520 |
- So opioids and the pain systems seem like things 01:04:48.200 |
that influence your emotions fairly directly. 01:05:03.360 |
but they are also important for every other category 01:05:10.400 |
- They affect a lot of other things too, I agree. 01:05:12.720 |
- So you will see sometimes scientists will assign 01:05:30.320 |
to biological entities like a chemical or a brain region, 01:05:48.320 |
not just in moments that are emotional for you. 01:06:33.200 |
I mean, we procreate our ideas as well as physically. 01:06:42.880 |
our reactions and interactions with other people 01:06:52.440 |
is that we get rewarded when we interact with other things 01:07:20.960 |
putting that into a system inherently that it wants to, 01:07:25.440 |
that it just desperately wants to make something new? 01:07:33.960 |
when we talk about, we've been focusing a lot on bodies. 01:07:38.320 |
I'm certainly not saying that's a sufficient condition, 01:07:43.800 |
But I certainly have the motivation to create, 01:07:58.480 |
I have a teenage daughter, I'm just telling you. 01:08:05.840 |
what people find rewarding is remarkably diverse. 01:08:19.840 |
for a lot of people that's having an impact in some way, 01:08:30.240 |
building something that wasn't there before, whatever, 01:08:39.280 |
It's just true for a lot of people in this room, probably, 01:08:45.280 |
spend a lot of time with, but not for everybody. 01:08:51.960 |
we can certainly make a genetic argument here, absolutely. 01:09:06.880 |
that you have to pass your genes on to the next generation 01:09:12.480 |
and actually make sure that that generation survives 01:09:37.840 |
concepts of social reality that we've been talking about, 01:09:47.840 |
That it's very expensive to have to encode everything 01:09:59.800 |
a genetic system that allows us to wire the brains 01:10:08.400 |
- All right, so I'm curious about your analogy 01:10:19.920 |
and the perception of red to the fact that we have emotion. 01:10:24.920 |
Because the distinguishing feature, it seems to me, 01:10:31.040 |
And as you said before, our brain assigns meaning to things. 01:10:40.480 |
We don't always deliberately assign meaning to it. 01:10:43.320 |
This is nothing I've said is about ever deliberate. 01:10:55.040 |
we want to pick a goal and then we create costs 01:10:59.480 |
to achieve that goal, but that goal is deliberately assigned. 01:11:03.520 |
So when you talk about what makes something intelligent, 01:11:07.120 |
what do you think the role of intentionality is 01:11:13.000 |
- So first of all, when you talk about intentionality, 01:11:16.280 |
that you are, philosophers talk about intentionality 01:11:20.440 |
They talk about intentionality to mean a deliberate action, 01:11:23.600 |
the way you mean it, but intentionality can also mean 01:11:26.200 |
that something has a referent outside of you, really. 01:11:39.160 |
I also think that you have to make a distinction 01:11:43.880 |
explicitly, a goal that you can explicitly describe 01:12:01.080 |
the kind of question you're asking is getting very close 01:12:12.120 |
very Cartesian, unfortunately, because that's English. 01:12:15.760 |
I don't know, there's no other way to do it, actually. 01:12:18.400 |
But what I wanna say is that your brain is always, 01:12:25.280 |
but it's not always consciously experienced by you 01:12:44.760 |
No, you don't, but your brain is actually making choices 01:12:55.460 |
So I think you have to be really careful about, 01:13:01.520 |
there are words that we use in English and in science 01:13:08.120 |
They can have a meaning that is about decision-making 01:13:10.960 |
or choice that is just obligatory, automatic, 01:13:17.560 |
And then there's the kind of choice that feels deliberate 01:13:20.600 |
and effortful and where we feel like we're the agents. 01:13:33.720 |
in philosophical terms, it can also be assigned 01:13:37.640 |
Even if you're completely unaware of having made a choice, 01:13:42.040 |
you're acting on something with some degree of volition, 01:13:49.160 |
It's not like somebody hit your patella or tendon 01:13:54.920 |
- So I think you just have to make that distinction. 01:13:57.920 |
And I probably should get to the next question. 01:14:02.320 |
- Yeah, so you've been asked a lot of esoteric questions 01:14:05.060 |
about AI, but I think we might gain some insights 01:14:08.660 |
by wondering about DI, that is dog intelligence. 01:14:13.220 |
- So I believe I sort of understand what my dog is feeling. 01:14:18.220 |
And I usually believe that my dog believes the same, 01:14:37.580 |
to make the expressions that, let's say, a dog can. 01:14:58.920 |
I would have answered this question differently. 01:15:04.440 |
I think many creatures on this planet have affect, right? 01:15:20.640 |
And it's actually a really interesting question. 01:15:23.460 |
They certainly, they have something like opioids 01:15:48.360 |
And I mean, they certainly have some capacities 01:15:58.960 |
It's not natural selection, it's artificial selection. 01:16:03.800 |
And we selected them for a couple of things, right? 01:16:18.120 |
in a lot of ways that they have a lot more control 01:16:31.180 |
they do joint attention really well with gaze. 01:16:36.660 |
So this is something that really no other animal can do, 01:16:50.940 |
So they'll look at something and they'll look back at you. 01:17:14.380 |
about what's important in the world is with gaze. 01:17:17.220 |
So I think that dogs may actually have some capacities 01:17:33.420 |
- Awesome, well with that, let's give Lisa a big hand.