back to index

Lisa Feldman Barrett: How the Brain Creates Emotions | MIT Artificial General Intelligence (AGI)


Chapters

0:0 Introduction
1:15 Ive cried
4:25 Common misconception
6:20 How emotions are created
8:23 Building blocks of emotion
11:18 What makes us human
15:2 Brain evolution
21:23 Emotions are not real
26:58 Emotions are a language
28:37 How does an infant learn
35:33 Different mappings
39:7 Building a robot
43:10 Will it love you back
50:16 Audience questions
56:41 The need for embodied systems

Whisper Transcript | Transcript Only Page

00:00:00.000 | Today we're going to try something different.
00:00:04.000 | We're going to try a conversation.
00:00:06.400 | We have Lisa Feldman Barrett with us.
00:00:08.960 | She is a university distinguished professor of psychology
00:00:11.800 | at Northeastern University,
00:00:13.620 | director of the Interdisciplinary Effective Science Laboratory,
00:00:17.960 | author of the new amazing book,
00:00:19.960 | How Emotions Are Made, The Secret Life of the Brain.
00:00:23.000 | She studies emotion, human emotion,
00:00:26.880 | from social, psychological, cognitive science,
00:00:30.160 | and neuroscience perspectives.
00:00:31.960 | And so I think our conversation may help us gain intuition
00:00:36.840 | about how we instill emotional life
00:00:40.120 | into future artificial intelligence systems.
00:00:43.080 | As Josh Tenenbaum gave you a shout out on Tuesday
00:00:47.160 | and said that if you want to understand
00:00:50.560 | how to create artificial general intelligence systems
00:00:53.600 | from an engineering perspective,
00:00:55.400 | we should study the human brain.
00:00:57.880 | So with that, let's have some fun.
00:01:02.000 | Let me start with a curve ball
00:01:03.720 | to conjure up an image of emotion.
00:01:08.720 | Have you ever cried while watching a movie
00:01:11.300 | that you remember, and what movie was it?
00:01:13.400 | - I've cried during lots of movies.
00:01:17.720 | Let me think.
00:01:22.760 | The last time I cried,
00:01:24.440 | actually, here's an interesting thing.
00:01:29.560 | When I'm speaking about, when I'm giving an academic talk,
00:01:33.240 | I sometimes will talk about a study that we did
00:01:35.680 | where we had people watch films,
00:01:38.080 | and we have them watch the most evocative clips of films.
00:01:42.160 | And there's several clips that are really powerful,
00:01:47.160 | and a couple of them, every time I talk about them,
00:01:52.440 | I'm gonna try not to cry now,
00:01:53.920 | every time I talk about them,
00:01:55.180 | I'm describing for the audience what subjects are seeing.
00:01:59.560 | One of the clips is from a movie called Sophie's Choice.
00:02:03.520 | Does anyone know this film?
00:02:05.200 | Raise your hand if you know this film.
00:02:06.960 | So we show, this is a film about a woman
00:02:09.880 | who is forced in a concentration camp
00:02:12.880 | to choose which of her children will die
00:02:15.200 | in the gas chambers.
00:02:16.760 | And so I'm already, you know,
00:02:19.000 | if I had a heart rate monitor and respiration monitor,
00:02:22.960 | you would see, it's a really powerful scene.
00:02:27.160 | Meryl Streep is Sophie, and it's very, very evocative.
00:02:31.480 | We also show, there's a scene from a film
00:02:37.580 | with Susan Sarandon, who is dying of breast cancer,
00:02:41.960 | and she has to tell her 12-year-old daughter
00:02:43.360 | that she's dying.
00:02:44.640 | So this is another scene also that I find very compelling.
00:02:49.640 | - And you use these scenes to elicit emotion
00:02:52.520 | as part of experiments.
00:02:53.600 | - We do, yeah.
00:02:54.920 | And in fact, I was just giving a presentation
00:02:56.920 | to the Supreme Judicial Court of Massachusetts
00:03:01.920 | on implicit bias, and to start the,
00:03:05.960 | you know, they wanted to understand the neuroscience
00:03:07.660 | of implicit bias and whether or not
00:03:09.640 | they should be crafting jury instructions
00:03:13.840 | for juries to be aware of implicit bias.
00:03:17.800 | And to open the discussion, I showed them a clip
00:03:22.160 | from a film that was filmed, you know,
00:03:25.640 | almost 20 years ago, actually, called "A Time to Kill,"
00:03:30.640 | a scene where Matthew McConaughey is this kind of,
00:03:33.880 | it's his closing statements in a case
00:03:36.120 | where he is defending an African-American man
00:03:40.480 | who murdered two European-American men
00:03:43.880 | who raped his 12-year-old daughter.
00:03:46.040 | And so the scene is, he's this completely pathetic lawyer
00:03:50.040 | until the end, when he masters this fantastic defense,
00:03:55.040 | basically, of this man on trial for, you know,
00:03:59.640 | basically avenging his daughter's death.
00:04:01.840 | And that one, I hadn't seen that for 20 years, that film,
00:04:05.840 | and it brought me to tears, actually.
00:04:07.940 | It's just a really powerful, powerful scene,
00:04:11.480 | and it just really punches you in the stomach.
00:04:14.040 | You just can't help but just experience
00:04:17.600 | the weight of racism in that room
00:04:20.600 | and the brilliance of the closing argument
00:04:23.780 | to sort of puncture through it, basically.
00:04:26.360 | - Right.
00:04:27.440 | Okay, so experience.
00:04:29.620 | So one of the things you talk a lot in your work,
00:04:32.280 | in your book, is the difference in the experience
00:04:35.240 | of emotion and the expression of emotion.
00:04:38.280 | So what's our biggest misconception about emotion
00:04:42.180 | for folks that haven't considered,
00:04:44.200 | have only considered emotions at a surface level?
00:04:50.100 | - I would say one common misconception
00:04:55.100 | is that you can look at someone's face
00:04:58.420 | and read their emotion the way you read words on a page,
00:05:02.540 | that everyone around the world,
00:05:04.600 | when they're feeling angry, they scowl.
00:05:06.600 | When they're feeling happy, they smile.
00:05:08.940 | When they're feeling sad, that they frown.
00:05:11.080 | The way that I'm saying it sounds preposterous,
00:05:13.140 | and it is preposterous, but unfortunately,
00:05:15.180 | that is actually what a lot of people believe,
00:05:18.280 | and tech companies around the world
00:05:21.100 | spend tremendous amounts of money,
00:05:24.640 | and the investment of some of the most creative people
00:05:28.500 | on this planet trying to build emotion detection systems,
00:05:33.100 | when really what they're building are excellent systems
00:05:35.560 | for reading facial movements,
00:05:38.040 | which have no intrinsic emotional meaning.
00:05:40.660 | I mean, sometimes facial movements communicate emotion,
00:05:43.820 | but many times they don't.
00:05:45.220 | So people scowl for many reasons.
00:05:48.840 | They scowl when they're concentrating.
00:05:51.220 | They scowl sometimes when they're sad.
00:05:53.960 | They even can scowl when they're happy.
00:05:56.420 | People smile when they're angry.
00:05:59.460 | They smile when they're sad.
00:06:01.060 | They don't just smile when they're happy.
00:06:02.620 | They sometimes smile to indicate,
00:06:05.460 | to send a social message
00:06:07.700 | that has nothing to do with emotion.
00:06:09.300 | And so the idea that there's one universal facial expression
00:06:14.300 | for example, for each emotion is just,
00:06:17.680 | there's no strong scientific evidence for that claim.
00:06:21.500 | - So how is emotion created then?
00:06:23.580 | Because we nevertheless feel like we're observing emotion
00:06:26.980 | when we're communicating with others.
00:06:28.700 | So how does emotion, the creation of emotion change
00:06:32.620 | in the presence of others?
00:06:35.360 | Does the audience change the display of emotion?
00:06:39.620 | Is that essentially the message?
00:06:41.540 | - Well, emotions are not displayed, I would say, right?
00:06:45.340 | So basically you and I are having a conversation right now.
00:06:49.460 | And part of what my brain is doing is it's guessing,
00:06:54.460 | making educated guesses
00:06:56.380 | about what your facial movements mean.
00:06:58.020 | So right now you have a slight smile on your face,
00:07:00.220 | not exactly a smirk, but.
00:07:01.900 | (audience laughing)
00:07:04.900 | - I'm Russian by the way,
00:07:09.180 | so we're not allowed to show any emotion.
00:07:10.860 | (laughing)
00:07:13.120 | - I have a friend of mine who's Russian who told me,
00:07:17.980 | and when she moved to this country,
00:07:20.100 | actually I've had several friends tell me this,
00:07:22.400 | that their cheeks hurt for a year
00:07:24.780 | for how much smiling they had to do.
00:07:26.620 | I have also a friend from the Netherlands
00:07:27.980 | who moved here who told me the same thing.
00:07:29.340 | Her face ached for how much smiling she did.
00:07:32.940 | And in Bulgaria they have apparently like a name
00:07:36.820 | for the pervasive American smile,
00:07:40.780 | kind of like a not flattering name
00:07:44.740 | for how much Americans smile.
00:07:46.740 | But basically when you look at someone's face,
00:07:49.660 | you're making a guess about how they feel.
00:07:52.500 | And although you yourself are focused on their face,
00:07:55.500 | to you it feels as if that's where all the information is.
00:07:57.800 | Your brain is actually taking in the entire sensory array.
00:08:01.700 | So it's taking in also the sound of the person's voice
00:08:05.980 | and the person's body posture
00:08:08.460 | and the dynamic temporal changes
00:08:11.700 | in all of those signals to make sense of things.
00:08:14.500 | The face is not a lighthouse,
00:08:18.260 | it's not a beacon that just displays
00:08:20.140 | somebody's internal state in an obligatory way.
00:08:22.800 | - Nevertheless, there's some signal there?
00:08:27.020 | - Of course.
00:08:28.220 | - And so what would you say,
00:08:31.660 | as far as I understand there's no good answer yet
00:08:33.880 | from science, is what are the basic building blocks
00:08:37.260 | of emotion?
00:08:38.140 | - Well, I wouldn't say that there's no good answer
00:08:41.020 | from science, I would say scientists disagree.
00:08:43.740 | I think it's very clear what the building blocks are.
00:08:46.540 | But you know, in one way or another,
00:08:50.620 | if you look back all the way to ancient Greece,
00:08:54.660 | you can see that there are two kinds of views of emotion
00:08:58.260 | that have been battling it out for millennia.
00:09:02.740 | One is the idea that you have,
00:09:06.820 | I will give you the modern versions of these views,
00:09:09.540 | but they really do have a very long history.
00:09:11.780 | One is the idea that you are born
00:09:15.300 | with circuits in your brain that are pre-wired
00:09:19.060 | for emotion.
00:09:20.020 | So that everybody around the world is born
00:09:21.980 | with an anger circuit, a fear circuit, a sadness circuit,
00:09:25.340 | a happiness circuit, some other circuits too.
00:09:28.100 | You have them, we all have them,
00:09:29.920 | all neurotypical brains have them.
00:09:32.220 | And actually other animals have them too.
00:09:34.340 | And the idea is that when one of these circuits
00:09:38.480 | is triggered, you have an emotional response
00:09:41.460 | which is obligatory.
00:09:43.340 | So you have a very stereotypic change in your body,
00:09:47.060 | your breathing changes in a stereotypic way,
00:09:49.220 | your heart rate changes,
00:09:51.620 | the chemicals in your body change in a particular way,
00:09:56.260 | you make a particular facial expression
00:09:57.860 | and you have a propensity to make a particular action
00:10:02.740 | like you attack in anger and you run in fear
00:10:07.340 | or you freeze in fear.
00:10:08.560 | And then there's another view which says,
00:10:12.540 | well, there are some basic kind of ingredients
00:10:16.940 | or building blocks that the human mind
00:10:19.300 | or the human brain now, people talk about the brain now,
00:10:23.740 | there are some basic ingredients that your brain uses
00:10:27.100 | and it uses them to make every kind of mental event
00:10:29.400 | that you experience and every action that you take.
00:10:32.020 | The recipes are what are unique, the ingredients are common.
00:10:36.900 | Just like you can take flour and water and salt
00:10:39.260 | and you can make a whole bunch of different recipes
00:10:41.120 | with them, some of which aren't even food, like glue.
00:10:45.620 | In a similar way, the idea is that your brain
00:10:47.740 | has some kind of all-purpose capacities
00:10:50.340 | and it puts these ingredients together
00:10:53.060 | and it makes emotion as you need them on the spot.
00:10:56.020 | And you don't have one anger,
00:10:57.180 | you have a whole repertoire of anger.
00:10:59.860 | You don't have one sadness,
00:11:00.980 | you have a whole repertoire of sadness.
00:11:02.900 | And if you grow up in a culture
00:11:04.100 | that has no concept for sadness,
00:11:06.100 | you don't experience sadness and you don't perceive sadness
00:11:08.780 | because your brain becomes wired
00:11:11.580 | to make whatever mental events that exist
00:11:16.100 | in your particular culture.
00:11:17.860 | - So for artificial intelligence systems,
00:11:22.260 | the idea of emotional intelligence is really difficult
00:11:25.500 | and it seems like it's an important thing to try to instill.
00:11:28.620 | So for human beings, where do you see the importance
00:11:32.780 | on the priority list of what makes us human?
00:11:35.420 | Where does emotion sit?
00:11:36.820 | - I actually think that's the wrong question to ask
00:11:41.060 | because, no, I mean, I think people constantly
00:11:46.060 | ask that question, but I don't think
00:11:47.820 | it's the right question to ask
00:11:48.900 | because some cultures on our planet
00:11:52.700 | don't even have a concept for emotion
00:11:55.340 | and people don't, while they experience
00:11:58.380 | what I'm going to refer to as,
00:12:01.500 | in scientific terms, we call affect,
00:12:03.700 | which is our simple feelings of feeling pleasant
00:12:06.700 | or unpleasant or feeling worked up or calm
00:12:09.620 | or comfortable or having discomfort.
00:12:12.380 | These are feelings that come directly from
00:12:15.380 | the internal state of the physical systems in your body.
00:12:20.020 | Not everybody makes emotions out of those feelings
00:12:22.980 | and, in fact, we often don't make emotions
00:12:24.980 | out of those feelings.
00:12:25.820 | So those feelings, the way to think about it is
00:12:28.020 | that your brain comes wired to regulate your body.
00:12:31.580 | Actually, I'll sort of take that back.
00:12:34.460 | Your brain comes wired, when you're born,
00:12:36.500 | your brain comes wired with the potential
00:12:37.940 | to regulate your body.
00:12:39.540 | Infants actually don't regulate
00:12:41.140 | their own nervous systems very well.
00:12:43.100 | They can't put themselves to sleep.
00:12:45.080 | They can't calm themselves down.
00:12:48.200 | They can't regulate their own temperature.
00:12:50.360 | That's what they need caregivers for.
00:12:52.520 | And as caregivers regulate the nervous systems
00:12:55.780 | of their infants, that wires the infant's brain.
00:12:58.940 | So little infant brains aren't born
00:13:01.420 | like little miniature adult brains.
00:13:04.100 | They're born waiting for a set of wiring instructions
00:13:08.020 | from the world and they wire themselves
00:13:10.460 | to the physical and social realities that they grow in.
00:13:15.020 | And as they learn to do this,
00:13:17.980 | they experience these simple feelings
00:13:22.980 | that feeling pleasant or unpleasant,
00:13:25.780 | feeling worked up or feeling calm.
00:13:27.460 | These are kind of like barometer readings in a way.
00:13:30.500 | They're very simple and they lack a lot of detail
00:13:33.420 | because of how we're wired.
00:13:35.420 | And we have to make sense of them.
00:13:37.900 | And the way that a culture makes sense of them
00:13:39.660 | is not always about emotion.
00:13:41.040 | So I would say, in our culture,
00:13:43.920 | when we lose something significant like a loved one,
00:13:46.680 | we feel sad.
00:13:48.700 | In Tahitian, people feel sick.
00:13:52.620 | They have an illness.
00:13:54.060 | It's not sadness.
00:13:55.420 | 100 or 200 years ago, there was an emotion called nostalgia
00:14:00.720 | which killed people.
00:14:03.260 | Was thought to kill people after,
00:14:05.260 | for example, after serving in World War I.
00:14:07.780 | We would now call that depression.
00:14:11.620 | It's not just a matter of changing the label of something.
00:14:18.780 | It's actually the formation of the experience
00:14:21.740 | is very different.
00:14:22.580 | So if you wanna build an intelligent agent,
00:14:25.900 | I don't think emotion is what you need to endow it with.
00:14:29.380 | You need to endow it with these basic ingredients
00:14:32.200 | that it can use to make whatever experiences
00:14:36.700 | or guide whatever actions,
00:14:38.860 | make whatever states or guide whatever actions
00:14:41.500 | are relevant to the situations that it's in.
00:14:44.260 | That will be different if it's an American or a British
00:14:48.020 | or a Western urbanized environment
00:14:51.620 | than say if you were to go to Tanzania
00:14:54.740 | and study the Hadza who are hunting and gathering
00:14:58.900 | since that culture, since the Pleistocene.
00:15:02.100 | - So do you have words or human interpretable labels
00:15:06.800 | on these basic ingredients that we can understand?
00:15:09.500 | - I think so.
00:15:10.460 | I mean, I think the first thing to understand
00:15:12.540 | is that when we think about building an intelligent agent,
00:15:17.540 | we think about endowing the agent with cognition
00:15:21.480 | or with emotion or with the ability
00:15:24.840 | to perceive things in the world.
00:15:28.420 | Because as humans, these are the things
00:15:31.440 | that are really important to us,
00:15:32.620 | especially in our culture, thinking, feeling, seeing.
00:15:36.540 | In other cultures, they have different ways
00:15:39.380 | of parsing things.
00:15:40.940 | But the truth is that your brain did not evolve
00:15:44.580 | so that you could think and feel and see.
00:15:50.980 | It evolved, the brains evolved to control bodies.
00:15:57.220 | Brains evolved so that they could control
00:16:01.720 | the various systems of the body as creatures move around
00:16:06.720 | in order to gain resources like food and water.
00:16:13.280 | And a brain has to figure out how much resources to expend
00:16:18.120 | on getting additional resources.
00:16:21.600 | Now that may sound really trivial,
00:16:23.120 | but it turns out it's actually really hard.
00:16:25.600 | And scientists actually haven't really figured out
00:16:27.720 | completely how brains do this.
00:16:30.500 | I mean, what's really interesting to me
00:16:33.760 | is that if you look, for example, at computer vision,
00:16:36.300 | try to figure out how to make an agent see,
00:16:41.300 | that's a pretty much solved problem.
00:16:43.400 | Not completely, but it's a pretty much solved problem.
00:16:46.880 | The basics are solved.
00:16:47.960 | - The pure perception problem.
00:16:49.120 | - Yeah, the pure perception problem.
00:16:50.640 | However, if you wanna create an agent
00:16:53.880 | that just reaches out smoothly, grabs a glass,
00:16:58.320 | and brings it into the body to drink,
00:17:00.680 | that problem is not solved.
00:17:02.540 | Something as simple as movement,
00:17:05.040 | which we think of as this really basic thing,
00:17:07.140 | like, oh, it's so trivial, all animals can do it,
00:17:10.160 | is actually one of the hardest problems to solve.
00:17:13.880 | And brains basically, I mean, there's a whole story here
00:17:19.200 | about evolution, which has nothing to do
00:17:21.080 | with having a lizard brain, which is wrapped in
00:17:24.200 | a cognitive brain or anything like that.
00:17:26.360 | That's a reference to the triune brain,
00:17:29.920 | which a lot of people believe.
00:17:31.320 | It's a way of thinking about brain evolution.
00:17:33.840 | Instead, what seems to be the case is that
00:17:37.560 | as bodies got larger and there were more systems,
00:17:42.560 | because the niche of the animal,
00:17:44.760 | the environment of the animal got bigger
00:17:46.280 | and bigger and bigger, brains also had to get bigger.
00:17:49.320 | But they had to get bigger to a point with some constraints.
00:17:53.840 | Like, they have to be, you have to keep
00:17:56.520 | metabolic costs down, it's really important.
00:18:01.320 | If you don't, creatures get sick and die.
00:18:05.000 | We can talk about what that means in terms of depression
00:18:07.020 | or metabolic illnesses or what have you.
00:18:09.080 | But, so what are the basic ingredients?
00:18:11.080 | Well, one of the basic ingredients is that
00:18:13.020 | your brain is controlling your body all the time.
00:18:15.600 | Whether you feel it or not, whether you're thinking
00:18:17.700 | about it or not, whether you're asleep or awake,
00:18:20.680 | certain parts of your brain are always very active
00:18:24.120 | all the time, even when you're sleeping,
00:18:26.940 | or else you'd be dead.
00:18:29.160 | And those parts of the brain that are controlling
00:18:32.040 | your heart and your lungs and your immune system
00:18:35.580 | and all of those regions that are controlling
00:18:38.200 | the systems of your body are also helping your brain
00:18:42.360 | to represent the sensory consequences
00:18:45.600 | of those changes in your body,
00:18:48.080 | which you don't feel directly.
00:18:50.800 | You don't feel your heart beating most of the time.
00:18:53.920 | You don't feel your lungs expanding most of the time.
00:18:58.080 | And there's a really good reason why we are all wired
00:19:00.960 | not to feel those things, and that is you'd never
00:19:04.160 | pay attention to anything outside in the world ever again
00:19:07.200 | if you could feel every little movement
00:19:09.160 | that was going on inside your body.
00:19:10.840 | So your brain represents those as a summary,
00:19:13.800 | these kind of simple summary feelings.
00:19:15.400 | You feel good, you feel bad.
00:19:17.240 | You feel great, you feel like shit.
00:19:19.360 | You feel really jittery, you feel really calm.
00:19:22.480 | And these feelings are not emotions.
00:19:24.940 | They sometimes, your brain can make them into emotions,
00:19:27.940 | but they're with you every waking moment of your life.
00:19:31.040 | They are properties of consciousness
00:19:35.200 | in the way that lightness and darkness
00:19:37.320 | is a property of vision.
00:19:40.280 | And sometimes we make them into emotions.
00:19:44.520 | When we have a big change in our heart rate
00:19:47.360 | or a big change in our breathing rate
00:19:50.880 | or a big change in our temperature
00:19:53.160 | or a big surge of glucose,
00:19:55.200 | the brain might make emotion out of those changes,
00:20:01.140 | those very strong changes which you will feel
00:20:03.880 | as really feeling unpleasant or really feeling pleasant.
00:20:07.200 | But your brain might also make hunger
00:20:09.480 | or your brain might make an instance of
00:20:11.840 | a physical sensation like nausea.
00:20:16.840 | Or your brain might even make a perception of the world
00:20:21.240 | like that's a nice guy.
00:20:24.000 | That guy's an asshole.
00:20:25.960 | This is a really delicious drink.
00:20:28.120 | That's a beautiful painting.
00:20:29.960 | Those are also moments where these simple feelings
00:20:33.080 | which we call affect are very strong.
00:20:35.740 | So affect is a basic ingredient.
00:20:38.160 | There are others that I could talk about too.
00:20:41.520 | It's not the only one.
00:20:42.520 | But one of the things I think that sometimes
00:20:45.920 | people who are studying to build AI systems
00:20:50.560 | don't realize is that the brain,
00:20:54.620 | its fundamental job is to keep your body alive and well.
00:20:59.620 | And if you don't have some kind of body to regulate
00:21:05.880 | with affective feelings that come from that regulation,
00:21:11.300 | or something like that,
00:21:13.820 | you're kind of gonna be out of luck I think
00:21:18.820 | in rendering something that seems more human.
00:21:24.100 | - Right, so maybe you can elaborate like in the book Sapiens
00:21:28.500 | that as human beings we're really good en masse
00:21:31.900 | as with thousands, millions of people together
00:21:34.940 | believing something even if it's not true.
00:21:37.580 | So while scientifically sort of
00:21:42.420 | from a neuroscience perspective it's maybe very true
00:21:44.860 | that emotions aren't real.
00:21:46.580 | - I didn't say they aren't real.
00:21:48.240 | But I said they're not, there isn't.
00:21:50.900 | So it's interesting, you finish and then I'll--
00:21:53.460 | - So what I'm trying to say is also from an AI perspective
00:21:57.380 | is they become these trivial ideas of mapping a smile
00:22:02.380 | to being happy and these kind of trivial ideas
00:22:05.080 | become real to us through Hollywood,
00:22:07.980 | through cultural spreading of information
00:22:10.100 | and we start to believe this and therefore it becomes real
00:22:13.620 | in as much as anything is real
00:22:17.100 | about our perception together.
00:22:20.540 | So it's really important scientifically
00:22:23.740 | the ideas that you're presenting,
00:22:25.000 | but does that mean there's,
00:22:27.180 | just because our brain doesn't feel those explicit emotions
00:22:30.180 | does that mean they're not real?
00:22:31.900 | - I didn't say we don't feel explicit emotions.
00:22:34.300 | So I want to be really clear about this
00:22:36.120 | because it's an interesting,
00:22:37.980 | this inference, it's an interesting inference
00:22:42.980 | that people often make.
00:22:44.980 | And so, but it's a mistake.
00:22:48.560 | And it's a mistake that betrays a certain kind of thinking
00:22:51.360 | that we do in this culture.
00:22:54.360 | And it's the mistake of the following sort.
00:22:56.520 | So when I say there's no facial expression
00:23:01.100 | that is diagnostic of a single emotion,
00:23:03.880 | that doesn't mean that people don't express emotion.
00:23:07.940 | They certainly do express emotion.
00:23:09.980 | They just don't, when they're in a state of anger
00:23:13.540 | or in a state of sadness or in a state of awe,
00:23:15.680 | their faces don't do one thing.
00:23:17.740 | When I say, well, your body can do many things
00:23:21.060 | when you're angry or when you're sad,
00:23:23.740 | but let's take anger.
00:23:24.820 | Your heart rate can go up, it can go down,
00:23:26.900 | it can stay the same.
00:23:28.060 | Your breathing rate can go up, it can go down,
00:23:30.340 | it can stay the same.
00:23:31.920 | The same pattern that you see in anger,
00:23:34.400 | you sometimes see in sadness and you sometimes see in fear.
00:23:37.220 | You sometimes even see it in enthusiasm and in awe.
00:23:40.600 | So does that mean that emotions aren't real?
00:23:43.460 | No, emotions are real.
00:23:45.360 | But sometimes things are real
00:23:48.440 | because the physical meaning of the signal
00:23:54.020 | is endowed in the signal.
00:23:57.940 | So when your retina communicates to your brain
00:24:02.500 | that you are faced with a wavelength that is 600 nanometers,
00:24:07.500 | the signal is endowed in your brain.
00:24:13.740 | That's like in the signal.
00:24:14.780 | It's not like interpret, you don't interpret
00:24:16.420 | that it's 600 nanometers.
00:24:18.420 | It is 600 nanometers.
00:24:19.900 | The information's in the signal.
00:24:21.740 | But when you see red,
00:24:26.100 | that information is not in the signal.
00:24:28.220 | Your brain has added information
00:24:32.940 | that isn't in the signal itself.
00:24:34.640 | In a sense, your brain has imposed meaning on a signal
00:24:39.640 | that the signal doesn't have on its own.
00:24:43.340 | So let me back up and give a different example
00:24:45.580 | to make it a little easier and then we'll re-approach this.
00:24:48.480 | There's a, there are some things that we,
00:24:53.380 | in fact, this is true almost of all civilization, right?
00:24:57.300 | That we, there are some things that are real
00:24:59.140 | by virtue of the fact that we agree that they exist.
00:25:02.760 | Little pieces of paper serve as money,
00:25:08.220 | or now, you know, Bitcoin,
00:25:10.420 | or little pieces of plastic, or gold, or diamonds,
00:25:15.180 | or in the past, barley, salt, shells, rocks,
00:25:21.240 | serve as currency, have value,
00:25:24.760 | only because a group of people agree that they have value.
00:25:28.360 | So we impose meaning on objects,
00:25:33.360 | and once we all agree that that object actually has value,
00:25:37.560 | we can trade it for material goods.
00:25:39.480 | The minute that some of us disagree,
00:25:42.660 | that we withdraw our consent, right,
00:25:47.660 | the things lose their value.
00:25:50.760 | That's what happened in the mortgage crisis.
00:25:53.160 | That's what happened in the tulip crisis
00:25:55.520 | in the Netherlands in the 17th century,
00:25:59.840 | or 16th century, or whatever it was.
00:26:01.640 | Money, currency exists because we impose meaning
00:26:07.760 | on objects in the world, physical objects in the world,
00:26:12.680 | that themselves don't have that meaning on their own.
00:26:15.880 | And they are very real, money is very real to people.
00:26:18.780 | I can stick somebody's head in a brain scanner
00:26:21.000 | and show you that they experience value in a very real way,
00:26:25.600 | but that reality is constructed
00:26:28.560 | by the fact that they have learned the value
00:26:34.900 | in a particular culture.
00:26:36.480 | Well, that's also what we do with emotion.
00:26:38.640 | We impose meaning on certain physical signals
00:26:41.720 | that they don't have on their own,
00:26:45.240 | but we have collective intentionality.
00:26:47.140 | We all agree that scowling is sometimes anger,
00:26:52.080 | and so it becomes anger in a very real way,
00:26:55.160 | just like little pieces of paper become money.
00:26:58.280 | - So it sounds like you kind of think about
00:27:00.040 | the expression of emotion as a kind of language,
00:27:02.120 | as an extension of a language that we learn,
00:27:04.640 | in the same way that we collectively agree on a language,
00:27:08.060 | on a lexicon, and how we use that language.
00:27:12.560 | - Sure, you could think of it that way.
00:27:14.200 | I mean, everything in our culture
00:27:18.080 | is, almost everything in our culture
00:27:23.160 | is a function of social reality in this way.
00:27:26.800 | We are citizens of a country,
00:27:29.880 | because we all agree that the country exists, more or less.
00:27:33.440 | And wow, you guys barely even laughed at that, okay.
00:27:36.800 | What's a revolution?
00:27:40.320 | A revolution is when some people in the country
00:27:42.160 | withdraw their consent.
00:27:44.000 | They no longer agree, right?
00:27:47.120 | A president has powers in a country,
00:27:49.440 | because we all agree that a president has powers.
00:27:51.720 | President only has powers by virtue of the fact
00:27:53.440 | that we all agree that he or she has powers.
00:27:55.800 | If we stop agreeing,
00:27:57.200 | the president doesn't have those powers anymore.
00:28:01.920 | It's very real.
00:28:03.880 | People's lives depend, outcomes of real people
00:28:07.720 | depend on these social realities that we build and nurture.
00:28:12.400 | And we wire these social realities
00:28:14.160 | into the brains of our children as we socialize them.
00:28:17.680 | And when people move from one culture to another,
00:28:20.280 | they have to learn the new social reality
00:28:24.160 | that they are faced with.
00:28:25.400 | And if they don't, they get very sick physically,
00:28:29.600 | because our ability to agree on what something means
00:28:34.380 | actually is important for regulating our nervous systems.
00:28:37.160 | - So can you speak to that a little bit,
00:28:39.120 | I mean, for machine learning methods,
00:28:40.560 | for systems that learn to behave based on a certain reward,
00:28:45.560 | it's important to kind of have some ground truth and learn.
00:28:52.520 | So how do, it sounds like the expression of emotion
00:28:56.240 | is learned.
00:28:57.840 | Can you talk about how we learn to fit into our culture
00:29:01.800 | by expressing emotion with our face, body,
00:29:05.160 | given the context, given the rich,
00:29:07.660 | what's that process look like?
00:29:08.840 | When does it happen?
00:29:10.100 | How much does it-- - Sure, well, you know,
00:29:12.720 | I mean, I wrote a 400 page book,
00:29:15.960 | so I'll try to do it in like a couple sentences, yeah.
00:29:18.660 | (audience laughing)
00:29:21.660 | So how does an infant learn anything?
00:29:28.740 | So an infant's born,
00:29:30.760 | and it can't really do anything for itself.
00:29:35.280 | It can't regulate its own nervous system.
00:29:37.620 | It can't keep its body systems balanced.
00:29:40.760 | This is a term, scientific term for this is allostasis.
00:29:44.120 | Allostasis is your brain's ability to predict
00:29:46.480 | what your body's gonna need before it needs it
00:29:49.200 | and tries to meet those needs before they arise.
00:29:52.380 | So an example would be, if you're gonna stand up,
00:29:55.540 | if your brain's gonna stand you up,
00:29:56.980 | it has to raise your blood pressure before it stands you up.
00:30:00.280 | If it doesn't, you'll fall.
00:30:02.320 | That's costly from a metabolic standpoint, it's costly.
00:30:06.220 | You'll hurt yourself.
00:30:08.220 | So an infant's brain can't do this very well.
00:30:12.300 | An infant doesn't know when to go to sleep
00:30:15.360 | and when to wake up.
00:30:17.480 | An infant can't feed itself,
00:30:19.840 | can't regulate its own temperature.
00:30:22.080 | Someone else has to do it.
00:30:24.360 | And when someone does it, the infant is learning.
00:30:28.900 | The infant is learning.
00:30:30.340 | It's taking in sights and sounds and smells,
00:30:36.080 | and the physical sensations from the body,
00:30:39.380 | which are comfortable and pleasant
00:30:42.300 | when the infant's allostasis is maintained.
00:30:46.020 | So right from the get-go,
00:30:50.400 | an infant is learning statistical learning,
00:30:54.500 | the capturing events,
00:30:59.000 | including their consequence for the infant's body.
00:31:02.160 | Some people think babies are born
00:31:04.120 | attached already to their caregivers,
00:31:06.040 | but they're not, actually.
00:31:08.400 | Infants don't even know what a caregiver is.
00:31:10.780 | It's just that the caregiver is there constantly
00:31:13.240 | meeting that infant's needs.
00:31:15.220 | That's how infants start to learn.
00:31:17.160 | Now, if there are statistical regularities,
00:31:19.760 | like, for example, an infant is not born
00:31:21.920 | with the ability to recognize a face as a face,
00:31:25.040 | but it learns that in the first couple of days of life.
00:31:29.220 | Because human faces have some
00:31:30.600 | statistical regularities to them.
00:31:33.200 | Two eyes, a nose, and a mouth,
00:31:34.760 | kind of in the same place most of the time.
00:31:38.280 | So it learns really quickly.
00:31:39.940 | But here's the interesting thing.
00:31:42.480 | Around three months of age,
00:31:44.880 | infants start to learn what we call abstract categories.
00:31:49.880 | They start to learn that some things
00:31:53.760 | which don't look the same or sound the same
00:31:57.120 | or smell the same actually have the same function
00:32:02.920 | and how do they learn this?
00:32:04.480 | They learn it with words.
00:32:05.980 | So if you do an experiment with a three month old,
00:32:09.800 | three months old, okay,
00:32:11.640 | and you say to that baby very, very intentionally,
00:32:14.880 | "Look, sweetie, this is a wug."
00:32:18.480 | And you put the wug down and it makes a noise, like a beep.
00:32:24.880 | And then you say, I'm like, I don't have props.
00:32:29.760 | And then you say-- (laughs)
00:32:31.440 | - I have my wallet.
00:32:32.280 | - Yeah, okay, and then you say, "Give me your wallet."
00:32:34.040 | Yeah, give me your wallet.
00:32:35.200 | (audience laughs)
00:32:36.600 | And then you say, "Look, sweetie, this is a wug."
00:32:41.600 | And you put the wug down and it makes a beep.
00:32:45.560 | If you say, "Look, sweetie, this is a wug,"
00:32:50.800 | that infant expects that object to beep.
00:32:55.800 | Why is that important?
00:32:58.720 | Because in the real experiments,
00:33:01.320 | this might be yellow and squishy and tall.
00:33:06.320 | And this might be red and pointy and hard.
00:33:11.680 | And this might be, so lots of different perceptual features,
00:33:17.360 | but the infant knows, learns,
00:33:22.360 | that the word is inviting the infant to understand
00:33:29.080 | that the function of those very different physical signals
00:33:33.600 | are actually the same.
00:33:35.200 | Similarly, you can take six objects
00:33:38.640 | that are exactly identical in their physical features,
00:33:41.680 | how they sound, how they smell, what they feel like,
00:33:45.920 | what they look like, and you can name three of them
00:33:48.240 | with one word and three of them with the other word,
00:33:51.320 | and the infant will understand
00:33:53.520 | that these are actually different objects.
00:33:55.880 | That happens a little after, not as early as three months.
00:33:59.280 | But the point is that we talk all the time.
00:34:02.560 | We use words all the time.
00:34:03.680 | What do we do with infants?
00:34:04.820 | We're constantly pointing things out and labeling them.
00:34:07.680 | This is a dog, this is a cat.
00:34:09.760 | You're angry, this person's sad.
00:34:12.440 | Oh, mommy's really happy today.
00:34:14.520 | Oh, you know, mommy loves you.
00:34:16.640 | Oh, daddy's really excited about this,
00:34:19.640 | and so on and so forth.
00:34:21.120 | And infants learn really, really quickly.
00:34:24.360 | Words are considered to be kind of invitations
00:34:27.240 | to form abstract concepts.
00:34:29.600 | That is the basis of almost all of the
00:34:34.200 | mental categories that we,
00:34:40.280 | mental events that we experience.
00:34:44.040 | We're basically teaching children
00:34:46.880 | to form these abstract categories.
00:34:48.920 | And that's the basis of money,
00:34:52.060 | and it's the basis of roles that we have with each other
00:34:56.800 | and what we expect from each other.
00:34:58.240 | It's the basis of a lot of the sort of functional categories
00:35:01.520 | that we use in everyday life.
00:35:02.480 | We impose meaning on sensory arrays
00:35:06.500 | that those sensory arrays in and of themselves
00:35:08.760 | don't necessarily have.
00:35:09.960 | They only have that meaning because you and I both learn
00:35:14.200 | that that package of sensory array means something.
00:35:16.840 | So when I make that,
00:35:18.680 | you can anticipate what will happen next.
00:35:21.200 | Just because we've learned those,
00:35:23.320 | they're wired into our brains in our culture.
00:35:28.320 | And when we go to a different culture,
00:35:29.400 | we have to learn sometimes different packages.
00:35:33.200 | - Yeah, different mappings.
00:35:34.200 | So you're saying that there's a few sources of sensory data
00:35:38.080 | and a few building blocks inside us,
00:35:41.480 | the feelings of some kind that we learn to then,
00:35:44.840 | from an early, we're born with those?
00:35:47.840 | - Part of what your brain is doing
00:35:49.280 | is it's trying to make sense of the sensory array around it.
00:35:53.640 | So from your brain's perspective,
00:35:57.980 | just think about it from your brain's perspective.
00:36:01.320 | Your brain's perspective,
00:36:03.240 | it spends its entire life trapped in a dark, silent box.
00:36:08.240 | And it has to make sense
00:36:14.760 | of what's going on around it in the world
00:36:17.000 | so that it knows what to do to keep itself alive and well.
00:36:22.000 | But it has to know what to do based on,
00:36:26.880 | it has to know what's happening around it
00:36:29.680 | only from the effects that it receives
00:36:32.680 | through the sensory systems of the body.
00:36:36.240 | So a flash of light, what's a flash of light?
00:36:39.040 | It could be anything.
00:36:40.120 | Like a siren, a siren could be a firetruck
00:36:46.500 | or it could be somebody's car alarm went off
00:36:50.280 | or it could be a doorbell or it could be, right?
00:36:52.800 | Any particular sensory cue could have multiple causes.
00:36:57.400 | So your brain's trapped basically in your skull
00:37:00.080 | and all it gets are the effects,
00:37:03.080 | the sensory effects of stuff that happens in the world.
00:37:06.620 | But it has to figure out what those things are
00:37:09.320 | so that it knows what to do.
00:37:10.660 | So how does it do that?
00:37:12.160 | Well, it has something else that it can draw on.
00:37:14.060 | It has your past experiences.
00:37:16.440 | Your brain basically doesn't store experiences
00:37:21.400 | from the past, it can reconstitute them in its wiring.
00:37:24.960 | And that's what it uses to guess
00:37:30.640 | at what those sensory cues mean,
00:37:33.200 | what those sensory changes mean.
00:37:35.120 | So in one situation, a siren means one thing.
00:37:39.700 | In another situation, it means another.
00:37:41.400 | A flash of light means one thing in one situation
00:37:43.960 | and a different thing in another.
00:37:45.720 | So your brain is using past experience to make guesses
00:37:48.620 | about what these sensory changes mean
00:37:51.040 | so that it knows what to do.
00:37:52.440 | And it has the same relationship
00:37:55.920 | to the sensory changes in your body.
00:37:58.000 | What's an ache in your stomach?
00:38:00.600 | Well, it could be hunger.
00:38:02.880 | It could be anger.
00:38:05.320 | It could be disgust.
00:38:06.240 | It could be longing.
00:38:07.760 | It could be nausea.
00:38:08.960 | It's not that there's one ache in your stomach for nausea
00:38:15.080 | and another ache in your stomach for hunger.
00:38:17.520 | There are many aches in your stomach,
00:38:21.720 | many different feelings of achiness for nausea
00:38:24.640 | and many different feelings of achiness for hunger
00:38:26.960 | and sometimes they overlap.
00:38:29.200 | So your brain has to make the same kinds of guesses
00:38:31.360 | about what's going on in your body
00:38:32.640 | as it does about what the sensory events mean in the world.
00:38:36.000 | And that's really what it's doing.
00:38:37.640 | It's guessing and making sense of the sensory array
00:38:40.800 | so that it knows what to do next.
00:38:42.840 | And when it guesses wrong,
00:38:44.480 | it takes in the sort of the information
00:38:49.480 | that it didn't predict well,
00:38:51.960 | which in psychology we have a really fancy name for that.
00:38:56.080 | We call it learning.
00:38:58.440 | Your brain takes in the information that it didn't predict
00:39:01.360 | and so that it can predict better in the next time
00:39:04.760 | to make sense of things the next time.
00:39:06.660 | - So you kind of answered this a little bit.
00:39:11.680 | I'd like to elaborate on it.
00:39:13.600 | If you were to build a robot that performs,
00:39:16.720 | maybe passes the Turing test
00:39:18.080 | or performs at the low bar level of,
00:39:21.720 | instead of myself here today,
00:39:23.320 | it would be a robot talking to you.
00:39:25.120 | It would be convincing as a human.
00:39:26.840 | How would you build that system in a sense
00:39:31.840 | in paralleling the infants?
00:39:35.600 | What essential aspect of the infant experience
00:39:39.520 | do you think is important?
00:39:41.520 | - It needs, well, I mean, I'm not a computer scientist.
00:39:45.760 | So the way that I would say it is it needs to have a body.
00:39:50.760 | It needs to have something like physical systems
00:39:53.720 | or an analogy to physical systems.
00:39:55.960 | It has to do something analogous to allostasis.
00:39:59.160 | - So what, sorry to elaborate.
00:40:00.840 | So what would be the goal for the system?
00:40:03.880 | You kind of mentioned previously
00:40:05.680 | that the goal would be for the brain
00:40:07.960 | to just stabilize itself.
00:40:09.720 | - No, it's not that the brain is stabilizing itself.
00:40:11.720 | So people talk about reward.
00:40:13.080 | What is reward?
00:40:14.160 | - In machine learning, it's pretty easy.
00:40:17.640 | It's something.
00:40:18.920 | (audience laughing)
00:40:21.880 | It's mathematical.
00:40:22.880 | So there's no philosophy to it.
00:40:25.240 | You just--
00:40:26.120 | - Oh, and there's philosophy to everything, right?
00:40:28.520 | Whether you admit it or not is a different story.
00:40:30.440 | - Right. - Yeah.
00:40:31.680 | - So you wanted to play a game of chess,
00:40:34.880 | play a game of Go.
00:40:35.800 | You wanted to pick up a water bottle.
00:40:38.760 | - Okay, but what is reward?
00:40:40.400 | - Existence?
00:40:43.840 | - Dopamine is actually not reward.
00:40:49.160 | Dopamine is effort.
00:40:50.520 | Dopamine is necessary for effort.
00:40:52.040 | It's not necessary for reward.
00:40:53.920 | I mean, it's commonly,
00:40:55.240 | if you read the most up-to-date literature,
00:40:58.960 | that is what you'll see.
00:41:00.680 | That it's actually, people can,
00:41:03.040 | animals can find things rewarding,
00:41:06.720 | can be without dopamine, actually,
00:41:09.720 | but they need dopamine to move.
00:41:13.400 | They need dopamine to encode, to learn information.
00:41:16.160 | So it's really for effort that is required
00:41:18.840 | to work towards getting a reward, I would say.
00:41:22.600 | And when a brain, an animal brain,
00:41:27.600 | any kind of animal brain,
00:41:29.800 | mispredicts what the reward will be,
00:41:32.640 | that's when you see a real surge of dopamine,
00:41:35.280 | because the animal has to adjust its action.
00:41:38.920 | But reward is basically bringing the body
00:41:44.160 | back into allostasis.
00:41:46.040 | It feels good when that happens.
00:41:48.000 | And people will, and animals will work tremendously hard
00:41:53.000 | to have that happen.
00:41:55.800 | So, you know, what is motivation?
00:41:58.600 | Motivation is expending resources to get a reward.
00:42:05.000 | So basically, if you don't have
00:42:10.000 | something like physical systems
00:42:16.240 | that have to be kept in balance,
00:42:18.640 | water, I mean, for humans,
00:42:20.800 | or for actually any living creature on this planet,
00:42:25.800 | even single-cell organisms, actually,
00:42:29.880 | there's an analogy to what we're talking about here,
00:42:32.680 | to brains.
00:42:34.000 | But, you know, salt, water, glucose,
00:42:38.280 | all these systems have to be kept in balance,
00:42:40.480 | and they have to be kept in balance
00:42:41.520 | in a very, very efficient way.
00:42:44.960 | And that's the motivating,
00:42:47.720 | so-called motivating force, really.
00:42:50.160 | That's what really brains are for.
00:42:53.480 | And that is the basis, the consequence of that regulation
00:42:59.960 | are the basis of affective feelings,
00:43:02.880 | which are, for many, many creatures on this planet,
00:43:07.880 | a property of consciousness.
00:43:09.960 | - Okay, so maybe, if it's okay,
00:43:13.920 | we'll take some questions from the audience.
00:43:15.520 | - Sure. - But first,
00:43:16.440 | let me ask the last question.
00:43:18.320 | So, building on the robot question,
00:43:20.760 | how would you build the same kind of robot
00:43:25.200 | that you would be able to, as a human being,
00:43:28.040 | fall in love with?
00:43:31.560 | - Well, you know, people fall in love with their cars.
00:43:34.880 | They fall in love with,
00:43:36.520 | they fall in love with their blankets,
00:43:39.760 | they fall in love with their toys.
00:43:42.440 | You know, you don't need,
00:43:43.400 | it doesn't need much to fall in love with something.
00:43:45.240 | The question is, will it love you back?
00:43:46.920 | - No, well. (audience laughing)
00:43:50.880 | I would elaborate, I think.
00:43:53.560 | Yeah, I think you're answering that,
00:43:56.320 | love, in the way we're defining it loosely
00:43:58.560 | in poetry and culture, is a social construct,
00:44:02.120 | and it's relative.
00:44:03.520 | What I mean is, sort of the idea of monogamous,
00:44:06.840 | long-term love that we have deep connection
00:44:09.920 | with other human beings that we have.
00:44:11.480 | You're saying you could do the same with a car,
00:44:13.400 | like a nice '69 Mustang.
00:44:14.920 | - Are you telling me that you've never,
00:44:17.480 | you don't know anyone who is so in love with their car
00:44:19.840 | that you, okay.
00:44:20.920 | Now, here's the thing. - That's true.
00:44:22.680 | - So, here's the thing.
00:44:23.720 | We are social animals, okay?
00:44:26.640 | What does that mean?
00:44:27.480 | What does it mean to be a social animal?
00:44:29.480 | It means that we regulate each other's nervous system.
00:44:31.560 | So, our brain, my brain isn't just regulating
00:44:34.160 | my nervous system, right now, it's regulating yours.
00:44:37.880 | And actually, it's regulating other people's
00:44:40.600 | in the audience, too, and vice versa.
00:44:42.640 | And why is that?
00:44:43.840 | You know, I mean, other animals do it, too.
00:44:46.760 | We're just really good at it.
00:44:49.000 | But other animals, like there are some insects
00:44:50.640 | that are social species, they regulate
00:44:53.120 | each other's nervous systems.
00:44:54.160 | They do it with chemicals, they do it with smell,
00:44:56.640 | primarily, and a little bit with touch,
00:44:58.840 | like earwigs will, you know, like, they can actually,
00:45:02.200 | I have this great picture of this, like,
00:45:03.440 | totally disgusting looking little bug,
00:45:05.240 | but it's like, you know, cuddling its little baby,
00:45:08.160 | ugly, little ugly baby bug, too.
00:45:10.840 | It's adorable picture.
00:45:13.000 | But, you know, what about mammals, like rats?
00:45:15.480 | Well, they also use chemicals, like smell,
00:45:18.440 | but they also use touch, and to some extent,
00:45:23.400 | they also use sound, they use hearing.
00:45:26.400 | Primates add vision, and as primates,
00:45:29.720 | we use all of those senses to regulate each other,
00:45:33.200 | and we also use something else.
00:45:35.200 | Words, right, exactly.
00:45:38.720 | And words, the systems in our brains
00:45:41.120 | that allow us to speak and allow us to understand words
00:45:45.160 | are directly connected to the parts of the brainstem
00:45:50.160 | that control the body.
00:45:51.880 | I don't mean like there are a bunch of,
00:45:53.040 | I mean monosynaptically connected.
00:45:56.320 | So exactly the same systems in your brain
00:45:59.560 | that are important for you to be able to understand language
00:46:04.320 | and to speak are also directly, directly
00:46:09.320 | affecting the systems of your body.
00:46:11.920 | And that is why I can say something to somebody,
00:46:15.900 | I can speak to you, and I can have an impact
00:46:18.160 | on the nervous systems of people
00:46:20.080 | all the way at the back of this auditorium
00:46:22.360 | without, you know, maybe they can see me, maybe they can't.
00:46:25.880 | Maybe they can, hopefully they can't smell me.
00:46:28.040 | Maybe they can hear me, maybe they, you know.
00:46:30.280 | But they can, if they hear me speak words,
00:46:31.920 | I can affect their nervous systems.
00:46:33.920 | That's why a telephone works.
00:46:35.760 | That's why a telephone works
00:46:36.760 | to where you can feel connected to someone
00:46:38.680 | just merely by hearing their voice
00:46:40.720 | because the sound of their voice has an effect
00:46:43.800 | on your nervous system.
00:46:45.000 | It can make you breathe faster,
00:46:46.840 | it can make you breathe slower,
00:46:48.400 | and the words also have an effect
00:46:51.000 | because when I say a word like,
00:46:54.820 | hmm, I don't know.
00:46:56.900 | When I say a word like car, that's a short form.
00:47:04.180 | I have a bunch of mental features in my mind
00:47:07.020 | when I say the word car, and I say that word,
00:47:09.700 | and that invokes those similar mental features,
00:47:12.380 | maybe not identical, but similar enough
00:47:14.780 | that it invokes it in your mind.
00:47:16.900 | And your mind is made by your brain,
00:47:22.940 | so it invokes, if I just say the word car,
00:47:26.940 | there are changes in your motor system
00:47:30.300 | that would be exactly the same or very close
00:47:33.660 | as if you were actually in a car, right?
00:47:36.980 | So this is something that we do,
00:47:41.220 | and the fact where attachment comes from,
00:47:46.100 | an infant to a caregiver or two lovers
00:47:50.380 | or two really close friends comes from the ability
00:47:54.900 | that we have to regulate each other's nervous systems.
00:47:57.760 | And that is why when you don't have
00:48:04.980 | that kind of attachment, you die sooner,
00:48:09.780 | on average, seven years sooner, loneliness kills.
00:48:13.180 | I always tell my daughter, my daughter's 19 years old,
00:48:15.180 | and I always tell her, when you break up with someone,
00:48:18.060 | it feels like it will kill you, but it won't.
00:48:22.100 | Loneliness, however, will kill you.
00:48:24.540 | It will kill you, on average, seven years earlier
00:48:30.860 | than it would if you didn't have an attachment.
00:48:34.780 | And that's because our nervous systems,
00:48:39.780 | as our bodies got really complex through evolution
00:48:43.940 | and our brains got bigger, they could only get so big
00:48:47.780 | there are constraints on how big any brain can get
00:48:52.420 | that have to do with birthing the infant,
00:48:54.660 | but it also has to do with the metabolic cost of a brain.
00:48:58.500 | Your brain is really expensive, my brain, really expensive.
00:49:02.380 | Three pounds, 20% of your metabolic budget, that's a lot.
00:49:06.980 | And so what did evolution do to solve this problem?
00:49:12.700 | Well, it couldn't make our brains any bigger,
00:49:15.380 | so it just entrained other brains
00:49:17.820 | to help manage our nervous systems.
00:49:19.860 | So you bear the burden of other people's allostasis
00:49:24.420 | and they bear the burden of yours,
00:49:27.020 | not always at the same time,
00:49:28.760 | but that's what it means to give people support.
00:49:31.320 | When you're feeling horrible
00:49:33.140 | and somebody pats you on the back
00:49:34.580 | or says nice words to you or gives you a hug,
00:49:37.360 | they are physically having an effect on your body,
00:49:40.380 | that they are helping your body to maintain allostasis
00:49:44.480 | at a time when your brain probably
00:49:46.100 | couldn't manage it on its own.
00:49:48.460 | And so the basis of love or attachment is basically that.
00:49:52.820 | It's the ability to affect each other's nervous systems
00:49:56.220 | in a positive way.
00:49:57.380 | I always say to people,
00:49:59.820 | the best thing for a nervous system,
00:50:01.780 | a human nervous system, is another human.
00:50:04.420 | And the worst thing for a human nervous system
00:50:09.420 | is another human.
00:50:13.900 | Because we're social animals.
00:50:15.360 | - Wow, beautifully put.
00:50:17.620 | So maybe a few questions from the audience.
00:50:20.580 | Do you mind?
00:50:21.540 | There's microphones on both sides.
00:50:23.180 | Go ahead.
00:50:24.020 | - Sure.
00:50:24.840 | - Hi, thanks for talking here.
00:50:28.820 | You're saying cool stuff.
00:50:30.180 | - Thanks.
00:50:31.020 | - Not a question.
00:50:32.780 | (audience laughing)
00:50:33.600 | - It's okay.
00:50:34.440 | I'll take it, it's all right.
00:50:37.980 | - So I was thinking about what you were saying about reward
00:50:43.660 | and I'm wondering, first of all,
00:50:46.500 | is, you described it as a return to allostasis.
00:50:50.460 | Is reward in any way linked to the,
00:50:55.460 | just like the reinforcement of pathways or behavior
00:51:00.700 | so that the next time you get that stimulus,
00:51:03.180 | you'll respond in the same way?
00:51:06.420 | And I'm also wondering about the link between
00:51:10.560 | the desire for allostasis and the need for novelty
00:51:15.340 | and the need to explore our environment.
00:51:16.180 | - Yeah, it's a great question.
00:51:17.420 | It's a really great question.
00:51:18.780 | So the second question's so much more interesting
00:51:23.380 | than the first, actually.
00:51:24.860 | So let me say this, that there is a need for novelty.
00:51:29.860 | The need differs for different people, I will say.
00:51:36.560 | Novelty, so first of all,
00:51:39.460 | we can think about the need for novelty
00:51:45.980 | in a really proximal way,
00:51:49.380 | or we can think about it in a really distal way.
00:51:51.780 | But basically, when I say that a brain is organized
00:51:58.940 | or engineered for metabolic efficiency,
00:52:01.780 | that doesn't mean that the goal is to only ever
00:52:06.360 | have your prediction, your brain's predicting all the time,
00:52:10.440 | to always have its predictions completely perfect
00:52:12.880 | so you'll never learn nothing,
00:52:14.340 | because you'll be bored out of your mind, right?
00:52:17.000 | And also, humans like to expand their niche,
00:52:22.000 | they like to explore.
00:52:25.440 | So it's a constant balance between what biologists
00:52:29.960 | would call exploitation and exploration.
00:52:33.680 | Novelty, it's exciting,
00:52:38.320 | there's actually an increase in norepinephrine,
00:52:40.320 | an increase in arousal, it feels really exciting,
00:52:43.180 | but it's also super costly.
00:52:46.620 | Novelty requires usually that you learn something new,
00:52:51.620 | that means that's actually a really
00:52:53.260 | metabolically expensive thing to do.
00:52:55.780 | And it also means usually that you're moving
00:52:57.740 | your body around, which is also a metabolically
00:52:59.920 | expensive thing to do.
00:53:01.420 | So the need for novelty is balanced by its cost,
00:53:05.820 | and different nervous systems can bear
00:53:10.260 | different amounts of cost.
00:53:12.720 | So for example, if you take two rats
00:53:15.420 | that are somewhat genetically,
00:53:19.520 | like they've been genetically bred,
00:53:21.380 | one is bred, when you stick it in a novel cage,
00:53:27.020 | it just sits still.
00:53:28.660 | And the other one, when you put it in a novel cage,
00:53:32.060 | it like roams all over the place,
00:53:33.780 | and it's just going crazy, kind of exploring everything.
00:53:37.580 | Well, the one that sits still, scientists might say,
00:53:40.220 | oh, that's a nervous rat, or that rat's afraid.
00:53:44.060 | What is that rat doing?
00:53:45.260 | The rat is not moving, and it's not encoding anything,
00:53:48.900 | 'cause encoding something is expensive.
00:53:53.000 | This rat, on the other hand, is roaming all over the place,
00:53:57.240 | moving a lot, learning a lot, so it's encoding a lot,
00:54:00.320 | spend, spend, spend, save, save, save,
00:54:03.040 | spend, spend, spend.
00:54:05.000 | There are differences between people,
00:54:07.240 | and there are also differences between times in your life,
00:54:09.520 | where moments where you feel like you have
00:54:10.960 | a little bit to spend, and other moments
00:54:13.040 | where you feel like you really have to conserve.
00:54:14.640 | When I talk to the public, I always talk about,
00:54:16.360 | I don't use the word allostasis,
00:54:17.780 | it's just too boring a word, but I sort of do
00:54:21.320 | to sort of explain it like a budget.
00:54:23.880 | Your brain is sort of like the financial office
00:54:26.600 | of a company.
00:54:27.680 | A company has lots of offices, it has to balance
00:54:30.840 | the expenditures and revenues,
00:54:33.120 | and it's gotta keep everything in balance,
00:54:34.440 | so it might take a little money here,
00:54:35.580 | move it a little over to that office,
00:54:36.960 | it's gotta keep everything in balance.
00:54:38.480 | What it's always trying to do is spend a little bit
00:54:40.600 | to make a little bit more.
00:54:42.040 | What happens when it spends a little bit,
00:54:45.180 | and it doesn't get a revenue back?
00:54:46.680 | There's no reward, what happens?
00:54:48.180 | Well, it goes into the red a little bit,
00:54:49.640 | so what do you do when something goes into the red?
00:54:51.480 | Well, you might do something risky,
00:54:53.100 | you might actually spend a lot to try to really make,
00:54:56.960 | you know, not just make back your deficit,
00:54:58.520 | but actually make a lot.
00:55:00.000 | That would be novelty, that would be move and spend,
00:55:02.720 | move and encode.
00:55:04.360 | Or you might reduce your spending.
00:55:07.220 | You might say, well, I'm gonna save a little bit now.
00:55:10.840 | That would be, I'm not gonna move too much,
00:55:13.720 | I'm not gonna spend too much,
00:55:15.040 | I'm not gonna encode anything.
00:55:16.960 | So, I certainly don't mean to suggest to you
00:55:19.420 | that novelty is unimportant,
00:55:22.680 | or that learning is unimportant.
00:55:25.180 | And it's a really important question about,
00:55:27.900 | what is there any intrinsic value to novelty
00:55:33.080 | over and above the rewards that it would give you
00:55:38.540 | in an allostatic sense?
00:55:40.440 | But it is really clear to me that the extent to which
00:55:47.760 | any nervous system will embrace novelty and even seek it,
00:55:52.020 | pretty much depends on the allostatic state of that system.
00:55:57.020 | If you don't have a lot to spend,
00:55:59.120 | and you're already in the red,
00:56:00.720 | at a certain point, if you continue to spend
00:56:04.320 | when you're in the red, you go bankrupt.
00:56:06.560 | What that means in human terms is you get depressed.
00:56:09.560 | It means that your brain makes you fatigued
00:56:12.120 | so you can't move, and it locks you in
00:56:15.920 | so you stop paying attention to anything going on
00:56:18.080 | around you in the world, and your experience
00:56:20.600 | is just what's in your head.
00:56:22.280 | That's actually what depression is.
00:56:24.200 | - So that makes me think of allostasis
00:56:28.400 | more as a range than as a zero point.
00:56:31.360 | - It's not homeostasis, it's allostasis.
00:56:33.800 | It's a range, for sure.
00:56:35.240 | But I'll answer some other questions,
00:56:37.680 | and maybe I'll get back to your first question
00:56:39.440 | if there's time.
00:56:40.280 | - So if I understand your argument correctly,
00:56:44.000 | if we're going to make anything like a general intelligence,
00:56:47.680 | something approaching, you know, like a human,
00:56:51.340 | it needs to be an embodied system.
00:56:55.720 | - Well, I want to be careful about saying that
00:56:57.440 | because for two reasons.
00:56:58.720 | One, because in biology, there is this concept
00:57:01.720 | of degeneracy, which is a sucky word,
00:57:04.360 | but it's a great concept, and it means that
00:57:06.740 | there's more than one way to skin a cat, basically.
00:57:09.600 | You want a functional outcome?
00:57:11.120 | There are many ways to get to that functional outcome.
00:57:13.560 | So genes, for example, you know,
00:57:15.800 | there are a lot of characteristics that are heritable,
00:57:18.720 | but we don't know the genes for them.
00:57:20.320 | And the reason why is that there isn't one set of genes.
00:57:22.440 | There are like multiple sets of genes
00:57:24.160 | that can give you actually the same outcome.
00:57:26.640 | So what I want to say is that
00:57:29.200 | you need something akin to a body.
00:57:32.480 | It doesn't actually have to be a body.
00:57:34.800 | I imagine there are lots of ways
00:57:36.800 | that you could implement a system,
00:57:39.400 | you could implement an agent that has multiple systems
00:57:42.200 | of some sort that it has to manage.
00:57:45.280 | But my point is that one thing about
00:57:47.800 | that is very important that we continually miss,
00:57:51.760 | when we think about building an agent with mental states,
00:57:55.040 | we continually miss the fact that it has a body.
00:57:59.000 | Humans have bodies, and that's the brain's primary task.
00:58:02.880 | And our most fundamental feelings come from
00:58:05.720 | the physical changes in our body,
00:58:07.280 | even though we don't normally experience it that way.
00:58:11.520 | That actually is how it is.
00:58:13.160 | If you just look at the wiring of the brain,
00:58:14.720 | you can just see it.
00:58:16.040 | So it seems to me that if you want to build an agent
00:58:18.880 | that is human-like, it has to have something like a body.
00:58:23.880 | Doesn't have to maybe be a body.
00:58:26.160 | And I'm sure there are many clever ways
00:58:27.880 | that you could implement something like a body
00:58:30.800 | without it actually being a body,
00:58:32.680 | if you understand what I mean.
00:58:35.000 | - So Amazon Alexa could be there
00:58:40.080 | if we just gave it some sort of, I don't know,
00:58:42.920 | representation of mental states
00:58:44.520 | or some kind of allostatic target.
00:58:47.800 | - Sure.
00:58:48.640 | Let me just say one other thing,
00:58:52.320 | I think it's really important.
00:58:53.840 | All a brain requires is that you at some point had a body.
00:58:59.000 | So basically, this is what phantom limb pain is,
00:59:05.080 | this is what chronic pain is, this is what happens.
00:59:08.840 | If at some point you cease to get information
00:59:12.200 | from your body anymore, your brain still can simulate,
00:59:16.360 | still can reinstate the sensory patterns
00:59:19.600 | that once came from the body.
00:59:22.080 | And that's what's required.
00:59:23.600 | At some point, the body isn't really needed anymore.
00:59:28.400 | Yeah.
00:59:29.240 | I think we're here, yeah.
00:59:31.880 | - So emotions in humans look as though
00:59:34.320 | they're implemented by a bunch of hacks.
00:59:36.120 | So there's a bunch of chemicals that go into your brain,
00:59:39.920 | there's oxytocin, serotonin,
00:59:42.880 | a whole bunch of biochemical things
00:59:45.920 | that diffuse around in the fluids in your brain
00:59:48.040 | that affect your emotional state.
00:59:49.680 | And that seems like a hack that we've inherited
00:59:51.960 | over millions of years from primitive ancestors.
00:59:54.720 | And if you look at the machine learning world,
00:59:57.800 | we can do a bunch of similar things with neural nets.
00:59:59.640 | So you can increase the activation thresholds
01:00:01.400 | on a large scale, change the amount of noise
01:00:05.440 | going into the system,
01:00:06.520 | you can do a bunch of similar things,
01:00:08.160 | but you don't have to rely on fluids
01:00:11.160 | being kind of cleaned slowly by glial cells.
01:00:15.120 | Things don't diffuse around in the fluids
01:00:17.480 | in the system necessarily.
01:00:18.880 | Seems like there's a lot more flexibility.
01:00:21.840 | So when you come to implementation,
01:00:23.120 | there's not so many constraints
01:00:25.440 | imposed by the evolutionary history on the whole system.
01:00:28.440 | And it seems like that would make it work better.
01:00:30.200 | So when people are in negative emotional states,
01:00:32.960 | they can't think straight.
01:00:34.320 | - Oh, that's absolutely not true.
01:00:36.160 | People can think actually quite well
01:00:37.800 | in negative emotional states, I have to tell you.
01:00:40.320 | - But the emotions-- - They can plan crimes,
01:00:42.640 | they can do very nefarious things very, very effectively.
01:00:45.800 | - Right, but the general point I think is true
01:00:48.600 | even if that example is not.
01:00:49.800 | So emotions kind of flood your nervous system with--
01:00:54.800 | - No, emotions don't flood your nervous system.
01:00:57.120 | - So some of them do.
01:00:58.040 | So powerful love. - No, really, they don't.
01:01:00.840 | So here's the thing.
01:01:01.680 | Neuro-- - Chemicals.
01:01:02.520 | - Let's talk about-- - In your bloodstream.
01:01:04.280 | - Let's talk about those chemicals.
01:01:06.240 | Let's talk about those chemicals.
01:01:07.320 | There is not a single chemical in your brain
01:01:09.240 | or anywhere in your nervous system that is for emotion.
01:01:12.880 | Serotonin is not for emotion.
01:01:14.680 | Dopamine is not for emotion.
01:01:16.040 | Oxytocin is not for emotion.
01:01:18.720 | Even opioids are not for emotion.
01:01:23.240 | - So they influence your emotions.
01:01:24.560 | They're involved in emotions.
01:01:25.400 | - They influence every mental event that you have,
01:01:29.080 | not just emotion.
01:01:30.600 | So your brain, for example,
01:01:32.720 | your brain is a physical,
01:01:35.880 | it's a set of physical cells
01:01:38.080 | that are bathed in neurochemical systems.
01:01:40.800 | And the neurochemical systems that you're referring to
01:01:43.880 | basically change the ease with which information is passed
01:01:47.080 | back and forth between those neurons.
01:01:48.800 | That's always true.
01:01:50.600 | It's true regardless of whether the event is emotion
01:01:54.080 | or whether it's a perception or whether it's a thought
01:01:57.040 | or whether it's a belief.
01:01:58.720 | It's always true.
01:02:00.200 | So for example, serotonin,
01:02:02.600 | serotonin is a neurotransmitter
01:02:05.120 | that allows your brain to delay gratification of a reward.
01:02:10.120 | It allows you to expend energy now
01:02:13.960 | because you anticipate a reward at some point in the future.
01:02:18.040 | And if you have a deficit in serotonin,
01:02:20.960 | then you can't do that very well.
01:02:22.720 | And it turns out for humans,
01:02:24.120 | one of our great superpowers
01:02:26.040 | is the ability to do mental time travel,
01:02:30.120 | to remember in the past and also to do things now
01:02:33.560 | because we know they're gonna have an effect
01:02:35.480 | 10 or 20 or 50 or 100 years from now.
01:02:38.880 | So when I said you have to have something like a body,
01:02:43.520 | I'm not saying literally
01:02:44.640 | you have to have a physical corporeal body.
01:02:47.520 | I'm saying you have to have,
01:02:49.640 | I mean, it's just a fact.
01:02:51.280 | It's not an argument.
01:02:52.400 | It's a fact that your brain,
01:02:54.640 | that brains evolved for the purposes
01:02:56.640 | of regulating multiple systems.
01:03:00.160 | And from a cybernetic standpoint,
01:03:02.760 | the best way to regulate a system
01:03:05.360 | is to build an internal model of it.
01:03:06.920 | That's what your brain is.
01:03:07.880 | Your brain is an internal model of your body in the world.
01:03:10.600 | It's running simulations.
01:03:12.080 | It's running this model.
01:03:13.600 | And if you wanna have an agent that is somewhat human-like,
01:03:17.320 | that has feelings like humans,
01:03:19.320 | then they have to be able to do something similar.
01:03:23.360 | And whether it's a actual physical corporeal body or not,
01:03:27.200 | I think that's an open question, right?
01:03:29.840 | - So it sounds a bit as though you're disputing my premises
01:03:31.960 | before I've got to my question.
01:03:33.080 | So I started off by saying--
01:03:34.480 | - Probably, sorry.
01:03:36.240 | - I started off by saying emotions are implemented
01:03:39.000 | as a bunch of hacks.
01:03:40.160 | So would you say that was broadly correct?
01:03:42.680 | Or would you say that they're not hacks,
01:03:45.360 | they're finely tuned and adaptive?
01:03:47.320 | And well, I wouldn't say they're not maladaptive,
01:03:49.800 | but would you say it's a bunch of hacks?
01:03:51.680 | - I don't think they're,
01:03:52.520 | I don't know what you mean by hacks.
01:03:53.920 | I think-- - So,
01:03:54.760 | I think it's the Kluge's kind of historical accidents
01:03:57.800 | that got fixed. - I think it's the wrong,
01:03:59.080 | you're talking about emotions like they're talking,
01:04:02.400 | the premise of your question,
01:04:04.200 | I can't answer your question
01:04:05.200 | 'cause I think it's not the correct question.
01:04:07.480 | I mean, emotions aren't like blood pressure.
01:04:12.480 | They don't exist in that sense.
01:04:15.880 | They are the way that they are,
01:04:20.880 | first of all, not all cultures have,
01:04:24.480 | not all people in all cultures have emotions.
01:04:28.160 | Everybody has affect,
01:04:29.920 | assuming they have a neurotypical brain of sorts,
01:04:32.680 | but they don't all have emotion.
01:04:36.080 | And so to answer the question that you're asking me is,
01:04:39.840 | I can't answer it 'cause I don't think it's the right,
01:04:42.680 | I don't think it's the right question.
01:04:43.520 | - So opioids and the pain systems seem like things
01:04:48.200 | that influence your emotions fairly directly.
01:04:51.920 | So it's not that it's the same thing,
01:04:54.000 | but it's that there's a powerful link.
01:04:56.560 | - Opioids though, so opioids are important
01:05:01.560 | for instances of emotion,
01:05:03.360 | but they are also important for every other category
01:05:08.080 | of mental event that your brain can make.
01:05:10.400 | - They affect a lot of other things too, I agree.
01:05:12.720 | - So you will see sometimes scientists will assign
01:05:17.720 | an emotional meaning to something like,
01:05:22.800 | dopamine is for first hedonic pleasantness,
01:05:26.080 | and then it's for reward.
01:05:27.640 | People like to assign single functions
01:05:30.320 | to biological entities like a chemical or a brain region,
01:05:35.320 | a cluster of neurons, or it's just not,
01:05:38.880 | that's really, I'm not saying every chemical
01:05:41.360 | does everything, but whatever opioids do,
01:05:44.680 | they do in every waking moment of your life,
01:05:48.320 | not just in moments that are emotional for you.
01:05:52.520 | I should stop now, thank you very much.
01:05:53.960 | - Thank you.
01:05:54.800 | (audience laughing)
01:05:57.160 | - Yeah.
01:05:58.400 | - I think it was very interesting
01:06:00.320 | that you brought up the last topic of love,
01:06:02.240 | because I think it actually brings up
01:06:03.840 | a really important thing, which is,
01:06:06.240 | and what you were saying about connection.
01:06:08.520 | That's, I mean, the primary purpose of life
01:06:12.840 | is to procreate, right?
01:06:14.600 | I mean, that's what our genes do, so.
01:06:17.840 | - First of all, right?
01:06:19.000 | - I'm not gonna be touching that today.
01:06:21.520 | - Okay.
01:06:22.360 | (audience laughing)
01:06:25.760 | - So that, and it's, of course, in humans,
01:06:30.760 | it's not just genetic, it's memetic.
01:06:33.200 | I mean, we procreate our ideas as well as physically.
01:06:38.200 | And so a primary purpose of our wanting,
01:06:42.880 | our reactions and interactions with other people
01:06:47.240 | and other things is that goal,
01:06:52.440 | is that we get rewarded when we interact with other things
01:06:57.000 | in a way that creates something new,
01:06:59.720 | whether it's art or a book or technology
01:07:03.680 | or something like that.
01:07:04.880 | And if we don't put that,
01:07:07.840 | if there's no inherent reason for a robot
01:07:11.960 | or a computer system to want to do that,
01:07:14.960 | I mean, how could we, can we even imagine
01:07:20.960 | putting that into a system inherently that it wants to,
01:07:25.440 | that it just desperately wants to make something new?
01:07:29.200 | I mean--
01:07:31.400 | - Well, here's what I would say.
01:07:32.240 | You know, when we're, first of all,
01:07:33.960 | when we talk about, we've been focusing a lot on bodies.
01:07:38.320 | I'm certainly not saying that's a sufficient condition,
01:07:40.320 | right, I'm just saying it's a necessary one
01:07:42.320 | or something like a body.
01:07:43.800 | But I certainly have the motivation to create,
01:07:48.080 | and I'm imagining that you do too.
01:07:50.240 | I have to tell you, not everybody has that.
01:07:52.400 | Most adolescents don't have that.
01:07:55.960 | Many adolescents don't have that.
01:07:57.320 | Okay, that was a joke.
01:07:58.480 | I have a teenage daughter, I'm just telling you.
01:08:01.120 | The point is that what people,
01:08:05.840 | what people find rewarding is remarkably diverse.
01:08:11.080 | But the property, I think,
01:08:16.080 | is that there has to be a feature of reward
01:08:19.840 | for a lot of people that's having an impact in some way,
01:08:24.840 | having an impact on another person,
01:08:26.880 | having an impact on a,
01:08:30.240 | building something that wasn't there before, whatever,
01:08:34.080 | innovating or discovering.
01:08:37.800 | But it's not true for everybody.
01:08:39.280 | It's just true for a lot of people in this room, probably,
01:08:43.840 | and certainly the people that we probably
01:08:45.280 | spend a lot of time with, but not for everybody.
01:08:47.360 | - And then there's also the genetic part.
01:08:49.840 | - Sure, I mean, look, but if you wanna,
01:08:51.960 | we can certainly make a genetic argument here, absolutely.
01:08:57.240 | There's nothing that I've said today
01:09:04.560 | that is inconsistent with the idea
01:09:06.880 | that you have to pass your genes on to the next generation
01:09:12.480 | and actually make sure that that generation survives
01:09:15.360 | to reproductive age.
01:09:16.800 | It's not just enough to have offspring.
01:09:19.960 | You have to make sure the offspring survive
01:09:22.120 | to reproductive age.
01:09:24.000 | And there's a whole argument about
01:09:28.640 | the learning of social reality,
01:09:37.840 | concepts of social reality that we've been talking about,
01:09:42.880 | like money and emotions and so on,
01:09:46.200 | that makes that argument, right?
01:09:47.840 | That it's very expensive to have to encode everything
01:09:51.080 | in genes or to have to learn everything
01:09:54.560 | from scratch every generation.
01:09:56.960 | So instead, what we have is a system,
01:09:59.800 | a genetic system that allows us to wire the brains
01:10:02.680 | of the young with what we know,
01:10:05.720 | and that's what we do, basically.
01:10:08.400 | - All right, so I'm curious about your analogy
01:10:15.320 | about intentionality that you talked about
01:10:17.600 | when you used the analogy between money
01:10:19.920 | and the perception of red to the fact that we have emotion.
01:10:24.920 | Because the distinguishing feature, it seems to me,
01:10:29.240 | is the level of intentionality.
01:10:31.040 | And as you said before, our brain assigns meaning to things.
01:10:35.080 | But we don't, or maybe, and my question is
01:10:37.800 | whether or not you agree with this, I guess.
01:10:40.480 | We don't always deliberately assign meaning to it.
01:10:43.320 | This is nothing I've said is about ever deliberate.
01:10:45.600 | - But sometimes we do.
01:10:46.760 | - Sometimes we do. - Quite often, actually.
01:10:48.880 | So when you go back to the question
01:10:51.000 | of what makes something intelligent,
01:10:52.760 | a lot of previous talks have been about,
01:10:55.040 | we want to pick a goal and then we create costs
01:10:59.480 | to achieve that goal, but that goal is deliberately assigned.
01:11:03.520 | So when you talk about what makes something intelligent,
01:11:07.120 | what do you think the role of intentionality is
01:11:10.840 | and the spectrum therein?
01:11:13.000 | - So first of all, when you talk about intentionality,
01:11:14.680 | I think you have to really be careful
01:11:16.280 | that you are, philosophers talk about intentionality
01:11:18.680 | in two ways.
01:11:20.440 | They talk about intentionality to mean a deliberate action,
01:11:23.600 | the way you mean it, but intentionality can also mean
01:11:26.200 | that something has a referent outside of you, really.
01:11:30.360 | So that a word has a referent,
01:11:33.120 | that's the intention of the word, basically.
01:11:37.640 | So I think you have to be really careful.
01:11:39.160 | I also think that you have to make a distinction
01:11:41.020 | between a conscious, deliberate,
01:11:43.880 | explicitly, a goal that you can explicitly describe
01:11:51.880 | I mean, you're sort of making,
01:12:01.080 | the kind of question you're asking is getting very close
01:12:03.600 | to the question of free will,
01:12:05.320 | which I would love to not have to discuss.
01:12:07.480 | But basically-- - Okay.
01:12:10.080 | - And what I'm about to say is gonna sound
01:12:12.120 | very Cartesian, unfortunately, because that's English.
01:12:15.760 | I don't know, there's no other way to do it, actually.
01:12:18.400 | But what I wanna say is that your brain is always,
01:12:22.460 | there's always volition,
01:12:25.280 | but it's not always consciously experienced by you
01:12:27.800 | as agency or will.
01:12:30.480 | So you're not a sea urchin.
01:12:35.120 | Your sensory neurons are not hardwired
01:12:37.520 | to your motor neurons.
01:12:39.080 | You have interneurons.
01:12:40.320 | That means you have choice.
01:12:41.800 | Do you consciously experience yourself
01:12:43.500 | as making choices all the time?
01:12:44.760 | No, you don't, but your brain is actually making choices
01:12:47.820 | all the time.
01:12:49.000 | That's why people who study decision-making
01:12:52.120 | think they're studying the whole brain,
01:12:53.640 | 'cause they are, actually.
01:12:55.460 | So I think you have to be really careful about,
01:13:01.520 | there are words that we use in English and in science
01:13:06.100 | that have two meanings.
01:13:08.120 | They can have a meaning that is about decision-making
01:13:10.960 | or choice that is just obligatory, automatic,
01:13:14.460 | and a function of how the system works.
01:13:17.560 | And then there's the kind of choice that feels deliberate
01:13:20.600 | and effortful and where we feel like we're the agents.
01:13:23.800 | We experience ourselves as the agents.
01:13:26.120 | And intentionality usually can be assigned
01:13:31.660 | to the second one, but actually, in truth,
01:13:33.720 | in philosophical terms, it can also be assigned
01:13:35.640 | to the first.
01:13:37.640 | Even if you're completely unaware of having made a choice,
01:13:42.040 | you're acting on something with some degree of volition,
01:13:46.320 | because it's not a reflex.
01:13:49.160 | It's not like somebody hit your patella or tendon
01:13:52.280 | and you kick.
01:13:53.120 | - That's really interesting.
01:13:54.920 | - So I think you just have to make that distinction.
01:13:57.920 | And I probably should get to the next question.
01:13:59.520 | We can follow. - Last question.
01:14:00.640 | Yeah, thank you.
01:14:02.320 | - Yeah, so you've been asked a lot of esoteric questions
01:14:05.060 | about AI, but I think we might gain some insights
01:14:08.660 | by wondering about DI, that is dog intelligence.
01:14:11.800 | - Oh, sure.
01:14:13.220 | - So I believe I sort of understand what my dog is feeling.
01:14:18.220 | And I usually believe that my dog believes the same,
01:14:23.260 | but not the same with a cat.
01:14:27.300 | - Well, it's 'cause you're not a cat person.
01:14:29.500 | - Right, yeah, sure.
01:14:31.020 | (audience laughing)
01:14:32.060 | - And I don't know that much about monkeys,
01:14:35.940 | but I've never really seen a monkey be able
01:14:37.580 | to make the expressions that, let's say, a dog can.
01:14:40.700 | And I was wondering if you had any insights
01:14:43.980 | about why dogs are able to do this
01:14:46.140 | and why we're able to read dogs.
01:14:48.140 | Is it something just simple,
01:14:50.460 | like they have the right facial muscles,
01:14:53.020 | or is it something, some drive
01:14:54.860 | that allows them to learn this?
01:14:57.020 | - So I think before I wrote my book,
01:14:58.920 | I would have answered this question differently.
01:15:02.600 | But now here's what I would say.
01:15:04.440 | I think many creatures on this planet have affect, right?
01:15:09.440 | And we can debate about whether,
01:15:15.160 | I just came across a paper the other day
01:15:16.600 | about whether fly, Drosophila have affect.
01:15:20.640 | And it's actually a really interesting question.
01:15:23.460 | They certainly, they have something like opioids
01:15:25.400 | and they, so it's an interesting question.
01:15:29.440 | But dogs, dogs are really interesting
01:15:32.600 | because they do seem to have some capacities
01:15:37.600 | that only, that you only see in great apes.
01:15:42.880 | And they may have capacities that great apes
01:15:45.700 | even other than us don't have.
01:15:48.360 | And I mean, they certainly have some capacities
01:15:50.480 | we don't have either.
01:15:51.320 | But here's my point.
01:15:52.840 | We actually bred them.
01:15:55.240 | We bred these animals.
01:15:57.080 | We selected them, basically.
01:15:58.960 | It's not natural selection, it's artificial selection.
01:16:03.800 | And we selected them for a couple of things, right?
01:16:06.400 | If you look at the experiments on breeding,
01:16:08.960 | taking a fox, taking foxes and breeding them
01:16:11.400 | into what look like little dog-like animals,
01:16:14.040 | it's interesting what they can do.
01:16:15.720 | And one of the things they can do
01:16:16.720 | is they can move their facial muscles
01:16:18.120 | in a lot of ways that they have a lot more control
01:16:21.680 | over their facial muscles than chimpanzees
01:16:24.480 | and other apes.
01:16:28.980 | And they also do,
01:16:31.180 | they do joint attention really well with gaze.
01:16:36.660 | So this is something that really no other animal can do,
01:16:39.860 | I think, other than a dog and humans.
01:16:42.700 | And that is they can, they meet your gaze
01:16:47.700 | and they use gaze for reference.
01:16:50.940 | So they'll look at something and they'll look back at you.
01:16:53.820 | And that's actually partly,
01:16:55.420 | that's how we communicate with each other.
01:16:58.180 | Chimpanzees lose that ability
01:17:00.260 | after about 10 or 11 months of age.
01:17:03.860 | But dogs continually do it.
01:17:06.420 | And actually, joint attention, shared gaze,
01:17:10.220 | is how we communicate with our infants also.
01:17:12.540 | That's actually partly how we teach infants
01:17:14.380 | about what's important in the world is with gaze.
01:17:17.220 | So I think that dogs may actually have some capacities
01:17:21.300 | probably because we bred them to have those.
01:17:25.460 | We selected the animals that, you know,
01:17:28.940 | and bred them to have those capacities.
01:17:31.860 | - That's very interesting, thank you.
01:17:33.420 | - Awesome, well with that, let's give Lisa a big hand.
01:17:35.820 | Thank you so much for being here.