back to index

Rosalind Picard: Affective Computing, Emotion, Privacy, and Health | Lex Fridman Podcast #24


Chapters

0:0
0:1 Rosalind Picard
2:2 Human-Computer Interaction
6:16 Turing Test for Emotional Intelligence
11:47 Fake News
25:40 Interaction with Alexa
30:41 Recognizing Emotion

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Rosalind Picard.
00:00:02.880 | She's a professor at MIT,
00:00:04.540 | director of the effective computing research group
00:00:06.880 | at the MIT Media Lab and co-founder of two companies,
00:00:10.180 | Affectiva and Empatica.
00:00:12.440 | Over two decades ago,
00:00:13.560 | she launched the field of effective computing
00:00:15.420 | with her book of the same name.
00:00:17.580 | This book described the importance of emotion
00:00:20.040 | in artificial and natural intelligence.
00:00:23.040 | The vital role of emotional communication
00:00:25.320 | has to the relationship between people in general
00:00:28.540 | and human robot interaction.
00:00:30.920 | I really enjoy talking with Ros over so many topics,
00:00:34.040 | including emotion, ethics, privacy, wearable computing,
00:00:37.460 | and her recent research in epilepsy
00:00:39.720 | and even love and meaning.
00:00:42.600 | This conversation is part
00:00:44.000 | of the Artificial Intelligence Podcast.
00:00:46.040 | If you enjoy it, subscribe on YouTube, iTunes,
00:00:48.760 | or simply connect with me on Twitter @LexFriedman,
00:00:51.920 | spelled F-R-I-D.
00:00:54.000 | And now here's my conversation with Rosalind Picard.
00:00:58.900 | More than 20 years ago,
00:01:00.740 | you've coined the term effective computing
00:01:03.340 | and led a lot of research in this area since then.
00:01:06.720 | As I understand, the goal is to make the machine detect
00:01:09.260 | and interpret the emotional state of a human being
00:01:12.420 | and adopt the behavior of the machine
00:01:14.260 | based on the emotional state.
00:01:16.160 | So how has your understanding of the problem space
00:01:19.980 | defined by effective computing changed in the past 24 years?
00:01:25.400 | So it's the scope, the applications, the challenges,
00:01:28.920 | what's involved, how has that evolved over the years?
00:01:32.160 | - Yeah, actually, originally when I defined
00:01:34.660 | the term affective computing,
00:01:36.920 | it was a bit broader than just recognizing
00:01:40.140 | and responding intelligently to human emotion,
00:01:42.300 | although those are probably the two pieces
00:01:44.560 | that we've worked on the hardest.
00:01:47.140 | The original concept also encompassed machines
00:01:50.680 | that would have mechanisms that functioned
00:01:53.520 | like human emotion does inside them.
00:01:55.720 | It would be any computing that relates to,
00:01:58.320 | arises from, or deliberately influences human emotion.
00:02:01.640 | So the human-computer interaction part
00:02:05.200 | is the part that people tend to see.
00:02:07.920 | Like if I'm really ticked off at my computer
00:02:11.040 | and I'm scowling at it and I'm cursing at it,
00:02:13.520 | and it just keeps acting smiling and happy
00:02:15.760 | like that little paperclip used to do,
00:02:17.520 | like dancing, winking,
00:02:20.840 | that kind of thing just makes you even more frustrated.
00:02:24.720 | And I thought, that stupid thing needs to see my affect.
00:02:29.160 | And if it's gonna be intelligent,
00:02:30.680 | which Microsoft researchers had worked really hard on,
00:02:33.080 | actually had some of the most sophisticated AI in it
00:02:35.200 | at the time, that thing's gonna actually be smart.
00:02:38.040 | It needs to respond to me and you,
00:02:41.660 | and we can send it very different signals.
00:02:45.400 | - So by the way, just a quick interruption.
00:02:47.200 | The Clippy, maybe it's in Word 95 and 98,
00:02:52.200 | I don't remember when it was born,
00:02:54.400 | but many people, do you find yourself with that reference
00:02:58.360 | that people recognize what you're talking about
00:03:00.360 | still to this point?
00:03:01.720 | - I don't expect the newest students to these days,
00:03:05.200 | but I've mentioned it to a lot of audiences,
00:03:07.240 | like how many of you know this Clippy thing?
00:03:09.280 | And still the majority of people seem to know it.
00:03:11.760 | - So Clippy kind of looks at
00:03:13.880 | maybe natural language processing, what you were typing,
00:03:16.320 | and tries to help you complete, I think.
00:03:19.320 | I don't even remember what Clippy was, except annoying.
00:03:22.520 | - Yeah, some people actually liked it.
00:03:25.880 | - Well, I miss it. - I would hear those stories.
00:03:27.560 | You miss it?
00:03:28.520 | - Well, I miss the annoyance.
00:03:31.360 | They felt like there's an element.
00:03:34.120 | - Someone was there.
00:03:34.960 | - Somebody was there, and we're in it together,
00:03:37.000 | and they were annoying.
00:03:38.040 | It's like a puppy that just doesn't get it.
00:03:40.920 | They keep ripping up the couch kind of thing.
00:03:41.760 | - And in fact, they could have done it smarter like a puppy.
00:03:45.000 | If they had done, like if when you yelled at it
00:03:48.040 | or cursed at it, if it had put its little ears back
00:03:50.840 | and its tail down and shirked off,
00:03:53.000 | probably people would have wanted it back, right?
00:03:55.960 | But instead, when you yelled at it, what did it do?
00:03:58.640 | It smiled, it winked, it danced, right?
00:04:01.320 | If somebody comes to my office and I yell at them,
00:04:03.240 | they start smiling, winking, and dancing,
00:04:04.800 | I'm like, I never wanna see you again.
00:04:06.800 | So Bill Gates got a standing ovation
00:04:08.560 | when he said it was going away,
00:04:10.200 | 'cause people were so ticked.
00:04:12.440 | It was so emotionally unintelligent, right?
00:04:15.080 | It was intelligent about whether you were writing a letter,
00:04:18.200 | what kind of help you needed for that context.
00:04:20.920 | It was completely unintelligent about,
00:04:23.480 | hey, if you're annoying your customer,
00:04:25.820 | don't smile in their face when you do it.
00:04:28.440 | So that kind of mismatch was something
00:04:32.400 | the developers just didn't think about.
00:04:35.120 | And intelligence at the time was really all about math
00:04:39.560 | and language and chess and games,
00:04:44.160 | problems that could be pretty well-defined.
00:04:47.960 | Social-emotional interaction is much more complex
00:04:50.920 | than chess or Go or any of the games
00:04:53.680 | that people are trying to solve.
00:04:56.100 | And in order to understand that required skills
00:04:58.740 | that most people in computer science
00:05:00.360 | actually were lacking personally.
00:05:02.640 | - Well, let's talk about computer science.
00:05:03.840 | Have things gotten better since the work,
00:05:06.440 | since the message, since you've really launched a field
00:05:09.600 | with a lot of research work in this space?
00:05:11.400 | I still find as a person like yourself,
00:05:14.160 | who's deeply passionate about human beings,
00:05:16.760 | and yet I'm in computer science,
00:05:18.920 | there still seems to be a lack of,
00:05:20.800 | sorry to say, empathy in us computer scientists.
00:05:26.880 | - Yeah, well--
00:05:27.880 | - Or has it gotten better?
00:05:28.960 | - Let's just say there's a lot more variety
00:05:30.800 | among computer scientists these days.
00:05:32.520 | Computer scientists are a much more diverse group today
00:05:35.080 | than they were 25 years ago, and that's good.
00:05:39.080 | We need all kinds of people to become computer scientists
00:05:41.840 | so that computer science reflects more what society needs.
00:05:45.640 | And there's brilliance among every personality type,
00:05:49.160 | so it need not be limited to people
00:05:52.080 | who prefer computers to other people.
00:05:54.160 | - How hard do you think it is?
00:05:55.880 | Your view of how difficult it is to recognize emotion
00:05:58.660 | or to create a deeply emotionally intelligent interaction,
00:06:04.000 | has it gotten easier or harder as you've explored it further?
00:06:07.480 | And how far away are we from cracking this,
00:06:10.060 | if you think of the Turing test solving the intelligence,
00:06:16.080 | looking at the Turing test for emotional intelligence?
00:06:18.780 | - I think it is as difficult as I thought it was gonna be.
00:06:25.600 | I think my prediction of its difficulty is spot on.
00:06:29.280 | I think the time estimates are always hard
00:06:33.140 | because they're always a function of society's love
00:06:37.320 | and hate of a particular topic.
00:06:39.480 | If society gets excited and you get hundreds of,
00:06:43.520 | and you get thousands of researchers working on it
00:06:47.720 | for a certain application,
00:06:49.040 | that application gets solved really quickly.
00:06:52.040 | The general intelligence, the computer's complete lack
00:06:57.040 | of ability to have awareness of what it's doing,
00:07:03.220 | the fact that it's not conscious,
00:07:05.540 | the fact that there's no signs of it becoming conscious,
00:07:08.620 | the fact that it doesn't read between the lines,
00:07:11.820 | those kinds of things that we have to teach it explicitly,
00:07:15.060 | what other people pick up implicitly,
00:07:17.500 | we don't see that changing yet.
00:07:20.400 | There aren't breakthroughs yet that lead us to believe
00:07:23.580 | that that's gonna go any faster,
00:07:25.320 | which means that it's still gonna be kind of stuck
00:07:28.660 | with a lot of limitations
00:07:31.280 | where it's probably only gonna do the right thing
00:07:34.040 | in very limited, narrow, pre-specified contexts
00:07:37.160 | where we can prescribe pretty much
00:07:40.920 | what's gonna happen there.
00:07:42.840 | So I don't see the,
00:07:44.860 | it's hard to predict a date
00:07:48.000 | because when people don't work on it, it's infinite.
00:07:51.760 | When everybody works on it, you get a nice piece of it
00:07:54.520 | well-solved in a short amount of time.
00:07:58.600 | I actually think there's a more important issue right now
00:08:01.580 | than the difficulty of it,
00:08:04.540 | and that's causing some of us
00:08:05.820 | to put the brakes on a little bit.
00:08:07.420 | Usually we're all just like, step on the gas,
00:08:09.380 | let's go faster.
00:08:11.180 | This is causing us to pull back and put the brakes on,
00:08:14.200 | and that's the way that some of this technology
00:08:18.700 | is being used in places like China right now,
00:08:21.220 | and that worries me so deeply
00:08:24.520 | that it's causing me to pull back myself
00:08:27.780 | on a lot of the things that we could be doing
00:08:30.100 | and try to get the community to think a little bit more
00:08:33.700 | about, okay, if we're gonna go forward with that,
00:08:36.060 | how can we do it in a way that puts in place safeguards
00:08:39.300 | that protects people?
00:08:41.140 | - So the technology we're referring to
00:08:42.860 | is just when a computer senses the human being,
00:08:46.420 | like the human face, right?
00:08:48.580 | So there's a lot of exciting things there,
00:08:51.860 | like forming a deep connection with a human being,
00:08:53.940 | so what are your worries how that could go wrong?
00:08:57.700 | Is it in terms of privacy?
00:08:59.500 | Is it in terms of other kinds of more subtle things?
00:09:03.540 | - Let's dig into privacy.
00:09:04.380 | So here in the US, if I'm watching a video
00:09:07.820 | of say a political leader,
00:09:09.860 | and in the US we're quite free, as we all know,
00:09:13.660 | to even criticize the President of the United States, right?
00:09:17.940 | Here that's not a shocking thing.
00:09:19.460 | It happens about every five seconds, right?
00:09:22.700 | But in China, what happens if you criticize
00:09:27.700 | the leader of the government, right?
00:09:31.060 | And so people are very careful not to do that.
00:09:34.340 | However, what happens if you're simply watching a video
00:09:37.860 | and you make a facial expression
00:09:41.020 | that shows a little bit of skepticism, right?
00:09:45.300 | Well, and here we're completely free to do that.
00:09:48.180 | In fact, we're free to fly off the handle
00:09:50.700 | and say anything we want, usually.
00:09:54.700 | I mean, there are some restrictions,
00:09:56.580 | when the athlete does this as part of the national broadcast,
00:10:00.860 | maybe the teams get a little unhappy
00:10:03.860 | about picking that forum to do it, right?
00:10:05.900 | But that's more a question of judgment.
00:10:08.740 | We have these freedoms,
00:10:11.580 | and in places that don't have those freedoms,
00:10:14.180 | what if our technology can read
00:10:17.100 | your underlying affective state?
00:10:19.620 | What if our technology can read it even non-contact?
00:10:22.460 | What if our technology can read it
00:10:24.460 | without your prior consent?
00:10:28.860 | And here in the US, in my first company,
00:10:31.420 | we started Affectiva.
00:10:32.940 | We have worked super hard to turn away money
00:10:35.580 | and opportunities that try to read people's affect
00:10:38.420 | without their prior informed consent.
00:10:41.340 | And even the software that is licensable,
00:10:45.180 | you have to sign things saying you will only use it
00:10:47.700 | in certain ways, which essentially is get people's buy-in,
00:10:51.820 | right, don't do this without people agreeing to it.
00:10:55.420 | There are other countries where they're not interested
00:10:58.660 | in people's buy-in, they're just gonna use it,
00:11:01.500 | they're gonna inflict it on you,
00:11:03.100 | and if you don't like it, you better not scowl
00:11:05.980 | in the direction of any censors.
00:11:08.540 | - So one, let me just comment on a small tangent.
00:11:11.500 | Do you know with the idea of adversarial examples
00:11:16.020 | and deepfakes and so on, what you bring up is actually,
00:11:20.860 | in that one sense, deepfakes provide
00:11:23.740 | a comforting protection that you can no longer really trust
00:11:28.740 | that the video of your face was legitimate,
00:11:34.580 | and therefore you always have an escape clause
00:11:37.060 | if a government is trying, if a stable, balanced,
00:11:43.540 | ethical government is trying to accuse you of something,
00:11:46.220 | at least you have protection, you can say it was fake news,
00:11:48.740 | as is a popular term now.
00:11:50.580 | - Yeah, that's the general thinking of it.
00:11:52.340 | We know how to go into the video and see, for example,
00:11:55.900 | your heart rate and respiration,
00:11:58.380 | and whether or not they've been tampered with.
00:12:02.180 | And we also can put fake heart rate and respiration
00:12:05.500 | in your video now, too. - Oh, interesting.
00:12:06.660 | - We decided we needed to do that.
00:12:08.580 | After we developed a way to extract it,
00:12:12.660 | we decided we also needed a way to jam it.
00:12:15.260 | So the fact that we took time to do that other step, too,
00:12:20.980 | that was time that I wasn't spending
00:12:22.620 | making the machine more affectively intelligent.
00:12:25.300 | And there's a choice in how we spend our time,
00:12:28.580 | which is now being swayed a little bit less by this goal
00:12:32.500 | and a little bit more by concern
00:12:34.420 | about what's happening in society
00:12:36.660 | and what kind of future do we wanna build.
00:12:38.940 | And as we step back and say,
00:12:41.740 | okay, we don't just build AI to build AI
00:12:44.700 | to make Elon Musk more money
00:12:46.620 | or to make Amazon Jeff Bezos more money.
00:12:48.900 | Good gosh, that's the wrong ethic.
00:12:53.020 | Why are we building it?
00:12:54.260 | What is the point of building AI?
00:12:57.340 | It used to be it was driven by researchers in academia
00:13:01.660 | to get papers published and to make a career for themselves
00:13:04.300 | and to do something cool, right?
00:13:06.140 | 'Cause maybe it could be done.
00:13:08.660 | Now we realize that this is enabling rich people
00:13:12.620 | to get vastly richer.
00:13:14.380 | The poor, the divide is even larger.
00:13:19.940 | And is that the kind of future that we want?
00:13:23.060 | Maybe we wanna think about, maybe we wanna rethink AI.
00:13:26.100 | Maybe we wanna rethink the problems in society
00:13:29.300 | that are causing the greatest inequity
00:13:32.900 | and rethink how to build AI
00:13:35.180 | that's not about a general intelligence,
00:13:36.900 | that's about extending the intelligence
00:13:39.340 | and capability of the have-nots
00:13:41.260 | so that we close these gaps in society.
00:13:43.820 | - Do you hope that kind of stepping on the brake
00:13:46.660 | happens organically?
00:13:48.020 | Because I think still majority of the force behind AI
00:13:51.220 | is the desire to publish papers,
00:13:52.780 | is to make money without thinking about the why.
00:13:55.820 | Do you hope it happens organically?
00:13:57.300 | Is there room for regulation?
00:13:59.020 | - Yeah, yeah, yeah, great questions.
00:14:02.980 | I prefer the, you know,
00:14:05.980 | they talk about the carrot versus the stick.
00:14:07.420 | I definitely prefer the carrot to the stick.
00:14:09.220 | And, you know, in our free world,
00:14:12.140 | there's only so much stick, right?
00:14:15.020 | You're gonna find a way around it.
00:14:17.340 | I generally think less regulation is better.
00:14:21.260 | That said, even though my position is classically carrot,
00:14:24.500 | no stick, no regulation,
00:14:26.340 | I think we do need some regulations in this space.
00:14:29.140 | I do think we need regulations
00:14:30.780 | around protecting people with their data,
00:14:33.700 | that you own your data, not Amazon, not Google.
00:14:38.340 | I would like to see people own their own data.
00:14:40.940 | I would also like to see the regulations
00:14:42.620 | that we have right now around lie detection
00:14:44.660 | being extended to emotion recognition in general.
00:14:48.300 | That right now you can't use a lie detector on an employee
00:14:51.140 | when you're on a candidate
00:14:52.900 | when you're interviewing them for a job.
00:14:54.860 | I think similarly, we need to put in place protection
00:14:57.900 | around reading people's emotions without their consent.
00:15:00.900 | And in certain cases, like characterizing them for a job
00:15:04.180 | and other opportunities.
00:15:06.180 | So I also think that when we're reading emotion
00:15:09.220 | that's predictive around mental health,
00:15:11.740 | that that should, even though it's not medical data,
00:15:14.220 | that that should get the kinds of protections
00:15:16.140 | that our medical data gets.
00:15:18.500 | What most people don't know yet
00:15:20.060 | is right now with your smartphone use,
00:15:22.620 | and if you're wearing a sensor
00:15:25.260 | and you wanna learn about your stress and your sleep
00:15:27.780 | and your physical activity
00:15:29.060 | and how much you're using your phone
00:15:30.860 | and your social interaction,
00:15:32.660 | all of that non-medical data,
00:15:34.980 | when we put it together with machine learning,
00:15:37.980 | now called AI, even though the founders of AI
00:15:40.180 | wouldn't have called it that,
00:15:41.700 | that capability can not only tell
00:15:46.660 | that you're calm right now
00:15:48.460 | or that you're getting a little stressed,
00:15:50.860 | but it can also predict how you're likely to be tomorrow.
00:15:53.940 | If you're likely to be sick or healthy,
00:15:55.860 | happy or sad, stressed or calm.
00:15:58.740 | - Especially when you're tracking data over time.
00:16:00.660 | - Especially when we're tracking a week of your data or more.
00:16:03.780 | - Do you have an optimism towards,
00:16:06.060 | a lot of people on our phones are worried
00:16:08.220 | about this camera that's looking at us.
00:16:10.340 | For the most part, on balance,
00:16:12.180 | are you optimistic about the benefits
00:16:16.100 | that can be brought from that camera
00:16:17.500 | that's looking at billions of us?
00:16:19.620 | Or should we be more worried?
00:16:22.100 | - I think we should be a little bit more worried
00:16:28.900 | about who's looking at us and listening to us.
00:16:32.540 | The device sitting on your countertop in your kitchen,
00:16:36.740 | whether it's, Alexa or Google Home or Apple, Siri,
00:16:41.740 | these devices want to listen,
00:16:46.500 | while they say ostensibly to help us.
00:16:49.700 | And I think there are great people in these companies
00:16:52.100 | who do wanna help people.
00:16:54.220 | Let me not brand them all bad.
00:16:56.180 | I'm a user of products from all of these companies.
00:16:59.380 | I'm naming all the A companies, Alphabet, Apple, Amazon.
00:17:04.380 | They are awfully big companies.
00:17:09.180 | They have incredible power.
00:17:11.580 | And what if China were to buy them?
00:17:16.580 | And suddenly all of that data,
00:17:19.940 | we're not part of free America,
00:17:22.500 | but all of that data were part of somebody
00:17:24.460 | who just wants to take over the world
00:17:26.700 | and you submit to them.
00:17:27.940 | And guess what happens if you so much as smirk the wrong way
00:17:32.180 | when they say something that you don't like?
00:17:34.580 | Well, they have re-education camps, right?
00:17:37.500 | That's a nice word for them.
00:17:39.020 | By the way, they have a surplus of organs
00:17:41.460 | for people who have surgery these days.
00:17:43.340 | They don't have an organ donation problem
00:17:45.100 | 'cause they take your blood and they know you're a match.
00:17:48.100 | And the doctors are on record of taking organs
00:17:51.820 | from people who are perfectly healthy
00:17:53.540 | and not prisoners.
00:17:55.380 | They're just simply not the favored ones of the government.
00:17:58.620 | And that's a pretty freaky evil society.
00:18:04.580 | And we can use the word evil there.
00:18:06.500 | - I was born in the Soviet Union.
00:18:07.860 | I can certainly connect to the worry that you're expressing.
00:18:12.860 | At the same time, probably both you and I,
00:18:15.460 | and you very much so,
00:18:19.620 | there's an exciting possibility
00:18:23.180 | that you can have a deep connection with the machine.
00:18:27.780 | - Yeah, yeah.
00:18:28.700 | - Right, so.
00:18:29.540 | - Those of us, I've admitted students who say that they,
00:18:35.500 | you know, when you list like,
00:18:36.820 | who do you most wish you could have lunch with
00:18:39.420 | or dinner with, right?
00:18:40.540 | And they'll write like, I don't like people.
00:18:43.420 | I just like computers.
00:18:44.860 | And one of them said to me once
00:18:46.420 | when I had this party at my house,
00:18:49.580 | I want you to know,
00:18:51.180 | this is my only social event of the year,
00:18:53.220 | my one social event of the year.
00:18:55.620 | Like, okay, now this is a brilliant
00:18:57.700 | machine learning person, right?
00:18:59.340 | And we need that kind of brilliance in machine learning.
00:19:01.980 | And I love that computer science welcomes people
00:19:04.820 | who love people and people who are very awkward
00:19:07.220 | around people.
00:19:08.060 | I love that this is a field that anybody could join.
00:19:12.780 | We need all kinds of people.
00:19:15.020 | And you don't need to be a social person.
00:19:16.740 | I'm not trying to force people who don't like people
00:19:19.060 | to suddenly become social.
00:19:20.900 | At the same time,
00:19:23.980 | if most of the people building the AIs of the future
00:19:26.580 | are the kind of people who don't like people,
00:19:29.500 | we've got a little bit of a problem.
00:19:31.140 | - Well, hold on a second.
00:19:32.020 | So let me push back on that.
00:19:33.500 | So don't you think a large percentage of the world
00:19:37.780 | can, you know, there's loneliness.
00:19:40.980 | There is--
00:19:41.820 | - There's a huge problem with loneliness and it's growing.
00:19:44.500 | - And so there's a longing for connection.
00:19:47.660 | Do you--
00:19:48.860 | - If you're lonely, you're part of a big and growing group.
00:19:51.460 | - Yes, so we're in it together, I guess.
00:19:54.380 | If you're lonely, join the group.
00:19:56.220 | - You're not alone.
00:19:57.060 | - You're not alone.
00:19:58.020 | That's a good line.
00:19:59.020 | But do you think there's a,
00:20:03.220 | you talked about some worry,
00:20:04.660 | but do you think there's an exciting possibility
00:20:07.660 | that something like Alexa and these kinds of tools
00:20:11.620 | can alleviate that loneliness
00:20:14.300 | in a way that other humans can't?
00:20:16.740 | - Yeah, yeah, definitely.
00:20:19.020 | I mean, a great book can kind of alleviate loneliness,
00:20:22.220 | because you just get sucked into this amazing story
00:20:25.100 | and you can't wait to go spend time with that character.
00:20:27.860 | And they're not a human character.
00:20:30.460 | There is a human behind it.
00:20:32.340 | But yeah, it can be an incredibly delightful way
00:20:35.500 | to pass the hours.
00:20:37.060 | And it can meet needs, even, you know,
00:20:40.780 | I don't read those trashy romance books,
00:20:43.500 | but somebody does, right?
00:20:44.860 | And what are they getting from this?
00:20:46.300 | Well, probably some of that feeling of being there, right?
00:20:50.860 | Being there in that social moment,
00:20:53.060 | that romantic moment, or connecting with somebody.
00:20:56.100 | I've had a similar experience
00:20:57.700 | reading some science fiction books, right?
00:20:59.580 | Connecting with the character.
00:21:00.700 | Orson Scott Card, you know,
00:21:01.900 | just amazing writing and Ender's Game
00:21:05.380 | and Speaker for the Dead, terrible title.
00:21:07.700 | But those kind of books that pull you into a character
00:21:11.140 | and you feel like you're,
00:21:13.100 | you feel very social, it's very connected,
00:21:15.180 | even though it's not responding to you.
00:21:17.660 | And a computer, of course, can respond to you.
00:21:19.860 | So it can deepen it, right?
00:21:21.580 | You can have a very deep connection,
00:21:25.660 | much more than the movie Her, you know, plays up, right?
00:21:29.540 | - Well, much more.
00:21:30.780 | I mean, movie Her is already a pretty deep connection, right?
00:21:34.900 | - Well, but it's just a movie, right?
00:21:36.900 | It's scripted, it's just, you know.
00:21:38.900 | But I mean, like, there can be a real interaction
00:21:42.780 | where the character can learn and you can learn.
00:21:46.740 | You could imagine it not just being you and one character.
00:21:49.700 | You can imagine a group of characters.
00:21:51.740 | You can imagine a group of people and characters,
00:21:53.740 | human and AI connecting,
00:21:56.580 | where maybe a few people
00:21:58.500 | can't sort of be friends with everybody,
00:22:01.700 | but the few people and their AIs can befriend more people.
00:22:06.700 | There can be an extended human intelligence in there
00:22:10.460 | where each human can connect with more people that way.
00:22:14.980 | But it's still very limited, but there are just,
00:22:19.580 | what I mean is there are many more possibilities
00:22:21.660 | than what's in that movie.
00:22:22.860 | - So there's a tension here.
00:22:24.820 | So when you express a really serious concern about privacy,
00:22:28.340 | about how governments can misuse the information,
00:22:31.260 | and there's the possibility of this connection.
00:22:34.220 | So let's look at Alexa.
00:22:36.300 | So personal assistants.
00:22:38.900 | For the most part, as far as I'm aware,
00:22:40.860 | they ignore your emotion.
00:22:42.860 | They ignore even the context or the existence of you,
00:22:47.420 | the intricate, beautiful, complex aspects of who you are,
00:22:52.220 | except maybe aspects of your voice
00:22:54.180 | that help it recognize for speech recognition.
00:22:58.380 | Do you think they should move towards
00:23:00.620 | trying to understand your emotion?
00:23:03.220 | - All of these companies are very interested
00:23:05.020 | in understanding human emotion.
00:23:07.460 | They want, more people are telling Siri
00:23:10.980 | every day they wanna kill themselves.
00:23:13.740 | Apple wants to know the difference
00:23:15.260 | between if a person is really suicidal
00:23:17.620 | versus if a person is just kind of fooling around with Siri.
00:23:21.420 | The words may be the same.
00:23:24.060 | The tone of voice and what surrounds those words
00:23:27.980 | is pivotal to understand if they should respond
00:23:32.940 | in a very serious way, bring help to that person,
00:23:35.980 | or if they should kind of jokingly tease back,
00:23:40.300 | you know, ah, you just wanna, you know,
00:23:43.100 | sell me for something else, right?
00:23:45.060 | Like, how do you respond when somebody says that?
00:23:48.020 | Well, you know, you do wanna err on the side
00:23:50.620 | of being careful and taking it seriously.
00:23:52.660 | People wanna know if the person is happy or stressed
00:23:58.740 | in part, well, so let me give you an altruistic reason
00:24:03.220 | and a business profit motivated reason,
00:24:08.220 | and there are people in companies
00:24:10.060 | that operate on both principles.
00:24:12.780 | The altruistic people really care about their customers
00:24:17.220 | and really care about helping you feel a little better
00:24:19.420 | at the end of the day, and it would just make
00:24:21.860 | those people happy if they knew
00:24:23.180 | that they made your life better.
00:24:24.420 | If you came home stressed and after talking
00:24:27.060 | with their product, you felt better.
00:24:30.020 | There are other people who maybe have studied
00:24:33.060 | the way affect affects decision making
00:24:35.180 | and prices people pay, and they know,
00:24:37.940 | I don't know if I should tell you,
00:24:38.820 | like the work of Jen Lerner on heart strings
00:24:42.580 | and purse strings, you know, if we manipulate you
00:24:45.980 | into a slightly sadder mood, you'll pay more, right?
00:24:50.860 | You'll pay more to change your situation.
00:24:53.860 | You'll pay more for something you don't even need
00:24:55.860 | to make yourself feel better.
00:24:58.060 | So, you know, if they sound a little sad,
00:25:00.180 | maybe I don't wanna cheer them up.
00:25:01.300 | Maybe first I wanna help them get something,
00:25:04.820 | a little shopping therapy, right, that helps them.
00:25:08.500 | - Which is really difficult for a company
00:25:09.900 | that's primarily funded on advertisement,
00:25:12.140 | so they're encouraged to get you--
00:25:15.380 | - To offer you products, or Amazon is primarily funded
00:25:17.860 | on you buying things from their store.
00:25:20.060 | So I think we should be, you know,
00:25:22.180 | maybe we need regulation in the future
00:25:24.140 | to put a little bit of a wall between these agents
00:25:27.260 | that have access to our emotion
00:25:29.140 | and agents that wanna sell us stuff.
00:25:32.300 | Maybe there needs to be a little bit more
00:25:35.580 | of a firewall in between those.
00:25:37.500 | - So maybe digging in a little bit
00:25:40.500 | on the interaction with Alexa,
00:25:42.220 | you mentioned, of course, a really serious concern
00:25:44.900 | about like recognizing emotion if somebody is speaking
00:25:48.140 | of suicide or depression and so on,
00:25:49.700 | but what about the actual interaction itself?
00:25:53.540 | Do you think, so if I, you know, you mentioned Clippy
00:25:58.900 | and being annoying, what is the objective function
00:26:03.020 | we're trying to optimize?
00:26:04.220 | Is it minimize annoyingness or maximize happiness?
00:26:09.220 | Or if we look at human to human relations,
00:26:12.460 | I think that push and pull, the tension, the dance,
00:26:15.500 | you know, the annoying, the flaws,
00:26:18.340 | that's what makes it fun.
00:26:19.860 | So is there a room for, like what is the objective function?
00:26:23.780 | - I mean, there are times when you wanna have
00:26:25.820 | a little push and pull.
00:26:26.740 | Think of kids sparring, right?
00:26:29.060 | You know, I see my sons and one of them wants
00:26:31.900 | to provoke the other to be upset and that's fun.
00:26:34.700 | And it's actually healthy to learn where your limits are,
00:26:38.460 | to learn how to self-regulate.
00:26:40.020 | You can imagine a game where it's trying to make you mad
00:26:42.940 | and you're trying to show self-control.
00:26:45.220 | And so if we're doing a AI human interaction
00:26:48.580 | that's helping build resilience and self-control,
00:26:51.180 | whether it's to learn how to not be a bully
00:26:53.980 | or how to turn the other cheek
00:26:55.500 | or how to deal with an abusive person in your life,
00:26:58.940 | then you might need an AI that pushes your buttons, right?
00:27:03.380 | But in general, do you want an AI that pushes your buttons?
00:27:07.820 | Probably depends on your personality.
00:27:12.100 | I don't.
00:27:13.220 | I want one that's respectful, that is there to serve me
00:27:18.180 | and that is there to extend my ability to do things.
00:27:23.180 | I'm not looking for a rival.
00:27:25.180 | I'm looking for a helper.
00:27:27.260 | And that's the kind of AI I'd put my money on.
00:27:30.220 | - Your senses for the majority of people in the world,
00:27:33.700 | in order to have a rich experience,
00:27:35.140 | that's what they're looking for as well.
00:27:37.100 | So they're not looking, if you look at the movie "Her",
00:27:39.620 | spoiler alert, I believe the program,
00:27:44.420 | the woman in the movie "Her" leaves the person
00:27:49.420 | for somebody else.
00:27:51.340 | Says they don't wanna be dating anymore.
00:27:53.980 | - Right, like do you, your sense is if Alexa said,
00:27:58.260 | you know what, I'm actually had enough of you for a while,
00:28:02.820 | so I'm gonna shut myself off.
00:28:04.860 | You don't see that as--
00:28:07.100 | - I'd say you're trash 'cause I paid for you, right?
00:28:09.900 | We've gotta remember, and this is where
00:28:15.180 | this blending human AI as if we're equals
00:28:19.980 | is really deceptive because AI is something
00:28:24.980 | at the end of the day that my students
00:28:27.540 | and I are making in the lab.
00:28:28.940 | And we're choosing what it's allowed to say,
00:28:33.180 | when it's allowed to speak, what it's allowed to listen to,
00:28:36.660 | what it's allowed to act on given the inputs
00:28:40.820 | that we choose to expose it to,
00:28:43.460 | what outputs it's allowed to have.
00:28:45.940 | It's all something made by a human.
00:28:49.340 | And if we wanna make something
00:28:50.540 | that makes our lives miserable, fine.
00:28:52.900 | I wouldn't invest in it as a business,
00:28:55.420 | unless it's just there for self-regulation training.
00:28:59.540 | But I think we need to think about
00:29:01.980 | what kind of future we want.
00:29:02.820 | And actually your question, I really like the,
00:29:05.540 | what is the objective function?
00:29:06.740 | Is it to calm people down?
00:29:09.340 | Sometimes.
00:29:10.580 | Is it to always make people happy and calm them down?
00:29:13.380 | Well, there was a book about that, right?
00:29:16.140 | The Brave New World, make everybody happy,
00:29:18.860 | take your Soma if you're unhappy, take your happy pill.
00:29:22.540 | And if you refuse to take your happy pill,
00:29:24.380 | well, we'll threaten you by sending you to Iceland
00:29:27.580 | to live there.
00:29:29.620 | I lived in Iceland three years.
00:29:30.860 | It's a great place.
00:29:31.860 | Don't take your Soma, go to Iceland.
00:29:33.940 | Little TV commercial there.
00:29:36.780 | I was a child there for a few years.
00:29:39.260 | It's a wonderful place.
00:29:40.620 | So that part of the book never scared me.
00:29:43.260 | But really like, do we want AI to manipulate us
00:29:46.740 | into submission, into making us happy?
00:29:49.100 | Well, if you are a, you know,
00:29:52.660 | like a power obsessed, sick dictator individual
00:29:56.100 | who only wants to control other people
00:29:57.660 | to get your jollies in life,
00:29:59.220 | then yeah, you want to use AI to extend your power
00:30:02.900 | and your scale to force people into submission.
00:30:07.100 | If you believe that the human race
00:30:09.500 | is better off being given freedom
00:30:11.140 | and the opportunity to do things that might surprise you,
00:30:15.380 | then you want to use AI to extend people's ability.
00:30:19.260 | To build, you want to build AI
00:30:20.940 | that extends human intelligence,
00:30:22.940 | that empowers the weak and helps balance the power
00:30:27.300 | between the weak and the strong,
00:30:28.820 | not that gives more power to the strong.
00:30:31.020 | - So in this process of empowering people
00:30:37.260 | and sensing people, what is your sense on emotion
00:30:41.260 | in terms of recognizing emotion?
00:30:42.660 | The difference between emotion that is shown
00:30:44.700 | and emotion that is felt.
00:30:46.700 | So yeah, emotion that is expressed on the surface
00:30:51.700 | through your face, your body, and various other things,
00:30:56.660 | and what's actually going on deep inside
00:30:58.940 | on the biological level, on the neuroscience level,
00:31:01.820 | or some kind of cognitive level.
00:31:03.740 | - Yeah, yeah.
00:31:04.900 | Whoa, no easy questions here.
00:31:07.900 | - Well, yeah, I'm sure there's no definitive answer,
00:31:11.340 | but what's your sense?
00:31:12.420 | How far can we get by just looking at the face?
00:31:15.340 | - We're very limited when we just look at the face,
00:31:18.540 | but we can get further than most people think we can get.
00:31:22.020 | People think, "Hey, I have a great poker face,
00:31:25.940 | "therefore all you're ever gonna get from me is neutral."
00:31:28.380 | Well, that's naive.
00:31:30.220 | We can read with the ordinary camera
00:31:32.780 | on your laptop or on your phone.
00:31:35.020 | We can read from a neutral face if your heart is racing.
00:31:39.420 | We can read from a neutral face
00:31:41.380 | if your breathing is becoming irregular
00:31:44.820 | and showing signs of stress.
00:31:46.980 | We can read under some conditions
00:31:50.780 | that maybe I won't give you details on,
00:31:53.300 | how your heart rate variability power is changing.
00:31:57.140 | That could be a sign of stress,
00:31:58.700 | even when your heart rate is not necessarily accelerating.
00:32:02.460 | - Sorry, from physio sensors or from the face?
00:32:06.140 | - From the color changes that you cannot even see,
00:32:09.220 | but the camera can see.
00:32:11.420 | - That's amazing.
00:32:12.260 | So you can get a lot of signal, but--
00:32:15.140 | - So we get things people can't see using a regular camera.
00:32:18.460 | And from that, we can tell things about your stress.
00:32:21.700 | So if you were just sitting there with a blank face,
00:32:25.340 | thinking nobody can read my emotion, well, you're wrong.
00:32:28.220 | - Right, so that's really interesting,
00:32:31.620 | but that's from sort of visual information from the face.
00:32:34.260 | That's almost like cheating your way
00:32:36.860 | to the physiological state of the body
00:32:38.860 | by being very clever with what you can do with vision.
00:32:42.500 | - With signal processing.
00:32:43.340 | - With signal processing.
00:32:44.180 | So that's really impressive.
00:32:45.060 | But if you just look at the stuff we humans can see,
00:32:48.340 | the smile, the smirks, the subtle, all the facial expressions.
00:32:54.100 | - So then you can hide that on your face
00:32:55.740 | for a limited amount of time.
00:32:57.020 | Now, if you're just going in for a brief interview
00:33:00.340 | and you're hiding it, that's pretty easy for most people.
00:33:03.660 | If you are, however, surveilled constantly everywhere you go,
00:33:08.580 | then it's gonna say, "Gee, Lex used to smile a lot,
00:33:13.060 | "and now I'm not seeing so many smiles.
00:33:15.660 | "And Roz used to laugh a lot
00:33:19.980 | "and smile a lot very spontaneously,
00:33:21.940 | "and now I'm only seeing these
00:33:23.900 | "not-so-spontaneous-looking smiles,
00:33:26.140 | "and only when she's asked these questions."
00:33:28.620 | You know?
00:33:29.860 | That's something to change to.
00:33:30.700 | - Probably not getting enough sleep.
00:33:33.460 | - We could look at that too.
00:33:34.940 | So now, I have to be a little careful too.
00:33:36.780 | When I say we, you think we can't read your emotion,
00:33:40.660 | and we can, it's not that binary.
00:33:42.500 | What we're reading is more some physiological changes
00:33:45.700 | that relate to your activation.
00:33:48.500 | Now, that doesn't mean that we know everything
00:33:51.540 | about how you feel.
00:33:52.380 | In fact, we still know very little about how you feel.
00:33:54.620 | Your thoughts are still private.
00:33:56.660 | Your nuanced feelings are still completely private.
00:34:00.860 | We can't read any of that.
00:34:02.660 | So there's some relief that we can't read that,
00:34:06.580 | even brain imaging can't read that,
00:34:09.340 | wearables can't read that.
00:34:11.820 | However, as we read your body state changes,
00:34:15.540 | and we know what's going on in your environment,
00:34:18.100 | and we look at patterns of those over time,
00:34:21.020 | we can start to make some inferences
00:34:24.540 | about what you might be feeling.
00:34:26.500 | And that is where it's not just the momentary feeling,
00:34:30.940 | but it's more your stance toward things.
00:34:33.740 | And that could actually be a little bit more scary
00:34:36.620 | with certain kinds of governmental control freak people
00:34:41.620 | who want to know more about,
00:34:44.900 | are you on their team or are you not?
00:34:47.900 | - And getting that information through over time.
00:34:49.940 | So you're saying there's a lot of signal
00:34:51.220 | by looking at the change over time.
00:34:53.300 | - Yeah.
00:34:54.140 | - So you've done a lot of exciting work,
00:34:56.180 | both in computer vision and physiological sense,
00:34:58.980 | like wearables.
00:35:00.180 | What do you think is the best modality for,
00:35:03.100 | what's the best window into the emotional soul?
00:35:08.100 | Is it the face, is it the voice?
00:35:10.220 | - Depends what you want to know.
00:35:11.980 | It depends what you want to know.
00:35:13.180 | It depends what you want to know.
00:35:14.020 | Everything is informative.
00:35:15.660 | Everything we do is informative.
00:35:17.460 | - So for health and wellbeing and things like that,
00:35:20.220 | do you find the wearable,
00:35:21.620 | measuring physiological signals
00:35:24.900 | is the best for health-based stuff?
00:35:29.380 | - So here I'm going to answer empirically
00:35:31.900 | with data and studies we've been doing.
00:35:34.700 | We've been doing studies now.
00:35:36.300 | These are currently running
00:35:38.300 | with lots of different kinds of people,
00:35:39.620 | but where we've published data,
00:35:41.940 | and I can speak publicly to it,
00:35:44.140 | the data are limited right now
00:35:45.540 | to New England college students.
00:35:47.740 | So that's a small group.
00:35:49.260 | Among New England college students,
00:35:52.500 | when they are wearing a wearable,
00:35:55.900 | like the Empatica Embrace here,
00:35:57.660 | that's measuring skin conductance, movement, temperature.
00:36:01.740 | And when they are using a smartphone
00:36:05.820 | that is collecting their time of day
00:36:09.380 | of when they're texting, who they're texting,
00:36:12.140 | their movement around it, their GPS,
00:36:14.180 | the weather information based upon their location.
00:36:18.020 | And when it's using machine learning
00:36:19.380 | and putting all of that together
00:36:20.900 | and looking not just at right now,
00:36:22.940 | but looking at your rhythm of behaviors
00:36:26.980 | over about a week.
00:36:28.580 | When we look at that,
00:36:29.940 | we are very accurate at forecasting tomorrow's stress,
00:36:33.580 | mood, and happy, sad mood, and health.
00:36:38.580 | And when we look at which pieces of that are most useful,
00:36:42.660 | first of all, if you have all the pieces,
00:36:45.620 | you get the best results.
00:36:47.100 | If you have only the wearable,
00:36:50.620 | you get the next best results.
00:36:53.700 | And that's still better than 80% accurate
00:36:56.580 | at forecasting tomorrow's levels.
00:36:58.620 | - Isn't that exciting?
00:37:00.980 | Because the wearable stuff with physiological information,
00:37:05.420 | it feels like it violates privacy less
00:37:08.100 | than the non-contact face-based methods.
00:37:12.820 | - Yeah, it's interesting.
00:37:14.140 | I think what people sometimes don't,
00:37:16.620 | it's fine in the early days people would say,
00:37:18.940 | oh, wearing something or giving blood is invasive, right?
00:37:22.660 | Whereas a camera is less invasive
00:37:24.580 | 'cause it's not touching you.
00:37:27.100 | I think on the contrary,
00:37:28.540 | the things that are not touching you are maybe the scariest
00:37:31.420 | because you don't know when they're on or off.
00:37:34.100 | And you don't know when,
00:37:36.300 | and you don't know who's behind it, right?
00:37:40.140 | A wearable, depending upon what's happening
00:37:43.700 | to the data on it,
00:37:45.060 | if it's just stored locally,
00:37:46.940 | or if it's streaming,
00:37:48.540 | and what it is being attached to,
00:37:52.620 | in a sense you have the most control over it
00:37:54.940 | because it's also very easy to just take it off, right?
00:37:59.420 | Now it's not sensing me.
00:38:01.460 | So if I'm uncomfortable with what it's sensing,
00:38:04.380 | now I'm free, right?
00:38:07.220 | If I'm comfortable with what it's sensing,
00:38:09.940 | then, and I happen to know everything about this one
00:38:12.820 | and what it's doing with it,
00:38:13.740 | so I'm quite comfortable with it,
00:38:15.620 | then I'm, I have control, I'm comfortable.
00:38:20.260 | Control is one of the biggest factors
00:38:23.340 | for an individual in reducing their stress.
00:38:26.620 | If I have control over it,
00:38:28.180 | if I know all there is to know about it,
00:38:30.220 | then my stress is a lot lower,
00:38:32.500 | and I'm making an informed choice
00:38:34.980 | about whether to wear it or not,
00:38:36.660 | or when to wear it or not.
00:38:38.060 | I wanna wear it sometimes, maybe not others.
00:38:40.380 | - Right, so that control, yeah, I'm with you.
00:38:42.780 | That control, even if, yeah, the ability to turn it off,
00:38:47.780 | that is a really impoverished thing.
00:38:49.060 | It's huge, and we need to maybe,
00:38:52.140 | if there's regulations, maybe that's number one to protect,
00:38:55.100 | is people's ability to, as easy to opt out as to opt in.
00:39:00.100 | - So you've studied a bit of neuroscience as well.
00:39:04.260 | How have looking at our own minds,
00:39:08.220 | sort of the biological stuff, or the neurobiological,
00:39:12.860 | the neuroscience, look at the signals in our brain,
00:39:17.380 | helped you understand the problem
00:39:18.740 | and the approach of effective computing?
00:39:20.740 | - Originally, I was a computer architect,
00:39:23.700 | and I was building hardware and computer designs,
00:39:26.300 | and I wanted to build ones that work like the brain.
00:39:28.260 | So I've been studying the brain
00:39:29.900 | as long as I've been studying how to build computers.
00:39:32.540 | - Have you figured out anything yet?
00:39:36.180 | - Very little.
00:39:37.020 | It's so amazing.
00:39:39.140 | They used to think, oh, if you remove this chunk of the brain
00:39:42.980 | and you find this function goes away,
00:39:44.460 | well, that's the part of the brain that did it.
00:39:46.140 | And then later they realized,
00:39:47.260 | if you remove this other chunk of the brain,
00:39:48.740 | that function comes back, and oh no,
00:39:51.020 | we really don't understand it.
00:39:53.220 | Brains are so interesting, and changing all the time,
00:39:56.580 | and able to change in ways
00:39:58.500 | that will probably continue to surprise us.
00:40:01.140 | When we were measuring stress, you may know the story
00:40:05.940 | where we found an unusual big skin conductance pattern
00:40:10.180 | on one wrist in one of our kids with autism.
00:40:13.140 | And in trying to figure out how on earth
00:40:15.860 | you could be stressed on one wrist and not the other,
00:40:18.100 | like how can you get sweaty on one wrist, right?
00:40:20.500 | When you get stressed
00:40:21.900 | with that sympathetic fight or flight response,
00:40:23.660 | like you kind of should like sweat more
00:40:25.500 | in some places than others,
00:40:26.620 | but not more on one wrist than the other.
00:40:28.300 | That didn't make any sense.
00:40:29.700 | We learned that what had actually happened
00:40:33.500 | was a part of his brain had unusual electrical activity,
00:40:37.580 | and that caused an unusually large sweat response
00:40:41.660 | on one wrist and not the other.
00:40:44.460 | And since then, we've learned that seizures
00:40:46.940 | cause this unusual electrical activity.
00:40:49.740 | And depending where the seizure is,
00:40:51.620 | if it's in one place and it's staying there,
00:40:53.780 | you can have a big electrical response
00:40:55.940 | we can pick up with a wearable at one part of the body.
00:40:58.900 | You can also have a seizure
00:40:59.820 | that spreads over the whole brain,
00:41:00.900 | generalized grand mal seizure,
00:41:02.820 | and that response spreads,
00:41:04.460 | and we can pick it up pretty much anywhere.
00:41:06.620 | As we learned this, and then later built an embrace
00:41:10.620 | that's now FDA cleared for seizure detection,
00:41:13.540 | we have also built relationships
00:41:16.180 | with some of the most amazing doctors in the world
00:41:18.860 | who not only help people
00:41:20.820 | with unusual brain activity or epilepsy,
00:41:23.460 | but some of them are also surgeons,
00:41:24.860 | and they're going in and they're implanting electrodes,
00:41:27.580 | not just to momentarily read the strange patterns
00:41:31.620 | of brain activity that we'd like to see return to normal,
00:41:35.460 | but also to read out continuously what's happening
00:41:37.740 | in some of these deep regions of the brain
00:41:39.500 | during most of life when these patients are not seizing.
00:41:42.020 | Most of the time, they're not seizing.
00:41:43.340 | Most of the time, they're fine.
00:41:45.340 | And so we are now working on mapping
00:41:48.300 | those deep brain regions
00:41:50.340 | that you can't even usually get with EEG scalp electrodes
00:41:54.060 | 'cause the changes deep inside don't reach the surface.
00:41:58.140 | But interesting, when some of those regions are activated,
00:42:01.700 | we see a big skin conductance response.
00:42:04.620 | Who would have thunk it, right?
00:42:06.100 | Like nothing here, but something here.
00:42:08.180 | In fact, right after seizures
00:42:10.740 | that we think are the most dangerous ones
00:42:12.900 | that precede what's called SUDEP,
00:42:14.420 | sudden unexpected death in epilepsy,
00:42:16.380 | there's a period where the brain waves go flat
00:42:20.100 | and it looks like the person's brain has stopped,
00:42:21.980 | but it hasn't.
00:42:23.460 | The activity has gone deep into a region
00:42:26.740 | that can make the cortical activity look flat,
00:42:29.460 | like a quick shutdown signal here.
00:42:31.460 | It can unfortunately cause breathing to stop
00:42:35.540 | if it progresses long enough.
00:42:37.340 | Before that happens,
00:42:39.620 | we see a big skin conductance response
00:42:42.100 | in the data that we have.
00:42:43.780 | The longer this flattening, the bigger our response here.
00:42:46.900 | So we have been trying to learn initially,
00:42:49.460 | like why are we getting a big response here
00:42:51.740 | when there's nothing here?
00:42:52.580 | Well, it turns out there's something much deeper.
00:42:55.620 | So we can now go inside the brains
00:42:57.780 | of some of these individuals, fabulous people
00:43:01.340 | who usually aren't seizing
00:43:03.500 | and get this data and start to map it.
00:43:05.660 | So that's the active research that we're doing right now
00:43:07.620 | with top medical partners.
00:43:09.380 | - So this wearable sensor that's looking,
00:43:12.060 | skin conductance can capture sort of the ripples
00:43:15.220 | of the complexity of what's going on in our brain.
00:43:18.900 | So this little device,
00:43:21.100 | you have a hope that you can start to get the signal
00:43:24.980 | from the interesting things happening in the brain.
00:43:27.940 | - Yeah, we've already published the strong correlations
00:43:30.660 | between the size of this response
00:43:32.220 | and the flattening that happens afterwards.
00:43:35.380 | And unfortunately also in a real SUDEP case
00:43:38.380 | where the patient died because the,
00:43:41.100 | well, we don't know why.
00:43:42.380 | We don't know if somebody was there,
00:43:43.660 | it would have definitely prevented it.
00:43:45.340 | But we know that most SUDEPs happen when the person's alone.
00:43:48.620 | And in this case--
00:43:50.460 | - What's a SUDEP, I'm sorry?
00:43:51.420 | - A SUDEP is an acronym, S-U-D-E-P.
00:43:54.300 | And it stands for the number two cause
00:43:57.420 | of years of life lost actually
00:43:59.580 | among all neurological disorders.
00:44:01.660 | Stroke is number one, SUDEP is number two,
00:44:04.140 | but most people haven't heard of it.
00:44:06.260 | Actually, I'll plug my TED Talk.
00:44:07.860 | It's on the front page of TED right now
00:44:09.940 | that talks about this.
00:44:11.740 | And we hope to change that.
00:44:14.180 | I hope everybody who's heard of SIDS and stroke
00:44:17.740 | will now hear of SUDEP 'cause we think in most cases
00:44:20.700 | it's preventable if people take their meds
00:44:23.620 | and aren't alone when they have a seizure.
00:44:26.220 | Not guaranteed to be preventable.
00:44:27.820 | There are some exceptions,
00:44:29.660 | but we think most cases probably are.
00:44:31.580 | - So you have this embrace now in the version two wristband,
00:44:34.980 | right, for epilepsy management.
00:44:37.820 | That's the one that's FDA approved?
00:44:41.500 | - Yes.
00:44:42.700 | - Which is kind of incredible.
00:44:43.540 | - FDA cleared, they say.
00:44:44.380 | - Cleared, sorry.
00:44:45.860 | - No, it's okay.
00:44:47.180 | It essentially means it's approved for marketing.
00:44:49.460 | - Got it.
00:44:50.300 | Just a side note, how difficult is that to do?
00:44:53.020 | It's essentially getting FDA approved
00:44:54.900 | for computer science technology.
00:44:57.060 | - It's so agonizing.
00:44:58.060 | It's much harder than publishing multiple papers
00:45:01.980 | in top medical journals.
00:45:04.180 | Yeah, we published peer reviewed
00:45:05.980 | top medical journal neurology best results
00:45:08.900 | and that's not good enough for the FDA.
00:45:10.900 | - Is that system, so if we look at the peer review
00:45:13.940 | of medical journals, there's flaws, there's strengths.
00:45:16.860 | Is the FDA approval process,
00:45:19.660 | how does it compare to the peer review process?
00:45:21.700 | Does it have the strength in the--
00:45:23.260 | - I'd take peer review over FDA any day.
00:45:25.820 | - But is that a good thing?
00:45:26.740 | Is that a good thing for FDA?
00:45:28.100 | You're saying does it stop some amazing technology
00:45:31.260 | from getting through?
00:45:32.460 | - Yeah, it does.
00:45:33.540 | The FDA performs a very important good role
00:45:36.300 | in keeping people safe.
00:45:37.660 | They keep things, they put you through tons
00:45:41.020 | of safety testing and that's wonderful and that's great.
00:45:44.340 | I'm all in favor of the safety testing.
00:45:46.300 | Sometimes they put you through additional testing
00:45:51.020 | that they don't have to explain why they put you through it
00:45:54.100 | and you don't understand why you're going through it
00:45:56.700 | and it doesn't make sense and that's very frustrating.
00:46:00.700 | And maybe they have really good reasons
00:46:03.100 | and they just would, it would do people a service
00:46:08.180 | to articulate those reasons.
00:46:09.740 | - Be more transparent.
00:46:10.660 | - Be more transparent.
00:46:12.180 | - So as part of Empatica, you have sensors.
00:46:15.820 | So what kind of problems can we crack?
00:46:17.860 | What kind of things from seizures to autism
00:46:22.860 | to I think I've heard you mention depression.
00:46:28.060 | What kind of things can we alleviate?
00:46:29.780 | Can we detect?
00:46:30.740 | What's your hope of how we can make the world
00:46:33.340 | a better place with this wearable tech?
00:46:35.780 | - I would really like to see my fellow brilliant researchers
00:46:40.780 | step back and say, what are the really hard problems
00:46:46.900 | that we don't know how to solve that come from people
00:46:50.540 | maybe we don't even see in our normal life
00:46:52.540 | because they're living in the poorer places.
00:46:55.500 | They're stuck on the bus.
00:46:58.020 | They can't even afford the Uber or the Lyft
00:46:59.740 | or the data plan or all these other wonderful things
00:47:03.500 | we have that we keep improving on.
00:47:05.580 | Meanwhile, there's all these folks left behind in the world
00:47:08.260 | and they're struggling with horrible diseases,
00:47:10.860 | with depression, with epilepsy, with diabetes,
00:47:14.340 | with just awful stuff that maybe a little more time
00:47:19.340 | and attention hanging out with them
00:47:21.860 | and learning what are their challenges in life?
00:47:24.220 | What are their needs?
00:47:25.380 | How do we help them have job skills?
00:47:27.060 | How do we help them have a hope and a future
00:47:30.020 | and a chance to have the great life
00:47:32.700 | that so many of us building technology have?
00:47:36.540 | And then how would that reshape the kinds of AI
00:47:39.300 | that we build?
00:47:40.220 | How would that reshape the new apps that we build?
00:47:43.420 | Or maybe we need to focus on how to make things
00:47:46.700 | more low cost and green instead of thousand dollar phones.
00:47:50.660 | I mean, come on, why can't we be thinking more
00:47:54.180 | about things that do more with less for these folks?
00:47:58.180 | Quality of life is not related to the cost of your phone.
00:48:01.780 | It's not something that, it's been shown that what?
00:48:05.020 | About $75,000 of income and happiness is the same.
00:48:08.620 | However, I can tell you, you get a lot of happiness
00:48:12.220 | from helping other people.
00:48:13.900 | You get a lot more than $75,000 buys.
00:48:16.780 | So how do we connect up the people who have real needs
00:48:20.900 | with the people who have the ability to build the future
00:48:23.540 | and build the kind of future that truly improves the lives
00:48:27.500 | of all the people that are currently being left behind?
00:48:30.260 | - So let me return just briefly on a point,
00:48:35.340 | maybe in the movie "Her."
00:48:37.060 | So do you think if we look farther into the future,
00:48:40.460 | you said so much of the benefit from making our technology
00:48:44.020 | more empathetic to us human beings
00:48:48.500 | would make them better tools, empower us,
00:48:51.620 | make our lives better.
00:48:53.100 | So if we look farther into the future,
00:48:54.700 | do you think we'll ever create an AI system
00:48:57.300 | that we can fall in love with and loves us back
00:49:00.660 | on a level that is similar to human to human interaction,
00:49:05.260 | like in the movie "Her" or beyond?
00:49:08.020 | - I think we can simulate it in ways that could,
00:49:12.420 | you know, sustain engagement for a while.
00:49:16.580 | Would it be as good as another person?
00:49:20.500 | I don't think so.
00:49:21.580 | Or if you're used to like good people,
00:49:24.940 | now, if you've just grown up with nothing but abuse
00:49:27.100 | and you can't stand human beings,
00:49:29.060 | can we do something that helps you there
00:49:32.140 | that gives you something through a machine?
00:49:33.980 | Yeah, but that's pretty low bar, right?
00:49:36.140 | If you've only encountered pretty awful people.
00:49:39.060 | If you've encountered wonderful, amazing people,
00:49:41.660 | we're nowhere near building anything like that.
00:49:44.780 | And I would not bet on building it.
00:49:49.420 | I would bet instead on building the kinds of AI
00:49:53.140 | that helps kind of raise all boats,
00:49:56.940 | that helps all people be better people,
00:49:59.540 | helps all people figure out
00:50:01.100 | if they're getting sick tomorrow
00:50:02.300 | and helps give them what they need to stay well tomorrow.
00:50:05.460 | That's the kind of AI I want to build
00:50:07.060 | that improves human lives,
00:50:09.100 | not the kind of AI that just walks on "The Tonight Show"
00:50:11.740 | and people go, wow, look how smart that is.
00:50:14.100 | You know, really?
00:50:15.340 | Like, and then it goes back in a box, you know?
00:50:18.700 | - So on that point,
00:50:20.100 | if we continue looking a little bit into the future,
00:50:23.580 | do you think an AI that's empathetic
00:50:25.340 | and does improve our lives
00:50:29.100 | need to have a physical presence, a body?
00:50:31.980 | And even, let me cautiously say the C word, consciousness,
00:50:36.980 | and even fear of mortality.
00:50:40.940 | So some of those human characteristics,
00:50:42.940 | do you think it needs to have those aspects
00:50:46.060 | or can it remain simply a machine learning tool
00:50:51.060 | that learns from data of behavior
00:50:53.220 | that learns to make us, based on previous patterns,
00:50:59.100 | feel better?
00:51:00.260 | Or does it need those elements of consciousness?
00:51:02.460 | - It depends on your goals.
00:51:03.940 | If you're making a movie, it needs a body.
00:51:06.940 | It needs a gorgeous body.
00:51:08.180 | It needs to act like it has consciousness.
00:51:10.220 | It needs to act like it has emotion, right?
00:51:11.900 | Because that's what sells.
00:51:13.540 | That's what's gonna get me to show up
00:51:15.020 | and enjoy the movie, okay?
00:51:17.140 | In real life, does it need all that?
00:51:19.980 | Well, if you've read Orson Scott Card,
00:51:21.980 | Ender's Game, Speaker for the Dead,
00:51:23.460 | you know, it could just be like a little voice
00:51:25.620 | in your earring, right?
00:51:26.900 | And you could have an intimate relationship
00:51:28.500 | and it could get to know you.
00:51:29.740 | And it doesn't need to be a robot.
00:51:31.820 | But that doesn't make this compelling of a movie, right?
00:51:37.300 | I mean, we already think it's kind of weird
00:51:38.580 | when a guy looks like he's talking to himself on the train,
00:51:41.420 | you know, even though it's earbuds.
00:51:43.860 | So we have these, embodied is more powerful.
00:51:48.860 | Embodied, when you compare interactions
00:51:51.780 | with an embodied robot versus a video of a robot
00:51:55.300 | versus no robot, the robot is more engaging.
00:52:00.180 | The robot gets our attention more.
00:52:01.700 | The robot, when you walk in your house,
00:52:03.100 | is more likely to get you to remember to do the things
00:52:05.460 | that you asked it to do
00:52:06.500 | because it's kind of got a physical presence.
00:52:09.020 | You can avoid it if you don't like it.
00:52:10.820 | It could see you're avoiding it.
00:52:12.500 | There's a lot of power to being embodied.
00:52:14.820 | There will be embodied AIs.
00:52:17.220 | They have great power and opportunity and potential.
00:52:22.100 | There will also be AIs that aren't embodied,
00:52:24.660 | that just are little software assistants
00:52:28.580 | that help us with different things
00:52:30.300 | that may get to know things about us.
00:52:32.660 | Will they be conscious?
00:52:34.940 | There will be attempts to program them
00:52:36.700 | to make them appear to be conscious.
00:52:39.340 | We can already write programs that make it look like,
00:52:41.820 | oh, what do you mean?
00:52:42.660 | Of course I'm aware that you're there, right?
00:52:43.860 | I mean, it's trivial to say stuff like that.
00:52:45.780 | It's easy to fool people,
00:52:47.860 | but does it actually have conscious experience like we do?
00:52:52.260 | Nobody has a clue how to do that yet.
00:52:55.980 | That seems to be something that is beyond
00:52:58.780 | what any of us knows how to build now.
00:53:01.540 | Will it have to have that?
00:53:03.780 | I think you can get pretty far
00:53:05.580 | with a lot of stuff without it.
00:53:07.340 | Will we accord it rights?
00:53:10.940 | Well, that's more a political game
00:53:13.300 | than it is a question of real consciousness.
00:53:16.420 | - Yeah, can you go to jail for turning off Alexa?
00:53:18.860 | It's a question for an election maybe a few decades from now.
00:53:24.900 | - Well, Sophia Robots already been given rights
00:53:27.580 | as a citizen in Saudi Arabia, right?
00:53:29.980 | Even before women have full rights.
00:53:33.660 | Then the robot was still put back in the box
00:53:36.980 | to be shipped to the next place
00:53:39.620 | where it would get a paid appearance, right?
00:53:41.820 | - Yeah, it's dark and almost comedic, if not absurd.
00:53:47.780 | So I've heard you speak about your journey in finding faith.
00:53:54.660 | - Sure.
00:53:55.980 | - And how you discovered some wisdoms about life
00:54:00.060 | and beyond from reading the Bible.
00:54:03.260 | And I've also heard you say that,
00:54:05.420 | you said scientists too often assume
00:54:07.020 | that nothing exists beyond what can be currently measured.
00:54:10.380 | - Yeah, materialism.
00:54:12.620 | - Materialism.
00:54:13.460 | - And scientism, yeah.
00:54:14.900 | - So in some sense, this assumption enables
00:54:17.380 | the near term scientific method,
00:54:20.260 | assuming that we can uncover the mysteries of this world
00:54:25.260 | by the mechanisms of measurement that we currently have.
00:54:28.260 | But we easily forget that we've made this assumption.
00:54:32.340 | So what do you think we miss out on
00:54:35.580 | by making that assumption?
00:54:37.100 | - It's fine to limit the scientific method
00:54:42.580 | to things we can measure and reason about and reproduce.
00:54:46.780 | That's fine.
00:54:48.580 | I think we have to recognize that sometimes we scientists
00:54:52.260 | also believe in things that happened historically.
00:54:55.260 | You know, like I believe the Holocaust happened.
00:54:57.860 | I can't prove events from past history scientifically.
00:55:03.820 | You prove them with historical evidence, right?
00:55:06.420 | With the impact they had on people,
00:55:08.140 | with eyewitness testimony and things like that.
00:55:11.620 | So a good thinker recognizes that science
00:55:15.660 | is one of many ways to get knowledge.
00:55:19.400 | It's not the only way.
00:55:21.580 | And there's been some really bad philosophy
00:55:24.260 | and bad thinking recently, you can call it scientism,
00:55:27.820 | where people say science is the only way to get to truth.
00:55:31.160 | And it's not, it just isn't.
00:55:33.320 | There are other ways that work also,
00:55:35.940 | like knowledge of love with someone.
00:55:38.460 | You don't prove your love through science, right?
00:55:43.460 | So history, philosophy, love,
00:55:48.100 | a lot of other things in life show us
00:55:50.820 | that there's more ways to gain knowledge and truth
00:55:55.820 | if you're willing to believe there is such a thing.
00:55:58.020 | And I believe there is, than science.
00:56:01.260 | I am a scientist, however, and in my science,
00:56:04.280 | I do limit my science to the things
00:56:07.160 | that the scientific method can do.
00:56:09.840 | But I recognize that it's myopic
00:56:12.020 | to say that that's all there is.
00:56:13.920 | - Right, there's, just like you listed,
00:56:15.920 | there's all the why questions.
00:56:17.340 | And really, we know, if we're being honest with ourselves,
00:56:20.760 | the percent of what we really know is basically zero
00:56:25.760 | relative to the full mystery of the--
00:56:28.380 | - Measure theory, a set of measure zero.
00:56:30.600 | - If I have a finite amount of knowledge, which I do.
00:56:33.380 | - So you said that you believe in truth.
00:56:37.200 | So let me ask that old question.
00:56:40.680 | What do you think this thing is all about?
00:56:43.000 | What's life on Earth?
00:56:44.200 | - Life, the universe, and everything?
00:56:46.480 | - And everything, what's the meaning?
00:56:47.320 | - I came from Douglas Adams, 42, my favorite number.
00:56:51.360 | By the way, that's my street address.
00:56:52.840 | My husband and I,
00:56:53.680 | I guessed to the exact same number for our house.
00:56:55.680 | We got to pick it.
00:56:56.920 | And there's a reason we picked 42, yeah.
00:57:00.320 | - So is it just 42, or do you have other words
00:57:03.840 | that you can put around it?
00:57:05.560 | - Well, I think there's a grand adventure,
00:57:07.960 | and I think this life is a part of it.
00:57:10.040 | I think there's a lot more to it than meets the eye
00:57:12.400 | and the heart and the mind and the soul here.
00:57:14.640 | I think we see but through a glass dimly in this life.
00:57:18.200 | We see only a part of all there is to know.
00:57:21.980 | If people haven't read the Bible, they should,
00:57:26.020 | if they consider themselves educated.
00:57:28.120 | And you could read Proverbs
00:57:30.680 | and find tremendous wisdom in there
00:57:33.600 | that cannot be scientifically proven,
00:57:35.960 | but when you read it, there's something in you,
00:57:38.440 | like a musician knows when the instrument's played right
00:57:41.440 | and it's beautiful.
00:57:42.960 | There's something in you that comes alive
00:57:45.480 | and knows that there's a truth there,
00:57:47.520 | that it's like your strings are being plucked by the master
00:57:50.440 | instead of by me, right, playing when I pluck it.
00:57:54.520 | But probably when you play, it sounds spectacular, right?
00:57:57.640 | And when you encounter those truths,
00:58:01.800 | there's something in you that sings
00:58:03.520 | and knows that there is more
00:58:06.600 | than what I can prove mathematically
00:58:09.480 | or program a computer to do.
00:58:11.560 | Don't get me wrong, the math is gorgeous.
00:58:13.640 | The computer programming can be brilliant.
00:58:16.280 | It's inspiring, right?
00:58:17.440 | We wanna do more.
00:58:19.440 | None of this squashes my desire to do science
00:58:22.080 | or to get knowledge through science.
00:58:23.640 | I'm not dissing the science at all.
00:58:26.320 | I grow even more in awe of what the science can do
00:58:29.880 | because I'm more in awe of all there is we don't know.
00:58:33.320 | And really at the heart of science,
00:58:36.360 | you have to have a belief that there's truth,
00:58:38.560 | that there's something greater to be discovered.
00:58:41.400 | And some scientists may not wanna use the faith word,
00:58:44.800 | but it's faith that drives us to do science.
00:58:48.120 | It's faith that there is truth,
00:58:50.280 | that there's something to know that we don't know,
00:58:52.840 | that it's worth knowing, that it's worth working hard,
00:58:56.280 | and that there is meaning,
00:58:58.560 | that there is such a thing as meaning,
00:59:00.120 | which by the way, science can't prove either.
00:59:03.000 | We have to kind of start with some assumptions
00:59:05.040 | that there's things like truth and meaning.
00:59:07.120 | And these are really questions philosophers own, right?
00:59:10.960 | This is their space,
00:59:12.000 | philosophers and theologians at some level.
00:59:14.680 | So these are things science,
00:59:17.160 | when people claim that science will tell you all truth,
00:59:22.600 | there's a name for that.
00:59:24.080 | It's its own kind of faith, it's scientism.
00:59:27.240 | And it's very myopic.
00:59:29.160 | - Yeah, there's a much bigger world out there
00:59:31.520 | to be explored in ways that science may not,
00:59:34.960 | at least for now, allow us to explore.
00:59:37.280 | - Yeah, and there's meaning and purpose and hope
00:59:40.240 | and joy and love and all these awesome things
00:59:43.400 | that make it all worthwhile too.
00:59:45.600 | - I don't think there's a better way to end it, Roz.
00:59:47.840 | Thank you so much for talking today.
00:59:49.200 | - Thanks, Lex, what a pleasure.
00:59:50.480 | Great questions.
00:59:51.320 | (upbeat music)
00:59:53.920 | (upbeat music)
00:59:56.520 | (upbeat music)
00:59:59.120 | (upbeat music)
01:00:01.720 | (upbeat music)
01:00:04.320 | (upbeat music)
01:00:06.920 | [BLANK_AUDIO]