back to indexRosalind Picard: Affective Computing, Emotion, Privacy, and Health | Lex Fridman Podcast #24
Chapters
0:0
0:1 Rosalind Picard
2:2 Human-Computer Interaction
6:16 Turing Test for Emotional Intelligence
11:47 Fake News
25:40 Interaction with Alexa
30:41 Recognizing Emotion
00:00:00.000 |
The following is a conversation with Rosalind Picard. 00:00:04.540 |
director of the effective computing research group 00:00:06.880 |
at the MIT Media Lab and co-founder of two companies, 00:00:13.560 |
she launched the field of effective computing 00:00:17.580 |
This book described the importance of emotion 00:00:25.320 |
has to the relationship between people in general 00:00:30.920 |
I really enjoy talking with Ros over so many topics, 00:00:34.040 |
including emotion, ethics, privacy, wearable computing, 00:00:46.040 |
If you enjoy it, subscribe on YouTube, iTunes, 00:00:48.760 |
or simply connect with me on Twitter @LexFriedman, 00:00:54.000 |
And now here's my conversation with Rosalind Picard. 00:01:03.340 |
and led a lot of research in this area since then. 00:01:06.720 |
As I understand, the goal is to make the machine detect 00:01:09.260 |
and interpret the emotional state of a human being 00:01:16.160 |
So how has your understanding of the problem space 00:01:19.980 |
defined by effective computing changed in the past 24 years? 00:01:25.400 |
So it's the scope, the applications, the challenges, 00:01:28.920 |
what's involved, how has that evolved over the years? 00:01:40.140 |
and responding intelligently to human emotion, 00:01:47.140 |
The original concept also encompassed machines 00:01:58.320 |
arises from, or deliberately influences human emotion. 00:02:11.040 |
and I'm scowling at it and I'm cursing at it, 00:02:20.840 |
that kind of thing just makes you even more frustrated. 00:02:24.720 |
And I thought, that stupid thing needs to see my affect. 00:02:30.680 |
which Microsoft researchers had worked really hard on, 00:02:33.080 |
actually had some of the most sophisticated AI in it 00:02:35.200 |
at the time, that thing's gonna actually be smart. 00:02:54.400 |
but many people, do you find yourself with that reference 00:02:58.360 |
that people recognize what you're talking about 00:03:01.720 |
- I don't expect the newest students to these days, 00:03:09.280 |
And still the majority of people seem to know it. 00:03:13.880 |
maybe natural language processing, what you were typing, 00:03:19.320 |
I don't even remember what Clippy was, except annoying. 00:03:25.880 |
- Well, I miss it. - I would hear those stories. 00:03:34.960 |
- Somebody was there, and we're in it together, 00:03:40.920 |
They keep ripping up the couch kind of thing. 00:03:41.760 |
- And in fact, they could have done it smarter like a puppy. 00:03:45.000 |
If they had done, like if when you yelled at it 00:03:48.040 |
or cursed at it, if it had put its little ears back 00:03:53.000 |
probably people would have wanted it back, right? 00:03:55.960 |
But instead, when you yelled at it, what did it do? 00:04:01.320 |
If somebody comes to my office and I yell at them, 00:04:15.080 |
It was intelligent about whether you were writing a letter, 00:04:18.200 |
what kind of help you needed for that context. 00:04:35.120 |
And intelligence at the time was really all about math 00:04:47.960 |
Social-emotional interaction is much more complex 00:04:56.100 |
And in order to understand that required skills 00:05:06.440 |
since the message, since you've really launched a field 00:05:20.800 |
sorry to say, empathy in us computer scientists. 00:05:32.520 |
Computer scientists are a much more diverse group today 00:05:35.080 |
than they were 25 years ago, and that's good. 00:05:39.080 |
We need all kinds of people to become computer scientists 00:05:41.840 |
so that computer science reflects more what society needs. 00:05:45.640 |
And there's brilliance among every personality type, 00:05:55.880 |
Your view of how difficult it is to recognize emotion 00:05:58.660 |
or to create a deeply emotionally intelligent interaction, 00:06:04.000 |
has it gotten easier or harder as you've explored it further? 00:06:10.060 |
if you think of the Turing test solving the intelligence, 00:06:16.080 |
looking at the Turing test for emotional intelligence? 00:06:18.780 |
- I think it is as difficult as I thought it was gonna be. 00:06:25.600 |
I think my prediction of its difficulty is spot on. 00:06:33.140 |
because they're always a function of society's love 00:06:39.480 |
If society gets excited and you get hundreds of, 00:06:43.520 |
and you get thousands of researchers working on it 00:06:52.040 |
The general intelligence, the computer's complete lack 00:06:57.040 |
of ability to have awareness of what it's doing, 00:07:05.540 |
the fact that there's no signs of it becoming conscious, 00:07:08.620 |
the fact that it doesn't read between the lines, 00:07:11.820 |
those kinds of things that we have to teach it explicitly, 00:07:20.400 |
There aren't breakthroughs yet that lead us to believe 00:07:25.320 |
which means that it's still gonna be kind of stuck 00:07:31.280 |
where it's probably only gonna do the right thing 00:07:34.040 |
in very limited, narrow, pre-specified contexts 00:07:48.000 |
because when people don't work on it, it's infinite. 00:07:51.760 |
When everybody works on it, you get a nice piece of it 00:07:58.600 |
I actually think there's a more important issue right now 00:08:07.420 |
Usually we're all just like, step on the gas, 00:08:11.180 |
This is causing us to pull back and put the brakes on, 00:08:14.200 |
and that's the way that some of this technology 00:08:18.700 |
is being used in places like China right now, 00:08:27.780 |
on a lot of the things that we could be doing 00:08:30.100 |
and try to get the community to think a little bit more 00:08:33.700 |
about, okay, if we're gonna go forward with that, 00:08:36.060 |
how can we do it in a way that puts in place safeguards 00:08:42.860 |
is just when a computer senses the human being, 00:08:51.860 |
like forming a deep connection with a human being, 00:08:53.940 |
so what are your worries how that could go wrong? 00:08:59.500 |
Is it in terms of other kinds of more subtle things? 00:09:09.860 |
and in the US we're quite free, as we all know, 00:09:13.660 |
to even criticize the President of the United States, right? 00:09:31.060 |
And so people are very careful not to do that. 00:09:34.340 |
However, what happens if you're simply watching a video 00:09:41.020 |
that shows a little bit of skepticism, right? 00:09:45.300 |
Well, and here we're completely free to do that. 00:09:56.580 |
when the athlete does this as part of the national broadcast, 00:10:11.580 |
and in places that don't have those freedoms, 00:10:19.620 |
What if our technology can read it even non-contact? 00:10:35.580 |
and opportunities that try to read people's affect 00:10:45.180 |
you have to sign things saying you will only use it 00:10:47.700 |
in certain ways, which essentially is get people's buy-in, 00:10:51.820 |
right, don't do this without people agreeing to it. 00:10:55.420 |
There are other countries where they're not interested 00:10:58.660 |
in people's buy-in, they're just gonna use it, 00:11:03.100 |
and if you don't like it, you better not scowl 00:11:08.540 |
- So one, let me just comment on a small tangent. 00:11:11.500 |
Do you know with the idea of adversarial examples 00:11:16.020 |
and deepfakes and so on, what you bring up is actually, 00:11:23.740 |
a comforting protection that you can no longer really trust 00:11:34.580 |
and therefore you always have an escape clause 00:11:37.060 |
if a government is trying, if a stable, balanced, 00:11:43.540 |
ethical government is trying to accuse you of something, 00:11:46.220 |
at least you have protection, you can say it was fake news, 00:11:52.340 |
We know how to go into the video and see, for example, 00:11:58.380 |
and whether or not they've been tampered with. 00:12:02.180 |
And we also can put fake heart rate and respiration 00:12:15.260 |
So the fact that we took time to do that other step, too, 00:12:22.620 |
making the machine more affectively intelligent. 00:12:25.300 |
And there's a choice in how we spend our time, 00:12:28.580 |
which is now being swayed a little bit less by this goal 00:12:57.340 |
It used to be it was driven by researchers in academia 00:13:01.660 |
to get papers published and to make a career for themselves 00:13:08.660 |
Now we realize that this is enabling rich people 00:13:23.060 |
Maybe we wanna think about, maybe we wanna rethink AI. 00:13:26.100 |
Maybe we wanna rethink the problems in society 00:13:43.820 |
- Do you hope that kind of stepping on the brake 00:13:48.020 |
Because I think still majority of the force behind AI 00:13:52.780 |
is to make money without thinking about the why. 00:14:21.260 |
That said, even though my position is classically carrot, 00:14:26.340 |
I think we do need some regulations in this space. 00:14:33.700 |
that you own your data, not Amazon, not Google. 00:14:38.340 |
I would like to see people own their own data. 00:14:44.660 |
being extended to emotion recognition in general. 00:14:48.300 |
That right now you can't use a lie detector on an employee 00:14:54.860 |
I think similarly, we need to put in place protection 00:14:57.900 |
around reading people's emotions without their consent. 00:15:00.900 |
And in certain cases, like characterizing them for a job 00:15:06.180 |
So I also think that when we're reading emotion 00:15:11.740 |
that that should, even though it's not medical data, 00:15:14.220 |
that that should get the kinds of protections 00:15:25.260 |
and you wanna learn about your stress and your sleep 00:15:34.980 |
when we put it together with machine learning, 00:15:37.980 |
now called AI, even though the founders of AI 00:15:50.860 |
but it can also predict how you're likely to be tomorrow. 00:15:58.740 |
- Especially when you're tracking data over time. 00:16:00.660 |
- Especially when we're tracking a week of your data or more. 00:16:22.100 |
- I think we should be a little bit more worried 00:16:28.900 |
about who's looking at us and listening to us. 00:16:32.540 |
The device sitting on your countertop in your kitchen, 00:16:36.740 |
whether it's, Alexa or Google Home or Apple, Siri, 00:16:49.700 |
And I think there are great people in these companies 00:16:56.180 |
I'm a user of products from all of these companies. 00:16:59.380 |
I'm naming all the A companies, Alphabet, Apple, Amazon. 00:17:27.940 |
And guess what happens if you so much as smirk the wrong way 00:17:45.100 |
'cause they take your blood and they know you're a match. 00:17:48.100 |
And the doctors are on record of taking organs 00:17:55.380 |
They're just simply not the favored ones of the government. 00:18:07.860 |
I can certainly connect to the worry that you're expressing. 00:18:23.180 |
that you can have a deep connection with the machine. 00:18:29.540 |
- Those of us, I've admitted students who say that they, 00:18:36.820 |
who do you most wish you could have lunch with 00:18:59.340 |
And we need that kind of brilliance in machine learning. 00:19:01.980 |
And I love that computer science welcomes people 00:19:04.820 |
who love people and people who are very awkward 00:19:08.060 |
I love that this is a field that anybody could join. 00:19:16.740 |
I'm not trying to force people who don't like people 00:19:23.980 |
if most of the people building the AIs of the future 00:19:26.580 |
are the kind of people who don't like people, 00:19:33.500 |
So don't you think a large percentage of the world 00:19:41.820 |
- There's a huge problem with loneliness and it's growing. 00:19:48.860 |
- If you're lonely, you're part of a big and growing group. 00:20:04.660 |
but do you think there's an exciting possibility 00:20:07.660 |
that something like Alexa and these kinds of tools 00:20:19.020 |
I mean, a great book can kind of alleviate loneliness, 00:20:22.220 |
because you just get sucked into this amazing story 00:20:25.100 |
and you can't wait to go spend time with that character. 00:20:32.340 |
But yeah, it can be an incredibly delightful way 00:20:46.300 |
Well, probably some of that feeling of being there, right? 00:20:53.060 |
that romantic moment, or connecting with somebody. 00:21:07.700 |
But those kind of books that pull you into a character 00:21:17.660 |
And a computer, of course, can respond to you. 00:21:25.660 |
much more than the movie Her, you know, plays up, right? 00:21:30.780 |
I mean, movie Her is already a pretty deep connection, right? 00:21:38.900 |
But I mean, like, there can be a real interaction 00:21:42.780 |
where the character can learn and you can learn. 00:21:46.740 |
You could imagine it not just being you and one character. 00:21:51.740 |
You can imagine a group of people and characters, 00:22:01.700 |
but the few people and their AIs can befriend more people. 00:22:06.700 |
There can be an extended human intelligence in there 00:22:10.460 |
where each human can connect with more people that way. 00:22:14.980 |
But it's still very limited, but there are just, 00:22:19.580 |
what I mean is there are many more possibilities 00:22:24.820 |
So when you express a really serious concern about privacy, 00:22:28.340 |
about how governments can misuse the information, 00:22:31.260 |
and there's the possibility of this connection. 00:22:42.860 |
They ignore even the context or the existence of you, 00:22:47.420 |
the intricate, beautiful, complex aspects of who you are, 00:22:54.180 |
that help it recognize for speech recognition. 00:23:17.620 |
versus if a person is just kind of fooling around with Siri. 00:23:24.060 |
The tone of voice and what surrounds those words 00:23:27.980 |
is pivotal to understand if they should respond 00:23:32.940 |
in a very serious way, bring help to that person, 00:23:35.980 |
or if they should kind of jokingly tease back, 00:23:45.060 |
Like, how do you respond when somebody says that? 00:23:52.660 |
People wanna know if the person is happy or stressed 00:23:58.740 |
in part, well, so let me give you an altruistic reason 00:24:12.780 |
The altruistic people really care about their customers 00:24:17.220 |
and really care about helping you feel a little better 00:24:19.420 |
at the end of the day, and it would just make 00:24:30.020 |
There are other people who maybe have studied 00:24:42.580 |
and purse strings, you know, if we manipulate you 00:24:45.980 |
into a slightly sadder mood, you'll pay more, right? 00:24:53.860 |
You'll pay more for something you don't even need 00:25:04.820 |
a little shopping therapy, right, that helps them. 00:25:15.380 |
- To offer you products, or Amazon is primarily funded 00:25:24.140 |
to put a little bit of a wall between these agents 00:25:42.220 |
you mentioned, of course, a really serious concern 00:25:44.900 |
about like recognizing emotion if somebody is speaking 00:25:49.700 |
but what about the actual interaction itself? 00:25:53.540 |
Do you think, so if I, you know, you mentioned Clippy 00:25:58.900 |
and being annoying, what is the objective function 00:26:04.220 |
Is it minimize annoyingness or maximize happiness? 00:26:12.460 |
I think that push and pull, the tension, the dance, 00:26:19.860 |
So is there a room for, like what is the objective function? 00:26:23.780 |
- I mean, there are times when you wanna have 00:26:29.060 |
You know, I see my sons and one of them wants 00:26:31.900 |
to provoke the other to be upset and that's fun. 00:26:34.700 |
And it's actually healthy to learn where your limits are, 00:26:40.020 |
You can imagine a game where it's trying to make you mad 00:26:48.580 |
that's helping build resilience and self-control, 00:26:55.500 |
or how to deal with an abusive person in your life, 00:26:58.940 |
then you might need an AI that pushes your buttons, right? 00:27:03.380 |
But in general, do you want an AI that pushes your buttons? 00:27:13.220 |
I want one that's respectful, that is there to serve me 00:27:18.180 |
and that is there to extend my ability to do things. 00:27:27.260 |
And that's the kind of AI I'd put my money on. 00:27:30.220 |
- Your senses for the majority of people in the world, 00:27:37.100 |
So they're not looking, if you look at the movie "Her", 00:27:44.420 |
the woman in the movie "Her" leaves the person 00:27:53.980 |
- Right, like do you, your sense is if Alexa said, 00:27:58.260 |
you know what, I'm actually had enough of you for a while, 00:28:07.100 |
- I'd say you're trash 'cause I paid for you, right? 00:28:33.180 |
when it's allowed to speak, what it's allowed to listen to, 00:28:55.420 |
unless it's just there for self-regulation training. 00:29:02.820 |
And actually your question, I really like the, 00:29:10.580 |
Is it to always make people happy and calm them down? 00:29:18.860 |
take your Soma if you're unhappy, take your happy pill. 00:29:24.380 |
well, we'll threaten you by sending you to Iceland 00:29:43.260 |
But really like, do we want AI to manipulate us 00:29:52.660 |
like a power obsessed, sick dictator individual 00:29:59.220 |
then yeah, you want to use AI to extend your power 00:30:02.900 |
and your scale to force people into submission. 00:30:11.140 |
and the opportunity to do things that might surprise you, 00:30:15.380 |
then you want to use AI to extend people's ability. 00:30:22.940 |
that empowers the weak and helps balance the power 00:30:37.260 |
and sensing people, what is your sense on emotion 00:30:46.700 |
So yeah, emotion that is expressed on the surface 00:30:51.700 |
through your face, your body, and various other things, 00:30:58.940 |
on the biological level, on the neuroscience level, 00:31:07.900 |
- Well, yeah, I'm sure there's no definitive answer, 00:31:12.420 |
How far can we get by just looking at the face? 00:31:15.340 |
- We're very limited when we just look at the face, 00:31:18.540 |
but we can get further than most people think we can get. 00:31:22.020 |
People think, "Hey, I have a great poker face, 00:31:25.940 |
"therefore all you're ever gonna get from me is neutral." 00:31:35.020 |
We can read from a neutral face if your heart is racing. 00:31:53.300 |
how your heart rate variability power is changing. 00:31:58.700 |
even when your heart rate is not necessarily accelerating. 00:32:02.460 |
- Sorry, from physio sensors or from the face? 00:32:06.140 |
- From the color changes that you cannot even see, 00:32:15.140 |
- So we get things people can't see using a regular camera. 00:32:18.460 |
And from that, we can tell things about your stress. 00:32:21.700 |
So if you were just sitting there with a blank face, 00:32:25.340 |
thinking nobody can read my emotion, well, you're wrong. 00:32:31.620 |
but that's from sort of visual information from the face. 00:32:38.860 |
by being very clever with what you can do with vision. 00:32:45.060 |
But if you just look at the stuff we humans can see, 00:32:48.340 |
the smile, the smirks, the subtle, all the facial expressions. 00:32:57.020 |
Now, if you're just going in for a brief interview 00:33:00.340 |
and you're hiding it, that's pretty easy for most people. 00:33:03.660 |
If you are, however, surveilled constantly everywhere you go, 00:33:08.580 |
then it's gonna say, "Gee, Lex used to smile a lot, 00:33:36.780 |
When I say we, you think we can't read your emotion, 00:33:42.500 |
What we're reading is more some physiological changes 00:33:48.500 |
Now, that doesn't mean that we know everything 00:33:52.380 |
In fact, we still know very little about how you feel. 00:33:56.660 |
Your nuanced feelings are still completely private. 00:34:02.660 |
So there's some relief that we can't read that, 00:34:15.540 |
and we know what's going on in your environment, 00:34:26.500 |
And that is where it's not just the momentary feeling, 00:34:33.740 |
And that could actually be a little bit more scary 00:34:36.620 |
with certain kinds of governmental control freak people 00:34:47.900 |
- And getting that information through over time. 00:34:56.180 |
both in computer vision and physiological sense, 00:35:03.100 |
what's the best window into the emotional soul? 00:35:17.460 |
- So for health and wellbeing and things like that, 00:35:57.660 |
that's measuring skin conductance, movement, temperature. 00:36:09.380 |
of when they're texting, who they're texting, 00:36:14.180 |
the weather information based upon their location. 00:36:29.940 |
we are very accurate at forecasting tomorrow's stress, 00:36:38.580 |
And when we look at which pieces of that are most useful, 00:37:00.980 |
Because the wearable stuff with physiological information, 00:37:16.620 |
it's fine in the early days people would say, 00:37:18.940 |
oh, wearing something or giving blood is invasive, right? 00:37:28.540 |
the things that are not touching you are maybe the scariest 00:37:31.420 |
because you don't know when they're on or off. 00:37:54.940 |
because it's also very easy to just take it off, right? 00:38:01.460 |
So if I'm uncomfortable with what it's sensing, 00:38:09.940 |
then, and I happen to know everything about this one 00:38:40.380 |
- Right, so that control, yeah, I'm with you. 00:38:42.780 |
That control, even if, yeah, the ability to turn it off, 00:38:52.140 |
if there's regulations, maybe that's number one to protect, 00:38:55.100 |
is people's ability to, as easy to opt out as to opt in. 00:39:00.100 |
- So you've studied a bit of neuroscience as well. 00:39:08.220 |
sort of the biological stuff, or the neurobiological, 00:39:12.860 |
the neuroscience, look at the signals in our brain, 00:39:23.700 |
and I was building hardware and computer designs, 00:39:26.300 |
and I wanted to build ones that work like the brain. 00:39:29.900 |
as long as I've been studying how to build computers. 00:39:39.140 |
They used to think, oh, if you remove this chunk of the brain 00:39:44.460 |
well, that's the part of the brain that did it. 00:39:53.220 |
Brains are so interesting, and changing all the time, 00:40:01.140 |
When we were measuring stress, you may know the story 00:40:05.940 |
where we found an unusual big skin conductance pattern 00:40:15.860 |
you could be stressed on one wrist and not the other, 00:40:18.100 |
like how can you get sweaty on one wrist, right? 00:40:21.900 |
with that sympathetic fight or flight response, 00:40:33.500 |
was a part of his brain had unusual electrical activity, 00:40:37.580 |
and that caused an unusually large sweat response 00:40:55.940 |
we can pick up with a wearable at one part of the body. 00:41:06.620 |
As we learned this, and then later built an embrace 00:41:10.620 |
that's now FDA cleared for seizure detection, 00:41:16.180 |
with some of the most amazing doctors in the world 00:41:24.860 |
and they're going in and they're implanting electrodes, 00:41:27.580 |
not just to momentarily read the strange patterns 00:41:31.620 |
of brain activity that we'd like to see return to normal, 00:41:35.460 |
but also to read out continuously what's happening 00:41:39.500 |
during most of life when these patients are not seizing. 00:41:50.340 |
that you can't even usually get with EEG scalp electrodes 00:41:54.060 |
'cause the changes deep inside don't reach the surface. 00:41:58.140 |
But interesting, when some of those regions are activated, 00:42:16.380 |
there's a period where the brain waves go flat 00:42:20.100 |
and it looks like the person's brain has stopped, 00:42:26.740 |
that can make the cortical activity look flat, 00:42:43.780 |
The longer this flattening, the bigger our response here. 00:42:52.580 |
Well, it turns out there's something much deeper. 00:42:57.780 |
of some of these individuals, fabulous people 00:43:05.660 |
So that's the active research that we're doing right now 00:43:12.060 |
skin conductance can capture sort of the ripples 00:43:15.220 |
of the complexity of what's going on in our brain. 00:43:21.100 |
you have a hope that you can start to get the signal 00:43:24.980 |
from the interesting things happening in the brain. 00:43:27.940 |
- Yeah, we've already published the strong correlations 00:43:45.340 |
But we know that most SUDEPs happen when the person's alone. 00:44:14.180 |
I hope everybody who's heard of SIDS and stroke 00:44:17.740 |
will now hear of SUDEP 'cause we think in most cases 00:44:31.580 |
- So you have this embrace now in the version two wristband, 00:44:47.180 |
It essentially means it's approved for marketing. 00:44:50.300 |
Just a side note, how difficult is that to do? 00:44:58.060 |
It's much harder than publishing multiple papers 00:45:10.900 |
- Is that system, so if we look at the peer review 00:45:13.940 |
of medical journals, there's flaws, there's strengths. 00:45:19.660 |
how does it compare to the peer review process? 00:45:28.100 |
You're saying does it stop some amazing technology 00:45:41.020 |
of safety testing and that's wonderful and that's great. 00:45:46.300 |
Sometimes they put you through additional testing 00:45:51.020 |
that they don't have to explain why they put you through it 00:45:54.100 |
and you don't understand why you're going through it 00:45:56.700 |
and it doesn't make sense and that's very frustrating. 00:46:03.100 |
and they just would, it would do people a service 00:46:22.860 |
to I think I've heard you mention depression. 00:46:30.740 |
What's your hope of how we can make the world 00:46:35.780 |
- I would really like to see my fellow brilliant researchers 00:46:40.780 |
step back and say, what are the really hard problems 00:46:46.900 |
that we don't know how to solve that come from people 00:46:59.740 |
or the data plan or all these other wonderful things 00:47:05.580 |
Meanwhile, there's all these folks left behind in the world 00:47:08.260 |
and they're struggling with horrible diseases, 00:47:10.860 |
with depression, with epilepsy, with diabetes, 00:47:14.340 |
with just awful stuff that maybe a little more time 00:47:21.860 |
and learning what are their challenges in life? 00:47:36.540 |
And then how would that reshape the kinds of AI 00:47:40.220 |
How would that reshape the new apps that we build? 00:47:43.420 |
Or maybe we need to focus on how to make things 00:47:46.700 |
more low cost and green instead of thousand dollar phones. 00:47:50.660 |
I mean, come on, why can't we be thinking more 00:47:54.180 |
about things that do more with less for these folks? 00:47:58.180 |
Quality of life is not related to the cost of your phone. 00:48:01.780 |
It's not something that, it's been shown that what? 00:48:05.020 |
About $75,000 of income and happiness is the same. 00:48:08.620 |
However, I can tell you, you get a lot of happiness 00:48:16.780 |
So how do we connect up the people who have real needs 00:48:20.900 |
with the people who have the ability to build the future 00:48:23.540 |
and build the kind of future that truly improves the lives 00:48:27.500 |
of all the people that are currently being left behind? 00:48:37.060 |
So do you think if we look farther into the future, 00:48:40.460 |
you said so much of the benefit from making our technology 00:48:57.300 |
that we can fall in love with and loves us back 00:49:00.660 |
on a level that is similar to human to human interaction, 00:49:08.020 |
- I think we can simulate it in ways that could, 00:49:24.940 |
now, if you've just grown up with nothing but abuse 00:49:36.140 |
If you've only encountered pretty awful people. 00:49:39.060 |
If you've encountered wonderful, amazing people, 00:49:41.660 |
we're nowhere near building anything like that. 00:49:49.420 |
I would bet instead on building the kinds of AI 00:50:02.300 |
and helps give them what they need to stay well tomorrow. 00:50:09.100 |
not the kind of AI that just walks on "The Tonight Show" 00:50:15.340 |
Like, and then it goes back in a box, you know? 00:50:20.100 |
if we continue looking a little bit into the future, 00:50:31.980 |
And even, let me cautiously say the C word, consciousness, 00:50:46.060 |
or can it remain simply a machine learning tool 00:50:53.220 |
that learns to make us, based on previous patterns, 00:51:00.260 |
Or does it need those elements of consciousness? 00:51:23.460 |
you know, it could just be like a little voice 00:51:31.820 |
But that doesn't make this compelling of a movie, right? 00:51:38.580 |
when a guy looks like he's talking to himself on the train, 00:51:51.780 |
with an embodied robot versus a video of a robot 00:52:03.100 |
is more likely to get you to remember to do the things 00:52:06.500 |
because it's kind of got a physical presence. 00:52:17.220 |
They have great power and opportunity and potential. 00:52:39.340 |
We can already write programs that make it look like, 00:52:42.660 |
Of course I'm aware that you're there, right? 00:52:47.860 |
but does it actually have conscious experience like we do? 00:53:16.420 |
- Yeah, can you go to jail for turning off Alexa? 00:53:18.860 |
It's a question for an election maybe a few decades from now. 00:53:24.900 |
- Well, Sophia Robots already been given rights 00:53:41.820 |
- Yeah, it's dark and almost comedic, if not absurd. 00:53:47.780 |
So I've heard you speak about your journey in finding faith. 00:53:55.980 |
- And how you discovered some wisdoms about life 00:54:07.020 |
that nothing exists beyond what can be currently measured. 00:54:20.260 |
assuming that we can uncover the mysteries of this world 00:54:25.260 |
by the mechanisms of measurement that we currently have. 00:54:28.260 |
But we easily forget that we've made this assumption. 00:54:42.580 |
to things we can measure and reason about and reproduce. 00:54:48.580 |
I think we have to recognize that sometimes we scientists 00:54:52.260 |
also believe in things that happened historically. 00:54:55.260 |
You know, like I believe the Holocaust happened. 00:54:57.860 |
I can't prove events from past history scientifically. 00:55:03.820 |
You prove them with historical evidence, right? 00:55:08.140 |
with eyewitness testimony and things like that. 00:55:24.260 |
and bad thinking recently, you can call it scientism, 00:55:27.820 |
where people say science is the only way to get to truth. 00:55:38.460 |
You don't prove your love through science, right? 00:55:50.820 |
that there's more ways to gain knowledge and truth 00:55:55.820 |
if you're willing to believe there is such a thing. 00:56:01.260 |
I am a scientist, however, and in my science, 00:56:17.340 |
And really, we know, if we're being honest with ourselves, 00:56:20.760 |
the percent of what we really know is basically zero 00:56:30.600 |
- If I have a finite amount of knowledge, which I do. 00:56:47.320 |
- I came from Douglas Adams, 42, my favorite number. 00:56:53.680 |
I guessed to the exact same number for our house. 00:57:00.320 |
- So is it just 42, or do you have other words 00:57:10.040 |
I think there's a lot more to it than meets the eye 00:57:12.400 |
and the heart and the mind and the soul here. 00:57:14.640 |
I think we see but through a glass dimly in this life. 00:57:21.980 |
If people haven't read the Bible, they should, 00:57:35.960 |
but when you read it, there's something in you, 00:57:38.440 |
like a musician knows when the instrument's played right 00:57:47.520 |
that it's like your strings are being plucked by the master 00:57:50.440 |
instead of by me, right, playing when I pluck it. 00:57:54.520 |
But probably when you play, it sounds spectacular, right? 00:58:19.440 |
None of this squashes my desire to do science 00:58:26.320 |
I grow even more in awe of what the science can do 00:58:29.880 |
because I'm more in awe of all there is we don't know. 00:58:36.360 |
you have to have a belief that there's truth, 00:58:38.560 |
that there's something greater to be discovered. 00:58:41.400 |
And some scientists may not wanna use the faith word, 00:58:50.280 |
that there's something to know that we don't know, 00:58:52.840 |
that it's worth knowing, that it's worth working hard, 00:59:00.120 |
which by the way, science can't prove either. 00:59:03.000 |
We have to kind of start with some assumptions 00:59:07.120 |
And these are really questions philosophers own, right? 00:59:17.160 |
when people claim that science will tell you all truth, 00:59:29.160 |
- Yeah, there's a much bigger world out there 00:59:37.280 |
- Yeah, and there's meaning and purpose and hope 00:59:40.240 |
and joy and love and all these awesome things 00:59:45.600 |
- I don't think there's a better way to end it, Roz.