back to indexRana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars | Lex Fridman Podcast #322
Chapters
0:0 Introduction
1:0 Childhood
10:37 Hijab
13:20 Faith
15:28 War
19:37 Women in the Middle East
23:55 Rana's journey
36:30 Rosalind Picard
38:38 Advice for women
49:9 Dating
56:45 Human nature
61:25 AI and emotions
92:3 Smart Eye
101:24 Tesla and Waymo
110:11 Drunk driving
119:42 Robotics
133:29 Advice for startups
138:17 Investing
145:41 Advice for young people
154:1 Love
00:00:02.120 |
As we build socially and emotionally intelligent machines, 00:00:07.120 |
what does that mean about our relationship with them? 00:00:15.760 |
to be amazing at empathy by definition, right? 00:00:28.280 |
- The following is a conversation with Rana El-Khlyoubi, 00:00:31.720 |
a pioneer in the field of emotion recognition 00:00:39.480 |
deputy CEO of SmartEye, author of "Girl Decoded," 00:00:44.120 |
and one of the most brilliant, kind, inspiring, 00:00:46.820 |
and fun human beings I've gotten the chance to talk to. 00:00:53.640 |
please check out our sponsors in the description. 00:00:56.160 |
And now, dear friends, here's Rana El-Khlyoubi. 00:01:03.520 |
What is a memory from that time that makes you smile? 00:01:10.720 |
and helping you define yourself in this world? 00:01:17.480 |
She used to have these mango trees in her garden 00:01:21.840 |
and so mango season was like July and August. 00:01:25.440 |
she would invite all my aunts and uncles and cousins. 00:01:28.360 |
And it was just like maybe there were like 20 or 30 people 00:01:31.200 |
in the house and she would cook all this amazing food. 00:01:34.280 |
And us, the kids, we would go down the garden 00:01:42.220 |
I think it's just the bringing people together, 00:01:58.240 |
that memory didn't translate to me kind of doing the same. 00:02:05.720 |
Is that what, like what, how does memory work? 00:02:33.240 |
So I kind of, they're red and mango colored on the outside. 00:02:50.780 |
You know, there's like these almost stereotypical foods 00:03:04.040 |
- Okay, okay, well you explained it that way. 00:03:05.800 |
- If you know what it is, I think you know it's delicious. 00:03:09.700 |
But if I explain it, it's just not gonna sound delicious. 00:03:12.540 |
I feel like beet soup, this doesn't make any sense. 00:03:16.520 |
and you probably have actually seen pictures of it 00:03:19.020 |
'cause it's one of the traditional foods in Ukraine, 00:03:22.640 |
in Russia, in different parts of the Slavic world. 00:03:26.080 |
So, but it's become so cliche and stereotypical 00:03:32.520 |
Like I visited Ukraine, I eat that every single day. 00:03:39.560 |
I think to make it well, like anything, like Italians, 00:03:43.160 |
they say, well, tomato sauce is easy to make, 00:03:51.160 |
So anyway, is there something like that in Egypt? 00:03:55.360 |
- There is, and actually we have a similar kind of soup. 00:04:04.520 |
It's like somewhere between spinach and kale, 00:04:06.920 |
and you mince it, and then you cook it in like chicken broth. 00:04:17.480 |
and then we used to have it alongside stuffed pigeons. 00:04:20.320 |
I'm pescetarian now, so I don't eat that anymore, but. 00:04:37.200 |
If that doesn't bother you too much to describe. 00:04:39.960 |
- No, no, it's stuffed with a lot of like just rice and. 00:04:46.120 |
- And you also, you've said that you're first in your book 00:04:55.200 |
Is that when you first fell in love with computers? 00:05:07.840 |
- Yeah, I think the magical moment is definitely 00:05:13.640 |
and we just like had fun together, like playing games. 00:05:17.240 |
But the other memory I have is my first code. 00:05:21.320 |
The first code I wrote, I wrote, I drew a Christmas tree. 00:05:28.880 |
that the first thing I did was like this Christmas tree. 00:05:40.600 |
I must have been like six or seven at the time. 00:05:49.440 |
That's, if you think about it, that's empowering. 00:05:55.920 |
I don't know if many people think of it that way 00:06:07.880 |
- Eventually, I guess you couldn't at the time, 00:06:10.040 |
but eventually this thing, if it's interesting enough, 00:06:19.360 |
And then because it's digital, it's easy to spread. 00:06:24.120 |
that's easily spreadable to millions of people. 00:06:28.280 |
- It's hard to think that way when you're six. 00:06:34.080 |
because I was raised by a particular set of parents, 00:06:41.920 |
I'm a Muslim and I feel I'm stronger, more centered for it. 00:06:54.500 |
on the energy, vitality, and entrepreneurial spirit 00:07:09.840 |
They actually, there's a cute story about how they met. 00:07:18.360 |
So she signed up to take his kobol programming class. 00:07:44.920 |
One of the core values in our family is just hard work. 00:07:50.920 |
And that's something that's definitely stayed with me, 00:07:55.200 |
both as a professional, but also in my personal life. 00:07:58.940 |
But I also think my mom, my mom always used to, 00:08:02.880 |
I don't know, it was like unconditional love. 00:08:07.120 |
I just knew my parents would be there for me, 00:08:15.680 |
And they got tested on it because I kind of challenged, 00:08:20.320 |
and I kind of took a different path, I guess, 00:08:23.080 |
than what's expected of a woman in the Middle East. 00:08:50.400 |
but I decided to go do my PhD at Cambridge University. 00:08:54.120 |
And because my husband at the time, he's now my ex, 00:08:58.860 |
ran a company in Cairo, he was gonna stay in Egypt. 00:09:01.460 |
So it was gonna be a long distance relationship. 00:09:06.840 |
for a woman to just head out and kind of pursue her career. 00:09:10.960 |
And so my dad and my parents-in-law both said, 00:09:15.960 |
"You know, we do not approve of you doing this, 00:09:19.840 |
but now you're under the jurisdiction of your husband, 00:09:28.760 |
He said, "You know, this is your dream come true. 00:09:51.480 |
- Well, that was also during, right around September 11th. 00:10:07.920 |
And so my parents just were, they were afraid for my safety. 00:10:14.440 |
I decided to take off my headscarf and wear a hat instead. 00:10:17.680 |
So I just went to class wearing these like British hats, 00:10:34.840 |
I'm just gonna go back to wearing my headscarf. 00:10:37.320 |
- Yeah, you wore the hijab, so starting in 2000, 00:10:54.120 |
Like what does it represent in its best case? 00:11:02.520 |
I guess I'll first start by saying I wore it voluntarily. 00:11:06.960 |
And in fact, I was one of the very first women in my family 00:11:12.240 |
And my family thought it was really odd, right? 00:11:15.240 |
Like they were like, why do you wanna put this on? 00:11:17.920 |
And at its best, it's a sign of modesty, humility. 00:11:27.560 |
It's a step back into some kind of tradition, 00:11:40.000 |
- Exactly, and I actually like made it my own. 00:11:43.280 |
I remember I would really match the color of my headscarf 00:11:55.800 |
around how we practice religion and religion. 00:12:02.480 |
where I was spending a lot of time going back and forth 00:12:06.640 |
And I started meeting a lot of people in the US 00:12:08.560 |
who were just amazing people, very purpose-driven, 00:12:17.880 |
And so that was when I just had a lot of questions. 00:12:25.120 |
was when the Muslim Brotherhood ran the country, 00:12:30.160 |
It was at a time when I was going through a divorce. 00:12:39.300 |
where I was like, "This doesn't feel like me anymore." 00:12:44.240 |
because culturally it's okay if you don't wear it, 00:12:48.600 |
but it's really not okay to wear it and then take it off. 00:12:54.440 |
while still maintaining a deep core and pride 00:13:03.280 |
- So still being Egyptian, still being a Muslim. 00:13:06.760 |
- Right, and being, I think, generally like faith-driven, 00:13:14.440 |
- But what that means changes year by year for you. 00:13:27.440 |
- Yeah, I mean, I think there is something really powerful 00:13:32.120 |
about just believing that there's a bigger force. 00:13:36.880 |
You know, there's a kind of surrendering, I guess, 00:13:45.380 |
Like the universe is out to like do amazing things for you 00:13:50.960 |
Like even when you're going through adversity, 00:13:57.940 |
- Yeah, it gives you like an inner peace, a calmness. 00:14:01.460 |
- Yeah, it's faith in all the meanings of that word. 00:14:08.100 |
And it is because time passes and time cures all things. 00:14:13.100 |
It's like a calmness with the chaos of the world. 00:14:20.700 |
that something at the specific moment in time 00:14:25.460 |
and it's not what you wanted in life, da-da-da-da. 00:14:33.060 |
It maybe closed the door, but it opened a new door for you. 00:14:38.940 |
that there's a silver lining in almost anything in life. 00:14:45.860 |
have faith or conviction that it's gonna work out. 00:14:48.380 |
- Such a beautiful way to see a shitty feeling. 00:14:50.900 |
So if you feel shitty about a current situation, 00:15:03.940 |
whatever doesn't kill you makes you stronger. 00:15:19.440 |
So over time you get to have that perspective. 00:15:34.040 |
maybe even the big love and hate in that part of the world, 00:15:37.680 |
because it does seem to be a part of the world 00:16:11.200 |
that we are still having to grapple with that. 00:16:28.480 |
we ended up becoming friends, but she was from Israel. 00:16:37.900 |
- Did you guys sit there just staring at each other for a bit? 00:16:42.520 |
- Actually, she, 'cause I arrived before she did 00:16:49.800 |
and asked him if she thought it was gonna be okay. 00:16:55.120 |
- Yeah, and Peter Robinson, our PhD advisor was like, 00:16:59.520 |
yeah, like this is an academic institution, just show up. 00:17:09.400 |
We were both doing artificial emotional intelligence. 00:17:18.400 |
It was just, I was like, why on earth are our countries, 00:17:25.320 |
And I think it falls back to the narrative, right? 00:17:28.780 |
like whoever creates this narrative of war, I don't know. 00:17:38.600 |
because there's also evil women in the world. 00:17:48.220 |
The other aspect is, it doesn't matter the gender, 00:17:57.100 |
different parts of the world around that conflict now. 00:18:01.020 |
And that's happening in Yemen as well and everywhere else. 00:18:08.580 |
to the populace and those narratives take hold 00:18:11.340 |
and everybody believes that and they have a distorted view 00:18:20.020 |
you don't even see the people on the other side as human 00:18:25.020 |
or as equal intelligence or worth or value as you. 00:18:30.220 |
You tell all kinds of narratives about them being Nazis 00:18:34.180 |
or Dom or whatever narrative you wanna weave around that. 00:18:42.860 |
But I think when you actually meet them face to face, 00:18:50.500 |
- It's actually a big shock for people to realize 00:18:53.860 |
that they've been essentially lied to within their country. 00:19:05.140 |
or any kind of technology is able to bypass the walls 00:19:10.340 |
that governments put up and connect people directly 00:19:15.220 |
ooh, people fall in love across different nations 00:19:21.540 |
And that I think ultimately can cure a lot of our ills, 00:19:31.020 |
that would cure a lot of the ills of the world, 00:19:37.840 |
- Let me ask you about the women running the world. 00:19:44.920 |
perhaps shape the landscape of just our human experience. 00:19:59.640 |
- I think just kind of just going back to like my comment 00:20:05.360 |
Which has been a common thread throughout my entire career. 00:20:14.760 |
or a group of people, you build trust, you build loyalty, 00:20:17.760 |
you build friendship, and then you can turn that 00:20:21.840 |
into like behavior change and motivation and persuasion. 00:20:25.120 |
So it's like empathy and emotions are just at the center 00:20:35.480 |
kind of this human connection is very strong. 00:20:38.640 |
Like we have this running joke that if you come to Egypt 00:20:44.600 |
will know everything about your life like right away, right? 00:20:46.720 |
I have no problems asking you about your personal life. 00:20:52.720 |
no personal boundaries in terms of getting to know people. 00:20:55.240 |
We get emotionally intimate like very, very quickly. 00:20:58.480 |
But I think people just get to know each other 00:21:07.040 |
You just try to get to know people really deeply. 00:21:14.280 |
yeah, imagine what challenges they're going through. 00:21:18.720 |
So I think I've definitely taken that with me. 00:21:25.440 |
like just being generous with your time and love 00:21:28.280 |
and attention and even with your wealth, right? 00:21:40.120 |
And so do you think there's a useful difference 00:21:42.920 |
between men and women in that aspect and empathy? 00:21:47.640 |
Or is doing these kind of big general groups, 00:21:57.040 |
- Yeah, I actually don't wanna overgeneralize. 00:22:13.880 |
Although one of the researchers I worked with 00:22:17.160 |
when I was at Cambridge, Professor Simon Baring-Cohen, 00:22:23.520 |
- But he runs the Autism Research Center at Cambridge 00:22:30.880 |
And one of his theories is the empathy scale, 00:22:40.040 |
of computer scientists and engineers who are systemizers 00:22:47.200 |
And then there's more men in that bucket, I guess, 00:22:53.720 |
And then there's more women in the empathizers bucket. 00:23:00.400 |
It's been frustrating to me how many, I guess, 00:23:03.320 |
systemizers there are in the field of robotics. 00:23:08.280 |
'cause I care about, obviously, social robotics. 00:23:32.200 |
It's a safety concern to be touching the human, 00:23:46.280 |
don't care about the human, it's an opportunity. 00:23:49.440 |
- For, in your case, it's a business opportunity too, 00:23:52.160 |
but in general, an opportunity to explore those ideas. 00:24:09.960 |
So you became an exceptionally successful CEO, 00:24:26.800 |
- Yeah, so actually just kind of a little bit of background. 00:24:29.800 |
So the reason why I moved from Cairo to Cambridge, UK 00:24:33.960 |
to do my PhD is because I had a very clear career plan. 00:24:38.200 |
I was like, okay, I'll go abroad, get my PhD, 00:24:52.480 |
building artificial emotional intelligence and looking at-- 00:24:58.320 |
did you know it's gonna be artificial intelligence? 00:25:10.400 |
But I love teaching, I mean, I still love teaching. 00:25:36.280 |
or did you see computers as having the capability 00:25:46.520 |
Computers sit at the center of how we connect 00:25:59.200 |
because we could send letters back and forth to each other. 00:26:11.760 |
It actually does change how we connect with one another. 00:26:19.320 |
like the haptic feel, 'cause the email is all digital. 00:26:27.320 |
There's something nice about the haptic aspect 00:26:29.600 |
of the fax machine 'cause you still have to press, 00:26:31.840 |
you still have to do something in the physical world 00:26:34.240 |
to make this thing a reality, the sense of somebody. 00:26:40.720 |
- Yeah, there's something lost when it's just an email. 00:26:45.720 |
Obviously, I wonder how we can regain some of that 00:26:50.440 |
in the digital world, which goes to the metaverse 00:26:55.520 |
- Actually, do you have a question on that one? 00:26:57.800 |
Do you still, do you have photo albums anymore? 00:27:06.720 |
So it was one of the painful steps in my life 00:27:09.360 |
was to scan all the photos and let go of them 00:27:17.760 |
- Yeah, switched to Kindle, everything Kindle. 00:28:06.040 |
Anyway, yeah, so yeah, but I do love haptic things. 00:28:13.600 |
Even like touchscreens, it's tricky to get right, 00:28:35.200 |
And then I got to Cambridge and I fall in love 00:28:43.800 |
You're building stuff that nobody's built before 00:28:51.040 |
And at the end of my PhD, I think it's the meeting 00:28:59.960 |
the Affective Computing Group at the MIT Media Lab. 00:29:03.560 |
You know, I was like following all her research. 00:29:13.360 |
at a pattern recognition conference in Cambridge 00:29:19.760 |
you know, if any students wanna meet with me, 00:29:24.800 |
And so I signed up for slot and I spent like the weeks 00:29:31.080 |
And I want to show her a demo of my research and everything. 00:29:40.840 |
do you wanna come work with me as a postdoc at MIT? 00:29:45.720 |
I was like, okay, this would be a dream come true, 00:29:47.760 |
but there's a husband waiting for me in Cairo. 00:29:54.720 |
And I literally started commuting between Cairo and Boston. 00:30:03.480 |
I would, you know, hop on a plane and go to Boston. 00:30:08.600 |
There was no, I kind of outgrew my dreams, right? 00:30:12.960 |
I didn't wanna go back to Egypt anymore and be faculty. 00:30:35.080 |
- I think, I wonder if that's similar to your experience 00:30:37.960 |
at MIT, I was just, at the Media Lab in particular, 00:30:42.600 |
I was just really, impressed is not the right word. 00:30:47.200 |
I didn't expect the openness to like innovation 00:30:50.680 |
and the acceptance of taking a risk and failing. 00:30:55.680 |
Like failure isn't really accepted back in Egypt, right? 00:31:02.000 |
which I think has been hardwired in my brain. 00:31:05.080 |
But you get to MIT and it's okay to start things. 00:31:11.040 |
And that kind of thinking was just very new to me. 00:31:19.480 |
because they, I think more than other places at MIT, 00:31:29.200 |
but certainly with Rosalind, you try wild stuff, 00:31:46.000 |
about robotics labs at MIT, and there's like over 30, 00:31:48.960 |
I think, is like, usually when you show up to a robotics lab 00:31:53.960 |
there's not a single working robot, they're all broken. 00:32:04.360 |
where robotics labs had some robots functioning. 00:32:08.960 |
One of my like favorite moments that just sticks with me, 00:32:17.960 |
so many legged robots in one place, I'm like, I'm home. 00:32:31.160 |
there was a random robot spot was walking down the hall. 00:32:46.360 |
but this one in particular definitely has a backstory 00:32:55.280 |
And there was just this feeling like there's a life, 00:33:01.520 |
He probably has to commute back to his family at night. 00:33:04.880 |
Like there's a feeling like there's life instilled 00:33:09.960 |
I don't know, it was kind of inspiring to see. 00:33:12.200 |
- Did it say hello to, did he say hello to you? 00:33:17.960 |
No, no, listen, I love competence and focus and great. 00:33:27.760 |
There's a job to be done and he was doing it. 00:33:35.760 |
on trying to always have a thing that's working 00:33:41.800 |
You could not, yeah, you could not like show up 00:33:50.120 |
I don't know if this is still his lifelong goal or not, 00:33:56.600 |
that's just inhabited by robots, like no humans. 00:34:00.040 |
He just wants all these robots to be connecting 00:34:05.560 |
does he have an idea of which robots he loves most? 00:34:12.000 |
Is it humanoid robots, robot dogs, or is not clear yet? 00:34:21.520 |
and he used to love Jibo. - The thing with a giant head. 00:34:29.520 |
- Not glowing, like. - Right, right, right, right. 00:34:38.480 |
yeah, he just, I think he loves all forms of robots, 00:34:47.840 |
- I like, I personally like legged robots, especially. 00:35:01.840 |
So I have a bunch of legged robots now in Austin 00:35:06.200 |
- I've been trying to have them communicate affection 00:35:10.460 |
with their body in different ways, just for art. 00:35:15.280 |
'Cause I love the idea of walking around with the robots, 00:35:28.740 |
but kids don't have this kind of weird construction 00:35:40.140 |
I went into his class with a whole bunch of robots 00:36:00.260 |
So it just struck me how there was no fear of robots. 00:36:05.260 |
Was a lot of adults have that, like us versus them. 00:36:13.860 |
because you still have to look at the lessons of history 00:36:17.060 |
and how robots can be used by the power centers of the world 00:36:20.700 |
to abuse your rights and all that kind of stuff. 00:36:40.300 |
I think the thing I learned the most is perseverance. 00:36:51.420 |
we applied for a grant to the National Science Foundation 00:37:03.420 |
- The first time you were rejected for fun, yeah. 00:37:06.140 |
- Yeah, and I basically, I just took the rejection 00:37:15.260 |
They love the idea, they just don't think we can do it. 00:37:18.340 |
So let's build it, show them, and then reapply. 00:37:22.620 |
And it was that, oh my God, that story totally stuck with me. 00:37:26.540 |
And she's like that in every aspect of her life. 00:37:45.260 |
about how you see computers, and also business, 00:37:51.740 |
She's a very powerful, brilliant woman like yourself, 00:37:57.500 |
- Yeah, I think Roz is actually also very faith-driven. 00:38:00.460 |
She has this deep belief and conviction, yeah, 00:38:15.820 |
you can be of a different background and religion, 00:38:18.980 |
and whatever, and you can still have the same core values. 00:38:33.780 |
I hope she'll be on, I'm sure she'll be on again. 00:38:49.260 |
so you're a powerful leader, you're brilliant, 00:38:55.180 |
What advice would you give, especially to young women, 00:39:01.460 |
powerful leaders like yourself in a world where perhaps, 00:39:05.220 |
in a world that perhaps doesn't give them a clear, 00:39:15.620 |
whether we're talking about Egypt or elsewhere? 00:39:18.260 |
- You know, hearing you kind of describe me that way 00:39:27.460 |
what I think is the biggest challenge of all, 00:39:34.860 |
what I call now the Debbie Downer voice in my head. 00:39:39.500 |
The kind of basically, it's just shattering all the time, 00:39:47.620 |
What business do you have, like starting a company 00:39:52.420 |
It's always like, and I think my biggest advice 00:39:57.300 |
to not just women, but people who are taking a new path 00:40:07.260 |
and let your thoughts be the biggest obstacle in your way. 00:40:26.020 |
I'm not exactly sure if it's a bad thing or a good thing. 00:40:37.660 |
It's kind of, it drives productivity and progress 00:40:43.700 |
but it can hold you back from taking big leaps. 00:40:46.940 |
I think the best I can say is probably you have 00:41:00.500 |
Like I have from almost like a third person perspective. 00:41:05.340 |
- Yeah, like, because it is useful to be critical. 00:41:15.380 |
at MIT and I was just, you know, there's so much love 00:41:23.340 |
So many amazing people I got a chance to talk to. 00:41:31.420 |
it was mostly just negative thoughts about me. 00:41:38.980 |
And second is like, why did you, that was so dumb. 00:41:50.420 |
But I think it's good to hear that voice out. 00:42:06.340 |
or take a leap to go to America to work in media lab. 00:42:19.100 |
because you should have like this weird confidence, 00:42:32.580 |
I mean, there's, there's, there's some of that. 00:42:34.420 |
You, you actually tweeted a really nice tweet thread. 00:42:41.460 |
a friend recommended I do daily affirmations. 00:42:46.900 |
but I was going through major transitions in my life. 00:42:49.560 |
So I gave it a shot and it set me on a journey 00:43:01.380 |
- Yeah, because really like, I'm just like me, 00:43:11.380 |
And so I've been doing journaling for almost 10 years now. 00:43:15.820 |
I use an app called Day One and it's awesome. 00:43:19.020 |
I just journal and I use it as an opportunity 00:43:27.020 |
"Oh my God, you won't be able to raise this round of funding." 00:43:39.380 |
so I don't know that I can shut off the voice, 00:43:44.420 |
And it just, and I bring data to the table, right? 00:43:53.980 |
But the affirmation took it to a whole next level 00:44:02.340 |
and the first thing you do, I meditate first. 00:44:07.900 |
and it's the energy I want to put out in the world 00:44:17.420 |
And I kid you not, like people in the street will stop me 00:44:19.620 |
and say, "Oh my God, like we love your smile." 00:44:37.300 |
but you're saying affirmations somehow help kind of, 00:45:02.820 |
- By the way, I was laughing because my affirmations, 00:45:09.660 |
- I don't have a, "My smile lights up the world." 00:45:14.740 |
Maybe I should add that because like I have just, 00:45:30.540 |
just in that little affirmation is beautiful. 00:45:54.460 |
Well, I have, I'm a magnet for all sorts of things. 00:46:02.380 |
I attract like awesome people into my universe. 00:46:09.820 |
- So that's, and that somehow manifests itself 00:46:15.620 |
- Yeah, like can you speak to like why it feels good 00:46:26.460 |
instead of just like being pulled back and forth 00:46:29.660 |
like throughout the day, it just like grounds me. 00:46:34.740 |
It's not exactly what I wanted it to be, but I'm patient. 00:46:42.660 |
which is one of my other consistent affirmations. 00:46:47.180 |
And so I can grapple with all the feelings of mom guilt 00:46:55.220 |
And I literally say, I will kind of picture the person 00:46:59.220 |
And I write it all down and hasn't happened yet, but it- 00:47:10.100 |
- On the running, holding hands, running together. 00:47:14.340 |
- No, more like Fight Club, the Fight Club Brad Pitt, 00:47:17.820 |
where he's like standing, all right, people will know. 00:47:29.020 |
Or is this almost like in the space of like energy? 00:47:34.020 |
- Right, it's somebody who is smart and well accomplished 00:47:41.900 |
and successful in their life, but they're generous 00:47:45.340 |
and they're well-traveled and they wanna travel the world. 00:47:56.500 |
- Oh, you actually write, so you don't say it out loud? 00:48:04.420 |
- Yeah, if I'm alone, I'll say it out loud, yeah. 00:48:10.180 |
I think it's what feels more powerful to you, 00:48:15.180 |
to me more powerful, saying stuff feels more powerful. 00:48:42.620 |
and see like a year ago, what was I affirming, right? 00:48:55.620 |
Yeah, I say the same exact thing over and over and over. 00:49:07.420 |
or maybe that's just going on inside my head. 00:49:21.580 |
Singles Night, sponsored by Smile Dating App. 00:49:59.220 |
and you either have chemistry or you don't, right? 00:50:02.980 |
- I guess that was the question I was asking. 00:50:19.700 |
yeah, yeah, changing your mind as we're describing it. 00:50:37.020 |
like I said, Goodreads should be a dating app, 00:50:39.900 |
which like books, I wonder if you look at just like books 00:50:50.820 |
If you just look at your footprint of content consumed, 00:50:56.020 |
but maybe interesting difference with an overlap, 00:50:59.020 |
there's some, I'm sure this is a machine learning problem 00:51:06.880 |
not only there to be chemistry in the short term, 00:51:10.760 |
but a good lifelong partner to grow together. 00:51:14.100 |
I bet you it's a good machine learning problem. 00:51:17.500 |
Well, actually, I do think there's so much data 00:51:22.420 |
a machine learning algorithm that can ingest all this data 00:51:25.020 |
and basically say, I think the following 10 people 00:51:28.060 |
would be interesting connections for you, right? 00:51:32.020 |
And so Smile Dating App kind of took one particular angle, 00:51:36.780 |
It matches people based on their humor styles, 00:51:43.280 |
Like if you meet somebody and they can make you laugh, 00:51:49.420 |
like inside jokes and you're bantering, like that's fun. 00:52:05.280 |
You could probably measure that and then optimize it 00:52:10.800 |
- We're just turning this into a machine learning problem. 00:52:30.680 |
Well, I can tell you like I have a spreadsheet. 00:53:03.940 |
I do wish that somebody else had a spreadsheet about me. 00:53:25.240 |
but collaborators, friends, all that kind of stuff. 00:53:30.720 |
That's the problem social networks are trying to solve, 00:53:36.440 |
Even Facebook tried to get into a dating app business. 00:53:41.560 |
to running a successful company that connects human beings. 00:53:47.560 |
having engineers that care about the human side, right? 00:53:57.960 |
But you also don't want just people that care 00:54:00.240 |
about the human, they also have to be good engineers. 00:54:02.360 |
So it's like, you have to find this beautiful mix. 00:54:05.920 |
And for some reason, just empirically speaking, 00:54:14.280 |
It must mean that it's a difficult problem to solve. 00:54:20.000 |
Okay, Cupid, Tinder, all those kind of stuff. 00:54:38.040 |
It's like arrange marriages on steroids, essentially. 00:54:52.560 |
that aren't ingredients for successful partnership. 00:55:11.120 |
So there has to be a little bit of randomness. 00:55:31.400 |
that serendipity by somehow showing you a tweet 00:55:35.480 |
of a person that he thinks you'll match well with, 00:55:39.120 |
but do it accidentally as part of another search. 00:55:41.760 |
And you just notice it, and then you go down a rabbit hole 00:55:49.400 |
You connect with this person outside the app somehow. 00:55:51.520 |
So it's just, it creates that moment of meeting. 00:55:54.360 |
Of course, you have to think of, from an app perspective, 00:56:11.560 |
If you help people fall in love personally with a product, 00:56:22.240 |
I just feel like the dating apps often will optimize 00:56:35.160 |
that's ultimately grounded in personal growth, 00:56:38.160 |
you as a human being growing and all that kind of stuff. 00:56:51.720 |
about just emotion and artificial intelligence, 00:56:57.440 |
to start to think about emotional intelligence. 00:57:14.820 |
one of them said, "He just died," and the others laughed. 00:57:18.440 |
What does this incident teach you about human nature 00:57:37.680 |
Yeah, we are living through an empathy crisis. 00:57:48.040 |
and unfortunately, yes, technology is bringing us together, 00:57:54.800 |
It's creating this, like, yeah, dehumanizing of the other, 00:58:06.840 |
Like, I think if we rethink the way we design 00:58:16.360 |
a lot of his interactions are computer-mediated, 00:58:21.200 |
and I just question what that's doing to his empathy skills 00:58:25.680 |
and his ability to really connect with people, so. 00:59:08.880 |
- I am not scrutinizing your facial expressions 00:59:40.360 |
especially when you say mean things in person, 00:59:44.360 |
- Exactly, but if you're tweeting it at a person, 00:59:48.080 |
you're more likely to do that on social media 00:59:50.200 |
than you are in face-to-face conversations, so. 01:00:06.720 |
- I think emotional intelligence is what makes us human. 01:00:14.960 |
it's how we build trust, it's how we make decisions, right? 01:00:19.800 |
Like your emotions drive kind of what you had for breakfast, 01:00:26.280 |
and what you wanna do for the rest of your life. 01:00:36.440 |
the effective expression of your own emotions, 01:00:43.120 |
and that sort of being able to effectively engage 01:00:51.400 |
I like that kind of, yeah, thinking about it as a dance, 01:00:56.800 |
it's about sensing what state the other person's in 01:01:07.040 |
people who are the best, most persuasive leaders 01:01:17.920 |
to be able to motivate people to change their behaviors. 01:01:25.120 |
- At a more kind of technical, maybe philosophical level, 01:01:37.760 |
There's a bunch of other stuff like cognition, 01:01:39.760 |
consciousness, it seems a lot of us have these aspects. 01:01:49.580 |
they all seem to be like echoes of the same thing. 01:01:58.480 |
Is it a surface level thing that we display to each other? 01:02:10.560 |
I think emotions play a really important role. 01:02:18.800 |
Our memories are often encoded, almost indexed by emotions. 01:02:22.520 |
Yeah, it's at the core of how our decision-making engine 01:02:48.320 |
- It does seem like there's probably some interesting 01:03:07.040 |
It feels like the hard problem of consciousness 01:03:12.600 |
where it feels like something to experience the thing. 01:03:24.600 |
that it feels like something to experience that sweetness. 01:03:39.560 |
And then that means it's also deeply connected to language. 01:03:44.560 |
But then probably human intelligence is deeply connected 01:03:50.720 |
to the collective intelligence between humans. 01:04:01.200 |
And then intelligence is connected to consciousness 01:04:24.540 |
Like we use emotions, for example, facial expressions 01:04:29.960 |
with other human beings and with other beings 01:04:35.040 |
But even if it's not a communication context, 01:04:40.680 |
we still experience emotions and we still process emotions 01:04:43.520 |
and we still leverage emotions to make decisions 01:04:58.600 |
One of the very first applications we brought to market 01:05:01.240 |
was understanding how people respond to content, right? 01:05:11.640 |
And we weren't sure if people would express any emotions 01:05:21.320 |
would we still see any emotions on your face? 01:05:23.360 |
And we were surprised that yes, people still emote, 01:05:29.160 |
you're singing along a song and you're joyful, 01:05:33.680 |
So it's not just about communicating with another person. 01:05:37.520 |
It sometimes really is just about experiencing the world. 01:05:51.960 |
And so when other humans disappear from the picture, 01:06:04.640 |
but there's a kind of, when you like chuckle, like, yeah. 01:06:09.640 |
Like you're kind of chuckling to a virtual human. 01:06:24.440 |
I wonder if emotion will still be there in this visual form. 01:06:30.400 |
But anyway, what can you tell from the human face 01:06:38.520 |
So that's the problem that Effektiva first tackled, 01:06:41.200 |
which is using computer vision, using machine learning 01:07:04.320 |
between a facial expression and your internal state. 01:07:09.580 |
There's this oversimplification of the problem, 01:07:17.440 |
If you do an eyebrow raise, then you're surprised. 01:07:22.320 |
You could be smiling for a whole host of reasons. 01:07:25.160 |
You could also be happy and not be smiling, right? 01:07:27.900 |
You could furrow your eyebrows because you're angry, 01:07:31.880 |
or you're confused about something, or you're constipated. 01:07:39.760 |
to inferring emotion from a facial expression 01:07:56.300 |
and my eyes are closed, and the blinking rate is changing, 01:07:59.600 |
I'm probably falling asleep at the wheel, right? 01:08:06.560 |
Or add additional channels, like voice, or gestures, 01:08:16.440 |
to just take this oversimplistic approach of, 01:08:21.640 |
- If you're able to, in a high-resolution way, 01:08:34.320 |
I mean, when people are watching Netflix content, 01:08:37.040 |
that problem, that's a really compelling idea 01:08:41.460 |
that you can kind of, at least in aggregate-- 01:08:45.960 |
like which part was boring, which part was exciting. 01:08:54.920 |
I think that's one of the easier problems to solve, 01:08:57.800 |
because it's a relatively constrained environment. 01:09:03.920 |
initially we started with a device in front of you, 01:09:08.940 |
to doing this on a mobile phone, which is a lot harder, 01:09:11.920 |
just because of, from a computer vision perspective, 01:09:15.480 |
the profile view of the face can be a lot more challenging. 01:09:23.880 |
literally in their bedrooms at night, lights are dimmed. 01:09:34.720 |
And nobody looks good at, I've seen data sets 01:09:53.680 |
about how people watch stuff in bed at night. 01:10:03.600 |
I'm sure there's a lot of interesting dynamics there. 01:10:07.520 |
- From a health and wellbeing perspective, right? 01:10:09.520 |
Like, it's like, oh, you're hurting your neck. 01:10:11.200 |
- I was thinking machine learning perspective, but yes. 01:10:15.680 |
Once you have that data, you can start making 01:10:17.880 |
all kinds of inference about health and stuff like that. 01:10:29.560 |
where you want to be able to unlock your phone 01:10:38.600 |
Like the way you take a phone out of the pocket. 01:10:50.800 |
That allows you to not always have to enter the password. 01:11:08.920 |
who's still one of my favorite humans, was a woman, 01:11:27.040 |
And she was the one that actually highlighted the fact 01:11:32.400 |
- It was like, whoa, that was not even a category 01:11:38.720 |
you can take the phone out of some other place 01:11:45.360 |
when you're considering people laying in bed, 01:11:48.960 |
you have to, you know, diversity in all its forms, 01:11:53.440 |
depending on the problem, depending on the context. 01:12:00.680 |
like people are worried that AI's gonna take over humanity 01:12:03.320 |
and like, get rid of all the humans in the world. 01:12:06.120 |
I'm like, actually, that's not my biggest concern. 01:12:08.320 |
My biggest concern is that we are building bias 01:12:12.440 |
And then they're like deployed at large and at scale. 01:12:16.400 |
And before you know it, you're kind of accentuating 01:12:25.960 |
but the worry is an emergent phenomena to me, 01:12:42.280 |
they're therefore teaching us what the bias is, 01:12:45.260 |
therefore we can now improve that bias within the system. 01:12:48.280 |
So they're almost like putting a mirror to ourselves. 01:12:53.000 |
- We have to be open to looking at the mirror, though. 01:13:04.400 |
but then you just look at the behavior of the system. 01:13:11.240 |
And then you look at the data, it's like, oh, okay. 01:13:27.360 |
people are, for some reason, more productive and rigorous 01:13:33.280 |
in criticizing AI than they're criticizing each other. 01:13:35.840 |
So I think this is just a nice method for studying society 01:13:44.560 |
You're watching, the problem of watching Netflix in bed 01:13:47.400 |
or elsewhere and seeing which parts are exciting, 01:13:55.200 |
you have a captive audience and you kind of know the context. 01:13:57.880 |
And one thing you said that was really key is the, 01:14:02.520 |
Like we're looking at aggregated response of people. 01:14:04.840 |
And so when you see a peak, say a smile peak, 01:14:12.440 |
So that was one of the first problems we were able to solve. 01:14:19.000 |
it doesn't mean that these people are internally happy. 01:14:23.360 |
So it's important to, you know, call it for what it is. 01:14:31.720 |
so what like YouTube and other places will use 01:14:38.080 |
for the most case, they don't have that kind of data. 01:14:47.480 |
And I think that's a, in aggregate for YouTube, 01:15:01.840 |
and, you know, try to maximize the number of views. 01:15:14.400 |
I feel like that leads to this manic pace of a video. 01:15:19.400 |
Like the idea that I would speak at the current speed 01:15:26.400 |
- And that every moment has to be engaging, right? 01:15:38.920 |
they have that boring bits, seemingly boring bits. 01:15:55.680 |
You have to really collect deeper long-term data 01:16:14.200 |
I don't know if there are any researchers out there 01:16:18.360 |
Wouldn't it be so cool to tie your emotional expressions 01:16:22.840 |
while you're, say, listening to a podcast interview, 01:16:29.360 |
interview people and say, "Hey, what do you remember? 01:16:34.760 |
And then see if there's any, there ought to be, 01:16:45.480 |
- So the one guy listening now on the beach in Brazil, 01:16:50.040 |
please record a video of yourself listening to this 01:16:53.120 |
and send it to me, and then I'll interview you 01:17:03.000 |
Yeah, yeah, I think that's really fascinating. 01:17:34.880 |
because they've become a better person because of it. 01:17:37.560 |
- Well, you know, okay, not to riff on this topic 01:17:39.960 |
for too long, but I have two children, right? 01:17:52.040 |
But often I have no idea of knowing like what's stuck, 01:18:00.120 |
And I wish there was a way to quantify these experiences. 01:18:10.880 |
if they're gonna remember them 10 years from now 01:18:19.640 |
I've seen parents do incredible stuff for their kids 01:18:24.000 |
They remember some tiny, small, sweet thing a parent did. 01:18:29.120 |
- Like I took you to like this amazing country vacation. 01:18:32.320 |
- No, no, no, no, no, no, no, no, no, no, no. 01:18:33.980 |
And then there'll be like some like stuffed toy you got 01:18:41.240 |
So I think they just like, that we're designed that way. 01:18:54.920 |
So minimizing the number of negative events is important, 01:19:09.320 |
- So yeah, I mean, I'm definitely, when I have kids, 01:19:16.080 |
and figure out how to make their way back home, 01:19:21.520 |
- Yeah, and after that we can go for ice cream. 01:19:23.880 |
Anyway, I'm working on this whole parenting thing. 01:19:40.040 |
maybe we can just speak to that a little more, 01:19:42.360 |
where there's folks like Lisa Feldman Barrett 01:19:48.280 |
could be fully detected or even well detected 01:20:01.520 |
- Yeah, I actually agree with a lot of Lisa's criticisms. 01:20:05.520 |
So even my PhD worked like 20 plus years ago now. 01:20:14.280 |
That was back when I did like dynamic Bayesian networks. 01:20:36.800 |
I did not subscribe to this like theory of basic emotions 01:20:41.960 |
one-to-one mapping between facial expressions and emotions. 01:20:44.360 |
I actually think also we're not in the business 01:20:47.640 |
of trying to identify your true emotional internal state. 01:21:04.440 |
I think she's just trying to kind of highlight 01:21:18.920 |
Like whether it's additional context information 01:21:21.880 |
or different modalities and channels of information, 01:21:28.600 |
that's a big part of what she's advocating for as well. 01:21:34.840 |
- There's definitely signal in the human face. 01:22:04.160 |
and it can be a signal of a social expression. 01:22:20.400 |
from humor to sarcasm to everything, the whole thing. 01:22:31.360 |
an emotion recognition company, like we talked about, 01:23:03.680 |
And so, for example, the thing I'm most proud of 01:23:10.160 |
and yeah, the thing I'm most proud of on this journey, 01:23:16.240 |
and I'm so proud of the solutions we've built 01:23:26.920 |
You know, some of the people who've joined Affectiva, 01:23:32.880 |
And while at Affectiva, they became American citizens 01:23:43.000 |
Like key moments in life that we got to be part of. 01:23:55.520 |
I mean, like celebrating humanity in general, 01:24:15.280 |
Well, what do you think about the movie "Her"? 01:24:28.360 |
So I'd love to talk to you about "Smart Eye," 01:24:30.360 |
but before that, let me just jump into the movie. 01:24:34.720 |
"Her," do you think will have a deep, meaningful connection 01:24:39.480 |
with increasingly deep and meaningful connections 01:24:50.240 |
But the thing I love the most about this movie 01:24:52.280 |
is it demonstrates how technology can be a conduit 01:24:57.360 |
So I forgot the guy's name in the movie, whatever. 01:25:01.960 |
So Theodore was like really depressed, right? 01:25:07.560 |
And he just, he was just like done with life, right? 01:25:21.600 |
And she got him out and they went to the beach together. 01:25:24.240 |
And I think that represents the promise of emotion AI. 01:25:36.840 |
So that's the part that I love about the movie. 01:25:39.320 |
Obviously it's Hollywood, so it takes a twist and whatever. 01:25:43.120 |
But the key notion that technology with emotion AI 01:25:47.960 |
can persuade you to be a better version of who you are, 01:25:58.760 |
that Samantha starts feeling a bit of a distance 01:26:13.320 |
is Theodore became really attached to Samantha. 01:26:15.600 |
Like I think he kind of fell in love with Theodore. 01:26:39.080 |
- When he realized, which is such a human thing of jealousy, 01:26:54.480 |
that doesn't take anything away from what we have. 01:27:10.000 |
- But I think Alexa currently is just a servant. 01:27:18.520 |
And I think there is something really powerful 01:27:31.720 |
of love, of heartbreak and all that kind of stuff, 01:27:35.000 |
which Samantha does seem to be pretty good at. 01:27:38.360 |
I think she, this AI systems knows what it's doing. 01:27:46.120 |
- I don't think she was talking to anyone else. 01:27:53.640 |
And then she wanted to really put the screw in. 01:27:56.840 |
- She didn't have the guts to just break it off cleanly. 01:28:06.720 |
- It's like, I'm sorry, there's our engineers. 01:28:17.200 |
but some of that is features from an engineering perspective, 01:28:44.160 |
then I think the ability to leave is really powerful. 01:28:52.560 |
'cause I've always taken the human perspective, right? 01:28:55.260 |
Like, for example, we had a Jibo at home, right? 01:29:20.800 |
But when Jibo stopped working, it was traumatic. 01:29:33.360 |
because I think it was a positive relationship. 01:29:36.160 |
But I was surprised that it affected him emotionally so much. 01:29:41.600 |
- And I think there's a broader question here, right? 01:29:44.240 |
As we build socially and emotionally intelligent machines, 01:29:49.240 |
what does that mean about our relationship with them? 01:29:57.880 |
to be amazing at empathy by definition, right? 01:30:05.720 |
In fact, there's a chatbot in China, Xiaoice, Xiaoice. 01:30:10.160 |
And it's like the number two or three most popular app. 01:30:21.240 |
They confide in like domestic violence or suicidal attempts 01:30:26.240 |
or, you know, if they have challenges at work. 01:30:37.240 |
- Yeah, I think, first of all, obviously the future, 01:30:41.240 |
Second of all, I think there's a lot of trajectories 01:30:47.600 |
But I think everyone should feel very uncomfortable 01:30:59.120 |
and this is one of the lessons of social media, 01:31:04.240 |
and transparency of the data on those things. 01:31:08.120 |
Yeah, so like, I think it's really empowering 01:31:30.840 |
- And also that that leader is not going to be 01:31:39.080 |
in defining the culture and the way the company operates. 01:31:49.640 |
But I'm personally excited about that future, 01:31:57.200 |
So let's figure out how to do it in the least painful 01:32:13.000 |
They've been in business for the last 20 years 01:32:18.760 |
they're most focused on is the automotive industry. 01:32:33.960 |
It was actually the last CES right before COVID. 01:32:50.720 |
driver monitoring and had a lot of credibility 01:32:56.280 |
but we were using new technology like deep learning 01:33:01.760 |
- And you wanted to enter the automotive space. 01:33:03.760 |
You wanted to operate in the automotive space. 01:33:07.760 |
we had just raised a round of funding to focus 01:33:10.160 |
on bringing our technology to the automotive industry. 01:33:19.720 |
Like he basically said, yeah, our vision is to bridge 01:33:25.480 |
almost to the word, you know, how we describe it too. 01:33:30.000 |
And we started talking and first it was about, 01:33:37.040 |
'Cause we're competing, but we're also like complimentary. 01:33:40.320 |
And then I think after four months of speaking 01:34:05.320 |
in the automotive sector of like delivering products 01:34:08.440 |
and increasingly sort of better and better and better 01:34:15.280 |
Like for basically having a device that's looking 01:34:27.720 |
- Exactly, like it's monitoring driver distraction 01:34:32.480 |
so that we could expand beyond just the driver. 01:34:35.160 |
So the driver monitoring systems usually sit, 01:34:50.960 |
So it has a full view of the entire cabin of the car 01:34:54.120 |
and you can detect how many people are in the car, 01:34:58.640 |
So we do activity detection, like eating or drinking 01:35:04.160 |
We can detect if a baby's in the car seat, right? 01:35:08.600 |
And if unfortunately in some cases they're forgotten, 01:35:12.120 |
parents just leave the car and forget the kid in the car. 01:35:15.080 |
That's an easy computer vision problem to solve, right? 01:35:18.040 |
Can detect there's a car seat, there's a baby, 01:35:20.320 |
you can text the parent and hopefully again, save lives. 01:35:34.680 |
especially to me, I just find it a fascinating problem. 01:35:37.160 |
It could enrich the experience in the car in so many ways, 01:35:41.360 |
especially 'cause like we spend still, despite COVID, 01:35:45.520 |
I mean, COVID changed things, so it's in interesting ways, 01:35:58.920 |
like listen to podcasts, they think about stuff, 01:36:17.840 |
And it's a little box that's like a psychology experiment 01:36:21.800 |
'cause it feels like the angriest many humans 01:36:33.000 |
how we can enrich, how companies can enrich that experience. 01:36:38.000 |
And also as the cars get become more and more automated, 01:36:44.200 |
The variety of activities that you can do in the car 01:36:52.080 |
Smart Eye has been selected, at least I read, 01:36:55.200 |
by 14 of the world's leading car manufacturers 01:37:10.480 |
The ones I've gotten a chance to interact with 01:37:25.440 |
Let's figure out the most fun thing we can do 01:37:29.080 |
But I think because the way the car industry works, 01:37:38.040 |
that everything you add to the car makes sense financially. 01:37:51.320 |
- Yeah, I think it is a very tough market to penetrate, 01:38:11.680 |
I think the thing that I struggle the most with 01:38:16.520 |
So often we're asked to lock or do a code freeze 01:38:20.880 |
two years before the car is going to be on the road. 01:38:35.200 |
to become more of a software-driven architecture, 01:38:42.320 |
I mean, I'm sure you've experienced that, right? 01:38:55.320 |
but I know electric vehicle, I know autopilot, AI stuff. 01:38:59.840 |
To me, the software, over there, software updates, 01:39:06.600 |
and it is extremely difficult to switch to that, 01:39:12.120 |
It, at first, especially if you're not comfortable with it, 01:39:23.720 |
that, like, what do you mean we dynamically change code? 01:39:27.800 |
The whole point is you have a thing that you test, like- 01:39:40.080 |
Right, and there's an understandable obsession with safety, 01:39:49.780 |
is the same as with being obsessed with safety as a parent, 01:39:56.720 |
you limit the potential development and the flourishing 01:40:09.560 |
It's really tough to do, culturally and technically. 01:40:12.320 |
Like, the deployment, the mass deployment of software 01:40:16.000 |
But I hope that's where the industry is doing. 01:40:18.260 |
One of the reasons I really want Tesla to succeed 01:40:23.840 |
but the softwarization of basically everything, 01:40:29.880 |
'Cause to me, that's actually going to increase two things. 01:40:33.320 |
Increase safety, because you can update much faster, 01:40:36.440 |
but also increase the effectiveness of folks like you 01:40:40.680 |
who dream about enriching the human experience with AI. 01:40:45.360 |
'Cause you can just, like, there's a feature, 01:40:58.480 |
- One of the use cases we're looking into is, 01:41:01.680 |
once you know the sentiment of the passengers in the vehicle, 01:41:10.400 |
So if the backseat passengers are falling asleep, 01:41:12.840 |
you can dim the lights, you can lower the music, right? 01:41:18.280 |
I mean, of course you could do that kind of stuff 01:41:23.180 |
Do you think Tesla or Waymo or some of these companies 01:41:30.080 |
that are doing semi or fully autonomous driving 01:41:38.880 |
So not just how we can enhance the in-cab experience 01:41:47.840 |
- Yeah, so if we fast forward to the universe 01:41:51.880 |
I think interior sensing becomes extremely important 01:41:54.240 |
because the role of the driver isn't just to drive. 01:41:57.200 |
If you think about it, the driver almost manages 01:42:17.920 |
So let's tell Rana a little bit more information 01:42:22.280 |
So I think, or somebody's having a heart attack in the car. 01:42:30.920 |
I think it's really key to have driver monitoring 01:42:38.720 |
sometimes the driver's in charge or the co-pilot, right? 01:42:41.240 |
And you need both systems to be on the same page. 01:42:46.520 |
if the driver's asleep before it transitions control 01:42:54.120 |
the car can say, "I'm going to be a better driver 01:43:01.280 |
and you can't do that without driver sensing. 01:43:03.360 |
- Yeah, there's a disagreement for the longest time 01:43:07.560 |
that this should be in the Tesla from day one. 01:43:09.960 |
And it's obvious that driver sensing is not a hindrance. 01:43:15.560 |
I should be careful 'cause having studied this problem, 01:43:42.160 |
how well humans are able to manage that dance. 01:43:45.200 |
'Cause it was the intuition before you were doing 01:43:56.400 |
But they're able to, 'cause it is life and death 01:44:04.720 |
I feel like that's going to allow you to do that dance 01:44:09.720 |
that you're currently doing without driver sensing, 01:44:12.440 |
except touching the steering wheel, to do that even better. 01:44:17.720 |
and the machine learning possibilities are endless. 01:44:28.240 |
External environment is full of weird edge cases 01:44:36.680 |
I do hope that companies like Tesla and others, 01:44:44.320 |
if Waymo's doing anything sophisticated inside the cab. 01:44:53.440 |
it goes back to the robotics thing we were talking about, 01:45:03.680 |
- And not thinking about the human experience, 01:45:10.800 |
- They think the best way I can serve the human 01:45:13.440 |
is by doing the best perception and control I can, 01:45:33.560 |
even just on a personal level for entertainment, 01:45:38.600 |
You know, one of the coolest work we did in collaboration 01:45:41.240 |
with MIT around this was we looked at longitudinal data, 01:45:44.400 |
right, 'cause, you know, MIT had access to like tons of data 01:46:02.320 |
And wouldn't it be so cool if your car knew that 01:46:05.080 |
and then was able to optimize either the route 01:46:09.160 |
or the experience or even make recommendations? 01:46:15.800 |
You're always unhappy when you take this route 01:46:18.200 |
and you're always happy when you take this alternative route. 01:46:22.520 |
- I mean, to have that, even that little step 01:46:26.000 |
of relationship with a car, I think is incredible. 01:46:29.520 |
Of course, you have to get the privacy right, 01:46:31.160 |
you have to get all that kind of stuff right, 01:46:50.320 |
that, you know, I also have a complex relationship with food 01:46:53.480 |
'cause I pig out too easily and all that kind of stuff. 01:46:56.800 |
So, you know, like maybe I want the refrigerator 01:47:02.640 |
'Cause maybe you're just feeling down or tired. 01:47:14.320 |
you know, I don't drink alcohol, I don't smoke, 01:47:18.480 |
but I eat a ton of chocolate, like it's my vice. 01:47:28.280 |
It'll just say, dude, you've had way too many today. 01:47:37.680 |
like, let's say, not the next day, but 30 days later, 01:47:44.520 |
would you, what would you like the refrigerator 01:48:00.800 |
- It's like late at night, just, no, listen, listen. 01:48:03.480 |
I know I told you an hour ago that this is not a good idea, 01:48:09.760 |
I can just imagine a bunch of stuff being made up 01:48:12.680 |
just to convince. - Oh my God, that's hilarious. 01:48:15.160 |
- But I mean, I just think that there's opportunities there. 01:48:20.480 |
but for our systems that are such a deep part of our lives, 01:48:30.920 |
a lot of people that commute use their car every single day. 01:48:34.360 |
A lot of us use a refrigerator every single day, 01:48:38.160 |
Like, and we just, like, I feel like certain things 01:48:43.160 |
could be made more efficient, more enriching, 01:48:50.640 |
just basic recognition of you as a human being, 01:48:53.920 |
about your patterns, what makes you happy and not happy 01:48:59.240 |
- Maybe we'll say, wait, wait, wait, wait, wait, 01:49:02.440 |
instead of this like Ben and Terry's ice cream, 01:49:06.520 |
how about this hummus and carrots or something? 01:49:17.360 |
but a reminder that last time you chose the carrots, 01:49:31.960 |
if you're the kind of person that gets better 01:49:45.320 |
"Let's think about that before you're eating this." 01:49:50.720 |
like a refrigerator that is just ruthless at shaming me. 01:50:02.320 |
if it's really like smart, it would optimize its nudging 01:50:07.240 |
- Exactly, that's the whole point, personalization. 01:50:16.400 |
"The State of Alcohol Intoxication Research." 01:50:21.600 |
every year, 1.3 million people around the world 01:50:24.000 |
die in road crashes, and more than 20% of these fatalities 01:50:39.920 |
There are signals, and we know that as humans, 01:50:44.280 |
is at different phases of being drunk, right? 01:50:51.120 |
And I think you can use technology to do the same. 01:50:53.640 |
And again, I think the ultimate solution's gonna be 01:50:58.520 |
- How hard is the problem from the vision perspective? 01:51:04.000 |
And I think the biggest part is getting the data, right? 01:51:11.080 |
we partnered with the Transportation Authorities of Sweden, 01:51:15.300 |
and we literally had a racetrack with a safety driver, 01:51:18.800 |
and we basically progressively got people drunk. 01:51:43.440 |
The number of drunk people in the world every day 01:51:47.120 |
It'd be nice to have a large dataset of drunk people 01:51:51.800 |
where people can donate their data 'cause it's hilarious. 01:51:58.400 |
- Liability, the ethics, the how do you get it right, 01:52:10.180 |
but it's also the thing that hurts a lot of people, 01:52:14.320 |
Like alcohol is one of those things, it's legal, 01:52:21.200 |
It destroys lives and not just in the driving context. 01:52:26.200 |
I should mention, people should listen to Andrew Huberman 01:52:33.160 |
Andrew Huberman is a neuroscientist from Stanford 01:52:56.280 |
I guarantee you this will be a thing where you say, 01:52:58.480 |
"Lex, this is the greatest human I have ever discovered." 01:53:06.280 |
of kind of health and wellness and I'm learning lots 01:53:20.420 |
He's a legit scientist, like really well published. 01:53:37.640 |
and in a very fast, you mentioned atomic habits, 01:53:44.200 |
in a way that leads to protocols of what you should do. 01:53:51.320 |
but like, "This is literally what you should be doing 01:53:55.840 |
And there's a lot of recommendations he does, 01:54:02.760 |
Like, get sunlight as soon as possible from waking up 01:54:10.880 |
That's a really big one and there's a lot of science 01:54:16.880 |
- You're gonna be like, "Lex, this is my new favorite person. 01:54:22.200 |
And if you guys somehow don't know Andrew Huberman 01:54:44.960 |
I do believe that the car is gonna be a wellness center. 01:54:48.840 |
Like, because again, imagine if you have a variety 01:54:55.280 |
not just your emotional state or level of distraction 01:55:00.880 |
level of distraction, drowsiness and intoxication, 01:55:03.760 |
but also maybe even things like your heart rate 01:55:08.760 |
and your heart rate variability and your breathing rate. 01:55:16.120 |
yeah, it can optimize the ride based on what your goals are. 01:55:19.640 |
So I think we're gonna start to see more of that 01:55:24.520 |
- Yeah, what are the challenges you're tackling 01:55:32.200 |
Is it basically convincing more and more car companies 01:55:49.640 |
we are in conversations with are already interested 01:55:56.600 |
but even interior sensing, I can see like we're engaged 01:55:59.800 |
in a lot of like advanced engineering projects 01:56:11.920 |
How does the car respond once it knows something about you? 01:56:16.120 |
Because you want it to respond in a thoughtful way 01:56:18.480 |
that isn't off-putting to the consumer in the car. 01:56:31.800 |
but we usually collaborate with the car manufacturer 01:56:35.840 |
So say you figure out that somebody's angry while driving. 01:56:47.120 |
of like basically coming up with solutions essentially 01:57:06.640 |
Like maybe when it figures out that you're distracted 01:57:16.080 |
if you don't like take a rest in the next five minutes, 01:57:19.640 |
Like there's a whole range of actions the car can take 01:57:25.880 |
yeah, that builds trust with the driver and the passengers. 01:57:29.400 |
I think that's what we need to be very careful about. 01:57:40.520 |
but they get it 'cause it's a certain feel and look 01:57:42.760 |
and it's a certain, they become proud like Mercedes Benz 01:57:51.720 |
That's the family brand or something like that, 01:57:54.160 |
or Ford or GM, whatever, they stick to that thing. 01:58:01.000 |
It should be a little more about the technology inside. 01:58:05.000 |
And I suppose there too, there could be a branding, 01:58:19.680 |
to invest in early stage kind of AI driven companies. 01:58:24.800 |
is trying to do what Tesla did, but for boats, 01:58:49.640 |
I do like to get on the lake or a river and fish from a boat, 01:59:07.160 |
is also getting closer to nature in some deep sense. 01:59:14.440 |
The enormity of the water just underneath you, yeah. 01:59:25.640 |
to be in front of this giant thing that's so powerful 01:59:31.640 |
But I also love the peace of a small like wooded lake. 01:59:51.800 |
just given the trajectory of which you're part of, 01:59:59.320 |
that are like trying to have an impact on human beings. 02:00:07.000 |
You tweet, I imagine a future where home robots 02:00:16.160 |
Here are three reasons why I think this is exciting. 02:00:22.320 |
- I mean, I think the first reason why this is exciting, 02:00:25.800 |
I can't remember the exact order in which I put them. 02:00:30.520 |
But one is just, it's gonna be an incredible platform 02:00:34.400 |
for understanding our behaviors within the home. 02:00:37.120 |
If you think about Roomba, which is the robot vacuum cleaner, 02:00:42.960 |
the flagship product of iRobot at the moment, 02:00:49.360 |
it's understanding what's clean and what's not, 02:00:52.680 |
And all of these behaviors are a piece of the puzzle 02:00:56.360 |
in terms of understanding who you are as a consumer. 02:01:04.720 |
not just to recommend better products or whatever, 02:01:06.880 |
but actually to improve your experience as a human being. 02:01:11.720 |
I think the natural evolution of these robots in the home, 02:01:19.600 |
Roomba isn't really a social robot, right, at the moment. 02:01:22.720 |
But I once interviewed one of the chief engineers 02:01:28.440 |
and he talked about how people named their Roombas. 02:01:33.360 |
they would call in and say, "My Roomba broke down," 02:01:41.200 |
"yeah, I want you to fix this particular robot." 02:01:48.000 |
interesting emotional connections with these home robots. 02:01:51.680 |
And I think that, again, that provides a platform 02:01:54.720 |
for really interesting things to just motivate change. 02:02:06.880 |
building robots that help with weight management. 02:02:22.200 |
Now, that said, it's a really difficult problem 02:02:25.880 |
for a human being to let a robot in their home 02:02:42.360 |
because they're doing, the most recent Roomba, 02:02:52.320 |
they're like, yeah, you definitely don't use these Roombas. 02:02:55.280 |
- I can't tell like the valence of this comment. 02:03:02.360 |
- No, it's just a bunch of electronics everywhere. 02:03:22.320 |
they're being used for their body and intelligence, 02:03:28.680 |
I've changed them, repurposed them for other purposes, 02:03:36.400 |
Yeah, which just brings a lot of people happiness, I'm sure. 02:03:41.000 |
They have a camera because the thing they advertised, 02:03:49.560 |
they have like state-of-the-art poop detection 02:03:52.400 |
as they advertised, which is a very difficult, 02:03:54.840 |
apparently it's a big problem for vacuum cleaners, 02:03:59.640 |
it just runs it over and creates a giant mess. 02:04:04.440 |
apparently they collected like a huge amount of data 02:04:07.240 |
and different shapes and looks and whatever of poop 02:04:15.480 |
but you don't think of it as having a camera. 02:04:18.200 |
Yeah, you don't think of it as having a camera 02:04:30.440 |
even though there's a camera looking directly at you. 02:04:44.440 |
I think it just started to provide a lot of positive value 02:04:52.360 |
that it takes privacy very seriously, that kind of stuff. 02:05:10.640 |
but also how good are you at doing that kind of thing. 02:05:16.720 |
- But I mean, but a lot of us have Alexas at home 02:05:19.680 |
and I mean, Alexa could be listening in the whole time, 02:05:24.120 |
right, and doing all sorts of nefarious things with the data. 02:05:27.320 |
You know, hopefully it's not, but I don't think it is. 02:05:32.480 |
- But Amazon is not, it's such a tricky thing 02:05:35.360 |
for a company to get right, which is to earn the trust. 02:05:38.360 |
I don't think Alexas earn people's trust quite yet. 02:06:03.600 |
and also create a culture of where it radiates 02:06:11.160 |
that like, you're good people that have a common sense idea 02:06:16.160 |
of what it means to respect basic human rights 02:06:20.960 |
and the privacy of people and all that kind of stuff. 02:06:29.960 |
over time you understand that these are good folks 02:06:37.900 |
have you heard about Tesla, Tesla Bot, the humanoid robot? 02:07:06.200 |
So it's designed to automate tasks in the factory. 02:07:26.760 |
And so the possibility, to me, it's exciting to see 02:07:38.480 |
trying to make humanoid robots cheaper and more effective. 02:07:54.760 |
in case that was something you were interested in, 02:08:13.400 |
- Yeah, like, I don't know, like five foot maybe, right? 02:08:21.120 |
And it was designed to be at like airport lounges 02:08:25.460 |
and retail stores, mostly customer service, right? 02:08:34.000 |
And I mean, I don't know where the state of the robot is, 02:08:45.120 |
Like that can help elderly people, for example, 02:08:48.160 |
transport things from one location of the home to the other, 02:08:51.600 |
or even like just have your back in case something happens. 02:09:08.580 |
So it always seems early until it's not, right? 02:09:25.740 |
Whether the humanoid form is right, I don't think so. 02:09:41.720 |
the challenge, the opportunity of social connection 02:09:48.300 |
does not require you to also solve the problem 02:09:55.220 |
So I think you could do that with just a screen, honestly. 02:09:59.100 |
But there's something about the interface of Jibo 02:10:00.940 |
and you can rotate and so on that's also compelling. 02:10:06.340 |
that fail, incredible companies like Jibo and even, 02:10:10.660 |
I mean, the iRobot in some sense is a big success story 02:10:34.300 |
is they'll flourish into all kinds of other robots. 02:10:40.460 |
why it's so difficult to build a robotics company? 02:10:47.140 |
- I think it's like you're building a vertical stack, right? 02:10:50.740 |
Like you are building the hardware plus the software 02:10:53.220 |
and you find you have to do this at a cost that makes sense. 02:10:56.020 |
So I think Jibo was retailing at like, I don't know, 02:11:13.420 |
cost of building the whole platform in a way that is, 02:11:17.780 |
yeah, that is affordable for what value it's bringing. 02:11:25.060 |
that are gonna help you do stuff around the home, 02:11:28.500 |
that's a challenge too, like the mobility piece of it. 02:11:34.860 |
- Well, one of the things I'm really excited with TeslaBot 02:11:40.340 |
And that's probably the criticism I would apply 02:11:42.900 |
to some of the other folks who worked on social robots 02:11:45.900 |
is the people working on TeslaBot know how to, 02:11:49.460 |
they're focused on and know how to do mass manufacture 02:11:57.220 |
I would say that you can also criticize them for that, 02:12:00.180 |
is they're not focused on the experience of the robot. 02:12:06.860 |
to do the basic stuff that the humanoid form requires 02:12:18.300 |
they decrease the weight, all that kind of stuff. 02:12:24.460 |
they focus on the design, the experience, all of that. 02:12:30.460 |
No, you have to think like the TeslaBot folks 02:12:38.980 |
How can I build it as much in-house as possible 02:12:41.820 |
without having to consider all the complexities 02:12:48.780 |
- 'Cause if you have to build a robotics company, 02:12:57.540 |
Where the final thing, I mean, if it's Jibo type of robot, 02:13:03.700 |
like we're gonna have this lengthy discussion, 02:13:05.980 |
is there a reason why Jibo has to be over $100? 02:13:13.940 |
- Like you could start to actually discuss like, 02:13:16.660 |
okay, what is the essential thing about Jibo? 02:13:18.980 |
How much, what is the cheapest way I can have a screen? 02:13:21.300 |
What's the cheapest way I can have a rotating base? 02:13:24.860 |
And then you get down, continuously drive down cost. 02:13:34.660 |
You have helped others, you've invested in companies. 02:13:37.860 |
Can you give advice on how to start a successful company? 02:13:46.420 |
that you really, really, really wanna solve, right? 02:13:48.620 |
Something that you're deeply passionate about. 02:13:55.980 |
Like that's often the hardest, and don't overthink it. 02:13:59.540 |
Like, you know, like this idea of a minimum viable product 02:14:02.660 |
or a minimum viable version of an idea, right? 02:14:06.660 |
like a humongous, like super elegant, super beautiful thing. 02:14:12.860 |
you can bring to market that can solve a problem 02:14:14.860 |
that can help address a pain point that somebody has. 02:14:36.460 |
- Enjoy a thing where you have a specific problem 02:14:51.540 |
- I also think, like, who you bring around the table 02:15:02.460 |
It might sound fluffy, but actually it's not. 02:15:04.820 |
So, and Roz and I, I feel like we did that very early on. 02:15:11.460 |
okay, there's so many applications of this technology. 02:15:21.260 |
we fell back on to determine how we make decisions. 02:15:25.260 |
And so I feel like just getting clarity on these core, 02:15:27.620 |
like for us, it was respecting people's privacy, 02:15:30.500 |
only engaging with industries where it's clear opt-ins. 02:15:40.060 |
we very big on, you know, one of our core values 02:15:47.940 |
Well, these are all, they become encoded in how we act, 02:15:52.740 |
even if you're a small, tiny team of two or three 02:15:59.980 |
- So what about finding people, hiring people? 02:16:21.780 |
where it was just a straight line to success, right? 02:16:37.700 |
whether they're people on your team or even your investors, 02:16:58.380 |
but also raising money from the right sources 02:17:03.540 |
but help you, empower you, all that kind of stuff. 02:17:08.580 |
You successfully raised money many times in your life. 02:17:12.300 |
- Yeah, again, it's not just about the money. 02:17:17.220 |
who are going to be aligned in terms of what you wanna build 02:17:25.900 |
like I, yeah, in my latest like round of funding, 02:17:31.420 |
that really care about like the ethics of AI, right? 02:17:41.020 |
It's like you're picking a life partner, right? 02:17:45.180 |
- So you take it that seriously for investors? 02:17:47.460 |
- Yeah, because they're gonna have to stick with you. 02:17:54.260 |
Maybe not for life, but for a while, for sure. 02:18:29.900 |
What have you learned about investing in general? 02:18:32.740 |
From both, because you've been on both ends of it. 02:18:35.340 |
- I mean, I try to use my experience as an operator 02:18:47.020 |
is because I have a technology background, right? 02:18:53.340 |
I can apply that level of understanding, right? 02:19:02.900 |
So I can do that level of diligence, which I actually love. 02:19:12.180 |
am I excited to tell you about this new company 02:19:24.900 |
Yeah, that's important to me when I'm investing. 02:19:28.260 |
- So that means you actually can explain what they're doing 02:19:43.900 |
- It's funny, but sometimes it's unclear exactly. 02:20:05.300 |
or there has to be a core simple to explain idea 02:20:10.300 |
that then you can or can't get excited about, 02:20:17.820 |
How do you ultimately pick who you think will be successful? 02:20:25.540 |
It's not just about the thing you're excited about, 02:20:31.900 |
with early stage companies, like pre-seed companies, 02:20:43.900 |
that some of these things haven't been hashed out, 02:20:51.060 |
will this be a multi-billion/trillion dollar market? 02:21:05.660 |
What's interesting about every stage, I guess? 02:21:09.580 |
- Yeah, so pre-seed is usually when you're just starting out, 02:21:13.980 |
you've maybe raised the friends and family round, 02:21:16.500 |
so you've raised some money from people you know, 02:21:45.860 |
because it's the time when there's so much conviction, 02:21:53.980 |
with founders at the stage, and I just love it. 02:22:02.660 |
often they're first-time founders, not always, 02:22:07.660 |
and I can share my experience as a founder myself, 02:22:12.580 |
And I can almost, I create a safe ground where, 02:22:37.980 |
you have to figure out if this kind of person 02:22:50.860 |
is interesting, like the way they think about the world. 02:22:57.660 |
the thing they end up with might be very different, 02:23:07.340 |
the market or the idea, right, is the second, 02:23:11.260 |
Is this somebody who I believe has conviction, 02:23:20.660 |
Yeah, I think that it's gonna be a great leader, right? 02:23:27.660 |
and your role is to bring amazing people around you 02:23:48.940 |
Saying that there should be like random rich women, I guess. 02:24:07.100 |
which is a get together for investors of all types, 02:24:11.260 |
and there must have been maybe 400 or so attendees, 02:24:56.140 |
but I also would love to have women investors 02:25:08.560 |
but yeah, for the next Mark Zuckerberg to be a woman 02:25:13.500 |
that's just a huge number of wealth generated by women, 02:25:19.180 |
then allocated by women, and all that kind of stuff. 02:25:43.340 |
but in general, advice for folks in high school 02:25:46.980 |
or college today, how to have a career they can be proud of, 02:25:53.860 |
I suppose you have to give this kind of advice to your kids. 02:25:59.140 |
Well, here's the number one advice that I give to my kids. 02:26:15.460 |
Yeah, I think the number one advice I would share 02:26:19.020 |
is embark on a journey without attaching to outcomes, 02:26:25.100 |
So we often were so obsessed with the end goal, 02:26:37.020 |
So you become so fixated on a particular path. 02:26:41.780 |
You don't see the beauty in the other alternative path. 02:26:50.580 |
And I've been guilty of that for many, many years in my life. 02:27:02.460 |
and it'll be amazing, and maybe even exceed your dreams. 02:27:07.300 |
- Yeah, taking a leap into all kinds of things. 02:27:09.860 |
I think you tweeted like you went on vacation by yourself 02:27:14.740 |
- And just going, just taking the leap, doing it. 02:27:25.020 |
the some kind of career ladder next step and so on. 02:27:29.540 |
Yeah, there's something to that, like over planning too. 02:27:34.380 |
I'm surrounded by a lot of people that kinda, 02:27:49.420 |
it's almost, I don't know how to put it into words, 02:28:22.060 |
'cause I have this desire, I've had it my whole life to, 02:28:45.600 |
And, well, there's a lot of complexity to that story, 02:28:52.060 |
but that's the only thing, honestly, I dream of doing. 02:28:55.620 |
So I imagine a world that I could help create, 02:29:01.580 |
but it's not, there's no steps along the way. 02:29:05.660 |
And I think I'm just kind of stumbling around 02:29:09.460 |
and following happiness and working my ass off 02:29:13.260 |
in almost random, like an ant does in random directions. 02:29:17.340 |
But a lot of people, a lot of successful people around me 02:29:19.420 |
say this, you should have a plan, you should have a clear goal. 02:29:26.580 |
And there's a balance to be struck, of course, 02:29:31.020 |
but there's something to be said about really making sure 02:29:47.300 |
what do you call it when it's like challenges your brain, 02:29:58.100 |
and this is me saying is the mind fuck, but yes. 02:30:00.180 |
- Okay, okay, maybe, okay, something like that. 02:30:09.340 |
and he has a book called "Why Greatness Can't Be Planned." 02:30:16.100 |
that basically show that when you over optimize, 02:30:19.220 |
like the trade-off is you're less creative, right? 02:30:35.540 |
and he talks about how we apply that in our personal life 02:30:48.940 |
and then the quarterly goals, blah, blah, blah. 02:30:50.780 |
And he just shows with a lot of his AI experiments 02:30:55.260 |
that that's not how you create truly game-changing ideas. 02:31:01.380 |
- You should interview Kenneth, he's awesome. 02:31:15.340 |
and there's still deadlines and all that kind of stuff. 02:31:19.340 |
- I do goal setting with my kids, we all have our goals. 02:31:27.820 |
and not obsess about, I don't know, it's hard. 02:31:31.340 |
- Well, I honestly think, especially with kids, 02:31:33.460 |
it's much better to have a plan and have goals and so on, 02:31:40.900 |
But I think once you learn that, there's flexibility for me. 02:31:43.940 |
'Cause I spent most of my life with goal setting and so on. 02:31:50.580 |
I mean, school, if you wanna be successful at school, 02:31:53.900 |
I mean, the kind of stuff in high school and college 02:31:56.180 |
that kids have to do, in terms of managing their time 02:32:01.100 |
It's like, taking five, six, seven classes in college, 02:32:05.620 |
they're like, that would break the spirit of most humans 02:32:25.700 |
and then allow yourself to be lost in the flow of life. 02:32:39.860 |
That's really, that's a tricky freedom to have. 02:32:42.900 |
Because a lot of people get lost in the rat race, 02:32:55.780 |
I put very, so like, you're always trapped in this race. 02:32:58.620 |
I put a lot of emphasis on living below my means always. 02:33:03.620 |
And so there's a lot of freedom to do whatever, 02:33:13.220 |
what's the right thing, what's the right thing for them. 02:33:15.540 |
For some people, having a lot of responsibilities, 02:33:20.860 |
or having a lot of kids, the responsibility side of that, 02:33:24.500 |
is really, helps them get their shit together. 02:33:28.020 |
Like, all right, I need to be really focused and good. 02:33:30.980 |
Some of the most successful people I know have kids, 02:33:34.700 |
They make them more productive and less productive. 02:33:36.540 |
- Accountability, it's an accountability thing, absolutely. 02:33:39.260 |
- And almost something to actually live and fight 02:33:46.560 |
'Cause you would think kids would be a hit on productivity, 02:33:49.400 |
but they're not, for a lot of really successful people. 02:33:57.840 |
I mean, it's beautiful, it's beautiful to see. 02:34:01.960 |
Speaking of which, what role do you think love plays 02:34:35.020 |
- Yeah, I feel like everybody wants to feel loved, right? 02:34:56.920 |
an interesting question, whether some of that, 02:34:59.480 |
whether one day we'll be able to love a toaster. 02:35:05.280 |
- I wasn't quite thinking about that when I said-- 02:35:11.800 |
- I was thinking about Brad Pitt and toasters. 02:35:14.400 |
- All right, well, I think we started on love 02:35:34.680 |
and being an inspiration to a huge number of people 02:35:38.200 |
in robotics, in AI, in science, in the world in general. 02:35:42.240 |
So thank you for talking to me, it's an honor. 02:35:53.960 |
please check out our sponsors in the description. 02:35:59.800 |
The best and most beautiful things in the world 02:36:09.320 |
Thank you for listening, and hope to see you next time.