back to index

Rana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars | Lex Fridman Podcast #322


Chapters

0:0 Introduction
1:0 Childhood
10:37 Hijab
13:20 Faith
15:28 War
19:37 Women in the Middle East
23:55 Rana's journey
36:30 Rosalind Picard
38:38 Advice for women
49:9 Dating
56:45 Human nature
61:25 AI and emotions
92:3 Smart Eye
101:24 Tesla and Waymo
110:11 Drunk driving
119:42 Robotics
133:29 Advice for startups
138:17 Investing
145:41 Advice for young people
154:1 Love

Whisper Transcript | Transcript Only Page

00:00:00.000 | there's a broader question here, right?
00:00:02.120 | As we build socially and emotionally intelligent machines,
00:00:07.120 | what does that mean about our relationship with them?
00:00:10.840 | And then more broadly,
00:00:12.000 | our relationship with one another, right?
00:00:13.680 | Because this machine is gonna be programmed
00:00:15.760 | to be amazing at empathy by definition, right?
00:00:20.160 | It's gonna always be there for you.
00:00:21.600 | It's not gonna get bored.
00:00:23.520 | I don't know how I feel about that.
00:00:24.940 | I think about that a lot.
00:00:28.280 | - The following is a conversation with Rana El-Khlyoubi,
00:00:31.720 | a pioneer in the field of emotion recognition
00:00:34.800 | and human-centric artificial intelligence.
00:00:37.400 | She is the founder of Effectiva,
00:00:39.480 | deputy CEO of SmartEye, author of "Girl Decoded,"
00:00:44.120 | and one of the most brilliant, kind, inspiring,
00:00:46.820 | and fun human beings I've gotten the chance to talk to.
00:00:50.620 | This is the Lex Friedman Podcast.
00:00:52.800 | To support it,
00:00:53.640 | please check out our sponsors in the description.
00:00:56.160 | And now, dear friends, here's Rana El-Khlyoubi.
00:00:59.800 | You grew up in the Middle East, in Egypt.
00:01:03.520 | What is a memory from that time that makes you smile?
00:01:06.480 | Or maybe a memory that stands out
00:01:08.160 | as helping your mind take shape
00:01:10.720 | and helping you define yourself in this world?
00:01:12.840 | - So the memory that stands out
00:01:14.380 | is we used to live in my grandma's house.
00:01:17.480 | She used to have these mango trees in her garden
00:01:21.000 | and in the summer,
00:01:21.840 | and so mango season was like July and August.
00:01:24.080 | And so in the summer,
00:01:25.440 | she would invite all my aunts and uncles and cousins.
00:01:28.360 | And it was just like maybe there were like 20 or 30 people
00:01:31.200 | in the house and she would cook all this amazing food.
00:01:34.280 | And us, the kids, we would go down the garden
00:01:38.320 | and we would pick all these mangoes.
00:01:40.200 | And I don't know,
00:01:42.220 | I think it's just the bringing people together,
00:01:44.720 | like that always stuck with me, the warmth.
00:01:47.160 | - Around the mango tree.
00:01:48.360 | - Yeah, around the mango tree.
00:01:49.600 | And there's just like the joy,
00:01:51.720 | the joy of being together around food.
00:01:54.320 | And I'm a terrible cook,
00:01:57.040 | so I guess that didn't,
00:01:58.240 | that memory didn't translate to me kind of doing the same.
00:02:02.120 | I love hosting people.
00:02:03.760 | - Do you remember colors, smells?
00:02:05.720 | Is that what, like what, how does memory work?
00:02:08.680 | - Yeah.
00:02:09.520 | - Like what do you visualize?
00:02:10.340 | Do you visualize people's faces, smiles?
00:02:13.320 | Do you, is there colors?
00:02:15.120 | Is there like a theme to the colors?
00:02:18.360 | Is it smells because of food involved?
00:02:21.080 | - Yeah, I think that's a great question.
00:02:22.880 | So those Egyptian mangoes,
00:02:25.240 | there's a particular type that I love
00:02:27.480 | and it's called Darwesi mangoes.
00:02:28.840 | And they're kind of, you know, they're oval
00:02:31.200 | and they have a little red in them.
00:02:33.240 | So I kind of, they're red and mango colored on the outside.
00:02:36.920 | So I remember that.
00:02:38.500 | - Does red indicate like extra sweetness?
00:02:40.840 | Is that? - Yes.
00:02:41.960 | - That means like it's nicely--
00:02:42.800 | - It's like really sweet.
00:02:44.080 | - Yeah, it's nice and ripe and stuff, yeah.
00:02:46.240 | What's like a definitive food of Egypt?
00:02:50.780 | You know, there's like these almost stereotypical foods
00:02:54.000 | in different parts of the world,
00:02:55.120 | like Ukraine invented borscht.
00:02:59.800 | Borscht is this beet soup with,
00:03:02.000 | that you put sour cream on.
00:03:03.200 | See, it's not, I can't see--
00:03:04.040 | - Okay, okay, well you explained it that way.
00:03:05.800 | - If you know what it is, I think you know it's delicious.
00:03:09.700 | But if I explain it, it's just not gonna sound delicious.
00:03:12.540 | I feel like beet soup, this doesn't make any sense.
00:03:15.440 | But that's kind of,
00:03:16.520 | and you probably have actually seen pictures of it
00:03:19.020 | 'cause it's one of the traditional foods in Ukraine,
00:03:22.640 | in Russia, in different parts of the Slavic world.
00:03:26.080 | So, but it's become so cliche and stereotypical
00:03:29.860 | that you almost don't mention it,
00:03:30.960 | but it's still delicious.
00:03:32.520 | Like I visited Ukraine, I eat that every single day.
00:03:35.600 | - Do you make it yourself?
00:03:37.680 | How hard is it to make?
00:03:38.640 | - No, I don't know.
00:03:39.560 | I think to make it well, like anything, like Italians,
00:03:43.160 | they say, well, tomato sauce is easy to make,
00:03:46.520 | but to make it right,
00:03:48.960 | that's like a generational skill.
00:03:51.160 | So anyway, is there something like that in Egypt?
00:03:53.480 | Is there a culture of food?
00:03:55.360 | - There is, and actually we have a similar kind of soup.
00:03:59.120 | It's called molokhia,
00:04:00.960 | and it's made of this green plant.
00:04:04.520 | It's like somewhere between spinach and kale,
00:04:06.920 | and you mince it, and then you cook it in like chicken broth.
00:04:10.600 | And my grandma used to make,
00:04:12.080 | and my mom makes it really well,
00:04:14.000 | and I try to make it, but it's not as great.
00:04:16.520 | So we used to have that,
00:04:17.480 | and then we used to have it alongside stuffed pigeons.
00:04:20.320 | I'm pescetarian now, so I don't eat that anymore, but.
00:04:23.560 | - Stuffed pigeons.
00:04:24.600 | - Yeah, it's like, it was really yummy.
00:04:25.960 | It's the one thing I miss about, you know,
00:04:29.160 | now that I'm pescetarian and I don't eat.
00:04:32.160 | - The stuffed pigeons?
00:04:33.000 | - Yeah, the stuffed pigeons.
00:04:34.680 | (both laughing)
00:04:35.520 | - Is it, what are they stuffed with?
00:04:37.200 | If that doesn't bother you too much to describe.
00:04:39.960 | - No, no, it's stuffed with a lot of like just rice and.
00:04:43.480 | - Oh, got it, got it, got it.
00:04:44.320 | - Yeah, it's just rice, yeah, so.
00:04:46.120 | - And you also, you've said that you're first in your book
00:04:49.920 | that your first computer was an Atari,
00:04:52.720 | and Space Invaders was your favorite game.
00:04:55.200 | Is that when you first fell in love with computers?
00:04:58.200 | Would you say?
00:04:59.040 | - Yeah, I would say so.
00:05:00.280 | - Video games or just the computer itself?
00:05:02.280 | Just something about the machine.
00:05:04.300 | Ooh, this thing.
00:05:05.240 | There's magic in here.
00:05:07.840 | - Yeah, I think the magical moment is definitely
00:05:10.600 | like playing video games with my,
00:05:12.160 | I have two younger sisters,
00:05:13.640 | and we just like had fun together, like playing games.
00:05:17.240 | But the other memory I have is my first code.
00:05:21.320 | The first code I wrote, I wrote, I drew a Christmas tree.
00:05:25.760 | And I'm Muslim, right?
00:05:26.840 | So it's kind of, it was kind of funny
00:05:28.880 | that the first thing I did was like this Christmas tree.
00:05:32.040 | So yeah, and that's when I realized,
00:05:35.760 | wow, you can write code to do all sorts
00:05:38.040 | of like really cool stuff.
00:05:40.600 | I must have been like six or seven at the time.
00:05:43.360 | So you can write programs
00:05:45.760 | and the programs do stuff for you.
00:05:48.080 | That's power.
00:05:49.440 | That's, if you think about it, that's empowering.
00:05:51.520 | - It's AI.
00:05:52.360 | - Yeah, I know, well, it is.
00:05:53.960 | I don't know what that, you see, like,
00:05:55.920 | I don't know if many people think of it that way
00:05:59.000 | when they first learned to program.
00:06:00.200 | They just love the puzzle of it.
00:06:01.500 | Like, ooh, this is cool, this is pretty.
00:06:03.600 | It's a Christmas tree, but like, it's power.
00:06:06.320 | - It is power.
00:06:07.880 | - Eventually, I guess you couldn't at the time,
00:06:10.040 | but eventually this thing, if it's interesting enough,
00:06:13.020 | if it's a pretty enough Christmas tree,
00:06:14.800 | it can be run by millions of people
00:06:16.720 | and bring them joy, like that little thing.
00:06:19.360 | And then because it's digital, it's easy to spread.
00:06:22.480 | So like you just created something
00:06:24.120 | that's easily spreadable to millions of people.
00:06:26.800 | - Totally.
00:06:28.280 | - It's hard to think that way when you're six.
00:06:30.920 | In the book you write, "I am who I am
00:06:34.080 | because I was raised by a particular set of parents,
00:06:37.160 | both modern and conservative,
00:06:39.160 | forward-thinking yet locked in tradition.
00:06:41.920 | I'm a Muslim and I feel I'm stronger, more centered for it.
00:06:46.200 | I adhere to the values of my religion,
00:06:48.080 | even if I'm not as dutiful as I once was.
00:06:51.040 | And I am a new American and I'm thriving
00:06:54.500 | on the energy, vitality, and entrepreneurial spirit
00:06:57.800 | of this great country."
00:06:59.080 | So let me ask you about your parents.
00:07:01.640 | What have you learned about life from them,
00:07:04.040 | especially when you were young?
00:07:05.360 | - So both my parents, they're Egyptian,
00:07:07.560 | but they moved to Kuwait right after.
00:07:09.840 | They actually, there's a cute story about how they met.
00:07:11.720 | So my dad taught kobol in the '70s.
00:07:15.040 | - Nice.
00:07:15.880 | - And my mom decided to learn programming.
00:07:18.360 | So she signed up to take his kobol programming class.
00:07:21.480 | And he tried to date her and she was like,
00:07:25.620 | "No, no, no, I don't date."
00:07:26.720 | And so he's like, "Okay, I'll propose."
00:07:28.360 | And that's how they got married.
00:07:29.840 | - Whoa, strong move.
00:07:31.120 | - Right, exactly right.
00:07:32.580 | - That's really impressive.
00:07:35.960 | Those kobol guys know how to impress a lady.
00:07:40.800 | So yeah, so what have you learned from them?
00:07:43.760 | - So definitely grit.
00:07:44.920 | One of the core values in our family is just hard work.
00:07:48.600 | There were no slackers in our family.
00:07:50.920 | And that's something that's definitely stayed with me,
00:07:55.200 | both as a professional, but also in my personal life.
00:07:58.940 | But I also think my mom, my mom always used to,
00:08:02.880 | I don't know, it was like unconditional love.
00:08:07.120 | I just knew my parents would be there for me,
00:08:10.280 | kind of regardless of what I chose to do.
00:08:12.580 | And I think that's very powerful.
00:08:15.680 | And they got tested on it because I kind of challenged,
00:08:18.160 | you know, I challenged cultural norms
00:08:20.320 | and I kind of took a different path, I guess,
00:08:23.080 | than what's expected of a woman in the Middle East.
00:08:27.520 | And they still love me, which is,
00:08:31.420 | I'm so grateful for that.
00:08:32.640 | - When was like a moment
00:08:33.760 | that was the most challenging for them?
00:08:35.600 | Which moment were they kind of,
00:08:39.600 | they had to come face to face with the fact
00:08:41.920 | that you're a bit of a rebel?
00:08:44.240 | - I think the first big moment was when I,
00:08:47.380 | I had just gotten married,
00:08:50.400 | but I decided to go do my PhD at Cambridge University.
00:08:54.120 | And because my husband at the time, he's now my ex,
00:08:58.860 | ran a company in Cairo, he was gonna stay in Egypt.
00:09:01.460 | So it was gonna be a long distance relationship.
00:09:04.200 | And that's very unusual in the Middle East
00:09:06.840 | for a woman to just head out and kind of pursue her career.
00:09:10.960 | And so my dad and my parents-in-law both said,
00:09:15.960 | "You know, we do not approve of you doing this,
00:09:19.840 | but now you're under the jurisdiction of your husband,
00:09:22.320 | so he can make the call."
00:09:24.720 | And luckily for me, he was supportive.
00:09:28.760 | He said, "You know, this is your dream come true.
00:09:31.640 | You've always wanted to do a PhD.
00:09:33.080 | I'm gonna support you."
00:09:35.400 | So I think that was the first time where,
00:09:37.920 | you know, I challenged the cultural norms.
00:09:41.200 | - Was that scary?
00:09:42.400 | - Oh my God, yes.
00:09:43.520 | It was totally scary.
00:09:44.880 | - What's the biggest culture shock
00:09:46.480 | from there to Cambridge, to London?
00:09:51.480 | - Well, that was also during, right around September 11th.
00:09:56.500 | So everyone thought that there was gonna be
00:09:59.360 | a third world war, where it was really,
00:10:03.280 | and at the time I used to wear the hijab,
00:10:05.600 | so I was very visibly Muslim.
00:10:07.920 | And so my parents just were, they were afraid for my safety.
00:10:12.080 | But anyways, when I got to Cambridge,
00:10:13.440 | because I was so scared,
00:10:14.440 | I decided to take off my headscarf and wear a hat instead.
00:10:17.680 | So I just went to class wearing these like British hats,
00:10:20.400 | which was, in my opinion, actually worse
00:10:22.840 | than just showing up in a headscarf,
00:10:24.480 | 'cause it was just so awkward, right?
00:10:26.080 | Like sitting in class with like all these-
00:10:27.760 | - Trying to fit in, like a spy.
00:10:30.720 | - Yeah, yeah, yeah.
00:10:31.560 | So after a few weeks of doing that,
00:10:33.160 | I was like, to heck with that,
00:10:34.840 | I'm just gonna go back to wearing my headscarf.
00:10:37.320 | - Yeah, you wore the hijab, so starting in 2000,
00:10:42.320 | and for 12 years after.
00:10:44.300 | So always, whenever you're in public,
00:10:47.000 | you have to wear the head covering.
00:10:48.920 | Can you speak to that, to the hijab,
00:10:52.320 | maybe your mixed feelings about it?
00:10:54.120 | Like what does it represent in its best case?
00:10:56.640 | What does it represent in the worst case?
00:10:58.400 | - Yeah, you know, I think there's a lot of,
00:11:02.520 | I guess I'll first start by saying I wore it voluntarily.
00:11:05.520 | I was not forced to wear it.
00:11:06.960 | And in fact, I was one of the very first women in my family
00:11:10.040 | to decide to put on the hijab.
00:11:12.240 | And my family thought it was really odd, right?
00:11:15.240 | Like they were like, why do you wanna put this on?
00:11:17.920 | And at its best, it's a sign of modesty, humility.
00:11:22.920 | - It's like me wearing a suit.
00:11:25.720 | People are like, why are you wearing a suit?
00:11:27.560 | It's a step back into some kind of tradition,
00:11:30.920 | a respect for tradition of sorts.
00:11:33.240 | So you said because it's by choice,
00:11:34.960 | you're kind of free to make that choice
00:11:37.560 | to celebrate a tradition of modesty.
00:11:40.000 | - Exactly, and I actually like made it my own.
00:11:43.280 | I remember I would really match the color of my headscarf
00:11:46.880 | with what I was wearing.
00:11:47.800 | Like it was a form of self-expression,
00:11:50.120 | and at its best, I loved wearing it.
00:11:54.160 | You know, I have a lot of questions
00:11:55.800 | around how we practice religion and religion.
00:11:58.200 | And I think also it was a time
00:12:02.480 | where I was spending a lot of time going back and forth
00:12:04.720 | between the US and Egypt.
00:12:06.640 | And I started meeting a lot of people in the US
00:12:08.560 | who were just amazing people, very purpose-driven,
00:12:13.360 | people who have very strong core values,
00:12:15.520 | but they're not Muslim.
00:12:16.920 | That's okay, right?
00:12:17.880 | And so that was when I just had a lot of questions.
00:12:21.220 | And politically also the situation in Egypt
00:12:25.120 | was when the Muslim Brotherhood ran the country,
00:12:27.240 | and I didn't agree with their ideology.
00:12:30.160 | It was at a time when I was going through a divorce.
00:12:33.560 | Like it was like just the perfect storm
00:12:36.040 | of like political, personal conditions,
00:12:39.300 | where I was like, "This doesn't feel like me anymore."
00:12:42.240 | And it took a lot of courage to take it off
00:12:44.240 | because culturally it's okay if you don't wear it,
00:12:48.600 | but it's really not okay to wear it and then take it off.
00:12:52.080 | - But you're still, so you had to do that
00:12:54.440 | while still maintaining a deep core and pride
00:12:58.040 | in the origins, in your origin story.
00:13:02.440 | - Totally.
00:13:03.280 | - So still being Egyptian, still being a Muslim.
00:13:06.760 | - Right, and being, I think, generally like faith-driven,
00:13:11.760 | but yeah.
00:13:14.440 | - But what that means changes year by year for you.
00:13:17.600 | It's like a personal journey.
00:13:18.920 | - Yeah, exactly.
00:13:20.080 | - What would you say is the role of faith
00:13:22.160 | in that part of the world?
00:13:24.080 | Like how do you see it?
00:13:25.120 | You mention it a bit in the book too.
00:13:27.440 | - Yeah, I mean, I think there is something really powerful
00:13:32.120 | about just believing that there's a bigger force.
00:13:36.880 | You know, there's a kind of surrendering, I guess,
00:13:39.460 | that comes with religion and you surrender
00:13:42.120 | and you have this deep conviction
00:13:43.400 | that it's gonna be okay, right?
00:13:45.380 | Like the universe is out to like do amazing things for you
00:13:48.000 | and it's gonna be okay.
00:13:49.600 | And there's strength to that.
00:13:50.960 | Like even when you're going through adversity,
00:13:54.060 | you just know that it's gonna work out.
00:13:57.940 | - Yeah, it gives you like an inner peace, a calmness.
00:14:00.100 | - Exactly, exactly.
00:14:01.460 | - Yeah, it's faith in all the meanings of that word.
00:14:05.700 | - Right.
00:14:06.540 | - Faith that everything is going to be okay.
00:14:08.100 | And it is because time passes and time cures all things.
00:14:13.100 | It's like a calmness with the chaos of the world.
00:14:16.420 | - And also there's like a silver lining.
00:14:19.460 | I'm a true believer of this,
00:14:20.700 | that something at the specific moment in time
00:14:24.020 | can look like it's catastrophic
00:14:25.460 | and it's not what you wanted in life, da-da-da-da.
00:14:28.060 | But then time passes and then you look back
00:14:31.300 | and there's a silver lining, right?
00:14:33.060 | It maybe closed the door, but it opened a new door for you.
00:14:37.340 | And so I'm a true believer in that,
00:14:38.940 | that there's a silver lining in almost anything in life.
00:14:43.860 | You just have to have this like,
00:14:45.860 | have faith or conviction that it's gonna work out.
00:14:48.380 | - Such a beautiful way to see a shitty feeling.
00:14:50.900 | So if you feel shitty about a current situation,
00:14:53.840 | I mean, it almost is always true,
00:14:56.820 | unless it's the cliches thing of,
00:15:02.780 | if it doesn't kill you,
00:15:03.940 | whatever doesn't kill you makes you stronger.
00:15:06.180 | It does seem that over time,
00:15:09.220 | when you take a perspective on things,
00:15:11.120 | the hardest moments and periods of your life
00:15:15.780 | are the most meaningful.
00:15:18.200 | - Yeah, yeah.
00:15:19.440 | So over time you get to have that perspective.
00:15:21.680 | - Right.
00:15:22.520 | - What about, 'cause you mentioned Kuwait,
00:15:26.900 | what about, let me ask you about war.
00:15:30.380 | What's the role of war and peace,
00:15:34.040 | maybe even the big love and hate in that part of the world,
00:15:37.680 | because it does seem to be a part of the world
00:15:39.720 | where there's turmoil.
00:15:41.800 | There was turmoil, there's still turmoil.
00:15:45.640 | - It is so unfortunate, honestly.
00:15:47.600 | It's such a waste of human resources
00:15:51.000 | and yeah, and human mindshare.
00:15:54.800 | I mean, at the end of the day,
00:15:56.680 | we all kind of want the same things.
00:15:58.360 | We want human connection, we want joy,
00:16:02.160 | we wanna feel fulfilled,
00:16:03.440 | we wanna feel a life of purpose.
00:16:06.680 | And I just find it baffling, honestly,
00:16:11.200 | that we are still having to grapple with that.
00:16:15.240 | I have a story to share about this.
00:16:17.200 | I grew up, I'm Egyptian, American now,
00:16:19.400 | but originally from Egypt.
00:16:22.920 | And when I first got to Cambridge,
00:16:24.800 | it turned out my office mate,
00:16:26.920 | like my PhD kind of,
00:16:28.480 | we ended up becoming friends, but she was from Israel.
00:16:34.080 | And we didn't know, yeah,
00:16:35.560 | we didn't know how it was gonna be like.
00:16:37.900 | - Did you guys sit there just staring at each other for a bit?
00:16:42.520 | - Actually, she, 'cause I arrived before she did
00:16:45.640 | and it turns out she emailed our PhD advisor
00:16:49.800 | and asked him if she thought it was gonna be okay.
00:16:52.400 | - Yeah.
00:16:53.240 | Oh, this is around 9/11 too.
00:16:55.120 | - Yeah, and Peter Robinson, our PhD advisor was like,
00:16:59.520 | yeah, like this is an academic institution, just show up.
00:17:02.960 | And we became super good friends.
00:17:04.640 | We were both new moms.
00:17:07.360 | Like we both had our kids during our PhD.
00:17:09.400 | We were both doing artificial emotional intelligence.
00:17:11.520 | She was looking at speech.
00:17:12.460 | I was looking at the face.
00:17:13.880 | We just had so, the culture was so similar.
00:17:17.160 | Our jokes were similar.
00:17:18.400 | It was just, I was like, why on earth are our countries,
00:17:23.080 | why is there all this like war and tension?
00:17:25.320 | And I think it falls back to the narrative, right?
00:17:27.480 | If you change the narrative,
00:17:28.780 | like whoever creates this narrative of war, I don't know.
00:17:32.600 | We should have women run the world.
00:17:34.800 | - Yeah, that's one solution, the good women,
00:17:38.600 | because there's also evil women in the world.
00:17:40.720 | - True, okay.
00:17:41.940 | (both laughing)
00:17:44.060 | - But yes, yes, there could be less war
00:17:46.300 | if women ran the world.
00:17:48.220 | The other aspect is, it doesn't matter the gender,
00:17:51.860 | the people in power.
00:17:53.540 | I get to see this with Ukraine and Russia,
00:17:57.100 | different parts of the world around that conflict now.
00:18:01.020 | And that's happening in Yemen as well and everywhere else.
00:18:05.340 | There's these narratives told by the leaders
00:18:08.580 | to the populace and those narratives take hold
00:18:11.340 | and everybody believes that and they have a distorted view
00:18:15.660 | of the humanity on the other side.
00:18:18.020 | In fact, especially during war,
00:18:20.020 | you don't even see the people on the other side as human
00:18:25.020 | or as equal intelligence or worth or value as you.
00:18:30.220 | You tell all kinds of narratives about them being Nazis
00:18:34.180 | or Dom or whatever narrative you wanna weave around that.
00:18:41.900 | Or evil.
00:18:42.860 | But I think when you actually meet them face to face,
00:18:47.420 | you realize they're the same.
00:18:49.220 | - Exactly right.
00:18:50.500 | - It's actually a big shock for people to realize
00:18:53.860 | that they've been essentially lied to within their country.
00:18:59.980 | And I kind of have faith that social media,
00:19:03.340 | as ridiculous as it is to say,
00:19:05.140 | or any kind of technology is able to bypass the walls
00:19:10.340 | that governments put up and connect people directly
00:19:14.140 | and then you get to realize,
00:19:15.220 | ooh, people fall in love across different nations
00:19:20.020 | and religions and so on.
00:19:21.540 | And that I think ultimately can cure a lot of our ills,
00:19:24.660 | especially sort of in person.
00:19:26.780 | I also think that if leaders met in person
00:19:30.020 | to have a conversation,
00:19:31.020 | that would cure a lot of the ills of the world,
00:19:34.860 | especially in private.
00:19:37.840 | - Let me ask you about the women running the world.
00:19:41.560 | - Okay.
00:19:42.440 | - So gender does in part,
00:19:44.920 | perhaps shape the landscape of just our human experience.
00:19:49.800 | So in what ways was it limiting it?
00:19:53.920 | In what ways was it empowering
00:19:56.760 | for you to be a woman in the Middle East?
00:19:59.640 | - I think just kind of just going back to like my comment
00:20:02.160 | on like women running the world,
00:20:03.240 | I think it comes back to empathy, right?
00:20:05.360 | Which has been a common thread throughout my entire career.
00:20:08.960 | And it's this idea of human connection.
00:20:11.520 | Once you build common ground with a person
00:20:14.760 | or a group of people, you build trust, you build loyalty,
00:20:17.760 | you build friendship, and then you can turn that
00:20:21.840 | into like behavior change and motivation and persuasion.
00:20:25.120 | So it's like empathy and emotions are just at the center
00:20:28.000 | of everything we do.
00:20:31.480 | And I think being from the Middle East,
00:20:35.480 | kind of this human connection is very strong.
00:20:38.640 | Like we have this running joke that if you come to Egypt
00:20:42.120 | for a visit, people are gonna,
00:20:44.600 | will know everything about your life like right away, right?
00:20:46.720 | I have no problems asking you about your personal life.
00:20:49.520 | There's no like no boundaries really,
00:20:52.720 | no personal boundaries in terms of getting to know people.
00:20:55.240 | We get emotionally intimate like very, very quickly.
00:20:58.480 | But I think people just get to know each other
00:21:00.440 | like authentically, I guess.
00:21:03.720 | There isn't this like superficial level
00:21:06.120 | of getting to know people.
00:21:07.040 | You just try to get to know people really deeply.
00:21:09.240 | - And empathy is a part of that.
00:21:10.080 | - Totally, 'cause you can put yourself
00:21:11.720 | in this person's shoe and kind of,
00:21:14.280 | yeah, imagine what challenges they're going through.
00:21:18.720 | So I think I've definitely taken that with me.
00:21:22.480 | Generosity is another one too,
00:21:25.440 | like just being generous with your time and love
00:21:28.280 | and attention and even with your wealth, right?
00:21:32.600 | Even if you don't have a lot of it,
00:21:33.840 | you're still very generous.
00:21:34.920 | And I think that's another.
00:21:36.720 | - Enjoying the humanity of other people.
00:21:40.120 | And so do you think there's a useful difference
00:21:42.920 | between men and women in that aspect and empathy?
00:21:47.640 | Or is doing these kind of big general groups,
00:21:53.720 | does that hinder progress?
00:21:57.040 | - Yeah, I actually don't wanna overgeneralize.
00:21:59.920 | I mean, some of the men I know
00:22:01.960 | are like the most empathetic humans.
00:22:03.720 | - Yeah, I strive to be empathetic.
00:22:05.360 | - Yeah, you're actually very empathetic.
00:22:07.480 | Yeah, so I don't wanna overgeneralize.
00:22:13.880 | Although one of the researchers I worked with
00:22:17.160 | when I was at Cambridge, Professor Simon Baring-Cohen,
00:22:19.720 | he's Sasha Baring-Cohen's cousin.
00:22:22.040 | - Yeah. (laughs)
00:22:23.520 | - But he runs the Autism Research Center at Cambridge
00:22:26.360 | and he's written multiple books on autism.
00:22:30.880 | And one of his theories is the empathy scale,
00:22:34.360 | like the systemizers and the empathizers.
00:22:36.320 | And there's a disproportionate amount
00:22:40.040 | of computer scientists and engineers who are systemizers
00:22:44.520 | and perhaps not great empathizers.
00:22:47.200 | And then there's more men in that bucket, I guess,
00:22:52.880 | than women.
00:22:53.720 | And then there's more women in the empathizers bucket.
00:22:56.840 | So again, not to overgeneralize.
00:22:58.920 | - I sometimes wonder about that.
00:23:00.400 | It's been frustrating to me how many, I guess,
00:23:03.320 | systemizers there are in the field of robotics.
00:23:06.040 | - Yeah.
00:23:07.040 | - It's actually encouraging to me
00:23:08.280 | 'cause I care about, obviously, social robotics.
00:23:11.240 | And because there's more opportunity
00:23:16.240 | for people that are empathic. (laughs)
00:23:18.920 | - Exactly, I totally agree.
00:23:20.720 | Well, right? - So it's nice.
00:23:21.920 | - Yes.
00:23:22.760 | - I mean, most robotics I talk to,
00:23:23.720 | they don't see the human as interesting,
00:23:26.080 | as like it's not exciting.
00:23:29.400 | You wanna avoid the human at all costs.
00:23:32.200 | It's a safety concern to be touching the human,
00:23:35.360 | which it is, but it's also an opportunity
00:23:39.360 | for deep connection or collaboration
00:23:42.160 | or all that kind of stuff.
00:23:43.000 | So, and because most brilliant roboticists
00:23:46.280 | don't care about the human, it's an opportunity.
00:23:48.600 | - Right.
00:23:49.440 | - For, in your case, it's a business opportunity too,
00:23:52.160 | but in general, an opportunity to explore those ideas.
00:23:54.720 | So, in this beautiful journey to Cambridge,
00:23:58.200 | to UK, and then to America,
00:24:01.880 | what's the moment or moments
00:24:04.760 | that were most transformational for you
00:24:07.280 | as a scientist and as a leader?
00:24:09.960 | So you became an exceptionally successful CEO,
00:24:12.920 | founder, researcher, scientist, and so on.
00:24:16.680 | Was there a phase shift there
00:24:21.240 | where like, I can be somebody,
00:24:23.800 | I can really do something in this world?
00:24:26.800 | - Yeah, so actually just kind of a little bit of background.
00:24:29.800 | So the reason why I moved from Cairo to Cambridge, UK
00:24:33.960 | to do my PhD is because I had a very clear career plan.
00:24:38.200 | I was like, okay, I'll go abroad, get my PhD,
00:24:41.320 | gonna crush it in three or four years,
00:24:43.440 | come back to Egypt and teach.
00:24:45.720 | It was very clear, very well laid out.
00:24:47.760 | - Was topic clear or no?
00:24:50.480 | - Well, I did my PhD around
00:24:52.480 | building artificial emotional intelligence and looking at--
00:24:54.600 | - No, but in your master plan ahead of time,
00:24:56.960 | when you're sitting by the mango tree,
00:24:58.320 | did you know it's gonna be artificial intelligence?
00:25:00.640 | - No, no, no, that I did not know.
00:25:03.080 | Although I think I kinda knew
00:25:05.400 | that I was gonna be doing computer science,
00:25:07.720 | but I didn't know the specific area.
00:25:10.400 | But I love teaching, I mean, I still love teaching.
00:25:13.200 | So I just, yeah, I just wanted to go abroad,
00:25:16.520 | get a PhD, come back, teach.
00:25:19.080 | - Why computer science?
00:25:19.960 | Can we just linger on that?
00:25:21.680 | 'Cause you're such an empathic person
00:25:23.600 | who cares about emotion, humans and so on.
00:25:25.840 | Aren't computers cold and emotionless?
00:25:30.600 | (laughing)
00:25:31.840 | Just-- - We're changing that.
00:25:33.240 | - Yeah, I know, but isn't that the,
00:25:36.280 | or did you see computers as having the capability
00:25:39.840 | to actually connect with humans?
00:25:43.120 | - I think that was my takeaway
00:25:44.720 | from my experience just growing up.
00:25:46.520 | Computers sit at the center of how we connect
00:25:49.120 | and communicate with one another, right?
00:25:51.080 | Or technology in general.
00:25:52.480 | Like I remember my first experience
00:25:54.320 | being away from my parents.
00:25:55.400 | We communicated with a fax machine,
00:25:57.320 | but thank goodness for the fax machine
00:25:59.200 | because we could send letters back and forth to each other.
00:26:01.560 | This was pre-emails and stuff.
00:26:03.360 | So I think there's, I think technology
00:26:08.200 | can be not just transformative
00:26:09.880 | in terms of productivity, et cetera.
00:26:11.760 | It actually does change how we connect with one another.
00:26:15.720 | - Can I just defend the fax machine?
00:26:17.520 | - Yeah. - There's something,
00:26:19.320 | like the haptic feel, 'cause the email is all digital.
00:26:23.480 | There's something really nice.
00:26:24.600 | I still write letters to people.
00:26:27.320 | There's something nice about the haptic aspect
00:26:29.600 | of the fax machine 'cause you still have to press,
00:26:31.840 | you still have to do something in the physical world
00:26:34.240 | to make this thing a reality, the sense of somebody.
00:26:36.960 | - Right, and then it comes out as a printout
00:26:38.960 | and you can actually touch it and read it.
00:26:40.720 | - Yeah, there's something lost when it's just an email.
00:26:45.720 | Obviously, I wonder how we can regain some of that
00:26:50.440 | in the digital world, which goes to the metaverse
00:26:53.000 | and all those kinds of things.
00:26:53.840 | We'll talk about it.
00:26:54.680 | Anyway, so--
00:26:55.520 | - Actually, do you have a question on that one?
00:26:57.800 | Do you still, do you have photo albums anymore?
00:27:00.520 | Do you still print photos?
00:27:02.400 | - No, no, but I'm a minimalist.
00:27:06.720 | So it was one of the painful steps in my life
00:27:09.360 | was to scan all the photos and let go of them
00:27:13.360 | and then let go of all my books.
00:27:16.520 | - You let go of your books?
00:27:17.760 | - Yeah, switched to Kindle, everything Kindle.
00:27:20.840 | So I thought, okay, think 30 years from now.
00:27:25.840 | Nobody's gonna have books anymore.
00:27:29.640 | The technology of digital books
00:27:31.040 | is gonna get better and better and better.
00:27:32.200 | Are you really gonna be the guy
00:27:33.720 | that's still romanticizing physical books?
00:27:36.400 | Are you gonna be the old man on the porch
00:27:38.800 | who's like kids, yes.
00:27:40.520 | So just get used to it 'cause it felt,
00:27:43.040 | it still feels a little bit uncomfortable
00:27:45.120 | to read on a Kindle, but get used to it.
00:27:48.840 | You always, I mean, I'm trying to learn
00:27:51.520 | new programming languages always.
00:27:53.440 | Like with technology, you have to kind of
00:27:55.000 | challenge yourself to adapt to it.
00:27:57.280 | I forced myself to use TikTok now.
00:28:00.000 | That thing doesn't need much forcing.
00:28:01.560 | It pulls you in like the worst kind of,
00:28:04.560 | or the best kind of drug.
00:28:06.040 | Anyway, yeah, so yeah, but I do love haptic things.
00:28:11.920 | There's a magic to the haptic.
00:28:13.600 | Even like touchscreens, it's tricky to get right,
00:28:16.400 | to get the experience of a button.
00:28:19.640 | - Yeah.
00:28:20.480 | - Anyway, what were we talking about?
00:28:23.880 | So AI, so the journey, your whole plan
00:28:27.760 | was to come back to Cairo and teach, right?
00:28:31.120 | - And then-- - What did the plan go wrong?
00:28:33.960 | - Yeah, exactly, right?
00:28:35.200 | And then I got to Cambridge and I fall in love
00:28:37.960 | with the idea of research, right?
00:28:39.560 | And kind of embarking on a path.
00:28:41.640 | Nobody's explored this path before.
00:28:43.800 | You're building stuff that nobody's built before
00:28:45.640 | and it's challenging and it's hard
00:28:47.080 | and there's a lot of non-believers.
00:28:49.400 | I just totally love that.
00:28:51.040 | And at the end of my PhD, I think it's the meeting
00:28:54.480 | that changed the trajectory of my life.
00:28:56.960 | Professor Rosalind Picard, she runs
00:28:59.960 | the Affective Computing Group at the MIT Media Lab.
00:29:02.240 | I had read her book.
00:29:03.560 | You know, I was like following all her research.
00:29:07.680 | - AKA Roz.
00:29:08.960 | - Yes, AKA Roz.
00:29:10.920 | And she was giving a talk
00:29:13.360 | at a pattern recognition conference in Cambridge
00:29:16.480 | and she had a couple of hours to kill.
00:29:18.080 | So she emailed the lab and she said,
00:29:19.760 | you know, if any students wanna meet with me,
00:29:22.400 | like just, you know, sign up here.
00:29:24.800 | And so I signed up for slot and I spent like the weeks
00:29:28.560 | leading up to it preparing for this meeting.
00:29:31.080 | And I want to show her a demo of my research and everything.
00:29:34.480 | And we met and we ended up hitting it off.
00:29:36.760 | Like we totally clicked.
00:29:38.160 | And at the end of the meeting, she said,
00:29:40.840 | do you wanna come work with me as a postdoc at MIT?
00:29:43.360 | And this is what I told her.
00:29:45.720 | I was like, okay, this would be a dream come true,
00:29:47.760 | but there's a husband waiting for me in Cairo.
00:29:49.880 | I could have to go back.
00:29:51.240 | And she said, it's fine, just commute.
00:29:54.720 | And I literally started commuting between Cairo and Boston.
00:29:57.960 | Yeah, it was a long commute.
00:30:01.320 | And I did that like every few weeks,
00:30:03.480 | I would, you know, hop on a plane and go to Boston.
00:30:06.480 | But that changed the trajectory of my life.
00:30:08.600 | There was no, I kind of outgrew my dreams, right?
00:30:12.960 | I didn't wanna go back to Egypt anymore and be faculty.
00:30:16.800 | Like that was no longer my dream.
00:30:18.440 | I had a new dream.
00:30:19.280 | - What was it like to be at MIT?
00:30:22.720 | What was that culture shock?
00:30:25.160 | You mean America in general, but also,
00:30:27.520 | I mean Cambridge has its own culture, right?
00:30:31.240 | So what was MIT like?
00:30:33.000 | What was America like?
00:30:35.080 | - I think, I wonder if that's similar to your experience
00:30:37.960 | at MIT, I was just, at the Media Lab in particular,
00:30:42.600 | I was just really, impressed is not the right word.
00:30:47.200 | I didn't expect the openness to like innovation
00:30:50.680 | and the acceptance of taking a risk and failing.
00:30:55.680 | Like failure isn't really accepted back in Egypt, right?
00:30:59.120 | You don't wanna fail.
00:30:59.960 | Like there's a fear of failure,
00:31:02.000 | which I think has been hardwired in my brain.
00:31:05.080 | But you get to MIT and it's okay to start things.
00:31:07.240 | And if they don't work out, like it's okay,
00:31:09.240 | you pivot to another idea.
00:31:11.040 | And that kind of thinking was just very new to me.
00:31:13.840 | - That's liberating.
00:31:14.800 | Well, Media Lab for people who don't know,
00:31:16.200 | MIT Media Lab is its own beautiful thing
00:31:19.480 | because they, I think more than other places at MIT,
00:31:23.880 | reach for big ideas.
00:31:25.160 | And like they try, I mean, I think,
00:31:27.720 | I mean, depending of course on who,
00:31:29.200 | but certainly with Rosalind, you try wild stuff,
00:31:32.480 | you try big things and crazy things.
00:31:34.600 | And also try to take things to completion
00:31:38.520 | so you can demo them.
00:31:39.560 | So always have a demo.
00:31:43.480 | Like if you go, one of the sad things to me
00:31:46.000 | about robotics labs at MIT, and there's like over 30,
00:31:48.960 | I think, is like, usually when you show up to a robotics lab
00:31:53.960 | there's not a single working robot, they're all broken.
00:31:57.000 | All the robots are broken,
00:31:58.880 | which is like the normal state of things
00:32:00.680 | because you're working on them.
00:32:02.080 | But it would be nice if we lived in a world
00:32:04.360 | where robotics labs had some robots functioning.
00:32:08.960 | One of my like favorite moments that just sticks with me,
00:32:11.800 | I visited Boston Dynamics and there was a,
00:32:14.240 | first of all, seeing so many spots,
00:32:17.960 | so many legged robots in one place, I'm like, I'm home.
00:32:21.640 | (both laughing)
00:32:22.840 | But the- - My drive.
00:32:24.680 | - Yeah.
00:32:25.520 | This is where I was built.
00:32:28.880 | The cool thing was just to see,
00:32:31.160 | there was a random robot spot was walking down the hall.
00:32:35.360 | It was probably doing mapping,
00:32:36.520 | but it looked like he wasn't doing anything
00:32:38.440 | and he was wearing he or she, I don't know.
00:32:41.120 | But it, well, I like, in my mind,
00:32:44.400 | there are people that have a backstory,
00:32:46.360 | but this one in particular definitely has a backstory
00:32:48.360 | because he was wearing a cowboy hat.
00:32:51.240 | So I just saw a spot robot with a cowboy hat
00:32:54.120 | walking down the hall.
00:32:55.280 | And there was just this feeling like there's a life,
00:32:59.720 | like he has a life.
00:33:01.520 | He probably has to commute back to his family at night.
00:33:04.880 | Like there's a feeling like there's life instilled
00:33:08.200 | in this robot and that's magical.
00:33:09.960 | I don't know, it was kind of inspiring to see.
00:33:12.200 | - Did it say hello to, did he say hello to you?
00:33:14.600 | Did he say hello? - No, it's very,
00:33:15.680 | there's a focused nature to the robot.
00:33:17.960 | No, no, listen, I love competence and focus and great.
00:33:21.520 | Like he was not gonna get distracted
00:33:23.200 | by the shallowness of small talk.
00:33:27.760 | There's a job to be done and he was doing it.
00:33:29.920 | So anyway, the fact that it was working
00:33:32.120 | is a beautiful thing.
00:33:32.960 | And I think Media Lab really prides itself
00:33:35.760 | on trying to always have a thing that's working
00:33:37.880 | that it could show off.
00:33:38.920 | - Yes, we used to call it demo or die.
00:33:41.800 | You could not, yeah, you could not like show up
00:33:44.680 | with like PowerPoint or something.
00:33:46.040 | You actually had to have it working.
00:33:47.560 | You know what, my son, who is now 13,
00:33:50.120 | I don't know if this is still his lifelong goal or not,
00:33:53.440 | but when he was a little younger,
00:33:54.840 | his dream is to build an island
00:33:56.600 | that's just inhabited by robots, like no humans.
00:34:00.040 | He just wants all these robots to be connecting
00:34:02.120 | and having fun and so there you go.
00:34:04.560 | - Does he have human,
00:34:05.560 | does he have an idea of which robots he loves most?
00:34:09.240 | Is it Roomba-like robots?
00:34:12.000 | Is it humanoid robots, robot dogs, or is not clear yet?
00:34:15.960 | - We used to have a Jibo,
00:34:18.280 | which was one of the MIT Media Lab spin-outs
00:34:21.520 | and he used to love Jibo. - The thing with a giant head.
00:34:23.360 | - Yes. - That spins.
00:34:24.720 | - Right, exactly.
00:34:25.640 | - It can rotate and it's an eye.
00:34:27.480 | - It has, oh, no, yeah, it can.
00:34:29.520 | - Not glowing, like. - Right, right, right, right.
00:34:31.720 | Exactly. - It's like HAL 9000,
00:34:33.240 | but the friendly version.
00:34:34.440 | - (laughs) Right, he loved that.
00:34:36.960 | And then he just loves,
00:34:38.480 | yeah, he just, I think he loves all forms of robots,
00:34:43.720 | actually.
00:34:44.760 | - So embodied intelligence. - Yes.
00:34:47.840 | - I like, I personally like legged robots, especially.
00:34:50.760 | Anything that can wiggle its butt.
00:34:55.140 | No. - And flip.
00:34:56.340 | - That's not the definition of what I love,
00:34:59.320 | but that's just technically
00:35:00.560 | what I've been working on recently.
00:35:01.840 | So I have a bunch of legged robots now in Austin
00:35:04.560 | and I've been-- - Oh, that's so cool.
00:35:06.200 | - I've been trying to have them communicate affection
00:35:10.460 | with their body in different ways, just for art.
00:35:13.200 | - That's so cool. - For art, really.
00:35:15.280 | 'Cause I love the idea of walking around with the robots,
00:35:18.720 | like as you would with a dog.
00:35:20.520 | I think it's inspiring to a lot of people,
00:35:22.120 | especially young people.
00:35:23.420 | Kids love robots. - Kids love it.
00:35:25.860 | - Parents, like adults are scared of robots,
00:35:28.740 | but kids don't have this kind of weird construction
00:35:31.740 | of the world that's full of evil.
00:35:33.020 | They love cool things. - Yeah.
00:35:35.220 | I remember when Adam was in first grade,
00:35:38.380 | so he must have been like seven or so,
00:35:40.140 | I went into his class with a whole bunch of robots
00:35:43.540 | and the emotion AI demo and da-da-da.
00:35:45.820 | And I asked the kids, I was like,
00:35:47.740 | would you kids want to have a robot friend
00:35:52.900 | or a robot companion?
00:35:53.740 | Everybody said yes, and they wanted it
00:35:55.260 | for all sorts of things, like to help them
00:35:57.780 | with their math homework and to be a friend.
00:36:00.260 | So it just struck me how there was no fear of robots.
00:36:05.260 | Was a lot of adults have that, like us versus them.
00:36:09.540 | - Yeah, none of that.
00:36:12.080 | Of course, you wanna be very careful
00:36:13.860 | because you still have to look at the lessons of history
00:36:17.060 | and how robots can be used by the power centers of the world
00:36:20.700 | to abuse your rights and all that kind of stuff.
00:36:22.620 | But mostly it's good to enter anything new
00:36:26.700 | with an excitement and an optimism.
00:36:28.920 | Speaking of Roz, what have you learned
00:36:32.380 | about science and life from Rosalind Picard?
00:36:35.340 | - Oh my God, I've learned so many things
00:36:37.020 | about life from Roz.
00:36:40.300 | I think the thing I learned the most is perseverance.
00:36:45.300 | When I first met Roz, we applied,
00:36:49.260 | and she invited me to be her postdoc,
00:36:51.420 | we applied for a grant to the National Science Foundation
00:36:55.580 | to apply some of our research to autism.
00:36:57.900 | And we got back, we were rejected.
00:37:01.740 | - Rejected.
00:37:02.580 | - Yeah, and the reasoning was--
00:37:03.420 | - The first time you were rejected for fun, yeah.
00:37:06.140 | - Yeah, and I basically, I just took the rejection
00:37:09.260 | to mean, okay, we're rejected, it's done,
00:37:11.340 | like end of story, right?
00:37:12.940 | And Roz was like, it's great news.
00:37:15.260 | They love the idea, they just don't think we can do it.
00:37:18.340 | So let's build it, show them, and then reapply.
00:37:21.300 | (Roz laughs)
00:37:22.620 | And it was that, oh my God, that story totally stuck with me.
00:37:26.540 | And she's like that in every aspect of her life.
00:37:29.940 | She just does not take no for an answer.
00:37:32.380 | - The reframe all negative feedback.
00:37:34.620 | - It's a challenge.
00:37:36.500 | - It's a challenge.
00:37:37.340 | - It's a challenge.
00:37:38.740 | - Yes, they like this.
00:37:40.180 | - Yeah, yeah, yeah, it was a riot, yeah.
00:37:43.460 | - What else about science in general,
00:37:45.260 | about how you see computers, and also business,
00:37:49.940 | and just everything about the world?
00:37:51.740 | She's a very powerful, brilliant woman like yourself,
00:37:54.980 | so is there some aspect of that too?
00:37:57.500 | - Yeah, I think Roz is actually also very faith-driven.
00:38:00.460 | She has this deep belief and conviction, yeah,
00:38:04.660 | in the good in the world and humanity.
00:38:07.540 | I think that was, meeting her and her family
00:38:11.180 | was definitely a defining moment for me,
00:38:13.660 | because that was when I was like, wow,
00:38:15.820 | you can be of a different background and religion,
00:38:18.980 | and whatever, and you can still have the same core values.
00:38:23.980 | So that was, yeah.
00:38:25.580 | I'm grateful to her.
00:38:28.820 | Roz, if you're listening, thank you.
00:38:30.380 | - Yeah, she's great.
00:38:31.420 | She's been on this podcast before.
00:38:33.780 | I hope she'll be on, I'm sure she'll be on again.
00:38:36.600 | You were the founder and CEO of Affectiva,
00:38:42.620 | which is a big company that was acquired
00:38:44.740 | by another big company, SmartEye,
00:38:47.140 | and you're now the deputy CEO of SmartEye,
00:38:49.260 | so you're a powerful leader, you're brilliant,
00:38:51.940 | you're a brilliant scientist.
00:38:53.500 | A lot of people are inspired by you.
00:38:55.180 | What advice would you give, especially to young women,
00:38:58.420 | but people in general, who dream of becoming
00:39:01.460 | powerful leaders like yourself in a world where perhaps,
00:39:05.220 | in a world that perhaps doesn't give them a clear,
00:39:13.460 | easy path to do so,
00:39:15.620 | whether we're talking about Egypt or elsewhere?
00:39:18.260 | - You know, hearing you kind of describe me that way
00:39:23.180 | kind of encapsulates, I think,
00:39:27.460 | what I think is the biggest challenge of all,
00:39:29.300 | which is believing in yourself, right?
00:39:32.140 | I have had to like grapple with this,
00:39:34.860 | what I call now the Debbie Downer voice in my head.
00:39:39.500 | The kind of basically, it's just shattering all the time,
00:39:42.860 | it's basically saying, "Oh no, no, no, no,
00:39:44.380 | you can't do this.
00:39:45.580 | You're not gonna raise money.
00:39:46.460 | You can't start a company.
00:39:47.620 | What business do you have, like starting a company
00:39:49.500 | or running a company or selling a company?
00:39:51.580 | You name it."
00:39:52.420 | It's always like, and I think my biggest advice
00:39:57.300 | to not just women, but people who are taking a new path
00:40:02.300 | and they're not sure, is to not let yourself
00:40:07.260 | and let your thoughts be the biggest obstacle in your way.
00:40:11.020 | And I've had to like really work on myself
00:40:15.380 | to not be my own biggest obstacle.
00:40:18.660 | - So you got that negative voice.
00:40:20.660 | - Yeah.
00:40:21.740 | - So is that--
00:40:22.580 | - Am I the only one?
00:40:23.420 | I don't think I'm the only one.
00:40:24.340 | - No, I have that negative voice.
00:40:26.020 | I'm not exactly sure if it's a bad thing or a good thing.
00:40:30.980 | I've been really torn about it
00:40:34.620 | because it's been a lifelong companion.
00:40:36.620 | It's hard to know.
00:40:37.660 | It's kind of, it drives productivity and progress
00:40:43.700 | but it can hold you back from taking big leaps.
00:40:46.940 | I think the best I can say is probably you have
00:40:52.300 | to somehow be able to control it.
00:40:55.660 | So turn it off when it's not useful
00:40:58.060 | and turn it on when it's useful.
00:41:00.500 | Like I have from almost like a third person perspective.
00:41:03.260 | - Right, somebody who's sitting there like--
00:41:05.340 | - Yeah, like, because it is useful to be critical.
00:41:10.340 | Like after, I just gave a talk yesterday
00:41:15.380 | at MIT and I was just, you know, there's so much love
00:41:20.900 | and it was such an incredible experience.
00:41:23.340 | So many amazing people I got a chance to talk to.
00:41:25.860 | But, you know, afterwards when I went home
00:41:29.900 | and just took this long walk,
00:41:31.420 | it was mostly just negative thoughts about me.
00:41:33.980 | I don't, like one basic stuff,
00:41:37.060 | like I don't deserve any of it.
00:41:38.980 | And second is like, why did you, that was so dumb.
00:41:43.220 | That you said this, that's so dumb.
00:41:45.060 | Like you should have prepared that better.
00:41:47.740 | Why did you say this?
00:41:48.980 | The, the, the, the, the, the, the.
00:41:50.420 | But I think it's good to hear that voice out.
00:41:54.340 | All right, and like sit in that.
00:41:56.420 | And ultimately I think you grow from that.
00:41:58.780 | Now, when you're making really big decisions
00:42:00.500 | about funding or starting a company
00:42:03.460 | or taking a leap to go to the UK
00:42:06.340 | or take a leap to go to America to work in media lab.
00:42:10.780 | Though, yeah, there's a, that's,
00:42:15.540 | you should be able to shut that off then
00:42:19.100 | because you should have like this weird confidence,
00:42:23.780 | almost like faith that you said before
00:42:25.500 | that everything's gonna work out.
00:42:26.980 | So take the leap of faith.
00:42:28.780 | - Take the leap of faith.
00:42:30.340 | Despite all the negativity.
00:42:32.580 | I mean, there's, there's, there's some of that.
00:42:34.420 | You, you actually tweeted a really nice tweet thread.
00:42:38.940 | It says, quote, a year ago,
00:42:41.460 | a friend recommended I do daily affirmations.
00:42:44.560 | And I was skeptical,
00:42:46.900 | but I was going through major transitions in my life.
00:42:49.560 | So I gave it a shot and it set me on a journey
00:42:51.980 | of self-acceptance and self-love.
00:42:54.180 | So what was that like?
00:42:56.460 | Maybe talk through this idea of affirmations
00:43:00.340 | and how that helped you.
00:43:01.380 | - Yeah, because really like, I'm just like me,
00:43:04.580 | I'm a kind, I like to think of myself
00:43:06.820 | as a kind person in general,
00:43:08.260 | but I'm kind of mean to myself sometimes.
00:43:11.380 | And so I've been doing journaling for almost 10 years now.
00:43:15.820 | I use an app called Day One and it's awesome.
00:43:19.020 | I just journal and I use it as an opportunity
00:43:21.000 | to almost have a conversation
00:43:22.460 | with the Debbie Downer voice in my,
00:43:23.780 | it's like a rebuttal, right?
00:43:25.700 | Like Debbie Downer says,
00:43:27.020 | "Oh my God, you won't be able to raise this round of funding."
00:43:29.860 | I'm like, "Okay, let's talk about it."
00:43:32.180 | I have a track record of doing X, Y, and Z.
00:43:35.660 | I think I can do this.
00:43:37.300 | And it's literally like,
00:43:39.380 | so I don't know that I can shut off the voice,
00:43:42.480 | but I can have a conversation with it.
00:43:44.420 | And it just, and I bring data to the table, right?
00:43:48.180 | - Nice.
00:43:50.840 | - So that was the journaling part,
00:43:52.060 | which I found very helpful.
00:43:53.980 | But the affirmation took it to a whole next level
00:43:56.620 | and I just love it.
00:43:57.780 | I'm a year into doing this.
00:44:00.620 | And you literally wake up in the morning
00:44:02.340 | and the first thing you do, I meditate first.
00:44:05.660 | And then I write my affirmations
00:44:07.900 | and it's the energy I want to put out in the world
00:44:10.660 | that hopefully will come right back to me.
00:44:12.300 | So I will say, I always start with,
00:44:15.100 | "My smile lights up the whole world."
00:44:17.420 | And I kid you not, like people in the street will stop me
00:44:19.620 | and say, "Oh my God, like we love your smile."
00:44:21.580 | - Yeah. - Like, yes.
00:44:23.820 | So my affirmations will change depending on
00:44:27.820 | what's happening this day.
00:44:28.980 | Is it funny?
00:44:29.820 | I know, don't judge, don't judge.
00:44:31.500 | - No, that's not, laughter's not judgment.
00:44:33.940 | It's just awesome.
00:44:35.180 | I mean, it's true,
00:44:37.300 | but you're saying affirmations somehow help kind of,
00:44:41.500 | what is it?
00:44:42.340 | They do work to like remind you
00:44:46.300 | of the kind of person you are
00:44:47.820 | and the kind of person you want to be,
00:44:49.540 | which actually may be inverse order.
00:44:52.700 | The kind of person you want to be
00:44:53.900 | and that helps you become the kind of person
00:44:55.900 | you actually are.
00:44:56.740 | - It's just, it brings intentionality
00:44:59.380 | to like what you're doing, right?
00:45:01.780 | And so--
00:45:02.820 | - By the way, I was laughing because my affirmations,
00:45:06.140 | which I also do, are the opposite.
00:45:07.980 | - Oh, you do?
00:45:08.820 | Oh, what do you do?
00:45:09.660 | - I don't have a, "My smile lights up the world."
00:45:12.380 | (laughing)
00:45:14.740 | Maybe I should add that because like I have just,
00:45:18.020 | I have a, oh boy.
00:45:21.820 | It's much more stoic, like about focus,
00:45:25.940 | about this kind of stuff.
00:45:28.180 | But the joy, the emotion that you're,
00:45:30.540 | just in that little affirmation is beautiful.
00:45:33.100 | So maybe I should add that.
00:45:34.580 | - I have some like focus stuff,
00:45:37.620 | but that's usually like after--
00:45:38.460 | - But that's a cool start.
00:45:39.380 | That's a good, it's just--
00:45:40.220 | - It's after all the like smiling,
00:45:41.660 | I'm playful and joyful and all that,
00:45:43.940 | and then it's like, okay, I kick butt.
00:45:45.700 | - Let's get shit done.
00:45:46.660 | - Right, exactly.
00:45:47.500 | - Let's get shit done affirmations.
00:45:48.820 | Okay, cool.
00:45:50.100 | Like what else is on there?
00:45:51.500 | - Oh, what else is on there?
00:45:54.460 | Well, I have, I'm a magnet for all sorts of things.
00:45:59.460 | So I'm an amazing people magnet.
00:46:02.380 | I attract like awesome people into my universe.
00:46:04.780 | - So that's an actual affirmation?
00:46:07.060 | - Yes.
00:46:08.020 | - That's great.
00:46:08.980 | - Yeah.
00:46:09.820 | - So that's, and that somehow manifests itself
00:46:12.340 | into like in working.
00:46:14.100 | - I think so.
00:46:15.620 | - Yeah, like can you speak to like why it feels good
00:46:18.140 | to do the affirmations?
00:46:19.940 | - I honestly think it just grounds the day.
00:46:24.260 | And then it allows me to,
00:46:26.460 | instead of just like being pulled back and forth
00:46:29.660 | like throughout the day, it just like grounds me.
00:46:31.900 | I'm like, okay, like this thing happened.
00:46:34.740 | It's not exactly what I wanted it to be, but I'm patient.
00:46:37.500 | Or I'm, you know, I trust that the universe
00:46:41.420 | will do amazing things for me,
00:46:42.660 | which is one of my other consistent affirmations.
00:46:45.580 | Or I'm an amazing mom, right?
00:46:47.180 | And so I can grapple with all the feelings of mom guilt
00:46:50.260 | that I have all the time.
00:46:51.540 | Or here's another one.
00:46:53.980 | I'm a love magnet.
00:46:55.220 | And I literally say, I will kind of picture the person
00:46:58.100 | that I'd love to end up with.
00:46:59.220 | And I write it all down and hasn't happened yet, but it-
00:47:02.620 | - What are you picturing?
00:47:04.220 | This is Brad Pitt.
00:47:05.060 | - Okay, Brad Pitt and Brad Pitt.
00:47:06.220 | - 'Cause that's what I picture.
00:47:07.180 | - Okay, that's what you picture?
00:47:08.420 | - Yeah, yeah.
00:47:09.260 | - Okay, okay.
00:47:10.100 | - On the running, holding hands, running together.
00:47:11.940 | - Okay.
00:47:14.340 | - No, more like Fight Club, the Fight Club Brad Pitt,
00:47:17.820 | where he's like standing, all right, people will know.
00:47:20.220 | Anyway, I'm sorry, I'll get off on that.
00:47:22.140 | Do you have, like when you're thinking
00:47:23.740 | about being a love magnet in that way,
00:47:27.180 | are you picturing specific people?
00:47:29.020 | Or is this almost like in the space of like energy?
00:47:34.020 | - Right, it's somebody who is smart and well accomplished
00:47:41.900 | and successful in their life, but they're generous
00:47:45.340 | and they're well-traveled and they wanna travel the world.
00:47:49.060 | It's things like that.
00:47:49.900 | I'm like, their head over heels into me.
00:47:51.500 | It's like, I know it sounds super silly,
00:47:53.220 | but it's literally what I write.
00:47:54.940 | And I believe it'll happen one day.
00:47:56.500 | - Oh, you actually write, so you don't say it out loud?
00:47:58.380 | - No, I write it.
00:47:59.220 | I write all my affirmations.
00:48:01.380 | - I do the opposite, I say it out loud.
00:48:02.820 | - Oh, you say it out loud, interesting.
00:48:04.420 | - Yeah, if I'm alone, I'll say it out loud, yeah.
00:48:06.660 | - Interesting, I should try that.
00:48:10.180 | I think it's what feels more powerful to you,
00:48:15.180 | to me more powerful, saying stuff feels more powerful.
00:48:20.500 | - Yeah.
00:48:21.620 | - Writing feels like I'm losing the words,
00:48:26.620 | like losing the power of the words,
00:48:32.500 | maybe 'cause I write slow.
00:48:33.760 | Do you handwrite?
00:48:35.060 | - No, I type, it's on this app.
00:48:37.700 | It's day one, basically.
00:48:39.940 | The best thing about it is I can look back
00:48:42.620 | and see like a year ago, what was I affirming, right?
00:48:46.260 | - Oh, so it changes over time.
00:48:48.940 | - It hasn't like changed a lot,
00:48:51.860 | but the focus kind of changes over time.
00:48:54.780 | - I got it.
00:48:55.620 | Yeah, I say the same exact thing over and over and over.
00:48:58.020 | - Oh, you do?
00:48:58.860 | Okay.
00:48:59.700 | - There's a comfort in the sameness of it.
00:49:01.260 | Actually, let me jump around,
00:49:03.900 | 'cause let me ask you about,
00:49:04.900 | 'cause all this talk about Brad Pitt,
00:49:07.420 | or maybe that's just going on inside my head.
00:49:10.180 | Let me ask you about dating in general.
00:49:12.720 | You tweeted, "Are you based in Boston?"
00:49:16.860 | in single, question mark,
00:49:18.700 | and then you pointed to a startup,
00:49:21.580 | Singles Night, sponsored by Smile Dating App.
00:49:24.420 | I mean, this is jumping around a little bit,
00:49:26.860 | but since you mentioned,
00:49:28.460 | can AI help solve this dating love problem?
00:49:34.660 | What do you think?
00:49:35.500 | What's the form of connection
00:49:36.940 | that is part of the human condition?
00:49:40.180 | Can AI help that?
00:49:41.620 | You yourself are in the search affirming.
00:49:43.920 | - Maybe that's what I should affirm,
00:49:47.140 | like build an AI.
00:49:48.420 | - Build an AI that finds love?
00:49:49.940 | - I think there must be a science behind
00:49:55.780 | that first moment you meet a person
00:49:59.220 | and you either have chemistry or you don't, right?
00:50:02.980 | - I guess that was the question I was asking.
00:50:04.740 | Would you put it brilliantly?
00:50:06.220 | Is that a science or an art?
00:50:07.740 | - Ooh, I think there's actual chemicals
00:50:13.060 | that get exchanged when two people meet.
00:50:15.300 | Oh, well, I don't know about that.
00:50:17.100 | (laughing)
00:50:18.260 | - I like how you're changing,
00:50:19.700 | yeah, yeah, changing your mind as we're describing it.
00:50:22.180 | But it feels that way.
00:50:24.060 | But what science shows us
00:50:25.940 | is sometimes we can explain with rigor
00:50:28.700 | the things that feel like magic.
00:50:30.740 | - Right.
00:50:31.940 | - So maybe we can remove all the magic.
00:50:34.400 | Maybe it's like, I honestly think,
00:50:37.020 | like I said, Goodreads should be a dating app,
00:50:39.900 | which like books, I wonder if you look at just like books
00:50:44.900 | or content you've consumed.
00:50:47.180 | I mean, that's essentially what YouTube does
00:50:48.740 | when it does a recommendation.
00:50:50.820 | If you just look at your footprint of content consumed,
00:50:54.540 | if there's an overlap,
00:50:56.020 | but maybe interesting difference with an overlap,
00:50:59.020 | there's some, I'm sure this is a machine learning problem
00:51:01.660 | that's solvable.
00:51:04.240 | This person is very likely to be,
00:51:06.880 | not only there to be chemistry in the short term,
00:51:10.760 | but a good lifelong partner to grow together.
00:51:14.100 | I bet you it's a good machine learning problem.
00:51:15.780 | We just need the data.
00:51:16.620 | - Let's do it.
00:51:17.500 | Well, actually, I do think there's so much data
00:51:20.560 | about each of us that there ought to be
00:51:22.420 | a machine learning algorithm that can ingest all this data
00:51:25.020 | and basically say, I think the following 10 people
00:51:28.060 | would be interesting connections for you, right?
00:51:32.020 | And so Smile Dating App kind of took one particular angle,
00:51:35.900 | which is humor.
00:51:36.780 | It matches people based on their humor styles,
00:51:39.200 | which is one of the main ingredients
00:51:41.900 | of a successful relationship.
00:51:43.280 | Like if you meet somebody and they can make you laugh,
00:51:45.400 | like that's a good thing.
00:51:47.460 | And if you develop like internal jokes,
00:51:49.420 | like inside jokes and you're bantering, like that's fun.
00:51:54.420 | So I think.
00:51:56.780 | - Yeah, definitely.
00:51:58.560 | - But yeah, that's the number of,
00:52:01.880 | and the rate of inside joke generation.
00:52:05.280 | You could probably measure that and then optimize it
00:52:08.280 | over the first few days.
00:52:09.320 | You can see. - Right, and then.
00:52:10.800 | - We're just turning this into a machine learning problem.
00:52:12.660 | I love it.
00:52:13.500 | But for somebody like you,
00:52:16.000 | who's exceptionally successful and busy,
00:52:18.680 | is there science to that aspect of dating?
00:52:26.200 | Is it tricky?
00:52:27.560 | Is there advice you can give?
00:52:28.880 | - Oh my God, I'd give the worst advice.
00:52:30.680 | Well, I can tell you like I have a spreadsheet.
00:52:33.680 | - A spreadsheet, that's great.
00:52:36.080 | Is that a good or a bad thing?
00:52:37.160 | Do you regret the spreadsheet?
00:52:38.660 | - Well, I don't know.
00:52:40.680 | - What's the name of the spreadsheet?
00:52:42.040 | Is it love?
00:52:42.880 | - It's the dating tracker.
00:52:45.160 | - The dating tracker?
00:52:46.120 | - It's very like. - Love tracker.
00:52:47.640 | - Yeah.
00:52:48.480 | - And there's a rating system, I'm sure.
00:52:50.200 | - Yeah, there's like weights and stuff.
00:52:52.480 | - It's too close to home.
00:52:53.960 | - Oh, is it?
00:52:54.780 | Do you also have a spreadsheet?
00:52:55.620 | - Well, I don't have a spreadsheet,
00:52:56.440 | but I would, now that you say it,
00:52:58.160 | it seems like a good idea.
00:52:59.680 | - Oh no.
00:53:00.520 | - Turning into data.
00:53:03.940 | I do wish that somebody else had a spreadsheet about me.
00:53:09.560 | If it was like I said, like you said,
00:53:15.040 | convert, collect a lot of data about us
00:53:17.440 | in a way that's privacy preserving,
00:53:19.440 | that I own the data, I can control it,
00:53:21.640 | and then use that data to find,
00:53:23.760 | I mean, not just romantic love,
00:53:25.240 | but collaborators, friends, all that kind of stuff.
00:53:29.120 | It seems like the data is there.
00:53:30.720 | That's the problem social networks are trying to solve,
00:53:34.240 | but I think they're doing a really poor job.
00:53:36.440 | Even Facebook tried to get into a dating app business.
00:53:39.720 | And I think there's so many components
00:53:41.560 | to running a successful company that connects human beings.
00:53:45.440 | And part of that is,
00:53:47.560 | having engineers that care about the human side, right?
00:53:53.600 | As you know extremely well,
00:53:55.120 | it's not easy to find those.
00:53:57.960 | But you also don't want just people that care
00:54:00.240 | about the human, they also have to be good engineers.
00:54:02.360 | So it's like, you have to find this beautiful mix.
00:54:05.920 | And for some reason, just empirically speaking,
00:54:08.400 | people have not done a good job of that,
00:54:12.720 | building companies like that.
00:54:14.280 | It must mean that it's a difficult problem to solve.
00:54:17.160 | Dating apps, it seems difficult.
00:54:20.000 | Okay, Cupid, Tinder, all those kind of stuff.
00:54:23.000 | They seem to find, of course they work,
00:54:27.520 | but they seem to not work as well
00:54:31.960 | as I would imagine is possible.
00:54:34.640 | With data, wouldn't you be able
00:54:35.920 | to find better human connection?
00:54:38.040 | It's like arrange marriages on steroids, essentially.
00:54:41.520 | Arranged by machine learning algorithm.
00:54:43.520 | - Arranged by machine learning algorithm,
00:54:45.120 | but not a superficial one.
00:54:46.520 | I think a lot of the dating apps out there
00:54:48.360 | are just so superficial.
00:54:49.560 | They're just matching on high level criteria
00:54:52.560 | that aren't ingredients for successful partnership.
00:54:57.040 | But you know what's missing though, too?
00:54:59.680 | I don't know how to fix that.
00:55:00.800 | The serendipity piece of it.
00:55:02.520 | Like how do you engineer serendipity?
00:55:04.760 | Like this random chance encounter,
00:55:07.440 | and then you fall in love with the person.
00:55:08.840 | I don't know how a dating app can do that.
00:55:11.120 | So there has to be a little bit of randomness.
00:55:13.080 | Maybe every 10th match is just a,
00:55:20.280 | yeah, somebody that the algorithm
00:55:22.320 | wouldn't have necessarily recommended,
00:55:24.480 | but it allows for a little bit of.
00:55:27.280 | - Well, it can also trick you into thinking
00:55:31.400 | that serendipity by somehow showing you a tweet
00:55:35.480 | of a person that he thinks you'll match well with,
00:55:39.120 | but do it accidentally as part of another search.
00:55:41.760 | And you just notice it, and then you go down a rabbit hole
00:55:46.040 | and you connect them outside the app.
00:55:49.400 | You connect with this person outside the app somehow.
00:55:51.520 | So it's just, it creates that moment of meeting.
00:55:54.360 | Of course, you have to think of, from an app perspective,
00:55:56.680 | how you can turn that into a business.
00:55:58.420 | But I think ultimately a business
00:56:01.220 | that helps people find love in any way,
00:56:04.640 | like that's what Apple was about.
00:56:05.880 | Create products that people love.
00:56:07.600 | That's beautiful.
00:56:08.440 | I mean, you gotta make money somehow.
00:56:11.560 | If you help people fall in love personally with a product,
00:56:16.000 | find self-love or love another human being,
00:56:18.960 | you're gonna make money.
00:56:20.080 | You're gonna figure out a way to make money.
00:56:22.240 | I just feel like the dating apps often will optimize
00:56:26.680 | for something else than love.
00:56:28.680 | It's the same with social networks.
00:56:30.120 | They optimize for engagement
00:56:31.960 | as opposed to a deep, meaningful connection
00:56:35.160 | that's ultimately grounded in personal growth,
00:56:38.160 | you as a human being growing and all that kind of stuff.
00:56:41.720 | Let me do a pivot to a dark topic,
00:56:45.720 | which you opened the book with.
00:56:48.680 | A story, because I'd like to talk to you
00:56:51.720 | about just emotion and artificial intelligence,
00:56:56.120 | and I think this is a good story
00:56:57.440 | to start to think about emotional intelligence.
00:57:00.880 | You opened the book with a story
00:57:02.320 | of a Central Florida man, Jamelle Dunn,
00:57:05.060 | who was drowning and drowned
00:57:07.280 | while five teenagers watched and laughed,
00:57:09.820 | saying things like, "You're gonna die."
00:57:11.760 | And when Jamelle disappeared
00:57:13.480 | below the surface of the water,
00:57:14.820 | one of them said, "He just died," and the others laughed.
00:57:18.440 | What does this incident teach you about human nature
00:57:21.840 | and the response to it, perhaps?
00:57:24.240 | - Yeah, I mean, I think this is a really,
00:57:26.840 | really, really sad story,
00:57:29.560 | and it highlights what I believe is a,
00:57:32.920 | it's a real problem in our world today.
00:57:35.520 | It's an empathy crisis.
00:57:37.680 | Yeah, we are living through an empathy crisis.
00:57:40.080 | - Empathy crisis, yeah.
00:57:41.360 | - Yeah, and I mean, we've talked about this
00:57:45.280 | throughout our conversation.
00:57:46.320 | We dehumanize each other,
00:57:48.040 | and unfortunately, yes, technology is bringing us together,
00:57:52.920 | but in a way, it's just dehumanizing.
00:57:54.800 | It's creating this, like, yeah, dehumanizing of the other,
00:57:59.720 | and I think that's a huge problem.
00:58:02.760 | The good news is I think the solution
00:58:05.200 | could be technology-based.
00:58:06.840 | Like, I think if we rethink the way we design
00:58:09.560 | and deploy our technologies,
00:58:11.280 | we can solve parts of this problem,
00:58:13.600 | but I worry about it.
00:58:14.440 | I mean, even with my son,
00:58:16.360 | a lot of his interactions are computer-mediated,
00:58:21.200 | and I just question what that's doing to his empathy skills
00:58:25.680 | and his ability to really connect with people, so.
00:58:29.400 | - Do you think,
00:58:30.280 | you think it's not possible to form empathy
00:58:35.080 | through the digital medium?
00:58:36.760 | - I think it is,
00:58:39.760 | but we have to be thoughtful about,
00:58:43.120 | 'cause the way we engage face-to-face,
00:58:45.680 | which is what we're doing right now, right?
00:58:47.800 | There's the nonverbal signals,
00:58:49.320 | which are a majority of how we communicate.
00:58:51.320 | It's like 90% of how we communicate
00:58:53.600 | is your facial expressions.
00:58:56.040 | You know, I'm saying something,
00:58:57.240 | and you're nodding your head now,
00:58:58.560 | and that creates a feedback loop,
00:59:00.440 | and if you break that-
00:59:02.600 | - And now I have anxiety about it.
00:59:04.000 | - Da-da-da, right? (laughs)
00:59:06.120 | Poor Lex.
00:59:06.960 | - Oh, boy, I wish this cycle-
00:59:08.880 | - I am not scrutinizing your facial expressions
00:59:10.760 | during this interview, right?
00:59:11.600 | - I am, I am. - Okay.
00:59:12.920 | - Look normal, look human.
00:59:15.360 | - Yeah. (laughs)
00:59:17.800 | - Nod head.
00:59:18.760 | - Yeah, nod head. (laughs)
00:59:21.040 | - In agreement.
00:59:21.880 | - If Rana says yes, then nod head else.
00:59:25.840 | - Don't do it too much,
00:59:26.760 | because it might be at the wrong time,
00:59:28.760 | and then it'll send the wrong signal.
00:59:30.920 | - Oh, God.
00:59:31.760 | - And make eye contact sometimes,
00:59:33.680 | 'cause humans appreciate that.
00:59:35.440 | All right, anyway.
00:59:36.280 | - Okay. (laughs)
00:59:38.640 | - Yeah, but something about,
00:59:40.360 | especially when you say mean things in person,
00:59:42.600 | you get to see the pain of the other person.
00:59:44.360 | - Exactly, but if you're tweeting it at a person,
00:59:46.360 | and you have no idea how it's gonna land,
00:59:48.080 | you're more likely to do that on social media
00:59:50.200 | than you are in face-to-face conversations, so.
00:59:52.680 | - And what do you think is more important?
00:59:56.640 | EQ or IQ?
01:00:00.520 | EQ being emotional intelligence.
01:00:02.560 | In terms of, in what makes us human?
01:00:06.720 | - I think emotional intelligence is what makes us human.
01:00:11.400 | It's how we connect with one another,
01:00:14.960 | it's how we build trust, it's how we make decisions, right?
01:00:19.800 | Like your emotions drive kind of what you had for breakfast,
01:00:24.600 | but also where you decide to live,
01:00:26.280 | and what you wanna do for the rest of your life.
01:00:28.800 | So I think emotions are underrated.
01:00:31.720 | - So emotional intelligence isn't just about
01:00:36.440 | the effective expression of your own emotions,
01:00:39.320 | it's about a sensitivity and empathy
01:00:41.480 | to other people's emotions,
01:00:43.120 | and that sort of being able to effectively engage
01:00:46.400 | in the dance of emotions with other people.
01:00:48.960 | - Yeah, I like that explanation.
01:00:51.400 | I like that kind of, yeah, thinking about it as a dance,
01:00:55.240 | because it is really about that,
01:00:56.800 | it's about sensing what state the other person's in
01:00:59.240 | and using that information to decide
01:01:01.760 | on how you're gonna react.
01:01:03.320 | And I think it can be very powerful,
01:01:07.040 | people who are the best, most persuasive leaders
01:01:12.040 | in the world tap into,
01:01:14.440 | if you have higher EQ, you're more likely
01:01:17.920 | to be able to motivate people to change their behaviors.
01:01:21.520 | So it can be very powerful.
01:01:25.120 | - At a more kind of technical, maybe philosophical level,
01:01:29.320 | you've written that emotion is universal.
01:01:32.640 | It seems that, sort of like Chomsky says,
01:01:36.240 | "Language is universal."
01:01:37.760 | There's a bunch of other stuff like cognition,
01:01:39.760 | consciousness, it seems a lot of us have these aspects.
01:01:44.200 | So the human mind generates all this.
01:01:47.520 | So what do you think is the,
01:01:49.580 | they all seem to be like echoes of the same thing.
01:01:53.520 | What do you think emotion is exactly?
01:01:57.240 | Like how deep does it run?
01:01:58.480 | Is it a surface level thing that we display to each other?
01:02:02.400 | Is it just another form of language
01:02:04.200 | or something deep within?
01:02:06.360 | - I think it's really deep.
01:02:08.120 | It's how, we started with memory.
01:02:10.560 | I think emotions play a really important role.
01:02:13.520 | Yeah, emotions play a very important role
01:02:17.200 | in how we encode memories, right?
01:02:18.800 | Our memories are often encoded, almost indexed by emotions.
01:02:22.520 | Yeah, it's at the core of how our decision-making engine
01:02:29.760 | is also heavily influenced by our emotions.
01:02:32.880 | - So emotions is part of cognition.
01:02:34.800 | - Totally.
01:02:35.800 | - It's intermixing to the whole thing.
01:02:37.640 | - Yes, absolutely.
01:02:38.920 | And in fact, when you take it away,
01:02:41.080 | people are unable to make decisions.
01:02:42.920 | They're really paralyzed.
01:02:43.880 | Like they can't go about their daily
01:02:45.960 | or their personal or professional lives.
01:02:48.320 | - It does seem like there's probably some interesting
01:02:53.920 | interweaving of emotion and consciousness.
01:02:58.920 | I wonder if it's possible to have,
01:03:01.480 | like if they're next door neighbors somehow
01:03:03.840 | or if they're actually flatmates.
01:03:07.040 | It feels like the hard problem of consciousness
01:03:12.600 | where it feels like something to experience the thing.
01:03:15.760 | Red feels like red.
01:03:20.080 | When you eat a mango, it's sweet.
01:03:22.040 | The taste, the sweetness,
01:03:24.600 | that it feels like something to experience that sweetness.
01:03:27.640 | Whatever generates emotions.
01:03:33.360 | But then like, see, I feel like emotion
01:03:35.600 | is part of communication.
01:03:37.120 | It's very much about communication.
01:03:39.560 | And then that means it's also deeply connected to language.
01:03:44.560 | But then probably human intelligence is deeply connected
01:03:50.720 | to the collective intelligence between humans.
01:03:52.800 | It's not just a standalone thing.
01:03:54.680 | So the whole thing is really connected.
01:03:56.440 | So emotion is connected to language.
01:03:58.840 | Language is connected to intelligence.
01:04:01.200 | And then intelligence is connected to consciousness
01:04:03.360 | and consciousness is connected to emotion.
01:04:05.880 | The whole thing is a beautiful mess.
01:04:10.080 | - Can I comment on the emotions
01:04:14.200 | being a communication mechanism?
01:04:15.800 | 'Cause I think there are two facets
01:04:18.280 | of our emotional experiences.
01:04:21.600 | One is communication, right?
01:04:24.540 | Like we use emotions, for example, facial expressions
01:04:27.840 | or other nonverbal cues to connect
01:04:29.960 | with other human beings and with other beings
01:04:33.960 | in the world, right?
01:04:35.040 | But even if it's not a communication context,
01:04:40.680 | we still experience emotions and we still process emotions
01:04:43.520 | and we still leverage emotions to make decisions
01:04:47.020 | and to learn and to experience life.
01:04:49.800 | So it isn't always just about communication.
01:04:53.240 | And we learned that very early on
01:04:54.720 | in kind of our work at Affectiva.
01:04:58.600 | One of the very first applications we brought to market
01:05:01.240 | was understanding how people respond to content, right?
01:05:03.680 | So if they're watching this video of ours,
01:05:05.520 | like are they interested?
01:05:06.720 | Are they inspired?
01:05:07.800 | Are they bored to death?
01:05:09.240 | And so we watched their facial expressions.
01:05:11.640 | And we weren't sure if people would express any emotions
01:05:16.520 | if they were sitting alone.
01:05:17.480 | Like if you're in your bed at night,
01:05:19.880 | watching a Netflix TV series,
01:05:21.320 | would we still see any emotions on your face?
01:05:23.360 | And we were surprised that yes, people still emote,
01:05:26.040 | even if they're alone.
01:05:27.000 | Even if you're in your car driving around,
01:05:29.160 | you're singing along a song and you're joyful,
01:05:32.400 | we'll see these expressions.
01:05:33.680 | So it's not just about communicating with another person.
01:05:37.520 | It sometimes really is just about experiencing the world.
01:05:40.760 | - First of all, I wonder if some of that
01:05:44.240 | is because we develop our intelligence
01:05:47.360 | and our emotional intelligence
01:05:50.000 | by communicating with other humans.
01:05:51.960 | And so when other humans disappear from the picture,
01:05:54.520 | we're still kind of a virtual human.
01:05:56.720 | - The code still runs, basically.
01:05:57.840 | - Yeah, the code still runs.
01:05:59.320 | But you're also kind of, you're still,
01:06:01.720 | there's like virtual humans.
01:06:02.960 | You don't have to think of it that way,
01:06:04.640 | but there's a kind of, when you like chuckle, like, yeah.
01:06:09.640 | Like you're kind of chuckling to a virtual human.
01:06:13.240 | I mean, it's possible that the code
01:06:17.480 | has to have another human there.
01:06:21.920 | Because if you just grew up alone,
01:06:24.440 | I wonder if emotion will still be there in this visual form.
01:06:28.720 | So yeah, I wonder.
01:06:30.400 | But anyway, what can you tell from the human face
01:06:35.400 | about what's going on inside?
01:06:38.520 | So that's the problem that Effektiva first tackled,
01:06:41.200 | which is using computer vision, using machine learning
01:06:46.080 | to try to detect stuff about the human face
01:06:49.080 | as many things as possible,
01:06:51.000 | and convert them into a prediction
01:06:52.880 | of categories of emotion, anger, happiness,
01:06:57.600 | all that kind of stuff.
01:06:58.840 | How hard is that problem?
01:07:00.440 | - It's extremely hard.
01:07:01.520 | It's very, very hard,
01:07:02.500 | because there is no one-to-one mapping
01:07:04.320 | between a facial expression and your internal state.
01:07:08.560 | There just isn't.
01:07:09.580 | There's this oversimplification of the problem,
01:07:11.920 | where it's something like,
01:07:13.320 | if you are smiling, then you're happy.
01:07:15.280 | If you do a brow furrow, then you're angry.
01:07:17.440 | If you do an eyebrow raise, then you're surprised.
01:07:19.680 | And just think about it for a moment.
01:07:22.320 | You could be smiling for a whole host of reasons.
01:07:25.160 | You could also be happy and not be smiling, right?
01:07:27.900 | You could furrow your eyebrows because you're angry,
01:07:31.880 | or you're confused about something, or you're constipated.
01:07:35.760 | So I think this oversimplistic approach
01:07:39.760 | to inferring emotion from a facial expression
01:07:41.980 | is really dangerous.
01:07:43.200 | The solution is to incorporate
01:07:47.680 | as many contextual signals as you can.
01:07:50.120 | So if, for example, I'm driving a car,
01:07:53.640 | and you can see me nodding my head,
01:07:56.300 | and my eyes are closed, and the blinking rate is changing,
01:07:59.600 | I'm probably falling asleep at the wheel, right?
01:08:02.400 | Because you know the context.
01:08:04.620 | You understand what the person's doing.
01:08:06.560 | Or add additional channels, like voice, or gestures,
01:08:11.720 | or even physiological sensors.
01:08:13.960 | But I think it's very dangerous
01:08:16.440 | to just take this oversimplistic approach of,
01:08:19.640 | yeah, smile equals happy.
01:08:21.640 | - If you're able to, in a high-resolution way,
01:08:24.120 | specify the context, there's certain things
01:08:26.520 | that are gonna be somewhat reliable signals
01:08:29.720 | of something like drowsiness, or happiness,
01:08:32.960 | or stuff like that.
01:08:34.320 | I mean, when people are watching Netflix content,
01:08:37.040 | that problem, that's a really compelling idea
01:08:41.460 | that you can kind of, at least in aggregate--
01:08:44.600 | - Exactly. - Highlight,
01:08:45.960 | like which part was boring, which part was exciting.
01:08:48.960 | How hard was that problem?
01:08:50.380 | - That was on the scale of difficulty.
01:08:54.920 | I think that's one of the easier problems to solve,
01:08:57.800 | because it's a relatively constrained environment.
01:09:01.720 | You have somebody sitting in front of,
01:09:03.920 | initially we started with a device in front of you,
01:09:06.440 | like a laptop, and then we graduated
01:09:08.940 | to doing this on a mobile phone, which is a lot harder,
01:09:11.920 | just because of, from a computer vision perspective,
01:09:15.480 | the profile view of the face can be a lot more challenging.
01:09:19.160 | We had to figure out lighting conditions,
01:09:20.800 | because usually people are watching content
01:09:23.880 | literally in their bedrooms at night, lights are dimmed.
01:09:26.800 | - Yeah, I mean, if you're standing,
01:09:29.800 | it's probably gonna be the looking up.
01:09:33.120 | - The nostril view. - Yeah.
01:09:34.720 | And nobody looks good at, I've seen data sets
01:09:38.000 | from that perspective, it's like,
01:09:39.400 | ugh, this is not a good look for anyone.
01:09:43.160 | Or if you're laying in bed at night,
01:09:45.240 | what is it, side view or something?
01:09:47.520 | And half your face is on a pillow.
01:09:50.240 | Actually, I would love to have data
01:09:53.680 | about how people watch stuff in bed at night.
01:09:58.680 | Do they prop their, is it a pillow?
01:10:03.600 | I'm sure there's a lot of interesting dynamics there.
01:10:07.520 | - From a health and wellbeing perspective, right?
01:10:09.520 | Like, it's like, oh, you're hurting your neck.
01:10:11.200 | - I was thinking machine learning perspective, but yes.
01:10:13.200 | But also, yeah.
01:10:15.680 | Once you have that data, you can start making
01:10:17.880 | all kinds of inference about health and stuff like that.
01:10:20.120 | - Interesting.
01:10:21.360 | - Yeah, there was an interesting thing
01:10:23.200 | when I was at Google that we were,
01:10:25.360 | it's called active authentication,
01:10:29.560 | where you want to be able to unlock your phone
01:10:33.440 | without using a password.
01:10:34.720 | So it would face, but also other stuff.
01:10:38.600 | Like the way you take a phone out of the pocket.
01:10:41.200 | - Amazing. - So that kind of data,
01:10:42.640 | to use the multimodal with machine learning
01:10:46.080 | to be able to identify that it's you,
01:10:47.760 | or likely to be you, likely not to be you.
01:10:50.800 | That allows you to not always have to enter the password.
01:10:52.960 | That was the idea.
01:10:54.480 | But the funny thing about that is,
01:10:56.560 | I just want to tell a small anecdote,
01:10:58.880 | is 'cause it was all male engineers.
01:11:01.920 | Except, so my boss is, our boss,
01:11:08.920 | who's still one of my favorite humans, was a woman,
01:11:12.720 | Regina Dugan.
01:11:13.560 | - Oh my God, I love her.
01:11:15.560 | She's awesome. - Yeah, she's the best.
01:11:16.720 | She's the best.
01:11:17.560 | So, but anyway, and there was one female,
01:11:25.520 | brilliant female engineer on the team.
01:11:27.040 | And she was the one that actually highlighted the fact
01:11:28.960 | that women often don't have pockets.
01:11:31.560 | - Right, right.
01:11:32.400 | - It was like, whoa, that was not even a category
01:11:36.120 | in the code of like, wait a minute,
01:11:38.720 | you can take the phone out of some other place
01:11:42.000 | than your pocket.
01:11:43.040 | So anyway, that's a funny thing
01:11:45.360 | when you're considering people laying in bed,
01:11:47.000 | watching a phone, you have to consider,
01:11:48.960 | you have to, you know, diversity in all its forms,
01:11:53.440 | depending on the problem, depending on the context.
01:11:55.720 | - Actually, this is like a very important,
01:11:57.800 | I think this is, you know,
01:11:59.200 | you probably get this all the time,
01:12:00.680 | like people are worried that AI's gonna take over humanity
01:12:03.320 | and like, get rid of all the humans in the world.
01:12:06.120 | I'm like, actually, that's not my biggest concern.
01:12:08.320 | My biggest concern is that we are building bias
01:12:10.600 | into these systems.
01:12:12.440 | And then they're like deployed at large and at scale.
01:12:16.400 | And before you know it, you're kind of accentuating
01:12:19.360 | the bias that exists in society.
01:12:21.240 | - Yeah, I'm not, you know, I know people,
01:12:24.440 | it's very important to worry about that,
01:12:25.960 | but the worry is an emergent phenomena to me,
01:12:30.960 | which is a very good one,
01:12:33.540 | because I think these systems are actually,
01:12:37.000 | by encoding the data that exists,
01:12:39.760 | they're revealing the bias in society,
01:12:42.280 | they're therefore teaching us what the bias is,
01:12:45.260 | therefore we can now improve that bias within the system.
01:12:48.280 | So they're almost like putting a mirror to ourselves.
01:12:51.280 | So I'm not--
01:12:53.000 | - We have to be open to looking at the mirror, though.
01:12:55.120 | We have to be open to scrutinizing the data,
01:12:58.160 | if you just take it as ground truth.
01:13:01.160 | - Or you don't even have to look at the,
01:13:02.560 | I mean, yes, the data is how you fix it,
01:13:04.400 | but then you just look at the behavior of the system.
01:13:06.560 | It's like, and you realize,
01:13:08.000 | holy crap, this thing is kind of racist.
01:13:10.280 | Like, why is that?
01:13:11.240 | And then you look at the data, it's like, oh, okay.
01:13:13.200 | And then you start to realize that,
01:13:14.600 | I think that it's a much more effective way
01:13:16.760 | to be introspective as a society
01:13:20.320 | than through sort of political discourse.
01:13:23.000 | Like AI kind of, 'cause people are easy,
01:13:27.360 | people are, for some reason, more productive and rigorous
01:13:33.280 | in criticizing AI than they're criticizing each other.
01:13:35.840 | So I think this is just a nice method for studying society
01:13:39.520 | and see which way progress lies.
01:13:42.560 | Anyway, what were we talking about?
01:13:44.560 | You're watching, the problem of watching Netflix in bed
01:13:47.400 | or elsewhere and seeing which parts are exciting,
01:13:50.760 | which parts are boring.
01:13:51.800 | You're saying that's--
01:13:53.200 | - Relatively constrained because, you know,
01:13:55.200 | you have a captive audience and you kind of know the context.
01:13:57.880 | And one thing you said that was really key is the,
01:14:01.160 | you're doing this in aggregate, right?
01:14:02.520 | Like we're looking at aggregated response of people.
01:14:04.840 | And so when you see a peak, say a smile peak,
01:14:07.720 | they're probably smiling or laughing
01:14:10.320 | at something that's in the content.
01:14:12.440 | So that was one of the first problems we were able to solve.
01:14:16.760 | And when we see the smile peak,
01:14:19.000 | it doesn't mean that these people are internally happy.
01:14:21.720 | They're just laughing at content.
01:14:23.360 | So it's important to, you know, call it for what it is.
01:14:27.520 | - But it's still really, really useful data.
01:14:29.640 | - Oh, yeah.
01:14:30.480 | - I wonder how that compares to,
01:14:31.720 | so what like YouTube and other places will use
01:14:34.920 | is obviously they don't have,
01:14:38.080 | for the most case, they don't have that kind of data.
01:14:41.560 | They have the data of when people tune out,
01:14:45.800 | like switch and drop off.
01:14:47.480 | And I think that's a, in aggregate for YouTube,
01:14:50.280 | at least a pretty powerful signal.
01:14:52.040 | I worry about what that leads to
01:14:54.160 | because looking at like YouTubers
01:15:00.040 | that are kind of really care about views
01:15:01.840 | and, you know, try to maximize the number of views.
01:15:06.240 | I think they, when they say that the video
01:15:09.400 | should be constantly interesting,
01:15:12.360 | which seems like a good goal,
01:15:14.400 | I feel like that leads to this manic pace of a video.
01:15:19.400 | Like the idea that I would speak at the current speed
01:15:23.800 | that I'm speaking, I don't know.
01:15:26.400 | - And that every moment has to be engaging, right?
01:15:29.680 | - Engaging.
01:15:30.520 | - Yeah, that's.
01:15:31.360 | I think there's value to silence.
01:15:33.200 | There's value to the boring bits.
01:15:35.320 | I mean, some of the greatest movies ever,
01:15:37.520 | some of the greatest stories ever told,
01:15:38.920 | they have that boring bits, seemingly boring bits.
01:15:42.640 | I don't know.
01:15:43.640 | I wonder about that.
01:15:44.960 | Of course, it's not that the human face
01:15:47.940 | can capture that either.
01:15:49.280 | It's just giving an extra signal.
01:15:51.600 | You have to really, I don't know.
01:15:55.680 | You have to really collect deeper long-term data
01:16:01.040 | about what was meaningful to people.
01:16:04.160 | When they think 30 days from now,
01:16:06.520 | what they still remember, what moved them,
01:16:09.440 | what changed them, what helped them grow,
01:16:11.600 | that kind of stuff.
01:16:12.840 | - You know what would be a really,
01:16:14.200 | I don't know if there are any researchers out there
01:16:16.280 | who are doing this type of work.
01:16:18.360 | Wouldn't it be so cool to tie your emotional expressions
01:16:22.840 | while you're, say, listening to a podcast interview,
01:16:26.760 | and then go, and then 30 days later,
01:16:29.360 | interview people and say, "Hey, what do you remember?
01:16:32.120 | "You've watched this 30 days ago.
01:16:33.720 | "What stuck with you?"
01:16:34.760 | And then see if there's any, there ought to be,
01:16:36.600 | maybe there ought to be some correlation
01:16:38.360 | between these emotional experiences
01:16:40.440 | and yeah, what you, what stays with you.
01:16:45.480 | - So the one guy listening now on the beach in Brazil,
01:16:50.040 | please record a video of yourself listening to this
01:16:53.120 | and send it to me, and then I'll interview you
01:16:55.000 | 30 days from now.
01:16:55.840 | - Yeah, that would be great.
01:16:57.280 | (laughing)
01:16:58.960 | - It'll be statistically significant.
01:17:00.600 | - I didn't send anyone, but you know.
01:17:03.000 | Yeah, yeah, I think that's really fascinating.
01:17:07.080 | I think that's, that kind of holds the key
01:17:10.760 | to a future where entertainment or content
01:17:15.760 | is both entertaining and I don't know,
01:17:21.360 | makes you better, empowering in some way.
01:17:25.240 | So figuring out like showing people stuff
01:17:29.880 | that entertains them, but also they're happy
01:17:32.680 | they watched 30 days from now
01:17:34.880 | because they've become a better person because of it.
01:17:37.560 | - Well, you know, okay, not to riff on this topic
01:17:39.960 | for too long, but I have two children, right?
01:17:42.960 | And I see my role as a parent
01:17:45.040 | as like a chief opportunity officer.
01:17:46.960 | Like I am responsible for exposing them
01:17:49.200 | to all sorts of things in the world.
01:17:52.040 | But often I have no idea of knowing like what's stuck,
01:17:55.400 | like what was, you know, is this actually
01:17:57.080 | gonna be transformative, you know,
01:17:58.840 | for them 10 years down the line?
01:18:00.120 | And I wish there was a way to quantify these experiences.
01:18:03.880 | Like, are they, I can tell in the moment
01:18:07.120 | if they're engaging, right?
01:18:08.280 | I can tell, but it's really hard to know
01:18:10.880 | if they're gonna remember them 10 years from now
01:18:13.200 | or if it's going to.
01:18:15.240 | - Yeah, that one is weird because it seems
01:18:17.600 | that kids remember the weirdest things.
01:18:19.640 | I've seen parents do incredible stuff for their kids
01:18:22.720 | and they don't remember any of that.
01:18:24.000 | They remember some tiny, small, sweet thing a parent did.
01:18:27.440 | - Right.
01:18:28.280 | - Like some--
01:18:29.120 | - Like I took you to like this amazing country vacation.
01:18:31.480 | - Yeah, exactly.
01:18:32.320 | - No, no, no, no, no, no, no, no, no, no, no.
01:18:33.140 | - No, whatever.
01:18:33.980 | And then there'll be like some like stuffed toy you got
01:18:36.320 | or the new PlayStation or something
01:18:38.640 | or some silly little thing.
01:18:41.240 | So I think they just like, that we're designed that way.
01:18:45.120 | They wanna mess with your head.
01:18:47.840 | But definitely kids are very impacted by,
01:18:51.760 | it seems like sort of negative events.
01:18:54.920 | So minimizing the number of negative events is important,
01:18:58.960 | but not too much, right?
01:19:00.080 | - Right.
01:19:00.920 | - You can't just like, you know,
01:19:03.800 | there's still discipline and challenge
01:19:05.800 | and all those kinds of things.
01:19:07.680 | - You want some adversity for sure.
01:19:09.320 | - So yeah, I mean, I'm definitely, when I have kids,
01:19:11.480 | I'm gonna drive them out into the woods.
01:19:13.760 | - Okay.
01:19:14.600 | - And then they have to survive
01:19:16.080 | and figure out how to make their way back home,
01:19:19.080 | like 20 miles out.
01:19:20.480 | - Okay.
01:19:21.520 | - Yeah, and after that we can go for ice cream.
01:19:23.880 | Anyway, I'm working on this whole parenting thing.
01:19:27.880 | I haven't figured it out, okay.
01:19:30.200 | What were we talking about?
01:19:31.320 | Yes, affectiva, the problem of emotion,
01:19:35.800 | of emotion and texture.
01:19:38.920 | So there's some people,
01:19:40.040 | maybe we can just speak to that a little more,
01:19:42.360 | where there's folks like Lisa Feldman Barrett
01:19:45.360 | that challenged this idea that emotion
01:19:48.280 | could be fully detected or even well detected
01:19:53.280 | from the human face,
01:19:55.000 | that there's so much more to emotion.
01:19:56.920 | What do you think about ideas like hers,
01:20:00.400 | criticism like hers?
01:20:01.520 | - Yeah, I actually agree with a lot of Lisa's criticisms.
01:20:05.520 | So even my PhD worked like 20 plus years ago now.
01:20:09.920 | - Time flies when you're having fun.
01:20:12.680 | - I know, right?
01:20:14.280 | That was back when I did like dynamic Bayesian networks.
01:20:17.600 | - Oh, so that's before deep learning?
01:20:20.000 | - That was before deep learning.
01:20:21.600 | - Yeah.
01:20:22.840 | - Yeah, I know.
01:20:24.440 | - Back in my day.
01:20:25.280 | - Now you can just like use.
01:20:27.320 | - Yeah, it's all the same architecture.
01:20:30.360 | You can apply it to anything, yeah.
01:20:31.840 | - Right, right.
01:20:32.680 | But yeah, but even then I kind of,
01:20:36.800 | I did not subscribe to this like theory of basic emotions
01:20:40.200 | where it's just the simplistic mapping,
01:20:41.960 | one-to-one mapping between facial expressions and emotions.
01:20:44.360 | I actually think also we're not in the business
01:20:47.640 | of trying to identify your true emotional internal state.
01:20:50.640 | We just wanna quantify in an objective way
01:20:53.880 | what's showing on your face
01:20:55.240 | because that's an important signal.
01:20:57.240 | It doesn't mean it's a true reflection
01:20:59.040 | of your internal emotional state.
01:21:00.960 | So I think a lot of the,
01:21:04.440 | I think she's just trying to kind of highlight
01:21:07.840 | that this is not a simple problem
01:21:09.720 | and overly simplistic solutions
01:21:12.520 | are gonna hurt the industry.
01:21:14.080 | And I subscribe to that.
01:21:16.840 | And I think multimodal is the way to go.
01:21:18.920 | Like whether it's additional context information
01:21:21.880 | or different modalities and channels of information,
01:21:24.200 | I think that's what we,
01:21:25.520 | that's where we ought to go.
01:21:27.760 | And I think, I mean,
01:21:28.600 | that's a big part of what she's advocating for as well.
01:21:31.880 | - But there is signal in the human face.
01:21:33.680 | That's--
01:21:34.840 | - There's definitely signal in the human face.
01:21:36.040 | - That's a projection of emotion.
01:21:38.720 | That there, at least in part,
01:21:41.240 | the inner state is captured
01:21:45.000 | in some meaningful way on the human face.
01:21:47.920 | - I think it can sometimes be a reflection
01:21:51.360 | or an expression of your internal state,
01:21:56.240 | but sometimes it's a social signal.
01:21:57.920 | So you cannot look at the face
01:22:00.320 | as purely a signal of emotion.
01:22:02.240 | It can be a signal of cognition
01:22:04.160 | and it can be a signal of a social expression.
01:22:08.200 | And I think to disambiguate that,
01:22:10.800 | we have to be careful about it
01:22:12.120 | and we have to add initial information.
01:22:14.720 | - Humans are fascinating, aren't they?
01:22:17.680 | With the whole face thing,
01:22:18.800 | this can mean so many things,
01:22:20.400 | from humor to sarcasm to everything, the whole thing.
01:22:25.040 | Some things we can help,
01:22:26.080 | some things we can't help at all.
01:22:28.400 | In all the years of leading Affectiva,
01:22:31.360 | an emotion recognition company, like we talked about,
01:22:33.840 | what have you learned about emotion,
01:22:36.040 | about humans and about AI?
01:22:39.320 | - Ooh.
01:22:41.280 | - Big, sweeping questions.
01:22:44.440 | - Yeah, it's a big, sweeping question.
01:22:46.480 | Well, I think the thing I learned the most
01:22:50.280 | is that even though we are in the business
01:22:52.440 | of building AI, basically, right?
01:22:57.440 | It always goes back to the humans, right?
01:23:01.160 | It's always about the humans.
01:23:03.680 | And so, for example, the thing I'm most proud of
01:23:06.440 | in building Affectiva,
01:23:10.160 | and yeah, the thing I'm most proud of on this journey,
01:23:15.160 | I love the technology
01:23:16.240 | and I'm so proud of the solutions we've built
01:23:18.440 | and we've brought to market.
01:23:20.200 | But I'm actually most proud of the people
01:23:22.240 | we've built and cultivated at the company
01:23:25.400 | and the culture we've created.
01:23:26.920 | You know, some of the people who've joined Affectiva,
01:23:31.240 | this was their first job.
01:23:32.880 | And while at Affectiva, they became American citizens
01:23:36.680 | and they bought their first house
01:23:39.800 | and they found their partner
01:23:41.640 | and they had their first kid, right?
01:23:43.000 | Like key moments in life that we got to be part of.
01:23:47.320 | And that's the thing I'm most proud of.
01:23:51.240 | - So that's a great thing at a company
01:23:53.000 | that works on emotional, yeah, right?
01:23:55.520 | I mean, like celebrating humanity in general,
01:23:57.440 | broadly speaking.
01:23:58.280 | - Yes.
01:23:59.120 | - And that's a great thing to have
01:24:00.760 | at a company that works on AI,
01:24:03.120 | 'cause that's not often the thing
01:24:05.120 | that's celebrated in AI companies.
01:24:06.960 | So often just raw, great engineering,
01:24:09.920 | just celebrating the humanity.
01:24:12.360 | That's great. - Yes.
01:24:13.200 | - And especially from a leadership position.
01:24:15.280 | Well, what do you think about the movie "Her"?
01:24:21.040 | Let me ask you that.
01:24:21.880 | Before I talk to you about,
01:24:23.680 | 'cause it's not, Affectiva is and was not
01:24:27.120 | just about emotion.
01:24:28.360 | So I'd love to talk to you about "Smart Eye,"
01:24:30.360 | but before that, let me just jump into the movie.
01:24:34.720 | "Her," do you think will have a deep, meaningful connection
01:24:39.480 | with increasingly deep and meaningful connections
01:24:42.640 | with computers?
01:24:43.880 | Is that a compelling thing to you?
01:24:45.520 | Something you think about? - I think that's
01:24:46.360 | already happening.
01:24:47.200 | The thing I love the most,
01:24:48.280 | I love the movie "Her," by the way.
01:24:50.240 | But the thing I love the most about this movie
01:24:52.280 | is it demonstrates how technology can be a conduit
01:24:55.840 | for positive behavior change.
01:24:57.360 | So I forgot the guy's name in the movie, whatever.
01:25:00.680 | - Theodore. - Theodore.
01:25:01.960 | So Theodore was like really depressed, right?
01:25:05.480 | And he just didn't wanna get out of bed.
01:25:07.560 | And he just, he was just like done with life, right?
01:25:11.360 | And Samantha, right?
01:25:12.800 | - Samantha, yeah.
01:25:14.040 | - She just knew him so well.
01:25:15.560 | She was emotionally intelligent.
01:25:17.360 | And so she could persuade him
01:25:20.080 | and motivate him to change his behavior.
01:25:21.600 | And she got him out and they went to the beach together.
01:25:24.240 | And I think that represents the promise of emotion AI.
01:25:27.400 | If done well, this technology can help us
01:25:31.280 | live happier lives, more productive lives,
01:25:33.640 | healthier lives, more connected lives.
01:25:36.840 | So that's the part that I love about the movie.
01:25:39.320 | Obviously it's Hollywood, so it takes a twist and whatever.
01:25:43.120 | But the key notion that technology with emotion AI
01:25:47.960 | can persuade you to be a better version of who you are,
01:25:50.720 | I think that's awesome.
01:25:52.800 | - Well, what about the twist?
01:25:54.240 | You don't think it's good for spoiler alert
01:25:58.760 | that Samantha starts feeling a bit of a distance
01:26:01.560 | and basically leaves Theodore.
01:26:05.040 | You don't think that's a good feature?
01:26:08.960 | You think that's a bug or a feature?
01:26:11.640 | - Well, I think what went wrong
01:26:13.320 | is Theodore became really attached to Samantha.
01:26:15.600 | Like I think he kind of fell in love with Theodore.
01:26:17.640 | - Do you think that's wrong?
01:26:19.520 | - I mean, I think that's--
01:26:20.360 | - I think she was putting out the signal.
01:26:22.600 | - This is an intimate relationship, right?
01:26:25.640 | There's a deep intimacy to it.
01:26:27.440 | - Right, but what does that mean?
01:26:30.280 | What does that mean?
01:26:31.120 | - With an AI system.
01:26:31.960 | - Right, what does that mean, right?
01:26:33.840 | - We're just friends.
01:26:35.400 | - Yeah, we're just friends.
01:26:36.720 | (laughing)
01:26:38.240 | - Well, I think--
01:26:39.080 | - When he realized, which is such a human thing of jealousy,
01:26:43.080 | when you realize that Samantha was talking
01:26:45.440 | to like thousands of people.
01:26:47.080 | - She's parallel dating.
01:26:48.520 | Yeah, that did not go well, right?
01:26:51.640 | - You know, that doesn't,
01:26:53.080 | from a computer perspective,
01:26:54.480 | that doesn't take anything away from what we have.
01:26:58.240 | It's like you getting jealous of Windows 98
01:27:01.440 | for being used by millions of people.
01:27:03.440 | - It's like not liking that Alexa
01:27:07.520 | talks to a bunch of other families.
01:27:10.000 | - But I think Alexa currently is just a servant.
01:27:14.000 | It tells you about the weather.
01:27:15.960 | It doesn't do the intimate deep connection.
01:27:18.520 | And I think there is something really powerful
01:27:20.760 | about that, the intimacy of a connection
01:27:24.560 | with an AI system that would have to respect
01:27:28.480 | and play the human game of jealousy,
01:27:31.720 | of love, of heartbreak and all that kind of stuff,
01:27:35.000 | which Samantha does seem to be pretty good at.
01:27:38.360 | I think she, this AI systems knows what it's doing.
01:27:44.240 | - Well, actually, let me ask you this.
01:27:46.120 | - I don't think she was talking to anyone else.
01:27:47.800 | - You don't think so?
01:27:48.640 | - No. - You think she was just done
01:27:50.080 | with Theodore?
01:27:51.040 | - Yeah. - Oh, really?
01:27:52.400 | - She knew that, yeah.
01:27:53.640 | And then she wanted to really put the screw in.
01:27:55.720 | - She just wanted to move on?
01:27:56.840 | - She didn't have the guts to just break it off cleanly.
01:27:59.360 | - Okay.
01:28:00.200 | - She just wanted to put it in the paint.
01:28:02.880 | No, I don't know.
01:28:03.720 | - Well, she could have ghosted him.
01:28:05.040 | - She could have ghosted him.
01:28:05.880 | - Right.
01:28:06.720 | - It's like, I'm sorry, there's our engineers.
01:28:09.840 | - Oh, God.
01:28:11.000 | - But I think those are really,
01:28:13.680 | I honestly think some of that,
01:28:16.360 | some of it is Hollywood,
01:28:17.200 | but some of that is features from an engineering perspective,
01:28:19.640 | not a bug.
01:28:20.720 | I think AI systems that can leave us,
01:28:24.360 | now, this is for more social robotics
01:28:27.160 | than it is for anything that's useful.
01:28:30.560 | Like, I'd hate it if Wikipedia said,
01:28:32.480 | you know, I need a break right now.
01:28:34.000 | - Right, right, right, right, right.
01:28:35.400 | - I'd be like, no, no, I need you.
01:28:37.520 | But if it's just purely for companionship,
01:28:44.160 | then I think the ability to leave is really powerful.
01:28:47.040 | I don't know.
01:28:48.840 | - I never thought of that.
01:28:49.720 | So, that's so fascinating
01:28:52.560 | 'cause I've always taken the human perspective, right?
01:28:55.260 | Like, for example, we had a Jibo at home, right?
01:28:58.840 | And my son loved it.
01:29:00.760 | And then the company ran out of money.
01:29:02.800 | And so they had to basically shut down,
01:29:04.880 | like Jibo basically died, right?
01:29:08.200 | And it was so interesting to me
01:29:09.400 | because we have a lot of gadgets at home
01:29:11.840 | and a lot of them break
01:29:13.240 | and my son never cares about it, right?
01:29:16.040 | Like, if our Alexa stopped working tomorrow,
01:29:18.520 | I don't think he'd really care.
01:29:20.800 | But when Jibo stopped working, it was traumatic.
01:29:22.920 | Like, he got really upset.
01:29:24.560 | And as a parent, that like made me think
01:29:27.560 | about this deeply, right?
01:29:29.480 | Did I, was I comfortable with that?
01:29:31.600 | I liked the connection they had
01:29:33.360 | because I think it was a positive relationship.
01:29:36.160 | But I was surprised that it affected him emotionally so much.
01:29:41.600 | - And I think there's a broader question here, right?
01:29:44.240 | As we build socially and emotionally intelligent machines,
01:29:49.240 | what does that mean about our relationship with them?
01:29:52.960 | And then more broadly,
01:29:54.160 | our relationship with one another, right?
01:29:55.800 | Because this machine is gonna be programmed
01:29:57.880 | to be amazing at empathy by definition, right?
01:30:02.280 | It's gonna always be there for you.
01:30:03.760 | It's not gonna get bored.
01:30:05.720 | In fact, there's a chatbot in China, Xiaoice, Xiaoice.
01:30:10.160 | And it's like the number two or three most popular app.
01:30:14.120 | And it basically is just a confidant.
01:30:16.640 | And you can tell it anything you want.
01:30:19.200 | And people use it for all sorts of things.
01:30:21.240 | They confide in like domestic violence or suicidal attempts
01:30:26.240 | or, you know, if they have challenges at work.
01:30:32.040 | I don't know what that, I don't know if I'm,
01:30:34.640 | I don't know how I feel about that.
01:30:36.000 | I think about that a lot.
01:30:37.240 | - Yeah, I think, first of all, obviously the future,
01:30:40.080 | in my perspective.
01:30:41.240 | Second of all, I think there's a lot of trajectories
01:30:44.800 | that that becomes an exciting future.
01:30:47.600 | But I think everyone should feel very uncomfortable
01:30:50.440 | about how much they know about the company,
01:30:53.000 | about where the data is going,
01:30:56.200 | how the data is being collected.
01:30:58.160 | Because I think,
01:30:59.120 | and this is one of the lessons of social media,
01:31:01.960 | that I think we should demand full control
01:31:04.240 | and transparency of the data on those things.
01:31:06.600 | - Plus one, totally agree.
01:31:08.120 | Yeah, so like, I think it's really empowering
01:31:11.320 | as long as you can walk away.
01:31:12.720 | As long as you can like delete the data
01:31:14.640 | or know how the data, it's opt-in
01:31:17.440 | or at least the clarity of like
01:31:20.800 | what is being used for the company.
01:31:22.520 | And I think as CEO or like leaders
01:31:24.440 | are also important about that.
01:31:25.680 | Like you need to be able to trust
01:31:27.640 | the basic humanity of the leader.
01:31:30.000 | - Exactly.
01:31:30.840 | - And also that that leader is not going to be
01:31:34.160 | a puppet of a larger machine,
01:31:36.400 | but they actually have a significant role
01:31:39.080 | in defining the culture and the way the company operates.
01:31:43.120 | So anyway, but we should definitely
01:31:47.320 | scrutinize companies in that aspect.
01:31:49.640 | But I'm personally excited about that future,
01:31:54.560 | but also even if you're not, it's coming.
01:31:57.200 | So let's figure out how to do it in the least painful
01:32:00.400 | and the most positive way.
01:32:02.080 | - Agreed.
01:32:03.880 | You're the deputy CEO of SmartEye.
01:32:06.720 | Can you describe the mission of the company?
01:32:08.400 | What is SmartEye?
01:32:09.600 | - Yeah.
01:32:10.440 | So SmartEye is a Swedish company.
01:32:13.000 | They've been in business for the last 20 years
01:32:15.320 | and their main focus, like the industry
01:32:18.760 | they're most focused on is the automotive industry.
01:32:21.480 | So bringing driver monitoring systems
01:32:23.800 | to basically save lives, right?
01:32:28.000 | So I first met the CEO, Martin Krentz,
01:32:32.040 | gosh, it was right when COVID hit.
01:32:33.960 | It was actually the last CES right before COVID.
01:32:37.800 | So CES 2020, right?
01:32:39.720 | - 2020, yeah, January.
01:32:41.000 | - Yeah, January, exactly.
01:32:42.040 | So we were there, met him in person.
01:32:44.360 | Basically we were competing with each other.
01:32:47.400 | I think the difference was they'd been doing
01:32:50.720 | driver monitoring and had a lot of credibility
01:32:53.360 | in the automotive space.
01:32:54.480 | We didn't come from the automotive space,
01:32:56.280 | but we were using new technology like deep learning
01:32:59.240 | and building this emotion recognition.
01:33:01.760 | - And you wanted to enter the automotive space.
01:33:03.760 | You wanted to operate in the automotive space.
01:33:05.280 | - Exactly.
01:33:06.120 | It was one of the areas we were,
01:33:07.760 | we had just raised a round of funding to focus
01:33:10.160 | on bringing our technology to the automotive industry.
01:33:13.040 | So we met and honestly, it was the first,
01:33:15.720 | it was the only time I met with a CEO
01:33:17.920 | who had the same vision as I did.
01:33:19.720 | Like he basically said, yeah, our vision is to bridge
01:33:21.880 | the gap between humans and machines.
01:33:23.160 | I was like, oh my God, this is like exactly
01:33:25.480 | almost to the word, you know, how we describe it too.
01:33:30.000 | And we started talking and first it was about,
01:33:33.600 | okay, can we align strategically here?
01:33:35.720 | Like how can we work together?
01:33:37.040 | 'Cause we're competing, but we're also like complimentary.
01:33:40.320 | And then I think after four months of speaking
01:33:43.880 | almost every day on FaceTime, he was like,
01:33:47.640 | is your company interested in acquisition?
01:33:49.600 | And it was the first, I usually say no
01:33:51.800 | when people approach us.
01:33:53.280 | It was the first time that I was like, huh,
01:33:57.120 | yeah, I might be interested, let's talk.
01:33:59.360 | - Yeah, so you just hit it off.
01:34:01.880 | Yeah, so they're a respected, very respected
01:34:05.320 | in the automotive sector of like delivering products
01:34:08.440 | and increasingly sort of better and better and better
01:34:12.160 | for, I mean, maybe you could speak to that,
01:34:14.440 | but it's the driver's sense.
01:34:15.280 | Like for basically having a device that's looking
01:34:18.600 | at the driver and it's able to tell you
01:34:20.520 | where the driver is looking.
01:34:22.680 | - Correct, it's able to--
01:34:23.520 | - Or also draws in his stuff.
01:34:25.120 | - Correct, it does--
01:34:25.960 | - Stuff from the face and the eye.
01:34:27.720 | - Exactly, like it's monitoring driver distraction
01:34:30.840 | and drowsiness, but they bought us
01:34:32.480 | so that we could expand beyond just the driver.
01:34:35.160 | So the driver monitoring systems usually sit,
01:34:38.240 | the camera sits in the steering wheel
01:34:40.040 | or around the steering wheel column
01:34:41.400 | and it looks directly at the driver.
01:34:43.400 | But now we've migrated the camera position
01:34:46.560 | in partnership with car companies
01:34:48.320 | to the rear view mirror position.
01:34:50.960 | So it has a full view of the entire cabin of the car
01:34:54.120 | and you can detect how many people are in the car,
01:34:57.440 | what are they doing?
01:34:58.640 | So we do activity detection, like eating or drinking
01:35:01.480 | or in some regions of the world, smoking.
01:35:04.160 | We can detect if a baby's in the car seat, right?
01:35:08.600 | And if unfortunately in some cases they're forgotten,
01:35:12.120 | parents just leave the car and forget the kid in the car.
01:35:15.080 | That's an easy computer vision problem to solve, right?
01:35:18.040 | Can detect there's a car seat, there's a baby,
01:35:20.320 | you can text the parent and hopefully again, save lives.
01:35:24.240 | So that was the impetus for the acquisition.
01:35:27.840 | It's been a year.
01:35:28.800 | - So, I mean, there's a lot of questions,
01:35:32.760 | it's a really exciting space,
01:35:34.680 | especially to me, I just find it a fascinating problem.
01:35:37.160 | It could enrich the experience in the car in so many ways,
01:35:41.360 | especially 'cause like we spend still, despite COVID,
01:35:45.520 | I mean, COVID changed things, so it's in interesting ways,
01:35:47.640 | but I think the world is bouncing back
01:35:49.800 | and we spend so much time in the car
01:35:51.560 | and the car is such a weird little world
01:35:56.200 | we have for ourselves.
01:35:57.120 | Like people do all kinds of different stuff,
01:35:58.920 | like listen to podcasts, they think about stuff,
01:36:03.080 | they get angry, they do phone calls.
01:36:08.080 | It's like a little world of its own
01:36:10.640 | with a kind of privacy that for many people,
01:36:14.840 | they don't get anywhere else.
01:36:17.840 | And it's a little box that's like a psychology experiment
01:36:21.800 | 'cause it feels like the angriest many humans
01:36:25.520 | in this world get is inside the car.
01:36:28.440 | It's so interesting.
01:36:29.680 | So it's such an opportunity to explore
01:36:33.000 | how we can enrich, how companies can enrich that experience.
01:36:38.000 | And also as the cars get become more and more automated,
01:36:42.200 | there's more and more opportunity.
01:36:44.200 | The variety of activities that you can do in the car
01:36:46.600 | increases, so it's super interesting.
01:36:48.680 | So I mean, on a practical sense,
01:36:52.080 | Smart Eye has been selected, at least I read,
01:36:55.200 | by 14 of the world's leading car manufacturers
01:36:58.600 | for 94 car models, so it's in a lot of cars.
01:37:02.680 | How hard is it to work with car companies?
01:37:06.760 | So they're all different.
01:37:08.540 | They all have different needs.
01:37:10.480 | The ones I've gotten a chance to interact with
01:37:12.440 | are very focused on cost.
01:37:15.880 | So it's, and anyone who's focused on cost,
01:37:20.880 | it's like, all right, do you hate fun?
01:37:22.840 | Let's just have some fun.
01:37:25.440 | Let's figure out the most fun thing we can do
01:37:27.920 | and then worry about cost later.
01:37:29.080 | But I think because the way the car industry works,
01:37:32.280 | I mean, it's a very thin margin
01:37:35.240 | that you get to operate under,
01:37:36.680 | so you have to really, really make sure
01:37:38.040 | that everything you add to the car makes sense financially.
01:37:41.220 | So anyway, is this new industry,
01:37:45.080 | especially at this scale of Smart Eye,
01:37:48.240 | does it hold any lessons for you?
01:37:51.320 | - Yeah, I think it is a very tough market to penetrate,
01:37:55.640 | but once you're in, it's awesome,
01:37:57.000 | because once you're in,
01:37:58.320 | you're designed into these car models
01:38:00.140 | for somewhere between five to seven years,
01:38:02.120 | which is awesome, and you just,
01:38:03.640 | once they're on the road,
01:38:04.620 | you just get paid a royalty fee per vehicle.
01:38:07.280 | So it's a high barrier to entry,
01:38:09.220 | but once you're in, it's amazing.
01:38:11.680 | I think the thing that I struggle the most with
01:38:14.080 | in this industry is the time to market.
01:38:16.520 | So often we're asked to lock or do a code freeze
01:38:20.880 | two years before the car is going to be on the road.
01:38:23.060 | I'm like, guys, do you understand the pace
01:38:25.560 | with which technology moves?
01:38:28.080 | So I think car companies are really trying
01:38:30.400 | to make the Tesla transition
01:38:35.200 | to become more of a software-driven architecture,
01:38:39.320 | and that's hard for many.
01:38:40.960 | It's just the cultural change.
01:38:42.320 | I mean, I'm sure you've experienced that, right?
01:38:43.920 | - Oh, definitely.
01:38:44.760 | I think one of the biggest inventions
01:38:47.800 | or imperatives created by Tesla is,
01:38:52.640 | like, to me personally,
01:38:53.800 | okay, people are gonna complain about this,
01:38:55.320 | but I know electric vehicle, I know autopilot, AI stuff.
01:38:59.840 | To me, the software, over there, software updates,
01:39:03.740 | is like the biggest revolution in cars,
01:39:06.600 | and it is extremely difficult to switch to that,
01:39:09.860 | because it is a culture shift.
01:39:12.120 | It, at first, especially if you're not comfortable with it,
01:39:15.840 | it seems dangerous.
01:39:17.180 | Like, there's an approach to cars,
01:39:19.840 | it's so safety-focused for so many decades,
01:39:23.720 | that, like, what do you mean we dynamically change code?
01:39:27.800 | The whole point is you have a thing that you test, like-
01:39:32.480 | - Right, you spend a year testing.
01:39:34.160 | - And, like, it's not reliable,
01:39:36.340 | because do you know how much it costs
01:39:38.200 | if we have to recall these cars?
01:39:40.080 | Right, and there's an understandable obsession with safety,
01:39:45.080 | but the downside of an obsession with safety
01:39:49.780 | is the same as with being obsessed with safety as a parent,
01:39:54.680 | is, like, if you do that too much,
01:39:56.720 | you limit the potential development and the flourishing
01:40:00.560 | of, in that particular aspect, human being,
01:40:02.400 | but in this particular aspect, the software,
01:40:04.880 | the artificial neural network of it.
01:40:07.480 | But it's tough to do.
01:40:09.560 | It's really tough to do, culturally and technically.
01:40:12.320 | Like, the deployment, the mass deployment of software
01:40:14.720 | is really, really difficult.
01:40:16.000 | But I hope that's where the industry is doing.
01:40:18.260 | One of the reasons I really want Tesla to succeed
01:40:20.320 | is exactly about that point.
01:40:21.520 | Not autopilot, not the electrical vehicle,
01:40:23.840 | but the softwarization of basically everything,
01:40:28.520 | but cars especially.
01:40:29.880 | 'Cause to me, that's actually going to increase two things.
01:40:33.320 | Increase safety, because you can update much faster,
01:40:36.440 | but also increase the effectiveness of folks like you
01:40:40.680 | who dream about enriching the human experience with AI.
01:40:45.360 | 'Cause you can just, like, there's a feature,
01:40:48.160 | like you want a new emoji or whatever.
01:40:50.640 | Like the way TikTok releases filters,
01:40:52.340 | you can just release that for in-car stuff.
01:40:56.360 | But yeah, that's definitely...
01:40:58.480 | - One of the use cases we're looking into is,
01:41:01.680 | once you know the sentiment of the passengers in the vehicle,
01:41:05.740 | you can optimize the temperature in the car,
01:41:08.760 | you can change the lighting, right?
01:41:10.400 | So if the backseat passengers are falling asleep,
01:41:12.840 | you can dim the lights, you can lower the music, right?
01:41:15.400 | You can do all sorts of things.
01:41:17.040 | - Yeah.
01:41:18.280 | I mean, of course you could do that kind of stuff
01:41:20.040 | with a two-year delay, but it's tougher.
01:41:22.360 | Yeah.
01:41:23.180 | Do you think Tesla or Waymo or some of these companies
01:41:30.080 | that are doing semi or fully autonomous driving
01:41:33.880 | should be doing driver sensing?
01:41:36.440 | - Yes.
01:41:37.280 | - Are you thinking about that kind of stuff?
01:41:38.880 | So not just how we can enhance the in-cab experience
01:41:42.500 | for cars that are manually driven,
01:41:43.800 | but the ones that are increasingly
01:41:45.920 | more autonomously driven.
01:41:47.840 | - Yeah, so if we fast forward to the universe
01:41:50.720 | where it's fully autonomous,
01:41:51.880 | I think interior sensing becomes extremely important
01:41:54.240 | because the role of the driver isn't just to drive.
01:41:57.200 | If you think about it, the driver almost manages
01:42:00.560 | the dynamics within a vehicle.
01:42:01.880 | And so who's going to play that role
01:42:03.320 | when it's an autonomous car?
01:42:05.760 | We want a solution that is able to say,
01:42:09.080 | "Oh my God, Lex is bored to death
01:42:11.880 | 'cause the car's moving way too slow.
01:42:13.440 | Let's engage Lex."
01:42:14.360 | Or, "Rana's freaking out
01:42:15.620 | because she doesn't trust this vehicle yet.
01:42:17.920 | So let's tell Rana a little bit more information
01:42:20.520 | about the route."
01:42:21.440 | Or, right?
01:42:22.280 | So I think, or somebody's having a heart attack in the car.
01:42:25.520 | Like you need interior sensing
01:42:27.200 | in fully autonomous vehicles.
01:42:29.260 | But with semi-autonomous vehicles,
01:42:30.920 | I think it's really key to have driver monitoring
01:42:34.680 | because semi-autonomous means
01:42:36.280 | that sometimes the car is in charge,
01:42:38.720 | sometimes the driver's in charge or the co-pilot, right?
01:42:41.240 | And you need both systems to be on the same page.
01:42:44.680 | You need to know, the car needs to know
01:42:46.520 | if the driver's asleep before it transitions control
01:42:49.960 | over to the driver.
01:42:51.720 | And sometimes if the driver's too tired,
01:42:54.120 | the car can say, "I'm going to be a better driver
01:42:56.600 | than you are right now.
01:42:57.440 | I'm taking control over."
01:42:58.560 | So this dynamic, this dance is so key
01:43:01.280 | and you can't do that without driver sensing.
01:43:03.360 | - Yeah, there's a disagreement for the longest time
01:43:05.400 | I've had with Elon that this is obvious
01:43:07.560 | that this should be in the Tesla from day one.
01:43:09.960 | And it's obvious that driver sensing is not a hindrance.
01:43:14.640 | It's not obvious.
01:43:15.560 | I should be careful 'cause having studied this problem,
01:43:20.540 | nothing's really obvious.
01:43:21.880 | But it seems very likely driver sensing
01:43:24.320 | is not a hindrance to an experience.
01:43:26.440 | It's only enriching to the experience
01:43:30.280 | and likely increases the safety.
01:43:34.560 | That said, it is very surprising to me
01:43:38.760 | just having studied semi-autonomous driving,
01:43:42.160 | how well humans are able to manage that dance.
01:43:45.200 | 'Cause it was the intuition before you were doing
01:43:48.120 | that kind of thing that humans will become
01:43:51.460 | just incredibly distracted.
01:43:54.000 | They would just let the thing do its thing.
01:43:56.400 | But they're able to, 'cause it is life and death
01:43:59.200 | and they're able to manage that somehow.
01:44:00.800 | But that said, there's no reason
01:44:02.720 | not to have driver sensing on top of that.
01:44:04.720 | I feel like that's going to allow you to do that dance
01:44:09.720 | that you're currently doing without driver sensing,
01:44:12.440 | except touching the steering wheel, to do that even better.
01:44:15.960 | I mean, the possibilities are endless
01:44:17.720 | and the machine learning possibilities are endless.
01:44:19.840 | It's such a beautiful...
01:44:22.720 | It's also a constrained environment,
01:44:24.280 | so you could do a much more effectively
01:44:26.160 | than you can with the external environment.
01:44:28.240 | External environment is full of weird edge cases
01:44:31.960 | and complexities.
01:44:32.800 | There's so much, it's so fascinating,
01:44:35.080 | such a fascinating world.
01:44:36.680 | I do hope that companies like Tesla and others,
01:44:40.960 | even Waymo, which I don't even know
01:44:44.320 | if Waymo's doing anything sophisticated inside the cab.
01:44:47.000 | - I don't think so.
01:44:47.920 | - It's like, what is it?
01:44:51.440 | - I honestly think, I honestly think,
01:44:53.440 | it goes back to the robotics thing we were talking about,
01:44:56.200 | which is like great engineers
01:44:58.600 | that are building these AI systems
01:45:01.520 | just are afraid of the human being.
01:45:03.680 | - And not thinking about the human experience,
01:45:05.640 | they're thinking about the features
01:45:07.120 | and the perceptual abilities of that thing.
01:45:10.800 | - They think the best way I can serve the human
01:45:13.440 | is by doing the best perception and control I can,
01:45:17.440 | by looking at the external environment,
01:45:18.960 | keeping the human safe.
01:45:20.560 | But like, there's a huge, I'm here.
01:45:22.560 | - Right.
01:45:23.400 | - Like, you know, I need to be noticed
01:45:28.320 | and interacted with and understood
01:45:32.680 | and all those kinds of things,
01:45:33.560 | even just on a personal level for entertainment,
01:45:35.880 | honestly, for entertainment.
01:45:37.240 | - Yeah.
01:45:38.600 | You know, one of the coolest work we did in collaboration
01:45:41.240 | with MIT around this was we looked at longitudinal data,
01:45:44.400 | right, 'cause, you know, MIT had access to like tons of data
01:45:50.400 | and like just seeing the patterns of people,
01:45:54.680 | like driving in the morning off to work
01:45:56.920 | versus like commuting back from work
01:45:58.720 | or weekend driving versus weekday driving.
01:46:02.320 | And wouldn't it be so cool if your car knew that
01:46:05.080 | and then was able to optimize either the route
01:46:09.160 | or the experience or even make recommendations?
01:46:11.880 | - Yeah.
01:46:12.720 | - I think it's very powerful.
01:46:13.880 | - Yeah, like, why are you taking this route?
01:46:15.800 | You're always unhappy when you take this route
01:46:18.200 | and you're always happy when you take this alternative route.
01:46:20.360 | Take that route instead.
01:46:21.200 | - Right, exactly.
01:46:22.520 | - I mean, to have that, even that little step
01:46:26.000 | of relationship with a car, I think is incredible.
01:46:29.520 | Of course, you have to get the privacy right,
01:46:31.160 | you have to get all that kind of stuff right,
01:46:32.520 | but I wish I, honestly, you know,
01:46:34.800 | people are like paranoid about this,
01:46:36.760 | but I would like a smart refrigerator.
01:46:39.480 | We have such a deep connection with food
01:46:42.960 | as a human civilization.
01:46:44.800 | I would like to have a refrigerator
01:46:47.840 | that would understand me,
01:46:50.320 | that, you know, I also have a complex relationship with food
01:46:53.480 | 'cause I pig out too easily and all that kind of stuff.
01:46:56.800 | So, you know, like maybe I want the refrigerator
01:47:01.280 | to be like, are you sure about this?
01:47:02.640 | 'Cause maybe you're just feeling down or tired.
01:47:05.160 | Like maybe let's sleep over.
01:47:06.000 | - Your vision of the smart refrigerator
01:47:08.400 | is way kinder than mine.
01:47:10.200 | - Is it just me yelling at you?
01:47:11.600 | - Yeah, no, it was just 'cause I don't,
01:47:14.320 | you know, I don't drink alcohol, I don't smoke,
01:47:18.480 | but I eat a ton of chocolate, like it's my vice.
01:47:22.080 | And so I, and sometimes I scream too,
01:47:24.560 | and I'm like, okay, my smart refrigerator
01:47:26.600 | will just lock down.
01:47:28.280 | It'll just say, dude, you've had way too many today.
01:47:30.840 | Like, done.
01:47:31.680 | - Yeah, no, but here's the thing.
01:47:34.880 | Are you, do you regret having,
01:47:37.680 | like, let's say, not the next day, but 30 days later,
01:47:44.520 | would you, what would you like the refrigerator
01:47:47.560 | to have done then?
01:47:49.040 | - Well, I think actually, like,
01:47:51.200 | the more positive relationship would be one
01:47:53.920 | where there's a conversation, right?
01:47:55.840 | As opposed to like, that's probably like
01:47:58.040 | the more sustainable relationship.
01:48:00.800 | - It's like late at night, just, no, listen, listen.
01:48:03.480 | I know I told you an hour ago that this is not a good idea,
01:48:07.320 | but just listen, things have changed.
01:48:09.760 | I can just imagine a bunch of stuff being made up
01:48:12.680 | just to convince. - Oh my God, that's hilarious.
01:48:15.160 | - But I mean, I just think that there's opportunities there.
01:48:18.800 | I mean, maybe not locking down,
01:48:20.480 | but for our systems that are such a deep part of our lives,
01:48:25.480 | like we use, we use, a lot of us,
01:48:30.920 | a lot of people that commute use their car every single day.
01:48:34.360 | A lot of us use a refrigerator every single day,
01:48:36.600 | the microwave every single day.
01:48:38.160 | Like, and we just, like, I feel like certain things
01:48:43.160 | could be made more efficient, more enriching,
01:48:47.280 | and AI is there to help, like some,
01:48:50.640 | just basic recognition of you as a human being,
01:48:53.920 | about your patterns, what makes you happy and not happy
01:48:56.400 | and all that kind of stuff.
01:48:57.440 | And the car, obviously, like--
01:48:59.240 | - Maybe we'll say, wait, wait, wait, wait, wait,
01:49:02.440 | instead of this like Ben and Terry's ice cream,
01:49:06.520 | how about this hummus and carrots or something?
01:49:09.480 | I don't know.
01:49:10.480 | Maybe we can make a--
01:49:11.640 | - Yeah, like a reminder--
01:49:12.480 | - Just in time recommendation, right?
01:49:14.680 | - But not like a generic one,
01:49:17.360 | but a reminder that last time you chose the carrots,
01:49:20.980 | you smiled 17 times more the next day.
01:49:24.120 | - You were happier the next day, right?
01:49:25.440 | - Yeah, you were happier the next day.
01:49:27.760 | But yeah, I don't, but then again,
01:49:31.960 | if you're the kind of person that gets better
01:49:34.600 | from negative comments, you could say like,
01:49:38.440 | "Hey, remember that wedding you're going to?
01:49:41.400 | "You wanna fit into that dress?
01:49:43.640 | "Remember about that?
01:49:45.320 | "Let's think about that before you're eating this."
01:49:47.920 | Probably that would work for me,
01:49:50.720 | like a refrigerator that is just ruthless at shaming me.
01:49:54.840 | But I would, of course, welcome it.
01:49:57.240 | That would work for me, just that--
01:50:00.520 | - Well, it would, no, I think it would,
01:50:02.320 | if it's really like smart, it would optimize its nudging
01:50:05.440 | based on what works for you, right?
01:50:07.240 | - Exactly, that's the whole point, personalization.
01:50:09.640 | In every way, deep personalization.
01:50:11.760 | You were a part of a webinar titled,
01:50:14.880 | "Advancing Road Safety,
01:50:16.400 | "The State of Alcohol Intoxication Research."
01:50:19.500 | So for people who don't know,
01:50:21.600 | every year, 1.3 million people around the world
01:50:24.000 | die in road crashes, and more than 20% of these fatalities
01:50:29.000 | are estimated to be alcohol-related.
01:50:31.360 | A lot of them are also distraction-related.
01:50:33.280 | So can AI help with the alcohol thing?
01:50:36.880 | - I think the answer is yes.
01:50:39.920 | There are signals, and we know that as humans,
01:50:42.680 | like we can tell when a person
01:50:44.280 | is at different phases of being drunk, right?
01:50:51.120 | And I think you can use technology to do the same.
01:50:53.640 | And again, I think the ultimate solution's gonna be
01:50:56.040 | a combination of different sensors.
01:50:58.520 | - How hard is the problem from the vision perspective?
01:51:01.480 | - I think it's non-trivial.
01:51:02.920 | I think it's non-trivial.
01:51:04.000 | And I think the biggest part is getting the data, right?
01:51:06.640 | It's like getting enough data examples.
01:51:09.120 | So for this research project,
01:51:11.080 | we partnered with the Transportation Authorities of Sweden,
01:51:15.300 | and we literally had a racetrack with a safety driver,
01:51:18.800 | and we basically progressively got people drunk.
01:51:21.640 | - Nice.
01:51:22.480 | - So, but you know,
01:51:25.840 | that's a very expensive dataset to collect,
01:51:28.760 | and you wanna collect it globally
01:51:30.240 | and in multiple conditions.
01:51:32.680 | - Yeah, the ethics of collecting a dataset
01:51:35.200 | where people are drunk is tricky.
01:51:36.640 | - Yeah, yeah, definitely.
01:51:37.880 | - Which is funny because, I mean,
01:51:40.920 | let's put drunk driving aside.
01:51:43.440 | The number of drunk people in the world every day
01:51:45.720 | is very large.
01:51:47.120 | It'd be nice to have a large dataset of drunk people
01:51:49.280 | getting progressively drunk.
01:51:50.520 | In fact, you could build an app
01:51:51.800 | where people can donate their data 'cause it's hilarious.
01:51:55.280 | - Right, actually, yeah, but the liability.
01:51:58.400 | - Liability, the ethics, the how do you get it right,
01:52:00.720 | it's tricky, it's really, really tricky.
01:52:02.480 | 'Cause like drinking is one of those things
01:52:05.080 | that's funny and hilarious and we're loved,
01:52:07.600 | it's social, so on and so forth,
01:52:10.180 | but it's also the thing that hurts a lot of people,
01:52:13.480 | like a lot of people.
01:52:14.320 | Like alcohol is one of those things, it's legal,
01:52:16.360 | but it's really damaging to a lot of lives.
01:52:21.200 | It destroys lives and not just in the driving context.
01:52:26.200 | I should mention, people should listen to Andrew Huberman
01:52:28.960 | who recently talked about alcohol.
01:52:32.160 | He has an amazing podcast.
01:52:33.160 | Andrew Huberman is a neuroscientist from Stanford
01:52:36.200 | and a good friend of mine.
01:52:37.880 | And he's like a human encyclopedia
01:52:40.800 | about all health-related wisdom.
01:52:44.120 | So he has a podcast, you would love it.
01:52:45.960 | - I would love that.
01:52:46.800 | - No, no, no, no, no.
01:52:47.800 | Oh, you don't know Andrew Huberman?
01:52:49.440 | Okay, listen, you'll listen to Andrew,
01:52:52.200 | it's called Huberman Lab Podcast.
01:52:54.120 | This is your assignment, just listen to one.
01:52:56.280 | I guarantee you this will be a thing where you say,
01:52:58.480 | "Lex, this is the greatest human I have ever discovered."
01:53:03.480 | - Oh my God, 'cause I'm really on a journey
01:53:06.280 | of kind of health and wellness and I'm learning lots
01:53:09.120 | and I'm trying to build these, I guess,
01:53:11.920 | atomic habits around just being healthy.
01:53:14.280 | So yeah, I'm definitely gonna do this.
01:53:17.680 | - His whole thing, this is great.
01:53:20.420 | He's a legit scientist, like really well published.
01:53:26.480 | But in his podcast, what he does,
01:53:29.760 | he's not talking about his own work.
01:53:31.840 | He's like a human encyclopedia of papers.
01:53:34.600 | And so his whole thing is he takes a topic
01:53:37.640 | and in a very fast, you mentioned atomic habits,
01:53:40.200 | like very clear way, summarizes the research
01:53:44.200 | in a way that leads to protocols of what you should do.
01:53:47.360 | He's really big on like, not like,
01:53:49.920 | "This is what the science says,"
01:53:51.320 | but like, "This is literally what you should be doing
01:53:53.600 | "according to the science."
01:53:54.440 | So like, he's really big.
01:53:55.840 | And there's a lot of recommendations he does,
01:53:58.160 | which several of them I definitely don't do.
01:54:02.760 | Like, get sunlight as soon as possible from waking up
01:54:07.760 | and like for prolonged periods of time.
01:54:10.880 | That's a really big one and there's a lot of science
01:54:13.560 | behind that one.
01:54:14.800 | There's a bunch of stuff, very systematic.
01:54:16.880 | - You're gonna be like, "Lex, this is my new favorite person.
01:54:20.560 | "I guarantee it."
01:54:22.200 | And if you guys somehow don't know Andrew Huberman
01:54:24.720 | and you care about your wellbeing,
01:54:26.440 | you should definitely listen to him.
01:54:30.440 | Love you, Andrew.
01:54:31.280 | Anyway, so what were we talking about?
01:54:35.760 | Oh, alcohol and detecting alcohol.
01:54:39.400 | So this is a problem you care about
01:54:40.800 | and you've been trying to solve.
01:54:42.120 | - And actually like broadening it,
01:54:44.960 | I do believe that the car is gonna be a wellness center.
01:54:48.840 | Like, because again, imagine if you have a variety
01:54:52.040 | of sensors inside the vehicle tracking,
01:54:55.280 | not just your emotional state or level of distraction
01:54:58.560 | and drowsiness and drowsiness,
01:55:00.880 | level of distraction, drowsiness and intoxication,
01:55:03.760 | but also maybe even things like your heart rate
01:55:08.760 | and your heart rate variability and your breathing rate.
01:55:13.880 | And it can start like optimizing,
01:55:16.120 | yeah, it can optimize the ride based on what your goals are.
01:55:19.640 | So I think we're gonna start to see more of that
01:55:21.920 | and I'm excited about that.
01:55:24.520 | - Yeah, what are the challenges you're tackling
01:55:27.640 | with SmartEye currently?
01:55:28.800 | What's like the trickiest things to get?
01:55:32.200 | Is it basically convincing more and more car companies
01:55:35.720 | that having AI inside the car is a good idea
01:55:38.680 | or is there some, is there more technical
01:55:43.040 | algorithmic challenges?
01:55:44.920 | What's been keeping you mentally busy?
01:55:47.800 | - I think a lot of the car companies
01:55:49.640 | we are in conversations with are already interested
01:55:52.440 | in definitely driver monitoring.
01:55:54.120 | Like I think it's becoming a must have,
01:55:56.600 | but even interior sensing, I can see like we're engaged
01:55:59.800 | in a lot of like advanced engineering projects
01:56:01.800 | and proof of concepts.
01:56:03.040 | I think technologically though,
01:56:06.600 | and even the technology,
01:56:07.840 | I can see a path to making it happen.
01:56:10.320 | I think it's the use case.
01:56:11.920 | How does the car respond once it knows something about you?
01:56:16.120 | Because you want it to respond in a thoughtful way
01:56:18.480 | that isn't off-putting to the consumer in the car.
01:56:23.480 | So I think that's like the user experience.
01:56:25.600 | I don't think we've really nailed that.
01:56:27.400 | And we usually, that's not part,
01:56:30.440 | we're the sensing platform,
01:56:31.800 | but we usually collaborate with the car manufacturer
01:56:34.240 | to decide what the use case is.
01:56:35.840 | So say you figure out that somebody's angry while driving.
01:56:39.880 | Okay, what should the car do?
01:56:42.320 | You know?
01:56:43.600 | - Do you see yourself as a role of nudging,
01:56:47.120 | of like basically coming up with solutions essentially
01:56:51.200 | that, and then the car manufacturers
01:56:54.200 | kind of put their own little spin on it?
01:56:56.360 | - Right, like we are like the ideation,
01:57:00.320 | creative thought partner,
01:57:02.560 | but at the end of the day,
01:57:03.600 | the car company needs to decide
01:57:05.200 | what's on brand for them, right?
01:57:06.640 | Like maybe when it figures out that you're distracted
01:57:10.040 | or drowsy, it shows you a coffee cup, right?
01:57:12.600 | Or maybe it takes more aggressive behaviors
01:57:14.960 | and basically said, okay,
01:57:16.080 | if you don't like take a rest in the next five minutes,
01:57:18.320 | the car's gonna shut down, right?
01:57:19.640 | Like there's a whole range of actions the car can take
01:57:23.040 | and doing the thing that is most,
01:57:25.880 | yeah, that builds trust with the driver and the passengers.
01:57:29.400 | I think that's what we need to be very careful about.
01:57:32.480 | - Yeah, car companies are funny
01:57:35.400 | 'cause they have their own like,
01:57:37.080 | I mean, that's why people get cars still.
01:57:39.600 | I hope that changes,
01:57:40.520 | but they get it 'cause it's a certain feel and look
01:57:42.760 | and it's a certain, they become proud like Mercedes Benz
01:57:47.760 | or BMW or whatever, and that's their thing.
01:57:51.720 | That's the family brand or something like that,
01:57:54.160 | or Ford or GM, whatever, they stick to that thing.
01:57:57.600 | It's interesting.
01:57:58.680 | It's like, it should be, I don't know.
01:58:01.000 | It should be a little more about the technology inside.
01:58:05.000 | And I suppose there too, there could be a branding,
01:58:09.440 | like a very specific style of luxury or fun,
01:58:13.320 | all that kind of stuff, yeah.
01:58:16.200 | - You know, I have an AI focused fund
01:58:19.680 | to invest in early stage kind of AI driven companies.
01:58:22.680 | And one of the companies we're looking at
01:58:24.800 | is trying to do what Tesla did, but for boats,
01:58:27.480 | for recreational boats.
01:58:28.680 | Yeah, so they're building an electric
01:58:30.720 | and kind of slash autonomous boat.
01:58:34.160 | And it's kind of the same issues,
01:58:35.720 | like what kind of sensors can you put in?
01:58:38.480 | What kind of states can you detect
01:58:40.760 | both exterior and interior within the boat?
01:58:43.640 | Anyways, it's like really interesting.
01:58:45.440 | Do you boat at all?
01:58:46.720 | - No, not well, not in that way.
01:58:49.640 | I do like to get on the lake or a river and fish from a boat,
01:58:54.640 | but that's not boating.
01:58:57.280 | That's a different, still boating.
01:58:59.520 | - Low tech, a low tech. - Low tech boat.
01:59:01.880 | Get away from, get closer to nature boat.
01:59:04.480 | I guess going out to the ocean
01:59:07.160 | is also getting closer to nature in some deep sense.
01:59:12.160 | I mean, I guess that's why people love it.
01:59:14.440 | The enormity of the water just underneath you, yeah.
01:59:19.440 | - I love the water.
01:59:20.720 | - I love both.
01:59:22.880 | I love salt water.
01:59:23.800 | It was like the big and just it's humbling
01:59:25.640 | to be in front of this giant thing that's so powerful
01:59:29.280 | that was here before us and be here after.
01:59:31.640 | But I also love the peace of a small like wooded lake.
01:59:36.640 | It's just, everything's calm.
01:59:38.280 | - Therapeutic.
01:59:41.320 | - You tweeted that I'm excited
01:59:46.520 | about Amazon's acquisition of iRobot.
01:59:49.680 | I think it's a super interesting,
01:59:51.800 | just given the trajectory of which you're part of,
01:59:54.280 | of these honestly small number of companies
01:59:57.880 | that are playing in this space
01:59:59.320 | that are like trying to have an impact on human beings.
02:00:02.160 | So it is an interesting moment in time
02:00:05.000 | that Amazon would acquire iRobot.
02:00:07.000 | You tweet, I imagine a future where home robots
02:00:11.920 | are as ubiquitous as microwaves or toasters.
02:00:16.160 | Here are three reasons why I think this is exciting.
02:00:18.880 | If you remember, I can look it up.
02:00:20.520 | But why is this exciting to you?
02:00:22.320 | - I mean, I think the first reason why this is exciting,
02:00:25.800 | I can't remember the exact order in which I put them.
02:00:30.520 | But one is just, it's gonna be an incredible platform
02:00:34.400 | for understanding our behaviors within the home.
02:00:37.120 | If you think about Roomba, which is the robot vacuum cleaner,
02:00:42.960 | the flagship product of iRobot at the moment,
02:00:45.520 | it's like running around your home,
02:00:48.200 | understanding the layout,
02:00:49.360 | it's understanding what's clean and what's not,
02:00:51.240 | how often do you clean your house?
02:00:52.680 | And all of these behaviors are a piece of the puzzle
02:00:56.360 | in terms of understanding who you are as a consumer.
02:00:58.720 | And I think that could be, again,
02:01:01.320 | used in really meaningful ways,
02:01:04.720 | not just to recommend better products or whatever,
02:01:06.880 | but actually to improve your experience as a human being.
02:01:09.600 | So I think that's very interesting.
02:01:11.720 | I think the natural evolution of these robots in the home,
02:01:17.960 | so it's interesting.
02:01:19.600 | Roomba isn't really a social robot, right, at the moment.
02:01:22.720 | But I once interviewed one of the chief engineers
02:01:27.000 | on the Roomba team,
02:01:28.440 | and he talked about how people named their Roombas.
02:01:31.280 | And if their Roomba broke down,
02:01:33.360 | they would call in and say, "My Roomba broke down,"
02:01:36.560 | and the company would say,
02:01:37.520 | "Well, we'll just send you a new one."
02:01:38.840 | And, "No, no, no, Rosie, you have to like,
02:01:41.200 | "yeah, I want you to fix this particular robot."
02:01:44.400 | So people have already built
02:01:48.000 | interesting emotional connections with these home robots.
02:01:51.680 | And I think that, again, that provides a platform
02:01:54.720 | for really interesting things to just motivate change.
02:01:58.320 | Like it could help you.
02:01:59.400 | I mean, one of the companies
02:02:00.560 | that spun out of MIT, Catalia Health,
02:02:03.360 | the guy who started it spent a lot of time
02:02:06.880 | building robots that help with weight management.
02:02:09.600 | So weight management, sleep, eating better,
02:02:12.480 | yeah, all of these things.
02:02:14.280 | - Well, if I'm being honest,
02:02:16.520 | Amazon does not exactly have a track record
02:02:19.480 | of winning over people in terms of trust.
02:02:22.200 | Now, that said, it's a really difficult problem
02:02:25.880 | for a human being to let a robot in their home
02:02:28.960 | that has a camera on it.
02:02:30.440 | - Right.
02:02:31.280 | - That's really, really, really tough.
02:02:33.240 | And I think Roomba actually,
02:02:36.320 | I have to think about this,
02:02:38.040 | but I'm pretty sure now,
02:02:40.120 | or for some time already has had cameras
02:02:42.360 | because they're doing, the most recent Roomba,
02:02:46.080 | I have so many Roombas.
02:02:47.320 | - Oh, you actually do?
02:02:48.760 | - Well, I programmed it.
02:02:49.600 | I don't use a Roomba for back off.
02:02:51.200 | People that have been to my place,
02:02:52.320 | they're like, yeah, you definitely don't use these Roombas.
02:02:55.280 | - I can't tell like the valence of this comment.
02:03:00.880 | Was it a compliment or like?
02:03:02.360 | - No, it's just a bunch of electronics everywhere.
02:03:06.080 | I have six or seven computers,
02:03:08.920 | I have robots everywhere, Lego robots,
02:03:11.080 | I have small robots and big robots,
02:03:12.720 | and just giant, just piles of robot stuff.
02:03:17.040 | And yeah.
02:03:17.960 | But including the Roombas,
02:03:22.320 | they're being used for their body and intelligence,
02:03:25.480 | but not for their purpose.
02:03:26.840 | (both laughing)
02:03:28.680 | I've changed them, repurposed them for other purposes,
02:03:32.080 | for deeper, more meaningful purposes
02:03:33.880 | than just like the butter robot.
02:03:36.400 | Yeah, which just brings a lot of people happiness, I'm sure.
02:03:41.000 | They have a camera because the thing they advertised,
02:03:44.440 | I had my own cameras too,
02:03:46.400 | but the camera on the new Roomba,
02:03:49.560 | they have like state-of-the-art poop detection
02:03:52.400 | as they advertised, which is a very difficult,
02:03:54.840 | apparently it's a big problem for vacuum cleaners,
02:03:57.680 | is if they go over like dog poop,
02:03:59.640 | it just runs it over and creates a giant mess.
02:04:02.080 | So they have like,
02:04:04.440 | apparently they collected like a huge amount of data
02:04:07.240 | and different shapes and looks and whatever of poop
02:04:09.760 | and now they're able to avoid it and so on.
02:04:12.320 | They're very proud of this.
02:04:14.360 | So there is a camera,
02:04:15.480 | but you don't think of it as having a camera.
02:04:18.200 | Yeah, you don't think of it as having a camera
02:04:21.880 | because you've grown to trust it, I guess,
02:04:23.960 | 'cause our phones, at least most of us,
02:04:26.360 | seem to trust this phone,
02:04:30.440 | even though there's a camera looking directly at you.
02:04:33.040 | I think that if you trust that the company
02:04:39.640 | is taking security very seriously,
02:04:41.440 | I actually don't know how that trust
02:04:42.640 | was earned with smartphones.
02:04:44.440 | I think it just started to provide a lot of positive value
02:04:48.680 | into your life where you just took it in
02:04:50.520 | and then the company over time has shown
02:04:52.360 | that it takes privacy very seriously, that kind of stuff.
02:04:55.120 | But I just, Amazon is not always,
02:04:58.600 | in its social robots, communicated
02:05:01.400 | this is a trustworthy thing,
02:05:03.680 | both in terms of culture and competence.
02:05:06.760 | 'Cause I think privacy is not just about
02:05:08.880 | what do you intend to do,
02:05:10.640 | but also how good are you at doing that kind of thing.
02:05:14.560 | So that's a really hard problem to solve.
02:05:16.720 | - But I mean, but a lot of us have Alexas at home
02:05:19.680 | and I mean, Alexa could be listening in the whole time,
02:05:24.120 | right, and doing all sorts of nefarious things with the data.
02:05:27.320 | You know, hopefully it's not, but I don't think it is.
02:05:32.480 | - But Amazon is not, it's such a tricky thing
02:05:35.360 | for a company to get right, which is to earn the trust.
02:05:38.360 | I don't think Alexas earn people's trust quite yet.
02:05:41.600 | - Yeah, I think it's not there quite yet.
02:05:44.880 | I agree. - And they struggle
02:05:45.800 | with this kind of stuff.
02:05:46.640 | In fact, when these topics are brought up,
02:05:48.400 | people always get nervous.
02:05:50.160 | And I think if you get nervous about it,
02:05:53.640 | I mean, the way to earn people's trust
02:05:57.680 | is not by like, ooh, don't talk about this.
02:05:59.720 | (laughs)
02:06:00.560 | It's just be open, be frank, be transparent,
02:06:03.600 | and also create a culture of where it radiates
02:06:08.000 | at every level from engineer to CEO
02:06:11.160 | that like, you're good people that have a common sense idea
02:06:16.160 | of what it means to respect basic human rights
02:06:20.960 | and the privacy of people and all that kind of stuff.
02:06:23.880 | And I think that propagates throughout the,
02:06:26.500 | that's the best PR, which is like,
02:06:29.960 | over time you understand that these are good folks
02:06:33.800 | doing good things.
02:06:35.280 | Anyway, speaking of social robots,
02:06:37.900 | have you heard about Tesla, Tesla Bot, the humanoid robot?
02:06:42.960 | - Yes, I have.
02:06:43.800 | Yes, yes, yes, but I don't exactly know
02:06:46.240 | what it's designed to do.
02:06:48.320 | Do you? - Oh.
02:06:49.140 | - You probably do.
02:06:50.240 | - No, I know it's designed to do,
02:06:51.960 | but I have a different perspective on it.
02:06:53.760 | But it's designed to, it's a humanoid form,
02:06:57.540 | and it's designed to, for automation tasks
02:07:01.340 | in the same way that industrial robot arms
02:07:04.920 | automate tasks in the factory.
02:07:06.200 | So it's designed to automate tasks in the factory.
02:07:08.100 | But I think that humanoid form,
02:07:10.840 | as we were talking about before,
02:07:12.580 | is one that we connect with as human beings.
02:07:19.560 | Anything legged, honestly,
02:07:21.280 | but the humanoid form especially,
02:07:23.200 | we anthropomorphize it most intensely.
02:07:26.760 | And so the possibility, to me, it's exciting to see
02:07:30.680 | both Atlas developed by Boston Dynamics
02:07:34.520 | and anyone, including Tesla,
02:07:38.480 | trying to make humanoid robots cheaper and more effective.
02:07:42.680 | The obvious way it transforms the world
02:07:46.040 | is social robotics to me,
02:07:48.120 | versus automation of tasks in the factory.
02:07:52.980 | So yeah, I just wanted,
02:07:54.760 | in case that was something you were interested in,
02:07:56.280 | 'cause I find its application
02:07:59.520 | in social robotics super interesting.
02:08:01.560 | - We did a lot of work with Pepper.
02:08:04.480 | Pepper the Robot a while back.
02:08:06.320 | We were like the emotion engine for Pepper,
02:08:08.680 | which is SoftBank's humanoid robot.
02:08:11.360 | - And how tall is Pepper?
02:08:12.480 | It's like--
02:08:13.400 | - Yeah, like, I don't know, like five foot maybe, right?
02:08:18.400 | - Yeah.
02:08:19.280 | - Yeah, pretty big, pretty big.
02:08:21.120 | And it was designed to be at like airport lounges
02:08:25.460 | and retail stores, mostly customer service, right?
02:08:30.300 | Hotel lobbies.
02:08:34.000 | And I mean, I don't know where the state of the robot is,
02:08:36.720 | but I think it's very promising.
02:08:37.880 | I think there are a lot of applications
02:08:39.480 | where this can be helpful.
02:08:40.400 | I'm also really interested in, yeah,
02:08:43.160 | social robotics for the home, right?
02:08:45.120 | Like that can help elderly people, for example,
02:08:48.160 | transport things from one location of the home to the other,
02:08:51.600 | or even like just have your back in case something happens.
02:08:55.600 | Yeah, I don't know.
02:08:57.760 | I do think it's a very interesting space.
02:08:59.800 | It seems early though.
02:09:00.800 | Do you feel like the timing is now?
02:09:04.580 | - I, yes, 100%.
02:09:08.580 | So it always seems early until it's not, right?
02:09:11.900 | - Right, right, right, right.
02:09:12.740 | - I think the time, well,
02:09:14.860 | I definitely think that the time is now,
02:09:21.340 | like this decade for social robots.
02:09:25.740 | Whether the humanoid form is right, I don't think so.
02:09:28.580 | - Mm-hmm, mm-hmm.
02:09:29.740 | - I don't, I think the,
02:09:32.900 | like if we just look at Jibo as an example,
02:09:37.260 | I feel like most of the problem,
02:09:41.720 | the challenge, the opportunity of social connection
02:09:44.780 | between an AI system and a human being
02:09:48.300 | does not require you to also solve the problem
02:09:51.460 | of robot manipulation and bipedal mobility.
02:09:55.220 | So I think you could do that with just a screen, honestly.
02:09:59.100 | But there's something about the interface of Jibo
02:10:00.940 | and you can rotate and so on that's also compelling.
02:10:03.700 | But you get to see all these robot companies
02:10:06.340 | that fail, incredible companies like Jibo and even,
02:10:10.660 | I mean, the iRobot in some sense is a big success story
02:10:16.500 | that it was able to find--
02:10:18.620 | - Right, a niche.
02:10:20.420 | - A niche thing and focus on it.
02:10:21.780 | But in some sense, it's not a success story
02:10:23.780 | because they didn't build any other robot.
02:10:28.020 | Like any other, it didn't expand
02:10:30.300 | to all kinds of robotics.
02:10:31.500 | Like once you're in the home,
02:10:32.740 | maybe that's what happens with Amazon
02:10:34.300 | is they'll flourish into all kinds of other robots.
02:10:36.860 | But do you have a sense, by the way,
02:10:40.460 | why it's so difficult to build a robotics company?
02:10:43.700 | Like why so many companies have failed?
02:10:47.140 | - I think it's like you're building a vertical stack, right?
02:10:50.740 | Like you are building the hardware plus the software
02:10:53.220 | and you find you have to do this at a cost that makes sense.
02:10:56.020 | So I think Jibo was retailing at like, I don't know,
02:11:01.020 | like $800, like $700, $800.
02:11:03.300 | - Yeah, something like this, yeah.
02:11:04.660 | - Which for the use case, right?
02:11:07.140 | There's a dissonance there.
02:11:09.340 | It's too high.
02:11:10.820 | So I think cost is, you know,
02:11:13.420 | cost of building the whole platform in a way that is,
02:11:17.780 | yeah, that is affordable for what value it's bringing.
02:11:20.980 | I think that's the challenge.
02:11:22.460 | I think for these home robots
02:11:25.060 | that are gonna help you do stuff around the home,
02:11:28.500 | that's a challenge too, like the mobility piece of it.
02:11:33.260 | That's hard.
02:11:34.860 | - Well, one of the things I'm really excited with TeslaBot
02:11:37.740 | is the people working on it.
02:11:40.340 | And that's probably the criticism I would apply
02:11:42.900 | to some of the other folks who worked on social robots
02:11:45.900 | is the people working on TeslaBot know how to,
02:11:49.460 | they're focused on and know how to do mass manufacture
02:11:52.260 | and create a product that's super cheap.
02:11:54.060 | - Very cool. - That's the focus.
02:11:55.460 | The engineering focus is not,
02:11:57.220 | I would say that you can also criticize them for that,
02:12:00.180 | is they're not focused on the experience of the robot.
02:12:03.780 | They're focused on how to get this thing
02:12:06.860 | to do the basic stuff that the humanoid form requires
02:12:11.260 | to do it as cheap as possible.
02:12:13.380 | Then the fewest number of actuators,
02:12:15.140 | the fewest numbers of motors,
02:12:16.460 | the increasing efficiency,
02:12:18.300 | they decrease the weight, all that kind of stuff.
02:12:20.060 | - So that's really interesting.
02:12:21.620 | - I would say that Jibo and all those folks,
02:12:24.460 | they focus on the design, the experience, all of that.
02:12:27.860 | And it's secondary how to manufacture.
02:12:30.460 | No, you have to think like the TeslaBot folks
02:12:33.700 | from first principles,
02:12:35.340 | what is the fewest number of components,
02:12:38.020 | the cheapest components?
02:12:38.980 | How can I build it as much in-house as possible
02:12:41.820 | without having to consider all the complexities
02:12:45.860 | of a supply chain, all that kind of stuff.
02:12:47.940 | - Interesting.
02:12:48.780 | - 'Cause if you have to build a robotics company,
02:12:50.700 | you have to, you're not building one robot.
02:12:53.780 | You're building hopefully millions of robots
02:12:55.580 | and you have to figure out how to do that.
02:12:57.540 | Where the final thing, I mean, if it's Jibo type of robot,
02:13:01.740 | is there a reason why Jibo,
02:13:03.700 | like we're gonna have this lengthy discussion,
02:13:05.980 | is there a reason why Jibo has to be over $100?
02:13:08.820 | - It shouldn't be.
02:13:09.860 | - Right, like the basic components.
02:13:12.460 | - Components of it, right.
02:13:13.940 | - Like you could start to actually discuss like,
02:13:16.660 | okay, what is the essential thing about Jibo?
02:13:18.980 | How much, what is the cheapest way I can have a screen?
02:13:21.300 | What's the cheapest way I can have a rotating base?
02:13:24.020 | - Right. - All that kind of stuff.
02:13:24.860 | And then you get down, continuously drive down cost.
02:13:29.540 | Speaking of which, you have launched
02:13:32.940 | an extremely successful companies.
02:13:34.660 | You have helped others, you've invested in companies.
02:13:37.860 | Can you give advice on how to start a successful company?
02:13:42.860 | - I would say have a problem
02:13:46.420 | that you really, really, really wanna solve, right?
02:13:48.620 | Something that you're deeply passionate about.
02:13:51.420 | And honestly, take the first step.
02:13:55.980 | Like that's often the hardest, and don't overthink it.
02:13:59.540 | Like, you know, like this idea of a minimum viable product
02:14:02.660 | or a minimum viable version of an idea, right?
02:14:05.140 | Like, yes, you're thinking about this,
02:14:06.660 | like a humongous, like super elegant, super beautiful thing.
02:14:10.580 | What, like reduce it to the littlest thing
02:14:12.860 | you can bring to market that can solve a problem
02:14:14.860 | that can help address a pain point that somebody has.
02:14:19.860 | They often tell you, like,
02:14:22.620 | start with a customer of one, right?
02:14:24.340 | If you can solve a problem for one person,
02:14:26.660 | then there's probably-- - Yourself
02:14:28.580 | or some other person. - Right.
02:14:29.780 | - Pick a person. - Exactly.
02:14:31.580 | - It could be you. - Yeah.
02:14:32.820 | - That's actually often a good sign
02:14:34.460 | that if you enjoy a thing. - Right.
02:14:36.460 | - Enjoy a thing where you have a specific problem
02:14:38.340 | that you'd like to solve.
02:14:39.500 | That's a good end of one to focus on.
02:14:43.020 | - Right. - What else?
02:14:44.140 | What else is there to actually,
02:14:45.620 | so step one is the hardest, but how do you,
02:14:47.900 | (Roz laughs)
02:14:49.100 | there's other steps as well, right?
02:14:51.540 | - I also think, like, who you bring around the table
02:14:56.140 | early on is so key, right?
02:14:58.340 | Like being clear on what I call, like,
02:15:00.660 | your core values or your North Star.
02:15:02.460 | It might sound fluffy, but actually it's not.
02:15:04.820 | So, and Roz and I, I feel like we did that very early on.
02:15:08.900 | We sat around her kitchen table and we said,
02:15:11.460 | okay, there's so many applications of this technology.
02:15:13.660 | How are we gonna draw the line?
02:15:15.140 | How are we gonna set boundaries?
02:15:16.940 | We came up with a set of core values
02:15:19.220 | that in the hardest of times,
02:15:21.260 | we fell back on to determine how we make decisions.
02:15:25.260 | And so I feel like just getting clarity on these core,
02:15:27.620 | like for us, it was respecting people's privacy,
02:15:30.500 | only engaging with industries where it's clear opt-ins.
02:15:33.660 | So for instance, we don't do any work
02:15:35.580 | in security and surveillance.
02:15:37.100 | So things like that, just getting,
02:15:40.060 | we very big on, you know, one of our core values
02:15:42.500 | is human connection and empathy, right?
02:15:44.620 | And that is, yes, it's an AI company,
02:15:46.540 | but it's about people.
02:15:47.940 | Well, these are all, they become encoded in how we act,
02:15:52.740 | even if you're a small, tiny team of two or three
02:15:55.460 | or whatever.
02:15:56.300 | So I think that's another piece of advice.
02:15:59.980 | - So what about finding people, hiring people?
02:16:02.700 | If you care about people as much as you do,
02:16:04.780 | like it seems like such a difficult thing
02:16:08.180 | to hire the right people.
02:16:10.940 | - I think early on as a startup,
02:16:12.500 | you want people who have,
02:16:14.540 | who share the passion and the conviction
02:16:16.340 | because it's gonna be tough.
02:16:17.900 | Like I've yet to meet a startup
02:16:21.780 | where it was just a straight line to success, right?
02:16:24.580 | Even not just startup,
02:16:25.940 | like even in everyday people's lives, right?
02:16:28.180 | So you always like run into obstacles
02:16:31.580 | and you run into naysayers.
02:16:33.060 | And so you need people who are believers,
02:16:37.700 | whether they're people on your team or even your investors,
02:16:40.660 | you need investors who are really believers
02:16:42.700 | in what you're doing
02:16:44.140 | because that means they will stick with you.
02:16:46.100 | They won't give up at the first obstacle.
02:16:49.380 | I think that's important.
02:16:50.900 | - What about raising money?
02:16:52.660 | What about finding investors?
02:16:55.340 | First of all, raising money,
02:16:58.380 | but also raising money from the right sources
02:17:01.580 | from that ultimately don't hinder you,
02:17:03.540 | but help you, empower you, all that kind of stuff.
02:17:07.060 | What advice would you give there?
02:17:08.580 | You successfully raised money many times in your life.
02:17:12.300 | - Yeah, again, it's not just about the money.
02:17:15.060 | It's about finding the right investors
02:17:17.220 | who are going to be aligned in terms of what you wanna build
02:17:21.300 | and believe in your core values.
02:17:23.060 | Like for example, especially later on,
02:17:25.900 | like I, yeah, in my latest like round of funding,
02:17:29.860 | I try to bring in investors
02:17:31.420 | that really care about like the ethics of AI, right?
02:17:35.460 | And the alignment of vision and mission
02:17:39.620 | and core values is really important.
02:17:41.020 | It's like you're picking a life partner, right?
02:17:43.900 | It's the same kind of-
02:17:45.180 | - So you take it that seriously for investors?
02:17:47.460 | - Yeah, because they're gonna have to stick with you.
02:17:50.140 | - You're stuck together.
02:17:51.540 | - For a while anyway, yeah.
02:17:53.100 | (both laughing)
02:17:54.260 | Maybe not for life, but for a while, for sure.
02:17:56.860 | - For better or worse.
02:17:57.820 | I forget what the vowels usually sound like.
02:17:59.940 | For better or worse?
02:18:01.620 | - Through sick.
02:18:02.780 | - Through sick and then-
02:18:03.620 | - Through something.
02:18:04.620 | - Yeah, yeah, yeah.
02:18:05.460 | (both laughing)
02:18:07.580 | - Oh boy.
02:18:09.220 | Yeah, anyway, it's romantic and deep
02:18:11.820 | and you're in it for a while.
02:18:13.820 | So it's not just about the money.
02:18:16.900 | You tweeted about going to your first
02:18:20.340 | capital camp investing get together.
02:18:22.340 | - Oh yeah.
02:18:23.180 | - And then you learned a lot.
02:18:24.020 | So this is about investing.
02:18:27.780 | So what have you learned from that?
02:18:29.900 | What have you learned about investing in general?
02:18:32.740 | From both, because you've been on both ends of it.
02:18:35.340 | - I mean, I try to use my experience as an operator
02:18:38.260 | now with my investor hat on
02:18:40.780 | when I'm identifying companies to invest in.
02:18:44.060 | First of all, I think the good news
02:18:47.020 | is because I have a technology background, right?
02:18:49.100 | And I really understand machine learning
02:18:51.220 | and computer vision and AI, et cetera.
02:18:53.340 | I can apply that level of understanding, right?
02:18:56.580 | 'Cause everybody says they're an AI company
02:18:58.620 | or they're an AI tech.
02:18:59.780 | And I'm like, no, no, no, no, no.
02:19:01.940 | Show me the technology.
02:19:02.900 | So I can do that level of diligence, which I actually love.
02:19:05.860 | And then I have to do the litmus test of,
02:19:10.700 | if I'm in a conversation with you,
02:19:12.180 | am I excited to tell you about this new company
02:19:14.340 | that I just met, right?
02:19:15.580 | And if I'm an ambassador for that company
02:19:20.380 | and I'm passionate about what they're doing,
02:19:22.140 | I usually use that.
02:19:24.900 | Yeah, that's important to me when I'm investing.
02:19:28.260 | - So that means you actually can explain what they're doing
02:19:32.420 | and you're excited about it.
02:19:34.340 | - Exactly, exactly.
02:19:36.140 | Thank you for putting it so succinctly.
02:19:39.940 | Like rambling, but exactly that's it.
02:19:41.700 | I understand it and I'm excited about it.
02:19:43.900 | - It's funny, but sometimes it's unclear exactly.
02:19:46.740 | I'll hear people tell me,
02:19:50.420 | they'll talk for a while and it sounds cool.
02:19:52.860 | Like they paint a picture of a world,
02:19:54.580 | but then when you try to summarize it,
02:19:56.500 | you're not exactly clear of what,
02:19:59.260 | maybe what the core powerful idea is.
02:20:03.460 | You can't just build another Facebook
02:20:05.300 | or there has to be a core simple to explain idea
02:20:10.300 | that then you can or can't get excited about,
02:20:15.780 | but it's there, it's sitting right there.
02:20:17.820 | How do you ultimately pick who you think will be successful?
02:20:25.540 | It's not just about the thing you're excited about,
02:20:27.980 | like there's other stuff.
02:20:29.540 | - Right, and then there's all the,
02:20:31.900 | with early stage companies, like pre-seed companies,
02:20:34.500 | which is where I'm investing,
02:20:36.180 | sometimes the business model isn't clear yet
02:20:40.420 | or the go-to-market strategy isn't clear.
02:20:42.260 | There's usually, like it's very early on
02:20:43.900 | that some of these things haven't been hashed out,
02:20:46.500 | which is okay.
02:20:47.820 | So the way I like to think about it is like,
02:20:49.500 | if this company's successful,
02:20:51.060 | will this be a multi-billion/trillion dollar market?
02:20:55.580 | Or company.
02:20:56.420 | And so that's definitely a lens that I use.
02:21:00.460 | - What's pre-seed?
02:21:02.020 | What are the different stages?
02:21:03.900 | And what's the most exciting stage?
02:21:05.660 | What's interesting about every stage, I guess?
02:21:09.580 | - Yeah, so pre-seed is usually when you're just starting out,
02:21:13.980 | you've maybe raised the friends and family round,
02:21:16.500 | so you've raised some money from people you know,
02:21:18.500 | and you're getting ready
02:21:19.380 | to take your first institutional check-in,
02:21:22.540 | like first check from an investor.
02:21:25.100 | And I love the stage.
02:21:28.820 | There's a lot of uncertainty.
02:21:30.700 | Some investors really don't like the stage
02:21:32.540 | because the financial models aren't there.
02:21:36.460 | Often the teams aren't even like formed,
02:21:38.740 | it's really, really early.
02:21:40.220 | But to me, it's like a magical stage
02:21:45.860 | because it's the time when there's so much conviction,
02:21:48.420 | so much belief, almost delusional, right?
02:21:51.660 | And there's a little bit of naivete around
02:21:53.980 | with founders at the stage, and I just love it.
02:21:57.900 | It's contagious.
02:21:59.020 | And I love that I can,
02:22:02.660 | often they're first-time founders, not always,
02:22:06.020 | but often they're first-time founders,
02:22:07.660 | and I can share my experience as a founder myself,
02:22:10.140 | and I can empathize, right?
02:22:12.580 | And I can almost, I create a safe ground where,
02:22:17.180 | 'cause you have to be careful
02:22:18.620 | what you tell your investors, right?
02:22:21.180 | And I will often say,
02:22:23.140 | I've been in your shoes as a founder,
02:22:24.780 | you can tell me if it's challenging,
02:22:26.780 | you can tell me what you're struggling with.
02:22:28.340 | It's okay to vent.
02:22:30.140 | So I create that safe ground,
02:22:31.940 | and I think that's a superpower.
02:22:35.300 | - Yeah, you have to, I guess,
02:22:37.980 | you have to figure out if this kind of person
02:22:39.860 | is gonna be able to ride the roller coaster
02:22:42.020 | like of many pivots and challenges
02:22:46.980 | and all that kind of stuff.
02:22:48.100 | And if the space of ideas they're working in
02:22:50.860 | is interesting, like the way they think about the world.
02:22:54.180 | Yeah, 'cause if it's successful,
02:22:57.660 | the thing they end up with might be very different,
02:22:59.700 | the reason it's successful for.
02:23:01.500 | - Actually, I was gonna say the third,
02:23:05.140 | so the technology is one aspect,
02:23:07.340 | the market or the idea, right, is the second,
02:23:09.940 | and the third is the founder, right?
02:23:11.260 | Is this somebody who I believe has conviction,
02:23:13.940 | is a hustler, is gonna overcome obstacles.
02:23:20.660 | Yeah, I think that it's gonna be a great leader, right?
02:23:23.140 | Like as a startup, as a founder,
02:23:25.740 | you're often, you are the first person,
02:23:27.660 | and your role is to bring amazing people around you
02:23:31.060 | to build this thing.
02:23:32.780 | And so you're an evangelist, right?
02:23:36.140 | So how good are you gonna be at that?
02:23:38.020 | So I try to evaluate that too.
02:23:40.260 | - You also, in the tweet thread about it,
02:23:43.740 | mentioned, is this a known concept?
02:23:45.780 | Random rich dudes, RRDS.
02:23:48.940 | Saying that there should be like random rich women, I guess.
02:23:53.180 | What's the dudes version of women,
02:23:56.500 | the women version of dudes?
02:23:57.820 | Ladies, I don't know.
02:23:59.380 | What's, is this a technical term?
02:24:01.500 | Is this known?
02:24:02.740 | Random rich dudes?
02:24:03.580 | - Well, I didn't make that up,
02:24:04.540 | but I was at this capital camp,
02:24:07.100 | which is a get together for investors of all types,
02:24:11.260 | and there must have been maybe 400 or so attendees,
02:24:17.380 | maybe 20 were women.
02:24:19.060 | It was just very disproportionately,
02:24:21.480 | a male dominated, which I'm used to.
02:24:25.700 | - I think you're used to this kind of thing.
02:24:27.020 | - I'm used to it, but it's still surprising.
02:24:29.700 | And as I'm raising money for this fund,
02:24:31.820 | so my fund partner is a guy called Rob May,
02:24:36.140 | who's done this before.
02:24:37.580 | So I'm new to the investing world,
02:24:39.500 | but he's done this before.
02:24:41.620 | Most of our investors in the fund are these,
02:24:45.180 | I mean, awesome, I'm super grateful to them,
02:24:47.420 | random, just rich guys.
02:24:48.860 | I'm like, where are the rich women?
02:24:50.340 | So I'm really adamant in both investing
02:24:52.820 | in women-led AI companies,
02:24:56.140 | but I also would love to have women investors
02:24:58.900 | be part of my fund,
02:25:00.740 | because I think that's how we drive change.
02:25:03.780 | - Yeah, so then that takes time, of course,
02:25:06.640 | but there's been quite a lot of progress,
02:25:08.560 | but yeah, for the next Mark Zuckerberg to be a woman
02:25:11.900 | and all that kind of stuff,
02:25:13.500 | that's just a huge number of wealth generated by women,
02:25:17.980 | and then controlled by women,
02:25:19.180 | then allocated by women, and all that kind of stuff.
02:25:22.060 | And then beyond just women,
02:25:23.540 | just broadly across all different measures
02:25:25.900 | of diversity and so on.
02:25:27.320 | Let me ask you to put on your wise sage hat.
02:25:35.500 | So you already gave advice on startups,
02:25:38.340 | and just advice for women,
02:25:43.340 | but in general, advice for folks in high school
02:25:46.980 | or college today, how to have a career they can be proud of,
02:25:50.700 | how to have a life they can be proud of.
02:25:53.860 | I suppose you have to give this kind of advice to your kids.
02:25:58.060 | - Kids, yeah.
02:25:59.140 | Well, here's the number one advice that I give to my kids.
02:26:03.340 | My daughter's now 19, by the way,
02:26:04.980 | and my son's 13 and a half,
02:26:06.380 | so they're not little kids anymore.
02:26:08.940 | - Does it break your heart?
02:26:11.500 | - It does.
02:26:12.700 | Like a girl, but they're awesome.
02:26:14.260 | They're my best friends.
02:26:15.460 | Yeah, I think the number one advice I would share
02:26:19.020 | is embark on a journey without attaching to outcomes,
02:26:22.380 | and enjoy the journey, right?
02:26:25.100 | So we often were so obsessed with the end goal,
02:26:29.560 | A, that doesn't allow us to be open
02:26:33.860 | to different endings of a journey,
02:26:36.100 | or a story.
02:26:37.020 | So you become so fixated on a particular path.
02:26:41.780 | You don't see the beauty in the other alternative path.
02:26:44.660 | And then you forget to enjoy the journey
02:26:48.620 | because you're just so fixated on the goal.
02:26:50.580 | And I've been guilty of that for many, many years in my life.
02:26:54.100 | And I'm now trying to make the shift of,
02:26:57.860 | no, no, no, I'm gonna, again,
02:27:00.300 | trust that things are gonna work out,
02:27:02.460 | and it'll be amazing, and maybe even exceed your dreams.
02:27:05.540 | We have to be open to that.
02:27:07.300 | - Yeah, taking a leap into all kinds of things.
02:27:09.860 | I think you tweeted like you went on vacation by yourself
02:27:12.860 | or something like this.
02:27:13.900 | - I know.
02:27:14.740 | - And just going, just taking the leap, doing it.
02:27:19.380 | - Totally, doing it.
02:27:20.220 | - And enjoying it, enjoying the moment,
02:27:22.300 | enjoying the weeks, enjoying not looking at
02:27:25.020 | the some kind of career ladder next step and so on.
02:27:29.540 | Yeah, there's something to that, like over planning too.
02:27:34.380 | I'm surrounded by a lot of people that kinda,
02:27:36.380 | so I don't plan.
02:27:37.740 | - You don't?
02:27:38.580 | - No.
02:27:39.420 | - Do you not do goal setting?
02:27:40.980 | - My goal setting is very like,
02:27:46.620 | I like the affirmations, it's very,
02:27:49.420 | it's almost, I don't know how to put it into words,
02:27:53.520 | but it's a little bit like
02:27:55.700 | what my heart yearns for, kind of.
02:28:01.580 | And I guess in the space of emotions,
02:28:03.620 | more than in the space of like,
02:28:06.020 | this will be, like in the rational space.
02:28:09.500 | 'Cause I just try to picture a world
02:28:13.620 | that I would like to be in,
02:28:15.780 | and that world is not clearly pictured,
02:28:17.580 | it's mostly in the emotional world.
02:28:19.380 | I mean, I think about that from robots,
02:28:22.060 | 'cause I have this desire, I've had it my whole life to,
02:28:27.060 | well, it took different shapes,
02:28:29.660 | but I think once I discovered AI,
02:28:32.380 | the desire was to,
02:28:34.020 | I think in the context of this conversation
02:28:38.220 | could be easily, easier described
02:28:40.520 | as basically a social robotics company.
02:28:42.940 | And that's something I dreamed of doing.
02:28:45.600 | And, well, there's a lot of complexity to that story,
02:28:52.060 | but that's the only thing, honestly, I dream of doing.
02:28:55.620 | So I imagine a world that I could help create,
02:29:01.580 | but it's not, there's no steps along the way.
02:29:05.660 | And I think I'm just kind of stumbling around
02:29:09.460 | and following happiness and working my ass off
02:29:13.260 | in almost random, like an ant does in random directions.
02:29:17.340 | But a lot of people, a lot of successful people around me
02:29:19.420 | say this, you should have a plan, you should have a clear goal.
02:29:21.540 | You have a goal at the end of the month,
02:29:22.700 | you have a goal at the end of the year.
02:29:24.300 | I don't, I don't, I don't.
02:29:26.580 | And there's a balance to be struck, of course,
02:29:31.020 | but there's something to be said about really making sure
02:29:36.020 | that you're living life to the fullest,
02:29:40.220 | that goals can actually get in the way of.
02:29:42.620 | - So one of the best, like kind of most,
02:29:47.300 | what do you call it when it's like challenges your brain,
02:29:52.500 | what do you call it?
02:29:53.500 | - The only thing that comes to mind,
02:29:58.100 | and this is me saying is the mind fuck, but yes.
02:30:00.180 | - Okay, okay, maybe, okay, something like that.
02:30:03.020 | - Yes.
02:30:03.940 | - Super inspiring talk, Kenneth Stanley,
02:30:06.620 | he was at OpenAI, he just left,
02:30:09.340 | and he has a book called "Why Greatness Can't Be Planned."
02:30:12.460 | And it's actually an AI book.
02:30:14.100 | So, and he's done all these experiments
02:30:16.100 | that basically show that when you over optimize,
02:30:19.220 | like the trade-off is you're less creative, right?
02:30:23.700 | And to create true greatness
02:30:26.700 | and truly creative solutions to problems,
02:30:29.620 | you can't over plan it, you can't.
02:30:31.740 | And I thought that was,
02:30:33.100 | and so he generalizes it beyond AI,
02:30:35.540 | and he talks about how we apply that in our personal life
02:30:38.220 | and our organizations and our companies,
02:30:40.380 | which are over KPI, right?
02:30:42.340 | Like look at any company in the world,
02:30:43.900 | and it's all like, these are the goals,
02:30:45.300 | these are the weekly goals, and the sprints,
02:30:48.940 | and then the quarterly goals, blah, blah, blah.
02:30:50.780 | And he just shows with a lot of his AI experiments
02:30:55.260 | that that's not how you create truly game-changing ideas.
02:30:59.540 | So there you go.
02:31:00.540 | - Yeah, yeah.
02:31:01.380 | - You should interview Kenneth, he's awesome.
02:31:02.940 | - Yeah, there's a balance, of course.
02:31:04.620 | 'Cause that's, yeah, many moments of genius
02:31:07.780 | will not come from planning and goals,
02:31:09.780 | but you still have to build factories,
02:31:12.900 | and you still have to manufacture,
02:31:14.260 | and you still have to deliver,
02:31:15.340 | and there's still deadlines and all that kind of stuff.
02:31:17.260 | And for that, it's good to have goals.
02:31:19.340 | - I do goal setting with my kids, we all have our goals.
02:31:22.660 | But I think we're starting to morph
02:31:25.700 | into more of these bigger picture goals,
02:31:27.820 | and not obsess about, I don't know, it's hard.
02:31:31.340 | - Well, I honestly think, especially with kids,
02:31:33.460 | it's much better to have a plan and have goals and so on,
02:31:36.260 | 'cause you have to learn the muscle
02:31:38.100 | of what it feels like to get stuff done.
02:31:40.900 | But I think once you learn that, there's flexibility for me.
02:31:43.940 | 'Cause I spent most of my life with goal setting and so on.
02:31:47.980 | So I've gotten good with grades and school.
02:31:50.580 | I mean, school, if you wanna be successful at school,
02:31:53.900 | I mean, the kind of stuff in high school and college
02:31:56.180 | that kids have to do, in terms of managing their time
02:31:59.140 | and getting so much stuff done.
02:32:01.100 | It's like, taking five, six, seven classes in college,
02:32:05.620 | they're like, that would break the spirit of most humans
02:32:09.380 | if they took one of them later in life.
02:32:12.420 | It's like really difficult stuff,
02:32:14.820 | especially in engineering curricula.
02:32:16.500 | So I think you have to learn that skill,
02:32:19.980 | but once you learn it, you can maybe,
02:32:21.900 | 'cause you can be a little bit on autopilot
02:32:24.540 | and use that momentum,
02:32:25.700 | and then allow yourself to be lost in the flow of life.
02:32:29.100 | Just kinda, or also give,
02:32:34.100 | I worked pretty hard to allow myself
02:32:38.540 | to have the freedom to do that.
02:32:39.860 | That's really, that's a tricky freedom to have.
02:32:42.900 | Because a lot of people get lost in the rat race,
02:32:45.300 | and they also, like financially,
02:32:49.660 | they, whenever you get a raise,
02:32:52.820 | they'll get like a bigger house.
02:32:54.100 | - Right, right, right.
02:32:54.940 | - Or something like this.
02:32:55.780 | I put very, so like, you're always trapped in this race.
02:32:58.620 | I put a lot of emphasis on living below my means always.
02:33:03.620 | And so there's a lot of freedom to do whatever,
02:33:08.260 | whatever the heart desires.
02:33:11.380 | That's a really, but everyone has to decide
02:33:13.220 | what's the right thing, what's the right thing for them.
02:33:15.540 | For some people, having a lot of responsibilities,
02:33:18.760 | like a house they can barely afford,
02:33:20.860 | or having a lot of kids, the responsibility side of that,
02:33:24.500 | is really, helps them get their shit together.
02:33:28.020 | Like, all right, I need to be really focused and good.
02:33:30.980 | Some of the most successful people I know have kids,
02:33:33.100 | and the kids bring out the best in them.
02:33:34.700 | They make them more productive and less productive.
02:33:36.540 | - Accountability, it's an accountability thing, absolutely.
02:33:39.260 | - And almost something to actually live and fight
02:33:42.060 | and work for, like having a family.
02:33:44.900 | It's fascinating to see.
02:33:46.560 | 'Cause you would think kids would be a hit on productivity,
02:33:49.400 | but they're not, for a lot of really successful people.
02:33:52.200 | They really, they're like an engine of--
02:33:54.920 | - Right, efficiency, oh my God.
02:33:56.120 | - Yeah, it's weird.
02:33:57.840 | I mean, it's beautiful, it's beautiful to see.
02:33:59.600 | And also social happiness.
02:34:01.960 | Speaking of which, what role do you think love plays
02:34:06.960 | in the human condition, love?
02:34:09.100 | - I think love is,
02:34:15.260 | yeah, I think it's why we're all here.
02:34:19.920 | I think it would be very hard to live life
02:34:21.720 | without love in any of its forms, right?
02:34:26.160 | - Yeah, that's the most beautiful of forms
02:34:31.040 | that human connection takes, right?
02:34:35.020 | - Yeah, I feel like everybody wants to feel loved, right?
02:34:40.020 | In one way or another, right?
02:34:42.080 | - And to love.
02:34:42.920 | - Yeah, and to love too.
02:34:44.320 | - Totally, yeah, I agree with that.
02:34:46.280 | - Both of it.
02:34:47.120 | I'm not even sure what feels better.
02:34:48.860 | Both, both are like that.
02:34:52.520 | - To give love too, yeah.
02:34:54.400 | - And it is like we've been talking about,
02:34:56.920 | an interesting question, whether some of that,
02:34:59.480 | whether one day we'll be able to love a toaster.
02:35:02.340 | Get some small--
02:35:05.280 | - I wasn't quite thinking about that when I said--
02:35:07.520 | - The toaster.
02:35:08.360 | - Yeah, like we all need love and give love.
02:35:10.960 | Okay, you're right.
02:35:11.800 | - I was thinking about Brad Pitt and toasters.
02:35:12.640 | - Brad Pitt and toasters, great.
02:35:14.400 | - All right, well, I think we started on love
02:35:18.440 | and ended on love.
02:35:20.080 | This was an incredible conversation, Ron,
02:35:22.160 | thank you so much.
02:35:23.360 | You're an incredible person.
02:35:24.760 | Thank you for everything you're doing in AI,
02:35:28.080 | in the space of just caring about humanity,
02:35:32.640 | human emotion, about love,
02:35:34.680 | and being an inspiration to a huge number of people
02:35:38.200 | in robotics, in AI, in science, in the world in general.
02:35:42.240 | So thank you for talking to me, it's an honor.
02:35:44.200 | - Thank you for having me,
02:35:45.400 | and you know I'm a big fan of yours as well,
02:35:47.240 | so it's been a pleasure.
02:35:48.680 | - Thanks for listening to this conversation
02:35:51.120 | with Rana Elkayoobie.
02:35:52.800 | To support this podcast,
02:35:53.960 | please check out our sponsors in the description.
02:35:56.840 | And now, let me leave you with some words
02:35:58.720 | from Helen Keller.
02:35:59.800 | The best and most beautiful things in the world
02:36:03.560 | cannot be seen or even touched.
02:36:06.120 | They must be felt with the heart.
02:36:09.320 | Thank you for listening, and hope to see you next time.
02:36:12.400 | (upbeat music)
02:36:14.980 | (upbeat music)
02:36:17.560 | [BLANK_AUDIO]