back to index

Kate Darling: Social Robots, Ethics, Privacy and the Future of MIT | Lex Fridman Podcast #329


Chapters

0:0 Introduction
1:46 What is a robot?
17:47 Metaverse
27:9 Bias in robots
40:51 Modern robotics
43:24 Automation
47:46 Autonomous driving
56:11 Privacy
59:37 Google's LaMDA
64:24 Robot animal analogy
77:28 Data concerns
95:30 Humanoid robots
114:31 LuLaRobot
123:25 Ethics in robotics
138:46 Jeffrey Epstein
172:21 Love and relationships

Whisper Transcript | Transcript Only Page

00:00:00.000 | - I think that animals are a really great thought experiment
00:00:04.880 | when we're thinking about AI and robotics,
00:00:06.320 | because again, this comparing them to humans
00:00:09.280 | that leads us down the wrong path,
00:00:10.960 | both because it's not accurate,
00:00:12.440 | but also I think for the future, we don't want that.
00:00:16.120 | We want something that's a supplement.
00:00:17.880 | But I think animals,
00:00:18.760 | because we've used them throughout history
00:00:21.120 | for so many different things,
00:00:22.320 | we domesticated them not because they do what we do,
00:00:25.020 | but because what they do is different and that's useful.
00:00:28.320 | And it's just like,
00:00:30.440 | whether we're talking about companionship,
00:00:32.320 | whether we're talking about work integration,
00:00:34.920 | whether we're talking about responsibility for harm,
00:00:37.240 | there's just so many things we can draw on in that history
00:00:40.600 | from these entities that can sense, think,
00:00:43.400 | make autonomous decisions and learn
00:00:45.720 | that are applicable to how we should be thinking
00:00:48.520 | about robots and AI.
00:00:49.720 | - The following is a conversation with Kate Darling,
00:00:54.520 | her second time on the podcast.
00:00:56.440 | She's a research scientist at MIT Media Lab,
00:00:59.280 | interested in human robot interaction and robot ethics,
00:01:03.800 | which she writes about in her recent book
00:01:06.600 | called "The New Breed",
00:01:08.320 | what our history with animals reveals
00:01:10.560 | about our future with robots.
00:01:12.500 | Kate is one of my favorite people at MIT.
00:01:15.040 | She was a courageous voice of reason and compassion
00:01:18.120 | to the time of the Jeffrey Epstein scandal
00:01:20.280 | at MIT three years ago.
00:01:22.480 | We reflect on this time in this very conversation,
00:01:26.520 | including the lessons it revealed about human nature
00:01:29.640 | and our optimistic vision for the future of MIT,
00:01:33.560 | a university we both love and believe in.
00:01:36.920 | This is the Lex Friedman Podcast.
00:01:38.840 | To support it,
00:01:39.800 | please check out our sponsors in the description.
00:01:42.120 | And now, dear friends, here's Kate Darling.
00:01:45.900 | Last time we talked a few years back,
00:01:49.120 | you wore Justin Bieber's shirt for the podcast.
00:01:51.920 | So now looking back, you're a respected researcher,
00:01:56.120 | all the amazing accomplishments in robotics,
00:01:59.240 | you're an author.
00:02:00.640 | Was this one of the proudest moments of your life?
00:02:03.240 | Proudest decisions you've ever made?
00:02:05.200 | - Definitely.
00:02:06.460 | You handled it really well, though.
00:02:07.920 | It was cool, 'cause I walked in,
00:02:09.120 | I didn't know you were gonna be filming.
00:02:10.920 | I walked in and you're in a fucking suit.
00:02:13.440 | - Yeah.
00:02:14.280 | - And I'm like, why are you all dressed up?
00:02:16.000 | - Yeah.
00:02:16.960 | - And then you were so nice about it.
00:02:18.220 | You like made some excuses.
00:02:19.060 | You were like, oh, well, I'm interviewing some,
00:02:21.120 | didn't you say you were interviewing
00:02:22.320 | some military general afterwards to like--
00:02:24.680 | - Oh yeah, that was--
00:02:25.520 | - Make me feel better?
00:02:26.340 | - CTO of Lockheed Martin, I think.
00:02:27.760 | - Oh, that's what it was.
00:02:29.080 | - Yeah.
00:02:29.920 | - You didn't tell me, oh, I was dressed like this.
00:02:32.560 | - Are you an actual Bieber fan?
00:02:33.880 | Or was that like one of those T-shirts
00:02:36.400 | that's in the back of the closet
00:02:39.040 | that you use for painting?
00:02:40.480 | - I think I bought it for my husband as a joke.
00:02:42.920 | And yeah, we were gut renovating a house at the time
00:02:46.720 | and I had worn it to the site.
00:02:48.840 | - Got his joke and now you wear it.
00:02:50.560 | Okay, have you worn it since?
00:02:53.200 | Is this the one time?
00:02:54.920 | - No, like how could I touch it again?
00:02:57.200 | It was on your podcast.
00:02:58.200 | Now it's framed.
00:02:59.400 | - It's like a wedding dress or something like that.
00:03:01.160 | You don't, you only wear it once.
00:03:03.500 | You are the author of "The New Breed,
00:03:07.160 | What Our History With Animals Reveals
00:03:09.060 | About Our Future With Robots."
00:03:11.240 | You opened the book with the surprisingly tricky question,
00:03:14.680 | what is a robot?
00:03:15.740 | So let me ask you, let's try to sneak up to this question.
00:03:19.120 | What's a robot?
00:03:20.720 | That's not really sneaking up.
00:03:22.800 | - It's just asking it.
00:03:23.960 | - Yeah.
00:03:25.000 | - All right, well.
00:03:25.840 | (laughing)
00:03:27.560 | - What do you think a robot is?
00:03:29.400 | - What I think a robot is,
00:03:32.120 | is something that has some level of intelligence
00:03:36.240 | and some level of magic.
00:03:40.820 | That little shine in the eye, you know,
00:03:44.040 | that allows you to navigate the uncertainty of life.
00:03:50.060 | So that means, like autonomous vehicles to me in that sense,
00:03:53.600 | are robots because they navigate the uncertainty,
00:03:58.480 | the complexity of life.
00:04:00.560 | Obviously social robots are that.
00:04:02.740 | - I love that.
00:04:04.080 | I like that you mentioned magic 'cause that also,
00:04:07.120 | well, so first of all,
00:04:08.080 | I don't define robot definitively in the book
00:04:10.840 | because there is no definition.
00:04:13.440 | That everyone agrees on.
00:04:15.300 | And if you look back through time,
00:04:18.400 | people have called things robots until they lose the magic
00:04:21.860 | because they're more ubiquitous.
00:04:23.340 | Like a vending machine used to be called a robot
00:04:25.540 | and now it's not, right?
00:04:26.620 | So I do agree with you that there's this magic aspect
00:04:30.740 | which is how people understand robots.
00:04:35.780 | If you ask a roboticist,
00:04:37.260 | they have the definition of something that is,
00:04:39.660 | well, it has to be physical usually.
00:04:41.220 | It's not an AI agent.
00:04:43.260 | It has to be embodied.
00:04:44.680 | They'll say it has to be able to sense its environment
00:04:48.260 | in some way.
00:04:49.100 | It has to be able to make a decision autonomously
00:04:52.460 | and then act on its environment again.
00:04:55.260 | I think that's a pretty good technical definition
00:04:58.500 | even though it really breaks down
00:04:59.960 | when you come to things like the smartphone
00:05:01.860 | because the smartphone can do all of those things.
00:05:04.500 | And most robotists would not call it a robot.
00:05:06.520 | So there's really no one good definition.
00:05:09.760 | But part of why I wrote the book is because people have
00:05:14.760 | a definition of robot in their minds
00:05:17.900 | that is usually very focused on a comparison
00:05:21.940 | of robots to humans.
00:05:23.340 | So if you Google image search robot,
00:05:24.980 | you get a bunch of humanoid robots,
00:05:27.100 | robots with a torso and head and two arms and two legs.
00:05:30.900 | And that's the definition of robot
00:05:33.900 | that I'm trying to get us away from
00:05:35.560 | because I think that it trips us up a lot.
00:05:37.980 | - Why does the humanoid form trip us up a lot?
00:05:40.740 | - Well, because this constant comparison of robots to people,
00:05:45.060 | artificial intelligence to human intelligence,
00:05:47.900 | first of all, it doesn't make sense
00:05:48.960 | from a technical perspective
00:05:50.140 | because the early AI researchers,
00:05:54.100 | some of them were trying to recreate human intelligence.
00:05:56.820 | Some people still are,
00:05:57.760 | and there's a lot to be learned from that academically,
00:06:00.140 | et cetera, but that's not where we've ended up.
00:06:03.260 | AI doesn't think like people.
00:06:05.500 | We wind up in this fallacy where we're comparing these two.
00:06:09.720 | And when we talk about what intelligence even is,
00:06:13.820 | we're often comparing to our own intelligence.
00:06:15.940 | And then the second reason this bothers me
00:06:19.300 | is because it doesn't make sense.
00:06:21.500 | I just think it's boring to recreate intelligence
00:06:25.380 | that we already have.
00:06:26.860 | I see the scientific value
00:06:28.140 | of understanding our own intelligence,
00:06:30.180 | but from a practical,
00:06:32.340 | what could we use these technologies for perspective,
00:06:35.740 | it's much more interesting to create something new,
00:06:38.540 | to create a skillset that we don't have
00:06:40.780 | that we can partner with
00:06:41.940 | in what we're trying to achieve.
00:06:43.640 | - And it should be in some deep way similar to us,
00:06:48.820 | but in most ways different
00:06:51.320 | because you still want to have a connection,
00:06:53.420 | which is why the similarity might be necessary.
00:06:56.700 | - That's what people argue, yes.
00:06:58.460 | And I think that's true.
00:06:59.500 | So the two arguments for humanoid robots
00:07:02.000 | are people need to be able to communicate
00:07:05.380 | and relate to robots,
00:07:06.820 | and we relate most to things that are like ourselves.
00:07:10.480 | And we have a world that's built for humans.
00:07:12.500 | So we have stairs and narrow passageways and door handles.
00:07:15.580 | And so we need humanoid robots to be able to navigate that.
00:07:18.780 | And so you're speaking to the first one,
00:07:21.940 | which is absolutely true.
00:07:22.880 | But what we know from social robotics
00:07:24.540 | and a lot of human robot interaction research
00:07:26.540 | is that all you need is something that's enough
00:07:30.220 | like a person for it to give off cues
00:07:34.620 | that someone relates to,
00:07:35.940 | but that doesn't have to look human or even act human.
00:07:39.740 | You can take a robot like R2-D2
00:07:41.840 | and it just beeps and boops.
00:07:44.320 | And people love R2-D2, right?
00:07:45.800 | Even though it's just like a trash can on wheels.
00:07:47.960 | And they like R2-D2 more than C3PO, who's a humanoid.
00:07:50.880 | So there's lots of ways to make robots
00:07:55.240 | even better than humans in some ways
00:07:57.960 | and make us relate more to them.
00:08:00.320 | - Yeah, it's kind of amazing the variety of cues
00:08:03.160 | that can be used to anthropomorphize the thing,
00:08:06.080 | like a glowing orb or something like that.
00:08:08.840 | Just a voice, just subtle basic interaction.
00:08:13.840 | I think people sometimes over-engineer these things.
00:08:17.480 | Like simplicity can go a really long way.
00:08:19.720 | - Totally.
00:08:20.560 | I mean, ask any animator and they'll know that.
00:08:22.400 | - Yeah, yeah, those are actually,
00:08:24.800 | so the people behind Cosmo, the robot,
00:08:29.580 | the right people to design those is animators,
00:08:33.480 | like Disney type of people versus like roboticists.
00:08:38.120 | - Robotists, quote unquote, are mostly clueless.
00:08:41.840 | It seems like--
00:08:44.040 | - They just have their own discipline
00:08:45.200 | that they're very good at and they don't have--
00:08:47.760 | - Yeah, but that, don't, you know.
00:08:51.320 | I feel like robotics of the early 21st century
00:08:56.240 | is not going to be the robotics of the later 21st century.
00:09:01.360 | Like if you call yourself a roboticist,
00:09:03.480 | it'd be something very different.
00:09:05.520 | 'Cause I think more and more you'd be like a,
00:09:10.120 | maybe like a control engineer or something,
00:09:12.680 | controls engineer, like you separate,
00:09:15.920 | because ultimately all the unsolved,
00:09:18.680 | all the big problems of robotics
00:09:21.720 | will be in the social aspect,
00:09:24.040 | in the interacting with humans aspect,
00:09:26.440 | in the perception interpreting the world aspect,
00:09:31.560 | in the brain part, not the basic control level part.
00:09:36.560 | - You call it basic, it's actually really complex.
00:09:40.360 | - It's very, very complicated.
00:09:41.600 | - And that's why, but like, I think you're so right
00:09:44.120 | and what a time to be alive, because for me, I just,
00:09:49.120 | we've had robots for so long
00:09:52.200 | and they've just been behind the scenes.
00:09:54.440 | And now finally robots are getting deployed into the world.
00:09:58.600 | - They're coming out of the closet.
00:10:00.000 | - Yeah, and we're seeing all these mistakes
00:10:02.280 | that companies are making because they focus so much
00:10:04.840 | on the engineering and getting that right
00:10:06.840 | and getting the robot to be even be able to function
00:10:09.640 | in a space that it shares with a human.
00:10:12.520 | - See what I feel like people don't understand
00:10:15.640 | is to solve the perception and the control problem.
00:10:19.140 | You shouldn't try to just solve
00:10:21.080 | the perception control problem.
00:10:22.480 | You should teach the robot how to say,
00:10:25.440 | "Oh shit, I'm sorry, I fucked up."
00:10:28.000 | - Yeah, or ask for help.
00:10:29.160 | - Or ask for help or be able to communicate the uncertainty.
00:10:34.160 | Yeah, exactly, all of those things,
00:10:36.200 | because you can't solve the perception control.
00:10:38.960 | We humans haven't solved it.
00:10:40.560 | We're really damn good at it.
00:10:42.680 | But the magic is in the self-deprecating humor
00:10:48.080 | and the self-awareness about where our flaws are,
00:10:51.080 | all that kind of stuff.
00:10:52.480 | - Yeah, and there's a whole body of research
00:10:54.480 | in human robot interaction showing like ways to do this.
00:10:58.680 | But a lot of these companies haven't, they don't do HRI.
00:11:01.920 | They like the, have you seen the grocery store robot
00:11:05.320 | in the Stop and Shop?
00:11:06.680 | - Yes.
00:11:07.500 | - Yeah, the Marty, it looks like a giant penis.
00:11:09.240 | It's like six feet tall, it roams the aisles.
00:11:11.680 | - I will never see Marty the same way again.
00:11:13.920 | Thank you for that.
00:11:14.760 | - You're welcome.
00:11:15.600 | But like, these poor people worked so hard
00:11:20.440 | on getting a functional robot together.
00:11:23.140 | And then people hate Marty because they didn't at all
00:11:26.940 | consider how people would react to Marty in their space.
00:11:30.660 | - Does everybody, I mean, you talk about this,
00:11:33.020 | do people mostly hate Marty?
00:11:34.500 | 'Cause I like Marty.
00:11:36.060 | - Yeah, but you like Flippy.
00:11:38.900 | - Yeah, I do.
00:11:39.740 | - And actually like--
00:11:40.940 | - There's a parallel between the two?
00:11:43.180 | - I believe there is.
00:11:44.100 | So we were actually gonna do a study on this
00:11:45.840 | right before the pandemic hit, and then we canceled it
00:11:48.180 | because we didn't wanna go to the grocery store
00:11:49.860 | and neither did anyone else.
00:11:51.400 | But our theory, so this was with a student at MIT,
00:11:56.340 | Daniela Di Paola.
00:11:57.860 | She noticed that everyone on Facebook, in her circles,
00:12:00.960 | was complaining about Marty.
00:12:02.740 | They were like, "What is this creepy robot?
00:12:04.700 | "It's watching me, it's always in the way."
00:12:06.820 | And she did this quick and dirty sentiment analysis
00:12:09.660 | on Twitter where she was looking at positive
00:12:11.540 | and negative mentions of the robot.
00:12:13.140 | And she found that the biggest spike of negative mentions
00:12:16.900 | happened when Stop and Shop threw a birthday party
00:12:20.980 | for the Marty robots, like with free cake and balloons.
00:12:24.300 | Like, who complains about free cake?
00:12:26.380 | Well, people who hate Marty, apparently.
00:12:28.380 | And so we were like, "That's interesting."
00:12:31.580 | And then we did this online poll.
00:12:34.100 | We used Mechanical Turk, and we tried to get at
00:12:38.620 | what people don't like about Marty.
00:12:41.340 | And a lot of it wasn't, "Oh, Marty's taking jobs."
00:12:44.700 | It was, "Marty is the surveillance robot,"
00:12:47.180 | which it's not.
00:12:48.020 | It looks for spills on the floor.
00:12:48.900 | It doesn't actually look at any people.
00:12:52.900 | It's watching, it's creepy, it's getting in the way.
00:12:55.060 | Those were the things that people complained about.
00:12:56.900 | And so our hypothesis became, "Is Marty a real-life Clippy?"
00:13:01.900 | Because I know, Lex, you love Clippy,
00:13:05.260 | but many people hated Clippy.
00:13:07.740 | - Well, there's a complex thing there.
00:13:09.540 | It could be like marriage.
00:13:10.660 | A lot of people seem to like to complain about marriage,
00:13:13.620 | but they secretly love it.
00:13:15.260 | So it could be, the relationship you might have
00:13:18.900 | with Marty is like, "Oh, there he goes again,
00:13:23.780 | "doing his stupid surveillance thing."
00:13:26.760 | But you grow to love the,
00:13:28.980 | I mean, bitching about the thing
00:13:33.380 | that kind of releases a kind of tension.
00:13:35.420 | And there's, I mean, some people, a lot of people,
00:13:38.940 | show love by sort of busting each other's chops,
00:13:43.540 | like making fun of each other.
00:13:45.260 | And then I think people would really love it
00:13:48.420 | if Marty talked back.
00:13:49.780 | And there's so many possible options for humor there.
00:13:56.660 | One, you can lean in.
00:13:58.300 | You can be like, "Yes, I'm an agent of the CIA,
00:14:01.060 | "monitoring your every move,"
00:14:03.080 | like mocking people that are concerned,
00:14:05.260 | you know what I'm saying?
00:14:06.860 | Yes, I'm watching you because you're so important
00:14:09.980 | with your shopping patterns.
00:14:11.660 | I'm collecting all this data.
00:14:13.340 | Or just any kind of making fun of people, I don't know.
00:14:18.060 | But I think you hit on what exactly it is
00:14:20.500 | because when it comes to robots or artificial agents,
00:14:24.500 | I think people hate them more
00:14:26.980 | than they would some other machine or device or object.
00:14:30.700 | And it might be that thing,
00:14:34.020 | it might be combined with love or like whatever it is,
00:14:36.700 | it's a more extreme response
00:14:38.140 | because they view these things as social agents
00:14:41.460 | and not objects.
00:14:42.620 | And that was, so Clifford Nass
00:14:45.500 | was a big human-computer interaction person.
00:14:48.220 | And his theory about Clippy was that
00:14:52.140 | because people viewed Clippy as a social agent,
00:14:55.780 | when Clippy was annoying and would like bother them
00:14:58.220 | and interrupt them and like not remember what they told him,
00:15:02.100 | that's when people got upset
00:15:03.380 | because it wasn't fulfilling their social expectations.
00:15:06.260 | And so they complained about Clippy more
00:15:08.680 | than they would have if it had been a different,
00:15:10.420 | like not a virtual character.
00:15:14.460 | - So is complaining to you a sign
00:15:16.740 | that we're on the wrong path with a particular robot?
00:15:19.580 | Or is it possible, like again, like marriage,
00:15:23.340 | like family, that there still is a path
00:15:28.340 | towards that direction
00:15:29.700 | where we can find deep, meaningful relationships?
00:15:32.900 | - I think we absolutely can find
00:15:34.620 | deep, meaningful relationships with robots.
00:15:36.900 | And well, maybe with Marty.
00:15:38.900 | I mean, I just would,
00:15:40.340 | I would have designed Marty a little differently.
00:15:42.500 | - Like how?
00:15:43.620 | Isn't there a charm to the clumsiness, the slowness?
00:15:46.900 | - There is if you're not trying to get through
00:15:48.500 | with a shopping cart and a screaming child.
00:15:50.620 | You know, there's, I think,
00:15:53.380 | I think you could make it charming.
00:15:54.980 | I think there are lots of design tricks
00:15:56.900 | that they could have used.
00:15:58.620 | And one of the things they did,
00:16:00.100 | I think without thinking about it at all,
00:16:01.820 | is they slapped two big googly eyes on Marty.
00:16:04.380 | - Oh yeah.
00:16:05.260 | - And I wonder if that contributed maybe
00:16:07.460 | to people feeling watched
00:16:10.820 | because it's looking at them.
00:16:13.940 | And so like, is there a way to design the robot
00:16:18.740 | to do the function that it's doing
00:16:21.020 | in a way that people are actually attracted to
00:16:24.340 | rather than annoyed by?
00:16:25.740 | And there are many ways to do that,
00:16:27.140 | but companies aren't thinking about it.
00:16:29.020 | Now they're realizing
00:16:30.100 | that they should have thought about it.
00:16:31.580 | - Yeah.
00:16:32.420 | I wonder if there's a way to,
00:16:33.860 | if it would help to make Marty seem like an entity
00:16:38.820 | of its own versus the arm of a large corporation.
00:16:43.820 | So there's some sense where this is just the camera
00:16:50.740 | that's monitoring people versus this is an entity
00:16:54.500 | that's a standalone entity.
00:16:56.180 | It has its own task and it has its own personality.
00:16:59.820 | The more personality you give it,
00:17:01.860 | the more it feels like it's not sharing data
00:17:06.180 | with anybody else.
00:17:07.420 | Like when we see other human beings,
00:17:10.700 | our basic assumption is whatever I say to this human being,
00:17:14.140 | it's not like being immediately sent to the CIA.
00:17:17.460 | - Yeah, what I say to you,
00:17:18.300 | no one's gonna hear that, right?
00:17:19.940 | - Yeah, that's true.
00:17:20.860 | That's true.
00:17:21.700 | - No, I'm kidding.
00:17:22.540 | - But you forget it.
00:17:23.360 | I mean, you do forget it.
00:17:24.200 | I mean, I don't know if that even with microphones here,
00:17:26.260 | you forget that that's happening,
00:17:28.060 | but for some reason, I think probably with Marty,
00:17:31.560 | I think when it's done really crudely and crappily,
00:17:36.640 | you start to realize, oh, this is like PR people
00:17:39.580 | trying to make a friendly version of a surveillance machine.
00:17:44.580 | But I mean, that reminds me of the slight clumsiness
00:17:50.700 | or significant clumsiness on the initial releases
00:17:53.440 | of the avatars for the metaverse.
00:17:55.660 | I don't know, what are your actually thoughts about that?
00:17:58.500 | The way the avatars,
00:18:04.580 | the way Mark Zuckerberg looks in that world,
00:18:08.660 | in the metaverse, the virtual reality world
00:18:12.280 | where you can have virtual meetings and stuff like that.
00:18:14.920 | How do we get that right?
00:18:16.280 | Do you have thoughts about that?
00:18:17.280 | 'Cause it's a kind of,
00:18:18.320 | it feels like a similar problem to social robotics,
00:18:25.160 | which is how you design a digital virtual world
00:18:29.240 | that is compelling when you connect to others
00:18:34.580 | there in the same way that physical connection is.
00:18:38.740 | - Right, I haven't looked into,
00:18:40.180 | I mean, I've seen people joking about it on Twitter
00:18:42.280 | and posting whatever.
00:18:44.460 | - Yeah, but I mean, have you seen it?
00:18:45.660 | 'Cause there's something you can't quite put into words
00:18:49.420 | that doesn't feel genuine about the way it looks.
00:18:54.420 | And so the question is, if you and I were to meet virtually,
00:18:58.280 | what should the avatars look like
00:19:02.240 | for us to have similar kind of connection?
00:19:04.700 | Should it be really simplified?
00:19:07.180 | Should it be a little bit more realistic?
00:19:09.820 | Should it be cartoonish?
00:19:11.260 | Should it be more,
00:19:12.920 | better capturing of expressions
00:19:18.580 | in interesting, complex ways
00:19:21.460 | versus like cartoonish, oversimplified ways?
00:19:24.260 | - But haven't video games figured this out?
00:19:26.180 | I'm not a gamer, so I don't have any examples,
00:19:28.500 | but I feel like there's this whole world in video games
00:19:31.660 | where they've thought about all of this,
00:19:33.580 | and depending on the game, they have different avatars,
00:19:36.580 | and a lot of the games are about connecting with others.
00:19:40.180 | The thing that I don't know is,
00:19:42.420 | and again, I haven't looked into this at all.
00:19:45.420 | I've been shockingly not very interested in the metaverse,
00:19:48.820 | but they must've poured so much investment into this meta.
00:19:53.820 | And why is it so bad?
00:20:02.140 | Like, there's gotta be a reason.
00:20:04.140 | There's gotta be some thinking behind it, right?
00:20:06.960 | - Well, I talked to Carmack about this,
00:20:10.780 | John Carmack, who's a part-time Oculus CTO.
00:20:15.140 | I think there's several things to say.
00:20:20.660 | One is, as you probably know,
00:20:22.460 | that I mean, there's bureaucracy,
00:20:23.860 | there's large corporations,
00:20:25.700 | and they often, large corporations have a way
00:20:28.700 | of killing the indie kind of artistic flame
00:20:33.700 | that's required to create something really compelling.
00:20:38.860 | Somehow they make everything boring,
00:20:40.500 | 'cause they run through this whole process
00:20:42.740 | through the PR department,
00:20:44.300 | through all that kind of stuff,
00:20:45.340 | and it somehow becomes generic through that process.
00:20:48.500 | - 'Cause they strip out anything interesting
00:20:50.060 | because it could be controversial, is that, or?
00:20:52.940 | - Yeah, right, exactly.
00:20:54.340 | Like, what, I mean, we're living through this now,
00:20:58.740 | like, with a lot of people with cancellations
00:21:03.060 | and all those kinds of stuff,
00:21:04.140 | people are nervous, and nervousness results in,
00:21:07.180 | like usual, the assholes are ruining everything.
00:21:11.340 | But, you know, the magic of human connection
00:21:13.260 | is taking risks, of making a risky joke,
00:21:16.500 | of, like, with people you like who are not assholes,
00:21:20.060 | good people, like, some of the fun in the metaverse
00:21:24.140 | or in video games is, you know, being edgier,
00:21:28.100 | being interesting, revealing your personality
00:21:30.740 | in interesting ways.
00:21:31.780 | In the sexual tension or in,
00:21:36.180 | they're definitely paranoid about that.
00:21:38.460 | - Oh, yeah. - Because, like,
00:21:39.300 | in metaverse, the possibility of sexual assault
00:21:42.300 | and sexual harassment and all that kind of stuff,
00:21:44.940 | it's obviously very high, but they're,
00:21:47.240 | so you should be paranoid to some degree,
00:21:50.420 | but not too much, because then you remove completely
00:21:53.060 | the personality of the whole thing.
00:21:54.820 | Then everybody's just like a vanilla bot that,
00:21:58.340 | like, you have to have ability
00:22:01.660 | to be a little bit political, to be a little bit edgy,
00:22:05.260 | all that kind of stuff,
00:22:06.100 | and large companies tend to suffocate that.
00:22:09.620 | So, but in general, just forget all that,
00:22:12.660 | just the ability to come up
00:22:14.940 | with really cool, beautiful ideas.
00:22:19.820 | If you look at, I think Grimes tweeted about this,
00:22:23.260 | she's very critical about the metaverse,
00:22:25.820 | is that,
00:22:26.660 | you know, independent game designers
00:22:33.220 | have solved this problem
00:22:34.300 | of how to create something beautiful
00:22:35.700 | and interesting and compelling.
00:22:37.540 | They do a really good job.
00:22:39.540 | So you have to let those kinds of minds,
00:22:41.660 | the small groups of people, design things
00:22:44.180 | and let them run with it, let them run wild
00:22:47.220 | and do edgy stuff, yeah.
00:22:49.300 | But otherwise you get this kind of,
00:22:53.420 | you get a clippy type of situation, right?
00:22:55.660 | Which is like a very generic looking thing.
00:22:59.180 | But even clippy has some, like,
00:23:02.860 | that's kind of wild that you would take a paperclip
00:23:07.140 | and put eyes on it.
00:23:08.860 | - And suddenly people are like, oh, you're annoying,
00:23:10.800 | but you're definitely a social agent.
00:23:13.260 | - And I just feel like that wouldn't even,
00:23:15.700 | that clippy thing wouldn't even survive Microsoft
00:23:19.940 | or Facebook of today, meta of today.
00:23:23.180 | 'Cause it would be like, well,
00:23:24.760 | there'll be these meetings about why is it a paperclip?
00:23:28.260 | Like, why don't we, it's not sufficiently friendly,
00:23:30.700 | let's make it, you know.
00:23:32.700 | And then all of a sudden the artist
00:23:34.900 | that with whom it originated is killed
00:23:37.540 | and it's all PR, marketing people
00:23:40.540 | and all of that kind of stuff.
00:23:41.700 | Now, they do important work to some degree,
00:23:45.260 | but they kill the creativity.
00:23:47.500 | - I think the killing of the creativity is in the whole,
00:23:50.460 | like, okay, so what I know from social robotics is like,
00:23:53.980 | obviously if you create agents that,
00:23:57.940 | okay, so take for an example,
00:23:59.140 | you'd create a robot that looks like a humanoid
00:24:01.500 | and it's, you know, Sophia or whatever.
00:24:04.100 | Now, suddenly you do have all of these issues
00:24:07.220 | where are you reinforcing an unrealistic beauty standard?
00:24:11.540 | Are you objectifying women?
00:24:14.300 | Why is the robot?
00:24:15.180 | Why is it white?
00:24:16.020 | So you have, but the thing is,
00:24:17.660 | I think that with creativity,
00:24:20.520 | you can find a solution that's even better
00:24:25.340 | where you're not even harming anyone
00:24:28.900 | and you're creating a robot that looks like,
00:24:31.380 | not humanoid, but like something
00:24:33.780 | that people relate to even more.
00:24:35.740 | And now you don't even have any of these bias issues
00:24:38.820 | that you're creating.
00:24:39.700 | And so how do we create that within companies?
00:24:42.260 | Because I don't think it's really about,
00:24:44.500 | like, 'cause I, you know, maybe we disagree on that.
00:24:48.220 | I don't think that edginess or humor or interesting things
00:24:52.460 | need to be things that harm or hurt people
00:24:55.220 | or that people are against.
00:24:56.460 | There are ways to find things that everyone is fine with.
00:24:59.560 | Why aren't we doing that?
00:25:01.780 | - The problem is there's departments
00:25:03.780 | that look for harm in things.
00:25:05.620 | - Yeah.
00:25:06.460 | - And so they will find harm in things that have no harm.
00:25:09.460 | - Okay.
00:25:10.300 | - That's the big problem
00:25:11.140 | because their whole job is to find harm in things.
00:25:13.660 | So what you said is completely correct,
00:25:16.700 | which is edginess should not hurt,
00:25:19.900 | doesn't necessarily, doesn't need to be a thing
00:25:22.540 | that hurts people, obviously.
00:25:25.500 | Great humor, great personality doesn't have to,
00:25:30.340 | like Clippy.
00:25:31.180 | But yeah, I mean, but it's tricky to get right.
00:25:36.780 | I'm not exactly sure.
00:25:37.780 | I don't know.
00:25:38.620 | I don't know why a large corporation
00:25:40.180 | with a lot of funding can't get this right.
00:25:41.940 | - I do think you're right
00:25:42.780 | that there's a lot of aversion to risk.
00:25:44.740 | And so if you get lawyers involved
00:25:46.860 | or people whose job it is, like you say, to mitigate risk,
00:25:50.660 | they're just gonna say no to most things
00:25:52.460 | that could even be in some way.
00:25:54.540 | Yeah.
00:25:56.100 | Yeah, you get the problem in all organizations.
00:25:58.060 | So I think that you're right, that that is a problem.
00:26:00.860 | - I think what's the way to solve that
00:26:02.740 | in large organizations is to have Steve Jobs
00:26:04.980 | type of characters.
00:26:06.340 | Unfortunately, you do need to have, I think,
00:26:10.020 | from a designer perspective,
00:26:12.500 | or maybe like a Johnny Ive,
00:26:14.540 | that is almost like a dictator.
00:26:17.140 | - Yeah, you want a benevolent dictator.
00:26:18.780 | - Yeah, who rolls in and says,
00:26:20.820 | cuts through the lawyers, the PR,
00:26:24.820 | but has a benevolent aspect.
00:26:26.780 | Like, yeah, there's a good heart and make sure,
00:26:29.900 | like I think all great artists and designers
00:26:32.780 | create stuff that doesn't hurt people.
00:26:35.100 | Like if you have a good heart,
00:26:36.500 | you're going to create something
00:26:37.620 | that's going to actually make a lot of people feel good.
00:26:41.180 | That's what like people like Johnny Ive,
00:26:43.340 | what they love doing is creating a thing
00:26:46.540 | that brings a lot of love to the world.
00:26:48.540 | They imagine like millions of people using the thing
00:26:50.780 | and it instills them with joy.
00:26:53.980 | That's that, you could say that about social robotics,
00:26:56.420 | you could say that about the metaverse.
00:26:59.140 | It shouldn't be done by the PR people,
00:27:01.700 | should be done by the designers.
00:27:03.540 | - I agree, PR people ruin everything.
00:27:06.100 | - Yeah, all the fun.
00:27:07.620 | (laughs)
00:27:09.180 | In the book, you have a picture,
00:27:11.500 | I just have a lot of ridiculous questions.
00:27:13.580 | You have a picture of two hospital delivery robots
00:27:16.100 | with a caption that reads, by the way,
00:27:18.620 | see your book, I appreciate that it keeps the humor in.
00:27:23.380 | You didn't run it by the PR department.
00:27:25.460 | - No, no one edited the book, we got rushed through.
00:27:28.300 | (laughs)
00:27:29.900 | - The caption reads, two hospital delivery robots
00:27:33.820 | whose sexy nurse names Roxy and Lola
00:27:36.540 | made me roll my eyes so hard they almost fell out.
00:27:39.820 | What aspect of it made you roll your eyes?
00:27:42.900 | Is it the naming?
00:27:44.260 | - It was the naming.
00:27:45.260 | The form factor is fine, it's like a little box on wheels.
00:27:48.140 | The fact that they named them, also great.
00:27:50.180 | That'll let people enjoy interacting with them.
00:27:53.820 | We know that even just giving a robot a name,
00:27:56.220 | people will, it facilitates technology adoption,
00:27:59.820 | people will be like, oh, you know,
00:28:02.380 | Betsy made a mistake, let's help her out
00:28:04.620 | instead of the stupid robot doesn't work.
00:28:06.660 | But why Lola and Roxy?
00:28:09.340 | Like--
00:28:10.180 | - Those are to you, too sexy?
00:28:12.020 | - I mean, there's research showing that
00:28:14.300 | a lot of robots are named according to gender biases
00:28:21.020 | about the function that they're fulfilling.
00:28:24.020 | So, you know, robots that are helpful and assistance
00:28:27.780 | and are like nurses are usually female-gendered.
00:28:32.180 | Robots that are powerful, all wise computers like Watson
00:28:35.860 | usually have like a booming male coded voice and name.
00:28:40.860 | So, like that's one of those things, right?
00:28:45.220 | You're opening a can of worms for no reason, for no reason.
00:28:48.540 | - You can avoid this whole can of worms.
00:28:49.940 | - Yeah, just give it a different name.
00:28:51.580 | Like why Roxy?
00:28:53.780 | It's because people aren't even thinking.
00:28:55.500 | So to some extent, I don't like PR departments,
00:28:59.460 | but getting some feedback on your work
00:29:02.460 | from a diverse set of participants,
00:29:04.860 | listening and taking in things
00:29:08.580 | that help you identify your own blind spots.
00:29:11.300 | And then you can always make your good leadership choices
00:29:14.660 | and good, like you can still ignore things
00:29:16.980 | that you don't believe are an issue,
00:29:19.540 | but having the openness to take in feedback
00:29:23.260 | and making sure that you're getting the right feedback
00:29:25.100 | from the right people, I think that's really important.
00:29:28.220 | - So don't unnecessarily propagate the biases of society.
00:29:32.660 | - Yeah, why?
00:29:33.820 | - In the design.
00:29:35.320 | But if you're not careful when you do the research of,
00:29:40.320 | like you might, if you ran a poll with a lot of people
00:29:45.460 | of all the possible names these robots have,
00:29:48.180 | they might come up with Roxy and Lola
00:29:50.180 | as names they would enjoy most.
00:29:55.260 | That could come up as the highest.
00:29:58.740 | - As in you do marketing research and then,
00:30:02.140 | well, that's what they did with Alexa.
00:30:03.600 | They did marketing research and nobody wanted the male voice.
00:30:07.180 | Everyone wanted it to be female.
00:30:09.020 | - Well, what do you think about that?
00:30:10.920 | If I were to say, I think the role of a great designer,
00:30:17.100 | again, to go back to Johnny Ive,
00:30:19.300 | is to throw out the marketing research.
00:30:23.100 | Like take it in, do it, learn from it.
00:30:26.180 | But if everyone wants Alexa to be a female voice,
00:30:31.180 | the role of the designer is to think deeply
00:30:33.220 | about the future of social agents in the home and think.
00:30:38.220 | Like what does that future look like?
00:30:40.780 | And try to reverse engineer that future.
00:30:43.460 | So in some sense, there's this weird tension.
00:30:46.100 | Like you want to listen to a lot of people,
00:30:49.020 | but at the same time, you're creating a thing
00:30:52.400 | that defines the future of the world.
00:30:54.260 | And the people that you're listening to
00:30:57.660 | are part of the past.
00:30:59.140 | So like that weird tension.
00:31:02.180 | - Yeah, I think that's true.
00:31:03.420 | And I think some companies like Apple
00:31:06.060 | have historically done very well at understanding a market
00:31:10.340 | and saying, you know what our role is?
00:31:11.780 | It's not to listen to what the current market says.
00:31:14.300 | It's to actually shape the market
00:31:16.180 | and shape consumer preferences.
00:31:18.300 | And companies have the power to do that.
00:31:20.580 | They can be forward thinking
00:31:22.520 | and they can actually shift
00:31:24.440 | what the future of technology looks like.
00:31:27.020 | And I agree with you that I would like to see more of that,
00:31:29.440 | especially when it comes to existing biases that we know.
00:31:34.440 | I think there's the low hanging fruit
00:31:39.840 | of companies that don't even think about it at all
00:31:41.600 | and aren't talking to the right people
00:31:42.960 | and aren't getting the full information.
00:31:44.520 | And then there's companies
00:31:45.560 | that are just like doing the safe thing
00:31:47.080 | and giving consumers what they want now.
00:31:49.820 | But to be really forward looking and be really successful,
00:31:52.840 | I think you have to make some judgment calls
00:31:55.440 | about what the future is gonna be.
00:31:57.400 | - But do you think it's still useful to gender
00:31:59.960 | and to name the robots?
00:32:01.800 | - Yes, I mean, gender is a minefield,
00:32:06.040 | but people, it's really hard to get people
00:32:10.960 | to not gender a robot in some way.
00:32:14.160 | So if you don't give it a name
00:32:16.600 | or you give it a ambiguous voice,
00:32:20.220 | people will just choose something.
00:32:22.500 | And maybe that's better than just entrenching something
00:32:27.500 | that you've decided is best.
00:32:30.540 | But I do think it can be helpful
00:32:33.060 | on the anthropomorphism engagement level
00:32:36.380 | to give it attributes that people identify with.
00:32:39.460 | - Yeah, I think a lot of roboticists I know,
00:32:42.140 | they don't gender the robot.
00:32:44.400 | They even try to avoid naming the robot
00:32:46.600 | or naming it something that can be used
00:32:50.760 | as a name in conversation kind of thing.
00:32:53.580 | And I think that actually that's irresponsible
00:32:57.580 | because people are going to anthropomorphize
00:33:02.800 | the thing anyway.
00:33:03.680 | So you're just removing from yourself
00:33:06.960 | the responsibility of how they're going
00:33:08.920 | to anthropomorphize it.
00:33:09.760 | - That's a good point.
00:33:11.140 | - And so you want to be able to,
00:33:13.860 | they're going to do it.
00:33:15.460 | You have to start to think about how they're going to do it.
00:33:17.700 | Even if the robot is a Boston Dynamics robot
00:33:20.780 | that's not supposed to have any kind of social component,
00:33:25.260 | they're obviously going to project
00:33:27.140 | a social component to it.
00:33:28.980 | Like that arm, I worked a lot with quadrupeds now
00:33:33.580 | with robot dogs.
00:33:35.860 | You know, that arm, people think is the head immediately.
00:33:39.460 | It's supposed to be an arm,
00:33:40.440 | but they start to think it's a head.
00:33:41.940 | And you have to like acknowledge that.
00:33:44.100 | You can't, I mean--
00:33:45.820 | - They do now.
00:33:46.900 | - They do now?
00:33:47.740 | - Well, they've deployed the robots
00:33:48.940 | and people are like, oh my God,
00:33:50.980 | the cops are using a robot dog.
00:33:53.060 | And so they have this PR nightmare.
00:33:54.860 | And so they're like, oh yeah.
00:33:58.260 | Okay, maybe we should hire some HR people.
00:34:00.460 | - Well, Boston Dynamics is an interesting company,
00:34:04.220 | or any of the others that are doing a similar thing
00:34:07.220 | because their main source of money
00:34:11.920 | is in the industrial application.
00:34:14.640 | So like surveillance of factories and doing dangerous jobs.
00:34:18.840 | So to them, it's almost good PR
00:34:22.240 | for people to be scared of these things
00:34:25.400 | 'cause it's for some reason, as you talk about,
00:34:28.560 | people are naturally for some reason scared.
00:34:31.040 | We could talk about that, of robots.
00:34:33.020 | And so it becomes more viral.
00:34:35.360 | Like playing with that little fear.
00:34:38.340 | And so it's almost like a good PR
00:34:40.020 | because ultimately they're not trying
00:34:41.780 | to put them in the home and have a good social connection.
00:34:44.580 | They're trying to put them in factories.
00:34:46.780 | And so they have fun with it.
00:34:48.540 | If you watch Boston Dynamics videos,
00:34:50.860 | they're aware of it.
00:34:52.580 | - Oh yeah.
00:34:53.420 | - They're, I mean, I was--
00:34:54.500 | - The videos for sure that they put out.
00:34:57.140 | - It's almost like an unspoken tongue in cheek thing.
00:35:01.460 | They're aware of how people are going to feel
00:35:05.140 | when you have a robot that does like a flip.
00:35:08.560 | Now most of the people are just like excited
00:35:11.680 | about the control problem of it.
00:35:13.160 | Like how to make the whole thing happen.
00:35:15.880 | But they're aware when people see.
00:35:18.480 | - Well, I think they became aware.
00:35:20.600 | I think that in the beginning,
00:35:21.760 | they were really, really focused on just the engineering.
00:35:24.480 | I mean, they're at the forefront of robotics,
00:35:27.040 | like locomotion and stuff.
00:35:28.600 | And then when they started doing the videos,
00:35:31.720 | I think that was kind of a labor of love.
00:35:34.640 | I know that the former CEO, Mark,
00:35:37.160 | like he oversaw a lot of the videos
00:35:38.940 | and made a lot of them himself.
00:35:40.180 | And like, he's even really detail-oriented.
00:35:42.900 | Like there can't be like some sort of incline
00:35:45.120 | that would give the robot an advantage.
00:35:46.620 | They're very, like he was very,
00:35:48.900 | a lot of integrity about the authenticity of them.
00:35:51.580 | But then when they started to go viral,
00:35:54.700 | I think that's when they started to realize,
00:35:56.900 | oh, there's something interesting here that,
00:36:03.500 | I don't know how much they took it seriously
00:36:05.920 | in the beginning, other than realizing
00:36:07.640 | that they could play with it in the videos.
00:36:10.480 | I know that they take it very seriously now.
00:36:13.360 | - What I like about Boston Dynamics and similar companies,
00:36:17.040 | it's still mostly run by engineers.
00:36:19.560 | But, you know, I've had my criticisms.
00:36:25.840 | There's a bit more PR leaking in.
00:36:28.660 | But those videos are made by engineers
00:36:31.640 | 'cause that's what they find fun.
00:36:33.860 | It's like testing the robustness of the system.
00:36:36.640 | I mean, they're having a lot of fun there with the robots.
00:36:41.640 | - Totally.
00:36:42.740 | Have you been to visit?
00:36:45.300 | - Yeah, yeah, yeah.
00:36:46.620 | It's one of the most,
00:36:48.140 | I mean, 'cause I have eight robot dogs now.
00:36:55.780 | - Wait, you have eight robot dogs?
00:36:58.100 | What?
00:36:58.920 | Are they just walking around your place?
00:37:00.660 | Like where do you keep them? - Yeah, I'm working on them.
00:37:02.420 | That's actually one of my goals
00:37:04.080 | is to have at any one time always a robot moving.
00:37:09.080 | - Oh. - I'm far away from that.
00:37:10.740 | - That's an ambitious goal.
00:37:13.020 | - Well, I have like more Roombas that I know what to do with.
00:37:15.780 | The Roomba that I program.
00:37:17.800 | So the programmable Roombas.
00:37:20.060 | - Nice.
00:37:20.900 | - And I have a bunch of little, like I built,
00:37:23.940 | I'm not finished with it yet,
00:37:24.860 | but bought a robot from Rick and Morty.
00:37:27.860 | I still have a bunch of robots everywhere.
00:37:29.180 | But the thing is, what happens is
00:37:32.220 | you're working on one robot at a time
00:37:34.860 | and that becomes like a little project.
00:37:38.060 | It's actually very difficult to have
00:37:40.060 | just a passively functioning robot always moving.
00:37:46.020 | - Yeah.
00:37:47.140 | - And that's a dream for me
00:37:48.860 | 'cause I'd love to create that kind of a little world.
00:37:51.180 | So the impressive thing about Boston Dynamics to me
00:37:54.900 | was to see like hundreds of spots.
00:37:59.500 | And like there was a,
00:38:00.940 | the most impressive thing that still sticks with me
00:38:02.940 | is there was a spot robot walking down the hall
00:38:07.940 | seemingly with no supervision whatsoever.
00:38:12.140 | And he was wearing, he or she, I don't know,
00:38:13.980 | was wearing a cowboy hat.
00:38:15.400 | It was just walking down the hall
00:38:18.420 | and nobody paying attention.
00:38:19.980 | And it's just like walking down this long hall.
00:38:22.380 | And I'm like looking around, is anyone,
00:38:26.060 | like what's happening here?
00:38:28.020 | So presumably some kind of automation was doing the map.
00:38:30.960 | I mean, the whole environment is probably really well mapped.
00:38:33.860 | But it was just, it gave me a picture of a world
00:38:38.700 | where a robot is doing his thing, wearing a cowboy hat,
00:38:41.780 | just going down the hall,
00:38:44.020 | like getting some coffee or whatever.
00:38:45.420 | Like I don't know what it's doing, what's the mission.
00:38:47.420 | But I don't know, for some reason it really stuck with me.
00:38:49.780 | You don't often see robots that aren't part of a demo
00:38:53.220 | or that aren't, you know, like with a semi-autonomous
00:38:57.660 | or autonomous vehicle, like directly doing a task.
00:39:01.340 | This was just chilling.
00:39:02.980 | - Yeah. - Walking around.
00:39:04.820 | I don't know.
00:39:05.660 | - Well, yeah, you know, I mean, we're at MIT.
00:39:07.340 | Like when I first got to MIT, I was like,
00:39:09.300 | okay, where's all the robots?
00:39:11.420 | And they were all like broken or like not demoing.
00:39:14.140 | So yeah.
00:39:16.460 | And what really excites me is that we're about to have that.
00:39:21.460 | We're about to have so many moving about too.
00:39:24.260 | Well, it's coming.
00:39:25.580 | It's coming in our lifetime
00:39:26.940 | that we will just have robots moving around.
00:39:29.340 | We're already seeing the beginnings of it.
00:39:30.860 | There's delivery robots in some cities, on the sidewalks.
00:39:34.060 | And I just love seeing like the TikToks
00:39:36.620 | of people reacting to that.
00:39:38.500 | Because yeah, you see a robot walking down the hall
00:39:40.740 | with a cowboy hat.
00:39:42.060 | You're like, what the fuck?
00:39:43.540 | What is this?
00:39:44.660 | This is awesome and scary and kind of awesome.
00:39:47.340 | And people either love or hate it.
00:39:49.340 | That's one of the things that I think companies
00:39:51.260 | are underestimating that people will either love a robot
00:39:54.460 | or hate a robot and nothing in between.
00:39:56.460 | So it's just, again, an exciting time to be alive.
00:40:00.460 | - Yeah, I think kids almost universally,
00:40:03.780 | at least in my experience, love them.
00:40:05.620 | Love legged robots.
00:40:08.100 | - If they're not loud.
00:40:09.180 | My son hates the Roomba because ours is loud.
00:40:12.980 | - Oh, that, yeah.
00:40:14.160 | No, the legs, the legs make a difference.
00:40:16.820 | 'Cause I don't, your son,
00:40:18.900 | do they understand Roomba to be a robot?
00:40:23.140 | - Oh yeah, my kids, that's one of the first words
00:40:25.380 | they learned.
00:40:26.220 | They know how to say beep boop.
00:40:27.780 | And yes, they think the Roomba's a robot.
00:40:29.940 | - Do they project intelligence out of the thing?
00:40:32.660 | - Well, we don't really use it around them anymore
00:40:34.420 | for the reason that my son is scared of it.
00:40:37.240 | - Yeah, that's really interesting.
00:40:41.420 | - I think they would.
00:40:42.360 | Even a Roomba, because it's moving around on its own,
00:40:47.000 | I think kids and animals view it as an agent.
00:40:51.080 | - So what do you think, if we just look at the state
00:40:54.440 | of the art of robotics, what do you think robots
00:40:56.480 | are actually good at today?
00:40:58.640 | So if we look at today.
00:40:59.880 | - You mean physical robots?
00:41:01.520 | - Yeah, physical robots.
00:41:02.900 | - Wow.
00:41:05.520 | - Like what are you impressed by?
00:41:07.200 | So I think a lot of people, I mean,
00:41:09.800 | that's what your book is about,
00:41:10.880 | is maybe not a perfectly calibrated understanding
00:41:15.880 | of where we are in terms of robotics,
00:41:20.320 | what's difficult in robotics, what's easy in robotics.
00:41:22.640 | - Yeah, we're way behind where people think we are.
00:41:26.800 | So what's impressive to me, so let's see.
00:41:31.800 | Oh, one thing that came out recently
00:41:34.320 | was Amazon has this new warehouse robot,
00:41:36.800 | and it's the first autonomous warehouse robot
00:41:41.600 | that is safe for people to be around.
00:41:44.520 | And so it's kind of, most people I think envision
00:41:48.640 | that our warehouses are already fully automated
00:41:51.200 | and that there's just robots doing things.
00:41:54.000 | It's actually still really difficult
00:41:56.120 | to have robots and people in the same space
00:41:59.800 | because it's dangerous for the most part.
00:42:01.960 | Robots, because especially robots
00:42:04.240 | that have to be strong enough to move something heavy,
00:42:06.960 | for example, they can really hurt somebody.
00:42:09.120 | And so until now, a lot of the warehouse robots
00:42:12.320 | had to just move along like preexisting lines,
00:42:15.040 | which really restricts what you can do.
00:42:17.180 | And so having, I think that's one of the big challenges
00:42:23.120 | and one of the big exciting things that's happening
00:42:25.840 | is that we're starting to see more cobotics
00:42:29.080 | in industrial spaces like that,
00:42:30.760 | where people and robots can work side by side
00:42:33.640 | and not get harmed.
00:42:35.560 | - Yeah, that's what people don't realize,
00:42:37.120 | sort of the physical manipulation task with humans.
00:42:41.000 | It's not that the robots wanna hurt you.
00:42:44.760 | I think that's what people are worried about,
00:42:46.360 | like this malevolent robot gets mad of its own
00:42:50.240 | and wants to destroy all humans.
00:42:52.480 | No, it's actually very difficult to know where the human is
00:42:55.840 | and to respond to the human and dynamically
00:43:00.740 | and collaborate with them on a task,
00:43:03.600 | especially if you're something like
00:43:04.800 | an industrial robotic arm, which is extremely powerful.
00:43:08.320 | Some of those arms are pretty impressive now
00:43:13.200 | that you can grab it, you can move it.
00:43:17.840 | So the collaboration between human and robot
00:43:21.000 | in the factory setting is really fascinating.
00:43:23.200 | Do you think they'll take our jobs?
00:43:27.040 | - I don't think it's that simple.
00:43:29.600 | I think that there's a ton of disruption that's happening
00:43:33.740 | and will continue to happen.
00:43:35.240 | I think speaking specifically of the Amazon warehouses,
00:43:40.880 | that might be an area where it would be good for robots
00:43:43.240 | to take some of the jobs that are,
00:43:45.660 | where people are put in a position where it's unsafe
00:43:48.780 | and they're treated horribly.
00:43:50.240 | And probably it would be better if a robot did that.
00:43:53.400 | And Amazon is clearly trying to automate that job away.
00:43:57.240 | So I think there's gonna be a lot of disruption.
00:44:00.840 | I do think that robots and humans
00:44:03.760 | have very different skillsets.
00:44:05.080 | So while a robot might take over a task,
00:44:08.520 | it's not gonna take over most jobs.
00:44:12.440 | I think just things will change a lot.
00:44:16.920 | Like, I don't know, one of the examples
00:44:18.640 | I have in the book is mining.
00:44:20.280 | So there you have this job that is very unsafe
00:44:26.320 | and that requires a bunch of workers
00:44:28.640 | and puts them in unsafe conditions.
00:44:30.080 | And now you have all these different robotic machines
00:44:33.960 | that can help make the job safer.
00:44:36.640 | And as a result, now people can sit in these
00:44:38.800 | like air conditioned remote control stations
00:44:42.560 | and like control these autonomous mining trucks.
00:44:45.680 | And so that's a much better job,
00:44:47.040 | but also they're employing less people now.
00:44:49.240 | So it's just a lot of,
00:44:54.840 | I think from a bird's eye perspective,
00:44:57.200 | you're not gonna see job loss.
00:44:58.520 | You're gonna see more jobs created because that's,
00:45:03.120 | the future is not robots just becoming like people
00:45:07.400 | and taking their jobs.
00:45:08.360 | The future is really a combination of our skills
00:45:11.320 | and then the supplemental skillset that robots have
00:45:14.500 | to increase productivity,
00:45:15.880 | to help people have better, safer jobs,
00:45:18.680 | to give people work that they actually enjoy doing
00:45:22.960 | and are good at.
00:45:24.980 | But it's really easy to say that
00:45:26.860 | from a bird's eye perspective
00:45:28.560 | and ignore kind of the rubble on the ground
00:45:33.560 | as we go through these transitions,
00:45:35.480 | because of course specific jobs are going to get lost.
00:45:39.540 | - If you look at the history of the 20th century,
00:45:41.660 | it seems like automation constantly increases productivity
00:45:46.620 | and improves the average quality of life.
00:45:51.140 | So it's been always good.
00:45:53.000 | So like thinking about this time being different
00:45:58.000 | is that we would need to go against the lessons of history.
00:46:01.900 | - It's true.
00:46:03.500 | - And the other thing is I think people think
00:46:06.500 | that the automation of the physical tasks is easy.
00:46:10.040 | I was just in Ukraine and the interesting thing is,
00:46:13.080 | I mean, there's a lot of difficult and dark lessons
00:46:18.740 | just about a war zone.
00:46:20.940 | But one of the things that happens in war
00:46:22.900 | is there's a lot of mines that are placed.
00:46:25.040 | This one of the big problems for years
00:46:31.120 | after a war is even over
00:46:33.000 | is the entire landscape is covered in mines.
00:46:35.620 | And so there's a demining effort.
00:46:39.380 | And you would think robots would be good
00:46:43.160 | at this kind of thing.
00:46:44.280 | Or like your intuition would be like,
00:46:45.920 | well, say you have unlimited money
00:46:48.840 | and you wanna do a good job of it, unlimited money.
00:46:51.320 | You would get a lot of really nice robots.
00:46:53.660 | But no, humans are still far superior.
00:46:57.460 | - Or animals. - At this kind of tech.
00:46:58.620 | Or animals.
00:46:59.620 | But humans with animals together.
00:47:02.340 | - Yeah. - You can't just have--
00:47:03.180 | - That's true. - A dog with a hat.
00:47:05.020 | (both laughing)
00:47:06.420 | - That's fair.
00:47:07.260 | - But yes, but figuring out also how to disable the mine.
00:47:13.820 | Obviously the easy thing,
00:47:17.660 | the thing a robot can help with
00:47:19.780 | is to find the mine and blow it up.
00:47:24.780 | But that's gonna destroy the landscape.
00:47:27.680 | That really does a lot of damage to the land.
00:47:30.080 | You wanna disable the mine.
00:47:32.440 | And to do that because of all the different,
00:47:34.880 | all the different edge cases of the problem.
00:47:37.240 | It requires a huge amount of human-like experience,
00:47:40.080 | it seems like.
00:47:41.100 | So it's mostly done by humans.
00:47:42.640 | They have no use for robots.
00:47:43.880 | They don't want robots.
00:47:45.120 | - Yeah, I think we overestimate what we can automate.
00:47:49.100 | - Especially in the physical realm.
00:47:51.700 | - Yeah. - It's weird.
00:47:53.280 | I mean, it continues that the story of humans,
00:47:57.560 | we think we're shitty at everything in the physical world,
00:48:00.680 | including driving.
00:48:01.640 | We think everybody makes fun of themselves
00:48:04.360 | and others for being shitty drivers,
00:48:06.200 | but we're actually kind of incredible.
00:48:07.920 | - No, we're incredible.
00:48:08.880 | And that's why Tesla still says
00:48:12.540 | that if you're in the driver's seat,
00:48:13.920 | like you are ultimately responsible.
00:48:17.320 | Because the ideal for, I mean,
00:48:18.840 | I mean, you know more about this than I do,
00:48:20.320 | but like robot cars are great at predictable things
00:48:25.320 | and can react faster and more precisely than a person
00:48:31.240 | and can do a lot of the driving.
00:48:33.400 | And then the reason that we still don't have
00:48:35.640 | autonomous vehicles on all the roads yet
00:48:37.920 | is because of this long tail of just unexpected occurrences
00:48:41.720 | where a human immediately understands
00:48:44.040 | that's a sunset and not a traffic light.
00:48:46.080 | That's a horse and carriage ahead of me on the highway,
00:48:48.440 | but the car has never encountered that before.
00:48:50.080 | So like in theory, combining those skill sets
00:48:53.760 | is what's gonna really be powerful.
00:48:57.040 | The only problem is figuring out
00:48:59.440 | the human-robot interaction and the handoffs.
00:49:01.400 | So like in cars, that's a huge problem right now,
00:49:03.760 | figuring out the handoffs.
00:49:06.120 | But in other areas, it might be easier.
00:49:08.880 | And that's really the future is human-robot interaction.
00:49:12.880 | - What's really hard to improve,
00:49:15.800 | it's terrible that people die in car accidents,
00:49:19.760 | but I mean, it's like 70, 80, 100 million miles,
00:49:24.520 | one death per 80 million miles.
00:49:28.560 | That's like really hard to beat for a robot.
00:49:31.360 | That's like incredible.
00:49:33.040 | Like think about it, like the, how many people?
00:49:37.040 | Think just the number of people throughout the world
00:49:38.960 | that are driving every single day,
00:49:41.320 | all of this, you know, sleep deprived, drunk,
00:49:45.360 | distracted, all of that, and still very few die
00:49:51.760 | relative to what I would imagine.
00:49:53.080 | If I were to guess, back in the horse,
00:49:55.760 | see, when I was like in the beginning of the 20th century,
00:49:58.680 | riding my horse, I would talk so much shit about these cars.
00:50:02.920 | I'd be like, this is extremely dangerous,
00:50:05.120 | these machines traveling at 30 miles an hour
00:50:08.080 | or whatever the hell they're going at.
00:50:10.340 | This is irresponsible, it's unnatural,
00:50:13.120 | and it's going to be destructive to all of human society.
00:50:16.520 | But then it's extremely surprising
00:50:18.120 | how humans adapt to the thing,
00:50:19.960 | and they know how to not kill each other.
00:50:22.000 | I mean, that ability to adapt is incredible,
00:50:27.520 | and to mimic that in the machine is really tricky.
00:50:30.340 | Now, that said, what Tesla is doing,
00:50:33.360 | I mean, I wouldn't have guessed
00:50:35.400 | how far machine learning can go on vision alone.
00:50:38.480 | It's really, really incredible,
00:50:39.860 | and people that are, at least from my perspective,
00:50:43.760 | people that are kind of critical of Elon and those efforts,
00:50:48.760 | I think don't give enough credit
00:50:54.280 | at how much incredible progress
00:50:57.120 | has been made in that direction.
00:50:59.160 | I think most of the robotics community
00:51:00.600 | wouldn't have guessed how much you can do on vision alone.
00:51:03.640 | It's kind of incredible, because we would be,
00:51:07.080 | I think it's that approach, which is relatively unique,
00:51:11.840 | has challenged the other competitors to step up their game.
00:51:16.620 | So if you're using LiDAR, if you're using mapping,
00:51:19.320 | that challenges them to do better, to scale faster,
00:51:26.660 | and to use machine learning and computer vision as well
00:51:30.460 | to integrate both LiDAR and vision.
00:51:32.860 | So it's kind of incredible.
00:51:35.580 | And I'm not, I don't know if I even have a good intuition
00:51:40.100 | of how hard driving is anymore.
00:51:43.040 | Maybe it is possible to solve.
00:51:45.460 | So all the sunset, all the edge cases you mentioned.
00:51:48.340 | Yeah, the question is when.
00:51:49.700 | - Yeah, I think it's not happening
00:51:51.060 | as quickly as people thought it would,
00:51:53.260 | because it is more complicated.
00:51:55.080 | - But I wouldn't have, I agree with you.
00:51:59.340 | My current intuition is that we're gonna get there.
00:52:02.840 | - I think we're gonna get there too.
00:52:04.140 | - But I didn't before, I wasn't sure we're gonna get there
00:52:09.140 | without, like with current technology.
00:52:13.060 | So I was kind of, this is like with vision alone,
00:52:19.020 | my intuition was you're gonna have to solve
00:52:23.220 | like common sense reasoning.
00:52:25.660 | You're gonna have to solve some of the big problems
00:52:28.940 | in artificial intelligence, not just perception.
00:52:34.940 | - Yeah.
00:52:35.780 | - Like you have to have a deep understanding of the world,
00:52:37.820 | is what was my sense.
00:52:38.820 | But now I'm starting to like, well, this,
00:52:41.300 | I mean, I'm continually surprised how well the thing works.
00:52:44.460 | Obviously Elon and others, others have stopped,
00:52:47.380 | but Elon continues saying we're gonna solve it in a year.
00:52:51.340 | - Well, yeah, that's the thing, bold predictions.
00:52:54.580 | - Yeah, well, everyone else used to be doing that,
00:52:57.180 | but they kind of like, all right.
00:52:59.140 | - Yeah, maybe we'll.
00:53:00.940 | - Maybe let's not promise we're gonna solve
00:53:03.100 | level four driving by 2020.
00:53:05.580 | Let's chill on that.
00:53:07.580 | - Well, people are still trying silently.
00:53:09.820 | I mean, the UK just committed 100 million pounds
00:53:14.620 | to research and development to speed up the process
00:53:18.180 | of getting autonomous vehicles on the road.
00:53:19.660 | Like everyone can see that it is solvable
00:53:23.780 | and it's going to happen and it's gonna change everything.
00:53:26.380 | And they're still investing in it.
00:53:28.180 | And like Waymo, low key has driverless cars in Arizona.
00:53:33.180 | Like you can get, you know, there's like robots.
00:53:40.580 | It's weird, have you ever been in one?
00:53:42.540 | - No.
00:53:43.380 | - It's so weird.
00:53:44.620 | It's so awesome.
00:53:45.820 | 'Cause the most awesome experience is the wheel turning
00:53:50.620 | and you're sitting in the back.
00:53:52.420 | It's like, I don't know, it's,
00:53:57.100 | it feels like you're a passenger with that friend
00:53:59.260 | who's a little crazy of a driver.
00:54:01.900 | It feels like, shit, I don't know.
00:54:05.060 | Are you all right to drive, bro?
00:54:06.460 | You know, that kind of feeling.
00:54:07.780 | But then you kind of, that experience, that nervousness
00:54:12.780 | and the excitement of trusting another being,
00:54:18.060 | and in this case, it's a machine, is really interesting.
00:54:20.860 | Just even introspecting your own feelings about the thing.
00:54:26.020 | - Yeah.
00:54:27.820 | - They're not doing anything in terms of making you feel
00:54:32.820 | better about, like at least Waymo.
00:54:37.620 | I think they went with the approach of like,
00:54:40.340 | let's not try to put eyes on the thing.
00:54:42.360 | It's a wheel, we know what that looks like.
00:54:46.020 | It's just a car.
00:54:47.140 | It's a car, get in the back.
00:54:48.720 | Let's not like discuss this at all.
00:54:50.260 | Let's not discuss the fact that this is a robot driving you
00:54:53.700 | and you're in the back.
00:54:55.020 | And if the robot wants to start driving 80 miles an hour
00:54:57.500 | and run off of a bridge, you have no recourse.
00:55:00.700 | Let's not discuss this.
00:55:02.140 | You're just getting in the back.
00:55:03.340 | There's no discussion about like how shit can go wrong.
00:55:06.980 | There's no eyes, there's nothing.
00:55:08.260 | There's like a map showing what the car can see.
00:55:12.120 | Like, you know, what happens if it's like
00:55:15.100 | a HAL 9000 situation?
00:55:17.980 | Like, I'm sorry, I can't.
00:55:21.380 | You have a button you can like call customer service.
00:55:24.420 | - Oh God, then you get put on hold for two hours.
00:55:27.540 | - Probably.
00:55:28.380 | But you know, currently what they're doing,
00:55:31.740 | which I think is understandable,
00:55:34.660 | but you know, the car just can pull over and stop
00:55:37.420 | and wait for help to arrive.
00:55:39.620 | And then a driver will come
00:55:40.900 | and then they'll actually drive the car for you.
00:55:43.180 | But that's like, you know, what if you're late for a meeting
00:55:47.580 | or all that kind of stuff.
00:55:48.780 | - Or like the more dystopian, isn't it the fifth element
00:55:51.100 | where, is Will Smith in that movie?
00:55:53.220 | Who's in that movie?
00:55:54.140 | - No, Bruce Willis?
00:55:54.980 | - Bruce Willis.
00:55:55.820 | - Oh yeah, and he gets into like a robotic cab
00:55:58.740 | or car or something.
00:56:00.180 | And then because he's violated a traffic rule,
00:56:02.260 | it locks him in and he has to wait for the cops to come
00:56:05.540 | and he can't get out.
00:56:06.460 | So like, we're gonna see stuff like that maybe.
00:56:09.700 | - Well, that's, I believe that the companies
00:56:13.740 | that have robots, the only ones that will succeed
00:56:18.740 | are the ones that don't do that.
00:56:21.900 | Meaning they respect privacy.
00:56:25.140 | - You think so?
00:56:26.420 | - Yeah, because people,
00:56:28.860 | 'cause they're gonna have to earn people's trust.
00:56:31.860 | - Yeah, but like Amazon works with law enforcement
00:56:34.500 | and gives them the data from the ring cameras.
00:56:36.780 | So why should it, yeah.
00:56:39.540 | Oh yeah.
00:56:40.740 | Do you have a ring camera?
00:56:42.860 | - No.
00:56:43.700 | - Okay.
00:56:44.520 | - No, no, but basically any security camera, right?
00:56:48.260 | I have a Google's, whatever they have.
00:56:51.500 | - We have one that's not the data.
00:56:54.100 | We store the data on a local server
00:56:56.060 | because we don't want it to go to law enforcement
00:56:58.740 | 'cause all the companies are doing it.
00:57:00.300 | They're doing it.
00:57:01.120 | I bet Apple wouldn't.
00:57:02.120 | - Yeah.
00:57:03.860 | - Apple's the only company I trust
00:57:04.940 | and I don't know how much longer.
00:57:06.580 | - I don't know.
00:57:09.140 | Maybe that's true for cameras, but with robots,
00:57:15.900 | people are just not gonna let a robot inside their home
00:57:19.500 | where like one time where somebody gets arrested
00:57:22.940 | because of something a robot sees,
00:57:24.900 | that's gonna destroy a company.
00:57:27.940 | - You don't think people are gonna be like,
00:57:29.260 | well, that wouldn't happen to me,
00:57:30.260 | that happened to a bad person.
00:57:31.760 | - I think they would.
00:57:35.420 | - Yeah.
00:57:36.260 | - 'Cause in the modern world, people are get,
00:57:38.780 | have you seen Twitter?
00:57:39.860 | They get extremely paranoid about any kind of surveillance.
00:57:43.620 | - But the thing that I've had to learn
00:57:45.220 | is that Twitter is not the modern world.
00:57:47.500 | Like when I go inland to visit my relatives,
00:57:52.500 | like they don't, that's a different discourse
00:57:55.980 | that's happening.
00:57:56.820 | I think like the whole tech criticism world,
00:57:59.500 | it's loud in our ears 'cause we're in those circles.
00:58:01.980 | - Do you think you can be a company
00:58:04.380 | that does social robotics and not win over Twitter?
00:58:07.020 | - That's a good question.
00:58:09.420 | - I feel like the early adopters are all on Twitter
00:58:12.460 | and it feels like you have to win them over.
00:58:15.620 | Feels like nowadays you'd have to win over TikTok, honestly.
00:58:18.620 | - TikTok, is that a website?
00:58:21.360 | I need to check it out.
00:58:25.100 | And that's an interesting one
00:58:29.180 | because China is behind that one.
00:58:31.620 | - Exactly.
00:58:32.460 | - So if it's compelling enough,
00:58:35.860 | maybe people would be able to give up privacy
00:58:40.700 | and that kind of stuff.
00:58:41.860 | That's really scary.
00:58:44.020 | - I mean, I'm worried about it.
00:58:46.980 | I'm worried about it.
00:58:47.980 | There've been some developments recently
00:58:51.580 | that are like super exciting,
00:58:53.140 | like the large language learning models.
00:58:55.580 | Like, wow, I did not anticipate those improving so quickly
00:58:59.860 | and those are gonna change everything.
00:59:03.780 | And one of the things that I'm trying to be cynical about
00:59:06.480 | is that I think they're gonna have a big impact
00:59:09.340 | on privacy and data security and like manipulating consumers
00:59:12.900 | and manipulating people because suddenly
00:59:14.740 | you'll have these agents that people will talk to
00:59:16.700 | and they won't care or won't know,
00:59:20.060 | at least on a conscious level,
00:59:21.140 | that it's recording the conversations.
00:59:22.700 | So kind of like we were talking about before.
00:59:24.900 | And at the same time, the technology is so freaking exciting
00:59:29.060 | that it's gonna get adopted.
00:59:31.020 | - Well, it's not even just the collection of data
00:59:32.460 | but the ability to manipulate at scale.
00:59:36.500 | So what do you think about the AI,
00:59:41.100 | the engineer from Google
00:59:45.500 | that thought Lambda is sentient?
00:59:49.020 | He had actually a really good post from somebody else.
00:59:51.220 | I forgot her name.
00:59:52.420 | It's brilliant.
00:59:53.260 | I can't believe I didn't know about her.
00:59:54.940 | Thanks to you. - Janelle Shane?
00:59:55.940 | - Yeah, from Weird AI.
00:59:57.580 | - Oh yeah, I love her book.
00:59:59.140 | - She's great.
00:59:59.980 | I left a note for myself to reach out to her.
01:00:01.780 | She's amazing.
01:00:02.820 | She's hilarious and brilliant
01:00:04.380 | and just a great summarizer of the state of AI.
01:00:07.860 | But she has, I think that was from her,
01:00:11.060 | where it was looking at AI explaining that it's a squirrel.
01:00:16.060 | - Oh yeah, because the transcripts
01:00:20.820 | that the engineer released,
01:00:22.900 | Lambda kind of talks about the experience
01:00:25.260 | of human-like feelings and I think even consciousness.
01:00:30.260 | And so she was like, oh cool, that's impressive.
01:00:33.780 | I wonder if an AI can also describe the experience
01:00:36.420 | of being a squirrel.
01:00:37.260 | And so she interviewed, I think she did GPT-3
01:00:40.620 | about the experience of being a squirrel.
01:00:42.620 | And then she did a bunch of other ones too,
01:00:44.700 | like what's it like being a flock of crows?
01:00:46.980 | What's it like being an algorithm that powers a Roomba?
01:00:49.380 | And you can have a conversation about any of those things
01:00:53.260 | and they're very convincing. - It's pretty convincing, yeah.
01:00:55.500 | Even GPT-3, which is not state of the art.
01:00:58.980 | It's convincing of being a squirrel.
01:01:00.900 | It's like what it's like.
01:01:02.220 | I mean, you should check it out,
01:01:04.300 | 'cause it really is.
01:01:05.940 | It's like, yeah, that probably is what a squirrel would--
01:01:09.340 | - I would say. (laughs)
01:01:11.700 | - Are you excited?
01:01:12.540 | Like what's it like being a squirrel?
01:01:13.740 | It's fun.
01:01:14.620 | - Oh yeah, I get to eat nuts and run around all day.
01:01:17.220 | - Like how do you think people feel
01:01:20.540 | like when you tell them that you're a squirrel?
01:01:22.900 | You know, or like, I forget what it was.
01:01:25.620 | Like a lot of people might be scared
01:01:27.620 | to find out that you're a squirrel or something like this.
01:01:29.940 | And then the system answers pretty well.
01:01:33.340 | Like yeah, I hope they'll,
01:01:36.980 | like what do you think when they find out you're a squirrel?
01:01:39.980 | (laughs)
01:01:41.980 | I hope they'll see how fun it is to be a squirrel
01:01:44.580 | or something like that.
01:01:45.420 | - What do you say to people who don't believe you're a squirrel
01:01:48.020 | I say, "Come see for yourself."
01:01:49.860 | I am a squirrel.
01:01:50.820 | - That's great.
01:01:51.980 | - Well, I think it's really great
01:01:53.260 | because the two things to note about it are,
01:01:57.300 | first of all, just because the machine
01:01:58.700 | is describing an experience
01:01:59.900 | doesn't mean it actually has that experience.
01:02:02.420 | But then secondly, these things are getting so advanced
01:02:05.300 | and so convincing at describing these things
01:02:08.860 | and talking to people.
01:02:10.660 | That's, I mean, just the implications for health,
01:02:15.300 | education, communication, entertainment, gaming.
01:02:19.660 | Like I just feel like all of the applications,
01:02:22.340 | it's mind boggling what we're gonna be able to do with this.
01:02:25.500 | And that my kids are not gonna remember a time
01:02:28.420 | before they could have conversations with artificial agents.
01:02:32.660 | - Do you think they would?
01:02:34.340 | Because to me, this is,
01:02:35.760 | the focus in the ad community has been,
01:02:39.380 | well, this engineer surely is hallucinating.
01:02:42.580 | The thing is not sentient.
01:02:44.860 | But to me, first of all,
01:02:48.940 | it doesn't matter if he is or not, this is coming.
01:02:52.620 | Where a large number of people
01:02:54.340 | would believe a system is sentient,
01:02:56.860 | including engineers within companies.
01:02:59.620 | So in that sense, you start to think about a world
01:03:02.420 | where your kids aren't just used
01:03:05.460 | to having a conversation with a bot,
01:03:07.340 | but used to believing,
01:03:08.980 | kind of having an implied belief that the thing is sentient.
01:03:13.980 | - Yeah, I think that's true.
01:03:15.780 | And I think that one of the things that bothered me
01:03:18.700 | about all of the coverage in the tech press
01:03:20.660 | about this incident,
01:03:21.780 | like obviously I don't believe the system is sentient.
01:03:25.020 | Like I think that it can convincingly describe that it is.
01:03:28.060 | I don't think it's doing what he thought it was doing
01:03:31.280 | and actually experiencing feelings.
01:03:33.260 | But a lot of the tech press was about how he was wrong
01:03:38.260 | and depicting him as kind of naive.
01:03:40.700 | And it's not naive.
01:03:42.460 | Like there's so much research in my field
01:03:45.180 | showing that people do this, even experts.
01:03:48.420 | They might be very clinical
01:03:50.260 | when they're doing human robot interaction experiments
01:03:52.700 | with a robot that they've built.
01:03:54.460 | And then you bring in a different robot
01:03:55.780 | and they're like, "Oh, look at it.
01:03:56.900 | It's having fun, it's doing this."
01:03:58.140 | Like that happens in our lab all the time.
01:04:01.380 | We are all this guy and it's gonna be huge.
01:04:06.380 | So I think that the goal is not to discourage
01:04:10.620 | this kind of belief or like design systems
01:04:13.780 | that people won't think are sentient.
01:04:15.460 | I don't think that's possible.
01:04:16.780 | I think you're right, this is coming.
01:04:18.740 | It's something that we have to acknowledge
01:04:21.280 | and even embrace and be very aware of.
01:04:25.100 | - So one of the really interesting perspectives
01:04:28.300 | that your book takes on a system like this
01:04:32.360 | is to see them, not to compare a system like this to humans,
01:04:35.960 | but to compare it to animals, of how we see animals.
01:04:39.880 | Can you kind of try to, again, sneak up,
01:04:42.400 | try to explain why this analogy is better
01:04:45.040 | than the human analogy, the analogy of robots as animals?
01:04:48.400 | - Yeah, and it gets trickier with the language stuff,
01:04:52.040 | but we'll get into that too.
01:04:54.800 | I think that animals are a really great thought experiment
01:04:59.740 | when we're thinking about AI and robotics,
01:05:01.180 | because again, this comparing them to humans
01:05:04.100 | that leads us down the wrong path,
01:05:05.820 | both because it's not accurate,
01:05:07.300 | but also I think for the future, we don't want that.
01:05:10.960 | We want something that's a supplement.
01:05:12.740 | But I think animals, because we've used them
01:05:15.160 | throughout history for so many different things,
01:05:17.140 | we domesticated them, not because they do what we do,
01:05:19.860 | but because what they do is different and that's useful.
01:05:23.180 | And it's just like,
01:05:25.320 | whether we're talking about companionship,
01:05:27.200 | whether we're talking about work integration,
01:05:29.800 | whether we're talking about responsibility for harm,
01:05:32.120 | there's just so many things we can draw on in that history
01:05:35.480 | from these entities that can sense, think,
01:05:38.280 | make autonomous decisions and learn
01:05:40.580 | that are applicable to how we should be thinking
01:05:43.400 | about robots and AI.
01:05:44.440 | And the point of the book is not
01:05:45.760 | that they're the same thing,
01:05:46.840 | that animals and robots are the same.
01:05:49.100 | Obviously, there are tons of differences there.
01:05:50.920 | Like you can't have a conversation with a squirrel, right?
01:05:54.820 | But the point is- - I do it all the time.
01:05:57.820 | - Oh, really?
01:05:58.740 | - By the way, squirrels are the cutest.
01:06:00.540 | I project so much on squirrels.
01:06:02.540 | I wonder what their inner life is.
01:06:04.380 | I suspect they're much bigger assholes than we imagine.
01:06:09.260 | - Really?
01:06:10.100 | - Like if it was a giant squirrel,
01:06:12.320 | it would fuck you over so fast if it had the chance.
01:06:15.180 | It would take everything you own.
01:06:16.460 | It would eat all your stuff,
01:06:18.180 | but 'cause it's small and the furry tail,
01:06:20.660 | the furry tail is a weapon
01:06:25.200 | against human consciousness and cognition.
01:06:28.260 | It wins us over.
01:06:29.140 | That's what cats do too.
01:06:30.620 | Cats out-competed squirrels.
01:06:33.560 | - And dogs.
01:06:35.020 | - Yeah.
01:06:35.860 | No, dogs have love.
01:06:37.000 | Cats have no soul.
01:06:39.460 | They, no, I'm just kidding.
01:06:40.900 | People get so angry when I talk shit about cats.
01:06:43.660 | I love cats.
01:06:44.620 | Anyway, so yeah, you're describing
01:06:47.940 | all the different kinds of animals that get domesticated.
01:06:51.580 | And it's a really interesting idea
01:06:54.100 | that it's not just sort of pets.
01:06:55.580 | There's all kinds of domestication going on.
01:06:57.540 | They all have all kinds of uses.
01:07:00.300 | - Yes.
01:07:01.140 | - Like the ox that you propose might be,
01:07:05.860 | at least historically,
01:07:06.700 | one of the most useful domesticated animals.
01:07:09.780 | - It was a game changer because it revolutionized
01:07:12.900 | what people could do economically, et cetera.
01:07:15.460 | So, I mean, just like robots,
01:07:17.700 | they're gonna change things economically.
01:07:20.820 | They're gonna change landscapes.
01:07:22.380 | Like cities might even get rebuilt
01:07:24.340 | around autonomous vehicles or drones or delivery robots.
01:07:28.700 | I think just the same ways that animals
01:07:30.840 | have really shifted society.
01:07:33.500 | And society has adapted also
01:07:35.060 | to like socially accepting animals as pets.
01:07:37.920 | I think we're gonna see very similar things with robots.
01:07:40.900 | So I think it's a useful analogy.
01:07:42.020 | It's not a perfect one,
01:07:42.860 | but I think it helps us get away from this idea
01:07:45.900 | that robots can, should, or will replace people.
01:07:49.140 | - If you remember,
01:07:50.180 | what are some interesting uses of animals?
01:07:52.460 | Ferrets, for example.
01:07:54.020 | - Oh yeah, the ferrets.
01:07:55.820 | They still do this.
01:07:56.660 | They use ferrets to go into narrow spaces
01:07:59.420 | that people can't go into, like a pipe.
01:08:01.460 | Or like they'll use them to run electrical wire.
01:08:03.500 | I think they did that for Princess Di's wedding.
01:08:05.780 | There's so many weird ways that--
01:08:07.740 | - Work is work.
01:08:08.580 | - We've used animals and still use animals
01:08:11.400 | for things that robots can't do.
01:08:13.620 | Like the dolphins that they used in the military.
01:08:18.020 | I think Russia still has dolphins
01:08:21.860 | and the US still has dolphins in their navies.
01:08:24.540 | Mine detection, looking for lost underwater equipment,
01:08:30.980 | some rumors about using them for weaponry.
01:08:33.960 | Which I think Russia's like, "Sure, believe that."
01:08:39.120 | And America's like, "No, no, we don't do that."
01:08:41.420 | Who knows?
01:08:42.780 | But they started doing that in like the 60s, 70s.
01:08:45.180 | They started training these dolphins
01:08:46.400 | because they were like, "Oh, dolphins have
01:08:48.300 | this amazing echolocation system
01:08:49.900 | that we can't replicate with machines
01:08:52.140 | and they're trainable.
01:08:52.980 | So we're gonna use them for all the stuff
01:08:54.460 | that we can't do with machines or by ourselves."
01:08:57.060 | And they've tried to phase out the dolphins.
01:08:59.180 | I know the US has invested a lot of money
01:09:02.620 | in trying to make robots do the mine detection.
01:09:05.860 | But like you were saying,
01:09:07.520 | there are some things that the robots are good at
01:09:09.580 | and there's some things that biological creatures
01:09:11.820 | are better at, so they still have the dolphins.
01:09:13.860 | - So there's also pigeons, of course.
01:09:16.020 | - Oh yeah, pigeons.
01:09:17.460 | Oh my gosh, there's so many examples.
01:09:19.340 | The pigeons were the original hobby photography drone.
01:09:23.900 | They also carried mail for thousands of years,
01:09:26.060 | letting people communicate with each other in new ways.
01:09:28.460 | So the thing that I like about the animal analogies,
01:09:31.140 | they have all these physical abilities
01:09:33.660 | but also sensing abilities
01:09:35.580 | that we don't have and that's just so useful.
01:09:40.580 | And that's robots, right?
01:09:43.040 | Robots have physical abilities.
01:09:44.640 | They can help us lift things
01:09:46.500 | or do things that we're not physically capable of.
01:09:48.500 | They can also sense things.
01:09:50.620 | It's just, I just feel like,
01:09:52.240 | I still feel like it's a really good analogy.
01:09:54.160 | - Yeah, it's really strong.
01:09:55.160 | - And it works because people are familiar with it.
01:09:58.780 | - What about companionship?
01:10:00.560 | And when we start to think about cats and dogs,
01:10:03.660 | they're pets that seem to serve no purpose whatsoever
01:10:06.860 | except the social connection.
01:10:08.820 | - Yeah, I mean, it's kind of a newer thing.
01:10:11.360 | At least in the United States,
01:10:13.860 | like dogs used to have,
01:10:16.900 | like they used to have a purpose.
01:10:18.100 | They used to be guard dogs or they had some sort of function.
01:10:21.560 | And then at some point they became just part of the family.
01:10:24.460 | And it's so interesting how there's some animals
01:10:30.620 | that we've treated as workers,
01:10:33.740 | some that we've treated as objects,
01:10:35.740 | some that we eat,
01:10:36.680 | and some that are parts of our families.
01:10:39.580 | And that's different across cultures.
01:10:41.640 | And I'm convinced that we're gonna see
01:10:43.540 | the same thing with robots,
01:10:45.040 | where people are gonna develop
01:10:47.640 | strong emotional connections to certain robots
01:10:49.660 | that they relate to, either culturally
01:10:51.300 | or personally, emotionally.
01:10:53.460 | And then there's gonna be other robots
01:10:54.740 | that we don't treat the same way.
01:10:57.200 | - I wonder, does that have to do more
01:11:00.260 | with the culture and the people or the robot design?
01:11:02.860 | Is there an interplay between the two?
01:11:04.740 | Like why did dogs and cats out-compete
01:11:09.740 | ox and I don't know, what else?
01:11:13.580 | Like farm animals to really get inside the home
01:11:16.940 | and get inside our hearts?
01:11:18.780 | - Yeah, I mean, people point to the fact
01:11:21.420 | that dogs are very genetically flexible
01:11:24.300 | and they can evolve much more quickly than other animals.
01:11:28.620 | And so they, evolutionary biologists think
01:11:32.940 | that dogs evolved to be more appealing to us.
01:11:36.220 | And then once we learned how to breed them,
01:11:38.220 | we started breeding them to be more appealing to us too,
01:11:40.780 | which is not something that we necessarily
01:11:43.180 | would be able to do with cows,
01:11:45.380 | although we've bred them to make more milk for us.
01:11:48.020 | So, but part of it is also culture.
01:11:50.780 | I mean, there are cultures where people eat dogs still today
01:11:53.540 | and then there's other cultures where we're like,
01:11:55.600 | oh no, that's terrible, we would never do that.
01:11:58.660 | And so I think there's a lot of different elements
01:12:00.700 | that play in.
01:12:01.780 | - I wonder if there's good,
01:12:03.220 | 'cause I understand dogs 'cause they use their eyes,
01:12:05.540 | they're able to communicate affection,
01:12:07.940 | all those kinds of things.
01:12:08.780 | It's really interesting what dogs do.
01:12:10.700 | There's a whole conferences on dog consciousness
01:12:13.660 | and cognition and all that kind of stuff.
01:12:16.500 | Now cats is a mystery to me
01:12:18.420 | because they seem to not give a shit about the human.
01:12:22.220 | - But they're warm and fluffy.
01:12:25.080 | - But they're also passive aggressive.
01:12:26.560 | So they're at the same time,
01:12:28.740 | they're dismissive of you in some sense.
01:12:33.620 | - I think some people like that about people.
01:12:36.880 | - Yeah, they want to push and pull of a relationship.
01:12:40.440 | - They don't want loyalty or unconditional love.
01:12:43.920 | That means they haven't earned it.
01:12:45.640 | - Yeah.
01:12:47.480 | And maybe that says a lot more about the people
01:12:51.720 | than it does about the animals.
01:12:53.040 | - Oh yeah, we all need therapy.
01:12:54.620 | - Yeah.
01:12:55.460 | So I'm judging harshly the people that have cats
01:12:59.620 | or the people that have dogs.
01:13:02.020 | Maybe the people that have dogs
01:13:03.840 | are desperate for attention and unconditional love
01:13:08.620 | and they're unable to sort of struggle
01:13:11.620 | to earn meaningful connections.
01:13:19.860 | I don't know.
01:13:22.020 | - Maybe people are talking about you
01:13:23.300 | and your robot pets in the same way.
01:13:26.100 | - Yeah, that's...
01:13:27.060 | It is kind of sad.
01:13:30.360 | There's just robots everywhere.
01:13:31.960 | But I mean, I'm joking about it being sad
01:13:34.300 | 'cause I think it's kind of beautiful.
01:13:35.340 | I think robots are beautiful in the same way that pets are,
01:13:40.060 | even children, in that they capture some kind of magic
01:13:44.500 | of social robots.
01:13:46.040 | They have the capacity to have the same kind of magic
01:13:50.640 | of connection.
01:13:52.220 | I don't know what that is.
01:13:53.780 | Like when they're brought to life and they move around,
01:13:57.020 | the way they make me feel, I'm pretty convinced,
01:14:02.740 | is as you know, they will make billions of people feel.
01:14:07.740 | Like I don't think I'm like some weird robotics guy.
01:14:12.100 | I'm not.
01:14:13.300 | - I mean, you are, but not in this way.
01:14:15.020 | - Not in this way.
01:14:15.860 | I mean, I just, I can put on my like normal human hat
01:14:19.980 | and just see this, oh, this is like,
01:14:22.440 | there's a lot of possibility there of something cool,
01:14:25.840 | just like with dogs.
01:14:27.160 | Like what is it?
01:14:28.000 | Why are we so into dogs or cats?
01:14:30.360 | Like it's like, it's way different than us.
01:14:33.980 | - It is.
01:14:35.720 | - It's like drooling all over the place
01:14:37.520 | with its tongue out.
01:14:38.480 | It's like a weird creature that used to be a wolf.
01:14:42.280 | Why are we into this thing?
01:14:43.520 | - Well, dogs can either express or mimic a lot of emotions
01:14:49.240 | that we recognize.
01:14:50.840 | And I think that's a big thing.
01:14:53.080 | Like a lot of the magic of animals and robots
01:14:55.960 | is our own self projection.
01:14:59.080 | And the easier it is for us to see ourselves in something
01:15:03.360 | and project human emotions or qualities or traits onto it,
01:15:06.920 | the more we'll relate to it.
01:15:08.240 | And then you also have the movement, of course.
01:15:10.340 | I think that's also really,
01:15:11.720 | that's why I'm so interested in physical robots
01:15:14.160 | because that's, I think the visceral magic of them.
01:15:17.280 | I think we're, I mean, there's research showing
01:15:19.960 | that we're probably biologically hardwired
01:15:22.680 | to respond to autonomous movement in our physical space
01:15:26.480 | because we've had to watch out for predators
01:15:28.560 | or whatever the reason is.
01:15:29.840 | And so animals and robots are very appealing to us
01:15:34.800 | as these autonomously moving things
01:15:37.200 | that we view as agents instead of objects.
01:15:39.920 | - I mean, I love the moment,
01:15:42.280 | which is I've been particularly working on,
01:15:44.360 | which is when a robot like the cowboy hat
01:15:47.720 | is doing its own thing and then it recognizes you.
01:15:52.160 | I mean, the way a dog does.
01:15:53.640 | And it looks like this.
01:15:56.000 | And the moment of recognition,
01:15:59.640 | like you're walking,
01:16:01.360 | say you're walking in an airport on the street
01:16:04.080 | and there's just hundreds of strangers,
01:16:08.040 | but then you see somebody you know,
01:16:09.320 | and that like, well, you wake up to like that excitement
01:16:14.320 | of seeing somebody you know and saying hello
01:16:16.240 | and all that kind of stuff.
01:16:17.540 | That's a magical moment.
01:16:19.040 | Like I think, especially with the dog,
01:16:22.160 | it makes you feel noticed and heard and loved.
01:16:27.400 | Like that somebody looks at you and recognizes you
01:16:30.560 | that it matters that you exist.
01:16:33.480 | - Yeah, you feel seen.
01:16:34.680 | - Yeah, and that's a cool feeling.
01:16:36.760 | And I honestly think robots can give that feeling too.
01:16:39.520 | - Oh yeah, totally.
01:16:41.240 | - Currently Alexa, I mean,
01:16:42.880 | one of the downsides of these systems is they don't,
01:16:46.000 | they're servants.
01:16:48.720 | Part of the, you know,
01:16:52.440 | they're trying to maintain privacy, I suppose,
01:16:55.640 | but I don't feel seen with Alexa, right?
01:17:00.640 | - I think that's gonna change.
01:17:01.920 | I think you're right.
01:17:03.120 | And I think that that's the game changing nature
01:17:06.360 | of things like these large language learning models.
01:17:09.440 | And the fact that these companies are investing
01:17:11.880 | in embodied versions that move around of Alexa, like Astro.
01:17:16.880 | - Can I just say, yeah, I haven't, is that out?
01:17:22.160 | - I mean, it's out.
01:17:23.000 | You can't just like buy one commercially yet,
01:17:25.280 | but you can apply for one.
01:17:26.960 | - Yeah.
01:17:28.600 | My gut says that these companies don't have the guts
01:17:33.600 | to do the personalization.
01:17:38.680 | This goes to the, because it's edgy, it's dangerous.
01:17:42.520 | It's gonna make a lot of people very angry.
01:17:44.680 | Like in a way that, you know, just imagine, okay.
01:17:49.360 | All right.
01:17:50.360 | If you do the full landscape of human civilization,
01:17:53.640 | just visualize the number of people
01:17:56.800 | that are going through breakups right now.
01:17:58.680 | Just the amount of really passionate,
01:18:01.400 | just even if we just look at teenagers,
01:18:03.400 | the amount of deep heartbreak that's happening.
01:18:07.040 | And like, if you're going to have Alexa
01:18:09.320 | have more of a personal connection with the human,
01:18:13.160 | you're gonna have humans that like have existential crises.
01:18:16.720 | There's a lot of people that suffer
01:18:18.000 | from loneliness and depression.
01:18:20.040 | And like, you're now taking on the full responsibility
01:18:24.440 | of being a companion to the rollercoaster
01:18:27.680 | of the human condition.
01:18:29.160 | As a company, like imagine PR and marketing people,
01:18:32.120 | they're gonna freak out.
01:18:33.240 | They don't have the guts.
01:18:34.760 | It's gonna have to come from somebody
01:18:36.760 | from a new Apple, from those kinds of folks,
01:18:39.120 | like a small startup.
01:18:41.000 | - And it might.
01:18:42.000 | - Yeah.
01:18:42.840 | - Like they're coming.
01:18:43.660 | There's already virtual therapists.
01:18:44.500 | There's that Replica app.
01:18:46.040 | I haven't tried it, but Replica's like a virtual companion.
01:18:49.240 | Like it's coming.
01:18:51.200 | And if big companies don't do it, someone else will.
01:18:53.800 | - Yeah, I think the next, the future,
01:18:55.880 | the next trillion dollar company
01:18:57.440 | will be those personalizations.
01:18:59.280 | 'Cause if you think,
01:19:00.580 | if you think about all the AI we have around us,
01:19:06.440 | all the smartphones and so on,
01:19:08.200 | there's very minimal personalization.
01:19:10.960 | - You don't think that's just because they weren't able?
01:19:14.200 | - No.
01:19:15.040 | - Really?
01:19:15.860 | - I don't think they have the guts.
01:19:17.400 | - I mean, it might be true, but I have to wonder,
01:19:19.440 | I mean, Google is clearly gonna do something
01:19:22.580 | with the language, I mean.
01:19:24.560 | - They don't have the guts.
01:19:25.800 | (Donna laughs)
01:19:26.880 | - Are you challenging them?
01:19:28.840 | - Partially, but not really.
01:19:30.320 | 'Cause I know they're not gonna do it.
01:19:32.160 | - I mean.
01:19:33.000 | - They don't have to, it's bad for business in the short term.
01:19:35.440 | - I'm gonna be honest, maybe it's not such a bad thing
01:19:38.960 | if they don't just roll this out quickly,
01:19:41.360 | because I do think there are huge issues.
01:19:44.920 | - Yeah.
01:19:45.760 | - And there's, and not just issues
01:19:48.680 | with the responsibility of unforeseen effects on people,
01:19:53.080 | but what's the business model?
01:19:56.520 | And if you are using the business model
01:19:59.520 | that you've used in other domains,
01:20:01.920 | then you're gonna have to collect data from people,
01:20:04.540 | which you will anyway to personalize the thing,
01:20:06.960 | and you're gonna be somehow monetizing the data,
01:20:10.000 | or you're gonna be doing some like ad model.
01:20:12.200 | It just, it seems like now we're suddenly getting
01:20:15.000 | into the realm of like severe consumer protection issues,
01:20:18.040 | and I'm really worried about that.
01:20:22.120 | I see massive potential for this technology to be used
01:20:25.680 | in a way that's not for the public good,
01:20:28.080 | and not, I mean, that's in an individual user's interest
01:20:32.280 | maybe, but not in society's interest.
01:20:35.060 | - Yeah, see, I think that kind of personalization
01:20:39.980 | should be like redefine how we treat data.
01:20:44.980 | I think you should own all the data your phone knows
01:20:49.080 | about you, like, and be able to delete it
01:20:51.980 | with a single click, and walk away.
01:20:55.380 | And that data cannot be monetized, or used,
01:20:59.340 | or shared anywhere without your permission.
01:21:02.100 | I think that's the only way people will trust you
01:21:04.800 | to give, for you to use that data.
01:21:07.920 | - But then how are companies gonna,
01:21:10.040 | I mean, a lot of these applications rely
01:21:12.240 | on massive troves of data to train the AI system.
01:21:17.240 | - Right, so you have to opt in constantly,
01:21:22.440 | and opt in not in some legal, I agree,
01:21:25.560 | but obvious, like show, like in the way I opt in
01:21:30.560 | and tell you a secret.
01:21:32.560 | Like, we understand, like, that, like,
01:21:37.120 | I have to choose, like, how well do I know you?
01:21:39.760 | And then I say, like, don't tell this to anyone.
01:21:42.540 | (laughs)
01:21:44.680 | And then I have to judge how leaky that,
01:21:47.720 | like, how good you are at keeping secrets.
01:21:50.000 | In that same way, like, it's very transparent
01:21:53.000 | in which data you're allowed to use for which purposes.
01:21:56.920 | - That's what people are saying is the solution,
01:21:58.800 | and I think that works to some extent,
01:22:00.480 | having transparency, having people consent.
01:22:03.760 | I think it breaks down at the point at which,
01:22:07.400 | we've seen this happen on social media, too,
01:22:08.840 | like, people are willingly giving up their data
01:22:10.720 | because they're getting a functionality from that,
01:22:13.920 | and then the harm that that causes is on a,
01:22:17.080 | like, maybe to someone else, and not to them personally.
01:22:21.280 | - I don't think people are giving their data.
01:22:22.880 | They're not being asked.
01:22:24.080 | Like, it's not consensual. - But if you were asked,
01:22:27.840 | if you were like, tell me a secret about yourself
01:22:31.120 | and I'll give you $100, I'd tell you a secret.
01:22:34.120 | - No, not $100.
01:22:34.960 | First of all, you wouldn't.
01:22:38.200 | You wouldn't trust, like, why are you giving me $100?
01:22:41.360 | - It's a bad example.
01:22:43.040 | - But, like, I need, I would ask for your specific,
01:22:48.040 | like, fashion interest in order to give recommendations
01:22:54.080 | to you for shopping, and I'd be very clear for that,
01:22:57.120 | and you could disable that, you can delete that.
01:23:00.480 | But then you can be, have a deep, meaningful,
01:23:04.280 | rich connection with the system
01:23:06.280 | about what you think you look fat in,
01:23:08.280 | what you look great in, what, like,
01:23:11.200 | the full history of all the things you've worn,
01:23:14.320 | whether you regret the Justin Bieber
01:23:17.800 | or enjoy the Justin Bieber shirt,
01:23:19.640 | all of that information that's mostly private to even you,
01:23:23.800 | not even your loved ones, that, a system should have that,
01:23:26.640 | 'cause then a system, if you trust it,
01:23:29.680 | to keep control of that data that you own,
01:23:32.400 | you could walk away with,
01:23:33.240 | that system could tell you a damn good thing to wear.
01:23:35.840 | - It could, and the harm that I'm concerned about
01:23:40.240 | is not that the system is gonna then suggest a dress for me
01:23:43.240 | that is based on my preferences.
01:23:45.120 | So I went to this conference once
01:23:47.520 | where I was talking to the people who do the analytics
01:23:49.880 | in, like, the big ad companies,
01:23:52.000 | and, like, literally a woman there was like,
01:23:55.040 | I can ask you three totally unrelated questions
01:23:58.480 | and tell you what menstrual product you use.
01:24:01.520 | And so what they do is they aggregate the data
01:24:04.240 | and they map out different personalities
01:24:06.240 | and different people and demographics,
01:24:08.400 | and then they have a lot of power and control
01:24:10.960 | to market to people.
01:24:13.080 | So, like, I might not be sharing my data
01:24:15.180 | with any of the systems because I'm like,
01:24:17.360 | I'm on Twitter, I know that this is bad.
01:24:19.360 | Other people might be sharing data
01:24:23.120 | that can be used against me.
01:24:25.200 | Like, I think it's way more complex than just
01:24:30.000 | I share a piece of personal information
01:24:32.560 | and it gets used against me.
01:24:33.960 | I think that at a more systemic level,
01:24:36.800 | and then it's always, you know,
01:24:38.520 | vulnerable populations that are targeted by this,
01:24:41.560 | low-income people being targeted for scammy loans,
01:24:45.000 | or, I don't know, like, I could get targeted,
01:24:48.600 | like, someone, not me,
01:24:51.200 | because someone who doesn't have kids yet
01:24:53.760 | and is my age could get targeted for, like,
01:24:56.360 | freezing their eggs.
01:24:57.480 | And there's all these ways that you can manipulate people
01:24:59.720 | where it's not really clear that that came from
01:25:02.880 | that person's data,
01:25:06.760 | it came from all of us, all of us opting into this.
01:25:11.320 | - But there's a bunch of sneaky decisions along the way
01:25:15.200 | that could be avoided if there's transparency.
01:25:18.040 | So one of the ways that goes wrong
01:25:21.240 | if you share that data with too many ad networks,
01:25:23.800 | don't run your own ad network.
01:25:27.960 | Don't share with anybody.
01:25:29.200 | - Okay, and that's something that you could regulate.
01:25:32.960 | - That belongs to just you,
01:25:35.080 | and all of the ways you allow the company to use it,
01:25:38.360 | the default is in no way at all.
01:25:40.300 | And you are consciously, constantly
01:25:43.720 | saying exactly how to use it.
01:25:46.640 | And also, it has to do with the recommender system itself
01:25:51.640 | from the company, which is freezing your eggs.
01:25:57.240 | If that doesn't make you happy,
01:26:00.080 | if that idea doesn't make you happy,
01:26:02.480 | then the system shouldn't recommend it.
01:26:04.280 | It should be very good at learning.
01:26:06.960 | So not the kind of things that the category of people
01:26:11.280 | it thinks you belong to would do,
01:26:13.800 | but more you specifically,
01:26:16.320 | what makes you happy, what is helping you grow.
01:26:19.320 | - But you're assuming that people's preferences
01:26:21.600 | and what makes them happy is static.
01:26:24.120 | Whereas when we were talking before
01:26:25.680 | about how a company like Apple
01:26:28.320 | can tell people what they want,
01:26:31.760 | and they will start to want it,
01:26:33.700 | that's the thing that I'm more concerned about.
01:26:36.320 | - Yeah, that is a huge problem.
01:26:37.560 | It's not just listening to people,
01:26:39.620 | but manipulating them into wanting something.
01:26:42.280 | - And that's like, we have a long history
01:26:45.240 | of using technology for that purpose.
01:26:48.200 | Like the persuasive design in casinos
01:26:50.440 | to get people to gamble more.
01:26:52.040 | It's just, I'm...
01:26:55.840 | The other thing that I'm worried about
01:26:59.000 | is as we have more social technology,
01:27:02.480 | suddenly you have this on a new level.
01:27:05.200 | If you look at the influencer marketing
01:27:07.400 | that happens online now.
01:27:09.400 | - What's influencer marketing?
01:27:10.400 | - So like on Instagram, there will be some person
01:27:14.040 | who has a bunch of followers.
01:27:15.880 | And then a brand will hire them to promote some product.
01:27:20.000 | And it's above board.
01:27:21.440 | They disclose, this is an ad that I'm promoting,
01:27:24.740 | but they have so many young followers
01:27:26.600 | who deeply admire and trust them.
01:27:29.000 | I mean, this must work for you too.
01:27:30.480 | Don't you have ads on the podcast?
01:27:32.400 | People trust you.
01:27:33.320 | - Magic spoon cereal, low carb, yes.
01:27:37.360 | - If you say that, I guarantee you
01:27:38.960 | some people will buy that just because
01:27:40.760 | even though they know that you're being paid,
01:27:42.880 | they trust you.
01:27:44.720 | - Yeah, it's different with podcasts
01:27:46.680 | 'cause well, my particular situation,
01:27:48.760 | but it's true for a lot of podcasts,
01:27:49.960 | especially big ones is, you know,
01:27:52.600 | I have 10 times more sponsors that wanna be sponsors
01:27:57.160 | than I have...
01:28:00.080 | - So you get to select the ones
01:28:00.920 | that you actually wanna support.
01:28:02.560 | - And so like you end up using it
01:28:04.440 | and then you're able to actually...
01:28:06.560 | Like there's no incentive to like shill for anybody.
01:28:12.360 | - Sure, and that's why it's fine
01:28:14.800 | when it's still human influencers.
01:28:17.320 | Now, if you're a bot, you're not gonna discriminate.
01:28:21.600 | You're not gonna be like,
01:28:23.480 | oh, well, I think this product is good for people.
01:28:27.000 | - You think there'll be like bots essentially
01:28:30.520 | with millions of followers.
01:28:32.200 | - There already are.
01:28:33.560 | There are virtual influencers in South Korea
01:28:36.880 | who shill products.
01:28:37.960 | And like that's just the tip of the iceberg
01:28:41.000 | because that's still very primitive.
01:28:42.520 | Now with the new image generation
01:28:44.320 | and the language learning models.
01:28:46.480 | And like, so we're starting to do some research
01:28:49.840 | around kids and like young adults
01:28:53.040 | because a lot of the research
01:28:56.080 | on like what's okay to advertise to kids
01:28:58.240 | and what is too manipulative
01:28:59.800 | has to do with television ads.
01:29:01.800 | Back in the day where like a kid who's 12 understands,
01:29:04.560 | oh, that's an advertisement.
01:29:06.080 | I can distinguish that from entertainment.
01:29:07.600 | I know it's trying to sell me something.
01:29:09.160 | Now it's getting really, really murky with influencers.
01:29:12.880 | And then if you have like a bot
01:29:15.200 | that a kid has developed a relationship with,
01:29:18.080 | is it okay to market products through that or not?
01:29:20.760 | Like you're getting into
01:29:21.640 | all these consumer protection issues
01:29:23.240 | because you're developing a trusted relationship
01:29:26.640 | with a social entity, but it's...
01:29:30.640 | And so now it's like personalized, it's scalable,
01:29:34.880 | it's automated and it can...
01:29:39.880 | So some of the research showing
01:29:42.640 | that kids are already very confused
01:29:44.080 | about like the incentives of the company
01:29:46.400 | versus what the robot is doing.
01:29:48.540 | - Meaning they're not deeply understanding
01:29:54.840 | the incentives of the system.
01:29:57.480 | - Well, yeah, so like kids who are old enough to understand
01:30:01.040 | this is a television advertisement
01:30:02.640 | is trying to advertise to me.
01:30:04.080 | I might still decide I want this product,
01:30:05.400 | but they understand what's going on.
01:30:06.480 | So there's some transparency there.
01:30:09.060 | That age child, so Daniela Di Paola, Anastasia Ostrovsky,
01:30:14.060 | and I advised on this project, they did this.
01:30:18.440 | They asked kids who had interacted with social robots,
01:30:21.340 | whether they would like a policy that allows robots
01:30:27.360 | to market to people through casual conversation,
01:30:31.140 | or whether they would prefer
01:30:32.360 | that it has to be transparent,
01:30:34.280 | that it's like an ad coming from a company.
01:30:36.680 | And the majority said they preferred the casual conversation.
01:30:40.360 | And when asked why, there was a lot of confusion about,
01:30:43.000 | they were like, well, the robot knows me better
01:30:45.240 | than the company does.
01:30:46.300 | So the robot's only gonna market things that I like.
01:30:49.760 | And so they don't really, they're not connecting the fact
01:30:52.440 | that the robot is an agent of the company.
01:30:54.880 | They're viewing it as something separate.
01:30:57.000 | And I think that even happens subconsciously with grownups
01:31:00.140 | when it comes to robots and artificial agents.
01:31:02.200 | And it will, like this Blake guy at Google,
01:31:04.360 | sorry I'm going on and on, but like his main concern
01:31:07.840 | was that Google owned this sentient agent
01:31:10.800 | and that it was being mistreated.
01:31:12.600 | His concern was not that the agent
01:31:14.000 | was gonna mistreat people.
01:31:15.720 | So I think we're gonna see a lot of this.
01:31:19.160 | - Yeah, but shitty companies will do that.
01:31:21.480 | I think ultimately that confusion should be alleviated
01:31:25.640 | by the robot should actually know you better
01:31:28.920 | and should not have any control from the company.
01:31:32.360 | - But what's the business model for that?
01:31:34.720 | - If you use the robot to buy, first of all,
01:31:38.440 | the robot should probably cost money.
01:31:41.120 | - Should what?
01:31:41.960 | - Cost money, like the way Windows operating system does.
01:31:44.640 | I see it more like an operating system.
01:31:46.940 | Then like this thing is your window,
01:31:51.940 | no pun intended, into the world.
01:31:55.640 | So it's helping you, it's like a personal assistant.
01:31:58.820 | Right, and so that should cost money.
01:32:01.720 | You should, you know, whatever it is, 10 bucks, 20 bucks.
01:32:06.000 | Like that's the thing that makes
01:32:07.200 | your life significantly better.
01:32:09.120 | This idea that everything should be free is,
01:32:12.640 | like it should actually help educate you.
01:32:14.720 | You should talk shit about all the other companies
01:32:16.480 | that do stuff for free.
01:32:18.100 | But also, yeah, in terms of if you purchase stuff
01:32:22.960 | based on its recommendation, it gets money.
01:32:25.200 | So it's kind of ad driven, but it's not ads.
01:32:30.720 | It's like, it's not controlled.
01:32:35.720 | Like no external entities can control it
01:32:41.080 | to try to manipulate you to want a thing.
01:32:45.400 | - That would be amazing.
01:32:46.320 | - It's actually trying to discover what you want.
01:32:48.880 | So it's not allowed to have any influence.
01:32:52.640 | No promoted ad, no anything.
01:32:55.880 | So it's finding, I don't know,
01:32:58.600 | the thing that would actually make you happy.
01:33:01.400 | That's the only thing it cares about.
01:33:03.760 | I think companies like this can win out.
01:33:08.200 | - Yes, I think eventually,
01:33:09.600 | once people understand the value of the robot,
01:33:14.320 | even just, like I think that robots
01:33:16.860 | would be valuable to people,
01:33:18.000 | even if they're not marketing something
01:33:20.760 | or helping with like preferences or anything.
01:33:23.000 | Like just a simple, the same thing as a pet,
01:33:25.960 | like a dog that has no function
01:33:28.400 | other than being a member of your family.
01:33:29.920 | I think robots could really be that
01:33:31.360 | and people would pay for that.
01:33:32.880 | I don't think the market realizes that yet.
01:33:35.840 | And so my concern is that companies
01:33:38.080 | are not gonna go in that direction,
01:33:40.000 | at least not yet, of making like this contained thing
01:33:43.720 | that you buy.
01:33:44.720 | It seems almost old fashioned, right,
01:33:46.760 | to have a disconnected object that you buy,
01:33:51.760 | that you're not like paying a subscription for.
01:33:53.720 | It's not like controlled by one of the big corporations.
01:33:56.040 | - But that's the old fashioned things
01:33:57.400 | that people yearn for,
01:34:01.940 | because I think it's very popular now
01:34:05.760 | and people understand the negative effects of social media,
01:34:09.040 | the negative effects of the data being used
01:34:11.920 | in all these kinds of ways.
01:34:13.080 | I think we're just waking up to the realization
01:34:16.360 | we tried, we're like baby deer,
01:34:19.120 | finding our legs in this new world of social media,
01:34:21.720 | of ad driven companies and realizing,
01:34:24.600 | okay, this has to be done somehow different.
01:34:27.120 | - Yeah.
01:34:27.960 | - Like one of the most popular notions,
01:34:31.480 | at least in the United States,
01:34:32.560 | is social media is evil and it's doing bad by us.
01:34:36.520 | It's not like it's totally tricked us
01:34:39.760 | into believing that it's good for us.
01:34:42.560 | I think everybody knows it's bad for us.
01:34:44.280 | And so there's a hunger for other ideas.
01:34:47.440 | - All right, it's time for us to start that company.
01:34:49.320 | - I think so.
01:34:50.160 | - Let's do it.
01:34:50.980 | - I think let's go.
01:34:51.820 | - Hopefully no one listens to this and steals the idea.
01:34:54.160 | - There's no, see that's the other thing.
01:34:55.720 | I think I'm a big person on,
01:34:58.080 | execution is what matters.
01:35:01.560 | I mean, it's like ideas are kind of true.
01:35:04.600 | The social robotics is a good example
01:35:06.160 | that there's been so many amazing companies
01:35:08.920 | that went out of business.
01:35:11.120 | I mean, to me it's obvious,
01:35:12.600 | like it's obvious that there will be a robotics company
01:35:17.600 | that puts a social robot in the home of billions of homes.
01:35:22.240 | - Yeah.
01:35:23.080 | - And it'll be a companion.
01:35:24.720 | Okay, there you go.
01:35:25.560 | You can steal that idea.
01:35:27.360 | Do it.
01:35:28.200 | - Okay, I have a question for you.
01:35:31.000 | What about Elon Musk's humanoid?
01:35:33.720 | Is he gonna execute on that?
01:35:35.240 | - There might be a lot to say.
01:35:38.760 | So for people who are not aware,
01:35:40.280 | there's an optimist, Tesla's optimist robot that's,
01:35:43.720 | I guess the stated reason for that robot
01:35:48.080 | is a humanoid robot in the factory
01:35:50.000 | that's able to automate some of the tasks
01:35:52.520 | that humans are currently doing.
01:35:54.240 | And the reason you wanna do,
01:35:55.600 | it's the second reason you mentioned,
01:35:57.080 | the reason you wanna do a humanoid robot
01:35:59.840 | is because the factory's built for,
01:36:01.640 | there's certain tasks that are designed for humans.
01:36:05.600 | So it's hard to automate with any other form factor
01:36:08.560 | than a humanoid.
01:36:09.840 | And then the other reason is because so much effort
01:36:13.760 | has been put into this giant data engine machine
01:36:16.840 | of perception that's inside Tesla autopilot
01:36:21.160 | that's seemingly, at least the machine, if not the data,
01:36:25.320 | is transferable to the factory setting, to any setting.
01:36:28.880 | - Yeah, he said it would do anything that's boring to us.
01:36:32.520 | - Yeah, yeah.
01:36:34.320 | The interesting thing about that
01:36:36.640 | is there's no interest
01:36:40.440 | and no discussion about the social aspect.
01:36:43.880 | Like I talked to him on mic and off mic about it quite a bit.
01:36:50.080 | And there's not a discussion about like,
01:36:55.080 | to me it's obvious if a thing like that works at all,
01:37:01.360 | at all.
01:37:02.520 | In fact, it has to work really well in a factory.
01:37:06.600 | If it works kind of shitty, it's much more useful in the home
01:37:10.000 | 'cause we're much,
01:37:13.040 | I think being shitty at stuff
01:37:16.480 | is kind of what makes relationships great.
01:37:20.680 | Like you wanna be flawed
01:37:24.280 | and be able to communicate your flaws
01:37:26.040 | and be unpredictable in certain ways.
01:37:28.640 | Like if you fell over every once in a while
01:37:30.920 | for no reason whatsoever,
01:37:32.200 | I think that's essential for like--
01:37:34.960 | - Very charming.
01:37:36.280 | - It's charming, but also concerning
01:37:38.080 | and also like, are you okay?
01:37:42.320 | I mean, it's both hilarious.
01:37:45.200 | Whenever somebody you love like falls down the stairs,
01:37:47.680 | it was both hilarious and concerning.
01:37:50.720 | It's some dance between the two.
01:37:54.160 | And I think that's essential for like,
01:37:56.520 | you almost wanna engineer that in,
01:37:58.880 | except you don't have to
01:37:59.960 | because of robotics in the physical space
01:38:02.920 | is really difficult.
01:38:04.520 | So I think I've learned to not discount
01:38:09.520 | the efforts that Elon does.
01:38:13.960 | There's a few things that are really interesting there.
01:38:16.880 | One, because he's taking it extremely seriously.
01:38:21.880 | What I like is the humanoid form,
01:38:24.640 | the cost of building a robot.
01:38:25.800 | I talked to Jim Keller offline about this a lot.
01:38:28.480 | And currently humanoid robots cost a lot of money.
01:38:31.400 | And the way they're thinking about it,
01:38:34.560 | now they're not talking about all the social robotics stuff
01:38:36.800 | that you and I care about.
01:38:38.240 | They are thinking, how can we manufacture this thing cheaply
01:38:43.800 | and do it like well?
01:38:45.280 | And the kind of discussions they're having
01:38:47.600 | is really great engineering.
01:38:49.440 | It's like first principles question of like,
01:38:52.800 | why is this cost so much?
01:38:54.520 | Like, what's the cheap way?
01:38:56.480 | Why can't we build?
01:38:57.640 | And there's not a good answer.
01:38:59.840 | Why can't we build this humanoid form for under $1,000?
01:39:03.640 | And like, I've sat and had these conversations.
01:39:07.320 | There's no reason.
01:39:08.280 | I think the reason they've been so expensive
01:39:11.800 | is because they were focused on trying to,
01:39:15.880 | they weren't focused on doing the mass manufacture.
01:39:20.200 | People are focused on getting a thing that's,
01:39:23.000 | I don't know exactly what the reasoning is,
01:39:25.760 | but it's the same like Waymo.
01:39:27.800 | It's like, let's build a million dollar car
01:39:30.520 | in the beginning, or like multi-million dollar car.
01:39:33.560 | Let's try to solve that problem.
01:39:36.200 | The way Elon, the way Jim Keller,
01:39:38.320 | the way some of those folks are thinking is,
01:39:40.960 | let's like at the same time,
01:39:43.000 | try to actually build a system that's cheap.
01:39:46.320 | Not crappy, but cheap.
01:39:47.560 | And let's, first principles,
01:39:49.640 | what is the minimum amount of degrees of freedom we need?
01:39:54.560 | What are the joints?
01:39:55.680 | Where's the control set?
01:39:57.120 | Like how many, how do we, like where are the activators?
01:40:00.600 | What's the way to power this
01:40:03.000 | in the lowest cost way possible?
01:40:04.920 | But also in a way that's like actually works.
01:40:07.400 | How do we make the whole thing not part of the components
01:40:10.440 | where there's a supply chain,
01:40:11.680 | you have to have all these different parts
01:40:14.040 | that have to feed us.
01:40:15.000 | Do it all from scratch and do the learning.
01:40:19.080 | I mean, it's like immediately certain things
01:40:22.120 | like become obvious.
01:40:23.560 | Do the exact same pipeline as you do for autonomous driving.
01:40:27.400 | Just the exact, I mean,
01:40:28.640 | the infrastructure there is incredible.
01:40:31.000 | For the computer vision, for the manipulation task,
01:40:33.760 | the control problem changes,
01:40:34.920 | the perception problem changes,
01:40:37.880 | but the pipeline doesn't change.
01:40:40.400 | Do it.
01:40:41.400 | And so I don't,
01:40:42.240 | obviously the optimism about how long it's gonna take,
01:40:47.240 | I don't share.
01:40:48.360 | But it's a really interesting problem.
01:40:51.560 | And I don't wanna say anything
01:40:53.280 | because my first gut is to say that
01:40:56.800 | why the humanoid form, that doesn't make sense.
01:40:58.960 | - Yeah, that's my second gut too, but.
01:41:01.960 | - But then there's a lot of people
01:41:04.000 | that are really excited about the humanoid form there.
01:41:06.080 | - That's true. - And it's like,
01:41:06.920 | I don't wanna get in the way,
01:41:08.760 | like they might solve this thing.
01:41:10.520 | And they might, it's like similar with Boston Dynamics.
01:41:13.880 | Like, why?
01:41:15.160 | Like, if I were to, you can be a hater
01:41:17.360 | and you go up to Mark Ryber and just,
01:41:21.120 | like, how are you gonna make money
01:41:22.640 | with these super expensive legged robots?
01:41:25.440 | What's your business plan?
01:41:27.160 | This doesn't make any sense.
01:41:28.200 | Why are you doing these legged robots?
01:41:30.160 | But at the same time, they're pushing forward
01:41:33.440 | the science, the art of robotics
01:41:35.760 | in a way that nobody else does.
01:41:37.560 | - Yeah.
01:41:38.400 | - And with Elon, they're not just going to do that,
01:41:43.040 | they're gonna drive down the cost
01:41:44.800 | to where we can have humanoid bots in the home potentially.
01:41:49.360 | - So the part I agree with is,
01:41:52.360 | a lot of people find it fascinating
01:41:54.000 | and it probably also attracts talent
01:41:56.200 | who wanna work on humanoid robots.
01:41:57.960 | I think it's a fascinating scientific problem
01:42:00.400 | and engineering problem,
01:42:01.760 | and it can teach us more about human body
01:42:04.840 | and locomotion and all of that.
01:42:06.040 | I think there's a lot to learn from it.
01:42:07.960 | Where I get tripped up is why we need them
01:42:11.920 | for anything other than art and entertainment
01:42:13.840 | in the real world.
01:42:15.480 | Like, I get that there are some areas
01:42:18.920 | where you can't just rebuild like a spaceship.
01:42:23.920 | You can't just like,
01:42:25.960 | they've worked for so many years on these spaceships,
01:42:28.040 | you can't just re-engineer it.
01:42:30.480 | You have some things that are just built for human bodies,
01:42:33.160 | a submarine, a spaceship.
01:42:34.800 | But a factory, maybe I'm naive,
01:42:38.680 | but it seems like we've already rebuilt factories
01:42:41.000 | to accommodate other types of robots.
01:42:43.720 | Why would we want to just like make a humanoid robot
01:42:46.560 | to go in there?
01:42:48.400 | I just get really tripped up on,
01:42:50.560 | I think that people want humanoids.
01:42:53.360 | I think people are fascinated by them.
01:42:56.600 | I think it's a little overhyped.
01:42:58.880 | - Well, most of our world is still built for humanoids.
01:43:03.200 | - I know, but it shouldn't be.
01:43:04.080 | It should be built so that it's wheelchair accessible.
01:43:07.040 | - Right, so the question is,
01:43:08.440 | do you build a world that's the general form
01:43:11.800 | of wheelchair accessible?
01:43:15.640 | All robot form factor accessible?
01:43:18.560 | Or do you build humanoid robots?
01:43:22.080 | - I mean, it doesn't have to be all,
01:43:23.880 | and it also doesn't have to be either or.
01:43:26.280 | I just feel like we're thinking so little
01:43:28.160 | about the system in general
01:43:30.160 | and how to create infrastructure that works for everyone,
01:43:34.480 | all kinds of people, all kinds of robots.
01:43:36.560 | I mean, it's more of an investment,
01:43:39.360 | but that would pay off way more in the future
01:43:42.520 | than just trying to cram expensive
01:43:45.680 | or maybe slightly less expensive humanoid technology
01:43:48.640 | into a human space.
01:43:50.040 | - Unfortunately, one company can't do that.
01:43:51.880 | We have to work together.
01:43:53.120 | It's like autonomous driving can be easily solved
01:43:55.640 | if you do V2I, if you change the infrastructure
01:43:59.400 | of cities and so on, but that requires a lot of people.
01:44:04.400 | A lot of them are politicians,
01:44:06.040 | and a lot of them are somewhat, if not a lot, corrupt,
01:44:09.120 | and all those kinds of things.
01:44:11.960 | And the talent thing you mentioned
01:44:13.840 | is really, really, really important.
01:44:16.080 | I've gotten a chance to meet a lot of folks
01:44:19.000 | at SpaceX and Tesla, other companies too,
01:44:22.400 | but they're specifically, the openness makes it easier
01:44:25.320 | to meet everybody.
01:44:26.760 | I think a lot of amazing things in this world happen
01:44:32.200 | when you get amazing people together.
01:44:34.360 | And if you can sell an idea,
01:44:36.200 | like us becoming a multi-planetary species,
01:44:39.480 | you can say, "Why the hell I would go to Mars?
01:44:43.280 | "Like why colonize Mars?"
01:44:45.440 | If you think from basic first principles,
01:44:50.000 | it doesn't make any sense.
01:44:51.360 | It doesn't make any sense to go to the moon.
01:44:55.400 | The only thing that makes sense
01:44:58.720 | to go to space is for satellites.
01:45:00.480 | But there's something about the vision of the future,
01:45:06.240 | the optimism laden that permeates this vision
01:45:10.020 | of us becoming multi-planetary.
01:45:12.000 | It's thinking not just for the next 10 years,
01:45:14.120 | it's thinking like human civilization
01:45:17.640 | reaching out into the stars.
01:45:19.460 | It makes people dream.
01:45:21.280 | It's really exciting.
01:45:23.040 | And that, they're gonna come up with some cool shit
01:45:25.960 | that might not have anything to do with,
01:45:28.840 | here's what I, 'cause Elon doesn't seem to care
01:45:32.720 | about social robotics, which is constantly surprising to me.
01:45:36.080 | I talk to him, he doesn't,
01:45:37.480 | humans are the things you avoid and don't hurt, right?
01:45:42.480 | Like that's, like the number one job of a robot
01:45:45.840 | is not to hurt a human, to avoid them.
01:45:47.960 | The collaborative aspect, the human-robot interaction,
01:45:52.960 | I think is not, at least not in his,
01:45:54.760 | not something he thinks about deeply.
01:45:59.320 | But my sense is if somebody like that takes on the problem
01:46:03.660 | of human-robotics, we're gonna get a social robot out of it.
01:46:07.660 | Like people like, not necessarily Elon,
01:46:10.880 | but people like Elon, if they take on seriously these,
01:46:14.460 | like I can just imagine with a humanoid robot,
01:46:22.080 | you can't help but create a social robot.
01:46:25.040 | So if you do different form factors,
01:46:26.720 | if you do industrial robotics,
01:46:31.440 | you're likely to actually not end up
01:46:33.440 | like walking head into a social robot,
01:46:38.760 | human-robot interaction problem.
01:46:40.280 | If you create for whatever the hell reason you want to,
01:46:42.960 | a humanoid robot, you're gonna have to reinvent,
01:46:46.520 | well, not reinvent, but introduce a lot of fascinating
01:46:50.580 | new ideas into the problem of human-robot interaction,
01:46:53.600 | which I'm excited about.
01:46:54.640 | So like if I was a business person,
01:46:57.160 | I would say this is way too risky.
01:47:00.780 | This doesn't make any sense.
01:47:02.420 | But when people are really convinced,
01:47:04.220 | and there's a lot of amazing people working on it,
01:47:06.660 | it's like, all right, let's see what happens here.
01:47:08.420 | This is really interesting.
01:47:09.700 | Just like with Atlas and Boston Dynamics.
01:47:13.020 | I mean, they, I apologize if I'm ignorant on this,
01:47:18.020 | but I think they really, more than anyone else,
01:47:22.200 | maybe with AIBO, like Sony,
01:47:24.660 | pushed forward humanoid robotics,
01:47:27.920 | like a leap with Atlas. - Oh yeah, with Atlas?
01:47:31.200 | Absolutely.
01:47:32.880 | - And like without them, like why the hell did they do it?
01:47:35.640 | - Why?
01:47:36.480 | Well, I think for them, it is a research platform.
01:47:38.480 | It's not, I don't think they ever, this is speculation,
01:47:42.320 | I don't think they ever intended Atlas
01:47:44.560 | to be like a commercially successful robot.
01:47:48.760 | I think they were just like, can we do this?
01:47:51.360 | Let's try.
01:47:52.960 | Yeah, I wonder if they, maybe the answer they landed on is,
01:47:56.880 | because they eventually went to Spot,
01:48:01.640 | the earlier versions of Spot.
01:48:03.320 | So, Quadruped's like four-legged robot,
01:48:07.640 | but maybe they reached for, let's try to make,
01:48:11.320 | like, I think they tried it,
01:48:15.160 | and they still are trying it for Atlas
01:48:18.160 | to be picking up boxes, to moving boxes, to being,
01:48:21.600 | it makes sense, okay, if they were exactly the same cost,
01:48:26.600 | it makes sense to have a humanoid robot in the warehouse.
01:48:32.200 | - Currently. - Currently.
01:48:33.040 | - I think it's short-sighted, but yes,
01:48:34.560 | currently, yes, it would sell.
01:48:37.080 | - But it's not, it's short-sighted,
01:48:40.020 | it's short-sighted, but it's not pragmatic
01:48:44.680 | to think any other way,
01:48:45.960 | to think that you're gonna be able to change warehouses.
01:48:48.000 | You're gonna have to, you're going--
01:48:50.040 | - If you're Amazon, you can totally change your warehouses.
01:48:52.760 | - Yes, yes, but even if you're Amazon,
01:48:56.680 | that's very costly to change warehouses.
01:49:01.560 | - It is, it's a big investment.
01:49:03.840 | - But isn't, shouldn't you do that investment in a way,
01:49:08.660 | so here's the thing, if you build a humanoid robot
01:49:10.880 | that works in the warehouse, that humanoid robot,
01:49:14.160 | see, I don't know why Tess is not talking about it this way,
01:49:17.760 | as far as I know, but that humanoid robot
01:49:20.240 | is gonna have all kinds of other applications
01:49:22.720 | outside their setting.
01:49:24.360 | To me, it's obvious.
01:49:27.440 | I think it's a really hard problem to solve,
01:49:29.120 | but whoever solves the humanoid robot problem
01:49:32.000 | are gonna have to solve the social robotics problem.
01:49:34.440 | - Oh, for sure, I mean, they're already with Spot
01:49:36.720 | needing to solve social robotics problems.
01:49:38.920 | - For Spot to be effective at scale.
01:49:42.000 | I'm not sure if Spot is currently effective at scale,
01:49:44.080 | it's getting better and better,
01:49:45.380 | but they're actually, the thing they did
01:49:48.680 | is an interesting decision,
01:49:50.400 | perhaps Tess will end up doing the same thing,
01:49:53.160 | which is Spot is supposed to be a platform for intelligence.
01:49:58.160 | So Spot doesn't have any high-level intelligence,
01:50:03.660 | like high-level perception skills.
01:50:07.120 | It's supposed to be controlled remotely.
01:50:10.240 | - And it's a platform that you can attach something to, yeah.
01:50:13.360 | - And somebody else is supposed to do the attaching.
01:50:15.880 | It's a platform that you can take an uneven ground
01:50:19.120 | and it's able to maintain balance,
01:50:21.440 | go into dangerous situations, it's a platform.
01:50:24.240 | On top of that, you can add a camera that does surveillance,
01:50:27.480 | that you can remotely monitor,
01:50:29.280 | you can record the camera, you can remote control it,
01:50:34.280 | but it's not gonna-- - Object manipulation.
01:50:36.000 | - Basic object manipulation,
01:50:37.320 | but not autonomous object manipulation.
01:50:40.060 | It's remotely controlled.
01:50:41.840 | But the intelligence on top of it,
01:50:43.960 | which is what would be required for automation,
01:50:46.880 | somebody else is supposed to do.
01:50:48.520 | Perhaps Tess will do the same thing ultimately,
01:50:51.960 | but it doesn't make sense
01:50:53.200 | because the goal of Optimus is automation.
01:50:56.860 | Without that, but then you never know.
01:51:03.440 | It's like, why go to Mars?
01:51:06.520 | - I mean, that's true, and I reluctantly
01:51:10.280 | am very excited about space travel.
01:51:12.320 | - Why?
01:51:14.440 | Can you introspect, like why?
01:51:16.200 | - Why am I excited about it?
01:51:18.620 | I think what got me excited was I saw a panel
01:51:21.720 | with some people who study other planets,
01:51:26.600 | and it became really clear how little we know
01:51:30.880 | about ourselves and about how nature works
01:51:33.760 | and just how much there is to learn
01:51:36.680 | from exploring other parts of the universe.
01:51:39.640 | So like on a rational level,
01:51:43.360 | that's how I convince myself that that's why I'm excited.
01:51:46.720 | In reality, it's just fucking exciting.
01:51:49.400 | I mean, just like the idea
01:51:51.360 | that we can do this difficult thing
01:51:54.320 | and that humans come together to build things
01:51:57.320 | that can explore space.
01:51:59.260 | I mean, there's just something inherently thrilling
01:52:02.320 | about that, and I'm reluctant about it
01:52:05.000 | because I feel like there are so many other challenges
01:52:08.120 | and problems that I think are more important to solve,
01:52:12.200 | but I also think we should be doing all of it at once.
01:52:14.680 | And so to that extent, I'm like all for research
01:52:19.560 | on humanoid robots, development of humanoid robots.
01:52:23.660 | I think that there's a lot to explore and learn,
01:52:26.680 | and it doesn't necessarily take away
01:52:28.800 | from other areas of science.
01:52:31.800 | At least it shouldn't.
01:52:33.040 | I think unfortunately a lot of the attention
01:52:35.040 | goes towards that, and it does take resources
01:52:39.080 | and attention away from other areas of robotics
01:52:41.520 | that we should be focused on,
01:52:43.400 | but I don't think we shouldn't do it.
01:52:45.960 | - So you think it might be a little bit of a distraction.
01:52:48.800 | Oh, forget the Elon particular application,
01:52:51.720 | but if you care about social robotics,
01:52:55.840 | the humanoid form is a distraction.
01:52:59.000 | - It's a distraction, and it's one
01:53:00.480 | that I find particularly boring.
01:53:03.280 | It's just, it's interesting from a research perspective,
01:53:07.040 | but from a like what types of robots can we create
01:53:11.280 | to put in our world?
01:53:12.120 | Like why would we just create a humanoid robot?
01:53:14.360 | - So even just robotic manipulation,
01:53:16.880 | so arms is not useful either.
01:53:19.220 | - Oh, arms can be useful, but like why not have three arms?
01:53:22.540 | Like why does it have to look like a person?
01:53:25.160 | - Well, I actually personally just think
01:53:26.800 | that washing the dishes is harder
01:53:30.720 | than a robot that can be a companion.
01:53:33.640 | - Yeah. - Like being useful
01:53:35.920 | in the home is actually really tough.
01:53:38.200 | - But does your companion have to have like two arms
01:53:41.280 | and look like you?
01:53:42.120 | - No, I'm making the case for zero arms.
01:53:44.640 | - Oh, okay, zero arms.
01:53:46.480 | - Yeah. - Okay, freaky.
01:53:48.480 | - That didn't come out the way I meant it,
01:53:52.160 | 'cause it almost sounds like I don't want a robot
01:53:54.420 | to defend itself.
01:53:56.400 | Like that's immediately you project,
01:53:58.060 | you know what I mean?
01:53:59.260 | Zero.
01:54:00.100 | No, I just think that the social component
01:54:05.740 | doesn't require arms or legs or so on, right?
01:54:08.380 | That's what we've talked about.
01:54:09.700 | And I think that's probably where a lot
01:54:11.540 | of the meaningful impact that's gonna be happening.
01:54:15.380 | - Yeah, I think just we could get so creative with the design
01:54:18.740 | like why not have a robot on roller skates or like whatever?
01:54:21.360 | Like why does it have to look like us?
01:54:24.620 | - Yeah.
01:54:27.180 | - Still, it is a compelling and interesting form
01:54:29.580 | from a research perspective, like you said.
01:54:31.580 | - Yeah.
01:54:32.700 | - You co-authored a paper as you were talking about
01:54:34.820 | that for WeRobot 2022,
01:54:38.020 | Lula Robot Consumer Protection
01:54:41.140 | in the Face of Automated Social Marketing.
01:54:43.100 | I think you were talking about some of the ideas in that.
01:54:46.260 | - Yes.
01:54:47.100 | Oh, you got it from Twitter.
01:54:47.940 | I was like, that's not published yet.
01:54:50.260 | - Yeah, this is how I do my research.
01:54:52.620 | - You just go through people's Twitter feeds.
01:54:54.660 | - Yeah, go, thank you.
01:54:56.900 | - It's not stalking if it's public.
01:54:58.980 | So there's a, you looked at me like you're offended.
01:55:04.180 | Like how did you know?
01:55:05.580 | - No, I was just like worried that like some early, I mean.
01:55:09.740 | - Yeah, there's a PDF.
01:55:11.120 | There's a PDF.
01:55:13.180 | - There is, like now?
01:55:15.860 | - Yeah.
01:55:16.700 | - Maybe like as of a few days ago.
01:55:18.260 | - Yeah.
01:55:19.100 | - Okay, well.
01:55:19.920 | - Yeah, yeah.
01:55:20.760 | - Okay.
01:55:21.600 | - You look violated.
01:55:24.180 | Like how did you get that PDF?
01:55:25.780 | - It's just a draft.
01:55:26.720 | It's online.
01:55:27.560 | Nobody read it yet until we've written the final paper.
01:55:30.820 | - Well, it's really good, so I enjoyed it.
01:55:32.460 | - Oh, thank you.
01:55:34.180 | - By the time this comes out, I'm sure it'll be out.
01:55:35.940 | Or no, when's Wee Robot?
01:55:37.420 | - So basically, Wee Robot, that's the workshop
01:55:39.860 | where you have an hour where people give you
01:55:42.100 | constructive feedback on the paper
01:55:43.700 | and then you write the good version.
01:55:45.340 | - Right, I take it back.
01:55:46.300 | There's no PDF.
01:55:47.140 | I don't know what I-- - It doesn't exist.
01:55:48.220 | - I imagine, but there is a table in there
01:55:50.980 | in a virtual imagined PDF that I like,
01:55:54.580 | that I wanted to mention, which is
01:55:56.680 | like this kind of strategy used
01:56:01.800 | across various marketing platforms.
01:56:03.440 | And it's basically looking at traditional media,
01:56:08.440 | person-to-person interaction, targeted ads,
01:56:11.040 | influencers, and social robots.
01:56:12.720 | This is the kind of idea that you've been speaking to.
01:56:14.840 | And it's just a nice breakdown of that,
01:56:17.600 | that social robots have personalized recommendations,
01:56:21.280 | social persuasion, automated scalable,
01:56:24.420 | data collection, and embodiment.
01:56:26.340 | So person-to-person interaction is really nice,
01:56:29.220 | but it doesn't have the automated
01:56:30.820 | and the data collection aspect.
01:56:33.340 | But the social robots have those two elements.
01:56:36.580 | - Yeah, we're talking about the potential
01:56:38.040 | for social robots to just combine
01:56:40.160 | all of these different marketing methods
01:56:41.940 | to be this really potent cocktail.
01:56:44.300 | And that table, which was Daniela's idea
01:56:46.820 | and a really fantastic one,
01:56:48.020 | we put it in at the last second, so.
01:56:50.740 | - Yeah, I really like that.
01:56:51.580 | - I'm glad you like it.
01:56:52.400 | - In a PDF that doesn't exist.
01:56:54.240 | - Yes.
01:56:55.080 | - That nobody can find if they look.
01:56:56.960 | - Yeah.
01:56:57.800 | - So when you say social robots, what does that mean?
01:56:59.440 | Does that include virtual ones or no?
01:57:01.640 | - I think a lot of this applies to virtual ones too.
01:57:04.680 | Although the embodiment thing,
01:57:06.880 | which I personally find very fascinating,
01:57:09.400 | is definitely a factor that research shows
01:57:11.560 | can enhance people's engagement with a device.
01:57:14.560 | - But can embodiment be a virtual thing also?
01:57:17.080 | Meaning like it has a body in the virtual world.
01:57:20.400 | - Maybe.
01:57:21.360 | - Makes you feel like,
01:57:22.980 | 'cause what makes a body?
01:57:25.560 | A body is a thing that
01:57:27.140 | can disappear, has a permanence.
01:57:32.520 | I mean, there's certain characteristics
01:57:33.960 | that you kind of associate to a physical object.
01:57:37.520 | - So I think what I'm referring to,
01:57:41.400 | and I think this gets messy
01:57:44.320 | because now we have all these new virtual worlds
01:57:47.260 | and AR and stuff, and I think it gets messy,
01:57:51.020 | but there's research showing that something on a screen,
01:57:53.620 | on a traditional screen,
01:57:54.740 | and something that is moving in your physical space,
01:57:58.260 | that that has a very different effect
01:57:59.980 | on how your brain perceives it even.
01:58:02.500 | - So, I mean, I have a sense that we can do that
01:58:08.580 | in a virtual world.
01:58:09.780 | - Probably.
01:58:10.600 | Like when I've used VR, I jump around like an idiot
01:58:13.800 | because I think something's gonna hit me.
01:58:16.180 | - And even if a video game on a 2D screen
01:58:18.340 | is compelling enough,
01:58:19.580 | like the thing that's immersive about it
01:58:21.980 | is I kind of put myself into that world.
01:58:24.180 | Those, the objects you're interacting with,
01:58:28.820 | Call of Duty, things you're shooting,
01:58:30.820 | they're kind of,
01:58:32.140 | I mean, your imagination fills the gaps
01:58:35.220 | and it becomes real.
01:58:36.060 | Like it pulls your mind in when it's well done.
01:58:38.700 | So it really depends what's shown on the 2D screen.
01:58:41.940 | - Yeah.
01:58:43.060 | Yeah, I think there's a ton of different factors
01:58:45.060 | and there's different types of embodiment.
01:58:46.700 | Like you can have embodiment in a virtual world.
01:58:49.860 | You can have an agent that's simply text-based,
01:58:52.820 | which has no embodiment.
01:58:54.620 | So I think there's a whole spectrum of factors
01:58:57.540 | that can influence how much you engage with something.
01:59:00.060 | - Yeah, I wonder, I always wondered if you can have like
01:59:02.780 | an entity living in a computer.
01:59:06.740 | Okay, this is gonna be dark.
01:59:08.900 | I haven't always wondered about this.
01:59:10.820 | So this is gonna make it sound like
01:59:12.060 | I keep thinking about this kind of stuff.
01:59:14.140 | No, but like, this is almost like black mirror,
01:59:17.300 | but the entity that's convinced
01:59:21.660 | or is able to convince you that it's being tortured
01:59:25.900 | inside the computer and needs your help to get out.
01:59:29.100 | Something like this.
01:59:30.460 | That becomes, to me,
01:59:31.820 | to me suffering is one of the things
01:59:33.660 | that make you empathize with.
01:59:36.260 | Like we're not good at, as you've discussed in other,
01:59:39.380 | in the physical form, like holding a robot upside down,
01:59:43.740 | you have a really good examples about that
01:59:46.180 | and discussing that.
01:59:47.500 | I think suffering is a really good catalyst for empathy.
01:59:51.620 | And I just feel like we can project embodiment
01:59:57.220 | on a virtual thing if it's capable
01:59:58.900 | of certain things like suffering.
02:00:01.500 | - Yeah. - So I was wondering.
02:00:02.860 | - I think that's true.
02:00:03.820 | And I think that's what happened with the Lambda thing.
02:00:06.140 | Not that, I don't, none of the transcript
02:00:08.700 | was about suffering, but it was about
02:00:12.620 | having the capacity for suffering and human emotion
02:00:15.740 | that convinced the engineer that this thing was sentient.
02:00:19.300 | And it's basically the plot of Ex Machina.
02:00:22.980 | - True.
02:00:23.820 | Have you ever made a robot scream in pain?
02:00:26.500 | - Have I?
02:00:27.340 | No, but have you seen that?
02:00:28.700 | Did someone?
02:00:29.580 | Oh yeah, no, they actually made a Roomba scream
02:00:34.620 | whenever it hit a wall.
02:00:36.060 | - Yeah, I programmed that myself as well.
02:00:38.020 | - Yeah? - 'Cause I was inspired by that.
02:00:39.460 | Yeah, it's cool. - Do you still have it?
02:00:40.700 | - Oh, sorry, hit a wall.
02:00:43.020 | I didn't--
02:00:43.980 | - Whenever it bumped into something,
02:00:44.980 | it would scream in pain. - Yeah, yeah.
02:00:46.300 | No, I, so I had, the way I programmed the Roombas
02:00:50.980 | is when I kick it, whenever I,
02:00:52.700 | so contact between me and the robot is when it screams.
02:00:55.620 | - Really?
02:00:56.460 | Okay, and you were inspired by that?
02:00:59.940 | - Yeah, I guess I misremembered the video.
02:01:01.660 | I saw the video a long, long time ago,
02:01:03.860 | and, or maybe heard somebody mention it,
02:01:06.140 | and that just, it's the easiest thing to program.
02:01:08.580 | So I did that.
02:01:09.420 | I haven't run those Roombas for over a year now,
02:01:11.780 | but yeah, it was, my experience with it was that
02:01:15.420 | it's like they quickly become,
02:01:21.660 | like you remember them, you miss them,
02:01:27.780 | like they're real living beings.
02:01:30.500 | So the capacity to suffer is a really powerful thing.
02:01:34.860 | - Yeah. - Even then, that,
02:01:36.060 | I mean, it was kind of hilarious.
02:01:38.060 | It was just a random recording of screaming
02:01:40.500 | from the internet, but it's still, it's still, it's weird.
02:01:45.100 | There's a thing you have to get right
02:01:46.820 | based on the interaction, like the latency.
02:01:48.980 | Like it has, there is,
02:01:51.340 | there is a realistic aspect of how you should scream
02:01:56.540 | relative to when you get hurt.
02:01:58.700 | Like it should correspond correctly.
02:02:01.460 | - Like if you kick it really hard, it should scream louder?
02:02:05.060 | - No, it should scream at the appropriate time, not like--
02:02:07.700 | - Oh, I see, not like five minutes later.
02:02:09.020 | - One second later, right?
02:02:10.580 | Like there's an exact, like there's a timing
02:02:13.860 | when you get like, I don't know,
02:02:16.060 | when you run into, when you run your foot
02:02:18.540 | into like the side of a table or something,
02:02:20.740 | there's a timing there, the dynamics you have to get right
02:02:23.580 | for the actual screaming,
02:02:25.460 | 'cause the Roomba in particular doesn't,
02:02:30.060 | so I was, the sensors don't,
02:02:35.980 | it doesn't know about pain.
02:02:37.380 | See? - What?
02:02:39.180 | (laughing)
02:02:40.940 | - I'm sorry to say, Roomba doesn't understand pain.
02:02:44.060 | So you have to correctly map the sensors,
02:02:48.300 | the timing to the production of the sound.
02:02:51.780 | But when you get that somewhat right,
02:02:53.900 | it starts, it's a weird, it's a really weird feeling.
02:02:56.780 | And you actually feel like a bad person.
02:02:58.780 | - Aw.
02:02:59.820 | - Yeah.
02:03:00.780 | So, but it's, it makes you think
02:03:04.860 | because that, with all the ways that we talked about,
02:03:07.620 | that could be used to manipulate you.
02:03:09.620 | - Oh, for sure.
02:03:10.500 | - In a good and bad way.
02:03:11.740 | So the good way is like you can form a connection
02:03:14.060 | with a thing, and a bad way that you can form a connection
02:03:17.380 | in order to sell you products that you don't want.
02:03:21.460 | - Yeah, or manipulate you politically,
02:03:23.900 | or many nefarious things.
02:03:26.540 | - You tweeted, "We're about to be living in the movie 'Her'
02:03:29.820 | except instead of," see, I researched your tweets,
02:03:33.020 | like they're like Shakespeare.
02:03:34.460 | "We're about to be living in the movie 'Her'
02:03:36.900 | except instead of about love, it's gonna be about,"
02:03:40.260 | what did I say, "the chatbot being subtly racist
02:03:43.980 | and the question whether it's ethical
02:03:46.900 | for companies to charge for software upgrades."
02:03:48.860 | - Yeah.
02:03:49.700 | So can we break that down?
02:03:52.300 | What do you mean by that?
02:03:54.180 | - Yeah.
02:03:55.020 | - Obviously, some of it is humor.
02:03:56.860 | - Yes, well, kind of.
02:03:58.860 | (laughs)
02:04:00.100 | I am, like, ah, it's so weird to be in this space
02:04:02.980 | where I'm so worried about this technology
02:04:06.700 | and also so excited about it at the same time.
02:04:09.740 | But the really, like, I haven't,
02:04:12.900 | I'd gotten a little bit jaded,
02:04:14.380 | and then with GPT-3 and then the Lambda transcript,
02:04:19.380 | I was, like, re-energized,
02:04:23.940 | but have also been thinking a lot about,
02:04:26.060 | you know, what are the ethical issues
02:04:31.100 | that are gonna come up?
02:04:32.620 | And I think some of the things
02:04:33.660 | that companies are really gonna have to figure out
02:04:35.660 | is obviously algorithmic bias
02:04:38.060 | is a huge and known problem at this point.
02:04:40.540 | Like, even, you know, the new image generation tools,
02:04:45.460 | like DALI, where they've clearly put in a lot of effort
02:04:49.860 | to make sure that if you search for people,
02:04:52.540 | it gives you a diverse set of people, et cetera.
02:04:54.780 | Like, even that one, people have already found numerous,
02:04:57.580 | like, ways that it just kind of regurgitates biases
02:05:02.220 | and things that it finds on the internet,
02:05:03.580 | like how if you search for success,
02:05:05.620 | it gives you a bunch of images of men.
02:05:07.220 | If you search for sadness,
02:05:08.380 | it gives you a bunch of images of women.
02:05:10.180 | So I think that this is, like, the really tricky one
02:05:14.220 | with these voice agents
02:05:15.820 | that companies are gonna have to figure out.
02:05:17.900 | And that's why it's subtly racist and not overtly,
02:05:20.780 | because I think they're gonna be able
02:05:21.820 | to solve the overt thing,
02:05:23.180 | and then with the subtle stuff,
02:05:24.340 | it's gonna be really difficult.
02:05:26.060 | And then I think the other thing is gonna be,
02:05:29.940 | people are gonna become so emotionally attached
02:05:33.700 | to artificial agents with this complexity of language,
02:05:38.660 | with a potential embodiment factor,
02:05:41.420 | that, I mean, there's already,
02:05:42.660 | there's a paper at WeRobot this year
02:05:44.620 | written by roboticists about how to deal
02:05:47.380 | with the fact that robots die
02:05:49.700 | and looking at it as an ethical issue,
02:05:52.020 | because it impacts people.
02:05:54.620 | And I think there's gonna be way more issues than just that.
02:05:59.100 | I think the tweet was software upgrades, right?
02:06:01.460 | Like, how much is it okay to charge for something like that
02:06:04.620 | if someone is deeply, emotionally invested
02:06:07.660 | in this relationship?
02:06:09.060 | - Oh, the ethics of that, that's interesting.
02:06:13.100 | But there's also the practical funding mechanisms,
02:06:17.220 | like you mentioned with AIBO, the dog.
02:06:19.380 | In theory, there's a subscription.
02:06:21.300 | - Yeah, the new AIBO.
02:06:23.580 | So the old AIBO from the '90s,
02:06:25.740 | people got really attached to,
02:06:27.140 | and in Japan they're still having funerals
02:06:29.380 | in Buddhist temples for the AIBOs that can't be repaired
02:06:32.180 | because people really viewed them as part of their families.
02:06:36.100 | - So we're talking about robot dogs.
02:06:37.540 | - Robot dogs, the AIBO, yeah.
02:06:38.860 | The original, famous robot dog that Sony made
02:06:42.140 | came out in the '90s, got discontinued,
02:06:45.060 | having funerals for them in Japan.
02:06:47.100 | Now they have a new one.
02:06:48.780 | The new one is great.
02:06:49.860 | I have one at home.
02:06:50.860 | It's like--
02:06:51.820 | - It's $3,000, how much is it?
02:06:53.300 | - I think it's 3,000 bucks.
02:06:54.820 | And then after a few years, you have to start paying,
02:06:58.660 | I think it's like 300 a year for a subscription service
02:07:01.780 | for cloud services.
02:07:02.940 | And the cloud services,
02:07:04.380 | I mean, it's a lot,
02:07:08.180 | the dog is more complex than the original,
02:07:11.420 | and it has a lot of cool features,
02:07:12.740 | and it can remember stuff and experiences,
02:07:14.740 | and it can learn.
02:07:15.580 | And a lot of that is outsourced to the cloud.
02:07:18.380 | And so you have to pay to keep that running,
02:07:20.660 | which makes sense.
02:07:21.500 | People should pay, and people who aren't using it
02:07:24.220 | shouldn't have to pay.
02:07:25.820 | But it does raise the interesting question,
02:07:28.860 | could you set that price to reflect a consumer's willingness
02:07:33.860 | to pay for the emotional connection?
02:07:36.220 | So if you know that people are really, really attached
02:07:41.220 | to these things, just like they would be to a real dog,
02:07:45.100 | could you just start charging more?
02:07:47.540 | Because there's more demand.
02:07:49.980 | - Yeah, and you have to be,
02:07:53.540 | but that's true for anything that people love, right?
02:07:58.540 | - It is, and it's also true for real dogs.
02:08:00.940 | Like there's all these new medical services nowadays
02:08:04.020 | where people will shell out thousands and thousands of
02:08:06.780 | dollars to keep their pets alive.
02:08:08.620 | And is that taking advantage of people,
02:08:11.580 | or is that just giving them what they want?
02:08:14.300 | That's the question.
02:08:16.220 | - Back to marriage, what about all the money
02:08:19.380 | that it costs to get married,
02:08:20.940 | and then all the money that it costs to get a divorce?
02:08:24.820 | That feels like a very, that's like a scam.
02:08:30.300 | I think the society is full of scams that are like--
02:08:32.820 | - Oh, it's such a scam.
02:08:34.620 | And then we've created,
02:08:35.820 | the whole wedding industrial complex has created
02:08:38.580 | all these quote unquote traditions that people buy into
02:08:42.580 | that aren't even traditions.
02:08:43.980 | They're just fabricated by marketing.
02:08:46.140 | It's awful.
02:08:48.340 | - Let me ask you about racist robots.
02:08:50.620 | - Sure.
02:08:51.460 | - Is it up to a company that creates that?
02:08:54.540 | So we talk about removing bias and so on.
02:08:57.460 | And that's a really popular field in AI currently.
02:08:59.980 | And a lot of people agree that it's an important field.
02:09:03.860 | But the question is for like social robotics,
02:09:06.660 | is should it be up to the company
02:09:08.580 | to remove the bias of society?
02:09:10.540 | - Well, who else can, oh, to remove the bias of society.
02:09:13.540 | - I guess because there's a lot of people
02:09:16.420 | that are subtly racist in modern society,
02:09:19.020 | like why shouldn't our robots also be subtly racist?
02:09:22.260 | I mean, that's like,
02:09:24.780 | why do we put so much responsibility on the robots?
02:09:28.660 | - Because, because the robot--
02:09:32.140 | - I'm imagining like a Hitler Roomba.
02:09:34.740 | I mean, that would be funny.
02:09:37.940 | But I guess I'm asking a serious question.
02:09:40.860 | - You're allowed to make that joke.
02:09:41.700 | - Yes, exactly, I'm allowed to make that joke, yes.
02:09:44.220 | (laughing)
02:09:47.100 | And I've been nonstop reading about World War II and Hitler.
02:09:50.100 | I think I'm glad we exist in a world
02:09:52.660 | where we can just make those jokes.
02:09:54.980 | That helps deal with it.
02:09:56.220 | Anyway, it is a serious question.
02:09:59.780 | It's sort of like,
02:10:00.780 | like it's such a difficult problem to solve.
02:10:06.380 | Now, of course, like bias and so on,
02:10:09.420 | like there's low hanging fruit,
02:10:11.020 | which I think was what a lot of people are focused on.
02:10:13.860 | But then it becomes like subtle stuff.
02:10:17.020 | Over time, it's very difficult to know.
02:10:19.420 | Now, you can also completely remove the personality.
02:10:23.860 | You can completely remove the personalization.
02:10:26.340 | - You can remove the language aspect,
02:10:27.980 | which is what I had been arguing,
02:10:30.020 | because I was like,
02:10:30.860 | the language is a disappointing aspect
02:10:32.540 | of social robots anyway.
02:10:34.380 | But now we're reintroducing that
02:10:36.020 | because it's now no longer disappointing.
02:10:39.220 | So I do think, well, let's just start with the premise,
02:10:44.220 | which I think is very true,
02:10:45.740 | which is that racism is not a neutral thing,
02:10:48.300 | but it is the thing that we don't want in our society.
02:10:51.260 | Like it does not conform to my values.
02:10:54.460 | So if we agree that racism is bad,
02:10:56.500 | I do think that it has to be the company,
02:11:01.500 | because the problem, I mean,
02:11:02.980 | it might not be possible.
02:11:06.140 | And companies might have to put out products
02:11:09.780 | that where they're taking risks
02:11:11.900 | and they might get slammed by consumers
02:11:14.620 | and they might have to adjust.
02:11:16.020 | I don't know how this is gonna work in the market.
02:11:19.020 | I have opinions about how it should work,
02:11:20.460 | but it is on the company.
02:11:22.460 | And the danger with robots
02:11:25.300 | is that they can entrench this stuff.
02:11:27.300 | It's not like your racist uncle
02:11:30.740 | who you can have a conversation with and-
02:11:34.740 | - And put things into context maybe with that.
02:11:38.260 | - Yeah, or who might change over time with more experience.
02:11:42.060 | A robot really just like regurgitates things,
02:11:46.460 | entrenches them, could influence other people.
02:11:50.440 | And I mean, I think that's terrible.
02:11:54.300 | - Well, I think there's a difficult challenge here
02:11:56.300 | is because even the premise you started with
02:11:58.860 | that essentially racism is bad.
02:12:01.220 | I think we live in a society today
02:12:04.560 | where the definition of racism
02:12:06.700 | is different between different people.
02:12:09.380 | Some people say that it's not enough not to be racist.
02:12:12.020 | Some people say you have to be anti-racist.
02:12:14.500 | So you have to have a robot that constantly calls out,
02:12:18.500 | like calls you out on your implicit racism.
02:12:23.500 | - I would love that.
02:12:25.620 | I would love that robot.
02:12:27.420 | - But like maybe it sees racism.
02:12:29.900 | Well, I don't know if you'd love it
02:12:31.020 | 'cause maybe you'll see racism in things that aren't racist.
02:12:34.700 | And then you're arguing with a robot.
02:12:37.180 | - Your robot starts calling you racist.
02:12:39.900 | I'm not exactly sure that, I mean, it's a tricky thing.
02:12:43.540 | I guess I'm saying that the line is not obvious,
02:12:48.540 | especially in this heated discussion
02:12:51.160 | where we have a lot of identity politics
02:12:53.000 | of what is harmful to different groups and so on.
02:12:56.340 | - Yeah.
02:12:57.180 | - It feels like the broader question here
02:13:01.420 | is should a social robotics company be solving
02:13:06.060 | or being part of solving the issues of society?
02:13:09.820 | - Well, okay, I think it's the same question as
02:13:12.580 | should I as an individual be responsible
02:13:16.340 | for knowing everything in advance
02:13:19.220 | and saying all the right things?
02:13:21.100 | And the answer to that is yes, I am responsible,
02:13:26.100 | but I'm not gonna get it perfect.
02:13:30.540 | And then the question is how do we deal with that?
02:13:32.740 | And so as a person, how I aspire to deal with that is
02:13:37.420 | when I do inevitably make a mistake
02:13:41.020 | because I have blind spots and people get angry,
02:13:45.740 | I don't take that personally
02:13:47.340 | and I listen to what's behind the anger.
02:13:50.540 | And it can even happen that like,
02:13:52.420 | maybe I'll tweet something that's well-intentioned
02:13:55.180 | and one group of people starts yelling at me
02:13:59.780 | and then I change it the way that they said
02:14:01.840 | and then another group of people starts yelling at me,
02:14:03.940 | which has happened, this happened to me actually around,
02:14:08.940 | in my talks, I talk about robots
02:14:10.900 | that are used in autism therapy.
02:14:12.780 | And so whether to say a child with autism
02:14:15.820 | or an autistic child is super controversial
02:14:19.180 | and a lot of autistic people
02:14:20.900 | prefer to be referred to as autistic people
02:14:22.860 | and a lot of parents of autistic children
02:14:25.260 | prefer child with autism and then they disagree.
02:14:29.380 | So I've gotten yelled at from both sides
02:14:32.400 | and I think I'm still, I'm responsible
02:14:35.600 | even if I can't get it right.
02:14:37.320 | I don't know if that makes sense.
02:14:38.280 | Like it's a responsibility thing
02:14:40.180 | and I can be as well-intentioned as I want
02:14:44.980 | and I'm still gonna make mistakes
02:14:46.400 | and that is part of the existing power structures that exist
02:14:49.720 | and that's something that I accept.
02:14:51.240 | - And you accept being attacked from both sides
02:14:53.320 | and grow from it and learn from it.
02:14:55.280 | But the danger is that after being attacked,
02:14:59.360 | assuming you don't get canceled,
02:15:00.980 | AKA completely removed from your ability to tweet,
02:15:04.840 | you might become jaded
02:15:09.220 | and not want to talk about autism anymore.
02:15:11.700 | - I don't and I didn't.
02:15:13.480 | I mean, it's happened to me.
02:15:15.520 | What I did was I listened to both sides
02:15:17.320 | and I chose, I tried to get information
02:15:21.720 | and then I decided that I was going to use autistic children
02:15:29.280 | and now I'm moving forward with that.
02:15:30.520 | Like, I don't know.
02:15:31.360 | - For now, right.
02:15:32.200 | - For now, yeah, until I get updated information
02:15:34.560 | and I'm never gonna get anything perfect
02:15:36.520 | but I'm making choices and I'm moving forward
02:15:39.160 | because being a coward and like just retreating from that,
02:15:43.000 | I think--
02:15:45.160 | - But see, here's the problem.
02:15:46.440 | You're a very smart person and an individual,
02:15:49.400 | a researcher, a thinker, an intellectual.
02:15:51.740 | So that's the right thing for you to do.
02:15:53.920 | The hard thing is when as a company,
02:15:56.120 | imagine you had a PR team.
02:15:58.800 | - I said, Kate, this you should--
02:16:00.360 | - The PR teams we hate.
02:16:01.600 | - Yeah, I mean, just, well, if you hired PR people,
02:16:06.600 | like obviously they would see that and they'd be like,
02:16:10.120 | well, maybe don't bring up autism.
02:16:12.640 | Maybe don't bring up these topics.
02:16:14.240 | You're getting attacked, it's bad for your brand.
02:16:17.120 | They'll say the brand word.
02:16:18.800 | There'll be, if we look at different demographics
02:16:22.240 | that are inspired by your work,
02:16:23.760 | I think it's insensitive to them.
02:16:25.160 | Let's not mention this anymore.
02:16:26.880 | There's this kind of pressure that all of a sudden you,
02:16:31.040 | or you do suboptimal decisions.
02:16:34.880 | You take a kind of poll.
02:16:36.700 | Again, it's looking at the past versus the future,
02:16:41.240 | all those kinds of things.
02:16:42.160 | And it becomes difficult.
02:16:44.320 | In the same way that it's difficult
02:16:46.360 | for social media companies to figure out like
02:16:49.200 | who to censor, who to recommend.
02:16:53.240 | - I think this is ultimately a question about leadership.
02:16:55.920 | Honestly, like the way that I see leadership,
02:16:58.240 | because right now,
02:17:00.180 | the thing that bothers me about institutions
02:17:04.180 | and a lot of people who run current institutions
02:17:06.580 | is that their main focus is protecting the institution
02:17:10.840 | or protecting themselves personally.
02:17:12.560 | That is bad leadership
02:17:13.940 | because it means you cannot have integrity.
02:17:16.280 | You cannot lead with integrity.
02:17:18.960 | And it makes sense because like, obviously,
02:17:21.040 | if you're the type of leader who immediately blows up
02:17:23.580 | the institution you're leading,
02:17:25.080 | then that doesn't exist anymore.
02:17:26.640 | And maybe that's why we don't have any good leaders anymore
02:17:29.400 | because they had integrity and they didn't put,
02:17:33.080 | you know, the survival of the institution first.
02:17:36.800 | But I feel like you have to,
02:17:39.300 | just to be a good leader,
02:17:43.080 | you have to be responsible
02:17:47.880 | and understand that with great power
02:17:49.280 | comes great responsibility.
02:17:50.980 | You have to be humble and you have to listen
02:17:53.000 | and you have to learn.
02:17:53.820 | You can't get defensive and you cannot
02:17:55.680 | put your own protection before other things.
02:17:58.000 | - Yeah, take risks where you might lose your job,
02:18:00.640 | you might lose your wellbeing because of,
02:18:05.060 | because in the process of standing for the principles,
02:18:12.800 | for the things you think are right to do, yeah.
02:18:15.240 | Based on the things you,
02:18:18.600 | like based on learning from,
02:18:20.520 | like listening to people and learning from what they feel.
02:18:23.600 | And the same goes for the institution, yeah.
02:18:26.120 | Yeah, but I ultimately actually believe
02:18:27.880 | that those kinds of companies and countries succeed
02:18:32.440 | that have leaders like that.
02:18:34.040 | You should run for president.
02:18:35.540 | - No, thank you.
02:18:38.080 | - Yeah.
02:18:38.900 | - That's maybe the problem.
02:18:39.740 | Like the people who have good ideas about leadership,
02:18:42.520 | they're like, yeah.
02:18:43.480 | - No.
02:18:44.320 | - This is why I don't,
02:18:45.160 | this is why I'm not running a company.
02:18:46.920 | - It's been, I think, three years
02:18:48.640 | since the Jeffrey Epstein controversy at MIT,
02:18:52.360 | MIT Media Lab.
02:18:53.920 | Joey Ito, the head of the Media Lab resigned.
02:18:58.200 | And I think at that time you wrote
02:19:00.000 | an opinion article about it.
02:19:02.680 | So just looking back a few years have passed,
02:19:05.640 | what have you learned about human nature
02:19:10.160 | from the fact that somebody like Jeffrey Epstein
02:19:16.680 | found his way inside MIT?
02:19:22.240 | - It's a really good question.
02:19:23.360 | What have I learned about human nature?
02:19:25.320 | I think,
02:19:27.880 | well, there's,
02:19:31.460 | there's how did this problem come about?
02:19:36.900 | And then there's what was the reaction to this problem
02:19:40.880 | and to it becoming public?
02:19:43.260 | And in the reaction,
02:19:44.860 | the things I learned about human nature
02:19:49.880 | were that sometimes cowards are worse than assholes.
02:19:54.880 | (laughs)
02:20:01.000 | Wow, I'm really, ugh.
02:20:02.280 | - I think that's a really powerful statement.
02:20:05.960 | - I think because the assholes,
02:20:08.000 | at least you know what you're dealing with.
02:20:11.040 | They have integrity in a way.
02:20:13.680 | They're just living out their asshole values.
02:20:16.120 | And the cowards are the ones that you have to watch out for.
02:20:18.760 | And this comes back to people protecting themselves
02:20:22.640 | over doing the right thing.
02:20:25.060 | They'll throw others under the bus.
02:20:29.680 | - Is there some sense that not enough people
02:20:31.280 | took responsibility?
02:20:32.460 | - For sure.
02:20:35.120 | And I mean, I don't wanna sugar coat at all
02:20:40.120 | what Joey Ito did.
02:20:41.760 | I mean, I think it's gross
02:20:44.320 | that he took money from Jeffrey Epstein.
02:20:46.160 | I believe him that he didn't know about the bad, bad stuff.
02:20:48.600 | But I've been in those circles
02:20:50.000 | with those public intellectual dudes
02:20:53.040 | that he was hanging out with.
02:20:54.320 | And any woman in those circles
02:20:57.480 | saw 10 zillion red flags.
02:21:00.000 | The whole environment was so misogynist.
02:21:02.420 | And so personally, because Joey was a great boss
02:21:08.040 | and a great friend,
02:21:09.140 | I was really disappointed that he ignored that
02:21:13.800 | in favor of raising money.
02:21:18.160 | And I think that it was right for him to resign
02:21:23.080 | in the face of that.
02:21:24.620 | But one of the things that he did
02:21:29.040 | that many others didn't was he came forward about it
02:21:33.720 | and he took responsibility.
02:21:35.700 | And all of the people who didn't, I think,
02:21:40.960 | it's just interesting.
02:21:45.980 | The other thing I learned about human nature,
02:21:47.640 | okay, I'm gonna go on a tangent,
02:21:49.280 | but I'll come back, I promise.
02:21:50.320 | So I once saw this tweet from someone,
02:21:53.680 | or it was a Twitter thread,
02:21:54.680 | from someone who worked at a homeless shelter.
02:21:57.040 | And he said that when he started working there,
02:22:00.400 | he noticed that people would often come in
02:22:02.760 | and use the bathroom
02:22:03.700 | and they would just trash the entire bathroom,
02:22:05.720 | like rip things out of the walls,
02:22:07.320 | like toilet paper on the ground.
02:22:08.800 | And he asked someone who had been there longer,
02:22:10.600 | like, why do they do this?
02:22:12.200 | Why do the homeless people come in and trash the bathroom?
02:22:13.960 | And he was told it's because it's the only thing
02:22:16.960 | in their lives that they have control over.
02:22:19.120 | And I feel like sometimes when it comes to the response,
02:22:25.080 | just the mobbing response that happens
02:22:36.160 | in the wake of some harm that was caused,
02:22:40.000 | if you can't target the person who actually caused the harm,
02:22:45.120 | who was Epstein,
02:22:47.560 | you will go as many circles out as you can
02:22:50.160 | until you find the person that you have power over
02:22:52.600 | and you have control over,
02:22:53.600 | and then you will trash that.
02:22:55.360 | And it makes sense that people do this.
02:22:57.920 | It's, again, it's a human nature thing.
02:22:59.960 | Of course, you're gonna focus all your energy
02:23:02.040 | because you feel helpless and enraged,
02:23:04.520 | and it's unfair and you have no other power.
02:23:08.480 | You're gonna focus all of your energy
02:23:10.080 | on someone who's so far removed from the problem
02:23:12.960 | that that's not even an efficient solution.
02:23:17.000 | - And the problem is often the first person you find
02:23:20.920 | is the one that has integrity,
02:23:22.360 | sufficient integrity to take responsibility.
02:23:24.800 | - Yeah, and it's why my husband always says,
02:23:27.480 | he's a liberal, but he's always like,
02:23:29.760 | when liberals form a firing squad, they stand in a circle.
02:23:33.920 | Because you know that your friends
02:23:36.200 | are gonna listen to you, so you criticize them.
02:23:39.120 | You're not gonna be able to convince someone
02:23:40.840 | across the aisle.
02:23:42.440 | - See, in that situation, what I had hoped
02:23:44.960 | is the people in the farther, in that situation,
02:23:49.400 | any situation of that sort,
02:23:51.480 | the people that are farther out in the circles
02:23:54.680 | stand up and also take some responsibility
02:24:00.800 | for the broader picture of human nature
02:24:02.640 | versus specific situation,
02:24:05.080 | but also take some responsibility,
02:24:11.920 | but also defend the people involved as flawed,
02:24:16.560 | not in a like, no, no, no, nothing,
02:24:19.160 | like this, people fucked up.
02:24:21.880 | Like you said, there's a lot of red flags
02:24:24.160 | that people just ignored for the sake of money
02:24:27.080 | in this particular case.
02:24:28.420 | But also like be transparent and public about it
02:24:32.840 | and spread the responsibility
02:24:36.040 | across a large number of people
02:24:38.520 | such that you learn a lesson from it.
02:24:41.120 | - Institutionally.
02:24:42.880 | - Yeah, it was a systems problem.
02:24:45.000 | It wasn't a one individual problem.
02:24:47.120 | - And I feel like currently,
02:24:49.180 | because Joey took, resigned because of it,
02:24:55.000 | or essentially fired, pressured out because of it,
02:24:58.720 | MIT can pretend like, oh, we didn't know anything.
02:25:04.680 | It wasn't part--
02:25:05.680 | - Bad leadership, again,
02:25:07.880 | because when you are at the top of an institution
02:25:11.600 | with that much power and you were complicit in what happened,
02:25:15.280 | which they were, like, come on,
02:25:18.280 | there's no way that they didn't know
02:25:19.960 | that this was happening.
02:25:21.360 | So like to not stand up and take responsibility,
02:25:26.280 | I think it's bad leadership.
02:25:29.160 | - Do you understand why Epstein was able to,
02:25:34.920 | outside of MIT, he was able to make a lot of friends
02:25:38.840 | with a lot of powerful people.
02:25:41.000 | Does that make sense to you?
02:25:42.520 | Why was he able to get in these rooms,
02:25:45.400 | befriend these people,
02:25:47.920 | befriend people that I don't know personally,
02:25:50.700 | but I think a lot of them indirectly I know
02:25:55.240 | as being good people, smart people.
02:25:59.400 | Why would they let Jeffrey Epstein into their office,
02:26:02.960 | have a discussion with them?
02:26:04.600 | What do you understand about human nature from that?
02:26:07.680 | - Well, so I never met Epstein,
02:26:10.560 | or I mean, I've met some of the people
02:26:14.600 | who interacted with him,
02:26:15.800 | but I was never, like, I never saw him in action.
02:26:20.120 | I don't know how charismatic he was or what that was,
02:26:23.160 | but I do think that sometimes the simple answer
02:26:27.760 | is the more likely one,
02:26:29.120 | and from my understanding,
02:26:31.480 | what he would do is he was kind of a social grifter,
02:26:34.320 | like, you know those people who will,
02:26:37.920 | you must get this because you're famous.
02:26:40.360 | You must get people coming to you and being like,
02:26:43.480 | oh, I know your friend so-and-so
02:26:45.760 | in order to get cred with you.
02:26:48.040 | I think he just convinced some people
02:26:54.000 | who were trusted in a network that he was a great guy
02:26:59.000 | and that, you know, whatever, I think at that point,
02:27:02.760 | 'cause at that point he had had like,
02:27:04.800 | what, a conviction prior, but it was a one-off thing.
02:27:08.920 | It wasn't clear that there was this other thing
02:27:11.000 | that was that--
02:27:12.560 | - And most people probably don't check.
02:27:14.480 | - Yeah, and most people don't check.
02:27:15.680 | Like, you're at an event, you meet this guy.
02:27:17.120 | I don't know, maybe people do check
02:27:18.440 | when they're that powerful and wealthy,
02:27:20.080 | or maybe they don't.
02:27:20.920 | I have no idea.
02:27:21.740 | No, they're just stupid. - I haven't seen that.
02:27:23.640 | I mean, and they're not, like,
02:27:25.800 | all right, does anyone check anything about me?
02:27:31.000 | Because I've walked into some of the richest,
02:27:33.680 | some of the most powerful people in the world,
02:27:35.720 | and nobody, like, asks questions like,
02:27:38.160 | who the fuck is this guy?
02:27:40.000 | Like-- - Yeah.
02:27:41.360 | - Like, nobody asks those questions.
02:27:43.120 | It's interesting.
02:27:44.200 | I would think, like, there would be more security
02:27:47.160 | or something.
02:27:47.980 | Like, there really isn't.
02:27:49.420 | I think a lot of it has to do, well, my hope is,
02:27:53.320 | in my case, has to do with, like,
02:27:54.580 | people can sense that this is a good person,
02:27:56.880 | but if that's the case, then they can surely,
02:28:00.840 | then a human being can use charisma to infiltrate.
02:28:05.200 | - Yeah. - Just being,
02:28:06.360 | just saying the right things-- - And once you have people
02:28:07.920 | vouching for you within that type of network,
02:28:11.040 | like, once you, yeah, once you have someone powerful
02:28:13.940 | vouching for you who someone else trusts,
02:28:16.360 | then, you know, you're in.
02:28:18.880 | - So how do you avoid something like that?
02:28:25.640 | If you're MIT, if you're Harvard,
02:28:27.320 | if you're any of these institutions?
02:28:29.280 | - Well, I mean, first of all, you have to do your homework
02:28:31.760 | before you take money from someone.
02:28:33.700 | Like, I think that's required,
02:28:38.520 | but I think, you know, I think Joey did do his homework.
02:28:40.580 | I think he did, and I think at the time that he took money,
02:28:44.660 | there was the one conviction and not, like, the later thing,
02:28:47.400 | and I think that the story at that time was that
02:28:50.880 | he didn't know she was underage and blah, blah, blah,
02:28:55.600 | or whatever, it was a mistake,
02:28:56.880 | and Joey always believed in redemption for people,
02:28:59.160 | and that people can change,
02:29:00.320 | and that they can genuinely regret,
02:29:02.280 | and, like, learn and move on,
02:29:04.480 | and he was a big believer in that,
02:29:05.820 | so I could totally see him being like,
02:29:08.240 | well, I'm not gonna exclude him because of this thing,
02:29:11.840 | and because other people are vouching for him.
02:29:14.080 | Just to be clear, we're now talking about the set of people
02:29:20.120 | who I think Joey belonged to who did not, like,
02:29:23.000 | go to the island and have sex with underage girls,
02:29:25.360 | because that's a whole other set of people who, like,
02:29:29.320 | were powerful and, like, were part of that network
02:29:32.320 | and who knew and participated,
02:29:34.440 | and so, like, I distinguish between people who got taken in
02:29:38.280 | who didn't know that that was happening
02:29:40.120 | and people who knew.
02:29:41.640 | - I wonder what the different circles look like.
02:29:43.840 | So, like, people that went to the island
02:29:46.440 | and didn't do anything, didn't see anything,
02:29:49.600 | didn't know about anything,
02:29:50.960 | versus the people that did something.
02:29:53.640 | - And then there's people who heard rumors, maybe.
02:29:56.880 | - And what do you do with rumors?
02:29:58.120 | Like, isn't there people that heard rumors
02:30:01.720 | about Bill Cosby for the longest time?
02:30:03.660 | For, like, for the longest, like, whenever that happened,
02:30:08.280 | like, all these people came out of the woodwork,
02:30:10.460 | like, everybody kind of knew.
02:30:11.920 | I mean, it's like, all right,
02:30:15.780 | so what are you supposed to do with rumors?
02:30:17.400 | Like, what, I think the other way to put it
02:30:20.680 | is red flags, as you were saying.
02:30:22.320 | - Yeah, and, like, I can tell you that those circles,
02:30:25.320 | like, there were red flags without me even hearing
02:30:28.200 | any rumors about anything ever.
02:30:30.040 | Like, I was already like, hmm,
02:30:31.760 | there are not a lot of women here, which is a bad sign.
02:30:36.220 | - Isn't there a lot of places where there's not a lot
02:30:39.440 | of women and that doesn't necessarily mean it's a bad sign?
02:30:42.560 | - There are if it's like a pipeline problem
02:30:44.660 | where it's like, I don't know,
02:30:49.200 | technology law clinic that only gets, like, male lawyers
02:30:51.840 | because there's not a lot of women, you know,
02:30:55.520 | applicants in the pool.
02:30:56.960 | - But there's some aspect of this situation
02:30:58.960 | that, like, there should be more women here.
02:31:00.760 | - Oh, yeah, yeah.
02:31:03.100 | - You've, actually, I'd love to ask you about this
02:31:10.680 | 'cause you have strong opinions about Richard Stallman.
02:31:15.640 | Is that, do you still have those strong opinions?
02:31:20.440 | - Look, all I need to say is that he met my friend
02:31:23.240 | who's a law professor.
02:31:25.000 | - Yeah.
02:31:25.960 | - She shook his hand and he licked her arm
02:31:28.260 | from wrist to elbow, and it certainly wasn't appropriate
02:31:31.000 | at that time.
02:31:31.980 | - What about if you're, like, an incredibly weird person?
02:31:38.200 | - Okay, that's a good question because, obviously,
02:31:41.120 | there's a lot of neurodivergence at MIT and everywhere,
02:31:45.360 | and, obviously, like, we need to accept
02:31:48.400 | that people are different,
02:31:49.920 | that people don't understand social conventions
02:31:51.800 | the same way, but one of the things that I've learned
02:31:54.740 | about neurodivergence is that women are often
02:31:59.440 | expected or taught to mask their neurodivergence
02:32:05.480 | and kind of fit in, and men are accommodated and excused,
02:32:10.480 | and I don't think that being neurodivergent
02:32:15.200 | gives you a license to be an asshole.
02:32:17.220 | Like, you can be a weird person and you can still learn
02:32:22.000 | that it's not okay to lick someone's arm.
02:32:24.840 | - Yeah, it's a balance.
02:32:26.560 | Like, women should be allowed to be a little weirder
02:32:29.360 | and men should be less weird.
02:32:31.080 | 'Cause I think there's a, 'cause you're one of the people,
02:32:34.820 | I think, tweeting that, what made me,
02:32:37.160 | 'cause I wanted to talk to Richard Stallman on the podcast
02:32:40.640 | about, 'cause I didn't have a context,
02:32:43.880 | 'cause I wanted to talk to him 'cause he's,
02:32:45.560 | you know, free software.
02:32:46.960 | - Richard Stallman.
02:32:47.800 | - He's very weird in interesting, good ways
02:32:51.060 | in the world of computer science.
02:32:52.720 | He's also weird in that, you know, when he gives a talk,
02:32:57.260 | he'll be like picking at his feet
02:33:00.620 | and eating the skin off his feet, right?
02:33:03.220 | He's known for these extremely kind of,
02:33:05.440 | how else do you put it?
02:33:08.020 | I don't know how to put it.
02:33:09.380 | But then there was something that happened to him
02:33:13.180 | in conversations on this thread related to Epstein.
02:33:16.260 | - Yeah.
02:33:17.100 | - Which I was torn about because I felt it's similar to Joy,
02:33:22.100 | you know, it's like, I felt he was maligned.
02:33:27.580 | Like people were looking for somebody to get angry at.
02:33:32.060 | So he was inappropriate, but the,
02:33:34.620 | I didn't like the cowardice more.
02:33:39.100 | Like I set aside his situation and we could discuss it,
02:33:44.220 | but the cowardice on MIT's part,
02:33:46.620 | and this is me saying it,
02:33:48.100 | about the way they treated that whole situation.
02:33:50.780 | - Oh, they're always cowards about how they treat anything.
02:33:52.820 | They just try to make the problem go away.
02:33:54.340 | - Yeah, so it was about, yeah, exactly.
02:33:57.540 | - That said, I think he should have left the mailing list.
02:34:01.100 | - He shouldn't have been part of the mailing list.
02:34:03.620 | - Well, that's probably true also.
02:34:05.900 | But I think what bothered me,
02:34:09.780 | what always bothers me in these mailing list situations
02:34:12.400 | or Twitter situations, like if you say something
02:34:17.020 | that's hurtful to people or makes people angry,
02:34:20.860 | and then people start yelling at you,
02:34:22.700 | maybe they shouldn't be yelling.
02:34:26.260 | Maybe they are yelling because again,
02:34:28.780 | you're the only point of power they have.
02:34:31.600 | Maybe it's okay that you're yelling, whatever it is,
02:34:35.300 | like it's your response to that that matters.
02:34:40.060 | And I think that I just have a lot of respect for people
02:34:43.260 | who can say, "Oh, people are angry.
02:34:49.180 | There's a reason they're angry.
02:34:51.100 | Let me find out what that reason is
02:34:52.500 | and learn more about it.
02:34:53.800 | It doesn't mean that I am wrong.
02:34:56.460 | It doesn't mean that I am bad.
02:34:58.140 | It doesn't mean that I'm ill-intentioned,
02:35:00.240 | but why are they angry?
02:35:02.420 | I wanna understand."
02:35:03.980 | And then once you understand,
02:35:06.540 | you can respond again with integrity and say,
02:35:09.900 | "Actually, I stand by what I said.
02:35:11.420 | Here's why."
02:35:12.240 | Or you can say, "Actually, I listened
02:35:14.460 | and here are some things I learned."
02:35:16.260 | That's the kind of response I wanna see from people.
02:35:19.980 | And people like Stallman do not respond that way.
02:35:22.340 | They just go into battle.
02:35:25.700 | - Right, like where it's obvious you didn't listen.
02:35:28.780 | - Yeah, no interest in listening.
02:35:30.500 | - Honestly, that's to me as bad as the people
02:35:32.980 | who just apologize just 'cause they are trying
02:35:35.660 | to make the problem go away.
02:35:36.940 | - Of course.
02:35:37.940 | - Right, so like if--
02:35:39.260 | - That's not--
02:35:40.100 | - It's like both are bad.
02:35:41.180 | - A good apology has to include understanding
02:35:44.420 | what you did wrong.
02:35:46.260 | - And in part, standing up for the things
02:35:48.580 | you think you did right.
02:35:49.740 | So like--
02:35:50.580 | - Yeah, if there are those things, yeah.
02:35:51.620 | - Finding and then, but you have to give,
02:35:54.140 | you have to acknowledge, you have to like give
02:35:57.260 | that hard hit to the ego that says I did something wrong.
02:36:00.460 | Yeah, definitely Richard Stallman is not somebody
02:36:03.180 | who is capable of that kind of thing
02:36:05.820 | or hasn't given evidence of that kind of thing.
02:36:09.220 | - But that was also, even just your tweet,
02:36:11.660 | I had to do a lot of thinking like,
02:36:13.420 | different people from different walks of life
02:36:17.660 | see red flags in different things.
02:36:19.700 | - Yeah.
02:36:20.740 | - And so things I find as a man,
02:36:25.740 | non-threatening and hilarious are not necessarily,
02:36:35.300 | doesn't mean that they're aren't like deeply hurtful
02:36:40.300 | to others.
02:36:41.620 | And I don't mean that in a social justice warrior way,
02:36:44.100 | but in a real way, like people really have
02:36:47.780 | different experiences.
02:36:49.220 | So I feel like really put things into context.
02:36:52.240 | They have to kind of listen to what people are saying,
02:36:56.740 | put aside the emotion of what they're,
02:36:58.820 | emotion with which you're saying it
02:37:00.780 | and try to keep the facts of their experience
02:37:04.820 | and learn from it.
02:37:05.860 | - And because it's not just about
02:37:07.140 | the individual experience either.
02:37:08.620 | It's not like, oh, my friend didn't have a sense of humor
02:37:11.940 | about being licked.
02:37:12.940 | It's that she's been metaphorically licked
02:37:17.500 | 57 times that week because she's an attractive law professor
02:37:21.260 | and she doesn't get taken seriously.
02:37:22.420 | And so like men walk through the world
02:37:25.260 | and it's impossible for them to even understand
02:37:29.260 | what it's like to have a different experience of the world.
02:37:33.060 | And that's why it's so important to listen to people
02:37:35.580 | and believe people and believe that they're angry
02:37:38.220 | for a reason.
02:37:39.220 | Maybe you don't like their tone.
02:37:40.540 | Maybe you don't like that they're angry at you.
02:37:42.340 | Maybe you get defensive about that.
02:37:44.100 | Maybe you think that they should explain it to you,
02:37:47.120 | but believe that they're angry for a reason
02:37:50.540 | and try to understand it.
02:37:51.660 | - Yeah, there's a deep truth there
02:37:54.140 | and an opportunity for you to become a better person.
02:37:56.980 | Can I ask you a question?
02:38:01.740 | Haven't you been doing that for two hours?
02:38:03.860 | - Three hours now.
02:38:06.820 | Let me ask you about Ghislaine Maxwell.
02:38:11.260 | She's been saying that she's an innocent victim.
02:38:13.860 | Is she an innocent victim or is she evil
02:38:20.420 | and equally responsible like Jeffrey Epstein?
02:38:23.580 | Now I'm asking far away from any MIT things and more,
02:38:27.580 | just your sense of the whole situation.
02:38:29.580 | - I haven't been following it,
02:38:30.820 | so I don't know the facts of the situation
02:38:33.140 | and what is now known to be her role in that.
02:38:37.420 | If I were her, clearly I'm not,
02:38:39.500 | but if I were her, I wouldn't be going around
02:38:41.300 | saying I'm an innocent victim.
02:38:42.820 | I would say, maybe she's, I don't know what she's saying.
02:38:47.500 | Again, I don't know.
02:38:48.460 | - She was controlled by Jeffrey.
02:38:50.780 | - Is she saying this as part of a legal case
02:38:52.700 | or is she saying this as a PR thing?
02:38:55.740 | - Well, PR, but it's not just her thing.
02:39:00.900 | It's her whole family believes this.
02:39:03.220 | There's a whole effort that says like,
02:39:06.040 | how should I put it?
02:39:09.660 | I believe they believe it.
02:39:11.220 | So in that sense, it's not PR.
02:39:12.860 | I believe the family, basically the family is saying
02:39:17.620 | that she's a really good human being.
02:39:22.020 | - Well, I think everyone is a good human being.
02:39:23.900 | I know that's a controversial opinion,
02:39:25.360 | but I think everyone is a good human being.
02:39:31.360 | There's no evil people.
02:39:35.200 | There's people who do bad things
02:39:39.240 | and who behave in ways that harm others.
02:39:41.400 | And I think we should always hold people accountable
02:39:43.960 | for that, but holding someone accountable
02:39:45.760 | doesn't mean saying that they're evil.
02:39:48.360 | - Yeah, actually those people usually think
02:39:51.240 | they're doing good.
02:39:52.920 | - Yeah, I mean, aside from, I don't know,
02:39:55.460 | maybe sociopaths like are specifically trying
02:39:58.640 | to harm people, but I think most people
02:40:02.080 | are trying to do their best.
02:40:04.680 | And if they're not doing their best,
02:40:07.120 | it's because there's some impediment
02:40:09.080 | or something in their past.
02:40:10.860 | So I just, I genuinely don't believe
02:40:13.960 | in good and evil people, but I do believe
02:40:15.680 | in harmful and not harmful actions.
02:40:18.420 | And so I don't know, I don't care.
02:40:22.160 | Yeah, she's a good person, but if she contributed
02:40:25.160 | to harm, then she needs to be accountable for that.
02:40:27.660 | That's my position.
02:40:29.000 | I don't know what the facts of the matter are.
02:40:30.440 | It seems like she was pretty close to the situation,
02:40:32.520 | so it doesn't seem very believable that she was a victim,
02:40:35.240 | but I don't know.
02:40:36.560 | - I wish I have met Epstein, 'cause something tells me
02:40:39.880 | he would just be a regular person,
02:40:41.720 | charismatic person like anybody else.
02:40:44.640 | And that's a very dark reality that we don't know
02:40:47.880 | which among us, what each of us are hiding in the closet.
02:40:54.680 | That's a really tough thing to deal with,
02:40:58.100 | because then you can put your trust into some people
02:41:01.660 | and they can completely betray that trust
02:41:03.660 | and in the process destroy you.
02:41:05.360 | Which there's a lot of people that interacted with Epstein
02:41:10.200 | that now have to, I mean, if they're not destroyed by it,
02:41:15.200 | then their whole, the ground on which they stand ethically
02:41:22.740 | has crumbled, at least in part.
02:41:25.600 | And I'm sure you and I have interacted with people
02:41:31.940 | without knowing it who are bad people.
02:41:34.680 | - As I always tell my four-year-old,
02:41:36.720 | people who have done bad things.
02:41:38.320 | - People who have done bad things.
02:41:39.760 | - He's always talking about bad guys
02:41:41.000 | and I'm trying to move him towards,
02:41:43.720 | they're just people who make bad choices.
02:41:46.320 | - Yeah, that's really powerful, actually.
02:41:48.400 | That's really important to remember,
02:41:49.720 | 'cause that means you have compassion
02:41:51.800 | towards all human beings.
02:41:53.120 | Do you have hope for the future of MIT,
02:41:56.380 | the future of Media Lab in this context?
02:41:58.440 | So David Newman is now at the helm.
02:42:01.340 | I'm gonna talk, I talked to her previously,
02:42:02.700 | I'll talk to her again.
02:42:04.540 | She's great.
02:42:05.500 | - Love her.
02:42:06.340 | Yeah, she's great.
02:42:07.860 | I don't know if she knew the whole situation
02:42:11.280 | when she started, because the situation went beyond
02:42:15.440 | just the Epstein scandal.
02:42:17.860 | A bunch of other stuff happened at the same time.
02:42:21.460 | Some of it's not public,
02:42:23.200 | but what I was personally going through at that time.
02:42:28.780 | So the Epstein thing happened, I think,
02:42:30.900 | was it August or September 2019?
02:42:34.500 | It was somewhere around late summer.
02:42:36.980 | In June 2019, so I'm a research scientist at MIT.
02:42:43.180 | You are too, right?
02:42:44.420 | So, and I always have had various supervisors
02:42:48.380 | over the years, and they've just basically
02:42:50.500 | let me do what I want, which has been great.
02:42:52.700 | But I had a supervisor at the time,
02:42:55.540 | and he called me into his office for a regular check-in.
02:43:00.380 | In June of 2019, I reported to MIT
02:43:03.500 | that my supervisor had grabbed me,
02:43:08.100 | pulled me into a hug, wrapped his arms around my waist,
02:43:12.380 | and started massaging my hip, and trying to kiss me,
02:43:16.700 | kiss my face, kiss me near the mouth,
02:43:19.560 | and said literally the words,
02:43:21.980 | "Don't worry, I'll take care of your career."
02:43:24.900 | And that experience was really interesting
02:43:31.780 | because I just, I was very indignant.
02:43:35.400 | I was like, "He can't do that to me.
02:43:38.820 | Doesn't he know who I am?"
02:43:40.020 | And I was like, "This is the Me Too era."
02:43:42.320 | And I naively thought that when I reported that,
02:43:45.060 | it would get taken care of.
02:43:46.440 | And then I had to go through
02:43:47.420 | the whole reporting process at MIT,
02:43:49.180 | and I learned a lot about how institutions
02:43:53.020 | really handle those things internally,
02:43:55.360 | particularly situations where
02:43:58.240 | I couldn't provide evidence that it happened.
02:44:00.500 | I had no reason to lie about it, but I had no evidence.
02:44:02.700 | And so I was going through that,
02:44:06.300 | and that was another experience for me
02:44:09.540 | where there's so many people in the institution
02:44:13.120 | who really believe in protecting the institution
02:44:16.500 | at all costs.
02:44:17.960 | And there's only a few people
02:44:19.220 | who care about doing the right thing.
02:44:21.460 | And one of them resigned.
02:44:24.820 | Now there's even less of them left.
02:44:26.620 | - So what'd you learn from that?
02:44:31.100 | I mean, where's the source,
02:44:32.540 | if you have hope for this institution
02:44:35.100 | that I think you love, at least in part?
02:44:38.380 | I love the idea of MIT.
02:44:41.460 | - I love the idea.
02:44:42.300 | I love the research body.
02:44:43.300 | I love a lot of the faculty.
02:44:44.460 | I love the students.
02:44:45.960 | I love the energy.
02:44:46.860 | I love it all.
02:44:48.000 | I think the administration suffers from the same problems
02:44:51.240 | as any institutional,
02:44:54.900 | any leadership of an institution that is large,
02:44:58.340 | which is that
02:44:59.320 | they've become risk-averse, like you mentioned.
02:45:06.200 | They care about PR.
02:45:08.120 | The only ways to get their attention
02:45:12.480 | or change their minds about anything
02:45:14.040 | are to threaten the reputation of the institute
02:45:16.280 | or to have a lot of money.
02:45:17.580 | That's the only way to have power at the institute.
02:45:21.280 | Yeah, I don't think they have a lot of integrity
02:45:26.360 | or believe in ideas
02:45:27.560 | or even have a lot of connection to the research body
02:45:30.480 | and the people who are really,
02:45:32.400 | 'cause it's so weird.
02:45:33.400 | You have this amazing research body of people
02:45:36.340 | pushing the boundaries of things
02:45:37.840 | who aren't afraid to,
02:45:39.320 | there's the hacker culture.
02:45:40.960 | And then you have the administration
02:45:44.000 | and they're really like
02:45:45.560 | protect the institution at all costs.
02:45:50.300 | - Yeah, there's a disconnect, right?
02:45:52.200 | - Complete disconnect.
02:45:53.040 | - I wonder if that was always there,
02:45:54.280 | if it just kind of slowly grows over time,
02:45:57.520 | a disconnect between the administration and the faculty.
02:45:59.920 | - I think it grew over time is what I've heard.
02:46:03.320 | I mean, I've been there for 11 years now.
02:46:05.640 | I don't know if it's gotten worse during my time,
02:46:11.120 | but I've heard from people who've been there longer
02:46:13.880 | that it didn't,
02:46:15.000 | like MIT didn't used to have a general counsel's office.
02:46:18.400 | They didn't used to have all of this corporate stuff.
02:46:20.840 | And then they had to create it as they got bigger
02:46:23.640 | and in the era where such things are,
02:46:27.160 | I guess, deemed necessary.
02:46:28.840 | - See, I believe in the power of individuals
02:46:30.840 | to overthrow the thing.
02:46:33.040 | So it's just a really good president of MIT
02:46:36.800 | or certain people in the administration
02:46:38.920 | can reform the whole thing.
02:46:40.240 | 'Cause the culture is still there
02:46:43.260 | of like, I think everybody remembers
02:46:47.520 | that MIT is about the students and the faculty.
02:46:50.960 | - Do they though?
02:46:51.800 | Because I don't know, I've had a lot of conversations
02:46:54.920 | that have been shocking with like senior administration.
02:46:57.640 | They think the students are children.
02:46:59.720 | They call them kids.
02:47:01.200 | It's like, these are the smartest people.
02:47:03.320 | They're way smarter than you.
02:47:05.780 | And you're so dismissive of that.
02:47:07.360 | - But those individuals, I'm saying like the capacity,
02:47:11.920 | like the aura of the place still values the students
02:47:16.920 | and the faculty.
02:47:18.940 | Like I'm being awfully poetic about it.
02:47:21.980 | But what I mean is the administration is the froth
02:47:25.980 | at the top of the, like the waves, the surface.
02:47:30.980 | Like they can be removed and new life can be brought in
02:47:35.860 | that would keep to the spirit of the place.
02:47:38.660 | - Who decides on who to bring in?
02:47:40.660 | Who's higher? - It's bottom up.
02:47:42.040 | Oh, I see.
02:47:42.880 | I see.
02:47:45.280 | But I do think ultimately,
02:47:47.400 | especially in the era of social media and so on,
02:47:50.780 | faculty and students have more and more power.
02:47:55.680 | Just more and more of a voice, I suppose.
02:47:57.840 | - I hope so.
02:47:58.800 | I really do.
02:48:00.360 | I don't see MIT going away anytime soon.
02:48:02.680 | And like, I also don't think it's a terrible place at all.
02:48:05.360 | - Yeah, it's an amazing place.
02:48:07.080 | But there's different trajectories it can take.
02:48:09.680 | - Yeah.
02:48:10.880 | - And like, and that has to do with a lot of things,
02:48:14.000 | including,
02:48:14.960 | does it, is it stays,
02:48:20.480 | even if we talk about robotics,
02:48:22.300 | it could be the capital of the world in robotics.
02:48:25.520 | But currently, if you wanna be doing the best AI work
02:48:29.420 | in the world, you're gonna go to Google or Facebook
02:48:33.360 | or Tesla or Apple or so on.
02:48:37.080 | You're not gonna be, you're not gonna be at MIT.
02:48:40.480 | And so that has to do,
02:48:42.760 | I think that's basically has to do with
02:48:46.360 | not allowing the brilliance of the researchers to flourish.
02:48:52.960 | - Yeah, people say it's about money,
02:48:55.360 | but I don't think it's about that at all.
02:48:56.680 | Like, sometimes you have more freedom
02:49:00.680 | and can work on more interesting things in companies.
02:49:02.880 | That's really where they lose people.
02:49:05.120 | - Yeah.
02:49:06.040 | - And sometimes the freedom in all ways,
02:49:09.800 | which is why it's heartbreaking to get like
02:49:12.960 | people like Richard Stallman,
02:49:14.320 | there's such an interesting line
02:49:15.880 | because like Richard Stallman's a gigantic weirdo
02:49:18.920 | that crossed lines he shouldn't have crossed, right?
02:49:22.900 | But we don't wanna draw too many lines.
02:49:25.640 | This is the tricky thing.
02:49:28.120 | - There are different types of lines in my opinion.
02:49:30.960 | - But it's your opinion.
02:49:32.160 | You have strong lines you hold to,
02:49:34.560 | but then if administration listens to every line,
02:49:37.380 | there's also power in drawing a line.
02:49:40.920 | And it becomes like a little drug.
02:49:47.520 | You have to find the right balance.
02:49:49.200 | Licking somebody's arm is never appropriate.
02:49:52.720 | I think the biggest aspect there is not owning it,
02:49:57.720 | learning from it, growing from it
02:49:59.160 | from the perspective of Stallman or people like that,
02:50:02.720 | back when it happened, like understanding,
02:50:05.080 | seeing the right, being empathetic,
02:50:07.000 | seeing the fact that this was like totally inappropriate.
02:50:10.600 | Not when that particular act,
02:50:14.320 | but everything that led up to it too.
02:50:16.880 | - No, I think there are different kinds of lines.
02:50:19.040 | I think there are...
02:50:20.040 | So Stallman crossed lines that essentially
02:50:25.760 | excluded a bunch of people and created an environment
02:50:28.640 | where we, there are brilliant minds
02:50:30.880 | that we never got the benefit of
02:50:32.920 | because he made things feel gross
02:50:36.600 | or even unsafe for people.
02:50:38.680 | There are lines that you can cross
02:50:40.920 | where you're challenging an institution to...
02:50:44.520 | Like, I don't think he was intentionally
02:50:48.560 | trying to cross a line, or maybe he didn't care.
02:50:53.560 | There are lines that you can cross intentionally
02:50:56.480 | to move something forward or to do the right thing.
02:50:59.400 | Like when MIT was like,
02:51:00.880 | you can't put an all-gender restroom in the media lab
02:51:03.880 | because something permits whatever,
02:51:07.080 | and Joey did it anyway.
02:51:08.640 | That's a line you can cross
02:51:10.160 | to make things actually better for people.
02:51:11.880 | And the line you're crossing is some arbitrary, stupid rule
02:51:15.200 | that people who don't wanna take the risk are like...
02:51:18.880 | - Yeah, for sure. - You know what I mean?
02:51:21.520 | - No, ultimately, I think the thing you said is like,
02:51:24.600 | cross lines in a way that doesn't...
02:51:28.640 | alienate others.
02:51:31.280 | So like, for example, me wearing,
02:51:33.480 | I started for a while wearing a suit often at MIT,
02:51:37.000 | which sounds counterintuitive, but that's actually...
02:51:40.520 | People always looked at me weird for that.
02:51:44.440 | MIT created this culture,
02:51:45.960 | specifically the people I was working with.
02:51:47.820 | Like, nobody wore suits.
02:51:48.880 | Maybe the business school does.
02:51:49.720 | - Yeah, we don't trust the suits.
02:51:50.560 | - People don't trust the suits.
02:51:51.600 | I was like, fuck you, I'm wearing a suit.
02:51:53.440 | - Nice. (laughs)
02:51:54.880 | - But that's not really hurting anybody, right?
02:51:57.000 | - Exactly.
02:51:58.520 | It's challenging people's perceptions.
02:52:01.240 | It's doing something that you wanna do.
02:52:03.520 | - Yeah.
02:52:04.360 | - But it's not hurting people.
02:52:06.000 | - And that particular thing was, yeah, it was hurting people.
02:52:10.920 | It's a good line, it's a good line to...
02:52:12.960 | Hurting, ultimately, the people that you want to flourish.
02:52:20.400 | - Yeah. - Yeah.
02:52:21.240 | - You tweeted a picture of pumpkin spice Greek yogurt.
02:52:27.180 | And asked, "Grounds for divorce?
02:52:29.900 | "Yes, no."
02:52:30.900 | So let me ask you,
02:52:32.020 | what's the key to a successful relationship?
02:52:35.020 | - Oh my God, a good couple's therapist?
02:52:37.980 | (both laugh)
02:52:40.060 | - What went wrong with the pumpkin spice Greek yogurt?
02:52:42.860 | What's exactly wrong?
02:52:43.860 | Is it the pumpkin?
02:52:45.320 | Is it the Greek yogurt?
02:52:46.160 | I didn't understand.
02:52:46.980 | I stared at that tweet for a while.
02:52:48.620 | - I grew up in Europe,
02:52:49.620 | so I don't understand the pumpkin spice in everything craze
02:52:53.420 | that they do every autumn here.
02:52:55.620 | Like, I understand that it might be good in some foods,
02:52:58.580 | but they just put it in everything.
02:53:01.020 | - And it doesn't belong in Greek yogurt.
02:53:03.200 | - I mean, I was just being humorous.
02:53:07.220 | I ate one of those yogurts
02:53:08.500 | and it actually tasted pretty good.
02:53:09.780 | - Yeah, exactly.
02:53:10.620 | (Bridget laughs)
02:53:11.940 | - I think part of the success of a good marriage
02:53:14.660 | is like giving each other a hard time humorously
02:53:18.700 | for things like that.
02:53:19.880 | - Is there a broader lesson?
02:53:22.900 | 'Cause you guys seem to have a really great marriage
02:53:25.580 | from the external perspective.
02:53:26.780 | - I mean, every marriage looks good from the external.
02:53:29.620 | Every, I think, yeah.
02:53:31.280 | (both laugh)
02:53:32.620 | - That's not true, but yeah, I get it.
02:53:35.380 | - Okay, right.
02:53:36.220 | (both laugh)
02:53:37.060 | That's not true.
02:53:38.060 | No, but like, relationships are hard.
02:53:40.580 | Relationships with anyone are hard,
02:53:42.300 | and especially because people evolve and change,
02:53:45.340 | and you have to make sure there's space
02:53:47.420 | for both people to evolve and change together.
02:53:49.380 | And I think one of the things
02:53:51.960 | that I really liked about our marriage vows
02:53:54.900 | was I remember before we got married,
02:53:58.460 | Greg, at some point, got kind of nervous,
02:54:01.780 | and he was like, "It's such a big commitment
02:54:04.420 | "to commit to something for life."
02:54:06.600 | And I was like, "We're not committing to this for life."
02:54:09.420 | And he was like, "We're not?"
02:54:11.540 | And I'm like, "No, we're committing to being part of a team
02:54:15.100 | "and doing what's best for the team.
02:54:17.520 | "If what's best for the team is to break up,
02:54:19.620 | "we'll break up."
02:54:20.700 | Like, I don't believe in this,
02:54:22.580 | like we have to do this for our whole lives.
02:54:25.500 | And that really resonated with him too, so yeah.
02:54:30.500 | - Did you put in the vows?
02:54:32.420 | - Yeah, yeah, that was our vows,
02:54:33.740 | like that we're gonna be a team.
02:54:35.820 | - You're a team and do what's right for the team?
02:54:37.300 | - Yeah, yeah.
02:54:38.420 | - That's very like Michael Jordan view.
02:54:42.540 | Did you guys get married in the desert,
02:54:47.540 | like November rain style with Slash playing?
02:54:50.340 | - Sure, you don't have to answer that.
02:54:53.860 | I'm not good at these questions.
02:54:55.100 | Okay.
02:54:56.060 | - You brought up marriage like eight times.
02:54:57.940 | Are you trying to hint something on the podcast?
02:55:01.460 | - I don't, yeah, I have an announcement to make.
02:55:04.420 | No, I don't know.
02:55:06.420 | It just seems like a good metaphor for,
02:55:10.920 | it felt like a good metaphor for, in a bunch of cases,
02:55:16.300 | for the marriage industrial complex, I remember that.
02:55:20.660 | And, oh, people complaining.
02:55:23.420 | It just seemed like marriage is one of the things
02:55:27.020 | that always surprises me 'cause I wanna get married.
02:55:30.220 | - You do?
02:55:31.060 | - Yeah, I do.
02:55:31.880 | And then I listened to like friends of mine that complain,
02:55:35.100 | not all, I like guys, I really like guys
02:55:37.900 | that don't complain about their marriage.
02:55:39.540 | It's such a cheap, like if, it's such a cheap release valve,
02:55:44.140 | like that's bitching about anything, honestly,
02:55:46.800 | that's just like, it's too easy.
02:55:48.740 | But especially, like bitch about the sports team
02:55:52.420 | or the weather if you want, but like about somebody
02:55:55.340 | that you're dedicating your life to,
02:55:57.520 | like if you bitch about them,
02:55:59.780 | you're going to see them as a lesser being also.
02:56:03.620 | Like you don't think so,
02:56:04.660 | but you're going to like decrease the value you have.
02:56:07.620 | I personally believe over time,
02:56:10.260 | you're not going to appreciate the magic of that person.
02:56:13.320 | I think, anyway, but it's like that I just noticed this
02:56:18.120 | a lot that people are married and they will whine about,
02:56:22.800 | you know, like the wife, whatever,
02:56:25.620 | it's part of the sort of the culture
02:56:28.620 | to kind of comment in that way.
02:56:30.480 | I think women do the same thing about the husband.
02:56:33.020 | He doesn't, he never does this or he's a goof,
02:56:36.060 | he's incompetent at this or that, whatever.
02:56:38.780 | There's a kind of--
02:56:39.620 | - Yeah, there's this tropes like, oh,
02:56:42.260 | you know, husbands never do X and like wives are,
02:56:45.820 | I think those do a disservice to everyone
02:56:48.060 | it's just disrespectful to everyone involved.
02:56:50.400 | - Yeah, but it happens.
02:56:51.240 | So I was brought that up as an example of something
02:56:54.580 | that people actually love, but they complain about
02:56:57.560 | 'cause for some reason that's more fun to do
02:57:00.520 | is complain about stuff.
02:57:02.540 | And so that's what with Clippy or whatever, right?
02:57:05.020 | So like you complain about, but you actually love it.
02:57:07.980 | It's just a good metaphor that, you know,
02:57:10.020 | what was I going to ask you?
02:57:13.100 | Oh, you,
02:57:15.980 | your hamster died.
02:57:19.380 | - When I was like eight.
02:57:22.740 | - You miss her?
02:57:23.580 | - Beige.
02:57:25.460 | - What's the closest relationship you've had with a pet?
02:57:30.980 | That the one?
02:57:31.820 | What pet or robot have you loved the most in your life?
02:57:41.980 | - I think my first pet was a goldfish named Bob
02:57:46.060 | and he died immediately and that was really sad.
02:57:48.420 | I think it was really attached to Bob and Nancy,
02:57:53.540 | my goldfish, we got new Bobs and then Bob kept dying
02:57:57.260 | and we got new Bobs, Nancy just kept living.
02:58:00.160 | - So it was very replaceable.
02:58:04.640 | - Yeah, I was young.
02:58:07.100 | It was easy to.
02:58:10.780 | - Do you think there will be a time when the robot,
02:58:13.660 | like in the movie "Her" be something
02:58:15.580 | we fall in love with romantically?
02:58:17.880 | - Oh yeah, oh for sure, yeah.
02:58:20.340 | - At scale, like we're a lot of people.
02:58:22.540 | - Romantically, I don't know if it's going to happen at scale.
02:58:27.540 | I think we talked about this a little bit last time
02:58:30.980 | on the podcast too, where I think we're just capable
02:58:33.660 | of so many different kinds of relationships.
02:58:35.620 | And actually part of why I think marriage is so tough
02:58:39.320 | as a relationship is because we put so many expectations
02:58:43.060 | on it, like your partner has to be your best friend
02:58:46.740 | and you have to be sexually attracted to them
02:58:48.660 | and they have to be a good co-parent and a good roommate
02:58:51.180 | and like it's all the relationships at once
02:58:54.260 | that have to work.
02:58:55.320 | But we're like normally with other people,
02:58:58.700 | we have like one type of relationship
02:59:00.580 | or we even have, we have a different relationship
02:59:02.380 | to our dog than we do to our neighbor,
02:59:04.620 | than we do to the person, someone, a coworker.
02:59:09.080 | I think that some people are gonna find romantic
02:59:12.320 | relationships with robots interesting.
02:59:14.660 | It might even be a widespread thing,
02:59:17.000 | but I don't think it's gonna replace
02:59:19.600 | like human romantic relationships.
02:59:21.360 | I think it's just gonna be a separate type of thing.
02:59:24.440 | - It's gonna be more narrow.
02:59:27.360 | - More narrow or even like just something new
02:59:31.160 | that we haven't really experienced before.
02:59:33.200 | Maybe like having a crush on an artificial agent
02:59:36.000 | is a different type of fascination.
02:59:38.580 | I don't know.
02:59:39.420 | - Do you think people would see that as cheating?
02:59:41.740 | - I think people would, well, I mean,
02:59:45.540 | the things that people feel threatened by
02:59:47.380 | in relationships are very manifold, so.
02:59:50.180 | - Yeah, that's just an interesting one.
02:59:53.220 | 'Cause maybe they'll be good, a little jealousy
02:59:58.220 | for the relationship.
03:00:00.540 | Maybe they'll be like part of the couple's therapy
03:00:02.780 | you know, kind of thing or whatever.
03:00:05.060 | - I don't think jealousy, I mean, I think
03:00:07.100 | it's hard to avoid jealousy, but I think
03:00:10.120 | the objective is probably to avoid it.
03:00:12.080 | I mean, some people don't even get jealous
03:00:13.500 | when their partner sleeps with someone else.
03:00:15.080 | Like there's polyamory and.
03:00:16.600 | I think there's just such a diversity of different ways
03:00:23.160 | that we can structure relationships or view them
03:00:26.360 | that this is just gonna be another one that we add.
03:00:30.040 | - You dedicate your book to your dad.
03:00:32.280 | What did you learn about life from your dad?
03:00:35.440 | - Oh man, my dad is, he's a great listener
03:00:40.440 | and he is the best person I know
03:00:45.700 | at the type of cognitive empathy
03:00:50.820 | that's like perspective taking.
03:00:53.580 | So not like emotional, like crying empathy,
03:00:57.580 | but trying to see someone else's point of view
03:01:01.940 | and trying to put yourself in their shoes.
03:01:03.940 | And he really instilled that in me from an early age.
03:01:07.740 | And then he made me read a ton of science fiction,
03:01:09.980 | which probably led me down this path.
03:01:13.780 | - Taught you how to be curious about the world
03:01:15.700 | and how to be open-minded.
03:01:17.100 | - Yeah.
03:01:18.580 | - Last question, what role does love play
03:01:21.260 | in the human condition?
03:01:22.980 | Since we've been talking about love and robots.
03:01:28.220 | And you're fascinated by social robotics.
03:01:32.260 | It feels like all of that operates in the landscape
03:01:36.300 | of something that we can call love.
03:01:38.660 | - Love, yeah, I think there are a lot
03:01:42.020 | of different kinds of love.
03:01:43.540 | I feel like it's, we need,
03:01:45.760 | I'm like, don't the Eskimos have all these different words
03:01:49.140 | for snow?
03:01:49.980 | We need more words to describe different types
03:01:53.100 | and kinds of love that we experience.
03:01:54.760 | But I think love is so important.
03:01:56.460 | And I also think it's not zero sum.
03:01:59.620 | That's the really interesting thing about love
03:02:02.180 | is that I had one kid and I loved my first kid
03:02:07.180 | more than anything else in the world.
03:02:08.900 | And I was like, how can I have a second kid
03:02:10.860 | and then love that kid also?
03:02:12.780 | I'm never gonna love it as much as the first.
03:02:15.380 | But I love them both equally.
03:02:16.900 | It just like, my heart expanded.
03:02:19.300 | And so I think that people who are threatened
03:02:22.900 | by love towards artificial agents,
03:02:27.180 | they don't need to be threatened for that reason.
03:02:30.420 | - Artificial agents will just, if done right,
03:02:33.140 | will just expand your capacity for love.
03:02:37.180 | - I think so.
03:02:38.100 | - I agree.
03:02:40.140 | Beautifully put.
03:02:41.060 | Kate, this was awesome.
03:02:43.060 | I still didn't talk about half the things
03:02:44.500 | I wanted to talk about, but we're already
03:02:46.460 | like way over three hours.
03:02:47.580 | So thank you so much.
03:02:48.460 | I really appreciate you talking today.
03:02:50.540 | You're awesome.
03:02:51.380 | You're an amazing human being,
03:02:52.620 | a great roboticist, great writer now.
03:02:55.860 | It's an honor that you would talk with me.
03:02:57.340 | Thanks for doing it.
03:02:58.180 | - Right back at you.
03:02:59.020 | Thank you.
03:03:00.620 | - Thanks for listening to this conversation
03:03:02.100 | with Kate Darling.
03:03:03.460 | To support this podcast, please check out
03:03:05.340 | our sponsors in the description.
03:03:07.580 | And now, let me leave you with some words
03:03:09.780 | from Maya Angelou.
03:03:11.700 | Courage is the most important of all the virtues,
03:03:15.300 | because without courage, you can't practice
03:03:17.660 | any other virtue consistently.
03:03:20.340 | Thank you for listening and hope to see you.
03:03:22.540 | Next time.
03:03:23.380 | (upbeat music)
03:03:25.980 | (upbeat music)
03:03:28.580 | [BLANK_AUDIO]