back to indexKate Darling: Social Robots, Ethics, Privacy and the Future of MIT | Lex Fridman Podcast #329
Chapters
0:0 Introduction
1:46 What is a robot?
17:47 Metaverse
27:9 Bias in robots
40:51 Modern robotics
43:24 Automation
47:46 Autonomous driving
56:11 Privacy
59:37 Google's LaMDA
64:24 Robot animal analogy
77:28 Data concerns
95:30 Humanoid robots
114:31 LuLaRobot
123:25 Ethics in robotics
138:46 Jeffrey Epstein
172:21 Love and relationships
00:00:00.000 |
- I think that animals are a really great thought experiment 00:00:12.440 |
but also I think for the future, we don't want that. 00:00:22.320 |
we domesticated them not because they do what we do, 00:00:25.020 |
but because what they do is different and that's useful. 00:00:32.320 |
whether we're talking about work integration, 00:00:34.920 |
whether we're talking about responsibility for harm, 00:00:37.240 |
there's just so many things we can draw on in that history 00:00:45.720 |
that are applicable to how we should be thinking 00:00:49.720 |
- The following is a conversation with Kate Darling, 00:00:59.280 |
interested in human robot interaction and robot ethics, 00:01:15.040 |
She was a courageous voice of reason and compassion 00:01:22.480 |
We reflect on this time in this very conversation, 00:01:26.520 |
including the lessons it revealed about human nature 00:01:29.640 |
and our optimistic vision for the future of MIT, 00:01:39.800 |
please check out our sponsors in the description. 00:01:49.120 |
you wore Justin Bieber's shirt for the podcast. 00:01:51.920 |
So now looking back, you're a respected researcher, 00:02:00.640 |
Was this one of the proudest moments of your life? 00:02:19.060 |
You were like, oh, well, I'm interviewing some, 00:02:29.920 |
- You didn't tell me, oh, I was dressed like this. 00:02:40.480 |
- I think I bought it for my husband as a joke. 00:02:42.920 |
And yeah, we were gut renovating a house at the time 00:02:59.400 |
- It's like a wedding dress or something like that. 00:03:11.240 |
You opened the book with the surprisingly tricky question, 00:03:15.740 |
So let me ask you, let's try to sneak up to this question. 00:03:32.120 |
is something that has some level of intelligence 00:03:44.040 |
that allows you to navigate the uncertainty of life. 00:03:50.060 |
So that means, like autonomous vehicles to me in that sense, 00:03:53.600 |
are robots because they navigate the uncertainty, 00:04:04.080 |
I like that you mentioned magic 'cause that also, 00:04:08.080 |
I don't define robot definitively in the book 00:04:18.400 |
people have called things robots until they lose the magic 00:04:23.340 |
Like a vending machine used to be called a robot 00:04:26.620 |
So I do agree with you that there's this magic aspect 00:04:37.260 |
they have the definition of something that is, 00:04:44.680 |
They'll say it has to be able to sense its environment 00:04:49.100 |
It has to be able to make a decision autonomously 00:04:55.260 |
I think that's a pretty good technical definition 00:05:01.860 |
because the smartphone can do all of those things. 00:05:04.500 |
And most robotists would not call it a robot. 00:05:09.760 |
But part of why I wrote the book is because people have 00:05:27.100 |
robots with a torso and head and two arms and two legs. 00:05:37.980 |
- Why does the humanoid form trip us up a lot? 00:05:40.740 |
- Well, because this constant comparison of robots to people, 00:05:45.060 |
artificial intelligence to human intelligence, 00:05:54.100 |
some of them were trying to recreate human intelligence. 00:05:57.760 |
and there's a lot to be learned from that academically, 00:06:00.140 |
et cetera, but that's not where we've ended up. 00:06:05.500 |
We wind up in this fallacy where we're comparing these two. 00:06:09.720 |
And when we talk about what intelligence even is, 00:06:13.820 |
we're often comparing to our own intelligence. 00:06:21.500 |
I just think it's boring to recreate intelligence 00:06:32.340 |
what could we use these technologies for perspective, 00:06:35.740 |
it's much more interesting to create something new, 00:06:43.640 |
- And it should be in some deep way similar to us, 00:06:53.420 |
which is why the similarity might be necessary. 00:07:06.820 |
and we relate most to things that are like ourselves. 00:07:12.500 |
So we have stairs and narrow passageways and door handles. 00:07:15.580 |
And so we need humanoid robots to be able to navigate that. 00:07:24.540 |
and a lot of human robot interaction research 00:07:26.540 |
is that all you need is something that's enough 00:07:35.940 |
but that doesn't have to look human or even act human. 00:07:45.800 |
Even though it's just like a trash can on wheels. 00:07:47.960 |
And they like R2-D2 more than C3PO, who's a humanoid. 00:08:00.320 |
- Yeah, it's kind of amazing the variety of cues 00:08:03.160 |
that can be used to anthropomorphize the thing, 00:08:13.840 |
I think people sometimes over-engineer these things. 00:08:20.560 |
I mean, ask any animator and they'll know that. 00:08:29.580 |
the right people to design those is animators, 00:08:33.480 |
like Disney type of people versus like roboticists. 00:08:38.120 |
- Robotists, quote unquote, are mostly clueless. 00:08:45.200 |
that they're very good at and they don't have-- 00:08:51.320 |
I feel like robotics of the early 21st century 00:08:56.240 |
is not going to be the robotics of the later 21st century. 00:09:05.520 |
'Cause I think more and more you'd be like a, 00:09:26.440 |
in the perception interpreting the world aspect, 00:09:31.560 |
in the brain part, not the basic control level part. 00:09:36.560 |
- You call it basic, it's actually really complex. 00:09:41.600 |
- And that's why, but like, I think you're so right 00:09:44.120 |
and what a time to be alive, because for me, I just, 00:09:54.440 |
And now finally robots are getting deployed into the world. 00:10:02.280 |
that companies are making because they focus so much 00:10:06.840 |
and getting the robot to be even be able to function 00:10:12.520 |
- See what I feel like people don't understand 00:10:15.640 |
is to solve the perception and the control problem. 00:10:29.160 |
- Or ask for help or be able to communicate the uncertainty. 00:10:36.200 |
because you can't solve the perception control. 00:10:42.680 |
But the magic is in the self-deprecating humor 00:10:48.080 |
and the self-awareness about where our flaws are, 00:10:54.480 |
in human robot interaction showing like ways to do this. 00:10:58.680 |
But a lot of these companies haven't, they don't do HRI. 00:11:01.920 |
They like the, have you seen the grocery store robot 00:11:07.500 |
- Yeah, the Marty, it looks like a giant penis. 00:11:09.240 |
It's like six feet tall, it roams the aisles. 00:11:23.140 |
And then people hate Marty because they didn't at all 00:11:26.940 |
consider how people would react to Marty in their space. 00:11:30.660 |
- Does everybody, I mean, you talk about this, 00:11:45.840 |
right before the pandemic hit, and then we canceled it 00:11:48.180 |
because we didn't wanna go to the grocery store 00:11:51.400 |
But our theory, so this was with a student at MIT, 00:11:57.860 |
She noticed that everyone on Facebook, in her circles, 00:12:06.820 |
And she did this quick and dirty sentiment analysis 00:12:13.140 |
And she found that the biggest spike of negative mentions 00:12:16.900 |
happened when Stop and Shop threw a birthday party 00:12:20.980 |
for the Marty robots, like with free cake and balloons. 00:12:34.100 |
We used Mechanical Turk, and we tried to get at 00:12:41.340 |
And a lot of it wasn't, "Oh, Marty's taking jobs." 00:12:52.900 |
It's watching, it's creepy, it's getting in the way. 00:12:55.060 |
Those were the things that people complained about. 00:12:56.900 |
And so our hypothesis became, "Is Marty a real-life Clippy?" 00:13:10.660 |
A lot of people seem to like to complain about marriage, 00:13:15.260 |
So it could be, the relationship you might have 00:13:18.900 |
with Marty is like, "Oh, there he goes again, 00:13:35.420 |
And there's, I mean, some people, a lot of people, 00:13:38.940 |
show love by sort of busting each other's chops, 00:13:49.780 |
And there's so many possible options for humor there. 00:13:58.300 |
You can be like, "Yes, I'm an agent of the CIA, 00:14:06.860 |
Yes, I'm watching you because you're so important 00:14:13.340 |
Or just any kind of making fun of people, I don't know. 00:14:20.500 |
because when it comes to robots or artificial agents, 00:14:26.980 |
than they would some other machine or device or object. 00:14:34.020 |
it might be combined with love or like whatever it is, 00:14:38.140 |
because they view these things as social agents 00:14:52.140 |
because people viewed Clippy as a social agent, 00:14:55.780 |
when Clippy was annoying and would like bother them 00:14:58.220 |
and interrupt them and like not remember what they told him, 00:15:03.380 |
because it wasn't fulfilling their social expectations. 00:15:08.680 |
than they would have if it had been a different, 00:15:16.740 |
that we're on the wrong path with a particular robot? 00:15:19.580 |
Or is it possible, like again, like marriage, 00:15:29.700 |
where we can find deep, meaningful relationships? 00:15:40.340 |
I would have designed Marty a little differently. 00:15:43.620 |
Isn't there a charm to the clumsiness, the slowness? 00:15:46.900 |
- There is if you're not trying to get through 00:16:01.820 |
is they slapped two big googly eyes on Marty. 00:16:13.940 |
And so like, is there a way to design the robot 00:16:21.020 |
in a way that people are actually attracted to 00:16:33.860 |
if it would help to make Marty seem like an entity 00:16:38.820 |
of its own versus the arm of a large corporation. 00:16:43.820 |
So there's some sense where this is just the camera 00:16:50.740 |
that's monitoring people versus this is an entity 00:16:56.180 |
It has its own task and it has its own personality. 00:17:10.700 |
our basic assumption is whatever I say to this human being, 00:17:14.140 |
it's not like being immediately sent to the CIA. 00:17:24.200 |
I mean, I don't know if that even with microphones here, 00:17:28.060 |
but for some reason, I think probably with Marty, 00:17:31.560 |
I think when it's done really crudely and crappily, 00:17:36.640 |
you start to realize, oh, this is like PR people 00:17:39.580 |
trying to make a friendly version of a surveillance machine. 00:17:44.580 |
But I mean, that reminds me of the slight clumsiness 00:17:50.700 |
or significant clumsiness on the initial releases 00:17:55.660 |
I don't know, what are your actually thoughts about that? 00:18:12.280 |
where you can have virtual meetings and stuff like that. 00:18:18.320 |
it feels like a similar problem to social robotics, 00:18:25.160 |
which is how you design a digital virtual world 00:18:29.240 |
that is compelling when you connect to others 00:18:34.580 |
there in the same way that physical connection is. 00:18:40.180 |
I mean, I've seen people joking about it on Twitter 00:18:45.660 |
'Cause there's something you can't quite put into words 00:18:49.420 |
that doesn't feel genuine about the way it looks. 00:18:54.420 |
And so the question is, if you and I were to meet virtually, 00:19:26.180 |
I'm not a gamer, so I don't have any examples, 00:19:28.500 |
but I feel like there's this whole world in video games 00:19:33.580 |
and depending on the game, they have different avatars, 00:19:36.580 |
and a lot of the games are about connecting with others. 00:19:42.420 |
and again, I haven't looked into this at all. 00:19:45.420 |
I've been shockingly not very interested in the metaverse, 00:19:48.820 |
but they must've poured so much investment into this meta. 00:20:04.140 |
There's gotta be some thinking behind it, right? 00:20:25.700 |
and they often, large corporations have a way 00:20:33.700 |
that's required to create something really compelling. 00:20:45.340 |
and it somehow becomes generic through that process. 00:20:50.060 |
because it could be controversial, is that, or? 00:20:54.340 |
Like, what, I mean, we're living through this now, 00:20:58.740 |
like, with a lot of people with cancellations 00:21:04.140 |
people are nervous, and nervousness results in, 00:21:07.180 |
like usual, the assholes are ruining everything. 00:21:16.500 |
of, like, with people you like who are not assholes, 00:21:20.060 |
good people, like, some of the fun in the metaverse 00:21:24.140 |
or in video games is, you know, being edgier, 00:21:28.100 |
being interesting, revealing your personality 00:21:39.300 |
in metaverse, the possibility of sexual assault 00:21:42.300 |
and sexual harassment and all that kind of stuff, 00:21:50.420 |
but not too much, because then you remove completely 00:21:54.820 |
Then everybody's just like a vanilla bot that, 00:22:01.660 |
to be a little bit political, to be a little bit edgy, 00:22:19.820 |
If you look at, I think Grimes tweeted about this, 00:23:02.860 |
that's kind of wild that you would take a paperclip 00:23:08.860 |
- And suddenly people are like, oh, you're annoying, 00:23:15.700 |
that clippy thing wouldn't even survive Microsoft 00:23:24.760 |
there'll be these meetings about why is it a paperclip? 00:23:28.260 |
Like, why don't we, it's not sufficiently friendly, 00:23:47.500 |
- I think the killing of the creativity is in the whole, 00:23:50.460 |
like, okay, so what I know from social robotics is like, 00:23:59.140 |
you'd create a robot that looks like a humanoid 00:24:04.100 |
Now, suddenly you do have all of these issues 00:24:07.220 |
where are you reinforcing an unrealistic beauty standard? 00:24:35.740 |
And now you don't even have any of these bias issues 00:24:39.700 |
And so how do we create that within companies? 00:24:44.500 |
like, 'cause I, you know, maybe we disagree on that. 00:24:48.220 |
I don't think that edginess or humor or interesting things 00:24:56.460 |
There are ways to find things that everyone is fine with. 00:25:06.460 |
- And so they will find harm in things that have no harm. 00:25:11.140 |
because their whole job is to find harm in things. 00:25:19.900 |
doesn't necessarily, doesn't need to be a thing 00:25:25.500 |
Great humor, great personality doesn't have to, 00:25:31.180 |
But yeah, I mean, but it's tricky to get right. 00:25:46.860 |
or people whose job it is, like you say, to mitigate risk, 00:25:56.100 |
Yeah, you get the problem in all organizations. 00:25:58.060 |
So I think that you're right, that that is a problem. 00:26:26.780 |
Like, yeah, there's a good heart and make sure, 00:26:37.620 |
that's going to actually make a lot of people feel good. 00:26:48.540 |
They imagine like millions of people using the thing 00:26:53.980 |
That's that, you could say that about social robotics, 00:27:13.580 |
You have a picture of two hospital delivery robots 00:27:18.620 |
see your book, I appreciate that it keeps the humor in. 00:27:25.460 |
- No, no one edited the book, we got rushed through. 00:27:29.900 |
- The caption reads, two hospital delivery robots 00:27:36.540 |
made me roll my eyes so hard they almost fell out. 00:27:45.260 |
The form factor is fine, it's like a little box on wheels. 00:27:50.180 |
That'll let people enjoy interacting with them. 00:27:53.820 |
We know that even just giving a robot a name, 00:27:56.220 |
people will, it facilitates technology adoption, 00:28:14.300 |
a lot of robots are named according to gender biases 00:28:24.020 |
So, you know, robots that are helpful and assistance 00:28:27.780 |
and are like nurses are usually female-gendered. 00:28:32.180 |
Robots that are powerful, all wise computers like Watson 00:28:35.860 |
usually have like a booming male coded voice and name. 00:28:45.220 |
You're opening a can of worms for no reason, for no reason. 00:28:55.500 |
So to some extent, I don't like PR departments, 00:29:11.300 |
And then you can always make your good leadership choices 00:29:23.260 |
and making sure that you're getting the right feedback 00:29:25.100 |
from the right people, I think that's really important. 00:29:28.220 |
- So don't unnecessarily propagate the biases of society. 00:29:35.320 |
But if you're not careful when you do the research of, 00:29:40.320 |
like you might, if you ran a poll with a lot of people 00:30:03.600 |
They did marketing research and nobody wanted the male voice. 00:30:10.920 |
If I were to say, I think the role of a great designer, 00:30:26.180 |
But if everyone wants Alexa to be a female voice, 00:30:33.220 |
about the future of social agents in the home and think. 00:30:43.460 |
So in some sense, there's this weird tension. 00:30:49.020 |
but at the same time, you're creating a thing 00:31:06.060 |
have historically done very well at understanding a market 00:31:11.780 |
It's not to listen to what the current market says. 00:31:27.020 |
And I agree with you that I would like to see more of that, 00:31:29.440 |
especially when it comes to existing biases that we know. 00:31:39.840 |
of companies that don't even think about it at all 00:31:49.820 |
But to be really forward looking and be really successful, 00:31:57.400 |
- But do you think it's still useful to gender 00:32:22.500 |
And maybe that's better than just entrenching something 00:32:36.380 |
to give it attributes that people identify with. 00:32:53.580 |
And I think that actually that's irresponsible 00:33:15.460 |
You have to start to think about how they're going to do it. 00:33:20.780 |
that's not supposed to have any kind of social component, 00:33:28.980 |
Like that arm, I worked a lot with quadrupeds now 00:33:35.860 |
You know, that arm, people think is the head immediately. 00:34:00.460 |
- Well, Boston Dynamics is an interesting company, 00:34:04.220 |
or any of the others that are doing a similar thing 00:34:14.640 |
So like surveillance of factories and doing dangerous jobs. 00:34:25.400 |
'cause it's for some reason, as you talk about, 00:34:41.780 |
to put them in the home and have a good social connection. 00:34:57.140 |
- It's almost like an unspoken tongue in cheek thing. 00:35:01.460 |
They're aware of how people are going to feel 00:35:21.760 |
they were really, really focused on just the engineering. 00:35:24.480 |
I mean, they're at the forefront of robotics, 00:35:42.900 |
Like there can't be like some sort of incline 00:35:48.900 |
a lot of integrity about the authenticity of them. 00:36:13.360 |
- What I like about Boston Dynamics and similar companies, 00:36:33.860 |
It's like testing the robustness of the system. 00:36:36.640 |
I mean, they're having a lot of fun there with the robots. 00:37:00.660 |
Like where do you keep them? - Yeah, I'm working on them. 00:37:04.080 |
is to have at any one time always a robot moving. 00:37:13.020 |
- Well, I have like more Roombas that I know what to do with. 00:37:20.900 |
- And I have a bunch of little, like I built, 00:37:40.060 |
just a passively functioning robot always moving. 00:37:48.860 |
'cause I'd love to create that kind of a little world. 00:37:51.180 |
So the impressive thing about Boston Dynamics to me 00:38:00.940 |
the most impressive thing that still sticks with me 00:38:02.940 |
is there was a spot robot walking down the hall 00:38:19.980 |
And it's just like walking down this long hall. 00:38:28.020 |
So presumably some kind of automation was doing the map. 00:38:30.960 |
I mean, the whole environment is probably really well mapped. 00:38:33.860 |
But it was just, it gave me a picture of a world 00:38:38.700 |
where a robot is doing his thing, wearing a cowboy hat, 00:38:45.420 |
Like I don't know what it's doing, what's the mission. 00:38:47.420 |
But I don't know, for some reason it really stuck with me. 00:38:49.780 |
You don't often see robots that aren't part of a demo 00:38:53.220 |
or that aren't, you know, like with a semi-autonomous 00:38:57.660 |
or autonomous vehicle, like directly doing a task. 00:39:05.660 |
- Well, yeah, you know, I mean, we're at MIT. 00:39:11.420 |
And they were all like broken or like not demoing. 00:39:16.460 |
And what really excites me is that we're about to have that. 00:39:21.460 |
We're about to have so many moving about too. 00:39:30.860 |
There's delivery robots in some cities, on the sidewalks. 00:39:38.500 |
Because yeah, you see a robot walking down the hall 00:39:44.660 |
This is awesome and scary and kind of awesome. 00:39:49.340 |
That's one of the things that I think companies 00:39:51.260 |
are underestimating that people will either love a robot 00:39:56.460 |
So it's just, again, an exciting time to be alive. 00:40:09.180 |
My son hates the Roomba because ours is loud. 00:40:23.140 |
- Oh yeah, my kids, that's one of the first words 00:40:29.940 |
- Do they project intelligence out of the thing? 00:40:32.660 |
- Well, we don't really use it around them anymore 00:40:42.360 |
Even a Roomba, because it's moving around on its own, 00:40:47.000 |
I think kids and animals view it as an agent. 00:40:51.080 |
- So what do you think, if we just look at the state 00:40:54.440 |
of the art of robotics, what do you think robots 00:41:10.880 |
is maybe not a perfectly calibrated understanding 00:41:20.320 |
what's difficult in robotics, what's easy in robotics. 00:41:22.640 |
- Yeah, we're way behind where people think we are. 00:41:36.800 |
and it's the first autonomous warehouse robot 00:41:44.520 |
And so it's kind of, most people I think envision 00:41:48.640 |
that our warehouses are already fully automated 00:42:04.240 |
that have to be strong enough to move something heavy, 00:42:09.120 |
And so until now, a lot of the warehouse robots 00:42:12.320 |
had to just move along like preexisting lines, 00:42:17.180 |
And so having, I think that's one of the big challenges 00:42:23.120 |
and one of the big exciting things that's happening 00:42:30.760 |
where people and robots can work side by side 00:42:37.120 |
sort of the physical manipulation task with humans. 00:42:44.760 |
I think that's what people are worried about, 00:42:46.360 |
like this malevolent robot gets mad of its own 00:42:52.480 |
No, it's actually very difficult to know where the human is 00:43:04.800 |
an industrial robotic arm, which is extremely powerful. 00:43:21.000 |
in the factory setting is really fascinating. 00:43:29.600 |
I think that there's a ton of disruption that's happening 00:43:35.240 |
I think speaking specifically of the Amazon warehouses, 00:43:40.880 |
that might be an area where it would be good for robots 00:43:45.660 |
where people are put in a position where it's unsafe 00:43:50.240 |
And probably it would be better if a robot did that. 00:43:53.400 |
And Amazon is clearly trying to automate that job away. 00:43:57.240 |
So I think there's gonna be a lot of disruption. 00:44:20.280 |
So there you have this job that is very unsafe 00:44:30.080 |
And now you have all these different robotic machines 00:44:42.560 |
and like control these autonomous mining trucks. 00:44:58.520 |
You're gonna see more jobs created because that's, 00:45:03.120 |
the future is not robots just becoming like people 00:45:08.360 |
The future is really a combination of our skills 00:45:11.320 |
and then the supplemental skillset that robots have 00:45:18.680 |
to give people work that they actually enjoy doing 00:45:35.480 |
because of course specific jobs are going to get lost. 00:45:39.540 |
- If you look at the history of the 20th century, 00:45:41.660 |
it seems like automation constantly increases productivity 00:45:53.000 |
So like thinking about this time being different 00:45:58.000 |
is that we would need to go against the lessons of history. 00:46:03.500 |
- And the other thing is I think people think 00:46:06.500 |
that the automation of the physical tasks is easy. 00:46:10.040 |
I was just in Ukraine and the interesting thing is, 00:46:13.080 |
I mean, there's a lot of difficult and dark lessons 00:46:48.840 |
and you wanna do a good job of it, unlimited money. 00:47:07.260 |
- But yes, but figuring out also how to disable the mine. 00:47:27.680 |
That really does a lot of damage to the land. 00:47:37.240 |
It requires a huge amount of human-like experience, 00:47:45.120 |
- Yeah, I think we overestimate what we can automate. 00:47:53.280 |
I mean, it continues that the story of humans, 00:47:57.560 |
we think we're shitty at everything in the physical world, 00:48:20.320 |
but like robot cars are great at predictable things 00:48:25.320 |
and can react faster and more precisely than a person 00:48:37.920 |
is because of this long tail of just unexpected occurrences 00:48:46.080 |
That's a horse and carriage ahead of me on the highway, 00:48:48.440 |
but the car has never encountered that before. 00:48:50.080 |
So like in theory, combining those skill sets 00:48:59.440 |
the human-robot interaction and the handoffs. 00:49:01.400 |
So like in cars, that's a huge problem right now, 00:49:08.880 |
And that's really the future is human-robot interaction. 00:49:15.800 |
it's terrible that people die in car accidents, 00:49:19.760 |
but I mean, it's like 70, 80, 100 million miles, 00:49:33.040 |
Like think about it, like the, how many people? 00:49:37.040 |
Think just the number of people throughout the world 00:49:41.320 |
all of this, you know, sleep deprived, drunk, 00:49:45.360 |
distracted, all of that, and still very few die 00:49:55.760 |
see, when I was like in the beginning of the 20th century, 00:49:58.680 |
riding my horse, I would talk so much shit about these cars. 00:50:13.120 |
and it's going to be destructive to all of human society. 00:50:27.520 |
and to mimic that in the machine is really tricky. 00:50:35.400 |
how far machine learning can go on vision alone. 00:50:39.860 |
and people that are, at least from my perspective, 00:50:43.760 |
people that are kind of critical of Elon and those efforts, 00:51:00.600 |
wouldn't have guessed how much you can do on vision alone. 00:51:03.640 |
It's kind of incredible, because we would be, 00:51:07.080 |
I think it's that approach, which is relatively unique, 00:51:11.840 |
has challenged the other competitors to step up their game. 00:51:16.620 |
So if you're using LiDAR, if you're using mapping, 00:51:19.320 |
that challenges them to do better, to scale faster, 00:51:26.660 |
and to use machine learning and computer vision as well 00:51:35.580 |
And I'm not, I don't know if I even have a good intuition 00:51:45.460 |
So all the sunset, all the edge cases you mentioned. 00:51:59.340 |
My current intuition is that we're gonna get there. 00:52:04.140 |
- But I didn't before, I wasn't sure we're gonna get there 00:52:13.060 |
So I was kind of, this is like with vision alone, 00:52:25.660 |
You're gonna have to solve some of the big problems 00:52:28.940 |
in artificial intelligence, not just perception. 00:52:35.780 |
- Like you have to have a deep understanding of the world, 00:52:41.300 |
I mean, I'm continually surprised how well the thing works. 00:52:44.460 |
Obviously Elon and others, others have stopped, 00:52:47.380 |
but Elon continues saying we're gonna solve it in a year. 00:52:51.340 |
- Well, yeah, that's the thing, bold predictions. 00:52:54.580 |
- Yeah, well, everyone else used to be doing that, 00:53:09.820 |
I mean, the UK just committed 100 million pounds 00:53:14.620 |
to research and development to speed up the process 00:53:23.780 |
and it's going to happen and it's gonna change everything. 00:53:28.180 |
And like Waymo, low key has driverless cars in Arizona. 00:53:33.180 |
Like you can get, you know, there's like robots. 00:53:45.820 |
'Cause the most awesome experience is the wheel turning 00:53:57.100 |
it feels like you're a passenger with that friend 00:54:07.780 |
But then you kind of, that experience, that nervousness 00:54:12.780 |
and the excitement of trusting another being, 00:54:18.060 |
and in this case, it's a machine, is really interesting. 00:54:20.860 |
Just even introspecting your own feelings about the thing. 00:54:27.820 |
- They're not doing anything in terms of making you feel 00:54:50.260 |
Let's not discuss the fact that this is a robot driving you 00:54:55.020 |
And if the robot wants to start driving 80 miles an hour 00:54:57.500 |
and run off of a bridge, you have no recourse. 00:55:03.340 |
There's no discussion about like how shit can go wrong. 00:55:08.260 |
There's like a map showing what the car can see. 00:55:21.380 |
You have a button you can like call customer service. 00:55:24.420 |
- Oh God, then you get put on hold for two hours. 00:55:34.660 |
but you know, the car just can pull over and stop 00:55:40.900 |
and then they'll actually drive the car for you. 00:55:43.180 |
But that's like, you know, what if you're late for a meeting 00:55:48.780 |
- Or like the more dystopian, isn't it the fifth element 00:55:55.820 |
- Oh yeah, and he gets into like a robotic cab 00:56:00.180 |
And then because he's violated a traffic rule, 00:56:02.260 |
it locks him in and he has to wait for the cops to come 00:56:06.460 |
So like, we're gonna see stuff like that maybe. 00:56:13.740 |
that have robots, the only ones that will succeed 00:56:28.860 |
'cause they're gonna have to earn people's trust. 00:56:31.860 |
- Yeah, but like Amazon works with law enforcement 00:56:34.500 |
and gives them the data from the ring cameras. 00:56:44.520 |
- No, no, but basically any security camera, right? 00:56:56.060 |
because we don't want it to go to law enforcement 00:57:09.140 |
Maybe that's true for cameras, but with robots, 00:57:15.900 |
people are just not gonna let a robot inside their home 00:57:19.500 |
where like one time where somebody gets arrested 00:57:36.260 |
- 'Cause in the modern world, people are get, 00:57:39.860 |
They get extremely paranoid about any kind of surveillance. 00:57:52.500 |
like they don't, that's a different discourse 00:57:59.500 |
it's loud in our ears 'cause we're in those circles. 00:58:04.380 |
that does social robotics and not win over Twitter? 00:58:09.420 |
- I feel like the early adopters are all on Twitter 00:58:15.620 |
Feels like nowadays you'd have to win over TikTok, honestly. 00:58:35.860 |
maybe people would be able to give up privacy 00:58:55.580 |
Like, wow, I did not anticipate those improving so quickly 00:59:03.780 |
And one of the things that I'm trying to be cynical about 00:59:06.480 |
is that I think they're gonna have a big impact 00:59:09.340 |
on privacy and data security and like manipulating consumers 00:59:14.740 |
you'll have these agents that people will talk to 00:59:22.700 |
So kind of like we were talking about before. 00:59:24.900 |
And at the same time, the technology is so freaking exciting 00:59:31.020 |
- Well, it's not even just the collection of data 00:59:49.020 |
He had actually a really good post from somebody else. 00:59:59.980 |
I left a note for myself to reach out to her. 01:00:04.380 |
and just a great summarizer of the state of AI. 01:00:11.060 |
where it was looking at AI explaining that it's a squirrel. 01:00:25.260 |
of human-like feelings and I think even consciousness. 01:00:30.260 |
And so she was like, oh cool, that's impressive. 01:00:33.780 |
I wonder if an AI can also describe the experience 01:00:37.260 |
And so she interviewed, I think she did GPT-3 01:00:46.980 |
What's it like being an algorithm that powers a Roomba? 01:00:49.380 |
And you can have a conversation about any of those things 01:00:53.260 |
and they're very convincing. - It's pretty convincing, yeah. 01:01:05.940 |
It's like, yeah, that probably is what a squirrel would-- 01:01:14.620 |
- Oh yeah, I get to eat nuts and run around all day. 01:01:20.540 |
like when you tell them that you're a squirrel? 01:01:27.620 |
to find out that you're a squirrel or something like this. 01:01:36.980 |
like what do you think when they find out you're a squirrel? 01:01:41.980 |
I hope they'll see how fun it is to be a squirrel 01:01:45.420 |
- What do you say to people who don't believe you're a squirrel 01:01:59.900 |
doesn't mean it actually has that experience. 01:02:02.420 |
But then secondly, these things are getting so advanced 01:02:10.660 |
That's, I mean, just the implications for health, 01:02:15.300 |
education, communication, entertainment, gaming. 01:02:19.660 |
Like I just feel like all of the applications, 01:02:22.340 |
it's mind boggling what we're gonna be able to do with this. 01:02:25.500 |
And that my kids are not gonna remember a time 01:02:28.420 |
before they could have conversations with artificial agents. 01:02:48.940 |
it doesn't matter if he is or not, this is coming. 01:02:59.620 |
So in that sense, you start to think about a world 01:03:08.980 |
kind of having an implied belief that the thing is sentient. 01:03:15.780 |
And I think that one of the things that bothered me 01:03:21.780 |
like obviously I don't believe the system is sentient. 01:03:25.020 |
Like I think that it can convincingly describe that it is. 01:03:28.060 |
I don't think it's doing what he thought it was doing 01:03:33.260 |
But a lot of the tech press was about how he was wrong 01:03:50.260 |
when they're doing human robot interaction experiments 01:04:06.380 |
So I think that the goal is not to discourage 01:04:25.100 |
- So one of the really interesting perspectives 01:04:32.360 |
is to see them, not to compare a system like this to humans, 01:04:35.960 |
but to compare it to animals, of how we see animals. 01:04:45.040 |
than the human analogy, the analogy of robots as animals? 01:04:48.400 |
- Yeah, and it gets trickier with the language stuff, 01:04:54.800 |
I think that animals are a really great thought experiment 01:05:07.300 |
but also I think for the future, we don't want that. 01:05:15.160 |
throughout history for so many different things, 01:05:17.140 |
we domesticated them, not because they do what we do, 01:05:19.860 |
but because what they do is different and that's useful. 01:05:27.200 |
whether we're talking about work integration, 01:05:29.800 |
whether we're talking about responsibility for harm, 01:05:32.120 |
there's just so many things we can draw on in that history 01:05:40.580 |
that are applicable to how we should be thinking 01:05:49.100 |
Obviously, there are tons of differences there. 01:05:50.920 |
Like you can't have a conversation with a squirrel, right? 01:06:04.380 |
I suspect they're much bigger assholes than we imagine. 01:06:12.320 |
it would fuck you over so fast if it had the chance. 01:06:40.900 |
People get so angry when I talk shit about cats. 01:06:47.940 |
all the different kinds of animals that get domesticated. 01:07:09.780 |
- It was a game changer because it revolutionized 01:07:12.900 |
what people could do economically, et cetera. 01:07:24.340 |
around autonomous vehicles or drones or delivery robots. 01:07:37.920 |
I think we're gonna see very similar things with robots. 01:07:42.860 |
but I think it helps us get away from this idea 01:07:45.900 |
that robots can, should, or will replace people. 01:08:01.460 |
Or like they'll use them to run electrical wire. 01:08:03.500 |
I think they did that for Princess Di's wedding. 01:08:13.620 |
Like the dolphins that they used in the military. 01:08:21.860 |
and the US still has dolphins in their navies. 01:08:24.540 |
Mine detection, looking for lost underwater equipment, 01:08:33.960 |
Which I think Russia's like, "Sure, believe that." 01:08:39.120 |
And America's like, "No, no, we don't do that." 01:08:42.780 |
But they started doing that in like the 60s, 70s. 01:08:54.460 |
that we can't do with machines or by ourselves." 01:09:02.620 |
in trying to make robots do the mine detection. 01:09:07.520 |
there are some things that the robots are good at 01:09:09.580 |
and there's some things that biological creatures 01:09:11.820 |
are better at, so they still have the dolphins. 01:09:19.340 |
The pigeons were the original hobby photography drone. 01:09:23.900 |
They also carried mail for thousands of years, 01:09:26.060 |
letting people communicate with each other in new ways. 01:09:28.460 |
So the thing that I like about the animal analogies, 01:09:35.580 |
that we don't have and that's just so useful. 01:09:46.500 |
or do things that we're not physically capable of. 01:09:52.240 |
I still feel like it's a really good analogy. 01:09:55.160 |
- And it works because people are familiar with it. 01:10:00.560 |
And when we start to think about cats and dogs, 01:10:03.660 |
they're pets that seem to serve no purpose whatsoever 01:10:18.100 |
They used to be guard dogs or they had some sort of function. 01:10:21.560 |
And then at some point they became just part of the family. 01:10:24.460 |
And it's so interesting how there's some animals 01:10:47.640 |
strong emotional connections to certain robots 01:11:00.260 |
with the culture and the people or the robot design? 01:11:13.580 |
Like farm animals to really get inside the home 01:11:24.300 |
and they can evolve much more quickly than other animals. 01:11:32.940 |
that dogs evolved to be more appealing to us. 01:11:38.220 |
we started breeding them to be more appealing to us too, 01:11:45.380 |
although we've bred them to make more milk for us. 01:11:50.780 |
I mean, there are cultures where people eat dogs still today 01:11:53.540 |
and then there's other cultures where we're like, 01:11:55.600 |
oh no, that's terrible, we would never do that. 01:11:58.660 |
And so I think there's a lot of different elements 01:12:03.220 |
'cause I understand dogs 'cause they use their eyes, 01:12:10.700 |
There's a whole conferences on dog consciousness 01:12:18.420 |
because they seem to not give a shit about the human. 01:12:33.620 |
- I think some people like that about people. 01:12:36.880 |
- Yeah, they want to push and pull of a relationship. 01:12:40.440 |
- They don't want loyalty or unconditional love. 01:12:47.480 |
And maybe that says a lot more about the people 01:12:55.460 |
So I'm judging harshly the people that have cats 01:13:03.840 |
are desperate for attention and unconditional love 01:13:35.340 |
I think robots are beautiful in the same way that pets are, 01:13:40.060 |
even children, in that they capture some kind of magic 01:13:46.040 |
They have the capacity to have the same kind of magic 01:13:53.780 |
Like when they're brought to life and they move around, 01:13:57.020 |
the way they make me feel, I'm pretty convinced, 01:14:02.740 |
is as you know, they will make billions of people feel. 01:14:07.740 |
Like I don't think I'm like some weird robotics guy. 01:14:15.860 |
I mean, I just, I can put on my like normal human hat 01:14:22.440 |
there's a lot of possibility there of something cool, 01:14:38.480 |
It's like a weird creature that used to be a wolf. 01:14:43.520 |
- Well, dogs can either express or mimic a lot of emotions 01:14:53.080 |
Like a lot of the magic of animals and robots 01:14:59.080 |
And the easier it is for us to see ourselves in something 01:15:03.360 |
and project human emotions or qualities or traits onto it, 01:15:08.240 |
And then you also have the movement, of course. 01:15:11.720 |
that's why I'm so interested in physical robots 01:15:14.160 |
because that's, I think the visceral magic of them. 01:15:17.280 |
I think we're, I mean, there's research showing 01:15:22.680 |
to respond to autonomous movement in our physical space 01:15:29.840 |
And so animals and robots are very appealing to us 01:15:47.720 |
is doing its own thing and then it recognizes you. 01:16:01.360 |
say you're walking in an airport on the street 01:16:09.320 |
and that like, well, you wake up to like that excitement 01:16:22.160 |
it makes you feel noticed and heard and loved. 01:16:27.400 |
Like that somebody looks at you and recognizes you 01:16:36.760 |
And I honestly think robots can give that feeling too. 01:16:42.880 |
one of the downsides of these systems is they don't, 01:16:52.440 |
they're trying to maintain privacy, I suppose, 01:17:03.120 |
And I think that that's the game changing nature 01:17:06.360 |
of things like these large language learning models. 01:17:09.440 |
And the fact that these companies are investing 01:17:11.880 |
in embodied versions that move around of Alexa, like Astro. 01:17:16.880 |
- Can I just say, yeah, I haven't, is that out? 01:17:23.000 |
You can't just like buy one commercially yet, 01:17:28.600 |
My gut says that these companies don't have the guts 01:17:38.680 |
This goes to the, because it's edgy, it's dangerous. 01:17:44.680 |
Like in a way that, you know, just imagine, okay. 01:17:50.360 |
If you do the full landscape of human civilization, 01:18:03.400 |
the amount of deep heartbreak that's happening. 01:18:09.320 |
have more of a personal connection with the human, 01:18:13.160 |
you're gonna have humans that like have existential crises. 01:18:20.040 |
And like, you're now taking on the full responsibility 01:18:29.160 |
As a company, like imagine PR and marketing people, 01:18:46.040 |
I haven't tried it, but Replica's like a virtual companion. 01:18:51.200 |
And if big companies don't do it, someone else will. 01:19:00.580 |
if you think about all the AI we have around us, 01:19:10.960 |
- You don't think that's just because they weren't able? 01:19:17.400 |
- I mean, it might be true, but I have to wonder, 01:19:33.000 |
- They don't have to, it's bad for business in the short term. 01:19:35.440 |
- I'm gonna be honest, maybe it's not such a bad thing 01:19:48.680 |
with the responsibility of unforeseen effects on people, 01:20:01.920 |
then you're gonna have to collect data from people, 01:20:04.540 |
which you will anyway to personalize the thing, 01:20:06.960 |
and you're gonna be somehow monetizing the data, 01:20:12.200 |
It just, it seems like now we're suddenly getting 01:20:15.000 |
into the realm of like severe consumer protection issues, 01:20:22.120 |
I see massive potential for this technology to be used 01:20:28.080 |
and not, I mean, that's in an individual user's interest 01:20:35.060 |
- Yeah, see, I think that kind of personalization 01:20:44.980 |
I think you should own all the data your phone knows 01:21:02.100 |
I think that's the only way people will trust you 01:21:12.240 |
on massive troves of data to train the AI system. 01:21:25.560 |
but obvious, like show, like in the way I opt in 01:21:37.120 |
I have to choose, like, how well do I know you? 01:21:39.760 |
And then I say, like, don't tell this to anyone. 01:21:50.000 |
In that same way, like, it's very transparent 01:21:53.000 |
in which data you're allowed to use for which purposes. 01:21:56.920 |
- That's what people are saying is the solution, 01:22:03.760 |
I think it breaks down at the point at which, 01:22:08.840 |
like, people are willingly giving up their data 01:22:10.720 |
because they're getting a functionality from that, 01:22:17.080 |
like, maybe to someone else, and not to them personally. 01:22:21.280 |
- I don't think people are giving their data. 01:22:24.080 |
Like, it's not consensual. - But if you were asked, 01:22:27.840 |
if you were like, tell me a secret about yourself 01:22:31.120 |
and I'll give you $100, I'd tell you a secret. 01:22:38.200 |
You wouldn't trust, like, why are you giving me $100? 01:22:43.040 |
- But, like, I need, I would ask for your specific, 01:22:48.040 |
like, fashion interest in order to give recommendations 01:22:54.080 |
to you for shopping, and I'd be very clear for that, 01:22:57.120 |
and you could disable that, you can delete that. 01:23:00.480 |
But then you can be, have a deep, meaningful, 01:23:11.200 |
the full history of all the things you've worn, 01:23:19.640 |
all of that information that's mostly private to even you, 01:23:23.800 |
not even your loved ones, that, a system should have that, 01:23:33.240 |
that system could tell you a damn good thing to wear. 01:23:35.840 |
- It could, and the harm that I'm concerned about 01:23:40.240 |
is not that the system is gonna then suggest a dress for me 01:23:47.520 |
where I was talking to the people who do the analytics 01:23:55.040 |
I can ask you three totally unrelated questions 01:24:01.520 |
And so what they do is they aggregate the data 01:24:08.400 |
and then they have a lot of power and control 01:24:25.200 |
Like, I think it's way more complex than just 01:24:38.520 |
vulnerable populations that are targeted by this, 01:24:41.560 |
low-income people being targeted for scammy loans, 01:24:45.000 |
or, I don't know, like, I could get targeted, 01:24:57.480 |
And there's all these ways that you can manipulate people 01:24:59.720 |
where it's not really clear that that came from 01:25:06.760 |
it came from all of us, all of us opting into this. 01:25:11.320 |
- But there's a bunch of sneaky decisions along the way 01:25:15.200 |
that could be avoided if there's transparency. 01:25:21.240 |
if you share that data with too many ad networks, 01:25:29.200 |
- Okay, and that's something that you could regulate. 01:25:35.080 |
and all of the ways you allow the company to use it, 01:25:46.640 |
And also, it has to do with the recommender system itself 01:25:51.640 |
from the company, which is freezing your eggs. 01:26:06.960 |
So not the kind of things that the category of people 01:26:16.320 |
what makes you happy, what is helping you grow. 01:26:19.320 |
- But you're assuming that people's preferences 01:26:33.700 |
that's the thing that I'm more concerned about. 01:26:39.620 |
but manipulating them into wanting something. 01:27:10.400 |
- So like on Instagram, there will be some person 01:27:15.880 |
And then a brand will hire them to promote some product. 01:27:21.440 |
They disclose, this is an ad that I'm promoting, 01:27:40.760 |
even though they know that you're being paid, 01:27:52.600 |
I have 10 times more sponsors that wanna be sponsors 01:28:06.560 |
Like there's no incentive to like shill for anybody. 01:28:17.320 |
Now, if you're a bot, you're not gonna discriminate. 01:28:23.480 |
oh, well, I think this product is good for people. 01:28:27.000 |
- You think there'll be like bots essentially 01:28:46.480 |
And like, so we're starting to do some research 01:29:01.800 |
Back in the day where like a kid who's 12 understands, 01:29:09.160 |
Now it's getting really, really murky with influencers. 01:29:15.200 |
that a kid has developed a relationship with, 01:29:18.080 |
is it okay to market products through that or not? 01:29:23.240 |
because you're developing a trusted relationship 01:29:30.640 |
And so now it's like personalized, it's scalable, 01:29:57.480 |
- Well, yeah, so like kids who are old enough to understand 01:30:09.060 |
That age child, so Daniela Di Paola, Anastasia Ostrovsky, 01:30:14.060 |
and I advised on this project, they did this. 01:30:18.440 |
They asked kids who had interacted with social robots, 01:30:21.340 |
whether they would like a policy that allows robots 01:30:27.360 |
to market to people through casual conversation, 01:30:36.680 |
And the majority said they preferred the casual conversation. 01:30:40.360 |
And when asked why, there was a lot of confusion about, 01:30:43.000 |
they were like, well, the robot knows me better 01:30:46.300 |
So the robot's only gonna market things that I like. 01:30:49.760 |
And so they don't really, they're not connecting the fact 01:30:57.000 |
And I think that even happens subconsciously with grownups 01:31:00.140 |
when it comes to robots and artificial agents. 01:31:04.360 |
sorry I'm going on and on, but like his main concern 01:31:21.480 |
I think ultimately that confusion should be alleviated 01:31:28.920 |
and should not have any control from the company. 01:31:41.960 |
- Cost money, like the way Windows operating system does. 01:31:55.640 |
So it's helping you, it's like a personal assistant. 01:32:01.720 |
You should, you know, whatever it is, 10 bucks, 20 bucks. 01:32:14.720 |
You should talk shit about all the other companies 01:32:18.100 |
But also, yeah, in terms of if you purchase stuff 01:32:46.320 |
- It's actually trying to discover what you want. 01:32:58.600 |
the thing that would actually make you happy. 01:33:09.600 |
once people understand the value of the robot, 01:33:20.760 |
or helping with like preferences or anything. 01:33:40.000 |
at least not yet, of making like this contained thing 01:33:51.760 |
that you're not like paying a subscription for. 01:33:53.720 |
It's not like controlled by one of the big corporations. 01:34:05.760 |
and people understand the negative effects of social media, 01:34:13.080 |
I think we're just waking up to the realization 01:34:19.120 |
finding our legs in this new world of social media, 01:34:32.560 |
is social media is evil and it's doing bad by us. 01:34:47.440 |
- All right, it's time for us to start that company. 01:34:51.820 |
- Hopefully no one listens to this and steals the idea. 01:35:12.600 |
like it's obvious that there will be a robotics company 01:35:17.600 |
that puts a social robot in the home of billions of homes. 01:35:40.280 |
there's an optimist, Tesla's optimist robot that's, 01:36:01.640 |
there's certain tasks that are designed for humans. 01:36:05.600 |
So it's hard to automate with any other form factor 01:36:09.840 |
And then the other reason is because so much effort 01:36:13.760 |
has been put into this giant data engine machine 01:36:21.160 |
that's seemingly, at least the machine, if not the data, 01:36:25.320 |
is transferable to the factory setting, to any setting. 01:36:28.880 |
- Yeah, he said it would do anything that's boring to us. 01:36:43.880 |
Like I talked to him on mic and off mic about it quite a bit. 01:36:55.080 |
to me it's obvious if a thing like that works at all, 01:37:02.520 |
In fact, it has to work really well in a factory. 01:37:06.600 |
If it works kind of shitty, it's much more useful in the home 01:37:45.200 |
Whenever somebody you love like falls down the stairs, 01:38:13.960 |
There's a few things that are really interesting there. 01:38:16.880 |
One, because he's taking it extremely seriously. 01:38:25.800 |
I talked to Jim Keller offline about this a lot. 01:38:28.480 |
And currently humanoid robots cost a lot of money. 01:38:34.560 |
now they're not talking about all the social robotics stuff 01:38:38.240 |
They are thinking, how can we manufacture this thing cheaply 01:38:59.840 |
Why can't we build this humanoid form for under $1,000? 01:39:03.640 |
And like, I've sat and had these conversations. 01:39:15.880 |
they weren't focused on doing the mass manufacture. 01:39:20.200 |
People are focused on getting a thing that's, 01:39:30.520 |
in the beginning, or like multi-million dollar car. 01:39:49.640 |
what is the minimum amount of degrees of freedom we need? 01:39:57.120 |
Like how many, how do we, like where are the activators? 01:40:04.920 |
But also in a way that's like actually works. 01:40:07.400 |
How do we make the whole thing not part of the components 01:40:23.560 |
Do the exact same pipeline as you do for autonomous driving. 01:40:31.000 |
For the computer vision, for the manipulation task, 01:40:42.240 |
obviously the optimism about how long it's gonna take, 01:40:56.800 |
why the humanoid form, that doesn't make sense. 01:41:04.000 |
that are really excited about the humanoid form there. 01:41:10.520 |
And they might, it's like similar with Boston Dynamics. 01:41:30.160 |
But at the same time, they're pushing forward 01:41:38.400 |
- And with Elon, they're not just going to do that, 01:41:44.800 |
to where we can have humanoid bots in the home potentially. 01:41:57.960 |
I think it's a fascinating scientific problem 01:42:11.920 |
for anything other than art and entertainment 01:42:18.920 |
where you can't just rebuild like a spaceship. 01:42:25.960 |
they've worked for so many years on these spaceships, 01:42:30.480 |
You have some things that are just built for human bodies, 01:42:38.680 |
but it seems like we've already rebuilt factories 01:42:43.720 |
Why would we want to just like make a humanoid robot 01:42:58.880 |
- Well, most of our world is still built for humanoids. 01:43:04.080 |
It should be built so that it's wheelchair accessible. 01:43:30.160 |
and how to create infrastructure that works for everyone, 01:43:39.360 |
but that would pay off way more in the future 01:43:45.680 |
or maybe slightly less expensive humanoid technology 01:43:53.120 |
It's like autonomous driving can be easily solved 01:43:55.640 |
if you do V2I, if you change the infrastructure 01:43:59.400 |
of cities and so on, but that requires a lot of people. 01:44:06.040 |
and a lot of them are somewhat, if not a lot, corrupt, 01:44:22.400 |
but they're specifically, the openness makes it easier 01:44:26.760 |
I think a lot of amazing things in this world happen 01:44:39.480 |
you can say, "Why the hell I would go to Mars? 01:45:00.480 |
But there's something about the vision of the future, 01:45:06.240 |
the optimism laden that permeates this vision 01:45:12.000 |
It's thinking not just for the next 10 years, 01:45:23.040 |
And that, they're gonna come up with some cool shit 01:45:28.840 |
here's what I, 'cause Elon doesn't seem to care 01:45:32.720 |
about social robotics, which is constantly surprising to me. 01:45:37.480 |
humans are the things you avoid and don't hurt, right? 01:45:42.480 |
Like that's, like the number one job of a robot 01:45:47.960 |
The collaborative aspect, the human-robot interaction, 01:45:59.320 |
But my sense is if somebody like that takes on the problem 01:46:03.660 |
of human-robotics, we're gonna get a social robot out of it. 01:46:10.880 |
but people like Elon, if they take on seriously these, 01:46:14.460 |
like I can just imagine with a humanoid robot, 01:46:40.280 |
If you create for whatever the hell reason you want to, 01:46:42.960 |
a humanoid robot, you're gonna have to reinvent, 01:46:46.520 |
well, not reinvent, but introduce a lot of fascinating 01:46:50.580 |
new ideas into the problem of human-robot interaction, 01:47:04.220 |
and there's a lot of amazing people working on it, 01:47:06.660 |
it's like, all right, let's see what happens here. 01:47:13.020 |
I mean, they, I apologize if I'm ignorant on this, 01:47:18.020 |
but I think they really, more than anyone else, 01:47:27.920 |
like a leap with Atlas. - Oh yeah, with Atlas? 01:47:32.880 |
- And like without them, like why the hell did they do it? 01:47:36.480 |
Well, I think for them, it is a research platform. 01:47:38.480 |
It's not, I don't think they ever, this is speculation, 01:47:52.960 |
Yeah, I wonder if they, maybe the answer they landed on is, 01:48:07.640 |
but maybe they reached for, let's try to make, 01:48:18.160 |
to be picking up boxes, to moving boxes, to being, 01:48:21.600 |
it makes sense, okay, if they were exactly the same cost, 01:48:26.600 |
it makes sense to have a humanoid robot in the warehouse. 01:48:45.960 |
to think that you're gonna be able to change warehouses. 01:48:50.040 |
- If you're Amazon, you can totally change your warehouses. 01:49:03.840 |
- But isn't, shouldn't you do that investment in a way, 01:49:08.660 |
so here's the thing, if you build a humanoid robot 01:49:10.880 |
that works in the warehouse, that humanoid robot, 01:49:14.160 |
see, I don't know why Tess is not talking about it this way, 01:49:20.240 |
is gonna have all kinds of other applications 01:49:29.120 |
but whoever solves the humanoid robot problem 01:49:32.000 |
are gonna have to solve the social robotics problem. 01:49:34.440 |
- Oh, for sure, I mean, they're already with Spot 01:49:42.000 |
I'm not sure if Spot is currently effective at scale, 01:49:50.400 |
perhaps Tess will end up doing the same thing, 01:49:53.160 |
which is Spot is supposed to be a platform for intelligence. 01:49:58.160 |
So Spot doesn't have any high-level intelligence, 01:50:10.240 |
- And it's a platform that you can attach something to, yeah. 01:50:13.360 |
- And somebody else is supposed to do the attaching. 01:50:15.880 |
It's a platform that you can take an uneven ground 01:50:21.440 |
go into dangerous situations, it's a platform. 01:50:24.240 |
On top of that, you can add a camera that does surveillance, 01:50:29.280 |
you can record the camera, you can remote control it, 01:50:43.960 |
which is what would be required for automation, 01:50:48.520 |
Perhaps Tess will do the same thing ultimately, 01:51:18.620 |
I think what got me excited was I saw a panel 01:51:26.600 |
and it became really clear how little we know 01:51:43.360 |
that's how I convince myself that that's why I'm excited. 01:51:54.320 |
and that humans come together to build things 01:51:59.260 |
I mean, there's just something inherently thrilling 01:52:05.000 |
because I feel like there are so many other challenges 01:52:08.120 |
and problems that I think are more important to solve, 01:52:12.200 |
but I also think we should be doing all of it at once. 01:52:14.680 |
And so to that extent, I'm like all for research 01:52:19.560 |
on humanoid robots, development of humanoid robots. 01:52:23.660 |
I think that there's a lot to explore and learn, 01:52:35.040 |
goes towards that, and it does take resources 01:52:39.080 |
and attention away from other areas of robotics 01:52:45.960 |
- So you think it might be a little bit of a distraction. 01:53:03.280 |
It's just, it's interesting from a research perspective, 01:53:07.040 |
but from a like what types of robots can we create 01:53:12.120 |
Like why would we just create a humanoid robot? 01:53:19.220 |
- Oh, arms can be useful, but like why not have three arms? 01:53:38.200 |
- But does your companion have to have like two arms 01:53:52.160 |
'cause it almost sounds like I don't want a robot 01:54:05.740 |
doesn't require arms or legs or so on, right? 01:54:11.540 |
of the meaningful impact that's gonna be happening. 01:54:15.380 |
- Yeah, I think just we could get so creative with the design 01:54:18.740 |
like why not have a robot on roller skates or like whatever? 01:54:27.180 |
- Still, it is a compelling and interesting form 01:54:32.700 |
- You co-authored a paper as you were talking about 01:54:43.100 |
I think you were talking about some of the ideas in that. 01:54:52.620 |
- You just go through people's Twitter feeds. 01:54:58.980 |
So there's a, you looked at me like you're offended. 01:55:05.580 |
- No, I was just like worried that like some early, I mean. 01:55:27.560 |
Nobody read it yet until we've written the final paper. 01:55:34.180 |
- By the time this comes out, I'm sure it'll be out. 01:55:37.420 |
- So basically, Wee Robot, that's the workshop 01:56:03.440 |
And it's basically looking at traditional media, 01:56:12.720 |
This is the kind of idea that you've been speaking to. 01:56:17.600 |
that social robots have personalized recommendations, 01:56:26.340 |
So person-to-person interaction is really nice, 01:56:33.340 |
But the social robots have those two elements. 01:56:57.800 |
- So when you say social robots, what does that mean? 01:57:01.640 |
- I think a lot of this applies to virtual ones too. 01:57:11.560 |
can enhance people's engagement with a device. 01:57:14.560 |
- But can embodiment be a virtual thing also? 01:57:17.080 |
Meaning like it has a body in the virtual world. 01:57:33.960 |
that you kind of associate to a physical object. 01:57:44.320 |
because now we have all these new virtual worlds 01:57:51.020 |
but there's research showing that something on a screen, 01:57:54.740 |
and something that is moving in your physical space, 01:58:02.500 |
- So, I mean, I have a sense that we can do that 01:58:10.600 |
Like when I've used VR, I jump around like an idiot 01:58:36.060 |
Like it pulls your mind in when it's well done. 01:58:38.700 |
So it really depends what's shown on the 2D screen. 01:58:43.060 |
Yeah, I think there's a ton of different factors 01:58:46.700 |
Like you can have embodiment in a virtual world. 01:58:49.860 |
You can have an agent that's simply text-based, 01:58:54.620 |
So I think there's a whole spectrum of factors 01:58:57.540 |
that can influence how much you engage with something. 01:59:00.060 |
- Yeah, I wonder, I always wondered if you can have like 01:59:14.140 |
No, but like, this is almost like black mirror, 01:59:21.660 |
or is able to convince you that it's being tortured 01:59:25.900 |
inside the computer and needs your help to get out. 01:59:36.260 |
Like we're not good at, as you've discussed in other, 01:59:39.380 |
in the physical form, like holding a robot upside down, 01:59:47.500 |
I think suffering is a really good catalyst for empathy. 01:59:51.620 |
And I just feel like we can project embodiment 02:00:03.820 |
And I think that's what happened with the Lambda thing. 02:00:12.620 |
having the capacity for suffering and human emotion 02:00:15.740 |
that convinced the engineer that this thing was sentient. 02:00:29.580 |
Oh yeah, no, they actually made a Roomba scream 02:00:46.300 |
No, I, so I had, the way I programmed the Roombas 02:00:52.700 |
so contact between me and the robot is when it screams. 02:01:06.140 |
and that just, it's the easiest thing to program. 02:01:09.420 |
I haven't run those Roombas for over a year now, 02:01:11.780 |
but yeah, it was, my experience with it was that 02:01:30.500 |
So the capacity to suffer is a really powerful thing. 02:01:40.500 |
from the internet, but it's still, it's still, it's weird. 02:01:51.340 |
there is a realistic aspect of how you should scream 02:02:01.460 |
- Like if you kick it really hard, it should scream louder? 02:02:05.060 |
- No, it should scream at the appropriate time, not like-- 02:02:20.740 |
there's a timing there, the dynamics you have to get right 02:02:40.940 |
- I'm sorry to say, Roomba doesn't understand pain. 02:02:53.900 |
it starts, it's a weird, it's a really weird feeling. 02:03:04.860 |
because that, with all the ways that we talked about, 02:03:11.740 |
So the good way is like you can form a connection 02:03:14.060 |
with a thing, and a bad way that you can form a connection 02:03:17.380 |
in order to sell you products that you don't want. 02:03:26.540 |
- You tweeted, "We're about to be living in the movie 'Her' 02:03:29.820 |
except instead of," see, I researched your tweets, 02:03:36.900 |
except instead of about love, it's gonna be about," 02:03:40.260 |
what did I say, "the chatbot being subtly racist 02:03:46.900 |
for companies to charge for software upgrades." 02:04:00.100 |
I am, like, ah, it's so weird to be in this space 02:04:06.700 |
and also so excited about it at the same time. 02:04:14.380 |
and then with GPT-3 and then the Lambda transcript, 02:04:33.660 |
that companies are really gonna have to figure out 02:04:40.540 |
Like, even, you know, the new image generation tools, 02:04:45.460 |
like DALI, where they've clearly put in a lot of effort 02:04:52.540 |
it gives you a diverse set of people, et cetera. 02:04:54.780 |
Like, even that one, people have already found numerous, 02:04:57.580 |
like, ways that it just kind of regurgitates biases 02:05:10.180 |
So I think that this is, like, the really tricky one 02:05:17.900 |
And that's why it's subtly racist and not overtly, 02:05:26.060 |
And then I think the other thing is gonna be, 02:05:29.940 |
people are gonna become so emotionally attached 02:05:33.700 |
to artificial agents with this complexity of language, 02:05:54.620 |
And I think there's gonna be way more issues than just that. 02:05:59.100 |
I think the tweet was software upgrades, right? 02:06:01.460 |
Like, how much is it okay to charge for something like that 02:06:09.060 |
- Oh, the ethics of that, that's interesting. 02:06:13.100 |
But there's also the practical funding mechanisms, 02:06:29.380 |
in Buddhist temples for the AIBOs that can't be repaired 02:06:32.180 |
because people really viewed them as part of their families. 02:06:38.860 |
The original, famous robot dog that Sony made 02:06:54.820 |
And then after a few years, you have to start paying, 02:06:58.660 |
I think it's like 300 a year for a subscription service 02:07:15.580 |
And a lot of that is outsourced to the cloud. 02:07:21.500 |
People should pay, and people who aren't using it 02:07:28.860 |
could you set that price to reflect a consumer's willingness 02:07:36.220 |
So if you know that people are really, really attached 02:07:41.220 |
to these things, just like they would be to a real dog, 02:07:53.540 |
but that's true for anything that people love, right? 02:08:00.940 |
Like there's all these new medical services nowadays 02:08:04.020 |
where people will shell out thousands and thousands of 02:08:20.940 |
and then all the money that it costs to get a divorce? 02:08:30.300 |
I think the society is full of scams that are like-- 02:08:35.820 |
the whole wedding industrial complex has created 02:08:38.580 |
all these quote unquote traditions that people buy into 02:08:57.460 |
And that's a really popular field in AI currently. 02:08:59.980 |
And a lot of people agree that it's an important field. 02:09:03.860 |
But the question is for like social robotics, 02:09:10.540 |
- Well, who else can, oh, to remove the bias of society. 02:09:19.020 |
like why shouldn't our robots also be subtly racist? 02:09:24.780 |
why do we put so much responsibility on the robots? 02:09:41.700 |
- Yes, exactly, I'm allowed to make that joke, yes. 02:09:47.100 |
And I've been nonstop reading about World War II and Hitler. 02:10:11.020 |
which I think was what a lot of people are focused on. 02:10:19.420 |
Now, you can also completely remove the personality. 02:10:23.860 |
You can completely remove the personalization. 02:10:39.220 |
So I do think, well, let's just start with the premise, 02:10:48.300 |
but it is the thing that we don't want in our society. 02:11:16.020 |
I don't know how this is gonna work in the market. 02:11:34.740 |
- And put things into context maybe with that. 02:11:38.260 |
- Yeah, or who might change over time with more experience. 02:11:42.060 |
A robot really just like regurgitates things, 02:11:46.460 |
entrenches them, could influence other people. 02:11:54.300 |
- Well, I think there's a difficult challenge here 02:12:09.380 |
Some people say that it's not enough not to be racist. 02:12:14.500 |
So you have to have a robot that constantly calls out, 02:12:31.020 |
'cause maybe you'll see racism in things that aren't racist. 02:12:39.900 |
I'm not exactly sure that, I mean, it's a tricky thing. 02:12:43.540 |
I guess I'm saying that the line is not obvious, 02:12:53.000 |
of what is harmful to different groups and so on. 02:13:01.420 |
is should a social robotics company be solving 02:13:06.060 |
or being part of solving the issues of society? 02:13:09.820 |
- Well, okay, I think it's the same question as 02:13:21.100 |
And the answer to that is yes, I am responsible, 02:13:30.540 |
And then the question is how do we deal with that? 02:13:32.740 |
And so as a person, how I aspire to deal with that is 02:13:41.020 |
because I have blind spots and people get angry, 02:13:52.420 |
maybe I'll tweet something that's well-intentioned 02:14:01.840 |
and then another group of people starts yelling at me, 02:14:03.940 |
which has happened, this happened to me actually around, 02:14:25.260 |
prefer child with autism and then they disagree. 02:14:46.400 |
and that is part of the existing power structures that exist 02:14:51.240 |
- And you accept being attacked from both sides 02:15:00.980 |
AKA completely removed from your ability to tweet, 02:15:21.720 |
and then I decided that I was going to use autistic children 02:15:32.200 |
- For now, yeah, until I get updated information 02:15:36.520 |
but I'm making choices and I'm moving forward 02:15:39.160 |
because being a coward and like just retreating from that, 02:15:46.440 |
You're a very smart person and an individual, 02:16:01.600 |
- Yeah, I mean, just, well, if you hired PR people, 02:16:06.600 |
like obviously they would see that and they'd be like, 02:16:14.240 |
You're getting attacked, it's bad for your brand. 02:16:18.800 |
There'll be, if we look at different demographics 02:16:26.880 |
There's this kind of pressure that all of a sudden you, 02:16:36.700 |
Again, it's looking at the past versus the future, 02:16:46.360 |
for social media companies to figure out like 02:16:53.240 |
- I think this is ultimately a question about leadership. 02:16:55.920 |
Honestly, like the way that I see leadership, 02:17:04.180 |
and a lot of people who run current institutions 02:17:06.580 |
is that their main focus is protecting the institution 02:17:21.040 |
if you're the type of leader who immediately blows up 02:17:26.640 |
And maybe that's why we don't have any good leaders anymore 02:17:29.400 |
because they had integrity and they didn't put, 02:17:33.080 |
you know, the survival of the institution first. 02:17:58.000 |
- Yeah, take risks where you might lose your job, 02:18:05.060 |
because in the process of standing for the principles, 02:18:12.800 |
for the things you think are right to do, yeah. 02:18:20.520 |
like listening to people and learning from what they feel. 02:18:27.880 |
that those kinds of companies and countries succeed 02:18:39.740 |
Like the people who have good ideas about leadership, 02:18:48.640 |
since the Jeffrey Epstein controversy at MIT, 02:18:53.920 |
Joey Ito, the head of the Media Lab resigned. 02:19:02.680 |
So just looking back a few years have passed, 02:19:10.160 |
from the fact that somebody like Jeffrey Epstein 02:19:36.900 |
And then there's what was the reaction to this problem 02:19:49.880 |
were that sometimes cowards are worse than assholes. 02:20:02.280 |
- I think that's a really powerful statement. 02:20:13.680 |
They're just living out their asshole values. 02:20:16.120 |
And the cowards are the ones that you have to watch out for. 02:20:18.760 |
And this comes back to people protecting themselves 02:20:46.160 |
I believe him that he didn't know about the bad, bad stuff. 02:21:02.420 |
And so personally, because Joey was a great boss 02:21:09.140 |
I was really disappointed that he ignored that 02:21:18.160 |
And I think that it was right for him to resign 02:21:29.040 |
that many others didn't was he came forward about it 02:21:45.980 |
The other thing I learned about human nature, 02:21:54.680 |
from someone who worked at a homeless shelter. 02:21:57.040 |
And he said that when he started working there, 02:22:03.700 |
and they would just trash the entire bathroom, 02:22:08.800 |
And he asked someone who had been there longer, 02:22:12.200 |
Why do the homeless people come in and trash the bathroom? 02:22:13.960 |
And he was told it's because it's the only thing 02:22:19.120 |
And I feel like sometimes when it comes to the response, 02:22:40.000 |
if you can't target the person who actually caused the harm, 02:22:50.160 |
until you find the person that you have power over 02:22:59.960 |
Of course, you're gonna focus all your energy 02:23:10.080 |
on someone who's so far removed from the problem 02:23:17.000 |
- And the problem is often the first person you find 02:23:29.760 |
when liberals form a firing squad, they stand in a circle. 02:23:36.200 |
are gonna listen to you, so you criticize them. 02:23:44.960 |
is the people in the farther, in that situation, 02:23:51.480 |
the people that are farther out in the circles 02:24:11.920 |
but also defend the people involved as flawed, 02:24:24.160 |
that people just ignored for the sake of money 02:24:28.420 |
But also like be transparent and public about it 02:24:55.000 |
or essentially fired, pressured out because of it, 02:24:58.720 |
MIT can pretend like, oh, we didn't know anything. 02:25:07.880 |
because when you are at the top of an institution 02:25:11.600 |
with that much power and you were complicit in what happened, 02:25:21.360 |
So like to not stand up and take responsibility, 02:25:34.920 |
outside of MIT, he was able to make a lot of friends 02:25:47.920 |
befriend people that I don't know personally, 02:25:59.400 |
Why would they let Jeffrey Epstein into their office, 02:26:04.600 |
What do you understand about human nature from that? 02:26:15.800 |
but I was never, like, I never saw him in action. 02:26:20.120 |
I don't know how charismatic he was or what that was, 02:26:23.160 |
but I do think that sometimes the simple answer 02:26:31.480 |
what he would do is he was kind of a social grifter, 02:26:40.360 |
You must get people coming to you and being like, 02:26:54.000 |
who were trusted in a network that he was a great guy 02:26:59.000 |
and that, you know, whatever, I think at that point, 02:27:04.800 |
what, a conviction prior, but it was a one-off thing. 02:27:08.920 |
It wasn't clear that there was this other thing 02:27:21.740 |
No, they're just stupid. - I haven't seen that. 02:27:25.800 |
all right, does anyone check anything about me? 02:27:31.000 |
Because I've walked into some of the richest, 02:27:33.680 |
some of the most powerful people in the world, 02:27:44.200 |
I would think, like, there would be more security 02:27:49.420 |
I think a lot of it has to do, well, my hope is, 02:27:56.880 |
but if that's the case, then they can surely, 02:28:00.840 |
then a human being can use charisma to infiltrate. 02:28:06.360 |
just saying the right things-- - And once you have people 02:28:07.920 |
vouching for you within that type of network, 02:28:11.040 |
like, once you, yeah, once you have someone powerful 02:28:29.280 |
- Well, I mean, first of all, you have to do your homework 02:28:38.520 |
but I think, you know, I think Joey did do his homework. 02:28:40.580 |
I think he did, and I think at the time that he took money, 02:28:44.660 |
there was the one conviction and not, like, the later thing, 02:28:47.400 |
and I think that the story at that time was that 02:28:50.880 |
he didn't know she was underage and blah, blah, blah, 02:28:56.880 |
and Joey always believed in redemption for people, 02:29:08.240 |
well, I'm not gonna exclude him because of this thing, 02:29:11.840 |
and because other people are vouching for him. 02:29:14.080 |
Just to be clear, we're now talking about the set of people 02:29:20.120 |
who I think Joey belonged to who did not, like, 02:29:23.000 |
go to the island and have sex with underage girls, 02:29:25.360 |
because that's a whole other set of people who, like, 02:29:29.320 |
were powerful and, like, were part of that network 02:29:34.440 |
and so, like, I distinguish between people who got taken in 02:29:41.640 |
- I wonder what the different circles look like. 02:29:53.640 |
- And then there's people who heard rumors, maybe. 02:30:03.660 |
For, like, for the longest, like, whenever that happened, 02:30:08.280 |
like, all these people came out of the woodwork, 02:30:22.320 |
- Yeah, and, like, I can tell you that those circles, 02:30:25.320 |
like, there were red flags without me even hearing 02:30:31.760 |
there are not a lot of women here, which is a bad sign. 02:30:36.220 |
- Isn't there a lot of places where there's not a lot 02:30:39.440 |
of women and that doesn't necessarily mean it's a bad sign? 02:30:49.200 |
technology law clinic that only gets, like, male lawyers 02:30:51.840 |
because there's not a lot of women, you know, 02:31:03.100 |
- You've, actually, I'd love to ask you about this 02:31:10.680 |
'cause you have strong opinions about Richard Stallman. 02:31:15.640 |
Is that, do you still have those strong opinions? 02:31:20.440 |
- Look, all I need to say is that he met my friend 02:31:28.260 |
from wrist to elbow, and it certainly wasn't appropriate 02:31:31.980 |
- What about if you're, like, an incredibly weird person? 02:31:38.200 |
- Okay, that's a good question because, obviously, 02:31:41.120 |
there's a lot of neurodivergence at MIT and everywhere, 02:31:49.920 |
that people don't understand social conventions 02:31:51.800 |
the same way, but one of the things that I've learned 02:31:54.740 |
about neurodivergence is that women are often 02:31:59.440 |
expected or taught to mask their neurodivergence 02:32:05.480 |
and kind of fit in, and men are accommodated and excused, 02:32:17.220 |
Like, you can be a weird person and you can still learn 02:32:26.560 |
Like, women should be allowed to be a little weirder 02:32:31.080 |
'Cause I think there's a, 'cause you're one of the people, 02:32:37.160 |
'cause I wanted to talk to Richard Stallman on the podcast 02:32:52.720 |
He's also weird in that, you know, when he gives a talk, 02:33:09.380 |
But then there was something that happened to him 02:33:13.180 |
in conversations on this thread related to Epstein. 02:33:17.100 |
- Which I was torn about because I felt it's similar to Joy, 02:33:27.580 |
Like people were looking for somebody to get angry at. 02:33:39.100 |
Like I set aside his situation and we could discuss it, 02:33:48.100 |
about the way they treated that whole situation. 02:33:50.780 |
- Oh, they're always cowards about how they treat anything. 02:33:57.540 |
- That said, I think he should have left the mailing list. 02:34:01.100 |
- He shouldn't have been part of the mailing list. 02:34:09.780 |
what always bothers me in these mailing list situations 02:34:12.400 |
or Twitter situations, like if you say something 02:34:17.020 |
that's hurtful to people or makes people angry, 02:34:31.600 |
Maybe it's okay that you're yelling, whatever it is, 02:34:35.300 |
like it's your response to that that matters. 02:34:40.060 |
And I think that I just have a lot of respect for people 02:35:06.540 |
you can respond again with integrity and say, 02:35:16.260 |
That's the kind of response I wanna see from people. 02:35:19.980 |
And people like Stallman do not respond that way. 02:35:25.700 |
- Right, like where it's obvious you didn't listen. 02:35:30.500 |
- Honestly, that's to me as bad as the people 02:35:32.980 |
who just apologize just 'cause they are trying 02:35:41.180 |
- A good apology has to include understanding 02:35:54.140 |
you have to acknowledge, you have to like give 02:35:57.260 |
that hard hit to the ego that says I did something wrong. 02:36:00.460 |
Yeah, definitely Richard Stallman is not somebody 02:36:05.820 |
or hasn't given evidence of that kind of thing. 02:36:13.420 |
different people from different walks of life 02:36:25.740 |
non-threatening and hilarious are not necessarily, 02:36:35.300 |
doesn't mean that they're aren't like deeply hurtful 02:36:41.620 |
And I don't mean that in a social justice warrior way, 02:36:49.220 |
So I feel like really put things into context. 02:36:52.240 |
They have to kind of listen to what people are saying, 02:37:00.780 |
and try to keep the facts of their experience 02:37:08.620 |
It's not like, oh, my friend didn't have a sense of humor 02:37:17.500 |
57 times that week because she's an attractive law professor 02:37:25.260 |
and it's impossible for them to even understand 02:37:29.260 |
what it's like to have a different experience of the world. 02:37:33.060 |
And that's why it's so important to listen to people 02:37:35.580 |
and believe people and believe that they're angry 02:37:40.540 |
Maybe you don't like that they're angry at you. 02:37:44.100 |
Maybe you think that they should explain it to you, 02:37:54.140 |
and an opportunity for you to become a better person. 02:38:11.260 |
She's been saying that she's an innocent victim. 02:38:20.420 |
and equally responsible like Jeffrey Epstein? 02:38:23.580 |
Now I'm asking far away from any MIT things and more, 02:38:33.140 |
and what is now known to be her role in that. 02:38:39.500 |
but if I were her, I wouldn't be going around 02:38:42.820 |
I would say, maybe she's, I don't know what she's saying. 02:39:12.860 |
I believe the family, basically the family is saying 02:39:22.020 |
- Well, I think everyone is a good human being. 02:39:41.400 |
And I think we should always hold people accountable 02:39:55.460 |
maybe sociopaths like are specifically trying 02:40:22.160 |
Yeah, she's a good person, but if she contributed 02:40:25.160 |
to harm, then she needs to be accountable for that. 02:40:29.000 |
I don't know what the facts of the matter are. 02:40:30.440 |
It seems like she was pretty close to the situation, 02:40:32.520 |
so it doesn't seem very believable that she was a victim, 02:40:36.560 |
- I wish I have met Epstein, 'cause something tells me 02:40:44.640 |
And that's a very dark reality that we don't know 02:40:47.880 |
which among us, what each of us are hiding in the closet. 02:40:58.100 |
because then you can put your trust into some people 02:41:05.360 |
Which there's a lot of people that interacted with Epstein 02:41:10.200 |
that now have to, I mean, if they're not destroyed by it, 02:41:15.200 |
then their whole, the ground on which they stand ethically 02:41:25.600 |
And I'm sure you and I have interacted with people 02:42:11.280 |
when she started, because the situation went beyond 02:42:17.860 |
A bunch of other stuff happened at the same time. 02:42:23.200 |
but what I was personally going through at that time. 02:42:36.980 |
In June 2019, so I'm a research scientist at MIT. 02:42:44.420 |
So, and I always have had various supervisors 02:42:55.540 |
and he called me into his office for a regular check-in. 02:43:08.100 |
pulled me into a hug, wrapped his arms around my waist, 02:43:12.380 |
and started massaging my hip, and trying to kiss me, 02:43:21.980 |
"Don't worry, I'll take care of your career." 02:43:42.320 |
And I naively thought that when I reported that, 02:43:58.240 |
I couldn't provide evidence that it happened. 02:44:00.500 |
I had no reason to lie about it, but I had no evidence. 02:44:09.540 |
where there's so many people in the institution 02:44:13.120 |
who really believe in protecting the institution 02:44:48.000 |
I think the administration suffers from the same problems 02:44:54.900 |
any leadership of an institution that is large, 02:44:59.320 |
they've become risk-averse, like you mentioned. 02:45:14.040 |
are to threaten the reputation of the institute 02:45:17.580 |
That's the only way to have power at the institute. 02:45:21.280 |
Yeah, I don't think they have a lot of integrity 02:45:27.560 |
or even have a lot of connection to the research body 02:45:33.400 |
You have this amazing research body of people 02:45:57.520 |
a disconnect between the administration and the faculty. 02:45:59.920 |
- I think it grew over time is what I've heard. 02:46:05.640 |
I don't know if it's gotten worse during my time, 02:46:11.120 |
but I've heard from people who've been there longer 02:46:15.000 |
like MIT didn't used to have a general counsel's office. 02:46:18.400 |
They didn't used to have all of this corporate stuff. 02:46:20.840 |
And then they had to create it as they got bigger 02:46:47.520 |
that MIT is about the students and the faculty. 02:46:51.800 |
Because I don't know, I've had a lot of conversations 02:46:54.920 |
that have been shocking with like senior administration. 02:47:07.360 |
- But those individuals, I'm saying like the capacity, 02:47:11.920 |
like the aura of the place still values the students 02:47:21.980 |
But what I mean is the administration is the froth 02:47:25.980 |
at the top of the, like the waves, the surface. 02:47:30.980 |
Like they can be removed and new life can be brought in 02:47:47.400 |
especially in the era of social media and so on, 02:47:50.780 |
faculty and students have more and more power. 02:48:02.680 |
And like, I also don't think it's a terrible place at all. 02:48:07.080 |
But there's different trajectories it can take. 02:48:10.880 |
- And like, and that has to do with a lot of things, 02:48:22.300 |
it could be the capital of the world in robotics. 02:48:25.520 |
But currently, if you wanna be doing the best AI work 02:48:29.420 |
in the world, you're gonna go to Google or Facebook 02:48:37.080 |
You're not gonna be, you're not gonna be at MIT. 02:48:46.360 |
not allowing the brilliance of the researchers to flourish. 02:49:00.680 |
and can work on more interesting things in companies. 02:49:15.880 |
because like Richard Stallman's a gigantic weirdo 02:49:18.920 |
that crossed lines he shouldn't have crossed, right? 02:49:28.120 |
- There are different types of lines in my opinion. 02:49:34.560 |
but then if administration listens to every line, 02:49:52.720 |
I think the biggest aspect there is not owning it, 02:49:59.160 |
from the perspective of Stallman or people like that, 02:50:07.000 |
seeing the fact that this was like totally inappropriate. 02:50:16.880 |
- No, I think there are different kinds of lines. 02:50:25.760 |
excluded a bunch of people and created an environment 02:50:40.920 |
where you're challenging an institution to... 02:50:48.560 |
trying to cross a line, or maybe he didn't care. 02:50:53.560 |
There are lines that you can cross intentionally 02:50:56.480 |
to move something forward or to do the right thing. 02:51:00.880 |
you can't put an all-gender restroom in the media lab 02:51:11.880 |
And the line you're crossing is some arbitrary, stupid rule 02:51:15.200 |
that people who don't wanna take the risk are like... 02:51:21.520 |
- No, ultimately, I think the thing you said is like, 02:51:33.480 |
I started for a while wearing a suit often at MIT, 02:51:37.000 |
which sounds counterintuitive, but that's actually... 02:51:54.880 |
- But that's not really hurting anybody, right? 02:52:06.000 |
- And that particular thing was, yeah, it was hurting people. 02:52:12.960 |
Hurting, ultimately, the people that you want to flourish. 02:52:21.240 |
- You tweeted a picture of pumpkin spice Greek yogurt. 02:52:40.060 |
- What went wrong with the pumpkin spice Greek yogurt? 02:52:49.620 |
so I don't understand the pumpkin spice in everything craze 02:52:55.620 |
Like, I understand that it might be good in some foods, 02:53:11.940 |
- I think part of the success of a good marriage 02:53:14.660 |
is like giving each other a hard time humorously 02:53:22.900 |
'Cause you guys seem to have a really great marriage 02:53:26.780 |
- I mean, every marriage looks good from the external. 02:53:42.300 |
and especially because people evolve and change, 02:53:47.420 |
for both people to evolve and change together. 02:54:06.600 |
And I was like, "We're not committing to this for life." 02:54:11.540 |
And I'm like, "No, we're committing to being part of a team 02:54:25.500 |
And that really resonated with him too, so yeah. 02:54:35.820 |
- You're a team and do what's right for the team? 02:54:57.940 |
Are you trying to hint something on the podcast? 02:55:01.460 |
- I don't, yeah, I have an announcement to make. 02:55:10.920 |
it felt like a good metaphor for, in a bunch of cases, 02:55:16.300 |
for the marriage industrial complex, I remember that. 02:55:23.420 |
It just seemed like marriage is one of the things 02:55:27.020 |
that always surprises me 'cause I wanna get married. 02:55:31.880 |
And then I listened to like friends of mine that complain, 02:55:39.540 |
It's such a cheap, like if, it's such a cheap release valve, 02:55:44.140 |
like that's bitching about anything, honestly, 02:55:48.740 |
But especially, like bitch about the sports team 02:55:52.420 |
or the weather if you want, but like about somebody 02:55:59.780 |
you're going to see them as a lesser being also. 02:56:04.660 |
but you're going to like decrease the value you have. 02:56:10.260 |
you're not going to appreciate the magic of that person. 02:56:13.320 |
I think, anyway, but it's like that I just noticed this 02:56:18.120 |
a lot that people are married and they will whine about, 02:56:30.480 |
I think women do the same thing about the husband. 02:56:33.020 |
He doesn't, he never does this or he's a goof, 02:56:42.260 |
you know, husbands never do X and like wives are, 02:56:48.060 |
it's just disrespectful to everyone involved. 02:56:51.240 |
So I was brought that up as an example of something 02:56:54.580 |
that people actually love, but they complain about 02:57:02.540 |
And so that's what with Clippy or whatever, right? 02:57:05.020 |
So like you complain about, but you actually love it. 02:57:25.460 |
- What's the closest relationship you've had with a pet? 02:57:31.820 |
What pet or robot have you loved the most in your life? 02:57:41.980 |
- I think my first pet was a goldfish named Bob 02:57:46.060 |
and he died immediately and that was really sad. 02:57:48.420 |
I think it was really attached to Bob and Nancy, 02:57:53.540 |
my goldfish, we got new Bobs and then Bob kept dying 02:58:10.780 |
- Do you think there will be a time when the robot, 02:58:22.540 |
- Romantically, I don't know if it's going to happen at scale. 02:58:27.540 |
I think we talked about this a little bit last time 02:58:30.980 |
on the podcast too, where I think we're just capable 02:58:35.620 |
And actually part of why I think marriage is so tough 02:58:39.320 |
as a relationship is because we put so many expectations 02:58:43.060 |
on it, like your partner has to be your best friend 02:58:46.740 |
and you have to be sexually attracted to them 02:58:48.660 |
and they have to be a good co-parent and a good roommate 02:59:00.580 |
or we even have, we have a different relationship 02:59:04.620 |
than we do to the person, someone, a coworker. 02:59:09.080 |
I think that some people are gonna find romantic 02:59:21.360 |
I think it's just gonna be a separate type of thing. 02:59:27.360 |
- More narrow or even like just something new 02:59:33.200 |
Maybe like having a crush on an artificial agent 02:59:39.420 |
- Do you think people would see that as cheating? 02:59:53.220 |
'Cause maybe they'll be good, a little jealousy 03:00:00.540 |
Maybe they'll be like part of the couple's therapy 03:00:16.600 |
I think there's just such a diversity of different ways 03:00:23.160 |
that we can structure relationships or view them 03:00:26.360 |
that this is just gonna be another one that we add. 03:00:57.580 |
but trying to see someone else's point of view 03:01:03.940 |
And he really instilled that in me from an early age. 03:01:07.740 |
And then he made me read a ton of science fiction, 03:01:13.780 |
- Taught you how to be curious about the world 03:01:22.980 |
Since we've been talking about love and robots. 03:01:32.260 |
It feels like all of that operates in the landscape 03:01:45.760 |
I'm like, don't the Eskimos have all these different words 03:01:49.980 |
We need more words to describe different types 03:01:59.620 |
That's the really interesting thing about love 03:02:02.180 |
is that I had one kid and I loved my first kid 03:02:12.780 |
I'm never gonna love it as much as the first. 03:02:19.300 |
And so I think that people who are threatened 03:02:27.180 |
they don't need to be threatened for that reason. 03:02:30.420 |
- Artificial agents will just, if done right, 03:03:11.700 |
Courage is the most important of all the virtues,