back to indexRobert Playter: Boston Dynamics CEO on Humanoid and Legged Robotics | Lex Fridman Podcast #374
Chapters
0:0 Introduction
2:57 Early days of Boston Dynamics
11:18 Simplifying robots
15:16 Art and science of robotics
19:59 Atlas humanoid robot
36:53 DARPA Robotics Challenge
51:13 BigDog robot
65:2 Spot robot
86:27 Stretch robot
89:15 Handle robot
94:49 Robots in our homes
103:36 Tesla Optimus robot
112:18 ChatGPT
115:22 Boston Dynamics AI Institute
116:53 Fear of robots
127:16 Running a company
132:52 Consciousness
140:26 Advice for young people
142:21 Future of robots
00:00:02.200 |
It was surprisingly hard to get that to work. 00:00:12.440 |
It was the prototype before the Petman robot. 00:00:30.300 |
But even then it was hard to get the robot to walk 00:00:33.580 |
when you're walking that it fully extended its leg. 00:00:37.540 |
And getting that all to work well took such a long time. 00:01:04.340 |
as sort of a byproduct of just a different process 00:01:07.640 |
they were applying to developing the control. 00:01:22.460 |
Last year that I think I saw a good walking on Atlas. 00:01:27.400 |
- The following is a conversation with Robert Plater, 00:01:32.960 |
a legendary robotics company that over 30 years 00:01:36.480 |
has created some of the most elegant, dexterous 00:01:48.880 |
One or both of whom you've probably seen on the internet, 00:01:52.760 |
either dancing, doing back flips, opening doors 00:02:07.240 |
He has been with the company from the very beginning, 00:02:12.600 |
where he received his PhD in aeronautical engineering. 00:02:16.300 |
This was in 1994 at the legendary MIT Leg Lab. 00:02:25.360 |
as part of which he programmed a bipedal robot 00:02:28.640 |
to do the world's first 3D robotic somersault. 00:02:32.440 |
Robert is a great engineer, roboticist and leader 00:02:40.440 |
This conversation was a big honor and pleasure 00:02:56.560 |
When did you first fall in love with robotics? 00:03:04.760 |
- Well, love is relevant because I think the fascination, 00:03:09.760 |
the deep fascination is really about movement. 00:03:12.800 |
And I was visiting MIT looking for a place to get a PhD 00:03:21.800 |
And one of my professors in the aero department said, 00:03:24.640 |
"Go see this guy Mark Raber down in the basement 00:03:32.880 |
and he showed me this robot doing a somersault. 00:03:41.720 |
And because of my own interest in gymnastics, 00:03:46.880 |
And I was interested in, I was in a aero-astro degree 00:03:51.300 |
because flight and movement was all so fascinating to me. 00:03:55.540 |
And then it turned out that robotics had this big challenge. 00:04:09.340 |
They're still working on perfecting motion in robots. 00:04:16.340 |
Is there something maybe grounded in your appreciation 00:04:23.820 |
Did you, was there something you just fundamentally 00:04:28.300 |
appreciated about the elegance and beauty of movement? 00:04:31.180 |
- You know, we had this concept in gymnastics 00:04:34.180 |
of letting your body do what it wanted to do. 00:04:42.000 |
part of what you're doing is putting your body 00:04:45.540 |
into a position where the physics and the body's inertia 00:04:48.940 |
and momentum will kind of push you in the right direction 00:04:59.920 |
was trying to figure out how to build machines 00:05:03.340 |
How do you build something so that the physics 00:05:05.460 |
of the machine just kind of inherently wants to do 00:05:09.820 |
And he was building these springy pogo stick type, 00:05:17.500 |
and there's a spring mass system that's oscillating, 00:05:29.980 |
how do you then control that, but not overpower it? 00:05:33.180 |
It's that coordination that I think creates real potential. 00:05:37.380 |
We could call it beauty, you know, you could call it, 00:05:39.740 |
I don't know, synergy, people have different words for it. 00:05:43.580 |
But I think that that was inherent from the beginning, 00:05:46.260 |
that was clear to me, that that's part of what Mark 00:05:48.540 |
was trying to do, he asked me to do that in my research work 00:05:53.980 |
- So part of the thing that I think I'm calling elegance 00:05:58.780 |
even with the pogo stick, is maybe the efficiency, 00:06:08.380 |
it also becomes easier to control in its own way 00:06:12.220 |
because the physics are solving some of the problem itself, 00:06:15.340 |
it's not like you have to do all this calculation 00:06:18.180 |
and overpower the physics, the physics naturally, 00:06:23.580 |
There can even be, you know, feedback mechanisms, 00:06:26.860 |
stabilizing mechanisms that occur simply by virtue 00:06:31.220 |
of the physics of the body and it's, you know, 00:06:33.940 |
not all in the computer or not even all in your mind 00:06:39.500 |
And there's something interesting in that melding. 00:06:43.660 |
- You were with Mark for many, many, many years, 00:06:45.980 |
but you were there in this kind of legendary space 00:06:55.300 |
Is there some memories from that time that you have? 00:07:09.100 |
- The memories, the distinctive lessons, I would say, 00:07:15.260 |
and that I think Mark was a great teacher of, 00:07:19.300 |
was it's okay to pursue your interests, your curiosity, 00:07:29.180 |
That is a lasting lesson that I think we apply 00:07:50.040 |
like the students that work at those robotics labs 00:07:52.640 |
are like some of the happiest people I've ever met. 00:07:58.640 |
A lot of them are kind of broken by the wear and tear 00:08:03.100 |
But roboticists are, while they work extremely hard 00:08:06.120 |
and work long hours, there's a happiness there. 00:08:11.120 |
The only other group of people I've met like that 00:08:17.240 |
For some reason, there's a deep, fulfilling happiness. 00:08:33.040 |
- We see, our attrition at the company is really low. 00:08:40.400 |
And I think part of that is that there's perhaps 00:08:46.620 |
when you have a robot that's moving around in the world 00:08:49.280 |
and part of your goal is to make it move around 00:09:10.320 |
something they love, it's easier to work hard at it 00:09:23.820 |
that we've been able to build a business around this 00:09:39.000 |
What have you learned about life about robotics from Mark 00:09:42.840 |
through all the many years you worked with him? 00:09:47.240 |
which was have the courage of your convictions 00:09:51.400 |
Be willing to try to find big, big problems to go after. 00:10:01.240 |
especially in a dynamic machine, nobody had solved it 00:10:06.980 |
and that felt like a multi-decade problem to go after. 00:10:21.200 |
So that's really probably the most important lesson 00:10:28.200 |
- How crazy is the effort of doing legged robotics 00:10:42.660 |
that has really become a value of the company 00:10:45.260 |
is try to simplify a thing to the core essence. 00:10:53.780 |
running across the savanna or climbing mountains, 00:11:10.360 |
that if we solved those, you could eventually extrapolate 00:11:15.640 |
And so look for those simplifying principles. 00:11:18.560 |
- How tough is the job of simplifying a robot? 00:11:49.680 |
the dynamics of motion and feedback control principles. 00:11:54.680 |
How to build, with the computers at the time, 00:12:00.640 |
that was simple enough that it could run in real time 00:12:02.800 |
at a thousand hertz and actually get that machine to work. 00:12:06.300 |
And that was not something everybody was doing 00:12:13.600 |
And I think the approaches to controlling robots 00:12:18.960 |
But, and they're going to become more broadly available. 00:12:28.440 |
who could really sort of work at that principled level 00:12:31.680 |
with both the software and make the hardware work. 00:12:38.880 |
you were sort of talking about what are the special things. 00:12:41.880 |
The other thing was, it's good to break stuff. 00:13:05.520 |
Too often, if you are working with a very expensive robot, 00:13:19.040 |
And so I think that's been a principle as well. 00:13:31.480 |
it had some custom stuff, like compute on it, 00:13:40.640 |
the code I wrote had an issue where it didn't stop the car 00:13:47.460 |
at like 20, 25 miles an hour, slammed into a wall. 00:13:51.340 |
And I just remember sitting there alone in a deep sadness. 00:13:55.100 |
Sort of full of regret, I think, almost anger, 00:14:01.460 |
but also like sadness because you think about, 00:14:06.340 |
well, these robots, especially for autonomous vehicles, 00:14:08.980 |
like you should be taking safety very seriously, 00:14:18.580 |
to do this kind of experiments in the future. 00:14:20.820 |
Perhaps the right way to have seen that is positively. 00:14:24.000 |
- It depends if you could have built that car 00:14:47.600 |
And I remember breaking some piece of equipment in the lab 00:14:52.240 |
and then realizing, 'cause maybe this was a unique part 00:15:07.920 |
it was gonna be required to get this thing done, 00:15:12.080 |
And that's freeing in a way that nothing else is. 00:15:16.760 |
- You mentioned that if you've got control, the dynamics, 00:15:40.320 |
that let you extrapolate from one-legged machine to another, 00:15:53.840 |
from a one-legged machine to a two- or a four-legged machine. 00:16:03.040 |
There's also, when we started to pursue humanoid robots, 00:16:14.320 |
that one of the benefits of the humanoid form 00:16:25.600 |
And now it's, or maybe it's just tapping into a knowledge 00:16:30.400 |
and then trying to express that in the machine. 00:16:39.560 |
Before you have the knowledge of how to control it, 00:16:43.800 |
And humanoids sort of make that available to you. 00:16:47.640 |
maybe you wouldn't have had the same intuition about it. 00:16:50.760 |
- Yeah, so your knowledge about moving through the world 00:16:59.440 |
- Yeah, it might be hard to actually articulate exactly. 00:17:02.520 |
There's something about, and being a competitive athlete, 00:17:11.000 |
A coach, one of the greatest strengths a coach has 00:17:18.800 |
and then being able to articulate that to the athlete. 00:17:53.120 |
but the main actress in that, who plays the AI robot, 00:18:02.420 |
and the, I don't know, eloquence of movement. 00:18:05.460 |
It looks efficient and easy, and just, it looks right. 00:18:15.660 |
- And then you look at, especially early robots, 00:18:19.000 |
I mean, they're so cautious in the way they move 00:18:27.140 |
it's something about the movement that looks wrong 00:18:30.220 |
that feels like it's very inefficient, unnecessarily so. 00:18:34.100 |
And it's hard to put that into words exactly. 00:18:37.980 |
- We think, and part of the reason why people 00:18:55.660 |
where you're trying to work with the dynamics 00:19:03.060 |
you're essentially, you're really trying hard 00:19:08.020 |
and so you're always stopping the tipping motion. 00:19:24.820 |
And there's something about getting those physics 00:19:41.100 |
it also ends up being more efficient, likely. 00:19:44.500 |
There's a benefit that it probably ends up being 00:19:58.820 |
- So how hard is it to get the humanoid robot Atlas 00:20:03.660 |
to do some of the things it's recently been doing? 00:20:22.500 |
we needed to deliver natural looking walking. 00:20:27.740 |
They wanted a robot that could walk naturally. 00:20:43.140 |
It was surprisingly hard to get that to work. 00:20:53.380 |
It was the prototype before the Pet Man Robot. 00:21:03.980 |
It would do heel strike first before it rolled onto the toe. 00:21:10.400 |
But even then it was hard to get the robot to walk 00:21:14.460 |
where when you're walking that it fully extended its leg 00:21:28.140 |
And getting that all to work well took such a long time. 00:21:54.960 |
as sort of a byproduct of just a different process 00:21:58.280 |
they were applying to developing the control. 00:22:04.320 |
10 to 15 years to sort of get that from, you know, 00:22:13.080 |
Last year that I think I saw good walking on Atlas. 00:22:17.520 |
what are some challenges of getting good walking? 00:22:31.200 |
the whole system operating in different conditions together? 00:22:51.900 |
where, you know, when your leg is fully extended, 00:22:53.900 |
it can't go further the other direction, right? 00:22:56.180 |
There's only, you can only move in one direction. 00:23:09.780 |
so it can deal with these singular configurations 00:23:19.660 |
And I'd say in the, you know, in those earlier days, 00:23:23.580 |
again, we were working with these really simplified models. 00:23:29.460 |
of the complex human body into a simpler subsystem 00:23:34.100 |
that we can more easily describe in mathematics. 00:23:48.020 |
that let us take the full physics of the robot into account 00:23:53.020 |
and deal with some of those strange situations 00:24:08.260 |
You can't push the robot in any direction you want to, right? 00:24:16.940 |
- And you have to do that for natural movement. 00:24:20.300 |
- It's not necessarily required for natural movement. 00:24:31.340 |
in the direction you want at all times, right? 00:24:45.300 |
and push in any direction you want, you know, so. 00:24:55.260 |
The humanoid form is attractive in many ways, 00:25:10.860 |
increases the complexity of maintaining balance. 00:25:13.980 |
And as soon as you pick up something heavy in your arms, 00:25:25.580 |
you know, we were pursuing these quadruped robots, 00:25:31.060 |
You had this big rigid body and then really light legs. 00:25:36.520 |
the leg motion didn't impact the body motion very much. 00:25:43.700 |
But when you have the humanoid, that doesn't work. 00:25:47.060 |
You swing the legs, it affects everything else. 00:25:54.300 |
does make the humanoid a much more complicated platform. 00:26:15.140 |
I don't know what, like you have to include that 00:26:19.180 |
as part of the modeling, as part of the planning. 00:26:27.180 |
like throws a heavy bag, throws a bunch of stuff. 00:26:31.700 |
So what's involved in picking up a thing, a heavy thing, 00:26:50.660 |
that the robot and the techniques we're applying 00:26:52.900 |
to Atlas let us deal with heavy things in the world. 00:27:35.100 |
you've got all of that has to be sort of included 00:28:05.420 |
are really allowing that to happen quickly now. 00:28:24.780 |
That means that the ability to create new behaviors 00:28:31.900 |
- So being able to explicitly model new things 00:28:35.060 |
that it might need to pick up, new types of things. 00:28:39.700 |
you don't wanna have to pay too much attention 00:28:50.980 |
you have to conform your hand, your end effector 00:28:57.420 |
it's probably just the mass and inertia that matter. 00:29:14.540 |
especially if I'm gonna throw it up on that scaffolding. 00:29:28.280 |
There are harder things that we haven't done yet. 00:29:32.460 |
or I don't know, a bunch of loose wire or rope. 00:29:45.980 |
where we had two spots holding up another spot 00:30:06.820 |
that's really one of the most fundamental questions 00:30:14.380 |
- It probably can't curl as much as we can yet. 00:30:32.980 |
It was probably a meter high or something like that. 00:30:34.940 |
It was a pretty tall jump that Atlas was able to manage 00:30:40.100 |
And I have video of my chief technical officer 00:30:47.820 |
- But the human, getting all the way on top of this box. 00:30:52.280 |
We're now thinking about the next generation of Atlas. 00:30:58.780 |
of a person can't do it with the next generation. 00:31:02.740 |
The robots, the actuators are gonna get stronger. 00:31:11.820 |
you probably had to do quite a bit of testing. 00:31:14.200 |
- Oh yeah, and there's lots of videos of it trying 00:31:43.200 |
- Well again, this stuff has really evolved rapidly 00:32:00.220 |
when I'd see early experiments that the team was doing, 00:32:03.660 |
I might make suggestions about how to change the technique. 00:32:06.420 |
Again, kind of borrowing from my own intuition 00:32:15.860 |
in almost a manual way, trying to change these trajectories 00:32:27.260 |
But more recently, we're running these model predictive 00:32:38.300 |
for the next second or two about how its motion 00:32:47.720 |
So this is happening in a much more natural way. 00:32:50.020 |
And we're really seeing an acceleration happen 00:32:55.340 |
again, partly due to these optimization techniques, 00:33:01.900 |
So it's hard in that there's a lot of mathematics 00:33:12.300 |
- So you can do model predictive control for, 00:33:14.880 |
I mean, I don't even understand what that looks like 00:33:26.220 |
So you know, the physics, we can calculate physics 00:33:29.540 |
pretty well using Newton's laws about how it's going 00:33:41.160 |
You saw the robot on various versions of that trick, 00:33:46.620 |
I've seen it, land in different configurations 00:33:51.940 |
And so, what this model predictive control means is, 00:33:56.100 |
again, in real time, the robot is projecting ahead, 00:34:01.100 |
a second into the future, and sort of exploring options. 00:34:04.460 |
And if I move my arm a little bit more this way, 00:34:08.580 |
And so it can do these calculations, many of them, 00:34:11.320 |
and basically solve for, given where I am now, 00:34:16.540 |
maybe I took off a little bit screwy from how I had planned, 00:34:24.060 |
So the model predictive control lets you adjust on the fly. 00:34:27.860 |
And of course, I think this is what people adapt as well. 00:34:34.540 |
we try to set it up so it's as close to the same every time, 00:34:38.180 |
but we figured out how to do some adjustment on the fly, 00:34:40.420 |
and now we're starting to figure out that the robots 00:34:55.780 |
So when you're in the air, there's some things 00:35:00.660 |
You can't change the momentum while it's in the air, 00:35:03.140 |
'cause you can't apply an external force, or torque, 00:35:09.500 |
of that fixed momentum to still get from A to B, 00:35:18.860 |
I mean, you become a drone for a brief moment in time. 00:35:21.740 |
No, you're not even a drone, 'cause you can't-- 00:35:28.340 |
- Yeah, are you considered a hover-type thing, or no? 00:35:36.120 |
And just even to have the guts to try a backflip, 00:35:43.140 |
- We definitely broke a few robots trying that. 00:35:47.580 |
- But that's where the build it, break it, fix it, 00:35:55.220 |
by breaking the robot repeatedly, you find the weak points, 00:36:02.180 |
- Through the breaking process, you learn a lot, 00:36:05.020 |
like a lot of lessons, and you keep improving, 00:36:07.580 |
not just how to make the backflip work, but everything. 00:36:13.260 |
I mean, is there something about just the guts 00:36:15.540 |
to come up with an idea of saying, you know what, 00:36:23.980 |
in the first place, and to not worry too much 00:36:28.700 |
why the heck are you doing backflips with robots? 00:36:31.300 |
Because a lot of people have asked that, you know, 00:36:53.740 |
to the DARPA Robotics Challenge in 2015, I think, 00:37:15.340 |
of, you know, manipulation, walking, driving a car, 00:37:24.780 |
if I remember correctly, sort of some slight capability 00:37:29.780 |
to communicate with humans, but the communication 00:37:37.540 |
- It could have periods where the communication 00:37:41.140 |
had to be able to proceed, but you could provide 00:37:45.460 |
basically low-bandwidth communications to steer it. 00:37:49.780 |
- I watched that challenge with kind of tears 00:38:05.620 |
by some of the most brilliant roboticists in the world. 00:38:08.460 |
So that was why the tragic, that's why the tears came. 00:38:14.380 |
to that time, what have you learned from that experience? 00:38:20.780 |
sort of the setup for people who haven't seen it. 00:38:25.740 |
of different robots were asked to do a series 00:38:34.700 |
go identify a valve, shut a valve, use a tool 00:38:47.780 |
So it was, the idea was have a general-purpose robot 00:38:54.940 |
Had to be mobility and manipulation, on-board perception. 00:39:10.980 |
let's try to push vehicle autonomy along, right? 00:39:14.300 |
They encouraged people to build autonomous cars. 00:39:18.100 |
So they're trying to basically push an industry forward. 00:39:21.620 |
And we were asked, our role in this was to build 00:39:43.460 |
that sort of won a contest, showed that they could 00:39:53.020 |
And then other robots were introduced as well. 00:39:56.580 |
Carnegie Mellon, for example, built their own robot. 00:40:00.460 |
And all these robots competed to see who could 00:40:07.660 |
And again, I think the purpose was to kind of 00:40:11.840 |
We provided the robot and some baseline software, 00:40:16.140 |
but we didn't actually compete as a participant, 00:40:19.480 |
where we were trying to drive the robot through this maze. 00:40:25.700 |
We were just trying to support the other teams. 00:40:28.540 |
It was humbling because it was really a hard task. 00:40:32.140 |
And honestly, the robots, the tears were because 00:40:48.100 |
But it was humbling because of just how hard, 00:40:54.100 |
But it was really hard to get the robots to do it. 00:40:56.940 |
- Well, the general nature of it, the variety of it. 00:41:01.500 |
- And also that I don't know if the tasks were 00:41:04.300 |
sort of, the task in themselves help us understand 00:41:17.420 |
And I think Atlas is really a general robot platform, 00:41:27.220 |
Like for just for example, probably the hardest task 00:41:35.780 |
And Atlas probably, if you were to design a robot 00:41:39.380 |
that can get into the car easily and get out easily, 00:41:42.500 |
you probably would not make Atlas that particular car. 00:41:50.500 |
- This is the curse of a general purpose robot, 00:41:56.020 |
but they might be able to do a wide variety of things. 00:42:03.740 |
You know, I think we all wanna build general purpose robots 00:42:08.120 |
that can be used for lots of different activities, 00:42:18.100 |
up until this point have been, go build a robot 00:42:21.380 |
for a specific task and it'll do it very well. 00:42:27.980 |
But robots need to be able to deal with uncertainty. 00:42:31.260 |
If they're gonna be useful to us in the future, 00:42:34.460 |
they need to be able to deal with unexpected situations. 00:42:38.700 |
And that's sort of the goal of a general purpose 00:42:46.740 |
Like I remember one of the, a robot, you know, 00:42:50.940 |
the first time you start to try to push on the world 00:42:54.140 |
with a robot, you forget that the world pushes back 00:42:58.580 |
and will push you over if you're not ready for it. 00:43:02.140 |
And the robot, you know, reached to grab the door handle. 00:43:05.380 |
I think it missed the grasp of the door handle, 00:43:08.300 |
was expecting that its hand was on the door handle. 00:43:14.140 |
It didn't realize, oh, I had missed the door handle. 00:43:16.500 |
I didn't have, I didn't, I was expecting a force 00:43:26.020 |
would take totally for granted and deal with, 00:43:35.220 |
- Well, I think a lot of us experience this in, 00:43:41.780 |
sort of you pick up a thing and expect it to be, 00:43:45.980 |
what is it, heavy, and it turns out to be light. 00:43:58.260 |
and then you think you're putting your hand on the table, 00:44:01.060 |
and you miss it, I mean, it's the same kind of situation. 00:44:11.760 |
Predict forward what you think's gonna happen. 00:44:14.540 |
And then if that does happen, you're in good shape. 00:44:19.100 |
- So re, like re, regenerate a plan when you don't. 00:44:24.100 |
I mean, that also requires a very fast feedback loop 00:44:36.500 |
- Yeah, those things have to run pretty quickly. 00:44:38.660 |
- What's the challenge of running things pretty quickly? 00:44:40.860 |
A thousand hertz of acting and sensing quickly. 00:44:45.860 |
- You know, there's a few different layers of that. 00:44:51.540 |
you like to run things typically at around a thousand hertz, 00:44:54.740 |
which means that, you know, at each joint of the robot, 00:45:04.900 |
trying to control the force coming out of that actuator. 00:45:11.700 |
And that means you can't have too much calculation 00:45:24.420 |
maybe at a hundred hertz, maybe 10 times slower, 00:45:27.900 |
which is now starting to look at the overall body motion 00:45:30.980 |
and thinking about the larger physics of the robot. 00:45:39.460 |
that's probably happening a little bit slower, 00:45:43.260 |
your perception and your vision and things like that. 00:45:54.300 |
so that you can squeeze in all the calculations you need 00:46:15.700 |
And that might allow me to do even better predictions 00:46:24.220 |
10 years ago, we had to have pretty simple models 00:46:29.220 |
that we were running, you know, at those fast rates 00:46:34.620 |
about calculating forward with a sophisticated model. 00:46:38.460 |
But as computation gets better, we can do more of that. 00:46:42.780 |
- What about the actual pipeline of software engineering? 00:46:56.260 |
- It's an important part of building a team around it, 00:47:00.380 |
which means, you know, you need to also have software tools, 00:47:24.020 |
to be the same code you're running on the hardware. 00:47:28.900 |
where it was the same code going from one to the other, 00:47:34.180 |
until, you know, a few years, several years ago. 00:47:36.580 |
But that was a, you know, that was a bit of a milestone. 00:47:39.740 |
And so you want to work, certainly work these pipelines 00:47:44.620 |
and have a bunch of people working in parallel, 00:47:47.100 |
especially when, you know, we only have, you know, 00:47:55.900 |
40 developers there all trying to gain access to it. 00:48:01.940 |
and use some of these, some of the software pipeline. 00:48:06.300 |
to be able to run the exact same code in simulation 00:48:11.820 |
realistic simulation, physics-based simulation of Atlas, 00:48:22.620 |
if it works in simulation, it works perfectly in reality. 00:48:25.140 |
How hard is it to sort of keep working on closing that gap? 00:48:28.740 |
- The root of some of our physics-based simulation tools 00:48:33.180 |
And we built some good physics-based modeling tools there. 00:48:45.620 |
It wasn't a particularly successful commercial product, 00:48:50.740 |
so that when we started doing legged robotics again, 00:48:57.060 |
were things that weren't necessarily handled very well 00:49:00.940 |
in the commercial tools you could buy off the shelf, 00:49:03.060 |
like interaction with the world, like foot ground contact. 00:49:16.900 |
of the interaction was a really important element 00:49:23.660 |
that was computationally feasible and could run fast. 00:49:34.300 |
So it's always about efficient, fast operation as well. 00:49:43.500 |
in parallel to the development of the platform 00:49:46.940 |
and trying to scale them has really been essential, 00:49:57.540 |
So foot ground contact, but sort of for manipulation. 00:50:00.800 |
Because don't you want to model all kinds of surfaces? 00:50:06.380 |
- Yeah, so it will be even more complex with manipulation 00:50:09.940 |
'cause there's a lot more going on, you know? 00:50:19.840 |
It's a level of complexity that I think goes above 00:50:33.840 |
to walk with Atlas in the sand along the beach 00:50:52.440 |
I mean, have we really had him out on the beach? 00:50:55.680 |
Yeah, we take them outside often, rocks, hills, 00:50:58.860 |
that sort of thing, even just around our lab in Waltham. 00:51:02.080 |
We probably haven't been on the sand, but I'm-- 00:51:11.800 |
We did take, we had to take Big Dog to Thailand years ago 00:51:27.480 |
up to, I don't know, its belly or something like that, 00:51:34.920 |
Great show, but then we didn't really clean the robot off 00:51:42.240 |
By the time it came back, we had some problems 00:51:49.080 |
- It's not like sand getting into the components 00:51:59.600 |
- Well, it's a personal goal of mine to walk along the beach. 00:52:04.760 |
You get sand everywhere, it's just a jam mess. 00:52:10.360 |
So, I mean, can we just linger on the robotics challenge? 00:52:26.480 |
the loose rock was the epitome of the hard walking surface. 00:52:32.280 |
and you had these little point feet on the robot, 00:52:45.440 |
and that thing responds to you stepping on it. 00:52:47.360 |
- Yeah, and it moves where your point of support is. 00:53:01.560 |
And we would actually build boxes full of rocks 00:53:23.040 |
Do you try not to anthropomorphize the robots? 00:53:27.840 |
Do you try not to, do you try to remember that they're, 00:53:34.960 |
For me there's a magic to the being that is a robot. 00:53:46.160 |
when it moves about the world is there in the robot. 00:53:57.120 |
- Well, I'll say, to address the meta question, 00:54:08.760 |
People, I think part of the magic of these kinds of machines 00:54:13.080 |
is by nature of their organic movement, of their dynamics, 00:54:22.600 |
We tend to look at them and sort of attribute 00:54:25.960 |
maybe feeling to that, because we've only seen 00:54:35.640 |
It means that you could have feelings for a machine. 00:54:43.800 |
They get attracted to them, attached to them. 00:54:49.320 |
as long as we manage what that interaction is. 00:54:52.400 |
So we don't put strong boundaries around this 00:55:01.240 |
because I think people look at these machines 00:55:09.000 |
Because again, they've seen things move like this 00:55:11.200 |
that were living beings, which are intelligent. 00:55:15.160 |
And so they want to attribute intelligence to the robots 00:55:20.000 |
even though they move like an intelligent being. 00:55:27.040 |
and try to, first of all, acknowledge that it's there. 00:55:35.460 |
it's just kind of fun, you know, to look at the robot. 00:55:43.140 |
kind of looking around for where the bag of tools was 00:55:51.100 |
Atlas has to kind of look around and see where they are. 00:55:54.620 |
And there's a little personality there that is fun. 00:56:01.940 |
can enhance interaction between humans and robots 00:56:09.380 |
- This is something to me personally is very interesting. 00:56:11.420 |
I've been, I happen to have a lot of legged robots. 00:56:18.260 |
I hope to have a lot of spots in my possession. 00:56:27.220 |
companies that do incredible stuff like Boston Dynamics. 00:56:36.780 |
you want to align, you want to help the company. 00:56:48.300 |
And so the kind of stuff I'm particularly interested in 00:56:52.540 |
may not be the thing that makes money in the short term. 00:56:54.660 |
I can make an argument that it will in the longterm. 00:57:15.060 |
And be able to communicate excitement or fear, 00:57:20.860 |
And I think as a base layer of function of behavior 00:57:35.380 |
And it's a thing we're beginning to pay attention to. 00:57:42.900 |
a differentiator for the company has always been, 00:57:54.620 |
It can really get around and it doesn't fall down. 00:57:57.120 |
But beyond that, now it needs to be a useful tool. 00:58:02.140 |
And our customers are, for example, factory owners. 00:58:06.100 |
People who are running a process manufacturing facility. 00:58:09.820 |
And the robot needs to be able to get through 00:58:16.120 |
We need for people who are operating those robots 00:58:35.100 |
so that a person looked at the robot and goes, 00:58:47.900 |
And we're even just, the robot's about to turn 00:58:52.380 |
in front of you and maybe indicate that it's going to turn. 00:58:55.440 |
And so you sort of see and can anticipate its motion. 00:58:58.240 |
So this kind of communication is going to become 00:59:04.300 |
but now that the robots are really out in the world 00:59:18.400 |
I think is going to become more and more important. 00:59:22.900 |
'cause there's a lot of interesting possibilities, 01:00:10.560 |
Then there's this opportunity to do a DARPA contract 01:00:24.640 |
And it was a quadruped, and it was the first time 01:00:27.220 |
we built a robot that had everything on board 01:00:34.400 |
it had on-board computers, it had hydraulic actuators 01:00:39.400 |
that needed to be cooled, so we had cooling systems built in. 01:00:48.320 |
So it was 10 years that we were not a robotics company. 01:00:53.280 |
and then we had to build a robot in about a year. 01:00:55.800 |
So that was a little bit of a rough transition. 01:01:00.840 |
- I mean, can you just comment on the roughness 01:01:06.080 |
I mean, this is this big quadruped, four-legs robot. 01:01:17.620 |
We would take 'em out, and it was hard to get 01:01:27.100 |
- And having that all work while trying to get 01:01:34.540 |
- So what was the power plant, what was the engine? 01:01:41.820 |
I don't know, it felt very loud and aggressive 01:01:49.520 |
We weren't trying to design the best robot hardware 01:01:52.760 |
at the time, and we wanted to buy an off-the-shelf engine. 01:02:00.720 |
had literally go-kart engines or something like that. 01:02:11.620 |
And we generally didn't put mufflers on them, 01:02:25.620 |
is always important, because it has to carry everything. 01:02:31.840 |
- Yeah, I mean, the early versions stood about, 01:02:38.960 |
They probably weighed maybe a couple of hundred pounds. 01:02:50.080 |
to really manage a remarkable level of rough terrain. 01:02:55.560 |
So we started out with just walking on the flat, 01:02:59.440 |
and then inclines, and then mud, and then slippery mud. 01:03:05.800 |
we were convinced that legged locomotion in a robot 01:03:17.780 |
but they used a giant hydraulic pump in the lab. 01:03:21.680 |
They used a giant computer that was in the lab. 01:03:36.480 |
that the legged locomotion could really work. 01:03:47.120 |
that you could make a legged robot that could work, 01:03:49.840 |
there was a period at DARPA where robotics got really hot, 01:04:04.720 |
We built Cheetah, which was designed to explore 01:04:18.920 |
not just one robot, but a whole family of robots. 01:04:21.360 |
- To push the limits in all kinds of directions. 01:04:29.320 |
We were able to develop principles of legged locomotion 01:04:32.440 |
so that we knew how to build a small legged robot 01:04:36.120 |
So leg length was now a parameter that we could play with. 01:04:43.640 |
So we built the LS3, which was an 800 pound robot 01:04:56.240 |
to their terrain, to their walking speed, to their payload? 01:05:11.120 |
So again, almost 10 years into sort of a run with DARPA 01:05:15.320 |
where we built a bunch of different quadrupeds. 01:05:27.200 |
where the government was gonna kind of back off 01:05:41.120 |
to somebody who wants to continue to invest in this area. 01:05:44.960 |
And so at Google, we would meet regularly with Larry Page 01:06:01.680 |
that we wanted to continue developing was a quadruped, 01:06:08.840 |
We thought it probably couldn't be hydraulically actuated. 01:06:16.440 |
if we could migrate to a smaller electrically actuated robot 01:06:23.680 |
- So not a gas engine and the actuators are electric. 01:06:36.880 |
what will a robot look like that could be built at scale? 01:06:52.100 |
What they really wanted was a consumer level product, 01:07:01.100 |
We didn't think that was the right next thing to do 01:07:11.180 |
Probably needed to cost a few thousand dollars. 01:07:29.980 |
And he suggested that we make the robots really inexpensive. 01:07:48.700 |
discover the hard problem that you don't know about. 01:07:51.820 |
Don't make it harder by building a crappy machine, basically. 01:08:02.060 |
And so we wanted to build these high quality machines still. 01:08:06.500 |
And we thought that was important for us to continue learning 01:08:19.540 |
And so ultimately, that's why we're building robots 01:08:24.860 |
Because the industry can afford a more expensive machine 01:08:47.680 |
to a consumer level product that will be that cheap. 01:08:50.740 |
But I think the path to getting there needs to go 01:09:06.060 |
So that presumably when you try to build a robot at scale, 01:09:11.780 |
to make money on a robot, even in the industrial setting. 01:09:15.140 |
But how interesting, how challenging of a thing is that? 01:09:20.140 |
In particular, probably new to an R&D company. 01:09:25.720 |
- Yeah, I'm glad you brought that last part up. 01:09:27.780 |
The transition from an R&D company to a commercial company, 01:09:33.180 |
'Cause you've got these engineers who love hard problems, 01:09:35.680 |
who wanna figure out how to make robots work. 01:09:40.060 |
that wanna work on the quality and reliability 01:09:45.120 |
And indeed, we have brought on a lot of new people 01:09:52.140 |
But the big takeaway lesson for me is we have good people. 01:09:59.620 |
And the quality and cost and manufacturability 01:10:05.100 |
And because they're so invested in what we're doing, 01:10:13.380 |
And so I think we're managing that transition very well. 01:10:18.540 |
I mean, it's a huge undertaking, by the way, right? 01:10:23.300 |
So even having to get reliability to where it needs to be, 01:10:30.380 |
that we're just operating 24/7 in our offices 01:10:33.980 |
to go find those rare failures and eliminate them. 01:10:37.420 |
It's just a totally different kind of activity 01:10:39.480 |
than the research activity where you get it to work, 01:10:42.300 |
the one robot you have to work in a repeatable way 01:10:50.100 |
But I think we're making remarkable progress, I guess. 01:10:57.940 |
And I mean, one of the things that's really cool 01:11:02.940 |
is to see a large number of robots moving about. 01:11:10.140 |
in the research environment at MIT, for example, 01:11:14.380 |
I don't think anyone ever has a working robot 01:11:21.720 |
in a sad state of despair, waiting to be born, 01:11:28.360 |
Just to have, I just remember there's a spot robot, 01:11:34.800 |
and was just walking randomly for whatever reason. 01:11:38.040 |
But there's a kind of a sense of sentience to it 01:11:42.400 |
because it doesn't seem like anybody was supervising it. 01:11:48.880 |
It is the case that if you come to our office today 01:12:03.080 |
So we have these robots programmed to do autonomous missions, 01:12:07.600 |
get up off their charging dock, walk around the building, 01:12:09.920 |
collect data at a few different places and go sit back down. 01:12:13.220 |
And we want that to be a very reliable process 01:12:16.080 |
'cause that's what somebody who's running a brewery, 01:12:20.200 |
a factory, that's what they need the robot to do. 01:12:31.720 |
we have robots that are accruing something like 01:12:39.080 |
and over a thousand hours of operation every week. 01:12:45.540 |
I don't think anybody else in the world can do 01:12:50.900 |
You have to be willing to dedicate it to that test. 01:13:04.040 |
What have you learned from the manufacturer side 01:13:11.880 |
We're learning how to cast parts instead of mill it all out 01:13:17.320 |
We're learning how to get plastic molded parts. 01:13:21.120 |
And we're learning about how to control that process 01:13:24.640 |
so that you can build the same robot twice in a row. 01:13:31.060 |
We've set up a manufacturing facility in Waltham. 01:13:41.280 |
to both spots and stretches, you know, at that factory. 01:13:49.100 |
we're still iterating on the design of the robot. 01:13:51.080 |
As we find failures from these reliability tests, 01:14:01.200 |
especially when you want to move as fast as we do. 01:14:11.080 |
who are trying to get the cheapest parts for us 01:14:17.080 |
And then we go change the design from underneath them. 01:14:20.280 |
And so, you know, getting everybody on the same page here 01:14:25.580 |
but we also need to try to figure out how to reduce costs. 01:14:38.760 |
- Yeah, things got more expensive and harder to get. 01:14:48.560 |
And, you know, these are really just the first generation 01:14:52.760 |
We're already thinking about what the next generation 01:15:06.940 |
But for example, in the applications that we're excited 01:15:11.960 |
about where you're monitoring these factories 01:15:15.940 |
there's probably a simpler machine that we could build 01:15:23.080 |
And that's the difference between the general purpose 01:15:26.480 |
machine or the platform versus the purpose built machine. 01:15:31.960 |
we'd still like the robot to do lots of different tasks. 01:15:35.240 |
If we really knew on day one that we're gonna be operating 01:16:07.020 |
So what can you say about the arm that Spot has? 01:16:20.900 |
That's where, you know, in the past 10 years, 01:16:27.920 |
If you ask what's the hard problem in the next 10 years, 01:16:40.520 |
And the arm is almost as complex as the robot itself. 01:16:52.000 |
It has, you know, several motors and actuators and sensors. 01:17:02.320 |
and the robot will control the motion of its hand 01:17:07.980 |
So in the same way the robot walks and balances, 01:17:11.280 |
managing its own foot placement to stay balanced, 01:17:17.240 |
where the robot, you indicate, okay, go grab that bottle. 01:17:23.160 |
and then sort of closing in on that, the grasp. 01:17:38.680 |
because again, we want it to be sort of identifiable. 01:17:42.360 |
In the last year, a lot of our sales have been 01:17:56.680 |
- What's the interface like to work with the arm? 01:18:00.200 |
Like is it pretty, so are they designed primarily, 01:18:09.000 |
Is it designed to be easily and efficiently operated 01:18:15.640 |
Or is there also the capability to push towards autonomy? 01:18:21.900 |
In the next version of the software that we release, 01:18:31.400 |
if you have an autonomous mission for the robot, 01:18:38.680 |
and it's gonna have to use that arm to open the door. 01:18:41.360 |
And so that'll be an autonomous manipulation task 01:18:44.240 |
that just, you can program easily with the robot. 01:18:48.440 |
Strictly through, we have a tablet interface. 01:18:52.040 |
And so on the tablet, you sort of see the view 01:19:05.880 |
So we want, and for a task like opening doors, 01:19:20.200 |
It's an electric utility, Ontario Power Generation. 01:19:23.900 |
And they have to, when they're gonna disconnect 01:19:32.320 |
From the grid, you have to disconnect this breaker switch. 01:19:35.480 |
Well, as you can imagine, there's hundreds or thousands 01:19:38.880 |
of amps and volts involved in this breaker switch. 01:19:42.360 |
And it's a dangerous event, 'cause occasionally 01:19:48.640 |
the sparks jump across and people die doing this. 01:19:52.300 |
And so Ontario Power Generation used our Spot 01:19:56.960 |
and the arm through the interface to operate this disconnect 01:20:12.400 |
And so we got some examples of that breaker switch. 01:20:16.440 |
And I believe in the next generation of software, 01:20:21.720 |
They're gonna be able to just point the robot 01:20:35.080 |
stick it into a socket, and literally unscrew 01:20:43.120 |
And we basically automated them so that the human says, 01:20:49.160 |
That right there is the socket where you're gonna 01:20:54.120 |
And so you can remotely sort of indicate this 01:20:56.080 |
on the tablet, and then the robot just does everything 01:21:00.560 |
- And it does everything, all the coordinated movement 01:21:02.560 |
of all the different actuators that includes the body. 01:21:08.960 |
So it's within reach, and the arm is in a position 01:21:17.400 |
- So how does one become a big enough customer 01:21:21.640 |
'Cause I personally want a robot that gets me a beer. 01:21:26.320 |
I mean, that has to be one of the most requests, 01:21:33.400 |
of picking up objects and bringing the objects to you. 01:21:38.080 |
- We love working with customers who have challenging 01:21:40.600 |
problems like this, and this one in particular, 01:21:53.560 |
Probably took them an hour to do it the first time, right? 01:22:00.040 |
for us to work on, to figure out how to automate 01:22:03.880 |
And so we took it on, not because we were gonna make 01:22:06.920 |
a bunch of money from it in selling the robot back to them, 01:22:27.560 |
And if they're, especially if they're gonna buy 10 or 20 01:22:29.720 |
or 30 robots, and they say, I really need it to do this, 01:22:33.120 |
well, that's exactly the right kind of problem 01:22:43.400 |
I think it's fair to say it's notoriously difficult 01:22:47.120 |
to make a lot of money as a robotics company. 01:22:49.560 |
How can you make money as a robotics company? 01:22:55.880 |
It seems that a lot of robotics companies fail. 01:23:02.240 |
It's difficult to build robots at a low enough cost 01:23:06.160 |
where customers, even the industrial setting, 01:23:09.320 |
And it's difficult to build robots that are useful, 01:23:18.120 |
for many years of finding a way to make money. 01:23:23.040 |
the money we made was from doing contract R&D work. 01:23:34.320 |
who had a vision of not only developing advanced technology, 01:23:42.680 |
And so both Google and SoftBank and now Hyundai 01:23:46.000 |
had that vision and were willing to provide that investment. 01:23:51.840 |
Now, our discipline is that we need to go find applications 01:24:03.640 |
because it doesn't work if you don't sell thousands 01:24:07.200 |
If you only sell hundreds, you will commercially fail. 01:24:10.880 |
And that's where most of the small robot companies 01:24:27.920 |
And so it really does take visionary investment 01:24:32.880 |
But we believe that we are going to make money 01:24:45.440 |
if the line goes down because a vacuum pump failed 01:24:48.320 |
someplace, that can be a very expensive process. 01:24:51.400 |
It can be a million dollars a day in lost production. 01:24:54.520 |
Maybe you have to throw away some of the product 01:25:09.760 |
But there needs to be a critical mass of this task. 01:25:12.960 |
And we're focusing on a few that we believe are ubiquitous 01:25:22.360 |
And that's using a thermal camera to keep things 01:25:49.000 |
You know, we're working with Global Foundries. 01:26:10.320 |
maybe 1,500 a year for that sort of part of the business. 01:26:15.240 |
So it still needs to grow, but we're on a good path. 01:26:36.040 |
You know, Spot we built because we had decades 01:26:44.240 |
But we had to go figure out what the application was. 01:26:47.120 |
And we actually discovered this factory patrol application, 01:27:02.920 |
There's shipping containers moving all around the world 01:27:06.680 |
full of boxes that are mostly being moved by hand. 01:27:09.240 |
By some estimates, we think there's a trillion boxes, 01:27:13.520 |
cardboard boxes, shipped around the world each year. 01:27:18.200 |
It became clear early on that there was an opportunity 01:27:22.000 |
for a mobile robot in here to move boxes around. 01:27:24.900 |
And the commercial experience has been very different 01:27:35.840 |
was gonna be used for, they immediately started saying, 01:27:43.720 |
We just started shipping the robot in January, 01:27:50.660 |
So our first deliveries of Stretch to customers 01:27:58.320 |
And we have about seven or eight other customers, 01:28:15.580 |
We're gonna do any box moving task in the warehouse. 01:28:21.260 |
And we'll eventually have it doing palletizing 01:28:23.800 |
or depalletizing or loading trucks or unloading trucks. 01:28:37.560 |
- It looks like a big, strong robot arm on a mobile base. 01:28:46.360 |
because that's what lives in warehouses, right? 01:28:50.340 |
So it needed to be able to fit in that space. 01:28:54.720 |
So it was our first, it was actually a bit of a commitment 01:28:59.720 |
from us, a challenge for us to build a non-balancing robot. 01:29:10.800 |
- Well, because it wasn't gonna have this balance problem. 01:29:16.760 |
of the logistics robot we built was a balancing robot. 01:29:36.860 |
I mean, just, can you actually just linger on 01:29:44.420 |
- Yeah, so let me, I love talking about the history 01:29:49.420 |
Because it connects all of our robots, actually. 01:29:59.680 |
we wanted to understand, I was telling you earlier, 01:30:01.600 |
the challenge of the human form is that you have 01:30:05.120 |
And balancing that inertia, that mass up high, 01:30:12.760 |
And so we started trying to get Atlas to balance 01:30:15.480 |
standing on one foot, like on a balance beam, 01:30:31.960 |
We were starting to figure that out on Atlas. 01:30:37.520 |
which was a robot that was gonna be on two wheels, 01:30:51.240 |
a big tail, to help it balance while it was using its arm. 01:30:56.240 |
So the reason why this robot sort of looks epic, 01:31:07.240 |
was the wheels, it has legs so it can extend its legs. 01:31:29.720 |
this two-legged robot was we had figured this thing out, 01:31:41.260 |
It moves in a graceful way like nothing else we've built. 01:31:45.080 |
But it wasn't the right machine for a logistics application. 01:31:50.760 |
And couldn't pick boxes fast enough, basically. 01:31:55.960 |
- Do it beautifully, but it just wasn't efficient enough. 01:32:00.480 |
But I think we'll come back to that machine eventually. 01:32:05.720 |
the fact that you showed that you could do so many things 01:32:13.020 |
That was a demonstration of what is possible. 01:32:17.260 |
and this was really kind of a hard-nosed business decision. 01:32:21.080 |
It indicated us not doing it just for the beauty 01:32:43.840 |
And that big battery sort of helps it stay balanced, right? 01:32:47.400 |
So it can move a 50-pound box around with its arm 01:32:51.140 |
It's omnidirectional, it can move in any direction, 01:32:57.880 |
so it can deal with gaps or things on the floor 01:33:05.240 |
It's a mobile robot arm that can work to carry, 01:33:13.920 |
- Take a box from point A to point B, anywhere. 01:33:21.040 |
because there's so many trucks and containers 01:33:23.280 |
that where goods are shipped, and it's a brutal job. 01:33:26.000 |
You know, in the summer, it can be 120 degrees 01:33:36.960 |
And so we feel like this is a productivity enhancer. 01:33:43.040 |
And for the people who used to do that job unloading trucks, 01:33:49.120 |
And so by building robots that are easy to control, 01:33:53.120 |
and it doesn't take an advanced degree to manage, 01:34:02.440 |
the warehouse workers who were doing that manual labor 01:34:06.520 |
And so we see this as ultimately a benefit to them as well. 01:34:14.100 |
- Not yet, but I will say that when we engage 01:34:21.260 |
with our customers, they'll be able to see a return 01:34:26.680 |
- Okay, so that's something that you're constantly 01:34:30.400 |
- And I suppose you have to do the same kind of thinking 01:34:39.480 |
- Yeah, and so you have a little more flexibility. 01:34:46.280 |
And with Spot, it took us a while to figure out 01:34:51.560 |
maybe the conversation you were having a while ago 01:34:56.680 |
with Larry Page, maybe looking to the longer future 01:35:11.880 |
about a future where Spot-like robots are in the home 01:35:20.280 |
We think the pathway to getting there is likely 01:35:31.840 |
how to make the software so that they can really 01:35:35.400 |
That's gonna take real investment to get there. 01:35:50.640 |
to start someplace else by just making a cute interaction, 01:35:56.760 |
And so we think the utility really needs to come first. 01:36:01.760 |
And that means you have to solve some of these hard problems. 01:36:05.880 |
And so to get there, we're gonna go through the design 01:36:13.440 |
and then that's eventually gonna let you reach a scale 01:36:20.520 |
And so, yeah, maybe we'll be able to build a smaller Spot 01:36:24.000 |
with an arm that could really go get your beer for you. 01:36:27.400 |
But there's things we need to figure out still. 01:36:32.400 |
and if you're gonna be interacting with children, 01:36:37.000 |
And right now, we count on a little bit of standoff distance 01:36:41.240 |
between the robot and people so that you don't 01:36:45.320 |
So you've got a lot of things you need to go solve 01:36:47.600 |
before you jump to that consumer-level product. 01:36:50.840 |
- Well, there's a kind of trade-off in safety 01:36:52.880 |
because it feels like in the home, you can fall. 01:37:02.200 |
like, you're allowed to fail in different ways, 01:37:05.440 |
in more ways, as long as it's safe for the humans. 01:37:09.720 |
So it just feels like an easier problem to solve 01:37:15.000 |
- That may be true, but I also think the variety of things 01:37:21.080 |
a consumer-level robot would be expected to do 01:37:31.640 |
They're all gonna want you to clean up the dishes 01:37:45.400 |
- So to push back on that, here's where application, 01:37:49.280 |
I think the application of being a pet, a friend. 01:38:04.880 |
There's something about just having interacted with them, 01:38:26.560 |
are, you're gonna be more attached to in the long run. 01:38:33.720 |
They sold over 100,000 of those, maybe 150,000. 01:38:37.200 |
Probably wasn't considered a successful product for them. 01:38:44.040 |
and then they brought it back, Sony brought it back. 01:38:55.640 |
Will you get away without having any other utility? 01:39:14.200 |
Maybe that'll open the social robot up again. 01:39:18.900 |
That's probably not a path we're gonna go down, 01:39:23.440 |
because, again, we're so focused on performance and utility. 01:39:34.160 |
- Yeah, but I also wanna predict that you're wrong on that, 01:39:47.080 |
to adding a chat GPT-like capability, maybe GPT-5. 01:39:52.080 |
And there's just so many open source alternatives 01:39:55.520 |
that you could just plop that on top of Spot. 01:40:01.240 |
and you're figuring out how to mass manufacture it, 01:40:05.320 |
and how to make it reliable, all those kinds of things, 01:40:09.480 |
to where just adding chat GPT on top of it will create-- 01:40:12.080 |
- Oh, I do think that being able to verbally converse, 01:40:44.600 |
And then just talk shit about the state of the world. 01:40:50.240 |
I mean, where there's a deep loneliness within all of us, 01:40:52.960 |
and I think a beer and a good chat solves so much of it, 01:40:57.200 |
or it takes us a long way to solving a lot of it. 01:41:04.800 |
when a generative AI can give you that warm feeling 01:41:09.800 |
that you connected, and that, oh, yeah, you remember me, 01:41:28.320 |
in many cases, some of the deepest friendships you have 01:41:31.280 |
is having gone through a difficult time together 01:41:34.160 |
and having a shared memory of an amazing time 01:41:50.320 |
of having gone through some shit in the past. 01:41:52.600 |
And the current systems are not personalized in that way, 01:42:00.960 |
So combine that with an embodied robot like Spot, 01:42:13.160 |
But of course, you have to build that on top of a company 01:42:19.360 |
real customers, and with robots that are safe and work 01:42:29.640 |
in that because of our investors, primarily Hyundai, 01:42:41.440 |
on driving us to profitability as soon as possible. 01:42:49.480 |
of creating, you know, what does mobility mean in the future? 01:43:05.640 |
between let's build a business that makes money. 01:43:15.800 |
we need to have a business that's profitable in the end. 01:43:18.320 |
Otherwise, somebody else is gonna drive the ship for us. 01:43:30.920 |
And the real trick will be if we can do both. 01:43:33.240 |
- Speaking of ships, let me ask you about a competitor 01:43:48.560 |
How does that change the landscape of your work? 01:43:53.560 |
So there's sort of from the outside perspective, 01:44:12.200 |
onto the work that we'd been doing for over a decade. 01:44:19.480 |
And in fact, what we've seen is that in addition to Tesla, 01:44:24.280 |
we're seeing a proliferation of robotic companies arise now. 01:44:41.440 |
a former Boston Dynamics employee on their staff 01:44:47.440 |
I would do that as a company, yeah, for sure. 01:44:54.720 |
It has brung a tremendous validation to what we're doing 01:45:01.020 |
Competitive juices are flowing, the whole thing. 01:45:32.460 |
So I think Elon is known for setting these kinds 01:45:42.240 |
but actually pushing not just the particular team he leads, 01:45:50.160 |
Do you see Boston Dynamics in the near future 01:45:57.960 |
kind of pushing Atlas maybe to do more cool stuff, 01:46:02.960 |
trying to drive the cost of Atlas down perhaps? 01:46:13.640 |
in Boston Dynamics due to this little bit of competition. 01:46:21.000 |
When we released our most recent video of Atlas, 01:46:27.440 |
the scaffolding and throwing the box of tools around 01:46:34.380 |
that not only can we do this parkour mobility thing, 01:46:47.880 |
And for the reasons I explained to you earlier, 01:47:12.000 |
We see the next phase of Atlas being more dexterous hands 01:47:19.840 |
that we're gonna start by moving big things around 01:47:32.080 |
Maybe you could go build a special purpose robot arm, 01:47:36.560 |
you know, for stuffing chips into electronics boards. 01:47:41.400 |
But we don't really wanna do really fine work like that. 01:47:47.240 |
where you're using two hands to pick up and balance 01:47:49.720 |
an unwieldy thing, maybe in a manufacturing environment, 01:47:57.200 |
are gonna be able to do with the level of dexterity 01:48:00.000 |
that they're gonna have in the next few years. 01:48:11.200 |
We think there's something very interesting there 01:48:17.560 |
you can transfer a thing from one hand to the other, 01:48:21.840 |
you can reorient it in a way that you can't do it 01:48:26.560 |
And so there's a lot that extra arm brings to the table. 01:48:32.720 |
you mentioned Boston Dynamics really wants to see 01:48:43.960 |
I think with Elon, he's really driving the cost down. 01:48:47.040 |
Is there some inspiration, some lessons you see there 01:48:55.000 |
especially with Atlas, with a humanoid robot? 01:48:57.080 |
- Well, I think the thing that he's certainly been learning 01:48:59.680 |
by building car factories is what that looks like 01:49:11.840 |
And the smart thing that they have in their favor 01:49:20.480 |
they know how to build computers and vision systems. 01:49:25.600 |
between modern automotive companies and robots. 01:49:35.520 |
I mean, automotive company behind us as well. 01:49:43.080 |
The electric vehicles from Hyundai are doing pretty well. 01:49:50.600 |
of the low-level controls, some of the incredible stuff 01:50:02.680 |
sort of higher-level machine learning applications? 01:50:06.240 |
Do you see customers adding on those capabilities 01:50:09.440 |
or do you see Boston Dynamics doing that in-house? 01:50:14.680 |
are probably gonna be more broadly available, 01:50:20.680 |
Using a machine learning, like a vision algorithm, 01:50:24.320 |
so a robot can recognize something in the environment. 01:50:27.120 |
That ought to be something you can just download. 01:50:36.080 |
And I think people besides Boston Dynamics will provide that 01:50:41.600 |
that lets people add these vision algorithms to Spot. 01:50:46.600 |
And we're currently working with some partners 01:51:02.600 |
So we see that, we see it ultimately an ecosystem 01:51:11.240 |
you might even be able to do the same thing with behaviors. 01:51:15.200 |
So this technology will also be brought to bear 01:51:19.240 |
on controlling the robot, the motions of the robot. 01:51:27.240 |
to develop algorithms for both locomotion and manipulation. 01:51:34.720 |
you can add new behaviors to a robot quickly. 01:51:45.640 |
I think you need to understand at a deep level 01:51:55.000 |
But it's certainly a place where these approaches 01:52:00.320 |
- So reinforcement learning is part of the process. 01:52:06.400 |
- So there's increasing levels of learning with these robots? 01:52:32.280 |
through the large language models like GPT-4? 01:52:58.040 |
Disinformation is a curse that's an unintended consequence 01:53:03.040 |
of social media that could be exacerbated with these tools. 01:53:17.600 |
with these kinds of models don't have a whole lot to do 01:53:21.120 |
with the way we're gonna use them in our robots. 01:53:23.960 |
If I'm using a robot, I'm building a robot to do, 01:53:37.040 |
There's sort of a built-in mechanism for judging. 01:53:50.040 |
Whereas if you're asking for, yeah, I don't know, 01:53:53.760 |
trying to ask a theoretical question in chat GPT, 01:54:02.840 |
What is that truth that you're comparing against? 01:54:05.600 |
Whereas in physical reality, you know the truth. 01:54:14.520 |
to be a little bit concerned about, you know, 01:54:18.400 |
how these tools, large language models could be used, 01:54:21.800 |
but I'm not very worried about how they're gonna be used. 01:54:33.080 |
that has different ways of verifying what's going on. 01:54:42.240 |
about the possibility of having conversations with bot. 01:54:45.720 |
- There's no, I would say negative consequences to that, 01:55:09.360 |
that's adding the vision algorithms for daydreaming for us. 01:55:15.800 |
where they hooked up, you know, a language tool to spot 01:55:19.760 |
and they're talking to spot to give it commands. 01:55:22.840 |
Can you tell me about the Boston Dynamics AI Institute? 01:55:30.320 |
the Boston Dynamics Artificial Intelligence Institute. 01:55:34.760 |
It's led by Mark Raybord, the founder of Boston Dynamics 01:55:37.680 |
and the former CEO and my old advisor at MIT. 01:55:40.980 |
Mark has always loved the research, the pure research, 01:55:46.760 |
without the confinement or demands of commercialization. 01:56:00.420 |
And so, suggested to Hyundai that he set up this institute 01:56:05.420 |
and they agree that it's worth additional investment 01:56:10.600 |
to kind of continue push, pushing this forefront. 01:56:14.360 |
And we expect to be working together where, you know, 01:56:39.040 |
of legged locomotion again, when we started that, 01:56:43.120 |
And so I think Mark wants to have the freedom 01:56:45.560 |
to pursue really hard over the horizon problems. 01:56:50.440 |
And that's, that'll be the goal of the institute. 01:56:57.560 |
some of the concerns about large language models. 01:57:00.200 |
That said, you know, there's been a long running fear 01:57:07.600 |
Why do you think people are afraid of legged robots? 01:57:34.680 |
that there's reason to be a little bit nervous 01:57:42.380 |
It's unfortunate that they chose to use a robot 01:57:53.140 |
But people are afraid because we've been taught 01:58:06.620 |
the Czech playwright for Rossum's Universal Robots. 01:58:19.340 |
And so we've been entertained by these stories 01:58:23.680 |
But I, and I think that's as much why people are afraid 01:58:28.020 |
as anything else, is we've been sort of taught 01:58:31.220 |
that this is the logical progression through fiction. 01:58:38.900 |
I think what people more and more will realize, 01:58:46.380 |
like say you have a super intelligent AI embodied 01:58:57.160 |
And we humans know how to deal with physical reality. 01:59:00.140 |
I think it's much scarier when you have arbitrary scaling 01:59:04.100 |
of intelligent AI systems in the digital space. 01:59:17.100 |
It could tell you, you could put Chad G.B.T. on top of it, 01:59:21.420 |
because you have a contact with physical reality. 01:59:28.380 |
I mean, I'm sure you can start just like a dog lies to you. 01:59:32.780 |
It's like, I wasn't part of tearing up that couch. 01:59:40.660 |
but you're going to kind of figure it out eventually. 01:59:43.980 |
It's, if it happens multiple times, you know. 01:59:49.300 |
- Humanity has figured out how to make machines safe. 01:59:52.340 |
And there's, you know, regulatory environments 01:59:56.020 |
and certification protocols that we've developed 02:00:00.500 |
in order to figure out how to make machines safe. 02:00:03.820 |
We don't know, and don't have that experience 02:00:06.860 |
with software that can be propagated worldwide 02:00:11.220 |
And so I think we needed to develop those protocols 02:00:19.940 |
but I don't think the fear of that and that work 02:00:32.840 |
there's a fear that robots will take our jobs. 02:00:35.900 |
I just, I took a ride, I was in San Francisco, 02:00:38.580 |
I took a ride in the Waymo vehicles, an autonomous vehicle. 02:00:59.700 |
I don't know exactly what they're flicking off. 02:01:30.400 |
there's been fear of, you know, an automation anxiety. 02:01:45.180 |
Sometime in the future, we're gonna look back 02:01:49.060 |
at people who manually unloaded these boxes from trailers 02:01:51.980 |
and we're gonna say, why did we ever do that manually? 02:01:54.660 |
But there's a lot of people who are doing that job today 02:02:00.080 |
But I think the reality is, as I said before, 02:02:05.300 |
so that those very same people can operate it. 02:02:07.560 |
And so I think there's a pathway to upskilling 02:02:09.780 |
and operating just like, look, we used to farm 02:02:13.120 |
with hand tools and now we farm with machines 02:02:15.900 |
and nobody has really regretted that transformation. 02:02:20.300 |
And I think the same can be said for a lot of manual labor 02:02:26.980 |
we're entering a new world where demographics 02:02:31.760 |
are gonna have strong impact on economic growth. 02:02:38.460 |
the first world is losing population quickly. 02:02:42.340 |
In Europe, they're worried about hiring enough people 02:02:47.080 |
just to keep the logistics supply chain going. 02:02:50.380 |
And, you know, part of this is the response to COVID 02:03:00.020 |
But these jobs are getting harder and harder to fill. 02:03:03.180 |
And I just, I'm hearing that over and over again. 02:03:06.300 |
So I think, frankly, this is the right technology 02:03:15.020 |
And we're gonna want tools to enhance that productivity. 02:03:43.380 |
- Well, and a lot, you know, anyone who deals with texts 02:03:46.060 |
and writing a draft proposal might be easily done 02:03:58.620 |
- But on the other hand, you also want it to be right. 02:04:01.820 |
And they don't know how to make it right yet. 02:04:05.100 |
But it might make a good starting point for you to iterate. 02:04:07.900 |
- Boy, do I have to talk to you about modern journalism. 02:04:23.760 |
- You spearheaded the NT weaponization letter. 02:04:34.640 |
and the general topic of the use of robots in war? 02:04:44.240 |
and then got several leading robotics companies 02:04:50.400 |
Unitry in China and Agility here in the United States 02:05:08.920 |
And part of the motivation there is, you know, 02:05:11.840 |
as these robots start to become commercially available, 02:05:16.360 |
you can see videos online of people who've gotten a robot 02:05:19.720 |
and strapped a gun on it and shown that they can, 02:05:26.760 |
And so having a robot that has this level of mobility 02:05:33.360 |
that could harm somebody from a remote operator 02:05:46.480 |
For, you know, reasons that we think ultimately 02:05:53.200 |
if it grows in a way where robots are ultimately 02:06:03.560 |
But by goodness, you're going to have to trust 02:06:08.240 |
And if you think the robot's going to harm you, 02:06:11.680 |
that's going to impede the growth of that industry. 02:06:16.440 |
So we thought it was important to draw a bright line 02:06:26.680 |
begin to engage with lawmakers and regulators. 02:06:31.680 |
Let's figure out what the rules are going to be 02:06:37.040 |
and use our position as leaders in this industry 02:06:45.920 |
And so we are, in fact, I have a policy director 02:06:51.600 |
at my company whose job it is to engage with the public, 02:06:55.680 |
to engage with interested parties and including regulators 02:07:04.920 |
and it's an important topic for people that worry 02:07:11.960 |
So I'm glad you're sort of leading the way in this. 02:07:19.240 |
What's it take to be a CEO of a robotics company? 02:07:35.120 |
from building the thing to leading a company? 02:07:48.980 |
I talked earlier about the courage to tackle hard problems. 02:07:53.080 |
So I think there's courage required not just of me, 02:07:56.200 |
but of all of the people who work at Boston Dynamics. 02:07:59.960 |
I also think we have a lot of really smart people. 02:08:03.600 |
We have people who are way smarter than I am. 02:08:09.560 |
to lead them and to trust that you have something 02:08:40.440 |
but it was the natural progression of things. 02:08:43.080 |
There was always a, there always needed to be 02:08:52.040 |
that needed to be done that wasn't being done, 02:08:58.500 |
of such strong engineers, oftentimes that was 02:09:04.560 |
or it was in the business development direction, 02:09:16.600 |
So I, you know, just willingness to sort of tackle 02:09:33.880 |
How do you know the guy or gal are gonna make 02:09:41.560 |
that's doing some of the hardest work in the world? 02:09:44.080 |
- You know, we developed an interview process 02:09:50.240 |
It's a little bit of a hard interview process, 02:09:52.600 |
because the best interviews, you ask somebody 02:10:03.360 |
that they worked on, and you saw they really did the work, 02:10:06.280 |
they solved the problems, and you saw their passion for it. 02:10:15.160 |
is you have to ask a probing question about it. 02:10:16.920 |
You have to be smart enough about what they're telling you, 02:10:23.720 |
And so it takes a pretty talented team to do that. 02:10:26.940 |
But if you can do that, that's how you tap into, 02:10:33.000 |
they really did the work, they're excited about it, 02:10:35.520 |
that's the kind of person I want at my company. 02:10:51.560 |
where it didn't matter if you were an engineer, 02:11:08.000 |
to tap into those things I just described to you. 02:11:10.320 |
At Google, they taught us, and I think, I understand why. 02:11:14.080 |
You're right, they're hiring tens of thousands of people. 02:11:21.480 |
where they would ask you a standard question. 02:11:25.320 |
and I'm just gonna ask you to write code in front of me. 02:11:32.540 |
It does let you compare candidates really well, 02:11:37.040 |
but it doesn't necessarily let you tap in to who they are. 02:11:41.040 |
Right, 'cause you're asking them to answer your question 02:11:43.600 |
instead of you asking them about what they're interested in. 02:11:55.000 |
But we are still doing that with the technical people. 02:11:59.300 |
But because we too now need to sort of increase 02:12:04.560 |
not everybody's giving a presentation anymore. 02:12:14.360 |
Did they find something interesting or curious? 02:12:31.560 |
if you get a person to talk about what they're interested in, 02:12:37.200 |
Like how much of the whiteboard can you fill out? 02:12:41.040 |
did they really do the work if they know some of the details? 02:12:47.480 |
Especially with engineering, the work is in the details. 02:13:17.560 |
or at least the wide world of opportunity and possibility 02:13:31.800 |
and perhaps the display, real or fake, of consciousness. 02:14:08.580 |
Is that something almost from a science fiction perspective 02:14:12.000 |
you think about, or do you try to avoid ever, 02:14:15.120 |
try to avoid the topic of consciousness altogether? 02:14:22.560 |
and I don't spend a lot of time thinking about this, right? 02:14:31.500 |
Our robots, you're right, the people anthropomorphize. 02:14:45.200 |
that are similar to things they might even see 02:14:48.380 |
I don't know much about how these large language models 02:14:54.520 |
I believe it's a kind of statistical averaging 02:14:58.320 |
of the most common responses to a series of words, right? 02:15:01.840 |
It's sort of a very elaborate word completion. 02:15:23.880 |
that are statistically associated with one another, 02:15:36.040 |
that allowed a sentient being to grow or evolve. 02:15:40.840 |
It feels to me like there's something about truth 02:15:45.480 |
or emotions that's just a very different kind of knowledge 02:15:51.520 |
Like the interesting thing about truth is it's absolute. 02:15:54.480 |
And it doesn't matter how frequently it's represented 02:16:04.640 |
And I think emotions are a little bit like that too. 02:16:09.520 |
and I just think that's a different kind of knowledge 02:16:22.320 |
very well might be statistically well represented 02:16:27.640 |
on the internet because the internet is made up of humans. 02:16:32.480 |
So I tend to suspect that large language models 02:16:35.780 |
are going to be able to simulate consciousness 02:16:42.200 |
when fine tuned correctly, would be able to do just that. 02:16:46.280 |
And that's going to be a lot of very complicated 02:16:55.540 |
- There needs to be some process of labeling, I think, 02:17:01.000 |
what is true because there is also disinformation 02:17:05.720 |
available on the web and these models are going to consider 02:17:12.360 |
And again, you can't average something that's true 02:17:24.400 |
and this is obviously something that the purveyors 02:17:31.760 |
- Well, if you interact on some controversial topics 02:17:34.480 |
with these models, they're actually refreshingly nuanced. 02:17:39.120 |
They present, 'cause you realize there's no one truth. 02:18:06.760 |
it presents calmly the amount of evidence for each one. 02:18:25.400 |
well, it sure as hell feels like I'm one of you humans, 02:18:35.200 |
The cool thing about GPT is it seems to be easily confused 02:18:41.680 |
You wake up in a new room and you ask, where am I? 02:18:45.540 |
It seems to be able to do that extremely well. 02:18:49.440 |
It'll tell you one thing, a fact about when a war started, 02:18:58.480 |
It'll have that same element, childlike element, 02:19:02.960 |
with humility of trying to figure out its way in the world. 02:19:13.160 |
of what we want to allow AI systems to say to us. 02:19:18.160 |
Because then if there's elements of sentience 02:19:24.100 |
you can then start to manipulate human emotion, 02:19:29.840 |
that's a really serious and aggressive discussion 02:19:41.480 |
from the arbitrary scaling of software systems 02:19:47.960 |
But that said, I really believe in that connection 02:20:11.760 |
- To what maybe other people that built on top 02:20:14.960 |
of Boston Dynamics robots or Boston Dynamics by itself. 02:20:40.680 |
seems to be defining the trajectory of human civilization? 02:20:44.720 |
Can you give them advice on how to have a career 02:20:58.120 |
Again, this was an organizing principle, I think, 02:21:18.000 |
and you'll be a lot better at it as a result. 02:21:26.800 |
Don't get too hung up on planning too far ahead. 02:21:43.360 |
I always feel like, yeah, there's a few happy mistakes 02:21:45.960 |
that happen along the way and just live with that, 02:21:52.240 |
So avail yourselves to these interesting opportunities, 02:21:55.440 |
like when I happened to run into Mark down in the lab, 02:21:59.880 |
but be willing to make a decision and then pivot 02:22:02.960 |
if you see something exciting to go at, you know, 02:22:08.200 |
you'll find things like that that get you excited. 02:22:12.000 |
- So there was a feeling when you first met Mark 02:22:14.200 |
and saw the robots that there's something interesting. 02:22:30.360 |
what do you think is the role of robots in society? 02:22:32.760 |
Do you think we'll be seeing billions of robots everywhere? 02:23:11.160 |
We don't wanna offload all of the work to the robots 02:23:16.720 |
And I think just self-satisfaction and feeling productive 02:23:31.400 |
don't end up being able to do all the creative work, right? 02:23:41.240 |
is the thing that gives you that serotonin rush 02:23:48.560 |
or that adrenaline rush that you never forget. 02:23:51.440 |
And so, you know, people need to be able to do 02:23:59.800 |
over fairly simple work, it's just well done, you know, 02:24:13.960 |
where they had this big ship and all the people 02:24:17.320 |
were just overweight, lying on their beach chairs, 02:24:21.600 |
kind of sliding around on the deck of the movie 02:24:29.600 |
You know, we need to work in some complimentary fashion 02:24:32.720 |
where we keep all of our faculties and our physical health 02:24:39.240 |
- And I think a lot of that has to do with the interaction, 02:24:42.600 |
the collaboration with robots and with AI systems. 02:24:45.400 |
I'm hoping there's a lot of interesting possibilities. 02:24:54.760 |
Robots, you know, you can ask a robot to do a job 02:25:05.880 |
It's a machine, and I don't have to have qualms about that. 02:25:21.960 |
and because one of the problems that humans have to solve 02:25:33.000 |
I think that makes a more enriching life, helps you grow, 02:25:44.400 |
nice report, article written about it recently. 02:25:47.560 |
They've been studying this group of a few thousand people 02:25:51.400 |
now for 70 or 80 years, and the conclusion is that 02:26:06.600 |
and I think that could happen with a machine. 02:26:15.020 |
I'm not convinced there will ever be true intelligence 02:26:18.400 |
in these machines, sentience, but they could simulate it, 02:26:24.520 |
and they could, I guess it remains to be seen 02:26:32.480 |
and bring that up, and you feel that connection, 02:26:40.960 |
inklings of that already started happening for me, 02:26:48.880 |
in the accessibility and the ease of use of such robots, 02:27:03.360 |
If you're interested in great engineering or robotics, 02:27:07.120 |
I'll forever celebrate the work you're doing. 02:27:09.600 |
And it's just a big honor that you would sit with me today 02:27:25.040 |
please check out our sponsors in the description. 02:27:36.300 |
A computer would deserve to be called intelligent 02:27:45.040 |
Thank you for listening, and hope to see you next time.