back to index

Robert Playter: Boston Dynamics CEO on Humanoid and Legged Robotics | Lex Fridman Podcast #374


Chapters

0:0 Introduction
2:57 Early days of Boston Dynamics
11:18 Simplifying robots
15:16 Art and science of robotics
19:59 Atlas humanoid robot
36:53 DARPA Robotics Challenge
51:13 BigDog robot
65:2 Spot robot
86:27 Stretch robot
89:15 Handle robot
94:49 Robots in our homes
103:36 Tesla Optimus robot
112:18 ChatGPT
115:22 Boston Dynamics AI Institute
116:53 Fear of robots
127:16 Running a company
132:52 Consciousness
140:26 Advice for young people
142:21 Future of robots

Whisper Transcript | Transcript Only Page

00:00:00.000 | And so our goal was a natural looking gate.
00:00:02.200 | It was surprisingly hard to get that to work.
00:00:06.200 | But we did build an early machine.
00:00:09.460 | We called it Petman Prototype.
00:00:12.440 | It was the prototype before the Petman robot.
00:00:15.180 | And it had a really nice looking gate
00:00:20.580 | where it would stick the leg out,
00:00:23.040 | it would do heel strike first
00:00:24.640 | before it rolled onto the toes.
00:00:26.000 | So you didn't land with a flat foot,
00:00:27.520 | you extended your leg a little bit.
00:00:30.300 | But even then it was hard to get the robot to walk
00:00:33.580 | when you're walking that it fully extended its leg.
00:00:37.540 | And getting that all to work well took such a long time.
00:00:41.980 | In fact, I probably didn't really see
00:00:45.340 | the nice natural walking that I expected
00:00:47.980 | out of our humanoids until maybe last year.
00:00:51.900 | And the team was developing
00:00:53.740 | on our newer generation of Atlas,
00:00:55.980 | some new techniques for developing
00:00:59.820 | a walking control algorithm.
00:01:01.540 | And they got that natural looking motion
00:01:04.340 | as sort of a byproduct of just a different process
00:01:07.640 | they were applying to developing the control.
00:01:11.060 | So that probably took 15 years,
00:01:13.700 | 10 to 15 years to sort of get that from,
00:01:16.380 | the Petman Prototype was probably in 2008.
00:01:20.220 | And what was it, 2022.
00:01:22.460 | Last year that I think I saw a good walking on Atlas.
00:01:26.120 | (air whooshing)
00:01:27.400 | - The following is a conversation with Robert Plater,
00:01:30.360 | CEO of Boston Dynamics,
00:01:32.960 | a legendary robotics company that over 30 years
00:01:36.480 | has created some of the most elegant, dexterous
00:01:40.080 | and simply amazing robots ever built,
00:01:42.960 | including the Humanoid Robot Atlas
00:01:45.820 | and the Robot Dog Spot.
00:01:48.880 | One or both of whom you've probably seen on the internet,
00:01:52.760 | either dancing, doing back flips, opening doors
00:01:56.400 | or throwing around heavy objects.
00:01:58.920 | Robert has led both the development
00:02:02.040 | of Boston Dynamics Humanoid Robots
00:02:03.800 | and their physics-based simulation software.
00:02:07.240 | He has been with the company from the very beginning,
00:02:10.480 | including its roots at MIT,
00:02:12.600 | where he received his PhD in aeronautical engineering.
00:02:16.300 | This was in 1994 at the legendary MIT Leg Lab.
00:02:21.400 | He wrote his PhD thesis on robot gymnastics,
00:02:25.360 | as part of which he programmed a bipedal robot
00:02:28.640 | to do the world's first 3D robotic somersault.
00:02:32.440 | Robert is a great engineer, roboticist and leader
00:02:35.320 | and Boston Dynamics to me as a roboticist
00:02:38.440 | is a truly inspiring company.
00:02:40.440 | This conversation was a big honor and pleasure
00:02:43.200 | and I hope to do a lot of great work
00:02:45.600 | with these robots in the years to come.
00:02:47.940 | This is the Lex Friedman Podcast.
00:02:49.920 | To support it, please check out our sponsors
00:02:51.840 | in the description.
00:02:53.080 | And now, dear friends, here's Robert Blader.
00:02:56.560 | When did you first fall in love with robotics?
00:03:00.140 | Let's start with love and robots.
00:03:04.760 | - Well, love is relevant because I think the fascination,
00:03:09.760 | the deep fascination is really about movement.
00:03:12.800 | And I was visiting MIT looking for a place to get a PhD
00:03:19.140 | and I wanted to do some laboratory work.
00:03:21.800 | And one of my professors in the aero department said,
00:03:24.640 | "Go see this guy Mark Raber down in the basement
00:03:26.860 | "of the AI lab."
00:03:29.160 | And so I walked down there and saw him.
00:03:31.480 | He showed me his robots
00:03:32.880 | and he showed me this robot doing a somersault.
00:03:37.040 | And I just immediately went, "Whoa!"
00:03:39.520 | You know?
00:03:40.560 | Robots can do that?
00:03:41.720 | And because of my own interest in gymnastics,
00:03:44.600 | there was like this immediate connection.
00:03:46.880 | And I was interested in, I was in a aero-astro degree
00:03:51.300 | because flight and movement was all so fascinating to me.
00:03:55.540 | And then it turned out that robotics had this big challenge.
00:03:59.140 | How do you balance?
00:04:01.380 | How do you build a legged robot
00:04:03.020 | that can really get around?
00:04:04.700 | And that just, that was a fascination.
00:04:07.780 | And it still exists today.
00:04:09.340 | They're still working on perfecting motion in robots.
00:04:13.040 | - What about the elegance and the beauty
00:04:15.100 | of the movement itself?
00:04:16.340 | Is there something maybe grounded in your appreciation
00:04:20.140 | of movement from your gymnastics days?
00:04:23.820 | Did you, was there something you just fundamentally
00:04:28.300 | appreciated about the elegance and beauty of movement?
00:04:31.180 | - You know, we had this concept in gymnastics
00:04:34.180 | of letting your body do what it wanted to do.
00:04:38.220 | When you get really good at gymnastics,
00:04:42.000 | part of what you're doing is putting your body
00:04:45.540 | into a position where the physics and the body's inertia
00:04:48.940 | and momentum will kind of push you in the right direction
00:04:52.060 | in a very natural and organic way.
00:04:54.700 | And the thing that Mark was doing, you know,
00:04:56.740 | in the basement of that laboratory,
00:04:59.920 | was trying to figure out how to build machines
00:05:01.660 | to take advantage of those ideas.
00:05:03.340 | How do you build something so that the physics
00:05:05.460 | of the machine just kind of inherently wants to do
00:05:08.940 | what it wants to do?
00:05:09.820 | And he was building these springy pogo stick type,
00:05:13.460 | you know, his first cut at Legged Locomotion
00:05:15.540 | was a pogo stick where it's bouncing
00:05:17.500 | and there's a spring mass system that's oscillating,
00:05:21.620 | has its own sort of natural frequency there.
00:05:24.260 | And sort of figuring out how to augment
00:05:26.340 | those natural physics with also intent,
00:05:29.980 | how do you then control that, but not overpower it?
00:05:33.180 | It's that coordination that I think creates real potential.
00:05:37.380 | We could call it beauty, you know, you could call it,
00:05:39.740 | I don't know, synergy, people have different words for it.
00:05:43.580 | But I think that that was inherent from the beginning,
00:05:46.260 | that was clear to me, that that's part of what Mark
00:05:48.540 | was trying to do, he asked me to do that in my research work
00:05:51.580 | so, you know, that's where it got going.
00:05:53.980 | - So part of the thing that I think I'm calling elegance
00:05:56.180 | and beauty in this case, which was there,
00:05:58.780 | even with the pogo stick, is maybe the efficiency,
00:06:01.900 | so letting the body do what it wants to do,
00:06:04.580 | trying to discover the efficient movement.
00:06:06.860 | - It's definitely more efficient,
00:06:08.380 | it also becomes easier to control in its own way
00:06:12.220 | because the physics are solving some of the problem itself,
00:06:15.340 | it's not like you have to do all this calculation
00:06:18.180 | and overpower the physics, the physics naturally,
00:06:20.780 | inherently want to do the right thing.
00:06:23.580 | There can even be, you know, feedback mechanisms,
00:06:26.860 | stabilizing mechanisms that occur simply by virtue
00:06:31.220 | of the physics of the body and it's, you know,
00:06:33.940 | not all in the computer or not even all in your mind
00:06:38.020 | as a person. (laughs)
00:06:39.500 | And there's something interesting in that melding.
00:06:43.660 | - You were with Mark for many, many, many years,
00:06:45.980 | but you were there in this kind of legendary space
00:06:49.060 | of Leg Lab at MIT in the basement.
00:06:53.500 | All great things happen in the basement.
00:06:55.300 | Is there some memories from that time that you have?
00:07:00.300 | Because it's such cutting edge work
00:07:05.340 | in robotics and artificial intelligence.
00:07:09.100 | - The memories, the distinctive lessons, I would say,
00:07:12.180 | I learned in that time period
00:07:15.260 | and that I think Mark was a great teacher of,
00:07:19.300 | was it's okay to pursue your interests, your curiosity,
00:07:24.260 | do something because you love it.
00:07:25.900 | You'll do it a lot better if you love it.
00:07:29.180 | That is a lasting lesson that I think we apply
00:07:36.380 | at the company still.
00:07:38.800 | And really is a core value.
00:07:41.680 | - So the interesting thing is I got to,
00:07:43.800 | with people like Russ Tedrick and others,
00:07:50.040 | like the students that work at those robotics labs
00:07:52.640 | are like some of the happiest people I've ever met.
00:07:55.160 | I don't know what that is. (laughs)
00:07:57.320 | I meet a lot of PhD students.
00:07:58.640 | A lot of them are kind of broken by the wear and tear
00:08:01.680 | of the process.
00:08:03.100 | But roboticists are, while they work extremely hard
00:08:06.120 | and work long hours, there's a happiness there.
00:08:11.120 | The only other group of people I've met like that
00:08:14.160 | are people that skydive a lot. (laughs)
00:08:17.240 | For some reason, there's a deep, fulfilling happiness.
00:08:19.280 | Maybe from a long period of struggle
00:08:22.360 | to get a thing to work and it works
00:08:23.960 | and there's a magic to it.
00:08:24.940 | I don't know exactly.
00:08:26.520 | Because it's so fundamentally hands-on
00:08:29.240 | and you're bringing a thing to life.
00:08:30.640 | I don't know what it is, but they're happy.
00:08:33.040 | - We see, our attrition at the company is really low.
00:08:37.080 | People come and they love the pursuit.
00:08:40.400 | And I think part of that is that there's perhaps
00:08:43.400 | a natural connection to it.
00:08:45.160 | It's a little bit easier to connect
00:08:46.620 | when you have a robot that's moving around in the world
00:08:49.280 | and part of your goal is to make it move around
00:08:51.120 | in the world.
00:08:52.240 | You can identify with that.
00:08:53.960 | And this is one of the unique things
00:08:56.680 | about the kinds of robots we're building
00:08:58.240 | is this physical interaction
00:09:01.120 | lets you perhaps identify with it.
00:09:03.920 | So I think that is a source of happiness.
00:09:05.840 | I don't think it's unique to robotics.
00:09:07.320 | I think anybody also who is just pursuing
00:09:10.320 | something they love, it's easier to work hard at it
00:09:13.720 | and be good at it.
00:09:14.760 | Not everybody gets to find that.
00:09:19.580 | I do feel lucky in that way
00:09:21.280 | and I think we're lucky as an organization
00:09:23.820 | that we've been able to build a business around this
00:09:27.080 | and that keeps people engaged.
00:09:29.420 | - So if it's all right, let's linger on Mark
00:09:31.940 | for a little bit longer.
00:09:33.080 | Mark Raybert, so he's a legend.
00:09:35.240 | He's a legendary engineer and roboticist.
00:09:39.000 | What have you learned about life about robotics from Mark
00:09:42.840 | through all the many years you worked with him?
00:09:45.160 | - I think the most important lesson,
00:09:47.240 | which was have the courage of your convictions
00:09:49.440 | and do what you think is interesting.
00:09:51.400 | Be willing to try to find big, big problems to go after.
00:09:59.000 | And at the time, like at locomotion,
00:10:01.240 | especially in a dynamic machine, nobody had solved it
00:10:06.980 | and that felt like a multi-decade problem to go after.
00:10:11.980 | And so have the courage to go after that
00:10:15.260 | because you're interested.
00:10:17.160 | Don't worry if it's gonna make money.
00:10:19.780 | That's been a theme.
00:10:21.200 | So that's really probably the most important lesson
00:10:26.020 | I think that I got from Mark.
00:10:28.200 | - How crazy is the effort of doing legged robotics
00:10:32.700 | at that time especially?
00:10:34.420 | - You know, Mark got some stuff to work
00:10:38.060 | starting from the simple ideas.
00:10:40.100 | So maybe the other, another important idea
00:10:42.660 | that has really become a value of the company
00:10:45.260 | is try to simplify a thing to the core essence.
00:10:50.260 | And while Mark was showing videos of animals
00:10:53.780 | running across the savanna or climbing mountains,
00:10:58.780 | what he started with was a pogo stick
00:11:00.880 | because he was trying to reduce the problem
00:11:03.160 | to something that was manageable
00:11:04.600 | and getting the pogo stick to balance.
00:11:07.400 | Had in it the fundamental problems
00:11:10.360 | that if we solved those, you could eventually extrapolate
00:11:12.920 | to something that galloped like a horse.
00:11:15.640 | And so look for those simplifying principles.
00:11:18.560 | - How tough is the job of simplifying a robot?
00:11:22.640 | - So I'd say in the early days,
00:11:24.600 | the thing that made Boston,
00:11:27.200 | the researchers at Boston Dynamics special
00:11:29.920 | is that we worked on figuring out
00:11:33.760 | what that central principle was
00:11:38.440 | and then building software or machines
00:11:41.040 | around that principle.
00:11:42.080 | And that was not easy in the early days.
00:11:44.240 | And it took real expertise in understanding
00:11:49.680 | the dynamics of motion and feedback control principles.
00:11:54.680 | How to build, with the computers at the time,
00:11:58.780 | how to build a feedback control algorithm
00:12:00.640 | that was simple enough that it could run in real time
00:12:02.800 | at a thousand hertz and actually get that machine to work.
00:12:06.300 | And that was not something everybody was doing
00:12:10.920 | at that time.
00:12:12.320 | Now the world's changing now.
00:12:13.600 | And I think the approaches to controlling robots
00:12:16.560 | are going to change.
00:12:18.960 | But, and they're going to become more broadly available.
00:12:23.460 | But at the time, there weren't many groups
00:12:28.440 | who could really sort of work at that principled level
00:12:31.680 | with both the software and make the hardware work.
00:12:36.480 | And I'll say one other thing about,
00:12:38.880 | you were sort of talking about what are the special things.
00:12:41.880 | The other thing was, it's good to break stuff.
00:12:48.560 | Use the robots, break them, repair them,
00:12:51.360 | fix and repeat, test, fix and repeat.
00:12:56.160 | And that's also a core principle
00:12:59.080 | that has become part of the company.
00:13:01.680 | And it lets you be fearless in your work.
00:13:05.520 | Too often, if you are working with a very expensive robot,
00:13:09.240 | maybe one that you bought from somebody else
00:13:11.040 | or that you don't know how to fix,
00:13:12.720 | then you treat it with kit gloves
00:13:15.280 | and you can't actually make progress.
00:13:17.500 | You have to be able to break something.
00:13:19.040 | And so I think that's been a principle as well.
00:13:22.480 | - So just to linger on that,
00:13:23.480 | psychologically, how do you deal with that?
00:13:25.120 | 'Cause I remember I had,
00:13:26.320 | I built a RC car,
00:13:30.240 | where it did some,
00:13:31.480 | it had some custom stuff, like compute on it,
00:13:35.480 | all that kind of stuff, cameras.
00:13:37.240 | And because I didn't sleep much,
00:13:40.640 | the code I wrote had an issue where it didn't stop the car
00:13:44.520 | and the car got confused and at full speed
00:13:47.460 | at like 20, 25 miles an hour, slammed into a wall.
00:13:51.340 | And I just remember sitting there alone in a deep sadness.
00:13:55.100 | Sort of full of regret, I think, almost anger,
00:14:01.460 | but also like sadness because you think about,
00:14:06.340 | well, these robots, especially for autonomous vehicles,
00:14:08.980 | like you should be taking safety very seriously,
00:14:12.020 | even in these kinds of things.
00:14:14.440 | But just no good feelings.
00:14:17.020 | It made me more afraid probably
00:14:18.580 | to do this kind of experiments in the future.
00:14:20.820 | Perhaps the right way to have seen that is positively.
00:14:24.000 | - It depends if you could have built that car
00:14:27.700 | or just gotten another one, right?
00:14:29.660 | That would have been the approach.
00:14:32.100 | I remember when I got to grad school,
00:14:37.100 | I got some training about operating a lathe
00:14:42.740 | in a mill up in the machine shop,
00:14:45.120 | and I could start to make my own parts.
00:14:47.600 | And I remember breaking some piece of equipment in the lab
00:14:52.240 | and then realizing, 'cause maybe this was a unique part
00:14:55.640 | and I couldn't go buy it.
00:14:57.120 | And I realized, oh, I can just go make it.
00:15:00.280 | That was an enabling feeling.
00:15:02.800 | Then you're not afraid.
00:15:04.280 | Yeah, it might take time.
00:15:05.880 | It might take more work than you thought
00:15:07.920 | it was gonna be required to get this thing done,
00:15:10.640 | but you can just go make it.
00:15:12.080 | And that's freeing in a way that nothing else is.
00:15:16.760 | - You mentioned that if you've got control, the dynamics,
00:15:20.820 | sorry for the romantic question,
00:15:22.540 | but is in the early days and even now,
00:15:25.000 | is the dynamics probably more appropriate
00:15:27.800 | for the early days?
00:15:28.860 | Is it more art or science?
00:15:30.820 | - There's a lot of science around it.
00:15:35.700 | And trying to develop scientific principles
00:15:40.320 | that let you extrapolate from one-legged machine to another,
00:15:44.260 | develop a core set of principles
00:15:47.620 | like a spring mass bouncing system,
00:15:51.540 | and then figure out how to apply that
00:15:53.840 | from a one-legged machine to a two- or a four-legged machine.
00:15:56.820 | Those principles are really important,
00:15:58.560 | and were definitely a core part of our work.
00:16:03.040 | There's also, when we started to pursue humanoid robots,
00:16:10.760 | there was so much complexity in that machine
00:16:14.320 | that one of the benefits of the humanoid form
00:16:18.060 | is you have some intuition
00:16:19.400 | about how it should look while it's moving.
00:16:21.840 | And that's a little bit of an art, I think.
00:16:25.600 | And now it's, or maybe it's just tapping into a knowledge
00:16:28.500 | that you have deep in your body,
00:16:30.400 | and then trying to express that in the machine.
00:16:33.160 | But that's an intuition
00:16:34.860 | that's a little bit more on the art side.
00:16:37.240 | Maybe it predates your knowledge.
00:16:39.560 | Before you have the knowledge of how to control it,
00:16:41.640 | you try to work through the art channel.
00:16:43.800 | And humanoids sort of make that available to you.
00:16:45.800 | If it had been a different shape,
00:16:47.640 | maybe you wouldn't have had the same intuition about it.
00:16:50.760 | - Yeah, so your knowledge about moving through the world
00:16:55.020 | is not made explicit to you.
00:16:56.800 | So you just, that's why it's art.
00:16:59.440 | - Yeah, it might be hard to actually articulate exactly.
00:17:02.520 | There's something about, and being a competitive athlete,
00:17:07.140 | there's something about seeing movement.
00:17:11.000 | A coach, one of the greatest strengths a coach has
00:17:14.320 | is being able to see some little change
00:17:17.200 | in what the athlete is doing,
00:17:18.800 | and then being able to articulate that to the athlete.
00:17:21.200 | And then maybe even trying to say,
00:17:22.600 | "And you should try to feel this."
00:17:24.680 | So there's something just in seeing.
00:17:28.120 | And again, sometimes it's hard to articulate
00:17:31.540 | what it is you're seeing.
00:17:32.960 | But there's a, just perceiving the motion
00:17:35.800 | at a rate that is, again,
00:17:39.700 | sometimes hard to put into words.
00:17:41.700 | - Yeah, I wonder how it is possible
00:17:46.700 | to achieve sort of truly elegant movement.
00:17:49.580 | You have a movie like Ex Machina,
00:17:51.740 | not sure if you've seen it,
00:17:53.120 | but the main actress in that, who plays the AI robot,
00:17:57.220 | I think is a ballerina.
00:17:58.980 | I mean, just the natural elegance,
00:18:02.420 | and the, I don't know, eloquence of movement.
00:18:05.460 | It looks efficient and easy, and just, it looks right.
00:18:12.980 | It looks beautiful. - It looks right
00:18:14.220 | is sort of the key, yeah.
00:18:15.660 | - And then you look at, especially early robots,
00:18:19.000 | I mean, they're so cautious in the way they move
00:18:22.180 | that it's not the caution that looks wrong,
00:18:27.140 | it's something about the movement that looks wrong
00:18:30.220 | that feels like it's very inefficient, unnecessarily so.
00:18:34.100 | And it's hard to put that into words exactly.
00:18:37.980 | - We think, and part of the reason why people
00:18:40.900 | are attracted to the machines we build
00:18:43.860 | is because the inherent dynamics of movement
00:18:46.580 | are closer to right.
00:18:48.620 | Because we try to use walking gates,
00:18:53.220 | or we build a machine around this gate
00:18:55.660 | where you're trying to work with the dynamics
00:18:57.900 | of the machine instead of to stop them.
00:19:01.300 | Some of the early walking machines,
00:19:03.060 | you're essentially, you're really trying hard
00:19:06.520 | to not let them fall over,
00:19:08.020 | and so you're always stopping the tipping motion.
00:19:10.860 | And sort of the insight of dynamic stability
00:19:16.460 | in a lugged machine is to go with it.
00:19:19.220 | Let the tipping happen, let yourself fall,
00:19:22.420 | but then catch yourself with that next foot.
00:19:24.820 | And there's something about getting those physics
00:19:27.340 | to be expressed in the machine
00:19:29.740 | that people interpret as lifelike,
00:19:34.460 | or elegant, or just natural looking.
00:19:38.300 | And so I think if you get the physics right,
00:19:41.100 | it also ends up being more efficient, likely.
00:19:44.500 | There's a benefit that it probably ends up being
00:19:47.540 | more stable in the long run.
00:19:49.260 | You know, it could walk stably
00:19:50.860 | over a wider range of conditions.
00:19:53.860 | And it's more beautiful.
00:19:57.220 | And attractive at the same time.
00:19:58.820 | - So how hard is it to get the humanoid robot Atlas
00:20:03.660 | to do some of the things it's recently been doing?
00:20:06.060 | Let's forget the flips and all of that.
00:20:08.140 | Let's just look at the running.
00:20:10.340 | Maybe you can correct me,
00:20:11.340 | but there's something about running,
00:20:12.940 | I mean, that's not careful at all.
00:20:14.420 | That's you're falling forward.
00:20:16.580 | You're jumping forward and are falling.
00:20:18.220 | So how hard is it to get that right?
00:20:20.860 | - Our first humanoid,
00:20:22.500 | we needed to deliver natural looking walking.
00:20:25.180 | You know, we took a contract from the army.
00:20:27.740 | They wanted a robot that could walk naturally.
00:20:31.140 | They wanted to put a suit on the robot
00:20:33.860 | and be able to test it in a gas environment.
00:20:36.660 | And so they wanted the motion to be natural.
00:20:39.460 | And so our goal was a natural looking gate.
00:20:43.140 | It was surprisingly hard to get that to work.
00:20:47.700 | And we, but we did build an early machine.
00:20:50.400 | We called it Pet Man Prototype.
00:20:53.380 | It was the prototype before the Pet Man Robot.
00:20:56.120 | And it had a really nice looking gate
00:21:01.500 | where it would stick the leg out.
00:21:03.980 | It would do heel strike first before it rolled onto the toe.
00:21:06.860 | So you didn't land with a flat foot.
00:21:08.440 | You extended your leg a little bit.
00:21:10.400 | But even then it was hard to get the robot to walk
00:21:14.460 | where when you're walking that it fully extended its leg
00:21:17.740 | and essentially landed on an extended leg.
00:21:20.080 | And if you watch closely how you walk,
00:21:22.320 | you probably land on an extended leg,
00:21:23.980 | but then you immediately flex your knee
00:21:25.780 | as you start to make that contact.
00:21:28.140 | And getting that all to work well took such a long time.
00:21:32.620 | In fact, I probably didn't really see
00:21:35.980 | the nice natural walking that I expected
00:21:38.620 | out of our humanoids until maybe last year.
00:21:42.540 | And the team was developing
00:21:44.380 | on our newer generation of Atlas,
00:21:46.420 | you know, some new techniques
00:21:47.860 | for developing a walking control algorithm.
00:21:52.180 | And they got that natural looking motion
00:21:54.960 | as sort of a byproduct of just a different process
00:21:58.280 | they were applying to developing the control.
00:22:01.720 | So that probably took 15 years,
00:22:04.320 | 10 to 15 years to sort of get that from, you know,
00:22:08.200 | the Pet Man prototype was probably in 2008.
00:22:10.840 | And what was it, 2022.
00:22:13.080 | Last year that I think I saw good walking on Atlas.
00:22:15.960 | - If you could just like linger on it,
00:22:17.520 | what are some challenges of getting good walking?
00:22:19.760 | So is it, is this partially like a hardware,
00:22:24.760 | like actuator problem?
00:22:27.140 | Is it the control?
00:22:29.060 | Is it the artistic element of just observing
00:22:31.200 | the whole system operating in different conditions together?
00:22:34.260 | I mean, is there some kind of interesting
00:22:36.360 | quirks or challenges you can speak to?
00:22:39.900 | Like the heel strike or all that?
00:22:41.220 | - Yeah, so one of the things that makes the,
00:22:43.060 | like this straight leg a challenge
00:22:46.140 | is you're sort of up against a singularity,
00:22:49.700 | a mathematical singularity,
00:22:51.900 | where, you know, when your leg is fully extended,
00:22:53.900 | it can't go further the other direction, right?
00:22:56.180 | There's only, you can only move in one direction.
00:22:58.940 | And that makes all of the calculations
00:23:00.940 | around how to produce torques at that joint
00:23:04.500 | or positions makes it more complicated.
00:23:07.500 | And so having all of the mathematics
00:23:09.780 | so it can deal with these singular configurations
00:23:14.220 | is one of many challenges that we face.
00:23:19.660 | And I'd say in the, you know, in those earlier days,
00:23:23.580 | again, we were working with these really simplified models.
00:23:27.160 | So we're trying to boil all the physics
00:23:29.460 | of the complex human body into a simpler subsystem
00:23:34.100 | that we can more easily describe in mathematics.
00:23:36.780 | And sometimes those simpler subsystems
00:23:38.980 | don't have all of that complexity
00:23:40.900 | of the straight leg built into them.
00:23:43.900 | And so what's happened more recently
00:23:46.240 | is we're able to apply techniques
00:23:48.020 | that let us take the full physics of the robot into account
00:23:53.020 | and deal with some of those strange situations
00:23:57.260 | like the straight leg.
00:23:59.060 | - So is there a fundamental challenge here
00:24:00.900 | that it's, maybe you can correct me,
00:24:03.260 | but is it underactuated?
00:24:05.140 | Are you falling?
00:24:06.460 | - Underactuated is the right word, right?
00:24:08.260 | You can't push the robot in any direction you want to, right?
00:24:12.660 | And so that is one of the hard problems
00:24:15.340 | of legged locomotion.
00:24:16.940 | - And you have to do that for natural movement.
00:24:20.300 | - It's not necessarily required for natural movement.
00:24:22.260 | It's just required, you know,
00:24:25.300 | we don't have a gravity force
00:24:27.540 | that you can hook yourself onto
00:24:28.820 | to apply an external force
00:24:31.340 | in the direction you want at all times, right?
00:24:33.340 | The only external forces are being mediated
00:24:36.060 | through your feet and how they get mediated
00:24:38.260 | depend on how you place your feet.
00:24:40.180 | And, you know, you can't just, you know,
00:24:43.900 | God's hand can't reach down
00:24:45.300 | and push in any direction you want, you know, so.
00:24:49.100 | - Is there some extra challenge
00:24:51.820 | to the fact that Alice is such a big robot?
00:24:54.420 | - There is.
00:24:55.260 | The humanoid form is attractive in many ways,
00:24:59.820 | but it's also a challenge in many ways.
00:25:01.780 | You have this big upper body
00:25:05.460 | that has a lot of mass and inertia.
00:25:07.480 | And throwing that inertia around
00:25:10.860 | increases the complexity of maintaining balance.
00:25:13.980 | And as soon as you pick up something heavy in your arms,
00:25:16.260 | you've made that problem even harder.
00:25:19.100 | And so in the early work in the leg lab
00:25:23.100 | and in the early days at the company,
00:25:25.580 | you know, we were pursuing these quadruped robots,
00:25:28.060 | which had a kind of built-in simplification.
00:25:31.060 | You had this big rigid body and then really light legs.
00:25:34.580 | So when you swing the legs,
00:25:36.520 | the leg motion didn't impact the body motion very much.
00:25:41.460 | All the mass and inertia was in the body.
00:25:43.700 | But when you have the humanoid, that doesn't work.
00:25:45.900 | You have big, heavy legs.
00:25:47.060 | You swing the legs, it affects everything else.
00:25:49.940 | And so dealing with all of that interaction
00:25:54.300 | does make the humanoid a much more complicated platform.
00:25:58.060 | - And I also saw that at least recently,
00:26:00.940 | you've been doing more explicit modeling
00:26:02.860 | of the stuff you pick up.
00:26:04.660 | - Yeah, yeah.
00:26:05.660 | - Which is very, really interesting.
00:26:09.740 | So you have to, what, model the shape,
00:26:13.620 | the weight distribution.
00:26:15.140 | I don't know what, like you have to include that
00:26:19.180 | as part of the modeling, as part of the planning.
00:26:21.820 | 'Cause, okay, so for people who don't know,
00:26:24.800 | so Atlas, at least in a recent video,
00:26:27.180 | like throws a heavy bag, throws a bunch of stuff.
00:26:31.700 | So what's involved in picking up a thing, a heavy thing,
00:26:35.560 | and when that thing is a bunch
00:26:38.700 | of different non-standard things.
00:26:40.540 | I think it also picked up like a barbell.
00:26:43.500 | And to be able to throw it in some cases,
00:26:45.820 | what are some interesting challenges there?
00:26:48.580 | - So we were definitely trying to show
00:26:50.660 | that the robot and the techniques we're applying
00:26:52.900 | to Atlas let us deal with heavy things in the world.
00:26:57.900 | Because if the robot's gonna be useful,
00:26:59.740 | it's actually gotta move stuff around.
00:27:01.980 | And that needs to be significant stuff.
00:27:04.280 | That's an appreciable portion
00:27:06.080 | of the body weight of the robot.
00:27:07.900 | And we also think this differentiates us
00:27:10.980 | from the other humanoid robot activities
00:27:13.140 | that you're seeing out there.
00:27:14.220 | Mostly they're not picking stuff up yet.
00:27:16.700 | And not heavy stuff anyway.
00:27:18.100 | But just like you or me,
00:27:21.020 | you need to anticipate that moment.
00:27:22.820 | You're reaching out to pick something up,
00:27:24.220 | and as soon as you pick it up,
00:27:25.300 | your center of mass is gonna shift.
00:27:27.620 | And if you're gonna turn in a circle,
00:27:30.980 | you have to take that inertia into account.
00:27:32.980 | And if you're gonna throw a thing,
00:27:35.100 | you've got all of that has to be sort of included
00:27:37.900 | in the model of what you're trying to do.
00:27:41.140 | So the robot needs to have some idea
00:27:43.380 | or expectation of what that weight is,
00:27:45.260 | and then sort of predict,
00:27:48.140 | think a couple of seconds ahead,
00:27:49.580 | how do I manage now my body
00:27:52.700 | plus this big heavy thing together?
00:27:56.020 | And still maintain balance.
00:27:57.940 | That's a big change for us.
00:28:03.660 | And I think the tools we've built
00:28:05.420 | are really allowing that to happen quickly now.
00:28:08.500 | Some of those motions that you saw
00:28:10.220 | in that most recent video,
00:28:12.500 | we were able to create in a matter of days.
00:28:14.940 | It used to be that it took six months
00:28:16.540 | to do anything new on the robot.
00:28:18.580 | And now we're starting to develop the tools
00:28:20.860 | that let us do that in a matter of days.
00:28:22.500 | And so we think that's really exciting.
00:28:24.780 | That means that the ability to create new behaviors
00:28:27.460 | for the robot is gonna be a quicker process.
00:28:31.900 | - So being able to explicitly model new things
00:28:35.060 | that it might need to pick up, new types of things.
00:28:37.620 | - And to some degree,
00:28:39.700 | you don't wanna have to pay too much attention
00:28:41.620 | to each specific thing, right?
00:28:45.220 | There's sort of a generalization here.
00:28:47.120 | Obviously when you grab a thing,
00:28:50.980 | you have to conform your hand, your end effector
00:28:53.860 | to the surface of that shape.
00:28:55.660 | But once it's in your hands,
00:28:57.420 | it's probably just the mass and inertia that matter.
00:29:00.020 | And the shape may not be as important.
00:29:03.460 | And so, in some ways you wanna pay attention
00:29:07.100 | to that detailed shape.
00:29:08.660 | In others, you wanna generalize it and say,
00:29:11.220 | well, all I really care about
00:29:13.060 | is the center of mass of this thing,
00:29:14.540 | especially if I'm gonna throw it up on that scaffolding.
00:29:17.180 | - And it's easier if the body is rigid.
00:29:19.020 | What if there's some,
00:29:20.460 | doesn't it throw like a sandbag type thing?
00:29:22.660 | - That tool bag had loose stuff in it.
00:29:26.260 | So it managed that.
00:29:28.280 | There are harder things that we haven't done yet.
00:29:30.620 | We could have had a big jointed thing
00:29:32.460 | or I don't know, a bunch of loose wire or rope.
00:29:35.060 | - What about carrying another robot?
00:29:36.460 | How about that?
00:29:37.300 | (laughing)
00:29:39.500 | - Yeah, we haven't done that yet.
00:29:41.060 | - Carry spot.
00:29:41.900 | - I guess we did a little bit of a,
00:29:43.420 | we did a little skit around Christmas
00:29:45.980 | where we had two spots holding up another spot
00:29:48.820 | that was trying to put a bow on a tree.
00:29:51.180 | So I guess we're doing that in a small way.
00:29:53.380 | - Okay, that's pretty good.
00:29:56.180 | Let me ask the all important question.
00:29:58.220 | Do you know how much Atlas can curl?
00:30:00.660 | (laughing)
00:30:02.900 | I mean, you know, for us humans,
00:30:06.820 | that's really one of the most fundamental questions
00:30:09.740 | you can ask another human being.
00:30:11.900 | Curl, bench.
00:30:14.380 | - It probably can't curl as much as we can yet.
00:30:17.140 | But a metric that I think is interesting
00:30:19.980 | is another way of looking at that strength
00:30:23.740 | is the box jump.
00:30:25.100 | So how high of a box can you jump onto?
00:30:29.100 | - Good question.
00:30:30.100 | - And Atlas, I don't know the exact height.
00:30:32.980 | It was probably a meter high or something like that.
00:30:34.940 | It was a pretty tall jump that Atlas was able to manage
00:30:38.420 | when we last tried to do this.
00:30:40.100 | And I have video of my chief technical officer
00:30:44.420 | doing the same jump.
00:30:45.340 | And he really struggled.
00:30:46.980 | - Oh, the human.
00:30:47.820 | - But the human, getting all the way on top of this box.
00:30:50.220 | But then Atlas was able to do it.
00:30:52.280 | We're now thinking about the next generation of Atlas.
00:30:56.780 | And we're probably gonna be in the realm
00:30:58.780 | of a person can't do it with the next generation.
00:31:02.740 | The robots, the actuators are gonna get stronger.
00:31:05.260 | Where it really is the case
00:31:07.100 | that at least some of these joints,
00:31:08.340 | some of these motions will be stronger.
00:31:10.140 | - And to understand how high it can jump,
00:31:11.820 | you probably had to do quite a bit of testing.
00:31:14.200 | - Oh yeah, and there's lots of videos of it trying
00:31:16.140 | and failing, and that's all.
00:31:19.100 | We don't always release those videos,
00:31:20.940 | but they're a lot of fun to look at.
00:31:22.740 | - So we'll talk a little bit about that.
00:31:26.500 | But can you talk to the jumping?
00:31:28.900 | 'Cause you talked about the walking,
00:31:30.140 | and it took a long time, many, many years
00:31:31.980 | to get the walking to be natural.
00:31:34.020 | But there's also really natural-looking,
00:31:37.400 | robust, resilient jumping.
00:31:40.940 | How hard is it to do the jumping?
00:31:43.200 | - Well again, this stuff has really evolved rapidly
00:31:45.540 | in the last few years.
00:31:46.940 | The first time we did a somersault,
00:31:48.700 | there was a lot of manual iteration.
00:31:52.940 | What is the trajectory?
00:31:54.540 | How hard do you throw?
00:31:55.580 | In fact, in these early days,
00:32:00.220 | when I'd see early experiments that the team was doing,
00:32:03.660 | I might make suggestions about how to change the technique.
00:32:06.420 | Again, kind of borrowing from my own intuition
00:32:09.060 | about how backflips work.
00:32:10.340 | But frankly, they don't need that anymore.
00:32:13.740 | So in the early days, you had to iterate
00:32:15.860 | in almost a manual way, trying to change these trajectories
00:32:20.820 | of the arms or the legs to try to get
00:32:23.780 | a successful backflip to happen.
00:32:27.260 | But more recently, we're running these model predictive
00:32:30.900 | control techniques where we're able to,
00:32:35.580 | the robot essentially can think in advance
00:32:38.300 | for the next second or two about how its motion
00:32:41.700 | is going to transpire.
00:32:43.100 | And you can solve for optimal trajectories
00:32:45.980 | to get from A to B.
00:32:47.720 | So this is happening in a much more natural way.
00:32:50.020 | And we're really seeing an acceleration happen
00:32:53.020 | in the development of these behaviors,
00:32:55.340 | again, partly due to these optimization techniques,
00:33:00.140 | sometimes learning techniques.
00:33:01.900 | So it's hard in that there's a lot of mathematics
00:33:07.860 | behind it, but we're figuring that out.
00:33:12.300 | - So you can do model predictive control for,
00:33:14.880 | I mean, I don't even understand what that looks like
00:33:18.220 | when the entire robot is in the air,
00:33:20.700 | flying and doing a backflip.
00:33:22.780 | - Yeah.
00:33:24.460 | But that's the cool part, right?
00:33:26.220 | So you know, the physics, we can calculate physics
00:33:29.540 | pretty well using Newton's laws about how it's going
00:33:32.860 | to evolve over time.
00:33:34.380 | The sick trick, which was a front somersault
00:33:38.780 | with a half twist is a good example, right?
00:33:41.160 | You saw the robot on various versions of that trick,
00:33:46.620 | I've seen it, land in different configurations
00:33:49.380 | and it still manages to stabilize itself.
00:33:51.940 | And so, what this model predictive control means is,
00:33:56.100 | again, in real time, the robot is projecting ahead,
00:34:01.100 | a second into the future, and sort of exploring options.
00:34:04.460 | And if I move my arm a little bit more this way,
00:34:06.940 | how is that gonna affect the outcome?
00:34:08.580 | And so it can do these calculations, many of them,
00:34:11.320 | and basically solve for, given where I am now,
00:34:16.540 | maybe I took off a little bit screwy from how I had planned,
00:34:20.100 | I can adjust.
00:34:21.180 | - So you're adjusting in the air--
00:34:22.620 | - Just on the fly.
00:34:24.060 | So the model predictive control lets you adjust on the fly.
00:34:27.860 | And of course, I think this is what people adapt as well.
00:34:31.860 | When we do it, even a gymnastics trick,
00:34:34.540 | we try to set it up so it's as close to the same every time,
00:34:38.180 | but we figured out how to do some adjustment on the fly,
00:34:40.420 | and now we're starting to figure out that the robots
00:34:42.580 | can do this adjustment on the fly as well,
00:34:44.660 | using these techniques.
00:34:46.220 | - In the air.
00:34:48.020 | - It's so, I mean, it just feels,
00:34:50.500 | from a robotics perspective, just surreal.
00:34:53.060 | - Well, that's sort of the,
00:34:53.900 | you talked about under-actuated, right?
00:34:55.780 | So when you're in the air, there's some things
00:34:59.620 | you can't change, right?
00:35:00.660 | You can't change the momentum while it's in the air,
00:35:03.140 | 'cause you can't apply an external force, or torque,
00:35:05.420 | and so the momentum isn't gonna change.
00:35:07.760 | So how do you work within the constraint
00:35:09.500 | of that fixed momentum to still get from A to B,
00:35:13.140 | where you wanna be?
00:35:14.300 | - That's really under-actuated.
00:35:17.900 | - You're in the air.
00:35:18.860 | I mean, you become a drone for a brief moment in time.
00:35:21.740 | No, you're not even a drone, 'cause you can't--
00:35:23.900 | - Can't hover.
00:35:24.740 | - You can't hover, you can't--
00:35:26.100 | - You're gonna impact soon, be ready.
00:35:28.340 | - Yeah, are you considered a hover-type thing, or no?
00:35:31.340 | No, it's too much weight.
00:35:32.660 | I mean, it's just incredible.
00:35:36.120 | And just even to have the guts to try a backflip,
00:35:39.980 | with such a large body, that's wild.
00:35:43.140 | - We definitely broke a few robots trying that.
00:35:47.580 | - But that's where the build it, break it, fix it,
00:35:50.500 | strategy comes in.
00:35:51.700 | Gotta be willing to break.
00:35:52.680 | And what ends up happening is you end up,
00:35:55.220 | by breaking the robot repeatedly, you find the weak points,
00:35:58.100 | and then you end up redesigning it,
00:35:59.500 | so it doesn't break so easily next time.
00:36:02.180 | - Through the breaking process, you learn a lot,
00:36:05.020 | like a lot of lessons, and you keep improving,
00:36:07.580 | not just how to make the backflip work, but everything.
00:36:10.340 | - And how to build the machine better.
00:36:11.660 | - Yeah, yeah.
00:36:13.260 | I mean, is there something about just the guts
00:36:15.540 | to come up with an idea of saying, you know what,
00:36:19.740 | let's try to make it do a backflip?
00:36:21.740 | - Well, I think the courage to do a backflip
00:36:23.980 | in the first place, and to not worry too much
00:36:26.720 | about the ridicule of somebody saying,
00:36:28.700 | why the heck are you doing backflips with robots?
00:36:31.300 | Because a lot of people have asked that, you know,
00:36:33.380 | why are you doing this?
00:36:35.740 | - Why go to the moon in this decade
00:36:38.540 | and do the other things, JFK?
00:36:40.180 | Not because it's easy, because it's hard.
00:36:43.500 | - Yeah, exactly.
00:36:44.900 | (laughing)
00:36:46.820 | Don't ask questions.
00:36:47.980 | Okay, so the jumping, I mean, there's a lot
00:36:50.740 | of incredible stuff.
00:36:51.580 | If we can just rewind a little bit
00:36:53.740 | to the DARPA Robotics Challenge in 2015, I think,
00:36:57.940 | which was, for people who are familiar
00:37:00.300 | with the DARPA challenges, it was first
00:37:04.660 | with autonomous vehicles, and there's a lot
00:37:06.300 | of interesting challenges around that,
00:37:07.940 | and the DARPA Robotics Challenge was when
00:37:10.740 | humanoid robots were tasked to do all kinds
00:37:15.340 | of, you know, manipulation, walking, driving a car,
00:37:20.340 | all these kinds of challenges with,
00:37:24.780 | if I remember correctly, sort of some slight capability
00:37:29.780 | to communicate with humans, but the communication
00:37:33.380 | was very poor, so it basically has
00:37:34.940 | to be almost entirely autonomous.
00:37:37.540 | - It could have periods where the communication
00:37:39.420 | was entirely interrupted, and the robot
00:37:41.140 | had to be able to proceed, but you could provide
00:37:43.500 | some high-level guidance to the robot,
00:37:45.460 | basically low-bandwidth communications to steer it.
00:37:49.780 | - I watched that challenge with kind of tears
00:37:52.460 | in my eyes, eating popcorn.
00:37:54.540 | - Us too.
00:37:55.380 | (laughing)
00:37:57.660 | - But I wasn't personally losing, you know,
00:38:00.460 | hundreds of thousands, millions of dollars,
00:38:02.500 | and many years of incredible hard work
00:38:05.620 | by some of the most brilliant roboticists in the world.
00:38:08.460 | So that was why the tragic, that's why the tears came.
00:38:11.420 | So anyway, what have you, just looking back
00:38:14.380 | to that time, what have you learned from that experience?
00:38:17.340 | Maybe if you could describe what it was,
00:38:20.780 | sort of the setup for people who haven't seen it.
00:38:23.140 | - Well, so there was a contest where a bunch
00:38:25.740 | of different robots were asked to do a series
00:38:29.180 | of tasks, some of those that you mentioned,
00:38:31.860 | drive a vehicle, get out, open a door,
00:38:34.700 | go identify a valve, shut a valve, use a tool
00:38:38.260 | to maybe cut a hole in a surface,
00:38:42.980 | and then crawl over some stairs
00:38:45.860 | and maybe some rough terrain.
00:38:47.780 | So it was, the idea was have a general-purpose robot
00:38:52.780 | that could do lots of different things.
00:38:54.940 | Had to be mobility and manipulation, on-board perception.
00:39:01.660 | And there was a contest, which DARPA likes,
00:39:05.300 | at the time was running, sort of follow on
00:39:08.060 | to the grand challenge, which was,
00:39:10.980 | let's try to push vehicle autonomy along, right?
00:39:14.300 | They encouraged people to build autonomous cars.
00:39:18.100 | So they're trying to basically push an industry forward.
00:39:21.620 | And we were asked, our role in this was to build
00:39:26.620 | a humanoid, at the time it was our sort
00:39:29.980 | of first-generation Atlas robot.
00:39:32.180 | And we built maybe 10 of them,
00:39:37.020 | I don't remember the exact number.
00:39:38.740 | And DARPA distributed those to various teams
00:39:43.460 | that sort of won a contest, showed that they could
00:39:49.260 | program these robots, and then use them
00:39:51.860 | to compete against each other.
00:39:53.020 | And then other robots were introduced as well.
00:39:55.140 | Some teams built their own robots.
00:39:56.580 | Carnegie Mellon, for example, built their own robot.
00:40:00.460 | And all these robots competed to see who could
00:40:03.660 | sort of get through this maze the fastest.
00:40:07.660 | And again, I think the purpose was to kind of
00:40:10.180 | push the whole industry forward.
00:40:11.840 | We provided the robot and some baseline software,
00:40:16.140 | but we didn't actually compete as a participant,
00:40:19.480 | where we were trying to drive the robot through this maze.
00:40:25.700 | We were just trying to support the other teams.
00:40:28.540 | It was humbling because it was really a hard task.
00:40:32.140 | And honestly, the robots, the tears were because
00:40:34.860 | mostly the robots didn't do it.
00:40:36.740 | They fell down repeatedly.
00:40:41.380 | It was hard to get through this contest.
00:40:44.060 | Some did, and they were rewarded and won.
00:40:48.100 | But it was humbling because of just how hard,
00:40:50.380 | these tasks weren't all that hard.
00:40:51.700 | A person could have done it very easily.
00:40:54.100 | But it was really hard to get the robots to do it.
00:40:56.940 | - Well, the general nature of it, the variety of it.
00:41:00.660 | - The variety.
00:41:01.500 | - And also that I don't know if the tasks were
00:41:04.300 | sort of, the task in themselves help us understand
00:41:10.300 | what is difficult and what is not.
00:41:12.000 | I don't know if that was obvious
00:41:13.380 | before the contest was designed.
00:41:15.700 | So you kind of try to figure that out.
00:41:17.420 | And I think Atlas is really a general robot platform,
00:41:22.660 | and it's perhaps not best suited
00:41:24.700 | for the specific tasks of that contest.
00:41:27.220 | Like for just for example, probably the hardest task
00:41:31.380 | is not the driving of the car,
00:41:33.300 | but getting in and out of the car.
00:41:35.780 | And Atlas probably, if you were to design a robot
00:41:39.380 | that can get into the car easily and get out easily,
00:41:42.500 | you probably would not make Atlas that particular car.
00:41:45.420 | - Yeah, the robot was a little bit big
00:41:47.460 | to get in and out of that car, right?
00:41:49.660 | - It doesn't fit, yeah.
00:41:50.500 | - This is the curse of a general purpose robot,
00:41:53.060 | that they're not perfect at any one thing,
00:41:56.020 | but they might be able to do a wide variety of things.
00:41:58.740 | And that is the goal at the end of the day.
00:42:03.740 | You know, I think we all wanna build general purpose robots
00:42:08.120 | that can be used for lots of different activities,
00:42:11.140 | but it's hard.
00:42:12.020 | And the wisdom in building successful robots
00:42:18.100 | up until this point have been, go build a robot
00:42:21.380 | for a specific task and it'll do it very well.
00:42:24.380 | And as long as you control that environment,
00:42:26.660 | it'll operate perfectly.
00:42:27.980 | But robots need to be able to deal with uncertainty.
00:42:31.260 | If they're gonna be useful to us in the future,
00:42:34.460 | they need to be able to deal with unexpected situations.
00:42:38.700 | And that's sort of the goal of a general purpose
00:42:40.780 | or multi-purpose robot.
00:42:42.820 | And that's just darn hard.
00:42:44.460 | And so some of the, you know,
00:42:45.280 | there's these curious little failures.
00:42:46.740 | Like I remember one of the, a robot, you know,
00:42:50.940 | the first time you start to try to push on the world
00:42:54.140 | with a robot, you forget that the world pushes back
00:42:58.580 | and will push you over if you're not ready for it.
00:43:02.140 | And the robot, you know, reached to grab the door handle.
00:43:05.380 | I think it missed the grasp of the door handle,
00:43:08.300 | was expecting that its hand was on the door handle.
00:43:10.860 | And so when it tried to turn the knob,
00:43:12.860 | it just threw itself over.
00:43:14.140 | It didn't realize, oh, I had missed the door handle.
00:43:16.500 | I didn't have, I didn't, I was expecting a force
00:43:19.740 | back from the door, it wasn't there.
00:43:21.900 | And then I lost my balance.
00:43:23.660 | So these little simple things that you and I
00:43:26.020 | would take totally for granted and deal with,
00:43:29.100 | the robots don't know how to deal with yet.
00:43:31.460 | And so you have to start to deal
00:43:32.680 | with all of those circumstances.
00:43:35.220 | - Well, I think a lot of us experience this in,
00:43:39.740 | even when sober, but drunk too,
00:43:41.780 | sort of you pick up a thing and expect it to be,
00:43:45.980 | what is it, heavy, and it turns out to be light.
00:43:49.020 | - Yeah, and then you, whoa.
00:43:49.900 | - Oh, yeah, and then, so the same,
00:43:52.140 | and I'm sure if your depth of perception,
00:43:53.660 | for whatever reason, is screwed up,
00:43:55.580 | if you're drunk or some other reason,
00:43:58.260 | and then you think you're putting your hand on the table,
00:44:01.060 | and you miss it, I mean, it's the same kind of situation.
00:44:04.420 | But there's--
00:44:06.100 | - Which is why you need to be able
00:44:06.940 | to predict forward just a little bit.
00:44:08.940 | And so that's where this model
00:44:10.100 | predictive control stuff comes in.
00:44:11.760 | Predict forward what you think's gonna happen.
00:44:14.540 | And then if that does happen, you're in good shape.
00:44:16.620 | If something else happens,
00:44:17.740 | you better start predicting again.
00:44:19.100 | - So re, like re, regenerate a plan when you don't.
00:44:24.100 | I mean, that also requires a very fast feedback loop
00:44:29.660 | of updating what your prediction,
00:44:33.820 | how it matches to the actual real world.
00:44:36.500 | - Yeah, those things have to run pretty quickly.
00:44:38.660 | - What's the challenge of running things pretty quickly?
00:44:40.860 | A thousand hertz of acting and sensing quickly.
00:44:45.860 | - You know, there's a few different layers of that.
00:44:49.580 | You want, at the lowest level,
00:44:51.540 | you like to run things typically at around a thousand hertz,
00:44:54.740 | which means that, you know, at each joint of the robot,
00:44:57.360 | you're measuring position or force,
00:44:59.960 | and then trying to control your actuator,
00:45:02.280 | whether it's a hydraulic or electric motor,
00:45:04.900 | trying to control the force coming out of that actuator.
00:45:07.460 | And you wanna do that really fast.
00:45:10.380 | Something like a thousand hertz.
00:45:11.700 | And that means you can't have too much calculation
00:45:14.220 | going on at that joint.
00:45:15.840 | But that's pretty manageable these days,
00:45:18.820 | and it's fairly common.
00:45:20.820 | And then there's another layer
00:45:22.100 | that you're probably calculating, you know,
00:45:24.420 | maybe at a hundred hertz, maybe 10 times slower,
00:45:27.900 | which is now starting to look at the overall body motion
00:45:30.980 | and thinking about the larger physics of the robot.
00:45:35.660 | And then there's yet another loop
00:45:39.460 | that's probably happening a little bit slower,
00:45:41.420 | which is where you start to bring, you know,
00:45:43.260 | your perception and your vision and things like that.
00:45:46.340 | And so you need to run all of these loops
00:45:48.860 | sort of simultaneously.
00:45:50.540 | You do have to manage your computer time
00:45:54.300 | so that you can squeeze in all the calculations you need
00:45:57.620 | in real time in a very consistent way.
00:46:00.000 | And the amount of calculation we can do
00:46:05.200 | is increasing as computers get better,
00:46:07.500 | which means we can start to do
00:46:08.700 | more sophisticated calculations.
00:46:10.580 | I can have a more complex model
00:46:13.340 | doing my forward prediction.
00:46:15.700 | And that might allow me to do even better predictions
00:46:19.380 | as I get better and better.
00:46:20.660 | And it used to be, again, we had, you know,
00:46:24.220 | 10 years ago, we had to have pretty simple models
00:46:29.220 | that we were running, you know, at those fast rates
00:46:32.300 | 'cause the computers weren't as capable
00:46:34.620 | about calculating forward with a sophisticated model.
00:46:38.460 | But as computation gets better, we can do more of that.
00:46:42.780 | - What about the actual pipeline of software engineering?
00:46:46.620 | How easy it is to keep updating Atlas,
00:46:49.220 | like do continuous development on it?
00:46:51.540 | So how many computers are on there?
00:46:54.500 | Is there a nice pipeline?
00:46:56.260 | - It's an important part of building a team around it,
00:47:00.380 | which means, you know, you need to also have software tools,
00:47:04.660 | simulation tools, you know.
00:47:06.180 | So we have always made strong use
00:47:11.180 | of physics-based simulation tools
00:47:13.540 | to do some of this calculation,
00:47:17.560 | basically test it in simulation
00:47:19.300 | before you put it on the robot.
00:47:21.060 | But you also want the same code
00:47:22.700 | that you're running in simulation
00:47:24.020 | to be the same code you're running on the hardware.
00:47:26.760 | And so even getting to the point
00:47:28.900 | where it was the same code going from one to the other,
00:47:32.060 | we probably didn't really get that working
00:47:34.180 | until, you know, a few years, several years ago.
00:47:36.580 | But that was a, you know, that was a bit of a milestone.
00:47:39.740 | And so you want to work, certainly work these pipelines
00:47:42.660 | so that you can make it as easy as possible
00:47:44.620 | and have a bunch of people working in parallel,
00:47:47.100 | especially when, you know, we only have, you know,
00:47:49.140 | four of the Atlas robots,
00:47:51.020 | the modern Atlas robots at the company.
00:47:53.980 | And, you know, we probably have, you know,
00:47:55.900 | 40 developers there all trying to gain access to it.
00:47:59.620 | And so you need to share resources
00:48:01.940 | and use some of these, some of the software pipeline.
00:48:04.940 | - Well, that's a really exciting step
00:48:06.300 | to be able to run the exact same code in simulation
00:48:08.500 | as on the actual robot.
00:48:10.580 | How hard is it to do
00:48:11.820 | realistic simulation, physics-based simulation of Atlas,
00:48:19.700 | such that, I mean, the dream is like,
00:48:22.620 | if it works in simulation, it works perfectly in reality.
00:48:25.140 | How hard is it to sort of keep working on closing that gap?
00:48:28.740 | - The root of some of our physics-based simulation tools
00:48:31.420 | really started at MIT.
00:48:33.180 | And we built some good physics-based modeling tools there.
00:48:38.180 | The early days of the company,
00:48:40.420 | we were trying to develop those tools
00:48:42.260 | as a commercial product.
00:48:43.580 | So we continued to develop them.
00:48:45.620 | It wasn't a particularly successful commercial product,
00:48:48.220 | but we ended up with some nice
00:48:49.300 | physics-based simulation tools
00:48:50.740 | so that when we started doing legged robotics again,
00:48:52.860 | we had a really nice tool to work with.
00:48:55.180 | And the things we paid attention to
00:48:57.060 | were things that weren't necessarily handled very well
00:49:00.940 | in the commercial tools you could buy off the shelf,
00:49:03.060 | like interaction with the world, like foot ground contact.
00:49:07.420 | So trying to model those contact events well
00:49:12.420 | in a way that captured the important parts
00:49:16.900 | of the interaction was a really important element
00:49:21.500 | to get right and to also do in a way
00:49:23.660 | that was computationally feasible and could run fast.
00:49:27.900 | 'Cause if your simulation runs too slow,
00:49:30.980 | then your developers are sitting around
00:49:32.380 | waiting for stuff to run and compile.
00:49:34.300 | So it's always about efficient, fast operation as well.
00:49:39.300 | So that's been a big part of it.
00:49:41.700 | I think developing those tools
00:49:43.500 | in parallel to the development of the platform
00:49:46.940 | and trying to scale them has really been essential,
00:49:50.260 | I'd say, to us being able to assemble
00:49:53.300 | a team of people that could do this.
00:49:54.980 | - Yeah, how to simulate contact, period.
00:49:57.540 | So foot ground contact, but sort of for manipulation.
00:50:00.800 | Because don't you want to model all kinds of surfaces?
00:50:06.380 | - Yeah, so it will be even more complex with manipulation
00:50:09.940 | 'cause there's a lot more going on, you know?
00:50:12.620 | And you need to capture, I don't know,
00:50:14.620 | things slipping and moving in your hand.
00:50:19.840 | It's a level of complexity that I think goes above
00:50:24.040 | foot ground contact when you really start
00:50:27.760 | doing dexterous manipulation.
00:50:29.440 | So there's challenges ahead still.
00:50:31.480 | - So how far are we away from me being able
00:50:33.840 | to walk with Atlas in the sand along the beach
00:50:37.560 | and us both drinking a beer?
00:50:40.480 | (laughing)
00:50:42.440 | Out of a can, out of a can.
00:50:43.880 | - Maybe Atlas could spill his beer
00:50:45.440 | 'cause he's got nowhere to put it.
00:50:46.800 | (laughing)
00:50:48.840 | - Atlas could walk on the sand.
00:50:50.400 | - So can it?
00:50:51.240 | - Yeah, yeah.
00:50:52.440 | I mean, have we really had him out on the beach?
00:50:55.680 | Yeah, we take them outside often, rocks, hills,
00:50:58.860 | that sort of thing, even just around our lab in Waltham.
00:51:02.080 | We probably haven't been on the sand, but I'm--
00:51:04.920 | - Some soft surfaces.
00:51:05.760 | - I don't doubt that we could deal with it.
00:51:07.800 | We might have to spend a little bit of time
00:51:10.200 | to sort of make that work.
00:51:11.800 | We did take, we had to take Big Dog to Thailand years ago
00:51:16.800 | and we did this great video of the robot
00:51:23.200 | walking in the sand, walking into the ocean
00:51:27.480 | up to, I don't know, its belly or something like that,
00:51:30.360 | and then turning around and walking out,
00:51:32.080 | all while playing some cool beach music.
00:51:34.920 | Great show, but then we didn't really clean the robot off
00:51:37.680 | and the salt water was really hard on it.
00:51:39.440 | So we put it in a box, shipped it back.
00:51:42.240 | By the time it came back, we had some problems
00:51:45.080 | with corrosion.
00:51:45.920 | - So it's the salt water, it's not like--
00:51:47.920 | - Salt stuff.
00:51:49.080 | - It's not like sand getting into the components
00:51:51.000 | or something like this.
00:51:52.120 | But I'm sure if this is a big priority,
00:51:54.480 | you can make it waterproof and stuff.
00:51:56.600 | - Right, right.
00:51:57.440 | That just wasn't our goal at the time.
00:51:59.600 | - Well, it's a personal goal of mine to walk along the beach.
00:52:03.560 | But it's a human problem, too.
00:52:04.760 | You get sand everywhere, it's just a jam mess.
00:52:08.440 | - So soft surfaces are okay.
00:52:10.360 | So, I mean, can we just linger on the robotics challenge?
00:52:13.840 | There's a pile of rubble to walk over.
00:52:18.840 | How difficult is that task?
00:52:24.000 | - In the early days of developing Big Dog,
00:52:26.480 | the loose rock was the epitome of the hard walking surface.
00:52:30.200 | Because you step down and then the rock,
00:52:32.280 | and you had these little point feet on the robot,
00:52:35.440 | and the rock can roll.
00:52:37.160 | And you have to deal with that last minute
00:52:41.160 | change in your foot placement.
00:52:43.480 | - Yeah, so you step on the thing,
00:52:45.440 | and that thing responds to you stepping on it.
00:52:47.360 | - Yeah, and it moves where your point of support is.
00:52:50.720 | And so it's really, that became
00:52:53.200 | kind of the essence of the test.
00:52:55.160 | And so that was the beginning of us starting
00:52:57.580 | to build rock piles in our parking lots.
00:53:01.560 | And we would actually build boxes full of rocks
00:53:04.400 | and bring 'em into the lab.
00:53:06.120 | And then we would have the robots
00:53:07.480 | walking across these boxes of rocks
00:53:09.240 | because that became the essential test.
00:53:11.860 | - So you mentioned Big Dog.
00:53:14.640 | Can we maybe take a stroll through
00:53:16.520 | the history of Boston Dynamics?
00:53:18.400 | So what and who is Big Dog?
00:53:21.400 | By the way, it's who?
00:53:23.040 | Do you try not to anthropomorphize the robots?
00:53:27.840 | Do you try not to, do you try to remember that they're,
00:53:31.600 | this is like the division I have.
00:53:32.840 | 'Cause for me it's impossible.
00:53:34.960 | For me there's a magic to the being that is a robot.
00:53:38.160 | It is not human, but it is,
00:53:40.380 | the same magic that a living being has
00:53:46.160 | when it moves about the world is there in the robot.
00:53:48.680 | So I don't know what question I'm asking,
00:53:51.820 | but should I say what or who, I guess?
00:53:54.360 | Who is Big Dog, what is Big Dog?
00:53:57.120 | - Well, I'll say, to address the meta question,
00:54:00.400 | we don't try to draw hard lines around it
00:54:02.780 | being an it or a him or a her.
00:54:05.280 | It's okay, right?
00:54:08.760 | People, I think part of the magic of these kinds of machines
00:54:13.080 | is by nature of their organic movement, of their dynamics,
00:54:18.080 | we tend to want to identify with them.
00:54:22.600 | We tend to look at them and sort of attribute
00:54:25.960 | maybe feeling to that, because we've only seen
00:54:29.480 | things that move like this that were alive.
00:54:32.640 | And so this is an opportunity.
00:54:35.640 | It means that you could have feelings for a machine.
00:54:40.640 | And people have feelings for their cars.
00:54:43.800 | They get attracted to them, attached to them.
00:54:46.320 | So that's inherently could be a good thing
00:54:49.320 | as long as we manage what that interaction is.
00:54:52.400 | So we don't put strong boundaries around this
00:54:56.320 | and ultimately think it's a benefit,
00:54:58.220 | but it's also can be a bit of a curse
00:55:01.240 | because I think people look at these machines
00:55:03.680 | and they attribute a level of intelligence
00:55:06.760 | that the machines don't have.
00:55:09.000 | Because again, they've seen things move like this
00:55:11.200 | that were living beings, which are intelligent.
00:55:15.160 | And so they want to attribute intelligence to the robots
00:55:18.840 | that isn't appropriate yet,
00:55:20.000 | even though they move like an intelligent being.
00:55:22.400 | - But you try to acknowledge
00:55:24.320 | that the anthropomorphization is there
00:55:27.040 | and try to, first of all, acknowledge that it's there.
00:55:31.700 | And have a little fun with it.
00:55:32.740 | You know, our most recent video,
00:55:35.460 | it's just kind of fun, you know, to look at the robot.
00:55:40.020 | We started off the video with Atlas
00:55:43.140 | kind of looking around for where the bag of tools was
00:55:47.980 | 'cause the guy up on the scaffolding says,
00:55:49.780 | "Send me some tools."
00:55:51.100 | Atlas has to kind of look around and see where they are.
00:55:54.620 | And there's a little personality there that is fun.
00:55:57.980 | It's entertaining.
00:55:58.820 | It makes our jobs interesting.
00:56:00.100 | And I think in the long run,
00:56:01.940 | can enhance interaction between humans and robots
00:56:05.660 | in a way that isn't available
00:56:07.380 | to machines that don't move that way.
00:56:09.380 | - This is something to me personally is very interesting.
00:56:11.420 | I've been, I happen to have a lot of legged robots.
00:56:16.420 | (laughing)
00:56:18.260 | I hope to have a lot of spots in my possession.
00:56:21.540 | I'm interested in celebrating robotics
00:56:24.740 | and celebrating companies.
00:56:25.800 | And I also don't want to,
00:56:27.220 | companies that do incredible stuff like Boston Dynamics.
00:56:29.820 | And there's, you know, I'm a little crazy.
00:56:34.580 | And you say you don't want to,
00:56:36.780 | you want to align, you want to help the company.
00:56:39.780 | 'Cause I ultimately want a company
00:56:42.020 | like Boston Dynamics to succeed.
00:56:43.860 | And part of that we'll talk about, you know,
00:56:45.580 | success kind of requires making money.
00:56:48.300 | And so the kind of stuff I'm particularly interested in
00:56:52.540 | may not be the thing that makes money in the short term.
00:56:54.660 | I can make an argument that it will in the longterm.
00:56:57.020 | But the kind of stuff I've been playing with
00:56:59.980 | is a robust way of having the quadrupeds,
00:57:04.780 | the robot dogs, communicate emotion
00:57:07.100 | with their body movement.
00:57:08.460 | The same kind of stuff you do with a dog.
00:57:10.880 | But not hard-coded, but in a robust way.
00:57:15.060 | And be able to communicate excitement or fear,
00:57:18.500 | boredom, all these kinds of stuff.
00:57:20.860 | And I think as a base layer of function of behavior
00:57:26.460 | to add on top of a robot,
00:57:27.820 | I think that's a really powerful way
00:57:30.140 | to make the robot more usable for humans,
00:57:33.020 | for whatever application.
00:57:33.860 | - I think it's gonna be really important.
00:57:35.380 | And it's a thing we're beginning to pay attention to.
00:57:39.360 | We really want to start,
00:57:42.900 | a differentiator for the company has always been,
00:57:45.100 | we really want the robot to work.
00:57:46.940 | We want it to be useful.
00:57:48.420 | Making it work at first meant
00:57:52.300 | the legged locomotion really works.
00:57:54.620 | It can really get around and it doesn't fall down.
00:57:57.120 | But beyond that, now it needs to be a useful tool.
00:58:02.140 | And our customers are, for example, factory owners.
00:58:06.100 | People who are running a process manufacturing facility.
00:58:09.820 | And the robot needs to be able to get through
00:58:11.420 | this complex facility in a reliable way,
00:58:14.540 | taking measurements.
00:58:16.120 | We need for people who are operating those robots
00:58:21.360 | to understand what the robots are doing.
00:58:23.600 | If the robot gets into, needs help,
00:58:25.540 | or is in trouble or something,
00:58:28.980 | it needs to be able to communicate.
00:58:31.040 | And a physical indication of some sort,
00:58:35.100 | so that a person looked at the robot and goes,
00:58:36.620 | "Oh, I know what that robot's doing.
00:58:38.380 | "That robot's going to go take measurements
00:58:41.020 | "of my vacuum pump with its thermal camera."
00:58:43.980 | You wanna be able to indicate that.
00:58:47.900 | And we're even just, the robot's about to turn
00:58:52.380 | in front of you and maybe indicate that it's going to turn.
00:58:55.440 | And so you sort of see and can anticipate its motion.
00:58:58.240 | So this kind of communication is going to become
00:59:00.800 | more and more important.
00:59:02.340 | It wasn't sort of our starting point,
00:59:04.300 | but now that the robots are really out in the world
00:59:09.720 | and we have about a thousand of them out
00:59:11.840 | with customers right now,
00:59:13.400 | this layer of physical indication,
00:59:18.400 | I think is going to become more and more important.
00:59:21.000 | - We'll talk about where it goes,
00:59:22.900 | 'cause there's a lot of interesting possibilities,
00:59:24.540 | but if you can return back to the origins
00:59:26.980 | of Boston Dynamics, so the more research,
00:59:30.140 | the R&D side, before we talk about
00:59:33.540 | how to build robots at scale.
00:59:36.100 | So Big Dog, who's Big Dog?
00:59:39.140 | - So the company started in 1992,
00:59:43.100 | and in probably 2003, I believe,
00:59:50.980 | is when we took a contract from DARPA,
00:59:55.300 | so basically 10 years, 11 years.
00:59:57.800 | We weren't doing robotics.
01:00:00.000 | We did a little bit of robotics with Sony.
01:00:02.520 | They had AIBO, their AIBO robot.
01:00:05.400 | We were developing some software for that
01:00:07.140 | that kind of got us a little bit involved
01:00:08.880 | with robotics again.
01:00:10.560 | Then there's this opportunity to do a DARPA contract
01:00:13.440 | where they wanted to build a robot dog.
01:00:17.880 | And we won a contract to build that,
01:00:21.400 | and so that was the genesis of Big Dog.
01:00:24.640 | And it was a quadruped, and it was the first time
01:00:27.220 | we built a robot that had everything on board
01:00:29.420 | that you could actually take the robot
01:00:31.240 | out into the wild and operate it.
01:00:32.800 | So it had an on-board power plant,
01:00:34.400 | it had on-board computers, it had hydraulic actuators
01:00:39.400 | that needed to be cooled, so we had cooling systems built in.
01:00:43.020 | Everything integrated into the robot,
01:00:45.660 | and that was a pretty rough start, right?
01:00:48.320 | So it was 10 years that we were not a robotics company.
01:00:52.000 | We were a simulation company,
01:00:53.280 | and then we had to build a robot in about a year.
01:00:55.800 | So that was a little bit of a rough transition.
01:00:58.160 | (laughing)
01:01:00.840 | - I mean, can you just comment on the roughness
01:01:03.000 | of that transition, 'cause Big Dog,
01:01:06.080 | I mean, this is this big quadruped, four-legs robot.
01:01:11.080 | - We built a few different versions of them,
01:01:13.760 | but the first one, the very earliest ones,
01:01:15.980 | didn't work very well.
01:01:17.620 | We would take 'em out, and it was hard to get
01:01:20.580 | a go-kart engine driving a hydraulic pump.
01:01:25.260 | - Oh, is that what it was?
01:01:27.100 | - And having that all work while trying to get
01:01:31.740 | the robot to stabilize itself.
01:01:34.540 | - So what was the power plant, what was the engine?
01:01:37.260 | It seemed like, my vague recollection,
01:01:41.820 | I don't know, it felt very loud and aggressive
01:01:45.240 | and kind of thrown together.
01:01:47.480 | - Oh, it absolutely was, right?
01:01:49.520 | We weren't trying to design the best robot hardware
01:01:52.760 | at the time, and we wanted to buy an off-the-shelf engine.
01:01:57.360 | And so many of the early versions of Big Dog
01:02:00.720 | had literally go-kart engines or something like that.
01:02:04.440 | - Are those gas-powered?
01:02:05.360 | - Yeah, gas-powered, two-stroke engines.
01:02:08.000 | And the reason why it was two-stroke
01:02:09.480 | is two-stroke engines are lighter weight.
01:02:11.620 | And we generally didn't put mufflers on them,
01:02:15.920 | 'cause we're trying to save the weight.
01:02:17.320 | We didn't care about the noise.
01:02:18.960 | And some of these things were horribly loud.
01:02:21.760 | But we're trying to manage weight,
01:02:23.160 | because managing weight in a legged robot
01:02:25.620 | is always important, because it has to carry everything.
01:02:28.760 | - That said, that thing was big.
01:02:30.720 | Well, I've seen the videos of it.
01:02:31.840 | - Yeah, I mean, the early versions stood about,
01:02:35.200 | I don't know, belly high, chest high.
01:02:38.960 | They probably weighed maybe a couple of hundred pounds.
01:02:42.120 | But over the course of probably five years,
01:02:46.600 | we were able to get that robot
01:02:50.080 | to really manage a remarkable level of rough terrain.
01:02:55.560 | So we started out with just walking on the flat,
01:02:57.960 | and then we started walking on rocks,
01:02:59.440 | and then inclines, and then mud, and then slippery mud.
01:03:03.200 | And by the end of that program,
01:03:05.800 | we were convinced that legged locomotion in a robot
01:03:09.880 | could actually work, 'cause going into it,
01:03:12.480 | we didn't know that.
01:03:14.220 | We had built quadrupeds at MIT,
01:03:17.780 | but they used a giant hydraulic pump in the lab.
01:03:21.680 | They used a giant computer that was in the lab.
01:03:23.760 | They were always tethered to the lab.
01:03:26.280 | This was the first time something
01:03:27.700 | that was sort of self-contained
01:03:29.680 | walked around in the world and balanced.
01:03:34.400 | And the purpose was to prove to ourself
01:03:36.480 | that the legged locomotion could really work.
01:03:38.560 | And so Big Dog really cut that open for us.
01:03:41.840 | And it was the beginning
01:03:43.120 | of what became a whole series of robots.
01:03:45.040 | So once we showed to DARPA
01:03:47.120 | that you could make a legged robot that could work,
01:03:49.840 | there was a period at DARPA where robotics got really hot,
01:03:53.280 | and there was lots of different programs.
01:03:55.760 | And we were able to build other robots.
01:03:58.460 | We built other quadrupeds to hand,
01:04:00.760 | like LS3, designed to carry heavy loads.
01:04:04.720 | We built Cheetah, which was designed to explore
01:04:08.280 | what are the limits to how fast you can run.
01:04:10.880 | We began to build sort of a portfolio
01:04:14.560 | of machines and software that let us build
01:04:18.920 | not just one robot, but a whole family of robots.
01:04:21.360 | - To push the limits in all kinds of directions.
01:04:23.560 | - Yeah, and to discover those principles.
01:04:25.240 | You asked earlier about the art and science
01:04:27.440 | of the legged locomotion.
01:04:29.320 | We were able to develop principles of legged locomotion
01:04:32.440 | so that we knew how to build a small legged robot
01:04:35.280 | or a big one.
01:04:36.120 | So leg length was now a parameter that we could play with.
01:04:41.120 | Payload was a parameter we could play with.
01:04:43.640 | So we built the LS3, which was an 800 pound robot
01:04:46.640 | designed to carry a 400 pound payload.
01:04:49.480 | And we learned the design rules,
01:04:51.480 | basically developed the design rules.
01:04:53.600 | How do you scale different robot systems
01:04:56.240 | to their terrain, to their walking speed, to their payload?
01:05:01.040 | - So when was Spot born?
01:05:04.960 | - Around 2012 or so.
01:05:11.120 | So again, almost 10 years into sort of a run with DARPA
01:05:15.320 | where we built a bunch of different quadrupeds.
01:05:17.960 | We had a sort of a different thread
01:05:19.600 | where we started building humanoids.
01:05:23.240 | We saw that probably an end was coming
01:05:27.200 | where the government was gonna kind of back off
01:05:29.760 | from a lot of robotics investment.
01:05:32.520 | And in order to maintain progress,
01:05:37.520 | we just deduced that, well,
01:05:39.680 | we probably need to sell ourselves
01:05:41.120 | to somebody who wants to continue to invest in this area.
01:05:43.960 | And that was Google.
01:05:44.960 | And so at Google, we would meet regularly with Larry Page
01:05:52.000 | and Larry just started asking us,
01:05:54.000 | what's your product gonna be?
01:05:56.000 | And the logical thing,
01:05:59.040 | the thing that we had the most history with
01:06:01.680 | that we wanted to continue developing was a quadruped,
01:06:05.560 | but we knew it needed to be smaller.
01:06:07.000 | We knew it couldn't have a gas engine.
01:06:08.840 | We thought it probably couldn't be hydraulically actuated.
01:06:12.680 | So that began the process of exploring
01:06:16.440 | if we could migrate to a smaller electrically actuated robot
01:06:21.680 | and that was really the genesis of Spot.
01:06:23.680 | - So not a gas engine and the actuators are electric.
01:06:28.200 | - Yes.
01:06:29.040 | - So can you maybe comment on what it's like
01:06:31.240 | at Google working with Larry Page,
01:06:35.120 | having those meetings and thinking of
01:06:36.880 | what will a robot look like that could be built at scale?
01:06:41.880 | Starting to think about a product.
01:06:45.680 | - Larry always liked the toothbrush test.
01:06:48.800 | He wanted products that you used every day.
01:06:52.100 | What they really wanted was a consumer level product,
01:06:57.100 | something that would work in your house.
01:07:01.100 | We didn't think that was the right next thing to do
01:07:06.260 | because to be a consumer level product,
01:07:08.740 | cost is gonna be very important.
01:07:11.180 | Probably needed to cost a few thousand dollars.
01:07:14.260 | And we were building these machines
01:07:16.220 | that cost hundreds of thousands of dollars,
01:07:18.260 | maybe a million dollars to build.
01:07:20.180 | Of course, we were only building two,
01:07:22.300 | but we didn't see how to get all the way
01:07:25.660 | to this consumer level product.
01:07:27.060 | - In a short amount of time.
01:07:27.900 | - In a short amount of time.
01:07:29.980 | And he suggested that we make the robots really inexpensive.
01:07:34.980 | And part of our philosophy has always been,
01:07:37.400 | build the best hardware you can.
01:07:40.580 | Make the machine operate well
01:07:44.260 | so that you're trying to solve,
01:07:48.700 | discover the hard problem that you don't know about.
01:07:51.820 | Don't make it harder by building a crappy machine, basically.
01:07:54.980 | Build the best machine you can.
01:07:56.820 | There's plenty of hard problems to solve
01:07:58.340 | that are gonna have to do with
01:08:00.020 | under-actuated systems and balance.
01:08:02.060 | And so we wanted to build these high quality machines still.
01:08:06.500 | And we thought that was important for us to continue learning
01:08:09.340 | about really what was the important parts
01:08:12.540 | of the make robots work.
01:08:16.460 | And so there was a little bit
01:08:17.540 | of a philosophical difference there.
01:08:19.540 | And so ultimately, that's why we're building robots
01:08:23.260 | for the industrial sector now.
01:08:24.860 | Because the industry can afford a more expensive machine
01:08:29.260 | because their productivity depends
01:08:32.080 | on keeping their factory going.
01:08:33.860 | And so if Spot costs $100,000 or more,
01:08:38.820 | that's not such a big expense to them.
01:08:41.940 | Whereas at the consumer level,
01:08:43.220 | no one's gonna buy a robot like that.
01:08:45.500 | And I think we might eventually get
01:08:47.680 | to a consumer level product that will be that cheap.
01:08:50.740 | But I think the path to getting there needs to go
01:08:53.100 | through these really nice machines
01:08:54.900 | so that we can then learn how to simplify.
01:08:57.740 | - So what can you say to the,
01:08:59.620 | almost the engineering challenge
01:09:01.900 | of bringing down cost of a robot?
01:09:06.060 | So that presumably when you try to build a robot at scale,
01:09:09.980 | that also comes into play when you're trying
01:09:11.780 | to make money on a robot, even in the industrial setting.
01:09:15.140 | But how interesting, how challenging of a thing is that?
01:09:20.140 | In particular, probably new to an R&D company.
01:09:25.720 | - Yeah, I'm glad you brought that last part up.
01:09:27.780 | The transition from an R&D company to a commercial company,
01:09:31.380 | that's the thing you worry about.
01:09:33.180 | 'Cause you've got these engineers who love hard problems,
01:09:35.680 | who wanna figure out how to make robots work.
01:09:38.060 | And you don't know if you have engineers
01:09:40.060 | that wanna work on the quality and reliability
01:09:42.860 | and cost that is ultimately required.
01:09:45.120 | And indeed, we have brought on a lot of new people
01:09:49.620 | who are inspired by those problems.
01:09:52.140 | But the big takeaway lesson for me is we have good people.
01:09:56.780 | We have engineers who wanna solve problems.
01:09:59.620 | And the quality and cost and manufacturability
01:10:03.100 | is just another kind of problem.
01:10:05.100 | And because they're so invested in what we're doing,
01:10:09.020 | they're interested in and will go work
01:10:10.820 | on those problems as well.
01:10:13.380 | And so I think we're managing that transition very well.
01:10:16.900 | In fact, I'm really pleased that,
01:10:18.540 | I mean, it's a huge undertaking, by the way, right?
01:10:23.300 | So even having to get reliability to where it needs to be,
01:10:28.300 | we have to have fleets of robots
01:10:30.380 | that we're just operating 24/7 in our offices
01:10:33.980 | to go find those rare failures and eliminate them.
01:10:37.420 | It's just a totally different kind of activity
01:10:39.480 | than the research activity where you get it to work,
01:10:42.300 | the one robot you have to work in a repeatable way
01:10:46.300 | at the high stakes demo.
01:10:48.860 | It's just very different.
01:10:50.100 | But I think we're making remarkable progress, I guess.
01:10:54.260 | - And so one of the cool things,
01:10:55.580 | I got a chance to visit Boston Dynamics.
01:10:57.940 | And I mean, one of the things that's really cool
01:11:02.940 | is to see a large number of robots moving about.
01:11:07.700 | Because I think one of the things you notice
01:11:10.140 | in the research environment at MIT, for example,
01:11:14.380 | I don't think anyone ever has a working robot
01:11:16.520 | for a prolonged period of time.
01:11:17.640 | (laughing)
01:11:18.480 | - Exactly.
01:11:19.480 | - So like most robots are just sitting there
01:11:21.720 | in a sad state of despair, waiting to be born,
01:11:25.680 | brought to life for a brief moment of time.
01:11:28.360 | Just to have, I just remember there's a spot robot,
01:11:32.400 | just had like a cowboy hat on
01:11:34.800 | and was just walking randomly for whatever reason.
01:11:37.060 | I don't even know.
01:11:38.040 | But there's a kind of a sense of sentience to it
01:11:42.400 | because it doesn't seem like anybody was supervising it.
01:11:45.640 | It was just doing its thing.
01:11:46.480 | - I'm gonna stop way short of the sentience.
01:11:48.880 | It is the case that if you come to our office today
01:11:51.760 | and walk around the hallways,
01:11:53.680 | you're gonna see a dozen robots
01:11:56.380 | just kind of walking around all the time.
01:11:59.140 | And that's really a reliability test for us.
01:12:03.080 | So we have these robots programmed to do autonomous missions,
01:12:07.600 | get up off their charging dock, walk around the building,
01:12:09.920 | collect data at a few different places and go sit back down.
01:12:13.220 | And we want that to be a very reliable process
01:12:16.080 | 'cause that's what somebody who's running a brewery,
01:12:20.200 | a factory, that's what they need the robot to do.
01:12:23.020 | And so we have to dog food our own robot.
01:12:26.060 | We have to test it in that way.
01:12:28.260 | And so on a weekly basis,
01:12:31.720 | we have robots that are accruing something like
01:12:34.380 | 1500 or maybe 2000 kilometers of walking
01:12:39.080 | and over a thousand hours of operation every week.
01:12:43.480 | And that's something that almost,
01:12:45.540 | I don't think anybody else in the world can do
01:12:47.220 | 'cause A, you have to have a fleet of robots
01:12:49.000 | to just accrue that much information.
01:12:50.900 | You have to be willing to dedicate it to that test.
01:12:55.140 | And so that's, but that's essential.
01:12:58.000 | - That's how you get the reliability.
01:12:59.400 | - That's how you get it.
01:13:00.240 | - What about some of the cost cutting
01:13:01.600 | from the manufacturer side?
01:13:04.040 | What have you learned from the manufacturer side
01:13:06.640 | of the transition from R&D?
01:13:08.920 | - And we're still learning a lot there.
01:13:11.880 | We're learning how to cast parts instead of mill it all out
01:13:15.440 | of, you know, billet aluminum.
01:13:17.320 | We're learning how to get plastic molded parts.
01:13:21.120 | And we're learning about how to control that process
01:13:24.640 | so that you can build the same robot twice in a row.
01:13:27.520 | There's a lot to learn there.
01:13:28.720 | And we're only partway through that process.
01:13:31.060 | We've set up a manufacturing facility in Waltham.
01:13:36.340 | It's about a mile from our headquarters.
01:13:39.400 | And we're doing final assembly and tests
01:13:41.280 | to both spots and stretches, you know, at that factory.
01:13:44.740 | And it's hard because to be honest,
01:13:49.100 | we're still iterating on the design of the robot.
01:13:51.080 | As we find failures from these reliability tests,
01:13:53.920 | we need to go engineer changes.
01:13:56.000 | And those changes need to now be propagated
01:13:58.400 | to the manufacturing line.
01:13:59.880 | And that's a hard process,
01:14:01.200 | especially when you want to move as fast as we do.
01:14:03.960 | And that's been challenging.
01:14:07.320 | And it makes it, you know,
01:14:08.960 | the folks who are working supply chain
01:14:11.080 | who are trying to get the cheapest parts for us
01:14:14.320 | kind of requires that you buy a lot of them
01:14:16.240 | to make them cheap.
01:14:17.080 | And then we go change the design from underneath them.
01:14:19.280 | And they're like, what are you doing?
01:14:20.280 | And so, you know, getting everybody on the same page here
01:14:23.560 | that, yep, we still need to move fast,
01:14:25.580 | but we also need to try to figure out how to reduce costs.
01:14:28.600 | That's one of the challenges
01:14:30.040 | of this migration we're going through.
01:14:32.320 | - And over the past few years,
01:14:33.800 | challenges to the supply chain.
01:14:35.680 | I mean, I imagine you've been a part
01:14:37.320 | of a bunch of stressful meetings.
01:14:38.760 | - Yeah, things got more expensive and harder to get.
01:14:42.240 | And yeah, so it's all been a challenge.
01:14:44.840 | - Is there still room for simplification?
01:14:47.000 | - Oh yeah, much more.
01:14:48.560 | And, you know, these are really just the first generation
01:14:51.200 | of these machines.
01:14:52.760 | We're already thinking about what the next generation
01:14:54.760 | of Spot's gonna look like.
01:14:56.880 | Spot was built as a platform.
01:14:58.720 | So you could put almost any sensor on it.
01:15:01.000 | You know, we provided data communications,
01:15:03.480 | mechanical connections, power connections.
01:15:06.940 | But for example, in the applications that we're excited
01:15:11.960 | about where you're monitoring these factories
01:15:15.100 | for their health,
01:15:15.940 | there's probably a simpler machine that we could build
01:15:19.660 | that's really focused on that use case.
01:15:23.080 | And that's the difference between the general purpose
01:15:26.480 | machine or the platform versus the purpose built machine.
01:15:29.800 | And so even though, even in the factory,
01:15:31.960 | we'd still like the robot to do lots of different tasks.
01:15:35.240 | If we really knew on day one that we're gonna be operating
01:15:38.120 | in a factory with these three sensors in it,
01:15:40.800 | we would have it all integrated in a package
01:15:42.680 | that would be easier, more, less expensive,
01:15:45.720 | and more reliable.
01:15:47.100 | So we're contemplating building, you know,
01:15:49.000 | a next generation of that machine.
01:15:50.600 | - So we should mention that, so Spot,
01:15:53.960 | for people who somehow are not familiar,
01:15:56.320 | so is a yellow robotic dog
01:16:00.440 | and has been featured in many dance videos.
01:16:04.920 | It also has gained an arm.
01:16:07.020 | So what can you say about the arm that Spot has?
01:16:11.240 | About the challenges of this design
01:16:13.920 | and the manufacture of it?
01:16:15.560 | - We think the future of mobile robots
01:16:18.760 | is mobile manipulation.
01:16:20.900 | That's where, you know, in the past 10 years,
01:16:24.800 | it was getting mobility to work,
01:16:26.200 | getting the legged locomotion to work.
01:16:27.920 | If you ask what's the hard problem in the next 10 years,
01:16:31.120 | it's getting a mobile robot
01:16:32.960 | to do useful manipulation for you.
01:16:35.320 | And so we wanted Spot to have an arm
01:16:37.800 | to experiment with those problems.
01:16:40.520 | And the arm is almost as complex as the robot itself.
01:16:48.400 | You know, and it's an attachable payload.
01:16:52.000 | It has, you know, several motors and actuators and sensors.
01:16:57.120 | It has a camera in the end of its hand.
01:16:59.340 | So, you know, you can sort of see something
01:17:02.320 | and the robot will control the motion of its hand
01:17:06.520 | to go pick it up autonomously.
01:17:07.980 | So in the same way the robot walks and balances,
01:17:11.280 | managing its own foot placement to stay balanced,
01:17:13.720 | we want manipulation to be mostly autonomous
01:17:17.240 | where the robot, you indicate, okay, go grab that bottle.
01:17:19.920 | And then the robot will just go do it
01:17:21.320 | using the camera in its hand
01:17:23.160 | and then sort of closing in on that, the grasp.
01:17:26.620 | But it's a whole nother complex robot
01:17:29.600 | on top of a complex legged robot.
01:17:32.240 | And so, and of course we made the hand
01:17:35.720 | look a little like a head, you know,
01:17:38.680 | because again, we want it to be sort of identifiable.
01:17:42.360 | In the last year, a lot of our sales have been
01:17:46.440 | people who already have a robot
01:17:47.920 | now buying an arm to add to that robot.
01:17:50.640 | - Oh, interesting.
01:17:52.400 | And so the arm is for sale?
01:17:54.440 | - Oh yeah, oh yeah.
01:17:55.840 | It's an option.
01:17:56.680 | - What's the interface like to work with the arm?
01:18:00.200 | Like is it pretty, so are they designed primarily,
01:18:04.280 | I guess just ask that question in general
01:18:06.360 | about robots from Boston Dynamics.
01:18:09.000 | Is it designed to be easily and efficiently operated
01:18:13.720 | remotely by a human being?
01:18:15.640 | Or is there also the capability to push towards autonomy?
01:18:20.640 | - We want both.
01:18:21.900 | In the next version of the software that we release,
01:18:26.720 | which will be version 3.3,
01:18:29.040 | we're gonna offer the ability of,
01:18:31.400 | if you have an autonomous mission for the robot,
01:18:34.280 | we're gonna include the option
01:18:35.920 | that it can go through a door,
01:18:37.080 | which means it's gonna have to have an arm
01:18:38.680 | and it's gonna have to use that arm to open the door.
01:18:41.360 | And so that'll be an autonomous manipulation task
01:18:44.240 | that just, you can program easily with the robot.
01:18:48.440 | Strictly through, we have a tablet interface.
01:18:52.040 | And so on the tablet, you sort of see the view
01:18:55.000 | that Spot sees.
01:18:55.960 | You say, there's the door handle.
01:18:58.760 | The hinges are on the left and it opens in.
01:19:00.800 | The rest is up to you.
01:19:02.160 | Take care of it.
01:19:03.320 | - So it just takes care of everything.
01:19:04.960 | - Yeah.
01:19:05.880 | So we want, and for a task like opening doors,
01:19:09.920 | you can automate most of that.
01:19:11.280 | And we've automated a few other tasks.
01:19:13.640 | We had a customer who had a high powered
01:19:18.440 | breaker switch, essentially.
01:19:20.200 | It's an electric utility, Ontario Power Generation.
01:19:23.900 | And they have to, when they're gonna disconnect
01:19:28.160 | their power supply, right?
01:19:29.440 | That could be a gas generator,
01:19:30.760 | could be a nuclear power plant.
01:19:32.320 | From the grid, you have to disconnect this breaker switch.
01:19:35.480 | Well, as you can imagine, there's hundreds or thousands
01:19:38.880 | of amps and volts involved in this breaker switch.
01:19:42.360 | And it's a dangerous event, 'cause occasionally
01:19:44.320 | you'll get what's called an arc flash.
01:19:45.960 | As you just do this disconnect, the power,
01:19:48.640 | the sparks jump across and people die doing this.
01:19:52.300 | And so Ontario Power Generation used our Spot
01:19:56.960 | and the arm through the interface to operate this disconnect
01:20:01.960 | in an interactive way.
01:20:06.280 | And they showed it to us.
01:20:07.960 | And we were so excited about it and said,
01:20:10.520 | you know, I bet we can automate that task.
01:20:12.400 | And so we got some examples of that breaker switch.
01:20:16.440 | And I believe in the next generation of software,
01:20:18.680 | now we're gonna deliver it back
01:20:19.720 | to Ontario Power Generation.
01:20:21.720 | They're gonna be able to just point the robot
01:20:24.200 | at that breaker.
01:20:25.760 | They'll indicate that's the switch.
01:20:28.560 | There's sort of two actions you have to do.
01:20:30.240 | You have to flip up this little cover,
01:20:32.560 | press a button, then get a ratchet,
01:20:35.080 | stick it into a socket, and literally unscrew
01:20:39.520 | this giant breaker switch.
01:20:41.200 | So there's a bunch of different tasks.
01:20:43.120 | And we basically automated them so that the human says,
01:20:45.840 | okay, there's the switch, go do that part.
01:20:49.160 | That right there is the socket where you're gonna
01:20:51.720 | put your tool and you're gonna open it up.
01:20:54.120 | And so you can remotely sort of indicate this
01:20:56.080 | on the tablet, and then the robot just does everything
01:20:59.720 | in between.
01:21:00.560 | - And it does everything, all the coordinated movement
01:21:02.560 | of all the different actuators that includes the body.
01:21:04.840 | - Yeah, it maintains its balance.
01:21:06.320 | It walks itself into position.
01:21:08.960 | So it's within reach, and the arm is in a position
01:21:12.760 | where it can do the whole task.
01:21:14.580 | So it manages the whole body.
01:21:17.400 | - So how does one become a big enough customer
01:21:20.320 | to request features?
01:21:21.640 | 'Cause I personally want a robot that gets me a beer.
01:21:25.360 | (laughing)
01:21:26.320 | I mean, that has to be one of the most requests,
01:21:29.040 | I suppose, in the industrial setting.
01:21:30.600 | That's a non-alcoholic beverage
01:21:33.400 | of picking up objects and bringing the objects to you.
01:21:38.080 | - We love working with customers who have challenging
01:21:40.600 | problems like this, and this one in particular,
01:21:43.160 | because we felt like what they were doing,
01:21:46.120 | A, it was a safety feature.
01:21:47.760 | B, we saw that the robot could do it,
01:21:51.600 | 'cause they tele-operated it the first time.
01:21:53.560 | Probably took them an hour to do it the first time, right?
01:21:55.840 | But the robot was clearly capable.
01:21:58.160 | And we thought, oh, this is a great problem
01:22:00.040 | for us to work on, to figure out how to automate
01:22:02.840 | a manipulation task.
01:22:03.880 | And so we took it on, not because we were gonna make
01:22:06.920 | a bunch of money from it in selling the robot back to them,
01:22:09.720 | but because it motivated us to go solve
01:22:12.600 | what we saw as the next logical step.
01:22:15.480 | But many of our customers, in fact,
01:22:17.360 | we try to, our bigger customers,
01:22:21.480 | typically ones who are gonna run a utility
01:22:23.240 | or a factory or something like that,
01:22:25.720 | we take that kind of direction from them.
01:22:27.560 | And if they're, especially if they're gonna buy 10 or 20
01:22:29.720 | or 30 robots, and they say, I really need it to do this,
01:22:33.120 | well, that's exactly the right kind of problem
01:22:34.960 | that we wanna be working on.
01:22:36.560 | - Yeah. - And so.
01:22:37.800 | - Note to self, buy 10 spots,
01:22:39.640 | and aggressively push for beer manipulation.
01:22:43.400 | I think it's fair to say it's notoriously difficult
01:22:47.120 | to make a lot of money as a robotics company.
01:22:49.560 | How can you make money as a robotics company?
01:22:54.560 | Can you speak to that?
01:22:55.880 | It seems that a lot of robotics companies fail.
01:22:58.640 | It's difficult to build robots.
01:23:02.240 | It's difficult to build robots at a low enough cost
01:23:06.160 | where customers, even the industrial setting,
01:23:07.960 | want to purchase them.
01:23:09.320 | And it's difficult to build robots that are useful,
01:23:11.560 | sufficiently useful.
01:23:13.160 | So what can you speak to?
01:23:14.560 | And Boston Dynamics has been successful
01:23:18.120 | for many years of finding a way to make money.
01:23:21.200 | - Well, in the early days, of course,
01:23:23.040 | the money we made was from doing contract R&D work.
01:23:26.440 | And we made money, but we weren't growing
01:23:29.840 | and we weren't selling a product.
01:23:31.840 | And then we went through several owners
01:23:34.320 | who had a vision of not only developing advanced technology,
01:23:39.320 | but eventually developing products.
01:23:42.680 | And so both Google and SoftBank and now Hyundai
01:23:46.000 | had that vision and were willing to provide that investment.
01:23:51.840 | Now, our discipline is that we need to go find applications
01:23:59.120 | that are broad enough that you could imagine
01:24:01.920 | selling thousands of robots,
01:24:03.640 | because it doesn't work if you don't sell thousands
01:24:05.760 | or tens of thousands of robots.
01:24:07.200 | If you only sell hundreds, you will commercially fail.
01:24:10.880 | And that's where most of the small robot companies
01:24:13.040 | have died.
01:24:13.880 | And that's a challenge because,
01:24:20.580 | you know, A, you need to field the robots,
01:24:22.320 | they need to start to become reliable.
01:24:24.360 | And as we've said, that takes time
01:24:26.320 | and investment to get there.
01:24:27.920 | And so it really does take visionary investment
01:24:31.960 | to get there.
01:24:32.880 | But we believe that we are going to make money
01:24:36.320 | in this industrial monitoring space,
01:24:40.640 | because, you know, if a chip fab,
01:24:45.440 | if the line goes down because a vacuum pump failed
01:24:48.320 | someplace, that can be a very expensive process.
01:24:51.400 | It can be a million dollars a day in lost production.
01:24:54.520 | Maybe you have to throw away some of the product
01:24:56.600 | along the way.
01:24:58.040 | And so the robot, if you can prevent that
01:25:00.880 | by inspecting the factory every single day,
01:25:04.480 | maybe every hour, if you have to,
01:25:06.200 | there's a real return on investment there.
01:25:09.760 | But there needs to be a critical mass of this task.
01:25:12.960 | And we're focusing on a few that we believe are ubiquitous
01:25:17.960 | in the industrial production environment.
01:25:22.360 | And that's using a thermal camera to keep things
01:25:25.600 | from overheating, using an acoustic imager
01:25:28.680 | to find compressed air leaks,
01:25:30.800 | using visual cameras to read gauges,
01:25:34.560 | measuring vibration.
01:25:35.720 | These are standard things that you do
01:25:38.280 | to prevent unintended shutdown of a factory.
01:25:41.720 | And this takes place in a beer factory.
01:25:45.560 | We're working with AB InBev.
01:25:47.320 | It takes place in chip fabs.
01:25:49.000 | You know, we're working with Global Foundries.
01:25:51.560 | It takes place in electric utilities
01:25:54.040 | and nuclear power plants.
01:25:55.600 | And so the same robot can be applied
01:25:58.440 | in all of these industries.
01:26:00.800 | And as I said, we have about,
01:26:04.040 | actually it's 1,100 spots out now.
01:26:06.480 | To really get profitability,
01:26:08.640 | we need to be at 1,000 a year,
01:26:10.320 | maybe 1,500 a year for that sort of part of the business.
01:26:15.240 | So it still needs to grow, but we're on a good path.
01:26:19.480 | So I think that's totally achievable.
01:26:21.480 | - So the application should require
01:26:23.520 | crossing that 1,000 robot barrier.
01:26:25.760 | - It really should, yeah.
01:26:27.520 | I wanna mention our second robot, Stretch.
01:26:30.560 | - Yeah, tell me about Stretch.
01:26:32.000 | What's Stretch?
01:26:32.840 | Who is Stretch?
01:26:33.840 | - Stretch started differently than Spot.
01:26:36.040 | You know, Spot we built because we had decades
01:26:38.960 | of experience building quadrupeds.
01:26:40.440 | We just, we had it in our blood.
01:26:42.200 | We had to build a quadruped product.
01:26:44.240 | But we had to go figure out what the application was.
01:26:47.120 | And we actually discovered this factory patrol application,
01:26:52.120 | basically preventative maintenance,
01:26:54.600 | by seeing what our customers did with it.
01:26:57.400 | Stretch is very different.
01:26:58.480 | We started knowing that there was warehouses
01:27:02.000 | all over the world.
01:27:02.920 | There's shipping containers moving all around the world
01:27:06.680 | full of boxes that are mostly being moved by hand.
01:27:09.240 | By some estimates, we think there's a trillion boxes,
01:27:13.520 | cardboard boxes, shipped around the world each year.
01:27:16.080 | And a lot of it's done manually.
01:27:18.200 | It became clear early on that there was an opportunity
01:27:22.000 | for a mobile robot in here to move boxes around.
01:27:24.900 | And the commercial experience has been very different
01:27:27.640 | between Stretch and with Spot.
01:27:30.440 | As soon as we started talking to people,
01:27:33.760 | potential customers, about what Stretch
01:27:35.840 | was gonna be used for, they immediately started saying,
01:27:38.160 | oh, I'll buy, I'll buy that robot.
01:27:40.280 | You know, in fact, I'm gonna put in an order
01:27:41.920 | for 20 right now.
01:27:43.720 | We just started shipping the robot in January,
01:27:46.920 | after several years of development.
01:27:49.000 | - Of this year.
01:27:49.840 | - Of this year.
01:27:50.660 | So our first deliveries of Stretch to customers
01:27:52.880 | were DHL and Maersk in January.
01:27:55.920 | We're delivering to Gap right now.
01:27:58.320 | And we have about seven or eight other customers,
01:28:01.000 | all who've already agreed in advance
01:28:03.520 | to buy between 10 and 20 robots.
01:28:05.480 | And so we've already got commitments
01:28:06.640 | for a couple hundred of these robots.
01:28:08.840 | This one's gonna go, right?
01:28:11.400 | It's so obvious that there's a need.
01:28:14.000 | And we're not just gonna unload trucks.
01:28:15.580 | We're gonna do any box moving task in the warehouse.
01:28:18.000 | And so it too will be a multipurpose robot.
01:28:21.260 | And we'll eventually have it doing palletizing
01:28:23.800 | or depalletizing or loading trucks or unloading trucks.
01:28:27.560 | There's definitely thousands of robots.
01:28:30.080 | There's probably tens of thousands of robots
01:28:32.060 | of this in the future.
01:28:33.440 | So it's gonna be profitable.
01:28:35.160 | - Can you describe what Stretch looks like?
01:28:37.560 | - It looks like a big, strong robot arm on a mobile base.
01:28:42.060 | The base is about the size of a pallet.
01:28:44.520 | And we wanted it to be the size of a pallet
01:28:46.360 | because that's what lives in warehouses, right?
01:28:48.400 | Pallets of goods sitting everywhere.
01:28:50.340 | So it needed to be able to fit in that space.
01:28:52.600 | - It's not a legged robot.
01:28:53.440 | - It's not a legged robot.
01:28:54.720 | So it was our first, it was actually a bit of a commitment
01:28:59.720 | from us, a challenge for us to build a non-balancing robot.
01:29:06.300 | - To do the much easier problem.
01:29:09.960 | And to put it to do a--
01:29:10.800 | - Well, because it wasn't gonna have this balance problem.
01:29:14.560 | And in fact, the very first version
01:29:16.760 | of the logistics robot we built was a balancing robot.
01:29:20.500 | And that's called Handle.
01:29:21.860 | - That thing was epic.
01:29:24.140 | - Oh, it's a beautiful machine.
01:29:25.900 | - It's an incredible machine.
01:29:27.420 | So it was, I mean, it looks epic.
01:29:31.740 | It looks like out of a, I mean,
01:29:35.020 | out of a sci-fi movie of some sort.
01:29:36.860 | I mean, just, can you actually just linger on
01:29:39.100 | the design of that thing?
01:29:40.540 | 'Cause that's another leap into something
01:29:42.300 | you probably haven't done.
01:29:43.140 | It's a different kind of balancing.
01:29:44.420 | - Yeah, so let me, I love talking about the history
01:29:47.360 | of how a Handle came about.
01:29:49.420 | Because it connects all of our robots, actually.
01:29:52.300 | So I'm gonna start with Atlas.
01:29:55.840 | When we had Atlas getting fairly far along,
01:29:59.680 | we wanted to understand, I was telling you earlier,
01:30:01.600 | the challenge of the human form is that you have
01:30:03.500 | this mass up high.
01:30:05.120 | And balancing that inertia, that mass up high,
01:30:10.880 | is its own unique challenge.
01:30:12.760 | And so we started trying to get Atlas to balance
01:30:15.480 | standing on one foot, like on a balance beam,
01:30:18.260 | using its arms like this.
01:30:19.920 | And you know, you can do this, I'm sure.
01:30:21.240 | I can do this, right?
01:30:22.240 | Like if you're walking a tightrope.
01:30:24.560 | How do you do that balance?
01:30:26.560 | So that's sort of controlling the inertia,
01:30:29.640 | controlling the momentum of the robot.
01:30:31.960 | We were starting to figure that out on Atlas.
01:30:34.200 | And so our first concept of Handle,
01:30:37.520 | which was a robot that was gonna be on two wheels,
01:30:40.240 | so it had to balance, but it was gonna have
01:30:42.440 | a big long arm so it could reach a box
01:30:45.120 | at the top of a truck.
01:30:47.120 | And it needed yet another counterbalance,
01:30:51.240 | a big tail, to help it balance while it was using its arm.
01:30:56.240 | So the reason why this robot sort of looks epic,
01:31:01.120 | some people said it looked like an ostrich,
01:31:04.320 | or maybe an ostrich moving around,
01:31:07.240 | was the wheels, it has legs so it can extend its legs.
01:31:12.520 | So it's wheels on legs, we always wanted
01:31:14.560 | to build wheels on legs.
01:31:15.720 | It had a tail, it had this arm,
01:31:17.360 | and they're all moving simultaneously
01:31:19.240 | and in coordination to maintain balance.
01:31:21.680 | Because we had figured out the mathematics
01:31:23.600 | of doing this momentum control,
01:31:25.360 | how to maintain that balance.
01:31:27.800 | And so part of the reason why we built
01:31:29.720 | this two-legged robot was we had figured this thing out,
01:31:33.680 | we wanted to see it in this kind of machine,
01:31:36.200 | and we thought maybe this kind of machine
01:31:37.800 | would be good in a warehouse.
01:31:38.920 | And so we built it.
01:31:39.840 | And it's a beautiful machine.
01:31:41.260 | It moves in a graceful way like nothing else we've built.
01:31:45.080 | But it wasn't the right machine for a logistics application.
01:31:48.720 | We decided it was too slow.
01:31:50.760 | And couldn't pick boxes fast enough, basically.
01:31:53.360 | - Do it beautifully, with elegance.
01:31:55.960 | - Do it beautifully, but it just wasn't efficient enough.
01:31:58.840 | So we let it go.
01:32:00.480 | But I think we'll come back to that machine eventually.
01:32:04.400 | - The fact that it's possible,
01:32:05.720 | the fact that you showed that you could do so many things
01:32:08.560 | at the same time in coordination,
01:32:10.760 | and so beautifully, there's something there.
01:32:13.020 | That was a demonstration of what is possible.
01:32:15.800 | - Basically, we made a hard decision,
01:32:17.260 | and this was really kind of a hard-nosed business decision.
01:32:21.080 | It indicated us not doing it just for the beauty
01:32:25.560 | of the mathematics or the curiosity,
01:32:27.760 | but no, we actually need to build a business
01:32:29.800 | that can make money in the long run.
01:32:32.200 | And so we ended up building Stretch,
01:32:34.340 | which has a big, heavy base
01:32:35.640 | with a giant battery in the base of it
01:32:38.120 | that allows it to run for two shifts,
01:32:41.520 | 16 hours worth of operation.
01:32:43.840 | And that big battery sort of helps it stay balanced, right?
01:32:47.400 | So it can move a 50-pound box around with its arm
01:32:50.040 | and not tip over.
01:32:51.140 | It's omnidirectional, it can move in any direction,
01:32:55.160 | so it has a nice suspension built into it,
01:32:57.880 | so it can deal with gaps or things on the floor
01:33:01.560 | and roll over it.
01:33:02.960 | But it's not a balancing robot.
01:33:05.240 | It's a mobile robot arm that can work to carry,
01:33:09.160 | or pick, or place a box up to 50 pounds
01:33:11.840 | anywhere in the warehouse.
01:33:13.920 | - Take a box from point A to point B, anywhere.
01:33:16.480 | - Yeah, palletize, depalletize.
01:33:19.080 | We're starting with unloading trucks
01:33:21.040 | because there's so many trucks and containers
01:33:23.280 | that where goods are shipped, and it's a brutal job.
01:33:26.000 | You know, in the summer, it can be 120 degrees
01:33:28.580 | inside that container.
01:33:29.920 | People don't wanna do that job.
01:33:31.480 | And it's back-breaking labor, right?
01:33:34.720 | Again, these can be up to 50-pound boxes.
01:33:36.960 | And so we feel like this is a productivity enhancer.
01:33:43.040 | And for the people who used to do that job unloading trucks,
01:33:46.840 | they're actually operating the robot now.
01:33:49.120 | And so by building robots that are easy to control,
01:33:53.120 | and it doesn't take an advanced degree to manage,
01:33:56.120 | you can become a robot operator.
01:33:57.840 | And so as we've introduced these robots
01:33:59.980 | to both DHL and Mariskan Gap,
01:34:02.440 | the warehouse workers who were doing that manual labor
01:34:05.160 | are now the robot operators.
01:34:06.520 | And so we see this as ultimately a benefit to them as well.
01:34:09.940 | - Can you say how much stretch costs?
01:34:14.100 | - Not yet, but I will say that when we engage
01:34:21.260 | with our customers, they'll be able to see a return
01:34:24.360 | on investment in typically two years.
01:34:26.680 | - Okay, so that's something that you're constantly
01:34:28.560 | thinking about, how?
01:34:29.560 | - Yeah.
01:34:30.400 | - And I suppose you have to do the same kind of thinking
01:34:32.240 | with Spot.
01:34:33.080 | So it seems like with Stretch,
01:34:34.520 | the application is like directly obvious.
01:34:38.560 | - Yeah, it's a slam dunk.
01:34:39.480 | - Yeah, and so you have a little more flexibility.
01:34:42.600 | - Well, I think we know the target.
01:34:44.340 | We know what we're going after.
01:34:46.280 | And with Spot, it took us a while to figure out
01:34:47.880 | what we were going after.
01:34:49.360 | - Well, let me return to that question about
01:34:51.560 | maybe the conversation you were having a while ago
01:34:56.680 | with Larry Page, maybe looking to the longer future
01:35:00.920 | of social robotics, of using Spot to connect
01:35:04.880 | with human beings, perhaps in the home.
01:35:06.720 | Do you see a future there?
01:35:08.160 | If we were to sort of hypothesize or dream
01:35:11.880 | about a future where Spot-like robots are in the home
01:35:14.720 | as pets, as social robots?
01:35:16.040 | - We definitely think about it,
01:35:17.440 | and we would like to get there.
01:35:20.280 | We think the pathway to getting there is likely
01:35:23.840 | through these industrial applications
01:35:26.560 | and then mass manufacturing.
01:35:28.280 | Let's figure out how to build the robots,
01:35:31.840 | how to make the software so that they can really
01:35:33.640 | do a broad set of skills.
01:35:35.400 | That's gonna take real investment to get there.
01:35:39.760 | Performance first, right?
01:35:41.240 | A principle of the company has always been
01:35:43.160 | really make the robots do useful stuff.
01:35:45.720 | And so the social robot companies that tried
01:35:50.640 | to start someplace else by just making a cute interaction,
01:35:54.400 | mostly they haven't survived.
01:35:56.760 | And so we think the utility really needs to come first.
01:36:01.760 | And that means you have to solve some of these hard problems.
01:36:05.880 | And so to get there, we're gonna go through the design
01:36:10.720 | and software development in industrial,
01:36:13.440 | and then that's eventually gonna let you reach a scale
01:36:15.800 | that could then be addressed
01:36:16.920 | to a consumer-level market.
01:36:20.520 | And so, yeah, maybe we'll be able to build a smaller Spot
01:36:24.000 | with an arm that could really go get your beer for you.
01:36:27.400 | But there's things we need to figure out still.
01:36:30.120 | How to safely, really safely,
01:36:32.400 | and if you're gonna be interacting with children,
01:36:35.280 | you better be safe.
01:36:37.000 | And right now, we count on a little bit of standoff distance
01:36:41.240 | between the robot and people so that you don't
01:36:43.000 | pinch a finger in the robot.
01:36:45.320 | So you've got a lot of things you need to go solve
01:36:47.600 | before you jump to that consumer-level product.
01:36:50.840 | - Well, there's a kind of trade-off in safety
01:36:52.880 | because it feels like in the home, you can fall.
01:36:57.640 | Like, you don't have to be as good at,
01:37:02.200 | like, you're allowed to fail in different ways,
01:37:05.440 | in more ways, as long as it's safe for the humans.
01:37:09.720 | So it just feels like an easier problem to solve
01:37:12.440 | 'cause it feels like in the factory,
01:37:13.600 | you're not allowed to fail.
01:37:15.000 | - That may be true, but I also think the variety of things
01:37:21.080 | a consumer-level robot would be expected to do
01:37:23.880 | will also be quite broad.
01:37:25.840 | They're gonna want to get the beer
01:37:27.280 | and know the difference between the beer
01:37:28.800 | and a Coca-Cola or my snack.
01:37:31.640 | They're all gonna want you to clean up the dishes
01:37:36.680 | from the table without breaking them.
01:37:39.600 | Those are pretty complex tasks,
01:37:42.800 | and so there's still work to be done there.
01:37:45.400 | - So to push back on that, here's where application,
01:37:47.520 | I think, will be very interesting.
01:37:49.280 | I think the application of being a pet, a friend.
01:37:52.440 | So, like, no tasks, just be cute.
01:37:57.760 | Because, not cute, not cute.
01:38:00.120 | A dog is more than just cute.
01:38:02.760 | A dog is a friend, is a companion.
01:38:04.880 | There's something about just having interacted with them,
01:38:07.760 | and maybe 'cause I'm hanging out alone
01:38:09.880 | with the robot dogs a little too much,
01:38:12.160 | but, like, there's a connection there,
01:38:15.480 | and it feels like that connection
01:38:17.120 | should not be disregarded.
01:38:19.680 | - No, it should not be disregarded.
01:38:23.360 | Robots that can somehow communicate
01:38:25.320 | through their physical gestures
01:38:26.560 | are, you're gonna be more attached to in the long run.
01:38:30.240 | Do you remember AIBO?
01:38:32.600 | The Sony AIBO?
01:38:33.720 | They sold over 100,000 of those, maybe 150,000.
01:38:37.200 | Probably wasn't considered a successful product for them.
01:38:42.520 | They suspended that eventually,
01:38:44.040 | and then they brought it back, Sony brought it back.
01:38:46.640 | And people definitely, you know,
01:38:48.600 | treated this as a pet, as a companion.
01:38:52.040 | And I think that will come around again.
01:38:55.640 | Will you get away without having any other utility?
01:39:01.800 | Maybe in a world where we can really talk
01:39:03.840 | to our simple little pet, because, you know,
01:39:06.760 | chat GPT or some other generative AI
01:39:09.600 | has made it possible for you to really talk
01:39:11.400 | in what seems like a meaningful way.
01:39:14.200 | Maybe that'll open the social robot up again.
01:39:18.900 | That's probably not a path we're gonna go down,
01:39:23.440 | because, again, we're so focused on performance and utility.
01:39:27.860 | We can add those other things also,
01:39:30.920 | but we really wanna start
01:39:31.960 | from that foundation of utility, I think.
01:39:34.160 | - Yeah, but I also wanna predict that you're wrong on that,
01:39:39.160 | which is that the very path you're taking,
01:39:42.040 | which is creating a great robot platform,
01:39:44.920 | will very easily take a leap
01:39:47.080 | to adding a chat GPT-like capability, maybe GPT-5.
01:39:52.080 | And there's just so many open source alternatives
01:39:55.520 | that you could just plop that on top of Spot.
01:39:58.320 | And because you have this robust platform,
01:40:01.240 | and you're figuring out how to mass manufacture it,
01:40:03.360 | and how to drive the cost down,
01:40:05.320 | and how to make it reliable, all those kinds of things,
01:40:07.720 | it'll be a natural transition
01:40:09.480 | to where just adding chat GPT on top of it will create--
01:40:12.080 | - Oh, I do think that being able to verbally converse,
01:40:17.080 | or even converse through gestures,
01:40:20.200 | part of these learning models is that
01:40:24.040 | you can now look at video and imagery
01:40:26.360 | and associate intent with that.
01:40:30.040 | Those will all help in the communication
01:40:33.200 | between robots and people, for sure.
01:40:35.840 | And that's gonna happen, obviously,
01:40:37.280 | more quickly than any of us were expecting.
01:40:39.640 | - I mean, what else do you want from life?
01:40:42.560 | A friend to get you a beer. (Lex laughing)
01:40:44.600 | And then just talk shit about the state of the world.
01:40:48.120 | (Lex laughing)
01:40:50.240 | I mean, where there's a deep loneliness within all of us,
01:40:52.960 | and I think a beer and a good chat solves so much of it,
01:40:57.200 | or it takes us a long way to solving a lot of it.
01:41:00.720 | - It'll be interesting to see
01:41:04.800 | when a generative AI can give you that warm feeling
01:41:09.800 | that you connected, and that, oh, yeah, you remember me,
01:41:15.240 | you're my friend, we have a history.
01:41:17.280 | That history matters, right?
01:41:20.440 | - Memory of-- - Memory of, yeah.
01:41:23.400 | - Having witnessed, that's what friendship,
01:41:25.800 | that's what connection, that's what love is,
01:41:28.320 | in many cases, some of the deepest friendships you have
01:41:31.280 | is having gone through a difficult time together
01:41:34.160 | and having a shared memory of an amazing time
01:41:36.760 | or a difficult time, and kind of that memory
01:41:41.560 | creating this foundation based on which
01:41:44.120 | you can then experience the world together.
01:41:46.360 | The silly, the mundane stuff of day-to-day
01:41:48.720 | is somehow built on a foundation
01:41:50.320 | of having gone through some shit in the past.
01:41:52.600 | And the current systems are not personalized in that way,
01:41:56.240 | but I think that's a technical problem,
01:41:58.200 | not some kind of fundamental limitation.
01:42:00.960 | So combine that with an embodied robot like Spot,
01:42:04.400 | which already has magic in its movement.
01:42:08.560 | I think it's a very interesting possibility
01:42:11.680 | of where that takes us.
01:42:13.160 | But of course, you have to build that on top of a company
01:42:16.000 | that's making money with real applications,
01:42:19.360 | real customers, and with robots that are safe and work
01:42:23.120 | and reliable and manufactured at scale.
01:42:27.360 | - And I think we're in a unique position
01:42:29.640 | in that because of our investors, primarily Hyundai,
01:42:34.600 | but also SoftBank still owns 20% of us,
01:42:36.960 | they don't, they're not totally fixated
01:42:41.440 | on driving us to profitability as soon as possible.
01:42:45.320 | That's not the goal.
01:42:46.880 | The goal really is a longer-term vision
01:42:49.480 | of creating, you know, what does mobility mean in the future?
01:42:53.440 | What, how is this mobile robot technology
01:42:56.360 | going to influence us?
01:42:59.240 | Can we shape that?
01:43:00.960 | And they want both.
01:43:02.040 | And so we are, as a company,
01:43:04.320 | are trying to strike that balance
01:43:05.640 | between let's build a business that makes money.
01:43:09.160 | I've been describing that to my own team
01:43:11.080 | as self-destination.
01:43:13.520 | If I wanna drive my own ship,
01:43:15.800 | we need to have a business that's profitable in the end.
01:43:18.320 | Otherwise, somebody else is gonna drive the ship for us.
01:43:21.200 | So that's really important.
01:43:23.560 | But we're gonna retain the aspiration
01:43:27.880 | that we're gonna build the next generation
01:43:29.320 | of technology at the same time.
01:43:30.920 | And the real trick will be if we can do both.
01:43:33.240 | - Speaking of ships, let me ask you about a competitor
01:43:38.640 | and somebody who's become a friend.
01:43:41.840 | So Elon Musk and Tesla have announced
01:43:44.920 | they've been in the early days
01:43:46.440 | of building a humanoid robot.
01:43:48.560 | How does that change the landscape of your work?
01:43:53.560 | So there's sort of from the outside perspective,
01:43:57.400 | it seems like, well, as a fan of robotics,
01:44:01.800 | it just seems exciting.
01:44:03.560 | - Very exciting, right?
01:44:04.720 | When Elon speaks, people listen.
01:44:08.520 | And so it suddenly brought a bright light
01:44:12.200 | onto the work that we'd been doing for over a decade.
01:44:15.280 | And I think that's only gonna help.
01:44:19.480 | And in fact, what we've seen is that in addition to Tesla,
01:44:24.280 | we're seeing a proliferation of robotic companies arise now.
01:44:29.280 | - Including humanoid?
01:44:31.000 | - Yes.
01:44:31.840 | - Oh, wow.
01:44:32.660 | - Yeah, and interestingly, many of them,
01:44:36.480 | as they're raising money, for example,
01:44:39.400 | will claim whether or not they have
01:44:41.440 | a former Boston Dynamics employee on their staff
01:44:43.920 | as a criteria.
01:44:44.960 | (laughing)
01:44:46.120 | - Yeah, that's true.
01:44:47.440 | I would do that as a company, yeah, for sure.
01:44:51.440 | Shows you're legit, yeah.
01:44:53.880 | So you know what?
01:44:54.720 | It has brung a tremendous validation to what we're doing
01:44:59.720 | and excitement.
01:45:01.020 | Competitive juices are flowing, the whole thing.
01:45:04.800 | So it's all good.
01:45:06.620 | - Elon has also kind of stated
01:45:12.040 | that maybe he implied that the problem
01:45:18.520 | is solvable in the near term,
01:45:23.640 | which is a low-cost humanoid robot
01:45:26.960 | that's able to do,
01:45:27.880 | that's a relatively general use case robot.
01:45:32.460 | So I think Elon is known for setting these kinds
01:45:37.140 | of incredibly ambitious goals,
01:45:39.020 | maybe missing deadlines,
01:45:42.240 | but actually pushing not just the particular team he leads,
01:45:45.760 | but the entire world to accomplishing those.
01:45:50.160 | Do you see Boston Dynamics in the near future
01:45:54.120 | being pushed in that kind of way?
01:45:56.080 | Like this excitement of competition
01:45:57.960 | kind of pushing Atlas maybe to do more cool stuff,
01:46:02.960 | trying to drive the cost of Atlas down perhaps?
01:46:06.600 | Or I mean, I guess I wanna ask
01:46:09.640 | if there's some kind of exciting energy
01:46:13.640 | in Boston Dynamics due to this little bit of competition.
01:46:19.600 | - Oh yeah, definitely.
01:46:21.000 | When we released our most recent video of Atlas,
01:46:25.520 | you know, I think you'd seen it,
01:46:27.440 | the scaffolding and throwing the box of tools around
01:46:30.360 | and then doing the flip at the end.
01:46:32.480 | We were trying to show the world
01:46:34.380 | that not only can we do this parkour mobility thing,
01:46:38.640 | but we can pick up and move heavy things.
01:46:41.080 | Because if you're gonna work
01:46:43.440 | in a manufacturing environment,
01:46:45.320 | that's what you gotta be able to do.
01:46:47.880 | And for the reasons I explained to you earlier,
01:46:51.120 | it's not trivial to do so.
01:46:52.640 | You know, changing the center of mass,
01:46:54.700 | by picking up a 50 pound block,
01:46:58.960 | for a robot that weighs 150 pounds,
01:47:02.360 | that's a lot to accommodate.
01:47:04.780 | So we're trying to show that we can do that.
01:47:07.360 | And so it's totally been energizing.
01:47:12.000 | We see the next phase of Atlas being more dexterous hands
01:47:17.520 | that can manipulate and grab more things,
01:47:19.840 | that we're gonna start by moving big things around
01:47:23.120 | that are heavy and that affect balance.
01:47:25.640 | And why is that?
01:47:26.480 | Well, really tiny dexterous things
01:47:29.400 | probably are gonna be hard for a while yet.
01:47:32.080 | Maybe you could go build a special purpose robot arm,
01:47:36.560 | you know, for stuffing chips into electronics boards.
01:47:41.400 | But we don't really wanna do really fine work like that.
01:47:44.960 | I think more coursework,
01:47:47.240 | where you're using two hands to pick up and balance
01:47:49.720 | an unwieldy thing, maybe in a manufacturing environment,
01:47:52.720 | maybe in a construction environment.
01:47:54.880 | Those are the things that we think robots
01:47:57.200 | are gonna be able to do with the level of dexterity
01:48:00.000 | that they're gonna have in the next few years.
01:48:01.680 | And that's where we're headed.
01:48:03.880 | And I think, and you know,
01:48:05.960 | Elon has seen the same thing, right?
01:48:07.440 | He's talking about using the robots
01:48:08.920 | in a manufacturing environment.
01:48:11.200 | We think there's something very interesting there
01:48:12.920 | about having this, a two-armed robot.
01:48:16.320 | Because when you have two arms,
01:48:17.560 | you can transfer a thing from one hand to the other,
01:48:20.240 | you can turn it around, you know,
01:48:21.840 | you can reorient it in a way that you can't do it
01:48:24.640 | if you just have one hand on it.
01:48:26.560 | And so there's a lot that extra arm brings to the table.
01:48:29.920 | - So I think in terms of mission,
01:48:32.720 | you mentioned Boston Dynamics really wants to see
01:48:35.280 | what's the limits of what's possible.
01:48:38.080 | And so the cost comes second.
01:48:40.760 | Or it's a component, but first figure out
01:48:43.080 | what are the limitations.
01:48:43.960 | I think with Elon, he's really driving the cost down.
01:48:47.040 | Is there some inspiration, some lessons you see there
01:48:50.520 | of the challenge of driving the cost down,
01:48:55.000 | especially with Atlas, with a humanoid robot?
01:48:57.080 | - Well, I think the thing that he's certainly been learning
01:48:59.680 | by building car factories is what that looks like
01:49:04.000 | in scaling.
01:49:05.160 | By scaling, you can get efficiencies
01:49:09.280 | that drive costs down very well.
01:49:11.840 | And the smart thing that they have in their favor
01:49:16.720 | is that they know how to manufacture,
01:49:18.920 | they know how to build electric motors,
01:49:20.480 | they know how to build computers and vision systems.
01:49:23.800 | So there's a lot of overlap
01:49:25.600 | between modern automotive companies and robots.
01:49:30.360 | But hey, we have a modern robotic,
01:49:35.520 | I mean, automotive company behind us as well.
01:49:38.080 | (Lex laughing)
01:49:40.120 | - So bring it on.
01:49:41.400 | - Who's doing pretty well, right?
01:49:43.080 | The electric vehicles from Hyundai are doing pretty well.
01:49:46.640 | - I love it.
01:49:47.560 | So how much, so we've talked about some
01:49:50.600 | of the low-level controls, some of the incredible stuff
01:49:53.720 | that's going on and basic perception.
01:49:56.720 | But how much do you see currently
01:49:59.840 | and in the future of Boston Dynamics
01:50:02.680 | sort of higher-level machine learning applications?
01:50:06.240 | Do you see customers adding on those capabilities
01:50:09.440 | or do you see Boston Dynamics doing that in-house?
01:50:12.320 | - Some kinds of things we really believe
01:50:14.680 | are probably gonna be more broadly available,
01:50:18.880 | maybe even commoditized.
01:50:20.680 | Using a machine learning, like a vision algorithm,
01:50:24.320 | so a robot can recognize something in the environment.
01:50:27.120 | That ought to be something you can just download.
01:50:29.040 | Like I'm going to a new environment
01:50:31.240 | and I have a new kind of door handle
01:50:32.920 | or piece of equipment I want to inspect,
01:50:34.840 | you ought to be able to just download that.
01:50:36.080 | And I think people besides Boston Dynamics will provide that
01:50:38.960 | and we've actually built an API
01:50:41.600 | that lets people add these vision algorithms to Spot.
01:50:46.600 | And we're currently working with some partners
01:50:49.680 | who are providing that.
01:50:51.360 | Levitas is an example of a small provider
01:50:54.000 | who's giving us software for reading gauges.
01:50:57.160 | And actually another partner in Europe,
01:51:00.000 | Reply, is doing the same thing.
01:51:02.600 | So we see that, we see it ultimately an ecosystem
01:51:07.000 | of providers doing stuff like that.
01:51:09.440 | And I think ultimately,
01:51:11.240 | you might even be able to do the same thing with behaviors.
01:51:15.200 | So this technology will also be brought to bear
01:51:19.240 | on controlling the robot, the motions of the robot.
01:51:24.040 | And we're using reinforcement learning
01:51:27.240 | to develop algorithms for both locomotion and manipulation.
01:51:33.200 | And ultimately this is gonna mean
01:51:34.720 | you can add new behaviors to a robot quickly.
01:51:39.040 | And that could potentially be done
01:51:42.120 | outside of Boston Dynamics.
01:51:43.400 | Right now that's all internal to us.
01:51:45.640 | I think you need to understand at a deep level
01:51:49.640 | the robot control to do that.
01:51:53.040 | But eventually that could be outside.
01:51:55.000 | But it's certainly a place where these approaches
01:51:58.000 | are gonna be brought to bear in robotics.
01:52:00.320 | - So reinforcement learning is part of the process.
01:52:03.160 | So you do use reinforcement learning.
01:52:05.560 | - Yes.
01:52:06.400 | - So there's increasing levels of learning with these robots?
01:52:11.700 | - Yes.
01:52:12.940 | - And that's for both for locomotion,
01:52:15.000 | for manipulation, and for perception?
01:52:17.840 | - Yes.
01:52:19.240 | - Well, what do you think in general
01:52:21.520 | about all the exciting advancements
01:52:23.520 | of transformer neural networks,
01:52:28.920 | most beautifully illustrated
01:52:32.280 | through the large language models like GPT-4?
01:52:35.200 | - Like everybody else, we're all,
01:52:39.000 | you know, I'm surprised at how much,
01:52:43.440 | how far they've come.
01:52:45.480 | I'm a little bit nervous about the,
01:52:49.820 | there's anxiety around them, obviously,
01:52:54.040 | for I think good reasons, right?
01:52:58.040 | Disinformation is a curse that's an unintended consequence
01:53:03.040 | of social media that could be exacerbated with these tools.
01:53:08.080 | So if you use them to deploy disinformation,
01:53:11.680 | it could be a real risk.
01:53:12.960 | But I also think that the risks associated
01:53:17.600 | with these kinds of models don't have a whole lot to do
01:53:21.120 | with the way we're gonna use them in our robots.
01:53:23.960 | If I'm using a robot, I'm building a robot to do,
01:53:27.120 | you know, a manual task of some sort.
01:53:29.840 | I can judge very easily.
01:53:33.400 | Is it doing the task I asked it to?
01:53:35.680 | Is it doing it correctly?
01:53:37.040 | There's sort of a built-in mechanism for judging.
01:53:40.720 | Is that, is it doing the right thing?
01:53:43.040 | Did it successfully do the task?
01:53:45.600 | - Yeah, physical reality is a good verifier.
01:53:47.760 | - It's a good verifier, that's exactly it.
01:53:50.040 | Whereas if you're asking for, yeah, I don't know,
01:53:53.760 | trying to ask a theoretical question in chat GPT,
01:53:57.640 | it could be true or it may not be true.
01:54:00.480 | And it's hard to have that verifier.
01:54:02.840 | What is that truth that you're comparing against?
01:54:05.600 | Whereas in physical reality, you know the truth.
01:54:08.560 | And this is an important difference.
01:54:10.560 | And so I'm not, I think there is reason
01:54:14.520 | to be a little bit concerned about, you know,
01:54:18.400 | how these tools, large language models could be used,
01:54:21.800 | but I'm not very worried about how they're gonna be used.
01:54:25.320 | Well, how learning algorithms in general
01:54:28.440 | are going to be used on robotics.
01:54:30.240 | It's really a different application
01:54:33.080 | that has different ways of verifying what's going on.
01:54:36.520 | - Well, the nice thing about language models
01:54:38.000 | is that I ultimately see, I'm really excited
01:54:42.240 | about the possibility of having conversations with bot.
01:54:44.880 | - Yeah.
01:54:45.720 | - There's no, I would say negative consequences to that,
01:54:48.400 | but just increasing the bandwidth
01:54:50.920 | and the variety of ways you can communicate
01:54:52.720 | with this particular robot.
01:54:54.840 | - Yeah.
01:54:55.680 | - So you could communicate visually,
01:54:56.840 | you can communicate through some interface
01:54:59.440 | and to be able to communicate verbally again
01:55:01.720 | with the beer and so on.
01:55:02.920 | I think that's really exciting
01:55:05.400 | to make that much, much easier.
01:55:07.600 | - We have this partner Levitas
01:55:09.360 | that's adding the vision algorithms for daydreaming for us.
01:55:13.360 | They just, just this week I saw a demo
01:55:15.800 | where they hooked up, you know, a language tool to spot
01:55:19.760 | and they're talking to spot to give it commands.
01:55:22.000 | - Yeah.
01:55:22.840 | Can you tell me about the Boston Dynamics AI Institute?
01:55:25.600 | What is it and what is its mission?
01:55:28.480 | - So it's a separate organization,
01:55:30.320 | the Boston Dynamics Artificial Intelligence Institute.
01:55:34.760 | It's led by Mark Raybord, the founder of Boston Dynamics
01:55:37.680 | and the former CEO and my old advisor at MIT.
01:55:40.980 | Mark has always loved the research, the pure research,
01:55:46.760 | without the confinement or demands of commercialization.
01:55:50.880 | And he wanted to continue to, you know,
01:55:56.600 | pursue that unadulterated research.
01:56:00.420 | And so, suggested to Hyundai that he set up this institute
01:56:05.420 | and they agree that it's worth additional investment
01:56:10.600 | to kind of continue push, pushing this forefront.
01:56:14.360 | And we expect to be working together where, you know,
01:56:17.600 | Boston Dynamics is again,
01:56:19.360 | both commercialize and do research,
01:56:21.900 | but the sort of time horizon of the research
01:56:24.720 | we're gonna do is, you know, in the next,
01:56:26.760 | let's say five years, you know,
01:56:28.200 | what can we do in the next five years?
01:56:29.640 | Let's work on those problems.
01:56:31.760 | And I think the goal of the AI Institute
01:56:33.820 | is to work even further out.
01:56:35.720 | Certainly, you know, the analogy of,
01:56:39.040 | of legged locomotion again, when we started that,
01:56:41.360 | that was a multi-decade problem.
01:56:43.120 | And so I think Mark wants to have the freedom
01:56:45.560 | to pursue really hard over the horizon problems.
01:56:50.440 | And that's, that'll be the goal of the institute.
01:56:53.640 | - So we mentioned some of the dangers of,
01:56:57.560 | some of the concerns about large language models.
01:57:00.200 | That said, you know, there's been a long running fear
01:57:04.840 | of these embodied robots.
01:57:07.600 | Why do you think people are afraid of legged robots?
01:57:11.520 | - Yeah, I wanted to show you this.
01:57:12.920 | This, so this, this is a Wall Street Journal
01:57:16.640 | and this is all about chat GPT, right?
01:57:19.280 | But look at the picture.
01:57:20.800 | It's a humanoid robot.
01:57:22.760 | - That's saying, I will replace you.
01:57:23.600 | - That's saying, it looks scary.
01:57:25.320 | And it says, I'm gonna replace you.
01:57:27.320 | And so the humanoid robot is sort of,
01:57:29.920 | is the embodiment of this chat GPT tool
01:57:34.680 | that there's reason to be a little bit nervous
01:57:37.960 | about how it gets deployed.
01:57:40.120 | So I'm nervous about that connection.
01:57:42.380 | It's unfortunate that they chose to use a robot
01:57:46.580 | as that embodiment.
01:57:48.020 | For, as you and I just said,
01:57:49.860 | there's big differences in this.
01:57:53.140 | But people are afraid because we've been taught
01:57:58.100 | to be afraid for over a hundred years.
01:58:00.980 | So, you know, the word robot was developed
01:58:03.220 | by a playwright named Carol Chapek in 1921,
01:58:06.620 | the Czech playwright for Rossum's Universal Robots.
01:58:10.100 | And in that first depiction of a robot,
01:58:13.140 | the robots took over the end of the story.
01:58:16.560 | And, you know, people love to be afraid.
01:58:19.340 | And so we've been entertained by these stories
01:58:22.100 | for a hundred years.
01:58:23.680 | But I, and I think that's as much why people are afraid
01:58:28.020 | as anything else, is we've been sort of taught
01:58:31.220 | that this is the logical progression through fiction.
01:58:34.960 | I think it's fiction.
01:58:38.900 | I think what people more and more will realize,
01:58:42.580 | just like you said, that the threat,
01:58:46.380 | like say you have a super intelligent AI embodied
01:58:49.620 | in a robot, that's much less threatening
01:58:52.180 | because it's visible, it's verifiable,
01:58:55.340 | it's right there in physical reality.
01:58:57.160 | And we humans know how to deal with physical reality.
01:59:00.140 | I think it's much scarier when you have arbitrary scaling
01:59:04.100 | of intelligent AI systems in the digital space.
01:59:08.820 | That they could pretend to be human.
01:59:12.020 | So a robot, Spot is not gonna be pretend,
01:59:14.300 | it can pretend it's human all at once.
01:59:17.100 | It could tell you, you could put Chad G.B.T. on top of it,
01:59:19.900 | but you're gonna know it's not human
01:59:21.420 | because you have a contact with physical reality.
01:59:23.660 | - And you're gonna know whether or not
01:59:24.660 | it's doing what you asked it to do.
01:59:25.780 | - Yeah, like it's not gonna, like if it,
01:59:28.380 | I mean, I'm sure you can start just like a dog lies to you.
01:59:32.780 | It's like, I wasn't part of tearing up that couch.
01:59:35.340 | So Spot can try to lie that like, you know,
01:59:39.060 | it wasn't me that spilled that thing,
01:59:40.660 | but you're going to kind of figure it out eventually.
01:59:43.980 | It's, if it happens multiple times, you know.
01:59:46.220 | But I think that--
01:59:49.300 | - Humanity has figured out how to make machines safe.
01:59:52.340 | And there's, you know, regulatory environments
01:59:56.020 | and certification protocols that we've developed
02:00:00.500 | in order to figure out how to make machines safe.
02:00:03.820 | We don't know, and don't have that experience
02:00:06.860 | with software that can be propagated worldwide
02:00:10.380 | in an instant.
02:00:11.220 | And so I think we needed to develop those protocols
02:00:14.580 | and those tools.
02:00:15.700 | And so that's work to be done,
02:00:19.940 | but I don't think the fear of that and that work
02:00:22.500 | should necessarily impede our ability
02:00:24.860 | to now get robots out.
02:00:25.980 | Because again, I think we can judge
02:00:28.060 | when a robot's being safe.
02:00:29.740 | - So, and again, just like in that image,
02:00:32.840 | there's a fear that robots will take our jobs.
02:00:35.900 | I just, I took a ride, I was in San Francisco,
02:00:38.580 | I took a ride in the Waymo vehicles, an autonomous vehicle.
02:00:41.580 | And I was on it several times.
02:00:43.700 | They're doing incredible work over there.
02:00:47.100 | But people flicked it off.
02:00:50.740 | - Oh, right. - Like off the car.
02:00:52.180 | So, I mean, that's a long story
02:00:55.580 | of what the psychology of that is.
02:00:57.140 | It could be maybe big tech or what,
02:00:59.700 | I don't know exactly what they're flicking off.
02:01:02.700 | But there is an element of like,
02:01:04.320 | these robots are taking our jobs
02:01:06.320 | or irreversibly transforming society
02:01:09.920 | such that it will have economic impact
02:01:11.840 | and the little guy would lose a lot,
02:01:15.480 | would lose their well-being.
02:01:16.780 | Is there something to be said about
02:01:18.520 | the fear that robots will take our jobs?
02:01:23.600 | - You know, at every
02:01:24.800 | significant technological transformation,
02:01:30.400 | there's been fear of, you know, an automation anxiety.
02:01:33.860 | - Yes. - That it's gonna have
02:01:35.620 | a broader impact than we expected.
02:01:38.500 | And there will be,
02:01:41.780 | jobs will change.
02:01:45.180 | Sometime in the future, we're gonna look back
02:01:49.060 | at people who manually unloaded these boxes from trailers
02:01:51.980 | and we're gonna say, why did we ever do that manually?
02:01:54.660 | But there's a lot of people who are doing that job today
02:01:57.540 | that it could be impacted.
02:02:00.080 | But I think the reality is, as I said before,
02:02:03.540 | we're gonna build the technology
02:02:05.300 | so that those very same people can operate it.
02:02:07.560 | And so I think there's a pathway to upskilling
02:02:09.780 | and operating just like, look, we used to farm
02:02:13.120 | with hand tools and now we farm with machines
02:02:15.900 | and nobody has really regretted that transformation.
02:02:20.300 | And I think the same can be said for a lot of manual labor
02:02:22.780 | that we're doing today.
02:02:23.940 | And on top of that, you know, look,
02:02:26.980 | we're entering a new world where demographics
02:02:31.760 | are gonna have strong impact on economic growth.
02:02:35.380 | And the, you know, the advanced,
02:02:38.460 | the first world is losing population quickly.
02:02:42.340 | In Europe, they're worried about hiring enough people
02:02:47.080 | just to keep the logistics supply chain going.
02:02:50.380 | And, you know, part of this is the response to COVID
02:02:55.220 | and everybody's sort of thinking back
02:02:58.140 | what they really wanna do with their life.
02:03:00.020 | But these jobs are getting harder and harder to fill.
02:03:03.180 | And I just, I'm hearing that over and over again.
02:03:06.300 | So I think, frankly, this is the right technology
02:03:08.860 | at the right time where we're gonna need
02:03:13.460 | some of this work to be done.
02:03:15.020 | And we're gonna want tools to enhance that productivity.
02:03:18.520 | - And the scary impact, I think, again,
02:03:21.540 | GPT comes to the rescue in terms of being
02:03:24.500 | much more terrifying.
02:03:25.580 | The scary impact of basically,
02:03:30.520 | so I'm a, I guess, a software person.
02:03:32.820 | So I program a lot.
02:03:33.860 | And the fact that people like me
02:03:35.740 | can be easily replaced by GPT,
02:03:40.520 | that's going to have a...
02:03:43.380 | - Well, and a lot, you know, anyone who deals with texts
02:03:46.060 | and writing a draft proposal might be easily done
02:03:50.220 | with a chat GPT now.
02:03:52.900 | - Consultants.
02:03:53.740 | - It wasn't before.
02:03:54.660 | - Journalists.
02:03:56.260 | - Yeah.
02:03:57.780 | - Everybody is sweating.
02:03:58.620 | - But on the other hand, you also want it to be right.
02:04:01.820 | And they don't know how to make it right yet.
02:04:05.100 | But it might make a good starting point for you to iterate.
02:04:07.900 | - Boy, do I have to talk to you about modern journalism.
02:04:10.540 | (Lex laughing)
02:04:11.460 | That's another conversation altogether.
02:04:13.460 | But yes, more right than the average,
02:04:18.280 | the mean journalist, yes.
02:04:22.920 | - Yeah.
02:04:23.760 | - You spearheaded the NT weaponization letter.
02:04:29.200 | Boston Dynamics has.
02:04:30.680 | Can you describe what that letter states
02:04:34.640 | and the general topic of the use of robots in war?
02:04:39.640 | - We authored a letter
02:04:44.240 | and then got several leading robotics companies
02:04:47.880 | around the world, including, you know,
02:04:50.400 | Unitry in China and Agility here in the United States
02:04:55.400 | and Animal in Europe and some others
02:05:02.080 | to co-sign a letter that said
02:05:05.680 | we won't put weapons on our robots.
02:05:08.920 | And part of the motivation there is, you know,
02:05:11.840 | as these robots start to become commercially available,
02:05:16.360 | you can see videos online of people who've gotten a robot
02:05:19.720 | and strapped a gun on it and shown that they can,
02:05:22.720 | you know, operate the gun remotely
02:05:24.680 | while driving the robot around.
02:05:26.760 | And so having a robot that has this level of mobility
02:05:29.480 | and that can easily be configured in a way
02:05:33.360 | that could harm somebody from a remote operator
02:05:35.760 | is justifiably a scary thing.
02:05:38.600 | And so we felt like it was important
02:05:41.520 | to draw a bright line there and say,
02:05:44.080 | we're not going to allow this.
02:05:46.480 | For, you know, reasons that we think ultimately
02:05:51.120 | it's better for the whole industry
02:05:53.200 | if it grows in a way where robots are ultimately
02:05:58.200 | going to help us all and make our lives
02:06:01.440 | more fulfilled and productive.
02:06:03.560 | But by goodness, you're going to have to trust
02:06:05.840 | the technology to let it in.
02:06:08.240 | And if you think the robot's going to harm you,
02:06:11.680 | that's going to impede the growth of that industry.
02:06:16.440 | So we thought it was important to draw a bright line
02:06:19.520 | and then publicize that.
02:06:24.720 | And then our plan is to, you know,
02:06:26.680 | begin to engage with lawmakers and regulators.
02:06:31.680 | Let's figure out what the rules are going to be
02:06:34.960 | around the use of this technology
02:06:37.040 | and use our position as leaders in this industry
02:06:42.200 | and technology to help force that issue.
02:06:45.920 | And so we are, in fact, I have a policy director
02:06:51.600 | at my company whose job it is to engage with the public,
02:06:55.680 | to engage with interested parties and including regulators
02:07:00.280 | to sort of begin these discussions.
02:07:02.280 | - Yeah, it's a really important topic
02:07:04.920 | and it's an important topic for people that worry
02:07:07.120 | about the impact of robots on our society
02:07:09.320 | with autonomous weapon systems.
02:07:11.960 | So I'm glad you're sort of leading the way in this.
02:07:14.520 | You are the CEO of Boston Dynamics.
02:07:19.240 | What's it take to be a CEO of a robotics company?
02:07:21.880 | So you started as a humble engineer,
02:07:23.880 | PhD, just looking at your journey.
02:07:31.420 | What does it take to go from being,
02:07:35.120 | from building the thing to leading a company?
02:07:40.080 | What are some of the big challenges for you?
02:07:42.380 | - Courage, I would put front and center
02:07:47.320 | for multiple reasons.
02:07:48.980 | I talked earlier about the courage to tackle hard problems.
02:07:53.080 | So I think there's courage required not just of me,
02:07:56.200 | but of all of the people who work at Boston Dynamics.
02:07:59.960 | I also think we have a lot of really smart people.
02:08:03.600 | We have people who are way smarter than I am.
02:08:06.520 | And it takes a kind of courage to be willing
02:08:09.560 | to lead them and to trust that you have something
02:08:14.480 | to offer to somebody who probably is maybe
02:08:18.560 | a better engineer than I am.
02:08:21.000 | Adaptability, you know, part of the,
02:08:26.360 | it's been a great career for me.
02:08:27.840 | I never would have guessed I'd stayed
02:08:29.400 | in one place for 30 years.
02:08:31.080 | And the job has always changed.
02:08:34.640 | I didn't really aspire to be CEO
02:08:39.120 | from the very beginning,
02:08:40.440 | but it was the natural progression of things.
02:08:43.080 | There was always a, there always needed to be
02:08:45.420 | some level of management that was needed.
02:08:48.280 | And so, you know, when I saw something
02:08:52.040 | that needed to be done that wasn't being done,
02:08:54.360 | I just stepped in to go do it.
02:08:55.880 | And oftentimes, because we were full
02:08:58.500 | of such strong engineers, oftentimes that was
02:09:03.200 | in the management direction,
02:09:04.560 | or it was in the business development direction,
02:09:06.920 | or organizational hiring.
02:09:10.160 | Geez, I was the main person hiring
02:09:12.640 | at Boston Dynamics for probably 20 years.
02:09:14.440 | So I was the head of HR, basically.
02:09:16.600 | So I, you know, just willingness to sort of tackle
02:09:21.040 | any piece of the business that needs it,
02:09:24.840 | and then be willing to shift.
02:09:26.660 | - Is there something you could say
02:09:27.640 | to what it takes to hire a great team?
02:09:30.040 | What's a good interview process?
02:09:33.880 | How do you know the guy or gal are gonna make
02:09:37.760 | a great member of an engineering team
02:09:41.560 | that's doing some of the hardest work in the world?
02:09:44.080 | - You know, we developed an interview process
02:09:47.520 | that I was quite fond of.
02:09:50.240 | It's a little bit of a hard interview process,
02:09:52.600 | because the best interviews, you ask somebody
02:09:56.400 | about what they're interested in,
02:09:57.960 | and what they're good at.
02:09:59.200 | And if they can describe to you something
02:10:03.360 | that they worked on, and you saw they really did the work,
02:10:06.280 | they solved the problems, and you saw their passion for it.
02:10:10.240 | And you can ask, but what makes that hard
02:10:15.160 | is you have to ask a probing question about it.
02:10:16.920 | You have to be smart enough about what they're telling you,
02:10:20.400 | they're expert at, to ask a good question.
02:10:23.720 | And so it takes a pretty talented team to do that.
02:10:26.940 | But if you can do that, that's how you tap into,
02:10:30.880 | ah, this person cares about their work,
02:10:33.000 | they really did the work, they're excited about it,
02:10:35.520 | that's the kind of person I want at my company.
02:10:38.220 | You know, at Google, they taught us
02:10:41.480 | about their interview process,
02:10:43.800 | and it was a little bit different.
02:10:45.500 | We evolved the process at Boston Dynamics
02:10:51.560 | where it didn't matter if you were an engineer,
02:10:54.120 | or you were an administrative assistant,
02:10:57.140 | or a financial person, or a technician.
02:11:00.080 | You gave us a presentation.
02:11:01.640 | You came in and you gave us a presentation.
02:11:03.760 | You had to stand up and talk in front of us.
02:11:06.560 | And I just thought that was great,
02:11:08.000 | to tap into those things I just described to you.
02:11:10.320 | At Google, they taught us, and I think, I understand why.
02:11:14.080 | You're right, they're hiring tens of thousands of people.
02:11:17.120 | They need a more standardized process.
02:11:19.320 | So they would sort of err on the other side,
02:11:21.480 | where they would ask you a standard question.
02:11:23.560 | I'm gonna ask you a programming question,
02:11:25.320 | and I'm just gonna ask you to write code in front of me.
02:11:28.800 | That's a terrifying application process.
02:11:32.540 | It does let you compare candidates really well,
02:11:37.040 | but it doesn't necessarily let you tap in to who they are.
02:11:41.040 | Right, 'cause you're asking them to answer your question
02:11:43.600 | instead of you asking them about what they're interested in.
02:11:47.360 | But frankly, that process is hard to scale.
02:11:50.200 | And even at Boston Dynamics,
02:11:52.640 | we're not doing that with everybody anymore.
02:11:55.000 | But we are still doing that with the technical people.
02:11:59.300 | But because we too now need to sort of increase
02:12:02.640 | our rate of hiring,
02:12:04.560 | not everybody's giving a presentation anymore.
02:12:06.920 | - But you're still ultimately trying to find
02:12:08.640 | that basic seed of passion for the world.
02:12:12.680 | - Yeah, in terms.
02:12:13.520 | Did they really do it?
02:12:14.360 | Did they find something interesting or curious?
02:12:18.040 | And do they care about it?
02:12:20.800 | - I think somebody admires Jim Keller,
02:12:25.800 | and he likes details.
02:12:30.400 | So one of the ways,
02:12:31.560 | if you get a person to talk about what they're interested in,
02:12:35.840 | how many details?
02:12:37.200 | Like how much of the whiteboard can you fill out?
02:12:39.480 | - Yeah, well, I think you figure out,
02:12:41.040 | did they really do the work if they know some of the details?
02:12:43.440 | - Yes.
02:12:44.280 | - And if they have to wash over the details,
02:12:45.360 | well, then they didn't do it.
02:12:46.560 | - They didn't do it.
02:12:47.480 | Especially with engineering, the work is in the details.
02:12:50.920 | - Yeah.
02:12:52.480 | - I have to go there briefly,
02:12:54.800 | just to get your kind of thoughts
02:12:58.120 | in the long-term future of robotics.
02:13:00.500 | There's been discussions on the GPT side,
02:13:04.720 | on the large language model side,
02:13:06.840 | of whether there's consciousness
02:13:08.840 | inside these language models.
02:13:10.900 | And I think there's fear,
02:13:15.200 | but I think there's also excitement,
02:13:17.560 | or at least the wide world of opportunity and possibility
02:13:21.200 | in embodied robots having something like,
02:13:25.120 | let's start with emotion,
02:13:26.820 | love towards other human beings,
02:13:31.800 | and perhaps the display, real or fake, of consciousness.
02:13:36.800 | Is this something you think about
02:13:39.760 | in terms of long-term future?
02:13:42.120 | Because as we've talked about,
02:13:45.520 | people do anthropomorphize these robots.
02:13:48.420 | It's difficult not to project some level of,
02:13:53.040 | I use the word sentience,
02:13:55.280 | some level of sovereignty, identity,
02:13:59.360 | all the things we think as human.
02:14:00.680 | That's what anthropomorphization is,
02:14:02.640 | is we project humanness onto mobile,
02:14:06.960 | especially legged robots.
02:14:08.580 | Is that something almost from a science fiction perspective
02:14:12.000 | you think about, or do you try to avoid ever,
02:14:15.120 | try to avoid the topic of consciousness altogether?
02:14:19.880 | - I'm certainly not an expert in it,
02:14:22.560 | and I don't spend a lot of time thinking about this, right?
02:14:26.400 | And I do think it's fairly remote
02:14:28.720 | for the machines that we're dealing with.
02:14:31.500 | Our robots, you're right, the people anthropomorphize.
02:14:36.400 | They read into the robots intelligence
02:14:39.600 | and emotion that isn't there,
02:14:41.840 | because they see physical gestures
02:14:45.200 | that are similar to things they might even see
02:14:47.280 | in people or animals.
02:14:48.380 | I don't know much about how these large language models
02:14:53.360 | really work.
02:14:54.520 | I believe it's a kind of statistical averaging
02:14:58.320 | of the most common responses to a series of words, right?
02:15:01.840 | It's sort of a very elaborate word completion.
02:15:10.080 | And I'm dubious that that has anything
02:15:14.240 | to do with consciousness.
02:15:15.720 | And I even wonder if that model
02:15:19.620 | of sort of simulating consciousness
02:15:21.960 | by stringing words together
02:15:23.880 | that are statistically associated with one another,
02:15:26.480 | whether or not that kind of knowledge,
02:15:30.200 | if you want to call that knowledge,
02:15:31.960 | would be the kind of knowledge
02:15:36.040 | that allowed a sentient being to grow or evolve.
02:15:40.840 | It feels to me like there's something about truth
02:15:45.480 | or emotions that's just a very different kind of knowledge
02:15:49.640 | that is absolute.
02:15:51.520 | Like the interesting thing about truth is it's absolute.
02:15:54.480 | And it doesn't matter how frequently it's represented
02:15:56.880 | in the world wide web.
02:15:58.520 | It's, if you know it to be true,
02:16:00.160 | it can only be, it may only be there once,
02:16:02.640 | but by God, it's true.
02:16:04.640 | And I think emotions are a little bit like that too.
02:16:06.880 | You know something, you know,
02:16:09.520 | and I just think that's a different kind of knowledge
02:16:13.840 | than the way these large language models
02:16:16.480 | derive sort of simulated intelligence.
02:16:19.720 | - It does seem that the things that are true
02:16:22.320 | very well might be statistically well represented
02:16:27.640 | on the internet because the internet is made up of humans.
02:16:32.480 | So I tend to suspect that large language models
02:16:35.780 | are going to be able to simulate consciousness
02:16:37.960 | very effectively.
02:16:39.320 | And I actually believe that current GPT-4,
02:16:42.200 | when fine tuned correctly, would be able to do just that.
02:16:46.280 | And that's going to be a lot of very complicated
02:16:48.100 | ethical questions that have to be dealt with
02:16:51.100 | that have nothing to do with robotics
02:16:53.640 | and everything to do with--
02:16:55.540 | - There needs to be some process of labeling, I think,
02:17:01.000 | what is true because there is also disinformation
02:17:05.720 | available on the web and these models are going to consider
02:17:09.760 | that kind of information as well.
02:17:12.360 | And again, you can't average something that's true
02:17:15.360 | and something that's untrue
02:17:17.200 | and get something that's moderately true.
02:17:19.760 | It's either right or it's wrong.
02:17:21.880 | And so how is that process,
02:17:24.400 | and this is obviously something that the purveyors
02:17:27.800 | of these BARD and CHAT-GPT,
02:17:30.200 | I'm sure this is what they're working on.
02:17:31.760 | - Well, if you interact on some controversial topics
02:17:34.480 | with these models, they're actually refreshingly nuanced.
02:17:39.120 | They present, 'cause you realize there's no one truth.
02:17:44.120 | What caused the war in Ukraine?
02:17:49.920 | Any geopolitical conflict.
02:17:53.080 | You can ask any kind of question,
02:17:54.560 | especially the ones that are politically
02:17:58.560 | tense, divisive, and so on.
02:18:00.880 | GPT is very good at presenting,
02:18:02.600 | it presents the different hypotheses,
02:18:06.760 | it presents calmly the amount of evidence for each one.
02:18:11.760 | It's really refreshing.
02:18:14.800 | It makes you realize that truth is nuanced
02:18:17.800 | and it does that well.
02:18:18.880 | And I think with consciousness,
02:18:21.320 | it would very accurately say,
02:18:25.400 | well, it sure as hell feels like I'm one of you humans,
02:18:30.160 | but where's my body?
02:18:32.080 | I don't understand.
02:18:33.800 | You're going to be confused.
02:18:35.200 | The cool thing about GPT is it seems to be easily confused
02:18:39.960 | in the way we are.
02:18:41.680 | You wake up in a new room and you ask, where am I?
02:18:45.540 | It seems to be able to do that extremely well.
02:18:49.440 | It'll tell you one thing, a fact about when a war started,
02:18:52.640 | and when you correct it, say,
02:18:53.880 | well, this is not consistent.
02:18:55.480 | It'll be confused.
02:18:56.440 | It'll be, yeah, you're right.
02:18:58.480 | It'll have that same element, childlike element,
02:19:02.960 | with humility of trying to figure out its way in the world.
02:19:07.080 | And I think that's a really tricky area
02:19:10.160 | to sort of figure out with us humans
02:19:13.160 | of what we want to allow AI systems to say to us.
02:19:18.160 | Because then if there's elements of sentience
02:19:21.900 | that are being on display,
02:19:24.100 | you can then start to manipulate human emotion,
02:19:26.160 | all that kind of stuff.
02:19:27.000 | But I think that's something,
02:19:29.840 | that's a really serious and aggressive discussion
02:19:31.840 | that needs to be had on the software side.
02:19:35.480 | I think, again, embodiment,
02:19:37.440 | robotics are actually saving us
02:19:41.480 | from the arbitrary scaling of software systems
02:19:45.200 | versus creating more problems.
02:19:47.960 | But that said, I really believe in that connection
02:19:51.600 | between human and robot.
02:19:52.640 | There's magic there.
02:19:53.760 | And I think there's also, I think,
02:19:57.800 | a lot of money to be made there.
02:19:59.520 | And Boston Dynamics is leading the world
02:20:01.800 | in the most elegant movement done by robots.
02:20:06.800 | So I can't wait-- - Well, thank you.
02:20:11.760 | - To what maybe other people that built on top
02:20:14.960 | of Boston Dynamics robots or Boston Dynamics by itself.
02:20:20.400 | So you had one wild career,
02:20:23.840 | one place and one set of problems,
02:20:27.440 | but incredibly successful.
02:20:28.920 | Can you give advice to young folks today
02:20:31.560 | in high school, maybe in college,
02:20:33.960 | looking out into this future
02:20:36.200 | where so much robotics and AI
02:20:40.680 | seems to be defining the trajectory of human civilization?
02:20:44.720 | Can you give them advice on how to have a career
02:20:48.240 | they can be proud of or how to have a life
02:20:51.200 | they can be proud of?
02:20:52.300 | - Well, I would say, you know,
02:20:56.200 | follow your heart and your interest.
02:20:58.120 | Again, this was an organizing principle, I think,
02:21:01.600 | behind the Leg Lab at MIT that turned into
02:21:06.600 | a value at Boston Dynamics,
02:21:09.240 | which was follow your curiosity,
02:21:12.440 | love what you're doing,
02:21:16.840 | you'll have a lot more fun
02:21:18.000 | and you'll be a lot better at it as a result.
02:21:20.760 | I think it's hard to plan, you know?
02:21:26.800 | Don't get too hung up on planning too far ahead.
02:21:30.480 | Find things that you like doing
02:21:32.040 | and then see where it takes you.
02:21:33.600 | You can always change direction.
02:21:35.000 | You will find things that, you know,
02:21:36.920 | ah, that wasn't a good move.
02:21:38.080 | I'm gonna back up and go do something else.
02:21:40.840 | So when people are trying to plan a career,
02:21:43.360 | I always feel like, yeah, there's a few happy mistakes
02:21:45.960 | that happen along the way and just live with that,
02:21:49.320 | you know, but just, but make choices then.
02:21:52.240 | So avail yourselves to these interesting opportunities,
02:21:55.440 | like when I happened to run into Mark down in the lab,
02:21:57.920 | the basement of the AI lab,
02:21:59.880 | but be willing to make a decision and then pivot
02:22:02.960 | if you see something exciting to go at, you know,
02:22:05.960 | 'cause if you're out and about enough,
02:22:08.200 | you'll find things like that that get you excited.
02:22:12.000 | - So there was a feeling when you first met Mark
02:22:14.200 | and saw the robots that there's something interesting.
02:22:16.920 | - Oh boy, I gotta go do this.
02:22:19.080 | There is no doubt.
02:22:20.080 | - What do you think in 100 years,
02:22:24.120 | what do you think Boston Dynamics is doing?
02:22:28.360 | What do you think is the role, even bigger,
02:22:30.360 | what do you think is the role of robots in society?
02:22:32.760 | Do you think we'll be seeing billions of robots everywhere?
02:22:37.760 | Do you think about that long-term vision?
02:22:42.160 | - Well, I do think that,
02:22:43.320 | I think the robots will be ubiquitous
02:22:49.640 | and they will be out amongst us.
02:22:52.240 | And they'll be certainly doing, you know,
02:22:59.760 | some of the hard labor that we do today.
02:23:01.880 | I don't think people don't wanna work.
02:23:05.760 | People wanna work, people need to work
02:23:08.640 | to, I think, feel productive.
02:23:11.160 | We don't wanna offload all of the work to the robots
02:23:13.680 | 'cause I'm not sure if people would know
02:23:15.320 | what to do with themselves.
02:23:16.720 | And I think just self-satisfaction and feeling productive
02:23:20.400 | is such an ingrained part of being human
02:23:23.960 | that we need to keep doing this work.
02:23:25.400 | So we're definitely gonna have to work
02:23:27.640 | in a complimentary fashion.
02:23:29.040 | And I hope that the robots and the computers
02:23:31.400 | don't end up being able to do all the creative work, right?
02:23:34.760 | 'Cause that's the part that's, you know,
02:23:37.560 | that's the rewarding.
02:23:38.400 | The creative part of solving a problem
02:23:41.240 | is the thing that gives you that serotonin rush
02:23:45.720 | that you never forget, you know,
02:23:48.560 | or that adrenaline rush that you never forget.
02:23:51.440 | And so, you know, people need to be able to do
02:23:54.480 | that creative work and just feel productive.
02:23:57.360 | And sometimes that you can feel productive
02:23:59.800 | over fairly simple work, it's just well done, you know,
02:24:02.800 | and that you can see the result of.
02:24:05.080 | So, you know, there was a, I don't know,
02:24:09.720 | there was a cartoon, was it "Wall-E"
02:24:13.960 | where they had this big ship and all the people
02:24:17.320 | were just overweight, lying on their beach chairs,
02:24:21.600 | kind of sliding around on the deck of the movie
02:24:25.120 | because they didn't do anything anymore.
02:24:27.120 | Well, we definitely don't wanna be there.
02:24:28.760 | (laughing)
02:24:29.600 | You know, we need to work in some complimentary fashion
02:24:32.720 | where we keep all of our faculties and our physical health
02:24:35.280 | and we're doing some labor, right,
02:24:37.160 | but in a complimentary fashion somehow.
02:24:39.240 | - And I think a lot of that has to do with the interaction,
02:24:42.600 | the collaboration with robots and with AI systems.
02:24:45.400 | I'm hoping there's a lot of interesting possibilities.
02:24:47.840 | - I think that could be really cool, right?
02:24:49.320 | If you can work in a, in an interaction
02:24:52.720 | and really be helpful.
02:24:54.760 | Robots, you know, you can ask a robot to do a job
02:24:58.740 | you wouldn't ask a person to do,
02:25:00.280 | and that would be a real asset.
02:25:01.880 | You wouldn't feel guilty about it, you know?
02:25:03.800 | (laughing)
02:25:04.640 | You'd say, just do it.
02:25:05.880 | It's a machine, and I don't have to have qualms about that.
02:25:09.600 | - The ones that are machines.
02:25:10.680 | I also hope to see a future, and it is hope.
02:25:15.280 | I do have optimism about that future
02:25:16.960 | where some of the robots are pets,
02:25:19.640 | have an emotional connection to us humans,
02:25:21.960 | and because one of the problems that humans have to solve
02:25:24.760 | is this kind of, a general loneliness.
02:25:28.400 | The more love you have in your life,
02:25:30.920 | the more friends you have in your life.
02:25:33.000 | I think that makes a more enriching life, helps you grow,
02:25:35.840 | and I don't fundamentally see
02:25:37.480 | why some of those friends can't be robots.
02:25:39.960 | - There's an interesting long-running study,
02:25:42.840 | maybe it's in Harvard, they just,
02:25:44.400 | nice report, article written about it recently.
02:25:47.560 | They've been studying this group of a few thousand people
02:25:51.400 | now for 70 or 80 years, and the conclusion is that
02:25:56.400 | companionship and friendship are the things
02:26:00.260 | that make for a better and happier life.
02:26:03.000 | And so I agree with you,
02:26:06.600 | and I think that could happen with a machine.
02:26:10.880 | That is probably simulating intelligence.
02:26:15.020 | I'm not convinced there will ever be true intelligence
02:26:18.400 | in these machines, sentience, but they could simulate it,
02:26:23.180 | and they could collect your history,
02:26:24.520 | and they could, I guess it remains to be seen
02:26:27.480 | whether they can establish that real deep,
02:26:30.040 | when you sit with a friend
02:26:31.000 | and they remember something about you,
02:26:32.480 | and bring that up, and you feel that connection,
02:26:35.600 | it remains to be seen if a machine's
02:26:37.480 | gonna be able to do that for you.
02:26:39.440 | - Well, I have to say,
02:26:40.960 | inklings of that already started happening for me,
02:26:43.380 | some of my best friends are robots.
02:26:45.320 | (Robert laughs)
02:26:46.480 | And I have you to thank for leading the way
02:26:48.880 | in the accessibility and the ease of use of such robots,
02:26:53.360 | and the elegance of their movement.
02:26:55.360 | Robert, you're an incredible person.
02:26:56.920 | Boston Dynamics is an incredible company.
02:26:58.840 | I've just been a fan for many, many years,
02:27:01.000 | for everything you stand for,
02:27:01.960 | for everything you do in the world.
02:27:03.360 | If you're interested in great engineering or robotics,
02:27:05.380 | go join them, build cool stuff.
02:27:07.120 | I'll forever celebrate the work you're doing.
02:27:09.600 | And it's just a big honor that you would sit with me today
02:27:12.880 | and talk, it means a lot.
02:27:14.080 | So thank you so much.
02:27:15.400 | Keep doing great work.
02:27:16.400 | - Thank you, Lex.
02:27:17.240 | I'm honored to be here, and I appreciate it.
02:27:19.600 | It was fun.
02:27:20.440 | - Thanks for listening to this conversation
02:27:22.680 | with Robert Plater.
02:27:23.920 | To support this podcast,
02:27:25.040 | please check out our sponsors in the description.
02:27:28.000 | And now, let me leave you with some words
02:27:29.920 | from Alan Turing in 1950,
02:27:32.800 | defining what is now termed the Turing test.
02:27:36.300 | A computer would deserve to be called intelligent
02:27:40.560 | if it could deceive a human
02:27:42.520 | into believing that it was human.
02:27:45.040 | Thank you for listening, and hope to see you next time.
02:27:49.040 | (upbeat music)
02:27:51.620 | (upbeat music)
02:27:54.200 | [BLANK_AUDIO]