back to index

Marc Raibert: Boston Dynamics and the Future of Robotics | Lex Fridman Podcast #412


Chapters

0:0 Introduction
1:43 Early robots
6:47 Legged robots
25:27 Boston Dynamics
28:45 BigDog
36:52 Hydraulic actuation
38:44 Natural movement
44:31 Leg Lab
51:23 AI Institute
54:41 Athletic intelligence
62:35 Building a team
65:37 Videos
73:25 Engineering
76:53 Dancing robots
81:40 Hiring
85:32 Optimus robot
94:2 Future of robotics
98:56 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | - So Big Dog became LS3, which is the big load carrying one.
00:00:03.600 | - Just a quick pause, it can carry 400 pounds.
00:00:08.020 | - It was designed to carry 400,
00:00:09.560 | but we had it carrying about 1,000 pounds at one time.
00:00:12.680 | - Of course you did, just to make sure.
00:00:14.960 | - We had one carrying the other one, we had two of them,
00:00:17.200 | so we had one carrying the other one.
00:00:18.960 | - So one of the things that stands out
00:00:20.240 | about the robots Boston Dynamics have created
00:00:23.320 | is how beautiful the movement is,
00:00:25.240 | how natural the walking is and running is,
00:00:30.000 | even flipping is, throwing is.
00:00:32.200 | So maybe you can talk about what's involved
00:00:34.960 | in making it look natural.
00:00:37.660 | - Well, I think having good hardware is part of the story,
00:00:41.240 | and people who think you don't need
00:00:43.480 | to innovate hardware anymore are wrong.
00:00:45.980 | - The following is a conversation with Mark Rybert,
00:00:52.360 | a legendary roboticist, founder and longtime CEO
00:00:55.560 | of Boston Dynamics, and recently the executive director
00:00:59.440 | of the newly created Boston Dynamics AI Institute
00:01:02.920 | that focuses on research and the cutting edge,
00:01:06.320 | on creating future generations of robots
00:01:08.720 | that are far better than anything that exists today.
00:01:12.400 | He has been leading the creation of incredible legged robots
00:01:16.240 | for over 40 years at CMU, at MIT,
00:01:20.820 | the legendary MIT Leg Lab,
00:01:23.400 | and then of course Boston Dynamics,
00:01:25.240 | with amazing robots like Big Dog, Atlas, Spot, and Handle.
00:01:29.200 | This was a big honor and pleasure for me.
00:01:33.940 | This is the Lex Friedman Podcast.
00:01:36.480 | To support it, please check out our sponsors
00:01:38.400 | in the description.
00:01:39.680 | And now, dear friends, here's Mark Rybert.
00:01:42.840 | When did you first fall in love with robotics?
00:01:47.080 | - Well, I was always a builder from a young age.
00:01:50.980 | I was lucky.
00:01:51.860 | My father was a frustrated engineer,
00:01:56.160 | and by that I mean he wanted to be an aerospace engineer,
00:02:01.160 | but his mom from the old country thought
00:02:03.860 | that that would be like a grease monkey,
00:02:05.740 | and so she said no, so he became an accountant.
00:02:09.380 | But the result of that was our basement
00:02:12.260 | was always full of tools and equipment and electronics,
00:02:16.820 | and from a young age I would watch him assembling a kit,
00:02:20.980 | an IKO kit or something like that.
00:02:22.460 | I still have a couple of his old IKO kits.
00:02:24.980 | But it was really during graduate school
00:02:29.540 | when I followed a professor back from class.
00:02:34.540 | It was Bertolt Horn at MIT,
00:02:36.900 | and I was taking an interim class.
00:02:40.540 | It's IAP, independent activities period,
00:02:43.200 | and I followed him back to his lab,
00:02:45.640 | and on the table was a Vykarm robot arm
00:02:50.620 | taken apart in probably 1,000 pieces,
00:02:53.300 | and when I saw that from that day on, I was a roboticist.
00:02:58.300 | - Do you remember the year?
00:02:59.740 | - 1974.
00:03:01.000 | - 1974, so there's just this arm in pieces,
00:03:04.620 | and you saw the pieces, and you saw in your vision
00:03:08.340 | the arm when it's put back together
00:03:11.220 | and the possibilities that holds.
00:03:12.800 | - Somehow it spurred my imagination.
00:03:15.740 | I was in the Brain and Cognitive Sciences Department
00:03:19.580 | as a graduate student doing neurophysiology.
00:03:21.620 | I'd been an electrical engineer
00:03:23.020 | as an undergrad at Northeastern,
00:03:25.260 | and the neurophysiology wasn't really working for me.
00:03:28.960 | It wasn't conceptual enough.
00:03:30.900 | I couldn't see really how by looking at single neurons
00:03:35.000 | you were gonna get to a place where you could understand
00:03:37.900 | control systems or thought or anything like that,
00:03:41.940 | and the AI lab was always an appealing,
00:03:45.300 | this was before CSAIL, right?
00:03:46.640 | This was in the '70s, so the AI lab
00:03:49.220 | was always an appealing idea,
00:03:51.660 | and so when I went back to the AI lab following him,
00:03:56.020 | and I saw the arm, I just thought this is it.
00:03:59.180 | - It's so interesting, the tension between the BCS,
00:04:03.040 | Brain and Cognitive Science approach
00:04:04.860 | to understanding intelligence,
00:04:07.120 | and the robotics approach to understanding intelligence.
00:04:09.700 | - Well, BCS has now morphed a bit, right?
00:04:11.960 | They have the Center for Brains, Minds, and Machines,
00:04:15.140 | which is trying to bridge that gap,
00:04:17.740 | and even when I was there, David Marr was in the AI lab.
00:04:21.060 | David Marr had models of the brain
00:04:24.380 | that were appealing both to biologists
00:04:26.620 | but also to computer people,
00:04:28.780 | so he was a visitor in the AI lab at the time,
00:04:31.380 | and I guess he became full-time there,
00:04:34.180 | so that was the first time a bridge was made
00:04:36.380 | between those two groups.
00:04:38.580 | Then the bridge kinda went away,
00:04:39.900 | and then there was another time in the '80s,
00:04:42.140 | and then recently, the last five or so years,
00:04:45.560 | there's been a stronger connection.
00:04:48.120 | - You said you were always kind of a builder.
00:04:50.300 | What stands out to you in memory of a thing you've built?
00:04:53.660 | Maybe a trivial thing that just kinda inspired you
00:04:58.420 | in the possibilities that this direction of work might hold?
00:05:02.600 | - I mean, we were just doing gadgets when we were kids.
00:05:05.100 | I had a friend, we were taking,
00:05:07.820 | you know the, I don't know if everybody remembers,
00:05:10.300 | but fluorescent lights had this little aluminum cylinder,
00:05:15.300 | I can't even remember what it's called now,
00:05:18.280 | that you needed, a starter, I think it was,
00:05:20.380 | and we would take those apart, fill 'em with match heads,
00:05:24.180 | put a tail on it, and make it into little rockets.
00:05:26.980 | - So it wasn't always about function.
00:05:29.000 | It was, well-- - Well, rocket
00:05:30.860 | was pretty much-- - I guess that is
00:05:32.580 | pretty functional, but yeah, I guess that is a question.
00:05:35.480 | How much was it about function versus
00:05:37.900 | just creating something cool?
00:05:39.680 | - I think it's still a balance between those two.
00:05:43.380 | There was a time, though, when I was a,
00:05:45.020 | I guess I was probably already a professor,
00:05:46.860 | or maybe late in graduate school,
00:05:48.180 | when I thought that function was everything,
00:05:50.620 | and that mobility, dexterity, perception, and intelligence,
00:05:55.620 | those are sort of the key functionalities for robotics,
00:05:59.580 | that that's what mattered, and nothing else mattered.
00:06:04.020 | And I even had kind of this platonic ideal
00:06:07.080 | that a robot, if you just looked at a robot,
00:06:10.400 | and it wasn't doing anything,
00:06:11.560 | it would look like a pile of junk,
00:06:12.980 | which a lot of my robots looked like in those days.
00:06:16.320 | But then when it started moving,
00:06:18.400 | you'd get the idea that it had some kind of life,
00:06:21.280 | or some kind of interest in its movement.
00:06:24.680 | And I think we purposely even designed the machines,
00:06:27.520 | not worrying about the aesthetics of the structure itself.
00:06:33.280 | But then it turns out that the aesthetics
00:06:35.760 | of the thing itself add and combine
00:06:38.660 | with the lifelike things that the robots can do.
00:06:42.420 | But the heart of it is making them do things
00:06:45.660 | that are interesting.
00:06:47.420 | - So one of the things that underlies a lot of your work
00:06:51.180 | is that the robots you create,
00:06:53.820 | the systems you have created for over 40 years now,
00:06:57.620 | have a kind of, they're not cautious.
00:07:00.220 | So a lot of robots that people know about
00:07:03.020 | move about this world very cautiously, carefully,
00:07:05.860 | very afraid of the world.
00:07:08.020 | A lot of the robots you built,
00:07:09.560 | especially in the early days,
00:07:10.680 | were very aggressive, under-actuated.
00:07:13.520 | They're hopping, they're wild, moving quickly.
00:07:18.020 | So is there a philosophy underlying that?
00:07:20.760 | - Well, let me tell you about
00:07:21.980 | how I got started on legs at all.
00:07:24.160 | When I was still a graduate student,
00:07:26.580 | I went to a conference.
00:07:28.620 | It was a biological legged locomotion conference,
00:07:31.700 | and I think it was in Philadelphia.
00:07:33.540 | So it was all biomechanics people,
00:07:35.580 | researchers who would look at muscle
00:07:37.460 | and maybe neurons and things like that.
00:07:40.220 | They weren't so much computational people,
00:07:42.460 | but they were more biomechanics.
00:07:44.180 | And maybe there were 1,000 people there.
00:07:45.820 | And I went to a talk, one of the talks,
00:07:48.980 | all the talks were about the body
00:07:51.700 | of either animals or people and respiration,
00:07:53.700 | things like that.
00:07:54.540 | But one talk was by a robotics guy,
00:07:56.860 | and he showed a six-legged robot
00:08:00.500 | that walked very slowly.
00:08:03.520 | It always had at least three feet on the ground,
00:08:06.940 | so it worked like a table or a chair with tripod stability.
00:08:10.220 | And it moved really slowly.
00:08:11.940 | And I just looked at that and said, wow, that's wrong.
00:08:16.300 | That's not anything like how people and animals work.
00:08:19.740 | Because we bounce and fly,
00:08:22.500 | we have to predict what's gonna happen
00:08:24.300 | in order to keep our balance
00:08:25.540 | when we're taking a running step or something like that.
00:08:28.780 | We use the springiness in our legs,
00:08:31.860 | in our muscles and our tendons and things like that
00:08:34.340 | as part of the story, the energy circulates.
00:08:37.140 | We don't just throw it away every time.
00:08:39.700 | So I'm not sure I understood all that when I first saw it,
00:08:42.580 | but I definitely got inspired to say,
00:08:45.380 | let's try the opposite.
00:08:48.220 | And I didn't have a clue
00:08:49.500 | as to how to make a hopping robot work,
00:08:51.820 | not balance in 3D.
00:08:54.440 | In fact, when I started,
00:08:56.820 | it was all just about the energy of bouncing.
00:08:58.860 | And I was gonna have a springy thing in the leg
00:09:01.700 | and some actuator
00:09:03.220 | so that you could get an energy regime going of bouncing.
00:09:07.980 | And the idea that balance was an important part of it
00:09:10.060 | didn't come until a little later.
00:09:12.580 | And then I made the one-legged, the pogo stick robots.
00:09:17.160 | Now I think that we need to do that in manipulation.
00:09:20.220 | If you look at robot manipulation,
00:09:22.860 | we, a community, has been working on it for 50 years.
00:09:26.840 | We're nowhere near human levels of manipulation.
00:09:30.340 | I mean, it's come along,
00:09:32.900 | but I think it's all too safe.
00:09:35.100 | And I think trying to break out of that safety thing
00:09:39.140 | of static grasping,
00:09:40.980 | if you look at a lot of work that goes on,
00:09:43.580 | it's about the geometry of the part,
00:09:45.380 | and then you figure out how to move your hand
00:09:48.180 | so that you can position it with respect to that,
00:09:50.340 | and then you grasp it carefully, and then you move it.
00:09:53.180 | Well, that's not anything like how people and animals work.
00:09:56.380 | We juggle in our hands.
00:09:58.020 | We hug multiple objects and can sort them.
00:10:01.040 | So now, to be fair,
00:10:04.880 | being more aggressive is gonna mean
00:10:07.460 | things aren't gonna work very well for a while.
00:10:09.260 | So it's a longer-term approach to the problem.
00:10:13.280 | And that's just theory now.
00:10:16.020 | Maybe that won't pay off,
00:10:17.020 | but that's sort of how I'm trying to think about it,
00:10:19.140 | trying to encourage our group to go at it.
00:10:22.900 | - Well, yeah, I mean, we'll talk about what it means to,
00:10:25.860 | what is the actual thing we're trying to optimize
00:10:29.280 | for a robot.
00:10:30.580 | Sometimes, especially with human-robot interaction,
00:10:33.020 | maybe flaws is a good thing.
00:10:36.240 | Perfection is not necessarily the right thing to be chasing.
00:10:38.580 | Just like you said, maybe being good at fumbling an object,
00:10:42.340 | being good at fumbling might be the right thing to optimize
00:10:46.060 | versus perfect modeling of the object
00:10:50.060 | and perfect movement of the arm to grasp that object,
00:10:54.180 | because maybe perfection is not supposed to exist
00:10:56.420 | in the real world.
00:10:57.260 | - I don't know if you know my friend, Matt Mason,
00:10:59.220 | who is the director of the Robotics Institute
00:11:02.780 | at Carnegie Mellon,
00:11:03.620 | and we go back to graduate school together.
00:11:05.940 | But he analyzed a movie of Julia Childs
00:11:10.200 | doing a cooking thing.
00:11:12.180 | And she did, I think he said something like
00:11:14.620 | there were 40 different ways that she handled a thing,
00:11:17.820 | and none of them was grasping.
00:11:19.620 | She would nudge, roll, flatten with her knife,
00:11:24.460 | things like that, and none of them was grasping.
00:11:28.180 | - So, okay, let's go back to the early days.
00:11:30.300 | First of all, you've created and led the Leg Lab,
00:11:33.860 | the legendary Leg Lab at MIT.
00:11:35.980 | So what was that first hopping robot?
00:11:38.380 | - But first of all, the Leg Lab actually started
00:11:40.540 | at Carnegie Mellon. - Carnegie Mellon.
00:11:41.940 | - So I was a professor there starting in 1980,
00:11:45.660 | about 1986.
00:11:48.680 | And so that's where the first hopping machines were built,
00:11:52.420 | starting, I guess we got the first one working
00:11:54.940 | in about 1982, something like that.
00:11:57.980 | That was a simplified one.
00:11:59.900 | Then we got a three-dimensional one in 1983.
00:12:03.420 | The quadruped that we built at the Leg Lab,
00:12:06.620 | the first version, was built in about 1984 or five,
00:12:10.840 | and really only got going about '86 or so,
00:12:15.020 | and it took years of development to get it to--
00:12:17.300 | - Let's just pause here.
00:12:18.380 | For people who don't know, I'm talking to Mark Weber.
00:12:20.620 | Founder of Boston Dynamics, but before that,
00:12:23.600 | you were a professor developing some of the most
00:12:26.620 | incredible robots for 15 years,
00:12:28.580 | and before that, of course, a grad student and all that.
00:12:30.780 | So you've been doing this for a really long time.
00:12:32.300 | So you skipped over this,
00:12:34.060 | but go to the first hopping robot.
00:12:36.220 | There's videos of some of this.
00:12:38.020 | I mean, these are incredible robots.
00:12:39.580 | So you talked about the very first step
00:12:43.500 | was to get a thing hopping, up and down,
00:12:47.160 | and then you realized, well, balancing
00:12:49.580 | is a thing you should care about,
00:12:51.440 | and it's actually a solvable problem.
00:12:53.800 | Can you just go through how to create that robot?
00:12:56.300 | - Sure.
00:12:58.220 | - What was involved in creating that robot?
00:13:00.060 | - Well, I'm gonna start on the, not the technical side,
00:13:03.480 | but the, I guess we could call it the motivational side,
00:13:07.160 | or the funding side.
00:13:08.680 | So before Carnegie Mellon, I was actually at JPL,
00:13:11.980 | at Jet Propulsion Lab, for three years,
00:13:14.220 | and while I was there, I connected up with Ivan Sutherland,
00:13:18.580 | who is sometimes regarded as the father of computer graphics,
00:13:23.220 | 'cause of work he did both at MIT
00:13:25.100 | and then University of Utah, and Evans and Sutherland.
00:13:28.220 | Anyway, I got to know him,
00:13:32.180 | and at one point, he said he encouraged me
00:13:34.700 | to do some kind of project at Caltech,
00:13:38.760 | even though I was at JPL.
00:13:40.160 | You know, those are kind of related institutions,
00:13:43.920 | and so I thought about it,
00:13:47.580 | and I made up a list of three possible projects,
00:13:51.840 | and I purposely made the top one and the bottom one
00:13:54.560 | really boring-sounding,
00:13:56.000 | and in the middle, I put Pogo Stick Robot,
00:14:00.280 | and when he looked at it, you know,
00:14:02.080 | Ivan is a brilliant guy, brilliant engineer,
00:14:07.080 | and a real cultivator of people.
00:14:09.300 | He looked at it and knew right away
00:14:11.140 | what the thing that was worth doing,
00:14:12.820 | and so he had an endowed chair,
00:14:15.020 | so he had about $3,000 that he gave me
00:14:17.220 | to build the first model, which I went to the shop
00:14:21.220 | and with my own hands kind of made a first model,
00:14:23.860 | which didn't work and was just a beginning shot at it,
00:14:31.400 | and Ivan and I took that to Washington,
00:14:34.940 | and in those days, you could just walk into DARPA
00:14:37.340 | and walk down the hallway and see who's there.
00:14:40.460 | Ivan, who had been there in his previous life,
00:14:43.700 | and so we walked around and we looked in the offices.
00:14:46.740 | Of course, I didn't know anything.
00:14:47.900 | You know, I was basically a kid,
00:14:49.180 | but Ivan knew his way around,
00:14:50.700 | and we found Craig Fields in his office.
00:14:54.380 | Craig later became the director of DARPA,
00:14:56.180 | but in those days, he was a program manager,
00:14:58.420 | and so we went in.
00:14:59.240 | I had a little Samsonite suitcase.
00:15:01.140 | We opened and it had just the skeleton
00:15:03.980 | of this one-legged hopping robot, and we showed it to him,
00:15:07.440 | and you could almost see the drool going down his chin.
00:15:11.180 | - It was exciting. - Of excitement,
00:15:12.580 | and he sent me $250,000.
00:15:14.620 | He said, "Okay, I wanna fund this,"
00:15:19.620 | and I was between institutions.
00:15:21.220 | I was just about to leave JPL,
00:15:22.780 | and I hadn't decided yet where I was going next,
00:15:25.980 | and then when I landed at CMU, he sent $250,000,
00:15:30.420 | which in 1980 was a lot of research money.
00:15:34.060 | - Did you see the possibility of where this is going,
00:15:36.460 | why this is an important problem?
00:15:38.940 | - No. - The balance.
00:15:40.060 | I mean, it has to do with legged locomotion.
00:15:43.740 | I mean, it has to do with all these problems
00:15:45.500 | that the human body solves when we're walking, for example.
00:15:49.820 | Like, all the fundamentals are there.
00:15:51.460 | - Yeah, I mean, I think that was the motivation
00:15:54.140 | to try and get more at the fundamentals of how animals work,
00:15:57.060 | but the idea that it would result in machines
00:16:00.180 | that were anything like practical, like we're making now,
00:16:03.900 | that wasn't anywhere in my head, no.
00:16:06.060 | You know, as an academic,
00:16:07.660 | I was mostly just trying to do the next thing,
00:16:10.500 | you know, make some progress,
00:16:12.820 | impress my colleagues if I could.
00:16:14.620 | - And have fun. - And have fun.
00:16:16.020 | - Pogo stick robot. - Pogo stick robot.
00:16:18.100 | - So what was, on the technical side,
00:16:19.700 | what are some of the challenges of getting up,
00:16:22.700 | getting to the point where we saw, like in the video,
00:16:25.860 | the pogo stick robot that's actually successfully hopping,
00:16:28.620 | and then eventually doing flips and all this kind of stuff?
00:16:31.540 | - Well, in the very early days,
00:16:32.900 | I needed some better engineering than I had,
00:16:35.580 | than I could do myself, and I hired Ben Brown.
00:16:39.420 | We each had our way of contributing to the design,
00:16:42.940 | and we came up with a thing that could start to work.
00:16:46.500 | I had some stupid ideas
00:16:47.820 | about how the actuation system should work,
00:16:50.580 | and we, you know, we sorted that out.
00:16:52.860 | It wasn't that hard to make it balanced
00:16:54.660 | once you get the physical machine to be working well enough,
00:16:58.860 | and have enough control over the degrees of freedom.
00:17:01.460 | And then we very quickly, you know,
00:17:04.180 | we started out by having it floating on an inclined air table
00:17:08.100 | and then that only gave us like six foot of travel,
00:17:11.220 | so once it started working,
00:17:13.020 | we switched to a thing that could run around the room
00:17:15.940 | on another device.
00:17:17.820 | It's hard to explain these without you seeing them,
00:17:19.500 | but you probably know what I'm talking about, a planarizer.
00:17:22.500 | And then the next big step was to make it work in 3D,
00:17:26.260 | which that was really the scary part,
00:17:28.180 | with these simple things.
00:17:29.540 | You know, people had inverted pendulums at the time
00:17:31.620 | for years and they could control them
00:17:33.660 | by driving a cart back and forth,
00:17:35.700 | but could you make it work in three dimensions
00:17:38.180 | while it's bouncing and all that?
00:17:40.540 | But it turned out, you know, not to be that hard to do,
00:17:43.780 | at least at the level of performance we achieved at the time.
00:17:46.580 | - So, okay, you mentioned inverted pendulum,
00:17:48.380 | like, can you explain how a hopping stick in 3D
00:17:53.380 | can control, can balance itself?
00:17:57.900 | - Yeah, sure.
00:17:58.740 | - What does the actuation look like?
00:18:01.220 | - You know, the simple story
00:18:02.260 | is that there's three things going on.
00:18:04.420 | There's something making it bounce.
00:18:06.460 | And, you know, we had a system
00:18:08.580 | that was estimating how high the robot was off the ground
00:18:13.580 | and using that, you know,
00:18:16.340 | there's energy that can be in three places in a pogo stick.
00:18:19.740 | One is in the spring, one is in the altitude,
00:18:22.820 | and the other is in the velocity.
00:18:24.940 | And so when at the top of the hop, it's all in the height.
00:18:29.260 | And so you could just measure how high you're going
00:18:31.820 | and thereby have an idea of a lot about the cycle
00:18:35.580 | and you could decide whether to put more energy in or less.
00:18:38.780 | So that was one element.
00:18:40.300 | Then there's a part that you decide where to put the foot.
00:18:43.620 | And if you think when you're landing on the ground
00:18:45.740 | with respect to the center of mass.
00:18:47.420 | So if you think of a pole vaulter,
00:18:49.460 | the key thing the pole vaulter has to do
00:18:51.820 | is get its body to the right place when the pole gets stuck.
00:18:56.260 | If they're too far forward,
00:18:58.740 | they kind of get thrown backwards.
00:19:00.920 | If they're too far back, they go, you know, over.
00:19:03.580 | And what they need to do is get it
00:19:04.780 | so that they go mostly up to get over the thing.
00:19:07.740 | And, you know, high jumpers is the same kind of thing.
00:19:10.940 | So there's a calculation about where to put the foot.
00:19:14.080 | And we did something, you know, relatively simple.
00:19:16.500 | And then there's a third part to keep the body
00:19:18.860 | at an attitude that's upright.
00:19:20.380 | 'Cause if it gets too far, you know,
00:19:22.100 | you could hop and just keep rotating around.
00:19:24.820 | But if it gets too far,
00:19:26.380 | then you run out of motion of the joints at the hips.
00:19:29.900 | So you have to do that.
00:19:31.580 | And we did that by applying a torque
00:19:34.060 | between the legs and the body
00:19:35.420 | every time the foot's on the ground.
00:19:36.700 | You only can do it while the foot's on the ground.
00:19:38.740 | In the air, you know, the physics don't work out.
00:19:42.860 | - How far does it have to tilt before it's too late
00:19:45.320 | to be able to balance itself?
00:19:47.200 | Or it's impossible to balance itself, correct itself?
00:19:50.040 | - Well, you're asking an interesting question
00:19:52.240 | because in those days, we didn't actually optimize things.
00:19:57.240 | And they probably could've gone much further than we did
00:20:01.040 | and then had higher performance.
00:20:02.760 | And we just kind of got, you know,
00:20:04.440 | a sketch of a solution and worked on that.
00:20:06.920 | And then in years since, some people working for us,
00:20:09.320 | some people working for others,
00:20:10.840 | people came up with all kinds of equations for,
00:20:14.640 | or, you know, algorithms for how to do a better job,
00:20:17.980 | be able to go faster.
00:20:19.840 | One of my students worked on getting things to go faster.
00:20:22.200 | Another one worked on climbing over obstacles.
00:20:26.180 | 'Cause when you're running, it's one,
00:20:28.440 | on the open ground, it's one thing.
00:20:29.960 | If you're running, like, up a stair,
00:20:32.700 | you have to adjust where you are.
00:20:34.640 | Otherwise, things don't work out right.
00:20:36.080 | You land your foot on the edge of the step.
00:20:37.800 | So there's other degrees of freedom to control
00:20:40.320 | if you're getting to, you know,
00:20:41.920 | more realistic, practical situations.
00:20:44.120 | - I think it's really interesting
00:20:44.960 | to ask about the early days.
00:20:46.200 | 'Cause, you know, believing in yourself,
00:20:49.120 | believing that there's something interesting here.
00:20:51.120 | And then you mentioned finding somebody else, Ben Brown.
00:20:54.000 | What's that like, finding other people
00:20:55.600 | with whom you can build this crazy idea
00:20:59.000 | and actually make it work?
00:21:00.420 | - Probably the smartest thing I ever did
00:21:02.600 | is to find the other people.
00:21:04.200 | I mean, when I look at it now, you know,
00:21:05.540 | I look at Boston Dynamics
00:21:06.760 | and all the really excellent engineering there.
00:21:10.000 | You know, people who really make stuff work.
00:21:12.460 | You know, I'm only the dreamer.
00:21:16.320 | - So when you talk about pogo stick robot
00:21:18.740 | or legged robots, whether it's quadrupeds
00:21:22.200 | or humanoid robots, did people doubt that this is possible?
00:21:26.800 | Did you experience a lot of people around you kinda--
00:21:29.200 | - I don't know if they doubted whether it was possible,
00:21:31.440 | but I think they thought it was a waste of time.
00:21:34.560 | - Oh, it's not even an interesting problem.
00:21:36.600 | - I think for a lot of people, you know,
00:21:38.040 | people who were, I think it's been both, though.
00:21:41.640 | Some people, I think, I felt like they were saying,
00:21:45.180 | oh, you know, why are you wasting your time
00:21:47.440 | on this stupid problem?
00:21:49.240 | And then, but then I've been at many things
00:21:51.240 | where people have told me it's been an inspiration
00:21:53.960 | to go out and, you know, attack these harder things.
00:21:58.960 | And I think it has turned out,
00:22:03.140 | I think legged locomotion has turned out
00:22:04.960 | to be a useful thing.
00:22:06.400 | - Did you ever have doubt about bringing Atlas to life,
00:22:10.040 | for example, or with Big Dog, just every step of the way,
00:22:14.040 | did you have doubt, like, this is too hard of a problem?
00:22:17.960 | - I mean, at first, I wasn't an enthusiast
00:22:20.720 | for the humanoids, 'cause, again, it goes back
00:22:23.280 | to saying what's the functionality?
00:22:25.940 | And the form wasn't as important as the functionality.
00:22:28.900 | And also, you know, there's an aspect to humanoid robots
00:22:35.140 | that's about, all about the cosmetics,
00:22:38.080 | where there isn't really other functionality,
00:22:40.040 | and that kind of is off-putting for me.
00:22:42.920 | As a roboticist, I think the functionality really matters.
00:22:46.040 | So probably that's why I avoided human robots,
00:22:49.640 | humanoid robots to start with.
00:22:51.800 | But I'll tell you, now, you know,
00:22:54.680 | after we started working on 'em,
00:22:55.880 | you could see that the connection and the impact
00:22:58.400 | with other people, whether they're lay people
00:23:02.400 | or even other technical people,
00:23:04.640 | there's a special thing that goes on.
00:23:08.240 | Even though most of the humanoid robots
00:23:09.860 | aren't that much like a person.
00:23:11.520 | - But we anthropomorphize, and we see the humanity.
00:23:15.640 | But also, like, with Spot, you can see, not the humanity,
00:23:20.100 | but whatever we find compelling about social interactions,
00:23:24.100 | there in Spot as well.
00:23:25.240 | - I'll tell you, you know, I go around giving talks
00:23:26.940 | and take Spot to a lot of them, and it's amazing.
00:23:30.780 | The media likes to say that they're terrifying,
00:23:32.820 | and that people are afraid.
00:23:34.420 | And YouTube commenters like to say that it's frightening.
00:23:39.020 | But when you take a Spot out there,
00:23:41.420 | now, maybe it's self-selecting,
00:23:42.740 | but you get a crowd of people who wanna take pictures,
00:23:45.480 | wanna pose for selfies, wanna operate the robot,
00:23:48.700 | wanna pet it, wanna put clothes on it, it's amazing.
00:23:52.700 | - Yeah, I love Spot.
00:23:53.980 | So if we move around history a little bit,
00:23:56.820 | so you said, I think, in the early days of Boston Dynamics,
00:24:00.680 | that you quietly worked on making a running version
00:24:03.340 | of Eyeball, Sony's robot dog.
00:24:06.580 | It's just an interesting little tidbit of history for me.
00:24:10.000 | What stands out to you in memory from that task?
00:24:15.140 | For people who don't know,
00:24:15.980 | that little dog robot moves slowly.
00:24:19.380 | How did that become Big Dog?
00:24:20.860 | What was involved there?
00:24:22.260 | What was the dance between,
00:24:23.500 | how do we make this cute little dog
00:24:25.060 | versus a thing that can actually carry a lot of payload
00:24:27.420 | and move fast and stuff like that?
00:24:29.180 | - What the connection was is that, at that point,
00:24:31.820 | Boston Dynamics was mostly
00:24:33.300 | a physics-based simulation company.
00:24:35.500 | So when I left MIT to start Boston Dynamics,
00:24:39.180 | you know, there was a few years of overlap,
00:24:40.500 | but the concept wasn't to start a robot company.
00:24:42.620 | The concept was to use this dynamic simulation tool
00:24:46.980 | that we developed to do robotics for other things.
00:24:50.360 | But working with Sony, we got back into robotics
00:24:54.660 | by doing the Eyeball Runner, by, we programmed,
00:24:57.620 | we made some tools for programming Curio,
00:25:00.380 | which was a small, a humanoid this big,
00:25:03.420 | that could do some dancing and other kinds of fun stuff.
00:25:06.540 | And I don't think it ever reached the market,
00:25:08.540 | even though they did show it.
00:25:10.440 | You know, when I look back,
00:25:12.820 | I say that we got us back where we belonged.
00:25:14.940 | - Yeah, you rediscovered the soul of the company.
00:25:17.780 | - That's right.
00:25:18.940 | - And so from there, it was always about robots.
00:25:21.220 | - Yeah.
00:25:22.060 | - So you started Boston Dynamics in 1992.
00:25:27.500 | - Right.
00:25:28.340 | - Some fond memories from the early days.
00:25:30.900 | - One of the robots that we built wasn't actually a robot.
00:25:36.540 | It was a surgical simulator, but it had force feedback,
00:25:40.380 | so it had all the techniques of robotics.
00:25:43.500 | And you look down into this mirror, it actually was,
00:25:47.820 | and it looked like you were looking down
00:25:49.620 | onto the body you were working on.
00:25:51.380 | Your hands were underneath the mirror,
00:25:52.780 | so they were where you were looking.
00:25:55.100 | And you had tools in your hands
00:25:56.800 | that were connected up to these force feedback devices
00:25:59.660 | made by another MIT spinout, Sensible Technologies.
00:26:03.780 | So they made the force feedback device,
00:26:05.380 | we attached the tools, and we wrote all the software,
00:26:08.380 | and did all the graphics.
00:26:09.460 | So we had 3D computer graphics.
00:26:11.700 | It was in the old days, this was in the late '90s,
00:26:14.460 | when you had a silicon graphics computer
00:26:17.460 | that was about this big.
00:26:18.720 | You know, it was the heater in the office, basically.
00:26:22.460 | - Nice.
00:26:24.300 | - And we were doing surgical operations,
00:26:26.840 | anastomosis, which was stitching tubes together,
00:26:30.400 | you know, tubes like blood vessels
00:26:31.840 | or other things in their body.
00:26:33.700 | And you could feel, and you could see the tissues move,
00:26:37.200 | and it was really exciting.
00:26:38.720 | And the idea was to make a trainer
00:26:40.560 | to teach surgeons how to do stuff.
00:26:42.860 | We built a scoring system, 'cause we interviewed surgeons
00:26:46.400 | that told us, you know, what you're supposed to do
00:26:49.280 | and what you're not supposed to do.
00:26:50.360 | You're not supposed to tear the tissue,
00:26:52.520 | you're not supposed to touch it in any place
00:26:54.360 | except for where you're trying to engage.
00:26:56.140 | There were a bunch of rules.
00:26:57.680 | So we built this thing and took it to a trade show,
00:27:00.200 | a surgical trade show.
00:27:02.480 | And the surgeons were practically lined up.
00:27:04.960 | Well, we kept the score, and we posted their scores,
00:27:07.560 | like on a video game.
00:27:08.800 | And those guys are so competitive
00:27:10.820 | that they really, really loved doing it.
00:27:13.680 | And they would come around,
00:27:14.520 | and they'd see someone's score was higher there,
00:27:16.020 | so they would come back.
00:27:18.020 | But we figured out, shortly after,
00:27:20.480 | that we thought surgeons were gonna pay us
00:27:22.920 | to get trained on these things.
00:27:24.880 | And the surgeons thought we should pay them
00:27:27.880 | in order to, so they could teach us about the thing.
00:27:31.640 | And there was no money from the surgeons.
00:27:34.160 | And we looked at it and thought,
00:27:35.360 | well, maybe we could sell it to hospitals
00:27:37.280 | that would train their surgeons.
00:27:39.280 | And then we said, well, we're this,
00:27:40.920 | at the time, we were probably a 12-person company
00:27:43.000 | or maybe 15 people, I don't remember.
00:27:44.840 | There's no way we could go after a marketing activity.
00:27:49.100 | You know, the company was all bootstrapped in those years.
00:27:51.440 | We never had investors until Google bought us,
00:27:54.560 | which was after 20 years.
00:27:56.480 | So we didn't have any resources to go after hospitals.
00:28:00.440 | So at one day, Rob and I were looking at that,
00:28:05.280 | and we said, we'd built another simulator
00:28:07.820 | for knee arthroscopy, and we said, this isn't gonna work.
00:28:11.760 | And we killed it, and we moved on.
00:28:14.720 | And that was really a milestone in the company,
00:28:17.120 | because we sort of understood who we were
00:28:20.260 | and what would work and what wouldn't,
00:28:22.260 | even though technically it was really a fascinating thing.
00:28:24.940 | - What was that meeting like?
00:28:26.500 | Were you just sitting at a table, you know what?
00:28:28.900 | - Probably.
00:28:30.860 | - We're going to pivot completely.
00:28:32.500 | We're going to let go of this thing
00:28:34.900 | we put so much hard work into,
00:28:37.380 | and then go back to the thing we came from.
00:28:39.060 | - It just always felt right once we did it, you know?
00:28:42.060 | - Just look at each other and said, let's build robots.
00:28:45.340 | - What was the first robot you built
00:28:47.900 | under the flag of Boston Dynamics, Big Dog?
00:28:51.320 | - Well, there was the AIBO runner,
00:28:54.680 | but it wasn't even a whole robot.
00:28:55.880 | It was just legs that we, we took off the legs on AIBOs
00:28:59.080 | and attached legs we'd made.
00:29:01.640 | And you know, we got that working
00:29:03.960 | and showed it to the Sony people.
00:29:05.640 | We worked pretty closely with Sony in those years.
00:29:09.060 | One of the interesting things is that
00:29:11.520 | it was before the internet and Zoom and anything like that.
00:29:15.780 | So we had six ISDN lines installed,
00:29:19.420 | and we would have a telecon every week
00:29:21.700 | that worked at very low frame rates,
00:29:23.660 | something like 10 hertz.
00:29:25.260 | You know, English across the boundary with Japan
00:29:30.600 | was a challenge, trying to understand
00:29:32.660 | what each of us was saying and have meetings every week
00:29:36.520 | for several years doing that.
00:29:39.240 | And it was a pleasure working with them.
00:29:42.020 | They were really supporters.
00:29:43.620 | They seemed to like us and what we were doing.
00:29:46.700 | That was the real transition
00:29:47.860 | from us being a simulation company
00:29:49.780 | into being a robotics company again.
00:29:51.860 | - It was a quadruped, the legs were four legs or two legs?
00:29:55.060 | - Yeah, no, four legs, yeah.
00:29:56.340 | - And what did you learn from that experience
00:29:58.600 | of building basically a fast-moving quadruped?
00:30:02.880 | - Mostly we learned that something that small
00:30:07.380 | doesn't look very exciting when it's running.
00:30:09.360 | It's like it's scampering, and you had to watch a slow-mo
00:30:12.760 | for it to look like it was interesting.
00:30:14.800 | If you watch it fast, it was just like a--
00:30:17.680 | - That's funny.
00:30:18.520 | - One of my things was to show stuff in video,
00:30:21.040 | from the very early days of the hopping machines.
00:30:23.480 | And so I was always focused on
00:30:26.340 | how's this gonna look through the viewfinder.
00:30:28.000 | And running AIBO didn't look so cool through the viewfinder.
00:30:31.520 | - So what came next in terms of,
00:30:36.360 | what was a big next milestone in terms of a robot you built?
00:30:40.100 | - I mean, you gotta say that Big Dog was,
00:30:42.700 | sort of put us on the map
00:30:44.700 | and got our heads really pulled together.
00:30:47.100 | We scaled up the company.
00:30:48.800 | Big Dog was the result of Alan Rudolph at DARPA
00:30:53.460 | starting a biodynautics program.
00:30:57.180 | And he put out a request for proposals,
00:31:00.340 | and I think there were 42 proposals written,
00:31:05.040 | and three got funded.
00:31:06.660 | One was Big Dog, one was a climbing robot, RISE.
00:31:10.220 | And you know, that put things in motion.
00:31:11.880 | We hired Martin Buehler.
00:31:14.320 | He was a professor in Montreal at McGill.
00:31:17.880 | He was incredibly important for getting Big Dog
00:31:21.480 | out of the lab and into the mud,
00:31:23.560 | which was a key step to really be willing
00:31:26.640 | to go out there and build it, break it, fix it,
00:31:30.000 | which is sort of one of our mottos at the company.
00:31:32.260 | - So testing it in the real world.
00:31:33.640 | - Testing. - For people who don't know,
00:31:35.200 | Big Dog, maybe you can correct me,
00:31:36.920 | but it's a big quadruped, four-leg robot.
00:31:40.500 | It looks big, could probably carry a lot of weight.
00:31:44.500 | Not the most weight that Boston Dynamics have built,
00:31:47.500 | but a lot.
00:31:48.720 | - Well, it was the first thing that worked.
00:31:50.160 | So let's see, if we go back to the leg lab,
00:31:51.940 | we built a quadruped that could do many of the things
00:31:54.800 | that Big Dog did, but it had a hydraulic pump
00:31:58.560 | sitting in the room with hoses connected to the robot.
00:32:01.200 | It had a VAX computer in the next room,
00:32:03.920 | it needed its own room 'cause it was this giant thing
00:32:06.360 | with air conditioning, and it had this very complicated
00:32:09.960 | bus connected to the robot.
00:32:12.120 | And the robot itself just had the actuators,
00:32:14.480 | it had gyroscopes for sensing and some other sensors,
00:32:18.720 | but all the power and computing was off-board.
00:32:22.220 | Big Dog had all that stuff integrated on the platform.
00:32:26.560 | It had a gasoline engine for power,
00:32:28.360 | which was a very complicated thing to undertake.
00:32:31.920 | It had to convert the rotation of the engine
00:32:34.760 | into hydraulic power, which is how we actuated it.
00:32:38.960 | So there was a lot of learning just on the,
00:32:42.240 | building the physical robot and the system integration
00:32:45.660 | for that, and then there was the controls of it.
00:32:49.120 | - So for Big Dog, you brought it all together
00:32:51.020 | onto one platform, and then so you could--
00:32:54.640 | - You could take it out in the woods.
00:32:55.720 | - Yeah, and you did.
00:32:57.120 | - We did.
00:32:57.960 | We spent a lot of time down at the Marine Corps base
00:33:01.280 | in Quantico where there was a trail
00:33:03.440 | called the Guadalcanal Trail, and our milestone
00:33:08.440 | that DARPA had specified was that we could go
00:33:11.080 | on this one particular trail that involved
00:33:14.200 | a lot of challenge, and we spent a lot of time,
00:33:16.560 | our team spent a lot of time down there.
00:33:19.560 | Those were fun days.
00:33:20.640 | - Hiking with the robot.
00:33:21.760 | So what did you learn about what it takes
00:33:23.760 | to balance a robot like that on a trail,
00:33:27.640 | on a hiking trail in the woods?
00:33:29.080 | Well, basically, forget the woods, just the real world.
00:33:32.080 | That's the big leap into testing in the real world.
00:33:35.160 | - Yeah, as challenging as the woods were,
00:33:39.160 | working inside of a home or in an office is really harder.
00:33:44.160 | - Yeah.
00:33:46.120 | - Because when you're in the woods,
00:33:46.960 | you can actually take any path up the hill.
00:33:49.800 | All you have to do is avoid the obstacles.
00:33:52.320 | There's no such thing as damaging the woods,
00:33:54.560 | at least to first order.
00:33:57.160 | Whereas if you're in a house, you can't leave scuff marks,
00:33:59.520 | you can't bang into the walls.
00:34:00.840 | The robots aren't very comfortable bumping into the walls,
00:34:03.040 | especially in the early days.
00:34:05.320 | So I think those were actually bigger challenges
00:34:08.080 | once we faced them.
00:34:09.480 | It was mostly getting the systems to work well enough
00:34:14.280 | to gather the hardware systems to work, and the controls.
00:34:18.000 | In those days, we did have a human operator
00:34:20.760 | who did all the visual perception
00:34:23.680 | going up the Guadalcanal Trail.
00:34:25.320 | So there was an operator who was right there
00:34:27.560 | who was very skilled at,
00:34:29.840 | even though the robot was balancing itself
00:34:31.800 | and placing its own feet,
00:34:33.760 | if the operator didn't do the right thing, it wouldn't go.
00:34:36.400 | But years later, we went back with one of the electric,
00:34:39.200 | the precursor to SPOT,
00:34:41.280 | and we had advanced the controls and everything so much
00:34:44.840 | that an amateur, complete amateur,
00:34:47.800 | could operate the robot the first time,
00:34:50.280 | up and down and up and down,
00:34:52.080 | whereas it had taken us years to get there
00:34:54.400 | in the previous robot.
00:34:55.920 | - So if you fast forward, Big Dog eventually became SPOT.
00:34:59.520 | - So Big Dog became LS3,
00:35:01.240 | which is the big load-carrying one.
00:35:03.080 | - Just a quick pause, it can carry 400 pounds.
00:35:07.480 | - It was designed to carry 400,
00:35:09.040 | but we had it carrying about 1,000 pounds at one time.
00:35:12.160 | - Of course you did, just to make sure.
00:35:14.440 | - We had one carrying the other one.
00:35:16.000 | We had two of them, so we had one carrying the other one.
00:35:18.440 | There's a little clip of that.
00:35:19.320 | We should put that out somewhere.
00:35:20.880 | That's from like 20 years ago.
00:35:22.680 | - Wow, wow.
00:35:24.320 | And it can go for very long distances.
00:35:26.360 | It can travel 20 miles.
00:35:28.200 | - Yeah, gasoline.
00:35:30.400 | - Gasoline, yeah.
00:35:31.880 | And that eventually, just, okay, sorry.
00:35:33.760 | So LS3, then what, how did that lead to SPOT?
00:35:38.080 | - So Big Dog and LS3 had
00:35:41.200 | engine power and hydraulic actuation.
00:35:46.800 | Then we made a robot that was
00:35:51.000 | electric power, so there's a battery driving a motor,
00:35:55.480 | driving a pump, but still hydraulic actuation.
00:35:59.440 | Larry sort of asked us,
00:36:00.440 | "Could you make something that weighed 60 pounds
00:36:02.400 | "that would not be so intimidating
00:36:04.400 | "if you had it in a house where there were people?"
00:36:07.240 | And that was the inspiration behind the SPOT,
00:36:10.080 | pretty much as it exists today.
00:36:11.320 | We did a prototype the same size
00:36:13.400 | that was the first all-electric, non-hydraulic robot.
00:36:18.400 | - What was the conversation with Larry Page like
00:36:20.800 | about, so here's a guy that kind of is very product-focused
00:36:25.760 | and can see a vision for what the future holds.
00:36:28.760 | That's just interesting kind of aside.
00:36:31.640 | What was the brainstorm
00:36:33.000 | about the future robotics with him like?
00:36:35.600 | - I mean, it was almost as simple as what I just said.
00:36:37.720 | He, you know, we were having a meeting.
00:36:39.760 | He said, "Yeah, geez, you know,
00:36:41.240 | "do you think you could make a smaller one
00:36:42.760 | "that wouldn't be so intimidating, like a big dog,
00:36:45.380 | "if it was in your house?"
00:36:47.920 | And I said, "Yeah, we could do that."
00:36:50.120 | And we started and did.
00:36:52.600 | - Is there a lot of technical challenges
00:36:54.080 | to go from hydraulic to electric?
00:36:57.080 | - You know, I had been in love with hydraulics
00:36:59.120 | and still love hydraulics.
00:37:02.120 | You know, it's a great technology.
00:37:03.600 | It's too bad that somehow the world out there
00:37:07.640 | looks at it like it's old-fashioned or that it's icky.
00:37:12.600 | And it's true that you do, it is very hard to keep it
00:37:14.960 | from having some amount of dripping from time to time.
00:37:18.880 | But if you look at the performance, you know,
00:37:21.400 | how strong you can get in a lightweight package,
00:37:24.320 | and of course, we did a huge amount of innovation.
00:37:26.640 | Most of hydraulic control, that is the valve
00:37:30.680 | that controls the flow of oil,
00:37:32.640 | had been designed in the '50s for airplanes.
00:37:36.360 | It had been made robust enough, safe enough,
00:37:39.880 | that you could count on it
00:37:41.000 | so that humans could fly in airplanes.
00:37:44.620 | And very little innovation had happened.
00:37:47.560 | You know, that might not be fair to the people
00:37:49.080 | who make the valves.
00:37:49.920 | I'm sure that they did innovate.
00:37:52.240 | But the basic design had stayed the same,
00:37:54.320 | and there was so much more you could do.
00:37:56.240 | And so our engineers designed valves,
00:37:59.480 | the ones that are in Atlas, for instance,
00:38:03.480 | that had new kinds of circuits.
00:38:05.280 | They sort of did some of the computing
00:38:06.860 | that could get you much more efficient use.
00:38:09.280 | They were much smaller and lighter,
00:38:10.820 | so that the whole robot could be smaller and lighter.
00:38:13.760 | We made a hydraulic power supply
00:38:16.480 | that had a bunch of components
00:38:18.960 | integrated in this tiny package.
00:38:20.600 | It's about this big, the size of a football.
00:38:23.320 | It weighs five kilograms,
00:38:25.880 | and it produces five kilowatts of power.
00:38:28.920 | Of course, it has to have a battery operating,
00:38:31.080 | but it's got a motor, a pump, filters,
00:38:34.280 | heat exchanger to keep it cool,
00:38:36.600 | some valves, all in this tiny little package.
00:38:39.760 | So hydraulics could still have a ways to go.
00:38:44.280 | - One of the things that stands out
00:38:45.420 | about the robots Boston Dynamics have created
00:38:48.520 | is how beautiful the movement is,
00:38:50.400 | how natural the walking is, and running is,
00:38:55.120 | even flipping is, throwing is.
00:38:57.320 | So maybe you can talk about what's involved
00:39:00.080 | in making it look natural.
00:39:02.800 | - Well, I think having good hardware is part of the story,
00:39:06.400 | and people who think you don't need
00:39:08.640 | to innovate hardware anymore are wrong, in my opinion.
00:39:12.220 | So I think one of the things,
00:39:14.480 | certainly in the early years for me,
00:39:16.680 | taking a dynamic approach where you think about
00:39:19.680 | what's the evolution of the motion
00:39:22.040 | of the thing gonna be in the future,
00:39:24.920 | and having a prediction of that
00:39:26.680 | that's used at the time that you're giving signals to it,
00:39:30.260 | as opposed to it all being servoing,
00:39:32.000 | which is, servoing is sort of backward-looking.
00:39:33.880 | It says, okay, where am I now?
00:39:36.120 | I'm gonna try and adjust for that,
00:39:37.940 | but you really need to think about what's coming.
00:39:40.440 | - So how far ahead do you have to look in time?
00:39:44.040 | - It's interesting.
00:39:44.880 | I think that the number is only a couple of seconds
00:39:47.760 | for a spot, so there's a limited horizon-type approach
00:39:52.760 | where you're recalculating, assuming what's gonna happen
00:39:56.440 | in the next second or second and a half,
00:39:59.320 | and then you keep iterating.
00:40:01.080 | At the next, even though a tenth of a second later,
00:40:03.440 | you'll say, okay, let's do that again
00:40:05.280 | and see what's happening,
00:40:06.460 | and you're looking at what the obstacles are,
00:40:08.260 | where the feet are gonna be placed,
00:40:10.100 | how to, you have to coordinate a lot of things
00:40:12.760 | if you have obstacles and you're balancing at the same time,
00:40:16.220 | and it's that limited horizon-type calculation
00:40:19.260 | that's doing a lot of that,
00:40:20.940 | but if you're doing something like a somersault,
00:40:23.260 | you're looking out a lot further.
00:40:25.300 | If you wanna stick the landing,
00:40:26.860 | you have to get, you have to, at the time of launch,
00:40:30.660 | have momentum and rotation, all those things coordinated
00:40:36.060 | so that a landing is within reach.
00:40:38.240 | - How hard is it to stick a landing?
00:40:40.460 | I mean, it's very much under-actuated,
00:40:44.060 | like you, once you've, in the air,
00:40:47.420 | you don't have as much control about anything,
00:40:51.300 | so how hard is it to get that to work?
00:40:54.300 | First of all, it did flips with a hopping robot.
00:40:56.660 | - If you look at the first time
00:40:58.940 | we ever made a robot do a somersault,
00:41:00.940 | it was in a planar robot, you know, it had a boom,
00:41:05.260 | so it was restricted to the surface of a sphere,
00:41:07.940 | we call that planar, so it could move fore and aft,
00:41:10.560 | it could go up and down, and it could rotate,
00:41:12.900 | and so the calculation of what you need to do
00:41:14.940 | to stick a landing isn't all that complicated.
00:41:19.100 | You have to look at, you know,
00:41:20.740 | you have to get time to make the rotation,
00:41:22.780 | so how high you jump gives you time.
00:41:25.720 | You look at how quickly you can rotate,
00:41:29.960 | and so, you know, if you get those two right,
00:41:32.140 | then when you land, you have the feet in the right place,
00:41:34.700 | and you have to get rid of all that
00:41:36.620 | rotational and linear momentum,
00:41:39.700 | but, you know, that's not too hard to figure out,
00:41:42.340 | and we made, you know, back in about 1985 or six,
00:41:46.420 | I can't remember, we had a simple robot doing somersaults.
00:41:50.420 | To do it in 3D, really the calculation is the same,
00:41:53.480 | you just have to be balancing
00:41:54.700 | in the other degrees of freedom.
00:41:55.940 | If you're just doing a somersault,
00:41:57.160 | it's just a planar thing.
00:41:59.340 | When Rob was my graduate student and we were at MIT,
00:42:01.980 | which is when we made, you know,
00:42:03.820 | a two-legged robot do a 3D somersault for the first time,
00:42:06.780 | there we, in order to get enough rotation rate,
00:42:10.780 | you needed to do tucking also.
00:42:12.600 | You know, withdraw the legs in order to accelerate it,
00:42:15.980 | and he did some really fascinating work
00:42:18.300 | on how you stabilize more complicated maneuvers.
00:42:21.820 | You remember he was a gymnast, a champion gymnast,
00:42:24.100 | before he'd come to me, so he had the physical abilities,
00:42:28.660 | and he was, you know, an engineer,
00:42:30.340 | so he could translate some of that into the math
00:42:33.540 | and the algorithms that you need to do that.
00:42:37.020 | - He knew how humans do it.
00:42:38.580 | You just had to get robots to do the same.
00:42:41.140 | - Unfortunately, though,
00:42:42.620 | humans don't really know how they do it, right?
00:42:45.580 | We're coached, we have ways of learning,
00:42:49.620 | but do we really understand in a physics way
00:42:52.980 | what we're doing?
00:42:54.600 | Probably most gymnasts and athletes don't know.
00:42:57.780 | - So in some way, by building robots,
00:43:00.380 | you are in part understanding how humans do, like walking.
00:43:04.420 | Most of us walk without considering how we walk, really,
00:43:08.180 | and how we make it so natural and efficient,
00:43:10.260 | all those kinds of things.
00:43:11.100 | - Atlas still doesn't walk like a person,
00:43:13.100 | and it still doesn't walk quite as gracefully as a person,
00:43:15.540 | even though it's been getting closer and closer.
00:43:18.460 | The running might be close to a human,
00:43:21.220 | but the walking is still a challenge.
00:43:23.540 | - That's interesting, right,
00:43:24.780 | that running is closer to a human.
00:43:26.740 | It just shows that the more aggressive and kind of,
00:43:31.240 | the more you leap into the unknown,
00:43:33.940 | the more natural it is.
00:43:35.040 | I mean, walking is kind of falling always, right?
00:43:37.620 | - And something weird about the knee,
00:43:39.620 | that you can kind of do this folding and unfolding
00:43:43.160 | and get it to work out just,
00:43:44.580 | a human can get it to work out just right.
00:43:46.420 | There's compliances.
00:43:48.100 | Compliance means springiness in the design
00:43:50.580 | that are important to how it works.
00:43:52.780 | Well, we used to have a motto at Boston Dynamics
00:43:55.020 | in the early days,
00:43:55.860 | which was, "You have to run before you can walk."
00:43:58.360 | (Lex laughs)
00:44:00.540 | - That's a good motto.
00:44:02.020 | 'Cause you also had Wildcat,
00:44:05.100 | which was one of the, along the way towards SPOT,
00:44:07.460 | which is a quadruped that went 19 miles an hour
00:44:10.500 | on flat terrain.
00:44:12.060 | Is that the fastest you've ever built?
00:44:14.180 | - Oh, yeah.
00:44:15.020 | - Might be the fastest quadruped in the world, I don't know.
00:44:17.020 | - For a quadruped, probably.
00:44:18.900 | Of course, it was probably the loudest, too.
00:44:21.180 | So we had this little racing go-kart engine on it,
00:44:24.220 | and we would get people from three buildings away
00:44:27.540 | sending us, you know, complaining about how loud it was.
00:44:31.820 | - So at the Leg Lab,
00:44:33.540 | I believe most of the robots didn't have knees.
00:44:36.160 | (Lex laughs)
00:44:38.620 | What's the, how do you figure out
00:44:40.340 | what is the right number of actuators?
00:44:42.020 | What are the joints to have?
00:44:44.100 | What do you need to have?
00:44:45.820 | You know, we humans have knees,
00:44:47.500 | and all kinds of interesting stuff on the feet.
00:44:51.260 | The toe is an important part, I guess, for humans.
00:44:54.360 | Or maybe it's not.
00:44:55.520 | I injured my toe recently,
00:44:57.020 | and it made running very unpleasant.
00:44:59.120 | So that seems to be kind of important.
00:45:01.000 | So how do you figure out, for efficiency, for function,
00:45:04.480 | for aesthetics, how many joints to have,
00:45:07.760 | how many actuators to have?
00:45:09.240 | - Well, it's always a balance
00:45:10.680 | between wanting to get where you really wanna get
00:45:14.100 | and what's practical to do based on your resources
00:45:18.640 | or what you know and all that.
00:45:20.760 | So, I mean, the whole idea of the pogo stick
00:45:24.600 | was to do a simplification.
00:45:26.400 | Obviously, it didn't look like a human.
00:45:28.640 | I think a technical scientist could appreciate
00:45:31.920 | that we were capturing some of the things
00:45:33.920 | that are important in human locomotion
00:45:35.920 | without it looking like it,
00:45:37.960 | without having a knee, an ankle.
00:45:40.360 | I'll tell you, the first sketch that Ben Brown made
00:45:43.640 | when we were talking about building this thing
00:45:45.840 | was a very complicated thing with zillions of springs,
00:45:49.980 | lots of joints.
00:45:51.240 | It looked much more like a kangaroo
00:45:54.920 | or an ostrich or something like that,
00:45:57.500 | things we were paying a lot of attention to at the time.
00:46:00.300 | So my job was to say, okay, well,
00:46:05.160 | let's do something simpler to get started,
00:46:08.020 | and maybe we'll get there at some point.
00:46:10.320 | - I just love the idea that you two
00:46:12.800 | were studying kangaroos and ostriches.
00:46:14.680 | - Oh, yeah.
00:46:16.520 | We filmed and digitized data from horses.
00:46:21.520 | I did a dissection of an ostrich at one point,
00:46:25.340 | which has absolutely remarkable legs.
00:46:27.740 | - Dumb question.
00:46:28.820 | Do ostriches have a lot of musculature on the legs or no?
00:46:33.560 | - Most of it's up in the feathers,
00:46:35.260 | but there's a huge amount going on in the feathers,
00:46:37.220 | including a knee joint.
00:46:38.860 | The knee joint's way up there.
00:46:40.140 | The thing that's halfway down the leg
00:46:42.060 | that looks like a backwards knee is actually the ankle.
00:46:46.480 | The thing on the ground which looks like the foot
00:46:49.060 | is actually the toes.
00:46:50.320 | It's an extended toe.
00:46:52.800 | But the basic morphology is the same in all these animals.
00:46:57.800 | - What do you think is the most beautiful
00:47:02.320 | movement of an animal?
00:47:03.680 | Like what animal do you think is the coolest?
00:47:06.600 | Land animal, 'cause fish is pretty cool,
00:47:08.760 | like the way fish moves through water,
00:47:10.040 | but like legged locomotion.
00:47:11.840 | - You know, the slow-mo's of cheetahs running
00:47:13.880 | are incredible, you know, there's so much back motion
00:47:18.240 | and grace, and of course they're moving very fast.
00:47:22.700 | The animals running away from the cheetah
00:47:25.720 | are pretty exciting.
00:47:26.760 | You know, the pronghorn, which, you know,
00:47:29.520 | they do this all four legs at once jump called the prong
00:47:33.520 | to kind of confuse the, especially if there's a group
00:47:36.560 | of them, to confuse whoever's chasing them.
00:47:39.280 | - So they do like a misdirection type of thing?
00:47:41.000 | - Yep, they do a misdirection thing.
00:47:42.920 | The front-on views of the cheetahs running fast
00:47:45.160 | where the tail is whipping around to help in the turns,
00:47:48.560 | to help stabilize in the turns, that's pretty exciting.
00:47:51.400 | - 'Cause they spend a lot of time in the air, I guess,
00:47:53.400 | as they're running that fast.
00:47:55.000 | - But they also turn very fast.
00:47:57.000 | - Is that a tail thing, or do you have to have contact
00:47:59.740 | with ground sometimes?
00:48:00.580 | - Everything in the body is probably helping turn,
00:48:02.600 | 'cause they're chasing something that's trying to get away
00:48:04.900 | that's also zigzagging around.
00:48:07.920 | But I would be remiss if I didn't say, you know,
00:48:10.720 | humans are pretty good too.
00:48:12.880 | You know, you watch gymnasts, especially these days,
00:48:15.840 | they're doing just incredible stuff.
00:48:19.400 | - Well, like, especially like Olympic-level gymnasts.
00:48:22.240 | See, but there could be cheetahs that are Olympic-level.
00:48:25.200 | We might be watching the average cheetah versus like,
00:48:28.240 | there could be like a really special cheetah
00:48:30.320 | that can do like-- - You're right.
00:48:32.320 | - When do the knees first come into play
00:48:35.340 | in you building legged robots?
00:48:37.640 | - In Big Dog. - Big Dog.
00:48:39.320 | - Yeah, Big Dog came first, and then Little Dog was later.
00:48:42.920 | And you know, there's a big compromise there.
00:48:47.140 | Human knees have multiple muscles,
00:48:50.300 | and you could argue that there's,
00:48:52.880 | I mean, it's a technical thing about negative work.
00:48:57.640 | When you're contracting a joint,
00:49:00.360 | but you're pushing out, that's negative work.
00:49:03.720 | And if you don't have a place to store that,
00:49:05.760 | it can be very expensive to do negative work.
00:49:08.040 | And in Big Dog, there was no place
00:49:10.680 | to store negative work in the knees.
00:49:14.000 | But Big Dog also had pogo stick springs down below,
00:49:19.480 | so part of the action was to comply in a bouncing motion.
00:49:23.640 | You know, later on in SPOT, we took that out.
00:49:27.000 | As we got further and further away from the leg lab,
00:49:29.920 | we had more, you know, energy-driven controls.
00:49:33.860 | - Is there something to be said about like,
00:49:36.520 | knees that go forward versus backward?
00:49:40.720 | - Sure, there's this idea called passive dynamics,
00:49:45.480 | which says that although you can use computers
00:49:48.600 | and actuators to make a motion,
00:49:50.640 | a mechanical system can make a motion just by itself
00:49:54.200 | if it gets stimulated the right way.
00:49:56.280 | So, Tad McGeer, I think in the mid '80s,
00:50:02.760 | maybe it was in the late '80s, started to work on that.
00:50:05.960 | And he made this legged system
00:50:09.400 | that could walk down an inclined plane
00:50:11.800 | where the legs folded and unfolded and swung forward,
00:50:14.680 | you know, do the whole walking motion,
00:50:17.120 | where the only thing, there was no computer.
00:50:19.520 | There were some adjustments to the mechanics
00:50:22.160 | so that there were dampers and springs in some places
00:50:24.880 | that helped the mechanical action happen.
00:50:28.440 | It was essentially a mechanical computer.
00:50:30.720 | And the idea, the interesting idea there
00:50:33.200 | is that it's not all about the brain dictating to the body
00:50:38.200 | what the body should do.
00:50:39.400 | The body is a participant in the motion.
00:50:42.440 | - So, a great design for a robot has a mechanical component
00:50:46.200 | where the movement is efficient even without a brain.
00:50:49.640 | - Yes.
00:50:50.560 | - How do you design that?
00:50:52.160 | - I think that these days, most robots aren't doing that.
00:50:54.680 | Most robots are basically using the computer
00:50:58.600 | to govern the motion.
00:51:00.480 | Now, the brain, though, is taking into account
00:51:03.920 | what the mechanical thing can do
00:51:06.160 | and how it's gonna behave.
00:51:08.160 | Otherwise, it would have to really forcefully move
00:51:11.680 | everything around all the time,
00:51:13.560 | which probably some solutions do,
00:51:15.880 | but I think you end up with a more efficient
00:51:18.400 | and more graceful thing if you're taking into account
00:51:21.240 | what the machine wants to do.
00:51:23.240 | - So, this might be a good place to mention
00:51:25.400 | that you're now leading up the Boston Dynamics AI
00:51:30.720 | Institute, newly formed, which is focused more
00:51:35.080 | on designing the robots of the future.
00:51:37.020 | I think one of the things, maybe you can tell me
00:51:40.800 | the big vision for what's going on,
00:51:42.360 | but one of the things is this idea
00:51:46.560 | that hardware still matters with organic design and so on.
00:51:50.280 | Maybe before that, can you zoom out
00:51:52.040 | and tell me what the vision is for the AI Institute?
00:51:55.860 | - You know, I like to talk about intelligence
00:51:59.200 | having two parts, an athletic part and a cognitive part.
00:52:04.080 | And I think Boston Dynamics, in my view,
00:52:07.160 | has sort of set the standard
00:52:08.520 | for what athletic intelligence can be,
00:52:12.040 | and it has to do with all the things
00:52:13.320 | we've been talking about, the mechanical design,
00:52:17.240 | the real-time control, the energetics,
00:52:19.920 | and that kind of stuff.
00:52:21.400 | But obviously, people have another kind of intelligence,
00:52:24.560 | and animals have another kind of intelligence.
00:52:26.920 | We can make a plan.
00:52:28.180 | Our meeting started at 9.30.
00:52:31.640 | I looked up on Google Maps how long it took
00:52:34.200 | to walk over here.
00:52:35.200 | It was 20 minutes, so I decided, okay,
00:52:38.400 | I'd leave my house at nine, which is what I did.
00:52:41.660 | You know, simple intelligence,
00:52:43.960 | but we use that kind of stuff all the time.
00:52:46.440 | It's sort of what we think of as going on in our heads.
00:52:49.200 | And I think that's in short supply for robots.
00:52:53.480 | Most robots are pretty dumb.
00:52:55.560 | And as a result, it takes a lot of skilled people
00:52:58.600 | to program them to do everything they do,
00:53:01.260 | and it takes a long time.
00:53:03.840 | And if robots are gonna satisfy our dreams,
00:53:08.160 | they need to be smarter.
00:53:09.360 | So the AI Institute is designed to combine
00:53:16.160 | that physicality of the athletic side
00:53:19.480 | with the cognitive side.
00:53:21.920 | So for instance, we're trying to make robots
00:53:24.460 | that can watch a human do a task,
00:53:27.720 | understand what it's seeing, and then do the task itself.
00:53:31.200 | So sort of OJT, on-the-job training for robots as a paradigm.
00:53:36.200 | Now, you know, that's pretty hard,
00:53:39.980 | and it's sort of science fiction,
00:53:42.120 | but our idea is to work on a longer timeframe
00:53:46.040 | and work on solving those kinds of problems.
00:53:48.920 | And I have a whole list of things
00:53:50.020 | that are kind of like in that vein.
00:53:53.040 | - Maybe we can just take many of the things you mentioned,
00:53:57.480 | just take it as a tangent.
00:53:59.240 | First of all, athletic intelligence is a super cool term.
00:54:02.040 | And that really is intelligence.
00:54:05.560 | We humans kind of take it for granted
00:54:07.720 | that we're so good at walking and moving about the world.
00:54:10.160 | - And using our hands, you know?
00:54:11.880 | The mechanics of interacting with all these two things.
00:54:16.260 | - And you've never touched those things before, right?
00:54:18.580 | - Well, I've touched ones like this.
00:54:20.280 | Look at all the things I can do, right?
00:54:21.720 | I can juggle, and I'm rotating it this way.
00:54:23.480 | I can rotate it without looking.
00:54:25.680 | I could fetch these things out of my pocket
00:54:27.560 | and figure out which one was which
00:54:29.160 | and all that kind of stuff.
00:54:30.640 | And I don't think we have much of a clue
00:54:33.420 | how all that works yet.
00:54:35.300 | - Right, and that's, I really like putting that
00:54:37.480 | under the banner of athletic intelligence.
00:54:41.400 | What are the big open problems in athletic intelligence?
00:54:44.360 | So Boston Dynamics, with Spot, with Atlas,
00:54:48.800 | just have shown time and time again,
00:54:51.520 | pushed the limits of what we think is possible with robots.
00:54:54.920 | But where do we stand, actually, if we kind of zoom out?
00:54:58.440 | What are the big open problems
00:54:59.840 | on the athletic intelligence side?
00:55:01.600 | - I mean, one question you could ask that isn't my question,
00:55:04.280 | but are they commercially viable?
00:55:06.760 | Will they increase productivity?
00:55:10.160 | And I think we're getting very close to that.
00:55:12.900 | I don't think we're quite there still.
00:55:15.960 | Most of the robotics companies, it's a struggle.
00:55:20.040 | It's really the lack of the cognitive side
00:55:22.280 | that probably is the biggest barrier at the moment,
00:55:25.120 | even for the physically successful robots.
00:55:27.920 | But your question's a good one.
00:55:29.560 | I mean, you can always do a thing that's more efficient,
00:55:33.600 | lighter, more reliable.
00:55:34.960 | I'd say reliability.
00:55:36.640 | I know that Spot, they've been working very hard
00:55:40.280 | on getting the tail of the reliability curve up,
00:55:44.440 | and they've made huge progress.
00:55:46.200 | So the robots, there's 1,500 of them out there now,
00:55:50.400 | many of them being used in practical applications
00:55:54.600 | day in and day out, where they have to work reliably.
00:55:59.600 | And it's very exciting that they've done that.
00:56:02.680 | But it takes a huge effort to get
00:56:04.120 | that kind of reliability in the robot.
00:56:07.200 | There's cost, too.
00:56:08.220 | You'd like to get the cost down.
00:56:10.580 | Spots are still pretty expensive,
00:56:12.280 | and I don't think that they have to be.
00:56:16.000 | But it takes a different kind of activity to do that.
00:56:19.960 | Now that, I think, you know that Boston Dynamics
00:56:25.920 | is owned primarily by Hyundai now,
00:56:28.400 | and I think that the skills of Hyundai in making cars
00:56:32.600 | can be brought to bear in making robots
00:56:36.760 | that are less expensive and more reliable
00:56:38.720 | and those kinds of things.
00:56:39.840 | - So on the cognitive side, for the Eye Institute,
00:56:43.800 | what's the trade-off between moonshot projects for you
00:56:47.880 | and maybe incremental progress?
00:56:50.600 | - That's a good question.
00:56:51.600 | I think we're using the paradigm
00:56:53.840 | called stepping stones to moonshots.
00:56:56.160 | I don't believe, that was in my original proposal
00:57:00.280 | for the Institute, stepping stones to moonshots.
00:57:02.920 | I think if you go more than a year
00:57:05.620 | without seeing a tangible status report of where you are,
00:57:10.080 | which is the stepping stone,
00:57:12.240 | and it could be a simplification, right?
00:57:14.140 | You don't necessarily have to solve all the problems
00:57:17.120 | of your target goal, even though your target goal
00:57:19.200 | is gonna take several years.
00:57:20.600 | You know, those stepping stone results give you feedback,
00:57:25.480 | give motivation, because usually there's some success
00:57:27.920 | in there, and so that's the mantra we've been working on.
00:57:32.920 | And that's pretty much how I'd say Boston Dynamics has worked
00:57:39.240 | where you make progress and show it as you go,
00:57:43.320 | show it to yourself, if not to the world.
00:57:45.940 | - What does success look like?
00:57:47.480 | What are some of the milestones you're chasing?
00:57:50.900 | - Well, with Watch, Understand, Do,
00:57:54.800 | the project I mentioned before,
00:57:56.520 | we've broken that down into getting some progress
00:58:00.000 | with what does meaningfully watching something mean,
00:58:03.180 | breaking down an observation of a person
00:58:06.800 | doing something into the components.
00:58:09.600 | You know, segmenting, you watch me do something,
00:58:12.720 | I'm gonna pick up this thing and put it down here
00:58:14.560 | and stack this on it.
00:58:15.840 | Well, it's not obvious if you just look at the raw data
00:58:18.680 | what the sequence of acts are.
00:58:22.000 | It's really a creative, intelligent act
00:58:24.280 | for you to break that down into the pieces
00:58:26.880 | and understand them in a way so you could say,
00:58:29.280 | okay, what skill do I need
00:58:31.520 | to accomplish each of those things?
00:58:34.240 | So we're working on the front end of that kind of a problem
00:58:37.420 | where we observe and translate the,
00:58:41.340 | if it may be video, it may be live,
00:58:43.500 | into a description of what we think is going on
00:58:47.120 | and then try and map that into skills to accomplish that.
00:58:49.480 | And we've been developing skills as well.
00:58:51.300 | So we have kind of multiple stabs at the pieces of doing that.
00:58:55.500 | - And this is usually video of humans
00:58:58.440 | manipulating objects with their hands kind of thing?
00:59:01.480 | - We're starting out with bicycle repair,
00:59:03.240 | some simple bicycle repair tasks.
00:59:05.380 | - That seems complicated, that seems really complicated.
00:59:07.480 | - It is, but there's some parts of it that aren't.
00:59:10.240 | Like putting the seat in into the,
00:59:14.280 | you have a tube that goes inside of another tube
00:59:16.440 | and there's a latch, that should be within range.
00:59:19.720 | - Is it possible to observe, to watch a video like this
00:59:23.420 | without having an explicit model
00:59:25.080 | of what a bicycle looks like?
00:59:26.640 | - I think it is.
00:59:27.880 | And I think that's the kind of thing
00:59:29.400 | that people don't recognize.
00:59:30.920 | So let me translate it to navigation.
00:59:33.240 | I think the basic paradigm for navigating a space
00:59:38.920 | is to get some kind of sensor that tells you
00:59:41.080 | where an obstacle is and what's open,
00:59:43.280 | build a map, and then go through the space.
00:59:46.240 | But if we were doing on-the-job training
00:59:48.540 | where I was giving you a task,
00:59:50.000 | I wouldn't have to say anything about the room, right?
00:59:51.960 | We came in here, all we did is adjust the chair,
00:59:54.960 | but we didn't say anything about the room
00:59:57.080 | and we could navigate it.
00:59:58.840 | So I think there's opportunities
01:00:00.200 | to build that kind of navigation skill into robots.
01:00:03.080 | And we're hoping to be able to do that.
01:00:07.360 | - So operate successfully under a lot of uncertainty.
01:00:10.480 | - Yeah, and lack of specification.
01:00:13.160 | - Lack of specification.
01:00:14.640 | - I mean, that's what sort of intelligence is, right?
01:00:16.720 | Kind of dealing with, understanding a situation
01:00:20.160 | even though it wasn't explained.
01:00:22.240 | - So how big of a role does machine learning
01:00:24.400 | play in all of this?
01:00:26.800 | Is this more and more learning-based?
01:00:30.320 | - You know, since ChatGBT, which is a year ago, basically,
01:00:35.320 | there's a huge interest in that
01:00:39.600 | and a huge optimism about it.
01:00:42.520 | And I think that there's a lot of things
01:00:44.280 | that machine learn, that kind of machine learning.
01:00:46.520 | Now, of course, there's lots of different kinds
01:00:47.760 | of machine learning.
01:00:48.840 | I think there's a lot of interest and optimism about it.
01:00:52.480 | I think the facts on the ground are
01:00:55.360 | that doing physical things with physical robots
01:00:58.640 | is a little bit different than language.
01:01:00.920 | And the tokens, you know, the tokens sort of don't exist.
01:01:04.560 | You know, pixel values aren't like words.
01:01:07.720 | But I think that there's a lot that can be done there.
01:01:12.480 | We have several people
01:01:16.400 | working on machine learning approaches.
01:01:18.360 | I don't know if you know,
01:01:19.200 | but we opened an office in Zurich recently.
01:01:22.320 | And Marco Hutter, who's one of the real leaders
01:01:25.520 | in reinforcement learning for robots,
01:01:28.720 | is the director of that office.
01:01:31.320 | He's still half-time at ETH, the university there,
01:01:36.320 | where he has an unbelievably fantastic lab.
01:01:39.320 | And then he's half-time leading,
01:01:42.300 | will be leading off efforts in the Zurich office.
01:01:45.480 | So we have a healthy learning component.
01:01:48.320 | But there's part of me that still says,
01:01:50.460 | if you look out in the world
01:01:51.720 | at what the most impressive performances are,
01:01:55.480 | they're still pretty much,
01:01:57.000 | I hate to use the word traditional,
01:01:59.920 | but that's what everybody's calling it,
01:02:01.000 | traditional controls, like model predictive control.
01:02:03.940 | You know, the thing, the ATLAS performances
01:02:07.880 | that you've seen are mostly model predictive control.
01:02:10.340 | They've started to do some learning stuff
01:02:11.760 | that's really incredible.
01:02:13.140 | I don't know if it's all been shown yet,
01:02:14.880 | but you'll see it over time.
01:02:17.680 | And then Marco's done some great stuff, and others.
01:02:21.560 | So especially for the athletic intelligence piece,
01:02:24.160 | the traditional approach seems to be the one
01:02:27.040 | that still performs the best.
01:02:29.200 | - I think we're gonna find a mating of the two,
01:02:31.600 | and we'll have the best of both worlds.
01:02:33.400 | And we're working on that at the institute, too.
01:02:36.040 | - If I can talk to you about teams,
01:02:38.000 | you've built an incredible team at Boston Dynamics.
01:02:40.040 | Before, at MIT and CMU, at Boston Dynamics,
01:02:43.200 | and now at the AI Institute.
01:02:45.080 | And you said that there's four components to a great team.
01:02:48.840 | Technical, fearlessness, diligence,
01:02:52.240 | intrepidness, and fun, technical fun.
01:02:55.040 | Can you explain each?
01:02:56.200 | Technical fearlessness, what do you mean by that?
01:02:58.160 | - Sure, technical fearlessness means
01:03:00.800 | being willing to take on a problem
01:03:02.240 | that you don't know how to solve.
01:03:04.440 | And, you know, study it, figure out an entry point,
01:03:09.440 | you know, maybe a simplified version,
01:03:12.200 | or a simplified solution or something.
01:03:14.920 | Learn from the stepping stone,
01:03:17.720 | and go back, and eventually make a solution
01:03:22.720 | that meets your goals.
01:03:26.560 | And I think that's really important.
01:03:28.280 | - The fearlessness comes into play
01:03:29.680 | because some of it has never been done before?
01:03:32.600 | - Yeah, and you don't know how to do it.
01:03:34.440 | And, you know, there's easier stuff to do in life.
01:03:37.080 | So, you know, I mean, I don't know.
01:03:41.440 | Watch, understand, do.
01:03:42.840 | It's a mountain of a challenge.
01:03:45.900 | - So that's the really big challenge you're tackling now.
01:03:48.600 | Can we watch humans at scale and have robots,
01:03:52.360 | by watching humans, become effective actors in the world?
01:03:57.360 | - Yeah, I mean, we have others like that.
01:03:59.760 | We have one called Inspect, Diagnose, Fix.
01:04:02.400 | Like, you know, you call up the Maytag repairman.
01:04:06.520 | Okay, he's the one who you don't have to call,
01:04:08.320 | but you know, you call up the dishwasher repair person,
01:04:12.840 | and they come to your house, and they look at your machine.
01:04:16.180 | It's already been actually figured out
01:04:18.900 | that something doesn't work,
01:04:20.040 | but they have to kind of examine it
01:04:21.780 | and figure out what's wrong, and then fix it.
01:04:25.260 | And I think robots should be able to do that.
01:04:28.320 | We already, Boston Dynamics already has spot robots
01:04:32.580 | collecting data on machines.
01:04:35.820 | Things like thermal data, reading the gauges,
01:04:38.420 | listening to them, getting sounds.
01:04:40.460 | And that data are used to determine
01:04:44.040 | whether they're healthy or not.
01:04:46.000 | But the interpretation isn't done by the robots yet,
01:04:49.120 | and certainly the fixing, the diagnosing and the fixing
01:04:53.200 | isn't done yet, but I think it could be.
01:04:55.520 | And that's bringing the AI and combining it
01:04:58.080 | with the physical skills to do it.
01:05:00.880 | - And you're referring to the fixing in the physical world.
01:05:02.920 | I can't wait until they can fix the psychological problems
01:05:05.520 | of humans and show up and just talk, do therapy.
01:05:08.840 | - Yeah, that's a different thing.
01:05:10.280 | Yeah, it's different.
01:05:11.100 | Well, it's all part of the same thing.
01:05:13.340 | Again, humanity.
01:05:14.460 | Maybe, maybe.
01:05:17.740 | - You mean convincing you it's okay
01:05:19.060 | that the dishwasher's broken?
01:05:20.220 | Just do the marketing approach?
01:05:23.860 | - Yeah, exactly.
01:05:25.060 | It's all, yeah, don't sweat the small stuff.
01:05:28.500 | Yeah, as opposed to fixing the dishwasher,
01:05:31.540 | it'll convince you that it's okay
01:05:32.620 | that the dishwasher's broken.
01:05:34.220 | It's a different approach.
01:05:35.520 | Diligence, why is diligence important?
01:05:39.920 | Well, if you want a real robot solution,
01:05:42.460 | it can't be a very narrow solution
01:05:46.900 | that's gonna break at the first variation
01:05:49.340 | in what the robot does or the environment
01:05:52.460 | if it wasn't exactly as you expected it.
01:05:55.020 | So how do you get there?
01:05:56.060 | I think having an approach that leaves you unsatisfied
01:06:01.060 | until you've embraced the bigger problem
01:06:03.700 | is the diligence I'm talking about.
01:06:06.380 | And again, I'll point at boss dynamics.
01:06:09.660 | I think they've done it.
01:06:10.660 | Some of the videos that we had showing the engineer
01:06:15.120 | making it hard for the robot to do its task.
01:06:17.420 | Spot opening a door and then the guy gets there
01:06:21.540 | and pushes on the door so it doesn't open
01:06:23.620 | the way it's supposed to.
01:06:24.440 | Pulling on the rope that's attached to the robot
01:06:27.380 | so its navigation has been screwed up.
01:06:30.440 | We have one where the robot's climbing stairs
01:06:32.540 | and an engineer is tugging on a rope
01:06:34.460 | that's pulling it back down the stairs.
01:06:36.820 | That's totally different than just the robot
01:06:39.380 | seeing the stairs, making a model,
01:06:41.420 | putting its feet carefully on each step.
01:06:43.700 | But that's what probably robotics needs to succeed
01:06:46.720 | and having that broader idea that you wanna come up
01:06:51.060 | with a robust solution is what I meant by diligence.
01:06:54.780 | - So really testing it in all conditions,
01:06:56.580 | perturbing the system in all kinds of ways.
01:06:59.140 | And as a result, creating some epic videos,
01:07:02.100 | the legendary-- - The fun part.
01:07:03.700 | The hockey stick.
01:07:04.780 | - And then yes, tugging on spot
01:07:06.780 | as it's trying to open the door.
01:07:09.060 | I mean, it's great testing, but it's also,
01:07:12.740 | I don't know, it's just somehow extremely compelling
01:07:18.300 | demonstration of robotics in video form.
01:07:21.260 | - I learned something very early on
01:07:22.980 | with the first three-dimensional hopping machine.
01:07:26.500 | If you just show a video of it hopping,
01:07:28.900 | it's a so what.
01:07:31.540 | If you show it falling over a couple of times
01:07:33.700 | and you can see how easily and fast it falls over,
01:07:36.880 | then you appreciate what the robot's doing
01:07:39.380 | when it's doing its thing.
01:07:41.620 | So I think the reaction you just gave
01:07:44.140 | to the robot getting kind of interfered with
01:07:48.660 | or tested while it's going through the door,
01:07:50.820 | it's showing you the scope of the solution.
01:07:53.260 | - The limits of the system,
01:07:55.340 | the challenges involved in failure,
01:07:57.660 | if it's shown both failure and success,
01:07:59.820 | makes you appreciate the success, yeah.
01:08:02.980 | And then just the way the videos are done
01:08:04.700 | in Boston Dynamics are incredible
01:08:06.100 | 'cause there's no flash, there's no extra production.
01:08:09.980 | It's just raw testing of the robot.
01:08:13.060 | - Well, I was the final edit for most of the videos
01:08:16.900 | up until about three years ago or four years ago.
01:08:21.900 | And my theory of the video is no explanation.
01:08:27.420 | If they can't see it, then it's not the right thing.
01:08:32.300 | And if you do something worth showing,
01:08:35.440 | then let them see it.
01:08:36.420 | Don't interfere with a bunch of titles that slow you down
01:08:41.420 | or a bunch of distraction.
01:08:44.300 | Just do something worth showing and then show it.
01:08:47.940 | - That's brilliant.
01:08:48.780 | - It's hard, though, for people to buy into that.
01:08:52.500 | - Yeah, I mean, people always wanna add more stuff,
01:08:55.860 | but the simplicity of just do something worth showing
01:08:59.380 | and show it, that's brilliant.
01:09:01.060 | And don't add extra stuff.
01:09:03.240 | People have criticized, especially the Big Dog videos
01:09:07.840 | where there's a human driving the robot.
01:09:11.160 | And I understand the criticism now.
01:09:13.160 | At the time, we wanted to just show,
01:09:14.600 | look, this thing's using its legs to get up the hill,
01:09:17.000 | so we focused on showing that,
01:09:18.880 | which was, we thought, the story.
01:09:22.500 | The fact that there's a human,
01:09:23.760 | so they were thinking about autonomy,
01:09:25.360 | whereas we were thinking about the mobility.
01:09:27.940 | And so we've adjusted to a lot of things
01:09:32.400 | that we see that people care about, trying to be honest.
01:09:36.280 | We've always tried to be honest.
01:09:38.520 | - But also just show cool stuff in its raw form,
01:09:42.580 | the limits of the system,
01:09:44.260 | to see the system be perturbed and be robust and resilient
01:09:47.360 | and all that kind of stuff.
01:09:49.120 | And dancing with some music.
01:09:51.520 | Intrepidness and fun, so intrepid.
01:09:57.760 | - I mean, it might be the most important ingredient.
01:10:00.480 | - And that is, you know, robotics is hard.
01:10:03.400 | It's not gonna work right, right away,
01:10:05.640 | so don't be discouraged is all it really means.
01:10:08.440 | So usually when I talk about these things,
01:10:10.960 | I show videos and I show a long string of outtakes.
01:10:14.400 | And you have to have courage to be intrepid
01:10:19.400 | when you work so hard to build your machine,
01:10:23.680 | and then you're trying and it just doesn't do
01:10:26.280 | what you thought it would do, what you want it to do.
01:10:29.460 | And, you know, you have to stick to it and keep trying.
01:10:34.460 | - How long, I mean, we don't often see that,
01:10:38.200 | the story behind S.P.O.D. and Atlas.
01:10:41.560 | How long, how many failures was there along the way
01:10:44.480 | to get a working Atlas, a working S.P.O.D.,
01:10:47.360 | in the early days, even working Big Dog?
01:10:49.600 | - There's a video of Atlas climbing three big steps.
01:10:53.240 | And it's very dynamic and it's really exciting,
01:10:55.700 | real accomplishment.
01:10:57.320 | It took 109 tries, and we have video of every one of them.
01:11:01.540 | You know, we shoot everything.
01:11:03.320 | Again, we, this is at Boston Dynamics.
01:11:05.440 | So it took 109 tries.
01:11:09.900 | But once it did it, it had a high percentage of success.
01:11:13.500 | So it's not like we're cheating
01:11:15.180 | by just showing the best one.
01:11:16.820 | But we do show the evolved performance,
01:11:19.940 | not everything along the way.
01:11:21.740 | But everything along the way is informative
01:11:23.700 | and it shows sort of, there's stupid things that go wrong.
01:11:28.700 | Like the robot, just when you say go
01:11:32.180 | and it collapses right there on the start.
01:11:35.000 | That doesn't have to do with the steps.
01:11:37.760 | Or the perception didn't work right,
01:11:39.440 | so you missed the target when you jump.
01:11:41.240 | Or something breaks and there's oil flying everywhere.
01:11:44.460 | But that's fun.
01:11:46.760 | - Yeah, so the hardware failures
01:11:48.920 | and maybe some soft--
01:11:50.240 | - Lots of control of evolution during that time.
01:11:52.900 | I think it took six weeks to get those 109 trials.
01:11:57.000 | 'Cause there was programming going on.
01:12:00.480 | It was actually robot learning,
01:12:03.520 | but there were human in the loop helping with the learning.
01:12:06.440 | So all data driven.
01:12:07.800 | - Okay, and you always are learning from that failure, so.
01:12:12.680 | - Right.
01:12:13.520 | - And how do you protect Atlas
01:12:18.080 | from not getting damaged from 109 attempts?
01:12:23.240 | - It's remarkable.
01:12:25.380 | One of the accomplishments of Atlas
01:12:27.580 | is that the engineers have made a machine
01:12:29.900 | that's robust enough that it can take that kind of testing
01:12:33.380 | where it's falling and stuff
01:12:34.700 | and it doesn't break every time.
01:12:36.180 | It still breaks.
01:12:37.620 | And part of the paradigm is to have people to repair stuff.
01:12:41.940 | You gotta figure that in
01:12:43.100 | if you're gonna do this kind of work.
01:12:44.940 | I sometimes criticize the people
01:12:48.500 | who have their gold plated thing
01:12:50.540 | and they keep it on the shelf
01:12:52.260 | and they're afraid to kind of use it.
01:12:54.220 | I don't think you can make progress
01:12:55.420 | if you're working that way.
01:12:57.020 | You need to be ready to have it break
01:12:59.020 | and go in there and fix it.
01:13:00.860 | It's part of the thing.
01:13:01.940 | Plan your budget so you have spare parts
01:13:04.780 | and a crew and all that stuff.
01:13:07.640 | - Yeah, if it falls 109 times, it's okay.
01:13:11.820 | So intrepid, truly.
01:13:14.940 | And that applies to spot,
01:13:16.180 | that applies to all the other--
01:13:17.020 | - Applies to everything.
01:13:18.200 | I think it applies to everything anybody tries to do
01:13:20.220 | that's worth doing.
01:13:21.160 | - And especially with systems in the real world, right?
01:13:25.940 | And so fun.
01:13:27.020 | - Fun.
01:13:28.540 | Technical fun, I usually say.
01:13:30.020 | Have technical fun.
01:13:31.460 | I think that life as an engineer is really satisfying.
01:13:36.320 | I think you get to,
01:13:37.720 | to some degree it can be like craft's work
01:13:41.700 | where you get to do things with your own hands
01:13:43.460 | or your own design or whatever your media is.
01:13:46.700 | And it's very satisfying to be able to just do the work
01:13:49.740 | unlike a lot of people who have to do something
01:13:53.140 | that they don't like doing.
01:13:54.060 | I think engineers typically get to do something
01:13:56.180 | that they like and there's a lot of satisfaction from that.
01:13:59.980 | Then there's, in many cases,
01:14:03.620 | you can have impact on the world somehow
01:14:07.100 | because you've done something that other people admire,
01:14:09.820 | which is different from the craft fun of building a thing.
01:14:13.560 | So that's a second way that being an engineer is good.
01:14:19.500 | I think the third thing is that if you're lucky
01:14:22.140 | to be working in a team where you're getting the benefit
01:14:25.540 | of other people's skills that are helping you do your thing,
01:14:29.180 | none of us has all the skills needed
01:14:31.400 | to do most of these projects.
01:14:34.520 | And if you have a team where you're working well
01:14:37.660 | with the others, that can be very satisfying.
01:14:40.060 | And then if you're an engineer, you also usually get paid.
01:14:43.060 | And so you kind of get paid four times
01:14:46.340 | in my view of the world.
01:14:47.900 | So what could be better than that?
01:14:49.540 | - Get paid to have fun.
01:14:51.020 | I mean, what do you love about engineering?
01:14:53.340 | When you say engineering, what does that mean to you exactly?
01:14:55.860 | What is this kind of big thing that we call engineering?
01:15:00.340 | - I think it's both being a scientist
01:15:03.460 | or getting to use science at the same time
01:15:06.340 | as being kind of an artist or a creator.
01:15:08.460 | 'Cause you're making some, you know,
01:15:09.780 | scientists only get to study what's out there
01:15:13.400 | and engineers get to make stuff that didn't exist before.
01:15:16.700 | And so it's really, I think, a higher calling,
01:15:18.820 | even though I think most, you know,
01:15:20.580 | the public out there thinks science is top
01:15:23.100 | and engineering is somehow secondary,
01:15:24.740 | but I think it's the other way around.
01:15:26.620 | - And at the cutting edge, I think,
01:15:28.220 | when we talk about robotics,
01:15:30.020 | there is a possibility to do art
01:15:34.540 | in that you do like the first of its kind thing.
01:15:37.420 | So then there's the production at scale,
01:15:39.900 | which is its own beautiful thing,
01:15:41.300 | but when you do the first new robot or the first new thing,
01:15:44.660 | that's the possibility to create something totally new.
01:15:47.540 | That is the art. - I mean, bringing metal
01:15:49.020 | to life or a machine to life is kind of, is fun.
01:15:52.700 | And, you know, it was fun doing the dancing videos
01:15:56.860 | where I got a huge, you know, public response.
01:16:00.300 | And we're gonna do more.
01:16:01.300 | We're gonna do some, we're doing some at the Institute
01:16:03.380 | and we'll do more.
01:16:05.560 | - Well, that metal to life moment,
01:16:07.100 | I mean, to me, that's still magical.
01:16:08.780 | Like when inanimate objects comes to life,
01:16:13.820 | that's still, like to me, to this day,
01:16:16.660 | it's still an incredible moment.
01:16:18.020 | The human intelligence can create systems
01:16:21.620 | that instill life or whatever that is
01:16:25.920 | into inanimate objects.
01:16:27.860 | It's really, it's truly magical,
01:16:30.220 | especially when it's at the scale
01:16:31.740 | that humans can perceive and appreciate, like directly.
01:16:36.180 | - But I think sort of with it going back
01:16:39.100 | to the pieces of that, you know,
01:16:41.260 | you design a linkage that turns out to be half the weight
01:16:45.100 | and just as strong, that's very satisfying.
01:16:48.420 | And, you know, there are people who do that
01:16:50.260 | and it's a creative act.
01:16:53.520 | - What to you is the most beautiful about robotics?
01:16:57.960 | Sorry for the big romantic question.
01:17:01.380 | - I think having the robots move in a way
01:17:03.260 | that's evocative of life is pretty exciting.
01:17:08.260 | - So the elegance of movement.
01:17:09.900 | - Or if it's a high performance act
01:17:11.820 | where it's doing it faster, bigger than other robots.
01:17:16.100 | Usually we're not doing it bigger, faster than people,
01:17:18.380 | but we're getting there in a few narrow dimensions.
01:17:22.060 | - So faster, bigger, smoother, more elegant, more graceful.
01:17:27.060 | - I mean, I'd like to do dancing that starts,
01:17:30.140 | you know, we're nowhere near
01:17:31.900 | the dancing capabilities of a human.
01:17:34.780 | We've been having a ballerina in
01:17:36.740 | who's kind of a well-known ballerina
01:17:39.580 | and she's been programming the robot.
01:17:41.980 | We've been working on the tools that can make it
01:17:43.860 | so that she can use her way of talking,
01:17:47.380 | you know, way of doing a choreography
01:17:49.460 | or something like that more accessible
01:17:51.340 | to get the robot to do things.
01:17:55.420 | And starting to produce some interesting stuff.
01:17:58.140 | - Well, we should mention that there is a choreography tool.
01:18:00.820 | - There is.
01:18:02.220 | - I mean, I guess I saw versions of it,
01:18:06.540 | which is pretty cool.
01:18:07.380 | You can kind of, at slices of time,
01:18:10.580 | control different parts at the high level,
01:18:13.220 | the movement of the robot.
01:18:15.020 | - We hope to take that forward and make it,
01:18:17.220 | you know, more tuned to how the dance world
01:18:20.540 | wants to talk, wants to communicate,
01:18:23.220 | and get better performances.
01:18:25.300 | I mean, we've done a lot, but there's still a lot possible.
01:18:28.740 | And I'd like to have performances
01:18:30.660 | where the robots are dancing with people.
01:18:33.020 | - So right now, almost everything that we've done on dancing
01:18:37.660 | is to a fixed time base.
01:18:40.140 | So once you press go, the robot does its thing
01:18:42.620 | and plays its thing.
01:18:43.700 | It's not listening, it's not watching,
01:18:46.740 | but I think it should do those things.
01:18:48.900 | - I think I would love to see a professional ballerina
01:18:53.300 | like alone in a room with a robot
01:18:54.900 | slowly teaching the robot.
01:18:56.620 | Just actually the process of a clueless robot
01:18:59.980 | trying to figure out a small little piece of a dance.
01:19:03.700 | So it's not like, 'cause right now,
01:19:05.580 | Atlas and Spot have done like perfect dancing
01:19:08.500 | to a beat and so on, you know, to a degree.
01:19:13.100 | But like the learning process of interacting with a human
01:19:17.460 | would be like incredible to watch.
01:19:19.460 | - One of the cool things going on,
01:19:21.020 | you know that there's a class at Brown University
01:19:23.820 | called Choreo Robotics.
01:19:25.260 | Sydney Skybetter is a dancer, choreographer,
01:19:29.140 | and he teamed up with Stephanie Tellex,
01:19:32.260 | who's a computer science professor,
01:19:34.300 | and they taught this class,
01:19:35.660 | and I think they have some graduate students
01:19:37.420 | helping teach it, where they have two spots
01:19:40.300 | and people come in, I think it's 50/50
01:19:43.340 | of computer science people and dance people,
01:19:46.260 | and they program performances that are very interesting.
01:19:50.660 | I show some of them sometimes when I give a talk.
01:19:53.140 | - And making that process of a human teaching the robot
01:19:55.500 | more efficient and more intuitive,
01:19:57.620 | maybe partial language, part movement,
01:20:00.780 | that'd be fascinating, that'd be really fascinating,
01:20:02.540 | 'cause I mean, one of the things I've kind of realized
01:20:05.340 | is humans communicate with movement a lot.
01:20:10.340 | It's not just language.
01:20:12.700 | There's a lot, there's body language,
01:20:14.180 | there's so many intricate little things.
01:20:16.580 | - Totally.
01:20:17.940 | - And like that, you know, to watch a human and Spot
01:20:22.420 | communicate back and forth with movement,
01:20:24.540 | I mean, there's just so many wonderful possibilities there.
01:20:28.740 | - But it's also a challenge, you know,
01:20:30.620 | we get asked to have our robots perform
01:20:35.060 | with famous dancers, and they can,
01:20:39.940 | you know, they have 200 degrees of freedom or something,
01:20:42.540 | right, every little ripple and thing,
01:20:44.820 | and they have all this head and neck
01:20:46.220 | and shoulders and stuff, and the robots
01:20:48.500 | mostly don't have all that stuff,
01:20:50.220 | and it's a daunting challenge to not look stupid,
01:20:55.220 | you know, physically stupid next to them.
01:20:58.500 | And so we've pretty much avoided that kind of performance,
01:21:01.780 | but we'll get to it.
01:21:04.140 | - I think even with the limited degrees of freedom,
01:21:06.020 | we could still have some sass and flavor and so on,
01:21:09.380 | you can figure out your own thing, even if you can't.
01:21:11.860 | - And we can reverse things, like if you watch a human
01:21:14.780 | do robot animation, which is a dance style,
01:21:17.740 | where, you know, you jerk around, sort of,
01:21:19.420 | and you pop and lock and all that stuff,
01:21:23.020 | I think the robots could show up to the humans
01:21:26.460 | by, you know, doing unstable oscillations and things
01:21:30.460 | that are faster than a person can.
01:21:32.460 | So that's sort of on my, you know, my plan,
01:21:36.340 | but we haven't quite gotten there yet.
01:21:38.980 | - You mentioned about building teams
01:21:40.800 | and robotics teams and so on,
01:21:42.220 | how do you find great engineers,
01:21:44.060 | how do you hire great engineers?
01:21:45.980 | - I think you need to have an environment
01:21:47.580 | where interesting, well, you know, it's a chicken and egg.
01:21:50.220 | If you have an environment where interesting engineering
01:21:52.260 | is going on, then engineers wanna work there.
01:21:55.460 | And, you know, I think it took a long time
01:21:59.900 | to develop that at Boston Dynamics.
01:22:01.780 | In fact, when we started, although, you know,
01:22:05.880 | I had the experience of building things in the leg lab,
01:22:08.900 | both at CMU and at MIT, we weren't that sophisticated
01:22:13.580 | of an engineering thing compared
01:22:15.900 | to what Boston Dynamics is now.
01:22:17.660 | But it was our ambition to do that.
01:22:20.380 | And, you know, Sarkos was another robot company.
01:22:23.940 | So I always thought of us as being this much
01:22:27.980 | on the computing side and this much on the hardware side,
01:22:31.420 | and they were like this.
01:22:32.700 | And then over the years, we, you know,
01:22:36.020 | I think we achieved the same or better levels of engineering.
01:22:40.980 | Meanwhile, you know, Sarkos got acquired
01:22:42.980 | and then they went through all kinds of changes.
01:22:44.860 | And I don't know exactly what their current status is,
01:22:48.020 | but so it took many years is part of the answer.
01:22:52.520 | I think you gotta find people who love it.
01:22:56.060 | In the early days, we paid a little less.
01:22:59.100 | So we only got people who were doing it
01:23:00.820 | 'cause they really loved it.
01:23:03.180 | We also hired people who might not have professional degrees,
01:23:06.740 | you know, people who were building bicycles
01:23:09.300 | and building kayaks.
01:23:10.860 | We have some people who come from that
01:23:13.180 | kind of the maker world.
01:23:14.740 | And that's really important for the kind of work we do
01:23:18.340 | to have that be part of the mix.
01:23:20.100 | - Whatever that is, whatever the magic ingredient
01:23:23.060 | that makes a great builder, maker,
01:23:25.540 | that's the big part of it.
01:23:26.940 | - People who repaired the cars or motorcycles
01:23:31.340 | or whatever in their garages when they were kids.
01:23:34.020 | - There's a kind of, like the robotics students,
01:23:36.460 | grad students, and just roboticists
01:23:38.100 | that I know and I hang out with,
01:23:40.420 | there's a kind of endless energy.
01:23:44.740 | Like, they're just happy.
01:23:46.660 | I compare it to another group of people
01:23:49.660 | that are like that are people that skydive professionally.
01:23:52.900 | There's just like excitement and general energy
01:23:56.340 | that I think probably has to do with the fact
01:23:58.460 | that they're just constantly,
01:24:00.620 | first of all, fail a lot.
01:24:02.500 | And then the joy of building a thing
01:24:05.060 | that eventually works.
01:24:06.820 | - Yeah, talk about being happy.
01:24:08.140 | There used to be a time when I was doing
01:24:10.380 | the machine shop work myself,
01:24:11.860 | back in those JPL and Caltech days,
01:24:14.820 | when if I came home smelling like the machine shop,
01:24:17.820 | you know, 'cause it's an oily place,
01:24:19.260 | my wife would say, "Oh, you had a good day today, huh?"
01:24:21.880 | 'Cause she could tell that that's where I'd been.
01:24:24.460 | - You've done, yeah, you've actually built something,
01:24:27.300 | you've done something in the physical world.
01:24:29.500 | Yeah, and probably the videos help, right?
01:24:32.140 | The videos help show off what robotics is.
01:24:35.500 | - Oh, you know, at Boston Dynamics,
01:24:37.820 | they put us on the map.
01:24:38.980 | I remember interviewing some sales guy,
01:24:43.820 | and he was from a company,
01:24:46.380 | and he said, "Well, no one's ever heard of my company,
01:24:48.460 | "but we have products, you know, really good products.
01:24:52.680 | "You guys, everybody knows who you are,
01:24:54.920 | "but you don't have any products at all," which was true.
01:24:58.300 | So it was, and you know, we thank YouTube for that.
01:25:01.380 | YouTube came, we caught the YouTube wave,
01:25:04.140 | and it had a huge impact on our company.
01:25:06.780 | - I mean, it's a big impact, not just on your company,
01:25:10.780 | but on robotics in general,
01:25:12.960 | helping people understand and inspire
01:25:15.580 | what is possible with robots.
01:25:18.380 | They inspire imagination, fear, everything.
01:25:21.460 | The full spectrum of human emotion was aroused,
01:25:24.060 | which is great for the entirety of humanity,
01:25:27.900 | and also probably inspiring for young people
01:25:29.900 | that wanna get into AI and robotics.
01:25:31.820 | Let me ask you about some competitors.
01:25:35.500 | - Sure.
01:25:36.320 | - In a complimentary of Elon and Tesla's work
01:25:39.060 | on Optimus Robot, with their humanoid robot,
01:25:43.420 | what do you think of their efforts there
01:25:45.380 | with the humanoid robot?
01:25:46.580 | - You know, I really admire Elon as a technologist.
01:25:52.780 | I think that what he did with Tesla,
01:25:55.220 | it was just totally mind-boggling
01:25:58.260 | that he could go from this totally niche area
01:26:02.180 | that less than 1% of anybody seemed to be interested
01:26:07.020 | to making it so that essentially every car company
01:26:10.000 | in the world is trying to do what he's done.
01:26:13.660 | So you gotta give it to him.
01:26:15.120 | Then look at SpaceX.
01:26:16.420 | He's basically replaced NASA, if you could.
01:26:20.720 | That might be a little exaggeration, but not by much.
01:26:23.380 | So you gotta admire the guy,
01:26:26.520 | and I wouldn't count him out for anything.
01:26:31.040 | I don't think Optimus today is where Atlas is,
01:26:36.040 | for instance.
01:26:37.740 | I don't know, it's a little hard to compare 'em
01:26:39.180 | to the other companies.
01:26:40.940 | You know, I visited Figure.
01:26:44.420 | I think they're doing well, and they have a good team.
01:26:47.100 | I've visited Eptronic, and I think they have a good team,
01:26:52.780 | and they're doing well.
01:26:53.940 | But Elon has a lot of resources.
01:26:58.180 | He has a lot of ambition.
01:27:00.980 | I'd like to take some credit for his ambition.
01:27:03.280 | I think if I read between the lines,
01:27:06.800 | it's hard not to think that him seeing what Atlas is doing
01:27:10.400 | is a little bit of an inspiration.
01:27:12.120 | I hope so.
01:27:13.440 | - Do you think Atlas and Optimus
01:27:14.680 | will hang out at some point?
01:27:16.980 | - I would love to host that.
01:27:18.880 | Now that I'm not at Boston Dynamics,
01:27:21.280 | I'm not officially connected.
01:27:23.600 | I am on the board, but I'm not officially connected.
01:27:26.160 | I would love to host a--
01:27:27.800 | - Robot meetups?
01:27:28.640 | - A road up meetup, yeah.
01:27:30.840 | - Does the AI Institute work with Spots and Atlas?
01:27:34.480 | Is it focused on Spots mostly right now as a platform?
01:27:37.680 | - We have a bunch of different robots.
01:27:39.040 | We bought everything we could buy.
01:27:40.600 | So we have Spots.
01:27:43.840 | I think we have a good-sized fleet of them.
01:27:45.560 | I don't know how many it is, but a good-sized fleet.
01:27:47.760 | We have a couple of Animal robots.
01:27:49.600 | You know, Animal is a company founded by Marco Hutter,
01:27:53.360 | even though he's not that involved anymore,
01:27:55.240 | but we have a couple of those.
01:27:56.760 | We have a bunch of ARMS, like Franca's and US Robotics.
01:28:01.760 | 'Cause you know, even though we have ambitions
01:28:05.080 | to build stuff, and we are starting to build stuff,
01:28:07.640 | day one, getting off the ground, we just bought stuff.
01:28:13.220 | - I love this robot playground you've built.
01:28:17.580 | - You can come over and take a look if you want.
01:28:19.180 | - That's great.
01:28:20.020 | So it's like all these kinds of robots, legged, ARMS.
01:28:23.360 | - It doesn't feel that much like,
01:28:24.560 | well, there's some areas that feel like a playground,
01:28:26.580 | but it's not like they're all frolicking together.
01:28:29.700 | - Hey, again, maybe you'll arrange a robot meetup.
01:28:35.640 | But in general, what's your view on competition
01:28:39.080 | in this space, especially like humanoid and legged robots?
01:28:42.320 | Are you excited by the competition,
01:28:45.160 | or the friendly competition?
01:28:46.880 | - I think that it doesn't, you know,
01:28:51.280 | I don't think about competition that much.
01:28:54.880 | You know, I'm not a commercial guy.
01:28:57.340 | I think for many years at Boston,
01:28:58.500 | you know, the many years I was at Boston Dynamics,
01:29:01.340 | we didn't think about competition,
01:29:02.860 | we were just kind of doing our thing.
01:29:04.420 | There wasn't, it wasn't like there were products out there
01:29:07.260 | that we were competing with, you know.
01:29:09.140 | Maybe there was some competition for DARPA funding,
01:29:12.580 | which we got a lot of, got very good at getting.
01:29:16.140 | But even there, in a couple of cases
01:29:20.500 | where we might have competed,
01:29:21.740 | we ended up just being the robot provider.
01:29:24.500 | That is for the little dog program.
01:29:26.560 | You know, we just made the robots,
01:29:28.640 | we didn't participate as developers,
01:29:30.840 | except for developing the robot.
01:29:32.480 | And in the DARPA Robotics Challenge,
01:29:35.680 | we didn't compete, we provided the robots.
01:29:38.880 | So, you know, in the AI world now,
01:29:43.720 | now that we're working on cognitive stuff,
01:29:45.280 | it feels much more like a competition.
01:29:47.880 | You know, the entry requirements
01:29:52.280 | in terms of computing hardware
01:29:54.640 | and the skills of the team are,
01:29:58.140 | and hiring talent, it's a much tougher place.
01:30:02.260 | So I think much more about competition
01:30:04.500 | now on the cognitive side.
01:30:06.380 | On the physical side, it doesn't feel like
01:30:08.440 | it's that much about competition yet.
01:30:10.980 | Obviously, with 10 humanoid companies out there,
01:30:13.100 | 10 or 12, I mean, there's probably others
01:30:15.260 | that I don't know about,
01:30:16.460 | they're definitely in competition, will be in competition.
01:30:22.140 | - How much room is there for a quadruped,
01:30:26.640 | and especially a humanoid robot, to become cheaper?
01:30:30.880 | So like, cutting cost, and like, how low can you go?
01:30:35.000 | And how much of it is just mass production,
01:30:38.960 | so questions of, you know, Hyundai,
01:30:42.420 | like how to produce, versus like engineering innovation,
01:30:46.160 | how to simplify?
01:30:47.860 | - I think there's a huge way to go.
01:30:49.720 | I don't think we've seen the bottom of it,
01:30:51.480 | or the bottom in terms of lower prices.
01:30:53.720 | You know, I think you should be totally optimistic
01:30:57.560 | that at Asymptote, things don't have to be
01:31:00.200 | anything like as expensive as they are now.
01:31:02.840 | Back to competition, I wanted to say one thing.
01:31:04.740 | I think in the quadruped space,
01:31:06.520 | having other people selling quadrupeds
01:31:09.480 | is a great thing for Boston Dynamics.
01:31:11.800 | Because the question, I believe the question
01:31:14.440 | in the user's minds is, which quadruped do I want?
01:31:17.480 | It's not, oh, do I want a quadruped,
01:31:20.240 | can a quadruped do my job?
01:31:22.520 | It's much more like that,
01:31:23.520 | which is a great place for it to be.
01:31:25.880 | Then you're just, you know, doing the things
01:31:29.400 | you normally do to make your product better,
01:31:31.340 | and compete, and selling, and all that stuff.
01:31:34.800 | And that'll be the way it is with humanoids at some point.
01:31:37.320 | - Well, there's a lot of humanoids,
01:31:38.480 | and you're just not even, it's like iPhone versus Android,
01:31:43.480 | and people are just buying both, and it's kind of just--
01:31:46.740 | - Yeah.
01:31:47.840 | - You're not really--
01:31:48.720 | - You're creating the category.
01:31:50.120 | - Yeah, creating the category.
01:31:50.960 | - Or the category is happening.
01:31:52.640 | I mean, right now, the use cases, you know,
01:31:55.240 | that's the key thing, having realistic use cases
01:31:59.060 | that are money-making in robotics is a big challenge.
01:32:03.560 | You know, there's the warehouse use case.
01:32:05.480 | That's probably the only thing that makes anybody
01:32:07.360 | any money in robotics at this point.
01:32:09.940 | - There's gotta be a moment.
01:32:11.320 | - There's old-fashioned robotics.
01:32:12.320 | I mean, there's aren't fixed arms doing manufacturing.
01:32:14.840 | I don't wanna say that they're not making money.
01:32:17.560 | Industrial robotics, yes, but there's gotta be a moment
01:32:20.760 | when social robotics starts making real money,
01:32:23.320 | meaning like a spa-type robot in the home,
01:32:26.180 | and there's tens of millions of them in the home,
01:32:28.680 | and they're like, you know, I don't know how many dogs
01:32:30.880 | there are in the United States as pets,
01:32:33.360 | but this feels-- - Many.
01:32:34.440 | - Many, it feels like there's something we love
01:32:37.720 | about having a intelligent companion with us
01:32:40.920 | that remembers us, that's excited to see us,
01:32:42.840 | all that kind of stuff.
01:32:44.100 | - But it's also true that the companies making those things,
01:32:47.200 | there've been a lot of failures in recent times, right?
01:32:49.640 | There's that one year when I think three of 'em went under.
01:32:53.000 | So it's-- - It's not easy.
01:32:53.840 | - It's not that easy to do that, right?
01:32:56.200 | Getting, you know, getting performance, safety, and cost
01:33:01.200 | all to be where they need to be at the same time is,
01:33:04.600 | that's hard.
01:33:07.200 | - But also some of it is, like you said,
01:33:09.120 | you can have a product, but people might not be aware of it,
01:33:13.020 | so like also part of it is the videos,
01:33:14.960 | or however you connect with the public, the culture,
01:33:18.720 | and create the category.
01:33:20.880 | Make people realize this is the thing you want.
01:33:22.880 | 'Cause from a, you know, there's a lot of negative
01:33:24.600 | perceptions you can have.
01:33:25.800 | Do you really want a system with a camera
01:33:28.560 | in your home walking around, right?
01:33:30.920 | If it's presented correctly, and if there's like
01:33:35.120 | the right kind of boundaries around it
01:33:36.720 | that you understand how it works and so on,
01:33:38.720 | that a lot of people would want to.
01:33:40.760 | And if they don't, they might be suspicious of it.
01:33:43.400 | So that's an important one.
01:33:45.040 | Like we all use smartphones, and that has a camera
01:33:47.320 | that's looking at us, you know?
01:33:48.960 | - Yeah, it has two or three or four.
01:33:50.680 | - And it's listening.
01:33:51.520 | And very few people are, you know, suspicious about it.
01:33:56.160 | They kind of take it for granted and so on.
01:33:58.000 | And I think robots would be the same kind of way.
01:33:59.980 | - I agree.
01:34:00.820 | - So as you work on the cognitive aspect of these robots,
01:34:06.920 | do you think we'll ever get to human level
01:34:10.160 | or superhuman level intelligence?
01:34:13.160 | There's been a lot of conversations about this recently,
01:34:16.560 | given the rapid development in large language models.
01:34:19.640 | - I think that intelligence is a lot of different things.
01:34:25.200 | And I think some things,
01:34:26.720 | computers are already smarter than people.
01:34:29.080 | And some things, they're not even close.
01:34:32.040 | And, you know, I think you'd need a menu
01:34:34.600 | of detailed categories to come up with that.
01:34:41.080 | But I also think that the conversation
01:34:45.360 | that seems to be happening about AGI puzzles me.
01:34:49.320 | It's sort of, so I ask you a question.
01:34:52.480 | Do you think there's anybody smarter than you in the world?
01:34:55.200 | - Absolutely, yes.
01:34:56.960 | - Do you find that threatening?
01:34:58.280 | - No.
01:34:59.200 | - So I don't understand,
01:35:00.920 | even if computers were smarter than people,
01:35:04.160 | why we should assume that that's a threat.
01:35:06.860 | Especially since they could easily be smarter
01:35:10.520 | but still available to us or under our control,
01:35:13.680 | which is basically how computers generally are.
01:35:17.280 | - I think the fear is that they would be 10x, 100x smarter.
01:35:22.280 | And operating under different morals and ethical codes
01:35:27.760 | than humans naturally do.
01:35:30.480 | And so almost become misaligned in unintended ways.
01:35:35.480 | And therefore harm humans in ways we just can't predict.
01:35:39.440 | And even if we program them to do a thing,
01:35:43.360 | as on the way of doing that thing,
01:35:45.400 | that would cause a lot of harm.
01:35:47.160 | And when they're 100x, 1,000x, 10,000x smarter than us,
01:35:51.760 | we won't be able to stop it
01:35:53.000 | or we won't be able to even see the harm
01:35:54.680 | as it's happening until it's too late.
01:35:56.880 | That kind of stuff.
01:35:57.720 | So you can construct all kinds of possible trajectories
01:36:01.160 | of how the world ends because of super intelligent systems.
01:36:04.480 | - It's a little bit like that line in the Oppenheimer movie.
01:36:09.360 | Where they contemplate whether the first time
01:36:12.600 | they set off a reaction,
01:36:14.400 | all matter on Earth is gonna go up.
01:36:19.400 | I don't remember what the verb they used was
01:36:22.000 | for the chain reaction, right?
01:36:25.440 | Yeah, I guess it's possible.
01:36:30.200 | But I personally don't think it's worth worrying about that.
01:36:35.000 | I think that it's an opportunity,
01:36:37.920 | balancing opportunities and risk.
01:36:40.240 | I think if you take any technology,
01:36:42.320 | there's opportunity and risk.
01:36:44.840 | And it's easy to, I'll point at the car.
01:36:47.100 | They pollute and they, about what, 1.25 million people
01:36:53.240 | get killed every year around the world because of them.
01:36:57.440 | Despite that, I think they're a boon to humankind,
01:37:01.880 | very useful, we all love, many of us love them.
01:37:06.120 | And those technical problems can be solved.
01:37:08.440 | I think they are becoming safer.
01:37:10.360 | I think they're becoming less polluting,
01:37:11.920 | at least some of them are.
01:37:13.220 | And every technology you can name
01:37:17.520 | has a story like that, in my opinion.
01:37:19.360 | - What's the story behind the Hawaiian shirt?
01:37:22.880 | Is it a fashion statement, a philosophical statement?
01:37:26.120 | Is it just a statement of rebellion?
01:37:27.920 | Engineering statement?
01:37:31.480 | - It was born of me being a contrarian.
01:37:34.080 | - Yes.
01:37:34.920 | - It's a symbol.
01:37:36.160 | - Someone told me once that I was wearing one
01:37:40.080 | when I only had one or two.
01:37:41.840 | And they said, oh, those things are so old-fashioned,
01:37:44.640 | you can't wear that, Mark.
01:37:46.060 | And I stopped wearing 'em for about a week.
01:37:48.440 | And then I said, I'm not gonna let them tell me what to do.
01:37:52.520 | And so, every day since, pretty much.
01:37:55.680 | - So it's like a symbol.
01:37:56.520 | - That was years ago, that was 20 years ago.
01:37:58.080 | - 20 years.
01:37:58.920 | - 15 years ago, probably.
01:38:00.480 | - That says something about your personality, that's great.
01:38:02.760 | 'Cause you're not.
01:38:04.160 | - It took me a while to realize I was a contrarian.
01:38:06.440 | But, you know, it can be a useful tool.
01:38:08.400 | - Have you had people tell you about,
01:38:11.520 | on the robotics side, that like,
01:38:13.120 | I don't think you could do this?
01:38:15.200 | The kind of negative motivation.
01:38:17.440 | - I'd rather talk about, there's a guy,
01:38:23.920 | when we were doing a lot of DARPA work,
01:38:25.280 | there was a Marine, Ed Tovar, who's still around.
01:38:28.780 | What he would always say is when someone would say,
01:38:33.320 | oh, you can't do that, he'd say, why not?
01:38:35.740 | And it's a great question.
01:38:37.960 | I ask all the time when I'm thinking,
01:38:39.920 | oh, we're not gonna do that, and I say, why not?
01:38:43.800 | And I give him credit for opening my eyes to resisting that.
01:38:48.800 | - So, yeah, yeah, the Hawaiian shirt
01:38:52.440 | is almost like a symbol of why not.
01:38:55.160 | Okay, what advice would you give to young folks
01:38:59.520 | that are trying to figure out what they wanna do
01:39:01.600 | with their life, how to have a life they can be proud of,
01:39:04.520 | they can have a career they can be proud of?
01:39:06.600 | - When I was teaching at MIT, for a while,
01:39:09.800 | I had undergraduate advisees where people
01:39:13.000 | would have to meet with me once a semester or something,
01:39:17.200 | and they frequently would ask what they should do.
01:39:20.880 | And I think the advice I used to give
01:39:22.760 | was something like, well, if you had no constraints on you,
01:39:27.360 | no resource constraints, no opportunity constraints,
01:39:31.440 | and no skill constraints, what could you imagine doing?
01:39:35.440 | And I said, well, start there and see how close you can get.
01:39:39.480 | What's realistic for how close you can get?
01:39:42.360 | You know, the other version of that is, you know,
01:39:44.640 | try and figure out what you wanna do and do that.
01:39:47.040 | 'Cause I don't think, a lot of people think
01:39:49.400 | that they're in a channel, right,
01:39:50.640 | and there's only limited opportunities,
01:39:52.760 | but it's usually wider than they think.
01:39:56.880 | - Yeah, the opportunities really are limitless,
01:39:59.040 | but at the same time, you want to pick a thing, right,
01:40:04.040 | and it's the diligence, and really, really pursue it, right,
01:40:09.520 | like really pursue it.
01:40:10.840 | 'Cause sometimes the really special stuff
01:40:16.240 | happens after years of pursuit.
01:40:18.200 | - Yeah, oh, absolutely, it can take a while.
01:40:21.720 | - I mean, you've been doing this for 40-plus years.
01:40:24.840 | - Some people think I'm in a rut, right, what did I do?
01:40:27.280 | And in fact, some of the inspiration for the AI Institute
01:40:31.760 | is to say, okay, I've been working on locomotion
01:40:35.280 | for however many years it was, let's do something else.
01:40:39.880 | And it's a really fascinating and interesting challenge.
01:40:44.880 | - And you're hoping to show it off also,
01:40:47.600 | in the same way as--
01:40:48.520 | - Just about to start showing some stuff off, yeah.
01:40:51.800 | I hope we have a YouTube channel.
01:40:53.320 | I mean, one of the challenges is,
01:40:54.520 | it's one thing to show athletic skills on YouTube.
01:40:59.040 | Showing cognitive function is a lot harder,
01:41:02.560 | and I haven't quite figured out yet how that's gonna work.
01:41:05.580 | - There might be a way.
01:41:07.920 | - There's a way.
01:41:08.760 | - There's a way. - Why not?
01:41:10.360 | - I also do think sucking at a task is also compelling,
01:41:15.360 | like the incremental improvement,
01:41:19.040 | a robot being really terrible at a task
01:41:21.560 | and then slowly becoming better.
01:41:23.240 | Even in athletic intelligence, honestly,
01:41:25.280 | learning to walk and falling and slowly figuring that out,
01:41:29.240 | I think there's something extremely compelling about that.
01:41:32.440 | We like flaws, especially with a cognitive task.
01:41:35.440 | It's okay to be clumsy.
01:41:36.560 | It's okay to be confused and a little silly
01:41:39.640 | and all that kind of stuff.
01:41:41.760 | It feels like in that space is where we can--
01:41:45.260 | - There's charm.
01:41:46.240 | - There's charm.
01:41:47.100 | There's charm and there's something inspiring
01:41:50.860 | about a robot sucking and then becoming less terrible
01:41:54.800 | slowly at a task.
01:41:56.940 | - I think you're right.
01:41:57.960 | - That kind of reveals something about ourselves.
01:42:01.080 | Ultimately, that's what's one of the coolest things
01:42:04.080 | about robots is it's kind of a mirror
01:42:05.720 | about what makes humans special.
01:42:08.840 | Just by watching a heart, it's just to make a robot
01:42:11.960 | do the things that humans do.
01:42:13.600 | You realize how special we are.
01:42:15.160 | What do you think is the meaning of this whole thing?
01:42:20.120 | Why are we here, Mark?
01:42:22.840 | You ever ask about the big questions
01:42:26.420 | as you try to create these humanoid,
01:42:28.900 | human-like intelligence systems?
01:42:32.100 | - I don't know.
01:42:32.940 | I think you have to have fun while you're here.
01:42:34.980 | That's about all I know.
01:42:36.300 | It would be a waste not to, right?
01:42:40.400 | - The ride is pretty short, so might as well have fun.
01:42:43.380 | Mark, I'm a huge fan of yours.
01:42:47.260 | It's a huge honor that you would talk with me.
01:42:49.140 | This is really amazing.
01:42:50.340 | Your work for many decades has been amazing.
01:42:52.740 | I can't wait to see what you do at the AI Institute.
01:42:56.100 | I'm gonna be waiting impatiently for the videos
01:43:00.580 | and the demos and the next robot meetup
01:43:03.540 | for maybe Atlas and Optimus to hang out.
01:43:07.140 | - I would love to do that.
01:43:08.340 | That would be fun.
01:43:09.460 | - Thank you so much for talking to me.
01:43:10.800 | - Thank you.
01:43:11.640 | It was fun talking to you.
01:43:13.220 | - Thanks for listening to this conversation
01:43:14.740 | with Mark Rybert.
01:43:16.020 | To support this podcast, please check out our sponsors
01:43:18.620 | in the description.
01:43:19.980 | And now, let me leave you with some words
01:43:21.740 | from Arthur C. Clarke.
01:43:23.200 | "Whether we're based on carbon or on silicon
01:43:27.300 | "makes no fundamental difference.
01:43:29.660 | "We should each be treated with appropriate respect."
01:43:34.660 | Thank you for listening and hope to see you next time.
01:43:37.980 | (upbeat music)
01:43:40.560 | (upbeat music)
01:43:43.140 | [BLANK_AUDIO]