back to index

Russ Tedrake: Underactuated Robotics, Control, Dynamics and Touch | Lex Fridman Podcast #114


Chapters

0:0 Introduction
4:29 Passive dynamic walking
9:40 Animal movement
13:34 Control vs Dynamics
15:49 Bipedal walking
20:56 Running barefoot
33:1 Think rigorously with machine learning
44:5 DARPA Robotics Challenge
67:14 When will a robot become UFC champion
78:32 Black Mirror Robot Dog
94:1 Robot control
107:0 Simulating robots
120:33 Home robotics
123:40 Soft robotics
127:25 Underactuated robotics
140:42 Touch
148:55 Book recommendations
160:8 Advice to young people
164:20 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Russ Tedrick,
00:00:03.000 | a roboticist and professor at MIT
00:00:05.560 | and vice president of robotics research
00:00:07.880 | at Toyota Research Institute or TRI.
00:00:11.240 | He works on control of robots in interesting,
00:00:15.160 | complicated, underactuated, stochastic,
00:00:18.000 | difficult to model situations.
00:00:19.960 | He's a great teacher and a great person.
00:00:22.640 | One of my favorites at MIT.
00:00:25.040 | We'll get into a lot of topics in this conversation
00:00:28.280 | from his time leading MIT's DARPA Robotics Challenge Team
00:00:32.760 | to the awesome fact that he often runs
00:00:35.400 | close to a marathon a day to and from work barefoot.
00:00:40.400 | For a world-class roboticist interested
00:00:42.800 | in elegant, efficient control
00:00:44.520 | of underactuated dynamical systems like the human body,
00:00:49.300 | this fact makes Russ one of the most
00:00:51.440 | fascinating people I know.
00:00:53.200 | Quick summary of the ads.
00:00:55.800 | Three sponsors, Magic Spoon Cereal,
00:00:58.440 | BetterHelp and ExpressVPN.
00:01:00.800 | Please consider supporting this podcast
00:01:02.640 | by going to magicspoon.com/lex
00:01:05.720 | and using code LEX at checkout,
00:01:08.000 | going to betterhelp.com/lex
00:01:10.520 | and signing up at expressvpn.com/lexpod.
00:01:14.680 | Click the links in the description,
00:01:16.520 | buy the stuff, get the discount.
00:01:18.840 | It really is the best way to support this podcast.
00:01:21.840 | If you enjoy this thing, subscribe on YouTube,
00:01:24.040 | review it with five stars on Apple Podcast,
00:01:26.240 | support it on Patreon or connect with me
00:01:28.280 | on Twitter @lexfriedman.
00:01:31.280 | As usual, I'll do a few minutes of ads now
00:01:33.640 | and never any ads in the middle
00:01:34.880 | that can break the flow of the conversation.
00:01:37.880 | This episode is supported by Magic Spoon,
00:01:40.880 | low-carb, keto-friendly cereal.
00:01:43.440 | I've been on a mix of keto or carnivore diet
00:01:45.800 | for a very long time now.
00:01:47.320 | That means eating very little carbs.
00:01:50.520 | I used to love cereal.
00:01:52.200 | Obviously, most have crazy amounts of sugar,
00:01:54.960 | which is terrible for you, so I quit years ago.
00:01:58.000 | But Magic Spoon is a totally new thing.
00:02:00.440 | Zero sugar, 11 grams of protein,
00:02:03.040 | and only three net grams of carbs.
00:02:05.720 | It tastes delicious.
00:02:07.240 | It has a bunch of flavors, they're all good,
00:02:09.640 | but if you know what's good for you, you'll go with cocoa,
00:02:12.560 | my favorite flavor and the flavor of champions.
00:02:15.840 | Click the magicspoon.com/lex link in the description,
00:02:19.480 | use code LEX at checkout to get the discount
00:02:22.160 | and to let them know I sent you.
00:02:24.400 | So buy all of their cereal.
00:02:26.680 | It's delicious and good for you.
00:02:28.640 | You won't regret it.
00:02:29.640 | The show is also sponsored by BetterHelp,
00:02:33.160 | spelled H-E-L-P, help.
00:02:36.040 | Check it out at betterhelp.com/lex.
00:02:39.440 | They figure out what you need
00:02:40.600 | and match you with a licensed professional therapist
00:02:43.240 | in under 48 hours.
00:02:44.960 | It's not a crisis line, it's not self-help,
00:02:47.640 | it is professional counseling done securely online.
00:02:51.040 | As you may know, I'm a bit from the David Goggins line
00:02:53.720 | of creatures and so have some demons to contend with,
00:02:57.080 | usually on long runs or all-nighters full of self-doubt.
00:03:01.560 | I think suffering is essential for creation,
00:03:04.360 | but you can suffer beautifully
00:03:06.040 | in a way that doesn't destroy you.
00:03:08.200 | For most people, I think a good therapist can help in this,
00:03:11.540 | so it's at least worth a try.
00:03:13.400 | Check out the reviews, they're all good.
00:03:15.640 | It's easy, private, affordable, available worldwide.
00:03:19.220 | You can communicate by text anytime
00:03:21.640 | and schedule weekly audio and video sessions.
00:03:25.080 | Check it out at betterhelp.com/lex.
00:03:28.500 | This show is also sponsored by ExpressVPN.
00:03:31.840 | Get it at expressvpn.com/lexpod to get a discount
00:03:35.700 | and to support this podcast.
00:03:37.680 | Have you ever watched "The Office"?
00:03:39.680 | If you have, you probably know it's based
00:03:41.920 | on a UK series also called "The Office."
00:03:45.120 | Not to stir up trouble, but I personally think
00:03:48.080 | the British version is actually more brilliant
00:03:50.320 | than the American one, but both are amazing.
00:03:53.120 | Anyway, there are actually nine other countries
00:03:56.140 | with their own version of "The Office."
00:03:58.400 | You can get access to them with no geo-restriction
00:04:01.180 | when you use ExpressVPN.
00:04:03.600 | It lets you control where you want sites
00:04:05.560 | to think you're located.
00:04:07.340 | You can choose from nearly 100 different countries,
00:04:10.380 | giving you access to content
00:04:12.120 | that isn't available in your region.
00:04:14.040 | So again, get it on any device at expressvpn.com/lexpod
00:04:19.040 | to get an extra three months free
00:04:22.060 | and to support this podcast.
00:04:25.000 | And now here's my conversation with Russ Tedrick.
00:04:28.620 | What is the most beautiful motion of a animal or robot
00:04:33.440 | that you've ever seen?
00:04:34.540 | - I think the most beautiful motion of a robot
00:04:38.280 | has to be the passive dynamic walkers.
00:04:41.120 | I think there's just something fundamentally beautiful.
00:04:43.320 | The ones in particular that Steve Collins built
00:04:45.360 | with Andy Ruina at Cornell, a 3D walking machine.
00:04:50.360 | So it was not confined to a boom or a plane
00:04:53.720 | that you put it on top of a small ramp,
00:04:57.460 | give it a little push.
00:04:59.100 | It's powered only by gravity,
00:05:00.500 | no controllers, no batteries whatsoever.
00:05:04.320 | It just falls down the ramp.
00:05:06.160 | And at the time it looked more natural, more graceful,
00:05:09.520 | more human-like than any robot we'd seen to date.
00:05:13.460 | Powered only by gravity.
00:05:15.240 | - How does it work?
00:05:16.200 | - Well, okay, the simplest model,
00:05:18.480 | it's kind of like a slinky.
00:05:19.480 | It's like an elaborate slinky.
00:05:21.520 | One of the simplest models we use to think about it
00:05:23.820 | is actually a rimless wheel.
00:05:25.340 | So imagine taking a bicycle wheel, but take the rim off.
00:05:30.080 | So it's now just got a bunch of spokes.
00:05:32.640 | If you give that a push,
00:05:33.720 | it still wants to roll down the ramp.
00:05:35.840 | But every time its foot, its spoke comes around
00:05:38.180 | and hits the ground, it loses a little energy.
00:05:40.660 | Every time it takes a step forward,
00:05:43.320 | it gains a little energy.
00:05:44.580 | Those things can come into perfect balance.
00:05:48.240 | And actually they want to, it's a stable phenomenon.
00:05:51.280 | If it's going too slow, it'll speed up.
00:05:53.760 | If it's going too fast, it'll slow down.
00:05:55.920 | And it comes into a stable periodic motion.
00:05:58.200 | Now you can take that rimless wheel,
00:06:02.160 | which doesn't look very much like a human walking,
00:06:05.080 | take all the extra spokes away, put a hinge in the middle.
00:06:08.120 | Now it's two legs.
00:06:09.740 | That's called our compass gate walker.
00:06:12.020 | That can still, you give it a little push,
00:06:13.840 | starts falling down a ramp.
00:06:15.560 | Looks a little bit more like walking.
00:06:17.280 | At least it's a biped.
00:06:18.400 | But what Steve and Andy and Ted McGeer
00:06:22.360 | started the whole exercise,
00:06:23.520 | but what Steve and Andy did was they took it
00:06:25.240 | to this beautiful conclusion
00:06:27.480 | where they built something that had knees, arms, a torso,
00:06:32.460 | the arms swung naturally, give it a little push.
00:06:36.320 | And that looked like a stroll through the park.
00:06:38.720 | - How do you design something like that?
00:06:40.240 | I mean, is that art or science?
00:06:42.360 | - It's on the boundary.
00:06:43.800 | I think there's a science to getting close to the solution.
00:06:47.640 | I think there's certainly art in the way that they,
00:06:50.160 | they made a beautiful robot.
00:06:52.000 | But then the finesse, because this was,
00:06:56.640 | they were working with a system
00:06:57.600 | that wasn't perfectly modeled, wasn't perfectly controlled.
00:07:01.060 | There's all these little tricks
00:07:02.800 | that you have to tune the suction cups at the knees,
00:07:05.480 | for instance, so that they stick,
00:07:07.960 | but then they release at just the right time.
00:07:09.640 | Or there's all these little tricks of the trade,
00:07:12.360 | which really are art, but it was a point.
00:07:14.440 | I mean, it made the point.
00:07:16.200 | We were at that time, the walking robot,
00:07:18.800 | the best walking robot in the world was Honda's ASIMO.
00:07:21.840 | Absolutely marvel of modern engineering.
00:07:24.120 | - This is 90s?
00:07:25.240 | - This was in 97 when they first released,
00:07:27.440 | it sort of announced P2 and then it went through,
00:07:29.920 | it was ASIMO by then in 2004.
00:07:34.880 | - It looks like this very cautious walking,
00:07:37.840 | like you're walking on hot coals or something like that.
00:07:41.280 | - I think it gets a bad rap.
00:07:43.760 | ASIMO is a beautiful machine.
00:07:45.340 | It does walk with its knees bent.
00:07:47.000 | Our Atlas walking had its knees bent,
00:07:49.760 | but actually ASIMO was pretty fantastic,
00:07:52.360 | but it wasn't energy efficient.
00:07:54.320 | Neither was Atlas when we worked on Atlas.
00:07:56.660 | None of our robots that have been that complicated
00:08:00.520 | have been very energy efficient.
00:08:04.060 | But there's a thing that happens when you do control,
00:08:09.060 | when you try to control a system of that complexity.
00:08:12.480 | You try to use your motors to basically counteract gravity.
00:08:16.480 | Take whatever the world's doing to you and push back,
00:08:20.680 | erase the dynamics of the world
00:08:23.520 | and impose the dynamics you want
00:08:25.040 | because you can make them simple and analyzable,
00:08:28.220 | mathematically simple.
00:08:30.780 | And this was a very sort of beautiful example
00:08:34.420 | that you don't have to do that.
00:08:36.380 | You can just let go, let physics do most of the work.
00:08:39.080 | And you just have to give it a little bit of energy.
00:08:42.180 | This one only walked down a ramp.
00:08:43.540 | It would never walk on the flat.
00:08:45.340 | To walk on the flat,
00:08:46.160 | you have to give a little energy at some point.
00:08:48.460 | But maybe instead of trying to take the forces
00:08:51.580 | imparted to you by the world and replacing them,
00:08:55.200 | what we should be doing is letting the world push us around
00:08:58.220 | and we go with the flow.
00:08:59.340 | Very zen, very zen robot.
00:09:01.260 | - Yeah, but okay, so that sounds very zen.
00:09:03.420 | But I can also imagine how many
00:09:08.220 | failed versions they had to go through.
00:09:11.340 | I would say it's probably,
00:09:14.020 | would you say it's in the thousands
00:09:15.320 | that they've had to have the system fall down
00:09:17.940 | before they figured out how to get--
00:09:19.860 | - I don't know if it's thousands, but it's a lot.
00:09:22.580 | It takes some patience, there's no question.
00:09:25.020 | - So in that sense, control might help a little bit.
00:09:28.340 | - Oh, I think everybody, even at the time,
00:09:32.140 | said that the answer is to do that with control.
00:09:35.060 | But it was just pointing out
00:09:36.380 | that maybe the way we're doing control right now
00:09:39.180 | isn't the way we should.
00:09:41.100 | - Got it, so what about on the animal side?
00:09:43.860 | The ones that figured out how to move efficiently?
00:09:46.220 | Is there anything you find inspiring or beautiful
00:09:49.460 | in the movement of any particular animal?
00:09:50.300 | - I do have a favorite example.
00:09:52.020 | - Okay.
00:09:52.860 | (laughing)
00:09:54.380 | - So it sort of goes with the passive walking idea.
00:09:57.180 | So is there, how energy efficient are animals?
00:10:01.420 | Okay, there's a great series of experiments
00:10:03.820 | by George Lauder at Harvard and Mike Tranefilo at MIT.
00:10:07.500 | They were studying fish swimming in a water tunnel, okay?
00:10:11.820 | And one of these, the type of fish they were studying
00:10:15.260 | were these rainbow trout,
00:10:17.260 | because there was a phenomenon well understood
00:10:20.380 | that rainbow trout,
00:10:21.220 | when they're swimming upstream at mating season,
00:10:23.520 | they kind of hang out behind the rocks.
00:10:25.100 | And it looks like, I mean,
00:10:26.060 | that's tiring work swimming upstream.
00:10:28.100 | They're hanging out behind the rocks.
00:10:29.180 | Maybe there's something energetically interesting there.
00:10:31.980 | So they tried to recreate that.
00:10:33.400 | They put in this water tunnel, a rock basically,
00:10:36.420 | a cylinder that had the same sort of vortex street,
00:10:40.580 | the eddies coming off the back of the rock
00:10:42.460 | that you would see in a stream.
00:10:44.260 | And they put a real fish behind this
00:10:46.100 | and watched how it swims.
00:10:48.020 | And the amazing thing is that if you watch from above
00:10:51.980 | what the fish swims when it's not behind a rock,
00:10:53.800 | it has a particular gate.
00:10:56.100 | You can identify the fish the same way you look at a human
00:10:58.760 | looking at walking down the street.
00:10:59.820 | You sort of have a sense of how a human walks.
00:11:02.440 | The fish has a characteristic gate.
00:11:04.200 | You put that fish behind the rock, its gate changes.
00:11:07.960 | And what they saw was that it was actually resonating
00:11:12.700 | and kind of surfing between the vortices.
00:11:15.120 | Now, here was the experiment that really was the clincher,
00:11:20.120 | because there was still,
00:11:20.960 | it wasn't clear how much of that was mechanics of the fish,
00:11:23.960 | how much of that is control, the brain.
00:11:26.900 | So the clincher experiment,
00:11:28.440 | and maybe one of my favorites to date,
00:11:29.760 | although there are many good experiments.
00:11:31.980 | They took, this was now a dead fish.
00:11:37.020 | They took a dead fish.
00:11:40.160 | They put a string that went,
00:11:41.600 | that tied the mouse of the fish to the rock.
00:11:44.120 | So it couldn't go back and get caught in the grates.
00:11:47.120 | And then they asked, what would that dead fish do
00:11:49.160 | when it was hanging out behind the rock?
00:11:51.160 | And so what you'd expect,
00:11:52.000 | it sort of flopped around like a dead fish
00:11:53.760 | in the vortex wake,
00:11:56.120 | until something sort of amazing happens.
00:11:57.760 | And this video is worth putting in.
00:12:00.960 | - What happens?
00:12:04.000 | - The dead fish basically starts swimming upstream.
00:12:06.560 | It's completely dead, no brain, no motors, no control,
00:12:12.120 | but it's somehow the mechanics of the fish
00:12:14.560 | resonate with the vortex street,
00:12:16.320 | and it starts swimming upstream.
00:12:18.240 | It's one of the best examples ever.
00:12:20.480 | - Who do you give credit for that to?
00:12:23.720 | Is that just evolution constantly just figuring out
00:12:27.960 | by killing a lot of generations of animals,
00:12:30.880 | like the most efficient motion?
00:12:32.700 | Or maybe the physics of our world completely,
00:12:37.440 | is evolution applied not only to animals,
00:12:40.900 | but just the entirety of it somehow drives to efficiency?
00:12:45.200 | Like nature likes efficiency?
00:12:48.120 | I don't know if that question even makes any sense.
00:12:50.000 | - I understand the question.
00:12:51.600 | I mean, do they co-evolve?
00:12:54.480 | - Yeah, somehow co, yeah.
00:12:56.400 | I don't know if an environment can evolve, but.
00:12:59.020 | - I mean, there are experiments that people do,
00:13:02.320 | careful experiments that show that animals
00:13:05.400 | can adapt to unusual situations and recover efficiency.
00:13:08.680 | So there seems like, at least in one direction,
00:13:11.120 | I think there is reason to believe
00:13:12.720 | that the animal's motor system,
00:13:14.540 | and probably its mechanics,
00:13:18.080 | adapt in order to be more efficient.
00:13:20.040 | But efficiency isn't the only goal, of course.
00:13:23.080 | Sometimes it's too easy to think about only efficiency.
00:13:26.160 | But we have to do a lot of other things first,
00:13:28.840 | not get eaten, and then all other things being equal,
00:13:32.820 | try to save energy.
00:13:34.080 | - By the way, let's draw a distinction
00:13:36.080 | between control and mechanics.
00:13:38.120 | Like how would you define each?
00:13:40.780 | - Yeah, I mean, I think part of the point is that
00:13:43.900 | we shouldn't draw a line as clearly as we tend to.
00:13:47.840 | But on a robot, we have motors,
00:13:51.420 | and we have the links of the robot, let's say.
00:13:54.800 | If the motors are turned off,
00:13:56.240 | the robot has some passive dynamics.
00:13:58.180 | Gravity does the work.
00:14:01.320 | You can put springs, I would call that mechanics.
00:14:03.680 | If we have springs and dampers,
00:14:04.920 | which our muscles are springs and dampers and tendons.
00:14:07.600 | But then you have something that's doing active work,
00:14:10.400 | putting energy in, which are your motors on the robot.
00:14:13.200 | The controller's job is to send commands to the motor
00:14:16.560 | that add new energy into the system.
00:14:18.600 | So the mechanics and control interplay
00:14:22.440 | somewhere the divide is around,
00:14:24.760 | did you decide to send some commands to your motor,
00:14:27.520 | or did you just leave the motors off,
00:14:28.920 | let them do their work?
00:14:30.520 | - So would you say is most of nature
00:14:33.880 | on the dynamic side or the control side?
00:14:39.800 | So like if you look at biological systems,
00:14:42.240 | we're living in a pandemic now,
00:14:45.320 | do you think a virus is a,
00:14:46.680 | do you think it's a dynamic system,
00:14:50.040 | or is there a lot of control, intelligence?
00:14:54.060 | - I think it's both, but I think we maybe
00:14:56.160 | have underestimated how important the dynamics are.
00:14:58.720 | I mean even our bodies, the mechanics of our bodies,
00:15:04.280 | certainly with exercise they evolve.
00:15:06.240 | So I actually, I lost a finger in early 2000s,
00:15:11.000 | and it's my fifth metacarpal.
00:15:14.400 | And it turns out you use that a lot,
00:15:16.560 | in ways you don't expect when you're opening jars,
00:15:19.280 | even when I'm just walking around,
00:15:20.580 | if I bump it on something,
00:15:22.480 | there's a bone there that was used to taking contact.
00:15:26.720 | My fourth metacarpal wasn't used to taking contact,
00:15:28.800 | it used to hurt, still does a little bit.
00:15:31.060 | But actually my bone has remodeled, right?
00:15:33.900 | Over a couple of years, the geometry,
00:15:39.560 | the mechanics of that bone changed
00:15:42.080 | to address the new circumstances.
00:15:44.300 | So the idea that somehow it's only our brain
00:15:46.800 | that's adapting or evolving is not right.
00:15:48.920 | - Maybe sticking on evolution for a bit,
00:15:52.520 | 'cause it's tended to create some interesting things.
00:15:56.680 | Bipedal walking, why the heck did evolution give us,
00:16:01.680 | I think we're, are we the only mammals that walk on two feet?
00:16:06.520 | - No, I mean there's a bunch of animals that do it, a bit.
00:16:10.520 | I think we are the most successful bipeds.
00:16:13.560 | - I think I read somewhere that the reason
00:16:18.560 | the evolution made us walk on two feet
00:16:24.120 | is because there's an advantage to being able
00:16:26.780 | to carry food back to the tribe or something like that.
00:16:29.600 | So you can carry, it's kind of this communal,
00:16:33.380 | cooperative thing, so to carry stuff back
00:16:36.520 | to a place of shelter and so on to share with others.
00:16:41.640 | - Do you understand at all the value of walking on two feet
00:16:46.040 | from both a robotics and a human perspective?
00:16:49.400 | - Yeah, there are some great books written
00:16:51.680 | about evolution of, walking evolution of the human body.
00:16:56.080 | I think it's easy though to make bad evolutionary arguments.
00:17:00.600 | - Sure.
00:17:02.200 | Most of them are probably bad, but what else can we do?
00:17:05.320 | - I mean I think a lot of what dominated our evolution
00:17:11.120 | probably was not the things that worked well
00:17:15.080 | sort of in the steady state, when things are good.
00:17:20.080 | But for instance, people talk about what we should eat now
00:17:25.040 | because our ancestors were meat eaters or whatever.
00:17:28.320 | - Oh yeah, I love that, yeah.
00:17:30.240 | - But probably the reason that one pre-Homo sapien species
00:17:35.240 | versus another survived was not because of
00:17:40.700 | whether they ate well when there was lots of food,
00:17:45.320 | but when the ice age came, probably one of them
00:17:49.600 | happened to be in the wrong place,
00:17:50.960 | one of them happened to forage a food that was okay
00:17:54.200 | even when the glaciers came or something like that.
00:17:57.720 | - There's a million variables that contributed
00:18:00.560 | and we can't, and are actually, the amount of information
00:18:04.080 | we're working with in telling these stories,
00:18:06.680 | these evolutionary stories is very little.
00:18:10.220 | So yeah, just like you said, it seems like,
00:18:13.040 | if you study history, it seems like history turns
00:18:15.680 | on these little events that otherwise would seem meaningless,
00:18:20.680 | but in the grant, in retrospect, were turning points.
00:18:26.480 | - Absolutely.
00:18:28.360 | - And that's probably how, like somebody got hit
00:18:30.920 | in the head with a rock because somebody slept
00:18:33.600 | with the wrong person back in the cave days
00:18:37.020 | and somebody got angry and that turned,
00:18:40.200 | warring tribes combined with the environment,
00:18:43.280 | all those millions of things and the meat eating,
00:18:46.480 | which I get a lot of criticism because I don't know,
00:18:49.360 | I don't know what your dietary processes are like,
00:18:51.480 | but these days I've been eating only meat,
00:18:55.040 | which is, there's a large community of people who say,
00:18:59.080 | yeah, probably make evolutionary arguments
00:19:01.080 | and say you're doing a great job.
00:19:02.720 | There's probably an even larger community of people,
00:19:05.760 | including my mom, who says it's a deeply unhealthy,
00:19:08.520 | it's wrong, but I just feel good doing it.
00:19:10.760 | But you're right, these evolutionary arguments
00:19:12.980 | can be flawed, but is there anything interesting
00:19:15.440 | to pull out for--
00:19:17.320 | - There's a great book, by the way,
00:19:19.360 | well, a series of books by Nicholas Taleb
00:19:21.280 | about fooled by randomness and black swan.
00:19:23.840 | Highly recommend them, but yeah,
00:19:26.840 | they make the point nicely that probably it was
00:19:30.320 | a few random events that, yes, maybe it was someone
00:19:35.320 | getting hit by a rock, as you say.
00:19:38.660 | - That said, do you think, I don't know how to ask
00:19:42.540 | this question or how to talk about this,
00:19:44.060 | but there's something elegant and beautiful
00:19:45.660 | about moving on two feet.
00:19:47.820 | Obviously biased, because I'm human,
00:19:50.300 | but from a robotics perspective, too,
00:19:53.300 | you work with robots on two feet.
00:19:55.420 | Is it all useful to build robots that are on two feet
00:20:00.100 | as opposed to four?
00:20:01.100 | Is there something useful about it?
00:20:02.340 | - I think the most, I mean, the reason I spent a long time
00:20:05.500 | working on bipedal walking was because it was hard,
00:20:08.980 | and it challenged control theory in ways
00:20:12.460 | that I thought were important.
00:20:13.960 | I wouldn't have ever tried to convince you
00:20:18.540 | that you should start a company around bipeds
00:20:22.420 | or something like this.
00:20:24.260 | There are people that make pretty compelling arguments.
00:20:26.100 | I think the most compelling one is that the world
00:20:28.940 | is built for the human form, and if you want a robot
00:20:32.300 | to work in the world we have today, then having a human form
00:20:36.900 | is a pretty good way to go.
00:20:38.260 | There are places that a biped can go that would be hard
00:20:42.620 | for other form factors to go, even natural places.
00:20:47.620 | But at some point in the long run, we'll be building
00:20:51.940 | our environments for our robots, probably,
00:20:54.260 | and so maybe that argument falls aside.
00:20:56.540 | - So you famously run barefoot.
00:20:58.820 | Do you still run barefoot?
00:21:02.180 | - I still run barefoot.
00:21:03.100 | - That's so awesome.
00:21:04.780 | - Much to my wife's chagrin.
00:21:06.340 | - Do you wanna make an evolutionary argument
00:21:09.340 | for why running barefoot is advantageous?
00:21:11.820 | What have you learned about human and robot movement
00:21:17.540 | in general from running barefoot?
00:21:19.840 | Human or robot and/or?
00:21:23.660 | - Well, you know, it happened the other way, right?
00:21:25.640 | So I was studying walking robots, and there's a great
00:21:30.740 | conference called the Dynamic Walking Conference,
00:21:34.300 | where it brings together both the biomechanics community
00:21:36.980 | and the walking robots community.
00:21:39.900 | And so I had been going to this for years and hearing
00:21:43.060 | talks by people who study barefoot running
00:21:45.060 | and other mechanics of running.
00:21:46.740 | So I did eventually read Born to Run.
00:21:50.280 | Most people read Born to Run and then--
00:21:51.860 | - First then.
00:21:52.700 | - Right?
00:21:54.060 | The other thing I had going for me is actually that
00:21:55.820 | I wasn't a runner before, and I learned to run
00:22:00.700 | after I had learned about barefoot running,
00:22:02.860 | or I mean, started running longer distances.
00:22:05.420 | So I didn't have to unlearn.
00:22:07.300 | And I'm definitely, I'm a big fan of it for me,
00:22:11.020 | but I'm not gonna, I tend to not try to convince
00:22:13.980 | other people.
00:22:14.800 | There's people who run beautifully with shoes on,
00:22:17.180 | and that's good.
00:22:18.260 | But here's why it makes sense for me.
00:22:24.060 | It's all about the long-term game, right?
00:22:26.380 | So I think it's just too easy to run 10 miles,
00:22:29.460 | feel pretty good, and then you get home at night
00:22:31.580 | and you realize, my knees hurt.
00:22:33.820 | I did something wrong, right?
00:22:35.500 | If you take your shoes off, then if you hit hard
00:22:41.580 | with your foot at all, then it hurts.
00:22:45.720 | You don't like run 10 miles and then realize
00:22:49.300 | you've done some damage.
00:22:50.800 | You have immediate feedback telling you that you've done
00:22:53.340 | something that's maybe suboptimal.
00:22:55.420 | And you change your gait.
00:22:56.540 | I mean, it's even subconscious.
00:22:57.720 | If I, right now, having run many miles barefoot,
00:23:00.620 | if I put a shoe on, my gait changes in a way
00:23:03.460 | that I think is not as good.
00:23:04.860 | So it makes me land softer.
00:23:09.540 | And I think my goals for running are to do it
00:23:13.780 | for as long as I can into old age, not to win any races.
00:23:18.780 | And so for me, this is a way to protect myself.
00:23:23.440 | - Yeah, I think, first of all, I've tried running barefoot
00:23:27.760 | many years ago, probably the other way,
00:23:30.520 | just reading Born to Run.
00:23:33.960 | But just to understand, because I felt like I couldn't
00:23:38.760 | put in the miles that I wanted to.
00:23:40.880 | And it feels like running, for me, and I think
00:23:44.720 | for a lot of people, was one of those activities
00:23:47.600 | that we do often and we never really try to learn
00:23:51.160 | to do correctly.
00:23:52.140 | Like, it's funny, there's so many activities we do every day
00:23:57.440 | like brushing our teeth, right?
00:24:00.320 | I think a lot of us, at least me, probably have never
00:24:03.400 | deeply studied how to properly brush my teeth, right?
00:24:07.080 | Or wash, as now with the pandemic,
00:24:09.000 | or how to properly wash our hands.
00:24:10.680 | We do it every day, but we haven't really studied,
00:24:13.840 | like, am I doing this correctly?
00:24:15.240 | But running felt like one of those things
00:24:17.120 | that it was absurd not to study how to do correctly
00:24:20.240 | 'cause it's the source of so much pain and suffering.
00:24:23.320 | Like, I hate running, but I do it.
00:24:25.720 | I do it because I hate it, but I feel good afterwards.
00:24:28.960 | But I think it feels like you need to learn
00:24:30.600 | how to do it properly, so that's where barefoot running
00:24:33.160 | came in, and then I quickly realized that my gait
00:24:35.760 | was completely wrong.
00:24:38.040 | I was taking huge, like, steps and landing hard on the heel,
00:24:43.040 | all those elements.
00:24:45.840 | And so, yeah, from that, I actually learned
00:24:47.600 | to take really small steps.
00:24:49.200 | Look, I already forgot the number,
00:24:52.320 | but I feel like it was 180 a minute or something like that.
00:24:55.640 | And I remember I actually just took songs
00:25:00.160 | that are 180 beats per minute,
00:25:03.440 | and then, like, tried to run at that beat,
00:25:05.560 | just to teach myself.
00:25:07.720 | It took a long time, and I feel like after a while,
00:25:11.160 | you learn to run, but you adjust it properly
00:25:14.360 | without going all the way to barefoot.
00:25:16.000 | I feel like barefoot is the legit way to do it.
00:25:19.440 | I mean, I think a lot of people
00:25:21.680 | would be really curious about it.
00:25:23.400 | Can you, if they're interested in trying,
00:25:25.560 | what would you, how would you recommend
00:25:27.840 | they start or try or explore?
00:25:30.760 | - Slowly. (laughs)
00:25:32.600 | That's the biggest thing people do,
00:25:33.720 | is they are excellent runners,
00:25:35.920 | and they're used to running long distances or running fast,
00:25:38.360 | and they take their shoes off, and they hurt themselves
00:25:41.520 | instantly trying to do something
00:25:42.840 | that they were used to doing.
00:25:44.280 | I think I lucked out in the sense
00:25:46.000 | that I couldn't run very far when I first started trying.
00:25:50.200 | And I run with minimal shoes, too.
00:25:51.840 | I mean, I will bring along a pair of,
00:25:54.360 | actually, like, Aqua socks or something like this
00:25:56.320 | I can just slip on, or running sandals.
00:25:58.360 | I've tried all of them.
00:26:00.400 | - What's the difference between a minimal shoe
00:26:02.640 | and nothing at all?
00:26:03.800 | What's, like, feeling-wise, what does it feel like?
00:26:07.060 | - There is a, I mean, I notice my gait changing, right?
00:26:10.040 | So, I mean, your foot has as many muscles and sensors
00:26:15.040 | as your hand does, right?
00:26:17.640 | - Sensors, ooh, okay.
00:26:20.000 | - And we do amazing things with our hands.
00:26:23.240 | And we stick our foot in a big, solid shoe, right?
00:26:26.040 | So there's, I think, you know, when you're barefoot,
00:26:29.680 | you're just giving yourself more proprioception.
00:26:33.280 | And that's why you're more aware of some of the gait flaws
00:26:35.760 | and stuff like this.
00:26:37.120 | Now, you have less protection, too.
00:26:40.760 | - Rocks and stuff.
00:26:42.440 | - I mean, yeah, so I think people who are afraid
00:26:45.200 | of barefoot running, they're worried about getting cuts
00:26:47.200 | or getting, stepping on rocks.
00:26:48.700 | First of all, even if that was a concern,
00:26:51.600 | I think those are all, like, very short-term.
00:26:54.280 | You know, if I get a scratch or something,
00:26:55.440 | it'll heal in a week.
00:26:56.560 | If I blow out my knees, I'm done running forever.
00:26:58.260 | So I will trade the short-term for the long-term, anytime.
00:27:01.760 | But even then, you know, and this, again,
00:27:04.800 | to my wife's chagrin, your feet get tough, right?
00:27:07.800 | And--
00:27:08.640 | - Your calves, okay.
00:27:09.480 | - Yeah, I can run over almost anything now.
00:27:11.560 | (laughing)
00:27:13.760 | - I mean, what, maybe, can you talk about,
00:27:17.260 | is there hint, like, is there tips or tricks
00:27:21.920 | that you have, suggestions about,
00:27:24.800 | like, if I wanted to try it?
00:27:26.600 | - You know, there is a good book, actually.
00:27:29.560 | There's probably more good books since I read them.
00:27:32.700 | But Ken Bob, Barefoot Ken Bob Saxton.
00:27:37.360 | He's an interesting guy.
00:27:38.840 | But I think his book captures the right way
00:27:42.640 | to describe running, barefoot running to somebody,
00:27:45.040 | better than any other I've seen.
00:27:46.640 | - So, you run pretty good distances, and you bike,
00:27:52.560 | and is there, you know, if we talk about bucket list items,
00:27:57.560 | is there something crazy on your bucket list,
00:28:00.240 | athletically, that you hope to do one day?
00:28:02.820 | - I mean, my commute is already a little crazy.
00:28:07.200 | - What are we talking about here?
00:28:09.040 | What distance are we talking about?
00:28:11.480 | - Well, I live about 12 miles from MIT,
00:28:14.720 | but you can find lots of different ways to get there.
00:28:16.660 | So, I mean, I've run there for many years, I've biked there.
00:28:20.560 | - Long ways?
00:28:21.520 | - Yeah, but normally I would try to run in
00:28:23.920 | and then bike home, bike in, run home.
00:28:26.000 | - But you have run there and back before?
00:28:28.200 | - Sure.
00:28:29.020 | - Barefoot?
00:28:29.860 | - Yeah, or with minimal shoes or whatever that--
00:28:32.320 | - 12 times two?
00:28:34.400 | - Yeah.
00:28:35.240 | - Okay.
00:28:36.060 | - It became kind of a game of how can I get to work.
00:28:38.400 | I've rollerbladed, I've done all kinds of weird stuff,
00:28:41.000 | but my favorite one these days,
00:28:42.680 | I've been taking the Charles River to work.
00:28:45.040 | So, I can put in a little rowboat, not so far from my house,
00:28:50.040 | but the Charles River takes a long way to get to MIT.
00:28:53.320 | So, I can spend a long time getting there.
00:28:56.400 | And it's, you know, it's not about, I don't know,
00:28:58.740 | it's just about, I've had people ask me,
00:29:02.560 | how can you justify taking that time?
00:29:05.800 | But for me, it's just a magical time to think,
00:29:10.120 | to compress, decompress, you know,
00:29:13.760 | especially I'll wake up, do a lot of work in the morning,
00:29:16.220 | and then I kind of have to just let that settle
00:29:19.180 | before I'm ready for all my meetings.
00:29:20.700 | And then on the way home, it's a great time to load it,
00:29:22.940 | sort of let that settle.
00:29:24.600 | - You lead a large group of people.
00:29:29.600 | I mean, is there days where you're like,
00:29:33.980 | oh shit, I gotta get to work in an hour.
00:29:36.620 | Like, I mean, is there a tension there?
00:29:43.660 | And like, if we look at the grand scheme of things,
00:29:47.940 | just like you said, long-term,
00:29:49.500 | that meeting probably doesn't matter.
00:29:51.700 | Like, you can always say, I'll run
00:29:54.660 | and let the meeting happen how it happens.
00:29:57.100 | Like, how do you, that zen,
00:30:01.340 | what do you do with that tension
00:30:03.580 | between the real world saying urgently,
00:30:05.580 | you need to be there, this is important,
00:30:08.200 | everything is melting down,
00:30:10.020 | how are we gonna fix this robot?
00:30:11.780 | There's this critical meeting,
00:30:14.620 | and then there's this zen beauty of just running,
00:30:17.980 | the simplicity of it, you along with nature.
00:30:20.380 | What do you do with that?
00:30:22.660 | - I would say I'm not a fast runner, particularly.
00:30:25.500 | Probably my fastest splits ever
00:30:26.940 | was when I had to get to daycare on time
00:30:29.180 | because they were gonna charge me, you know,
00:30:30.660 | some dollar per minute that I was late.
00:30:33.540 | I've run some fast splits to daycare,
00:30:35.920 | but those times are past now.
00:30:39.840 | I think work, you can find a work-life balance in that way.
00:30:44.920 | I think you just have to.
00:30:46.180 | I think I am better at work
00:30:48.660 | because I take time to think on the way in.
00:30:52.220 | So I plan my day around it,
00:30:54.400 | and I rarely feel that those are really at odds.
00:31:00.940 | - So what, the bucket list item.
00:31:03.420 | If we're talking 12 times two,
00:31:07.260 | or approaching a marathon,
00:31:08.860 | what, have you run an ultra marathon before?
00:31:15.100 | Do you do races?
00:31:16.800 | Is there, what's--
00:31:17.640 | - Not to win.
00:31:18.780 | - Not to-- (laughs)
00:31:21.660 | - I'm not gonna take a dinghy across the Atlantic
00:31:23.740 | or something if that's what you want,
00:31:24.820 | but if someone does and wants to write a book,
00:31:27.940 | I would totally read it
00:31:28.780 | 'cause I'm a sucker for that kind of thing.
00:31:31.140 | No, I do have some fun things that I will try.
00:31:33.580 | I like to, when I travel,
00:31:35.300 | I almost always bike to the Logan Airport
00:31:37.020 | and fold up a little folding bike
00:31:38.740 | and then take it with me and bike to wherever I'm going.
00:31:41.020 | And it's taken me,
00:31:42.380 | or I'll take a stand-up paddleboard these days
00:31:44.580 | on the airplane,
00:31:45.500 | and then I'll try to paddle around
00:31:46.580 | where I'm going or whatever.
00:31:47.860 | And I've done some crazy things.
00:31:49.900 | - But not for the,
00:31:52.040 | I now talk, I don't know if you know
00:31:55.660 | who David Goggins is by any chance.
00:31:57.500 | - Not well, but yeah.
00:31:58.460 | - But I talk to him now every day.
00:32:00.140 | So he's the person who made me do this stupid challenge.
00:32:05.140 | So he's insane, and he does things for the purpose,
00:32:10.160 | in the best kind of way.
00:32:11.340 | He does things like for the explicit purpose of suffering.
00:32:16.340 | Like he picks the thing that,
00:32:18.420 | like whatever he thinks he can do, he does more.
00:32:21.500 | So is that, do you have that thing in you?
00:32:26.140 | Or are you--
00:32:27.300 | - I think it's become the opposite.
00:32:29.100 | (Lex laughing)
00:32:30.660 | - So you're like that dynamical system,
00:32:32.260 | that the walker, the efficient--
00:32:34.380 | - Yeah, it's leave no pain, right?
00:32:37.840 | You should end feeling better than you started.
00:32:41.540 | But it's mostly, I think,
00:32:44.540 | and COVID has tested this 'cause I've lost my commute.
00:32:47.700 | I think I'm perfectly happy walking around town
00:32:51.940 | with my wife and kids if they could get them to go.
00:32:55.620 | And it's more about just getting outside
00:32:57.780 | and getting away from the keyboard for some time
00:32:59.980 | just to let things compress.
00:33:01.380 | - Let's go into robotics a little bit.
00:33:04.100 | What to you is the most beautiful idea in robotics?
00:33:06.740 | Whether we're talking about control,
00:33:09.740 | or whether we're talking about optimization
00:33:12.740 | and the math side of things,
00:33:14.200 | or the engineering side of things,
00:33:16.180 | or the philosophical side of things.
00:33:18.180 | - I think I've been lucky to experience something
00:33:23.520 | that not so many roboticists have experienced,
00:33:27.700 | which is to hang out
00:33:30.220 | with some really amazing control theorists.
00:33:33.540 | And the clarity of thought
00:33:39.420 | that some of the more mathematical control theory can bring
00:33:44.660 | to even very complex, messy-looking problems
00:33:47.500 | is really, it really had a big impact on me
00:33:53.140 | and I had a day even just a couple of weeks ago
00:33:57.900 | where I had spent the day on a Zoom robotics conference
00:34:01.020 | having great conversations with lots of people.
00:34:04.060 | Felt really good about the ideas
00:34:06.780 | that were flowing and the like.
00:34:09.500 | And then I had a late afternoon meeting
00:34:12.940 | with one of my favorite control theorists
00:34:15.540 | and we went from these abstract discussions
00:34:20.780 | about maybes and what ifs and what a great idea,
00:34:25.740 | to these super precise statements
00:34:29.340 | about systems that aren't that much more simple
00:34:33.860 | or abstract than the ones I care about deeply.
00:34:37.640 | And the contrast of that is,
00:34:40.380 | I don't know, it really gets me.
00:34:44.020 | I think people underestimate
00:34:49.880 | maybe the power of clear thinking.
00:34:53.520 | And so for instance, deep learning is amazing.
00:34:58.260 | I use it heavily in our work.
00:35:02.440 | I think it's changed the world unquestionable.
00:35:04.800 | It makes it easy to get things to work
00:35:09.000 | without thinking as critically about it.
00:35:10.600 | So I think one of the challenges as an educator
00:35:13.420 | is to think about how do we make sure people get a taste
00:35:17.120 | of the more rigorous thinking that I think goes along
00:35:22.120 | with some different approaches.
00:35:24.760 | - Yeah, so that's really interesting.
00:35:26.200 | So understanding the fundamentals,
00:35:29.040 | the first principles of the problem,
00:35:33.880 | or in this case, it's mechanics,
00:35:35.920 | like how a thing moves, how a thing behaves,
00:35:40.460 | like all the forces involved,
00:35:42.420 | like really getting a deep understanding of that.
00:35:44.780 | I mean, from physics, the first principle thing
00:35:47.280 | come from physics, and here it's literally physics.
00:35:51.060 | Yeah, and this applies, in deep learning,
00:35:53.880 | this applies to not just, I mean,
00:35:56.920 | it applies so cleanly in robotics,
00:35:59.200 | but it also applies to just in any data set.
00:36:03.520 | I find this true, I mean, driving as well.
00:36:07.040 | There's a lot of folks that work on autonomous vehicles
00:36:11.720 | that don't study driving, like deeply.
00:36:16.720 | I might be coming a little bit from the psychology side,
00:36:23.080 | but I remember I spent a ridiculous number of hours
00:36:28.080 | at lunch at this like lawn chair,
00:36:31.880 | and I would sit somewhere in MIT's campus,
00:36:35.680 | there's a few interesting intersections,
00:36:37.280 | and we just watch people cross.
00:36:39.380 | So we were studying pedestrian behavior,
00:36:43.240 | and I felt like, I had to record a lot of video to try,
00:36:46.600 | and then there's the computer vision
00:36:47.800 | extracts their movement, how they move their head, and so on.
00:36:50.840 | But like every time, I felt like I didn't understand enough.
00:36:55.320 | I just, I felt like I wasn't understanding
00:36:58.600 | what, how are people signaling to each other?
00:37:01.620 | What are they thinking?
00:37:03.560 | How cognizant are they of their fear of death?
00:37:07.780 | Like, what's the underlying game theory here?
00:37:11.880 | What are the incentives?
00:37:14.120 | And then I finally found a live stream of an intersection
00:37:17.800 | that's like high def that I just, I would watch,
00:37:20.280 | so I wouldn't have to sit out there.
00:37:21.720 | But that's interesting, so like, I feel--
00:37:23.600 | - That's tough, that's a tough example,
00:37:25.120 | because I mean, the learning--
00:37:27.040 | - Humans are involved.
00:37:28.760 | - Not just because human, but I think the learning mantra
00:37:33.420 | is that basically the statistics of the data
00:37:35.480 | will tell me things I need to know, right?
00:37:37.920 | And for the example you gave of all the nuances
00:37:42.840 | of eye contact or hand gestures or whatever
00:37:46.840 | that are happening for these subtle interactions
00:37:48.880 | between pedestrians and traffic, right?
00:37:51.120 | Maybe the data will tell that story.
00:37:54.560 | Maybe even one level more meta than what you're saying.
00:37:59.560 | For a particular problem, I think it might be the case
00:38:03.800 | that data should tell us the story.
00:38:06.000 | But I think there's a rigorous thinking
00:38:09.400 | that is just an essential skill for a mathematician
00:38:13.120 | or an engineer that I just don't wanna lose it.
00:38:18.120 | There are certainly super rigorous control,
00:38:22.440 | or sorry, machine learning people.
00:38:24.920 | I just think deep learning makes it so easy
00:38:27.980 | to do some things that our next generation,
00:38:31.560 | are not immediately rewarded for going through
00:38:36.560 | some of the more rigorous approaches.
00:38:38.520 | And then I wonder where that takes us.
00:38:40.760 | Well, I'm actually optimistic about it.
00:38:42.240 | I just want to do my part to try to steer
00:38:45.840 | that rigorous thinking.
00:38:48.000 | - So there's like two questions I wanna ask.
00:38:51.040 | Do you have sort of a good example of rigorous thinking
00:38:56.040 | where it's easy to get lazy and not do the rigorous thinking
00:39:00.840 | and the other question I have is like,
00:39:02.480 | do you have advice of how to practice rigorous thinking
00:39:07.480 | in all the computer science disciplines
00:39:13.680 | that we've mentioned?
00:39:14.720 | - Yeah, I mean, there are times where problems
00:39:21.360 | that can be solved with well-known mature methods
00:39:24.800 | could also be solved with a deep learning approach
00:39:30.280 | and there's an argument that you must use learning
00:39:35.280 | even for the parts we already think we know
00:39:38.320 | because if the human has touched it,
00:39:39.760 | then you've biased the system
00:39:42.420 | and you've suddenly put a bottleneck in there
00:39:44.280 | that is your own mental model.
00:39:46.280 | But something like inverting a matrix,
00:39:48.480 | you know, I think we know how to do that pretty well,
00:39:50.720 | even if it's a pretty big matrix
00:39:51.960 | and we understand that pretty well
00:39:53.120 | and you could train a deep network to do it,
00:39:55.040 | but you shouldn't probably.
00:39:57.840 | - So in that sense, rigorous thinking is understanding
00:40:02.200 | the scope and the limitations of the methods that we have,
00:40:07.200 | like how to use the tools of mathematics properly.
00:40:10.120 | - Yeah, I think, you know, taking a class on analysis
00:40:15.080 | is all I'm sort of arguing.
00:40:17.360 | Take a chance to stop and force yourself
00:40:20.060 | to think rigorously about even, you know,
00:40:23.520 | the rational numbers or something,
00:40:24.880 | you know, it doesn't have to be the end all problem,
00:40:27.760 | but that exercise of clear thinking,
00:40:31.080 | I think goes a long way
00:40:33.420 | and I just want to make sure we keep preaching it.
00:40:35.280 | - We don't lose it. - Yeah.
00:40:36.360 | - But do you think when you're doing like rigorous thinking
00:40:39.560 | or like maybe trying to write down equations
00:40:43.240 | or sort of explicitly, like formally describe a system,
00:40:47.960 | do you think we naturally simplify things too much?
00:40:51.560 | Is that a danger you run into?
00:40:53.980 | Like in order to be able to understand something
00:40:56.200 | about the system mathematically,
00:40:58.180 | we make it too much of a toy example.
00:41:01.680 | - But I think that's the good stuff, right?
00:41:04.480 | - That's how you understand the fundamentals?
00:41:07.040 | - I think so.
00:41:07.880 | I think maybe even that's a key to intelligence
00:41:10.400 | or something, but I mean, okay,
00:41:12.480 | what if Newton and Galileo had deep learning
00:41:15.120 | and they had done a bunch of experiments
00:41:18.360 | and they told the world,
00:41:20.280 | "Here's your weights of your neural network.
00:41:22.400 | "We've solved the problem."
00:41:23.520 | - Yeah.
00:41:24.360 | - Where would we be today?
00:41:25.340 | I don't think we'd be as far as we are.
00:41:28.400 | There's something to be said
00:41:29.240 | about having the simplest explanation for a phenomenon.
00:41:32.520 | So I don't doubt that we can train neural networks
00:41:37.180 | to predict even physical,
00:41:39.500 | F equals MA type equations,
00:41:44.660 | but I maybe, I want another Newton to come along
00:41:51.320 | 'cause I think there's more to do
00:41:52.920 | in terms of coming up with the simple models
00:41:56.020 | for more complicated tasks.
00:41:59.860 | - Yeah.
00:42:01.180 | Let's not offend the AI systems from 50 years
00:42:04.240 | from now that are listening to this
00:42:06.340 | that are probably better at,
00:42:08.260 | might be better at coming up
00:42:10.180 | with F equals MA equations themselves.
00:42:13.100 | - Oh, sorry, I actually think learning
00:42:15.420 | is probably a route to achieving this,
00:42:18.220 | but the representation matters, right?
00:42:21.180 | And I think having a function
00:42:24.700 | that takes my inputs to outputs
00:42:27.260 | that is arbitrarily complex may not be the end goal.
00:42:30.760 | I think there's still the most simple
00:42:34.140 | or parsimonious explanation for the data.
00:42:36.400 | Simple doesn't mean low dimensional.
00:42:39.020 | That's one thing I think that we've,
00:42:41.020 | a lesson that we've learned.
00:42:41.980 | So a standard way to do model reduction
00:42:46.080 | or system identification and controls
00:42:47.860 | is to, the typical formulation is that you try
00:42:50.100 | to find the minimal state dimension realization
00:42:53.420 | of a system that hits some error bounds
00:42:56.140 | or something like that.
00:42:57.780 | And that's maybe not, I think we're learning
00:43:00.340 | that that was, that state dimension
00:43:02.900 | is not the right metric.
00:43:04.540 | - Of complexity.
00:43:06.820 | - Of complexity.
00:43:07.640 | But for me, I think a lot about contact,
00:43:09.460 | the mechanics of contact.
00:43:10.840 | The robot hand is picking up an object or something.
00:43:13.660 | And when I write down the equations of motion for that,
00:43:16.660 | they look incredibly complex,
00:43:19.100 | not because, actually not so much
00:43:23.420 | because of the dynamics of the hand when it's moving,
00:43:26.660 | but it's just the interactions
00:43:28.500 | and when they turn on and off, right?
00:43:30.860 | So having a high dimensional, you know,
00:43:33.300 | but simple description of what's happening out here is fine.
00:43:36.420 | But if, when I actually start touching,
00:43:38.500 | if I write down a different dynamical system
00:43:41.860 | for every polygon on my robot hand
00:43:45.420 | and every polygon on the object,
00:43:47.300 | whether it's in contact or not,
00:43:49.000 | with all the combinatorics that explodes there,
00:43:51.700 | then that's too complex.
00:43:54.460 | So I need to somehow summarize that
00:43:55.780 | with a more intuitive physics way of thinking.
00:44:00.780 | And yeah, I'm very optimistic
00:44:03.500 | that machine learning will get us there.
00:44:05.700 | - First of all, I mean, I'll probably do it
00:44:08.220 | in the introduction, but you're one
00:44:10.820 | of the great robotics people at MIT.
00:44:12.900 | You're a professor at MIT.
00:44:14.300 | You've teach a lot of amazing courses.
00:44:16.480 | You run a large group and you have a important history
00:44:21.480 | for MIT, I think, as being a part
00:44:23.960 | of the DARPA Robotics Challenge.
00:44:26.340 | Can you maybe first say what is the DARPA Robotics Challenge
00:44:30.000 | and then tell your story around it, your journey with it?
00:44:35.000 | - Yeah, sure.
00:44:37.220 | So the DARPA Robotics Challenge,
00:44:41.060 | it came on the tails of the DARPA Grand Challenge
00:44:44.720 | and DARPA Urban Challenge,
00:44:45.940 | which were the challenges that brought us,
00:44:48.420 | put a spotlight on self-driving cars.
00:44:52.740 | Gil Pratt was at DARPA and pitched a new challenge
00:45:00.400 | that involved disaster response.
00:45:03.540 | It didn't explicitly require humanoids,
00:45:07.160 | although humanoids came into the picture.
00:45:09.220 | This happened shortly after the Fukushima disaster in Japan.
00:45:14.740 | And our challenge was motivated roughly by that
00:45:17.660 | because that was a case where if we had had robots
00:45:21.060 | that were ready to be sent in,
00:45:22.700 | there's a chance that we could have averted disaster.
00:45:26.580 | And certainly after the, in the disaster response,
00:45:30.620 | there were times where we would have loved
00:45:32.380 | to have sent robots in.
00:45:33.540 | So in practice, what we ended up with was a grand challenge,
00:45:39.220 | a DARPA Robotics Challenge,
00:45:42.140 | where Boston Dynamics was to make humanoid robots.
00:45:47.140 | People like me and the amazing team at MIT
00:45:52.500 | were competing first in a simulation challenge
00:45:56.740 | to try to be one of the ones that wins the right
00:45:59.420 | to work on one of the Boston Dynamics humanoids
00:46:03.300 | in order to compete in the final challenge,
00:46:06.580 | which was a physical challenge.
00:46:08.540 | - And at that point, it was already,
00:46:10.340 | so it was decided as humanoid robots early on.
00:46:13.740 | - There were two tracks.
00:46:15.100 | You could enter as a hardware team
00:46:16.860 | where you brought your own robot,
00:46:18.460 | or you could enter through the virtual robotics challenge
00:46:21.340 | as a software team that would try to win the right
00:46:24.260 | to use one of the Boston Dynamics robots.
00:46:25.940 | - Which are called Atlas. - Atlas.
00:46:27.940 | - Humanoid robots. - Yeah, it was a 400 pound
00:46:30.860 | Marvel, but a pretty big, scary looking robot.
00:46:34.440 | - Expensive too. - Expensive, yeah.
00:46:38.260 | - Okay, so, I mean, how did you feel
00:46:42.340 | the prospect of this kind of challenge?
00:46:44.780 | I mean, it seems, you know, autonomous vehicles,
00:46:48.820 | yeah, I guess that sounds hard,
00:46:51.060 | but not really from a robotics perspective.
00:46:53.980 | It's like, didn't they do it in the '80s
00:46:56.020 | is the kind of feeling I would have
00:46:57.780 | like when you first look at the problem.
00:47:00.820 | It's on wheels, but like humanoid robots,
00:47:04.900 | that sounds really hard.
00:47:08.140 | So what, psychologically speaking,
00:47:12.860 | what were you feeling, excited, scared?
00:47:15.780 | Why the heck did you get yourself involved
00:47:18.020 | in this kind of messy challenge?
00:47:19.660 | - We didn't really know for sure
00:47:21.980 | what we were signing up for,
00:47:23.420 | in the sense that you could have had something that,
00:47:26.840 | as it was described in the call for participation,
00:47:30.780 | that could have put a huge emphasis
00:47:32.700 | on the dynamics of walking and not falling down
00:47:35.700 | and walking over rough terrain,
00:47:37.380 | or the same description,
00:47:38.580 | 'cause the robot had to go into this disaster area
00:47:40.780 | and turn valves and pick up a drill
00:47:44.580 | and cut a hole through a wall.
00:47:45.780 | It had to do some interesting things.
00:47:48.420 | The challenge could have really highlighted perception
00:47:51.860 | and autonomous planning,
00:47:54.860 | or it ended up that, you know,
00:47:57.860 | locomoting over a complex terrain
00:48:01.080 | played a pretty big role in the competition.
00:48:05.520 | - And the degree of autonomy wasn't clear.
00:48:08.360 | - The degree of autonomy was always
00:48:09.960 | a central part of the discussion.
00:48:11.920 | So what wasn't clear was how we would be able,
00:48:15.560 | how far we'd be able to get with it.
00:48:17.520 | So the idea was always that you want semi-autonomy,
00:48:21.640 | that you want the robot to have enough compute
00:48:24.280 | that you can have a degraded network link to a human.
00:48:27.640 | And so the same way we had degraded networks
00:48:30.640 | at many natural disasters,
00:48:33.160 | you'd send your robot in,
00:48:34.960 | you'd be able to get a few bits back and forth,
00:48:37.520 | but you don't get to have enough,
00:48:38.920 | potentially, to fully operate the robot,
00:48:42.080 | every joint of the robot.
00:48:43.320 | So, and then the question was,
00:48:46.160 | and the gamesmanship of the organizers
00:48:48.880 | was to figure out what we're capable of,
00:48:50.680 | push us as far as we could,
00:48:52.580 | so that it would differentiate the teams
00:48:55.300 | that put more autonomy on the robot
00:48:57.520 | and had a few clicks and just said,
00:48:59.320 | "Go there, do this, go there, do this,"
00:49:00.920 | versus someone who's picking every footstep
00:49:03.400 | or something like that.
00:49:05.280 | - So what were some memories,
00:49:10.280 | painful, triumphant from the experience?
00:49:13.600 | Like what was that journey?
00:49:15.040 | Maybe if you can dig in a little deeper,
00:49:17.660 | maybe even on the technical side, on the team side,
00:49:21.120 | that whole process of,
00:49:23.000 | from the early idea stages to actually competing.
00:49:28.200 | - I mean, this was a defining experience for me.
00:49:30.600 | It came at the right time for me in my career.
00:49:33.900 | I had gotten tenure before I was due a sabbatical,
00:49:37.440 | and most people do something relaxing
00:49:39.800 | and restorative for a sabbatical.
00:49:41.880 | - So you got tenure before this?
00:49:44.480 | - Yeah, yeah, yeah.
00:49:46.160 | It was a good time for me.
00:49:48.040 | We had a bunch of algorithms that we were very happy with.
00:49:50.920 | We wanted to see how far we could push them,
00:49:52.520 | and this was a chance to really test our mettle,
00:49:54.880 | to do more proper software engineering.
00:49:56.840 | The team, we all just worked our butts off.
00:50:02.920 | We're in that lab almost all the time.
00:50:04.860 | Okay, so I mean, there were some, of course,
00:50:09.580 | high highs and low lows throughout that,
00:50:12.060 | anytime you're not sleeping and devoting your life
00:50:14.980 | to a 400-pound humanoid.
00:50:16.360 | I remember actually one funny moment
00:50:20.680 | where we're all super tired,
00:50:21.900 | and so Atlas had to walk across cinder blocks.
00:50:24.720 | That was one of the obstacles.
00:50:26.500 | And I remember Atlas was powered down,
00:50:28.420 | hanging limp on its harness,
00:50:31.260 | and the humans were there picking up
00:50:33.980 | and laying the brick down
00:50:35.180 | so that the robot could walk over it.
00:50:36.460 | And I thought, "What is wrong with this?"
00:50:38.460 | We've got a robot just watching us
00:50:41.560 | do all the manual labor so that it can take its little
00:50:44.160 | stroll across the terrain.
00:50:46.740 | I mean, even the virtual robotics challenge
00:50:52.140 | was super nerve-wracking and dramatic.
00:50:54.620 | I remember, so we were using Gazebo as a simulator.
00:50:59.620 | We were using Gazebo as a simulator on the cloud,
00:51:02.300 | and there was all these interesting challenges.
00:51:03.940 | I think the investment that OSRFC,
00:51:08.580 | whatever they were called at that time,
00:51:10.020 | Brian Gerke's team at Open Source Robotics,
00:51:12.220 | they were pushing on the capabilities of Gazebo
00:51:16.020 | in order to scale it to the complexity of these challenges.
00:51:20.380 | So up to the virtual competition.
00:51:23.900 | So the virtual competition was,
00:51:26.180 | you will sign on at a certain time,
00:51:28.460 | and we'll have a network connection
00:51:29.820 | to another machine on the cloud
00:51:32.060 | that is running the simulator of your robot.
00:51:34.860 | And your controller will run on this computer,
00:51:38.140 | and the physics will run on the other,
00:51:40.880 | and you have to connect.
00:51:43.020 | Now, the physics, they wanted it to run at real-time rates,
00:51:48.020 | because there was an element of human interaction.
00:51:50.700 | And humans, if you do want to tele-op,
00:51:53.260 | it works way better if it's at frame rate.
00:51:56.100 | - Oh, cool.
00:51:57.140 | - But it was very hard to simulate
00:51:58.740 | these complex scenes at real-time rate.
00:52:03.260 | So right up to days before the competition,
00:52:06.540 | the simulator wasn't quite at real-time rate.
00:52:11.060 | And that was great for me,
00:52:12.100 | because my controller was solving
00:52:13.780 | a pretty big optimization problem,
00:52:16.340 | and it wasn't quite at real-time rate.
00:52:17.800 | So I was fine.
00:52:18.900 | I was keeping up with the simulator.
00:52:20.520 | We were both running at about 0.7.
00:52:22.900 | And I remember getting this email,
00:52:24.980 | and by the way, the perception folks on our team hated,
00:52:28.460 | that they knew that if my controller was too slow,
00:52:31.460 | the robot was gonna fall down.
00:52:32.500 | And no matter how good their perception system was,
00:52:34.920 | if I can't make my controller fast.
00:52:36.940 | Anyways, we get this email
00:52:37.940 | like three days before the virtual competition.
00:52:40.300 | You know, it's for all the marbles.
00:52:41.500 | We're gonna either get a humanoid robot or we're not.
00:52:44.940 | And we get an email saying, "Good news.
00:52:46.380 | "We made the robot, the simulator faster."
00:52:48.660 | It's now 1.0.
00:52:49.500 | And I was just like, "Oh man, what are we gonna do here?"
00:52:54.780 | So that came in late at night for me.
00:52:58.380 | - A few days ahead.
00:53:00.500 | - A few days ahead.
00:53:01.420 | I went over, it happened at Frank Permenter,
00:53:03.980 | who's a very, very sharp,
00:53:06.820 | he was a student at the time working on optimization.
00:53:10.220 | He was still in lab.
00:53:12.160 | Frank, we need to make
00:53:15.100 | this quadratic programming solver faster.
00:53:17.340 | Not like a little faster, it's actually, you know.
00:53:19.840 | And we wrote a new solver for that QP,
00:53:24.980 | together, that night.
00:53:26.620 | - And you saw her. - It was terrifying.
00:53:29.420 | - So there's a really hard optimization problem
00:53:31.940 | that you're constantly solving.
00:53:33.500 | You didn't make the optimization problem simpler?
00:53:36.860 | You wrote a new solver?
00:53:38.540 | - So, I mean, your observation is almost spot on.
00:53:42.860 | What we did was what everybody,
00:53:44.540 | I mean, people know how to do this,
00:53:45.820 | but we had not yet done this idea of warm starting.
00:53:49.300 | So we are solving a big optimization problem
00:53:51.380 | at every time step.
00:53:52.700 | But if you're running fast enough,
00:53:54.300 | the optimization problem you're solving
00:53:55.700 | on the last time step is pretty similar
00:53:57.940 | to the optimization you're gonna solve with the next.
00:54:00.100 | We, of course, had told our commercial solver
00:54:02.300 | to use warm starting,
00:54:03.620 | but even the interface to that commercial solver
00:54:07.220 | was causing us these delays.
00:54:09.860 | So what we did was we basically wrote,
00:54:12.780 | we called it fast QP at the time.
00:54:15.420 | We wrote a very lightweight, very fast layer,
00:54:18.500 | which would basically check if nearby solutions
00:54:22.160 | to the quadratic program,
00:54:23.860 | which were very easily checked, could stabilize the robot.
00:54:28.000 | And if they couldn't, we would fall back to the solver.
00:54:30.740 | - You couldn't really test this well, right?
00:54:33.140 | Or like?
00:54:34.300 | - So we always knew that if we fell back,
00:54:37.380 | if we, it got to the point where if for some reason
00:54:40.420 | things slowed down and we fell back to the original solver,
00:54:42.820 | the robot would actually literally fall down.
00:54:45.060 | So it was a harrowing sort of ledge we were sort of on.
00:54:51.180 | But I mean, actually, like the 400 pound humanoid
00:54:54.880 | could come crashing to the ground
00:54:55.840 | if your solver's not fast enough.
00:54:58.020 | But we had lots of good experiences.
00:55:01.160 | - Can I ask you a weird question I get
00:55:05.600 | about the idea of hard work?
00:55:09.460 | So actually people, like students of yours
00:55:14.320 | that I've interacted with,
00:55:15.960 | and just, and robotics people in general,
00:55:19.400 | but they have moments, at moments have worked harder
00:55:24.400 | than most people I know in terms of,
00:55:29.280 | if you look at different disciplines
00:55:30.580 | of how hard people work.
00:55:32.360 | But they're also like the happiest.
00:55:34.580 | Like, just like, I don't know.
00:55:36.980 | It's the same thing with like running,
00:55:39.200 | people that push themselves to like the limit,
00:55:41.380 | they also seem to be like the most like full of life somehow.
00:55:45.420 | And I get often criticized, like,
00:55:48.680 | "You're not getting enough sleep.
00:55:50.360 | "What are you doing to your body?"
00:55:51.960 | Blah, blah, blah, like this kind of stuff.
00:55:54.680 | And I usually just kind of respond like,
00:55:57.960 | "I'm doing what I love, I'm passionate about it.
00:56:00.840 | "I love it, I feel like it's invigorating."
00:56:04.840 | I actually think, I don't think the lack of sleep
00:56:07.640 | is what hurts you.
00:56:08.880 | I think what hurts you is stress
00:56:10.960 | and lack of doing things that you're passionate about.
00:56:13.280 | But in this world, yeah, I mean, can you comment about
00:56:16.960 | why the heck robotics people are
00:56:21.280 | (laughs)
00:56:23.800 | willing to push themselves to that degree?
00:56:26.200 | Is there value in that?
00:56:27.680 | And why are they so happy?
00:56:29.380 | - I think you got it right.
00:56:31.920 | I mean, I think the causality is not that we work hard.
00:56:36.440 | And I think other disciplines work very hard too.
00:56:38.520 | But I don't think it's that we work hard
00:56:40.300 | and therefore we are happy.
00:56:43.160 | I think we found something
00:56:44.720 | that we're truly passionate about.
00:56:46.600 | It makes us very happy.
00:56:50.000 | And then we get a little involved with it
00:56:52.280 | and spend a lot of time on it.
00:56:54.600 | What a luxury to have something
00:56:56.000 | that you wanna spend all your time on, right?
00:56:58.260 | - We could talk about this for many hours,
00:57:00.800 | but maybe if we could pick,
00:57:03.880 | is there something on the technical side
00:57:05.480 | on the approach that you took that's interesting
00:57:08.260 | that turned out to be a terrible failure
00:57:10.240 | or a success that you carry into your work today
00:57:13.800 | about all the different ideas that were involved
00:57:17.280 | in making, whether in the simulation or in the real world,
00:57:22.280 | making the semi-autonomous system work?
00:57:25.520 | - I mean, it really did teach me something fundamental
00:57:30.880 | about what it's gonna take to get robustness
00:57:33.560 | out of a system of this complexity.
00:57:35.320 | I would say the DARPA challenge
00:57:37.740 | really was foundational in my thinking.
00:57:41.040 | I think the autonomous driving community thinks about this.
00:57:43.720 | I think lots of people thinking about safety critical
00:57:46.640 | systems that might have machine learning in the loop
00:57:48.920 | are thinking about these questions.
00:57:50.360 | For me, the DARPA challenge was the moment where I realized,
00:57:54.240 | we've spent every waking minute running this robot.
00:57:58.920 | And again, for the physical competition,
00:58:01.440 | days before the competition,
00:58:02.540 | we saw the robot fall down in a way
00:58:04.440 | it had never fallen down before.
00:58:05.980 | I thought, how could we have found that?
00:58:09.260 | We only have one robot, it's running almost all the time.
00:58:13.600 | We just didn't have enough hours in the day
00:58:15.560 | to test that robot.
00:58:17.120 | Something has to change, right?
00:58:19.400 | And I think that, I mean, I would say that the team that won
00:58:22.760 | from KAIST was the team that had two robots
00:58:28.040 | and was able to do not only incredible engineering,
00:58:30.560 | just absolutely top-rate engineering,
00:58:33.260 | but also they were able to test at a rate
00:58:36.120 | and discipline that we didn't keep up with.
00:58:39.620 | - What does testing look like?
00:58:41.120 | What are we talking about here?
00:58:42.280 | Like what's a loop of tests?
00:58:45.040 | Like from start to finish, what is a loop of testing?
00:58:48.760 | - Yeah, I mean, I think there's a whole philosophy
00:58:51.280 | to testing.
00:58:52.120 | There's the unit tests, and you can do that on a hardware.
00:58:54.400 | You can do that in a small piece of code.
00:58:56.340 | You write one function, you should write a test
00:58:58.240 | that checks that function's input and outputs.
00:59:00.600 | You should also write an integration test
00:59:02.400 | at the other extreme of running the whole system together,
00:59:05.280 | where they try to turn on all of the different functions
00:59:09.080 | that you think are correct.
00:59:11.560 | It's much harder to write the specifications
00:59:13.360 | for a system-level test,
00:59:14.480 | especially if that system is as complicated
00:59:17.320 | as a humanoid robot, but the philosophy is sort of the same.
00:59:21.040 | On the real robot, it's no different,
00:59:24.160 | but on a real robot,
00:59:26.040 | it's impossible to run the same experiment twice.
00:59:28.640 | So if you see a failure,
00:59:32.480 | you hope you caught something in the logs
00:59:34.400 | that tell you what happened,
00:59:35.640 | but you'd probably never be able to run exactly
00:59:37.320 | that experiment again.
00:59:39.400 | And right now, I think our philosophy is just,
00:59:44.400 | basically, Monte Carlo estimation,
00:59:47.880 | is just run as many experiments as we can,
00:59:50.880 | maybe try to set up the environment
00:59:53.080 | to make the things we are worried about
00:59:56.600 | happen as often as possible,
00:59:59.880 | but really we're relying on somewhat random search
01:00:02.280 | in order to test.
01:00:03.180 | Maybe that's all we'll ever be able to,
01:00:05.480 | but I think, you know, 'cause there's an argument
01:00:08.160 | that the things that'll get you
01:00:10.520 | are the things that are really nuanced in the world,
01:00:14.040 | and it'd be very hard to, for instance,
01:00:15.700 | put back in a simulation.
01:00:16.880 | - Yeah, I guess the edge cases.
01:00:19.860 | What was the hardest thing?
01:00:21.800 | Like, so you said walking over rough terrain,
01:00:24.680 | like just taking footsteps.
01:00:27.120 | I mean, people, it's so dramatic and painful
01:00:31.360 | in a certain kind of way to watch these videos
01:00:33.520 | from the DRC of robots falling.
01:00:37.600 | - Yep.
01:00:38.440 | - It's just so heartbreaking.
01:00:39.440 | I don't know.
01:00:40.280 | Maybe it's because, for me at least,
01:00:42.400 | we anthropomorphize the robot.
01:00:45.140 | Of course, it's also funny for some reason.
01:00:48.400 | Like, humans falling is funny.
01:00:50.260 | It's some dark reason.
01:00:53.400 | I'm not sure why it is so,
01:00:55.300 | but it's also like tragic and painful.
01:00:57.880 | And so speaking of which,
01:01:00.160 | I mean, what made the robots fall and fail in your view?
01:01:05.000 | - So I can tell you exactly what happened.
01:01:07.160 | I contributed one of those.
01:01:08.360 | Our team contributed one of those spectacular falls.
01:01:10.960 | Every one of those falls has a complicated story.
01:01:15.560 | I mean, at one time,
01:01:16.920 | the power effectively went out on the robot
01:01:19.160 | because it had been sitting at the door
01:01:21.700 | waiting for a green light to be able to proceed
01:01:24.400 | and its batteries, you know,
01:01:26.280 | and therefore it just fell backwards
01:01:28.060 | and smashed its head to the ground.
01:01:29.280 | And it was hilarious,
01:01:30.120 | but it wasn't because of bad software, right?
01:01:32.760 | But for ours, so the hardest part of the challenge,
01:01:37.120 | the hardest task in my view
01:01:38.680 | was getting out of the Polaris.
01:01:40.440 | It was actually relatively easy to drive the Polaris.
01:01:43.760 | - Can you tell the story?
01:01:44.600 | Sorry to interrupt. - No, of course.
01:01:45.440 | - The story of the car.
01:01:46.920 | People should watch this video.
01:01:51.200 | I mean, the thing you've come up with is just brilliant,
01:01:53.880 | but anyway, sorry.
01:01:55.120 | - Yeah, we kind of joke,
01:01:56.920 | we call it the big robot little car problem
01:01:59.040 | because somehow the race organizers
01:02:02.400 | decided to give us a 400 pound humanoid
01:02:05.360 | and that they also provided the vehicle,
01:02:07.480 | which was a little Polaris.
01:02:08.680 | And the robot didn't really fit in the car.
01:02:11.800 | So you couldn't drive the car with your feet
01:02:14.560 | under the steering column.
01:02:15.720 | We actually had to straddle the main column
01:02:19.560 | of the, and have basically one foot in the passenger seat,
01:02:23.600 | one foot in the driver's seat
01:02:25.280 | and then drive with our left hand.
01:02:27.620 | But the hard part was we had to then park the car,
01:02:31.320 | get out of the car.
01:02:33.080 | It didn't have a door, that was okay,
01:02:34.320 | but it's just getting up from crouch, from sitting
01:02:38.760 | when you're in this very constrained environment.
01:02:41.920 | - First of all, I remember after watching those videos,
01:02:44.360 | I was much more cognizant of how hard is it,
01:02:47.080 | it is for me to get in and out of the car
01:02:49.640 | and out of the car especially.
01:02:51.280 | Like it's actually a really difficult control problem.
01:02:54.280 | - Yeah.
01:02:55.120 | - And I'm very cognizant of it when I'm like injured
01:02:58.360 | for whatever reason. - No, it's really hard.
01:03:00.120 | - Yeah.
01:03:01.440 | So how did you approach this problem?
01:03:03.560 | - So we had, you think of NASA's operations
01:03:08.160 | and they have these checklists, pre-launch checklists
01:03:10.680 | and they're like, we weren't far off from that.
01:03:12.380 | We had this big checklist and on the first day
01:03:14.720 | of the competition, we were running down our checklist.
01:03:17.520 | And one of the things we had to do,
01:03:19.120 | we had to turn off the controller,
01:03:21.320 | the piece of software that was running
01:03:23.320 | that would drive the left foot of the robot
01:03:25.560 | in order to accelerate on the gas.
01:03:28.120 | And then we turned on our balancing controller.
01:03:30.840 | And the nerves, jitters of the first day of the competition,
01:03:34.280 | someone forgot to check that box
01:03:35.680 | and turn that controller off.
01:03:37.560 | So we used a lot of motion planning to figure out
01:03:42.280 | a sort of configuration of the robot
01:03:45.320 | that we could get up and over.
01:03:47.200 | We relied heavily on our balancing controller.
01:03:50.300 | And basically there were, when the robot was in one
01:03:53.760 | of its most precarious sort of configurations,
01:03:57.560 | trying to sneak its big leg out of the side,
01:04:00.920 | the other controller that thought it was still driving
01:04:05.040 | told its left foot to go like this.
01:04:06.880 | And that wasn't good, but it turned disastrous for us
01:04:11.880 | because what happened was a little bit of push here.
01:04:17.000 | Actually, we have videos of us running into the robot
01:04:21.100 | with a 10 foot pole and it kind of will recover.
01:04:24.720 | But this is a case where there's no space to recover.
01:04:27.840 | So a lot of our secondary balancing mechanisms
01:04:30.200 | about like take a step to recover,
01:04:32.200 | they were all disabled because we were in the car
01:04:33.800 | and there's no place to step.
01:04:35.360 | So we were relying on our just lowest level reflexes.
01:04:38.400 | And even then, I think just hitting the foot on the seat,
01:04:42.200 | on the floor, we probably could have recovered from it.
01:04:45.000 | But the thing that was bad that happened
01:04:46.440 | is when we did that and we jostled a little bit,
01:04:49.480 | the tailbone of our robot was only a little off the seat,
01:04:53.760 | it hit the seat.
01:04:54.600 | And the other foot came off the ground just a little bit.
01:04:58.280 | And nothing in our plans had ever told us what to do
01:05:02.320 | if your butt's on the seat and your feet are in the air.
01:05:05.160 | - Feet in the air.
01:05:06.080 | - And then the thing is, once you get off the script,
01:05:10.000 | things can go very wrong because even our state estimation,
01:05:12.800 | our system that was trying to collect all the data
01:05:15.240 | from the sensors and understand
01:05:16.800 | what's happening with the robot,
01:05:18.520 | it didn't know about this situation.
01:05:20.120 | So it was predicting things that were just wrong.
01:05:22.840 | And then we did a violent shake and fell off
01:05:26.560 | in our face first out of the robot.
01:05:29.200 | - But like into the destination.
01:05:32.560 | - That's true, we fell in and we got our point for egress.
01:05:35.480 | - But so is there any hope for, that's interesting,
01:05:39.320 | is there any hope for Atlas to be able to do something
01:05:43.320 | when it's just on its butt and feet in the air?
01:05:46.360 | - Absolutely.
01:05:47.200 | No, so that is one of the big challenges.
01:05:50.960 | And I think it's still true, you know,
01:05:53.880 | Boston Dynamics and Andy Mow and there's this incredible
01:05:58.880 | work on legged robots happening around the world.
01:06:02.040 | Most of them still are very good at the case
01:06:07.680 | where you're making contact with the world at your feet.
01:06:10.160 | And they have typically point feet relatively,
01:06:12.240 | their balls on their feet, for instance.
01:06:14.560 | If those robots get in a situation where the elbow
01:06:17.920 | hits the wall or something like this,
01:06:19.960 | that's a pretty different situation.
01:06:21.320 | Now they have layers of mechanisms that will make,
01:06:24.200 | I think the more mature solutions have ways in which
01:06:28.680 | the controller won't do stupid things.
01:06:31.320 | But a human for instance, is able to leverage
01:06:34.840 | incidental contact in order to accomplish a goal.
01:06:36.840 | In fact, I might, if you push me, I might actually
01:06:38.920 | put my hand out and make a new brand new contact.
01:06:42.320 | The feet of the robot are doing this on quadrupeds,
01:06:45.040 | but we mostly in robotics are afraid of contact
01:06:49.200 | on the rest of our body, which is crazy.
01:06:52.100 | There's this whole field of motion planning,
01:06:56.120 | collision free motion planning.
01:06:58.120 | And we write very complex algorithms so that the robot
01:07:00.760 | can dance around and make sure it doesn't touch the world.
01:07:04.120 | - So people are just afraid of contact
01:07:07.800 | 'cause contact is seen as a difficult.
01:07:09.920 | - It's still a difficult control problem
01:07:12.280 | and sensing problem.
01:07:13.420 | - Now you're a serious person.
01:07:17.080 | (laughs)
01:07:19.880 | - I'm a little bit of an idiot
01:07:21.200 | and I'm going to ask you some dumb questions.
01:07:24.160 | So I do martial arts.
01:07:27.160 | So like jujitsu, I wrestled my whole life.
01:07:30.400 | So let me ask the question,
01:07:33.280 | whenever people learn that I do any kind of AI
01:07:36.920 | or like I mentioned robots and things like that,
01:07:39.760 | they say when are we gonna have robots
01:07:41.920 | that can win in a wrestling match
01:07:46.840 | or in a fight against a human?
01:07:49.040 | So we just mentioned sitting on your butt,
01:07:52.200 | if you're in the air, that's a common position,
01:07:53.960 | jujitsu when you're on the ground,
01:07:55.440 | you're a down opponent.
01:07:57.640 | Like how difficult do you think is the problem
01:08:03.840 | and when will we have a robot that can defeat a human
01:08:06.920 | in a wrestling match?
01:08:08.640 | And we're talking about a lot,
01:08:10.920 | I don't know if you're familiar with wrestling,
01:08:12.480 | but essentially--
01:08:14.040 | - Not very.
01:08:16.240 | - It's basically the art of contact.
01:08:19.680 | It's like, 'cause you're picking contact points
01:08:24.680 | and then using like leverage,
01:08:27.040 | like to off balance, to trick people.
01:08:31.440 | It's like you make them feel like you're doing one thing
01:08:35.660 | and then they change their balance
01:08:38.880 | and then you switch what you're doing
01:08:41.640 | and then results in a throw or whatever.
01:08:44.080 | So it's basically the art of multiple contacts.
01:08:47.880 | - Awesome, that's a nice description of it.
01:08:50.840 | So there's also an opponent in there, right?
01:08:53.040 | So if--
01:08:54.160 | - Very dynamic.
01:08:55.040 | - Right, if you are wrestling a human
01:08:58.520 | and are in a game theoretic situation with a human,
01:09:02.880 | that's still hard.
01:09:06.600 | But just to speak to the quickly reasoning
01:09:09.160 | about contact part of it, for instance.
01:09:11.360 | - Yeah, maybe even throwing the game theory out of it,
01:09:13.400 | almost like a, yeah, almost like a non-dynamic opponent.
01:09:17.700 | - Right.
01:09:18.640 | There's reasons to be optimistic,
01:09:20.080 | but I think our best understanding of those problems
01:09:22.660 | are still pretty hard.
01:09:23.940 | I have been increasingly focused on manipulation,
01:09:29.800 | partly where that's a case where the contact
01:09:31.720 | has to be much more rich.
01:09:35.800 | And there are some really impressive examples
01:09:38.240 | of deep learning policies, controllers,
01:09:41.800 | that can appear to do good things through contact.
01:09:46.800 | We've even got new examples of deep learning models
01:09:52.480 | of predicting what's gonna happen to objects
01:09:54.360 | as they go through contact.
01:09:56.200 | But I think the challenge you just offered there
01:09:59.760 | still eludes us, right?
01:10:01.480 | The ability to make a decision
01:10:03.600 | based on those models quickly.
01:10:05.280 | I have to think though, it's hard for humans too,
01:10:10.080 | when you get that complicated.
01:10:11.320 | I think probably you had maybe a slow motion version
01:10:16.080 | of where you learn the basic skills,
01:10:17.920 | and you've probably gotten better at it,
01:10:20.660 | and there's much more subtlety.
01:10:24.620 | But it might still be hard to actually,
01:10:26.820 | really on the fly, take a model of your humanoid
01:10:32.120 | and figure out how to plan the optimal sequence.
01:10:35.240 | That might be a problem we never solve.
01:10:36.680 | - Well, I mean, one of the most amazing things to me
01:10:40.360 | about the, we could talk about martial arts,
01:10:43.720 | we could also talk about dancing,
01:10:45.360 | doesn't really matter, too human.
01:10:47.680 | I think it's the most interesting study of contact.
01:10:51.200 | It's not even the dynamic element of it.
01:10:53.200 | Like when you get good at it, it's so effortless.
01:10:58.760 | Like I can just, I'm very cognizant
01:11:00.880 | of the entirety of the learning process
01:11:03.400 | being essentially like learning how to move my body
01:11:07.640 | in a way that I could throw very large weights around
01:11:12.640 | effortlessly.
01:11:14.700 | And I can feel the learning.
01:11:18.500 | Like I'm a huge believer in drilling of techniques.
01:11:21.560 | And you can just like feel your,
01:11:23.560 | you're not feeling, you're feeling, sorry,
01:11:26.760 | you're learning it intellectually a little bit,
01:11:29.760 | but a lot of it is the body learning it somehow,
01:11:32.800 | like instinctually.
01:11:33.880 | And whatever that learning is, that's really,
01:11:37.160 | I'm not even sure if that's equivalent
01:11:40.760 | to like a deep learning, learning a controller.
01:11:44.720 | I think it's something more,
01:11:46.780 | it feels like there's a lot of distributed learning
01:11:49.680 | going on.
01:11:50.520 | - Yeah, I think there's hierarchy and composition
01:11:54.480 | probably in the systems
01:11:57.960 | that we don't capture very well yet.
01:11:59.900 | You have layers of control systems,
01:12:02.440 | you have reflexes at the bottom layer,
01:12:03.960 | and you have a system that's capable of planning a vacation
01:12:08.960 | to some distant country,
01:12:11.320 | which is probably,
01:12:12.560 | you probably don't have a controller or a policy
01:12:14.840 | for every possible destination you'll ever pick, right?
01:12:18.740 | But there's something magical in the in-between.
01:12:23.480 | And how do you go from these low level feedback loops
01:12:26.400 | to something that feels like a pretty complex
01:12:30.040 | set of outcomes?
01:12:31.080 | My guess is, I think there's evidence
01:12:34.760 | that you can plan at some of these levels, right?
01:12:37.640 | So Josh Tenenbaum just showed it in his talk the other day.
01:12:41.760 | He's got a game he likes to talk about,
01:12:43.360 | I think he calls it the pick three game or something,
01:12:46.720 | where he puts a bunch of clutter down in front of a person,
01:12:50.760 | and he says, "Okay, pick three objects."
01:12:52.400 | And it might be a telephone or a shoe
01:12:55.760 | or a Kleenex box or whatever.
01:12:58.980 | And apparently you pick three items and then you pick,
01:13:01.880 | he says, "Okay, pick the first one up with your right hand,
01:13:04.120 | the second one up with your left hand.
01:13:06.400 | Now using those objects,
01:13:08.040 | now as tools, pick up the third object."
01:13:10.160 | Right, so that's down at the level of physics and mechanics
01:13:16.120 | and contact mechanics that I think we do learning,
01:13:20.520 | or we do have policies for,
01:13:21.920 | we do control for, almost feedback.
01:13:24.760 | But somehow we're able to still,
01:13:26.320 | I mean, I've never picked up a telephone
01:13:28.440 | with a shoe and a water bottle before,
01:13:30.280 | and somehow, and it takes me a little longer
01:13:32.480 | to do that the first time,
01:13:34.520 | but most of the time we can sort of figure that out.
01:13:37.300 | So, yeah, I think the amazing thing
01:13:41.240 | is this ability to be flexible with our models,
01:13:44.120 | plan when we need to,
01:13:46.160 | use our well-oiled controllers when we don't,
01:13:49.320 | when we're in familiar territory.
01:13:51.820 | Having models, I think the other thing you just said
01:13:55.560 | was something about,
01:13:57.120 | I think your awareness of what's happening
01:13:58.800 | is even changing as you improve your expertise, right?
01:14:02.360 | So maybe you have a very approximate model
01:14:04.960 | of the mechanics to begin with,
01:14:06.240 | and as you gain expertise,
01:14:09.320 | you get a more refined version of that model.
01:14:11.920 | You're aware of muscles or balanced components
01:14:17.080 | that you just weren't even aware of before.
01:14:19.680 | So how do you scaffold that?
01:14:21.760 | - Yeah, plus the fear of injury,
01:14:24.200 | the ambition of goals, of excelling,
01:14:28.800 | and fear of mortality.
01:14:32.040 | Let's see what else is in there as motivations.
01:14:34.620 | Overinflated ego in the beginning,
01:14:38.040 | and then a crash of confidence in the middle.
01:14:42.880 | All of those seem to be essential for the learning process.
01:14:46.720 | And if all that's good,
01:14:48.160 | then you're probably optimizing energy efficiency.
01:14:50.520 | - Yeah, right, so we have to get that right.
01:14:53.120 | So there was this idea that you would have robots play soccer
01:14:58.120 | better than human players by 2050.
01:15:03.840 | That was the goal.
01:15:05.720 | Basically, was the goal to beat world champion team,
01:15:10.200 | to become a World Cup,
01:15:11.400 | beat like a World Cup level team?
01:15:13.400 | So are we gonna see that first,
01:15:15.920 | or a robot, if you're familiar,
01:15:19.600 | there's an organization called UFC for mixed martial arts.
01:15:23.480 | Are we gonna see a World Cup championship soccer team
01:15:27.160 | that have robots,
01:15:28.480 | or a UFC champion mixed martial artist that's a robot?
01:15:33.480 | - I mean, it's very hard to say one thing is harder,
01:15:37.200 | some problem's harder than the other.
01:15:38.640 | What probably matters is who started the organization.
01:15:45.040 | I mean, I think RoboCup has a pretty serious following,
01:15:47.160 | and there is a history now of people playing that game,
01:15:50.880 | learning about that game,
01:15:51.840 | building robots to play that game,
01:15:53.640 | building increasingly more human robots.
01:15:55.880 | It's got momentum.
01:15:57.040 | So if you want to have mixed martial arts compete,
01:16:00.960 | you better start your organization now, right?
01:16:04.000 | I think almost independent of which problem
01:16:07.720 | is technically harder,
01:16:08.680 | 'cause they're both hard and they're both different.
01:16:11.400 | - That's a good point.
01:16:12.240 | I mean, those videos are just hilarious.
01:16:14.680 | Like, especially the humanoid robots trying to play soccer.
01:16:18.840 | I mean, they're kind of terrible right now.
01:16:23.400 | - I mean, I guess there is RoboSumo wrestling.
01:16:26.000 | There's like the RoboOne competitions
01:16:27.880 | where they do have these robots that go on a table
01:16:31.160 | and basically fight.
01:16:32.080 | So maybe I'm wrong.
01:16:33.720 | - First of all, do you have a year in mind for RoboCup,
01:16:37.160 | just from a robotics perspective?
01:16:39.080 | Seems like a super exciting possibility.
01:16:42.160 | - That, like in the physical space,
01:16:46.360 | this is what's interesting.
01:16:47.640 | I think the world is captivated.
01:16:50.600 | I think it's really exciting.
01:16:52.800 | It inspires just a huge number of people
01:16:56.440 | when a machine beats a human
01:17:00.840 | at a game that humans are really damn good at.
01:17:03.520 | So you're talking about chess and Go,
01:17:05.800 | but that's in the world of digital.
01:17:09.880 | I don't think machines have beat humans
01:17:13.360 | at a game in the physical space yet,
01:17:16.080 | but that would be just--
01:17:17.760 | - You have to make the rules very carefully, right?
01:17:20.400 | I mean, if Atlas kicked me in the shins, I'm down,
01:17:23.040 | and game over.
01:17:25.520 | So it's very subtle on what's fair.
01:17:30.520 | - I think the fighting one is a weird one, yeah,
01:17:33.320 | 'cause you're talking about a machine
01:17:35.240 | that's much stronger than you.
01:17:36.560 | But yeah, in terms of soccer, basketball,
01:17:39.200 | all those kinds. - Even soccer, right?
01:17:40.440 | I mean, as soon as there's contact or whatever,
01:17:43.480 | and there are some things that the robot will do better.
01:17:46.520 | I think if you really set yourself up to try to see
01:17:51.520 | could robots win the game of soccer
01:17:53.120 | as the rules were written,
01:17:54.560 | the right thing for the robot to do
01:17:56.920 | is to play very differently than a human would play.
01:17:59.640 | You're not gonna get the perfect soccer player robot.
01:18:04.000 | You're gonna get something that exploits the rules,
01:18:07.880 | exploits its super actuators,
01:18:10.720 | its super low bandwidth feedback loops or whatever,
01:18:14.440 | and it's gonna play the game differently
01:18:15.720 | than you want it to play.
01:18:16.960 | And I bet there's ways, I bet there's loopholes, right?
01:18:21.360 | We saw that in the DARPA challenge,
01:18:24.120 | that it's very hard to write a set of rules
01:18:28.200 | that someone can't find a way to exploit.
01:18:32.840 | - Let me ask another ridiculous question.
01:18:35.600 | I think this might be the last ridiculous question, but--
01:18:38.640 | - I doubt it.
01:18:39.480 | (laughing)
01:18:41.080 | - I aspire to ask as many ridiculous questions
01:18:44.560 | of a brilliant MIT professor.
01:18:48.040 | Okay, I don't know if you've seen The Black Mirror.
01:18:52.440 | - It's funny, I never watched the episode.
01:18:56.720 | I know when it happened, though,
01:18:58.840 | because I gave a talk to some MIT faculty one day,
01:19:03.080 | on an unassuming Monday or whatever,
01:19:05.720 | I was telling them about the state of robotics.
01:19:08.480 | And I showed some video from Boston Dynamics
01:19:10.760 | of the quadruped spot at the time.
01:19:13.960 | It was their early version of spot.
01:19:15.920 | And there was a look of horror that went across the room.
01:19:19.280 | And I said, "I've shown videos like this a lot of times.
01:19:23.160 | "What happened?"
01:19:24.000 | And it turns out that this video had,
01:19:26.800 | this Black Mirror episode had changed the way people watched
01:19:31.920 | the videos I was putting out.
01:19:33.120 | - The way they see these kinds of robots.
01:19:34.720 | So I talked to so many people who are just terrified
01:19:37.760 | because of that episode probably of these kinds of robots.
01:19:40.960 | I almost want to say that they almost enjoy being terrified.
01:19:44.520 | I don't even know what it is about human psychology
01:19:47.100 | that kind of imagine doomsday,
01:19:49.240 | the destruction of the universe or our society,
01:19:52.760 | and kind of enjoy being afraid.
01:19:57.320 | I don't want to simplify it,
01:19:58.360 | but it feels like they talk about it so often,
01:20:01.000 | it almost, there does seem to be an addictive quality to it.
01:20:06.000 | I talked to a guy, a guy named Joe Rogan,
01:20:09.440 | who's kind of the flag bearer
01:20:11.520 | for being terrified of these robots.
01:20:13.620 | Do you have a, two questions.
01:20:17.280 | One, do you have an understanding
01:20:18.560 | of why people are afraid of robots?
01:20:21.640 | And the second question is, in Black Mirror,
01:20:24.880 | just to tell you the episode,
01:20:26.320 | I don't even remember it that much anymore,
01:20:28.100 | but these robots, I think they can shoot
01:20:31.020 | like a pellet or something.
01:20:32.740 | They basically have, it's basically a spot with a gun.
01:20:36.460 | And how far are we away from having robots
01:20:41.460 | that go rogue like that?
01:20:43.780 | You know, basically spot that goes rogue for some reason
01:20:48.380 | and somehow finds a gun.
01:20:49.940 | - Right, so, I mean, I'm not a psychologist.
01:20:56.380 | I think, I don't know exactly why people react
01:21:00.740 | the way they do.
01:21:01.640 | I think we have to be careful
01:21:06.980 | about the way robots influence our society and the like.
01:21:09.860 | I think that's something, that's a responsibility
01:21:11.700 | that roboticists need to embrace.
01:21:14.000 | I don't think robots are gonna come after me
01:21:17.380 | with a kitchen knife or a pellet gun right away.
01:21:20.340 | And I mean, if they were programmed in such a way,
01:21:23.180 | but I used to joke with Atlas
01:21:25.340 | that all I had to do was run for five minutes
01:21:28.700 | and its battery would run out.
01:21:30.180 | But actually they've got a very big battery
01:21:32.460 | in there by the end.
01:21:33.300 | So it was over an hour.
01:21:34.460 | I think the fear is a bit cultural though.
01:21:39.380 | 'Cause I mean, you notice that,
01:21:42.740 | like I think in my age in the US,
01:21:45.920 | we grew up watching Terminator, right?
01:21:48.220 | If I had grown up at the same time in Japan,
01:21:50.460 | I probably would have been watching Astro Boy.
01:21:52.720 | And there's a very different reaction to robots
01:21:55.840 | in different countries, right?
01:21:57.440 | So I don't know if it's a human innate fear
01:22:00.000 | of metal marvels,
01:22:02.600 | or if it's something that we've done to ourselves
01:22:06.400 | with our sci-fi.
01:22:07.420 | - Yeah, the stories we tell ourselves through movies,
01:22:12.560 | through just, through popular media.
01:22:16.760 | But if I were to tell,
01:22:19.560 | if you were my therapist and I said,
01:22:21.520 | "I'm really terrified that we're going to have these robots
01:22:26.280 | very soon that will hurt us."
01:22:29.380 | Like, how do you approach making me feel better?
01:22:35.560 | Like, why shouldn't people be afraid?
01:22:39.600 | There's a, I think there's a video that went viral recently.
01:22:44.480 | Everything was spot in Boston,
01:22:46.720 | then it goes viral in general.
01:22:48.360 | But usually it's like really cool stuff.
01:22:50.040 | Like they're doing flips and stuff,
01:22:51.400 | or like sad stuff,
01:22:52.720 | Atlas being hit with a broomstick or something like that.
01:22:57.280 | But there's a video where I think
01:23:00.800 | one of the new productions bought robots,
01:23:03.560 | which are awesome.
01:23:04.600 | It was like patrolling somewhere in some country.
01:23:08.520 | And people immediately were saying,
01:23:11.880 | this is the dystopian future, the surveillance state.
01:23:16.320 | For some reason, you can just have a camera.
01:23:19.440 | Something about spot, being able to walk on four feet
01:23:23.400 | with like really terrified people.
01:23:25.920 | So what do you say to those people?
01:23:30.920 | I think there is a legitimate fear there
01:23:33.800 | because so much of our future is uncertain.
01:23:36.140 | But at the same time, technically speaking,
01:23:40.080 | it seems like we're not there yet.
01:23:41.880 | So what do you say?
01:23:42.840 | - I mean, I think technology is complicated.
01:23:48.560 | It can be used in many ways.
01:23:49.920 | I think there are purely software attacks
01:23:53.160 | somebody could use to do great damage.
01:23:59.000 | Maybe they have already.
01:24:00.480 | I think wheeled robots could be used in bad ways too.
01:24:06.480 | - Drones. - Drones, right?
01:24:10.520 | I don't think that, let's see.
01:24:16.360 | I don't want to be building technology
01:24:19.880 | just because I'm compelled to build technology
01:24:21.840 | and I don't think about it.
01:24:23.520 | But I would consider myself a technological optimist,
01:24:27.720 | I guess, in the sense that I think we should continue
01:24:32.200 | to create and evolve and our world will change.
01:24:37.200 | And if we will introduce new challenges,
01:24:40.760 | we'll screw something up maybe.
01:24:42.880 | But I think also we'll invent ourselves
01:24:46.200 | out of those challenges and life will go on.
01:24:49.360 | - So it's interesting 'cause you didn't mention
01:24:51.880 | this is technically too hard.
01:24:54.520 | - I don't think robots are,
01:24:55.880 | I think people attribute a robot that looks like an animal
01:24:59.120 | as maybe having a level of self-awareness
01:25:02.120 | or consciousness or something that they don't have yet.
01:25:05.200 | I think our ability to anthropomorphize those robots
01:25:11.720 | is probably, we're assuming that they have
01:25:14.920 | a level of intelligence that they don't yet have
01:25:17.960 | and that might be part of the fear.
01:25:20.060 | So in that sense, it's too hard.
01:25:22.280 | But there are many scary things in the world, right?
01:25:25.760 | So I think we're right to ask those questions,
01:25:29.880 | we're right to think about the implications of our work.
01:25:33.600 | - Right, in the short term as we're working on it, for sure.
01:25:39.720 | Is there something long-term that scares you
01:25:43.880 | about our future with AI and robots?
01:25:47.680 | A lot of folks from Elon Musk to Sam Harris
01:25:52.400 | to a lot of folks talk about the existential threats
01:25:56.860 | about artificial intelligence.
01:25:58.880 | Oftentimes robots kind of inspire that the most
01:26:03.680 | because of the anthropomorphism.
01:26:05.840 | Do you have any fears?
01:26:07.400 | - It's an important question.
01:26:09.900 | I actually, I think I like Rod Brooks answer
01:26:14.900 | maybe the best on this.
01:26:16.620 | I think, and it's not the only answer he's given
01:26:18.900 | over the years, but maybe one of my favorites is,
01:26:21.300 | he says, it's not gonna be,
01:26:25.420 | he's got a book, "Flesh and Machines," I believe.
01:26:27.820 | It's not gonna be the robots versus the people.
01:26:31.900 | We're all gonna be robot people.
01:26:34.260 | Because we already have smartphones,
01:26:37.980 | some of us have serious technology
01:26:40.700 | implanted in our bodies already,
01:26:41.940 | whether we have a hearing aid or a pacemaker
01:26:44.940 | or anything like this.
01:26:46.360 | People with amputations might have prosthetics.
01:26:50.880 | That's a trend I think that is likely to continue.
01:26:57.300 | I mean, this is now wild speculation.
01:27:01.380 | But I mean, when do we get to cognitive implants
01:27:05.460 | and the like?
01:27:06.580 | Yeah, with neural link, brain-computer interfaces.
01:27:09.460 | That's interesting.
01:27:10.300 | So there's a dance between humans and robots.
01:27:12.580 | It's going to be,
01:27:13.920 | it's going to be impossible to be scared
01:27:18.900 | of the other out there, the robot,
01:27:23.380 | because the robot will be part of us, essentially.
01:27:26.020 | It'd be so intricately sort of part of our society.
01:27:29.860 | - Yeah, and it might not even be implanted part of us,
01:27:33.040 | but just it's so much a part of our society.
01:27:37.220 | - So in that sense, the smartphone is already the robot
01:27:39.380 | we should be afraid of, yeah.
01:27:40.820 | I mean, yeah, and all the usual fears arise
01:27:44.720 | of the misinformation, the manipulation,
01:27:50.440 | all those kinds of things that,
01:27:53.520 | the problems are all the same.
01:27:57.880 | They're human problems, essentially, it feels like.
01:28:00.700 | - Yeah, I mean, I think the way we interact
01:28:03.420 | with each other online is changing the value we put on
01:28:06.740 | personal interaction, and that's a crazy big change
01:28:10.460 | that's going to happen and has already been ripping
01:28:13.080 | through our society, right?
01:28:14.180 | And that has implications that are massive.
01:28:18.100 | I don't know if they should be scared of it
01:28:19.300 | or go with the flow, but I don't see
01:28:23.700 | some battle lines between humans and robots
01:28:26.540 | being the first thing to worry about.
01:28:29.620 | - I mean, I do want to just, as a kind of comment,
01:28:33.380 | maybe you can comment about your just feelings
01:28:35.500 | about Boston Dynamics in general,
01:28:37.740 | but I love science, I love engineering.
01:28:40.340 | I think there's so many beautiful ideas in it.
01:28:42.580 | And when I look at Boston Dynamics
01:28:45.340 | or legged robots in general, I think they inspire people
01:28:50.340 | curiosity and feelings in general, excitement
01:28:56.500 | about engineering more than almost anything else
01:28:58.980 | in popular culture.
01:29:00.660 | And I think that's such an exciting responsibility
01:29:04.820 | and possibility for robotics.
01:29:06.860 | And Boston Dynamics is riding that wave pretty damn well.
01:29:10.500 | Like they found it, they've discovered that hunger
01:29:14.020 | and curiosity in the people, and they're doing magic with it.
01:29:17.580 | I don't care if they, I mean, I guess it's their company,
01:29:19.860 | they have to make money, right?
01:29:21.380 | But they're already doing incredible work
01:29:24.340 | in inspiring the world about technology.
01:29:26.980 | I mean, do you have thoughts about Boston Dynamics
01:29:30.740 | and maybe others, your own work in robotics
01:29:34.660 | and inspiring the world in that way?
01:29:36.620 | - I completely agree.
01:29:39.020 | I think Boston Dynamics is absolutely awesome.
01:29:42.660 | I think I show my kids those videos,
01:29:46.140 | and the best thing that happens is sometimes
01:29:48.620 | they've already seen them, right?
01:29:50.740 | I think, I just think it's a pinnacle of success
01:29:55.380 | in robotics that is just one of the best things
01:29:58.780 | that's happened.
01:29:59.620 | I absolutely, completely agree.
01:30:01.660 | - One of the heartbreaking things to me
01:30:05.180 | is how many robotics companies fail.
01:30:09.580 | How hard it is to make money with the robotics company.
01:30:13.060 | Like iRobot like went through hell just to arrive at Arumba
01:30:18.060 | to figure out one product.
01:30:19.820 | And then there's so many home robotics companies
01:30:23.860 | like Jibo and Anki, the cutest toy.
01:30:28.860 | There's a great robot I thought went down.
01:30:35.260 | I'm forgetting a bunch of them,
01:30:36.300 | but a bunch of robotics companies fail.
01:30:37.980 | Rod's company, Rethink Robotics.
01:30:40.580 | Like, do you have anything hopeful to say
01:30:47.220 | about the possibility of making money with robots?
01:30:50.300 | - Oh, I think you can't just look at the failures.
01:30:53.940 | You can, I mean, Boston Dynamics is a success.
01:30:55.940 | There's lots of companies that are still
01:30:57.580 | doing amazingly good work in robotics.
01:31:01.140 | I mean, this is the capitalist ecology or something, right?
01:31:05.380 | I think you have many companies, you have many startups,
01:31:07.700 | and they push each other forward, and many of them fail,
01:31:11.380 | and some of them get through,
01:31:12.500 | and that's sort of the natural--
01:31:15.580 | - Way of things.
01:31:16.420 | - Way of those things.
01:31:17.260 | I don't know that, is robotics really that much worse?
01:31:20.460 | I feel the pain that you feel too.
01:31:22.300 | Every time I read one of these, sometimes it's friends,
01:31:26.460 | and I definitely wish it went better, went differently.
01:31:31.460 | But I think it's healthy and good to have bursts of ideas,
01:31:38.340 | bursts of activities, ideas.
01:31:40.660 | If they are really aggressive, they should fail sometimes.
01:31:43.500 | Certainly that's the research mantra, right?
01:31:46.940 | If you're succeeding at every problem you attempt,
01:31:50.780 | then you're not choosing aggressively enough.
01:31:53.420 | - Is it exciting to you, the new Spot?
01:31:56.020 | - Oh, it's so good.
01:31:57.660 | - When are you getting him as a pet, or it?
01:32:00.220 | - Yeah, I mean, I have to dig up 75K right now.
01:32:03.300 | (laughing)
01:32:04.140 | - I mean, it's so cool that there's a price tag.
01:32:05.740 | You can go and then actually buy it.
01:32:08.660 | - I have a Skydio R1, love it.
01:32:11.540 | So, no, I would absolutely be a customer.
01:32:17.540 | - I wonder what your kids would think about it.
01:32:20.100 | - I actually, Zach from Boston Dynamics
01:32:24.420 | would let my kid drive in one of their demos one time,
01:32:27.180 | and that was just so good, so good.
01:32:31.140 | So, I'll forever be grateful for that.
01:32:34.220 | - And there's something magical
01:32:35.540 | about the anthropomorphization of that arm.
01:32:38.940 | It adds another level of human connection.
01:32:42.620 | I'm not sure we understand from a control aspect
01:32:47.500 | the value of anthropomorphization.
01:32:49.540 | I think that's an understudied
01:32:54.020 | and underunderstood engineering problem.
01:32:57.060 | There's been a, psychologists have been studying it.
01:33:00.180 | I think it's part, like, manipulating our mind
01:33:02.900 | to believe things is a valuable engineering.
01:33:06.780 | Like, this is another degree of freedom
01:33:08.860 | that can be controlled.
01:33:09.860 | - I like that, yeah, I think that's right.
01:33:11.420 | I think there's something that humans seem to do,
01:33:15.980 | or maybe my dangerous introspection is,
01:33:19.020 | I think we are able to make very simple models
01:33:23.860 | that assume a lot about the world very quickly,
01:33:27.820 | and then it takes us a lot more time, like your wrestling.
01:33:31.260 | You probably thought you knew
01:33:32.660 | what you were doing with wrestling,
01:33:33.740 | and you were fairly functional as a complete wrestler,
01:33:36.940 | and then you slowly got more expertise.
01:33:39.380 | So maybe it's natural that our first level of defense
01:33:44.380 | against seeing a new robot is to think of it
01:33:48.140 | in our existing models of how humans and animals behave.
01:33:52.500 | And it's just, as you spend more time with it,
01:33:55.140 | then you'll develop more sophisticated models
01:33:57.100 | that will appreciate the differences.
01:33:59.540 | - Exactly.
01:34:01.700 | Can you say what does it take to control a robot?
01:34:04.460 | Like, what is the control problem of a robot?
01:34:08.620 | And in general, what is a robot in your view?
01:34:11.020 | Like, how do you think of this system?
01:34:13.940 | - What is a robot?
01:34:16.060 | - What is a robot?
01:34:17.620 | - I think robotics-- - I told you
01:34:18.700 | ridiculous questions.
01:34:20.060 | - No, no, it's good.
01:34:21.540 | I mean, there's standard definitions
01:34:23.020 | of combining computation with some ability
01:34:27.500 | to do mechanical work.
01:34:29.100 | I think that gets us pretty close.
01:34:31.020 | But I think robotics has this problem
01:34:34.220 | that once things really work,
01:34:37.220 | we don't call them robots anymore.
01:34:38.940 | Like, my dishwasher at home is pretty sophisticated,
01:34:42.940 | beautiful mechanisms.
01:34:45.620 | There's actually a pretty good computer,
01:34:46.940 | probably a couple of chips in there doing amazing things.
01:34:49.580 | We don't think of that as a robot anymore,
01:34:51.620 | which isn't fair, 'cause then,
01:34:53.300 | roughly it means that robotics always has to
01:34:56.220 | solve the next problem
01:34:58.340 | and doesn't get to celebrate its past successes.
01:35:00.580 | I mean, even factory room floor robots
01:35:04.740 | are super successful.
01:35:06.860 | They're amazing.
01:35:08.260 | But that's not the ones,
01:35:09.500 | I mean, people think of them as robots,
01:35:10.860 | but they don't, if you ask what are the successes
01:35:13.300 | of robotics, somehow it doesn't come
01:35:16.140 | to your mind immediately.
01:35:17.860 | - So the definition of robot is a system
01:35:20.540 | with some level of automation that fails frequently.
01:35:23.500 | - Something like, it's the computation
01:35:25.940 | plus mechanical work and an unsolved problem.
01:35:29.940 | (laughing)
01:35:30.780 | - Unsolved problem, yeah.
01:35:32.260 | So from a perspective of control
01:35:35.380 | and mechanics, dynamics, what is a robot?
01:35:39.780 | - So there are many different types of robots.
01:35:42.340 | The control that you need for a Jibo robot,
01:35:47.340 | some robot that's sitting on your countertop
01:35:50.540 | and interacting with you,
01:35:52.380 | but not touching you, for instance,
01:35:54.660 | is very different than what you need for an autonomous car
01:35:56.820 | or an autonomous drone.
01:35:59.420 | It's very different than what you need for a robot
01:36:00.980 | that's gonna walk or pick things up with its hands.
01:36:04.700 | My passion has always been for the places
01:36:09.100 | where you're interacting more,
01:36:10.500 | you're doing more dynamic interactions with the world.
01:36:13.660 | So walking, now manipulation.
01:36:17.700 | And the control problems there are beautiful.
01:36:21.660 | I think contact is one thing that differentiates them
01:36:25.900 | from many of the control problems we've solved classically.
01:36:29.180 | Right, like modern control grew up stabilizing fighter jets
01:36:32.740 | that were passively unstable
01:36:34.020 | and there's like amazing success stories
01:36:36.380 | from control all over the place.
01:36:38.060 | Power grid, I mean, there's all kinds of,
01:36:41.300 | it's everywhere that we don't even realize,
01:36:44.620 | just like AI is now.
01:36:47.500 | - So you mentioned contact, like what's contact?
01:36:50.660 | - So an airplane is an extremely complex system
01:36:54.940 | or a spacecraft landing or whatever,
01:36:57.340 | but at least it has the luxury
01:36:59.300 | of things change relatively continuously.
01:37:03.620 | That's an oversimplification.
01:37:04.900 | But if I make a small change
01:37:07.020 | in the command I send to my actuator,
01:37:10.100 | then the path that the robot will take
01:37:12.660 | tends to change only by a small amount.
01:37:15.820 | - And there's a feedback mechanism here.
01:37:18.860 | That's what we're talking about.
01:37:19.700 | - And there's a feedback mechanism.
01:37:20.940 | And thinking about this as locally,
01:37:23.780 | like a linear system, for instance,
01:37:25.780 | I can use more linear algebra tools
01:37:29.180 | to study systems like that,
01:37:31.300 | generalizations of linear algebra
01:37:33.020 | to these smooth systems.
01:37:35.500 | What is contact?
01:37:37.300 | A robot has something very discontinuous
01:37:41.500 | that happens when it makes or breaks,
01:37:43.580 | when it starts touching the world.
01:37:45.380 | And even the way it touches or the order of contacts
01:37:48.060 | can change the outcome in potentially unpredictable ways.
01:37:53.060 | Not unpredictable, but complex ways.
01:37:55.860 | I do think there's a little bit of a,
01:38:01.420 | a lot of people will say that contact is hard in robotics,
01:38:04.540 | even to simulate.
01:38:05.580 | And I think there's a little bit of a,
01:38:08.660 | there's truth to that,
01:38:09.580 | but maybe a misunderstanding around that.
01:38:11.980 | So what is limiting is that when we think about our robots
01:38:18.500 | and we write our simulators,
01:38:21.340 | we often make an assumption that objects are rigid.
01:38:24.420 | And when it comes down,
01:38:28.100 | you know, that their mass moves all,
01:38:30.180 | you know, it stays in a constant position
01:38:32.140 | relative to each other itself.
01:38:33.740 | And that leads to some paradoxes
01:38:39.260 | when you go to try to talk about
01:38:40.500 | rigid body mechanics and contact.
01:38:43.140 | And so for instance,
01:38:45.460 | if I have a three-legged stool with just a,
01:38:49.500 | imagine it comes to a point at the legs.
01:38:51.900 | So it's only touching the world at a point.
01:38:54.420 | If I draw my physics,
01:38:56.940 | my high school physics diagram of this system,
01:39:00.340 | then there's a couple of things that I'm given
01:39:02.340 | by elementary physics.
01:39:03.860 | I know if the system, if the table is at rest,
01:39:06.380 | if it's not moving, it's zero velocities.
01:39:08.540 | That means that the normal force,
01:39:11.180 | all the forces are in balance.
01:39:13.340 | So the force of gravity is being countered
01:39:16.460 | by the forces that the ground is pushing on my table legs.
01:39:20.140 | I also know since it's not rotating
01:39:23.980 | that the moments have to balance.
01:39:25.900 | And since it can, it's a three-dimensional table,
01:39:29.620 | it could fall in any direction.
01:39:31.220 | It actually tells me uniquely
01:39:33.140 | what those three normal forces have to be.
01:39:35.460 | If I have four legs on my table, four-legged table,
01:39:40.780 | and they were perfectly machined
01:39:43.380 | to be exactly the right, same height,
01:39:45.420 | and they're set down and the table's not moving,
01:39:48.140 | then the basic conservation laws don't tell me
01:39:52.060 | there are many solutions for the forces
01:39:54.140 | that the ground could be putting on my legs
01:39:56.700 | that would still result in the table not moving.
01:39:59.100 | Now, the reason, that seems fine.
01:40:02.340 | I could just pick one.
01:40:03.980 | But it gets funny now because if you think about friction,
01:40:06.880 | what we think about with friction is our standard model says
01:40:12.100 | the amount of force that the table will push back,
01:40:15.940 | if I were to now try to push my table sideways,
01:40:18.100 | I guess I have a table here,
01:40:19.460 | is proportional to the normal force.
01:40:24.060 | So if I'm barely touching and I push, I'll slide,
01:40:27.220 | but if I'm pushing more and I push, I'll slide less.
01:40:30.500 | It's called Coulomb friction, is our standard model.
01:40:33.780 | Now, if you don't know what the normal force is
01:40:35.580 | on the four legs and you push the table,
01:40:38.880 | then you don't know what the friction forces are gonna be.
01:40:42.440 | And so you can't actually tell,
01:40:45.620 | the laws just aren't explicit yet
01:40:48.020 | about which way the table's gonna go.
01:40:49.740 | It could veer off to the left,
01:40:51.420 | it could veer off to the right,
01:40:52.740 | it could go straight.
01:40:53.780 | So the rigid body assumption of contact
01:40:58.500 | leaves us with some paradoxes,
01:40:59.900 | which are annoying for writing simulators
01:41:02.900 | and for writing controllers.
01:41:04.300 | We still do that sometimes because soft contact
01:41:09.500 | is potentially harder numerically or whatever,
01:41:13.220 | and the best simulators do both
01:41:14.720 | or do some combination of the two.
01:41:17.060 | But anyways, because of these kinds of paradoxes,
01:41:19.380 | there's all kinds of paradoxes in contact.
01:41:22.680 | Mostly due to these rigid body assumptions.
01:41:25.400 | It becomes very hard to write the same kind of control laws
01:41:29.760 | that we've been able to be successful with
01:41:31.520 | for like fighter jets.
01:41:33.900 | We haven't been as successful writing those controllers
01:41:36.400 | for manipulation.
01:41:37.920 | - And so you don't know what's going to happen
01:41:39.680 | at the point of contact, at the moment of contact.
01:41:41.960 | - There are situations absolutely
01:41:43.400 | where our laws don't tell us.
01:41:46.240 | So the standard approach, that's okay.
01:41:47.920 | I mean, instead of having a differential equation,
01:41:51.640 | you end up with a differential inclusion, it's called.
01:41:54.140 | It's a set valued equation.
01:41:56.560 | It says that I'm in this configuration,
01:41:58.800 | I have these forces applied on me,
01:42:00.520 | and there's a set of things that could happen.
01:42:03.960 | And you can-- - And those aren't continuous,
01:42:07.400 | I mean, what, so when you say non-smooth,
01:42:12.020 | they're not only not smooth, but this is discontinuous?
01:42:16.200 | - The non-smooth comes in when I make
01:42:18.480 | or break a new contact first,
01:42:20.400 | or when I transition from stick to slip.
01:42:22.880 | So you typically have static friction,
01:42:25.280 | and then you'll start sliding,
01:42:26.440 | and that'll be a discontinuous change
01:42:28.440 | in velocity, for instance,
01:42:31.360 | especially if you come to rest or--
01:42:33.360 | - That's so fascinating.
01:42:34.480 | Okay, so what do you do?
01:42:37.720 | Sorry, I interrupted you. - That's fine.
01:42:39.760 | - What's the hope under so much uncertainty
01:42:44.160 | about what's going to happen?
01:42:45.440 | What are you supposed to do?
01:42:46.280 | - I mean, control has an answer for this.
01:42:48.520 | Robust control is one approach,
01:42:50.240 | but roughly, you can write controllers
01:42:52.640 | which try to still perform the right task
01:42:55.920 | despite all the things that could possibly happen.
01:42:58.120 | The world might want the table to go this way and this way,
01:43:00.000 | but if I write a controller
01:43:01.560 | that pushes a little bit more and pushes a little bit,
01:43:04.320 | I can certainly make the table go in the direction I want.
01:43:08.000 | It just puts a little bit more of a burden
01:43:10.000 | on the control system, right?
01:43:12.160 | And these discontinuities do change the control system
01:43:15.440 | because the way we write it down right now,
01:43:19.840 | every different control configuration,
01:43:24.320 | including sticking or sliding
01:43:26.200 | or parts of my body that are in contact or not,
01:43:29.200 | looks like a different system.
01:43:30.840 | And I think of them,
01:43:31.880 | I reason about them separately or differently,
01:43:34.680 | and the combinatorics of that blow up, right?
01:43:38.000 | So I just don't have enough time to compute
01:43:41.440 | all the possible contact configurations of my humanoid.
01:43:45.040 | Interestingly, I mean, I'm a humanoid.
01:43:49.040 | I have lots of degrees of freedom, lots of joints.
01:43:51.600 | I've only been around for a handful of years.
01:43:54.960 | It's getting up there,
01:43:55.800 | but I haven't had time in my life
01:43:59.200 | to visit all of the states in my system,
01:44:02.060 | certainly all the contact configurations.
01:44:05.240 | So if step one is to consider
01:44:08.320 | every possible contact configuration that I'll ever be in,
01:44:12.160 | that's probably not a problem I need to solve, right?
01:44:16.040 | - Just as a small tangent,
01:44:18.360 | what's a contact configuration?
01:44:20.360 | Just so we can enumerate, what are we talking about?
01:44:26.240 | How many are there?
01:44:27.560 | - The simplest example maybe would be,
01:44:29.960 | imagine a robot with a flat foot.
01:44:32.680 | And we think about the phases of gait
01:44:35.400 | where the heel strikes and then the front toe strikes,
01:44:39.960 | and then you can heel up, toe off.
01:44:42.420 | Those are each different contact configurations.
01:44:46.680 | I only had two different contacts,
01:44:48.280 | but I ended up with four different contact configurations.
01:44:51.380 | Now, of course, my robot might actually have bumps on it
01:44:56.380 | or other things, so it could be much more subtle than that.
01:45:00.600 | But it's just even with one sort of box
01:45:03.120 | interacting with the ground already in the plane
01:45:06.200 | has that many, right?
01:45:07.040 | And if I was just even a 3D foot,
01:45:09.380 | then it probably my left toe might touch
01:45:11.220 | just before my right toe and things get subtle.
01:45:14.320 | Now, if I'm a dexterous hand
01:45:16.440 | and I go to talk about just grabbing a water bottle,
01:45:20.700 | if I have to enumerate every possible order
01:45:26.680 | that my hand came into contact with the bottle,
01:45:30.960 | then I'm dead in the water.
01:45:32.920 | Any approach that we were able to get away with that
01:45:35.360 | in walking because we mostly touched the ground
01:45:38.440 | more than a small number of points, for instance,
01:45:40.920 | and we haven't been able to get dexterous hands that way.
01:45:43.880 | - So you've mentioned that people think that contact
01:45:50.120 | is really hard and that that's the reason
01:45:55.760 | that robotic manipulation problem is really hard.
01:46:00.600 | Is there any flaws in that thinking?
01:46:06.600 | - So I think simulating contact is one aspect.
01:46:10.560 | I know people often say that we don't,
01:46:12.880 | that one of the reasons that we have a limit in robotics
01:46:16.360 | is because we do not simulate contact accurately
01:46:19.080 | in our simulators.
01:46:20.880 | And I think that is,
01:46:22.120 | the extent to which that's true is partly
01:46:26.120 | because our simulators,
01:46:27.920 | we haven't got mature enough simulators.
01:46:29.960 | There are some things that are still hard, difficult,
01:46:34.160 | that we should change.
01:46:35.980 | But we actually, we know what the governing equations are.
01:46:40.980 | They have some foibles, like this indeterminacy,
01:46:44.700 | but we should be able to simulate them accurately.
01:46:47.200 | We have incredible open source community in robotics,
01:46:51.440 | but it actually just takes a professional engineering team
01:46:54.340 | a lot of work to write a very good simulator like that.
01:46:57.740 | - My word is, I believe you've written, Drake.
01:47:02.160 | - There's a team of people.
01:47:04.500 | I certainly spent a lot of hours on it myself.
01:47:07.300 | - Well, what is Drake?
01:47:08.620 | What does it take to create a simulation environment
01:47:13.620 | for the kind of difficult control problems
01:47:18.580 | we're talking about?
01:47:19.580 | - Right, so Drake is the simulator that I've been working on.
01:47:24.640 | There are other good simulators out there.
01:47:26.780 | I don't like to think of Drake as just a simulator,
01:47:29.700 | 'cause we write our controllers in Drake,
01:47:31.780 | we write our perception systems a little bit in Drake,
01:47:34.340 | but we write all of our low-level control
01:47:37.060 | and even planning and optimization.
01:47:40.820 | - So it has optimization capabilities as well?
01:47:42.460 | - Absolutely, yeah.
01:47:43.620 | I mean, Drake is three things, roughly.
01:47:46.000 | It's an optimization library, which is,
01:47:48.220 | sits on, it provides a layer of abstraction
01:47:52.340 | in C++ and Python for commercial solvers.
01:47:55.900 | You can write linear programs, quadratic programs,
01:48:00.740 | semi-definite programs, sums of squares programs,
01:48:03.340 | the ones we've used, mixed integer programs,
01:48:05.660 | and it will do the work to curate those
01:48:07.940 | and send them to whatever the right solver is,
01:48:09.780 | for instance, and it provides a level of abstraction.
01:48:12.500 | The second thing is a system modeling language,
01:48:18.340 | a bit like LabVIEW or Simulink,
01:48:20.900 | where you can make block diagrams out of complex systems,
01:48:24.820 | or it's like ROS in that sense,
01:48:26.660 | where you might have lots of ROS nodes
01:48:29.020 | that are each doing some part of your system,
01:48:31.980 | but to contrast it with ROS,
01:48:33.820 | we try to write, if you write a Drake system,
01:48:37.860 | then you have to, it asks you to describe
01:48:41.300 | a little bit more about the system.
01:48:43.020 | If you have any state, for instance, in the system,
01:48:46.260 | any variables that are gonna persist,
01:48:47.700 | you have to declare them.
01:48:49.140 | Parameters can be declared and the like,
01:48:51.660 | but the advantage of doing that is that you can,
01:48:54.180 | if you like, run things all on one process,
01:48:57.480 | but you can also do control design against it.
01:49:00.220 | You can do, I mean, simple things like rewinding
01:49:03.100 | and playing back your simulations, for instance.
01:49:07.780 | You know, these things, you get some rewards
01:49:09.580 | for spending a little bit more upfront cost
01:49:11.380 | in describing each system.
01:49:12.700 | And I was inspired to do that
01:49:16.900 | because I think the complexity of Atlas, for instance,
01:49:20.340 | is just so great.
01:49:22.620 | And I think, although, I mean,
01:49:24.140 | ROS has been incredible, absolute,
01:49:26.940 | huge fan of what it's done for the robotics community,
01:49:30.700 | but the ability to rapidly put different pieces together
01:49:35.480 | and have a functioning thing is very good.
01:49:37.960 | But I do think that it's hard to think clearly
01:49:42.860 | about a bag of disparate parts,
01:49:44.980 | Mr. Potato Head kind of software stack.
01:49:48.180 | And if you can, you know, ask a little bit more
01:49:53.060 | out of each of those parts,
01:49:54.180 | then you can understand the way they work better.
01:49:56.100 | You can try to verify them and the like,
01:49:59.260 | or you can do learning against them.
01:50:02.660 | And then one of those systems, the last thing,
01:50:04.460 | I said the first two things that Drake is,
01:50:06.460 | but the last thing is that there is a set
01:50:09.620 | of multi-body equations, rigid body equations,
01:50:12.520 | that is trying to provide a system that simulates physics.
01:50:16.740 | And that, we also have renderers and other things,
01:50:20.020 | but I think the physics component of Drake is special
01:50:23.260 | in the sense that we have done excessive amount
01:50:27.700 | of engineering to make sure
01:50:29.820 | that we've written the equations correctly.
01:50:31.540 | Every possible tumbling satellite or spinning top
01:50:34.100 | or anything that we could possibly write as a test
01:50:36.100 | is tested.
01:50:36.940 | We are making some, you know, I think,
01:50:40.620 | fundamental improvements on the way you simulate contact.
01:50:44.220 | - Just what does it take to simulate contact?
01:50:47.580 | I mean, it just seems,
01:50:49.080 | I mean, there's something just beautiful
01:50:52.380 | with the way you were explaining contact
01:50:55.220 | and you were tapping your fingers on the table
01:50:58.340 | while you're doing it.
01:50:59.500 | Just--
01:51:00.740 | - Easily, right?
01:51:01.580 | - Easily, just not even,
01:51:04.820 | it was helping you think, I guess.
01:51:06.660 | You have this awesome demo of loading
01:51:14.020 | or unloading a dishwasher.
01:51:15.680 | Just picking up a plate,
01:51:21.460 | grasping it like for the first time.
01:51:24.020 | That just seems like so difficult.
01:51:28.180 | What, how do you simulate any of that?
01:51:32.400 | - So it was really interesting that what happened was that
01:51:36.540 | we started getting more professional
01:51:39.220 | about our software development
01:51:40.500 | during the DARPA Robotics Challenge.
01:51:42.300 | I learned the value of software engineering
01:51:46.060 | and how to bridle complexity.
01:51:48.640 | I guess that's what I want to somehow fight against
01:51:52.780 | and bring some of the clear thinking of controls
01:51:54.760 | into these complex systems we're building for robots.
01:51:58.200 | Shortly after the DARPA Robotics Challenge,
01:52:02.940 | Toyota opened a research institute,
01:52:04.600 | TRI, Toyota Research Institute.
01:52:07.260 | They put one of their, there's three locations.
01:52:10.880 | One of them is just down the street from MIT
01:52:13.040 | and I helped ramp that up
01:52:17.520 | as a part of my, the end of my sabbatical, I guess.
01:52:20.860 | So TRI has given me, the TRI Robotics effort
01:52:28.480 | has made this investment in simulation in Drake.
01:52:32.640 | And Michael Sherman leads a team there
01:52:34.480 | of just absolutely top-notch dynamics experts
01:52:37.800 | that are trying to write those simulators
01:52:40.120 | that can pick up the dishes.
01:52:41.960 | And there's also a team working on manipulation there
01:52:44.780 | that is taking problems like loading the dishwasher
01:52:48.980 | and we're using that to study these really hard corner cases
01:52:53.180 | kind of problems in manipulation.
01:52:55.280 | So for me, this, you know, simulating the dishes,
01:52:59.720 | we could actually write a controller.
01:53:01.580 | If we just cared about picking up dishes in the sink once,
01:53:05.040 | we could write a controller
01:53:05.880 | without any simulation whatsoever
01:53:07.720 | and we could call it done.
01:53:10.040 | But we want to understand like,
01:53:12.140 | what is the path you take to actually get to a robot
01:53:17.040 | that could perform that for any dish in anybody's kitchen
01:53:22.040 | with enough confidence
01:53:23.280 | that it could be a commercial product, right?
01:53:26.520 | And it has deep learning perception in the loop.
01:53:29.360 | It has complex dynamics in the loop.
01:53:31.060 | It has controller, it has a planner.
01:53:33.260 | And how do you take all of that complexity
01:53:36.340 | and put it through this engineering discipline
01:53:39.020 | and verification and validation process
01:53:42.420 | to actually get enough confidence to deploy?
01:53:46.440 | I mean, the DARPA challenge made me realize
01:53:49.840 | that that's not something you throw over the fence
01:53:52.000 | and hope that somebody will harden it for you.
01:53:54.060 | That there are really fundamental challenges
01:53:57.380 | in closing that last gap.
01:53:59.820 | - They're doing the validation and the testing.
01:54:02.320 | - I think it might even change the way we have to think
01:54:05.980 | about the way we write systems.
01:54:08.980 | What happens if you have the robot running lots of tests
01:54:13.980 | and it screws up, it breaks a dish, right?
01:54:19.040 | How do you capture that?
01:54:19.960 | I said, you can't run the same simulation
01:54:23.600 | or the same experiment twice on a real robot.
01:54:27.020 | Do we have to be able to bring that one-off failure
01:54:31.520 | back into simulation in order to change our controllers,
01:54:34.200 | study it, make sure it won't happen again?
01:54:37.240 | Do we, is it enough to just try to add that
01:54:40.600 | to our distribution and understand that on average,
01:54:43.800 | we're gonna cover that situation again?
01:54:45.920 | There's like really subtle questions at the corner cases
01:54:49.960 | that I think we don't yet have satisfying answers for.
01:54:53.520 | - How do you find the corner cases?
01:54:55.120 | That's one kind of, is there,
01:54:57.160 | do you think that's possible to create a systematized way
01:55:01.280 | of discovering corner cases efficiently?
01:55:04.720 | - Yes.
01:55:05.560 | - And whatever the problem is?
01:55:07.600 | - Yes, I mean, I think we have to get better at that.
01:55:10.760 | I mean, control theory has, for decades,
01:55:14.920 | talked about active experiment design.
01:55:16.900 | - What's that?
01:55:18.680 | - So people call it curiosity these days.
01:55:22.080 | It's roughly this idea of trying to,
01:55:24.100 | exploration or exploitation,
01:55:25.520 | but in the active experiment design is even,
01:55:28.120 | is more specific.
01:55:29.640 | You could try to understand the uncertainty in your system,
01:55:34.120 | design the experiment that will provide
01:55:36.480 | the maximum information to reduce that uncertainty.
01:55:40.120 | If there's a parameter you wanna learn about,
01:55:42.360 | what is the optimal trajectory I could execute
01:55:45.440 | to learn about that parameter, for instance?
01:55:47.640 | Scaling that up to something that has a deep network
01:55:51.720 | in the loop and a planning in the loop is tough.
01:55:55.660 | We've done some work on,
01:55:57.000 | you know, with Matt O'Kelley and Amansina,
01:56:00.280 | we've worked on some falsification algorithms
01:56:03.600 | that are trying to do rare event simulation
01:56:05.600 | that try to just hammer on your simulator.
01:56:08.120 | And if your simulator is good enough,
01:56:10.000 | you can spend a lot of time,
01:56:13.420 | you can write good algorithms
01:56:15.840 | that try to spend most of their time in the corner cases.
01:56:19.920 | So you basically imagine you're building an autonomous car
01:56:24.920 | and you wanna put it in, I don't know,
01:56:27.320 | downtown New Delhi all the time, right?
01:56:29.360 | And accelerated testing.
01:56:30.740 | If you can write sampling strategies,
01:56:33.340 | which figure out where your controller's performing badly
01:56:36.260 | in simulation and start generating
01:56:38.160 | lots of examples around that.
01:56:39.720 | You know, it's just the space of possible places
01:56:44.060 | where that can be, where things can go wrong is very big.
01:56:48.040 | So it's hard to write those algorithms.
01:56:49.800 | - Yeah, rare event simulation
01:56:51.720 | is just like a really compelling notion.
01:56:53.720 | If it's possible.
01:56:55.800 | - We joked and we call it the black swan generator.
01:56:58.800 | - It's a black swan.
01:57:00.080 | - 'Cause you don't just want the rare events,
01:57:01.680 | you want the ones that are highly impactful.
01:57:04.000 | - I mean, that's the most,
01:57:05.660 | those are the most sort of profound questions
01:57:08.780 | we ask of our world.
01:57:10.120 | Like, what's the worst that can happen?
01:57:15.120 | But what we're really asking
01:57:18.080 | isn't some kind of like computer science,
01:57:20.800 | worst case analysis.
01:57:22.580 | We're asking like, what are the millions of ways
01:57:25.600 | this can go wrong?
01:57:27.360 | And that's like our curiosity.
01:57:31.200 | We humans, I think are pretty bad at,
01:57:34.880 | we just like run into it.
01:57:36.960 | And I think there's a distributed sense
01:57:38.520 | because there's now like 7.5 billion of us.
01:57:41.720 | And so there's a lot of them,
01:57:42.840 | and then a lot of them write blog posts
01:57:45.000 | about the stupid thing they've done.
01:57:46.480 | So we learn in a distributed way.
01:57:48.840 | There's some--
01:57:50.800 | - I think that's gonna be important for robots too.
01:57:52.720 | - Yeah.
01:57:53.560 | - I mean, that's another massive theme
01:57:55.880 | at Toyota Research for robotics
01:57:58.760 | is this fleet learning concept.
01:58:00.560 | Is the idea that I as a human,
01:58:04.760 | I don't have enough time to visit all of my states.
01:58:07.680 | Right?
01:58:08.520 | There's just a, it's very hard for one robot
01:58:10.160 | to experience all the things.
01:58:11.600 | But that's not actually the problem we have to solve.
01:58:14.680 | Right?
01:58:15.520 | We're gonna have fleets of robots
01:58:17.720 | that can have very similar appendages.
01:58:20.680 | And at some point, maybe collectively,
01:58:24.160 | they have enough data that their computational processes
01:58:29.320 | should be set up differently than ours.
01:58:30.680 | Right?
01:58:31.880 | - It's this vision of just,
01:58:34.200 | I mean, all these dishwasher unloading robots.
01:58:38.880 | I mean, that robot dropping a plate
01:58:42.600 | and a human looking at the robot, probably pissed off.
01:58:46.880 | - Yeah.
01:58:47.840 | - But that's a special moment to record.
01:58:51.200 | I think one thing in terms of fleet learning,
01:58:54.520 | and I've seen that because I've talked to a lot of folks,
01:58:57.720 | just like Tesla users or Tesla drivers,
01:59:01.200 | they're another company that's using
01:59:03.600 | this kind of fleet learning idea.
01:59:05.240 | One hopeful thing I have about humans
01:59:08.160 | is they really enjoy when a system improves, learns.
01:59:13.160 | So they enjoy fleet learning.
01:59:14.640 | And the reason it's hopeful for me
01:59:17.200 | is they're willing to put up with something
01:59:20.240 | that's kind of dumb right now.
01:59:22.600 | And they're like, if it's improving,
01:59:25.480 | they almost enjoy being part of the teaching.
01:59:29.280 | It almost like if you have kids,
01:59:30.880 | like you're teaching them something.
01:59:32.200 | - Right.
01:59:33.480 | - I think that's a beautiful thing
01:59:35.080 | 'cause that gives me hope
01:59:36.200 | that we can put dumb robots out there.
01:59:38.640 | I mean, the problem on the Tesla side with cars,
01:59:43.280 | cars can kill you.
01:59:44.440 | That makes the problem so much harder.
01:59:47.680 | Dishwasher unloading is a little safe.
01:59:50.520 | That's why home robotics is really exciting.
01:59:54.160 | And just to clarify, I mean,
01:59:55.680 | for people who might not know,
01:59:57.560 | I mean, TRI, Toyota Research Institute,
02:00:00.080 | so they're pretty well known
02:00:03.960 | for like autonomous vehicle research,
02:00:06.120 | but they're also interested in home robotics.
02:00:10.240 | - Yep, there's a big group working on,
02:00:12.760 | multiple groups working on home robotics.
02:00:14.320 | It's a major part of the portfolio.
02:00:17.440 | There's also a couple other projects
02:00:19.080 | and advanced materials discovery,
02:00:21.280 | using AI and machine learning to discover new materials
02:00:24.400 | for car batteries and the like, for instance.
02:00:27.920 | Yeah, and that's been actually incredibly successful team.
02:00:31.520 | There's new projects starting up too.
02:00:33.640 | - Do you see a future of where like robots are in our home
02:00:38.640 | and like robots that have like actuators
02:00:43.960 | that look like arms in our home
02:00:46.600 | or like more like humanoid type robots?
02:00:49.320 | Or is this, are we gonna do the same thing
02:00:51.760 | that you just mentioned that,
02:00:53.840 | the dishwasher is no longer a robot.
02:00:55.960 | We're going to just not even see them as robots.
02:00:58.680 | But I mean, what's your vision of the home of the future?
02:01:02.480 | 10, 20 years from now, 50 years if you get crazy.
02:01:06.180 | - Yeah, I think we already have Roombas cruising around.
02:01:10.680 | We have, you know, Alexa's or Google Homes
02:01:13.680 | on our kitchen counter.
02:01:16.200 | It's only a matter of time till they spring arms
02:01:18.040 | and start doing something useful like that.
02:01:20.760 | So I do think it's coming.
02:01:23.840 | I think lots of people have lots of motivations
02:01:27.680 | for doing it.
02:01:29.400 | It's been super interesting actually learning
02:01:31.520 | about Toyota's vision for it,
02:01:33.880 | which is about helping people age in place.
02:01:36.380 | 'Cause I think that's not necessarily the first entry,
02:01:41.600 | the most lucrative entry point,
02:01:44.360 | but it's the problem maybe that
02:01:47.680 | we really need to solve no matter what.
02:01:50.040 | And so I think there's a real opportunity.
02:01:53.920 | It's a delicate problem.
02:01:55.760 | How do you work with people, help people,
02:01:59.340 | keep them active, engaged, you know,
02:02:02.340 | but improve the quality of life
02:02:05.120 | and help them age in place, for instance.
02:02:08.360 | - It's interesting 'cause older folks are also,
02:02:12.480 | I mean, there's a contrast there because
02:02:15.880 | they're not always the folks
02:02:18.120 | who are the most comfortable with technology, for example.
02:02:20.920 | So there's a division that's interesting there
02:02:24.600 | that you can do so much good with a robot
02:02:27.120 | for older folks,
02:02:32.060 | but there's a gap to fill of understanding.
02:02:36.400 | I mean, it's actually kind of beautiful.
02:02:38.400 | Robot is learning about the human
02:02:41.160 | and the human is kind of learning about this new robot thing.
02:02:44.840 | And it's also with,
02:02:47.240 | at least with, like when I talk to my parents about robots,
02:02:51.440 | there's a little bit of a blank slate there too.
02:02:54.520 | Like you can, I mean, they don't know anything about robotics
02:02:59.360 | so it's completely like wide open.
02:03:02.640 | They don't have, they haven't,
02:03:03.880 | my parents haven't seen Black Mirror.
02:03:05.720 | So like it's a blank slate.
02:03:09.480 | Here's a cool thing, like what can it do for me?
02:03:11.960 | - Yeah.
02:03:12.800 | - So it's an exciting space.
02:03:14.360 | - I think it's a really important space.
02:03:16.360 | I do feel like, you know, a few years ago,
02:03:20.000 | drones were successful enough in academia.
02:03:22.720 | They kind of broke out and started in industry
02:03:26.000 | and autonomous cars have been happening.
02:03:29.120 | It does feel like manipulation in logistics, of course,
02:03:32.880 | first, but in the home shortly after,
02:03:35.680 | it seems like one of the next big things
02:03:37.160 | that's gonna really pop.
02:03:40.040 | - So I don't think we talked about it,
02:03:42.080 | but what's soft robotics?
02:03:44.560 | So we talked about like rigid bodies.
02:03:48.560 | If we can just linger on this whole touch thing.
02:03:52.120 | - Yeah, so what's soft robotics?
02:03:54.680 | So I told you that I really dislike the fact
02:03:59.680 | that robots are afraid of touching the world
02:04:03.180 | all over their body.
02:04:04.880 | So there's a couple reasons for that.
02:04:06.920 | If you look carefully at all the places
02:04:08.760 | that robots actually do touch the world,
02:04:11.240 | they're almost always soft.
02:04:12.560 | They have some sort of pad on their fingers
02:04:14.680 | or a rubber sole on their foot.
02:04:16.900 | But if you look up and down the arm,
02:04:19.280 | we're just pure aluminum or something.
02:04:21.680 | So that makes it hard actually.
02:04:26.640 | In fact, hitting the table with your rigid arm
02:04:30.440 | or nearly rigid arm has some of the problems
02:04:34.560 | that we talked about in terms of simulation.
02:04:37.220 | I think it fundamentally changes the mechanics of contact
02:04:39.920 | when you're soft, right?
02:04:41.640 | You turn point contacts into patch contacts,
02:04:44.980 | which can have torsional friction.
02:04:46.980 | You can have distributed load.
02:04:49.240 | If I wanna pick up an egg, right?
02:04:52.420 | If I pick it up with two points,
02:04:54.240 | then in order to put enough force
02:04:56.200 | to sustain the weight of the egg,
02:04:57.280 | I might have to put a lot of force to break the egg.
02:04:59.920 | If I envelop it with contact all around,
02:05:04.420 | then I can distribute my force across the shell of the egg
02:05:07.520 | and have a better chance of not breaking it.
02:05:09.760 | So soft robotics is for me a lot about
02:05:12.440 | changing the mechanics of contact.
02:05:15.520 | - Does it make the problem a lot harder?
02:05:17.480 | (laughing)
02:05:21.360 | - Quite the opposite.
02:05:22.400 | It changes the computational problem.
02:05:26.760 | I think because of the,
02:05:28.560 | I think our world and our mathematics
02:05:31.880 | has biased us towards rigid.
02:05:34.520 | But it really should make things better in some ways, right?
02:05:37.520 | It's a, I think the future is unwritten there.
02:05:43.080 | But the other thing--
02:05:45.480 | - I think ultimately, sorry to interrupt,
02:05:46.840 | but I think ultimately it will make things simpler
02:05:49.560 | if we embrace the softness of the world.
02:05:51.560 | - It makes things smoother, right?
02:05:55.720 | So the result of small actions is less discontinuous,
02:06:02.320 | but it also means potentially less, you know,
02:06:05.080 | instantaneously bad, for instance.
02:06:08.080 | I won't necessarily contact something and send it flying off.
02:06:11.520 | The other aspect of it
02:06:14.240 | that just happens to dovetail really well
02:06:16.000 | is that soft robotics tends to be a place
02:06:18.400 | where we can embed a lot of sensors too.
02:06:20.240 | So if you change your hardware and make it more soft,
02:06:24.720 | then you can potentially have a tactile sensor,
02:06:26.680 | which is measuring the deformation.
02:06:30.680 | So there's a team at TRI that's working on soft hands,
02:06:34.880 | and you get so much more information.
02:06:38.160 | If you can put a camera behind the skin roughly
02:06:41.720 | and get fantastic tactile information,
02:06:45.760 | which is, it's super important.
02:06:49.120 | Like in manipulation,
02:06:50.080 | one of the things that really is frustrating
02:06:52.840 | is if you work super hard on your head-mounted,
02:06:55.080 | on your perception system for your head-mounted cameras,
02:06:57.720 | and then you've identified an object,
02:06:59.480 | you reach down to touch it,
02:07:00.360 | and the last thing that happens,
02:07:01.920 | right before the most important time,
02:07:04.000 | you stick your hand and you're occluding
02:07:05.760 | your head-mounted sensors, right?
02:07:07.400 | So in all the part that really matters,
02:07:10.200 | all of your off-board sensors are occluded.
02:07:13.560 | And really, if you don't have tactile information,
02:07:15.920 | then you're blind in an important way.
02:07:19.300 | So it happens that soft robotics and tactile sensing
02:07:23.160 | tend to go hand in hand.
02:07:25.080 | - I think we've kind of talked about it,
02:07:26.820 | but you taught a course on under-actuated robotics.
02:07:31.060 | I believe that was the name of it, actually.
02:07:32.780 | - That's right.
02:07:33.620 | - Can you talk about it in that context?
02:07:37.340 | What is under-actuated robotics?
02:07:40.380 | - Right, so under-actuated robotics is my graduate course.
02:07:43.740 | It's online mostly now,
02:07:46.640 | in the sense that the lectures--
02:07:47.480 | - Several versions of it, I think.
02:07:49.060 | - Right, the YouTube--
02:07:49.900 | - It's really great, I recommend it highly.
02:07:52.060 | - Look on YouTube for the 2020 versions until March,
02:07:56.180 | and then you have to go back to 2019, thanks to COVID.
02:07:59.180 | No, I've poured my heart into that class.
02:08:03.800 | And lecture one is basically explaining
02:08:06.900 | what the word under-actuated means.
02:08:08.220 | So people are very kind to show up,
02:08:10.140 | and then maybe have to learn
02:08:12.500 | what the title of the course means
02:08:13.720 | over the course of the first lecture.
02:08:15.680 | - That first lecture's really good.
02:08:17.780 | You should watch it.
02:08:19.020 | - Thanks.
02:08:20.140 | It's a strange name,
02:08:21.780 | but I thought it captured the essence
02:08:26.160 | of what control was good at doing,
02:08:28.220 | and what control was bad at doing.
02:08:30.260 | So what do I mean by under-actuated?
02:08:32.220 | So a mechanical system
02:08:34.980 | has many degrees of freedom, for instance.
02:08:39.780 | I think of a joint as a degree of freedom.
02:08:42.260 | And it has some number of actuators, motors.
02:08:46.500 | So if you have a robot that's bolted to the table
02:08:49.540 | that has five degrees of freedom, and five motors,
02:08:54.420 | then you have a fully actuated robot.
02:08:56.260 | If you take away one of those motors,
02:09:00.860 | then you have an under-actuated robot.
02:09:03.500 | Now why on earth?
02:09:05.260 | I have a good friend who likes to tease me.
02:09:07.780 | He said, "Russ, if you had more research funding,
02:09:09.780 | "would you work on fully actuated robots?"
02:09:12.100 | - Yeah. (laughs)
02:09:14.180 | - The answer's no.
02:09:15.500 | The world gives us under-actuated robots,
02:09:17.740 | whether we like it or not.
02:09:18.740 | I'm a human.
02:09:20.100 | I'm an under-actuated robot,
02:09:21.740 | even though I have more muscles
02:09:23.740 | than my big degrees of freedom,
02:09:25.420 | because I have, in some places,
02:09:27.980 | multiple muscles attached to the same joint.
02:09:30.180 | But still, there's a really important degree of freedom
02:09:34.140 | that I have, which is the location
02:09:35.620 | of my center of mass in space, for instance.
02:09:38.780 | All right, I can jump into the air,
02:09:42.740 | and there's no motor that connects my center of mass
02:09:45.460 | to the ground in that case.
02:09:47.460 | So I have to think about the implications
02:09:49.700 | of not having control over everything.
02:09:51.980 | The passive dynamic walkers are the extreme view of that,
02:09:56.820 | where you've taken away all the motors,
02:09:58.140 | and you have to let physics do the work.
02:10:00.260 | But it shows up in all of the walking robots,
02:10:02.500 | where you have to use some of the actuators
02:10:04.820 | to push and pull even the degrees of freedom
02:10:07.260 | that you don't have an actuator on.
02:10:09.180 | - That's referring to walking if you're falling forward.
02:10:13.420 | Is there a way to walk that's fully actuated?
02:10:16.540 | - So it's a subtle point.
02:10:18.620 | When you're in contact, and you have your feet
02:10:22.340 | on the ground, there are still limits to what you can do.
02:10:26.780 | Unless I have suction cups on my feet,
02:10:29.380 | I cannot accelerate my center of mass
02:10:31.380 | towards the ground faster than gravity,
02:10:34.020 | 'cause I can't get a force pushing me down.
02:10:36.180 | But I can still do most of the things that I want to.
02:10:39.660 | So you can get away with basically thinking
02:10:42.260 | of the system as fully actuated,
02:10:43.700 | unless you suddenly needed to accelerate down super fast.
02:10:46.540 | But as soon as I take a step,
02:10:49.260 | I get into the more nuanced territory.
02:10:52.940 | And to get to really dynamic robots,
02:10:55.740 | or airplanes, or other things,
02:10:59.220 | I think you have to embrace the under-actuated dynamics.
02:11:02.620 | Manipulation, people think,
02:11:04.460 | is manipulation under-actuated?
02:11:06.940 | Even if my arm is fully actuated, I have a motor,
02:11:10.580 | if my goal is to control the position
02:11:13.500 | and orientation of this cup,
02:11:16.020 | then I don't have an actuator for that directly.
02:11:19.220 | So I have to use my actuators over here
02:11:21.060 | to control this thing.
02:11:22.260 | Now it gets even worse,
02:11:24.340 | like what if I have to button my shirt?
02:11:26.900 | What are the degrees of freedom of my shirt?
02:11:31.340 | I suddenly, that's a hard question to think about.
02:11:34.500 | It kind of makes me queasy
02:11:36.020 | as thinking about my state-space control ideas.
02:11:39.860 | But actually those are the problems
02:11:41.780 | that make me so excited about manipulation right now,
02:11:44.500 | is that it breaks some of the,
02:11:46.980 | it breaks a lot of the foundational control stuff
02:11:49.980 | that I've been thinking about.
02:11:51.500 | - What are some interesting insights you could say
02:11:54.940 | about trying to solve an under-actuated,
02:11:57.980 | a control in an under-actuated system?
02:12:02.300 | - So I think the philosophy there
02:12:04.740 | is let physics do more of the work.
02:12:08.380 | The technical approach has been optimization.
02:12:12.140 | So you typically formulate your decision-making for control
02:12:15.020 | as an optimization problem,
02:12:17.060 | and you use the language of optimal control
02:12:19.340 | and sometimes often numerical optimal control
02:12:22.700 | in order to make those decisions
02:12:24.660 | and balance these complicated equations of,
02:12:29.020 | and in order to control.
02:12:30.820 | You don't have to use optimal control
02:12:33.060 | to do under-actuated systems,
02:12:34.820 | but that has been the technical approach
02:12:36.300 | that has borne the most fruit in our,
02:12:39.060 | at least in our line of work.
02:12:40.860 | - And there's some, so in under-actuated systems,
02:12:44.020 | when you say let physics do some of the work,
02:12:46.780 | so there's a kind of feedback loop
02:12:50.300 | that observes the state that the physics brought you to.
02:12:54.500 | So like you've, there's a perception there,
02:12:57.740 | there's a feedback somehow.
02:13:00.260 | Do you ever loop in like complicated perception systems
02:13:05.380 | into this whole picture?
02:13:06.860 | - Right, right around the time of the DARPA challenge,
02:13:09.580 | we had a complicated perception system
02:13:11.300 | in the DARPA challenge.
02:13:12.660 | We also started to embrace perception
02:13:15.540 | for our flying vehicles at the time.
02:13:17.300 | We had a really good project
02:13:20.060 | on trying to make airplanes fly
02:13:21.780 | at high speeds through forests.
02:13:23.340 | Sertac Karaman was on that project,
02:13:27.420 | and we had, it was a really fun team to work on.
02:13:30.700 | He's carried it farther, much farther forward since then.
02:13:34.180 | - And that's using cameras for perception?
02:13:35.940 | - So that was using cameras.
02:13:37.540 | That was, at the time, we felt like LIDAR was too heavy
02:13:41.740 | and too power heavy to be carried on a light UAV,
02:13:46.740 | and we were using cameras.
02:13:49.180 | And that was a big part of it,
02:13:50.340 | was just how do you do even stereo matching
02:13:53.060 | at a fast enough rate with a small camera,
02:13:56.140 | a small onboard compute.
02:13:57.580 | Since then, we have now,
02:14:00.660 | so the deep learning revolution unquestionably changed
02:14:04.100 | what we can do with perception for robotics and control.
02:14:08.980 | So in manipulation, we can address,
02:14:10.940 | we can use perception in, I think, a much deeper way.
02:14:14.620 | And we get into not only,
02:14:17.300 | I think the first use of it naturally
02:14:19.780 | would be to ask your deep learning system
02:14:22.900 | to look at the cameras and produce the state,
02:14:25.940 | which is like the pose of my thing, for instance.
02:14:28.860 | But I think we've quickly found out
02:14:30.420 | that that's not always the right thing to do.
02:14:34.420 | - Why is that?
02:14:35.620 | - Because what's the state of my shirt?
02:14:38.380 | Imagine--
02:14:39.220 | - Is it very noisy, you mean?
02:14:40.860 | - If the first step of me trying to button my shirt
02:14:46.140 | is estimate the full state of my shirt,
02:14:48.580 | including what's happening in the back,
02:14:50.260 | you know, whatever, whatever,
02:14:51.780 | that's just not the right specification.
02:14:55.780 | There are aspects of the state
02:14:57.500 | that are very important to the task.
02:15:00.260 | There are many that are unobservable
02:15:03.220 | and not important to the task.
02:15:05.860 | So you really need,
02:15:06.940 | it begs new questions about state representation.
02:15:10.300 | Another example that we've been playing with in lab
02:15:13.140 | has been just the idea of chopping onions, okay?
02:15:17.660 | Or carrots, turns out to be better.
02:15:19.540 | So the onions stink up the lab.
02:15:22.480 | And they're hard to see in a camera.
02:15:26.180 | But--
02:15:27.020 | (Lex laughing)
02:15:27.860 | - The details matter, yeah.
02:15:28.700 | - Details matter, you know?
02:15:32.380 | - Chop and carrot.
02:15:33.220 | - If I'm moving around a particular object, right?
02:15:35.180 | Then I think about,
02:15:36.020 | oh, it's got a position or an orientation in space,
02:15:37.940 | that's the description I want.
02:15:39.720 | Now, when I'm chopping an onion, okay,
02:15:42.660 | the first chop comes down,
02:15:44.220 | I have now 100 pieces of onion.
02:15:46.740 | Does my control system really need to understand
02:15:50.260 | the position and orientation
02:15:51.780 | and even the shape of the 100 pieces of onion
02:15:53.940 | in order to make a decision?
02:15:56.040 | Probably not, you know?
02:15:56.880 | And if I keep going, I'm just getting,
02:15:58.820 | more and more is my state space getting bigger as I cut?
02:16:02.500 | It's not right.
02:16:06.020 | So somehow there's a-- - What do you do?
02:16:08.100 | - I think there's a richer idea of state.
02:16:13.100 | It's not the state that is given to us
02:16:15.740 | by Lagrangian mechanics.
02:16:17.180 | There is a proper Lagrangian state of the system,
02:16:21.340 | but the relevant state for this
02:16:22.900 | is some latent state,
02:16:26.460 | is what we call it in machine learning.
02:16:28.540 | But there's some different state representation.
02:16:32.180 | - Some compressed representation?
02:16:34.980 | - And that's what I worry about saying compressed
02:16:37.220 | because I don't mind that it's low dimensional or not,
02:16:41.420 | but it has to be something that's easier to think about.
02:16:46.220 | - By us humans.
02:16:47.340 | - Or my algorithms.
02:16:49.300 | - Or the algorithms being like control, optimal control.
02:16:53.900 | - So for instance, if the contact mechanics
02:16:56.540 | of all of those onion pieces
02:16:58.500 | and all the permutations of possible touches
02:17:00.580 | between those onion pieces,
02:17:01.940 | you can give me a high dimensional state representation,
02:17:05.100 | I'm okay if it's linear.
02:17:06.820 | But if I have to think about all the possible
02:17:08.700 | shattering combinatorics of that,
02:17:10.780 | then my robot's gonna sit there thinking
02:17:13.900 | and the soup's gonna get cold or something.
02:17:17.380 | - So since you taught the course,
02:17:20.140 | it kinda entered my mind,
02:17:22.780 | the idea of underactuated as really compelling
02:17:26.020 | to see the world in this kind of way.
02:17:29.500 | Do you ever, if we talk about onions
02:17:32.420 | or you talk about the world with people in it in general,
02:17:35.460 | do you see the world as basically an underactuated system?
02:17:39.980 | Do you often look at the world in this way?
02:17:42.180 | Or is this overreach?
02:17:44.740 | - Underactuated is a way of life, man.
02:17:49.100 | - Exactly, I guess that's what I'm asking.
02:17:51.420 | - I do think it's everywhere.
02:17:54.900 | I think in some places,
02:17:57.260 | we already have natural tools to deal with it.
02:18:01.500 | It rears its head.
02:18:02.380 | I mean, in linear systems, it's not a problem.
02:18:04.700 | An underactuated linear system
02:18:07.260 | is really not sufficiently distinct
02:18:08.940 | from a fully actuated linear system.
02:18:10.780 | It's a subtle point about when that becomes a bottleneck
02:18:15.540 | in what we know how to do with control.
02:18:17.140 | It happens to be a bottleneck,
02:18:18.780 | although we've gotten incredibly good solutions now.
02:18:22.420 | But for a long time,
02:18:23.740 | I felt that that was the key bottleneck in Legged Robots.
02:18:27.060 | And roughly now, the underactuated course is,
02:18:29.580 | me trying to tell people everything I can
02:18:33.780 | about how to make Atlas do a backflip.
02:18:36.460 | I have a second course now
02:18:39.860 | that I teach in the other semesters,
02:18:41.220 | which is on manipulation.
02:18:43.540 | And that's where we get into now more of the,
02:18:45.820 | that's a newer class.
02:18:47.100 | I'm hoping to put it online this fall completely.
02:18:51.540 | And that's gonna have much more aspects
02:18:53.660 | about these perception problems
02:18:55.420 | and the state representation questions,
02:18:57.180 | and then how do you do control.
02:18:59.220 | And the thing that's a little bit sad is that,
02:19:03.980 | for me at least, is there's a lot of manipulation tasks
02:19:07.420 | that people wanna do and should wanna do.
02:19:09.220 | They could start a company with it and be very successful
02:19:12.700 | that don't actually require you to think that much
02:19:15.540 | about dynamics at all, even,
02:19:18.020 | but certainly underactuated dynamics.
02:19:19.980 | Once I have, if I reach out and grab something,
02:19:23.060 | if I can sort of assume it's rigidly attached to my hand,
02:19:25.660 | then I can do a lot of interesting,
02:19:26.860 | meaningful things with it
02:19:28.740 | without really ever thinking about the dynamics
02:19:30.900 | of that object.
02:19:31.940 | So we've built systems that kind of
02:19:34.940 | reduce the need for that,
02:19:38.060 | enveloping grasps and the like.
02:19:40.900 | But I think the really good problems in manipulation,
02:19:44.300 | so manipulation, by the way,
02:19:46.460 | is more than just pick and place.
02:19:49.740 | That's like, a lot of people think of that, just grasping.
02:19:53.020 | I don't mean that, I mean buttoning my shirt.
02:19:55.460 | I mean tying shoelaces.
02:19:57.700 | - Tying shoelaces. - How do you program
02:19:58.860 | a robot to tie shoelaces?
02:20:00.300 | And not just one shoe, but every shoe, right?
02:20:04.020 | That's a really good problem.
02:20:06.820 | It's tempting to write down
02:20:08.260 | like the infinite dimensional state of the laces.
02:20:11.900 | That's probably not needed to write a good controller.
02:20:16.140 | I know we could hand design a controller that would do it,
02:20:19.460 | but I don't want that.
02:20:20.300 | I wanna understand the principles
02:20:22.860 | that would allow me to solve another problem
02:20:24.540 | that's kind of like that.
02:20:26.460 | But I think if we can stay pure in our approach,
02:20:30.900 | then the challenge of tying anybody's shoes
02:20:34.940 | is a great challenge.
02:20:36.260 | - That's a great challenge.
02:20:37.180 | I mean, and the soft touch comes into play there.
02:20:40.900 | That's really interesting.
02:20:42.180 | Let me ask another ridiculous question on this topic.
02:20:47.500 | How important is touch?
02:20:49.780 | We haven't talked much about humans,
02:20:52.300 | but I have this argument with my dad
02:20:54.780 | where like I think you can fall in love with a robot
02:20:59.620 | based on language alone.
02:21:02.580 | And he believes that touch is essential.
02:21:05.360 | Touch and smell, he says.
02:21:07.780 | So in terms of robots, you know, connecting with humans,
02:21:16.540 | and we can go philosophical
02:21:18.660 | in terms of like a deep, meaningful connection, like love,
02:21:21.820 | but even just like collaborating in an interesting way,
02:21:25.540 | how important is touch?
02:21:26.900 | From an engineering perspective and a philosophical one.
02:21:32.820 | - I think it's super important.
02:21:34.460 | Even just in a practical sense,
02:21:37.060 | if we forget about the emotional part of it,
02:21:39.260 | but for robots to interact safely
02:21:43.340 | while they're doing meaningful mechanical work
02:21:46.420 | in the close contact with or vicinity of people
02:21:52.260 | that need help, I think we have to have them,
02:21:55.260 | we have to build them differently.
02:21:57.540 | They have to be afraid, not afraid of touching the world.
02:21:59.900 | So I think Baymax is just awesome.
02:22:02.860 | That's just like the movie of "Big Hero 6"
02:22:06.300 | and the concept of Baymax, that's just awesome.
02:22:08.740 | I think we should, and we have some folks at Toyota
02:22:13.060 | that are trying to, Toyota Research,
02:22:14.460 | that are trying to build Baymax roughly.
02:22:16.900 | And I think it's just a fantastically good project.
02:22:21.900 | I think it will change the way people physically interact.
02:22:25.660 | The same way, I mean, you gave a couple examples earlier,
02:22:28.020 | but if the robot that was walking around my home
02:22:31.980 | looked more like a teddy bear
02:22:34.020 | and a little less like the Terminator,
02:22:36.020 | that could change completely the way people perceive it
02:22:38.940 | and interact with it.
02:22:39.860 | And maybe they'll even wanna teach it, like you said.
02:22:43.300 | You could not quite gamify it,
02:22:47.700 | but somehow instead of people judging it
02:22:50.100 | and looking at it as if it's not doing as well as a human,
02:22:54.380 | they're gonna try to help out the cute teddy bear.
02:22:57.100 | Who knows?
02:22:57.940 | But I think we're building robots wrong
02:23:01.260 | and being more soft and more contact is important.
02:23:07.820 | - Yeah, like all the magical moments
02:23:09.900 | I can remember with robots.
02:23:12.380 | Well, first of all, just visiting your lab and seeing Atlas,
02:23:16.000 | but also Spot Mini.
02:23:18.300 | When I first saw Spot Mini in person
02:23:21.700 | and hung out with him, her, it,
02:23:26.300 | I don't have trouble engendering robots.
02:23:28.380 | I feel robotics people really always say it.
02:23:31.500 | I kinda like the idea that it's a her or him.
02:23:35.820 | It's a magical moment, but there's no touching.
02:23:38.820 | I guess the question I have, have you ever been,
02:23:41.660 | like, have you had a human robot experience
02:23:44.980 | where a robot touched you?
02:23:47.980 | And it was like, wait, was there a moment
02:23:53.020 | that you've forgotten that a robot is a robot?
02:23:56.100 | And the anthropomorphization stepped in
02:24:00.860 | and for a second you forgot that it's not human?
02:24:04.940 | - I mean, I think when you're in on the details,
02:24:07.860 | then we of course anthropomorphized our work with Atlas,
02:24:12.400 | but in verbal communication and the like,
02:24:17.140 | I think we were pretty aware of it as a machine
02:24:20.460 | that needed to be respected.
02:24:21.860 | I actually, I worry more about the smaller robots
02:24:26.300 | that could still move quickly if programmed wrong
02:24:29.580 | and we have to be careful actually about safety
02:24:32.260 | and the like right now.
02:24:33.780 | And that, if we build our robots correctly,
02:24:36.420 | I think then a lot of those concerns could go away.
02:24:40.340 | And we're seeing that trend.
02:24:41.300 | We're seeing the lower cost, lighter weight arms now
02:24:44.140 | that could be fundamentally safe.
02:24:46.760 | I mean, I do think touch is so fundamental.
02:24:52.180 | Ted Adelson is great.
02:24:54.340 | He's a perceptual scientist at MIT
02:24:57.580 | and he studied vision most of his life.
02:25:01.420 | And he said, "When I had kids,
02:25:04.380 | "I expected to be fascinated
02:25:06.040 | "by their perceptual development."
02:25:07.940 | But what really, what he noticed was,
02:25:12.580 | felt more impressive, more dominant
02:25:14.140 | was the way that they would touch everything
02:25:16.220 | and lick everything and pick things up,
02:25:17.780 | stick it on their tongue and whatever.
02:25:19.420 | And he said, watching his daughter
02:25:22.500 | convinced him that actually he needed to study
02:25:26.620 | tactile sensing more.
02:25:28.580 | So there's something very important.
02:25:33.540 | I think it's a little bit also of the passive
02:25:35.780 | versus active part of the world, right?
02:25:38.660 | You can passively perceive the world,
02:25:41.460 | but it's fundamentally different
02:25:45.060 | if you can do an experiment, right?
02:25:47.140 | And if you can change the world.
02:25:48.820 | And you can learn a lot more than a passive observer.
02:25:51.660 | So you can, in dialogue, that was your initial example,
02:25:56.940 | you could have an active experiment exchange.
02:26:00.060 | But I think if you're just a camera watching YouTube,
02:26:03.020 | I think that's a very different problem
02:26:05.860 | than if you're a robot that can apply force and touch.
02:26:10.720 | I think it's important.
02:26:15.500 | - Yeah, I think it's just an exciting area of research.
02:26:17.980 | I think you're probably right
02:26:19.220 | that this hasn't been under-researched.
02:26:21.460 | To me, as a person who's captivated
02:26:25.740 | by the idea of human robot interaction,
02:26:27.780 | it feels like such a rich opportunity to explore touch.
02:26:32.780 | Not even from a safety perspective,
02:26:35.860 | but like you said, the emotional too.
02:26:38.060 | I mean, safety comes first,
02:26:39.700 | but the next step is like a real human connection.
02:26:46.260 | Even in the industrial setting,
02:26:51.420 | it just feels like it's nice for the robot.
02:26:55.580 | - I don't know, you might disagree with this,
02:26:58.220 | 'cause I think it's important to see robots as tools often,
02:27:04.420 | but I don't know.
02:27:06.140 | I think they're just always going to be more effective
02:27:08.620 | once you humanize them.
02:27:10.220 | It's convenient now to think of them as tools
02:27:14.420 | 'cause we wanna focus on the safety,
02:27:16.220 | but I think ultimately to create a good experience
02:27:21.220 | for the worker, for the person,
02:27:24.940 | there has to be a human element.
02:27:28.020 | I don't know, for me.
02:27:29.260 | It feels like an industrial robotic arm
02:27:33.140 | would be better if it has a human element.
02:27:34.900 | I think like we think robotics had that idea
02:27:37.060 | with Baxter and having eyes and so on.
02:27:40.300 | I don't know, I'm a big believer in that.
02:27:43.100 | - It's not my area, but I am also a big believer.
02:27:48.100 | - Do you have an emotional connection to Atlas?
02:27:51.640 | (laughing)
02:27:53.820 | Like, do you miss him?
02:27:55.020 | - I mean, yes.
02:27:56.020 | I don't know if I more so than if I had
02:28:03.460 | a different science project that I'd worked on super hard.
02:28:06.300 | But yeah, I mean, the robot,
02:28:11.300 | we basically had to do heart surgery on the robot
02:28:14.820 | in the final competition 'cause we melted the core.
02:28:17.500 | Yeah, there was something about watching that robot
02:28:23.180 | hanging there, we know we had to compete with it
02:28:24.940 | in an hour and it was getting its guts ripped out.
02:28:28.060 | - Those are all historic moments.
02:28:30.540 | I think if you look back like 100 years from now,
02:28:32.980 | yeah, I think those are important moments in robotics.
02:28:38.560 | I mean, these are the early days.
02:28:40.020 | You look at like the early days
02:28:41.220 | of a lot of scientific disciplines,
02:28:42.780 | they look ridiculous, it's full of failure.
02:28:45.380 | But it feels like robotics will be important
02:28:48.340 | in the coming 100 years.
02:28:52.460 | And these are the early days.
02:28:54.060 | So I think a lot of people look at a brilliant person
02:29:00.140 | such as yourself and are curious
02:29:02.260 | about the intellectual journey they've took.
02:29:05.180 | Is there maybe three books, technical, fiction,
02:29:09.600 | philosophical that had a big impact on your life
02:29:13.860 | that you would recommend perhaps others reading?
02:29:16.760 | - Yeah, so I actually didn't read that much as a kid,
02:29:21.740 | but I read fairly voraciously now.
02:29:24.380 | There are some recent books that if you're interested
02:29:28.860 | in this kind of topic, like "AI Superpowers" by Kai-Fu Lee
02:29:33.860 | is just a fantastic read, you must read that.
02:29:36.940 | Yuval Harari, I think that can open your mind.
02:29:44.060 | - "Sapiens." - "Sapiens" is the first one.
02:29:48.060 | "Homo Deus" is the second, yeah.
02:29:51.020 | We mentioned "The Black Swan" by Taleb.
02:29:53.500 | I think that's a good sort of mind opener.
02:29:56.080 | I actually, so there's maybe
02:30:02.220 | a more controversial recommendation I could give.
02:30:06.220 | - Great, we love controversial.
02:30:08.740 | - In some sense, it's so classical it might surprise you.
02:30:11.580 | But I actually recently read Mortimer Adler's
02:30:15.980 | "How to Read a Book."
02:30:17.600 | Not so long, it was a while ago.
02:30:19.060 | But some people hate that book.
02:30:22.340 | I loved it.
02:30:24.860 | I think we're in this time right now where,
02:30:29.020 | boy, we're just inundated with research papers
02:30:33.820 | that you could read on archive with limited peer review
02:30:38.620 | and just this wealth of information.
02:30:41.020 | I don't know, I think the passion
02:30:47.140 | of what you can get out of a book,
02:30:50.060 | a really good book or a really good paper if you find it,
02:30:53.880 | the attitude, the realization that you're only gonna find
02:30:56.080 | a few that really are worth all your time.
02:30:58.520 | But then once you find them, you should just dig in
02:31:02.540 | and understand it very deeply and it's worth
02:31:06.380 | marking it up and having the hard copy,
02:31:11.300 | writing in the side notes, side margins.
02:31:16.020 | I think that was really,
02:31:17.260 | I read it at the right time where I was just feeling
02:31:21.420 | just overwhelmed with really low quality stuff, I guess.
02:31:26.260 | And similarly, I'm giving more than three now.
02:31:32.420 | I'm sorry if I've exceeded my quota.
02:31:35.300 | - But on that topic just real quick is,
02:31:38.180 | so basically finding a few companions
02:31:41.700 | to keep for the rest of your life
02:31:43.740 | in terms of papers and books and so on.
02:31:45.800 | And those are the ones, like not doing,
02:31:49.420 | what is it, FOMO, fear of missing out,
02:31:52.880 | constantly trying to update yourself,
02:31:54.820 | but really deeply making a life journey
02:31:57.540 | of studying a particular paper essentially, set of papers.
02:32:01.300 | - Yeah, I think when you really find something,
02:32:06.060 | which a book that resonates with you
02:32:07.740 | might not be the same book that resonates with me,
02:32:10.400 | but when you really find one that resonates with you,
02:32:13.140 | I think the dialogue that happens,
02:32:15.180 | and that's what I loved that Adler was saying,
02:32:18.260 | I think Socrates and Plato say,
02:32:20.560 | the written word is never gonna capture
02:32:25.700 | the beauty of dialogue, right?
02:32:28.000 | But Adler says, no, no,
02:32:29.200 | a really good book is a dialogue between you and the author,
02:32:35.340 | and it crosses time and space.
02:32:37.300 | I don't know, I think it's a very romantic,
02:32:40.660 | there's a bunch of specific advice,
02:32:42.720 | which you can just gloss over,
02:32:44.360 | but the romantic view of how to read
02:32:47.220 | and really appreciate it is so good.
02:32:51.020 | And similarly, teaching,
02:32:53.860 | I thought a lot about teaching,
02:32:58.860 | so Isaac Asimov, great science fiction writer,
02:33:03.260 | has also actually spent a lot of his career
02:33:05.300 | writing nonfiction, right?
02:33:07.220 | His memoir is fantastic.
02:33:08.720 | He was passionate about explaining things, right?
02:33:12.700 | He wrote all kinds of books on all kinds of topics
02:33:14.600 | in science, he was known as the great explainer.
02:33:17.700 | And I do really resonate with his style
02:33:22.340 | and just his way of talking about,
02:33:27.100 | by communicating and explaining to something
02:33:30.520 | is really the way that you learn something.
02:33:32.500 | I think about problems very differently
02:33:36.220 | because of the way I've been given the opportunity
02:33:39.180 | to teach them at MIT, and have questions asked,
02:33:43.540 | the fear of the lecture, the experience of the lecture
02:33:47.660 | and the questions I get and the interactions
02:33:50.200 | just forces me to be rock solid on these ideas
02:33:53.140 | in a way that if I didn't have that,
02:33:55.040 | I don't know I would be in a different intellectual space.
02:33:58.260 | - Also, video, does that scare you
02:34:00.420 | that your lectures are online,
02:34:02.140 | and people like me in sweatpants can sit,
02:34:04.660 | sipping coffee and watch you give lectures?
02:34:07.940 | - I think it's great.
02:34:09.140 | I do think that something's changed right now,
02:34:12.820 | which is, right now we're giving lectures over Zoom,
02:34:16.900 | I mean, giving seminars over Zoom and everything.
02:34:19.300 | I'm trying to figure out, I think it's a new medium.
02:34:24.380 | - Do you think it's-- - I'm trying to figure out
02:34:25.540 | how to exploit it. - How to exploit it.
02:34:28.020 | Yeah, I've been quite cynical
02:34:34.500 | about the human to human connection over that medium,
02:34:39.500 | but I think that's because it hasn't been explored fully,
02:34:43.580 | and teaching is a different thing.
02:34:45.820 | - Every lecture is a, I'm sorry, every seminar even,
02:34:49.140 | I think every talk I give,
02:34:51.020 | is an opportunity to give that differently.
02:34:55.020 | I can deliver content directly into your browser.
02:34:57.980 | You have a WebGL engine right there.
02:35:00.060 | I can throw 3D content into your browser
02:35:04.940 | while you're listening to me, right?
02:35:06.940 | And I can assume that you have a,
02:35:08.700 | at least a powerful enough laptop or something
02:35:11.780 | to watch Zoom while I'm doing that,
02:35:13.740 | while I'm giving a lecture.
02:35:15.100 | That's a new communication tool
02:35:18.100 | that I didn't have last year, right?
02:35:19.980 | And I think robotics can potentially benefit a lot
02:35:24.220 | from teaching that way.
02:35:25.340 | We'll see, it's gonna be an experiment this fall.
02:35:28.220 | - It's interesting. - I'm thinking a lot
02:35:29.420 | about it.
02:35:30.420 | - Yeah, and also the length of lectures,
02:35:35.420 | or the length of, there's something,
02:35:38.900 | so I guarantee you, 80% of people
02:35:42.980 | who started listening to our conversation
02:35:44.980 | are still listening to now, which is crazy to me.
02:35:48.020 | There's a patience and an interest in long-form content,
02:35:52.740 | but at the same time, there's a magic
02:35:55.300 | to forcing yourself to condense,
02:35:58.020 | an idea to a short as possible,
02:36:01.300 | shortest possible like clip.
02:36:04.740 | It can be a part of a longer thing,
02:36:06.260 | but like just a really beautifully condensed an idea.
02:36:09.700 | There's a lot of opportunity there
02:36:11.980 | that's easier to do in remote with,
02:36:14.660 | I don't know, with editing too.
02:36:19.100 | Editing is an interesting thing.
02:36:20.860 | Well, most professors don't get,
02:36:25.100 | when they give a lecture,
02:36:25.940 | they don't get to go back and edit out parts,
02:36:28.300 | like crisp it up a little bit.
02:36:31.660 | That's also, it can do magic.
02:36:34.260 | Like if you remove like five to 10 minutes
02:36:37.700 | from an hour lecture, it can actually,
02:36:41.220 | it can make something special of a lecture.
02:36:43.300 | I've seen that in myself and in others too,
02:36:47.940 | 'cause I edit other people's lectures to extract clips.
02:36:50.660 | It's like there's certain tangents that are like,
02:36:52.820 | that lose, they're not interesting.
02:36:54.500 | They're mumbling, they're just not,
02:36:57.260 | they're not clarifying, they're not helpful at all.
02:36:59.900 | And once you remove them, it's just, I don't know.
02:37:02.940 | Editing can be magic.
02:37:04.660 | - It can take a lot of time.
02:37:05.980 | - Yeah, it takes, it depends like what is teaching,
02:37:09.060 | you have to ask.
02:37:09.900 | - Yeah.
02:37:10.740 | - Yeah, 'cause I find the editing process
02:37:14.580 | is also beneficial as for teaching,
02:37:19.580 | but also for your own learning.
02:37:21.700 | I don't know if, have you watched yourself?
02:37:23.780 | - Yeah, sure.
02:37:24.620 | - Have you watched those videos?
02:37:26.220 | - I mean, not all of them.
02:37:28.380 | - It could be painful to see how to improve.
02:37:33.380 | - So do you find that, I know you segment your podcast.
02:37:37.260 | Do you think that helps people
02:37:39.700 | with the attention span aspect of it?
02:37:42.220 | Or is it--
02:37:43.060 | - Segment like sections like--
02:37:44.260 | - Yeah, we're talking about this topic, whatever.
02:37:46.380 | - Nope, nope, that just helps me.
02:37:48.300 | It's actually bad.
02:37:49.460 | So, and you've been incredible.
02:37:53.860 | So I'm learning, like I'm afraid of conversation.
02:37:56.460 | This is even today, I'm terrified of talking to you.
02:37:59.180 | I mean, it's something I'm trying to remove from myself.
02:38:03.540 | A guy, I mean, I learned from a lot of people,
02:38:07.460 | but really there's been a few people
02:38:10.780 | who's been inspirational to me in terms of conversation.
02:38:14.140 | Whatever people think of him, Joe Rogan
02:38:16.340 | has been inspirational to me
02:38:17.540 | because comedians have been too.
02:38:20.540 | Being able to just have fun and enjoy themselves
02:38:23.340 | and lose themselves in conversation,
02:38:25.620 | that requires you to be a great storyteller,
02:38:28.860 | to be able to pull a lot of different pieces
02:38:31.500 | of information together,
02:38:32.860 | but mostly just to enjoy yourself in conversations.
02:38:36.540 | And I'm trying to learn that.
02:38:38.100 | These notes are, you see me looking down,
02:38:41.700 | that's like a safety blanket
02:38:43.060 | that I'm trying to let go of more and more.
02:38:45.300 | - Cool.
02:38:46.300 | - So that's, people love just regular conversation.
02:38:49.500 | That's what they, the structure is like, whatever.
02:38:52.740 | I would say maybe like 10 to,
02:38:57.700 | so there's a bunch of,
02:38:58.740 | there's probably a couple thousand PhD students
02:39:03.860 | listening to this right now, right?
02:39:07.020 | And they might know what we're talking about,
02:39:09.580 | but there is somebody, I guarantee you right now in Russia,
02:39:14.580 | some kid who's just like, who's just smoked some weed,
02:39:18.020 | sitting back and just enjoying
02:39:20.580 | the hell out of this conversation.
02:39:22.540 | Not really understanding,
02:39:23.820 | he kind of watched some Boston Dynamics videos.
02:39:25.940 | He's just enjoying it.
02:39:27.300 | And I salute you, sir.
02:39:30.260 | No, but just like, there's so much variety of people
02:39:33.820 | that just have curiosity about engineering,
02:39:36.180 | about sciences, about mathematics.
02:39:38.980 | And also like I should,
02:39:41.380 | I mean, enjoying it is one thing,
02:39:45.980 | but I also often notice it inspires people to,
02:39:50.140 | there's a lot of people who are like
02:39:51.860 | in their undergraduate studies trying to figure out what,
02:39:54.780 | trying to figure out what to pursue.
02:39:57.100 | And these conversations can really spark
02:39:59.940 | the direction of their life.
02:40:02.620 | And in terms of robotics, I hope it does,
02:40:04.460 | 'cause I'm excited about the possibilities
02:40:07.380 | of what robotics brings.
02:40:08.500 | On that topic, do you have advice?
02:40:13.380 | Like what advice would you give to a young person about life?
02:40:17.700 | - A young person about life
02:40:20.140 | or a young person about life in robotics?
02:40:22.940 | - It could be in robotics.
02:40:25.100 | It could be in life in general.
02:40:26.580 | It could be career.
02:40:28.380 | It could be relationship advice.
02:40:31.260 | It could be running advice.
02:40:32.820 | Just like, that's one of the things I see,
02:40:36.540 | like we talked to like 20 year olds.
02:40:38.580 | They're like, how do I do this thing?
02:40:42.420 | What do I do?
02:40:43.940 | If they come up to you, what would you tell them?
02:40:48.020 | - I think it's an interesting time to be a kid these days.
02:40:52.540 | Everything points to this being
02:40:57.220 | sort of a winner take all economy and the like.
02:40:59.340 | I think the people that will really excel, in my opinion,
02:41:04.340 | are gonna be the ones that can think deeply about problems.
02:41:11.240 | You have to be able to ask questions agilely
02:41:13.980 | and use the internet for everything it's good for
02:41:15.860 | and stuff like this.
02:41:16.700 | And I think a lot of people will develop those skills.
02:41:19.460 | I think the leaders, thought leaders,
02:41:24.460 | robotics leaders, whatever,
02:41:26.820 | are gonna be the ones that can do more
02:41:29.060 | and they can think very deeply and critically.
02:41:31.300 | And that's a harder thing to learn.
02:41:34.980 | I think one path to learning that is through mathematics,
02:41:38.100 | through engineering.
02:41:39.520 | I would encourage people to start math early.
02:41:44.160 | I mean, I didn't really start.
02:41:46.880 | I mean, I was always in the better math classes
02:41:50.460 | that I could take,
02:41:51.300 | but I wasn't pursuing super advanced mathematics
02:41:54.720 | or anything like that until I got to MIT.
02:41:56.680 | I think MIT lit me up
02:41:59.000 | and really started the life that I'm living now.
02:42:05.600 | But yeah, I really want kids to dig deep,
02:42:10.600 | really understand things, building things too.
02:42:12.460 | I mean, pull things apart, put them back together.
02:42:15.180 | Like that's just such a good way
02:42:17.160 | to really understand things
02:42:19.980 | and expect it to be a long journey, right?
02:42:23.780 | You don't have to know everything.
02:42:27.260 | You're never gonna know everything.
02:42:29.500 | - So think deeply and stick with it.
02:42:31.480 | - Enjoy the ride,
02:42:33.940 | but just make sure you're not, yeah.
02:42:38.060 | Just make sure you're stopping
02:42:40.580 | to think about why things work.
02:42:43.220 | - It's true.
02:42:44.060 | It's easy to lose yourself
02:42:45.420 | in the distractions of the world.
02:42:49.120 | - We're overwhelmed with content right now,
02:42:52.740 | but you have to stop and pick some of it
02:42:56.260 | and really understand.
02:42:57.880 | - Yeah, on the book point,
02:43:00.380 | I've read "Animal Farm" by George Orwell
02:43:04.980 | a ridiculous number of times.
02:43:06.180 | So for me, like that book,
02:43:07.900 | I don't know if it's a good book in general,
02:43:09.780 | but for me, it connects deeply somehow.
02:43:12.200 | It somehow connects.
02:43:15.080 | So I was born in the Soviet Union.
02:43:18.300 | So it connects to me to the entirety of the history
02:43:20.500 | of the Soviet Union and to World War II
02:43:23.220 | and to the love and hatred and suffering that went on there.
02:43:28.060 | And the corrupting nature of power and greed
02:43:33.060 | and just somehow,
02:43:35.300 | that book has taught me more about life
02:43:38.140 | than like anything else.
02:43:39.420 | Even though it's just like a silly,
02:43:41.060 | like childlike book about pigs.
02:43:46.060 | I don't know why, it just connects and inspires.
02:43:48.980 | And the same, there's a few,
02:43:50.700 | yeah, there's a few technical books too
02:43:53.820 | and algorithms that just, yeah, you return to often.
02:43:58.500 | I'm with you.
02:43:59.740 | Yeah, there's, I don't know.
02:44:02.900 | And I've been losing that because of the internet.
02:44:05.420 | I've been like going on,
02:44:07.260 | I've been going on archive and blog posts and GitHub
02:44:11.300 | and the new thing and you lose your ability
02:44:16.220 | to really master an idea.
02:44:18.140 | - Right.
02:44:18.980 | - Wow.
02:44:19.820 | - Exactly right.
02:44:21.140 | - What's a fond memory from childhood?
02:44:25.540 | Baby, Russ, Tedrick.
02:44:27.460 | - Well, I guess I just said that,
02:44:31.180 | at least my current life began when I got to MIT.
02:44:36.820 | If I have to go farther than that.
02:44:38.940 | - Yeah, was there a life before MIT?
02:44:41.340 | - Oh, absolutely.
02:44:43.140 | But let me actually tell you what happened
02:44:47.860 | when I first got to MIT.
02:44:48.900 | 'Cause that, I think might be relevant here.
02:44:52.260 | But I had taken a computer engineering degree at Michigan.
02:44:57.260 | I enjoyed it immensely, learned a bunch of stuff.
02:45:00.420 | I liked computers, I liked programming.
02:45:03.560 | But when I did get to MIT and started working
02:45:07.340 | with Sebastian Sung, theoretical physicist,
02:45:10.300 | computational neuroscientist,
02:45:11.880 | the culture here was just different.
02:45:17.220 | It demanded more of me, certainly mathematically
02:45:20.260 | and in the critical thinking.
02:45:22.660 | And I remember the day that I borrowed one of the books
02:45:27.660 | from my advisor's office and walked down
02:45:29.780 | to the Charles River and was like,
02:45:32.140 | I'm getting my butt kicked.
02:45:33.500 | And I think that's gonna happen to everybody
02:45:38.140 | who's doing this kind of stuff.
02:45:40.220 | I think, I expected you to ask me the meaning of life.
02:45:45.220 | I think that the, somehow I think that's gotta be part of it.
02:45:51.580 | This--
02:45:52.780 | - Doing hard things?
02:45:53.900 | - Yeah.
02:45:55.940 | - Did you consider quitting at any point?
02:45:58.180 | Did you consider this isn't for me?
02:45:59.740 | - No, never that.
02:46:01.700 | I mean, I was working hard, but I was loving it.
02:46:06.700 | I think there's this magical thing where you,
02:46:09.480 | I'm lucky to surround myself with people that basically,
02:46:13.300 | almost every day I'll see something,
02:46:17.860 | I'll be told something or something that I realize,
02:46:20.300 | wow, I don't understand that.
02:46:21.980 | And if I could just understand that,
02:46:24.140 | there's something else to learn,
02:46:26.000 | that if I could just learn that thing,
02:46:28.100 | I would connect another piece of the puzzle.
02:46:30.300 | And I think that is just such an important aspect
02:46:35.300 | and being willing to understand what you can and can't do
02:46:40.220 | and loving the journey of going
02:46:43.540 | and learning those other things.
02:46:44.780 | I think that's the best part.
02:46:47.300 | - I don't think there's a better way to end it, Russ.
02:46:51.460 | You've been an inspiration to me since I showed up at MIT.
02:46:55.540 | Your work has been an inspiration to the world.
02:46:57.660 | This conversation was amazing.
02:46:59.700 | I can't wait to see what you do next
02:47:01.660 | with robotics, home robots.
02:47:03.740 | I hope to see your work in my home one day.
02:47:05.700 | So thanks so much for talking today.
02:47:07.460 | It's been awesome.
02:47:08.300 | - Cheers.
02:47:09.460 | - Thanks for listening to this conversation
02:47:11.020 | with Russ Tedrick.
02:47:12.180 | And thank you to our sponsors,
02:47:14.140 | Magic Spoon Cereal, BetterHelp and ExpressVPN.
02:47:18.180 | Please consider supporting this podcast
02:47:20.140 | by going to magicspoon.com/lex
02:47:23.360 | and using code LEX at checkout.
02:47:25.460 | Go into betterhelp.com/lex
02:47:27.780 | and signing up at expressvpn.com/lexpod.
02:47:32.780 | Click the links, buy the stuff, get the discount.
02:47:36.220 | It really is the best way to support this podcast.
02:47:39.380 | If you enjoy this thing, subscribe on YouTube,
02:47:41.520 | review it with Five Stars and Apple Podcast,
02:47:43.680 | support it on Patreon or connect with me on Twitter
02:47:46.540 | at Lex Friedman spelled somehow without the E,
02:47:50.620 | just F-R-I-D-M-A-N.
02:47:53.460 | And now let me leave you with some words
02:47:55.100 | from Neil deGrasse Tyson,
02:47:56.620 | talking about robots in space
02:47:58.500 | and the emphasis we humans put
02:48:00.660 | on human-based space exploration.
02:48:03.540 | "Robots are important.
02:48:05.580 | "If I don my pure scientist hat,
02:48:07.860 | "I would say just send robots.
02:48:09.900 | "I'll stay down here and get the data.
02:48:12.240 | "But nobody's ever given a parade for a robot.
02:48:14.980 | "Nobody's ever named a high school after a robot.
02:48:17.840 | "So when I don my public educator hat,
02:48:20.060 | "I have to recognize the elements of exploration
02:48:22.680 | "that excite people.
02:48:24.060 | "It's not only the discoveries and the beautiful photos
02:48:26.880 | "that come down from the heavens.
02:48:28.900 | "It's the vicarious participation in discovery itself."
02:48:33.080 | Thank you for listening and hope to see you next time.
02:48:37.240 | (upbeat music)
02:48:39.820 | (upbeat music)
02:48:42.400 | [BLANK_AUDIO]