back to indexRuss Tedrake: Underactuated Robotics, Control, Dynamics and Touch | Lex Fridman Podcast #114
Chapters
0:0 Introduction
4:29 Passive dynamic walking
9:40 Animal movement
13:34 Control vs Dynamics
15:49 Bipedal walking
20:56 Running barefoot
33:1 Think rigorously with machine learning
44:5 DARPA Robotics Challenge
67:14 When will a robot become UFC champion
78:32 Black Mirror Robot Dog
94:1 Robot control
107:0 Simulating robots
120:33 Home robotics
123:40 Soft robotics
127:25 Underactuated robotics
140:42 Touch
148:55 Book recommendations
160:8 Advice to young people
164:20 Meaning of life
00:00:00.000 |
The following is a conversation with Russ Tedrick, 00:00:11.240 |
He works on control of robots in interesting, 00:00:25.040 |
We'll get into a lot of topics in this conversation 00:00:28.280 |
from his time leading MIT's DARPA Robotics Challenge Team 00:00:35.400 |
close to a marathon a day to and from work barefoot. 00:00:44.520 |
of underactuated dynamical systems like the human body, 00:01:18.840 |
It really is the best way to support this podcast. 00:01:21.840 |
If you enjoy this thing, subscribe on YouTube, 00:01:54.960 |
which is terrible for you, so I quit years ago. 00:02:09.640 |
but if you know what's good for you, you'll go with cocoa, 00:02:12.560 |
my favorite flavor and the flavor of champions. 00:02:15.840 |
Click the magicspoon.com/lex link in the description, 00:02:40.600 |
and match you with a licensed professional therapist 00:02:47.640 |
it is professional counseling done securely online. 00:02:51.040 |
As you may know, I'm a bit from the David Goggins line 00:02:53.720 |
of creatures and so have some demons to contend with, 00:02:57.080 |
usually on long runs or all-nighters full of self-doubt. 00:03:08.200 |
For most people, I think a good therapist can help in this, 00:03:15.640 |
It's easy, private, affordable, available worldwide. 00:03:21.640 |
and schedule weekly audio and video sessions. 00:03:31.840 |
Get it at expressvpn.com/lexpod to get a discount 00:03:45.120 |
Not to stir up trouble, but I personally think 00:03:48.080 |
the British version is actually more brilliant 00:03:53.120 |
Anyway, there are actually nine other countries 00:03:58.400 |
You can get access to them with no geo-restriction 00:04:07.340 |
You can choose from nearly 100 different countries, 00:04:14.040 |
So again, get it on any device at expressvpn.com/lexpod 00:04:25.000 |
And now here's my conversation with Russ Tedrick. 00:04:28.620 |
What is the most beautiful motion of a animal or robot 00:04:34.540 |
- I think the most beautiful motion of a robot 00:04:41.120 |
I think there's just something fundamentally beautiful. 00:04:43.320 |
The ones in particular that Steve Collins built 00:04:45.360 |
with Andy Ruina at Cornell, a 3D walking machine. 00:05:06.160 |
And at the time it looked more natural, more graceful, 00:05:09.520 |
more human-like than any robot we'd seen to date. 00:05:21.520 |
One of the simplest models we use to think about it 00:05:25.340 |
So imagine taking a bicycle wheel, but take the rim off. 00:05:35.840 |
But every time its foot, its spoke comes around 00:05:38.180 |
and hits the ground, it loses a little energy. 00:05:48.240 |
And actually they want to, it's a stable phenomenon. 00:06:02.160 |
which doesn't look very much like a human walking, 00:06:05.080 |
take all the extra spokes away, put a hinge in the middle. 00:06:27.480 |
where they built something that had knees, arms, a torso, 00:06:32.460 |
the arms swung naturally, give it a little push. 00:06:36.320 |
And that looked like a stroll through the park. 00:06:43.800 |
I think there's a science to getting close to the solution. 00:06:47.640 |
I think there's certainly art in the way that they, 00:06:57.600 |
that wasn't perfectly modeled, wasn't perfectly controlled. 00:07:02.800 |
that you have to tune the suction cups at the knees, 00:07:07.960 |
but then they release at just the right time. 00:07:09.640 |
Or there's all these little tricks of the trade, 00:07:18.800 |
the best walking robot in the world was Honda's ASIMO. 00:07:27.440 |
it sort of announced P2 and then it went through, 00:07:37.840 |
like you're walking on hot coals or something like that. 00:07:56.660 |
None of our robots that have been that complicated 00:08:04.060 |
But there's a thing that happens when you do control, 00:08:09.060 |
when you try to control a system of that complexity. 00:08:12.480 |
You try to use your motors to basically counteract gravity. 00:08:16.480 |
Take whatever the world's doing to you and push back, 00:08:25.040 |
because you can make them simple and analyzable, 00:08:30.780 |
And this was a very sort of beautiful example 00:08:36.380 |
You can just let go, let physics do most of the work. 00:08:39.080 |
And you just have to give it a little bit of energy. 00:08:46.160 |
you have to give a little energy at some point. 00:08:48.460 |
But maybe instead of trying to take the forces 00:08:51.580 |
imparted to you by the world and replacing them, 00:08:55.200 |
what we should be doing is letting the world push us around 00:09:15.320 |
that they've had to have the system fall down 00:09:19.860 |
- I don't know if it's thousands, but it's a lot. 00:09:25.020 |
- So in that sense, control might help a little bit. 00:09:32.140 |
said that the answer is to do that with control. 00:09:36.380 |
that maybe the way we're doing control right now 00:09:43.860 |
The ones that figured out how to move efficiently? 00:09:46.220 |
Is there anything you find inspiring or beautiful 00:09:54.380 |
- So it sort of goes with the passive walking idea. 00:09:57.180 |
So is there, how energy efficient are animals? 00:10:03.820 |
by George Lauder at Harvard and Mike Tranefilo at MIT. 00:10:07.500 |
They were studying fish swimming in a water tunnel, okay? 00:10:11.820 |
And one of these, the type of fish they were studying 00:10:17.260 |
because there was a phenomenon well understood 00:10:21.220 |
when they're swimming upstream at mating season, 00:10:29.180 |
Maybe there's something energetically interesting there. 00:10:33.400 |
They put in this water tunnel, a rock basically, 00:10:36.420 |
a cylinder that had the same sort of vortex street, 00:10:48.020 |
And the amazing thing is that if you watch from above 00:10:51.980 |
what the fish swims when it's not behind a rock, 00:10:56.100 |
You can identify the fish the same way you look at a human 00:10:59.820 |
You sort of have a sense of how a human walks. 00:11:04.200 |
You put that fish behind the rock, its gate changes. 00:11:07.960 |
And what they saw was that it was actually resonating 00:11:15.120 |
Now, here was the experiment that really was the clincher, 00:11:20.960 |
it wasn't clear how much of that was mechanics of the fish, 00:11:44.120 |
So it couldn't go back and get caught in the grates. 00:11:47.120 |
And then they asked, what would that dead fish do 00:12:04.000 |
- The dead fish basically starts swimming upstream. 00:12:06.560 |
It's completely dead, no brain, no motors, no control, 00:12:23.720 |
Is that just evolution constantly just figuring out 00:12:32.700 |
Or maybe the physics of our world completely, 00:12:40.900 |
but just the entirety of it somehow drives to efficiency? 00:12:48.120 |
I don't know if that question even makes any sense. 00:12:56.400 |
I don't know if an environment can evolve, but. 00:12:59.020 |
- I mean, there are experiments that people do, 00:13:05.400 |
can adapt to unusual situations and recover efficiency. 00:13:08.680 |
So there seems like, at least in one direction, 00:13:20.040 |
But efficiency isn't the only goal, of course. 00:13:23.080 |
Sometimes it's too easy to think about only efficiency. 00:13:26.160 |
But we have to do a lot of other things first, 00:13:28.840 |
not get eaten, and then all other things being equal, 00:13:40.780 |
- Yeah, I mean, I think part of the point is that 00:13:43.900 |
we shouldn't draw a line as clearly as we tend to. 00:13:51.420 |
and we have the links of the robot, let's say. 00:14:01.320 |
You can put springs, I would call that mechanics. 00:14:04.920 |
which our muscles are springs and dampers and tendons. 00:14:07.600 |
But then you have something that's doing active work, 00:14:10.400 |
putting energy in, which are your motors on the robot. 00:14:13.200 |
The controller's job is to send commands to the motor 00:14:24.760 |
did you decide to send some commands to your motor, 00:14:56.160 |
have underestimated how important the dynamics are. 00:14:58.720 |
I mean even our bodies, the mechanics of our bodies, 00:15:06.240 |
So I actually, I lost a finger in early 2000s, 00:15:16.560 |
in ways you don't expect when you're opening jars, 00:15:22.480 |
there's a bone there that was used to taking contact. 00:15:26.720 |
My fourth metacarpal wasn't used to taking contact, 00:15:52.520 |
'cause it's tended to create some interesting things. 00:15:56.680 |
Bipedal walking, why the heck did evolution give us, 00:16:01.680 |
I think we're, are we the only mammals that walk on two feet? 00:16:06.520 |
- No, I mean there's a bunch of animals that do it, a bit. 00:16:24.120 |
is because there's an advantage to being able 00:16:26.780 |
to carry food back to the tribe or something like that. 00:16:29.600 |
So you can carry, it's kind of this communal, 00:16:36.520 |
to a place of shelter and so on to share with others. 00:16:41.640 |
- Do you understand at all the value of walking on two feet 00:16:46.040 |
from both a robotics and a human perspective? 00:16:51.680 |
about evolution of, walking evolution of the human body. 00:16:56.080 |
I think it's easy though to make bad evolutionary arguments. 00:17:02.200 |
Most of them are probably bad, but what else can we do? 00:17:05.320 |
- I mean I think a lot of what dominated our evolution 00:17:15.080 |
sort of in the steady state, when things are good. 00:17:20.080 |
But for instance, people talk about what we should eat now 00:17:25.040 |
because our ancestors were meat eaters or whatever. 00:17:30.240 |
- But probably the reason that one pre-Homo sapien species 00:17:40.700 |
whether they ate well when there was lots of food, 00:17:45.320 |
but when the ice age came, probably one of them 00:17:50.960 |
one of them happened to forage a food that was okay 00:17:54.200 |
even when the glaciers came or something like that. 00:17:57.720 |
- There's a million variables that contributed 00:18:00.560 |
and we can't, and are actually, the amount of information 00:18:13.040 |
if you study history, it seems like history turns 00:18:15.680 |
on these little events that otherwise would seem meaningless, 00:18:20.680 |
but in the grant, in retrospect, were turning points. 00:18:28.360 |
- And that's probably how, like somebody got hit 00:18:30.920 |
in the head with a rock because somebody slept 00:18:40.200 |
warring tribes combined with the environment, 00:18:43.280 |
all those millions of things and the meat eating, 00:18:46.480 |
which I get a lot of criticism because I don't know, 00:18:49.360 |
I don't know what your dietary processes are like, 00:18:55.040 |
which is, there's a large community of people who say, 00:19:02.720 |
There's probably an even larger community of people, 00:19:05.760 |
including my mom, who says it's a deeply unhealthy, 00:19:10.760 |
But you're right, these evolutionary arguments 00:19:12.980 |
can be flawed, but is there anything interesting 00:19:26.840 |
they make the point nicely that probably it was 00:19:30.320 |
a few random events that, yes, maybe it was someone 00:19:38.660 |
- That said, do you think, I don't know how to ask 00:19:55.420 |
Is it all useful to build robots that are on two feet 00:20:02.340 |
- I think the most, I mean, the reason I spent a long time 00:20:05.500 |
working on bipedal walking was because it was hard, 00:20:18.540 |
that you should start a company around bipeds 00:20:24.260 |
There are people that make pretty compelling arguments. 00:20:26.100 |
I think the most compelling one is that the world 00:20:28.940 |
is built for the human form, and if you want a robot 00:20:32.300 |
to work in the world we have today, then having a human form 00:20:38.260 |
There are places that a biped can go that would be hard 00:20:42.620 |
for other form factors to go, even natural places. 00:20:47.620 |
But at some point in the long run, we'll be building 00:21:11.820 |
What have you learned about human and robot movement 00:21:23.660 |
- Well, you know, it happened the other way, right? 00:21:25.640 |
So I was studying walking robots, and there's a great 00:21:30.740 |
conference called the Dynamic Walking Conference, 00:21:34.300 |
where it brings together both the biomechanics community 00:21:39.900 |
And so I had been going to this for years and hearing 00:21:54.060 |
The other thing I had going for me is actually that 00:21:55.820 |
I wasn't a runner before, and I learned to run 00:22:07.300 |
And I'm definitely, I'm a big fan of it for me, 00:22:11.020 |
but I'm not gonna, I tend to not try to convince 00:22:14.800 |
There's people who run beautifully with shoes on, 00:22:26.380 |
So I think it's just too easy to run 10 miles, 00:22:29.460 |
feel pretty good, and then you get home at night 00:22:35.500 |
If you take your shoes off, then if you hit hard 00:22:50.800 |
You have immediate feedback telling you that you've done 00:22:57.720 |
If I, right now, having run many miles barefoot, 00:23:09.540 |
And I think my goals for running are to do it 00:23:13.780 |
for as long as I can into old age, not to win any races. 00:23:18.780 |
And so for me, this is a way to protect myself. 00:23:23.440 |
- Yeah, I think, first of all, I've tried running barefoot 00:23:33.960 |
But just to understand, because I felt like I couldn't 00:23:40.880 |
And it feels like running, for me, and I think 00:23:44.720 |
for a lot of people, was one of those activities 00:23:47.600 |
that we do often and we never really try to learn 00:23:52.140 |
Like, it's funny, there's so many activities we do every day 00:24:00.320 |
I think a lot of us, at least me, probably have never 00:24:03.400 |
deeply studied how to properly brush my teeth, right? 00:24:10.680 |
We do it every day, but we haven't really studied, 00:24:17.120 |
that it was absurd not to study how to do correctly 00:24:20.240 |
'cause it's the source of so much pain and suffering. 00:24:25.720 |
I do it because I hate it, but I feel good afterwards. 00:24:30.600 |
how to do it properly, so that's where barefoot running 00:24:33.160 |
came in, and then I quickly realized that my gait 00:24:38.040 |
I was taking huge, like, steps and landing hard on the heel, 00:24:52.320 |
but I feel like it was 180 a minute or something like that. 00:25:07.720 |
It took a long time, and I feel like after a while, 00:25:16.000 |
I feel like barefoot is the legit way to do it. 00:25:35.920 |
and they're used to running long distances or running fast, 00:25:38.360 |
and they take their shoes off, and they hurt themselves 00:25:46.000 |
that I couldn't run very far when I first started trying. 00:25:54.360 |
actually, like, Aqua socks or something like this 00:26:00.400 |
- What's the difference between a minimal shoe 00:26:03.800 |
What's, like, feeling-wise, what does it feel like? 00:26:07.060 |
- There is a, I mean, I notice my gait changing, right? 00:26:10.040 |
So, I mean, your foot has as many muscles and sensors 00:26:23.240 |
And we stick our foot in a big, solid shoe, right? 00:26:26.040 |
So there's, I think, you know, when you're barefoot, 00:26:29.680 |
you're just giving yourself more proprioception. 00:26:33.280 |
And that's why you're more aware of some of the gait flaws 00:26:42.440 |
- I mean, yeah, so I think people who are afraid 00:26:45.200 |
of barefoot running, they're worried about getting cuts 00:26:51.600 |
I think those are all, like, very short-term. 00:26:56.560 |
If I blow out my knees, I'm done running forever. 00:26:58.260 |
So I will trade the short-term for the long-term, anytime. 00:27:04.800 |
to my wife's chagrin, your feet get tough, right? 00:27:29.560 |
There's probably more good books since I read them. 00:27:42.640 |
to describe running, barefoot running to somebody, 00:27:46.640 |
- So, you run pretty good distances, and you bike, 00:27:52.560 |
and is there, you know, if we talk about bucket list items, 00:27:57.560 |
is there something crazy on your bucket list, 00:28:02.820 |
- I mean, my commute is already a little crazy. 00:28:14.720 |
but you can find lots of different ways to get there. 00:28:16.660 |
So, I mean, I've run there for many years, I've biked there. 00:28:29.860 |
- Yeah, or with minimal shoes or whatever that-- 00:28:36.060 |
- It became kind of a game of how can I get to work. 00:28:38.400 |
I've rollerbladed, I've done all kinds of weird stuff, 00:28:45.040 |
So, I can put in a little rowboat, not so far from my house, 00:28:50.040 |
but the Charles River takes a long way to get to MIT. 00:28:56.400 |
And it's, you know, it's not about, I don't know, 00:29:05.800 |
But for me, it's just a magical time to think, 00:29:13.760 |
especially I'll wake up, do a lot of work in the morning, 00:29:16.220 |
and then I kind of have to just let that settle 00:29:20.700 |
And then on the way home, it's a great time to load it, 00:29:43.660 |
And like, if we look at the grand scheme of things, 00:30:14.620 |
and then there's this zen beauty of just running, 00:30:22.660 |
- I would say I'm not a fast runner, particularly. 00:30:39.840 |
I think work, you can find a work-life balance in that way. 00:30:54.400 |
and I rarely feel that those are really at odds. 00:31:21.660 |
- I'm not gonna take a dinghy across the Atlantic 00:31:24.820 |
but if someone does and wants to write a book, 00:31:31.140 |
No, I do have some fun things that I will try. 00:31:38.740 |
and then take it with me and bike to wherever I'm going. 00:31:42.380 |
or I'll take a stand-up paddleboard these days 00:32:00.140 |
So he's the person who made me do this stupid challenge. 00:32:05.140 |
So he's insane, and he does things for the purpose, 00:32:11.340 |
He does things like for the explicit purpose of suffering. 00:32:18.420 |
like whatever he thinks he can do, he does more. 00:32:37.840 |
You should end feeling better than you started. 00:32:44.540 |
and COVID has tested this 'cause I've lost my commute. 00:32:47.700 |
I think I'm perfectly happy walking around town 00:32:51.940 |
with my wife and kids if they could get them to go. 00:32:57.780 |
and getting away from the keyboard for some time 00:33:04.100 |
What to you is the most beautiful idea in robotics? 00:33:18.180 |
- I think I've been lucky to experience something 00:33:23.520 |
that not so many roboticists have experienced, 00:33:39.420 |
that some of the more mathematical control theory can bring 00:33:53.140 |
and I had a day even just a couple of weeks ago 00:33:57.900 |
where I had spent the day on a Zoom robotics conference 00:34:01.020 |
having great conversations with lots of people. 00:34:20.780 |
about maybes and what ifs and what a great idea, 00:34:29.340 |
about systems that aren't that much more simple 00:34:33.860 |
or abstract than the ones I care about deeply. 00:34:53.520 |
And so for instance, deep learning is amazing. 00:35:02.440 |
I think it's changed the world unquestionable. 00:35:10.600 |
So I think one of the challenges as an educator 00:35:13.420 |
is to think about how do we make sure people get a taste 00:35:17.120 |
of the more rigorous thinking that I think goes along 00:35:42.420 |
like really getting a deep understanding of that. 00:35:44.780 |
I mean, from physics, the first principle thing 00:35:47.280 |
come from physics, and here it's literally physics. 00:36:07.040 |
There's a lot of folks that work on autonomous vehicles 00:36:16.720 |
I might be coming a little bit from the psychology side, 00:36:23.080 |
but I remember I spent a ridiculous number of hours 00:36:43.240 |
and I felt like, I had to record a lot of video to try, 00:36:47.800 |
extracts their movement, how they move their head, and so on. 00:36:50.840 |
But like every time, I felt like I didn't understand enough. 00:36:58.600 |
what, how are people signaling to each other? 00:37:03.560 |
How cognizant are they of their fear of death? 00:37:07.780 |
Like, what's the underlying game theory here? 00:37:14.120 |
And then I finally found a live stream of an intersection 00:37:17.800 |
that's like high def that I just, I would watch, 00:37:28.760 |
- Not just because human, but I think the learning mantra 00:37:37.920 |
And for the example you gave of all the nuances 00:37:46.840 |
that are happening for these subtle interactions 00:37:54.560 |
Maybe even one level more meta than what you're saying. 00:37:59.560 |
For a particular problem, I think it might be the case 00:38:09.400 |
that is just an essential skill for a mathematician 00:38:13.120 |
or an engineer that I just don't wanna lose it. 00:38:31.560 |
are not immediately rewarded for going through 00:38:51.040 |
Do you have sort of a good example of rigorous thinking 00:38:56.040 |
where it's easy to get lazy and not do the rigorous thinking 00:39:02.480 |
do you have advice of how to practice rigorous thinking 00:39:14.720 |
- Yeah, I mean, there are times where problems 00:39:21.360 |
that can be solved with well-known mature methods 00:39:24.800 |
could also be solved with a deep learning approach 00:39:30.280 |
and there's an argument that you must use learning 00:39:42.420 |
and you've suddenly put a bottleneck in there 00:39:48.480 |
you know, I think we know how to do that pretty well, 00:39:57.840 |
- So in that sense, rigorous thinking is understanding 00:40:02.200 |
the scope and the limitations of the methods that we have, 00:40:07.200 |
like how to use the tools of mathematics properly. 00:40:10.120 |
- Yeah, I think, you know, taking a class on analysis 00:40:24.880 |
you know, it doesn't have to be the end all problem, 00:40:33.420 |
and I just want to make sure we keep preaching it. 00:40:36.360 |
- But do you think when you're doing like rigorous thinking 00:40:43.240 |
or sort of explicitly, like formally describe a system, 00:40:47.960 |
do you think we naturally simplify things too much? 00:40:53.980 |
Like in order to be able to understand something 00:41:04.480 |
- That's how you understand the fundamentals? 00:41:07.880 |
I think maybe even that's a key to intelligence 00:41:29.240 |
about having the simplest explanation for a phenomenon. 00:41:32.520 |
So I don't doubt that we can train neural networks 00:41:44.660 |
but I maybe, I want another Newton to come along 00:42:01.180 |
Let's not offend the AI systems from 50 years 00:42:27.260 |
that is arbitrarily complex may not be the end goal. 00:42:47.860 |
is to, the typical formulation is that you try 00:42:50.100 |
to find the minimal state dimension realization 00:43:10.840 |
The robot hand is picking up an object or something. 00:43:13.660 |
And when I write down the equations of motion for that, 00:43:23.420 |
because of the dynamics of the hand when it's moving, 00:43:33.300 |
but simple description of what's happening out here is fine. 00:43:49.000 |
with all the combinatorics that explodes there, 00:43:55.780 |
with a more intuitive physics way of thinking. 00:44:16.480 |
You run a large group and you have a important history 00:44:26.340 |
Can you maybe first say what is the DARPA Robotics Challenge 00:44:30.000 |
and then tell your story around it, your journey with it? 00:44:41.060 |
it came on the tails of the DARPA Grand Challenge 00:44:52.740 |
Gil Pratt was at DARPA and pitched a new challenge 00:45:09.220 |
This happened shortly after the Fukushima disaster in Japan. 00:45:14.740 |
And our challenge was motivated roughly by that 00:45:17.660 |
because that was a case where if we had had robots 00:45:22.700 |
there's a chance that we could have averted disaster. 00:45:26.580 |
And certainly after the, in the disaster response, 00:45:33.540 |
So in practice, what we ended up with was a grand challenge, 00:45:42.140 |
where Boston Dynamics was to make humanoid robots. 00:45:52.500 |
were competing first in a simulation challenge 00:45:56.740 |
to try to be one of the ones that wins the right 00:45:59.420 |
to work on one of the Boston Dynamics humanoids 00:46:10.340 |
so it was decided as humanoid robots early on. 00:46:18.460 |
or you could enter through the virtual robotics challenge 00:46:21.340 |
as a software team that would try to win the right 00:46:27.940 |
- Humanoid robots. - Yeah, it was a 400 pound 00:46:30.860 |
Marvel, but a pretty big, scary looking robot. 00:46:44.780 |
I mean, it seems, you know, autonomous vehicles, 00:47:23.420 |
in the sense that you could have had something that, 00:47:26.840 |
as it was described in the call for participation, 00:47:32.700 |
on the dynamics of walking and not falling down 00:47:38.580 |
'cause the robot had to go into this disaster area 00:47:48.420 |
The challenge could have really highlighted perception 00:48:11.920 |
So what wasn't clear was how we would be able, 00:48:17.520 |
So the idea was always that you want semi-autonomy, 00:48:21.640 |
that you want the robot to have enough compute 00:48:24.280 |
that you can have a degraded network link to a human. 00:48:34.960 |
you'd be able to get a few bits back and forth, 00:49:17.660 |
maybe even on the technical side, on the team side, 00:49:23.000 |
from the early idea stages to actually competing. 00:49:28.200 |
- I mean, this was a defining experience for me. 00:49:30.600 |
It came at the right time for me in my career. 00:49:33.900 |
I had gotten tenure before I was due a sabbatical, 00:49:48.040 |
We had a bunch of algorithms that we were very happy with. 00:49:52.520 |
and this was a chance to really test our mettle, 00:50:12.060 |
anytime you're not sleeping and devoting your life 00:50:21.900 |
and so Atlas had to walk across cinder blocks. 00:50:41.560 |
do all the manual labor so that it can take its little 00:50:54.620 |
I remember, so we were using Gazebo as a simulator. 00:50:59.620 |
We were using Gazebo as a simulator on the cloud, 00:51:02.300 |
and there was all these interesting challenges. 00:51:12.220 |
they were pushing on the capabilities of Gazebo 00:51:16.020 |
in order to scale it to the complexity of these challenges. 00:51:34.860 |
And your controller will run on this computer, 00:51:43.020 |
Now, the physics, they wanted it to run at real-time rates, 00:51:48.020 |
because there was an element of human interaction. 00:52:06.540 |
the simulator wasn't quite at real-time rate. 00:52:24.980 |
and by the way, the perception folks on our team hated, 00:52:28.460 |
that they knew that if my controller was too slow, 00:52:32.500 |
And no matter how good their perception system was, 00:52:37.940 |
like three days before the virtual competition. 00:52:41.500 |
We're gonna either get a humanoid robot or we're not. 00:52:49.500 |
And I was just like, "Oh man, what are we gonna do here?" 00:53:06.820 |
he was a student at the time working on optimization. 00:53:17.340 |
Not like a little faster, it's actually, you know. 00:53:29.420 |
- So there's a really hard optimization problem 00:53:33.500 |
You didn't make the optimization problem simpler? 00:53:38.540 |
- So, I mean, your observation is almost spot on. 00:53:45.820 |
but we had not yet done this idea of warm starting. 00:53:57.940 |
to the optimization you're gonna solve with the next. 00:54:00.100 |
We, of course, had told our commercial solver 00:54:03.620 |
but even the interface to that commercial solver 00:54:15.420 |
We wrote a very lightweight, very fast layer, 00:54:18.500 |
which would basically check if nearby solutions 00:54:23.860 |
which were very easily checked, could stabilize the robot. 00:54:28.000 |
And if they couldn't, we would fall back to the solver. 00:54:37.380 |
if we, it got to the point where if for some reason 00:54:40.420 |
things slowed down and we fell back to the original solver, 00:54:42.820 |
the robot would actually literally fall down. 00:54:45.060 |
So it was a harrowing sort of ledge we were sort of on. 00:54:51.180 |
But I mean, actually, like the 400 pound humanoid 00:55:19.400 |
but they have moments, at moments have worked harder 00:55:39.200 |
people that push themselves to like the limit, 00:55:41.380 |
they also seem to be like the most like full of life somehow. 00:55:57.960 |
"I'm doing what I love, I'm passionate about it. 00:56:04.840 |
I actually think, I don't think the lack of sleep 00:56:10.960 |
and lack of doing things that you're passionate about. 00:56:13.280 |
But in this world, yeah, I mean, can you comment about 00:56:31.920 |
I mean, I think the causality is not that we work hard. 00:56:36.440 |
And I think other disciplines work very hard too. 00:56:56.000 |
that you wanna spend all your time on, right? 00:57:05.480 |
on the approach that you took that's interesting 00:57:10.240 |
or a success that you carry into your work today 00:57:13.800 |
about all the different ideas that were involved 00:57:17.280 |
in making, whether in the simulation or in the real world, 00:57:25.520 |
- I mean, it really did teach me something fundamental 00:57:41.040 |
I think the autonomous driving community thinks about this. 00:57:43.720 |
I think lots of people thinking about safety critical 00:57:46.640 |
systems that might have machine learning in the loop 00:57:50.360 |
For me, the DARPA challenge was the moment where I realized, 00:57:54.240 |
we've spent every waking minute running this robot. 00:58:09.260 |
We only have one robot, it's running almost all the time. 00:58:19.400 |
And I think that, I mean, I would say that the team that won 00:58:28.040 |
and was able to do not only incredible engineering, 00:58:45.040 |
Like from start to finish, what is a loop of testing? 00:58:48.760 |
- Yeah, I mean, I think there's a whole philosophy 00:58:52.120 |
There's the unit tests, and you can do that on a hardware. 00:58:56.340 |
You write one function, you should write a test 00:58:58.240 |
that checks that function's input and outputs. 00:59:02.400 |
at the other extreme of running the whole system together, 00:59:05.280 |
where they try to turn on all of the different functions 00:59:17.320 |
as a humanoid robot, but the philosophy is sort of the same. 00:59:26.040 |
it's impossible to run the same experiment twice. 00:59:35.640 |
but you'd probably never be able to run exactly 00:59:39.400 |
And right now, I think our philosophy is just, 00:59:59.880 |
but really we're relying on somewhat random search 01:00:05.480 |
but I think, you know, 'cause there's an argument 01:00:10.520 |
are the things that are really nuanced in the world, 01:00:21.800 |
Like, so you said walking over rough terrain, 01:00:31.360 |
in a certain kind of way to watch these videos 01:01:00.160 |
I mean, what made the robots fall and fail in your view? 01:01:08.360 |
Our team contributed one of those spectacular falls. 01:01:10.960 |
Every one of those falls has a complicated story. 01:01:21.700 |
waiting for a green light to be able to proceed 01:01:30.120 |
but it wasn't because of bad software, right? 01:01:32.760 |
But for ours, so the hardest part of the challenge, 01:01:40.440 |
It was actually relatively easy to drive the Polaris. 01:01:51.200 |
I mean, the thing you've come up with is just brilliant, 01:02:19.560 |
of the, and have basically one foot in the passenger seat, 01:02:27.620 |
But the hard part was we had to then park the car, 01:02:34.320 |
but it's just getting up from crouch, from sitting 01:02:38.760 |
when you're in this very constrained environment. 01:02:41.920 |
- First of all, I remember after watching those videos, 01:02:51.280 |
Like it's actually a really difficult control problem. 01:02:55.120 |
- And I'm very cognizant of it when I'm like injured 01:03:08.160 |
and they have these checklists, pre-launch checklists 01:03:10.680 |
and they're like, we weren't far off from that. 01:03:12.380 |
We had this big checklist and on the first day 01:03:14.720 |
of the competition, we were running down our checklist. 01:03:28.120 |
And then we turned on our balancing controller. 01:03:30.840 |
And the nerves, jitters of the first day of the competition, 01:03:37.560 |
So we used a lot of motion planning to figure out 01:03:47.200 |
We relied heavily on our balancing controller. 01:03:50.300 |
And basically there were, when the robot was in one 01:03:53.760 |
of its most precarious sort of configurations, 01:04:00.920 |
the other controller that thought it was still driving 01:04:06.880 |
And that wasn't good, but it turned disastrous for us 01:04:11.880 |
because what happened was a little bit of push here. 01:04:17.000 |
Actually, we have videos of us running into the robot 01:04:21.100 |
with a 10 foot pole and it kind of will recover. 01:04:24.720 |
But this is a case where there's no space to recover. 01:04:27.840 |
So a lot of our secondary balancing mechanisms 01:04:32.200 |
they were all disabled because we were in the car 01:04:35.360 |
So we were relying on our just lowest level reflexes. 01:04:38.400 |
And even then, I think just hitting the foot on the seat, 01:04:42.200 |
on the floor, we probably could have recovered from it. 01:04:46.440 |
is when we did that and we jostled a little bit, 01:04:49.480 |
the tailbone of our robot was only a little off the seat, 01:04:54.600 |
And the other foot came off the ground just a little bit. 01:04:58.280 |
And nothing in our plans had ever told us what to do 01:05:02.320 |
if your butt's on the seat and your feet are in the air. 01:05:06.080 |
- And then the thing is, once you get off the script, 01:05:10.000 |
things can go very wrong because even our state estimation, 01:05:12.800 |
our system that was trying to collect all the data 01:05:20.120 |
So it was predicting things that were just wrong. 01:05:32.560 |
- That's true, we fell in and we got our point for egress. 01:05:35.480 |
- But so is there any hope for, that's interesting, 01:05:39.320 |
is there any hope for Atlas to be able to do something 01:05:43.320 |
when it's just on its butt and feet in the air? 01:05:53.880 |
Boston Dynamics and Andy Mow and there's this incredible 01:05:58.880 |
work on legged robots happening around the world. 01:06:07.680 |
where you're making contact with the world at your feet. 01:06:10.160 |
And they have typically point feet relatively, 01:06:14.560 |
If those robots get in a situation where the elbow 01:06:21.320 |
Now they have layers of mechanisms that will make, 01:06:24.200 |
I think the more mature solutions have ways in which 01:06:31.320 |
But a human for instance, is able to leverage 01:06:34.840 |
incidental contact in order to accomplish a goal. 01:06:36.840 |
In fact, I might, if you push me, I might actually 01:06:38.920 |
put my hand out and make a new brand new contact. 01:06:42.320 |
The feet of the robot are doing this on quadrupeds, 01:06:45.040 |
but we mostly in robotics are afraid of contact 01:06:58.120 |
And we write very complex algorithms so that the robot 01:07:00.760 |
can dance around and make sure it doesn't touch the world. 01:07:21.200 |
and I'm going to ask you some dumb questions. 01:07:33.280 |
whenever people learn that I do any kind of AI 01:07:36.920 |
or like I mentioned robots and things like that, 01:07:52.200 |
if you're in the air, that's a common position, 01:07:57.640 |
Like how difficult do you think is the problem 01:08:03.840 |
and when will we have a robot that can defeat a human 01:08:10.920 |
I don't know if you're familiar with wrestling, 01:08:19.680 |
It's like, 'cause you're picking contact points 01:08:31.440 |
It's like you make them feel like you're doing one thing 01:08:44.080 |
So it's basically the art of multiple contacts. 01:08:58.520 |
and are in a game theoretic situation with a human, 01:09:11.360 |
- Yeah, maybe even throwing the game theory out of it, 01:09:13.400 |
almost like a, yeah, almost like a non-dynamic opponent. 01:09:20.080 |
but I think our best understanding of those problems 01:09:23.940 |
I have been increasingly focused on manipulation, 01:09:35.800 |
And there are some really impressive examples 01:09:41.800 |
that can appear to do good things through contact. 01:09:46.800 |
We've even got new examples of deep learning models 01:09:56.200 |
But I think the challenge you just offered there 01:10:05.280 |
I have to think though, it's hard for humans too, 01:10:11.320 |
I think probably you had maybe a slow motion version 01:10:26.820 |
really on the fly, take a model of your humanoid 01:10:32.120 |
and figure out how to plan the optimal sequence. 01:10:36.680 |
- Well, I mean, one of the most amazing things to me 01:10:47.680 |
I think it's the most interesting study of contact. 01:10:53.200 |
Like when you get good at it, it's so effortless. 01:11:03.400 |
being essentially like learning how to move my body 01:11:07.640 |
in a way that I could throw very large weights around 01:11:18.500 |
Like I'm a huge believer in drilling of techniques. 01:11:26.760 |
you're learning it intellectually a little bit, 01:11:29.760 |
but a lot of it is the body learning it somehow, 01:11:33.880 |
And whatever that learning is, that's really, 01:11:40.760 |
to like a deep learning, learning a controller. 01:11:46.780 |
it feels like there's a lot of distributed learning 01:11:50.520 |
- Yeah, I think there's hierarchy and composition 01:12:03.960 |
and you have a system that's capable of planning a vacation 01:12:12.560 |
you probably don't have a controller or a policy 01:12:14.840 |
for every possible destination you'll ever pick, right? 01:12:18.740 |
But there's something magical in the in-between. 01:12:23.480 |
And how do you go from these low level feedback loops 01:12:26.400 |
to something that feels like a pretty complex 01:12:34.760 |
that you can plan at some of these levels, right? 01:12:37.640 |
So Josh Tenenbaum just showed it in his talk the other day. 01:12:43.360 |
I think he calls it the pick three game or something, 01:12:46.720 |
where he puts a bunch of clutter down in front of a person, 01:12:58.980 |
And apparently you pick three items and then you pick, 01:13:01.880 |
he says, "Okay, pick the first one up with your right hand, 01:13:10.160 |
Right, so that's down at the level of physics and mechanics 01:13:16.120 |
and contact mechanics that I think we do learning, 01:13:34.520 |
but most of the time we can sort of figure that out. 01:13:41.240 |
is this ability to be flexible with our models, 01:13:46.160 |
use our well-oiled controllers when we don't, 01:13:51.820 |
Having models, I think the other thing you just said 01:13:58.800 |
is even changing as you improve your expertise, right? 01:14:09.320 |
you get a more refined version of that model. 01:14:11.920 |
You're aware of muscles or balanced components 01:14:32.040 |
Let's see what else is in there as motivations. 01:14:38.040 |
and then a crash of confidence in the middle. 01:14:42.880 |
All of those seem to be essential for the learning process. 01:14:48.160 |
then you're probably optimizing energy efficiency. 01:14:53.120 |
So there was this idea that you would have robots play soccer 01:15:05.720 |
Basically, was the goal to beat world champion team, 01:15:19.600 |
there's an organization called UFC for mixed martial arts. 01:15:23.480 |
Are we gonna see a World Cup championship soccer team 01:15:28.480 |
or a UFC champion mixed martial artist that's a robot? 01:15:33.480 |
- I mean, it's very hard to say one thing is harder, 01:15:38.640 |
What probably matters is who started the organization. 01:15:45.040 |
I mean, I think RoboCup has a pretty serious following, 01:15:47.160 |
and there is a history now of people playing that game, 01:15:57.040 |
So if you want to have mixed martial arts compete, 01:16:00.960 |
you better start your organization now, right? 01:16:08.680 |
'cause they're both hard and they're both different. 01:16:14.680 |
Like, especially the humanoid robots trying to play soccer. 01:16:23.400 |
- I mean, I guess there is RoboSumo wrestling. 01:16:27.880 |
where they do have these robots that go on a table 01:16:33.720 |
- First of all, do you have a year in mind for RoboCup, 01:17:00.840 |
at a game that humans are really damn good at. 01:17:17.760 |
- You have to make the rules very carefully, right? 01:17:20.400 |
I mean, if Atlas kicked me in the shins, I'm down, 01:17:30.520 |
- I think the fighting one is a weird one, yeah, 01:17:40.440 |
I mean, as soon as there's contact or whatever, 01:17:43.480 |
and there are some things that the robot will do better. 01:17:46.520 |
I think if you really set yourself up to try to see 01:17:56.920 |
is to play very differently than a human would play. 01:17:59.640 |
You're not gonna get the perfect soccer player robot. 01:18:04.000 |
You're gonna get something that exploits the rules, 01:18:10.720 |
its super low bandwidth feedback loops or whatever, 01:18:16.960 |
And I bet there's ways, I bet there's loopholes, right? 01:18:35.600 |
I think this might be the last ridiculous question, but-- 01:18:41.080 |
- I aspire to ask as many ridiculous questions 01:18:48.040 |
Okay, I don't know if you've seen The Black Mirror. 01:18:58.840 |
because I gave a talk to some MIT faculty one day, 01:19:05.720 |
I was telling them about the state of robotics. 01:19:15.920 |
And there was a look of horror that went across the room. 01:19:19.280 |
And I said, "I've shown videos like this a lot of times. 01:19:26.800 |
this Black Mirror episode had changed the way people watched 01:19:34.720 |
So I talked to so many people who are just terrified 01:19:37.760 |
because of that episode probably of these kinds of robots. 01:19:40.960 |
I almost want to say that they almost enjoy being terrified. 01:19:44.520 |
I don't even know what it is about human psychology 01:19:49.240 |
the destruction of the universe or our society, 01:19:58.360 |
but it feels like they talk about it so often, 01:20:01.000 |
it almost, there does seem to be an addictive quality to it. 01:20:32.740 |
They basically have, it's basically a spot with a gun. 01:20:43.780 |
You know, basically spot that goes rogue for some reason 01:20:56.380 |
I think, I don't know exactly why people react 01:21:06.980 |
about the way robots influence our society and the like. 01:21:09.860 |
I think that's something, that's a responsibility 01:21:17.380 |
with a kitchen knife or a pellet gun right away. 01:21:20.340 |
And I mean, if they were programmed in such a way, 01:21:25.340 |
that all I had to do was run for five minutes 01:21:50.460 |
I probably would have been watching Astro Boy. 01:21:52.720 |
And there's a very different reaction to robots 01:22:02.600 |
or if it's something that we've done to ourselves 01:22:07.420 |
- Yeah, the stories we tell ourselves through movies, 01:22:21.520 |
"I'm really terrified that we're going to have these robots 01:22:29.380 |
Like, how do you approach making me feel better? 01:22:39.600 |
There's a, I think there's a video that went viral recently. 01:22:52.720 |
Atlas being hit with a broomstick or something like that. 01:23:04.600 |
It was like patrolling somewhere in some country. 01:23:11.880 |
this is the dystopian future, the surveillance state. 01:23:19.440 |
Something about spot, being able to walk on four feet 01:24:00.480 |
I think wheeled robots could be used in bad ways too. 01:24:19.880 |
just because I'm compelled to build technology 01:24:23.520 |
But I would consider myself a technological optimist, 01:24:27.720 |
I guess, in the sense that I think we should continue 01:24:32.200 |
to create and evolve and our world will change. 01:24:49.360 |
- So it's interesting 'cause you didn't mention 01:24:55.880 |
I think people attribute a robot that looks like an animal 01:25:02.120 |
or consciousness or something that they don't have yet. 01:25:05.200 |
I think our ability to anthropomorphize those robots 01:25:14.920 |
a level of intelligence that they don't yet have 01:25:22.280 |
But there are many scary things in the world, right? 01:25:25.760 |
So I think we're right to ask those questions, 01:25:29.880 |
we're right to think about the implications of our work. 01:25:33.600 |
- Right, in the short term as we're working on it, for sure. 01:25:52.400 |
to a lot of folks talk about the existential threats 01:25:58.880 |
Oftentimes robots kind of inspire that the most 01:26:16.620 |
I think, and it's not the only answer he's given 01:26:18.900 |
over the years, but maybe one of my favorites is, 01:26:25.420 |
he's got a book, "Flesh and Machines," I believe. 01:26:27.820 |
It's not gonna be the robots versus the people. 01:26:46.360 |
People with amputations might have prosthetics. 01:26:50.880 |
That's a trend I think that is likely to continue. 01:27:01.380 |
But I mean, when do we get to cognitive implants 01:27:06.580 |
Yeah, with neural link, brain-computer interfaces. 01:27:10.300 |
So there's a dance between humans and robots. 01:27:23.380 |
because the robot will be part of us, essentially. 01:27:26.020 |
It'd be so intricately sort of part of our society. 01:27:29.860 |
- Yeah, and it might not even be implanted part of us, 01:27:37.220 |
- So in that sense, the smartphone is already the robot 01:27:57.880 |
They're human problems, essentially, it feels like. 01:28:03.420 |
with each other online is changing the value we put on 01:28:06.740 |
personal interaction, and that's a crazy big change 01:28:10.460 |
that's going to happen and has already been ripping 01:28:29.620 |
- I mean, I do want to just, as a kind of comment, 01:28:33.380 |
maybe you can comment about your just feelings 01:28:40.340 |
I think there's so many beautiful ideas in it. 01:28:45.340 |
or legged robots in general, I think they inspire people 01:28:50.340 |
curiosity and feelings in general, excitement 01:28:56.500 |
about engineering more than almost anything else 01:29:00.660 |
And I think that's such an exciting responsibility 01:29:06.860 |
And Boston Dynamics is riding that wave pretty damn well. 01:29:10.500 |
Like they found it, they've discovered that hunger 01:29:14.020 |
and curiosity in the people, and they're doing magic with it. 01:29:17.580 |
I don't care if they, I mean, I guess it's their company, 01:29:26.980 |
I mean, do you have thoughts about Boston Dynamics 01:29:39.020 |
I think Boston Dynamics is absolutely awesome. 01:29:50.740 |
I think, I just think it's a pinnacle of success 01:29:55.380 |
in robotics that is just one of the best things 01:30:09.580 |
How hard it is to make money with the robotics company. 01:30:13.060 |
Like iRobot like went through hell just to arrive at Arumba 01:30:19.820 |
And then there's so many home robotics companies 01:30:47.220 |
about the possibility of making money with robots? 01:30:50.300 |
- Oh, I think you can't just look at the failures. 01:30:53.940 |
You can, I mean, Boston Dynamics is a success. 01:31:01.140 |
I mean, this is the capitalist ecology or something, right? 01:31:05.380 |
I think you have many companies, you have many startups, 01:31:07.700 |
and they push each other forward, and many of them fail, 01:31:17.260 |
I don't know that, is robotics really that much worse? 01:31:22.300 |
Every time I read one of these, sometimes it's friends, 01:31:26.460 |
and I definitely wish it went better, went differently. 01:31:31.460 |
But I think it's healthy and good to have bursts of ideas, 01:31:40.660 |
If they are really aggressive, they should fail sometimes. 01:31:46.940 |
If you're succeeding at every problem you attempt, 01:31:50.780 |
then you're not choosing aggressively enough. 01:32:00.220 |
- Yeah, I mean, I have to dig up 75K right now. 01:32:04.140 |
- I mean, it's so cool that there's a price tag. 01:32:17.540 |
- I wonder what your kids would think about it. 01:32:24.420 |
would let my kid drive in one of their demos one time, 01:32:42.620 |
I'm not sure we understand from a control aspect 01:32:57.060 |
There's been a, psychologists have been studying it. 01:33:00.180 |
I think it's part, like, manipulating our mind 01:33:11.420 |
I think there's something that humans seem to do, 01:33:19.020 |
I think we are able to make very simple models 01:33:23.860 |
that assume a lot about the world very quickly, 01:33:27.820 |
and then it takes us a lot more time, like your wrestling. 01:33:33.740 |
and you were fairly functional as a complete wrestler, 01:33:39.380 |
So maybe it's natural that our first level of defense 01:33:48.140 |
in our existing models of how humans and animals behave. 01:33:52.500 |
And it's just, as you spend more time with it, 01:33:55.140 |
then you'll develop more sophisticated models 01:34:01.700 |
Can you say what does it take to control a robot? 01:34:04.460 |
Like, what is the control problem of a robot? 01:34:08.620 |
And in general, what is a robot in your view? 01:34:38.940 |
Like, my dishwasher at home is pretty sophisticated, 01:34:46.940 |
probably a couple of chips in there doing amazing things. 01:34:58.340 |
and doesn't get to celebrate its past successes. 01:35:10.860 |
but they don't, if you ask what are the successes 01:35:20.540 |
with some level of automation that fails frequently. 01:35:25.940 |
plus mechanical work and an unsolved problem. 01:35:39.780 |
- So there are many different types of robots. 01:35:54.660 |
is very different than what you need for an autonomous car 01:35:59.420 |
It's very different than what you need for a robot 01:36:00.980 |
that's gonna walk or pick things up with its hands. 01:36:10.500 |
you're doing more dynamic interactions with the world. 01:36:17.700 |
And the control problems there are beautiful. 01:36:21.660 |
I think contact is one thing that differentiates them 01:36:25.900 |
from many of the control problems we've solved classically. 01:36:29.180 |
Right, like modern control grew up stabilizing fighter jets 01:36:47.500 |
- So you mentioned contact, like what's contact? 01:36:50.660 |
- So an airplane is an extremely complex system 01:37:45.380 |
And even the way it touches or the order of contacts 01:37:48.060 |
can change the outcome in potentially unpredictable ways. 01:38:01.420 |
a lot of people will say that contact is hard in robotics, 01:38:11.980 |
So what is limiting is that when we think about our robots 01:38:21.340 |
we often make an assumption that objects are rigid. 01:38:56.940 |
my high school physics diagram of this system, 01:39:00.340 |
then there's a couple of things that I'm given 01:39:03.860 |
I know if the system, if the table is at rest, 01:39:16.460 |
by the forces that the ground is pushing on my table legs. 01:39:25.900 |
And since it can, it's a three-dimensional table, 01:39:35.460 |
If I have four legs on my table, four-legged table, 01:39:45.420 |
and they're set down and the table's not moving, 01:39:48.140 |
then the basic conservation laws don't tell me 01:39:56.700 |
that would still result in the table not moving. 01:40:03.980 |
But it gets funny now because if you think about friction, 01:40:06.880 |
what we think about with friction is our standard model says 01:40:12.100 |
the amount of force that the table will push back, 01:40:15.940 |
if I were to now try to push my table sideways, 01:40:24.060 |
So if I'm barely touching and I push, I'll slide, 01:40:27.220 |
but if I'm pushing more and I push, I'll slide less. 01:40:30.500 |
It's called Coulomb friction, is our standard model. 01:40:33.780 |
Now, if you don't know what the normal force is 01:40:38.880 |
then you don't know what the friction forces are gonna be. 01:41:04.300 |
We still do that sometimes because soft contact 01:41:09.500 |
is potentially harder numerically or whatever, 01:41:17.060 |
But anyways, because of these kinds of paradoxes, 01:41:25.400 |
It becomes very hard to write the same kind of control laws 01:41:33.900 |
We haven't been as successful writing those controllers 01:41:37.920 |
- And so you don't know what's going to happen 01:41:39.680 |
at the point of contact, at the moment of contact. 01:41:47.920 |
I mean, instead of having a differential equation, 01:41:51.640 |
you end up with a differential inclusion, it's called. 01:42:00.520 |
and there's a set of things that could happen. 01:42:12.020 |
they're not only not smooth, but this is discontinuous? 01:42:55.920 |
despite all the things that could possibly happen. 01:42:58.120 |
The world might want the table to go this way and this way, 01:43:01.560 |
that pushes a little bit more and pushes a little bit, 01:43:04.320 |
I can certainly make the table go in the direction I want. 01:43:12.160 |
And these discontinuities do change the control system 01:43:26.200 |
or parts of my body that are in contact or not, 01:43:31.880 |
I reason about them separately or differently, 01:43:34.680 |
and the combinatorics of that blow up, right? 01:43:41.440 |
all the possible contact configurations of my humanoid. 01:43:49.040 |
I have lots of degrees of freedom, lots of joints. 01:43:51.600 |
I've only been around for a handful of years. 01:44:08.320 |
every possible contact configuration that I'll ever be in, 01:44:12.160 |
that's probably not a problem I need to solve, right? 01:44:20.360 |
Just so we can enumerate, what are we talking about? 01:44:35.400 |
where the heel strikes and then the front toe strikes, 01:44:42.420 |
Those are each different contact configurations. 01:44:48.280 |
but I ended up with four different contact configurations. 01:44:51.380 |
Now, of course, my robot might actually have bumps on it 01:44:56.380 |
or other things, so it could be much more subtle than that. 01:45:03.120 |
interacting with the ground already in the plane 01:45:11.220 |
just before my right toe and things get subtle. 01:45:16.440 |
and I go to talk about just grabbing a water bottle, 01:45:26.680 |
that my hand came into contact with the bottle, 01:45:32.920 |
Any approach that we were able to get away with that 01:45:35.360 |
in walking because we mostly touched the ground 01:45:38.440 |
more than a small number of points, for instance, 01:45:40.920 |
and we haven't been able to get dexterous hands that way. 01:45:43.880 |
- So you've mentioned that people think that contact 01:45:55.760 |
that robotic manipulation problem is really hard. 01:46:06.600 |
- So I think simulating contact is one aspect. 01:46:12.880 |
that one of the reasons that we have a limit in robotics 01:46:16.360 |
is because we do not simulate contact accurately 01:46:29.960 |
There are some things that are still hard, difficult, 01:46:35.980 |
But we actually, we know what the governing equations are. 01:46:40.980 |
They have some foibles, like this indeterminacy, 01:46:44.700 |
but we should be able to simulate them accurately. 01:46:47.200 |
We have incredible open source community in robotics, 01:46:51.440 |
but it actually just takes a professional engineering team 01:46:54.340 |
a lot of work to write a very good simulator like that. 01:46:57.740 |
- My word is, I believe you've written, Drake. 01:47:04.500 |
I certainly spent a lot of hours on it myself. 01:47:08.620 |
What does it take to create a simulation environment 01:47:19.580 |
- Right, so Drake is the simulator that I've been working on. 01:47:26.780 |
I don't like to think of Drake as just a simulator, 01:47:31.780 |
we write our perception systems a little bit in Drake, 01:47:40.820 |
- So it has optimization capabilities as well? 01:47:55.900 |
You can write linear programs, quadratic programs, 01:48:00.740 |
semi-definite programs, sums of squares programs, 01:48:07.940 |
and send them to whatever the right solver is, 01:48:09.780 |
for instance, and it provides a level of abstraction. 01:48:12.500 |
The second thing is a system modeling language, 01:48:20.900 |
where you can make block diagrams out of complex systems, 01:48:29.020 |
that are each doing some part of your system, 01:48:33.820 |
we try to write, if you write a Drake system, 01:48:43.020 |
If you have any state, for instance, in the system, 01:48:51.660 |
but the advantage of doing that is that you can, 01:48:57.480 |
but you can also do control design against it. 01:49:00.220 |
You can do, I mean, simple things like rewinding 01:49:03.100 |
and playing back your simulations, for instance. 01:49:16.900 |
because I think the complexity of Atlas, for instance, 01:49:26.940 |
huge fan of what it's done for the robotics community, 01:49:30.700 |
but the ability to rapidly put different pieces together 01:49:37.960 |
But I do think that it's hard to think clearly 01:49:48.180 |
And if you can, you know, ask a little bit more 01:49:54.180 |
then you can understand the way they work better. 01:50:02.660 |
And then one of those systems, the last thing, 01:50:09.620 |
of multi-body equations, rigid body equations, 01:50:12.520 |
that is trying to provide a system that simulates physics. 01:50:16.740 |
And that, we also have renderers and other things, 01:50:20.020 |
but I think the physics component of Drake is special 01:50:23.260 |
in the sense that we have done excessive amount 01:50:31.540 |
Every possible tumbling satellite or spinning top 01:50:34.100 |
or anything that we could possibly write as a test 01:50:40.620 |
fundamental improvements on the way you simulate contact. 01:50:44.220 |
- Just what does it take to simulate contact? 01:50:55.220 |
and you were tapping your fingers on the table 01:51:32.400 |
- So it was really interesting that what happened was that 01:51:48.640 |
I guess that's what I want to somehow fight against 01:51:52.780 |
and bring some of the clear thinking of controls 01:51:54.760 |
into these complex systems we're building for robots. 01:52:07.260 |
They put one of their, there's three locations. 01:52:17.520 |
as a part of my, the end of my sabbatical, I guess. 01:52:28.480 |
has made this investment in simulation in Drake. 01:52:34.480 |
of just absolutely top-notch dynamics experts 01:52:41.960 |
And there's also a team working on manipulation there 01:52:44.780 |
that is taking problems like loading the dishwasher 01:52:48.980 |
and we're using that to study these really hard corner cases 01:52:55.280 |
So for me, this, you know, simulating the dishes, 01:53:01.580 |
If we just cared about picking up dishes in the sink once, 01:53:12.140 |
what is the path you take to actually get to a robot 01:53:17.040 |
that could perform that for any dish in anybody's kitchen 01:53:23.280 |
that it could be a commercial product, right? 01:53:26.520 |
And it has deep learning perception in the loop. 01:53:36.340 |
and put it through this engineering discipline 01:53:49.840 |
that that's not something you throw over the fence 01:53:52.000 |
and hope that somebody will harden it for you. 01:53:59.820 |
- They're doing the validation and the testing. 01:54:02.320 |
- I think it might even change the way we have to think 01:54:08.980 |
What happens if you have the robot running lots of tests 01:54:23.600 |
or the same experiment twice on a real robot. 01:54:27.020 |
Do we have to be able to bring that one-off failure 01:54:31.520 |
back into simulation in order to change our controllers, 01:54:40.600 |
to our distribution and understand that on average, 01:54:45.920 |
There's like really subtle questions at the corner cases 01:54:49.960 |
that I think we don't yet have satisfying answers for. 01:54:57.160 |
do you think that's possible to create a systematized way 01:55:07.600 |
- Yes, I mean, I think we have to get better at that. 01:55:29.640 |
You could try to understand the uncertainty in your system, 01:55:36.480 |
the maximum information to reduce that uncertainty. 01:55:40.120 |
If there's a parameter you wanna learn about, 01:55:42.360 |
what is the optimal trajectory I could execute 01:55:47.640 |
Scaling that up to something that has a deep network 01:55:51.720 |
in the loop and a planning in the loop is tough. 01:56:00.280 |
we've worked on some falsification algorithms 01:56:15.840 |
that try to spend most of their time in the corner cases. 01:56:19.920 |
So you basically imagine you're building an autonomous car 01:56:33.340 |
which figure out where your controller's performing badly 01:56:39.720 |
You know, it's just the space of possible places 01:56:44.060 |
where that can be, where things can go wrong is very big. 01:56:55.800 |
- We joked and we call it the black swan generator. 01:57:00.080 |
- 'Cause you don't just want the rare events, 01:57:05.660 |
those are the most sort of profound questions 01:57:22.580 |
We're asking like, what are the millions of ways 01:57:50.800 |
- I think that's gonna be important for robots too. 01:58:04.760 |
I don't have enough time to visit all of my states. 01:58:11.600 |
But that's not actually the problem we have to solve. 01:58:24.160 |
they have enough data that their computational processes 01:58:34.200 |
I mean, all these dishwasher unloading robots. 01:58:42.600 |
and a human looking at the robot, probably pissed off. 01:58:51.200 |
I think one thing in terms of fleet learning, 01:58:54.520 |
and I've seen that because I've talked to a lot of folks, 01:59:08.160 |
is they really enjoy when a system improves, learns. 01:59:25.480 |
they almost enjoy being part of the teaching. 01:59:38.640 |
I mean, the problem on the Tesla side with cars, 02:00:06.120 |
but they're also interested in home robotics. 02:00:21.280 |
using AI and machine learning to discover new materials 02:00:24.400 |
for car batteries and the like, for instance. 02:00:27.920 |
Yeah, and that's been actually incredibly successful team. 02:00:33.640 |
- Do you see a future of where like robots are in our home 02:00:55.960 |
We're going to just not even see them as robots. 02:00:58.680 |
But I mean, what's your vision of the home of the future? 02:01:02.480 |
10, 20 years from now, 50 years if you get crazy. 02:01:06.180 |
- Yeah, I think we already have Roombas cruising around. 02:01:16.200 |
It's only a matter of time till they spring arms 02:01:23.840 |
I think lots of people have lots of motivations 02:01:29.400 |
It's been super interesting actually learning 02:01:36.380 |
'Cause I think that's not necessarily the first entry, 02:02:08.360 |
- It's interesting 'cause older folks are also, 02:02:18.120 |
who are the most comfortable with technology, for example. 02:02:20.920 |
So there's a division that's interesting there 02:02:41.160 |
and the human is kind of learning about this new robot thing. 02:02:47.240 |
at least with, like when I talk to my parents about robots, 02:02:51.440 |
there's a little bit of a blank slate there too. 02:02:54.520 |
Like you can, I mean, they don't know anything about robotics 02:03:09.480 |
Here's a cool thing, like what can it do for me? 02:03:22.720 |
They kind of broke out and started in industry 02:03:29.120 |
It does feel like manipulation in logistics, of course, 02:03:48.560 |
If we can just linger on this whole touch thing. 02:04:26.640 |
In fact, hitting the table with your rigid arm 02:04:37.220 |
I think it fundamentally changes the mechanics of contact 02:04:57.280 |
I might have to put a lot of force to break the egg. 02:05:04.420 |
then I can distribute my force across the shell of the egg 02:05:34.520 |
But it really should make things better in some ways, right? 02:05:37.520 |
It's a, I think the future is unwritten there. 02:05:46.840 |
but I think ultimately it will make things simpler 02:05:55.720 |
So the result of small actions is less discontinuous, 02:06:02.320 |
but it also means potentially less, you know, 02:06:08.080 |
I won't necessarily contact something and send it flying off. 02:06:20.240 |
So if you change your hardware and make it more soft, 02:06:24.720 |
then you can potentially have a tactile sensor, 02:06:30.680 |
So there's a team at TRI that's working on soft hands, 02:06:38.160 |
If you can put a camera behind the skin roughly 02:06:52.840 |
is if you work super hard on your head-mounted, 02:06:55.080 |
on your perception system for your head-mounted cameras, 02:07:13.560 |
And really, if you don't have tactile information, 02:07:19.300 |
So it happens that soft robotics and tactile sensing 02:07:26.820 |
but you taught a course on under-actuated robotics. 02:07:40.380 |
- Right, so under-actuated robotics is my graduate course. 02:07:52.060 |
- Look on YouTube for the 2020 versions until March, 02:07:56.180 |
and then you have to go back to 2019, thanks to COVID. 02:08:46.500 |
So if you have a robot that's bolted to the table 02:08:49.540 |
that has five degrees of freedom, and five motors, 02:09:07.780 |
He said, "Russ, if you had more research funding, 02:09:30.180 |
But still, there's a really important degree of freedom 02:09:42.740 |
and there's no motor that connects my center of mass 02:09:51.980 |
The passive dynamic walkers are the extreme view of that, 02:10:00.260 |
But it shows up in all of the walking robots, 02:10:09.180 |
- That's referring to walking if you're falling forward. 02:10:13.420 |
Is there a way to walk that's fully actuated? 02:10:18.620 |
When you're in contact, and you have your feet 02:10:22.340 |
on the ground, there are still limits to what you can do. 02:10:36.180 |
But I can still do most of the things that I want to. 02:10:43.700 |
unless you suddenly needed to accelerate down super fast. 02:10:59.220 |
I think you have to embrace the under-actuated dynamics. 02:11:06.940 |
Even if my arm is fully actuated, I have a motor, 02:11:16.020 |
then I don't have an actuator for that directly. 02:11:31.340 |
I suddenly, that's a hard question to think about. 02:11:36.020 |
as thinking about my state-space control ideas. 02:11:41.780 |
that make me so excited about manipulation right now, 02:11:46.980 |
it breaks a lot of the foundational control stuff 02:11:51.500 |
- What are some interesting insights you could say 02:12:08.380 |
The technical approach has been optimization. 02:12:12.140 |
So you typically formulate your decision-making for control 02:12:19.340 |
and sometimes often numerical optimal control 02:12:40.860 |
- And there's some, so in under-actuated systems, 02:12:44.020 |
when you say let physics do some of the work, 02:12:50.300 |
that observes the state that the physics brought you to. 02:13:00.260 |
Do you ever loop in like complicated perception systems 02:13:06.860 |
- Right, right around the time of the DARPA challenge, 02:13:27.420 |
and we had, it was a really fun team to work on. 02:13:30.700 |
He's carried it farther, much farther forward since then. 02:13:37.540 |
That was, at the time, we felt like LIDAR was too heavy 02:13:41.740 |
and too power heavy to be carried on a light UAV, 02:14:00.660 |
so the deep learning revolution unquestionably changed 02:14:04.100 |
what we can do with perception for robotics and control. 02:14:10.940 |
we can use perception in, I think, a much deeper way. 02:14:22.900 |
to look at the cameras and produce the state, 02:14:25.940 |
which is like the pose of my thing, for instance. 02:14:30.420 |
that that's not always the right thing to do. 02:14:40.860 |
- If the first step of me trying to button my shirt 02:15:06.940 |
it begs new questions about state representation. 02:15:10.300 |
Another example that we've been playing with in lab 02:15:13.140 |
has been just the idea of chopping onions, okay? 02:15:33.220 |
- If I'm moving around a particular object, right? 02:15:36.020 |
oh, it's got a position or an orientation in space, 02:15:46.740 |
Does my control system really need to understand 02:15:51.780 |
and even the shape of the 100 pieces of onion 02:15:58.820 |
more and more is my state space getting bigger as I cut? 02:16:17.180 |
There is a proper Lagrangian state of the system, 02:16:28.540 |
But there's some different state representation. 02:16:34.980 |
- And that's what I worry about saying compressed 02:16:37.220 |
because I don't mind that it's low dimensional or not, 02:16:41.420 |
but it has to be something that's easier to think about. 02:16:49.300 |
- Or the algorithms being like control, optimal control. 02:17:01.940 |
you can give me a high dimensional state representation, 02:17:06.820 |
But if I have to think about all the possible 02:17:22.780 |
the idea of underactuated as really compelling 02:17:32.420 |
or you talk about the world with people in it in general, 02:17:35.460 |
do you see the world as basically an underactuated system? 02:17:57.260 |
we already have natural tools to deal with it. 02:18:02.380 |
I mean, in linear systems, it's not a problem. 02:18:10.780 |
It's a subtle point about when that becomes a bottleneck 02:18:18.780 |
although we've gotten incredibly good solutions now. 02:18:23.740 |
I felt that that was the key bottleneck in Legged Robots. 02:18:27.060 |
And roughly now, the underactuated course is, 02:18:43.540 |
And that's where we get into now more of the, 02:18:47.100 |
I'm hoping to put it online this fall completely. 02:18:59.220 |
And the thing that's a little bit sad is that, 02:19:03.980 |
for me at least, is there's a lot of manipulation tasks 02:19:09.220 |
They could start a company with it and be very successful 02:19:12.700 |
that don't actually require you to think that much 02:19:19.980 |
Once I have, if I reach out and grab something, 02:19:23.060 |
if I can sort of assume it's rigidly attached to my hand, 02:19:28.740 |
without really ever thinking about the dynamics 02:19:40.900 |
But I think the really good problems in manipulation, 02:19:49.740 |
That's like, a lot of people think of that, just grasping. 02:19:53.020 |
I don't mean that, I mean buttoning my shirt. 02:20:00.300 |
And not just one shoe, but every shoe, right? 02:20:08.260 |
like the infinite dimensional state of the laces. 02:20:11.900 |
That's probably not needed to write a good controller. 02:20:16.140 |
I know we could hand design a controller that would do it, 02:20:26.460 |
But I think if we can stay pure in our approach, 02:20:37.180 |
I mean, and the soft touch comes into play there. 02:20:42.180 |
Let me ask another ridiculous question on this topic. 02:20:54.780 |
where like I think you can fall in love with a robot 02:21:07.780 |
So in terms of robots, you know, connecting with humans, 02:21:18.660 |
in terms of like a deep, meaningful connection, like love, 02:21:21.820 |
but even just like collaborating in an interesting way, 02:21:26.900 |
From an engineering perspective and a philosophical one. 02:21:43.340 |
while they're doing meaningful mechanical work 02:21:46.420 |
in the close contact with or vicinity of people 02:21:52.260 |
that need help, I think we have to have them, 02:21:57.540 |
They have to be afraid, not afraid of touching the world. 02:22:06.300 |
and the concept of Baymax, that's just awesome. 02:22:08.740 |
I think we should, and we have some folks at Toyota 02:22:16.900 |
And I think it's just a fantastically good project. 02:22:21.900 |
I think it will change the way people physically interact. 02:22:25.660 |
The same way, I mean, you gave a couple examples earlier, 02:22:28.020 |
but if the robot that was walking around my home 02:22:36.020 |
that could change completely the way people perceive it 02:22:39.860 |
And maybe they'll even wanna teach it, like you said. 02:22:50.100 |
and looking at it as if it's not doing as well as a human, 02:22:54.380 |
they're gonna try to help out the cute teddy bear. 02:23:01.260 |
and being more soft and more contact is important. 02:23:12.380 |
Well, first of all, just visiting your lab and seeing Atlas, 02:23:31.500 |
I kinda like the idea that it's a her or him. 02:23:35.820 |
It's a magical moment, but there's no touching. 02:23:38.820 |
I guess the question I have, have you ever been, 02:23:53.020 |
that you've forgotten that a robot is a robot? 02:24:00.860 |
and for a second you forgot that it's not human? 02:24:04.940 |
- I mean, I think when you're in on the details, 02:24:07.860 |
then we of course anthropomorphized our work with Atlas, 02:24:17.140 |
I think we were pretty aware of it as a machine 02:24:21.860 |
I actually, I worry more about the smaller robots 02:24:26.300 |
that could still move quickly if programmed wrong 02:24:29.580 |
and we have to be careful actually about safety 02:24:36.420 |
I think then a lot of those concerns could go away. 02:24:41.300 |
We're seeing the lower cost, lighter weight arms now 02:25:22.500 |
convinced him that actually he needed to study 02:25:33.540 |
I think it's a little bit also of the passive 02:25:48.820 |
And you can learn a lot more than a passive observer. 02:25:51.660 |
So you can, in dialogue, that was your initial example, 02:25:56.940 |
you could have an active experiment exchange. 02:26:00.060 |
But I think if you're just a camera watching YouTube, 02:26:05.860 |
than if you're a robot that can apply force and touch. 02:26:15.500 |
- Yeah, I think it's just an exciting area of research. 02:26:27.780 |
it feels like such a rich opportunity to explore touch. 02:26:39.700 |
but the next step is like a real human connection. 02:26:55.580 |
- I don't know, you might disagree with this, 02:26:58.220 |
'cause I think it's important to see robots as tools often, 02:27:06.140 |
I think they're just always going to be more effective 02:27:10.220 |
It's convenient now to think of them as tools 02:27:16.220 |
but I think ultimately to create a good experience 02:27:43.100 |
- It's not my area, but I am also a big believer. 02:27:48.100 |
- Do you have an emotional connection to Atlas? 02:28:03.460 |
a different science project that I'd worked on super hard. 02:28:11.300 |
we basically had to do heart surgery on the robot 02:28:14.820 |
in the final competition 'cause we melted the core. 02:28:17.500 |
Yeah, there was something about watching that robot 02:28:23.180 |
hanging there, we know we had to compete with it 02:28:24.940 |
in an hour and it was getting its guts ripped out. 02:28:30.540 |
I think if you look back like 100 years from now, 02:28:32.980 |
yeah, I think those are important moments in robotics. 02:28:54.060 |
So I think a lot of people look at a brilliant person 02:29:05.180 |
Is there maybe three books, technical, fiction, 02:29:09.600 |
philosophical that had a big impact on your life 02:29:13.860 |
that you would recommend perhaps others reading? 02:29:16.760 |
- Yeah, so I actually didn't read that much as a kid, 02:29:24.380 |
There are some recent books that if you're interested 02:29:28.860 |
in this kind of topic, like "AI Superpowers" by Kai-Fu Lee 02:29:33.860 |
is just a fantastic read, you must read that. 02:29:36.940 |
Yuval Harari, I think that can open your mind. 02:30:02.220 |
a more controversial recommendation I could give. 02:30:08.740 |
- In some sense, it's so classical it might surprise you. 02:30:11.580 |
But I actually recently read Mortimer Adler's 02:30:29.020 |
boy, we're just inundated with research papers 02:30:33.820 |
that you could read on archive with limited peer review 02:30:50.060 |
a really good book or a really good paper if you find it, 02:30:53.880 |
the attitude, the realization that you're only gonna find 02:30:58.520 |
But then once you find them, you should just dig in 02:31:17.260 |
I read it at the right time where I was just feeling 02:31:21.420 |
just overwhelmed with really low quality stuff, I guess. 02:31:26.260 |
And similarly, I'm giving more than three now. 02:31:57.540 |
of studying a particular paper essentially, set of papers. 02:32:01.300 |
- Yeah, I think when you really find something, 02:32:07.740 |
might not be the same book that resonates with me, 02:32:10.400 |
but when you really find one that resonates with you, 02:32:15.180 |
and that's what I loved that Adler was saying, 02:32:29.200 |
a really good book is a dialogue between you and the author, 02:32:58.860 |
so Isaac Asimov, great science fiction writer, 02:33:08.720 |
He was passionate about explaining things, right? 02:33:12.700 |
He wrote all kinds of books on all kinds of topics 02:33:14.600 |
in science, he was known as the great explainer. 02:33:36.220 |
because of the way I've been given the opportunity 02:33:39.180 |
to teach them at MIT, and have questions asked, 02:33:43.540 |
the fear of the lecture, the experience of the lecture 02:33:50.200 |
just forces me to be rock solid on these ideas 02:33:55.040 |
I don't know I would be in a different intellectual space. 02:34:09.140 |
I do think that something's changed right now, 02:34:12.820 |
which is, right now we're giving lectures over Zoom, 02:34:16.900 |
I mean, giving seminars over Zoom and everything. 02:34:19.300 |
I'm trying to figure out, I think it's a new medium. 02:34:24.380 |
- Do you think it's-- - I'm trying to figure out 02:34:34.500 |
about the human to human connection over that medium, 02:34:39.500 |
but I think that's because it hasn't been explored fully, 02:34:45.820 |
- Every lecture is a, I'm sorry, every seminar even, 02:34:55.020 |
I can deliver content directly into your browser. 02:35:08.700 |
at least a powerful enough laptop or something 02:35:19.980 |
And I think robotics can potentially benefit a lot 02:35:25.340 |
We'll see, it's gonna be an experiment this fall. 02:35:44.980 |
are still listening to now, which is crazy to me. 02:35:48.020 |
There's a patience and an interest in long-form content, 02:36:06.260 |
but like just a really beautifully condensed an idea. 02:36:25.940 |
they don't get to go back and edit out parts, 02:36:47.940 |
'cause I edit other people's lectures to extract clips. 02:36:50.660 |
It's like there's certain tangents that are like, 02:36:57.260 |
they're not clarifying, they're not helpful at all. 02:36:59.900 |
And once you remove them, it's just, I don't know. 02:37:05.980 |
- Yeah, it takes, it depends like what is teaching, 02:37:33.380 |
- So do you find that, I know you segment your podcast. 02:37:44.260 |
- Yeah, we're talking about this topic, whatever. 02:37:53.860 |
So I'm learning, like I'm afraid of conversation. 02:37:56.460 |
This is even today, I'm terrified of talking to you. 02:37:59.180 |
I mean, it's something I'm trying to remove from myself. 02:38:03.540 |
A guy, I mean, I learned from a lot of people, 02:38:10.780 |
who's been inspirational to me in terms of conversation. 02:38:20.540 |
Being able to just have fun and enjoy themselves 02:38:32.860 |
but mostly just to enjoy yourself in conversations. 02:38:46.300 |
- So that's, people love just regular conversation. 02:38:49.500 |
That's what they, the structure is like, whatever. 02:38:58.740 |
there's probably a couple thousand PhD students 02:39:07.020 |
And they might know what we're talking about, 02:39:09.580 |
but there is somebody, I guarantee you right now in Russia, 02:39:14.580 |
some kid who's just like, who's just smoked some weed, 02:39:23.820 |
he kind of watched some Boston Dynamics videos. 02:39:30.260 |
No, but just like, there's so much variety of people 02:39:45.980 |
but I also often notice it inspires people to, 02:39:51.860 |
in their undergraduate studies trying to figure out what, 02:40:13.380 |
Like what advice would you give to a young person about life? 02:40:43.940 |
If they come up to you, what would you tell them? 02:40:48.020 |
- I think it's an interesting time to be a kid these days. 02:40:57.220 |
sort of a winner take all economy and the like. 02:40:59.340 |
I think the people that will really excel, in my opinion, 02:41:04.340 |
are gonna be the ones that can think deeply about problems. 02:41:13.980 |
and use the internet for everything it's good for 02:41:16.700 |
And I think a lot of people will develop those skills. 02:41:29.060 |
and they can think very deeply and critically. 02:41:34.980 |
I think one path to learning that is through mathematics, 02:41:39.520 |
I would encourage people to start math early. 02:41:46.880 |
I mean, I was always in the better math classes 02:41:51.300 |
but I wasn't pursuing super advanced mathematics 02:41:59.000 |
and really started the life that I'm living now. 02:42:10.600 |
really understand things, building things too. 02:42:12.460 |
I mean, pull things apart, put them back together. 02:43:18.300 |
So it connects to me to the entirety of the history 02:43:23.220 |
and to the love and hatred and suffering that went on there. 02:43:46.060 |
I don't know why, it just connects and inspires. 02:43:53.820 |
and algorithms that just, yeah, you return to often. 02:44:02.900 |
And I've been losing that because of the internet. 02:44:07.260 |
I've been going on archive and blog posts and GitHub 02:44:31.180 |
at least my current life began when I got to MIT. 02:44:52.260 |
But I had taken a computer engineering degree at Michigan. 02:44:57.260 |
I enjoyed it immensely, learned a bunch of stuff. 02:45:03.560 |
But when I did get to MIT and started working 02:45:17.220 |
It demanded more of me, certainly mathematically 02:45:22.660 |
And I remember the day that I borrowed one of the books 02:45:40.220 |
I think, I expected you to ask me the meaning of life. 02:45:45.220 |
I think that the, somehow I think that's gotta be part of it. 02:46:01.700 |
I mean, I was working hard, but I was loving it. 02:46:06.700 |
I think there's this magical thing where you, 02:46:09.480 |
I'm lucky to surround myself with people that basically, 02:46:17.860 |
I'll be told something or something that I realize, 02:46:30.300 |
And I think that is just such an important aspect 02:46:35.300 |
and being willing to understand what you can and can't do 02:46:47.300 |
- I don't think there's a better way to end it, Russ. 02:46:51.460 |
You've been an inspiration to me since I showed up at MIT. 02:46:55.540 |
Your work has been an inspiration to the world. 02:47:14.140 |
Magic Spoon Cereal, BetterHelp and ExpressVPN. 02:47:32.780 |
Click the links, buy the stuff, get the discount. 02:47:36.220 |
It really is the best way to support this podcast. 02:47:39.380 |
If you enjoy this thing, subscribe on YouTube, 02:47:43.680 |
support it on Patreon or connect with me on Twitter 02:47:46.540 |
at Lex Friedman spelled somehow without the E, 02:48:12.240 |
"But nobody's ever given a parade for a robot. 02:48:14.980 |
"Nobody's ever named a high school after a robot. 02:48:20.060 |
"I have to recognize the elements of exploration 02:48:24.060 |
"It's not only the discoveries and the beautiful photos 02:48:28.900 |
"It's the vicarious participation in discovery itself." 02:48:33.080 |
Thank you for listening and hope to see you next time.