back to index

Rodney Brooks: Robotics | Lex Fridman Podcast #217


Chapters

0:0 Introduction
1:31 First robots
22:56 Brains and computers
55:45 Self-driving cars
75:55 Believing in the impossible
86:45 Predictions
97:47 iRobot
125:9 Sharing an office with AI experts
137:19 Advice for young people
141:5 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Rodney Brooks,
00:00:03.200 | one of the greatest roboticists in history.
00:00:06.440 | He led the computer science
00:00:08.000 | and artificial intelligence laboratory at MIT,
00:00:10.640 | then co-founded iRobot,
00:00:13.160 | which is one of the most successful robotics companies ever.
00:00:16.520 | Then he co-founded Rethink Robotics
00:00:19.120 | that created some amazing collaborative robots
00:00:21.440 | like Baxter and Sawyer.
00:00:23.840 | Finally, he co-founded Robust.ai,
00:00:27.360 | whose mission is to teach robots common sense,
00:00:30.680 | which is a lot harder than it sounds.
00:00:33.800 | To support this podcast,
00:00:35.240 | please check out our sponsors in the description.
00:00:38.320 | As a side note, let me say that Rodney
00:00:40.400 | is someone I've looked up to for many years
00:00:43.160 | in my now over two decade journey in robotics,
00:00:46.440 | because one, he's a legit great engineer
00:00:50.600 | of real world systems,
00:00:52.160 | and two, he's not afraid to state controversial opinions
00:00:55.360 | that challenge the way we see the AI world.
00:00:59.080 | But of course, while I agree with him
00:01:01.240 | on some of his critical views of AI,
00:01:03.520 | I don't agree with some others,
00:01:05.640 | and he's fully supportive of such disagreement.
00:01:08.640 | Nobody ever built anything great by being fully agreeable.
00:01:13.600 | There's always respect and love behind our interactions,
00:01:16.760 | and when a conversation is recorded
00:01:18.560 | like it was for this podcast,
00:01:20.240 | I think a little bit of disagreement is fun.
00:01:24.000 | This is the Lex Friedman Podcast,
00:01:26.240 | and here is my conversation with Rodney Brooks.
00:01:29.760 | What is the most amazing or beautiful robot
00:01:35.320 | that you've ever had the chance to work with?
00:01:37.760 | - I think it was Domo,
00:01:39.320 | which was made by one of my grad students, Aaron Edsinger.
00:01:42.680 | It now sits in Daniela Russo's office,
00:01:45.920 | director of CSAIL,
00:01:47.600 | and it was just a beautiful robot,
00:01:49.520 | and Aaron was really clever.
00:01:51.800 | He didn't give me a budget ahead of time.
00:01:54.280 | He didn't tell me what he was gonna do.
00:01:56.440 | He just started spending money.
00:01:58.120 | He spent a lot of money.
00:02:00.000 | He and Jeff Weber, who is a mechanical engineer
00:02:03.520 | who Aaron insisted he bring with him
00:02:06.160 | when he became a grad student,
00:02:07.840 | built this beautiful, gorgeous robot, Domo,
00:02:09.960 | which is an upper torso, humanoid,
00:02:12.560 | two arms with three-fingered hands,
00:02:19.360 | and face, eyeballs, all, not the eyeballs,
00:02:24.280 | but everything else, series elastic actuators.
00:02:27.120 | You can interact with it, cable-driven.
00:02:30.200 | All the motors are inside, and it's just gorgeous.
00:02:33.840 | - The eyeballs are actuated, too, or no?
00:02:35.880 | - Oh, yeah, the eyeballs are actuated with cameras,
00:02:37.600 | and so it had a visual attention mechanism,
00:02:40.360 | looking when people came in,
00:02:42.920 | and looking in their face, and talking with them.
00:02:46.320 | - Why was it amazing?
00:02:48.160 | - The beauty of it.
00:02:49.080 | - You said what was the most beautiful?
00:02:50.640 | - The beauty, what is the most beautiful?
00:02:52.320 | - It's just mechanically gorgeous.
00:02:53.960 | As everything Aaron builds
00:02:55.800 | has always been mechanically gorgeous.
00:02:57.680 | It's just exquisite in the detail.
00:03:00.440 | - Oh, we're talking about mechanically,
00:03:01.920 | like literally the amount of actuators, the--
00:03:04.840 | - The actuators, the cables,
00:03:06.680 | he anodizes different parts, different colors,
00:03:10.160 | and it just looks like a work of art.
00:03:13.400 | - What about the face?
00:03:14.800 | Do you find the face beautiful in robots?
00:03:17.920 | - When you make a robot, it's making a promise
00:03:21.240 | for how well it will be able to interact,
00:03:23.280 | so I always encourage my students not to overpromise.
00:03:27.760 | - Even with its essence, like the thing it presents,
00:03:30.800 | it should not overpromise.
00:03:31.960 | - Yeah, so the joke I make, which I think you'll get,
00:03:34.800 | is if your robot looks like Albert Einstein,
00:03:37.320 | it should be as smart as Albert Einstein.
00:03:39.560 | So the only thing in Domo's face is the eyeballs,
00:03:44.560 | 'cause that's all it can do.
00:03:47.640 | It can look at you and pay attention.
00:03:49.480 | So there is no, it's not like one of those Japanese robots
00:03:56.120 | that looks exactly like a person at all.
00:03:59.000 | - But see, the thing is, us humans and dogs too,
00:04:02.680 | don't just use eyes as attentional mechanisms.
00:04:06.920 | They also use it to communicate,
00:04:08.760 | as part of the communication.
00:04:10.200 | Like a dog can look at you, look at another thing,
00:04:12.360 | and look back at you, and that designates
00:04:14.520 | that we're going to be looking at that thing together.
00:04:16.280 | - Yeah, or intent.
00:04:17.640 | You know, on both Baxter and Sawyer at Rethink Robotics,
00:04:21.920 | they had a screen with graphic eyes,
00:04:26.200 | so it wasn't actually where the cameras were pointing,
00:04:28.720 | but the eyes would look in the direction
00:04:31.920 | it was about to move its arm,
00:04:33.200 | so people in the factory nearby were not surprised
00:04:36.120 | by its motions, 'cause it gave that intent away.
00:04:40.640 | - Before we talk about Baxter,
00:04:41.960 | which I think is a beautiful robot,
00:04:44.520 | let's go back to the beginning.
00:04:45.880 | When did you first fall in love with robotics?
00:04:49.360 | We're talking about beauty and love,
00:04:50.680 | to open the conversation, this is great.
00:04:53.360 | - I've got these, I was born in the end of 1954,
00:04:56.920 | and I grew up in Adelaide, South Australia.
00:04:59.680 | And I have these two books that are dated 1961,
00:05:04.680 | so I'm guessing my mother found them in a store in '62 or '63,
00:05:09.320 | How and Why Wonder Books.
00:05:10.780 | How and Why Wonder Book of Electricity,
00:05:13.920 | and How and Why Wonder Book of Giant Brains and Robots.
00:05:18.560 | And I learned how to build circuits,
00:05:23.360 | when I was eight or nine, simple circuits,
00:05:25.360 | and I read, learned the binary system,
00:05:28.280 | and saw all these drawings mostly of robots,
00:05:33.280 | and then I tried to build them for the rest of my childhood.
00:05:36.820 | - Wait, '61 you said?
00:05:40.460 | - This was when the two books, I've still got them at home.
00:05:43.120 | - What does the robot mean in that context?
00:05:45.840 | - No, they were, some of the robots that they had were arms,
00:05:50.840 | you know, big arms to move nuclear material around,
00:05:53.880 | but they had pictures of welding robots
00:05:56.680 | that looked like humans under the sea,
00:05:58.560 | welding stuff underwater.
00:06:01.160 | So they weren't real robots,
00:06:03.980 | but they were what people were thinking about for robots.
00:06:07.320 | - What were you thinking about?
00:06:08.720 | Were you thinking about humanoids?
00:06:10.120 | Were you thinking about arms with fingers?
00:06:11.960 | Were you thinking about faces or cars?
00:06:14.040 | - No, actually, to be honest,
00:06:15.640 | I realized my limitation on building mechanical stuff.
00:06:19.440 | So I just built the brains mostly,
00:06:23.280 | out of different technologies as I got older.
00:06:26.600 | I built a learning system which was chemical-based,
00:06:33.140 | and I had this ice cube tray, each well was a cell,
00:06:37.940 | and by applying voltage to the two electrodes,
00:06:41.680 | it would build up a copper bridge.
00:06:43.200 | So over time, it would learn a simple network,
00:06:48.200 | so I could teach it stuff.
00:06:50.120 | And that was, mostly things were driven by my budget,
00:06:54.520 | and nails as electrodes, and an ice cream,
00:06:58.520 | I mean, an ice cube tray was about my budget at that stage.
00:07:02.280 | Later, I managed to buy transistors,
00:07:04.720 | and then I could build gates and flip-flops and stuff.
00:07:07.640 | - So one of your first robots was an ice cube tray?
00:07:11.120 | - Yeah. (laughs)
00:07:13.080 | And it was very cerebral, 'cause it went to add.
00:07:15.480 | - Very nice.
00:07:18.040 | Well, just a decade or so before, in 1950,
00:07:23.040 | Alan Turing wrote a paper that formulated the Turing test,
00:07:27.360 | and he opened that paper with the question,
00:07:29.760 | "Can machines think?"
00:07:32.480 | So let me ask you this question.
00:07:34.280 | Can machines think?
00:07:36.320 | Can your ice cube tray one day think?
00:07:40.820 | - Certainly machines can think,
00:07:42.080 | because I believe you're a machine, and I'm a machine,
00:07:44.720 | and I believe we both think.
00:07:46.840 | I think-- - Speak for yourself.
00:07:47.960 | - I think any other philosophical position
00:07:49.880 | is sort of a little ludicrous, what does think mean,
00:07:52.280 | if it's not something that we do?
00:07:54.080 | And we are machines.
00:07:56.360 | So yes, machines can,
00:07:59.160 | but do we have a clue how to build such machines?
00:08:01.960 | That's a very different question.
00:08:03.560 | Are we capable of building such machines?
00:08:07.040 | Are we smart enough?
00:08:07.880 | We think we're smart enough to do anything,
00:08:09.440 | but maybe we're not.
00:08:11.060 | Maybe we're just not smart enough to build stuff like us.
00:08:15.260 | - The kind of computer that Alan Turing was thinking about,
00:08:18.340 | do you think there is something fundamentally
00:08:22.020 | or significantly different between a computer
00:08:24.520 | between our ears, the biological computer that humans use,
00:08:28.660 | and the computer that he was thinking about,
00:08:31.260 | from a sort of high-level philosophical?
00:08:34.820 | - Yeah, I believe that it's very wrong.
00:08:37.900 | In fact, I'm halfway through a,
00:08:40.720 | I think it'll be about a 480-page book,
00:08:43.120 | titled, the working title is "Not Even Wrong."
00:08:47.160 | And if I may, I'll tell you a bit about that book.
00:08:51.400 | So there's two, well, three thrusts to it.
00:08:54.100 | One is the history of computation,
00:08:57.160 | what we call computation.
00:08:59.080 | Goes all the way back to some manuscripts in Latin
00:09:02.720 | from 1614 and 1620 by Napier and Kepler,
00:09:07.180 | through Babbage and Lovelace.
00:09:09.680 | And then Turing's 1936 paper is what we think of
00:09:14.680 | as the invention of modern computation.
00:09:20.400 | And that paper, by the way, did not set out
00:09:23.120 | to invent computation.
00:09:26.160 | It set out to negatively answer
00:09:28.880 | one of Hilbert's three later set of problems.
00:09:32.760 | He called it an effective way of getting answers.
00:09:37.760 | And Hilbert really worked with rewriting rules,
00:09:42.540 | as did Church, who also, at the same time,
00:09:47.540 | a month earlier than Turing, disproved Hilbert's
00:09:52.900 | one of these three hypotheses.
00:09:54.600 | The other two had already been disproved by Gödel.
00:09:57.100 | So Turing set out to disprove it,
00:09:59.220 | 'cause it's always easier to disprove these things
00:10:01.700 | than to prove that there is an answer.
00:10:04.260 | And so he needed, and it really came from his professor,
00:10:09.260 | well, he was an undergrad at Cambridge,
00:10:13.420 | who turned it into, is there a mechanical process?
00:10:16.600 | So he wanted to show a mechanical process
00:10:19.200 | that could calculate numbers,
00:10:22.960 | because that was a mechanical process
00:10:25.580 | that people used to generate tables.
00:10:27.960 | They were called computers, the people at the time.
00:10:30.960 | And they followed a set of rules where they had paper,
00:10:33.740 | and they would write numbers down,
00:10:35.540 | and based on the numbers, they'd keep writing other numbers.
00:10:39.160 | And they would produce numbers for these tables,
00:10:43.040 | engineering tables, that the more iterations they did,
00:10:47.300 | the more significant digits came out.
00:10:49.880 | And so Turing, in that paper, set out to define
00:10:54.880 | what sort of machine could do that, mechanical machine,
00:10:59.440 | where it could produce an arbitrary number of digits
00:11:02.760 | in the same way a human computer did.
00:11:05.740 | And he came up with a very simple set of constraints
00:11:12.320 | where there was an infinite supply of paper.
00:11:15.200 | This is the tape of the Turing machine.
00:11:18.120 | And each Turing machine had a set of,
00:11:21.720 | came with a set of instructions
00:11:23.420 | that as a person could do with pencil and paper,
00:11:27.040 | write down things on the tape and erase them
00:11:29.200 | and put new things there.
00:11:31.040 | And he was able to show that that system
00:11:35.600 | was not able to do something that Hilbert hypothesized.
00:11:38.520 | So he disproved it.
00:11:39.920 | But he had to show that this system was good enough
00:11:44.440 | to do whatever could be done,
00:11:46.760 | but couldn't do this other thing.
00:11:48.560 | And there he said, and he says in the paper,
00:11:51.440 | I don't have any real arguments for this,
00:11:54.000 | but based on intuition.
00:11:55.960 | So that's how he defined computation.
00:11:58.240 | And then if you look over the next, from 1936,
00:12:01.640 | up until really around 1975,
00:12:04.760 | you see people struggling with,
00:12:07.720 | is this really what computation is?
00:12:10.160 | And so Marvin Minsky, very well known in AI,
00:12:13.200 | but also a fantastic mathematician
00:12:16.040 | in his book, "Finite and Infinite Machines"
00:12:19.360 | from the mid '60s,
00:12:20.280 | which is a beautiful, beautiful mathematical book,
00:12:22.940 | says at the start of the book,
00:12:25.600 | well, what is computation?
00:12:26.800 | Turing says it's this.
00:12:27.800 | And yeah, I sort of think it's that.
00:12:29.680 | It doesn't really matter
00:12:30.600 | whether the stuff's made of wood or plastic.
00:12:32.400 | It's just, you know,
00:12:33.640 | that relatively cheap stuff can do this stuff.
00:12:36.480 | And so, yeah, it seems like computation.
00:12:40.280 | And Donald Knuth, in his first volume of his,
00:12:45.000 | you know, "Art of Computer Programming" in around 1968,
00:12:49.600 | says, well, what's computation?
00:12:52.560 | It's this stuff, like Turing says,
00:12:54.520 | that a person could do each step without too much trouble.
00:12:57.960 | And so one of his examples
00:13:00.480 | of what would be too much trouble
00:13:02.320 | was a step which required knowing
00:13:06.440 | whether Fermat's Last Theorem was true or not,
00:13:08.560 | because it was not known at the time.
00:13:10.360 | And that's too much trouble for a person to do as a step.
00:13:13.640 | And Hopcroft and Ullman
00:13:16.760 | sort of said a similar thing later that year.
00:13:19.640 | And by 1975, in the A. O. Hopcroft and Ullman book,
00:13:23.640 | they're saying, well, you know,
00:13:25.320 | we don't really know what computation is,
00:13:27.280 | but intuition says this is sort of about right,
00:13:30.200 | and this is what it is.
00:13:32.760 | That's computation.
00:13:33.960 | It's a sort of agreed-upon thing,
00:13:37.360 | which happens to be really easy to implement in silicon.
00:13:40.920 | And then we had Moore's law, which took off,
00:13:43.360 | and it's been an incredibly powerful tool.
00:13:46.120 | I certainly wouldn't argue with that.
00:13:47.720 | The version we have of computation, incredibly powerful.
00:13:51.000 | - Can we just take a pause?
00:13:52.920 | So what we're talking about is there's an infinite tape
00:13:55.760 | with some simple rules of how to write on that tape,
00:13:58.640 | and that's what we're kind of thinking about.
00:14:00.560 | This is computation.
00:14:01.480 | - Yeah, and it's modeled after humans, how humans do stuff.
00:14:04.480 | And I think it's, Turing says in the '36 paper,
00:14:08.280 | one of the critical facts here
00:14:10.100 | is that a human has a limited amount of memory.
00:14:13.200 | So that's what we're gonna put
00:14:14.320 | onto our mechanical computers.
00:14:16.840 | So, mm, mm, mm.
00:14:19.360 | So, you know, unlike mass,
00:14:21.280 | unlike mass or charge or, you know.
00:14:23.960 | It's not given by the universe.
00:14:26.400 | It was, this is what we're gonna call computation.
00:14:29.320 | And then it has this really, you know,
00:14:31.640 | it had this really good implementation,
00:14:33.520 | which has completely changed our technological world.
00:14:37.000 | That's computation.
00:14:37.960 | Second part of the book, or argument in the book,
00:14:44.240 | I have this two-by-two matrix
00:14:46.680 | with science in the top row,
00:14:51.080 | engineering in the bottom row.
00:14:53.020 | Left column is intelligence, right column is life.
00:14:57.060 | So in the bottom row, the engineering,
00:15:00.280 | there's artificial intelligence and there's artificial life.
00:15:03.600 | In the top row, there's neuroscience and abiogenesis.
00:15:07.680 | How does living matter turn in,
00:15:10.080 | how does non-living matter become living matter?
00:15:12.800 | Four disciplines.
00:15:14.120 | These four disciplines all came into the current form
00:15:19.120 | in the period of 1945 to 1965.
00:15:22.420 | - That's interesting.
00:15:25.000 | - There was neuroscience before,
00:15:26.280 | but it wasn't effective neuroscience.
00:15:28.600 | It was, you know, there was ganglia
00:15:30.040 | and there's electrical charges,
00:15:31.280 | but no one knows what to do with it.
00:15:33.800 | And furthermore, there were a lot of players
00:15:36.200 | who are common across them.
00:15:37.720 | I've identified common players,
00:15:40.560 | except for artificial intelligence and abiogenesis.
00:15:43.400 | I don't have, but any other pair,
00:15:45.400 | I can point to people who worked on.
00:15:47.320 | And a whole bunch of them, by the way,
00:15:49.280 | were at the research lab for electronics at MIT,
00:15:52.200 | where Warren McCulloch held forth.
00:15:57.100 | And in fact, McCulloch, Pitts, Letvin, and Maturana
00:16:02.880 | wrote the first paper on functional neuroscience
00:16:06.480 | called "What the Frog's Eye Tells the Frog's Brain,"
00:16:08.880 | where instead of it just being this bunch of nerves,
00:16:12.360 | they sort of showed what different anatomical components
00:16:16.680 | were doing and telling other anatomical components
00:16:20.640 | and generating behavior in the frog.
00:16:23.960 | - Would you put them as basically the fathers
00:16:26.400 | or one of the early pioneers
00:16:28.880 | of what are now called artificial neural networks?
00:16:31.780 | - Yeah, I mean, McCulloch and Pitts,
00:16:35.480 | Pitts was much younger than him,
00:16:38.940 | in 1943, had written a paper inspired by Bertrand Russell
00:16:43.940 | on a calculus for the ideas eminent in neural systems,
00:16:49.780 | where they had tried to, without any real proof,
00:16:55.460 | they had tried to give a formalism for neurons,
00:16:58.940 | basically in terms of logic,
00:17:01.980 | AND gates, OR gates, and NOT gates,
00:17:03.860 | with no real evidence that that was what was going on,
00:17:08.100 | but they talked about it,
00:17:09.260 | and that was picked up by Minsky for his 1953 dissertation,
00:17:13.620 | which was a neural network, we would call it today.
00:17:18.580 | It was picked up by John von Neumann
00:17:22.500 | when he was designing the EDVAC computer in 1945.
00:17:26.580 | He talked about its components being neurons,
00:17:30.060 | based on, and in references,
00:17:31.460 | he's only got three references,
00:17:32.500 | and one of them is the McCulloch-Pitts paper.
00:17:35.740 | So all these people, and then the AI people,
00:17:38.420 | and the artificial life people,
00:17:39.780 | which was John von Neumann originally.
00:17:41.940 | - It was like overlap between all.
00:17:43.700 | - They're all going around at the same time.
00:17:45.300 | And three of these four disciplines
00:17:47.700 | turned to computation as their primary metaphor.
00:17:50.960 | So I've got a couple of chapters in the book.
00:17:54.560 | One is titled, "Wait, Computers Are People?"
00:17:58.700 | 'Cause that's where our computers came from,
00:18:01.140 | from people who were computing stuff.
00:18:05.500 | And then I've got another chapter,
00:18:07.140 | "Wait, People Are Computers?"
00:18:09.140 | Which is about computational neuroscience.
00:18:11.420 | So there's this whole circle here,
00:18:13.500 | that computation is it.
00:18:16.680 | And I have talked to people about,
00:18:19.660 | "Well, maybe it's not computation that goes on in the head."
00:18:23.100 | Of course it is.
00:18:24.620 | Okay, well, when Elon Musk's rocket goes up,
00:18:29.620 | is it computing?
00:18:31.700 | Is that how it gets into orbit?
00:18:32.900 | By computing?
00:18:34.260 | But we've got this idea, if you wanna build an AI system,
00:18:36.680 | you write a computer program.
00:18:38.180 | - Yeah, so the word computation
00:18:43.460 | very quickly starts doing a lot of work
00:18:45.740 | that it was not initially intended to do.
00:18:48.800 | Is this like in the same,
00:18:50.060 | if you talk about the universe
00:18:52.100 | as essentially performing a computation?
00:18:53.900 | - Yeah, right.
00:18:54.740 | Wolfram does this.
00:18:55.560 | He turns it into computation.
00:18:57.380 | You don't turn rockets into computation.
00:18:59.540 | - Yeah.
00:19:00.380 | By the way, when you say computation in our conversation,
00:19:03.140 | do you tend to think of computation narrowly
00:19:05.420 | in the way Turing thought of computation?
00:19:07.940 | - It's gotten very, okay,
00:19:11.040 | squishy, squishy over time.
00:19:15.940 | But computation in the way Turing thinks about it
00:19:20.820 | and the way most people think about it
00:19:23.300 | actually fits very well
00:19:25.540 | with thinking like a hunter-gatherer.
00:19:29.140 | There are places, and there can be stuff in places,
00:19:32.940 | and the stuff in places can change,
00:19:34.620 | and it stays there until someone changes it.
00:19:37.740 | And it's this metaphor of place and container,
00:19:41.900 | which is a combination of our place cells
00:19:45.900 | in our hippocampus and our cortex.
00:19:49.300 | But this is how we use metaphors
00:19:52.300 | for mostly to think about.
00:19:53.140 | And when we get outside of our metaphor range,
00:19:56.100 | we have to invent tools
00:19:57.680 | which we can sort of switch on to use.
00:20:00.100 | So calculus is an example of a tool.
00:20:02.740 | It can do stuff that our raw reasoning can't do,
00:20:05.860 | and we've got conventions of when you can use it or not.
00:20:09.660 | But sometimes, people try to, all the time,
00:20:14.100 | we always try to get physical metaphors for things,
00:20:17.140 | which is why quantum mechanics
00:20:20.180 | has been such a problem for 100 years,
00:20:22.300 | 'cause it's a particle.
00:20:23.340 | No, it's a wave.
00:20:24.160 | It's gotta be something we understand.
00:20:25.860 | And I say, no, it's some weird mathematical logic
00:20:28.260 | that's different from those, but we want that metaphor.
00:20:31.820 | Well, I suspect that 100 years or 200 years from now,
00:20:36.660 | neither quantum mechanics nor dark matter
00:20:38.940 | will be talked about in the same terms,
00:20:41.100 | in the same way that Flodgerson's theory
00:20:44.500 | eventually went away,
00:20:46.440 | 'cause it just wasn't an adequate explanatory metaphor.
00:20:50.180 | That metaphor was the stuff,
00:20:53.140 | there is stuff in the burning.
00:20:56.140 | The burning is in the matter.
00:20:57.980 | But as it turns out, the burning was outside the matter.
00:21:00.060 | It was the oxygen.
00:21:01.460 | So our desire for metaphor,
00:21:03.060 | and combined with our limited cognitive capabilities,
00:21:06.620 | gets us into trouble.
00:21:08.020 | - That's my argument in this book.
00:21:09.980 | Now, and people say, well, what is it then?
00:21:11.740 | And I say, well, I wish I knew that
00:21:13.140 | if I'd spoke about that.
00:21:14.220 | But I give some ideas.
00:21:15.820 | But so, there's three things.
00:21:18.480 | Computation is sort of a particular thing we use.
00:21:21.300 | Oh, can I tell you one beautiful thing?
00:21:26.100 | One beautiful thing I found? - Yes, please.
00:21:27.900 | - You know, I used an example of a thing
00:21:29.540 | that's different from computation.
00:21:31.160 | You hit a drum and it vibrates,
00:21:33.460 | and there are some stationary points on the drum's surface,
00:21:36.660 | you know, 'cause the waves are going up and down
00:21:38.020 | the stationary points.
00:21:39.140 | Now, you could compute them to arbitrary precision,
00:21:45.840 | but the drum just knows them.
00:21:48.820 | The drum doesn't have to compute.
00:21:50.780 | What was the very first computer program ever written
00:21:53.300 | by Ada Lovelace?
00:21:54.980 | To compute Bernoulli numbers,
00:21:56.320 | and Bernoulli numbers are exactly what you need
00:21:58.220 | to find those stable points in the drum's surface.
00:22:01.440 | - Wow.
00:22:02.280 | - Anyway, and there was a bug in the program.
00:22:04.340 | The arguments to divide were reversed in one place.
00:22:10.120 | - And it still worked?
00:22:11.240 | - Well, no, she never got to run it.
00:22:12.720 | They never built the analytical engine.
00:22:14.160 | She wrote the program without a, you know.
00:22:16.520 | - So, computation?
00:22:21.160 | - Computation is sort of, you know,
00:22:22.680 | a thing that's become dominant as a metaphor,
00:22:26.080 | but is it the right metaphor?
00:22:28.480 | All three of these four fields adopted computation,
00:22:33.400 | and you know, a lot of it swirls around Warren McCulloch
00:22:37.720 | and all his students, and he funded a lot of people.
00:22:41.480 | And our human metaphors, our limitations to human thinking
00:22:48.680 | will play into this.
00:22:50.120 | Those are the three themes of the book.
00:22:52.880 | So, I have a little to say about computation.
00:22:55.240 | (laughing)
00:22:57.480 | - So, you're saying that there is a gap
00:23:00.840 | between the computer, or the machine
00:23:04.600 | that performs computation, and this machine
00:23:09.600 | that appears to have consciousness and intelligence.
00:23:13.440 | - Yeah.
00:23:14.280 | - Can we--
00:23:15.120 | - That piece of meat in your head.
00:23:16.200 | - Piece of meat.
00:23:17.040 | - And maybe it's not just the meat in your head,
00:23:19.360 | it's the rest of you, too.
00:23:20.900 | I mean, you actually have a neural system in your gut.
00:23:25.120 | - I tend to also believe, not believe,
00:23:27.920 | but we're now dancing around things we don't know,
00:23:30.920 | but I tend to believe other humans are important.
00:23:35.620 | Like, so, we're almost like, I just don't think
00:23:40.620 | we would ever have achieved the level of intelligence
00:23:42.880 | we have with other humans.
00:23:44.760 | I'm not saying so confidently, but I have an intuition
00:23:47.800 | that some of the intelligence is in the interaction.
00:23:51.320 | - Yeah, and I think it seems to be very likely,
00:23:55.800 | again, this is speculation, but we, our species,
00:24:00.440 | and probably Neanderthals, to some extent,
00:24:03.920 | because you can find old bones where they seem
00:24:07.180 | to be counting on them by putting notches,
00:24:10.280 | that when Neanderthals had done,
00:24:13.680 | we were able to put some of our stuff
00:24:16.320 | outside our body into the world,
00:24:18.580 | and then other people can share it,
00:24:20.520 | and then we get these tools that become shared tools,
00:24:23.160 | and so there's a whole coupling that would not occur
00:24:26.240 | in the single deep learning network,
00:24:30.440 | which was fed all of literature or something.
00:24:33.100 | - Yeah, the neural network can't step outside of itself,
00:24:38.520 | but is there some, can we explore this dark room
00:24:43.520 | a little bit and try to get at something?
00:24:47.720 | What is the magic, where does the magic come from
00:24:50.480 | in the human brain that creates the mind?
00:24:53.440 | What's your sense, as scientists that try to understand it
00:24:58.440 | and try to build it, what are the directions
00:25:01.280 | if followed might be productive?
00:25:05.340 | Is it creative, interactive robots?
00:25:08.120 | Is it creating large, deep neural networks
00:25:11.240 | that do self-supervised learning?
00:25:14.440 | And just like we'll discover that
00:25:17.400 | when you make something large enough,
00:25:19.000 | some interesting things will emerge.
00:25:20.840 | Is it through physics and chemistry and biology,
00:25:22.920 | like artificial life, and go like,
00:25:24.680 | we'll sneak up in this four quadrant matrix
00:25:28.600 | that you mentioned, is there anything,
00:25:30.480 | your most, if you had to bet all your money,
00:25:32.840 | financial, I wouldn't.
00:25:35.640 | Okay.
00:25:36.480 | - So every intelligence we know,
00:25:38.640 | and who's, animal intelligence, dog intelligence,
00:25:44.240 | octopus intelligence, which is a very different
00:25:46.600 | sort of architecture from us.
00:25:48.740 | All the intelligences we know perceive the world
00:25:55.040 | in some way and then have action in the world,
00:26:00.320 | but they're able to perceive objects in a way
00:26:05.680 | which is actually pretty damn phenomenal and surprising.
00:26:12.920 | You know, we tend to think, you know,
00:26:15.120 | that the box over here between us,
00:26:19.520 | which is a sound box, I think, is a blue box,
00:26:22.560 | but blueness is something that we construct
00:26:27.240 | with color constancy.
00:26:30.960 | It's not, the blueness is not a direct function
00:26:34.960 | of the photons we're receiving.
00:26:37.280 | It's actually context, you know,
00:26:39.160 | which is why you can turn, you know,
00:26:44.160 | maybe seen the examples where someone turns a stop sign
00:26:49.200 | into some other sort of sign
00:26:52.280 | by just putting a couple of marks on them
00:26:53.800 | and the deep learning system gets it wrong.
00:26:55.520 | And everyone says, but the stop sign's red.
00:26:58.000 | You know, why is it,
00:26:58.840 | why is it thinking it's the other sort of sign?
00:27:00.080 | Because redness is not intrinsic in just the photons.
00:27:02.920 | It's actually a construction of an understanding
00:27:05.280 | of the whole world and the relationship between objects
00:27:07.960 | to get color constancy.
00:27:10.160 | But our tendency, in order that we get an archive paper
00:27:14.480 | really quickly is you just show a lot of data
00:27:16.760 | and give the labels and hope it figures it out.
00:27:19.040 | But it's not figuring it out in the same way we do.
00:27:21.160 | We have a very complex perceptual understanding
00:27:24.320 | of the world.
00:27:25.160 | Dogs have a very different perceptual understanding
00:27:27.360 | based on smell.
00:27:28.200 | They go smell a post, they can tell how many,
00:27:32.600 | you know, different dogs have visited it
00:27:34.520 | in the last 10 hours and how long ago.
00:27:36.440 | There's all sorts of stuff that we just don't perceive
00:27:38.480 | about the world.
00:27:39.600 | And just taking a single snapshot
00:27:41.240 | is not perceiving about the world.
00:27:42.520 | It's not seeing the registration between us and the object.
00:27:47.520 | And registration is a philosophical concept.
00:27:52.240 | Brian Cantwell Smith talks about a lot,
00:27:54.560 | very difficult squirmy thing to understand.
00:27:59.400 | But I think none of our systems do that.
00:28:02.200 | We've always talked in AI about the symbol grounding problem,
00:28:05.160 | how our symbols that we talk about are grounded in the world.
00:28:08.240 | And when deep learning came along
00:28:09.680 | and started labeling images, people said,
00:28:11.520 | "Ah, the grounding problem has been solved."
00:28:13.560 | No, the labeling problem was solved
00:28:16.320 | with some percentage accuracy,
00:28:18.080 | which is different from the grounding problem.
00:28:20.720 | - So you agree with Hans Marvek
00:28:24.640 | and what's called the Marvek's paradox
00:28:28.040 | that highlights this counterintuitive notion
00:28:31.200 | that reasoning is easy,
00:28:35.600 | but perception and mobility are hard.
00:28:39.560 | - Yeah, we shared an office when I was working
00:28:43.360 | on computer vision and he was working
00:28:44.800 | on his first mobile robot.
00:28:46.840 | - What were those conversations like?
00:28:48.600 | - They were great.
00:28:49.440 | (laughing)
00:28:50.280 | - So do you still kind of, maybe you can elaborate,
00:28:53.320 | do you still believe this kind of notion
00:28:55.400 | that perception is really hard?
00:28:59.520 | Can you make sense of why we humans have this poor intuition
00:29:03.520 | about what's hard and not?
00:29:04.600 | - Well, let me give a sort of another story.
00:29:09.600 | - Sure.
00:29:11.720 | - If you go back to the original teams working on AI
00:29:16.720 | from the late '50s into the '60s,
00:29:21.800 | and you go to the AI lab at MIT,
00:29:24.160 | who was it that was doing that?
00:29:27.960 | Was a bunch of really smart kids who got into MIT
00:29:31.600 | and they were intelligent.
00:29:33.480 | So what's intelligence about?
00:29:35.160 | Well, the stuff they were good at,
00:29:36.360 | playing chess, doing integrals, that was hard stuff.
00:29:41.360 | But a baby could see stuff, that wasn't intelligent.
00:29:45.800 | Anyone could do that, that's not intelligence.
00:29:48.200 | And so there was this intuition that the hard stuff
00:29:52.440 | is the things they were good at
00:29:54.920 | and the easy stuff was the stuff that everyone could do.
00:29:58.320 | - Yeah.
00:29:59.160 | - And maybe I'm overplaying it a little bit,
00:30:00.320 | but I think there's an element of that.
00:30:01.920 | - Yeah, I mean, I don't know how much truth there is to,
00:30:05.800 | like chess, for example, was for the longest time
00:30:09.440 | seen as the highest level of intellect, right?
00:30:14.440 | - Until we got computers that were better at it than people.
00:30:17.320 | And then we realized, if you go back to the '90s,
00:30:20.320 | you'll see the stories in the press
00:30:22.160 | around when Kasparov was beaten by Deep Blue.
00:30:26.520 | Oh, this is the end of all sorts of things.
00:30:28.480 | Computers are gonna be able to do anything from now on.
00:30:30.760 | And we saw exactly the same stories with AlphaZero,
00:30:33.320 | the Go playing program.
00:30:36.280 | - Yeah, but still to me, reasoning is a special thing
00:30:41.280 | and perhaps-
00:30:42.120 | - No, actually, we're really bad at reasoning.
00:30:44.760 | We just use these analogies
00:30:46.200 | based on our hunter-gatherer intuitions.
00:30:48.600 | - But why is that not,
00:30:49.760 | don't you think the ability to construct metaphor
00:30:52.560 | is a really powerful thing?
00:30:54.040 | - Oh, yeah, it is.
00:30:54.880 | - To tell stories.
00:30:55.720 | - It is, it's the constructing the metaphor
00:30:57.760 | and registering that something constant in our brains.
00:31:01.200 | - Like, isn't that what we're doing with vision too?
00:31:04.120 | And we're telling our stories.
00:31:06.280 | We're constructing good models of the world.
00:31:08.640 | - Yeah, yeah.
00:31:09.960 | But I think we jumped between what we're capable of
00:31:14.880 | and how we're doing it.
00:31:16.080 | - Right, there was a little confusion that went on.
00:31:18.320 | - Sure.
00:31:19.160 | - As we were telling each other stories.
00:31:21.760 | - Yes, exactly.
00:31:22.800 | - Trying to delude each other.
00:31:24.960 | No, I just think, I'm not exactly,
00:31:27.320 | so I'm trying to pull apart this Marv X paradox.
00:31:30.360 | - I don't view it as a paradox.
00:31:31.880 | What did evolution spend its time on?
00:31:36.480 | It spent its time on getting us to perceive
00:31:38.200 | and move in the world.
00:31:39.480 | That was 600 million years
00:31:41.440 | as multi-celled creatures doing that.
00:31:43.720 | And then it was relatively recent
00:31:46.520 | that we were able to hunt or gather,
00:31:50.640 | or even animals hunting.
00:31:53.280 | That's much more recent.
00:31:55.080 | And then anything that we,
00:31:58.160 | speech, language, those things are
00:32:02.040 | just a couple of hundred thousand years probably,
00:32:03.880 | if that long.
00:32:05.880 | And then agriculture, 10,000 years.
00:32:08.640 | All that stuff was built on top of those earlier things
00:32:12.280 | which took a long time to develop.
00:32:14.480 | - So if you then look at the engineering of these things,
00:32:17.680 | so building it into robots,
00:32:20.600 | what's the hardest part of robotics, do you think?
00:32:24.960 | As the decades that you worked on robots,
00:32:28.000 | in the context of what we're talking about,
00:32:32.080 | vision, perception,
00:32:34.240 | the actual sort of the biomechanics of movement.
00:32:38.920 | I'm kind of drawing parallels here
00:32:40.400 | between humans and machines always.
00:32:42.400 | Like, what do you think is the hardest part of robotics?
00:32:46.080 | - I sort of think all of them.
00:32:47.880 | (laughing)
00:32:49.560 | There are no easy paths to do well.
00:32:51.720 | We sort of go reductionist and we reduce it.
00:32:55.760 | If only we had all the location of all the points in 3D,
00:32:59.640 | things would be great.
00:33:00.760 | If only we had labels on the images,
00:33:05.360 | things would be great.
00:33:07.520 | But as we see, that's not good enough.
00:33:10.760 | Some deeper understanding.
00:33:13.200 | - But if I came to you and I could solve
00:33:16.640 | one category of problems in robotics instantly,
00:33:21.640 | what would give you the greatest pleasure?
00:33:24.440 | (laughing)
00:33:27.680 | You look at robots that manipulate objects.
00:33:34.200 | What's hard about that?
00:33:36.800 | Is it the perception?
00:33:39.880 | Is it the reasoning about the world,
00:33:43.120 | like common sense reasoning?
00:33:45.360 | Is it the actual building a robot
00:33:47.840 | that's able to interact with the world?
00:33:50.440 | Is it like human aspects of a robot
00:33:53.000 | that's interacting with humans
00:33:54.240 | and that game theory of how they work well together?
00:33:56.920 | - Well, let's talk about manipulation for a second,
00:33:58.720 | 'cause I had this really blinding moment.
00:34:01.680 | You know, I'm a grandfather,
00:34:04.400 | so grandfathers have blinding moments.
00:34:06.760 | Just three or four miles from here,
00:34:10.040 | last year, my 16-month-old grandson
00:34:14.120 | was in his new house, first time, right?
00:34:18.400 | First time in this house.
00:34:19.880 | And he'd never been able to get to a window before,
00:34:23.400 | but this had some low windows.
00:34:25.160 | And he goes up to this window with a handle on it
00:34:27.840 | that he's never seen before.
00:34:29.520 | And he's got one hand pushing the window
00:34:32.520 | and the other hand turning the handle to open the window.
00:34:36.800 | He knew two different hands,
00:34:40.120 | two different things he knew how to put together.
00:34:43.320 | And he's 16 months old!
00:34:45.720 | - And there you are watching in awe.
00:34:47.520 | (both laughing)
00:34:50.200 | - In an environment he'd never seen before,
00:34:54.520 | a mechanism he'd never seen. - How did he do that?
00:34:56.320 | - Yes, that's a good question.
00:34:57.800 | How did he do that?
00:34:59.040 | That's why.
00:34:59.880 | - It's like, okay, like you could see the leap of genius
00:35:04.120 | from using one hand to perform a task,
00:35:06.480 | to combining, to doing, I mean, first of all,
00:35:09.680 | in manipulation, that's really difficult.
00:35:11.640 | It's like two hands,
00:35:13.200 | both necessary to complete the action.
00:35:16.040 | - And completely different.
00:35:16.960 | And he'd never seen a window open before.
00:35:19.360 | But he inferred somehow a handle opened something.
00:35:25.280 | - Yeah, there may have been a lot of
00:35:27.120 | slightly different failure cases that you didn't see.
00:35:31.960 | - Yeah. - Not with a window,
00:35:33.040 | but with other objects of turning and twisting and handles.
00:35:36.240 | - Oh, there's a great counter to,
00:35:39.040 | reinforcement learning will just give the robot,
00:35:45.440 | or you'll give the robot plenty of time to try everything.
00:35:49.000 | - Yes. - Actually,
00:35:50.440 | can I tell a little side story here?
00:35:52.560 | So I'm in DeepMind in London,
00:35:56.680 | this is three, four years ago,
00:35:59.880 | where there's a big Google building,
00:36:03.360 | and then you go inside and you go through,
00:36:04.920 | there's more security, and then you get to DeepMind,
00:36:07.040 | where the other Google employees can't go.
00:36:08.840 | - Yeah. - And I'm in a conference room,
00:36:12.280 | Bayer conference room with some of the people,
00:36:14.320 | and they tell me about their reinforcement learning
00:36:17.520 | experiment with robots,
00:36:19.760 | which are just trying stuff out.
00:36:24.080 | And they're my robots, they're Sawyers, we sold them.
00:36:27.800 | And they really like them,
00:36:30.160 | 'cause Sawyers are compliant and consents forces,
00:36:33.120 | so they don't break when they're bashing into walls,
00:36:35.800 | they stop and they do all this stuff.
00:36:38.600 | And so you just let the robot do stuff,
00:36:41.160 | and eventually it figures stuff out.
00:36:42.720 | - By the way, Sawyer, we're talking about robot manipulation,
00:36:45.560 | so robot arms and so on.
00:36:47.480 | - Yeah, Sawyer's a robot.
00:36:49.480 | - Just to go, what's Sawyer?
00:36:51.360 | - Sawyer's a robot arm that my company,
00:36:53.760 | Rethink Robotics built.
00:36:55.280 | - Thank you for the context. (laughs)
00:36:56.640 | - Sorry. - Okay, cool,
00:36:57.680 | so we're in DeepMind.
00:36:59.200 | - And it's in the next room,
00:37:01.040 | these robots are just bashing around
00:37:02.680 | to try and use reinforcement learning to learn how to act.
00:37:05.680 | And can I go see them?
00:37:06.880 | Oh no, they're secret.
00:37:08.520 | They're my robots, they're secret.
00:37:10.920 | - That's hilarious, okay.
00:37:12.200 | - Anyway, the point is,
00:37:13.880 | this idea that you just let reinforcement learning
00:37:17.520 | figure everything out is so counter
00:37:19.760 | to how a kid does stuff.
00:37:21.800 | So again, story about my grandson,
00:37:24.840 | I gave him this box
00:37:27.480 | that had lots of different lock mechanisms.
00:37:29.720 | He didn't randomly, you know, and he was 18 months old,
00:37:32.600 | he didn't randomly try to touch every surface
00:37:35.000 | or push everything.
00:37:35.840 | He found, he could see where the mechanism was
00:37:39.800 | and he started exploring the mechanism
00:37:41.920 | for each of these different lock mechanisms.
00:37:44.640 | And there was reinforcement, no doubt,
00:37:46.960 | of some sort going on there.
00:37:48.760 | But he applied a pre-filter,
00:37:51.480 | which cut down the search space dramatically.
00:37:54.640 | - I wonder to what level we're able
00:37:57.600 | to introspect what's going on.
00:37:59.760 | Because what's also possible
00:38:01.080 | is you have something like reinforcement learning
00:38:03.520 | going on in the mind, in the space of imagination.
00:38:05.960 | So like you have a good model of the world
00:38:08.040 | you're predicting,
00:38:09.280 | and you may be running those tens of thousands
00:38:11.960 | of like loops, but you're like, as a human,
00:38:15.680 | you're just looking at yourself,
00:38:16.880 | trying to tell a story of what happened.
00:38:18.840 | And it might seem simple,
00:38:20.400 | but maybe there's a lot of computation going on.
00:38:24.520 | - Whatever it is, but there's also a mechanism
00:38:27.120 | that's being built up.
00:38:28.040 | It's not just random search.
00:38:31.600 | That mechanism prunes it dramatically.
00:38:34.840 | - Yeah, that pruning step.
00:38:38.200 | But it doesn't, it's possible that that's,
00:38:41.840 | so you don't think that's akin to a neural network
00:38:44.680 | inside a reinforcement learning algorithm?
00:38:47.640 | Is it possible?
00:38:48.760 | - It's, yeah, it's possible.
00:38:53.760 | But I, you know,
00:38:55.360 | I'll be incredibly surprised if that happens.
00:39:00.360 | I'll also be incredibly surprised that, you know,
00:39:03.760 | after all the decades that I've been doing this,
00:39:06.120 | where every few years someone thinks,
00:39:08.520 | "Now we've got it, now we've got it."
00:39:11.680 | You know, four or five years ago, I was saying,
00:39:14.560 | "I don't think we've got it yet."
00:39:15.800 | And everyone was saying,
00:39:16.640 | "Oh, you don't understand how powerful AI is."
00:39:18.960 | I had people tell me,
00:39:19.800 | "You don't understand how powerful it is."
00:39:23.040 | You know, I sort of had a track record
00:39:27.720 | of what the world had done to think,
00:39:29.600 | "Well, this is no different from before."
00:39:31.680 | Well, we have bigger computers.
00:39:33.240 | We had bigger computers in the '90s
00:39:34.760 | and we could do more shit stuff.
00:39:36.400 | - But, okay, so let me push back.
00:39:40.320 | 'Cause I'm generally sort of optimistic
00:39:42.680 | and try to find the beauty in things.
00:39:44.600 | I think there's a lot of surprising and beautiful things
00:39:51.560 | that neural networks,
00:39:52.760 | this new generation of deep learning revolution has revealed
00:39:56.480 | to me has continually been very surprising
00:39:59.600 | the kind of things it's able to do.
00:40:01.440 | Now, generalizing that over saying like,
00:40:04.040 | we've solved intelligence, that's another big leap.
00:40:07.400 | But is there something surprising and beautiful to you
00:40:10.720 | about neural networks that where actually you sat back
00:40:13.360 | and said, "I did not expect this."
00:40:16.520 | - Oh, I think their performance,
00:40:20.960 | their performance on ImageNet was shocking.
00:40:24.280 | - So computer vision, those early days,
00:40:25.880 | it was just very like, "Wow, okay."
00:40:28.520 | - That doesn't mean that they're solving everything
00:40:31.200 | in computer vision we need to solve or in vision for robots.
00:40:35.720 | - What about AlphaZero and self-play mechanisms
00:40:38.280 | and reinforcement learning?
00:40:39.160 | Isn't that-
00:40:40.000 | - Yeah, that was all in Donald Mickey's 1961 paper.
00:40:43.040 | Everything there was there,
00:40:45.520 | which introduced reinforcement learning.
00:40:48.520 | - No, but come on.
00:40:49.440 | So now you're talking about the actual techniques,
00:40:52.200 | but isn't this surprising to you?
00:40:54.200 | The level it's able to achieve with no human supervision
00:40:57.440 | of chess play?
00:40:59.960 | To me, there's a big, big difference between Deep Blue and-
00:41:05.920 | - Maybe what that's saying is how overblown
00:41:10.320 | our view of ourselves is.
00:41:12.240 | - You know, the chess is easy.
00:41:16.880 | - Yeah, I mean, I came across this 1946 report that,
00:41:21.880 | and I'd seen this as a kid in one of those books
00:41:29.040 | that my mother had given me, actually.
00:41:31.560 | 1946 report, which pitted someone with an abacus
00:41:36.560 | against an electronic calculator,
00:41:39.800 | and he beat the electronic calculator.
00:41:42.960 | So there, at that point was,
00:41:44.800 | well, humans are still better than machines at calculating.
00:41:49.240 | Are you surprised today that a machine can, you know,
00:41:52.440 | do a billion floating point operations a second,
00:41:55.120 | and you're puzzling for minutes through one?
00:41:59.560 | So, you know-
00:42:01.040 | - I am, I mean, I don't know,
00:42:03.160 | but I am certainly surprised.
00:42:05.720 | There's something, to me, different about learning.
00:42:09.640 | So a system that's able to learn-
00:42:12.000 | - Learning, now, see, now you're getting
00:42:13.760 | to one of the deadly sins.
00:42:15.560 | - Because of using terms overly broadly.
00:42:20.720 | - Yeah, I mean, there's so many different forms of learning.
00:42:23.320 | - Yeah.
00:42:24.160 | - And so many different forms.
00:42:25.000 | You know, I learned my way around the city.
00:42:26.520 | I learned to play chess.
00:42:28.040 | I learned Latin.
00:42:29.840 | I learned to ride a bicycle.
00:42:31.160 | All of those are, you know, very different capabilities.
00:42:35.280 | And if someone, you know, has a,
00:42:38.640 | you know, in the old days,
00:42:41.800 | people would write a paper about learning something.
00:42:44.480 | Now, the corporate press office puts out a press release
00:42:48.280 | about how company X has,
00:42:51.480 | is leading the world because they have a system that can.
00:42:57.000 | - Yeah, but here's the thing.
00:42:58.320 | Okay, so what is learning?
00:43:00.120 | When I refer to learning as many things, but-
00:43:03.480 | - It's a suitcase word.
00:43:04.800 | - It's a suitcase word, but loosely,
00:43:08.160 | there's a dumb system.
00:43:10.200 | And over time, it becomes smart.
00:43:13.840 | - Well, it becomes less dumb at the thing that it's doing.
00:43:16.440 | - Yeah.
00:43:17.280 | - Smart is a loaded word.
00:43:19.280 | - Yes, less dumb at the thing it's doing.
00:43:21.400 | - It gets better performance under some measure,
00:43:24.200 | under some set of conditions at that thing.
00:43:27.200 | And most of these learning algorithms,
00:43:29.720 | learning systems fail when you change the conditions
00:43:35.960 | just a little bit in a way that humans don't.
00:43:38.040 | So I was at DeepMind, the AlphaGo had just come out,
00:43:43.040 | and I said, "What would have happened
00:43:46.000 | "if you'd given it a 21 by 21 board
00:43:48.320 | "instead of a 19 by 19 board?"
00:43:50.080 | They said, "Fail totally."
00:43:51.760 | But a human player would actually,
00:43:53.800 | well, would actually be able to play that game.
00:43:55.840 | - And actually, funny enough,
00:43:56.760 | if you look at DeepMind's work since then,
00:43:59.680 | they're presenting a lot of algorithms
00:44:04.480 | that would do well at the bigger board.
00:44:07.800 | So they're slowly expanding this generalization.
00:44:10.560 | I mean, to me, there's a core element there.
00:44:13.520 | It is very surprising to me
00:44:15.480 | that even in a constrained game of chess or Go,
00:44:20.000 | that through self-play, by a system playing itself,
00:44:23.240 | that it can achieve superhuman level performance
00:44:28.240 | through learning alone.
00:44:30.120 | So like--
00:44:30.960 | - Okay, so you know, you didn't--
00:44:32.280 | - It's so fundamentally different in search of--
00:44:34.560 | - You didn't like it when I referred
00:44:36.280 | to Donald Mickey's 1961 paper.
00:44:40.040 | There, in the second part of it, which came a year later,
00:44:44.240 | they had self-play on an electronic computer at tic-tac-toe.
00:44:48.240 | Okay, it's not as,
00:44:49.920 | but it learned to play tic-tac-toe through self-play.
00:44:52.440 | - That's not--
00:44:53.280 | - And it learned to play optimally.
00:44:54.680 | - What I'm saying is,
00:44:56.320 | I, okay, I have a little bit of a bias,
00:44:59.960 | but I find ideas beautiful,
00:45:02.880 | but only when they actually realize the promise.
00:45:06.800 | That's another level of beauty.
00:45:08.160 | Like, for example,
00:45:10.240 | what Bezos and Elon Musk are doing with rockets.
00:45:13.680 | We had rockets for a long time,
00:45:15.200 | but doing reusable, cheap rockets, it's very impressive.
00:45:19.080 | In the same way, I, okay.
00:45:21.400 | - Yeah.
00:45:22.240 | - I would have not predicted, first of all,
00:45:23.840 | when I was,
00:45:24.680 | started and fell in love with AI,
00:45:28.520 | the game of Go was seen to be impossible to solve.
00:45:32.120 | Okay, so I thought maybe,
00:45:35.440 | maybe it'd be possible to maybe have big leaps
00:45:39.520 | in a Moore's law style of way in computation
00:45:41.960 | that would be able to solve it.
00:45:43.120 | But I would never have guessed
00:45:44.480 | that you could learn your way, however,
00:45:47.760 | I mean, in the narrow sense of learning,
00:45:52.320 | learn your way to beat the best people in the world
00:45:56.080 | at the game of Go without human supervision,
00:45:58.720 | not studying the game of experts.
00:46:01.280 | - Okay, so using a different learning technique.
00:46:04.760 | - Yes.
00:46:05.600 | - Arthur Samuel in the early '60s,
00:46:08.640 | and he was the first person to use machine learning,
00:46:11.080 | got, had a program that could beat
00:46:13.960 | the world champion at checkers.
00:46:16.200 | Now, so, and that at the time was considered amazing.
00:46:19.920 | By the way, Arthur Samuel had some fantastic advantages.
00:46:23.640 | Do you want to hear Arthur Samuel's advantages?
00:46:25.800 | Two things.
00:46:26.640 | One, he was at the 1956 AI conference.
00:46:30.600 | I knew Arthur later in life.
00:46:32.360 | He was at Stanford when I was graduating there.
00:46:34.640 | He wore a tie and a jacket every day.
00:46:36.440 | The rest of us didn't.
00:46:37.560 | Delightful man, delightful man.
00:46:40.720 | It turns out Claude Shannon
00:46:45.320 | in a 1950 Scientific American article
00:46:48.040 | outlined on chess playing,
00:46:51.320 | outlined the learning mechanism that Arthur Samuel used.
00:46:54.840 | And they had met in 1956.
00:46:57.240 | I assume there was some communication,
00:46:58.960 | but I don't know that for sure.
00:47:00.680 | But Arthur Samuel had been a vacuum tube engineer,
00:47:04.840 | getting reliability of vacuum tubes,
00:47:06.800 | and then had overseen
00:47:08.160 | the first transistorized computers at IBM.
00:47:11.960 | And in those days, before you shipped a computer,
00:47:14.600 | you ran it for a week to see, to get early failures.
00:47:18.280 | So he had this whole farm of computers
00:47:21.240 | running random code for hours and hours,
00:47:26.000 | a week for each computer.
00:47:28.760 | He had a whole bunch of them.
00:47:30.080 | So he ran his chess learning program with self-play
00:47:34.360 | on IBM's production line.
00:47:38.960 | He had more computation available to him
00:47:41.360 | than anyone else in the world.
00:47:43.160 | And then he was able to produce a chess playing program.
00:47:46.160 | I mean, a checkers playing program
00:47:48.040 | that could beat the world champion.
00:47:50.000 | So- - That's amazing.
00:47:51.480 | The question is, what I mean surprised,
00:47:54.560 | I don't just mean it's nice to have that accomplishment.
00:47:58.000 | Is there is a stepping towards something
00:48:01.840 | that feels more intelligent than before.
00:48:06.360 | And the question is- - Yeah, but that's
00:48:07.600 | in your view of the world.
00:48:08.840 | - Okay, well, let me then, doesn't mean I'm wrong.
00:48:12.280 | - No, no, it doesn't. (laughs)
00:48:14.680 | - So the question is, if we keep taking steps like that,
00:48:17.680 | how far that takes us.
00:48:19.320 | Are we going to build a better recommender systems?
00:48:21.920 | Are we going to build a better robot?
00:48:23.960 | Or will we solve intelligence?
00:48:26.000 | - So, I'm putting my bet on,
00:48:29.640 | we're still missing a whole lot, a lot.
00:48:34.560 | And why would I say that?
00:48:36.120 | Well, in these games, they're all 100% information games.
00:48:41.120 | But again, but each of these systems
00:48:43.920 | is a very short description of the current state,
00:48:48.120 | which is different from registering
00:48:51.320 | and perception in the world.
00:48:53.680 | Which gets back to my R-of-X paradox.
00:48:55.720 | I'm definitely not saying that chess is somehow harder
00:49:00.720 | than perception or any kind of,
00:49:05.120 | even any kind of robotics in the physical world,
00:49:07.760 | I definitely think is way harder than the game of chess.
00:49:10.920 | So I was always much more impressed by like the workings
00:49:14.480 | of the human mind, it's incredible.
00:49:16.040 | The human mind is incredible.
00:49:17.640 | I believe that from the very beginning,
00:49:18.920 | I wanted to be a psychiatrist for the longest time.
00:49:20.840 | I always thought that's way more incredible
00:49:22.480 | in the game of chess.
00:49:23.320 | I think the game of chess is, I love the Olympics.
00:49:26.840 | It's just another example of us humans picking a task
00:49:29.680 | and then agreeing that a million humans
00:49:32.000 | will dedicate their whole life to that task.
00:49:34.000 | And that's the cool thing that the human mind
00:49:37.000 | is able to focus on one task
00:49:38.960 | and then compete against each other
00:49:40.920 | and achieve like weirdly incredible levels of performance.
00:49:44.600 | That's the aspect of chess that's super cool.
00:49:46.880 | Not that chess in itself is really difficult.
00:49:49.840 | It's like the Fermat's last theorem
00:49:51.720 | is not in itself to me that interesting.
00:49:53.680 | The fact that thousands of people have been struggling
00:49:56.600 | to solve that particular problem is fascinating.
00:49:58.640 | - So can I tell you my disease in this way?
00:50:00.640 | - Sure.
00:50:01.680 | - Which actually is closer to what you're saying.
00:50:03.840 | So as a child, I was building various,
00:50:06.800 | I called them computers,
00:50:07.760 | they weren't general purpose computers.
00:50:09.520 | - Ice cube tray.
00:50:10.480 | - Ice cube tray was one, but I built other machines.
00:50:12.840 | And what I liked to build was machines
00:50:15.040 | that could beat adults at a game
00:50:16.920 | and the adults couldn't beat my machine.
00:50:19.800 | - Yes.
00:50:20.640 | - So you were like, that's powerful.
00:50:22.560 | Like that's a way to rebel.
00:50:25.440 | - Yeah.
00:50:26.280 | - By the way,
00:50:27.120 | when was the first time you built something
00:50:31.560 | that outperformed you?
00:50:33.360 | Do you remember?
00:50:34.720 | - Well, I knew how it worked.
00:50:36.440 | I was probably nine years old
00:50:37.920 | and I built a thing that was a game
00:50:40.480 | where you take turns in taking matches from a pile
00:50:44.640 | and either the one who takes the last one
00:50:47.080 | or the one who doesn't take the last one wins, I forget.
00:50:49.560 | And so it was pretty easy to build that out of wires
00:50:52.040 | and nails and little coils that were like plugging
00:50:55.360 | in the number and a few light bulbs.
00:50:58.320 | The one I was prouder of, I was 12 when I built
00:51:02.320 | a thing out of old telephone switchboard switches
00:51:07.240 | that could always win at tic-tac-toe.
00:51:11.520 | And that was a much harder circuit to design.
00:51:14.600 | But again, it was just, it was no active components.
00:51:17.720 | It was just three position switches, empty, X, zero, O
00:51:22.080 | and nine of them and a light bulb on which move
00:51:28.360 | it wanted next and then the human would go and move that.
00:51:31.640 | - See, there's magic in that creation.
00:51:33.200 | - There was, yeah, yeah.
00:51:34.640 | - I tend to see magic in robots that,
00:51:38.560 | like, I also think that intelligence
00:51:42.720 | is a little bit overrated.
00:51:44.840 | I think we can have deep connections with robots,
00:51:47.480 | very soon.
00:51:49.200 | And--
00:51:50.040 | - Well, we'll come back to connections with robots.
00:51:52.040 | - Sure.
00:51:52.880 | - But I do wanna say, I don't, I think too many people
00:51:57.200 | make the mistake of seeing that magic and thinking,
00:52:00.680 | well, we'll just continue, you know?
00:52:02.920 | But each one of those is a hard fought battle
00:52:05.880 | for the next step, the next step.
00:52:07.320 | - Yes, the open question here is,
00:52:09.800 | and this is why I'm playing devil's advocate,
00:52:11.440 | but I often do when I read your blog post in my mind
00:52:15.080 | because I have this eternal optimism,
00:52:17.560 | is it's not clear to me, so I don't do what, obviously,
00:52:21.400 | the journalists do or give into the hype,
00:52:24.120 | but it's not obvious to me how many steps away we are
00:52:27.160 | from a truly transformational understanding
00:52:32.160 | of what it means to build intelligent systems,
00:52:38.920 | or how to build intelligent systems.
00:52:41.360 | I'm also aware of the whole history
00:52:42.800 | of artificial intelligence, which is where
00:52:44.440 | your deep grounding of this is,
00:52:47.400 | is there has been an optimism for decades.
00:52:49.880 | And that optimism, just like reading old optimism,
00:52:53.840 | is absurd because people were like, this is,
00:52:56.880 | they were saying things are trivial for decades,
00:52:59.160 | since the '60s.
00:53:00.400 | They were saying everything is true,
00:53:01.520 | computer vision is trivial.
00:53:03.120 | But I think my mind is working crisply enough to where,
00:53:08.120 | I mean, we can dig into it if you want.
00:53:10.520 | I'm really surprised by the things DeepMind has done.
00:53:13.760 | I don't think they're yet close to solving intelligence,
00:53:18.760 | but I'm not sure it's not 10 years away.
00:53:22.160 | What I'm referring to is interesting to see
00:53:26.160 | when the engineering, it takes that idea to scale,
00:53:31.160 | and the idea works.
00:53:33.600 | - And no, it fools people.
00:53:35.680 | - Okay, honestly, Rodney, if it was you, me, and Demis
00:53:39.240 | inside a room, forget the press, forget all those things,
00:53:42.040 | just as a scientist, as a roboticist,
00:53:44.840 | that wasn't surprising to you that at scale,
00:53:47.680 | so we're talking about very large,
00:53:49.520 | okay, let's pick one that's the most surprising to you.
00:53:53.480 | Please don't yell at me.
00:53:54.520 | GPT-3, okay.
00:53:57.240 | Hold on a second.
00:53:58.080 | - I was gonna bring that up.
00:53:59.600 | - Okay, thank you.
00:54:00.960 | Alpha zero, alpha go, alpha go zero, alpha zero,
00:54:04.680 | and then alpha fold one and two.
00:54:07.760 | So do any of these kind of have this core
00:54:11.240 | of forget usefulness or application and so on,
00:54:15.120 | which you could argue for alpha fold,
00:54:17.040 | like as a scientist, was those surprising to you
00:54:21.160 | that it worked as well as it did?
00:54:24.400 | - Okay, so if we're gonna make the distinction
00:54:26.720 | between surprise and usefulness,
00:54:29.840 | and I have to explain this.
00:54:33.960 | I would say alpha fold, and one of the problems
00:54:39.480 | at the moment with alpha fold is,
00:54:41.760 | you know, it gets a lot of them right,
00:54:43.480 | which is a surprise to me,
00:54:44.680 | 'cause they're a really complex thing,
00:54:47.040 | but you don't know which ones it gets right,
00:54:50.040 | which then is a bit of a problem.
00:54:52.640 | Now, they've come out with a--
00:54:53.480 | - You mean the structure of the protein,
00:54:55.040 | it gets a lot of those right?
00:54:56.320 | - Yeah, it's a surprising number of them right.
00:54:58.680 | It's been a really hard problem.
00:55:00.440 | So that was a surprise how many it gets right.
00:55:03.360 | So far, the usefulness is limited
00:55:05.520 | because you don't know which ones are right or not,
00:55:07.600 | and now they've come out with a thing
00:55:10.520 | in the last few weeks,
00:55:11.400 | which is trying to get a useful tool out of it,
00:55:14.120 | and they may well do it.
00:55:15.800 | - In that sense, at least alpha fold is different,
00:55:18.400 | because your alpha fold too is different,
00:55:21.960 | because now it's producing data sets
00:55:24.200 | that are actually potentially revolutionizing
00:55:27.640 | competition biology.
00:55:28.760 | Like they will actually help a lot of people.
00:55:31.280 | - You would say potentially revolutionizing,
00:55:34.560 | we don't know yet, but yeah.
00:55:36.240 | - That's true, yeah.
00:55:37.080 | - But I got you.
00:55:39.440 | I mean, this is, okay, so you know what?
00:55:41.400 | This is gonna be so fun.
00:55:42.760 | So let's go right into it.
00:55:45.960 | Speaking of robots that operate in the real world,
00:55:50.320 | let's talk about self-driving cars.
00:55:52.920 | - Oh.
00:55:53.760 | (laughing)
00:55:54.680 | - Okay, because you have built robotics companies.
00:55:58.560 | You're one of the greatest roboticists in history,
00:56:01.120 | and that's not just in the space of ideas.
00:56:05.000 | We'll also probably talk about that,
00:56:06.640 | but in the actual building and execution of businesses
00:56:11.320 | that make robots that are useful for people,
00:56:13.880 | and that actually work in the real world and make money.
00:56:16.720 | You also sometimes are critical of Mr. Elon Musk,
00:56:22.720 | or let's more specifically focus
00:56:24.440 | on this particular technology,
00:56:25.840 | which is autopilot inside Teslas.
00:56:27.920 | What are your thoughts about Tesla autopilot,
00:56:31.560 | or more generally vision-based machine learning approach
00:56:34.240 | to semi-autonomous driving?
00:56:37.000 | These are robots.
00:56:39.800 | They're being used in the real world
00:56:41.560 | by hundreds of thousands of people.
00:56:43.760 | And if you wanna go there, I can go there,
00:56:47.760 | but that's not too much,
00:56:49.200 | which let's say they're on par safety-wise
00:56:52.760 | as humans currently,
00:56:54.320 | meaning human alone versus human plus robot.
00:56:58.640 | - Okay, so first let me say,
00:57:00.760 | I really like the car I came in here today,
00:57:04.000 | which is?
00:57:04.840 | - 2021 model Mercedes E450.
00:57:11.400 | I am impressed by the machine vision.
00:57:17.360 | So now other things, I'm impressed by what it can do.
00:57:21.840 | I'm really impressed with many aspects of it.
00:57:26.840 | And I'm-
00:57:28.640 | - It's able to stay in lane, is it?
00:57:31.520 | - Oh, yeah, it does the lane stuff.
00:57:33.560 | It's looking on either side of me.
00:57:38.440 | It's telling me about nearby cars.
00:57:40.520 | - Or blind spots and so on.
00:57:41.680 | - Yeah, when I'm going in close to something in the park,
00:57:45.960 | I get this beautiful, gorgeous, top-down view of the world.
00:57:50.000 | I am impressed up the wazoo
00:57:53.040 | of how registered and metrical that is.
00:57:57.000 | - Oh, so it's like multiple cameras
00:57:58.400 | and it's all very- - Yeah, and it has that.
00:57:59.240 | - To produce the 360 view kind of thing?
00:58:01.080 | - 360 view, you know, synthesized so it's above the car.
00:58:04.280 | I mean, it is unbelievable.
00:58:06.400 | I got this car in January.
00:58:08.160 | It's the longest I've ever owned a car without digging it.
00:58:11.560 | So it's better than me.
00:58:13.480 | Well, me and it together are better.
00:58:16.040 | So I'm not saying technology's bad or not useful,
00:58:21.040 | but here's my point.
00:58:25.080 | - Yes.
00:58:25.920 | - It's a replay of the same movie.
00:58:31.480 | - Okay, so maybe you've seen me ask this question before,
00:58:35.880 | but when did the first car go over 55 miles an hour
00:58:40.880 | for over 10 miles on a public freeway
00:58:50.920 | with other traffic around driving completely autonomously?
00:58:56.840 | When did that happen?
00:58:59.240 | - Was it the new '80s or something?
00:59:01.200 | It was a long time ago.
00:59:02.560 | - It was actually in 1987 in Munich.
00:59:05.760 | - Oh, Munich, yeah, yeah.
00:59:06.720 | - At the Bundeswehr.
00:59:07.880 | - Yeah.
00:59:08.720 | - So they had it running in 1987.
00:59:12.400 | When do you think, and Elon has said he's gonna do this,
00:59:16.000 | when do you think we'll have the first car
00:59:18.160 | drive coast to coast in the US, hands off the wheel,
00:59:21.920 | hands off the wheel, feet off the pedals, coast to coast?
00:59:26.160 | - As far as I know, a few people have claimed to do it.
00:59:28.480 | 1995, that was Carnegie Mellon.
00:59:30.840 | - I didn't know, oh, that was the,
00:59:32.440 | yeah, they didn't claim, did they claim 100%?
00:59:35.840 | - Not 100%, not 100%, but--
00:59:38.280 | - And then there's a few marketing people
00:59:39.960 | who have claimed 100% since then.
00:59:42.040 | - But my point is that, you know,
00:59:44.560 | what I see happening again is someone sees a demo
00:59:49.200 | and they overgeneralize and say, we must be almost there.
00:59:52.440 | Well, we've been working on it for 35 years.
00:59:55.120 | - So that's demos, but this is gonna take us back
00:59:57.960 | to the same conversation with the AlphaZero.
01:00:00.280 | Are you not, okay, I'll just say what I am,
01:00:04.200 | because I thought, okay, when I first started interacting
01:00:07.360 | with the Mobileye implementation of Tesla Autopilot,
01:00:11.800 | I've driven a lot of cars, you know,
01:00:13.680 | I've been in Google stuff, driving cars since the beginning.
01:00:16.800 | I thought there was no way, before I sat and used Mobileye,
01:00:22.640 | I thought there, just knowing computer vision,
01:00:24.760 | I thought there's no way it could work as well as it was
01:00:27.120 | working, so my model of the limits of computer vision
01:00:31.920 | was way more limited than the actual implementation
01:00:35.960 | of Mobileye, so that's one example, I was really surprised.
01:00:40.200 | It's like, wow, that was incredible.
01:00:42.480 | The second surprise came when Tesla threw away Mobileye
01:00:47.040 | and started from scratch.
01:00:49.720 | I thought there's no way they can catch up to Mobileye.
01:00:53.520 | I thought what Mobileye was doing was kind of incredible,
01:00:56.120 | like the amount of work and the annotation.
01:00:57.640 | - Yeah, well, Mobileye was started by Amnon Shasher
01:01:00.480 | and used a lot of traditional, you know,
01:01:02.760 | hard-fought computer vision techniques.
01:01:05.320 | - But they also did a lot of good sort of,
01:01:08.480 | like, non-research stuff, like actual, like,
01:01:11.200 | just good, like what you do to make a successful product,
01:01:14.960 | right, it's scaled, all that kind of stuff,
01:01:17.000 | and so I was very surprised when they, from scratch,
01:01:19.080 | were able to catch up to that.
01:01:21.560 | That's very impressive, and I've talked to a lot of
01:01:23.360 | engineers that was involved, that was impressive.
01:01:27.360 | And the recent progress, especially under,
01:01:31.000 | well, with the involvement of Andrej Karpathy,
01:01:33.280 | what they were, what they're doing with the data engine,
01:01:38.280 | which is converting into the driving task
01:01:40.480 | into these multiple tasks, and then doing this
01:01:42.920 | edge case discovery when they're pulling back,
01:01:45.280 | like, the level of engineering
01:01:47.240 | made me rethink what's possible.
01:01:50.040 | I don't, I still, you know,
01:01:52.000 | I don't know to that intensity, but I always thought
01:01:55.240 | it was very difficult to solve autonomous driving
01:01:57.720 | with all the sensors, with all of the computation.
01:02:00.400 | I just thought it was a very difficult problem.
01:02:02.520 | But I've been continuously surprised
01:02:06.440 | how much you can engineer.
01:02:08.000 | First of all, the data acquisition problem,
01:02:10.120 | 'cause I thought, you know, just because I worked
01:02:12.560 | with a lot of car companies,
01:02:16.920 | they're so a little bit old school
01:02:20.360 | to where I didn't think they could do this at scale,
01:02:22.800 | like AWS style data collection.
01:02:26.040 | So when Tesla was able to do that,
01:02:28.320 | I started to think, okay, so what are the limits of this?
01:02:33.120 | I still believe that
01:02:35.440 | driver, like sensing and the interaction with the driver
01:02:40.080 | and like studying the human factors,
01:02:41.920 | psychology problem is essential.
01:02:43.840 | It's always going to be there.
01:02:46.120 | It's always going to be there,
01:02:48.200 | even with fully autonomous driving.
01:02:50.320 | But I've been surprised what is the limit,
01:02:53.680 | especially a vision-based alone, how far that can take us.
01:02:57.760 | So that's my levels of surprise.
01:03:01.160 | Now, can you explain in the same way you said,
01:03:06.160 | like alpha zero, that's a homework problem
01:03:09.000 | that's scaled large in its chest,
01:03:11.240 | like who cares, go with it.
01:03:12.840 | Here's actual people using an actual car and driving,
01:03:16.800 | many of them drive more than half their miles
01:03:19.880 | using the system.
01:03:20.960 | - So, yeah, they're doing well with pure vision.
01:03:25.920 | - With pure vision, yeah.
01:03:27.160 | - And now no radar, which is.
01:03:30.360 | - I suspect that can't go all the way.
01:03:32.440 | And one reason is without new cameras
01:03:36.120 | that have a dynamic range closer to the human eye,
01:03:38.280 | 'cause human eye has incredible dynamic range.
01:03:41.040 | And we make use of that dynamic range
01:03:43.280 | in it's 11 orders of magnitude
01:03:46.000 | or some crazy number like that.
01:03:48.720 | The cameras don't have that,
01:03:50.000 | which is why you see the bad cases
01:03:53.560 | where the sun on a white thing and the blinds,
01:03:56.680 | in a way it wouldn't blind a person.
01:03:58.480 | I think there's a bunch of things to think about
01:04:04.480 | before you say, this is so good, it's just going to work.
01:04:08.000 | (laughing)
01:04:09.120 | - Okay.
01:04:10.800 | - And I'll come at it from multiple angles.
01:04:14.320 | And I know you've got a lot of time.
01:04:15.840 | - Yeah, okay, let's do this.
01:04:17.680 | - I have thought about these things.
01:04:19.400 | - Yeah, I know.
01:04:20.840 | You've been writing a lot of great blog posts
01:04:23.160 | about it for a while before Tesla had autopilot, right?
01:04:27.360 | So you've been thinking about autonomous driving
01:04:29.320 | for a while from every angle.
01:04:31.400 | - So a few things.
01:04:33.440 | In the US, I think that the death rate
01:04:36.680 | from motor vehicle accidents is about 35,000 a year,
01:04:41.680 | which is an outrageous number.
01:04:47.360 | Not outrageous compared to COVID deaths,
01:04:49.240 | but there is no rationality.
01:04:51.240 | And that's part of the thing.
01:04:53.400 | People have said, engineers say to me,
01:04:55.400 | well, if we cut down the number of deaths by 10%
01:04:58.440 | by having autonomous driving, that's going to be great.
01:05:01.440 | Everyone will love it.
01:05:02.720 | And my prediction is that if autonomous vehicles
01:05:07.720 | kill more than 10 people a year,
01:05:09.800 | there'll be screaming and hollering,
01:05:11.560 | even though 35,000 people a year
01:05:14.360 | have been killed by human drivers.
01:05:16.880 | It's not rational.
01:05:18.520 | It's a different set of expectations.
01:05:20.560 | And that will probably continue.
01:05:22.500 | So there's that aspect of it.
01:05:26.680 | The other aspect of it is that
01:05:30.840 | when we introduce new technology,
01:05:34.120 | we often change the rules of the game.
01:05:37.120 | So when we introduced cars first,
01:05:42.120 | into our daily lives,
01:05:44.200 | we completely rebuilt our cities
01:05:46.160 | and we changed all the laws.
01:05:48.360 | Jaywalking was not an offense.
01:05:51.620 | That was pushed by the car companies
01:05:53.400 | so that people would stay off the road
01:05:54.960 | so there wouldn't be deaths from pedestrians getting hit.
01:05:58.400 | We completely changed the structure of our cities
01:06:01.000 | and had these foul smelling things everywhere around us.
01:06:06.000 | And now you see pushback in cities like Barcelona
01:06:08.840 | is really trying to exclude cars, et cetera.
01:06:12.340 | So I think that to get to self-driving,
01:06:19.240 | we will, large adoption.
01:06:24.040 | It's not going to be just take the current situation,
01:06:27.520 | take out the driver and put the same car
01:06:30.520 | doing the same stuff because the end cases too many.
01:06:33.740 | Here's an interesting question.
01:06:36.920 | How many fully autonomous train systems
01:06:43.240 | do we have in the US?
01:06:44.600 | - I mean, do you count them as fully autonomous?
01:06:48.680 | I don't know.
01:06:49.640 | 'Cause they're usually as a driver,
01:06:51.120 | but they're kind of autonomous, right?
01:06:52.760 | - No, let's get rid of the driver.
01:06:54.820 | - Okay, I don't know.
01:06:57.400 | - It's either 15 or 16.
01:06:59.600 | Most of them are in airports.
01:07:01.040 | There's a few that go about five,
01:07:04.600 | two that go about five kilometers out of airports.
01:07:07.440 | - Yeah.
01:07:08.280 | - When is the first fully autonomous train system
01:07:15.160 | for mass transit expected to operate fully autonomously
01:07:18.400 | with no driver in any US city?
01:07:23.920 | - It's expected to operate in 2017 in Honolulu.
01:07:28.300 | It's delayed, but they will get there.
01:07:32.160 | But by the way, it was originally gonna be autonomous
01:07:34.880 | here in the Bay Area.
01:07:35.880 | - I mean, they're all very close to fully autonomous, right?
01:07:38.960 | - Yeah, but getting the closest to the thing.
01:07:41.680 | And I've often gone on a fully autonomous train in Japan,
01:07:46.680 | one that goes out to that fake island
01:07:49.600 | in the middle of Tokyo Bay.
01:07:50.880 | I forget the name of the...
01:07:53.600 | And what do you see when you look at that?
01:07:55.680 | What do you see when you go to a fully autonomous train
01:07:58.920 | in an airport?
01:08:02.240 | It's not like regular trains.
01:08:05.660 | At every station, there's a double set of doors.
01:08:10.740 | So that there's a door of the train
01:08:12.240 | and there's a door off the platform.
01:08:16.680 | And it's really visible in this Japanese one
01:08:21.040 | because it goes out in amongst buildings.
01:08:23.880 | The whole track is built so that people can't climb onto it.
01:08:28.040 | So there's an engineering that then makes the system safe
01:08:32.040 | and makes them acceptable.
01:08:33.680 | I think we'll see similar sorts of things happen in the US.
01:08:38.680 | What surprised me, I thought wrongly
01:08:42.440 | that we would have special purpose lanes on 101
01:08:48.700 | in the Bay Area, the leftmost lane,
01:08:51.620 | so that it would be normal for Teslas or other cars
01:08:57.660 | to move into that lane and then say,
01:08:59.660 | "Okay, now it's autonomous," and have that dedicated lane.
01:09:03.420 | I was expecting movement to that.
01:09:06.060 | Five years ago, I was expecting we'd have
01:09:07.460 | a lot more movement towards that.
01:09:08.980 | We haven't.
01:09:10.100 | And it may be because Tesla's been over-promising
01:09:13.300 | by saying this, calling their system fully self-driving.
01:09:17.180 | I think they may have gotten there quicker
01:09:19.540 | by collaborating to change the infrastructure.
01:09:24.540 | This is one of the problems
01:09:27.180 | with long-haul trucking being autonomous.
01:09:32.580 | I think it makes sense on freeways at night
01:09:35.660 | for the trucks to go autonomously.
01:09:38.880 | But then there's the how do you get onto
01:09:41.460 | and off of the freeway?
01:09:42.580 | What sort of infrastructure do you need for that?
01:09:45.860 | Do you need to have the human in there to do that?
01:09:48.540 | Or can you get rid of the human?
01:09:50.700 | So I think there's ways to get there,
01:09:52.580 | but it's an infrastructure argument
01:09:55.100 | because the long tail of cases is very long,
01:10:00.800 | and the acceptance of it will not be
01:10:02.980 | at the same level as human drivers.
01:10:04.840 | - So I'm with you still, and I was with you for a long time,
01:10:10.780 | but I am surprised how well,
01:10:13.660 | how many edge cases of machine learning
01:10:17.060 | and vision-based methods can cover.
01:10:19.340 | This is what I'm trying to get at is,
01:10:21.500 | I think there's something fundamentally different
01:10:26.060 | with vision-based methods and Tesla Autopilot
01:10:29.260 | and any company that's trying to do the same.
01:10:32.140 | - Okay, well, I'm not gonna argue with you
01:10:33.980 | 'cause we're speculating.
01:10:38.980 | - Yes, but--
01:10:39.820 | - And my gut feeling tells me
01:10:43.300 | it's gonna be, things will speed up
01:10:48.140 | when there is engineering of the environment
01:10:50.640 | because that's what happened with every other technology.
01:10:53.300 | - I'm a bit, I don't know about you,
01:10:54.580 | but I'm a bit cynical that infrastructure,
01:10:57.740 | which relies on government to help out in these cases.
01:11:02.740 | If you just look at infrastructure in all domains,
01:11:08.060 | it's just government always drags behind on infrastructure.
01:11:11.620 | There's so many--
01:11:13.740 | - Well, in this country.
01:11:15.100 | - In the, sorry, yes, in this country.
01:11:18.020 | And of course, there's many, many countries
01:11:19.780 | that are actually much worse on infrastructure.
01:11:21.980 | - Oh, yes, many of them are much worse,
01:11:23.460 | and there's some that, like high-speed rail,
01:11:26.180 | the other countries have done much better.
01:11:28.940 | - I guess my question is, which is at the core
01:11:32.580 | of what I was trying to think through here and ask you,
01:11:35.540 | is how hard is the driving problem as it currently stands?
01:11:40.460 | So you mentioned like, we don't want to just
01:11:42.660 | take the human out and duplicate
01:11:43.980 | whatever the human was doing.
01:11:45.260 | But if we were to try to do that,
01:11:47.180 | how hard is that problem?
01:11:51.540 | Because I used to think it's way harder.
01:11:55.580 | Like I used to think it's, with vision alone,
01:11:59.020 | it would be three decades, four decades.
01:12:02.380 | - Okay, so I don't know the answer
01:12:04.100 | to this thing I'm about to pose,
01:12:07.260 | but I do notice that on Highway 280 here in the Bay Area,
01:12:12.260 | which largely has concrete surface
01:12:15.620 | rather than blacktop surface,
01:12:17.180 | the white lines that are painted there
01:12:19.860 | now have black boundaries around them.
01:12:21.940 | And my lane drift system in my car
01:12:27.940 | would not work without those black boundaries.
01:12:30.380 | - Interesting.
01:12:31.220 | - So I don't know whether they've started doing it
01:12:33.140 | to help the lane drift,
01:12:34.700 | whether it is an instance of infrastructure
01:12:37.740 | following the technology,
01:12:40.180 | but my car would not perform as well
01:12:43.580 | without that change in the way they paint the line.
01:12:45.540 | - Unfortunately, really good lane keeping
01:12:48.540 | is not as valuable.
01:12:50.460 | Like it's orders of magnitude more valuable
01:12:53.500 | to have a fully autonomous system.
01:12:55.140 | - But for me, lane keeping is really helpful
01:12:59.740 | 'cause I'm lousy at it.
01:13:01.020 | - But you wouldn't pay 10 times.
01:13:03.820 | Like the problem is there's not financial,
01:13:07.900 | like it doesn't make sense to revamp the infrastructure
01:13:12.420 | to make lane keeping easier.
01:13:14.860 | It does make sense to revamp the infrastructure.
01:13:16.980 | - Oh, I see what you mean.
01:13:17.820 | - If you have a large fleet of autonomous vehicles,
01:13:19.860 | now you change what it means to own cars,
01:13:22.420 | you change the nature of transportation.
01:13:24.420 | I mean, but for that, you need autonomous vehicles.
01:13:29.420 | Let me ask you about Waymo then.
01:13:31.540 | I've gotten a bunch of chances to ride in a Waymo
01:13:35.260 | self-driving car and they're,
01:13:38.540 | I don't know if you'd call them self-driving, but.
01:13:41.140 | - Well, I mean, I rode in one before that called Waymo.
01:13:44.140 | - Yeah. - Still at X.
01:13:45.820 | - So there's currently, was a big leap,
01:13:48.700 | another surprising leap I didn't think would happen,
01:13:51.340 | which is they have no driver currently.
01:13:53.900 | - Yeah, in Chandler.
01:13:55.180 | - In Chandler, Arizona.
01:13:56.220 | And I think they're thinking of doing that in Austin as well.
01:13:59.140 | But they're like expanding.
01:14:00.780 | - Although, and I do an annual checkup on this.
01:14:05.780 | So as of late last year,
01:14:07.860 | they were aiming for hundreds of rides a week, not thousands.
01:14:12.860 | And there is no one in the car,
01:14:17.620 | but there's certainly safety people in the loop.
01:14:22.620 | And it's not clear how many,
01:14:24.700 | what the ratio of cars to safety people is.
01:14:27.820 | - It wasn't, obviously they're not 100% transparent
01:14:31.380 | about this, but--
01:14:32.220 | - No, none of them are 100% transparent.
01:14:33.380 | They're very untransparent.
01:14:34.580 | - But at least the way they're,
01:14:38.020 | I don't wanna make definitively,
01:14:39.260 | but they're saying there's no teleoperation.
01:14:41.460 | So like, they're, I mean, okay.
01:14:45.780 | - And that sort of fits with YouTube videos I've seen
01:14:50.140 | of people being trapped in the car by a red cone
01:14:54.580 | on the street.
01:14:56.220 | And they do have rescue vehicles that come
01:14:59.620 | and then a person gets in and drives it.
01:15:02.500 | - Yeah, but isn't it incredible to you,
01:15:05.900 | it was to me to get in a car with no driver
01:15:09.820 | and watch the steering wheel turn.
01:15:12.300 | Like for somebody who has been studying,
01:15:14.740 | at least certainly the human side of autonomous vehicles
01:15:17.380 | for many years, and you've been doing it for way longer.
01:15:20.620 | Like it was incredible to me
01:15:21.660 | that this was actually could happen.
01:15:23.140 | I don't care if that scale is 100 cars.
01:15:24.860 | This is not a demo.
01:15:26.100 | This is not, this is me as a regular human--
01:15:28.980 | - The argument I have is that people make interpolations
01:15:32.780 | from that.
01:15:33.620 | - Interpolations.
01:15:34.460 | - That, you know, it's here, it's done.
01:15:36.300 | You know, it's just, you know, we've solved it.
01:15:39.500 | No, we haven't yet.
01:15:41.180 | And that's my argument.
01:15:42.700 | - Okay, so I'd like to go to,
01:15:44.820 | you keep a list of predictions.
01:15:46.940 | - Yeah, okay.
01:15:47.780 | - On your amazing blog posts.
01:15:48.620 | It'd be fun to go through them.
01:15:49.940 | But before then, let me ask you about this.
01:15:52.500 | You have a harshness to you sometimes
01:15:57.500 | in your criticisms of what is perceived as hype.
01:16:04.340 | (both laughing)
01:16:06.700 | And so like, 'cause people extrapolate, like you said,
01:16:09.780 | and they kind of buy into the hype,
01:16:11.620 | and then they kind of start to think that
01:16:15.460 | the technology is way better than it is.
01:16:20.180 | But let me ask you maybe a difficult question.
01:16:23.420 | - Sure.
01:16:25.020 | - Do you think, if you look at history of progress,
01:16:28.660 | don't you think to achieve the quote, "impossible,"
01:16:31.980 | you have to believe that it's possible?
01:16:34.740 | - Absolutely, yeah.
01:16:35.980 | Look, here's two great runs.
01:16:39.940 | Great, unbelievable.
01:16:42.060 | 1903, first human power,
01:16:47.060 | human, you know, heavier than air flight.
01:16:49.420 | - Yeah.
01:16:50.740 | - 1969, we land on the moon.
01:16:52.940 | That's 66 years.
01:16:54.140 | I'm 66 years old.
01:16:55.900 | In my lifetime, that span of my lifetime,
01:16:58.060 | we went from barely flying, I don't know what it was,
01:17:01.660 | 50 feet, the length of the first flight or something,
01:17:05.020 | to landing on the moon.
01:17:06.380 | Unbelievable.
01:17:08.220 | - Yeah. - Fantastic.
01:17:09.100 | - But that requires, by the way,
01:17:10.780 | one of the Wright brothers, both of them,
01:17:12.780 | but one of them didn't believe it's even possible
01:17:15.300 | like a year before, right?
01:17:17.220 | So like not just possible soon, but like ever.
01:17:21.180 | - So, you know.
01:17:22.660 | - How important is it to believe and be optimistic
01:17:25.140 | is what I guess.
01:17:25.980 | - Oh yeah, it is important.
01:17:26.940 | It's when it goes crazy.
01:17:28.460 | When, you know, you said that,
01:17:31.620 | what was the word you used for my bad?
01:17:34.020 | - Harshness?
01:17:34.860 | - Harshness, yes.
01:17:36.060 | (both laughing)
01:17:38.740 | I just get so frustrated.
01:17:42.820 | - Yes.
01:17:43.660 | - When people make these leaps
01:17:46.140 | and tell me that I don't understand.
01:17:49.380 | - Right.
01:17:50.220 | - Yeah.
01:17:51.980 | Just from iRobot, which I was co-founder of,
01:17:57.980 | I don't know the exact numbers now
01:17:59.300 | 'cause it's 10 years since I stepped off the board,
01:18:01.940 | but I believe it's well over 30 million
01:18:04.260 | robots cleaning houses from that one company.
01:18:06.940 | Then now there's lots of other companies.
01:18:08.380 | - Yes.
01:18:09.220 | - Was that a crazy idea that we had to believe
01:18:15.140 | in 2002 when we released it?
01:18:17.580 | Yeah, that was, we had to, you know,
01:18:21.620 | believe that it could be done.
01:18:23.020 | - Let me ask you about this.
01:18:23.860 | So iRobot, one of the greatest robotics companies ever
01:18:27.820 | in terms of manufacturing,
01:18:29.260 | creating a robot that actually works in the real world
01:18:31.820 | is probably the greatest robotics company ever.
01:18:34.500 | You were the co-founder of it.
01:18:36.100 | If the Rodney Brooks of today
01:18:42.220 | talked to the Rodney of back then,
01:18:44.820 | what would you tell him?
01:18:46.100 | 'Cause I have a sense that,
01:18:48.180 | would you pat him on the back and say,
01:18:50.780 | what you're doing is going to fail,
01:18:53.460 | but go at it anyway?
01:18:55.580 | That's what I'm referring to with the harshness.
01:18:58.700 | You've accomplished an incredible thing there.
01:19:01.300 | One of the several things we'll talk about.
01:19:04.140 | Like that's what I'm trying to get at that line.
01:19:06.740 | - No, it's when,
01:19:08.020 | my harshness is reserved for people who are not doing it,
01:19:13.260 | who claim it's just,
01:19:14.660 | well, this shows that it's just gonna happen.
01:19:16.660 | - But here's the thing.
01:19:18.020 | - This shows-
01:19:18.860 | - But you have that harshness for Elon too.
01:19:22.860 | - And no-
01:19:24.700 | - Or no, it's a different harshness.
01:19:26.140 | - No, it's a different argument with Elon.
01:19:30.380 | You know, I think SpaceX is an amazing company.
01:19:34.860 | On the other hand, you know, in one of my blog posts,
01:19:38.500 | I said, what's easy and what's hard.
01:19:40.620 | I said, SpaceX, vertical landing rockets,
01:19:44.420 | it had been done before.
01:19:46.460 | Grid fins had been done since the '60s.
01:19:48.740 | Every Sawyer's has them.
01:19:49.940 | Reusable space,
01:19:54.900 | DCX reused those rockets that landed vertically.
01:19:58.420 | There was a whole insurance industry in place
01:20:02.620 | for rocket launches.
01:20:04.380 | There were all sorts of infrastructure.
01:20:07.180 | That was doable.
01:20:09.580 | It took a great entrepreneur, a great personal expense.
01:20:13.540 | He almost drove himself bankrupt doing it.
01:20:16.540 | A great belief to do it.
01:20:20.340 | Whereas Hyperloop,
01:20:22.740 | there's a whole bunch more stuff
01:20:25.980 | that's never been thought about, never been demonstrated.
01:20:28.380 | So my estimation is Hyperloop is a long, long,
01:20:32.660 | a lot further off.
01:20:33.700 | But, and if I've got a criticism of Elon,
01:20:37.260 | it's that he doesn't make distinctions
01:20:39.740 | between when the technology's coming along and ready,
01:20:44.740 | and then he'll go off and mouth off about other things,
01:20:48.420 | which then people go and compete about and try and do.
01:20:51.500 | - This is where I understand what you're saying.
01:20:57.700 | I tend to draw a different distinction.
01:20:59.740 | I have a similar kind of harshness
01:21:03.460 | towards people who are not telling the truth,
01:21:06.020 | who are basically fabricating stuff to make money
01:21:09.460 | or to--
01:21:10.420 | - Oh, he believes what he says.
01:21:11.540 | I just think he's wrong sometimes.
01:21:12.380 | - To me, that's a very important difference.
01:21:13.780 | - Yeah, I'm not.
01:21:15.220 | - Because I think in order to fly,
01:21:18.060 | in order to get to the moon,
01:21:19.060 | you have to believe even when most people tell you
01:21:23.140 | you're wrong and most likely you're wrong,
01:21:25.540 | but sometimes you're right.
01:21:27.060 | I mean, that's the same thing I have with Tesla Autopilot.
01:21:29.980 | I think that's an interesting one.
01:21:31.940 | I was, especially when I was at MIT
01:21:35.940 | and just the entire human factors
01:21:37.380 | in the robotics community were very negative towards Elon.
01:21:40.380 | It was very interesting for me to observe colleagues at MIT.
01:21:43.380 | I wasn't sure what to make of that.
01:21:46.780 | That was very upsetting to me
01:21:48.540 | because I understood where that's coming from.
01:21:52.020 | And I agreed with them.
01:21:53.300 | And I kind of almost felt the same thing in the beginning
01:21:56.100 | until I kind of opened my eyes
01:21:58.220 | and realized there's a lot of interesting ideas here
01:22:01.500 | that might be over hype.
01:22:02.700 | If you focus yourself on the idea
01:22:05.900 | that you shouldn't call a system full self-driving
01:22:10.900 | when it's obviously not autonomous, fully autonomous,
01:22:14.980 | you're going to miss the magic of proxy.
01:22:17.940 | - You are gonna miss the magic,
01:22:19.020 | but at the same time, there are people who buy it,
01:22:22.260 | literally pay money for it and take those words as given.
01:22:27.140 | - But I haven't, so take words as given is one thing.
01:22:33.580 | I haven't actually seen people that use autopilot
01:22:36.540 | that believe that the behavior is really important,
01:22:39.620 | like the actual action.
01:22:40.820 | So like this is to push back
01:22:43.460 | on the very thing that you're frustrated about,
01:22:45.660 | which is like journalists and general people
01:22:47.900 | buying all the hype and going out.
01:22:51.340 | In the same way, I think there's a lot of hype
01:22:53.740 | about the negatives of this too,
01:22:57.060 | that people are buying without using.
01:22:58.460 | People use the way, this opened my eyes actually,
01:23:03.220 | the way people use a product is very different
01:23:05.940 | than the way they talk about it.
01:23:08.020 | This is true with robotics, with everything.
01:23:09.620 | Everybody has dreams of how a particular product
01:23:12.260 | might be used or so on.
01:23:14.300 | And then when it meets reality,
01:23:16.020 | there's a lot of fear of robotics, for example,
01:23:18.260 | that robots are somehow dangerous
01:23:19.740 | and all those kinds of things.
01:23:20.980 | But when you actually have robots in your life,
01:23:22.780 | whether it's in the factory or in the home,
01:23:25.060 | making your life better, that's going to be,
01:23:27.460 | that's way different.
01:23:28.820 | Your perceptions of it are gonna be way different.
01:23:30.980 | And so my just tension was, was like, here's an innovator,
01:23:35.460 | what is it?
01:23:41.420 | Sorry, Super Cruise from Cadillac
01:23:42.780 | was super interesting too.
01:23:43.900 | That's a really interesting system.
01:23:45.740 | We should like be excited by those innovations.
01:23:48.020 | - Okay, so let me, can I tell you something
01:23:49.420 | that's really annoyed me recently?
01:23:51.740 | It's really annoyed me that the press
01:23:55.020 | and friends of mine on Facebook are going,
01:23:58.420 | these billionaires and their space games,
01:24:00.900 | you know, why are they doing that?
01:24:02.060 | - Yeah, that's been very frustrating.
01:24:03.340 | - Really pisses me off.
01:24:04.380 | I must say, I applaud that.
01:24:07.580 | I applaud it.
01:24:08.940 | It's the taking and not necessarily the people
01:24:12.140 | who are doing the things,
01:24:13.860 | but that I keep having to push back against
01:24:17.700 | on realistic expectations
01:24:19.620 | of when these things can become real.
01:24:22.500 | - Yeah, this was interesting, Ana,
01:24:25.540 | because there's been a particular focus for me
01:24:27.700 | is autonomous driving.
01:24:29.420 | Elon's prediction of when certain milestones will be hit.
01:24:32.260 | There's several things to be said there
01:24:37.220 | that I thought about,
01:24:39.140 | because whenever you said them,
01:24:40.380 | it was obvious that's not going to me
01:24:42.700 | as a person that kind of not inside the system,
01:24:47.700 | it was obvious it's unlikely to hit those.
01:24:51.020 | There's two comments I want to make.
01:24:52.860 | One, he legitimately believes it.
01:24:55.340 | And two, much more importantly,
01:24:59.380 | I think that having ambitious deadlines
01:25:04.380 | drives people to do the best work of their life,
01:25:07.460 | even when the odds of those deadlines are very low.
01:25:11.180 | - To a point, and I'm not talking about Elon here.
01:25:13.900 | I'm just saying.
01:25:14.740 | - So there's a line there, right?
01:25:15.940 | - You have to have a line,
01:25:16.780 | because you overextend and it's demoralizing.
01:25:21.580 | But I will say that there's an additional thing here,
01:25:27.060 | that those words also drive the stock market.
01:25:32.060 | And we have, because of the way that rich people
01:25:38.100 | in the past have manipulated the rubes through investment,
01:25:42.860 | we have developed laws about what you're allowed to say
01:25:47.860 | and have a promise.
01:25:50.500 | And there's an area here which is...
01:25:55.260 | I tend to be, maybe I'm naive,
01:25:57.580 | but I tend to believe that engineers, innovators,
01:26:02.580 | people like that, they're not,
01:26:05.060 | they don't think like that,
01:26:07.580 | like manipulating the stock price,
01:26:09.620 | but it's possible that I'm wrong.
01:26:14.340 | It's a very cynical view of the world,
01:26:18.140 | because I think most people that run companies
01:26:21.460 | and build, especially original founders, they...
01:26:25.460 | - Yeah, I'm not saying that's the intent.
01:26:28.900 | I'm saying it's a...
01:26:29.740 | - Eventually it's kind of,
01:26:31.060 | you fall into that kind of a behavior pattern.
01:26:35.260 | I don't know.
01:26:36.100 | I tend to...
01:26:37.260 | - I wasn't saying it's falling into that intent.
01:26:39.700 | It's just, you also have to protect investors
01:26:42.980 | in this market.
01:26:44.740 | - Yeah.
01:26:45.700 | Okay, so you have, first of all,
01:26:47.420 | you have an amazing blog that people should check out,
01:26:50.260 | but you also have this, in that blog,
01:26:52.620 | a set of predictions.
01:26:54.780 | It's such a cool idea.
01:26:55.940 | I don't know how long ago you started,
01:26:57.380 | like three, four years ago?
01:26:58.380 | - It was January 1st, 2018.
01:27:01.980 | - 18, yeah.
01:27:03.060 | - And I made these predictions,
01:27:04.740 | and I said that every January 1st,
01:27:06.700 | I was gonna check back on how my predictions had...
01:27:09.100 | - That's such a great thought experiment.
01:27:10.340 | - For 32 years.
01:27:12.060 | - Oh, so you said 32 years.
01:27:13.460 | - I said 32 years,
01:27:14.300 | 'cause I thought that'll be January 1st, 2050.
01:27:17.020 | I'll be, I will just turn 95.
01:27:20.220 | (both laughing)
01:27:24.540 | - Nice.
01:27:25.380 | And so people know that your predictions,
01:27:29.940 | at least for now,
01:27:30.860 | are in the space of artificial intelligence.
01:27:33.340 | - Yeah, I didn't say I was gonna make new predictions.
01:27:35.020 | I was just gonna measure this set of predictions
01:27:36.660 | that I made, 'cause I was sort of annoyed
01:27:39.100 | that everyone could make predictions
01:27:40.780 | that didn't come true and everyone forgot.
01:27:42.580 | So I said, "I should hold myself to a high standard."
01:27:44.980 | - Yeah, but also just putting years
01:27:46.860 | and date ranges on things,
01:27:48.860 | it's a good thought exercise.
01:27:51.020 | And reasoning your thoughts out.
01:27:53.060 | And so the topics are artificial intelligence,
01:27:56.340 | autonomous vehicles, and space.
01:27:58.380 | I was wondering if we could just go through some
01:28:03.460 | that stand out, maybe from memory,
01:28:04.860 | I can just mention to you some,
01:28:06.300 | let's talk about self-driving cars,
01:28:08.300 | some predictions that you're particularly proud of
01:28:11.220 | or are particularly interesting,
01:28:14.660 | from flying cars to,
01:28:17.260 | the other element here is how widespread the location
01:28:22.260 | where the deployment of the autonomous vehicles is.
01:28:25.060 | And there's also just a few fun ones.
01:28:27.740 | Is there something that jumps to mind
01:28:28.980 | that you remember from the predictions?
01:28:30.940 | - Well, I think I did put in there
01:28:33.820 | that there would be a dedicated self-driving lane
01:28:37.500 | on 101 by some year,
01:28:39.620 | and I think I was over-optimistic on that one.
01:28:42.500 | - Yeah, actually, yeah, I actually do remember that.
01:28:44.300 | But I think you were mentioning difficulties
01:28:47.420 | in different cities.
01:28:48.660 | - Yeah, yeah.
01:28:49.580 | - So Cambridge, Massachusetts, I think was an example.
01:28:52.580 | - Yeah, like in Cambridge Port.
01:28:54.180 | I lived in Cambridge Port for a number of years,
01:28:56.980 | and the roads are narrow,
01:28:59.180 | and getting anywhere as a human driver
01:29:02.260 | is incredibly frustrating when you start to put,
01:29:04.940 | and people drive the wrong way on one-way streets there.
01:29:07.860 | - So your prediction was driverless taxi services
01:29:13.500 | operating on all streets in Cambridge Port, Massachusetts,
01:29:17.580 | in 2035.
01:29:21.100 | - Yeah, and that may have been too optimistic.
01:29:25.100 | - You think, so-
01:29:26.140 | - You know, I've gotten a little more pessimistic
01:29:28.420 | since I made these internally on some of these things.
01:29:31.500 | - So what,
01:29:32.340 | can you put a year to a major milestone
01:29:37.260 | of deployment of a taxi service
01:29:42.460 | in a few major cities?
01:29:43.660 | Like something where you feel like
01:29:45.500 | autonomous vehicles are here.
01:29:47.580 | - So let's take the grid streets
01:29:52.420 | of San Francisco north of Market.
01:29:56.020 | - Okay. - Okay.
01:29:57.100 | Relatively benign environment.
01:30:03.980 | The streets are wide.
01:30:05.140 | The major problem is delivery trucks stopping everywhere,
01:30:12.340 | which has made things more complicated.
01:30:14.340 | A taxi system there with
01:30:18.300 | somewhat designated pickup and drop-offs,
01:30:23.340 | unlike with Uber and Lyft,
01:30:24.620 | where you can sort of get to any place
01:30:26.940 | and the drivers will figure out how to get in there.
01:30:31.780 | We're still a few years away.
01:30:35.180 | I live in that area.
01:30:38.220 | So I see the self-driving car companies,
01:30:41.740 | cars, multiple ones every day.
01:30:45.300 | Now if they drive a cruise,
01:30:47.100 | Zoox less often, Waymo all the time,
01:30:54.940 | different ones come and go.
01:30:56.900 | - And there's always a driver.
01:30:58.740 | - There's always a driver at the moment.
01:31:00.700 | Although I have noticed that sometimes the driver
01:31:05.180 | does not have the authority to take over
01:31:07.660 | without talking to the home office
01:31:10.580 | because they will sit there waiting for a long time.
01:31:15.020 | And clearly something's going on
01:31:16.900 | where the home office is making a decision.
01:31:19.660 | - That's fascinating.
01:31:20.500 | - So they're, you know,
01:31:22.180 | and so you can see whether they've got their hands
01:31:24.740 | on the wheel or not.
01:31:25.700 | And it's the incident resolution time
01:31:29.220 | that tells you, gives you some clues.
01:31:31.460 | - So what year do you think, what's your intuition?
01:31:33.940 | What date range are you currently thinking
01:31:36.740 | San Francisco would be autonomous taxi service
01:31:41.500 | from any point A to any point B without a driver?
01:31:46.220 | Are you still, are you thinking 10 years from now,
01:31:51.500 | 20 years from now, 30 years from now?
01:31:53.140 | - Certainly not 10 years from now.
01:31:55.660 | It's going to be longer.
01:31:57.020 | If you're allowed to go South of market, way longer.
01:31:59.620 | And unless there's re-engineering of roads.
01:32:04.420 | - By the way, what's the biggest challenge?
01:32:05.820 | You mentioned a few.
01:32:07.020 | Is it the delivery trucks?
01:32:10.340 | Is it the edge cases, the computer perception?
01:32:13.500 | - Well, here's a case that I saw outside my house
01:32:16.580 | a few weeks ago, about 8 p.m. on a Friday night.
01:32:19.940 | It was getting dark.
01:32:20.780 | It was before the solstice.
01:32:22.140 | It was a cruise vehicle come down the hill,
01:32:27.780 | turned right and stopped dead covering the crosswalk.
01:32:32.780 | Why did it stop dead?
01:32:35.260 | 'Cause there was a human just two feet from it.
01:32:39.500 | Now I just glanced, I knew what was happening.
01:32:41.860 | The human was a woman, was at the door of her car
01:32:46.260 | trying to unlock it with one of those things
01:32:48.260 | that you know, when you don't have a key.
01:32:50.660 | That car thought, oh, she could jump out
01:32:53.860 | in front of me any second.
01:32:55.660 | As a human, I could tell, no, she's not gonna jump out.
01:32:57.940 | She's busy trying to unlock her, she's lost her keys.
01:33:00.420 | She's trying to get in the car.
01:33:01.900 | And it stayed there for, until I got bored.
01:33:06.300 | - Yeah.
01:33:07.140 | - And so the human driver in there did not take over.
01:33:11.940 | But here's the kicker to me.
01:33:14.460 | A guy comes down the hill with a stroller.
01:33:18.700 | I assume there's a baby in there.
01:33:20.780 | And now the crosswalk's blocked by this cruise vehicle.
01:33:25.780 | What's he gonna do?
01:33:28.140 | Cleverly, I think he decided not to go in front of the car.
01:33:31.300 | (laughing)
01:33:33.300 | He went, but he had to go behind it.
01:33:35.100 | He had to get off the crosswalk, out into the intersection
01:33:38.220 | to push his baby around this car, which was stopped there.
01:33:41.300 | And no human driver would have stopped there
01:33:43.020 | for that length of time.
01:33:45.140 | They would have got out and out of the way.
01:33:46.980 | And that's another one of my pet peeves
01:33:51.580 | that safety is being compromised for individuals
01:33:56.140 | who didn't sign up for having this happen
01:33:58.300 | in their neighborhood.
01:34:00.620 | - Yeah, but--
01:34:01.900 | - Now you can say that's an edge case, but--
01:34:04.700 | - Yeah, well, I'm in general not a fan
01:34:08.100 | of anecdotal evidence for stuff.
01:34:12.260 | This is one of my biggest problems
01:34:14.820 | with the discussion of autonomous vehicles in general.
01:34:17.140 | People that criticize them or support them
01:34:19.100 | are using edge cases.
01:34:20.580 | - Okay.
01:34:21.420 | - Are using anecdotal evidence.
01:34:22.700 | - So let me--
01:34:23.540 | - But I got you.
01:34:24.740 | - Your question is when is it gonna happen in San Francisco?
01:34:26.940 | I say not soon, but it's gonna be one of them.
01:34:29.180 | But where it is gonna happen is in limited domains,
01:34:33.660 | campuses of various sorts, gated communities,
01:34:39.860 | where the other drivers are not arbitrary people.
01:34:44.860 | They're people who know about these things.
01:34:48.180 | They, you know, it's been warned about them.
01:34:50.740 | And at velocities where it's always safe to stop dead.
01:34:55.740 | - Yeah.
01:34:57.260 | - You can't do that on the freeway.
01:34:58.860 | That I think we're gonna start to see.
01:35:00.740 | And they may not be shaped like current cars.
01:35:06.300 | They may be things like May Mobility has those things
01:35:10.900 | and various companies have these.
01:35:12.780 | - Yeah, I wonder if that's a compelling experience.
01:35:14.540 | To me, it's always important.
01:35:15.940 | It's not just about automation.
01:35:17.220 | It's about creating a product that makes your,
01:35:20.460 | it's not just cheaper, but it makes your,
01:35:22.460 | that's fun to ride.
01:35:23.620 | One of the least fun things is for a car that stops
01:35:28.580 | and like waits.
01:35:29.660 | There's something deeply frustrating for us humans,
01:35:32.860 | for the rest of the world to take advantage of us
01:35:34.780 | as we wait.
01:35:35.620 | - But think about, you know, not you as the customer,
01:35:40.620 | but someone who's in their 80s in a retirement village
01:35:47.700 | whose kids have said, "You are not driving anymore."
01:35:52.020 | And this gives you the freedom to go to the market.
01:35:54.420 | - That's a hugely beneficial thing,
01:35:56.020 | but it's a very few orders of magnitude
01:35:59.340 | less impact on the world.
01:36:00.980 | It's not, it's just a few people in a small community
01:36:03.660 | using cars as opposed to the entirety of the world.
01:36:06.460 | I like that the first time that a car equipped
01:36:11.500 | with some version of a solution to the trolley problem
01:36:14.500 | is what's NIML stand for?
01:36:16.460 | - None in my life.
01:36:17.300 | - None in my life.
01:36:18.140 | - I define my lifetime as--
01:36:19.620 | - Up to 2050.
01:36:20.460 | - 2050.
01:36:21.820 | Yeah.
01:36:23.860 | - You know, I ask you, when have you had to decide
01:36:27.460 | which person shall I kill?
01:36:29.580 | No, you put the brakes on and you brake as hard as you can.
01:36:33.220 | I mean, you're not making that decision.
01:36:35.500 | - It is, you know, I do think autonomous vehicles
01:36:38.540 | or semi-autonomous vehicles do need to solve
01:36:40.740 | the whole pedestrian problem that has elements
01:36:43.300 | of the trolley problem within it, but it's not--
01:36:45.740 | - Yeah, well, so here's, and I talk about it
01:36:47.740 | in one of the articles or blog posts that I wrote.
01:36:50.340 | Here's, and people have told me,
01:36:53.140 | one of my coworkers has told me he does this.
01:36:55.780 | He tortures autonomously driven vehicles
01:36:59.580 | and pedestrians will torture them.
01:37:01.740 | Now, you know, once they realize that, you know,
01:37:04.380 | putting one foot off the curb makes the car think
01:37:07.180 | that they might walk into the road, kids,
01:37:09.300 | teenagers will be doing that all the time.
01:37:11.100 | They will.
01:37:12.220 | - I, by the way, one of my,
01:37:13.660 | and this is a whole nother discussion,
01:37:15.060 | 'cause my main issue with robotics
01:37:17.180 | is HRI, human robot interaction.
01:37:20.340 | I believe that robots that interact with humans
01:37:22.980 | will have to push back.
01:37:25.500 | Like they can't just be bullied
01:37:29.140 | because that creates a very uncompelling experience
01:37:31.460 | for the humans.
01:37:32.420 | - Yeah, well, you know, Waymo,
01:37:33.660 | before it was called Waymo, discovered that, you know,
01:37:36.500 | they had to do that at four-way intersections.
01:37:39.140 | They had to nudge forward to get the queue
01:37:42.140 | that they were gonna go, 'cause otherwise
01:37:44.020 | the other drivers would just beat them all the time.
01:37:47.500 | - So you co-founded iRobot, as we mentioned,
01:37:50.420 | one of the most successful robotics companies ever.
01:37:54.540 | What are you most proud of with that company
01:37:56.620 | and the approach you took to robotics?
01:38:01.620 | - Well, there's something I'm quite proud of there,
01:38:04.860 | which may be a surprise,
01:38:06.780 | but I was still on the board when this happened.
01:38:10.620 | It was March, 2011.
01:38:13.420 | And we sent robots to Japan,
01:38:17.420 | and they were used to help shut down
01:38:21.660 | the Fukushima Daiichi nuclear power plant,
01:38:25.860 | which was, everything was, I've been there since.
01:38:28.740 | I was there in 2014, and the robots,
01:38:31.660 | some of the robots were still there.
01:38:33.300 | I was proud that we were able to do that.
01:38:36.220 | Why were we able to do that?
01:38:37.820 | And, you know, people have said,
01:38:40.260 | well, you know, Japan is so good at robotics.
01:38:43.460 | It was because we had had about 6,500 robots
01:38:48.300 | deployed in Iraq and Afghanistan, tele-opt,
01:38:52.300 | but with intelligence, dealing with roadside bombs.
01:38:57.140 | So we had, I think it was at that time,
01:38:59.540 | nine years of in-field experience
01:39:03.060 | with the robots in harsh conditions.
01:39:05.500 | Whereas the Japanese robots, which were, you know,
01:39:08.220 | getting, you know, this goes back to what
01:39:10.860 | annoys me so much, getting all the hype.
01:39:12.660 | Look at that, look at that Honda robot.
01:39:15.180 | It can walk.
01:39:16.020 | Wow, the future's here.
01:39:18.500 | Couldn't do a thing because they weren't deployed,
01:39:21.380 | but we had deployed in really harsh conditions
01:39:24.020 | for a long time.
01:39:24.940 | And so we're able to do something very positive
01:39:29.060 | in a very bad situation.
01:39:31.020 | - What about just the simple,
01:39:33.900 | and for people who don't know,
01:39:34.900 | one of the things that iRobot has created
01:39:37.220 | is the Roomba vacuum cleaner.
01:39:40.780 | What about the simple robot that is the Roomba,
01:39:45.980 | quote unquote simple, that's deployed
01:39:48.580 | in tens of millions of homes?
01:39:52.180 | What do you think about that?
01:39:54.900 | - Well, I make the joke that I started out life
01:39:58.220 | as a pure mathematician
01:40:00.060 | and turned into a vacuum cleaner salesman.
01:40:02.980 | So if you're going to be an entrepreneur,
01:40:05.500 | be ready to do anything.
01:40:08.300 | But I was, you know,
01:40:10.700 | there was a wacky lawsuit that I got posed for
01:40:17.740 | not too many years ago.
01:40:19.660 | And I was the only one who had emailed from the 1990s
01:40:23.900 | and no one in the company had it.
01:40:26.620 | So I went and went through my email
01:40:29.340 | and it reminded me of, you know,
01:40:32.780 | the joy of what we were doing.
01:40:35.140 | And what was I doing?
01:40:37.140 | What was I doing at the time we were building the Roomba?
01:40:40.140 | One of the things was we had this incredibly tight budget
01:40:47.660 | 'cause we wanted to put it on the shelves at $200.
01:40:51.580 | There was another home cleaning robot at the time.
01:40:55.100 | It was the Electrolux Trilobyte,
01:40:59.100 | which sold for 2000 euros.
01:41:03.220 | And to us, that was not going to be a consumer product.
01:41:06.100 | So we had reason to believe that $200
01:41:09.140 | was a thing that people would buy at.
01:41:12.660 | That was our aim.
01:41:13.700 | But that meant we had, you know,
01:41:15.420 | that's on the shelf making profit.
01:41:19.060 | That means the cost of goods has to be minimal.
01:41:22.820 | So I find all these emails of me going, you know,
01:41:26.740 | I'd be in Taipei for a MIT meeting
01:41:30.420 | and I'd stay a few extra days.
01:41:31.740 | I'd go down to Hsinchu and talk to these little tiny
01:41:34.420 | companies, lots of little tiny companies outside of TSMC,
01:41:38.460 | Taiwan Semiconductor Manufacturing Corporation,
01:41:42.620 | which let all these little companies be fabulous.
01:41:45.460 | They didn't have to have their own fab
01:41:46.900 | so they could innovate.
01:41:48.740 | And they were building,
01:41:51.700 | their innovations were to build stripped down 6802s.
01:41:55.500 | 6802 was what was in an Apple One.
01:41:57.700 | Get rid of half the silicon and still have it be viable.
01:42:00.980 | And I'd previously got some of those
01:42:03.980 | for some earlier failed products of iRobot.
01:42:07.860 | And then that was in Hong Kong,
01:42:11.220 | going to all these companies that built,
01:42:14.420 | you know, they weren't gaming in the current sense.
01:42:16.980 | There were these handheld games that you would play
01:42:19.580 | or birthday cards.
01:42:22.900 | 'Cause we had about a 50 cent budget for computation.
01:42:26.060 | So I'm trekking from place to place,
01:42:29.260 | looking at their chips,
01:42:31.300 | looking at what they'd removed.
01:42:33.060 | Oh, the interrupt,
01:42:34.740 | the interrupt handling is too weak for a general purpose.
01:42:38.740 | So I was going deep technical detail.
01:42:41.660 | And then I found this one from a company called Winbond,
01:42:44.540 | which had, and I'd forgotten that it had this much RAM.
01:42:47.700 | It had 512 bytes of RAM and it was in our budget
01:42:51.620 | and it had all the capabilities we needed.
01:42:54.660 | - Yeah.
01:42:55.500 | - So.
01:42:56.340 | - And you were excited.
01:42:57.180 | - Yeah, and I was reading all these emails,
01:42:59.140 | "Colin, I found this."
01:43:00.500 | (laughs)
01:43:02.980 | - Did you think,
01:43:03.820 | did you ever think that you guys could be so successful?
01:43:06.500 | Like eventually this company would be so successful.
01:43:09.580 | Did you, could you possibly have imagined?
01:43:11.860 | - No, we never did think that.
01:43:13.980 | We'd had 14 failed business models up to 2002.
01:43:17.260 | And then we had two winners the same year.
01:43:19.500 | No, and then, you know,
01:43:23.540 | we, I remember the board,
01:43:27.940 | 'cause by this time we had some venture capital in.
01:43:31.460 | The board went along with us building
01:43:34.340 | some robots for, you know,
01:43:38.380 | aiming at the Christmas 2002 market.
01:43:41.860 | And we went three times over what they authorized
01:43:46.060 | and built 70,000 of them and sold them all in that first,
01:43:50.460 | 'cause we released on September 18th
01:43:52.780 | and they were all sold by Christmas.
01:43:55.580 | So it was,
01:43:56.860 | so we were gutsy, but.
01:43:59.900 | (laughs)
01:44:00.980 | - But yeah, you didn't think this will take over the world.
01:44:03.620 | Well, this is,
01:44:04.580 | so a lot of amazing robotics companies
01:44:10.100 | have gone under over the past few decades.
01:44:13.420 | Why do you think it's so damn hard
01:44:16.540 | to run a successful robotics company?
01:44:20.700 | - There's a few things.
01:44:23.860 | One is expectations of capabilities
01:44:27.740 | by the founders that are off base.
01:44:32.620 | - The founders, not the consumer, the founders.
01:44:34.580 | - Yeah, expectations of what can be delivered, sure.
01:44:37.180 | Mispricing, and what a customer thinks is a valid price
01:44:43.220 | is not rational necessarily.
01:44:45.900 | - Yeah.
01:44:47.540 | - And expectations of customers.
01:44:53.460 | just the
01:44:54.300 | sheer hardness of getting people to adopt a new technology.
01:45:00.700 | And I've suffered from all three of these.
01:45:03.900 | I've had more failures than successes
01:45:06.860 | in terms of companies.
01:45:09.060 | I've suffered from all three.
01:45:12.940 | - Do you think
01:45:15.180 | one day there will be a robotics company,
01:45:20.740 | and by robotics company, I mean,
01:45:22.380 | where your primary source of income is from robots,
01:45:26.220 | that will be a trillion plus dollar company?
01:45:29.620 | And if so, what would that company do?
01:45:32.780 | - I can't, you know,
01:45:38.140 | because I'm still starting robot companies.
01:45:40.420 | - Yeah.
01:45:41.260 | (both laughing)
01:45:43.220 | - I'm not making any such predictions in my own mind.
01:45:46.460 | I'm not thinking about a trillion dollar company.
01:45:48.220 | And by the way, I don't think, you know,
01:45:50.380 | in the '90s anyone was thinking
01:45:51.740 | that Apple would ever be a trillion dollar company.
01:45:53.700 | So these are very hard to predict.
01:45:57.380 | - But, sorry to interrupt,
01:45:58.980 | but don't you, 'cause I kind of have a vision
01:46:01.860 | in a small way, a big vision in a small way,
01:46:05.700 | that I see that there will be robots in the home
01:46:08.940 | at scale, like Roomba, but more.
01:46:13.660 | And that's trillion dollar.
01:46:15.380 | - Right.
01:46:16.220 | And I think there's a real market pull for them
01:46:18.900 | because of the demographic inversion.
01:46:22.260 | You know, who's gonna do all the stuff for the older people?
01:46:26.340 | There's too many, you know, I'm leading here.
01:46:29.900 | (both laughing)
01:46:31.380 | There's gonna be too many of us.
01:46:33.340 | But we don't have capable enough robots
01:46:39.220 | to make that economic argument at this point.
01:46:42.380 | Do I expect that that will happen?
01:46:44.260 | Yes, I expect it will happen.
01:46:45.460 | But I gotta tell you,
01:46:47.140 | we introduced the Roomba in 2002
01:46:49.620 | and I stayed another nine years.
01:46:52.820 | We were always trying to find
01:46:54.100 | what the next home robot would be.
01:46:55.780 | And still today, the primary product of 20 years,
01:47:00.700 | almost 20 years later, 19 years later,
01:47:02.940 | the primary product is still the Roomba.
01:47:04.540 | So iRobot hasn't found the next one.
01:47:07.940 | - Do you think it's possible for one person in the garage
01:47:10.300 | to build it versus like Google launching,
01:47:14.580 | Google self-driving car that turns into Waymo?
01:47:17.460 | Do you think it's possible?
01:47:18.620 | This is almost like what it takes
01:47:19.860 | to build a successful robotics company.
01:47:22.020 | Do you think it's possible to go from the ground up
01:47:23.900 | or is it just too much capital investment?
01:47:26.340 | - Yeah, so it's very hard to get there
01:47:29.900 | without a lot of capital.
01:47:32.620 | And we're starting to see, you know,
01:47:35.340 | fair chunks of capital for some robotics companies.
01:47:39.580 | You know, Series Bs,
01:47:41.420 | I just saw one yesterday for $80 million.
01:47:44.420 | I think it was for Covariant.
01:47:46.420 | But it can take real money to get into these things
01:47:53.460 | and you may fail along the way.
01:47:54.780 | I've certainly failed at Rethink Robotics
01:47:57.660 | and we lost $150 million in capital there.
01:48:00.980 | - So, okay, so Rethink Robotics
01:48:02.820 | is another amazing robotics company you co-founded.
01:48:06.580 | So what was the vision there?
01:48:09.100 | What was the dream?
01:48:11.180 | And what are you most proud of with Rethink Robotics?
01:48:15.780 | - I'm most proud of the fact that we got robots
01:48:19.780 | out of the cage in factories that was safe,
01:48:23.180 | absolutely safe for people and robots to be next to each other.
01:48:26.300 | - So these are robotic arms.
01:48:27.820 | - Robotic arms for people to pick up stuff
01:48:29.820 | and interact with humans.
01:48:31.220 | - Yeah, and that humans could retask them
01:48:34.340 | without writing code.
01:48:35.420 | And now that's sort of become an expectation
01:48:39.100 | for a lot of other little companies
01:48:40.620 | and big companies are advertising they're doing.
01:48:43.220 | - That's both an interface problem and also a safety problem.
01:48:46.540 | - Yeah, yeah.
01:48:48.580 | So I'm most proud of that.
01:48:50.620 | I completely, I let myself be talked out of
01:48:57.220 | what I wanted to do.
01:49:00.340 | And you know, you always got, you know,
01:49:02.060 | I can't replay the tape.
01:49:03.460 | You know, I can't replay it.
01:49:05.460 | Maybe, maybe, you know, if I'd been stronger on,
01:49:09.980 | and I remember the day, I remember the exact meeting.
01:49:12.820 | - Can you take me through that meeting?
01:49:16.300 | - Yeah.
01:49:17.140 | So I'd said that, I'd set as a target for the company
01:49:22.300 | that we were gonna build $3,000 robots with force feedback
01:49:25.860 | that was safe for people to be around.
01:49:29.820 | - Wow.
01:49:30.660 | - That was my goal.
01:49:32.300 | And we built, so we started in 2008
01:49:35.980 | and we had prototypes built of plastic,
01:49:39.580 | plastic gear boxes and at a $3,000, you know, lifetime,
01:49:44.580 | oh, $3,000, I was saying, we're gonna go after
01:49:49.780 | not the people who already have robot arms in factories,
01:49:52.660 | the people who would never have a robot arm.
01:49:55.100 | We're gonna go after a different market.
01:49:57.020 | So we don't have to meet their expectations.
01:49:59.260 | And so we're gonna build it out of plastic.
01:50:03.020 | It doesn't have to have a 35,000 hour lifetime.
01:50:05.900 | It's gonna be so cheap that it's OPEX, not CAPEX.
01:50:09.620 | And so we had a prototype that worked reasonably well,
01:50:14.620 | but the control engineers were complaining
01:50:19.940 | about these plastic gear boxes
01:50:21.900 | with a beautiful little planetary gearbox,
01:50:24.580 | but we could use something called serious elastic actuators.
01:50:29.580 | We embedded them in there.
01:50:31.140 | We can measure forces.
01:50:32.300 | We knew when we hit something, et cetera.
01:50:35.260 | The control engineers were saying,
01:50:36.580 | yeah, but this is torque ripple
01:50:39.300 | 'cause these plastic gears, they're not great gears.
01:50:42.180 | And there's this ripple and trying to do force control
01:50:44.940 | around this ripple is so hard.
01:50:47.420 | And I'm not gonna name names,
01:50:51.740 | but I remember one of the mechanical engineers saying,
01:50:54.900 | we'll just build a metal gearbox with spur gears
01:50:58.140 | and it'll take six weeks, we'll be done, problem solved.
01:51:03.700 | Two years later, we got the spur gearbox working.
01:51:06.940 | We cost reduced in every possible way we could,
01:51:11.420 | but now the price went up to,
01:51:15.540 | and then the CEO at the time said,
01:51:17.500 | well, we have to have two arms, not one arm.
01:51:19.900 | So our first robot product, Baxter, now costs $25,000.
01:51:24.700 | And the only people who were gonna look at that
01:51:28.180 | were people who had arms in factories
01:51:30.420 | 'cause that was somewhat cheaper for two arms
01:51:32.260 | than arms in factories,
01:51:34.300 | but they were used to 0.1 millimeter
01:51:36.860 | reproducibility of motion and certain velocities.
01:51:41.860 | And I kept thinking, but that's not what we're giving you.
01:51:45.700 | You don't need position repeatability.
01:51:47.420 | You use force control like a human does.
01:51:49.660 | No, no, but we want that repeatability.
01:51:53.220 | We want that repeatability.
01:51:54.620 | All the other robots have that repeatability.
01:51:56.420 | Why don't you have that repeatability?
01:51:58.580 | - So can you clarify, force control is you can grab the arm
01:52:01.860 | and you can move it.
01:52:02.700 | - Yeah, well, you can move it around,
01:52:03.780 | but suppose you, can you see that?
01:52:06.980 | - Yes.
01:52:07.820 | - Suppose you want to,
01:52:09.980 | - Yes.
01:52:11.220 | - Suppose this thing is a precise thing
01:52:13.620 | that's gotta fit here in this right angle.
01:52:15.980 | Under position control, you have fixtured where this is.
01:52:21.340 | You know where this is precisely,
01:52:23.260 | and you just move it, and it goes there.
01:52:26.260 | In force control, you would do something like
01:52:28.620 | slide it over here till we feel that,
01:52:30.380 | and slide it in there.
01:52:31.620 | And that's how a human gets that precision.
01:52:34.620 | - Yeah.
01:52:35.460 | - They use force feedback.
01:52:36.340 | - Yes.
01:52:37.180 | - And get the things to mate,
01:52:38.020 | rather than just go straight to it.
01:52:40.700 | - Yeah.
01:52:41.540 | - Couldn't convince our customers who were in factories
01:52:46.940 | and were used to thinking about things a certain way,
01:52:49.540 | and they wanted it, wanted it, wanted it.
01:52:51.940 | So then we said, okay, we're gonna build an arm
01:52:54.980 | that gives you that.
01:52:56.260 | So now we ended up building a $35,000 robot
01:52:58.940 | with one arm with, oh, what are they called?
01:53:03.540 | A certain sort of gearbox made by a company whose name
01:53:09.980 | I can't remember right now,
01:53:10.980 | but it's the name of the gearbox.
01:53:12.620 | But it's got torque ripple in it.
01:53:18.020 | So now there was an extra two years
01:53:19.820 | of solving the problem of doing the force
01:53:21.780 | with the torque ripple.
01:53:22.660 | So we had to do the thing we had avoided
01:53:28.620 | for the plastic gearboxes.
01:53:29.780 | We ended up having to do, the robot was now overpriced.
01:53:33.340 | - And that was your intuition from the very beginning,
01:53:37.500 | kind of that this is not,
01:53:39.020 | you're opening a door to solve a lot of problems
01:53:42.780 | that you're eventually gonna have to solve
01:53:44.900 | this problem anyway.
01:53:45.860 | - Yeah, and also I was aiming at a low price
01:53:48.340 | to go into a different market.
01:53:49.380 | - Low price.
01:53:50.220 | - That didn't have robots.
01:53:51.060 | - $3,000 would be amazing.
01:53:52.620 | - Yeah, I think we could have done it for five.
01:53:55.340 | But you said, talked about setting the goal
01:53:58.580 | a little too far for the engineers.
01:54:00.300 | - Exactly.
01:54:01.140 | Why would you say that company not failed, but went under?
01:54:08.700 | - We had buyers and there's this thing called
01:54:14.780 | the Committee on Foreign Investment in the US, CFIUS.
01:54:17.980 | And that had previously been invoked twice
01:54:24.620 | around where the government could stop foreign money
01:54:28.020 | coming into a US company based on defense requirements.
01:54:33.020 | We went through due diligence multiple times.
01:54:37.780 | We were gonna get acquired,
01:54:39.100 | but every consortium had Chinese money in it.
01:54:43.740 | And all the bankers would say at the last minute,
01:54:46.300 | you know, this isn't gonna get past CFIUS.
01:54:49.380 | And the investors would go away.
01:54:51.940 | And then we had two buyers,
01:54:54.340 | when we were about to run out of money, two buyers.
01:54:57.300 | And one used heavy-handed legal stuff with the other one,
01:55:00.720 | said they were gonna take it and pay more,
01:55:06.900 | dropped out when we were out of cash,
01:55:08.780 | and then bought the assets at 1/30th of the price
01:55:12.100 | they had offered a week before.
01:55:13.660 | It was a tough week.
01:55:16.500 | - Do you, does it hurt to think about?
01:55:21.700 | Like an amazing company that didn't,
01:55:24.220 | you know, like iRobot didn't find a way.
01:55:29.820 | - Yeah, it was tough.
01:55:30.980 | I said I was never gonna start another company.
01:55:33.020 | I was pleased that everyone liked what we did so much
01:55:36.340 | that the team was hired by three companies within a week.
01:55:41.340 | Everyone had a job in one of these three companies.
01:55:44.900 | Some stayed in their same desks
01:55:46.700 | because another company came in and rented the space.
01:55:50.820 | So I felt good about people not being out on the street.
01:55:55.820 | - So Baxter's a screen with a face.
01:55:58.340 | That's a revolutionary idea for a robot manipulation,
01:56:04.700 | like for a robotic arm.
01:56:06.240 | How much opposition did you get?
01:56:08.980 | - Well, first, the screen was also used
01:56:10.660 | during codeless programming,
01:56:13.060 | where you taught by demonstration,
01:56:14.580 | it showed you what its understanding of the task was.
01:56:17.780 | So it had two roles.
01:56:19.260 | Some customers hated it.
01:56:23.460 | And so we made it so that when the robot was running,
01:56:26.460 | it could be showing graphs of what was happening
01:56:29.180 | and not show the eyes.
01:56:30.380 | Other people, and some of them surprised me who they were,
01:56:34.740 | were saying, "Well, this one doesn't look as human
01:56:37.300 | "as the old one.
01:56:38.140 | "We liked the human looking."
01:56:40.180 | So there was a mixed bag there.
01:56:42.380 | - But do you think that's, I don't know.
01:56:46.740 | I'm kind of disappointed whenever I talk to roboticists,
01:56:51.740 | like the best robotics people in the world,
01:56:54.080 | they seem to not want to do the eyes type of thing.
01:56:57.540 | They seem to see it as a machine,
01:56:59.800 | as opposed to a machine
01:57:00.720 | that can also have a human connection.
01:57:02.920 | I'm not sure what to do with that.
01:57:04.060 | It seems like a lost opportunity.
01:57:05.580 | I think the trillion dollar company
01:57:08.140 | will have to do the human connection very well,
01:57:10.220 | no matter what it does.
01:57:11.260 | - Yeah, I agree.
01:57:12.280 | - Can I ask you a ridiculous question?
01:57:15.580 | - Sure.
01:57:17.380 | - I give a ridiculous answer.
01:57:18.880 | - Do you think, well, maybe by way of asking the question,
01:57:23.660 | let me first mention that you're kind of critical
01:57:26.660 | of the idea of the Turing test as a test of intelligence.
01:57:29.560 | Let me first ask this question.
01:57:34.340 | Do you think we'll be able to build an AI system
01:57:38.620 | that humans fall in love with
01:57:40.540 | and it falls in love with the human, like romantic love?
01:57:45.380 | - Well, we've had that with humans falling in love with cars
01:57:48.860 | even back in the '50s.
01:57:50.180 | - It's a different love, right?
01:57:51.500 | I think there's a lifelong partnership
01:57:54.100 | where you can communicate and grow like...
01:57:58.260 | - I think we're a long way from that.
01:58:00.620 | I think we're a long, long way.
01:58:02.260 | I think Blade Runner was, you know,
01:58:06.180 | had the time scale totally wrong.
01:58:08.060 | - Yeah, but do you,
01:58:11.820 | so to me, honestly, the most difficult part
01:58:14.340 | is the thing that you said with the Marvax paradox
01:58:16.860 | is to create a human form that interacts
01:58:19.380 | and perceives the world.
01:58:20.700 | But if we just look at a voice, like the movie "Her,"
01:58:24.140 | or just like an Alexa type voice,
01:58:26.940 | I tend to think we're not that far away.
01:58:28.940 | - Well, for some people, maybe not, but I,
01:58:33.980 | you know, I,
01:58:40.260 | you know, as humans, as we think about the future,
01:58:42.460 | we always try to,
01:58:43.820 | and this is the premise of most science fiction movies,
01:58:46.740 | you've got the world just as it is today
01:58:48.300 | and you change one thing, right?
01:58:50.780 | But that's not how,
01:58:51.620 | and it's the same with the self-driving car.
01:58:53.540 | You change one thing.
01:58:54.780 | No, everything changes.
01:58:56.460 | Everything grows together.
01:58:59.100 | So surprisingly, it might be surprising to you,
01:59:01.980 | it might not, I think the best movie about this stuff
01:59:04.660 | was "Bicentennial Man."
01:59:07.380 | And what was happening there?
01:59:09.340 | It was schmaltzy and, you know,
01:59:11.380 | but what was happening there?
01:59:12.860 | As the robot was trying to become more human,
01:59:17.420 | the humans were adopting the technology of the robot
01:59:20.180 | and changing their bodies.
01:59:21.580 | So there was a convergence happening in a sense.
01:59:25.860 | So we will not be the same,
01:59:27.180 | but, you know, we're already talking about
01:59:29.060 | genetically modifying our babies.
01:59:30.980 | You know, there's a,
01:59:32.820 | you know, there's a,
01:59:34.140 | more and more stuff happening around that.
01:59:36.260 | We will want to modify ourselves even more
01:59:39.500 | for all sorts of things.
01:59:41.500 | We put all sorts of technology in our bodies
01:59:47.220 | to improve it, you know.
01:59:48.540 | I've got,
01:59:50.300 | I've got things in my ears so that I can sort of hear you.
01:59:54.220 | - Yeah. (laughs)
01:59:56.540 | - So we're always modifying our bodies.
01:59:58.060 | So, you know, I think it's hard to imagine
02:00:01.020 | exactly what it will be like in the future.
02:00:03.220 | - But on the Turing test side,
02:00:06.340 | do you think, so forget about love for a second,
02:00:09.540 | let's talk about just like the Alexa prize.
02:00:12.860 | Actually, I was invited to be a,
02:00:15.380 | what is the interviewer for the Alexa prize or whatever?
02:00:19.740 | That's in two days.
02:00:21.700 | Their idea is success looks like a person wanting to turn
02:00:27.980 | on the Alexa and talk to an AI system for a prolonged period
02:00:32.980 | of time, like 20 minutes.
02:00:34.260 | How far away are we?
02:00:37.980 | And why is it difficult to build an AI system
02:00:40.900 | with which you'd want to have a beer
02:00:43.220 | and talk for an hour or two hours?
02:00:45.900 | Like not for, to check the weather or to check music,
02:00:49.820 | but just like to talk as friends.
02:00:53.300 | - Yeah, well, you know, we saw,
02:00:54.660 | we saw Weizenbaum back in the sixties
02:00:57.860 | with his program, Eliza,
02:00:59.620 | being shocked at how much people would talk to Eliza.
02:01:03.260 | And I remember, you know, in the seventies typing,
02:01:06.500 | you know, stuff to Eliza to see what it would come back with.
02:01:09.500 | You know, I think right now,
02:01:12.700 | and this is a thing that Amazon's been trying to improve
02:01:17.700 | with Alexa, there is no continuity of topic.
02:01:24.820 | There's not, you can't refer
02:01:26.900 | to what we talked about yesterday.
02:01:28.620 | It's not the same as talking to a person
02:01:31.300 | where there seems to be an ongoing existence, which changes.
02:01:35.460 | - We share moments together
02:01:36.660 | and they last in our memory together.
02:01:38.780 | - Yeah, but there's none of that.
02:01:40.340 | And there's no sort of intention of these systems
02:01:45.340 | that they have any goal in life,
02:01:47.940 | even if it's to be happy, you know,
02:01:50.100 | they don't even have a semblance of that.
02:01:53.380 | Now, I'm not saying this can't be done.
02:01:55.180 | I'm just saying, I think this is why we don't feel
02:01:57.620 | that way about them.
02:01:59.260 | Or that's a sort of a minimal requirement.
02:02:02.980 | If you want the sort of interaction you're talking about,
02:02:06.580 | it's a minimal requirement.
02:02:07.660 | Whether it's going to be sufficient, I don't know.
02:02:10.940 | We haven't seen it yet.
02:02:12.260 | We don't know what it feels like.
02:02:14.860 | - I tend to think it's not as difficult
02:02:19.860 | as solving intelligence, for example.
02:02:22.740 | And I think it's achievable in the near term.
02:02:25.500 | But on the Turing test,
02:02:29.620 | why don't you think the Turing test
02:02:31.500 | is a good test of intelligence?
02:02:33.020 | - Oh, because, you know, again,
02:02:36.500 | the Turing, if you read the paper,
02:02:38.340 | Turing wasn't saying this is a good test.
02:02:40.620 | He was using it as a rhetorical device to argue
02:02:43.900 | that if you can't tell the difference
02:02:46.660 | between a computer and a person,
02:02:49.260 | you must say that the computer's thinking
02:02:52.020 | because you can't tell the difference when it's thinking.
02:02:56.740 | You can't say something different.
02:02:58.500 | What it has become as this sort of weird game
02:03:02.740 | of fooling people.
02:03:04.220 | So back at the AI lab in the late '80s,
02:03:10.460 | we had this thing that still goes on called the AI Olympics.
02:03:15.140 | And one of the events we had one year
02:03:17.540 | was the original imitation game
02:03:21.420 | as Turing talked about,
02:03:22.380 | 'cause he starts by saying,
02:03:24.460 | can you tell whether it's a man or a woman?
02:03:26.660 | So we did that at the lab.
02:03:28.620 | We had, you know, you'd go and type
02:03:29.940 | and the thing would come back
02:03:32.700 | and you had to tell whether it was a man or a woman.
02:03:35.900 | And the, one of the,
02:03:39.900 | one man came up with a question that he could ask,
02:03:49.500 | which was always a dead giveaway
02:03:52.340 | of whether the other person was really a man or a woman.
02:03:55.260 | You know, what he would ask them,
02:03:57.660 | did you have green plastic toy soldiers as a kid?
02:04:01.460 | Yeah, what'd you do with them?
02:04:03.340 | And a woman trying to be a man would say,
02:04:06.500 | oh, I lined them up.
02:04:07.340 | We had wars, we had battles.
02:04:08.900 | And the man just being a man would say,
02:04:10.420 | I stomped on them, I burned them.
02:04:12.100 | (laughing)
02:04:15.060 | So, you know, that's what the Turing test,
02:04:18.500 | the Turing test with computers has become.
02:04:21.620 | What's the trick question?
02:04:23.220 | What's the, that's why I say it's sort of devolved
02:04:26.580 | into this weirdness.
02:04:28.140 | - Nevertheless, conversation not formulated as a test
02:04:32.620 | is a pretty, is a fascinatingly challenging dance.
02:04:36.820 | That's a really hard problem.
02:04:38.380 | To me, conversation when non-posed as a test
02:04:41.580 | is a more intuitive illustration
02:04:44.940 | how far away we are from solving intelligence
02:04:47.340 | than like computer vision.
02:04:48.900 | It's hard, computer vision is harder for me to pull apart.
02:04:53.140 | But with language, with conversation, you could see--
02:04:55.580 | - No, 'cause language is so human.
02:04:56.980 | - It's so human.
02:04:57.820 | We can so clearly see it.
02:05:02.820 | Shit, you mentioned something I was gonna go off on.
02:05:07.020 | Okay.
02:05:07.860 | I mean, I have to ask you,
02:05:10.740 | 'cause you were the head of CSAIL, AI Lab for a long time.
02:05:16.900 | You're, I don't know, to me, when I came to MIT,
02:05:20.300 | you're like one of the greats at MIT.
02:05:22.900 | So what was that time like?
02:05:24.460 | And plus you, you're, I don't know, friends with,
02:05:30.740 | but you knew Minsky and all the folks there,
02:05:33.740 | all the legendary AI people of which you're one.
02:05:38.060 | So what was that time like?
02:05:39.460 | What are memories that stand out to you from that time,
02:05:44.860 | from your time at MIT, from the AI Lab,
02:05:47.620 | from the dreams that the AI Lab represented
02:05:51.020 | to the actual like revolutionary work?
02:05:53.660 | - Let me tell you first a disappointment in myself.
02:05:56.660 | You know, as I've been researching this book
02:05:59.060 | and so many of the players were active in the '50s and '60s,
02:06:04.060 | I knew many of them when they were older.
02:06:07.060 | And I didn't ask them all the questions
02:06:08.660 | now I wish I had asked.
02:06:11.500 | I'd sit with them at our Thursday lunches,
02:06:13.680 | which we had at faculty lunch.
02:06:15.460 | And I didn't ask them so many questions
02:06:18.100 | that now I wish I had.
02:06:19.940 | - Can I ask you that question?
02:06:20.940 | 'Cause you wrote that.
02:06:22.540 | You wrote that you were fortunate to know
02:06:24.100 | and rub shoulders with many of the greats,
02:06:26.780 | those who founded AI, robotics and computer science
02:06:29.780 | and the World Wide Web.
02:06:31.540 | And you wrote that your big regret nowadays
02:06:33.740 | is that often I have questions for those who have passed on.
02:06:36.860 | - Yeah.
02:06:37.700 | - And I didn't think to ask them any of these questions.
02:06:40.620 | - Right.
02:06:41.580 | - Even as I saw them and said hello to them
02:06:43.940 | on a daily basis.
02:06:45.020 | So maybe also another question I wanna ask,
02:06:49.140 | if you could talk to them today,
02:06:51.180 | what question would you ask?
02:06:52.980 | What questions would you ask?
02:06:54.300 | - Oh, well, Rick Leiter.
02:06:55.660 | I would ask him, you know, he had the vision
02:06:58.580 | for humans and computers working together.
02:07:02.860 | And he really founded that at DARPA.
02:07:05.100 | And he gave the money to MIT,
02:07:08.420 | which started Project MAC in 1963.
02:07:11.940 | And I would have talked to him about
02:07:14.740 | what the successes were, what the failures were,
02:07:17.220 | what he saw as progress, et cetera.
02:07:20.000 | I would have asked him more questions about that.
02:07:24.140 | 'Cause now I could use it in my book.
02:07:26.740 | (laughing)
02:07:27.700 | But I think it's lost, it's lost forever.
02:07:29.500 | A lot of the motivations are lost.
02:07:36.300 | I should have asked Marvin why he and Seymour Papert
02:07:40.700 | came down so hard on neural networks in 1968
02:07:44.940 | in their book "Perceptrons."
02:07:46.260 | Because Marvin's PhD thesis was on neural networks.
02:07:50.220 | - How do you make sense of that?
02:07:51.500 | - That book destroyed the field.
02:07:53.580 | - He probably, do you think he knew
02:07:56.100 | the effect that book would have?
02:07:57.700 | - All the theorems are negative theorems.
02:08:02.780 | - Yeah.
02:08:05.940 | - So yeah.
02:08:07.300 | - That's the way of life.
02:08:11.240 | But still, it's kind of tragic that he was both
02:08:14.740 | the proponent and the destroyer of neural networks.
02:08:17.840 | Is there other memories stand out
02:08:21.460 | from the robotics and the AI work at MIT?
02:08:25.560 | - Well, yeah, but you gotta be more specific.
02:08:31.460 | - Well, I mean, it's such a magical place.
02:08:33.260 | I mean, to me, it's a little bit also heartbreaking
02:08:35.860 | that with Google and Facebook, like DeepMind and so on,
02:08:41.860 | so much of the talent doesn't stay necessarily
02:08:46.740 | for prolonged periods of time in these universities.
02:08:50.540 | - Oh yeah, I mean, some of the companies
02:08:52.940 | are more guilty than others of paying fabulous salaries
02:08:57.780 | to some of the highest producers.
02:09:00.180 | And then just, you never hear from them again.
02:09:03.020 | They're not allowed to give public talks.
02:09:04.660 | They're sort of locked away.
02:09:06.700 | And it's sort of like collecting Hollywood stars
02:09:11.420 | or something, and they're not allowed
02:09:13.220 | to make movies anymore.
02:09:14.060 | I own them.
02:09:15.420 | - Yeah, that's tragic, 'cause I mean,
02:09:18.020 | there's an openness to the university setting
02:09:20.620 | where you do research, both in the space of ideas
02:09:23.560 | and the space, like publication, all those kinds of things.
02:09:25.660 | - Yeah, and there's the publication and all that,
02:09:29.060 | and often, although these places say they publish,
02:09:33.100 | there's pressure.
02:09:34.020 | But I think, for instance, net-net,
02:09:41.260 | I think Google buying those eight or nine robotics company
02:09:46.140 | was bad for the field, because it locked those people away.
02:09:49.440 | They didn't have to make the company succeed anymore,
02:09:53.100 | locked them away for years, and then sort of all
02:09:58.500 | threaded away.
02:09:59.460 | - Do you have hope for MIT?
02:10:04.420 | For MIT?
02:10:07.140 | - Yeah, why shouldn't I?
02:10:08.660 | - Well, I could be harsh and say that,
02:10:11.160 | I'm not sure I would say MIT is leading the world in AI,
02:10:16.820 | or even Stanford or Berkeley.
02:10:19.580 | I would say DeepMind, Google AI, Facebook AI.
02:10:26.780 | I would take a slightly different approach,
02:10:28.620 | or a different answer.
02:10:30.580 | I'll come back to Facebook in a minute.
02:10:33.020 | But I think those other places are following a dream
02:10:38.020 | of one of the founders,
02:10:41.780 | and I'm not sure that it's well-founded, the dream,
02:10:47.460 | and I'm not sure that it's going to have the impact
02:10:50.380 | that he believes it is.
02:10:55.440 | - You're talking about Facebook and Google and so on.
02:10:57.320 | - I'm talking about Google.
02:10:58.360 | - Google.
02:10:59.200 | But the thing is, those research labs aren't,
02:11:01.840 | there's the big dream, and I'm usually a fan of,
02:11:06.280 | no matter what the dream is, a big dream is a unifier,
02:11:09.240 | because what happens is you have a lot of bright minds
02:11:12.680 | working together on a dream.
02:11:15.880 | What results is a lot of adjacent ideas.
02:11:18.920 | I mean, this is how so much progress is made.
02:11:20.840 | - Yeah, so I'm not saying they're actually leading.
02:11:23.400 | I'm not saying that the universities are leading,
02:11:26.440 | but I don't think those companies are leading in general,
02:11:29.120 | because they're,
02:11:29.960 | we saw this incredible spike in attendees at NeurIPS.
02:11:36.360 | And as I said in my January 1st review this year for 2020,
02:11:42.040 | 2020 will not be remembered as a watershed year
02:11:47.200 | for machine learning or AI.
02:11:50.320 | There was nothing surprising happened anyway,
02:11:53.400 | unlike when deep learning hit ImageNet.
02:11:58.120 | That was a shake.
02:12:00.960 | And there's a lot more people writing papers,
02:12:04.920 | but the papers are fundamentally boring and uninteresting.
02:12:09.480 | - Incremental work.
02:12:10.880 | - Yeah.
02:12:11.720 | - Is there particular memories you have with Minsky
02:12:17.200 | or somebody else at MIT that stand out?
02:12:19.720 | - Funny stories.
02:12:21.320 | I mean, unfortunately, he's another one that's passed away.
02:12:24.320 | You've known some of the biggest minds in AI.
02:12:29.480 | - Yeah, and they did amazing things,
02:12:32.160 | and sometimes they were grumpy.
02:12:34.600 | - Well, he was interesting, 'cause he was very grumpy,
02:12:39.560 | but that was,
02:12:41.240 | I remember him saying in an interview
02:12:44.480 | that the key to success or to keep being productive
02:12:48.480 | is to hate everything you've ever done in the past.
02:12:51.520 | - Maybe that explains the Perceptron book.
02:12:54.000 | There it was.
02:12:55.760 | He told you exactly.
02:12:57.160 | - But he, meaning like, just like,
02:13:01.400 | I mean, maybe that's the way
02:13:02.440 | to not treat yourself too seriously,
02:13:04.640 | just always be moving forward.
02:13:08.080 | That was his idea.
02:13:08.960 | I mean, that crankiness, I mean, there's a...
02:13:11.760 | - Yeah, so let me tell you what really,
02:13:17.880 | you know, the joy memories are about
02:13:23.080 | having access to technology before anyone else has seen it.
02:13:26.880 | So, you know, I got to Stanford in 1977,
02:13:31.280 | and we had, you know,
02:13:34.080 | we had terminals that could show live video on them,
02:13:37.240 | digital sound system.
02:13:40.800 | We had a Xerox graphics printer.
02:13:45.160 | We could print, it wasn't, you know,
02:13:49.080 | it wasn't like a typewriter ball hitting characters.
02:13:52.120 | It could print arbitrary things,
02:13:53.840 | only in, you know, one bit, you know, black or white,
02:13:56.800 | but you could, arbitrary pictures.
02:13:58.560 | This was science fiction sort of stuff.
02:14:00.600 | At MIT, the Lisp machines, which, you know,
02:14:06.920 | they were the first personal computers,
02:14:09.520 | and, you know, they cost $100,000 each,
02:14:12.240 | and I could, you know, I got there early enough in the day,
02:14:14.760 | I got one for the day.
02:14:16.040 | Couldn't stand up, had to keep working.
02:14:18.680 | (both laughing)
02:14:21.960 | - So having that like direct glimpse into the future.
02:14:25.480 | - Yeah, and, you know, I've had email every day since 1977,
02:14:29.960 | and, you know, the host field was only eight bits,
02:14:34.960 | you know, there weren't that many places,
02:14:36.600 | but I could send email to other people at a few places.
02:14:40.080 | So that was pretty exciting,
02:14:42.920 | to be in that world so different
02:14:45.160 | from what the rest of the world knew.
02:14:47.000 | - Let me ask you, I'll probably edit this out,
02:14:52.560 | but just in case you have a story.
02:14:54.280 | I'm hanging out with Don Knuth for a while tomorrow.
02:15:00.240 | Did you ever get a chance,
02:15:01.360 | it's such a different world than yours.
02:15:03.400 | He's a very kind of theoretical computer science,
02:15:06.200 | the puzzle of computer science and mathematics,
02:15:09.640 | and you're so much about the magic of robotics,
02:15:12.520 | like the practice of it.
02:15:13.920 | You mentioned him earlier for like,
02:15:15.800 | not, you know, about computation.
02:15:18.040 | Did your worlds cross?
02:15:19.720 | - They did in a, you know, I know him now, we talk,
02:15:23.320 | but let me tell you my Donald Knuth story.
02:15:26.680 | So, you know, besides, you know, analysis of algorithms,
02:15:30.720 | he's well known for writing tech, which is in LaTeX,
02:15:34.480 | which is the academic publishing system.
02:15:37.800 | So he did that at the AI lab,
02:15:40.960 | and he would do it, he would work overnight at the AI lab.
02:15:45.120 | And one day, one night,
02:15:49.640 | the mainframe computer went down,
02:15:53.280 | and a guy named Robert Paul was there.
02:15:57.560 | He later did his PhD at the Media Lab at MIT,
02:16:00.640 | and he was an engineer.
02:16:03.520 | And so he and I, you know,
02:16:07.400 | tracked down what were the problem was.
02:16:08.920 | It was one of this big refrigerator size
02:16:11.000 | or washing machine size disk drives had failed,
02:16:13.640 | and that's what brought the whole system down.
02:16:15.680 | So we got panels pulled off,
02:16:17.600 | and we're pulling, you know, circuit cards out,
02:16:20.400 | and Donald Knuth, who's a really tall guy,
02:16:23.400 | walks in and he's looking down and says,
02:16:25.280 | "When will it be fixed?"
02:16:26.680 | You know, 'cause he wanted to get back
02:16:27.840 | to writing his tech system.
02:16:29.280 | We're like, "It's Donald Knuth."
02:16:31.480 | And so we figured out, you know,
02:16:34.360 | it was a particular chip, 7400 series chip,
02:16:37.560 | which was socketed.
02:16:38.800 | We popped it out.
02:16:40.880 | We put a replacement in, put it back in.
02:16:43.320 | Smoke comes out, 'cause we put it in backwards,
02:16:45.880 | 'cause we were so nervous
02:16:46.960 | that Donald Knuth was standing over us.
02:16:49.600 | Anyway, we eventually got it fixed
02:16:51.320 | and got the mainframe running again.
02:16:53.760 | - So that was your little, when was that again?
02:16:56.360 | - Well, that must have been before October '79,
02:16:58.960 | 'cause we moved out of that building then.
02:17:00.400 | So sometime, probably '78, sometime early '79.
02:17:03.960 | - Yeah, all those figures are just fascinating.
02:17:07.760 | All the people who have passed through MIT
02:17:10.200 | is really fascinating.
02:17:11.840 | Is there, let me ask you to put on your big, wise man hat.
02:17:16.840 | Is there advice that you can give to young people today,
02:17:22.520 | whether in high school or college,
02:17:24.120 | who are thinking about their career,
02:17:26.320 | who are thinking about life,
02:17:28.120 | how to live a life they're proud of, a successful life?
02:17:36.680 | - Yeah, so many people ask me for advice and have asked,
02:17:40.160 | and I talk to a lot of people all the time.
02:17:42.680 | And there is no one way.
02:17:45.400 | There's a lot of pressure to produce papers
02:17:53.040 | that will be acceptable and be published.
02:17:58.320 | Maybe I come from an age where I could be a rebel
02:18:04.640 | against that and still succeed.
02:18:07.200 | Maybe it's harder today.
02:18:08.560 | But I think it's important not to get too caught up
02:18:13.480 | with what everyone else is doing.
02:18:18.400 | And well, it depends on what you want in life.
02:18:23.040 | If you wanna have real impact,
02:18:27.160 | you have to be ready to fail a lot of times.
02:18:31.200 | So you have to make a lot of unsafe decisions.
02:18:34.280 | And the only way to make that work
02:18:36.400 | is to keep doing it for a long time.
02:18:38.800 | And then one of them will be work out.
02:18:40.280 | And so that will make something successful.
02:18:43.800 | - Or not.
02:18:44.640 | That's the whole point.
02:18:46.560 | - Or you just may end up not having a lousy career.
02:18:50.840 | I mean, it's certainly possible.
02:18:52.240 | - Taking the risk is the thing.
02:18:53.480 | - Yeah.
02:18:54.320 | But there's no way to make all safe decisions
02:19:01.240 | and actually really contribute.
02:19:04.840 | - Do you think about your death, about your mortality?
02:19:11.240 | - I gotta say when COVID hit, I did.
02:19:15.840 | 'Cause in the early days,
02:19:17.240 | we didn't know how bad it was gonna be.
02:19:19.240 | That made me work on my book harder for a while.
02:19:22.920 | But then I'd started this company
02:19:24.400 | and now I'm doing more than full-time at the company
02:19:27.200 | so the book's on hold.
02:19:28.560 | But I do wanna finish this book.
02:19:30.440 | - When you think about it, are you afraid of it?
02:19:32.800 | - I'm afraid of dribbling.
02:19:37.440 | Of losing it.
02:19:42.320 | - The details of, okay.
02:19:44.000 | - Yeah, yeah.
02:19:45.680 | - But the fact that the ride ends?
02:19:47.480 | - I've known that for a long time.
02:19:51.360 | So it's...
02:19:53.360 | - Yeah, but there's knowing and knowing.
02:19:55.480 | It's such a...
02:19:57.480 | - Yeah, and it- - It really sucks.
02:19:59.320 | - It feels a lot closer.
02:20:00.640 | So in my blog with my predictions,
02:20:05.200 | my sort of pushback against that was I said,
02:20:09.040 | I'm gonna review these every year for 32 years.
02:20:12.040 | And that puts me into my mid-90s.
02:20:14.960 | So it was my-
02:20:15.800 | - That puts a whole, every time you write the blog posts,
02:20:19.000 | you're getting closer and closer to your own prediction.
02:20:21.680 | - That's true. - Of your death.
02:20:23.800 | - Yeah, yeah.
02:20:25.120 | - What do you hope your legacy is?
02:20:28.320 | You're one of the greatest roboticist,
02:20:30.080 | AI researchers of all time.
02:20:32.280 | - What I hope is that I actually finish writing this book
02:20:38.400 | and that there's one person who reads it
02:20:44.640 | and sees something about changing the way they're thinking.
02:20:50.280 | And that leads to the next big.
02:20:56.280 | And then there'll be on a podcast 100 years from now
02:20:59.280 | saying I once read that book.
02:21:00.920 | And that changed everything.
02:21:04.280 | What do you think is the meaning of life?
02:21:08.520 | This whole thing, the existence,
02:21:10.880 | all the hurried things we do on this planet?
02:21:14.360 | What do you think is the meaning of it all?
02:21:16.000 | - Well, I think we're all really bad at it.
02:21:18.280 | - Life or finding meaning or both?
02:21:21.520 | - Yeah, we get caught up in the,
02:21:23.960 | it's easier to do the stuff that's immediate
02:21:27.360 | and not do the stuff that's not immediate.
02:21:29.520 | - The big picture, we're bad at.
02:21:32.880 | - Yeah, yeah.
02:21:33.720 | - Do you have a sense of what that big picture is?
02:21:36.240 | Like why?
02:21:37.080 | You ever look up to the stars and ask,
02:21:39.120 | why the hell are we here?
02:21:40.360 | - You know, my atheism tells me it's just random,
02:21:50.080 | but I wanna understand the way random,
02:21:54.120 | and that's what I talk about in this book,
02:21:56.120 | how order comes from disorder.
02:21:57.880 | - But it kind of sprung up,
02:22:01.800 | like most of the whole thing is random,
02:22:03.480 | but this little pocket of complexity they will call earth,
02:22:07.160 | why the hell does that happen?
02:22:10.640 | - And what we don't know is how common
02:22:13.160 | those pockets of complexity are or how often,
02:22:18.760 | 'cause they may not last forever.
02:22:21.480 | - Which is more exciting/sad to you,
02:22:25.920 | if we're alone or if there's infinite number of--
02:22:30.560 | - Oh, I think it's impossible for me to believe
02:22:34.480 | that we're alone.
02:22:35.760 | That would just be too horrible, too cruel.
02:22:40.360 | - Could be like the sad thing,
02:22:43.360 | it could be like a graveyard of intelligent civilizations.
02:22:46.440 | - Oh, everywhere, yeah.
02:22:48.160 | That may be the most likely outcome.
02:22:50.720 | - And for us, too.
02:22:51.800 | - Yeah, exactly, yeah.
02:22:53.000 | - And all of this will be forgotten,
02:22:54.800 | including all the robots you build, everything forgotten.
02:23:00.200 | - Well, on average, everyone has been forgotten in history.
02:23:05.880 | - Yeah. - Right?
02:23:07.040 | - Yeah.
02:23:07.880 | - Most people are not remembered,
02:23:09.040 | beyond a generation or two.
02:23:10.760 | - I mean, yeah, well, not just on average, basically.
02:23:15.400 | Very close to 100% of people who have ever lived
02:23:18.040 | are forgotten.
02:23:18.880 | - Yeah, I mean--
02:23:19.920 | - In a long arc of time.
02:23:20.880 | - I don't know anyone alive who remembers
02:23:22.920 | my great-grandparents, 'cause we didn't meet them.
02:23:25.480 | - Still, this life is pretty fun, somehow.
02:23:32.520 | - Yeah.
02:23:33.360 | (laughing)
02:23:34.200 | - Even the immense absurdity
02:23:36.520 | and at times meaninglessness of it all, it's pretty fun.
02:23:40.240 | And for me, one of the most fun things is robots,
02:23:43.840 | and I've looked up to your work,
02:23:45.280 | I've looked up to you for a long time.
02:23:46.800 | - Oh, that's right, Rod.
02:23:47.720 | Rod, it's an honor that you would spend
02:23:51.880 | your valuable time with me today,
02:23:53.240 | talking, it was an amazing conversation.
02:23:54.920 | Thank you so much for being here.
02:23:56.160 | - Well, thanks for talking with me.
02:23:58.000 | I've enjoyed it.
02:23:58.840 | - Thanks for listening to this conversation
02:24:01.720 | with Rodney Brooks.
02:24:02.920 | To support this podcast,
02:24:04.240 | please check out our sponsors in the description.
02:24:07.000 | And now, let me leave you with the three laws of robotics
02:24:10.400 | from Isaac Asimov.
02:24:12.720 | One, a robot may not injure a human being
02:24:15.840 | or through inaction, allow a human being to come to harm.
02:24:20.200 | Two, a robot must obey the orders given to it
02:24:23.520 | by human beings, except when such orders
02:24:25.680 | would conflict with the first law.
02:24:27.560 | And three, a robot must protect its own existence
02:24:32.560 | as long as such protection does not conflict
02:24:35.480 | with the first or the second laws.
02:24:38.840 | Thank you for listening.
02:24:39.960 | I hope to see you next time.
02:24:41.840 | (upbeat music)
02:24:44.440 | (upbeat music)
02:24:47.040 | [BLANK_AUDIO]