back to index

Neil Gershenfeld: Self-Replicating Robots and the Future of Fabrication | Lex Fridman Podcast #380


Chapters

0:0 Introduction
1:29 What Turing got wrong
6:53 MIT Center for Bits and Atoms
20:0 Digital logic
26:36 Self-assembling robots
37:4 Digital fabrication
47:59 Self-reproducing machine
55:45 Trash and fabrication
60:41 Lab-made bioweapons
64:56 Genome
76:48 Quantum computing
81:19 Microfluidic bubble computation
86:41 Maxwell's demon
95:27 Consciousness
102:27 Cellular automata
106:59 Universe is a computer
111:45 Advice for young people
121:2 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The ribosome, who I mentioned a little while back,
00:00:02.400 | can make an elephant one molecule at a time.
00:00:05.360 | Ribosomes are slow, they run at about one molecule a second,
00:00:09.260 | but ribosomes make ribosomes,
00:00:11.480 | so you have trillions of them and that makes an elephant.
00:00:14.560 | In the same way, these little assembly robots I'm describing
00:00:17.240 | can make giant structures at heart
00:00:20.240 | because the robot can make the robot.
00:00:23.200 | So more recently, two of my students, Amira and Miana,
00:00:26.760 | had a nature communication paper showing
00:00:29.200 | how this robot can be made out of the parts it's making,
00:00:33.720 | so the robots can make the robots,
00:00:35.500 | so you build up the capacity of robotic assembly.
00:00:38.140 | The following is a conversation with Neil Gershenfeld,
00:00:43.480 | the director of MIT's Center for Bits and Atoms,
00:00:46.680 | an amazing laboratory that is breaking down boundaries
00:00:49.860 | between the digital and physical worlds,
00:00:52.520 | fabricating objects and machines at all scales of reality,
00:00:56.840 | including robots and automata
00:00:59.820 | that can build copies of themselves
00:01:02.160 | and self-assemble into complex structures.
00:01:06.520 | His work inspires millions across the world
00:01:09.320 | as part of the maker movement to build cool stuff,
00:01:13.160 | to create the very act that makes life
00:01:16.300 | so beautiful and fun.
00:01:19.860 | This is the Alex Friedman Podcast.
00:01:21.800 | To support it, please check out our sponsors
00:01:23.760 | in the description.
00:01:24.920 | And now, dear friends, here's Neil Gershenfeld.
00:01:28.480 | You have spent your life working at the boundary
00:01:31.840 | between bits and atoms, so the digital and the physical.
00:01:36.160 | What have you learned about engineering
00:01:38.920 | and about nature reality from working at this divide,
00:01:43.200 | trying to bridge this divide?
00:01:45.000 | - I learned why von Neumann and Turing
00:01:48.720 | made fundamental mistakes.
00:01:52.720 | I learned the secret of life.
00:01:55.800 | - Yeah.
00:01:57.200 | - I learned how to solve many of the world's
00:02:00.120 | most important problems, which all sound presumptuous,
00:02:03.320 | but all of those are things I learned at that boundary.
00:02:05.480 | - Okay, so Turing and von Neumann, let's start there.
00:02:07.720 | Some of the most impactful, important humans
00:02:09.840 | who have ever lived in computing, why were they wrong?
00:02:12.640 | - So I worked with Andy Gleason,
00:02:16.240 | who was Turing's counterpart.
00:02:17.760 | So just for background, if anybody doesn't know,
00:02:20.160 | Turing is credited with the modern architecture
00:02:22.080 | of computing, among many other things.
00:02:25.840 | Andy Gleason was his US counterpart.
00:02:28.320 | And you might not have heard of Andy Gleason,
00:02:31.600 | but you might have heard of the Hilbert problems.
00:02:33.880 | And Andy Gleason solved the fifth one.
00:02:36.360 | So he was a really notable mathematician.
00:02:39.440 | During the war, he was Turing's counterpart.
00:02:41.300 | Then von Neumann is credited
00:02:42.920 | with the modern architecture of computing.
00:02:45.280 | And one of his students was Marvin Minsky.
00:02:47.720 | So I could ask Marvin what Johnny was thinking,
00:02:50.120 | and I could ask Andy what Alan was thinking.
00:02:52.880 | And what came out from that,
00:02:55.240 | what I came to appreciate as background,
00:02:58.020 | I never understood the difference
00:02:59.480 | between computer science and physical science.
00:03:01.880 | But Turing's machine, that's the foundation
00:03:05.880 | of modern computing, has a simple physics mistake,
00:03:10.120 | which is the head is distinct from the tape.
00:03:13.000 | So in the Turing machine, there's a head
00:03:14.840 | that programmatically moves and reads and writes a tape.
00:03:17.480 | The head is distinct from the tape,
00:03:19.120 | which means persistence of information
00:03:22.020 | is separate from interaction with information.
00:03:25.060 | Then von Neumann wrote deeply and beautifully
00:03:30.120 | about many things, but not computing.
00:03:32.680 | He wrote a horrible memo called
00:03:35.120 | the First Draft of a Report on the EDVAC,
00:03:38.600 | which is how you program a very early computer.
00:03:41.920 | In it, he essentially roughly took Turing's architecture
00:03:46.000 | and built it into a machine.
00:03:48.640 | So the legacy of that is the computer
00:03:52.680 | somebody's using to watch this
00:03:54.280 | is spending much of its effort moving information
00:03:57.300 | from storage transistors to processing transistors,
00:04:01.320 | even though they have the same computational complexity.
00:04:04.000 | So in computer science, when you learn about computing,
00:04:08.400 | there's a ridiculous taxonomy
00:04:10.860 | of about 100 different models of computation,
00:04:15.080 | but they're all fictions.
00:04:16.820 | In physics, a patch of space occupies space.
00:04:20.320 | It stores state, it takes time to transit,
00:04:23.360 | and you can interact.
00:04:24.760 | That is the only model of computation that's physical.
00:04:28.120 | Everything else is a fiction.
00:04:30.240 | So I really came to appreciate that a few years back
00:04:34.780 | when I did a keynote for the annual meeting
00:04:36.400 | of the supercomputer industry,
00:04:38.440 | and then went into the halls and spent time
00:04:40.460 | with the supercomputer builders,
00:04:42.280 | and came to appreciate,
00:04:45.540 | oh, see, if you're familiar with the movie "The Metropolis,"
00:04:50.080 | people would frolic upstairs in the gardens,
00:04:52.120 | and down in the basement, people would move levers.
00:04:54.520 | And that's how computing exists today,
00:04:58.080 | that we pretend software is not physical.
00:05:00.680 | It's separate from hardware.
00:05:02.600 | And the whole canon of computer science
00:05:05.000 | is based on this fiction
00:05:06.660 | that bits aren't constrained by atoms,
00:05:09.560 | but all sorts of scaling issues in computing
00:05:12.640 | come from that boundary,
00:05:13.680 | but all sorts of opportunities come from that boundary.
00:05:16.720 | And so you can trace it all the way back
00:05:18.720 | to Turing's machine making this mistake
00:05:22.020 | between the head and the tape.
00:05:23.520 | He never called it von Neumann's architecture.
00:05:28.120 | He wrote about it in this dreadful memo,
00:05:30.120 | and then he wrote beautifully
00:05:31.040 | about other things we'll talk about.
00:05:32.880 | Now, to end a long answer,
00:05:34.980 | Turing and von Neumann both knew this.
00:05:38.080 | So all of the canon of computer scientists credits them
00:05:41.840 | for what was never meant to be a computer architecture.
00:05:45.240 | Both Turing and von Neumann ended their life
00:05:48.240 | studying exactly how software becomes hardware.
00:05:52.240 | So von Neumann studied self-reproducing automata,
00:05:55.240 | how a machine communicates its own construction.
00:05:58.640 | Turing studied morphogenesis, how genes give rise to form.
00:06:02.280 | They ended their life
00:06:03.400 | studying the embodiment of computation,
00:06:06.060 | something that's been forgotten by the canon of computing,
00:06:09.080 | but developed sort of off to the sides
00:06:10.840 | by a really interesting lineage.
00:06:12.880 | - So there's no distinction between the head and the tape,
00:06:16.200 | between the computer and the computation.
00:06:17.880 | It is all computation.
00:06:19.480 | - Right, so I never understood the difference
00:06:22.800 | between computer science and physical science,
00:06:25.480 | and working at that boundary helped lead to things
00:06:29.020 | like my lab was part of doing,
00:06:31.040 | with a number of interesting collaborators,
00:06:32.720 | the first faster than classical quantum computations.
00:06:36.600 | We were part of a collaboration
00:06:38.360 | creating the minimal synthetic organism
00:06:40.840 | where you design life in a computer.
00:06:42.680 | Those both involve domains
00:06:45.440 | where you just can't separate hardware from software.
00:06:48.320 | The embodiment of computation is embodied
00:06:51.120 | in these really profound ways.
00:06:52.820 | - So the first quantum computations,
00:06:56.800 | synthetic life, so in the space of biology,
00:07:00.480 | the space of physics at the lowest level,
00:07:02.640 | and the space of biology at the lowest level.
00:07:05.680 | So let's talk about CBA, Center of Bits and Atoms.
00:07:09.200 | What's the origin story of this MIT,
00:07:11.320 | legendary MIT center that you're a part of creating?
00:07:16.280 | - In high school, I really wanted to go to vocational school
00:07:19.760 | where you learn to weld and fix cars and build houses,
00:07:24.080 | and I was told, "No, you're smart.
00:07:25.440 | "You have to sit in a room."
00:07:27.160 | And nobody could explain to me
00:07:28.360 | why I couldn't go to vocational school.
00:07:32.560 | I then worked at Bell Labs,
00:07:35.120 | this wonderful place before deregulation, legendary place,
00:07:39.440 | and I would get union grievances
00:07:41.800 | because I would go into the workshop
00:07:43.920 | and try to make something, and they would say,
00:07:45.280 | "No, you're smart.
00:07:46.100 | "You have to tell somebody what to do."
00:07:48.640 | And it wasn't until MIT, and I'll explain how CBA started,
00:07:52.160 | but I could create CBA that I came to understand
00:07:55.540 | this is a mistake that dates back to the Renaissance.
00:07:58.080 | So in the Renaissance, the liberal arts emerged,
00:08:01.900 | and liberal doesn't mean politically liberal.
00:08:04.420 | This was the path to liberation, birth of humanism.
00:08:07.600 | And so the liberal arts were the trivium, quadrivium,
00:08:10.400 | roughly language, natural science,
00:08:12.920 | and at that moment, what emerged was this dreadful concept
00:08:17.920 | of the illiberal arts.
00:08:20.600 | So anything that wasn't the liberal arts
00:08:22.580 | was for commercial gain and was just making stuff
00:08:25.540 | and wasn't valid for serious study.
00:08:28.560 | And so that's why we're left with learning to weld
00:08:31.760 | wasn't a subject for serious study.
00:08:34.080 | But the means of expression have changed
00:08:37.400 | since the Renaissance.
00:08:38.320 | So micro-machining or embedded coding
00:08:41.880 | is every bit as expressive as painting a painting
00:08:44.400 | or writing a sonnet.
00:08:45.400 | So never understanding this difference
00:08:49.400 | between computer science and physical science,
00:08:51.980 | the path that led me to create CBA with colleagues was,
00:08:59.420 | I was what's called the junior fellow at Harvard.
00:09:01.860 | I was visiting MIT through Marvin
00:09:04.180 | because I was interested in the physics
00:09:05.660 | of musical instruments.
00:09:06.940 | This will be another slight digression.
00:09:10.860 | In Cornell, I would study physics
00:09:13.780 | and then I would cross the street
00:09:15.600 | and go to the music department where I played the bassoon
00:09:17.500 | and I would trim reeds and play the reeds
00:09:19.420 | and they'd be beautiful, but then they'd get soggy.
00:09:21.820 | And then I discovered in the basement
00:09:23.300 | of the music department at Cornell was David Borden,
00:09:25.900 | who you might not have heard of,
00:09:28.220 | but is legendary in electronic music
00:09:30.100 | 'cause he was really the first electronic musician.
00:09:32.300 | So Bob Moog, who invented Moog synthesizers
00:09:36.100 | was a physics student at Cornell,
00:09:37.580 | like me crossing the street.
00:09:39.380 | And eventually he was kicked out
00:09:41.100 | and invented electronic music.
00:09:42.820 | David Borden was the first musician
00:09:45.460 | who created electronic music.
00:09:46.980 | So he's legendary for people like Phil Glass
00:09:48.940 | and Steve Reich.
00:09:50.380 | And so that got me thinking about,
00:09:52.940 | I would behave as a scientist in the music department,
00:09:55.620 | but not in the physics department,
00:09:57.900 | but not in the music department.
00:09:59.220 | Got me thinking about what's the computational capacity
00:10:01.700 | of a musical instrument.
00:10:02.960 | And through Marvin, he introduced me to Todd MacOver
00:10:07.140 | at the Media Lab, who was just about to start a project
00:10:10.140 | with Yo-Yo Ma that led to a collaboration
00:10:14.020 | to instrument a cello, to extract Yo-Yo's data
00:10:17.300 | and bring it out into computational environments.
00:10:20.300 | - What is the computational capacity of a musical instrument
00:10:23.380 | as we continue on this tangent
00:10:24.700 | and when we shall return to CBA?
00:10:26.140 | - Yeah, so one part of that is to understand the computing.
00:10:31.140 | And if you look at like the finest timescale
00:10:36.100 | and length scale you need to model the physics,
00:10:38.540 | it's not heroic.
00:10:40.020 | A good GPU can do teraflops today.
00:10:42.900 | That used to be a national class supercomputer,
00:10:45.060 | now it's just a GPU.
00:10:46.300 | And that's about, if you take the timescales
00:10:48.980 | and length scales relevant for the physics,
00:10:50.700 | that's about the scale of the physics computing.
00:10:53.380 | For Yo-Yo, what was really driving it
00:10:55.780 | was he's completely unsentimental about the Strad.
00:10:59.460 | It's not that it makes some magical wiggles
00:11:02.180 | in the sound wave, it's performance as a controller,
00:11:05.940 | how he can manipulate it as an interface device.
00:11:09.860 | - Interface between what and what exactly?
00:11:11.500 | - Him and sound.
00:11:12.820 | - Okay, him and sound.
00:11:13.860 | - So what it led to was, I had started by thinking
00:11:16.740 | about ops per second, but Yo-Yo's question
00:11:19.100 | was really resolution and bandwidth.
00:11:23.140 | It's how fast can you measure what he does
00:11:27.220 | and the bandwidth and the resolution
00:11:32.220 | of detecting his controls and then mapping them into sounds.
00:11:37.900 | And what we found, what he found was
00:11:41.740 | if you instrument everything he does
00:11:44.100 | and connect it to almost anything,
00:11:46.940 | it sounds like Yo-Yo, that the magic is in the control,
00:11:51.220 | not in ineffable details in how the wood wiggles.
00:11:55.740 | And so with Yo-Yo and Todd, that led to a piece
00:11:58.460 | and towards the end I asked Yo-Yo what it would take
00:12:01.500 | for him to get rid of his Strad and use our stuff.
00:12:03.820 | And his answer was just logistics.
00:12:05.460 | It was at that time, our stuff was like a rack
00:12:08.340 | of electronics and lots of cables
00:12:09.980 | and some grad students to make it work.
00:12:12.600 | Once the technology becomes as invisible as the Strad,
00:12:16.100 | then sure, absolutely, he would take it.
00:12:19.180 | And by the way, as a footnote on the footnote,
00:12:22.060 | an accident in the sensing of Yo-Yo's cello
00:12:25.140 | led to a hundred million dollar a year auto safety business
00:12:27.820 | to control airbags in cars.
00:12:29.860 | - How did that work?
00:12:31.280 | - I had to instrument the bow without interfering with it.
00:12:34.020 | So I set up local electromagnetic fields
00:12:38.060 | where I would detect how those fields interact
00:12:42.780 | with the bow he's playing.
00:12:45.380 | But we had a problem that his hand,
00:12:47.540 | whenever his hand got near the sensing fields,
00:12:49.620 | I would start sensing his hand
00:12:51.020 | rather than the materials on the bow.
00:12:53.560 | And I didn't quite understand what was going on
00:12:57.460 | with that interference.
00:13:00.140 | So my very first grad student ever, Josh Smith,
00:13:04.900 | did a thesis on tomography with electric fields,
00:13:07.380 | how to see in 3D with electric fields.
00:13:10.360 | Then through Todd and at that point,
00:13:13.320 | research scientists in my lab, Joe Paradiso,
00:13:15.700 | it led to a collaboration with Penn and Teller
00:13:18.420 | who, where we did a magic trick in Las Vegas
00:13:21.780 | to contact Houdini.
00:13:23.060 | And sort of these fields are sort of like,
00:13:25.100 | you know, contacting spirits.
00:13:27.120 | So we did a magic trick in Las Vegas.
00:13:30.460 | And then the crazy thing that happened after that
00:13:33.580 | was Phil Rittmuller came running into my lab.
00:13:37.780 | He worked with, this became with Honda and NEC,
00:13:42.140 | airbags were killing infants in rear-facing child seats.
00:13:45.660 | Cars need to distinguish a front-facing adult
00:13:49.500 | where you'd save the life versus a bag of groceries
00:13:52.500 | where you don't need to fire the airbag
00:13:53.900 | versus a rear-facing infant where you would kill it.
00:13:56.380 | And so the seat need to in effect see in 3D
00:13:59.580 | to understand the occupants.
00:14:01.100 | And so we took the Penn and Teller magic trick
00:14:05.060 | derived from Josh's thesis from Yo-Yo's cello
00:14:08.580 | to an auto show and all the car companies said,
00:14:11.100 | "Great, when can we buy it?"
00:14:12.820 | And so that became Ellicis
00:14:14.460 | and it was a hundred million dollar a year business
00:14:16.780 | making sensors.
00:14:17.700 | There wasn't a lot of publicity because it was in the car
00:14:20.100 | so the car didn't kill you.
00:14:22.160 | So they didn't sort of advertise,
00:14:24.460 | we have nice sensors so the car doesn't kill you.
00:14:26.700 | But it became a leading auto safety sensor.
00:14:28.900 | - And that started from the cello
00:14:30.580 | and the question of the computational capacity
00:14:33.180 | of a musical instrument.
00:14:34.220 | - Right, so now to get back to MIT,
00:14:39.220 | I was spending a lot of outside time at IBM Research
00:14:42.260 | that had gods of the foundations of computing.
00:14:45.080 | There's just amazing people there.
00:14:48.220 | And I'd always expected to go to IBM to take over a lab,
00:14:52.300 | but at the last minute pivoted and came to MIT
00:14:57.100 | to take a position in the Media Lab
00:15:02.100 | and start what became the predecessor to CBA.
00:15:05.500 | Media Lab is well known for Nicholas Negroponte.
00:15:09.940 | What's less well known is the role of Jerry Wiesner.
00:15:13.460 | So Jerry was MIT's president
00:15:15.740 | before that Kennedy Science Advisor,
00:15:17.540 | grand old man of science.
00:15:20.120 | At the end of his life, he was frustrated
00:15:23.140 | by how knowledge was segregated.
00:15:25.940 | And so he wanted to create a department
00:15:28.240 | of none of the above.
00:15:30.140 | A department for work that didn't fit in departments.
00:15:33.540 | And the Media Lab in a sense was a cover story
00:15:37.100 | for him to hide a department.
00:15:39.740 | As MIT's president towards the end of his tenure,
00:15:42.300 | if he said, "I'm gonna make a department
00:15:44.060 | for things that don't fit in departments,"
00:15:45.620 | the departments would have screamed.
00:15:47.660 | But everybody was sort of paying attention
00:15:49.420 | to Nicholas creating the Media Lab
00:15:50.980 | and Jerry kind of hid in it a department
00:15:53.700 | called Media Arts and Sciences.
00:15:55.260 | It's really the department of none of the above.
00:15:57.900 | And Jerry explaining that and Nicholas then confirming it
00:16:00.980 | is really why I pivoted and went to MIT.
00:16:03.820 | Because my students who helped create quantum computing
00:16:07.500 | or synthetic life get degrees from Media Arts and Sciences,
00:16:11.380 | this department of none of the above.
00:16:13.520 | So that led to coming to MIT.
00:16:17.220 | With Todd and Joe Paradiso and my colleague,
00:16:21.820 | we started a consortium called Things That Think.
00:16:24.260 | And this was around the birth of Internet of Things and RFID.
00:16:29.260 | But then we started doing things like work we can discuss
00:16:32.880 | that became the beginnings of quantum computing
00:16:35.260 | and cryptography and materials.
00:16:37.300 | And logic and microfluidics.
00:16:39.860 | And those needed much more significant infrastructure
00:16:44.860 | and were much longer research arcs.
00:16:47.220 | So with a bigger team of about 20 people,
00:16:50.580 | we wrote a proposal to the NSF to assemble one of every tool
00:16:54.100 | to make anything of any size was roughly the proposal.
00:16:57.580 | - One of any tool to make anything of any size.
00:17:01.540 | - Yeah, so they're usually nanometers, micrometers,
00:17:05.540 | millimeters, meters are segregated.
00:17:07.820 | Input and output is segregated.
00:17:10.660 | We wanted to look just very literally
00:17:12.940 | at how digital becomes physical and physical becomes digital.
00:17:16.940 | And fortunately, we got NSF on a good day.
00:17:21.620 | And they funded this facility of one of almost every tool
00:17:25.540 | to make anything.
00:17:27.020 | And so with a group of core colleagues
00:17:32.820 | that included Joe Jacobson, Ike Trang, Scott Manalis,
00:17:36.940 | we launched CBA.
00:17:38.260 | - And so you're talking about nanoscale, microscale,
00:17:42.820 | nanostructures, microstructures, macrostructures,
00:17:45.540 | electron microscopes and focused ion beam probes
00:17:48.140 | for nanostructures, laser, micro machining
00:17:51.580 | and x-ray microtomography for microstructures,
00:17:55.260 | multi-axis machining and 3D printing for macrostructures,
00:17:57.820 | just some examples.
00:17:59.040 | What are we talking about in terms of scale?
00:18:00.660 | How can we build tiny things and big things
00:18:04.020 | all in one place?
00:18:04.860 | How's that possible? - Yeah, so a well-equipped
00:18:06.660 | research lab has the sort of tools we're talking about,
00:18:09.900 | but they're segregated in different places.
00:18:12.280 | They're typically also run by technicians
00:18:16.420 | where you then have an account and a project and you charge.
00:18:19.540 | All of these tools are essentially
00:18:22.060 | when you don't know what you're doing,
00:18:23.380 | not when you do know what you're doing,
00:18:25.020 | in that they're when you need to work across length scales
00:18:28.260 | where we don't, once projects are running in this facility,
00:18:32.980 | we don't charge for time, you don't make a formal proposal
00:18:37.980 | to schedule and the users really run the tools
00:18:40.820 | and it's for work that's kind of inchoate
00:18:42.900 | that needs to span these disciplines and length scales.
00:18:46.500 | And so work in the project today,
00:18:51.500 | work in CBA today ranges from developing
00:18:55.820 | Zeptojoule electronics for the lowest power computing
00:18:59.180 | to micromachining diamond to take 10 million RPM bearings
00:19:03.540 | for molecular spectroscopy studies
00:19:06.300 | up to exploring robots to build
00:19:09.740 | hundred meter structures in space.
00:19:12.780 | - Okay, can we, the three things you just mentioned,
00:19:15.900 | let's start with the biggest.
00:19:17.740 | What are some of the biggest stuff you attempted
00:19:20.340 | to explore how to build in a lab?
00:19:22.300 | - Sure, so viewed from one direction,
00:19:26.100 | what we're talking about is a crazy random seeming
00:19:29.940 | of almost unrelated projects.
00:19:32.700 | But if you rotate 90 degrees,
00:19:34.380 | it's really just a core thought over and over again,
00:19:37.940 | just very literally how bits and atoms relate,
00:19:41.340 | how digital and just going from digital to physical
00:19:44.660 | in many different domains,
00:19:45.820 | but it's really just the same idea over and over again.
00:19:48.740 | So to understand the biggest things,
00:19:52.060 | let me go back to bring in now Shannon
00:19:56.940 | as well as Von Neumann.
00:19:59.300 | - Claude Shannon.
00:20:00.140 | - Yeah, so what is digital?
00:20:02.620 | The casual obvious answer is digital in one and zero,
00:20:08.300 | but that's wrong.
00:20:09.700 | There's a much deeper answer,
00:20:11.100 | which is Claude Shannon at MIT
00:20:16.100 | wrote the best master's thesis ever.
00:20:18.540 | In his master's thesis,
00:20:19.780 | he invented our modern notion of digital logic.
00:20:23.580 | Where it came from was Vannevar Bush
00:20:27.300 | was a grand old man at MIT.
00:20:30.420 | He created the post-war research establishment
00:20:33.380 | that led to the National Science Foundation.
00:20:35.500 | And he made an important mistake, which we can talk about,
00:20:38.620 | but he also made the differential analyzer,
00:20:41.900 | which was the last great analog computer.
00:20:44.100 | So it was a room full of gears and pulleys
00:20:46.180 | and the longer it ran, the worse the answer was.
00:20:49.020 | And Shannon worked on it as a student
00:20:52.540 | and he got so annoyed in his master's thesis,
00:20:55.100 | he invented digital logic.
00:20:57.660 | But he then went on to Bell Labs.
00:20:59.700 | And what he did there was
00:21:01.300 | communication was beginning to expand.
00:21:05.220 | There was more demand for phone lines.
00:21:07.540 | And so there's a question about how many phone lines
00:21:11.780 | you could, phone messages you could send down a wire.
00:21:15.420 | And you could try to just make it better and better.
00:21:17.940 | He asked a question nobody had asked,
00:21:19.820 | which is rather than make it better and better,
00:21:21.740 | what's the limit to how good it can be?
00:21:23.740 | And he proved a couple things,
00:21:26.860 | but one of the main things he proved
00:21:28.620 | was a threshold theorem for channel capacity.
00:21:32.660 | And so what he showed was,
00:21:34.780 | my voice to you right now is coming as a wave through sound.
00:21:38.460 | And the further you get, the worse it sounds.
00:21:40.900 | But people watching this are getting it
00:21:43.460 | as packets of data in a network.
00:21:47.740 | When the computer they're watching this
00:21:50.340 | gets the packet of information,
00:21:52.780 | it can detect and correct an error.
00:21:56.940 | And what Shannon showed is if the noise in the cable
00:22:00.780 | to the people watching this
00:22:02.380 | is above a threshold, they're doomed.
00:22:04.980 | But if the noise is below a threshold,
00:22:07.420 | for a linear increase in the energy
00:22:10.300 | representing our conversation,
00:22:12.420 | the error rate goes down exponentially.
00:22:15.500 | Exponentials are fast.
00:22:17.140 | There's very few of them in engineering.
00:22:19.540 | And the exponential reduction of error
00:22:22.060 | below a threshold if you restore state
00:22:25.020 | is called a threshold theorem.
00:22:27.220 | That's what led to digital.
00:22:29.220 | That means unreliable things can work reliably.
00:22:32.300 | So Shannon did that for communication.
00:22:34.860 | Then von Neumann was inspired by that
00:22:38.620 | and applied it to computation.
00:22:40.820 | And he showed how an unreliable computer
00:22:43.060 | can operate reliably
00:22:44.860 | by using the same threshold property of restoring state.
00:22:48.100 | It was then forgotten many years.
00:22:49.540 | We had to rediscover it in effect
00:22:51.260 | in the quantum computing era
00:22:52.900 | when things are very unreliable again.
00:22:54.820 | But now to go back to how does this relate
00:22:58.580 | to the biggest things I've made?
00:23:00.580 | So in fabrication,
00:23:03.980 | MIT invented computer-controlled manufacturing in 1952.
00:23:08.980 | Jet aircraft were just emerging.
00:23:12.420 | There was a limit to turning cranks on a machine,
00:23:15.940 | on a milling machine to make parts for jet aircraft.
00:23:19.740 | Now this is a messy story.
00:23:21.060 | MIT actually stole computer-controlled machining
00:23:23.700 | from an inventor who brought it to MIT,
00:23:25.980 | wanted to do a joint project with the Air Force,
00:23:28.340 | and MIT effectively stole it from him.
00:23:30.100 | So it's kind of a messy history.
00:23:32.460 | But that sounds like the birth
00:23:35.060 | of computer-controlled machining, 1952.
00:23:38.020 | There are a number of inventors of 3D printing.
00:23:42.180 | One of the companies spun off from my lab
00:23:44.260 | by Max Lebowski is Formlabs,
00:23:45.940 | which is now a billion-dollar 3D printing company.
00:23:48.820 | That's the modern version.
00:23:50.660 | But all of that's analog,
00:23:53.220 | meaning the information is in the control computer.
00:23:56.100 | There's no information in the materials.
00:23:58.060 | And so it goes back to Vannevar Bush's analog computer.
00:24:01.620 | If you make a mistake in printing or machining,
00:24:04.420 | just the mistake accumulates.
00:24:06.860 | The real birth of computerized digital manufacturing
00:24:12.060 | is four billion years ago.
00:24:14.020 | That's the evolutionary age of the ribosome.
00:24:16.900 | So the way you're manufactured
00:24:21.060 | is there's a code that describes you, the genetic code.
00:24:26.060 | It goes to a micro-machine, the ribosome,
00:24:30.540 | which is this molecular factory
00:24:32.260 | that builds the molecules that are you.
00:24:36.300 | The key thing to know about that
00:24:38.900 | is there are about 20 amino acids that get assembled.
00:24:43.500 | And in that machinery,
00:24:45.300 | it does everything Shannon and von Neumann taught us.
00:24:47.820 | You detect and correct errors.
00:24:49.500 | So if you mix chemicals,
00:24:51.460 | the error rate is about a part in 100.
00:24:53.940 | When you elongate a protein in the ribosome,
00:24:56.940 | it's about a part in 10 to the four.
00:24:59.020 | When you replicate DNA,
00:25:00.820 | there's an extra level of error correction.
00:25:03.120 | It's a part in 10 to the eight.
00:25:05.180 | And so in the molecules that make you,
00:25:08.460 | you can detect and correct errors,
00:25:10.620 | and you don't need a ruler to make you.
00:25:12.660 | The geometry comes from your parts.
00:25:15.260 | So now compare a child playing with Lego
00:25:19.380 | and a state-of-the-art 3D printer
00:25:21.620 | or computerized milling machine.
00:25:23.480 | The tower made by a child
00:25:27.100 | is more accurate than their motor control
00:25:29.500 | because the act of snapping the bricks together
00:25:31.820 | gives you a constraint on the joints.
00:25:34.980 | You can join bricks made out of dissimilar materials.
00:25:38.500 | You don't need a ruler for Lego
00:25:40.100 | 'cause the geometry locally gives you the global parts.
00:25:42.980 | And there's no Lego trash.
00:25:44.540 | The parts have enough information to disassemble them.
00:25:47.580 | Those are exactly the properties of a digital code.
00:25:52.200 | - The unreliable is made reliable.
00:25:54.060 | - Yes, absolutely.
00:25:55.460 | So what the ribosome figured out four billion years ago
00:25:59.620 | is how to embody these digital properties,
00:26:03.540 | but not for communication or computation in effect,
00:26:08.160 | but for construction.
00:26:09.940 | So a number of projects in my lab
00:26:13.180 | have been studying the idea of digital materials.
00:26:16.380 | And think of a digital material just as Lego bricks.
00:26:20.260 | The precise meaning is a discrete set of parts
00:26:24.320 | reversibly joined with global geometry
00:26:28.340 | determined from local constraints.
00:26:30.540 | And so it's digitizing the materials.
00:26:33.340 | And so I'm coming back to
00:26:34.580 | what are the biggest things I've made?
00:26:36.360 | My lab was working with the aerospace industry.
00:26:38.740 | So Spirit Aero was Boeing's factories.
00:26:41.980 | They asked us for how to join composites.
00:26:44.640 | When you make a composite airplane,
00:26:46.440 | you make these giant wing and fuselage parts.
00:26:48.820 | And they asked us for a better way to stick them together
00:26:52.220 | 'cause the joints were a place of failure.
00:26:54.500 | And what we discovered was instead of making a few big parts,
00:26:59.500 | if you make little loops of carbon fiber
00:27:02.860 | and you reversibly link them in joints,
00:27:06.160 | and you do it in a special geometry
00:27:08.980 | that balances being under constrained and over constrained
00:27:12.380 | with just the right degrees of freedom,
00:27:14.740 | we set the world record
00:27:16.380 | for the highest modulus ultralight material
00:27:20.080 | just by in effect making carbon fiber Lego.
00:27:23.880 | So lightweight materials are crucial for energy efficiency.
00:27:29.260 | This let us make the lightest weight high modulus material.
00:27:33.980 | We then showed that with just a few part types,
00:27:37.700 | we can tune the material properties.
00:27:39.920 | And then you can create really wild robots
00:27:43.240 | that instead of having a tool the size of a jumbo jet
00:27:45.920 | to make a jumbo jet,
00:27:47.000 | you can make little robots
00:27:48.560 | that walk on these cellular structures
00:27:50.700 | to build the structures
00:27:52.220 | where they error correct their position on the structure
00:27:54.580 | and they navigate on the structure.
00:27:56.700 | And so using all of that with NASA,
00:28:01.700 | we made morphing airplanes,
00:28:04.580 | a former student Kenny Chung and Ben Jeanette
00:28:08.400 | made a morphing airplane
00:28:09.340 | the size of NASA Langley's biggest wind tunnel.
00:28:11.980 | With Toyota, we've made super efficiency race cars.
00:28:15.660 | We're right now looking at projects with NASA
00:28:18.020 | to build these for things like space telescopes
00:28:20.180 | and space habitats,
00:28:21.860 | where the ribosome who I mentioned a little while back
00:28:25.180 | can make an elephant one molecule at a time.
00:28:28.140 | Ribosomes are slow.
00:28:29.940 | They run at about one molecule a second,
00:28:32.020 | but ribosomes make ribosomes.
00:28:34.240 | So you have thousands of them,
00:28:36.340 | trillions of them and that makes an elephant.
00:28:38.580 | In the same way, these little assembly robots I'm describing
00:28:41.260 | can make giant structures at heart
00:28:44.540 | because the robot can make the robot.
00:28:47.500 | So more recently, two of my students,
00:28:49.700 | Amira and Miana had a nature communication paper
00:28:52.780 | showing how this robot can be made out of the parts
00:28:56.620 | it's making so the robots can make the robots.
00:28:59.800 | So you build up the capacity of robotic assembly.
00:29:02.300 | - It can self replicate.
00:29:03.180 | Can you linger on what that robot looks like?
00:29:05.120 | What is a robot that can walk along and do error correction?
00:29:08.580 | And what is a robot that can self replicate
00:29:11.860 | from the materials that is given?
00:29:14.820 | What does that look like?
00:29:15.660 | What are we talking?
00:29:16.980 | This is fascinating.
00:29:17.820 | - Yeah, the answer is different at different length scales.
00:29:21.660 | So to explain that, in biology,
00:29:24.540 | primary structure is the code in the messenger RNA
00:29:28.540 | that says what the ribosome should build.
00:29:31.220 | Secondary structure are geometrical motifs.
00:29:33.820 | They're things like helices or sheets.
00:29:35.860 | Tertiary structures are functional elements
00:29:38.540 | like electron donors or acceptors.
00:29:40.940 | Quaternary structure is things like molecular motors
00:29:45.940 | that are moving my mouth
00:29:47.140 | or making the synapses work in my brain.
00:29:49.480 | So there's that hierarchy of primary,
00:29:51.460 | secondary, tertiary, quaternary.
00:29:53.840 | Now what's interesting is
00:29:55.480 | if you wanna buy electronics today from a vendor,
00:30:00.320 | there are hundreds of thousands of types of resistors
00:30:04.260 | or capacitors or transistors, huge inventory.
00:30:07.700 | All of biology is just made from this inventory
00:30:09.920 | of 20 parts amino acids.
00:30:11.640 | And by composing them, you can create all of life.
00:30:14.560 | And so as part of this digitization of materials,
00:30:19.560 | we're in effect trying to create
00:30:22.520 | something like amino acids for engineering,
00:30:24.520 | creating all of technology from 20 parts.
00:30:27.780 | As another discretion,
00:30:31.320 | I helped start an office for science in Hollywood.
00:30:34.840 | And there was a fun thing for the movie "The Martian"
00:30:38.080 | where I did a program with Bill Nye and a few others
00:30:41.160 | on how to actually build a civilization on Mars
00:30:43.560 | that they described in a way that I like
00:30:45.920 | as I was talking about how to go to Mars without luggage.
00:30:49.040 | And at heart, it's sort of how to create life
00:30:52.840 | in non-living materials.
00:30:54.280 | So if you think about this primary,
00:30:56.800 | secondary, tertiary, quaternary structure,
00:30:59.000 | in my lab, we're doing that,
00:31:02.340 | but on different length scales for different purposes.
00:31:04.440 | So we're making micro robots out of like nano bricks
00:31:08.640 | and to make the robots to build large scale structures
00:31:12.920 | in space, the elements of the robots now are centimeters
00:31:16.920 | rather than micrometers.
00:31:19.360 | And so the assembly robots for the bigger structures are,
00:31:23.220 | there are the cells that make up the structure,
00:31:27.840 | but then we have functional cells.
00:31:29.520 | And so cells that can process and actuate,
00:31:32.440 | each cell can like move one degree of freedom
00:31:35.320 | or attach or detach or process.
00:31:40.320 | Now, those elements I just described,
00:31:43.080 | we can make out of the still smaller parts.
00:31:45.280 | So eventually there's a hierarchy of the little parts
00:31:47.720 | make little robots that make bigger parts of bigger robots
00:31:51.080 | up through that hierarchy.
00:31:52.560 | - And in that way, you can move up the length scale.
00:31:54.400 | - Right, early on, I tried to go in a straight line
00:31:57.320 | from the bottom to the top,
00:31:58.480 | and that ended up being a bad idea.
00:32:00.400 | Instead, we're kind of doing all of these in parallel
00:32:02.880 | and then they're growing together.
00:32:04.760 | And so to make the larger scale structures,
00:32:08.280 | like there's a lot of hype right now
00:32:10.220 | about 3D printing houses
00:32:11.480 | where you have a printer the size of the house.
00:32:13.800 | We're right now working on using swarms of these
00:32:17.280 | table scale robots that walk on the structures
00:32:19.520 | to place the parts much more efficiently.
00:32:22.320 | - That's amazing.
00:32:23.680 | But you're saying you can't for now go from the very small
00:32:26.640 | to the very large.
00:32:27.840 | - That'll come, that'll come in stages.
00:32:30.760 | - Can I just linger on this idea?
00:32:32.960 | Starting from von Neumann's self-replicating automata
00:32:36.600 | that you mentioned, it's just a beautiful idea.
00:32:39.640 | - So that's at the heart of all of this.
00:32:41.600 | In the stack I described, so one student, Will Langford,
00:32:44.840 | made these micro robots out of little parts
00:32:47.520 | that then we're using for Mianna's bigger robots
00:32:50.740 | up through this hierarchy.
00:32:52.200 | And it's really realizing this idea
00:32:54.440 | of the self-reproducing automata.
00:32:56.040 | So von Neumann, when I complained about
00:32:59.000 | the von Neumann architecture,
00:33:01.400 | it's not fair to von Neumann 'cause he never claimed it
00:33:03.820 | as his architecture.
00:33:04.800 | He really wrote about it in this one fairly dreadful memo
00:33:07.280 | that led to all sorts of lawsuits and fights
00:33:09.920 | about the early days of computing.
00:33:11.960 | He did beautiful work on reliable computation
00:33:15.560 | and reliable devices.
00:33:17.040 | And towards the end of his life, what he studied was how,
00:33:21.480 | and I have to say this precisely,
00:33:24.040 | how a computation communicates its own construction.
00:33:27.480 | - Yeah, so beautiful.
00:33:28.960 | - So a computation can store a description
00:33:33.960 | of how to build itself.
00:33:35.920 | But now there's a really hard problem,
00:33:38.600 | which is if you have that in your mind,
00:33:43.460 | how do you transfer it and wake up a thing
00:33:48.000 | that then can contain it?
00:33:50.260 | So how do you give birth to a thing
00:33:51.700 | that knows how to make itself?
00:33:53.640 | And so with Stan Ulam, he invented cellular automata
00:33:58.400 | as a way to simulate these, but that was theoretical.
00:34:03.400 | Now the work I'm describing in my lab
00:34:05.880 | is fundamentally how to realize it,
00:34:08.440 | how to realize self-reproducing automata.
00:34:13.120 | And so this is something von Neumann thought very deeply
00:34:16.840 | and very beautifully about theoretically.
00:34:19.600 | And it's right at this intersection.
00:34:21.960 | It's not communication or computation or fabrication.
00:34:27.240 | It's right at this intersection where communication
00:34:29.340 | and computation meets fabrication.
00:34:31.340 | Now the reason self-reproducing automata
00:34:35.240 | intellectually is so important,
00:34:36.760 | 'cause this is the foundation of life.
00:34:38.420 | This is really just understanding the essence of how to life
00:34:42.040 | and in effect, we're trying to create life
00:34:43.680 | in non-living material.
00:34:45.340 | The reason it's so important technologically
00:34:48.340 | is because that's how you scale capacity.
00:34:50.760 | That's how you can make an elephant from a ribosome,
00:34:53.480 | 'cause the assemblers make assemblers.
00:34:56.320 | - So simple building blocks that inside themselves
00:35:00.000 | contain the information how to build more building blocks
00:35:03.360 | and between each other construct
00:35:06.400 | arbitrarily complex objects.
00:35:08.200 | - Now let me give you the numbers.
00:35:09.800 | So let me relate this to, right now we're living
00:35:12.400 | in AI mania explosion time.
00:35:15.800 | Let me relate that to what we're talking about.
00:35:18.920 | 100 petaflop computer, which is a current generation
00:35:25.880 | super computer, not quite the biggest ones,
00:35:29.920 | does 10 to the 17 ops per second.
00:35:32.840 | Your brain does 10 to the 17 ops per second.
00:35:35.720 | It has about 10 to the 15 synapses
00:35:38.000 | and they run at about 100 hertz.
00:35:40.080 | So as of a year or two ago,
00:35:42.380 | the performance of a big computer matched a brain.
00:35:47.640 | So you could view AI as a breakthrough,
00:35:50.440 | but the real story is within about a year or two ago,
00:35:55.720 | and let's see, the super computer has about 10 to the 15
00:36:00.000 | transistors in the processors,
00:36:01.440 | 10 to the 15 transistors in the memory,
00:36:03.320 | which is the synapses in your brain.
00:36:05.160 | So the real breakthrough was the computers
00:36:07.720 | match the computational capacity of a brain.
00:36:11.620 | And so we'd be sort of derelict
00:36:13.080 | if they couldn't do about the same thing.
00:36:15.480 | But now the reason I'm mentioning that
00:36:17.440 | is the chip fab making the super computer
00:36:23.280 | is placing about 10 to the 10 transistors a second.
00:36:26.400 | While you're digesting your lunch right now,
00:36:29.720 | you're placing about 10 to the 18 parts per second.
00:36:33.780 | There's an eight order of magnitude difference,
00:36:39.080 | so in computational capacity, it's done, we've caught up.
00:36:42.400 | But there's eight orders of magnitude difference
00:36:45.000 | in the rate at which biology can build
00:36:47.580 | versus state of the art manufacturing can build.
00:36:50.800 | And that distinction is what we're talking about.
00:36:52.920 | That distinction is not analog,
00:36:56.240 | but this deep sense of digital fabrication,
00:36:58.740 | of embodying codes in construction.
00:37:01.080 | So a description doesn't describe a thing,
00:37:02.860 | but the description becomes the thing.
00:37:04.920 | - So you're saying, I mean, this is one of the cases
00:37:07.360 | you're making, and this is this third revolution.
00:37:10.420 | We've seen the Moore's law in communication,
00:37:12.480 | we've seen the Moore's law-like type of growth
00:37:15.320 | in computation, and you're anticipating
00:37:19.400 | we're going to see that in digital fabrication.
00:37:22.160 | Can you actually, first of all, describe
00:37:23.760 | what you mean by this term, digital fabrication?
00:37:26.440 | - So the casual meaning is a computer controls
00:37:29.400 | a tool to make something.
00:37:31.480 | And that was invented when MIT stole it in 1952.
00:37:35.060 | There's the deep meaning of what the ribosome does,
00:37:39.600 | of a digital description doesn't describe a thing,
00:37:44.600 | a digital description becomes the thing.
00:37:48.060 | That's the path to the Star Trek replicator,
00:37:52.980 | and that's the thing that doesn't exist yet.
00:37:56.260 | Now I think the best way to understand
00:37:58.500 | what this roadmap looks like is to now bring in Fab Labs,
00:38:03.340 | and how they relate to all of this.
00:38:04.700 | - What are Fab Labs?
00:38:05.940 | - So here's a sequence.
00:38:08.820 | With colleagues, I accidentally started a network
00:38:12.300 | of what's now 2,500 digital fabrication community labs,
00:38:16.480 | called Fab Labs, right now in 125 countries,
00:38:19.500 | and they double every year and a half.
00:38:21.300 | That's called Lassa's Law after Sherry Lassiter,
00:38:24.240 | who I'll explain.
00:38:25.220 | So here's the sequence.
00:38:26.540 | We started Center for Bits and Atoms
00:38:31.100 | to do the kind of research we're talking about.
00:38:33.380 | We had all of these machines, and then had a problem,
00:38:36.900 | it would take a lifetime of classes
00:38:38.860 | to learn to use all the machines.
00:38:41.460 | So with colleagues who helped start CBA,
00:38:45.540 | we began a class modestly called
00:38:47.360 | How to Make Almost Anything.
00:38:48.880 | And there's no big agenda, it was just,
00:38:51.320 | it was aimed at a few research students to use the machines.
00:38:53.980 | And we're completely unprepared,
00:38:55.980 | for the first time we taught it,
00:38:57.800 | we were swamped by, every year since,
00:39:00.700 | hundreds of students try to take the class,
00:39:02.680 | it's one of the most oversubscribed classes at MIT.
00:39:05.560 | Students would say things like,
00:39:07.900 | can you teach this at MIT, it seems too useful.
00:39:10.520 | It's just how to work these machines.
00:39:13.120 | And the students in the class,
00:39:15.740 | I would teach them all the skills to use all these tools,
00:39:18.900 | and then they would do projects integrating them,
00:39:20.980 | and they were amazing.
00:39:21.820 | So Kelly was a sculptor, no engineering background.
00:39:25.500 | Her project was, she made a device that saves up screams
00:39:28.320 | when you're mad and plays them back later.
00:39:30.660 | - And saves up screams when you're mad
00:39:33.340 | and plays them back later.
00:39:34.180 | - Right, you scream into this device,
00:39:35.640 | and it deadens the sound, records it,
00:39:38.220 | and then when it's convenient, releases your scream.
00:39:40.900 | - Can we just pause on the brilliance of that invention?
00:39:45.160 | Creation, the art, I don't know, the brilliance.
00:39:49.320 | Who is this that created this?
00:39:50.160 | - Kelly Dobson.
00:39:51.120 | - Kelly Dobson.
00:39:52.040 | - Gone on to do a number of interesting things.
00:39:54.200 | Mijin, who's gone on to do a number of interesting things,
00:39:57.120 | made a dress instrumented with sensors in spines,
00:39:59.840 | and when somebody creepy comes close,
00:40:01.280 | it would defend your personal space.
00:40:02.800 | - They're also very useful.
00:40:04.840 | - Another project early on was a web browser for parrots,
00:40:07.920 | which have the cognitive ability of a young child
00:40:09.960 | and lets parrots surf the internet.
00:40:12.240 | You know, another was an alarm clock you wrestle with
00:40:15.640 | and prove you're awake.
00:40:17.320 | And what connects all of these is,
00:40:20.280 | so MIT made the first real-time computer, the Whirlwind.
00:40:23.360 | That was transistorized as the TX.
00:40:26.320 | The TX was spun off from MIT as the PDP.
00:40:30.520 | PDPs were the mini computers that created the internet.
00:40:36.440 | So outside MIT was DEC, Prime, Wang, Data General,
00:40:40.680 | the whole mini computer industry.
00:40:42.840 | The whole computing industry was there,
00:40:44.720 | and it all failed when computing became personal.
00:40:48.760 | Ken Olsen, the head of digital,
00:40:50.360 | famously said you don't need a computer at home.
00:40:52.880 | There's a little background to that,
00:40:54.120 | but DEC completely missed computing became personal.
00:40:58.680 | So I mention all of that because I was asking
00:41:01.680 | how to do digital fabrication, but not really why.
00:41:04.680 | The students in this how to make class were showing me
00:41:08.400 | that the killer app of digital fabrication
00:41:11.000 | is personal fabrication.
00:41:12.360 | - Yeah, how do you jump to the personal fabrication?
00:41:14.640 | - So Kelly didn't make the screen body
00:41:16.880 | because it was for a thesis.
00:41:18.160 | She wasn't writing a research paper.
00:41:20.220 | It wasn't a business model.
00:41:23.120 | She wanted, it was 'cause she wanted one.
00:41:25.440 | It was personal expression,
00:41:27.160 | going back to me in vocational school,
00:41:29.560 | it was personal expression in these new means of expression.
00:41:33.000 | So that's happened every year since.
00:41:34.880 | - It literally is called, the course is literally called
00:41:37.600 | How to Make Almost Anything, a legendary course at MIT.
00:41:42.600 | Every year.
00:41:44.480 | - And it's grown to multiple labs at MIT
00:41:48.600 | with as many people involved in teaching as taking it,
00:41:51.160 | and there's even a Harvard lab for the MIT class.
00:41:53.640 | - What have you learned about humans
00:41:57.160 | colliding with the Fab Lab,
00:41:59.660 | about what the capacity of humans to be creative and to build?
00:42:02.560 | - I mentioned Marvin.
00:42:03.960 | Another mentor at MIT,
00:42:05.440 | sadly no longer living, is Seymour Papert.
00:42:08.120 | So Papert studied with Piaget.
00:42:11.120 | He came to MIT to get access to the early,
00:42:14.640 | Piaget was a pioneer in how kids learn.
00:42:17.980 | Papert came to MIT to get access to the early computers
00:42:22.760 | with the goal of letting kids play with them.
00:42:25.280 | Piaget helped show kids are like scientists.
00:42:27.360 | They learn as scientists,
00:42:29.280 | and it gets kind of throttled out of them.
00:42:31.280 | Seymour wanted to let kids have a broader landscape to play.
00:42:34.280 | Seymour's work led with Mitch Resnick
00:42:36.200 | to Lego, Logo, Mindstorms, all of that stuff.
00:42:39.400 | As Fab Lab spread,
00:42:40.960 | and we started creating educational programs
00:42:43.520 | for kids in them,
00:42:44.440 | Seymour said something really interesting.
00:42:46.120 | He made a gesture.
00:42:47.000 | He said it was a thorn in his side
00:42:50.480 | that they invented what's called the Turtle,
00:42:53.080 | a robot kids could, early robot kids could program
00:42:55.640 | to connect it to a mainframe computer.
00:42:57.920 | Seymour said the goal was not for the kids
00:43:01.280 | to program the robot.
00:43:02.480 | It was for the kids to create the robot.
00:43:04.540 | And so in that sense,
00:43:06.520 | the Fab Labs, which for me were just this accident,
00:43:09.380 | he described as sort of this fulfillment of the arc
00:43:12.360 | of kids learn by experimenting.
00:43:14.440 | It was to give them the tools to create,
00:43:16.880 | not just assemble things and program things,
00:43:19.580 | but actually create.
00:43:20.940 | So coming to your question,
00:43:23.120 | what I've learned is
00:43:27.080 | MIT, a few years back,
00:43:29.400 | somebody added up businesses spun off from MIT,
00:43:32.780 | and it's the world's 10th economy.
00:43:34.480 | It falls between India and Russia.
00:43:36.520 | And I view that in a way as a bad number
00:43:39.720 | because it's only a few thousand people,
00:43:41.640 | and these aren't uniquely the 4,000 brightest people.
00:43:44.720 | It's just a productive environment for them.
00:43:47.280 | And what we found is in rural Indian villages
00:43:50.720 | in African shanty towns in Arctic hamlets,
00:43:55.120 | I find exactly precisely that profile.
00:43:59.820 | So Ling Sai did a few hours above Tromso,
00:44:04.200 | way above the Arctic circles.
00:44:05.840 | It's so far north, the satellite dishes look at the ground,
00:44:08.080 | not the sky.
00:44:09.480 | Hans Christian in the lab was considered a problem
00:44:13.280 | in the local school
00:44:14.480 | 'cause they couldn't teach him anything.
00:44:16.200 | I showed him a few projects.
00:44:17.480 | Next time I came back,
00:44:18.360 | he was designing and building little robot vehicles.
00:44:21.040 | And in South Africa, I mentioned Soshin Govi,
00:44:26.040 | in this apartheid township,
00:44:28.360 | the local technical institute taught kids
00:44:30.480 | how to make bricks and fold sheets.
00:44:32.200 | It was punitive.
00:44:33.800 | But Tshepiso in the Fab Lab
00:44:35.540 | was actually doing all the work of my MIT classes.
00:44:37.860 | And so over and over,
00:44:39.560 | we found precisely the same kind of bright
00:44:43.400 | invent of creativity.
00:44:47.760 | And historically, the answer was,
00:44:51.800 | go, you're smart, go away.
00:44:53.160 | It's sort of like me in vocational school.
00:44:55.560 | But in this lab network,
00:44:57.280 | what we could then do is in effect,
00:44:59.120 | bring the world to them.
00:45:00.480 | Now let's look at the scaling of all of this.
00:45:02.420 | So there's one earth, a thousand cities,
00:45:07.420 | a million towns, a billion people, a trillion things.
00:45:11.100 | There was one whirlwind computer.
00:45:14.200 | MIT made the first real-time computer.
00:45:17.280 | There were thousands of PDPs.
00:45:19.840 | There were millions of hobbyist computers
00:45:21.800 | that came from that.
00:45:23.600 | Billions of personal computers,
00:45:25.120 | trillions of internet of things.
00:45:27.400 | So now if we look at this Fab Lab story,
00:45:30.160 | 1952 was the NC Mill.
00:45:32.340 | There are now thousands of Fab Labs.
00:45:35.640 | And the Fab Lab costs exactly the same
00:45:39.560 | cost and complexity of the mini computer.
00:45:42.460 | So on the mini computer, it didn't fit in your pocket.
00:45:46.000 | It filled the room.
00:45:47.680 | But video games, email, word processing,
00:45:51.600 | really anything you do, the internet,
00:45:53.460 | anything you do with a computer today
00:45:55.200 | happened at that era.
00:45:56.720 | Because it got on the scale of a work group,
00:45:59.320 | not a corporation.
00:46:01.420 | In the same way, Fab Labs are like the mini computers
00:46:05.100 | inventing how does the world work
00:46:06.520 | if anybody can make anything.
00:46:08.360 | Then if you look at that scaling,
00:46:11.840 | labs today are transitioning from buying a machine
00:46:16.040 | to make machines making machines.
00:46:18.000 | So we're transitioning to you can go to a Fab Lab
00:46:20.600 | not to make a project, but to make a new machine.
00:46:24.340 | So we talked about the deep sense of self-replication.
00:46:27.280 | There's a very practical sense of Fab Lab machines
00:46:30.240 | making Fab Lab machines.
00:46:32.480 | And so that's the equivalent of the hobbyist computer era,
00:46:37.480 | what it's called the Altair historically.
00:46:41.440 | Then the work we spent a while talking about
00:46:43.640 | about assemblers and self-assemblers,
00:46:45.600 | that's the equivalent of smartphones and internet of things.
00:46:49.680 | That's when, so the assemblers are like the smartphone
00:46:53.920 | where a smartphone today has the capacity
00:46:55.900 | of what used to be a supercomputer in your pocket.
00:46:58.640 | And then the smart thermostat on your wall
00:47:02.360 | has the power of the original PDP computer,
00:47:07.360 | not metaphorically, but literally.
00:47:08.940 | And now there's trillions of those.
00:47:10.800 | In the same sense that when we finally merge materials
00:47:14.760 | with the machines and the self-assembly,
00:47:16.560 | that's like the internet of things stage.
00:47:19.080 | But here's the important lesson.
00:47:21.200 | If you look at the computing analogy,
00:47:24.800 | computing expanded exponentially,
00:47:27.920 | but it really didn't fundamentally change.
00:47:32.080 | The core things happened in that transition
00:47:35.960 | in the mini computer era.
00:47:37.240 | So in the same sense, the research now,
00:47:40.120 | we spent a while talking about
00:47:41.280 | is how we get to the replicator.
00:47:43.640 | Today, you can do all of that.
00:47:45.440 | If you close your eyes
00:47:46.560 | and view the whole Fab Lab as a machine,
00:47:49.080 | in that room you can make almost anything,
00:47:51.040 | but you need a lot of inputs.
00:47:52.720 | Bit by bit, the inputs will go down
00:47:55.600 | and the size of the room will go down
00:47:57.360 | as we go through each of these stages.
00:47:59.800 | - So how difficult is it to create
00:48:02.200 | a self-replicating assembler,
00:48:04.640 | self-replicating machine that builds copies of itself
00:48:09.600 | or builds more complicated version of itself,
00:48:11.600 | which is kind of the dream towards which you're pushing
00:48:14.280 | in a generic, arbitrary sense?
00:48:16.800 | - I had a student, Nadia Peek, with Jonathan Ward,
00:48:20.280 | who for me started this idea of
00:48:22.800 | how do we use the tools in my lab
00:48:24.440 | to make the tools in the lab?
00:48:25.720 | - Yes.
00:48:26.720 | - In a very clear sense,
00:48:29.440 | they are making self-reproducing machines.
00:48:31.760 | So one of the really cool things that's happened
00:48:34.480 | is there's a whole network of machine builders
00:48:37.960 | around the world.
00:48:38.840 | So there's Daniel now in Germany and Jens in Norway.
00:48:43.040 | And each of these people has learned the skills
00:48:48.040 | to go into a Fab Lab and make a machine.
00:48:50.880 | And so we've started creating a network of super Fab,
00:48:54.000 | so the Fab Lab can make a machine,
00:48:55.480 | but it can't make a number of the precision parts
00:48:57.560 | of the machine.
00:48:58.640 | So in places like Bhutan or Kerala in the South of India,
00:49:01.920 | we've started creating super Fab Labs
00:49:03.960 | that have more advanced tools
00:49:05.840 | to make the parts of the machines
00:49:07.800 | so that the machines themselves become even cheaper.
00:49:11.200 | So that is self-reproducing machines,
00:49:15.640 | but you need to feed it things like bearings
00:49:19.400 | or microcontrollers.
00:49:20.600 | They can't make those parts.
00:49:21.840 | But other than that, they're making their own things.
00:49:24.360 | And I should note as a footnote,
00:49:26.280 | the stack I described of computers controlling machines
00:49:29.520 | to machine making machines,
00:49:30.760 | to assemblers, to self-assemblers,
00:49:32.560 | view that as Fab 1, 2, 3, 4.
00:49:35.320 | So we're transitioning from Fab 1 to Fab 2,
00:49:38.200 | and the research in the lab is 3 and 4.
00:49:40.560 | At this Fab 2 stage, a big component of this
00:49:43.880 | is sustainability in the material feedstocks.
00:49:47.840 | So Alicia, a colleague in Chile, is leading a great effort
00:49:51.520 | looking at how you take forest products
00:49:53.960 | and coffee grounds and seashells
00:49:55.680 | and a range of locally available materials
00:49:58.600 | and produce the high-tech materials that go into the lab.
00:50:02.440 | So all of that is machine building today.
00:50:05.760 | Then back in the lab, what we can do today
00:50:10.760 | is we have robots that can build structures
00:50:14.160 | and can assemble more robots that build structures.
00:50:17.600 | We have finer resolution robots
00:50:20.440 | that can build micro-mechanical systems.
00:50:23.040 | So robots that can build robots
00:50:24.760 | that can walk and manipulate.
00:50:26.960 | And we're just now, we have a project
00:50:30.560 | at the layer below that where there's endless attention today
00:50:33.880 | to billion-dollar chip fab investments.
00:50:36.480 | But a really interesting thing we passed through
00:50:40.240 | is today the smallest transistors you can buy
00:50:44.120 | as a single transistor, just commercially for electronics,
00:50:47.320 | is actually the size of an early transistor
00:50:49.680 | in an integrated circuit.
00:50:51.760 | So we're using these machines-making machines,
00:50:54.520 | making assemblers, to place those parts
00:50:57.240 | to not use a billion-dollar chip fab
00:50:59.280 | to make integrated circuits,
00:51:00.440 | but actually assemble little electronic components.
00:51:03.080 | - So have a fine enough, precise enough actuators
00:51:06.800 | and manipulators that allow you to place these transistors.
00:51:10.080 | - Right, that's a research project in my lab
00:51:12.520 | called DICE, on discrete assembly of integrated electronics.
00:51:16.240 | And we're just at the point to really start
00:51:18.840 | to take seriously this notion of not having a chip fab
00:51:22.600 | make integrated electronics, but having not a 3D printer,
00:51:26.760 | but a thing that's a cross between a pick and place
00:51:30.040 | to make circuit boards in 2D.
00:51:32.200 | The 3D printer extrudes in 3D.
00:51:34.840 | We're making sort of a micromanipulator
00:51:37.000 | that acts like a printer,
00:51:38.120 | but it's placing to build electronics in 3D.
00:51:41.360 | - But this micromanipulator is distributed,
00:51:43.400 | so there's a bunch of them,
00:51:44.320 | or is this one centralized thing?
00:51:46.240 | - That's why that's a great question.
00:51:47.640 | So I have a prize that's almost but not been claimed
00:51:51.240 | for the students whose thesis can walk out of the printer.
00:51:54.640 | - Oh, nice.
00:51:55.560 | - So you have to print the thesis
00:51:58.120 | with the means to exit the printer,
00:52:01.520 | and it has to contain its description of the thesis
00:52:05.800 | that says how to do that.
00:52:07.160 | - It's a really good, I mean,
00:52:10.040 | it's a fun example of exactly the thing we're talking.
00:52:13.480 | - And I've had a few students almost get to that.
00:52:17.400 | And so in what I'm describing,
00:52:20.880 | there's this stack where we're getting closer,
00:52:23.340 | but it's still quite a few years to really go from a,
00:52:26.660 | so there's a layer below the transistors
00:52:28.800 | where we assemble the base materials
00:52:30.460 | that become the transistor.
00:52:32.180 | We're now just at the edge of assembling the transistors
00:52:35.580 | to make the circuits.
00:52:37.720 | We can assemble the micro parts to make the micro robots.
00:52:41.620 | We can assemble the bigger robots.
00:52:43.600 | And in the coming years,
00:52:44.540 | we'll be patching together all of those scales.
00:52:47.660 | - So do you see a vision of just endless billions of robots
00:52:52.660 | at the different scales,
00:52:53.780 | self-assembling, self-replicating,
00:52:56.820 | and building complicated structures?
00:52:58.820 | - Yes.
00:53:00.460 | And the but to the yes but is,
00:53:03.420 | let me clarify two things.
00:53:04.880 | One is that immediately raises King Charles' fear
00:53:09.880 | of Grey Goo, of runaway mutant self-reproducing things.
00:53:14.960 | The reason why there are many things
00:53:17.220 | I can tell you to worry about,
00:53:18.980 | but that's not one of them,
00:53:21.180 | is if you want things to autonomously self-reproduce
00:53:25.620 | and take over the world,
00:53:26.740 | that means they need to compete with nature
00:53:28.980 | on using the resources of nature,
00:53:31.180 | of water and sunlight.
00:53:32.860 | And in light of everything I'm describing,
00:53:35.540 | biology knows everything I told you.
00:53:37.860 | Every single thing I explain,
00:53:39.480 | biology already knows how to do.
00:53:41.180 | What I'm describing isn't new for biology.
00:53:45.380 | It's new for non-biological systems.
00:53:47.840 | So in the digital era,
00:53:50.420 | the economic win ended up being centralized,
00:53:53.740 | the big platforms.
00:53:55.620 | In this world of machines that can make machines,
00:53:58.260 | I'm asked, for example,
00:53:59.920 | what's the killer opportunity?
00:54:03.500 | Who's gonna make all the money?
00:54:05.580 | Who to invest in?
00:54:07.060 | But if the machine can make the machine,
00:54:08.940 | it's not a great business to invest in the machine.
00:54:12.160 | In the same way that if you can think globally,
00:54:17.820 | but produce locally,
00:54:19.700 | then the way the technology goes out into society
00:54:23.340 | isn't a function of central control,
00:54:26.120 | but is fundamentally distributed.
00:54:27.900 | Now, that raises an obvious kind of concern,
00:54:31.620 | which is, well, doesn't this mean
00:54:32.740 | you could make bombs and guns and all of that?
00:54:35.720 | The reason that's much less of a problem
00:54:38.220 | than you would think is making bombs and guns
00:54:41.980 | and all of that is a very well-met market need.
00:54:45.260 | Anywhere we go, there's a fine supply chain for weapons.
00:54:49.660 | Now, hobbyists have been making guns for ages,
00:54:51.780 | and guns are available just about anywhere.
00:54:54.300 | So you could go into the lab and make a gun.
00:54:56.300 | Today, it's not a very good gun,
00:54:58.380 | and guns are easily available.
00:55:00.100 | And so generally, we run these lab in war zones.
00:55:03.180 | What we find is people don't go to them to make weapons,
00:55:08.020 | which you can already do anyway.
00:55:09.420 | It's an alternative to making weapons.
00:55:11.540 | Coming back to your question,
00:55:12.780 | I'd say the single most important thing I've learned
00:55:15.420 | is the greatest natural resource of the planet
00:55:18.980 | is this amazing density of bright and venerable people
00:55:23.060 | whose brains are underused.
00:55:24.580 | And you could view the social engineering of this lab work
00:55:29.580 | as creating the capacity for them.
00:55:32.380 | And so in the end, the way this is gonna impact society
00:55:37.060 | isn't gonna be command and control.
00:55:38.860 | It's how the world uses it.
00:55:40.960 | And it's been really gratifying for me
00:55:43.500 | to see just how it does.
00:55:45.740 | - Yeah, but what are the different ways
00:55:48.100 | the evolution of the exponential scaling
00:55:50.860 | of digital fabrication can evolve?
00:55:52.700 | So you said, yeah, self-replicating nanobots, right?
00:55:56.700 | This is the Grey Goo fear.
00:55:59.660 | It's a caricature of a fear,
00:56:01.700 | but nevertheless, there's interesting,
00:56:03.620 | just like you said, spam and all these kinds of things
00:56:06.540 | that came with the scaling of communication and computation.
00:56:10.020 | What are the different ways that malevolent actors
00:56:13.700 | will use this technology?
00:56:15.020 | - Yeah, well, first, let me start with a benevolent story,
00:56:17.620 | which is trash is an analog concept.
00:56:22.620 | There's no trash in a forest.
00:56:25.740 | All the parts get disassembled and reused.
00:56:28.460 | Trash means something doesn't have enough information
00:56:31.500 | to tell you how to reuse it.
00:56:33.900 | It's as simple as there's no trash in a Lego room.
00:56:38.380 | When you assemble Lego, the Lego bricks
00:56:40.420 | have enough information to disassemble them.
00:56:42.820 | So as you go through this Fab 1, 2, 3, 4 story,
00:56:47.220 | one of the implications of this transition
00:56:51.100 | from printing to assembling.
00:56:53.620 | So the real breakthrough technologically
00:56:56.180 | isn't additive versus subtractive,
00:56:58.220 | which is a subject of a lot of attention and hype.
00:57:01.380 | 3D printers are useful.
00:57:03.980 | We spun off companies like Formlabs
00:57:07.860 | led by Max for 3D printing,
00:57:09.780 | but in a Fab Lab, it's one of maybe 10 machines.
00:57:12.300 | It's used, but it's only part of the machines.
00:57:15.060 | The real technological change is when we go
00:57:17.500 | from printing and cutting to assembling and disassembling,
00:57:22.100 | but that reduces inventories of hundreds of thousands
00:57:26.500 | of parts to just having a few parts to make almost anything.
00:57:29.780 | It reduces global supply chains
00:57:31.820 | to locally sourcing these building blocks.
00:57:34.100 | But one of the key implications is it gets rid
00:57:36.940 | of technological trash because you can disassemble
00:57:40.740 | and reuse the parts, not throw them away.
00:57:43.180 | And so initially that's of interest for things
00:57:45.820 | at the end of long supply chains, like satellites on orbit.
00:57:48.660 | But one of the things coming is eliminating technical trash
00:57:51.660 | through reuse of the building blocks.
00:57:54.340 | - So like when you think about 3D printers,
00:57:56.780 | you're thinking about addition and subtraction.
00:57:59.900 | When you think about the other options available to you
00:58:03.580 | in that parameter space, as you call it,
00:58:05.780 | that's going to be assembly, disassembly, cutting, you said?
00:58:09.460 | - So the 1952 NC Mill was subtractive.
00:58:14.300 | You remove material.
00:58:15.620 | And 3D printing additive, and there's a couple of claims
00:58:20.080 | to the invention of 3D printing,
00:58:21.760 | that's closer to what's called net shape,
00:58:23.580 | which is you don't have to cut away the material
00:58:25.420 | you don't need, you just put material where you do need it.
00:58:27.620 | And so that's the 3D printing revolution.
00:58:30.060 | But there are all sorts of limitations on 3D printing
00:58:34.540 | to the kinds of materials you can print,
00:58:38.140 | the kind of functionality you can print.
00:58:40.380 | We're just not going to get to making
00:58:42.700 | everything in a cell phone on a single printer.
00:58:47.820 | But I do expect to make everything in a cell phone
00:58:50.120 | with an assembler.
00:58:51.140 | And so instead of printing and cutting,
00:58:53.460 | technologically it's this transition
00:58:55.160 | to assembling and disassembling.
00:58:57.700 | Going back to Shannon and von Neumann,
00:59:00.260 | going back to the ribosome 4 billion years ago.
00:59:03.460 | Now, you come to malevolent.
00:59:06.380 | Let me tell you a story about,
00:59:08.980 | I was doing a briefing for the National Academy
00:59:13.980 | of Sciences group that advises the intelligence communities.
00:59:19.020 | And I talked about the kind of research we do.
00:59:21.740 | And at the very end, I showed a little video clip
00:59:24.880 | of Valentina in Ghana, making, a local girl,
00:59:29.600 | making surface mount electronics in the Fab Lab.
00:59:33.420 | And I showed that to this room full of people.
00:59:36.340 | One of the members of the intelligence community
00:59:39.100 | got up livid and said, how dare you waste our time
00:59:43.420 | showing us a young girl in an African village
00:59:45.560 | making surface mount electronics.
00:59:47.480 | We're looking at, we need to know about disruptive threats
00:59:51.000 | to the future of the United States.
00:59:52.740 | And somebody else got up in the room and yelled at him,
00:59:56.620 | and you idiot, I can't think of anything
00:59:58.820 | more important than this.
01:00:00.340 | But for two reasons.
01:00:01.800 | One reason was, because if we rely on like,
01:00:06.800 | informational superiority in the battlefield,
01:00:09.560 | it means other people could get access to it.
01:00:11.940 | But this intelligence person's point, bless him,
01:00:15.000 | wasn't that, it was getting at the root causes of conflict.
01:00:20.000 | Is if this young girl in an African village
01:00:22.220 | could actually master surface mount electronics,
01:00:25.060 | it changes some of the most fundamental things
01:00:28.080 | about recruitment for terrorism,
01:00:31.880 | impact of economic migration,
01:00:34.940 | basic assumptions about an economy.
01:00:36.840 | It's just existential for the future of the planet.
01:00:40.600 | - But we've just lived through a pandemic.
01:00:44.680 | I would love to linger on this,
01:00:46.400 | 'cause the possibilities that are positive are endless.
01:00:50.400 | But the possibilities that are negative
01:00:52.040 | are still nevertheless extremely important.
01:00:54.320 | What's both positive and negative?
01:00:56.480 | What do you do with a large number of general assemblers?
01:01:01.480 | - Yeah, with the Fab Lab, you could roughly make a biolab,
01:01:04.920 | then learn biotechnology.
01:01:06.520 | Now that's terrifying, because making self-reproducing
01:01:10.640 | gray goo that outcompetes biology,
01:01:14.280 | I consider doomed because biology knows everything
01:01:17.520 | I'm describing and is really good at what it does.
01:01:20.020 | In how to grow almost anything,
01:01:25.840 | you learn skills in biotechnology
01:01:28.920 | that let you make serious biological threats.
01:01:32.800 | - And when you combine some of the innovations
01:01:36.520 | you see with large language models,
01:01:38.080 | some of the innovations you see with AlphaFold,
01:01:40.680 | so applications of AI for designing biological systems,
01:01:44.880 | for writing programs, which you can
01:01:48.400 | with large language models increasingly.
01:01:50.240 | So there seems to be an interesting dance here
01:01:53.040 | of automating the design stage of complex systems using AI.
01:01:58.040 | And then that's the bits.
01:02:02.440 | And you can leap, now the innovations you're talking about,
01:02:05.160 | you can leap from the complex systems in the digital space
01:02:08.520 | to the printing, to the creation, to the assembly
01:02:12.560 | at scale of complex systems in the physical space.
01:02:18.120 | - Yeah, so something to be scared about is
01:02:21.800 | a Fab Lab can make a Bio Lab,
01:02:23.360 | a Bio Lab can make biotechnology,
01:02:25.520 | somebody could learn to make a virus.
01:02:28.960 | That's scary.
01:02:30.040 | That's, unlike some of the things I said
01:02:32.840 | I don't worry about, that's something
01:02:34.760 | I really worry about that is scary.
01:02:37.080 | Now, how do you deal with that?
01:02:38.960 | Prior threats we dealt with command and control.
01:02:45.560 | So like early color copiers had unique codes
01:02:51.920 | and you could tell which copier made them.
01:02:53.600 | Eventually you couldn't keep up with that.
01:02:56.400 | There was a famous meeting at a Asilomar
01:02:59.480 | in the early days of recombinant DNA
01:03:02.560 | where that community recognized the dangers
01:03:06.480 | of what it was doing and put in place a regime
01:03:09.800 | to help manage it.
01:03:11.240 | And so that led to the kind of research management.
01:03:14.520 | So MIT has an office that supervises research
01:03:18.080 | and it works with the national office.
01:03:19.840 | That works if you can identify who's doing it and where.
01:03:23.480 | It doesn't work in this world we're describing.
01:03:26.400 | So anybody could do this anywhere.
01:03:28.880 | And so what we found is you can't contain this.
01:03:33.880 | It's already out.
01:03:35.920 | You can't forbid because there isn't command and control.
01:03:39.280 | The most useful thing you can do
01:03:41.540 | is provide incentives for transparency.
01:03:44.300 | But really the heart of what we do is
01:03:48.360 | you could do this by yourself in a basement
01:03:51.480 | for nefarious reasons,
01:03:53.880 | or you could come into a place in the light
01:03:56.120 | where you get help and you get community
01:03:57.880 | and you get resources.
01:03:59.600 | And there's an incentive to do it in the open,
01:04:02.520 | not in the dark.
01:04:03.480 | And that might sound naive,
01:04:06.280 | but in the sort of places we're working,
01:04:10.260 | again, bad people do bad things in these places already,
01:04:16.480 | but providing openness and providing transparency
01:04:20.800 | is a key part of managing these.
01:04:22.480 | And so it transitions from regulating risks as regulation
01:04:27.280 | to soft power to manage them.
01:04:30.080 | - So there's so much potential for good,
01:04:32.040 | so much capacity for good,
01:04:33.800 | that Fab Labs and the ability
01:04:38.160 | and the tools of creation really unlock that potential.
01:04:44.240 | - Yeah, and I don't say that as sort of dewy-eyed naive.
01:04:48.120 | I say that empirically from just years
01:04:51.080 | of seeing how this plays out in communities.
01:04:52.720 | - I wonder if it's the early days of personal computers,
01:04:54.760 | though, before we get spam, right?
01:04:57.400 | - In the end, most fundamentally,
01:05:00.640 | literally the mother of all problems
01:05:05.480 | is who designed us.
01:05:09.980 | So assume success in that we're gonna transition
01:05:14.980 | to the machines making machines,
01:05:17.180 | and all of these new sort of social systems we're describing
01:05:20.460 | will help manage them and curate them and democratize them.
01:05:23.800 | If we close the gap I just led off with
01:05:28.700 | of 10 to the 10 to 10 to the 18 between ChipFab and you,
01:05:33.540 | we're ultimately in marrying communication,
01:05:37.820 | computation, and fabrication,
01:05:39.340 | gonna be able to create unimaginable complexity.
01:05:42.280 | And how do you design that?
01:05:48.280 | And so I'd say the deepest of all questions
01:05:53.280 | that I've been working on
01:05:55.380 | goes back to the oldest part of our genome.
01:06:00.520 | So in our genome, what are called Hox genes,
01:06:06.100 | and these are morphogenes.
01:06:09.680 | And nowhere in your genome is the number five.
01:06:15.580 | It doesn't store the fact that you have five fingers.
01:06:19.420 | And what it stores is what's called
01:06:20.860 | a developmental program.
01:06:22.440 | It's a series of steps, and the steps have the character
01:06:25.900 | of like grow up a gradient or break symmetry.
01:06:30.420 | And at the end of that developmental program,
01:06:32.560 | you have five fingers.
01:06:34.500 | So you are stored not as a body plan,
01:06:39.500 | but as a growth plan.
01:06:44.500 | And there's two reasons for that.
01:06:46.540 | One reason is just compression.
01:06:49.660 | Billions of genes can place trillions of cells.
01:06:53.960 | But the much deeper one is evolution
01:06:57.460 | doesn't randomly perturb.
01:07:00.380 | Almost anything you did randomly in the genome
01:07:02.660 | would be fatal or inconsequential, but not interesting.
01:07:06.940 | But when you modify things in these developmental programs,
01:07:11.040 | you go from like webs for swimming to fingers,
01:07:14.220 | or you go from walking to wings for flying.
01:07:17.180 | It's a space in which search is interesting.
01:07:19.700 | So this is the heart of the success of AI.
01:07:24.700 | In part, it was the scaling we talked about a while ago.
01:07:30.640 | And in part, it was the representations
01:07:33.900 | for which search is effective.
01:07:37.000 | AI has found good representations.
01:07:41.140 | It hasn't found new ways to search,
01:07:43.020 | but it's found good representations of search.
01:07:45.500 | - And you're saying that's what biology,
01:07:47.300 | that's what evolution has done,
01:07:49.860 | is created representations, structures,
01:07:52.580 | biological structures through which search is effective.
01:07:56.260 | - And so the developmental programs in the genome
01:08:01.080 | beautifully encapsulate the lessons of AI.
01:08:05.440 | And this is, it's molecular intelligence.
01:08:08.720 | It's AI embodied in our genome.
01:08:11.720 | It's every bit as profound as the cognition in our brain,
01:08:16.480 | but now this is sort of thinking in,
01:08:18.880 | molecular thinking in how you design.
01:08:22.440 | And so I'd say the most fundamental problem
01:08:26.200 | we're working on is, it's kind of tautological
01:08:29.720 | that when you design a phone, you design the phone,
01:08:33.400 | you represent the design of the phone.
01:08:35.320 | But that actually fails when you get to the sort
01:08:37.960 | of complexity that we're talking about.
01:08:40.400 | And so there's this profound transition to come.
01:08:43.480 | Once I can have self-reproducing assemblers,
01:08:46.280 | placing 10 to the 18 parts,
01:08:49.400 | you need to not sort of metaphorically,
01:08:53.180 | but create life in that you need to learn how to evolve.
01:08:58.180 | But evolutionary design has a really misleading,
01:09:01.840 | trivial meaning.
01:09:04.460 | It's not as simple as you randomly mutate things.
01:09:08.500 | It's this much more deep embodiment of AI and morphogenesis.
01:09:13.500 | - Is there a way for us to continue
01:09:16.400 | the kind of evolutionary design that led us to this place
01:09:19.020 | from the early days of bacteria, single cell organism,
01:09:21.800 | to ribosomes and the 20 amino acids?
01:09:24.460 | - You mean for human augmentation?
01:09:26.860 | - For life, I mean, what would you call assemblers
01:09:30.580 | that are self-replicating and placing parts?
01:09:32.580 | What is that?
01:09:33.420 | The dynamic complex things built with digital fabrication,
01:09:39.860 | what is that?
01:09:40.700 | That's life.
01:09:41.580 | - Yeah, so ultimately, absolutely,
01:09:44.180 | if you add up everything I'm talking about,
01:09:47.580 | it's building up to creating life in non-living materials.
01:09:51.500 | And I don't view this as copying life.
01:09:56.100 | I view it as deriving life.
01:09:58.280 | I didn't start from how does biology work
01:10:00.760 | and then I'm gonna copy it.
01:10:02.660 | I start from how to solve problems
01:10:05.940 | and then it leads me to, in a sense, rediscover biology.
01:10:09.660 | So if you go back to Valentina in Ghana
01:10:13.020 | making her circuit board,
01:10:15.060 | she still needs a chip fab very far away
01:10:17.100 | to make the processor on her circuit board.
01:10:19.860 | For her to make the processor locally,
01:10:22.660 | for all the reasons we described,
01:10:24.800 | you actually need the deep things
01:10:26.940 | we were just talking about.
01:10:29.340 | And so it really does lead you.
01:10:32.100 | So let's see, there's a wonderful series of books by Gingery.
01:10:37.100 | Book one is "How to Make a Charcoal Furnace"
01:10:41.160 | and at the end of book seven, you have "A Machine Shop."
01:10:44.680 | So it's sort of how you do
01:10:47.440 | your own personal industrial revolution.
01:10:50.340 | ISRU is what NASA calls in situ resource utilization.
01:10:54.940 | And that's how do you go to a planet
01:10:56.720 | and create a civilization.
01:10:59.880 | ISRU has essentially assumed Gingery.
01:11:02.820 | You go through the industrial revolution
01:11:04.860 | and you create the inventory of 100,000 resistors.
01:11:08.540 | What we're finding is the way,
01:11:10.500 | the minimum building blocks for a civilization
01:11:13.660 | is roughly 20 parts.
01:11:16.900 | So what's interesting about the amino acids
01:11:18.780 | is they're not interesting.
01:11:20.360 | They're hydrophobic or hydrophilic, basic or acidic.
01:11:23.780 | They have typical but not extremal properties,
01:11:26.420 | but they're good enough you can combine them to make you.
01:11:29.700 | So what this is leading towards
01:11:31.860 | is technology doesn't need enormous global supply chains.
01:11:36.860 | It just needs about 20 properties you can compose
01:11:40.140 | to create all technology as the minimum building blocks
01:11:43.740 | for a technological civilization.
01:11:45.900 | - So there's going to be 20 basic building blocks
01:11:49.420 | based on which the self-replicating assemblers can work.
01:11:52.900 | - Right, and I say that not philosophically,
01:11:55.180 | just empirically, sort of that's where it's heading.
01:11:58.380 | And I like thinking about how you bootstrap
01:12:02.900 | a civilization on Mars, that problem.
01:12:04.820 | There's a fun video on bonus material for the movie
01:12:07.980 | with a neat group of people.
01:12:09.580 | We talk about it because it has really profound implications
01:12:12.700 | back here on Earth about how we live sustainably.
01:12:16.240 | - What does that civilization on Mars looks like
01:12:18.680 | that's using ISRU, that's using these 20 building blocks
01:12:22.600 | and does self-assembly?
01:12:24.120 | - Yeah, go through primary, secondary, tertiary, quaternary.
01:12:27.120 | You extract properties like conducting, insulating,
01:12:33.800 | semi-conducting, magnetic, dielectric, flexural.
01:12:39.000 | These are the kind of roughly 20 properties.
01:12:44.320 | With those, those are enough for us to assemble logic
01:12:49.320 | and they're enough for us to assemble actuation.
01:12:53.440 | With logic and actuation, we can make micro robots.
01:12:58.680 | The micro robots can build bigger robots.
01:13:03.240 | The bigger robots can then take the building block materials
01:13:07.560 | and make the structural elements that you then do
01:13:10.960 | to make construction.
01:13:12.520 | Then you boot up through the stages
01:13:14.080 | of a technological civilization.
01:13:16.400 | - By the way, where in the span of logic and actuation
01:13:20.560 | did the sensing come in?
01:13:22.400 | - Oh, I skipped over that.
01:13:23.520 | But my favorite sensor is a step response.
01:13:28.520 | If you just make a step and measure the response
01:13:32.400 | to the electric field, that ranges from user interfaces
01:13:37.240 | to positioning to material properties.
01:13:40.160 | If you do it at higher frequencies, you get chemistry.
01:13:43.320 | And you can get all of that just from a step
01:13:45.360 | in an electric field.
01:13:46.840 | So for example, once you have time resolution in logic,
01:13:51.840 | something as simple as two electrodes
01:13:54.640 | let you do amazingly capable sensing.
01:13:58.220 | So we've been talking about all the work I do.
01:14:00.660 | There's a story about how it happens.
01:14:04.360 | Where do ideas come from?
01:14:06.520 | - That's an interesting story.
01:14:07.520 | Where do ideas come from?
01:14:08.600 | So I had mentioned Vannevar Bush.
01:14:12.160 | And he wrote a really influential thing
01:14:17.000 | called The Endless Frontier.
01:14:18.560 | So science won World War II.
01:14:22.240 | The more known story is nuclear bombs.
01:14:26.360 | The less well-known story is the Rad Lab.
01:14:28.920 | So at MIT, an amazing group of people invented radar,
01:14:32.680 | which is really credited as winning the war.
01:14:35.280 | So after the war, a grand old man from MIT,
01:14:39.480 | and it was charged with science won the war,
01:14:44.480 | how do we maintain that edge?
01:14:48.280 | And the report he wrote led
01:14:50.640 | to the National Science Foundation.
01:14:53.520 | And the modern notion we take for granted,
01:14:55.420 | but didn't really exist before then,
01:14:57.160 | of public funding of research or research agencies.
01:15:02.600 | In it, he made, again, what I consider an important mistake,
01:15:07.240 | which is he described basic research
01:15:11.080 | leads to applied research, leads to applications,
01:15:15.100 | leads to commercialization, leads to impact.
01:15:18.560 | And so we need to invest in that pipeline.
01:15:21.960 | The reason I consider it a mistake
01:15:25.960 | is almost all of the examples we've been talking about
01:15:30.920 | in my lab went backwards,
01:15:33.320 | that the basic research came from applications.
01:15:36.660 | And further, almost all of the examples
01:15:41.640 | we've been talking about came fundamentally from mistakes.
01:15:46.240 | So essentially everything I've ever worked on has failed,
01:15:51.240 | but in failing, something better happened.
01:15:56.080 | So the way I like to describe it is ready, aim, fire
01:16:00.140 | is you do your homework,
01:16:01.640 | you aim carefully at a target you want to accomplish,
01:16:06.920 | and if everything goes right,
01:16:08.080 | you then hit the target and succeed.
01:16:10.260 | What I do, you can think of as ready, fire, aim.
01:16:15.140 | So you do a lot of work to get ready,
01:16:18.240 | then you close your eyes,
01:16:19.960 | and you don't really think about where you're aiming,
01:16:22.720 | but you look very carefully at where you did aim,
01:16:25.120 | you aim after you fire.
01:16:29.040 | And the reason that's so important is
01:16:31.940 | if you do ready, aim, fire,
01:16:35.140 | the best you can hope is hit what you aim at.
01:16:38.520 | So let me give you some examples.
01:16:40.480 | 'Cause this is a source of great--
01:16:43.560 | - You're full of good lines today.
01:16:45.440 | - Source of great frustration.
01:16:47.120 | So I mentioned the early quantum computing.
01:16:51.240 | So quantum computing is this power
01:16:53.420 | of using quantum mechanics to make computers
01:16:55.640 | that for some problems are dramatically more powerful
01:16:58.840 | than classical computers.
01:17:01.000 | Before it started,
01:17:02.720 | there was a really interesting group of people
01:17:05.240 | who knew a lot about physics and computing
01:17:09.520 | that were inventing what became quantum computing
01:17:12.560 | before it was clear anything,
01:17:14.120 | there was an opportunity there.
01:17:15.600 | It was just studying how those relate.
01:17:18.320 | Here's how it fits to the ready, fire, aim.
01:17:21.340 | I was doing really short-term work in my lab
01:17:24.240 | on shoplifting tags.
01:17:27.640 | This was really before there was modern RFID.
01:17:30.920 | And so how you put tags in objects to sense them,
01:17:35.400 | something we just take for granted commercially.
01:17:37.480 | And there was a problem of how you can sense
01:17:39.620 | multiple objects at the same time.
01:17:41.500 | And so I was studying how you can remotely sense materials
01:17:47.920 | to make low-cost tags that could let you
01:17:50.960 | distinguish multiple objects simultaneously.
01:17:53.720 | To do that, you need non-linearity
01:17:55.920 | so that the signal is modulated.
01:17:58.220 | And so I was looking for material sources of non-linearity
01:18:03.120 | and that led me to look at how nuclear spins interact.
01:18:08.120 | Just for spin resonance.
01:18:13.800 | The sort of things you use when you go in an MRI machine.
01:18:17.560 | And so I was studying how to use that.
01:18:19.640 | And it turns out that it was a bad idea.
01:18:22.120 | You couldn't remotely use it for
01:18:25.960 | shoplifting tags, but I realized you could compute.
01:18:30.960 | And so with a group of colleagues
01:18:36.120 | thinking about early quantum computing,
01:18:37.980 | like David DiVincenzo and Charlie Bennett,
01:18:39.940 | was articulating what are the properties
01:18:41.780 | you need to compute.
01:18:43.160 | And then looking at how to make the tags.
01:18:45.440 | It turns out the tags were a terrible idea
01:18:48.240 | for sensing objects in a supermarket checkout.
01:18:54.200 | But I realized they were computing.
01:18:56.440 | So with Ike Chuang and a few other people,
01:18:58.520 | we realized we could program nuclear spins to compute.
01:19:02.160 | And so that's what we used to do Grover's search algorithm.
01:19:05.640 | And then it was used for a Shor's factoring algorithm.
01:19:08.920 | And it worked out.
01:19:10.200 | The systems we did it in, nuclear magnetic resonance,
01:19:12.880 | don't scale beyond a few qubits.
01:19:15.780 | But the techniques have lived on.
01:19:18.120 | And so all the current quantum computing techniques
01:19:21.940 | grew out of the ways we would talk to these spins.
01:19:26.400 | But I'm telling this whole story
01:19:28.280 | because it came from a bad way to make a shoplifting tag.
01:19:32.200 | - Starting with an application,
01:19:33.960 | mistakes led to-- - The fundamental science.
01:19:36.520 | - Fundamental science.
01:19:37.840 | I mean, can you just link on that?
01:19:39.360 | I mean, just using nuclear spins to do computation,
01:19:43.800 | like what gave you the guts to try to think through this?
01:19:50.800 | From a digital fabrication perspective, actually,
01:19:53.380 | how to leap from one to the other.
01:19:54.940 | - I wouldn't call it guts, I would call it collaboration.
01:19:57.060 | So at IBM, there was this amazing group of,
01:20:02.060 | like I mentioned, Charlie Bennett and David DiVincenzo
01:20:04.980 | and Ralph Landauer and Nabeel Amir.
01:20:07.180 | And these were all gods of thinking
01:20:10.660 | about physics and computing.
01:20:12.980 | So I yelled at the whole computer industry
01:20:18.140 | being based on a fiction, "Metropolis."
01:20:22.840 | Programmers frolicking in the garden
01:20:24.280 | while somebody moves levers in the basement.
01:20:26.340 | There's a complete parallel history
01:20:28.760 | of Maxwell to Boltzmann to Szilard to Landauer to Bennett.
01:20:33.760 | Most people won't know most of these names,
01:20:39.200 | but this whole parallel history,
01:20:40.560 | thinking deeply about how computation and physics relate.
01:20:44.320 | So I was collaborating with that whole group of people.
01:20:48.700 | And then at MIT, I was in this high traffic environment.
01:20:53.360 | I wasn't deeply inspired to think about
01:20:55.780 | better ways to detect shoplifting tags,
01:20:57.700 | but stumbled across companies that needed help with that
01:21:01.180 | and was thinking about it.
01:21:02.660 | And then I realized those two worlds intersected
01:21:05.180 | and we could use the failed approach
01:21:07.660 | for the shoplifting tags
01:21:09.820 | to make early quantum computing algorithms.
01:21:14.560 | - And this kind of stumbling is fundamental
01:21:16.240 | to the Fab Lab idea, right?
01:21:18.040 | - Right.
01:21:18.880 | Here's one more example.
01:21:19.960 | With a student, Manu, we talked about ribosomes
01:21:23.000 | and I was trying to build a ribosome
01:21:24.880 | that worked on fluids so that I could place
01:21:29.760 | the little parts we're talking about.
01:21:31.840 | And it kept failing 'cause bubbles would come
01:21:35.500 | into our system and the bubbles would make
01:21:37.360 | the whole thing stop working.
01:21:38.960 | And we spent about half a year trying
01:21:40.340 | to get rid of the bubbles.
01:21:42.300 | Then Manu said, "Wait a minute.
01:21:44.100 | "The bubbles are actually better than what we're doing.
01:21:47.200 | "We should just use the bubbles."
01:21:48.760 | And so we invented how to do universal object
01:21:52.460 | with little, logic with little bubbles in fluid.
01:21:55.740 | - Okay, you have to explain this microfluidic bubble logic.
01:21:59.680 | Please, how does this work?
01:22:01.060 | - So--
01:22:01.900 | - It's super interesting.
01:22:03.860 | - Yeah, and so I'll come back and explain it.
01:22:06.560 | But what it led to was we showed fluids could do,
01:22:10.620 | it'd been known fluid could do logic,
01:22:13.900 | like your old automobile transmissions do logic,
01:22:17.520 | but that's macroscopic.
01:22:18.640 | It didn't work at little scales.
01:22:19.960 | We showed with these bubbles,
01:22:20.940 | we could do it at little scales.
01:22:22.760 | Then I'm gonna come back and explain it.
01:22:24.480 | But what came out of that is Manu then showed
01:22:27.320 | you could make a 50-cent microscope using little bubbles.
01:22:30.960 | And then the techniques we developed are what we use
01:22:34.520 | to transplant genomes to make synthetic life
01:22:37.840 | all came out of the failure of trying to make the genome,
01:22:42.200 | the ribosome.
01:22:43.640 | Now, so the way the bubble logic works is
01:22:46.560 | in a little channel,
01:22:50.360 | fluid at small scales is fairly viscous.
01:22:56.720 | It's sort of like pushing jello, think of it as.
01:23:02.320 | If a bubble gets stuck, the fluid has to detour around it.
01:23:06.820 | So now imagine a channel that has two wells and one bubble.
01:23:12.920 | If the bubble is in one well,
01:23:16.880 | the fluid has to go in the other channel.
01:23:19.520 | If the fluid is in the other well,
01:23:20.920 | it has to go in the first channel.
01:23:23.240 | So the position of the bubble can switch,
01:23:28.240 | it's a switch, it can switch the fluid between two channels.
01:23:31.920 | So now we have one element of switch.
01:23:34.120 | And it's also a memory because you can detect
01:23:36.480 | whether or not a bubble is stored there.
01:23:39.080 | Then if two bubbles meet,
01:23:42.080 | if you have two channels crossing,
01:23:46.060 | a bubble can go through one way
01:23:47.640 | or a bubble can go through the other way.
01:23:49.680 | But if two bubbles come together,
01:23:52.560 | then they push on each other and one goes one way
01:23:54.720 | and one goes the other way.
01:23:56.120 | That's a logic operation, that's a logic gate.
01:23:59.200 | So we now have a switch, we have a memory
01:24:01.080 | and we have a logic gate and that's everything you need
01:24:02.880 | to make a universal computer.
01:24:05.240 | - I mean the fact that you did that with bubbles
01:24:07.540 | in microfluid, just kind of brilliant.
01:24:12.160 | - Well, so to stay with that example,
01:24:14.800 | what we proposed to do was to make a fluidic ribosome
01:24:20.080 | and the project crashed and burned.
01:24:21.760 | It was a disaster.
01:24:23.280 | This is what came out of it.
01:24:26.180 | And so it was precisely ready, fire, aim
01:24:30.680 | and that we had to do a lot of homework
01:24:32.760 | to be able to make these microfluidic systems.
01:24:36.000 | The fire part was we didn't think too hard
01:24:38.920 | about making the ribosome, we just tried to do it.
01:24:41.520 | The aim part was we realized the ribosome failed
01:24:43.840 | but something better had happened.
01:24:45.560 | And if you look all across research funding,
01:24:48.740 | research management, it doesn't anticipate this.
01:24:53.480 | So fail fast is familiar,
01:24:56.360 | but fail fast tends to miss ready and aim.
01:25:00.760 | You can't just fail, you have to do your homework
01:25:03.560 | before the fail part and you have to do the aim part
01:25:06.720 | after the fail part.
01:25:08.080 | And so the whole language of research
01:25:09.960 | is about like milestones and deliverables.
01:25:12.180 | That works when you're going down a straight line
01:25:14.960 | but it doesn't work for this kind of discovery.
01:25:17.560 | And to leap to something you said that's really important
01:25:20.240 | is I view part of what the Fab Lab Network is doing
01:25:23.920 | is giving more people the opportunity to fail.
01:25:28.920 | - You've said that geometry is really important in biology.
01:25:34.000 | What does fabrication biology look like?
01:25:39.840 | Why is geometry important?
01:25:41.760 | - So molecular biology is dominated by geometry.
01:25:46.400 | That's why the protein folding is so important,
01:25:48.400 | that the geometry gives the function
01:25:53.160 | and there's this hierarchical construction
01:25:56.880 | of as you go through primary, secondary, tertiary, quaternary
01:26:00.280 | the shapes of the molecules
01:26:03.320 | make the shape of the molecular machines
01:26:05.920 | and they really are exquisite machines.
01:26:08.120 | If you look at how your muscles move,
01:26:13.640 | if you were to see a simulation of it,
01:26:15.560 | it would look like a improbable science fiction cyborg world
01:26:19.960 | of these little walking robots
01:26:22.300 | that walk on a discrete lattice.
01:26:24.040 | They're really exquisite machines.
01:26:26.240 | And then from there, there's this whole hierarchical stack
01:26:29.440 | of once you get to the top of that,
01:26:32.200 | you then start making organelles that make cells
01:26:36.000 | that make organs through the stack of that hierarchy.
01:26:39.400 | - Just stepping back, does it amaze you
01:26:43.480 | that from small building blocks
01:26:45.340 | where amino acids, you mentioned molecules,
01:26:49.240 | let's go to the very beginning of hydrogen and helium
01:26:51.360 | at the start of this universe,
01:26:53.040 | that we're able to build up such complex
01:26:57.540 | and beautiful things like our human brain?
01:27:00.720 | - So studying thermodynamics,
01:27:04.720 | which is exactly the question of batteries run out
01:27:10.600 | and need recharging,
01:27:13.840 | equipment, cars get old and fail,
01:27:20.320 | yet life doesn't.
01:27:23.340 | And that's why there's a sense
01:27:26.400 | in which life seems to violate thermodynamics,
01:27:28.840 | although of course it doesn't.
01:27:30.120 | - It seems to resist the march towards entropy somehow.
01:27:33.300 | - Right, and so Maxwell,
01:27:36.440 | who helped give rise to the science of thermodynamics,
01:27:40.860 | posited a problem that was so infuriating
01:27:45.600 | it led to a series of suicides.
01:27:48.040 | There was a series of advisors and advisees,
01:27:52.420 | three in a row that all ended up committing suicide
01:27:56.640 | that happened to work on this problem.
01:27:59.080 | And Maxwell's demon is this simple but infamous problem
01:28:04.080 | where right now in this room we're surrounded by molecules
01:28:11.600 | and they run at different velocities.
01:28:14.480 | Imagine a container that has a wall
01:28:18.480 | and it's got gas on both sides and a little door.
01:28:21.600 | And if the door is a molecular-sized creature
01:28:26.000 | and it could watch the molecules coming,
01:28:28.280 | and when a fast molecule is coming it opens the door,
01:28:31.320 | when a slow molecule is coming it closes the door.
01:28:33.880 | After it does that for a while,
01:28:36.800 | one side is hot, one is cold.
01:28:39.160 | When something is hot and is cold you can make an engine
01:28:42.620 | and so you close that and you make an engine
01:28:44.880 | and you make energy.
01:28:45.920 | So the demon is violating thermodynamics
01:28:51.600 | because it's never touching the molecule
01:28:55.000 | yet by just opening and closing the door
01:28:58.600 | it can make arbitrary amounts of energy
01:29:00.840 | and power a machine.
01:29:03.520 | And in thermodynamics you can't do that.
01:29:05.320 | So that's Maxwell's demon.
01:29:09.380 | That problem is connected to everything
01:29:12.220 | we just spoke about for the last few hours.
01:29:14.300 | So Leo Szilard around early 1900s
01:29:19.300 | was a deep physicist who then had a lot to do
01:29:27.500 | with also post-war anti-nuclear things.
01:29:32.820 | But he reduced Maxwell's demon to a single molecule.
01:29:38.780 | So the molecule, there's only one molecule
01:29:41.740 | and the question is which side of the partition is it on?
01:29:44.940 | That led to the idea of one bit of information.
01:29:48.980 | So Shannon credited Szilard's analysis of Maxwell's demon
01:29:53.580 | for the invention of the bit.
01:29:55.540 | For many years people tried to explain Maxwell's demon
01:30:00.960 | by like the energy in the demon looking at the molecule
01:30:04.940 | or the energy to open and close the door
01:30:08.500 | and nothing ever made sense.
01:30:11.100 | Finally, Rolf Landauer, one of the colleagues
01:30:14.580 | I mentioned at IBM, finally solved the problem.
01:30:18.400 | He showed that you can explain Maxwell's demon
01:30:23.940 | by you need the mind of the demon.
01:30:27.140 | When the demon opened and closes the door
01:30:33.100 | as long as it remembers what it did
01:30:36.120 | you can run the whole thing backwards.
01:30:39.620 | But when the demon forgets, then you can't run it backwards
01:30:44.620 | and that's where you get dissipation
01:30:49.140 | and that's where you get the violation of thermodynamics.
01:30:51.940 | And so the explanation of Maxwell's demon
01:30:54.480 | is that it's in the demon's brain.
01:30:57.540 | So then Rolf's colleague Charlie at IBM
01:31:03.600 | then shocked Rolf by showing you can compute
01:31:07.400 | with arbitrarily low energy.
01:31:10.080 | So one of the things that's not well covered
01:31:13.240 | is the big computers used for big machine learning,
01:31:18.240 | the data centers use tens of megawatts of power,
01:31:22.080 | they use as much power as a city.
01:31:23.800 | Charlie showed you can actually compute
01:31:27.780 | with arbitrarily low amounts of energy
01:31:30.860 | by making computers that can go backwards
01:31:33.140 | as well as forwards.
01:31:34.560 | And what limits the speed of the computer
01:31:39.180 | is how fast you want an answer
01:31:42.380 | and how certain you want the answer to be.
01:31:45.100 | But we're orders of magnitude away from that.
01:31:47.500 | So I have a student Cameron working with Lincoln Labs
01:31:50.400 | on making superconducting computers
01:31:53.180 | that operate near this Landauer limit
01:31:56.860 | that are orders of magnitude more efficient.
01:31:59.060 | So stepping back to all of that,
01:32:01.880 | that whole tour was driven by your question about life.
01:32:05.660 | And right at the heart of it is Maxwell's demon.
01:32:10.000 | Life exists because it can locally violate thermodynamics.
01:32:14.360 | It can locally violate thermodynamics
01:32:16.460 | because of intelligence.
01:32:18.860 | And it's molecular intelligence.
01:32:21.700 | I would even go out on a limb to say
01:32:25.740 | we can already see we're beginning to come to the end
01:32:29.660 | of this current AI phase.
01:32:32.280 | So depending on how you count,
01:32:34.000 | this is I'd say the fifth AI boom-bust cycle.
01:32:36.960 | And you can already, it's exploding,
01:32:40.200 | but you can already see where it's heading,
01:32:42.840 | how it's going to saturate, what happens on the far side.
01:32:46.900 | The big thing that's not yet on horizons
01:32:50.920 | is embodied AI, molecular intelligence.
01:32:55.920 | So to step back to this AI story,
01:32:59.980 | there was automation and that was gonna change everything.
01:33:04.980 | Then there were expert systems.
01:33:07.120 | There was then the first phase of the neural network systems.
01:33:14.000 | There've been about five of these.
01:33:15.700 | In each case, on the slope up,
01:33:19.340 | it's gonna change everything.
01:33:21.000 | Each case what happens is on the slope down,
01:33:24.780 | we sort of move the goalposts
01:33:28.360 | and it becomes sort of irrelevant.
01:33:29.920 | So a good example is going up,
01:33:33.280 | computer chess was gonna change everything.
01:33:35.340 | Once computers could play chess,
01:33:36.600 | that fundamentally changes the world.
01:33:38.540 | Now on the downside, computers play chess.
01:33:41.540 | Winning at chess is no longer seen as a unique human thing,
01:33:44.460 | but people still play chess.
01:33:48.160 | This new phase is gonna take a new chunk of things
01:33:51.240 | that we thought computers couldn't do,
01:33:53.020 | now computers will be able to do.
01:33:54.480 | They have roughly our brain capacity.
01:33:57.840 | But we'll keep thinking as well as computers.
01:34:01.180 | And as I described,
01:34:03.740 | while we've been going through these five boom busts,
01:34:06.040 | if you just look at the numbers of ops per second,
01:34:08.480 | bits storage, bits of IO, that's the more interesting one.
01:34:11.960 | That's been steady and that's what finally caught up
01:34:14.040 | to people.
01:34:14.960 | But as we've talked about a couple times,
01:34:17.920 | there's eight orders of magnitude to go,
01:34:20.720 | not in the intelligence in the transistors or in the brain,
01:34:24.160 | but in the embodied intelligence,
01:34:25.840 | in the intelligence in our body.
01:34:27.480 | - So the intelligent construction of physical systems
01:34:30.520 | that would embody the intelligence
01:34:32.680 | versus contain it within the computation.
01:34:34.840 | - Right, and there's a brain centrism
01:34:38.240 | that assumes our intelligence is centered in our brain.
01:34:42.840 | And in endless ways in this conversation,
01:34:45.200 | we've been talking about molecular intelligence.
01:34:47.480 | Our molecular systems do a deep kind
01:34:51.680 | of artificial intelligence.
01:34:53.040 | All the things you think of as artificial intelligence does
01:34:56.680 | in representing knowledge, storing knowledge,
01:35:01.680 | searching over knowledge, adapting to knowledge,
01:35:04.600 | our molecular systems do,
01:35:06.840 | but the output isn't just a thought, it's us.
01:35:11.080 | It's the evolution of us.
01:35:13.000 | And that's the real horizon to come is now embodying AI,
01:35:18.000 | not just a processor and a robot,
01:35:20.800 | but building systems that really can grow and evolve.
01:35:26.800 | - So we've been speaking about this boundary
01:35:29.560 | between bits and atoms.
01:35:31.760 | So let me ask you about one of the big mysteries
01:35:34.720 | of consciousness.
01:35:36.760 | Do you think it comes from somewhere between that boundary?
01:35:41.760 | - I won't name names,
01:35:43.760 | but if you know who I'm talking about,
01:35:46.080 | it's probably clear.
01:35:47.280 | I once did a drive, in fact,
01:35:49.960 | up to the Mussolini-era villa outside Torino
01:35:54.880 | in the early days of what became quantum computing
01:35:58.000 | with a famous person who thinks about
01:36:02.440 | quantum mechanics and consciousness.
01:36:04.400 | And we had the most infuriating conversation
01:36:07.680 | that went roughly along the lines of
01:36:10.280 | consciousness is weird, quantum mechanics is weird,
01:36:17.080 | therefore quantum mechanics explains consciousness.
01:36:20.520 | That was roughly the logical process.
01:36:24.320 | - And you're not satisfied with that process?
01:36:26.120 | - No, and I say that very precisely in the following sense.
01:36:29.560 | I was a program manager, somewhat by accident,
01:36:32.840 | in a DARPA program on quantum biology.
01:36:37.840 | And so biology trivially uses quantum mechanics
01:36:43.520 | and that were made out of atoms.
01:36:45.760 | But the distinction is in quantum computing,
01:36:49.320 | quantum information, you need quantum coherence.
01:36:53.120 | And there's a lot of muddled thinking about
01:36:56.080 | collapse of the wave function
01:36:58.760 | and claims of quantum computing
01:37:00.520 | that garbles just quantum coherence.
01:37:04.760 | You can think of it as a wave
01:37:09.120 | that has very special properties,
01:37:10.520 | but these wave-like properties.
01:37:12.720 | And so there's a small set of places
01:37:16.600 | where biology uses quantum mechanics in that deeper sense.
01:37:21.040 | One is how light is converted to energy in photo systems.
01:37:26.040 | It looks like one is olfaction,
01:37:29.200 | how your nose is able to tell different smells.
01:37:31.800 | Probably one has to do with how birds navigate,
01:37:37.200 | how they sense magnetic fields.
01:37:40.120 | That involves a coupling between a very weak energy
01:37:44.600 | with a magnetic field, coupling into chemical reactions.
01:37:48.040 | And there's a beautiful system.
01:37:50.640 | Standard in chemistry is magnetic fields like this
01:37:56.480 | can influence chemistry,
01:37:58.160 | but there are biological circuits
01:38:00.000 | that are carefully balanced with two pathways
01:38:02.200 | that become unbalanced with magnetic fields.
01:38:04.520 | So each of these areas are expensive for biology.
01:38:07.960 | It has to consume resources
01:38:09.640 | to use quantum mechanics in this way.
01:38:11.480 | So those are places where we know
01:38:14.560 | there's quantum mechanics in biology.
01:38:16.640 | In cognition, there's just no evidence.
01:38:20.320 | There's no evidence of anything quantum mechanical going on
01:38:25.320 | in how cognition works.
01:38:29.520 | - Consciousness.
01:38:30.720 | - Well, I'm saying cognition, I'm not saying consciousness.
01:38:33.680 | But to get from cognition to consciousness,
01:38:37.160 | so McCullough and Pitts made a model of neurons.
01:38:42.960 | That led to perceptrons that then through a couple
01:38:47.760 | boom busts led to deep learning.
01:38:51.120 | One of the interesting things about that sequence
01:38:53.040 | is it diverged off.
01:38:55.640 | So deep neural networks used in machine learning
01:39:00.120 | diverged from trying to understand how the brain works.
01:39:03.200 | What makes them work, what's emerged is,
01:39:09.040 | it's a really interesting story.
01:39:10.720 | This may be too much of a technical detail,
01:39:13.080 | but it has to do with function approximation.
01:39:15.320 | We talked about exponentials.
01:39:18.600 | A deep network needs an exponentially larger,
01:39:23.600 | shallow network to do the same function.
01:39:27.120 | And that exponential is what gives the power
01:39:29.800 | to deep networks.
01:39:31.440 | But what's interesting is the sort of lessons
01:39:34.760 | about building these deep architectures
01:39:37.400 | and how to train them have really interesting echoes
01:39:41.280 | to how brains work.
01:39:43.240 | And there's an interesting conversation
01:39:45.280 | that's sort of coming back of neuroscientists
01:39:48.160 | looking over the shoulder of people training
01:39:50.320 | these deep networks, seeing interesting echoes
01:39:52.720 | for how the brain works, interesting parallels with it.
01:39:56.600 | And so I didn't say consciousness, I just said cognition.
01:40:01.480 | But I don't know any experimental evidence
01:40:05.120 | that points to anything in neurobiology
01:40:07.480 | that says we need quantum mechanics.
01:40:10.000 | And I view the question about whether a large language model
01:40:15.000 | is conscious as silly, in that biology is full of hacks
01:40:22.600 | and it works.
01:40:29.400 | There's no evidence we have that there's anything
01:40:33.640 | deeper going on than just this sort of stacking up
01:40:37.040 | of hacks in the brain.
01:40:38.360 | - And somehow consciousness is one of the hacks
01:40:40.800 | or an emergent property of the hacks.
01:40:42.680 | - Absolutely.
01:40:43.520 | And just numerically I said big computations
01:40:47.680 | now have the degrees of freedom of the brain.
01:40:50.440 | And they're showing a lot of the phenomenology
01:40:53.200 | of what we think is properties of what a brain can do.
01:40:57.060 | And I don't see any reason to invoke anything else.
01:41:02.360 | - That makes you wonder what kind of beautiful stuff
01:41:04.840 | digital fabrication will create.
01:41:06.820 | If biology created a few hacks on top of which
01:41:09.440 | consciousness and cognition, some of the things
01:41:12.440 | we love about human beings was created.
01:41:15.400 | It makes you wonder what kind of beauty
01:41:19.000 | in the complexity can be created through digital fabrication.
01:41:21.880 | - There's an early peak at that, which is,
01:41:24.400 | there's a misleading term which is generative design.
01:41:29.160 | Generative design is where you don't tell a computer
01:41:32.760 | how to design something, you tell the computer
01:41:34.600 | what you want it to do.
01:41:36.400 | That doesn't work, that only works in limited subdomains.
01:41:40.920 | You can't do really complex functionality that way.
01:41:44.020 | The one place it's matured though is topology optimization
01:41:47.860 | for structure.
01:41:48.700 | So let's say you wanted to make a bicycle or a table.
01:41:52.200 | You describe the loads on it and it figures out
01:41:56.640 | how to design it.
01:41:57.800 | And what it makes are beautiful, organic looking things.
01:42:01.560 | These are things that look like they grew in a forest.
01:42:04.440 | And they look like they grew in a forest
01:42:07.040 | 'cause that's sort of exactly what they are.
01:42:08.900 | That they're solving the ways of how you handle loads
01:42:12.880 | in the same way biology does.
01:42:14.200 | And so you get things that look like trees and shells
01:42:17.000 | and all of that.
01:42:17.840 | And so that's a peak at this transition to,
01:42:20.620 | from we design to we teach the machines how to design.
01:42:27.040 | - What can you say about, 'cause you mentioned
01:42:28.520 | cellular automata earlier, about from this example
01:42:31.840 | you just gave and in general the observation you can make
01:42:34.340 | by looking at cellular automata that there's a,
01:42:37.240 | from simple rules and simple building blocks
01:42:40.320 | can emerge arbitrary complexity.
01:42:43.240 | Do you understand what that is, how that can be leveraged?
01:42:48.680 | - So understand what it is is much easier than it sounds.
01:42:53.680 | I complained about Turing's machine
01:42:55.920 | making a physics mistake.
01:42:57.280 | But Turing never intended it to be a computer architecture.
01:43:01.600 | He used it just to prove results about uncomputability.
01:43:06.260 | What Turing did on what is computation is exquisite,
01:43:10.800 | is gorgeous.
01:43:11.800 | He gave us our notion of computational universality.
01:43:16.340 | And something that sounds deep and turns out to be trivial
01:43:21.640 | is it's really easy to show almost everything
01:43:26.640 | is computationally universal.
01:43:29.240 | So Norm Margulis wrote a beautiful paper
01:43:33.280 | with Tom Toffoli showing in a cellular,
01:43:37.720 | a cellular automata world is like the game of life
01:43:41.600 | where you just move tokens around.
01:43:43.860 | They showed that modeling billiard balls
01:43:48.600 | on a billiard table with cellular automata
01:43:52.360 | is a universal computer.
01:43:53.820 | To be universal, you need a persistent state,
01:43:59.600 | you need a nonlinear operation to interact them,
01:44:03.100 | and you need connectivity.
01:44:07.540 | So that's what you need to show computational universality.
01:44:11.400 | So they showed that a CA modeling billiard balls
01:44:15.640 | is a universal computer.
01:44:17.600 | Chris Moore went on to show that instead of chaos,
01:44:21.680 | Turing showed there are computable,
01:44:25.600 | there are problems in computation that you can't solve,
01:44:29.160 | that they're harder than you can't predict.
01:44:31.280 | They're actually in a deep reason that they are unsolvable.
01:44:34.640 | Chris Moore showed it's very easy to make physical systems
01:44:39.240 | that are uncomputable, that what the physics system does,
01:44:43.320 | just bouncing balls and surfaces,
01:44:45.420 | you can make systems that solve uncomputable problems.
01:44:48.800 | And so almost any non-trivial physical system
01:44:52.520 | is computationally universal.
01:44:54.680 | So the first part of the answer to your question is,
01:44:57.640 | this comes back to my comment about
01:45:00.560 | how do you bootstrap a civilization?
01:45:02.900 | You just don't need much to be computationally universal.
01:45:06.700 | So then there isn't today a notion
01:45:10.720 | of like fabricational universality
01:45:14.040 | or fabricational complexity.
01:45:16.320 | The sort of numbers I've been giving you
01:45:18.440 | about you eating lunch versus the chip fab,
01:45:21.900 | that's in the same spirit of what Shannon did.
01:45:26.540 | But once you connect computational universality
01:45:31.540 | to kind of fabricational universality,
01:45:35.300 | you then get the ability to grow and adapt and evolve.
01:45:39.320 | - Because that evolution happens in the physical space
01:45:42.640 | and that's ultimately--
01:45:43.480 | - And so that's why, for me,
01:45:45.720 | the heart of this whole conversation is morphogenesis.
01:45:49.760 | So just to come back to that,
01:45:51.940 | what Turing ended his sadly cut short life studying
01:45:58.800 | was how genes give rise to form.
01:46:05.640 | So how the small amount of it,
01:46:09.320 | relatively in effect, small amount of information
01:46:11.580 | in the genome can give rise to the complexity of who you are.
01:46:15.560 | And that's where what resides is this molecular intelligence
01:46:20.560 | which is first how to describe you,
01:46:25.000 | but then how to describe you such that you can exist
01:46:29.400 | and you can reproduce and you can grow and you can evolve.
01:46:33.600 | And so that's the seat of our molecular intelligence.
01:46:38.600 | - The maker revolution in biology.
01:46:41.120 | - Yeah, it really is.
01:46:44.280 | It really is.
01:46:45.160 | And that's where you can't separate communication,
01:46:49.380 | computation and fabrication.
01:46:50.980 | You can't separate computer science and physical science.
01:46:53.760 | You can't separate hardware and software.
01:46:56.000 | They all intersect right at that place.
01:46:58.360 | - Do you think of our universe
01:47:00.960 | as just one giant computation?
01:47:02.680 | - I would even kind of say quantum computing is overhyped
01:47:08.880 | in that there's a few things quantum computing
01:47:11.360 | is gonna be good at.
01:47:12.400 | One is breaking crypto systems,
01:47:14.160 | what we know how to make new crypto systems.
01:47:16.080 | What it's really good at is modeling other quantum systems.
01:47:19.000 | So for studying nanotechnology, it's gonna be powerful.
01:47:23.260 | But quantum computing is not going to disrupt
01:47:27.200 | and change everything.
01:47:29.000 | But the reason I say that is this interesting group
01:47:33.080 | of strange people who helped invent quantum computing
01:47:36.600 | before it was clear anything was there.
01:47:39.360 | One of the main reasons they did it
01:47:42.200 | wasn't to make a computer that can break a crypto system.
01:47:46.440 | It was, you could turn this backwards.
01:47:48.840 | You could be surprised quantum mechanics can compute
01:47:52.280 | or you can go in the opposite direction
01:47:55.280 | and say if quantum mechanics can compute,
01:47:58.500 | that's a description of nature.
01:48:01.840 | So physics is written
01:48:05.240 | in terms of partial differential equations.
01:48:08.280 | That is an information technology from two centuries ago.
01:48:13.280 | The equations of physics are not,
01:48:21.200 | this would sound very strange to say,
01:48:22.680 | but the equations of physics, Schrodinger's equations
01:48:25.120 | and Maxwell's equations and all of them are not fundamental.
01:48:28.740 | They're a representation of physics
01:48:30.960 | that was accessible to us in the era
01:48:33.760 | of having a pencil and a piece of paper.
01:48:35.760 | They have a fundamental problem
01:48:41.480 | which is if you make a dot on a piece of paper,
01:48:45.520 | in traditional physics theory,
01:48:48.520 | there's infinite information in that dot.
01:48:51.800 | A point has infinite information.
01:48:55.340 | That can't be true because information
01:49:00.680 | is a fundamental resource that's connected to energy.
01:49:05.680 | And in fact, one of my favorite questions
01:49:09.920 | you can ask a cosmologist to trip them up is ask,
01:49:14.040 | is information a conserved quantity in the universe?
01:49:17.720 | Was all the information created in the Big Bang
01:49:20.200 | or can the universe create information?
01:49:21.880 | And I've yet to meet a cosmologist who doesn't stutter
01:49:26.560 | and not clearly know how to handle that existential question
01:49:31.560 | but sort of putting that to a side,
01:49:35.120 | in physics theory, the way it's taught,
01:49:37.980 | information comes late.
01:49:43.240 | You're taught about X, a variable
01:49:46.400 | which can contain infinite information
01:49:48.200 | but physically that's unrealistic.
01:49:50.440 | And so physics theories have to find ways to cut that off.
01:49:53.920 | So instead, there are a number of people
01:49:57.880 | who start with a theory of the universe
01:50:03.240 | should start with information and computation
01:50:06.680 | as the fundamental resources that explain nature
01:50:09.560 | and then you build up from that to something
01:50:12.280 | that looks like throwing baseballs down a slope.
01:50:15.800 | And so in that sense, the work on physics and computation
01:50:22.120 | has many applications that we've been talking about
01:50:24.640 | but more deeply, it's really getting at new ways
01:50:29.040 | to think about how the universe works.
01:50:31.200 | And there are a number of things that are hard to do
01:50:33.640 | in traditional physics that make more sense
01:50:36.320 | when you start with information and computation
01:50:39.120 | as the root of physical theory.
01:50:41.240 | - So information and computation being
01:50:43.240 | the real fundamental thing in the universe.
01:50:46.040 | - Right, that information is a resource.
01:50:49.920 | You can't have infinite information in finite space.
01:50:52.880 | Information propagates and interacts
01:50:56.320 | and from there you erect the scaffolding of physics.
01:50:58.760 | Now it happens, the words I just said
01:51:01.520 | look a lot like quantum field theories
01:51:04.240 | but there's an interesting way where instead of starting
01:51:08.560 | with differential equations to get to quantum field theories
01:51:13.040 | and quantum field theories you get to quantization.
01:51:18.360 | If you start from computation and information
01:51:21.280 | you begin sort of quantized and you build up from there.
01:51:24.720 | And so that's the sense in which,
01:51:26.960 | absolutely I think about the universe as a computer.
01:51:31.240 | The easy way to understand that is
01:51:34.120 | just almost anything is computationally universal
01:51:38.880 | but the deep way is it's a real fundamental way
01:51:41.920 | to understand how the universe works.
01:51:43.920 | - Let me go a little bit to the personal
01:51:47.560 | and with Senator Bitz and Adams.
01:51:50.440 | You have worked with, the students you've worked with
01:51:56.560 | have gone on to do some incredible things in this world
01:52:00.400 | including build super computers that power
01:52:02.840 | Facebook and Twitter and so on.
01:52:05.520 | What advice would you give to young people?
01:52:08.780 | What advice have you given them
01:52:11.160 | how to have one heck of a great career,
01:52:14.120 | one heck of a great life?
01:52:15.920 | - One important one is,
01:52:17.440 | if you look at junior faculty trying to get tenure
01:52:23.480 | at a place like MIT, the ones who try to figure out
01:52:27.320 | how to get tenure are miserable and don't get tenure.
01:52:30.680 | And the ones who don't try to figure it out
01:52:33.360 | are happy and do get it.
01:52:34.680 | You have to love what you're doing and believe in it
01:52:39.120 | and nothing else could possibly be
01:52:42.920 | what you wanna be doing with your life
01:52:44.440 | and it gets you out of bed in the morning.
01:52:46.520 | And again, it sounds naive, but
01:52:49.120 | within like the limited domain I'm describing now
01:52:54.240 | as getting tenure at MIT, that's a key attribute to it.
01:52:57.280 | And then same sense, if you take the sort of outliers
01:53:02.080 | students were talking about,
01:53:04.120 | 99 out of 100 come to me and say,
01:53:06.040 | "Your work is very fascinating.
01:53:07.400 | "I'd be interesting to work for you."
01:53:11.080 | And one out of 100 come and say, "You're wrong.
01:53:15.560 | "Here's your mistake.
01:53:17.960 | "Here's what you should have been doing."
01:53:20.040 | They just sort of say, "I'm here and get to work."
01:53:26.480 | And again, I don't know how far this resource goes.
01:53:29.840 | So I've said, I consider the world's greatest resource,
01:53:33.720 | this engine of bright and vent of people
01:53:36.040 | of which we only see a tiny little iceberg of it.
01:53:39.200 | And everywhere we open these labs,
01:53:41.080 | they come out of the woodwork.
01:53:43.320 | We didn't create all these educational programs,
01:53:45.920 | all these other things I'm describing.
01:53:47.760 | We tried to partner everywhere with local schools
01:53:51.160 | and local companies and kept tripping over dysfunction
01:53:53.640 | and find we had to create the environment
01:53:55.240 | where people like this can flourish.
01:53:56.800 | And so I don't know if this is everyone,
01:54:00.040 | if it's 1% of society, what the fraction is,
01:54:02.520 | but it's so many orders of magnitude bigger
01:54:05.600 | than we see today.
01:54:07.000 | We've been racing to keep up with it,
01:54:08.360 | to take advantage of that resource.
01:54:10.280 | - Something tells me it's a very large fraction
01:54:12.160 | of the population.
01:54:13.240 | - I mean, the thing that gives me most hope
01:54:15.640 | for the future is that population.
01:54:18.800 | Once a year, this whole lab network meets,
01:54:21.160 | and it's my favorite gathering, it's in Bhutan this year,
01:54:24.040 | because it's every body shape,
01:54:26.680 | it's every language, every geography,
01:54:28.800 | but it's the same person in all those packages.
01:54:31.640 | It's the same sense of bright and vent of joy and discovery.
01:54:36.920 | - If there's people listening to this
01:54:38.280 | and they're just overwhelmed with how exciting this is,
01:54:41.640 | which I think they would be,
01:54:43.200 | how can they participate, how can they help,
01:54:44.960 | how can they encourage young people or themselves
01:54:48.160 | to build stuff, to create stuff?
01:54:50.880 | - Yeah, that's a great question.
01:54:51.920 | So this is part of a much bigger maker movement
01:54:56.920 | that has a lot of embodiments.
01:55:00.520 | The part I've been involved in, this Fab Lab Network,
01:55:03.340 | you can think of as a curated part that works as a network.
01:55:06.160 | So you don't benefit in a gym
01:55:09.200 | if somebody exercises in another gym,
01:55:11.720 | but in the Fab Network, you do in a sense benefit
01:55:14.800 | when somebody works in another network,
01:55:16.720 | another lab in the way it functions as a network.
01:55:19.480 | So you can come to cba.mit.edu
01:55:23.520 | to see the research we're talking about.
01:55:26.280 | There's a Fab Foundation run by Sherry Lasseter
01:55:29.400 | at fabfoundation.org.
01:55:31.440 | Fab Labs IO is a portal into this lab network.
01:55:36.260 | Fabacademy.org is this distributed
01:55:39.540 | hands-on educational program.
01:55:41.980 | Fab.city is the platform of cities
01:55:44.500 | producing what they consume.
01:55:45.740 | Those are all nodes in this network.
01:55:48.580 | - So you can learn with Fab Academy
01:55:50.340 | and you can perhaps launch or help launch
01:55:53.040 | or participate in launching a Fab Lab.
01:55:54.660 | - Well, and in particular, from one to a thousand,
01:55:59.180 | we carefully counted labs.
01:56:00.700 | Now we're going from a thousand to a million,
01:56:03.060 | where it ceases to become interesting to count them.
01:56:05.300 | And in the thousand to the million,
01:56:07.280 | what's interesting about that stage is technologically,
01:56:12.860 | you go to a lab not to get access to the machine,
01:56:15.260 | but you go to the lab to make the machine.
01:56:17.880 | But the other thing interesting in it
01:56:20.180 | is we have an interesting collaboration
01:56:22.580 | on a Fab Lab in a box.
01:56:26.420 | And this came out of a collaboration with SolidWorks
01:56:30.700 | on how you can put a Fab Lab in a box,
01:56:34.900 | which is not just the tools, but the knowledge.
01:56:37.740 | So you open the box and the box contains the knowledge
01:56:42.020 | of how to use it, as well as the tools within it,
01:56:45.600 | so that the knowledge can propagate.
01:56:47.260 | And so we have an interesting group of people
01:56:48.820 | working on, you know, the original Fab Labs,
01:56:51.940 | we'd have a whole team to get involved
01:56:53.940 | in the setting up and training.
01:56:56.220 | And the Fab Academy is a real in-depth,
01:56:58.660 | deep technical program in the training.
01:57:01.020 | But in this next phase, how sort of the lab itself
01:57:04.940 | knows how to do the lab, that it's, you know,
01:57:08.060 | we've talked deeply about the intelligence in fabrication,
01:57:12.980 | but in a much more accessible one,
01:57:14.540 | about how the AI in the lab, in effect,
01:57:19.500 | becomes a collaborator with you
01:57:21.980 | in this near term to help get started.
01:57:24.440 | And for people wanting to connect,
01:57:28.260 | it can seem like a big step, a big threshold,
01:57:31.100 | but we've gotten to thousands of these
01:57:32.900 | and they're doubling exactly that way,
01:57:35.260 | just from people opting in.
01:57:37.620 | - And in so doing, driving towards this kind of idea
01:57:40.500 | of personal digital fabrication.
01:57:43.900 | - Yeah, and it's not utopia, it's not free,
01:57:46.540 | but come back to today, we separately have education,
01:57:51.540 | we have big business, we have startups,
01:57:55.500 | we have entertainment, sort of each of these things
01:57:57.920 | are segregated.
01:57:59.620 | When you have global connection
01:58:01.880 | to one of these local facilities,
01:58:04.060 | in that you can do play and art and education
01:58:07.740 | and create infrastructure.
01:58:09.380 | You can make many of the things you consume.
01:58:13.660 | You could make it for yourself,
01:58:14.980 | it could be done on a community scale,
01:58:16.740 | it could be done on a regional scale.
01:58:19.120 | It really, I'd say the research we spent
01:58:24.220 | the last few hours talking about, I thought was hard.
01:58:27.740 | And in a sense, I mean, it's non-trivial,
01:58:32.740 | but in a sense, it's just sort of playing out,
01:58:35.320 | we're turning the crank.
01:58:36.520 | What I didn't think was hard is,
01:58:39.200 | if anybody can make almost anything anywhere,
01:58:43.620 | how do you live, how do you learn,
01:58:45.480 | how do you work, how you play?
01:58:47.460 | These very basic assumptions about how society functions.
01:58:50.980 | There's a way in which it's kind of back to the future,
01:58:54.820 | in that this mode where work is money is consumption,
01:58:59.820 | and consumption is shopping by selecting,
01:59:03.680 | is only a kind of a few decade old stretch.
01:59:07.360 | In some ways, we're getting back to,
01:59:11.500 | a Sami village in North Norway is deeply sustainable.
01:59:16.500 | But rather than just reverting to living
01:59:22.020 | the way we did a few thousand years ago,
01:59:24.260 | being connected globally,
01:59:27.060 | having the benefits of modern society,
01:59:30.640 | but connecting it back to older notions of sustainability,
01:59:34.420 | I hadn't remotely anticipated just how fundamentally
01:59:39.960 | that challenges how a society functions,
01:59:42.900 | and how interesting and how hard it is
01:59:44.780 | to figure out how we can make that work.
01:59:47.820 | - And it's possible that this kind of process
01:59:51.420 | will give a deeper sense of meaning to each person.
01:59:54.900 | - Let me violently agree in two ways.
01:59:58.100 | One way is,
01:59:59.800 | this community making crosses
02:00:06.500 | many sensitive sectarian boundaries
02:00:09.380 | in many parts of the world,
02:00:10.940 | where there's just implicit or explicit conflict,
02:00:15.620 | but sort of this act of making
02:00:18.700 | seems to transcend a lot of historical divisions.
02:00:22.700 | I don't say that philosophically,
02:00:24.420 | I just say that as an observation.
02:00:26.380 | And I think there's something really fundamental
02:00:29.980 | in what you said, which is,
02:00:31.740 | deep in our brain is shaping our environment.
02:00:35.940 | A lot of what's strange about our society
02:00:41.880 | is the way that we can't do that.
02:00:45.900 | The act of shaping our environment
02:00:48.980 | touches something really, really deep
02:00:51.220 | that gets to the essence of who we are.
02:00:53.940 | That's again why I say that, in a way,
02:00:57.780 | the most important thing made in these labs
02:01:00.220 | is making itself.
02:01:01.860 | - What do you think, if the shaping of our environment
02:01:06.260 | gets to something deep,
02:01:07.220 | what do you think is the meaning of it all?
02:01:09.620 | What's the meaning of life, Neil?
02:01:12.220 | - I can tell you my insights into how life works.
02:01:17.220 | I can tell you my insights in how to make life
02:01:21.900 | meaningful and fulfilling,
02:01:23.860 | and sustainable.
02:01:26.680 | I have no idea what the meaning of life is,
02:01:31.300 | but maybe that's the meaning of life.
02:01:33.740 | - Nah, the uncertainty, the confusion.
02:01:35.660 | Because there's a magic to it all.
02:01:38.820 | Everything you've talked about,
02:01:40.100 | from starting from the basic elements with the Big Bang,
02:01:43.860 | that somehow created the sun,
02:01:45.400 | that somehow said FU to thermodynamics and created life,
02:01:50.400 | and all the ways that you've talked about,
02:01:52.700 | from ribosomes that created the machinery
02:01:54.860 | that created the machine,
02:01:55.940 | and then now the biological machine creating,
02:01:59.120 | through digital fabrication,
02:02:00.400 | more complex artificial machines.
02:02:02.380 | All of that, there's a magic to that creative process.
02:02:05.820 | And we humans are smart enough to notice the magic.
02:02:09.500 | - So, you haven't said the S word yet.
02:02:13.220 | - Which one is that?
02:02:14.060 | - Singularity.
02:02:15.140 | (laughing)
02:02:17.420 | Yeah, I'm not sure if Ray Kurzweil is listening,
02:02:19.700 | if he is, hi, Ray.
02:02:20.700 | But I have a complex relationship with Ray,
02:02:23.220 | 'cause a lot of the things he projects, I find annoying.
02:02:28.220 | But then, he does his homework,
02:02:31.740 | and then, somewhat annoyingly, he points out
02:02:34.560 | how almost everything I'm doing fits on his roadmaps.
02:02:38.380 | - Yeah.
02:02:39.780 | - And so, the question is,
02:02:44.780 | are we heading towards a singularity?
02:02:47.580 | So, I'd have to say I lean towards sigmoids
02:02:52.940 | rather than exponentials.
02:02:54.680 | - We've done pretty well with sigmoids.
02:02:58.980 | - Yeah, so sigmoids are things grow, and they taper,
02:03:02.420 | and then there can be one after it, and one after it.
02:03:08.220 | I'll pass on whether there's enough of them
02:03:10.340 | that they diverge, but the selfish gene answer
02:03:15.340 | to the meaning of life is the meaning of life
02:03:21.100 | is the propagation of life.
02:03:23.860 | And so, it was a step for atoms to assemble into a molecule,
02:03:33.980 | for molecules to assemble into a protocell,
02:03:38.540 | for the protocell to form, to then form organelles,
02:03:41.860 | for the organ cells to form organs,
02:03:44.500 | the organs to form an organism.
02:03:46.600 | Then, it was a step for organisms to form family units,
02:03:51.140 | then family units to form villages.
02:03:53.420 | You can view each of those as a stack
02:03:56.120 | in the level of organizations.
02:03:57.560 | So, you could view everything we've spoken about
02:04:02.060 | as the imperative of life, just the next step
02:04:07.060 | in the hierarchy of that, in the fulfillment
02:04:11.020 | of the inexorable drive of the violation of thermodynamics.
02:04:14.380 | So, you could view, I'm an embodiment of the will
02:04:17.940 | of the violation of thermodynamics speaking.
02:04:20.340 | - The two of us having an old chat, yes.
02:04:24.740 | - Yeah.
02:04:25.860 | - And so, it continues, and even then,
02:04:27.860 | the singularity's just a transition up the ladder.
02:04:31.480 | There's nothing deeper to consciousness than,
02:04:35.900 | it's a derived property of distributed problem solving.
02:04:39.940 | There's nothing deeper to life than embodied AI
02:04:46.340 | in morphogenesis.
02:04:50.420 | So, why so much of this conversation in my life
02:04:53.840 | is involved in these fab labs?
02:04:57.220 | And initially, it just started as outreach.
02:04:59.960 | Then it started as keeping up with it.
02:05:02.600 | Then it turned to, it was rewarding.
02:05:07.600 | Then it turned to, we're learning as much
02:05:11.300 | from these labs in as goes out to them.
02:05:14.220 | It began as outreach, but now more knowledge
02:05:16.660 | is coming back from the labs than is going into them.
02:05:19.380 | And then finally, it ends with what I described
02:05:24.380 | as competing with myself at MIT,
02:05:27.060 | but a better way to say that is tapping
02:05:30.140 | the brain power of the planet.
02:05:31.780 | And so, I guess for me personally,
02:05:35.260 | that's the meaning of my life.
02:05:37.540 | - And maybe that's the meaning for the universe too.
02:05:39.640 | It's using us humans and our creations to understand itself
02:05:44.640 | in the way it's, whatever the creative process
02:05:49.980 | that created Earth, it's competing with itself.
02:05:53.160 | - Yeah, so you could take morphogenesis
02:05:56.360 | as a summary of this whole conversation,
02:05:58.100 | or you could take recursion, that in a sense,
02:06:02.360 | what we've been talking about is recursion all the way down.
02:06:05.500 | - And in the end, I think this whole thing is pretty fun.
02:06:08.820 | It's short, life is, but it's pretty fun.
02:06:12.340 | And so is this conversation.
02:06:13.900 | I mentioned to you offline that I'm going
02:06:15.740 | through some difficult stuff personally,
02:06:17.260 | and your passion for what you do is just really inspiring,
02:06:20.540 | and it just lights up my mood and lights up my heart.
02:06:23.740 | And you're an inspiration for, I know,
02:06:26.520 | thousands of people that work with you at MIT
02:06:28.560 | and millions of people across the world.
02:06:30.160 | It's a big honor that you sit with me today.
02:06:31.840 | This was really fun.
02:06:32.720 | - This was a pleasure.
02:06:33.820 | - Thanks for listening to this conversation
02:06:36.240 | with Neil Gershenfeld.
02:06:37.640 | To support this podcast, please check out our sponsors
02:06:40.160 | in the description.
02:06:41.600 | And now, let me leave you with some words
02:06:43.600 | from Pablo Picasso.
02:06:45.760 | Every child is an artist.
02:06:48.240 | The challenge is staying an artist when you grow up.
02:06:51.360 | Thank you for listening.
02:06:53.560 | And hope to see you next time.
02:06:55.760 | (upbeat music)
02:06:58.340 | (upbeat music)
02:07:00.920 | [BLANK_AUDIO]