back to index

Robert Langer: Edison of Medicine | Lex Fridman Podcast #105


Chapters

0:0 Introduction
3:7 Magic and science
5:34 Memorable rejection
8:35 How to come up with big ideas in science
13:27 How to make a new drug
22:38 Drug delivery
28:22 Tissue engineering
35:22 Beautiful idea in bioengineering
38:16 Patenting process
42:21 What does it take to build a successful startup?
46:18 Mentoring students
50:54 Funding
58:8 Cookies
59:41 What are you most proud of?

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Bob Langer,
00:00:02.920 | professor at MIT and one of the most cited researchers
00:00:06.120 | in history, specializing in biotechnology fields
00:00:09.840 | of drug delivery systems and tissue engineering.
00:00:12.600 | He has bridged theory and practice by being a key member
00:00:16.920 | and driving force in launching many successful
00:00:19.760 | biotech companies out of MIT.
00:00:22.080 | This conversation was recorded before the outbreak
00:00:25.140 | of the coronavirus pandemic.
00:00:27.160 | His research and companies are at the forefront
00:00:29.440 | of developing treatment for COVID-19,
00:00:31.840 | including a promising vaccine candidate.
00:00:35.380 | Quick summary of the ads.
00:00:37.000 | Two sponsors, Cash App and Masterclass.
00:00:40.640 | Please consider supporting the podcast
00:00:42.400 | by downloading Cash App and using code LEXPODCAST
00:00:46.440 | and signing up at masterclass.com/lex.
00:00:49.720 | Click on the links, buy the stuff.
00:00:52.140 | It really is the best way to support this podcast
00:00:54.400 | and in general, the journey I'm on
00:00:57.040 | in my research and startup.
00:00:59.480 | This is the Artificial Intelligence Podcast.
00:01:01.920 | If you enjoy it, subscribe on YouTube,
00:01:04.200 | review it with five stars on Apple Podcast,
00:01:06.440 | support it on Patreon or connect with me
00:01:08.480 | on Twitter @lexfriedman.
00:01:11.520 | As usual, I'll do a few minutes of ads now
00:01:13.680 | and never any ads in the middle that can break the flow
00:01:15.880 | of the conversation.
00:01:16.920 | This show is presented by Cash App,
00:01:20.740 | the number one finance app in the App Store.
00:01:23.380 | When you get it, use code LEXPODCAST.
00:01:26.560 | Cash App lets you send money to friends, buy Bitcoin
00:01:29.440 | and invest in the stock market with as little as $1.
00:01:32.520 | Since Cash App allows you to send
00:01:34.040 | and receive money digitally,
00:01:35.960 | let me mention a surprising fact related to physical money.
00:01:39.240 | Of all the currency in the world,
00:01:40.880 | roughly 8% of it is actual physical money.
00:01:44.580 | The other 92% of money only exists digitally.
00:01:48.620 | So again, if you get Cash App from the App Store,
00:01:51.200 | Google Play and use the code LEXPODCAST,
00:01:54.220 | you get $10 and Cash App will also donate $10 to Thirst,
00:01:58.680 | an organization that is helping to advance robotics
00:02:01.080 | and STEM education for young people around the world.
00:02:04.080 | This show is sponsored by Masterclass.
00:02:07.680 | Sign up at masterclass.com/lex to get a discount
00:02:11.280 | and to support this podcast.
00:02:13.360 | When I first heard about Masterclass,
00:02:15.520 | I thought it was too good to be true.
00:02:17.640 | For $180 a year, you get an all access pass
00:02:21.240 | to watch courses from, to list some of my favorites.
00:02:24.860 | Chris Hadfield on space exploration,
00:02:26.980 | Neil deGrasse Tyson on scientific thinking
00:02:28.780 | and communication, Will Wright,
00:02:30.660 | creator of SimCity and Sims, on game design,
00:02:33.940 | Carlos Santana on guitar,
00:02:36.260 | Europa is probably one of the most beautiful
00:02:38.260 | guitar instrumentals ever,
00:02:40.160 | Garry Kasparov on chess,
00:02:42.160 | Daniel Negrano on poker and many more.
00:02:44.880 | Chris Hadfield explaining how rockets work
00:02:47.380 | and the experience of being launched into space alone
00:02:49.940 | is worth the money.
00:02:51.720 | You can watch it on basically any device.
00:02:54.120 | Once again, sign up at masterclass.com/lex
00:02:57.920 | to get a discount and to support this podcast.
00:03:02.600 | And now, here's my conversation with Bob Langer.
00:03:06.460 | You have a bit of a love for magic.
00:03:09.960 | Do you see a connection between magic and science?
00:03:13.160 | - I do.
00:03:14.000 | I think magic can surprise you
00:03:15.920 | and I think science can surprise you
00:03:19.320 | and there's something magical about science.
00:03:22.400 | I mean, making discoveries and things like that, yeah.
00:03:24.940 | - So, and then on the magic side,
00:03:26.680 | is there some kind of engineering scientific process
00:03:29.320 | to the tricks themselves?
00:03:31.140 | Do you see, 'cause there's a duality to it.
00:03:33.700 | One is you're the,
00:03:35.580 | you're sort of the person inside
00:03:39.160 | that knows how the whole thing works,
00:03:40.820 | how the universe of the magic trick works.
00:03:43.740 | And then from the outside observer,
00:03:45.480 | which is kind of the role of the scientist,
00:03:48.080 | you, the people that observe the magic trick
00:03:50.280 | don't know, at least initially, anything that's going on.
00:03:53.880 | Do you see that kind of duality?
00:03:55.800 | - Well, I think the duality that I see is fascination.
00:03:58.280 | You know, I think of it, you know,
00:03:59.800 | when I watch magic myself, I'm always fascinated by it.
00:04:04.380 | Sometimes it's a puzzle to think how it's done,
00:04:06.320 | but just the sheer fact that something
00:04:08.520 | that you never thought could happen does happen.
00:04:11.240 | And I think about that in science too.
00:04:13.480 | You know, sometimes you,
00:04:15.400 | it's something that you might dream about
00:04:18.120 | and hoping to discover, maybe you do in some way or form.
00:04:22.680 | - What is the most amazing magic trick you've ever seen?
00:04:26.960 | - Well, there's one I like,
00:04:27.920 | which is called the invisible pack.
00:04:30.020 | And the way it works is you have this pack
00:04:33.480 | and you hold it up.
00:04:35.800 | Well, first you say to somebody, this is invisible.
00:04:39.480 | And this deck, and you say, well, shuffle it.
00:04:42.880 | They shuffle it, but you know, they're sort of make-believe.
00:04:45.600 | And then you say, okay, I'd like you to pick a card,
00:04:48.080 | any card, and show it to me.
00:04:50.200 | And you show it to me and I look at it.
00:04:52.840 | And let's say it's the three of hearts.
00:04:56.640 | I said, well, put it back in the deck,
00:04:57.960 | but what I'd like you to do is turn it upside down
00:05:00.320 | from every other card in the deck.
00:05:02.360 | So they do that imaginary.
00:05:04.760 | And I said, do you want to shuffle it again?
00:05:06.200 | And they shuffle it.
00:05:07.040 | And I said, well, so there's still one card upside down
00:05:10.320 | from every other card in the deck.
00:05:11.760 | I said, what is that?
00:05:12.600 | And they said, well, three of hearts.
00:05:14.200 | So I said, well, it just so happens in my back pocket
00:05:16.160 | I have this deck, it's a real deck.
00:05:18.560 | I show it to you and I just open it up
00:05:20.760 | and there's just one card upside down.
00:05:23.080 | And it's the three of hearts.
00:05:24.520 | - And you can do this trick.
00:05:30.720 | - I can, if I don't, I would have probably brought it.
00:05:33.520 | - All right, well, beautiful.
00:05:36.360 | Let's get into the science.
00:05:39.320 | As of today, you have over 295,000 citations
00:05:43.360 | and H index of 269.
00:05:46.880 | You're one of the most cited people in history
00:05:48.960 | and the most cited engineer in history.
00:05:51.000 | And yet nothing great, I think,
00:05:54.880 | is ever achieved without failure.
00:05:56.640 | So the interesting part, what rejected papers, ideas,
00:06:01.400 | efforts in your life were most painful
00:06:03.680 | or had the biggest impact on your life?
00:06:05.840 | - Well, it's interesting.
00:06:06.680 | I mean, I've had plenty of rejection too.
00:06:09.240 | But I suppose one way I think about this
00:06:11.640 | is that when I first started
00:06:14.080 | and this certainly had an impact both ways.
00:06:16.120 | You know, I first started, we made two big discoveries
00:06:20.520 | and they were kind of interrelated.
00:06:22.040 | I mean, one was I was trying to isolate
00:06:24.640 | with my postdoctoral advisor, Judah Folkman,
00:06:27.200 | substances that could stop blood vessels from growing
00:06:30.120 | and nobody had done that before.
00:06:33.080 | And so that was part A, let's say.
00:06:36.320 | Part B is we had to develop a way to study that.
00:06:39.360 | And what was critical to study that
00:06:41.640 | was to have a way to slowly release those substances
00:06:45.400 | for more than a day, maybe months.
00:06:49.600 | And that had never been done before either.
00:06:51.700 | So we published the first one,
00:06:53.880 | we sent to Nature, the journal, and they rejected it.
00:06:58.480 | And then we revised it, we sent it to Science
00:07:01.840 | and they accepted it.
00:07:03.720 | And the opposite happened.
00:07:06.400 | We sent it to Science and they rejected it
00:07:08.240 | and then we sent it to Nature and they accepted it.
00:07:10.880 | But I have to tell you, when we got the rejections,
00:07:13.160 | it was really upsetting.
00:07:14.240 | I thought, you know, I'd done some really good work
00:07:16.400 | and Dr. Folkman thought we'd done some really good work.
00:07:19.360 | But it was very depressing to get rejected like that.
00:07:25.760 | - If you can linger on just the feeling
00:07:27.720 | or the thought process when you get the rejection,
00:07:30.400 | especially early on in your career,
00:07:32.840 | what, I mean, you don't know,
00:07:36.160 | now people know you as a brilliant scientist
00:07:42.160 | but at the time, I'm sure you're full of self-doubt.
00:07:45.400 | And did you believe that maybe this idea
00:07:49.480 | is actually quite terrible,
00:07:51.360 | that it could have been done much better,
00:07:53.060 | or is there underlying confidence?
00:07:54.760 | What was the feelings?
00:07:56.280 | - Well, you feel depressed and I felt the same way
00:07:59.720 | when I got grants rejected,
00:08:01.040 | which I did a lot in the beginning.
00:08:03.480 | I guess part of me, you know, you have multiple emotions.
00:08:06.680 | One is being sad and being upset
00:08:11.120 | and also being maybe a little bit angry
00:08:12.760 | 'cause you feel the reviewers didn't get it.
00:08:16.000 | But then as I thought about it more,
00:08:17.640 | I thought, well, maybe I just didn't explain it well enough.
00:08:20.320 | And you know, you go through stages.
00:08:22.920 | And so you say, well, okay,
00:08:24.720 | I'll explain it better next time.
00:08:26.160 | And certainly you get reviews
00:08:27.640 | and when you get the reviews,
00:08:28.660 | you see what they either didn't like or didn't understand
00:08:31.720 | and then you try to incorporate that into your next versions.
00:08:35.400 | - You've given advice to students to do something big,
00:08:38.520 | do something that really can change the world
00:08:40.240 | rather than something incremental.
00:08:42.080 | How did you yourself seek out such ideas?
00:08:45.720 | Is there a process?
00:08:47.760 | Is there sort of a rigorous process
00:08:49.760 | or is it more spontaneous?
00:08:52.040 | - It's more spontaneous.
00:08:53.280 | I mean, part of it's exposure to things,
00:08:56.280 | part of it's seeing other people
00:08:58.280 | like I mentioned Dr. Folkman,
00:08:59.680 | he was my postdoctoral advisor.
00:09:01.440 | He was very good at that.
00:09:02.560 | You could sort of see that he had big ideas
00:09:05.240 | and I certainly met a lot of people who didn't.
00:09:07.400 | And I think you could spot an idea
00:09:09.480 | that might have potential when you see it,
00:09:11.520 | you know, because it could have very broad implications.
00:09:14.360 | Whereas a lot of people
00:09:15.240 | might just keep doing derivative stuff.
00:09:17.680 | But it's not something that I've ever done
00:09:23.200 | systematically, I don't think.
00:09:26.400 | - So in the space of ideas,
00:09:28.200 | how many are just when you see them, it's just magic.
00:09:31.840 | It's something that you see that could be impactful
00:09:34.720 | if you dig deeper.
00:09:38.400 | - Yeah, it's sort of hard to say
00:09:39.880 | because there's multiple levels of ideas.
00:09:42.640 | One type of thing is like a new, you know, creation,
00:09:48.840 | that you could engineer tissues for the first time
00:09:51.240 | or make dishes from scratch from the first time.
00:09:53.600 | But another thing is really just deeply understanding
00:09:56.280 | something and that's important too.
00:09:59.080 | So, and that may lead to other things.
00:10:03.120 | So sometimes you could think of a new technology
00:10:06.720 | or I thought of a new technology,
00:10:08.520 | but other times things came from just the process
00:10:12.160 | of trying to discover things.
00:10:14.000 | So it's never, and you don't necessarily know,
00:10:17.800 | like people talk about aha moments,
00:10:19.800 | but I don't know if I've, I mean,
00:10:22.280 | I certainly feel like I've had some ideas
00:10:24.080 | that I really like, but it's taken me a long time
00:10:28.360 | to go from the thought process of starting it
00:10:31.600 | to all of a sudden knowing that it might work.
00:10:35.580 | - So if you take drug delivery, for example,
00:10:38.200 | is the notion, is the initial notion
00:10:41.000 | kind of a very general one,
00:10:43.280 | that we should be able to do something like this?
00:10:46.440 | - Yeah. - And then you start
00:10:47.560 | to ask the questions of, well, how would you do it?
00:10:50.400 | And then digging and digging and digging.
00:10:52.760 | - I think that's right.
00:10:53.680 | I think it depends.
00:10:54.540 | I mean, there are many different examples.
00:10:56.360 | The example I gave about delivering large molecules,
00:10:59.520 | which we used to study these blood vessel inhibitors.
00:11:03.080 | I mean, there we had to invent something that would do that.
00:11:06.320 | But other times it's different.
00:11:10.200 | Sometimes it's really understanding what goes on
00:11:12.760 | in terms of understanding the mechanisms.
00:11:14.880 | And so it's not a single thing.
00:11:17.640 | And there are many different parts to it.
00:11:20.120 | But over the years we've invented different,
00:11:23.040 | or discovered different principles for aerosols,
00:11:26.160 | for delivering genetic therapy agents,
00:11:29.720 | all kinds of things.
00:11:30.780 | - So let's explore some of the key ideas
00:11:33.280 | you've touched on in your life.
00:11:34.880 | Let's start with the basics.
00:11:37.520 | - Okay.
00:11:38.600 | - So first let me ask, how complicated is the biology
00:11:41.920 | and chemistry of the human body
00:11:43.520 | from the perspective of trying to affect some parts of it
00:11:46.800 | in a positive way?
00:11:49.000 | So that you know, for me, especially coming from
00:11:51.440 | the field of computer science
00:11:53.760 | and computer engineering and robotics,
00:11:56.420 | it seems that the human body is exceptionally complicated
00:11:59.120 | and how the heck you can figure out anything is amazing.
00:12:02.160 | - Well, I agree with you.
00:12:03.120 | I think it's super complicated.
00:12:04.560 | I mean, we're still just scratching the surface
00:12:06.720 | in many ways.
00:12:07.840 | But I feel like we have made progress in different ways.
00:12:10.660 | And some of it's by really understanding things
00:12:15.020 | like we were just talking about.
00:12:16.520 | Other times, you know, you might, or somebody might,
00:12:19.380 | we or others might invent technologies
00:12:21.600 | that might be helpful on exploring that.
00:12:24.400 | And I think over many years,
00:12:26.300 | we've understood things better and better,
00:12:27.840 | but we still have such a long ways to go.
00:12:29.720 | - Are there, I mean, if you just look,
00:12:32.680 | are there things that, are there knobs
00:12:37.160 | that are reliably controllable about the human body?
00:12:40.220 | If you consider, is there, is there,
00:12:44.680 | so if you start to think about controlling various aspects
00:12:48.080 | of, when we talk about drug delivery a little bit,
00:12:51.260 | but controlling various aspects chemically
00:12:55.220 | of the human body, is there a solid understanding
00:12:57.840 | across the populations of humans
00:12:59.920 | that are solid, reliable knobs that can be controlled?
00:13:03.400 | - I think that's hard to do.
00:13:05.440 | But on the other hand, whenever we make a new drug
00:13:07.500 | or medical device, to a certain extent, we're doing that,
00:13:10.480 | you know, in a small way, what you just said.
00:13:12.880 | But I don't know that there are great knobs.
00:13:16.400 | I mean, and we're learning about those knobs all the time.
00:13:18.960 | But if there's a biological pathway
00:13:21.400 | or something that you can affect or understand,
00:13:24.840 | I mean, then that might be such a knob.
00:13:27.760 | - So what is a pharmaceutical drug?
00:13:30.240 | How do you do, how do you discover a specific one?
00:13:33.560 | How do you test it?
00:13:34.520 | How do you understand it?
00:13:35.800 | How do you ship it?
00:13:36.960 | - Yeah, well, I'll give an example
00:13:39.720 | which goes back to what I said before.
00:13:41.540 | So when I was doing my postdoctoral work with Judith Folkman,
00:13:45.200 | we wanted to come up with drugs
00:13:46.400 | that would stop blood vessels from growing
00:13:47.920 | or alternatively make them grow.
00:13:50.360 | And actually, people didn't even believe
00:13:52.280 | that those things could happen.
00:13:55.560 | But-- - Can we pause on that
00:13:56.920 | for a second? - Sure.
00:13:57.960 | - What is a blood vessel?
00:13:59.400 | What does it mean for a blood vessel to grow and shrink?
00:14:01.960 | And why is that important?
00:14:03.120 | - Sure, so a blood vessel is,
00:14:05.320 | could be an artery or a vein or a capillary.
00:14:10.120 | And it provides oxygen, it provides nutrients,
00:14:15.120 | gets rid of waste.
00:14:16.420 | So to different parts of your body,
00:14:20.660 | so the blood vessels end up being very, very important.
00:14:25.260 | And if you have cancer, blood vessels grow into the tumor
00:14:30.260 | and that's part of what enables the tumor to get bigger.
00:14:33.100 | And that's also part of what enables the tumor
00:14:36.380 | to metastasize, which means spread throughout the body.
00:14:39.420 | And ultimately kill somebody.
00:14:41.540 | So that was part of what we were trying to do.
00:14:43.340 | We wanted to see if we could find substances
00:14:46.340 | that could stop that from happening.
00:14:48.140 | So first, I mean, there are many steps.
00:14:50.180 | First, we had to develop a bioassay
00:14:52.060 | to study blood vessel growth.
00:14:53.620 | Again, there wasn't one.
00:14:54.860 | That's where we needed the polymer systems
00:14:57.540 | because the blood vessels grew slowly, took months.
00:15:00.880 | So after we had the polymer system and we had the bioassay,
00:15:05.980 | then I isolated many different molecules initially
00:15:09.340 | from cartilage and almost all of them didn't work.
00:15:14.140 | But we were fortunate, we found one.
00:15:16.740 | It wasn't purified, but we found one that did work.
00:15:20.460 | And that paper, that was this paper I mentioned
00:15:22.700 | in Science in 1976.
00:15:24.260 | Those were really the isolation of some of the very first
00:15:27.100 | angiogenesis blood vessel inhibitors.
00:15:29.660 | - So there's a lot of words there.
00:15:31.180 | - Yeah.
00:15:32.020 | - First of all, polymer molecules, big, big molecules.
00:15:37.900 | So what are polymers?
00:15:39.920 | What's bioassay?
00:15:41.460 | What is the process of trying to isolate
00:15:45.420 | this whole thing simplified to where you can control
00:15:47.340 | and experiment with it?
00:15:48.580 | - Polymers are like plastics or rubber.
00:15:52.560 | What were some of the other questions?
00:15:55.940 | - Sorry, so a polymer is some plastics and rubber,
00:15:58.700 | and that means something that has structure
00:16:00.860 | and that could be useful for what?
00:16:03.140 | - Well, in this case, it would be something
00:16:04.860 | that could be useful for delivering a molecule
00:16:08.380 | for a long time, so it could slowly diffuse out of that
00:16:11.580 | at a controlled rate to where you wanted it to go.
00:16:15.100 | - So then you would find, the idea is that there would be
00:16:17.620 | a particular blood vessels that you can target,
00:16:22.260 | say they're connected somehow to a tumor,
00:16:24.700 | that you could target and over a long period of time
00:16:28.780 | to be able to place the polymer there
00:16:31.620 | and it'd be delivering a certain kind of chemical.
00:16:34.660 | - That's correct, I think what you said is good.
00:16:36.700 | So that it would deliver the molecule or the chemical
00:16:40.660 | that would stop the blood vessels from growing
00:16:42.620 | over a long enough time so that it really could happen.
00:16:45.460 | So that was sort of what we call a bioassay,
00:16:48.620 | is the way that we would study that.
00:16:50.580 | - So sorry, so what is a bioassay?
00:16:52.740 | Which part is the bioassay?
00:16:54.540 | - All of it, in other words, the bioassay
00:16:57.060 | is the way you study blood vessel growth.
00:17:01.900 | - The blood vessel growth, and you can control that
00:17:04.580 | somehow with, is there an understanding
00:17:06.900 | what kind of chemicals could control the growth
00:17:08.820 | of a blood vessel?
00:17:09.660 | - Sure, well now there is, but then when I started,
00:17:11.620 | there wasn't, and that gets to your original question.
00:17:14.460 | So you go through various steps.
00:17:15.940 | We did the first steps, we showed that A,
00:17:18.240 | such molecules existed and then we developed techniques
00:17:21.100 | for studying them, and we even isolated fractions,
00:17:25.140 | you know, groups of substances that would do it.
00:17:28.500 | But what would happen over the next,
00:17:30.460 | we did that in 1976, we published that.
00:17:34.000 | What would happen over the next 28 years
00:17:36.540 | is other people would follow in our footsteps.
00:17:38.780 | I mean, we tried to do some stuff too,
00:17:40.940 | but ultimately to make a new drug takes billions of dollars.
00:17:44.820 | So what happened was there were different growth factors
00:17:48.580 | that people would isolate, sometimes using the techniques
00:17:51.420 | that we developed, and then they would figure out
00:17:55.920 | using some of those techniques ways to stop those,
00:17:58.460 | the growth factors and ways to stop
00:18:00.380 | the blood vessels from growing.
00:18:02.340 | But that, like I say, took 28 years,
00:18:03.880 | it took billions of dollars,
00:18:05.120 | and worked by many companies like Genentech.
00:18:07.680 | But in 2004, 28 years after we started,
00:18:12.420 | the first one of those, Avastin, got approved by the FDA.
00:18:17.020 | And that's become one of the top biotech-selling drugs
00:18:22.640 | in history, and it's been approved for all kinds of cancers
00:18:25.340 | and actually for many eye diseases too,
00:18:27.720 | where you have abnormal blood vessel growth, macular.
00:18:31.360 | - So in general, one of the key ways you can alleviate,
00:18:36.360 | so what's the hope in terms of tumors associated
00:18:41.080 | with cancerous tumors?
00:18:42.840 | What can you help by being able to control
00:18:46.720 | the growth of vessels?
00:18:48.440 | - So if you cut off the blood supply, you cut off the,
00:18:52.640 | it's kind of like a war almost, right?
00:18:54.760 | If you have, if the nutrition is going to the tumor,
00:18:58.840 | and you can cut it off, I mean, you starve the tumor
00:19:02.000 | and it becomes very small, it may disappear,
00:19:04.600 | or it's gonna be much more amenable to other therapies,
00:19:07.700 | because it is tiny, like chemotherapy or immunotherapy
00:19:12.320 | is gonna have a much easier time against a small tumor
00:19:15.400 | than a big one.
00:19:16.520 | - Is that an obvious idea?
00:19:18.560 | I mean, it seems like a very clever strategy in this war
00:19:22.520 | against cancer.
00:19:24.760 | - Well, in retrospect, it's an obvious idea,
00:19:27.220 | but when Dr. Folkman, my boss, first proposed it,
00:19:30.520 | it wasn't, a lot of people thought it was pretty crazy.
00:19:34.360 | - And so in what sense, if you could sort of linger on it,
00:19:39.160 | when you're thinking about these ideas at the time,
00:19:42.040 | were you feeling around in the dark?
00:19:44.480 | So how much mystery is there about the whole thing?
00:19:46.840 | How much just blind experimentation,
00:19:50.020 | if you can put yourself in that mindset from years ago?
00:19:52.620 | - Yeah, well, there was, I mean, for me,
00:19:55.560 | actually, it wasn't just the idea,
00:19:57.120 | it was that I didn't know a lot of biology or biochemistry,
00:19:59.560 | so I certainly felt I was in the dark.
00:20:01.920 | But I kept trying, and I kept trying to learn,
00:20:04.680 | and I kept plugging, but I mean,
00:20:07.240 | a lot of it was being in the dark.
00:20:09.440 | - So the human body is complicated, right?
00:20:11.440 | We'll establish this.
00:20:12.640 | Quantum mechanics and physics is a theory
00:20:15.760 | that works incredibly well,
00:20:16.980 | but we don't really necessarily understand
00:20:18.960 | the underlying nature of it.
00:20:20.840 | So are drugs the same, in that you can,
00:20:24.120 | you're ultimately trying to show that the thing works
00:20:28.280 | to do something that you try to do,
00:20:29.880 | but you don't necessarily understand
00:20:32.640 | the fundamental mechanisms by which it's doing it?
00:20:36.000 | - It really varies.
00:20:36.880 | I think sometimes people do know them,
00:20:39.280 | because they've figured out pathways
00:20:40.960 | and ways to interfere with them.
00:20:43.020 | Other times, it is shooting in the dark.
00:20:45.360 | It really has varied.
00:20:47.360 | And sometimes people make serendipitous discoveries,
00:20:49.800 | and they don't even realize what they did.
00:20:52.820 | - So what is the discovery process for a drug?
00:20:56.800 | You said a bunch of people have tried to work with this.
00:20:59.120 | Is it a kind of mix of serendipitous discovery and art,
00:21:04.120 | or is there a systematic science
00:21:10.360 | to trying different chemical reactions
00:21:13.480 | and how they affect whatever you're trying to do,
00:21:16.920 | like shrink blood vessels?
00:21:18.880 | - Yeah, I don't think there's a single way,
00:21:20.680 | you know, a single way to go about something
00:21:23.360 | in terms of characterizing
00:21:24.700 | the entire drug discovery process.
00:21:26.440 | If I look at the blood vessel one,
00:21:28.620 | yeah, there, the first step was to have the kinds of theories
00:21:33.620 | that Dr. Folkman had.
00:21:35.440 | The second step was to have the techniques
00:21:37.560 | where you could study blood vessel growth
00:21:39.240 | for the first time,
00:21:40.440 | and at least quantitate or semi-quantitate it.
00:21:42.740 | Third step was to find substances
00:21:46.720 | that would stop blood vessels from growing.
00:21:49.680 | Fourth step was to maybe purify those substances.
00:21:52.340 | There are many other steps too.
00:21:55.800 | I mean, before you have an effective drug,
00:21:57.720 | you have to show that it's safe,
00:21:58.800 | you have to show that it's effective,
00:22:00.640 | and you start with animals,
00:22:01.880 | you ultimately go to patients,
00:22:03.600 | and there are multiple kinds of clinical trials
00:22:05.200 | you have to do.
00:22:06.400 | - If you step back, is it amazing to you
00:22:08.720 | that we descendants of great apes
00:22:11.680 | are able to create things that are,
00:22:13.520 | you know, that create drugs,
00:22:17.320 | chemicals that are able to improve
00:22:19.680 | some aspects of our bodies,
00:22:21.280 | or is it quite natural that we're able
00:22:25.480 | to discover these kinds of things?
00:22:27.680 | - Well, at a high level, it is amazing.
00:22:29.800 | I mean, evolution's amazing.
00:22:31.320 | You know, the way I look at your question,
00:22:33.560 | the fact that we have evolved the way we've done,
00:22:36.360 | I mean, it's pretty remarkable.
00:22:38.640 | - So let's talk about drug delivery.
00:22:41.120 | What are the difficult problems in drug delivery?
00:22:43.640 | What is drug delivery?
00:22:47.040 | You know, starting from your early seminal work
00:22:49.680 | in the field to today.
00:22:51.240 | - Well, drug delivery is getting a drug
00:22:54.680 | to go where you want it, at the level you want it,
00:22:58.000 | in a safe way.
00:22:59.760 | Some of the big challenges, I mean, there are a lot.
00:23:02.040 | I mean, I'd say one is, could you target the right cell?
00:23:06.560 | Like we talked about cancers,
00:23:07.800 | or some way to deliver a drug just to a cancer cell
00:23:10.640 | and no other cell.
00:23:11.960 | Another challenge is to get drugs across different barriers.
00:23:15.840 | Like, could you ever give insulin orally?
00:23:17.680 | Could you, or give it passively transdermally?
00:23:21.360 | Can you get drugs across the blood-brain barrier?
00:23:24.040 | I mean, there are lots of big challenges.
00:23:26.280 | Can you make smart drug delivery systems
00:23:29.080 | that might respond to physiologic signals in the body?
00:23:32.440 | - Oh, interesting.
00:23:33.280 | So smart, they have some kind of sense,
00:23:37.080 | a chemical sensor, or is there something more
00:23:39.320 | than a chemical sensor that's able to respond
00:23:41.320 | to something in the body?
00:23:43.160 | - Could be either one.
00:23:44.080 | I mean, you know, I mean, one example might be
00:23:46.600 | if you were diabetic, if you had more,
00:23:49.900 | got more glucose, could you get more insulin?
00:23:53.320 | But I don't, but that's just an example.
00:23:57.080 | - Is there some way to control the actual mechanism
00:23:59.520 | of delivery in response to what the body's doing?
00:24:02.300 | - Yes, there is.
00:24:03.240 | I mean, one of the things that we've done
00:24:05.000 | is encapsulate what are called beta cells.
00:24:07.560 | Those are insulin-producing cells,
00:24:09.440 | in a way that they're safe and protected.
00:24:11.840 | And then what'll happen is glucose will go in
00:24:15.280 | and, you know, cells will make insulin.
00:24:20.280 | And so that's an example.
00:24:22.120 | - So from an AI robotics perspective,
00:24:25.880 | how close are these drug delivery systems
00:24:29.240 | to something like a robot?
00:24:31.040 | Or is it totally wrong to think about them
00:24:33.620 | as intelligent agents?
00:24:35.460 | And how much room is there to add that kind of intelligence
00:24:39.580 | into these delivery systems, perhaps in the future?
00:24:42.060 | - Yeah, I think it depends
00:24:43.160 | on the particular delivery system.
00:24:45.020 | You know, of course, one of the things people
00:24:46.360 | are concerned about is cost.
00:24:47.660 | And if you add a lot of bells and whistles to something,
00:24:50.180 | it'll cost more.
00:24:51.400 | But I mean, we, for example, have made
00:24:54.200 | what I'll call intelligent microchips that can,
00:24:57.300 | you know, where you can send a signal
00:24:58.840 | and you'll release drug in response to that signal.
00:25:01.840 | And I think systems like that microchip
00:25:04.180 | someday have the potential to do what you and I
00:25:06.020 | were just talking about,
00:25:07.400 | that there could be a signal like glucose
00:25:09.500 | and it could have some instruction to say
00:25:11.620 | when there's more glucose, deliver more insulin.
00:25:14.380 | - So do you think it's possible that there could be
00:25:17.220 | robotic type systems roaming our bodies sort of long term
00:25:20.720 | and be able to deliver certain kinds of drugs in the future?
00:25:23.620 | You see that kind of future?
00:25:26.320 | - Someday, I don't think we're very close to it yet,
00:25:29.000 | but someday, you know, that's nanotechnology
00:25:31.860 | and that would mean even miniaturizing
00:25:33.740 | some of the things that I just discussed.
00:25:35.780 | And we're certainly not at that point yet,
00:25:37.820 | but someday I expect we will be.
00:25:40.140 | - So some of it is just the shrinking of the technology.
00:25:44.240 | - That's a part of it, that's one of the things.
00:25:47.300 | - In general, what role do you see AI sort of,
00:25:51.900 | there's a lot of work now with using data
00:25:55.180 | to make intelligent, create systems
00:25:57.220 | that make intelligent decisions.
00:25:59.080 | Do you see any of that data-driven kind of
00:26:04.540 | computing systems having a role in any part of this,
00:26:09.480 | into the delivery of drugs, the design of drugs,
00:26:13.340 | in any part of the chain?
00:26:15.220 | - I do.
00:26:16.140 | I think that AI can be useful in a number of parts
00:26:19.500 | of the chain.
00:26:20.340 | I mean, one, I think if you get a large amount
00:26:22.860 | of information, you know, say you have some chemical data
00:26:26.300 | 'cause you've done high throughput screens,
00:26:29.060 | and let's, I'll just make this up,
00:26:30.580 | but let's say I have, I'm trying to come up with a drug
00:26:33.540 | to treat disease X, whatever that disease is,
00:26:37.820 | and I have a test for that, and hopefully a fast test,
00:26:42.820 | and let's say I test 10,000 chemical substances,
00:26:47.500 | and a couple work, most of them don't work,
00:26:49.940 | some maybe work a little, but if I had a,
00:26:52.620 | with the right kind of artificial intelligence,
00:26:54.940 | maybe you could look at the chemical structures
00:26:57.180 | and look at what works and see if there's
00:26:58.740 | certain commonalities, look at what doesn't work,
00:27:01.100 | and see what commonalities there are,
00:27:03.300 | and then maybe use that somehow to predict
00:27:05.780 | the next generation of things that you would test.
00:27:08.540 | - As a tangent, what are your thoughts
00:27:10.780 | on our society's relationship with pharmaceutical drugs?
00:27:14.820 | Do we, and perhaps, I apologize
00:27:17.380 | as this is a philosophical, broader question,
00:27:19.900 | but do we over-rely on them?
00:27:22.260 | Do we improperly prescribe them?
00:27:24.720 | In what ways is the system working well?
00:27:26.580 | In what way can it improve?
00:27:28.020 | - Well, I think, you know, pharmaceutical drugs
00:27:32.140 | are really important, I mean, the life expectancy
00:27:35.380 | and life quality of people over many, many years
00:27:39.300 | has increased tremendously,
00:27:40.940 | and I think that's a really good thing.
00:27:42.860 | I think one thing that would also be good
00:27:44.780 | is if we could extend that more and more
00:27:46.620 | to people in the developing world,
00:27:48.440 | which is something that our lab has been doing
00:27:50.620 | with the Gates Foundation, or trying to do.
00:27:52.980 | So I think ways in which it could improve,
00:27:56.580 | I mean, if there were some way to reduce costs,
00:27:59.820 | you know, that's certainly an issue
00:28:01.220 | people are concerned about.
00:28:02.220 | If there was some way to help people in poor countries,
00:28:05.820 | that would also be a good thing.
00:28:07.380 | And then, of course, we still need to make better drugs
00:28:11.180 | for so many diseases.
00:28:12.860 | I mean, cancer, diabetes, I mean, we, you know,
00:28:15.860 | there's heart disease and rare diseases.
00:28:18.180 | There are many, many situations where it'd be great
00:28:20.860 | if we could do better and help more people.
00:28:23.420 | - Can we talk about another exciting,
00:28:25.360 | another exciting space, which is tissue engineering?
00:28:29.540 | What is tissue engineering, or regenerative medicine?
00:28:32.380 | - Yeah, so that tissue engineering or regenerative medicine
00:28:35.440 | have to do with building an organ or tissue from scratch.
00:28:38.860 | So, you know, someday maybe we can build a liver,
00:28:42.180 | you know, or make new cartilage.
00:28:45.940 | And also would enable you to, you know,
00:28:48.260 | someday create organs on a chip,
00:28:49.940 | which people, we and others are trying to do,
00:28:52.500 | which might lead to better drug testing
00:28:54.540 | and maybe less testing on animals or people.
00:28:57.900 | - Organs on a chip, that sounds fascinating.
00:29:01.220 | So what are the various ways to generate tissue?
00:29:06.100 | And how do, so, you know,
00:29:08.300 | the one is, of course, from stem cells.
00:29:10.580 | Is there other methods?
00:29:11.940 | What are the different possible flavors here?
00:29:14.220 | - Yeah, well, I think, I mean, there's multiple components.
00:29:17.420 | One is having generally some type of scaffold.
00:29:19.780 | That's what Jay Vacanti and I started many, many years ago.
00:29:23.800 | And then on that scaffold,
00:29:26.180 | you might put different cell types,
00:29:28.380 | which could be a cartilage cell, a bone cell,
00:29:30.620 | could be a stem cell
00:29:31.660 | that might differentiate into different things.
00:29:33.780 | Could be more than one cell.
00:29:35.500 | - And a scaffold, sorry to interrupt,
00:29:37.540 | is kind of like a canvas that,
00:29:38.900 | it's a structure that you can,
00:29:40.680 | on which the cells can grow?
00:29:42.980 | - I think that's a good explanation
00:29:44.300 | of what you just did.
00:29:45.140 | I'll have to use that.
00:29:46.100 | The canvas, that's good.
00:29:47.860 | Yeah, so I think that that's fair.
00:29:49.620 | You know, and the chip could be such a canvas.
00:29:52.580 | Could be fibers that are made of plastics
00:29:55.220 | and that you'd put in the body someday.
00:29:57.100 | - And when you say chip, do you mean electronic chip?
00:29:59.820 | Like-- - Not necessarily.
00:30:01.140 | It could be, though.
00:30:02.180 | But it doesn't have to be.
00:30:03.220 | It could just be a structure that's not in vivo,
00:30:07.900 | so to speak, that's outside the body.
00:30:10.940 | - So is there-- - Canvas is not a bad word.
00:30:13.280 | - So is there a possibility to weave into this canvas
00:30:19.660 | a computational component?
00:30:22.180 | So if we talk about electronic chips,
00:30:23.900 | some ability to sense, control some aspect
00:30:28.700 | of this growth process for the tissue?
00:30:31.380 | - I would say the answer to that is yes.
00:30:33.380 | I think right now people are working mostly
00:30:36.820 | on validating these kinds of chips for saying,
00:30:40.700 | well, it does work as effectively,
00:30:43.600 | or hopefully as just putting something in the body.
00:30:47.260 | But I think someday what you suggested,
00:30:49.420 | it certainly would be possible.
00:30:51.300 | - So what kind of tissues can we engineer today?
00:30:53.640 | What kind of tissues?
00:30:55.220 | - Well, so skin's already been made and approved by the FDA.
00:30:58.820 | There are advanced clinical trials,
00:31:00.900 | like what are called phase three trials,
00:31:03.060 | that are at complete or near completion
00:31:05.460 | for making new blood vessels.
00:31:08.420 | One of my former students, Laura Nicholson,
00:31:10.500 | led a lot of that.
00:31:11.640 | - So that's amazing.
00:31:14.180 | So human skin can be grown.
00:31:16.260 | That's already approved through the entire FDA process.
00:31:20.500 | So that means,
00:31:21.620 | one, that means you can grow that tissue
00:31:27.960 | and do various kinds of experiments
00:31:30.080 | in terms of drugs and so on.
00:31:34.200 | But what does that, does that mean
00:31:35.760 | that some kind of healing and treatment
00:31:38.040 | of different conditions for inhuman beings?
00:31:41.240 | - Yes, I mean, they've been approved now for,
00:31:43.560 | I mean, different groups have made them,
00:31:45.200 | different companies and different professors.
00:31:47.640 | But they've been approved for burn victims
00:31:50.620 | and for patients with diabetic skin ulcers.
00:31:53.260 | - That's amazing.
00:31:54.100 | Okay, so skin, what else?
00:31:59.540 | - Well, at different stages,
00:32:01.940 | people are, like skin, blood vessels,
00:32:05.260 | there's clinical trials going now
00:32:07.100 | for helping patients hear better,
00:32:09.220 | for patients that might be paralyzed,
00:32:12.340 | for patients that have different eye problems.
00:32:15.500 | I mean, and different groups have worked on
00:32:18.400 | just about everything, new liver, new kidneys.
00:32:20.840 | I mean, there've been all kinds of work done in this area.
00:32:24.480 | Some of it's early,
00:32:25.400 | but there's certainly a lot of activity.
00:32:27.680 | - What about neural tissue?
00:32:29.280 | - Yeah.
00:32:31.160 | - In the nervous system and even the brain?
00:32:34.160 | - Well, there've been people that have worked on that too.
00:32:36.200 | We've done a little bit with that,
00:32:37.440 | but there are people who've done a lot on neural stem cells.
00:32:40.360 | And I know Evan Snyder,
00:32:41.860 | who's been one of our collaborators
00:32:43.260 | on some of our spinal cord works,
00:32:45.160 | done work like that.
00:32:46.200 | And there've been other people as well.
00:32:48.060 | - Is there challenges for the,
00:32:51.120 | when it is part of the human body,
00:32:52.800 | is there challenges to getting the body
00:32:55.160 | to accept this new tissue that's being generated?
00:32:58.040 | How do you solve that kind of challenge?
00:33:00.120 | - There can be problems with accepting it.
00:33:02.760 | I think maybe in particular,
00:33:04.560 | you might mean rejection by the body.
00:33:07.160 | So there are multiple ways
00:33:08.640 | that people are trying to deal with that.
00:33:10.000 | One way is, which is what we've done
00:33:13.340 | and with Dan Anderson,
00:33:14.700 | who was one of my former postdocs,
00:33:16.060 | and I mentioned this a little bit before,
00:33:17.540 | for a pancreas, is encapsulating the cells.
00:33:20.860 | So immune cells or antibodies can't get in and attack them.
00:33:25.860 | So that's a way to protect them.
00:33:28.900 | Other strategies could be making the cells non-immunogenic,
00:33:34.260 | which might be done by different either techniques,
00:33:36.980 | which might mask them,
00:33:38.520 | or using some gene editing approaches.
00:33:40.760 | So there are different ways
00:33:41.800 | that people are trying to do that.
00:33:43.600 | And of course, if you use the patient's own cells
00:33:45.560 | or cells from a close relative,
00:33:48.040 | that might be another way.
00:33:50.240 | - It increases the likelihood that it'll get accepted
00:33:52.720 | if you use the patient's own cells.
00:33:54.460 | - Yes.
00:33:55.320 | And then finally, there's immunosuppressive drugs,
00:33:57.700 | which will suppress the immune response.
00:34:00.320 | That's right now what's done, say, for a liver transplant.
00:34:03.960 | - The fact that this whole thing works is fascinating,
00:34:06.240 | at least from my outside perspective.
00:34:09.880 | Will we one day be able to regenerate any organ
00:34:13.220 | or part of the human body, in your view?
00:34:16.880 | I mean, it's exciting to think about future possibilities
00:34:19.400 | of tissue engineering.
00:34:20.700 | Do you see some tissues more difficult than others?
00:34:25.100 | What are the possibilities here?
00:34:27.000 | - Yeah, well, of course, I'm an optimist,
00:34:29.160 | and I also feel a timeframe,
00:34:30.760 | if we're talking about someday,
00:34:32.320 | someday could be hundreds of years.
00:34:33.860 | But I think that, yes, someday,
00:34:36.080 | I think we will be able to regenerate many things.
00:34:39.280 | And there are different strategies that one might use.
00:34:41.840 | One might use some cells themselves.
00:34:44.840 | One might use some molecules
00:34:47.600 | that might help regenerate the cells.
00:34:49.680 | And so I think there are different possibilities.
00:34:51.920 | - What do you think that means for longevity?
00:34:54.740 | If we look, maybe not someday, but 10, 20 years out,
00:34:59.740 | the possibilities of tissue engineering,
00:35:01.840 | the possibilities of the research that you're doing,
00:35:04.440 | does it have a significant impact
00:35:06.800 | on the longevity, human life?
00:35:10.040 | - I don't know that we'll see
00:35:11.120 | a radical increase in longevity,
00:35:12.800 | but I think that in certain areas,
00:35:15.920 | we'll see people live better lives
00:35:19.400 | and maybe somewhat longer lives.
00:35:21.720 | - What's the most beautiful scientific idea
00:35:25.720 | in bioengineering that you've come across
00:35:28.220 | in your years of research?
00:35:29.800 | I apologize for the romantic notion.
00:35:33.480 | - No, that's an interesting question.
00:35:35.440 | I certainly think what's happening right now
00:35:37.880 | with CRISPR is a beautiful idea.
00:35:39.580 | That certainly wasn't my idea.
00:35:42.080 | I mean, but I think it's very interesting here.
00:35:46.000 | What people have capitalized on is that there's a mechanism
00:35:51.000 | by which bacteria are able to destroy viruses.
00:35:54.780 | And that understanding that leads to machinery
00:35:58.520 | to sort of cut and paste genes and fix a cell.
00:36:03.520 | - So that kind of, do you see a promise
00:36:09.400 | for that kind of ability to copy and paste?
00:36:13.720 | I mean, like we said, the human body's complicated.
00:36:16.880 | Is that, that seems exceptionally difficult to do.
00:36:21.880 | - I think it is exceptionally difficult to do,
00:36:25.240 | but that doesn't mean that it won't be done.
00:36:27.920 | There's a lot of companies and people trying to do it.
00:36:30.440 | And I think in some areas it will be done.
00:36:32.480 | Some of the ways that you might lower the bar
00:36:36.320 | are not, you know, are just taking,
00:36:38.280 | like not necessarily doing it directly,
00:36:40.920 | but you know, you could take a cell that might be useful,
00:36:45.760 | but you want to give it some cancer killing capabilities,
00:36:48.720 | something like what's called a CAR T cell.
00:36:50.960 | And that might be a different way
00:36:52.360 | of somehow making a CAR T cell and maybe making it better.
00:36:56.260 | So there might be sort of easier things
00:36:58.480 | and rather than just fixing the whole body.
00:37:01.520 | So the way a lot of things have moved
00:37:03.480 | with medicine over time is stepwise.
00:37:06.140 | So I can see things that might be easier to do
00:37:10.240 | than say fix a brain.
00:37:11.880 | That would be very hard to do,
00:37:13.960 | but maybe someday that'll happen too.
00:37:16.440 | - So in terms of stepwise, that's an interesting notion.
00:37:19.260 | Do you see that if you look at medicine or bioengineering,
00:37:25.200 | do you see that there is these big leaps
00:37:29.140 | that happen every decade or so or some distant period,
00:37:33.460 | or is it a lot of incremental work?
00:37:36.520 | Not, I don't mean to reduce its impact
00:37:39.020 | by saying it's incremental,
00:37:40.060 | but is there sort of phase shifts
00:37:44.740 | in the science, big leaps?
00:37:48.220 | - I think there's both.
00:37:49.380 | You know, every so often a new technique
00:37:51.260 | or a new technology comes out.
00:37:54.300 | I mean, genetic engineering was an example.
00:37:56.680 | I mentioned CRISPR.
00:37:58.340 | You know, I think every so often things happen
00:38:01.280 | that make a big difference,
00:38:03.360 | but still there's to try to really make progress,
00:38:07.580 | make a new drug, make a new device.
00:38:09.800 | There's a lot of things.
00:38:11.120 | I don't know if I'd call them incremental,
00:38:12.720 | but there's a lot, a lot of work that needs to be done.
00:38:15.760 | - Absolutely.
00:38:16.960 | So you have over, numbers could be off,
00:38:20.920 | but it's a big amount.
00:38:22.040 | You have over 1,100 current or pending patents
00:38:25.360 | that have been licensed,
00:38:26.400 | sub-licensed to over 300 companies.
00:38:28.740 | What's your view?
00:38:31.160 | What in your view are the strengths
00:38:33.960 | and what are the drawbacks of the patenting process?
00:38:36.660 | - Well, I think for the most part, they're strengths.
00:38:40.000 | I think that if you didn't have patents,
00:38:43.040 | especially in medicine,
00:38:44.380 | you'd never get the funding that it takes
00:38:46.760 | to make a new drug or a new device.
00:38:48.460 | I mean, which according to Tufts,
00:38:50.240 | to make a new drug costs over $2 billion right now.
00:38:53.520 | And nobody would even come close to giving you that money,
00:38:56.620 | any of that money, if it weren't for the patent system
00:39:01.620 | because then anybody else could do it.
00:39:03.940 | That then leads to the negative though.
00:39:09.840 | Sometimes somebody does have a very successful drug
00:39:13.440 | and you certainly wanna try to make it available
00:39:15.980 | to everybody.
00:39:17.400 | And so the patent system allows it to happen
00:39:22.400 | in the first place,
00:39:24.080 | but maybe it'll impede it after a little bit
00:39:26.680 | or certainly to some people or to some companies,
00:39:29.960 | once it is out there.
00:39:32.880 | - What's the, on the point of the cost,
00:39:35.960 | what would you say is the most expensive part
00:39:39.160 | of the $2 billion of making the drug?
00:39:42.080 | - Human clinical trials.
00:39:44.060 | That is by far the most expensive.
00:39:45.560 | - In terms of money or pain or both?
00:39:48.760 | - Well, money, but pain goes, it's hard to know.
00:39:51.840 | I mean, but usually proving things that are,
00:39:55.480 | proving that something new is safe and effective in people
00:39:58.960 | is almost always the biggest expense.
00:40:01.520 | - Could you linger on that for just a little longer
00:40:03.600 | and describe what it takes to prove,
00:40:07.240 | for people that don't know in general,
00:40:10.600 | what it takes to prove that something is effective
00:40:12.460 | on humans?
00:40:13.900 | - Well, you'd have to take a particular disease,
00:40:18.540 | but the process is you start out with,
00:40:21.320 | usually you start out with cells,
00:40:22.840 | then you'd go to animal models.
00:40:24.260 | Usually you have to do a couple animal models.
00:40:26.640 | And of course the animal models aren't perfect for humans.
00:40:29.520 | And then you have to do three sets of clinical trials
00:40:31.640 | at a minimum, a phase one trial to show that it's safe
00:40:34.840 | in small number of patients,
00:40:36.440 | phase two trial to show that it's effective
00:40:38.200 | in a small number of patients,
00:40:39.920 | and a phase three trial to show that it's safe
00:40:42.020 | and effective in a large number of patients.
00:40:45.200 | And that could end up being hundreds
00:40:47.560 | or thousands of patients,
00:40:49.020 | and they have to be really carefully controlled studies.
00:40:53.320 | And you'd have to manufacture the drug.
00:40:55.720 | You'd have to really watch those patients.
00:40:59.040 | You have to be very concerned that it is gonna be safe.
00:41:03.320 | And then you look and see,
00:41:05.880 | does it treat the disease better
00:41:07.680 | than whatever the gold standard was before that?
00:41:11.080 | If there was, assuming there was one.
00:41:12.980 | - That's a really interesting line.
00:41:14.620 | Show that it's safe first and then that it's effective.
00:41:18.180 | - First do no harm.
00:41:19.420 | - First do no harm, that's right.
00:41:21.580 | So how, again, if you can linger on it a little bit,
00:41:26.340 | how does the patenting process work?
00:41:28.480 | - Yeah, well, you do a certain amount of research.
00:41:32.700 | Though that's not necessarily has to be the case,
00:41:35.500 | but for us, usually it is.
00:41:37.020 | Usually we do a certain amount of research
00:41:40.620 | and make some findings.
00:41:42.060 | And we had a hypothesis, let's say we prove it,
00:41:46.580 | or we make some discovery, we invent some technique.
00:41:49.780 | And then we write something up, what's called a disclosure.
00:41:53.140 | We give it to MIT's Technology Transfer Office.
00:41:55.820 | They then give it to some patent attorneys,
00:41:58.040 | and they use that, and plus talking to us,
00:42:01.100 | and work on writing a patent.
00:42:03.940 | And then you go back and forth with the USPTO,
00:42:07.900 | that's the United States Patent and Trademark Office,
00:42:10.100 | and they may not allow it the first, second, or third time,
00:42:14.540 | but they will tell you why they don't.
00:42:16.500 | And you may adjust it, and maybe you'll eventually get it,
00:42:20.060 | and maybe you won't.
00:42:21.540 | - So you've been part of launching 40 companies,
00:42:24.980 | together worth, again, numbers could be outdated,
00:42:28.520 | but an estimated $23 billion.
00:42:30.900 | You've described your thoughts on a formula
00:42:35.740 | for startup success, so perhaps you can describe
00:42:38.340 | that formula, and in general, describe what does it take
00:42:41.560 | to build a successful startup?
00:42:43.280 | - Well, I'd break that down into a couple categories,
00:42:46.540 | and I'm a scientist, and certainly,
00:42:48.660 | from the science standpoint, I'll go over that.
00:42:50.700 | But I actually think that, really, the most important thing
00:42:54.180 | is probably the business people that I work with.
00:42:57.660 | And when I look back at the companies that have done well,
00:43:01.580 | it's been because we've had great business people,
00:43:03.860 | and when they haven't done as well,
00:43:05.220 | we haven't had as good business people.
00:43:06.960 | But from a science standpoint, I think about
00:43:09.700 | that we've made some kind of discovery
00:43:11.540 | that is almost what I'd call a platform,
00:43:15.340 | that you could use it for different things.
00:43:17.820 | And certainly, the drug delivery system example
00:43:20.500 | that I gave earlier is a good example of that.
00:43:22.620 | You could use it for drug A, B, C, D, E, and so forth.
00:43:25.620 | And I'd like to think that we've taken it far enough
00:43:30.380 | so that we've written at least one really good paper
00:43:33.260 | in a top journal, hopefully a number,
00:43:36.300 | that we've reduced it to practice in animal models,
00:43:39.420 | that we've filed patents, maybe had issued patents
00:43:45.420 | that have what I'll call very good and broad claims.
00:43:48.380 | That's sort of the key on a patent.
00:43:50.940 | And then in our case, a lot of times, when we've done it,
00:43:54.240 | a lot of times, it's somebody in the lab,
00:43:57.860 | like a post-doc or graduate student,
00:43:59.500 | that spent a big part of their life doing it,
00:44:01.780 | and that they want to work at that company
00:44:03.820 | 'cause they have this passion
00:44:04.860 | and they want to see something they did
00:44:06.620 | make a difference in people's lives.
00:44:08.420 | - Maybe you can mention the business component.
00:44:12.980 | It's funny to hear great scientists say
00:44:15.300 | that there's value to business folks.
00:44:17.940 | - Oh yeah, well-- - It's not always said.
00:44:20.420 | So what value, what business instinct is valuable
00:44:25.220 | to make a startup successful, a company successful?
00:44:29.180 | - I think the business aspects are,
00:44:32.180 | you have to be a good judge of people
00:44:35.780 | so that you hire the right people.
00:44:37.740 | You have to be strategic so you figure out,
00:44:40.520 | if you do, of that platform that could be used
00:44:42.500 | for all these different things.
00:44:43.940 | What one are you, and knowing that medical research
00:44:46.540 | is so expensive, what thing are you gonna do first,
00:44:48.900 | second, third, fourth, and fifth?
00:44:52.020 | I think you need to have a good,
00:44:53.700 | what I'll call FDA regulatory clinical trial strategy.
00:44:57.260 | I think you have to be able to raise money, credibly.
00:45:01.620 | So there are a lot of things.
00:45:02.780 | You have to be good with people, good manager of people.
00:45:05.300 | - So the money and the people part I get,
00:45:08.500 | but the stuff before, in terms of deciding the A, B, C, D,
00:45:13.080 | if you have a platform, which drugs to first take a testing,
00:45:16.700 | you see, nevertheless, scientists as not being
00:45:19.940 | too, always too good at that process.
00:45:22.620 | - Well, I think they're a part of the process,
00:45:24.300 | but I'd say there's probably, I'm gonna just make this up,
00:45:28.220 | but maybe six or seven criteria that you wanna use,
00:45:31.820 | and it's not just science.
00:45:33.340 | I mean, the kinds of things that I would think about
00:45:35.220 | is, is the market big or small?
00:45:37.900 | Is the, are there good animal models for it
00:45:41.240 | so that you could test it and it wouldn't take 50 years?
00:45:44.320 | Are the clinical trials that could be set up ones
00:45:48.860 | that have clear endpoints where you can make a judgment?
00:45:54.140 | And another issue would be competition.
00:45:58.860 | Are there other ways that some companies out there
00:46:00.780 | are doing it?
00:46:01.620 | Another issue would be reimbursement.
00:46:04.820 | You know, can it get reimbursed?
00:46:07.500 | So a lot of things that you have,
00:46:08.980 | manufacturing issues you'd wanna consider.
00:46:11.820 | So I think there are really a lot of things that go into
00:46:14.820 | whether you, what you do first, second, third, or fourth.
00:46:17.620 | - So you lead one of the largest academic labs in the world
00:46:23.060 | with over $10 million in annual grants
00:46:26.980 | and over 100 researchers,
00:46:28.460 | probably over a thousand since the lab's beginning.
00:46:31.220 | Researchers can be individualistic and eccentric.
00:46:37.060 | How do I put it nicely?
00:46:38.540 | There you go, eccentric.
00:46:39.940 | So what insights into research leadership can you give
00:46:43.100 | having to run such a successful lab
00:46:45.300 | with so much diverse talent?
00:46:47.280 | - Well, I don't know that I'm any expert.
00:46:50.740 | I think that what you do to me,
00:46:53.460 | I mean, I just want,
00:46:54.860 | I mean, this is gonna sound very simplistic,
00:46:56.340 | but I just want people in the lab to be happy,
00:46:58.900 | to be doing things that I hope will
00:47:00.900 | make the world a better place,
00:47:02.700 | to be working on science that can make the world
00:47:05.500 | a better place.
00:47:06.340 | And I guess my feeling is if we're able to do that,
00:47:09.780 | you know, it kind of runs itself.
00:47:13.900 | - So how do you make a researcher happy in general?
00:47:17.340 | - I think when people feel,
00:47:19.220 | I mean, this is gonna sound like, again,
00:47:21.420 | simplistic or maybe like motherhood and apple pie,
00:47:23.560 | but I think if people feel they're working
00:47:26.140 | on something really important
00:47:27.300 | that can affect many other people's lives
00:47:30.180 | and they're making some progress,
00:47:32.620 | they'll feel good about it.
00:47:34.100 | They'll feel good about themselves and they'll be happy.
00:47:37.260 | - But through brainstorming and so on,
00:47:39.820 | what's your role and how difficult it is as a group
00:47:43.940 | in this collaboration to arrive at these big questions?
00:47:49.420 | That might have impact.
00:47:51.100 | - Well, the big questions come from many different ways.
00:47:54.620 | Sometimes it's trying to, things that I might think of
00:47:57.420 | or somebody in the lab might think of,
00:47:59.500 | which could be a new technique
00:48:00.900 | or to understand something better.
00:48:02.900 | But gee, we've had people like Bill Gates
00:48:05.580 | and the Gates Foundation come to us
00:48:07.180 | and Juvenile Diabetes Foundation come to us and say,
00:48:10.100 | gee, could you help us on these things?
00:48:11.740 | And I mean, that's good too.
00:48:13.580 | It doesn't happen just one way.
00:48:16.900 | And I mean, you've kind of mentioned it, happiness,
00:48:20.820 | but is there something more,
00:48:24.740 | how do you inspire a researcher
00:48:26.440 | to do the best work of their life?
00:48:28.260 | So you mentioned passion and passion is a kind of fire.
00:48:32.660 | Do you see yourself having a role to keep that fire going,
00:48:35.980 | to build it up, to inspire the researchers
00:48:39.900 | through the pretty difficult process of going from idea
00:48:44.900 | to big question to big answer?
00:48:49.140 | - I think so.
00:48:50.940 | I think I try to do that by talking to people,
00:48:54.920 | going over their ideas and their progress.
00:48:59.020 | I try to do it as an individual.
00:49:02.380 | Certainly when I talk about my own career,
00:49:04.100 | I had my setbacks at different times
00:49:06.940 | and people know that, that know me.
00:49:09.140 | And you just try to keep pushing and so forth.
00:49:14.460 | But yeah, I think I try to do that
00:49:17.900 | as the one who leads the lab.
00:49:20.940 | - So you have this exceptionally successful lab
00:49:23.300 | and one of the great institutions in the world, MIT.
00:49:27.360 | And yet sort of at least in my neck of the woods
00:49:32.680 | in computer science and artificial intelligence,
00:49:35.220 | a lot of the research is kind of,
00:49:39.660 | a lot of the great researchers, not everyone,
00:49:43.060 | but some are kind of going to industry.
00:49:46.940 | A lot of the research is moving to industry.
00:49:49.700 | What do you think about the future of science in general?
00:49:52.600 | Is there drawbacks?
00:49:54.440 | Is there strength to the academic environment
00:49:57.060 | that you hope will persist?
00:49:59.940 | How does it need to change?
00:50:02.140 | What needs to stay the same?
00:50:04.220 | What are your thoughts on this whole landscape of science
00:50:06.500 | and its future?
00:50:08.020 | - Well, first I think going into industry is good,
00:50:10.380 | but I think being in academia is good.
00:50:12.840 | I have lots of students who've done both
00:50:15.220 | and they've had great careers doing both.
00:50:18.500 | I think from an academic standpoint,
00:50:21.260 | I mean the biggest concern probably that people feel today
00:50:25.260 | at a place like MIT or other research heavy institutions
00:50:28.980 | is gonna be funding and particular funding
00:50:31.300 | that's not super directed,
00:50:34.420 | so that you can do basic research.
00:50:37.100 | I think that's probably the number one thing,
00:50:39.380 | but it would be great if we as a society
00:50:43.400 | could come up with better ways to teach,
00:50:45.920 | so that people all over could learn better.
00:50:49.200 | So I think there are a number of things
00:50:51.820 | that would be good to be able to do better.
00:50:53.960 | - So again, you're very successful in terms of funding,
00:50:58.900 | but do you still feel the pressure of that,
00:51:01.980 | of having to seek funding?
00:51:04.640 | Does it affect the science
00:51:06.000 | or can you simply focus on doing the best work of your life
00:51:11.000 | and the funding comes along with that?
00:51:14.020 | - I'd say the last 10 or 15 years,
00:51:16.240 | we've done pretty well funding,
00:51:18.460 | but I always worry about it.
00:51:19.780 | You know, it's like you're still operating
00:51:23.100 | on more soft money than hard,
00:51:25.300 | and so I always worry about it,
00:51:27.780 | but we've been fortunate that places have come to us,
00:51:32.780 | like the Gates Foundation and others,
00:51:34.900 | Juvenile Diabetes Foundation, some companies,
00:51:37.580 | and they're willing to give us funding,
00:51:39.780 | and we've gotten government money as well.
00:51:42.220 | We have a number of NIH grants, and I've always had that,
00:51:44.740 | and that's important to me too.
00:51:46.540 | So I worry about it,
00:51:50.300 | but I just view that as a part of the process.
00:51:53.760 | - Now, if you put yourself in the shoes of a philanthropist,
00:51:57.460 | like say I gave you $100 billion right now,
00:52:02.180 | but you couldn't spend it on your own research.
00:52:04.740 | So how hard is it to decide which labs to invest in,
00:52:10.580 | which ideas, which problems, which solutions?
00:52:16.460 | You know, 'cause funding is so much,
00:52:19.100 | such an important part of progression of science
00:52:22.740 | in today's society.
00:52:24.580 | So if you put yourself in the shoes of a philanthropist,
00:52:26.700 | how hard is that problem?
00:52:27.540 | How would you go about solving it?
00:52:29.300 | - Sure, well, I think what I do,
00:52:31.460 | the first thing is different philanthropists
00:52:33.020 | have different visions,
00:52:34.740 | and I think the first thing is to form a concrete vision
00:52:37.620 | of what you want.
00:52:38.460 | Some people, I mean, I'll just give you two examples
00:52:41.100 | of people that I know.
00:52:44.340 | David Koch was very interested in cancer research,
00:52:47.500 | and part of that was that he had cancer,
00:52:49.580 | and prostate cancer,
00:52:51.620 | and a number of people do that along those lines.
00:52:55.420 | They've had somebody, they've either had cancer themselves
00:52:57.980 | or somebody they loved had cancer,
00:53:00.000 | and they wanna put money into cancer research.
00:53:02.700 | Bill Gates, on the other hand,
00:53:04.140 | I think when he had got his fortune,
00:53:06.500 | I mean, he thought about it and felt,
00:53:08.300 | well, how could he have the greatest impact?
00:53:10.180 | And he thought about helping people in the developing world,
00:53:13.500 | and medicines, and different things like that,
00:53:17.260 | like vaccines that might be really helpful
00:53:19.580 | for people in the developing world.
00:53:21.100 | And so I think first you start out with that vision.
00:53:25.700 | Once you start out with that vision,
00:53:28.260 | whatever vision it is,
00:53:29.460 | then I think you try to ask the question,
00:53:33.780 | who in the world does the best work,
00:53:36.540 | if that was your goal.
00:53:38.360 | I mean, but you really, I think,
00:53:39.820 | have to have a defined vision.
00:53:41.020 | - Vision first.
00:53:41.860 | - Yeah, and I think that's what people do.
00:53:45.080 | I mean, I have never seen anybody do it otherwise.
00:53:48.340 | I mean, and that, by the way,
00:53:49.740 | may not be the best thing overall.
00:53:53.300 | I mean, I think it's good that all those things happen,
00:53:55.580 | but what you really wanna do,
00:53:57.780 | and I'll make a contrast in a second,
00:54:00.420 | in addition to funding important areas
00:54:02.660 | like what both of those people did,
00:54:05.100 | is to help young people.
00:54:07.740 | And they may be at odds with each other
00:54:10.120 | because a far more, a lab like ours,
00:54:13.100 | which is, you know, I'm older,
00:54:14.700 | is, you know, might be very good
00:54:16.500 | at addressing some of those kinds of problems,
00:54:19.240 | but, you know, I'm not young.
00:54:20.660 | I train a lot of people who are young,
00:54:22.700 | but it's not the same as helping somebody
00:54:24.460 | who's an assistant professor someplace.
00:54:26.460 | So I think what's, I think, been good about our thing,
00:54:30.940 | our society or things overall
00:54:33.260 | are that there are people who come at it
00:54:35.040 | from different ways.
00:54:36.420 | And the combination, the confluence
00:54:38.580 | of the government funding,
00:54:40.380 | the certain foundations that fund things
00:54:43.860 | and other foundations that, you know,
00:54:46.620 | wanna see disease treated,
00:54:48.260 | well, then they can go seek out people
00:54:51.500 | or they can put a request for proposals
00:54:53.220 | and see who does the best.
00:54:54.700 | You know, I'd say both David Koch
00:54:56.980 | and Bill Gates did exactly that.
00:54:58.980 | They sought out people, most of them,
00:55:01.460 | you know, or their foundations that they were involved in
00:55:04.180 | sought out people like myself,
00:55:05.840 | but they also had requests for proposals.
00:55:10.220 | - You mentioned young people,
00:55:12.700 | and that reminds me of something you said
00:55:14.140 | in an interview of "Written Somewhere"
00:55:17.720 | that said some of your initial struggles
00:55:21.820 | in terms of finding a faculty position or so on
00:55:26.540 | that you didn't quite, for people,
00:55:30.100 | fit into a particular bucket, a particular--
00:55:32.980 | - Right.
00:55:33.820 | - Can you speak to that?
00:55:36.320 | How, do you see limitations to the academic system
00:55:41.740 | that it does have such buckets?
00:55:43.640 | Is there, how can we allow for people
00:55:51.620 | who are brilliant but outside the disciplines
00:55:56.420 | of the previous decade?
00:55:59.620 | - Yeah, well, I think that's a great question.
00:56:01.340 | I think that, I think the department has have
00:56:03.860 | to have a vision, you know, and some of them do.
00:56:07.080 | Every so often, you know, there are institutes
00:56:11.060 | or labs that do that.
00:56:13.420 | I mean, at MIT, I think that's done sometimes.
00:56:17.500 | I know mechanical engineering department just had a search
00:56:21.260 | and they hired Gio Traverso, who was one of my,
00:56:25.460 | he was a fellow with me, but he's actually
00:56:28.220 | a molecular biologist and a gastroenterologist,
00:56:32.020 | and, you know, he's one of the best in the world,
00:56:34.180 | but he's also done some great mechanical engineering
00:56:37.220 | and designing some new pills and things like that,
00:56:39.820 | and they picked him, and boy, I give them a lot of credit.
00:56:43.900 | I mean, that's vision to pick somebody,
00:56:46.900 | and I think, you know, they'll be the richer for it.
00:56:49.880 | I think the media lab has certainly hired, you know,
00:56:52.120 | people like Ed Boyden and others who have done,
00:56:55.420 | you know, very different things, and so I think that,
00:56:58.540 | you know, that's part of the vision of the leadership
00:57:01.300 | who do things like that.
00:57:03.540 | - Do you think one day, you've mentioned David Koch
00:57:06.700 | and cancer, do you think one day we'll cure cancer?
00:57:10.520 | - Yeah, I mean, of course, one day,
00:57:12.260 | I don't know how long that day will come.
00:57:14.260 | - Soon.
00:57:15.100 | - But, yeah, soon, no, but I think--
00:57:17.540 | - So you think it is a grand challenge?
00:57:19.220 | - It is a grand challenge, it's not just solvable
00:57:21.540 | within a few years.
00:57:23.060 | - I don't think very many things are solvable
00:57:25.060 | in a few years.
00:57:25.980 | There's some good ideas that people are working on,
00:57:28.620 | but I mean, all cancers, that's pretty tough.
00:57:31.140 | - If we do get the cure, what will the cure look like?
00:57:35.020 | Do you think which mechanisms, which disciplines
00:57:38.380 | will help us arrive at that cure,
00:57:40.260 | from all the amazing work you've done
00:57:42.740 | that has touched on cancer?
00:57:44.020 | - No, I think it'll be a combination
00:57:45.560 | of biology and engineering.
00:57:46.860 | I think it'll be biology to understand
00:57:50.420 | the right genetic mechanisms to solve this problem,
00:57:54.020 | and maybe the right immunological mechanisms,
00:57:56.380 | and engineering in the sense of producing the molecules,
00:58:00.380 | developing the right delivery systems,
00:58:02.180 | targeting it, or whatever else needs to be done.
00:58:04.580 | - Well, that's a beautiful vision for engineering.
00:58:08.940 | So on a lighter topic, I've read that you love chocolate,
00:58:11.980 | and mentioned two places, Ben and Bill's Chocolate Emporium,
00:58:16.840 | and the chocolate cookies, the Soho Globs
00:58:20.300 | from Rosie's Bakery in Chestnut Hill.
00:58:22.660 | I went to their website, and I was trying to finish
00:58:25.780 | a paper last night, and there's a deadline today,
00:58:28.300 | and yet I was wasting way too much time at 3 a.m.
00:58:32.540 | instead of writing the paper,
00:58:34.240 | staring at the Rosie Bakers cookies,
00:58:36.420 | which just look incredible.
00:58:38.380 | The Soho Globs just look incredible.
00:58:40.700 | But for me, oatmeal white raisin cookies won my heart,
00:58:44.740 | just from the pictures.
00:58:46.500 | Do you think one day we'll be able to engineer
00:58:49.060 | the perfect cookie with the help of chemistry,
00:58:52.940 | and maybe a bit of data-driven artificial intelligence,
00:58:55.460 | or is cookies something that's more art than engineering?
00:59:00.460 | - I think they're some of both.
00:59:03.640 | I think engineering will probably help someday.
00:59:06.100 | - What about chocolate?
00:59:08.640 | - Same thing, same thing.
00:59:09.940 | You have to go to see some of David Edwards' stuff.
00:59:12.180 | You know, he was one of my postdocs,
00:59:14.740 | and he's a professor at Harvard,
00:59:15.880 | but he also started Cafe Art Sciences,
00:59:18.540 | and it's just a really cool restaurant around here.
00:59:22.460 | But he also has companies that do ways
00:59:26.980 | of looking at fragrances and trying
00:59:29.460 | to use engineering in new ways.
00:59:31.540 | And so I think, I mean, that's just an example,
00:59:34.300 | but I expect someday that AI and engineering
00:59:38.220 | will play a role in almost everything.
00:59:40.900 | - Including creating the perfect cookie.
00:59:42.620 | - Yes.
00:59:43.460 | - Well, I dream of that day as well.
00:59:45.260 | So when you look back at your life,
00:59:47.760 | having accomplished an incredible amount
00:59:49.380 | of positive impact on the world
00:59:51.500 | through science and engineering,
00:59:53.620 | what are you most proud of?
00:59:56.320 | - My students.
00:59:57.160 | You know, I mean, I really feel when I look at that,
00:59:59.160 | I mean, we've probably had close to 1,000 students
01:00:02.300 | go through the lab, and I mean,
01:00:04.340 | they've done incredibly well.
01:00:06.740 | I think 18 are in the National Academy of Engineering,
01:00:09.720 | 16 in the National Academy of Medicine.
01:00:12.280 | I mean, they've been CEOs of companies,
01:00:15.280 | presidents of universities.
01:00:16.700 | I mean, and they've done, I think,
01:00:19.760 | eight are faculty at MIT, maybe about 12 at Harvard.
01:00:22.940 | I mean, so it really makes you feel good
01:00:25.680 | to think that the people, you know,
01:00:27.920 | they're not my children, but they're close
01:00:29.400 | to my children in a way, and it makes you feel really good
01:00:32.720 | to see them have such great lives
01:00:34.640 | and them do so much good and be happy.
01:00:36.780 | - Well, I think that's a perfect way to end it, Bob.
01:00:40.060 | Thank you so much for talking today.
01:00:41.240 | - My pleasure.
01:00:42.080 | - Good questions, thank you.
01:00:44.060 | Thanks for listening to this conversation with Bob Langer,
01:00:48.000 | and thank you to our sponsors, Cash App and Masterclass.
01:00:52.120 | Please consider supporting the podcast
01:00:53.960 | by downloading Cash App and using code LEXPODCAST
01:00:58.640 | and signing up at masterclass.com/lex.
01:01:02.720 | Click on the links, buy all the stuff.
01:01:05.360 | It's the best way to support this podcast
01:01:07.240 | and the journey I'm on in my research and startup.
01:01:11.240 | If you enjoy this thing, subscribe on YouTube,
01:01:13.840 | review it with 5,000 Apple Podcast,
01:01:16.120 | support it on Patreon, or connect with me on Twitter
01:01:19.080 | at Lex Friedman, spelled without the E, just F-R-I-D-M-A-N.
01:01:24.080 | And now let me leave you with some words
01:01:26.880 | from Bill Bryson in his book,
01:01:29.080 | "A Short History of Nearly Everything."
01:01:32.040 | If this book has a lesson,
01:01:33.880 | it is that we're awfully lucky to be here.
01:01:36.480 | And by we, I mean every living thing.
01:01:39.640 | To attain any kind of life in this universe of ours
01:01:42.500 | appears to be quite an achievement.
01:01:44.800 | As humans, we're doubly lucky, of course.
01:01:47.460 | We enjoy not only the privilege of existence,
01:01:50.080 | but also the singular ability to appreciate it,
01:01:53.440 | and even in a multitude of ways to make it better.
01:01:57.600 | It is talent we have only barely begun to grasp.
01:02:00.700 | Thank you for listening, and hope to see you next time.
01:02:04.800 | (upbeat music)
01:02:07.380 | (upbeat music)
01:02:09.960 | [BLANK_AUDIO]