back to index

Manolis Kellis: Human Genome and Evolutionary Dynamics | Lex Fridman Podcast #113


Chapters

0:0 Introduction
3:54 Human genome
17:47 Sources of knowledge
29:15 Free will
33:26 Simulation
35:17 Biological and computing
50:10 Genome-wide evolutionary signatures
56:54 Evolution of COVID-19
62:59 Are viruses intelligent?
72:8 Humans vs viruses
79:39 Engineered pandemics
83:23 Immune system
93:22 Placebo effect
95:39 Human genome source code
104:40 Mutation
111:46 Deep learning
118:8 Neuralink
127:7 Language
135:19 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | "The following is a conversation with Manolis Kellis.
00:00:03.100 | "He's a professor at MIT
00:00:04.780 | "and head of the MIT Computational Biology Group.
00:00:08.420 | "He's interested in understanding the human genome
00:00:11.280 | "from a computational, evolutionary, biological,
00:00:14.180 | "and other cross-disciplinary perspectives.
00:00:17.020 | "He has more big, impactful papers and awards
00:00:20.160 | "than I can list, but most importantly,
00:00:22.480 | "he's a kind, curious, brilliant human being
00:00:26.240 | "and just someone I really enjoy talking to.
00:00:28.700 | "His passion for science and life in general is contagious.
00:00:32.940 | "The hours honestly flew by,
00:00:34.620 | "and I'm sure we'll talk again on this podcast soon."
00:00:37.800 | Quick summary of the ads.
00:00:39.180 | Three sponsors, Blinkist, 8sleep, and Masterclass.
00:00:43.240 | Please consider supporting this podcast
00:00:45.040 | by going to blinkist.com/lex, 8sleep.com/lex,
00:00:49.920 | and signing up at masterclass.com/lex.
00:00:53.400 | Click the links, buy the stuff, get the discount.
00:00:56.200 | It's the best way to support this podcast.
00:00:58.920 | If you enjoy this thing, subscribe on YouTube,
00:01:01.220 | review it with Five Stars and Apple Podcasts,
00:01:03.440 | support it on Patreon,
00:01:04.760 | or connect with me on Twitter @LexFriedman.
00:01:08.220 | As usual, I'll do a few minutes of ads now
00:01:10.360 | and never any ads in the middle
00:01:11.880 | that can break the flow of the conversation.
00:01:14.580 | This episode is supported by Blinkist,
00:01:17.240 | my favorite app for learning new things.
00:01:19.840 | Get it at blinkist.com/lex for a seven-day free trial
00:01:24.080 | and 25% off afterwards.
00:01:27.000 | Blinkist takes the key ideas
00:01:28.440 | from thousands of nonfiction books
00:01:30.400 | and condenses them down into just 15 minutes
00:01:33.240 | that you can read or listen to.
00:01:35.440 | I'm a big believer in reading at least an hour every day.
00:01:39.000 | As part of that, I use Blinkist every day
00:01:41.960 | to try out a book I may otherwise
00:01:43.960 | never have a chance to read.
00:01:45.600 | And in general, it's a great way to broaden your view
00:01:48.400 | of the idea landscape out there
00:01:50.640 | and find books that you may want to read more deeply.
00:01:54.000 | With Blinkist, you get unlimited access
00:01:56.200 | to read or listen to a massive library
00:01:58.680 | of condensed nonfiction books.
00:02:00.720 | Go to blinkist.com/lex to try it free for seven days
00:02:05.320 | and save 25% off your new subscription.
00:02:08.360 | That's blinkist.com/lex, Blinkist spelled B-L-I-N-K-I-S-T.
00:02:13.360 | This show is also sponsored by 8Sleep
00:02:18.240 | and its PodPro mattress.
00:02:19.840 | You can check out at 8Sleep.com/lex
00:02:22.480 | to get $200 off.
00:02:24.640 | It controls temperature with an app
00:02:26.760 | and can cool down to as low as 55 degrees
00:02:29.240 | on each side of the bed separately.
00:02:31.520 | Research shows that temperature has a big impact
00:02:33.880 | on the quality of our sleep.
00:02:35.720 | Anecdotally, it's been true for me,
00:02:37.880 | it's truly been a game changer.
00:02:39.880 | I love it.
00:02:40.920 | The PodPro is packed with sensors that track heart rate,
00:02:43.440 | heart rate variability, and respiratory rate,
00:02:46.120 | showing it all in their app.
00:02:48.160 | The app's health metrics are amazing,
00:02:50.120 | but the cooling alone is honestly worth the money.
00:02:52.920 | Check it out at 8Sleep.com/lex to get $200 off.
00:02:57.500 | This show is also sponsored by Masterclass.
00:03:00.320 | Sign up at masterclass.com/lex to get a discount
00:03:03.560 | and to support this podcast.
00:03:05.500 | When I first heard about Masterclass,
00:03:07.080 | I thought it was too good to be true.
00:03:09.040 | For 180 bucks a year, you get an all-access pass
00:03:12.120 | to watch courses from, to list some of my favorites,
00:03:15.480 | Chris Hadfield on space exploration,
00:03:17.720 | Neil deGrasse Tyson on scientific thinking
00:03:19.600 | and communication, Will Wright,
00:03:21.640 | one of my favorite game designers,
00:03:23.440 | Carlos Santana, one of my favorite guitar players,
00:03:26.480 | Garry Kasparov, of course,
00:03:27.960 | the greatest chess player of all time, I'm not biased,
00:03:30.800 | Daniel Negrano on poker, and many more.
00:03:33.760 | Chris Hadfield explaining how rockets work
00:03:35.880 | and the experience of being launched into space alone
00:03:38.680 | is worth the money.
00:03:40.040 | By the way, you can watch it on basically any device.
00:03:43.120 | Once again, sign up at masterclass.com/lex
00:03:46.160 | to get a discount and to support this podcast.
00:03:49.640 | And now, here's my conversation with Manolis Callas.
00:03:53.500 | What to you is the most beautiful aspect
00:03:56.920 | of the human genome?
00:03:58.640 | - Don't get me started.
00:04:00.080 | (both laughing)
00:04:02.040 | So-- - We got time.
00:04:03.280 | - The first answer is that the beauty of genomes
00:04:06.400 | transcends humanity.
00:04:07.680 | So it's not just about the human genome.
00:04:09.480 | Genomes in general are amazingly beautiful.
00:04:12.680 | And again, I'm obviously biased.
00:04:14.080 | So in my view, the way that I like to introduce
00:04:18.720 | the human genome and the way that I like to introduce
00:04:20.880 | genomics to my class is by telling them,
00:04:22.960 | you know, we're not the inventors
00:04:25.000 | of the first digital computer.
00:04:26.600 | We are the descendants of the first digital computer.
00:04:29.340 | Basically, life is digital.
00:04:32.380 | And that's absolutely beautiful about life.
00:04:34.640 | The fact that at every replication step,
00:04:37.120 | you don't lose any information
00:04:38.520 | because that information is digital.
00:04:40.080 | If it was analog, if it was just brought in concentrations,
00:04:43.200 | you'd lose it after a few generations.
00:04:44.680 | It would just dissolve away.
00:04:46.400 | And that's what the ancients
00:04:48.240 | didn't understand about inheritance.
00:04:50.120 | The first person to understand digital inheritance
00:04:52.160 | was Mendel, of course.
00:04:54.440 | And his theory, in fact, stayed in a bookshelf
00:04:57.600 | for like 50 years while Darwin was getting famous
00:05:00.760 | about natural selection.
00:05:02.480 | But the missing component was this digital inheritance,
00:05:05.480 | the mechanism of evolution that Mendel had discovered.
00:05:09.760 | So that aspect, in my view, is the most beautiful aspect,
00:05:13.140 | but it transcends all of life.
00:05:14.920 | - And can you elaborate maybe the inheritance part?
00:05:18.080 | What was the key thing that the ancients didn't understand?
00:05:22.520 | - So the very theory of inheritance as discrete units.
00:05:27.520 | Throughout the life of Mendel and well after his writing,
00:05:32.920 | people thought that his p experiments
00:05:35.280 | were just a little fluke,
00:05:36.640 | that they were just a little exception
00:05:38.800 | that would normally not even apply to humans.
00:05:41.880 | That basically what they saw is this continuum of eye color,
00:05:46.880 | this continuum of skin color,
00:05:49.800 | this continuum of hair color, this continuum of height.
00:05:52.560 | And all of these continuums did not fit
00:05:55.120 | with a discrete type of inheritance
00:05:56.840 | that Mendel was describing.
00:05:58.960 | But what's unique about genomics
00:06:00.520 | and what's unique about the genome
00:06:01.680 | is really that there are two copies
00:06:03.880 | and that you get a combination of these,
00:06:06.280 | but for every trait,
00:06:08.240 | there are dozens of contributing variables.
00:06:10.760 | And it was only Ronald Fisher in the 20th century
00:06:14.080 | that basically recognized that even five Mendelian traits
00:06:19.080 | would add up to a continuum-like inheritance pattern.
00:06:25.000 | And he wrote a series of papers
00:06:27.560 | that still are very relevant today
00:06:30.400 | about sort of this Mendelian inheritance
00:06:33.000 | of continuum-like traits.
00:06:35.240 | And I think that was the missing step in inheritance.
00:06:38.600 | So well before the discovery of the structure of DNA,
00:06:41.600 | which is again another amazingly beautiful aspect,
00:06:44.480 | the double helix, what I like to call
00:06:46.280 | the most noble molecule of our time,
00:06:48.560 | holds within it the secret of that discrete inheritance.
00:06:54.160 | But the conceptualization of discrete elements
00:06:58.560 | is something that precedes that.
00:06:59.760 | - So even though it's discrete,
00:07:01.480 | when it materializes itself into actual traits that we see,
00:07:06.480 | it can be continuous,
00:07:07.520 | it can basically arbitrarily rich and complex.
00:07:10.920 | - So if you have five genes that contribute to human height,
00:07:15.080 | and there aren't five, there's a thousand.
00:07:17.040 | If there's only five genes
00:07:18.520 | and you inherit some combination of them
00:07:20.960 | and everyone makes you two inches taller
00:07:23.160 | or two inches shorter,
00:07:24.600 | it'll look like a continuum trait, a continuous trait.
00:07:27.520 | But instead of five, there are thousands
00:07:30.520 | and every one of them contributes to less than one millimeter.
00:07:33.960 | We change in height more during the day
00:07:37.000 | than each of these genetic variants contributes.
00:07:39.400 | So by the evening, you're shorter than you were,
00:07:43.200 | you woke up with.
00:07:44.040 | - Isn't that weird then
00:07:45.360 | that we're not more different than we are?
00:07:48.120 | Why are we all so similar
00:07:49.720 | if there's so much possibility to be different?
00:07:52.640 | - Yeah, so there are selective advantages to being medium.
00:07:57.520 | If you're extremely tall or extremely short,
00:07:59.960 | you run into selective disadvantages.
00:08:02.320 | So you have trouble breathing, you have trouble running,
00:08:04.160 | you have trouble sitting if you're too tall.
00:08:06.360 | If you're too short, you might, I don't know,
00:08:08.280 | have other selective pressures acting against that.
00:08:11.040 | If you look at natural history of human population,
00:08:13.640 | there's actually selection for height in Northern Europe
00:08:17.040 | and selection against height in Southern Europe.
00:08:19.960 | So there might actually be advantages
00:08:21.840 | to actually being not super tall.
00:08:25.040 | And if you look across the entire human population,
00:08:27.800 | for many, many traits,
00:08:29.880 | there's a lot of push towards the middle.
00:08:32.120 | Balancing selection is the usual term
00:08:35.320 | for selection that sort of seeks to not be extreme
00:08:39.720 | and to sort of have a combination of alleles
00:08:43.640 | that sort of keep recombining.
00:08:46.080 | And if you look at mate selection,
00:08:48.680 | super, super tall people will not tend
00:08:51.320 | to sort of marry super, super tall people.
00:08:53.400 | Very often you see these couples
00:08:55.240 | that are kind of compensating for each other.
00:08:58.040 | And the best predictor of the kid's age
00:09:00.480 | is very often just take the average of the two parents
00:09:03.560 | and then adjust for sex and boom, you get it.
00:09:07.160 | It's extremely heritable.
00:09:08.560 | - Let me ask, you kind of took a step back
00:09:11.520 | to the genome outside of just humans,
00:09:13.960 | but is there something that you find beautiful
00:09:15.960 | about the human genome specifically?
00:09:18.680 | - So I think the genome,
00:09:21.440 | if more people understood the beauty of the human genome,
00:09:24.520 | there would be so many fewer wars,
00:09:26.680 | so much less anger in the world.
00:09:29.040 | I mean, what's really beautiful about the human genome
00:09:31.480 | is really the variation that teaches us
00:09:34.760 | both about individuality and about similarity.
00:09:38.280 | So any two people on the planet are 99.9% identical.
00:09:42.440 | How can you fight with someone
00:09:45.640 | who's 99.9% identical to you?
00:09:47.320 | It's just counterintuitive.
00:09:49.760 | And yet any two siblings of the same parent
00:09:53.960 | differ in millions of locations.
00:09:57.160 | So every one of them is basically two to the million unique
00:10:01.080 | from any pair of parents,
00:10:03.200 | let alone any two random parents on the planet.
00:10:05.840 | So that's, I think, something that teaches us
00:10:08.600 | about sort of the nature of humanity in many ways,
00:10:11.280 | that every one of us is as unique as any star
00:10:14.720 | and way more unique in actually many ways.
00:10:17.240 | And yet we're all brothers and sisters.
00:10:22.240 | - Yeah, just like stars,
00:10:23.360 | most of it is just fusion reactions.
00:10:26.200 | - Yeah, you only have a few parameters to describe stars.
00:10:28.640 | - Yeah, exactly.
00:10:29.480 | So mass size, initial size, and stage of life.
00:10:33.120 | Whereas for humans, it's thousands of parameters
00:10:36.360 | scattered across our genome.
00:10:38.080 | So the other thing that makes humans unique,
00:10:41.360 | the other things that makes inheritance unique in humans
00:10:45.240 | is that most species inherit things vertically.
00:10:50.240 | Basically, instinct is a huge part of their behavior.
00:10:54.480 | The way that, I mean, with my kids,
00:10:57.560 | we've been watching this nest of birds
00:11:01.000 | with two little eggs outside our window
00:11:03.600 | for the last few months,
00:11:05.240 | for the last few weeks as they've been growing.
00:11:07.480 | And there's so much behavior that's hard-coded.
00:11:12.280 | Birds don't just learn as they grow.
00:11:15.280 | They don't, there's no culture.
00:11:17.160 | Like a bird that's born in Boston will be the same
00:11:20.200 | as a bird that's born in California.
00:11:22.040 | So there's not as much inheritance of ideas, of customs.
00:11:27.960 | A lot of it is hard-coded in their genome.
00:11:30.560 | What's really beautiful about the human genome
00:11:32.320 | is that if you take a person from today
00:11:35.040 | and you place them back in ancient Egypt,
00:11:37.080 | or if you take a person from ancient Egypt
00:11:39.160 | and you place them here today,
00:11:41.200 | they will grow up to be completely normal.
00:11:43.300 | That is not genetics.
00:11:47.720 | This is the other type of inheritance in humans.
00:11:51.880 | So on one hand, we have genetic inheritance,
00:11:53.920 | which is vertical from your parents down.
00:11:56.200 | On the other hand, we have horizontal inheritance,
00:11:58.440 | which is the ideas that are built up at every generation
00:12:02.440 | are horizontally transmitted.
00:12:04.600 | And the huge amount of time that we spend
00:12:07.440 | in educating ourselves, a concept known as neoteny,
00:12:11.960 | neo for newborn and then teny for holding.
00:12:15.220 | So if you look at humans, I mean, the little birds,
00:12:18.360 | they were eggs two weeks ago,
00:12:20.400 | and now one of them has already flown off.
00:12:22.720 | The other one's ready to fly off.
00:12:24.560 | In two weeks, they're ready to just fend for themselves.
00:12:27.320 | Humans, 16 years.
00:12:30.040 | (both laughing)
00:12:31.280 | 18 years, 24, getting out of college.
00:12:33.040 | - I'm still learning.
00:12:34.440 | So that's so fascinating,
00:12:35.800 | this picture of a vertical and a horizontal.
00:12:38.720 | When you talk about the horizontal,
00:12:40.120 | is it in the realm of ideas?
00:12:42.000 | - Exactly.
00:12:42.840 | - Okay, so it's the actual social interactions.
00:12:45.360 | - That's exactly right, that's exactly right.
00:12:47.120 | So basically, the concept of neoteny
00:12:49.240 | is that you spend acquiring characteristics
00:12:52.800 | from your environment in an extremely malleable state
00:12:56.120 | of your brain and the wiring of your brain
00:12:58.520 | for a long period of your life.
00:13:00.680 | Compared to primates, we are useless.
00:13:03.520 | You take any primate at seven weeks
00:13:05.360 | and any human at seven weeks, we lose the battle.
00:13:08.480 | But at 18 years, you know, all bets are off.
00:13:11.680 | Like, basically, our brain continues to develop
00:13:14.880 | in an extremely malleable form till very late.
00:13:17.800 | And this is what allows education.
00:13:20.400 | This is what allows the person from Egypt
00:13:22.520 | to do extremely well now.
00:13:24.720 | And the reason for that is that the wiring of our brain
00:13:29.720 | and the development of that wiring is actually delayed.
00:13:35.000 | So, you know, the longer you delay that,
00:13:37.520 | the more opportunity you have to pass on knowledge,
00:13:40.880 | to pass on concepts, ideals, ideas
00:13:44.280 | from the parents to the child.
00:13:46.080 | And what's really absolutely beautiful about humans today
00:13:49.160 | is that that lateral transfer of ideas and culture
00:13:52.160 | is not just from uncles and aunts and teachers at school,
00:13:55.760 | but it's from Wikipedia and review articles on the web
00:14:00.360 | and thousands of journals
00:14:02.680 | that are sort of putting out information for free
00:14:05.560 | and podcasts and video casts and all of that stuff
00:14:08.920 | where you can basically learn about any topic,
00:14:12.920 | pretty much everything that would be
00:14:15.920 | in any super advanced textbook in a matter of days,
00:14:19.560 | instead of having to go to the library of Alexandria
00:14:22.960 | and sail there to read three books
00:14:24.720 | and then sail for another few days
00:14:26.160 | to get to Athens and et cetera, et cetera, et cetera.
00:14:28.660 | So the democratization of knowledge
00:14:31.400 | and the spread, the speed of spread of knowledge
00:14:34.280 | is what defines, I think, the human inheritance pattern.
00:14:38.920 | - So you sound excited about it.
00:14:41.560 | Are you also a little bit afraid
00:14:43.760 | or are you more excited by the power
00:14:46.480 | of this kind of distributed spread of information?
00:14:49.920 | So you put it very kindly
00:14:51.320 | that most people are kind of using the internet
00:14:53.560 | and looking Wikipedia, reading articles,
00:14:56.800 | reading papers and so on,
00:14:58.120 | but if we're honest, most people online,
00:15:02.040 | especially when they're younger,
00:15:03.240 | probably looking at five second clips on TikTok
00:15:05.840 | or whatever the new social network is,
00:15:08.520 | are you, given this power of horizontal inheritance,
00:15:12.520 | are you optimistic or a little bit pessimistic
00:15:16.520 | about this new effect of the internet
00:15:21.520 | and democratization of knowledge on our,
00:15:25.560 | what would you call this?
00:15:28.120 | This genome, like would you use the term genome,
00:15:30.760 | by the way, for this?
00:15:31.600 | - Yeah, yeah.
00:15:32.420 | I think, you know, we use the genome to talk about DNA,
00:15:36.200 | but very often we say, you know, I mean, I'm Greek,
00:15:39.000 | so people ask me, "Hey, what's in the Greek genome?"
00:15:40.800 | And I'm like, "Well, yeah, what's in the Greek genome
00:15:42.760 | "is both our genes and also our ideas
00:15:44.720 | "and our ideals and our culture."
00:15:46.680 | - The poetic meaning of the word.
00:15:48.240 | - Exactly, exactly, yeah.
00:15:50.120 | So I think that there's a beauty
00:15:55.120 | to the democratization of knowledge,
00:15:57.760 | the fact that you can reach as many people
00:16:00.240 | as any other person on the planet
00:16:02.840 | and it's not who you are,
00:16:04.280 | it's really your ideas that matter,
00:16:06.680 | is a beautiful aspect of the internet.
00:16:09.700 | The, I think there's of course a danger
00:16:14.200 | of my ignorance is as important as your expertise.
00:16:18.260 | The fact that with this democratization
00:16:21.360 | comes the abolishment of respecting expertise.
00:16:25.160 | Just because you've spent, you know,
00:16:27.120 | 10,000 hours of your life studying,
00:16:29.920 | I don't know, human brain circuitry,
00:16:33.320 | why should I trust you?
00:16:34.160 | I'm just gonna make up my own theories
00:16:35.640 | and they'll be just as good as yours,
00:16:37.240 | is an attitude that sort of counteracts
00:16:39.660 | the beauty of the democratization.
00:16:42.480 | And I think that within our educational system
00:16:47.400 | and within the upbringing of our children,
00:16:49.720 | we have to not only teach them knowledge,
00:16:52.340 | but we have to teach them the means to get to knowledge.
00:16:55.780 | And that, you know, it's very similar to sort of,
00:16:58.000 | you fish, you catch a fish for a man for one day,
00:17:01.400 | you fed them for one day, you teach them how to fish,
00:17:03.880 | you fed them for the rest of their life.
00:17:05.580 | So instead of just gathering the knowledge they need
00:17:08.240 | for any one task, we can just tell them,
00:17:10.160 | all right, here's how you Google it.
00:17:12.520 | Here's how to figure out what's real and what's not.
00:17:14.640 | Here's how you check the sources.
00:17:16.440 | Here's how you form a basic opinion for yourself.
00:17:19.320 | And I think that inquisitive nature is paramount
00:17:24.320 | to being able to sort through this huge wealth of knowledge.
00:17:29.320 | So you need a basic educational foundation
00:17:32.560 | based on which you can then add on
00:17:35.520 | the sort of domain specific knowledge,
00:17:38.240 | but that basic educational foundation
00:17:39.720 | should not just be knowledge,
00:17:42.400 | but it should also be epistemology,
00:17:45.240 | the way to acquire knowledge.
00:17:47.240 | - I'm not sure any of us know how to do that
00:17:49.760 | in this modern day.
00:17:50.600 | We're actually learning.
00:17:51.680 | One of the big surprising thing to me
00:17:53.580 | about the coronavirus, for example,
00:17:57.280 | is that Twitter has been one of the best sources
00:18:01.080 | of information, basically like building your own network
00:18:04.960 | of experts, as opposed to the traditional
00:18:09.800 | centralized expertise of the WHO and the CDC
00:18:12.840 | and the, or maybe any one particular respectable person
00:18:17.840 | at the top of a department, some kind of institution.
00:18:21.800 | You instead look at 10, 20, hundreds of people,
00:18:26.520 | some of whom are young kids with just,
00:18:30.280 | that are incredibly good at aggregating data
00:18:33.760 | and plotting and visualizing that data.
00:18:35.800 | That's been really surprising to me.
00:18:37.240 | I don't know what to make of it.
00:18:38.840 | I don't know how that matures into something stable.
00:18:44.320 | I don't know if you have ideas.
00:18:46.600 | Like what, if you were to try to explain to your kids
00:18:49.960 | of how, where should you go to learn about coronavirus?
00:18:54.960 | What would you say?
00:18:56.800 | - It's such a beautiful example.
00:18:58.080 | And I think the current pandemic
00:18:59.920 | and the speed at which the scientific community has moved
00:19:04.000 | in the current pandemic,
00:19:04.840 | I think exemplifies this horizontal transfer
00:19:08.080 | and the speed of horizontal transfer of information.
00:19:10.840 | The fact that, you know, the genome was first sequenced
00:19:15.360 | in early January.
00:19:16.400 | The first sample was obtained December 29, 2019,
00:19:20.400 | a week after the publication of the first genome sequence.
00:19:23.600 | Moderna had already finalized its vaccine design
00:19:26.760 | and was moving to production.
00:19:29.520 | I mean, this is phenomenal.
00:19:31.840 | The fact that we go from not knowing
00:19:34.960 | what the heck is killing people in Wuhan to, wow,
00:19:38.920 | it's SARS-CoV-2 and here's the set of genes,
00:19:41.880 | here's the genome, here's the sequence,
00:19:43.560 | here are the polymorphisms, et cetera,
00:19:45.640 | in the matter of weeks is phenomenal.
00:19:48.200 | In that incredible pace of transfer of knowledge,
00:19:52.720 | there have been many mistakes.
00:19:54.400 | So, you know, some of those mistakes
00:19:56.640 | may have been politically motivated,
00:19:57.960 | or other mistakes may have just been innocuous errors.
00:20:00.880 | Others may have been misleading the public
00:20:02.880 | for the greater good, such as don't wear masks
00:20:05.560 | because we don't want the mask to run out.
00:20:07.240 | I mean, that was very silly in my view
00:20:09.080 | and a very big mistake.
00:20:10.480 | But the spread of knowledge
00:20:15.120 | from the scientific community was phenomenal.
00:20:17.320 | And some people will point out to bogus articles
00:20:20.720 | that snuck in and made the front page.
00:20:22.920 | Yeah, they did, but within 24 hours, they were debunked
00:20:26.280 | and went out of the front page.
00:20:27.520 | And I think that's the beauty of science today.
00:20:30.200 | The fact that it's not, oh, knowledge is fixed.
00:20:33.240 | It's the ability to embrace that nothing is permanent
00:20:37.000 | when it comes to knowledge,
00:20:37.920 | that everything is the current best hypothesis
00:20:40.040 | and the current best model that best fits the current data
00:20:42.960 | and the willingness to be wrong.
00:20:45.840 | The expectation that we're gonna be wrong
00:20:48.320 | and the celebration of success based on
00:20:50.720 | how long was I not proven wrong for,
00:20:52.760 | rather than, wow, I was exactly right.
00:20:55.640 | 'Cause no one is gonna be exactly right
00:20:57.080 | with partial knowledge.
00:20:59.000 | But the arc towards perfection,
00:21:03.160 | I think is so much more important
00:21:05.320 | than how far you are in your first step.
00:21:08.720 | And I think that's what sort of
00:21:10.440 | the current pandemic has taught us.
00:21:13.440 | The fact that, yeah, no, of course,
00:21:14.800 | we're gonna make mistakes,
00:21:16.200 | but at least we're gonna learn from those mistakes
00:21:18.360 | and become better and learn better
00:21:20.360 | and spread information better.
00:21:21.360 | So if I were to answer the question of
00:21:23.360 | where would you go to learn about coronavirus?
00:21:27.760 | First, textbook.
00:21:28.880 | It all starts with a textbook.
00:21:30.000 | Just open up a chapter on virology
00:21:32.640 | and how coronaviruses work.
00:21:34.440 | Then some basic epidemiology
00:21:37.000 | and sort of how pandemics have worked in the past.
00:21:39.920 | What are the basic principles
00:21:41.080 | surrounding these first wave, second wave?
00:21:43.520 | Why do they even exist?
00:21:45.480 | Then understanding about growth,
00:21:47.280 | understanding about the R naught and R T
00:21:50.280 | at various time points.
00:21:52.440 | And then understanding the means of spread,
00:21:55.320 | how it spreads from person to person.
00:21:57.360 | Then how does it get into your cells
00:22:00.120 | from when it gets into the cells?
00:22:01.640 | What are the paths that it takes?
00:22:03.280 | What are the cell types that express
00:22:05.240 | the particular ACE2 receptor?
00:22:07.360 | How is your immune system interacting with the virus?
00:22:09.960 | And once your immune system launches its defense,
00:22:12.320 | how is that helping or actually hurting your health?
00:22:15.560 | What about the cytokine storm?
00:22:16.960 | What are most people dying from?
00:22:18.600 | Why are the comorbidities and these risk factors
00:22:22.760 | even applying?
00:22:23.920 | What makes obese people respond more
00:22:26.000 | or elderly people respond more to the virus
00:22:28.600 | while kids are completely,
00:22:30.640 | very often not even aware that they're spreading it?
00:22:36.400 | So I think there's some basic questions
00:22:41.200 | that you would start from.
00:22:42.760 | And then I'm sorry to say,
00:22:44.400 | but Wikipedia is pretty awesome.
00:22:45.960 | Google is pretty awesome.
00:22:47.040 | (laughs)
00:22:47.880 | - There used to be a time,
00:22:48.720 | there used to be a time maybe five years ago,
00:22:50.560 | I forget when,
00:22:52.040 | but people kind of made fun of Wikipedia
00:22:54.280 | for being an unreliable source.
00:22:57.280 | I never quite understood it.
00:22:58.560 | I thought from the early days, it was pretty reliable.
00:23:01.280 | They're better than a lot of the alternatives.
00:23:03.680 | But at this point,
00:23:04.800 | it's kind of like a solid accessible survey paper
00:23:08.360 | on every subject ever.
00:23:10.080 | - There's an ascertainment bias and a writing bias.
00:23:14.680 | So I think this is related to sort of people saying,
00:23:17.840 | oh, so many nature papers are wrong.
00:23:19.880 | And they're like, why would you publish in nature?
00:23:22.600 | So many nature papers are wrong.
00:23:23.720 | And my answer is no, no, no.
00:23:26.160 | So many nature papers are scrutinized.
00:23:29.480 | And just because more of them are being proven wrong
00:23:32.000 | than in other articles is actually evidence
00:23:35.440 | that they're actually better papers overall
00:23:37.080 | because they're being scrutinized at a rate
00:23:39.200 | much higher than any other journal.
00:23:41.080 | So if you basically judge Wikipedia
00:23:45.600 | by not the initial content,
00:23:49.880 | but by the number of revisions,
00:23:52.480 | then of course it's gonna be the best source
00:23:53.960 | of knowledge eventually.
00:23:55.360 | It's still very superficial.
00:23:57.160 | You then have to go into the review papers,
00:23:58.800 | et cetera, et cetera, et cetera.
00:24:00.200 | But I mean, for most scientific topics,
00:24:03.400 | it's extremely superficial.
00:24:05.080 | But it is quite authoritative because it is the place
00:24:09.240 | that everybody likes to criticize as being wrong.
00:24:11.680 | - You say that it's superficial.
00:24:13.640 | On a lot of topics that I've studied a lot of,
00:24:18.400 | I find it, I don't know if superficial is the right word.
00:24:23.240 | 'Cause superficial kind of implies that it's not correct.
00:24:27.680 | - No, no, no.
00:24:29.120 | I don't mean any implication of it not being correct.
00:24:31.640 | It's just superficial.
00:24:32.840 | It's basically only scratching the surface.
00:24:35.520 | For depth, you don't go to Wikipedia.
00:24:37.120 | You go to the review articles.
00:24:38.320 | - But it can be profound in the way that articles rarely,
00:24:41.840 | one of the frustrating things to me
00:24:43.280 | about certain computer science,
00:24:46.600 | like in the machine learning world,
00:24:48.360 | articles, they don't as often take the bigger picture view.
00:24:53.360 | There's a kind of data set and you show that it works
00:24:57.440 | and you kind of show that here's an architectural thing
00:24:59.680 | that creates an improvement and so on and so forth.
00:25:02.280 | But you don't say, well, what does this mean
00:25:05.280 | for the nature of intelligence for future data sets
00:25:08.580 | we haven't even thought about?
00:25:10.080 | Or if you were trying to implement this,
00:25:11.960 | like if we took this data set of 100,000 examples
00:25:15.920 | and scale it to 100 billion examples with this method,
00:25:19.840 | like look at the bigger picture,
00:25:21.240 | which is what a Wikipedia article would actually try to do,
00:25:25.560 | which is like, what does this mean in the context
00:25:28.560 | of the broad field of computer vision or something like that?
00:25:32.360 | - Yeah, yeah.
00:25:33.200 | No, I agree with you completely,
00:25:34.840 | but it depends on the topic.
00:25:36.100 | I mean, for some topics, there's been a huge amount of work.
00:25:38.440 | For other topics, it's just a stub.
00:25:40.320 | So, you know.
00:25:41.560 | - I got it. - Yeah.
00:25:42.600 | - Well, yeah, actually, which we'll talk on,
00:25:46.400 | genomics was not--
00:25:48.080 | - Yeah, it's very shallow.
00:25:49.360 | Yeah, yeah.
00:25:50.480 | It's not wrong, it's just shallow.
00:25:51.880 | - It's shallow. - Yeah.
00:25:53.000 | Every time I criticize something,
00:25:54.720 | I should feel partly responsible.
00:25:56.360 | Basically, if more people from my community
00:25:58.280 | went there and edited, it would not be shallow.
00:26:01.160 | It's just that there's different modes of communication
00:26:04.040 | in different fields.
00:26:05.320 | And in some fields, the experts have embraced Wikipedia.
00:26:09.040 | In other fields, it's relegated
00:26:11.200 | and perhaps the reason is that
00:26:14.080 | if it was any better to start with,
00:26:16.540 | people would invest more time.
00:26:18.040 | But if it's not great to start with,
00:26:20.000 | then you need a few initial pioneers
00:26:22.400 | who will basically go in and say,
00:26:24.180 | ah, enough, we're just gonna fix that.
00:26:26.640 | And then I think it'll catch on much more.
00:26:29.160 | - So, if it's okay, before we go on to genomics,
00:26:32.260 | can we linger a little bit longer
00:26:34.280 | on the beauty of the human genome?
00:26:37.160 | You've given me a few notes.
00:26:38.600 | What else do you find beautiful about the human genome?
00:26:41.680 | - So, the last aspect of what makes the human genome unique,
00:26:44.900 | in addition to the similarity and the differences
00:26:49.900 | and the individuality, is that,
00:26:52.800 | so, very early on, people would basically say,
00:26:57.360 | oh, you don't do that experiment in human.
00:26:59.280 | You have to learn about that in fly.
00:27:01.240 | Or you have to learn about that in yeast first,
00:27:03.120 | or in mouse first, or in a primate first.
00:27:05.880 | And the human genome was, in fact,
00:27:07.800 | relegated to sort of, oh, the last place
00:27:09.960 | that you're gonna go to learn something new.
00:27:12.640 | That has dramatically changed.
00:27:14.240 | And the reason that changed is human genetics.
00:27:17.440 | We are the species in the planet
00:27:22.520 | that's the most studied right now.
00:27:24.680 | It's embarrassing to say that,
00:27:26.260 | but this was not the case a few years ago.
00:27:28.400 | It used to be, you know, first viruses,
00:27:31.840 | then bacteria, then yeast,
00:27:35.200 | then the fruit fly and the worm,
00:27:37.840 | then the mouse, and eventually, human was very far last.
00:27:42.360 | - So, it's embarrassing that it took us this long
00:27:44.680 | to focus on it, or the--
00:27:46.520 | - It's embarrassing that the model organisms
00:27:49.160 | have been taken over because of the power of human genetics.
00:27:52.600 | That, right now, it's actually simpler
00:27:54.760 | to figure out the phenotype of something
00:27:57.240 | by mining this massive amount of human data
00:28:01.360 | than by going back to any of the other species.
00:28:03.960 | And the reason for that is that if you look
00:28:05.480 | at the natural variation that happens
00:28:07.320 | in a population of seven billion,
00:28:09.660 | you basically have a mutation in almost every nucleotide.
00:28:13.360 | So, every nucleotide you wanna perturb,
00:28:15.640 | you can go find a living, breathing human being
00:28:18.760 | and go test the function of that nucleotide
00:28:20.320 | by sort of searching the database and finding that person.
00:28:22.600 | - Wait, why is that embarrassing?
00:28:23.600 | It's a beautiful data set.
00:28:24.680 | - It's a beautiful data set.
00:28:26.360 | It's embarrassing for the model organism.
00:28:29.320 | - For the flies and-- - Yeah, exactly.
00:28:31.240 | - I mean, do you feel, on a small tangent,
00:28:34.960 | is there something of value in the genome of a fly
00:28:39.960 | and other of these model organisms that you miss
00:28:43.760 | that we wish we would be looking at deeper?
00:28:47.400 | - So, directed perturbation, of course.
00:28:49.880 | So, I think the place where humans are still lagging
00:28:54.120 | is the fact that in an animal model,
00:28:55.720 | you can go and say, well, let me knock out
00:28:57.360 | this gene completely. - Got it.
00:28:58.640 | - And let me knock out these three genes completely.
00:29:00.600 | And at the moment you get into combinatorics,
00:29:02.760 | it's something you can't do in the human
00:29:04.200 | because there just simply aren't enough humans on the planet
00:29:07.080 | and, again, let me be honest,
00:29:08.840 | we haven't sequenced all seven billion people.
00:29:11.200 | It's not like we have every mutation,
00:29:12.800 | but we know that there's a carrier out there.
00:29:15.040 | So, if you look at the trend and the speed
00:29:17.480 | with which human genetics has progressed,
00:29:19.480 | we can now find thousands of genes
00:29:22.440 | involved in human cognition, in human psychology,
00:29:27.080 | in the emotions and the feelings
00:29:29.080 | that we used to think are uniquely learned.
00:29:31.800 | Turns out there's a genetic basis to a lot of that.
00:29:34.360 | So, the human genome has continued to elucidate
00:29:39.360 | through these studies of genetic variation
00:29:44.880 | so many different processes that we previously thought
00:29:47.520 | were something that, like free will.
00:29:52.320 | Free will is this beautiful concept
00:29:54.280 | that humans have had for a long time.
00:29:57.840 | You know, in the end, it's just a bunch of chemical reactions
00:29:59.880 | happening in your brain
00:30:00.760 | and the particular abundance of receptors
00:30:03.160 | that you have this day based on what you ate yesterday
00:30:06.120 | or that you have been wired with
00:30:08.400 | based on your parents and your upbringing, et cetera,
00:30:12.600 | determines a lot of that, quote unquote, free will component
00:30:15.720 | to sort of narrow and narrow, sort of slices.
00:30:20.720 | - So, on that point, how much freedom do you think we have
00:30:25.680 | to escape the constraints of our genome?
00:30:30.400 | You're making it sound like more and more
00:30:31.960 | we're discovering that our genome
00:30:33.400 | actually has a lot of the story already encoded into it.
00:30:37.760 | How much freedom do we have?
00:30:39.200 | - So, let me describe what that freedom would look like.
00:30:45.120 | That freedom would be my saying,
00:30:47.600 | ooh, I'm gonna resist the urge to eat that apple
00:30:51.520 | because I choose not to.
00:30:54.500 | But there are chemical receptors
00:30:56.460 | that made me not resist the urge
00:30:59.360 | to prove my individuality and my free will
00:31:02.360 | by resisting the apple.
00:31:04.100 | So, then the next question is,
00:31:05.580 | well, maybe now I'll resist the urge to resist the apple
00:31:08.220 | and I'll go for the chocolate instead
00:31:09.560 | to prove my individuality.
00:31:10.780 | But then, what about those other receptors that, you know?
00:31:14.420 | (laughing)
00:31:16.060 | - That might be all encoded in there.
00:31:17.860 | - So, it's kicking the bucket down the road
00:31:19.500 | and basically saying, well, your choice
00:31:22.060 | will may have actually been driven by other things
00:31:24.940 | that you actually are not choosing.
00:31:26.700 | So, that's why it's very hard to answer that question.
00:31:30.060 | - It's hard to know what to do with that.
00:31:31.420 | I mean, if the genome has,
00:31:33.360 | if there's not much freedom, it's--
00:31:38.540 | - It's the butterfly effect.
00:31:40.540 | It's basically that in the short term,
00:31:42.940 | you can predict something extremely well
00:31:45.720 | by knowing the current state of the system.
00:31:48.120 | But a few steps down, it's very hard to predict
00:31:50.740 | based on the current knowledge.
00:31:52.460 | Is that because the system is truly free?
00:31:55.300 | When I look at weather patterns,
00:31:56.420 | I can predict the next 10 days.
00:31:57.940 | Is it because the weather has a lot of freedom
00:32:00.320 | and after 10 days, it chooses to do something else?
00:32:03.500 | Or is it because, in fact, the system is fully deterministic
00:32:07.360 | and there's just a slightly different magnetic field
00:32:10.180 | of the Earth, slightly more energy arriving from the sun,
00:32:12.460 | a slightly different spin
00:32:13.740 | of the gravitational pull of Jupiter
00:32:16.100 | that is now causing all kinds of tides
00:32:18.900 | and slight deviation of the moon, et cetera.
00:32:20.900 | Maybe all of that can be fully modeled.
00:32:22.980 | Maybe the fact that China is emitting
00:32:25.760 | a little more carbon today
00:32:27.300 | is actually gonna affect the weather in Egypt in three weeks
00:32:31.480 | and all of that could be fully modeled.
00:32:33.900 | In the same way, if you take a complete view
00:32:36.820 | of a human being now, I model everything about you.
00:32:41.260 | The question is, can I predict your next step?
00:32:44.940 | Probably, but at how far?
00:32:47.820 | And if it's a little further,
00:32:49.420 | is that because of stochasticity
00:32:51.340 | and sort of chaos properties of unpredictability
00:32:54.580 | of beyond a certain level
00:32:56.180 | or was that actually true free will?
00:32:58.300 | - Yeah, so the number of variables might be so,
00:33:01.300 | you might need to build an entire universe
00:33:03.780 | to be able to model. - To simulate a human
00:33:06.620 | and then maybe that human will be fully simulatable,
00:33:09.460 | but maybe aspects of free will will exist
00:33:12.260 | and where's that free will coming from?
00:33:13.420 | It's still coming from the same neurons
00:33:15.020 | or maybe from a spirit inhabiting these neurons,
00:33:17.620 | but again, it's very difficult empirically
00:33:19.760 | to sort of evaluate where does free will begin
00:33:22.580 | and sort of chemical reactions and electric signals and.
00:33:25.820 | - So on that topic, let me ask the most absurd question
00:33:29.900 | that most MIT faculty rolled their eyes on,
00:33:33.940 | but what do you think about the simulation hypothesis
00:33:38.260 | and the idea that we live in a simulation?
00:33:40.260 | - I think it's complete BS.
00:33:41.580 | (both laughing)
00:33:44.020 | - Okay. - There's no empirical evidence.
00:33:45.700 | - No, there's not. - Absolutely not.
00:33:47.060 | - Not in terms of empirical evidence, not,
00:33:49.020 | but in terms of a thought experiment,
00:33:52.380 | does it help you think about the universe?
00:33:54.860 | I mean, so if you look at the genome,
00:33:57.500 | it's encoding a lot of the information
00:33:59.180 | that is required to create some of the beautiful
00:34:01.500 | human complexity that we see around us.
00:34:04.220 | It's an interesting thought experiment.
00:34:05.940 | How much parameters do we need to have
00:34:10.940 | in order to model this full human experience?
00:34:15.300 | Like if we were to build a video game,
00:34:17.540 | how hard it would be to build a video game
00:34:19.980 | that's convincing enough and fun enough
00:34:22.660 | and has consistent laws of physics, all that stuff?
00:34:27.660 | It's not interesting to you as a thought experiment?
00:34:31.380 | - I mean, it's cute, but it's Occam's razor.
00:34:35.060 | I mean, what's more realistic,
00:34:36.820 | the fact that you're actually a machine
00:34:38.340 | or that you're a person?
00:34:39.940 | The fact that all of my experiences exist
00:34:43.340 | inside the chemical molecules that I have
00:34:45.540 | or that somebody's actually simulating all that?
00:34:48.660 | I mean, to me--
00:34:49.500 | - Well, you did refer to humans
00:34:50.860 | as a digital computer earlier, so--
00:34:52.540 | - Of course, of course, but that does not--
00:34:54.260 | - It's a kind of a machine, right?
00:34:55.260 | - I know, I know, but I think the probability
00:35:00.260 | of all that is nil and let the machines wake me up
00:35:03.500 | and just terminate me now if it's not.
00:35:05.500 | (laughing)
00:35:07.540 | I challenge you machines.
00:35:08.860 | - They're gonna wait a little bit
00:35:10.380 | to see what you're gonna do next.
00:35:12.380 | - It's fun, it's fun to watch,
00:35:14.540 | especially the clever humans.
00:35:16.280 | What's the difference to you between the way
00:35:19.540 | a computer stores information
00:35:21.300 | and the human genome stores information?
00:35:23.980 | So you also have roots and your work.
00:35:26.980 | Would you say you're,
00:35:28.700 | when you introduce yourself at a bar--
00:35:30.700 | - It depends who I'm talking to.
00:35:34.020 | - Would you say it's computational biology?
00:35:36.140 | Do you reveal your expertise in computer science
00:35:41.300 | or your expertise in computers?
00:35:43.300 | - It depends who I'm talking to, truly.
00:35:45.340 | I mean, basically, if I meet someone who's in computers,
00:35:47.700 | I'll say, oh, I'm a professor in computer science.
00:35:51.100 | If I meet someone who's in engineering,
00:35:52.460 | I say computer science and electrical engineering.
00:35:54.740 | If I meet someone in biology, I'll say,
00:35:56.180 | hey, I work in genomics.
00:35:57.180 | If I meet someone in medicine, I'm like,
00:35:58.460 | hey, I work on genetics.
00:36:00.740 | - You're a fun person to meet at a bar, I got you.
00:36:04.100 | - No, no, but what I'm trying to say is that I don't,
00:36:07.420 | I mean, there's no single attribute
00:36:09.060 | that I will define myself as.
00:36:10.820 | You know, there's a few things I know,
00:36:12.060 | there's a few things I study,
00:36:13.100 | there's a few things I have degrees on,
00:36:15.060 | and there's a few things that I grant degrees in.
00:36:17.940 | And, you know, I publish papers across the whole gamut,
00:36:22.660 | you know, the whole spectrum of computation
00:36:25.300 | to biology, et cetera.
00:36:26.340 | I mean, the complete answer is that I use computer science
00:36:31.340 | to understand biology.
00:36:32.860 | So I'm a, you know, I develop methods in AI
00:36:38.700 | and machine learning, statistics and algorithms, et cetera.
00:36:41.660 | But the ultimate goal of my career
00:36:44.020 | is to really understand biology.
00:36:45.660 | If these things don't advance our understanding of biology,
00:36:49.620 | I'm not as fascinated by them.
00:36:51.940 | Although there are some beautiful computational problems
00:36:54.900 | by themselves, I've sort of made it my mission
00:36:57.900 | to apply the power of computer science
00:37:01.620 | to truly understand the human genome, health, disease,
00:37:06.980 | you know, and the whole gamut of how our brain works,
00:37:10.060 | how our body works, and all of that,
00:37:11.700 | which is so fascinating.
00:37:13.900 | (laughs)
00:37:14.980 | - So the dream, there's not a equivalent
00:37:16.980 | sort of complementary dream of understanding
00:37:20.980 | human biology in order to create an artificial life,
00:37:23.380 | an artificial brain, an artificial intelligence
00:37:26.100 | that supersedes the intelligence
00:37:27.700 | and the capabilities of us humans.
00:37:29.720 | - It's an interesting question.
00:37:31.620 | It's a fascinating question.
00:37:33.300 | So understanding the human brain is undoubtedly coupled
00:37:38.300 | to how do we make better AI,
00:37:42.220 | because so much of AI has in fact been inspired by the brain.
00:37:47.220 | It may have taken 50 years
00:37:49.100 | since the early days of neural networks
00:37:51.140 | till we have, you know, all of these amazing progress
00:37:55.460 | that we've seen with, you know, deep belief networks
00:38:00.860 | and, you know, all of these advances in Go and chess,
00:38:05.860 | in image synthesis, in deep fakes, in you name it.
00:38:10.480 | And, but the underlying architecture
00:38:15.380 | is very much inspired by the human brain,
00:38:18.060 | which actually posits a very, very interesting question.
00:38:20.900 | Why are neural networks performing so well?
00:38:27.220 | And they perform amazingly well.
00:38:28.980 | Is it because they can simulate any possible function?
00:38:31.740 | And the answer is no, no.
00:38:34.420 | They simulate a very small number of functions.
00:38:37.180 | Is it because they can simulate
00:38:38.420 | every possible function in the universe?
00:38:40.540 | And that's where it gets interesting.
00:38:41.540 | The answer is actually, yeah, a little closer to that.
00:38:44.740 | And here's where it gets really fun.
00:38:46.580 | If you look at human brain and human cognition,
00:38:51.720 | it didn't evolve in a vacuum.
00:38:54.020 | It evolved in a world with physical constraints,
00:38:58.820 | like the world that inhabits us.
00:39:00.700 | It is the world that we inhabit.
00:39:02.300 | And if you look at our senses, what do they perceive?
00:39:08.220 | They perceive different, you know,
00:39:10.340 | parts of the electromagnetic spectrum.
00:39:12.780 | You know, the hearing is just different movements in air.
00:39:17.300 | The touch, et cetera.
00:39:18.860 | I mean, all of these things, we've built intuitions
00:39:21.700 | for the physical world that we inhabit.
00:39:24.020 | And our brains and the brains of all animals
00:39:26.140 | evolved for that world.
00:39:29.220 | And the AI systems that we have built
00:39:32.740 | happen to work well with images of the type
00:39:35.380 | that we encounter in the physical world that we inhabit.
00:39:38.460 | Whereas if you just take noise and you add random signal
00:39:42.480 | that doesn't match anything in our world,
00:39:44.500 | neural networks will not do as well.
00:39:47.020 | And that actually basically has this whole loop around this,
00:39:52.980 | which is this was designed by studying our own brain,
00:39:57.700 | which was evolved for our own world.
00:39:59.660 | And they happen to do well in our own world.
00:40:02.020 | And they happen to make the same types of mistakes
00:40:04.220 | that humans make many times.
00:40:07.100 | And of course you can engineer images
00:40:08.820 | by adding just the right amount of, you know,
00:40:11.140 | sort of pixel deviations to make a zebra look like a bamboo
00:40:14.500 | and stuff like that, or like a table.
00:40:18.460 | But ultimately the undoctored images at least
00:40:22.780 | are very often mistaken, I don't know,
00:40:26.020 | between muffins and dogs, for example,
00:40:28.940 | in the same way that humans make those mistakes.
00:40:31.300 | So it's, you know, there's no doubt in my view
00:40:35.900 | that the more we understand about the tricks
00:40:38.660 | that our human brain has evolved
00:40:40.660 | to understand the physical world around us,
00:40:42.980 | the more we will be able to bring
00:40:44.820 | new computational primitives in our AI systems
00:40:48.820 | to again, better understand not just the world around us,
00:40:52.260 | but maybe even the world inside us.
00:40:54.460 | And maybe even the computational problems
00:40:56.420 | that arise from new types of data
00:40:58.380 | that we haven't been exposed to,
00:41:00.380 | but are yet inhabiting the same universe that we live in
00:41:03.460 | with a very tiny little subset of functions
00:41:06.100 | from all possible mathematical functions.
00:41:08.140 | - Yeah, and that small subset of functions,
00:41:10.220 | all that matters to us humans, really.
00:41:11.700 | That's what makes-
00:41:12.940 | - It's all that has mattered so far.
00:41:14.860 | And even within our scientific realm,
00:41:17.100 | it's all that seems to continue to matter.
00:41:19.740 | But I mean, I always like to think about our senses
00:41:24.740 | and how much of the physical world around us we perceive.
00:41:29.660 | And if you look at the LIGO experiment
00:41:34.660 | over the last year and a half has been all over the news.
00:41:38.220 | What did LIGO do?
00:41:39.660 | It created a new sense for human beings.
00:41:42.900 | A sense that has never been sensed
00:41:45.820 | in the history of our planet.
00:41:47.300 | Gravitational waves have been traversing the earth
00:41:51.620 | since its creation a few billion years ago.
00:41:55.180 | Life has evolved senses to sense things
00:42:00.340 | that were never before sensed.
00:42:02.140 | Light was not perceived by early life.
00:42:06.660 | No one cared.
00:42:07.500 | And eventually, photoreceptors evolved
00:42:12.140 | and the ability to sense colors
00:42:15.060 | by sort of catching different parts
00:42:16.900 | of that electromagnetic spectrum.
00:42:19.180 | And hearing evolved and touch evolved, et cetera.
00:42:24.180 | But no organism evolved a way to sense neutrinos
00:42:27.740 | floating through earth
00:42:28.780 | or gravitational waves flowing through earth, et cetera.
00:42:31.340 | And I find it so beautiful in the history
00:42:33.900 | of not just humanity, but life on the planet,
00:42:37.060 | that we are now able to capture additional signals
00:42:40.500 | from the physical world than we ever knew before.
00:42:43.660 | And axioms, for example, have been all over the news
00:42:46.380 | in the last few weeks.
00:42:47.500 | The concept that we can capture
00:42:52.940 | and perceive more of that physical world
00:42:54.940 | is as exciting as the fact that we were blind to it
00:43:01.060 | is traumatizing before.
00:43:04.620 | Because that also tells us, you know, we're in 2020.
00:43:09.380 | Picture yourself in 3020 or in 20, you know--
00:43:12.820 | - What new senses might we discover?
00:43:15.460 | - Could it be that we're missing 9/10 of physics?
00:43:21.940 | That there's a lot of physics out there
00:43:24.020 | that we're just blind to, completely oblivious to it,
00:43:27.940 | and yet they're permeating us all the time.
00:43:29.420 | - Yeah, so it might be right in front of us.
00:43:31.580 | - So when you're thinking about premonitions,
00:43:34.260 | yeah, a lot of that is ascertainment bias.
00:43:37.620 | Like, yeah, every now and then you're like,
00:43:39.420 | "Oh, I remember my friend,"
00:43:41.060 | and then my friend doesn't appear,
00:43:42.700 | and I'll forget that I remember my friend.
00:43:44.580 | But every now and then, my friend will actually appear.
00:43:46.060 | I'm like, "Oh my God, I thought about you a minute ago.
00:43:48.340 | "You just called me, that's amazing."
00:43:50.180 | So, you know, some of that is this,
00:43:52.020 | but some of that might be that there are,
00:43:55.100 | within our brain, sensors for waves that we emit
00:44:00.100 | that we're not even aware of.
00:44:03.260 | And this whole concept of when I hug my children,
00:44:07.060 | there's such an emotional transfer there
00:44:10.500 | that we don't comprehend.
00:44:12.260 | I mean, sure, yeah, of course, we're all like hardwired
00:44:15.140 | for all kinds of touchy-feely things
00:44:16.740 | between parents and kids, it's beautiful,
00:44:18.260 | between partners, it's beautiful, et cetera.
00:44:20.700 | But then there are intangible aspects of human communication
00:44:25.700 | that I don't think it's unfathomable
00:44:30.060 | that our brain has actually evolved ways and sensors for it
00:44:32.900 | that we just don't capture.
00:44:33.980 | We don't understand the function
00:44:35.260 | of the vast majority of our neurons.
00:44:37.460 | And maybe our brain is already sensing it,
00:44:40.140 | but even worse, maybe our brain is not sensing it at all,
00:44:43.980 | and we're oblivious to this until we build a machine
00:44:46.620 | that suddenly is able to sort of capture
00:44:48.300 | so much more of what's happening in the natural world.
00:44:50.380 | - So what you're saying is we're going,
00:44:52.260 | physics is going to discover a sensor for love.
00:44:54.780 | - And maybe dogs are off scale for that.
00:45:00.220 | (Zubin laughs)
00:45:01.460 | And we've been oblivious to it the whole time
00:45:04.140 | 'cause we didn't have the right sensor.
00:45:05.780 | And now you're going to have a little wrist that says,
00:45:07.380 | "Oh my God, I feel all this love in the house.
00:45:09.620 | "I sense a disturbance in the forest."
00:45:11.820 | (Zubin laughs)
00:45:12.820 | - It's all around us.
00:45:13.700 | And dogs and cats will have zero.
00:45:15.740 | - None. - None.
00:45:16.580 | - None.
00:45:17.420 | - It's just, yeah. - Oh, looks like you lost it.
00:45:18.260 | (both laugh)
00:45:20.140 | - But let's take a step back to our unfortunate place.
00:45:24.540 | - To one of the 400 topics that we had actually planned for.
00:45:26.980 | (both laugh)
00:45:29.580 | - But to our sad time in 2020
00:45:31.820 | when we only have just a few sensors
00:45:33.860 | and we're very primitive early computers.
00:45:37.620 | So you have a foot in computer science
00:45:41.820 | and a foot in biology.
00:45:43.500 | In your sense, how do computers represent information
00:45:48.300 | differently than the genome or biological systems?
00:45:52.300 | - So first of all, let me correct that,
00:45:55.900 | no, we're in an amazing time in 2020.
00:45:58.300 | (both laugh)
00:46:00.340 | Computer science is totally awesome
00:46:02.460 | and physics is totally awesome
00:46:03.980 | and we have understood so much of the natural world
00:46:06.900 | than ever before.
00:46:08.500 | So I am extremely grateful and feeling extremely lucky
00:46:13.140 | to be living in the time that we are.
00:46:16.180 | 'Cause first of all, who knows when the asteroid will hit?
00:46:20.060 | (Zubin laughs)
00:46:21.860 | And second, of all times in humanity,
00:46:26.140 | this is probably the best time to be a human being
00:46:29.420 | and this might actually be the best place
00:46:31.100 | to be a human being.
00:46:31.940 | So anyway, for anyone who loves science,
00:46:34.460 | this is it, this is awesome, it's a great time.
00:46:36.980 | - At the same time, just a quick comment.
00:46:39.340 | All I meant is that if we look several hundred years from now
00:46:43.620 | and we end up somehow not destroying ourselves,
00:46:48.540 | people will probably look back at this time
00:46:50.340 | in computer science and at your work of Manos at MIT.
00:46:55.340 | - As infantile.
00:46:56.660 | - As infantile and silly and how ignorant it all was.
00:46:59.700 | I like to joke very often with my students
00:47:02.580 | that we've written so many papers,
00:47:04.300 | we've published so much, we've been cited so much
00:47:06.540 | and every single time I tell my students,
00:47:08.460 | the best is ahead of us.
00:47:09.780 | What we're working on now is the most exciting thing
00:47:12.540 | I've ever worked on.
00:47:13.940 | So in a way, I do have this sense of,
00:47:16.260 | yeah, even the papers I wrote 10 years ago,
00:47:18.600 | they were awesome at the time,
00:47:20.380 | but I'm so much more excited about where we're heading now.
00:47:22.460 | And I don't mean to minimize any of the stuff
00:47:24.580 | we've done in the past, but there's just this sense
00:47:28.500 | of excitement about what you're working on now
00:47:31.060 | that as soon as a paper is submitted,
00:47:33.460 | it's like, ugh, it's old.
00:47:35.260 | Like, you know, I can't talk about that anymore.
00:47:37.180 | I'm not gonna talk about it.
00:47:38.020 | - At the same time, you probably are not going
00:47:39.820 | to be able to predict what are the most impactful papers
00:47:44.300 | and ideas when people look back 200 years from now
00:47:47.380 | at your work, what would be the most exciting papers.
00:47:50.780 | And it may very well be not the thing that you expected.
00:47:54.260 | Or the things you got awards for or, you know,
00:47:58.100 | - This might be true in some fields.
00:48:00.020 | I don't know, I feel slightly differently
00:48:01.380 | about it in our field.
00:48:02.380 | I feel that I kind of know what are the important ones.
00:48:05.660 | And there's a very big difference
00:48:07.340 | between what the press picks up on
00:48:09.220 | and what's actually fundamentally important for the field.
00:48:11.660 | And I think for the fundamentally important ones,
00:48:13.420 | we kind of have a pretty good idea what they are.
00:48:15.620 | And it's hard to sometimes get the press excited
00:48:18.180 | about the fundamental advances, but you know,
00:48:21.380 | we take what we get and celebrate what we get.
00:48:24.780 | And sometimes, you know, one of our papers,
00:48:27.220 | which was in a minor journal, made the front page of Reddit
00:48:30.220 | and suddenly had like hundreds of thousands of views.
00:48:33.580 | Even though it was in a minor journal,
00:48:35.020 | because, you know, somebody pitched it the right way
00:48:37.060 | that it suddenly caught everybody's attention.
00:48:39.380 | Whereas other papers that are sort of truly fundamental,
00:48:42.060 | you know, we have a hard time getting the editors
00:48:44.500 | even excited about them when so many hundreds of people
00:48:47.900 | are already using the results and building upon them.
00:48:50.900 | So I do appreciate that there's a discrepancy
00:48:54.420 | between the perception and the perceived success
00:48:57.460 | and the awards that you get for various papers.
00:48:59.540 | But I think that fundamentally, I know that,
00:49:02.300 | you know, some paper, so when you're right--
00:49:04.500 | - So is there a paper that you're most proud of?
00:49:06.860 | See, now you just, you trapped yourself.
00:49:09.380 | - No, no, no, no.
00:49:10.220 | - I mean, is there a line of work that you have a sense
00:49:13.580 | is really powerful that you've done to date?
00:49:17.620 | You've done so much work in so many directions,
00:49:20.220 | which is interesting.
00:49:21.900 | Is there something where you think is quite special?
00:49:24.920 | - I mean, it's like asking me to say
00:49:28.820 | which of my three children I love best.
00:49:30.460 | I mean. (laughs)
00:49:32.660 | - Exactly.
00:49:34.980 | - So, I mean, and it's such a gimme question
00:49:38.600 | that it's so difficult not to brag
00:49:42.660 | about the awesome work that my team
00:49:44.820 | and my students have done.
00:49:46.140 | And I'll just mention a few off the top of my head.
00:49:50.060 | I mean, basically there's a few landmark papers
00:49:53.180 | that I think have shaped my scientific path.
00:49:56.900 | And, you know, I like to somehow describe it
00:50:00.500 | as a linear continuation of one thing led to another
00:50:03.740 | and led to another, led to another.
00:50:05.460 | And, you know, it kind of all started with,
00:50:09.140 | skip, skip, skip, skip, skip.
00:50:12.420 | Let me try to start somewhere in the middle. (laughs)
00:50:15.340 | So my first PhD paper was the first comparative
00:50:20.020 | analysis of multiple species.
00:50:21.900 | So multiple complete genomes.
00:50:23.660 | So for the first time, we basically developed a concept
00:50:27.380 | of genome-wide evolutionary signatures.
00:50:30.020 | The fact that you could look across the entire genome
00:50:32.980 | and understand how things evolve.
00:50:35.700 | And from these signatures of evolution,
00:50:38.300 | you could go back and study any one region and say,
00:50:42.460 | that's a protein coding gene.
00:50:44.060 | That's an RNA gene.
00:50:45.580 | That's a regulatory motif.
00:50:47.300 | That's a binding site and so on and so forth.
00:50:50.140 | So-- - Oh, sorry.
00:50:51.340 | So comparing different-- - Different species.
00:50:53.780 | - Species of the same, so-- - So take human, mouse,
00:50:56.300 | rat, and dog. - Yep.
00:50:58.060 | - You know, they're all animals.
00:50:59.060 | They're all mammals.
00:50:59.980 | They're all performing similar functions with their heart,
00:51:02.820 | with their brain, with their lungs, et cetera, et cetera,
00:51:05.180 | et cetera.
00:51:06.020 | So there's many functional elements
00:51:08.140 | that make us uniquely mammalian.
00:51:10.900 | And those mammalian elements are actually conserved.
00:51:14.620 | 99% of our genome does not code for protein.
00:51:17.740 | 1% codes for protein.
00:51:20.780 | The other 99%, we frankly didn't know what it does
00:51:25.100 | until we started doing this comparative genomic studies.
00:51:28.140 | So basically, these series of papers in my career
00:51:32.060 | have basically first developed that concept
00:51:34.540 | of evolutionary signatures and then applied them to yeast,
00:51:37.460 | applied them to flies, applied them to four mammals,
00:51:40.140 | applied them to 17 fungi,
00:51:41.620 | applied them to 12 Drosophila species,
00:51:43.700 | applied them to then 29 mammals, and now 200 mammals.
00:51:46.900 | - So sorry, so can we,
00:51:48.860 | so the evolutionary signatures,
00:51:50.700 | it seems like such a fascinating idea.
00:51:53.580 | I'm probably gonna linger on your early PhD work
00:51:57.380 | for two hours.
00:51:58.220 | But what is, how can you reveal something interesting
00:52:03.220 | about the genome by looking at the multiple species
00:52:09.280 | and looking at the evolutionary signatures?
00:52:11.940 | - Yeah.
00:52:12.780 | So you basically align the matching regions.
00:52:17.780 | So everything evolved from a common ancestor way, way back.
00:52:24.060 | And mammals evolved from a common ancestor
00:52:26.100 | about 60 million years back.
00:52:27.940 | So after the meteor that killed off the dinosaurs
00:52:32.940 | landed near Machu Picchu, we know the crater.
00:52:38.860 | It didn't allegedly land.
00:52:40.620 | (laughing)
00:52:41.740 | - That was the aliens, okay.
00:52:42.860 | - No, just slightly north of Machu Picchu
00:52:44.700 | in the Gulf of Mexico, there's a giant hole
00:52:47.140 | that that meteor impact.
00:52:49.100 | - Sorry, is that definitive to people?
00:52:51.380 | Have people conclusively figured out
00:52:56.380 | what killed the dinosaurs?
00:52:58.220 | - I think so.
00:52:59.260 | - So it was a meteor?
00:53:00.540 | - Well, you know, volcanic activity,
00:53:03.460 | all kinds of other stuff is coinciding.
00:53:06.820 | But the meteor is pretty unique.
00:53:09.580 | And we now have-- - That's also terrifying.
00:53:11.180 | (laughing)
00:53:13.220 | - We still have a lot of 2020 left.
00:53:14.980 | So if anything comes-- - No, no,
00:53:16.100 | but think about it this way.
00:53:17.260 | So the dinosaurs ruled the earth for 175 million years.
00:53:22.260 | We humans have been around for, what,
00:53:28.420 | less than one million years, if you're super generous
00:53:31.020 | about what you call humans.
00:53:32.940 | And you include chimps, basically.
00:53:34.580 | So we are just getting warmed up.
00:53:39.660 | And we've ruled the planet much more ruthlessly
00:53:42.540 | than Tyrannosaurus Rex.
00:53:44.100 | (laughing)
00:53:46.260 | T-Rex had much less of an environmental impact than we did.
00:53:49.580 | And if you give us another 174 million years,
00:53:54.060 | humans will look very different if we make it that far.
00:53:58.380 | So I think dinosaurs basically are much more
00:54:02.180 | of life history on earth than we are in all respects.
00:54:06.100 | But look at the bright side.
00:54:07.700 | When they were killed off, another life form emerged,
00:54:10.340 | mammals. - And that's that whole
00:54:12.940 | evolutionary branching that's happened.
00:54:15.100 | So you kind of have, when you have
00:54:17.340 | these evolutionary signatures, is there basically a map
00:54:21.180 | of how the genome changed?
00:54:22.660 | - Yeah, exactly. - Throughout?
00:54:23.500 | - So now you can go back to this early mammal
00:54:26.180 | that was hiding in caves, and you can basically ask
00:54:29.260 | what happened after the dinosaurs were wiped out.
00:54:31.300 | A ton of evolutionary niches opened up.
00:54:34.060 | And the mammals started populating all of these niches.
00:54:37.500 | And in that diversification, there was room for expansion
00:54:42.500 | of new types of functions.
00:54:44.820 | So some of them populated the air with bats flying,
00:54:49.820 | a new evolution of flight.
00:54:51.740 | Some populated the oceans with dolphins and whales
00:54:57.520 | going off to swim, et cetera.
00:54:58.780 | But we all are fundamentally mammals.
00:55:01.420 | So you can take the genomes of all these species
00:55:04.300 | and align them on top of each other.
00:55:06.340 | And basically create nucleotide resolution correspondences.
00:55:11.340 | What my PhD work showed is that when you do that,
00:55:14.300 | when you line up species on top of each other,
00:55:17.260 | you can see that within protein-coding genes,
00:55:19.900 | there's a particular pattern of evolution
00:55:22.040 | that is dictated by the level
00:55:24.820 | at which evolutionary selection acts.
00:55:27.760 | If I'm coding for a protein, and I change
00:55:31.580 | the third codon position of a triplet
00:55:35.180 | that codes for that amino acid,
00:55:37.420 | the same amino acid will be encoded.
00:55:39.780 | So that basically means that any kind of mutation
00:55:42.840 | that preserves that translation,
00:55:45.540 | that is invariant to that ultimate functional assessment
00:55:50.360 | that evolution will give, is tolerated.
00:55:53.180 | So for any function that you're trying to achieve,
00:55:55.860 | there's a set of sequences that encode it.
00:55:58.540 | You can now look at the mapping,
00:56:01.040 | the graph isomorphism, if you wish,
00:56:05.100 | between all of the possible DNA encodings
00:56:08.100 | of a particular function and that function.
00:56:10.460 | And instead of having just that exact sequence
00:56:13.020 | at the protein level, you can think of the set
00:56:15.660 | of protein sequences that all fulfill the same function.
00:56:18.660 | What's evolution doing?
00:56:20.020 | Evolution has two components.
00:56:21.460 | One component is random, blind, and stupid mutation.
00:56:25.940 | The other component is super smart, ruthless selection.
00:56:30.940 | That's my mom calling from Greece.
00:56:34.620 | (both laughing)
00:56:35.860 | Yes, I might be a fully grown man,
00:56:38.000 | but I am a Greek.
00:56:40.620 | - Did you just cancel the call?
00:56:42.140 | Wow, you're in trouble. - I know, I'm in trouble.
00:56:43.540 | No, she's gonna be calling the cops.
00:56:45.500 | - I'm gonna edit this clip out and send it to her.
00:56:47.980 | (both laughing)
00:56:50.740 | - So. - So yeah,
00:56:51.660 | so there's a lot of encoding
00:56:53.080 | for the same kind of function.
00:56:54.340 | - Yeah, so you now have this mapping
00:56:56.660 | between all of the set of functions
00:56:58.800 | that could all encode the same,
00:57:00.920 | all of the set of sequences
00:57:02.220 | that can all encode the same function.
00:57:04.300 | What evolutionary signatures does
00:57:06.620 | is that it basically looks at the shape
00:57:09.020 | of that distribution of sequences
00:57:11.220 | that all encode the same thing.
00:57:13.100 | And based on that shape, you can basically say,
00:57:15.260 | ooh, proteins have a very different shape
00:57:17.940 | than RNA structures, than regulatory motifs, et cetera.
00:57:21.360 | So just by scanning a sequence, ignoring the sequence,
00:57:24.520 | and just looking at the patterns of change,
00:57:26.760 | I'm like, wow, this thing is evolving like a protein.
00:57:29.500 | And that thing is evolving like a motif,
00:57:31.700 | and that thing is evolving.
00:57:33.180 | So that's exactly what we just did for COVID.
00:57:35.620 | So our paper that we post in a bioarchive about coronavirus
00:57:39.020 | basically took this concept of evolutionary signatures
00:57:42.020 | and applied it on the SARS-CoV-2 genome
00:57:45.740 | that is responsible for the COVID-19 pandemic.
00:57:48.540 | - And comparing it to?
00:57:50.500 | - To 44 Cervicovirus species, so this is the beta.
00:57:53.700 | - What word did you just use?
00:57:56.260 | - Cervicovirus, so SARS-related beta coronavirus.
00:58:00.500 | It's a portmanteau of a bunch.
00:58:01.460 | - So that whole family of viruses.
00:58:03.100 | - Yeah, so. - How big is that family?
00:58:05.100 | - We have 44 species that are--
00:58:07.500 | - 44 species in the family?
00:58:09.340 | - Yeah. - Virus is a clever bunch.
00:58:11.100 | - No, no, but there's just 44,
00:58:12.900 | and again, we don't call them species in viruses,
00:58:15.700 | we call them strains, but anyway, there's 44 strains,
00:58:18.140 | and that's a tiny little subset of maybe another 50 strains
00:58:22.340 | that are just far too distantly related.
00:58:24.460 | Most of those only infect bats as the host,
00:58:29.080 | and a subset of only four or five have ever infected humans.
00:58:34.040 | And we basically took all of those and we aligned them
00:58:36.800 | in the same exact way that we've aligned mammals,
00:58:39.000 | and then we looked at what proteins are,
00:58:42.360 | which of the currently hypothesized genes
00:58:45.000 | for the coronavirus genome are in fact evolving like proteins
00:58:49.040 | and which ones are not.
00:58:50.280 | And what we found is that ORF10,
00:58:52.980 | the last little open reading frame,
00:58:54.680 | the last little gene in the genome, is bogus.
00:58:57.040 | That's not a protein at all.
00:58:58.760 | - What is it?
00:58:59.960 | - It's an RNA structure.
00:59:01.840 | - That doesn't have a--
00:59:03.600 | - It doesn't get translated into amino acids.
00:59:05.720 | - And that, so it's important to narrow down
00:59:08.300 | to basically discover what's useful and what's not.
00:59:10.840 | - Exactly, basically what is even the set of genes?
00:59:13.640 | The other thing that this evolutionary signature showed
00:59:15.560 | is that within ORF3A lies a tiny little additional gene
00:59:20.560 | encoded within the other gene.
00:59:22.740 | So you can translate a DNA sequence
00:59:24.560 | in three different reading frames.
00:59:26.880 | If you start in the first one, it's ATG, et cetera.
00:59:30.160 | If you start on the second one, it's TGC, et cetera.
00:59:33.000 | And there's a gene within a gene.
00:59:36.680 | So there's a whole other protein that we didn't know about
00:59:39.480 | that might be super important.
00:59:41.200 | So we don't even know the building blocks of SARS-CoV-2.
00:59:45.680 | So if we want to understand coronavirus biology
00:59:48.340 | and eventually fight it successfully,
00:59:50.560 | we need to even have the set of genes.
00:59:51.960 | And these evolutionary signatures
00:59:53.720 | that I developed in my PhD work--
00:59:55.600 | - Are you really useful here?
00:59:56.440 | - Are really useful here.
00:59:57.280 | - Recently used.
00:59:58.100 | - You know what, let's run with that tangent
00:59:59.560 | for a little bit, if it's okay.
01:00:01.160 | Can we talk about the COVID-19 a little bit more?
01:00:08.280 | What's your sense about the genome, the proteins,
01:00:13.160 | the functions that we understand about COVID-19?
01:00:16.320 | Where do we stand in your sense?
01:00:18.880 | What are the big open problems?
01:00:21.400 | And also, you kind of said it's important to understand
01:00:25.340 | what are the important proteins,
01:00:29.800 | and why is that important?
01:00:32.120 | - So what else does the comparison of these species tell us?
01:00:39.120 | What it tells us is how fast are things evolving.
01:00:43.000 | It tells us about at what level is the acceleration
01:00:46.640 | or deceleration pedal set for every one of these proteins.
01:00:50.800 | So the genome has 30-some genes.
01:00:54.120 | Some genes evolve super, super fast.
01:00:56.580 | Others evolve super, super slow.
01:00:59.020 | If you look at the polymerase gene
01:01:00.460 | that basically replicates the genome,
01:01:01.980 | that's a super slow evolving one.
01:01:04.220 | If you look at the nucleocapsid protein,
01:01:06.340 | that's also super slow evolving.
01:01:08.360 | If you look at the spike one protein,
01:01:11.460 | this is the part of the spike protein
01:01:13.380 | that actually touches the H2 receptor
01:01:15.740 | and then enables the virus to attach to your cells.
01:01:20.740 | - That's the thing that gives it that visual--
01:01:23.820 | - Yeah, the corona look, basically.
01:01:24.920 | - The corona look, yeah.
01:01:26.040 | - So basically, the spike protein sticks out of the virus,
01:01:28.560 | and there's a first part of the protein, S1,
01:01:31.200 | which basically attaches to the H2 receptor,
01:01:34.560 | and then S2 is the latch that sort of pushes and channels
01:01:39.520 | the fusion of the membranes and then the incorporation
01:01:42.400 | of the viral RNA inside our cells,
01:01:47.120 | which then gets translated into all of these 30 proteins.
01:01:50.520 | - So the S1 protein is evolving ridiculously fast.
01:01:55.520 | So if you look at the stop, there's this gas pedal.
01:02:00.540 | The gas pedal is all the way down.
01:02:02.240 | Orf8 is also evolving super fast,
01:02:06.180 | and Orf6 is evolving super fast.
01:02:07.640 | We have no idea what they do.
01:02:09.000 | We have some idea, but nowhere near what S1 is.
01:02:12.300 | So what the--
01:02:13.140 | - Isn't that terrifying that S1 is evolving?
01:02:15.220 | That means that's a really useful function,
01:02:17.940 | and if it's evolving fast,
01:02:19.780 | doesn't that mean new strains could be created,
01:02:21.740 | or it does something?
01:02:22.580 | - That means that it's searching for how to match,
01:02:25.220 | how to best match the host.
01:02:26.920 | So basically, anything, in general, in evolution,
01:02:29.460 | if you look at genomes,
01:02:30.300 | anything that's contacting the environment
01:02:32.500 | is evolving much faster than anything that's internal,
01:02:35.220 | and the reason is that the environment changes.
01:02:37.340 | So if you look at the evolution of these Cerbicoviruses,
01:02:42.340 | the S1 protein has evolved very rapidly
01:02:44.620 | because it's attaching to different hosts each time.
01:02:47.620 | We think of them as bats,
01:02:48.720 | but there's thousands of species of bats,
01:02:50.820 | and to go from one species of bat to another species of bat,
01:02:53.100 | you have to adjust S1 to the new ACE2 receptor
01:02:56.180 | that you're gonna be facing in that new species.
01:02:58.300 | - Sorry, quick tangent.
01:02:59.740 | - Yeah.
01:03:00.580 | - Is it fascinating to you that viruses are doing this?
01:03:03.740 | I mean, it feels like they're this intelligent organism.
01:03:07.100 | I mean, is it, like, does it give you pause
01:03:10.460 | how incredible it is that they are,
01:03:14.060 | that the evolutionary dynamics that you're describing
01:03:16.700 | is actually happening,
01:03:18.100 | and they're figuring out how to jump from bats to humans
01:03:22.340 | all in this distributed fashion,
01:03:24.380 | and then most of us don't even say
01:03:25.820 | they're alive or intelligent or whatever?
01:03:27.780 | - So intelligence is in the eye of the beholder.
01:03:31.260 | You know, stupid is as stupid does,
01:03:33.460 | as Forrest Gump would say,
01:03:35.140 | and intelligent is as intelligent does.
01:03:36.780 | So basically, if the virus is finding solutions
01:03:39.420 | that we think of as intelligent,
01:03:41.100 | yeah, it's probably intelligent,
01:03:42.460 | but that's, again, in the eye of the beholder.
01:03:44.140 | - Do you think viruses are intelligent?
01:03:45.900 | - Oh, of course not.
01:03:47.500 | - Really?
01:03:48.340 | - No.
01:03:49.180 | - It's so incredible.
01:03:50.380 | - So remember when I was talking
01:03:51.940 | about the two components of evolution?
01:03:53.540 | One is the stupid mutation, which is completely blind,
01:03:57.100 | and the other one is the super smart selection,
01:04:00.300 | which is ruthless.
01:04:01.860 | So it's not viruses who are smart.
01:04:04.820 | It's this component of evolution that's smart.
01:04:06.880 | So it's evolution that sort of appears smart.
01:04:10.380 | And how is that happening?
01:04:12.060 | By huge parallel search
01:04:15.500 | across thousands of, you know,
01:04:18.820 | parallel infections throughout the world right now.
01:04:21.700 | - Yes, but so to push back on that,
01:04:23.980 | so yes, so then the intelligence is in the mechanism.
01:04:27.980 | But then by that argument,
01:04:31.380 | viruses would be more intelligent
01:04:32.860 | because there's just more of them.
01:04:34.660 | So the search, they're basically the brute force search
01:04:38.740 | that's happening with viruses,
01:04:40.340 | because there's so many more of them than humans,
01:04:43.220 | then they're taken as a whole are more intelligent.
01:04:47.540 | I mean, so you don't think it's possible that,
01:04:50.740 | I mean, who runs, would we even be here if viruses weren't?
01:04:55.580 | I mean, who runs this thing?
01:04:58.340 | - So let me answer, yeah, let me answer your question.
01:05:03.040 | So we would not be here if it wasn't for viruses.
01:05:08.040 | And part of the reason is that
01:05:11.820 | if you look at mammalian evolution early on
01:05:14.340 | in this mammalian radiation
01:05:16.100 | that basically happened after the death of the dinosaurs,
01:05:18.580 | is that some of the viruses that we had in our genome
01:05:22.740 | spread throughout our genome
01:05:24.580 | and created binding sites
01:05:27.260 | for new classes of regulatory proteins.
01:05:30.340 | And these binding sites that landed all over our genome
01:05:33.340 | are now control elements that basically control our genes
01:05:36.860 | and sort of help the complexity of the circuitry
01:05:40.420 | of mammalian genomes.
01:05:42.220 | So, you know, everything's co-evolution.
01:05:45.100 | - That's fascinating, we're working together.
01:05:47.780 | And yet you say they're dumb. - We've co-opted them.
01:05:49.620 | No, I never said they're dumb.
01:05:51.300 | They just don't care.
01:05:53.620 | They don't care.
01:05:54.700 | Another thing, oh, is the virus trying to kill us?
01:05:56.940 | No, it's not.
01:05:58.020 | The virus is not trying to kill you.
01:05:59.940 | It's actually actively trying to not kill you.
01:06:02.820 | So when you get infected, if you die,
01:06:04.940 | Pomeroy killed him,
01:06:07.300 | is what the reaction of the virus will be.
01:06:09.140 | Why? Because that virus won't spread.
01:06:11.020 | Many people have a misconception of,
01:06:13.740 | oh, viruses are smart or, oh, viruses are mean.
01:06:16.740 | They don't care.
01:06:17.580 | It's like you have to clean yourself
01:06:20.740 | of any kind of anthropomorphism out there.
01:06:23.220 | - I don't know. - Oh, yes.
01:06:25.420 | - So there's a sense when taken as a whole that there's a...
01:06:29.660 | - It's in the eye of the beholder.
01:06:32.940 | Stupid is as stupid does,
01:06:34.140 | intelligent is as intelligent does.
01:06:35.900 | So if you wanna call them intelligent, that's fine.
01:06:38.420 | Because the end result
01:06:40.140 | is that they're finding amazing solutions.
01:06:42.700 | I mean, I am in awe.
01:06:44.300 | - They're so dumb about it.
01:06:45.620 | They're just doing dumb. - They don't care.
01:06:46.980 | They're not dumb and they're not...
01:06:48.300 | They just don't care. - They don't care.
01:06:50.060 | The care word is really interesting.
01:06:51.820 | I mean, there could be an argument that they're conscious.
01:06:54.380 | - They're just dividing.
01:06:55.500 | They're not, they're just dividing.
01:06:57.660 | They're just a little entity
01:06:59.220 | which happens to be dividing and spreading.
01:07:02.700 | It doesn't want to kill us.
01:07:04.460 | In fact, it prefers not to kill us.
01:07:06.380 | It just wants to spread.
01:07:07.740 | And when I say wants, again, I'm anthropomorphizing,
01:07:11.060 | but it's just that if you have two versions of a virus,
01:07:15.100 | one acquires a mutation that spreads more,
01:07:17.460 | that's gonna spread more.
01:07:18.660 | One acquires a mutation that spreads less,
01:07:20.300 | that's gonna be lost.
01:07:21.820 | One acquires a mutation that enters faster,
01:07:24.060 | that's gonna be kept.
01:07:25.140 | One acquires a mutation that kills you right away,
01:07:27.060 | it's gonna be lost.
01:07:28.460 | So over evolutionary time,
01:07:30.540 | the viruses that spread super well,
01:07:32.740 | but don't kill the host,
01:07:33.980 | are the ones that are gonna survive.
01:07:36.300 | - Yeah, but so you brilliantly described
01:07:39.100 | the basic mechanisms of how it all happens,
01:07:41.100 | but when you zoom out and you see the,
01:07:43.380 | you know, the entirety of viruses,
01:07:46.500 | maybe across different strains of viruses,
01:07:49.980 | it seems like a living organism.
01:07:52.380 | - I am in awe of biology.
01:07:55.020 | I find biology amazingly beautiful.
01:07:58.380 | I find the design of the current coronavirus,
01:08:01.100 | however lethal it is, amazingly beautiful.
01:08:04.260 | The way that it is encoded,
01:08:06.340 | the way that it tricks your cells
01:08:08.980 | into making 30 proteins from a single RNA.
01:08:12.340 | Human cells don't do that.
01:08:14.300 | Human cells make one protein from each RNA molecule.
01:08:18.180 | They don't make two, they make one.
01:08:20.220 | We are hardwired to make only one protein
01:08:22.300 | from every RNA molecule.
01:08:23.780 | And yet this virus goes in,
01:08:25.700 | throws in a single messenger RNA.
01:08:27.620 | Just like any messenger RNA,
01:08:29.940 | we have tens of thousands of messenger RNAs
01:08:32.140 | in our cells in any one time.
01:08:34.140 | In every one of our cells.
01:08:35.820 | It throws in one RNA and that RNA is so,
01:08:40.620 | I'm gonna use your word here, not my word,
01:08:42.620 | intelligent that it hijacks the entire machinery
01:08:46.660 | of your human cell.
01:08:49.300 | It basically has at the beginning
01:08:52.460 | a giant open reading frame.
01:08:54.460 | That's a giant protein that gets translated.
01:08:57.140 | Two thirds of that RNA make a single giant protein.
01:09:01.540 | That single protein is basically
01:09:03.340 | what a human cell would make.
01:09:04.540 | It's like, oh, here's a start codon.
01:09:06.300 | I'm gonna start translating here.
01:09:07.660 | Human cells are kind of dumb, I'm sorry.
01:09:08.980 | Again, this is not the word that we'd normally use.
01:09:12.380 | But the human cell basically is,
01:09:13.380 | oh, this is an RNA, must be mine, let me translate.
01:09:15.900 | And it starts translating it and then you're in trouble.
01:09:19.260 | Because that one protein, as it's growing,
01:09:22.300 | gets cleaved into about 20 different peptides.
01:09:26.940 | The first peptide and the second peptide start interacting
01:09:30.900 | and the third one and the fourth one.
01:09:32.700 | And they shut off the ribosome of the whole cell
01:09:37.700 | to not translate human RNAs anymore.
01:09:42.780 | So the virus basically hijacks your cells
01:09:46.540 | and it cuts, it cleaves every one of your human RNAs
01:09:50.740 | to basically say to the ribosome,
01:09:52.060 | don't translate this one, junk.
01:09:53.460 | Don't look at this one, junk.
01:09:55.100 | And it only spares its own RNAs
01:09:58.660 | because they have a particular mark that it spares.
01:10:01.220 | Then all of the ribosomes that normally make protein
01:10:04.300 | in your human cells are now only able
01:10:06.780 | to translate viral RNAs.
01:10:09.180 | They have more and more and more and more of them.
01:10:11.380 | That's the first 20 proteins.
01:10:13.020 | In fact, halfway down about protein 11,
01:10:16.220 | between 11 and 12,
01:10:17.860 | you basically have a translational slippage
01:10:20.940 | where the ribosome skips reading frame
01:10:23.340 | and it translates from one reading frame
01:10:24.900 | to another reading frame.
01:10:25.740 | That means that about half of them
01:10:27.020 | are gonna be translated from one to 11
01:10:29.220 | and the other half are gonna be translated from 12 to 16.
01:10:32.700 | It's gorgeous.
01:10:34.260 | And then, then you're done.
01:10:37.380 | Then that mRNA will never translate the last 10 proteins,
01:10:40.420 | but spike is the one right after that one.
01:10:42.540 | So how does spike even get translated?
01:10:45.140 | This positive strand RNA virus has a reverse transcriptase,
01:10:50.020 | which is an RNA-based reverse transcriptase.
01:10:52.340 | So from the RNA on the positive strand,
01:10:54.460 | it makes an RNA on the negative strand.
01:10:56.940 | And in between every single one of these genes,
01:10:59.620 | these open reading frames,
01:11:01.060 | there's a little signal, AACGCA or something like that,
01:11:05.580 | that basically loops over to the beginning of the RNA.
01:11:09.940 | And basically, instead of sort of having
01:11:11.700 | a single full negative strand RNA,
01:11:14.500 | it basically has a partial negative strand RNA
01:11:16.820 | that ends right before the beginning of that gene.
01:11:19.700 | And another one that ends right before
01:11:20.860 | the beginning of that gene.
01:11:21.980 | These negative strand RNAs now make positive strand RNAs
01:11:25.340 | that then loop to the human host cell,
01:11:27.460 | just like any other human mRNA.
01:11:29.740 | It's like, "Oh, great, I'm gonna translate that one."
01:11:31.740 | 'Cause it doesn't have the cleaving
01:11:32.980 | that the virus has now put on all your human genes.
01:11:36.300 | And then you've lost the battle.
01:11:38.300 | That cell is now only making proteins for the virus
01:11:42.500 | that will then create the spike protein,
01:11:45.500 | the envelope protein, the membrane protein,
01:11:47.620 | the nucleocapsid protein that will package up the RNA,
01:11:50.300 | and then sort of create new viral envelopes.
01:11:53.820 | And these will then be secreted out of that cell
01:11:57.780 | in new little packages
01:11:59.260 | that will then infect the rest of the cells.
01:12:00.580 | - Repeat the whole process again.
01:12:01.900 | - It's beautiful, right?
01:12:03.300 | It's mind-boggling.
01:12:04.140 | - It's hard not to anthropomorphize it.
01:12:05.660 | (laughing)
01:12:06.500 | - I know, but it's so gorgeous.
01:12:08.100 | - So there is a beauty to it.
01:12:09.900 | - Of course.
01:12:10.740 | - Is it terrifying to you?
01:12:13.980 | - So this is something that has happened throughout history.
01:12:16.900 | Humans have been nearly wiped out
01:12:19.700 | over and over and over again,
01:12:21.140 | and yet never fully wiped out.
01:12:23.260 | So I'm not concerned about the human race.
01:12:25.940 | I'm not even concerned about the impact
01:12:29.340 | on sort of our survival as a species.
01:12:32.460 | This is absolutely something,
01:12:35.620 | I mean, human life is so invaluable,
01:12:38.660 | and every one of us is so invaluable,
01:12:40.060 | but if you think of it as sort of,
01:12:42.260 | is this the end of our species?
01:12:44.100 | By no means, basically.
01:12:46.420 | So let me explain.
01:12:48.100 | The Black Death killed what, 30% of Europe?
01:12:52.220 | That has left a tremendous imprint,
01:12:55.780 | a huge hole, a horrendous hole
01:13:00.300 | in the genetic makeup of humans.
01:13:04.540 | There's been series of wiping out
01:13:07.580 | of huge fractions of entire species,
01:13:10.780 | or just entire species altogether,
01:13:13.100 | and that has a consequence on the human immune repertoire.
01:13:19.580 | If you look at how Europe was shaped,
01:13:23.300 | and how Africa was shaped by malaria, for example,
01:13:27.220 | all the individuals that carry a mutation
01:13:29.460 | that protects you from malaria
01:13:31.820 | were able to survive much more.
01:13:33.660 | And if you look at the frequency of sickle cell disease
01:13:36.420 | and the frequency of malaria,
01:13:38.180 | the maps are actually showing the same pattern,
01:13:40.860 | the same imprint on Africa,
01:13:42.860 | and that basically led people to hypothesize
01:13:44.820 | that the reason why sickle cell disease
01:13:46.300 | is so much more frequent in Americans of African descent
01:13:50.100 | is because there was selection in Africa against malaria,
01:13:54.100 | leading to sickle cell, because when the cells sickle,
01:13:57.860 | malaria is not able to replicate inside your cells as well,
01:14:01.420 | and therefore you protect against that.
01:14:03.140 | So if you look at human disease,
01:14:05.420 | all of the genetic associations
01:14:06.900 | that we do with human disease,
01:14:09.260 | you basically see the imprint
01:14:13.700 | of these waves of selection,
01:14:15.700 | killing off gazillions of humans.
01:14:18.380 | And there's so many immune processes that are coming up
01:14:23.260 | as associated with so many different diseases.
01:14:25.940 | The reason for that is similar
01:14:27.580 | to what I was describing earlier,
01:14:28.700 | where the outward facing proteins evolve much more rapidly
01:14:33.700 | because the environment is always changing.
01:14:35.940 | But what's really interesting, the human genome
01:14:37.660 | is that we have co-opted many of these immune genes
01:14:40.380 | to carry out non-immune functions.
01:14:42.460 | For example, in our brain,
01:14:44.020 | we use immune cells to cleave off neuronal connections
01:14:48.900 | that don't get used.
01:14:50.180 | This whole use it or lose it, we know the mechanism.
01:14:52.860 | It's microglia that cleave off
01:14:54.620 | neuronal synaptic connections that are just not utilized.
01:14:59.940 | When you utilize them, you mark them in a particular way
01:15:02.060 | that basically when the microglia come,
01:15:04.420 | tell it, "Don't kill this one, it's used now."
01:15:07.860 | And the microglia will go off
01:15:08.980 | and kill the ones you don't use.
01:15:10.380 | This is an immune function,
01:15:12.820 | which is co-opted to do non-immune things.
01:15:15.020 | If you look at our adipocytes,
01:15:16.820 | M1 versus M2 macrophages inside our fat
01:15:19.940 | will basically determine whether you're obese or not.
01:15:22.660 | And these are again immune cells that are resident
01:15:24.780 | and living within these tissues.
01:15:27.100 | So many disease association.
01:15:30.260 | - Fascinating that we co-opt these kinds of things
01:15:33.700 | for incredibly complicated functions.
01:15:36.700 | - Exactly, evolution works in so many different ways,
01:15:39.900 | which are all beautiful and mysterious.
01:15:42.020 | - But not intelligent.
01:15:43.300 | - Not intelligent, it's in the eye of the beholder.
01:15:45.780 | (both laughing)
01:15:47.580 | But the point that I'm trying to make
01:15:50.340 | is that if you look at the imprint that COVID will have,
01:15:54.260 | hopefully it will not be big.
01:15:55.980 | Hopefully the US will get attacked together
01:15:57.980 | and stop the virus from spreading further.
01:16:00.420 | But if it doesn't, it's having an imprint
01:16:03.500 | on individuals who have particular genetic repertoires.
01:16:07.340 | So if you look at now the genetic associations
01:16:10.080 | of blood type and immune function cells, et cetera,
01:16:13.620 | there's actually association, genetic variation
01:16:15.740 | that basically says how much more likely am I or you to die
01:16:18.540 | if we contact the virus.
01:16:20.220 | And it's through these rounds of shaping the human genome
01:16:24.540 | that humans have basically made it so far.
01:16:27.380 | And selection is ruthless and it's brutal
01:16:32.380 | and it only comes with a lot of killing.
01:16:34.380 | But this is the way that viruses and environments
01:16:38.120 | have shaped the human genome.
01:16:39.500 | Basically when you go through periods of famine,
01:16:41.420 | you select for particular genes.
01:16:43.660 | And what's left is not necessarily better,
01:16:46.540 | it's just whatever survived.
01:16:49.020 | And it may have been the surviving one back then,
01:16:51.980 | not because it was better,
01:16:53.140 | maybe the ones that ran slower survived.
01:16:54.980 | I mean, again, not necessarily better,
01:16:57.400 | but the surviving ones are basically the ones
01:17:00.020 | that then are shaped for any kind
01:17:02.420 | of subsequent evolutionary condition
01:17:05.380 | and environmental condition.
01:17:07.260 | But if you look at, for example, obesity,
01:17:09.580 | obesity was selected for,
01:17:11.780 | basically the genes that now predisposes to obesity
01:17:14.420 | were at 2% frequency in Africa.
01:17:16.660 | They rose to 44% frequency in Europe.
01:17:19.020 | - Wow, that's fascinating.
01:17:20.300 | - Because you basically went through the ice ages
01:17:22.940 | and there was a scarcity of food.
01:17:24.620 | So there was a selection to being able to store
01:17:27.140 | every single calorie you consume.
01:17:30.740 | Eventually, environment changes.
01:17:33.360 | So the better allele, which was the fat storing allele,
01:17:36.560 | became the worst allele,
01:17:38.060 | because it's the fat storing allele.
01:17:40.340 | It still has the same function.
01:17:42.500 | So if you look at my genome, speaking of mom calling,
01:17:45.660 | mom gave me a bad copy of that gene, this FTO locus.
01:17:49.940 | Basically makes me-- - The one that has to do
01:17:51.260 | with the-- - Obesity.
01:17:52.540 | - With obesity.
01:17:53.460 | - Yeah, I basically now have a bad copy from mom
01:17:56.000 | that makes me more likely to be obese.
01:17:57.940 | And I also have a bad copy from dad
01:18:00.380 | that makes me more likely to be obese.
01:18:01.580 | So I'm homozygous.
01:18:03.340 | And that's the allele, it's still the minor allele,
01:18:07.420 | but it's at 44% frequency in Southeast Asia,
01:18:10.660 | 42% frequency in Europe, even though it started at 2%.
01:18:14.260 | It was an awesome allele to have 100 years ago.
01:18:17.560 | Right now, it's a pretty terrible allele.
01:18:19.400 | So the other concept is that diversity matters.
01:18:23.540 | If we had 100 million nuclear physicists
01:18:26.780 | living on the earth right now, we'd be in trouble.
01:18:29.900 | (Zubin laughs)
01:18:30.740 | You need diversity, you need artists,
01:18:32.860 | and you need musicians, and you need mathematicians,
01:18:34.940 | and you need politicians, yes, even those.
01:18:38.140 | And you need-- - Oh, let's not get crazy.
01:18:40.140 | (Zubin laughs)
01:18:40.980 | But because then if a virus comes along or whatever--
01:18:44.300 | - Exactly, exactly.
01:18:45.900 | So no, there's two reasons.
01:18:47.020 | Number one, you want diversity in the immune repertoire,
01:18:49.900 | and we have built-in diversity.
01:18:51.940 | So basically, they are the most diverse.
01:18:54.420 | Basically, if you look at our immune system,
01:18:55.780 | there's layers and layers of diversity.
01:18:57.880 | Like the way that you create your cells generates diversity
01:19:02.340 | because of the selection for the VDJ recombination
01:19:05.680 | that basically eventually leads
01:19:07.360 | to a huge number of repertoires.
01:19:08.940 | But that's only one small component of diversity.
01:19:10.980 | The blot type is another one.
01:19:12.340 | The major histocompatibility complex, the HLA alleles,
01:19:16.620 | are another source of diversity.
01:19:18.760 | So the immune system of humans
01:19:21.080 | is by nature incredibly diverse.
01:19:24.140 | And that basically leads to resilience.
01:19:26.600 | So basically, what I'm saying that I don't worry
01:19:28.420 | for the human species.
01:19:30.400 | Because we are so diverse immunologically,
01:19:33.580 | we are likely to be very resilient
01:19:35.940 | against so many different attacks like this current virus.
01:19:40.020 | - So you're saying natural pandemics may not be something
01:19:43.100 | that you're really afraid of because of the diversity
01:19:45.940 | in our genetic makeup.
01:19:49.420 | What about engineered pandemics?
01:19:51.420 | Do you have fears of us messing with the makeup of viruses
01:19:56.680 | or, well, yeah, let's say with the makeup of viruses
01:19:59.880 | to create something that we can't control
01:20:01.800 | and would be much more destructive
01:20:03.760 | than it would come about naturally?
01:20:06.280 | - Remember how we were talking about how smart evolution is?
01:20:08.960 | Humans are much dumber.
01:20:10.160 | - You mean like human scientists, engineers?
01:20:12.680 | - Yeah, humans, humans just like--
01:20:14.240 | - Humans overall?
01:20:15.080 | - Yeah, humans overall.
01:20:17.080 | But I mean, even the sort of synthetic biologists.
01:20:21.540 | Basically, if you were to create
01:20:26.100 | a virus like SARS that will kill a lot of people,
01:20:30.900 | you would probably start with SARS.
01:20:33.200 | So whoever would like to design such a thing
01:20:38.620 | would basically start with a SARS tree
01:20:41.180 | or at least some relative of SARS.
01:20:43.540 | The source genome for the current virus
01:20:46.720 | was something completely different.
01:20:48.300 | It was something that has never infected humans.
01:20:50.600 | No one in their right mind would have started there.
01:20:52.860 | - But when you say source, it's like the nearest--
01:20:54.980 | - The nearest relative is in a whole other branch,
01:20:58.700 | no species of which has ever infected humans in that branch.
01:21:01.700 | So let's put this to rest.
01:21:05.300 | This was not designed by someone to kill off the human race.
01:21:08.540 | - So you don't believe it was engineered?
01:21:10.900 | - The-- - Well, likely.
01:21:13.060 | - Yeah, the path to engineering a deadly virus
01:21:16.080 | would not come from this strain that was used.
01:21:21.140 | Moreover, there's been various claims of,
01:21:26.140 | ha-ha, this was mixed and matched in a lab
01:21:29.220 | because the S1 protein has three different components,
01:21:32.480 | each of which has a different evolutionary tree.
01:21:34.620 | So a lot of popular press basically said,
01:21:37.220 | aha, this came from pangolin
01:21:39.140 | and this came from all kinds of other species.
01:21:41.820 | This is what has been happening
01:21:44.820 | throughout the coronavirus tree.
01:21:46.860 | So basically, the S1 protein has been recombining
01:21:49.300 | across species all the time.
01:21:50.380 | Remember when I was talking about the positive strand,
01:21:51.980 | the negative strand, subgenomic RNAs?
01:21:54.300 | These can actually recombine.
01:21:55.740 | And if you have two different viruses
01:21:57.100 | infecting the same cell,
01:21:58.500 | they can actually mix and match
01:21:59.740 | between the positive strand and the negative strand
01:22:01.280 | and basically create a new hybrid virus with recombination
01:22:04.660 | that now has the S1 from one
01:22:06.660 | and the rest of the genome from another.
01:22:08.740 | And this is something that happens a lot in S1,
01:22:10.540 | in OrphA, et cetera.
01:22:12.020 | And that's something that's true of the whole tree.
01:22:13.900 | - For the whole family of-- - Exactly.
01:22:14.980 | - Coronaviruses. - So it's not like
01:22:16.500 | someone has been messing with this for millions of years
01:22:19.420 | and changing the-- - This happens naturally.
01:22:21.620 | That's, again, beautiful that that somehow happens,
01:22:24.420 | that they recombine.
01:22:25.900 | So two different strands can infect the body
01:22:27.700 | and then recombine.
01:22:28.640 | So all of this actually magic happens inside hosts.
01:22:34.540 | - Yeah, that's why classification-wise,
01:22:39.220 | virus is not thought to be alive
01:22:40.700 | because it doesn't self-replicate, it's not autonomous.
01:22:43.020 | It's something that enters a living cell
01:22:45.740 | and then co-opts it to basically make it its own.
01:22:48.760 | But by itself, people ask me, how do we kill this bastard?
01:22:51.580 | I'm like, you stop it from replicating.
01:22:54.180 | It's not like a bacterium that will just live
01:22:57.660 | in a puddle or something.
01:23:01.060 | It's a virus.
01:23:02.460 | Viruses don't live without their host.
01:23:04.420 | And they only live without their host for very little time.
01:23:07.360 | So if you stop it from replicating,
01:23:09.100 | it'll stop from spreading.
01:23:11.260 | I mean, it's not like HIV,
01:23:12.300 | which can stay dormant for a long time.
01:23:13.900 | Basically, coronaviruses just don't do that.
01:23:15.580 | They're not integrating genomes.
01:23:16.780 | They're RNA genomes.
01:23:18.060 | So if it's not expressed, it degrades.
01:23:20.220 | RNA degrades.
01:23:21.180 | It doesn't just stick around.
01:23:23.380 | - Well, let me ask also about the immune system you mentioned.
01:23:27.340 | A lot of people kind of ask,
01:23:29.360 | how can we strengthen the immune system
01:23:34.120 | to respond to this particular virus,
01:23:36.300 | but in viruses in general?
01:23:37.740 | Do you have, from a biological perspective,
01:23:40.420 | thoughts on what we can do as humans
01:23:43.140 | to strengthen our immune system?
01:23:43.980 | - If you look at the death rates across different countries,
01:23:46.620 | people with less vaccination have been dying more.
01:23:49.700 | If you look at North Italy,
01:23:51.380 | the vaccination rates are abysmal there.
01:23:53.940 | And a lot of people have been dying.
01:23:55.860 | If you look at Greece, very good vaccination rates.
01:23:58.780 | Almost no one has been dying.
01:24:00.300 | So yes, there's a policy component.
01:24:03.580 | So Italy reacted very slowly.
01:24:05.980 | Greece reacted very fast.
01:24:07.460 | So yeah, many fewer people died in Greece.
01:24:09.760 | But there might actually be a component
01:24:11.720 | of genetic immune repertoire.
01:24:14.080 | Basically, how did people die off in the history
01:24:18.140 | of the Greek population versus the Italian population?
01:24:20.700 | - Wow.
01:24:21.540 | - There's a--
01:24:22.380 | - That's interesting to think about.
01:24:24.620 | - And then there's a component
01:24:25.980 | of what vaccinations did you have as a kid,
01:24:28.940 | and what are the off-target effects of those vaccinations?
01:24:32.480 | So basically, a vaccination can have two components.
01:24:34.900 | One is training your immune system
01:24:37.620 | against that specific insult.
01:24:39.480 | The second one is boosting up your immune system
01:24:42.140 | for all kinds of other things.
01:24:44.600 | If you look at allergies,
01:24:47.100 | Northern Europe, super clean environments,
01:24:50.200 | tons of allergies.
01:24:51.420 | Southern Europe, my kids grew up eating dirt.
01:24:54.040 | No allergies.
01:24:55.800 | (both laugh)
01:24:57.060 | So growing up, I never had even heard of what allergies are.
01:25:00.380 | Like, was it really allergies?
01:25:01.940 | And the reason is that I was playing in the garden.
01:25:03.580 | I was putting all kinds of stuff in my mouth
01:25:05.560 | from all kinds of dirt and stuff.
01:25:07.360 | Tons of viruses there, tons of bacteria there.
01:25:09.920 | My immune system was built up.
01:25:11.480 | So the more you protect your immune system from exposure,
01:25:16.480 | the less opportunity it has to learn about non-self repertoire
01:25:21.700 | in a way that prepares it for the next insult.
01:25:24.400 | - So that's the horizontal thing, too?
01:25:26.760 | So it's throughout your lifetime
01:25:28.120 | and the lifetime of the people that, your ancestors?
01:25:33.120 | - Yeah, yeah, absolutely.
01:25:34.080 | - What about, so again, it returns against free will.
01:25:37.920 | On the free will side of things,
01:25:39.560 | is there something we could do
01:25:41.000 | to strengthen our immune system in 2020?
01:25:44.760 | Is there exercise, diet, all that kind of stuff?
01:25:49.760 | - So it's kind of funny.
01:25:52.880 | There's a cartoon that basically shows two windows
01:25:55.920 | with a teller in each window.
01:25:58.280 | One has a humongous line, and the other one has no one.
01:26:02.200 | The one that has no one above says health.
01:26:04.720 | No, it says exercise and diet.
01:26:07.240 | And the other one says pill.
01:26:09.000 | (Zubin laughs)
01:26:10.280 | And there's a huge line for pill.
01:26:12.160 | So we're looking basically for magic bullets
01:26:13.960 | for sort of ways that we can beat cancer
01:26:17.640 | and beat coronavirus and beat this and beat that.
01:26:19.480 | And it turns out that the window with just diet and exercise
01:26:23.240 | is the best way to boost every aspect of your health.
01:26:26.120 | If you look at Alzheimer's, exercise and nutrition.
01:26:31.120 | I mean, you're like, really?
01:26:32.600 | For my brain, neurodegeneration?
01:26:34.720 | Absolutely.
01:26:36.160 | If you look at cancer, exercise and nutrition.
01:26:39.600 | (Zubin laughs)
01:26:40.440 | If you look at coronavirus, exercise and nutrition.
01:26:43.800 | Every single aspect of human health gets improved.
01:26:47.280 | And one of the studies we're doing now
01:26:48.600 | is basically looking at what are the effects
01:26:51.240 | of diet and exercise?
01:26:52.920 | How similar are they to each other?
01:26:55.320 | We basically take in diet intervention
01:26:58.400 | and exercise intervention in human and in mice,
01:27:01.400 | and we're basically doing single cell profiling
01:27:03.560 | of a bunch of different tissues
01:27:04.960 | to basically understand how are the cells,
01:27:08.040 | both the stromal cells and the immune cells
01:27:10.920 | of each of these tissues,
01:27:12.360 | responding to the effect of exercise.
01:27:15.120 | What are the communication networks between different cells
01:27:18.560 | where the muscle that exercises sends signals
01:27:23.560 | through the bloodstream, through the lymphatic system,
01:27:26.000 | through all kinds of other systems
01:27:27.680 | that give signals to other cells that I have exercised
01:27:31.600 | and you should change in this particular way,
01:27:34.000 | which basically reconfigure those receptor cells
01:27:37.640 | with the effect of exercise.
01:27:39.880 | - So how well understood is those reconfigurations?
01:27:43.880 | - Very little.
01:27:44.720 | We're just starting now, basically.
01:27:47.400 | - Is the hope there to understand the effect on,
01:27:51.060 | so like the effect on the immune system?
01:27:54.360 | - On the immune system, the effect on brain,
01:27:56.280 | the effect on your liver, on your digestive system,
01:27:59.100 | on your adipocytes.
01:28:00.960 | Adipose, you know, the most misunderstood organ.
01:28:03.640 | Everybody thinks, "Ugh, fat, terrible."
01:28:05.800 | No, fat is awesome.
01:28:07.480 | Your fat cells is what's keeping you alive
01:28:09.960 | because if you didn't have your fat cells,
01:28:11.440 | all those lipids and all those calories
01:28:13.960 | would be floating around in your blood
01:28:15.440 | and you'd be dead by now.
01:28:16.960 | Your adipocytes are your best friend.
01:28:18.480 | They're basically storing all these excess calories
01:28:21.840 | so that they don't hurt all of the rest of the body.
01:28:24.960 | And they're also fat-burning in many ways.
01:28:28.960 | So, you know, again,
01:28:30.600 | when you don't have the homozygous version that I have,
01:28:33.520 | your cells are able to burn calories much more easily
01:28:36.560 | by sort of flipping a master metabolic switch
01:28:40.080 | that involves these FTO locus that I mentioned earlier
01:28:42.460 | and these target genes, RX3 and RX5,
01:28:45.120 | that basically switch your adipocytes
01:28:47.600 | during their three first days of differentiation
01:28:50.840 | as they're becoming mature adipocytes
01:28:52.360 | to basically become either fat-burning
01:28:54.380 | or fat-storing fat cells.
01:28:57.140 | And the fat-burning fat cells are your best friend.
01:28:59.040 | They're much closer to muscle
01:29:00.520 | than they are to white adipocytes.
01:29:02.880 | - Is there a lot of difference between people
01:29:04.600 | like that you could give,
01:29:06.400 | science could eventually give advice
01:29:09.520 | that is very generalizable?
01:29:12.260 | Or is our differences in our genetic makeup,
01:29:16.100 | like you mentioned,
01:29:16.940 | is that going to be basically something
01:29:19.080 | we have to be very specialized to individuals?
01:29:22.800 | Any advice we give in terms of diet,
01:29:24.880 | like what we were just talking about?
01:29:25.880 | - Believe it or not,
01:29:26.720 | the most personalized advice that you give for nutrition
01:29:29.640 | don't have to do with your genome.
01:29:31.440 | They have to do with your gut microbiome,
01:29:34.400 | with the bacteria that live inside you.
01:29:35.900 | So most of your digestion is actually happening
01:29:37.960 | by species that are not human inside you.
01:29:40.800 | You have more non-human cells than you have human cells.
01:29:43.080 | You're basically a giant bag of bacteria
01:29:46.720 | with a few human cells along.
01:29:48.320 | - And those do not necessarily have to do
01:29:53.120 | with your genetic makeup?
01:29:54.900 | - They interact with your genetic makeup.
01:29:56.760 | They interact with your epigenome.
01:29:58.000 | They interact with your nutrition.
01:29:59.600 | They interact with your environment.
01:30:01.280 | They're basically an additional source of variation.
01:30:06.280 | So when you're thinking about
01:30:08.120 | personalized nutritional advice,
01:30:10.080 | part of that is actually how do you match your microbiome?
01:30:13.640 | And part of that is how do we match your genetics?
01:30:17.080 | But again, this is a very diverse set of contributors.
01:30:22.080 | And the effect sizes are not enormous.
01:30:24.640 | So I think the science for that is not fully developed yet.
01:30:27.920 | - Speaking of diets,
01:30:28.760 | 'cause I've wrestled in combat sports,
01:30:30.640 | sports my whole life, where weight matters.
01:30:32.640 | So you have to cut and all that stuff.
01:30:35.320 | One thing I've learned a lot about my body,
01:30:38.200 | and it seems to be, I think, true
01:30:40.120 | about other people's bodies,
01:30:41.640 | is that you can adjust to a lot of things.
01:30:45.080 | That's the miraculous thing about this biological system
01:30:48.240 | is I fast often.
01:30:52.340 | I used to eat five, six times a day
01:30:55.000 | and thought that was absolutely necessary.
01:30:57.000 | How could you not eat that often?
01:30:59.000 | And then when I started fasting,
01:31:01.320 | your body adjusts to that,
01:31:02.720 | and you learn how to not eat.
01:31:04.200 | And it was, if you just give it a chance
01:31:07.600 | for a few weeks, actually,
01:31:09.140 | over a period of a few weeks,
01:31:10.320 | your body can adjust to anything.
01:31:11.800 | And that's such a beautiful thing.
01:31:14.120 | - So I'm a computer scientist,
01:31:15.480 | and I've basically gone through periods of 24 hours
01:31:18.040 | without eating or stopping.
01:31:19.680 | And then I'm like, ooh, must eat.
01:31:22.080 | And I eat a ton.
01:31:23.080 | I used to order two pizzas just with my brother.
01:31:27.440 | So I've gone through these extremes as well,
01:31:29.720 | and I've gone the whole intermittent fasting thing.
01:31:32.180 | So I can sympathize with you both
01:31:34.040 | on the seven meals a day to the zero meals a day.
01:31:36.760 | So I think when I say everything with moderation,
01:31:40.840 | I actually think your body responds interestingly
01:31:44.360 | to these different changes in diet.
01:31:47.360 | I think part of the reason why we lose weight
01:31:49.880 | with pretty much every kind of change in behavior
01:31:52.200 | is because our epigenome and the set of proteins
01:31:55.920 | and enzymes that are expressed in our microbiome
01:31:58.600 | are not well suited to that nutritional source.
01:32:02.040 | And therefore, they will not be able
01:32:03.840 | to sort of catch everything that you give them.
01:32:06.660 | And then a lot of that will go undigested.
01:32:09.160 | And that basically means that your body
01:32:10.720 | can then lose weight in the short term,
01:32:13.200 | but very quickly will adjust to that new normal.
01:32:16.160 | And then we'll be able to sort of perhaps gain
01:32:18.200 | a lot of weight from the diet.
01:32:20.400 | So anyway, I mean, there's also studies in factories
01:32:24.400 | where basically people dim the lights,
01:32:27.200 | and then suddenly everybody started working better.
01:32:28.720 | It was like, wow, that's amazing.
01:32:30.160 | Three weeks later, they made the lights a little brighter.
01:32:32.840 | Everybody started working better.
01:32:34.240 | (laughs)
01:32:35.360 | So any kind of intervention has a placebo effect of,
01:32:39.420 | wow, now I'm healthier,
01:32:40.600 | now I'm gonna be running more often, et cetera.
01:32:42.120 | So it's very hard to uncouple the placebo effect
01:32:44.600 | of, wow, I'm doing something to intervene on my diet
01:32:47.080 | from the, wow, this is actually the right thing for me.
01:32:50.280 | So, you know. - Yeah, from the perspective
01:32:51.960 | from a nutrition science, psychology,
01:32:54.760 | both things I'm interested in, especially psychology,
01:32:57.080 | it seems that it's extremely difficult to do good science
01:33:01.240 | because there's so many variables involved.
01:33:04.200 | It's so difficult to control the variables.
01:33:06.560 | So difficult to do sufficiently large-scale experiments,
01:33:10.300 | both sort of in terms of number of subjects and temporal,
01:33:14.200 | like how long you do the study for,
01:33:17.020 | that it just seems like it's not even a real science for now,
01:33:21.480 | like nutrition science.
01:33:22.640 | - I wanna jump into the whole placebo effect
01:33:24.820 | for a little bit here and basically
01:33:26.960 | talk about the implications of that.
01:33:30.200 | If I give you a sugar pill and I tell you it's a sugar pill,
01:33:33.200 | you won't get any better.
01:33:35.520 | But if I tell you a sugar pill and I tell you,
01:33:38.200 | wow, this is an amazing drug
01:33:40.000 | and it actually will stop your cancer,
01:33:42.240 | your cancer will actually stop with much higher probability.
01:33:46.280 | What does that mean?
01:33:47.120 | - That's so amazing, by the way.
01:33:47.940 | - That means that if I can trick your brain
01:33:49.920 | into thinking that I'm healing you,
01:33:51.840 | your brain will basically figure out a way to heal itself,
01:33:54.580 | to heal the body.
01:33:56.040 | And that tells us that there's so much
01:33:58.600 | that we don't understand in the interplay
01:34:01.440 | between our cognition and our biology
01:34:04.120 | that if we were able to better harvest
01:34:08.400 | the power of our brain to sort of impact the body
01:34:12.320 | through the placebo effect,
01:34:14.200 | we would be so much better in so many different things.
01:34:17.300 | Just by tricking yourself into thinking
01:34:19.220 | that you're doing better, you're actually doing better.
01:34:21.560 | So there's something to be said
01:34:22.640 | about sort of positive thinking, about optimism,
01:34:25.020 | about sort of just getting your brain and your mind
01:34:30.020 | into the right mindset that helps your body
01:34:34.920 | and helps your entire biology.
01:34:37.200 | - Yeah, from a science perspective, that's just fascinating.
01:34:40.120 | Obviously, most things about the brain
01:34:41.880 | is a total mystery for now,
01:34:43.920 | but that's a fascinating interplay
01:34:46.440 | that the brain can reduce,
01:34:49.920 | the brain can help cure cancer.
01:34:53.120 | I don't even know what to do with that.
01:34:56.100 | - I mean, the way to think about that is the following.
01:34:59.400 | The converse of the equation
01:35:01.060 | is something that we are much more comfortable with.
01:35:03.420 | Like, oh, if you're stressed,
01:35:06.020 | then your heart rate might rise
01:35:08.500 | and all kinds of sort of toxins might be released
01:35:11.260 | and that can have a detrimental effect in your body,
01:35:13.920 | et cetera, et cetera, et cetera.
01:35:15.100 | So maybe it's easier to understand your body
01:35:18.340 | healing from your mind
01:35:20.340 | by your mind is not killing your body,
01:35:23.340 | or at least it's killing it less.
01:35:24.860 | So I think that aspect of the stress equation
01:35:28.300 | is a little easier for most of us to conceptualize,
01:35:31.780 | but then the healing part is perhaps the same pathways,
01:35:35.140 | perhaps different pathways,
01:35:36.140 | but again, something that is totally untapped scientifically.
01:35:39.500 | - I think we tried to bring this question up a couple of times
01:35:42.860 | but let's return to it again.
01:35:44.580 | Is what do you think is the difference
01:35:46.460 | between the way a computer represents information,
01:35:49.460 | the human genome represents and stores information?
01:35:53.100 | And maybe broadly, what is the difference
01:35:55.500 | between how you think about computers
01:35:57.820 | and how you think about biological systems?
01:36:00.420 | - So I made a very provocative claim earlier
01:36:02.500 | that we are a digital computer,
01:36:04.340 | like that at the core lies a digital code.
01:36:06.340 | And that's true in many ways,
01:36:07.540 | but surrounding that digital core,
01:36:09.460 | there's a huge amount of analog.
01:36:11.460 | If you look at our brain, it's not really digital.
01:36:13.660 | If you look at our sort of RNA
01:36:15.780 | and all of that stuff inside ourselves,
01:36:17.260 | it's not really digital.
01:36:18.100 | It's really analog in many ways.
01:36:19.860 | But let's start with a code
01:36:22.780 | and then we'll expand to the rest.
01:36:24.820 | So the code itself is digital.
01:36:27.820 | So there's genes.
01:36:28.660 | You can think of the genes as, I don't know,
01:36:30.820 | the procedures, the functions inside your language.
01:36:33.900 | And then somehow you have to turn these functions on.
01:36:36.420 | How do you call a gene?
01:36:37.340 | How do you call that function?
01:36:39.380 | The way that you would do it in old programming languages
01:36:41.900 | is go to address, whatever in your memory,
01:36:44.580 | and then you'd start running from there.
01:36:46.260 | And modern programming languages
01:36:48.940 | have encapsulated this into functions
01:36:50.820 | and objects and all of that.
01:36:52.020 | And it's nice and cute, but in the end, deep down,
01:36:54.580 | there's still an assembly code that says
01:36:55.980 | go to that instruction and it runs that instruction.
01:36:58.700 | If you look at the human genome
01:37:01.580 | and the genome of pretty much most species out there,
01:37:04.980 | there's no go-to function.
01:37:08.140 | You just don't start transcribing
01:37:11.180 | in position 13,500,
01:37:13.900 | 13,527 in chromosome 12.
01:37:18.780 | You instead have content-based indexing.
01:37:21.940 | So at every location in the genome,
01:37:25.140 | in front of the genes that need to be turned on,
01:37:28.820 | I don't know, when you drink coffee,
01:37:30.620 | there's a little coffee marker in front of all of them.
01:37:34.500 | And whenever your cells that metabolize coffee
01:37:38.540 | need to metabolize coffee,
01:37:39.820 | they basically see coffee and they're like,
01:37:41.260 | "Ooh, let's go turn on all the coffee marked genes."
01:37:44.780 | So there's basically these small motifs,
01:37:48.100 | these small sequences that we call regulatory motifs.
01:37:50.860 | They're like patterns of DNA.
01:37:52.100 | They're only eight characters long or so,
01:37:54.700 | like GAT, GCA, et cetera.
01:37:57.100 | And these motifs work in combinations
01:38:01.620 | and every one of them has some recruitment affinity
01:38:06.380 | for a different protein that will then come and bind it.
01:38:09.700 | And together, collections of these motifs
01:38:11.900 | create regions that we call regulatory regions
01:38:15.500 | that can be either promoters near the beginning of the gene,
01:38:19.340 | and that basically tells you
01:38:20.180 | where the function actually starts, where you call it,
01:38:22.540 | and then enhancers that are looping around of the DNA
01:38:26.260 | that basically bring the machinery
01:38:28.220 | that binds those enhancers
01:38:29.820 | and then bring it onto the promoter,
01:38:32.540 | which then recruits the right,
01:38:34.660 | sort of the ribosome and the polymerase
01:38:36.780 | and all of that thing,
01:38:37.860 | which will first transcribe
01:38:39.620 | and then export and then eventually translate
01:38:41.660 | in the cytoplasm, whatever, RNA molecule.
01:38:44.540 | So the beauty of the way
01:38:50.580 | that the digital computer, that's the genome, works
01:38:55.220 | is that it's extremely fault tolerant.
01:38:58.780 | If I took your hard drive
01:39:00.340 | and I messed with 20% of the letters in it,
01:39:03.900 | of the zeros and ones, and I flipped them,
01:39:06.660 | you'd be in trouble.
01:39:08.180 | If I take the genome and I flip 20% of the letters,
01:39:11.060 | you probably won't even notice.
01:39:13.740 | And that resilience-
01:39:16.020 | - That's fascinating, yeah.
01:39:17.460 | - Is a key design principle,
01:39:19.580 | and again, I'm thermo-morphizing here,
01:39:21.460 | but it's a key driving principle
01:39:23.100 | of how biological systems work.
01:39:25.020 | They're first resilient and then anything else.
01:39:28.420 | And when you look at this incredible beauty of life
01:39:32.820 | from the most, I don't know, beautiful,
01:39:36.180 | I don't know, human genome maybe, of humanity
01:39:38.940 | and all of the ideals that should come with it,
01:39:41.500 | to the most terrifying genome, like, I don't know,
01:39:44.060 | COVID-19, SARS-CoV-2, and the current pandemic,
01:39:47.260 | you basically see this elegance
01:39:50.820 | as the epitome of clean design, but it's dirty.
01:39:55.820 | It's a mess.
01:39:57.820 | It's, you know, the way to get there is hugely messy.
01:40:02.660 | And that's something that we as computer scientists
01:40:04.940 | don't embrace.
01:40:06.660 | We like to have clean code.
01:40:08.340 | Like in engineering, they teach you
01:40:11.500 | about compartmentalization,
01:40:12.780 | about sort of separating functions,
01:40:14.260 | about modularity, about hierarchical design.
01:40:17.660 | None of that applies in biology.
01:40:19.060 | - Testing.
01:40:19.900 | (laughing)
01:40:21.340 | - Testing, sure, yeah, biology does plenty of that,
01:40:24.260 | but I mean, through evolutionary exploration.
01:40:26.820 | But if you look at biological systems,
01:40:31.100 | first, they are robust,
01:40:33.460 | and then they specialize to become anything else.
01:40:36.740 | And if you look at viruses,
01:40:38.260 | the reason why they're so elegant,
01:40:41.060 | when you look at the design of this, you know, genome,
01:40:44.620 | it seems so elegant.
01:40:46.180 | And the reason for that is that it's been stripped down
01:40:49.740 | from something much larger
01:40:51.660 | because of the pressure to keep it compact.
01:40:53.980 | So many compact genomes out there
01:40:56.060 | have ancestors that were much larger.
01:40:58.720 | You don't start small and become big.
01:41:00.820 | You go through a loop of add a bunch of stuff,
01:41:03.660 | increase complexity, and then slim it down.
01:41:07.260 | And one of my early papers was, in fact,
01:41:10.420 | on genome duplication.
01:41:12.140 | One of the things we found is that baker's yeast,
01:41:14.100 | which is the yeast that you use to make bread,
01:41:17.620 | but also the yeast that you use to make wine,
01:41:19.460 | which is basically the dominant species
01:41:20.940 | when you go in the fields of Tuscany
01:41:22.380 | and you say, you know, what's out there,
01:41:24.020 | is basically Saccharomyces cerevisiae,
01:41:26.300 | or the way my Italian friends say,
01:41:28.020 | Saccharomyces cerevisiae.
01:41:30.020 | (laughing)
01:41:32.280 | - Which means what?
01:41:34.500 | - Oh, Saccharomyces, okay, I'm sorry, I'm Greek.
01:41:36.660 | So yeah, zaharo, mykis, zaharo is sugar,
01:41:39.660 | mykis is fungus.
01:41:41.100 | - Yes.
01:41:41.940 | - Cerevisiae, cerveza, beer.
01:41:44.580 | So it means the sugar fungus of beer.
01:41:47.260 | - Yeah.
01:41:48.100 | - You know, less, less, less,
01:41:49.780 | less than sounding to the--
01:41:51.100 | - Still poetic, yep.
01:41:52.900 | - So anyway, Saccharomyces cerevisiae,
01:41:55.020 | basically the major baker's yeast out there,
01:41:57.100 | is the descendant of a whole genome duplication.
01:42:00.500 | Why would a whole genome duplication even happen?
01:42:02.940 | When it happened is coinciding
01:42:06.140 | with about 100 million years ago
01:42:08.300 | and the emergence of fruit-bearing plants.
01:42:12.920 | Why fruit-bearing plants?
01:42:15.580 | Because animals would eat the fruit,
01:42:19.060 | would walk around and poop huge amounts of nutrients
01:42:23.660 | along with a seed for the plants to spread.
01:42:26.520 | Before that, plants were not spreading through animals,
01:42:29.060 | they were spreading through wind
01:42:30.500 | and all kinds of other ways.
01:42:32.400 | But basically, the moment you have fruit-bearing plants,
01:42:35.100 | these plants are basically creating
01:42:38.220 | this abundance of sugar in the environment.
01:42:40.400 | So there's an evolutionary niche that gets created.
01:42:43.100 | And in that evolutionary niche,
01:42:44.260 | you basically have enough sugar
01:42:46.820 | that a whole genome duplication,
01:42:48.860 | which initially is a very messy event,
01:42:51.260 | allows you to then, you know,
01:42:53.900 | relieve some of that complexity.
01:42:55.900 | - So I had to pause.
01:42:57.340 | What does genome duplication mean?
01:42:59.660 | - That basically means that instead of having eight chromosomes
01:43:03.380 | you can now have 16 chromosomes.
01:43:05.080 | - So, but the duplication,
01:43:08.940 | at first when you go to 16, you're not using that.
01:43:13.940 | - Oh yeah, you are.
01:43:15.300 | Yeah, so basically from one day to the next,
01:43:17.460 | you went from having eight chromosomes
01:43:18.780 | to having 16 chromosomes.
01:43:20.420 | Probably a non-disjunction event during a duplication,
01:43:23.060 | during a division.
01:43:24.340 | So you basically divide the cell.
01:43:25.940 | Instead of half the genome going this way
01:43:27.940 | and half the genome going the other way
01:43:29.180 | after duplication of the genome,
01:43:30.800 | you basically have all of it going to one cell.
01:43:33.180 | And then there's sufficient messiness there
01:43:36.000 | that you end up with slight differences
01:43:38.500 | that make most of these chromosomes be actually preserved.
01:43:41.400 | It's a long story short to basically--
01:43:43.260 | - But it's a big upgrade, right?
01:43:45.180 | So that's--
01:43:46.000 | - Not necessarily,
01:43:46.900 | because what happens immediately thereafter
01:43:48.640 | is that you start massively losing
01:43:50.480 | tons of those duplicated genes.
01:43:52.460 | So 90% of those genes were actually lost
01:43:55.520 | very rapidly after whole genome duplication.
01:43:58.300 | And the reason for that is that biology is not intelligent.
01:44:01.820 | It's just ruthless selection, random mutation.
01:44:06.620 | So the ruthless selection basically means that
01:44:08.820 | as soon as one of the random mutations hit one gene,
01:44:11.540 | ruthless selection just kills off that gene.
01:44:13.500 | It's just, you know,
01:44:14.940 | if you have a pressure to maintain a small compact genome,
01:44:19.660 | you will very rapidly lose the second copy
01:44:21.820 | of most of your genes.
01:44:22.820 | And a small number, 10%, were kept in two copies.
01:44:25.900 | And those had to do a lot with environment adaptation,
01:44:28.940 | with the speed of replication,
01:44:31.220 | with the speed of translation,
01:44:32.540 | and with sugar processing.
01:44:34.380 | So I'm making a long story short
01:44:36.160 | to basically say that evolution is messy.
01:44:38.860 | The only way, like so, you know,
01:44:41.160 | the example that I was giving of messing with 20%
01:44:44.060 | of your bits in your computer, totally bogus.
01:44:47.340 | Duplicating all your functions
01:44:48.860 | and just throwing them out there in the same, you know,
01:44:51.580 | function, just totally bogus.
01:44:52.980 | Like this would never work in an engineer system.
01:44:55.360 | But biological systems,
01:44:57.100 | because of this content-based indexing
01:44:59.260 | and because of this modularity that comes
01:45:02.060 | from the fact that the gene is controlled
01:45:04.400 | by a series of tags,
01:45:05.380 | and now if you need this gene in another setting,
01:45:08.360 | you just add some more tags
01:45:09.940 | that will basically turn it on also in those settings.
01:45:12.740 | So this gene is now pressured to do two different functions.
01:45:17.500 | And it builds up complexity.
01:45:19.980 | I see whole-genome duplication
01:45:21.460 | and gene duplication in general
01:45:22.780 | as a way to relieve that complexity.
01:45:24.740 | So you have this gradual buildup of complexity
01:45:26.860 | as functions get sort of added onto the existing genes.
01:45:30.940 | And then boom, you duplicate your workforce.
01:45:34.380 | And you now have two copies of this gene.
01:45:36.940 | One will probably specialize to do one,
01:45:38.980 | and the other one will specialize to do the other,
01:45:40.620 | or one will maintain the ancestral function,
01:45:42.380 | the other one will sort of be free to evolve
01:45:45.020 | and specialize while losing the ancestral function,
01:45:47.940 | and so on and so forth.
01:45:48.860 | So that's how genomes evolve.
01:45:50.180 | They're just messy things,
01:45:52.260 | but they're extremely fault-tolerant,
01:45:54.780 | and they're extremely able to deal with mutations
01:45:58.580 | because that's the very way that you generate new functions.
01:46:03.580 | So new functionalization comes
01:46:05.640 | from the very thing that breaks it.
01:46:07.920 | So even in the current pandemic,
01:46:09.340 | many people are asking me,
01:46:10.620 | "Which mutations matter the most?"
01:46:12.780 | And what I tell them is,
01:46:13.940 | "Well, we can study the evolutionary dynamics
01:46:16.420 | "of the current genome to then understand
01:46:19.640 | "which mutations have previously happened or not,
01:46:23.340 | "and which mutations happen in genes
01:46:27.040 | "that evolve rapidly or not."
01:46:29.220 | And one of the things we found, for example,
01:46:30.860 | is that the genes that evolved rapidly in the past
01:46:34.980 | are still evolving rapidly now in the current pandemic.
01:46:37.860 | The genes that evolved slowly in the past
01:46:39.820 | are still evolving slowly.
01:46:41.120 | - Which means that they're useful.
01:46:42.860 | - Which means that they're
01:46:44.140 | under the same evolutionary pressures.
01:46:46.400 | But then the question is,
01:46:47.820 | what happens in specific mutations?
01:46:50.540 | So if you look at the D614 gene mutation
01:46:53.440 | that's been all over the news,
01:46:54.420 | so in position 614, in the amino acid 614,
01:46:58.380 | of the S protein,
01:47:00.300 | there's a D to gene mutation
01:47:02.660 | that sort of has creeped over the population.
01:47:06.420 | That mutation, we found out through my work,
01:47:10.580 | disrupts a perfectly conserved nucleotide position
01:47:14.020 | that has never been changed in the history
01:47:16.340 | of millions of years of equivalent mammalian evolution
01:47:20.420 | of these viruses.
01:47:23.080 | That basically means
01:47:23.920 | that it's a completely new adaptation to human.
01:47:27.500 | And that mutation has now gone from 1% frequency
01:47:30.820 | to 90% frequency in almost all outbreaks.
01:47:33.820 | - So there's a mutation,
01:47:35.100 | I like how you say the 416, what was it?
01:47:39.620 | - Yeah, 614, sorry.
01:47:40.660 | - 614, all right.
01:47:42.180 | - D614 gene.
01:47:43.700 | - So literally, so what you're saying
01:47:46.580 | is this is like a chess move.
01:47:48.460 | So it just mutated one letter to another.
01:47:50.580 | - Exactly.
01:47:51.420 | - And that hasn't happened before.
01:47:52.980 | - Yeah, never.
01:47:54.300 | - And this somehow, this mutation is really useful.
01:47:58.140 | - It's really useful in the current environment
01:48:00.620 | of the genome, which is moving from human to human.
01:48:04.900 | When it was moving from bat to bat,
01:48:06.840 | it couldn't care less for that mutation.
01:48:08.680 | But it's environment specific,
01:48:09.980 | so now that it's moving from human to human,
01:48:11.780 | hoo-hoo, it's moving way better,
01:48:14.100 | like by orders of magnitude.
01:48:15.940 | - What do you, okay, so you're like tracking
01:48:18.460 | this evolutionary dynamics, which is fascinating.
01:48:22.540 | But what do you do with that?
01:48:24.220 | So what does that mean?
01:48:25.340 | What does this mean, what do you make,
01:48:27.580 | what do you make of this mutation
01:48:29.160 | in trying to anticipate, I guess?
01:48:31.500 | Is one of the things you're trying to do
01:48:34.140 | is anticipate where, how this unrolls into the future,
01:48:37.860 | this evolutionary dynamic?
01:48:39.740 | - Such a great question.
01:48:40.660 | So there's two things.
01:48:42.940 | Remember when I was saying earlier,
01:48:44.740 | mutation is the path to new things,
01:48:47.140 | but also the path to break old things.
01:48:49.820 | So what we know is that this position
01:48:53.060 | was extremely preserved through gazillions of mutations.
01:48:56.700 | That mutation was never tolerated
01:48:58.540 | when it was moving from bat to bat.
01:49:00.300 | So that basically means that that mutation,
01:49:01.620 | that position is extremely important
01:49:04.120 | in the function of that protein.
01:49:05.700 | That's the first thing it tells.
01:49:07.040 | The second one is that that position
01:49:09.380 | was very well suited to bat transmission,
01:49:12.460 | but now is not well suited to human transmission,
01:49:14.800 | so it got rid of it.
01:49:15.920 | And it now has a new version of that amino acid
01:49:18.920 | that basically makes it much easier
01:49:21.000 | to transmit from human to human.
01:49:22.800 | So in terms of the evolutionary history
01:49:27.520 | teaching us about the future,
01:49:29.920 | it basically tells us here's the regions
01:49:32.040 | that are currently mutating,
01:49:34.880 | here's the regions that are most likely
01:49:36.480 | to mutate going forward.
01:49:37.960 | As you're building a vaccine,
01:49:39.480 | here's what you should be focusing on
01:49:41.760 | in terms of the most stable regions
01:49:43.620 | that are the least likely to mutate,
01:49:45.520 | or here's the newly evolved functions
01:49:48.280 | that are the most likely to be important
01:49:50.320 | because they've overcome this local maximum
01:49:54.640 | that it had reached in the bat transmission.
01:49:59.440 | So anyway, it's a tangent to basically say
01:50:01.840 | that evolution works in messy ways,
01:50:04.280 | and the thing that you would break
01:50:06.360 | is the thing that actually allows you
01:50:10.360 | to first go through a lull
01:50:12.240 | and then reach a new local maximum.
01:50:15.240 | And I often like to say that if engineers
01:50:19.000 | had basically designed evolution,
01:50:21.260 | we would still be perfectly replicating bacteria
01:50:24.560 | because it's by making the bacterium worse
01:50:29.480 | that you allow evolution to reach a new optimum.
01:50:32.240 | - That's just a pause on that,
01:50:34.560 | that's so profound.
01:50:35.860 | That's so profound for the entirety
01:50:39.400 | of this scientific and engineering disciplines.
01:50:43.720 | - Exactly.
01:50:45.520 | We as engineers need to embrace breaking things.
01:50:48.520 | We as engineers need to embrace robustness
01:50:50.960 | as the first principle beyond perfection
01:50:54.240 | 'cause nothing's gonna ever be perfect.
01:50:56.040 | And when you're sending a satellite to Mars,
01:50:58.440 | when something goes wrong, it'll break down
01:51:01.160 | as opposed to building systems that tolerate failure
01:51:04.560 | and are resilient to that,
01:51:08.840 | and in fact, get better through that.
01:51:11.080 | - So the SpaceX approach versus NASA for the...
01:51:14.600 | (laughing)
01:51:15.640 | - For example.
01:51:16.480 | - Is there something we can learn about the incredible,
01:51:21.280 | take lessons from the incredible biological systems
01:51:23.920 | in their resilience, in the mushiness, the messiness
01:51:27.600 | to our computing systems, to our computers?
01:51:31.880 | - It would basically be starting from scratch in many ways.
01:51:35.280 | It would basically be building new paradigms
01:51:38.960 | that don't try to get the right answer all the time,
01:51:42.760 | but try to get the right answer most of the time
01:51:45.600 | or a lot of the time.
01:51:47.000 | - Do you see deep learning systems
01:51:48.560 | and the whole world of machine learning
01:51:49.960 | as kind of taking a step in that direction?
01:51:52.000 | - Absolutely, absolutely.
01:51:53.640 | Basically by allowing this much more natural evolution
01:51:57.560 | of these parameters, you basically,
01:52:01.120 | and if you look at sort of deep learning systems,
01:52:03.240 | again, they're not inspired by the genome aspect of biology,
01:52:07.480 | they're inspired by the brain aspect of biology.
01:52:10.200 | And again, I want you to pause for a second
01:52:12.600 | and realize the complexity of the entire human brain
01:52:17.600 | with trillions of connections within our neurons,
01:52:22.800 | with millions of cells talking to each other,
01:52:26.720 | is still encoded within that same genome.
01:52:29.080 | (Zubin laughs)
01:52:32.440 | That same genome encodes every single freaking cell type
01:52:36.160 | of the entire body.
01:52:38.000 | Every single cell is encoded by the same code.
01:52:41.120 | And yet specialization allows you to have
01:52:45.320 | the single viral-like genome that self-replicates,
01:52:50.120 | the single module, modular automaton,
01:52:54.280 | work with other copies of itself.
01:52:56.560 | It's mind-boggling.
01:52:57.560 | Create complex organs through which blood flows.
01:53:02.760 | And what is that blood?
01:53:03.800 | The same freaking genome.
01:53:05.240 | (Zubin laughs)
01:53:06.760 | Create organs that communicate with each other.
01:53:10.960 | And what are these organs?
01:53:12.360 | The exact same genome.
01:53:14.280 | Create a brain that is innervated
01:53:17.640 | by massive amounts of blood pumping energy to it,
01:53:22.440 | 20% of our energetic needs,
01:53:25.440 | to the brain from the same genome.
01:53:28.240 | And all of the neuronal connections,
01:53:30.120 | all of the auxiliary cells, all of the immune cells,
01:53:33.920 | the astrocytes, the ligandrocytes, the neurons,
01:53:35.920 | the excitatory, the inhibitory neurons,
01:53:37.360 | all of the different classes of pericytes,
01:53:39.480 | the blood-brain barrier, all of that, same genome.
01:53:42.880 | - One way to see that in a sad,
01:53:46.600 | this one is beautiful, the sad thing is thinking about
01:53:49.240 | the trillions of organisms that died to create that.
01:53:55.280 | - You mean on the evolutionary path?
01:53:56.920 | - On the evolutionary path to humans.
01:53:59.640 | - It's crazy, there's two descendant of apes
01:54:02.720 | just talking on a podcast, okay.
01:54:05.380 | (Zubin laughs)
01:54:07.080 | So mind-boggling.
01:54:08.520 | - Just to boggle our minds a little bit more.
01:54:11.160 | Us talking to each other,
01:54:12.560 | we are basically generating a series of vocal utterances
01:54:18.460 | through our pulsating of vocal cords received through this.
01:54:23.480 | The people who listen to this
01:54:26.200 | are taking a completely different path
01:54:29.280 | to that information transfer, yet through language.
01:54:32.920 | But imagine if we could connect these brains
01:54:36.180 | directly to each other.
01:54:37.540 | The amount of information that I'm condensing
01:54:41.640 | into a small number of words is a huge funnel,
01:54:46.400 | which then you receive and you expand
01:54:49.500 | into a huge number of thoughts from that small funnel.
01:54:53.560 | (Zubin laughs)
01:54:55.760 | In many ways, engineers would love to have
01:54:58.420 | the whole information transfer,
01:54:59.900 | just take the whole set of neurons and throw them away.
01:55:02.660 | I mean, throw them to the other person.
01:55:04.560 | This might actually not be better
01:55:07.300 | because in your misinterpretation of every word
01:55:11.780 | that I'm saying, you are creating new interpretation
01:55:14.680 | that might actually be way better
01:55:16.080 | than what I meant in the first place.
01:55:17.920 | The ambiguity of language,
01:55:20.520 | perhaps might be the secret to creativity.
01:55:24.160 | Every single time you work on a project by yourself,
01:55:28.400 | you only bounce ideas with one person
01:55:31.120 | and your neurons are basically fully cognizant
01:55:33.760 | of what these ideas are.
01:55:35.880 | But the moment you interact with another person,
01:55:37.720 | the misinterpretations that happen
01:55:41.080 | might be the most creative part of the process.
01:55:43.760 | With my students, every time we have a research meeting,
01:55:45.600 | I very often pause and say,
01:55:47.520 | let me repeat what you just said in a different way.
01:55:50.400 | And I sort of go on and brainstorm
01:55:52.400 | with what they were saying,
01:55:53.660 | but by the third time, it's not what they were saying at all.
01:55:58.000 | And when they pick up what I'm saying,
01:55:59.480 | they're like, oh, well, da-da-da,
01:56:01.160 | now they've sort of learned something very different
01:56:04.140 | from what I was saying.
01:56:05.000 | And that is the same kind of messiness
01:56:08.480 | that I'm describing in the genome itself.
01:56:10.960 | It's sort of embracing the messiness.
01:56:13.560 | - And that's a feature, not a book.
01:56:15.360 | - Exactly.
01:56:16.200 | And in the same way, when you're thinking
01:56:17.520 | about sort of these deep learning systems
01:56:19.920 | that will allow us to sort of be more creative perhaps
01:56:23.560 | or learn better approximations of these complex functions,
01:56:27.520 | again, tuned to the universe that we inhabit,
01:56:29.760 | you have to embrace the breaking.
01:56:33.660 | You have to embrace the,
01:56:35.360 | how do we get out of these local optima?
01:56:37.960 | And a lot of the design paradigms
01:56:40.940 | that have made deep learning so successful
01:56:43.360 | are ways to get away from that,
01:56:45.360 | ways to get better training
01:56:47.320 | by sort of sending long range messages,
01:56:50.480 | these LSTM models and the sort of feed forward loops
01:56:55.480 | that sort of jump through layers
01:56:59.260 | of a convolutional neural network.
01:57:00.880 | All of these things are basically ways
01:57:04.200 | to push you out of these local maxima.
01:57:06.360 | And that's sort of what evolution does,
01:57:08.800 | that's what language does,
01:57:09.800 | that's what conversation and brainstorming does,
01:57:12.320 | that's what our brain does.
01:57:14.080 | So this design paradigm is something that's pervasive
01:57:18.280 | and yet not taught in schools,
01:57:20.520 | not taught in engineering schools
01:57:22.240 | where everything's minutely modularized
01:57:24.480 | to make sure that we never deviate
01:57:26.000 | from whatever signal we're trying to emit
01:57:28.600 | as opposed to let all hell breaks loose
01:57:31.400 | 'cause that's the path to paradise.
01:57:33.960 | - The path to paradise.
01:57:35.400 | Yeah, I mean, it's difficult to know how to teach that
01:57:37.960 | and what to do with it.
01:57:39.240 | I mean, it's difficult to know how to build up
01:57:43.640 | the scientific method around messiness.
01:57:46.600 | (Lex laughing)
01:57:48.280 | - I mean, it's not all messiness.
01:57:49.920 | We need some cleanness.
01:57:51.880 | And going back to the example with Mars,
01:57:54.320 | that's probably the place where I want to sort of
01:57:56.720 | moderate error as much as possible
01:57:58.760 | and sort of control the environment as much as possible.
01:58:01.000 | But if you're trying to repopulate Mars,
01:58:03.120 | well, maybe messiness is a good thing then.
01:58:06.200 | - On that, you quickly mentioned this
01:58:09.320 | in terms of us using our vocal cords to speak on a podcast.
01:58:14.320 | So Elon Musk and Neuralink are working on trying to plug,
01:58:20.120 | as per our discussion with computers and biological systems,
01:58:24.920 | to connect the two.
01:58:25.840 | He's trying to connect our brain to a computer
01:58:30.640 | to create a brain-computer interface
01:58:32.840 | where they can communicate back and forth.
01:58:36.160 | On this line of thinking,
01:58:38.240 | do you think this is possible to bridge the gap
01:58:42.480 | between our engineered computing systems
01:58:45.240 | and the messy biological systems?
01:58:48.280 | - My answer would be absolutely.
01:58:51.240 | There's no doubt that we can understand more and more
01:58:55.040 | about what goes on in the brain,
01:58:57.160 | and we can sort of train the brain.
01:59:00.320 | I don't know if you remember the Palm Pilot.
01:59:03.600 | - Yeah, Palm Pilot, yeah.
01:59:04.720 | - Remember this whole sort of alphabet that they had created?
01:59:08.480 | Am I thinking of the same thing?
01:59:10.960 | It's basically, you had a little pen,
01:59:13.320 | and for every character, you had a little scribble
01:59:17.040 | that was unique that the machine could understand,
01:59:19.840 | and that instead of trying to teach the machine
01:59:23.680 | to recognize human characters,
01:59:25.400 | you had basically, they figured out
01:59:27.240 | that it's better and easier to train humans
01:59:30.020 | to create human-like characters
01:59:31.880 | that the machine is better at recognizing.
01:59:34.740 | So in the same way, I think what will happen
01:59:38.300 | is that humans will be trained
01:59:40.580 | to be able to create the mind pattern
01:59:43.200 | that the machine will respond to
01:59:45.140 | before the machine truly comprehends our thoughts.
01:59:47.760 | So the first human brain interfaces
01:59:50.140 | will be tricking humans to speak the machine language,
01:59:53.640 | where with the right set of electrodes,
01:59:55.620 | I can sort of trick my brain into doing this.
01:59:57.620 | And this is the same way that many people teach,
02:00:00.260 | like learn to control artificial limbs.
02:00:02.980 | You basically try a bunch of stuff,
02:00:04.540 | and eventually you figure out how your limbs work.
02:00:06.920 | That might not be very different
02:00:08.240 | from how humans learn to use their natural limbs
02:00:11.500 | when they first grow up.
02:00:13.100 | Basically, you have these neoteny period
02:00:16.460 | of this puddle of soup inside your brain,
02:00:21.340 | trying to figure out how to even make neuronal connections
02:00:23.880 | before you're born, and then learning sounds in utero
02:00:28.500 | of all kinds of echoes,
02:00:31.540 | and eventually getting out in the real world.
02:00:35.900 | And I don't know if you've seen newborns,
02:00:37.360 | but they just stare around a lot.
02:00:39.180 | One way to think about this as a machine learning person
02:00:42.900 | is, oh, they're just training their edge detectors.
02:00:46.140 | And eventually, they figure out
02:00:47.380 | how to train their edge detectors.
02:00:48.740 | They work through the second layer of the visual cortex
02:00:50.860 | and the third layer and so on and so forth.
02:00:52.700 | And you basically have this
02:00:57.920 | learning how to control your limbs
02:00:59.440 | that probably comes at the same time.
02:01:01.080 | You're sort of throwing random things there,
02:01:03.360 | and you realize that, ooh, wow,
02:01:04.760 | when I do this thing, my limb moves.
02:01:07.200 | Let's do the following experiment.
02:01:09.320 | Take a breath.
02:01:10.140 | What muscles did you flex?
02:01:13.560 | Now take another breath and think
02:01:14.920 | what muscles do I flex?
02:01:16.640 | The first thing that you're thinking
02:01:18.000 | when you're taking a breath
02:01:19.920 | is the impact that it has on your lungs.
02:01:22.400 | You're like, oh, I'm now gonna increase my lungs,
02:01:24.160 | or I'm now gonna bring air in.
02:01:25.480 | But what you're actually doing
02:01:26.420 | is just changing your diaphragm.
02:01:28.080 | That's not conscious, of course.
02:01:31.920 | You never think of the diaphragm as a thing.
02:01:35.000 | And why is that?
02:01:36.040 | That's probably the same reason
02:01:37.400 | why I think of moving my finger
02:01:38.880 | when I actually move my finger.
02:01:40.600 | I think of the effect instead of actually thinking
02:01:42.600 | of whatever muscle is twitching
02:01:44.160 | that actually causes my finger to move.
02:01:46.500 | So we basically, in our first years of life,
02:01:49.280 | build up this massive lookup table
02:01:52.400 | between whatever neuronal firing we do
02:01:55.440 | and whatever action happens in our body that we control.
02:02:00.440 | If you have a kid grow up with a third limb,
02:02:03.180 | I'm sure they'll figure out how to control them
02:02:06.600 | probably at the same rate as their natural limbs.
02:02:09.440 | - And a lot of the work would be done by the,
02:02:13.360 | if a third limb is a computer,
02:02:15.560 | you kind of have a, not a faith,
02:02:18.520 | but a thought that the brain might be able to figure out.
02:02:23.880 | Like the plasticity would come from the brain.
02:02:26.480 | Like the brain would be cleverer than the machine at first.
02:02:29.000 | - When I talk about a third limb,
02:02:30.000 | that's exactly what I'm saying.
02:02:30.920 | An artificial limb that basically just controls your mouse
02:02:33.600 | while you're typing.
02:02:34.600 | Perfectly natural thing.
02:02:36.640 | I mean, again, in a few hundred years.
02:02:39.020 | - Maybe sooner than that.
02:02:41.600 | - But basically, as long as the machine is consistent
02:02:46.080 | in the way that it will respond to your brain impulses,
02:02:49.800 | you'll figure out how to control that
02:02:51.680 | and you could play tennis with your third limb.
02:02:53.920 | And let me go back to consistency.
02:02:57.520 | People who have dramatic accidents
02:03:01.280 | that basically take out a whole chunk of their brain
02:03:03.920 | can be taught to co-opt other parts of the brain
02:03:07.040 | to then control that part.
02:03:08.560 | You can basically build up that tissue again
02:03:10.840 | and eventually train your body how to walk again
02:03:13.480 | and how to read again and how to play again
02:03:15.400 | and how to think again,
02:03:16.240 | how to speak a language again, et cetera.
02:03:18.080 | So there's a massive amount of malleability
02:03:21.280 | that happens naturally in our way of controlling our body,
02:03:26.280 | our brain, our thoughts, our vocal cords,
02:03:29.040 | our limbs, et cetera.
02:03:30.760 | And human-machine interfaces are all inevitable
02:03:35.640 | if we sort of figure out how to read these electric impulses
02:03:39.240 | but the resolution at which we can understand human thought
02:03:43.420 | right now is nil, is ridiculous.
02:03:46.560 | So how are human thoughts encoded?
02:03:49.160 | It's basically combinations of neurons that co-fire
02:03:53.560 | and these create these things called engrams
02:03:55.720 | that eventually form memories and so on and so forth.
02:03:58.940 | We know nothing of all that stuff.
02:04:01.940 | So before we can actually read into your brain
02:04:05.600 | that you wanna build a program that does this
02:04:07.120 | and this and this and that, we need a lot of neuroscience.
02:04:10.960 | - Well, so to push back on that,
02:04:13.480 | do you think it's possible that without understanding
02:04:16.680 | the functionally about the brain or from the neuroscience
02:04:20.000 | or the cognitive science or psychology,
02:04:22.080 | whichever level of the brain we'll look at,
02:04:24.220 | do you think if we just connect them,
02:04:26.700 | just like per your previous point,
02:04:29.200 | if we just have a high enough resolution
02:04:30.840 | between connection between Wikipedia and your brain,
02:04:34.400 | the brain will just figure it out with less understanding?
02:04:38.160 | Because that's one of the innovations of Neuralink
02:04:40.320 | is they're increasing the number of connections to the brain
02:04:44.040 | to several thousand, which before was in the dozens
02:04:48.000 | or whatever.
02:04:48.840 | - You're still off by a few orders of magnitude,
02:04:51.040 | on the order of seven.
02:04:52.080 | (both laughing)
02:04:54.320 | - Right, but the thing is, the hope is if you increase
02:04:57.520 | that number more and more and more,
02:04:58.820 | maybe you don't need to understand anything
02:05:00.640 | about the actual, how human thought is represented
02:05:04.520 | in the brain.
02:05:05.340 | You can just let it figure it out by itself.
02:05:08.040 | - Yeah, like when Keanu Reeves waking up and saying,
02:05:09.800 | "I know ku-ku-fu."
02:05:10.720 | - Yeah, exactly.
02:05:11.560 | (both laughing)
02:05:13.200 | So yeah, sure.
02:05:14.640 | - You don't have faith in the plasticity of the brain
02:05:16.760 | to that degree?
02:05:18.280 | - It's not about brain plasticity.
02:05:19.840 | It's about the input aspect.
02:05:21.920 | Basically, I think on the output aspect,
02:05:23.760 | being able to control a machine is something
02:05:25.480 | that you can probably train your neural impulses
02:05:28.480 | that you're sending out to sort of match
02:05:30.960 | whatever response you see in the environment.
02:05:33.320 | If this thing moved every single time I thought
02:05:35.640 | a particular thought, then I could figure out,
02:05:37.360 | I could hack my way into moving this thing
02:05:39.560 | with just a series of thoughts.
02:05:40.960 | I could think, "Guitar, piano, tennis ball."
02:05:44.640 | (both laughing)
02:05:45.880 | And then this thing would be moving.
02:05:47.560 | And then I would just have the series of thoughts
02:05:50.640 | that would sort of result in the impulses
02:05:52.640 | that will move this thing the way that I want.
02:05:53.960 | And then eventually it'll become natural
02:05:55.560 | 'cause I won't even think about it.
02:05:57.640 | I mean, the same way that we control our limbs
02:05:59.120 | in a very natural way.
02:06:00.240 | But babies don't do that.
02:06:01.360 | Babies have to figure it out.
02:06:03.160 | And some of that is hard-coded,
02:06:04.860 | but some of that is actually learned
02:06:06.800 | based on whatever soup of neurons you ended up with,
02:06:10.320 | whatever connections you pruned them to,
02:06:13.440 | and eventually you were born with.
02:06:15.360 | A lot of that is coded in the genome,
02:06:17.740 | but a huge chunk of that is stochastic
02:06:19.680 | in sort of the way that you sort of create
02:06:21.320 | all these neurons, they migrate, they form connections,
02:06:23.440 | they sort of spread out,
02:06:25.140 | they have particular branching patterns,
02:06:26.520 | but then the connectivity itself,
02:06:28.200 | unique in every single new person.
02:06:30.120 | All this to say that on the output side,
02:06:34.000 | absolutely, I'm very, very hopeful
02:06:37.320 | that we can have machines that read
02:06:40.000 | thousands of these neuronal connections on the output side,
02:06:42.800 | but on the input side, oh boy.
02:06:45.580 | I don't expect any time in the near future
02:06:51.240 | we'll be able to sort of send a series of impulses
02:06:53.400 | that will tell me, oh, Earth to sun distance,
02:06:56.280 | 7.5 million, et cetera, like nowhere.
02:07:00.720 | I mean, I think language will still be the input way
02:07:04.480 | rather than sort of any kind of more complex.
02:07:07.340 | - It's a really interesting notion
02:07:08.760 | that the ambiguity of language is a feature.
02:07:11.840 | - Yeah.
02:07:12.680 | - And we evolved for millions of years
02:07:16.520 | to take advantage of that ambiguity.
02:07:19.520 | - Exactly.
02:07:20.520 | And yet no one teaches us the subtle differences
02:07:23.380 | between words that are near cognates,
02:07:26.100 | and yet evoke so much more than one from the other.
02:07:30.760 | And yet, when you're choosing words
02:07:34.520 | from a list of 20 synonyms,
02:07:36.840 | you know exactly the connotation of every single one of them.
02:07:40.040 | And that's something that is there.
02:07:42.600 | So yes, there's ambiguity,
02:07:45.100 | but there's all kinds of connotations.
02:07:46.800 | And in the way that we select our words,
02:07:48.880 | we have so much baggage that we're sending along,
02:07:51.320 | the way that we're emoting,
02:07:52.980 | the way that we're moving our hands
02:07:54.720 | every single time we speak,
02:07:56.080 | the pauses, the eye contact, et cetera,
02:07:58.880 | so much higher baud rate than just a vocal,
02:08:01.800 | you know, string of characters.
02:08:04.040 | - Well, let me just take a small tangent on that.
02:08:07.120 | - Oh, tangent, we haven't done that yet.
02:08:08.760 | - We haven't done that.
02:08:09.600 | - That's a good idea, let's do a tangent.
02:08:10.440 | (laughing)
02:08:12.320 | - We'll return to the origin of life after.
02:08:14.480 | So, I mean, you're Greek,
02:08:17.800 | but I'm going on this personal journey.
02:08:20.860 | I'm going to Paris for the explicit purpose
02:08:25.080 | of talking to one of the most famous,
02:08:29.360 | a couple who's a famous translators of Russian literature,
02:08:33.200 | Dostoevsky, Tolstoy, and they go,
02:08:36.280 | that's their art, is the translation.
02:08:38.560 | Everything I've learned about the translation art,
02:08:44.320 | it makes me feel,
02:08:46.100 | it's so profound in a way that's so much more profound
02:08:53.240 | than the natural language processing papers
02:08:55.400 | I read in the machine learning community,
02:08:57.440 | that there's such depth to language
02:09:00.440 | that I don't know what to do with.
02:09:03.160 | I don't know if you've experienced that in your own life
02:09:05.720 | with knowing multiple languages.
02:09:07.960 | I don't know what to, I don't know how to make sense of it,
02:09:11.720 | but there's so much loss in translation
02:09:13.640 | between Russian and English, and getting a sense of that.
02:09:17.440 | Like, for example, there's like,
02:09:20.440 | just taking a single sentence from Dostoevsky,
02:09:23.400 | and there's a lot of them.
02:09:25.440 | You could talk for hours about
02:09:27.560 | how to translate that sentence properly.
02:09:30.120 | That captures the meaning, the period,
02:09:34.360 | the culture, the humor, the wit,
02:09:36.540 | the suffering that was in the context of the time,
02:09:39.760 | all of that could be a single sentence.
02:09:42.280 | You could talk forever about
02:09:45.680 | what it takes to translate that correctly.
02:09:47.160 | I don't know what to do with that.
02:09:48.720 | So being Greek, it's very hard for me to think of a sentence
02:09:53.440 | or even a word without going into the full etymology
02:09:58.200 | of that word, breaking up every single atom of that sentence,
02:10:03.200 | and every single atom of these words,
02:10:07.080 | and rebuilding it back up.
02:10:08.860 | I have three kids, and the way that I teach them Greek
02:10:13.680 | is the same way that the documentary was mentioning earlier
02:10:17.620 | about sort of understanding the deep roots
02:10:19.720 | of all of these words.
02:10:21.900 | And it's very interesting
02:10:28.880 | that every single time I hear a new word
02:10:31.320 | that I've never heard before,
02:10:33.020 | I go and figure out the etymology of that word,
02:10:34.720 | because I will never appreciate that word
02:10:36.760 | without understanding how it was initially formed.
02:10:39.260 | - Interesting.
02:10:41.000 | But how does that help?
02:10:42.080 | Because that's not the full picture.
02:10:44.080 | - No, no, of course, of course.
02:10:44.920 | But what I'm trying to say is that knowing the components
02:10:48.360 | teaches you about the context of the formation of that word
02:10:52.260 | and sort of the original usage of that word.
02:10:55.120 | And then of course, the word takes new meaning
02:10:57.360 | as you create it from its parts.
02:11:00.820 | And that meaning then gets augmented,
02:11:04.120 | and two synonyms that sort of have different roots
02:11:08.160 | will actually have implications
02:11:09.200 | that carry a lot of that baggage
02:11:11.440 | of the historical provenance of these words.
02:11:14.220 | So before working on genome evolution,
02:11:16.620 | my passion was evolution of language
02:11:19.920 | and sort of tracing cognates across different languages
02:11:23.720 | through their etymologies.
02:11:27.280 | - And that's fascinating that there's parallels between,
02:11:30.280 | I mean, the idea that there's evolutionary dynamics
02:11:34.260 | to our language.
02:11:35.500 | - Yeah.
02:11:37.920 | In every single word that you utter, parallels, parallels.
02:11:42.600 | What does parallels mean?
02:11:43.880 | Para means side by side, alleles from alleles,
02:11:47.560 | which means identical twins, parallels.
02:11:50.800 | I mean, name any word, and there's so much baggage,
02:11:54.240 | so much beauty in how that word came to be
02:11:58.040 | and how this word took a new meaning
02:12:00.040 | than the sum of its parts.
02:12:01.420 | - Yeah, and they're just words.
02:12:06.120 | They don't have any physical grounding.
02:12:07.920 | - Exactly, and now you take these words
02:12:10.240 | and you weave them into a sentence.
02:12:13.600 | The emotional invocations of that weaving are fathomless.
02:12:18.600 | - And all of those emotions all live
02:12:23.280 | in the brains of humans.
02:12:25.480 | - In the eye of the beholder.
02:12:27.060 | No, seriously, you have to embrace this concept
02:12:30.840 | of the eye of the beholder.
02:12:32.440 | It's the conceptualization that nothing takes meaning
02:12:37.440 | with one person creating it.
02:12:39.400 | Everything takes meaning in the receiving end.
02:12:42.480 | And the emergent properties
02:12:45.160 | of these communication networks,
02:12:47.760 | where every single, if you look at the network of our cells
02:12:50.960 | and how they're communicating with each other,
02:12:52.480 | every cell has its own code.
02:12:54.200 | This code is modulated by the epigenome.
02:12:56.200 | This creates a bunch of different cell types.
02:12:57.960 | Each cell type now has its own identity,
02:13:00.000 | yet they all have the common root of the stem cells
02:13:02.200 | that sort of led to them.
02:13:03.660 | Each of these identities is now communicating
02:13:06.600 | with each other.
02:13:08.120 | They take meaning in their interaction.
02:13:11.800 | There's an emergent property that comes
02:13:13.760 | from a bunch of cells being together
02:13:15.680 | that is not in any one of the parts.
02:13:17.920 | If you look at neurons communicating,
02:13:19.320 | again, these engrams don't exist in any one neuron.
02:13:23.360 | They exist in the connection, in the combination of neurons.
02:13:26.440 | And the meaning of the words that I'm telling you
02:13:29.000 | is empty until it reaches you
02:13:33.000 | and it affects you in a very different way
02:13:35.200 | than it affects whoever's listening to this conversation now.
02:13:38.480 | Because of the emotional baggage that I've grown up with,
02:13:41.400 | that you've grown up with,
02:13:42.240 | and that they've grown up with.
02:13:44.320 | And that's, I think, the magic of translation.
02:13:47.800 | If you start thinking of translation
02:13:49.720 | as just simply capturing that emotional set of reactions
02:13:54.720 | that you evoke, you need a different set of words
02:14:00.960 | to evoke that same set of reactions to a French person
02:14:04.320 | than to a Russian person,
02:14:05.680 | because of the baggage of the culture that we grew up in.
02:14:08.360 | - Yeah, I mean, there's--
02:14:09.880 | - So basically, you shouldn't find the best word.
02:14:13.400 | Sometimes it's a completely different sentence structure
02:14:16.000 | that you will need,
02:14:17.000 | matched to the cultural context
02:14:21.960 | of the target audience that you have.
02:14:23.480 | - Yeah, I mean, you're just,
02:14:26.180 | I usually don't think about this,
02:14:27.400 | but right now there's this feeling,
02:14:29.680 | as a reminder, there's just you and I talking,
02:14:32.340 | but there's several hundred thousand people
02:14:34.960 | will listen to this.
02:14:36.080 | There's some guy in Russia right now
02:14:38.320 | running, like in Moscow, listening to us.
02:14:43.320 | And there's somebody in India, I guarantee you,
02:14:46.360 | there's somebody in China and South America,
02:14:48.340 | there's somebody in Texas,
02:14:50.360 | and they all have different--
02:14:52.960 | - Emotional baggage.
02:14:53.960 | - They probably got angry earlier on
02:14:56.040 | about the whole discussion about coronavirus
02:14:58.240 | and about some aspect of it.
02:15:01.960 | Yeah, and there's that network effect.
02:15:04.840 | - Yeah, yeah, yeah.
02:15:05.680 | - That's--
02:15:06.840 | - It's a beautiful thing.
02:15:07.880 | - And this lateral transfer of information,
02:15:10.760 | that's what makes the collective, quote-unquote,
02:15:12.800 | genome of humanity so unique from any other species.
02:15:17.800 | - So you somehow miraculously wrapped it back
02:15:22.600 | to the very beginning of when we were talking
02:15:25.120 | about the beauty of the human genome.
02:15:27.820 | So I think this is the right time,
02:15:31.220 | unless we wanna go for a six to eight hour conversation.
02:15:34.840 | We're gonna have to talk again,
02:15:35.960 | but I think for now, to wrap it up,
02:15:39.100 | this is the right time to talk about
02:15:40.900 | the biggest, most ridiculous question of all,
02:15:44.920 | meaning of life.
02:15:45.880 | Off mic, you mentioned to me that you had
02:15:50.120 | your 42nd birthday, 42nd being a very special,
02:15:55.120 | absurdly special number, and you had a kind of
02:15:59.520 | get together with friends to discuss the meaning of life.
02:16:04.360 | So let me ask you, in your, as a biologist,
02:16:08.400 | as a computer scientist, and as a human,
02:16:11.320 | what is the meaning of life?
02:16:14.660 | - I've been asking this question for a long time,
02:16:18.960 | ever since my 42nd birthday, but well before that,
02:16:22.120 | in even planning the Meaning of Life Symposium.
02:16:25.320 | And symposium, sym means together,
02:16:29.800 | posy actually means to drink together.
02:16:31.560 | So symposium is actually a drinking party.
02:16:33.560 | (laughing)
02:16:35.640 | - Can you actually elaborate about this
02:16:37.320 | Meaning of Life Symposium that you put together?
02:16:39.520 | It's like the most genius idea I've ever heard.
02:16:42.320 | - So 42 is obviously the answer to life,
02:16:44.640 | the universe, and everything,
02:16:45.600 | from the Hitchhiker's Guide to the Galaxy.
02:16:47.660 | And as I was turning 42, I've had the theme
02:16:50.680 | for every one of my birthdays.
02:16:51.800 | When I was turning 32, it's one zero zero,
02:16:54.920 | zero zero zero in binary.
02:16:56.640 | So I celebrated my 100,000th binary birthday,
02:17:00.080 | and I had the theme of going back 100,000 years,
02:17:02.760 | you know, let's dress something in the last 100,000 years.
02:17:07.160 | Anyway, I've always had these--
02:17:09.560 | - That's such an interesting human being.
02:17:12.280 | Okay, that's awesome.
02:17:13.120 | - I've always had these sort of numerology
02:17:16.240 | related announcements for my birthday party.
02:17:20.400 | (laughing)
02:17:21.740 | So what came out of that Meaning of Life Symposium
02:17:26.740 | is that I basically asked 42 of my colleagues,
02:17:29.680 | 42 of my friends, 42 of my collaborators,
02:17:33.040 | to basically give seven-minute speeches
02:17:35.480 | on the meaning of life, each from their perspective.
02:17:38.480 | And I really encourage you to go there,
02:17:40.600 | 'cause it's mind-boggling that every single person
02:17:44.200 | said a different answer.
02:17:46.280 | Every single person started with,
02:17:48.400 | "I don't know what the meaning of life is, but,"
02:17:50.920 | and then gave this beautifully, eloquently answer,
02:17:54.240 | eloquent answer.
02:17:55.440 | And they were all different,
02:17:57.280 | but they all were consistent with each other
02:18:01.300 | and mutually synergistic and together forming
02:18:04.340 | a beautiful view of what it means to be human in many ways.
02:18:07.520 | Some people talked about the loss of their loved one,
02:18:12.280 | their life partner for many, many years,
02:18:14.520 | and how their life changed through that.
02:18:16.520 | Some people talked about the origin of life.
02:18:19.260 | Some people talked about the difference
02:18:21.080 | between purpose and meaning.
02:18:24.160 | I'll maybe quote one of the answers,
02:18:28.560 | which is this linguistics professor,
02:18:30.840 | a friend of mine at Harvard,
02:18:32.480 | who basically said that she was gonna,
02:18:36.600 | she's Greek as well,
02:18:37.800 | and she said it will give a very Pythian answer.
02:18:40.120 | So Pythia was the oracle of Delphi,
02:18:42.960 | who would basically give these very cryptic answers,
02:18:45.280 | very short, but interpretable in many different ways.
02:18:48.320 | There was this whole set of priests
02:18:50.480 | who were tasked with interpreting what Pythia had said,
02:18:53.440 | and very often you would not get a clean interpretation,
02:18:56.440 | but she said, "I will be like Pythia
02:18:58.920 | "and give you a very short
02:19:00.840 | "and multiply interpretable answer,
02:19:02.480 | "but unlike her, I will actually
02:19:04.100 | "also give you three interpretations."
02:19:06.040 | And she said, "The answer to the meaning of life
02:19:09.740 | "is become one."
02:19:11.140 | And the first interpretation is,
02:19:15.080 | like a child, become one year old
02:19:17.660 | with the excitement of discovering
02:19:19.340 | everything about the world.
02:19:21.400 | Second interpretation, in whatever you take on,
02:19:25.020 | become one, the first, the best, excel,
02:19:28.840 | drive yourself to perfection for every one of your tasks.
02:19:32.760 | And become one when people are separate,
02:19:37.760 | become one, come together, learn to understand each other.
02:19:42.040 | - Damn, that's an answer.
02:19:45.400 | - And one way to summarize
02:19:46.880 | this whole meaning of life symposium
02:19:48.760 | is that the very symposium was illustrating
02:19:52.920 | the quest for meaning,
02:19:54.680 | which might itself be the meaning of life.
02:19:58.120 | This constant quest for something sublime,
02:20:01.400 | something human, something intangible,
02:20:04.900 | some aspect of what defines us
02:20:08.600 | as a species and as an individual,
02:20:11.320 | both the quest of me as a person through my own life,
02:20:16.360 | but the meaning of life could also be
02:20:19.200 | the meaning of all of life.
02:20:20.840 | What is the whole point of life?
02:20:22.040 | Why life?
02:20:22.880 | Why life itself?
02:20:24.480 | 'Cause we've been talking about the history
02:20:26.680 | and evolution of life,
02:20:28.320 | but we haven't talked about why life in the first place.
02:20:31.040 | Is life inevitable?
02:20:32.480 | Is life part of physics?
02:20:35.840 | Does life transcend physics?
02:20:37.680 | By fighting against entropy,
02:20:40.280 | by compartmentalizing and increasing concentrations
02:20:42.920 | rather than diluting away,
02:20:45.280 | is life a distinct entity in the universe
02:20:50.280 | beyond the traditional, very simple physical rules
02:20:55.080 | that govern gravity and electromagnetism
02:20:58.520 | and all of these forces?
02:21:00.640 | Is life another force?
02:21:02.120 | Is there a life force?
02:21:03.100 | Is there a unique kind of set of principles that emerge,
02:21:05.920 | of course, built on top of the hardware of physics,
02:21:09.100 | but is it sort of a new layer of software
02:21:11.820 | or a new layer of a computer system?
02:21:14.400 | So that's at the level of big questions.
02:21:18.480 | There's another aspect of gratitude,
02:21:21.200 | of basically, what I like to say is,
02:21:26.200 | during this pandemic,
02:21:27.920 | I've basically worked from 6 a.m. until 7 p.m.
02:21:30.800 | every single day, nonstop, including Saturday and Sunday.
02:21:34.280 | I've basically broken all boundaries
02:21:36.440 | of where personal life begins and work life ends.
02:21:42.000 | And that has been exhilarating for me,
02:21:46.280 | just the intellectual pleasure that I get
02:21:50.480 | from a day of exhaustion,
02:21:53.820 | where at the end of the day, my brain is hurting,
02:21:55.520 | I'm telling my wife, "Wow, I was useful today."
02:21:59.400 | And there's a certain pleasure
02:22:04.720 | that comes from feeling useful.
02:22:08.360 | And there's a certain pleasure
02:22:09.880 | that comes from feeling grateful.
02:22:12.480 | So I've written this little sort of prayer for my kids
02:22:16.440 | to say at bedtime every night,
02:22:19.520 | where they basically say,
02:22:20.940 | "Thank you, God, for all you have given me
02:22:24.720 | and give me the strength to give unto others
02:22:28.580 | with the same love that you have given unto me."
02:22:31.320 | We as a species are so special.
02:22:36.560 | The only ones who worry about the meaning of life.
02:22:39.520 | And maybe that's what makes us human.
02:22:43.320 | And what I like to say to my wife and to my students
02:22:47.960 | during this pandemic work extravaganza
02:22:52.240 | is every now and then they ask me, "But how do you do this?"
02:22:56.400 | And I'm like, "I'm a workaholic.
02:22:58.880 | I love this.
02:23:00.880 | This is me in the most unfiltered way.
02:23:04.800 | The ability to do something useful,
02:23:07.280 | to feel that my brain's being used,
02:23:09.640 | to interact with the smartest people on the planet
02:23:12.580 | day in, day out,
02:23:14.120 | and to help them discover aspects of the human genome,
02:23:17.120 | of the human brain, of human disease and the human condition
02:23:21.940 | that no one has seen before
02:23:24.560 | with data that we're capturing
02:23:27.000 | that has never been observed.
02:23:29.860 | And there's another aspect, which is on the personal life.
02:23:34.480 | Many people say, "Oh, I'm not gonna have kids."
02:23:36.120 | Why bother?
02:23:37.560 | I can tell you as a father,
02:23:39.160 | they're missing half the picture, if not the whole picture.
02:23:44.580 | Teaching my kids about my view of the world
02:23:50.160 | and watching through their eyes
02:23:52.360 | the naivete with which they start
02:23:54.600 | and the sophistication with which they end up,
02:23:57.040 | the understanding that they have
02:24:01.160 | of not just the natural world around them, but of me too.
02:24:05.000 | The unfiltered criticism that you get
02:24:10.000 | from your own children that knows no bounds of honesty.
02:24:16.320 | And I've grown components of my heart
02:24:22.840 | that I didn't know I had
02:24:24.840 | until you sense that fragility,
02:24:29.020 | that vulnerability of the children
02:24:33.280 | that immense love and passion,
02:24:37.000 | the unfiltered egoism that we as adults
02:24:42.200 | learn how to hide so much better.
02:24:44.280 | It's just this back of emotions
02:24:48.080 | that tell me about the raw materials
02:24:50.980 | that make a human being
02:24:53.000 | and how these raw materials can be arranged
02:24:55.280 | with more sophistication that we learn through life
02:24:58.600 | to become truly human adults.
02:25:02.160 | But there's something so beautiful
02:25:04.320 | about seeing that progression between them,
02:25:07.260 | the complexity of the language growing
02:25:10.600 | as more neuronal connections are formed,
02:25:12.800 | to realize that the hardware is getting rearranged
02:25:18.540 | as their software is getting implemented on that hardware,
02:25:21.580 | that their frontal cortex continues to grow
02:25:24.860 | for another 10 years.
02:25:26.140 | There's neuronal connections that are continuing to form,
02:25:29.920 | new neurons that actually get replicated and formed.
02:25:33.080 | And it's just incredible that we have these,
02:25:38.080 | not just you grow the hardware for 30 years
02:25:40.680 | and then you feed it all of the knowledge.
02:25:42.640 | No, no, the knowledge is fed throughout
02:25:45.200 | and is shaping these neural connections as they're forming.
02:25:48.480 | So seeing that transformation from either your own blood
02:25:52.840 | or from an adopted child
02:25:54.560 | is the most beautiful thing you can do as a human being.
02:25:57.520 | And it completes you,
02:25:58.640 | it completes that path, that journey.
02:26:00.760 | The create life, oh sure, that's at conception, that's easy.
02:26:04.880 | But create human life to add the human part,
02:26:08.400 | that takes decades of compassion, of sharing,
02:26:13.160 | of love and of anger and of impatience and patience.
02:26:18.160 | And as a parent,
02:26:21.880 | I think I've become a very different kind of teacher.
02:26:25.060 | Because again, I'm a professor,
02:26:27.080 | my first role is to bring adult human beings
02:26:31.040 | into a more mature level of adulthood,
02:26:34.440 | where they learn not just to do science,
02:26:37.040 | but they learn the process of discovery
02:26:39.840 | and the process of collaboration,
02:26:41.200 | the process of sharing,
02:26:42.280 | the process of conveying the knowledge,
02:26:44.840 | of encapsulating something incredibly complex
02:26:48.000 | and sort of giving it up in sort of bite-sized chunks
02:26:51.200 | that the rest of humanity can appreciate.
02:26:54.400 | I tell my students all the time,
02:26:55.680 | if you, you know, like when an apple falls,
02:26:58.760 | when a tree falls in the forest
02:27:00.840 | and no one's there to listen, has it really fallen?
02:27:03.040 | The same way you do this awesome research,
02:27:05.280 | if you write an impenetrable paper
02:27:06.700 | that no one will understand,
02:27:08.640 | it's as if you never did the awesome research.
02:27:11.040 | So conveying of knowledge,
02:27:12.520 | conveying this lateral transfer
02:27:15.200 | that I was talking about at the very beginning
02:27:17.540 | of sort of humanity and sort of the sharing of information,
02:27:22.500 | all of that has gotten so much more rich
02:27:27.220 | by seeing human beings grow in my own home,
02:27:30.980 | because that makes me a better parent
02:27:35.100 | and that makes me a better teacher
02:27:36.980 | and a better mentor to the nurturing of my adult children,
02:27:41.980 | which are my research group.
02:27:44.020 | - First of all, beautifully put,
02:27:45.780 | connects beautifully to the vertical
02:27:49.480 | and the horizontal inheritance of ideas
02:27:52.260 | that we talked about at the very beginning.
02:27:54.460 | I don't think there's a better way to end it
02:27:57.380 | on this poetic and powerful note.
02:28:01.340 | Manolis, thank you so much for talking to us.
02:28:03.100 | A huge honor.
02:28:03.940 | We'll have to talk again about the origin of life,
02:28:07.280 | about epigenetics, epigenomics,
02:28:10.540 | and some of the incredible research you're doing.
02:28:13.620 | Truly an honor.
02:28:14.460 | Thanks so much for talking to me.
02:28:15.300 | - Thank you, such a pleasure.
02:28:16.460 | It's such a pleasure.
02:28:17.280 | I mean, your questions are outstanding.
02:28:19.100 | I've had such a blast here.
02:28:20.580 | I can't wait to be back.
02:28:21.920 | - Awesome.
02:28:23.260 | Thanks for listening to this conversation
02:28:24.860 | with Manolis Kellis,
02:28:25.980 | and thank you to our sponsors, Blinkist,
02:28:29.060 | 8sleep, and Masterclass.
02:28:31.420 | Please consider supporting this podcast
02:28:33.260 | by going to blinkist.com/lex, 8sleep.com/lex,
02:28:37.680 | and masterclass.com/lex.
02:28:41.100 | Click the links, buy the stuff, get the discount.
02:28:44.220 | It's the best way to support this podcast.
02:28:47.100 | If you enjoy this thing, subscribe on YouTube,
02:28:48.820 | review it with the five stars on Apple Podcasts,
02:28:50.940 | support it on Patreon,
02:28:52.300 | or connect with me on Twitter @LexFriedman.
02:28:55.480 | And now let me leave you with some words
02:28:57.380 | from Charles Darwin
02:28:58.940 | that I think Manolis represents quite beautifully.
02:29:01.680 | "If I had my life to live over again,
02:29:04.780 | "I would have made a rule to read some poetry
02:29:07.460 | "and listen to some music at least once every week."
02:29:11.660 | Thank you for listening, and hope to see you next time.
02:29:14.900 | (upbeat music)
02:29:17.480 | (upbeat music)
02:29:20.060 | [BLANK_AUDIO]