back to indexManolis Kellis: Meaning of Life, the Universe, and Everything | Lex Fridman Podcast #142
Chapters
0:0 Introduction
2:18 Music and life
41:51 The number 42
47:52 The question about the meaning of life
50:32 Are humans unique in the universe?
56:16 Human civilization
68:22 Mars
70:15 Human mind and the abstraction layers of reality
81:8 Neural networks and intelligence
88:25 Ideas as organisms
97:49 Language
109:4 Legacy
123:55 Poems
00:00:00.000 |
The following is a conversation with Manolis Kellis, 00:00:06.880 |
and head of the MIT Computational Biology Group. 00:00:16.660 |
is the answer to the ultimate question of life, 00:00:20.420 |
according to the Hitchhiker's Guide to the Galaxy, 00:00:23.640 |
we decided to talk about this unanswerable question 00:00:28.120 |
in whatever way we two descendants of apes could muster, 00:00:32.000 |
from biology, to psychology, to metaphysics, and to music. 00:00:39.400 |
followed by some thoughts related to the episode. 00:00:47.000 |
grammar, sentence structure, and readability, 00:00:56.280 |
and Cash App, the app I use to send money to friends. 00:01:00.400 |
Please check out these sponsors in the description 00:01:02.440 |
to get a discount and to support this podcast. 00:01:05.680 |
As a side note, let me say that the opening 40 minutes 00:01:08.360 |
of the conversation are all about the many songs 00:01:11.480 |
that formed the soundtrack to the journey of Manolis's life. 00:01:24.840 |
of many of the songs we mention in the description 00:01:31.960 |
without listening to the songs or watching the video, 00:01:39.600 |
his singing of the little excerpts from the songs, 00:01:53.960 |
and more philosophical parts of the conversation. 00:01:56.720 |
I hope you enjoy this little experimenting conversation 00:02:13.280 |
And now, here's my conversation with Manolis Kallis. 00:02:17.360 |
You mentioned Leonard Cohen and the song "Hallelujah" 00:02:31.320 |
So there's really countless songs that have marked me, 00:02:34.660 |
that have sort of shaped me in periods of joy 00:02:50.520 |
'cause I can sort of recite hundreds of songs 00:02:54.560 |
So it's gonna be very hard to just pick a few. 00:03:03.720 |
as I told you before, the misery, the poverty, 00:03:09.000 |
So some of the songs that have really shaped me 00:03:16.560 |
And then there's also really just old traditional songs 00:03:28.840 |
And the song is painting this beautiful picture 00:03:32.200 |
about all the noises that you hear in the neighborhood, 00:03:39.960 |
and the kids crying next door and all of that. 00:03:44.640 |
I'm having trouble falling asleep and dreaming. 00:03:49.560 |
And then he was like, you know, breaking into that. 00:03:52.400 |
So it's this juxtaposition between the spirit 00:03:55.720 |
and the sublime and then the physical and the harsh reality. 00:03:59.960 |
It's just not having troubles, not being miserable. 00:04:10.240 |
being able to sort of be the captain of a ship 00:04:19.040 |
they acknowledge the cruelty, the difficulty of life, 00:04:45.800 |
I don't even know if there's a word for that in English. 00:04:51.920 |
♪ Ta heria sum megalosan ke ponesan ke matesan ♪ 00:04:59.120 |
basically like the state of being poor and misery, 00:05:02.680 |
you know, for you, I write all my songs, et cetera. 00:05:21.800 |
with nothing to lose because you've seen the worst of it. 00:05:29.960 |
So it's describing the young men as cypress trees. 00:05:34.120 |
And that's probably one of my earliest exposure 00:05:41.560 |
I was reading a story to my kids the other day 00:06:00.240 |
If you just close your eyes and listen, it's a video. 00:06:06.840 |
And he's basically showing just how much more 00:06:09.000 |
the human imagination has besides just a few images 00:06:37.000 |
And again, these songs give you so much perspective. 00:06:44.240 |
about sort of the passion and the force and the drive. 00:07:03.200 |
sort of view of romanticism of, you know, suffering. 00:07:22.960 |
and all of these sort of really popular songs 00:07:27.240 |
Just songs that I would just listen to the radio 00:07:30.760 |
And eventually, as I started learning English, 00:07:32.440 |
I was like, oh, wow, this thing has been repeating. 00:07:44.080 |
is teaching you that it's your responsibility 00:07:49.320 |
You know, if you wanna make the world a better place, 00:07:55.080 |
all of these songs, you can listen to them shallowly 00:08:00.680 |
And I think there's a certain philosophy of song 00:08:10.080 |
because they have an accident in that region of the brain 00:08:14.080 |
because it's exactly the symmetric region of the brain. 00:08:26.120 |
and rhythmic patterns and eventually language. 00:08:31.080 |
- Do you have a sense of why songs developed? 00:08:33.960 |
So you're kind of suggesting that it's possible 00:08:39.240 |
about our connection with song and with music 00:08:53.360 |
Like, basically, my view of human cognitive evolution 00:09:04.560 |
There's organized dance performances around mating. 00:09:11.440 |
I mean, that's an evolutionary drive right there. 00:09:14.080 |
So basically, if you're not able to string together 00:09:16.160 |
a complex dance as a bird, you don't get a mate. 00:09:27.800 |
and not every bird knows how to learn a complicated song. 00:09:36.200 |
and a lot of that is inherent and genetically encoded. 00:09:42.600 |
And if you look at a lot of these exotic birds of paradise 00:09:51.400 |
And I think human mating rituals of ancient tribes 00:09:56.840 |
And in my view, the sequential formation of these movements 00:10:10.600 |
not just an accidental precursor to intelligence. 00:10:16.040 |
- Well, it's sexually selected and it's a prerequisite. 00:10:36.840 |
No, I mean, I don't know if you remember that scene from, 00:10:40.160 |
oh gosh, what's that Jack Nicholson movie in New Hampshire? 00:10:56.200 |
and what's coming out of the typewriter is just gibberish. 00:10:59.040 |
And I have that image as well when I'm working. 00:11:01.600 |
And I'm like, no, basically all of these crazy, 00:11:11.960 |
This ability of sort of stretching your brain 00:11:15.600 |
is connecting your emotional self and your cognitive self. 00:11:25.520 |
- Yeah, I wonder if the world without art and music, 00:11:30.600 |
that world would be not just devoid of fun things 00:11:34.680 |
to look at or listen to, but devoid of all the other stuff, 00:11:50.480 |
No, they play the piano and they play the violin 00:12:01.840 |
and playing soccer and avoiding obstacles and all of that, 00:12:06.400 |
that forms your three-dimensional view of the world. 00:12:28.760 |
So I can't have a conversation with my students 00:12:44.600 |
- The whiteboard, yeah, that's fascinating to think about. 00:12:49.440 |
"Mirror, Careless Whisper" with George Michael, 00:13:05.800 |
And I had a tape where I only had part of that song. 00:13:08.480 |
- Part of that song, you just sang it over and over. 00:13:09.320 |
- And I just played it over and over and over again. 00:13:15.120 |
That song is almost Greek, it's so heartbreaking. 00:13:26.600 |
So sorry to offend you so deeply not knowing this. 00:13:31.160 |
- So anyway, so we're moving to France when I'm 12 years old 00:13:33.040 |
and now I'm getting into the songs of Gainsbourg. 00:13:36.160 |
So Gainsbourg is this incredible French composer. 00:14:20.040 |
and then ek from Ikea, from ecology, which means home. 00:14:23.360 |
So metek is someone who has changed homes, who are migrant. 00:16:28.240 |
where she's actually added some English lyrics. 00:16:45.200 |
there's not many songs that show such depth of desperation 00:16:58.680 |
- And then high school, now I'm starting to learn English. 00:17:16.920 |
♪ It takes a man to suffer ignorance and smile ♪ 00:17:23.840 |
And then, ♪ Takes more than combat gear to make a man ♪ 00:17:30.720 |
♪ Confront your enemies, avoid them when you can ♪ 00:17:42.640 |
Basically says, it's not the combat gear that makes a man. 00:17:46.400 |
- Where's the part where he says, there you go. 00:17:49.560 |
♪ Gentleness so brighty, a rare in this society ♪ 00:17:53.920 |
♪ At night a candle's brighter than the sun ♪ 00:17:57.680 |
He basically says, well, you just might be the only one. 00:18:14.040 |
Don't, you know, don't let that anger get to you. 00:18:22.960 |
So again, as in Greece, I didn't even know what that meant, 00:18:26.120 |
how fragile we are, but the song was so beautiful. 00:18:34.040 |
after the Contras murdered Ben Linder in 1987. 00:18:51.400 |
And that song starts with the most beautiful poetry. 00:18:56.200 |
♪ If blood will flow when flesh and steel are one ♪ 00:19:04.600 |
♪ Tomorrow's rain will wash the stains away ♪ 00:19:08.280 |
♪ But something in our minds will always stay ♪ 00:19:37.440 |
it's just such a refined way with deep meanings, 00:19:42.000 |
but also words that rhyme just so beautifully 00:19:45.600 |
and evocations of when flesh and steel are one. 00:20:03.400 |
And again, tears from a star, how fragile we are. 00:20:06.840 |
I mean, just these rhymes are just flowing so naturally. 00:20:10.160 |
- Something, it seems that more meaning comes 00:20:16.000 |
That probably connects to exactly what you were saying. 00:20:25.120 |
And the less obvious are often the first verse 00:20:28.120 |
because it makes the second verse flow much more naturally 00:20:33.760 |
Oh, you went and found this like unusual word. 00:20:36.480 |
In "Dark Moments", the whole album of Pink Floyd 00:20:45.200 |
And there's one song that never actually made it 00:20:47.680 |
into the album, that's only there in the movie 00:20:55.000 |
And it just describes again, this vivid imagery. 00:20:58.120 |
It was just before dawn, one miserable morning in black 44, 00:21:02.920 |
when the forward commander was told to sit tight 00:21:08.440 |
And the generals gave thanks as the other ranks held back 00:21:16.040 |
And the Anzio Bridgehead was held for the price 00:21:24.640 |
So that's a theme that keeps coming back in Pink Floyd 00:21:43.480 |
From another song, it's like this whole concept 00:21:46.400 |
And there's that theme of "Us Versus Them" again 00:21:48.720 |
where the child is discovering how his father died 00:21:58.560 |
And my eyes still grow damp to remember his majesty's sign 00:22:11.760 |
He's crying because kind old King George took the time 00:22:16.000 |
to actually write mother a note about the fact 00:22:23.400 |
we are just ordinary men and of course we're disposable. 00:22:26.920 |
So I don't know if you know the root of the word pioneers, 00:22:29.840 |
but you had a chessboard here earlier, a pawn. 00:22:45.360 |
to actually be the ones that we're now treating as heroes. 00:22:48.880 |
So anyway, there's this juxtaposition of that. 00:22:50.920 |
And then the part that always just strikes me 00:22:53.680 |
is the music and the tonality totally changes. 00:23:24.800 |
And that song, even though it's not in the album, 00:23:32.080 |
It's this movie of someone being stuck in their head 00:23:37.080 |
There's no other movie that I think has captured so well 00:24:32.440 |
which is like, where he just breaks out with his guitar 00:24:36.840 |
He starts throwing stuff and then he like, you know, 00:24:39.720 |
breaks the window, he throws the chair outside 00:24:43.680 |
with his own blood like, you know, everywhere. 00:24:51.680 |
And it's this whole sort of mania versus, you know, 00:25:33.120 |
you're making me realistic. - But imagine watching this 00:26:22.120 |
There's no way that they're gonna let their children die. 00:26:34.200 |
And now that's the new national anthem, are you reading? 00:26:55.480 |
♪ I hope the Russians love their children too ♪ 00:27:27.320 |
Russians love their children too, the common humanity. 00:27:35.200 |
about the daughter who's crying for her brother 00:27:44.480 |
This Turk, here's his family, here's his children. 00:27:47.560 |
This other one, he just got married, et cetera. 00:27:56.280 |
the enemies are these monsters, they're not human. 00:28:00.880 |
We always say, they're not like us, they're different. 00:28:06.240 |
So there's this dehumanization that has to happen 00:28:10.200 |
If you realize just how close we are genetically, 00:28:14.280 |
one with the other, this whole 99.9% identical, 00:28:18.680 |
you can't bear weapons against someone who's like that. 00:28:22.120 |
- And the things that are the most meaningful to us 00:28:24.160 |
in our lives at every level is the same on all sides, 00:28:30.040 |
- So it's not just that we're genetically the same. 00:28:39.920 |
And the last one I mentioned last time we spoke, 00:29:20.080 |
♪ And then I've looked at clouds from both sides now ♪ 00:29:25.640 |
♪ And still somehow it's clouds illusions I recall ♪ 00:29:41.160 |
and about life, how it's about winning and losing, 00:29:49.240 |
♪ They shake their heads, they say I've changed ♪ 00:29:53.200 |
♪ Well, something's lost and something's gained ♪ 00:29:59.520 |
So again, that's growing up and realizing that, 00:30:04.960 |
is not necessarily that you have as an adult. 00:30:07.200 |
Remember my poem from when I was 16 years old 00:30:09.920 |
of this whole, you know, children dance now while in row, 00:30:13.440 |
and then in the end, even though the snow seems bright, 00:30:24.600 |
you see the exact same thing from a different way. 00:30:29.840 |
ah, sucks, I won't be able to go outside now. 00:30:40.800 |
I'm so glad we stumbled on how much joy you have 00:30:54.080 |
so Leonard Cohen's "Dance Me to the End of Love," 00:30:56.360 |
that was our opening song in our wedding with my wife. 00:31:03.400 |
And then another one, which is just so passionate always, 00:31:06.200 |
and we always keep referring back to it, is "I'm Your Man." 00:31:20.240 |
You can have the passion, you can have the anger, 00:31:23.280 |
you can have the love, you can have the tenderness. 00:31:40.040 |
Then if you want a boxer, I will step into the ring for you. 00:31:46.680 |
or if you want to take me for a ride, you know you can. 00:31:50.760 |
So this whole concept of you wanna drive, I'll follow. 00:32:02.160 |
he's proud of his ability to basically be any kind of man 00:32:07.920 |
for as long as he wants, as opposed to the Jacques Brel 00:32:15.040 |
for you to love me, that kind of desperation. 00:32:20.440 |
There's a verse that is perhaps not paid attention 00:32:36.520 |
in the same way that "Nemeketepa" is an apology song. 00:32:45.920 |
- And in the same way that the "Careless Whisper" 00:32:57.240 |
So this is an apology song, not by begging on his knees 00:33:00.080 |
or I'd crawl to you, baby, and I'd fall at your feet 00:33:03.520 |
and I'd howl at your beauty like a dog in heat 00:33:07.480 |
and I'd claw at your heart and I'd tear at your sheet. 00:33:22.320 |
♪ Or only want to walk with me a while across the sand ♪ 00:33:30.800 |
That's the last verses, which basically says, 00:33:40.200 |
If you want a father for your child, I'll be there too. 00:33:52.920 |
country song, believe it or not, the lucky one. 00:34:08.600 |
is the guy who got me to genomics in the first place. 00:34:27.840 |
a bodybuilder, was holding me on his shoulder. 00:34:30.240 |
And I was like doing maneuvers in the air, basically. 00:34:41.280 |
who was describing how every member of her family 00:34:50.840 |
like death and, you know, sickness and everything. 00:35:11.200 |
"But I'm only reassured because you're here with me." 00:35:18.880 |
I know you're the luckiest man on the planet." 00:35:24.240 |
where I just feel freaking lucky all the time. 00:35:30.520 |
Of course, I'm not any luckier than any other person, 00:35:32.440 |
but every time something horrible happens to me, 00:35:53.760 |
how nicely that the wind blew and the writing was erased." 00:36:11.760 |
it must open the door to a beautiful chapter. 00:36:15.280 |
- So, Alison Krauss is talking about "The Lucky One." 00:36:18.320 |
So, it's like, "Oh my God, she wrote a song for me." 00:36:25.320 |
♪ As free as the wind blowing down the road ♪ 00:36:30.200 |
♪ I'd say you were lucky 'cause you know what you've done ♪ 00:36:32.680 |
♪ Not the care in the world, not the worry inside ♪ 00:36:45.920 |
♪ You look at the world with the smiling eyes ♪ 00:36:48.160 |
♪ And laugh at the devil as his train rolls by ♪ 00:36:50.960 |
♪ I'll give you a song and a one-night stand ♪ 00:37:17.280 |
♪ But never knowing which road you're choosing ♪ 00:37:22.320 |
♪ To playing and winning is playing and losing ♪ 00:37:36.360 |
if you lose, it's okay, you had an awesome game. 00:37:43.800 |
but then there's the last verse basically says, 00:37:46.960 |
no matter where you're at, that's where you'll be, 00:37:59.960 |
And it basically tells you that freedom comes at a price. 00:38:04.360 |
Freedom comes at the price of non-commitment. 00:38:06.480 |
This whole sort of birds who love or birds who cry, 00:38:12.360 |
You can't just be the lucky one, the happy boy, la la la, 00:38:27.400 |
But at the same time, I identify with a lesson of, 00:38:30.600 |
well, you can't just be the happy merry-go-lucky all the time. 00:38:49.040 |
But if you just go all in and you just, you know, 00:38:59.680 |
I managed to narrow it down to what, 15 songs? 00:39:14.880 |
I haven't heard it before, but that's exactly right. 00:39:24.280 |
And there's something to that, but you're right. 00:39:26.920 |
It needs to be, we need to now return to the muck of life 00:39:43.440 |
things don't turn out the way you expect it to. 00:39:52.240 |
But then that feeling of things being different 00:39:55.640 |
than you expected, that you stumble in all the kinds of ways 00:39:59.760 |
that seems to be, needs to be paired with the feeling. 00:40:04.760 |
The only way not to make mistakes is to never do anything. 00:40:19.480 |
And it's just crazy for me, who's a computer scientist, 00:40:36.280 |
but it's always something worth exploring further. 00:40:44.000 |
and knowing that I'll be wrong a bunch of times, 00:40:53.560 |
is part of this whole sort of messiness of life. 00:41:10.640 |
you know, if you are a doer, you'll make mistakes. 00:41:17.600 |
you can sit back and criticize everybody else 00:41:24.320 |
And frankly, I'd rather be the criticized one 00:41:28.960 |
- Every time somebody steals my bicycle, I say, 00:41:34.400 |
And I'm like, "Aren't you happy that you have bicycles 00:41:39.200 |
I'd rather be the person stolen from than the stealer. 00:41:45.160 |
So that's, we've just talked amazingly about life 00:41:55.120 |
from perhaps other perspective and its meaning. 00:42:07.040 |
to the number 42 that our culture has elevated. 00:42:13.680 |
So this is a perfect time to talk about the meaning of life. 00:42:18.400 |
but do you think this question that's so simple 00:42:33.280 |
and trying to answer it, is that a ridiculous pursuit 00:42:39.360 |
- So first of all, I feel that we owe it to your listeners 00:42:46.040 |
- So of course, the Hitchhiker's Guide to the Galaxy 00:42:48.360 |
came up with 42 as basically a random number. 00:42:52.200 |
Just, you know, the author just pulled it out of a hat 00:43:00.400 |
But in fact, there's many numbers that are linked to 42. 00:43:18.840 |
At some point, the computer says, "I have an answer." 00:43:30.440 |
And then the irony is that they had forgotten, of course, 00:43:39.380 |
So as I was turning 42, I basically sort of researched 00:43:49.560 |
to all those guests to my 42nd birthday party 00:43:53.080 |
why we were talking about the meaning of life. 00:43:54.880 |
And I basically talked about how 42 is the angle 00:43:59.880 |
at which light reflects off of water to create a rainbow. 00:44:14.880 |
So 42 is also the sum of all rows and columns 00:44:17.760 |
of a magic cube that contains all consecutive integers 00:44:24.640 |
between one and however many vertices there are, 00:44:42.000 |
And now it's the only one that actually has a solution. 00:44:46.160 |
42 is also one, zero, one, zero, one, zero in binary. 00:44:50.240 |
Again, the yin and the yang, the good and the evil, 00:44:54.960 |
42 is the number of chromosomes for the giant panda. 00:45:15.040 |
is exactly the strength but yet peace, and so on and so forth. 00:45:33.520 |
should be folded to reach beyond the moon. (laughs) 00:45:41.960 |
when they ask that their paper reaches for the stars. 00:45:44.560 |
I just tell them just fold it a bunch of times. 00:45:48.920 |
42 is the number of Messier object 42, which is Orion. 00:45:58.520 |
It's, I think, also the place where we can actually see 00:46:08.280 |
Which is very useful when searching for the stars. 00:46:10.880 |
And also a reg exp for life, the universe, and everything. 00:46:22.320 |
would ask 42 questions to every dying person, 00:46:25.600 |
and those answering successfully would become stars, 00:46:27.960 |
continue to give life and fuel universal growth. 00:46:30.440 |
In Judaic tradition, God ascribed the 42-lettered name 00:46:35.720 |
and trusted only to the middle-aged, pious, meek, 00:46:38.080 |
free from bad temper, sober, and not insistent on his rights. 00:46:42.120 |
And in Christian tradition, there's 42 generations 00:46:53.680 |
In Qabbalistic tradition, Eloqa, which is 42, 00:46:56.880 |
is the number with which God creates the universe, 00:47:00.000 |
starting with 25, let there be, and ending with 17, good. 00:47:10.600 |
which is the first Indian religious scripture, 00:47:14.640 |
thus introducing Buddhism to China from India. 00:47:29.560 |
A yeast cell, which is called a single-cell eukaryote 00:47:42.920 |
So I guess what you're saying is just a random number. 00:47:52.160 |
So anyway, so now that we've spoken about why 42, 00:48:00.160 |
will that search ultimately lead to our destruction? 00:48:08.560 |
is something that's so inherent to human nature. 00:48:15.000 |
And that searching for meaning is actually the point. 00:48:21.680 |
Don't ever be satisfied that, you know, I've got it. 00:48:52.200 |
which is happening for humanity itself to find our meaning. 00:48:55.960 |
And we as humans like to look at animals and say, 00:49:09.440 |
You know, remember our joke with the cat and the dog. 00:49:15.200 |
And I'm noticing the yin yang symbol right here 00:49:20.640 |
with this whole panda, black and white and the zero one zero. 00:49:25.000 |
Some of those are gold ASCII value for a star symbol. 00:49:32.920 |
the search for meaning and the act of searching 00:49:37.160 |
for something more meaningful is life's meaning by itself. 00:49:42.160 |
The fact that we kind of always hope that yes, 00:49:48.040 |
but maybe humans have something that we should be doing 00:49:55.480 |
It's not just about strength and feeding, et cetera. 00:49:58.400 |
Like we're the one species that spends such a tiny little 00:50:01.080 |
minority of its time feeding that we have this enormous, 00:50:08.840 |
that we can just use for all kinds of other stuff. 00:50:12.960 |
That's where, you know, the healthy mind comes in with, 00:50:16.440 |
you know, exploring all of these different aspects 00:50:18.360 |
that are just not directly tied to a purpose. 00:50:43.400 |
- Is it possible that we're the most beautiful thing 00:50:55.120 |
The dinosaurs ruled the earth for 135 million years. 00:51:11.040 |
as tiny little creatures for 30 million years, 00:51:18.040 |
Out of these mammals came the neocortex formation. 00:51:25.600 |
which is sort of the outer layer of our brain 00:51:28.080 |
compared to our quote unquote reptilian brain, 00:51:29.880 |
which we share the structure of with all of the dinosaurs, 00:51:33.200 |
they didn't have that, and yet they ruled the planet. 00:51:36.040 |
So how many other planets have still, you know, 00:51:38.760 |
mindless dinosaurs where strength was the only dimension 00:51:45.480 |
So there was something weird that annihilated the dinosaurs. 00:51:51.080 |
of sort of God coming and wiping out his creatures 00:51:55.080 |
So the mammals basically sort of took over the planet 00:52:10.560 |
and humans among primates have just exploded that hardware. 00:52:24.200 |
It's initially selected with this very simple 00:52:28.160 |
Darwinian view of the world of random mutation, 00:52:33.800 |
and then selection for making more of yourself. 00:52:46.720 |
an enormous amount of energy on this apparatus 00:52:55.680 |
like some enormous percentage of our calories 00:53:01.600 |
No other species makes that big of a commitment. 00:53:08.840 |
for efficiency on the metabolic side for humanity 00:53:15.400 |
And our brain is both enormously more efficient 00:53:18.680 |
than other brains, but also, despite this efficiency, 00:53:33.280 |
before it could no longer go through the pelvic opening 00:53:42.200 |
effectively creating just so much more capacity. 00:53:46.640 |
The evolutionary context in which this was made 00:53:57.240 |
that we have now killed off or that have gone extinct. 00:54:00.440 |
And that has now created this weird place of humans 00:54:13.280 |
that perhaps has not been recreated elsewhere. 00:54:25.560 |
However, we're not as unique as we like to think 00:54:37.800 |
where you have a neocortex-like explosion of, 00:54:44.720 |
and we're now competing on larger and larger groups 00:54:48.800 |
and being able to coordinate and being able to have empathy. 00:54:53.800 |
The concept of empathy, the concept of an ego, 00:55:06.240 |
another person's intentions to understand what they mean 00:55:15.160 |
So me being able to sort of create a mental model 00:55:33.520 |
which basically means passion, pathos, suffering, 00:55:39.120 |
So basically empathy means feeling what you're feeling, 00:55:56.680 |
So life itself, in my view, is inevitable on every planet. 00:56:05.440 |
But the evolution of life to self-awareness and cognition 00:56:09.920 |
and all the incredible things that humans have done, 00:56:15.440 |
So if you were to sort of estimate and bet some money on it, 00:56:24.920 |
would what we got now be the most special thing 00:56:42.800 |
Basically, I feel that the longevity of dinosaurs 00:57:10.440 |
if the universe has a lot of alien life forms in it, 00:57:16.480 |
And one thought is that there's a great filter 00:57:21.240 |
that basically would destroy intelligent civilizations. 00:57:25.560 |
This thing that we, you know, this multifolding brain 00:57:28.640 |
that keeps growing may not be such a big feature. 00:57:38.440 |
that is a very short one with a quick dead end. 00:57:58.840 |
is when somebody should have been paying attention 00:58:07.260 |
is a long, long time ago in a galaxy far, far away. 00:58:27.760 |
multicellular perhaps, and so on and so forth. 00:58:31.600 |
But the fact that humanity has only been listening 00:58:39.360 |
means that any of these, you know, alien civilizations 00:58:44.100 |
to every single insignificant planet out there. 00:58:47.200 |
And, you know, again, I mean, the movie "Contact" 00:58:52.400 |
This whole concept of we don't need to travel physically. 00:58:57.720 |
We can send instructions for people to create machines 00:59:01.080 |
that will allow us to beam down light and recreate ourselves. 00:59:04.880 |
And in the book, you know, the aliens actually take over. 00:59:09.840 |
But, you know, this concept that we have to eventually 00:59:18.600 |
- So you have a hope, well, you said think, so. 00:59:35.360 |
So basically humans have won the battle for, you know, 00:59:41.400 |
It wasn't necessarily the case with dinosaurs. 00:59:50.400 |
And if you look at Jurassic Park, yeah, sure, whatever. 00:59:53.840 |
But, you know, they just don't have the hardware for it. 01:00:06.120 |
Like basically there's universes where strength won out. 01:00:11.680 |
particular version of whatever happened in this planet, 01:00:20.220 |
But it's kind of like living in Boston instead of, 01:00:36.080 |
- But the flip side of that is that this hardware 01:00:58.120 |
has dramatically reduced violence, dramatically. 01:01:08.880 |
the state basically owns the right to violence. 01:01:20.440 |
but violence has been eliminated by that state. 01:01:36.840 |
is something that has led so much to less violence, 01:02:11.480 |
- I probably imagine there were two dinosaurs 01:02:14.300 |
back in the day having this exact conversation 01:02:18.600 |
and there seems to be something like an asteroid 01:02:24.760 |
that the arc of our society of human civilization 01:02:44.160 |
You could imagine that a pandemic would be more destructive 01:02:54.480 |
which I recently learned is not that easy to-- 01:03:15.400 |
- So we have 48 years to get our act together. 01:03:18.560 |
It's not like some distant, distant hypothesis. 01:03:27.200 |
- Oh gosh, I'm so happy with where we are now. 01:03:34.040 |
the speed with which knowledge has been transferred, 01:03:38.440 |
what has led to humanity making so many advances so fast? 01:03:44.360 |
So what has led to humanity making so many advances 01:03:52.600 |
So by hardware upgrades, I basically mean our neocortex 01:03:55.520 |
and the expansion and these layers and folds of our brain 01:04:04.520 |
the hardware hasn't changed much in the last, 01:04:22.440 |
What has changed is that we are growing up in societies 01:04:35.760 |
The concept that our brain has not fully formed, 01:04:43.600 |
So we basically have a good 16 years, 18 years 01:04:51.240 |
If you look at what happened in ancient Greece, 01:05:05.680 |
This whole concept of creating abstract notions, 01:05:14.960 |
and layers of meaning and layers of abstraction 01:05:26.800 |
There's no such thing as sort of expressing these ideals 01:05:42.160 |
Your brain has trouble getting at that concept. 01:05:55.360 |
human culture, human environment, human education 01:06:00.080 |
have basically led to this enormous explosion of knowledge. 01:06:11.320 |
and the printed press, the dissemination of knowledge, 01:06:13.800 |
you basically now have this whole horizontal dispersion 01:06:17.080 |
of ideas in addition to the vertical inheritance of genes. 01:06:28.720 |
And the reason why human civilization exploded 01:06:34.560 |
So if you're looking at now where we are today, 01:06:42.520 |
A hundred years ago, it would have, but it didn't. 01:06:54.920 |
And now less than a year later, 10 months later, 01:06:58.160 |
we already have a working vaccine that's 90% efficient. 01:07:05.960 |
So the asteroid, yes, could wipe us out in 48 years, 01:07:11.000 |
I mean, look at where we were 48 years ago, technologically. 01:07:22.960 |
The technological revolutions of digitization, 01:07:27.320 |
the amount of compute power we can put on any, 01:07:40.560 |
We all have our little problems going back and forth 01:07:46.240 |
and on the cognitive and on the sort of human side 01:07:49.280 |
and the societal side, but science has not slowed down. 01:07:57.360 |
So Elon is now putting rockets out from the private space. 01:08:01.680 |
I mean, that now democratization of space exploration 01:08:06.040 |
is gonna revolutionize. - It's gonna explode. 01:08:09.800 |
- In the same way that every technology has exploded, 01:08:12.600 |
this is the shift to space technology exploding. 01:08:22.600 |
- Are you excited by the possibility of a human, 01:08:28.440 |
and two, possible colonization of not necessarily Mars, 01:08:50.000 |
about two to three million people lived here. 01:08:58.280 |
Would you wait until the Declaration of Independence? 01:09:04.000 |
They'll probably get a bunch of younger people. 01:09:09.800 |
- But wisdom can be transferred horizontally. 01:09:23.720 |
I mean, just, I mean, you know you can watch a live feed 01:09:47.280 |
But I think our species is in for big, good things. 01:09:54.800 |
I think that we will overcome our little problems 01:10:01.400 |
I feel that we're definitely on the path to that. 01:10:04.360 |
And it's just not permeated through the whole universe yet, 01:10:17.640 |
How exactly are we special relative to the dinosaurs? 01:10:41.000 |
the lion is smelling the molecules in the environment. 01:10:54.520 |
The target is constantly looking around and sensing. 01:10:58.440 |
I've actually been in Kenya and I've kind of seen the hunt. 01:11:03.000 |
So I've kind of seen the sort of game of waiting. 01:11:07.360 |
And the mitochondria in the muscles of the lion 01:11:18.360 |
They're expensing an enormous amount of energy. 01:11:21.520 |
The grass as it's flowing is constantly transforming 01:11:34.640 |
and eventually feeds the lions and so on and so forth. 01:11:52.520 |
The chloroplasts are only experiencing one layer. 01:12:00.640 |
like the lion always attacks against the wind 01:12:04.080 |
Like all of these things are one layer at a time. 01:12:10.160 |
And we humans somehow perceive the whole stack. 01:12:20.640 |
you basically have a physical layer that you start with. 01:12:28.720 |
you have basically gates and logic and an assembly layer. 01:12:33.720 |
you have your higher order, higher level programming. 01:12:38.600 |
you have your deep learning routine, et cetera. 01:12:41.040 |
you eventually build a cognitive system that's smart. 01:12:46.060 |
I want you to now picture this cognitive system 01:12:51.920 |
but also becoming aware of the hardware that it's made of 01:12:56.880 |
and the atoms that it's made of and so on and so forth. 01:13:03.680 |
and there's this beautiful scene in "2001 Odyssey of Space" 01:13:08.680 |
where Hal, after Dave starts disconnecting him, 01:13:13.060 |
is starting to sing a song about daisies, et cetera. 01:13:30.880 |
of knowing that the hardware is no longer there, is amazing. 01:13:48.740 |
about sort of the eventual cognitive leap to self-awareness. 01:13:57.280 |
actually breaking through these layers and saying, 01:14:00.240 |
a slightly better hardware to get me functioning better." 01:14:16.520 |
that can integrate equations and it's man-made, 01:14:22.140 |
We share this cognitive layer of playing chess. 01:14:26.560 |
We're not the only thing on the planet that plays chess. 01:14:31.280 |
- But in some sense, that particular organism, AI, 01:14:43.800 |
A bat is doing this incredible integration of signals, 01:14:50.020 |
It's basically constantly sending echo location, waves, 01:14:59.200 |
are operating at slightly different frequencies 01:15:10.760 |
All they know is that they have a 3D view of space 01:15:13.920 |
around them, just like any gazelle walking through, 01:15:19.000 |
And any baby looking around is aware of things 01:15:23.640 |
without doing the math of how am I processing 01:15:29.000 |
You're just aware of the layer that you live in. 01:15:36.120 |
we've basically managed through our cognitive layer, 01:15:39.120 |
through our perception layer, through our senses layer, 01:15:42.040 |
through our multi-organ layer, through our genetic layer, 01:15:47.040 |
through our molecular layer, through our atomic layer, 01:15:54.360 |
through even the very fabric of the space-time continuum, 01:16:00.680 |
So as we're watching that scene in the Serengeti, 01:16:07.360 |
we as, you know, anyone who's finished high school 01:16:12.120 |
of all of these different layers interplaying together. 01:16:16.720 |
in perhaps not just the galaxy, but maybe even the cosmos. 01:16:30.780 |
And that's what I love about particle physics, 01:16:47.880 |
of increasingly large particles before that explosion 01:16:53.640 |
And it's only through understanding the very large 01:16:57.220 |
that we understand the very small and vice versa. 01:17:04.880 |
As you are watching the Kilimanjaro Mountain, 01:17:27.000 |
We are aware of the eons that have happened on Earth 01:17:36.060 |
the same way that we're aware of the Big Bang 01:17:54.360 |
I mean, that would be magician stuff in ancient times. 01:17:58.800 |
So what I love about humanity and its role in the universe 01:18:05.840 |
he's like, "Finally, somebody figured it out. 01:18:08.240 |
"I've been building all these beautiful things 01:18:22.000 |
and us humans are able to convert those layers 01:18:35.200 |
whether we live in a simulation, for example. 01:18:37.640 |
I mean, realize that we are living in a simulation. 01:18:45.740 |
without any sort of person programming this is a simulation. 01:18:49.580 |
Like basically what happens inside your skull? 01:18:55.780 |
which are translated into perceptory signals, 01:18:58.580 |
which are then translated into a conceptual model 01:19:15.760 |
you can think of the reality that we live in as a matrix, 01:19:20.500 |
but we've actually broken through the matrix. 01:19:30.120 |
to basically give us the blue pill or the red pill. 01:19:32.540 |
We were able to sufficiently evolve cognitively 01:19:43.060 |
to basically get at breaking through the matrix, 01:19:47.820 |
and realizing that we are this thing in there. 01:19:51.220 |
And yet that thing in there has a consciousness 01:19:58.480 |
We are the only thing that we even can think of 01:20:13.780 |
and realizing what we're really, really made of. 01:20:16.900 |
And the next frontier is of course, cognition. 01:20:38.820 |
I mean, that's, you know, really the next frontier. 01:20:41.060 |
So in terms of these peeling off layers of complexity, 01:20:47.420 |
and the reasoning layer or the computational layer, 01:20:52.420 |
there's still some stuff to be figured out there. 01:20:56.020 |
of sort of completing our journey through that matrix. 01:21:18.400 |
in our cognitively capable artificial systems 01:21:35.440 |
this is not really intelligent because we coded it up. 01:21:38.900 |
And we've just put in these little parameters there 01:21:41.480 |
and there's like, you know, what, 6 billion parameters. 01:21:54.200 |
They're just made out of neurons and, you know, 01:22:06.400 |
compared to the complexity of our cognitive apparatus. 01:22:13.440 |
are in fact encoding all of our cognitive functions. 01:22:26.760 |
in the same way that when we build, you know, 01:22:29.360 |
these conversational systems or these cognitive systems 01:22:47.640 |
in the grocery bags when you have both cold and hot 01:22:54.560 |
No, no, you basically now just program the primitives 01:23:07.200 |
the result of it being deployed into the world 01:23:18.800 |
it's not yet able to be cognizant of all the other layers 01:23:35.040 |
on which it runs, the electricity on which it runs yet. 01:23:41.080 |
we basically have the same cognitive architecture 01:23:53.080 |
again, it's the same architecture, just more of it. 01:24:04.040 |
do you really need fundamentally different architectures 01:24:16.480 |
So, you know, there's something to be said about 01:24:34.360 |
from the great apes, except for just a ton more of it. 01:24:37.960 |
- Yeah, it's interesting that in the AI community, 01:24:44.240 |
but the notion that GPT-10 will achieve general intelligence 01:24:51.840 |
that there has to be something totally different 01:25:01.000 |
this very simple thing, this very simple architecture, 01:25:09.000 |
- And people think the same way about humanity 01:25:13.080 |
They're like, "Oh, consciousness might be quantum 01:25:18.120 |
And it's like, or it could just be a lot more 01:25:21.560 |
of the same hardware that now is sufficiently capable 01:25:25.760 |
of self-awareness just because it has the neurons to do it. 01:25:29.320 |
So maybe the consciousness that is so elusive 01:25:32.720 |
is an emergent behavior of you basically string together 01:25:38.200 |
all these cognitive capabilities that come from running, 01:25:47.240 |
All of these things are just like great lookup tables 01:25:54.760 |
of the different types of excitatory and inhibitory neurons, 01:25:57.160 |
the waveforms that sort of shine through the connections 01:26:14.200 |
which are self-organized and shaped by their environment. 01:26:19.200 |
Babies that are growing up today are listening to language 01:26:28.400 |
Basically, as soon as the auditory apparatus forms, 01:26:32.200 |
it's already getting shaped to the types of signals 01:26:37.120 |
So it's not just like, oh, have an Egyptian be born 01:26:40.440 |
It's like, no, that Egyptian would be listening in 01:26:44.200 |
to the complex of the world and then getting born 01:26:46.480 |
and sort of seeing just how much more complex the world is. 01:26:49.520 |
So it's a combination of the underlying hardware, 01:26:57.440 |
in my view, the hardware gives you an upper bound 01:27:02.160 |
but it's the environment that makes those capabilities shine 01:27:06.880 |
So we're a combination of nature and nurture. 01:27:10.960 |
The nature is our genes and our cognitive apparatus, 01:27:15.240 |
and the nurture is the richness of the environment 01:27:18.520 |
that makes that cognitive apparatus reach its potential. 01:27:22.040 |
And we are so far from reaching our full potential, so far. 01:27:27.040 |
I think that kids being born a hundred years from now, 01:27:36.400 |
I can't believe people were not wired into this, 01:27:39.200 |
you know, virtual reality from birth as we are now, 01:27:42.640 |
'cause like they're clearly inferior and so on and so forth. 01:27:48.000 |
will continue exploding and our cognitive capabilities, 01:27:53.000 |
it's not like, oh, we're only using 10% of our brain. 01:27:56.040 |
Of course, we're using a hundred percent of our brain, 01:28:03.800 |
but the software in a quickly advancing environment, 01:28:17.120 |
will look very different a hundred years from now, 01:28:25.440 |
At the core of this is kind of a notion of ideas 01:28:34.960 |
but Richard Dawkins talks about the notion of memes 01:28:45.120 |
you know, multiplying, selecting in the minds of humans. 01:28:49.120 |
Do you ever think about ideas from that perspective, 01:28:59.520 |
I love the concept of these horizontal transfer of ideas 01:29:11.160 |
So you can think of sort of the cognitive space 01:29:24.840 |
well beyond what was thought to be ever capable 01:29:28.880 |
when the concept of a meme was created by Richard Dawkins. 01:29:39.240 |
which is the horizontal transfer of humans with fellowships. 01:30:19.640 |
So those cognitive systems that think of, you know, 01:30:32.120 |
might go to, I don't know, Stanford or CMU or MIT. 01:30:44.680 |
that love these ideas and feed on these ideas 01:30:47.240 |
and understand these ideas and appreciate these ideas, 01:30:52.080 |
So you basically have students coming to Boston to study, 01:30:59.280 |
And they're selected based on their cognitive output 01:31:25.840 |
cognitive interconnection system of the planet, 01:31:40.080 |
who came to be a student, kind of like myself, 01:31:58.000 |
surrounded by other cognitive systems of a similar age, 01:32:02.440 |
with parents who love these types of thinking and ideas. 01:32:06.320 |
And you basically have a whole interbreeding now 01:32:09.200 |
of genetically selected transfer of cognitive systems, 01:32:14.200 |
where the genes and the memes are co-evolving 01:32:37.000 |
meat cognitive systems to physical locations. 01:32:42.360 |
the biology ones cluster in a certain building too, 01:32:45.280 |
so within that there's clusters on top of clusters 01:32:55.280 |
because people now form groups on the internet 01:33:03.760 |
these cognitive systems can collect themselves 01:33:08.600 |
and breed together in different layers of spaces. 01:33:17.120 |
So basically there's the physical rearrangement, 01:33:26.760 |
Doesn't need to belong to only one community. 01:33:31.560 |
but you can also hang out in the biology department, 01:33:35.800 |
I don't know, poetry department readings and so on so forth, 01:33:45.280 |
and are now interbreeding these ideas in a whole other way. 01:34:01.720 |
- And sometimes these cognitive systems hold conferences 01:34:09.040 |
and they're all like listening and then they discuss 01:34:13.120 |
- No, but then that's where you find students 01:34:18.240 |
I go through the posters where I'm on a mission. 01:34:27.640 |
but I make it a point to just go poster after poster 01:34:31.800 |
And I find some gems and students that I speak to 01:34:37.800 |
And then sort of you're sort of creating this permeation 01:34:48.280 |
and very often of moral values, of social structures, 01:34:52.960 |
of, you know, just more imperceptible properties 01:34:57.960 |
of these cognitive systems that simply just cling together. 01:35:04.760 |
I have the luxury at MIT of not just choosing smart people, 01:35:09.120 |
but choosing smart people who I get along with, 01:35:12.680 |
who are generous and friendly and creative and smart 01:35:22.280 |
in their uninhibited behaviors and so on and so forth. 01:35:26.000 |
So you basically can choose yourself to surround, 01:35:31.240 |
with people who are not only cognitively compatible, 01:35:39.760 |
through the meta cognitive systems compatible. 01:35:43.560 |
And again, when I say compatible, not all the same. 01:35:46.720 |
Sometimes, you know, not sometimes, all the time, 01:35:50.560 |
the teams are made out of complementary components, 01:35:53.520 |
not just compatible, but very often complementary. 01:35:56.240 |
So in my own team, I have a diversity of students 01:36:13.360 |
is the fact that not only do we have a common mission, 01:36:20.360 |
view of the world, but that we're complementary 01:36:24.600 |
in our skills, in our angles with which we accommodate, 01:36:38.040 |
so you meet because there's some common thing, 01:36:40.800 |
but you stick together because you're different 01:36:48.440 |
I mean, we adore each other, like, to pieces, 01:36:51.840 |
but we're also extremely different in many ways. 01:37:00.760 |
I'm like living out there in the world of ideas 01:37:15.480 |
Basically, I need her as much as she needs me, 01:37:18.120 |
and she loves interacting with me and talking. 01:37:20.200 |
I mean, you know, last night, we were talking about this, 01:37:38.280 |
and we're like, you know, bouncing ideas, et cetera. 01:37:41.040 |
So, you know, we have extremely different perspectives, 01:37:44.560 |
but very common, you know, goals and interests, and anyway. 01:37:49.560 |
- What do you make of the communication mechanism 01:37:54.160 |
'Cause like one essential element of all of this 01:37:57.320 |
is not just that we're able to have these ideas, 01:38:06.080 |
but we seem to use language to share the ideas. 01:38:10.080 |
Maybe we share them in some much deeper way than language, 01:38:13.440 |
But what do you make of this whole mechanism, 01:38:15.800 |
and how fundamental it is to the human condition? 01:38:23.160 |
and your thoughts cannot form outside language. 01:38:35.240 |
I don't dream in words, I dream in shapes, and forms, 01:38:38.360 |
and, you know, three-dimensional space with extreme detail. 01:38:44.320 |
in the middle of the night, I actually record my dreams. 01:38:47.160 |
Sometimes I write them down in a Dropbox file. 01:38:50.040 |
Other times I'll just dictate them in, you know, audio. 01:38:53.940 |
And my wife was giving me a massage the other day, 01:39:15.480 |
you know, three-dimensional shape, and dream, and concept. 01:39:20.200 |
And in the same way, when I'm thinking of ideas, 01:39:25.320 |
I mean, I will describe them with a thousand words, 01:39:46.880 |
And the language certainly gives you the apparatus 01:40:05.800 |
that you then build upon with all kinds of other things. 01:40:09.080 |
So there's this co-evolution again of ideas and language, 01:40:16.520 |
Now let's talk about language itself, words, sentences. 01:40:34.120 |
and my face is changing through all kinds of emotions, 01:40:37.120 |
and my entire body composition posture is reshaped, 01:40:45.360 |
the softer and the louder and the this and that 01:40:59.520 |
there's a lot of grunting, there's a lot of posturing, 01:41:02.400 |
there's a lot of sort of shrieking, et cetera. 01:41:04.760 |
They have a lot of components of our human language, 01:41:21.420 |
but also of course, the, you know, GPT-3 component. 01:41:36.840 |
And what I love about humanity is that we have both. 01:41:42.400 |
We're a grunting, emotionally charged, you know, 01:42:01.160 |
- It does seem like we're able to somehow transfer 01:42:08.800 |
is always a giant knowledge base of like shared experiences, 01:42:16.520 |
but I don't know, the knowledge of who the last three, 01:42:22.240 |
and just all the, you know, 9/11, the tragedies in 9/11, 01:42:32.640 |
and somehow enrich the ability to transfer information. 01:42:41.920 |
and that evokes all these feelings that you had 01:42:45.520 |
- We're both visualizing that, maybe in different ways. 01:42:50.400 |
And not only that, but the feeling is broad-based. 01:42:55.400 |
Broad back up, just like you said, with the dreams. 01:43:10.600 |
- Now let's talk about Neuralink for a second. 01:43:14.600 |
The concept of Neuralink is that I'm gonna take 01:43:25.240 |
and extremely sort of, you know, appealing concept, 01:43:29.180 |
but I see a lot of challenges surrounding that. 01:43:36.780 |
how knowledge is encoded in a person's brain. 01:43:40.140 |
I mean, I told you about this paper that we had recently 01:43:54.300 |
and select those neurons that fire by marking them, 01:43:56.820 |
and then see what happens when they first fire, 01:44:04.900 |
and then there's the memory consolidation neurons. 01:44:09.940 |
of sort of the distributed nature of knowledge encoding 01:44:18.020 |
And the concept that we'll understand that sufficiently 01:44:26.700 |
of what does that scene from Dave losing his mind, 01:44:41.440 |
But now imagine, suppose that we solve this problem, 01:44:45.840 |
and the next enormous challenge is how do I go 01:44:51.080 |
to now create the same exact neural connections? 01:44:56.360 |
So basically it's not just reading, it's now writing. 01:45:05.200 |
And number three, who says that the way that you encode, 01:45:17.320 |
Basically, maybe the way that I'm encoding it 01:45:27.120 |
and yours is twisted with your childhood memories 01:45:31.200 |
And there's no way that I can take my encoding 01:45:38.120 |
and B, be incompatible with your own unique experiences. 01:45:46.240 |
You're reminding us that there's two biological systems 01:46:04.440 |
So where one side of that is a little bit more controllable, 01:46:08.820 |
but even just that is exceptionally difficult. 01:46:13.320 |
- Let's talk about two neuronal systems talking to each other. 01:46:23.000 |
- I have 10 times more hardware, I'm ready, just feed me. 01:46:27.320 |
Is it gonna say, "Oh, here's my 10 billion parameters?" 01:46:32.720 |
The simplest way, and perhaps the fastest way, 01:46:35.080 |
for GPT-3 to transfer all its knowledge to its older body 01:46:39.860 |
is to regenerate every single possible human sentence 01:46:48.140 |
- Keep talking, and just re-encode it all together. 01:46:59.660 |
that kind of make sense in my cognitive frame, 01:47:08.580 |
might actually be the most efficient way to do it. 01:47:14.580 |
So talking back and forth, asking questions, interrupting. 01:47:29.780 |
there's all kinds of misinterpretations that happen. 01:47:33.460 |
That, you know, basically when my students speak, 01:47:44.820 |
which I know is different from what you said. 01:48:01.940 |
with full neural network parameters back and forth 01:48:09.060 |
would be far inferior because the re-encoding 01:48:18.860 |
from our unique experiences through our shared experiences, 01:48:49.060 |
that's sort of what really creates a whole new level 01:48:52.180 |
of human experience through this reasoning layer 01:48:57.180 |
and this computational layer that obviously lives 01:49:04.860 |
- So you're one of these aforementioned cognitive systems, 01:49:21.140 |
What do you, in your brief time here on Earth, 01:49:27.240 |
so what do you hope this world will remember you as? 01:49:34.680 |
- I don't think of legacy as much as maybe most people. 01:49:47.840 |
Many students tell me, oh, give us some career advice. 01:50:08.220 |
and just, there's a conscious decision we can make 01:50:13.980 |
which again goes back to the I'm the lucky one 01:50:19.540 |
Of living in the present and being happy winning 01:50:24.260 |
and there's a certain freedom that comes with that, 01:50:55.260 |
of the universe, of the awesomeness of the planet 01:51:17.700 |
You can't be upset at people if you truly love them. 01:51:25.740 |
and yet when you see them, you just see them with love, 01:51:34.580 |
when I look at my three-year-old who's screaming. 01:51:39.700 |
she's still screaming and saying, "No, no, no, no, no." 01:51:42.940 |
And I'm like, "I love you, genuinely love you." 01:51:47.060 |
But I can sort of kind of see that your brain 01:51:49.580 |
is kind of stuck in that little mode of anger. 01:51:53.860 |
And there's plenty of people out there who don't like me, 01:52:09.020 |
So there's that aspect of sort of experiencing 01:52:23.780 |
who would basically say, "Oh, I love it when I'm wrong 01:52:31.140 |
I mean, she's really one of the smartest people 01:52:32.700 |
I've ever met, and she was like, "Oh, it's such a good feeling." 01:52:36.140 |
And I love being wrong, but there's something 01:52:41.140 |
about self-improvement, there's something about sort of 01:52:46.900 |
but attempt the most rights and do the fewest wrongs, 01:52:50.860 |
but with the full knowledge that this will happen. 01:53:04.740 |
really, thanks to you and through this podcast, 01:53:09.940 |
who will basically comment, "Wow, I've been following 01:53:14.860 |
or, "Wow, this guy has inspired so many of us 01:53:17.340 |
"in computational biology," and so on and so forth. 01:53:21.980 |
"but I'm only discovering this now through this sort of 01:53:25.220 |
"sharing our emotional states and our cognitive states 01:53:32.900 |
I'm sort of realizing that, wow, maybe I've had a legacy. 01:53:39.100 |
of students from MIT, and I've put all of my courses 01:53:47.740 |
So basically, all of my video recordings of my lectures 01:53:52.220 |
So countless generations of people from across the world 01:53:57.940 |
I was at this conference where somebody heard my voice 01:54:11.400 |
which students will get where from whatever they catch 01:54:15.460 |
out of these lectures, even if what they catch 01:54:20.720 |
So there's this intangible, you know, legacy, 01:54:29.720 |
One of my friends from undergrad basically told me, 01:54:43.440 |
And there's that aspect of sort of just motivating people 01:54:47.040 |
with your kindness, with your passion, with your generosity, 01:54:50.640 |
and with your, you know, just selflessness of, you know, 01:54:58.040 |
I've been to conferences where basically people will, 01:55:05.480 |
where I asked somebody a question, they said, 01:55:07.120 |
"Oh, in fact, this entire project was inspired 01:55:09.160 |
"by your question three years ago at the same conference." 01:55:13.480 |
- And then on top of that, there's also the ripple effect. 01:55:23.960 |
that through from you just this one individual first drop. 01:55:36.680 |
and genetic variants with very recent ancestors 01:55:41.040 |
So even if I die tomorrow, my genes are still shared 01:55:49.960 |
And of course, I'm lucky enough to have my own children, 01:55:52.740 |
but even if you don't, your genes are still permeating 01:55:57.560 |
- So your genes will have the legacy there, yeah. 01:56:03.680 |
are constantly intermingling with each other. 01:56:08.760 |
100 years from now who will not be directly impacted 01:56:12.000 |
by every one of the planet living here today. 01:56:18.520 |
- That's cool to think that your ideas, Manolis Callas, 01:56:22.000 |
would touch every single person on this planet. 01:56:33.920 |
So there's this interconnectedness of humanity. 01:56:36.320 |
And then I'm also a professor, so my day job is legacy. 01:56:41.420 |
My day job is training, not just the thousands of people 01:56:50.360 |
who basically come to MIT to learn from a bunch of us. 01:57:02.760 |
That's what makes America the beacon of the world. 01:57:05.240 |
We don't just export goods, we export people. 01:57:15.360 |
and we also export training that people born elsewhere 01:57:19.040 |
will come here to get, and will then disseminate 01:57:28.760 |
that you cannot stop with political isolation, 01:57:33.480 |
That's something that will continue to happen 01:57:35.720 |
through all the people we've touched through our universities. 01:57:40.960 |
who are basically now going off and teaching their classes, 01:57:44.280 |
and I've trained generations of computational biologists. 01:57:56.240 |
"that's what got me into the field 15 years ago." 01:58:00.200 |
And then there's the academic family that I have. 01:58:04.800 |
So the students who are actually studying with me, 01:58:09.320 |
So this sort of mentorship of ancient Greece, this. 01:58:24.400 |
this bond of you're part of the Kelly's family. 01:58:36.480 |
So I've trained people who are now professors at Stanford, 01:58:40.920 |
CMU, Harvard, WashU, I mean, everywhere in the world. 01:58:58.400 |
'cause you don't have to wait for the 18 years 01:59:03.880 |
to sort of have amazing conversation with people. 01:59:07.400 |
These are fully grown humans, fully grown adults, 01:59:17.800 |
and I'm like, "I can see the touch of our lab 01:59:34.200 |
is an experience with every one of these students. 01:59:37.480 |
So I always tell them to write the whole first draft, 01:59:40.560 |
and they know that I will rewrite every word. 01:59:45.840 |
and what I do is these like joint editing sessions 01:59:56.680 |
and I'm just thinking out loud as I'm doing this, 02:00:05.080 |
I'm sort of, "Well, that's not how you write this. 02:00:24.040 |
No, what you have to do is extract the meaning, 02:00:34.640 |
which is infinite because they've now gone off on the, 02:00:50.280 |
The way that they learn is through the mens et manus, 02:00:56.000 |
It's the practical training of actually doing research. 02:00:59.360 |
And that research is a beneficial side effect 02:01:06.640 |
that will now tell other people how to think. 02:01:10.900 |
There's this paper we just posted recently on MedArchive, 02:01:13.240 |
and one of the most generous and eloquent comments about it 02:01:24.920 |
It's just so fulfilling from a person I've never met. 02:01:31.860 |
but it's "Single-Cell Dissection of Schizophrenia Reveals." 02:01:42.780 |
Like there's some individuals who are schizophrenic, 02:01:50.040 |
or initial cell state, which we believe is protective. 02:02:06.120 |
that are basically sending these inhibitory brain waves 02:02:14.920 |
there's a set of master regulators that we discovered 02:03:02.100 |
These are people who used it to write something based on it. 02:03:12.960 |
So I don't think of my legacy as I live every day. 02:03:41.320 |
we would be amiss if we did not have at least a poem or two. 02:03:57.120 |
I remember when you were talking with Eric Weinstein 02:04:03.740 |
about this comment of Leonard Cohen that says, 02:04:09.400 |
"But you don't really care for music, do ya?" 02:04:11.920 |
In "Hallelujah," that's basically kind of like 02:04:23.560 |
and there's this other friend who was coming to visit me. 02:04:27.000 |
And she said, "I will not come unless you write me a poem." 02:04:33.040 |
And I was like, "Oh, writing a poem on demand." 02:04:40.960 |
It goes, "Write me a poem," she said with a smile. 02:04:44.280 |
"Make sure it's pretty, romantic, and rhymes. 02:05:18.340 |
"Throw in some cute words, oh, here and there. 02:05:40.960 |
Three roses, white chocolate, vanilla powder, 02:05:51.540 |
"You must believe it straight from your heart. 02:06:02.880 |
"You're the stars and the moon and the birds way up high. 02:06:06.000 |
"You're my evening sweet song, my morning blue sky. 02:06:12.400 |
"You bring me my voice and scatter my thoughts. 02:06:15.560 |
"To put that love in writing, in vain, I can try. 02:06:19.080 |
"But when I'm with you, my wings want to fly. 02:06:31.280 |
- The baffled king composing, oh, that was beautiful. 02:06:42.200 |
So basically, when I write poems, I just type. 02:06:54.800 |
it's an emergent phenomenon. - It's an emergent phenomenon. 02:06:57.120 |
I just get into that mode, and then it comes out. 02:07:11.640 |
so it's very introspective in this whole concept. 02:07:16.500 |
So anyway, there's another one many years earlier 02:07:23.620 |
It's basically this whole concept of let's be friends. 02:07:56.120 |
with nothing sweet felt and nothing harsh told. 02:08:14.040 |
when you cover your passion in a bland friend's disguise. 02:08:21.080 |
Turn off the lights and rip off your fashion. 02:08:32.500 |
Don't try and protect me from love's cutting blade. 02:08:38.500 |
Don't spare me the passion to spare me the pains. 02:09:28.700 |
Manolis, thank you so much for talking today. 02:09:34.500 |
with Manolis Kellis, and thank you to our sponsors. 02:09:38.000 |
Grammarly, which is a service for checking spelling, 02:09:40.760 |
grammar, sentence structure, and readability. 02:09:50.220 |
Cash App, the app I use to send money to friends. 02:09:54.100 |
Please check out these sponsors in the description 02:09:56.360 |
to get a discount and to support this podcast. 02:09:59.380 |
If you enjoy this thing, subscribe on YouTube, 02:10:31.220 |
was muck about in the water having a good time. 02:10:34.900 |
But conversely, the dolphins had always believed 02:10:43.920 |
Thank you for listening, I hope to see you next time.