back to index

Manolis Kellis: Meaning of Life, the Universe, and Everything | Lex Fridman Podcast #142


Chapters

0:0 Introduction
2:18 Music and life
41:51 The number 42
47:52 The question about the meaning of life
50:32 Are humans unique in the universe?
56:16 Human civilization
68:22 Mars
70:15 Human mind and the abstraction layers of reality
81:8 Neural networks and intelligence
88:25 Ideas as organisms
97:49 Language
109:4 Legacy
123:55 Poems

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Manolis Kellis,
00:00:02.960 | his fourth time on the podcast.
00:00:05.000 | He's a professor at MIT
00:00:06.880 | and head of the MIT Computational Biology Group.
00:00:10.420 | Since this is episode number 142,
00:00:14.440 | and 42, as we all know,
00:00:16.660 | is the answer to the ultimate question of life,
00:00:18.840 | the universe, and everything,
00:00:20.420 | according to the Hitchhiker's Guide to the Galaxy,
00:00:23.640 | we decided to talk about this unanswerable question
00:00:26.740 | of the meaning of life
00:00:28.120 | in whatever way we two descendants of apes could muster,
00:00:32.000 | from biology, to psychology, to metaphysics, and to music.
00:00:37.000 | Quick mention of each sponsor,
00:00:39.400 | followed by some thoughts related to the episode.
00:00:42.100 | Thanks to Grammarly,
00:00:44.160 | which is a service for checking, spelling,
00:00:47.000 | grammar, sentence structure, and readability,
00:00:49.880 | Athletic Greens, the all-in-one drink
00:00:52.520 | that I start every day with
00:00:54.080 | to cover all my nutritional bases,
00:00:56.280 | and Cash App, the app I use to send money to friends.
00:01:00.400 | Please check out these sponsors in the description
00:01:02.440 | to get a discount and to support this podcast.
00:01:05.680 | As a side note, let me say that the opening 40 minutes
00:01:08.360 | of the conversation are all about the many songs
00:01:11.480 | that formed the soundtrack to the journey of Manolis's life.
00:01:15.820 | It was a happy accident for me to discover
00:01:18.000 | yet another dimension of depth
00:01:20.440 | to the fascinating mind of Manolis.
00:01:22.720 | I include links to YouTube versions
00:01:24.840 | of many of the songs we mention in the description
00:01:28.160 | and overlay lyrics on occasion.
00:01:30.600 | But if you're just listening to this
00:01:31.960 | without listening to the songs or watching the video,
00:01:34.720 | I hope you still might enjoy, as I did,
00:01:37.020 | the passion that Manolis has for music,
00:01:39.600 | his singing of the little excerpts from the songs,
00:01:43.080 | and in general, the meaning we discuss
00:01:46.200 | that we pull from the different songs.
00:01:49.180 | If music is not your thing,
00:01:50.760 | I do give timestamps to the less musical
00:01:53.960 | and more philosophical parts of the conversation.
00:01:56.720 | I hope you enjoy this little experimenting conversation
00:02:00.040 | about music and life.
00:02:02.940 | If you do, please subscribe on YouTube,
00:02:05.260 | review it with Five Stars on Apple Podcast,
00:02:07.760 | follow on Spotify, support on Patreon,
00:02:10.360 | or connect with me on Twitter @LexFriedman.
00:02:13.280 | And now, here's my conversation with Manolis Kallis.
00:02:17.360 | You mentioned Leonard Cohen and the song "Hallelujah"
00:02:21.100 | as a beautiful song.
00:02:22.800 | So what are the three songs
00:02:26.120 | you draw the most meaning from about life?
00:02:29.160 | - Don't get me started.
00:02:31.320 | So there's really countless songs that have marked me,
00:02:34.660 | that have sort of shaped me in periods of joy
00:02:38.520 | and in periods of sadness.
00:02:40.280 | My son likes to joke that I have a song
00:02:43.040 | for every sentence he will say,
00:02:44.900 | 'cause very often I will break into a song
00:02:46.320 | with a sentence he'll say.
00:02:47.760 | (laughs)
00:02:48.760 | My wife calls me the radio
00:02:50.520 | 'cause I can sort of recite hundreds of songs
00:02:53.440 | that have really shaped me.
00:02:54.560 | So it's gonna be very hard to just pick a few.
00:02:56.800 | So I'm just gonna tell you a little bit
00:02:57.800 | about my song transition as I've grown up.
00:03:01.640 | In Greece, it was very much about,
00:03:03.720 | as I told you before, the misery, the poverty,
00:03:06.880 | but also overcoming adversity.
00:03:09.000 | So some of the songs that have really shaped me
00:03:11.920 | are "Kharis Alexiou," for example,
00:03:13.880 | is one of my favorite singers in Greece.
00:03:16.560 | And then there's also really just old traditional songs
00:03:19.600 | that my parents used to listen to.
00:03:21.120 | Like one of them is,
00:03:22.340 | ♪ Ani mon plousios ♪
00:03:25.480 | Which is basically, oh, if I was rich.
00:03:28.840 | And the song is painting this beautiful picture
00:03:32.200 | about all the noises that you hear in the neighborhood,
00:03:34.960 | his poor neighborhood, the train going by,
00:03:37.540 | the priest walking to the church,
00:03:39.960 | and the kids crying next door and all of that.
00:03:43.080 | And he says, with all of that,
00:03:44.640 | I'm having trouble falling asleep and dreaming.
00:03:47.520 | ♪ If I was rich ♪
00:03:49.560 | And then he was like, you know, breaking into that.
00:03:52.400 | So it's this juxtaposition between the spirit
00:03:55.720 | and the sublime and then the physical and the harsh reality.
00:03:59.960 | It's just not having troubles, not being miserable.
00:04:03.360 | So basically, rich to him just means
00:04:05.200 | out of my misery, basically.
00:04:06.600 | And then also being able to travel,
00:04:10.240 | being able to sort of be the captain of a ship
00:04:12.280 | and see the world and stuff like that.
00:04:14.200 | So it's just such beautiful imagery.
00:04:16.200 | - So many of the Greek songs,
00:04:17.360 | just like the poetry we talked about,
00:04:19.040 | they acknowledge the cruelty, the difficulty of life,
00:04:22.560 | but are longing for a better life.
00:04:24.440 | - That's exactly right.
00:04:25.280 | And another one is "Phtohologia."
00:04:26.960 | And this is one of those songs
00:04:28.320 | that has a fast and joyful half
00:04:31.240 | and a slow and sad half.
00:04:33.600 | And it goes back and forth between them.
00:04:35.760 | And it's like,
00:04:36.600 | ♪ Phtohologia, yefsena kathemo dragoodi ♪
00:04:41.200 | So poor, you know, basically,
00:04:42.740 | it's the state of being poor.
00:04:45.800 | I don't even know if there's a word for that in English.
00:04:50.160 | And then fast part is,
00:04:51.920 | ♪ Ta heria sum megalosan ke ponesan ke matesan ♪
00:04:56.240 | So then it's like, oh, you know,
00:04:59.120 | basically like the state of being poor and misery,
00:05:02.680 | you know, for you, I write all my songs, et cetera.
00:05:05.600 | And then the fast part is,
00:05:07.160 | in your arms grew up and suffered
00:05:11.600 | and, you know, stood up and, you know, rose.
00:05:16.160 | Men with clear vision.
00:05:18.400 | This whole concept of taking on the world
00:05:21.800 | with nothing to lose because you've seen the worst of it.
00:05:25.080 | This imagery of,
00:05:26.680 | ♪ Psyllaki parisopula harastakorisopula ♪
00:05:29.960 | So it's describing the young men as cypress trees.
00:05:34.120 | And that's probably one of my earliest exposure
00:05:35.840 | to a metaphor, to sort of, you know,
00:05:38.120 | this very rich imagery.
00:05:40.080 | And I love about the fact that
00:05:41.560 | I was reading a story to my kids the other day
00:05:43.520 | and it was dark.
00:05:44.800 | And my daughter, who's six, is like,
00:05:47.000 | oh, can I please see the pictures?
00:05:48.480 | And Jonathan, who's eight,
00:05:51.120 | so my daughter Cleo is like,
00:05:53.600 | oh, let's look at the pictures.
00:05:54.640 | And my son Jonathan, he's like,
00:05:56.320 | but Cleo, if you look at the pictures,
00:05:58.800 | it's just an image.
00:06:00.240 | If you just close your eyes and listen, it's a video.
00:06:03.280 | (laughing)
00:06:04.680 | - That's brilliant.
00:06:05.520 | - It's beautiful.
00:06:06.840 | And he's basically showing just how much more
00:06:09.000 | the human imagination has besides just a few images
00:06:12.280 | that, you know, the book will give you.
00:06:14.160 | And then another one, oh gosh,
00:06:15.760 | this one is really like miserable.
00:06:18.040 | It's called "Sto perigiali, to krifo."
00:06:23.040 | And it's basically describing how
00:06:25.680 | vigorously we took on our life
00:06:30.000 | and we pushed hard towards a direction
00:06:33.000 | that we then realized was the wrong one.
00:06:35.040 | (laughing)
00:06:37.000 | And again, these songs give you so much perspective.
00:06:39.720 | There's no songs like that in English
00:06:41.040 | that will basically, you know,
00:06:42.040 | sort of just smack you in the face
00:06:44.240 | about sort of the passion and the force and the drive.
00:06:47.480 | And then it turns out,
00:06:48.320 | ah, we just followed the wrong life.
00:06:51.160 | And it's like, whoa.
00:06:52.600 | (laughing)
00:06:54.480 | - Okay, so that was you.
00:06:55.320 | - All right, so that's like before 12.
00:06:57.120 | So, you know, growing up in sort of this
00:07:00.440 | horrendously miserable, you know,
00:07:03.200 | sort of view of romanticism of, you know, suffering.
00:07:07.080 | So then my preteen years is like, you know,
00:07:10.640 | learning English through songs.
00:07:12.800 | So basically, you know,
00:07:13.720 | listening to all the American pop songs
00:07:15.520 | and then memorizing them vocally
00:07:17.800 | before I even knew what they meant.
00:07:19.600 | (laughing)
00:07:21.000 | So, you know, Madonna and Michael Jackson
00:07:22.960 | and all of these sort of really popular songs
00:07:25.240 | and, you know, George Michael.
00:07:27.240 | Just songs that I would just listen to the radio
00:07:29.360 | and repeat vocally.
00:07:30.760 | And eventually, as I started learning English,
00:07:32.440 | I was like, oh, wow, this thing has been repeating.
00:07:34.480 | I now understand what it means
00:07:36.000 | without re-listening it.
00:07:37.800 | But just with re-repeating it,
00:07:39.040 | I was like, oh. (laughing)
00:07:41.440 | Again, Michael Jackson's "Man in the Mirror"
00:07:44.080 | is teaching you that it's your responsibility
00:07:47.160 | to just improve yourself.
00:07:49.320 | You know, if you wanna make the world a better place,
00:07:51.080 | take a look at yourself and make the change.
00:07:52.880 | This whole concept of, again, I mean,
00:07:55.080 | all of these songs, you can listen to them shallowly
00:07:56.920 | or you can just listen to them and say,
00:07:58.960 | oh, there's deeper meaning here.
00:08:00.680 | And I think there's a certain philosophy of song
00:08:04.160 | as a way of touching the psyche.
00:08:06.280 | So if you look at regions of the brain,
00:08:08.320 | people who have lost their language ability
00:08:10.080 | because they have an accident in that region of the brain
00:08:12.240 | can actually sing
00:08:14.080 | because it's exactly the symmetric region of the brain.
00:08:18.320 | And that, again, teaches you so much
00:08:19.600 | about language evolution
00:08:21.440 | and sort of the duality of musicality
00:08:26.120 | and rhythmic patterns and eventually language.
00:08:31.080 | - Do you have a sense of why songs developed?
00:08:33.960 | So you're kind of suggesting that it's possible
00:08:36.200 | that there is something important
00:08:39.240 | about our connection with song and with music
00:08:43.160 | on the level of the importance of language.
00:08:46.040 | Is it possible?
00:08:47.320 | - It's not just possible.
00:08:48.560 | In my view, language comes after music.
00:08:51.200 | Language comes after song.
00:08:52.520 | No, seriously.
00:08:53.360 | Like, basically, my view of human cognitive evolution
00:08:56.720 | is rituals.
00:08:58.800 | If you look at many early cultures,
00:09:01.360 | there's rituals around every stage of life.
00:09:04.560 | There's organized dance performances around mating.
00:09:09.560 | And if you look at mate selection,
00:09:11.440 | I mean, that's an evolutionary drive right there.
00:09:14.080 | So basically, if you're not able to string together
00:09:16.160 | a complex dance as a bird, you don't get a mate.
00:09:19.960 | And that actually forms this development
00:09:23.240 | for many song-learning birds.
00:09:25.320 | Not every bird knows how to sing,
00:09:27.800 | and not every bird knows how to learn a complicated song.
00:09:31.040 | So basically, there's birds
00:09:32.800 | that simply have the same few tunes
00:09:35.360 | that they know how to play,
00:09:36.200 | and a lot of that is inherent and genetically encoded.
00:09:39.120 | And others are birds that learn how to sing.
00:09:42.600 | And if you look at a lot of these exotic birds of paradise
00:09:47.320 | and stuff like that,
00:09:48.160 | like the mating rituals that they have
00:09:49.760 | are enormously amazing.
00:09:51.400 | And I think human mating rituals of ancient tribes
00:09:55.120 | are not very far off from that.
00:09:56.840 | And in my view, the sequential formation of these movements
00:10:01.960 | is a prelude to the cognitive capabilities
00:10:06.080 | that ultimately enable language.
00:10:08.360 | - It's fascinating to think that that's
00:10:10.600 | not just an accidental precursor to intelligence.
00:10:13.440 | - Yeah, it's sexually selected.
00:10:16.040 | - Well, it's sexually selected and it's a prerequisite.
00:10:19.560 | - Yeah.
00:10:20.400 | - It's like it's required for intelligence.
00:10:21.720 | - And even as language has now developed,
00:10:24.960 | I think the artistic expression is needed,
00:10:28.760 | like badly needed by our brain.
00:10:31.280 | So it's not just that,
00:10:33.240 | oh, our brain can kind of take a break
00:10:35.760 | and go do that stuff.
00:10:36.840 | No, I mean, I don't know if you remember that scene from,
00:10:40.160 | oh gosh, what's that Jack Nicholson movie in New Hampshire?
00:10:43.560 | All work and no play, make Jack a doll boy.
00:10:47.840 | - The doll boy, The Shining.
00:10:49.640 | - The Shining.
00:10:50.480 | (laughing)
00:10:51.520 | - Such a good movie.
00:10:52.360 | - There's this amazing scene
00:10:53.720 | where he's constantly trying to concentrate
00:10:56.200 | and what's coming out of the typewriter is just gibberish.
00:10:59.040 | And I have that image as well when I'm working.
00:11:01.600 | And I'm like, no, basically all of these crazy,
00:11:04.560 | huge number of hobbies that I have,
00:11:06.520 | they're not just tolerated by my work,
00:11:09.520 | they're required by my work.
00:11:11.960 | This ability of sort of stretching your brain
00:11:13.920 | in all these different directions
00:11:15.600 | is connecting your emotional self and your cognitive self.
00:11:20.120 | And that's a prerequisite
00:11:21.880 | to being able to be cognitively capable,
00:11:24.680 | at least in my view.
00:11:25.520 | - Yeah, I wonder if the world without art and music,
00:11:28.800 | you're just making me realize that perhaps
00:11:30.600 | that world would be not just devoid of fun things
00:11:34.680 | to look at or listen to, but devoid of all the other stuff,
00:11:38.080 | all the bridges and rockets and science.
00:11:41.120 | - Exactly, exactly.
00:11:42.360 | Creativity is not disconnected from art.
00:11:45.280 | And my kids, I mean, I could be doing
00:11:49.120 | the full math treatment to them.
00:11:50.480 | No, they play the piano and they play the violin
00:11:53.280 | and they play sports.
00:11:54.280 | I mean, this whole sort of movement
00:11:58.120 | and going through mazes and playing tennis
00:12:01.840 | and playing soccer and avoiding obstacles and all of that,
00:12:06.400 | that forms your three-dimensional view of the world.
00:12:09.120 | Being able to actually move and run
00:12:10.600 | and play in three dimensions
00:12:12.560 | is extremely important for math,
00:12:14.600 | for stringing together complicated concepts.
00:12:17.840 | It's the same underlying cognitive machinery
00:12:21.440 | that is used for navigating mazes
00:12:23.640 | and for navigating theorems
00:12:27.360 | and sort of solving equations.
00:12:28.760 | So I can't have a conversation with my students
00:12:32.360 | without sort of either using my hands
00:12:35.360 | or opening the whiteboard in Zoom
00:12:39.360 | and just constantly drawing.
00:12:41.240 | Or back when we had in-person meetings,
00:12:43.760 | just the whiteboard.
00:12:44.600 | - The whiteboard, yeah, that's fascinating to think about.
00:12:47.200 | So that's Michael Jackson, man.
00:12:49.440 | "Mirror, Careless Whisper" with George Michael,
00:12:52.200 | which is a song I like.
00:12:53.040 | You didn't say "Careless Whisper."
00:12:53.880 | - It's "The Careless Whisper."
00:12:54.720 | - You didn't say that?
00:12:55.560 | I like that one.
00:12:56.400 | That's me.
00:12:57.240 | I had recorded-- - "Pogular" for you?
00:12:58.080 | - I had recorded, no, no, no.
00:12:59.960 | It's an amazing song for me.
00:13:01.640 | I had recorded a small part of it
00:13:03.560 | as it's played at the tail end of the radio.
00:13:05.800 | And I had a tape where I only had part of that song.
00:13:08.480 | - Part of that song, you just sang it over and over.
00:13:09.320 | - And I just played it over and over and over again.
00:13:11.960 | Just so beautiful.
00:13:13.760 | - It's so heartbreaking.
00:13:15.120 | That song is almost Greek, it's so heartbreaking.
00:13:17.000 | - I know.
00:13:17.840 | And George Michael is Greek.
00:13:19.080 | - Is he Greek? - He's Greek.
00:13:19.920 | - He's, no. - Of course.
00:13:20.800 | George Michaelides, I mean, he's Greek.
00:13:22.760 | - Yeah.
00:13:23.600 | (both laughing)
00:13:24.800 | - Now you know.
00:13:26.600 | So sorry to offend you so deeply not knowing this.
00:13:30.320 | So, okay, so what's--
00:13:31.160 | - So anyway, so we're moving to France when I'm 12 years old
00:13:33.040 | and now I'm getting into the songs of Gainsbourg.
00:13:36.160 | So Gainsbourg is this incredible French composer.
00:13:39.600 | He is always seen on stage,
00:13:42.000 | like not even pretending to try to please,
00:13:44.320 | just like with his cigarette,
00:13:45.320 | just like rrrr, mumbling his songs.
00:13:47.520 | But the lyrics are unbelievable.
00:13:49.840 | Like basically entire sentences will rhyme.
00:13:53.080 | He will say the same thing twice.
00:13:54.880 | And you're like, whoa.
00:13:56.600 | (laughs)
00:13:58.000 | And in fact, another, speaking of Greek,
00:14:00.440 | a French Greek, Georges Moustaki.
00:14:03.040 | This song is just magnificent.
00:14:05.240 | (speaking in foreign language)
00:14:09.160 | So with my face of,
00:14:13.600 | metek is actually a Greek word.
00:14:15.520 | It's a French word for a Greek word.
00:14:17.800 | But met comes from meta,
00:14:20.040 | and then ek from Ikea, from ecology, which means home.
00:14:23.360 | So metek is someone who has changed homes, who are migrant.
00:14:26.640 | So with my face of a migrant,
00:14:28.480 | and you'll love this one,
00:14:30.000 | the (speaking in foreign language)
00:14:32.320 | of a meandering Jew,
00:14:35.400 | of Greek pastor.
00:14:37.800 | (laughs)
00:14:39.600 | So again, you know, the Russian Greek,
00:14:42.000 | Jew orthodox connection.
00:14:43.600 | (speaking in foreign language)
00:14:46.920 | With my hair in the four wings.
00:14:48.760 | (speaking in foreign language)
00:14:53.640 | With my eyes that are all washed out,
00:14:56.000 | who gives me the pretense of dreaming,
00:14:58.840 | but who don't dream that much anymore.
00:15:01.280 | With my hands of thief, of musician,
00:15:05.240 | and who have stolen so many gardens.
00:15:09.000 | With my mouth that has drunk,
00:15:11.320 | that has kissed, and that has bitten,
00:15:14.560 | without ever pleasing its hunger.
00:15:18.040 | With my skin that has been rubbed
00:15:22.400 | in the sun of all the summers,
00:15:25.680 | and anything that was wearing a skirt.
00:15:28.080 | With my heart,
00:15:30.560 | and then you have to listen to this verse,
00:15:32.200 | it's so beautiful.
00:15:33.400 | (speaking in foreign language)
00:15:37.320 | With my heart that knew how to make suffer
00:15:44.560 | as much as it suffered,
00:15:46.600 | but was able to,
00:15:49.080 | that knew how to make,
00:15:50.400 | in French is actually,
00:15:51.440 | (speaking in foreign language)
00:15:53.200 | that knew how to make.
00:15:54.760 | (speaking in foreign language)
00:15:57.880 | Verses that span the whole thing.
00:15:59.520 | It's just beautiful.
00:16:00.840 | - Do you know, on a small tangent,
00:16:02.960 | do you know Jacques Brel?
00:16:05.800 | - Of course, of course.
00:16:06.960 | (speaking in foreign language)
00:16:08.840 | - Yeah. - You know those songs?
00:16:09.680 | Those, that song gets me every time.
00:16:12.600 | - So there's a cover of that song
00:16:14.880 | by one of my favorite female artists.
00:16:17.160 | - Not Nina Simone.
00:16:18.440 | - No, no, no, no, no.
00:16:19.840 | - Modern? - Carol Emerald.
00:16:21.800 | She's from Amsterdam.
00:16:24.800 | And she has a version of "Noumekito Paa"
00:16:28.240 | where she's actually added some English lyrics.
00:16:30.600 | And it's really beautiful.
00:16:33.120 | But again, "Noumekito Paa" is just so,
00:16:35.320 | I mean, it's, you know, the promises,
00:16:38.400 | the volcanoes that will restart.
00:16:42.040 | It's just so beautiful.
00:16:43.480 | And-- - I love,
00:16:45.200 | there's not many songs that show such depth of desperation
00:16:50.200 | for another human being.
00:16:52.840 | That's so powerful.
00:16:54.000 | Unapologetic.
00:16:54.840 | (singing in foreign language)
00:16:58.680 | - And then high school, now I'm starting to learn English.
00:17:03.800 | So I moved to New York.
00:17:05.040 | So Sting's "Englishman in New York."
00:17:07.720 | - Yeah. - Magnificent song.
00:17:09.920 | And again, there's,
00:17:10.960 | ♪ If manners mageth manners someone said ♪
00:17:14.320 | ♪ Then he's the hero of the day ♪
00:17:16.920 | ♪ It takes a man to suffer ignorance and smile ♪
00:17:20.120 | ♪ Be yourself no matter what they say ♪
00:17:23.840 | And then, ♪ Takes more than combat gear to make a man ♪
00:17:27.800 | ♪ Takes more than a license for a gun ♪
00:17:30.720 | ♪ Confront your enemies, avoid them when you can ♪
00:17:34.200 | ♪ A gentleman will walk but never run ♪
00:17:37.480 | It's, again, you're talking about songs
00:17:39.920 | that teach you how to live.
00:17:41.240 | I mean, this is one of them.
00:17:42.640 | Basically says, it's not the combat gear that makes a man.
00:17:46.400 | - Where's the part where he says, there you go.
00:17:49.560 | ♪ Gentleness so brighty, a rare in this society ♪
00:17:53.920 | ♪ At night a candle's brighter than the sun ♪
00:17:56.840 | So beautiful.
00:17:57.680 | He basically says, well, you just might be the only one.
00:18:00.760 | ♪ Modesty propriety can lead to notoriety ♪
00:18:03.800 | ♪ You could end up as the only one ♪
00:18:05.920 | It's, it basically tells you,
00:18:08.160 | you don't have to be like the others.
00:18:09.880 | Be yourself, show kindness, show generosity.
00:18:14.040 | Don't, you know, don't let that anger get to you.
00:18:18.320 | You know the song "Fragile"?
00:18:19.840 | ♪ How fragile we are, how fragile we are ♪
00:18:22.960 | So again, as in Greece, I didn't even know what that meant,
00:18:26.120 | how fragile we are, but the song was so beautiful.
00:18:28.880 | And then eventually I learned English
00:18:30.000 | and I actually understand the lyrics.
00:18:31.840 | And the song is actually written
00:18:34.040 | after the Contras murdered Ben Linder in 1987.
00:18:39.680 | And the US eventually turned against
00:18:41.880 | supporting these guerrillas.
00:18:44.520 | And it was just a political song,
00:18:46.160 | but so such a realization that
00:18:48.160 | you can't win with violence, basically.
00:18:51.400 | And that song starts with the most beautiful poetry.
00:18:56.200 | ♪ If blood will flow when flesh and steel are one ♪
00:19:00.680 | ♪ Drying in the color of the evening sun ♪
00:19:04.600 | ♪ Tomorrow's rain will wash the stains away ♪
00:19:08.280 | ♪ But something in our minds will always stay ♪
00:19:11.760 | ♪ Perhaps this final act was meant ♪
00:19:14.280 | ♪ To clinch a lifetime's argument ♪
00:19:16.720 | ♪ That nothing comes from violence ♪
00:19:19.040 | ♪ And nothing ever could ♪
00:19:21.320 | ♪ For all those born beneath an angry star ♪
00:19:25.000 | ♪ Lest we forget how fragile we are ♪
00:19:28.280 | - Damn.
00:19:29.360 | - Damn, right?
00:19:30.200 | (laughing)
00:19:31.080 | I mean, that's poetry.
00:19:33.680 | It was beautiful.
00:19:35.680 | And he's using the English language,
00:19:37.440 | it's just such a refined way with deep meanings,
00:19:42.000 | but also words that rhyme just so beautifully
00:19:45.600 | and evocations of when flesh and steel are one.
00:19:49.760 | I mean, it's just mind boggling.
00:19:53.360 | And then of course,
00:19:54.320 | the refrain that everybody remembers is,
00:19:55.960 | ♪ On and on the rain will fall ♪
00:19:59.200 | Et cetera.
00:20:00.040 | But like this beginning.
00:20:00.880 | - Tears from a star, wow.
00:20:01.880 | - Yeah.
00:20:03.400 | And again, tears from a star, how fragile we are.
00:20:06.840 | I mean, just these rhymes are just flowing so naturally.
00:20:10.160 | - Something, it seems that more meaning comes
00:20:12.800 | when there's a rhythm that,
00:20:14.840 | I don't know what that is.
00:20:16.000 | That probably connects to exactly what you were saying.
00:20:18.200 | - And if you pay close attention,
00:20:19.200 | you will notice that the more obvious words
00:20:22.480 | sometimes are the second verse.
00:20:25.120 | And the less obvious are often the first verse
00:20:28.120 | because it makes the second verse flow much more naturally
00:20:31.680 | because otherwise it feels contrived.
00:20:33.760 | Oh, you went and found this like unusual word.
00:20:36.480 | In "Dark Moments", the whole album of Pink Floyd
00:20:39.800 | and the movie just marked me enormously
00:20:42.560 | as a teenager, just the wall.
00:20:45.200 | And there's one song that never actually made it
00:20:47.680 | into the album, that's only there in the movie
00:20:49.840 | about when the Tigers broke free
00:20:51.920 | and the Tigers are the tanks of the Germans.
00:20:55.000 | And it just describes again, this vivid imagery.
00:20:58.120 | It was just before dawn, one miserable morning in black 44,
00:21:02.920 | when the forward commander was told to sit tight
00:21:05.720 | when he asked that his men be withdrawn.
00:21:08.440 | And the generals gave thanks as the other ranks held back
00:21:12.840 | the enemy tanks for a while.
00:21:16.040 | And the Anzio Bridgehead was held for the price
00:21:20.720 | of a few hundred ordinary lives.
00:21:24.640 | So that's a theme that keeps coming back in Pink Floyd
00:21:27.080 | with "Us Versus Them".
00:21:28.920 | ♪ Us and them ♪
00:21:30.560 | ♪ God only knows ♪
00:21:32.240 | ♪ That's not what we would choose to do ♪
00:21:36.400 | ♪ Forward he cried from the rear ♪
00:21:39.760 | ♪ And the front rows died ♪
00:21:43.480 | From another song, it's like this whole concept
00:21:45.120 | of "Us Versus Them".
00:21:46.400 | And there's that theme of "Us Versus Them" again
00:21:48.720 | where the child is discovering how his father died
00:21:53.000 | when he finds an old and a founded one day
00:21:55.960 | in a drawer of old photographs hidden away.
00:21:58.560 | And my eyes still grow damp to remember his majesty's sign
00:22:03.200 | with his own rubber stamp.
00:22:05.240 | So it's so ironic because it seems the way
00:22:08.640 | that he's writing it, that he's not crying
00:22:10.160 | because his father was lost.
00:22:11.760 | He's crying because kind old King George took the time
00:22:16.000 | to actually write mother a note about the fact
00:22:18.680 | that his father died.
00:22:20.120 | It's so ironic 'cause it basically says,
00:22:23.400 | we are just ordinary men and of course we're disposable.
00:22:26.920 | So I don't know if you know the root of the word pioneers,
00:22:29.840 | but you had a chessboard here earlier, a pawn.
00:22:34.920 | In French, it's a pion.
00:22:36.960 | They are the ones that you send to the front
00:22:38.360 | to get murdered, slaughtered.
00:22:40.280 | This whole concept of pioneers having taken
00:22:42.880 | this whole disposable ordinary men
00:22:45.360 | to actually be the ones that we're now treating as heroes.
00:22:48.880 | So anyway, there's this juxtaposition of that.
00:22:50.920 | And then the part that always just strikes me
00:22:53.680 | is the music and the tonality totally changes.
00:22:56.920 | And now he describes the attack.
00:22:59.200 | ♪ It was dark all around ♪
00:23:01.200 | ♪ There was frost in the ground ♪
00:23:03.160 | ♪ When the tigers broke free ♪
00:23:05.720 | ♪ And no one survived ♪
00:23:07.480 | ♪ From the Royal Fusiliers Company ♪
00:23:11.120 | ♪ They were all left behind ♪
00:23:13.280 | ♪ Most of them dead ♪
00:23:15.600 | ♪ The rest of them dying ♪
00:23:18.880 | ♪ And that's how the high command ♪
00:23:21.480 | ♪ Took my daddy from me ♪
00:23:24.800 | And that song, even though it's not in the album,
00:23:27.800 | explains the whole movie.
00:23:30.040 | 'Cause it's this movie of misery.
00:23:32.080 | It's this movie of someone being stuck in their head
00:23:35.080 | and not being able to get out of it.
00:23:37.080 | There's no other movie that I think has captured so well
00:23:40.120 | this prison that is someone's own mind.
00:23:44.760 | And this wall that you're stuck inside
00:23:47.240 | and this feeling of loneliness.
00:23:50.680 | And sort of, is there anybody out there?
00:23:53.080 | And you know, sort of, hello, hello,
00:23:56.680 | is there anybody in there?
00:23:59.120 | Just nod if you can hear me.
00:24:01.160 | Is there anyone home?
00:24:03.000 | Come on, yeah.
00:24:06.800 | I hear you're feeling down.
00:24:09.000 | Just one little day
00:24:12.920 | ♪ And you're not in again ♪
00:24:15.480 | Anyway, so--
00:24:16.840 | - Yeah, the prison of your mind.
00:24:18.080 | So those are the darker moments.
00:24:19.640 | - Exactly, these are the darker moments.
00:24:22.000 | - Yeah, in the darker moments,
00:24:23.760 | the mind does feel like you're trapped
00:24:28.040 | alone in a room with it.
00:24:30.120 | - Yeah, and there's this scene in the movie
00:24:32.440 | which is like, where he just breaks out with his guitar
00:24:35.600 | and there's this prostitute in the room.
00:24:36.840 | He starts throwing stuff and then he like, you know,
00:24:39.720 | breaks the window, he throws the chair outside
00:24:41.840 | and then you see him laying in the pool
00:24:43.680 | with his own blood like, you know, everywhere.
00:24:45.800 | And then there's these endless hours spent
00:24:48.840 | fixing every little thing and lining it up.
00:24:51.680 | And it's this whole sort of mania versus, you know,
00:24:55.240 | you can spend hours building up something
00:24:57.320 | and just destroy it in a few seconds.
00:24:59.760 | One of my turns is that song.
00:25:01.600 | And it's like,
00:25:03.240 | ♪ I feel cold as a tourniquet ♪
00:25:07.560 | ♪ Red as an intercom ♪
00:25:10.080 | ♪ Dry as a funeral drum ♪
00:25:13.920 | And then the music builds up, it's like,
00:25:15.800 | ♪ Run to the bedroom ♪
00:25:17.560 | ♪ There's a suitcase on the left ♪
00:25:19.920 | ♪ You'll find my favourite axe ♪
00:25:23.240 | ♪ Don't look so frightened ♪
00:25:24.920 | ♪ This is just a passing phase ♪
00:25:27.400 | ♪ One of my bad days ♪
00:25:30.000 | It's just so beautiful.
00:25:31.080 | - I need to rewatch it, that's so,
00:25:33.120 | you're making me realistic. - But imagine watching this
00:25:34.480 | as a teenager. - Yeah, yeah.
00:25:35.800 | - It like, ruins your mind.
00:25:37.680 | (laughing)
00:25:39.000 | Like so many, it's just such harsh imagery.
00:25:42.200 | And then, you know, anyway,
00:25:44.600 | so there's the dark moment.
00:25:46.720 | And then again, going back to Sting,
00:25:48.800 | now it's the political songs, Russians.
00:25:50.880 | And I think that song should be
00:25:53.760 | a new national anthem for the US.
00:25:56.440 | Not for Russians, but for Red versus Blue.
00:25:58.520 | ♪ Mr. Khrushchev says we will bury you ♪
00:26:03.120 | ♪ I don't subscribe to this point of view ♪
00:26:06.760 | ♪ It'd be such an ignorant thing to do ♪
00:26:10.920 | ♪ If the Russians love their children too ♪
00:26:15.480 | What is it doing?
00:26:16.520 | It's basically saying,
00:26:17.760 | the Russians are just as humans as we are.
00:26:22.120 | There's no way that they're gonna let their children die.
00:26:25.560 | And then it's just so beautiful.
00:26:27.400 | ♪ How can I save my innocent boy ♪
00:26:30.960 | ♪ From Oppenheimer's deadly toy ♪
00:26:34.200 | And now that's the new national anthem, are you reading?
00:26:37.480 | ♪ There is no monopoly of common sense ♪
00:26:40.800 | ♪ On either side of the political fence ♪
00:26:44.560 | ♪ We share the same biology ♪
00:26:47.920 | ♪ Regardless of ideology ♪
00:26:51.360 | ♪ Believe me when I say to you ♪
00:26:55.480 | ♪ I hope the Russians love their children too ♪
00:27:00.240 | ♪ There's no such thing as a winnable war ♪
00:27:03.680 | ♪ It's a lie we don't believe anymore ♪
00:27:08.560 | I mean, it's beautiful, right?
00:27:10.600 | And for God's sake, America, wake up.
00:27:13.640 | These are your fellow Americans.
00:27:15.680 | They're your fellow biology.
00:27:19.160 | There is no monopoly of common sense
00:27:21.440 | on either side of the political fence.
00:27:23.200 | It's just so beautiful.
00:27:24.320 | - There's no crisper, simpler way to say,
00:27:27.320 | Russians love their children too, the common humanity.
00:27:31.160 | - Yeah, and remember what I was telling you,
00:27:33.200 | I think in one of our first podcasts
00:27:35.200 | about the daughter who's crying for her brother
00:27:39.640 | to come back from war,
00:27:40.920 | and then the Virgin Mary appears and says,
00:27:43.120 | who should I take instead?
00:27:44.480 | This Turk, here's his family, here's his children.
00:27:47.560 | This other one, he just got married, et cetera.
00:27:51.040 | And that basically says, no, I mean,
00:27:53.760 | if you look at the Lord of the Rings,
00:27:56.280 | the enemies are these monsters, they're not human.
00:27:59.440 | And that's what we always do.
00:28:00.880 | We always say, they're not like us, they're different.
00:28:04.880 | They're not humans, et cetera.
00:28:06.240 | So there's this dehumanization that has to happen
00:28:08.920 | for people to go to war.
00:28:10.200 | If you realize just how close we are genetically,
00:28:14.280 | one with the other, this whole 99.9% identical,
00:28:18.680 | you can't bear weapons against someone who's like that.
00:28:22.120 | - And the things that are the most meaningful to us
00:28:24.160 | in our lives at every level is the same on all sides,
00:28:28.600 | on both sides. - Exactly.
00:28:30.040 | - So it's not just that we're genetically the same.
00:28:32.360 | - Yeah, we're ideologically the same.
00:28:34.400 | We love our children, we love our country.
00:28:36.360 | We will fight for our family.
00:28:39.920 | And the last one I mentioned last time we spoke,
00:28:43.880 | which is Joni Mitchell's "Both Sides Now."
00:28:47.280 | So she has three rounds, one on clouds,
00:28:50.960 | one on love, and one on life.
00:28:53.360 | And on clouds, she says,
00:28:55.520 | ♪ Rows and flows of angel hair ♪
00:28:58.920 | ♪ And ice cream castles in the air ♪
00:29:02.160 | ♪ And feather canyons everywhere ♪
00:29:04.640 | ♪ I've looked at clouds that way ♪
00:29:07.240 | ♪ But now they only block the sun ♪
00:29:10.720 | ♪ They rain and snow on everyone ♪
00:29:14.160 | ♪ So many things I would've done ♪
00:29:17.240 | ♪ But clouds got in my way ♪
00:29:20.080 | ♪ And then I've looked at clouds from both sides now ♪
00:29:23.360 | ♪ From up and down ♪
00:29:25.640 | ♪ And still somehow it's clouds illusions I recall ♪
00:29:31.200 | ♪ I really don't know clouds at all ♪
00:29:36.200 | And then she goes on about love,
00:29:37.680 | how it's super, super happy,
00:29:39.240 | or it's about misery and loss,
00:29:41.160 | and about life, how it's about winning and losing,
00:29:44.240 | and so on and so forth.
00:29:45.440 | ♪ But now old friends are acting strange ♪
00:29:49.240 | ♪ They shake their heads, they say I've changed ♪
00:29:53.200 | ♪ Well, something's lost and something's gained ♪
00:29:56.920 | ♪ In living every day ♪
00:29:59.520 | So again, that's growing up and realizing that,
00:30:02.400 | well, the view that you had as a kid
00:30:04.960 | is not necessarily that you have as an adult.
00:30:07.200 | Remember my poem from when I was 16 years old
00:30:09.920 | of this whole, you know, children dance now while in row,
00:30:13.440 | and then in the end, even though the snow seems bright,
00:30:16.240 | without you have lost their light,
00:30:17.760 | song that sang and moon that smiled.
00:30:19.680 | So this whole concept of if you have love
00:30:23.040 | and if you have passion,
00:30:24.600 | you see the exact same thing from a different way.
00:30:26.800 | You can go out running in the rain,
00:30:28.560 | or you could just stay in and say,
00:30:29.840 | ah, sucks, I won't be able to go outside now.
00:30:32.560 | - Both sides.
00:30:34.400 | - Anyway, and the last one is,
00:30:36.240 | last, last one, I promise, Leonard Cohen.
00:30:38.440 | - This is amazing, by the way.
00:30:39.800 | (Lex laughing)
00:30:40.800 | I'm so glad we stumbled on how much joy you have
00:30:45.000 | in so many avenues of life,
00:30:48.120 | and music is just one of them.
00:30:49.560 | That's amazing, but yes, Leonard Cohen.
00:30:52.200 | - Going back to Leonard Cohen,
00:30:53.160 | since that's where you started,
00:30:54.080 | so Leonard Cohen's "Dance Me to the End of Love,"
00:30:56.360 | that was our opening song in our wedding with my wife.
00:30:59.480 | - Oh no, that's good.
00:31:00.320 | - As we came out to greet the guests,
00:31:01.760 | it was "Dance Me to the End of Love."
00:31:03.400 | And then another one, which is just so passionate always,
00:31:06.200 | and we always keep referring back to it, is "I'm Your Man."
00:31:09.880 | And it goes on and on about sort of,
00:31:12.120 | I can be every type of lover for you.
00:31:14.720 | And what's really beautiful in marriage
00:31:16.720 | is that we live that with my wife every day.
00:31:20.240 | You can have the passion, you can have the anger,
00:31:23.280 | you can have the love, you can have the tenderness.
00:31:26.000 | There's just so many gems in that song.
00:31:28.400 | If you want a partner, take my hand,
00:31:31.720 | or if you want to strike me down in anger,
00:31:35.800 | here I stand, I'm your man.
00:31:40.040 | Then if you want a boxer, I will step into the ring for you.
00:31:43.920 | If you want a driver, climb inside,
00:31:46.680 | or if you want to take me for a ride, you know you can.
00:31:50.760 | So this whole concept of you wanna drive, I'll follow.
00:31:54.120 | You want me to drive, I'll drive.
00:31:55.840 | - And the difference, I would say,
00:31:58.040 | between that and "Nemeketepa" is,
00:32:00.600 | this song, he's got an attitude,
00:32:02.160 | he's proud of his ability to basically be any kind of man
00:32:07.920 | for as long as he wants, as opposed to the Jacques Brel
00:32:10.920 | like desperation of, what do I have to be
00:32:15.040 | for you to love me, that kind of desperation.
00:32:17.760 | - But notice, there's a parallel here.
00:32:20.440 | There's a verse that is perhaps not paid attention
00:32:22.960 | to as much, which says,
00:32:24.520 | "Ah, but a man never got a woman back,
00:32:29.320 | not by begging on his knees."
00:32:32.640 | So it seems that the "I'm your man"
00:32:34.760 | is actually an apology song,
00:32:36.520 | in the same way that "Nemeketepa" is an apology song.
00:32:39.920 | "Nemeketepa" basically says, I've--
00:32:43.600 | - Screwed up. - I've screwed up.
00:32:45.000 | - I'm sorry, baby.
00:32:45.920 | - And in the same way that the "Careless Whisper"
00:32:48.240 | is "I'm Screwed Up."
00:32:49.120 | - Yes, that's right.
00:32:50.440 | ♪ I'm never gonna dance again ♪
00:32:52.760 | ♪ Guilty feet have got no rhythm ♪
00:32:57.240 | So this is an apology song, not by begging on his knees
00:33:00.080 | or I'd crawl to you, baby, and I'd fall at your feet
00:33:03.520 | and I'd howl at your beauty like a dog in heat
00:33:07.480 | and I'd claw at your heart and I'd tear at your sheet.
00:33:11.240 | I'd say, please.
00:33:13.880 | And then the last one is so beautiful.
00:33:18.480 | ♪ If you want a father for your child ♪
00:33:22.320 | ♪ Or only want to walk with me a while across the sand ♪
00:33:27.320 | ♪ I'm your man ♪
00:33:30.800 | That's the last verses, which basically says,
00:33:33.480 | you want me for a day?
00:33:34.760 | I'll be there.
00:33:37.040 | Do you want me to just walk?
00:33:38.040 | I'll be there.
00:33:38.880 | You want me for life?
00:33:40.200 | If you want a father for your child, I'll be there too.
00:33:42.600 | It's just so beautiful.
00:33:43.840 | Oh, sorry.
00:33:44.840 | Remember how I told you I was gonna finish
00:33:45.960 | with a lighthearted song?
00:33:47.040 | - Yes.
00:33:47.880 | - Last one.
00:33:48.720 | You ready?
00:33:49.560 | So, Alison Krauss and Union Station,
00:33:52.920 | country song, believe it or not, the lucky one.
00:33:55.400 | So, I've never identified as much
00:34:00.760 | with the lyrics of a song as this one.
00:34:05.520 | And it's hilarious.
00:34:06.400 | My friend, Seraphim Patoglou,
00:34:08.600 | is the guy who got me to genomics in the first place.
00:34:11.240 | I owe enormously to him.
00:34:13.120 | And he's another Greek.
00:34:14.280 | We actually met dancing, believe it or not.
00:34:15.760 | So, we used to perform Greek dances.
00:34:18.200 | I was the president
00:34:19.040 | of the International Students Association.
00:34:20.120 | So, we put on these big performances
00:34:21.840 | for 500 people at MIT.
00:34:23.720 | And there's a picture on the MIT Tech
00:34:26.040 | where Seraphim, who's like, you know,
00:34:27.840 | a bodybuilder, was holding me on his shoulder.
00:34:30.240 | And I was like doing maneuvers in the air, basically.
00:34:33.840 | So, anyway, this guy, Seraphim,
00:34:36.000 | we were driving back from a conference
00:34:39.720 | and there's this Russian girl
00:34:41.280 | who was describing how every member of her family
00:34:43.600 | had been either killed by the communists
00:34:45.600 | or killed by the Germans or killed by the...
00:34:48.000 | Like, she had just like, you know, misery,
00:34:50.840 | like death and, you know, sickness and everything.
00:34:54.440 | Everyone was decimated in her family.
00:34:55.840 | She was the last standing member.
00:34:57.480 | And we stop at a, Seraphim was driving,
00:35:00.040 | and we stop at a rest area.
00:35:02.160 | And he takes me aside and he's like,
00:35:03.920 | "Manolis, we're gonna crash."
00:35:05.640 | (both laughing)
00:35:07.480 | Get her out of my car.
00:35:08.640 | And then he basically says,
00:35:11.200 | "But I'm only reassured because you're here with me."
00:35:15.120 | And I'm like, "What do you mean?"
00:35:15.960 | He's like, "From your smile,
00:35:18.880 | I know you're the luckiest man on the planet."
00:35:20.880 | (both laughing)
00:35:22.400 | So, there's this really funny thing
00:35:24.240 | where I just feel freaking lucky all the time.
00:35:28.600 | And it's a question of attitude.
00:35:30.520 | Of course, I'm not any luckier than any other person,
00:35:32.440 | but every time something horrible happens to me,
00:35:35.040 | I'm like, and in fact, even in that song,
00:35:37.080 | the song about sort of, you know,
00:35:39.040 | walking on the beach and this, you know,
00:35:40.880 | sort of taking our life the wrong way,
00:35:43.000 | and then, you know, having to turn around,
00:35:45.320 | at some point he's like, you know,
00:35:47.160 | "In the fresh sand we wrote her name,
00:35:50.240 | (speaking in foreign language)
00:35:53.760 | how nicely that the wind blew and the writing was erased."
00:35:57.760 | So again, it's this whole sort of,
00:36:00.960 | not just saying, "Oh, bummer,"
00:36:03.960 | but, "Oh, great, I just lost this.
00:36:07.040 | This must mean something."
00:36:08.840 | (both laughing)
00:36:09.680 | Right?
00:36:10.520 | - This horrible thing happened,
00:36:11.760 | it must open the door to a beautiful chapter.
00:36:15.280 | - So, Alison Krauss is talking about "The Lucky One."
00:36:18.320 | So, it's like, "Oh my God, she wrote a song for me."
00:36:20.560 | (both laughing)
00:36:21.880 | And she goes,
00:36:22.720 | ♪ You're the lucky one, I know that now ♪
00:36:25.320 | ♪ As free as the wind blowing down the road ♪
00:36:27.880 | ♪ Loved by many, hated by none ♪
00:36:30.200 | ♪ I'd say you were lucky 'cause you know what you've done ♪
00:36:32.680 | ♪ Not the care in the world, not the worry inside ♪
00:36:36.000 | ♪ Everything's gonna be all right ♪
00:36:37.720 | ♪ 'Cause you're the lucky one ♪
00:36:39.720 | And then she goes,
00:36:41.280 | ♪ You're the lucky one, always having fun ♪
00:36:43.480 | ♪ A jack of all trades, a master of none ♪
00:36:45.920 | ♪ You look at the world with the smiling eyes ♪
00:36:48.160 | ♪ And laugh at the devil as his train rolls by ♪
00:36:50.960 | ♪ I'll give you a song and a one-night stand ♪
00:36:53.720 | ♪ You'll be looking at a happy man ♪
00:36:55.640 | ♪ 'Cause you're the lucky one ♪
00:36:57.240 | It basically says,
00:36:59.000 | if you just don't worry too much,
00:37:01.440 | if you don't try to be, you know,
00:37:04.480 | a one-trick pony,
00:37:06.760 | if you just embrace the fact
00:37:09.120 | that you might suck at a bunch of things,
00:37:10.840 | but you're just gonna try a lot of things.
00:37:12.720 | And then there's another verse that says,
00:37:15.160 | ♪ Well, you're blessed, I guess ♪
00:37:17.280 | ♪ But never knowing which road you're choosing ♪
00:37:20.120 | ♪ To you, the next best thing ♪
00:37:22.320 | ♪ To playing and winning is playing and losing ♪
00:37:25.320 | It's just so beautiful
00:37:27.320 | because it basically says,
00:37:29.040 | if you try
00:37:30.840 | your best,
00:37:34.040 | but it's still playing,
00:37:36.360 | if you lose, it's okay, you had an awesome game.
00:37:38.760 | And again, superficially,
00:37:42.120 | it sounds like a super happy song,
00:37:43.800 | but then there's the last verse basically says,
00:37:46.960 | no matter where you're at, that's where you'll be,
00:37:48.760 | you can bet your luck won't follow me.
00:37:51.480 | Just give you a song and a one-night stand,
00:37:53.880 | you'll be looking at a happy man.
00:37:55.640 | And in the video of the song,
00:37:57.040 | she just walks away or he just walks away
00:37:58.600 | or something like that.
00:37:59.960 | And it basically tells you that freedom comes at a price.
00:38:04.360 | Freedom comes at the price of non-commitment.
00:38:06.480 | This whole sort of birds who love or birds who cry,
00:38:09.120 | you can't really love unless you cry.
00:38:12.360 | You can't just be the lucky one, the happy boy, la la la,
00:38:15.720 | and yet have a long-term relationship.
00:38:18.960 | So it's, on one hand,
00:38:21.000 | I identify with the shallowness of the song
00:38:23.120 | of you're the lucky one,
00:38:24.880 | jack of all trades, a master of none.
00:38:27.400 | But at the same time, I identify with a lesson of,
00:38:30.600 | well, you can't just be the happy merry-go-lucky all the time.
00:38:34.400 | Sometimes you have to embrace loss
00:38:36.440 | and sometimes you have to embrace suffering.
00:38:38.360 | And sometimes you have to embrace that.
00:38:40.520 | If you have a safety net,
00:38:43.120 | you're not really committing enough.
00:38:45.520 | You're not, you know,
00:38:46.520 | basically you're allowing yourself to slip.
00:38:49.040 | But if you just go all in and you just, you know,
00:38:53.160 | let go of your reservations,
00:38:54.760 | that's when you truly will get somewhere.
00:38:56.800 | So anyway, that's like the,
00:38:59.680 | I managed to narrow it down to what, 15 songs?
00:39:02.920 | - Thank you for that wonderful journey
00:39:05.040 | that you just took us on.
00:39:06.280 | The darkest possible places of Greek song
00:39:12.080 | to ending on this, a country song.
00:39:14.880 | I haven't heard it before, but that's exactly right.
00:39:18.120 | I feel the same way, depending on the day.
00:39:21.440 | Is this the luckiest human on earth?
00:39:24.280 | And there's something to that, but you're right.
00:39:26.920 | It needs to be, we need to now return to the muck of life
00:39:31.920 | in order to be able to truly enjoy it.
00:39:37.160 | - What do you mean muck?
00:39:38.560 | What's muck?
00:39:39.960 | - The messiness of life.
00:39:41.320 | The things that were,
00:39:43.440 | things don't turn out the way you expect it to.
00:39:47.040 | So like to feel lucky is like focusing
00:39:49.760 | on the beautiful consequences.
00:39:52.240 | But then that feeling of things being different
00:39:55.640 | than you expected, that you stumble in all the kinds of ways
00:39:59.760 | that seems to be, needs to be paired with the feeling.
00:40:03.320 | - There's basically one way.
00:40:04.760 | The only way not to make mistakes is to never do anything.
00:40:07.880 | - Right.
00:40:08.720 | - Basically, you have to embrace the fact
00:40:11.880 | that you'll be wrong so many times.
00:40:13.520 | In so many research meetings,
00:40:15.480 | I just go off on a tangent and I say,
00:40:17.640 | let's think about this for a second.
00:40:19.480 | And it's just crazy for me, who's a computer scientist,
00:40:23.040 | to just tell my biologist friends,
00:40:24.760 | what if biology kind of worked this way?
00:40:27.440 | And they humor me, they just let me talk.
00:40:30.280 | And rarely has it not gone somewhere good.
00:40:34.840 | It's not that I'm always right,
00:40:36.280 | but it's always something worth exploring further.
00:40:39.840 | That if you're an outsider with humility
00:40:44.000 | and knowing that I'll be wrong a bunch of times,
00:40:47.360 | but I'll challenge your assumptions,
00:40:49.760 | and often take us to a better place,
00:40:53.560 | is part of this whole sort of messiness of life.
00:40:56.160 | Like if you don't try and lose and get hurt
00:40:59.320 | and suffer and cry and just break your heart
00:41:02.480 | and all these feelings of guilt and,
00:41:05.160 | wow, I did the wrong thing.
00:41:06.520 | Of course, that's part of life.
00:41:09.080 | And that's just something that,
00:41:10.640 | you know, if you are a doer, you'll make mistakes.
00:41:15.400 | If you're a criticizer, yeah, sure,
00:41:17.600 | you can sit back and criticize everybody else
00:41:19.840 | for the mistakes they make.
00:41:21.480 | Or instead, you can just be out there
00:41:22.920 | making those mistakes.
00:41:24.320 | And frankly, I'd rather be the criticized one
00:41:26.280 | than the criticized one.
00:41:27.840 | - Brilliantly put.
00:41:28.960 | - Every time somebody steals my bicycle, I say,
00:41:30.840 | well, no, my son's like,
00:41:32.480 | "Why do they steal our bicycle, dad?"
00:41:34.400 | And I'm like, "Aren't you happy that you have bicycles
00:41:37.320 | "that people can steal?"
00:41:39.200 | I'd rather be the person stolen from than the stealer.
00:41:41.720 | - Yeah, yeah.
00:41:42.560 | (Lex laughing)
00:41:43.400 | It's not the critic that counts.
00:41:45.160 | So that's, we've just talked amazingly about life
00:41:49.840 | from the music perspective.
00:41:51.680 | Let's talk about life, human life,
00:41:55.120 | from perhaps other perspective and its meaning.
00:41:57.440 | So this is episode 142.
00:42:00.760 | There is perhaps an absurdly deep meaning
00:42:07.040 | to the number 42 that our culture has elevated.
00:42:13.680 | So this is a perfect time to talk about the meaning of life.
00:42:16.760 | We've talked about it already,
00:42:18.400 | but do you think this question that's so simple
00:42:22.840 | and so seemingly absurd has value
00:42:27.560 | of what is the meaning of life?
00:42:29.160 | Is it something that raising the question
00:42:33.280 | and trying to answer it, is that a ridiculous pursuit
00:42:36.520 | or is there some value?
00:42:37.560 | Is it answerable at all?
00:42:39.360 | - So first of all, I feel that we owe it to your listeners
00:42:42.560 | to say why 42.
00:42:44.000 | - Sure.
00:42:44.840 | (Lex laughing)
00:42:46.040 | - So of course, the Hitchhiker's Guide to the Galaxy
00:42:48.360 | came up with 42 as basically a random number.
00:42:52.200 | Just, you know, the author just pulled it out of a hat
00:42:55.480 | and he's admitted so.
00:42:56.680 | He said, "Well, 42 just seemed like
00:42:59.000 | "just random numbers, any."
00:43:00.400 | But in fact, there's many numbers that are linked to 42.
00:43:05.480 | So 42, again, just to summarize,
00:43:09.320 | is the answer that these super mega computer
00:43:13.860 | that had computed for a million years
00:43:15.960 | with the most powerful computer in the world
00:43:17.800 | had come up with.
00:43:18.840 | At some point, the computer says, "I have an answer."
00:43:23.360 | And they're like, "What?"
00:43:25.320 | It's like, "You're not gonna like it."
00:43:27.160 | Like, "What is it?"
00:43:28.200 | "It's 42."
00:43:29.040 | (both laughing)
00:43:30.440 | And then the irony is that they had forgotten, of course,
00:43:33.040 | what the question was.
00:43:33.880 | - Yes.
00:43:34.720 | (both laughing)
00:43:35.540 | - So now they have to build bigger computers
00:43:36.380 | to figure out what the question is.
00:43:37.220 | - What the question is.
00:43:38.060 | - To which the answer is 42.
00:43:39.380 | So as I was turning 42, I basically sort of researched
00:43:43.760 | why 42 is such a cool number.
00:43:45.720 | And it turns out that, and I put together
00:43:48.240 | this little passage that was explaining
00:43:49.560 | to all those guests to my 42nd birthday party
00:43:53.080 | why we were talking about the meaning of life.
00:43:54.880 | And I basically talked about how 42 is the angle
00:43:59.880 | at which light reflects off of water to create a rainbow.
00:44:04.260 | And it's so beautiful because the rainbow
00:44:07.760 | is basically the combination of sort of,
00:44:09.880 | it's been raining, but there's hope
00:44:11.760 | 'cause the sun just came out.
00:44:13.280 | So it's a very beautiful number there.
00:44:14.880 | So 42 is also the sum of all rows and columns
00:44:17.760 | of a magic cube that contains all consecutive integers
00:44:21.360 | starting at one.
00:44:22.720 | So basically if you take all integers
00:44:24.640 | between one and however many vertices there are,
00:44:27.360 | the sums is always 42.
00:44:28.880 | 42 is the only number left under 100
00:44:34.240 | for which the equation of X to the cube
00:44:36.540 | plus Y to the cube plus Z to the cube is N.
00:44:39.760 | And was not known to not have a solution.
00:44:42.000 | And now it's the only one that actually has a solution.
00:44:46.160 | 42 is also one, zero, one, zero, one, zero in binary.
00:44:50.240 | Again, the yin and the yang, the good and the evil,
00:44:52.700 | one and zero, the balance of the fours.
00:44:54.960 | 42 is the number of chromosomes for the giant panda.
00:44:58.400 | The giant panda. (laughs)
00:45:00.080 | No, it's totally random.
00:45:01.920 | - Or is it? - It's a vicious symbol
00:45:03.160 | of great strength coupled with peace,
00:45:05.600 | friendship, gentle temperament,
00:45:07.480 | harmony, balance, and friendship,
00:45:09.520 | whose black and white colors, again,
00:45:11.320 | symbolize yin and yang.
00:45:12.560 | The reason why it's the symbol for China
00:45:15.040 | is exactly the strength but yet peace, and so on and so forth.
00:45:19.840 | So 42 chromosomes.
00:45:21.680 | It takes light 10 to the minus 42 seconds
00:45:24.960 | to cross the diameter of a proton
00:45:27.880 | connecting the two fundamental dimensions
00:45:29.560 | of space and time.
00:45:31.160 | 42 is the number of times a piece of paper
00:45:33.520 | should be folded to reach beyond the moon. (laughs)
00:45:37.800 | So, which is what I assume my students mean
00:45:41.960 | when they ask that their paper reaches for the stars.
00:45:44.560 | I just tell them just fold it a bunch of times.
00:45:46.120 | (both laugh)
00:45:48.920 | 42 is the number of Messier object 42, which is Orion.
00:45:53.920 | And that's one of the most famous galaxies.
00:45:58.520 | It's, I think, also the place where we can actually see
00:46:00.800 | the center of our galaxy.
00:46:03.000 | 42 is the numeric representation
00:46:04.560 | of the star symbol in ASCII.
00:46:07.360 | (Lex laughs)
00:46:08.280 | Which is very useful when searching for the stars.
00:46:10.880 | And also a reg exp for life, the universe, and everything.
00:46:14.240 | So star. (both laugh)
00:46:17.160 | In Egyptian mythology, the goddess Maat,
00:46:20.120 | which was personifying truth and justice,
00:46:22.320 | would ask 42 questions to every dying person,
00:46:25.600 | and those answering successfully would become stars,
00:46:27.960 | continue to give life and fuel universal growth.
00:46:30.440 | In Judaic tradition, God ascribed the 42-lettered name
00:46:35.720 | and trusted only to the middle-aged, pious, meek,
00:46:38.080 | free from bad temper, sober, and not insistent on his rights.
00:46:42.120 | And in Christian tradition, there's 42 generations
00:46:46.480 | from Abraham, Isaac, that we talked about,
00:46:49.320 | the story of Isaac, Jacob,
00:46:51.240 | eventually Joseph, Mary, and Jesus.
00:46:53.680 | In Qabbalistic tradition, Eloqa, which is 42,
00:46:56.880 | is the number with which God creates the universe,
00:47:00.000 | starting with 25, let there be, and ending with 17, good.
00:47:04.760 | So 25 plus, you know, 17.
00:47:07.800 | There's the 42-chapter sutra,
00:47:10.600 | which is the first Indian religious scripture,
00:47:12.840 | which was translated to Chinese,
00:47:14.640 | thus introducing Buddhism to China from India.
00:47:18.800 | The 42-line Bible was the first printed book
00:47:21.920 | marking the age of printing in the 1450s
00:47:25.760 | and the dissemination of knowledge
00:47:27.240 | eventually leading to the Enlightenment.
00:47:29.560 | A yeast cell, which is called a single-cell eukaryote
00:47:33.240 | and the subject of my PhD research
00:47:34.800 | has exactly 42 million proteins.
00:47:38.080 | Anyway, so there's a series of 42.
00:47:39.920 | - You're on fire with this.
00:47:41.360 | These are really good.
00:47:42.920 | So I guess what you're saying is just a random number.
00:47:45.680 | - Yeah, basically.
00:47:47.320 | So all of these are background names.
00:47:49.200 | So, you know, after you have the number,
00:47:50.480 | you figure out why that number.
00:47:52.160 | So anyway, so now that we've spoken about why 42,
00:47:56.400 | why do we search for meaning?
00:47:58.280 | And you're asking, you know,
00:48:00.160 | will that search ultimately lead to our destruction?
00:48:02.600 | And my thinking is exactly the opposite.
00:48:04.880 | So basically that asking about meaning
00:48:08.560 | is something that's so inherent to human nature.
00:48:11.520 | It's something that makes life beautiful
00:48:13.480 | that makes it worth living.
00:48:15.000 | And that searching for meaning is actually the point.
00:48:18.520 | It's not the finding it.
00:48:19.720 | I think when you found it, you're dead.
00:48:21.680 | Don't ever be satisfied that, you know, I've got it.
00:48:26.560 | So I like to say that life is lived forward,
00:48:30.120 | but it only makes sense backward.
00:48:32.320 | And I don't remember whose quote that is,
00:48:34.480 | but the whole search itself is the meaning.
00:48:39.480 | And what I love about it is that
00:48:41.880 | there's a double search going on.
00:48:45.000 | There's a search in every one of us
00:48:47.680 | through our own lives to find meaning.
00:48:50.760 | And then there's a search,
00:48:52.200 | which is happening for humanity itself to find our meaning.
00:48:55.960 | And we as humans like to look at animals and say,
00:49:01.600 | of course they have a meaning.
00:49:03.160 | Like a dog has its meaning.
00:49:05.000 | It's just a bunch of instincts, you know,
00:49:07.040 | running around, loving everything.
00:49:09.440 | You know, remember our joke with the cat and the dog.
00:49:11.760 | - Yeah, cat has no meaning.
00:49:13.440 | (laughing)
00:49:14.360 | - No, no.
00:49:15.200 | And I'm noticing the yin yang symbol right here
00:49:20.640 | with this whole panda, black and white and the zero one zero.
00:49:23.400 | - You're on fire with that 42.
00:49:25.000 | Some of those are gold ASCII value for a star symbol.
00:49:29.560 | Damn.
00:49:30.400 | - I love it.
00:49:31.440 | So basically in my view,
00:49:32.920 | the search for meaning and the act of searching
00:49:37.160 | for something more meaningful is life's meaning by itself.
00:49:42.160 | The fact that we kind of always hope that yes,
00:49:45.960 | maybe for animals that's not the case,
00:49:48.040 | but maybe humans have something that we should be doing
00:49:51.280 | and something else.
00:49:52.120 | And it's not just about procreation.
00:49:53.560 | It's not just about dominance.
00:49:55.480 | It's not just about strength and feeding, et cetera.
00:49:58.400 | Like we're the one species that spends such a tiny little
00:50:01.080 | minority of its time feeding that we have this enormous,
00:50:05.400 | you know, huge cognitive capability
00:50:08.840 | that we can just use for all kinds of other stuff.
00:50:11.160 | And that's where art comes in.
00:50:12.960 | That's where, you know, the healthy mind comes in with,
00:50:16.440 | you know, exploring all of these different aspects
00:50:18.360 | that are just not directly tied to a purpose.
00:50:23.360 | That's not directly tied to a function.
00:50:25.880 | It's really just the playing of life.
00:50:28.600 | The, you know, not for a particular reason.
00:50:32.440 | - Do you think this thing we got,
00:50:34.320 | this mind is unique in the universe
00:50:38.200 | in terms of how difficult it is to build?
00:50:42.320 | - So--
00:50:43.400 | - Is it possible that we're the most beautiful thing
00:50:47.560 | that the universe has constructed?
00:50:49.920 | - Both the most beautiful and the most ugly,
00:50:51.200 | but certainly the most complex.
00:50:52.560 | So look at evolutionary time.
00:50:55.120 | The dinosaurs ruled the earth for 135 million years.
00:50:58.920 | We've been around for a million years.
00:51:00.960 | So one versus 135.
00:51:05.840 | So dinosaurs were extinct, you know,
00:51:07.760 | about 60 million years ago,
00:51:09.120 | and mammals that had been happily evolving
00:51:11.040 | as tiny little creatures for 30 million years,
00:51:13.320 | then took over the planet.
00:51:14.720 | And then, you know, dramatically radiated
00:51:16.680 | about 60 million years ago.
00:51:18.040 | Out of these mammals came the neocortex formation.
00:51:24.040 | So basically the neocortex,
00:51:25.600 | which is sort of the outer layer of our brain
00:51:28.080 | compared to our quote unquote reptilian brain,
00:51:29.880 | which we share the structure of with all of the dinosaurs,
00:51:33.200 | they didn't have that, and yet they ruled the planet.
00:51:36.040 | So how many other planets have still, you know,
00:51:38.760 | mindless dinosaurs where strength was the only dimension
00:51:42.800 | ruling the planet?
00:51:45.480 | So there was something weird that annihilated the dinosaurs.
00:51:49.400 | And again, you could look at biblical things
00:51:51.080 | of sort of God coming and wiping out his creatures
00:51:53.360 | and to make room for the next ones.
00:51:55.080 | So the mammals basically sort of took over the planet
00:51:59.120 | and then grew this cognitive capability
00:52:03.200 | of this general purpose machine.
00:52:06.240 | And primates pushed that to extreme,
00:52:10.560 | and humans among primates have just exploded that hardware.
00:52:15.520 | But that hardware is selected for survival.
00:52:22.240 | It's selected for procreation.
00:52:24.200 | It's initially selected with this very simple
00:52:28.160 | Darwinian view of the world of random mutation,
00:52:32.400 | ruthless selection,
00:52:33.800 | and then selection for making more of yourself.
00:52:36.200 | If you look at human cognition,
00:52:40.880 | it's gone down a weird evolutionary path
00:52:43.760 | in the sense that we are expending
00:52:46.720 | an enormous amount of energy on this apparatus
00:52:49.800 | between our ears that is wasting, what,
00:52:52.680 | 15% of our bodily energy, 20%,
00:52:55.680 | like some enormous percentage of our calories
00:52:59.760 | go to function our brain.
00:53:01.600 | No other species makes that big of a commitment.
00:53:05.960 | That has basically taken energetic changes
00:53:08.840 | for efficiency on the metabolic side for humanity
00:53:13.240 | to basically power that thing.
00:53:15.400 | And our brain is both enormously more efficient
00:53:18.680 | than other brains, but also, despite this efficiency,
00:53:22.120 | enormously more energy consuming.
00:53:23.920 | So, and if you look at just the sheer folds
00:53:28.600 | that the human brain has,
00:53:30.400 | again, our skull could only grow so much
00:53:33.280 | before it could no longer go through the pelvic opening
00:53:36.400 | and kill the mother at every birth.
00:53:39.480 | So, but yet the folds continued,
00:53:42.200 | effectively creating just so much more capacity.
00:53:46.640 | The evolutionary context in which this was made
00:53:51.560 | is enormously fascinating.
00:53:53.480 | And it has to do with other humans
00:53:57.240 | that we have now killed off or that have gone extinct.
00:54:00.440 | And that has now created this weird place of humans
00:54:04.280 | on the planet as the only species
00:54:06.520 | that has this enormous hardware.
00:54:08.800 | So, that can basically make us think
00:54:10.400 | that there's something very weird and unique
00:54:12.120 | that happened in human evolution
00:54:13.280 | that perhaps has not been recreated elsewhere.
00:54:15.400 | Maybe the asteroid didn't hit sister earth
00:54:18.600 | and dinosaurs are still ruling.
00:54:20.720 | And any kind of proto-human is squished
00:54:23.760 | and eaten for breakfast, basically.
00:54:25.560 | However, we're not as unique as we like to think
00:54:31.280 | because there was this enormous diversity
00:54:33.400 | of other human-like forms.
00:54:35.800 | And once you make it to that stage
00:54:37.800 | where you have a neocortex-like explosion of,
00:54:40.560 | wow, we're now competing on intelligence
00:54:42.920 | and we're now competing on social structures
00:54:44.720 | and we're now competing on larger and larger groups
00:54:48.800 | and being able to coordinate and being able to have empathy.
00:54:53.800 | The concept of empathy, the concept of an ego,
00:54:57.800 | the concept of a self, of self-awareness
00:55:01.240 | comes probably from being able to project
00:55:06.240 | another person's intentions to understand what they mean
00:55:10.880 | when you have these large cognitive groups,
00:55:13.480 | large social groups.
00:55:15.160 | So me being able to sort of create a mental model
00:55:19.080 | of how you think may have come before
00:55:22.120 | I was able to create a personal mental model
00:55:25.120 | of how do I think.
00:55:26.360 | So this introspection probably came after
00:55:29.680 | this sort of projection and this empathy,
00:55:33.520 | which basically means passion, pathos, suffering,
00:55:37.720 | but basically sensing.
00:55:39.120 | So basically empathy means feeling what you're feeling,
00:55:42.600 | trying to project your emotional state
00:55:45.360 | onto my cognitive apparatus.
00:55:47.560 | And I think that is what eventually led
00:55:51.360 | to this enormous cognitive explosion
00:55:55.240 | that happened in humanity.
00:55:56.680 | So life itself, in my view, is inevitable on every planet.
00:56:01.680 | - Inevitable. - Inevitable.
00:56:05.440 | But the evolution of life to self-awareness and cognition
00:56:09.920 | and all the incredible things that humans have done,
00:56:12.720 | you know, that might not be as inevitable.
00:56:14.600 | - That's your intuition.
00:56:15.440 | So if you were to sort of estimate and bet some money on it,
00:56:20.440 | if we reran Earth a million times,
00:56:24.920 | would what we got now be the most special thing
00:56:29.520 | and how often would it be?
00:56:30.960 | So scientifically speaking,
00:56:33.040 | how repeatable is this experiment? (laughs)
00:56:36.280 | - So this whole cognitive revolution,
00:56:38.680 | yes. - Maybe not.
00:56:41.960 | Maybe not.
00:56:42.800 | Basically, I feel that the longevity of dinosaurs
00:56:47.400 | suggests that it was not quite inevitable,
00:56:51.880 | that we humans eventually made it.
00:56:56.680 | - Well, you're also implying one thing here.
00:56:59.220 | You're saying, you're implying that humans
00:57:01.120 | also don't have this longevity.
00:57:03.320 | This is the interesting question.
00:57:05.040 | So with the Fermi paradox,
00:57:06.960 | the idea that the basic question is like,
00:57:10.440 | if the universe has a lot of alien life forms in it,
00:57:14.400 | why haven't we seen them?
00:57:16.480 | And one thought is that there's a great filter
00:57:19.360 | or multiple great filters
00:57:21.240 | that basically would destroy intelligent civilizations.
00:57:25.560 | This thing that we, you know, this multifolding brain
00:57:28.640 | that keeps growing may not be such a big feature.
00:57:32.160 | It might be useful for survival,
00:57:34.280 | but it like takes us down a side road
00:57:38.440 | that is a very short one with a quick dead end.
00:57:41.160 | What do you think about that?
00:57:42.240 | - So I think the universe is enormous,
00:57:45.620 | not just in space, but also in time.
00:57:48.280 | And the pretense that, you know,
00:57:53.960 | the last blink of an instant
00:57:56.760 | that we've been able to send radio waves
00:57:58.840 | is when somebody should have been paying attention
00:58:00.360 | to our planet is a little ridiculous.
00:58:03.880 | So my, you know, what I love about Star Wars
00:58:07.260 | is a long, long time ago in a galaxy far, far away.
00:58:10.800 | It's not like some distant future,
00:58:11.920 | it's a long, long time ago.
00:58:13.520 | What I love about it is that basically says,
00:58:16.000 | you know, evolution and civilization
00:58:19.400 | are just so recent in, you know, on earth.
00:58:23.800 | Like there's countless other planets
00:58:25.120 | that have probably all kinds of life forms,
00:58:27.760 | multicellular perhaps, and so on and so forth.
00:58:31.600 | But the fact that humanity has only been listening
00:58:35.880 | and emitting for just this tiny little blink
00:58:39.360 | means that any of these, you know, alien civilizations
00:58:42.840 | would need to be paying attention
00:58:44.100 | to every single insignificant planet out there.
00:58:47.200 | And, you know, again, I mean, the movie "Contact"
00:58:50.360 | and the book is just so beautiful.
00:58:52.400 | This whole concept of we don't need to travel physically.
00:58:56.240 | We can travel as light.
00:58:57.720 | We can send instructions for people to create machines
00:59:01.080 | that will allow us to beam down light and recreate ourselves.
00:59:04.880 | And in the book, you know, the aliens actually take over.
00:59:08.160 | - Yes. - They're not as friendly.
00:59:09.840 | But, you know, this concept that we have to eventually
00:59:13.560 | go and conquer every planet.
00:59:14.780 | I mean, I think that, yes,
00:59:16.720 | we will become a galactic species.
00:59:18.600 | - So you have a hope, well, you said think, so.
00:59:21.840 | - Oh, of course, of course.
00:59:23.120 | I mean, now that we've made it so far.
00:59:25.840 | - So you feel like we've made it.
00:59:27.240 | - Oh gosh, I feel that, you know, cognition,
00:59:30.120 | the cognition as an evolutionary trait
00:59:31.800 | has won over in our planet.
00:59:33.840 | There's no doubt that we've made it.
00:59:35.360 | So basically humans have won the battle for, you know,
00:59:40.280 | dominance.
00:59:41.400 | It wasn't necessarily the case with dinosaurs.
00:59:43.800 | Like, I mean, yes, you know,
00:59:46.160 | there's some claims of intelligence.
00:59:50.400 | And if you look at Jurassic Park, yeah, sure, whatever.
00:59:53.840 | But, you know, they just don't have the hardware for it.
00:59:57.280 | - Yeah.
00:59:58.120 | - And humans have the hardware.
00:59:59.320 | There's no doubt that mammals
01:00:00.960 | have a dramatically improved hardware
01:00:03.680 | for cognition over dinosaurs.
01:00:06.120 | Like basically there's universes where strength won out.
01:00:09.460 | And in our planet, in our, you know,
01:00:11.680 | particular version of whatever happened in this planet,
01:00:15.600 | cognition won out.
01:00:16.960 | And it's kind of cool.
01:00:18.040 | I mean, it's a privilege, right?
01:00:20.220 | But it's kind of like living in Boston instead of,
01:00:22.040 | I don't know, some middle age place
01:00:26.600 | where everybody's like hitting each other
01:00:28.080 | with, you know, weapons and sticks.
01:00:31.160 | - You're back to the lucky one song.
01:00:32.880 | (laughing)
01:00:33.720 | - I mean, we are the lucky ones.
01:00:36.080 | - But the flip side of that is that this hardware
01:00:39.640 | also allows us to develop weapons
01:00:41.640 | and methods of destroying ourselves.
01:00:43.880 | So you--
01:00:46.120 | - Again, I wanna go back to Pinker
01:00:47.960 | and the better angels of our nature.
01:00:50.440 | The whole concept that civilization
01:00:55.120 | and the act of civilizing
01:00:58.120 | has dramatically reduced violence, dramatically.
01:01:02.920 | If you look, you know, at every scale,
01:01:06.320 | as soon as organization comes,
01:01:08.880 | the state basically owns the right to violence.
01:01:13.740 | And eventually the state gives that right
01:01:17.600 | of governance to the people,
01:01:20.440 | but violence has been eliminated by that state.
01:01:23.480 | So this whole concept of central governance
01:01:27.160 | and people agreeing to live together
01:01:29.840 | and share responsibilities and duties
01:01:34.000 | and, you know, all of that,
01:01:36.840 | is something that has led so much to less violence,
01:01:41.600 | less death, less suffering, less, you know,
01:01:44.800 | poverty, less, you know, war.
01:01:48.800 | I mean, yes, we have the capability
01:01:52.240 | to destroy ourselves,
01:01:53.560 | but the arc of civilization
01:01:56.560 | has led to much, much less destruction,
01:01:58.760 | much, much less war and much more peace.
01:02:01.080 | And of course there's blips back and forth
01:02:04.040 | and, you know, there are setbacks,
01:02:06.480 | but again, the moral arc of the universe
01:02:10.240 | seems to just--
01:02:11.480 | - I probably imagine there were two dinosaurs
01:02:14.300 | back in the day having this exact conversation
01:02:17.080 | and they look up to the sky
01:02:18.600 | and there seems to be something like an asteroid
01:02:21.200 | going towards Earth.
01:02:22.040 | So while it's very true
01:02:24.760 | that the arc of our society of human civilization
01:02:28.520 | seems to be progressing
01:02:30.380 | towards a better, better life for everybody
01:02:32.880 | in the many ways that you described,
01:02:34.860 | things can change in a moment.
01:02:39.160 | And it feels like it's not just us humans
01:02:41.640 | we're living through a pandemic.
01:02:44.160 | You could imagine that a pandemic would be more destructive
01:02:47.960 | or there could be asteroids that just appear
01:02:52.520 | out of the darkness of space,
01:02:54.480 | which I recently learned is not that easy to--
01:02:58.400 | - Let me give you another number.
01:02:59.240 | - Detect them, yes.
01:03:00.400 | - So 48, what happens in 48 years?
01:03:02.940 | - I'm not sure.
01:03:05.440 | - 2068, Apophis.
01:03:06.960 | There's an asteroid that's coming.
01:03:09.600 | In 48 years, it has very high chance
01:03:11.560 | of actually wiping us out completely.
01:03:13.560 | - Yes.
01:03:14.400 | - Yes.
01:03:15.400 | - So we have 48 years to get our act together.
01:03:18.560 | It's not like some distant, distant hypothesis.
01:03:21.000 | - Yes.
01:03:21.840 | - Like, yeah, sure, they're hard to detect,
01:03:23.120 | but this one we know about, it's coming.
01:03:25.280 | - So how do you feel about that?
01:03:26.240 | Why are you still so optimistic?
01:03:27.200 | - Oh gosh, I'm so happy with where we are now.
01:03:29.840 | This is gonna be great.
01:03:30.880 | Seriously, if you look at progress,
01:03:32.760 | if you look at, again,
01:03:34.040 | the speed with which knowledge has been transferred,
01:03:38.440 | what has led to humanity making so many advances so fast?
01:03:43.280 | Okay?
01:03:44.360 | So what has led to humanity making so many advances
01:03:47.560 | is not just the hardware upgrades.
01:03:50.400 | It's also the software upgrades.
01:03:52.600 | So by hardware upgrades, I basically mean our neocortex
01:03:55.520 | and the expansion and these layers and folds of our brain
01:03:59.440 | and all of that.
01:04:00.280 | That's the hardware.
01:04:01.400 | The software hasn't, you know,
01:04:04.520 | the hardware hasn't changed much in the last,
01:04:07.120 | what, 70,000 years?
01:04:08.400 | As I mentioned last time,
01:04:12.000 | if you take a person from ancient Egypt
01:04:13.560 | and you bring them up now,
01:04:15.200 | they're just as equally fit.
01:04:18.000 | So hardware hasn't changed.
01:04:20.520 | What has changed is software.
01:04:22.440 | What has changed is that we are growing up in societies
01:04:27.440 | that are much more complex.
01:04:30.880 | This whole concept of neoteny
01:04:32.880 | basically allows our exponential growth.
01:04:35.760 | The concept that our brain has not fully formed,
01:04:39.120 | has not fully stabilized itself
01:04:41.680 | until after our teenage years.
01:04:43.600 | So we basically have a good 16 years, 18 years
01:04:46.680 | to sort of infuse it with the latest
01:04:48.920 | and greatest in software.
01:04:51.240 | If you look at what happened in ancient Greece,
01:04:53.920 | why did everything explode at once?
01:04:57.400 | My take on this is that it was the shift
01:04:59.680 | from the Egyptian and hieroglyphic software
01:05:03.520 | to the Greek language software.
01:05:05.680 | This whole concept of creating abstract notions,
01:05:09.960 | of creating these layers of cognition
01:05:14.960 | and layers of meaning and layers of abstraction
01:05:19.480 | for words and ideals and beauty and harmony.
01:05:24.480 | How do you write harmony in hieroglyphics?
01:05:26.800 | There's no such thing as sort of expressing these ideals
01:05:29.920 | of peace and justice and these concepts of,
01:05:34.360 | or even macabre concepts of doom, et cetera.
01:05:39.680 | You don't have the language for it.
01:05:42.160 | Your brain has trouble getting at that concept.
01:05:47.160 | So what I'm trying to say is that
01:05:50.120 | these software upgrades for human language,
01:05:55.360 | human culture, human environment, human education
01:06:00.080 | have basically led to this enormous explosion of knowledge.
01:06:04.920 | And eventually after the enlightenment,
01:06:07.200 | and as I was mentioning, the 42-line Bible
01:06:11.320 | and the printed press, the dissemination of knowledge,
01:06:13.800 | you basically now have this whole horizontal dispersion
01:06:17.080 | of ideas in addition to the vertical inheritance of genes.
01:06:21.320 | So the hardware improvements
01:06:23.640 | happen through vertical inheritance.
01:06:25.480 | The software improvements happen
01:06:27.000 | through horizontal inheritance.
01:06:28.720 | And the reason why human civilization exploded
01:06:31.360 | is not a hardware change anymore,
01:06:32.680 | it's really a software change.
01:06:34.560 | So if you're looking at now where we are today,
01:06:37.200 | look at coronavirus.
01:06:40.480 | Yes, sure, it could have killed us.
01:06:42.520 | A hundred years ago, it would have, but it didn't.
01:06:46.400 | Because in January, we published the genome.
01:06:50.960 | A month later, less than a month later,
01:06:52.960 | the first vaccine designs were done.
01:06:54.920 | And now less than a year later, 10 months later,
01:06:58.160 | we already have a working vaccine that's 90% efficient.
01:07:01.160 | I mean, that is ridiculous by any standards.
01:07:03.800 | And the reason is sharing.
01:07:05.960 | So the asteroid, yes, could wipe us out in 48 years,
01:07:09.400 | but 48 years?
01:07:11.000 | I mean, look at where we were 48 years ago, technologically.
01:07:16.000 | I mean, how much more we understand
01:07:18.360 | the basic foundations of space is enormous.
01:07:22.960 | The technological revolutions of digitization,
01:07:27.320 | the amount of compute power we can put on any,
01:07:33.480 | in nail size hardware is enormous.
01:07:36.880 | So, and this is nowhere near ending.
01:07:40.560 | We all have our little problems going back and forth
01:07:43.800 | on the social side and on the political side
01:07:46.240 | and on the cognitive and on the sort of human side
01:07:49.280 | and the societal side, but science has not slowed down.
01:07:54.080 | Science is moving at a breakneck pace ahead.
01:07:57.360 | So Elon is now putting rockets out from the private space.
01:08:01.680 | I mean, that now democratization of space exploration
01:08:06.040 | is gonna revolutionize. - It's gonna explode.
01:08:08.680 | - Of course. - It could continue.
01:08:09.800 | - In the same way that every technology has exploded,
01:08:12.600 | this is the shift to space technology exploding.
01:08:15.920 | So 48 years is infinity from now
01:08:19.360 | in terms of space capabilities.
01:08:21.280 | So I'm not worried at all.
01:08:22.600 | - Are you excited by the possibility of a human,
01:08:25.000 | well, one, a human stepping foot on Mars
01:08:28.440 | and two, possible colonization of not necessarily Mars,
01:08:33.080 | but other planets and all that kind of stuff
01:08:34.840 | for people living in space?
01:08:36.120 | - Inevitable. - Inevitable.
01:08:37.560 | - Inevitable. - Would you do it?
01:08:39.000 | Or are you kind of like Earth? - Of course.
01:08:40.240 | Of course. (Lex laughing)
01:08:41.520 | In a heartbeat.
01:08:42.360 | - How many people will you wait?
01:08:44.000 | Will you wait for, I think it was about when
01:08:46.400 | the Declaration of Independence was signed,
01:08:50.000 | about two to three million people lived here.
01:08:52.560 | So would you move like before?
01:08:54.440 | Would you be like on the first boat?
01:08:57.000 | Would you be on the 10th boat?
01:08:58.280 | Would you wait until the Declaration of Independence?
01:09:00.440 | - I don't think I'll be on the short list
01:09:02.120 | because I'll be old by then.
01:09:04.000 | They'll probably get a bunch of younger people.
01:09:06.120 | - But you're, it's the-- (Lex laughing)
01:09:08.200 | It's the wisdom and the--
01:09:09.800 | - But wisdom can be transferred horizontally.
01:09:13.640 | - I gotta tell you, you are the lucky one,
01:09:15.240 | so you might be on the list.
01:09:16.160 | I don't know.
01:09:17.000 | You never know. - I mean, I kind of feel
01:09:19.200 | like I would love to see Earth from above
01:09:21.960 | just to watch our planet.
01:09:23.720 | I mean, just, I mean, you know you can watch a live feed
01:09:26.840 | of the space station.
01:09:29.320 | Watching Earth is magnificent,
01:09:32.600 | like this blue tiny little shield.
01:09:35.840 | It's so thin, our atmosphere.
01:09:38.040 | Like if you drive to New York,
01:09:39.160 | you're basically in outer space.
01:09:40.560 | I mean, it's ridiculous.
01:09:41.440 | It's just so thin.
01:09:42.480 | And it's just, again, such a privilege
01:09:45.040 | to be on this planet, such a privilege.
01:09:47.280 | But I think our species is in for big, good things.
01:09:52.280 | (both laughing)
01:09:54.800 | I think that we will overcome our little problems
01:09:58.800 | and eventually come together as a species.
01:10:01.400 | I feel that we're definitely on the path to that.
01:10:04.360 | And it's just not permeated through the whole universe yet,
01:10:09.360 | I mean, through the whole world yet,
01:10:11.240 | through the whole Earth yet,
01:10:12.560 | but it's definitely permeating.
01:10:14.200 | - So you've talked about humans as special.
01:10:17.640 | How exactly are we special relative to the dinosaurs?
01:10:24.280 | - So I mentioned that there's this dramatic
01:10:28.200 | cognitive improvement that we've made,
01:10:31.720 | but I think it goes much deeper than that.
01:10:34.000 | So if you look at a lion attacking a gazelle
01:10:38.720 | in the middle of the Serengeti,
01:10:41.000 | the lion is smelling the molecules in the environment.
01:10:45.680 | Its hormones and neuroreceptors
01:10:50.680 | are sort of getting it ready for impulse.
01:10:54.520 | The target is constantly looking around and sensing.
01:10:58.440 | I've actually been in Kenya and I've kind of seen the hunt.
01:11:03.000 | So I've kind of seen the sort of game of waiting.
01:11:07.360 | And the mitochondria in the muscles of the lion
01:11:13.320 | are basically ready for jumping.
01:11:18.360 | They're expensing an enormous amount of energy.
01:11:21.520 | The grass as it's flowing is constantly transforming
01:11:25.760 | solar energy into chloroplasts,
01:11:30.080 | through the chloroplasts into energy,
01:11:32.200 | which eventually feeds the gazelle
01:11:34.640 | and eventually feeds the lions and so on and so forth.
01:11:37.880 | So as humans, we experience all of that,
01:11:42.880 | but the lion only experiences one layer.
01:11:49.620 | The mitochondria in its body
01:11:51.120 | are only experiencing one layer.
01:11:52.520 | The chloroplasts are only experiencing one layer.
01:11:55.560 | The photoreceptors and the smell receptors
01:11:59.800 | and the chemical receptors,
01:12:00.640 | like the lion always attacks against the wind
01:12:02.840 | so that it's not smelled.
01:12:04.080 | Like all of these things are one layer at a time.
01:12:10.160 | And we humans somehow perceive the whole stack.
01:12:14.760 | So going back to software infrastructure
01:12:17.120 | and hardware infrastructure,
01:12:18.680 | if you design a computer,
01:12:20.640 | you basically have a physical layer that you start with.
01:12:23.400 | And then on top of that physical layer,
01:12:24.840 | you have the electrical layer.
01:12:27.360 | And on top of the electrical layer,
01:12:28.720 | you have basically gates and logic and an assembly layer.
01:12:32.720 | And on top of the assembly layer,
01:12:33.720 | you have your higher order, higher level programming.
01:12:37.760 | And on top of that,
01:12:38.600 | you have your deep learning routine, et cetera.
01:12:40.200 | And on top of that,
01:12:41.040 | you eventually build a cognitive system that's smart.
01:12:46.060 | I want you to now picture this cognitive system
01:12:49.540 | becoming not just self-aware,
01:12:51.920 | but also becoming aware of the hardware that it's made of
01:12:56.880 | and the atoms that it's made of and so on and so forth.
01:13:01.180 | So it's as if your AI system,
01:13:03.680 | and there's this beautiful scene in "2001 Odyssey of Space"
01:13:08.680 | where Hal, after Dave starts disconnecting him,
01:13:13.060 | is starting to sing a song about daisies, et cetera.
01:13:17.720 | And Hal is basically saying,
01:13:20.160 | "Dave, I'm losing my mind.
01:13:22.740 | I can feel I'm losing my mind."
01:13:26.480 | It's just so beautiful.
01:13:28.160 | This concept of self-awareness,
01:13:30.880 | of knowing that the hardware is no longer there, is amazing.
01:13:35.320 | And in the same way,
01:13:36.680 | humans who have had accidents
01:13:39.120 | are aware that they've had accidents.
01:13:42.120 | So there's this self-awareness of AI
01:13:44.240 | that is this beautiful concept
01:13:48.740 | about sort of the eventual cognitive leap to self-awareness.
01:13:53.740 | But imagine now the AI system
01:13:57.280 | actually breaking through these layers and saying,
01:13:58.840 | "Wait a minute, I think I can design
01:14:00.240 | a slightly better hardware to get me functioning better."
01:14:03.520 | And that's what basically humans are doing.
01:14:05.740 | So if you look at our reasoning layer,
01:14:08.440 | it's built on top of a cognitive layer.
01:14:11.240 | And the reasoning layer we share with AI.
01:14:13.420 | It's kind of cool.
01:14:14.840 | There is another thing on the planet
01:14:16.520 | that can integrate equations and it's man-made,
01:14:19.400 | but we share computation with them.
01:14:22.140 | We share this cognitive layer of playing chess.
01:14:24.880 | We're not alone anymore.
01:14:26.560 | We're not the only thing on the planet that plays chess.
01:14:28.440 | Now we have AI that also plays chess.
01:14:31.280 | - But in some sense, that particular organism, AI,
01:14:34.200 | as it is now, only operates in that layer.
01:14:36.280 | - Exactly, exactly.
01:14:38.240 | And then most animals operate in the sort of
01:14:41.440 | cognitive layer that we're all experiencing.
01:14:43.800 | A bat is doing this incredible integration of signals,
01:14:48.040 | but it's not aware of it.
01:14:50.020 | It's basically constantly sending echo location, waves,
01:14:55.020 | and it's receiving them back.
01:14:56.920 | And multiple bats in the same cave
01:14:59.200 | are operating at slightly different frequencies
01:15:01.800 | and with slightly different pulses.
01:15:03.800 | And they're all sensing objects
01:15:05.240 | and they're doing motion planning
01:15:07.440 | in their cognitive hardware,
01:15:08.880 | but they're not even aware of all of that.
01:15:10.760 | All they know is that they have a 3D view of space
01:15:13.920 | around them, just like any gazelle walking through,
01:15:17.440 | you know, the desert.
01:15:19.000 | And any baby looking around is aware of things
01:15:23.640 | without doing the math of how am I processing
01:15:26.920 | all of these visual information, et cetera.
01:15:29.000 | You're just aware of the layer that you live in.
01:15:31.880 | I think if you look at this, at humanity,
01:15:36.120 | we've basically managed through our cognitive layer,
01:15:39.120 | through our perception layer, through our senses layer,
01:15:42.040 | through our multi-organ layer, through our genetic layer,
01:15:47.040 | through our molecular layer, through our atomic layer,
01:15:52.200 | through our quantum layer,
01:15:54.360 | through even the very fabric of the space-time continuum,
01:15:58.200 | unite all of that cognitively.
01:16:00.680 | So as we're watching that scene in the Serengeti,
01:16:04.520 | we as scientists, we as educated humans,
01:16:07.360 | we as, you know, anyone who's finished high school
01:16:10.040 | are aware of all of this beauty
01:16:12.120 | of all of these different layers interplaying together.
01:16:14.760 | And I think that's something very unique
01:16:16.720 | in perhaps not just the galaxy, but maybe even the cosmos.
01:16:20.160 | These species that has managed to, in space,
01:16:25.960 | cross through these layers from the enormous
01:16:29.420 | to the infinitely small.
01:16:30.780 | And that's what I love about particle physics,
01:16:33.240 | the fact that it actually unites everything.
01:16:35.800 | - The very small, the very big.
01:16:36.640 | - The very small and the very big.
01:16:38.000 | It's only through the very big
01:16:39.280 | that the very small gets formed.
01:16:41.320 | Like basically every atom of gold
01:16:43.080 | results from the fusion that happened
01:16:47.880 | of increasingly large particles before that explosion
01:16:51.400 | that then disperses it through the cosmos.
01:16:53.640 | And it's only through understanding the very large
01:16:57.220 | that we understand the very small and vice versa.
01:16:59.560 | And that's in space.
01:17:02.000 | Then there's the time direction.
01:17:04.880 | As you are watching the Kilimanjaro Mountain,
01:17:08.720 | you can kind of look back through time
01:17:11.720 | to when that volcano was exploding
01:17:14.360 | and growing out of the tectonic forces.
01:17:17.760 | As you drive through Death Valley,
01:17:20.460 | you see these mountains on their side
01:17:23.400 | and these layers of history exposed.
01:17:27.000 | We are aware of the eons that have happened on Earth
01:17:33.260 | and the tectonic movements on Earth
01:17:36.060 | the same way that we're aware of the Big Bang
01:17:41.220 | and the early evolution of the cosmos.
01:17:44.620 | And we can also see forward in time
01:17:46.620 | as to where the universe is heading.
01:17:48.380 | We can see Apophis in 2068 coming over,
01:17:53.340 | looking ahead in time.
01:17:54.360 | I mean, that would be magician stuff in ancient times.
01:17:58.800 | So what I love about humanity and its role in the universe
01:18:02.400 | is that if there's a God watching,
01:18:05.840 | he's like, "Finally, somebody figured it out.
01:18:08.240 | "I've been building all these beautiful things
01:18:10.380 | "and somebody can appreciate it."
01:18:11.640 | - And figured me out from God's perspective,
01:18:13.780 | meaning become aware of.
01:18:15.680 | - Yeah.
01:18:16.980 | - Yeah, so it's kind of interesting
01:18:18.920 | to think of the world in this way as layers
01:18:22.000 | and us humans are able to convert those layers
01:18:25.380 | into ideas that you can then combine.
01:18:29.920 | So we're doing some kind of conversion.
01:18:32.040 | - Exactly, exactly.
01:18:33.500 | And last time you asked me about
01:18:35.200 | whether we live in a simulation, for example.
01:18:37.640 | I mean, realize that we are living in a simulation.
01:18:42.640 | We are.
01:18:44.260 | The reality that we're in
01:18:45.740 | without any sort of person programming this is a simulation.
01:18:49.580 | Like basically what happens inside your skull?
01:18:52.420 | There's this integration of sensory inputs
01:18:55.780 | which are translated into perceptory signals,
01:18:58.580 | which are then translated into a conceptual model
01:19:01.180 | of the world around you.
01:19:03.260 | And that exercise is happening seamlessly.
01:19:07.580 | And yet, if you think about sort of, again,
01:19:11.680 | this whole simulation and Neo analogy,
01:19:15.760 | you can think of the reality that we live in as a matrix,
01:19:19.060 | as the matrix,
01:19:20.500 | but we've actually broken through the matrix.
01:19:23.020 | We've actually traversed the layers.
01:19:24.620 | We didn't have to take a pill.
01:19:26.140 | Like we didn't, you know,
01:19:28.300 | Morpheus didn't have to show up
01:19:30.120 | to basically give us the blue pill or the red pill.
01:19:32.540 | We were able to sufficiently evolve cognitively
01:19:36.580 | through the hardware explosion
01:19:38.460 | and sufficiently involve scientifically
01:19:41.220 | through the software explosion
01:19:43.060 | to basically get at breaking through the matrix,
01:19:45.820 | realizing that we live in a matrix
01:19:47.820 | and realizing that we are this thing in there.
01:19:51.220 | And yet that thing in there has a consciousness
01:19:53.780 | that lives through all these layers.
01:19:56.060 | And I think we're the only species.
01:19:58.480 | We are the only thing that we even can think of
01:20:00.460 | that has actually done that,
01:20:01.620 | has sort of permeated space and time,
01:20:06.620 | scales and layers of abstraction,
01:20:11.940 | plowing through them
01:20:13.780 | and realizing what we're really, really made of.
01:20:16.900 | And the next frontier is of course, cognition.
01:20:20.020 | So we understand so much of the cosmos,
01:20:22.540 | so much of the stuff around us,
01:20:24.320 | but the stuff inside here,
01:20:26.100 | finding the basis for the soul,
01:20:28.060 | finding the basis for the ego,
01:20:30.100 | for the self, the self-awareness.
01:20:32.900 | When does the spark happen
01:20:35.180 | that basically sort of makes you, you?
01:20:38.820 | I mean, that's, you know, really the next frontier.
01:20:41.060 | So in terms of these peeling off layers of complexity,
01:20:45.000 | somewhere between the cognitive layer
01:20:47.420 | and the reasoning layer or the computational layer,
01:20:52.420 | there's still some stuff to be figured out there.
01:20:54.460 | And I think that's the final frontier
01:20:56.020 | of sort of completing our journey through that matrix.
01:20:59.060 | - And maybe duplicating it
01:21:00.740 | in other versions of ourselves through AI,
01:21:05.540 | which is another very exciting possibility.
01:21:08.660 | - What I love about AI
01:21:10.420 | and the way that it operates right now
01:21:12.700 | is the fact that it is unpredictable.
01:21:14.940 | There's emergent behavior
01:21:18.400 | in our cognitively capable artificial systems
01:21:22.240 | that we can certainly model,
01:21:26.580 | but we don't encode directly.
01:21:30.000 | And that's a key difference.
01:21:32.480 | So we like to say, oh, of course,
01:21:35.440 | this is not really intelligent because we coded it up.
01:21:38.900 | And we've just put in these little parameters there
01:21:41.480 | and there's like, you know, what, 6 billion parameters.
01:21:43.720 | And once you've learned them, you know,
01:21:45.680 | we kind of understand the layers.
01:21:48.160 | But that's an oversimplification.
01:21:50.400 | It's like saying, oh, of course, humans,
01:21:53.280 | we understand humans.
01:21:54.200 | They're just made out of neurons and, you know,
01:21:56.680 | layers of cortex and there's a visual area.
01:22:00.440 | But every human is encoded
01:22:04.200 | by a ridiculously small number of genes
01:22:06.400 | compared to the complexity of our cognitive apparatus.
01:22:09.560 | 20,000 genes is really not that much
01:22:11.560 | out of which a tiny little fraction
01:22:13.440 | are in fact encoding all of our cognitive functions.
01:22:16.400 | The rest is emergent behavior.
01:22:19.760 | The rest is the, you know,
01:22:22.760 | the cortical layers doing their thing
01:22:26.760 | in the same way that when we build, you know,
01:22:29.360 | these conversational systems or these cognitive systems
01:22:32.160 | or these deep learning systems,
01:22:34.600 | we put the architecture in place,
01:22:36.160 | but then they do their thing.
01:22:37.640 | And in some ways, that's creating something
01:22:39.960 | that has its own identity.
01:22:41.600 | That's creating something that's not just,
01:22:43.600 | oh yeah, it's not the early AI
01:22:45.840 | where if you hadn't programmed what happens
01:22:47.640 | in the grocery bags when you have both cold and hot
01:22:50.760 | and hard and soft, you know,
01:22:53.480 | the system wouldn't know what to do.
01:22:54.560 | No, no, you basically now just program the primitives
01:22:57.520 | and then it learns from that.
01:22:59.480 | - So even though the origins are humble,
01:23:01.200 | just like it is for genetic code, for AI,
01:23:04.000 | even though the origins are humble,
01:23:07.200 | the result of it being deployed into the world
01:23:11.400 | is infinitely complex.
01:23:13.800 | And that's, and yet, there's not,
01:23:18.800 | it's not yet able to be cognizant of all the other layers
01:23:23.200 | in, of its, you know, it's not,
01:23:26.640 | it's not able to think about space and time.
01:23:32.760 | It's not able to think about the hardware
01:23:35.040 | on which it runs, the electricity on which it runs yet.
01:23:38.440 | - So if you look at humans,
01:23:41.080 | we basically have the same cognitive architecture
01:23:43.280 | as monkeys, as the great apes.
01:23:45.640 | It's just a ton more of it.
01:23:48.200 | If you look at GPT-3 versus GPT-2,
01:23:53.080 | again, it's the same architecture, just more of it.
01:23:56.400 | And yet it's able to do so much more.
01:23:59.320 | So if you start thinking about sort of
01:24:00.760 | what's the future of that, GPT-4 and GPT-5,
01:24:04.040 | do you really need fundamentally different architectures
01:24:06.240 | or do you just need a ton more hardware?
01:24:08.760 | And we do have a ton more hardware.
01:24:11.320 | Like these systems are nowhere near
01:24:13.520 | what humans have between our ears.
01:24:16.480 | So, you know, there's something to be said about
01:24:20.560 | stay tuned for emergent behavior.
01:24:23.680 | We keep thinking that general intelligence
01:24:25.480 | might just be forever away,
01:24:28.440 | but it could just simply be that
01:24:30.520 | we just need a ton more hardware
01:24:32.600 | and that humans are just not that different
01:24:34.360 | from the great apes, except for just a ton more of it.
01:24:37.960 | - Yeah, it's interesting that in the AI community,
01:24:41.760 | maybe there's a human-centric fear,
01:24:44.240 | but the notion that GPT-10 will achieve general intelligence
01:24:49.240 | is something that people shy away from,
01:24:51.840 | that there has to be something totally different
01:24:54.200 | and new added to this.
01:24:56.120 | And yet it's not seriously considered that
01:25:01.000 | this very simple thing, this very simple architecture,
01:25:05.040 | when scaled, might be the thing
01:25:07.360 | that achieves super intelligence.
01:25:09.000 | - And people think the same way about humanity
01:25:11.480 | and human consciousness.
01:25:13.080 | They're like, "Oh, consciousness might be quantum
01:25:15.080 | "or it might be some non-physical thing."
01:25:18.120 | And it's like, or it could just be a lot more
01:25:21.560 | of the same hardware that now is sufficiently capable
01:25:25.760 | of self-awareness just because it has the neurons to do it.
01:25:29.320 | So maybe the consciousness that is so elusive
01:25:32.720 | is an emergent behavior of you basically string together
01:25:38.200 | all these cognitive capabilities that come from running,
01:25:41.240 | from seeing, from reacting,
01:25:43.160 | from predicting the movement of a fly
01:25:45.560 | as you're catching it through the air.
01:25:47.240 | All of these things are just like great lookup tables
01:25:49.440 | encoded in a giant neural network.
01:25:51.320 | I mean, I'm oversimplifying, of course,
01:25:52.960 | the complexity and the diversity
01:25:54.760 | of the different types of excitatory and inhibitory neurons,
01:25:57.160 | the waveforms that sort of shine through the connections
01:26:02.160 | across all these different layers,
01:26:04.640 | the amalgamation of signals, et cetera.
01:26:06.800 | The brain is enormously complex.
01:26:08.400 | I mean, of course, but again,
01:26:09.920 | it's a small number of primitives
01:26:11.520 | encoded by a tiny number of genes,
01:26:14.200 | which are self-organized and shaped by their environment.
01:26:19.200 | Babies that are growing up today are listening to language
01:26:26.360 | from conception.
01:26:28.400 | Basically, as soon as the auditory apparatus forms,
01:26:32.200 | it's already getting shaped to the types of signals
01:26:35.040 | that are out in the real world today.
01:26:37.120 | So it's not just like, oh, have an Egyptian be born
01:26:39.240 | and then ship them over.
01:26:40.440 | It's like, no, that Egyptian would be listening in
01:26:44.200 | to the complex of the world and then getting born
01:26:46.480 | and sort of seeing just how much more complex the world is.
01:26:49.520 | So it's a combination of the underlying hardware,
01:26:53.840 | which if you think about as a geneticist,
01:26:57.440 | in my view, the hardware gives you an upper bound
01:27:00.000 | of cognitive capabilities,
01:27:02.160 | but it's the environment that makes those capabilities shine
01:27:05.160 | and reach their maximum.
01:27:06.880 | So we're a combination of nature and nurture.
01:27:10.960 | The nature is our genes and our cognitive apparatus,
01:27:15.240 | and the nurture is the richness of the environment
01:27:18.520 | that makes that cognitive apparatus reach its potential.
01:27:22.040 | And we are so far from reaching our full potential, so far.
01:27:27.040 | I think that kids being born a hundred years from now,
01:27:30.960 | they'll be looking at us now and saying
01:27:33.400 | what primitive educational systems they had.
01:27:36.400 | I can't believe people were not wired into this,
01:27:39.200 | you know, virtual reality from birth as we are now,
01:27:42.640 | 'cause like they're clearly inferior and so on and so forth.
01:27:46.000 | So I basically think that our environment
01:27:48.000 | will continue exploding and our cognitive capabilities,
01:27:53.000 | it's not like, oh, we're only using 10% of our brain.
01:27:55.200 | That's ridiculous.
01:27:56.040 | Of course, we're using a hundred percent of our brain,
01:27:57.600 | but it's still constrained
01:27:59.880 | by how complex our environment is.
01:28:02.080 | - So the hardware will remain the same,
01:28:03.800 | but the software in a quickly advancing environment,
01:28:08.680 | the software will make a huge difference
01:28:10.760 | in the nature of like the human experience,
01:28:14.280 | the human condition.
01:28:15.480 | It's fascinating to think that humans
01:28:17.120 | will look very different a hundred years from now,
01:28:19.440 | just because the environment changed,
01:28:20.720 | even though we're still the same great apes,
01:28:22.960 | the descendant of apes.
01:28:25.440 | At the core of this is kind of a notion of ideas
01:28:28.600 | that I don't know if you're,
01:28:31.000 | there's a lot of people that's,
01:28:32.240 | including you eloquently about this topic,
01:28:34.960 | but Richard Dawkins talks about the notion of memes
01:28:39.480 | and let's say this notion of ideas,
01:28:45.120 | you know, multiplying, selecting in the minds of humans.
01:28:49.120 | Do you ever think about ideas from that perspective,
01:28:52.440 | ideas as organisms themselves
01:28:54.720 | that are breeding in the minds of humans?
01:28:57.600 | - I love the concept of memes.
01:28:59.520 | I love the concept of these horizontal transfer of ideas
01:29:03.640 | and sort of permeating through, you know,
01:29:06.520 | our layer of interconnected neural networks.
01:29:11.160 | So you can think of sort of the cognitive space
01:29:15.160 | that has now connected all of humanity,
01:29:18.040 | where we are now one giant information
01:29:22.480 | and idea sharing network,
01:29:24.840 | well beyond what was thought to be ever capable
01:29:28.880 | when the concept of a meme was created by Richard Dawkins.
01:29:32.600 | So, but I want to take that concept just to,
01:29:36.560 | you know, into another twist,
01:29:39.240 | which is the horizontal transfer of humans with fellowships.
01:29:44.240 | And the fact that as people apply to MIT
01:29:51.120 | from around the world,
01:29:53.080 | there's a selection that happens,
01:29:55.680 | not just for their ideas,
01:29:58.240 | but also for the cognitive hardware
01:30:00.640 | that came up with those ideas.
01:30:02.960 | So we don't just ship ideas around anymore.
01:30:05.280 | They don't evolve in a vacuum.
01:30:07.240 | The ideas themselves influence
01:30:10.120 | the distribution of cognitive systems,
01:30:12.520 | i.e. humans and brains, around the planet.
01:30:15.520 | - Yeah, we ship them to different locations
01:30:17.160 | based on their properties.
01:30:18.360 | - That's exactly right.
01:30:19.640 | So those cognitive systems that think of, you know,
01:30:23.440 | physics, for example, might go to CERN,
01:30:26.120 | and those that think of genomics
01:30:28.320 | might go to the Broad Institute,
01:30:30.480 | and those that think of computer science
01:30:32.120 | might go to, I don't know, Stanford or CMU or MIT.
01:30:35.840 | And you basically have this co-evolution now
01:30:38.680 | of memes and ideas,
01:30:41.240 | and the cognitive conversational systems
01:30:44.680 | that love these ideas and feed on these ideas
01:30:47.240 | and understand these ideas and appreciate these ideas,
01:30:50.160 | now coming together.
01:30:52.080 | So you basically have students coming to Boston to study,
01:30:56.000 | because that's the place where
01:30:57.320 | these type of cognitive systems thrive.
01:30:59.280 | And they're selected based on their cognitive output
01:31:05.600 | and their idea output.
01:31:08.000 | But once they get into that place,
01:31:10.640 | the boiling and interbreeding of these memes
01:31:15.000 | becomes so much more frequent.
01:31:17.680 | That what comes out of it is so far beyond
01:31:21.560 | if ideas were evolving in a vacuum
01:31:24.000 | of an already established hardware
01:31:25.840 | cognitive interconnection system of the planet,
01:31:28.560 | where now you basically have the ideas
01:31:32.800 | shaping the distribution of these systems,
01:31:35.080 | and then the genetics kick in as well.
01:31:37.680 | You basically have now these people
01:31:40.080 | who came to be a student, kind of like myself,
01:31:42.000 | who now stuck around and are now professors,
01:31:45.240 | bringing up our own genetically encoded
01:31:49.880 | and genetically related cognitive systems.
01:31:53.280 | Mine are eight, six, and three years old,
01:31:56.080 | who are now growing up in an environment
01:31:58.000 | surrounded by other cognitive systems of a similar age,
01:32:02.440 | with parents who love these types of thinking and ideas.
01:32:06.320 | And you basically have a whole interbreeding now
01:32:09.200 | of genetically selected transfer of cognitive systems,
01:32:14.200 | where the genes and the memes are co-evolving
01:32:21.320 | the same soup of ever improving knowledge
01:32:25.280 | and societal inter-fertilization,
01:32:30.160 | cross-fertilization of these ideas.
01:32:31.800 | - So this beautiful image,
01:32:34.760 | so these are shipping these actual
01:32:37.000 | meat cognitive systems to physical locations.
01:32:39.480 | They tend to cluster in,
01:32:42.360 | the biology ones cluster in a certain building too,
01:32:45.280 | so within that there's clusters on top of clusters
01:32:49.480 | on top of clusters.
01:32:50.320 | What about in the online world?
01:32:52.680 | Is that, do you also see that kind of,
01:32:55.280 | because people now form groups on the internet
01:32:58.880 | that they stick together,
01:33:00.120 | so they can sort of,
01:33:03.760 | these cognitive systems can collect themselves
01:33:08.600 | and breed together in different layers of spaces.
01:33:13.600 | It doesn't just have to be physical space.
01:33:15.880 | - Absolutely, absolutely.
01:33:17.120 | So basically there's the physical rearrangement,
01:33:19.680 | but there's also the conglomeration
01:33:21.800 | of the same cognitive system.
01:33:24.280 | Doesn't need to be, I eat human.
01:33:25.800 | (both laughing)
01:33:26.760 | Doesn't need to belong to only one community.
01:33:29.120 | So yes, you might be a member
01:33:30.320 | of the computer science department,
01:33:31.560 | but you can also hang out in the biology department,
01:33:33.760 | but you might also go online into,
01:33:35.800 | I don't know, poetry department readings and so on so forth,
01:33:38.640 | or you might be part of a group
01:33:40.600 | that only has 12 people in the world,
01:33:42.800 | but that are connected through their ideas
01:33:45.280 | and are now interbreeding these ideas in a whole other way.
01:33:49.360 | So this co-evolution of genes and memes
01:33:53.160 | is not just physically instantiated,
01:33:55.240 | it's also sort of rearranged
01:33:58.000 | in this cognitive space as well.
01:34:01.720 | - And sometimes these cognitive systems hold conferences
01:34:05.080 | and they all gather around
01:34:06.800 | and there's like one of them is like talking
01:34:09.040 | and they're all like listening and then they discuss
01:34:11.320 | and then they have free lunch and so on.
01:34:13.120 | - No, but then that's where you find students
01:34:15.960 | where, you know, when I go to a conference,
01:34:18.240 | I go through the posters where I'm on a mission.
01:34:20.920 | Basically my mission is to read
01:34:22.880 | and understand what every poster is about.
01:34:25.080 | And for a few of them,
01:34:25.920 | I'll dive deeply and understand everything,
01:34:27.640 | but I make it a point to just go poster after poster
01:34:29.920 | in order to read all of them.
01:34:31.800 | And I find some gems and students that I speak to
01:34:35.440 | that sometimes eventually join my lab.
01:34:37.800 | And then sort of you're sort of creating this permeation
01:34:41.800 | of, you know, the transfer of ideas,
01:34:46.400 | of ways of thinking,
01:34:48.280 | and very often of moral values, of social structures,
01:34:52.960 | of, you know, just more imperceptible properties
01:34:57.960 | of these cognitive systems that simply just cling together.
01:35:02.680 | Basically, you know, there's,
01:35:04.760 | I have the luxury at MIT of not just choosing smart people,
01:35:09.120 | but choosing smart people who I get along with,
01:35:12.680 | who are generous and friendly and creative and smart
01:35:17.680 | and, you know, excited and childish
01:35:22.280 | in their uninhibited behaviors and so on and so forth.
01:35:26.000 | So you basically can choose yourself to surround,
01:35:29.800 | you can choose to surround yourself
01:35:31.240 | with people who are not only cognitively compatible,
01:35:36.120 | but also, you know, imperceptibly
01:35:39.760 | through the meta cognitive systems compatible.
01:35:43.560 | And again, when I say compatible, not all the same.
01:35:46.720 | Sometimes, you know, not sometimes, all the time,
01:35:50.560 | the teams are made out of complementary components,
01:35:53.520 | not just compatible, but very often complementary.
01:35:56.240 | So in my own team, I have a diversity of students
01:35:59.600 | who come from very different backgrounds.
01:36:01.920 | There's a whole spectrum of biology
01:36:03.480 | to computation, of course,
01:36:04.880 | but within biology, there's a lot of realms,
01:36:06.680 | within computation, there's a lot of realms.
01:36:08.840 | And what makes us click so well together
01:36:13.360 | is the fact that not only do we have a common mission,
01:36:16.640 | a common passion, and a common, you know,
01:36:20.360 | view of the world, but that we're complementary
01:36:24.600 | in our skills, in our angles with which we accommodate,
01:36:27.480 | and so on and so forth,
01:36:28.320 | and that's sort of what makes it click.
01:36:29.720 | - Yeah, it's fascinating that the stickiness
01:36:32.880 | of multiple cognitive systems together
01:36:36.040 | includes both the commonality,
01:36:38.040 | so you meet because there's some common thing,
01:36:40.800 | but you stick together because you're different
01:36:45.240 | in all the useful ways.
01:36:46.760 | - Yeah, yeah, and my wife and I,
01:36:48.440 | I mean, we adore each other, like, to pieces,
01:36:51.840 | but we're also extremely different in many ways.
01:36:54.320 | - Careful. - And that's beautiful.
01:36:55.920 | - She's gonna be listening to this.
01:36:57.120 | - But I love that about us.
01:36:59.040 | I love the fact that, you know,
01:37:00.760 | I'm like living out there in the world of ideas
01:37:03.600 | and I forget what day it is,
01:37:05.600 | and she's like, "Well, at 8 a.m.,
01:37:07.040 | the kids better be to school."
01:37:08.200 | (both laughing)
01:37:09.280 | And, you know, I do get yelled at,
01:37:13.920 | but I need it.
01:37:15.480 | Basically, I need her as much as she needs me,
01:37:18.120 | and she loves interacting with me and talking.
01:37:20.200 | I mean, you know, last night, we were talking about this,
01:37:23.560 | and I showed her the questions,
01:37:24.600 | and we were bouncing ideas of each other,
01:37:26.560 | and it was just beautiful.
01:37:28.040 | Like, we basically have these, you know,
01:37:30.120 | basically cognitive, you know,
01:37:34.000 | let it all loose kind of dates,
01:37:36.120 | where, you know, we just bring papers,
01:37:38.280 | and we're like, you know, bouncing ideas, et cetera.
01:37:41.040 | So, you know, we have extremely different perspectives,
01:37:44.560 | but very common, you know, goals and interests, and anyway.
01:37:49.560 | - What do you make of the communication mechanism
01:37:52.240 | that we humans use to share those ideas?
01:37:54.160 | 'Cause like one essential element of all of this
01:37:57.320 | is not just that we're able to have these ideas,
01:38:01.520 | but we're also able to share them.
01:38:03.760 | We tend to, maybe you can correct me,
01:38:06.080 | but we seem to use language to share the ideas.
01:38:10.080 | Maybe we share them in some much deeper way than language,
01:38:12.600 | I don't know.
01:38:13.440 | But what do you make of this whole mechanism,
01:38:15.800 | and how fundamental it is to the human condition?
01:38:18.720 | - So some people will tell you
01:38:20.120 | that your language dictates your thoughts,
01:38:23.160 | and your thoughts cannot form outside language.
01:38:26.280 | I tend to disagree.
01:38:27.920 | I see thoughts as much more abstract,
01:38:32.920 | as, you know, basically when I dream,
01:38:35.240 | I don't dream in words, I dream in shapes, and forms,
01:38:38.360 | and, you know, three-dimensional space with extreme detail.
01:38:42.400 | I was describing, so when I wake up
01:38:44.320 | in the middle of the night, I actually record my dreams.
01:38:47.160 | Sometimes I write them down in a Dropbox file.
01:38:50.040 | Other times I'll just dictate them in, you know, audio.
01:38:53.940 | And my wife was giving me a massage the other day,
01:38:57.440 | 'cause like my left side was frozen,
01:39:00.000 | and I started playing the recording.
01:39:02.760 | And as I was listening to it, I was like,
01:39:05.440 | I don't remember any of that.
01:39:06.280 | And I was like, of course!
01:39:08.000 | And then the entire thing came back.
01:39:10.240 | But then there's no way any other person
01:39:12.400 | could have recreated that entire, sort of,
01:39:15.480 | you know, three-dimensional shape, and dream, and concept.
01:39:20.200 | And in the same way, when I'm thinking of ideas,
01:39:22.560 | there's so many ideas I can't put to words.
01:39:25.320 | I mean, I will describe them with a thousand words,
01:39:27.120 | but the idea itself is much more precise,
01:39:29.880 | or much more sort of abstract,
01:39:31.520 | or much more something, you know, different.
01:39:34.080 | It's either less abstract or more abstract,
01:39:36.400 | and it's either, you know, basically,
01:39:39.080 | there's a projection that happens
01:39:42.760 | from the three-dimensional ideas
01:39:44.640 | into, let's say, a one-dimensional language.
01:39:46.880 | And the language certainly gives you the apparatus
01:39:51.040 | to think about concepts
01:39:52.460 | that you didn't realize existed before.
01:39:54.960 | And with my team, we often create new words.
01:39:57.640 | I'm like, well, now we're gonna call this
01:39:59.440 | the regulatory plexus of a gene.
01:40:01.360 | And that gives us now the language
01:40:02.920 | to sort of build on that as one concept
01:40:05.800 | that you then build upon with all kinds of other things.
01:40:09.080 | So there's this co-evolution again of ideas and language,
01:40:13.120 | but they're not one-to-one with each other.
01:40:16.520 | Now let's talk about language itself, words, sentences.
01:40:21.520 | This is a very distant construct
01:40:25.200 | from where language actually begun.
01:40:27.520 | So if you look at how we communicate,
01:40:29.520 | as I'm speaking, my eyes are shining,
01:40:34.120 | and my face is changing through all kinds of emotions,
01:40:37.120 | and my entire body composition posture is reshaped,
01:40:42.120 | and my intonation, the pauses that I make,
01:40:45.360 | the softer and the louder and the this and that
01:40:48.360 | are conveying so much more information.
01:40:51.800 | And if you look at early human language,
01:40:55.300 | and if you look at how, you know,
01:40:57.640 | the great apes communicate with each other,
01:40:59.520 | there's a lot of grunting, there's a lot of posturing,
01:41:01.360 | there's a lot of emotions,
01:41:02.400 | there's a lot of sort of shrieking, et cetera.
01:41:04.760 | They have a lot of components of our human language,
01:41:09.760 | just not the words.
01:41:11.560 | So I think of human communication
01:41:15.960 | as combining the ape component,
01:41:21.420 | but also of course, the, you know, GPT-3 component.
01:41:25.220 | So basically there's the cognitive layer
01:41:27.280 | and the reasoning layer that we share
01:41:30.120 | with different parts of our relatives.
01:41:32.680 | There's the AI relatives,
01:41:34.280 | but there's also the grunting relatives.
01:41:36.840 | And what I love about humanity is that we have both.
01:41:40.040 | We're not just a conversational system.
01:41:42.400 | We're a grunting, emotionally charged, you know,
01:41:46.800 | weirdly interconnected system
01:41:51.080 | that also has the ability to reason.
01:41:53.600 | And when we communicate with each other,
01:41:56.620 | there's so much more than just language.
01:41:59.000 | There's so much more than just words.
01:42:01.160 | - It does seem like we're able to somehow transfer
01:42:03.760 | even more than the body language.
01:42:06.680 | It seems that in the room with us
01:42:08.800 | is always a giant knowledge base of like shared experiences,
01:42:13.800 | different perspectives on those experiences,
01:42:16.520 | but I don't know, the knowledge of who the last three,
01:42:20.200 | four presidents in the United States was,
01:42:22.240 | and just all the, you know, 9/11, the tragedies in 9/11,
01:42:25.680 | all the beautiful and terrible things
01:42:29.120 | that happened in the world,
01:42:29.960 | they're somehow both in our minds
01:42:32.640 | and somehow enrich the ability to transfer information.
01:42:37.080 | - What I love about it is I can talk to you
01:42:39.080 | about 2001 Audience of Space
01:42:40.440 | and mention a very specific scene,
01:42:41.920 | and that evokes all these feelings that you had
01:42:44.520 | when you first watched it.
01:42:45.520 | - We're both visualizing that, maybe in different ways.
01:42:48.200 | - Exactly.
01:42:49.040 | - But in the, yeah.
01:42:50.400 | And not only that, but the feeling is broad-based.
01:42:55.400 | Broad back up, just like you said, with the dreams.
01:42:58.160 | We both have that feeling arise in some form
01:43:01.160 | as you bring up the, how, you know,
01:43:04.600 | facing his own mortality.
01:43:07.160 | It's fascinating that we're able to do that,
01:43:09.280 | but I don't know.
01:43:10.600 | - Now let's talk about Neuralink for a second.
01:43:12.520 | So what's the concept of Neuralink?
01:43:14.600 | The concept of Neuralink is that I'm gonna take
01:43:17.080 | whatever knowledge is encoded in my brain,
01:43:19.720 | directly transfer it into your brain.
01:43:22.140 | So this is a beautiful, fascinating,
01:43:25.240 | and extremely sort of, you know, appealing concept,
01:43:29.180 | but I see a lot of challenges surrounding that.
01:43:32.180 | The first one is we have no idea
01:43:34.900 | how to even begin to understand
01:43:36.780 | how knowledge is encoded in a person's brain.
01:43:40.140 | I mean, I told you about this paper that we had recently
01:43:41.900 | with Li-Hui Tsai and Asaf Marko,
01:43:45.420 | that basically was looking at these engrams
01:43:47.980 | that are formed with combinations of neurons
01:43:50.620 | that co-fire when a stimulus happens,
01:43:53.180 | where we can go into a mouse
01:43:54.300 | and select those neurons that fire by marking them,
01:43:56.820 | and then see what happens when they first fire,
01:43:58.620 | and then select the neurons that fire again
01:44:00.820 | when the experience is repeated.
01:44:02.780 | These are the recall neurons,
01:44:04.900 | and then there's the memory consolidation neurons.
01:44:07.660 | So we're starting to understand a little bit
01:44:09.940 | of sort of the distributed nature of knowledge encoding
01:44:14.140 | and experience encoding in the human brain
01:44:16.600 | and in the mouse brain.
01:44:18.020 | And the concept that we'll understand that sufficiently
01:44:22.460 | one day to be able to take a snapshot
01:44:26.700 | of what does that scene from Dave losing his mind,
01:44:31.700 | of Hal losing his mind and talking to Dave,
01:44:34.340 | how is that scene encoded in your mind?
01:44:39.440 | Imagine the complexity of that.
01:44:41.440 | But now imagine, suppose that we solve this problem,
01:44:45.840 | and the next enormous challenge is how do I go
01:44:48.840 | and modify the next person's brain
01:44:51.080 | to now create the same exact neural connections?
01:44:54.360 | So that's an enormous challenge right there.
01:44:56.360 | So basically it's not just reading, it's now writing.
01:44:59.000 | And again, what if something goes wrong?
01:45:02.200 | I don't wanna even think about that.
01:45:04.120 | That's number two.
01:45:05.200 | And number three, who says that the way that you encode,
01:45:09.600 | "Dave, I'm losing my mind,"
01:45:11.560 | and I encode, "Dave, I'm losing my mind,"
01:45:14.760 | is anywhere near each other.
01:45:17.320 | Basically, maybe the way that I'm encoding it
01:45:20.040 | is twisted with my childhood memories
01:45:23.320 | of running through the pebbles in Greece,
01:45:27.120 | and yours is twisted with your childhood memories
01:45:29.320 | of growing up in Russia.
01:45:31.200 | And there's no way that I can take my encoding
01:45:34.720 | and put it into your brain,
01:45:35.840 | 'cause it'll A, mess things up,
01:45:38.120 | and B, be incompatible with your own unique experiences.
01:45:43.000 | - So that's telepathic communications.
01:45:44.540 | From human to human, that's fascinating.
01:45:46.240 | You're reminding us that there's two biological systems
01:45:51.240 | on both ends of that communication.
01:45:54.000 | The one, the easier, I guess,
01:45:56.280 | maybe half as difficult thing to do,
01:45:59.680 | and the hope with Neuralink is that
01:46:01.860 | we can communicate with an AI system.
01:46:04.440 | So where one side of that is a little bit more controllable,
01:46:08.820 | but even just that is exceptionally difficult.
01:46:13.320 | - Let's talk about two neuronal systems talking to each other.
01:46:16.840 | Suppose that GPT-4 tells GPT-3,
01:46:19.240 | "Hey, give me all your knowledge."
01:46:21.320 | Right? - Yeah, that's hilarious.
01:46:23.000 | - I have 10 times more hardware, I'm ready, just feed me.
01:46:26.040 | What's GPT-3 gonna do?
01:46:27.320 | Is it gonna say, "Oh, here's my 10 billion parameters?"
01:46:30.920 | - No. - No way.
01:46:32.720 | The simplest way, and perhaps the fastest way,
01:46:35.080 | for GPT-3 to transfer all its knowledge to its older body
01:46:37.960 | that has a lot more hardware,
01:46:39.860 | is to regenerate every single possible human sentence
01:46:44.860 | that it can possibly create.
01:46:46.780 | - Just keep talking.
01:46:48.140 | - Keep talking, and just re-encode it all together.
01:46:50.820 | So maybe what language does is exactly that.
01:46:53.420 | It's taking one generative cognitive model,
01:46:56.780 | it's running it forward to emit utterances
01:46:59.660 | that kind of make sense in my cognitive frame,
01:47:01.820 | and it's re-encoding them into yours
01:47:04.100 | through the parsing of that same language.
01:47:07.300 | - And I think the conversation
01:47:08.580 | might actually be the most efficient way to do it.
01:47:11.180 | So not just talking, but interactive.
01:47:14.580 | So talking back and forth, asking questions, interrupting.
01:47:18.380 | - So GPT-4 will constantly be interrupted.
01:47:20.780 | - Interrupted, just annoying.
01:47:22.660 | Annoying it. - Annoyingly.
01:47:24.260 | - Yeah.
01:47:25.100 | - But the beauty of that is also that
01:47:27.740 | as we're interrupting each other,
01:47:29.780 | there's all kinds of misinterpretations that happen.
01:47:33.460 | That, you know, basically when my students speak,
01:47:36.620 | I will often know that I'm misunderstanding
01:47:38.940 | what they're saying, and I'll be like,
01:47:41.380 | hold that thought for a second.
01:47:43.060 | Let me tell you what I think I understood,
01:47:44.820 | which I know is different from what you said.
01:47:46.740 | Then I'll say that, and then someone else
01:47:48.980 | in the same Zoom meeting will basically say,
01:47:51.580 | well, you know, here's another way to think
01:47:54.100 | about what you just said.
01:47:55.500 | And then by the third iteration,
01:47:57.260 | we're somewhere completely different
01:47:59.220 | that if we could actually communicate
01:48:01.940 | with full neural network parameters back and forth
01:48:06.060 | of that knowledge and idea and coding
01:48:09.060 | would be far inferior because the re-encoding
01:48:14.020 | with our own, as we said last time,
01:48:16.380 | emotional baggage and cognitive baggage
01:48:18.860 | from our unique experiences through our shared experiences,
01:48:23.860 | distinct encodings in the context
01:48:27.940 | of all our unique experiences is leading
01:48:30.940 | to so much more diversity of perceptions
01:48:35.940 | and perspectives, and again, going back
01:48:38.060 | to this whole concept of this entire network
01:48:41.940 | of all of human cognitive systems connected
01:48:44.380 | to each other and sort of how ideas
01:48:47.060 | and memes permeate through that,
01:48:49.060 | that's sort of what really creates a whole new level
01:48:52.180 | of human experience through this reasoning layer
01:48:57.180 | and this computational layer that obviously lives
01:49:02.740 | on top of our cognitive layer.
01:49:04.860 | - So you're one of these aforementioned cognitive systems,
01:49:09.860 | mortal but thoughtful, and you're connected
01:49:16.500 | to a bunch, like you said, students,
01:49:18.980 | your wife, your kids.
01:49:21.140 | What do you, in your brief time here on Earth,
01:49:24.740 | this is a Meaning of Life episode,
01:49:27.240 | so what do you hope this world will remember you as?
01:49:32.780 | What do you hope your legacy will be?
01:49:34.680 | - I don't think of legacy as much as maybe most people.
01:49:41.820 | - I thought all Greeks think of legacy.
01:49:44.100 | - Oh, it's kind of funny.
01:49:44.940 | I'm consciously living the present.
01:49:47.840 | Many students tell me, oh, give us some career advice.
01:49:51.060 | I'm like, I'm the wrong person.
01:49:52.520 | I've never made a career plan.
01:49:54.300 | I still have to make one.
01:49:55.540 | (laughing)
01:49:59.260 | It's funny to be both experiencing the past
01:50:03.500 | and the present and the future,
01:50:05.580 | but also consciously living in the present
01:50:08.220 | and just, there's a conscious decision we can make
01:50:12.020 | to not worry about all that,
01:50:13.980 | which again goes back to the I'm the lucky one
01:50:16.740 | kind of thing.
01:50:17.580 | (laughing)
01:50:19.540 | Of living in the present and being happy winning
01:50:22.960 | and being happy losing,
01:50:24.260 | and there's a certain freedom that comes with that,
01:50:28.500 | but again, a certain sort of, I don't know,
01:50:32.460 | ephemerity of living for the present.
01:50:35.800 | But if you step back from all of that,
01:50:39.700 | where basically my current modus operandi
01:50:44.580 | is live for the present,
01:50:46.340 | make every day the best you can make,
01:50:50.460 | and just make the local blip of local maxima
01:50:55.260 | of the universe, of the awesomeness of the planet
01:50:58.700 | and the town and the family that we live in,
01:51:02.020 | both academic family and biological family.
01:51:05.820 | Make it a little more awesome
01:51:08.660 | by being generous to your friends,
01:51:09.860 | being generous to the people around you,
01:51:11.260 | being kind to your enemies,
01:51:13.580 | and just showing love all around.
01:51:17.700 | You can't be upset at people if you truly love them.
01:51:21.500 | If somebody yells at you and insults you
01:51:23.340 | every time you say the slightest thing,
01:51:25.740 | and yet when you see them, you just see them with love,
01:51:28.900 | it's a beautiful feeling.
01:51:31.620 | It's like, I'm feeling exactly like
01:51:34.580 | when I look at my three-year-old who's screaming.
01:51:37.580 | Even though I love her and I want her good,
01:51:39.700 | she's still screaming and saying, "No, no, no, no, no."
01:51:42.940 | And I'm like, "I love you, genuinely love you."
01:51:47.060 | But I can sort of kind of see that your brain
01:51:49.580 | is kind of stuck in that little mode of anger.
01:51:53.860 | And there's plenty of people out there who don't like me,
01:51:58.260 | and I see them with love as a child
01:52:01.460 | that is stuck in a cognitive state
01:52:04.540 | that they're eventually gonna snap out of,
01:52:06.980 | or maybe not, and that's okay.
01:52:09.020 | So there's that aspect of sort of experiencing
01:52:13.020 | life with the best intentions.
01:52:17.020 | And I love when I'm wrong.
01:52:20.420 | I had a friend who was like
01:52:21.980 | one of the smartest people I've ever met,
01:52:23.780 | who would basically say, "Oh, I love it when I'm wrong
01:52:26.300 | "'cause it makes me feel human."
01:52:27.940 | (Zubin laughs)
01:52:29.740 | And it's so beautiful.
01:52:31.140 | I mean, she's really one of the smartest people
01:52:32.700 | I've ever met, and she was like, "Oh, it's such a good feeling."
01:52:36.140 | And I love being wrong, but there's something
01:52:41.140 | about self-improvement, there's something about sort of
01:52:43.900 | how do I not make the most mistakes,
01:52:46.900 | but attempt the most rights and do the fewest wrongs,
01:52:50.860 | but with the full knowledge that this will happen.
01:52:53.140 | That's one aspect.
01:52:54.420 | So through this life in the present,
01:53:00.300 | what's really funny is, and that's something
01:53:02.700 | that I've experienced more and more,
01:53:04.740 | really, thanks to you and through this podcast,
01:53:07.820 | is this enormous number of people
01:53:09.940 | who will basically comment, "Wow, I've been following
01:53:13.060 | "this guy for so many years now,"
01:53:14.860 | or, "Wow, this guy has inspired so many of us
01:53:17.340 | "in computational biology," and so on and so forth.
01:53:19.540 | And I'm like, "I don't know any of that,
01:53:21.980 | "but I'm only discovering this now through this sort of
01:53:25.220 | "sharing our emotional states and our cognitive states
01:53:29.260 | "with a wider audience," where suddenly,
01:53:32.900 | I'm sort of realizing that, wow, maybe I've had a legacy.
01:53:36.260 | - Yes.
01:53:37.100 | - Like basically, I've trained generations
01:53:39.100 | of students from MIT, and I've put all of my courses
01:53:43.580 | freely online since 2001.
01:53:47.740 | So basically, all of my video recordings of my lectures
01:53:50.140 | have been online since 2001.
01:53:52.220 | So countless generations of people from across the world
01:53:56.100 | will meet me at a conference and say,
01:53:57.940 | I was at this conference where somebody heard my voice
01:54:01.060 | and was like, "I know this voice.
01:54:02.540 | "I've been listening to your lectures."
01:54:04.540 | And it's just such a beautiful thing
01:54:06.260 | where we're sharing widely, and who knows
01:54:11.400 | which students will get where from whatever they catch
01:54:15.460 | out of these lectures, even if what they catch
01:54:17.340 | is just inspiration and passion and drive.
01:54:20.720 | So there's this intangible, you know, legacy,
01:54:25.340 | quote unquote, that every one of us has
01:54:27.920 | through the people we touch.
01:54:29.720 | One of my friends from undergrad basically told me,
01:54:31.880 | "Oh, my mom remembers you vividly
01:54:33.560 | "from when she came to campus."
01:54:34.520 | I'm like, "I didn't even meet her."
01:54:36.080 | She's like, "No, but she sort of saw you
01:54:38.800 | "interacting with people and said,
01:54:40.040 | "Wow, he's exuding this positive energy."
01:54:43.440 | And there's that aspect of sort of just motivating people
01:54:47.040 | with your kindness, with your passion, with your generosity,
01:54:50.640 | and with your, you know, just selflessness of, you know,
01:54:55.360 | just give, it doesn't matter where it goes.
01:54:58.040 | I've been to conferences where basically people will,
01:55:01.400 | you know, I'll ask them a question,
01:55:03.160 | and then they'll come back to,
01:55:04.120 | or like there was a conference
01:55:05.480 | where I asked somebody a question, they said,
01:55:07.120 | "Oh, in fact, this entire project was inspired
01:55:09.160 | "by your question three years ago at the same conference."
01:55:11.560 | - Yes. - I'm like, "Wow."
01:55:13.480 | - And then on top of that, there's also the ripple effect.
01:55:15.560 | So you're speaking to the direct influence
01:55:17.680 | of inspiration or education,
01:55:19.200 | but there's also like the follow-on things
01:55:22.120 | that happen to that, and there's this ripple
01:55:23.960 | that through from you just this one individual first drop.
01:55:27.120 | - And from every one of us, from everyone.
01:55:29.240 | That's what I love about humanity.
01:55:30.920 | The fact that every one of us shares genes
01:55:36.680 | and genetic variants with very recent ancestors
01:55:39.800 | with everyone else.
01:55:41.040 | So even if I die tomorrow, my genes are still shared
01:55:45.640 | through my cousins and through my uncles
01:55:47.440 | and through my, you know, immediate family.
01:55:49.960 | And of course, I'm lucky enough to have my own children,
01:55:52.740 | but even if you don't, your genes are still permeating
01:55:55.680 | through all of the layers of your family.
01:55:57.560 | - So your genes will have the legacy there, yeah.
01:56:00.240 | - Every one of us.
01:56:01.960 | - Yeah. - Number two, our ideas
01:56:03.680 | are constantly intermingling with each other.
01:56:05.720 | So there's no person living in the planet
01:56:08.760 | 100 years from now who will not be directly impacted
01:56:12.000 | by every one of the planet living here today.
01:56:14.120 | - Yeah. - Through genetic inheritance
01:56:15.880 | and through meme inheritance.
01:56:18.520 | - That's cool to think that your ideas, Manolis Callas,
01:56:22.000 | would touch every single person on this planet.
01:56:27.000 | It's interesting. - But not just mine.
01:56:28.760 | Joe Smith, who's looking at this right now,
01:56:30.760 | his ideas will also touch everybody.
01:56:33.920 | So there's this interconnectedness of humanity.
01:56:36.320 | And then I'm also a professor, so my day job is legacy.
01:56:41.420 | My day job is training, not just the thousands of people
01:56:46.240 | who watch my videos on the web,
01:56:47.960 | but the people who are actually in my class,
01:56:50.360 | who basically come to MIT to learn from a bunch of us.
01:56:55.080 | - The cognitive systems that were shipped
01:56:57.720 | to this particular location.
01:56:59.040 | - And who will then disperse back
01:57:00.600 | into all of their home countries.
01:57:02.760 | That's what makes America the beacon of the world.
01:57:05.240 | We don't just export goods, we export people.
01:57:10.240 | - Cognitive systems.
01:57:12.200 | - We export people who are born here,
01:57:15.360 | and we also export training that people born elsewhere
01:57:19.040 | will come here to get, and will then disseminate
01:57:21.960 | not just whatever knowledge they got,
01:57:24.000 | but whatever ideals they learned.
01:57:26.200 | And I think that's a legacy of the US,
01:57:28.760 | that you cannot stop with political isolation,
01:57:31.320 | you cannot stop with economic isolation.
01:57:33.480 | That's something that will continue to happen
01:57:35.720 | through all the people we've touched through our universities.
01:57:38.320 | So there's the students who took my classes,
01:57:40.960 | who are basically now going off and teaching their classes,
01:57:44.280 | and I've trained generations of computational biologists.
01:57:46.480 | No one in genomics who's gone through MIT
01:57:48.680 | hasn't taken my class.
01:57:50.120 | So basically there's this impact through,
01:57:53.080 | I mean, there's so many people in biotechs
01:57:54.840 | who are like, "Hey, I took your class,
01:57:56.240 | "that's what got me into the field 15 years ago."
01:57:58.440 | And it's just so beautiful.
01:58:00.200 | And then there's the academic family that I have.
01:58:04.800 | So the students who are actually studying with me,
01:58:07.800 | who are my trainees.
01:58:09.320 | So this sort of mentorship of ancient Greece, this.
01:58:13.120 | (Lex laughing)
01:58:15.040 | So I basically have an academic family,
01:58:17.760 | and we are a family.
01:58:20.200 | There's this such strong connection,
01:58:24.400 | this bond of you're part of the Kelly's family.
01:58:27.840 | So I have a biological family at home,
01:58:30.160 | and I have an academic family on campus.
01:58:32.280 | And that academic family
01:58:33.920 | has given me great grandchildren already.
01:58:36.480 | So I've trained people who are now professors at Stanford,
01:58:40.920 | CMU, Harvard, WashU, I mean, everywhere in the world.
01:58:45.920 | And these people have now trained people
01:58:49.440 | who are now having their own faculty jobs.
01:58:53.040 | So there's basically people who see me
01:58:55.120 | as their academic grandfather.
01:58:57.320 | And it's just so beautiful,
01:58:58.400 | 'cause you don't have to wait for the 18 years
01:59:00.400 | of cognitive hardware development
01:59:03.880 | to sort of have amazing conversation with people.
01:59:07.400 | These are fully grown humans, fully grown adults,
01:59:09.600 | who are cognitively super ready,
01:59:13.800 | and who are shaped by,
01:59:15.640 | and I see some of these beautiful papers,
01:59:17.800 | and I'm like, "I can see the touch of our lab
01:59:21.080 | in those papers."
01:59:21.920 | It's just so beautiful, 'cause you're like,
01:59:23.600 | "I've spent hours with these people,
01:59:25.520 | teaching them not just how to do a paper,
01:59:27.560 | but how to think."
01:59:28.800 | And this whole concept of,
01:59:31.960 | the first paper that we write together
01:59:34.200 | is an experience with every one of these students.
01:59:37.480 | So I always tell them to write the whole first draft,
01:59:40.560 | and they know that I will rewrite every word.
01:59:43.320 | But the act of them writing it,
01:59:45.840 | and what I do is these like joint editing sessions
01:59:48.320 | where I'm like, "Let's co-edit."
01:59:50.080 | And with this co-editing, we basically have-
01:59:53.040 | - Creative destruction.
01:59:55.040 | - So I share my Zoom screen,
01:59:56.680 | and I'm just thinking out loud as I'm doing this,
01:59:59.600 | and they're learning from that process,
02:00:01.200 | as opposed to like come back two days later,
02:00:03.200 | and they see a bunch of red on a page.
02:00:05.080 | I'm sort of, "Well, that's not how you write this.
02:00:08.120 | That's not how you think about this.
02:00:09.480 | That's not, you know, what's the point?"
02:00:10.840 | Like this morning I was having,
02:00:12.560 | yes, this morning between six and 8 a.m.,
02:00:14.680 | I had a two-hour meeting,
02:00:16.800 | going through one of these papers,
02:00:18.400 | and then saying, "What's the point here?
02:00:20.800 | Why do you even show that?
02:00:22.200 | It's just a bunch of points on a graph."
02:00:24.040 | No, what you have to do is extract the meaning,
02:00:26.640 | do the homework for them.
02:00:28.240 | And there's this nurturing, this mentorship
02:00:32.440 | that sort of creates now a legacy,
02:00:34.640 | which is infinite because they've now gone off on the,
02:00:39.400 | you know, and all of that is just humanity.
02:00:42.120 | Then, of course, there's the papers I write.
02:00:45.160 | 'Cause yes, my day job is training students,
02:00:48.240 | but it's a research university.
02:00:50.280 | The way that they learn is through the mens et manus,
02:00:54.400 | mind and hand.
02:00:56.000 | It's the practical training of actually doing research.
02:00:59.360 | And that research is a beneficial side effect
02:01:03.860 | of having these awesome papers
02:01:06.640 | that will now tell other people how to think.
02:01:10.900 | There's this paper we just posted recently on MedArchive,
02:01:13.240 | and one of the most generous and eloquent comments about it
02:01:16.200 | was like, "Wow, this is a masterclass
02:01:19.320 | in scientific writing, in analysis,
02:01:22.760 | in biological interpretation," and so forth.
02:01:24.920 | It's just so fulfilling from a person I've never met.
02:01:27.480 | - Can you say the title of the paper, Brian?
02:01:29.680 | - I don't remember the title,
02:01:31.860 | but it's "Single-Cell Dissection of Schizophrenia Reveals."
02:01:36.760 | And so the two points that we found
02:01:39.920 | was this whole transcriptional resilience.
02:01:42.780 | Like there's some individuals who are schizophrenic,
02:01:46.600 | but who's, they have an additional cell type
02:01:50.040 | or initial cell state, which we believe is protective.
02:01:53.560 | And that cell state, when they have it,
02:01:55.680 | will cause other cells
02:01:56.900 | to have normal gene expression patterns.
02:01:58.920 | It's just beautiful.
02:02:00.440 | And then that cell is connected
02:02:03.980 | with some of the PV interneurons
02:02:06.120 | that are basically sending these inhibitory brain waves
02:02:09.520 | through the brain.
02:02:10.880 | And basically there's another component of,
02:02:14.920 | there's a set of master regulators that we discovered
02:02:18.440 | who are controlling many of the genes
02:02:20.740 | that are differentially expressed.
02:02:22.600 | And these master regulators are themselves
02:02:24.480 | genetic targets of schizophrenia.
02:02:27.120 | And they are themselves involved
02:02:28.680 | in both synaptic connectivity
02:02:31.800 | and also in early brain development.
02:02:34.720 | So there's this sort of interconnectedness
02:02:36.860 | between synaptic development axis
02:02:40.280 | and also this transcriptional resilience.
02:02:41.640 | So I mean, we basically made up a title
02:02:43.120 | that combines all these concepts.
02:02:44.840 | You have all these concepts,
02:02:45.760 | all these people working together,
02:02:46.920 | and ultimately these minds condense it down
02:02:49.640 | into a beautifully written little document
02:02:51.760 | that lives on forever. - Exactly.
02:02:52.600 | And that document now has its own life.
02:02:55.040 | Our work has 120,000 citations.
02:02:59.920 | I mean, that's not just people who read it.
02:03:02.100 | These are people who used it to write something based on it.
02:03:05.760 | I mean, that to me is just so fulfilling
02:03:10.640 | to basically say, wow, I've touched people.
02:03:12.960 | So I don't think of my legacy as I live every day.
02:03:17.800 | I just think of the beauty of the present
02:03:20.300 | and the power of interconnectedness.
02:03:22.440 | And just, I feel like a kid in a candy shop
02:03:25.000 | where I'm just like constantly,
02:03:28.040 | where do I, what package do I open first?
02:03:30.820 | And-- - You're the lucky one.
02:03:33.960 | (both laughing)
02:03:35.360 | - A jack of all trades, a master of none.
02:03:37.880 | - I think for a "Meaning of Life" episode,
02:03:41.320 | we would be amiss if we did not have at least a poem or two.
02:03:45.400 | Do you mind if we end in a couple of poems?
02:03:49.400 | Maybe a happy, maybe a sad one.
02:03:51.760 | - I would love that.
02:03:52.720 | So thank you for the luxury.
02:03:54.180 | The first one is kind of,
02:03:57.120 | I remember when you were talking with Eric Weinstein
02:04:03.740 | about this comment of Leonard Cohen that says,
02:04:09.400 | "But you don't really care for music, do ya?"
02:04:11.920 | In "Hallelujah," that's basically kind of like
02:04:13.760 | mocking its reader.
02:04:16.320 | So one of my poems is a little like that.
02:04:18.560 | So I had just broken up with my girlfriend
02:04:23.560 | and there's this other friend who was coming to visit me.
02:04:27.000 | And she said, "I will not come unless you write me a poem."
02:04:30.400 | (both laughing)
02:04:33.040 | And I was like, "Oh, writing a poem on demand."
02:04:37.540 | So this poem is called "Write Me a Poem."
02:04:40.960 | It goes, "Write me a poem," she said with a smile.
02:04:44.280 | "Make sure it's pretty, romantic, and rhymes.
02:04:47.120 | "Make sure it's worthy of that bold flame
02:04:49.480 | "that love uniting us beyond a mere game."
02:04:52.640 | And she took off without more words,
02:04:54.920 | rushed for the bus, and traveled the world.
02:04:57.840 | "A poem," I thought, "this is sublime.
02:05:00.600 | "What better way for passing the time?
02:05:03.080 | "What better way to count up the hours
02:05:05.340 | "before she comes back to my lonely tower?
02:05:08.400 | "Waiting for joy to fill up my heart,
02:05:10.360 | "let's write a poem for when we're apart.
02:05:13.240 | "How does a poem start?" I inquired.
02:05:15.980 | "Give me a topic, cook up a style.
02:05:18.340 | "Throw in some cute words, oh, here and there.
02:05:20.780 | "Throw in some passion, love, and despair.
02:05:23.700 | "Love, three eggs, one pound of flour,
02:05:26.680 | "three cups of water, and bake for an hour.
02:05:28.860 | "Love is no recipe, as I understand.
02:05:32.240 | "You can't just cook up a poem on demand."
02:05:34.760 | And as I was twisting all this in my mind,
02:05:37.040 | I looked at the page.
02:05:38.520 | By golly, it rhymed.
02:05:40.960 | Three roses, white chocolate, vanilla powder,
02:05:43.960 | some beautiful rhymes, and maybe a flower.
02:05:46.580 | "No, be romantic," the young girl insisted.
02:05:49.300 | "Do this, do that, don't be so silly.
02:05:51.540 | "You must believe it straight from your heart.
02:05:53.320 | "If you don't feel it, we're better apart."
02:05:55.940 | "Oh, my sweet thing, what can I say?
02:05:59.460 | "You bring me the sun all night and all day.
02:06:02.880 | "You're the stars and the moon and the birds way up high.
02:06:06.000 | "You're my evening sweet song, my morning blue sky.
02:06:09.380 | "You are my muse, your spell has me caught.
02:06:12.400 | "You bring me my voice and scatter my thoughts.
02:06:15.560 | "To put that love in writing, in vain, I can try.
02:06:19.080 | "But when I'm with you, my wings want to fly.
02:06:22.320 | "So I put down the pen and drop my defenses,
02:06:25.560 | "give myself to you and fill up my senses."
02:06:31.280 | - The baffled king composing, oh, that was beautiful.
02:06:35.040 | - What I love about it is that I did not
02:06:38.160 | bring up a dictionary of rhymes.
02:06:39.760 | I did not sort of work hard.
02:06:42.200 | So basically, when I write poems, I just type.
02:06:45.560 | I never go back, I just.
02:06:46.840 | So when my brain gets into that mode,
02:06:50.880 | it actually happens like I wrote it.
02:06:52.800 | - Oh, wow, so the rhyme just kind of,
02:06:54.800 | it's an emergent phenomenon. - It's an emergent phenomenon.
02:06:57.120 | I just get into that mode, and then it comes out.
02:07:00.960 | - That's a beautiful one.
02:07:02.040 | - And it's basically, as you got it,
02:07:06.760 | it's basically saying it's not a recipe,
02:07:08.120 | and then I'm throwing in the recipes,
02:07:09.860 | and as I'm writing it, I'm like,
02:07:11.640 | so it's very introspective in this whole concept.
02:07:16.500 | So anyway, there's another one many years earlier
02:07:19.560 | that is darker.
02:07:23.620 | It's basically this whole concept of let's be friends.
02:07:26.760 | I was like, "Ugh!"
02:07:29.760 | No, let's be friends, just like, you know.
02:07:32.480 | So the last words are shout out, I love you,
02:07:34.640 | or send me to hell.
02:07:36.240 | So the title is "Burn Me Tonight."
02:07:40.080 | Lie to me, baby.
02:07:43.200 | Lie to me now.
02:07:44.680 | Tell me you love me.
02:07:45.920 | Break me a vow.
02:07:47.080 | Give me a sweet word, a promise, a kiss.
02:07:49.920 | Give me the world, a sweet taste to miss.
02:07:52.760 | Don't let me lay here, inert, ugly, cold,
02:07:56.120 | with nothing sweet felt and nothing harsh told.
02:07:59.140 | Give me some hope, false, foolish, yet kind.
02:08:02.360 | Make me regret, I'll leave you behind.
02:08:05.120 | Don't pity my soul, but torture it right.
02:08:08.240 | Treat it with hatred, start up a fight,
02:08:11.040 | for it's from mildness that my soul dies
02:08:14.040 | when you cover your passion in a bland friend's disguise.
02:08:18.600 | Kiss me now, baby, show me your passion.
02:08:21.080 | Turn off the lights and rip off your fashion.
02:08:23.640 | Give me my life's joy this one night.
02:08:26.720 | Burn all my matches for one blazing light.
02:08:30.140 | Don't think of tomorrow and let today fade.
02:08:32.500 | Don't try and protect me from love's cutting blade.
02:08:35.580 | Your razor will always rip off my veins.
02:08:38.500 | Don't spare me the passion to spare me the pains.
02:08:42.100 | Kiss me now, honey, or spit in my face.
02:08:44.620 | Throw me an insult I'll gladly embrace.
02:08:47.700 | Tell me now clearly that you never cared.
02:08:49.980 | Say it now loudly like you never dared.
02:08:52.820 | I'm ready to hear it.
02:08:54.100 | I'm ready to die.
02:08:55.500 | I'm ready to burn and start a new life.
02:08:58.520 | I'm ready to face the rough burning truth
02:09:01.420 | rather than waste the rest of my youth.
02:09:04.380 | So tell me, my lover, should I stay or go?
02:09:07.260 | The answer to love is one, yes or no.
02:09:09.800 | There's no I like you, no let's be friends,
02:09:12.480 | shout out I love you, or send me to hell.
02:09:15.140 | (laughing)
02:09:18.420 | - I don't think there's a better way
02:09:19.840 | to end a discussion of the meaning of life,
02:09:23.320 | whatever the heck the meaning is,
02:09:25.740 | go all in as that poem says.
02:09:28.700 | Manolis, thank you so much for talking today.
02:09:30.500 | - Thanks, I look forward to next time.
02:09:32.800 | - Thanks for listening to this conversation
02:09:34.500 | with Manolis Kellis, and thank you to our sponsors.
02:09:38.000 | Grammarly, which is a service for checking spelling,
02:09:40.760 | grammar, sentence structure, and readability.
02:09:43.740 | Athletic Greens, the all-in-one drink
02:09:46.260 | that I start every day with
02:09:47.860 | to cover all my nutritional bases.
02:09:50.220 | Cash App, the app I use to send money to friends.
02:09:54.100 | Please check out these sponsors in the description
02:09:56.360 | to get a discount and to support this podcast.
02:09:59.380 | If you enjoy this thing, subscribe on YouTube,
02:10:01.880 | review it with Five Stars on Apple Podcast,
02:10:04.040 | follow on Spotify, support on Patreon,
02:10:06.520 | or connect with me on Twitter @LexFriedman.
02:10:09.640 | And now, let me leave you with some words
02:10:11.460 | from Douglas Adams in his book,
02:10:13.480 | "Hitchhiker's Guide to the Galaxy."
02:10:16.120 | On the planet Earth, man had always assumed
02:10:19.440 | that he was more intelligent than dolphins
02:10:22.160 | because he had achieved so much.
02:10:24.460 | The wheel, New York, wars, and so on.
02:10:28.720 | Whilst all the dolphins had ever done
02:10:31.220 | was muck about in the water having a good time.
02:10:34.900 | But conversely, the dolphins had always believed
02:10:38.560 | that they were far more intelligent than man
02:10:41.460 | for precisely the same reasons.
02:10:43.920 | Thank you for listening, I hope to see you next time.
02:10:47.600 | (upbeat music)
02:10:50.180 | (upbeat music)
02:10:52.760 | [BLANK_AUDIO]