back to index

Lisa Feldman Barrett: Love, Evolution, and the Human Brain | Lex Fridman Podcast #140


Chapters

0:0 Introduction
2:10 Falling in love
19:54 Love at first sight
34:32 Romantic
38:32 Writing process
49:15 Evolution of the human brain
63:24 Nature of evil
72:7 Love is an evolutionary advantage
76:43 Variation in species
82:24 Does evolution have a direction?
100:3 Love with an inanimate object
104:21 Just be yourself is confusing advice
114:32 Consciousness
121:10 Book recommendations

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Lisa Feldman Barrett,
00:00:03.600 | her second time on the podcast.
00:00:05.740 | She's a neuroscientist at Northeastern University
00:00:09.060 | and one of my favorite people.
00:00:11.240 | Her new book called "Seven and a Half Lessons
00:00:13.400 | "About the Brain" is out now as of a couple days ago,
00:00:16.760 | so you should definitely support Lisa by buying it
00:00:19.520 | and sharing with friends if you like it.
00:00:21.800 | It's a great short intro to the human brain.
00:00:25.460 | Quick mention of each sponsor,
00:00:27.040 | followed by some thoughts related to the episode.
00:00:29.880 | Athleta Greens, the all-in-one drink
00:00:32.080 | that I start every day with
00:00:33.960 | to cover all my nutritional bases.
00:00:36.520 | 8 Sleep, a mattress that cools itself
00:00:39.480 | and gives me yet another reason to enjoy sleep.
00:00:42.360 | Masterclass, online courses that I enjoy
00:00:45.160 | from some of the most amazing people in history.
00:00:48.660 | And BetterHelp, online therapy with a licensed professional.
00:00:52.680 | Please check out these sponsors in the description
00:00:55.040 | to get a discount and to support this podcast.
00:00:58.040 | As a side note, let me say that Lisa,
00:01:00.200 | just like Manolis Kellis,
00:01:02.380 | is a local brilliant mind and friend
00:01:05.360 | and someone I can see talking to many more times.
00:01:08.160 | Sometimes it's fun to talk to a scientist
00:01:10.760 | not just about their field of expertise,
00:01:12.620 | but also about random topics, even silly ones,
00:01:15.880 | from love to music to philosophy.
00:01:19.200 | Ultimately, it's about having fun,
00:01:21.520 | something I know nothing about.
00:01:23.800 | This conversation is certainly that.
00:01:25.840 | It may not always work, but it's worth a shot.
00:01:28.760 | I think it's valuable to alternate
00:01:31.040 | along all kinds of dimensions,
00:01:32.980 | like between deeper technical discussions
00:01:35.720 | and more fun random discussion,
00:01:37.920 | from liberal thinker to conservative thinker,
00:01:40.760 | from musician to athlete,
00:01:43.040 | from CEO to junior engineer,
00:01:45.920 | from friend to stranger.
00:01:48.020 | Variety makes life and conversation more interesting.
00:01:51.480 | Let's see where this little podcast journey goes.
00:01:54.840 | If you enjoy this thing, subscribe on YouTube,
00:01:57.040 | review it with Five Stars on Apple Podcasts,
00:01:59.280 | follow on Spotify, support it on Patreon,
00:02:02.000 | or connect with me on Twitter @LexFriedman.
00:02:05.240 | And now, here's my conversation with Lisa Feldman Barrett.
00:02:10.000 | Based on the comments in our previous conversation,
00:02:13.800 | I think a lot of people would be very disappointed,
00:02:17.040 | I should say, to learn that you are, in fact, married.
00:02:20.920 | As they say, all the good ones are taken.
00:02:22.960 | Okay, so I'm a fan of your husband as well, Dan.
00:02:27.960 | He's a programmer, a musician,
00:02:29.760 | so a man after my own heart.
00:02:32.080 | Can I ask a ridiculously over-romanticized question
00:02:36.160 | of when did you first fall in love with Dan?
00:02:40.960 | - It's actually, it's a really romantic story, I think.
00:02:45.380 | So I was divorced by the time I was 26, 27, 26, I guess.
00:02:50.740 | And I was in my first academic job,
00:02:52.660 | which was Penn State University,
00:02:54.940 | which is in the middle of Pennsylvania,
00:02:57.540 | surrounded by mountains.
00:02:58.700 | So you have, it's four hours to get anywhere,
00:03:00.820 | to get to Philadelphia, New York, Washington.
00:03:03.540 | I mean, you're basically stuck, you know?
00:03:05.580 | And I was very fortunate to have
00:03:10.060 | a lot of other assistant professors
00:03:11.980 | who were hired at the same time as I was.
00:03:13.900 | So there were a lot of us, we were all friends,
00:03:15.900 | which was really fun.
00:03:17.620 | But I was single, and I didn't wanna date a student.
00:03:21.420 | And there were no,
00:03:24.580 | and I wasn't gonna date somebody in my department,
00:03:26.780 | that's just a recipe for disaster.
00:03:29.220 | - Yeah.
00:03:30.060 | - So--
00:03:30.900 | - But even at 20, whatever you were,
00:03:32.540 | you were already wise enough to know that.
00:03:34.300 | - Yeah, a little bit, maybe, yeah.
00:03:36.460 | I wouldn't call me wise at that age.
00:03:38.180 | But anyways, not sure that I would say that I'm wise now.
00:03:41.180 | But, and so,
00:03:44.940 | and so, after, you know,
00:03:49.940 | I was spending probably 16 hours a day in the lab
00:03:53.740 | because it was my first year,
00:03:55.700 | and as an assistant professor, and there's a lot to do.
00:03:58.820 | And I was also bitching and moaning to my friends
00:04:02.660 | that I hadn't had sex in I don't know how many months.
00:04:06.380 | And I was starting to become unhappy with my life.
00:04:10.580 | And I think at a certain point,
00:04:13.060 | they just got tired of listening to me bitch and moan
00:04:15.580 | and said, "Just do something about it then,
00:04:18.300 | "like do, you know, if you're unhappy."
00:04:20.380 | And so the first thing I did was I made friends
00:04:23.740 | with a sushi chef in town.
00:04:25.740 | And this is like a State College, Pennsylvania
00:04:28.860 | in the early '90s was there was like a pizza shop
00:04:32.140 | and a sub shop and actually a very good bagel shop
00:04:36.260 | and one good coffee shop and maybe one nice restaurant.
00:04:39.100 | I mean, there was really,
00:04:40.180 | but there was the second son of a Japanese sushi chef
00:04:44.460 | who was not going to inherit the restaurant.
00:04:47.220 | And so he moved to Pennsylvania and was giving sushi lessons.
00:04:51.220 | So I met this guy, the sushi chef,
00:04:54.260 | and we decided to throw a sushi party at the coffee shop.
00:04:57.420 | So we basically, it was the goal was to invite
00:04:59.700 | every eligible bachelor really within like a 20 mile radius.
00:05:04.700 | We had a totally fun time.
00:05:07.020 | I wore an awesome crushed velvet burgundy dress,
00:05:11.060 | it was beautiful dress.
00:05:12.540 | And I didn't meet any, I met a lot of new friends,
00:05:16.540 | but I did not meet anybody.
00:05:17.740 | So then I thought, okay, well,
00:05:18.700 | maybe I'll try the personals ads,
00:05:20.940 | which I had never used before in my life.
00:05:23.420 | And I first tried the paper personals ads.
00:05:28.420 | - Like in the newspaper?
00:05:30.020 | - Like in the newspaper, that didn't work.
00:05:33.020 | And then a friend of mine said,
00:05:33.860 | "Oh, you know, there's this thing called net news."
00:05:36.620 | This is like 1992 maybe.
00:05:40.140 | So there was this anonymous, you could do it anonymously.
00:05:43.820 | So you would read, you could post or you could read ads
00:05:48.820 | and then respond to an address, which was anonymous.
00:05:54.900 | And that was yoked to somebody's real address.
00:05:57.540 | And there was always a lag
00:06:01.020 | because it was this like a bulletin board sort of thing.
00:06:04.860 | So at first I read them over
00:06:08.900 | and I decided to respond to one or two.
00:06:12.700 | And it was interesting.
00:06:15.300 | - Sorry, this is not on the internet.
00:06:16.980 | - Yeah, this is totally on the internet.
00:06:18.300 | - But it takes, there's a delay of a couple of days
00:06:20.180 | or whatever.
00:06:21.020 | - Yeah, right, right.
00:06:21.860 | It's 1992.
00:06:23.060 | There's no web.
00:06:24.180 | - No pictures.
00:06:25.180 | - There's no pictures.
00:06:26.020 | The web doesn't exist.
00:06:27.140 | It's all done in ASCII format sort of.
00:06:29.780 | But the ratio--
00:06:32.300 | - Loving ASCII.
00:06:33.300 | - But the ratio of men to women was like 10 to one.
00:06:38.300 | I mean, there were many more men
00:06:39.860 | because it was basically academics and the government.
00:06:42.900 | That was it.
00:06:43.740 | That was no, I mean, I think AOL maybe
00:06:46.060 | was just starting to become popular.
00:06:47.980 | And so the first person I met told me
00:06:55.700 | that he was a scientist who worked for NASA.
00:07:00.060 | And--
00:07:00.900 | - Impressive.
00:07:01.740 | - Yeah.
00:07:03.420 | Anyways, it turned out that he didn't actually.
00:07:06.700 | - Yeah.
00:07:07.540 | This is how they brag is you elevate your,
00:07:11.220 | as opposed to saying you're taller than you are,
00:07:12.820 | you say like your position is higher.
00:07:14.020 | - Yeah, and I actually, I would have been fine
00:07:16.740 | dating somebody who wasn't a scientist.
00:07:18.380 | It's just that they have, it's just that whoever I date
00:07:21.820 | has to just accept that I am and that I was pretty ambitious
00:07:26.820 | and was trying to make my career.
00:07:30.340 | And that's not, I think it's maybe more common now
00:07:34.820 | for men to maybe accept that in their female partners,
00:07:38.540 | but at that time, not so common.
00:07:40.500 | - Could be intimidating, I guess.
00:07:41.940 | - Yes, that has been said.
00:07:44.140 | And so then the next one I actually corresponded with,
00:07:49.140 | and we actually got to the point of talking on the phone
00:07:51.660 | and we had this really kind of funny conversation
00:07:53.900 | where we're chatting and he said,
00:07:57.540 | he introduces the idea that he's really looking
00:08:00.980 | for a dominant woman and I'm thinking,
00:08:03.780 | I'm a psychologist by training,
00:08:05.260 | so I'm thinking, oh, he means sex roles.
00:08:07.140 | Like, I'm like, no, I'm very assertive
00:08:09.020 | and I'm glad you think that, you know, okay.
00:08:10.380 | Anyways, long story short, that's not really what he meant.
00:08:13.980 | (laughing)
00:08:16.380 | - Okay, got it.
00:08:17.220 | - Yeah, so, and I just, you know,
00:08:20.420 | that will just show you my level of naivete.
00:08:22.340 | Like I was like, I didn't completely understand,
00:08:24.620 | but I was like, well, yeah, you know, no.
00:08:27.500 | At one point he asked me how I felt
00:08:29.260 | about him wearing my lingerie and I was like,
00:08:33.540 | I don't even share my lingerie with my sister.
00:08:35.700 | Like, I don't share my lingerie with anybody, you know?
00:08:40.820 | The third one I interacted with was a banker
00:08:45.300 | who lived in Singapore and that conversation
00:08:50.300 | didn't last very long because he made an,
00:08:55.180 | I guess he made an analogy between me
00:08:57.740 | and a character in "The Fountainhead,"
00:09:01.740 | the woman who's raped in "The Fountainhead,"
00:09:06.220 | and I was like, okay, that's not.
00:09:08.660 | - That's not a good-- - That's not a good,
00:09:10.220 | no, that's not a good one.
00:09:11.500 | - Not that part, not that scene.
00:09:12.980 | - Not that scene.
00:09:13.820 | So then I was like, okay, you know what?
00:09:16.260 | I'm gonna post my own ad.
00:09:17.780 | And so I did, I posted, well, first I wrote my ad
00:09:20.140 | and then I, of course, I checked it with my friends
00:09:22.620 | who were all also assistant professors
00:09:24.740 | who are like my little Greek chorus,
00:09:26.460 | and then I posted it and I got something like,
00:09:30.200 | I don't know, 80-something responses in 24 hours.
00:09:34.300 | I mean, it was very--
00:09:35.140 | - Do you remember the pitch?
00:09:36.540 | Like how you, I guess, condensed yourself?
00:09:40.940 | - I don't remember it exactly, although Dan has it.
00:09:43.900 | But actually for our 20th wedding anniversary,
00:09:48.100 | he took our exchanges and he printed them off
00:09:51.660 | and put them in a leather-bound book for us to read,
00:09:54.660 | which was really sweet.
00:09:55.900 | Yeah, I think I was just really direct.
00:09:58.740 | Like I'm almost 30, I'm a scientist,
00:10:01.260 | I'm not looking to, you know,
00:10:02.820 | I'm looking for something serious and, you know.
00:10:04.980 | But the thing is I forgot to say where my location was
00:10:09.020 | and my age, which I forgot.
00:10:12.020 | So I got lots of, I mean, I will say,
00:10:14.700 | so I printed off all of the responses
00:10:17.900 | and I had all my friends over and we were, you know,
00:10:21.620 | had a big, I made a big pot of gumbo
00:10:24.520 | and we drank through several bottles of wine
00:10:27.300 | reading these responses.
00:10:28.660 | And I would say for the most part, they were really sweet,
00:10:32.500 | like earnest and genuine,
00:10:35.220 | as much as you could tell that somebody's being genuine.
00:10:37.380 | I mean, it seemed, you know,
00:10:38.420 | there were a couple of really funky ones,
00:10:40.340 | like, you know, this one couple who told me
00:10:42.500 | that I was their soulmate, the two of them,
00:10:44.500 | then they were looking for, you know, a third person
00:10:47.180 | and I was like, "Oh, okay."
00:10:48.780 | But mostly super, seemed like super genuine people.
00:10:54.020 | And so I chose five men to start corresponding with
00:10:57.640 | and I was corresponding with them.
00:10:58.940 | And then about a week later, I get this other email.
00:11:03.200 | And okay, and then I post something the next day
00:11:05.040 | that said, "Okay, you know, thank you so much."
00:11:06.800 | And I'm gonna, I answered every person back.
00:11:10.360 | But then after that I said,
00:11:11.400 | "Okay, and I'm not gonna answer anymore."
00:11:13.400 | You know, 'cause it was, they were still coming in
00:11:15.120 | and I couldn't, you know, I have a job
00:11:16.800 | and, you know, a house to take care of and stuff.
00:11:18.760 | So, and then about a week later, I get this other email
00:11:22.720 | and he says, you know, he just describes himself
00:11:27.500 | like I'm this, I'm this, I'm this, I'm a chef,
00:11:30.260 | I'm a scientist, I'm a this, I'm a this.
00:11:32.540 | And so I emailed him back and I said,
00:11:35.900 | "You know, you seem interesting.
00:11:37.540 | "You can write me at my actual address if you want.
00:11:40.020 | "Here's my address.
00:11:40.860 | "I'm not really responding,
00:11:42.100 | "I'm not really responding to other people anymore,
00:11:43.660 | "but you seem interesting.
00:11:44.740 | "You know, you can write to me if you want."
00:11:46.860 | And then he wrote to me and I,
00:11:51.780 | then I wrote him back and I,
00:11:52.760 | it was a nondescript kind of email and I wrote him back
00:11:55.600 | and I said, "Thanks for responding.
00:11:56.800 | "You know, I'm really busy right now.
00:11:58.760 | "I was in the middle of writing my first slate
00:12:01.260 | "of grant applications, so I was really consumed."
00:12:04.520 | And I said, "I'll get back to you in a couple of days."
00:12:06.920 | And so I did, I waited a couple of days
00:12:09.240 | till my grants were, you know, safe,
00:12:11.240 | grant application safely out the door.
00:12:13.400 | And then I emailed him back and then he emailed me
00:12:16.920 | and then really across two days, we sent 100 emails.
00:12:21.500 | - And text only?
00:12:23.740 | Was there pictures or anything of that?
00:12:25.220 | - Text only, text only.
00:12:27.500 | And then, so this was like a Thursday and a Friday.
00:12:30.740 | And then Friday, he said,
00:12:33.740 | "Let's talk on the weekend on the phone."
00:12:35.620 | And I said, "Okay."
00:12:37.040 | And he wanted to talk Sunday night
00:12:39.820 | and I had a date Sunday night.
00:12:42.460 | So I said, "Okay, sure, we can talk Sunday night."
00:12:46.780 | And then I was like, "Well, you know,
00:12:49.340 | "I don't really wanna cancel my date,
00:12:50.600 | "so I'm just gonna call him on Saturday."
00:12:52.400 | So I just called, I cold called him on Saturday
00:12:55.440 | and a woman answered.
00:12:57.240 | - Oh, wow.
00:12:58.080 | That's not cool.
00:12:59.960 | - Not cool.
00:13:01.640 | And so she says, you know, "Hello."
00:13:04.520 | And I say, "Oh, you know, stand there."
00:13:06.880 | And she said, "Sure, can I ask who's calling?"
00:13:09.760 | And I said, "Tell him it's Lisa."
00:13:11.720 | And she went, "Oh my God, oh my God, I'm just a friend.
00:13:14.400 | "I'm just a friend, I just need to tell you,
00:13:16.180 | "I'm just a friend."
00:13:17.760 | And I was like, this is adorable, right?
00:13:21.760 | And then he gets on the phone, not high, nice to meet.
00:13:24.480 | The first thing he says to me, "She's just a friend."
00:13:27.400 | So I was just so charmed, really, by the whole thing.
00:13:32.400 | So it was Yom Kippur, it was the Jewish Day of Atonement
00:13:37.480 | that was ending and they were baking cookies
00:13:39.120 | and going to a break fast.
00:13:40.240 | So people, as you know, fast all day
00:13:42.480 | and then they go to a party and they break fast.
00:13:44.840 | So I thought, okay, I'll just cancel my date.
00:13:49.840 | So I did and I stayed home and we talked for eight hours
00:13:55.000 | and then the next night for six hours.
00:13:58.600 | And it basically, it just went on like that.
00:14:00.440 | And then by the end of the week, he flew to State College.
00:14:05.440 | And we'd gone through this whole thing where I'd said,
00:14:09.560 | we're gonna take it slow,
00:14:10.840 | we're gonna get to know each other.
00:14:13.160 | And then really by, I think we talked like two
00:14:15.600 | or three times, these like really long conversations.
00:14:18.320 | And then he said, "I'm just gonna fly there."
00:14:20.440 | And then, so of course there's,
00:14:23.080 | I don't even know that there were fax machines
00:14:26.760 | at that point, maybe there were, but I don't think so.
00:14:31.280 | Anyway, so we decided we'll exchange pictures.
00:14:34.320 | And so he, I take my photograph
00:14:37.980 | and I give it to my secretary and I say to my secretary.
00:14:40.780 | - Fax this.
00:14:42.720 | - I say that, send this priority mail.
00:14:45.200 | - Priority mail.
00:14:46.040 | - And he goes, okay, I'll send a priority mail.
00:14:47.400 | Let me, it's a priority mail.
00:14:48.480 | He's like, I know, priority mail, okay.
00:14:50.440 | And then, so I get Dan's photograph in the mail
00:14:54.800 | and it's him in shorts and you can see
00:15:00.360 | that he's probably somewhere like the Bahamas
00:15:02.680 | or something like that and it's like cropped.
00:15:05.080 | So clearly what he's done is he's taken a photograph
00:15:07.480 | where he's in it with someone else
00:15:10.340 | who turned out to be his ex-wife.
00:15:11.940 | So I'm thinking, well, this is awesome.
00:15:14.420 | I've hit the jackpot.
00:15:15.540 | He's very appealing to me, very attractive.
00:15:18.700 | And then my photograph doesn't show up
00:15:23.100 | and it doesn't show up.
00:15:24.380 | And so like one day and then two days
00:15:27.380 | and then he's like, I said,
00:15:30.740 | well, I asked my secretary to send a priority.
00:15:34.180 | I mean, I don't know what he did.
00:15:37.260 | And he's like, I said, I'm like, well, you don't have to,
00:15:40.980 | you know, you don't have to come.
00:15:41.940 | And he's like, no, no, no, I'm gonna, you know,
00:15:43.640 | we've had like five dates,
00:15:45.440 | the equivalent of five dates practically.
00:15:47.540 | And then, so he's supposed to fly on a Thursday or Friday,
00:15:52.380 | I can't remember.
00:15:53.580 | And I get a call like maybe an hour
00:15:56.400 | before his flight's supposed to leave.
00:15:58.140 | And he says, hi.
00:15:59.040 | And I say, and it's just something in his voice, right?
00:16:00.980 | And I say, 'cause at this point,
00:16:02.360 | I think I've talked to him like for 25 hours, I don't know.
00:16:05.440 | And he says, hi.
00:16:07.300 | And I'm like, you got the picture?
00:16:09.280 | And he's like, yeah.
00:16:10.540 | And I'm like, you don't like it?
00:16:12.660 | And he's like, well, I'm sure it's not,
00:16:17.660 | I'm sure it's your, I'm sure it's just not a good,
00:16:21.580 | you know, it's probably not your best.
00:16:24.260 | - Oh, no.
00:16:25.220 | - You know, you don't have to come.
00:16:28.060 | And he's like, no, no, no, I'm coming.
00:16:29.420 | And I'm like, no, you don't have to come.
00:16:30.660 | And he's like, no, no, I really wanna,
00:16:32.220 | I'm getting on the plane.
00:16:33.860 | I'm like, you don't have to get on the plane.
00:16:35.860 | He's like, no, I'm getting on the plane.
00:16:38.240 | And so I go down to my, I go,
00:16:40.640 | I'm in my office, this is happening, right?
00:16:42.040 | So I go downstairs to my, one of my closest friends,
00:16:44.680 | who's still actually one of my closest friends,
00:16:47.040 | who is one of my colleagues and Kevin.
00:16:51.280 | And I say, Kevin, and I go to Kevin, I go, Kevin, Kevin,
00:16:54.280 | Kevin, he doesn't like the photograph.
00:16:55.760 | And Kevin's like, well, which photograph did you send?
00:16:57.480 | And I'm like, well, you know,
00:16:58.560 | the one where we're shooting pool?
00:16:59.900 | And he's like, you sent that photograph?
00:17:03.900 | That's a horrible photograph.
00:17:05.040 | I'm like, yeah, but it's the only one that I had
00:17:07.340 | that was like, where my hair was kind of similar
00:17:09.620 | to what it is now.
00:17:10.460 | And he's like, Lisa, do I have to check everything for you?
00:17:14.780 | You should not have sent that.
00:17:17.420 | - But still, he flew over.
00:17:20.340 | - So he flew.
00:17:21.300 | - Where from, by the way?
00:17:22.580 | - He was in graduate school at Amherst,
00:17:25.900 | yeah, at UMass Amherst.
00:17:27.940 | So he flew and I picked him up at the airport
00:17:32.940 | and he was happy.
00:17:36.520 | So whatever the concern was, was gone.
00:17:40.040 | And I was dressed, you know, I carefully, carefully dressed.
00:17:43.800 | - Were you nervous?
00:17:44.840 | - I was really, really nervous.
00:17:46.960 | 'Cause I don't really believe in fate
00:17:50.560 | and I don't really think there's only one person
00:17:53.000 | that you can be with.
00:17:54.700 | But I think, you know, people who,
00:17:59.700 | some people are curvy, they're kind of complicated.
00:18:04.320 | And so the number of people who fit them
00:18:06.400 | is maybe less than.
00:18:08.080 | - I like it, mathematically speaking, yeah.
00:18:10.800 | - And so when I was going to pick him up at the airport,
00:18:13.160 | I was thinking, well, this could,
00:18:15.760 | I could be going to pick up the person I'm gonna marry.
00:18:19.100 | Or not.
00:18:21.520 | I mean, like I really, but I really, you know,
00:18:24.720 | like our conversations were just very authentic
00:18:27.840 | and very moving and we really connected.
00:18:32.840 | And I really felt like he understood me, actually,
00:18:37.720 | in a way that a lot of people don't.
00:18:41.680 | And what was really nice was at the time,
00:18:46.680 | you know, the airport was this tiny little airport
00:18:53.200 | out in a cornfield, basically.
00:18:54.760 | And so driving back to the town,
00:18:58.480 | we were in the car for 15 minutes,
00:19:00.200 | completely in the dark as I was driving.
00:19:02.420 | And so it was very similar to,
00:19:04.500 | we had just spent, you know,
00:19:06.460 | 20 something hours on the telephone,
00:19:09.200 | sitting in the dark, talking to each other.
00:19:11.320 | So it was very familiar.
00:19:14.540 | And we basically spent the whole weekend together
00:19:16.440 | and he met all my friends and we had a big party.
00:19:21.000 | And at the end of the weekend, I said, okay,
00:19:25.800 | you know, if we're gonna give this a shot,
00:19:30.200 | we probably shouldn't see other people.
00:19:33.160 | So it's a risk, you know?
00:19:35.680 | - Commitment.
00:19:36.520 | - But I just didn't see how it would work
00:19:40.220 | if we were dating people locally
00:19:42.220 | and then also seeing each other at a distance.
00:19:44.200 | 'Cause I've had long distance relationships before
00:19:46.360 | and they're hard and they take a lot of effort.
00:19:50.280 | And so we decided we'd give it three months
00:19:51.960 | and see what happened.
00:19:53.320 | And that was it.
00:19:54.640 | - This is an interesting thing.
00:19:57.280 | Like we're all, what is it?
00:19:58.880 | There's several billion of us
00:20:00.480 | and we're kind of roaming this world.
00:20:02.400 | And then you kind of stick together.
00:20:04.480 | You find somebody that just like gets you.
00:20:07.680 | And it's interesting to think about,
00:20:10.040 | there's probably thousands, if not millions,
00:20:12.160 | of people that would be sticky to you,
00:20:14.960 | depending on the curvature of your space.
00:20:17.760 | But what is the, could you speak to the stickiness?
00:20:22.760 | Like to the, just the falling in love?
00:20:25.960 | Like seeing that somebody really gets you?
00:20:29.920 | Maybe by way of telling, do you think,
00:20:34.920 | do you remember there was a moment
00:20:36.400 | when you just realized, damn it, I think I'm,
00:20:40.600 | like I think this is the guy.
00:20:42.800 | I think I'm in love.
00:20:44.280 | - We were having these conversations actually
00:20:46.600 | from the really from the second weekend we were together.
00:20:49.560 | So he flew back the next weekend to State College
00:20:51.560 | 'cause it was my birthday.
00:20:52.400 | It was my 30th birthday.
00:20:53.240 | My friends were throwing me a party.
00:20:55.200 | And we went hiking and we hiked up some mountain
00:20:58.080 | and we were sitting on a cliff over this overlook
00:21:01.520 | and talking to each other.
00:21:02.480 | And I was thinking, and I actually said to him,
00:21:04.040 | I'm like, I haven't really known you very long,
00:21:06.880 | but I feel like I'm falling in love with you,
00:21:08.520 | which can't possibly be happening.
00:21:10.160 | I must be projecting.
00:21:11.600 | - Must be projecting.
00:21:12.920 | - But it certainly feels that way, right?
00:21:14.800 | Like I don't believe in love at first sight.
00:21:16.640 | So this can't really be happening,
00:21:18.880 | but it sort of feels like it is.
00:21:20.200 | And he was like, I know what you mean.
00:21:21.680 | And so for the first three months or four months,
00:21:24.480 | we would say things to each other like,
00:21:26.280 | I feel like I'm in love with you,
00:21:28.080 | but you know, but that can't,
00:21:32.640 | but things don't really work like that.
00:21:34.160 | So, but you know, so, and then it became a joke.
00:21:37.040 | Like, I feel like I'm in love with you.
00:21:38.520 | And then eventually, you know, I think,
00:21:41.680 | but I think that was one moment
00:21:43.280 | where we were talking about, I don't know, just,
00:21:47.960 | you know, not just all the great aspirations you have
00:21:53.880 | or all the things,
00:21:54.720 | but also things you don't like about yourself,
00:21:56.560 | things that you're worried about,
00:21:57.920 | things that you're scared of.
00:21:59.720 | And then I think that was sort of solidified
00:22:03.200 | the relationship.
00:22:04.040 | And then there was one weekend
00:22:06.400 | where we went to Maine in the winter,
00:22:09.120 | which I mean, I really love the beach always,
00:22:12.720 | but in the winter, particularly.
00:22:15.680 | - 'Cause it's just beautiful and calm and whatever.
00:22:18.560 | - Yeah, and I also, I do find beauty in starkness sometimes.
00:22:23.560 | Like, so there's this grand majestic scene of, you know,
00:22:28.880 | this very powerful ocean
00:22:30.400 | and it's all these like beautiful blue grays
00:22:33.040 | and it's just stunning.
00:22:35.680 | And so we were sitting on this huge rock in Maine
00:22:39.440 | and where we'd gone for the weekend, it was freezing cold.
00:22:42.440 | And I honestly can't remember what he said
00:22:45.880 | or what I said or what,
00:22:48.560 | but I definitely remember having this feeling of,
00:22:52.920 | I absolutely wanna stay with this person.
00:22:57.160 | And I don't know what my life will be like
00:22:58.840 | if I'm not with this person.
00:23:00.120 | Like, I need to be with this person.
00:23:02.400 | - Can we, from a scientific and a human perspective,
00:23:05.680 | dig into your belief that love at first sight is not possible,
00:23:11.520 | you don't believe in it?
00:23:13.200 | 'Cause there is, you don't think there's like a magic
00:23:15.840 | where you see somebody in the Jack Kerouac way
00:23:19.720 | and you're like, wow, that's something.
00:23:22.920 | That's a special little glimmer or something.
00:23:26.800 | - Oh, I definitely think you can connect with someone
00:23:29.720 | instant, in an instance.
00:23:32.160 | And I definitely think you can say,
00:23:34.760 | oh, there's something there
00:23:35.800 | and I'm really clicking with that person.
00:23:37.780 | Romantically, but also just with friends,
00:23:39.640 | it's possible to do that.
00:23:40.800 | You recognize a mind that's like yours
00:23:44.160 | or that's compatible with yours.
00:23:47.560 | There are ways that you feel like you're being understood
00:23:50.480 | or that you understand something about this person
00:23:52.760 | or maybe you see something in this person
00:23:54.480 | that you find really compelling or intriguing.
00:23:58.280 | But I think, you know, your brain is predictive organ.
00:24:02.840 | Right?
00:24:03.680 | You're using your past.
00:24:05.520 | - You're projecting.
00:24:06.640 | - You're using your past to make predictions
00:24:10.160 | and I mean, not deliberately.
00:24:12.920 | That's how your brain is wired.
00:24:14.400 | That's what it does.
00:24:15.320 | And so it's filling in all of the gaps that you,
00:24:20.240 | there are lots of gaps of information that you don't,
00:24:24.760 | information you don't have.
00:24:26.320 | And so your brain is filling those in and--
00:24:28.880 | - But isn't that what love is?
00:24:32.640 | - No, I don't think so, actually.
00:24:35.200 | I mean, to some extent, sure.
00:24:36.720 | You always, there's research to show
00:24:39.400 | that people who are in love always see the best
00:24:42.960 | in each other and they, you know,
00:24:45.000 | when there's a negative interpretation
00:24:48.280 | or a positive interpretation,
00:24:49.480 | you know, they choose the positive ones.
00:24:50.840 | There's a little bit of positive illusion there,
00:24:52.840 | you know, going on.
00:24:54.080 | That's what the research shows.
00:24:55.520 | But I think,
00:24:56.400 | I think that when you find somebody
00:25:04.000 | who not just appreciates your feelings
00:25:09.240 | and your faults, but loves you for them, actually,
00:25:12.640 | you know, like maybe even doesn't see them as a fault,
00:25:16.600 | that's, so you have to be honest enough
00:25:20.520 | about what your faults are.
00:25:24.000 | So it's easy to love someone for all the things that they,
00:25:26.960 | for all the wonderful characteristics they have.
00:25:34.080 | It's harder, I think, to love someone despite their faults
00:25:37.920 | or maybe even the faults that they see
00:25:39.840 | aren't really faults at all to you.
00:25:41.120 | They're actually something really special.
00:25:43.840 | - But isn't that, can't you explain that
00:25:45.680 | by saying the brain kind of, like you're projecting,
00:25:48.600 | it's your, you have a conception of a human being
00:25:53.600 | or just a spirit that really connects with you
00:25:58.120 | and you're projecting that onto that person
00:26:01.240 | and within that framework, all their faults
00:26:04.800 | then become beautiful, like little--
00:26:06.400 | - Maybe, but you just have to pay attention
00:26:09.160 | to the prediction error.
00:26:10.400 | - No, but maybe that's what love,
00:26:13.760 | like maybe you start ignoring the prediction error.
00:26:17.640 | Maybe love is just your ability--
00:26:19.960 | - To ignore the prediction error?
00:26:22.800 | Well, I think that there's some research
00:26:24.920 | that might say that, but that's not my experience, I guess.
00:26:29.920 | But there is some research that says,
00:26:32.000 | I mean, there's some research that says
00:26:33.320 | you have to have an optimal margin of illusion,
00:26:35.840 | which means that you put a positive spin on smaller things,
00:26:40.840 | but you don't ignore the bigger things, right?
00:26:45.120 | And I think without being judgmental at all,
00:26:48.400 | when someone says to me, you're not who I thought you were,
00:26:52.720 | I mean, nobody has said that to me in a really long time,
00:26:55.040 | but certainly when I was younger,
00:26:56.560 | that was, you're not who I thought you were.
00:26:58.520 | My reaction to that was, well, whose fault is that?
00:27:01.440 | (Lyle laughs)
00:27:04.520 | I'm a pretty upfront person.
00:27:07.280 | I mean, I will though say that in my experience,
00:27:11.200 | people don't lie to you about who they are.
00:27:15.880 | They lie to themselves in your presence.
00:27:18.420 | - Yeah.
00:27:22.040 | - And so, you don't wanna get tied up in that,
00:27:27.040 | tangled up in that.
00:27:30.320 | And I think from the get-go,
00:27:32.540 | Dan and I were just, for whatever reason,
00:27:34.480 | maybe it's 'cause we both have been divorced already,
00:27:36.240 | and he told me who he thought he was,
00:27:41.240 | and he was pretty accurate as far as I could--
00:27:46.800 | - He was accurate?
00:27:47.640 | - Pretty much, actually.
00:27:48.600 | I mean, there's very,
00:27:50.200 | I can't say that I've ever come across a characteristic
00:27:54.820 | in him that really surprised me in a bad way.
00:27:58.520 | - It's hard to know yourself.
00:28:00.440 | - It is hard to know yourself.
00:28:01.280 | - And to communicate that.
00:28:02.600 | - For sure.
00:28:03.440 | And I'll say, I had the advantage of training
00:28:07.640 | as a therapist, which meant for five years,
00:28:09.840 | I was under a fucking microscope.
00:28:11.680 | - Yeah.
00:28:12.840 | - When I was training as a therapist,
00:28:14.880 | it was hour for hour supervision,
00:28:17.040 | which meant if you were in a room with a client for an hour,
00:28:20.600 | you had an hour with a supervisor.
00:28:23.880 | So that supervisor was behind the mirror for your session,
00:28:28.320 | and then you went and had an hour of discussion
00:28:30.720 | about what you said, what you didn't say,
00:28:33.200 | learning to use your own feelings and thoughts
00:28:37.920 | as a tool to probe the mind of the client and so on.
00:28:42.280 | And so you can't help but learn a lot of,
00:28:45.720 | you can't help but learn a lot about yourself
00:28:47.760 | in that process.
00:28:48.920 | - Do you think knowing or learning how the sausage is made
00:28:53.920 | ruins the magic of the actual experience?
00:28:58.120 | Like, you as a neuroscientist who studies the brain,
00:29:01.600 | do you think it ruins the magic of love at first sight?
00:29:05.640 | Do you consciously are still able
00:29:09.240 | to lose yourself in the moment?
00:29:11.320 | - I'm definitely able to lose myself in the moment.
00:29:13.680 | - Is wine involved?
00:29:14.800 | - Not always.
00:29:16.640 | Chocolate?
00:29:17.480 | I mean, some kind of wine, I'll drink some substance, right?
00:29:20.320 | But yeah, for sure.
00:29:23.200 | I mean, I guess what I would say though is that,
00:29:25.600 | for me, part of the magic is the process.
00:29:31.240 | Like, so I remember a day,
00:29:35.200 | well, I was working on this book of essays.
00:29:38.280 | I was in New York.
00:29:39.800 | I can't remember why I was in New York,
00:29:42.840 | but I was in New York for something,
00:29:44.680 | and I was in Central Park,
00:29:46.920 | and I was looking at all the people with their babies,
00:29:50.400 | and I was thinking,
00:29:51.580 | each one of these, there's a tiny little brain
00:29:57.440 | that's wiring itself right now.
00:30:00.160 | And I just, I felt in that moment,
00:30:03.600 | I was like, I am never gonna look at an infant
00:30:06.040 | in the same way ever again.
00:30:08.320 | And so to me, I mean, honestly,
00:30:11.200 | before I started learning about brain development,
00:30:14.360 | I thought babies were cute, but not that interesting
00:30:17.760 | until they could interact with you and do things.
00:30:21.400 | Of course, my own infant,
00:30:22.520 | I thought was extraordinarily interesting,
00:30:24.360 | but they're kind of like lumps.
00:30:27.160 | That's until they can interact with you,
00:30:30.000 | but they are anything but lumps.
00:30:31.600 | I mean, so, and part of the,
00:30:34.960 | I mean, all I can say is I have deep affection now
00:30:38.920 | for tiny little babies in a way that I didn't really before
00:30:43.920 | because of the, I'm just so curious.
00:30:51.600 | - But the actual process, the mechanisms of the wiring
00:30:55.440 | of the brain, the learning,
00:30:56.480 | all the magic of the neurobiology.
00:30:58.360 | - Yeah, and or something like,
00:31:01.120 | when you make eye contact with someone directly,
00:31:05.960 | sometimes you feel something, right?
00:31:10.960 | And what is it?
00:31:14.440 | And what is that?
00:31:15.800 | And so to me, that's not backing away from the moment.
00:31:20.240 | That's like expanding the moment.
00:31:22.040 | It's like, that's incredibly cool.
00:31:26.640 | I'll just say that when I was in graduate school,
00:31:30.480 | I also was in therapy because it's almost a given
00:31:34.800 | that you're gonna be in therapy yourself
00:31:36.920 | if you're gonna become a therapist.
00:31:38.400 | And I had a deal with my therapist,
00:31:42.040 | which was that I could call time out
00:31:44.120 | at any moment that I wanted to,
00:31:46.200 | as long as I was being responsible about it.
00:31:48.240 | And I wasn't using it as a way to get out of something.
00:31:50.680 | And he could tell me, no, he could decline and say,
00:31:54.400 | no, you're using this to get out of something.
00:31:56.960 | But I could call time out whenever I want and say,
00:31:59.440 | what are you doing right now?
00:32:01.280 | Here's what I'm experiencing.
00:32:02.400 | What are you trying to do?
00:32:03.760 | I wanted to use my own experience to interrogate
00:32:07.480 | what the process was.
00:32:10.920 | And that made it more helpful in a way.
00:32:15.920 | Do you know what I mean?
00:32:18.880 | So yeah, I don't think learning how something works
00:32:21.640 | makes it less magical, actually.
00:32:23.440 | But that's just me, I guess.
00:32:25.360 | I don't know, would you?
00:32:26.560 | - Yes.
00:32:28.640 | I tend to have two modes.
00:32:32.160 | One is an engineer and one is romantic.
00:32:35.600 | And I'm conscious of like, there's two rooms.
00:32:40.600 | You can go into the one, the engineer room,
00:32:43.760 | and I think that ruins the romance.
00:32:45.840 | So I tend to, there's two rooms.
00:32:48.560 | One is the engineering room.
00:32:50.480 | Think from first principles.
00:32:51.920 | How do we build the thing
00:32:53.400 | that creates this kind of behavior?
00:32:56.080 | And then you go into the romantic room
00:32:58.000 | where you're like emotional, it's a roller coaster.
00:33:00.200 | And then you're, the thing is, let's take it slow.
00:33:03.880 | And then you get married the next night.
00:33:05.880 | Then you're just this giant mess and you write a song
00:33:08.680 | and then you cry and then you send a bunch of texts
00:33:12.200 | and anger and whatever.
00:33:14.160 | And somehow you're in Vegas and there's random people
00:33:17.160 | and you're drunk and whatever, all that.
00:33:18.760 | Like in poetry, just mess of it.
00:33:21.040 | Fighting, yeah, that's not, those are two rooms.
00:33:24.880 | And you go back between them.
00:33:27.040 | But I think the way you put it is quite poetic.
00:33:29.600 | I think you're much better at adulting with love
00:33:34.600 | than perhaps I am, because there's a magic to children.
00:33:40.200 | I also think like of adults as children.
00:33:45.200 | It's kind of cool to see, it's a cool thought experiment
00:33:48.680 | to look at adults and think like that used to be a baby.
00:33:53.360 | And then that's like a fully wired baby.
00:33:56.280 | And it's just walking around pretending to be like
00:33:58.520 | all serious and important, wearing a suit or something.
00:34:01.840 | But that used to be a baby.
00:34:03.680 | And then you think of like the parenting
00:34:05.500 | and all the experiences they had.
00:34:07.520 | Like it's cool to think of it that way.
00:34:09.920 | But then I started thinking of it like
00:34:11.460 | from a machine learning perspective.
00:34:13.500 | But once you're like the romantic moments,
00:34:16.280 | all that kind of stuff, all that falls away.
00:34:19.120 | I forget about all that, I don't know.
00:34:21.640 | That's the Russian thing.
00:34:23.460 | - Maybe, maybe.
00:34:24.720 | But I also think it might be an age thing
00:34:26.480 | or maybe an experience thing.
00:34:28.080 | So I think we all, I mean,
00:34:33.040 | if you're exposed to Western culture at all,
00:34:35.320 | you are exposed to the sort of idealized,
00:34:40.040 | stereotypic, romantic exchange.
00:34:45.640 | And what does it mean to be romantic?
00:34:48.080 | And so here's a test.
00:34:50.100 | I'm gonna see how to phrase it.
00:34:55.000 | Okay, so not really a test,
00:34:56.960 | but this tells you something about
00:34:58.720 | your own ideas about romance.
00:35:00.380 | For Valentine's Day one year,
00:35:06.340 | my husband bought me a six-way plug.
00:35:09.760 | Is that romantic or not romantic?
00:35:14.960 | - Like, sorry, six-way plug, that's like an outlet.
00:35:17.960 | - Yeah, like to put it in an outlet.
00:35:19.420 | Is that romantic or not romantic?
00:35:21.600 | - I mean, depends the look in his eyes when he does it.
00:35:28.000 | I mean, it depends on the conversation
00:35:31.520 | that led up to that point.
00:35:33.760 | Depends how much, it's like the music,
00:35:38.120 | 'cause you have a very, you're both from the,
00:35:42.040 | my experience is with you as a fan,
00:35:44.300 | you have both a romantic nature,
00:35:45.720 | but you have a very pragmatic,
00:35:46.960 | like you cut through the bullshit of the fuzziness.
00:35:51.260 | And there's something about a six-way plug
00:35:53.320 | that cuts through the bullshit,
00:35:54.360 | that connects to the human,
00:35:55.360 | like he understands who you are.
00:35:57.360 | - Exactly.
00:35:58.200 | - Yeah.
00:35:59.080 | - Exactly.
00:36:00.500 | That was the most romantic gift he could have given me
00:36:03.120 | because he knows me so well.
00:36:05.960 | He has a deep understanding of me,
00:36:08.000 | which is that I will sit and suffer and complain
00:36:12.160 | about the fact that I have to plug and unplug things,
00:36:15.580 | and I will bitch and moan until the cows come home,
00:36:17.940 | but it would never occur to me
00:36:20.480 | to go buy a bloody six-way plug.
00:36:23.380 | Whereas for him, he bought it, he plugged it in,
00:36:27.120 | he arranged, he taped up all my wires,
00:36:29.340 | he made it like really usable.
00:36:31.020 | (both laughing)
00:36:32.020 | And for me, that was the best present.
00:36:37.020 | - The most romantic thing.
00:36:38.340 | - It was the most romantic thing
00:36:40.500 | because he understood who I was,
00:36:43.600 | and he did something very, or just the casual,
00:36:47.200 | like we moved into a house that we went
00:36:50.280 | from having a two-car garage to a one-car garage.
00:36:52.920 | And I said, "Okay, I'm from Canada,
00:36:54.600 | I'm not bothered by snow."
00:36:56.040 | Well, I mean, I'm a little bothered by snow,
00:36:57.600 | but he's very bothered by snow.
00:36:59.240 | So I'm like, "Okay, you can park your car in the garage,
00:37:01.840 | it's fine."
00:37:03.280 | Every day when it snows, he goes out and cleans my car.
00:37:06.500 | Every day.
00:37:09.020 | I never asked him to do it, he just does it
00:37:12.060 | because he knows that I'm cutting it really close
00:37:15.220 | in the morning, when we all used to go to work.
00:37:17.880 | I have it timed to the second
00:37:20.860 | so that I can get up as late as possible,
00:37:23.460 | work out as long as possible,
00:37:25.620 | and make it into my office a minute before my first meeting.
00:37:29.220 | And so if it snows unexpectedly or something, I'm screwed
00:37:32.420 | because now that's an added 10 or 15 minutes
00:37:35.200 | and I'm gonna be late.
00:37:36.740 | Anyways, it's just these little tiny things.
00:37:39.400 | He's a really easygoing guy,
00:37:43.640 | and he doesn't look like somebody
00:37:45.360 | who pays attention to detail.
00:37:47.380 | He doesn't fuss about detail,
00:37:50.520 | but he definitely pays attention to detail.
00:37:53.160 | And it is very, very romantic in the sense that
00:37:57.520 | he loves me despite my little details.
00:38:04.680 | - And understands you.
00:38:05.840 | - Yeah, he understands me.
00:38:06.680 | - But it is kind of hilarious that that is,
00:38:09.540 | the six-way plug is the most fulfilling,
00:38:14.540 | richest display of romance in your life.
00:38:19.180 | I love it.
00:38:20.020 | I love it.
00:38:20.840 | - That's what I mean about romance.
00:38:21.680 | Romance is really, it's not all about chocolates and flowers
00:38:24.380 | and whatever.
00:38:25.500 | I mean, those are all nice too, but--
00:38:28.220 | - Sometimes it's about the six-way plug.
00:38:29.740 | - Sometimes it's about the six-way plug.
00:38:32.260 | So maybe one way I could ask
00:38:35.520 | before we talk about the details,
00:38:36.680 | you also have the author of another book
00:38:38.520 | as we talked about how emotions are made.
00:38:41.280 | So it's interesting to talk about the process of writing.
00:38:44.040 | You mentioned you were in New York.
00:38:46.000 | What have you learned from writing these two books
00:38:48.200 | about the actual process of writing?
00:38:50.400 | And maybe, I don't know what's the most interesting thing
00:38:53.400 | to talk about there, maybe the biggest challenges
00:38:55.880 | or the boring, mundane, systematic,
00:38:58.360 | like day-to-day of what worked for you,
00:39:00.360 | like hacks or even just about the neuroscience
00:39:04.000 | that you've learned through the process
00:39:06.980 | of trying to write them.
00:39:08.360 | - Here's the thing I learned.
00:39:09.680 | If you think that it's gonna take you a year
00:39:11.760 | to write your book, it's going to take you three years
00:39:14.160 | to write your book.
00:39:15.240 | That's the first thing I learned,
00:39:17.120 | is that no matter how organized you are,
00:39:22.120 | it's always gonna take way longer than what you think
00:39:28.760 | in part because very few people make an outline
00:39:33.760 | and then just stick to it.
00:39:35.220 | Some of the topics really take on a life of their own
00:39:39.020 | and to some extent, you wanna let them have their voice.
00:39:43.920 | You wanna follow leads until you feel satisfied
00:39:46.960 | that you've dealt with the topic appropriately.
00:39:51.960 | But I, and that part is actually fun.
00:39:54.460 | It's not fun to feel like you're constantly behind
00:39:57.040 | the eight ball in terms of time,
00:39:59.480 | but it is the exploration and the foraging for information
00:40:02.880 | is incredibly fun for me anyways.
00:40:05.760 | I found it really enjoyable.
00:40:07.040 | And if I wasn't also running a lab at the same time
00:40:09.160 | and trying to keep my family going,
00:40:12.520 | it would have been, the whole thing would have just been fun.
00:40:15.920 | But I would say the hardest thing about,
00:40:18.440 | the most important thing I think I learned
00:40:20.200 | is also the hardest thing for me,
00:40:22.200 | which is knowing what to leave out.
00:40:27.200 | A really good storyteller knows what to leave out.
00:40:31.720 | In academic writing, you shouldn't leave anything out.
00:40:37.540 | All the details should be there.
00:40:40.400 | I've written or participated in writing over 200 papers
00:40:51.240 | peer reviewed papers.
00:40:54.640 | So I'm pretty good with detail.
00:40:57.680 | Knowing what to leave out,
00:40:59.720 | knowing what to leave out
00:41:00.720 | and not harming the validity of the story.
00:41:04.160 | That is a tricky, tricky thing.
00:41:06.840 | It was tricky when I wrote "How Emotions Are Made,"
00:41:10.160 | but that's a standard popular science book.
00:41:13.780 | So it's 300 something pages.
00:41:15.400 | And then, it has like a thousand end notes
00:41:18.280 | and then each of the end notes is attached to a web note,
00:41:22.400 | which is also long.
00:41:23.920 | So I mean, it's,
00:41:25.800 | and it start, and I mean the final draft,
00:41:30.520 | I mean, I wrote three drafts of that book actually,
00:41:33.620 | and the final draft, and then I had to cut by a third.
00:41:36.960 | I mean, or, I mean, I, you know,
00:41:38.660 | it was like 150,000 words or something
00:41:42.560 | and I had to cut it down to like 110.
00:41:44.960 | So obviously, I struggle with what to leave out.
00:41:49.020 | You know, brevity is not my strong suit.
00:41:50.520 | I'm always telling people that, it's a warning.
00:41:52.840 | So that's why this book was,
00:41:55.120 | I, you know, I'd always been really fascinated with essays.
00:41:58.280 | I love reading essays.
00:41:59.960 | And after reading a small set of essays by Anne Fadiman
00:42:04.960 | called "At Large and at Small,"
00:42:07.800 | which I just loved these little essays.
00:42:10.320 | - What's the topic of those essays?
00:42:12.280 | - They are, they're called familiar essays.
00:42:15.280 | So the topics are like everyday topics,
00:42:18.240 | like mail, coffee, chocolate.
00:42:22.160 | I mean, just like,
00:42:23.120 | and what she does is she weaves her own experience.
00:42:26.160 | It's a little bit like these conversations
00:42:28.060 | that you're so good at curating actually.
00:42:30.560 | You're weaving together history and philosophy and science
00:42:36.240 | and also personal reflections.
00:42:38.600 | And a little bit you feel like you're like eavesdropping
00:42:43.600 | on someone's train of thought in a way.
00:42:47.680 | It's really, they're really compelling to me.
00:42:51.600 | - Even if it's just like a mundane topic.
00:42:53.360 | - Yeah, but it's so interesting to learn about
00:42:58.360 | like all of these little stories
00:43:02.400 | in the wrapping of the history of like mail.
00:43:08.120 | Like that's really interesting.
00:43:10.080 | And so I read these essays
00:43:12.040 | and then I wrote to her a little fangirl email.
00:43:15.440 | This was many years ago.
00:43:16.960 | And I said, "I just love this book.
00:43:21.680 | "And how did you learn to write essays like this?"
00:43:24.000 | And she gave me a reading list of essays that I should read,
00:43:26.840 | like writers.
00:43:27.720 | And so I read them all.
00:43:29.040 | And anyway, so I decided
00:43:32.160 | it would be a really good challenge for me
00:43:34.360 | to try to write something really brief
00:43:37.240 | where I could focus on, you know,
00:43:41.720 | one or two really fascinating tidbits of neuroscience.
00:43:46.720 | Connect it to, connect each one to something philosophical
00:43:51.800 | or, you know, like just a question about human nature.
00:43:56.600 | Do it in a really brief format
00:43:58.720 | without violating the validity of the science.
00:44:05.200 | And that was a, I just set myself this,
00:44:07.280 | what I thought of as a really, really big challenge
00:44:09.520 | in part because it was an incredibly hard thing
00:44:11.520 | for me to do in the first book.
00:44:13.360 | - Yeah, we should say that this is,
00:44:15.520 | "The Seven and a Half Lessons" is a very short book.
00:44:18.000 | I mean, it's like it embodies brevity, right?
00:44:22.880 | The whole point throughout is just,
00:44:25.720 | I mean, you could tell that there's editing,
00:44:27.960 | like there's pain in trying to bring it
00:44:31.240 | as brief as possible, as clean as possible, yeah.
00:44:35.080 | - Yeah, so it's, the way I think of it is,
00:44:37.720 | you know, it's a little book of big science and big ideas.
00:44:41.400 | - Yeah, really big ideas in brief little packages.
00:44:45.000 | - And, you know, I wrote it so that people could read it.
00:44:49.880 | I love reading on the beach.
00:44:51.840 | I love reading essays on the beach.
00:44:53.520 | I read it, I wrote it so people could read it on the beach
00:44:55.900 | or in the bathtub or, you know, a subway stop.
00:44:58.960 | - Even if the beach is frozen over in the snow.
00:45:02.080 | - Yeah, so my husband, Dan,
00:45:04.280 | calls it the first neuroscience beach read.
00:45:06.760 | That's his phrasing, yeah.
00:45:10.080 | - Yeah, and like you said, you learn a lot about writing
00:45:13.400 | from your husband, like you were saying offline.
00:45:15.320 | - Well, he is, of the two of us, he is the better writer.
00:45:20.320 | He is a masterful writer.
00:45:22.320 | He's also, I mean, you know,
00:45:27.120 | he's a PhD in computer science.
00:45:28.480 | He's a software engineer,
00:45:30.000 | but he's also really good at organization of knowledge.
00:45:35.000 | So he built, for a company he used to work for,
00:45:38.880 | he built one of the first knowledge management systems.
00:45:41.720 | And he now works at Google
00:45:44.240 | where he does engineering education.
00:45:46.600 | Like he understands how to tell a good story
00:45:50.080 | just, you know, about anything really.
00:45:54.800 | He's got impeccable timing.
00:45:57.320 | He's really funny.
00:45:59.160 | And luckily for me,
00:46:00.760 | he knows very little about psychology or neuroscience.
00:46:03.720 | Well, now he knows more, obviously, but so, you know,
00:46:07.440 | he was really, when "How Emotions Were Made,"
00:46:09.920 | you know, he was really, really helpful to me
00:46:13.920 | because the first draft of every chapter
00:46:17.200 | was me talking to him about what, you know,
00:46:19.760 | I would talk out loud about what I wanted to say
00:46:22.560 | and the order in which I wanted to say it.
00:46:24.840 | And then I would write it,
00:46:27.600 | and then he would read it
00:46:28.880 | and tell me all the bits that could be excised.
00:46:32.040 | And sometimes we would, you know, I should say,
00:46:35.480 | I mean, we don't, he and I don't really argue about much
00:46:39.040 | except directions in the car.
00:46:41.400 | Like that's, if we're gonna have an argument,
00:46:44.240 | that's gonna be where it's gonna happen, where.
00:46:46.880 | - What's the nature of the argument
00:46:48.600 | about directions exactly?
00:46:49.920 | - I don't really know.
00:46:50.920 | It's just that we're very,
00:46:52.480 | I think it's that spatially, you know,
00:46:56.600 | I use egocentric space.
00:46:58.680 | So I wanna say, you know, turn left.
00:47:01.400 | Like I'm reasoning in relation
00:47:03.880 | to like my own physical corporeal body.
00:47:06.680 | So, you know, you walk to the church
00:47:08.200 | and you turn left and you, then, you know, whatever.
00:47:11.080 | You know, I'm always like,
00:47:12.120 | and his, you know, he gives directions allocentrically,
00:47:16.280 | which means organized around North, South, East, West.
00:47:21.040 | - So to you, the Earth is at the center of the solar system
00:47:24.320 | and to him, reasonably. - No, I'm at the center.
00:47:26.520 | - I'm at the center. - You're at the center
00:47:28.000 | of the solar system.
00:47:29.000 | Okay, so. - Anyway, so we,
00:47:32.160 | but here we, you know,
00:47:34.400 | we had some really rip roaring arguments,
00:47:37.640 | like really rip roaring arguments
00:47:39.560 | where he would say like, "Who is this for?
00:47:41.880 | Is this for the 1%?"
00:47:44.320 | And I'd be like, 1% meaning not, you know, not wealth,
00:47:47.880 | but like civilians versus academics.
00:47:50.760 | You know, so are these for the scientists
00:47:52.160 | or is this for the civilians, right?
00:47:54.120 | - So he speaks for the people, for the civilians.
00:47:56.040 | - He speaks for the people and I'd be like,
00:47:57.760 | "No, you have to."
00:47:59.200 | And so he made, you know,
00:48:00.880 | after one terrible argument that we had
00:48:03.120 | where it was really starting to affect our relationship
00:48:05.840 | because we were so mad at each other all the time,
00:48:09.080 | he made these little signs, writing and science.
00:48:14.080 | And we only use them, this was like,
00:48:16.800 | when you pulled out a sign, that's it.
00:48:20.520 | Like the other person just wins
00:48:22.240 | and you have to stop fighting about it.
00:48:24.320 | And that's it. - Great.
00:48:25.240 | - And so we just did that.
00:48:26.600 | And we didn't really have to use it too much for this book
00:48:30.200 | 'cause this book was in some ways,
00:48:33.120 | you know, I didn't have to learn a lot of new things
00:48:37.680 | for this book, I had to learn some,
00:48:39.080 | but a lot of what I learned
00:48:44.080 | for "How Emotions Are Made"
00:48:47.480 | really stood me in good stead for this book.
00:48:50.520 | So there was a little bit,
00:48:51.400 | each essay was a little bit of learning.
00:48:53.360 | A couple were, was a little more than the small amount,
00:48:56.640 | but I didn't have so much trouble here.
00:48:59.960 | I had a lot of trouble with the first book,
00:49:03.800 | but still even here, you know,
00:49:05.800 | he would tell me that I could take something out
00:49:09.520 | and I really wanted to keep it.
00:49:11.040 | And I think we only used the signs once.
00:49:15.320 | - Well, if we could dive in some aspects of the book,
00:49:17.960 | I would love that.
00:49:19.120 | Can we talk about, so one of the essays,
00:49:23.280 | it looks at evolution.
00:49:24.640 | Let me ask the big question,
00:49:30.920 | did the human brain evolve to think?
00:49:33.920 | That's essentially the question that you address in the essay
00:49:38.360 | can you speak to it?
00:49:39.480 | - Sure, you know, the big caveat here is that
00:49:43.360 | we don't really know why brains evolved.
00:49:45.760 | The big why questions are called teleological questions.
00:49:49.800 | And in general, scientists should avoid those questions
00:49:54.800 | because we don't know really why, we don't know the why.
00:49:58.640 | However, for a very long time,
00:50:03.640 | the assumption was that evolution worked
00:50:07.080 | in a progressive upward scale,
00:50:09.400 | that you start off with simple organisms
00:50:11.200 | and those organisms get more complex
00:50:13.160 | and more complex and more complex.
00:50:14.920 | Now, obviously that's true in some like
00:50:17.600 | really general way, right?
00:50:19.640 | That life started off as single cell organisms
00:50:22.720 | and things got more complex.
00:50:24.440 | But the idea that brains evolved in some upward trajectory
00:50:29.440 | from simple brains in simple animals
00:50:34.700 | to complex brains in complex animals
00:50:37.040 | is called a phylogenetic scale.
00:50:39.340 | And that phylogenetic scale is embedded
00:50:44.360 | in a lot of evolutionary thinking,
00:50:46.000 | including Darwin's actually.
00:50:48.880 | And it's been seriously challenged, I would say,
00:50:53.120 | by modern evolutionary biology.
00:50:56.340 | And so, you know, thinking is something that,
00:51:01.540 | rationality is something that humans,
00:51:04.320 | at least in the West, really prize
00:51:07.340 | as a great human achievement.
00:51:10.640 | And so the idea that the most common evolutionary story
00:51:15.880 | is that brains evolved in like sedimentary rock
00:51:20.360 | with a layer for instincts, that's your lizard brain,
00:51:25.520 | and a layer on top of that for emotions,
00:51:30.480 | that's your limbic system, limbic meaning border.
00:51:33.260 | So it borders the parts that are for instincts.
00:51:36.480 | - Oh, interesting.
00:51:37.320 | - And then the neocortex or new cortex
00:51:42.840 | where rationality is supposed to live.
00:51:46.200 | That's the sort of traditional story.
00:51:48.480 | - It just keeps getting layered on top by evolution.
00:51:52.040 | - Right, and so you can think about, you know,
00:51:54.840 | I mean, sedimentary rock is the way
00:51:57.080 | typically people describe it.
00:51:58.480 | The way I sometimes like to think about it is,
00:52:01.560 | you know, thinking about the cerebral cortex
00:52:03.360 | like icing on an already baked cake, you know,
00:52:07.800 | where, you know, the cake is your inner beast.
00:52:11.160 | These like boiling, you know, roiling instincts
00:52:14.000 | and emotions that have to be contained.
00:52:15.880 | And by the cortex, and it's just, it's a fiction.
00:52:20.880 | It's a myth.
00:52:23.880 | It's a myth that you can trace all the way back
00:52:26.280 | to stories about morality in ancient Greece.
00:52:31.080 | But what you can do is look at the scientific record
00:52:35.640 | and say, well, there are other stories
00:52:38.160 | that you could tell about brain evolution
00:52:40.200 | and the context in which brains evolved.
00:52:45.200 | So when you look at creatures who don't have brains
00:52:50.600 | and you look at creatures who do, what's the difference?
00:52:55.040 | And you can look at, you know, some animals.
00:53:00.960 | So we call, scientists call an environment
00:53:05.680 | that an animal lives in a niche, their environmental niche.
00:53:09.320 | What are the things, what are the parts of the environment
00:53:11.320 | that matter to that animal?
00:53:13.200 | And so there's some animals whose niche hasn't changed
00:53:16.600 | in 400 million years.
00:53:18.440 | So they're not, these creatures are modern creatures,
00:53:21.560 | but they're living in a niche that hasn't changed much.
00:53:24.520 | And so their biology hasn't changed much.
00:53:27.240 | And you can kind of verify that by looking at the genes
00:53:30.440 | that lurk deep, you know,
00:53:32.280 | in the molecular structure of cells.
00:53:35.480 | And so you can, by looking at various animals
00:53:39.840 | in their developmental state,
00:53:41.720 | meaning not, you don't look at adult animals,
00:53:43.840 | you look at embryos of animals and developing animals,
00:53:47.600 | you can see, you can piece together a different story.
00:53:50.400 | And that story is that brains evolved
00:53:54.640 | under the selection pressure of hunting.
00:53:58.600 | That in the Cambrian period,
00:54:00.760 | hunting emerged on the scene
00:54:02.480 | where animals deliberately ate one another.
00:54:05.620 | And what, so, you know, before the Cambrian period,
00:54:11.720 | the animals didn't really have,
00:54:15.040 | well, they didn't have brains,
00:54:16.480 | but they also didn't have senses, really,
00:54:19.600 | the very, very rudimentary senses.
00:54:21.520 | So the animal that I wrote about in seven and a half lessons
00:54:26.520 | is called an amphioxus or a lancelet.
00:54:29.720 | And little amphioxus has no eyes,
00:54:34.720 | it has no ears, it has no nose,
00:54:37.480 | it has no eyes, it has a couple of cells
00:54:41.580 | for detecting light and dark for circadian rhythm purposes.
00:54:46.580 | So, and it can't hear,
00:54:50.460 | it has a vestibular cell to keep its body upright.
00:54:53.560 | It has a very rudimentary sense of touch,
00:54:57.540 | and it doesn't really have any internal organs
00:55:00.460 | other than this, like, basically stomach.
00:55:03.060 | It's like a, just like a,
00:55:04.820 | it doesn't have an enteric nervous system,
00:55:06.980 | it doesn't have, like, a gut that, you know, moves,
00:55:10.700 | like we do, it just has basically a tube.
00:55:13.620 | - Yeah, a little container.
00:55:15.540 | - Like a little container, yeah.
00:55:16.740 | And so, and really, it doesn't move very much.
00:55:20.140 | It can move, it just sort of wriggles,
00:55:22.160 | it doesn't have very sophisticated movement.
00:55:24.740 | And it's this really sweet little animal,
00:55:27.820 | it sort of wriggles its way to a spot
00:55:30.580 | and then plants itself in the sand
00:55:33.560 | and just filters food as the food goes by.
00:55:36.860 | And then when the food concentration decreases,
00:55:41.580 | it just ejects itself, wriggles to some spot randomly,
00:55:46.580 | where probabilistically there will be more food,
00:55:50.500 | and plants itself again.
00:55:51.980 | So it's not really aware,
00:55:56.340 | very aware that it has an environment.
00:55:58.380 | It has a niche, but that niche is very small
00:56:00.740 | and it's not really experiencing that niche very much.
00:56:05.260 | So it's basically like a little stomach on a stick.
00:56:08.100 | That's really what it is.
00:56:09.860 | And, but when animals start to literally hunt each other,
00:56:20.420 | all of a sudden it becomes important
00:56:23.180 | to be able to sense your environment.
00:56:25.860 | 'Cause you need to know, is that blob up ahead
00:56:29.260 | gonna eat me or should I eat it?
00:56:31.760 | And so all of a sudden you want,
00:56:34.860 | distance senses are very useful.
00:56:36.980 | And so in the water, distance senses are vision
00:56:41.180 | and a little bit hearing,
00:56:44.100 | olfaction, smelling, and touch.
00:56:49.500 | 'Cause in the water, touch is a distance sense
00:56:51.780 | 'cause you can feel the vibration.
00:56:53.540 | So on land, vision is a distance sense,
00:56:59.900 | touch not so much, but for elephants maybe.
00:57:03.020 | - The vibrations. - Vibrations.
00:57:06.180 | Olfaction definitely because of the concentration of,
00:57:10.580 | the more concentrated something is,
00:57:12.260 | the more likely it is to be close to you.
00:57:14.820 | So animals developed senses.
00:57:17.940 | They developed a head, like a literal head.
00:57:20.340 | So amphioxus doesn't even have a head really.
00:57:22.140 | It's just a long--
00:57:23.700 | - What's the purpose of a head?
00:57:25.780 | - That's a great question.
00:57:27.380 | - Is it to have a jaw?
00:57:29.620 | - That's a great question.
00:57:30.820 | So jaw, so yes, jaws are a major--
00:57:35.580 | - Useful feature?
00:57:36.700 | - Yeah, I would say they're a major adaptation
00:57:39.260 | after there's a split between vertebrates and invertebrates.
00:57:42.620 | So amphioxus is thought to be very, very similar
00:57:45.340 | to the animal that's before that split.
00:57:48.740 | But then after the development,
00:57:50.140 | very quickly after the development of a head
00:57:52.660 | is the development of a jaw, which is a big thing.
00:57:56.060 | And what goes along with that is the development of a brain.
00:58:01.060 | - It's weird, is that just a coincidence
00:58:04.140 | that the thing, the part of our body,
00:58:07.820 | of the mammal, I think, body that we eat with
00:58:12.500 | and attack others with is also the thing
00:58:15.300 | that contains all the majority of the brain type of stuff?
00:58:20.300 | - Well, actually, the brain goes
00:58:23.100 | with the development of a head
00:58:24.980 | and the development of a visual system
00:58:27.740 | and an auditory system and an olfactory system and so on.
00:58:31.420 | So your senses are developing
00:58:34.780 | and the other thing that's happening, right,
00:58:38.740 | is that animals are getting bigger.
00:58:40.500 | Because they're, and also their niche is getting bigger.
00:58:44.580 | - Well, this is the, just sorry to take a tiny tangent
00:58:47.860 | on the niche thing is it seems like the niche
00:58:50.380 | is getting bigger, but not just bigger,
00:58:53.380 | like more complicated, like shaped in weird ways.
00:58:56.660 | So like predation seems to create,
00:58:59.000 | like the whole world becomes your oyster, whatever.
00:59:03.380 | But like you also start to carve out
00:59:05.980 | the places in which you can operate the best.
00:59:08.380 | - Yeah, and in fact, that's absolutely right.
00:59:10.500 | And in fact, some scientists think that theory of mind,
00:59:15.060 | your ability to make inferences about the inner life
00:59:18.660 | of other creatures actually developed
00:59:22.100 | under the selection pressure of predation.
00:59:24.300 | Because it makes you a better predator.
00:59:28.140 | - Do you ever look at, you just said you looked at babies
00:59:31.220 | as these wiring creatures.
00:59:35.020 | Do you ever think of humans as just clever predators?
00:59:39.340 | Like that there is under, underneath it all is this,
00:59:43.240 | the Nietzschean will to power in all of its forms?
00:59:49.500 | Or are we now friendlier?
00:59:52.100 | - Yeah, so it's interesting.
00:59:54.260 | I mean, there are zeitgeists
00:59:57.300 | in how humans think about themselves, right?
00:59:59.700 | And so if you look in the 20th century,
01:00:02.480 | you can see that the idea of an inner beast
01:00:08.060 | that were just predators, were just basically animals,
01:00:10.900 | baseless animals, violent animals
01:00:13.340 | that have to be contained by culture
01:00:15.260 | and by our prodigious neocortex,
01:00:17.760 | really took hold particularly after World War I
01:00:24.100 | and really held sway for much of that century.
01:00:30.340 | And then around, at least in Western writing, I would say,
01:00:36.980 | we're talking mainly about Western scientific writing,
01:00:41.140 | Western philosophical writing.
01:00:42.780 | And then late '90s maybe,
01:00:47.160 | you start to see books and articles about our social nature,
01:00:52.200 | that we're social animals.
01:00:53.540 | And we are social animals, but what does that mean exactly?
01:00:56.860 | About--
01:01:00.480 | - It's us carving out different niches
01:01:02.020 | in the space of ideas, it looks like.
01:01:03.460 | - I think so, I think so.
01:01:06.460 | So, do humans, can humans be violent?
01:01:15.540 | Can humans be really helpful?
01:01:18.220 | Yes, actually.
01:01:19.720 | And humans are interesting creatures
01:01:22.920 | because other animals can also be helpful to one another.
01:01:27.920 | In fact, there's a whole literature, booming literature
01:01:30.940 | on how other animals are,
01:01:35.340 | support one another.
01:01:38.580 | They regulate each other's nervous systems
01:01:40.660 | in interesting ways
01:01:41.740 | and they will be helpful to one another, right?
01:01:43.580 | So for example, there's a whole literature on rodents
01:01:46.440 | and how they signal one another, what is safe to eat?
01:01:51.440 | And they will perform acts of generosity
01:01:57.040 | to their conspecifics that are related to them
01:02:03.220 | or who they were raised with.
01:02:05.400 | So if an animal was raised in a litter
01:02:08.100 | that they were raised in,
01:02:09.900 | although not even at the same time,
01:02:11.500 | they'll be more likely to help that animal.
01:02:13.460 | So there's always some kind of physical relationship
01:02:16.940 | between animals that predicts whether or not
01:02:20.620 | they'll help one another.
01:02:22.060 | For humans, humans, you know,
01:02:27.060 | we have ways of categorizing who's in our group
01:02:31.560 | and who isn't by non-physical ways, right?
01:02:34.860 | Even by just something abstract like an idea.
01:02:38.060 | And we are much more likely to extend help
01:02:41.980 | to people in our own group,
01:02:43.920 | whatever that group may be at that moment,
01:02:47.020 | whatever feature you're using to define who's in your group
01:02:51.060 | and who isn't, we're more likely to help those people
01:02:55.860 | than even members of our own family at times.
01:02:59.260 | So humans are much more flexible in their,
01:03:03.280 | in the way that they help one another,
01:03:07.000 | but also in the way that they harm one another.
01:03:08.900 | So I don't, I don't think I subscribe to,
01:03:13.900 | you know, we are primarily this or we are primarily that.
01:03:21.040 | I don't think humans have essences in that way, really.
01:03:24.340 | - I apologize to take us in this direction
01:03:27.320 | for a brief moment, but I've been really deep
01:03:29.860 | on Stalin and Hitler recently in terms of reading.
01:03:34.100 | And is there something that you think about
01:03:37.900 | in terms of the nature of evil
01:03:41.620 | from a neuroscience perspective?
01:03:44.060 | Is there some lessons that are sort of hopeful
01:03:49.060 | about human civilization that we can find
01:03:57.260 | in our brain with regard to the Hitlers of the world?
01:04:00.280 | Do you think about the nature of evil?
01:04:05.380 | - Yeah, I do.
01:04:07.460 | I don't know that what I have to say is so useful from a,
01:04:12.460 | I don't know that I can say as a neuroscientist,
01:04:14.140 | well, here's a study that, you know,
01:04:17.740 | so I sort of have to take off my lab coat, right?
01:04:20.220 | And now I'm gonna now conjecture as a human
01:04:22.840 | who just also, who has opinions,
01:04:24.700 | but who also maybe has some knowledge
01:04:26.700 | about neuroscience, but I'm not speaking
01:04:29.300 | as a neuroscientist when I say this
01:04:30.660 | 'cause I don't think neuroscientists know enough, really,
01:04:33.720 | to be able to say, but I guess,
01:04:36.100 | the kinds of things I think about are,
01:04:38.040 | what, so I have always thought,
01:04:44.260 | even before I knew anything about neuroscience,
01:04:47.260 | I've always thought that,
01:04:49.700 | I don't think anybody could become Hitler,
01:04:54.980 | but I think the majority of people can be,
01:04:58.520 | are capable of doing very bad things.
01:05:02.760 | It's just, the question is really,
01:05:06.660 | how much encouragement does it take from the environment
01:05:09.240 | to get them to do something bad?
01:05:11.300 | - That's what I, kind of when I look at the life of Hitler,
01:05:14.460 | it seems like there's so many places where--
01:05:19.460 | - Something could have intervened.
01:05:20.980 | - Intervened, no, it could change completely the person.
01:05:23.380 | I mean, there's the caricature,
01:05:25.260 | like the obvious places where he was an artist,
01:05:28.500 | and if he wasn't rejected as an artist,
01:05:30.380 | he was a reasonably good artist,
01:05:32.060 | so that could have changed, but just his entire,
01:05:34.420 | like where he went in Vienna and all these kinds of things,
01:05:37.740 | like little interactions could have changed,
01:05:39.980 | and there's probably millions of other people
01:05:44.260 | who are capable, who the environment may be able to mold
01:05:49.260 | in the same way it did this particular person
01:05:51.760 | to create this particular kind of charismatic leader
01:05:55.940 | in this particular moment of time.
01:05:57.540 | - Absolutely, and I guess the way that I would say it,
01:06:01.340 | I would agree 100%,
01:06:02.660 | and I guess the way that I would say it is like this.
01:06:05.300 | In the West, we have a way of reasoning
01:06:10.380 | about causation, which focuses on single,
01:06:17.100 | simple causes for things.
01:06:20.380 | There's an essence to Hitler,
01:06:22.680 | there's an essence to his character.
01:06:24.800 | He was born with that essence,
01:06:26.640 | or it was forged very, very early in his life,
01:06:30.180 | and that explains the landscape of his,
01:06:35.180 | the horrible landscape of his behavior,
01:06:37.880 | but there's another way to think about it,
01:06:41.120 | a way that actually is much more consistent
01:06:42.820 | with what we know about biology,
01:06:45.760 | how biology works in the physical world,
01:06:49.080 | and that is that most things are complex,
01:06:52.160 | not as in, wow, this is really complex and hard,
01:06:54.280 | but complex as in complexity,
01:06:56.880 | that is more than the sum of their parts,
01:06:59.800 | and that most phenomena have many, many
01:07:03.880 | weak, nonlinear interacting causes,
01:07:08.760 | and so little things that we might not even be aware of
01:07:13.760 | can shift someone's developmental trajectory
01:07:17.200 | from this to that, and that's enough
01:07:20.000 | to take it on a whole set of other paths,
01:07:23.280 | and that these things are happening all the time.
01:07:28.400 | So it's not random, and it's not really,
01:07:31.240 | it's not deterministic in the sense
01:07:32.840 | that everything you do determines your outcome,
01:07:35.940 | but it's a little more like
01:07:37.880 | you're nudging someone from one set of possibilities
01:07:44.520 | to another set of possibilities,
01:07:46.640 | but I think the thing is,
01:07:48.560 | the thing that I find optimistic
01:07:50.000 | is that the other side of that coin is also true, right?
01:07:55.000 | So look at all the people who risked their lives
01:08:00.920 | to help people they didn't even know.
01:08:05.320 | I mean, I just watched "Borat," the new "Borat" movie,
01:08:10.000 | and the thing that I came away with,
01:08:12.460 | but you know, the thing I came away with was
01:08:16.080 | look at how generous people were in that,
01:08:19.720 | or 'cause he's making,
01:08:20.720 | there are a lot of people he makes fun of, and that's fine,
01:08:23.460 | but think about those two guys, those--
01:08:27.000 | - The Trump supporter guys.
01:08:28.440 | - The Trump supporter guys.
01:08:29.960 | Those guys-- - That was cool.
01:08:31.120 | - Those kindness in them, right?
01:08:33.180 | - They took a complete stranger in a pandemic
01:08:38.180 | into their house.
01:08:40.380 | Who does that?
01:08:42.920 | Like, that's a really nice thing,
01:08:44.720 | or there's one scene,
01:08:46.160 | I mean, I don't wanna spoil it for people
01:08:47.800 | who haven't seen it,
01:08:49.320 | but there's one scene where he goes in,
01:08:51.920 | he dresses up as a Jew.
01:08:53.620 | I laugh myself sick at that scene, seriously,
01:08:58.520 | but he goes in, and there are these two old Jewish ladies.
01:09:03.080 | What a bunch of sweethearts, oh my gosh, like, really?
01:09:09.160 | I mean, that was what I was struck by, actually.
01:09:12.240 | I mean, there are other ones, or like the babysitter, right?
01:09:15.440 | I mean, she was really kind,
01:09:18.560 | and yeah, so that's really what I was more struck by.
01:09:22.840 | Sure, there are other people who do very bad things,
01:09:28.640 | or say bad things, or whatever,
01:09:30.520 | but there's one guy who's completely stoic,
01:09:35.160 | like the guy who's doing the,
01:09:38.280 | sending the messages, I don't know if it's fax or whatever.
01:09:41.760 | He's just completely stoic,
01:09:43.440 | but he's doing his job, actually.
01:09:45.600 | You don't know what he was thinking inside his head.
01:09:48.880 | You don't know what he was feeling,
01:09:49.840 | but he was totally professional doing his job.
01:09:52.140 | So I guess I just, I had a bit of a different view, I guess,
01:09:58.080 | so I also think that about people.
01:10:00.420 | I think everybody is capable of kindness,
01:10:04.020 | but the question is, how much does it take,
01:10:08.820 | and what are the circumstances?
01:10:09.880 | So for some people, it's gonna take a lot,
01:10:12.080 | and for some people, it only takes a little bit,
01:10:14.360 | but are we actually cultivating an environment
01:10:19.360 | for the next generation that provides opportunities
01:10:26.680 | for people to go in the direction of caring and kindness?
01:10:32.280 | - Yeah.
01:10:33.120 | - Or, and I'm not saying that as like a Pollyanna-ish person.
01:10:39.840 | I think there's a lot of room for competition
01:10:42.600 | and debate and so on,
01:10:45.240 | but I don't see Hitler as an anomaly,
01:10:49.100 | and I never have.
01:10:50.200 | That was even before I learned anything about neuroscience,
01:10:52.680 | and now, I would say, knowing what we know
01:10:55.000 | about developmental trajectories and life histories
01:10:57.280 | and how important that is,
01:10:58.640 | knowing what we know about the whole question
01:11:03.600 | of nature versus nurture is a completely wrong question.
01:11:07.860 | We have the kind of nature that requires nurture.
01:11:11.000 | We have the kind of genes that allow infants to be born
01:11:14.960 | with unfinished brains, where their brains are wired
01:11:19.800 | across a 25-year period with wiring instructions
01:11:23.040 | from the world that is created for them,
01:11:25.960 | and so I don't think Hitler is an anomaly.
01:11:29.920 | Even if it's less probable that that would happen,
01:11:37.020 | it's possible that it could happen again,
01:11:39.160 | and it's not like he's a bad seed.
01:11:43.280 | I mean, that doesn't, I just wanna say,
01:11:45.640 | of course, he's completely 100% responsible for his actions
01:11:48.720 | and all the bad things that happen,
01:11:50.120 | so I'm not in any way, this is not me saying--
01:11:53.400 | - But the environment is also responsible, in part,
01:11:56.240 | for creating the evil in this world,
01:11:59.240 | so like Hitler's in different versions of,
01:12:04.440 | more subtle, more smaller-scale versions of evil,
01:12:07.500 | but I tend to believe that there's a much stronger,
01:12:12.500 | I don't like to talk about evolutionary advantages,
01:12:16.340 | but it seems like it makes sense for love
01:12:20.500 | to be a more powerful, emergent phenomena
01:12:25.220 | of our collective intelligence
01:12:26.700 | versus hate and evil and destruction,
01:12:30.540 | because from a survival, from a niche perspective,
01:12:34.380 | it seems to be, like, in my own life,
01:12:38.600 | in my thinking about the intuition
01:12:40.280 | about the way humans work together to solve problems,
01:12:44.580 | it seems that love is a very useful tool.
01:12:47.780 | - I definitely agree with you,
01:12:50.080 | but I think the caveat here is that, you know, humans,
01:12:55.080 | the research suggests that humans are capable
01:13:00.680 | of great acts of kindness and great acts of generosity
01:13:04.600 | to people in their in-group.
01:13:06.520 | - Right.
01:13:08.200 | - And-- - So we're also tribal.
01:13:10.880 | - Yeah, I mean, that's the kitschy way to say it.
01:13:14.420 | We're tribes, we're tribal, yeah.
01:13:16.600 | So that's the kitschy way to say it.
01:13:18.300 | What I would say is that, you know,
01:13:22.360 | there are a lot of features
01:13:24.620 | that you can use to describe yourself.
01:13:28.760 | You don't have one identity,
01:13:30.160 | you don't have one self, you have many selves,
01:13:32.200 | you have many identities.
01:13:33.620 | Sometimes you're a man, sometimes you're a scientist,
01:13:37.440 | sometimes you're a, do you have a brother or a sister?
01:13:40.160 | - Yeah, brother.
01:13:41.000 | - So sometimes you're a brother, you know, you,
01:13:43.680 | sometimes you're a friend.
01:13:45.160 | - Sometimes you're a human, so you can keep zooming out.
01:13:47.680 | - Yes, exactly. - Living organism on Earth.
01:13:49.700 | - Yes, exactly, that's exactly, that's exactly right.
01:13:53.440 | And so there are some people who,
01:13:59.360 | there is research which suggests
01:14:01.860 | that there are some people who will tell you,
01:14:05.200 | I think it's appropriate and better to help,
01:14:08.680 | I should help my family more than I should help my neighbors
01:14:11.460 | and I should help my neighbors more
01:14:12.880 | than I should help the average stranger
01:14:15.400 | and I should help, you know,
01:14:18.420 | the average stranger in my country
01:14:20.040 | more than I should help somebody outside my country
01:14:22.160 | and I should help humans more than I should help,
01:14:25.520 | you know, other animals and I should, right?
01:14:27.080 | So there's a clear hierarchy of helping.
01:14:29.240 | And there are other people who, you know,
01:14:33.160 | they are, their niche is much more inclusive, right?
01:14:37.160 | And that they're humans first, right?
01:14:40.960 | Or creatures of the Earth first, let's say.
01:14:43.700 | And I don't think we know how flexible those attitudes are
01:14:50.440 | because I don't think the research really tells us that,
01:14:54.000 | but in any case, there are, you know,
01:14:56.880 | and there are beliefs, people also have beliefs about,
01:15:00.000 | there's this really interesting research
01:15:02.280 | in really in anthropology that looks at
01:15:06.760 | what are cultures particularly afraid of?
01:15:12.440 | Like what, the people in a particular culture
01:15:15.200 | are organizing their social systems
01:15:17.400 | to prevent certain types of problems.
01:15:20.160 | So what are the problems that they're worried about?
01:15:21.880 | And so there are some cultures
01:15:23.880 | that are much more hierarchical
01:15:25.980 | and some cultures that are, you know, much more egalitarian.
01:15:30.560 | There are some cultures that, you know,
01:15:32.420 | in the debate of like getting along versus getting ahead,
01:15:35.740 | there are some cultures that really prioritize
01:15:38.700 | the individual over the group.
01:15:40.140 | And there are other cultures that really prioritize
01:15:41.900 | the group over the individual.
01:15:43.340 | You know, it's not like one of these is right
01:15:45.380 | and one of these is wrong.
01:15:46.660 | It's that, you know, different combinations
01:15:48.940 | of these features are different solutions
01:15:51.460 | that humans have come up with for living in groups,
01:15:55.660 | which is a major adaptive advantage of our species.
01:15:58.320 | And it's not the case that one of these is better
01:16:02.700 | and one of these is worse.
01:16:03.860 | Although as a person, of course, I have opinions about that.
01:16:07.380 | And as a person, I can say,
01:16:10.660 | I would very much prefer certain,
01:16:12.860 | I have certain beliefs and I really want everyone
01:16:15.340 | in the world to live by those beliefs, you know.
01:16:17.400 | But as a scientist, I know that it's not really the case
01:16:21.420 | that for the species, any one of these is better
01:16:25.940 | than any other.
01:16:26.980 | There are different solutions that work differentially well
01:16:29.960 | in particular, you know, ecological parts of the world.
01:16:34.960 | But for individual humans, there are definitely
01:16:40.900 | some systems that are better
01:16:42.140 | and some systems that are worse, right?
01:16:43.860 | But when anthropologists or when neuroscientists
01:16:46.860 | or biologists are talking,
01:16:48.860 | not usually talking about the lives of individual people,
01:16:52.060 | they're talking about, you know, the species,
01:16:54.460 | what's better for the species,
01:16:55.700 | the survivability of the species.
01:16:57.420 | And what's better for the survivability of the species
01:17:00.020 | is variation, that we have lots of cultures
01:17:03.700 | with lots of different solutions
01:17:05.820 | because if the environment were to change drastically,
01:17:09.900 | some of those solutions will work better than others.
01:17:17.620 | And you can see that happening with COVID.
01:17:21.540 | - Right, so some people might be more susceptible
01:17:23.820 | to this virus than others.
01:17:26.300 | And so variation is very useful.
01:17:28.420 | Say COVID was much, much more destructive than it is
01:17:32.060 | and like, I don't know, 20% of the population died.
01:17:36.040 | You know, it's good to have variability
01:17:40.220 | because then at least some percent will survive.
01:17:42.900 | - Yeah, I mean, you know, the way that I used to describe it
01:17:46.580 | was, you know, using, you know, those movies
01:17:51.340 | like "The War of the Worlds" or "Pacific Rim,"
01:17:55.220 | you know, where like aliens come down from outer space
01:17:58.400 | and they, you know, wanna kill humans.
01:18:01.260 | And so all the humans band together as a species,
01:18:04.220 | like, and they all, like all the, you know,
01:18:06.540 | little squabbling from countries and whatever,
01:18:08.940 | all, you know, goes away.
01:18:10.500 | And everyone is just one big, you know,
01:18:13.380 | well, that, you know, that doesn't happen.
01:18:18.380 | I mean, 'cause COVID is, you know, a virus like COVID-19
01:18:24.740 | is like a creature from outer space.
01:18:29.980 | And that's not what you see happening.
01:18:31.820 | What you do see happening, it is true that some people,
01:18:35.340 | I mean, we could use this as an example
01:18:36.860 | of essentialism also.
01:18:37.900 | So just to say like exposure to the virus
01:18:40.580 | does not mean that you will become infected
01:18:43.180 | with a disease.
01:18:44.300 | So, I mean, in controlled studies,
01:18:48.220 | one of which was actually a coronavirus, not COVID,
01:18:51.580 | but this was, these are studies from 10,
01:18:53.540 | 10 or so years ago, you know,
01:18:55.460 | only somewhere between 20 and 40% of people
01:18:58.660 | were developed respiratory illness
01:19:02.220 | when a virus was placed in their nose.
01:19:04.700 | - Yeah.
01:19:05.540 | - And so--
01:19:07.180 | - And there's a dose question, all those--
01:19:09.100 | - Well, not in these studies, actually.
01:19:10.700 | So in these studies, the dose was consistent
01:19:13.220 | across all people.
01:19:14.540 | And everything, you know, they were sequestered
01:19:17.900 | in hotel rooms and what they ate was, you know,
01:19:21.420 | measured out by scientists and so on.
01:19:23.260 | And so when you hold dose, I mean,
01:19:25.660 | the dose issue is a real issue in the real world,
01:19:28.020 | but in these studies, that was controlled.
01:19:31.980 | And only somewhere between 20, depending on the study,
01:19:35.420 | between 20 and 40% of people became infected with a disease.
01:19:38.740 | So exposure to a virus doesn't mean de facto
01:19:43.340 | that you will develop an illness.
01:19:46.340 | You will be a carrier and you will spread the virus
01:19:49.500 | to other people, but you yourself may not,
01:19:52.540 | your immune system may be in a state
01:19:55.820 | that you can make enough antibodies
01:19:58.740 | to not show symptoms, not develop symptoms.
01:20:03.460 | And so, of course, what this means is,
01:20:08.300 | again, is that, you know, like if I asked you,
01:20:11.340 | do you think a virus is the cause of a common cold,
01:20:16.340 | or, you know, most people, if I asked this question,
01:20:19.780 | I can tell you, 'cause I asked this question.
01:20:21.740 | So do you think a virus is the cause of a cold?
01:20:25.740 | Most people would say, yes, I think it is.
01:20:27.740 | And then I say, yeah, well, only 20 to 40% of people
01:20:30.560 | develop respiratory illness in exposure to a virus.
01:20:34.260 | So clearly it is a necessary cause,
01:20:37.980 | but it's not a sufficient cause.
01:20:39.620 | And there are other causes, again,
01:20:41.020 | so not simple single causes for things, right?
01:20:44.060 | Multiple interacting influences.
01:20:47.500 | So it is true that individuals vary
01:20:50.380 | in their susceptibility to illness upon exposure,
01:20:53.520 | but different cultures have different sets of norms
01:20:57.860 | and practices that allow, that will slow
01:21:02.220 | or speed the spread.
01:21:05.780 | And that's the point that I was actually trying to make here
01:21:08.780 | that, you know, when the environment changes,
01:21:13.780 | that is, there's a mutation of a virus
01:21:18.700 | that is incredibly infectious,
01:21:21.580 | some cultures will succumb,
01:21:25.500 | people in some cultures will succumb faster
01:21:27.560 | because of the particular norms and practices
01:21:32.340 | that they've developed in their culture
01:21:35.580 | versus other cultures.
01:21:36.580 | Now, there could be some other, you know,
01:21:40.540 | thing that changes that,
01:21:41.980 | where those other cultures, you know, would do better.
01:21:46.420 | So very individualistic cultures like ours
01:21:49.820 | may do much better under other types of selection pressures.
01:21:53.900 | But for COVID, for things like COVID, you know,
01:21:58.140 | my colleague, Michelle Gelfand,
01:22:00.040 | her research shows that she looks at like loose cultures
01:22:04.000 | and tight cultures, so cultures that have very, very strict
01:22:07.640 | rules versus cultures that are much more individualistic
01:22:10.440 | and where personal freedoms are more valued.
01:22:14.220 | And she, you know, her research suggests
01:22:17.200 | that for pandemic circumstances,
01:22:20.520 | tight cultures actually, the people survive better.
01:22:23.480 | - She's still lingering a little bit longer.
01:22:27.200 | We started this part of the conversation talking about,
01:22:30.640 | you know, did humans evolve to think,
01:22:33.360 | did the human brain evolve to think,
01:22:36.000 | implying is there like a progress to the thing
01:22:39.480 | that's always improving?
01:22:41.480 | (laughing)
01:22:42.320 | - That's right, we never, yeah.
01:22:43.600 | And so the answer is no.
01:22:46.040 | - But let me sort of push back.
01:22:47.840 | But so your intuition is very strong here,
01:22:51.000 | not your intuition, the way you described this,
01:22:54.000 | but is it possible there's a direction to this evolution?
01:22:58.560 | Like, do you think of this evolution as having a direction?
01:23:01.920 | Like it's like walking along a certain path
01:23:04.620 | towards something?
01:23:06.520 | 'Cause we, you know, what is it?
01:23:08.960 | (laughing)
01:23:11.200 | Is it Elon Musk said like the earth got bombarded
01:23:16.720 | with photons and then all of a sudden,
01:23:20.680 | like a Tesla was launched into space or whatever,
01:23:23.200 | a rocket started coming.
01:23:24.440 | Like, is there a sense in which,
01:23:26.800 | even though in the, like within the system,
01:23:30.800 | the evolution seems to be this mess of variation,
01:23:33.160 | we're kind of trying to find our niches and so on.
01:23:36.000 | But do you think there ultimately when you zoom out,
01:23:38.800 | there is a direction that's strong,
01:23:40.900 | that does tend towards greater complexity and intelligence?
01:23:45.900 | - No.
01:23:50.120 | (laughing)
01:23:50.960 | So, I mean, and I, and again, what I would say is,
01:23:53.640 | I'm really just echoing people who are much smarter
01:23:58.280 | than I am about this.
01:24:00.120 | - I see you're saying smarter.
01:24:01.600 | I thought it doesn't, there's no,
01:24:03.400 | I thought there's no smarter.
01:24:04.960 | - No, I didn't say there's no smarter.
01:24:06.160 | I said there's no direction.
01:24:07.560 | - Okay.
01:24:08.400 | - So I think the thing to say,
01:24:10.320 | or what I understand to be the case is that
01:24:13.040 | there's variation, it's not unbounded variation.
01:24:17.400 | And there are selectors.
01:24:18.760 | There are pressures that will select.
01:24:22.540 | And so not anything is possible because we live on a planet
01:24:26.840 | that has certain physical realities to it, right?
01:24:30.200 | But those physical realities are what constrain
01:24:33.920 | the possibilities, the physical realities of our genes
01:24:40.160 | and the physical realities of our corporeal bodies
01:24:43.960 | and the physical realities of life on this planet.
01:24:48.960 | So what I would say is that there's no direction
01:24:57.060 | but there is, it's not infinite possibility
01:25:02.060 | because we live on a particular planet
01:25:07.020 | that has particular statistical regularities in it
01:25:10.220 | and some things will never happen.
01:25:12.380 | And so all of those things are interacting
01:25:15.980 | with our genes and so on and our,
01:25:20.980 | the physical nature of our bodies
01:25:23.820 | to make some things more possible
01:25:25.460 | and some things less possible.
01:25:26.740 | Look, I mean, humans have very complex brains
01:25:29.340 | but birds have complex brains.
01:25:31.060 | And so do octopuses have very complex brains
01:25:36.060 | and all three sets of, all three of those brains
01:25:40.660 | are somewhat different from one another.
01:25:43.500 | Some birds have very complex brains.
01:25:47.320 | Some even have rudimentary language.
01:25:48.820 | They have no cerebral cortex.
01:25:51.140 | I mean, they admittedly, they have,
01:25:53.140 | this is now lesson two, right?
01:25:54.900 | They have, is it lesson two or lesson one?
01:25:56.580 | Let me think.
01:25:57.420 | No, this is lesson one.
01:25:58.900 | They have the same neurons,
01:26:03.900 | the same neurons that in a human
01:26:07.620 | become the cerebral cortex, birds have those neurons.
01:26:10.980 | They just don't form themselves into a cerebral cortex.
01:26:13.660 | But I mean, crows, for example,
01:26:15.260 | are very sophisticated animals.
01:26:17.140 | They can do a lot of the things that humans can do.
01:26:19.780 | In fact, all of the things that humans do
01:26:22.340 | that are very special, that seem very special,
01:26:24.900 | there's at least one other animal on the planet
01:26:26.940 | that can do those things too.
01:26:29.040 | What's special about the human brain
01:26:30.780 | is that we put them all together.
01:26:33.060 | So we learn from one another.
01:26:35.900 | We don't have to experience everything ourselves.
01:26:37.620 | We can watch another animal or another human
01:26:40.760 | experience something and we can learn from that.
01:26:42.600 | Well, there are many other animals
01:26:44.060 | who can learn by copying.
01:26:45.820 | That we communicate with each other very, very efficiently.
01:26:48.620 | We have language.
01:26:49.460 | But we're not the only animals
01:26:51.080 | who are efficient communicators.
01:26:52.900 | There are lots of other animals
01:26:54.280 | who can efficiently communicate, like bees, for example.
01:26:57.080 | We cooperate really well with one another to do grand things
01:27:01.700 | but there are other animals that cooperate too.
01:27:03.420 | And so every innovation that we have,
01:27:06.180 | other animals have too.
01:27:07.740 | What we have is we have all of those together
01:27:11.300 | interwoven in this very complex dance
01:27:14.900 | in a brain that is not unique, exactly,
01:27:20.900 | but it does have some features
01:27:25.300 | that make it particularly useful for us
01:27:29.920 | to do all of these things,
01:27:32.560 | to have all of these things intertwined.
01:27:35.700 | So our brains are, actually the last time we talked,
01:27:40.700 | I made a mistake 'cause I said,
01:27:43.640 | in my enthusiasm I said,
01:27:46.680 | our brains are not larger,
01:27:50.480 | relative to our bodies,
01:27:51.640 | our brains are not larger than other primates.
01:27:55.800 | And that's actually not true, actually.
01:27:57.400 | Our brains relative to our body size is somewhat larger.
01:28:01.520 | So an ape who's not a human, that's not a human,
01:28:05.440 | their brains are larger than their body sizes
01:28:09.760 | than say, relative to like a smaller monkey.
01:28:13.320 | And a human's brain is larger
01:28:15.880 | relative to its body size than a gorilla.
01:28:18.160 | - So that's a good approximation of your,
01:28:21.000 | of whatever, of the bunch of stuff
01:28:23.480 | that you can shove in there.
01:28:25.200 | - But, well what I was gonna say is,
01:28:26.600 | but our cerebral cortex is not larger
01:28:29.560 | than what you would expect for a brain of its size.
01:28:33.560 | So relative to say an ape, like a gorilla or a chimp,
01:28:38.560 | or even a mammal like a dolphin or an elephant,
01:28:44.440 | you know, our brains, our cerebral cortex
01:28:49.280 | is as large as you would expect it to be
01:28:51.640 | for a brain of our size.
01:28:54.160 | So there's nothing special about our cerebral cortex.
01:28:58.000 | And this is something I explain in the book,
01:29:00.240 | where I say, okay, you know, like by analogy,
01:29:03.320 | if you walk into somebody's house
01:29:05.640 | and you see that they have a huge kitchen,
01:29:08.340 | you know, you might think, well, maybe, you know,
01:29:10.720 | maybe this is a place I really definitely wanna eat dinner
01:29:13.520 | at because, you know, these people must be gourmet cooks.
01:29:16.680 | But you don't know anything about what the size
01:29:18.360 | of their kitchen means unless you consider it
01:29:20.160 | in relation to the size of the rest of the house.
01:29:23.120 | If it's a big kitchen in a really big house,
01:29:25.800 | it's not telling you anything special, right?
01:29:29.320 | If it's a big kitchen in a small house,
01:29:32.000 | then that might be a place that you wanna eat for,
01:29:33.920 | you wanna stay for dinner because it's more likely
01:29:36.760 | that that kitchen is large for a special reason.
01:29:39.400 | And so the cerebral cortex of a human brain
01:29:43.680 | isn't in and of itself special because of its size.
01:29:48.520 | However, there are some genetic changes
01:29:53.520 | that have happened in the human brain as it's grown
01:29:58.000 | with to whatever size is, you know,
01:30:01.640 | typical for the whole brain size, right?
01:30:04.200 | There are some changes that do give the human brain
01:30:07.660 | slightly more of some capacities.
01:30:12.460 | They're not special, but there's just,
01:30:14.800 | they just, you know, we can do some things
01:30:17.360 | much better than other animals.
01:30:21.620 | And, you know, correspondingly,
01:30:22.900 | other animals can do some things much better than we can.
01:30:25.800 | We can't grow back limbs,
01:30:27.000 | we can't lift 50 times our own body weight.
01:30:29.020 | Well, I mean, maybe you can,
01:30:30.040 | but I can't lift 50 times my own body weight.
01:30:31.880 | - Ants with that regard are very impressive.
01:30:34.120 | And then you're saying with the frontal cortex,
01:30:36.880 | like that's, the size is not always the right measure
01:30:40.760 | of capability, I guess.
01:30:44.140 | So size isn't everything.
01:30:46.520 | - Size isn't everything.
01:30:48.320 | - That's a quote about, you know,
01:30:49.800 | people like it when I disagree,
01:30:51.080 | so let me disagree with you on something
01:30:53.800 | or just like play devil's advocate a little bit.
01:30:56.360 | So you've painted a really nice picture
01:30:58.620 | that evolution doesn't have a direction,
01:31:00.840 | but is it possible if we just ran earth over and over again,
01:31:06.380 | like this video game,
01:31:08.660 | that the final result would be the same?
01:31:11.840 | So in the sense that we're,
01:31:14.160 | eventually there'll be an AGI type HAL 9000 type system
01:31:18.760 | that just like flies and colonizes nearby earth-like planets.
01:31:23.760 | And it's always will be the same.
01:31:26.920 | And the different organisms
01:31:29.040 | and the different evolution of the brain,
01:31:31.640 | like it doesn't feel like it has like a direction,
01:31:35.200 | but given the constraints of earth
01:31:37.380 | and whatever this imperative,
01:31:40.640 | whatever the hell is running this universe,
01:31:43.260 | like it seems like it's running towards something,
01:31:46.920 | is it possible that it will always be the same?
01:31:49.920 | Thereby, it will be a direction.
01:31:51.780 | - Yeah, I think, you know,
01:31:54.660 | as you know better than anyone else
01:31:57.620 | that the answer to that question is,
01:31:59.280 | of course, there's some probability
01:32:00.800 | that that could happen, right?
01:32:03.020 | It's not a yes or no answer.
01:32:04.340 | It's what's the probability that that would happen?
01:32:07.600 | And there's a whole distribution of possibilities.
01:32:12.600 | So maybe we end up,
01:32:16.280 | what's the probability we end up
01:32:17.420 | with exactly the same complement of creatures, including us?
01:32:22.420 | What's the likelihood that we end up with, you know,
01:32:25.940 | creatures that are similar to humans that are,
01:32:29.980 | but you know, similar in certain ways, let's say,
01:32:33.100 | but not exactly humans or, you know,
01:32:35.300 | all the way to a completely different
01:32:37.820 | distribution of creatures?
01:32:41.460 | - What's your intuition?
01:32:42.300 | Like if you were to bet money,
01:32:43.860 | what does that distribution look like
01:32:45.420 | if we ran earth over and over and over again?
01:32:47.300 | - I would say given the,
01:32:49.680 | you're now asking me questions that--
01:32:51.180 | - This is not science.
01:32:52.220 | - This is not science.
01:32:53.440 | But I would say, okay, well,
01:32:54.620 | what's the probability that it's gonna be a carbon life form?
01:32:58.700 | Probably high,
01:33:00.700 | but that's because I don't know anything about--
01:33:03.180 | - Alternatives?
01:33:04.020 | - Yeah, you know, I don't,
01:33:05.100 | I'm not really well versed that.
01:33:07.900 | What's the probability that, you know,
01:33:09.380 | so what's the probability that the animals will begin
01:33:12.160 | in the ocean and crawl out onto land?
01:33:14.740 | - Versus the other way.
01:33:15.580 | - Versus the, I would say probably high.
01:33:18.280 | I don't know, but, you know,
01:33:20.620 | but do I think, what's the likelihood
01:33:22.800 | that we would end up with exactly the same or very similar?
01:33:26.780 | I think it's low, actually.
01:33:28.700 | I wouldn't say it's low,
01:33:29.840 | but I would say it's not 100%
01:33:32.140 | and I'm not even sure it's 50%.
01:33:34.540 | You know, I would say,
01:33:36.440 | I don't think that we're here by accident
01:33:38.980 | because I think, like I said, there are constraints.
01:33:41.700 | Like, there are some physical constraints about Earth.
01:33:44.800 | Now, of course, if you were a cosmologist,
01:33:46.600 | you could say, well, the fact that the Earth is,
01:33:49.220 | if you were to do the Big Bang over again
01:33:51.300 | and keep doing it over and over and over again,
01:33:53.060 | would you still get the same solar systems?
01:33:56.340 | Would you still get the same planets?
01:33:58.240 | Would, you know, would you still get the same galaxies,
01:34:00.820 | the same solar systems, the same planets?
01:34:02.800 | You know, I don't know, but my guess is probably not
01:34:05.780 | because there are random things that happen
01:34:08.540 | that can, again, send things in one direct, you know,
01:34:12.580 | make one set of trajectories possible
01:34:14.500 | and another set impossible.
01:34:15.900 | So, but I guess my,
01:34:19.040 | if I were gonna bet something, money or something valuable,
01:34:25.740 | I would probably say it's not zero and it's not 100%
01:34:30.740 | and it's probably not even 50%.
01:34:33.280 | So, there's some probability, but I don't know.
01:34:34.120 | - That it will be similar.
01:34:35.680 | - That it would be similar, but I don't think,
01:34:37.320 | I just think there are too many degrees of freedom.
01:34:40.560 | There are too many degrees of freedom.
01:34:42.840 | I mean, one of the real tensions in writing this book
01:34:47.660 | is to, on the one hand, there's some truth in saying
01:34:52.720 | that humans are not special.
01:34:57.680 | We are just, you know, we're not special
01:35:01.720 | in the animal kingdom.
01:35:03.020 | All animals are well-adapted, if they're survived,
01:35:08.020 | they're well-adapted to their niche.
01:35:11.100 | It does happen to be the case that our niche is large.
01:35:15.520 | For any individual human, your niche is whatever it is,
01:35:18.380 | but for the species, right, we live almost everywhere,
01:35:22.580 | not everywhere, but almost everywhere on the planet,
01:35:25.180 | but not in the ocean.
01:35:28.540 | And actually, other animals like bacteria, for example,
01:35:32.100 | have us beat miles, you know, hands down, right?
01:35:35.540 | So, by any definition, we're not special.
01:35:40.540 | We're just, you know, adapted to our environment.
01:35:46.660 | - But bacteria don't have a podcast.
01:35:48.340 | - Exactly, exactly.
01:35:50.300 | - They're not able to introspect.
01:35:51.620 | - So, that's the tension, right?
01:35:53.060 | So, on the one hand, you know, we're not special animals.
01:35:55.320 | We're just, you know, particularly well-adapted to our niche.
01:35:58.260 | On the other hand, our niche is huge,
01:36:00.040 | and we don't just adapt to our environment.
01:36:03.060 | We add to our environment.
01:36:04.740 | We make stuff up, give it a name, and then it becomes real.
01:36:08.460 | And so, no other animal can do that.
01:36:10.820 | And so, I think the way to think about it
01:36:14.460 | from my perspective, or the way I made sense of it,
01:36:16.780 | is to say, you can look at any individual
01:36:20.620 | single characteristic that a human has
01:36:23.020 | that seems remarkable,
01:36:26.420 | and you can find that in some other animal.
01:36:28.620 | What you can't find in any other animal
01:36:33.780 | is all of those characteristics together
01:36:37.240 | in a brain that is souped up in particular ways,
01:36:43.060 | like ours is, and if you combine these things,
01:36:46.340 | multiple interacting causes, right?
01:36:48.820 | Not one essence, like your cortex, your big neocortex,
01:36:53.720 | but, which isn't really that big.
01:36:56.160 | I mean, it's just big for your big brain,
01:36:59.860 | for the size of your big brain.
01:37:01.220 | It's the size it should be.
01:37:02.860 | If you add all those things together,
01:37:05.980 | and they interact with each other,
01:37:07.360 | that produces some pretty remarkable results.
01:37:10.100 | And if you're aware of that,
01:37:14.420 | then you can start asking different kinds of questions
01:37:19.420 | about what it means to be human,
01:37:22.620 | and what kind of a human you wanna be,
01:37:25.180 | and what kind of a world do you wanna curate
01:37:28.800 | for the next generation of humans?
01:37:31.100 | I think that's the goal anyways, right?
01:37:33.780 | It's just to have a glimpse of,
01:37:36.220 | instead of thinking about things in a simple, linear way,
01:37:42.780 | just to have a glimpse of some of the things that matter,
01:37:45.780 | that evidence suggests matters,
01:37:49.060 | to the kind of brain, and the kind of bodies that we have.
01:37:54.060 | Once you know that, you can work with it a little bit.
01:37:58.780 | - You write, "Words have power over your biology."
01:38:02.100 | Right now, I can text the words, "I love you,"
01:38:05.300 | from the United States to my close friend in Belgium,
01:38:08.500 | and even though she cannot hear my voice or see my face,
01:38:12.060 | I will change her heart rate,
01:38:13.940 | her breathing, and her metabolism.
01:38:16.660 | By the way, beautifully written.
01:38:18.260 | Or someone could text something ambiguous to you,
01:38:22.060 | like, "Is your door locked?"
01:38:24.740 | And odds are that it would affect your nervous system
01:38:27.380 | in an unpleasant way.
01:38:29.500 | So, I mean, there's a lot of stuff to talk about here,
01:38:33.060 | but just one way to ask is,
01:38:37.460 | why do you think words have so much power over our brain?
01:38:42.260 | - Well, I think we just have to look at the anatomy
01:38:46.260 | of the brain to answer that question.
01:38:48.220 | So, if you look at the parts of the brain,
01:38:52.780 | the systems that are important for processing language,
01:38:57.780 | you can see that some of these regions
01:39:03.060 | are also important for controlling your major organ systems,
01:39:06.540 | and your autonomic nervous system
01:39:08.540 | that controls your cardiovascular system,
01:39:11.020 | your respiratory system, and so on,
01:39:13.420 | that these regions control your endocrine system,
01:39:18.420 | your immune system, and so on.
01:39:21.340 | So, and you can actually see this in other animals, too.
01:39:24.140 | So, in birds, for example,
01:39:26.140 | the neurons that are responsible for bird song
01:39:29.340 | also control the systems of a bird's body.
01:39:32.020 | And the reason why I bring that up
01:39:33.260 | is that some scientists think that the anatomy
01:39:38.260 | of a bird's brain that control bird song
01:39:43.100 | are homologous or structurally have a similar origin
01:39:46.860 | to the human system for language.
01:39:49.540 | So, the parts of the brain that are important
01:39:52.140 | for processing language are not unique
01:39:54.460 | and specialized for language.
01:39:57.580 | They do many things.
01:39:59.180 | And one of the things they do
01:40:00.340 | is control your major organ systems.
01:40:03.700 | - Do you think we can fall in love,
01:40:05.260 | I have arguments about this all the time,
01:40:07.860 | do you think we can fall in love based on words alone?
01:40:10.580 | - Well, I think people have been doing it for centuries.
01:40:14.020 | I mean, it used to be the case
01:40:15.500 | that people wrote letters to each other,
01:40:17.460 | and then that was how they communicated.
01:40:22.180 | - I guess that's how you and Dan got--
01:40:24.180 | - Exactly, exactly, exactly, yeah, exactly.
01:40:28.100 | - So, is the answer a clear yes there?
01:40:31.180 | Because I get a lot of pushback from people often
01:40:34.060 | that you need the touch and the smell
01:40:37.860 | and the bodily stuff.
01:40:42.140 | - I think the touch and the smell
01:40:43.500 | and the bodily stuff helps.
01:40:45.620 | - Okay.
01:40:46.440 | - But I don't think it's necessary.
01:40:47.900 | - Do you think you can have a lifelong monogamous
01:40:50.700 | relationship with an AI system that only communicates
01:40:54.620 | with you on text, romantic relationship?
01:40:57.700 | - Well, I suppose that's an empirical question
01:41:00.820 | that hasn't been answered yet, but--
01:41:02.740 | - So, yeah.
01:41:03.580 | - I guess what I would say is,
01:41:05.020 | I don't think I could.
01:41:09.320 | Could any human, could the average human,
01:41:14.460 | could, you know, so, if I,
01:41:18.060 | I wanna even modify that and say,
01:41:25.580 | I'm thinking now of Tom Hanks and the movie--
01:41:30.580 | - Castaway?
01:41:31.940 | - Yeah, you know, with Wilson.
01:41:33.300 | - Yeah.
01:41:34.140 | - I think if that was, if you had to make that work,
01:41:37.460 | if you had to make that work--
01:41:39.060 | - With the volleyball, yeah.
01:41:40.420 | - If you had to make it work, could you,
01:41:43.180 | could you, prediction and simulation, right?
01:41:45.340 | So, if you had to make it work, could you make it work?
01:41:49.500 | Using simulation and, you know, your past experience,
01:41:53.460 | could you make it work?
01:41:55.620 | Could you make it work, you as a human, could you,
01:41:59.020 | could you, like--
01:41:59.860 | - Could you have a relationship,
01:42:01.500 | literally with an inanimate object,
01:42:03.620 | and have it sustain you in the way
01:42:05.900 | that another human could?
01:42:07.740 | - Yeah.
01:42:08.860 | - Your life would probably be shorter,
01:42:11.020 | because you wouldn't actually derive
01:42:12.900 | the body budgeting benefits from, right?
01:42:15.380 | So, we've talked about, you know,
01:42:18.940 | how your brain, its most important job
01:42:22.660 | is to control your body, and you can describe that
01:42:25.980 | as your brain running a budget for your body.
01:42:28.220 | And there are metaphorical, you know,
01:42:31.860 | deposits and withdrawals into your body budget,
01:42:34.660 | and you also make deposits and withdrawals
01:42:37.420 | in other people's body budgets, figuratively speaking.
01:42:40.420 | So, you wouldn't have that particular benefit,
01:42:43.840 | so your life would probably be shorter,
01:42:48.300 | but I think it would be harder for some people
01:42:51.100 | than for other people.
01:42:52.420 | - Yeah, I tend to, my intuition is that
01:42:53.860 | you can have a deep, fulfilling relationship
01:42:56.700 | with a volleyball.
01:42:57.860 | I think a lot of the environments that set up,
01:43:04.020 | I think that's a really good example,
01:43:05.460 | like the constraints of your particular environment
01:43:10.460 | define the, like, I believe like scarcity
01:43:14.060 | is a good catalyst for deep, meaningful connection
01:43:19.060 | with other humans and with inanimate objects.
01:43:21.980 | So, the less you have, the more fulfilling
01:43:24.700 | those relationships are, and I would say
01:43:27.540 | a relationship with a volleyball,
01:43:29.260 | the sex is not great, but everything else,
01:43:31.620 | I feel like it could be a very fulfilling relationship,
01:43:34.660 | which I don't know, from an engineering perspective,
01:43:37.260 | what to do with that.
01:43:38.780 | Just like you said, it is an empirical question, but.
01:43:41.500 | - But there are places to learn about that, right?
01:43:43.540 | So, for example, think about children and their blankets.
01:43:48.140 | Right, so there, there's something tactile
01:43:51.220 | and there's something olfactory,
01:43:53.660 | and it's very comforting.
01:43:55.380 | I mean, even for non-human little animals, right?
01:44:00.220 | Like puppies and, so I don't know about cats, but.
01:44:03.100 | - Cats are cold-hearted, there's no,
01:44:07.620 | there's nothing going on there.
01:44:08.820 | - I don't know, there are some cats
01:44:10.460 | that are very dog-like, I mean, really, so.
01:44:14.100 | - Some cats identify as dogs, yes.
01:44:15.820 | - I think that's true, yeah, they're species fluid.
01:44:19.340 | (both laughing)
01:44:21.380 | - So, you also write, "When it comes to human minds,
01:44:26.100 | "variation is the norm, and what we call, quote,
01:44:29.420 | "human nature is really many human natures."
01:44:32.860 | Again, many questions I can ask here,
01:44:36.300 | but maybe an interesting one to ask is,
01:44:39.300 | I often hear, you know, we often hear this idea
01:44:42.700 | of be yourself.
01:44:43.820 | Is this possible, to be yourself?
01:44:48.160 | Is it a good idea to strive to be yourself?
01:44:51.580 | Is it, does that even have any meaning?
01:44:54.500 | - It's a very Western question, first of all,
01:44:57.340 | because which self are you talking about?
01:44:59.540 | You don't have one self,
01:45:01.900 | there is no self that's an essence of you.
01:45:04.580 | You have multiple selves, actually,
01:45:06.480 | there is research on this.
01:45:09.020 | You know, to quote the great social psychologist,
01:45:12.540 | Hazel Marcus, you're never,
01:45:13.900 | you cannot be a self by yourself.
01:45:16.820 | You, you know, you, and so different contexts
01:45:21.080 | pull for or draw on different features of your,
01:45:26.000 | of who you are or what you believe, what you feel,
01:45:28.960 | what your actions are.
01:45:30.140 | Different contexts, you know, will put certain things,
01:45:35.120 | or make more, some features be more in the foreground
01:45:37.840 | and some in the background.
01:45:39.840 | It takes us back right to our discussion earlier
01:45:42.120 | about Stalin and Hitler and so on.
01:45:46.100 | The thing that I would caution,
01:45:48.560 | in addition to the fact that there is no single self,
01:45:51.420 | you know, that you have multiple selves, who you can be,
01:45:54.340 | and you can certainly choose the situations
01:45:59.080 | that you put yourself in to some extent.
01:46:01.240 | Not everybody has complete choice,
01:46:02.880 | but everybody has a little bit of choice.
01:46:04.480 | And I think I said this to you before,
01:46:06.420 | that one of the pieces of advice that we gave Sophia,
01:46:10.240 | you know, when she went, our daughter,
01:46:11.960 | when she was going off to college was,
01:46:14.560 | try to spend time around people,
01:46:18.440 | choose relationships that allow you to be your best self.
01:46:21.340 | We should have said your best selves, but--
01:46:26.840 | - The pool of selves given the environment.
01:46:31.600 | - Yeah, but the one thing I do wanna say is that
01:46:35.400 | the risk of saying be yourself, just be yourself,
01:46:38.320 | is that that can be used as an excuse.
01:46:42.440 | Well, this is just the way that I am, I'm just like this.
01:46:45.540 | And that I think should be tremendously resistant.
01:46:50.540 | - So that's one, that's for the excuse side,
01:46:54.360 | but you know, I'm really self-critical often,
01:46:57.640 | I'm full of doubt, and people often tell me,
01:47:00.600 | just don't worry about it, just be yourself, man.
01:47:04.120 | And the thing is, it's not, from an engineering perspective,
01:47:09.880 | does not seem like actionable advice,
01:47:12.480 | because I guess constantly worrying about who,
01:47:17.480 | what are the right words to say
01:47:24.080 | to express how I'm feeling is, I guess, myself.
01:47:29.080 | There's a kind of line, I guess,
01:47:32.000 | that this might be a Western idea,
01:47:34.500 | but something that feels genuine
01:47:37.520 | and something that feels not genuine.
01:47:39.420 | And I'm not sure what that means,
01:47:42.600 | 'cause I would like to be fully genuine and fully open,
01:47:45.820 | but I'm also aware, like this morning,
01:47:49.160 | I was very silly and giddy, I was just being funny
01:47:54.160 | and relaxed and light, like there's nothing
01:47:58.320 | that could bother me in the world,
01:48:01.120 | I was just smiling and happy.
01:48:02.680 | And then I remember last night,
01:48:04.080 | I was just feeling very grumpy,
01:48:06.000 | like stuff was bothering me,
01:48:09.120 | like certain things were bothering me.
01:48:10.960 | And what are those, are those the different selves?
01:48:14.520 | Like what, who am I in that, and what do I do?
01:48:17.720 | Because if we take Twitter as an example,
01:48:20.560 | if I actually send a tweet last night
01:48:23.280 | and a tweet this morning,
01:48:24.360 | it's gonna be very two different people tweeting that.
01:48:28.960 | And I don't know what to do with that,
01:48:30.600 | because one does seem to be more me than the other,
01:48:35.000 | but that's maybe because there's a narrative,
01:48:36.960 | the story that I'm trying,
01:48:38.320 | there's something I'm striving to be,
01:48:40.520 | like the ultimate human that I might become.
01:48:43.240 | I have maybe a vision of that,
01:48:44.640 | and I'm trying to become that,
01:48:46.500 | but it does seem like there's a lot
01:48:49.000 | of different minds in there.
01:48:50.440 | And they're all like having a discussion
01:48:54.520 | and a battle for who's gonna win.
01:48:56.800 | - I suppose you could think of it that way,
01:48:58.200 | but there's another way to think of it, I think,
01:49:00.160 | and that is that maybe the more Buddhist way
01:49:03.080 | to think of it, right,
01:49:04.000 | or a more contemplative way to think about it,
01:49:05.760 | which is not that you have multiple personalities
01:49:08.800 | inside your head, but you have,
01:49:11.800 | your brain has this amazing capacity.
01:49:16.800 | It has a population of experiences that you've had
01:49:23.160 | that it can regenerate, reconstitute.
01:49:27.840 | And it can even take bits and pieces
01:49:30.960 | of those experiences and combine them into something new.
01:49:35.020 | And it's often doing this to predict
01:49:39.200 | what's going to happen next and to plan your actions,
01:49:42.240 | but it's also happening, this also happens just,
01:49:46.320 | that's what mind-wandering is,
01:49:47.840 | or just internal thought and so on.
01:49:50.240 | It's the same mechanism, really.
01:49:52.320 | And so a lot of times we hear the saying,
01:49:57.040 | just think, if you think differently,
01:49:58.840 | you'll feel differently.
01:50:00.840 | But your brain is having a conversation
01:50:04.200 | continually with your body.
01:50:06.540 | And your body, your brain's trying to control your body,
01:50:10.640 | well, trying, your brain is controlling your body,
01:50:13.000 | your body is sending information back to the brain.
01:50:16.120 | And in part, the information that your body sends back
01:50:19.760 | to your brain, just like the information
01:50:23.600 | coming from the world, initiates the next volley
01:50:27.760 | of predictions or simulations.
01:50:30.280 | So in some ways, you could also say,
01:50:32.300 | the way that you feel, I think we talked before
01:50:37.240 | about affective feeling or mood coming from the sensations
01:50:41.960 | of body budgeting, you know, influences what you think.
01:50:50.240 | And as much as, so feelings influence thought
01:50:54.800 | as much as thought influence feeling, and maybe more.
01:50:58.720 | - But just, the whole thing doesn't seem stable.
01:51:01.440 | - Well, it's a dynamic system, Mr. Engineer.
01:51:04.360 | - Yeah.
01:51:05.200 | - Right, it's a dynamic, it's a dynamical system, right?
01:51:07.760 | Non-linear dynamical system.
01:51:09.360 | And I think that's, I'm actually writing a paper
01:51:11.780 | with a bunch of engineers about this, actually.
01:51:14.980 | But I mean, other people have talked about the brain
01:51:17.520 | as a dynamical system before, but you know,
01:51:20.120 | the real tricky bit is trying to figure out
01:51:22.260 | how do you get mental features out of that system?
01:51:24.880 | I guess one thing to figure out how you get
01:51:26.220 | a motor movement out of that system,
01:51:27.760 | it's another thing to figure out how you get
01:51:29.320 | a mental feature, like a feeling of being loved
01:51:32.920 | or a feeling of being worthwhile, or a feeling of,
01:51:36.800 | you know, just basically feeling like shit.
01:51:38.600 | How do you get a feeling, a mental features
01:51:41.440 | out of that system?
01:51:42.840 | So what I would say is that you aren't,
01:51:48.020 | the Buddhist thing to say is that you're not one person
01:51:50.760 | and you're not many people.
01:51:52.620 | You are, you are the sum of your experiences
01:51:57.620 | and who you are in any given moment,
01:52:00.860 | meaning what your actions will be,
01:52:05.000 | is influenced by the state of your body
01:52:07.680 | and the state of the world that you've put yourself in.
01:52:10.280 | And you can change either of those things.
01:52:12.880 | One is a little easier to change than the other, right?
01:52:15.200 | You can change your environment by literally getting up
01:52:17.820 | and moving, or you can change it by paying attention
01:52:21.000 | to some things differently and letting other,
01:52:23.400 | some features come to the fore
01:52:26.080 | and other features be backgrounded.
01:52:28.000 | Like I'm looking around your place.
01:52:30.040 | - Oh no, this is not something you should do.
01:52:32.600 | - No, but I'm gonna say one thing.
01:52:34.960 | No green plants.
01:52:36.680 | No green plants.
01:52:39.640 | - 'Cause green plants mean a home
01:52:41.380 | and I want this to be temporary.
01:52:43.160 | - Fair, fair, but--
01:52:45.880 | - What goes through your mind
01:52:47.360 | when you see no green plants?
01:52:48.500 | - No, I'm just making the point that what if you,
01:52:53.500 | again, not everybody has control over their environment.
01:52:59.280 | Some people don't have control over the noise
01:53:01.520 | or the temperature or any of those things.
01:53:04.500 | But everybody has a little bit of control
01:53:07.100 | and you can place things in your environment,
01:53:10.280 | photographs, plants, anything that's meaningful to you
01:53:16.300 | and use it as a shift of environment when you need it.
01:53:20.280 | You can also do things to change
01:53:22.700 | the conditions of your body.
01:53:24.960 | When you exercise every day,
01:53:26.720 | you're making an investment in your body.
01:53:29.780 | Actually, you're making an investment in your brain too.
01:53:32.140 | It makes you, even though it's unpleasant
01:53:34.720 | and there's a cost to it, if you replenish,
01:53:38.140 | if you invest and you make up that,
01:53:40.580 | you make a deposit and you make up that, what you've spent,
01:53:45.220 | you're basically making an investment
01:53:47.140 | in making it easier for your brain
01:53:49.660 | to control your body in the future.
01:53:52.180 | So you can make sure you're hydrated, drink water.
01:53:56.400 | You don't have to drink bottled water.
01:53:57.820 | You can drink water from the tap.
01:53:59.380 | This is in most places, maybe not everywhere,
01:54:02.420 | but most places in the developed world.
01:54:07.060 | You can try to get enough sleep.
01:54:10.420 | Not everybody has that luxury,
01:54:11.980 | but everybody can do something to make their body budgets
01:54:16.980 | a little more solvent.
01:54:18.540 | And that will also make it more likely
01:54:22.340 | that certain thoughts will emerge
01:54:24.280 | from that prediction machine.
01:54:27.660 | - That's the control you do have,
01:54:29.420 | is being able to control the environment.
01:54:32.100 | That's really well put.
01:54:33.260 | I don't think we've talked about this,
01:54:37.380 | so let's go to the biggest unanswerable questions
01:54:39.680 | of consciousness.
01:54:41.600 | What is, you just rolled your eyes.
01:54:44.100 | - I did, that was my, yeah.
01:54:45.940 | - So what is consciousness from a neuroscience perspective?
01:54:49.220 | I know you, I mean.
01:54:50.400 | - I made notes, you know,
01:54:54.500 | 'cause you gave me some questions in advance
01:54:56.660 | and I made notes for every single.
01:54:58.300 | - Oh, except that one?
01:54:59.140 | - Yeah, well, that one I had, what the fuck?
01:55:01.380 | And then I took it out.
01:55:02.620 | - So is there something interesting,
01:55:06.140 | because you're so pragmatic,
01:55:07.760 | is there something interesting to say
01:55:09.060 | about intuition building about consciousness?
01:55:13.440 | Or is this something that we're just totally clueless about,
01:55:16.440 | that this is, let's focus on the body,
01:55:20.840 | the brain listens to the body,
01:55:22.340 | the body speaks to the brain,
01:55:24.720 | and let's just figure this piece out,
01:55:27.120 | and then consciousness will probably emerge somehow
01:55:29.400 | after that.
01:55:30.640 | - No, I think, you know, well, first of all,
01:55:33.360 | I'll just say up front,
01:55:35.320 | I am not a philosopher of consciousness,
01:55:37.840 | and I'm not a neuroscientist who focuses on consciousness.
01:55:40.560 | I mean, in some sense, I do study it
01:55:42.060 | because I study affect and mood,
01:55:44.680 | and that is the,
01:55:46.720 | you know, to use the phrase,
01:55:51.120 | that is the hard question of consciousness.
01:55:54.280 | How is it that your brain is modeling your body?
01:55:58.160 | Brain is modeling the sensory conditions of your body,
01:56:00.900 | and it's being updated,
01:56:04.040 | that model is being updated by the sense data
01:56:06.640 | that's coming from your body,
01:56:08.000 | and it's happening continuously your whole life,
01:56:10.400 | and you don't feel those sensations directly.
01:56:15.800 | What you feel is a general sense of pleasantness
01:56:19.360 | or unpleasantness, comfort, discomfort,
01:56:21.360 | feeling worked up, feeling calm.
01:56:22.580 | So we call that affect, you know,
01:56:24.800 | most people call it mood.
01:56:26.600 | So how is it that your brain gives you
01:56:29.080 | this very low-dimensional feeling of mood or affect
01:56:34.040 | when it's presumably receiving
01:56:36.080 | a very high-dimensional array of sense data,
01:56:39.440 | and the model that the brain is running of the body
01:56:42.840 | has to be high-dimensional
01:56:44.720 | because there's a lot going on in there, right?
01:56:48.000 | You're not aware, but as you're sitting there quietly,
01:56:50.180 | as your listeners, as our viewers are sitting--
01:56:54.800 | - They might be working out, running now,
01:56:56.720 | or as many of them write to me--
01:56:58.320 | - That's fair.
01:56:59.160 | - They're laying in bed,
01:56:59.980 | smoking weed with their eyes closed.
01:57:01.160 | - (laughs) That's fair.
01:57:02.240 | So maybe we should say that bit again then.
01:57:04.200 | (both laugh)
01:57:05.040 | So if, so some people may be working out,
01:57:07.800 | some people may be--
01:57:09.720 | - Relaxing.
01:57:10.560 | - Relaxing, but even if you're sitting very still
01:57:14.100 | while you're watching this or listening to this,
01:57:16.920 | there's a whole drama going on inside your body
01:57:19.600 | that you're largely unaware of,
01:57:21.400 | yet your brain makes you aware
01:57:26.160 | or gives you a status report in a sense
01:57:29.640 | by virtue of these mental features of feeling pleasant,
01:57:32.680 | feeling unpleasant, feeling comfortable,
01:57:34.440 | feeling uncomfortable, feeling energetic,
01:57:36.400 | feeling tired, and so on.
01:57:38.120 | And so how the hell is it doing that?
01:57:41.300 | That is the basic question of consciousness.
01:57:46.300 | - And like the status reports seem to be,
01:57:49.240 | in the way we experience them, seem to be quite simple.
01:57:52.280 | It doesn't feel like there's a lot of data.
01:57:56.600 | - Yeah, no, there isn't.
01:57:57.680 | So when you feel discomfort,
01:58:02.680 | when you're feeling basically like shit,
01:58:04.840 | you feel like shit, what does that tell you?
01:58:06.960 | Like what are you supposed to do next?
01:58:08.360 | What caused it?
01:58:09.320 | I mean, the thing is not one thing caused it, right?
01:58:12.160 | It's multiple factors probably influencing
01:58:15.560 | your physical state.
01:58:16.920 | Your body budget-- - You said it's
01:58:17.760 | very high dimensional, yeah.
01:58:18.600 | - It's very high dimensional.
01:58:20.200 | And that, and the,
01:58:25.480 | there are different temporal scales of influence, right?
01:58:27.960 | So the state of your gut is not just influenced
01:58:32.960 | by what you ate five minutes ago.
01:58:34.680 | It's also what you ate a day ago
01:58:36.680 | and two days ago and so on.
01:58:38.600 | So I think the,
01:58:45.260 | I'm not trying to weasel out of the question.
01:58:50.300 | I just think it's the hardest question, actually.
01:58:55.120 | - Do you think we'll ever understand it?
01:58:57.120 | As scientists.
01:59:03.280 | - I think that we will understand it
01:59:06.680 | as well as we understand other things
01:59:09.720 | like the birth of the universe
01:59:13.200 | or the nature of the universe, I guess I would say.
01:59:18.200 | So do I think we can get to that level of an explanation?
01:59:21.920 | I do, actually, but I think that we have to start asking
01:59:24.860 | somewhat different questions and approaching the science
01:59:28.080 | somewhat differently than we have in the past.
01:59:30.280 | - I mean, it's also possible that consciousness
01:59:32.040 | is much more difficult to understand
01:59:33.600 | than the nature of the universe.
01:59:35.320 | - It is, but I wasn't necessarily saying
01:59:37.680 | that it was a question that was of equivalent complexity.
01:59:40.800 | I was saying that I do think that we could get to some,
01:59:45.240 | I am optimistic that, I would not,
01:59:51.040 | I would be very willing to invest my time on this earth
01:59:56.020 | as a scientist in trying to answer that question
01:59:58.380 | if I could do it the way that I wanna do it,
02:00:01.920 | not in the way that it's currently being done.
02:00:04.380 | - So like rigorously?
02:00:05.600 | - I don't wanna say unrigorously.
02:00:07.980 | I just wanna say that there are a certain set of assumptions
02:00:10.340 | that, you know, scientists have what I would call
02:00:13.420 | ontological commitments.
02:00:14.720 | They're commitments about the way the world is
02:00:17.060 | or the way that nature is.
02:00:19.720 | And these commitments lead scientists sometimes blindly,
02:00:24.720 | scientists sometimes, sometimes scientists are aware
02:00:27.220 | of these commitments, but sometimes they're not.
02:00:29.380 | And these commitments on the list influence
02:00:31.580 | how scientists ask questions, what they measure,
02:00:35.500 | how they measure, and I just have very different views
02:00:40.100 | than a lot of my colleagues
02:00:41.660 | about the ways to approach this.
02:00:43.740 | Not everybody, but the way that I would approach it
02:00:47.020 | would be different and it would cost more
02:00:50.920 | and it would take longer.
02:00:53.160 | It doesn't fit very well
02:00:54.840 | into the current incentive structure of science.
02:00:56.920 | And so do I think that doing science
02:00:59.680 | the way science is currently done
02:01:01.120 | with the budget that it currently has
02:01:02.860 | and the incentive structure that it currently has,
02:01:04.840 | will we have an answer?
02:01:05.760 | No, I think absolutely not.
02:01:07.320 | Good luck is what I would say.
02:01:08.820 | - People love book recommendations.
02:01:13.040 | Let me ask what three books?
02:01:15.160 | - Well, you can't just give me three.
02:01:17.780 | I mean, like really, three?
02:01:19.180 | - What seven and a half books you can recommend.
02:01:23.100 | So you're also the author of seven and a half lessons
02:01:25.140 | about the brain.
02:01:26.640 | You're author of "How Emotions Are Made."
02:01:29.780 | Okay, so definitely those are the top two recommendations
02:01:33.340 | of all, the two greatest books of all time.
02:01:35.300 | Other than that, are there books that,
02:01:37.940 | technical, fiction, philosophical, that you've enjoyed
02:01:41.140 | and you might recommend to others?
02:01:42.780 | - Yes.
02:01:44.740 | Actually, every PhD student,
02:01:46.900 | when they graduate with their PhD,
02:01:50.420 | I give them a set, like a little library,
02:01:52.940 | like a set of books, some of which they've already read,
02:01:55.980 | some of which I want them to read.
02:01:57.820 | But I think nonfiction books, I would read,
02:02:04.060 | the things I would recommend are "The Triple Helix"
02:02:08.220 | by Richard Luontan.
02:02:10.860 | It's a little book published in 2000,
02:02:14.640 | which is, I think, a really good introduction
02:02:18.100 | to complexity and population thinking,
02:02:23.100 | as opposed to essentialism.
02:02:25.720 | So this idea, essentialism is this idea that, you know,
02:02:28.780 | there's an essence to each person,
02:02:30.480 | whether it's a soul or your genes or what have you,
02:02:33.420 | as opposed to this idea that you,
02:02:35.980 | we have the kind of nature that requires a nurturer.
02:02:38.700 | We are, we are, you are the product of a complex dance
02:02:44.000 | between an environment,
02:02:47.060 | between a set of genes and an environment
02:02:49.460 | that turns those genes on and off
02:02:52.580 | to produce your brain and your body,
02:02:54.460 | and really who you are at any given moment.
02:02:57.180 | - It's a good title for that, "Triple Helix."
02:02:59.300 | So playing on the double helix,
02:03:00.500 | where it's just the biology, it's bigger than the biology.
02:03:03.960 | - Exactly.
02:03:05.400 | It's a wonderful book.
02:03:06.280 | I've read it probably six or seven times
02:03:08.300 | throughout the year.
02:03:09.140 | He has another book, too, which is,
02:03:11.780 | it's more, I think scientists would find it,
02:03:14.360 | I don't know, I've loved it.
02:03:15.500 | It's called "Biology as Ideology."
02:03:18.600 | And it really is all about,
02:03:20.860 | I wouldn't call it one of the best books of all time,
02:03:22.860 | but I love the book because it really does point out,
02:03:26.360 | you know, that science as it's currently practiced,
02:03:31.640 | I mean, the book was written in 1991,
02:03:33.160 | but it actually, I think, still holds,
02:03:34.920 | that science as it's currently practiced
02:03:36.980 | has a set of ontological commitments
02:03:38.920 | which are somewhat problematic.
02:03:41.600 | - So the assumptions are limiting.
02:03:43.540 | - Yeah, in ways that you,
02:03:44.820 | it's like you're a fish in water and you don't,
02:03:47.300 | like, okay, so, yeah, so here's--
02:03:49.060 | - David Foster Wallace and stuff.
02:03:50.500 | - Well, but, you know, but here's a really cool thing
02:03:52.700 | I just learned recently.
02:03:55.180 | Is it okay to go off on this tangent for a minute?
02:03:57.940 | - Yeah, yeah, let's go tangency, great.
02:04:00.580 | - I was just gonna say that I just learned recently
02:04:02.620 | that we don't have water receptors on our skin.
02:04:06.140 | So how do you know when you're sweating?
02:04:07.740 | How do you know when a raindrop,
02:04:10.620 | when it's gonna rain and, you know,
02:04:12.500 | like a raindrop hits your skin
02:04:13.780 | and you can feel that little drop of wetness?
02:04:16.620 | How is it that you feel that drop of wetness
02:04:18.500 | when we don't have water receptors in our skin?
02:04:22.340 | And I was, when I--
02:04:23.180 | - My mind's blown already.
02:04:24.900 | - Yeah, that was, I have my reaction too, right?
02:04:27.220 | I was like, of course we don't
02:04:29.260 | because we evolved in the water.
02:04:31.500 | Like, why would we need, you know, it just,
02:04:33.300 | it was just this, like, you know,
02:04:34.500 | you have these moments where you're like, oh,
02:04:36.420 | of course, let me just like go, yeah, so--
02:04:38.300 | - And you'll never see rain the same way again.
02:04:40.740 | - So the answer is it's a combination
02:04:44.260 | of temperature and touch.
02:04:47.780 | But it's a complex sense that's only computed in your brain.
02:04:52.580 | There's no receptor for it.
02:04:54.340 | Anyways.
02:04:55.380 | - Yeah, that's why, like, snow versus cold rain
02:04:58.500 | versus warm rain all feel different
02:05:00.580 | 'cause you're trying to infer stuff from the temperature
02:05:03.660 | and the size of the droplets, it's fascinating.
02:05:05.900 | - Yeah, your brain is a prediction machine.
02:05:07.660 | It's using lots and lots of information combining it.
02:05:11.220 | Anyways, so, but, so, "Biology is Ideology"
02:05:16.220 | is, I wouldn't say it's one of the greatest books
02:05:18.580 | of all time, but it is a really useful book.
02:05:22.580 | There's a book by, if you're interested in psychology
02:05:25.700 | or the mind at all, there's a wonderful book,
02:05:28.460 | a little, it's a fairly small book
02:05:33.020 | called "Naming the Mind" by Kurt Danziger
02:05:36.300 | who's a historian of psychology.
02:05:38.580 | Everybody in my lab reads both of these books.
02:05:42.500 | - So what's the book?
02:05:43.980 | - It's about the origin of the,
02:05:45.620 | where did we get the theory of mind that we have
02:05:49.620 | that the human mind is populated by thoughts and feelings
02:05:53.500 | and perceptions, and where did those categories come from?
02:05:57.860 | Because they don't exist in all cultures.
02:06:00.580 | - Oh, so this isn't, that's a cultural construct?
02:06:05.980 | - The idea that you have thoughts and feelings
02:06:08.060 | and they're very distinct is definitely a cultural construct.
02:06:11.300 | - It's another mind-blowing thing, just like the rain?
02:06:15.260 | - So Kurt Danziger is a, the opening chapter in that book
02:06:21.220 | is absolutely mind-blowing.
02:06:26.260 | I love it, I love it.
02:06:29.420 | I just think it's fantastic.
02:06:32.140 | And I would say that there are many, many
02:06:35.980 | popular science books that I could recommend
02:06:39.060 | that I think are extremely well-written in their own way.
02:06:42.940 | Before I, maybe I said this to you,
02:06:44.580 | but before I undertook writing "How Emotions Are Made,"
02:06:49.180 | I read, I don't know, somewhere on the order of 50 or 60
02:06:53.060 | popular science books to try to figure out
02:06:56.580 | how to write a popular science book
02:07:00.500 | because while there are many books about writing,
02:07:03.980 | Stephen King has a great book about--
02:07:05.980 | - On writing? - On writing.
02:07:07.380 | And where he gives tips interlaced
02:07:11.500 | with his own personal history.
02:07:13.380 | That was where I learned you write for a specific person.
02:07:17.500 | You have a specific person in mind.
02:07:19.340 | And that's, for me, that person is Dan.
02:07:22.740 | - That's fascinating.
02:07:23.580 | I mean, that's a whole 'nother conversation
02:07:24.780 | to have like which popular science books,
02:07:27.980 | like what you learned from that search.
02:07:31.100 | Because there's, I have, for me,
02:07:34.580 | some popular science books that I just roll my eyes,
02:07:37.020 | like this is too, it's the same with TED Talks.
02:07:41.780 | Like some of them go too much into the flowery
02:07:45.140 | and I would say don't give enough respect
02:07:48.860 | to the intelligence of the reader.
02:07:50.660 | But this is my own bias, very specific.
02:07:55.140 | - I completely agree with you.
02:07:56.660 | And in fact, I have a colleague,
02:07:59.220 | his name is Van Yang, who,
02:08:03.220 | he produced a cinematic lecture
02:08:08.020 | of how emotions are made
02:08:09.300 | that we wrote together with Joseph Fridman, no relation.
02:08:13.740 | - Well, we're all related.
02:08:16.060 | - Well, I mean, you and I are probably,
02:08:17.380 | you know, have some, yeah.
02:08:18.660 | - Yeah, I remember.
02:08:21.260 | The memories are in there somewhere.
02:08:22.780 | - Yeah, it's from many, many, many generations ago.
02:08:26.100 | Well, half my family is Russian, so from--
02:08:28.700 | - The good half.
02:08:29.540 | - The good half, right.
02:08:30.380 | (both laughing)
02:08:33.060 | But, you know, he, his goal actually is to produce
02:08:38.060 | you know, videos and lectures that are beautiful
02:08:48.740 | and educational and that don't dumb the material down.
02:08:54.140 | And he's really remarkable at it, actually.
02:08:57.300 | I mean, just, but again, you know,
02:09:00.220 | that requires a bit of a paradigm shift.
02:09:02.380 | We could have a whole conversation
02:09:03.540 | about the split between entertainment
02:09:06.460 | and education in this country
02:09:07.900 | and why it is the way it is,
02:09:09.140 | but that's another conversation.
02:09:11.620 | - To be continued.
02:09:12.460 | - But I would say, if I were to pick one book
02:09:15.420 | that I think is a really good example
02:09:18.620 | of good science writing,
02:09:20.220 | it would be "The Beak of the Finch."
02:09:21.940 | Which won a Pulitzer Prize a number of years ago.
02:09:26.940 | And I'm not, I'm not remembering the author's name.
02:09:31.980 | I'm blanking.
02:09:32.820 | - But the, I'm guessing, is it focusing on birds
02:09:37.820 | and the evolution of birds?
02:09:40.620 | - Actually, there's also "The Evolution of Beauty."
02:09:43.220 | - That's, yeah.
02:09:44.060 | - Yeah, which is also a great book.
02:09:45.980 | But no, "The Beak of the Finch" is,
02:09:49.220 | "The Beak of the Finch" is, it's a,
02:09:52.420 | it has two storylines that are interwoven.
02:09:56.820 | One is about Darwin and Darwin's explorations
02:10:01.820 | in the Galapagos Island.
02:10:04.180 | And then modern day researchers from Princeton
02:10:07.820 | who have a research program in the Galapagos
02:10:10.900 | looking at Darwin's finches.
02:10:13.060 | And it's just a really, first of all,
02:10:18.300 | there's top-notch science in there.
02:10:21.380 | And really science, like, you know,
02:10:24.820 | evolutionary biology that a lot of people don't know.
02:10:28.460 | And it's told really, really well.
02:10:30.860 | - It sounds like there also, there's a narrative in there.
02:10:34.180 | There's, it's like storytelling too.
02:10:35.820 | - Yeah, I think all good popular science books
02:10:38.700 | are storytelling for just, you know,
02:10:40.740 | but storytelling grounded, constrained by,
02:10:42.980 | you know, the evidence.
02:10:44.660 | And then I just want to say that there are,
02:10:47.220 | for fiction, I'm a really big fan of love stories,
02:10:51.300 | just to return us to the topic that we began with.
02:10:54.980 | And so my, some of my favorite love stories
02:10:59.620 | are "Major Pettigrew's Last Stand" by Helen Simonson.
02:11:04.240 | It's a love story about people
02:11:08.700 | who you wouldn't expect to fall in love
02:11:11.260 | and all the people around them
02:11:12.980 | who have to overcome their prejudices.
02:11:16.100 | And I love this book.
02:11:19.500 | - What do you like?
02:11:20.340 | Like what makes a good love story?
02:11:21.900 | - There isn't one thing, you know,
02:11:24.540 | there are many different things that make a good love story,
02:11:26.860 | but I think in this case,
02:11:28.500 | you can feel,
02:11:32.020 | you can feel the journey.
02:11:36.140 | You can feel the journey that these characters are on
02:11:39.100 | and all the people around them are on this journey too,
02:11:42.860 | basically to come to grips
02:11:44.460 | with this really unexpected love,
02:11:47.700 | really profound love that develops
02:11:50.260 | between these two characters
02:11:52.020 | who are very unlikely to have fallen in love, but they do.
02:11:55.660 | And it's just, it's very gentle.
02:11:59.940 | Another book like that is
02:12:01.580 | the storied life of A.J. Feerke,
02:12:07.000 | which is also a love story,
02:12:11.080 | but in this case, it's a love story
02:12:12.880 | between a little girl and her adopted dad.
02:12:17.880 | And the dad is this like real curmudgeony,
02:12:23.000 | you know, guy,
02:12:26.460 | but of course there's a story there.
02:12:28.100 | And it's just a beautiful love story.
02:12:31.740 | But it also, it's like everybody in this community
02:12:36.620 | falls in love with him because he falls in love with her.
02:12:40.940 | And he, you know, she just gets left at his store,
02:12:44.420 | his bookstore, he has this failing bookstore.
02:12:46.840 | And he discovers that, you know,
02:12:51.820 | he feels like inexplicably this need
02:12:54.440 | to take care of this little baby.
02:12:56.540 | And this whole life emerges out of that one decision,
02:13:00.380 | which is really beautiful, actually.
02:13:03.960 | It's very poignant.
02:13:06.660 | - Do you think the greatest stories have a happy ending
02:13:10.320 | or a heartbreak at the end?
02:13:14.060 | - That's such a Russian question.
02:13:15.460 | It's like Russian tragedies, you know?
02:13:18.140 | - So I would say the answer to that for me,
02:13:20.020 | there has to be heartbreak.
02:13:21.860 | - Yeah, I really don't like heartbreak.
02:13:24.180 | I don't like heartbreak.
02:13:25.620 | I want there to be a happy ending,
02:13:27.580 | or at least a hopeful ending.
02:13:32.320 | But, you know, like Dr. Zhivago,
02:13:35.300 | like, or the English patient.
02:13:37.940 | Oh my goodness, like why?
02:13:40.820 | Oh, it's just, yeah, no.
02:13:44.100 | - Well, I don't think there's a better way to end it
02:13:47.760 | on a happy note like this.
02:13:51.000 | Lisa, like I said, I'm a huge fan of yours.
02:13:53.000 | Thank you for wasting yet more time with me talking again.
02:13:57.720 | People should definitely get your book.
02:13:59.520 | And maybe one day I can't wait
02:14:02.040 | to talk to your husband as well.
02:14:03.520 | - Well, right back at you, Lexi.
02:14:07.720 | - Thanks for listening to this conversation
02:14:09.240 | with Lisa Feldman Barrett,
02:14:10.880 | and thank you to our sponsors.
02:14:12.760 | Athletic Greens, the all-in-one drink
02:14:15.400 | that I start every day with
02:14:16.760 | to cover all my nutritional bases.
02:14:19.280 | 8 Sleep, a mattress that cools itself
02:14:21.980 | and gives me yet another reason to enjoy sleep.
02:14:24.960 | Masterclass, online courses that I enjoy
02:14:28.240 | from some of the most amazing humans in history.
02:14:31.520 | And BetterHelp, online therapy with a licensed professional.
02:14:36.140 | Please check out these sponsors in the description
02:14:38.640 | to get a discount and to support this podcast.
02:14:42.040 | If you enjoy this thing, subscribe on YouTube,
02:14:44.220 | review it with Five Stars on Apple Podcasts,
02:14:46.600 | follow on Spotify, support on Patreon,
02:14:49.160 | or connect with me on Twitter @LexFriedman.
02:14:52.440 | And now, let me leave you with some words
02:14:54.840 | from Sun Tzu and the Art of War.
02:14:58.320 | There are not more than five musical notes,
02:15:01.000 | yet the combination of these five give rise
02:15:03.600 | to more melodies than can ever be heard.
02:15:06.680 | There are not more than five primary colors,
02:15:09.640 | yet in combination they produce more hues
02:15:12.920 | than can ever be seen.
02:15:15.140 | There are not more than five cardinal tastes,
02:15:18.840 | and yet combinations of them yield more flavors
02:15:22.920 | than can ever be tasted.
02:15:24.400 | Thank you for listening, and hope to see you next time.
02:15:28.700 | (upbeat music)
02:15:31.280 | (upbeat music)
02:15:33.860 | [BLANK_AUDIO]