back to index

Charan Ranganath: Human Memory, Imagination, Deja Vu, and False Memories | Lex Fridman Podcast #430


Chapters

0:0 Introduction
1:3 Experiencing self vs remembering self
14:44 Creating memories
24:16 Why we forget
31:53 Training memory
42:22 Memory hacks
54:10 Imagination vs memory
63:29 Memory competitions
73:18 Science of memory
88:33 Discoveries
99:37 Deja vu
104:54 False memories
124:59 False confessions
128:45 Heartbreak
136:19 Nature of time
144:0 Brain–computer interface (BCI)
158:4 AI and memory
168:18 ADHD
175:15 Music
185:0 Human mind

Whisper Transcript | Transcript Only Page

00:00:00.000 | the act of remembering can change the memory.
00:00:02.920 | If you remember some event
00:00:05.240 | and then I tell you something about the event,
00:00:08.240 | later on when you remember the event,
00:00:11.020 | you might remember some original information from the event
00:00:14.360 | as well as some information about what I told you.
00:00:17.600 | And sometimes if you're not able to tell the difference,
00:00:21.200 | that information that I told you
00:00:22.640 | gets mixed into the story that you had originally.
00:00:25.760 | So now I give you some more misinformation
00:00:28.160 | or you're exposed to some more information,
00:00:29.800 | somewhere else, and eventually your memory
00:00:32.400 | becomes totally detached from what happened.
00:00:34.820 | - The following is a conversation with Charan Ranganath,
00:00:40.240 | a psychologist and neuroscientist at UC Davis,
00:00:43.560 | specializing in human memory.
00:00:46.680 | He's the author of "Why We Remember,"
00:00:49.640 | unlocking memory's power to hold on to what matters.
00:00:53.340 | This is the Lex Friedman Podcast.
00:00:55.600 | To support it, please check out our sponsors
00:00:57.680 | in the description.
00:00:59.080 | And now, dear friends, here's Charan Ranganath.
00:01:02.760 | Danny Kahneman describes the experiencing self
00:01:06.920 | and the remembering self.
00:01:09.600 | And that happiness and satisfaction you gain
00:01:12.840 | from the outcomes of your decisions
00:01:14.280 | did not come from what you've experienced,
00:01:16.640 | but rather from what you remember of the experience.
00:01:19.240 | So can you speak to this interesting difference
00:01:22.520 | that you write about in your book
00:01:23.920 | of the experiencing self and the remembering self?
00:01:27.000 | - Danny really impacted me
00:01:28.680 | 'cause I was an undergrad at Berkeley
00:01:30.280 | and I got to take a class from him
00:01:32.360 | long before he won the Nobel Prize or anything,
00:01:34.340 | and it was just a mind-blowing class.
00:01:36.920 | But this idea of the remembering self
00:01:39.320 | and the experiencing self,
00:01:40.800 | I got into it because it's so much about memory,
00:01:44.480 | even though he doesn't study memory.
00:01:46.200 | So we're right now having this experience, right?
00:01:49.040 | And people can watch it, presumably on YouTube,
00:01:52.040 | or listen to it on audio.
00:01:53.800 | But if you're talking to somebody else,
00:01:55.720 | you could probably describe this whole thing in 10 minutes,
00:01:58.640 | but that's going to miss a lot of what actually happened.
00:02:02.820 | And so the idea there is that the way we remember things
00:02:06.760 | is not the replay of the experience,
00:02:09.660 | it's something totally different.
00:02:11.400 | And it tends to be biased by the beginning and the end,
00:02:14.280 | and he talks about the peaks,
00:02:16.200 | but there's also the best parts, the worst parts, et cetera.
00:02:20.720 | And those are the things that we remember.
00:02:22.900 | And so when we make decisions,
00:02:25.360 | we usually consult memory,
00:02:27.840 | and we feel like our memory is a record
00:02:30.280 | of what we've experienced, but it's not.
00:02:32.040 | It's this kind of very biased sample,
00:02:35.640 | but it's biased in an interesting,
00:02:37.400 | and I think biologically relevant way.
00:02:39.640 | - So in the way we construct a narrative about our past,
00:02:43.120 | you say that it gives us an illusion of stability.
00:02:49.160 | Can you explain that?
00:02:50.880 | - Basically, I think that a lot of learning in the brain
00:02:55.320 | is driven towards being able to make sense.
00:02:58.320 | I mean, really, memory is all about
00:02:59.780 | the present and the future.
00:03:01.240 | The past is done, so biologically speaking,
00:03:04.440 | it's not important unless there's something
00:03:06.800 | from the past that's useful.
00:03:08.940 | And so what our brains are really optimized for
00:03:11.600 | is to learn about the stuff from the past
00:03:14.660 | that's going to be most useful in understanding the present
00:03:17.840 | and predicting the future, right?
00:03:19.960 | And so cause-effect relationships, for instance,
00:03:22.320 | that's a big one.
00:03:23.620 | Now, my future is completely unpredictable
00:03:26.320 | in the sense that you could, in the next 10 minutes,
00:03:29.480 | pull a knife on me and slip my throat, right?
00:03:31.440 | - I was planning on it.
00:03:32.280 | - Exactly, but having seen some of your work,
00:03:36.080 | I just generally, my expectations about life,
00:03:39.600 | I'm not expecting that.
00:03:41.160 | I have a certainty that everything's gonna be fine
00:03:43.160 | and we're gonna have a great time talking today, right?
00:03:45.880 | But we're often right.
00:03:47.360 | It's like, okay, so I go to see a band on stage.
00:03:51.080 | I know they're gonna make me wait,
00:03:53.040 | the show's gonna start late.
00:03:54.720 | And then, you know, they come on,
00:03:56.900 | there's a very good chance there's gonna be an encore.
00:03:59.520 | I have a memory, so to speak, for that event
00:04:01.820 | before I've even walked into the show, right?
00:04:04.080 | There's gonna be people holding up their camera phones,
00:04:06.580 | trying to take videos of it now,
00:04:08.160 | 'cause this is kind of the world we live in.
00:04:10.880 | So that's like everyday fortune telling that we do, though.
00:04:14.120 | It's not real, it's imagined.
00:04:17.200 | And it's amazing that we have this capability
00:04:19.200 | and that's what memory is about.
00:04:22.060 | But it can also give us this illusion
00:04:24.000 | that we know everything that's about to happen.
00:04:26.520 | And I think what's valuable about that illusion
00:04:30.640 | is when it's broken, it gives us the information, right?
00:04:34.480 | So, I mean, I'm sure being an AI,
00:04:36.800 | you know about information theory.
00:04:38.480 | And the idea is the information
00:04:40.080 | is what you didn't already have.
00:04:42.080 | And so those prediction errors that we make based on,
00:04:45.240 | you know, we make a prediction based on memory
00:04:46.840 | and the errors are where the action is.
00:04:49.240 | The error is where the learning happens.
00:04:52.980 | - Exactly, exactly.
00:04:55.020 | - Well, just to linger on Danny Kahneman
00:04:58.380 | and just this whole idea of experiencing self
00:05:03.260 | versus remembering self, I was hoping you can give
00:05:07.820 | a simple answer of how we should live life.
00:05:10.100 | Based on the fact that our memories
00:05:15.960 | could be a source of happiness
00:05:17.780 | or could be the primary source of happiness,
00:05:20.460 | that an event, when experienced,
00:05:24.460 | bears its fruits the most when it's remembered
00:05:28.140 | over and over and over and over.
00:05:29.540 | And maybe there is some wisdom in the fact
00:05:32.420 | that we can control, to some degree, how we remember it,
00:05:36.260 | how we evolve our memory of it,
00:05:39.020 | such that it can maximize the long-term happiness
00:05:42.840 | of that repeated experience.
00:05:45.300 | - Okay, well, first I'll say,
00:05:46.580 | I wish I could take you on the road with me,
00:05:48.380 | 'cause that was such a great description.
00:05:50.420 | - Can I be your opening excerpt?
00:05:52.820 | - Oh my God, no, I'm gonna open for you, dude.
00:05:56.020 | Otherwise, it's like, you know,
00:05:57.220 | everybody leaves after you're done.
00:05:58.960 | (both laughing)
00:06:01.100 | Believe me, I did that in Columbus, Ohio once.
00:06:03.960 | It wasn't fun.
00:06:04.880 | Like, the opening acts, like, drank our bar tab.
00:06:07.820 | We spent all this money going all the way there.
00:06:09.900 | There was only the, everybody left
00:06:11.860 | after the opening acts were done
00:06:13.260 | and there was just that stoner dude
00:06:15.020 | with the dreadlocks hanging out.
00:06:16.860 | And then next thing you know,
00:06:17.820 | we blew, like, our savings on getting a hotel room.
00:06:21.140 | (both laughing)
00:06:21.980 | - So we should, as a small tangent,
00:06:23.700 | you're a legit touring act.
00:06:26.580 | - When I was in grad school, I played in a band.
00:06:28.660 | And yeah, we traveled, we would play shows.
00:06:30.580 | It wasn't like, we were in a hardcore touring band,
00:06:33.400 | but we did some touring and had some fun times.
00:06:36.460 | And yeah, we did a movie soundtrack.
00:06:39.380 | - Nice.
00:06:40.220 | - "Henry Portrait of a Serial Killer."
00:06:41.540 | So that's a good movie.
00:06:42.860 | We were on the soundtrack for the sequel,
00:06:44.660 | "Henry II, Mask of Sanity," which is a terrible movie.
00:06:48.300 | - How's the soundtrack?
00:06:49.420 | It's pretty good.
00:06:50.260 | - It's badass.
00:06:51.080 | At least that one part where the guy
00:06:52.260 | throws up the milkshake, which is my song.
00:06:54.940 | - We're gonna have to see, we're gonna have to see it.
00:06:56.900 | - All right, we'll get you back to life advice.
00:06:59.100 | - And happiness, yeah.
00:07:00.940 | - One thing that I try to live by, especially nowadays,
00:07:03.820 | and since I wrote the book,
00:07:04.980 | I've been thinking more and more about this,
00:07:06.700 | is how do I want to live a memorable life?
00:07:11.300 | I think if we go back to the pandemic, right?
00:07:15.540 | How many people have memories from that period?
00:07:19.180 | Aside from the trauma of being locked up
00:07:22.200 | and seeing people die and all this stuff.
00:07:24.480 | I think it's like one of these things
00:07:27.740 | where we were stuck inside looking at screens all day,
00:07:31.980 | doing the same thing with the same people.
00:07:34.900 | And so I don't remember much from that
00:07:37.740 | in terms of those good memories
00:07:39.100 | that you're talking about, right?
00:07:40.940 | You know, when I was growing up,
00:07:41.940 | my parents worked really hard for us,
00:07:44.060 | and we went on some vacations, but not very often.
00:07:48.860 | And I really try to do now vacations to interesting places
00:07:52.580 | as much as possible with my family,
00:07:54.540 | because those are the things that you remember, right?
00:07:58.260 | So I really do think about what's going to be something
00:08:03.260 | that's memorable, and then just do it,
00:08:05.700 | even if it's a pain in the ass,
00:08:06.980 | because the experiencing self will suffer for that,
00:08:10.060 | but the remembering self will be like,
00:08:11.580 | yes, I'm so glad I did that.
00:08:13.540 | - Do things that are very unpleasant in the moment,
00:08:16.380 | because those can be reframed and enjoyed
00:08:19.060 | for many years to come.
00:08:20.900 | That's probably good advice,
00:08:24.060 | or at least when you're going through shit,
00:08:25.500 | it's a good way to see the silver lining of it.
00:08:29.340 | - Yeah, I mean, I think it's one of these things
00:08:31.320 | where if you have people who you've gone through,
00:08:35.580 | since you said it, I'll just say it,
00:08:36.660 | since you've gone through shit with someone,
00:08:38.560 | and it's like, that's a bonding experience often, you know?
00:08:43.100 | I mean, that can really bring you together.
00:08:45.360 | I like to say it's like, there's no point in suffering
00:08:47.580 | unless you get a story out of it.
00:08:49.100 | (laughs)
00:08:50.000 | So in the book, I talk about the power
00:08:53.200 | of the way we communicate with others
00:08:55.440 | and how that shapes our memories.
00:08:57.200 | And so I had this near-death experience,
00:08:59.600 | at least that's how I remember it, on this paddleboard,
00:09:02.480 | where just everything that could have gone wrong
00:09:04.500 | did go wrong, almost.
00:09:06.760 | So many mistakes were made and ended up at some point
00:09:11.760 | just basically away from my board,
00:09:15.920 | pinned in a current in this corner,
00:09:18.840 | not a super good swimmer.
00:09:21.000 | And my friend who came with me, Randy,
00:09:23.340 | who's a computational neuroscientist,
00:09:24.940 | and he had just been pushed down past me
00:09:27.720 | and so he couldn't even see me.
00:09:29.480 | And I'm just like, if I die here,
00:09:33.120 | I mean, no one's around, it's like you're just I alone.
00:09:37.120 | And so I just said, well, failure's not an option.
00:09:39.680 | And eventually I got out of it and froze and got cut up.
00:09:44.680 | And I mean, the things that we were going through
00:09:47.440 | were just insane.
00:09:48.440 | But a short version of this is my wife and my daughter
00:09:53.440 | and Randy's wife, they gave us all sorts of hell about this
00:09:58.200 | 'cause they were ready to send out a search party.
00:10:01.740 | So they were giving me hell about it.
00:10:04.440 | And then I started to tell people in my lab about this
00:10:07.400 | and then friends.
00:10:08.240 | And it just became a better and better story every time.
00:10:11.260 | And we actually had some photos of just the crazy things
00:10:14.200 | like this generator that was hanging over the water
00:10:17.080 | and we're like ducking under these metal gratings
00:10:20.040 | and I'm like going flat.
00:10:21.880 | And it was just nuts, but it became a great story.
00:10:26.360 | And it was definitely, I mean, Randy and I were already
00:10:28.040 | tight, but that was a real bonding experience for us.
00:10:30.560 | And yeah, I mean, and I learned from that,
00:10:33.480 | that it's like, I don't look back on that enough actually,
00:10:37.560 | because I think we often, at least for me,
00:10:41.800 | I don't necessarily have the confidence to think
00:10:44.000 | that things will work out,
00:10:45.520 | that I'll be able to get through certain things.
00:10:47.760 | But my ability to actually get something done in that moment
00:10:52.760 | is better than I give myself credit for, I think.
00:10:55.680 | And that was the lesson of that story
00:10:58.200 | that I really took away.
00:10:59.440 | - Well, actually just for me, you're making me realize now
00:11:03.000 | that it's not just those kinds of stories,
00:11:05.720 | but even things like periods of depression
00:11:10.040 | or really low points.
00:11:12.200 | To me at least, it feels like a motivating thing
00:11:15.960 | that the darker it gets, the better the story will be
00:11:19.760 | if you emerge on the other side.
00:11:21.640 | That to me feels like a motivating thing.
00:11:24.200 | So maybe if people are listening to this
00:11:26.040 | and they're going through some shit, as we said,
00:11:29.480 | one thing that could be a source of light
00:11:33.320 | is that it'll be a hell of a good story when it's all over,
00:11:36.520 | when you emerge on the other side.
00:11:38.280 | Let me ask you about decisions.
00:11:41.480 | You've already talked about it a little bit,
00:11:43.080 | but when we face the world
00:11:45.480 | and we're making different decisions,
00:11:47.320 | how much does our memory come into play?
00:11:52.200 | Is it the kind of narratives that we've constructed
00:11:55.280 | about the world that are used to make predictions
00:11:58.320 | that's fundamentally part of the decision-making?
00:12:01.040 | - Absolutely, yeah.
00:12:02.080 | So let's say after this,
00:12:03.520 | you and I decided we're gonna go for a beer, right?
00:12:05.960 | How do you choose where to go?
00:12:07.480 | You're probably gonna be like,
00:12:08.320 | "Oh yeah, this new bar opened up near me.
00:12:10.440 | "I had a great time there.
00:12:11.940 | "They had a great beer selection."
00:12:13.480 | Or you might say, "Oh, we went to this place
00:12:15.600 | "and it was totally crowded
00:12:16.960 | "and they were playing this horrible EDM or whatever."
00:12:19.720 | And so right there, valuable source of information, right?
00:12:24.800 | And then you have these things
00:12:26.240 | like where you do this counterfactual stuff like,
00:12:28.960 | "Well, I did this previously,
00:12:30.800 | "but what if I had gone somewhere else and said,
00:12:32.880 | "maybe I'll go to this other place
00:12:34.140 | "because I didn't try it the previous time."
00:12:36.420 | So there's all that kind of reasoning that goes into it too.
00:12:39.960 | I think even if you think about
00:12:42.600 | the big decisions in life, right?
00:12:44.600 | It's like you and I were talking
00:12:46.240 | before we started recording
00:12:47.800 | about how I got into memory research and you got into AI.
00:12:52.400 | And it's like we all have these personal reasons
00:12:55.520 | that guide us in these particular directions.
00:12:57.640 | And some of it's the environment and random factors in life,
00:13:01.440 | and some of it is memories of things that we wanna overcome
00:13:05.960 | or things that we build on in a positive way.
00:13:10.100 | But either way, they define us.
00:13:12.440 | - And probably the earlier in life the memories happen,
00:13:15.960 | the more defining power they have
00:13:19.280 | in terms of determining who we become.
00:13:21.700 | I mean, I do feel like adolescence is much more important
00:13:25.160 | than I think people give credit for.
00:13:27.240 | I think that there is this kind of a sense
00:13:28.840 | like the first three years of life
00:13:31.480 | is the most important part,
00:13:33.040 | but the teenage years are just so important for the brain.
00:13:38.040 | And so that's where a lot of mental illness
00:13:41.360 | starts to emerge.
00:13:43.120 | Now we're thinking of things like schizophrenia
00:13:45.000 | as a neurodevelopmental disorder
00:13:47.280 | because it just emerges during that period of adolescence
00:13:50.560 | and early adulthood.
00:13:52.080 | So, and I think the other part of it is,
00:13:54.760 | is that, I guess I was a little bit too firm
00:13:57.900 | in saying that memory determines who we are.
00:14:00.520 | It's really, the self is an evolving construct.
00:14:03.380 | I think we kind of underestimate that.
00:14:05.740 | And when you're a parent,
00:14:07.540 | you feel like every decision you make is consequential
00:14:11.940 | in forming this child and plays a role,
00:14:15.220 | but so do the child's peers.
00:14:17.260 | And so do, there's so much,
00:14:20.220 | I mean, that's why I think the big part of education,
00:14:22.660 | I think that's so important,
00:14:24.420 | is not the content you learn.
00:14:25.820 | I mean, think of how much dumb stuff
00:14:27.920 | we learned in school, right?
00:14:29.880 | But a lot of it is learning how to get along with people
00:14:34.200 | and learning who you are and how you function.
00:14:38.260 | And that can be terribly traumatizing
00:14:41.060 | even if you have perfect parents working on you.
00:14:45.240 | - Is there some insight into the human brain
00:14:48.720 | that explains why we don't seem to remember anything
00:14:52.040 | from the first few years of life?
00:14:53.860 | - Yeah, yeah.
00:14:55.000 | In fact, actually, I was just talking
00:14:56.540 | to my really good friend and colleague,
00:14:58.560 | Simona Getty, who studies the neuroscience
00:15:01.220 | of child development, and so we were talking about this.
00:15:04.060 | And so there are a bunch of reasons, I would say.
00:15:06.880 | So one reason is there's an area of the brain
00:15:10.600 | called the hippocampus, which is very, very important
00:15:13.200 | for remembering events or episodic memory.
00:15:16.160 | And so the first two years of life,
00:15:18.440 | there's a period called infantile amnesia.
00:15:21.400 | And then the next couple years of life after that,
00:15:24.460 | there's a period called childhood amnesia.
00:15:26.480 | And the difference is that basically in the lab
00:15:29.680 | and even during childhood and afterwards,
00:15:33.360 | children basically don't have any episodic memories
00:15:37.640 | for those first two years.
00:15:39.320 | The next two years, it's very fragmentary,
00:15:41.360 | and that's why they call it childhood amnesia.
00:15:43.120 | So there's some, but it's not much.
00:15:44.640 | So one reason is that the hippocampus
00:15:47.300 | is taking some time to develop,
00:15:49.240 | but another is the neocortex,
00:15:51.240 | so the whole folded stuff of gray matter
00:15:53.720 | all around the hippocampus,
00:15:55.340 | is developing so rapidly and changing,
00:15:59.560 | and a child's knowledge of the world
00:16:01.400 | is just massively being built up, right?
00:16:03.800 | So I'm gonna probably embarrass myself,
00:16:07.000 | but it's like if you showed like,
00:16:09.360 | you know, you trained a neural network
00:16:11.080 | and you give it like the first couple of patterns
00:16:12.800 | or something like that,
00:16:14.000 | and then you bombard it with another,
00:16:15.600 | like, you know, years worth of data,
00:16:17.880 | try to get back those first couple of patterns, right?
00:16:20.320 | It's like everything changes.
00:16:22.400 | And so the brain is so plastic,
00:16:25.120 | the cortex is so plastic during that time,
00:16:28.160 | and we think that memories for events
00:16:30.520 | are very distributed across the brain.
00:16:32.720 | So imagine you're trying to get back
00:16:34.680 | that pattern of activity that happened
00:16:36.520 | during this one moment,
00:16:38.120 | but the roads that you would take to get there
00:16:40.240 | have been completely rerouted, right?
00:16:42.660 | So I think that's my best explanation.
00:16:45.260 | - The third explanation is a child's sense of self
00:16:47.840 | takes a while to develop.
00:16:49.860 | And so their experience of learning
00:16:53.060 | might be more learning what happened
00:16:55.200 | as opposed to having this first-person experience
00:16:57.960 | of, "Ah, I remember I was there."
00:17:00.480 | - Well, I think somebody once said to me
00:17:05.480 | that kind of loosely, philosophically,
00:17:09.440 | that the reason we don't remember
00:17:12.460 | the first few years of life, infantile amnesia,
00:17:16.180 | is because how traumatic it is.
00:17:18.540 | Basically, the error rate that you mentioned,
00:17:21.660 | when your brain's prediction doesn't match reality,
00:17:25.780 | the error rate in the first few years of life,
00:17:28.020 | your first few months, certainly,
00:17:29.800 | is probably crazy high.
00:17:31.380 | It's just nonstop freaking out.
00:17:34.500 | The collision between your model of the world
00:17:37.400 | and how the world works is just so high
00:17:39.700 | that you want whatever the trauma of that is
00:17:42.640 | not to linger around.
00:17:44.720 | I always thought that's an interesting idea
00:17:46.960 | because just imagine the insanity
00:17:50.320 | of what's happening in a human brain
00:17:52.020 | in the first couple years.
00:17:53.600 | Just, you don't know anything.
00:17:56.160 | And there's just this string of knowledge
00:17:58.080 | and we're somehow, given how plastic everything is,
00:18:00.920 | it just kind of molds and figures it out.
00:18:03.100 | But it's like an insane waterfall of information.
00:18:09.080 | I wouldn't necessarily describe it as a trauma.
00:18:10.860 | We can get into this whole stages of life thing,
00:18:12.860 | which I just love.
00:18:14.760 | And basically, those first few years,
00:18:16.940 | there are, I mean, think about it,
00:18:18.780 | a kid's internal model of their body is changing.
00:18:23.100 | It's like just learning to move.
00:18:25.140 | I mean, if you ever have a baby,
00:18:27.900 | you'll know that the first three months,
00:18:29.740 | they're discovering their toes.
00:18:31.540 | It's just nuts.
00:18:33.140 | So everything is changing.
00:18:34.880 | But what's really fascinating is,
00:18:36.860 | and I think this is one of those,
00:18:38.500 | this is not at all me being a scientist,
00:18:40.540 | but it's like one of those things that people talk about
00:18:42.420 | when they talk about the positive aspects of children
00:18:47.420 | is that they're exceptionally curious
00:18:49.900 | and they have this kind of openness towards the world.
00:18:53.740 | And so that prediction error
00:18:55.580 | is not a negative traumatic thing, I think.
00:18:58.680 | It's like a very positive thing
00:19:00.860 | because it's what they use, they're seeking information.
00:19:04.300 | One of the areas that I'm very interested in
00:19:06.300 | is the prefrontal cortex.
00:19:07.820 | It's an area of the brain that,
00:19:09.860 | I mean, I could talk all day about it,
00:19:11.580 | but it helps us use our knowledge to say,
00:19:15.340 | "Hey, this is what I want to do now.
00:19:17.180 | "This is my goal.
00:19:18.540 | "So this is how I'm going to achieve it,"
00:19:20.820 | and focus everything towards that goal, right?
00:19:24.100 | The prefrontal cortex takes forever to develop in humans.
00:19:27.920 | The connections are still being tweaked and reformed
00:19:30.860 | like into late adolescence, early adulthood,
00:19:34.440 | which is when you tend to see mental illness pop up, right?
00:19:38.040 | So it's being massively reformed.
00:19:40.800 | Then you have about 10 years maybe
00:19:43.080 | of prime functioning of the prefrontal cortex,
00:19:45.680 | and then it starts going down again
00:19:47.480 | and you end up being older
00:19:48.840 | and you start losing all that frontal function.
00:19:51.500 | So I look at this and you'd say, okay,
00:19:53.740 | from you sit around episodic memory talks,
00:19:56.320 | we'll always say children are worse
00:19:57.560 | than adults at episodic memory.
00:19:59.320 | Older adults are worse than young adults at episodic memory.
00:20:01.880 | And I always would say, God, that's so weird.
00:20:04.920 | Why would we have this period of time
00:20:07.040 | that's so short when we're perfect, right?
00:20:09.600 | Or optimal.
00:20:10.440 | And I like to use that word optimal now
00:20:12.480 | because there's such a culture of optimization right now.
00:20:15.720 | And it's like, I realized I have to redefine what optimal is
00:20:19.640 | because for most of the human condition,
00:20:23.660 | I think we had a series of stages of life
00:20:27.800 | where you have basically adults saying, okay,
00:20:31.280 | young adults saying, I've got a child
00:20:34.120 | and I'm part of this village
00:20:35.800 | and I have to hunt and forage and get things done.
00:20:38.560 | I need a prefrontal cortex so I can stay focused
00:20:40.940 | on the big picture and long haul goals.
00:20:43.780 | Now I'm a child, I'm in this village,
00:20:47.120 | I'm kind of wandering around and I've got some safety
00:20:50.840 | and I need to learn about this culture
00:20:52.960 | because I know so little.
00:20:54.600 | What's the best way to do that?
00:20:55.620 | Let's explore.
00:20:56.560 | I don't wanna be constrained by goals as much.
00:20:59.000 | I wanna really be free, play and explore and learn.
00:21:03.080 | So you don't want a super tight prefrontal cortex.
00:21:05.480 | You don't even know what the goals should be yet, right?
00:21:07.960 | It's like, if you're trying to design a model
00:21:10.600 | that's based on a bad goal, it's not gonna work well, right?
00:21:15.000 | So then you go late in life and you say,
00:21:17.480 | oh, why don't you have a great prefrontal cortex then?
00:21:20.680 | But I think, I mean, if you go back and you think
00:21:23.400 | how many species actually stick around naturally
00:21:27.200 | long after their childbearing years are over,
00:21:29.800 | after their reproductive years are over?
00:21:31.320 | Like menopause, from what I understand,
00:21:33.320 | menopause is not all that common in the animal world, right?
00:21:36.720 | So why would that happen?
00:21:38.760 | And so I saw Alison Gopnik said something about this
00:21:43.120 | so I started to look into this, about this idea
00:21:45.680 | that really when you're older in most societies,
00:21:49.760 | your job is no longer to form new episodic memories.
00:21:53.640 | It's to pass on the memories that you already have,
00:21:56.200 | this knowledge about the world,
00:21:57.400 | or what we call semantic memory,
00:21:59.200 | to pass on that semantic memory to the younger generations,
00:22:02.520 | to pass on the culture.
00:22:03.960 | You know, even now in indigenous cultures,
00:22:06.160 | that's the role of the elders.
00:22:07.400 | They're respected, they're not seen as, you know,
00:22:10.240 | people who are past it and losing it.
00:22:12.920 | And I thought that was a very poignant thing
00:22:15.680 | that memory is doing what it's supposed to
00:22:19.160 | throughout these stages of life.
00:22:21.240 | - So it is always optimal in a sense.
00:22:23.840 | - Yeah.
00:22:24.680 | - It's just optimal for that stage of life.
00:22:26.400 | - Yeah, and for the ecology of the system.
00:22:28.880 | So you've got, so I looked into this
00:22:30.600 | and it's like another species that has menopause is orcas.
00:22:34.200 | Orca pods are led by the grandmothers, right?
00:22:37.000 | So not the young adults, not the parents or whatever,
00:22:39.720 | the grandmothers.
00:22:41.080 | And so they're the ones that pass on the traditions
00:22:44.200 | to the, I guess, the younger generation of orcas.
00:22:47.480 | And if you look from what little I understand,
00:22:50.360 | different orca pods have different traditions.
00:22:53.640 | They hunt for different things,
00:22:55.040 | they have different play traditions.
00:22:56.880 | And that's a culture, right?
00:22:59.400 | And so in social animals,
00:23:02.720 | evolution, I think, is designing brains
00:23:06.040 | that are really around, you know,
00:23:07.880 | it's obviously optimized for the individual,
00:23:11.380 | but also for kin.
00:23:13.160 | And I think that the kin are part of this,
00:23:15.880 | like when they're a part of this intense social group,
00:23:18.800 | the brain development should parallel
00:23:21.520 | the nature of the ecology.
00:23:22.880 | - Well, it's just fascinating to think
00:23:25.200 | of the individual orca or human
00:23:28.880 | throughout its life, in stages,
00:23:31.080 | doing a kind of optimal wisdom development.
00:23:35.840 | So in the early days, you don't even know what the goal is,
00:23:38.720 | and you figure out the goal,
00:23:39.680 | and you kind of optimize for that goal,
00:23:41.280 | and you pursue that goal,
00:23:42.200 | and then all the wisdom you collect through that,
00:23:44.520 | then you share with the others in the system,
00:23:47.000 | with the other individuals.
00:23:48.240 | And as a collective, then you kind of converge
00:23:50.960 | towards greater wisdom throughout the generation.
00:23:55.880 | So in that sense, it's optimal.
00:23:58.560 | Us humans and orcas got something going on.
00:24:01.240 | It works. - Oh, yeah.
00:24:02.600 | Apex predators.
00:24:03.820 | - I just got a Megalodon tooth,
00:24:07.600 | speaking of apex predators. - Oh, man.
00:24:10.240 | - It's, just imagine the size of that thing.
00:24:14.520 | Anyway, how does the brain forget,
00:24:19.520 | and how and why does it remember?
00:24:22.080 | So maybe some of the mechanisms.
00:24:24.520 | You mentioned the hippocampus.
00:24:25.560 | What are the different components involved here?
00:24:28.520 | - So we can think about this on a number of levels.
00:24:30.440 | Maybe I'll give you the simplest version first,
00:24:32.580 | which is, we tend to think of memories
00:24:34.480 | as these individual things, and we can just access them,
00:24:37.580 | maybe a little bit like photos on your phone
00:24:40.000 | or something like that.
00:24:41.360 | But in the brain, the way it works
00:24:43.360 | is you have this distributed pool of neurons,
00:24:46.120 | and the memories are kind of shared
00:24:49.320 | across different pools of neurons.
00:24:51.060 | And so what you have is competition,
00:24:53.720 | where sometimes memories that overlap
00:24:56.020 | can be fighting against each other, right?
00:24:58.480 | So sometimes we forget
00:25:00.320 | because that competition just wipes things out.
00:25:03.800 | Sometimes we forget
00:25:04.900 | because there aren't the biological signals
00:25:07.160 | which we can get into that would promote long-term retention.
00:25:11.040 | And lots of times we forget
00:25:12.560 | because we can't find the cue
00:25:15.200 | that sends us back to the right memory.
00:25:17.640 | And we need the right cue to be able to activate it, right?
00:25:20.360 | So for instance, in a neural network,
00:25:23.600 | there is no, you wouldn't go and you'd say,
00:25:25.680 | "This is the memory," right?
00:25:27.200 | It's like the whole ecosystem of memories
00:25:31.340 | is in the weights of the neural network.
00:25:33.080 | And in fact, you could extract entirely new memories
00:25:35.520 | depending on how you feed.
00:25:37.200 | - You have to have the right query, the right prompt
00:25:39.560 | to access that, whatever the part you're looking for.
00:25:42.280 | - That's exactly right, that's exactly right.
00:25:44.320 | And in humans, you have this more complex set
00:25:46.560 | of ways memory works.
00:25:48.000 | There's, as I said, the knowledge
00:25:49.800 | or what you call semantic memory.
00:25:51.240 | And then there's these memories for specific events,
00:25:53.960 | which we call episodic memory.
00:25:55.480 | And so there's different pieces of the puzzle
00:25:58.680 | that require different kinds of cues.
00:26:01.280 | So that's a big part of it too,
00:26:03.120 | is just this kind of what we call retrieval failure.
00:26:06.180 | - You mentioned episodic memory,
00:26:07.440 | you mentioned semantic memory.
00:26:08.840 | What are the different separations here?
00:26:10.640 | What's working memory, short-term memory, long-term memory?
00:26:14.680 | What are the interesting categories of memory?
00:26:17.520 | - Yeah, and so memory researchers,
00:26:19.680 | we love to cut things up and say,
00:26:22.520 | is memory one thing or is it two things?
00:26:24.560 | Is it two things or is it three things?
00:26:25.880 | And so one of the things that, there's value in that,
00:26:29.820 | and especially experimental value
00:26:32.040 | in terms of being able to dissect things.
00:26:34.240 | In the real world, it's all connected.
00:26:36.540 | Speak to your question, working memory,
00:26:38.760 | it was a term that was coined by Alan Baddeley.
00:26:41.040 | It's basically thought to be this ability
00:26:43.560 | to keep information online in your mind
00:26:46.680 | right in front of you at a given time
00:26:48.560 | and to be able to control the flow of that information,
00:26:51.220 | to choose what information is relevant,
00:26:53.520 | to be able to manipulate it and so forth.
00:26:56.200 | And one of the things that Alan did
00:26:58.120 | that was quite brilliant was he said,
00:27:00.560 | there's this ability to kind of passively store information,
00:27:03.840 | see things in your mind's eye or hear your internal monologue
00:27:07.600 | but we have that ability to keep information in mind.
00:27:12.160 | But then we also have this separate,
00:27:13.920 | what he called a central executive,
00:27:17.120 | which is identified a lot with the prefrontal cortex.
00:27:19.640 | It's this ability to control the flow of information
00:27:24.180 | that's being kept active based on what it is you're doing.
00:27:27.680 | Now, a lot of my early work was basically saying
00:27:30.280 | that this working memory,
00:27:31.440 | which some memory researchers would call short-term memory,
00:27:34.680 | is not at all independent from long-term memory.
00:27:38.120 | That is that a lot of executive function requires learning
00:27:42.120 | and you have to have synaptic change for that to happen.
00:27:45.180 | But there's also transient forms of memory.
00:27:48.360 | So one of the things I've been getting into lately
00:27:51.600 | is the idea that we form internal models of events.
00:27:56.600 | The obvious one that I always use is birthday parties.
00:27:59.640 | So you go to a child's birthday party,
00:28:01.760 | once the cake comes out and they start,
00:28:04.240 | you just see a candle,
00:28:05.720 | you can predict the whole frame,
00:28:08.120 | set of events that happens later.
00:28:10.160 | And up till that point where the child blows out the candle,
00:28:12.720 | you have an internal model in your head of what's going on.
00:28:16.240 | And so if you follow people's eyes,
00:28:18.300 | it's not actually on what's happening.
00:28:19.820 | It's going where the action's about to happen,
00:28:22.160 | which is just fascinating, right?
00:28:24.200 | So you have this internal model
00:28:25.480 | and that's a kind of a working memory product.
00:28:28.660 | It's something that you're keeping online
00:28:30.820 | that's allowing you to interpret this world around you.
00:28:33.720 | Now to build that model though,
00:28:35.440 | you need to pull out stuff
00:28:36.720 | from your general knowledge of the world,
00:28:39.480 | which is what we call semantic memory.
00:28:41.580 | And then you'd want to be able to pull out memories
00:28:44.260 | for specific events that happened in the past,
00:28:46.400 | which we call episodic memory.
00:28:48.400 | So in a way they're all connected,
00:28:51.880 | even though it's different.
00:28:54.520 | The things that we're focusing on
00:28:56.100 | and the way we organize information in the present,
00:28:58.440 | which is working memory,
00:28:59.800 | will play a big role in determining
00:29:01.600 | how we remember that information later,
00:29:03.520 | which people typically call long-term memory.
00:29:05.400 | - So if you have something like a birthday party
00:29:07.780 | and you've been to many before,
00:29:09.880 | you're gonna load that from disk into working memory,
00:29:13.540 | this model, and then you're mostly operating on the model.
00:29:16.660 | And if it's a new task, you don't have a model,
00:29:21.660 | so you're more in the data collection?
00:29:24.560 | - Yes, one of the fascinating things
00:29:26.320 | that we've been studying,
00:29:27.280 | and we're not at all the first to do this.
00:29:30.160 | Jeff Sachs was a big pioneer in this,
00:29:33.080 | and I've been working with many other people,
00:29:34.960 | Ken Norman, Leila Davaci at Columbia
00:29:38.240 | has done some interesting stuff with this,
00:29:40.440 | is this idea that we form these internal models
00:29:44.520 | at particular points of high prediction error,
00:29:47.180 | or points of, I believe also points of uncertainty,
00:29:50.560 | points of surprise or motivationally significant periods.
00:29:54.080 | And those points are when it's maximally optimal
00:29:58.340 | to encode an episodic memory.
00:30:00.480 | So I used to think,
00:30:01.440 | oh, well, we're just encoding episodic memories constantly,
00:30:04.480 | boom, boom, boom, boom, boom.
00:30:06.160 | But think about how much redundancy there is
00:30:08.300 | in all that, right?
00:30:09.680 | It's just a lot of information that you don't need.
00:30:13.400 | But if you capture an episodic memory
00:30:15.840 | at the point of maximum uncertainty
00:30:19.240 | for the singular experience, right?
00:30:20.980 | You're just, it's only gonna happen once.
00:30:23.280 | But if you capture it at the point of maximum uncertainty
00:30:25.600 | or maximum surprise,
00:30:27.480 | you have the most useful point in your experience
00:30:30.440 | that you've grabbed.
00:30:31.720 | And what we see is that the hippocampus
00:30:34.160 | and these other networks that are involved
00:30:37.640 | in generating these internal models of events,
00:30:40.520 | they show a heightened period of connectivity
00:30:43.680 | or correlated activity during those breaks
00:30:46.720 | between different events, which we call event boundaries.
00:30:49.500 | These are the points where you're like surprised
00:30:51.280 | or you cross from one room to another and so forth.
00:30:54.400 | And that communication is associated
00:30:56.360 | with a bump of activity in the hippocampus
00:30:58.300 | and better memory.
00:30:59.360 | And so if people have a very good internal model
00:31:04.360 | throughout that event,
00:31:06.560 | you don't need to do much memory processing,
00:31:09.200 | you're in a predictive mode, right?
00:31:10.720 | And so then at these event boundaries, you encode
00:31:13.340 | and then you retrieve and you're like,
00:31:14.560 | okay, wait a minute, what's going on here?
00:31:17.400 | Branganath's now talking about orcas, what's going on?
00:31:19.960 | And maybe you have to go back and remember reading my book
00:31:22.240 | to pull out the episodic memory to make sense
00:31:24.300 | of whatever it is I'm babbling about, right?
00:31:26.600 | And so there's this beautiful dynamics
00:31:29.140 | that you can see in the brain of these different networks
00:31:33.080 | that are coming together and then de-affiliating
00:31:35.760 | at different points in time that are allowing you
00:31:38.320 | to go into these modes.
00:31:39.480 | And so to speak to your original question,
00:31:42.720 | to some extent when we're talking about semantic memory
00:31:45.280 | and episodic memory and working memory,
00:31:47.400 | you can think about it as these processes
00:31:49.600 | that are unfolding as these networks kind of come together
00:31:52.200 | and pull apart.
00:31:53.820 | - Can memory be trained and improved?
00:31:57.080 | This beautiful connected system that you've described,
00:32:01.120 | what aspect of it is a mechanism
00:32:04.740 | that can be improved through training?
00:32:06.940 | - I think improvement, it depends on what your definition
00:32:10.120 | of optimal is.
00:32:11.000 | So what I say in the book is that you don't wanna remember
00:32:14.880 | more, you wanna remember better, which means focusing
00:32:18.400 | on the things that are important.
00:32:20.240 | And that's what our brains are designed to do.
00:32:22.000 | So if you go back to the earliest quantitative studies
00:32:25.460 | of memory by Ebbinghaus, what you see is that he was trying
00:32:29.600 | so hard to memorize this arbitrary nonsense,
00:32:32.800 | and within a day he lost about 60% of that information.
00:32:37.160 | And he was basically using a very, very generous way
00:32:40.080 | of measuring it, right?
00:32:41.400 | So as far as we know, nobody has managed to violate
00:32:46.100 | those basics of having people forget
00:32:48.640 | most of their experiences.
00:32:50.220 | So if your expectation is that you should remember
00:32:52.240 | everything and that's what your optimal is,
00:32:54.560 | you're already off, because this is not what human brains
00:32:57.240 | are designed to do.
00:32:58.680 | On the other hand, what we see over and over again
00:33:01.680 | is that the brain does, basically, one of the cool things
00:33:05.260 | about the design of the brain is it's always less is more,
00:33:08.340 | less is more, right?
00:33:09.400 | It's like, I mean, I've seen estimates that the human brain
00:33:12.500 | uses something like 12 to 20 watts, you know, in a day.
00:33:15.560 | I mean, that's just nuts, the low power consumption, right?
00:33:18.960 | So it's all about reusing information and making the most
00:33:22.820 | of what we already have.
00:33:24.080 | And so that's why, basically, again,
00:33:28.040 | what you see biologically is, you know, neuromodulators,
00:33:31.900 | for instance, these chemicals in the brain,
00:33:33.640 | like norepinephrine, dopamine, serotonin.
00:33:37.440 | These are chemicals that are released during moments
00:33:40.660 | that tend to be biologically significant,
00:33:42.960 | surprise, fear, stress, et cetera.
00:33:46.560 | And so these chemicals promote lasting plasticity, right?
00:33:51.560 | Essentially, some mechanisms by which the brain can say,
00:33:54.240 | prioritize the information that you carry with you
00:33:57.000 | into the future.
00:33:58.280 | Attention is a big factor as well,
00:34:00.520 | our ability to focus our attention on what's important.
00:34:03.280 | And so there's different schools of thought
00:34:08.600 | on training attention, for instance.
00:34:10.640 | So one of my colleagues, Amishi Jha,
00:34:13.760 | she wrote a book called "Peak Mind"
00:34:15.360 | and talks about mindfulness as a method
00:34:17.560 | for improving attention and focus.
00:34:21.080 | So she works a lot with military, like Navy SEALs and stuff
00:34:23.960 | to do this kind of work with mindfulness meditation.
00:34:28.560 | Adam Ghazali, another one of my friends and colleagues,
00:34:30.760 | has work on kind of training through video games, actually,
00:34:34.460 | as a way of training attention.
00:34:36.160 | And so it's not clear to me, you know,
00:34:39.720 | one of the challenges, though, in training
00:34:41.500 | is you tend to overfit to the thing
00:34:44.920 | that you're trying to optimize, right?
00:34:46.560 | So you tend to, if I'm looking at a video game,
00:34:50.600 | I can definitely get better at paying attention
00:34:52.840 | in the context of the video game,
00:34:54.020 | but you transfer it to the outside world.
00:34:56.420 | That's very controversial.
00:34:58.080 | - The implication there is that attention
00:35:00.760 | is a fundamental component of remembering something,
00:35:03.820 | allocating attention to it,
00:35:05.360 | and then attention might be something that you could train,
00:35:09.080 | how you allocate attention
00:35:11.040 | and how you hold attention on a thing.
00:35:13.600 | - I can say that, in fact, we do in certain ways, right?
00:35:16.720 | So if you are an expert in something,
00:35:20.220 | you are training attention.
00:35:21.840 | So we did this one study of expertise in the brain.
00:35:25.520 | And so people used to think,
00:35:27.640 | let's say if you're a bird expert or something, right,
00:35:29.520 | people will go, like,
00:35:30.680 | if you get really into this world of birds,
00:35:33.400 | you start to see the differences
00:35:35.220 | in your visual cortex is tuned up,
00:35:37.140 | and it's all about plasticity of the visual cortex.
00:35:39.400 | And vision researchers love to say everything's visual,
00:35:42.600 | but it's like, we did this study of attention
00:35:45.400 | and working memory and expertise.
00:35:48.740 | And one of the things that surprised us
00:35:50.200 | were the biggest effects as people became experts
00:35:52.920 | in identifying these different kinds
00:35:55.160 | of just crazy objects that we made up.
00:35:57.880 | As they developed this expertise
00:35:59.440 | of being able to identify
00:36:00.480 | what made them different from each other
00:36:02.200 | and what made them unique,
00:36:03.460 | we were actually seeing massive increases in activity
00:36:05.960 | in the prefrontal cortex.
00:36:07.720 | And this fits with some of the studies of chess experts
00:36:09.960 | and so forth that it's not so much
00:36:12.120 | that you learn the patterns passively,
00:36:14.980 | you learn what to look for.
00:36:17.000 | You learn what's important, what's not, right?
00:36:19.200 | And you can see this in any kind
00:36:21.080 | of expert professional athlete.
00:36:23.520 | They're looking three steps ahead
00:36:25.760 | of where they're supposed to be.
00:36:27.280 | So that's a kind of a training of attention.
00:36:29.920 | And those are also what you'd call expert memory skills.
00:36:32.920 | So if you take the memory athletes,
00:36:35.440 | I know that's something we're both interested in.
00:36:37.840 | So these are people who train in these competitions
00:36:40.540 | and they'll memorize a deck of cards
00:36:42.620 | in a really short amount of time.
00:36:45.220 | There's a great memory athlete,
00:36:48.280 | her name I think is pronounced Yenia Wintersol,
00:36:52.300 | but she, so I think she's got like a giant Instagram
00:36:55.400 | following and so she had this YouTube video that went viral
00:36:58.800 | where she had memorized an entire Ikea catalog, right?
00:37:02.260 | And so how do people do this?
00:37:05.120 | By all accounts from people who become memory athletes,
00:37:08.680 | they weren't born with some extraordinary memory,
00:37:11.540 | but they practice strategies over and over and over again.
00:37:15.200 | The strategy that they use for memorizing
00:37:17.240 | a particular thing, it can become automatic
00:37:20.120 | and you can just deploy it in an instant, right?
00:37:22.960 | So again, it's not necessarily gonna,
00:37:24.960 | one strategy for learning the order of a deck of cards
00:37:28.400 | might not help you for something else that you need,
00:37:30.820 | like remembering your way around Austin, Texas,
00:37:34.020 | but it's gonna be these, whatever you're interested in,
00:37:37.880 | you can optimize for that.
00:37:39.400 | And that's just a natural by-product of expertise.
00:37:43.000 | - There's certain hacks.
00:37:44.440 | There's something called the memory palace
00:37:46.000 | that I played with, I don't know if you're familiar
00:37:47.840 | with that whole technique, and it works.
00:37:51.200 | It's interesting.
00:37:52.360 | So another thing I recommend for people a lot
00:37:55.080 | is I use Anki a lot every day.
00:37:58.240 | It's a app that does spaced repetition.
00:38:01.640 | So I think medical students and students use this a lot
00:38:04.120 | to remember a lot of different things.
00:38:05.600 | - Oh yeah, okay, we can come back to this, but yeah, go on.
00:38:07.800 | - Sure, it's the whole concept of spaced repetition.
00:38:09.920 | You just, when the thing is fresh,
00:38:13.340 | you kind of have to remind yourself of it a lot
00:38:15.880 | and then over time, you can wait a week, a month, a year
00:38:20.880 | before you have to recall the thing again.
00:38:23.840 | And that way, you essentially have something
00:38:26.120 | like note cards that you can have tens of thousands of
00:38:29.720 | and can only spend 30 minutes a day
00:38:31.440 | and actually be refreshing all of that information,
00:38:34.720 | all of that knowledge.
00:38:35.800 | It's really great.
00:38:36.640 | And then for memory palace is a technique
00:38:40.040 | that allows you to remember things like the Ikea catalog
00:38:43.400 | or by placing them visually in a place
00:38:47.080 | that you're really familiar with.
00:38:48.320 | Like I'm really familiar with this place,
00:38:49.920 | so I can put numbers or facts
00:38:54.560 | or whatever you wanna remember.
00:38:55.680 | You can walk along that little palace and it reminds you.
00:38:58.840 | It's cool.
00:38:59.680 | Like there's stuff like that that I think athletes,
00:39:03.600 | memory athletes could use,
00:39:05.080 | but I think also regular people can use.
00:39:07.120 | One of the things I have to solve for myself
00:39:09.440 | is how to remember names.
00:39:10.920 | I'm horrible at it.
00:39:13.140 | I think it's because when people introduce themselves,
00:39:15.800 | I have the social anxiety of the interaction
00:39:21.400 | where I'm like, I know I should be remembering that,
00:39:24.300 | but I'm freaking out internally
00:39:26.480 | about social interaction in general.
00:39:30.200 | And so therefore, I forget immediately.
00:39:32.440 | So I'm looking for good tricks for that.
00:39:34.520 | - So I feel like we've got a lot in common
00:39:39.040 | because when people introduce themselves to me,
00:39:41.760 | it's almost like I have this just blank blackout
00:39:45.860 | for a moment and then I'm just looking at them like,
00:39:48.100 | what happened?
00:39:49.060 | I look away or something, what's wrong with me?
00:39:51.820 | So, I mean, I'm totally with you on this.
00:39:54.060 | The reason why it's hard is that there's no reason
00:39:57.280 | we should be able to remember names
00:39:59.940 | because when you say remembering a name,
00:40:01.700 | you're not really remembering a name.
00:40:03.100 | Maybe in my case you are,
00:40:04.100 | but most of the time you're associating a name
00:40:06.260 | with a face and an identity.
00:40:09.140 | And that's a completely arbitrary thing, right?
00:40:11.920 | I mean, maybe in the olden days, somebody named Miller,
00:40:14.560 | it's like they're actually making flour
00:40:16.040 | or something like that, but for the most part,
00:40:19.360 | it's like these names are just utterly arbitrary.
00:40:22.400 | So you have no thing to latch onto.
00:40:25.280 | And so it's not really a thing that our brain does very well
00:40:28.720 | is to learn meaningless, arbitrary stuff.
00:40:31.160 | So what you need to do is build connections somehow,
00:40:35.360 | visualize a connection.
00:40:36.800 | And sometimes it's obvious or sometimes it's not.
00:40:40.880 | I'm trying to think of a good one for you now,
00:40:43.000 | but the first thing I think of is Lex Luthor, but-
00:40:45.240 | - That's great.
00:40:46.080 | - Yeah, so you think of Lex Luthor.
00:40:46.920 | - Doesn't Lex Luthor wear a suit, I think?
00:40:50.160 | - I know he has a shaved head though, or he's bald,
00:40:53.460 | which you're not, you've got a great head
00:40:54.920 | if I trade hair with you any day.
00:40:56.640 | But like, you know, something like that.
00:40:58.880 | But if I can come up with something, like I could say,
00:41:01.280 | okay, so Lex Luthor is this criminal mastermind
00:41:04.160 | and then I just imagine you-
00:41:05.000 | - And we talked about stabbing or whatever earlier.
00:41:07.240 | - Yeah, exactly, right.
00:41:08.080 | - So I'm just kind of connected and that's it.
00:41:09.680 | - Yeah, yeah, but I'm serious though
00:41:12.040 | that these kinds of weird associations,
00:41:14.260 | now I'm building a richer network.
00:41:16.120 | I mean, one of the things that I find is if I've,
00:41:19.080 | like you can have somebody's name
00:41:20.740 | that's just totally generic, like John Smith or something,
00:41:23.720 | not that, no offense to people, that name,
00:41:26.120 | but you know, if I see a generic name like that,
00:41:29.680 | but I've read John Smith's papers academically
00:41:32.960 | and then I meet John Smith at a conference,
00:41:35.480 | I can immediately associate that name with that face
00:41:38.000 | 'cause I have this preexisting network
00:41:39.840 | to lock everything into, right?
00:41:41.960 | And so you can build that network
00:41:43.520 | and that's what the method of loci
00:41:45.400 | or the memory palace technique is all about
00:41:47.920 | is you have a preexisting structure in your head
00:41:50.420 | of like your childhood home or this mental palace
00:41:54.200 | that you've created for yourself.
00:41:55.960 | And so now you can put arbitrary pieces of information
00:42:00.200 | in different locations in that mental structure of yours
00:42:04.580 | and then you could walk through the different path
00:42:07.520 | and find all the pieces of information you're looking for.
00:42:10.640 | So the method of loci is a great method
00:42:12.680 | for just learning arbitrary things
00:42:15.100 | because it allows you to link them together
00:42:17.540 | and get that cue that you need to pop in
00:42:20.800 | and find everything, right?
00:42:22.420 | - We should maybe linger on this memory palace thing
00:42:27.120 | just to make obvious 'cause when people were describing
00:42:30.040 | to me a while ago what this is, it seems insane.
00:42:34.080 | You literally think of a place like a childhood home
00:42:40.000 | or a home that you're really visually familiar with
00:42:44.160 | and you literally place in that three-dimensional space
00:42:50.880 | facts or people or whatever you wanna remember
00:42:54.680 | and you just walk in your mind along that place visually
00:42:59.960 | and you can remember,
00:43:02.580 | remind yourself of the different things.
00:43:04.480 | One of the limitations is there is a sequence to it.
00:43:07.600 | So it's, I think your brain somehow,
00:43:10.040 | you can't just like go upstairs right away or something.
00:43:12.680 | You have to like walk along the room.
00:43:15.000 | So it's really great for remembering sequences
00:43:16.880 | but it's also not great for remembering
00:43:19.000 | like individual facts out of context.
00:43:20.760 | So the full context of the tour I think is important.
00:43:23.760 | But it's fascinating how the mind is able to do that
00:43:28.240 | when you ground these pieces of knowledge
00:43:30.720 | into something that you remember well already,
00:43:34.880 | especially visually.
00:43:36.360 | Fascinating.
00:43:37.200 | And you can just do that for any kind of sequence.
00:43:39.320 | I'm sure she used something like this for IKEA catalog.
00:43:43.160 | - Oh yeah, absolutely, absolutely.
00:43:45.880 | And I think the principle here is,
00:43:49.280 | again, I was telling you this idea
00:43:50.640 | that memories can compete with each other, right?
00:43:53.320 | Well, I like to use this example
00:43:56.100 | and maybe someday I'll regret this
00:43:57.520 | but I've used it a lot recently is like,
00:43:59.920 | imagine if this were my desk,
00:44:01.560 | it could be cluttered with a zillion different things, right?
00:44:03.800 | So imagine it's just cluttered
00:44:04.880 | with a whole bunch of yellow post-it notes.
00:44:07.080 | And one of them, I put my bank password on it, right?
00:44:09.980 | Well, it's gonna take me forever to find it.
00:44:11.680 | I might, you know, it's just gonna be buried
00:44:13.720 | under all these other post-it notes.
00:44:15.360 | But if it's like hot pink, it's gonna stand out
00:44:18.560 | and I find it really easily, right?
00:44:19.960 | And so that's one way in which if things are distinctive,
00:44:23.600 | if you've processed information in a very distinctive way,
00:44:27.320 | then you can have a memory that's gonna last.
00:44:31.400 | And that's very good, for instance,
00:44:33.120 | for name-face associations.
00:44:34.880 | If I get something distinctive about you,
00:44:37.200 | you know, that it's like, that you've got very short hair
00:44:39.920 | and maybe I can make the association
00:44:41.280 | with Lex Luthor that way or something like that, right?
00:44:43.560 | You know, but I get something very specific.
00:44:45.620 | That's a great cue.
00:44:47.120 | But the other part of it is,
00:44:48.400 | what if I just organized my notes
00:44:50.840 | so that I have my finances in one pile
00:44:53.080 | and I have my like reminders, my to-do list in one pile?
00:44:57.000 | And so forth, so I organize them.
00:44:58.880 | Well, then I know exactly if I'm going for my banking,
00:45:02.160 | you know, my bank password,
00:45:03.800 | I could go to the finance pile, right?
00:45:06.040 | So the method of loci works or memory palaces work
00:45:10.080 | because they give you a way of organizing.
00:45:12.180 | There's a school of thought that says
00:45:14.640 | that episodic memory evolved
00:45:17.440 | from this like kind of knowledge of space
00:45:19.720 | and, you know, basically this primitive abilities
00:45:22.800 | to figure out where you are.
00:45:24.000 | And so people explain the method of loci that way.
00:45:26.920 | And, you know, whether or not
00:45:29.160 | the evolutionary argument is true,
00:45:31.640 | the method of loci is not at all special.
00:45:33.400 | So if you're not a good visualizer,
00:45:35.560 | stories are a good one.
00:45:37.840 | So a lot of memory athletes will use stories
00:45:41.240 | and they'll go like, if you're memorizing a deck of cards,
00:45:44.480 | they have a little code for the different,
00:45:46.480 | like the king and the jack and the 10 and so forth.
00:45:50.520 | And they'll make up a story about things that they're doing
00:45:53.080 | and that'll work.
00:45:54.040 | Songs are a great one, right?
00:45:56.080 | I mean, it's like, I can still remember
00:45:58.240 | there was this obscure episode of the TV show "Cheers."
00:46:01.080 | They sing a song about Albania
00:46:02.640 | that he uses to memorize all these facts about Albania.
00:46:05.800 | I could still sing that song to you.
00:46:08.160 | It's just, I saw it on a TV show, you know?
00:46:12.160 | - So you mentioned space repetition.
00:46:13.800 | So what, do you like this process?
00:46:16.120 | Maybe can you explain it?
00:46:17.520 | - Oh yeah.
00:46:18.360 | If I'm trying to memorize something,
00:46:19.880 | let's say if I have an hour to memorize
00:46:22.160 | as many Spanish words as I can,
00:46:24.500 | if I just try to do like half an hour
00:46:27.400 | and then later in the day I do half an hour,
00:46:30.160 | I won't retain that information as long
00:46:32.680 | as if I do half an hour today
00:46:34.680 | and half an hour one week from now.
00:46:37.120 | And so doing that extra spacing
00:46:40.040 | should help me retain the information better.
00:46:42.800 | Now there's an interesting boundary condition,
00:46:45.920 | which is it depends on when you need that information.
00:46:49.360 | So many of us, you know, for me,
00:46:51.480 | like I can't remember so much from college and high school
00:46:54.580 | 'cause I crammed,
00:46:55.420 | 'cause I just did everything at the last minute.
00:46:57.580 | And sometimes I would literally study like, you know,
00:47:01.240 | in the hallway right before the test.
00:47:03.620 | And that was great because what would happen is,
00:47:06.300 | is I just had that information right there.
00:47:08.900 | And so actually not spacing can really help you
00:47:12.460 | if you need it very quickly, right?
00:47:14.900 | But the problem is,
00:47:15.780 | is that you tend to forget it later on.
00:47:18.260 | But on the other hand, if you space things out,
00:47:20.860 | you get a benefit for later on retention.
00:47:23.860 | And so there's many different explanations.
00:47:27.080 | We have a computational model of this.
00:47:29.060 | It's currently under revision.
00:47:31.420 | But in our computer model,
00:47:32.820 | what we say is, is that an easy,
00:47:35.140 | maybe a good way of thinking about this is
00:47:37.820 | this conversation that you and I are having,
00:47:40.660 | it's associated with a particular context,
00:47:43.100 | a particular place in time.
00:47:45.140 | And so all of these little cues that are in the background,
00:47:47.740 | these little guitar sculptures that you have
00:47:49.980 | and that big light umbrella thing, right?
00:47:52.180 | All these things are part of my memory
00:47:53.820 | for what we're talking about, the content.
00:47:56.380 | So now later on, you're sitting around
00:48:00.140 | and you're at home drinking a beer and you're thinking,
00:48:02.620 | God, what a strange interview that was, right?
00:48:04.820 | So now you're trying to remember it,
00:48:06.940 | but the context is different.
00:48:08.780 | So your current situation doesn't match up
00:48:12.980 | with the memory that you pulled up.
00:48:14.740 | There's error.
00:48:15.600 | There's a mismatch between what you pulled up
00:48:18.320 | and your current context.
00:48:20.020 | And so in our model, what you start to do
00:48:21.860 | is you start to erase or alter the parts of the memory
00:48:25.820 | that are associated with a specific place in time,
00:48:28.740 | and you heighten the information about the content.
00:48:32.100 | And so if you remember this information
00:48:34.940 | in different times and different places,
00:48:37.460 | it's more accessible at different times and different places
00:48:41.060 | because it's not overfitted
00:48:42.980 | in an AI kind of way of thinking about things.
00:48:45.660 | It's not overfitted to one particular context.
00:48:47.940 | But that's also why the memories that we call upon the most
00:48:51.660 | also feel kind of like they're just things
00:48:53.860 | that we read about almost.
00:48:54.900 | You don't vividly re-imagine them, right?
00:48:57.180 | It's like they're just these things
00:48:58.680 | that just come to us like facts, right?
00:49:01.340 | And it's a little bit different than semantic memory,
00:49:03.820 | but it's like basically these events
00:49:05.860 | that we have recalled over and over and over again,
00:49:10.140 | we keep updating that memory
00:49:11.860 | so it's less and less tied to the original experience.
00:49:15.020 | But then we have those other ones,
00:49:16.300 | which it's like you just get a reminder
00:49:18.020 | of that very specific context.
00:49:20.060 | You smell something, you hear a song,
00:49:22.700 | you see a place that you haven't been to in a while,
00:49:25.540 | and boom, it just comes back to you.
00:49:27.580 | And that's the exact opposite
00:49:28.900 | of what you get with spacing, right?
00:49:30.340 | - That's so fascinating.
00:49:31.300 | So with space repetition, one of its powers
00:49:34.300 | is that you lose attachment to a particular context,
00:49:37.160 | but then it loses the intensity
00:49:42.740 | of the flavor of the memory.
00:49:44.980 | That's interesting, that's so interesting.
00:49:47.460 | - Yeah, but at the same time, it becomes stronger
00:49:50.180 | in the sense that the content becomes stronger.
00:49:52.340 | - Yeah, so it's used for learning languages,
00:49:54.860 | for learning facts, for learning,
00:49:57.100 | for that generic semantic information type of memory.
00:49:59.940 | - Yeah, and I think this falls into a category.
00:50:02.740 | We've done other modeling.
00:50:04.700 | One of these is a published study
00:50:06.020 | in PLOS Computational Biology,
00:50:08.200 | where we showed that another way,
00:50:11.260 | which is, I think, related to the spacing effect,
00:50:14.700 | is what's called the testing effect.
00:50:16.220 | So the idea is that if you're trying to learn words,
00:50:20.620 | let's say in Spanish or something like that,
00:50:22.500 | and this doesn't have to be words, it could be anything,
00:50:25.140 | you test yourself on the words,
00:50:27.540 | and that act of testing yourself
00:50:29.460 | helps you retain it better over time
00:50:31.780 | than if you just studied it, right?
00:50:33.840 | And so from traditional learning theories,
00:50:37.860 | some learning theories anyway, this seems weird,
00:50:40.020 | why would you do better giving yourself this extra error
00:50:44.020 | from testing yourself,
00:50:45.340 | rather than just giving yourself perfect input
00:50:48.540 | that's a replica of what it is that you're trying to learn?
00:50:51.860 | And I think the reason is is that you get better retention
00:50:55.260 | from that error, that mismatch that we talked about, right?
00:50:58.700 | So what's happening in our model,
00:51:00.860 | it's actually conceptually kind of similar
00:51:03.060 | to what happens with backprop in AI or neural networks.
00:51:07.460 | And so the idea is that you expose,
00:51:10.020 | here's the bad connections and here's the good connections.
00:51:13.660 | And so we can keep the parts of this cell assembly
00:51:17.340 | that are good for the memory
00:51:18.500 | and lose the ones that are not so good.
00:51:20.860 | But if you don't stress test the memory,
00:51:22.700 | you haven't exposed it to the error fully.
00:51:25.660 | And so that's why I think this is kind of,
00:51:28.020 | this is a thing that I come back to over and over again,
00:51:30.200 | is that you will retain information better
00:51:34.200 | if you're constantly pushing yourself to your limit, right?
00:51:37.980 | If you are feeling like you're coasting,
00:51:41.420 | then you're actually not learning.
00:51:44.020 | So it's like-
00:51:45.020 | - So you should always be stress testing the memory system.
00:51:50.020 | - Yeah, and feel good about it.
00:51:52.020 | Even though everyone tells me, oh, my memory's terrible,
00:51:55.280 | in the moment, they're overconfident
00:51:57.140 | about what they'll retain later on.
00:51:59.340 | So it's fascinating.
00:52:01.020 | And so what happens is when you test yourself,
00:52:03.800 | you're like, oh my God, I thought I knew that, but I don't.
00:52:06.740 | And so it can be demoralizing until you get around that
00:52:11.500 | and you realize, hey, this is the way that I learned.
00:52:14.020 | This is how I learned best.
00:52:16.660 | It's like if you're trying to star in a movie
00:52:20.580 | or something like that,
00:52:21.460 | you don't just sit around reading the script.
00:52:23.400 | You actually act it out.
00:52:24.700 | And you're gonna botch those lines from time to time, right?
00:52:27.340 | - You know, there's an interesting moment
00:52:28.580 | you probably experienced this.
00:52:29.780 | I remember a good friend of mine, Joe Rogan,
00:52:32.900 | I was on his podcast.
00:52:34.900 | And we were randomly talking about soccer, football.
00:52:39.900 | Somebody I grew up watching, Diego Armando Maradona,
00:52:45.760 | one of the greatest soccer players of all time.
00:52:47.940 | And we were talking about him and his career and so on.
00:52:52.200 | And Joe asked me if he's still around.
00:52:57.040 | Now, and I said, yeah.
00:53:01.700 | I don't know why I thought yeah,
00:53:06.200 | because that was a perfect example of memories.
00:53:08.860 | He passed away.
00:53:10.320 | I tweeted about it, how heartbroken I was,
00:53:14.300 | all this kind of stuff, like a year before.
00:53:17.060 | I know this, but in my mind,
00:53:19.520 | I went back to the thing I've done many times in my head,
00:53:22.740 | visualizing some of the epic runs he had on goal and so on.
00:53:26.420 | So for me, he's alive.
00:53:28.100 | And so I'm, and part of the, also the conversation
00:53:30.740 | when you're talking to Joe, you're just stressing.
00:53:33.300 | The focus is allocated,
00:53:34.860 | the attention is allocated in a particular way.
00:53:37.700 | But when I walked away, I was like,
00:53:40.620 | in which world was Diego Maradona still alive?
00:53:44.660 | Like in which, 'cause I was sure in my head
00:53:47.500 | that he was still alive.
00:53:49.100 | There was a, it's a moment that sticks with me.
00:53:51.700 | I've had a few like that in my life,
00:53:53.620 | where it just kinda, like obvious things
00:53:58.300 | just disappear from mind.
00:53:59.980 | And it's cool, like it shows actually the power
00:54:02.580 | of the mind in a positive sense
00:54:04.500 | to erase memories you want erased, maybe.
00:54:07.220 | But I don't know.
00:54:09.280 | I don't know if there's a good explanation for that.
00:54:10.980 | - One of the cool things that I found is this,
00:54:13.980 | that some people really just revolutionize a field
00:54:18.420 | by creating a problem that didn't exist before.
00:54:22.140 | It's kinda like why I love science is like,
00:54:24.500 | engineering is like solving other people's problems,
00:54:27.180 | and science is about creating problems.
00:54:29.620 | I'm just much more like I wanna break things
00:54:31.860 | and create problems.
00:54:34.340 | Not necessarily move fast though.
00:54:36.040 | (laughing)
00:54:36.980 | But one of my former mentors, Marsha Johnson,
00:54:39.540 | who in my opinion is one of the greatest
00:54:40.940 | memory researchers of all time.
00:54:42.880 | She comes up, young woman in the field,
00:54:45.300 | and it's mostly guy field.
00:54:47.380 | And she gets into this idea of how do we tell the difference
00:54:51.460 | between things that we've imagined
00:54:53.220 | and things that we actually remember?
00:54:55.140 | How do we tell, I get some mental experience.
00:54:57.420 | Where did that mental experience come from, right?
00:55:00.180 | And it turns out this is a huge problem
00:55:02.980 | because essentially our mental experience
00:55:04.900 | of remembering something that happened,
00:55:06.900 | our mental experience of thinking about something,
00:55:10.180 | how do you tell the difference?
00:55:11.340 | They're both largely constructions in our head.
00:55:15.380 | And so it is very important.
00:55:18.260 | And the way that you do it is, I mean it's not perfect,
00:55:22.720 | but the way that we often do it and succeed
00:55:25.420 | is by again using our prefrontal cortex
00:55:27.620 | and really focusing on the sensory information
00:55:31.680 | or the place and time and the things that put us back
00:55:34.700 | into when this information happened.
00:55:37.500 | And if it's something you thought about,
00:55:39.200 | you're not gonna have all of that vivid detail
00:55:41.740 | as you do for something that actually happened.
00:55:44.380 | But it doesn't work all the time.
00:55:45.740 | But that's a big thing that you have to do.
00:55:47.420 | But it takes time, it's slow, and it's again, effortful.
00:55:51.260 | But that's what you need to remember accurately.
00:55:53.700 | But what's cool, and I think this is what you alluded to
00:55:55.800 | about how that was an interesting experience,
00:55:57.840 | is imagination's exactly the opposite.
00:56:00.420 | Imagination is basically saying,
00:56:03.100 | I'm just gonna take all this information from memory,
00:56:06.300 | recombine it in different ways and throw it out there.
00:56:09.220 | And so for instance, Dan Schachter and Donna Addis
00:56:13.060 | have done cool work on this.
00:56:14.360 | Demis Hassabis did work on this
00:56:16.520 | with Eleanor McGuire in UCL.
00:56:19.400 | And this goes back actually to this guy, Frederick Bartlett,
00:56:23.220 | who was this revolutionary memory researcher.
00:56:25.760 | Bartlett, he actually rejected the whole idea
00:56:29.360 | of quantifying memory.
00:56:30.560 | He said, "There's no statistics in my book."
00:56:32.760 | He came from this anthropology perspective.
00:56:35.480 | And short version of the stories,
00:56:38.080 | he just asked people to recall things.
00:56:40.080 | He would give people stories and poems,
00:56:42.200 | ask people to recall them.
00:56:43.840 | And what he found was people's memories
00:56:45.760 | didn't reflect all of the details
00:56:48.100 | of what they were exposed to.
00:56:50.160 | And they did reflect a lot more,
00:56:52.160 | they were filtered through this lens of prior knowledge.
00:56:55.120 | The cultures that they came from,
00:56:57.480 | the beliefs that they had, the things they knew.
00:57:00.160 | And so what he concluded was that
00:57:01.880 | he called remembering an imaginative construction.
00:57:05.760 | Meaning that we don't replay the past,
00:57:09.080 | we imagine how the past could have been
00:57:11.220 | by taking bits and pieces that come up in our heads.
00:57:14.400 | And likewise, he wrote this beautiful paper on imagination
00:57:17.460 | saying when we imagine something and create something,
00:57:20.340 | we're creating it from these specific experiences
00:57:23.060 | that we've had and combining it with our general knowledge.
00:57:25.660 | But instead of trying to focus it on being accurate
00:57:28.380 | and getting at one thing,
00:57:29.760 | you're just ruthlessly recombining things
00:57:31.960 | without any necessary kind of goal in mind.
00:57:36.960 | I mean, or at least that's one kind of creation.
00:57:39.380 | - So imagination is fundamentally coupled with memory
00:57:44.380 | in both directions?
00:57:48.300 | - I think so.
00:57:49.140 | I mean, it's not clear that it is in everyone,
00:57:52.580 | but one of the things that's been studied
00:57:54.500 | is some patients who have amnesia, for instance,
00:57:56.860 | they have brain damage, say, to the hippocampus.
00:58:00.360 | And if you ask them to imagine things
00:58:03.620 | that are not in front of them,
00:58:04.780 | like imagine what could happen
00:58:06.220 | after I leave this room, right?
00:58:08.420 | They find it very difficult to give you a scenario
00:58:11.700 | of what could happen.
00:58:13.020 | Or if they do, it'd be more stereotyped,
00:58:14.820 | like yes, this would happen, this would.
00:58:16.460 | But it's not like they can come up with anything
00:58:18.220 | that's very vivid and creative in that sense.
00:58:21.340 | And it's partly 'cause when you have amnesia,
00:58:23.140 | you're stuck in the present.
00:58:25.200 | Because to get a very good model of the future,
00:58:28.340 | it really helps to have episodic memories to draw upon.
00:58:32.020 | And so that's the basic idea.
00:58:35.100 | And in fact, one of the most impressive things,
00:58:38.020 | when people started to scan people's brains
00:58:40.580 | and ask people to remember past events,
00:58:43.980 | what they found was there was this big network of the brain
00:58:46.260 | called the default mode network.
00:58:47.900 | It gets a lot of press
00:58:49.140 | because it's thought to be important.
00:58:50.780 | It's engaged during mind-wandering.
00:58:52.740 | And if I ask you to pay attention to something,
00:58:55.420 | it only comes on when you stop paying attention.
00:58:58.020 | So people, oh, it's just this kind of daydreaming network.
00:59:02.540 | And I thought, this is just ridiculous research.
00:59:04.420 | Who cares?
00:59:06.260 | But then what people found
00:59:07.740 | was when people recall episodic memories,
00:59:10.100 | this network gets active.
00:59:11.940 | And so we started to look into it,
00:59:15.140 | and this network of areas is really closely,
00:59:18.540 | functionally interacting with the hippocampus.
00:59:21.300 | And so, in fact, some would say the hippocampus
00:59:23.820 | is part of this default network.
00:59:26.220 | And if you look at brain images of people,
00:59:29.680 | or brain maps of activation, so to speak,
00:59:31.580 | of people imagining possible scenarios
00:59:34.220 | of things that could happen in the future,
00:59:36.140 | even things that couldn't really be very plausible,
00:59:39.500 | they look very similar.
00:59:41.020 | I mean, to the naked eye,
00:59:42.340 | they look almost the same as maps of brain activation
00:59:45.340 | when people remember the past.
00:59:47.420 | According to our theory,
00:59:48.940 | and we've got some data to support this,
00:59:50.540 | we've broken up this network into various sub-pieces,
00:59:53.620 | is that basically it's kind of taking apart
00:59:55.980 | all of our experiences
00:59:57.660 | and creating these little Lego blocks out of them.
00:59:59.980 | And then you can put them back together
01:00:02.020 | if you have the right instructions
01:00:03.420 | to recreate these experiences that you've had,
01:00:06.260 | but you could also reassemble them into new pieces
01:00:08.620 | to create a model of an event that hasn't happened yet.
01:00:11.820 | And that's what we think happens.
01:00:13.460 | Our common ground that we're establishing in language
01:00:18.420 | requires using those building blocks
01:00:20.620 | to put together a model of what's going on.
01:00:23.380 | - Well, there's a good percentage of time
01:00:25.100 | I personally live in the imagined world.
01:00:28.260 | I think of, I do thought experiments a lot.
01:00:32.860 | I take the absurdity of human life as it stands
01:00:37.860 | and play it forward in all kinds of different directions.
01:00:44.020 | Sometimes it's rigorous thought experiments,
01:00:47.540 | sometimes it's fun ones.
01:00:48.500 | So I imagine that that has an effect
01:00:52.380 | on how I remember things.
01:00:54.540 | And I suppose I have to be a little bit careful
01:00:56.700 | to make sure stuff happened
01:00:58.340 | versus stuff that I just imagined happened.
01:01:02.300 | And this also, I mean, some of my best friends
01:01:05.900 | are characters inside books that never even existed.
01:01:09.620 | And there's some degree
01:01:13.540 | to which they actually exist in my mind.
01:01:16.300 | Like these characters exist, authors exist,
01:01:18.300 | Dostoevsky exists, but also Brothers Karamazov.
01:01:22.620 | - I love that book.
01:01:23.700 | - Yeah.
01:01:24.540 | - It's one of the few books I've read.
01:01:25.700 | (laughing)
01:01:26.820 | One of the few literature books that I've read,
01:01:28.700 | I should say.
01:01:29.540 | I read a lot in school that I don't remember,
01:01:31.020 | but Brothers Karamazov.
01:01:32.300 | - But they exist.
01:01:33.380 | They exist, and I have almost conversations with them.
01:01:36.060 | It's interesting.
01:01:36.900 | It's interesting to allow your brain
01:01:39.740 | to kind of play with ideas of the past,
01:01:43.100 | of the imagined, and see it all as one.
01:01:46.060 | - Yeah, there was actually this famous mnemonist.
01:01:49.060 | He's kind of like back then
01:01:50.260 | the equivalent of a memory athlete,
01:01:51.780 | except he would go to shows and do this.
01:01:54.580 | That was described by this really famous
01:01:57.100 | neuropsychologist from Russia named Luria.
01:02:00.340 | And so this guy was named Solomon Sharoshevsky,
01:02:03.780 | and he had this condition called synesthesia
01:02:06.460 | that basically created these weird associations
01:02:09.380 | between different senses that normally wouldn't go together.
01:02:12.860 | So that gave him this incredibly vivid imagination
01:02:17.860 | that he would use to basically imagine
01:02:20.820 | all sorts of things that he would need to memorize,
01:02:24.180 | and he would just imagine,
01:02:25.780 | just create these incredibly detailed things in his head
01:02:29.100 | that allowed him to memorize all sorts of stuff.
01:02:32.420 | But it also really haunted him by some reports
01:02:35.300 | that basically it was like he was at some point,
01:02:39.100 | and again, who knows if the drinking was part of this,
01:02:41.140 | but at some point had trouble
01:02:42.980 | differentiating his imagination from reality, right?
01:02:46.300 | And this is interesting because it's like,
01:02:48.940 | I mean, that's what psychosis is in some ways,
01:02:52.340 | is you, first of all, you're just learning connections
01:02:56.940 | from prediction errors that you probably shouldn't learn,
01:03:00.260 | and the other part of it is that your internal signals
01:03:04.180 | are being confused with actual things
01:03:06.740 | in the outside world, right?
01:03:08.780 | - Well, that's why a lot of this stuff
01:03:10.220 | is both feature and bug.
01:03:11.700 | It's a double-edged sword.
01:03:13.140 | - Yeah, I mean, it might be why
01:03:13.980 | there's such an interesting relationship
01:03:15.680 | between genius and psychosis.
01:03:17.600 | - Yeah, maybe they're just two sides of the same coin.
01:03:22.420 | Humans are fascinating, aren't they?
01:03:25.940 | - I think so, sometimes scary, but mostly fascinating.
01:03:29.820 | - Can we just talk about memory sport a little longer?
01:03:33.220 | There's something called the USA Memory Championship.
01:03:36.460 | Like, what are these athletes like?
01:03:38.940 | What does it mean to be like elite level at this?
01:03:42.620 | Have you interacted with any of them,
01:03:44.160 | or reading about them,
01:03:45.380 | what have you learned about these folks?
01:03:47.100 | - There's a guy named Henry Roediger
01:03:49.140 | who's studying these guys,
01:03:50.460 | and there's actually a book by Joshua Foer
01:03:52.980 | called "Moonwalking with Einstein"
01:03:54.800 | where he talks about,
01:03:56.340 | he actually, as part of this book,
01:03:58.020 | just decided to become a memory athlete.
01:04:00.940 | They often have these life events
01:04:02.860 | that make them go, "Hey, why don't I do this?"
01:04:05.660 | So there was a guy named Scott Hagwood,
01:04:07.340 | who I write about,
01:04:08.520 | who thought that he was getting chemo for cancer.
01:04:13.520 | And so he decided, like, because the chemo,
01:04:18.020 | there's a well-known thing called chemo brain
01:04:20.780 | where people become like,
01:04:21.820 | they just lose a lot of their sharpness.
01:04:24.900 | And so he wanted to fight that
01:04:27.000 | by learning these memory skills.
01:04:28.760 | So he bought a book,
01:04:29.760 | and this is the story you hear in a lot of memory athletes
01:04:32.600 | is they buy a book by other memory athletes,
01:04:35.640 | or other memory experts, so to speak,
01:04:38.320 | and they just learn those skills
01:04:40.080 | and practice them over and over again.
01:04:42.160 | They start by winning bets and so forth,
01:04:44.120 | and then they go into these competitions.
01:04:45.760 | And the competitions are typically things
01:04:48.040 | like memorizing long strings of numbers,
01:04:50.880 | or memorizing orders of cards, and so forth.
01:04:54.500 | So there tend to be pretty arbitrary things,
01:04:56.820 | not like things that would be able,
01:04:59.100 | you'd be able to bring a lot of prior knowledge.
01:05:01.820 | But they build the skills
01:05:03.860 | that you need to memorize arbitrary things.
01:05:06.060 | - Yeah, that's fascinating.
01:05:07.140 | I've gotten a chance to work with something called
01:05:09.780 | NBAC tasks, so there's all these kinds of tasks,
01:05:12.620 | memory recall tasks that are used to kinda load up
01:05:16.300 | the quote-unquote working memory.
01:05:17.900 | - Yeah, yeah.
01:05:18.740 | - And to see, a psychologist used it
01:05:21.620 | to test all kinds of stuff,
01:05:22.700 | like to see how well you're good at multitasking.
01:05:25.580 | We used it in particular for the task of driving,
01:05:28.300 | like if you fill up your brain
01:05:30.580 | with intensive working memory tasks,
01:05:35.340 | how good are you at also not crashing?
01:05:38.980 | That kind of stuff.
01:05:39.980 | So it's fascinating, but again,
01:05:42.700 | those tasks are arbitrary,
01:05:44.380 | and they're usually about recalling a sequence of numbers
01:05:47.300 | in some kinda semi-complex way.
01:05:50.580 | Are you, do you have any favorite tasks of this nature
01:05:53.140 | in your own studies?
01:05:55.860 | - I've really been most excited
01:05:58.620 | about going in the opposite direction
01:06:00.700 | and using things that are more and more naturalistic.
01:06:03.980 | And the reason is is that we've moved in that direction
01:06:07.900 | because what we found is that memory works
01:06:11.020 | very, very differently when you study it,
01:06:14.260 | when you study memory in the way
01:06:16.300 | that people typically remember.
01:06:18.780 | And so it goes into a much more predictive mode,
01:06:22.460 | and you have these event boundaries, for instance,
01:06:25.260 | and you have, but a lot of what happens
01:06:29.340 | is this kind of fascinating mix
01:06:31.380 | that we've been talking about,
01:06:32.500 | a mix of interpretations and imagination with perception.
01:06:37.220 | And so, and the new direction we're going in
01:06:41.140 | is understanding navigation in our memory first places.
01:06:45.180 | And the reason is is that there's a lot of work
01:06:47.500 | that's done in rats, which is very good work.
01:06:50.380 | They have a rat, and they put it in a box,
01:06:52.900 | and the rat goes, chases cheese in a box,
01:06:55.140 | and you'll find cells in the hippocampus
01:06:57.380 | that fire when a rat is in different places in the box.
01:07:01.260 | And so the conventional wisdom is that the hippocampus
01:07:04.820 | forms this map of the box.
01:07:07.540 | And I think that probably may happen
01:07:11.540 | when you have absolutely no knowledge of the world, right?
01:07:16.540 | But I think one of the cool things about human memory
01:07:19.380 | is we can bring to bear our past experiences
01:07:22.020 | to economically learn new ones.
01:07:24.860 | And so, for instance, if you learn a map of an Ikea,
01:07:29.420 | let's say if I go to the Ikea in Austin,
01:07:31.220 | I'm sure there's one here,
01:07:32.940 | I probably could go to this Ikea and find my way
01:07:36.300 | to where the wine glasses are
01:07:39.820 | without having to even think about it
01:07:41.780 | because it's got a very similar layout.
01:07:43.540 | Even though Ikea is a nightmare to get around,
01:07:45.980 | once I learn my local Ikea, I can use that map everywhere.
01:07:49.500 | Why form a brand new one for a new place?
01:07:53.140 | And so that kind of ability to reuse information
01:07:57.180 | really comes into play when we look at things
01:08:00.460 | that are more naturalistic tasks.
01:08:03.600 | And another thing that we're really interested in
01:08:07.020 | is this idea of what if instead of basically mapping out
01:08:11.740 | every coordinate in a space,
01:08:13.820 | you form a pretty economical graph
01:08:17.020 | that connects basically the major landmarks together
01:08:20.260 | and being able to use that as emphasizing the things
01:08:24.020 | that are most important, the places that you go for food
01:08:26.900 | and the places that are landmarks that help you get around.
01:08:30.740 | And then filling in the blanks for the rest
01:08:32.900 | because I really believe that cognitive maps
01:08:36.380 | or mental maps of the world,
01:08:38.420 | just like our memories for events, are not photographic.
01:08:42.060 | I think they're this combination
01:08:43.740 | of actual verifiable details
01:08:46.980 | and then a lot of inference that you make.
01:08:49.740 | - So what have you learned
01:08:50.820 | about this kind of spatial mapping of places?
01:08:54.900 | How do people represent locations?
01:08:57.580 | - There's a lot of variability, I think,
01:09:00.220 | and there's a lot of disagreement
01:09:01.980 | about how people represent locations.
01:09:04.580 | In a world of GPS and physical maps,
01:09:07.900 | people can learn it from basically
01:09:10.300 | what they call survey perspective,
01:09:12.140 | being able to see everything.
01:09:13.380 | And so that's one way in which humans can do it
01:09:16.780 | that's a little bit different.
01:09:18.820 | There's one way in which we can memorize routes.
01:09:21.780 | I know how to get from here to,
01:09:23.820 | let's say if I walk here from my hotel,
01:09:26.380 | I can just rigidly follow that route back, right?
01:09:29.420 | And there's another more integrative way,
01:09:31.740 | which would be what's called a cognitive map,
01:09:34.500 | which would be kind of a sense
01:09:36.700 | of how everything relates to each other.
01:09:38.860 | And so there's lots of people who believe
01:09:42.460 | that these maps that we have in our head
01:09:45.660 | are isomorphic with the world.
01:09:47.140 | They're like these literal coordinates
01:09:49.900 | that follow Euclidean space.
01:09:51.660 | And as you know, Euclidean mathematics
01:09:54.060 | is very constrained, right?
01:09:55.940 | And I think that we are actually much more generative
01:09:59.580 | in our maps of space
01:10:01.140 | so that we do have these bits and pieces.
01:10:03.260 | And we've got a small task as it's right now,
01:10:06.940 | not yet like we need to do some work on it
01:10:10.140 | for further analysis.
01:10:11.260 | But one of the things we're looking at
01:10:13.340 | is these signals called ripples in the hippocampus,
01:10:17.300 | which are these bursts of activity that you see
01:10:20.540 | that are synchronized with areas in the neocortex
01:10:23.580 | in the default network actually.
01:10:25.540 | And so what we find is that those ripples
01:10:28.340 | seem to increase at navigationally important points
01:10:31.380 | when you're making a decision
01:10:32.700 | or when you reach a goal.
01:10:34.660 | So it speaks to the emotion thing, right?
01:10:37.020 | 'Cause if you have limited choices,
01:10:39.180 | if I'm walking down a street,
01:10:41.000 | I could really just get a mental map of the neighborhood
01:10:45.200 | with a more minimal kind of thing
01:10:46.780 | by just saying here's the intersections
01:10:48.980 | and here's the directions I take to get in between them.
01:10:51.700 | And what we found in general in our MRI studies
01:10:54.100 | is basically the more people can reduce the problem,
01:10:59.660 | whether it's space or any kind of decision-making problem,
01:11:03.260 | the less the hippocampus encodes.
01:11:06.100 | It really is very economical
01:11:08.100 | towards the points of highest information content and value.
01:11:13.100 | - So can you describe the encoding in the hippocampus
01:11:17.100 | and the ripples you were talking about?
01:11:19.060 | What's the signal in which we see the ripples?
01:11:23.780 | - Yeah, so this is really interesting.
01:11:25.300 | There are these oscillations, right?
01:11:27.420 | So there's these waves that you basically see,
01:11:29.900 | and these waves are points of very high excitability
01:11:33.900 | and low excitability.
01:11:35.740 | And at least during,
01:11:37.660 | they happen actually during slow wave sleep too.
01:11:39.860 | So the deepest stages of sleep
01:11:41.500 | when you're just zonked out, right?
01:11:43.660 | You see these very slow waves where it's like very excitable
01:11:47.000 | and then very unexcitable, it goes up and down.
01:11:49.660 | And on top of them,
01:11:50.680 | you'll see these little sharp wave ripples.
01:11:53.220 | And when there's a ripple in the hippocampus,
01:11:56.260 | you tend to see a sequence of cells
01:11:58.940 | that resemble a sequence of cells that fire
01:12:02.420 | when an animal's actually doing something in the world.
01:12:06.100 | So it almost is like a little, people call it replay.
01:12:09.960 | I think it's a little bit, I don't like that term,
01:12:11.900 | but it's basically a little bit of a compressed play
01:12:16.580 | of the sequence of activity in the brain
01:12:19.620 | that was taking place earlier.
01:12:21.520 | And during those moments,
01:12:23.340 | there's a little window of communication
01:12:25.260 | between the hippocampus and these areas in the neocortex.
01:12:28.220 | And so that, I think, helps you form new memories,
01:12:32.820 | but it also helps you, I think, stabilize them,
01:12:35.820 | but also really connect different things together in memory
01:12:39.020 | and allows you to build bridges
01:12:40.480 | between different events that you've had.
01:12:43.020 | And so this is one of, at least our theories of sleep
01:12:46.620 | and its real role in helping you see the connections
01:12:49.720 | between different events that you've experienced.
01:12:52.380 | - So during sleep is when the connections are formed.
01:12:55.220 | - The connections between different events, right?
01:12:58.380 | So it's like, you see me now, you see me next week,
01:13:01.900 | you see me a month later.
01:13:03.780 | You start to build a little internal model
01:13:05.940 | of how I behave and what to expect of me.
01:13:09.620 | And with sleep, one of the things that allows you to do
01:13:12.780 | is figure out those connections and connect the dots
01:13:16.340 | and find the signal and the noise.
01:13:18.360 | - So you mentioned fMRI.
01:13:21.380 | What is it and how is it used in studying memory?
01:13:24.380 | - This is actually the reason why I got
01:13:26.180 | into this whole field of sciences.
01:13:28.620 | When I was in grad school, fMRI was just really taking off
01:13:32.840 | as a technique for studying brain activity.
01:13:35.580 | And what's beautiful about it is you can study
01:13:38.380 | the whole human brain and there's lots of limits to it,
01:13:42.440 | but you can basically do it in person
01:13:44.940 | without sticking anything into their brains
01:13:47.580 | and very non-invasive.
01:13:49.300 | I mean, for me, being an MRI scanner
01:13:50.940 | is like being in the womb, I just fall asleep.
01:13:53.420 | If I'm not being asked to do anything,
01:13:55.420 | I get very sleepy, you know?
01:13:57.660 | But you can have people watch movies
01:14:00.140 | while they're being scanned,
01:14:01.260 | or you can have them do tests of memory,
01:14:03.140 | like giving them words and so forth to memorize.
01:14:06.460 | But what MRI is itself is just this technique
01:14:10.180 | where you put people in a very high magnetic field.
01:14:15.060 | Typical ones we would use would be three Tesla
01:14:17.380 | to give you an idea.
01:14:18.860 | So a three Tesla magnet, you put somebody in,
01:14:21.740 | and what happens is you get this very weak,
01:14:24.340 | but measurable magnetization in the brain.
01:14:28.300 | And then you apply a radiofrequency pulse,
01:14:31.140 | which is basically a different electromagnetic field.
01:14:33.740 | And so you're basically using water,
01:14:36.060 | the water molecules in the brain as a tracer, so to speak.
01:14:40.100 | And part of it in fMRI is the fact
01:14:43.900 | that these magnetic fields that you mess with
01:14:47.740 | by manipulating these radiofrequency pulses
01:14:52.100 | and the static field,
01:14:53.500 | and you have things called gradients
01:14:55.420 | which change the strength of the magnetic field
01:14:57.460 | in different parts of the head.
01:14:58.900 | So they're all, we tweak them in different ways,
01:15:01.220 | but the basic idea that we use in fMRI
01:15:04.100 | is that blood is flowing to the brain.
01:15:07.060 | And when you have blood that doesn't have oxygen on it,
01:15:10.700 | it's a little bit more magnetizable than blood that does,
01:15:14.060 | 'cause you have hemoglobin that carries the oxygen,
01:15:16.660 | the iron, basically, in the blood that makes it red.
01:15:20.100 | And so that hemoglobin, when it's deoxygenated,
01:15:23.540 | actually has different magnetic field properties
01:15:27.500 | than when it has oxygen.
01:15:29.220 | And it turns out when you have an increase
01:15:32.020 | in local activity in some part of the brain,
01:15:34.540 | the blood flows there, and as a result,
01:15:37.220 | you get a lower concentration of hemoglobin
01:15:42.020 | that is not oxygenated.
01:15:45.140 | - And then that gives you more signal.
01:15:47.580 | So I gave you, I think I sent you a gif,
01:15:51.620 | as you like to say.
01:15:52.860 | - Yeah, we had an off-record intense argument
01:15:56.140 | about if it's pronounced jif or gif,
01:15:58.900 | but we shall set that aside as friends.
01:16:02.140 | - We could've called it a stern rebuke, perhaps.
01:16:05.060 | - Rebuke, yeah.
01:16:06.500 | I drew a hard line.
01:16:08.220 | It is true, the creator of gifs said it's pronounced jif,
01:16:12.260 | but that's the only person that pronounces jif.
01:16:15.140 | Anyway, yes, you sent a gif of--
01:16:19.420 | - This would be basically a whole movie of fMRI data.
01:16:24.220 | And so when you look at it, it's not very impressive.
01:16:26.220 | It looks like these very pixelated maps of the brain,
01:16:29.780 | but it's mostly kind of like white.
01:16:31.220 | But these tiny changes in the intensity of those signals
01:16:35.720 | that you probably wouldn't be able to visually perceive,
01:16:38.380 | like about 1% can be statistically
01:16:40.940 | very, very large effects for us.
01:16:43.060 | And that allows us to see,
01:16:44.740 | hey, there's an increase in activity
01:16:46.900 | in some part of the brain when I'm doing some task,
01:16:50.180 | like trying to remember something.
01:16:52.020 | And I can use those changes to even predict,
01:16:55.220 | is a person going to remember this later or not?
01:16:58.420 | And the coolest thing that people have done
01:17:00.340 | is to decode what people are remembering
01:17:04.820 | from the patterns of activity.
01:17:07.260 | 'Cause maybe when I'm remembering this thing,
01:17:09.100 | like I'm remembering the house where I grew up,
01:17:12.540 | I might have one pixel that's bright in the hippocampus
01:17:15.580 | and one that's dark.
01:17:17.020 | And if I'm remembering something more like the car
01:17:21.900 | that I used to drive when I was 16,
01:17:23.420 | I might see the opposite pattern
01:17:24.860 | where a different pixel's bright.
01:17:26.620 | And so all that little stuff that we used to think of noise,
01:17:30.100 | we can now think of almost like a QR code for memory,
01:17:32.940 | so to speak, where different memories
01:17:34.700 | have a different little pattern
01:17:36.740 | of bright pixels and dark pixels.
01:17:38.900 | And so this really revolutionized my research.
01:17:40.980 | So there's fancy research out there where people really,
01:17:44.460 | I mean, not even that,
01:17:45.300 | I mean, by your standards, this would be Stone Age,
01:17:47.460 | but applying machine learning techniques
01:17:49.900 | to do decoding and so forth.
01:17:51.500 | And now there's a lot of forward encoding models
01:17:54.260 | and you can go to town with this stuff, right?
01:17:56.980 | And I'm much more old school of designing experiments
01:18:01.300 | where you basically say, okay,
01:18:02.900 | here's a whole web of memories that overlap
01:18:06.660 | in some way, shape, or form.
01:18:08.660 | Do memories that occurred in the same place
01:18:11.500 | have a similar QR code?
01:18:14.260 | And do memories that occurred in different places
01:18:15.780 | have a different QR code?
01:18:16.860 | And you can just use things like correlation coefficients
01:18:20.540 | or cosine distance to measure that stuff, right?
01:18:23.180 | Super simple, right?
01:18:24.780 | And so what happens is you can start
01:18:26.820 | to get a whole state space of how a brain area
01:18:30.220 | is indexing all these different memories.
01:18:32.580 | And it's super fascinating because what we could see
01:18:34.980 | is this little like separation
01:18:37.300 | between how certain brain areas are processing memory
01:18:40.580 | for who was there and other brain areas
01:18:43.060 | are processing information about where it occurred
01:18:45.700 | or the situation that's kind of unfolding.
01:18:48.100 | And some are giving you information about
01:18:50.700 | what are my goals that are involved and so forth.
01:18:53.620 | And the hippocampus is just putting it all together
01:18:56.740 | into these unique things that just are about
01:18:59.300 | when and where it happened.
01:19:00.780 | - So there is a separation between spatial information,
01:19:06.220 | concepts, like literally there's distinct,
01:19:11.220 | as you said, QR codes for these?
01:19:13.340 | - So to speak.
01:19:14.180 | Let me try a different analogy too
01:19:15.860 | that might be more accessible for people,
01:19:18.500 | which would be like you've got a folder on your computer,
01:19:21.420 | right, and you open it up, there's a bunch of files there.
01:19:24.060 | I can sort those files by alphabetical order.
01:19:28.700 | And now things that both start with letter A
01:19:30.780 | are lumped together.
01:19:32.180 | And things that start with Z versus A are far apart, right?
01:19:35.820 | And so that is one way of organizing the folder,
01:19:38.620 | but I could do it by date.
01:19:40.180 | And if I do it by date,
01:19:41.180 | things that were created close together in time are close
01:19:44.700 | and things that are far apart in time are far.
01:19:47.060 | So every, like you can think of how a brain area
01:19:51.500 | or a network of areas contributes to memory
01:19:54.540 | by looking at what the sorting scheme is.
01:19:57.220 | And these QR codes that we're talking about
01:19:59.980 | that you get from fMRI allow you to do that.
01:20:02.620 | And you can do the same thing if you're recording
01:20:04.220 | from massive populations of neurons in an animal.
01:20:09.140 | And you can do it for recording local potentials
01:20:12.660 | in the brain, you know, so little waves of activity
01:20:16.740 | in let's say a human who has epilepsy
01:20:19.340 | and they stick electrodes in their brain
01:20:20.860 | to try to find the seizures.
01:20:22.240 | So that's some of the work that we're doing now.
01:20:24.860 | But all of these techniques basically allow you to say,
01:20:27.340 | hey, what's the sorting scheme?
01:20:29.820 | And so we've found that some networks of the brain
01:20:32.500 | sort information in memory according to who was there.
01:20:36.480 | So I might have, like we've actually shown
01:20:39.460 | in one of my favorite studies of all time
01:20:41.540 | that was done by a former postdoc, Zach Rea.
01:20:44.260 | And Zach did the study where we had a bunch of movies
01:20:47.860 | with different people in my labs.
01:20:50.020 | There were two different people and he filmed them
01:20:52.100 | at two different cafes and two different supermarkets.
01:20:55.420 | And what he could show is in one particular network,
01:20:58.380 | you could find the same kind of pattern of activity
01:21:02.620 | more or less, a very similar pattern of activity
01:21:05.700 | every time I saw Alex in one of these movies,
01:21:09.040 | no matter where he was, right?
01:21:10.900 | And I could see another one that was like a common pattern
01:21:15.100 | that happened every time I saw
01:21:16.860 | this particular supermarket nugget, you know?
01:21:19.880 | And so, and it didn't matter whether you're watching a movie
01:21:23.180 | or whether you're recalling the movie,
01:21:25.180 | it's the same kind of pattern that comes up, right?
01:21:27.540 | - That's so fascinating. - It is fascinating.
01:21:29.200 | And so now you have those building blocks
01:21:31.340 | for assembling a model of what's happening in the present,
01:21:35.820 | imagining what could happen and remembering things
01:21:39.180 | very economically from putting together all these pieces
01:21:41.740 | so that all the hippocampus has to do
01:21:43.760 | is get the right kind of blueprint
01:21:46.020 | for how to put together all these building blocks.
01:21:48.620 | - These are all like beautiful hints
01:21:51.700 | at a super interesting system
01:21:54.540 | that makes me wonder on the other side of it,
01:21:56.340 | how to build it.
01:21:57.740 | But it's like, it's fascinating.
01:22:00.140 | Like the way it does the encoding
01:22:01.740 | is really, really fascinating.
01:22:03.420 | Or I guess the symptoms, the results of that encoding
01:22:06.500 | are fascinating to study from this.
01:22:08.180 | Just as a small tangent,
01:22:09.660 | you mentioned sort of the measuring local potentials
01:22:13.820 | with electrodes versus fMRI.
01:22:16.900 | - Oh yeah.
01:22:17.740 | - What are some interesting like limitations,
01:22:21.020 | possibilities of fMRI?
01:22:22.920 | Maybe, the way you explained it is like brilliant
01:22:25.460 | with blood and detecting the activations
01:22:30.460 | or the excitation because blood flows to that area.
01:22:34.260 | What's like the latency of that?
01:22:36.300 | Like what's the blood dynamics in the brain?
01:22:39.020 | - Yeah, yeah.
01:22:39.860 | - Like how quickly can it,
01:22:40.980 | how quickly can the task change and all that kind of stuff?
01:22:44.780 | - Yeah, I mean, it's very slow.
01:22:46.860 | To the brain, 50 milliseconds is like, you know,
01:22:51.540 | like it's an eternity.
01:22:53.940 | Maybe not 50, you know, maybe like, you know,
01:22:57.380 | let's say half a second, 500 milliseconds.
01:23:00.300 | Just so much back and forth stuff
01:23:02.340 | happens in the brain in that time, right?
01:23:04.460 | So in fMRI, you can measure these magnetic field responses
01:23:09.460 | about six seconds after that burst of activity
01:23:12.940 | would take place.
01:23:14.160 | All these things, it's like, is it a feature
01:23:16.220 | or is it a bug, right?
01:23:17.580 | So one of the interesting things that's been discovered
01:23:19.540 | about fMRI is it's not so tightly related
01:23:23.140 | to the spiking of the neurons.
01:23:25.900 | So we tend to think of the computation, so to speak,
01:23:29.060 | as being driven by spikes,
01:23:30.980 | meaning like there's just a burst of,
01:23:32.940 | it's either on or it's off,
01:23:34.540 | and the neuron's like going up or down.
01:23:36.500 | But sometimes what you can have is these states
01:23:41.060 | where the neuron becomes a little bit more excitable
01:23:43.720 | or less excitable.
01:23:45.460 | And so fMRI is very sensitive
01:23:47.700 | to those changes in excitability.
01:23:50.420 | Actually, one of the fascinating things about fMRI
01:23:54.100 | is where does that, how is it we go from neural activity
01:23:58.820 | to essentially blood flow to oxygen, all this stuff.
01:24:03.820 | It's such a long chain of going from neural activity
01:24:06.900 | to magnetic fields.
01:24:08.580 | And one of the theories that's out there
01:24:10.380 | is most of the cells in the brain are not neurons.
01:24:14.620 | They're actually these support cells called glial cells.
01:24:17.540 | And one big one is astrocytes,
01:24:19.200 | and they play this big role in regulating,
01:24:22.180 | kind of being a middle man, so to speak, with the neurons.
01:24:25.140 | So if you, for instance, one neuron's talking to another,
01:24:28.860 | you release a neurotransmitter, like let's say glutamate,
01:24:31.940 | and that gets another neuron,
01:24:33.660 | starts getting active after you release it
01:24:36.900 | in the gap between the two neurons called synapse.
01:24:39.740 | So what's interesting is if you leave that,
01:24:43.380 | imagine you're just flooded with this liquid in there.
01:24:46.820 | If you leave it in there too long,
01:24:48.500 | you just excite the other neuron too much,
01:24:50.440 | and you can start to basically get seizure activity.
01:24:52.400 | You don't want this, so you gotta suck it up.
01:24:54.800 | And so actually what happens is these astrocytes,
01:24:57.880 | one of their functions is to suck up the glutamate
01:25:01.440 | from the synapse, and that is a massively,
01:25:04.720 | and then break it down and then feed it back
01:25:06.820 | into the neuron so that you can reuse it.
01:25:08.960 | But that cycling is actually very energy-intensive.
01:25:12.280 | And what's interesting is at least according to one theory,
01:25:15.440 | they need to work so quickly that they're working
01:25:18.900 | on metabolizing the glucose that comes in
01:25:21.820 | without using oxygen, kind of like anaerobic metabolism.
01:25:26.540 | So they're not using oxygen as fast
01:25:30.140 | as they are using glucose.
01:25:32.140 | So what we're really seeing in some ways may be in fMRI,
01:25:37.060 | not the neurons themselves being active,
01:25:39.820 | but rather the astrocytes,
01:25:42.220 | which are meeting the metabolic demands
01:25:44.380 | of the process of keeping the whole system going.
01:25:47.120 | - It does seem to be that fMRI is a good way
01:25:50.800 | to study activation.
01:25:52.040 | So with these astrocytes, even though there's a latency,
01:25:57.040 | it's pretty reliably coupled to the activations.
01:26:01.120 | - Oh, well, this gets me to the other part of it.
01:26:03.080 | So now let's say, for instance,
01:26:05.120 | if I'm just kind of like, I'm talking to you,
01:26:06.960 | but I'm kind of paying attention to your cowboy hat,
01:26:09.120 | right, so I'm looking off to the,
01:26:10.240 | or I'm thinking about the right,
01:26:11.900 | even if I'm not looking at it.
01:26:13.540 | What you'd see is that there'd be this little elevation
01:26:16.840 | in activity in areas in the visual cortex,
01:26:21.520 | which process vision, around that point in space, okay?
01:26:26.520 | So if then something happened,
01:26:28.960 | like all of a sudden a light flashed in that part of,
01:26:32.440 | right in front of your cowboy hat,
01:26:34.360 | I would have a bigger response to it.
01:26:36.880 | But what you see in fMRI is even if I'm not,
01:26:39.160 | even if I don't see that flash of light,
01:26:40.800 | there's a lot of activity that I can measure
01:26:43.440 | because you're kind of keeping it excitable
01:26:45.460 | and that in and of itself,
01:26:46.780 | even though I'm not seeing anything there
01:26:49.780 | that's particularly interesting,
01:26:51.540 | there's still this increase in activity.
01:26:53.580 | And so it's more sensitive with fMRI.
01:26:55.420 | So there, is that a feature or is it a bug?
01:26:57.620 | You know, some people,
01:26:58.460 | people who study spikes in neurons would say,
01:27:00.900 | well, that's terrible, we don't want that, you know?
01:27:03.940 | Likewise, it's slow and that's terrible
01:27:07.040 | for measuring things that are very fast.
01:27:09.740 | But one of the things that we've found in our work
01:27:12.260 | is when we give people movies
01:27:13.760 | and when we give people stories to listen to,
01:27:17.320 | a lot of the action is in the very, very slow stuff.
01:27:20.300 | It's in, because if you're thinking about like a story,
01:27:23.800 | let's say you're listening to a podcast
01:27:26.400 | or you're listening to the Lex Friedman podcast, right?
01:27:29.000 | You're putting this stuff together
01:27:30.680 | and building this internal model over several seconds,
01:27:34.000 | which is basically, we filter that out
01:27:36.080 | when we look at electrical activity in the brain
01:27:38.440 | because we're interested in this millisecond scale.
01:27:40.320 | It's almost massive amounts of information, right?
01:27:43.420 | So the way I see it is every technique
01:27:46.480 | gives you a little limited window into what's going on.
01:27:49.980 | fMRI is huge problems.
01:27:51.500 | You know, people lie down in the scanner.
01:27:53.860 | There's parts of the brain where you,
01:27:55.420 | I'll show you in some of these images
01:27:57.100 | where you'll see kind of gaping holes
01:27:59.380 | because there's, you can't keep the magnetic field stable
01:28:03.260 | in those spots.
01:28:04.760 | You'll see parts where it's like there's a vein
01:28:06.940 | and so it just produces big increase
01:28:09.260 | and decrease in signal or respiration
01:28:11.160 | that causes these changes.
01:28:12.360 | There's lots of artifacts and stuff like that, you know.
01:28:15.280 | Every technique has its limits.
01:28:17.200 | If I'm lying down in an MRI scanner, I'm lying down.
01:28:20.120 | I'm not interacting with you in the same way
01:28:23.400 | that I would in the real world.
01:28:25.160 | But at the same time, I'm getting data
01:28:27.940 | that I might not be able to get otherwise.
01:28:30.320 | And so different techniques
01:28:31.800 | give you different kinds of advantages.
01:28:33.480 | - What kind of big scientific discoveries,
01:28:35.880 | maybe the flavor of discoveries have been done
01:28:39.160 | throughout the history of the science of memory,
01:28:41.860 | the studying of memory?
01:28:43.100 | What kind of things have been understood?
01:28:48.100 | - Oh, there's so many.
01:28:49.700 | It's really so hard to summarize it.
01:28:52.740 | I mean, I think it's funny because it's like
01:28:55.220 | when you're in the field,
01:28:56.260 | you can get kind of blase about this stuff.
01:28:58.460 | But then once I started to write the book,
01:29:00.020 | I was like, oh my God, this is really interesting.
01:29:02.260 | How did we do all this stuff?
01:29:07.600 | I would say that some of the,
01:29:09.460 | I mean, from the first studies,
01:29:11.060 | just showing how much we forget is very important.
01:29:14.900 | Showing how much schemas,
01:29:16.900 | which is our organized knowledge about the world,
01:29:19.500 | increase our ability to remember information,
01:29:23.460 | just massively increase it.
01:29:25.900 | Studies of expertise showing how experts like chess experts
01:29:29.980 | can memorize so much in such a short amount of time
01:29:33.080 | because of the schemas they have for chess.
01:29:36.260 | But then also showing that those lead
01:29:38.040 | to all sorts of distortions in memory.
01:29:40.520 | The discovery that the act of remembering
01:29:43.440 | can change the memory, can strengthen it,
01:29:46.080 | but it can also distort it
01:29:47.840 | if you get misinformation at the time.
01:29:50.240 | And it can also strengthen or weaken other memories
01:29:53.100 | that you didn't even recall.
01:29:54.880 | So just this whole idea of memory as an ecosystem,
01:29:57.320 | I think was a big discovery.
01:29:58.940 | I could go, this idea of like breaking up
01:30:03.440 | our continuous experience into these discrete events,
01:30:07.580 | I think was a major discovery.
01:30:09.140 | - So the discreteness of our encoding of events?
01:30:11.980 | - Maybe, yeah.
01:30:12.820 | I mean, you know, and again,
01:30:13.700 | there's controversial ideas about this, right?
01:30:15.940 | But it's like, yeah, this idea that,
01:30:18.140 | and this gets back to just this common experience
01:30:20.900 | of you walk into the kitchen and you're like,
01:30:22.980 | why am I here?
01:30:23.900 | And you just end up grabbing some food from the fridge.
01:30:26.100 | And then you go back and you're like,
01:30:27.260 | oh, wait a minute, I left my watch in the kitchen.
01:30:29.420 | That's what I was looking for.
01:30:31.140 | And so what happens is,
01:30:33.040 | is that you have a little internal model
01:30:34.880 | of where you are, what you're thinking about.
01:30:37.240 | And when you cross from one room to another,
01:30:39.720 | those models get updated.
01:30:41.520 | And so now when you're in the kitchen,
01:30:42.960 | you have to go back and mentally time travel
01:30:45.240 | back to this earlier point
01:30:47.200 | to remember what it was that you went there for.
01:30:50.360 | And so these event boundaries,
01:30:51.960 | turns out like in our research,
01:30:54.120 | and again, I don't wanna make it sound like
01:30:55.280 | we've figured out everything,
01:30:56.600 | but in our research,
01:30:57.480 | one of the things that we found is,
01:31:00.260 | is that basically as people get older,
01:31:04.100 | the activity in the hippocampus at these event boundaries
01:31:07.880 | tends to go down.
01:31:09.100 | But independent of age,
01:31:13.140 | if I give you outside of the scanner,
01:31:15.180 | you're done with the scanner,
01:31:16.020 | I just scan you while you're watching a movie,
01:31:17.660 | you just watch it.
01:31:18.840 | You come out, I give you a test of memory for stories.
01:31:22.380 | What happens is you find this incredible correlation
01:31:25.940 | between the activity in the hippocampus
01:31:28.420 | at these singular points in time, these event boundaries,
01:31:32.100 | and your ability to just remember a story
01:31:35.180 | outside of the scanner later on.
01:31:36.580 | So it's marking this ability to encode memories,
01:31:39.500 | just these little snippets of neural activity.
01:31:42.660 | So I think that's a big one.
01:31:44.820 | There's all sorts of work in animal models
01:31:47.540 | that I can get into, sleep,
01:31:50.020 | I think there's so much interesting stuff
01:31:52.080 | that's being discovered in sleep right now.
01:31:55.420 | Being able to just record from large populations of cells
01:32:00.420 | and then be able to relate that.
01:32:02.580 | You know what I think the coolest thing
01:32:03.980 | gets back to this QR code thing,
01:32:05.780 | because what we can do now is I can take fMRI data
01:32:10.780 | while you're watching a movie.
01:32:12.700 | Or let's do better than that.
01:32:14.340 | Let me get fMRI data while you use a joystick
01:32:16.700 | to move around in virtual reality.
01:32:18.740 | So you're in the metaverse, whatever, right?
01:32:20.540 | But it's kind of a crappy metaverse
01:32:21.980 | 'cause there's always so much metaverse
01:32:23.340 | you can do in an MRI scanner.
01:32:24.900 | So you're doing this crappy metaversing.
01:32:27.340 | So now I can take a rat, record from his hippocampus
01:32:32.060 | and prefrontal cortex and all these areas
01:32:34.260 | with these really new electrodes,
01:32:35.700 | you get massive amounts of data,
01:32:37.700 | and have it move around on a trackball in virtual reality
01:32:42.140 | in the same metaverse that I did
01:32:44.420 | and record that rat's activity.
01:32:46.380 | I can get a person with epilepsy
01:32:48.520 | who we have electrodes in their brain anyway
01:32:50.620 | to try to figure out where the seizures are coming from.
01:32:53.340 | And if it's a healthy part of the brain,
01:32:54.580 | record from that person, right?
01:32:57.140 | And I can get a computational model.
01:33:00.220 | And one of the brand new members in my lab,
01:33:03.500 | Tyler Bond, is just doing some great stuff.
01:33:05.300 | He relates computer vision models
01:33:08.420 | and looks at the weaknesses of computer vision models
01:33:10.620 | and relates it to what the brain does well.
01:33:12.820 | And so you can actually take a ground truth
01:33:20.300 | code for the metaverse, basically,
01:33:22.380 | and you can feed in the visual information,
01:33:25.220 | let's say the sensory information
01:33:26.500 | or whatever that's coming in,
01:33:28.220 | to a computational model that's designed
01:33:31.300 | to take real-world inputs, right?
01:33:34.020 | And you could basically tie them all together
01:33:37.300 | by virtue of the state spaces that you're measuring
01:33:40.520 | in neural activity, in these different formats,
01:33:43.180 | in these different species,
01:33:44.340 | and in the computational model,
01:33:46.140 | which I just find that mind-blowing.
01:33:48.540 | You could do different kinds of analyses on language
01:33:52.340 | and basically come up with just like,
01:33:54.300 | basically, it's the guts of LLMs, right?
01:33:56.540 | You could do analyses on language
01:34:00.860 | and you could do analyses on sentiment analyses
01:34:04.660 | of emotions and so forth.
01:34:05.860 | Put all this stuff together.
01:34:07.980 | I mean, it's almost too much.
01:34:10.260 | But if you do it right and you do it in a theory-driven way,
01:34:14.460 | as opposed to just throwing all the data at the wall
01:34:16.780 | and see what sticks,
01:34:17.740 | I mean, that to me is just exceptionally powerful.
01:34:20.660 | - So you can take fMRI data across species
01:34:25.260 | and across different types of humans
01:34:27.300 | or conditions of humans and what find,
01:34:30.180 | construct models that help you find the commonalities
01:34:36.220 | or like the core thing that makes somebody navigate
01:34:39.780 | through the metaverse, for example?
01:34:41.340 | - Yeah, yeah, I mean, more or less.
01:34:42.860 | I mean, there's a lot of details, but yes.
01:34:45.100 | I think, and not just fMRI,
01:34:46.660 | but you can relate it to, like I said,
01:34:48.820 | recordings from large populations of neurons
01:34:51.780 | that could be taken in a human
01:34:53.020 | or even in a non-human animal
01:34:55.860 | that is where you think it's an anatomical homologue.
01:35:00.580 | So that's just mind-blowing to me.
01:35:02.980 | - What's the similarities in humans and mice?
01:35:06.420 | (Lex laughing)
01:35:08.140 | That's what smashing pumpkins,
01:35:10.460 | we're all just rats in a cage.
01:35:12.220 | Is that smashing pumpkins?
01:35:13.740 | - Despite all of your rage.
01:35:15.660 | - Is that smashing pumpkins, I think?
01:35:17.340 | - Despite all of your rage at GIFs,
01:35:19.380 | you're still just a rat in a cage.
01:35:21.260 | (Lex laughing)
01:35:22.100 | - Oh, yeah.
01:35:22.940 | All right, good callback.
01:35:23.760 | - Good callback.
01:35:24.600 | See, these memory retrieval exercises I'm doing
01:35:26.860 | are actually helping you build
01:35:28.860 | a lasting memory of this conversation.
01:35:31.420 | - And it's strengthening the visual thing I have of you
01:35:34.940 | with James Brown on stage.
01:35:36.260 | (Lex laughing)
01:35:37.100 | It's just becoming stronger and stronger by the second.
01:35:40.380 | But anyway. - It's hot.
01:35:41.780 | (both laughing)
01:35:42.780 | - But animal studies work here as well?
01:35:45.100 | - Yeah, yeah.
01:35:45.940 | So, okay, so let's go to the...
01:35:47.900 | So, I think, I've got great colleagues who I talk to
01:35:52.620 | who study memory in mice.
01:35:55.100 | And there's some,
01:35:56.060 | one of the valuable things in those models
01:35:59.420 | is you can study neural circuits
01:36:01.760 | in an enormously targeted way
01:36:03.380 | because you could do these genetic studies, for instance,
01:36:06.340 | where you can manipulate particular groups of neurons.
01:36:10.540 | And it's just getting more and more targeted
01:36:12.460 | to the point where you can actually turn on
01:36:15.460 | particular kind of memory
01:36:17.700 | just by activating a particular set of neurons
01:36:20.820 | that was active during an experience, right?
01:36:23.380 | So, there's a lot of conservation
01:36:27.140 | of some of these neural circuits
01:36:28.780 | across evolution in mammals, for instance.
01:36:33.140 | And then some people would even say
01:36:35.260 | that there's genetic mechanisms for learning
01:36:37.180 | that are conserved even going back far, far before.
01:36:40.900 | But let's go back to the mice and humans question, right?
01:36:44.740 | There's a lot of differences.
01:36:45.940 | So, for one thing,
01:36:46.860 | the sensory information is very different.
01:36:49.260 | Mice and rats explore the world
01:36:52.340 | largely through smelling, olfaction.
01:36:56.460 | But they also have vision
01:36:57.660 | that's kind of designed to kind of catch death from above.
01:37:00.260 | So, it's like a very big view of the world.
01:37:02.900 | And we move our eyes around
01:37:05.040 | in a way that focuses on particular spots in space
01:37:08.380 | where you get very high resolution
01:37:10.260 | from a very limited set of spots in space.
01:37:12.540 | So, that makes us very different in that way.
01:37:15.820 | We also have all these other structures as social animals
01:37:18.780 | that allow us to respond differently.
01:37:22.580 | There's language, there's like, you know, so you name it,
01:37:26.140 | there's obviously gobs of differences.
01:37:28.060 | Humans aren't just giant rats.
01:37:29.540 | There's much more complexity to us.
01:37:30.980 | Time scales are very important.
01:37:32.940 | So, primate brains and human brains
01:37:35.900 | are especially good at integrating
01:37:38.620 | and holding on to information
01:37:41.300 | across longer and longer periods of time, right?
01:37:44.340 | And also, you know, finally,
01:37:46.260 | it's like our history of training data, so to speak,
01:37:48.980 | is very, very different than, you know,
01:37:51.260 | I mean, a human's world is very different
01:37:53.860 | than a wild mouse's world.
01:37:56.620 | And a lab mouse's world is extraordinarily impoverished
01:37:59.740 | relative to an adult human, you know?
01:38:01.780 | - But still, what can you understand by studying mice?
01:38:03.820 | I mean, just basic, almost behavioral stuff about memory?
01:38:07.460 | - Well, yes, but that's very important, right?
01:38:10.260 | So, you can understand, for instance,
01:38:12.540 | how do neurons talk to each other?
01:38:14.780 | That's a really big, big question.
01:38:17.580 | Neural computation, in and of itself,
01:38:19.700 | you'd think it's the most simple question, right?
01:38:22.180 | Not at all.
01:38:23.020 | I mean, it's a big, big question.
01:38:25.180 | And understanding how two parts of the brain interact,
01:38:30.180 | meaning that it's not just one area speaking.
01:38:33.020 | It's not like, you know, it's not like Twitter,
01:38:35.140 | where one area of the brain is shouting,
01:38:36.940 | and then another area of the brain's
01:38:38.380 | just stuck listening to this crap.
01:38:40.220 | It's like they're actually interacting
01:38:42.220 | on a millisecond scale, right?
01:38:43.900 | How does that happen?
01:38:45.340 | And how do you regulate those interactions,
01:38:47.540 | these dynamic, you know, interactions?
01:38:50.500 | We're still figuring that out,
01:38:51.780 | but that's gonna be coming largely from model systems
01:38:54.740 | that are easier to understand.
01:38:57.660 | You can do manipulations, like drug manipulations,
01:39:00.620 | to manipulate circuits and, you know,
01:39:03.260 | use viruses and so forth and lasers to turn on circuits
01:39:06.620 | that you just can't do in humans.
01:39:08.900 | So I think there's a lot that can be learned from mice.
01:39:11.940 | There's a lot that can be learned from non-human primates.
01:39:14.740 | And there's a lot that you need to learn from humans.
01:39:17.340 | And I think, unfortunately,
01:39:19.380 | some of the people in the National Institutes of Health
01:39:22.580 | think you can learn everything from the mouse.
01:39:24.540 | It's like, why study memory in humans
01:39:26.780 | when I could study learning in a mouse?
01:39:28.300 | And just like, oh my God,
01:39:30.060 | I'm gonna get my funding from somewhere else.
01:39:32.220 | So... (laughs)
01:39:34.140 | - Well, let me ask you some random fascinating questions.
01:39:36.180 | - Yeah, sure.
01:39:37.020 | - How does deja vu work?
01:39:40.860 | - So deja vu is, it's actually,
01:39:44.940 | one of these things I think that some of the surveys
01:39:47.660 | suggest that like 75% of people report
01:39:50.620 | having a deja vu experience one time or another.
01:39:53.340 | I don't know where that came from,
01:39:54.540 | but I've polled people in my class
01:39:56.860 | and most of them say they've experienced deja vu.
01:40:00.260 | It's this kind of sense
01:40:01.420 | that I've experienced this moment sometime before,
01:40:04.260 | I've been here before.
01:40:06.060 | And actually, there's all sorts of variants of this.
01:40:08.620 | The French have all sorts of names
01:40:10.020 | for various versions of the chamois vu,
01:40:13.100 | parley vu, I don't know. (laughs)
01:40:15.820 | All these different vus.
01:40:17.380 | But deja vu is the sense that it can be like
01:40:22.260 | almost disturbing, intense sense of familiarity.
01:40:26.100 | So there is a researcher named Wilder Penfield.
01:40:29.820 | Actually, this goes back even earlier
01:40:31.420 | to some of the earliest,
01:40:32.700 | like Hughlings Jackson was this neurologist
01:40:35.260 | who did a lot of the early characterizations of epilepsy.
01:40:39.460 | And one of the things he notices in epilepsy patients,
01:40:42.620 | some group of them, right before they would get a seizure,
01:40:45.940 | they would have this intense sense of deja vu.
01:40:49.140 | So it's this artificial sense of familiarity.
01:40:52.980 | It's a sense of having a memory that's not there, right?
01:40:57.860 | And so what was happening was there's electrical activity
01:41:01.100 | in certain parts of these brains.
01:41:02.260 | So this guy Penfield later on,
01:41:04.260 | when he was trying to look for
01:41:07.220 | how do we map out the brain to figure out
01:41:09.740 | which parts we wanna remove and which parts don't we,
01:41:12.540 | he would stimulate parts of the temporal lobes of the brain
01:41:15.260 | and find you could elicit the sense of deja vu.
01:41:17.740 | Sometimes you'd actually get a memory
01:41:19.740 | that a person would re-experience
01:41:21.100 | just from electrically stimulating some parts.
01:41:23.940 | Sometimes though, they just have this intense feeling
01:41:26.620 | of being somewhere before.
01:41:28.020 | And so one theory which I really like is,
01:41:32.100 | is that in higher order areas of the brain,
01:41:35.180 | they're integrating for many, many different
01:41:38.220 | sources of input.
01:41:39.820 | What happens is is that they're tuning themselves up
01:41:43.860 | every time you process a similar input, right?
01:41:47.260 | And so that allows you to just get
01:41:50.120 | this kind of a fluent sense that I'm very familiar,
01:41:53.340 | you're very familiar with this place, right?
01:41:55.380 | And so just being here,
01:41:57.180 | you're not going to be moving your eyes all over the place
01:41:59.740 | 'cause you kind of have an idea of where everything is.
01:42:01.740 | And that fluency gives you a sense of like, I'm here.
01:42:04.660 | Now I wake up in my hotel room
01:42:06.660 | and I have this very unfamiliar sense of where I am, right?
01:42:10.580 | But there's a great set of studies done by Ann Cleary
01:42:13.780 | at Colorado State,
01:42:15.220 | where she created these virtual reality environments.
01:42:17.740 | And we'll go back to the metaverse.
01:42:19.740 | Imagine you go through a virtual museum, right?
01:42:23.780 | And then she would put people in virtual reality
01:42:25.940 | and have them go through a virtual arcade.
01:42:29.460 | But the map of the two places was exactly the same.
01:42:32.300 | She just put different skins on them.
01:42:34.020 | So one looks different than the other,
01:42:35.700 | but they've got same landmarks and the same places,
01:42:38.260 | same objects and everything,
01:42:39.420 | but carpeting, colors, theme, everything's different.
01:42:43.000 | People will often not have any conscious idea
01:42:46.700 | that the two are the same,
01:42:48.340 | but they could report this very intense sense of deja vu.
01:42:52.140 | So it's like a partial match
01:42:53.740 | that's eliciting this kind of a sense of familiarity.
01:42:56.980 | And that's why in patients who have epilepsy
01:43:01.140 | that affects memory,
01:43:02.540 | you get this artificial sense of familiarity that happens.
01:43:06.420 | And so we think that,
01:43:08.060 | and again, this is just one theory amongst many,
01:43:10.660 | but we think that we get a little bit of that feeling.
01:43:14.340 | It's not enough to necessarily give you deja vu,
01:43:17.660 | even for very mundane things, right?
01:43:20.540 | So it's like if I tell you the word rutabaga,
01:43:24.580 | your brain's gonna work a little bit harder to catch it
01:43:27.040 | than if I give you a word like apple, right?
01:43:30.180 | And that's 'cause you hear apple lots.
01:43:32.100 | Your brain's very tuned up to process it efficiently,
01:43:34.500 | but rutabaga takes a little bit longer and more intense.
01:43:37.300 | And you can actually see a difference in brain activity
01:43:40.580 | in areas in the temporal lobe when you hear a word
01:43:42.820 | just based on how frequent it is in the English language.
01:43:46.400 | So we think it's tied to this basic,
01:43:49.840 | it's basically a by-product of our mechanism
01:43:53.060 | of just learning, doing this error-driven learning
01:43:55.960 | as we go through life to become better and better and better
01:43:58.620 | to process things more and more efficiently.
01:44:00.860 | - So I guess deja vu is just thinking extra elevated,
01:44:04.340 | the stuff coming together,
01:44:06.680 | firing for this artificial memory as if it's the real memory.
01:44:10.780 | I mean, why does it feel so intense?
01:44:15.220 | - Well, it doesn't happen all the time,
01:44:16.800 | but I think what may be happening is it's such a,
01:44:20.280 | it's a partial match to something that we have,
01:44:22.960 | and it's not enough to trigger that sense of,
01:44:25.580 | you know, that ability to pull together all the pieces,
01:44:28.760 | but it's a close enough match to give you
01:44:30.920 | that intense sense of familiarity
01:44:33.360 | without the recollection of exactly what happened when.
01:44:36.980 | - But it's also like a spatiotemporal familiarity.
01:44:41.240 | So like, it's also in time.
01:44:43.240 | Like, there's a weird blending of time that happens.
01:44:46.660 | And we'll probably talk about time,
01:44:49.080 | 'cause I think that's a really interesting idea,
01:44:50.720 | how time relates to memory.
01:44:52.420 | But you also kind of, artificial memory brings to mind
01:44:57.420 | this idea of false memories
01:45:00.480 | that comes in all kinds of contexts,
01:45:02.740 | but how do false memories form?
01:45:05.860 | - Well, I like to say there's no such thing
01:45:08.160 | as true or false memories, right?
01:45:10.200 | It's like Johnny Rotten from the Sex Pistols,
01:45:13.120 | he had a saying that's like,
01:45:13.960 | "I don't believe in false memories
01:45:15.440 | "any more than I believe in false songs," right?
01:45:18.120 | It's like, and so the basic idea is that
01:45:21.720 | we have these memories that reflect bits and pieces
01:45:24.420 | of what happened, as well as our inferences and theories,
01:45:28.240 | right, so I'm a scientist and I collect data,
01:45:30.920 | but I use theories to make sense of that data.
01:45:34.880 | And so a memory is kind of a mix of all these things.
01:45:38.260 | So where memories can go off the deep end
01:45:40.400 | and become what we would call conventionally
01:45:42.320 | as false memories, are sometimes little distortions
01:45:47.320 | where we filled in the blanks, the gaps in our memory,
01:45:50.880 | based on things that we know,
01:45:52.840 | but don't actually correspond to what happened, right?
01:45:55.780 | So if I were to tell you that I'm like, you know,
01:46:02.520 | a story about this person who's like,
01:46:05.800 | worried that they have cancer or something like that,
01:46:08.680 | and then, you know, they see a doctor and the doctor says,
01:46:11.560 | "Well, things are very much like you would have expected,"
01:46:14.520 | or like, you know, "what you were afraid of," or something.
01:46:17.280 | When people remember that, they'll often remember,
01:46:19.200 | "Well, the doctor told the patient that he had cancer,"
01:46:23.120 | even if that wasn't in the story,
01:46:24.560 | because they're infusing meaning into that story, right?
01:46:27.840 | So that's a minor distortion.
01:46:29.960 | But what happens is, is that sometimes things
01:46:32.000 | can really get out of hand where people have trouble
01:46:35.800 | telling the difference in things that they've imagined
01:46:37.960 | versus things that happen, but also, as I told you,
01:46:41.040 | the act of remembering can change the memory.
01:46:44.920 | And so what happens then is you can actually be exposed
01:46:48.800 | to some misinformation.
01:46:50.160 | And so Elizabeth Loftus was a real pioneer in this work,
01:46:53.320 | and there's lots of other work that's been done since.
01:46:56.560 | But basically, it's like, if you remember some event,
01:47:00.120 | and then I tell you something about the event,
01:47:03.100 | later on, when you remember the event,
01:47:05.880 | you might remember some original information
01:47:08.040 | from the event, as well as some information
01:47:10.980 | about what I told you.
01:47:12.480 | And sometimes, if you're not able to tell the difference,
01:47:16.060 | that information that I told you gets mixed
01:47:18.200 | into the story that you had originally.
01:47:20.640 | So now, I give you some more misinformation,
01:47:23.040 | or you're exposed to some more information somewhere else,
01:47:25.640 | and eventually, your memory becomes totally detached
01:47:28.640 | from what happened.
01:47:30.000 | And so sometimes you can have cases where people,
01:47:33.800 | this is very rare, but you can do it in lab, too,
01:47:37.480 | where a significant, not everybody,
01:47:40.700 | but a chunk of people will fall for this,
01:47:43.580 | where you can give people misinformation
01:47:46.980 | about an event that never took place.
01:47:49.820 | And as they keep trying to remember that event more and more,
01:47:53.140 | what happens is they start to imagine,
01:47:54.620 | they start to pull up things
01:47:56.140 | from other experiences they've had,
01:47:58.220 | and eventually, they can stitch together a vivid memory
01:48:01.800 | of something that never happened.
01:48:03.940 | Because they're not remembering an event that happened,
01:48:07.300 | they're remembering the act
01:48:09.400 | of trying to remember what happened,
01:48:11.800 | and basically putting it together into the wrong story.
01:48:14.920 | - So it's fascinating, 'cause this could probably happen
01:48:17.920 | at a collective level.
01:48:21.560 | Like, this is probably what successful
01:48:23.400 | propaganda machines aim to do,
01:48:25.640 | is creating false memory across thousands,
01:48:28.600 | if not millions, of minds.
01:48:30.320 | - Yeah, absolutely.
01:48:31.940 | I mean, this is exactly what they do.
01:48:34.480 | And so, all these kind of foibles of human memory
01:48:37.740 | get magnified when you start to have social interactions.
01:48:40.520 | There's a whole literature
01:48:41.800 | on something called social contagion,
01:48:44.280 | which is basically when misinformation spreads like a virus.
01:48:47.800 | Like, you remember the same thing that I did,
01:48:51.040 | but I give you a little bit of wrong information,
01:48:53.400 | then that becomes part of your story of what happened.
01:48:56.560 | Because once you and I share a memory,
01:48:59.260 | like I tell you about something I've experienced,
01:49:01.240 | and you tell me about your experience of the same event,
01:49:04.080 | it's no longer your memory or my memory, it's our memory.
01:49:07.560 | And so, now the misinformation spreads.
01:49:10.200 | And the more you trust someone,
01:49:12.040 | or the more powerful that person is,
01:49:14.600 | the more of a voice they have in shaping that narrative.
01:49:18.040 | And there's all sorts of interesting ways
01:49:21.640 | in which misinformation can happen.
01:49:23.040 | There's a great example of when John McCain
01:49:27.000 | and George Bush Jr. were in a primary,
01:49:32.080 | and there were these polls where they would do these,
01:49:35.440 | like, I guess they were like not robocalls,
01:49:37.600 | but real calls where they would poll voters.
01:49:40.240 | But they actually inserted some misinformation
01:49:43.760 | about McCain's beliefs on taxation, I think,
01:49:46.560 | and maybe it was something about illegitimate children.
01:49:48.520 | I don't really remember.
01:49:49.920 | But they included misinformation
01:49:52.120 | in the question that they asked.
01:49:53.720 | Like, how do you feel about the fact
01:49:56.480 | that he wants to do this or something?
01:49:58.600 | And so, people would end up becoming convinced
01:50:01.240 | he had these policy things or these personal things
01:50:05.000 | that were not true, just based on the polls
01:50:07.400 | that were being used.
01:50:08.560 | So, it was a case where, interestingly enough,
01:50:12.360 | the people who were using misinformation
01:50:16.240 | were actually ahead of the curve
01:50:18.080 | relative to the scientists who were trying
01:50:19.760 | to study these effects in memory.
01:50:21.520 | - Yeah, it's really interesting.
01:50:26.720 | So, it's not just about truth and falsehoods,
01:50:29.360 | like us as intelligent reasoning machines,
01:50:34.360 | but it's the formation of memories
01:50:37.680 | where they become, like, visceral.
01:50:39.680 | You can rewrite history.
01:50:41.400 | If you just look throughout the 20th century,
01:50:44.000 | some of the dictatorships with Nazi Germany,
01:50:47.600 | with the Soviet Union, effective propaganda machines
01:50:52.320 | can rewrite our conceptions of history,
01:50:55.080 | how we remember our own culture,
01:50:56.600 | our upbringing, all this kind of stuff.
01:50:59.120 | You could do quite a lot of damage in this way.
01:51:01.160 | And then, there's probably some kind
01:51:02.560 | of social contagion happening there.
01:51:05.120 | Like, certain ideas that may be initiated
01:51:10.440 | by the propaganda machine can spread faster than others.
01:51:13.200 | You could see that in modern day,
01:51:15.040 | certain conspiracy theories,
01:51:17.080 | there's just something about them
01:51:18.280 | that they are, like, really effective at spreading.
01:51:20.800 | There's something sexy about them to people,
01:51:24.360 | to where something about the human mind eats it up
01:51:28.560 | and then uses that to construct memories
01:51:31.800 | as if they almost were there to witness
01:51:35.600 | whatever the content of the conspiracy theory is.
01:51:38.240 | It's fascinating, 'cause once you feel
01:51:40.000 | like you remember a thing, I feel like there's a certainty.
01:51:44.040 | There's a, it emboldens you to, like, say stuff.
01:51:47.640 | Like, you really, like, it's not just you believe
01:51:51.080 | an idea's true or not, you're like,
01:51:53.520 | it's at the core of your being that you,
01:51:57.760 | you feel like you were there to watch the thing happen.
01:52:00.880 | - Yeah, I mean, there's so much in what you're saying.
01:52:03.880 | I mean, one of the things is that people's sense
01:52:07.160 | of collective identity is very much tied to shared memories.
01:52:11.720 | If we have a shared narrative of the past,
01:52:13.920 | or even better, if we have a shared past,
01:52:16.220 | we will feel more socially connected with each other,
01:52:19.000 | and I will feel part of this group.
01:52:20.660 | They're part of my tribe if I remember the same things
01:52:23.540 | in the same way.
01:52:24.820 | And you brought up this weaponization of history,
01:52:27.460 | and, you know, it really speaks to, I think,
01:52:29.660 | one of the parts of memory, which is that
01:52:31.900 | if you have a belief, you will find,
01:52:35.780 | and you have a goal in mind, you will find stuff in memory
01:52:39.040 | that aligns with it, and you won't see the parts
01:52:41.660 | in memory that don't.
01:52:43.140 | So a lot of the stories we put together
01:52:44.980 | are based on our perspectives, right?
01:52:47.740 | And so let's just zoom out for the moment
01:52:51.340 | from misinformation, take something even more fascinating,
01:52:55.620 | but not as scary.
01:52:57.780 | I was reading Thanh Viet Nguyen,
01:53:01.260 | but he wrote a book about the collective memory
01:53:04.260 | of the Vietnam War.
01:53:05.220 | He's a Vietnamese immigrant who was flown out
01:53:08.500 | as after the war was over, and so he went back
01:53:11.460 | to his family to get their stories about the war,
01:53:14.660 | and they called it the American War,
01:53:17.500 | not the Vietnam War, right?
01:53:19.780 | And that just kind of blew my mind,
01:53:22.540 | having grown up in the US, and I've always heard about it
01:53:25.180 | as the Vietnam War, but of course they call it
01:53:26.980 | the American War, 'cause that's what happened.
01:53:28.580 | America came in, right?
01:53:30.300 | And that's based on their perspective,
01:53:32.620 | which is a very valid perspective.
01:53:35.800 | And so that just gives you this idea
01:53:39.700 | of the way we put together these narratives
01:53:42.200 | based on our perspectives.
01:53:44.020 | And I think the opportunities that we can have in memory
01:53:49.020 | is if we bring groups together from different perspectives
01:53:54.220 | and we allow them to talk to each other
01:53:56.420 | and we allow ourselves to listen.
01:53:58.140 | I mean, right now you'll hear a lot of just jammering,
01:54:02.620 | people going blah, blah, blah about free speech,
01:54:04.820 | but they just wanna listen to themselves, right?
01:54:06.980 | It's like, let's face it, the old days
01:54:09.620 | before people were supposedly woke,
01:54:11.620 | they were trying to ban two live crew or, you know,
01:54:14.780 | just think about, Lenny Bruce got canceled for cursing.
01:54:17.540 | Jesus Christ, you know?
01:54:19.300 | It's like, this is nothing new.
01:54:21.780 | People don't like to hear things that disagree with them.
01:54:25.020 | But if you're in a, I mean, you can see two situations
01:54:30.020 | in groups with memory.
01:54:32.700 | One situation is you have people who are very dominant
01:54:36.580 | who just take over the conversation.
01:54:38.540 | And they, basically what happens is the group
01:54:40.940 | remembers less from the experience
01:54:42.860 | and they remember more of what
01:54:44.420 | the dominant narrator says, right?
01:54:46.720 | Now, if you have a diverse group of people,
01:54:48.820 | and I don't mean diverse in necessarily
01:54:51.220 | the human resources sense of the word.
01:54:53.140 | I mean, diverse in any way you wanna take it, right?
01:54:55.540 | But diverse in every way, hopefully.
01:54:57.700 | And you give everyone a chance to speak
01:54:59.700 | and everyone's being appreciated
01:55:02.060 | for their unique contribution.
01:55:03.820 | You get more accurate memories
01:55:05.380 | and you get more information from it, right?
01:55:08.460 | Even two people who come from very similar backgrounds,
01:55:10.940 | if you can appreciate the unique contributions
01:55:13.380 | that each one has, you can do a better job
01:55:16.020 | of generating information from memory.
01:55:18.460 | And that's a way to inoculate ourselves,
01:55:20.580 | I believe, from misinformation in the modern world.
01:55:23.240 | But like everything else,
01:55:25.500 | it requires a certain tolerance for discomfort.
01:55:28.300 | And I think when we don't have much time
01:55:31.260 | and I think when we're stressed out
01:55:33.300 | and when we are just tired,
01:55:36.220 | it's very hard to tolerate discomfort.
01:55:38.980 | - And I mean, social media has a lot of opportunity
01:55:42.000 | for this because it enables
01:55:44.660 | this distributed one-on-one interaction
01:55:46.900 | that you're talking about where everybody has a voice,
01:55:49.500 | but still our natural inclination,
01:55:52.420 | you see this on social media,
01:55:53.820 | is there's a natural clustering of people and opinions
01:55:56.100 | and you just kind of form these kind of bubbles.
01:55:59.900 | I think that's, to me personally,
01:56:02.380 | I think that's a technology problem that could be solved.
01:56:04.460 | If there's a little bit of interaction,
01:56:07.940 | kind, respectful, compassionate interaction
01:56:09.940 | with people that have a very different memory,
01:56:13.020 | that respectful interaction will start
01:56:16.660 | to intermix the memories and ways of thinking
01:56:20.020 | to where you're slowly moving towards truth.
01:56:23.540 | But that's a technology problem
01:56:24.980 | because naturally, left to our own devices,
01:56:28.620 | we want to cluster up in a tribe.
01:56:30.980 | - Yeah, and that's the human problem.
01:56:33.220 | I think a lot of the problems that come up with technology
01:56:37.260 | aren't the technology itself
01:56:38.860 | as much as the fact that people adapt to the technology
01:56:42.820 | in maladaptive ways.
01:56:44.980 | I mean, one of my fears about AI is not what AI will do,
01:56:49.420 | but what people will do.
01:56:50.540 | I mean, take text messaging, right?
01:56:52.340 | It's like pain in the ass to text people, at least for me.
01:56:55.460 | And so what happens is the communication
01:56:57.500 | becomes very spartan and devoid of meaning, right?
01:57:00.700 | It's just very telegraphic.
01:57:02.580 | And that's people adapting to the medium, right?
01:57:05.020 | I mean, look at you.
01:57:05.860 | You've got this keyboard, right,
01:57:07.700 | that's got these dome-shaped things,
01:57:10.100 | and you've adapted to that to communicate, right?
01:57:12.820 | That's not the technology adapting to you.
01:57:14.900 | It's you adapting to the technology.
01:57:17.260 | And I think one of the things I learned
01:57:18.940 | when Google started to introduce autocomplete in emails,
01:57:22.620 | I started to use it, and about a third of the time,
01:57:25.300 | I was like, "This isn't what I want to say."
01:57:27.940 | A third of the time, I'd be like,
01:57:29.060 | "This is exactly what I wanted to say."
01:57:31.020 | And a third of the time, I was saying,
01:57:32.220 | "Well, this is good enough.
01:57:33.140 | "I'll just go with it," right?
01:57:35.500 | And so what happens is it's not that the technology
01:57:38.380 | necessarily is doing anything so bad,
01:57:41.060 | as much as it's just going to constrain my language
01:57:45.020 | because I'm just doing what's being suggested to me.
01:57:48.860 | And so this is why I say, kind of like my mantra
01:57:52.620 | for some of what I've learned about everything in memory
01:57:56.660 | is to diversify your training data, basically,
01:57:59.740 | because otherwise you're going to...
01:58:01.780 | So humans have this capability to be so much more creative
01:58:05.900 | than anything generative AI will put together,
01:58:08.460 | at least right now.
01:58:09.500 | Who knows where this goes?
01:58:11.060 | But it can also go the opposite direction
01:58:14.180 | where people could become much, much less creative
01:58:17.180 | if they just become more and more resistant to discomfort
01:58:22.180 | and resistant to exposing themselves to novelty,
01:58:26.300 | to cognitive dissonance, and so forth.
01:58:29.060 | - I think there is a dance
01:58:30.140 | between natural human adaptation of technology
01:58:32.820 | and the people that design the engineering
01:58:35.740 | of that technology.
01:58:37.060 | So I think there's a lot of opportunity
01:58:38.580 | to create, like this keyboard,
01:58:40.580 | things that on that are positive for human behavior.
01:58:45.300 | So we adapt and all this kind of stuff,
01:58:47.300 | but when you look at the long arc of history
01:58:51.100 | across years and decades,
01:58:52.700 | has humanity been flourishing?
01:58:55.860 | Are humans creating more awesome stuff?
01:58:58.660 | Are humans happier? All that kind of stuff.
01:59:00.580 | And so there, I think technology on that has been,
01:59:04.660 | and I think, maybe hope,
01:59:07.380 | will always be on that a positive thing.
01:59:10.860 | - Do you think people are happier now
01:59:12.300 | than they were 50 years ago or 100 years ago?
01:59:14.700 | - Yes, yes.
01:59:16.140 | - I don't know about that.
01:59:17.420 | - I think humans in general like to reminisce about the past,
01:59:22.420 | like the times are better. - That's true.
01:59:24.860 | - And complain about the weather today
01:59:27.580 | or complain about whatever today,
01:59:29.060 | 'cause we, there's this kind of complainy engine
01:59:32.980 | that just, there's so much pleasure in saying,
01:59:35.500 | you know, life sucks for some reason.
01:59:37.860 | (Lyle laughs)
01:59:39.620 | - That's why I love punk rock.
01:59:41.180 | - Exactly, I mean, there's something in humans
01:59:43.860 | that loves complaining, even about trivial things,
01:59:48.220 | but complaining about change, complaining about everything.
01:59:52.100 | But ultimately, I think on net,
01:59:55.260 | on every measure, things are getting better.
01:59:59.500 | Life is getting better.
02:00:00.940 | - Oh, life is getting better,
02:00:02.100 | but I don't know necessarily
02:00:03.220 | that attracts people's happiness, right?
02:00:06.100 | I mean, I would argue that maybe, and who knows?
02:00:08.220 | I don't know this, but I wouldn't be surprised
02:00:10.340 | if people in hunter-gatherer societies are happier.
02:00:14.220 | I mean, I wouldn't be surprised if they're happier
02:00:16.220 | than people who have access to modern medicine
02:00:19.980 | and email and cell phones.
02:00:23.020 | - Well, I don't think there's a question
02:00:24.340 | whether you take hunter-gatherer folks
02:00:26.900 | and put them into modern day
02:00:28.540 | and give them enough time to adapt,
02:00:29.860 | they would be much happier.
02:00:31.060 | The question is in terms of every single problem they've had
02:00:35.260 | is now solved.
02:00:36.660 | There's now food, there is guarantee of survival,
02:00:39.340 | shelter, and all this kind of stuff.
02:00:40.540 | So what you're asking is a deeper
02:00:42.300 | sort of biological question.
02:00:44.180 | Do we want to be, oh, Werner Herzog in the movie
02:00:47.940 | "Happy People, Life in the Taiga,"
02:00:49.940 | do we want to be busy 100% of our time
02:00:52.660 | hunting, gathering, surviving, worried about the next day?
02:00:57.380 | Maybe that constant struggle
02:01:00.540 | ultimately creates a more fulfilling life.
02:01:02.780 | I don't know, but I do know this modern society
02:01:06.720 | allows us to, when we're sick,
02:01:11.020 | to find medicine, to find cures.
02:01:13.780 | When we're hungry, to get food,
02:01:15.860 | much more than we did even 100 years ago.
02:01:19.740 | And there's many more activities.
02:01:22.500 | I think you could perform a creative,
02:01:24.060 | all this kind of stuff that enables the flourishing
02:01:27.620 | of humans at the individual level.
02:01:29.860 | Whether that leads to happiness,
02:01:31.540 | I mean, that's a very deep philosophical question.
02:01:34.180 | Maybe struggle, deep struggle is necessary for happiness.
02:01:39.180 | - Or maybe cultural connection.
02:01:43.300 | Maybe it's about functioning in social groups
02:01:47.300 | that are meaningful and having time.
02:01:49.900 | But I do think there's an interesting
02:01:51.560 | memory-related thing, which is that if you look
02:01:54.700 | at things like reinforcement learning, for instance,
02:01:57.820 | you're not learning necessarily every time
02:02:01.460 | you get a reward, if it's the same reward,
02:02:04.100 | you're not learning that much.
02:02:06.320 | You mainly learn if it deviates from your expectation
02:02:08.820 | of what you're supposed to get, right?
02:02:10.100 | So it's like, you get a paycheck every month
02:02:13.280 | from MIT or whatever, right?
02:02:15.140 | And it's like, you probably don't even get excited
02:02:18.700 | about it when you get the paycheck.
02:02:20.280 | But if they cut your salary, you're gonna be pissed.
02:02:22.500 | And if they increase your salary,
02:02:23.940 | oh, good, I got a bonus, you know?
02:02:26.140 | And that adaptation and that ability
02:02:30.220 | that basically you learn to expect these things,
02:02:33.820 | I think, is a major source of, I guess,
02:02:38.120 | it's a major way in which we're kind of more,
02:02:40.620 | in my opinion, wired to strive and not be happy,
02:02:45.060 | to be in a state of wanting.
02:02:46.460 | And so people talk about dopamine, for instance,
02:02:48.920 | being this pleasure chemical.
02:02:50.740 | And it's like, there's a lot of compelling research
02:02:53.420 | to suggest it's not about pleasure at all.
02:02:57.180 | It's about the discomfort that energizes you
02:03:01.100 | to get things, to seek a reward, right?
02:03:04.620 | And so you could give an animal
02:03:06.160 | that's been deprived of dopamine a reward,
02:03:08.700 | and oh, yeah, I enjoy it, it's pretty good.
02:03:11.600 | But they're not gonna do anything to get it, you know?
02:03:14.940 | And just one of the weird things in our research
02:03:17.900 | is I got into curiosity from a postdoc in my lab,
02:03:22.340 | Matthias Gruber, and one of the things that we found
02:03:25.780 | is when we gave people a question,
02:03:27.860 | like a trivia question that they wanted the answer to,
02:03:31.380 | that question, the more curious people were
02:03:34.380 | about the answer, the more activity
02:03:36.500 | in these dopamine-related circuits
02:03:38.460 | in the brain we would see.
02:03:39.740 | And again, that was not driven by the answer per se,
02:03:43.100 | but by the question.
02:03:44.740 | So it was not about getting the information,
02:03:47.460 | it was about the drive to seek the information.
02:03:50.280 | But it depends on how you take that.
02:03:53.660 | If you get this uncomfortable gap
02:03:55.500 | between what you know and what you want to know,
02:03:58.740 | you could either use that to motivate you and energize you,
02:04:01.660 | or you could use it to say,
02:04:03.420 | I don't wanna hear about this,
02:04:04.640 | this disagrees with my beliefs,
02:04:06.260 | I'm gonna go back to my echo chamber, you know?
02:04:08.660 | - Yeah.
02:04:10.820 | I like what you said, that maybe we're designed
02:04:14.900 | to be in a kind of constant state of wanting,
02:04:17.980 | which, by the way, is a pretty good either band name
02:04:21.340 | or rock song name, state of wanting.
02:04:25.420 | - That's like a hardcore band name, yeah, yeah, yeah.
02:04:28.220 | - Yeah, it's pretty good.
02:04:29.060 | - But I also like the hedonic treadmill.
02:04:31.180 | - Hedonic treadmill's pretty good.
02:04:32.580 | - Yeah, yeah, we could use that
02:04:33.700 | for like our techno project, I think.
02:04:37.060 | - You mean the one we're starting?
02:04:38.300 | - Yeah, exactly. - Okay, great.
02:04:39.900 | We're going on tour soon.
02:04:43.620 | (laughing)
02:04:45.580 | This is our announcement.
02:04:47.540 | - We could build a false memory of a show, in fact,
02:04:49.980 | if you want, let's just put it all together.
02:04:51.740 | So we don't even have to do all the work to play the show,
02:04:54.500 | we can just create a memory of it and it might as well
02:04:56.860 | happen, 'cause the remembering self is in charge anyway.
02:05:00.300 | - So let me ask you about, we talked about false memories,
02:05:02.700 | but in the legal system, false confessions.
02:05:05.660 | I remember reading 1984 where,
02:05:09.460 | sorry for the dark turn of our conversation,
02:05:12.280 | but through torture, you can make people say anything
02:05:17.280 | and essentially remember anything.
02:05:20.460 | I wonder to what degree there's truth to that,
02:05:22.500 | if you look at the torture that happened in the Soviet Union
02:05:26.020 | for confessions, all that kind of stuff.
02:05:28.140 | How much can you really get people to really,
02:05:31.460 | yeah, to force false memories, I guess?
02:05:36.240 | - Yeah, I mean, I think there's a lot of history of this
02:05:40.340 | actually in the criminal justice system.
02:05:43.960 | You might've heard the term, the third degree.
02:05:46.720 | If you actually look it up, historically,
02:05:49.160 | it was a very intense set of beatings and starvation,
02:05:54.120 | physical demands that they would place at people
02:05:58.360 | to get them to talk.
02:05:59.480 | And there was certainly a lot of work that's been done
02:06:03.260 | by the CIA in terms of enhanced interrogation techniques.
02:06:07.320 | And from what I understand, the research actually shows
02:06:11.340 | that they just produce what people want to hear,
02:06:15.420 | not necessarily the information that is being looked for.
02:06:19.260 | And the reason is that, I mean, there's different reasons.
02:06:22.960 | I mean, one is people just get tired of being tortured
02:06:25.060 | and just say whatever.
02:06:26.420 | But another part of it is that you create
02:06:29.700 | a very interesting set of conditions
02:06:32.180 | where there's an authority figure telling you something,
02:06:36.020 | that you did this, we know you did this,
02:06:37.800 | we have witnesses saying you did this.
02:06:39.640 | So now you start to question yourself.
02:06:42.100 | Then they put you under stress.
02:06:44.080 | Maybe they're not feeding you.
02:06:46.000 | Maybe they're kind of like making you be cold
02:06:48.360 | or exposing you to music that you can't stand or something,
02:06:53.200 | whatever it is, right?
02:06:54.680 | It's like they're creating this physical stress.
02:06:58.440 | And so stress starts to act on,
02:07:01.520 | starts to down-regulate the prefrontal cortex.
02:07:03.880 | They're not necessarily as good at monitoring
02:07:05.740 | the accuracy of stuff.
02:07:07.700 | Then they start to get nice to you and they say,
02:07:09.420 | imagine, okay, I know you don't remember this,
02:07:11.940 | but maybe we can walk you through how it could have happened
02:07:14.940 | and they feed you the information.
02:07:17.120 | And so you're in this weakened mental state
02:07:20.340 | and you're being encouraged to imagine things
02:07:22.660 | by people who give you a plausible scenario.
02:07:25.640 | And at some point, certain people can be very coaxed
02:07:29.520 | into creating a memory for something that never happened.
02:07:33.020 | And there's actually some pretty convincing cases out there
02:07:35.740 | where you don't know exactly the truth.
02:07:38.780 | There's a sheriff, for instance,
02:07:40.340 | who came to believe that he had a false memory.
02:07:44.000 | I mean, that he had a memory of doing sexual abuse
02:07:46.940 | based on, you know, essentially, I think it was,
02:07:49.900 | you know, I'm not gonna tell the story
02:07:52.780 | because I don't remember it well enough
02:07:54.460 | to necessarily accurately give it to you,
02:07:57.020 | but people could look this stuff up.
02:07:58.860 | There are definitely stories out there like this,
02:08:00.960 | where people confess to crimes that they just didn't do
02:08:03.820 | and objective evidence came out later on.
02:08:06.480 | But there's a basic recipe for it,
02:08:10.020 | which is you feed people the information
02:08:12.660 | that you want them to remember, you stress them out,
02:08:16.620 | you have an authority figure
02:08:18.300 | kind of like pushing this information on them,
02:08:21.500 | or you motivate them to produce the information
02:08:24.160 | you're looking for, and that pretty much over time
02:08:27.860 | gives you what you want.
02:08:29.640 | It's really tragic that centralized power
02:08:34.640 | can use these kinds of tools to destroy lives.
02:08:41.260 | Since there's a theme about music
02:08:47.220 | throughout this conversation,
02:08:49.180 | one of the best topics for songs is heartbreak,
02:08:53.500 | love in general, but heartbreak.
02:08:56.400 | Why and how do we remember and forget heartbreak?
02:08:59.680 | Asking for a friend.
02:09:00.720 | - Oh, God, that's so hard to,
02:09:03.340 | asking for a friend, I love that.
02:09:05.000 | Oh, it's such a hard one.
02:09:08.840 | Well, so, I mean, part of this is we tend to go back
02:09:13.840 | to particular times that are the more
02:09:17.040 | emotionally intense periods,
02:09:19.080 | and so that's a part of it, and again,
02:09:24.840 | memory is designed to kind of capture these things
02:09:27.500 | that are biologically significant,
02:09:29.340 | and attachment is a big part of biological significance
02:09:33.140 | for humans, right?
02:09:33.980 | Human relationships are super important,
02:09:36.940 | and sometimes that heartbreak comes with massive changes
02:09:40.740 | in your beliefs about somebody,
02:09:42.060 | say if they cheated on you or something like that,
02:09:45.220 | or regrets, and you kind of ruminate about things
02:09:47.780 | that you've done wrong.
02:09:48.940 | There's really so many reasons, though,
02:09:53.760 | but I mean, I've had this.
02:09:55.700 | My first pet I had was,
02:10:01.460 | we got her for a wedding present as a cat,
02:10:03.220 | and got it after, but it died of FIP
02:10:07.180 | when it was four years old,
02:10:08.860 | and I just would see her everywhere around the house.
02:10:13.580 | We got another cat, then we got a dog.
02:10:16.020 | Dog eventually died of cancer,
02:10:17.860 | and the cat just died recently,
02:10:20.220 | and so we got a new dog because I kept seeing
02:10:24.120 | the dog around, and I was just so heartbroken about this,
02:10:27.640 | but I still remember the pets that died.
02:10:30.760 | It just comes back to you.
02:10:32.080 | I mean, it's part of this.
02:10:33.240 | I think there's also something about attachment
02:10:36.120 | that's just so crucial that drives, again,
02:10:39.240 | these things that we want to remember,
02:10:41.640 | and that gives us that longing sometimes.
02:10:44.520 | Sometimes it's also not just about the heartbreak,
02:10:47.600 | but about the positive aspects of it,
02:10:50.400 | 'cause the loss comes from not only the fact
02:10:53.240 | that the relationship is over,
02:10:55.640 | but you had all of these good things before
02:10:58.760 | that you can now see in a new light,
02:11:01.840 | and so part of, one of the things that I found
02:11:05.520 | from my clinical background that really, I think,
02:11:07.500 | gave me a different perspective on memory
02:11:09.960 | is so much of the therapy process
02:11:12.300 | was guided towards reframing
02:11:14.880 | and getting people to look at the past in a different way,
02:11:19.240 | not by imposing, changing people's memories,
02:11:21.600 | or not by imposing an interpretation,
02:11:24.200 | but just offering a different perspective,
02:11:26.600 | and maybe one that's kind of more optimized
02:11:28.400 | towards learning, and an appreciation, maybe,
02:11:33.080 | or gratitude, whatever it is, right,
02:11:35.320 | that gives you a way of taking,
02:11:37.400 | I think you said it in the beginning, right,
02:11:39.080 | where you can have this kind of like dark experiences,
02:11:42.360 | and you can use it as training data
02:11:45.720 | to grow in new ways, but it's hard.
02:11:49.720 | - I often go back to this moment,
02:11:53.520 | this show "Louie" with Louis C.K.,
02:11:56.560 | where he's all heartbroken about a breakup
02:11:59.160 | with a woman he loves,
02:12:01.120 | and an older gentleman tells him
02:12:05.940 | that that's actually the best part, that heartbreak,
02:12:09.660 | because you get to intensely experience
02:12:12.120 | how valuable this love was.
02:12:14.280 | He says the worst part is forgetting it,
02:12:17.320 | is actually when you get over the heartbreak,
02:12:19.680 | that's the worst part.
02:12:21.120 | So I sometimes think about that,
02:12:23.480 | because having the love and losing it,
02:12:28.360 | like the losing it is when you sometimes
02:12:32.800 | feel it the deepest, which is an interesting way
02:12:37.160 | to celebrate the past and relive it.
02:12:40.640 | It sucks that you don't have a thing,
02:12:42.620 | but when you don't have a thing,
02:12:43.920 | it's a good moment to viscerally experience
02:12:48.880 | the memories of something that you now appreciate even more.
02:12:53.280 | - So you don't believe that an owner of a lonely heart
02:12:55.840 | is much better than an owner of a broken heart?
02:12:58.200 | You think an owner of a broken heart
02:13:01.000 | is better than the owner of a lonely heart?
02:13:02.740 | - Yes, for sure, I think so, I think so.
02:13:05.200 | But I'm gonna have to, day by day,
02:13:07.640 | I don't know, I'm gonna have to listen
02:13:09.080 | to some more Bruce Springsteen to figure that one out.
02:13:12.520 | - Well, you know, it's funny because it's like,
02:13:14.000 | after I turned 50, I think of death all the time.
02:13:17.040 | I just think that I'm in, I probably have fewer years
02:13:22.040 | ahead of me than I have behind me, right?
02:13:24.560 | So I think about one thing, which is,
02:13:27.240 | what are the memories that I wanna carry with me
02:13:29.580 | for the next period of time?
02:13:32.200 | And also about just the fact that everything around me
02:13:35.960 | I know more people who are dying for various reasons.
02:13:40.800 | And so, I'm not lost, I'm not that old, right?
02:13:45.320 | But it's something that I think about a lot,
02:13:49.400 | and I'm reminded of how I talked to somebody
02:13:53.800 | who's a Buddhist, and I was like,
02:13:56.680 | the whole idea of Buddhism is renouncing attachments.
02:13:59.520 | Some way, the idea of Buddhism is staying out
02:14:02.900 | of the world of memory and staying in the moment, right?
02:14:05.860 | And they talked about, you know, it's like,
02:14:07.860 | how do you renounce attachments
02:14:10.860 | to the people that you love, right?
02:14:12.680 | And they're just saying, well, I appreciate
02:14:14.580 | that I have this moment with them,
02:14:16.060 | and knowing that they will die makes me appreciate
02:14:18.500 | this moment that much more.
02:14:20.700 | I mean, you said something similar, right,
02:14:22.380 | in your daily routine that you think about things this way,
02:14:25.260 | right?
02:14:26.100 | - Yeah, I meditate on mortality every day.
02:14:30.860 | But, I don't know, at the same time,
02:14:33.180 | that really makes you appreciate the moment
02:14:34.660 | and live in the moment, and I also appreciate
02:14:38.740 | the full, deep rollercoaster of suffering
02:14:41.740 | involved in life, the little and the big, too.
02:14:45.320 | So, I don't know.
02:14:46.660 | The Buddhist kind of removing yourself from the world,
02:14:50.620 | or the Stoic, removing yourself from the world,
02:14:52.680 | the world of emotion, I'm torn about that one.
02:14:56.020 | I'm not sure.
02:14:56.940 | - Well, you know, this is where Hinduism and Buddhism,
02:14:59.740 | or at least some strains of Hinduism and Buddhism differ.
02:15:02.220 | In Hinduism, like if you read the Bhagavad Gita,
02:15:06.420 | the philosophy is not one of renouncing the world,
02:15:11.140 | because the idea is that not doing something
02:15:14.520 | is no different than doing something, right?
02:15:17.260 | So, what they argue, and again, you could interpret
02:15:20.580 | in different ways, positive and negative,
02:15:22.580 | but the argument is is that you don't want
02:15:25.020 | to renounce action, but you want to renounce
02:15:28.020 | the fruits of the action.
02:15:29.980 | You don't do it because of the outcome,
02:15:32.460 | you do it because of the process,
02:15:34.760 | because the process is part of the balance
02:15:37.460 | of the world that you're trying to preserve, right?
02:15:40.020 | And of course, you could take that different ways,
02:15:41.860 | but I really think about that from time to time
02:15:44.380 | in terms of like, you know, letting go of this idea
02:15:48.220 | of does this book sell, or trying to, you know,
02:15:52.460 | like impress you and get you to laugh at my jokes
02:15:55.460 | or whatever, and just be more like I'm sharing
02:15:57.800 | this information with you and, you know,
02:16:00.300 | getting to know you or whatever it is.
02:16:01.900 | But it's hard, right?
02:16:03.780 | It's like, 'cause we're so driven by the reinforcer,
02:16:07.260 | the outcome.
02:16:08.340 | - You're just part of the process of telling the joke,
02:16:12.380 | and if I laugh or not, that's up to the universe to decide.
02:16:16.820 | - Yep, it's my dharma.
02:16:18.220 | - How does studying memory affect your understanding
02:16:22.660 | of the nature of time?
02:16:23.980 | So like, we've been talking about us living in the present
02:16:28.980 | and making decisions about the future,
02:16:33.380 | standing on the foundation of these memories
02:16:35.900 | and narratives about the memories that we've constructed.
02:16:38.660 | So it feels like it does weird things to time.
02:16:43.420 | - Yeah, and the reason is, is that in some sense,
02:16:46.540 | I think, especially the farther we go back,
02:16:49.540 | I mean, there's all sorts of interesting things
02:16:51.420 | that happen, so your sense of like,
02:16:54.220 | if I ask you how different does one hour ago feel
02:16:58.260 | from two hours ago, you'd probably say pretty different.
02:17:01.860 | But if I ask you, okay, go back one year ago
02:17:04.620 | versus one year and one hour ago,
02:17:06.420 | it's the same difference in time,
02:17:07.860 | it won't feel very different, right?
02:17:09.340 | So there's this kind of compression that happens
02:17:11.860 | as you look back farther in time.
02:17:14.940 | So that it's kind of like why when you're older,
02:17:17.460 | the difference between somebody who's like 50
02:17:20.060 | and 45 doesn't seem as big as the difference
02:17:23.420 | between like 10 and five or something, right?
02:17:25.220 | When you're 10 years old, everything seems like
02:17:27.660 | it's a long period of time.
02:17:29.220 | Here's the point is that, so one of the interesting things
02:17:32.220 | that I found when I was working on the book actually
02:17:34.460 | was during the pandemic, I just decided to ask people
02:17:38.180 | in my class when we were doing the remote instruction.
02:17:40.340 | So one of the things I did was I'd pull people.
02:17:42.860 | And so I just asked people, do you feel like the days
02:17:46.380 | are moving by slower or faster or about the same?
02:17:51.060 | Almost everyone in the class said that the days
02:17:53.460 | were moving by slower.
02:17:54.820 | So then I would say, okay, so do you feel like the weeks
02:17:59.620 | are passing by slower, faster or the same?
02:18:02.900 | And the majority of them said that the weeks
02:18:04.660 | were passing by faster.
02:18:06.660 | So according to the laws of physics,
02:18:08.860 | I don't think that makes any sense, right?
02:18:11.540 | But according to memory, it did because what happened
02:18:14.740 | was people were doing the same thing over and over
02:18:17.620 | in the same context.
02:18:19.340 | And without that change in context,
02:18:22.140 | their feeling was that they were
02:18:25.020 | in one long monotonous event.
02:18:28.220 | And so, but then at the end of the week,
02:18:30.580 | you look back at that week and you say, well, what happened?
02:18:33.700 | No memories of what happened.
02:18:35.500 | So it must, the week just went by
02:18:37.660 | without even my noticing it.
02:18:39.420 | But that week went by during the same amount of time
02:18:42.820 | as an eventful week where you might've been going out,
02:18:45.140 | hanging out with friends on vacation or whatever, right?
02:18:48.060 | It's just that nothing happened
02:18:50.300 | because you're doing the same thing over and over.
02:18:52.260 | So I feel like memory really shapes our sense of time,
02:18:56.660 | but it does so in part because context
02:18:59.540 | is so important for memory.
02:19:01.460 | - Well, that compression you mentioned,
02:19:03.660 | it's an interesting process.
02:19:05.740 | 'Cause what I think about when I was like 12, 15,
02:19:11.780 | I just fundamentally feel like the same person.
02:19:15.340 | It's interesting what that compression does.
02:19:17.020 | It makes me feel like we're all connected,
02:19:19.740 | not just amongst humans and spatially,
02:19:22.100 | but in terms, back in time,
02:19:25.580 | there's a kind of eternal nature,
02:19:29.420 | like the timelessness, I guess, to life.
02:19:32.460 | It could be also a genetic thing just for me.
02:19:34.740 | I don't know if everyone agrees to this view of time,
02:19:38.540 | but to me, it all feels the same.
02:19:40.500 | - Like you don't feel the passage of time or?
02:19:43.140 | - No, I feel the passage of time
02:19:44.300 | in the same way that your students did from day to day.
02:19:47.980 | There's certain markers that let you know
02:19:52.060 | that time has passed, you celebrate birthdays and so on.
02:19:55.060 | But the core of who I am and who others I know are,
02:19:58.540 | or events, that compression of my understanding
02:20:02.420 | of the world removes time
02:20:06.140 | 'cause time is not useful for the compression.
02:20:08.260 | So the details of that time, at least for me,
02:20:11.180 | is not useful to understanding the core of the thing.
02:20:14.860 | - Maybe what it is is that you really like
02:20:17.940 | to see connections between things.
02:20:20.020 | This is really what motivates me in science, actually, too.
02:20:23.520 | But it's like when you start recalling the past
02:20:26.100 | and seeing the connections between the past and present,
02:20:30.660 | now you have this kind of web
02:20:32.740 | of interconnected memories, right?
02:20:35.700 | And so I can imagine, in that sense,
02:20:38.580 | there is this kind of the present is with you, right?
02:20:41.420 | But what's interesting about what you said, too,
02:20:44.420 | that struck me is that your 16-year-old self
02:20:48.460 | was probably very complex.
02:20:50.820 | And by the way, I'm the same way,
02:20:52.740 | but it's like it really is the source
02:20:54.420 | of a lot of darkness for me.
02:20:56.180 | But when you can look back at,
02:21:01.220 | let's say you hear a song that you used to play
02:21:04.900 | before you would go do a sports thing
02:21:07.340 | or something like that,
02:21:08.180 | and you might not think of yourself as an athlete,
02:21:09.820 | but once you get back to that,
02:21:12.140 | you mentally time travel to that particular thing,
02:21:14.720 | you open up this little compartment of yourself
02:21:17.220 | that wasn't there before, right,
02:21:18.740 | that didn't seem accessible before.
02:21:20.980 | Dan Schachter's lab did this really cool study
02:21:23.700 | where they would ask people
02:21:25.580 | to either remember doing something altruistic
02:21:29.900 | or imagine doing something altruistic.
02:21:32.980 | And that act made them more likely
02:21:36.540 | to want to do things for other people.
02:21:39.740 | So that act of mental time travel
02:21:43.140 | can change who you are in the present.
02:21:45.020 | And we tend to think of,
02:21:46.220 | this goes back to that illusion of stability,
02:21:48.240 | and we tend to think of memory
02:21:50.460 | in this very deterministic way,
02:21:52.020 | that I am who I am because I have this past.
02:21:54.840 | But we have a very multifaceted past
02:21:57.540 | and can access different parts of it
02:22:00.560 | and change in the moment
02:22:02.780 | based on whatever part we want to reach for, right?
02:22:06.460 | - How does nostalgia connect into this,
02:22:09.220 | like this desire and pleasure associated with going back?
02:22:14.220 | - Yeah, so my friend Philippe DeBregaard wrote this,
02:22:20.620 | and it just like blew my mind,
02:22:23.260 | where the word nostalgia was coined by a Swiss physician
02:22:26.820 | who was actually studying traumatized soldiers.
02:22:29.100 | And so he described nostalgia as a disease.
02:22:32.420 | And the idea was it was bringing these people
02:22:34.500 | extraordinary unhappiness
02:22:36.240 | because they're remembering how things used to be.
02:22:38.740 | And I think it's very complex.
02:22:42.900 | So as people get older, for instance,
02:22:45.540 | nostalgia can be an enormous source of happiness, right?
02:22:49.740 | And being nostalgic can improve people's moods
02:22:52.540 | in the moment.
02:22:53.860 | But it just depends on what they do with it,
02:22:56.640 | because what you can sometimes see is,
02:22:59.060 | nostalgia has the opposite effect of thinking
02:23:01.460 | those were the good old days and those days are over, right?
02:23:04.220 | It's like America used to be so great and now it sucks.
02:23:07.700 | Or, you know, my life used to be so great when I was a kid
02:23:11.140 | and now it's not, right?
02:23:12.940 | And you're selectively remembering the things
02:23:15.900 | that we don't realize how selective
02:23:18.080 | our remembering self is.
02:23:20.140 | And so, you know, I lived through the '70s.
02:23:22.600 | It sucked.
02:23:23.440 | Partly it sucked more for me,
02:23:26.900 | but I would say that even otherwise,
02:23:29.380 | it's like there's all sorts of problems going on.
02:23:31.580 | Gas lines, people were like, you know,
02:23:33.780 | worried about like Russia, nuclear war, blah, blah, blah.
02:23:37.660 | So, I mean, it's just this idea that people have
02:23:42.100 | about the past can be very useful
02:23:45.220 | if it brings you happiness in the present,
02:23:48.100 | but if it narrows your worldview in the present,
02:23:50.580 | you're not aware of those biases that you have,
02:23:54.080 | you will end up, you can end up,
02:23:55.860 | it can be toxic, right?
02:23:57.340 | Either at a personal level or at a collective level.
02:24:01.160 | - Let me ask you both a practical question
02:24:04.020 | and an out there question.
02:24:05.620 | So, let's start with the more practical one.
02:24:07.420 | What are your thoughts about BCIs,
02:24:11.460 | brain computer interfaces,
02:24:12.900 | and the work that's going on with Neuralink?
02:24:15.440 | We talked about electrodes and different ways
02:24:17.620 | of measuring the brain,
02:24:18.860 | and here Neuralink is working on basically
02:24:21.280 | two-way communication with the brain.
02:24:22.900 | And the more out there question would be like,
02:24:24.500 | where does this go?
02:24:25.500 | But more practically in the near term,
02:24:27.860 | what do you think about Neuralink?
02:24:29.660 | - Yeah, I mean, I can't say specifics about the company
02:24:32.740 | 'cause I haven't studied it that much,
02:24:34.340 | but I mean, I think there's two parts of it.
02:24:36.760 | So, one is they're developing
02:24:37.900 | some really interesting technology,
02:24:39.420 | I think with these surgical robots and things like that.
02:24:42.960 | BCI, though, has a whole lot of innovation going on.
02:24:48.820 | I'm not necessarily seeing any scientific evidence
02:24:52.620 | from Neuralink, and maybe that's just
02:24:54.740 | 'cause I'm not looking for it,
02:24:55.820 | but I'm not seeing the evidence that they're anywhere near
02:24:58.660 | where the scientific community is.
02:25:00.700 | And there's lots of startups
02:25:01.820 | that are doing incredibly innovative stuff.
02:25:03.860 | One of my colleagues, Sergei Savitsky,
02:25:05.620 | is just like a genius in this area,
02:25:07.420 | and they're working on it.
02:25:09.020 | I think speech prosthetics that are incorporating,
02:25:13.100 | you know, decoding techniques with AI,
02:25:15.260 | and, you know, movement prosthetics,
02:25:16.860 | it's just like the rate of progress is just enormous.
02:25:21.380 | So, part of the technology is having good enough data
02:25:24.820 | and understanding which data to use
02:25:26.900 | and what to do with it, right?
02:25:28.400 | And then the other part of it, then,
02:25:30.780 | is the algorithms for decoding it and so forth.
02:25:33.780 | And I think part of that has really resulted
02:25:35.960 | in some real breakthroughs in neuroscience as a result.
02:25:38.900 | So, there's lots of new technologies like Neuropixels,
02:25:43.220 | for instance, that allow you to harvest activity
02:25:45.560 | from many, many neurons from a single electrode.
02:25:48.860 | I know Neuralink has some technologies
02:25:51.420 | that are also along these lines,
02:25:53.020 | but I haven't, again, because they do their own stuff.
02:25:56.980 | The scientific community doesn't see it, right?
02:25:59.340 | But I think BCI is much, much bigger than Neuralink,
02:26:03.940 | and there's just so much innovation happening.
02:26:07.300 | I think the interesting question,
02:26:09.280 | which we may be getting into,
02:26:11.060 | is I was talking to Sergei a while ago about, you know,
02:26:13.460 | so, a lot of language is not just what we hear
02:26:16.860 | and what we speak, but also our intentions
02:26:20.980 | and our internal models.
02:26:22.300 | And, you know, so are you really gonna be able
02:26:24.620 | to restore language without dealing with that part of it?
02:26:27.680 | And he brought up a really interesting question,
02:26:29.820 | which is the ethics of reading out people's intentions
02:26:34.140 | and understanding of the world,
02:26:36.400 | as opposed to the more, you know,
02:26:39.140 | the more concrete parts of hearing
02:26:41.940 | and producing movements, right?
02:26:43.740 | - Just so we're clear,
02:26:44.700 | 'cause you said a few interesting things.
02:26:46.460 | When we talk about language and BCIs,
02:26:48.780 | what we mean is getting signal from the brain
02:26:51.620 | and generating the language,
02:26:54.380 | say you're not able to actually speak,
02:26:56.320 | it's as a kind of linguistic prosthetic.
02:27:01.740 | It's able to speak for you exactly what you wanted to say.
02:27:06.740 | And then the deeper question is,
02:27:08.380 | well, saying something isn't just the letters,
02:27:12.820 | the words that you're saying,
02:27:14.620 | it's also the intention behind it,
02:27:15.980 | the feeling behind it, all that kind of stuff.
02:27:18.700 | And is it ethical to reveal that full shebang,
02:27:21.820 | the full context of what's going on in our brain?
02:27:25.660 | That's really, that's really interesting.
02:27:28.900 | That's really interesting.
02:27:29.740 | I mean, our thoughts.
02:27:31.140 | Is it ethical for anyone to have access to our thoughts?
02:27:34.860 | Because right now the resolution is so low
02:27:40.460 | that we're okay with it,
02:27:41.460 | even doing studies and all this kind of stuff.
02:27:43.580 | But if neuroscience has a few breakthroughs
02:27:46.460 | to where you can start to map out the QR codes
02:27:48.780 | for different thoughts, for different kinds of thoughts,
02:27:52.220 | maybe political thoughts, you know, McCarthyism.
02:27:56.660 | What if I'm getting a lot of them communist thoughts
02:28:00.300 | or however we want to categorize or label it?
02:28:03.260 | That's interesting.
02:28:06.540 | That's really interesting.
02:28:07.780 | I think ultimately this always,
02:28:10.420 | the more transparency there is about the human mind,
02:28:15.420 | the better it is.
02:28:18.140 | But there could be always intermediate battles
02:28:22.260 | with how much control does a centralized entity have,
02:28:25.500 | like a government and so on.
02:28:26.500 | What is the regulation?
02:28:27.620 | What are the rules?
02:28:28.460 | What are the, what's legal and illegal?
02:28:30.500 | You know, if you talk about the police
02:28:32.340 | whose job is to track down criminals and so on,
02:28:36.260 | and you look at all the history,
02:28:38.020 | how the police could abuse its power
02:28:41.300 | to control the citizenry, all that kind of stuff.
02:28:43.820 | So people are always paranoid and rightfully so.
02:28:46.340 | It's fascinating.
02:28:47.660 | It's really fascinating.
02:28:49.380 | You know, we talk about freedom of speech,
02:28:51.540 | you know, freedom of thought,
02:28:54.940 | which is also a very important liberty
02:29:00.400 | at the core of this country and probably humanity,
02:29:03.380 | starts to get awfully tricky
02:29:05.380 | when you start to be able to collect those thoughts.
02:29:08.420 | But what I wanted to actually ask you is,
02:29:11.140 | do you think for fun and for practical purposes,
02:29:16.140 | you'll be able to, we would be able to modify memories?
02:29:21.880 | So how difficult is it to,
02:29:26.540 | how far away we are from understanding
02:29:29.220 | the different parts of the brains,
02:29:30.760 | everything we've been talking about,
02:29:33.260 | in order to figure out how could we adjust this memory
02:29:36.540 | at the crude level from unpleasant to pleasant?
02:29:39.980 | You talked about we can remember the mall and the people,
02:29:43.860 | like location of the people.
02:29:45.380 | Can we keep the people and change the place?
02:29:47.500 | Like this kind of stuff.
02:29:48.700 | How difficult is that?
02:29:50.200 | - Well, I mean, in some sense,
02:29:51.660 | we know we can do it just behaviorally, right?
02:29:54.420 | - Behaviorally, yes.
02:29:55.260 | - Just like tell you, give, you know,
02:29:56.940 | under certain conditions anyway,
02:29:58.540 | it can give you the misinformation
02:30:00.620 | and then you can change the people, places and so forth.
02:30:03.580 | On the crude level, there's a lot of work that's being done
02:30:08.340 | on a phenomenon called reconsolidation,
02:30:11.060 | which is this idea that essentially when I recall a memory,
02:30:14.680 | what happens is that the connections between the neurons
02:30:18.460 | and that cell assembly that give you the memory
02:30:21.500 | are going to be like more modifiable.
02:30:25.980 | And so some people have used techniques to try to like,
02:30:29.100 | for instance, with fear memories,
02:30:30.980 | to reduce that physical visceral component of the memory
02:30:35.140 | when it's being activated.
02:30:36.860 | Right now, I think I've, as an outsider looking at the data,
02:30:40.260 | I think it's like mixed results.
02:30:42.820 | And part of it is, and this speaks to the more complex issue,
02:30:46.800 | is that you need somebody to actually fully recall
02:30:51.800 | that traumatic memory in the first place.
02:30:54.680 | And in order to actually modify it,
02:30:57.860 | then what is the memory?
02:30:59.500 | That is the key part of the problem.
02:31:01.900 | So if we go back to reading people's thoughts,
02:31:03.900 | what is the thought?
02:31:05.700 | I mean, people can sometimes look at this like behaviorists
02:31:08.300 | and go, well, the memory is like,
02:31:10.300 | I've given you A and you produce B.
02:31:12.180 | But I think that's a very bankrupt concept about memory.
02:31:15.700 | I think it's much more complicated than that.
02:31:17.940 | And one of the things that when we started studying
02:31:20.740 | naturalistic memory, like memory from movies,
02:31:23.560 | that was so hard was we had to change
02:31:25.500 | the way we did the studies.
02:31:26.700 | Because if I show you a movie and I watch the same movie
02:31:31.700 | and you recall everything that happened
02:31:34.940 | and I recall everything that happened,
02:31:36.960 | we might take a different amount of time to do it.
02:31:39.160 | We might use different words.
02:31:41.040 | And yet to an outside observer,
02:31:42.580 | we might've recalled the same thing, right?
02:31:45.380 | So it's not about the words necessarily,
02:31:47.780 | and it's not about how long we spent or whatever.
02:31:50.780 | There's something deeper that is there that's this idea,
02:31:54.220 | but it's like, how do you understand that thought?
02:31:57.420 | I encounter a lot of concrete thinking that it's like,
02:32:01.220 | if I show a model, like the visual information
02:32:06.020 | that a person sees when they drive,
02:32:08.340 | I can basically reverse engineer driving.
02:32:10.860 | Well, that's not really how it works.
02:32:13.380 | I once saw a talk by somebody,
02:32:15.300 | or I saw somebody talking in this discussion
02:32:18.060 | between neuroscientists and AI people.
02:32:20.580 | And he was saying that the problem with self-driving cars
02:32:24.180 | that they had in cities, as opposed to highways,
02:32:27.220 | was that the car was okay at doing the things
02:32:31.580 | it's supposed to, but when there were pedestrians around,
02:32:34.620 | it couldn't predict the intentions of people.
02:32:37.500 | And so that unpredictability of people was the problem
02:32:40.920 | that they were having in the self-driving car design,
02:32:44.180 | 'cause it didn't have a good enough internal model
02:32:47.980 | of what the people were, what they were doing,
02:32:51.480 | what they wanted.
02:32:53.060 | Now, what do you think about that?
02:32:54.380 | - Well, I spent a huge amount of time
02:32:56.660 | watching pedestrians, thinking about pedestrians,
02:33:01.220 | thinking about what it takes to solve the problem of
02:33:03.820 | measuring, detecting the intention of a pedestrian,
02:33:12.980 | really of a human being in this particular context
02:33:15.540 | of having to cross a street.
02:33:18.020 | And it's fascinating.
02:33:20.340 | I think it's a window into how complex social systems are
02:33:25.340 | that involve humans, because I would just stand there
02:33:33.220 | and watch intersections for hours.
02:33:36.420 | And what you start to figure out is
02:33:38.660 | every single intersection has its own personality.
02:33:42.220 | So there's a history to that intersection,
02:33:44.620 | like jaywalking, certain intersections,
02:33:48.300 | intersections allow jaywalking a lot more,
02:33:52.020 | because what happens is we're leaders and followers.
02:33:56.660 | So there's a regular, let's say,
02:33:58.120 | and they get off the subway,
02:33:59.740 | and they start crossing in red light,
02:34:01.620 | and they do this every single day.
02:34:03.900 | And then there's people that don't show up
02:34:05.180 | to that intersection often,
02:34:06.380 | and they're looking for cues
02:34:07.460 | of how we're supposed to behave here.
02:34:09.460 | And if a few people start to jaywalk and cross in red light,
02:34:13.240 | they will also, they will follow.
02:34:15.820 | And there's just a dynamic to that intersection.
02:34:18.420 | There's a spirit to it.
02:34:19.900 | If you look at Boston versus New York,
02:34:23.140 | versus a rural town, even Boston, San Francisco,
02:34:26.260 | or here in Austin, there's different personalities city-wide
02:34:30.100 | but there's different personalities area-wide, region-wide,
02:34:32.740 | and there's different personalities, different intersections.
02:34:35.540 | And it's just fascinating.
02:34:37.260 | For a car to be able to determine that is tricky.
02:34:41.560 | Now, what machine learning systems are able to do well
02:34:44.380 | is collect a huge amount of data.
02:34:46.220 | So for us, it's tricky because we get to understand the world
02:34:50.980 | with very limited information and make decisions grounded
02:34:55.500 | in this big foundation model that we've built
02:34:58.380 | of understanding how humans work.
02:35:00.900 | AI could literally, in the context of driving,
02:35:04.220 | this is where I've often been really torn
02:35:06.660 | in both directions.
02:35:07.500 | If you just collect a huge amount of data,
02:35:09.900 | all of that information,
02:35:12.420 | and then compress it into a representation
02:35:15.660 | of how humans cross streets, it's probably all there.
02:35:20.620 | In the same way that you have a Noam Chomsky who says,
02:35:23.500 | "No, no, no, AI can't talk,
02:35:25.900 | "can't write convincing language
02:35:27.740 | "without understanding language."
02:35:30.620 | And more and more, you see large language models
02:35:33.260 | without quote-unquote understanding
02:35:35.900 | can generate very convincing language.
02:35:38.440 | But I think what the process of compression
02:35:40.620 | from a huge amount of data compressing
02:35:42.380 | into a representation is doing is in fact understanding.
02:35:47.180 | Deeply, in order to be able to generate one letter
02:35:49.940 | at a time, one word at a time,
02:35:52.300 | you have to understand the cruelty of Nazi Germany
02:35:57.300 | and the beauty of sending humans to space.
02:36:03.580 | And you have to understand all of that
02:36:05.060 | in order to generate, I'm going to the kitchen
02:36:07.780 | to get an apple and do that grammatically correctly.
02:36:10.700 | You have to have a world model
02:36:11.960 | that includes all of human behavior.
02:36:13.700 | - You're thinking LLM is building that world model.
02:36:16.100 | - It has to in order to be good at generating
02:36:19.740 | one word at a time, a convincing sentence.
02:36:22.260 | And in the same way, I think AI that drives a car,
02:36:27.180 | if it has enough data, will be able to form a world model
02:36:31.700 | that will be able to predict correctly
02:36:33.520 | what a pedestrian does.
02:36:35.260 | But when we, as humans are watching pedestrians,
02:36:38.740 | we slowly realize, damn, this is really complicated.
02:36:42.300 | In fact, when you start to self-reflect on driving,
02:36:45.420 | you realize driving is really complicated.
02:36:49.000 | There's like subtle cues we take about like just,
02:36:52.820 | there's a million things I could say,
02:36:55.460 | but like one of them determining who around you
02:36:58.180 | is an asshole, aggressive driver, potentially dangerous.
02:37:00.860 | - Yes, yes, I was just thinking about this, yes.
02:37:02.420 | - Or like--
02:37:03.260 | - You can read it, once you become a great driver,
02:37:06.860 | you can see it a mile away, this guy is gonna pull
02:37:09.620 | an asshole move in front of you.
02:37:11.660 | He's like way back there, but you know it's gonna happen.
02:37:14.340 | - And I don't know what, 'cause we're ignoring
02:37:16.940 | all the other cars, but for some reason,
02:37:19.180 | the asshole, like a red, like a glowing obvious symbol
02:37:23.540 | is just like right there, even in the periphery vision.
02:37:26.240 | 'Cause we're, again, we're usually when we're driving
02:37:29.260 | just looking forward, but we're like using
02:37:32.100 | the periphery vision to figure stuff out,
02:37:34.060 | and it's like a little puzzle that we're usually
02:37:36.580 | only allocating a small amount of our attention to,
02:37:39.660 | at least like cognitive attention to.
02:37:41.540 | And it's fascinating, but I think AI just has
02:37:45.380 | a fundamentally different suite of sensors
02:37:47.780 | in terms of the bandwidth of data that's coming in
02:37:50.240 | that allows you to form the representation
02:37:53.660 | and perform inference on the representation you,
02:37:56.220 | using the representation you form.
02:37:58.780 | That for the case of driving, I think it could be
02:38:01.300 | quite effective.
02:38:04.100 | But one of the things that's currently missing,
02:38:06.500 | even though OpenAI just recently announced adding memory,
02:38:11.080 | and I did wanna ask you like how important it is,
02:38:15.500 | how difficult is it to add some of the memory mechanisms
02:38:19.620 | that you've seen in humans to AI systems?
02:38:22.340 | - I would say superficially not that hard,
02:38:27.180 | but then in a deeper level, very, very hard,
02:38:29.620 | because we don't understand episodic memory, right?
02:38:31.980 | And so one of the ideas I talk about in the book
02:38:34.620 | is one of the oldest kind of dilemmas
02:38:38.020 | in computational neuroscience
02:38:39.460 | is what Steve Grossberg called
02:38:41.220 | the stability plasticity dilemma, right?
02:38:43.140 | When do you say something is new
02:38:46.660 | and overwrite your preexisting knowledge
02:38:48.660 | versus going with what you had before
02:38:51.020 | and making incremental changes?
02:38:52.660 | And so part of the problem with going through like massive,
02:38:58.700 | you know what I mean?
02:38:59.980 | Part of the problem of things
02:39:01.060 | like if you're trying to design an LLM
02:39:02.660 | or something like that is especially for English,
02:39:05.120 | there's so many exceptions to the rules, right?
02:39:08.100 | And so if you wanna rapidly learn the exceptions,
02:39:11.460 | you're gonna lose the rules.
02:39:13.340 | And if you wanna keep the rules,
02:39:14.940 | you have a harder time learning the exception.
02:39:17.620 | And so David Marr is one of the early pioneers
02:39:21.500 | in computational neuroscience.
02:39:23.540 | And then Jay McClelland and my colleague, Randy O'Reilly,
02:39:27.460 | some other people like Neil Cohen,
02:39:29.820 | all these people started to come up with the idea
02:39:32.500 | that maybe that's part of what we need
02:39:35.020 | in what the human brain is doing
02:39:37.460 | is we have this kind of a,
02:39:39.400 | actually a fairly dumb system,
02:39:41.300 | which just says this happened once at this point in time,
02:39:44.900 | which we call episodic memory, so to speak.
02:39:47.260 | And then we have this knowledge that we've accumulated
02:39:49.820 | from our experiences as semantic memory.
02:39:52.140 | So now when we want to,
02:39:55.940 | we encounter a situation that's surprising
02:39:58.500 | and violates all our previous expectations,
02:40:01.100 | what happens is is that now we can form
02:40:03.420 | an episodic memory here.
02:40:05.180 | And the next time we're in a similar situation, boom,
02:40:08.580 | we could supplement our knowledge with this information
02:40:11.060 | from episodic memory and reason about
02:40:12.980 | what the right thing to do is, right?
02:40:15.100 | So it gives us this enormous amount of flexibility
02:40:18.660 | to stop on a dime and change
02:40:20.780 | without having to erase everything we've already learned.
02:40:25.300 | And that solution is incredibly powerful
02:40:28.300 | because it gives you the ability to learn
02:40:31.820 | from so much less information really, right?
02:40:34.460 | And it gives you that flexibility.
02:40:37.000 | So one of the things I think that makes humans great
02:40:39.820 | is having both episodic and semantic memory.
02:40:43.140 | Now, can you build something like that?
02:40:46.740 | I mean, computational neuroscience people would say,
02:40:49.460 | well, yeah, you just record a moment
02:40:51.780 | and you just get it and you're done, right?
02:40:53.940 | But when do you record that moment?
02:40:55.980 | How much do you record?
02:40:57.260 | What's the information you prioritize
02:40:58.980 | and what's the information you don't?
02:41:00.900 | These are the hard questions.
02:41:02.300 | When do you use episodic memory?
02:41:04.140 | When do you just throw it away?
02:41:06.240 | And these are the hard questions
02:41:07.800 | we're still trying to figure out in people.
02:41:09.940 | And then you start to think about all these mechanisms
02:41:13.220 | that we have in the brain for figuring out
02:41:14.920 | some of these things.
02:41:15.760 | And it's not just one, but it's many of them
02:41:17.980 | that are interacting with each other.
02:41:19.460 | And then you just take not only the episodic
02:41:22.300 | and the semantic, but then you start to take
02:41:24.000 | the motivational survival things, right?
02:41:26.860 | It's just like the fight or flight responses
02:41:29.240 | that we associate with particular things
02:41:31.340 | or the kind of like reward motivation
02:41:34.660 | that we associate with certain things, so forth.
02:41:37.420 | And those things are absent from AI.
02:41:39.540 | I frankly don't know if we want it.
02:41:41.300 | I don't necessarily want a self-motivated LLM, right?
02:41:45.580 | And then there's the problem of how do you even
02:41:51.540 | build the motivations that should guide
02:41:53.700 | a proper reinforcement learning kind of thing,
02:41:55.940 | for instance, so a friend of mine, Sam Gershman,
02:41:59.660 | I might be missing the quote exactly,
02:42:02.180 | but he basically said, if I wanted to train
02:42:05.340 | like a typical AI model to make me as much money
02:42:08.580 | as possible, first thing I might do is sell my house.
02:42:11.620 | So it's not even just about having one goal
02:42:15.620 | or one objective, but just having all these
02:42:17.660 | competing goals and objectives, right?
02:42:19.980 | And then things start to get really complicated.
02:42:22.500 | - Well, it's all interconnected.
02:42:23.860 | I mean, just even the thing you've mentioned is the moment.
02:42:27.860 | You know, if we record a moment,
02:42:29.940 | it's difficult to express concretely what a moment is,
02:42:36.180 | like how deeply connected it is to the entirety of it.
02:42:42.100 | Maybe to record a moment, you have to make
02:42:45.540 | a universe from scratch.
02:42:47.460 | You have to include everything.
02:42:49.580 | You have to include all the emotions involved,
02:42:51.020 | all the context, all the things it built around it,
02:42:53.100 | all the social connections, all the visual experiences,
02:42:58.100 | all the sensory experience, all of that.
02:43:00.220 | All the history that came before that moment is built on.
02:43:03.740 | And we somehow take all of that and we compress it
02:43:06.620 | and keep the useful parts and then integrate it
02:43:09.940 | into the whole thing, into our whole narrative.
02:43:13.260 | And then each individual has their own little version
02:43:16.020 | of that narrative, and then we collide in a social way,
02:43:18.660 | and we adjust it, and we evolve.
02:43:21.380 | - Yeah, yeah.
02:43:22.220 | I mean, well, even if we wanna go super simple, right?
02:43:24.860 | Like Tyler Bonnen, who's a post-doc
02:43:27.740 | who's collaborating with me,
02:43:29.140 | he actually studied a lot of computer vision at Stanford.
02:43:34.140 | And so one of the things he was interested in
02:43:36.420 | is some people who have brain damage in areas of the brain
02:43:38.860 | that were thought to be important for memory.
02:43:41.100 | But they also seem to have some perception problems
02:43:44.900 | with particular kinds of object perception.
02:43:47.420 | And this is super controversial,
02:43:48.980 | and some people found this effect, some didn't.
02:43:51.420 | And he went back to computer vision,
02:43:54.380 | and he said, "Let's take the best state-of-the-art
02:43:56.340 | "computer vision models, and let's give them
02:43:58.700 | "the same kinds of perception tests
02:44:00.540 | "that we were giving to these people."
02:44:02.820 | And then he would find the images
02:44:04.380 | where the computer vision models would just struggle,
02:44:06.980 | and you would find that they just didn't do well.
02:44:09.460 | Even if you add more parameters,
02:44:11.180 | you add more layers, on and on and on, it doesn't help,
02:44:13.900 | right, the architecture didn't matter,
02:44:15.300 | it was just there, the problem.
02:44:16.980 | And then he found those were the exact ones
02:44:19.540 | where these humans with particular damage
02:44:21.740 | to this area called the perirhinal cortex,
02:44:24.380 | that was where they were struggling.
02:44:26.580 | So somehow this brain area was important
02:44:30.580 | for being able to do these things
02:44:32.080 | that were adversarial to these computer vision models.
02:44:36.060 | So then he found that it only happened
02:44:40.180 | if people had enough time
02:44:42.580 | they could make those discriminations.
02:44:44.780 | But without enough time, if they just get a glance,
02:44:46.980 | they're just like the computer vision models.
02:44:49.100 | So then what he started to say was,
02:44:50.340 | "Well, maybe let's look at people's eyes," right?
02:44:52.800 | So a computer vision model sees every pixel all at once,
02:44:55.780 | right, it's not, you know, and we don't,
02:44:58.540 | we never see every pixel all at once.
02:45:00.880 | Even if I'm looking at a screen with pixels,
02:45:02.540 | I'm not seeing every pixel all at once.
02:45:04.580 | I'm grabbing little points on the screen
02:45:08.140 | by moving my eyes around
02:45:09.900 | and getting a very high-resolution picture
02:45:12.080 | of what I'm focusing on,
02:45:13.860 | and kind of a lower-resolution information
02:45:16.100 | about everything else.
02:45:17.800 | But I'm, I'm not necessarily choosing,
02:45:20.140 | but I'm directing that exploration
02:45:23.940 | and allowing people to move their eyes
02:45:26.420 | and integrate that information
02:45:28.060 | gave them something
02:45:29.860 | that the computer vision models weren't able to do.
02:45:33.420 | So somehow integrating information across time
02:45:37.100 | and getting less information at each step
02:45:40.260 | gave you more out of the process.
02:45:42.580 | - I mean, the process of allocating attention
02:45:47.580 | across time seems to be a really important process.
02:45:52.780 | Even the breakthroughs that you get
02:45:55.660 | with machine learning mostly has to do,
02:46:00.380 | attention is all you need.
02:46:01.780 | It's about attention, transform is about attention.
02:46:04.080 | So attention is a really interesting one,
02:46:07.380 | but then like, yeah, how you allocate that attention,
02:46:12.500 | again, is like, is at the core of like,
02:46:17.500 | what it means to be intelligent,
02:46:19.940 | what it means to process the world,
02:46:22.780 | integrate all the important things,
02:46:25.060 | discard all the unimportant things.
02:46:27.200 | Attention is at the core of it,
02:46:30.900 | it's probably at the core of memory too.
02:46:33.700 | 'Cause there's so much sensory information,
02:46:35.380 | there's so much going on.
02:46:36.700 | There's so much going on.
02:46:37.740 | To filter it down to almost nothing
02:46:39.900 | and just keep those parts,
02:46:41.740 | and to keep those parts and then,
02:46:44.020 | whenever there's an error, to adjust the model
02:46:46.140 | such that you can allocate attention even better
02:46:49.180 | to new things that would result,
02:46:52.620 | maybe maximize the chance of confirming the model
02:46:55.060 | or disconfirming the model that you have
02:46:56.700 | and adjusting it since then.
02:46:58.720 | Yeah, attention is a weird one.
02:47:01.660 | I was always fascinated.
02:47:04.020 | I mean, I got a chance to study peripheral vision for a bit
02:47:09.500 | and indirectly study attention through that.
02:47:11.980 | It's just fascinating how humans,
02:47:14.100 | how good humans are at looking around
02:47:15.900 | and gathering information.
02:47:17.860 | - Yeah, at the same time, people are terrible
02:47:20.220 | at detecting changes that can happen in the environment
02:47:23.740 | if they're not attending in the right way,
02:47:26.080 | if their predictive model is too strong, you know?
02:47:29.100 | So you have these weird things where like,
02:47:31.420 | the machines can do better than the people.
02:47:33.460 | It's not that it's like, you know,
02:47:35.220 | so this is the thing is people go,
02:47:36.580 | oh, the machines can do this stuff that's just like humans.
02:47:39.860 | It's like, well, the machines make different kinds
02:47:43.100 | of mistakes than the people do.
02:47:45.340 | And I will never be convinced unless that, you know,
02:47:48.580 | we've replicated human,
02:47:50.900 | I don't even like the term intelligence
02:47:52.700 | 'cause I think it's a stupid concept,
02:47:54.740 | but it's like, I don't think we've replicated
02:47:57.020 | human intelligence unless I know that the simulator
02:48:01.540 | is making exactly the same kinds of mistakes that people do
02:48:05.020 | 'cause people make characteristic mistakes.
02:48:07.620 | They have characteristic biases.
02:48:09.420 | They have characteristic, like, you know,
02:48:11.940 | heuristics that we use.
02:48:13.420 | And those, I have yet to see evidence
02:48:16.360 | that CHAT-GPT will do that.
02:48:18.740 | - Since we're talking about attention,
02:48:20.640 | is there an interesting connection to you
02:48:23.420 | between ADHD and memory?
02:48:26.500 | - Well, it's interesting for me because when I was a child,
02:48:30.420 | I was actually told my school,
02:48:33.180 | I don't know if it came from a school psychologist.
02:48:35.300 | They did do some testing on me, I know,
02:48:37.640 | for like IQ and stuff like that.
02:48:39.460 | Or if it just came from teachers who hated me,
02:48:43.100 | but they told my parents that I had ADHD.
02:48:46.780 | And so this was, of course, in the '70s.
02:48:48.940 | So basically they said like, you know,
02:48:51.380 | he has poor motor control and he's got ADHD.
02:48:54.860 | And so, and, you know, there was social issues.
02:48:57.940 | So like I could have been put a year ahead in school,
02:49:00.980 | but then they said, oh, but he doesn't have the social,
02:49:04.300 | he doesn't have the social capabilities.
02:49:06.620 | So I still ended up being like, you know,
02:49:08.900 | an outcast, even in my own grade, but then like I was,
02:49:14.700 | so then my parents said, okay, well,
02:49:16.620 | they got me on a diet free of artificial colors and flavors,
02:49:20.260 | because that was the thing
02:49:21.340 | that people talked about back then.
02:49:22.740 | So I'm interested in this topic
02:49:25.020 | because I've come to appreciate now
02:49:27.460 | that I have many of the characteristics,
02:49:29.860 | if not, you know, full blown.
02:49:31.540 | It's like, I'm definitely time blindness,
02:49:34.780 | rejection sensitive, you name it, they talk about it.
02:49:37.620 | It's like impulsive behavior.
02:49:39.500 | I can tell you about all sorts of fights
02:49:41.020 | I've gotten into in the past, just you name it.
02:49:44.260 | But yeah, so ADHD is fascinating though,
02:49:48.820 | because right now we're seeing like
02:49:50.780 | more and more diagnosis of it.
02:49:52.660 | And I don't know what to say about that.
02:49:55.660 | I don't know how much of that is based on
02:50:00.260 | kind of inappropriate expectations, especially for children,
02:50:04.660 | and how much of that is based on true
02:50:06.740 | kind of like maladaptive kinds of tendencies.
02:50:10.660 | But what we do know is this,
02:50:11.900 | is that ADHD is associated with differences
02:50:14.860 | in prefrontal function,
02:50:16.860 | so that attention can be both more,
02:50:20.580 | you're more distractible,
02:50:21.860 | you have harder time focusing your attention
02:50:24.140 | on what's relevant.
02:50:25.300 | And so you shift too easily,
02:50:26.700 | but then once you get on something
02:50:28.100 | that you're interested in, you can get stuck.
02:50:30.820 | And so, you know, attention is this beautiful balance
02:50:34.740 | of being able to focus when you need to focus
02:50:37.580 | and shift when you need to shift.
02:50:39.820 | And so it's that flexibility plus stability again.
02:50:43.100 | And that balance seems to be disrupted in ADHD.
02:50:48.460 | And so as a result, memory tends to be poor in ADHD,
02:50:52.860 | but it's not necessarily because
02:50:54.300 | there's a traditional memory problem,
02:50:56.940 | but it's more because of this attentional issue, right?
02:51:01.420 | And so, and people with ADHD often will have great memory
02:51:05.780 | for the things that they're interested in,
02:51:08.260 | and just no memory for the things
02:51:10.060 | that they're not interested in.
02:51:11.460 | - Is there advice from your own life
02:51:14.060 | on how to learn and succeed from that?
02:51:17.060 | From just how the characteristics of your own brain
02:51:20.780 | with ADHD and so on,
02:51:23.380 | how do you learn?
02:51:25.220 | How do you remember information?
02:51:29.900 | How do you flourish in this sort of education context?
02:51:34.140 | - I'm still trying to figure out the flourishing per se,
02:51:36.580 | but education, I mean, being in science
02:51:39.460 | is enormously enabling of ADHD.
02:51:42.420 | It's like, you're constantly looking for new things,
02:51:45.180 | you're constantly seeking that dopamine hit,
02:51:47.420 | and that's great, you know?
02:51:50.460 | And they tolerate your being late for things.
02:51:53.860 | Nothing is really, nobody's gonna die if you screw up,
02:51:56.300 | you know, but it's nice.
02:51:57.940 | It's not like being a doctor or something
02:51:59.540 | where you have to be much more responsible and focused.
02:52:03.220 | You can just freely follow your curiosity,
02:52:05.700 | which is just great.
02:52:06.860 | But what I would say is that I'm learning now
02:52:11.900 | about so many things,
02:52:14.460 | like about how to structure my activities more,
02:52:18.700 | and basically say, "Okay, if I'm going to be,"
02:52:23.460 | email's like the big one that kills me right now.
02:52:25.860 | I'm just constantly shifting between email and my activities.
02:52:29.980 | And what happens is that I don't actually get the email.
02:52:33.020 | I just look at my email and I get stressed,
02:52:34.860 | 'cause I'm like, "Oh, I have to think about this.
02:52:36.900 | "Let me get back to it."
02:52:38.140 | And I go back to something else,
02:52:39.340 | and so I've just got fragmentary memories of everything.
02:52:43.180 | So what I'm trying to do is set aside a time
02:52:45.260 | where I'm like, "This is my email time.
02:52:46.620 | "This is my writing time.
02:52:49.580 | "This is my goofing off time."
02:52:52.140 | And so blocking these things off,
02:52:54.420 | you give yourself the goofing off time.
02:52:56.140 | Sometimes I do that, and sometimes I have to be flexible
02:52:59.420 | and go like, "Okay, I'm definitely not focusing.
02:53:01.660 | "I'm going to give myself the downtime."
02:53:03.980 | And it's an investment.
02:53:05.140 | It's not like wasting time.
02:53:06.620 | It's an investment in my attention later on.
02:53:10.300 | - And I'm very much with Cal Newport on this.
02:53:13.140 | He wrote "Deep Work" and a lot of other amazing books.
02:53:16.100 | He talks about task switching
02:53:18.580 | as the thing that really destroys productivity.
02:53:22.380 | So switching, it doesn't even matter from what to what,
02:53:27.060 | but checking social media, checking email,
02:53:29.380 | maybe switching to a phone call,
02:53:31.260 | and then doing work, and then switching.
02:53:32.780 | Even switching between, if you're reading a paper,
02:53:35.280 | switching from paper to paper to paper,
02:53:38.900 | because curiosity and whatever the dopamine hit
02:53:43.700 | from the attention switch, limiting that,
02:53:46.300 | 'cause otherwise your brain is just not capable
02:53:48.700 | to really load it in and really do that deep deliberation.
02:53:53.700 | I think that's required to remember things
02:53:58.940 | and to really think through things.
02:54:00.900 | - Yeah, I mean, you probably see this, I imagine,
02:54:03.780 | in AI conferences, but definitely
02:54:05.820 | in neuroscience conferences, it's now the norm
02:54:08.860 | that people have their laptops out during talks.
02:54:12.260 | And conceivably, they're writing notes.
02:54:15.820 | But in fact, what often happens if you look at people,
02:54:18.700 | and we can speak from a little bit of personal experience,
02:54:22.060 | is you're checking email, or I'm working on my own talk,
02:54:27.060 | but often it's like you're doing things
02:54:28.980 | that are not paying attention.
02:54:30.580 | I have this illusion, well, I'm paying attention,
02:54:32.660 | and then I'm going back.
02:54:33.940 | And then what happens is I don't remember anything
02:54:36.620 | from that day, it just kind of vanished.
02:54:38.460 | Because what happens is I'm creating
02:54:40.580 | all these artificial event boundaries.
02:54:42.740 | I'm losing all this executive function.
02:54:45.740 | Every time I switch, I'm getting a few seconds slower,
02:54:50.100 | and I'm catching up mentally to what's happening.
02:54:53.300 | And so instead of being in a model
02:54:54.780 | where you're meaningfully integrating everything
02:54:57.020 | and predicting and generating this kind of rich model,
02:55:00.580 | I'm just catching up, you know?
02:55:02.860 | And so yeah, there's great research
02:55:05.020 | by Melina Unkefer and Anthony Wagner on multitasking
02:55:08.260 | that people can look up that talks about
02:55:10.700 | just how bad it is for memory,
02:55:12.300 | and it's becoming worse and worse of a problem.
02:55:15.980 | - So you're a musician, take me through,
02:55:19.420 | how'd you get into music?
02:55:20.540 | What made you first fall in love with music,
02:55:23.020 | with creating music?
02:55:24.740 | - I, yeah, so I started playing music
02:55:27.260 | just when I was doing trumpet in school for a school band,
02:55:31.740 | and I would just read music and play,
02:55:33.580 | and it was pretty decent at it,
02:55:36.180 | not great, but it was decent.
02:55:37.700 | - How'd you go from trumpet to--
02:55:40.020 | - Guitar.
02:55:40.860 | - To guitar, especially the kind of music you're into?
02:55:43.420 | - Yeah, so basically in high school,
02:55:46.060 | yeah, so I kind of was a late bloomer to music,
02:55:50.060 | but just kind of MTV grew up with me.
02:55:53.420 | I grew up with MTV,
02:55:54.940 | and so then you started seeing all this stuff,
02:55:57.020 | and then I got into, metal was kind of like my early genre,
02:56:01.100 | and I always reacted to just things that were loud
02:56:04.100 | and had a beat like, you know, ADHD, right?
02:56:07.540 | It's like, you know, everything from Sgt. Pepper's
02:56:11.580 | by the Beatles to Led Zeppelin II, my dad had both,
02:56:16.540 | my parents had both those albums,
02:56:18.100 | so I listened to them a lot,
02:56:19.620 | and then like The Police, Ghost in the Machine,
02:56:23.620 | but then I got into metal, Def Leppard,
02:56:25.540 | and, you know, AC/DC, Metallica,
02:56:29.420 | went way down the rabbit hole of speed metal,
02:56:32.120 | and that time was kind of like,
02:56:35.700 | oh, why don't I play guitar?
02:56:37.380 | I can do this, and I had friends who were doing that,
02:56:40.260 | and I just never got it.
02:56:42.340 | Like, I took lessons and stuff like that,
02:56:45.080 | but it was different because when I was doing trumpet,
02:56:47.900 | I was reading sheet music,
02:56:49.260 | and this was like, I was learning by looking,
02:56:51.940 | there's a thing called tablature, you know this,
02:56:53.700 | where it's like you see a drawing of the fretboard
02:56:57.060 | with numbers, and that's where you're supposed to put your,
02:56:58.860 | it's kind of like paint by numbers, right?
02:57:00.900 | And so, I learned it in a completely different way,
02:57:05.440 | but I was still terrible at it,
02:57:07.740 | and I didn't get it, it's actually taken me a long time
02:57:11.620 | to understand exactly what the issue was,
02:57:13.420 | but it wasn't until I really got into punk,
02:57:16.260 | and I saw bands like, I saw Sonic Youth,
02:57:18.620 | I remember especially, and it just blew my mind,
02:57:21.720 | because they violated the rules
02:57:24.180 | of what I thought music was supposed to be.
02:57:25.780 | I was like, this doesn't sound right,
02:57:28.360 | these are not power chords,
02:57:30.540 | and this isn't just have like a shouty verse,
02:57:33.580 | and then a chorus part, it's not going,
02:57:35.540 | this is just like weird, and then it occurred to me,
02:57:39.940 | you don't have to write music
02:57:42.780 | the way people tell you it's supposed to sound.
02:57:46.160 | I just opened up everything for me,
02:57:47.460 | and I was playing in a band,
02:57:48.820 | and I was struggling with writing music,
02:57:51.460 | because I would try to write like,
02:57:53.580 | you know, whatever was popular at the time,
02:57:56.700 | and or whatever sounded like other bands
02:57:58.740 | that I was listening to,
02:58:00.060 | and somehow I kind of morphed into just like,
02:58:03.380 | just grabbing a guitar and just doing stuff,
02:58:06.660 | and I realized that part of my problem
02:58:08.460 | with doing music before was,
02:58:10.780 | I didn't enjoy trying to play stuff
02:58:12.820 | that other people played,
02:58:13.780 | I just enjoyed music just dripping out of me,
02:58:16.460 | and just, you know, spilling out and just doing stuff,
02:58:19.820 | and so then I started to say,
02:58:21.580 | what if I don't play a chord?
02:58:23.540 | What if I just play like notes that shouldn't go together,
02:58:26.700 | and just mess around with stuff?
02:58:28.660 | Then I said, well, what if I don't do four beats,
02:58:30.900 | go na-na-na-na, one-two-three-four,
02:58:32.620 | one-two-three-four, one-two-three-four,
02:58:34.380 | whatever I go, one-two-three-four-five,
02:58:36.180 | one-two-three-four-five,
02:58:37.780 | and started messing around with time signatures.
02:58:39.620 | Then I was playing in this band with a great musician
02:58:43.700 | who was really Brent Ritzel,
02:58:45.540 | who was in this band with me,
02:58:46.740 | and he taught me about arranging songs,
02:58:49.180 | and it was like, what if we take this part,
02:58:51.220 | and instead of make it go like back and forth,
02:58:53.820 | we make it like a circle,
02:58:55.180 | or what if we make it like a straight line,
02:58:57.780 | you know, or a zigzag, you know,
02:58:59.420 | just make it like non-linear in these interesting ways.
02:59:03.340 | And then the next thing you know,
02:59:04.300 | it's like the whole world sort of opens up as like the,
02:59:08.100 | and then what I started to realize,
02:59:09.580 | especially, so you could appreciate this as a musician,
02:59:11.940 | I think, so time signatures, right?
02:59:14.140 | So we are so brainwashed to think in 4/4, right?
02:59:17.900 | Every rock song you could think of almost is in 4/4.
02:59:20.900 | I know you're a Floyd fan,
02:59:22.620 | so think of "Money" by Pink Floyd, right?
02:59:24.700 | - Yeah.
02:59:25.740 | - ♪ Ba-na-na bum bum bum bum bum ♪
02:59:29.220 | - Yeah.
02:59:30.060 | - You feel like it's in 4/4 because it resolves itself,
02:59:33.460 | but it resolves on the last note of the,
02:59:37.060 | basically it resolves on the first note of the next measure.
02:59:40.780 | So it's got seven beats instead of eight
02:59:43.060 | where the riff is actually happening.
02:59:44.860 | - Interesting.
02:59:45.700 | - But you're thinking in four,
02:59:47.700 | because that's how we're used to thinking.
02:59:49.820 | So the music flows a little bit faster
02:59:52.820 | than it's supposed to,
02:59:54.700 | and you're getting a little bit of prediction error
02:59:56.940 | every time this is happening.
02:59:59.300 | And once I got used to that, I was like,
03:00:01.500 | I hate writing in 4/4
03:00:03.180 | because it was like, everything just feels better
03:00:05.380 | if I do it in 7/4 or if I alternate between 4 and 3
03:00:08.740 | and doing all this stuff.
03:00:10.580 | And then it's like, you just,
03:00:12.500 | jazz music is like that, you know,
03:00:14.220 | they just do so much interesting stuff with this.
03:00:17.100 | - So playing with those time signatures
03:00:18.580 | allows you to really break it all open and just,
03:00:22.020 | I guess there's something about that
03:00:23.540 | where it allows you to actually have fun.
03:00:25.620 | - Yeah, yeah.
03:00:26.460 | And it's like, so I'm actually like a very,
03:00:28.940 | one of the genres we used to play in was math rock,
03:00:33.620 | is what they called it.
03:00:34.460 | It was just like, there's just so many weird time signatures.
03:00:36.460 | - What is math?
03:00:37.300 | Oh, interesting.
03:00:38.140 | - Yeah, so.
03:00:39.180 | - That's the math part of rock is what,
03:00:42.900 | the mathematical disturbances of it or what?
03:00:45.220 | - Yeah, I guess it would be like,
03:00:46.260 | so instead of, you might go like,
03:00:47.820 | instead of playing four beats in every measure,
03:00:50.380 | na-na-na-na-na-na-na-na, you go,
03:00:52.780 | na-na-na-na-na-na-na-na-na-na-na-na-na-na,
03:00:55.540 | you know, and just do these things.
03:00:57.340 | And then you might arrange it in weird ways
03:00:59.620 | so that there might be three measures of verse
03:01:03.300 | and then one, you know, and then five measures of chorus
03:01:07.260 | and then two measures.
03:01:08.180 | So you could just mess around with everything, right?
03:01:10.020 | - What does that feel like to listen to?
03:01:12.580 | There's something about symmetry
03:01:14.580 | or like patterns that feel good
03:01:18.420 | and like relaxing for us or whatever.
03:01:20.500 | It feels like home.
03:01:21.580 | And disturbing, that can be quite disturbing.
03:01:24.220 | - Yeah.
03:01:25.060 | - So is that, is that the feeling you would have
03:01:27.300 | if you were keep messing, if you were math rock?
03:01:29.900 | I mean, it's stressing me out just listening.
03:01:32.700 | - Well, yeah, yeah.
03:01:33.540 | - Learning about it.
03:01:34.380 | - So, I mean, it depends.
03:01:35.460 | So a lot of my style of songwriting
03:01:39.140 | is very much like in terms of like repetitive themes,
03:01:43.660 | but messing around with structure
03:01:46.100 | 'cause I'm not a great guitarist technically.
03:01:48.700 | And so I don't play complicated stuff.
03:01:50.820 | And there's things that you can hear stuff
03:01:52.420 | where it's just like so complicated, you know?
03:01:54.780 | But often what I find is, is like having a melody
03:01:59.620 | or, and then adding some dissonance to it, just enough.
03:02:03.380 | And then adding some complexity
03:02:05.420 | that gets you going just enough.
03:02:08.180 | But I have a high tolerance
03:02:09.580 | for that kind of dissonance and prediction.
03:02:12.180 | And I think I have a theory, a pet theory
03:02:14.420 | that it's like basically you can explain
03:02:15.980 | most of human behavior is some people are lumpers
03:02:18.980 | and some people are splitters, you know?
03:02:21.140 | And so it's like some people are very kind of excited
03:02:24.860 | when they get this dissonance
03:02:26.180 | and they want to like go with it.
03:02:27.620 | And some people are just like, no, I want to lump every,
03:02:29.980 | you know, I don't know,
03:02:30.820 | maybe that's even a different thing.
03:02:32.140 | But it's like, basically it's like,
03:02:33.700 | I think some people get scared of that discomfort.
03:02:37.380 | And I really gravitate towards it, you know?
03:02:41.020 | - I love it.
03:02:42.740 | What's the name of your band now?
03:02:44.660 | - The cover band I play in is a band called Pavlov's Dogs.
03:02:48.420 | And so it's a band, unsurprisingly,
03:02:53.420 | of mostly memory researchers, neuroscientists.
03:02:55.900 | - I love this.
03:02:56.740 | I love this so much.
03:02:57.940 | - Actually, one of your MIT colleagues,
03:02:59.620 | Earl Miller, plays bass.
03:03:01.100 | - Plays bass?
03:03:01.940 | So you play, like, do you play a rhythm or a leader?
03:03:03.980 | - You could compete if you want.
03:03:05.300 | Maybe we could audition you.
03:03:06.300 | - For audition?
03:03:07.140 | Oh, yeah.
03:03:08.140 | I'm coming for you, Earl.
03:03:09.220 | (laughing)
03:03:10.580 | - Earl's gonna kill me.
03:03:13.540 | He's, like, very precise, though.
03:03:15.060 | - I'll play triangle or something.
03:03:16.540 | (laughing)
03:03:18.460 | Or the cowbell, yeah.
03:03:20.300 | I'll be the cowbell guy.
03:03:22.860 | What kind of songs do you guys do?
03:03:24.500 | - So it's mostly late '70s punk
03:03:28.180 | and '80s new wave and post-punk.
03:03:31.580 | Blondie, Ramones, Clash.
03:03:35.820 | I sing Age of Consent by New Order
03:03:39.060 | and Love Will Tear Us Apart.
03:03:40.340 | - You said you have a female singer now?
03:03:41.940 | - Yeah, yeah, yeah, Carrie Hoffman
03:03:43.660 | and also Paula Croxon.
03:03:47.020 | And so they do, yeah, so Carrie does Blondie amazingly well.
03:03:52.020 | And we do, like, Gigantic by the Pixies.
03:03:55.220 | Paula does that one.
03:03:56.860 | - Which song do you love to play the most?
03:03:58.660 | What kind of song is super fun for you?
03:04:00.620 | - Of someone else's?
03:04:02.940 | - Yeah, cover, yeah.
03:04:04.460 | - Cover, okay.
03:04:05.300 | And it's one we do with Pavlov's Dogs.
03:04:08.940 | I really enjoy playing I Wanna Be Your Dog
03:04:12.540 | by Iggy and the Stooges.
03:04:14.020 | - Yeah, it's a good song.
03:04:14.860 | - Which is perfect 'cause we're Pavlov's Dogs.
03:04:17.180 | And Pavlov, of course,
03:04:18.180 | was like basically created learning theory.
03:04:20.300 | So, you know, there's that.
03:04:22.020 | But also it's like,
03:04:22.860 | but I mean, Iggy and the Stooges, that song,
03:04:25.820 | so I play and sing on it,
03:04:27.060 | but it's just like, it devolves into total noise.
03:04:29.460 | And I just like fall on the floor and generate feedback.
03:04:33.580 | I've like, I think in the last version,
03:04:35.900 | it might've been that or a Velvet Underground cover
03:04:37.780 | in our last show, I actually,
03:04:39.580 | I have a guitar made of aluminum that I got made.
03:04:42.660 | And I thought this thing's indestructible.
03:04:44.700 | And so I kind of like was just, you know,
03:04:47.180 | moving it around,
03:04:48.100 | had it upside down and all this stuff to generate feedback.
03:04:51.020 | And I think I broke one of the,
03:04:52.220 | I broke one of the tuning pegs.
03:04:54.140 | - Oh, wow.
03:04:54.980 | - Yeah, so I've managed to break an all metal guitar.
03:04:58.740 | Go figure.
03:04:59.620 | - A bit of a big, ridiculous question,
03:05:02.180 | but let me ask you,
03:05:03.140 | we've been talking about neuroscience in general.
03:05:06.420 | What do you,
03:05:07.980 | you've been studying the human mind for a long time.
03:05:11.500 | What do you love most about the human mind?
03:05:15.140 | Like when you look at it,
03:05:17.180 | we'll look at the fMRI,
03:05:18.420 | just the scans and the behavioral stuff,
03:05:21.060 | the electrodes, the psychology aspect,
03:05:24.380 | reading the literature on the biology side,
03:05:26.540 | neurobiology, all of it.
03:05:28.060 | When you look at it, what is most beautiful to you?
03:05:33.060 | - I think the most beautiful,
03:05:35.380 | but incredibly hard to put your finger on
03:05:39.900 | is this idea of the internal model.
03:05:43.060 | That it's like, there's everything you see,
03:05:45.420 | and there's everything you hear and touch and taste,
03:05:47.940 | you know, every breath you take, whatever.
03:05:50.660 | But it's all connected by this like dark energy
03:05:55.660 | that's holding that whole universe of your mind together.
03:06:00.900 | And without that, it's just a bunch of stuff.
03:06:04.020 | And somehow we put that together
03:06:06.500 | and it forms so much of our experience.
03:06:11.500 | And being able to figure out where that comes from
03:06:14.660 | and how things are connected to me is just amazing.
03:06:19.540 | But just this idea of like the world in front of us,
03:06:22.820 | we're only sampling this little bit
03:06:24.460 | and trying to take so much meaning from it.
03:06:26.940 | And we do a really good job, not perfect, I mean, you know.
03:06:30.660 | But that ability to me is just amazing.
03:06:34.020 | - Yeah, it's an incredible mystery, all of it.
03:06:36.100 | It's funny you said dark energy,
03:06:37.380 | 'cause the same in astrophysics.
03:06:39.700 | You look out there, you look at dark matter and dark energy,
03:06:43.020 | which is this loose term assigned to a thing
03:06:46.220 | we don't understand, which helps make the equations work
03:06:50.780 | in terms of gravity and the expansion of the universe.
03:06:53.100 | In the same way, it seems like there's that kind of thing
03:06:56.020 | in the human mind that we're like striving to understand.
03:06:59.500 | - Yeah, yeah, you know, it's funny that you mentioned that.
03:07:01.540 | So one of the reasons I wrote the book, amongst many,
03:07:03.900 | is that I really felt like people needed to hear
03:07:06.660 | from scientists, and COVID was just a great example of this,
03:07:10.900 | because people weren't hearing from scientists.
03:07:13.860 | One of the things I think that people didn't get
03:07:16.300 | was the uncertainty of science and how much we don't know.
03:07:20.580 | And I think every scientist lives
03:07:22.980 | in this world of uncertainty.
03:07:24.700 | And when I was writing the book,
03:07:28.900 | I just became aware of all of these things we don't know.
03:07:31.900 | And so I think of physics a lot.
03:07:33.980 | I think of this idea of like overwhelming majority
03:07:38.460 | of the stuff that's in our universe
03:07:40.580 | cannot be directly measured.
03:07:42.660 | I used to think, ha-ha, I hate physics,
03:07:44.580 | so physicists get the Nobel Prize
03:07:47.380 | for doing whatever stupid thing,
03:07:48.940 | 'cause it's like there's 10 physicists out there.
03:07:50.820 | I'm just kidding.
03:07:51.820 | Just disingenuous. - Strong words.
03:07:53.540 | - Yeah, no, no, no, I'm kidding.
03:07:55.300 | The physicists who do neuroscience
03:07:56.980 | could be rather opinionated,
03:07:58.340 | so sometimes I like to dish on that.
03:07:59.740 | - It's all love.
03:08:00.700 | - It's all love, that's right.
03:08:01.980 | This is the ADHD talking.
03:08:04.540 | But at some point I had this aha moment
03:08:09.380 | where I was like, to be aware of that much
03:08:14.340 | that we don't know and have a bead on it
03:08:17.700 | and be able to go towards it,
03:08:19.500 | that's one of the biggest scientific successes
03:08:22.380 | that I could think of.
03:08:23.940 | You are aware that you don't know
03:08:26.140 | about this gigantic section,
03:08:28.500 | overwhelming majority of the universe, right?
03:08:31.260 | And I think the more,
03:08:33.260 | what keeps me going to some extent
03:08:36.460 | is realizing, changing the scope of the problem
03:08:41.180 | and figuring out, oh my God,
03:08:42.340 | there's all these things we don't know,
03:08:44.460 | and I thought I knew this,
03:08:46.460 | 'cause science is all about assumptions, right?
03:08:49.340 | So have you ever read
03:08:50.180 | "The Structure of Scientific Revolutions" by Thomas Kuhn?
03:08:53.300 | - Yes.
03:08:54.140 | - That's my only philosophy, really, that I've read.
03:08:57.260 | But it's so brilliant in the way
03:08:59.740 | that they frame this idea of,
03:09:01.340 | he frames this idea of assumptions
03:09:03.780 | being core to the scientific process,
03:09:06.340 | and the paradigm shift
03:09:07.340 | comes from changing those assumptions.
03:09:10.020 | And this idea of finding out this whole zone
03:09:13.740 | of what you don't know, to me,
03:09:15.900 | is the exciting part, you know?
03:09:18.300 | - Well, you are a great scientist
03:09:20.940 | and you wrote an incredible book,
03:09:23.660 | so thank you for doing that.
03:09:25.420 | And thank you for talking today.
03:09:27.180 | You've decreased the amount of uncertainty I have
03:09:32.100 | just a tiny little bit today
03:09:35.380 | and revealed the beauty of memory.
03:09:37.220 | This is a fascinating conversation.
03:09:38.660 | Thank you for talking today.
03:09:39.660 | - Oh, thank you.
03:09:40.500 | It's been a blast. (chuckles)
03:09:43.420 | - Thanks for listening to this conversation
03:09:45.260 | with Charan Ranganath.
03:09:46.980 | To support this podcast,
03:09:48.180 | please check out our sponsors in the description.
03:09:51.140 | And now, let me leave you with some words
03:09:53.140 | from Haruki Murakami.
03:09:55.140 | "Most things are forgotten over time.
03:09:59.460 | "Even the war itself,
03:10:01.220 | "the life and death struggle people went through
03:10:03.900 | "is now like something from the distant past.
03:10:06.980 | "We're so caught up in our everyday lives
03:10:08.940 | "that events of the past are no longer in orbit
03:10:11.060 | "around our minds.
03:10:12.780 | "There are just too many things
03:10:15.060 | "we have to think about every day,
03:10:16.900 | "too many new things we have to learn.
03:10:19.460 | "But still, no matter how much time passes,
03:10:22.980 | "no matter what takes place in the interim,
03:10:26.020 | "there are some things we can never assign to oblivion,
03:10:29.780 | "memories we can never rub away.
03:10:32.460 | "They remain with us forever like a touchstone."
03:10:36.260 | Thank you for listening.
03:10:38.740 | I hope to see you next time.
03:10:40.700 | (upbeat music)
03:10:43.300 | (upbeat music)
03:10:45.900 | [BLANK_AUDIO]