back to indexCharan Ranganath: Human Memory, Imagination, Deja Vu, and False Memories | Lex Fridman Podcast #430
Chapters
0:0 Introduction
1:3 Experiencing self vs remembering self
14:44 Creating memories
24:16 Why we forget
31:53 Training memory
42:22 Memory hacks
54:10 Imagination vs memory
63:29 Memory competitions
73:18 Science of memory
88:33 Discoveries
99:37 Deja vu
104:54 False memories
124:59 False confessions
128:45 Heartbreak
136:19 Nature of time
144:0 Brain–computer interface (BCI)
158:4 AI and memory
168:18 ADHD
175:15 Music
185:0 Human mind
00:00:00.000 |
the act of remembering can change the memory. 00:00:05.240 |
and then I tell you something about the event, 00:00:11.020 |
you might remember some original information from the event 00:00:14.360 |
as well as some information about what I told you. 00:00:17.600 |
And sometimes if you're not able to tell the difference, 00:00:22.640 |
gets mixed into the story that you had originally. 00:00:34.820 |
- The following is a conversation with Charan Ranganath, 00:00:40.240 |
a psychologist and neuroscientist at UC Davis, 00:00:49.640 |
unlocking memory's power to hold on to what matters. 00:00:59.080 |
And now, dear friends, here's Charan Ranganath. 00:01:02.760 |
Danny Kahneman describes the experiencing self 00:01:16.640 |
but rather from what you remember of the experience. 00:01:19.240 |
So can you speak to this interesting difference 00:01:23.920 |
of the experiencing self and the remembering self? 00:01:32.360 |
long before he won the Nobel Prize or anything, 00:01:40.800 |
I got into it because it's so much about memory, 00:01:46.200 |
So we're right now having this experience, right? 00:01:49.040 |
And people can watch it, presumably on YouTube, 00:01:55.720 |
you could probably describe this whole thing in 10 minutes, 00:01:58.640 |
but that's going to miss a lot of what actually happened. 00:02:02.820 |
And so the idea there is that the way we remember things 00:02:11.400 |
And it tends to be biased by the beginning and the end, 00:02:16.200 |
but there's also the best parts, the worst parts, et cetera. 00:02:39.640 |
- So in the way we construct a narrative about our past, 00:02:43.120 |
you say that it gives us an illusion of stability. 00:02:50.880 |
- Basically, I think that a lot of learning in the brain 00:03:08.940 |
And so what our brains are really optimized for 00:03:14.660 |
that's going to be most useful in understanding the present 00:03:19.960 |
And so cause-effect relationships, for instance, 00:03:26.320 |
in the sense that you could, in the next 10 minutes, 00:03:29.480 |
pull a knife on me and slip my throat, right? 00:03:32.280 |
- Exactly, but having seen some of your work, 00:03:36.080 |
I just generally, my expectations about life, 00:03:41.160 |
I have a certainty that everything's gonna be fine 00:03:43.160 |
and we're gonna have a great time talking today, right? 00:03:47.360 |
It's like, okay, so I go to see a band on stage. 00:03:56.900 |
there's a very good chance there's gonna be an encore. 00:04:01.820 |
before I've even walked into the show, right? 00:04:04.080 |
There's gonna be people holding up their camera phones, 00:04:10.880 |
So that's like everyday fortune telling that we do, though. 00:04:17.200 |
And it's amazing that we have this capability 00:04:24.000 |
that we know everything that's about to happen. 00:04:26.520 |
And I think what's valuable about that illusion 00:04:30.640 |
is when it's broken, it gives us the information, right? 00:04:42.080 |
And so those prediction errors that we make based on, 00:04:45.240 |
you know, we make a prediction based on memory 00:04:58.380 |
and just this whole idea of experiencing self 00:05:03.260 |
versus remembering self, I was hoping you can give 00:05:24.460 |
bears its fruits the most when it's remembered 00:05:32.420 |
that we can control, to some degree, how we remember it, 00:05:39.020 |
such that it can maximize the long-term happiness 00:05:52.820 |
- Oh my God, no, I'm gonna open for you, dude. 00:06:01.100 |
Believe me, I did that in Columbus, Ohio once. 00:06:04.880 |
Like, the opening acts, like, drank our bar tab. 00:06:07.820 |
We spent all this money going all the way there. 00:06:17.820 |
we blew, like, our savings on getting a hotel room. 00:06:26.580 |
- When I was in grad school, I played in a band. 00:06:30.580 |
It wasn't like, we were in a hardcore touring band, 00:06:33.400 |
but we did some touring and had some fun times. 00:06:44.660 |
"Henry II, Mask of Sanity," which is a terrible movie. 00:06:54.940 |
- We're gonna have to see, we're gonna have to see it. 00:06:56.900 |
- All right, we'll get you back to life advice. 00:07:00.940 |
- One thing that I try to live by, especially nowadays, 00:07:11.300 |
I think if we go back to the pandemic, right? 00:07:15.540 |
How many people have memories from that period? 00:07:27.740 |
where we were stuck inside looking at screens all day, 00:07:44.060 |
and we went on some vacations, but not very often. 00:07:48.860 |
And I really try to do now vacations to interesting places 00:07:54.540 |
because those are the things that you remember, right? 00:07:58.260 |
So I really do think about what's going to be something 00:08:06.980 |
because the experiencing self will suffer for that, 00:08:13.540 |
- Do things that are very unpleasant in the moment, 00:08:25.500 |
it's a good way to see the silver lining of it. 00:08:29.340 |
- Yeah, I mean, I think it's one of these things 00:08:31.320 |
where if you have people who you've gone through, 00:08:38.560 |
and it's like, that's a bonding experience often, you know? 00:08:45.360 |
I like to say it's like, there's no point in suffering 00:08:59.600 |
at least that's how I remember it, on this paddleboard, 00:09:02.480 |
where just everything that could have gone wrong 00:09:06.760 |
So many mistakes were made and ended up at some point 00:09:33.120 |
I mean, no one's around, it's like you're just I alone. 00:09:37.120 |
And so I just said, well, failure's not an option. 00:09:39.680 |
And eventually I got out of it and froze and got cut up. 00:09:44.680 |
And I mean, the things that we were going through 00:09:48.440 |
But a short version of this is my wife and my daughter 00:09:53.440 |
and Randy's wife, they gave us all sorts of hell about this 00:09:58.200 |
'cause they were ready to send out a search party. 00:10:04.440 |
And then I started to tell people in my lab about this 00:10:08.240 |
And it just became a better and better story every time. 00:10:11.260 |
And we actually had some photos of just the crazy things 00:10:14.200 |
like this generator that was hanging over the water 00:10:17.080 |
and we're like ducking under these metal gratings 00:10:21.880 |
And it was just nuts, but it became a great story. 00:10:26.360 |
And it was definitely, I mean, Randy and I were already 00:10:28.040 |
tight, but that was a real bonding experience for us. 00:10:33.480 |
that it's like, I don't look back on that enough actually, 00:10:41.800 |
I don't necessarily have the confidence to think 00:10:45.520 |
that I'll be able to get through certain things. 00:10:47.760 |
But my ability to actually get something done in that moment 00:10:52.760 |
is better than I give myself credit for, I think. 00:10:59.440 |
- Well, actually just for me, you're making me realize now 00:11:12.200 |
To me at least, it feels like a motivating thing 00:11:15.960 |
that the darker it gets, the better the story will be 00:11:26.040 |
and they're going through some shit, as we said, 00:11:33.320 |
is that it'll be a hell of a good story when it's all over, 00:11:52.200 |
Is it the kind of narratives that we've constructed 00:11:55.280 |
about the world that are used to make predictions 00:11:58.320 |
that's fundamentally part of the decision-making? 00:12:03.520 |
you and I decided we're gonna go for a beer, right? 00:12:16.960 |
"and they were playing this horrible EDM or whatever." 00:12:19.720 |
And so right there, valuable source of information, right? 00:12:26.240 |
like where you do this counterfactual stuff like, 00:12:30.800 |
"but what if I had gone somewhere else and said, 00:12:36.420 |
So there's all that kind of reasoning that goes into it too. 00:12:47.800 |
about how I got into memory research and you got into AI. 00:12:52.400 |
And it's like we all have these personal reasons 00:12:55.520 |
that guide us in these particular directions. 00:12:57.640 |
And some of it's the environment and random factors in life, 00:13:01.440 |
and some of it is memories of things that we wanna overcome 00:13:05.960 |
or things that we build on in a positive way. 00:13:12.440 |
- And probably the earlier in life the memories happen, 00:13:21.700 |
I mean, I do feel like adolescence is much more important 00:13:33.040 |
but the teenage years are just so important for the brain. 00:13:43.120 |
Now we're thinking of things like schizophrenia 00:13:47.280 |
because it just emerges during that period of adolescence 00:14:00.520 |
It's really, the self is an evolving construct. 00:14:07.540 |
you feel like every decision you make is consequential 00:14:20.220 |
I mean, that's why I think the big part of education, 00:14:29.880 |
But a lot of it is learning how to get along with people 00:14:34.200 |
and learning who you are and how you function. 00:14:41.060 |
even if you have perfect parents working on you. 00:14:48.720 |
that explains why we don't seem to remember anything 00:15:01.220 |
of child development, and so we were talking about this. 00:15:04.060 |
And so there are a bunch of reasons, I would say. 00:15:06.880 |
So one reason is there's an area of the brain 00:15:10.600 |
called the hippocampus, which is very, very important 00:15:21.400 |
And then the next couple years of life after that, 00:15:26.480 |
And the difference is that basically in the lab 00:15:33.360 |
children basically don't have any episodic memories 00:15:41.360 |
and that's why they call it childhood amnesia. 00:16:11.080 |
and you give it like the first couple of patterns 00:16:17.880 |
try to get back those first couple of patterns, right? 00:16:38.120 |
but the roads that you would take to get there 00:16:45.260 |
- The third explanation is a child's sense of self 00:16:55.200 |
as opposed to having this first-person experience 00:17:12.460 |
the first few years of life, infantile amnesia, 00:17:18.540 |
Basically, the error rate that you mentioned, 00:17:21.660 |
when your brain's prediction doesn't match reality, 00:17:25.780 |
the error rate in the first few years of life, 00:17:34.500 |
The collision between your model of the world 00:17:58.080 |
and we're somehow, given how plastic everything is, 00:18:03.100 |
But it's like an insane waterfall of information. 00:18:09.080 |
I wouldn't necessarily describe it as a trauma. 00:18:10.860 |
We can get into this whole stages of life thing, 00:18:18.780 |
a kid's internal model of their body is changing. 00:18:40.540 |
but it's like one of those things that people talk about 00:18:42.420 |
when they talk about the positive aspects of children 00:18:49.900 |
and they have this kind of openness towards the world. 00:19:00.860 |
because it's what they use, they're seeking information. 00:19:20.820 |
and focus everything towards that goal, right? 00:19:24.100 |
The prefrontal cortex takes forever to develop in humans. 00:19:27.920 |
The connections are still being tweaked and reformed 00:19:34.440 |
which is when you tend to see mental illness pop up, right? 00:19:43.080 |
of prime functioning of the prefrontal cortex, 00:19:48.840 |
and you start losing all that frontal function. 00:19:59.320 |
Older adults are worse than young adults at episodic memory. 00:20:01.880 |
And I always would say, God, that's so weird. 00:20:12.480 |
because there's such a culture of optimization right now. 00:20:15.720 |
And it's like, I realized I have to redefine what optimal is 00:20:27.800 |
where you have basically adults saying, okay, 00:20:35.800 |
and I have to hunt and forage and get things done. 00:20:38.560 |
I need a prefrontal cortex so I can stay focused 00:20:47.120 |
I'm kind of wandering around and I've got some safety 00:20:56.560 |
I don't wanna be constrained by goals as much. 00:20:59.000 |
I wanna really be free, play and explore and learn. 00:21:03.080 |
So you don't want a super tight prefrontal cortex. 00:21:05.480 |
You don't even know what the goals should be yet, right? 00:21:07.960 |
It's like, if you're trying to design a model 00:21:10.600 |
that's based on a bad goal, it's not gonna work well, right? 00:21:17.480 |
oh, why don't you have a great prefrontal cortex then? 00:21:20.680 |
But I think, I mean, if you go back and you think 00:21:23.400 |
how many species actually stick around naturally 00:21:27.200 |
long after their childbearing years are over, 00:21:33.320 |
menopause is not all that common in the animal world, right? 00:21:38.760 |
And so I saw Alison Gopnik said something about this 00:21:43.120 |
so I started to look into this, about this idea 00:21:45.680 |
that really when you're older in most societies, 00:21:49.760 |
your job is no longer to form new episodic memories. 00:21:53.640 |
It's to pass on the memories that you already have, 00:21:59.200 |
to pass on that semantic memory to the younger generations, 00:22:07.400 |
They're respected, they're not seen as, you know, 00:22:30.600 |
and it's like another species that has menopause is orcas. 00:22:34.200 |
Orca pods are led by the grandmothers, right? 00:22:37.000 |
So not the young adults, not the parents or whatever, 00:22:41.080 |
And so they're the ones that pass on the traditions 00:22:44.200 |
to the, I guess, the younger generation of orcas. 00:22:47.480 |
And if you look from what little I understand, 00:22:50.360 |
different orca pods have different traditions. 00:23:15.880 |
like when they're a part of this intense social group, 00:23:35.840 |
So in the early days, you don't even know what the goal is, 00:23:42.200 |
and then all the wisdom you collect through that, 00:23:44.520 |
then you share with the others in the system, 00:23:48.240 |
And as a collective, then you kind of converge 00:23:50.960 |
towards greater wisdom throughout the generation. 00:24:25.560 |
What are the different components involved here? 00:24:28.520 |
- So we can think about this on a number of levels. 00:24:30.440 |
Maybe I'll give you the simplest version first, 00:24:34.480 |
as these individual things, and we can just access them, 00:24:43.360 |
is you have this distributed pool of neurons, 00:25:00.320 |
because that competition just wipes things out. 00:25:07.160 |
which we can get into that would promote long-term retention. 00:25:17.640 |
And we need the right cue to be able to activate it, right? 00:25:33.080 |
And in fact, you could extract entirely new memories 00:25:37.200 |
- You have to have the right query, the right prompt 00:25:39.560 |
to access that, whatever the part you're looking for. 00:25:42.280 |
- That's exactly right, that's exactly right. 00:25:44.320 |
And in humans, you have this more complex set 00:25:51.240 |
And then there's these memories for specific events, 00:25:55.480 |
And so there's different pieces of the puzzle 00:26:03.120 |
is just this kind of what we call retrieval failure. 00:26:10.640 |
What's working memory, short-term memory, long-term memory? 00:26:14.680 |
What are the interesting categories of memory? 00:26:25.880 |
And so one of the things that, there's value in that, 00:26:38.760 |
it was a term that was coined by Alan Baddeley. 00:26:48.560 |
and to be able to control the flow of that information, 00:27:00.560 |
there's this ability to kind of passively store information, 00:27:03.840 |
see things in your mind's eye or hear your internal monologue 00:27:07.600 |
but we have that ability to keep information in mind. 00:27:17.120 |
which is identified a lot with the prefrontal cortex. 00:27:19.640 |
It's this ability to control the flow of information 00:27:24.180 |
that's being kept active based on what it is you're doing. 00:27:27.680 |
Now, a lot of my early work was basically saying 00:27:31.440 |
which some memory researchers would call short-term memory, 00:27:34.680 |
is not at all independent from long-term memory. 00:27:38.120 |
That is that a lot of executive function requires learning 00:27:42.120 |
and you have to have synaptic change for that to happen. 00:27:48.360 |
So one of the things I've been getting into lately 00:27:51.600 |
is the idea that we form internal models of events. 00:27:56.600 |
The obvious one that I always use is birthday parties. 00:28:10.160 |
And up till that point where the child blows out the candle, 00:28:12.720 |
you have an internal model in your head of what's going on. 00:28:19.820 |
It's going where the action's about to happen, 00:28:25.480 |
and that's a kind of a working memory product. 00:28:30.820 |
that's allowing you to interpret this world around you. 00:28:41.580 |
And then you'd want to be able to pull out memories 00:28:44.260 |
for specific events that happened in the past, 00:28:56.100 |
and the way we organize information in the present, 00:29:03.520 |
which people typically call long-term memory. 00:29:05.400 |
- So if you have something like a birthday party 00:29:09.880 |
you're gonna load that from disk into working memory, 00:29:13.540 |
this model, and then you're mostly operating on the model. 00:29:16.660 |
And if it's a new task, you don't have a model, 00:29:33.080 |
and I've been working with many other people, 00:29:40.440 |
is this idea that we form these internal models 00:29:44.520 |
at particular points of high prediction error, 00:29:47.180 |
or points of, I believe also points of uncertainty, 00:29:50.560 |
points of surprise or motivationally significant periods. 00:29:54.080 |
And those points are when it's maximally optimal 00:30:01.440 |
oh, well, we're just encoding episodic memories constantly, 00:30:09.680 |
It's just a lot of information that you don't need. 00:30:23.280 |
But if you capture it at the point of maximum uncertainty 00:30:27.480 |
you have the most useful point in your experience 00:30:37.640 |
in generating these internal models of events, 00:30:40.520 |
they show a heightened period of connectivity 00:30:46.720 |
between different events, which we call event boundaries. 00:30:49.500 |
These are the points where you're like surprised 00:30:51.280 |
or you cross from one room to another and so forth. 00:30:59.360 |
And so if people have a very good internal model 00:31:10.720 |
And so then at these event boundaries, you encode 00:31:17.400 |
Branganath's now talking about orcas, what's going on? 00:31:19.960 |
And maybe you have to go back and remember reading my book 00:31:22.240 |
to pull out the episodic memory to make sense 00:31:29.140 |
that you can see in the brain of these different networks 00:31:33.080 |
that are coming together and then de-affiliating 00:31:35.760 |
at different points in time that are allowing you 00:31:42.720 |
to some extent when we're talking about semantic memory 00:31:49.600 |
that are unfolding as these networks kind of come together 00:31:57.080 |
This beautiful connected system that you've described, 00:32:06.940 |
- I think improvement, it depends on what your definition 00:32:11.000 |
So what I say in the book is that you don't wanna remember 00:32:14.880 |
more, you wanna remember better, which means focusing 00:32:20.240 |
And that's what our brains are designed to do. 00:32:22.000 |
So if you go back to the earliest quantitative studies 00:32:25.460 |
of memory by Ebbinghaus, what you see is that he was trying 00:32:32.800 |
and within a day he lost about 60% of that information. 00:32:37.160 |
And he was basically using a very, very generous way 00:32:41.400 |
So as far as we know, nobody has managed to violate 00:32:50.220 |
So if your expectation is that you should remember 00:32:54.560 |
you're already off, because this is not what human brains 00:32:58.680 |
On the other hand, what we see over and over again 00:33:01.680 |
is that the brain does, basically, one of the cool things 00:33:05.260 |
about the design of the brain is it's always less is more, 00:33:09.400 |
It's like, I mean, I've seen estimates that the human brain 00:33:12.500 |
uses something like 12 to 20 watts, you know, in a day. 00:33:15.560 |
I mean, that's just nuts, the low power consumption, right? 00:33:18.960 |
So it's all about reusing information and making the most 00:33:28.040 |
what you see biologically is, you know, neuromodulators, 00:33:37.440 |
These are chemicals that are released during moments 00:33:46.560 |
And so these chemicals promote lasting plasticity, right? 00:33:51.560 |
Essentially, some mechanisms by which the brain can say, 00:33:54.240 |
prioritize the information that you carry with you 00:34:00.520 |
our ability to focus our attention on what's important. 00:34:21.080 |
So she works a lot with military, like Navy SEALs and stuff 00:34:23.960 |
to do this kind of work with mindfulness meditation. 00:34:28.560 |
Adam Ghazali, another one of my friends and colleagues, 00:34:30.760 |
has work on kind of training through video games, actually, 00:34:46.560 |
So you tend to, if I'm looking at a video game, 00:34:50.600 |
I can definitely get better at paying attention 00:35:00.760 |
is a fundamental component of remembering something, 00:35:05.360 |
and then attention might be something that you could train, 00:35:13.600 |
- I can say that, in fact, we do in certain ways, right? 00:35:21.840 |
So we did this one study of expertise in the brain. 00:35:27.640 |
let's say if you're a bird expert or something, right, 00:35:37.140 |
and it's all about plasticity of the visual cortex. 00:35:39.400 |
And vision researchers love to say everything's visual, 00:35:42.600 |
but it's like, we did this study of attention 00:35:50.200 |
were the biggest effects as people became experts 00:36:03.460 |
we were actually seeing massive increases in activity 00:36:07.720 |
And this fits with some of the studies of chess experts 00:36:17.000 |
You learn what's important, what's not, right? 00:36:29.920 |
And those are also what you'd call expert memory skills. 00:36:35.440 |
I know that's something we're both interested in. 00:36:37.840 |
So these are people who train in these competitions 00:36:48.280 |
her name I think is pronounced Yenia Wintersol, 00:36:52.300 |
but she, so I think she's got like a giant Instagram 00:36:55.400 |
following and so she had this YouTube video that went viral 00:36:58.800 |
where she had memorized an entire Ikea catalog, right? 00:37:05.120 |
By all accounts from people who become memory athletes, 00:37:08.680 |
they weren't born with some extraordinary memory, 00:37:11.540 |
but they practice strategies over and over and over again. 00:37:20.120 |
and you can just deploy it in an instant, right? 00:37:24.960 |
one strategy for learning the order of a deck of cards 00:37:28.400 |
might not help you for something else that you need, 00:37:30.820 |
like remembering your way around Austin, Texas, 00:37:34.020 |
but it's gonna be these, whatever you're interested in, 00:37:39.400 |
And that's just a natural by-product of expertise. 00:37:46.000 |
that I played with, I don't know if you're familiar 00:37:52.360 |
So another thing I recommend for people a lot 00:38:01.640 |
So I think medical students and students use this a lot 00:38:05.600 |
- Oh yeah, okay, we can come back to this, but yeah, go on. 00:38:07.800 |
- Sure, it's the whole concept of spaced repetition. 00:38:13.340 |
you kind of have to remind yourself of it a lot 00:38:15.880 |
and then over time, you can wait a week, a month, a year 00:38:26.120 |
like note cards that you can have tens of thousands of 00:38:31.440 |
and actually be refreshing all of that information, 00:38:40.040 |
that allows you to remember things like the Ikea catalog 00:38:55.680 |
You can walk along that little palace and it reminds you. 00:38:59.680 |
Like there's stuff like that that I think athletes, 00:39:13.140 |
I think it's because when people introduce themselves, 00:39:21.400 |
where I'm like, I know I should be remembering that, 00:39:39.040 |
because when people introduce themselves to me, 00:39:41.760 |
it's almost like I have this just blank blackout 00:39:45.860 |
for a moment and then I'm just looking at them like, 00:39:49.060 |
I look away or something, what's wrong with me? 00:39:54.060 |
The reason why it's hard is that there's no reason 00:40:04.100 |
but most of the time you're associating a name 00:40:09.140 |
And that's a completely arbitrary thing, right? 00:40:11.920 |
I mean, maybe in the olden days, somebody named Miller, 00:40:16.040 |
or something like that, but for the most part, 00:40:19.360 |
it's like these names are just utterly arbitrary. 00:40:25.280 |
And so it's not really a thing that our brain does very well 00:40:31.160 |
So what you need to do is build connections somehow, 00:40:36.800 |
And sometimes it's obvious or sometimes it's not. 00:40:40.880 |
I'm trying to think of a good one for you now, 00:40:43.000 |
but the first thing I think of is Lex Luthor, but- 00:40:50.160 |
- I know he has a shaved head though, or he's bald, 00:40:58.880 |
But if I can come up with something, like I could say, 00:41:01.280 |
okay, so Lex Luthor is this criminal mastermind 00:41:05.000 |
- And we talked about stabbing or whatever earlier. 00:41:08.080 |
- So I'm just kind of connected and that's it. 00:41:16.120 |
I mean, one of the things that I find is if I've, 00:41:20.740 |
that's just totally generic, like John Smith or something, 00:41:26.120 |
but you know, if I see a generic name like that, 00:41:29.680 |
but I've read John Smith's papers academically 00:41:35.480 |
I can immediately associate that name with that face 00:41:47.920 |
is you have a preexisting structure in your head 00:41:50.420 |
of like your childhood home or this mental palace 00:41:55.960 |
And so now you can put arbitrary pieces of information 00:42:00.200 |
in different locations in that mental structure of yours 00:42:04.580 |
and then you could walk through the different path 00:42:07.520 |
and find all the pieces of information you're looking for. 00:42:22.420 |
- We should maybe linger on this memory palace thing 00:42:27.120 |
just to make obvious 'cause when people were describing 00:42:30.040 |
to me a while ago what this is, it seems insane. 00:42:34.080 |
You literally think of a place like a childhood home 00:42:40.000 |
or a home that you're really visually familiar with 00:42:44.160 |
and you literally place in that three-dimensional space 00:42:50.880 |
facts or people or whatever you wanna remember 00:42:54.680 |
and you just walk in your mind along that place visually 00:43:04.480 |
One of the limitations is there is a sequence to it. 00:43:10.040 |
you can't just like go upstairs right away or something. 00:43:15.000 |
So it's really great for remembering sequences 00:43:20.760 |
So the full context of the tour I think is important. 00:43:23.760 |
But it's fascinating how the mind is able to do that 00:43:30.720 |
into something that you remember well already, 00:43:37.200 |
And you can just do that for any kind of sequence. 00:43:39.320 |
I'm sure she used something like this for IKEA catalog. 00:43:50.640 |
that memories can compete with each other, right? 00:44:01.560 |
it could be cluttered with a zillion different things, right? 00:44:07.080 |
And one of them, I put my bank password on it, right? 00:44:15.360 |
But if it's like hot pink, it's gonna stand out 00:44:19.960 |
And so that's one way in which if things are distinctive, 00:44:23.600 |
if you've processed information in a very distinctive way, 00:44:27.320 |
then you can have a memory that's gonna last. 00:44:37.200 |
you know, that it's like, that you've got very short hair 00:44:41.280 |
with Lex Luthor that way or something like that, right? 00:44:53.080 |
and I have my like reminders, my to-do list in one pile? 00:44:58.880 |
Well, then I know exactly if I'm going for my banking, 00:45:06.040 |
So the method of loci works or memory palaces work 00:45:19.720 |
and, you know, basically this primitive abilities 00:45:24.000 |
And so people explain the method of loci that way. 00:45:41.240 |
and they'll go like, if you're memorizing a deck of cards, 00:45:46.480 |
like the king and the jack and the 10 and so forth. 00:45:50.520 |
And they'll make up a story about things that they're doing 00:45:58.240 |
there was this obscure episode of the TV show "Cheers." 00:46:02.640 |
that he uses to memorize all these facts about Albania. 00:46:40.040 |
should help me retain the information better. 00:46:42.800 |
Now there's an interesting boundary condition, 00:46:45.920 |
which is it depends on when you need that information. 00:46:51.480 |
like I can't remember so much from college and high school 00:46:55.420 |
'cause I just did everything at the last minute. 00:46:57.580 |
And sometimes I would literally study like, you know, 00:47:03.620 |
And that was great because what would happen is, 00:47:08.900 |
And so actually not spacing can really help you 00:47:18.260 |
But on the other hand, if you space things out, 00:47:45.140 |
And so all of these little cues that are in the background, 00:48:00.140 |
and you're at home drinking a beer and you're thinking, 00:48:02.620 |
God, what a strange interview that was, right? 00:48:15.600 |
There's a mismatch between what you pulled up 00:48:21.860 |
is you start to erase or alter the parts of the memory 00:48:25.820 |
that are associated with a specific place in time, 00:48:28.740 |
and you heighten the information about the content. 00:48:37.460 |
it's more accessible at different times and different places 00:48:42.980 |
in an AI kind of way of thinking about things. 00:48:45.660 |
It's not overfitted to one particular context. 00:48:47.940 |
But that's also why the memories that we call upon the most 00:49:01.340 |
And it's a little bit different than semantic memory, 00:49:05.860 |
that we have recalled over and over and over again, 00:49:11.860 |
so it's less and less tied to the original experience. 00:49:22.700 |
you see a place that you haven't been to in a while, 00:49:34.300 |
is that you lose attachment to a particular context, 00:49:47.460 |
- Yeah, but at the same time, it becomes stronger 00:49:50.180 |
in the sense that the content becomes stronger. 00:49:57.100 |
for that generic semantic information type of memory. 00:49:59.940 |
- Yeah, and I think this falls into a category. 00:50:11.260 |
which is, I think, related to the spacing effect, 00:50:16.220 |
So the idea is that if you're trying to learn words, 00:50:22.500 |
and this doesn't have to be words, it could be anything, 00:50:37.860 |
some learning theories anyway, this seems weird, 00:50:40.020 |
why would you do better giving yourself this extra error 00:50:45.340 |
rather than just giving yourself perfect input 00:50:48.540 |
that's a replica of what it is that you're trying to learn? 00:50:51.860 |
And I think the reason is is that you get better retention 00:50:55.260 |
from that error, that mismatch that we talked about, right? 00:51:03.060 |
to what happens with backprop in AI or neural networks. 00:51:10.020 |
here's the bad connections and here's the good connections. 00:51:13.660 |
And so we can keep the parts of this cell assembly 00:51:28.020 |
this is a thing that I come back to over and over again, 00:51:34.200 |
if you're constantly pushing yourself to your limit, right? 00:51:45.020 |
- So you should always be stress testing the memory system. 00:51:52.020 |
Even though everyone tells me, oh, my memory's terrible, 00:52:01.020 |
And so what happens is when you test yourself, 00:52:03.800 |
you're like, oh my God, I thought I knew that, but I don't. 00:52:06.740 |
And so it can be demoralizing until you get around that 00:52:11.500 |
and you realize, hey, this is the way that I learned. 00:52:16.660 |
It's like if you're trying to star in a movie 00:52:21.460 |
you don't just sit around reading the script. 00:52:24.700 |
And you're gonna botch those lines from time to time, right? 00:52:34.900 |
And we were randomly talking about soccer, football. 00:52:39.900 |
Somebody I grew up watching, Diego Armando Maradona, 00:52:45.760 |
one of the greatest soccer players of all time. 00:52:47.940 |
And we were talking about him and his career and so on. 00:53:06.200 |
because that was a perfect example of memories. 00:53:19.520 |
I went back to the thing I've done many times in my head, 00:53:22.740 |
visualizing some of the epic runs he had on goal and so on. 00:53:28.100 |
And so I'm, and part of the, also the conversation 00:53:30.740 |
when you're talking to Joe, you're just stressing. 00:53:34.860 |
the attention is allocated in a particular way. 00:53:40.620 |
in which world was Diego Maradona still alive? 00:53:49.100 |
There was a, it's a moment that sticks with me. 00:53:59.980 |
And it's cool, like it shows actually the power 00:54:09.280 |
I don't know if there's a good explanation for that. 00:54:10.980 |
- One of the cool things that I found is this, 00:54:13.980 |
that some people really just revolutionize a field 00:54:18.420 |
by creating a problem that didn't exist before. 00:54:24.500 |
engineering is like solving other people's problems, 00:54:36.980 |
But one of my former mentors, Marsha Johnson, 00:54:47.380 |
And she gets into this idea of how do we tell the difference 00:54:55.140 |
How do we tell, I get some mental experience. 00:54:57.420 |
Where did that mental experience come from, right? 00:55:06.900 |
our mental experience of thinking about something, 00:55:11.340 |
They're both largely constructions in our head. 00:55:18.260 |
And the way that you do it is, I mean it's not perfect, 00:55:27.620 |
and really focusing on the sensory information 00:55:31.680 |
or the place and time and the things that put us back 00:55:39.200 |
you're not gonna have all of that vivid detail 00:55:41.740 |
as you do for something that actually happened. 00:55:47.420 |
But it takes time, it's slow, and it's again, effortful. 00:55:51.260 |
But that's what you need to remember accurately. 00:55:53.700 |
But what's cool, and I think this is what you alluded to 00:55:55.800 |
about how that was an interesting experience, 00:56:03.100 |
I'm just gonna take all this information from memory, 00:56:06.300 |
recombine it in different ways and throw it out there. 00:56:09.220 |
And so for instance, Dan Schachter and Donna Addis 00:56:19.400 |
And this goes back actually to this guy, Frederick Bartlett, 00:56:23.220 |
who was this revolutionary memory researcher. 00:56:25.760 |
Bartlett, he actually rejected the whole idea 00:56:52.160 |
they were filtered through this lens of prior knowledge. 00:56:57.480 |
the beliefs that they had, the things they knew. 00:57:01.880 |
he called remembering an imaginative construction. 00:57:11.220 |
by taking bits and pieces that come up in our heads. 00:57:14.400 |
And likewise, he wrote this beautiful paper on imagination 00:57:17.460 |
saying when we imagine something and create something, 00:57:20.340 |
we're creating it from these specific experiences 00:57:23.060 |
that we've had and combining it with our general knowledge. 00:57:25.660 |
But instead of trying to focus it on being accurate 00:57:36.960 |
I mean, or at least that's one kind of creation. 00:57:39.380 |
- So imagination is fundamentally coupled with memory 00:57:49.140 |
I mean, it's not clear that it is in everyone, 00:57:54.500 |
is some patients who have amnesia, for instance, 00:57:56.860 |
they have brain damage, say, to the hippocampus. 00:58:08.420 |
They find it very difficult to give you a scenario 00:58:16.460 |
But it's not like they can come up with anything 00:58:18.220 |
that's very vivid and creative in that sense. 00:58:21.340 |
And it's partly 'cause when you have amnesia, 00:58:25.200 |
Because to get a very good model of the future, 00:58:28.340 |
it really helps to have episodic memories to draw upon. 00:58:35.100 |
And in fact, one of the most impressive things, 00:58:43.980 |
what they found was there was this big network of the brain 00:58:52.740 |
And if I ask you to pay attention to something, 00:58:55.420 |
it only comes on when you stop paying attention. 00:58:58.020 |
So people, oh, it's just this kind of daydreaming network. 00:59:02.540 |
And I thought, this is just ridiculous research. 00:59:18.540 |
functionally interacting with the hippocampus. 00:59:21.300 |
And so, in fact, some would say the hippocampus 00:59:36.140 |
even things that couldn't really be very plausible, 00:59:42.340 |
they look almost the same as maps of brain activation 00:59:50.540 |
we've broken up this network into various sub-pieces, 00:59:57.660 |
and creating these little Lego blocks out of them. 01:00:03.420 |
to recreate these experiences that you've had, 01:00:06.260 |
but you could also reassemble them into new pieces 01:00:08.620 |
to create a model of an event that hasn't happened yet. 01:00:13.460 |
Our common ground that we're establishing in language 01:00:32.860 |
I take the absurdity of human life as it stands 01:00:37.860 |
and play it forward in all kinds of different directions. 01:00:54.540 |
And I suppose I have to be a little bit careful 01:01:02.300 |
And this also, I mean, some of my best friends 01:01:05.900 |
are characters inside books that never even existed. 01:01:18.300 |
Dostoevsky exists, but also Brothers Karamazov. 01:01:26.820 |
One of the few literature books that I've read, 01:01:29.540 |
I read a lot in school that I don't remember, 01:01:33.380 |
They exist, and I have almost conversations with them. 01:01:46.060 |
- Yeah, there was actually this famous mnemonist. 01:02:00.340 |
And so this guy was named Solomon Sharoshevsky, 01:02:06.460 |
that basically created these weird associations 01:02:09.380 |
between different senses that normally wouldn't go together. 01:02:12.860 |
So that gave him this incredibly vivid imagination 01:02:20.820 |
all sorts of things that he would need to memorize, 01:02:25.780 |
just create these incredibly detailed things in his head 01:02:29.100 |
that allowed him to memorize all sorts of stuff. 01:02:32.420 |
But it also really haunted him by some reports 01:02:35.300 |
that basically it was like he was at some point, 01:02:39.100 |
and again, who knows if the drinking was part of this, 01:02:42.980 |
differentiating his imagination from reality, right? 01:02:48.940 |
I mean, that's what psychosis is in some ways, 01:02:52.340 |
is you, first of all, you're just learning connections 01:02:56.940 |
from prediction errors that you probably shouldn't learn, 01:03:00.260 |
and the other part of it is that your internal signals 01:03:17.600 |
- Yeah, maybe they're just two sides of the same coin. 01:03:25.940 |
- I think so, sometimes scary, but mostly fascinating. 01:03:29.820 |
- Can we just talk about memory sport a little longer? 01:03:33.220 |
There's something called the USA Memory Championship. 01:03:38.940 |
What does it mean to be like elite level at this? 01:04:02.860 |
that make them go, "Hey, why don't I do this?" 01:04:08.520 |
who thought that he was getting chemo for cancer. 01:04:18.020 |
there's a well-known thing called chemo brain 01:04:29.760 |
and this is the story you hear in a lot of memory athletes 01:04:59.100 |
you'd be able to bring a lot of prior knowledge. 01:05:07.140 |
I've gotten a chance to work with something called 01:05:09.780 |
NBAC tasks, so there's all these kinds of tasks, 01:05:12.620 |
memory recall tasks that are used to kinda load up 01:05:22.700 |
like to see how well you're good at multitasking. 01:05:25.580 |
We used it in particular for the task of driving, 01:05:44.380 |
and they're usually about recalling a sequence of numbers 01:05:50.580 |
Are you, do you have any favorite tasks of this nature 01:06:00.700 |
and using things that are more and more naturalistic. 01:06:03.980 |
And the reason is is that we've moved in that direction 01:06:18.780 |
And so it goes into a much more predictive mode, 01:06:22.460 |
and you have these event boundaries, for instance, 01:06:32.500 |
a mix of interpretations and imagination with perception. 01:06:41.140 |
is understanding navigation in our memory first places. 01:06:45.180 |
And the reason is is that there's a lot of work 01:06:47.500 |
that's done in rats, which is very good work. 01:06:57.380 |
that fire when a rat is in different places in the box. 01:07:01.260 |
And so the conventional wisdom is that the hippocampus 01:07:11.540 |
when you have absolutely no knowledge of the world, right? 01:07:16.540 |
But I think one of the cool things about human memory 01:07:24.860 |
And so, for instance, if you learn a map of an Ikea, 01:07:32.940 |
I probably could go to this Ikea and find my way 01:07:43.540 |
Even though Ikea is a nightmare to get around, 01:07:45.980 |
once I learn my local Ikea, I can use that map everywhere. 01:07:53.140 |
And so that kind of ability to reuse information 01:07:57.180 |
really comes into play when we look at things 01:08:03.600 |
And another thing that we're really interested in 01:08:07.020 |
is this idea of what if instead of basically mapping out 01:08:17.020 |
that connects basically the major landmarks together 01:08:20.260 |
and being able to use that as emphasizing the things 01:08:24.020 |
that are most important, the places that you go for food 01:08:26.900 |
and the places that are landmarks that help you get around. 01:08:38.420 |
just like our memories for events, are not photographic. 01:08:50.820 |
about this kind of spatial mapping of places? 01:09:13.380 |
And so that's one way in which humans can do it 01:09:18.820 |
There's one way in which we can memorize routes. 01:09:26.380 |
I can just rigidly follow that route back, right? 01:09:31.740 |
which would be what's called a cognitive map, 01:09:55.940 |
And I think that we are actually much more generative 01:10:03.260 |
And we've got a small task as it's right now, 01:10:13.340 |
is these signals called ripples in the hippocampus, 01:10:17.300 |
which are these bursts of activity that you see 01:10:20.540 |
that are synchronized with areas in the neocortex 01:10:28.340 |
seem to increase at navigationally important points 01:10:41.000 |
I could really just get a mental map of the neighborhood 01:10:48.980 |
and here's the directions I take to get in between them. 01:10:51.700 |
And what we found in general in our MRI studies 01:10:54.100 |
is basically the more people can reduce the problem, 01:10:59.660 |
whether it's space or any kind of decision-making problem, 01:11:08.100 |
towards the points of highest information content and value. 01:11:13.100 |
- So can you describe the encoding in the hippocampus 01:11:19.060 |
What's the signal in which we see the ripples? 01:11:27.420 |
So there's these waves that you basically see, 01:11:29.900 |
and these waves are points of very high excitability 01:11:37.660 |
they happen actually during slow wave sleep too. 01:11:43.660 |
You see these very slow waves where it's like very excitable 01:11:47.000 |
and then very unexcitable, it goes up and down. 01:11:53.220 |
And when there's a ripple in the hippocampus, 01:12:02.420 |
when an animal's actually doing something in the world. 01:12:06.100 |
So it almost is like a little, people call it replay. 01:12:09.960 |
I think it's a little bit, I don't like that term, 01:12:11.900 |
but it's basically a little bit of a compressed play 01:12:25.260 |
between the hippocampus and these areas in the neocortex. 01:12:28.220 |
And so that, I think, helps you form new memories, 01:12:32.820 |
but it also helps you, I think, stabilize them, 01:12:35.820 |
but also really connect different things together in memory 01:12:43.020 |
And so this is one of, at least our theories of sleep 01:12:46.620 |
and its real role in helping you see the connections 01:12:49.720 |
between different events that you've experienced. 01:12:52.380 |
- So during sleep is when the connections are formed. 01:12:55.220 |
- The connections between different events, right? 01:12:58.380 |
So it's like, you see me now, you see me next week, 01:13:09.620 |
And with sleep, one of the things that allows you to do 01:13:12.780 |
is figure out those connections and connect the dots 01:13:21.380 |
What is it and how is it used in studying memory? 01:13:28.620 |
When I was in grad school, fMRI was just really taking off 01:13:35.580 |
And what's beautiful about it is you can study 01:13:38.380 |
the whole human brain and there's lots of limits to it, 01:13:50.940 |
is like being in the womb, I just fall asleep. 01:14:03.140 |
like giving them words and so forth to memorize. 01:14:06.460 |
But what MRI is itself is just this technique 01:14:10.180 |
where you put people in a very high magnetic field. 01:14:15.060 |
Typical ones we would use would be three Tesla 01:14:18.860 |
So a three Tesla magnet, you put somebody in, 01:14:31.140 |
which is basically a different electromagnetic field. 01:14:36.060 |
the water molecules in the brain as a tracer, so to speak. 01:14:43.900 |
that these magnetic fields that you mess with 01:14:55.420 |
which change the strength of the magnetic field 01:14:58.900 |
So they're all, we tweak them in different ways, 01:15:07.060 |
And when you have blood that doesn't have oxygen on it, 01:15:10.700 |
it's a little bit more magnetizable than blood that does, 01:15:14.060 |
'cause you have hemoglobin that carries the oxygen, 01:15:16.660 |
the iron, basically, in the blood that makes it red. 01:15:20.100 |
And so that hemoglobin, when it's deoxygenated, 01:15:23.540 |
actually has different magnetic field properties 01:15:52.860 |
- Yeah, we had an off-record intense argument 01:16:02.140 |
- We could've called it a stern rebuke, perhaps. 01:16:08.220 |
It is true, the creator of gifs said it's pronounced jif, 01:16:12.260 |
but that's the only person that pronounces jif. 01:16:19.420 |
- This would be basically a whole movie of fMRI data. 01:16:24.220 |
And so when you look at it, it's not very impressive. 01:16:26.220 |
It looks like these very pixelated maps of the brain, 01:16:31.220 |
But these tiny changes in the intensity of those signals 01:16:35.720 |
that you probably wouldn't be able to visually perceive, 01:16:46.900 |
in some part of the brain when I'm doing some task, 01:16:55.220 |
is a person going to remember this later or not? 01:17:07.260 |
'Cause maybe when I'm remembering this thing, 01:17:09.100 |
like I'm remembering the house where I grew up, 01:17:12.540 |
I might have one pixel that's bright in the hippocampus 01:17:17.020 |
And if I'm remembering something more like the car 01:17:26.620 |
And so all that little stuff that we used to think of noise, 01:17:30.100 |
we can now think of almost like a QR code for memory, 01:17:38.900 |
And so this really revolutionized my research. 01:17:40.980 |
So there's fancy research out there where people really, 01:17:45.300 |
I mean, by your standards, this would be Stone Age, 01:17:51.500 |
And now there's a lot of forward encoding models 01:17:54.260 |
and you can go to town with this stuff, right? 01:17:56.980 |
And I'm much more old school of designing experiments 01:18:14.260 |
And do memories that occurred in different places 01:18:16.860 |
And you can just use things like correlation coefficients 01:18:20.540 |
or cosine distance to measure that stuff, right? 01:18:26.820 |
to get a whole state space of how a brain area 01:18:32.580 |
And it's super fascinating because what we could see 01:18:37.300 |
between how certain brain areas are processing memory 01:18:43.060 |
are processing information about where it occurred 01:18:50.700 |
what are my goals that are involved and so forth. 01:18:53.620 |
And the hippocampus is just putting it all together 01:19:00.780 |
- So there is a separation between spatial information, 01:19:18.500 |
which would be like you've got a folder on your computer, 01:19:21.420 |
right, and you open it up, there's a bunch of files there. 01:19:24.060 |
I can sort those files by alphabetical order. 01:19:32.180 |
And things that start with Z versus A are far apart, right? 01:19:35.820 |
And so that is one way of organizing the folder, 01:19:41.180 |
things that were created close together in time are close 01:19:44.700 |
and things that are far apart in time are far. 01:19:47.060 |
So every, like you can think of how a brain area 01:20:02.620 |
And you can do the same thing if you're recording 01:20:04.220 |
from massive populations of neurons in an animal. 01:20:09.140 |
And you can do it for recording local potentials 01:20:12.660 |
in the brain, you know, so little waves of activity 01:20:22.240 |
So that's some of the work that we're doing now. 01:20:24.860 |
But all of these techniques basically allow you to say, 01:20:29.820 |
And so we've found that some networks of the brain 01:20:32.500 |
sort information in memory according to who was there. 01:20:44.260 |
And Zach did the study where we had a bunch of movies 01:20:50.020 |
There were two different people and he filmed them 01:20:52.100 |
at two different cafes and two different supermarkets. 01:20:55.420 |
And what he could show is in one particular network, 01:20:58.380 |
you could find the same kind of pattern of activity 01:21:02.620 |
more or less, a very similar pattern of activity 01:21:05.700 |
every time I saw Alex in one of these movies, 01:21:10.900 |
And I could see another one that was like a common pattern 01:21:16.860 |
this particular supermarket nugget, you know? 01:21:19.880 |
And so, and it didn't matter whether you're watching a movie 01:21:25.180 |
it's the same kind of pattern that comes up, right? 01:21:27.540 |
- That's so fascinating. - It is fascinating. 01:21:31.340 |
for assembling a model of what's happening in the present, 01:21:35.820 |
imagining what could happen and remembering things 01:21:39.180 |
very economically from putting together all these pieces 01:21:46.020 |
for how to put together all these building blocks. 01:21:54.540 |
that makes me wonder on the other side of it, 01:22:03.420 |
Or I guess the symptoms, the results of that encoding 01:22:09.660 |
you mentioned sort of the measuring local potentials 01:22:17.740 |
- What are some interesting like limitations, 01:22:22.920 |
Maybe, the way you explained it is like brilliant 01:22:30.460 |
or the excitation because blood flows to that area. 01:22:40.980 |
how quickly can the task change and all that kind of stuff? 01:22:46.860 |
To the brain, 50 milliseconds is like, you know, 01:22:53.940 |
Maybe not 50, you know, maybe like, you know, 01:23:04.460 |
So in fMRI, you can measure these magnetic field responses 01:23:09.460 |
about six seconds after that burst of activity 01:23:17.580 |
So one of the interesting things that's been discovered 01:23:25.900 |
So we tend to think of the computation, so to speak, 01:23:36.500 |
But sometimes what you can have is these states 01:23:41.060 |
where the neuron becomes a little bit more excitable 01:23:50.420 |
Actually, one of the fascinating things about fMRI 01:23:54.100 |
is where does that, how is it we go from neural activity 01:23:58.820 |
to essentially blood flow to oxygen, all this stuff. 01:24:03.820 |
It's such a long chain of going from neural activity 01:24:10.380 |
is most of the cells in the brain are not neurons. 01:24:14.620 |
They're actually these support cells called glial cells. 01:24:22.180 |
kind of being a middle man, so to speak, with the neurons. 01:24:25.140 |
So if you, for instance, one neuron's talking to another, 01:24:28.860 |
you release a neurotransmitter, like let's say glutamate, 01:24:36.900 |
in the gap between the two neurons called synapse. 01:24:43.380 |
imagine you're just flooded with this liquid in there. 01:24:50.440 |
and you can start to basically get seizure activity. 01:24:52.400 |
You don't want this, so you gotta suck it up. 01:24:54.800 |
And so actually what happens is these astrocytes, 01:24:57.880 |
one of their functions is to suck up the glutamate 01:25:08.960 |
But that cycling is actually very energy-intensive. 01:25:12.280 |
And what's interesting is at least according to one theory, 01:25:15.440 |
they need to work so quickly that they're working 01:25:21.820 |
without using oxygen, kind of like anaerobic metabolism. 01:25:32.140 |
So what we're really seeing in some ways may be in fMRI, 01:25:44.380 |
of the process of keeping the whole system going. 01:25:52.040 |
So with these astrocytes, even though there's a latency, 01:25:57.040 |
it's pretty reliably coupled to the activations. 01:26:01.120 |
- Oh, well, this gets me to the other part of it. 01:26:05.120 |
if I'm just kind of like, I'm talking to you, 01:26:06.960 |
but I'm kind of paying attention to your cowboy hat, 01:26:13.540 |
What you'd see is that there'd be this little elevation 01:26:21.520 |
which process vision, around that point in space, okay? 01:26:28.960 |
like all of a sudden a light flashed in that part of, 01:26:58.460 |
people who study spikes in neurons would say, 01:27:00.900 |
well, that's terrible, we don't want that, you know? 01:27:09.740 |
But one of the things that we've found in our work 01:27:13.760 |
and when we give people stories to listen to, 01:27:17.320 |
a lot of the action is in the very, very slow stuff. 01:27:20.300 |
It's in, because if you're thinking about like a story, 01:27:26.400 |
or you're listening to the Lex Friedman podcast, right? 01:27:30.680 |
and building this internal model over several seconds, 01:27:36.080 |
when we look at electrical activity in the brain 01:27:38.440 |
because we're interested in this millisecond scale. 01:27:40.320 |
It's almost massive amounts of information, right? 01:27:46.480 |
gives you a little limited window into what's going on. 01:27:59.380 |
because there's, you can't keep the magnetic field stable 01:28:04.760 |
You'll see parts where it's like there's a vein 01:28:12.360 |
There's lots of artifacts and stuff like that, you know. 01:28:17.200 |
If I'm lying down in an MRI scanner, I'm lying down. 01:28:35.880 |
maybe the flavor of discoveries have been done 01:28:39.160 |
throughout the history of the science of memory, 01:29:00.020 |
I was like, oh my God, this is really interesting. 01:29:11.060 |
just showing how much we forget is very important. 01:29:16.900 |
which is our organized knowledge about the world, 01:29:19.500 |
increase our ability to remember information, 01:29:25.900 |
Studies of expertise showing how experts like chess experts 01:29:29.980 |
can memorize so much in such a short amount of time 01:29:50.240 |
And it can also strengthen or weaken other memories 01:29:54.880 |
So just this whole idea of memory as an ecosystem, 01:30:03.440 |
our continuous experience into these discrete events, 01:30:09.140 |
- So the discreteness of our encoding of events? 01:30:13.700 |
there's controversial ideas about this, right? 01:30:18.140 |
and this gets back to just this common experience 01:30:20.900 |
of you walk into the kitchen and you're like, 01:30:23.900 |
And you just end up grabbing some food from the fridge. 01:30:27.260 |
oh, wait a minute, I left my watch in the kitchen. 01:30:34.880 |
of where you are, what you're thinking about. 01:30:47.200 |
to remember what it was that you went there for. 01:31:04.100 |
the activity in the hippocampus at these event boundaries 01:31:16.020 |
I just scan you while you're watching a movie, 01:31:18.840 |
You come out, I give you a test of memory for stories. 01:31:22.380 |
What happens is you find this incredible correlation 01:31:28.420 |
at these singular points in time, these event boundaries, 01:31:36.580 |
So it's marking this ability to encode memories, 01:31:39.500 |
just these little snippets of neural activity. 01:31:55.420 |
Being able to just record from large populations of cells 01:32:05.780 |
because what we can do now is I can take fMRI data 01:32:14.340 |
Let me get fMRI data while you use a joystick 01:32:27.340 |
So now I can take a rat, record from his hippocampus 01:32:37.700 |
and have it move around on a trackball in virtual reality 01:32:50.620 |
to try to figure out where the seizures are coming from. 01:33:08.420 |
and looks at the weaknesses of computer vision models 01:33:34.020 |
And you could basically tie them all together 01:33:37.300 |
by virtue of the state spaces that you're measuring 01:33:40.520 |
in neural activity, in these different formats, 01:33:48.540 |
You could do different kinds of analyses on language 01:34:00.860 |
and you could do analyses on sentiment analyses 01:34:10.260 |
But if you do it right and you do it in a theory-driven way, 01:34:14.460 |
as opposed to just throwing all the data at the wall 01:34:17.740 |
I mean, that to me is just exceptionally powerful. 01:34:30.180 |
construct models that help you find the commonalities 01:34:36.220 |
or like the core thing that makes somebody navigate 01:34:55.860 |
that is where you think it's an anatomical homologue. 01:35:02.980 |
- What's the similarities in humans and mice? 01:35:24.600 |
See, these memory retrieval exercises I'm doing 01:35:31.420 |
- And it's strengthening the visual thing I have of you 01:35:37.100 |
It's just becoming stronger and stronger by the second. 01:35:47.900 |
So, I think, I've got great colleagues who I talk to 01:36:03.380 |
because you could do these genetic studies, for instance, 01:36:06.340 |
where you can manipulate particular groups of neurons. 01:36:17.700 |
just by activating a particular set of neurons 01:36:37.180 |
that are conserved even going back far, far before. 01:36:40.900 |
But let's go back to the mice and humans question, right? 01:36:57.660 |
that's kind of designed to kind of catch death from above. 01:37:05.040 |
in a way that focuses on particular spots in space 01:37:12.540 |
So, that makes us very different in that way. 01:37:15.820 |
We also have all these other structures as social animals 01:37:22.580 |
There's language, there's like, you know, so you name it, 01:37:41.300 |
across longer and longer periods of time, right? 01:37:46.260 |
it's like our history of training data, so to speak, 01:37:56.620 |
And a lab mouse's world is extraordinarily impoverished 01:38:01.780 |
- But still, what can you understand by studying mice? 01:38:03.820 |
I mean, just basic, almost behavioral stuff about memory? 01:38:07.460 |
- Well, yes, but that's very important, right? 01:38:19.700 |
you'd think it's the most simple question, right? 01:38:25.180 |
And understanding how two parts of the brain interact, 01:38:30.180 |
meaning that it's not just one area speaking. 01:38:33.020 |
It's not like, you know, it's not like Twitter, 01:38:51.780 |
but that's gonna be coming largely from model systems 01:38:57.660 |
You can do manipulations, like drug manipulations, 01:39:03.260 |
use viruses and so forth and lasers to turn on circuits 01:39:08.900 |
So I think there's a lot that can be learned from mice. 01:39:11.940 |
There's a lot that can be learned from non-human primates. 01:39:14.740 |
And there's a lot that you need to learn from humans. 01:39:19.380 |
some of the people in the National Institutes of Health 01:39:22.580 |
think you can learn everything from the mouse. 01:39:30.060 |
I'm gonna get my funding from somewhere else. 01:39:34.140 |
- Well, let me ask you some random fascinating questions. 01:39:44.940 |
one of these things I think that some of the surveys 01:39:50.620 |
having a deja vu experience one time or another. 01:39:56.860 |
and most of them say they've experienced deja vu. 01:40:01.420 |
that I've experienced this moment sometime before, 01:40:06.060 |
And actually, there's all sorts of variants of this. 01:40:22.260 |
almost disturbing, intense sense of familiarity. 01:40:26.100 |
So there is a researcher named Wilder Penfield. 01:40:35.260 |
who did a lot of the early characterizations of epilepsy. 01:40:39.460 |
And one of the things he notices in epilepsy patients, 01:40:42.620 |
some group of them, right before they would get a seizure, 01:40:45.940 |
they would have this intense sense of deja vu. 01:40:49.140 |
So it's this artificial sense of familiarity. 01:40:52.980 |
It's a sense of having a memory that's not there, right? 01:40:57.860 |
And so what was happening was there's electrical activity 01:41:09.740 |
which parts we wanna remove and which parts don't we, 01:41:12.540 |
he would stimulate parts of the temporal lobes of the brain 01:41:15.260 |
and find you could elicit the sense of deja vu. 01:41:21.100 |
just from electrically stimulating some parts. 01:41:23.940 |
Sometimes though, they just have this intense feeling 01:41:39.820 |
What happens is is that they're tuning themselves up 01:41:43.860 |
every time you process a similar input, right? 01:41:50.120 |
this kind of a fluent sense that I'm very familiar, 01:41:57.180 |
you're not going to be moving your eyes all over the place 01:41:59.740 |
'cause you kind of have an idea of where everything is. 01:42:01.740 |
And that fluency gives you a sense of like, I'm here. 01:42:06.660 |
and I have this very unfamiliar sense of where I am, right? 01:42:10.580 |
But there's a great set of studies done by Ann Cleary 01:42:15.220 |
where she created these virtual reality environments. 01:42:19.740 |
Imagine you go through a virtual museum, right? 01:42:23.780 |
And then she would put people in virtual reality 01:42:29.460 |
But the map of the two places was exactly the same. 01:42:35.700 |
but they've got same landmarks and the same places, 01:42:39.420 |
but carpeting, colors, theme, everything's different. 01:42:43.000 |
People will often not have any conscious idea 01:42:48.340 |
but they could report this very intense sense of deja vu. 01:42:53.740 |
that's eliciting this kind of a sense of familiarity. 01:43:02.540 |
you get this artificial sense of familiarity that happens. 01:43:08.060 |
and again, this is just one theory amongst many, 01:43:10.660 |
but we think that we get a little bit of that feeling. 01:43:14.340 |
It's not enough to necessarily give you deja vu, 01:43:20.540 |
So it's like if I tell you the word rutabaga, 01:43:24.580 |
your brain's gonna work a little bit harder to catch it 01:43:32.100 |
Your brain's very tuned up to process it efficiently, 01:43:34.500 |
but rutabaga takes a little bit longer and more intense. 01:43:37.300 |
And you can actually see a difference in brain activity 01:43:40.580 |
in areas in the temporal lobe when you hear a word 01:43:42.820 |
just based on how frequent it is in the English language. 01:43:53.060 |
of just learning, doing this error-driven learning 01:43:55.960 |
as we go through life to become better and better and better 01:44:00.860 |
- So I guess deja vu is just thinking extra elevated, 01:44:06.680 |
firing for this artificial memory as if it's the real memory. 01:44:16.800 |
but I think what may be happening is it's such a, 01:44:20.280 |
it's a partial match to something that we have, 01:44:22.960 |
and it's not enough to trigger that sense of, 01:44:25.580 |
you know, that ability to pull together all the pieces, 01:44:33.360 |
without the recollection of exactly what happened when. 01:44:36.980 |
- But it's also like a spatiotemporal familiarity. 01:44:43.240 |
Like, there's a weird blending of time that happens. 01:44:49.080 |
'cause I think that's a really interesting idea, 01:44:52.420 |
But you also kind of, artificial memory brings to mind 01:45:10.200 |
It's like Johnny Rotten from the Sex Pistols, 01:45:15.440 |
"any more than I believe in false songs," right? 01:45:21.720 |
we have these memories that reflect bits and pieces 01:45:24.420 |
of what happened, as well as our inferences and theories, 01:45:28.240 |
right, so I'm a scientist and I collect data, 01:45:30.920 |
but I use theories to make sense of that data. 01:45:34.880 |
And so a memory is kind of a mix of all these things. 01:45:42.320 |
as false memories, are sometimes little distortions 01:45:47.320 |
where we filled in the blanks, the gaps in our memory, 01:45:52.840 |
but don't actually correspond to what happened, right? 01:45:55.780 |
So if I were to tell you that I'm like, you know, 01:46:05.800 |
worried that they have cancer or something like that, 01:46:08.680 |
and then, you know, they see a doctor and the doctor says, 01:46:11.560 |
"Well, things are very much like you would have expected," 01:46:14.520 |
or like, you know, "what you were afraid of," or something. 01:46:17.280 |
When people remember that, they'll often remember, 01:46:19.200 |
"Well, the doctor told the patient that he had cancer," 01:46:24.560 |
because they're infusing meaning into that story, right? 01:46:29.960 |
But what happens is, is that sometimes things 01:46:32.000 |
can really get out of hand where people have trouble 01:46:35.800 |
telling the difference in things that they've imagined 01:46:37.960 |
versus things that happen, but also, as I told you, 01:46:41.040 |
the act of remembering can change the memory. 01:46:44.920 |
And so what happens then is you can actually be exposed 01:46:50.160 |
And so Elizabeth Loftus was a real pioneer in this work, 01:46:53.320 |
and there's lots of other work that's been done since. 01:46:56.560 |
But basically, it's like, if you remember some event, 01:47:00.120 |
and then I tell you something about the event, 01:47:12.480 |
And sometimes, if you're not able to tell the difference, 01:47:23.040 |
or you're exposed to some more information somewhere else, 01:47:25.640 |
and eventually, your memory becomes totally detached 01:47:30.000 |
And so sometimes you can have cases where people, 01:47:33.800 |
this is very rare, but you can do it in lab, too, 01:47:49.820 |
And as they keep trying to remember that event more and more, 01:47:58.220 |
and eventually, they can stitch together a vivid memory 01:48:03.940 |
Because they're not remembering an event that happened, 01:48:11.800 |
and basically putting it together into the wrong story. 01:48:14.920 |
- So it's fascinating, 'cause this could probably happen 01:48:34.480 |
And so, all these kind of foibles of human memory 01:48:37.740 |
get magnified when you start to have social interactions. 01:48:44.280 |
which is basically when misinformation spreads like a virus. 01:48:47.800 |
Like, you remember the same thing that I did, 01:48:51.040 |
but I give you a little bit of wrong information, 01:48:53.400 |
then that becomes part of your story of what happened. 01:48:59.260 |
like I tell you about something I've experienced, 01:49:01.240 |
and you tell me about your experience of the same event, 01:49:04.080 |
it's no longer your memory or my memory, it's our memory. 01:49:14.600 |
the more of a voice they have in shaping that narrative. 01:49:32.080 |
and there were these polls where they would do these, 01:49:40.240 |
But they actually inserted some misinformation 01:49:46.560 |
and maybe it was something about illegitimate children. 01:49:58.600 |
And so, people would end up becoming convinced 01:50:01.240 |
he had these policy things or these personal things 01:50:08.560 |
So, it was a case where, interestingly enough, 01:50:26.720 |
So, it's not just about truth and falsehoods, 01:50:41.400 |
If you just look throughout the 20th century, 01:50:47.600 |
with the Soviet Union, effective propaganda machines 01:50:59.120 |
You could do quite a lot of damage in this way. 01:51:10.440 |
by the propaganda machine can spread faster than others. 01:51:18.280 |
that they are, like, really effective at spreading. 01:51:24.360 |
to where something about the human mind eats it up 01:51:35.600 |
whatever the content of the conspiracy theory is. 01:51:40.000 |
like you remember a thing, I feel like there's a certainty. 01:51:44.040 |
There's a, it emboldens you to, like, say stuff. 01:51:47.640 |
Like, you really, like, it's not just you believe 01:51:57.760 |
you feel like you were there to watch the thing happen. 01:52:00.880 |
- Yeah, I mean, there's so much in what you're saying. 01:52:03.880 |
I mean, one of the things is that people's sense 01:52:07.160 |
of collective identity is very much tied to shared memories. 01:52:16.220 |
we will feel more socially connected with each other, 01:52:20.660 |
They're part of my tribe if I remember the same things 01:52:24.820 |
And you brought up this weaponization of history, 01:52:35.780 |
and you have a goal in mind, you will find stuff in memory 01:52:39.040 |
that aligns with it, and you won't see the parts 01:52:51.340 |
from misinformation, take something even more fascinating, 01:53:01.260 |
but he wrote a book about the collective memory 01:53:05.220 |
He's a Vietnamese immigrant who was flown out 01:53:08.500 |
as after the war was over, and so he went back 01:53:11.460 |
to his family to get their stories about the war, 01:53:22.540 |
having grown up in the US, and I've always heard about it 01:53:25.180 |
as the Vietnam War, but of course they call it 01:53:26.980 |
the American War, 'cause that's what happened. 01:53:44.020 |
And I think the opportunities that we can have in memory 01:53:49.020 |
is if we bring groups together from different perspectives 01:53:58.140 |
I mean, right now you'll hear a lot of just jammering, 01:54:02.620 |
people going blah, blah, blah about free speech, 01:54:04.820 |
but they just wanna listen to themselves, right? 01:54:11.620 |
they were trying to ban two live crew or, you know, 01:54:14.780 |
just think about, Lenny Bruce got canceled for cursing. 01:54:21.780 |
People don't like to hear things that disagree with them. 01:54:25.020 |
But if you're in a, I mean, you can see two situations 01:54:32.700 |
One situation is you have people who are very dominant 01:54:38.540 |
And they, basically what happens is the group 01:54:53.140 |
I mean, diverse in any way you wanna take it, right? 01:55:08.460 |
Even two people who come from very similar backgrounds, 01:55:10.940 |
if you can appreciate the unique contributions 01:55:20.580 |
I believe, from misinformation in the modern world. 01:55:25.500 |
it requires a certain tolerance for discomfort. 01:55:38.980 |
- And I mean, social media has a lot of opportunity 01:55:46.900 |
that you're talking about where everybody has a voice, 01:55:53.820 |
is there's a natural clustering of people and opinions 01:55:56.100 |
and you just kind of form these kind of bubbles. 01:56:02.380 |
I think that's a technology problem that could be solved. 01:56:09.940 |
with people that have a very different memory, 01:56:16.660 |
to intermix the memories and ways of thinking 01:56:33.220 |
I think a lot of the problems that come up with technology 01:56:38.860 |
as much as the fact that people adapt to the technology 01:56:44.980 |
I mean, one of my fears about AI is not what AI will do, 01:56:52.340 |
It's like pain in the ass to text people, at least for me. 01:56:57.500 |
becomes very spartan and devoid of meaning, right? 01:57:02.580 |
And that's people adapting to the medium, right? 01:57:10.100 |
and you've adapted to that to communicate, right? 01:57:18.940 |
when Google started to introduce autocomplete in emails, 01:57:22.620 |
I started to use it, and about a third of the time, 01:57:35.500 |
And so what happens is it's not that the technology 01:57:41.060 |
as much as it's just going to constrain my language 01:57:45.020 |
because I'm just doing what's being suggested to me. 01:57:48.860 |
And so this is why I say, kind of like my mantra 01:57:52.620 |
for some of what I've learned about everything in memory 01:57:56.660 |
is to diversify your training data, basically, 01:58:01.780 |
So humans have this capability to be so much more creative 01:58:05.900 |
than anything generative AI will put together, 01:58:14.180 |
where people could become much, much less creative 01:58:17.180 |
if they just become more and more resistant to discomfort 01:58:22.180 |
and resistant to exposing themselves to novelty, 01:58:30.140 |
between natural human adaptation of technology 01:58:40.580 |
things that on that are positive for human behavior. 01:59:00.580 |
And so there, I think technology on that has been, 01:59:12.300 |
than they were 50 years ago or 100 years ago? 01:59:17.420 |
- I think humans in general like to reminisce about the past, 01:59:29.060 |
'cause we, there's this kind of complainy engine 01:59:32.980 |
that just, there's so much pleasure in saying, 01:59:41.180 |
- Exactly, I mean, there's something in humans 01:59:43.860 |
that loves complaining, even about trivial things, 01:59:48.220 |
but complaining about change, complaining about everything. 02:00:06.100 |
I mean, I would argue that maybe, and who knows? 02:00:08.220 |
I don't know this, but I wouldn't be surprised 02:00:10.340 |
if people in hunter-gatherer societies are happier. 02:00:14.220 |
I mean, I wouldn't be surprised if they're happier 02:00:16.220 |
than people who have access to modern medicine 02:00:31.060 |
The question is in terms of every single problem they've had 02:00:36.660 |
There's now food, there is guarantee of survival, 02:00:44.180 |
Do we want to be, oh, Werner Herzog in the movie 02:00:52.660 |
hunting, gathering, surviving, worried about the next day? 02:01:02.780 |
I don't know, but I do know this modern society 02:01:24.060 |
all this kind of stuff that enables the flourishing 02:01:31.540 |
I mean, that's a very deep philosophical question. 02:01:34.180 |
Maybe struggle, deep struggle is necessary for happiness. 02:01:43.300 |
Maybe it's about functioning in social groups 02:01:51.560 |
memory-related thing, which is that if you look 02:01:54.700 |
at things like reinforcement learning, for instance, 02:02:06.320 |
You mainly learn if it deviates from your expectation 02:02:15.140 |
And it's like, you probably don't even get excited 02:02:20.280 |
But if they cut your salary, you're gonna be pissed. 02:02:30.220 |
that basically you learn to expect these things, 02:02:38.120 |
it's a major way in which we're kind of more, 02:02:40.620 |
in my opinion, wired to strive and not be happy, 02:02:46.460 |
And so people talk about dopamine, for instance, 02:02:50.740 |
And it's like, there's a lot of compelling research 02:03:11.600 |
But they're not gonna do anything to get it, you know? 02:03:14.940 |
And just one of the weird things in our research 02:03:17.900 |
is I got into curiosity from a postdoc in my lab, 02:03:22.340 |
Matthias Gruber, and one of the things that we found 02:03:27.860 |
like a trivia question that they wanted the answer to, 02:03:39.740 |
And again, that was not driven by the answer per se, 02:03:47.460 |
it was about the drive to seek the information. 02:03:55.500 |
between what you know and what you want to know, 02:03:58.740 |
you could either use that to motivate you and energize you, 02:04:06.260 |
I'm gonna go back to my echo chamber, you know? 02:04:10.820 |
I like what you said, that maybe we're designed 02:04:14.900 |
to be in a kind of constant state of wanting, 02:04:17.980 |
which, by the way, is a pretty good either band name 02:04:25.420 |
- That's like a hardcore band name, yeah, yeah, yeah. 02:04:47.540 |
- We could build a false memory of a show, in fact, 02:04:51.740 |
So we don't even have to do all the work to play the show, 02:04:54.500 |
we can just create a memory of it and it might as well 02:04:56.860 |
happen, 'cause the remembering self is in charge anyway. 02:05:00.300 |
- So let me ask you about, we talked about false memories, 02:05:12.280 |
but through torture, you can make people say anything 02:05:20.460 |
I wonder to what degree there's truth to that, 02:05:22.500 |
if you look at the torture that happened in the Soviet Union 02:05:28.140 |
How much can you really get people to really, 02:05:36.240 |
- Yeah, I mean, I think there's a lot of history of this 02:05:43.960 |
You might've heard the term, the third degree. 02:05:49.160 |
it was a very intense set of beatings and starvation, 02:05:54.120 |
physical demands that they would place at people 02:05:59.480 |
And there was certainly a lot of work that's been done 02:06:03.260 |
by the CIA in terms of enhanced interrogation techniques. 02:06:07.320 |
And from what I understand, the research actually shows 02:06:11.340 |
that they just produce what people want to hear, 02:06:15.420 |
not necessarily the information that is being looked for. 02:06:19.260 |
And the reason is that, I mean, there's different reasons. 02:06:22.960 |
I mean, one is people just get tired of being tortured 02:06:32.180 |
where there's an authority figure telling you something, 02:06:46.000 |
Maybe they're kind of like making you be cold 02:06:48.360 |
or exposing you to music that you can't stand or something, 02:06:54.680 |
It's like they're creating this physical stress. 02:07:01.520 |
starts to down-regulate the prefrontal cortex. 02:07:03.880 |
They're not necessarily as good at monitoring 02:07:07.700 |
Then they start to get nice to you and they say, 02:07:09.420 |
imagine, okay, I know you don't remember this, 02:07:11.940 |
but maybe we can walk you through how it could have happened 02:07:20.340 |
and you're being encouraged to imagine things 02:07:25.640 |
And at some point, certain people can be very coaxed 02:07:29.520 |
into creating a memory for something that never happened. 02:07:33.020 |
And there's actually some pretty convincing cases out there 02:07:40.340 |
who came to believe that he had a false memory. 02:07:44.000 |
I mean, that he had a memory of doing sexual abuse 02:07:46.940 |
based on, you know, essentially, I think it was, 02:07:58.860 |
There are definitely stories out there like this, 02:08:00.960 |
where people confess to crimes that they just didn't do 02:08:12.660 |
that you want them to remember, you stress them out, 02:08:18.300 |
kind of like pushing this information on them, 02:08:21.500 |
or you motivate them to produce the information 02:08:24.160 |
you're looking for, and that pretty much over time 02:08:34.640 |
can use these kinds of tools to destroy lives. 02:08:49.180 |
one of the best topics for songs is heartbreak, 02:08:56.400 |
Why and how do we remember and forget heartbreak? 02:09:08.840 |
Well, so, I mean, part of this is we tend to go back 02:09:24.840 |
memory is designed to kind of capture these things 02:09:29.340 |
and attachment is a big part of biological significance 02:09:36.940 |
and sometimes that heartbreak comes with massive changes 02:09:42.060 |
say if they cheated on you or something like that, 02:09:45.220 |
or regrets, and you kind of ruminate about things 02:10:08.860 |
and I just would see her everywhere around the house. 02:10:20.220 |
and so we got a new dog because I kept seeing 02:10:24.120 |
the dog around, and I was just so heartbroken about this, 02:10:33.240 |
I think there's also something about attachment 02:10:44.520 |
Sometimes it's also not just about the heartbreak, 02:11:01.840 |
and so part of, one of the things that I found 02:11:05.520 |
from my clinical background that really, I think, 02:11:14.880 |
and getting people to look at the past in a different way, 02:11:28.400 |
towards learning, and an appreciation, maybe, 02:11:39.080 |
where you can have this kind of like dark experiences, 02:12:05.940 |
that that's actually the best part, that heartbreak, 02:12:17.320 |
is actually when you get over the heartbreak, 02:12:32.800 |
feel it the deepest, which is an interesting way 02:12:48.880 |
the memories of something that you now appreciate even more. 02:12:53.280 |
- So you don't believe that an owner of a lonely heart 02:12:55.840 |
is much better than an owner of a broken heart? 02:13:09.080 |
to some more Bruce Springsteen to figure that one out. 02:13:12.520 |
- Well, you know, it's funny because it's like, 02:13:14.000 |
after I turned 50, I think of death all the time. 02:13:17.040 |
I just think that I'm in, I probably have fewer years 02:13:27.240 |
what are the memories that I wanna carry with me 02:13:32.200 |
And also about just the fact that everything around me 02:13:35.960 |
I know more people who are dying for various reasons. 02:13:40.800 |
And so, I'm not lost, I'm not that old, right? 02:13:56.680 |
the whole idea of Buddhism is renouncing attachments. 02:13:59.520 |
Some way, the idea of Buddhism is staying out 02:14:02.900 |
of the world of memory and staying in the moment, right? 02:14:16.060 |
and knowing that they will die makes me appreciate 02:14:22.380 |
in your daily routine that you think about things this way, 02:14:34.660 |
and live in the moment, and I also appreciate 02:14:41.740 |
involved in life, the little and the big, too. 02:14:46.660 |
The Buddhist kind of removing yourself from the world, 02:14:50.620 |
or the Stoic, removing yourself from the world, 02:14:52.680 |
the world of emotion, I'm torn about that one. 02:14:56.940 |
- Well, you know, this is where Hinduism and Buddhism, 02:14:59.740 |
or at least some strains of Hinduism and Buddhism differ. 02:15:02.220 |
In Hinduism, like if you read the Bhagavad Gita, 02:15:06.420 |
the philosophy is not one of renouncing the world, 02:15:17.260 |
So, what they argue, and again, you could interpret 02:15:37.460 |
of the world that you're trying to preserve, right? 02:15:40.020 |
And of course, you could take that different ways, 02:15:41.860 |
but I really think about that from time to time 02:15:44.380 |
in terms of like, you know, letting go of this idea 02:15:48.220 |
of does this book sell, or trying to, you know, 02:15:52.460 |
like impress you and get you to laugh at my jokes 02:15:55.460 |
or whatever, and just be more like I'm sharing 02:16:03.780 |
It's like, 'cause we're so driven by the reinforcer, 02:16:08.340 |
- You're just part of the process of telling the joke, 02:16:12.380 |
and if I laugh or not, that's up to the universe to decide. 02:16:18.220 |
- How does studying memory affect your understanding 02:16:23.980 |
So like, we've been talking about us living in the present 02:16:35.900 |
and narratives about the memories that we've constructed. 02:16:38.660 |
So it feels like it does weird things to time. 02:16:43.420 |
- Yeah, and the reason is, is that in some sense, 02:16:49.540 |
I mean, there's all sorts of interesting things 02:16:54.220 |
if I ask you how different does one hour ago feel 02:16:58.260 |
from two hours ago, you'd probably say pretty different. 02:17:09.340 |
So there's this kind of compression that happens 02:17:14.940 |
So that it's kind of like why when you're older, 02:17:17.460 |
the difference between somebody who's like 50 02:17:23.420 |
between like 10 and five or something, right? 02:17:25.220 |
When you're 10 years old, everything seems like 02:17:29.220 |
Here's the point is that, so one of the interesting things 02:17:32.220 |
that I found when I was working on the book actually 02:17:34.460 |
was during the pandemic, I just decided to ask people 02:17:38.180 |
in my class when we were doing the remote instruction. 02:17:40.340 |
So one of the things I did was I'd pull people. 02:17:42.860 |
And so I just asked people, do you feel like the days 02:17:46.380 |
are moving by slower or faster or about the same? 02:17:51.060 |
Almost everyone in the class said that the days 02:17:54.820 |
So then I would say, okay, so do you feel like the weeks 02:18:11.540 |
But according to memory, it did because what happened 02:18:14.740 |
was people were doing the same thing over and over 02:18:30.580 |
you look back at that week and you say, well, what happened? 02:18:39.420 |
But that week went by during the same amount of time 02:18:42.820 |
as an eventful week where you might've been going out, 02:18:45.140 |
hanging out with friends on vacation or whatever, right? 02:18:50.300 |
because you're doing the same thing over and over. 02:18:52.260 |
So I feel like memory really shapes our sense of time, 02:19:05.740 |
'Cause what I think about when I was like 12, 15, 02:19:11.780 |
I just fundamentally feel like the same person. 02:19:32.460 |
It could be also a genetic thing just for me. 02:19:34.740 |
I don't know if everyone agrees to this view of time, 02:19:40.500 |
- Like you don't feel the passage of time or? 02:19:44.300 |
in the same way that your students did from day to day. 02:19:52.060 |
that time has passed, you celebrate birthdays and so on. 02:19:55.060 |
But the core of who I am and who others I know are, 02:19:58.540 |
or events, that compression of my understanding 02:20:06.140 |
'cause time is not useful for the compression. 02:20:08.260 |
So the details of that time, at least for me, 02:20:11.180 |
is not useful to understanding the core of the thing. 02:20:20.020 |
This is really what motivates me in science, actually, too. 02:20:23.520 |
But it's like when you start recalling the past 02:20:26.100 |
and seeing the connections between the past and present, 02:20:38.580 |
there is this kind of the present is with you, right? 02:20:41.420 |
But what's interesting about what you said, too, 02:21:01.220 |
let's say you hear a song that you used to play 02:21:08.180 |
and you might not think of yourself as an athlete, 02:21:12.140 |
you mentally time travel to that particular thing, 02:21:14.720 |
you open up this little compartment of yourself 02:21:20.980 |
Dan Schachter's lab did this really cool study 02:21:25.580 |
to either remember doing something altruistic 02:21:46.220 |
this goes back to that illusion of stability, 02:22:02.780 |
based on whatever part we want to reach for, right? 02:22:09.220 |
like this desire and pleasure associated with going back? 02:22:14.220 |
- Yeah, so my friend Philippe DeBregaard wrote this, 02:22:23.260 |
where the word nostalgia was coined by a Swiss physician 02:22:26.820 |
who was actually studying traumatized soldiers. 02:22:32.420 |
And the idea was it was bringing these people 02:22:36.240 |
because they're remembering how things used to be. 02:22:45.540 |
nostalgia can be an enormous source of happiness, right? 02:22:49.740 |
And being nostalgic can improve people's moods 02:22:59.060 |
nostalgia has the opposite effect of thinking 02:23:01.460 |
those were the good old days and those days are over, right? 02:23:04.220 |
It's like America used to be so great and now it sucks. 02:23:07.700 |
Or, you know, my life used to be so great when I was a kid 02:23:12.940 |
And you're selectively remembering the things 02:23:29.380 |
it's like there's all sorts of problems going on. 02:23:33.780 |
worried about like Russia, nuclear war, blah, blah, blah. 02:23:37.660 |
So, I mean, it's just this idea that people have 02:23:48.100 |
but if it narrows your worldview in the present, 02:23:50.580 |
you're not aware of those biases that you have, 02:23:57.340 |
Either at a personal level or at a collective level. 02:24:15.440 |
We talked about electrodes and different ways 02:24:22.900 |
And the more out there question would be like, 02:24:29.660 |
- Yeah, I mean, I can't say specifics about the company 02:24:39.420 |
I think with these surgical robots and things like that. 02:24:42.960 |
BCI, though, has a whole lot of innovation going on. 02:24:48.820 |
I'm not necessarily seeing any scientific evidence 02:24:55.820 |
but I'm not seeing the evidence that they're anywhere near 02:25:09.020 |
I think speech prosthetics that are incorporating, 02:25:16.860 |
it's just like the rate of progress is just enormous. 02:25:21.380 |
So, part of the technology is having good enough data 02:25:30.780 |
is the algorithms for decoding it and so forth. 02:25:35.960 |
in some real breakthroughs in neuroscience as a result. 02:25:38.900 |
So, there's lots of new technologies like Neuropixels, 02:25:43.220 |
for instance, that allow you to harvest activity 02:25:45.560 |
from many, many neurons from a single electrode. 02:25:53.020 |
but I haven't, again, because they do their own stuff. 02:25:56.980 |
The scientific community doesn't see it, right? 02:25:59.340 |
But I think BCI is much, much bigger than Neuralink, 02:26:03.940 |
and there's just so much innovation happening. 02:26:11.060 |
is I was talking to Sergei a while ago about, you know, 02:26:13.460 |
so, a lot of language is not just what we hear 02:26:22.300 |
And, you know, so are you really gonna be able 02:26:24.620 |
to restore language without dealing with that part of it? 02:26:27.680 |
And he brought up a really interesting question, 02:26:29.820 |
which is the ethics of reading out people's intentions 02:26:48.780 |
what we mean is getting signal from the brain 02:27:01.740 |
It's able to speak for you exactly what you wanted to say. 02:27:08.380 |
well, saying something isn't just the letters, 02:27:15.980 |
the feeling behind it, all that kind of stuff. 02:27:18.700 |
And is it ethical to reveal that full shebang, 02:27:21.820 |
the full context of what's going on in our brain? 02:27:31.140 |
Is it ethical for anyone to have access to our thoughts? 02:27:41.460 |
even doing studies and all this kind of stuff. 02:27:46.460 |
to where you can start to map out the QR codes 02:27:48.780 |
for different thoughts, for different kinds of thoughts, 02:27:52.220 |
maybe political thoughts, you know, McCarthyism. 02:27:56.660 |
What if I'm getting a lot of them communist thoughts 02:28:00.300 |
or however we want to categorize or label it? 02:28:10.420 |
the more transparency there is about the human mind, 02:28:18.140 |
But there could be always intermediate battles 02:28:22.260 |
with how much control does a centralized entity have, 02:28:32.340 |
whose job is to track down criminals and so on, 02:28:41.300 |
to control the citizenry, all that kind of stuff. 02:28:43.820 |
So people are always paranoid and rightfully so. 02:29:00.400 |
at the core of this country and probably humanity, 02:29:05.380 |
when you start to be able to collect those thoughts. 02:29:11.140 |
do you think for fun and for practical purposes, 02:29:16.140 |
you'll be able to, we would be able to modify memories? 02:29:33.260 |
in order to figure out how could we adjust this memory 02:29:36.540 |
at the crude level from unpleasant to pleasant? 02:29:39.980 |
You talked about we can remember the mall and the people, 02:29:51.660 |
we know we can do it just behaviorally, right? 02:30:00.620 |
and then you can change the people, places and so forth. 02:30:03.580 |
On the crude level, there's a lot of work that's being done 02:30:11.060 |
which is this idea that essentially when I recall a memory, 02:30:14.680 |
what happens is that the connections between the neurons 02:30:18.460 |
and that cell assembly that give you the memory 02:30:25.980 |
And so some people have used techniques to try to like, 02:30:30.980 |
to reduce that physical visceral component of the memory 02:30:36.860 |
Right now, I think I've, as an outsider looking at the data, 02:30:42.820 |
And part of it is, and this speaks to the more complex issue, 02:30:46.800 |
is that you need somebody to actually fully recall 02:31:01.900 |
So if we go back to reading people's thoughts, 02:31:05.700 |
I mean, people can sometimes look at this like behaviorists 02:31:12.180 |
But I think that's a very bankrupt concept about memory. 02:31:15.700 |
I think it's much more complicated than that. 02:31:17.940 |
And one of the things that when we started studying 02:31:20.740 |
naturalistic memory, like memory from movies, 02:31:26.700 |
Because if I show you a movie and I watch the same movie 02:31:36.960 |
we might take a different amount of time to do it. 02:31:47.780 |
and it's not about how long we spent or whatever. 02:31:50.780 |
There's something deeper that is there that's this idea, 02:31:54.220 |
but it's like, how do you understand that thought? 02:31:57.420 |
I encounter a lot of concrete thinking that it's like, 02:32:01.220 |
if I show a model, like the visual information 02:32:20.580 |
And he was saying that the problem with self-driving cars 02:32:24.180 |
that they had in cities, as opposed to highways, 02:32:27.220 |
was that the car was okay at doing the things 02:32:31.580 |
it's supposed to, but when there were pedestrians around, 02:32:34.620 |
it couldn't predict the intentions of people. 02:32:37.500 |
And so that unpredictability of people was the problem 02:32:40.920 |
that they were having in the self-driving car design, 02:32:44.180 |
'cause it didn't have a good enough internal model 02:32:47.980 |
of what the people were, what they were doing, 02:32:56.660 |
watching pedestrians, thinking about pedestrians, 02:33:01.220 |
thinking about what it takes to solve the problem of 02:33:03.820 |
measuring, detecting the intention of a pedestrian, 02:33:12.980 |
really of a human being in this particular context 02:33:20.340 |
I think it's a window into how complex social systems are 02:33:25.340 |
that involve humans, because I would just stand there 02:33:38.660 |
every single intersection has its own personality. 02:33:52.020 |
because what happens is we're leaders and followers. 02:34:09.460 |
And if a few people start to jaywalk and cross in red light, 02:34:15.820 |
And there's just a dynamic to that intersection. 02:34:23.140 |
versus a rural town, even Boston, San Francisco, 02:34:26.260 |
or here in Austin, there's different personalities city-wide 02:34:30.100 |
but there's different personalities area-wide, region-wide, 02:34:32.740 |
and there's different personalities, different intersections. 02:34:37.260 |
For a car to be able to determine that is tricky. 02:34:41.560 |
Now, what machine learning systems are able to do well 02:34:46.220 |
So for us, it's tricky because we get to understand the world 02:34:50.980 |
with very limited information and make decisions grounded 02:34:55.500 |
in this big foundation model that we've built 02:35:00.900 |
AI could literally, in the context of driving, 02:35:15.660 |
of how humans cross streets, it's probably all there. 02:35:20.620 |
In the same way that you have a Noam Chomsky who says, 02:35:30.620 |
And more and more, you see large language models 02:35:42.380 |
into a representation is doing is in fact understanding. 02:35:47.180 |
Deeply, in order to be able to generate one letter 02:35:52.300 |
you have to understand the cruelty of Nazi Germany 02:36:05.060 |
in order to generate, I'm going to the kitchen 02:36:07.780 |
to get an apple and do that grammatically correctly. 02:36:13.700 |
- You're thinking LLM is building that world model. 02:36:16.100 |
- It has to in order to be good at generating 02:36:22.260 |
And in the same way, I think AI that drives a car, 02:36:27.180 |
if it has enough data, will be able to form a world model 02:36:35.260 |
But when we, as humans are watching pedestrians, 02:36:38.740 |
we slowly realize, damn, this is really complicated. 02:36:42.300 |
In fact, when you start to self-reflect on driving, 02:36:49.000 |
There's like subtle cues we take about like just, 02:36:55.460 |
but like one of them determining who around you 02:36:58.180 |
is an asshole, aggressive driver, potentially dangerous. 02:37:00.860 |
- Yes, yes, I was just thinking about this, yes. 02:37:03.260 |
- You can read it, once you become a great driver, 02:37:06.860 |
you can see it a mile away, this guy is gonna pull 02:37:11.660 |
He's like way back there, but you know it's gonna happen. 02:37:14.340 |
- And I don't know what, 'cause we're ignoring 02:37:19.180 |
the asshole, like a red, like a glowing obvious symbol 02:37:23.540 |
is just like right there, even in the periphery vision. 02:37:26.240 |
'Cause we're, again, we're usually when we're driving 02:37:34.060 |
and it's like a little puzzle that we're usually 02:37:36.580 |
only allocating a small amount of our attention to, 02:37:41.540 |
And it's fascinating, but I think AI just has 02:37:47.780 |
in terms of the bandwidth of data that's coming in 02:37:53.660 |
and perform inference on the representation you, 02:37:58.780 |
That for the case of driving, I think it could be 02:38:04.100 |
But one of the things that's currently missing, 02:38:06.500 |
even though OpenAI just recently announced adding memory, 02:38:11.080 |
and I did wanna ask you like how important it is, 02:38:15.500 |
how difficult is it to add some of the memory mechanisms 02:38:29.620 |
because we don't understand episodic memory, right? 02:38:31.980 |
And so one of the ideas I talk about in the book 02:38:52.660 |
And so part of the problem with going through like massive, 02:39:02.660 |
or something like that is especially for English, 02:39:05.120 |
there's so many exceptions to the rules, right? 02:39:08.100 |
And so if you wanna rapidly learn the exceptions, 02:39:14.940 |
you have a harder time learning the exception. 02:39:17.620 |
And so David Marr is one of the early pioneers 02:39:23.540 |
And then Jay McClelland and my colleague, Randy O'Reilly, 02:39:29.820 |
all these people started to come up with the idea 02:39:41.300 |
which just says this happened once at this point in time, 02:39:47.260 |
And then we have this knowledge that we've accumulated 02:40:05.180 |
And the next time we're in a similar situation, boom, 02:40:08.580 |
we could supplement our knowledge with this information 02:40:15.100 |
So it gives us this enormous amount of flexibility 02:40:20.780 |
without having to erase everything we've already learned. 02:40:37.000 |
So one of the things I think that makes humans great 02:40:46.740 |
I mean, computational neuroscience people would say, 02:41:09.940 |
And then you start to think about all these mechanisms 02:41:34.660 |
that we associate with certain things, so forth. 02:41:41.300 |
I don't necessarily want a self-motivated LLM, right? 02:41:45.580 |
And then there's the problem of how do you even 02:41:53.700 |
a proper reinforcement learning kind of thing, 02:41:55.940 |
for instance, so a friend of mine, Sam Gershman, 02:42:05.340 |
like a typical AI model to make me as much money 02:42:08.580 |
as possible, first thing I might do is sell my house. 02:42:19.980 |
And then things start to get really complicated. 02:42:23.860 |
I mean, just even the thing you've mentioned is the moment. 02:42:29.940 |
it's difficult to express concretely what a moment is, 02:42:36.180 |
like how deeply connected it is to the entirety of it. 02:42:49.580 |
You have to include all the emotions involved, 02:42:51.020 |
all the context, all the things it built around it, 02:42:53.100 |
all the social connections, all the visual experiences, 02:43:00.220 |
All the history that came before that moment is built on. 02:43:03.740 |
And we somehow take all of that and we compress it 02:43:06.620 |
and keep the useful parts and then integrate it 02:43:09.940 |
into the whole thing, into our whole narrative. 02:43:13.260 |
And then each individual has their own little version 02:43:16.020 |
of that narrative, and then we collide in a social way, 02:43:22.220 |
I mean, well, even if we wanna go super simple, right? 02:43:29.140 |
he actually studied a lot of computer vision at Stanford. 02:43:34.140 |
And so one of the things he was interested in 02:43:36.420 |
is some people who have brain damage in areas of the brain 02:43:38.860 |
that were thought to be important for memory. 02:43:41.100 |
But they also seem to have some perception problems 02:43:48.980 |
and some people found this effect, some didn't. 02:43:54.380 |
and he said, "Let's take the best state-of-the-art 02:44:04.380 |
where the computer vision models would just struggle, 02:44:06.980 |
and you would find that they just didn't do well. 02:44:11.180 |
you add more layers, on and on and on, it doesn't help, 02:44:32.080 |
that were adversarial to these computer vision models. 02:44:44.780 |
But without enough time, if they just get a glance, 02:44:46.980 |
they're just like the computer vision models. 02:44:50.340 |
"Well, maybe let's look at people's eyes," right? 02:44:52.800 |
So a computer vision model sees every pixel all at once, 02:45:29.860 |
that the computer vision models weren't able to do. 02:45:33.420 |
So somehow integrating information across time 02:45:42.580 |
- I mean, the process of allocating attention 02:45:47.580 |
across time seems to be a really important process. 02:46:01.780 |
It's about attention, transform is about attention. 02:46:07.380 |
but then like, yeah, how you allocate that attention, 02:46:44.020 |
whenever there's an error, to adjust the model 02:46:46.140 |
such that you can allocate attention even better 02:46:52.620 |
maybe maximize the chance of confirming the model 02:47:04.020 |
I mean, I got a chance to study peripheral vision for a bit 02:47:17.860 |
- Yeah, at the same time, people are terrible 02:47:20.220 |
at detecting changes that can happen in the environment 02:47:26.080 |
if their predictive model is too strong, you know? 02:47:36.580 |
oh, the machines can do this stuff that's just like humans. 02:47:39.860 |
It's like, well, the machines make different kinds 02:47:45.340 |
And I will never be convinced unless that, you know, 02:47:54.740 |
but it's like, I don't think we've replicated 02:47:57.020 |
human intelligence unless I know that the simulator 02:48:01.540 |
is making exactly the same kinds of mistakes that people do 02:48:26.500 |
- Well, it's interesting for me because when I was a child, 02:48:33.180 |
I don't know if it came from a school psychologist. 02:48:39.460 |
Or if it just came from teachers who hated me, 02:48:54.860 |
And so, and, you know, there was social issues. 02:48:57.940 |
So like I could have been put a year ahead in school, 02:49:00.980 |
but then they said, oh, but he doesn't have the social, 02:49:08.900 |
an outcast, even in my own grade, but then like I was, 02:49:16.620 |
they got me on a diet free of artificial colors and flavors, 02:49:34.780 |
rejection sensitive, you name it, they talk about it. 02:49:41.020 |
I've gotten into in the past, just you name it. 02:50:00.260 |
kind of inappropriate expectations, especially for children, 02:50:06.740 |
kind of like maladaptive kinds of tendencies. 02:50:28.100 |
that you're interested in, you can get stuck. 02:50:30.820 |
And so, you know, attention is this beautiful balance 02:50:34.740 |
of being able to focus when you need to focus 02:50:39.820 |
And so it's that flexibility plus stability again. 02:50:43.100 |
And that balance seems to be disrupted in ADHD. 02:50:48.460 |
And so as a result, memory tends to be poor in ADHD, 02:50:56.940 |
but it's more because of this attentional issue, right? 02:51:01.420 |
And so, and people with ADHD often will have great memory 02:51:17.060 |
From just how the characteristics of your own brain 02:51:29.900 |
How do you flourish in this sort of education context? 02:51:34.140 |
- I'm still trying to figure out the flourishing per se, 02:51:42.420 |
It's like, you're constantly looking for new things, 02:51:50.460 |
And they tolerate your being late for things. 02:51:53.860 |
Nothing is really, nobody's gonna die if you screw up, 02:51:59.540 |
where you have to be much more responsible and focused. 02:52:06.860 |
But what I would say is that I'm learning now 02:52:14.460 |
like about how to structure my activities more, 02:52:18.700 |
and basically say, "Okay, if I'm going to be," 02:52:23.460 |
email's like the big one that kills me right now. 02:52:25.860 |
I'm just constantly shifting between email and my activities. 02:52:29.980 |
And what happens is that I don't actually get the email. 02:52:34.860 |
'cause I'm like, "Oh, I have to think about this. 02:52:39.340 |
and so I've just got fragmentary memories of everything. 02:52:56.140 |
Sometimes I do that, and sometimes I have to be flexible 02:52:59.420 |
and go like, "Okay, I'm definitely not focusing. 02:53:10.300 |
- And I'm very much with Cal Newport on this. 02:53:13.140 |
He wrote "Deep Work" and a lot of other amazing books. 02:53:18.580 |
as the thing that really destroys productivity. 02:53:22.380 |
So switching, it doesn't even matter from what to what, 02:53:32.780 |
Even switching between, if you're reading a paper, 02:53:38.900 |
because curiosity and whatever the dopamine hit 02:53:46.300 |
'cause otherwise your brain is just not capable 02:53:48.700 |
to really load it in and really do that deep deliberation. 02:54:00.900 |
- Yeah, I mean, you probably see this, I imagine, 02:54:05.820 |
in neuroscience conferences, it's now the norm 02:54:08.860 |
that people have their laptops out during talks. 02:54:15.820 |
But in fact, what often happens if you look at people, 02:54:18.700 |
and we can speak from a little bit of personal experience, 02:54:22.060 |
is you're checking email, or I'm working on my own talk, 02:54:30.580 |
I have this illusion, well, I'm paying attention, 02:54:33.940 |
And then what happens is I don't remember anything 02:54:45.740 |
Every time I switch, I'm getting a few seconds slower, 02:54:50.100 |
and I'm catching up mentally to what's happening. 02:54:54.780 |
where you're meaningfully integrating everything 02:54:57.020 |
and predicting and generating this kind of rich model, 02:55:05.020 |
by Melina Unkefer and Anthony Wagner on multitasking 02:55:12.300 |
and it's becoming worse and worse of a problem. 02:55:27.260 |
just when I was doing trumpet in school for a school band, 02:55:40.860 |
- To guitar, especially the kind of music you're into? 02:55:46.060 |
yeah, so I kind of was a late bloomer to music, 02:55:54.940 |
and so then you started seeing all this stuff, 02:55:57.020 |
and then I got into, metal was kind of like my early genre, 02:56:01.100 |
and I always reacted to just things that were loud 02:56:07.540 |
It's like, you know, everything from Sgt. Pepper's 02:56:11.580 |
by the Beatles to Led Zeppelin II, my dad had both, 02:56:19.620 |
and then like The Police, Ghost in the Machine, 02:56:29.420 |
went way down the rabbit hole of speed metal, 02:56:37.380 |
I can do this, and I had friends who were doing that, 02:56:45.080 |
but it was different because when I was doing trumpet, 02:56:49.260 |
and this was like, I was learning by looking, 02:56:51.940 |
there's a thing called tablature, you know this, 02:56:53.700 |
where it's like you see a drawing of the fretboard 02:56:57.060 |
with numbers, and that's where you're supposed to put your, 02:57:00.900 |
And so, I learned it in a completely different way, 02:57:07.740 |
and I didn't get it, it's actually taken me a long time 02:57:18.620 |
I remember especially, and it just blew my mind, 02:57:30.540 |
and this isn't just have like a shouty verse, 02:57:35.540 |
this is just like weird, and then it occurred to me, 02:57:42.780 |
the way people tell you it's supposed to sound. 02:58:00.060 |
and somehow I kind of morphed into just like, 02:58:13.780 |
I just enjoyed music just dripping out of me, 02:58:16.460 |
and just, you know, spilling out and just doing stuff, 02:58:23.540 |
What if I just play like notes that shouldn't go together, 02:58:28.660 |
Then I said, well, what if I don't do four beats, 02:58:37.780 |
and started messing around with time signatures. 02:58:39.620 |
Then I was playing in this band with a great musician 02:58:51.220 |
and instead of make it go like back and forth, 02:58:59.420 |
just make it like non-linear in these interesting ways. 02:59:04.300 |
it's like the whole world sort of opens up as like the, 02:59:09.580 |
especially, so you could appreciate this as a musician, 02:59:14.140 |
So we are so brainwashed to think in 4/4, right? 02:59:17.900 |
Every rock song you could think of almost is in 4/4. 02:59:30.060 |
- You feel like it's in 4/4 because it resolves itself, 02:59:37.060 |
basically it resolves on the first note of the next measure. 02:59:54.700 |
and you're getting a little bit of prediction error 03:00:03.180 |
because it was like, everything just feels better 03:00:05.380 |
if I do it in 7/4 or if I alternate between 4 and 3 03:00:14.220 |
they just do so much interesting stuff with this. 03:00:18.580 |
allows you to really break it all open and just, 03:00:28.940 |
one of the genres we used to play in was math rock, 03:00:34.460 |
It was just like, there's just so many weird time signatures. 03:00:47.820 |
instead of playing four beats in every measure, 03:00:59.620 |
so that there might be three measures of verse 03:01:03.300 |
and then one, you know, and then five measures of chorus 03:01:08.180 |
So you could just mess around with everything, right? 03:01:21.580 |
And disturbing, that can be quite disturbing. 03:01:25.060 |
- So is that, is that the feeling you would have 03:01:27.300 |
if you were keep messing, if you were math rock? 03:01:29.900 |
I mean, it's stressing me out just listening. 03:01:39.140 |
is very much like in terms of like repetitive themes, 03:01:46.100 |
'cause I'm not a great guitarist technically. 03:01:52.420 |
where it's just like so complicated, you know? 03:01:54.780 |
But often what I find is, is like having a melody 03:01:59.620 |
or, and then adding some dissonance to it, just enough. 03:02:15.980 |
most of human behavior is some people are lumpers 03:02:21.140 |
And so it's like some people are very kind of excited 03:02:27.620 |
And some people are just like, no, I want to lump every, 03:02:33.700 |
I think some people get scared of that discomfort. 03:02:44.660 |
- The cover band I play in is a band called Pavlov's Dogs. 03:02:53.420 |
of mostly memory researchers, neuroscientists. 03:03:01.940 |
So you play, like, do you play a rhythm or a leader? 03:03:47.020 |
And so they do, yeah, so Carrie does Blondie amazingly well. 03:04:14.860 |
- Which is perfect 'cause we're Pavlov's Dogs. 03:04:27.060 |
but it's just like, it devolves into total noise. 03:04:29.460 |
And I just like fall on the floor and generate feedback. 03:04:35.900 |
it might've been that or a Velvet Underground cover 03:04:39.580 |
I have a guitar made of aluminum that I got made. 03:04:48.100 |
had it upside down and all this stuff to generate feedback. 03:04:54.980 |
- Yeah, so I've managed to break an all metal guitar. 03:05:03.140 |
we've been talking about neuroscience in general. 03:05:07.980 |
you've been studying the human mind for a long time. 03:05:28.060 |
When you look at it, what is most beautiful to you? 03:05:45.420 |
and there's everything you hear and touch and taste, 03:05:50.660 |
But it's all connected by this like dark energy 03:05:55.660 |
that's holding that whole universe of your mind together. 03:06:00.900 |
And without that, it's just a bunch of stuff. 03:06:11.500 |
And being able to figure out where that comes from 03:06:14.660 |
and how things are connected to me is just amazing. 03:06:19.540 |
But just this idea of like the world in front of us, 03:06:26.940 |
And we do a really good job, not perfect, I mean, you know. 03:06:34.020 |
- Yeah, it's an incredible mystery, all of it. 03:06:39.700 |
You look out there, you look at dark matter and dark energy, 03:06:46.220 |
we don't understand, which helps make the equations work 03:06:50.780 |
in terms of gravity and the expansion of the universe. 03:06:53.100 |
In the same way, it seems like there's that kind of thing 03:06:56.020 |
in the human mind that we're like striving to understand. 03:06:59.500 |
- Yeah, yeah, you know, it's funny that you mentioned that. 03:07:01.540 |
So one of the reasons I wrote the book, amongst many, 03:07:03.900 |
is that I really felt like people needed to hear 03:07:06.660 |
from scientists, and COVID was just a great example of this, 03:07:10.900 |
because people weren't hearing from scientists. 03:07:13.860 |
One of the things I think that people didn't get 03:07:16.300 |
was the uncertainty of science and how much we don't know. 03:07:28.900 |
I just became aware of all of these things we don't know. 03:07:33.980 |
I think of this idea of like overwhelming majority 03:07:48.940 |
'cause it's like there's 10 physicists out there. 03:08:19.500 |
that's one of the biggest scientific successes 03:08:28.500 |
overwhelming majority of the universe, right? 03:08:36.460 |
is realizing, changing the scope of the problem 03:08:46.460 |
'cause science is all about assumptions, right? 03:08:50.180 |
"The Structure of Scientific Revolutions" by Thomas Kuhn? 03:08:54.140 |
- That's my only philosophy, really, that I've read. 03:09:27.180 |
You've decreased the amount of uncertainty I have 03:09:48.180 |
please check out our sponsors in the description. 03:10:01.220 |
"the life and death struggle people went through 03:10:03.900 |
"is now like something from the distant past. 03:10:08.940 |
"that events of the past are no longer in orbit 03:10:26.020 |
"there are some things we can never assign to oblivion, 03:10:32.460 |
"They remain with us forever like a touchstone."