back to indexAri Wallach: Create Your Ideal Future Using Science-Based Protocols
Chapters
0:0 Ari Wallach
1:58 Sponsors: David, Helix Sleep & ROKA
6:13 Mental Time Travel; Technology & Present
15:46 Technology; Tools: Transgenerational Empathy; Bettering Today
22:0 Tool: Empathy for Others
26:9 Empathy for Future Generations, Emotion & Logic
31:48 Tool: Emotion to Guide Action
36:50 Sponsor: AG1
38:2 Tools: Perfect Day Exercise; Cathedral Thinking, Awe & Future Generations
43:52 Egoic Legacy, Modeling Behavior
51:13 Social Media, Time Capsule, Storytelling
60:6 Sponsor: LMNT
61:18 Short-Term Thinking; Life Purpose, Science & Religion
69:23 Longpath, Telos, Time Perception
75:19 Tools: Photo Frames; Behavior & Legacy; Life in Weeks
83:2 Tool: Visualizing Future You
90:17 Death, Western Society
96:20 Tool: Writing Letter to Future Self
101:1 Society, Future Harmony
107:3 Traditional Institutions, Family, Future Consciousness; “Protopia”
118:48 Tool: Behavior & Modeling for the Future
128:11 Tool: “Why Tuesdays?”, Examining Self
134:58 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter
00:00:10.400 |
and I'm a Professor of Neurobiology and Ophthalmology 00:00:17.760 |
Ari Wallach is an Adjunct Associate Professor 00:00:30.240 |
on perhaps one of the most important questions 00:00:32.560 |
that any and all of us have to ask ourselves at some point, 00:00:36.240 |
which is how is it that we are preparing this planet 00:00:42.080 |
if we happen to have children or want children, 00:00:47.840 |
is capable of orienting its thoughts and its memories 00:00:50.960 |
to the past, to the present, or to the future. 00:00:56.800 |
to think about the future that they are creating 00:00:58.940 |
on this planet and in culture, within our families, et cetera 00:01:02.540 |
for the next generation and generations that follow them. 00:01:19.180 |
in order to best ensure the thriving of our species. 00:01:30.060 |
So during today's episode, Ari Wallach spells out for us, 00:01:33.260 |
not just the aspirations, not just what we want, 00:01:35.860 |
but how to actually create that positive future and legacy 00:01:39.540 |
for ourselves, for our families, and for society at large. 00:01:43.380 |
It's an extremely interesting take on how to live now 00:01:46.300 |
in a way that is positively building toward the future. 00:01:50.500 |
you will have a unique perspective on how your brain works, 00:02:00.420 |
that this podcast is separate from my teaching 00:02:12.460 |
I'd like to thank the sponsors of today's podcast. 00:02:30.060 |
This is 50% higher than the next closest protein bar. 00:02:38.740 |
but then again, I also like the chocolate-flavored one, 00:02:46.380 |
I try to get most of my calories from whole foods. 00:02:49.020 |
However, when I'm in a rush, or I'm away from home, 00:02:51.860 |
or I'm just looking for a quick afternoon snack, 00:02:59.340 |
of high-quality protein with the calories of a snack, 00:03:02.140 |
which makes it very easy to hit my protein goals 00:03:04.180 |
of one gram of protein per pound of body weight, 00:03:10.300 |
As I mentioned before, they are incredibly delicious. 00:03:21.580 |
I was never a big fan of bars until I discovered David bars. 00:03:24.980 |
If you give them a try, you'll know what I mean. 00:03:31.660 |
Again, the link is davidprotein.com/huberman. 00:03:35.820 |
Today's episode is also brought to us by Helix Sleep. 00:03:40.820 |
that are customized to your unique sleep needs. 00:03:43.740 |
I've spoken many times before on this and other podcasts 00:03:46.140 |
about the fact that getting a great night's sleep 00:03:51.620 |
Now, the mattress we sleep on makes an enormous difference 00:03:54.020 |
in terms of the quality of sleep that we get each night. 00:04:00.060 |
one that is neither too soft nor too hard for you, 00:04:04.060 |
and that won't be too warm or too cold for you. 00:04:11.620 |
do you sleep on your back, your side, or your stomach? 00:04:13.380 |
Do you tend to run hot or cold during the night? 00:04:16.780 |
Maybe you know the answers to those questions, 00:04:22.480 |
For me, that turned out to be the Dusk mattress, D-U-S-K. 00:04:40.540 |
that's customized for your unique sleep needs. 00:04:52.240 |
Today's episode is also brought to us by Roka. 00:05:07.340 |
to be able to see clearly from moment to moment. 00:05:11.700 |
and has designed all of their eyeglasses and sunglasses 00:05:14.220 |
with the biology of the visual system in mind. 00:05:20.020 |
in particular for things like running and cycling, 00:05:26.460 |
you don't even remember that you're wearing them, 00:05:28.220 |
and they're also designed so that they don't slip off 00:05:31.900 |
Now, even though Roka eyeglasses and sunglasses 00:05:35.620 |
they now have many different frames and styles, 00:05:41.260 |
to work, essentially any time and any setting. 00:05:48.060 |
and I wear Roka sunglasses in the middle of the day 00:05:50.340 |
anytime it's too bright for me to see clearly. 00:05:52.180 |
My eyes are somewhat sensitive, so I need that. 00:05:57.300 |
which I have as eyeglasses and now as sunglasses too. 00:06:06.280 |
Again, that's roka.com/huberman to get 20% off. 00:06:19.060 |
and I think that's a good way to frame today's conversation, 00:06:21.600 |
not by talking about our history by any stretch, 00:06:34.060 |
the human brain is capable of this amazing thing, 00:06:50.420 |
provided all our mental faculties are intact. 00:06:53.180 |
One of the key aspects to brain function, however, 00:06:57.900 |
is to use that ability to try and set goals, reach goals, 00:07:09.020 |
we operate on short timeframe reward schedules, 00:07:18.100 |
we generally have ways of getting it pretty quickly, 00:07:30.000 |
A lot of your work is focused on linking our perception 00:07:38.560 |
and trying to project our current decision-making 00:07:43.620 |
into the future to try and create a better future. 00:07:46.880 |
And that's some pretty heavy mental gymnastics, 00:08:01.100 |
without letting their health get out of control, 00:08:04.240 |
or I should say their illness get out of control, 00:08:31.880 |
- It's a great question, or a great series of questions. 00:08:34.680 |
One of the things that Homo sapiens do extremely well 00:08:53.800 |
that what separates us out from almost every other species, 00:08:56.800 |
as far as we know, the ones we can talk to, mostly us, 00:09:01.660 |
We can do mental time travel towards the future, right? 00:09:04.540 |
We can think about different possible outcomes, 00:09:24.320 |
as far as we know, we're the only ones who do it, 00:09:38.500 |
Which, one thing about the hippocampus that's amazing 00:09:47.580 |
of episodic memories that have happened in the past, 00:09:50.620 |
reassembles them so that we can mentally time travel 00:09:54.480 |
and then figure out these different future scenarios 00:09:57.180 |
So if we take Ari and Andy, 150,000 years ago. 00:10:25.680 |
And we're now at a point where we're no longer singular, 00:10:30.100 |
but we're within a kind of a small tribal structure. 00:10:33.440 |
We wanna start hunting larger and larger game. 00:10:44.240 |
We have to start thinking about different scenarios. 00:10:48.840 |
is really coming from our desire for more protein to exist 00:10:55.960 |
the super energy-intensive thing called the human brain. 00:11:04.200 |
and been successful in the past or not successful 00:11:27.500 |
And fast forward to right now, on my way in here, 00:11:40.860 |
So we're working on two 300,000-year-old hardware. 00:11:44.780 |
At the same time, we have a cultural substrate that is, 00:11:54.020 |
to make us, A, want that immediate gratification, 00:12:10.780 |
but to actually think about the immediate present. 00:12:23.740 |
for us as individuals, as societies, as civilization, 00:12:28.740 |
in the way that you and I may have done 150,000 years ago, 00:12:35.320 |
where are we gonna move our family and our tribe 00:12:37.060 |
or our clan, and we would go to warmer climates. 00:12:44.840 |
for us to break out of this presentist moment. 00:12:47.580 |
- I really appreciate your answer for a couple of reasons. 00:13:09.620 |
But what you're describing is actually too much 00:13:12.060 |
being present, what you're calling presentism. 00:13:15.840 |
And of course, it depends on what's happening 00:13:31.340 |
they're spending too much time worrying about the future, 00:13:41.660 |
And as you said, we're in this sort of hall of mirrors 00:13:46.960 |
And I don't want today's discussion to be doom and gloom, 00:13:53.360 |
and what Jonathan Haidt, who is on this podcast, 00:14:03.600 |
yes, the human brain can focus on past, present, 00:14:12.040 |
of our technologies and our human interactions, 00:14:18.020 |
has us really locked in the present in stimulus response. 00:14:40.720 |
we absolutely need to take five to 10 minutes each day 00:14:46.280 |
typically by closing one's eyes and just looking inward. 00:14:48.520 |
It doesn't even have to be called meditation, 00:14:50.760 |
in order to understand what our greater wishes are, 00:14:54.600 |
how to link our current thinking and behavior 00:15:08.460 |
So to link these concepts in a more coherent way, 00:15:16.500 |
either the traditional type of notifications on your phone, 00:15:20.600 |
but that we're basically just living in stimulus response 00:15:24.460 |
And if so, what direction is that taking ourselves 00:15:39.100 |
how bad is it to just be focused on managing the day-to-day? 00:15:42.740 |
Or maybe that's a better way to go about life. 00:15:47.380 |
There are people like me who are full-time futurists. 00:15:51.700 |
'cause what we tend to do is think more in the future 00:16:02.660 |
dealing with what's right in front of you, that's great. 00:16:06.900 |
what I call kind of transgenerational empathy. 00:16:12.580 |
So we know empathy, you've had guests on that. 00:16:18.220 |
starts with empathy and compassion for yourself. 00:16:21.860 |
Then we move into empathy for those who came before, 00:16:25.940 |
which then allows us to build empathy for the future, 00:16:28.460 |
future Ari, future Andy, but then future generations. 00:16:34.500 |
- Yeah, maybe we could just parse each of those one by one. 00:16:43.460 |
It's recognizing you're doing the best you can 00:16:51.620 |
and I'm guilty of this, of images and quotes and books 00:16:56.620 |
of how to live your best life, how to be amazing. 00:17:00.140 |
And anything below that metric of perfection, 00:17:05.580 |
And you start to kind of ruminate over what you, 00:17:11.500 |
And you forget that you're only able to handle 00:17:19.580 |
And you can't hold yourself up to this idealized yardstick. 00:17:28.040 |
And from when he learned to when he passed away 00:17:35.020 |
And for a lot of that time, I was kind of in denial, right? 00:17:45.200 |
In fact, and we're not gonna, we won't go into this. 00:17:49.380 |
We were working together that summer at a summer camp. 00:17:59.060 |
And then I realized, and this is a self-compassion, 00:18:01.660 |
like 18 year old Ari was only at a place emotionally 00:18:06.420 |
and psychologically to be able to do what I did. 00:18:10.020 |
And it wasn't the older 30 or 40 year old Ari 00:18:14.620 |
So empathy for yourself really, really centers. 00:18:18.300 |
It doesn't mean you let yourself off the hook. 00:18:23.740 |
It means you recognize that who you were even yesterday 00:18:28.700 |
is in many ways different than who you are today 00:18:32.780 |
So transgenerational empathy has to start with yourself. 00:18:36.140 |
It has to start with being able to look in the mirror 00:18:46.420 |
into my birth family or family that you choose. 00:18:55.220 |
but you have to start there because so many times 00:18:57.820 |
I work with people and I talk to people and they say, 00:19:00.140 |
I wanna have empathy for the past and for the future, 00:19:06.780 |
it becomes very, very difficult to spread out 00:19:15.020 |
is to get you to spread that out into the future. 00:19:19.660 |
because I've heard it before in other contexts, 00:19:25.220 |
I think, yeah, there's two phrases that come to mind. 00:19:30.220 |
There's a book called "A Fighter's Heart" by Sam Sheridan. 00:19:38.500 |
of all the different forms of martial arts and fighting. 00:19:43.620 |
where he says, you can't have your 20th birthday 00:19:50.460 |
but it's actually a pretty profound statement. 00:19:56.180 |
He has an interesting lineage in his own right. 00:19:59.580 |
he claims he just painted and smoked cigarettes. 00:20:10.580 |
you can't have your 20th birthday until you're 19 00:20:21.180 |
And so I like to think he was in agreement with you 00:20:28.300 |
that comes to mind is that I, like many people, 00:20:46.060 |
And there's a guy, I'll put it in the show note captions. 00:21:01.500 |
for a lot of the reasons that we're talking about, 00:21:03.020 |
its ability to flip from past to present to future, et cetera. 00:21:09.380 |
to one actionable step per day or per morning, 00:21:20.540 |
what am I going to do today to make my day better? 00:21:25.540 |
Not to be better than I was yesterday, right? 00:21:32.020 |
because like yesterday could have been an amazing day. 00:21:34.340 |
You might not be as good as yesterday, right? 00:21:39.940 |
on these circadian biology units of 24 hours. 00:21:44.940 |
So I like this concept of what can I do today 00:21:48.180 |
to make my life and hopefully the lives of others better? 00:21:54.260 |
and it's really focused on the unit of the day, 00:22:03.300 |
that we're always doing the best we can with what we've got, 00:22:05.840 |
but that there's a striving kind of woven into that statement 00:22:11.500 |
At what point do we start to develop empathy for others? 00:22:19.340 |
I mean, that's the kind of traditional definition. 00:22:21.780 |
we start off with kind of cognitive intellectual empathy, 00:22:36.180 |
and you want to bring, if they're feeling bad, 00:22:40.500 |
If they're feeling good, you can be there with them. 00:22:43.060 |
At a fundamental level, this is mirror neurons. 00:22:46.340 |
And I'm connecting with you and you're connecting with me, 00:22:48.980 |
and there's a genetic adaptive fitness for that, right? 00:23:09.940 |
It can be, look, it can be taxing, don't get me wrong, 00:23:12.700 |
but ultimately that is what self-compassion can give you 00:23:20.060 |
you are no longer fundamentally disconnected. 00:23:27.620 |
over the past several decades, if not centuries, 00:23:29.860 |
is disconnection, disconnection from ourselves, 00:23:34.420 |
and disconnection from nature and the planet. 00:23:37.460 |
So anything we can do to further that connection 00:23:40.380 |
is going to benefit us today in the current moment. 00:23:47.220 |
into the requirements for empathy and connection, 00:23:59.220 |
a beautiful fern, or a dog, or a significant other, 00:24:02.900 |
or another human being that we happen to encounter, 00:24:12.460 |
- Right, can't be in the past, can't be in the future, 00:24:16.020 |
to really touch into the details of the experience. 00:24:25.100 |
to leave whatever kind of pressures are on us 00:24:32.060 |
Like every neural circuit we know has a push and a pull. 00:24:34.580 |
Like in order to get A, you need to suppress B. 00:24:36.660 |
And this is the way neural circuits work generally. 00:24:39.380 |
You know, flexors and extensors in the muscles 00:24:47.340 |
your tricep is essentially relaxing and vice versa, 00:24:53.300 |
The PTs are going to dive all over me for that one. 00:24:55.140 |
But that's sort of how neural circuits in the brain work. 00:25:03.420 |
and their difference is actually what allows us 00:25:14.340 |
And then we need to be able to return to our own, 00:25:16.940 |
you know, self-attention in order to be functional. 00:25:20.180 |
And I think that, I think this is where the challenge is 00:25:31.660 |
we've got so many pressures upon us every day, all day, 00:25:37.980 |
to be empathic and to build this idealized future 00:25:43.540 |
But on the other hand, I hear you and other people saying, 00:25:46.420 |
well, things are so much better than they were 00:25:48.500 |
even 50 years ago in terms of health outcomes, 00:25:54.180 |
the status of people having shelter, et cetera. 00:26:02.660 |
Well, they were, people suffering were elsewhere. 00:26:08.260 |
So there are a couple of levels of question here, 00:26:11.460 |
but the first one is perhaps, are we much better off, 00:26:20.340 |
there's so much incoming that we miss the fact 00:26:26.420 |
preventing us from seeing that we actually have so much 00:26:28.940 |
that we're, you know, 100 times better off than we were 00:26:34.940 |
'Cause I feel like a lot of the debates that I see online 00:26:37.780 |
about climate change, about health, about longevity, 00:26:40.700 |
it's like, it's overwhelming because I feel like 00:26:42.420 |
people aren't agreeing on the first principles. 00:26:45.300 |
So let's start with this, are human beings better off 00:26:49.460 |
in terms of health and longevity than we were? 00:26:57.700 |
because we can find peaks and valleys, right, 00:27:02.020 |
there's no better time to be alive as a homo sapien 00:27:08.940 |
- I mean, according to what metrics, like happiness? 00:27:13.340 |
even as we backslide in this country, being a woman, 00:27:23.740 |
and you stepped on a rusty nail 100 years ago, 00:27:31.940 |
that we don't even have to put anything on it, 00:27:33.020 |
we can just put it underneath high pressure water 00:27:39.100 |
So net net, this is the best time to be alive. 00:27:42.900 |
All the markers, you can go to Gapminder if you want, 00:27:50.580 |
The issue is that we are now at an inflection point 00:28:11.980 |
not only the next several years and several decades, 00:28:15.820 |
So you've hit it, we're being bombarded by information. 00:28:28.180 |
If we saw this beautiful tree, aesthetically, 00:28:32.060 |
and we saw maybe a tree over here that was on fire, 00:28:41.120 |
That being said, if you and I run a major media company, 00:28:44.720 |
you and I both know that the more negative stories 00:28:48.080 |
that we put out, the more hits we're gonna get. 00:28:49.900 |
- Not this media company. - Not this media company. 00:28:59.220 |
in the negativity and there's a real thirst and a hunger 00:29:06.800 |
But that negativity bias is still part of us, right? 00:29:09.660 |
I think one of the issues that we have to confront 00:29:17.260 |
the prefrontal cortex parts of us that are amazing, 00:29:20.180 |
that build microphones, that have conversations, 00:29:27.580 |
there are parts of us that happen below the surface 00:29:33.540 |
And we often wanna either be up here and say, 00:29:37.820 |
or we wanna wallow in the kind of the death and despair 00:29:40.540 |
and the horrific things that we can do to one another. 00:29:43.020 |
You know, my personal past on my father's side 00:29:47.740 |
in Homo sapien behavior, and that was not that long ago. 00:29:51.540 |
So if we wanna move into a place that allows us to ask 00:29:57.380 |
what I think is the fundamental question of our time, 00:30:00.860 |
which is how do we become the great ancestors 00:30:05.360 |
We need to find a way to both tap into the elephant 00:30:09.460 |
in the rider, which you'll do a better job of me 00:30:28.660 |
doesn't know or care about the clock or the calendar. 00:30:34.340 |
It doesn't care about whether or not that feeling 00:30:35.780 |
is relevant to the past, the present, or the future. 00:30:38.260 |
It just has a job, which is just to bring out 00:30:42.020 |
- You're jumping ahead a little bit, but that's okay. 00:30:43.740 |
Because what you're jumping into is when we ask 00:30:50.940 |
we wanna have empathy with future generations. 00:30:59.820 |
So if I ask someone, what do you want the future 00:31:02.660 |
to be like for your great grandkids in the 2080s? 00:31:06.140 |
And they give me a list of kind of bullet points, 00:31:08.580 |
but they're usually externalized bullet points. 00:31:15.420 |
much smarter than you have done this in studies. 00:31:16.700 |
We say Jakob Trope at NYU is the one who taught me this. 00:31:25.300 |
this is somatic markers hypothesis theory, right? 00:31:29.420 |
Where if you really want something to happen, 00:31:37.820 |
to the emotional amygdala sense of what that is 00:31:50.260 |
but also it was the kids say, sorry, not sorry, 00:31:58.180 |
some really brilliant work creating practices 00:32:00.900 |
where when one is not feeling what they want to feel, 00:32:07.100 |
like, are you supposed to feel your feelings? 00:32:08.500 |
Are you supposed to create new feelings in place of them, 00:32:11.460 |
And it's like, there's no clear answer to that 00:32:13.180 |
because it's complicated, infinite number of variables. 00:32:17.580 |
But she does have this interesting practice whereby, 00:32:31.700 |
to think back to a time when you felt particularly blank, 00:32:36.620 |
like a time when you felt particularly empowered 00:32:38.780 |
or particularly curious, it can be very specific, 00:32:45.580 |
and the idea is that in anchoring to the emotion state first, 00:32:50.140 |
you call to mind a bunch of potential action steps. 00:33:03.500 |
are linked to a bunch of action step possibilities, 00:33:07.820 |
where if you go into the room called sadness, 00:33:11.780 |
there are a bunch of action steps associated with 00:33:14.620 |
It's like curling up in the fetal position, et cetera. 00:33:16.300 |
You go into the room that's called excitement, 00:33:19.900 |
and there's all this idea about getting in vehicles 00:33:27.260 |
I believe, thinking about the emotional states of others, 00:33:37.980 |
cultivating some action steps that you can take 00:33:53.100 |
So it's not saying, "I want my kids to be happy. 00:33:58.160 |
It's feeling what it would be to be happy, no trauma. 00:34:10.420 |
What it does is, but it places it, it's like a kedge anchor. 00:34:12.700 |
So if you and I were sailors, which we're not, 00:34:17.260 |
And a kedge anchor is this anchor that you throw 00:34:19.580 |
30, 40 meters off to the side, it hits the bottom, 00:34:31.620 |
So time and time again, when we intellectualize 00:34:34.700 |
and we become overly cognitive in terms of futures 00:34:38.380 |
that we wanna see happen for ourselves, future Ari, 00:34:51.460 |
we have to actually connect the emotional state 00:34:58.300 |
that Marty Seligman says, that Freud got it wrong. 00:35:08.380 |
and that was neuroses and anxiety and depression. 00:35:11.180 |
Emotions are there to help us make better decisions 00:35:21.000 |
So what emotions do, it's not meant to be like, 00:35:26.140 |
I feel so terrible, and then I'm gonna go to my therapist, 00:35:30.300 |
or talk about all that stuff that happened in the past. 00:35:38.380 |
whatever you just did that had you in that situation, 00:35:43.300 |
Because if you do, you're gonna feel a certain way. 00:35:49.180 |
who had just been in a kind of a quasi-long-term relationship 00:35:57.540 |
is it's important to define the relationship. 00:36:07.100 |
and another group actually just got acetaminophen, 00:36:24.540 |
that emotions are there to guide future action. 00:36:35.140 |
as what we're connected to the future generations 00:36:38.100 |
that we wanna see, how we wanna see them flourish, 00:36:46.460 |
at an intellectual kind of two-dimensional level. 00:36:59.380 |
The reason for that is AG1 is the highest quality 00:37:02.820 |
of the foundational nutritional supplements available. 00:37:08.340 |
but also probiotics, prebiotics, and adaptogens 00:37:21.460 |
it's very difficult for me to get enough fruits 00:37:25.260 |
micronutrients, and adaptogens from food alone. 00:37:28.340 |
For that reason, I've been taking AG1 daily since 2012, 00:37:31.740 |
and often twice a day, once in the morning or mid-morning, 00:37:36.440 |
When I do that, it clearly bolsters my energy, 00:37:52.020 |
Right now, they're giving away five free travel packs 00:38:02.660 |
- I really like this because it gets to so many themes 00:38:05.660 |
that have been discussed on this podcast previously 00:38:08.220 |
and that exist in the neuroscience literature of, 00:38:11.700 |
like, yes, emotions don't know the clock or the calendar. 00:38:17.740 |
And oftentimes it's discussed as a bad thing. 00:38:21.300 |
you're not able to access the parts of your brain 00:38:26.340 |
except in light of what's immediately pressing. 00:38:29.100 |
I mean, I would say that stress in the short term 00:38:54.140 |
also leverages the fact that emotions don't know 00:39:05.180 |
then action steps born out of those emotions, 00:39:20.980 |
that has largely existed, at least to my knowledge, 00:39:29.100 |
in part because of an exercise that she's included in, 00:39:54.420 |
Most people don't have too much trouble doing that exercise. 00:39:58.660 |
I think you're supposed to take a little break 00:40:01.580 |
and then you do a perfect day exercise where no rules. 00:40:08.380 |
and you can imagine your day includes anything you want. 00:40:14.060 |
The room can morph from one country to the next, 00:40:18.820 |
And you also experience the sensations in your body. 00:40:23.700 |
And in that second exercise, it's remarkable, 00:40:29.180 |
there are little seeds of things kind of pop out 00:40:36.360 |
And they're not outside the bounds of reality. 00:40:40.000 |
And those are things that then you write down, 00:40:43.300 |
and that at least in my life have all borne out. 00:40:48.260 |
So this is something, an exercise you do routinely. 00:40:50.700 |
And when I first heard about this, I was like, 00:41:00.760 |
"I'm a neuroscience professor, you got to be kidding me." 00:41:06.560 |
And the reason I bring it up now in discussion with you 00:41:10.100 |
is I think you and Martha arrived at a similar place 00:41:16.740 |
you're talking about specifically toward building a future 00:41:28.940 |
is in a story that I heard a very long time ago. 00:41:34.480 |
That being said, this story exists in many cultures. 00:41:49.660 |
"How long will it be until this carob tree bears fruit 00:41:53.700 |
And he goes, "Oh, it'll be at least 40 years." 00:42:05.980 |
"So it's my job to plant this carob tree now." 00:42:15.980 |
is by planting carob trees whose shade we will never know. 00:42:20.820 |
And look, I can give you a bunch of examples. 00:42:34.940 |
it took a really long time to build great things. 00:42:37.300 |
So you go back two, 300 years ago or even further, 00:42:42.020 |
and oftentimes, the architect and the original stonemason 00:42:52.740 |
It's doing things whose fruits you will not be around 00:43:14.660 |
- Like, I mean, I've seen some amazing architecture. 00:43:17.420 |
And I was like, "Okay, like, it'll be a beautiful building." 00:43:22.500 |
- But that woe that you felt is what we call awe. 00:43:30.960 |
is what I am advocating for us to build in the world today, 00:43:35.380 |
is so that when our descendants look back and they say, 00:43:42.700 |
It's not because we necessarily built cathedrals. 00:43:56.420 |
I don't know who built the Blue Mosque specifically. 00:44:03.620 |
And even, you know, earlier this year, we were in Sydney. 00:44:11.680 |
I can tell you that the architect was Danish, 00:44:25.620 |
And gosh, this is the opposite of social media, right? 00:44:30.620 |
Social media, it's all about getting credit, you know? 00:44:34.360 |
And yet in science where people care a lot about credit 00:44:42.860 |
- It's also a business model of academic science right now. 00:44:45.100 |
- Right, which is that with the exception of Einstein 00:45:00.660 |
and I know a lot about the scientists that were ahead of him. 00:45:13.840 |
Talk about the discovery, people will build on it. 00:45:17.620 |
for which you won't get credit in the long run. 00:45:23.600 |
that's more relevant to everybody, not just scientists, 00:45:29.860 |
on these short-term contingencies, reward schedules, 00:45:33.600 |
where, you know, we achieve something, we get credit. 00:45:41.060 |
It's like, you know, podium, you know, bronze, silver, gold. 00:45:59.300 |
I mean, I don't wanna give too many examples, 00:46:02.260 |
for which there's an endowment the size of a country. 00:46:08.300 |
The buildings have names on the side of them. 00:46:10.500 |
The reason they have names on the side of them 00:46:26.460 |
So if people knew that if they gave half their wealth 00:46:31.900 |
and their name might be scraped off a building 00:46:33.660 |
in 200 years, they might feel differently about it. 00:46:57.540 |
how do we get ourselves working on short-term contingencies 00:47:04.660 |
for the next generation and let go of our need for credit? 00:47:08.340 |
- Great series of points and questions brought up. 00:47:10.340 |
So part of what you're talking about is egoic legacy, right? 00:47:15.460 |
It could be at any building at any major university. 00:47:25.180 |
Proof that you can bounce around and still be successful, 00:47:32.420 |
- Sproul Plaza, seed of the free speech movement, 00:47:35.180 |
although now you could argue not so free speech movement. 00:47:40.260 |
Sproul Plaza, like I can't tell you who Sproul was. 00:47:45.940 |
I can tell you that it was a free speech movement. 00:47:47.620 |
I can tell you that I saw certain bands play there. 00:47:49.900 |
I can tell you that it's supposed to be a place 00:47:52.060 |
where you can say anything and be exempt from 00:47:59.900 |
Maybe that's still true, but I don't think it is. 00:48:05.580 |
So Sproul Plaza, let's say 250 years from now, 00:48:10.580 |
that name will probably, it may or may not be there, 00:48:22.940 |
on the side of a building, that's one form of legacy. 00:48:29.540 |
That being said, if, you know, I have three children. 00:48:33.500 |
So let's say they continue on at 2.2 children or whatever, 00:48:57.300 |
And by the way, if you want to keep giving money 00:48:58.700 |
to put your name on the side of buildings, please do so. 00:49:07.620 |
people think of it as like, oh, people, egoic legacy. 00:49:10.060 |
Sure, also pays for hundreds of thousands of scholarships, 00:49:27.620 |
'cause remember, I'm not the kind of futurist 00:49:40.580 |
It's to help folks make better decisions today 00:49:44.020 |
so that we have better futures in the near term, 00:49:49.580 |
So what's going to impact those 50,000 Wallach descendants? 00:49:54.140 |
It's not gonna be anything that I did egoically 00:50:00.140 |
What's going to impact them, and we know this in many ways 00:50:06.060 |
what's going to impact them is going to be how I am 00:50:22.380 |
Not internet memes where I watch a lot of those, 00:50:25.300 |
but true memes, these cultural units that we hand off 00:50:33.620 |
to other generations, especially those closest to us. 00:50:39.860 |
Reduce your carbon footprint, give money, vote this. 00:50:42.860 |
I want all of those to happen in a positive way. 00:50:45.980 |
But at the end of the day, it's monkey see, monkey do. 00:50:56.100 |
but then everyone who's listening or viewing, 00:50:57.500 |
how they are with the person who hands them the coffee, 00:51:05.460 |
is going to impact the future in a greater way, 00:51:13.980 |
- I totally agree, and I think I'm old enough, 00:51:21.100 |
that I can make statements about being old enough 00:51:23.220 |
to know that, like, I believe that our species 00:51:28.020 |
I feel like most people, if raised in a low trauma 00:51:40.400 |
There are exceptions, and there may be sociopaths 00:51:43.300 |
that are born with really disrupted neural circuitry 00:51:45.740 |
that they just have to do evil or feel, you know, 00:51:48.580 |
but I think it's clear that trauma and challenge 00:51:53.580 |
can rewire behavior, and certainly the brain, 00:51:57.800 |
to create, you know, what we see as evil, right? 00:52:18.900 |
I'll probably live, hopefully, to be about 100, 00:52:21.940 |
Bullet buster cancer, I'm going to give it what I got. 00:52:24.380 |
- It depends on whether or not you read your book fully. 00:52:27.620 |
That, there's a response to that that could go either way. 00:52:31.040 |
The, I like to think that reading the book fully 00:52:37.580 |
- But if nothing else, maybe it'll cure insomnia. 00:52:40.300 |
The idea here is that if we're going to invest 00:52:49.420 |
one would hope that other people will respond to that 00:53:07.460 |
but like spontaneous etiquette, more genuine etiquette, 00:53:14.860 |
- And I have a theory and I'll go through this quickly. 00:53:19.500 |
I saw a documentary recently about the history of game shows 00:53:26.080 |
when DiMaggio was making a run on the home run record. 00:53:29.420 |
So they used a sports game that was televised 00:53:35.080 |
which were basically commercials for the products. 00:53:49.260 |
is the reality TV show and we're all able to opt in 00:54:06.500 |
I've tried and I think managed to some extent to do so too. 00:54:15.260 |
in this reality TV show that we are all in on social media 00:54:19.160 |
by just being super nice to everybody and being, 00:54:29.360 |
'Cause it's less interesting, there's less drama. 00:54:33.360 |
But I do think that there are pockets of that. 00:54:39.040 |
where people are rewarded for being benevolent, 00:54:45.920 |
And I say social media because I think so much of life now 00:54:50.720 |
And that's the opportunity to reach people across continents 00:55:03.080 |
that is not just on the half-life of like 12 hours, 00:55:06.560 |
what was tweeted, et cetera, what was retweeted? 00:55:10.200 |
and even the highest virality social media posts 00:55:15.200 |
have a half-life of about six months to a year. 00:55:26.960 |
So is there a time capsule sort of version of social media? 00:55:35.560 |
Because I look on the internet, like on YouTube, 00:55:51.780 |
maybe Benet Brown's TED Talk on vulnerability. 00:56:18.080 |
But I like to think that these podcast episodes 00:56:19.920 |
are gonna project forward 30, 40 years into the future. 00:56:22.360 |
But if we look at the history of what's on YouTube 00:56:25.200 |
and we look at the half-life of any social media post, 00:56:33.060 |
One would hope that they morph into something that lasts, 00:56:40.440 |
to teach the sorts of principles that you're talking about? 00:56:43.920 |
- In the show that I just did, "A Brief History of the Future," 00:56:54.720 |
because what these caves have in them side by side 00:57:03.520 |
It's one of the few places where they exist side by side. 00:57:07.960 |
we have to talk about what that really is, is storytelling. 00:57:12.960 |
And we're trying, in social media as we know it right now, 00:57:16.360 |
we're trying to tell the world a story about who we are 00:57:25.960 |
But when we go back to that cave that I stood in 00:57:27.800 |
where those drawings were from 40, 50,000 years ago, 00:57:39.480 |
you should expect to see these animals in this area, right? 00:57:47.240 |
So when you, because that's the way we used to think 00:57:50.360 |
from 40,000 years to the agricultural revolution, 00:57:54.960 |
to probably up until a couple of hundred years ago, 00:57:59.060 |
The minute hand only existed on the analog clock 00:58:06.660 |
We barely thought, look, the clock as we know it, 00:58:11.400 |
only comes about during the industrial revolution. 00:58:14.520 |
And especially then when we start to have trains, 00:58:18.200 |
- It was stone hedge, it was sundial, it was seasons, right? 00:58:24.240 |
wait, when people say, "Oh, Ari, you're a futurist." 00:58:29.180 |
No, the idea of the future that is this thing out there 00:58:38.020 |
'Cause up until a couple of hundred years ago, 00:58:45.060 |
There was no kind of evolution in social structure. 00:58:50.180 |
- I guess it could be argued I've done a lot of things 00:58:53.100 |
He is a scientist and there are other domains of life. 00:59:02.560 |
- Yeah, so my dad would say, "You'd open the paper." 00:59:05.560 |
from behind when I wanted his attention, yeah. 00:59:08.080 |
- We can talk about that in a second, the attention part. 00:59:15.760 |
when I started answering your question about social media, 00:59:22.000 |
I wanna say, why is it that we're doing what we're doing? 00:59:27.560 |
so that we can potentially go in a different direction 00:59:39.380 |
so that we can flourish and move forward as a species. 01:00:11.600 |
that has everything you need and nothing you don't. 01:00:13.760 |
That means the electrolytes, sodium, magnesium, 01:00:15.960 |
and potassium in the correct ratios, but no sugar. 01:00:19.640 |
Now, I and others on the podcast have talked a lot 01:00:31.580 |
It's also important that you get adequate electrolytes 01:00:33.920 |
in order for your body and brain to function at their best. 01:00:36.640 |
The electrolytes, sodium, magnesium, and potassium 01:00:51.400 |
and I drink that basically first thing in the morning. 01:00:55.680 |
during any kind of physical exercise I'm doing, 01:01:19.040 |
I mean, one of the reasons I fell in love with biology 01:01:32.940 |
it's a core truth about us way back when and now, 01:01:40.000 |
And of course, technologies will modify that, 01:01:49.840 |
that I described on the podcast about viewing sunlight, 01:01:53.760 |
has been core to our biology and our wellbeing 01:01:59.560 |
and very likely it will be core to our biology 01:02:08.440 |
that shortens up our timescale of motivation and reward. 01:02:20.440 |
so I am not anti-social media by any stretch. 01:02:38.920 |
people are going to say I'm a gambling addict, 01:02:47.320 |
And I often do pretty well for whatever reason, 01:03:01.600 |
that there are these other longer timescales. 01:03:04.160 |
- That's why there's no natural light in most casinos. 01:03:06.000 |
- There's no lights, there's no clocks in many of them. 01:03:13.560 |
that's there is designed to keep you playing. 01:03:16.680 |
And I would argue that a lot of social media is like that. 01:03:25.280 |
They want to fight 'cause they like that emotion, 01:03:29.900 |
so that they shorten up your temporal window. 01:03:35.480 |
oh, we're walking around with a little slot machine 01:03:37.080 |
in our pocket all day long with our smartphone, 01:03:43.400 |
where that casino harbors all sorts of different games 01:03:46.320 |
and they're gonna find the one that you like. 01:03:57.020 |
A friend of mine who's actually an addiction counselor, 01:04:04.080 |
Because the next time really could change everything. 01:04:09.540 |
where the next time is just gonna take you further down. 01:04:12.040 |
In gambling, there is the realistic possibility 01:04:29.280 |
in order to get into these longer-term investments 01:04:52.840 |
about what was happening at the debates from other people? 01:05:01.280 |
is I can get caught in the little eddy of the tide pool 01:05:08.060 |
or the debate about the debate about the debate. 01:05:19.560 |
much less get into this longer-term thinking. 01:05:21.880 |
And maybe this is why David Goggins is always out running 01:05:26.260 |
even though he's used it to good end to share his message. 01:05:36.160 |
to disengage from that short-term contingency reward mindset 01:05:54.400 |
we need to tend to our kids, we need to tend to our health, 01:05:56.200 |
we need to get our sleep, we need to get our... 01:06:02.360 |
which doesn't leave a whole lot of time afterwards anyway. 01:06:07.880 |
Like, where are the story, where should the stories go? 01:06:12.840 |
I feel really impassioned by this because, you know, 01:06:19.440 |
And I teach biology because I believe it's fundamental 01:06:21.840 |
and transcends time, but I care about the future. 01:06:26.720 |
And I'm well aware that, you know, in 30 years, 01:06:31.440 |
the idea that there was a guy on the internet 01:06:32.960 |
talking about the importance of getting morning sunlight, 01:06:44.960 |
enjoyed the research, enjoyed the day-to-day, 01:06:48.160 |
I feel good about the research contributions we made, 01:06:51.760 |
but that I knew that people weren't gonna be like, 01:06:57.720 |
because I had already forgotten the people 32 years, 01:07:01.320 |
in my head, and I know the literature really well. 01:07:04.280 |
So, like, how do you square these different mental frames? 01:07:13.960 |
- This is the fundamental question of our time, 01:07:16.640 |
is what is the purpose of our species being here on earth? 01:07:21.640 |
And for thousands of years, that was answered by religion. 01:07:25.720 |
The idea about who we are and why we are here, 01:07:29.760 |
more often than not, was answered in the afterlife. 01:07:33.640 |
But then along came our friend, rationality and logic, 01:07:41.360 |
And as Nietzsche said, I'll give you the full quote, 01:07:43.960 |
"God is dead, and now we're basically screwed." 01:07:49.520 |
I mean, I've gone on record saying that before. 01:07:54.160 |
but it still is difficult to navigate the day-to-day. 01:07:56.640 |
- Because I wanna separate out what scientific rationality 01:08:05.040 |
What it actually did was it killed the structures 01:08:07.680 |
that arose to intermediate between us and God, 01:08:12.520 |
And this is not a conversation about theology. 01:08:14.600 |
This is a conversation about structures and about power. 01:08:21.840 |
It destroyed the stories that religion told us 01:08:30.600 |
"Well, science destroyed God and destroyed religion 01:08:47.280 |
going back 13.7 billion years ago to the Big Bang, 01:08:52.640 |
up to today, science is telling us how we got to this point. 01:09:02.720 |
And so, what, I'm not saying God should be telling us 01:09:10.840 |
- You're not gonna argue you can tell God what to tell us? 01:09:26.840 |
this mindset that I'm advocating for, I call long path. 01:09:35.520 |
These are the kind of, to use your nomenclature, 01:09:41.440 |
Empathy with yourself, empathy with the past, 01:09:53.120 |
'Cause we often think of the future as a noun, 01:10:04.840 |
is this idea of telos, ultimate aim, ultimate goal. 01:10:10.200 |
So we all suffer from what I call a lifespan bias. 01:10:13.440 |
So the most important unit of time to Andrew Huberman 01:10:25.080 |
I grew up, and I wanna be a geneticist, right? 01:10:41.080 |
and it has been going back hundreds of thousands of years, 01:10:52.400 |
There's massive overlaps in terms of the culture, 01:10:55.080 |
the emotional, the psychology of what I got from them, 01:11:00.200 |
But what ends up happening in a lifespan bias society, 01:11:06.320 |
We have lost the ultimate aim or goal or purpose 01:11:09.480 |
for our species, for our civilization on this planet. 01:11:15.200 |
What I am gonna say is when you don't have that, 01:11:21.400 |
we flounder about, and we're looking for metrics to judge. 01:11:27.440 |
Will people know who I am 200 years from now? 01:11:29.640 |
Is my sense of purpose connected to anything larger? 01:11:35.600 |
And without these larger religious structures 01:11:37.800 |
that we had for thousands of years, the answer is no. 01:11:40.560 |
- But there are still many people on the planet 01:11:45.640 |
- Yes, more than there are that aren't religious. 01:11:59.200 |
and this is where I'm gonna get a lot of hate mail, 01:12:01.000 |
is mostly about power and coercion and control. 01:12:07.400 |
- And I would say that for every major religion. 01:12:16.920 |
from the human condition to connect to something larger, 01:12:21.560 |
The problem is when the business models get in the way, 01:12:30.200 |
I mean, I know a lot about the business models of science. 01:12:33.400 |
Science, it's no longer like pure Medici-type science 01:12:46.440 |
What I'm asking for when we have a conversation 01:12:48.720 |
about our toes is to rise up out of this current moment 01:12:52.880 |
and say, most mammals kind of have about a million years 01:12:57.880 |
that they exist on earth from kind of when they rise up 01:13:01.760 |
We're in the first third of this ballgame, right? 01:13:13.440 |
Well, you finally said something that gives me, 01:13:15.280 |
Lots of things that you've said give me confidence 01:13:21.680 |
sorry to interrupt, but I'm going to compliment you. 01:13:54.120 |
- And high salience, high stress, high excitement, 01:13:57.280 |
life and thinking shrinks the aperture, right? 01:14:04.360 |
and makes us very good at dealing with things 01:14:06.480 |
in the present, get to the next day or the next hour, 01:14:08.720 |
collapse, go and continue, repeat, repeat, repeat. 01:14:24.320 |
we'll have to edit this in, the Asatoma Prayer, 01:14:56.760 |
it works to bring your level of autonomic arousal down, 01:15:02.280 |
But it is the hyper-rare individual who thinks, 01:15:06.080 |
"Well, look, this is linked to some larger time scale." 01:15:14.920 |
about this dynamic relationship with that horizon. 01:15:27.520 |
to actually create useful tools for the future? 01:15:31.720 |
Like, so for instance, before we started recording, 01:15:34.040 |
we were talking about the notion of time capsules. 01:15:36.280 |
I've been keeping a time capsule for a long time. 01:15:38.160 |
The first idea for this came when I was a kid. 01:15:39.840 |
We used to build skateboard ramps in the backyard. 01:15:41.960 |
And I'll never forget that right before we put down 01:15:43.640 |
the first layer of plywood, we put a time capsule in there. 01:15:46.800 |
We all, like, wrote little notes and did things, I think. 01:15:49.960 |
Someone put some candy in there or something. 01:15:53.440 |
But social media, to me, does not seem like a time capsule. 01:16:01.360 |
What are the real time capsules of human experience? 01:16:10.720 |
But those are the big three, Bible, Koran, Torah. 01:16:15.200 |
Then we've got literature, music, poetry, visual art. 01:16:28.000 |
- So let's bring this down to the individual. 01:16:48.200 |
I haven't invited you or you just, I don't know. 01:16:54.840 |
So we have a shelf with a bunch of family photos. 01:17:01.140 |
And, you know, there's photos of my grandparents, 01:17:08.760 |
there's actually, and people are always like, 01:17:17.720 |
Those, you know, I have three kids, they're young, 01:17:19.760 |
but that blank photo frame represents my grandkids 01:17:26.440 |
It's just something that I can immediately see 01:17:36.140 |
or I have a conversation with you or anything like that, 01:17:39.100 |
and I immediately have this stimulus arousal response 01:17:43.580 |
but I actually want to see the bigger picture. 01:17:51.340 |
I'll say like, what are we really trying to do here? 01:17:55.780 |
And that, because I've been doing this long enough, 01:17:59.260 |
And when I see that third empty picture frame, 01:18:01.680 |
it always reminds me that I'm here for this one segment. 01:18:17.000 |
is you're concretizing a process, a protocol, if you will, 01:18:21.960 |
And I would argue that the shift from printed photos, 01:18:25.560 |
largely from printed photos to electronic photos 01:18:35.160 |
as opposed to having to actually take photographs 01:18:53.960 |
were saved by my father who was in World War II, 01:18:58.320 |
made his way through Europe to Cuba to Mexico, 01:19:00.600 |
where he eventually met my mom and I was born. 01:19:05.080 |
he had kept in his wallet for several decades 01:19:08.200 |
and he had them kind of reconstructed and turned in. 01:19:13.440 |
- Okay, and then you're married, you have three kids, 01:19:39.120 |
- I don't think this was a name I gave to him. 01:19:39.960 |
- Ari and I have known each other since we were little kids. 01:19:43.840 |
Actually, he hurt himself when he was younger 01:19:51.720 |
kind of like David Goggins would treadmill on his hands, 01:19:56.160 |
Okay, so chances are you'll meet your grandkids. 01:19:59.680 |
- Yeah, God willing, you'll meet your grandkids. 01:20:13.200 |
like in the sense that this big lump of cells 01:20:18.200 |
will probably not meet my great-grandchildren, 01:20:22.960 |
I'm 100% sure of is the way that I've modeled 01:20:30.740 |
be they my wife, my children, business, colleagues, 01:20:35.740 |
that modeling, my kids will be in the room sometimes 01:20:51.740 |
That is how I'm gonna meet my great-grandkids. 01:20:59.780 |
That 50,000 descendants that I talked about earlier, 01:21:27.860 |
probably at least, you know, I think 30, 50 years out, 01:21:31.580 |
if you Google your name or whatever it's called 01:21:36.340 |
People go, "Why don't you talk about a different?" 01:21:40.420 |
Unless you use DuckDuckGo 'cause you're afraid 01:21:43.340 |
So when someone comes up with a truly better one, 01:21:53.980 |
They could hear this conversation, this very conversation. 01:21:57.620 |
why people go on social media, not just to be consumers, 01:22:02.420 |
They're probably not thinking about it consciously, 01:22:03.820 |
but they wanna leave something for the future. 01:22:10.580 |
He has this Your Life in Weeks, I think it's called. 01:22:26.580 |
unless there's some technology that would allow me 01:22:30.740 |
So, and you mark off that you fill in these little squares. 01:22:37.300 |
And, you know, I'm not quite halfway through, 01:22:41.260 |
And it's an interesting thing to see your life 01:22:46.100 |
Oh, wow, it can inspire better decision-making 01:22:49.380 |
because we can lose track of where we are in time. 01:22:55.020 |
People that have ever waited for me on an appointment 01:22:57.940 |
I don't, I track, I'm very oriented in space, 01:23:11.860 |
But the problem is that they're not in the forefront 01:23:15.100 |
of our consciousness throughout the day, right? 01:23:21.020 |
I didn't even think about it again until now. 01:23:26.380 |
in some cases, we have the opportunity to step back 01:23:28.340 |
and say, okay, look, in the bigger arc of things, 01:23:30.700 |
I got to go left here, even though I want to go right. 01:23:32.940 |
This is the right thing for my, the bigger picture. 01:23:40.540 |
is there maybe a technology that actually serves us 01:23:43.500 |
to anchor us to best decision-making for a given, 01:23:47.640 |
best time bin, we would call it in neuroscience, 01:23:55.700 |
- I think you need to ask yourself a question. 01:23:58.740 |
not should I have turkey or chicken for lunch, 01:24:01.980 |
but maybe a slightly, or maybe that question too. 01:24:04.620 |
Just ask yourself, am I being a great ancestor? 01:24:10.180 |
How will descendants look back on this decision, 01:24:21.780 |
named Hal Hirschfield, smart, great guy at UCLA. 01:24:34.420 |
functional MRI to see kind of where the flow is. 01:24:37.180 |
And he asked them, he did a series of questions 01:24:41.100 |
where it's like, think about yourself right now 01:24:48.260 |
I think he used Matt Damon and Natalie Portman 01:24:52.140 |
And he said, I want you to think about yourself 01:24:56.200 |
The part of the brain that lit up for the celebrities, 01:24:58.580 |
Natalie and Matt, was the same part that lit up 01:25:03.780 |
So you got a vague idea of who future Ari was, 01:25:06.580 |
but you weren't totally connected to them, right? 01:25:19.100 |
and then puts them into a 3D virtual reality. 01:25:25.460 |
As you walk across the room, you see a mirror 01:25:35.340 |
- Very cool, does this intervention, pulls them out, 01:25:40.220 |
And he has them hypothetically put money away 01:25:42.700 |
for savings account, you know exactly what happens. 01:25:45.540 |
The people who saw a version of their age self 01:25:47.980 |
put more money away for a future retirement account 01:25:51.860 |
So the question is, not only are we disconnected 01:26:02.300 |
So what I've done, and you'll see this in the show, 01:26:06.900 |
and you'll always look like your dad when you do this, 01:26:09.100 |
is even though we've been bagging on social media, 01:26:17.980 |
and you can send it to your partner and everybody laughs. 01:26:22.420 |
- Everybody laughs as opposed to saying you look great. 01:26:26.340 |
And so once I read about this house research many years ago, 01:26:31.340 |
I printed that out, you know, my little home printer, 01:26:48.620 |
It's about also, how do I take care, you know, 01:26:52.900 |
You know, you get it at the end of the night, 01:26:54.660 |
you wanna just brush your teeth and go to bed. 01:27:00.460 |
- And I learned from the dentist right before sleep. 01:27:01.780 |
- The most important way to take care of future self 01:27:14.060 |
But if you look at your mouth 20 years from now, 01:27:20.000 |
a little bit more wrinkles, you're gonna do it. 01:27:47.580 |
Can I just ask you a question real quickly before here? 01:27:58.220 |
not as a way to scare yourself into better health habits, 01:28:11.740 |
and that person needs care and in an environment 01:28:20.240 |
I feel like barring accident or injury or disease, 01:28:36.420 |
and would see him occasionally around Palo Alto 01:28:38.860 |
and then read the Walter Isaacson biography about him 01:29:03.460 |
and to discard with a lot of kind of like popular convention 01:29:08.860 |
but I think most people celebrate him for it. 01:29:11.060 |
I guess he had some sense of how long he was gonna live. 01:29:15.060 |
And then at one point maybe that sense was inflated 01:29:21.780 |
Do you think that that gave you a perspective 01:29:23.980 |
that at any moment you could be four months out, 01:29:32.480 |
Like did it shape your thinking about the future? 01:29:35.500 |
I mean, my dad's now, I'm not saying this as a, 01:29:39.640 |
that there may have been a distinct advantage, 01:29:43.600 |
but to the idea that it really creates this sense of urgency 01:30:03.440 |
maybe they feel less of a sense of urgency, right? 01:30:07.320 |
Parents are alive, vigorous, okay, that's a blessing. 01:30:12.520 |
in a way that's really linked to your futures, 01:30:16.400 |
So do you think that we have an intuitive sense 01:30:18.560 |
or an unconscious sense of how long we are likely to live, 01:30:35.360 |
that keeps us from thinking about the far future 01:30:47.200 |
means that you're gonna have to think about a moment 01:30:54.200 |
which you'll know all about the book based on the title, 01:30:59.280 |
And Becker's contention was that we're the only species 01:31:06.700 |
that we are only here for a short period of time, 01:31:09.460 |
but more than anything, at one point in time, we will die. 01:31:15.860 |
that everything, religion, culture, laptops, convertibles, 01:31:21.200 |
everything that we create is our way of pushing back 01:31:28.560 |
- I could not agree more, and I'm so, so grateful 01:31:31.640 |
that you mentioned this book and this idea from Becker, 01:31:37.000 |
every single addiction, is based in a fear of death. 01:31:40.440 |
And an attempt to shorten the timescale of thinking, 01:31:45.480 |
shorten the timescale of everything to avoid that reality. 01:31:49.720 |
And it's a reality that we learn of at a very early age, 01:31:57.240 |
especially in the Western world, we push back from death. 01:32:00.040 |
We do everything we can to avoid, even old people, 01:32:12.760 |
because older people, I would argue, remind us of death, 01:32:17.320 |
And so until we can reconcile ourselves truly 01:32:20.600 |
at an individual, and maybe even at a collective level, 01:32:23.040 |
that we will cease to exist, it becomes extremely 01:32:25.800 |
and is extremely difficult to future, to future properly, 01:32:28.840 |
to future in the way that I'm advocating for, 01:32:35.800 |
And so in the work that I've done and in the show 01:32:38.840 |
that I did, I did something, people were very confused. 01:32:41.880 |
The show about the future, "Beef History of the Future," 01:32:46.440 |
"all this cool technology, blah, blah, blah, blah." 01:32:50.360 |
But in the middle of the show, in episode four, 01:32:56.640 |
but I go to the high mountain desert outside of Tucson, 01:32:59.600 |
and I sit with the Lua Arthur, a death doula. 01:33:05.600 |
when we think of a doula, we think of someone helping birth 01:33:20.000 |
She does something called a death meditation. 01:33:21.720 |
And in the show I do it, and you can find these online, 01:33:25.360 |
where you literally go through a guided meditation 01:33:29.200 |
where you go from breathing to cessation of breath 01:33:32.160 |
to literally just becoming one with the soil. 01:33:38.160 |
But I went through a version of the death meditation, 01:33:41.200 |
as you've alluded to, when I was 18 years old. 01:33:43.800 |
'Cause I literally am the one who picked up the phone 01:33:52.440 |
I picked up the phone, and I said, "This is his son." 01:33:54.480 |
'Cause who else was calling at two in the morning? 01:34:15.320 |
of his mortality and my own mortality very, very abruptly. 01:34:20.320 |
Other people have their own early brushes with death. 01:34:32.200 |
But when you've come close to seeing what that looks 01:34:34.560 |
and feels like, you all of a sudden become free 01:35:09.960 |
90% of the companies that are over 1,000 years old 01:35:19.600 |
and how they think and respect elders and death, 01:35:22.000 |
and they understand that we don't need to exist 01:35:29.040 |
a great chain of being those who came before, 01:35:32.600 |
the pros and cons of that, the baggage of that, 01:35:35.080 |
and then it's my role to decide what I wanna keep 01:35:39.080 |
and then what I wanna transmit to the next generation. 01:35:47.320 |
that I think we need back in Western society, 01:36:00.200 |
what we do or do not do about synthetic biology, 01:36:02.400 |
what we do or do not do about artificial intelligence, 01:36:05.040 |
'cause right now, especially on the last two, 01:36:09.720 |
and we don't need more smartness, we need more wisdom, 01:36:15.560 |
by us integrating the fact that you alluded to 01:36:25.520 |
whoever ends up in that empty frame to have a better life, 01:36:34.240 |
Like I think most people assume once it's lights out, 01:36:39.880 |
for something that they don't have the ability to imagine 01:36:45.620 |
So in other words, if we have a hard enough time 01:36:47.680 |
imagining ourselves in the future, you gave us a tool, 01:36:52.280 |
I love that, and if there's a website that will do that, 01:36:54.660 |
we can put a link to it in the show note captions. 01:37:01.200 |
and try and live for the wellbeing of that person 01:37:15.240 |
the futures approach, the verbing of the future 01:37:24.000 |
and for people that we don't even really know 01:37:31.160 |
let's double click on the individual incentive. 01:37:34.120 |
So we talked about the aging photo that you can do. 01:37:37.680 |
There's also another thing you can do that's very powerful. 01:37:40.860 |
which is writing a letter to your future self. 01:37:55.600 |
people have me come and talk to large groups. 01:38:06.320 |
"is I want you to write a letter to your future self. 01:38:08.080 |
"It's gonna be delivered in five years from now." 01:38:13.520 |
because I've been doing it from a very early age, 01:38:18.360 |
- Yeah, I can't, I mean, maybe once or twice. 01:38:26.680 |
The change occurs not when you receive the letter, 01:38:34.120 |
about future you in a way that you normally don't, 01:38:42.720 |
is people come after me, come up to me afterwards, 01:38:49.540 |
"Like, what's that arc of what I wanna kinda connect to? 01:38:57.840 |
that if you can't look at a photo of yourself aged, 01:39:00.220 |
the very least, write a letter to your future self. 01:39:23.960 |
and you can't be it if you can't see it, right? 01:39:30.680 |
but what you really wanna see aspirationally in that letter 01:39:33.800 |
now starts creating a roadmap to getting there, 01:39:38.360 |
because at the very kind of bottom of the pyramid 01:39:42.400 |
is visualizing what that success looks like, right? 01:39:58.760 |
I started running the four-by-100, which is a relay race. 01:40:01.360 |
And what I learned from my coach, Coach Ted Tillian, 01:40:10.760 |
But where that race is won or lost is in the transition zone, 01:40:17.360 |
And so when you write a letter to your future self, 01:40:30.400 |
and you're carrying a baton that was handed to you 01:40:45.080 |
and what we do or do not do in this intertidal, 01:40:50.300 |
that is homo sapien, planetary, flourishing culture, 01:40:53.580 |
is gonna matter much more than we think it does 01:41:10.980 |
which some of these protocols will help you do, 01:41:14.600 |
that we wanna actually tackle the question of to what end, 01:41:27.700 |
some sort of guarantee that we would go on to heaven or hell, 01:41:31.100 |
now that that is no longer there for a lot of people, 01:41:34.700 |
and it still helps them make better decisions, 01:41:44.680 |
that the decisions that we make or do not make 01:41:59.140 |
Or we can say we wanna be part of a much larger project. 01:42:12.400 |
we're kind of at the bottom or the top of the third. 01:42:14.340 |
We have at least several hundred thousand more years to go. 01:42:17.080 |
I am not as focused as to whether or not we leave Earth 01:42:42.480 |
So instead of there being like a hundred great heroes 01:42:47.760 |
you know, like the Dalai Lama or Mother Teresa 01:43:00.320 |
how do you build transgenerational empathy with the past? 01:43:03.300 |
Read people's biographies, especially autobiographies, 01:43:11.840 |
are, of course, through their own lives, right? 01:43:17.160 |
"God, that person was kind of an asshole," right? 01:43:28.560 |
are at this heightened sense of kind of intellectual 01:43:45.760 |
whatever the population of Homo sapiens is on planet Earth 01:43:48.280 |
over the next several centuries or millennia, 01:43:54.000 |
that's beyond what science fiction has ever even showed us, 01:44:00.040 |
what Andy, what Andrew Huberman is doing in his work, 01:44:02.440 |
what Ari Walker is doing, is contributing to that, 01:44:09.800 |
that we are now sorely lacking in a social media world 01:44:14.800 |
of instant buying of crap that we don't need on the internet. 01:44:20.200 |
and it's just a shorter timescale reward thing. 01:44:26.880 |
or the pleasure that we get in our lifespan or day is bad. 01:44:31.040 |
I don't think, you know, I'm a capitalist too. 01:44:36.420 |
it is but one time window of kind of operations. 01:44:39.580 |
I just think it's good to have flexibility, right? 01:44:46.180 |
- It's not about balance, it's about harmony. 01:44:52.340 |
- So I love it, and I also know that a lot of people love it 01:45:15.160 |
I don't know how people feel about politicians nowadays, 01:45:19.840 |
but, you know, but the people building technologies 01:45:32.120 |
That they will take care of it for next generations, right? 01:45:36.460 |
Just like there were those, the Edisons and the Einsteins 01:45:55.760 |
like the appreciation of our relationship with animals 01:45:59.200 |
to our own understanding of ourselves and our planet, 01:46:03.060 |
So, you know, those people ushered in the life 01:46:07.800 |
that I've had and I feel pretty great about that. 01:46:26.320 |
they carry forward the patterns and the traits 01:46:29.400 |
and certainly the responses that they observe 01:46:33.160 |
in their parents, what's okay, what's not okay. 01:46:35.320 |
You know, starting in the '80s and in the '90s 01:46:37.400 |
in this country, there were many more divorces 01:46:39.720 |
and fractured homes than there were previously. 01:46:43.080 |
As a consequence, there's also been a fracturing 01:46:47.120 |
of the kind of collective celebration of holidays. 01:46:51.160 |
Like the things that have anchored us through time 01:47:00.380 |
You know, people were getting Christmas presents 01:47:02.840 |
So, you know, do you think that the kind of fracturing 01:47:17.140 |
- Look, I think it's the fracturing of the institutions 01:47:22.060 |
that have been with us for the past several hundred years 01:47:31.520 |
Maybe for a moment, we could just talk about universities. 01:47:34.080 |
These days, in part because of the distrust of science 01:47:37.640 |
and in part because of the distrust in government 01:47:39.840 |
and in part because of the distrust in traditional media, 01:47:42.640 |
there's more and more ideas being kicked around 01:47:47.200 |
that, you know, formal education is not as valuable 01:47:52.280 |
as it used to be, and people always cite the examples 01:47:58.640 |
but I would argue they got in and chose to leave. 01:48:02.120 |
They didn't drop out, and they are rare individuals. 01:48:10.320 |
that needs to stay in college, with rare exception, 01:48:16.120 |
that needs to be tended to because nowhere else in life, 01:48:20.080 |
is there such a clear designated set of steps 01:48:22.640 |
that can take you from, you know, point A to point B 01:48:32.840 |
but I would also argue that academic institutions 01:48:44.740 |
so we are having a harder time relying on them 01:48:49.200 |
You saw a lot of presidents of major universities 01:48:56.600 |
but also Harvard and other places for different reasons, 01:49:06.160 |
They have new ones in, and so there's a lot of distrust, 01:49:13.640 |
Like if it's not, if people are having less faith 01:49:16.080 |
in religion, less faith in academic institutions, 01:49:26.440 |
but not about what the systems we wanted them to be, 01:49:29.160 |
because going back several hundred years ago, 01:49:33.880 |
especially, well, Renaissance into the Enlightenment, 01:49:42.660 |
and the ability to kind of understand the world 01:49:45.360 |
by breaking it down into its component parts. 01:49:51.440 |
and we're at the point now where we're really good 01:49:54.640 |
at saying what doesn't work, but very, very bad 01:49:57.720 |
about saying what does work and what we do want, 01:50:02.360 |
that we have to put forth some sort of meta-narrative, 01:50:12.360 |
It's one thing to say, which is scary for a lot of people, 01:50:16.760 |
'cause it's one thing to say that doesn't work, 01:50:21.280 |
What you're describing has incredible parallels to health. 01:50:26.360 |
and even before when I was posting on social media, 01:50:30.520 |
and it was like all this fear about everything, 01:50:32.240 |
and I said, "Listen, like, I can't solve this larger issue 01:50:40.280 |
"People are stressed, stress is bad when it's chronic. 01:50:51.100 |
So a lot of the backbone of the "Huberman Lab" podcast 01:50:58.520 |
So what you're describing is essentially a field 01:51:00.040 |
that consists of, like, breaking things down, 01:51:04.600 |
and I think that people love potential solutions. 01:51:08.980 |
"Look, this might not solve every sleep issue," 01:51:19.800 |
And of course, those for whom the tools don't work 01:51:22.760 |
and they need to go to more extreme measures. 01:51:25.080 |
But I hear you saying that religion provided the solutions, 01:51:32.480 |
People are not looking at that as much anymore. 01:51:35.240 |
The big institutions, like academic institutions, 01:51:41.480 |
regardless of where one sits on one side of the aisle, 01:51:46.040 |
It's like 12-hour news cycle designed to just point fingers 01:51:49.840 |
what they really believe in a clear, tangible way. 01:51:53.280 |
There are those that do that a bit more than others, 01:52:00.120 |
I feel like family units and values and structures 01:52:04.740 |
at least in the traditional view of the family. 01:52:07.040 |
- Let's remember- - Two parents, kids, et cetera, 01:52:18.740 |
but that part of the work of being a human being 01:52:22.460 |
now and going forward is to learn this futures approach? 01:52:31.500 |
We have to critically assess where we came from 01:52:37.980 |
The idea that your children would be "sleep trained" 01:52:46.300 |
where you would put your kids in another room. 01:52:49.140 |
Because if you go back to most indigenous cultures, 01:52:56.540 |
- Yeah, or in one big room or in a long house. 01:53:07.780 |
Look, I'm gonna say this in a nonjudgmental way, 01:53:16.100 |
being pushed by a seemingly healthy adult, right? 01:53:19.860 |
The kid is detached and they're in this kind of this buggy, 01:53:25.560 |
But if you look at most cultures around the world 01:53:38.420 |
and the baby would be wrapped and be held very close to them. 01:53:42.740 |
- No, the baby bjorn, you put the baby in front of you, 01:53:45.500 |
When you really wrap them with like a 20 yard wrap, 01:53:52.700 |
like everything, there's a reason for everything. 01:54:00.260 |
as cognitively, intellectually, and physically ready 01:54:03.260 |
as a baby chimpanzee would take 18 months of gestation, 01:54:21.940 |
because when we went from walking on all fours 01:54:26.940 |
and there's only so much room for that baby to come out, 01:54:29.940 |
- Yeah, if the brain had completed development internally, 01:54:38.540 |
that many mothers and babies died in childbirth 01:54:49.620 |
But what that means is the baby has to be attached 01:54:51.660 |
and close to the mother because it's totally helpless. 01:55:02.420 |
I would argue that breakdown isn't happening now. 01:55:04.740 |
That breakdown happened when we started to move 01:55:17.820 |
that after the female goes through menopause, 01:55:21.940 |
Basically elephants, whales, and humans, right? 01:55:25.240 |
'Cause those are the species where you need others, 01:55:29.780 |
because of the aforementioned early birthing. 01:55:33.140 |
- But maybe it's also the propagation of story, 01:55:36.620 |
as you said earlier, that can inform better decisions. 01:55:41.660 |
- Wisdom is like spoken cave paintings, basically. 01:55:46.500 |
about what does it mean to have a proper family structure, 01:55:50.580 |
whether it's a nuclear family of four or five 01:56:04.340 |
Now we're at this point, in this intertidal moment, 01:56:16.820 |
And your question was, well, why do they wanna do that? 01:57:00.660 |
and I look at their bookshelf, 15-year-old twin daughters, 01:57:09.140 |
All the futures they know are the "Hunger Games," 01:57:22.060 |
People are gonna be attracted to reading about those things. 01:57:28.980 |
there's always a love interest in a teenage thing, 01:57:30.740 |
but it's always, the backdrop is always dystopia. 01:57:42.340 |
- Dystopian stories can act as an early warning system. 01:57:45.380 |
If you keep doing this one thing that you're doing, 01:57:47.940 |
and extrapolate out a few decades, it'll look like this. 01:57:54.100 |
are the stories about what if we get it right, 01:58:03.740 |
Dystopia we talked about is a terrible, terrible world. 01:58:06.500 |
A protopia, this idea put forth by Kevin Kelly, 01:58:22.900 |
In tomorrows where not everything is perfect, 01:58:27.340 |
It won't be perfect, there'll still be divorces, 01:58:31.100 |
But if we start backdropping our future visions 01:58:34.820 |
in worlds that are better than they are today, 01:58:49.440 |
Because I think a lot of people listening to this 01:58:56.340 |
that the shift from the notion of building a better future 01:59:03.260 |
rather you can make it almost like pro-self-and-others 01:59:15.520 |
contraction and dilation of your time window, 01:59:20.520 |
but you take care of the future generations as well. 01:59:23.460 |
Like for that empty frame, the now empty frame. 01:59:33.140 |
"Okay, well, at best I could do that for myself 01:59:39.540 |
It's gonna be hard to do that as a greater good 01:59:44.420 |
"Well, that does contribute to the greater good." 02:00:01.460 |
And they get the sense and you already have the sense 02:00:04.040 |
'cause you have the experience to know, like, 02:00:16.580 |
And you're gonna spend the next five years of your life 02:00:19.540 |
Maybe three, but probably five years of your life. 02:00:23.860 |
like, do you ditch that project and go for something else? 02:00:28.900 |
you get to put your brick on the wall, but it's a brick. 02:00:46.500 |
But, you know, what we're saying here is, you know, 02:00:56.580 |
And by the way, I love the protocols that you offer, 02:00:58.540 |
the empty frame, the journaling to future self, 02:01:03.180 |
your present thinking into the future, the aging of self. 02:01:09.180 |
I plan to do them and I think they're very valuable. 02:01:14.660 |
you are interested in creating a movement of sorts 02:01:18.620 |
where many, if not everybody, is thinking this way. 02:01:24.500 |
"Okay, well, the Elons will take care of it for us." 02:01:43.860 |
how does one, you know, join up with other people 02:02:05.480 |
And I mean that in a good way, not in a selfish way. 02:02:12.240 |
I'm advocating for how do we optimize society? 02:02:21.480 |
unlike when we think of scale being, you know, 02:02:26.880 |
this is really a one plus one plus one plus one at infinity. 02:02:42.700 |
within their closest sphere, and you go out, right? 02:02:46.520 |
So right now, your listeners have the potential 02:02:54.720 |
they're thinking about their purpose in the world 02:02:57.080 |
as nested within the larger purpose of our species 02:03:01.280 |
to allow for more mass flourishing in the future 02:03:07.520 |
If you think about your listeners and how they interact 02:03:09.920 |
and how they model behavior, and their spheres, 02:03:18.540 |
And what we know about social and emotional contagion 02:03:33.200 |
where you're not going to, you know, just add powder 02:03:37.520 |
and it all of a sudden will create this optimal future 02:03:43.180 |
for everyone because only one person does it. 02:03:51.160 |
when they're done doing it to take a few minutes 02:03:54.040 |
and think about what kind of futures do I want 02:03:56.800 |
for myself, for my family, for the generations to come? 02:04:12.240 |
or maybe they've been thinking about it for years. 02:04:14.760 |
Even in their smallest interactions, they start doing it. 02:04:19.680 |
the Santa Fe Institute and complexity theory. 02:04:26.600 |
we don't need a march for long-termism, right? 02:04:29.400 |
We don't, we don't, we don't need bumper stickers. 02:04:32.580 |
There will be no bumper stickers on my forerunner. 02:04:40.280 |
and our actions within the realm of possibility 02:04:59.040 |
that you are modeling a way of being in the world 02:05:07.960 |
who knows if anyone will listen to your podcast. 02:05:11.160 |
'cause I'm sure it's probably already happened, 02:05:16.520 |
something, you know, what we call AI right now, 02:05:56.760 |
and how to become the best of or the worst of ourselves. 02:06:02.360 |
is going to become what these machines think of 02:06:09.580 |
And going back to the higher education example 02:06:13.860 |
like many institutions, as AI is what we call that, 02:06:18.720 |
fully comes online is going to radically, radically change. 02:06:22.400 |
And it will be a Cambridge or an Oxford tutor 02:06:31.820 |
to receive information will start to dissipate 02:06:39.080 |
is not just the intellectual and the cognitive, 02:06:42.260 |
but also the psychological and the emotional core 02:06:49.740 |
You know, there was a former guest on this podcast, 02:06:53.260 |
or there was a guest on this podcast previously, 02:06:59.060 |
I think now she's the Dean of Arts and Sciences, 02:07:10.900 |
First, to kind of, to help students manage the stress 02:07:20.800 |
which is to try and teach emotional development, 02:07:32.260 |
And I love the way that you're describing this. 02:07:39.640 |
it's a lens into human experience that's very dynamic, 02:07:54.080 |
There will be parts of your day, no doubt today, 02:07:59.160 |
And then the ability to dilate your consciousness 02:08:05.220 |
and to solve for things that are more long-term, 02:08:12.680 |
how can we incentivize people to be good, to do good? 02:08:16.600 |
And how can we incentivize people to do this on a backdrop 02:08:20.720 |
of a lot of short-term carrots and short-term horizons? 02:08:33.160 |
I'm journaling into the future, writing to future self, 02:08:49.600 |
Meaning, is there anything that we should all be doing? 02:09:03.940 |
encourage people to do to be the best version of themselves 02:09:16.720 |
why is it we do and are the way that we are, right? 02:09:21.440 |
Do you know why in this country we vote on Tuesday? 02:09:25.560 |
- So most advanced democracies vote over the weekend 02:09:49.800 |
because then people can still watch Monday night football. 02:09:53.000 |
- No, this is long before Monday night football. 02:10:03.280 |
and have become as individuals and as a society. 02:10:05.680 |
I'm a big fan of cognitive behavioral therapy, of CBT. 02:10:09.960 |
I think partially because what it does is it has this, 02:10:15.660 |
because you can't just say stop doing something, 02:10:20.800 |
What I've tried to do with some of our time here today 02:10:25.000 |
and what I want people to partially take away, 02:10:28.400 |
more than partially to really take away and bring in, 02:10:34.240 |
What are those stories that you've inherited? 02:10:40.840 |
like you are defined by the society by what you own, 02:10:44.760 |
by the badge on your car that says how successful you are. 02:10:47.720 |
That's a story, it's a story that's been fed to us. 02:10:50.440 |
There are other stories that are very personal. 02:10:52.740 |
These are stories that can sometimes be very private 02:10:57.880 |
And then to understand some of those stories serve us, 02:11:12.480 |
Isn't going to be answered by a religion or a God 02:11:25.080 |
about how it is you got here, what really matters, 02:11:29.480 |
and where you want to contribute and help move us forward 02:11:37.300 |
as not as a passenger, but as crew on this vessel 02:11:54.560 |
"I'm going to write these stories that serve me. 02:12:04.600 |
And those stories may be very intra-personal, 02:12:09.000 |
they may be interpersonal, they may be political, 02:12:11.500 |
they may be business, they may be what you buy, 02:12:13.520 |
what you consume, but you have to have agency. 02:12:17.000 |
You have to instill a sense of hope into your own life 02:12:19.880 |
and a sense of awe and a sense of really just empathy 02:12:26.040 |
if we want to collectively move forward into the futures 02:12:30.840 |
that will allow our descendants to look back on us 02:12:37.160 |
And I also just want to highlight the importance 02:12:40.320 |
of record-keeping, of putting things down on paper 02:12:48.680 |
creating time capsules for the future generations. 02:12:51.720 |
Because I think a lot of what people probably are thinking 02:12:57.200 |
"Okay, I can do all this stuff to try and make things better 02:12:59.440 |
and even give up the desire for any kind of credit," 02:13:02.440 |
but not feeling like it will be of any significance. 02:13:07.080 |
But what I've learned from you today is that, 02:13:09.920 |
it starts with the self and then it radiates out 02:13:12.760 |
to the people we know and that maybe we cohabitate with. 02:13:17.760 |
But even if we don't cohabitate with anybody, 02:13:21.200 |
it radiates out from us and that it is important 02:13:26.640 |
so that people can feel like they have some significance 02:13:35.620 |
but to really like send those ripples forward 02:13:37.640 |
and get the sense that those ripples are moving forward. 02:13:40.360 |
So for that reason, and especially given the nature 02:13:45.360 |
of this podcast, for the reason that you gave 02:13:50.960 |
that we've highlighted in the timestamps, of course, 02:13:55.360 |
as tools, as protocols, I really want to thank you 02:13:59.080 |
because oftentimes discussions about past, present, 02:14:01.960 |
and future can get a bit abstract and a bit vague 02:14:05.700 |
for people and you've done us all a great service 02:14:11.160 |
That's so much of what this podcast is about. 02:14:12.920 |
It's one part information, one part option for action. 02:14:19.220 |
I'm certainly going to adopt some of these protocols. 02:14:22.320 |
And also for taking the time to come to talk with us today, 02:14:26.320 |
share your wisdom and share what you're doing in many ways. 02:14:32.480 |
it is absolutely part of what you're describing, 02:14:38.580 |
toward how things can be better now and in the future. 02:14:59.040 |
- Thank you for joining me for today's discussion 02:15:02.200 |
To find links to his book, to his television show, 02:15:09.080 |
If you're learning from and or enjoying this podcast, 02:15:13.360 |
That's a terrific zero cost way to support us. 02:15:24.840 |
at the beginning and throughout today's episode. 02:15:29.640 |
If you have questions for me or comments about the podcast 02:15:32.280 |
or guests or topics that you'd like me to consider 02:15:35.640 |
please put those in the comment section on YouTube. 02:15:43.600 |
It's entitled "Protocols, An Operating Manual 02:15:53.360 |
And it covers protocols for everything from sleep, 02:16:01.360 |
And of course, I provide the scientific substantiation 02:16:06.800 |
The book is now available by presale at protocolsbook.com. 02:16:15.800 |
"Protocols, An Operating Manual for the Human Body." 02:16:19.180 |
If you're not already following me on social media, 02:16:21.160 |
I am Huberman Lab on all social media platforms. 02:16:24.140 |
So that's Instagram, X, formerly known as Twitter, 02:16:35.240 |
but much of which is distinct from the content 02:16:38.320 |
Again, that's Huberman Lab on all social media channels. 02:16:47.760 |
that includes podcast summaries as well as protocols 02:17:00.000 |
deliberate cold exposure, deliberate heat exposure. 02:17:11.300 |
and all of which, again, is completely zero cost. 02:17:16.300 |
go to the Menu tab up in the upper right corner, 02:17:18.940 |
scroll down to Newsletter, and provide your email. 02:17:21.880 |
that we do not share your email with anybody.