back to index

Ari Wallach: Create Your Ideal Future Using Science-Based Protocols


Chapters

0:0 Ari Wallach
1:58 Sponsors: David, Helix Sleep & ROKA
6:13 Mental Time Travel; Technology & Present
15:46 Technology; Tools: Transgenerational Empathy; Bettering Today
22:0 Tool: Empathy for Others
26:9 Empathy for Future Generations, Emotion & Logic
31:48 Tool: Emotion to Guide Action
36:50 Sponsor: AG1
38:2 Tools: Perfect Day Exercise; Cathedral Thinking, Awe & Future Generations
43:52 Egoic Legacy, Modeling Behavior
51:13 Social Media, Time Capsule, Storytelling
60:6 Sponsor: LMNT
61:18 Short-Term Thinking; Life Purpose, Science & Religion
69:23 Longpath, Telos, Time Perception
75:19 Tools: Photo Frames; Behavior & Legacy; Life in Weeks
83:2 Tool: Visualizing Future You
90:17 Death, Western Society
96:20 Tool: Writing Letter to Future Self
101:1 Society, Future Harmony
107:3 Traditional Institutions, Family, Future Consciousness; “Protopia”
118:48 Tool: Behavior & Modeling for the Future
128:11 Tool: “Why Tuesdays?”, Examining Self
134:58 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.240 | where we discuss science
00:00:03.680 | and science-based tools for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.400 | and I'm a Professor of Neurobiology and Ophthalmology
00:00:13.560 | at Stanford School of Medicine.
00:00:15.560 | My guest today is Ari Wallach.
00:00:17.760 | Ari Wallach is an Adjunct Associate Professor
00:00:20.200 | at Columbia University School
00:00:21.880 | of International and Public Affairs.
00:00:24.040 | He is also the host of a new TV series,
00:00:26.640 | A Brief History of the Future.
00:00:28.620 | Today's discussion focuses
00:00:30.240 | on perhaps one of the most important questions
00:00:32.560 | that any and all of us have to ask ourselves at some point,
00:00:36.240 | which is how is it that we are preparing this planet
00:00:39.380 | for the future?
00:00:40.840 | Not just for our children,
00:00:42.080 | if we happen to have children or want children,
00:00:44.400 | but for all people.
00:00:46.440 | The human brain, as we know,
00:00:47.840 | is capable of orienting its thoughts and its memories
00:00:50.960 | to the past, to the present, or to the future.
00:00:54.540 | But few people actually take the time
00:00:56.800 | to think about the future that they are creating
00:00:58.940 | on this planet and in culture, within our families, et cetera
00:01:02.540 | for the next generation and generations that follow them.
00:01:06.020 | Ari Wallach is an expert in this topic,
00:01:08.300 | and he has centered his work
00:01:09.620 | around what he calls long path labs,
00:01:11.980 | which is a focus on long-term thinking
00:01:13.840 | and coordinated behavior at the individual,
00:01:16.740 | organizational, and societal level
00:01:19.180 | in order to best ensure the thriving of our species.
00:01:22.520 | And while that may sound a bit aspirational,
00:01:25.060 | it is both aspirational and grounded
00:01:28.120 | in specific actions and logic.
00:01:30.060 | So during today's episode, Ari Wallach spells out for us,
00:01:33.260 | not just the aspirations, not just what we want,
00:01:35.860 | but how to actually create that positive future and legacy
00:01:39.540 | for ourselves, for our families, and for society at large.
00:01:43.380 | It's an extremely interesting take on how to live now
00:01:46.300 | in a way that is positively building toward the future.
00:01:49.100 | So by the end of today's episode,
00:01:50.500 | you will have a unique perspective on how your brain works,
00:01:53.940 | how you frame time perception,
00:01:55.660 | and indeed how you frame your entire life.
00:01:58.620 | Before you begin, I'd like to emphasize
00:02:00.420 | that this podcast is separate from my teaching
00:02:02.340 | and research roles at Stanford.
00:02:03.940 | It is, however, part of my desire and effort
00:02:06.100 | to bring zero cost to consumer information
00:02:08.020 | about science and science-related tools
00:02:10.020 | to the general public.
00:02:11.340 | In keeping with that theme,
00:02:12.460 | I'd like to thank the sponsors of today's podcast.
00:02:15.100 | Our first sponsor is David.
00:02:16.940 | David makes a protein bar unlike any other.
00:02:19.620 | It has 28 grams of protein,
00:02:21.580 | only 150 calories, and zero grams of sugar.
00:02:25.100 | That's right, 28 grams of protein,
00:02:26.900 | and 75% of its calories come from protein.
00:02:30.060 | This is 50% higher than the next closest protein bar.
00:02:33.620 | These bars from David also taste incredible.
00:02:36.180 | My favorite bar is the cake-flavored one,
00:02:38.740 | but then again, I also like the chocolate-flavored one,
00:02:40.700 | and I like the berry-flavored one.
00:02:41.980 | Basically, I like all the flavors.
00:02:43.540 | They're all incredibly delicious.
00:02:45.300 | Now, for me personally,
00:02:46.380 | I try to get most of my calories from whole foods.
00:02:49.020 | However, when I'm in a rush, or I'm away from home,
00:02:51.860 | or I'm just looking for a quick afternoon snack,
00:02:54.060 | I often find that I'm looking
00:02:55.140 | for a high-quality protein source.
00:02:57.020 | And with David, I'm able to get 28 grams
00:02:59.340 | of high-quality protein with the calories of a snack,
00:03:02.140 | which makes it very easy to hit my protein goals
00:03:04.180 | of one gram of protein per pound of body weight,
00:03:06.860 | and it allows me to do so
00:03:08.180 | without taking on an excess of calories.
00:03:10.300 | As I mentioned before, they are incredibly delicious.
00:03:12.380 | In fact, they're surprisingly delicious.
00:03:14.060 | Even the consistency is great.
00:03:15.420 | It's more like a cookie consistency,
00:03:17.180 | kind of a chewy cookie consistency,
00:03:18.940 | which is unlike other bars,
00:03:20.020 | which I tend to kind of saturate on.
00:03:21.580 | I was never a big fan of bars until I discovered David bars.
00:03:24.980 | If you give them a try, you'll know what I mean.
00:03:27.300 | So if you'd like to try David,
00:03:28.580 | you can go to davidprotein.com/huberman.
00:03:31.660 | Again, the link is davidprotein.com/huberman.
00:03:35.820 | Today's episode is also brought to us by Helix Sleep.
00:03:38.980 | Helix Sleep makes mattresses and pillows
00:03:40.820 | that are customized to your unique sleep needs.
00:03:43.740 | I've spoken many times before on this and other podcasts
00:03:46.140 | about the fact that getting a great night's sleep
00:03:48.540 | is the foundation of mental health,
00:03:50.100 | physical health, and performance.
00:03:51.620 | Now, the mattress we sleep on makes an enormous difference
00:03:54.020 | in terms of the quality of sleep that we get each night.
00:03:56.500 | We need a mattress that is matched
00:03:58.300 | to our unique sleep needs,
00:04:00.060 | one that is neither too soft nor too hard for you,
00:04:02.660 | one that breathes well
00:04:04.060 | and that won't be too warm or too cold for you.
00:04:06.940 | If you go to the Helix website,
00:04:08.420 | you can take a brief two-minute quiz
00:04:09.940 | and it asks you questions such as,
00:04:11.620 | do you sleep on your back, your side, or your stomach?
00:04:13.380 | Do you tend to run hot or cold during the night?
00:04:15.460 | Things of that sort.
00:04:16.780 | Maybe you know the answers to those questions,
00:04:18.640 | maybe you don't.
00:04:19.480 | Either way, Helix will match you
00:04:20.860 | to the ideal mattress for you.
00:04:22.480 | For me, that turned out to be the Dusk mattress, D-U-S-K.
00:04:25.560 | I've been sleeping on a Dusk mattress for,
00:04:27.540 | gosh, no, more than four years.
00:04:29.740 | And the sleep that I've been getting
00:04:31.040 | is absolutely phenomenal.
00:04:32.880 | So if you'd like to try Helix,
00:04:34.200 | you can go to helixsleep.com/huberman.
00:04:37.340 | Take that two-minute sleep quiz
00:04:38.780 | and Helix will match you to a mattress
00:04:40.540 | that's customized for your unique sleep needs.
00:04:43.220 | Right now, Helix is giving up to 25% off
00:04:45.940 | all mattress orders.
00:04:47.060 | Again, that's helixsleep.com/huberman
00:04:50.200 | to get up to 25% off.
00:04:52.240 | Today's episode is also brought to us by Roka.
00:04:55.220 | Roka makes eyeglasses and sunglasses
00:04:57.340 | that are of the absolute highest quality.
00:04:59.740 | I've spent a lifetime working on the biology
00:05:01.460 | of the visual system,
00:05:02.420 | and I can tell you that your visual system
00:05:03.860 | has to contend with an enormous number
00:05:05.580 | of different challenges in order for you
00:05:07.340 | to be able to see clearly from moment to moment.
00:05:09.940 | Roka understands all of that
00:05:11.700 | and has designed all of their eyeglasses and sunglasses
00:05:14.220 | with the biology of the visual system in mind.
00:05:16.860 | Roka eyeglasses and sunglasses
00:05:18.340 | were first designed for use in sport,
00:05:20.020 | in particular for things like running and cycling,
00:05:22.300 | and as a consequence,
00:05:23.380 | Roka frames are extremely lightweight,
00:05:25.380 | so much so that most of the time
00:05:26.460 | you don't even remember that you're wearing them,
00:05:28.220 | and they're also designed so that they don't slip off
00:05:30.380 | even if you get sweaty.
00:05:31.900 | Now, even though Roka eyeglasses and sunglasses
00:05:33.860 | were initially designed for sport,
00:05:35.620 | they now have many different frames and styles,
00:05:37.560 | all of which can be used not just for sport,
00:05:39.620 | but also for wearing out to dinner,
00:05:41.260 | to work, essentially any time and any setting.
00:05:44.060 | I wear Roka readers at night,
00:05:45.540 | or Roka eyeglasses if I'm driving at night,
00:05:48.060 | and I wear Roka sunglasses in the middle of the day
00:05:50.340 | anytime it's too bright for me to see clearly.
00:05:52.180 | My eyes are somewhat sensitive, so I need that.
00:05:54.460 | I particularly like the Hunter 2.0 frames,
00:05:57.300 | which I have as eyeglasses and now as sunglasses too.
00:06:00.760 | If you'd like to try Roka,
00:06:01.940 | you can go to roka.com/huberman
00:06:04.660 | to get 20% off your purchase.
00:06:06.280 | Again, that's roka.com/huberman to get 20% off.
00:06:10.300 | And now for my discussion with Ari Wallach.
00:06:13.420 | Ari Wallach, welcome.
00:06:15.180 | - Andrew Huberman, thank you for having me.
00:06:17.300 | - You and I go way back,
00:06:19.060 | and I think that's a good way to frame today's conversation,
00:06:21.600 | not by talking about our history by any stretch,
00:06:25.480 | but because really what I want to understand
00:06:28.980 | is about time and time perception.
00:06:32.140 | So without going into a long dialogue,
00:06:34.060 | the human brain is capable of this amazing thing,
00:06:38.640 | of being able to think about the past,
00:06:40.080 | the present, or the future,
00:06:41.660 | or some combination of the three.
00:06:43.320 | If other animals and insects do that,
00:06:46.980 | I wouldn't be surprised, but we do that.
00:06:49.460 | And we do it pretty well,
00:06:50.420 | provided all our mental faculties are intact.
00:06:53.180 | One of the key aspects to brain function, however,
00:06:57.900 | is to use that ability to try and set goals, reach goals,
00:07:02.900 | and that's a neurochemical process.
00:07:05.360 | And I would say these days, more than ever,
00:07:09.020 | we operate on short timeframe reward schedules,
00:07:14.020 | meaning we want something,
00:07:18.100 | we generally have ways of getting it pretty quickly,
00:07:20.900 | or at least the information
00:07:22.000 | about how we might get it pretty quickly,
00:07:25.180 | and we either get it or we don't.
00:07:26.940 | And of course, it involves dopamine
00:07:28.200 | and a bunch of other things as well.
00:07:30.000 | A lot of your work is focused on linking our perception
00:07:34.600 | of what we're doing in the present
00:07:36.760 | with knowledge about the past,
00:07:38.560 | and trying to project our current decision-making
00:07:43.620 | into the future to try and create a better future.
00:07:46.880 | And that's some pretty heavy mental gymnastics,
00:07:50.760 | especially when many, perhaps most,
00:07:53.320 | but certainly many, many people worldwide
00:07:56.580 | are just trying to get through their day
00:07:59.620 | without feeling overly anxious,
00:08:01.100 | without letting their health get out of control,
00:08:04.240 | or I should say their illness get out of control,
00:08:07.020 | and on and on.
00:08:08.120 | So to kick the ball out,
00:08:10.760 | I've got this long-winded question,
00:08:13.440 | and it is indeed a question,
00:08:15.040 | which is how do we navigate this conundrum?
00:08:20.040 | Like if we really care about the future,
00:08:22.280 | what do we want to do?
00:08:25.960 | Where do we want to place our mental frame,
00:08:28.080 | and how do we start going about doing that?
00:08:31.880 | - It's a great question, or a great series of questions.
00:08:34.680 | One of the things that Homo sapiens do extremely well
00:08:38.040 | is what we call mental time travel.
00:08:39.840 | We're able to actually take ourselves
00:08:43.160 | in the current moment and project out.
00:08:45.440 | In fact, Marty Seligman,
00:08:47.760 | kind of the father of positive psychology,
00:08:51.000 | put forth this idea in this great book
00:08:52.600 | called "Homo Prospectus,"
00:08:53.800 | that what separates us out from almost every other species,
00:08:56.800 | as far as we know, the ones we can talk to, mostly us,
00:08:59.240 | is that we do two things extremely well.
00:09:01.660 | We can do mental time travel towards the future, right?
00:09:04.540 | We can think about different possible outcomes,
00:09:07.280 | different possible scenarios,
00:09:09.200 | and we can collaborate to make the ones
00:09:12.080 | that we want to see manifest, manifest.
00:09:14.720 | Now, that involves language,
00:09:16.560 | that involves social interaction,
00:09:17.980 | a whole bunch of other things.
00:09:19.400 | But at the end of the day,
00:09:21.400 | what we do extremely well,
00:09:24.320 | as far as we know, we're the only ones who do it,
00:09:25.960 | and I think this is part of the reasons
00:09:27.280 | why we're so good at what we do
00:09:28.800 | as a dominant species on this planet,
00:09:30.720 | is to project out into futures that we want.
00:09:33.340 | We know where this comes from, mostly.
00:09:36.180 | It's coming from the hippocampus, right?
00:09:38.500 | Which, one thing about the hippocampus that's amazing
00:09:40.860 | is that it's almost atemporal.
00:09:43.060 | It doesn't actually have a timestamp.
00:09:44.900 | And so what it does is it takes snapshots
00:09:47.580 | of episodic memories that have happened in the past,
00:09:50.620 | reassembles them so that we can mentally time travel
00:09:54.480 | and then figure out these different future scenarios
00:09:56.340 | of what might happen.
00:09:57.180 | So if we take Ari and Andy, 150,000 years ago.
00:10:01.040 | - He calls me Andy, folks, but it's Andrew.
00:10:03.400 | No, it's okay.
00:10:04.240 | Just stick with Andy, but-
00:10:05.480 | - I'm gonna stick with Andy.
00:10:06.320 | - I'm giving you permission
00:10:07.140 | for at least the duration of this episode.
00:10:08.520 | - For the duration of this episode.
00:10:09.960 | So, Andrew, no, Andy.
00:10:12.880 | Look, here's the thing.
00:10:14.480 | If Ari and Andy are out on the Serengeti,
00:10:17.360 | 150,000 years ago, right?
00:10:19.480 | Homo sapiens, about 200,000 years ago,
00:10:21.200 | about 150,000 years ago,
00:10:22.560 | we're kind of starting to spread
00:10:23.600 | out of the Rift Valley into Africa.
00:10:25.680 | And we're now at a point where we're no longer singular,
00:10:30.100 | but we're within a kind of a small tribal structure.
00:10:33.440 | We wanna start hunting larger and larger game.
00:10:35.800 | We're no longer reactive.
00:10:37.060 | So if we wanna go after that game,
00:10:39.020 | it's not a foregone conclusion
00:10:41.140 | that when we go after something,
00:10:43.180 | it's gonna do what we want it to do.
00:10:44.240 | We have to start thinking about different scenarios.
00:10:46.340 | So that first kind of mental time travel
00:10:48.840 | is really coming from our desire for more protein to exist
00:10:53.740 | and to grow the group and really to feed
00:10:55.960 | the super energy-intensive thing called the human brain.
00:10:59.200 | That's where mental time travel starts.
00:11:00.760 | And Hippocampus takes different memories
00:11:02.560 | of different ways we've hunted
00:11:04.200 | and been successful in the past or not successful
00:11:06.960 | and starts to put together scenarios.
00:11:09.480 | Now, fast forward.
00:11:11.420 | So that's a very long time ago.
00:11:13.440 | You take us through the Middle East,
00:11:17.880 | into Europe, into Asia, 20,000 years ago,
00:11:22.040 | our ancestors crossed Beringia,
00:11:23.940 | which is now the Bering Strait,
00:11:25.900 | and we're in North America.
00:11:27.500 | And fast forward to right now, on my way in here,
00:11:30.300 | I get a notification on my phone, ding.
00:11:32.300 | And I immediately pick up the phone to see,
00:11:35.460 | and you've covered this before,
00:11:37.020 | what's that new information?
00:11:38.860 | What is it that I have to react to?
00:11:40.860 | So we're working on two 300,000-year-old hardware.
00:11:44.780 | At the same time, we have a cultural substrate that is,
00:11:49.740 | for lack of better words,
00:11:51.140 | has hacked into that older part of us
00:11:54.020 | to make us, A, want that immediate gratification,
00:11:56.920 | and B, force us to now react in a way
00:12:01.920 | where that mental time travel
00:12:04.580 | has closed that temporal horizon.
00:12:07.500 | We're now training ourselves
00:12:08.740 | no longer to think about the far future,
00:12:10.780 | but to actually think about the immediate present.
00:12:13.420 | And I don't mean present in a Buddhist way.
00:12:16.060 | I mean presentism as in a hall of mirrors.
00:12:18.580 | There is no past, there is no future.
00:12:20.220 | There's only this moment.
00:12:21.700 | And so it's becoming extremely difficult
00:12:23.740 | for us as individuals, as societies, as civilization,
00:12:26.740 | to think about the long-term
00:12:28.740 | in the way that you and I may have done 150,000 years ago,
00:12:32.100 | because winter was coming.
00:12:34.480 | And we would start thinking,
00:12:35.320 | where are we gonna move our family and our tribe
00:12:37.060 | or our clan, and we would go to warmer climates.
00:12:39.860 | We don't even do that anymore, right?
00:12:41.180 | We're so in this moment
00:12:43.380 | that it's becoming extremely difficult
00:12:44.840 | for us to break out of this presentist moment.
00:12:47.580 | - I really appreciate your answer for a couple of reasons.
00:12:50.420 | Through the '90s and early 2000s,
00:12:54.260 | and maybe even until 2020,
00:12:56.980 | there was a growing movement within science,
00:13:01.500 | but also outside of science,
00:13:03.060 | towards encouraging people to be mindful,
00:13:07.180 | this whole notion of being present, right?
00:13:09.620 | But what you're describing is actually too much
00:13:12.060 | being present, what you're calling presentism.
00:13:15.840 | And of course, it depends on what's happening
00:13:17.660 | in the present.
00:13:19.100 | But in the '80s, in the '90s, in the 2000s,
00:13:23.860 | up to about 2020,
00:13:25.340 | 'cause of course we're still in the 2000s,
00:13:28.180 | there was this notion of future tripping.
00:13:30.180 | Like people are future tripping,
00:13:31.340 | they're spending too much time worrying about the future,
00:13:33.160 | too much time worrying about the future.
00:13:35.140 | I feel like the horizon on our cognition
00:13:39.320 | has really come closer in now.
00:13:41.660 | And as you said, we're in this sort of hall of mirrors
00:13:44.300 | where it's constant stimulus and response.
00:13:46.960 | And I don't want today's discussion to be doom and gloom,
00:13:48.720 | we're going to talk about solutions.
00:13:50.280 | But I think between what you're saying
00:13:53.360 | and what Jonathan Haidt, who is on this podcast,
00:13:55.520 | author of "Anxious Generation,"
00:13:56.800 | "Coddling in the American Mind,"
00:13:57.800 | professor at NYU, et cetera, has said,
00:13:59.620 | I'm starting to really believe that,
00:14:03.600 | yes, the human brain can focus on past, present,
00:14:07.880 | or future, or some combination,
00:14:09.840 | but that something about the architecture
00:14:12.040 | of our technologies and our human interactions,
00:14:14.460 | 'cause those are so closely interwoven,
00:14:16.220 | that's taking place now,
00:14:18.020 | has us really locked in the present in stimulus response.
00:14:22.060 | And I'm going to just briefly reference
00:14:24.920 | a previous episode of the podcast I did.
00:14:26.500 | It's one of my favorite conversations ever,
00:14:28.900 | on or off microphone, which was, excuse me,
00:14:32.220 | with Dr. James Hollis,
00:14:33.460 | a 84-year-old Jungian psychoanalyst,
00:14:35.820 | where he had many important messages there,
00:14:37.820 | but one of them was, we need,
00:14:40.720 | we absolutely need to take five to 10 minutes each day
00:14:44.120 | to exit stimulus response mode,
00:14:46.280 | typically by closing one's eyes and just looking inward.
00:14:48.520 | It doesn't even have to be called meditation,
00:14:50.760 | in order to understand what our greater wishes are,
00:14:54.600 | how to link our current thinking and behavior
00:14:56.960 | to the future and to the past.
00:14:59.000 | And I think he's qualified to say this
00:15:02.040 | because he's an analyst,
00:15:03.440 | that that process actually is a reflection
00:15:07.360 | of the unconscious mind.
00:15:08.460 | So to link these concepts in a more coherent way,
00:15:12.260 | is it possible that we are just overwhelmed
00:15:15.040 | with notifications,
00:15:16.500 | either the traditional type of notifications on your phone,
00:15:20.600 | but that we're basically just living in stimulus response
00:15:23.060 | all the time now?
00:15:24.460 | And if so, what direction is that taking ourselves
00:15:28.860 | as individuals, as families, as communities,
00:15:32.300 | and as a species?
00:15:35.140 | I'm basically validating what you just said,
00:15:36.640 | even though you don't need my validation.
00:15:38.060 | And just asking like,
00:15:39.100 | how bad is it to just be focused on managing the day-to-day?
00:15:42.740 | Or maybe that's a better way to go about life.
00:15:45.560 | - You need to manage the day-to-day.
00:15:47.380 | There are people like me who are full-time futurists.
00:15:50.020 | We tend to be very anxious,
00:15:51.700 | 'cause what we tend to do is think more in the future
00:15:53.900 | and aren't as present as we should be.
00:15:55.420 | That being said,
00:15:57.060 | if 90% of your day is going about your day,
00:16:02.660 | dealing with what's right in front of you, that's great.
00:16:04.740 | What I'm advocating for is
00:16:06.900 | what I call kind of transgenerational empathy.
00:16:11.140 | It's a mouthful.
00:16:12.580 | So we know empathy, you've had guests on that.
00:16:15.340 | Transgenerational empathy first and foremost
00:16:18.220 | starts with empathy and compassion for yourself.
00:16:21.860 | Then we move into empathy for those who came before,
00:16:25.940 | which then allows us to build empathy for the future,
00:16:28.460 | future Ari, future Andy, but then future generations.
00:16:32.540 | And we can get into how to do that.
00:16:34.500 | - Yeah, maybe we could just parse each of those one by one.
00:16:36.740 | So how do you define empathy for self?
00:16:39.780 | - So empathy for yourself is,
00:16:41.420 | in many ways it's almost self-compassion.
00:16:43.460 | It's recognizing you're doing the best you can
00:16:45.940 | with what you have.
00:16:47.460 | Part of the issue is we surround ourselves,
00:16:51.620 | and I'm guilty of this, of images and quotes and books
00:16:56.620 | of how to live your best life, how to be amazing.
00:17:00.140 | And anything below that metric of perfection,
00:17:04.360 | you start to feel terrible.
00:17:05.580 | And you start to kind of ruminate over what you,
00:17:08.460 | you lie in bed at night and you think,
00:17:09.780 | how could I have done that?
00:17:10.620 | How could I have done that?
00:17:11.500 | And you forget that you're only able to handle
00:17:16.500 | what you can at that time.
00:17:19.580 | And you can't hold yourself up to this idealized yardstick.
00:17:22.300 | Look, I dealt with this for a long time.
00:17:24.740 | We learned my father had stage four cancer.
00:17:26.780 | I was 18 years old.
00:17:28.040 | And from when he learned to when he passed away
00:17:32.100 | was only four months.
00:17:33.340 | - Four months.
00:17:34.180 | - Four months.
00:17:35.020 | And for a lot of that time, I was kind of in denial, right?
00:17:40.020 | Like I wasn't actually there with him
00:17:43.300 | as much as I should have been.
00:17:45.200 | In fact, and we're not gonna, we won't go into this.
00:17:47.980 | I was actually with you that summer.
00:17:49.380 | We were working together that summer at a summer camp.
00:17:52.780 | Now for years, I beat myself up.
00:17:55.120 | How could I have done that?
00:17:56.140 | I should have been home with him.
00:17:57.400 | It was only gonna be four months.
00:17:59.060 | And then I realized, and this is a self-compassion,
00:18:01.660 | like 18 year old Ari was only at a place emotionally
00:18:06.420 | and psychologically to be able to do what I did.
00:18:10.020 | And it wasn't the older 30 or 40 year old Ari
00:18:12.740 | of now being like, of having these regrets.
00:18:14.620 | So empathy for yourself really, really centers.
00:18:18.300 | It doesn't mean you let yourself off the hook.
00:18:20.100 | It doesn't mean you can go willy nilly
00:18:21.380 | and treat people terribly.
00:18:23.740 | It means you recognize that who you were even yesterday
00:18:28.700 | is in many ways different than who you are today
00:18:30.740 | and what you've learned.
00:18:32.780 | So transgenerational empathy has to start with yourself.
00:18:36.140 | It has to start with being able to look in the mirror
00:18:39.260 | and say, I'm not perfect.
00:18:41.680 | I was born into this world, into a family,
00:18:46.420 | into my birth family or family that you choose.
00:18:51.420 | And they were born into something.
00:18:53.920 | And you work with what you have,
00:18:55.220 | but you have to start there because so many times
00:18:57.820 | I work with people and I talk to people and they say,
00:19:00.140 | I wanna have empathy for the past and for the future,
00:19:03.420 | but they don't have it for themselves.
00:19:04.380 | So if you don't start there,
00:19:06.780 | it becomes very, very difficult to spread out
00:19:10.620 | first obviously going backwards.
00:19:12.100 | And then ultimately the goal of my work
00:19:15.020 | is to get you to spread that out into the future.
00:19:17.580 | - I love this concept of empathy for self
00:19:19.660 | because I've heard it before in other contexts,
00:19:21.500 | but I haven't heard it operationalized
00:19:23.700 | the way that you describe it.
00:19:25.220 | I think, yeah, there's two phrases that come to mind.
00:19:30.220 | There's a book called "A Fighter's Heart" by Sam Sheridan.
00:19:35.400 | And it's a pretty interesting account
00:19:38.500 | of all the different forms of martial arts and fighting.
00:19:41.260 | And there's an interesting part of the book
00:19:43.620 | where he says, you can't have your 20th birthday
00:19:46.500 | until you're 19, which is a big giant duh,
00:19:50.460 | but it's actually a pretty profound statement.
00:19:52.260 | And by the way, he went to Harvard.
00:19:53.340 | He's a smart kid.
00:19:54.300 | His father was in the SEAL teams.
00:19:56.180 | He has an interesting lineage in his own right.
00:19:58.540 | And I think at Harvard,
00:19:59.580 | he claims he just painted and smoked cigarettes.
00:20:02.420 | So, you know, it's a bit of an iconoclast.
00:20:07.180 | In any case, I think that statement,
00:20:10.580 | you can't have your 20th birthday until you're 19
00:20:12.400 | is something that we forget
00:20:14.260 | because of the immense amount of attention
00:20:16.100 | that we pay to trying to be like others
00:20:19.160 | and satisfy external metrics.
00:20:21.180 | And so I like to think he was in agreement with you
00:20:24.200 | if I may.
00:20:25.040 | The other thing that happened to me recently
00:20:28.300 | that comes to mind is that I, like many people,
00:20:31.700 | peruse Instagram.
00:20:33.260 | I teach on Instagram, et cetera.
00:20:35.140 | And there are a lot of these quote accounts
00:20:37.540 | or like life inspiration accounts.
00:20:39.440 | And I would argue that the half-life
00:20:40.960 | of any one of those posts is pretty short,
00:20:44.080 | but some are pretty interesting.
00:20:46.060 | And there's a guy, I'll put it in the show note captions.
00:20:48.580 | I don't remember off the top of my head.
00:20:50.720 | Not a huge account, not a small account.
00:20:53.300 | I think he lives in Austin.
00:20:55.100 | And he goes through this long discourse
00:20:58.860 | about the challenges of the human mind
00:21:01.500 | for a lot of the reasons that we're talking about,
00:21:03.020 | its ability to flip from past to present to future, et cetera.
00:21:06.320 | But then he says, it basically distills down
00:21:09.380 | to one actionable step per day or per morning,
00:21:12.980 | which is at some point,
00:21:16.460 | if you want to grow and be more functional,
00:21:18.600 | you have to ask yourself,
00:21:20.540 | what am I going to do today to make my day better?
00:21:25.540 | Not to be better than I was yesterday, right?
00:21:29.020 | Which is also a fine statement,
00:21:30.440 | but that one never really resonated for me
00:21:32.020 | because like yesterday could have been an amazing day.
00:21:34.340 | You might not be as good as yesterday, right?
00:21:36.180 | Every day is kind of its own unique unit.
00:21:38.420 | And our biology really does function
00:21:39.940 | on these circadian biology units of 24 hours.
00:21:43.420 | There's no negotiating that.
00:21:44.940 | So I like this concept of what can I do today
00:21:48.180 | to make my life and hopefully the lives of others better?
00:21:51.680 | Because it implies a verb, an action step,
00:21:54.260 | and it's really focused on the unit of the day,
00:21:56.060 | which is really what we've got.
00:21:57.620 | So that resonated.
00:21:59.700 | So according to your definition,
00:22:01.100 | empathy for self starts with understanding
00:22:03.300 | that we're always doing the best we can with what we've got,
00:22:05.840 | but that there's a striving kind of woven into that statement
00:22:09.060 | that there is a need for striving.
00:22:11.500 | At what point do we start to develop empathy for others?
00:22:14.040 | And what does that look like?
00:22:15.300 | Like is empathy for somebody else
00:22:18.180 | feeling what they feel?
00:22:19.340 | I mean, that's the kind of traditional definition.
00:22:20.940 | - Yeah, I mean, look,
00:22:21.780 | we start off with kind of cognitive intellectual empathy,
00:22:24.180 | right?
00:22:25.020 | So you can kind of think it.
00:22:26.900 | But where you really want to be able to be
00:22:28.900 | is at a place where their feelings
00:22:33.900 | are feelings that you can feel,
00:22:36.180 | and you want to bring, if they're feeling bad,
00:22:38.740 | you want to bring some resolution to that.
00:22:40.500 | If they're feeling good, you can be there with them.
00:22:43.060 | At a fundamental level, this is mirror neurons.
00:22:46.340 | And I'm connecting with you and you're connecting with me,
00:22:48.980 | and there's a genetic adaptive fitness for that, right?
00:22:52.060 | We all want to kind of be in sync
00:22:53.700 | because the tribe that works together
00:22:55.740 | flourishes together and thrives together.
00:22:57.380 | So it makes sense at that level.
00:22:59.020 | But when I'm feeling empathy for another,
00:23:04.380 | their state of being can be as important
00:23:07.900 | as my own state of being.
00:23:09.940 | It can be, look, it can be taxing, don't get me wrong,
00:23:12.700 | but ultimately that is what self-compassion can give you
00:23:16.100 | because it can give you a state of being
00:23:18.820 | where those around you,
00:23:20.060 | you are no longer fundamentally disconnected.
00:23:22.420 | And I think one of the great errors
00:23:25.660 | of where we have taken this civilization
00:23:27.620 | over the past several decades, if not centuries,
00:23:29.860 | is disconnection, disconnection from ourselves,
00:23:32.820 | disconnection from each other,
00:23:34.420 | and disconnection from nature and the planet.
00:23:37.460 | So anything we can do to further that connection
00:23:40.380 | is going to benefit us today in the current moment.
00:23:43.740 | - I agree completely.
00:23:45.540 | If we were to break that down
00:23:47.220 | into the requirements for empathy and connection,
00:23:52.700 | one, it seems like presence.
00:23:54.860 | Like we need to be present.
00:23:55.980 | Like if we're going to appreciate a fern,
00:23:59.220 | a beautiful fern, or a dog, or a significant other,
00:24:02.900 | or another human being that we happen to encounter,
00:24:05.700 | we have to be present.
00:24:08.380 | If we're going to have empathy,
00:24:09.820 | our mind can't be someplace else.
00:24:11.620 | - Can't be wandering.
00:24:12.460 | - Right, can't be in the past, can't be in the future,
00:24:14.380 | or we're not going to be able
00:24:16.020 | to really touch into the details of the experience.
00:24:19.660 | So that seems like requirement number one.
00:24:22.220 | The second is that we need to be able
00:24:25.100 | to leave whatever kind of pressures are on us
00:24:29.820 | to tend to other things, right?
00:24:32.060 | Like every neural circuit we know has a push and a pull.
00:24:34.580 | Like in order to get A, you need to suppress B.
00:24:36.660 | And this is the way neural circuits work generally.
00:24:39.380 | You know, flexors and extensors in the muscles
00:24:41.180 | are a good analogy for, which by the way,
00:24:45.340 | like if you're going to flex your bicep,
00:24:47.340 | your tricep is essentially relaxing and vice versa,
00:24:50.940 | in so many words.
00:24:53.300 | The PTs are going to dive all over me for that one.
00:24:55.140 | But that's sort of how neural circuits in the brain work.
00:24:57.580 | We can actually see all around us
00:24:59.820 | by virtue of neurons that respond
00:25:01.620 | to either increments or decrements in light,
00:25:03.420 | and their difference is actually what allows us
00:25:05.620 | to see boundaries, borders, visually.
00:25:07.980 | So we need to suppress like our thoughts
00:25:11.340 | about where we need to be that day
00:25:12.540 | or other things that are going on for us.
00:25:14.340 | And then we need to be able to return to our own,
00:25:16.940 | you know, self-attention in order to be functional.
00:25:20.180 | And I think that, I think this is where the challenge is
00:25:24.020 | and where the next question arises,
00:25:25.780 | which is on the one hand,
00:25:27.700 | I could imagine that, okay,
00:25:31.660 | we've got so many pressures upon us every day, all day,
00:25:35.340 | that it's getting much harder to be present,
00:25:37.980 | to be empathic and to build this idealized future
00:25:40.980 | or better future.
00:25:43.540 | But on the other hand, I hear you and other people saying,
00:25:46.420 | well, things are so much better than they were
00:25:48.500 | even 50 years ago in terms of health outcomes,
00:25:51.300 | believe it or not, in terms of, you know,
00:25:54.180 | the status of people having shelter, et cetera.
00:25:57.220 | And this is a shock to a lot of people.
00:25:58.880 | They're like, wait a second,
00:25:59.720 | I didn't see homeless people on the street
00:26:00.980 | when I was a kid and now I do.
00:26:02.660 | Well, they were, people suffering were elsewhere.
00:26:06.380 | You didn't, perhaps didn't see them.
00:26:08.260 | So there are a couple of levels of question here,
00:26:11.460 | but the first one is perhaps, are we much better off,
00:26:15.540 | but we are worse off in the sense that
00:26:20.340 | there's so much incoming that we miss the fact
00:26:23.300 | that we're better off?
00:26:24.740 | Like, you know, is it like notifications
00:26:26.420 | preventing us from seeing that we actually have so much
00:26:28.940 | that we're, you know, 100 times better off than we were
00:26:33.300 | as a species 50 years ago?
00:26:34.940 | 'Cause I feel like a lot of the debates that I see online
00:26:37.780 | about climate change, about health, about longevity,
00:26:40.700 | it's like, it's overwhelming because I feel like
00:26:42.420 | people aren't agreeing on the first principles.
00:26:45.300 | So let's start with this, are human beings better off
00:26:49.460 | in terms of health and longevity than we were?
00:26:52.860 | Let's go short scale, 50 years ago.
00:26:55.660 | So look, in aggregate,
00:26:57.700 | because we can find peaks and valleys, right,
00:26:59.860 | when we zoom in, if we pull back,
00:27:02.020 | there's no better time to be alive as a homo sapien
00:27:04.980 | on planet earth than right now.
00:27:06.540 | Now, someone's gonna argue right now
00:27:08.100 | and they're gonna say, no, no, no, no.
00:27:08.940 | - I mean, according to what metrics, like happiness?
00:27:11.100 | - Health, infant mortality,
00:27:13.340 | even as we backslide in this country, being a woman,
00:27:16.740 | education, kind of the calories that we get,
00:27:20.940 | across the, look, if you and I go outside
00:27:23.740 | and you stepped on a rusty nail 100 years ago,
00:27:26.300 | good chance you would die.
00:27:27.700 | Right now, we just go to the drugstore
00:27:30.220 | and put something on it, or we even know
00:27:31.940 | that we don't even have to put anything on it,
00:27:33.020 | we can just put it underneath high pressure water
00:27:35.340 | for 30 seconds and that'll clean out
00:27:37.300 | because we now know germ theory, right?
00:27:39.100 | So net net, this is the best time to be alive.
00:27:42.900 | All the markers, you can go to Gapminder if you want,
00:27:45.580 | and you can see that we are doing better,
00:27:48.820 | we are progressing.
00:27:50.580 | The issue is that we are now at an inflection point
00:27:55.580 | because the things that we do or do not do
00:27:58.740 | across the major issues of our day
00:28:01.140 | and how we deal with them, climate change,
00:28:03.820 | artificial intelligence, synthetic biology,
00:28:06.720 | what we do or do not do will dictate
00:28:11.980 | not only the next several years and several decades,
00:28:13.940 | potentially the next several centuries.
00:28:15.820 | So you've hit it, we're being bombarded by information.
00:28:19.780 | Most of the information we're attracted to
00:28:21.900 | is the negative negativity bias.
00:28:24.240 | You and I on this, we're gonna go back
00:28:25.740 | to R and Andy 150,000 years ago.
00:28:28.180 | If we saw this beautiful tree, aesthetically,
00:28:32.060 | and we saw maybe a tree over here that was on fire,
00:28:35.340 | you and I would zoom in on the tree on fire
00:28:37.260 | and focus on the negative
00:28:38.660 | 'cause negative things hurt and kill us.
00:28:41.120 | That being said, if you and I run a major media company,
00:28:44.720 | you and I both know that the more negative stories
00:28:48.080 | that we put out, the more hits we're gonna get.
00:28:49.900 | - Not this media company. - Not this media company.
00:28:52.700 | - I'm not kidding. - But all the other ones.
00:28:54.420 | Well, I would argue some of your success
00:28:57.220 | comes from the fact that you don't wallow
00:28:59.220 | in the negativity and there's a real thirst and a hunger
00:29:01.940 | and desire to learn more about who we are
00:29:04.340 | and how we can make ourselves better.
00:29:06.800 | But that negativity bias is still part of us, right?
00:29:09.660 | I think one of the issues that we have to confront
00:29:14.120 | as a society is that there are parts of us,
00:29:17.260 | the prefrontal cortex parts of us that are amazing,
00:29:20.180 | that build microphones, that have conversations,
00:29:22.160 | that stream across the internet.
00:29:23.540 | And then there are parts of us,
00:29:24.980 | this is Jonathan's elephant in the rider,
00:29:27.580 | there are parts of us that happen below the surface
00:29:30.100 | that have hundreds of thousands,
00:29:31.560 | if not millions of years of legacy.
00:29:33.540 | And we often wanna either be up here and say,
00:29:36.220 | "Oh, we're so smart, we're so great,"
00:29:37.820 | or we wanna wallow in the kind of the death and despair
00:29:40.540 | and the horrific things that we can do to one another.
00:29:43.020 | You know, my personal past on my father's side
00:29:46.260 | is I think some of the darkest moments
00:29:47.740 | in Homo sapien behavior, and that was not that long ago.
00:29:51.540 | So if we wanna move into a place that allows us to ask
00:29:57.380 | what I think is the fundamental question of our time,
00:30:00.860 | which is how do we become the great ancestors
00:30:03.660 | the future needs us to be?
00:30:05.360 | We need to find a way to both tap into the elephant
00:30:09.460 | in the rider, which you'll do a better job of me
00:30:11.660 | in explaining than I will.
00:30:12.500 | - No, I love this idea.
00:30:13.540 | I mean, we could map it to neural circuits,
00:30:15.180 | but I love this idea of high-level concepts
00:30:17.460 | and then neural circuits that are very,
00:30:20.300 | what Dr. Paul Conti who was on this podcast,
00:30:22.820 | psychiatrist, brilliant psychiatrist,
00:30:24.340 | said the limbic system, the emotional system
00:30:28.660 | doesn't know or care about the clock or the calendar.
00:30:31.860 | It just elicits feeling.
00:30:34.340 | It doesn't care about whether or not that feeling
00:30:35.780 | is relevant to the past, the present, or the future.
00:30:38.260 | It just has a job, which is just to bring out
00:30:40.900 | a particular feeling.
00:30:42.020 | - You're jumping ahead a little bit, but that's okay.
00:30:43.740 | Because what you're jumping into is when we ask
00:30:47.700 | and we want to have an empathic connection,
00:30:50.940 | we wanna have empathy with future generations.
00:30:53.820 | We don't want it to just be cognitive.
00:30:55.700 | We don't want it just to be intellectual.
00:30:57.460 | We actually want it to be emotional.
00:30:59.820 | So if I ask someone, what do you want the future
00:31:02.660 | to be like for your great grandkids in the 2080s?
00:31:06.140 | And they give me a list of kind of bullet points,
00:31:08.580 | but they're usually externalized bullet points.
00:31:10.340 | - Shelter, healthcare.
00:31:12.060 | - Yeah, and then I follow up.
00:31:14.140 | And we've done this in other people
00:31:15.420 | much smarter than you have done this in studies.
00:31:16.700 | We say Jakob Trope at NYU is the one who taught me this.
00:31:19.500 | How do you want them to feel?
00:31:22.100 | That's different, right?
00:31:24.180 | This is Damasio's,
00:31:25.300 | this is somatic markers hypothesis theory, right?
00:31:29.420 | Where if you really want something to happen,
00:31:33.180 | it's not just about visualizing it.
00:31:35.740 | It's about visualizing it and connecting it
00:31:37.820 | to the emotional amygdala sense of what that is
00:31:40.700 | to actually move towards the actions
00:31:42.460 | and changing behaviors that you want.
00:31:44.300 | Madison Avenue understands this.
00:31:45.660 | Marketing understands this.
00:31:47.660 | - But the general public tends not to,
00:31:49.300 | sorry, I keep interrupting you,
00:31:50.260 | but also it was the kids say, sorry, not sorry,
00:31:52.740 | in the sense that I wanna make sure
00:31:54.020 | that I highlight something.
00:31:55.940 | Martha Beck is somebody who I think has done
00:31:58.180 | some really brilliant work creating practices
00:32:00.900 | where when one is not feeling what they want to feel,
00:32:04.220 | there's this kind of question,
00:32:07.100 | like, are you supposed to feel your feelings?
00:32:08.500 | Are you supposed to create new feelings in place of them,
00:32:10.380 | especially if they're unpleasant?
00:32:11.460 | And it's like, there's no clear answer to that
00:32:13.180 | because it's complicated, infinite number of variables.
00:32:17.580 | But she does have this interesting practice whereby,
00:32:20.620 | it's a bit like a meditation
00:32:22.060 | where if you're struggling with something,
00:32:25.100 | like maybe you're struggling with boredom
00:32:26.420 | or not knowing where to go with your life,
00:32:28.020 | or you're not happy,
00:32:29.300 | or you just feel some underlying anxiety,
00:32:31.700 | to think back to a time when you felt particularly blank,
00:32:36.620 | like a time when you felt particularly empowered
00:32:38.780 | or particularly curious, it can be very specific,
00:32:42.260 | particularly amused because,
00:32:45.580 | and the idea is that in anchoring to the emotion state first,
00:32:50.140 | you call to mind a bunch of potential action steps.
00:32:54.580 | And the reason I like this approach
00:32:57.380 | is that that is at least one way
00:32:59.940 | that "the brain works,"
00:33:01.660 | which is that the emotion states
00:33:03.500 | are linked to a bunch of action step possibilities,
00:33:06.100 | kind of like a magic library
00:33:07.820 | where if you go into the room called sadness,
00:33:11.780 | there are a bunch of action steps associated with
00:33:13.540 | that go beyond crying.
00:33:14.620 | It's like curling up in the fetal position, et cetera.
00:33:16.300 | You go into the room that's called excitement,
00:33:19.900 | and there's all this idea about getting in vehicles
00:33:22.340 | and going places and things of that sort.
00:33:24.260 | So what you're talking about is,
00:33:27.260 | I believe, thinking about the emotional states of others,
00:33:33.460 | and then from there,
00:33:36.020 | I think this is where you're gonna go,
00:33:37.980 | cultivating some action steps that you can take
00:33:40.260 | to ensure that that future generation
00:33:43.540 | can access those emotions.
00:33:45.300 | - Yes, but with a slight correction,
00:33:47.060 | because it's not about thinking
00:33:48.700 | about their future emotional states,
00:33:50.300 | it's actually feeling them.
00:33:52.260 | - I see.
00:33:53.100 | So it's not saying, "I want my kids to be happy.
00:33:55.500 | I want them to have no trauma."
00:33:58.160 | It's feeling what it would be to be happy, no trauma.
00:34:03.160 | - Yes.
00:34:05.100 | - Right, okay.
00:34:05.940 | So that becomes an anchor, right?
00:34:09.300 | She's 100% correct.
00:34:10.420 | What it does is, but it places it, it's like a kedge anchor.
00:34:12.700 | So if you and I were sailors, which we're not,
00:34:15.660 | there's a thing called a kedge anchor.
00:34:17.260 | And a kedge anchor is this anchor that you throw
00:34:19.580 | 30, 40 meters off to the side, it hits the bottom,
00:34:22.660 | and you use the rope to pull yourself there.
00:34:25.140 | Emotions will pull us towards those futures.
00:34:29.740 | It will alter the behaviors.
00:34:31.620 | So time and time again, when we intellectualize
00:34:34.700 | and we become overly cognitive in terms of futures
00:34:38.380 | that we wanna see happen for ourselves, future Ari,
00:34:41.860 | or future Wallach family, or future society,
00:34:44.480 | or future global planetary civilization,
00:34:47.620 | if we think about it, that's one thing.
00:34:49.260 | But to actually execute on those goals,
00:34:51.460 | we have to actually connect the emotional state
00:34:54.340 | that we wanna be in to drive that function.
00:34:57.020 | Remember, look, this is one of the things
00:34:58.300 | that Marty Seligman says, that Freud got it wrong.
00:35:00.660 | Freud felt, as Marty says, that emotions
00:35:05.060 | were these things that happened in the past
00:35:07.020 | that we would use to dwell on,
00:35:08.380 | and that was neuroses and anxiety and depression.
00:35:10.340 | No, no, no, no.
00:35:11.180 | Emotions are there to help us make better decisions
00:35:15.460 | for the future.
00:35:16.700 | We are future-oriented mammals and species.
00:35:21.000 | So what emotions do, it's not meant to be like,
00:35:23.420 | oh, you know, I had this terrible breakup,
00:35:26.140 | I feel so terrible, and then I'm gonna go to my therapist,
00:35:30.300 | or talk about all that stuff that happened in the past.
00:35:32.300 | That's one way of looking at it.
00:35:33.260 | The other way is, your body is telling you
00:35:35.820 | in a very, very visceral way,
00:35:38.380 | whatever you just did that had you in that situation,
00:35:41.880 | don't do it again.
00:35:43.300 | Because if you do, you're gonna feel a certain way.
00:35:44.540 | You know, they did this study where they,
00:35:46.940 | at a college campus, they found people
00:35:49.180 | who had just been in a kind of a quasi-long-term relationship
00:35:52.500 | that had gone through a breakup.
00:35:53.780 | - Quasi-long-term.
00:35:54.980 | - Six months, six months.
00:35:56.700 | - What I've learned in life
00:35:57.540 | is it's important to define the relationship.
00:35:59.180 | - About six months.
00:36:00.620 | And people who had gone through the breakup,
00:36:04.420 | they gave one group a placebo,
00:36:07.100 | and another group actually just got acetaminophen,
00:36:09.140 | got Tylenol.
00:36:10.380 | And the group that got the acetaminophen
00:36:12.340 | actually felt better.
00:36:15.020 | Because we actually feel emotions.
00:36:17.820 | We actually feel pain.
00:36:19.340 | Some of the same circuits are being tripped.
00:36:21.900 | And so that says to me
00:36:24.540 | that emotions are there to guide future action.
00:36:27.340 | So if we can have pro-social emotions,
00:36:29.900 | awe and empathy and compassion,
00:36:33.180 | and this one we call love,
00:36:35.140 | as what we're connected to the future generations
00:36:38.100 | that we wanna see, how we wanna see them flourish,
00:36:40.140 | we are much more likely to see that happen
00:36:42.260 | than if we just have a vision
00:36:45.100 | of what tomorrow will look like
00:36:46.460 | at an intellectual kind of two-dimensional level.
00:36:50.020 | I'd like to take a quick break
00:36:51.260 | and acknowledge our sponsor, AG1.
00:36:53.940 | By now, many of you have heard me say
00:36:55.500 | that if I could take just one supplement,
00:36:57.300 | that supplement would be AG1.
00:36:59.380 | The reason for that is AG1 is the highest quality
00:37:01.900 | and most complete
00:37:02.820 | of the foundational nutritional supplements available.
00:37:05.540 | What that means is that it contains
00:37:06.980 | not just vitamins and minerals,
00:37:08.340 | but also probiotics, prebiotics, and adaptogens
00:37:11.740 | to cover any gaps you may have in your diet
00:37:14.180 | and provide support for a demanding life.
00:37:16.460 | For me, even if I eat mostly whole foods
00:37:18.300 | and minimally processed foods,
00:37:19.500 | which I do for most of my food intake,
00:37:21.460 | it's very difficult for me to get enough fruits
00:37:23.420 | and vegetables, vitamins and minerals,
00:37:25.260 | micronutrients, and adaptogens from food alone.
00:37:28.340 | For that reason, I've been taking AG1 daily since 2012,
00:37:31.740 | and often twice a day, once in the morning or mid-morning,
00:37:34.420 | and again in the afternoon or evening.
00:37:36.440 | When I do that, it clearly bolsters my energy,
00:37:38.860 | my immune system, and my gut microbiome.
00:37:41.440 | These are all critical to brain function,
00:37:43.260 | mood, physical performance, and much more.
00:37:45.860 | If you'd like to try AG1,
00:37:47.300 | you can go to drinkag1.com/huberman
00:37:50.440 | to claim their special offer.
00:37:52.020 | Right now, they're giving away five free travel packs
00:37:54.140 | plus a year supply of vitamin D3K2.
00:37:56.980 | Again, that's drinkag1.com/huberman
00:38:00.600 | to claim that special offer.
00:38:02.660 | - I really like this because it gets to so many themes
00:38:05.660 | that have been discussed on this podcast previously
00:38:08.220 | and that exist in the neuroscience literature of,
00:38:11.700 | like, yes, emotions don't know the clock or the calendar.
00:38:16.300 | And that sounds like a bad thing.
00:38:17.740 | And oftentimes it's discussed as a bad thing.
00:38:19.500 | Like, oh, when you're feeling stressed,
00:38:21.300 | you're not able to access the parts of your brain
00:38:23.980 | that can make better decisions.
00:38:25.300 | We know that's true,
00:38:26.340 | except in light of what's immediately pressing.
00:38:29.100 | I mean, I would say that stress in the short term
00:38:32.340 | makes us much better thinkers and movers
00:38:34.860 | for sake of survival.
00:38:35.980 | In the long-term, it's problematic.
00:38:39.740 | But the way that you're describing emotions
00:38:44.220 | as a Kedge anchor, is that what it's called?
00:38:47.040 | Kedge with a K? - Yep.
00:38:48.460 | - Yeah, Kedge anchor, interesting.
00:38:51.380 | As a Kedge anchor, to pull us forward,
00:38:54.140 | also leverages the fact that emotions don't know
00:38:56.460 | about the clock or the calendar.
00:38:58.060 | And that the order of operations here
00:39:01.500 | seems to be emotions first,
00:39:05.180 | then action steps born out of those emotions,
00:39:08.600 | and then future state hopefully arrived at,
00:39:11.560 | if it's set along the right path.
00:39:14.300 | I like that a lot.
00:39:17.140 | And again, it maps to some of the work
00:39:20.980 | that has largely existed, at least to my knowledge,
00:39:23.700 | in popular psychology,
00:39:25.780 | or whatever you wanna call it, self-help.
00:39:27.580 | Again, I'm a big Martha Beck fan,
00:39:29.100 | in part because of an exercise that she's included in,
00:39:33.520 | I think several, if not all her books,
00:39:35.280 | of this perfect day exercise.
00:39:37.980 | Have you ever done this exercise?
00:39:38.820 | - No. - It's a very interesting exercise.
00:39:40.260 | You first sit with your eyes closed
00:39:44.420 | and you imagine like really terrible stuff,
00:39:46.940 | and you experience it in your body,
00:39:48.940 | and you experience it in your mind,
00:39:50.140 | and you just pay attention to how it feels,
00:39:51.660 | and it sucks, it doesn't feel good.
00:39:54.420 | Most people don't have too much trouble doing that exercise.
00:39:57.160 | Then you shift over,
00:39:58.660 | I think you're supposed to take a little break
00:39:59.880 | or maybe move around a little bit,
00:40:01.580 | and then you do a perfect day exercise where no rules.
00:40:06.260 | You lie down or sit down, close your eyes,
00:40:08.380 | and you can imagine your day includes anything you want.
00:40:12.440 | You can be anywhere you want.
00:40:14.060 | The room can morph from one country to the next,
00:40:17.460 | it doesn't matter.
00:40:18.820 | And you also experience the sensations in your body.
00:40:23.700 | And in that second exercise, it's remarkable,
00:40:27.660 | I've done it several times now,
00:40:29.180 | there are little seeds of things kind of pop out
00:40:33.140 | where you go, "Oh, I didn't realize
00:40:34.740 | "that would be part of my perfect day."
00:40:36.360 | And they're not outside the bounds of reality.
00:40:40.000 | And those are things that then you write down,
00:40:43.300 | and that at least in my life have all borne out.
00:40:48.260 | So this is something, an exercise you do routinely.
00:40:50.700 | And when I first heard about this, I was like,
00:40:52.100 | "Okay, this seems like weird self-hypnosis,
00:40:55.460 | "self-help-y, woo stuff, I'm not, come on."
00:40:59.100 | I'm like, at that time, I'm like,
00:41:00.760 | "I'm a neuroscience professor, you got to be kidding me."
00:41:03.900 | And it's a remarkable exercise.
00:41:06.560 | And the reason I bring it up now in discussion with you
00:41:10.100 | is I think you and Martha arrived at a similar place
00:41:14.480 | or a similar avenue, but in your case,
00:41:16.740 | you're talking about specifically toward building a future
00:41:19.060 | that's not necessarily for you to live in,
00:41:21.820 | but for someone else to live in.
00:41:23.980 | - Oh, look, the core of my philosophy
00:41:28.940 | is in a story that I heard a very long time ago.
00:41:33.460 | It comes from the Talmud.
00:41:34.480 | That being said, this story exists in many cultures.
00:41:37.260 | And so there's a man named Honi walking,
00:41:42.020 | and he comes across a much older man
00:41:43.500 | who's planting a carob tree.
00:41:45.500 | And he says to the older man,
00:41:47.260 | "Why are you planting a carob tree?
00:41:49.660 | "How long will it be until this carob tree bears fruit
00:41:52.500 | "or even has shade?"
00:41:53.700 | And he goes, "Oh, it'll be at least 40 years."
00:41:55.180 | And he goes, "Well, why plant it?
00:41:57.180 | "You won't be around for that."
00:41:59.140 | And the old man says, "When I was young,
00:42:02.180 | "I played in the shade of a carob tree.
00:42:03.960 | "I ate from the carob tree.
00:42:05.980 | "So it's my job to plant this carob tree now."
00:42:09.600 | This is how societies move forward.
00:42:13.140 | This is how we become great,
00:42:15.980 | is by planting carob trees whose shade we will never know.
00:42:20.820 | And look, I can give you a bunch of examples.
00:42:22.740 | The Panama Canal, right?
00:42:24.160 | That was a great, you know.
00:42:25.300 | But another way that we think about this,
00:42:27.200 | we call this cathedral thinking.
00:42:28.660 | So now, when we're in California,
00:42:31.780 | they'll put up a home in three or four days.
00:42:33.940 | But back in the day,
00:42:34.940 | it took a really long time to build great things.
00:42:37.300 | So you go back two, 300 years ago or even further,
00:42:42.020 | and oftentimes, the architect and the original stonemason
00:42:45.260 | who would plant the keystone
00:42:46.980 | would not be alive to see this cathedral
00:42:49.700 | or mosque fully built.
00:42:51.600 | That's cathedral thinking.
00:42:52.740 | It's doing things whose fruits you will not be around
00:42:57.740 | to take advantage of, to reap,
00:43:00.600 | and to have as part of your life.
00:43:03.140 | - And I love it.
00:43:04.780 | And I love the notion of cathedral thinking,
00:43:07.140 | just the visual there, or mosque thinking.
00:43:10.580 | I went to the Blue Mosque years ago.
00:43:13.000 | - In Istanbul? - Yeah.
00:43:13.840 | - Yeah.
00:43:14.660 | - Like, I mean, I've seen some amazing architecture.
00:43:16.300 | I love architecture.
00:43:17.420 | And I was like, "Okay, like, it'll be a beautiful building."
00:43:19.580 | And I was like, "Whoa."
00:43:22.500 | - But that woe that you felt is what we call awe.
00:43:26.780 | - Yeah.
00:43:27.620 | - And that sense of awe at what they built
00:43:30.960 | is what I am advocating for us to build in the world today,
00:43:35.380 | is so that when our descendants look back and they say,
00:43:38.900 | "What did Ari, what did Andy do?"
00:43:41.860 | They have awe.
00:43:42.700 | It's not because we necessarily built cathedrals.
00:43:44.500 | It's because we took actions,
00:43:46.420 | both very small and very large,
00:43:49.200 | to ensure that they would flourish,
00:43:50.620 | that they would have those carob trees.
00:43:52.140 | - And I think what I realize is that
00:43:56.420 | I don't know who built the Blue Mosque specifically.
00:44:01.000 | I don't know who the architect was.
00:44:02.600 | I should, you know?
00:44:03.620 | And even, you know, earlier this year, we were in Sydney.
00:44:06.860 | I went to the Sydney Opera House.
00:44:08.060 | We did a live there.
00:44:08.900 | It's a beautiful building.
00:44:09.840 | I learned they had been built
00:44:10.680 | over a very long period of time.
00:44:11.680 | I can tell you that the architect was Danish,
00:44:14.260 | but I can't remember his name.
00:44:15.600 | So part of what we're talking about here
00:44:17.780 | is giving up our need for attribution.
00:44:22.660 | - Yep.
00:44:23.500 | - Giving up our need for credit.
00:44:25.620 | And gosh, this is the opposite of social media, right?
00:44:30.620 | Social media, it's all about getting credit, you know?
00:44:34.360 | And yet in science where people care a lot about credit
00:44:37.240 | while they're alive,
00:44:38.860 | and my scientist colleagues hate this,
00:44:41.420 | but they know it deeply too.
00:44:42.860 | - It's also a business model of academic science right now.
00:44:45.100 | - Right, which is that with the exception of Einstein
00:44:49.460 | and a few others,
00:44:52.660 | most people will not be associated
00:44:54.700 | with their incredible discoveries,
00:44:56.060 | even the textbook discoveries 20 years out.
00:44:58.860 | And I know this 'cause my dad's a scientist
00:45:00.660 | and I know a lot about the scientists that were ahead of him.
00:45:03.680 | And he taught me this early on.
00:45:05.580 | He just said, you know, with rare exception,
00:45:07.900 | you know, the discoveries are not,
00:45:09.600 | you know, no one's going to say,
00:45:11.480 | oh, that's the discovery of so-and-so.
00:45:13.840 | Talk about the discovery, people will build on it.
00:45:16.280 | So you're part of a process
00:45:17.620 | for which you won't get credit in the long run.
00:45:19.900 | You will get credit in the short run.
00:45:21.520 | And that brings me around to perhaps a point
00:45:23.600 | that's more relevant to everybody, not just scientists,
00:45:25.760 | which is that we are all trained to work
00:45:29.860 | on these short-term contingencies, reward schedules,
00:45:33.600 | where, you know, we achieve something, we get credit.
00:45:35.940 | You get an A, you get a B, you get a trophy.
00:45:38.300 | And we just came from the Olympic track
00:45:39.340 | and field trials in Oregon.
00:45:41.060 | It's like, you know, podium, you know, bronze, silver, gold.
00:45:45.140 | And so, yes, you're part of a larger legacy.
00:45:49.220 | You're building toward a larger legacy
00:45:50.660 | in the examples that you give.
00:45:52.140 | But part of it is understanding
00:45:53.940 | that you're not gonna get credit.
00:45:56.580 | You're not gonna have your name huge
00:45:58.140 | on the side of the building.
00:45:59.300 | I mean, I don't wanna give too many examples,
00:46:00.780 | but I work at a university
00:46:02.260 | for which there's an endowment the size of a country.
00:46:05.660 | Right?
00:46:06.500 | We're very blessed to have that endowment.
00:46:08.300 | The buildings have names on the side of them.
00:46:10.500 | The reason they have names on the side of them
00:46:12.380 | is because people gave money,
00:46:14.740 | typically gave money to the university
00:46:17.220 | to have their name on the side of a building
00:46:18.980 | to be immortalized.
00:46:20.100 | What's interesting for many reasons,
00:46:22.760 | both sociopolitical, but also other reasons,
00:46:25.140 | those names change over time.
00:46:26.460 | So if people knew that if they gave half their wealth
00:46:31.900 | and their name might be scraped off a building
00:46:33.660 | in 200 years, they might feel differently about it.
00:46:35.900 | So short-term contingencies are important.
00:46:38.020 | Then again, we call it Rockefeller Plaza.
00:46:40.340 | - Yep. - Right?
00:46:41.500 | It's Lincoln Center named after a Lincoln.
00:46:43.980 | - Yeah, probably, yeah.
00:46:44.800 | I'm not sure it is. - You're the New Yorker.
00:46:46.860 | - Yeah. - And so on and so forth.
00:46:48.860 | So like if people,
00:46:51.220 | how do we get the everyday person,
00:46:56.140 | and I consider myself an everyday person,
00:46:57.540 | how do we get ourselves working on short-term contingencies
00:47:00.700 | for a future that we can visualize as better
00:47:04.660 | for the next generation and let go of our need for credit?
00:47:08.340 | - Great series of points and questions brought up.
00:47:10.340 | So part of what you're talking about is egoic legacy, right?
00:47:13.340 | So you mentioned a building.
00:47:15.460 | It could be at any building at any major university.
00:47:17.700 | The name is put there on marble.
00:47:19.340 | You said 200 years. - You went to Berkeley.
00:47:21.340 | - I went to Berkeley. - You went to a bunch
00:47:22.420 | of places. - Yeah.
00:47:23.260 | - But you bounced around, folks.
00:47:25.180 | Proof that you can bounce around and still be successful,
00:47:27.820 | but maybe you should eventually finish.
00:47:29.460 | We'll talk about that later.
00:47:30.300 | But Sproul Plaza. - Yes.
00:47:32.420 | - Sproul Plaza, seed of the free speech movement,
00:47:35.180 | although now you could argue not so free speech movement.
00:47:38.020 | That's my, I said that.
00:47:39.380 | Yes, I said that.
00:47:40.260 | Sproul Plaza, like I can't tell you who Sproul was.
00:47:42.860 | Do you know who Sproul was? - No.
00:47:43.940 | - Exactly.
00:47:44.940 | I can tell you the Arches.
00:47:45.940 | I can tell you that it was a free speech movement.
00:47:47.620 | I can tell you that I saw certain bands play there.
00:47:49.900 | I can tell you that it's supposed to be a place
00:47:52.060 | where you can say anything and be exempt from
00:47:55.500 | being put in jail, basically anything.
00:47:59.900 | Maybe that's still true, but I don't think it is.
00:48:02.500 | But I can't tell you who Sproul is.
00:48:03.980 | - The question of legacy is very important.
00:48:05.580 | So Sproul Plaza, let's say 250 years from now,
00:48:10.580 | that name will probably, it may or may not be there,
00:48:13.780 | the plaza, but the name will,
00:48:15.780 | maybe it was renamed by someone else.
00:48:17.900 | So for titans of industry that can put down
00:48:21.140 | several million dollars and put their name
00:48:22.940 | on the side of a building, that's one form of legacy.
00:48:26.060 | That is not the every person.
00:48:29.540 | That being said, if, you know, I have three children.
00:48:33.500 | So let's say they continue on at 2.2 children or whatever,
00:48:37.900 | you know, my descendants.
00:48:39.020 | In 250 years, Sproul Plaza may or may not
00:48:42.940 | still be called that, but in 250 years,
00:48:46.620 | I will have roughly 50,000 descendants.
00:48:50.540 | - That's a scary. - For my wife.
00:48:51.660 | I know. - This is an exciting thought.
00:48:52.940 | - It's an exciting and scary thought.
00:48:54.340 | So what is going to impact the future?
00:48:57.300 | And by the way, if you want to keep giving money
00:48:58.700 | to put your name on the side of buildings, please do so.
00:49:00.500 | - Oh yeah, no, please do that.
00:49:01.340 | - Please do so, please.
00:49:02.820 | Please do that.
00:49:03.660 | - I should just be very clear.
00:49:05.060 | Philanthropy at universities and elsewhere,
00:49:07.620 | people think of it as like, oh, people, egoic legacy.
00:49:10.060 | Sure, also pays for hundreds of thousands of scholarships,
00:49:14.100 | the opportunity for people to--
00:49:14.940 | - And research, and you need to do it 100%.
00:49:17.700 | - It's vital, it's vital. - It's vital.
00:49:19.980 | But for the everyday person like you or me,
00:49:22.860 | if I want to impact the future, which I do,
00:49:27.620 | 'cause remember, I'm not the kind of futurist
00:49:30.020 | where I don't predict the future.
00:49:32.140 | My job at this point in time as I manifested
00:49:36.180 | in this biological entity called Ari Wallach
00:49:38.820 | is not to predict the future.
00:49:40.580 | It's to help folks make better decisions today
00:49:44.020 | so that we have better futures in the near term,
00:49:47.180 | the medium term, and the far off tomorrows.
00:49:49.580 | So what's going to impact those 50,000 Wallach descendants?
00:49:54.140 | It's not gonna be anything that I did egoically
00:49:58.140 | in terms of getting a recognition.
00:50:00.140 | What's going to impact them, and we know this in many ways
00:50:04.580 | from across multiple disciplines,
00:50:06.060 | what's going to impact them is going to be how I am
00:50:11.060 | with my children and my wife and my partner
00:50:14.660 | and the behaviors that I model
00:50:16.700 | because those become the memes, right?
00:50:20.780 | Susan Blackmore has meme theory, right?
00:50:22.380 | Not internet memes where I watch a lot of those,
00:50:25.300 | but true memes, these cultural units that we hand off
00:50:29.140 | both laterally and forward longitudinally
00:50:33.620 | to other generations, especially those closest to us.
00:50:36.020 | If you want to impact the future,
00:50:38.220 | there's a bunch of things you can do, right?
00:50:39.860 | Reduce your carbon footprint, give money, vote this.
00:50:42.860 | I want all of those to happen in a positive way.
00:50:45.980 | But at the end of the day, it's monkey see, monkey do.
00:50:49.580 | How you and I interact right now
00:50:51.420 | will obviously impact our relationship,
00:50:53.940 | everyone who's listening or viewing,
00:50:56.100 | but then everyone who's listening or viewing,
00:50:57.500 | how they are with the person who hands them the coffee,
00:51:00.940 | the barista, or they are with their partner,
00:51:02.940 | how they model those behaviors
00:51:05.460 | is going to impact the future in a greater way,
00:51:08.740 | I will argue, than most of the ways
00:51:10.980 | we egoically think about having a legacy.
00:51:13.980 | - I totally agree, and I think I'm old enough,
00:51:18.980 | and frankly, I'm excited to be old enough
00:51:21.100 | that I can make statements about being old enough
00:51:23.220 | to know that, like, I believe that our species
00:51:25.700 | is, for the most part, benevolent.
00:51:28.020 | I feel like most people, if raised in a low trauma
00:51:33.020 | environment with adequate resources,
00:51:38.940 | will behave really well.
00:51:40.400 | There are exceptions, and there may be sociopaths
00:51:43.300 | that are born with really disrupted neural circuitry
00:51:45.740 | that they just have to do evil or feel, you know,
00:51:48.580 | but I think it's clear that trauma and challenge
00:51:53.580 | can rewire behavior, and certainly the brain,
00:51:57.800 | to create, you know, what we see as evil, right?
00:52:01.780 | So, but I think most people are good.
00:52:03.980 | - Yeah.
00:52:04.820 | - Most people are of genuine goodness,
00:52:06.900 | and I do think that we model behavior.
00:52:10.140 | I think that etiquette is something that,
00:52:13.140 | I guess, as a 49-year-old person,
00:52:16.500 | I guess, does that make me middle-aged?
00:52:17.780 | I'm of middle age.
00:52:18.900 | I'll probably live, hopefully, to be about 100,
00:52:20.660 | but we'll see.
00:52:21.940 | Bullet buster cancer, I'm going to give it what I got.
00:52:24.380 | - It depends on whether or not you read your book fully.
00:52:26.780 | - Right.
00:52:27.620 | That, there's a response to that that could go either way.
00:52:31.040 | The, I like to think that reading the book fully
00:52:34.900 | will extend life as opposed to shorten life.
00:52:36.740 | - Yes.
00:52:37.580 | - But if nothing else, maybe it'll cure insomnia.
00:52:40.300 | The idea here is that if we're going to invest
00:52:47.580 | in being our best selves,
00:52:49.420 | one would hope that other people will respond to that
00:52:51.460 | the way that you said,
00:52:52.680 | that we'll kind of mirror each other.
00:52:54.500 | Good behavior breeds good behavior.
00:52:55.900 | In my lifetime, I've seen a real increase
00:52:59.980 | in the number of rules and regulations
00:53:01.980 | and a decrease in etiquette.
00:53:04.420 | Like, and what I would call, and I don't,
00:53:05.880 | this isn't a real term, I don't think,
00:53:07.460 | but like spontaneous etiquette, more genuine etiquette,
00:53:10.220 | like people being kind just to be kind,
00:53:11.860 | not because they're afraid of a consequence.
00:53:14.860 | - And I have a theory and I'll go through this quickly.
00:53:19.500 | I saw a documentary recently about the history of game shows
00:53:22.960 | where I learned that the first commercial
00:53:24.700 | was during the World Series where,
00:53:26.080 | when DiMaggio was making a run on the home run record.
00:53:29.420 | So they used a sports game that was televised
00:53:31.980 | and on the radio to have a first commercial.
00:53:33.940 | Then they had game shows,
00:53:35.080 | which were basically commercials for the products.
00:53:37.500 | That's what they were.
00:53:38.340 | And they used human interaction as a way
00:53:39.980 | to make it more interesting
00:53:41.260 | between the contestants and the host.
00:53:43.400 | And then came reality TV shows.
00:53:46.060 | And then now I would argue that social media
00:53:49.260 | is the reality TV show and we're all able to opt in
00:53:52.760 | and cast ourselves in it.
00:53:53.980 | And that the way that people get more,
00:53:57.220 | let's just say presence on the show,
00:53:59.600 | is to do things that are more hyperbolic.
00:54:03.260 | - Yeah, more outlandish.
00:54:04.100 | - Right, like it's very hard.
00:54:06.500 | I've tried and I think managed to some extent to do so too.
00:54:10.500 | It's very hard to create a very,
00:54:12.720 | very popular social media channel
00:54:15.260 | in this reality TV show that we are all in on social media
00:54:19.160 | by just being super nice to everybody and being,
00:54:24.080 | you can, but it's much harder
00:54:26.360 | than if you're a high friction player.
00:54:29.360 | 'Cause it's less interesting, there's less drama.
00:54:31.920 | It takes more attention.
00:54:33.360 | But I do think that there are pockets of that.
00:54:35.360 | So Lex Friedman used to talk about this.
00:54:37.520 | Like, is there a social media platform
00:54:39.040 | where people are rewarded for being benevolent,
00:54:42.900 | for modeling good etiquette,
00:54:44.520 | because they genuinely like that.
00:54:45.920 | And I say social media because I think so much of life now
00:54:49.880 | is taking place there.
00:54:50.720 | And that's the opportunity to reach people across continents
00:54:53.280 | and far away in time as well, right?
00:54:57.360 | To timestamp down things.
00:54:58.960 | So here's my question.
00:55:00.840 | Is there a version of social media
00:55:03.080 | that is not just on the half-life of like 12 hours,
00:55:06.560 | what was tweeted, et cetera, what was retweeted?
00:55:09.120 | 'Cause I would argue that,
00:55:10.200 | and even the highest virality social media posts
00:55:15.200 | have a half-life of about six months to a year.
00:55:19.960 | Maybe not even that.
00:55:21.080 | There are a few memes like the guy looking
00:55:22.800 | at the other girl walking the other way,
00:55:24.240 | those kinds of memes that seem to persist,
00:55:25.720 | but most of them don't.
00:55:26.960 | So is there a time capsule sort of version of social media?
00:55:35.560 | Because I look on the internet, like on YouTube,
00:55:38.680 | and I would say there are probably
00:55:40.800 | three or four YouTube videos,
00:55:42.560 | namely the Steve Jobs commencement speech
00:55:44.960 | at Stanford in 2015,
00:55:46.880 | maybe Last Lecture by Randy Pausch
00:55:49.520 | before he died of pancreatic cancer,
00:55:51.780 | maybe Benet Brown's TED Talk on vulnerability.
00:55:54.220 | I'm thinking mainly in the self-help space,
00:55:55.740 | personal development space here.
00:55:57.680 | And frankly, aside from that,
00:56:00.300 | and most things as popular as they may seem,
00:56:05.360 | 100 million views, 200 million views,
00:56:07.520 | compared to literature, compared to music,
00:56:10.920 | compared to poetry, compared to visual arts,
00:56:14.360 | it's gonna be gone.
00:56:18.080 | But I like to think that these podcast episodes
00:56:19.920 | are gonna project forward 30, 40 years into the future.
00:56:22.360 | But if we look at the history of what's on YouTube
00:56:25.200 | and we look at the half-life of any social media post,
00:56:28.860 | it may not be the case.
00:56:30.760 | In fact, it's very likely it's not the case.
00:56:33.060 | One would hope that they morph into something that lasts,
00:56:35.040 | but the question here is,
00:56:37.000 | is there a version of social media
00:56:38.480 | that acts as a time capsule
00:56:40.440 | to teach the sorts of principles that you're talking about?
00:56:43.920 | - In the show that I just did, "A Brief History of the Future,"
00:56:45.960 | one of the places I visit
00:56:47.680 | are these caves in the South of Spain,
00:56:51.160 | 300 feet below the surface,
00:56:53.240 | that are extremely rare
00:56:54.720 | because what these caves have in them side by side
00:56:58.160 | are both kind of hand paintings
00:57:00.880 | done by both Neanderthals and Homo sapiens.
00:57:03.520 | It's one of the few places where they exist side by side.
00:57:06.640 | So before we talk about social media,
00:57:07.960 | we have to talk about what that really is, is storytelling.
00:57:12.960 | And we're trying, in social media as we know it right now,
00:57:16.360 | we're trying to tell the world a story about who we are
00:57:18.920 | and what I stand for.
00:57:19.840 | Why am I here and why do I matter?
00:57:22.640 | And notice me, my life meant something.
00:57:25.960 | But when we go back to that cave that I stood in
00:57:27.800 | where those drawings were from 40, 50,000 years ago,
00:57:31.580 | it was, these are the animals that are here,
00:57:33.720 | here's when they come by.
00:57:35.200 | This is going back
00:57:36.040 | to the very beginning of our conversation.
00:57:37.760 | This is a time of year
00:57:39.480 | you should expect to see these animals in this area, right?
00:57:42.440 | And it was what Nancy Barducci calls
00:57:44.080 | horticultural time versus mechanical time.
00:57:47.240 | So when you, because that's the way we used to think
00:57:50.360 | from 40,000 years to the agricultural revolution,
00:57:52.760 | 12,000, 10, 12,000 years ago,
00:57:54.960 | to probably up until a couple of hundred years ago,
00:57:58.120 | we didn't remember.
00:57:59.060 | The minute hand only existed on the analog clock
00:58:03.120 | starting about 200 years ago.
00:58:04.360 | - Really?
00:58:05.200 | - Yeah, we didn't think in minutes.
00:58:06.660 | We barely thought, look, the clock as we know it,
00:58:09.800 | the mechanical clock as we know it
00:58:11.400 | only comes about during the industrial revolution.
00:58:14.520 | And especially then when we start to have trains,
00:58:16.080 | remember the transcontinental rail--
00:58:16.920 | - Is it all sundial then?
00:58:18.200 | - It was stone hedge, it was sundial, it was seasons, right?
00:58:21.760 | The way we would think about the future,
00:58:24.240 | wait, when people say, "Oh, Ari, you're a futurist."
00:58:26.520 | Like, "People like you have always existed."
00:58:29.180 | No, the idea of the future that is this thing out there
00:58:34.180 | that's gonna roil over us is relatively new.
00:58:38.020 | 'Cause up until a couple of hundred years ago,
00:58:39.380 | Ari and Andy, we did exactly what our,
00:58:41.660 | probably what our fathers did.
00:58:43.140 | And our kids would do exactly what we did.
00:58:45.060 | There was no kind of evolution in social structure.
00:58:49.340 | But at the advent as we--
00:58:50.180 | - I guess it could be argued I've done a lot of things
00:58:52.220 | that my father did.
00:58:53.100 | He is a scientist and there are other domains of life.
00:58:55.380 | But yeah, yeah. - This goes back
00:58:56.440 | to modeling behavior. - Right.
00:58:57.620 | - Right, the number one predictor
00:58:59.920 | if someone's gonna read the newspaper
00:59:01.280 | is if their parents read the newspaper.
00:59:02.560 | - Yeah, so my dad would say, "You'd open the paper."
00:59:04.720 | - Do you-- - And I'd poke it
00:59:05.560 | from behind when I wanted his attention, yeah.
00:59:08.080 | - We can talk about that in a second, the attention part.
00:59:11.720 | And so when I, look, when I look at,
00:59:15.760 | when I started answering your question about social media,
00:59:17.320 | I look at it as an anthropologist from Mars.
00:59:20.000 | That's how I go into every situation.
00:59:22.000 | I wanna say, why is it that we're doing what we're doing?
00:59:24.320 | How did that come about?
00:59:25.600 | And how might we learn from that
00:59:27.560 | so that we can potentially go in a different direction
00:59:29.840 | if we choose?
00:59:31.360 | All of storytelling is really a way
00:59:33.920 | of doing cultural transmission of memes,
00:59:37.320 | of ideas, of ways of being,
00:59:39.380 | so that we can flourish and move forward as a species.
00:59:43.040 | So then if you take that at its truth,
00:59:47.840 | what is social media right now,
00:59:50.080 | but nothing but a kind of a hall of mirrors
00:59:52.480 | of our culture right now?
00:59:53.640 | What will they say 200 years from now
00:59:56.200 | when they look at these posts with the likes
00:59:58.800 | and things that, the metrics that we use
01:00:02.160 | to judge ourselves individually
01:00:04.140 | and say, what happened to this species?
01:00:06.800 | - I'd like to take a brief break
01:00:07.880 | to thank one of our sponsors, Element.
01:00:10.200 | Element is an electrolyte drink
01:00:11.600 | that has everything you need and nothing you don't.
01:00:13.760 | That means the electrolytes, sodium, magnesium,
01:00:15.960 | and potassium in the correct ratios, but no sugar.
01:00:19.640 | Now, I and others on the podcast have talked a lot
01:00:21.800 | about the critical importance of hydration
01:00:23.760 | for proper brain and bodily function.
01:00:25.920 | Research shows that even a slight degree
01:00:27.640 | of dehydration can really diminish cognitive
01:00:30.320 | and physical performance.
01:00:31.580 | It's also important that you get adequate electrolytes
01:00:33.920 | in order for your body and brain to function at their best.
01:00:36.640 | The electrolytes, sodium, magnesium, and potassium
01:00:39.200 | are critical for the functioning
01:00:40.560 | of all the cells in your body,
01:00:41.800 | especially your neurons or nerve cells.
01:00:43.760 | To make sure that I'm getting proper amounts
01:00:45.220 | of hydration and electrolytes,
01:00:46.840 | I dissolve one packet of Element
01:00:48.280 | in about 16 to 32 ounces of water
01:00:50.200 | when I wake up in the morning,
01:00:51.400 | and I drink that basically first thing in the morning.
01:00:53.960 | I also drink Element dissolved in water
01:00:55.680 | during any kind of physical exercise I'm doing,
01:00:57.840 | especially on hot days if I'm sweating a lot
01:00:59.800 | and losing water and electrolytes.
01:01:01.960 | If you'd like to try Element,
01:01:03.140 | you can go to drinkelement.com/huberman,
01:01:05.880 | spelled drinklement.com/huberman,
01:01:09.720 | to claim a free Element sample pack
01:01:11.440 | with the purchase of any Element drink mix.
01:01:13.720 | Again, that's drinkelement.com/huberman
01:01:16.840 | to claim a free sample pack.
01:01:19.040 | I mean, one of the reasons I fell in love with biology
01:01:21.360 | is that, yes, we are evolving as a species,
01:01:24.960 | but I would argue slowly enough
01:01:26.800 | that any fundamental knowledge about biology
01:01:30.520 | of the human body,
01:01:32.940 | it's a core truth about us way back when and now,
01:01:38.260 | and very likely into the future.
01:01:40.000 | And of course, technologies will modify that,
01:01:41.840 | medicine will modify our biology, et cetera,
01:01:44.480 | but I get great peace from that.
01:01:48.160 | And most of the so-called protocols
01:01:49.840 | that I described on the podcast about viewing sunlight,
01:01:51.760 | et cetera, circadian rhythmicity, et cetera,
01:01:53.760 | has been core to our biology and our wellbeing
01:01:57.360 | 100,000 years ago,
01:01:59.560 | and very likely it will be core to our biology
01:02:01.760 | 100,000 years from now.
01:02:03.300 | I therefore worry about any technology
01:02:08.440 | that shortens up our timescale of motivation and reward.
01:02:19.160 | And I use social media,
01:02:20.440 | so I am not anti-social media by any stretch.
01:02:22.840 | In fact, I'm quite pro,
01:02:24.680 | provided it's kept in check,
01:02:26.720 | a la Jonathan Haidt's ideas.
01:02:29.280 | I really like those.
01:02:30.600 | But let me put it this way.
01:02:33.160 | If I go to Las Vegas,
01:02:34.320 | which I do enjoy doing from time to time,
01:02:36.420 | I'm not a gambling addict.
01:02:37.760 | I guess if I say that enough times,
01:02:38.920 | people are going to say I'm a gambling addict,
01:02:40.280 | but I enjoy playing a little bit of roulette
01:02:41.920 | or a little bit of slots.
01:02:43.080 | I play all the low-level stuff
01:02:44.320 | that doesn't require any thinking.
01:02:47.320 | And I often do pretty well for whatever reason,
01:02:49.840 | 'cause I know when to leave probably.
01:02:53.460 | But Vegas is all about short-term thinking
01:02:56.080 | and short-term reward contingency.
01:02:58.040 | It's actually designed in every respect
01:03:00.000 | to get you to forget
01:03:01.600 | that there are these other longer timescales.
01:03:04.160 | - That's why there's no natural light in most casinos.
01:03:06.000 | - There's no lights, there's no clocks in many of them.
01:03:09.160 | The random intermittent reward schedule
01:03:13.560 | that's there is designed to keep you playing.
01:03:16.680 | And I would argue that a lot of social media is like that.
01:03:19.240 | Not all of it, but a lot of it is like that.
01:03:21.760 | Reward likes and responses,
01:03:23.520 | in some cases fighting is what people want.
01:03:25.280 | They want to fight 'cause they like that emotion,
01:03:27.520 | that the algorithms figure you out
01:03:29.900 | so that they shorten up your temporal window.
01:03:32.880 | And so when people say,
01:03:35.480 | oh, we're walking around with a little slot machine
01:03:37.080 | in our pocket all day long with our smartphone,
01:03:39.520 | I actually think that's right.
01:03:40.780 | I think it's right.
01:03:41.620 | It's more like a casino, however,
01:03:43.400 | where that casino harbors all sorts of different games
01:03:46.320 | and they're gonna find the one that you like.
01:03:48.120 | Some people like playing roulette.
01:03:49.920 | I happen to like playing roulette.
01:03:51.200 | Some people like crap.
01:03:52.160 | Some people like poker.
01:03:53.500 | Some people like to bet on a game
01:03:54.700 | where you get to sit the whole game
01:03:55.920 | with the possibility of winning.
01:03:57.020 | A friend of mine who's actually an addiction counselor,
01:03:59.320 | he said, you know, the gambling addiction
01:04:00.920 | is the absolute worst of all the addictions.
01:04:04.080 | Because the next time really could change everything.
01:04:07.000 | Unlike alcoholism or drug addiction
01:04:08.600 | or other forms of addiction,
01:04:09.540 | where the next time is just gonna take you further down.
01:04:12.040 | In gambling, there is the realistic possibility
01:04:14.880 | that the next time could change everything
01:04:17.280 | and that destroys lives.
01:04:19.760 | So if we are walking around
01:04:21.760 | with a sort of casino in our pocket,
01:04:23.520 | how do we get out of that mindset,
01:04:26.600 | much less use that tool
01:04:29.280 | in order to get into these longer-term investments
01:04:31.840 | for the future?
01:04:32.680 | This is what I wanna know.
01:04:33.500 | How do we get into the metaphorical,
01:04:35.640 | you know, cave painting scenario?
01:04:37.760 | Because what it means is that the stories
01:04:39.440 | that I'm seeing on social media today
01:04:41.640 | probably are meaningless toward my future.
01:04:45.800 | Probably.
01:04:46.740 | - More than likely, yes.
01:04:47.760 | - But I need to be informed.
01:04:49.440 | But, you know, I saw the debates.
01:04:51.200 | Like, how much more do I need to hear
01:04:52.840 | about what was happening at the debates from other people?
01:04:56.560 | Probably zero.
01:04:57.400 | Like, there's no new information there.
01:04:59.380 | The only thing that can happen
01:05:01.280 | is I can get caught in the little eddy of the tide pool
01:05:04.820 | that is the debate about the debate
01:05:08.060 | or the debate about the debate about the debate.
01:05:10.160 | So, I mean, it takes a strong, strong mind
01:05:14.900 | to divorce oneself from all of that,
01:05:19.560 | much less get into this longer-term thinking.
01:05:21.880 | And maybe this is why David Goggins is always out running
01:05:24.600 | and hates social media so much,
01:05:26.260 | even though he's used it to good end to share his message.
01:05:30.960 | I mean, what is it that we can do
01:05:36.160 | to disengage from that short-term contingency reward mindset
01:05:41.160 | and behaviors?
01:05:42.040 | And what in the world can we do instead?
01:05:45.680 | Is it go paint, like, on the side of a cave?
01:05:49.000 | Is it write a book?
01:05:50.600 | Is it, I mean, how do we do that?
01:05:53.000 | And let's check off the box of, like,
01:05:54.400 | we need to tend to our kids, we need to tend to our health,
01:05:56.200 | we need to get our sleep, we need to get our...
01:05:57.760 | Let's just assume that we're taking care
01:05:59.040 | of the fundamentals of health and wellbeing,
01:06:02.360 | which doesn't leave a whole lot of time afterwards anyway.
01:06:06.720 | What do we do?
01:06:07.880 | Like, where are the story, where should the stories go?
01:06:11.760 | Where do we put them?
01:06:12.840 | I feel really impassioned by this because, you know,
01:06:17.280 | I devote my life to this, right?
01:06:19.440 | And I teach biology because I believe it's fundamental
01:06:21.840 | and transcends time, but I care about the future.
01:06:26.720 | And I'm well aware that, you know, in 30 years,
01:06:31.440 | the idea that there was a guy on the internet
01:06:32.960 | talking about the importance of getting morning sunlight,
01:06:35.600 | sure, that might happen, you know,
01:06:37.640 | but probably no one will care.
01:06:39.440 | Just like I realized about halfway
01:06:41.240 | through my scientific career that, sure,
01:06:42.660 | I was tenured at Stanford, won some awards,
01:06:44.960 | enjoyed the research, enjoyed the day-to-day,
01:06:47.080 | but I realized, okay, there's some...
01:06:48.160 | I feel good about the research contributions we made,
01:06:51.760 | but that I knew that people weren't gonna be like,
01:06:56.440 | oh, Huberman discovered this,
01:06:57.720 | because I had already forgotten the people 32 years,
01:07:01.320 | in my head, and I know the literature really well.
01:07:04.280 | So, like, how do you square these different mental frames?
01:07:09.280 | It's a conundrum.
01:07:13.960 | - This is the fundamental question of our time,
01:07:16.640 | is what is the purpose of our species being here on earth?
01:07:21.640 | And for thousands of years, that was answered by religion.
01:07:25.720 | The idea about who we are and why we are here,
01:07:29.760 | more often than not, was answered in the afterlife.
01:07:33.640 | But then along came our friend, rationality and logic,
01:07:38.480 | and the Renaissance and the Enlightenment.
01:07:41.360 | And as Nietzsche said, I'll give you the full quote,
01:07:43.960 | "God is dead, and now we're basically screwed."
01:07:46.560 | - But I don't believe that.
01:07:48.160 | I mean, I believe in God.
01:07:49.520 | I mean, I've gone on record saying that before.
01:07:51.440 | So, and there are many people who believe
01:07:52.800 | in God in the afterlife,
01:07:54.160 | but it still is difficult to navigate the day-to-day.
01:07:56.640 | - Because I wanna separate out what scientific rationality
01:08:01.000 | and the scientific method did,
01:08:02.800 | is it didn't actually kill God.
01:08:05.040 | What it actually did was it killed the structures
01:08:07.680 | that arose to intermediate between us and God,
01:08:11.400 | AKA the church.
01:08:12.520 | And this is not a conversation about theology.
01:08:14.600 | This is a conversation about structures and about power.
01:08:18.480 | - So, science destroyed religion?
01:08:20.600 | - 100%.
01:08:21.840 | It destroyed the stories that religion told us
01:08:24.920 | about our larger purpose.
01:08:26.960 | Because what ended up happening,
01:08:28.200 | look, oftentimes folks will say,
01:08:30.600 | "Well, science destroyed God and destroyed religion
01:08:35.520 | "because it told us where we came from.
01:08:37.320 | "We're not coming from seven days, right,
01:08:39.400 | "where God spun the earth
01:08:40.760 | "and created the heavens in seven days."
01:08:43.360 | I think we're at a point now
01:08:44.400 | where we're starting to realize
01:08:45.520 | that science actually tells us,
01:08:47.280 | going back 13.7 billion years ago to the Big Bang,
01:08:50.840 | we can quibble with that number,
01:08:52.640 | up to today, science is telling us how we got to this point.
01:08:57.400 | What science cannot do
01:08:59.040 | and what technology cannot do
01:09:00.920 | is tell us where we should be going.
01:09:02.720 | And so, what, I'm not saying God should be telling us
01:09:07.320 | what we should be doing or spirituality.
01:09:09.480 | What I'm saying is--
01:09:10.840 | - You're not gonna argue you can tell God what to tell us?
01:09:14.560 | - No, I'm not gonna argue.
01:09:16.400 | - But wait, but the term you just said,
01:09:18.520 | that science and technology cannot tell us
01:09:20.760 | where we need to go.
01:09:22.360 | - No, look, here, we started off by,
01:09:24.720 | we started off, so the work that I do,
01:09:26.840 | this mindset that I'm advocating for, I call long path.
01:09:31.840 | Long path sits on three pillars.
01:09:35.520 | These are the kind of, to use your nomenclature,
01:09:37.960 | there are three protocols.
01:09:39.800 | One, transgenerational empathy.
01:09:41.440 | Empathy with yourself, empathy with the past,
01:09:44.040 | and then empathy with the future.
01:09:45.840 | You need those three.
01:09:47.080 | The second pillar is futures thinking.
01:09:49.320 | You'll notice it's future with an S
01:09:51.440 | as opposed to the singular future.
01:09:53.120 | 'Cause we often think of the future as a noun,
01:09:55.080 | this thing that's out there,
01:09:56.440 | as opposed to what the future really is,
01:09:57.840 | which is a verb, it's something that we do.
01:10:00.240 | Then the final pillar,
01:10:01.880 | the one that is the most difficult
01:10:03.320 | for us to wrap our head around,
01:10:04.840 | is this idea of telos, ultimate aim, ultimate goal.
01:10:08.520 | What are we here for?
01:10:10.200 | So we all suffer from what I call a lifespan bias.
01:10:13.440 | So the most important unit of time to Andrew Huberman
01:10:18.200 | is from your birth to your death.
01:10:20.040 | We're all wired that way,
01:10:21.680 | because that's the literature,
01:10:23.280 | the science that I grew up with.
01:10:25.080 | I grew up, and I wanna be a geneticist, right?
01:10:27.720 | That's where I started.
01:10:29.560 | What the literature tells us
01:10:31.360 | about us as a biological entity
01:10:34.200 | is that the most important unit of time
01:10:35.960 | is from my birth to my death.
01:10:37.320 | But the reality is, for our species,
01:10:41.080 | and it has been going back hundreds of thousands of years,
01:10:43.400 | is that these things actually overlap.
01:10:45.920 | I come from my parents, then I am here,
01:10:49.640 | and now my children.
01:10:50.720 | These are not distinct units.
01:10:52.400 | There's massive overlaps in terms of the culture,
01:10:55.080 | the emotional, the psychology of what I got from them,
01:10:58.280 | what I'm giving to my kids.
01:11:00.200 | But what ends up happening in a lifespan bias society,
01:11:03.360 | the one that we exist in right now,
01:11:04.680 | is we have lost the telos.
01:11:06.320 | We have lost the ultimate aim or goal or purpose
01:11:09.480 | for our species, for our civilization on this planet.
01:11:13.800 | I'm not gonna tell you what that is.
01:11:15.200 | What I am gonna say is when you don't have that,
01:11:17.640 | because God is no longer in the picture,
01:11:19.160 | religion is no longer in the picture,
01:11:21.400 | we flounder about, and we're looking for metrics to judge.
01:11:24.200 | Am I doing the right thing?
01:11:26.240 | Do I matter?
01:11:27.440 | Will people know who I am 200 years from now?
01:11:29.640 | Is my sense of purpose connected to anything larger?
01:11:35.600 | And without these larger religious structures
01:11:37.800 | that we had for thousands of years, the answer is no.
01:11:40.560 | - But there are still many people on the planet
01:11:44.080 | who believe in God and are religious.
01:11:45.640 | - Yes, more than there are that aren't religious.
01:11:48.000 | - So does that mean that they're immune
01:11:50.280 | from this confusion?
01:11:52.600 | - Well, no, because there's other confusions
01:11:54.280 | that come from it, right?
01:11:55.120 | There's other, religion as it's practiced
01:11:57.840 | in majority parts of the world,
01:11:59.200 | and this is where I'm gonna get a lot of hate mail,
01:12:01.000 | is mostly about power and coercion and control.
01:12:05.240 | - Not at its essence.
01:12:06.560 | - Not at its essence.
01:12:07.400 | - And I would say that for every major religion.
01:12:08.920 | - Yes.
01:12:09.760 | - I would say for every religion.
01:12:10.960 | The essence of it is about love.
01:12:13.920 | - The essence is about love and emancipation
01:12:16.920 | from the human condition to connect to something larger,
01:12:19.720 | to connect to the divine.
01:12:21.560 | The problem is when the business models get in the way,
01:12:24.240 | right?
01:12:25.080 | - Right, like with anything.
01:12:25.920 | - Like with anything.
01:12:27.280 | And so--
01:12:28.840 | - But that's true of science too.
01:12:30.200 | I mean, I know a lot about the business models of science.
01:12:32.000 | - You referenced it earlier, right?
01:12:33.400 | Science, it's no longer like pure Medici-type science
01:12:37.160 | where you're doing these things in a lab.
01:12:38.880 | It's published, it's perished.
01:12:40.480 | There's business models.
01:12:41.320 | Can we take it from the lab to the, can we?
01:12:44.040 | 100%, and that is part of where we are.
01:12:46.440 | What I'm asking for when we have a conversation
01:12:48.720 | about our toes is to rise up out of this current moment
01:12:52.880 | and say, most mammals kind of have about a million years
01:12:57.880 | that they exist on earth from kind of when they rise up
01:13:00.440 | to when they go extinct.
01:13:01.760 | We're in the first third of this ballgame, right?
01:13:05.120 | - That's reassuring.
01:13:05.960 | - Yeah, we're in the--
01:13:06.800 | - 'Cause I keep hearing about, you know,
01:13:08.000 | the fact that we're almost dead.
01:13:09.040 | So we're about a third of the way through?
01:13:10.720 | - We're in the bottom of the third inning.
01:13:11.760 | - Oh, goodness.
01:13:12.600 | Yeah, all right.
01:13:13.440 | Well, you finally said something that gives me,
01:13:14.440 | I'm just kidding.
01:13:15.280 | Lots of things that you've said give me confidence
01:13:18.160 | in our future.
01:13:19.000 | Most notably that you're talking about this,
01:13:21.680 | sorry to interrupt, but I'm going to compliment you.
01:13:23.400 | So maybe it's okay.
01:13:24.240 | - I'll stop talking now.
01:13:25.240 | [Zubin laughs]
01:13:27.240 | - That most notably that, you know,
01:13:30.200 | I think you're the first person outside of
01:13:33.760 | the sub-branch of neuroscience,
01:13:37.520 | which is a very small sub-branch,
01:13:39.280 | people that study time perception,
01:13:41.400 | to really call to people's consciousness
01:13:45.640 | that the human brain can expand
01:13:50.320 | or contract its time perception.
01:13:52.000 | And we do this all day long.
01:13:53.280 | - All the time.
01:13:54.120 | - And high salience, high stress, high excitement,
01:13:57.280 | life and thinking shrinks the aperture, right?
01:14:02.560 | It contracts the aperture
01:14:04.360 | and makes us very good at dealing with things
01:14:06.480 | in the present, get to the next day or the next hour,
01:14:08.720 | collapse, go and continue, repeat, repeat, repeat.
01:14:12.240 | It's the opposite of what the Buddhists
01:14:15.240 | traditionally said, which was to be present
01:14:17.160 | in order to see the timelessness.
01:14:19.480 | This is why I'm a big fan of the,
01:14:21.080 | I forget the name, it's, Rob,
01:14:24.320 | we'll have to edit this in, the Asatoma Prayer,
01:14:28.720 | which talks about release me
01:14:30.120 | from the time-bound nature of consciousness
01:14:32.080 | to timelessness.
01:14:33.280 | Sounds very mystical,
01:14:34.200 | but what they're really talking about
01:14:35.160 | is get me out of the mode of stress,
01:14:36.520 | into the mode of relaxation that allows me
01:14:38.560 | to see how the now links with the past
01:14:41.040 | and relates to the future.
01:14:42.480 | Impossible to do when we're under stress,
01:14:44.240 | trying to figure out, like,
01:14:46.040 | how we're gonna get some place in traffic
01:14:47.680 | to pick up the kids so they're not waiting
01:14:49.440 | outside the school alone.
01:14:50.760 | Impossible, you just can't,
01:14:54.160 | the two deep breaths and the long exhale,
01:14:56.760 | it works to bring your level of autonomic arousal down,
01:14:59.920 | make you navigate that situation better.
01:15:02.280 | But it is the hyper-rare individual who thinks,
01:15:06.080 | "Well, look, this is linked to some larger time scale."
01:15:09.800 | Like, when we are stressed,
01:15:11.640 | the horizon gets right up close.
01:15:13.680 | So you're one of the first people to talk
01:15:14.920 | about this dynamic relationship with that horizon.
01:15:17.400 | Is there a way that we can leverage
01:15:22.760 | the immediacy of our experience, that fact,
01:15:27.520 | to actually create useful tools for the future?
01:15:31.720 | Like, so for instance, before we started recording,
01:15:34.040 | we were talking about the notion of time capsules.
01:15:36.280 | I've been keeping a time capsule for a long time.
01:15:38.160 | The first idea for this came when I was a kid.
01:15:39.840 | We used to build skateboard ramps in the backyard.
01:15:41.960 | And I'll never forget that right before we put down
01:15:43.640 | the first layer of plywood, we put a time capsule in there.
01:15:46.800 | We all, like, wrote little notes and did things, I think.
01:15:49.960 | Someone put some candy in there or something.
01:15:51.480 | It's kind of a cool concept, right?
01:15:53.440 | But social media, to me, does not seem like a time capsule.
01:15:58.280 | I feel like it's just gonna get turned over,
01:16:00.520 | turned over, turned over.
01:16:01.360 | What are the real time capsules of human experience?
01:16:03.640 | So you said religion, religious doctrine,
01:16:06.320 | Bible, Koran, Torah being the big three.
01:16:09.560 | And there are others, of course.
01:16:10.720 | But those are the big three, Bible, Koran, Torah.
01:16:12.960 | Those are big three time capsules, okay.
01:16:15.200 | Then we've got literature, music, poetry, visual art.
01:16:20.200 | So paintings, drawings, and sculpture.
01:16:25.200 | What else do we have?
01:16:28.000 | - So let's bring this down to the individual.
01:16:31.260 | Like, what one of my practices is,
01:16:34.880 | or I'll go through a couple of them.
01:16:36.680 | And so one of them, if you come to my home,
01:16:40.440 | which hopefully, you know, you'll come over.
01:16:42.440 | - I've been to your home.
01:16:43.280 | - Yeah, but, you know.
01:16:44.240 | - It's been a while.
01:16:45.080 | - It's been a while.
01:16:45.900 | - That was a complaint.
01:16:46.740 | - That was a, you know, I don't know if,
01:16:48.200 | I haven't invited you or you just, I don't know.
01:16:50.200 | We'll talk about it afterwards.
01:16:51.160 | - Whenever I make it to Manhattan,
01:16:52.280 | I have a hard time getting out of Manhattan.
01:16:54.000 | - It's true.
01:16:54.840 | So we have a shelf with a bunch of family photos.
01:17:01.140 | And, you know, there's photos of my grandparents,
01:17:04.320 | my parents, myself, my kids.
01:17:07.920 | And then to the right of that,
01:17:08.760 | there's actually, and people are always like,
01:17:10.520 | why didn't you, you know, take care of this?
01:17:13.360 | There's always, there's a blank photo frame.
01:17:15.560 | Just blank.
01:17:17.720 | Those, you know, I have three kids, they're young,
01:17:19.760 | but that blank photo frame represents my grandkids
01:17:24.680 | or future generations.
01:17:26.440 | It's just something that I can immediately see
01:17:27.920 | what I think about the decisions.
01:17:29.640 | That's why I said long path is a mindset.
01:17:31.340 | So there's all these complicated things
01:17:32.620 | and it's also a mantra.
01:17:34.300 | So when I get into an argument with my wife
01:17:36.140 | or I have a conversation with you or anything like that,
01:17:39.100 | and I immediately have this stimulus arousal response
01:17:41.940 | where I want to act in the short term,
01:17:43.580 | but I actually want to see the bigger picture.
01:17:47.120 | And again, this is highly self-referential.
01:17:49.460 | I understand that.
01:17:50.300 | I'll just say long path.
01:17:51.340 | I'll say like, what are we really trying to do here?
01:17:53.720 | What is this actually all about?
01:17:55.780 | And that, because I've been doing this long enough,
01:17:58.420 | brings me back.
01:17:59.260 | And when I see that third empty picture frame,
01:18:01.680 | it always reminds me that I'm here for this one segment.
01:18:06.400 | There was a segment before
01:18:07.240 | and there's a segment coming after me.
01:18:08.520 | And so how I am in my daily interactions
01:18:11.520 | is going to impact that.
01:18:13.160 | - How far, so just a few questions
01:18:14.680 | more specifically about you,
01:18:15.840 | because I think what you're doing here
01:18:17.000 | is you're concretizing a process, a protocol, if you will,
01:18:20.280 | that anyone can use.
01:18:21.960 | And I would argue that the shift from printed photos,
01:18:25.560 | largely from printed photos to electronic photos
01:18:27.800 | has made this problematic.
01:18:29.280 | I mean, it's made certain things simpler.
01:18:32.360 | Like if you change relationships,
01:18:33.620 | you can just delete a folder
01:18:35.160 | as opposed to having to actually take photographs
01:18:37.240 | from a previous relationship
01:18:38.240 | and make sure they're not around
01:18:39.440 | in case your next relationship
01:18:40.720 | would understandably take issue with that.
01:18:43.040 | I'm not speaking from experience here.
01:18:45.760 | But how far back do your photos go?
01:18:48.980 | - It's interesting.
01:18:49.820 | The photos of my grandparents
01:18:52.140 | who both perished in the Holocaust
01:18:53.960 | were saved by my father who was in World War II,
01:18:57.120 | fought with the Jewish underground,
01:18:58.320 | made his way through Europe to Cuba to Mexico,
01:19:00.600 | where he eventually met my mom and I was born.
01:19:02.900 | The photos that we have,
01:19:05.080 | he had kept in his wallet for several decades
01:19:08.200 | and he had them kind of reconstructed and turned in.
01:19:10.640 | That's as far back as we go.
01:19:12.120 | - So grandparents. - Yeah.
01:19:13.440 | - Okay, and then you're married, you have three kids,
01:19:16.120 | and then you have this--
01:19:19.200 | - Empty photo frame. - Empty photo frame.
01:19:20.960 | And you're the same age as me.
01:19:24.600 | You're 50 or you're 49?
01:19:25.680 | - 49. - 49.
01:19:26.600 | - Thank you.
01:19:27.440 | - But you seem to be in good health.
01:19:29.440 | - Yes, and seemingly young, right?
01:19:31.960 | - Yeah, you have energy.
01:19:32.800 | You've always had a lot of energy.
01:19:34.420 | You used to call yourself Ari Ferrari.
01:19:36.960 | You said you were like a Ferrari.
01:19:38.040 | That's why they named you Ari.
01:19:39.120 | - I don't think this was a name I gave to him.
01:19:39.960 | - Ari and I have known each other since we were little kids.
01:19:42.360 | He's always had a ton of energy.
01:19:43.840 | Actually, he hurt himself when he was younger
01:19:46.120 | and he was in full traction,
01:19:47.280 | like cast of his whole lower body.
01:19:49.600 | And he would dance on the floor on his arms,
01:19:51.720 | kind of like David Goggins would treadmill on his hands,
01:19:54.740 | even when he can't move his legs.
01:19:56.160 | Okay, so chances are you'll meet your grandkids.
01:19:58.840 | - Hopefully.
01:19:59.680 | - Yeah, God willing, you'll meet your grandkids.
01:20:01.880 | But probably not your great-grandkids.
01:20:06.080 | - Probably not.
01:20:06.920 | - Okay, well, I have a different tool.
01:20:08.520 | - But let me say something.
01:20:09.760 | I will not probably meet them biologically,
01:20:13.200 | like in the sense that this big lump of cells
01:20:18.200 | will probably not meet my great-grandchildren,
01:20:20.840 | but we'll meet them.
01:20:22.960 | I'm 100% sure of is the way that I've modeled
01:20:27.600 | being in the world to partners,
01:20:30.740 | be they my wife, my children, business, colleagues,
01:20:35.740 | that modeling, my kids will be in the room sometimes
01:20:37.860 | when I'm on work calls, right?
01:20:39.860 | You know, nothing confidential.
01:20:41.180 | And they'll hear in the background,
01:20:43.140 | they'll hear how I interact,
01:20:44.180 | how I am in the current human moment.
01:20:48.080 | They are learning, they are receiving.
01:20:51.740 | That is how I'm gonna meet my great-grandkids.
01:20:53.980 | That's how I will be in the room with them.
01:20:55.700 | How I have been is going to impact
01:20:58.380 | 30 or 40 generations out.
01:20:59.780 | That 50,000 descendants that I talked about earlier,
01:21:03.540 | 250 years from now, I will meet them.
01:21:05.800 | I will be with them.
01:21:06.860 | They may not know my name, who I am,
01:21:09.420 | but hopefully the way they treat a stranger
01:21:12.260 | or they interact with their partners
01:21:14.780 | comes about how I did it,
01:21:17.220 | that modeled behavior, that transmission.
01:21:19.180 | - Yeah, I get it.
01:21:20.020 | And it's interesting because I think that,
01:21:23.260 | well, and you're on the internet,
01:21:25.320 | so people will see you on the internet,
01:21:27.860 | probably at least, you know, I think 30, 50 years out,
01:21:31.580 | if you Google your name or whatever it's called
01:21:33.380 | at that point, Googling, you know.
01:21:34.940 | I get in trouble whenever I say Googling.
01:21:36.340 | People go, "Why don't you talk about a different?"
01:21:38.020 | 'Cause that's the one everyone uses.
01:21:40.420 | Unless you use DuckDuckGo 'cause you're afraid
01:21:42.180 | of what people might.
01:21:43.340 | So when someone comes up with a truly better one,
01:21:47.940 | maybe it'll get replaced.
01:21:48.780 | But meanwhile, Google.
01:21:49.960 | So they'll get to, your great-grandkids
01:21:52.580 | could possibly know you there.
01:21:53.980 | They could hear this conversation, this very conversation.
01:21:56.780 | I think that's part of the reason
01:21:57.620 | why people go on social media, not just to be consumers,
01:22:00.020 | but they wanna leave something.
01:22:02.420 | They're probably not thinking about it consciously,
01:22:03.820 | but they wanna leave something for the future.
01:22:05.800 | I use a tool that I learned from a friend.
01:22:10.580 | He has this Your Life in Weeks, I think it's called.
01:22:15.580 | And it's this, you know, you fill in chart
01:22:18.460 | where you put your birthday,
01:22:20.460 | you put your predicted lifespan.
01:22:22.340 | So for me, I put 100, it feels good to me.
01:22:24.580 | I'm not interested in living much past 100
01:22:26.580 | unless there's some technology that would allow me
01:22:28.460 | to do that with a lot of vigor
01:22:29.340 | and my friends would be around.
01:22:30.740 | So, and you mark off that you fill in these little squares.
01:22:35.140 | And I did this morning, actually.
01:22:37.300 | And, you know, I'm not quite halfway through,
01:22:39.660 | but I'm about halfway through.
01:22:41.260 | And it's an interesting thing to see your life
01:22:43.580 | in that representation.
01:22:46.100 | Oh, wow, it can inspire better decision-making
01:22:49.380 | because we can lose track of where we are in time.
01:22:52.020 | And some of us, including me,
01:22:53.220 | are not very good at tracking time.
01:22:55.020 | People that have ever waited for me on an appointment
01:22:57.100 | know this.
01:22:57.940 | I don't, I track, I'm very oriented in space,
01:22:59.980 | not well-oriented in time.
01:23:02.060 | So the problem with these charts is that,
01:23:07.060 | or photos on the shelf, I would argue,
01:23:10.920 | is they have great utility.
01:23:11.860 | But the problem is that they're not in the forefront
01:23:15.100 | of our consciousness throughout the day, right?
01:23:19.660 | Like I filled out that chart,
01:23:21.020 | I didn't even think about it again until now.
01:23:23.020 | And when we are pressed with a decision,
01:23:26.380 | in some cases, we have the opportunity to step back
01:23:28.340 | and say, okay, look, in the bigger arc of things,
01:23:30.700 | I got to go left here, even though I want to go right.
01:23:32.940 | This is the right thing for my, the bigger picture.
01:23:36.340 | - The bigger picture, the long path, yes.
01:23:37.940 | - So, you know, is there a way,
01:23:40.540 | is there maybe a technology that actually serves us
01:23:43.500 | to anchor us to best decision-making for a given,
01:23:47.640 | best time bin, we would call it in neuroscience,
01:23:52.380 | best time binning, mode of time binning,
01:23:54.820 | for a given decision?
01:23:55.700 | - I think you need to ask yourself a question.
01:23:57.180 | When you're facing a, you know,
01:23:58.740 | not should I have turkey or chicken for lunch,
01:24:01.980 | but maybe a slightly, or maybe that question too.
01:24:04.620 | Just ask yourself, am I being a great ancestor?
01:24:07.200 | What will allow me to be a great ancestor?
01:24:10.180 | How will descendants look back on this decision,
01:24:12.500 | go left or right?
01:24:13.380 | That's going to elevate you.
01:24:14.900 | Look, I talked about that.
01:24:15.740 | You talked about, you know, deleting photos
01:24:17.460 | and then stuff like that.
01:24:18.300 | So I'll tell you about the work.
01:24:19.860 | One of my, on my advisory board is a guy
01:24:21.780 | named Hal Hirschfield, smart, great guy at UCLA.
01:24:25.420 | He does a lot of future you work.
01:24:27.740 | And so what he did was,
01:24:29.700 | and I'll do the short version of this.
01:24:32.060 | It's like a bunch of people into an fMRI,
01:24:34.420 | functional MRI to see kind of where the flow is.
01:24:37.180 | And he asked them, he did a series of questions
01:24:41.100 | where it's like, think about yourself right now
01:24:42.700 | when one part of your brain lit up.
01:24:44.620 | And then he goes, okay,
01:24:45.460 | I want you to think about this celebrity.
01:24:48.260 | I think he used Matt Damon and Natalie Portman
01:24:50.220 | and another part of their brain lit up.
01:24:52.140 | And he said, I want you to think about yourself
01:24:53.620 | 10 years from now.
01:24:55.080 | And guess what?
01:24:56.200 | The part of the brain that lit up for the celebrities,
01:24:58.580 | Natalie and Matt, was the same part that lit up
01:25:01.180 | when thinking about you 10 years from now.
01:25:03.780 | So you got a vague idea of who future Ari was,
01:25:06.580 | but you weren't totally connected to them, right?
01:25:09.780 | It was like a stranger to you.
01:25:11.620 | Pulled him out, one group did nothing.
01:25:13.460 | Another group, he took a photo of them
01:25:16.140 | and he took a photo, ages them,
01:25:19.100 | and then puts them into a 3D virtual reality.
01:25:22.300 | And you're in a room, and at one point,
01:25:24.100 | and you don't know this is gonna happen.
01:25:25.460 | As you walk across the room, you see a mirror
01:25:27.340 | and you look at yourself in the mirror,
01:25:28.300 | and it's a photo of you, but age 10 years.
01:25:30.400 | So you're seeing an older version of you.
01:25:33.180 | - Yikes, I mean, and cool.
01:25:35.340 | - Very cool, does this intervention, pulls them out,
01:25:38.100 | brings them back, I think two weeks later.
01:25:40.220 | And he has them hypothetically put money away
01:25:42.700 | for savings account, you know exactly what happens.
01:25:45.540 | The people who saw a version of their age self
01:25:47.980 | put more money away for a future retirement account
01:25:50.780 | than the folks that didn't.
01:25:51.860 | So the question is, not only are we disconnected
01:25:55.220 | from the future, my future descendants,
01:25:58.660 | I'm disconnected from my future self.
01:26:02.300 | So what I've done, and you'll see this in the show,
01:26:05.260 | it's scary 'cause I look just like my dad,
01:26:06.900 | and you'll always look like your dad when you do this,
01:26:09.100 | is even though we've been bagging on social media,
01:26:12.400 | you can go on Snap or other places
01:26:15.020 | where they'll age you, right?
01:26:16.420 | It'll make you look 10, 15 years older,
01:26:17.980 | and you can send it to your partner and everybody laughs.
01:26:20.080 | So I took a screenshot and I-
01:26:22.420 | - Everybody laughs as opposed to saying you look great.
01:26:24.820 | - No, no, no, everyone's like, "Oh my God."
01:26:26.340 | And so once I read about this house research many years ago,
01:26:31.340 | I printed that out, you know, my little home printer,
01:26:34.700 | cut it out, and it's on my bathroom mirror.
01:26:37.100 | And every day I spend two or three seconds
01:26:40.540 | staring at future older Ari in his 70s.
01:26:43.520 | That's how I make better decisions today.
01:26:45.580 | And those better decisions aren't just about
01:26:47.220 | putting money for retirement.
01:26:48.620 | It's about also, how do I take care, you know,
01:26:51.600 | do I floss or not?
01:26:52.900 | You know, you get it at the end of the night,
01:26:54.660 | you wanna just brush your teeth and go to bed.
01:26:56.020 | - No, you need to floss at night.
01:26:57.100 | - You need to floss at night.
01:26:58.220 | - We did an episode on oral health.
01:26:59.620 | - Yeah, yeah.
01:27:00.460 | - And I learned from the dentist right before sleep.
01:27:01.780 | - The most important way to take care of future self
01:27:03.620 | is flossing, by the way, just to be clear.
01:27:05.500 | I've learned this from many people.
01:27:06.980 | - It's actually true.
01:27:08.360 | - No, it's true.
01:27:09.200 | - It's so key for brain and body health.
01:27:10.460 | - It's unbelievably key.
01:27:11.300 | - The dentists are gonna thank you.
01:27:12.780 | - But we don't do it.
01:27:14.060 | But if you look at your mouth 20 years from now,
01:27:16.300 | staring at you as you're smiling
01:27:17.580 | with the older version of Andy with you,
01:27:19.100 | you know, a little bit less hair,
01:27:20.000 | a little bit more wrinkles, you're gonna do it.
01:27:22.740 | This is what Hal's work has showed.
01:27:24.300 | So that's another thing that I've done
01:27:25.460 | is just look at that image of future you
01:27:28.780 | and connect with it.
01:27:29.880 | That's about having compassion for yourself.
01:27:33.500 | That's part of this kind of
01:27:34.380 | transgenerational empathy component.
01:27:37.060 | The one thing I wanna circle back on
01:27:39.500 | because we could quickly fly past it
01:27:41.660 | is this idea of futures thinking
01:27:44.180 | versus the singular future.
01:27:46.220 | - Yeah, I definitely wanna touch on that.
01:27:47.580 | Can I just ask you a question real quickly before here?
01:27:49.980 | - Of course.
01:27:50.800 | - This notion of, let's say, a protocol
01:27:54.180 | for imagining future self
01:27:56.100 | or actually visualizing future self,
01:27:58.220 | not as a way to scare yourself into better health habits,
01:28:00.660 | although if it works, great,
01:28:02.260 | but as a way to really get your mind
01:28:06.380 | into the reality that if you survive,
01:28:08.980 | you're gonna get older by definition
01:28:11.740 | and that person needs care and in an environment
01:28:15.660 | and your kids are gonna grow up too.
01:28:17.940 | We know this.
01:28:18.780 | Okay, so that's all obvious.
01:28:20.240 | I feel like barring accident or injury or disease,
01:28:26.200 | most people have a kind of intuitive sense
01:28:29.300 | of how long they're going to live.
01:28:30.980 | And the reason I say this is,
01:28:33.100 | I remember when Steve Jobs was alive
01:28:34.700 | 'cause I was a postdoc in Palo Alto then
01:28:36.420 | and would see him occasionally around Palo Alto
01:28:38.860 | and then read the Walter Isaacson biography about him
01:28:43.860 | and it seemed like he had a very clear sense
01:28:46.820 | that someday he would die.
01:28:48.460 | And he lived his life
01:28:49.380 | essentially according to that principle.
01:28:51.900 | And in some sense may have justified
01:28:53.480 | being a little bit outrageous at times
01:28:55.900 | and a little bit high friction at times
01:28:57.860 | through the sense of urgency.
01:29:00.780 | Like it was important to get things done
01:29:02.080 | and get them done right
01:29:03.460 | and to discard with a lot of kind of like popular convention
01:29:06.140 | and he's kind of celebrated for it.
01:29:07.460 | I'm sure a few people dislike him
01:29:08.860 | but I think most people celebrate him for it.
01:29:11.060 | I guess he had some sense of how long he was gonna live.
01:29:15.060 | And then at one point maybe that sense was inflated
01:29:17.460 | and then boom.
01:29:19.120 | Your dad died when you were very young.
01:29:21.780 | Do you think that that gave you a perspective
01:29:23.980 | that at any moment you could be four months out,
01:29:29.200 | you could get the four months notice
01:29:30.680 | that you're gonna be dead in four months?
01:29:32.480 | Like did it shape your thinking about the future?
01:29:35.500 | I mean, my dad's now, I'm not saying this as a,
01:29:38.240 | I mean, no, it's interesting
01:29:39.640 | that there may have been a distinct advantage,
01:29:41.480 | of course, not to his dying, of course,
01:29:43.600 | but to the idea that it really creates this sense of urgency
01:29:47.320 | about not just the present, but the future.
01:29:49.440 | I remember when we were very young,
01:29:50.440 | you're like, "I wanna have kids."
01:29:51.260 | You got going on a family like,
01:29:52.520 | I think first among all of us, really early.
01:29:56.280 | And for those whose parents are still alive
01:30:00.840 | and seem to be vigorous,
01:30:03.440 | maybe they feel less of a sense of urgency, right?
01:30:05.720 | Which sounds wonderful.
01:30:07.320 | Parents are alive, vigorous, okay, that's a blessing.
01:30:10.300 | But if it prevents you from living your life
01:30:12.520 | in a way that's really linked to your futures,
01:30:15.180 | that's not good.
01:30:16.400 | So do you think that we have an intuitive sense
01:30:18.560 | or an unconscious sense of how long we are likely to live,
01:30:22.240 | like a kind of a range?
01:30:23.600 | 'Cause Steve kind of argued that
01:30:25.200 | in some of his writings and speaking.
01:30:27.160 | - So look, let's talk about death.
01:30:29.840 | So it's my contention that one of the things
01:30:35.360 | that keeps us from thinking about the far future
01:30:37.680 | and acting and behaving in a way
01:30:39.520 | that will alter it for the better
01:30:41.960 | is the fact that to truly think
01:30:44.280 | and feel yourself into the far future
01:30:47.200 | means that you're gonna have to think about a moment
01:30:49.320 | where you no longer exist.
01:30:51.640 | In 1972, Ernest Becker wrote a book,
01:30:54.200 | which you'll know all about the book based on the title,
01:30:56.120 | called "The Denial of Death."
01:30:57.240 | He won the Pulitzer Prize for it.
01:30:59.280 | And Becker's contention was that we're the only species
01:31:03.200 | that at a very early age recognizes
01:31:06.700 | that we are only here for a short period of time,
01:31:09.460 | but more than anything, at one point in time, we will die.
01:31:11.800 | We will cease to exist.
01:31:13.400 | And it was Becker's contention further
01:31:15.860 | that everything, religion, culture, laptops, convertibles,
01:31:21.200 | everything that we create is our way of pushing back
01:31:23.880 | the very understanding that at one point
01:31:25.880 | we will cease to exist, and it horrifies us.
01:31:28.560 | - I could not agree more, and I'm so, so grateful
01:31:31.640 | that you mentioned this book and this idea from Becker,
01:31:34.520 | because I would argue that every addiction,
01:31:37.000 | every single addiction, is based in a fear of death.
01:31:40.440 | And an attempt to shorten the timescale of thinking,
01:31:44.120 | shorten the timescale of reward,
01:31:45.480 | shorten the timescale of everything to avoid that reality.
01:31:49.720 | And it's a reality that we learn of at a very early age,
01:31:53.440 | intuitively, 'cause we see death around us.
01:31:55.440 | More and more now in America,
01:31:57.240 | especially in the Western world, we push back from death.
01:32:00.040 | We do everything we can to avoid, even old people,
01:32:05.040 | that we put 'em in old age homes.
01:32:08.800 | It used to be we lived together, right,
01:32:10.720 | in these multi-generational homes,
01:32:12.760 | because older people, I would argue, remind us of death,
01:32:15.520 | remind us of our own mortality.
01:32:17.320 | And so until we can reconcile ourselves truly
01:32:20.600 | at an individual, and maybe even at a collective level,
01:32:23.040 | that we will cease to exist, it becomes extremely
01:32:25.800 | and is extremely difficult to future, to future properly,
01:32:28.840 | to future in the way that I'm advocating for,
01:32:31.040 | which is about being a great ancestor
01:32:32.880 | to future descendants and generations.
01:32:35.800 | And so in the work that I've done and in the show
01:32:38.840 | that I did, I did something, people were very confused.
01:32:41.880 | The show about the future, "Beef History of the Future,"
01:32:45.280 | everyone's like, "Oh, you're gonna go see
01:32:46.440 | "all this cool technology, blah, blah, blah, blah."
01:32:48.840 | That's part of what we do.
01:32:50.360 | But in the middle of the show, in episode four,
01:32:54.120 | I go to the high mountain desert,
01:32:55.800 | we travel all over the world,
01:32:56.640 | but I go to the high mountain desert outside of Tucson,
01:32:59.600 | and I sit with the Lua Arthur, a death doula.
01:33:02.920 | And what she does, you know, mostly the time
01:33:05.600 | when we think of a doula, we think of someone helping birth
01:33:08.440 | a child into the world.
01:33:10.640 | What a death doula does is help us
01:33:12.800 | and help our loved ones exit this world.
01:33:15.720 | And she does something extraordinary.
01:33:17.640 | Other cultures, some religions have this.
01:33:20.000 | She does something called a death meditation.
01:33:21.720 | And in the show I do it, and you can find these online,
01:33:25.360 | where you literally go through a guided meditation
01:33:29.200 | where you go from breathing to cessation of breath
01:33:32.160 | to literally just becoming one with the soil.
01:33:34.680 | It's a very intense thing to go through.
01:33:38.160 | But I went through a version of the death meditation,
01:33:41.200 | as you've alluded to, when I was 18 years old.
01:33:43.800 | 'Cause I literally am the one who picked up the phone
01:33:46.680 | from the hospital at two in the morning.
01:33:48.080 | I was home from college, and I picked it up.
01:33:50.360 | I didn't even say hello.
01:33:52.440 | I picked up the phone, and I said, "This is his son."
01:33:54.480 | 'Cause who else was calling at two in the morning?
01:33:56.640 | And it was a charge nurse, and she goes,
01:33:57.960 | "I want to bring you up to speed."
01:33:59.720 | It was the late stage of cancer.
01:34:01.240 | "Your father is not responding.
01:34:02.680 | "We've been doing CPR.
01:34:03.760 | "There are no orders on what to do.
01:34:05.080 | "What do you want us to do?"
01:34:06.480 | So I made that call, 'cause it was obvious
01:34:09.200 | of where it was going.
01:34:11.120 | That was my way of confronting the salience
01:34:15.320 | of his mortality and my own mortality very, very abruptly.
01:34:20.320 | Other people have their own early brushes with death.
01:34:24.200 | I would argue that there is a certain level,
01:34:26.760 | and you touched on this, of emancipation
01:34:29.520 | when you've come close.
01:34:30.360 | You don't want to wish it on anyone.
01:34:32.200 | But when you've come close to seeing what that looks
01:34:34.560 | and feels like, you all of a sudden become free
01:34:37.320 | from the burdens that society places on you
01:34:40.200 | in the Ernest Beckerian way
01:34:42.760 | of trying to push back mortality,
01:34:44.720 | 'cause you no longer give a shit,
01:34:46.920 | 'cause you now know where it's all gonna go,
01:34:49.400 | and you've seen it.
01:34:50.360 | As a society in the West, in America,
01:34:55.240 | we do the exact opposite of that.
01:34:57.280 | We inject things into our body,
01:34:59.200 | into everything we can to push it back,
01:35:01.560 | 'cause we want more quantity,
01:35:03.760 | but we don't think about the quality
01:35:05.720 | of the life that we want.
01:35:07.640 | Now, that being said, you go to Japan.
01:35:09.960 | 90% of the companies that are over 1,000 years old
01:35:14.840 | on planet Earth right now are in Japan.
01:35:16.760 | So part of it is our culture.
01:35:18.000 | Part of it is different cultures
01:35:19.600 | and how they think and respect elders and death,
01:35:22.000 | and they understand that we don't need to exist
01:35:25.240 | within this own lifespan bias,
01:35:26.880 | but we're actually part of a chain,
01:35:29.040 | a great chain of being those who came before,
01:35:32.600 | the pros and cons of that, the baggage of that,
01:35:35.080 | and then it's my role to decide what I wanna keep
01:35:38.080 | and what I wanna let go,
01:35:39.080 | and then what I wanna transmit to the next generation.
01:35:41.720 | That larger purpose, that larger telos
01:35:45.720 | is what's missing right now
01:35:47.320 | that I think we need back in Western society,
01:35:50.240 | not just so that we're grounded and happy,
01:35:52.560 | that's yes, and we're content,
01:35:54.560 | but because we need to be able to do that
01:35:56.280 | as we confront what we do or do not do
01:35:59.360 | about climate change,
01:36:00.200 | what we do or do not do about synthetic biology,
01:36:02.400 | what we do or do not do about artificial intelligence,
01:36:05.040 | 'cause right now, especially on the last two,
01:36:07.360 | the technology is telling us what to do,
01:36:09.720 | and we don't need more smartness, we need more wisdom,
01:36:12.960 | and part of that wisdom is gonna come about
01:36:15.560 | by us integrating the fact that you alluded to
01:36:18.760 | that at one point we won't be here.
01:36:20.840 | - How do we do this?
01:36:21.720 | I mean, like we can do it conceptually,
01:36:23.800 | like you wanna set the stage for that,
01:36:25.520 | whoever ends up in that empty frame to have a better life,
01:36:30.520 | but it's hard to do.
01:36:34.240 | Like I think most people assume once it's lights out,
01:36:36.840 | who knows what happens next,
01:36:38.020 | but it's very hard to get them working
01:36:39.880 | for something that they don't have the ability to imagine
01:36:43.600 | and the people that they don't even know.
01:36:45.620 | So in other words, if we have a hard enough time
01:36:47.680 | imagining ourselves in the future, you gave us a tool,
01:36:50.640 | look at the aged version of yourself.
01:36:52.280 | I love that, and if there's a website that will do that,
01:36:54.660 | we can put a link to it in the show note captions.
01:36:56.360 | Put a reminder that you will get older,
01:36:58.620 | you are getting older in this very moment,
01:37:01.200 | and try and live for the wellbeing of that person
01:37:04.840 | and the people around them and look at it.
01:37:07.080 | So that creates a protocol for the self.
01:37:09.880 | How do we protocol the future setting,
01:37:15.240 | the futures approach, the verbing of the future
01:37:20.560 | or into the future for people around us
01:37:24.000 | and for people that we don't even really know
01:37:25.840 | and that we probably will never even meet?
01:37:27.480 | - Great question.
01:37:29.920 | Before we go into that,
01:37:31.160 | let's double click on the individual incentive.
01:37:34.120 | So we talked about the aging photo that you can do.
01:37:37.680 | There's also another thing you can do that's very powerful.
01:37:39.840 | You touched on this earlier,
01:37:40.860 | which is writing a letter to your future self.
01:37:43.120 | So you can do this at longpath.org,
01:37:47.760 | you can find future me, websites.
01:37:49.640 | - You have a-- - Yeah, yeah.
01:37:51.120 | It's the number one tool that we use.
01:37:52.760 | So when I give talks, I give, shockingly,
01:37:55.600 | people have me come and talk to large groups.
01:37:57.720 | - Not shockingly, come on.
01:37:58.800 | - What I say to them is,
01:38:00.900 | we'll kind of go through a version
01:38:02.360 | of a different conversation like this.
01:38:04.300 | And I'll say, "Now, what I want you to do
01:38:06.320 | "is I want you to write a letter to your future self.
01:38:08.080 | "It's gonna be delivered in five years from now."
01:38:10.720 | And I thought this was a common practice
01:38:13.520 | because I've been doing it from a very early age,
01:38:15.920 | but apparently it's not,
01:38:16.880 | to write a letter to your future self.
01:38:18.360 | - Yeah, I can't, I mean, maybe once or twice.
01:38:20.400 | - We did it.
01:38:21.240 | And so I'll let you in on a little secret.
01:38:26.680 | The change occurs not when you receive the letter,
01:38:29.520 | but when you actually write it.
01:38:31.640 | Because you're actually thinking, in a way,
01:38:34.120 | about future you in a way that you normally don't,
01:38:36.660 | which is, "Who's gonna receive this letter?
01:38:38.260 | "Where do I want them to be?"
01:38:39.660 | And what I find, more often than not,
01:38:42.720 | is people come after me, come up to me afterwards,
01:38:44.880 | and I go, "To write, I'd never even thought.
01:38:47.680 | "Who do I wanna be in five or 10 years?
01:38:49.540 | "Like, what's that arc of what I wanna kinda connect to?
01:38:52.260 | "What am I optimizing for?
01:38:53.860 | "How do I make myself better in that way?"
01:38:55.600 | So I wanna make sure people understand
01:38:57.840 | that if you can't look at a photo of yourself aged,
01:39:00.220 | the very least, write a letter to your future self.
01:39:02.240 | - And what does the letter include?
01:39:04.480 | - Dear Andy, dear Ari,
01:39:08.040 | and then whatever you wanna put in, right?
01:39:10.960 | This is a one-to-one private conversation
01:39:13.280 | with your future self.
01:39:14.640 | What are your hopes?
01:39:15.480 | What are your dreams?
01:39:16.300 | What are your desires?
01:39:17.140 | What are you afraid of?
01:39:17.960 | What do you wanna see happen?
01:39:19.440 | Because until you put out there,
01:39:23.960 | and you can't be it if you can't see it, right?
01:39:26.160 | You have to actually visualize what that is,
01:39:28.760 | and putting in, not the negative,
01:39:30.680 | but what you really wanna see aspirationally in that letter
01:39:33.800 | now starts creating a roadmap to getting there,
01:39:38.360 | because at the very kind of bottom of the pyramid
01:39:41.160 | of what that roadmap is,
01:39:42.400 | is visualizing what that success looks like, right?
01:39:45.000 | So I was, in high school, I ran track,
01:39:50.120 | and I started off by doing the 100,
01:39:53.760 | very kind of an individual sport.
01:39:56.000 | And then eventually, as I went forward,
01:39:58.760 | I started running the four-by-100, which is a relay race.
01:40:01.360 | And what I learned from my coach, Coach Ted Tillian,
01:40:04.260 | was that the four-by-100,
01:40:06.880 | it's very important that all four runners
01:40:08.600 | run very, very fast, obviously.
01:40:10.760 | But where that race is won or lost is in the transition zone,
01:40:14.920 | is in the passing of the baton.
01:40:17.360 | And so when you write a letter to your future self,
01:40:20.640 | yes, you're connecting to your future,
01:40:22.280 | but what it's really also helping you do
01:40:25.600 | is realize that life is not a 100-yard dash.
01:40:29.040 | It's actually a relay,
01:40:30.400 | and you're carrying a baton that was handed to you
01:40:32.420 | that you are now gonna hand off.
01:40:34.000 | And I'm arguing that we, right now,
01:40:36.520 | what I call we're in this intertidal moment
01:40:38.900 | between kind of what was and what will be
01:40:40.680 | as a planetary civilization.
01:40:43.560 | We are in this transition zone,
01:40:45.080 | and what we do or do not do in this intertidal,
01:40:47.720 | in this transition zone, with the baton,
01:40:50.300 | that is homo sapien, planetary, flourishing culture,
01:40:53.580 | is gonna matter much more than we think it does
01:40:58.800 | in the current moment of social media pings.
01:41:01.320 | So that's touching on the individual.
01:41:03.480 | Let's go up to that collective.
01:41:05.040 | We have to decide, as individuals,
01:41:10.980 | which some of these protocols will help you do,
01:41:12.380 | but we have to decide, as a society,
01:41:14.600 | that we wanna actually tackle the question of to what end,
01:41:19.340 | because in the erasure of God,
01:41:21.260 | in the erasure of the afterlife,
01:41:23.060 | in what was given to us by religion
01:41:25.520 | for hundreds, thousands of years,
01:41:27.700 | some sort of guarantee that we would go on to heaven or hell,
01:41:31.100 | now that that is no longer there for a lot of people,
01:41:33.340 | for some it still is,
01:41:34.700 | and it still helps them make better decisions,
01:41:36.620 | I would argue, in the day-to-day,
01:41:38.180 | but for those who no longer have that,
01:41:40.880 | we have to decide that,
01:41:42.780 | and this can be from an egoic level,
01:41:44.680 | that the decisions that we make or do not make
01:41:47.220 | are either gonna hook up in a great way
01:41:50.260 | future generations or not.
01:41:53.380 | We can be in those three categories.
01:41:54.900 | We can be one or two.
01:41:55.760 | It doesn't matter.
01:41:56.600 | Who cares?
01:41:57.420 | I'm just gonna YOLO.
01:41:59.140 | Or we can say we wanna be part of a much larger project.
01:42:02.840 | I talk about this a lot.
01:42:05.260 | You can tell my bias here.
01:42:08.700 | I don't say human.
01:42:09.540 | I don't say being a project.
01:42:10.520 | I think, like I said,
01:42:12.400 | we're kind of at the bottom or the top of the third.
01:42:14.340 | We have at least several hundred thousand more years to go.
01:42:17.080 | I am not as focused as to whether or not we leave Earth
01:42:20.560 | and we go to Mars
01:42:21.400 | and we become an interstellar species.
01:42:22.840 | I'm more focused on who we are,
01:42:26.440 | 'cause I've met, like you,
01:42:28.580 | I've met great hearts and minds,
01:42:31.140 | and I think that as a society,
01:42:34.600 | if we take care of everyone's basic needs,
01:42:36.900 | if we look at kind of the best of humanity,
01:42:39.040 | the best of the humans that we've met,
01:42:40.800 | we can all rise to that level.
01:42:42.480 | So instead of there being like a hundred great heroes
01:42:45.680 | in the world who are just so heartfelt,
01:42:47.760 | you know, like the Dalai Lama or Mother Teresa
01:42:49.760 | or even Einstein,
01:42:51.120 | that that could actually be--
01:42:52.760 | - Are those three still in touch
01:42:54.000 | or they've been canceled yet?
01:42:55.080 | - No, they're still with us.
01:42:56.440 | They're still with me.
01:42:57.680 | But look, even when you get into their,
01:42:58.880 | look, you asked one of the ways,
01:43:00.320 | how do you build transgenerational empathy with the past?
01:43:03.300 | Read people's biographies, especially autobiographies,
01:43:05.600 | and you see they had it really tough,
01:43:07.480 | and they're not as perfect and as saintly
01:43:08.920 | as we think they are.
01:43:09.840 | - And those, right, and the autobiographies
01:43:11.840 | are, of course, through their own lives, right?
01:43:13.040 | - Through their own lives.
01:43:13.880 | And so the biographies give you,
01:43:14.700 | or you read their letters to their lovers
01:43:16.320 | or to their partners, and you're like,
01:43:17.160 | "God, that person was kind of an asshole," right?
01:43:19.640 | But at the end of the day,
01:43:22.440 | if we as a society want to find ourselves
01:43:26.000 | where more of us than less of us
01:43:28.560 | are at this heightened sense of kind of intellectual
01:43:31.420 | and spiritual and emotional activation,
01:43:35.480 | that's not gonna happen overnight.
01:43:37.480 | But if we say that's the goal that we want,
01:43:39.800 | we want to see, people will argue,
01:43:43.680 | nine billion, seven billion, three billion,
01:43:45.760 | whatever the population of Homo sapiens is on planet Earth
01:43:48.280 | over the next several centuries or millennia,
01:43:52.280 | if we want to see them flourishing in a way
01:43:54.000 | that's beyond what science fiction has ever even showed us,
01:43:57.840 | if we make that decision that your life,
01:44:00.040 | what Andy, what Andrew Huberman is doing in his work,
01:44:02.440 | what Ari Walker is doing, is contributing to that,
01:44:05.300 | that gives you a sense of purpose
01:44:07.640 | that I think religion used to give us
01:44:09.800 | that we are now sorely lacking in a social media world
01:44:14.800 | of instant buying of crap that we don't need on the internet.
01:44:18.800 | - Yeah, or that we do need,
01:44:20.200 | and it's just a shorter timescale reward thing.
01:44:23.240 | Like, I don't believe that everything
01:44:25.040 | that happens on social media or that we buy
01:44:26.880 | or the pleasure that we get in our lifespan or day is bad.
01:44:31.040 | I don't think, you know, I'm a capitalist too.
01:44:33.240 | What I think is that it's just one,
01:44:36.420 | it is but one time window of kind of operations.
01:44:39.580 | I just think it's good to have flexibility, right?
01:44:42.300 | It's sort of like in nutrition,
01:44:43.580 | they talk about metabolic flexibility.
01:44:46.180 | - It's not about balance, it's about harmony.
01:44:47.980 | How are we in harmony with the future?
01:44:50.420 | That is what I'm advocating for.
01:44:52.340 | - So I love it, and I also know that a lot of people love it
01:44:56.900 | even if they don't know they love it,
01:44:59.380 | meaning they perhaps haven't heard it framed
01:45:02.460 | the way that you describe it in your book,
01:45:04.560 | on your show, and today.
01:45:05.840 | But I think a lot of people just are hoping
01:45:09.960 | that these super high achievers, right?
01:45:11.940 | The Steve Jobses, the Elons, the,
01:45:15.160 | I don't know how people feel about politicians nowadays,
01:45:19.840 | but, you know, but the people building technologies
01:45:22.680 | who seem to really care about the future.
01:45:24.320 | I mean, say what you want about Elon,
01:45:25.700 | but the guy is building stuff for the now
01:45:27.400 | and for the future.
01:45:28.680 | I mean, he's doing it.
01:45:32.120 | That they will take care of it for next generations, right?
01:45:36.460 | Just like there were those, the Edisons and the Einsteins
01:45:40.600 | and the, you know, you have to be careful
01:45:44.400 | with names these days 'cause almost everyone
01:45:46.160 | has something associated with them
01:45:47.380 | where you're going to trigger someone,
01:45:48.920 | but I'll just be, you know, relaxed about it
01:45:52.400 | and say like, I would even say like,
01:45:54.560 | you know, even like a Jane Goodall,
01:45:55.760 | like the appreciation of our relationship with animals
01:45:58.080 | and what they have to contribute
01:45:59.200 | to our own understanding of ourselves and our planet,
01:46:02.220 | that kind of thing.
01:46:03.060 | So, you know, those people ushered in the life
01:46:07.800 | that I've had and I feel pretty great about that.
01:46:12.800 | So many people are probably saying,
01:46:15.800 | okay, makes sense for my family,
01:46:18.240 | but, you know, what do I have to contribute?
01:46:21.760 | And you give the example of the fact
01:46:24.260 | that children are always observing,
01:46:26.320 | they carry forward the patterns and the traits
01:46:29.400 | and certainly the responses that they observe
01:46:33.160 | in their parents, what's okay, what's not okay.
01:46:35.320 | You know, starting in the '80s and in the '90s
01:46:37.400 | in this country, there were many more divorces
01:46:39.720 | and fractured homes than there were previously.
01:46:43.080 | As a consequence, there's also been a fracturing
01:46:47.120 | of the kind of collective celebration of holidays.
01:46:51.160 | Like the things that have anchored us through time
01:46:53.760 | are happening less frequently now.
01:46:56.000 | Many of these have become commercialized,
01:46:58.200 | but that was always the case.
01:47:00.380 | You know, people were getting Christmas presents
01:47:01.760 | one way or another.
01:47:02.840 | So, you know, do you think that the kind of fracturing
01:47:07.840 | of the family unit has contributed
01:47:10.240 | to some of this lack of, let's just call it,
01:47:13.440 | longer path thinking and decision-making?
01:47:17.140 | - Look, I think it's the fracturing of the institutions
01:47:22.060 | that have been with us for the past several hundred years
01:47:24.280 | that is leading to an exponential rise
01:47:28.360 | in short-term behavior.
01:47:29.680 | - Okay, so you mentioned religion.
01:47:31.520 | Maybe for a moment, we could just talk about universities.
01:47:34.080 | These days, in part because of the distrust of science
01:47:37.640 | and in part because of the distrust in government
01:47:39.840 | and in part because of the distrust in traditional media,
01:47:42.640 | there's more and more ideas being kicked around
01:47:47.200 | that, you know, formal education is not as valuable
01:47:52.280 | as it used to be, and people always cite the examples
01:47:54.720 | of the Mark Zuckerbergs and others
01:47:57.280 | who didn't finish college,
01:47:58.640 | but I would argue they got in and chose to leave.
01:48:00.860 | They took leave of absence.
01:48:02.120 | They didn't drop out, and they are rare individuals.
01:48:05.880 | Ryan Holiday said it best, I think.
01:48:07.480 | If you are struggling in college,
01:48:09.080 | you're absolutely the kind of person
01:48:10.320 | that needs to stay in college, with rare exception,
01:48:13.680 | unless there's like a mental health issue
01:48:14.920 | or something, a physical health issue
01:48:16.120 | that needs to be tended to because nowhere else in life,
01:48:18.880 | except perhaps the military,
01:48:20.080 | is there such a clear designated set of steps
01:48:22.640 | that can take you from, you know, point A to point B
01:48:26.840 | with a credential that you can leverage
01:48:28.760 | in the real world for builds,
01:48:31.200 | and I completely agree with that,
01:48:32.840 | but I would also argue that academic institutions
01:48:37.840 | and financial institutions have changed,
01:48:41.180 | political institutions have changed,
01:48:42.720 | and there's a deep distrust,
01:48:44.740 | so we are having a harder time relying on them
01:48:47.740 | to make good decisions.
01:48:49.200 | You saw a lot of presidents of major universities
01:48:51.760 | fired recently, including Stanford.
01:48:55.060 | There, I said it, it happened,
01:48:56.600 | but also Harvard and other places for different reasons,
01:49:00.520 | and fired might be not the correct term.
01:49:03.500 | They decided to resign, whatever it was.
01:49:05.320 | They're no longer there.
01:49:06.160 | They have new ones in, and so there's a lot of distrust,
01:49:09.940 | so what can we rely on?
01:49:13.640 | Like if it's not, if people are having less faith
01:49:16.080 | in religion, less faith in academic institutions,
01:49:18.480 | less faith in, like, what do we got?
01:49:21.240 | - We got really good in academia,
01:49:23.040 | at least on the social sciences side,
01:49:24.400 | of saying what was wrong with the systems,
01:49:26.440 | but not about what the systems we wanted them to be,
01:49:29.160 | because going back several hundred years ago,
01:49:32.680 | coming, you know, through the Enlightenment,
01:49:33.880 | especially, well, Renaissance into the Enlightenment,
01:49:35.720 | the Enlightenment gave us back
01:49:37.760 | this idea of a new meta-narrative
01:49:39.560 | based on rationality and logos,
01:49:42.660 | and the ability to kind of understand the world
01:49:45.360 | by breaking it down into its component parts.
01:49:47.320 | That's science.
01:49:48.400 | Fast forward several hundred years,
01:49:51.440 | and we're at the point now where we're really good
01:49:54.640 | at saying what doesn't work, but very, very bad
01:49:57.720 | about saying what does work and what we do want,
01:50:00.280 | because by saying what we do want means
01:50:02.360 | that we have to put forth some sort of meta-narrative,
01:50:04.880 | some thread, some official future
01:50:07.360 | that we can hang ourselves on.
01:50:09.080 | - And it tells us a lot about,
01:50:10.600 | it's sort of like declaration of values.
01:50:12.360 | It's one thing to say, which is scary for a lot of people,
01:50:16.760 | 'cause it's one thing to say that doesn't work,
01:50:18.360 | that's no good, that's no good.
01:50:19.600 | It's easy to be a critic.
01:50:21.280 | What you're describing has incredible parallels to health.
01:50:24.840 | Like, you know, when I started the podcast,
01:50:26.360 | and even before when I was posting on social media,
01:50:28.240 | it was during the lockdowns,
01:50:30.520 | and it was like all this fear about everything,
01:50:32.240 | and I said, "Listen, like, I can't solve this larger issue
01:50:35.160 | "related to what may or may not be going on,
01:50:38.740 | "but what's obvious?
01:50:40.280 | "People are stressed, stress is bad when it's chronic.
01:50:43.300 | "People aren't sleeping, that's bad,
01:50:45.400 | "especially when it's chronic,
01:50:46.960 | "and I've got some potential solutions,
01:50:49.660 | "some tools, some zero-cost tools."
01:50:51.100 | So a lot of the backbone of the "Huberman Lab" podcast
01:50:53.920 | is about the things you do
01:50:56.380 | more so than the things you don't do.
01:50:58.520 | So what you're describing is essentially a field
01:51:00.040 | that consists of, like, breaking things down,
01:51:02.160 | but isn't offering solutions.
01:51:03.520 | So it sounds very similar,
01:51:04.600 | and I think that people love potential solutions.
01:51:07.620 | Even if one acknowledges,
01:51:08.980 | "Look, this might not solve every sleep issue,"
01:51:11.860 | it very well could make, you know,
01:51:14.280 | positive ground towards some of it,
01:51:15.880 | or make it 50% better, or 20% better,
01:51:17.960 | in some cases, 100% better.
01:51:19.800 | And of course, those for whom the tools don't work
01:51:22.760 | and they need to go to more extreme measures.
01:51:25.080 | But I hear you saying that religion provided the solutions,
01:51:30.000 | not just pointing to problems.
01:51:32.480 | People are not looking at that as much anymore.
01:51:35.240 | The big institutions, like academic institutions,
01:51:39.040 | political institutions, let's face it,
01:51:41.480 | regardless of where one sits on one side of the aisle,
01:51:44.260 | or the other, they're constantly fighting.
01:51:46.040 | It's like 12-hour news cycle designed to just point fingers
01:51:48.480 | so that nobody actually has to say
01:51:49.840 | what they really believe in a clear, tangible way.
01:51:53.280 | There are those that do that a bit more than others,
01:51:55.000 | but it's a mess.
01:51:56.880 | And then in terms of the family unit,
01:51:58.720 | this is what I was alluding to before,
01:52:00.120 | I feel like family units and values and structures
01:52:03.160 | are becoming more rare,
01:52:04.740 | at least in the traditional view of the family.
01:52:07.040 | - Let's remember- - Two parents, kids, et cetera,
01:52:10.440 | which is not by no means a requirement
01:52:11.880 | to call something a family.
01:52:12.880 | - But remember- - So are you saying
01:52:15.100 | that we all have to look as,
01:52:17.180 | it obviously starts with the individual,
01:52:18.740 | but that part of the work of being a human being
01:52:22.460 | now and going forward is to learn this futures approach?
01:52:27.460 | - We have to be future conscious.
01:52:29.260 | But again, this goes back
01:52:30.340 | to the transgenerational component.
01:52:31.500 | We have to critically assess where we came from
01:52:34.860 | and why we're at this point.
01:52:36.020 | So let's talk about the nuclear family.
01:52:37.980 | The idea that your children would be "sleep trained"
01:52:42.620 | and put into another room is relatively new.
01:52:44.860 | That's from the Victorian era,
01:52:46.300 | where you would put your kids in another room.
01:52:49.140 | Because if you go back to most indigenous cultures,
01:52:52.020 | everyone slept together.
01:52:53.420 | And this happened for thousands of years,
01:52:55.060 | and the kids did- - In a big pile?
01:52:56.540 | - Yeah, or in one big room or in a long house.
01:52:59.780 | - Like piglets?
01:53:00.620 | - Huh? - Like piglets.
01:53:01.440 | - I don't know if they were like piglets,
01:53:02.280 | but they definitely all slept together.
01:53:04.220 | And look, everyone can...
01:53:07.780 | Look, I'm gonna say this in a nonjudgmental way,
01:53:10.100 | but it's gonna sound very judgmental.
01:53:11.740 | I walk down the street sometimes
01:53:13.060 | and I see kids in strollers
01:53:16.100 | being pushed by a seemingly healthy adult, right?
01:53:19.860 | The kid is detached and they're in this kind of this buggy,
01:53:22.540 | which comes from 17th, 18th century England.
01:53:25.560 | But if you look at most cultures around the world
01:53:27.780 | for thousands of years,
01:53:28.700 | what they did was they wore their babies
01:53:31.700 | for what we call the fourth trimester.
01:53:33.900 | Usually the mother,
01:53:35.100 | so a bunch of patriarchal reasons for that.
01:53:37.020 | But they literally would have a wrap on
01:53:38.420 | and the baby would be wrapped and be held very close to them.
01:53:40.980 | - This is the baby bjorn thing?
01:53:42.740 | - No, the baby bjorn, you put the baby in front of you,
01:53:44.340 | but it's facing out.
01:53:45.500 | When you really wrap them with like a 20 yard wrap,
01:53:48.300 | it's skin to skin, right?
01:53:50.020 | And look, and there's a reason,
01:53:52.700 | like everything, there's a reason for everything.
01:53:55.440 | For a human baby to come out of the mother
01:54:00.260 | as cognitively, intellectually, and physically ready
01:54:03.260 | as a baby chimpanzee would take 18 months of gestation,
01:54:07.100 | but we only do nine.
01:54:08.260 | You know why, right?
01:54:10.260 | Well, we do it because our brains got so big
01:54:14.020 | 'cause of all that protein,
01:54:14.980 | because Ari and Andy were hunting together
01:54:16.500 | using our prospection earlier on this story,
01:54:18.940 | that the baby has to come out at nine months
01:54:21.940 | because when we went from walking on all fours
01:54:23.740 | to being bipedal, the female pelvis closes
01:54:26.940 | and there's only so much room for that baby to come out,
01:54:28.740 | so they come out early.
01:54:29.940 | - Yeah, if the brain had completed development internally,
01:54:32.620 | you'd have only stillborn.
01:54:33.900 | I mean, presumably there was a branch
01:54:35.300 | of our earlier version of species
01:54:38.540 | that many mothers and babies died in childbirth
01:54:43.340 | because of this, they were deselected,
01:54:45.460 | but that's not the proper term.
01:54:46.300 | - And so we found the optimal balance
01:54:48.020 | of nine months, roughly, right?
01:54:49.620 | But what that means is the baby has to be attached
01:54:51.660 | and close to the mother because it's totally helpless.
01:54:55.060 | The point is that so much of what we do,
01:54:58.300 | we don't critically examine.
01:54:59.960 | So you're talking about the breakdown
01:55:01.580 | of the family structure.
01:55:02.420 | I would argue that breakdown isn't happening now.
01:55:04.740 | That breakdown happened when we started to move
01:55:07.140 | from tribes and clans of raising children
01:55:10.960 | and move into a Victorian-era mindset
01:55:13.300 | where we take the grandparent, you know.
01:55:15.380 | There's very few species on planet Earth
01:55:17.820 | that after the female goes through menopause,
01:55:20.900 | they still live.
01:55:21.940 | Basically elephants, whales, and humans, right?
01:55:25.240 | 'Cause those are the species where you need others,
01:55:27.900 | elders, to help care for the young
01:55:29.780 | because of the aforementioned early birthing.
01:55:33.140 | - But maybe it's also the propagation of story,
01:55:36.620 | as you said earlier, that can inform better decisions.
01:55:39.980 | - So we need new stories.
01:55:41.660 | - Wisdom is like spoken cave paintings, basically.
01:55:44.620 | - Yeah, and so we need, so those stories
01:55:46.500 | about what does it mean to have a proper family structure,
01:55:50.580 | whether it's a nuclear family of four or five
01:55:52.380 | or 20 of aunts and uncles and around.
01:55:55.140 | Look, we did pretty well
01:55:56.380 | for the first couple hundred thousand years,
01:55:58.100 | and then there was all these things
01:55:59.340 | that religion disrupted, right?
01:56:00.900 | Taking the children away from the mom.
01:56:02.260 | These all come from puritanical beliefs.
01:56:04.340 | Now we're at this point, in this intertidal moment,
01:56:06.540 | where we have to critically examine
01:56:08.580 | why is it we do what we do?
01:56:10.940 | What are the things that we wanna keep?
01:56:14.100 | And what are the things we wanna let go of,
01:56:15.420 | and how do we move forward?
01:56:16.820 | And your question was, well, why do they wanna do that?
01:56:20.500 | What's the incentive structure?
01:56:23.180 | And I'm arguing that the incentive structure
01:56:25.100 | for us to do that, because we actually care
01:56:29.120 | about where we take our species,
01:56:30.900 | where we move forward in the universe,
01:56:35.220 | given the fact that so much had to go right
01:56:38.420 | to get us to this point.
01:56:39.660 | Right, I'm often asked this question.
01:56:42.120 | God, how did we get so messed up,
01:56:45.980 | and what is it gonna look like?
01:56:47.540 | - Wait, are we so messed up because you said
01:56:49.700 | we're about a third of the way through,
01:56:50.980 | or things are better than ever?
01:56:52.460 | - Yeah, so I get the question.
01:56:54.660 | Like, how is it that we messed up?
01:56:55.500 | And I always say, we didn't mess up.
01:56:56.940 | We're actually doing much better.
01:56:58.980 | Look, I walk into my daughter's room,
01:57:00.660 | and I look at their bookshelf, 15-year-old twin daughters,
01:57:04.180 | and every piece of fiction that takes place
01:57:06.580 | somewhat in the future is dystopian.
01:57:09.140 | All the futures they know are the "Hunger Games,"
01:57:12.340 | are the "100," are the "Maze Runner,"
01:57:15.620 | a world that has gone bad.
01:57:17.260 | I understand, we talked about this earlier,
01:57:20.980 | there's a negativity bias.
01:57:22.060 | People are gonna be attracted to reading about those things.
01:57:24.260 | - Kids read that stuff now?
01:57:25.380 | - Oh my, those are the bestsellers.
01:57:26.900 | The bestsellers are all these dystopian,
01:57:28.980 | there's always a love interest in a teenage thing,
01:57:30.740 | but it's always, the backdrop is always dystopia.
01:57:34.700 | And we're attracted to that in the same way
01:57:36.580 | we're attracted to a dumpster fire,
01:57:38.380 | because we wanna see the things,
01:57:40.420 | dystopias can act as a-- - So sad.
01:57:42.340 | - Dystopian stories can act as an early warning system.
01:57:45.380 | If you keep doing this one thing that you're doing,
01:57:47.940 | and extrapolate out a few decades, it'll look like this.
01:57:50.980 | What we're missing, and you just,
01:57:52.940 | you hit the nail on the head,
01:57:54.100 | are the stories about what if we get it right,
01:57:56.860 | what we call protopia.
01:57:58.460 | So utopia is this perfect world
01:58:01.260 | that always collapses on itself.
01:58:02.660 | It's really dystopian in size.
01:58:03.740 | Dystopia we talked about is a terrible, terrible world.
01:58:06.500 | A protopia, this idea put forth by Kevin Kelly,
01:58:09.620 | is a better tomorrow, not perfect,
01:58:11.740 | but one where we're making progress.
01:58:13.620 | So it's unbelievably important,
01:58:15.420 | and this is how I'm answering your question
01:58:16.940 | from a few minutes ago,
01:58:18.300 | that we start setting stories in protopias,
01:58:21.140 | in better tomorrows.
01:58:22.900 | In tomorrows where not everything is perfect,
01:58:25.260 | but where we have made significant progress.
01:58:27.340 | It won't be perfect, there'll still be divorces,
01:58:29.220 | and maybe murders, and mayhem.
01:58:31.100 | But if we start backdropping our future visions
01:58:34.820 | in worlds that are better than they are today,
01:58:37.780 | I would argue that will be the stories
01:58:40.500 | that start acting as a kedge
01:58:42.300 | to help pull us through this narrow moment
01:58:44.500 | of flux and chaos that is this intertidal.
01:58:47.780 | - How do we do it at scale?
01:58:49.440 | Because I think a lot of people listening to this
01:58:52.260 | will say, "Okay, that all sounds great."
01:58:53.900 | Like I, for one, say, you know,
01:58:56.340 | that the shift from the notion of building a better future
01:59:00.540 | through self-sacrifice,
01:59:03.260 | rather you can make it almost like pro-self-and-others
01:59:08.020 | endeavor, the way you've described it.
01:59:09.300 | Empathy for self, empathy for others,
01:59:11.660 | getting some control over the, you know,
01:59:15.520 | contraction and dilation of your time window,
01:59:17.980 | making sure that, you know,
01:59:18.860 | what you take good care of yourself,
01:59:20.520 | but you take care of the future generations as well.
01:59:23.460 | Like for that empty frame, the now empty frame.
01:59:26.780 | And then moving from dystopia to protopia,
01:59:29.440 | that all sounds great.
01:59:31.020 | But I think a lot of people might think,
01:59:33.140 | "Okay, well, at best I could do that for myself
01:59:36.780 | and the people that I know.
01:59:39.540 | It's gonna be hard to do that as a greater good
01:59:42.580 | for the greater good."
01:59:43.580 | And you could say,
01:59:44.420 | "Well, that does contribute to the greater good."
01:59:45.700 | This is actually very similar
01:59:46.900 | to what we tell graduate students
01:59:48.820 | when they get their first round of data.
01:59:52.020 | You go, "Okay, well, the data oftentimes,
01:59:54.940 | not always, but oftentimes you say,
01:59:56.980 | "Oh, well, the data are cool.
01:59:58.100 | Like if it continues this way,
01:59:59.900 | that'd be an interesting story."
02:00:01.460 | And they get the sense and you already have the sense
02:00:04.040 | 'cause you have the experience to know, like,
02:00:06.260 | the best case scenario is a nice solid paper
02:00:09.340 | that your three reviewers
02:00:13.620 | and maybe 20 other people will read.
02:00:16.580 | And you're gonna spend the next five years of your life
02:00:18.640 | on this thing.
02:00:19.540 | Maybe three, but probably five years of your life.
02:00:21.500 | And you'll get your PhD.
02:00:22.900 | And there's always this question,
02:00:23.860 | like, do you ditch that project and go for something else?
02:00:26.780 | Or do you stay with that project?
02:00:27.860 | In other words, what you're saying is
02:00:28.900 | you get to put your brick on the wall, but it's a brick.
02:00:32.420 | Whereas, you know, there are other projects
02:00:35.220 | and you go, "Whoa, like that's, you know,
02:00:37.460 | that's like one wing of the cathedral."
02:00:39.540 | And it's a rare instance where that happens.
02:00:42.620 | And a lot of it's luck
02:00:43.540 | and it doesn't always work out anyway.
02:00:46.500 | But, you know, what we're saying here is, you know,
02:00:49.860 | how hard people are willing to work
02:00:52.100 | is often related to what they feel
02:00:53.700 | the potential payoff will be.
02:00:55.340 | If they can sense the payoff.
02:00:56.580 | And by the way, I love the protocols that you offer,
02:00:58.540 | the empty frame, the journaling to future self,
02:01:01.860 | this notion of time capsuling,
02:01:03.180 | your present thinking into the future, the aging of self.
02:01:07.300 | These are very actionable things.
02:01:09.180 | I plan to do them and I think they're very valuable.
02:01:12.740 | But if I understand correctly,
02:01:14.660 | you are interested in creating a movement of sorts
02:01:18.620 | where many, if not everybody, is thinking this way.
02:01:22.540 | Because the other model is,
02:01:24.500 | "Okay, well, the Elons will take care of it for us."
02:01:27.180 | Or the system is so broken, like,
02:01:31.620 | "There's nothing I can do.
02:01:32.460 | I'm just trying to make ends meet."
02:01:34.300 | So how does one create like a reward system
02:01:39.300 | or a social media platform or, you know,
02:01:43.860 | how does one, you know, join up with other people
02:01:46.820 | who are trying to do this?
02:01:48.680 | - So the question you're getting at is,
02:01:51.440 | in a lot of the work that I've read
02:01:58.480 | and listened to on this podcast,
02:01:59.960 | oftentimes it's about how do we, you know,
02:02:02.480 | obviously, how do we optimize the self?
02:02:05.480 | And I mean that in a good way, not in a selfish way.
02:02:08.280 | How do we make ourselves better, right?
02:02:10.280 | That's where you have to start.
02:02:12.240 | I'm advocating for how do we optimize society?
02:02:16.760 | How do we optimize civilization?
02:02:18.640 | And this is a clear case where,
02:02:21.480 | unlike when we think of scale being, you know,
02:02:24.460 | make more widgets at a cheaper price,
02:02:26.880 | this is really a one plus one plus one plus one at infinity.
02:02:30.320 | So it's at infinitum.
02:02:31.680 | If we think about, just for example,
02:02:35.000 | how many listeners and viewers there are
02:02:37.480 | of this podcast, millions, right?
02:02:40.400 | And how many people they interact with
02:02:42.700 | within their closest sphere, and you go out, right?
02:02:46.520 | So right now, your listeners have the potential
02:02:49.600 | to live and act long pathian in this way
02:02:52.840 | where they're doing something for a greater,
02:02:54.720 | they're thinking about their purpose in the world
02:02:57.080 | as nested within the larger purpose of our species
02:03:01.280 | to allow for more mass flourishing in the future
02:03:05.300 | for generations to come.
02:03:07.520 | If you think about your listeners and how they interact
02:03:09.920 | and how they model behavior, and their spheres,
02:03:13.860 | you're at 30, 40, 50 million people, right?
02:03:16.440 | That's a very, very large number.
02:03:18.540 | And what we know about social and emotional contagion
02:03:24.760 | is that these things are contagious.
02:03:26.640 | They are memes.
02:03:27.480 | This is Susan Blackmore's work.
02:03:28.840 | That's how it scales.
02:03:31.840 | It actually is one of those things
02:03:33.200 | where you're not going to, you know, just add powder
02:03:37.520 | and it all of a sudden will create this optimal future
02:03:43.180 | for everyone because only one person does it.
02:03:45.040 | We all have a role to play in it.
02:03:46.480 | It's like literally, what I would want
02:03:48.520 | is anyone who's listening or watching this
02:03:51.160 | when they're done doing it to take a few minutes
02:03:54.040 | and think about what kind of futures do I want
02:03:56.800 | for myself, for my family, for the generations to come?
02:03:59.840 | And what is my role in that great play?
02:04:02.280 | What do I have to do?
02:04:03.240 | And yes, you need the protocols
02:04:04.440 | to kind of bring you back into there, right?
02:04:06.000 | For me, it's easy 'cause I wrote the book.
02:04:07.720 | I did the show.
02:04:08.540 | I can just think long path.
02:04:09.380 | I can do it.
02:04:10.220 | For others, this is gonna be the first time
02:04:11.160 | they're thinking about this
02:04:12.240 | or maybe they've been thinking about it for years.
02:04:14.760 | Even in their smallest interactions, they start doing it.
02:04:18.040 | And we, you know, this gets into kind of
02:04:19.680 | the Santa Fe Institute and complexity theory.
02:04:21.760 | This stuff starts to actually reverberate.
02:04:24.600 | That's how we do it.
02:04:25.760 | You know, there's not,
02:04:26.600 | we don't need a march for long-termism, right?
02:04:29.400 | We don't, we don't, we don't need bumper stickers.
02:04:31.740 | - Thank you.
02:04:32.580 | There will be no bumper stickers on my forerunner.
02:04:33.420 | - There will be no bumper stickers.
02:04:34.900 | There will be no bumper stickers.
02:04:36.560 | It's about placing our very essence
02:04:40.280 | and our actions within the realm of possibility
02:04:44.600 | for the futures that we want
02:04:46.120 | and our role in that and then the purpose.
02:04:47.880 | So I don't care if you're a barista,
02:04:49.420 | if you're a surfing instructor,
02:04:51.000 | if you're a, you know, brilliant podcaster,
02:04:53.960 | whatever it is that you do,
02:04:56.440 | do it with the intention and recognition
02:04:59.040 | that you are modeling a way of being in the world
02:05:02.640 | that has ramifications and reverberations
02:05:04.620 | beyond this current moment.
02:05:06.160 | And you said earlier, well, you know,
02:05:07.960 | who knows if anyone will listen to your podcast.
02:05:09.680 | What I can tell you with certainty,
02:05:11.160 | 'cause I'm sure it's probably already happened,
02:05:13.040 | is a large language model, an LLM,
02:05:16.520 | something, you know, what we call AI right now,
02:05:19.200 | is already, or will at some point,
02:05:21.240 | ingest the Huberman Lab podcast.
02:05:23.200 | - Yeah, we have one.
02:05:24.040 | We have a Huberman Lab AI.
02:05:25.040 | - There you go.
02:05:25.860 | - We haven't advertised it very heavily,
02:05:26.800 | but it's there.
02:05:27.640 | You can ask me questions.
02:05:28.960 | It's pretty good.
02:05:29.800 | It sounds a bit like me.
02:05:31.800 | The jokes are dry.
02:05:33.000 | - They're dry.
02:05:33.840 | - And not funny.
02:05:34.660 | - I was gonna say mostly funny,
02:05:35.640 | but I'll give you some more.
02:05:37.340 | But eventually that will percolate out.
02:05:40.960 | So at the speed of things are going
02:05:42.400 | three or four years from now,
02:05:43.720 | this very conversation, how we're modeled,
02:05:46.760 | what I learned in school, discourse ethics,
02:05:48.800 | how we talk to one another,
02:05:50.160 | that is teaching these machines
02:05:54.460 | how to think and act and who and what we are
02:05:56.760 | and how to become the best of or the worst of ourselves.
02:05:59.080 | What we put out there,
02:06:00.400 | the kind of the public facing content
02:06:02.360 | is going to become what these machines think of
02:06:05.780 | as how they should be.
02:06:07.000 | And we're modeling it for them.
02:06:09.580 | And going back to the higher education example
02:06:11.640 | for a second, I think higher education,
02:06:13.860 | like many institutions, as AI is what we call that,
02:06:18.720 | fully comes online is going to radically, radically change.
02:06:22.400 | And it will be a Cambridge or an Oxford tutor
02:06:26.760 | in everyone's ear and higher education,
02:06:29.120 | this idea that you kind of come together
02:06:31.820 | to receive information will start to dissipate
02:06:34.440 | from higher education.
02:06:35.480 | But what higher education will start to do,
02:06:37.240 | and I think we'll need to focus on,
02:06:39.080 | is not just the intellectual and the cognitive,
02:06:42.260 | but also the psychological and the emotional core
02:06:45.640 | of who you are and helping you develop that.
02:06:47.980 | - Well, amen to that.
02:06:49.740 | You know, there was a former guest on this podcast,
02:06:53.260 | or there was a guest on this podcast previously,
02:06:56.100 | Dr. Wendy Suzuki's professor at NYU.
02:06:59.060 | I think now she's the Dean of Arts and Sciences,
02:07:02.300 | I think is the correct title.
02:07:03.900 | And, you know, she's trying to bring
02:07:05.620 | some of her laboratory's data on the value
02:07:07.980 | of even very brief meditations
02:07:09.460 | to stress management in college.
02:07:10.900 | First, to kind of, to help students manage the stress
02:07:14.300 | that is college and being in your early 20s.
02:07:17.080 | But I think there's a larger theme there,
02:07:20.800 | which is to try and teach emotional development,
02:07:23.200 | to teach self-regulation,
02:07:25.380 | because many people don't get that.
02:07:27.520 | I mean, you know, or they get it,
02:07:29.980 | but then there are big gaps.
02:07:32.260 | And I love the way that you're describing this.
02:07:35.080 | Basically, it's a lens, if I may,
02:07:39.640 | it's a lens into human experience that's very dynamic,
02:07:43.560 | and is really in concert with the fact
02:07:47.440 | that the human brain has the capacity
02:07:49.720 | for this dynamic representation of time,
02:07:52.480 | like focus on, like solve for the now.
02:07:54.080 | There will be parts of your day, no doubt today,
02:07:56.160 | where you just have to solve for the now.
02:07:57.340 | You're not thinking about the greater good.
02:07:59.160 | And then the ability to dilate your consciousness
02:08:03.280 | in the temporal sense,
02:08:05.220 | and to solve for things that are more long-term,
02:08:07.440 | make these investments towards the future.
02:08:09.900 | I wonder though, you know,
02:08:12.680 | how can we incentivize people to be good, to do good?
02:08:16.600 | And how can we incentivize people to do this on a backdrop
02:08:20.720 | of a lot of short-term carrots and short-term horizons?
02:08:25.200 | I think you've given us some answers,
02:08:27.520 | and they're very powerful ones,
02:08:29.380 | such as the aging self-image exercise.
02:08:33.160 | I'm journaling into the future, writing to future self,
02:08:35.480 | the empty frame, the empty frame exercise,
02:08:40.480 | linking up with our ancestors,
02:08:42.920 | and thinking about where we're at now,
02:08:44.080 | and where we want to go.
02:08:45.740 | Is there anything else that you want to add?
02:08:49.600 | Meaning, is there anything that we should all be doing?
02:08:52.720 | Should we all be reading more biography?
02:08:54.920 | Should we, if I look back through history,
02:08:57.520 | it's both dark and light.
02:09:01.320 | Like, is there anything else that you really
02:09:03.940 | encourage people to do to be the best version of themselves
02:09:07.440 | for this life and the ones that come next?
02:09:11.840 | - I've touched on this.
02:09:13.180 | We need to examine in ourselves,
02:09:16.720 | why is it we do and are the way that we are, right?
02:09:21.440 | Do you know why in this country we vote on Tuesday?
02:09:24.200 | - I don't have any idea.
02:09:25.560 | - So most advanced democracies vote over the weekend
02:09:28.640 | or a couple of weekends.
02:09:30.320 | In America, we vote on Tuesday
02:09:32.720 | because that was the time that was necessary
02:09:37.720 | for someone to leave church on Sunday,
02:09:41.200 | ride on horseback into the big city,
02:09:43.540 | vote on Tuesday, and ride back
02:09:46.160 | before market day on Wednesday.
02:09:48.240 | - So glad you're going to tell me it's not
02:09:49.800 | because then people can still watch Monday night football.
02:09:53.000 | - No, this is long before Monday night football.
02:09:56.160 | And so I think why we vote on Tuesday,
02:09:59.800 | it's a metaphor for so much of who we are
02:10:03.280 | and have become as individuals and as a society.
02:10:05.680 | I'm a big fan of cognitive behavioral therapy, of CBT.
02:10:09.960 | I think partially because what it does is it has this,
02:10:12.080 | look at what are those negative stories
02:10:13.960 | that we tell ourselves, but then,
02:10:15.660 | because you can't just say stop doing something,
02:10:17.440 | you can't just extinguish a behavior,
02:10:18.560 | you have to add and put in a positive story.
02:10:20.800 | What I've tried to do with some of our time here today
02:10:25.000 | and what I want people to partially take away,
02:10:28.400 | more than partially to really take away and bring in,
02:10:31.080 | is examine the why Tuesdays.
02:10:34.240 | What are those stories that you've inherited?
02:10:38.200 | Some of them are going to be macro-social,
02:10:40.840 | like you are defined by the society by what you own,
02:10:44.760 | by the badge on your car that says how successful you are.
02:10:47.720 | That's a story, it's a story that's been fed to us.
02:10:50.440 | There are other stories that are very personal.
02:10:52.740 | These are stories that can sometimes be very private
02:10:55.400 | and go back generations within a family.
02:10:57.880 | And then to understand some of those stories serve us,
02:11:00.640 | some of those stories don't serve us.
02:11:03.200 | But after discerning that,
02:11:05.320 | we then have to write a new story.
02:11:07.780 | We have to write a new story for ourself.
02:11:09.720 | Who am I?
02:11:10.640 | Why am I here?
02:11:12.480 | Isn't going to be answered by a religion or a God
02:11:16.600 | or a book or a podcast or a futurist.
02:11:19.060 | It's going to be answered
02:11:20.480 | by looking and searching inside of yourself
02:11:25.080 | about how it is you got here, what really matters,
02:11:29.480 | and where you want to contribute and help move us forward
02:11:33.240 | as a species on spaceship Earth,
02:11:37.300 | as not as a passenger, but as crew on this vessel
02:11:41.960 | and how we're going to move forward.
02:11:43.760 | So the stories have served us well
02:11:48.600 | and they have not served us well.
02:11:50.400 | And to move forward, it's okay now to say,
02:11:54.560 | "I'm going to write these stories that serve me.
02:11:57.240 | I'm going to see the future, not as a noun,
02:11:58.960 | not as this thing that I'm heading towards
02:12:01.800 | or that's going to tumble over me,
02:12:03.300 | but that I'm going to create."
02:12:04.600 | And those stories may be very intra-personal,
02:12:09.000 | they may be interpersonal, they may be political,
02:12:11.500 | they may be business, they may be what you buy,
02:12:13.520 | what you consume, but you have to have agency.
02:12:17.000 | You have to instill a sense of hope into your own life
02:12:19.880 | and a sense of awe and a sense of really just empathy
02:12:23.520 | for who you are and where we are
02:12:26.040 | if we want to collectively move forward into the futures
02:12:30.840 | that will allow our descendants to look back on us
02:12:33.000 | and say, "They were great ancestors."
02:12:36.320 | - I love it.
02:12:37.160 | And I also just want to highlight the importance
02:12:40.320 | of record-keeping, of putting things down on paper
02:12:45.320 | or maybe in electronic form,
02:12:48.680 | creating time capsules for the future generations.
02:12:51.720 | Because I think a lot of what people probably are thinking
02:12:55.920 | or worried about a little bit is like,
02:12:57.200 | "Okay, I can do all this stuff to try and make things better
02:12:59.440 | and even give up the desire for any kind of credit,"
02:13:02.440 | but not feeling like it will be of any significance.
02:13:07.080 | But what I've learned from you today is that,
02:13:09.920 | it starts with the self and then it radiates out
02:13:12.760 | to the people we know and that maybe we cohabitate with.
02:13:17.760 | But even if we don't cohabitate with anybody,
02:13:21.200 | it radiates out from us and that it is important
02:13:24.320 | to get a sort of time capsule going
02:13:26.640 | so that people can feel like they have some significance
02:13:31.560 | in the future that they may not ever have
02:13:34.520 | immediate experience of,
02:13:35.620 | but to really like send those ripples forward
02:13:37.640 | and get the sense that those ripples are moving forward.
02:13:40.360 | So for that reason, and especially given the nature
02:13:45.360 | of this podcast, for the reason that you gave
02:13:47.840 | these very concrete protocols, if you will,
02:13:50.960 | that we've highlighted in the timestamps, of course,
02:13:55.360 | as tools, as protocols, I really want to thank you
02:13:59.080 | because oftentimes discussions about past, present,
02:14:01.960 | and future can get a bit abstract and a bit vague
02:14:05.700 | for people and you've done us all a great service
02:14:08.520 | by making them very concrete and actionable.
02:14:11.160 | That's so much of what this podcast is about.
02:14:12.920 | It's one part information, one part option for action.
02:14:16.720 | We don't tell people what to do,
02:14:17.720 | but we give them the option for action.
02:14:19.220 | I'm certainly going to adopt some of these protocols.
02:14:22.320 | And also for taking the time to come to talk with us today,
02:14:26.320 | share your wisdom and share what you're doing in many ways.
02:14:30.360 | Well, it is not in many ways,
02:14:32.480 | it is absolutely part of what you're describing,
02:14:34.580 | which is putting your best self
02:14:38.580 | toward how things can be better now and in the future.
02:14:41.780 | It's also a great pleasure to sit down
02:14:43.960 | with somebody I've known for so many years
02:14:46.120 | and learn from you.
02:14:49.240 | So it's a real honor and a privilege.
02:14:51.680 | And I know everyone else listening to
02:14:54.360 | and watching this feels the same way.
02:14:55.720 | So thank you so much.
02:14:57.520 | - Thank you for having me.
02:14:59.040 | - Thank you for joining me for today's discussion
02:15:00.960 | with Ari Wolok.
02:15:02.200 | To find links to his book, to his television show,
02:15:04.800 | and other resources related to Long Path,
02:15:07.000 | please see the show note captions.
02:15:09.080 | If you're learning from and or enjoying this podcast,
02:15:11.440 | please subscribe to our YouTube channel.
02:15:13.360 | That's a terrific zero cost way to support us.
02:15:15.680 | In addition, please subscribe to the podcast
02:15:17.880 | on both Spotify and Apple.
02:15:19.500 | And on both Spotify and Apple,
02:15:20.920 | you can leave us up to a five-star review.
02:15:23.300 | Please check out the sponsors mentioned
02:15:24.840 | at the beginning and throughout today's episode.
02:15:27.000 | That's the best way to support this podcast.
02:15:29.640 | If you have questions for me or comments about the podcast
02:15:32.280 | or guests or topics that you'd like me to consider
02:15:34.140 | for the Huberman Lab podcast,
02:15:35.640 | please put those in the comment section on YouTube.
02:15:37.940 | I do read all the comments.
02:15:39.640 | For those of you that haven't heard,
02:15:40.780 | I have a new book coming out.
02:15:41.980 | It's my very first book.
02:15:43.600 | It's entitled "Protocols, An Operating Manual
02:15:46.120 | for the Human Body."
02:15:47.160 | This is a book that I've been working on
02:15:48.320 | for more than five years,
02:15:49.480 | and that's based on more than 30 years
02:15:51.800 | of research and experience.
02:15:53.360 | And it covers protocols for everything from sleep,
02:15:56.400 | to exercise, to stress control,
02:15:58.900 | protocols related to focus and motivation.
02:16:01.360 | And of course, I provide the scientific substantiation
02:16:04.720 | for the protocols that are included.
02:16:06.800 | The book is now available by presale at protocolsbook.com.
02:16:10.700 | There you can find links to various vendors.
02:16:13.080 | You can pick the one that you like best.
02:16:14.840 | Again, the book is called
02:16:15.800 | "Protocols, An Operating Manual for the Human Body."
02:16:19.180 | If you're not already following me on social media,
02:16:21.160 | I am Huberman Lab on all social media platforms.
02:16:24.140 | So that's Instagram, X, formerly known as Twitter,
02:16:26.720 | Threads, Facebook, and LinkedIn.
02:16:28.600 | And on all those platforms,
02:16:30.160 | I discuss science and science-related tools,
02:16:32.280 | some of which overlaps with the content
02:16:33.760 | of the Huberman Lab podcast,
02:16:35.240 | but much of which is distinct from the content
02:16:37.120 | on the Huberman Lab podcast.
02:16:38.320 | Again, that's Huberman Lab on all social media channels.
02:16:41.380 | If you haven't already subscribed
02:16:42.540 | to our Neural Network Newsletter,
02:16:44.340 | our Neural Network Newsletter
02:16:45.740 | is a zero-cost monthly newsletter
02:16:47.760 | that includes podcast summaries as well as protocols
02:16:50.420 | in the form of brief one-to-three-page PDFs.
02:16:53.260 | Those protocol PDFs are on things
02:16:55.540 | like neuroplasticity and learning,
02:16:57.420 | optimizing dopamine, improving your sleep,
02:17:00.000 | deliberate cold exposure, deliberate heat exposure.
02:17:02.180 | We have a foundational fitness protocol
02:17:04.260 | that describes a template routine
02:17:06.420 | that includes cardiovascular training
02:17:07.920 | and resistance training with sets and reps,
02:17:09.880 | all backed by science,
02:17:11.300 | and all of which, again, is completely zero cost.
02:17:13.740 | To subscribe, simply go to HubermanLab.com,
02:17:16.300 | go to the Menu tab up in the upper right corner,
02:17:18.940 | scroll down to Newsletter, and provide your email.
02:17:21.040 | And I should emphasize
02:17:21.880 | that we do not share your email with anybody.
02:17:24.380 | Thank you once again for joining me
02:17:25.720 | for today's discussion with Ari Wallach.
02:17:28.140 | And last, but certainly not least,
02:17:30.420 | thank you for your interest in science.
02:17:32.440 | [upbeat music]
02:17:35.020 | (upbeat music)