back to index

How Hormones & Status Shape Our Values & Decisions | Dr. Michael Platt


Chapters

0:0 Dr. Michael Platt
2:12 Humans, Old World Primates & Decision-Making; Swiss Army Knife Analogy
7:52 Sponsors: Our Place & Wealthfront
11:1 Attention Allocation, Resource Foraging
16:40 Social Media; Marginal Value Theorem, Distraction
22:22 Tool: Remove Phone from Room; Attention & Urgency
25:23 Tool: Self Conversation; Visual Input, Attention as a Skill
29:29 Warming-Up Focus, Tool: Visual Aperture & Attention
38:57 Sponsor: AG1
40:13 Control of Attention, Tool: Changing Environment
44:7 Attention Continuum, Professions, Measuring Business Skill with Neuroscience
53:6 Theory of Mind, Covert Attention, Attentional Spotlights
60:5 Primates, Hormone Status, Brain Size, Monogamy
69:31 Monkeys, Neuronal Multiplexing & Context; Equitable Relationships
80:5 Sponsor: BetterHelp
81:11 Relationships, Power Dynamics, Neuroethology
89:34 Humans, Females & Hormone Status; Monkeys, Social Images, Hormones
98:3 Humans, Attractiveness, Value-Based Decision Making
104:32 Altruism, Group Selection & Cooperation, Selflessness
109:8 Males, Testosterone, Behavior Changes
115:46 Sponsor: Function
117:34 Oxytocin, Pro-Social Behaviors, Behavioral Synchrony
128:13 MDMA, Oxytocin, Anxiety; Social Touch, Despair & Isolation
137:12 Isolation, Social Connections & Strangers, Tool: Deep Conversation Questions
141:17 Bridging the Divide, Tribes & Superficial Biases
146:58 Testosterone, Risk-Taking Behavior
150:52 Decision-Making, Tool: Accurate or Fast?
158:31 Decision-Making, Impact of Time & Fatigue
165:23 Advertising, Status, Celebrity, Monkeys
172:19 Hierarchy; Abundance & Scarcity, Money & Happiness, Loss Aversion
182:47 Meme Coins, Celebrity Endorsement, Social Sensitivity
192:22 Decisions & Urgency; Bounded & Ecological Rationality
198:9 Longevity Movement; Mortality & Motivation
204:48 Retirement?, Serial Pursuits & Pivoting
210:17 Apple or Samsung?, Brand Loyalty, Empathy
218:15 Political Affiliation, Empathy
226:22 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.240 | where we discuss science
00:00:03.660 | and science-based tools for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.280 | and I'm a professor of neurobiology and ophthalmology
00:00:13.500 | at Stanford School of Medicine.
00:00:15.480 | My guest today is Dr. Michael Platt.
00:00:18.020 | Dr. Michael Platt is a professor of neuroscience
00:00:20.220 | and psychology at the University of Pennsylvania.
00:00:22.960 | His laboratory focuses on decision-making,
00:00:25.500 | more specifically, how we make decisions,
00:00:27.760 | and the impact of power dynamics,
00:00:29.560 | such as hierarchies in a given organization or group,
00:00:32.440 | as well as hormones on decision-making.
00:00:35.120 | We also discuss valuation,
00:00:36.820 | that is how we place value on things, on people.
00:00:40.040 | And what you'll find is that there are many factors
00:00:42.820 | that impact whether or not we think something is good,
00:00:45.280 | very good, bad, or very bad,
00:00:47.520 | that operate below our conscious awareness.
00:00:50.400 | In fact, today's discussion will teach you
00:00:51.960 | how you make decisions,
00:00:53.360 | how to make better decisions in the context of everything
00:00:56.200 | from picking out a watch or a pair of shoes,
00:00:59.080 | all the way up to something as important
00:01:00.700 | as picking a life mate.
00:01:02.600 | Indeed, hormones, hierarchies,
00:01:04.920 | and specific things that are operating within you
00:01:07.200 | and adjacent to nearby the things that you're evaluating,
00:01:10.880 | whether or not those things are people or objects,
00:01:13.500 | are powerfully shaping the neural circuits
00:01:15.660 | that lead you to make specific decisions.
00:01:17.940 | So today you're going to learn how all of that works
00:01:20.360 | and, as I mentioned, how to make better decisions.
00:01:23.360 | Dr. Platt also explains how we are evaluating
00:01:25.720 | the hormone levels of other people,
00:01:27.800 | both same sex and opposite sex,
00:01:29.720 | and the implications that has for relationships of all kinds.
00:01:33.100 | It's an incredibly interesting and unique conversation,
00:01:36.180 | certainly unique among the conversations I've had
00:01:38.400 | with any of my neuroscience colleagues over the decades.
00:01:40.920 | And I know that the information you're going to learn today
00:01:43.240 | is going to be both fascinating to you,
00:01:45.140 | it certainly was to me,
00:01:46.600 | and that it will impact the way that you think about
00:01:49.440 | all decisions at every level in everyday life.
00:01:52.580 | Before we begin, I'd like to emphasize that this podcast
00:01:55.420 | is separate from my teaching and research roles at Stanford.
00:01:58.060 | It is, however, part of my desire and effort
00:02:00.200 | to bring zero cost to consumer information about science
00:02:02.820 | and science-related tools to the general public.
00:02:05.320 | In keeping with that theme,
00:02:06.600 | this episode does include sponsors.
00:02:09.160 | And now for my discussion with Dr. Michael Platt.
00:02:12.320 | Dr. Michael Platt, welcome.
00:02:14.880 | - Thanks, it's awesome to be here.
00:02:16.840 | - I've been following your work
00:02:17.840 | since I was a graduate student,
00:02:19.600 | and it's really interesting.
00:02:22.340 | You're an anthropologist by training,
00:02:24.860 | turned neuroscientist,
00:02:26.300 | turned practical applications of neuroscience
00:02:29.760 | in related fields to everybody,
00:02:32.600 | as it relates to business, decision-making,
00:02:35.840 | social interactions, hormones.
00:02:37.840 | You've worked on a lot of different things.
00:02:40.780 | The first question I have is,
00:02:43.900 | let's all agree, we're old world primates.
00:02:47.720 | - Yes.
00:02:48.760 | - Most people don't even think of us as old world primates,
00:02:51.560 | but we are all old world primates.
00:02:53.980 | And we share many similarities
00:02:55.480 | in terms of the neural circuits that we have in our skulls
00:02:59.200 | with some of the other old world primates,
00:03:01.120 | like macaque monkeys, for instance.
00:03:03.460 | When you step back and look at a process
00:03:06.600 | like decision-making or marketing out in the world,
00:03:10.280 | or how people interact with one another,
00:03:13.920 | engage value of objects, relationships,
00:03:18.440 | even their own value, if I may,
00:03:22.480 | how much of what you see in human old world primates
00:03:26.620 | do you think is reflected by the interactions
00:03:31.140 | of old world primates, like rhesus macaque monkeys,
00:03:33.580 | and vice versa?
00:03:34.740 | I mean, in other words, how primitive are we,
00:03:37.300 | and/or how sophisticated are the other old world primates?
00:03:40.660 | - That's a great way of putting it,
00:03:42.500 | 'cause I think it's both.
00:03:43.660 | I always like to say there's a little monkey in all of us.
00:03:46.380 | And I believe that going in,
00:03:49.260 | having spent, actually, my formative years studying,
00:03:53.400 | just watching monkeys.
00:03:54.500 | And I worked at the Cleveland Zoo when I was in college,
00:03:57.020 | and I took every opportunity I could get to go,
00:03:59.540 | I went to the field.
00:04:00.580 | I watched monkeys in South America and in Mexico.
00:04:03.580 | And I think we all get that.
00:04:07.020 | But over the course of my career,
00:04:09.260 | I'm astonished at how deep that goes.
00:04:12.580 | And basically, for every behavioral, cognitive,
00:04:16.760 | emotional phenomenon that we've trained our lens on,
00:04:20.580 | it looks almost exactly the same in people and monkeys.
00:04:26.260 | Now, obviously, we're not just monkeys,
00:04:28.060 | and we can talk, and we're doing this,
00:04:29.860 | and that's a big, big difference.
00:04:32.300 | But all the things that you talked about,
00:04:34.440 | decision-making, social interaction,
00:04:37.080 | the way that we explore the world,
00:04:41.580 | the fountain of creativity,
00:04:43.500 | not only the neural circuits,
00:04:45.980 | but the actual expression is so similar.
00:04:50.420 | We have monkeys and people
00:04:51.540 | do the exact same things in the lab.
00:04:53.900 | And if I didn't label the videos,
00:04:57.380 | the outputs of the avatars and whatnot in games,
00:04:59.820 | you couldn't tell the difference.
00:05:01.460 | - What's striking about what you just said is that,
00:05:05.740 | I recall, I guess at that time it was called a tweet,
00:05:09.780 | and I think it was from Elon,
00:05:11.660 | that said that we're basically a species
00:05:15.180 | that got a supercomputer placed on top of a monkey brain.
00:05:18.980 | So in thinking about it the other way,
00:05:21.540 | what aspects of being human,
00:05:25.460 | this old world primate that we are,
00:05:28.540 | think is distinctly different than, say, a macaque monkey,
00:05:32.460 | aside from language?
00:05:34.700 | - I don't know that anything really is.
00:05:36.180 | I mean, so actually, it's an interesting time
00:05:38.420 | to have you ask me that question,
00:05:40.140 | 'cause this spring semester I teach a seminar
00:05:42.940 | for the psychology department at Penn called Being Human.
00:05:46.700 | And the whole idea of that,
00:05:48.020 | each week we tackle an aspect of who we are
00:05:52.420 | that has, at one point or another,
00:05:54.420 | been considered to be uniquely human or close to, right?
00:05:58.220 | And that could be something like art and creativity,
00:06:01.460 | or theory of mind, right?
00:06:04.700 | Or economics and markets and things like that.
00:06:09.380 | And when you take a look at these things
00:06:11.140 | through the lenses of neuroscience and anthropology,
00:06:14.020 | this is how we do it,
00:06:15.180 | economics, psychology, neurology, and on and on and on,
00:06:19.460 | you start to really see that there's a lot more continuity
00:06:23.500 | than discontinuity.
00:06:25.580 | And that's kind of pretty shocking.
00:06:28.700 | And I wanna go back to that Elon tweet, if I may,
00:06:33.700 | because I think that's where we go
00:06:35.780 | a little bit astray, too,
00:06:36.940 | in thinking about the brain as a computer, right?
00:06:40.660 | So it's, well, obviously it's not built on silicon, right?
00:06:43.860 | It's made of meat and fat,
00:06:46.660 | and it's subject to all of the constraints
00:06:49.060 | that go along with that.
00:06:52.180 | And what I think, instead, is a better metaphor
00:06:55.340 | is that we've got a 30-million-year-old
00:06:58.140 | Swiss Army knife in our heads, right?
00:07:00.100 | So yes, you can learn to do all kinds of different things,
00:07:02.500 | but you've got a brain
00:07:04.260 | that's got essentially specific tools in it.
00:07:07.900 | It's like having a knife in a corkscrew,
00:07:10.220 | which is the most important one,
00:07:12.460 | nail file, saw, et cetera,
00:07:14.980 | and a monkey's got those, too.
00:07:16.540 | Now, ours might be a little bigger and sharper,
00:07:20.180 | but they look pretty similar,
00:07:23.620 | and they do the job in a very similar way.
00:07:26.620 | And I think once we appreciate that,
00:07:29.700 | then that opens up a lot of territory for applications,
00:07:34.700 | not just trying to understand
00:07:36.340 | how some of those tools might get broken or dull
00:07:38.860 | as a result of illness or injury or disorders, et cetera,
00:07:43.580 | but also how we can measure them
00:07:46.220 | and how we can develop them better,
00:07:48.100 | because some of those we use all the time, say, in business.
00:07:51.100 | - I'd like to take a quick break
00:07:53.340 | and acknowledge our sponsor, Our Place.
00:07:55.900 | Our Place makes my favorite pots, pans, and other cookware.
00:07:59.380 | Surprisingly, toxic compounds such as PFASs,
00:08:02.180 | or forever chemicals,
00:08:03.500 | are still found in 80% of nonstick pans,
00:08:06.440 | as well as utensils, appliances,
00:08:08.020 | and countless other kitchen products.
00:08:10.060 | Now, I've talked before on this podcast
00:08:11.500 | about these PFASs, or forever chemicals, like Teflon,
00:08:14.940 | which have been linked to major health issues,
00:08:16.700 | such as hormone disruption, gut microbiome disruption,
00:08:19.860 | fertility issues, and many other health problems,
00:08:22.380 | so it's really important to avoid them.
00:08:24.380 | This is why I'm a huge fan of Our Place.
00:08:26.500 | Our Place products are made
00:08:27.620 | with the highest quality materials
00:08:29.140 | and are all PFAS and toxin-free.
00:08:31.680 | I particularly love their Titanium Always Pan Pro.
00:08:34.620 | It's the first nonstick pan made
00:08:36.060 | with zero chemicals and zero coating.
00:08:38.380 | Instead, it uses pure titanium.
00:08:40.380 | This means it has no harmful forever chemicals,
00:08:42.780 | and it also doesn't degrade
00:08:44.140 | or lose its nonstick effect over time.
00:08:46.420 | It's extremely durable,
00:08:47.660 | and it's also beautiful to look at.
00:08:49.500 | I cook eggs in my Titanium Always Pan Pro
00:08:51.740 | almost every morning.
00:08:53.140 | The design allows for the eggs to cook perfectly
00:08:55.540 | without sticking to the pan.
00:08:57.120 | I also cook burgers and steaks in it,
00:08:58.860 | and it puts a really nice sear on the meat.
00:09:00.780 | But again, nothing sticks to the pan,
00:09:02.260 | so it's really easy to clean,
00:09:03.620 | and it's even dishwasher safe.
00:09:05.360 | I love it, and I use it every day.
00:09:07.220 | For a limited time,
00:09:08.120 | Our Place is offering an exclusive 20% discount
00:09:11.060 | on the Titanium Always Pan Pro.
00:09:13.100 | If you go to the website fromourplace.com/huberman
00:09:16.660 | and use the code SAVEHUBERMAN20,
00:09:18.940 | you can claim the offer.
00:09:20.460 | With a 100-day risk-free trial,
00:09:22.020 | free shipping, and free returns,
00:09:24.020 | you can experience this fantastic cookware
00:09:25.980 | with absolutely zero risk.
00:09:27.720 | Again, that's fromourplace.com/huberman to get 20% off.
00:09:32.100 | Today's episode is also brought to us by Wealthfront.
00:09:34.820 | I've been using Wealthfront for my savings
00:09:36.420 | and my investing for nearly a decade,
00:09:38.180 | and I absolutely love it.
00:09:39.780 | At the start of every year, I set new goals,
00:09:41.900 | and one of my goals for 2025 is to focus on saving money.
00:09:45.580 | Since I have Wealthfront,
00:09:46.720 | I'll keep that savings in my Wealthfront Cash account,
00:09:49.220 | where I'm able to earn 4% annual percentage yield
00:09:51.500 | on my deposits, and you can as well.
00:09:53.860 | With Wealthfront, you can earn 4% APY on your cash
00:09:56.500 | from partner banks until you're ready
00:09:58.380 | to either spend that money or invest it.
00:10:00.620 | With Wealthfront, you also get free instant withdrawals
00:10:02.800 | to eligible accounts every day,
00:10:04.620 | even on weekends and holidays.
00:10:06.620 | The 4% APY is not a promotional rate,
00:10:09.060 | and there's no limit to what you can deposit and earn.
00:10:11.140 | And you can even get protection for up to $8 million
00:10:13.940 | through FDIC insurance provided
00:10:15.600 | through Wealthfront's partner banks.
00:10:17.540 | Wealthfront gives you free instant withdrawals
00:10:19.740 | where it takes just minutes to transfer your money
00:10:21.860 | to eligible external accounts.
00:10:23.740 | It also takes just minutes to transfer your cash
00:10:26.220 | from the cash account
00:10:27.380 | to any of Wealthfront's automated investment accounts
00:10:29.780 | when you're ready to invest.
00:10:31.400 | There are already a million people using Wealthfront
00:10:33.500 | to save more, earn more, and build long-term wealth.
00:10:36.620 | Earn 4% APY on your cash today.
00:10:39.380 | If you'd like to try Wealthfront,
00:10:40.740 | go to wealthfront.com/huberman
00:10:43.260 | to receive a free $50 bonus with a $500 deposit
00:10:46.620 | into your first cash account.
00:10:48.300 | That's wealthfront.com/huberman to get started now.
00:10:52.020 | This has been a paid testimonial of Wealthfront.
00:10:54.300 | Wealthfront brokerage isn't a bank.
00:10:56.180 | The APY is subject to change.
00:10:58.180 | For more information, see the episode description.
00:11:01.700 | - So if we were to start at what us neuroscientists
00:11:05.860 | would call kind of more low-level functioning,
00:11:07.980 | even though it's pretty high level,
00:11:09.980 | with something like attention.
00:11:12.120 | You know, we are very visual creatures
00:11:14.980 | for those of us that are sighted.
00:11:16.260 | Most humans are sighted.
00:11:18.280 | We rely on vision to assess the world around us,
00:11:21.340 | to assess emotions of others, et cetera.
00:11:24.420 | And so are the other old world primates, right?
00:11:29.180 | How do we allocate attention?
00:11:31.800 | Like what grabs our attention?
00:11:33.620 | And maybe in this discussion, we could also touch on,
00:11:36.140 | 'cause I know you've worked on this,
00:11:37.700 | what underlies some deficits in attention?
00:11:40.880 | So yeah, if we could just explore this
00:11:44.760 | from the perspective of, okay, you go into an environment,
00:11:47.780 | let's say it's a familiar environment.
00:11:49.220 | You wake up in the room, you wake up in each day.
00:11:51.660 | What grabs your attention?
00:11:53.920 | What keeps your attention?
00:11:55.900 | And if we do, in fact, have control over our attention,
00:11:59.980 | which we do to some extent,
00:12:01.580 | why is it so difficult for many of us to decide,
00:12:05.320 | you know what, I'm just gonna put everything away,
00:12:07.700 | and I'm just gonna focus on this task for the next hour?
00:12:11.920 | Why is that so challenging for so many people,
00:12:14.940 | regardless of whether they have a diagnosis
00:12:17.580 | of attention deficit hyperactivity disorder?
00:12:20.320 | - Okay, there's a lot in that question,
00:12:22.260 | many questions in there.
00:12:23.480 | And let's talk about what attention is, right?
00:12:25.460 | It is a prioritization, right?
00:12:27.620 | Or an amplification of what you're focusing on, right?
00:12:31.480 | And we do that by where we point our eyes, right?
00:12:34.340 | And then that, it gets turned up in the brain
00:12:37.300 | with a lot of consequences.
00:12:39.300 | And really, why do we have attention?
00:12:42.300 | Because you can't do everything at once, right?
00:12:44.660 | So it's in the name of efficiency.
00:12:47.420 | What we attend to is a product of two things.
00:12:53.120 | It's what we're looking for
00:12:55.180 | and what the world looks like, right?
00:12:57.740 | And that kind of what the world looks like part
00:13:01.060 | is importantly shaped by what our ancestors experienced,
00:13:06.060 | and also what we experienced
00:13:07.820 | when we were developing or growing up.
00:13:10.220 | So things that are bright or shiny or moving fast, right?
00:13:13.980 | Or loud or whatever, that grabs our attention.
00:13:16.820 | Things that stand out, that are different.
00:13:18.920 | And for us as primates,
00:13:22.380 | one thing that's super important
00:13:23.700 | and kind of really deeply baked in is other people.
00:13:27.860 | So if there are faces,
00:13:29.900 | if there are people in the environment doing something,
00:13:32.820 | then that naturally just grabs our attention
00:13:37.100 | unless we happen to be an individual
00:13:39.520 | who's sort of wired a little bit differently,
00:13:41.620 | like folks on the autism spectrum disorder
00:13:43.900 | or schizophrenia, things like that,
00:13:46.400 | where that prioritization is not quite the same.
00:13:51.900 | So that's kind of how our experience as primates,
00:13:56.900 | and just the design principles of the way our brains work
00:14:01.040 | to overcome some of these limitations
00:14:03.760 | in the name of efficiency come about.
00:14:06.100 | And then as you mentioned,
00:14:07.420 | what we can control our attention to a certain degree.
00:14:10.380 | And that's super important for a lot of,
00:14:13.580 | I think overcoming a lot of the challenges that we have.
00:14:16.860 | And we can talk about that like in decision-making,
00:14:20.060 | for example, because you, or learning,
00:14:22.500 | because you can't control what you're attending to,
00:14:26.060 | that gets turned up in the brain, right?
00:14:28.180 | And that affects what we choose
00:14:30.220 | and it affects what we learn,
00:14:32.220 | it affects what we remember as well.
00:14:36.780 | So now I'm trying to kind of go back to like,
00:14:39.060 | then the end part of your question.
00:14:42.460 | Oh, so that had to do with multitasking
00:14:44.460 | or just things in the environment.
00:14:47.120 | And that gets at this question
00:14:49.320 | or topic of, in my view, of foraging, right?
00:14:53.140 | And so I think that attention,
00:14:56.040 | this is the argument we've made,
00:14:58.260 | operates according to essentially the same rules
00:15:01.700 | and principles that our bodies do
00:15:05.980 | when we are searching the environment for resources.
00:15:10.000 | So all mobile animals search for food,
00:15:14.420 | search for mates, search for water,
00:15:16.620 | for the resources that they need to survive
00:15:18.460 | and to reproduce.
00:15:20.900 | And as it turns out, that kind of decision do,
00:15:25.420 | that the clash made very memorable.
00:15:28.180 | Should I stay or should I go?
00:15:30.320 | That's the key thing.
00:15:31.300 | So when you encounter something,
00:15:33.220 | like the question is like, do I take it?
00:15:34.780 | Do I stick with it?
00:15:35.780 | Even though it might be depleting, getting worse,
00:15:38.700 | or should I take a risk and invest time and energy
00:15:40.860 | and go look for something else?
00:15:42.420 | All animals have to do that.
00:15:47.260 | It turns out there's an optimal solution to that,
00:15:50.020 | which was written out
00:15:50.860 | by one of the great mathematical ecologists,
00:15:53.820 | Eric Chernoff, in a paper in 1976.
00:15:57.060 | And so he wrote this out.
00:15:58.340 | And what's cool about it is it's very simple.
00:16:01.580 | It's basically you leave,
00:16:04.220 | you abandon the thing that you're harvesting
00:16:08.700 | when what you're getting from it
00:16:10.400 | falls below the average for the environment.
00:16:12.200 | That just makes sense.
00:16:13.220 | The marginal returns, right?
00:16:16.100 | - And this could be a social interaction.
00:16:17.660 | - Could be a social interaction.
00:16:18.660 | It could be food, could be water,
00:16:19.620 | could be the money that you're making in the moment,
00:16:22.140 | could be the information that you're getting
00:16:24.580 | from a book or from a website or whatnot.
00:16:28.040 | And we, from studies done over the last,
00:16:32.260 | whatever that is now, 50 years,
00:16:33.640 | have shown that every animal that's ever been observed
00:16:36.620 | behaves as if they're performing that computation.
00:16:39.660 | - Could you give an example in the context
00:16:41.620 | of let's say social media?
00:16:42.860 | And as we were walking into record today,
00:16:46.280 | we were comparing and contrasting X
00:16:48.780 | as a platform versus Instagram.
00:16:50.980 | And it occurred to me now,
00:16:52.300 | based on what you said a few moments ago,
00:16:54.420 | that Instagram is very visual.
00:16:56.220 | So you see faces.
00:16:57.620 | Many accounts on X, either the icon is so small
00:17:00.940 | or people even just have cartoons or whatever,
00:17:04.300 | avatars there that aren't really faces in many cases.
00:17:07.300 | And it does seem that on X,
00:17:09.140 | there's a kind of a elevated level of emotionality
00:17:14.140 | to what people write.
00:17:15.500 | That's what tends to grab attention.
00:17:16.780 | And I wonder whether or not that's
00:17:17.940 | because of the absence of faces.
00:17:19.660 | I mean, when somebody is on an Instagram post
00:17:21.780 | and they're kind of ranting a bit,
00:17:22.980 | in fact, I saw this yesterday.
00:17:24.540 | Tim Ferriss, another podcaster,
00:17:27.580 | had the investor Chris Sacca on.
00:17:29.660 | And Chris was talking about environmentalism and the fires.
00:17:34.540 | And he had opinions about AI.
00:17:36.140 | He's very, very smart, very opinionated guy.
00:17:38.020 | But people were commenting.
00:17:39.740 | I don't know how he felt, how could I?
00:17:42.220 | But people were commenting, "He's so angry.
00:17:44.740 | He's so angry."
00:17:45.580 | And he was just being passionate and emphatic.
00:17:47.980 | Maybe he was angry, I don't know.
00:17:49.020 | But he was clearly very alert,
00:17:50.780 | leaning forward into the camera.
00:17:52.220 | And people were paying,
00:17:53.660 | most of their comments were paying attention
00:17:56.100 | to the emotion behind what he was saying.
00:17:59.300 | And whereas on X, I feel like if you just took the text
00:18:02.500 | of what he was saying and you put it there,
00:18:05.220 | it would be kind of below the average emotionality on X.
00:18:09.140 | And so when you say that we are drawn to faces
00:18:13.780 | or that faces are,
00:18:14.860 | we naturally forge towards faces versus other things,
00:18:18.440 | that feels very true.
00:18:21.220 | And do you feel like elevated levels of emotion in faces
00:18:24.860 | are what harness the most attention?
00:18:27.100 | And by parallel, if you get a bunch of monkeys together
00:18:30.420 | and one of them is really upset,
00:18:31.860 | do they all look at that monkey?
00:18:34.700 | - Speculating a little bit here.
00:18:36.180 | It's not thought about in the context of say,
00:18:39.540 | X versus Instagram.
00:18:40.740 | But I think you're right on.
00:18:42.620 | I mean, I think that's spot on.
00:18:44.720 | You're just combining, like you're turning,
00:18:46.780 | the volume gets turned up because there are faces there.
00:18:49.500 | And if they're more emotional,
00:18:50.420 | they're just gonna be much more salient.
00:18:52.180 | Grab your attention.
00:18:53.020 | And that's something that's really important
00:18:54.100 | to pay attention to,
00:18:54.940 | because somebody who's very aroused, right?
00:18:58.900 | That's activation, that's sort of pre-activation
00:19:02.340 | before they do something.
00:19:03.460 | Like they might attack you
00:19:05.500 | or they might take something from you.
00:19:07.500 | Who knows, right?
00:19:08.340 | Something could happen there.
00:19:10.380 | But I want to take this back a little bit.
00:19:12.460 | I'm older than you.
00:19:14.340 | And I want to take this idea of different sources,
00:19:19.340 | like where you could place your attention,
00:19:21.040 | take it back a little bit more in time.
00:19:22.740 | Because what's been shown, and it's interesting,
00:19:24.900 | computer science picked up on this marginal value theorem
00:19:29.100 | from mathematical ecology around 2000 or so,
00:19:33.100 | and began to investigate how people search the web.
00:19:36.300 | And it turned out people would leave a website
00:19:39.460 | the moment their information intake rate
00:19:42.780 | fell below the average for sort of all the websites
00:19:46.260 | that they were encountering.
00:19:48.020 | - The average is determined by your behavior
00:19:50.820 | in the what, the preceding bin of time, like 10 minutes,
00:19:54.380 | until you arrive at a site or within site.
00:19:57.140 | - So that's less well known,
00:19:58.700 | but we're now learning that it is pretty short term, right?
00:20:02.740 | So it seems to be driven by reinforcement learning processes
00:20:07.420 | that kind of are telling you how rich that environment is.
00:20:10.500 | And so one of the things about the marginal value theorem
00:20:12.900 | I think is really, really profound
00:20:15.660 | for understanding our current predicament
00:20:19.540 | is that it says that if you're in a really poor environment,
00:20:22.540 | like you, let's say you forge for apples, right?
00:20:26.260 | And there's one apple tree for the next 10 miles.
00:20:28.500 | You stay in that apple tree until you picked every apple,
00:20:31.340 | rotten or not rotten, not ripe, right?
00:20:33.660 | Before you move on.
00:20:34.820 | If you were in an orchard with apple trees everywhere,
00:20:37.180 | you just pick the ones that are easiest to get
00:20:38.940 | and then you move on.
00:20:40.140 | So now think about it in the context of web surfing, the web.
00:20:45.140 | Like when you were, you know,
00:20:46.900 | if you're coming up when I did,
00:20:49.220 | you know, I was in graduate school or as an undergraduate,
00:20:52.940 | the way I accessed the internet was through a dial-up modem.
00:20:55.940 | So it was very slow, it was a very poor environment.
00:20:59.940 | You're sitting there waiting for the information
00:21:02.500 | to load up, right?
00:21:03.700 | And it might take 30 seconds or longer.
00:21:05.980 | You don't abandon that.
00:21:08.660 | You read the whole thing, you might print it out,
00:21:11.060 | put it in your file cabinet, right?
00:21:13.300 | Now you get like super high speed internet.
00:21:17.500 | - Yeah, you can have 12 tabs open, 50 tabs open.
00:21:19.020 | - And you're like, you just so you spend like, you know,
00:21:21.100 | half a second or a couple of seconds on any one.
00:21:23.260 | You don't, you certainly don't scroll down
00:21:25.060 | beneath the fold, right?
00:21:26.940 | So it totally makes sense.
00:21:28.420 | Now think about all the devices you might have.
00:21:30.300 | Or it could be tabs, it could be,
00:21:31.780 | most people are sitting around with a TV on,
00:21:34.500 | you know, their phone, a tablet, a laptop, whatnot.
00:21:37.060 | - Yeah, I'm guilty of having, I have three phones.
00:21:40.300 | - Yeah, so you're just cycling.
00:21:41.820 | You are doing exactly what you're designed to do, right?
00:21:45.980 | Which is to move between these resources
00:21:50.380 | quickly and easily because it's so easy.
00:21:52.540 | So and sometimes that's what going back to your question
00:21:54.380 | about like, why is it so hard?
00:21:56.420 | It's gonna be really, really deliberate.
00:21:57.900 | You have to either reduce, you know,
00:22:01.100 | make it a harder environment, I guess is the idea.
00:22:03.980 | You would have to actually put things away
00:22:06.300 | or make the return rate that you're getting
00:22:08.820 | from any of them much worse.
00:22:10.260 | Like for example, if you turn your phone monochrome,
00:22:13.140 | some, which we know works, right?
00:22:15.100 | It helps you to stop checking your phone
00:22:17.140 | and spend less time on it
00:22:18.980 | because it's just not as good of a source.
00:22:22.380 | - Yeah, the information feels really depleted.
00:22:25.820 | You reposted a paper result recently,
00:22:29.980 | and I did as well after I saw it on your ex account
00:22:33.260 | that if you look at working memory,
00:22:37.460 | the ability to keep information online in real time
00:22:39.860 | and work with it, it seems that working memory is worst
00:22:44.860 | when your phone is right next to you.
00:22:48.640 | If it's somewhere else in the room that you're working,
00:22:52.300 | then we're trying to do real work of some sort,
00:22:55.980 | your performance is slightly better
00:22:59.140 | than if it's right next to you.
00:23:00.700 | But if the phone is completely outside of the room,
00:23:03.300 | improvements in working memory
00:23:05.700 | are statistically significant.
00:23:07.340 | In other words, get the phone completely out of the room.
00:23:09.700 | It's not sufficient to have it next to you turned face down
00:23:12.340 | or even in your backpack behind you.
00:23:14.060 | It needs to be in a completely separate environment
00:23:16.420 | in order to maximize this effect.
00:23:18.700 | - Yeah, I mean, it's completely consistent
00:23:20.420 | with what we're saying here with regard to foraging.
00:23:22.860 | But if I take my phone and I put it,
00:23:24.820 | I don't have my phone here under the chair,
00:23:26.260 | but let's say I did, and this result suggests
00:23:28.260 | that some component of our neural circuitry
00:23:30.420 | is operating in the background thinking,
00:23:32.400 | well, I guess something could be on there.
00:23:34.100 | Maybe I got a text or maybe there's a tweet
00:23:36.540 | I should look at or an Instagram post.
00:23:38.860 | It suggests that we are multitasking
00:23:42.260 | even when we think we are not multitasking.
00:23:46.100 | - Yeah, I think you're absolutely right.
00:23:47.500 | It's beneath our awareness, right?
00:23:49.860 | So that's where I think the kind of comparative psychology,
00:23:54.780 | comparative neurobiology is really important here
00:23:57.900 | because I don't necessarily impute conscious awareness
00:24:02.900 | to all these critters that are out there doing these things,
00:24:07.060 | behaving exactly the same way we are.
00:24:09.580 | And so to me, that just indicates that all that hardware,
00:24:14.340 | those same routines are just running under the hood,
00:24:17.060 | running under the surface, and we're not aware of it.
00:24:19.260 | So when your phone is somewhere within the sphere
00:24:23.140 | that could be accessed, the brain's aware of that,
00:24:26.380 | and it's including that in the calculations
00:24:29.500 | about what to do next.
00:24:31.900 | And it actually reminds me now of,
00:24:36.700 | there's actually a couple of papers
00:24:37.780 | that we published some time ago on foraging.
00:24:40.900 | And one of the things that's really interesting about it
00:24:42.660 | is that as you are considering your options
00:24:49.140 | and you're experiencing sort of these depleting rewards
00:24:52.580 | or whatnot, you see this urgency signal kind of building up
00:24:57.060 | in a part of the brain, the anterior cingulate cortex
00:24:59.660 | that we know is important for moving on,
00:25:01.860 | for switching, for searching for something new.
00:25:06.060 | And it does, you know,
00:25:07.380 | I don't know what the emotional component of that is.
00:25:10.060 | We never explored that.
00:25:11.560 | But it seems reasonable to imagine that that's tied to,
00:25:15.380 | you know, this sense of like,
00:25:16.840 | I really, I'm gonna turn my phone over
00:25:20.980 | and check what's going on there.
00:25:22.360 | - Are there any data that suggest
00:25:24.340 | that just being able to maintain a thought train,
00:25:28.460 | independent of visual input,
00:25:30.380 | can help us get better at maintaining attention?
00:25:32.940 | So for instance, this morning I woke up very early,
00:25:35.860 | unusually early for me,
00:25:36.940 | 'cause I went to bed unusually early for me.
00:25:39.100 | And I decided to try something,
00:25:41.860 | which is something that actually our colleague
00:25:45.900 | in neuroscience, Karl Deisseroth, had mentioned he does,
00:25:48.500 | and a previous guest on this podcast, Josh Waitzkin,
00:25:51.720 | who is a former chess grandmaster champion,
00:25:54.660 | has described something like this.
00:25:55.820 | I decided to try it, which was to keep my eyes closed
00:25:59.300 | and just try and think in complete sentences,
00:26:04.000 | not let my mind drift off topic for a while,
00:26:06.400 | have a conversation with myself in my head.
00:26:09.620 | But with the constant redirect
00:26:13.000 | of trying to stay in a thought train.
00:26:14.500 | And it's actually much more difficult
00:26:16.460 | than I thought it would be, right?
00:26:18.460 | There's no other input, my eyes are closed,
00:26:20.260 | I was comfortable at the temperature of the room was,
00:26:23.100 | et cetera, I was well-rested, no phone, no input.
00:26:27.140 | And you get one sentence of thought out, then the next.
00:26:30.020 | It's a bit like writing, except here, no visual input.
00:26:32.940 | So I would have thought it's a lot easier
00:26:35.580 | because you don't have a set of tabs across the top
00:26:39.420 | or even a word doc with a, like,
00:26:40.640 | do you want to change it to bold, et cetera,
00:26:42.220 | like no other input competing for one's attention.
00:26:46.100 | And I found that after about 10 minutes,
00:26:48.260 | it became pretty easy.
00:26:50.140 | But it took me about 10 minutes
00:26:51.460 | to get into this redirect of focus.
00:26:53.940 | And then at one point I thought I better stop this
00:26:55.940 | 'cause it's seeming kind of weird,
00:26:57.100 | but that was very different, I would say,
00:27:00.120 | than sitting down to say meditate
00:27:03.700 | and think about my breath,
00:27:04.580 | which is a physical phenomenon that's tangible
00:27:06.940 | at the level of feeling one's breath.
00:27:08.660 | - So how do you feel about practices
00:27:10.700 | that teach us to maintain attention
00:27:13.480 | and redirect our attention
00:27:15.360 | that are very deprived of visual input
00:27:17.980 | as a kind of training ground for being able to harness
00:27:21.660 | and maintain visual input when we need to get work done,
00:27:25.380 | work on problem sets, write,
00:27:27.120 | do like what I call real work
00:27:30.100 | or Cal Newport would call deep work.
00:27:32.760 | - So I've never tried that.
00:27:34.180 | And it sounds fascinating.
00:27:35.460 | And I'm gonna try to give it a shot,
00:27:37.620 | you know, tomorrow morning.
00:27:39.060 | At first I was thinking this sounds a lot like meditation,
00:27:44.820 | right, but there are a whole variety,
00:27:46.500 | I'm no expert on meditation,
00:27:47.940 | but there are a whole variety
00:27:48.780 | of different kinds of meditation.
00:27:50.260 | Some, as you mentioned, you know,
00:27:52.020 | you're focusing on breath work, physical stimulus,
00:27:56.260 | but there are others that are not
00:27:58.980 | and that are much more kind of cognitively focused.
00:28:02.940 | So for example, like loving kindness meditation
00:28:07.940 | is one where you're kind of thinking
00:28:09.860 | about a particular person, you're imagining them
00:28:11.580 | and you're imagining something
00:28:13.260 | really good happening to them, right?
00:28:15.500 | So this sort of one of these, you know,
00:28:18.140 | self transcendent types of meditation,
00:28:20.340 | which are not, I don't think,
00:28:21.820 | really tied to any external input coming out,
00:28:24.620 | although it's an internal input, right?
00:28:26.900 | That's based on your memory
00:28:29.900 | or awe based meditation.
00:28:33.100 | So maybe it's more similar to those, but I-
00:28:37.460 | - But it's like thematically anchored.
00:28:39.580 | - Exactly, exactly.
00:28:40.860 | - As opposed to visually anchored,
00:28:41.940 | like staring at a flame or concentrating on one's breath.
00:28:44.620 | Yeah, I didn't have a, it was like free,
00:28:47.100 | in terms of putting in language of foraging,
00:28:49.660 | it was like, I didn't have a plan.
00:28:50.660 | I wasn't writing a paragraph.
00:28:52.020 | It was just, can I stay in a conversation with myself
00:28:55.020 | that's where there's no moment
00:28:58.780 | that some external voice or input
00:29:04.220 | or thought about something else in the room?
00:29:06.060 | You know, just, can I just kind of stay in there?
00:29:08.820 | Can I just stay in there?
00:29:10.180 | That was really the question.
00:29:12.100 | - Yeah, I think that that makes complete sense
00:29:14.820 | because it's kind of like you're foraging for apples
00:29:18.220 | in that tree that's, you know,
00:29:19.860 | on the middle of the Serengeti somewhere, right?
00:29:21.700 | And there's nothing anywhere around you.
00:29:23.780 | And so you're going to stick with that
00:29:25.660 | and just keep mining it until there's nothing left.
00:29:29.340 | One of the reasons that I brought up this example
00:29:32.180 | was I noticed that anything that has to do with attention,
00:29:36.260 | whether or not it's visual attention or, you know,
00:29:39.180 | needing to write or cognitive attention
00:29:42.180 | and redirecting attention,
00:29:43.380 | unless there's some high level of,
00:29:45.700 | as you call it arousal or emotionality,
00:29:48.500 | I find there's always a kind of warmup period required
00:29:51.580 | and that this isn't taught to us in school.
00:29:54.060 | And that so many people who think
00:29:55.220 | that they have a hard time maintaining attention,
00:29:57.860 | I have this hypothesis that they are training
00:30:00.940 | non-attention or brief attention by, you know,
00:30:04.500 | scrolling through movies on a, you know,
00:30:07.540 | social media platform is basically training,
00:30:09.740 | redirecting your attention every couple of seconds
00:30:13.420 | or maybe every few minutes.
00:30:16.180 | So you get good at that.
00:30:17.500 | You get good at scrolling.
00:30:18.940 | You get good at what you do.
00:30:21.140 | But also I think it was always the case
00:30:25.060 | that sitting down to do something difficult
00:30:27.900 | or learn or write or pay careful auditory attention,
00:30:30.820 | maybe even to a podcast,
00:30:32.580 | that there's a kind of a warming up period.
00:30:34.540 | What is the evidence that neural circuits in the brain
00:30:37.500 | are kind of, here I'm using very top contour language
00:30:41.260 | in front of another card carrying neuroscientist,
00:30:43.740 | but that neural circuits are kind of more dispersed
00:30:47.220 | in their activation patterns,
00:30:49.580 | but that over time we can drop into a trench,
00:30:51.940 | not just of attention,
00:30:52.900 | but that then the signal to noise of that circuit
00:30:56.020 | required for attention
00:30:57.620 | and the other components of the task
00:30:59.380 | gets much greater compared to the background noise.
00:31:02.540 | Is there evidence for that?
00:31:03.380 | In the same way that warming up to work out,
00:31:06.140 | no one expects to walk in and train with their work weight
00:31:09.340 | or to run at the speed that they would in mile three, right?
00:31:13.220 | You know, that you warm up.
00:31:15.140 | It's like, but this notion of warming up the brain
00:31:18.420 | for specific cognitive activities
00:31:20.580 | doesn't seem as abundant out there.
00:31:24.620 | And I think part of the reason might be,
00:31:26.180 | and I'd like your thoughts on this,
00:31:27.700 | that we are all familiar with something super exciting
00:31:31.580 | or scary grabbing our attention in this,
00:31:34.340 | but then I would say, well, you can sprint into the street
00:31:37.340 | to save your kid from getting hit by a car.
00:31:38.980 | You didn't warm up for that,
00:31:40.860 | but that's not how you exercise
00:31:42.660 | because there isn't the same level of urgency.
00:31:45.220 | - That's a deep question.
00:31:46.540 | I think, and I, you know, it's funny to me too,
00:31:49.340 | because it, I don't warm up often before I work out.
00:31:54.340 | And that's like, so-
00:31:56.140 | - You seem to be in great shape.
00:31:57.140 | - No, but it's like, it's funny, you know,
00:31:58.340 | I've been doing CrossFit for like 17 years.
00:32:00.540 | - Oh, wow.
00:32:01.380 | And you're still uninjured?
00:32:02.220 | You're one of the few- - Oh, no, I've got
00:32:03.180 | plenty of injuries.
00:32:04.020 | I, you know, I've had, you know,
00:32:05.940 | a couple of hernias surgeries and maybe,
00:32:08.460 | maybe just like five or six minutes of mobility work.
00:32:11.180 | You know, we have a lot of episodes on that.
00:32:12.580 | - No, no, the mobility is really good.
00:32:14.380 | And I actually, what I, what I have, you know,
00:32:16.740 | periodically it's like, take like, you know,
00:32:18.300 | many months off to do just purely mobility, PT,
00:32:21.380 | because, and like, I did Pilates intensively
00:32:24.300 | for a year and a half after, after one injury,
00:32:27.900 | and I loved it.
00:32:28.900 | And it's cool to see what it does to your body,
00:32:31.380 | 'cause it totally refashioned it.
00:32:32.620 | I was, 'cause I've always been like big guy up here.
00:32:35.660 | And then you do Pilates for, or yoga for a long time,
00:32:37.940 | went through yoga period too.
00:32:39.220 | And suddenly it's all core, you know,
00:32:41.260 | and you become like a very different, very different human.
00:32:45.140 | - Yeah, so there's this issue of warming up.
00:32:46.660 | You don't like warming up, which explains your injuries.
00:32:48.620 | - I like warming up.
00:32:49.620 | It's more a question of time.
00:32:50.780 | The reason why, and that's why I needed a CrossFit
00:32:52.940 | in the first place is because I could do a workout
00:32:55.580 | in 10 minutes or under that left me, you know,
00:32:58.500 | dead on the floor.
00:32:59.340 | - I'm telling you. - It was super awesome.
00:33:00.620 | - I'm telling you, 100 jumping jacks,
00:33:03.340 | just like in PE class is still the best warmup I'm aware of.
00:33:06.020 | - It's amazing.
00:33:06.860 | - Like people laugh at me, you know,
00:33:08.460 | it's like, it's so old school,
00:33:09.500 | but you do 100 jumping jacks before you do any kind
00:33:11.660 | of cardiovascular resistance training.
00:33:13.740 | And I don't, I haven't run a study on this,
00:33:16.460 | but you greatly diminish your chance of injury,
00:33:19.820 | probably because of just raising core body temperature.
00:33:22.300 | But so the question is what, okay,
00:33:24.140 | well then let's pose it in this parallel fashion.
00:33:26.980 | What is the equivalent of the 100 jumping jacks
00:33:30.140 | for cognitive work, right?
00:33:32.180 | For me, it's like internally going like,
00:33:35.780 | what's wrong with you, Andrew?
00:33:36.700 | Why is it so hard for you to like punch out
00:33:38.320 | these 10 paragraphs?
00:33:40.460 | But if someone on my team says,
00:33:41.660 | "Hey, we need this in eight minutes."
00:33:43.920 | I could do that anywhere.
00:33:45.820 | Unless I'm actually driving a vehicle,
00:33:48.300 | I can work anywhere, anytime.
00:33:50.980 | But I would say we don't have the equivalent
00:33:55.500 | of 100 jumping jacks for cognitive work,
00:33:57.140 | but we need it, we need that.
00:33:58.500 | I think people need that and they need the understanding
00:34:00.360 | that it can help them get into that trench of attention.
00:34:03.580 | - I have a bunch of disconnected thoughts on this.
00:34:05.540 | - Please.
00:34:06.380 | - So one would be the converse of that,
00:34:09.220 | which is the, which you kind of alluded to earlier,
00:34:12.400 | which is the not warming up,
00:34:13.720 | but the opposite of warming up, like the distraction.
00:34:16.840 | So there have been some really interesting studies done
00:34:20.380 | in sort of more business-y settings,
00:34:24.240 | management settings about, that looked at foraging, okay?
00:34:29.240 | And think of it this way.
00:34:30.920 | It is more like a measure of creativity,
00:34:32.760 | your proclivity to explore, to try new things,
00:34:35.580 | to go to, you know, to be the opposite of focused, okay?
00:34:39.600 | So, and you can measure that, for example,
00:34:42.640 | like an anagram task.
00:34:44.860 | So you get a bunch of letters,
00:34:45.920 | make as many words as you can.
00:34:48.040 | At some point, you decide to dump them
00:34:50.560 | and get new letters, right?
00:34:51.780 | And so that's sort of an, you know,
00:34:53.240 | you're taking a risk and you're exploring
00:34:54.800 | and you're getting a new set.
00:34:56.320 | You don't know what's gonna happen, right?
00:34:58.160 | And really cool studies showed that
00:35:01.400 | if you precede that task
00:35:04.400 | with a task where people are foraging
00:35:06.740 | for points on a screen, there's hidden,
00:35:09.140 | it's like a visual kind of thing
00:35:10.460 | and you're just looking for stuff.
00:35:12.740 | If the points are really dispersed and spread out,
00:35:17.460 | then people, we don't know how long
00:35:19.580 | that kind of after effect lasts,
00:35:22.340 | but then people are way more kind of hyper explorers.
00:35:26.380 | - With the words in the session.
00:35:27.220 | - With the word thing later.
00:35:28.420 | And if they're doing, if they have to like decide,
00:35:30.900 | if they're playing virtual fishing
00:35:32.820 | and the number of, you know,
00:35:34.820 | the rate at which you catch fish in a pond is declining
00:35:37.840 | and you can press a button and take a time out
00:35:41.100 | to travel to another pond,
00:35:42.840 | people are much more willing to move on, okay?
00:35:45.820 | When they do that,
00:35:47.260 | whereas if you put all the points kind of together,
00:35:48.940 | which is essentially related to what you're saying,
00:35:51.180 | cognitively warming up by focusing,
00:35:53.820 | literally instead of having your filter,
00:35:57.620 | you know, your aperture, your lens like this,
00:36:00.200 | it's now like this,
00:36:01.180 | even though it's a different task that you're going to do.
00:36:04.880 | - Oh, I love this.
00:36:06.120 | - Then you're much more focused on that.
00:36:10.120 | - Okay, I've sat here and done many, many podcasts
00:36:14.920 | and I have to say, it's rare that I say I love this,
00:36:16.800 | probably the first time.
00:36:17.960 | I absolutely love this.
00:36:19.000 | 'Cause as a person who's worked
00:36:20.760 | on a variety of topics in neuroscience,
00:36:22.380 | but visual neuroscience has really been my first home
00:36:25.700 | and continues to be the way that I think
00:36:27.880 | about a lot of this.
00:36:29.180 | - You know, there are a couple of really interesting papers
00:36:32.480 | that have led to some practices,
00:36:33.880 | mainly in China where students focus on a fixation point
00:36:37.360 | before they sit down to do cognitive work.
00:36:39.460 | And it improves their attention
00:36:42.840 | and performance on cognitive work.
00:36:44.360 | And it sounds so silly to people.
00:36:45.680 | People think, oh, okay, I'm gonna stare at a dot
00:36:48.160 | and then you're gonna like stare at a dot
00:36:49.460 | at the given distance that I'm gonna do my work.
00:36:51.240 | How lame is that?
00:36:52.140 | Well, I think it's incredible
00:36:54.440 | because what you just said fully supports this idea
00:36:59.440 | that we're, well, we all agree here.
00:37:02.260 | And there's two of us that we're mainly visual,
00:37:04.980 | even those of us that like to listen to music
00:37:06.860 | and things like that.
00:37:07.680 | And we're very somatic or, you know, very visual creatures
00:37:10.380 | and that where we place our visual attention
00:37:12.540 | and the size of the aperture of that attention,
00:37:15.180 | whether or not we're looking at a small box or a big box,
00:37:17.900 | not metaphorically, but literally determines the aperture
00:37:22.180 | of our attention going forward.
00:37:24.100 | In other words, I think this is such an important thing
00:37:26.700 | because when we look at a horizon
00:37:27.980 | or we walk through a city, you know,
00:37:29.860 | there's information flowing past us, you know,
00:37:32.180 | and all kinds of, you know,
00:37:33.740 | without us placing our eyes on any one particular point.
00:37:38.580 | And that people don't notice until they do this
00:37:42.620 | and they hear this, but that's very relaxing.
00:37:44.380 | We look at a horizon, it relaxes us.
00:37:45.780 | And that's because panoramic vision, non-foveated vision
00:37:48.980 | is it's associated with a decrease in autonomic arousal.
00:37:53.100 | So has this been leveraged toward teaching kids
00:37:56.420 | and adults how to attend better?
00:37:58.020 | Because I think this is immensely valuable.
00:38:01.260 | I mean, this is a behaviorally driven pharmacology
00:38:05.660 | as I like to call it, because clearly there's a change
00:38:08.060 | in our chemistry when we do this sort of thing.
00:38:09.740 | - I mean, other than what you just said
00:38:11.460 | about the work that's done in, you know,
00:38:13.740 | what they're doing in China,
00:38:14.700 | which is entirely consistent with what I just said,
00:38:17.500 | I'm unaware of any utilization.
00:38:21.100 | I think it could be.
00:38:23.060 | I mean, I love that phrase that you just used, right?
00:38:26.500 | Which is, when we understand the underlying neurochemistry,
00:38:30.380 | let's say, that's great, but you're not gonna go in
00:38:33.660 | and directly manipulate people's neurochemistry.
00:38:35.980 | - No.
00:38:36.820 | - But if you can change the environment they're in,
00:38:39.300 | or you can change the state that they're in,
00:38:42.860 | behavioral state, cognitive state, emotional state,
00:38:45.300 | then that's an effective, potentially effective,
00:38:49.340 | practical, ethical, right?
00:38:52.300 | Way of having this kind of same or similar impact.
00:38:55.980 | - I'd like to take a quick break
00:38:58.780 | and thank our sponsor, AG-1.
00:39:00.940 | AG-1 is an all-in-one vitamin, mineral,
00:39:03.060 | probiotic drink with adaptogens.
00:39:05.740 | I've been taking AG-1 daily since 2012,
00:39:08.620 | so I'm delighted that they're sponsoring this podcast.
00:39:11.100 | The reason I started taking AG-1,
00:39:12.860 | and the reason I still take AG-1,
00:39:14.580 | is because it is the highest quality
00:39:16.060 | and most complete foundational nutritional supplement.
00:39:18.940 | What that means is that AG-1 ensures
00:39:20.940 | that you're getting all the necessary vitamins,
00:39:23.180 | minerals, and other micronutrients
00:39:25.060 | to form a strong foundation for your daily health.
00:39:27.740 | AG-1 also has probiotics and prebiotics
00:39:30.380 | that support a healthy gut microbiome.
00:39:32.340 | Your gut microbiome consists of trillions
00:39:34.180 | of microorganisms that line your digestive tract
00:39:37.060 | and impact things such as your immune system status,
00:39:39.540 | your metabolic health, your hormone health, and much more.
00:39:42.540 | So I've consistently found that when I take AG-1 daily,
00:39:45.540 | my digestion is improved, my immune system is more robust,
00:39:48.740 | and my mood and mental focus are at their best.
00:39:51.420 | In fact, if I could take just one supplement,
00:39:53.620 | that supplement would be AG-1.
00:39:55.820 | If you'd like to try AG-1,
00:39:57.220 | you can go to drinkag1.com/huberman
00:40:00.440 | to claim a special offer.
00:40:01.700 | They'll give you five free travel packs
00:40:03.420 | plus a year supply of vitamin D3K2 with your order of AG-1.
00:40:07.880 | Again, go to drinkag1.com/huberman
00:40:11.220 | to claim this special offer.
00:40:12.700 | Yeah, I think that so many people, including myself,
00:40:17.220 | think, okay, what's a way that I can increase my level
00:40:20.300 | of alertness and attention?
00:40:21.460 | Well, I have this gallery of caffeine.
00:40:23.740 | Actually, the middle one's water,
00:40:25.000 | for those that are just listening.
00:40:25.940 | I've got a mate gourd here, plenty of caffeine in there.
00:40:28.780 | I had a cold brew mate, plenty of caffeine in there.
00:40:30.820 | I had several, actually, and then water in the center.
00:40:33.700 | But caffeine raises our level of alertness
00:40:36.980 | and thereby attentional capabilities.
00:40:39.900 | But I think that most people are not familiar
00:40:41.680 | with using behavior as a way to increase
00:40:45.020 | their endogenous release of the neurochemicals
00:40:47.140 | that increase arousal and attention.
00:40:49.560 | And we just tend to over-rely on pharmacology,
00:40:53.220 | and I'm not against that.
00:40:54.340 | I use it, obviously.
00:40:55.640 | But what do you think it is?
00:40:59.020 | I mean, now I'm asking you to be a bit
00:41:00.380 | of a cultural anthropologist.
00:41:02.900 | What do you think it is that has led people
00:41:07.340 | in the United States and Europe to mainly focus
00:41:12.300 | on this idea that if you can't attend easily,
00:41:15.340 | that it's a pharmacologic issue,
00:41:18.580 | that behavioral tools are not as useful?
00:41:22.140 | Because the experiment you described is so cool, right?
00:41:25.100 | Look at dots that are close together.
00:41:27.700 | Then cognitive space becomes kind of more bundled
00:41:32.140 | into a tighter bundle.
00:41:33.500 | Look at dots that are more dispersed,
00:41:35.860 | and you tend to kind of disperse your cognition.
00:41:38.740 | It becomes almost like more of a creative exploration.
00:41:42.180 | Maybe this is why my friend Rick Rubin,
00:41:43.980 | whose name is sort of synonymous with creativity,
00:41:46.200 | 'cause he wrote that amazing book, "The Creative Act,"
00:41:47.980 | is so into sky and clouds and sunsets and space, open space.
00:41:52.980 | Rarely have I ever heard Rick say,
00:41:54.740 | "Hey, you know, you should stare into a little soda straw."
00:41:58.460 | I'd love for you to just kind of riff
00:41:59.960 | on what you think some of the better tools are
00:42:02.380 | for improving attention and focus,
00:42:05.340 | and whether or not you think we're really as challenged
00:42:08.540 | in that as many people assume.
00:42:11.660 | - Well, I don't think we're that challenged.
00:42:14.820 | I think, as I mentioned earlier,
00:42:16.620 | our brains are just performing the computations
00:42:19.760 | that they have been endowed with
00:42:21.580 | by millions of years of evolution,
00:42:24.060 | which is to allocate attention, to allocate behavior,
00:42:27.300 | to allocate focus according to how rich, I'll call it rich,
00:42:32.300 | or poor the environment is,
00:42:35.680 | how many different sources are there.
00:42:38.380 | And so, those are the rules your brain lives by,
00:42:42.780 | and you're not really going to change those.
00:42:44.540 | I mean, you could kind of modulate up and down a little bit,
00:42:47.200 | whether that's through neurochemistry
00:42:48.700 | or other kinds of things,
00:42:49.580 | but ultimately, it's, in this case,
00:42:51.980 | the brain in the environment that it's in.
00:42:54.300 | So, from my perspective,
00:42:56.060 | the best thing you could do is just change the environment,
00:42:59.800 | put those devices away, you know,
00:43:02.620 | to enable you to focus, right?
00:43:04.980 | And so, anyway, I don't know
00:43:08.580 | if I had that much more to say on that topic.
00:43:10.460 | - No, I think what's great about this
00:43:12.780 | is that you're essentially pointing to the fact
00:43:14.700 | that we have control.
00:43:16.600 | We're not somehow deficient or messed up
00:43:20.760 | if we find ourselves having a hard time
00:43:23.860 | directing our attention,
00:43:25.320 | because we've been training ourselves to scroll.
00:43:29.420 | We've been training ourselves
00:43:30.540 | to redirect our attention constantly to new things.
00:43:33.820 | I mean, as you can probably tell,
00:43:35.020 | I'm a big fan of intervening in that process
00:43:37.640 | so that one has the ability to drop into focused work.
00:43:40.780 | I do feel as if progress in life, you know,
00:43:45.580 | scales fairly directly with the ability to focus
00:43:48.820 | on one thing for some period of time
00:43:50.660 | for sake of, you know, learning in school,
00:43:54.780 | for sake of sport, for sake of relationships,
00:43:57.060 | the ability to have like a real connection to somebody,
00:43:59.920 | you know, and we're going to get into a discussion
00:44:03.740 | about social interactions in a bit.
00:44:06.100 | But when it comes to foraging,
00:44:11.100 | do you find that people fall out
00:44:13.060 | into different kind of clusters
00:44:14.620 | of how they forage for information?
00:44:16.420 | And what are some of the themes of that
00:44:18.660 | or kind of signatures of the different groups?
00:44:22.940 | - Yeah, that's a great question.
00:44:23.780 | We haven't really approached it
00:44:25.500 | with the idea that there are clusters,
00:44:27.140 | but rather that there's, let's say, a continuum
00:44:31.540 | and of being either, you know,
00:44:33.860 | most people are somewhere in the middle, of course,
00:44:36.540 | but some folks hyper-focused, right?
00:44:39.340 | And you might just metaphorically imagine them
00:44:41.660 | at the extreme of like obsessive-compulsive almost, right?
00:44:44.900 | You can't get unstuck from a routine.
00:44:47.020 | And at the other end would be folks
00:44:49.020 | who explore too readily, right?
00:44:51.940 | So folks who we would say have attention deficit
00:44:55.380 | hyperactivity disorder.
00:44:56.860 | And so folks fall somewhere along that distribution.
00:45:00.060 | Now, we've seen that there are differences between species
00:45:04.140 | in terms of where they are on that.
00:45:06.300 | Difference is a function of age in humans.
00:45:10.740 | So you kind of move from being more hyper-exploratory
00:45:14.460 | toward more focused as you get older.
00:45:17.980 | - Oh, good.
00:45:18.820 | - And that also one of the things
00:45:21.620 | that we've talked about a lot
00:45:23.460 | is that that variation where you're on that continuum
00:45:29.060 | might make you more or less suited
00:45:32.460 | to different types of careers, different types of jobs.
00:45:35.860 | It's not to say that people can't change,
00:45:37.460 | but think of it this way.
00:45:39.140 | For, you've got a dial that goes from super-focused
00:45:42.420 | to a major explorer, and creativity goes along with that.
00:45:46.380 | One person might come with their dial set at three,
00:45:50.340 | another person at seven.
00:45:52.260 | And you could help that person at a three,
00:45:55.420 | maybe turn theirs to five, but probably not to 10, right?
00:45:58.180 | The person who's at seven,
00:45:59.580 | you could turn them up to nine, right?
00:46:01.220 | So through various kinds of practices.
00:46:03.820 | 'Cause I think it's really important to just recognize
00:46:05.860 | that people do vary, and that variation we pick up on
00:46:10.860 | in the sort of neurological context of like issues,
00:46:16.740 | problems that people experience
00:46:20.500 | like with focus in school, et cetera, like that.
00:46:22.900 | - People are no doubt wondering,
00:46:25.340 | well, if I am good at dropping into a trench
00:46:28.540 | and focusing my attention for long periods of time,
00:46:31.820 | maybe it's more obvious what types of careers
00:46:33.740 | would let that person would be better at.
00:46:36.340 | You know, maybe it's programming or writing
00:46:38.060 | or who knows, painting.
00:46:40.020 | But when you have somebody whose attention
00:46:42.740 | tends to flip between different things,
00:46:45.260 | what sorts of professions do they align well with?
00:46:48.580 | - Yeah, that aligns with creative professions.
00:46:50.620 | So, and also being entrepreneurs.
00:46:52.660 | Actually, if you look at the data on entrepreneurs,
00:46:56.140 | the rate of attention problems is two, three, four X.
00:47:02.020 | The general population, you also see that
00:47:06.620 | it's often comorbid with other issues
00:47:08.660 | related to anxiety, bipolar, et cetera.
00:47:11.820 | So they've kind of like all clustered there
00:47:13.980 | with a real issue on that sort of focus.
00:47:17.940 | And we work with a team out in Berkeley, actually,
00:47:22.580 | that provides support to entrepreneurs
00:47:27.060 | so that they can do their best, do their thing,
00:47:32.060 | which is to be like wildly creative, right?
00:47:35.140 | And innovative, I should say.
00:47:38.540 | But when they need that focus, so they can have it.
00:47:41.700 | And we have a big research project going on right now,
00:47:45.140 | looking at entrepreneurs in California
00:47:46.780 | and also MBA students at Wharton
00:47:49.140 | to just kind of try to identify
00:47:51.820 | the prevalence of these issues
00:47:53.540 | and then to potentially provide support for them.
00:47:58.420 | And that support could take any number of different forms.
00:48:00.780 | It could be true psychiatric support
00:48:02.900 | in the sense of like maybe attention-focusing
00:48:07.340 | pharmaceuticals, drugs like Ritalin, Adderall,
00:48:10.620 | which can be used appropriately,
00:48:13.320 | but that doesn't rob those individuals of their mojo.
00:48:17.340 | But in other cases, it's gonna be more like changing their,
00:48:20.020 | providing an ecosystem, right?
00:48:21.980 | So where they can learn focusing practices,
00:48:24.940 | as we've already talked about,
00:48:26.680 | where when they build their teams,
00:48:29.220 | they can build complementary strengths
00:48:32.660 | in the people that surround them
00:48:34.700 | so that they're much more likely to be successful.
00:48:37.540 | And our economy depends on those people being successful,
00:48:41.540 | right, so that's where the vast majority
00:48:43.040 | of economic activity is coming from,
00:48:45.280 | is people who start small businesses,
00:48:47.520 | who are entrepreneurs and who are innovators.
00:48:49.820 | So it makes all the sense in the world to do that.
00:48:53.980 | I think we've been neglecting all this.
00:48:55.380 | Now, actually, the thing I wanted to say earlier about this
00:48:59.420 | and that where I think neuroscience gives us a new tool
00:49:04.420 | to approach a lot of these business questions
00:49:09.020 | is that let's imagine you're hiring, right?
00:49:12.300 | And you're hiring, well, we need a creative type, okay?
00:49:15.020 | So you put an ad out and you get, you know,
00:49:19.700 | resumes and responses and people come in for interviews.
00:49:22.340 | How do you measure that creativity, typically?
00:49:25.440 | Are you gonna say, "Oh, how creative are you?"
00:49:26.940 | And you're like, "You really want the job."
00:49:28.300 | So you're like, "Yeah, I'm super creative," you know?
00:49:30.880 | Or you give them a personality test, for example,
00:49:32.900 | or, you know, like Myers-Briggs or something like that.
00:49:35.460 | And we know those are not particularly accurate
00:49:40.340 | and self-report can be not only inaccurate,
00:49:44.940 | but biased and biased by the context.
00:49:49.400 | Why am I here?
00:49:50.240 | Who's asking me a question?
00:49:51.300 | How is that question asked?
00:49:53.180 | Whereas the neuroscience, neuroscience gives us tools
00:49:56.680 | to kind of measure those things directly.
00:49:59.320 | And in some cases, you could measure it directly
00:50:01.060 | from the brain and we do that,
00:50:02.580 | but that's not going to be practical,
00:50:05.100 | not gonna be scalable, right?
00:50:07.740 | Not gonna be something a lot of people want to, you know,
00:50:11.860 | embrace, let's say, as applicants.
00:50:14.140 | But find ways to interrogate the brain
00:50:18.120 | that are not asking people to assess themselves.
00:50:21.900 | - For instance, what would a small number of questions
00:50:25.220 | be that- - Well, not even questions.
00:50:26.980 | One of the things that we've done is develop games,
00:50:29.980 | like brief, little, very engaging games
00:50:32.820 | that are based on specific tasks
00:50:36.980 | that we know interrogate specific circuits in the brain,
00:50:40.300 | like foraging, for example,
00:50:42.820 | where, you know, people are literally harvesting berries,
00:50:46.620 | let's say, okay, and they're going along,
00:50:49.820 | and the goal is to kind of get as many as you can.
00:50:52.720 | And from their behavior, we can figure out exactly
00:50:56.140 | where they are on that continuum, mathematically,
00:51:00.460 | and say, okay, well, in the dashboard that we create,
00:51:03.620 | like, okay, you are pitched a bit more
00:51:06.340 | toward being an innovator and creative type,
00:51:09.260 | and explorer, and less, so less likely to be, say,
00:51:12.820 | a good manager who would need to be, you know,
00:51:15.460 | sort of have a higher degree of focus.
00:51:18.020 | And we do that for a number of different aspects
00:51:23.020 | of cognitive and emotional performance.
00:51:26.300 | So things like, in terms of social competence, for example,
00:51:31.300 | and so we have a little, actually a little game.
00:51:33.740 | It's mimic soccer.
00:51:35.780 | And we've had monkeys play it, humans play it.
00:51:37.580 | We know exactly what it, what it kind of elicits
00:51:40.860 | from the brain and what circuits it relies on.
00:51:43.540 | And that allows us to numerically, you know,
00:51:48.540 | identify, like, your strategic planning abilities,
00:51:51.780 | or your something-like theory of mind,
00:51:54.100 | getting in the head of an opponent.
00:51:56.940 | And those games, we found, it's really been very gratifying
00:52:01.940 | to demonstrate that those predict performance
00:52:07.160 | in a number of different jobs, in high-performance jobs,
00:52:10.020 | like soccer players, actual soccer players,
00:52:12.860 | but also in the military, in cyber operations.
00:52:17.860 | And so we're now exploring, and we've helped to stand up
00:52:21.860 | a startup company in Philadelphia that is actually,
00:52:25.340 | you know, that's their mission, is to go out
00:52:28.100 | and try to use those tools to see if they can do better
00:52:32.240 | than basically a whole bunch of questions.
00:52:34.900 | - Yeah, it certainly goes way beyond
00:52:37.200 | kind of typical Myers-Briggs
00:52:38.680 | or Enneagram-type personality tests,
00:52:40.640 | which I think has certain value.
00:52:42.920 | If nothing else, they, you know,
00:52:44.840 | people like to know about themselves.
00:52:46.240 | And I do think categorizing oneself a little bit,
00:52:49.520 | according to, like, are you a three on the Enneagram,
00:52:51.760 | or a four, or an eight, or you know what,
00:52:54.280 | certainly gives you a frame of reference.
00:52:57.440 | But yeah, it doesn't seem very useful
00:52:59.700 | for the kinds of work environments that you're describing,
00:53:03.280 | whereas what you're describing sounds
00:53:04.620 | much more sophisticated.
00:53:06.740 | You mentioned theory of mind.
00:53:08.380 | We should talk about theory of mind,
00:53:10.160 | because here we are back to visual neuroscience,
00:53:12.900 | but I have the understanding,
00:53:16.180 | you can tell me if I'm right or wrong,
00:53:17.620 | that as old world primates,
00:53:19.400 | one of the more impressive features that we've developed
00:53:23.300 | is the ability to attend to a location with our eyes,
00:53:26.900 | but pay attention to something else in the periphery.
00:53:29.660 | People used to refer to this
00:53:31.120 | as the other cocktail party effect.
00:53:34.860 | The cocktail party effect is the ability
00:53:36.260 | to pay attention to a conversation
00:53:37.740 | while there's stuff in the background,
00:53:38.820 | but this is the other cocktail party effect
00:53:40.900 | that sort of sometimes chuckles,
00:53:44.820 | gets described as, you know,
00:53:47.500 | you're out to dinner with somebody,
00:53:50.400 | and you're listening to them,
00:53:51.860 | and you're paying attention to them,
00:53:52.920 | but you're also paying attention
00:53:54.240 | to the conversation next to you,
00:53:55.820 | or maybe someone else at the bar.
00:53:58.740 | You know, you can fill in the blanks there.
00:54:01.620 | This is an amazing ability,
00:54:03.460 | regardless of what it's used for,
00:54:06.740 | that a lot of other primate species don't have.
00:54:10.260 | - I mean, as far as I know, no other species have,
00:54:14.020 | so this seems to be,
00:54:16.340 | we know macaques can do this, for example,
00:54:18.180 | and humans do this routinely.
00:54:20.220 | We assume all apes do this,
00:54:22.940 | and the adaptive explanation is, I think,
00:54:27.740 | exactly what you're alluding to,
00:54:28.940 | which is the fact that when you live
00:54:31.300 | in a complex, multilevel society
00:54:34.260 | with differentiated relationships,
00:54:35.980 | where the things that matter to you
00:54:37.860 | are your family, your rank, your status,
00:54:42.860 | your friends, your enemies, all those kinds of things,
00:54:45.840 | that then creates a really complex environment
00:54:50.660 | for, as you said, devoting your attention,
00:54:52.220 | because where we look is the focus of our,
00:54:56.360 | typically, that's the focus of your attention
00:54:58.140 | and what's turned up, and other brains know that, right?
00:55:02.140 | And so now, let's imagine you're a baboon, right?
00:55:05.320 | And you're not the highest-ranking baboon,
00:55:09.460 | and the high-ranking, you know, the alpha's over there,
00:55:11.940 | and so you train your gaze on that alpha baboon,
00:55:15.540 | but there's a really attractive female over here
00:55:17.460 | that you want to know where she's heading,
00:55:19.940 | because that's a good mating opportunity later.
00:55:22.780 | So it's that ability to kind of split attention
00:55:24.620 | from your overt attention, what your gaze is pointed at,
00:55:28.980 | and covertly, what you're amplifying
00:55:32.300 | and tracking in the environment.
00:55:34.700 | And there's this, you know,
00:55:37.220 | to tie this back to theory of mind,
00:55:39.020 | there's, I think, it's reasonable
00:55:43.100 | and consistent with some of the data,
00:55:45.200 | that theory of mind, which is a sense
00:55:48.620 | of being able to infer what somebody else knows,
00:55:52.860 | what they can see, right, what they want,
00:55:56.020 | their state of mind, which might be different from yours,
00:56:00.460 | that it develops through the way that,
00:56:03.220 | as infants and young children,
00:56:05.420 | our experience of first gazing at a caregiver,
00:56:10.420 | maintaining attention with them,
00:56:11.660 | and then learning to follow their gaze
00:56:14.820 | when they look somewhere and they say,
00:56:16.140 | "Hey, that's a, you know, that's an apple," or whatever,
00:56:19.420 | that you do the same thing,
00:56:20.940 | and that gaze following, then,
00:56:22.340 | is a precursor to joint attention,
00:56:25.340 | and joint attention being really important
00:56:27.520 | for the development of this, of the theory of mind,
00:56:31.380 | which is our sense of being able to understand,
00:56:34.580 | make predictions, make inferences
00:56:36.020 | about what's going on in somebody else's head.
00:56:38.500 | - I feel like the overlap of covert attention
00:56:43.500 | and theory of mind, as you described,
00:56:46.360 | comes from this assumption that I have,
00:56:49.040 | which is that we have effectively
00:56:52.100 | two spotlights of attention, and that we can merge them,
00:56:55.420 | so I can place all my attention on you
00:56:57.340 | and what we're talking about in your face, et cetera,
00:56:59.140 | or I can split my attention between you
00:57:01.280 | and, you know, something over there in the corner,
00:57:03.960 | or I can take that second spotlight of attention
00:57:06.140 | and place it on myself, like, oh, you know,
00:57:08.540 | like I need to move to the side
00:57:10.300 | 'cause I've got a little, you know,
00:57:11.940 | you know, maybe an itch on my thigh or something like that,
00:57:15.140 | but I don't think we have three spotlights
00:57:17.140 | that we can work with very easily anyway.
00:57:19.680 | Maybe we could train that up,
00:57:21.200 | but that we don't naturally have
00:57:23.120 | more than two spotlights of attention.
00:57:25.240 | We can merge these two spotlights of attention,
00:57:27.360 | and I feel like, and I've done some practice at this,
00:57:30.600 | just 'cause I'm a neuroscientist and I like to try things,
00:57:33.480 | of ramping up my level of focus,
00:57:35.620 | just trying to really, like I'm doing it right now,
00:57:37.860 | I'm looking at you and like the contour
00:57:39.920 | of your shape against the background,
00:57:41.560 | like I can really decide to emphasize those borders.
00:57:44.200 | I'm not really doing anything behaviorally.
00:57:46.680 | It's different than I was a few moments ago,
00:57:48.820 | but then I could also bring that spotlight of attention
00:57:51.160 | kind of down a little bit in an intensity.
00:57:53.140 | So I feel like we have two spotlights of attention
00:57:55.360 | that we can ramp up in intensity,
00:57:56.880 | and we don't normally do this so consciously.
00:57:58.680 | Normally we're more in stimulus response,
00:58:01.300 | and I think about this a lot nowadays because,
00:58:03.600 | and forgive me for referencing previous podcasts,
00:58:05.600 | but we had this brilliant, absolutely brilliant,
00:58:08.160 | 84-year-old psychoanalyst, Jungian analyst
00:58:11.360 | named James Hollis on the podcast,
00:58:16.360 | and he talked about what it is to be human
00:58:19.680 | and to create a life.
00:58:20.620 | And it boiled down basically to two things,
00:58:22.400 | which was to acknowledge that we're in stimulus response
00:58:25.380 | a lot of the day and how to be functional in that domain
00:58:27.480 | was a lot of that conversation,
00:58:28.620 | but that there's this essential aspect to life,
00:58:31.800 | which is to get out of stimulus response
00:58:33.920 | and bring those spotlights of attention inward
00:58:36.360 | and to think and to reflect,
00:58:38.480 | and then go back into stimulus response.
00:58:40.920 | And when we just sleep, wake up
00:58:42.400 | and go into stimulus response all day,
00:58:44.200 | or if we go meditate all day
00:58:46.120 | and are not in stimulus response, neither is good.
00:58:48.380 | So it's that balance.
00:58:49.860 | And so this notion of two spotlights of attention,
00:58:52.720 | I'd love for you to tell me this is like complete BS
00:58:55.480 | or that it works.
00:58:56.600 | I don't need to be validated here.
00:58:58.560 | We're putting it out there as a hypothesis
00:59:00.140 | because it feels true to me,
00:59:01.960 | but that's obviously just a feeling.
00:59:04.480 | - Well, I think that feeling, as far as I know,
00:59:08.560 | is consistent with what we understand
00:59:10.920 | about how attention can,
00:59:13.640 | how it amplifies the visual signals or other signals
00:59:17.400 | that are coming into our brains
00:59:18.960 | and the ways in which we can kind of,
00:59:21.160 | I don't know if it's divided purely
00:59:23.920 | or if it sort of bleeds over,
00:59:25.720 | what that really exactly looks like.
00:59:28.120 | But the landscape,
00:59:28.960 | let's imagine it's a landscape of neural activity
00:59:31.560 | and you can kind of raise up two humps or just one hump.
00:59:35.680 | And it doesn't feel like you can go beyond that.
00:59:39.760 | That's really, really hard to measure.
00:59:42.920 | And I think our best data on that
00:59:45.960 | comes from recording the activity of neurons
00:59:49.040 | in macaques and monkeys while they are doing attention,
00:59:53.360 | these sort of visual discrimination tasks.
00:59:55.960 | And I think that'd be really, really hard
00:59:58.080 | to actually elicit that kind of behavior from them.
01:00:03.080 | - Well, we both agree.
01:00:08.320 | I know because we were talking before,
01:00:09.560 | we started recording that certain types of stimuli
01:00:13.280 | really grab our attention and influence our decisions
01:00:17.800 | and our valuation of things out there in the world.
01:00:20.800 | So talk to me about monkey porn.
01:00:23.320 | - Okay, we never called it monkey porn,
01:00:26.400 | but a lot of people- - We're calling it
01:00:27.560 | monkey porn here. - A lot of people have said
01:00:29.080 | that essentially, no matter what else I do in my career,
01:00:33.120 | that's gonna be on my tombstone.
01:00:35.760 | - This man worked on monkey,
01:00:37.080 | this man unpacked the neurobiology of monkey porn.
01:00:39.960 | - Okay, so let's go back in the way back machine.
01:00:42.960 | And so back when I was an anthropologist
01:00:45.840 | and I'm going out, I'm watching monkeys
01:00:47.600 | and it's very clear that there are certain things
01:00:52.440 | in the world that are important to them
01:00:54.840 | that they prioritize.
01:00:55.680 | And those, they're the same things that we do.
01:00:59.440 | So they pay attention to each other, to their faces,
01:01:02.840 | but also to other cues.
01:01:05.840 | And these cues seem to make adaptive significance,
01:01:10.840 | that they're relevant for your ability
01:01:13.520 | to survive and reproduce,
01:01:15.660 | which is the name of the game for evolution.
01:01:17.480 | That's all that really counts.
01:01:19.640 | Okay, and what are those things?
01:01:20.860 | Well, they're cues to status.
01:01:23.320 | Like, so who's dominant, who's subordinate?
01:01:26.040 | Who can take my stuff?
01:01:27.240 | Who do I gotta watch out for?
01:01:28.560 | Who can I dominate and take stuff from?
01:01:31.280 | And cues to sort of mate quality, mating opportunities.
01:01:36.280 | And if you look at non-human primates,
01:01:39.360 | they display those things very conspicuously, right?
01:01:42.160 | So males have these big canines
01:01:44.960 | and they have sort of physical dominant features,
01:01:47.200 | very square jaw, all that kind of stuff.
01:01:49.160 | And females, for example, in macaques,
01:01:52.960 | display their state, their hormonal state,
01:01:57.240 | how receptive they are to mating
01:01:58.880 | and likelihood of ovulating at that time
01:02:02.480 | through the sort of the swelling
01:02:04.360 | and coloration on their perineum.
01:02:07.120 | Here's a good word for your listeners, perineum,
01:02:09.560 | which we introduced, I think,
01:02:11.000 | into the neuroscience literature.
01:02:12.800 | And that's just the sort of anogenital region.
01:02:15.240 | So that's where they're putting a lot of--
01:02:17.200 | - Someone else on here.
01:02:18.240 | - Signaling, taint.
01:02:19.360 | - Listen, another card-carrying researcher,
01:02:23.800 | Dr. Shana Swan, excuse me,
01:02:27.520 | came on here to talk about phthalates
01:02:29.400 | and microplastics and endocrine disruptors.
01:02:31.360 | She spent a career working on this.
01:02:32.600 | She's a serious scientist.
01:02:34.360 | And she talked about how taint sizes
01:02:37.440 | are diminishing in males
01:02:41.640 | by virtue of endocrine disruptors
01:02:44.960 | accessing the fetus during pregnancy.
01:02:46.800 | This is a statistically very robust effect.
01:02:50.080 | I know we're gonna get into a discussion
01:02:51.400 | about fertility later
01:02:53.200 | because you've worked on this issue as well.
01:02:55.080 | So we can say the perineum, taint,
01:02:57.760 | and now everyone knows what we're talking about.
01:02:59.840 | So the females display their perineum region
01:03:03.680 | differently when they're ovulating.
01:03:06.720 | - Yeah, so it becomes redder, fuller, et cetera.
01:03:09.080 | So if you go to the zoo and you just, you could say,
01:03:11.400 | you see the monkeys with the red butt, big red butts,
01:03:13.980 | they're the ones who are, the females who are,
01:03:16.000 | it turns out the males do that too.
01:03:18.040 | So males signal kind of their circulating testosterone levels
01:03:22.800 | by how red their taint is.
01:03:26.160 | And actually even you can just see the physical size
01:03:28.540 | of their testes is a pretty good proxy in a cue.
01:03:32.360 | And in rhesus macaques,
01:03:34.040 | there's also kind of these signals around the eyes
01:03:36.520 | that get a little bit darker.
01:03:38.500 | The theory is that humans,
01:03:42.880 | so for a long time people said,
01:03:44.040 | oh, humans don't display their,
01:03:46.800 | anything about their hormonal biological state,
01:03:50.160 | to promote monogamy and all kinds of stuff like that.
01:03:53.360 | Even though it seems that monogamy is not the,
01:03:56.360 | monogamy does not seem,
01:03:57.680 | monogamy in terms of mating
01:03:59.680 | does not seem to be the dominant strategy in humans.
01:04:04.680 | Let's call it that.
01:04:07.360 | - Yeah, but just to make sure that I'm clear on this,
01:04:10.040 | it used to be said,
01:04:11.360 | you are saying that it used to be said
01:04:13.960 | that humans don't signal their hormonal status.
01:04:17.360 | And the reason people were saying that
01:04:20.120 | is because it was a promotion of monogamous behavior,
01:04:23.680 | which is actually not true in humans?
01:04:26.760 | - Well, so this goes back to Darwin, really,
01:04:29.800 | who sort of theorized that humans during human evolution,
01:04:34.760 | that as monogamy became more adaptive for whatever reason,
01:04:39.760 | it's all speculation, right?
01:04:41.660 | That these sort of cues were hidden.
01:04:45.820 | So that males couldn't,
01:04:48.480 | you wouldn't be encouraged
01:04:51.820 | to find other mating opportunities
01:04:53.980 | outside your monogamous relationship.
01:04:56.660 | And so it would kind of keep the focus,
01:04:59.960 | to get back to that, on your partner.
01:05:03.220 | But all the data that's out there,
01:05:05.140 | both from when societies were encountered
01:05:08.940 | by Western scientists,
01:05:10.700 | like whether polygyny was practiced or not,
01:05:13.240 | to just what we understand about extra,
01:05:17.680 | extra pair matings, like an offspring, et cetera.
01:05:23.020 | Strict monogamy does not seem to be the,
01:05:26.880 | to have been the dominant strategy.
01:05:32.940 | Now that's also consistent with the observation
01:05:35.020 | that we are a sexually dimorphic species.
01:05:38.020 | So if, when you look at the animal kingdom,
01:05:41.420 | you're in primates in particular,
01:05:43.540 | those that are obligate pair bonded monogamous primates,
01:05:48.500 | males and females don't really differ much.
01:05:50.220 | Like if you look at marmosets or tamarins.
01:05:51.900 | - In terms of body size.
01:05:52.940 | - Body size, coloration,
01:05:54.980 | conspicuous sexual characteristics.
01:05:57.380 | - Brain structure as well.
01:05:58.780 | - That's another interesting point,
01:06:01.100 | which we can circle back to.
01:06:02.820 | But even if you just look at, well, we can,
01:06:04.560 | even if you just look at brain size,
01:06:06.020 | relative brain size relative to body size,
01:06:09.180 | that is smallest in pair bonded monogamous species.
01:06:14.180 | - The difference in brain size.
01:06:18.500 | - Not between males and females, but just overall.
01:06:20.820 | And it sort of scales up with group size
01:06:25.060 | and group complexity.
01:06:26.340 | It's slightly different, but there's a point there,
01:06:28.720 | which is that, well, pair bonded monogamous species
01:06:32.540 | look very, very different, right?
01:06:35.100 | Different, I'm sorry, I'm different than us.
01:06:36.900 | - It's very unusual, let's just say this.
01:06:38.260 | So it's very unusual, right?
01:06:41.180 | In mammals overall, it's very unusual in primates.
01:06:43.740 | There's only a few, you know.
01:06:45.060 | - Monogamous primates.
01:06:46.060 | - Monogamous obligate pair bonded primates.
01:06:48.840 | And in general, their behavior is not as complicated
01:06:54.740 | or complex as individuals that live in societies
01:06:58.260 | where there's a lot more going on
01:07:01.780 | in terms of strategizing to attain mating opportunities
01:07:06.780 | through, you know, either through sort of physical challenge
01:07:12.260 | or through, you know, being sneaky or, you know,
01:07:16.620 | or making friends, et cetera, et cetera.
01:07:18.420 | There's this sort of proliferation of different strategies
01:07:22.300 | that requires a lot more mental calculation, apparently,
01:07:25.220 | that goes hand in hand with an increase in brain size
01:07:30.380 | and brain size, cortex size, et cetera.
01:07:33.940 | - Which makes sense from the standpoint
01:07:35.420 | of like more prefrontal cortex,
01:07:37.060 | more context-dependent strategy setting and decision-making.
01:07:40.180 | And it could be based on,
01:07:41.700 | it seems that with more prefrontal cortex,
01:07:44.500 | one can, a species can incorporate
01:07:48.540 | different valuations of mates.
01:07:50.660 | It can be about hormonal status.
01:07:52.540 | And I want to make sure we get back to that,
01:07:53.780 | how humans signal hormonal status.
01:07:56.820 | But it could also be about, you know,
01:08:00.340 | reproductive potential as it relates to resource allocation
01:08:04.000 | or whether or not there'll be a good caretaker.
01:08:06.020 | I mean, a lot of additional factors can be incorporated in
01:08:11.020 | and working with more variables flexibly
01:08:15.100 | requires more neural real estate,
01:08:17.380 | mostly in prefrontal cortex, right?
01:08:19.860 | - Absolutely.
01:08:20.700 | Although I will, based on a paper we published last year
01:08:24.180 | in nature, I would say that our notions
01:08:26.980 | of sort of the breakdowns of like where stuff is in the brain
01:08:30.980 | and how it's encoded, I think is going to change a lot.
01:08:34.020 | And there are a number of other studies
01:08:35.580 | that have come out in the last year or so that echo this.
01:08:38.780 | And so this was a paper in which we did something
01:08:42.180 | unthinkable, I think, in the sort of history of neuroscience,
01:08:45.420 | which is all about reduction.
01:08:46.500 | Let's make the experiment as simple as possible,
01:08:51.500 | only very one thing, right?
01:08:53.580 | And we're going to find where that one thing is in the brain.
01:08:56.180 | And that's the tradition going back to Hubel and Wiesel,
01:08:58.620 | right?
01:08:59.460 | - Hubel and Wiesel folks, my scientific great-grandparents.
01:09:02.060 | No, we were bound to do it sooner or later.
01:09:03.740 | They won the Nobel prize for their understanding,
01:09:05.940 | for their parsing of the neural basis of vision,
01:09:09.580 | neuroplasticity, et cetera.
01:09:11.060 | Torsten's still alive.
01:09:14.140 | I think he's like a hundred now.
01:09:16.100 | Last time I saw him, he was 96
01:09:18.020 | and he was still jogging and doing art.
01:09:20.420 | David passed away.
01:09:21.660 | Amazing, you can look it up.
01:09:22.660 | H and W, we call him Hubel and Wiesel.
01:09:24.260 | They're among our,
01:09:25.980 | they're on the Mount Rushmore of neuroscience.
01:09:29.620 | And we'll get back to this.
01:09:30.700 | So, please, yeah, explain to us what this paper showed.
01:09:35.700 | And then we will then talk about
01:09:40.180 | how humans signal their hormonal status.
01:09:42.460 | - And we'll go all the way back to monkey porn, I hope,
01:09:44.020 | because it's really near and dear to my heart.
01:09:46.700 | - We won't leave monkey porn in the past.
01:09:48.660 | - So near and dear to my heart.
01:09:50.940 | - Okay, so, Hubel and Wiesel,
01:09:53.500 | let's, we're gonna really simplify
01:09:57.140 | because that's how we figure out exactly how it works.
01:10:00.980 | But it's not what our brains do.
01:10:02.940 | That's not the environment our brains are in.
01:10:04.340 | I mean, when you're out there in the world,
01:10:06.180 | you've got this just incredibly complex visual environment,
01:10:10.500 | social environment.
01:10:11.340 | And what you do in any moment
01:10:12.700 | depends on what you experienced recently,
01:10:15.020 | what you think might happen next,
01:10:17.540 | what might've happened last week in a similar circumstance.
01:10:19.900 | It's super complicated.
01:10:21.380 | And it reflects all these different
01:10:23.220 | competing interests and values.
01:10:26.660 | And that's true for monkeys too, okay?
01:10:29.580 | And so, we did the dream, my dream experiment
01:10:32.980 | from back when I was an anthropologist,
01:10:34.540 | which was to get rid of the lab, okay?
01:10:38.020 | And instead, we recorded wirelessly
01:10:42.060 | from thousands of neurons in the brain
01:10:45.860 | in prefrontal cortex, which you mentioned,
01:10:48.140 | and we tend to think of as being
01:10:50.100 | important for decision-making
01:10:51.460 | and kind of setting goals and context.
01:10:54.500 | And also, the sort of high-level visual area
01:10:59.500 | in the temporal lobe that's important for sensing objects
01:11:04.060 | and maybe faces and things like that.
01:11:07.420 | So, seemingly, you know, one at like an input level
01:11:09.780 | and one at like a higher order level.
01:11:12.380 | We did this mostly because of
01:11:14.100 | some of the technological limitations.
01:11:16.420 | But it turned out to be really like a good thing
01:11:19.100 | in the end, 'cause it told us something really unusual.
01:11:22.460 | So, what we did then is we let monkeys
01:11:24.780 | just be monkeys with each other, okay?
01:11:27.060 | So, we'd have a male with his female friend
01:11:32.060 | or alone with a female friend on the other side
01:11:36.140 | of a sort of plexiglass divider.
01:11:39.300 | And then there could be other monkeys present
01:11:41.540 | like as observers, like who are like
01:11:43.220 | watching what they're doing or not.
01:11:45.500 | And then we also introduced challenges to them.
01:11:48.380 | Like, so basically, my graduate student would come in
01:11:51.060 | and like, you know, threaten one of the monkeys.
01:11:52.820 | And this elicits a lot of agitation and arousal.
01:11:56.820 | - We're gonna have to say how you threaten a monkey.
01:11:58.380 | - Monkeys, you know, look, we're just like
01:12:00.460 | big kind of not as hairy monkeys to them.
01:12:03.380 | And, you know, that makes--
01:12:05.100 | - To threaten them, you look at them directly.
01:12:06.460 | - Yeah, you look at them and you--
01:12:07.500 | - Yeah, so if you go to the zoo, folks,
01:12:09.300 | and you look directly at a monkey and you smile,
01:12:11.700 | that's a threat.
01:12:13.100 | If you wanna be friendly with the monkeys,
01:12:15.420 | lip smack, it's an affiliation thing.
01:12:19.860 | It almost looked like we were blowing kisses at one another,
01:12:21.700 | you know, so we both looked away.
01:12:22.540 | - It's probably where it comes from.
01:12:23.940 | - That's right.
01:12:24.780 | - That's probably where it comes from.
01:12:25.860 | - So you got a naturalistic experiment.
01:12:28.100 | - So you got a natural experiment.
01:12:28.980 | And so rather than having one, you know,
01:12:31.660 | varying one thing, these monkeys engaged in like 27, 28
01:12:34.660 | different kinds of behaviors, okay?
01:12:36.780 | They would forage, they'd scratch, they'd groom each other,
01:12:39.900 | they'd threaten, they'd mount,
01:12:40.980 | they'd do everything that monkeys do, right?
01:12:43.140 | And then we also, you know,
01:12:44.340 | as we were varying the context as well.
01:12:46.740 | And so that's like blows the lid off of the typical,
01:12:50.180 | the complexity in a typical experiment.
01:12:52.460 | And what did we find?
01:12:53.460 | We found that neurons in both these areas,
01:12:55.980 | they were indistinguishable, were modulated,
01:12:59.580 | they were affected, their firing rates,
01:13:01.380 | their activity was affected by the behaviors
01:13:05.140 | that the animals engaged in
01:13:06.980 | and what the other animals engage in too.
01:13:08.780 | Also who's around, who's watching me?
01:13:11.180 | Is it like male, you know, X or female Y?
01:13:14.820 | And that, and what was really surprising,
01:13:18.820 | so first of all, you see these signals,
01:13:20.100 | they're basically the same.
01:13:21.100 | These two parts of the brain
01:13:22.020 | are supposed to be very, very different.
01:13:24.060 | And the average neuron cared about,
01:13:29.060 | you know, something like seven things
01:13:31.500 | rather than, you know, like one or two.
01:13:33.780 | Okay, like a grandmother cell, you know,
01:13:35.900 | which was kind of one idea
01:13:37.580 | for how the brain encoded things.
01:13:38.740 | Like there's one neuron
01:13:39.860 | that only responds to your grandmother, right?
01:13:41.740 | Something like that.
01:13:42.780 | - Jennifer Aniston cells.
01:13:43.900 | - Jennifer Aniston cells, very famous.
01:13:46.260 | - Barack Obama cells.
01:13:47.380 | - Barack Obama cells.
01:13:48.220 | - And now there's this question
01:13:49.220 | about whether or not they're in a relationship.
01:13:50.820 | So that's why I brought it.
01:13:51.860 | But that was actually in the paper.
01:13:53.700 | There were neurons in the cortex
01:13:55.860 | that responded to Jennifer Aniston specifically.
01:13:59.740 | Jennifer Aniston cells, Barack Obama specifically.
01:14:03.500 | I'm guessing there are Donald Trump neurons.
01:14:06.540 | - There's probably quite a few.
01:14:07.700 | - Right.
01:14:08.820 | And I'm guessing there were Biden neurons.
01:14:12.500 | - Maybe.
01:14:13.340 | - Maybe.
01:14:14.380 | So you're saying that two very distinct brain areas
01:14:19.380 | can respond very similarly to the same things.
01:14:23.540 | And that, so that's one interesting finding.
01:14:26.620 | And the second interesting finding, as I understand,
01:14:29.340 | is that neurons are paying attention,
01:14:32.540 | not just to what you're looking at,
01:14:35.660 | where the monkey is looking at,
01:14:36.780 | but also who's looking at them,
01:14:39.100 | who else is around, what the goal is.
01:14:42.540 | So individual neurons are multitasking.
01:14:45.460 | - They're multitasking.
01:14:46.380 | - Got it.
01:14:47.220 | - And, or as we say, multiplexing,
01:14:49.140 | but it's really the same thing as multitasking.
01:14:51.820 | And that raises a lot of really interesting questions.
01:14:55.860 | Why are these signals all over the place?
01:14:58.180 | Which it seems to be the case, right?
01:15:00.020 | And one idea that's out there is that,
01:15:04.220 | because the, you know, if you, let's say it's a visual area,
01:15:07.740 | those visual neurons might need to know the context
01:15:10.900 | in which something is happening
01:15:12.180 | in order to appropriately like encode that stimulus, right?
01:15:17.180 | 'Cause it matters.
01:15:19.020 | The meaning of that stimulus, it's another monkey.
01:15:21.140 | Like when I'm looking at you,
01:15:22.740 | it matters that we're in this setting here in California
01:15:25.940 | and I flew out here yesterday and all that stuff.
01:15:27.780 | Might be really, really important
01:15:28.780 | for what my brain does with that information.
01:15:31.900 | Like what, how I, how I encode it,
01:15:34.060 | what I put into memory, et cetera.
01:15:35.860 | And so that's sort of one hypothesis
01:15:39.500 | that I think that we're all entertaining.
01:15:43.020 | 'Cause it would be, I mean, it would be heresy to say
01:15:46.340 | that like, actually it's a more like another name drop.
01:15:49.660 | Carl Lashley kind of view of the brain
01:15:52.220 | that it's just one big mush.
01:15:54.260 | - Yeah, so in 30 seconds,
01:15:56.580 | Carl Lashley ran a really critical experiment
01:15:59.100 | where it was the equal potentiality of cortex experiment,
01:16:02.280 | where basically there'd been decades of experiments
01:16:06.020 | with people lesioning a given area of the brain
01:16:09.360 | and seeing some deficit in behavior.
01:16:11.860 | Lashley decided to do the same experiment
01:16:14.740 | and found that regardless of which area of the cortex,
01:16:17.700 | this is important that it was the cortex specifically
01:16:19.580 | that he scooped out, lesioned, got rid of,
01:16:22.060 | set it in a dish next to the,
01:16:24.260 | his case, I think it was rats.
01:16:26.700 | He didn't observe deficits in that behavior,
01:16:29.060 | at least that persisted.
01:16:30.480 | But you see this in the monkey and human data.
01:16:32.840 | You can lesion a brain area, see a huge deficit.
01:16:35.500 | I know I'm telling you what you already know, Michael,
01:16:37.080 | but I think most people don't realize this.
01:16:39.400 | A brain lesion can lead to a huge deficit in behavior
01:16:42.740 | that is recovered later over time through plasticity,
01:16:47.580 | unless you start digging into the deeper stuff of the brain
01:16:50.780 | where lesions lead to permanent deficits
01:16:53.540 | unless some intervention is provided.
01:16:55.180 | - Yeah, the cortex seems to be maybe a little bit,
01:16:57.900 | I don't wanna say equipotential, but it's very plastic.
01:17:01.720 | It's very flexible and very adaptive.
01:17:05.040 | So this was a really cool finding, I thought.
01:17:09.560 | And we could decode from the population of neurons
01:17:14.560 | exactly what each of the animals was doing
01:17:18.680 | and who was around and who's watching, right?
01:17:21.560 | I mean, which, I mean, to me, it was very gratifying.
01:17:25.440 | But the thing that was most exciting to me,
01:17:28.460 | the most exciting, I think that finding
01:17:30.740 | is really cool for neuroscientists,
01:17:32.820 | but for the primatologist, the anthropologist in me,
01:17:36.020 | the finding that was most exciting
01:17:37.940 | was that we discovered the account,
01:17:42.800 | the mental account for our social relationships, okay?
01:17:47.800 | So for monkeys, a large fraction of their,
01:17:52.820 | the way that they build and maintain relationships
01:17:55.380 | is through grooming each other.
01:17:56.440 | So when they go and they pick through each other's fur,
01:18:00.000 | and that's how you make friends, okay?
01:18:02.480 | That's how you make allies.
01:18:04.200 | And what has been observed going back
01:18:08.120 | to when monkeys were first being watched
01:18:10.920 | is that they tend to be really equitable, right?
01:18:14.520 | They're like, if I invest two minutes in you,
01:18:16.400 | you will eventually invest two minutes back in me.
01:18:18.800 | It might not happen right away,
01:18:20.840 | but we're gonna balance out.
01:18:22.160 | It's gonna come even.
01:18:23.320 | And that raised the idea,
01:18:25.220 | which most people thought was like ridiculous,
01:18:27.820 | that, wow, they're actually tracking
01:18:30.500 | and keeping notes on all this.
01:18:32.100 | They've got a ledger for their investments
01:18:36.220 | and withdrawals in this social relationship.
01:18:38.060 | So to make it more salient for the listeners,
01:18:43.060 | like think of it as like when you're texting,
01:18:45.900 | and you text a friend, and they text you back,
01:18:48.060 | and then you text them, and you text them again,
01:18:49.620 | and you text them again, and you're like,
01:18:51.260 | am I getting ghosted?
01:18:52.100 | What the heck's going on here?
01:18:53.060 | Why are you not texting?
01:18:53.900 | You start to feel that sense of like urgency, betrayal,
01:18:56.820 | like, I'm not gonna text you now.
01:18:58.340 | I'm gonna wait.
01:18:59.180 | I'm gonna wait until you text me back.
01:19:00.980 | It's the same kind of thing.
01:19:01.820 | We have this sense of, and in fact,
01:19:03.700 | when we think about now all the stuff that's going on,
01:19:06.900 | socio-politically, in terms of equitable relationships,
01:19:11.820 | I think that this bears on that.
01:19:14.480 | So we did something that had never been done before,
01:19:17.300 | which is we tracked every single grooming interaction
01:19:19.540 | that ever happened between these monkeys over months,
01:19:22.340 | 'cause we could.
01:19:23.180 | We had cameras on them,
01:19:24.020 | and we used computer vision to do all that tracking.
01:19:28.580 | And yeah, they were perfectly equitable.
01:19:33.500 | But sometimes it would take minutes to balance it,
01:19:36.820 | sometimes it took weeks.
01:19:38.620 | Like you owe me, you owe me.
01:19:40.660 | And then it would come back.
01:19:42.700 | What we found is that in the brain,
01:19:44.820 | in these two, both of these brain areas,
01:19:46.980 | we're carrying that mental account
01:19:49.260 | that precisely tracked who owed whom what.
01:19:53.800 | - Amazing.
01:19:54.640 | - How much grooming they, you know.
01:19:56.780 | That blew, I mean, that blew my mind.
01:19:58.660 | It's like, 'cause we all feel that, right?
01:20:01.060 | It's like one of the most salient things
01:20:02.780 | there is in our lives.
01:20:04.440 | - I'd like to take a quick break
01:20:06.520 | and acknowledge our sponsor, BetterHelp.
01:20:08.740 | BetterHelp offers professional therapy
01:20:10.500 | with a licensed therapist carried out entirely online.
01:20:14.180 | Now, I personally have been doing therapy weekly
01:20:16.020 | for well over 30 years.
01:20:17.620 | Initially, I didn't have a choice.
01:20:18.920 | It was a condition of being allowed to stay in school,
01:20:21.260 | but pretty soon I realized that therapy
01:20:22.860 | is an extremely important component to one's overall health.
01:20:25.980 | There are essentially three things
01:20:27.180 | that great therapy provides.
01:20:28.340 | First of all, it provides a good rapport with somebody
01:20:30.980 | that you can trust and talk to
01:20:32.540 | about pretty much any issue with.
01:20:34.220 | Second of all, it can provide support
01:20:36.260 | in the form of emotional support and directed guidance.
01:20:39.340 | And third, expert therapy can provide useful insights.
01:20:42.540 | BetterHelp makes it very easy to find an expert therapist
01:20:45.100 | with whom you resonate with,
01:20:46.320 | and that can provide you those three benefits
01:20:48.300 | that come from effective therapy.
01:20:49.900 | Also, because BetterHelp allows for therapy
01:20:51.780 | to be done entirely online,
01:20:53.660 | it's super time-efficient
01:20:54.820 | and easy to fit into a busy schedule.
01:20:56.720 | There's no commuting to a therapist's office
01:20:58.460 | or sitting in a waiting room,
01:21:00.000 | no traffic, no parking issues, et cetera.
01:21:02.640 | If you'd like to try BetterHelp,
01:21:03.960 | you can go to betterhelp.com/huberman
01:21:06.940 | to get 10% off your first month.
01:21:08.820 | Again, that's betterhelp.com/huberman.
01:21:11.960 | This explains, occasionally I'll get a text
01:21:16.880 | from a friend that says, "Nice conversation,"
01:21:20.120 | which means they texted me a bunch before
01:21:22.360 | and I didn't respond.
01:21:24.000 | And part of that has to do with, for me,
01:21:25.920 | the way that texts are archived.
01:21:28.160 | They can kind of drift down and then they're hard to find.
01:21:30.560 | And, you know, and I'm a known long latency response person,
01:21:35.560 | but then I barrage, not intentionally,
01:21:38.040 | but I'll like get on a plane and be like,
01:21:39.180 | "Oh, that's right, I'm going to get to these texts
01:21:41.240 | from a couple of weeks ago and respond to them
01:21:42.680 | or a couple of days ago."
01:21:43.580 | And I find voice memos to be a good solution to this.
01:21:47.360 | I have a couple of people in my life
01:21:48.720 | with whom I mainly communicate through voice memo,
01:21:51.920 | but it is very interesting that, you know,
01:21:53.440 | my team here, we have what feels like
01:21:56.800 | a very consistent cadence and balance of accounts.
01:22:00.920 | Like even the text duration, like, you know,
01:22:04.160 | like I'm fine with a one word or even one letter text
01:22:08.080 | and I'm fine with an essay.
01:22:09.360 | Like, but certain relationships, you just don't do that.
01:22:11.920 | So what is this, just because I can't help myself,
01:22:14.820 | what is the brain area that's tracking this account
01:22:17.060 | or is it a network?
01:22:18.260 | - It's a network.
01:22:19.100 | - Okay, great.
01:22:19.920 | Then we don't have to lock people down
01:22:21.240 | with people's brain areas.
01:22:23.100 | Well, this is amazing.
01:22:24.100 | I mean, I think that rather than, you know,
01:22:25.980 | people talk about love languages, right?
01:22:27.580 | Like people, is it physical touch?
01:22:29.740 | Is it acts of service?
01:22:31.500 | I think that, you know, some of your words of affirmation,
01:22:34.740 | I'm guessing that some people are tracking these
01:22:36.420 | very carefully too, in humans and balancing the account.
01:22:40.740 | And that kind of love language idea seems like,
01:22:43.800 | like are five acts of support
01:22:47.520 | or five physical contact events,
01:22:51.480 | whatever you want to call it.
01:22:52.760 | I know I'm really sounding like a scientist, a nerd.
01:22:56.540 | Are those equivalent to, you know,
01:22:59.560 | five sentences of affirmation?
01:23:01.840 | What I'm gathering is that the brain
01:23:03.040 | is probably calculating these things on an individual basis.
01:23:07.700 | And so it's not like five sentences
01:23:10.380 | equals five acts of service.
01:23:12.880 | But that, maybe it is,
01:23:16.380 | where there's some like internal valuation
01:23:17.980 | that is like very mathematical.
01:23:19.580 | You're just trying to balance the checkbook.
01:23:20.940 | - I think it is very mathematical,
01:23:22.380 | but I want to point out that in the pairs of monkeys,
01:23:26.740 | we've now expanded this to multiple monkeys
01:23:29.220 | in a big open field,
01:23:31.660 | but they're equal kind of partners, right?
01:23:34.540 | So it made sense that that balance was sort of one for one.
01:23:38.960 | And we know that studies of, you know,
01:23:41.760 | wild monkeys, wild primates,
01:23:44.080 | that the sort of conversion, you know,
01:23:47.520 | like, you know, dollars to pesos or whatever,
01:23:50.020 | is not one-to-one if there's something else
01:23:54.360 | in that relationship.
01:23:55.360 | So if there's a power differential,
01:23:57.340 | it's like if you're beta male
01:24:00.560 | and you're grooming alpha male, right?
01:24:04.080 | It might be a hundred minutes of grooming that alpha male
01:24:08.500 | that you get like one groom back.
01:24:10.720 | Or more importantly, you groom that alpha male
01:24:14.640 | for months and years on end,
01:24:17.160 | and then he comes and saves your life
01:24:20.120 | when you are in an aggressive encounter
01:24:22.440 | with another individual.
01:24:24.000 | So you see how that there is this,
01:24:25.820 | I think that's what you're getting at with me,
01:24:27.400 | with the love languages,
01:24:28.320 | which is that there is this underlying currency,
01:24:32.800 | but the value of that currency for each individual
01:24:37.080 | like varies depending on what they,
01:24:39.760 | I don't know where that variation comes from,
01:24:41.760 | but depending on what's most important,
01:24:45.400 | what's most salient for them,
01:24:46.480 | and then also probably what that relationship is like,
01:24:51.040 | and if there's a power differential,
01:24:52.560 | if there's any other kind of differential as well.
01:24:55.320 | - The math of power dynamics online
01:24:58.140 | is really interesting to observe on X,
01:25:01.400 | where people tend to be a bit more combative at times,
01:25:04.160 | not everybody, but I've noticed this,
01:25:05.880 | like this notion of like, don't feed the trolls, right?
01:25:08.200 | Like someone says something that's insulting
01:25:09.940 | and you don't honor them with a response.
01:25:13.360 | You just let it go by,
01:25:14.280 | that that would be somehow completing
01:25:16.120 | some sort of reward circuitry,
01:25:17.520 | because what they really want
01:25:18.680 | is not to harm your reputation,
01:25:20.580 | but to be acknowledged that their opinion matters.
01:25:23.040 | And social media,
01:25:24.520 | as long as people have access to an account,
01:25:26.880 | effectively levels the field.
01:25:29.780 | Although then there's this prioritization
01:25:31.760 | of like high follower accounts and what used to be,
01:25:34.720 | when blue checks became purchasable, right?
01:25:38.080 | That a lot of people were upset
01:25:40.040 | because it was essentially like, you know,
01:25:41.640 | equaling the status playing field somewhat, right?
01:25:44.640 | But it's very interesting to see how this stuff plays out,
01:25:48.000 | you know, like, do you honor somebody with a response,
01:25:50.340 | or do you like ignoring somebody's insult,
01:25:52.480 | the classic Mad Men Don Draper response, you know,
01:25:55.160 | in the elevator that has turned into a meme,
01:25:57.080 | that, well, I don't think of you,
01:25:58.520 | I don't think about you at all,
01:25:59.940 | being the ultimate sort of display of his power
01:26:02.140 | that in terms of, you know,
01:26:05.580 | not even allowing his neural circuits
01:26:07.460 | to keep track of an account,
01:26:09.460 | it's like zero for you, you know,
01:26:13.540 | is essentially what he was saying.
01:26:15.060 | So is it the case that power dynamics
01:26:18.700 | are tracked across for conflict, for collaboration?
01:26:23.500 | We talked about love languages, which is a collaboration,
01:26:27.140 | you know, some people do seem in life
01:26:32.000 | to be very transactional, is the word we assign to it.
01:26:35.440 | They're tracking like, what, you did this and I did this,
01:26:37.600 | no, you know, you paid last time,
01:26:39.120 | I paid this, this kind of thing.
01:26:41.400 | Or they're elevated by the idea that,
01:26:44.440 | oh yeah, you know, they did this and this,
01:26:45.760 | therefore the relationship must be much tighter
01:26:48.760 | than perhaps the other person
01:26:50.640 | in the relationship thinks it is.
01:26:52.000 | These are complex features,
01:26:53.560 | but the idea that we are old world primates
01:26:55.640 | and that there's a brain network tracking this stuff,
01:26:57.820 | to me, makes really good sense.
01:27:00.780 | And I think it's wonderful that you've identified
01:27:02.580 | a physiological anatomical substrate for it.
01:27:04.860 | I think it lends a lot of support
01:27:07.020 | to like thousands of years of observations.
01:27:09.520 | [laughing]
01:27:10.780 | - Well, thanks.
01:27:11.620 | No, I think you're spot on there in the sense that,
01:27:16.620 | and at some point it's all really transactional
01:27:20.500 | in the calculus of evolution, right?
01:27:24.820 | So ultimately, it's if your calculations do the right thing
01:27:29.820 | so that you get resources and mating opportunities
01:27:35.360 | and translate that into offspring,
01:27:38.040 | and that they do that into offspring as well,
01:27:40.400 | then those, whatever the biological substrate was
01:27:44.020 | that did that is going to proliferate
01:27:46.520 | and potentially become honed
01:27:49.080 | and really specialized for doing that job.
01:27:53.080 | And that's actually the argument, one other argument
01:27:55.360 | for why we study primates
01:27:58.800 | because we're so closely related to them.
01:28:02.200 | We share all these features of our biology and our behavior,
01:28:05.760 | but also because, and this is where I think,
01:28:08.720 | for example, personally, I find it much more compelling
01:28:13.720 | to study animals like rhesus macaques
01:28:16.360 | as opposed to say marmosets,
01:28:18.320 | which we've talked about a little bit,
01:28:20.500 | in the sense that if we're thinking about the forces
01:28:25.340 | that have made us who we are, right,
01:28:28.020 | which we just talked about,
01:28:29.340 | you see it displayed on X every day,
01:28:31.820 | like attending to all these things,
01:28:33.560 | tracking all these different relationships,
01:28:35.860 | deciding whether or not to give somebody your attention,
01:28:39.140 | the purest form of generosity, as it was said,
01:28:41.940 | that's what monkeys have to do too.
01:28:45.140 | And so this argument from what we call neuroethology,
01:28:49.580 | and ethology being the science
01:28:52.260 | of basically trying to understand behavior
01:28:55.980 | as a product of evolution, right,
01:28:58.620 | that it's designed just like physical features,
01:29:03.620 | just like the wing of a bird, right,
01:29:06.500 | that our mental processes and the underlying mechanisms
01:29:11.500 | are designed to serve very specific functions.
01:29:15.700 | And so if we want to understand
01:29:17.220 | how we got to be the way that we are,
01:29:20.020 | we should look toward animals
01:29:22.700 | that seem to be doing the same kind of things,
01:29:25.940 | facing the same kinds of pressures in the environment,
01:29:29.620 | in particular the social environment,
01:29:31.060 | which seems to be the one that's most important for us.
01:29:33.820 | - How do humans signal their hormone status?
01:29:39.060 | This is on a very different end of the spectrum,
01:29:41.220 | but everything we're talking about,
01:29:44.140 | which is fairly high level and in the brain,
01:29:47.780 | exists, as I like to think about it,
01:29:49.500 | on a kind of a water level or a tide
01:29:53.460 | that's set by our levels of autonomic arousal,
01:29:56.020 | like thinking, feeling, action changes
01:29:57.980 | when levels of autonomic arousal are very high,
01:30:00.700 | aka stress, alertness, versus when we're sleepy.
01:30:04.740 | And hormones certainly influence autonomic arousal
01:30:07.020 | and a bunch of other things too.
01:30:08.300 | Hormones is a broad category,
01:30:09.660 | but let's just stay with the ones
01:30:10.980 | that most people are familiar with.
01:30:12.500 | So what are the data on how females signal,
01:30:17.500 | let's just say, testosterone, estrogen,
01:30:21.300 | and other relevant hormones, and for males as well,
01:30:26.300 | what are the external signals or behavioral signals?
01:30:30.500 | - Yeah, so that's a really important point that you made
01:30:33.380 | because both of those things go together.
01:30:36.400 | So it's been most controversial for females,
01:30:39.940 | but in my view, the data is pretty clear
01:30:42.620 | and it aligns, I think, with our own intuitions
01:30:46.660 | just from daily life, which is,
01:30:49.080 | well, some things are apparently not consciously perceptible.
01:30:54.080 | It's hard to report, but through studies
01:30:58.220 | where you just ask males for like,
01:31:00.140 | okay, how attractive is this woman, or et cetera,
01:31:02.820 | that there are changes in the face, for example,
01:31:06.700 | and that's been one argument is that,
01:31:08.580 | and this is gonna sound funny, but that the signals
01:31:11.180 | that in non-human primates are in the rear
01:31:14.860 | are, because we're walking upright,
01:31:16.580 | you can't see that really, so now it's kind of in the face,
01:31:19.940 | and so these changes that happen,
01:31:22.620 | that the ovulatory cycle is reflected in the turgidity,
01:31:27.060 | how tight the skin is in the face
01:31:28.380 | because it gets a little plumper and a little bit redder,
01:31:31.700 | and we may not be consciously aware of that,
01:31:34.180 | but that it's there, right, and it shows up
01:31:37.800 | in sort of preference data when you ask heterosexual males,
01:31:42.540 | how attractive is this woman, et cetera,
01:31:44.100 | so that seems to be the case, and also behavioral,
01:31:49.100 | so sort of flirtatious behavior--
01:31:54.460 | - Increases around the time of ovulation.
01:31:56.900 | - Yeah, yeah, yeah, there is a classic study
01:32:00.500 | that exotic dancers, strippers,
01:32:06.420 | would actually get bigger tips, more tips
01:32:09.160 | when they were ovulating than when they're not ovulating.
01:32:12.480 | - Interesting.
01:32:13.320 | - So there may be--
01:32:14.280 | - And it could be by virtue of their behavior,
01:32:17.760 | but it could be the way they dance, proximity to the,
01:32:21.360 | what, I guess the observers, clients,
01:32:23.080 | whatever you call them.
01:32:24.040 | - I don't recall that being quantified,
01:32:26.960 | but it suggests that there's a latent signal there.
01:32:32.800 | And that men are unconsciously processing this.
01:32:37.800 | They're not saying, "Oh, her cheeks are particularly
01:32:40.920 | "plump and red right now,"
01:32:44.500 | but that if you measure their ratings
01:32:49.380 | or their scores of attractiveness, when she's ovulating,
01:32:53.520 | it's these features that might be drawing out that response.
01:32:56.740 | - Correct, we can take this back to the monkey porn studies,
01:32:59.140 | which was our first real foray
01:33:03.200 | into trying to quantify the value
01:33:07.580 | of various kinds of social information
01:33:12.540 | for guiding decisions.
01:33:13.780 | And we already came into this with a sense that like,
01:33:17.720 | yeah, things like status, physical prowess, mating status,
01:33:22.420 | you look like a good mate, bad mate,
01:33:25.280 | are you in mating condition, et cetera.
01:33:29.200 | And so when you think about that,
01:33:32.360 | like how do you ask a monkey that question?
01:33:35.040 | You could ask them, they're not gonna tell you
01:33:36.680 | 'cause they can't talk,
01:33:38.700 | but you have to develop a behavioral way to elicit that.
01:33:41.600 | And so what we did, I think it was pretty clever,
01:33:45.120 | was to riff on the studies that I had already done
01:33:50.120 | looking at varying the expected value of two options.
01:33:53.400 | So this was the work I did as a postdoc with Paul Glimcher,
01:33:56.840 | where we revealed economic signals in the brain,
01:34:01.840 | in the parietal cortex,
01:34:03.280 | an area between where visual signals come in
01:34:05.940 | and where you make a choice to make a behavioral response.
01:34:10.880 | And we varied, like in this case,
01:34:14.200 | monkeys don't work for money, though they work for juice.
01:34:17.080 | Okay, it's been actually, it's really fun.
01:34:18.800 | You spend a lot of time figuring out what juice
01:34:20.240 | they really love best.
01:34:21.760 | And then economically, you would vary like the size
01:34:25.160 | of the juice reward that each of the two offered
01:34:28.060 | or its probability while maintaining size constant.
01:34:31.400 | That when you combine those, you multiply those together,
01:34:33.560 | you get expected value.
01:34:34.840 | That's the first model of economic decision-making
01:34:37.900 | that was really ever developed, right?
01:34:39.040 | You compute the expected value, different options,
01:34:40.760 | you choose the one that has the highest value.
01:34:42.960 | It doesn't work all the time, but it's sort of a rough proxy
01:34:46.360 | and we showed that, yeah,
01:34:47.240 | neurons and parietal cortex signal that.
01:34:48.760 | Monkeys are good economists.
01:34:49.920 | They choose the one that has a higher expected value.
01:34:52.060 | Okay, so now take that experiment.
01:34:54.480 | I'm gonna have monkeys choosing between two options
01:34:57.720 | that vary in how much juice they pay out.
01:35:00.200 | But I'm also gonna pop up a picture
01:35:03.140 | when they choose one of them, okay?
01:35:05.900 | And they don't know what picture's coming up,
01:35:07.680 | but the picture's gonna be, it could be a nothing burger,
01:35:11.320 | just like some gray square, it doesn't mean anything.
01:35:14.040 | Or it could be the perineum of a female,
01:35:18.040 | if it were males that we were studying.
01:35:19.200 | We did this with males, sorry,
01:35:20.620 | females making choices eventually as well.
01:35:22.780 | Could be face of a dominant male,
01:35:24.760 | face of a subordinate male, face of female.
01:35:26.920 | - What's the equivalent of the swollen taint
01:35:30.480 | of a female monkey, if you reverse the experiment
01:35:34.960 | and it's the female monkey who's making a choice
01:35:36.840 | about male monkeys, what do they find really attractive
01:35:40.120 | in a male monkey?
01:35:40.960 | - Yeah, so it's the taint of the male monkey
01:35:42.800 | 'cause it's providing a signal about how much--
01:35:45.160 | - Monkeys looking at taints of other monkeys.
01:35:46.000 | - Yeah, how much testosterone is circulate,
01:35:49.600 | that they've got on board basically,
01:35:51.400 | which is a good predictor of their status.
01:35:54.680 | It's a good predictor of their fighting ability,
01:35:56.940 | all that kind of stuff.
01:35:57.780 | And if you're a female,
01:35:58.880 | that's a reasonable kind of choice to make
01:36:02.240 | 'cause if you have male offspring
01:36:03.560 | and females are predisposed to choose that,
01:36:05.460 | then your male offspring are gonna do pretty well.
01:36:08.560 | So that's what we did.
01:36:09.760 | And we varied how much juice.
01:36:11.600 | So sometimes monkeys would get paid,
01:36:13.560 | they'd have to give up juice to see the pictures,
01:36:16.000 | sometimes they get paid more to see the pictures.
01:36:18.640 | And what we did then is we construct a choice curve
01:36:22.040 | and we use the differential.
01:36:23.880 | If it's not 50/50, if it slides one way or the other,
01:36:27.640 | it tells us that monkeys are paying X amount
01:36:31.840 | to see certain kinds of pictures
01:36:33.900 | or you have to overpay them, right?
01:36:35.560 | And so what did we find?
01:36:36.680 | It was really, I think, scientifically revealing,
01:36:39.520 | but it's pretty fun, people got it immediately.
01:36:42.740 | They will pay.
01:36:44.820 | - Juice.
01:36:45.660 | - Juice, they will give up juice.
01:36:47.140 | They will pay it to see pictures of the perineum,
01:36:50.580 | the hindquarters of females.
01:36:52.820 | This was an original study, it was in male monkeys.
01:36:54.900 | They will pay to see the faces of dominant males
01:36:59.340 | and you had to pay them
01:37:00.820 | to see the faces of subordinate males.
01:37:03.600 | Okay, so females will give up juice
01:37:07.780 | to see the taints of testosterone-rich male monkeys
01:37:12.780 | and male monkeys will pay juice
01:37:16.700 | to see the swollen taints of female monkeys
01:37:21.700 | that are, because of the swelling,
01:37:25.420 | indicates a better reproductive competence.
01:37:28.140 | - Yes, better, you know, the time is ripe, okay, to mate.
01:37:33.140 | But just in general, it's a signal that is like,
01:37:35.860 | what we would say is it's important,
01:37:37.260 | it has value.
01:37:38.420 | - Monkey porn.
01:37:39.260 | - It's something you should try.
01:37:40.700 | And in fact, yeah, they're paying for it.
01:37:42.300 | So, you know, it just blew up on the internet.
01:37:44.220 | Even back then, it was like suddenly million,
01:37:46.180 | every website was like,
01:37:47.300 | oh, you've proven monkey porn, blah, blah, blah.
01:37:50.180 | It was kind of a fun ride.
01:37:51.460 | It did, it was a New York Times idea of the year in 2005,
01:37:56.380 | which was, again, kind of shocking.
01:37:59.500 | You know, there's like, a little word on that,
01:38:02.180 | but people, it makes sense.
01:38:03.940 | And the thing I want to point out is that
01:38:07.060 | we ran this same experiment in people,
01:38:09.420 | not with unclothed humans.
01:38:12.860 | So we used, and we used only, well, no, it was,
01:38:16.660 | and we had to create our own stimulus set,
01:38:18.580 | because all the stimulus sets that were out there
01:38:20.940 | for visual studies of humans were like,
01:38:24.380 | a bunch of German people looking very dour,
01:38:28.060 | they were very well controlled,
01:38:29.180 | and we wanted something that was more natural.
01:38:30.420 | So we downloaded thousands of photos
01:38:33.300 | from this website, hotornot.com.
01:38:35.300 | I don't know if you recall that,
01:38:36.720 | but it was a website where you could upload pictures
01:38:38.660 | and people would rate you.
01:38:39.660 | I mean, now, that's just like-
01:38:42.620 | - Probably wouldn't be allowed now.
01:38:43.540 | I remember rate my pet.
01:38:46.460 | - Rate my pet, rate my professor,
01:38:48.020 | I think, which is still around.
01:38:49.260 | - We were saying rate.
01:38:50.540 | - Rate, yes.
01:38:51.380 | - Rate. - Rate.
01:38:52.620 | - With a T, my pet.
01:38:54.660 | - Yeah.
01:38:55.500 | - But this was hotornot.com.
01:38:56.860 | So you get all these really natural looking,
01:38:59.220 | and then we had, this was really funny though, too.
01:39:02.060 | So we had a group of, separate groups of raters
01:39:07.060 | from the people who we actually tested in the experiment.
01:39:10.380 | So we had a group of males, heterosexual males,
01:39:13.900 | rating the female photos and vice versa.
01:39:16.780 | And that was interesting in its own right.
01:39:18.980 | So we were just trying to establish,
01:39:21.260 | we're not saying why they're attractive or anything like that,
01:39:23.440 | just like, let's measure it, okay?
01:39:25.940 | And it was really fun because,
01:39:28.140 | and it took, it was hard work.
01:39:29.340 | You're having to do one every three seconds,
01:39:31.260 | and it took like an hour.
01:39:32.560 | And the, you know, when the women were done rating,
01:39:35.940 | they were like, "Okay, I'm glad that's over."
01:39:38.500 | The hour's over, and our male raters were like,
01:39:41.960 | "Did you have any more?"
01:39:43.260 | You know, "I'd be happy to sit here
01:39:45.660 | "and rate more photographs for you."
01:39:48.080 | - Interesting.
01:39:48.980 | So women got sort of like,
01:39:51.780 | they got tired of rating males for attractiveness.
01:39:55.140 | - Yes.
01:39:55.980 | - Males did not tire of rating females for attractiveness.
01:39:58.060 | - They did not at all, which that's anecdotal,
01:40:00.580 | but it's still, I think it's revealing.
01:40:02.780 | Then we ran the pay-per-view experiment,
01:40:05.580 | just like in "Monkeys" on "Humans."
01:40:08.340 | - Pay-per-view.
01:40:09.180 | - And we also ran a couple of other economic,
01:40:13.580 | you know, standard economic tasks.
01:40:15.120 | One would be, how long are you willing to wait?
01:40:17.100 | So that's a delayed discounting.
01:40:18.660 | Like, in general, you will wait longer
01:40:21.180 | for a bigger reward, or a smaller reward.
01:40:23.660 | And also, how hard would you work?
01:40:25.140 | And we, the work was like,
01:40:26.580 | you had to alternate pressing two keys on a keyboard.
01:40:28.700 | It was really just menial, laborious, you know, et cetera.
01:40:32.980 | So, the two interesting,
01:40:35.740 | just sociologically it's interesting,
01:40:37.460 | what comes out of this.
01:40:39.500 | Our female subjects basically wouldn't give up money.
01:40:44.500 | They were working for money.
01:40:47.580 | They were hearing the sound of coins
01:40:48.860 | coming out of a slot machine,
01:40:49.980 | which was proportional to how much money they actually got.
01:40:52.380 | - Real money.
01:40:53.220 | - Real money.
01:40:54.040 | If you ignored the pictures,
01:40:55.020 | you'd go home with like $17 extra,
01:40:58.160 | compared to if you were influenced by them.
01:41:00.540 | And the females did really well economically.
01:41:02.900 | So they pretty much kind of ignored the pictures
01:41:05.140 | of the males, even though they were rated,
01:41:07.380 | even the ones that were super hot,
01:41:09.780 | they were not very concerned with that.
01:41:13.300 | For the males, it was the exact opposite.
01:41:15.100 | So the males are giving up, essentially.
01:41:17.040 | They're paying, and they had thousands of trials.
01:41:19.700 | They're paying somewhere between a half
01:41:21.380 | and three quarters of a cent to see images
01:41:25.380 | of women who were rated in the top third of attractiveness.
01:41:29.940 | They also would wait significantly longer,
01:41:31.820 | and they would work really hard.
01:41:33.100 | It was like rats pressing for cocaine,
01:41:35.020 | quite literally, to keep those pictures up on the screen.
01:41:39.140 | Okay, so that's the setting.
01:41:41.260 | We've established in monkeys and in people
01:41:43.900 | similar economic principles that are guiding social,
01:41:48.300 | you call it attention, social valuation, whatever.
01:41:51.040 | So we're like, okay, let's go look in the brain.
01:41:53.460 | So we did an MRI experiment, fMRI experiment,
01:41:56.140 | measured blood flow to different parts of the brain.
01:41:59.600 | We only tested males because they were the ones
01:42:06.140 | who displayed differential preferences there.
01:42:10.180 | And what we found is that parts of the visual system
01:42:14.020 | that are involved in encoding faces,
01:42:15.540 | but then the reward system was activated
01:42:19.840 | and tracked linearly how much money
01:42:23.660 | these guys were paying to see images.
01:42:26.900 | There's basically the trade-off value, the currency,
01:42:29.940 | the translation of pictures into money, okay?
01:42:34.040 | Then in monkeys, we studied all the same areas,
01:42:36.780 | but now we could record from individual neurons
01:42:39.180 | in those areas rather than looking at blood flow,
01:42:40.940 | which is a crude proxy.
01:42:43.120 | And we found exactly the same thing,
01:42:45.940 | which is that neurons in the reward system
01:42:50.000 | were spontaneously and strongly activated
01:42:52.660 | by those pictures that made sense, right?
01:42:57.660 | So the pictures of the perinea of females
01:43:00.840 | by dominant male faces.
01:43:04.160 | And that correspondence, I thought,
01:43:08.080 | was pretty compelling, right?
01:43:11.760 | - So these are brain areas that are involved
01:43:13.720 | in value-based decision-making.
01:43:17.160 | - Yeah.
01:43:18.440 | - Not unlike the value-based decision-making
01:43:21.680 | of tracking how many grooming events
01:43:25.480 | one received versus needs to give,
01:43:28.200 | or texts one has received or gives,
01:43:30.860 | or acts of service one trades for some other love language.
01:43:35.860 | I mean, here I'm extrapolating to a lot of different themes,
01:43:38.760 | but I mean, talk about transactional.
01:43:42.640 | I mean, this implies that our neural circuitry,
01:43:46.320 | while flexible, we can trade two of those for one of those,
01:43:50.820 | or we can decide, you know,
01:43:51.720 | I'm just going to be a selfless giver,
01:43:53.760 | that that's a decision.
01:43:56.940 | And that altruism, well, it certainly exists.
01:44:02.560 | I mean, we fortunately see acts of altruism a lot,
01:44:05.560 | probably not as much as humanity would be served by,
01:44:09.160 | but it exists, altruism exists.
01:44:11.520 | But nonetheless, there's a formula
01:44:15.840 | that's maintained in the brain,
01:44:18.000 | like I'm going to do all this for nothing.
01:44:20.780 | And the circuit kind of understands that
01:44:23.400 | versus I'm going to do this,
01:44:24.680 | but there's an expectation,
01:44:27.080 | maybe with a long latency that at some point
01:44:29.920 | it's going to be paid back.
01:44:30.760 | I expect to be paid back.
01:44:32.320 | - The idea of altruism has been very controversial
01:44:34.840 | within kind of evolutionary biology for a long time
01:44:39.840 | because it's kind of hard to imagine a scenario
01:44:44.240 | in which being purely selfless could persist
01:44:49.160 | if there was a genetic, you know, part of that, right?
01:44:53.520 | If it were heritable.
01:44:55.040 | So that's why we have ideas like kin selection,
01:44:57.320 | like I will give up my life for, you know,
01:44:59.640 | eight of my cousins, for example.
01:45:01.260 | - Well, right.
01:45:02.100 | And I was saying in parenting and taking care of young,
01:45:04.280 | like we give selflessly,
01:45:06.480 | but there's this like unconscious
01:45:08.480 | or semi-conscious backdrop,
01:45:09.740 | which is you want your own offspring to proliferate,
01:45:12.080 | to survive and flourish.
01:45:14.200 | And so it's not quote unquote really selfless,
01:45:16.360 | although in the short term it can appear selfless.
01:45:18.520 | That's, I guess,
01:45:19.360 | suppose the real evolutionary biology argument.
01:45:21.760 | I would say that in terms of just pure acts of giving
01:45:25.340 | where we don't expect anything in return,
01:45:27.840 | I think most people that do that say,
01:45:32.420 | certainly I've had this experience, right?
01:45:33.880 | It feels good.
01:45:35.360 | So there is a return on investment.
01:45:37.680 | It's just that the return doesn't come
01:45:39.480 | from somebody else doing something to reciprocate
01:45:41.760 | in the same domain, but it feels good.
01:45:43.960 | You know, there's nothing more impressive
01:45:45.440 | than an anonymous donor, right?
01:45:48.000 | You know, actually I don't wanna take us too far off track,
01:45:50.520 | but there's this idea in a lot of Europe
01:45:53.440 | that if somebody donates a lot of money to a cause that,
01:45:55.960 | you know, they're doing philanthropy,
01:45:57.100 | that they're like trying to hide something.
01:45:59.160 | Whereas in this country, that tends to be not the case.
01:46:02.800 | Although it's sort of growing this idea that,
01:46:04.800 | oh, if somebody is giving a lot of money to a university,
01:46:06.920 | they want their name on the side of a building,
01:46:08.400 | they're really looking to kind of either hide
01:46:10.640 | other features of their life and/or they want respect,
01:46:14.600 | right, they want fame.
01:46:18.880 | So it's kind of interesting.
01:46:20.360 | I like to believe in pure altruism.
01:46:23.000 | I just, it feels good to me to believe in true altruism.
01:46:25.720 | - So I, you know, I don't think this is settled.
01:46:28.600 | And I think this is where there's another feature
01:46:31.680 | of human and maybe human evolution that,
01:46:35.320 | humans and human evolution that's relevant here,
01:46:37.880 | which is that we may be one of the only organisms
01:46:40.760 | in which something called group selection might happen,
01:46:44.440 | right, and that's this idea that like groups
01:46:46.560 | are competing with each other,
01:46:48.360 | in addition to individuals competing
01:46:50.160 | and collaborating and competing.
01:46:52.680 | And so that evolution might favor groups
01:46:57.560 | in which there are certain individuals
01:46:59.160 | who are in a sense wired to be selfless.
01:47:03.360 | And there's one of my colleagues at Penn,
01:47:06.520 | a guy named Duncan Watts,
01:47:07.600 | has done these really interesting experiments
01:47:09.920 | where he ran these massive online,
01:47:12.960 | like Prisoner's Dilemma games,
01:47:15.040 | where, you know, people are having to decide
01:47:16.920 | whether to, you know, to either support, you know,
01:47:21.320 | their partner or defect, essentially.
01:47:26.320 | And, but what was unusual about these games
01:47:28.520 | is he let people play them over and over again,
01:47:31.800 | hundreds and hundreds of times.
01:47:32.720 | What typically happens is once you've experienced
01:47:35.200 | the fact that like, if you cooperate,
01:47:38.120 | you're gonna get screwed eventually,
01:47:40.360 | then everybody just says,
01:47:42.440 | I'm just gonna, I'm just screwing the other guy
01:47:45.000 | from here on out.
01:47:46.040 | But he identified that there's a population,
01:47:49.040 | like 20% of people, I think, something like that,
01:47:52.400 | who are persistent cooperators,
01:47:54.440 | who cooperate no matter what their experience.
01:47:58.440 | And that is resonant with this idea
01:48:01.280 | kind of from group selection,
01:48:02.880 | that groups that had individuals who were cooperators,
01:48:06.600 | who were selfless no matter what,
01:48:08.320 | might out-compete other groups, right?
01:48:12.200 | And I think that's a really interesting idea.
01:48:15.200 | I wanna circle back to what you were saying
01:48:17.080 | about the feel good, like when you give,
01:48:19.720 | there's a real substrate to that.
01:48:21.760 | If we can engage in a little reverse inference,
01:48:24.320 | which is that, and this was shown actually
01:48:27.800 | a couple decades ago by a neuroeconomist named Bill Harbaugh
01:48:32.120 | for the first time, which is that,
01:48:34.480 | when you give to a charity that you love,
01:48:38.840 | you see activation of reward circuitry
01:48:41.280 | that looks just like if you got the reward yourself, right?
01:48:45.160 | So it's like if I give to whatever,
01:48:47.120 | March of Dimes or something, and that's what I love,
01:48:50.160 | then it, in essence, feels good to me.
01:48:53.160 | And that reward system activation, right,
01:48:55.200 | is the thing that, through dopamine, reinforces behavior.
01:49:00.440 | So when you have that warm glow,
01:49:01.960 | it makes you more likely to do that again in the future.
01:49:06.040 | It's a self-reinforcing signal.
01:49:08.280 | - I love that those sorts of circuits exist
01:49:11.160 | 'cause they seem to serve the greater good,
01:49:12.960 | and I'm not trying to rub away our more,
01:49:17.960 | I don't know, harsher features of primate brain wiring,
01:49:23.940 | but they're all in there.
01:49:25.000 | So speaking of which,
01:49:28.980 | are there external signals besides muscularity,
01:49:33.300 | jaw shape, et cetera,
01:49:35.320 | that relate to levels of testosterone in male humans
01:49:39.820 | that are transient?
01:49:43.040 | The male hormones don't cycle as robustly as female hormones
01:49:47.800 | because of the lack of a menstrual cycle.
01:49:49.840 | They might change with age, et cetera,
01:49:51.600 | but is there anything that signals testosterone
01:49:56.100 | or free testosterone level?
01:49:58.800 | Certainly stress hormone level is signaled,
01:50:01.620 | quaking of hands, that kind of thing.
01:50:05.900 | But what about testosterone signaling
01:50:08.040 | that is independent of the kind of like vigor display stuff
01:50:11.800 | that we normally hear about?
01:50:13.200 | - It's a good question.
01:50:14.900 | I think it's important, as you pointed out,
01:50:16.940 | that it doesn't vary too much over weeks or months
01:50:21.540 | or anything like that.
01:50:22.380 | It's pretty stable.
01:50:24.180 | But while one thing we can think about
01:50:27.260 | is work done by my colleague, Giddy Nave,
01:50:30.480 | who's in the marketing department at Wharton
01:50:33.800 | and working with Colin Kammerer actually out here at Caltech.
01:50:37.760 | They did a number of studies,
01:50:38.720 | not where they are measuring testosterone,
01:50:40.540 | but doing very well-controlled placebo trials
01:50:45.540 | of applying testosterone gel
01:50:48.120 | versus something that people,
01:50:51.340 | you didn't know which arm you were getting.
01:50:53.240 | - So testosterone versus placebo.
01:50:55.160 | - Versus placebo, yeah.
01:50:56.600 | And measuring things like
01:50:58.440 | desire for conspicuous consumption,
01:51:05.720 | so buying luxury cars or things like that,
01:51:08.380 | or other things are their cognitive reflection,
01:51:13.380 | like they're really bad at,
01:51:16.840 | they start to fail on things that require
01:51:19.640 | not just giving the simple answer.
01:51:21.800 | They become more risk-taking.
01:51:23.760 | So there's a number of features
01:51:24.760 | that we kind of, I think, collectively, anecdotally,
01:51:27.760 | think of as being like hyper-masculine
01:51:31.000 | associated with testosterone.
01:51:32.280 | Like you want a signal, like you're a big guy,
01:51:34.760 | you take more risks.
01:51:36.680 | - And you're less reflective.
01:51:38.200 | - You display more and you're less reflective.
01:51:39.960 | So that, and I'm gonna screw it up right now
01:51:43.000 | if I actually tried to give you the question,
01:51:45.280 | but it's like a bat and a ball cost $1.10 together.
01:51:48.600 | I'm gonna screw it up and then you're gonna say,
01:51:51.120 | well, you lack cognitive reflection.
01:51:53.080 | Let's just leave that aside.
01:51:54.760 | But then they're much more likely to give the wrong answer,
01:51:57.660 | go to the, jump to the conclusion,
01:52:01.160 | which seems obvious, but it's wrong.
01:52:03.760 | - So higher testosterone, more impulsive with responses,
01:52:08.760 | less reflective, tend to be wrong more often.
01:52:11.120 | - Yes, but more confident.
01:52:12.800 | - But more confident, more risk-taking,
01:52:15.920 | that kind of fit. - More risk-taking.
01:52:16.760 | - That's kind of a, okay, fully expected one.
01:52:21.400 | And I guess the purchasing, you know,
01:52:25.400 | items that signal wealth or status.
01:52:29.240 | - It's a display.
01:52:30.080 | So, you know, I think of it as like the chimpanzee.
01:52:33.200 | So when researchers first went out to study chimpanzees,
01:52:37.260 | you know, in Africa,
01:52:39.600 | and then they had like generators or whatnot around
01:52:42.800 | and they had these big gasoline cans or whatever.
01:52:45.400 | And the male chimps, one of the male chimps, you know,
01:52:48.320 | discovered that he could take these cans
01:52:50.880 | and run around the group, banging them together
01:52:54.760 | and getting a lot of attention, you know,
01:52:56.040 | which is similar, you see them up in a tree.
01:52:57.960 | - Vigor display.
01:52:58.800 | - Vigor display, displaying, you know,
01:53:00.720 | just grabbing, I think, so much of its attention.
01:53:03.640 | Just look at me, look at me, look at me.
01:53:05.680 | And I think that's what you've got going on
01:53:08.080 | with this sort of, you know, buying a Jaguar or whatever,
01:53:11.400 | you know, it's like, it's-
01:53:14.040 | - Or people are trying to signal what they don't have,
01:53:17.040 | actually have, right?
01:53:18.560 | - I mean, it's tricky because we,
01:53:21.360 | now you can buy things on credit.
01:53:22.760 | Like, you know, there are a bunch of jokes
01:53:24.760 | about Los Angeles that can be made here.
01:53:26.360 | You know, I grew up in the Bay Area
01:53:28.200 | and there are areas of the Bay Area
01:53:30.320 | where there's a tremendous amount of wealth.
01:53:32.720 | My dad used to always say, you know,
01:53:34.240 | you know, up here, like wealth is really kind of hidden
01:53:36.420 | back in the trees, literally.
01:53:38.600 | You know, you go to LA
01:53:39.440 | and there's all this display through stuff.
01:53:41.800 | It depends on where you are in LA.
01:53:43.520 | But it's largely true.
01:53:46.000 | You see, let me put it this way.
01:53:47.120 | You see a lot more yellow Lamborghinis here
01:53:50.420 | than you do in Portola Valley,
01:53:52.520 | but I'll be willing to bet that there's far more money
01:53:55.840 | in Portola Valley than there is
01:53:58.920 | in all of Rodeo Drive in Los Angeles.
01:54:01.560 | If you really just looked at actual net worth,
01:54:05.560 | not an experiment I want to run,
01:54:07.400 | but I'd be willing to bet one entire limb
01:54:10.860 | of somebody's choice to run that experiment.
01:54:13.920 | And so there is this kind of strange thing
01:54:17.160 | where the display of vigor is so flexible in humans,
01:54:21.080 | right, like, it's like, you know, nowadays,
01:54:26.960 | there's a lot of discussion about billionaires
01:54:29.680 | signaling more traditional or primitive forms of vigor,
01:54:33.440 | like fighting ability or muscle versus, you know,
01:54:37.400 | like it's almost like,
01:54:38.440 | and I think part of the reason for that is that
01:54:42.660 | the concept of a billion dollars
01:54:44.620 | is very hard for most people to conceptualize
01:54:47.100 | as like an operational thing,
01:54:49.420 | like what they would do with it
01:54:50.460 | and how it would impact their level of happiness,
01:54:52.540 | which is probably actually very little, et cetera.
01:54:55.100 | But we can assess physical qualities so readily.
01:54:59.260 | Like, and so, anyway, I guess that this is really
01:55:04.020 | just my way of taking us back to this idea of valuation,
01:55:06.660 | like how we place value on a potential mate
01:55:10.820 | or a friend or a coworker.
01:55:12.860 | It sounds so transactional,
01:55:14.180 | but clearly the brain is performing these operations
01:55:17.260 | all the time, and it's highly variable
01:55:20.860 | depending on who you are, the social context you live in.
01:55:24.180 | And yet these hormones,
01:55:26.020 | especially testosterone and estrogen,
01:55:28.100 | seem to really be playing with the volume
01:55:31.260 | or the gain on all of this stuff.
01:55:33.060 | - Yep, that's exactly, and in fact,
01:55:34.660 | that's how I think about all, you know,
01:55:36.620 | we can think about oxytocin the same way
01:55:38.100 | as like a volume knob for pro-social interactions
01:55:42.860 | in general and testosterone.
01:55:44.420 | So I think that works.
01:55:46.020 | I'd like to take a quick break
01:55:48.180 | and acknowledge one of our sponsors, Function.
01:55:50.940 | Last year, I became a Function member
01:55:52.540 | after searching for the most comprehensive approach
01:55:54.840 | to lab testing.
01:55:56.300 | Function provides over 100 advanced lab tests
01:55:59.260 | that give you a key snapshot of your entire bodily health.
01:56:02.700 | This snapshot offers you with insights
01:56:04.340 | on your heart health, hormone health,
01:56:06.140 | immune functioning, nutrient levels, and much more.
01:56:09.340 | They've also recently added tests for toxins,
01:56:11.420 | such as BPA exposure from harmful plastics,
01:56:13.980 | and tests for PFASs, or forever chemicals.
01:56:16.980 | Function not only provides testing
01:56:18.540 | of over 100 biomarkers key to your physical
01:56:20.740 | and mental health, but it also analyzes these results
01:56:23.660 | and provides insights from top doctors
01:56:25.820 | who are expert in the relevant areas.
01:56:27.900 | For example, in one of my first tests with Function,
01:56:30.580 | I learned that I had elevated levels
01:56:32.100 | of mercury in my blood.
01:56:33.700 | Function not only helped me detect that,
01:56:35.660 | but offered insights into how best
01:56:37.140 | to reduce my mercury levels,
01:56:38.680 | which included limiting my tuna consumption,
01:56:41.060 | I'd been eating a lot of tuna,
01:56:42.460 | while also making an effort to eat more leafy greens
01:56:44.680 | and supplementing with NAC and acetylcysteine,
01:56:47.040 | both of which can support glutathione production
01:56:49.080 | and detoxification.
01:56:50.540 | And I should say, by taking a second Function test,
01:56:53.060 | that approach worked.
01:56:54.060 | Comprehensive blood testing is vitally important.
01:56:56.540 | There's so many things related to your mental
01:56:58.440 | and physical health that can only be detected
01:57:00.960 | in a blood test.
01:57:02.100 | The problem is blood testing has always been
01:57:03.740 | very expensive and complicated.
01:57:05.520 | In contrast, I've been super impressed
01:57:07.260 | by Function's simplicity and at the level of cost.
01:57:10.500 | It is very affordable.
01:57:11.780 | As a consequence,
01:57:12.620 | I decided to join their scientific advisory board,
01:57:14.940 | and I'm thrilled that they're sponsoring the podcast.
01:57:17.580 | If you'd like to try Function,
01:57:18.940 | you can go to functionhealth.com/huberman.
01:57:21.900 | Function currently has a wait list of over 250,000 people,
01:57:25.020 | but they're offering early access
01:57:27.140 | to Huberman podcast listeners.
01:57:28.820 | Again, that's functionhealth.com/huberman
01:57:32.080 | to get early access to Function.
01:57:34.640 | Let's talk about oxytocin.
01:57:37.480 | We hear about it as the love hormone,
01:57:39.540 | the affiliative hormone.
01:57:40.900 | Folks, it's a neurohormone,
01:57:42.580 | so it's somewhere in between a neuromodulator and a hormone.
01:57:47.080 | Let's set all that aside, all the mechanistic stuff,
01:57:49.220 | and I'd love to know your knowledge
01:57:53.820 | about what changing levels of oxytocin does
01:57:58.380 | to perception, to behavior, humans.
01:58:02.300 | - So yeah, oxytocin, we've been interested in it
01:58:05.680 | for a long time because, as you said,
01:58:07.440 | it seems to be a dial that can turn up or turn down
01:58:12.240 | certain aspects of social behavior
01:58:13.600 | and other aspects of mental and emotional function.
01:58:17.400 | It's important to point out that oxytocin
01:58:19.600 | and its sister neurohormone vasopressin,
01:58:23.880 | arginine vasopressin, which is maybe a little more important
01:58:28.000 | in males than in females,
01:58:29.160 | and females, oxytocin's a little more important,
01:58:30.760 | but they're in both, and they've been around a long time.
01:58:34.520 | They've actually, you know,
01:58:35.440 | there's a very early invertebrate evolution.
01:58:38.920 | In mammals, oxytocin has the primary role, right,
01:58:43.200 | of helping to build bonds between mom and baby.
01:58:47.360 | So oxytocin's released during childbirth,
01:58:50.320 | it's released when mom is nursing,
01:58:52.760 | and it seems that in humans and some other social,
01:58:56.080 | you know, really, really social creatures,
01:58:58.160 | it's now been co-opted to kind of have
01:59:01.320 | that similar kind of role in the relationships
01:59:05.280 | you have with other people who are not your offspring
01:59:07.800 | or your pairmate, right?
01:59:10.240 | So 'cause oxytocin, for example,
01:59:11.520 | is released, you know, when you orgasm,
01:59:13.160 | and so then that's, you know,
01:59:14.720 | thought to be why that sort of pillow talk afterward
01:59:17.120 | is, you know, it's like, it's more engaging,
01:59:19.640 | and, you know, people feel things at that time
01:59:22.280 | that they might not, that are different
01:59:23.640 | from what they would have felt--
01:59:24.880 | - It fosters attachment.
01:59:26.200 | - It fosters attachment, that's a good way of putting it.
01:59:29.760 | So oxytocin levels are hard to measure, right?
01:59:34.600 | You can measure it at a distance
01:59:37.320 | in the periphery, in the blood,
01:59:38.640 | but it's not exactly like one-to-one correlated
01:59:41.040 | with what's going on in the brain,
01:59:42.720 | and in general, we don't wanna put like a, you know,
01:59:45.760 | a pump or a little thing in your brain
01:59:48.160 | that we could measure how much is in there.
01:59:51.000 | So we can look at, instead, what is often done
01:59:55.640 | is to look at what happens if you introduce oxytocin,
01:59:59.440 | more oxytocin than you normally have, like, into the brain.
02:00:02.960 | You can't inject it or anything like that,
02:00:04.680 | and the way that it's typically applied
02:00:07.040 | is to squirt up your nose, or inhale it intranasally,
02:00:11.400 | so it then is taken up by the nerves
02:00:14.560 | that are in your sinuses and whatnot,
02:00:15.980 | and then goes into the brain.
02:00:18.180 | That was what it was thought to.
02:00:19.440 | I think that we were the first to show
02:00:20.880 | that that's actually how it works.
02:00:22.880 | We did all the work in monkeys
02:00:24.240 | where it's, all these things are just sort of easier to do,
02:00:29.240 | and the behavior's a little bit less complex,
02:00:31.600 | so our readouts are, I think, a bit more straightforward.
02:00:35.900 | In the human studies, there's a lot of, you know,
02:00:38.320 | it's controversial 'cause there's a lot of like,
02:00:40.360 | there's some crap studies,
02:00:41.600 | and there's just a lot of variability
02:00:42.740 | in the effects across studies.
02:00:43.720 | I think some of that's just because you ask people
02:00:46.460 | to squirt it up their own noses,
02:00:48.120 | and so there's a lot of, that introduces variation
02:00:51.120 | in just how good they were at getting it in the right place.
02:00:54.140 | With the monkeys, what we did instead is we used
02:00:56.000 | what's called a nebulizer or aerosolizer.
02:00:58.440 | Like, I had noticed when like my kid had pneumonia,
02:01:01.880 | and I took him to the ER, they put this mask on him,
02:01:03.880 | and they, you know, they missed this albuterol,
02:01:06.760 | which opened up his airways.
02:01:07.760 | Oh, we could do that with oxytocin too,
02:01:09.120 | so that's what we do with the monkeys.
02:01:10.040 | It makes sure they get like a really good dose,
02:01:12.760 | and then we show that that gets right into the brain.
02:01:14.640 | Okay, now that puts us in a position
02:01:16.160 | to ask questions of what does it do.
02:01:18.600 | Well, one of the first things that oxytocin does
02:01:20.840 | is it relaxes you.
02:01:22.940 | So just overall, you know,
02:01:25.480 | you were talking about autonomic function,
02:01:26.920 | it's a relaxer, it's an anxiolytic.
02:01:30.640 | So it, and in monkeys what that does
02:01:34.080 | is it reduces their vigilance to sort of any threats.
02:01:36.920 | So they're just a lot more chill.
02:01:39.880 | So that's sort of a primary thing.
02:01:42.040 | And then we've looked at how it affects their behavior
02:01:45.920 | in males and females separately,
02:01:48.080 | because as I said before, they sort of,
02:01:49.960 | first of all, males and females have different strategies
02:01:52.740 | and behaviors and the expression
02:01:55.760 | of where oxytocin receptors are in the brain, et cetera,
02:01:59.240 | and vasopressin receptors are a little bit different.
02:02:01.480 | And in male monkeys, it's super interesting,
02:02:04.280 | because, you know, we've been talking about how,
02:02:06.400 | you know, dominance and that really,
02:02:08.160 | like Rhesus macaques have this really steep hierarchy.
02:02:11.880 | And one of the things we found right away
02:02:13.440 | is that you give oxytocin and it just flattens the hierarchy.
02:02:17.280 | So the dominant male monkeys
02:02:19.200 | become super chill and friendly,
02:02:21.600 | and the subordinate ones become a bit bolder,
02:02:26.260 | perhaps because, you know, if I dose my own,
02:02:30.260 | or I've dosed you with oxytocin and changed your behavior,
02:02:32.740 | which would change my behavior,
02:02:34.140 | so it reverberates across individuals.
02:02:36.860 | So it flattens the hierarchy.
02:02:39.560 | They spend more time making eye contact.
02:02:41.540 | They pay more attention to the other individual.
02:02:43.900 | And we've shown that-
02:02:45.220 | - It's Burning Man.
02:02:46.100 | - Yeah, it's true.
02:02:46.940 | - I've never been to Burning Man.
02:02:47.780 | - I've never either, but it's-
02:02:49.060 | - This is what I hear.
02:02:49.900 | - No, I think that's the right point,
02:02:51.620 | and I'll circle back to that,
02:02:53.180 | because we also showed that in a task-based situation
02:02:57.500 | where a monkey can choose, we gave monkeys choices
02:02:59.620 | of whether they could give a reward to themselves,
02:03:01.400 | to another monkey, to a bottle that could collect reward,
02:03:06.020 | you know, in case they just like to see juice dripping out.
02:03:09.640 | And they would become more pro-social,
02:03:11.580 | so they're much more likely to give a reward
02:03:14.540 | to another monkey.
02:03:15.380 | They're more altruistic, as you, you know,
02:03:18.980 | as we talked about earlier.
02:03:20.740 | So that's like, it looks like a real pro-social
02:03:24.320 | kind of thing, right?
02:03:25.460 | Which I think is super interesting.
02:03:27.860 | In females, it's a little bit different.
02:03:30.440 | Females become kind of nicer to each other,
02:03:34.620 | and we see that greater eye contact, et cetera,
02:03:36.740 | but they become more aggressive toward males.
02:03:39.660 | And we speculate, I think, it's the hypothesis
02:03:43.580 | that because oxytocin is released
02:03:47.340 | when you've got an infant, basically, for females,
02:03:50.960 | males are a bigger threat then,
02:03:54.300 | because in many primate societies and other mammals,
02:03:59.060 | males sometimes can be infanticidal
02:04:03.100 | because if they kill off a female's infant,
02:04:07.180 | that's, you know, then that will bring that female
02:04:10.580 | into receptivity for mating much more quickly.
02:04:15.420 | And so, that's sort of the--
02:04:17.100 | - Brutal. - Evolutionary, yeah.
02:04:18.260 | It is brutal, the evolutionary rationale behind that.
02:04:20.900 | So that's kind of our supposition.
02:04:23.380 | The other thing that I thought is really interesting
02:04:27.860 | as well is we find a greater,
02:04:31.540 | or an increase in the synchronization of behavior.
02:04:34.040 | So when I do, you know, this idea of mirroring,
02:04:38.380 | which has been talked about in business context
02:04:40.580 | for a long time, you know, it's a real thing.
02:04:43.740 | And it's a marker of a good relationship,
02:04:46.460 | a strong relationship.
02:04:47.460 | If you have good rapport with somebody,
02:04:48.780 | you tend to adopt similar movements and postures.
02:04:53.100 | If you do those things-- - Shirts.
02:04:54.700 | - Shirts, exactly.
02:04:56.540 | We didn't coordinate here. - Similar clothes, yeah.
02:04:58.260 | - You just happen to be a great dresser.
02:04:59.500 | - Oh, well, you know, same here.
02:05:01.220 | So when you have that, you know,
02:05:03.180 | actually, if you do those things,
02:05:04.380 | if I subtly mirror you, and I'm in a job interview,
02:05:07.420 | I'm more likely to get the job,
02:05:08.540 | gonna get a higher salary, et cetera.
02:05:10.060 | - Really? - All those sort
02:05:10.900 | of good things.
02:05:11.720 | So oxytocin turns up behavioral synchrony.
02:05:15.500 | And one of the things, this is like something
02:05:17.360 | I've been fascinated in for the last decade,
02:05:20.640 | and we and a lot of other people have been working on,
02:05:22.440 | is that this synchrony, the behavioral and neural level,
02:05:26.040 | physiological synchrony, is kind of,
02:05:30.000 | it's this black magic of social behavior.
02:05:33.480 | It's the glue that allows us to live and work together.
02:05:38.020 | So the observation is that if you and I,
02:05:40.960 | we have a good rapport here, let's say,
02:05:42.280 | if we were measuring activity in our brains right now,
02:05:45.960 | we'd see that they were coming into alignment.
02:05:48.280 | So they might've been very disparate
02:05:49.880 | when I arrived here and you arrived here today.
02:05:52.840 | And as we've grown closer and we've discovered things
02:05:57.120 | that are similar about us, that the, you know,
02:06:00.400 | our mindsets and our emotional sets are more overlapping.
02:06:03.840 | So we see this world more similarly,
02:06:05.680 | we feel more similarly about it.
02:06:07.760 | We're more likely to take similar decisions
02:06:10.520 | and then that reverb, the coolest thing
02:06:13.800 | is this reverberates down to your body.
02:06:15.480 | So if our brains begin to align,
02:06:19.360 | our hearts actually begin to beat together.
02:06:21.820 | You know, if we have different resting heart rates,
02:06:24.400 | you begin to breathe together
02:06:25.600 | and you start to move together.
02:06:26.480 | You start to look at the same things in the environment.
02:06:28.040 | We've talked about attention.
02:06:28.880 | When you look at something, the same thing,
02:06:30.960 | you're getting the same data and that feedback loop,
02:06:35.280 | which I think now you can see that that is a way
02:06:38.520 | to coordinate behavior.
02:06:40.720 | And that is the essence of sort of,
02:06:42.560 | that's our secret sauce as a species,
02:06:44.640 | which is that we can collaborate and do things together.
02:06:48.480 | And it seems to like oxytocin, vasopressin
02:06:51.080 | are involved in this as a way of kind of turning up
02:06:53.720 | the dial on synchrony.
02:06:55.840 | It seemed to turn up the so-called social brain network.
02:06:59.280 | And then that synchrony is the glue.
02:07:03.480 | And it's a biomarker, a biological marker
02:07:06.720 | of a close relationship that predicts better communication,
02:07:10.200 | increased trust, better teamwork,
02:07:12.600 | whether your marriage is gonna last.
02:07:15.360 | I mean, the things that it predicts,
02:07:18.480 | group decision-making.
02:07:19.480 | So we showed that in like in a business context,
02:07:22.060 | committees that are more in sync with each other,
02:07:25.160 | that their hearts are beating together,
02:07:27.520 | are more likely to reach the right decision
02:07:30.420 | in a really difficult problem than committees that are not.
02:07:34.000 | The cool thing is that now that you have a biomarker,
02:07:37.600 | you can hack that, right?
02:07:39.640 | In the sense that now we can start looking
02:07:41.420 | at all those trust-building exercises or anything else
02:07:44.160 | that we're supposed to turn things up,
02:07:46.800 | turn up the dial on teamwork or communication,
02:07:50.360 | and we have a readout.
02:07:51.480 | And we could say, yeah, that's working.
02:07:53.120 | That's actually doing the thing.
02:07:55.000 | It's not BS, right?
02:07:57.380 | You should invest your time and energy in that
02:07:59.880 | rather than something else.
02:08:01.360 | And there's like, now we've been working through this list
02:08:03.560 | as well as others.
02:08:04.400 | There's a whole host of things
02:08:05.960 | that seem to actually turn up synchrony.
02:08:09.400 | And that's a shortcut to team chemistry.
02:08:13.640 | - So interesting.
02:08:14.680 | I'm sure you're familiar with the molecule MDMA,
02:08:21.480 | AKA ecstasy.
02:08:22.720 | - Never taken it.
02:08:24.160 | - I have. - High on my list.
02:08:25.280 | - Yeah, I have.
02:08:26.720 | It's an illegal drug, but if you are part
02:08:30.260 | of a clinical trial exploring MDMA,
02:08:32.400 | then you can do it legally.
02:08:34.080 | If you're not, you're breaking the law.
02:08:36.080 | So methylene dioxymethamphetamine,
02:08:40.560 | it's very interesting because it dramatically increases
02:08:44.720 | dopamine, but not nearly as much as it increases serotonin.
02:08:48.600 | And it also leads to enormous increases in oxytocin.
02:08:52.780 | And it's not really a classic psychedelic.
02:08:55.520 | It's an empathogen.
02:08:58.560 | It has unique properties in that it raises dopamine
02:09:02.720 | and serotonin simultaneously.
02:09:04.440 | That's unusual among compounds like amphetamine,
02:09:08.600 | dopamine, epinephrine, psilocybin, serotonin.
02:09:12.560 | So broadly speaking, there's a really nice experiment
02:09:15.600 | that was done trying to isolate the effects
02:09:18.440 | of dopamine versus serotonin versus oxytocin
02:09:21.140 | on the empathogenic effect.
02:09:23.380 | And by administering different drugs
02:09:27.920 | and in the case of oxytocin, oxytocin directly,
02:09:30.340 | what they basically concluded was that oxytocin
02:09:33.760 | has very little, if anything, to do
02:09:35.240 | with the pathogenic aspects of MDMA.
02:09:39.400 | But if I recall correctly,
02:09:40.560 | and I have to go back and look at this,
02:09:41.640 | but if I recall correctly, it had a profound impact on,
02:09:46.640 | as you pointed out, the reducing anxiety.
02:09:51.440 | And that reduction in anxiety brings us back to this idea
02:09:55.520 | that as we change the tide of autonomic arousal,
02:09:59.200 | things become more or less available to us
02:10:01.560 | in terms of emotions and behavior.
02:10:03.520 | So I find oxytocin to just be
02:10:05.560 | a spectacularly interesting compound for so many reasons,
02:10:09.440 | but perhaps for that reason more than all the others,
02:10:14.440 | that it's like it's our own affiliative,
02:10:18.320 | as you said, anxiolytic, is that how you pronounce that?
02:10:20.800 | - Yeah, yeah, yeah.
02:10:22.520 | - To, I never actually said that word out loud.
02:10:25.040 | I've written it many, many times.
02:10:27.040 | - It's kind of, when I said it,
02:10:28.360 | I was worried that like, maybe I'm saying the opposite.
02:10:30.760 | - Or anxiolytic, or is it anxio, anyway.
02:10:33.480 | Reduces anxiety, folks.
02:10:35.880 | - Chills you out. - Chills you out.
02:10:37.400 | And I think that's so interesting
02:10:38.600 | that oxytocin can be evoked
02:10:40.000 | by all these different types of stimuli.
02:10:42.840 | So as you mentioned, it's like post-coital or post-orgasmic.
02:10:46.360 | But it can be elicited by non-sexual affiliative touch,
02:10:51.480 | by, there's actually really interesting evidence
02:10:56.480 | that, and this led to this question
02:10:58.840 | about whether or not cesarean sections
02:11:00.440 | versus, you know, traditional vaginal births are,
02:11:05.440 | are they truly equal in terms of their effect on the fetus?
02:11:09.280 | And it does seem to be, at least in rodent models,
02:11:11.760 | that the passage through the vaginal canal during birth
02:11:14.560 | helps stimulate oxytocin,
02:11:17.300 | that it has a bidirectional effect
02:11:19.000 | on the mother-infant relationship.
02:11:22.400 | Is there any evidence of that in primates as well?
02:11:24.900 | - I know the evidence that you're talking about.
02:11:28.080 | I don't know of evidence in primates for that.
02:11:32.100 | But I think I'd like to circle back
02:11:36.360 | to what you talked about in terms of social touch,
02:11:40.120 | which I think is a really, especially right now, today,
02:11:44.320 | I think is a very important topic to consider.
02:11:48.140 | So we, like other primates, we have these,
02:11:53.140 | they're actually unspecialized sensors in our skin,
02:11:58.040 | the hairy parts of our skin,
02:11:59.240 | like your arm, whatever your,
02:12:01.000 | and they provide input essentially to a system
02:12:04.760 | that release, the system that releases oxytocin directly.
02:12:08.200 | And that's basically all they do.
02:12:09.560 | They're really bad at telling you exactly where,
02:12:12.280 | how, you know, what's being done or how much pressure,
02:12:15.440 | but they operate best at body temperature.
02:12:19.580 | So you're being touched with a body temperature stimulus.
02:12:23.600 | And in a way that's very,
02:12:24.940 | what we would consider to be very pleasant,
02:12:27.060 | like getting tickies, you know, it's like grooming.
02:12:29.340 | Like it's the same thing as grooming in monkeys.
02:12:33.220 | And so it tells us that this is an ancient part
02:12:37.640 | of our heritage to building relationships,
02:12:40.300 | which is actually through social touch, right?
02:12:44.020 | And it's been said, and I think reasonably,
02:12:47.300 | that we're living through an epidemic
02:12:49.500 | of the loss of social touch for a lot of good reasons,
02:12:53.980 | right, because of raising awareness
02:12:57.060 | of inappropriate touch, et cetera.
02:12:59.240 | But now it's almost as if we've swung the pendulum
02:13:02.620 | too hard in one direction,
02:13:04.420 | which is that we're being robbed
02:13:06.800 | of this very natural intrinsic signaling mechanism
02:13:12.340 | for building bonds that humans would normally,
02:13:16.060 | you know, normally, would, you know,
02:13:17.900 | in the past have benefited greatly from.
02:13:21.100 | And it's, you know, it's not clear how we move forward
02:13:24.760 | in terms of like replacing that,
02:13:26.620 | but I do think it's possibly part of the constellation
02:13:32.920 | of forces of losses that is making us very sick
02:13:41.100 | as a species and as a society, you know,
02:13:44.700 | namely the loneliness epidemic,
02:13:46.380 | the sort of antisocial century with concomitant,
02:13:51.380 | you know, with basically all these fall ons
02:13:55.300 | in terms of anxiety and depression and-
02:13:57.980 | - Despair. - Despair, exactly, despair.
02:14:00.780 | It's such a heavy word.
02:14:02.260 | - Yeah, it captures so much.
02:14:03.980 | A couple of reflections about this.
02:14:07.060 | I mean, 'cause I think about this a lot.
02:14:08.860 | I'll never forget when I was traveling overseas in 2019,
02:14:12.460 | so this is like pre-lockdowns and all that.
02:14:15.140 | You would see in certain areas of the world,
02:14:19.340 | men walking, holding hands.
02:14:21.180 | - Right.
02:14:22.460 | - And, you know, I didn't know their sexual orientation,
02:14:24.940 | but my assumption was that they were heterosexual men
02:14:27.460 | holding hands 'cause it was like just very much
02:14:30.220 | part of the culture over there.
02:14:32.220 | The other thing was if you,
02:14:33.940 | and I have gone to South America,
02:14:35.740 | you'll see school kids walking home,
02:14:38.300 | all holding hands, boys and girls,
02:14:40.500 | just walking, holding hands.
02:14:41.940 | It's very, you know, casual, you know,
02:14:44.620 | non-romantic hand holding, a lot more hugging,
02:14:49.300 | a lot of like, I wouldn't say long, firm embrace,
02:14:52.580 | but I'd say like vigorous embrace
02:14:54.580 | upon meeting kind of thing.
02:14:56.420 | And I grew up in the era of, you know,
02:14:58.660 | like fist bumps and side hugs,
02:15:00.660 | you know, that was like a thing over here.
02:15:03.380 | And as you point out,
02:15:05.980 | I think that the lack of physical touch of that sort,
02:15:10.980 | meaning just whatever is culturally acceptable,
02:15:16.980 | consensual, casual, physical touch,
02:15:20.060 | definitely according to the literature that I'm aware of,
02:15:25.460 | signals to the rest of the nervous system and body,
02:15:29.700 | isolation, even if we're surrounded by people.
02:15:32.580 | - And I watched that Chimp Empire series on Netflix,
02:15:37.300 | where they talk about this allopathic grooming,
02:15:39.900 | this collaborative grooming,
02:15:40.900 | like I'll trade, you know, five, you know,
02:15:43.620 | pick your back for a while, you pick mine.
02:15:45.500 | And when they decide that they're going to ostracize
02:15:48.580 | a given member of their troop for whatever reason,
02:15:51.540 | sometimes it's because the chimp misbehaved,
02:15:54.460 | other times it's more diabolical than that.
02:15:57.180 | They're trying to really get rid of,
02:15:59.740 | they're trying to adjust the power balance in the troop
02:16:03.420 | for other reasons.
02:16:05.020 | They basically just leave that chimp
02:16:07.860 | to try and groom itself,
02:16:09.580 | and then the parasites start to eat away at it,
02:16:11.860 | it develops these immune issues,
02:16:13.420 | and then they often just go off on their own and die.
02:16:16.740 | It's an incredibly hard thing to watch.
02:16:19.940 | And what the underlying reasons are in each case
02:16:22.380 | are not made completely clear.
02:16:24.620 | But I think about this whole thing of like deaths of despair
02:16:29.780 | and, you know, not long ago,
02:16:30.980 | you were talking about group selection.
02:16:33.020 | I feel like these two themes might be related.
02:16:35.700 | I feel like right now,
02:16:37.580 | politically and culturally in this country,
02:16:39.900 | and now starting in Europe as well,
02:16:41.860 | it really is, it has become an us versus them
02:16:45.660 | kind of scenario.
02:16:46.620 | There doesn't seem to be a middle at all.
02:16:48.660 | It's like a big trough.
02:16:50.660 | And even the suggestion
02:16:51.660 | that somebody could kind of switch between groups
02:16:53.980 | is kind of like a no,
02:16:55.540 | because they believe and have said and done this,
02:16:59.860 | no, because they believe and have said and done this.
02:17:01.860 | And very strong opinions from both sides.
02:17:03.740 | So I don't think we're in a just hug it out
02:17:08.740 | kind of landscape right now.
02:17:12.600 | And so I'm curious what forms
02:17:15.200 | of non-physical affiliative behavior exist out there.
02:17:19.140 | There are social media accounts out there like Upworthy,
02:17:21.860 | which, you know, just consistently puts out positive content.
02:17:25.660 | There are people who are very positive
02:17:27.500 | in their online behavior.
02:17:30.900 | But, and there's encouragement exists online,
02:17:34.420 | but it seems to be swamped
02:17:37.220 | by these like high salience like attacks.
02:17:40.260 | Like, what's the deal?
02:17:41.580 | What can we do?
02:17:42.400 | - Yeah, I mean, this is a fundamental question
02:17:45.640 | for our age, I think.
02:17:46.620 | And we're on a trajectory toward,
02:17:48.360 | I mean, I don't wanna give the impression
02:17:51.020 | that I'm a complete pessimist,
02:17:52.260 | but I could, I was about to say toward oblivion
02:17:55.080 | between like the despair that's drive,
02:17:57.540 | it has been driving people to either commit suicide
02:18:02.540 | or to, you know, develop severe mental illness
02:18:06.860 | or physical health issues, cardiovascular disease,
02:18:11.860 | diabetes, et cetera, that are, I think,
02:18:14.540 | a consequence of being, in some cases,
02:18:16.940 | a consequence of being isolated
02:18:18.460 | because you are not interacting.
02:18:20.180 | That's part of who we are as a species and we don't thrive.
02:18:23.860 | I mean, the work is very clear that like being isolated,
02:18:27.220 | being alone is worse for your health
02:18:28.500 | than smoking 15 cigarettes a day.
02:18:30.360 | I mean, it's just really, really bad.
02:18:32.820 | And it scales, it's almost linear
02:18:35.340 | to how many contacts you have per week or per month.
02:18:39.100 | So that's all really bad.
02:18:41.320 | And I do believe that's also driving,
02:18:43.540 | that's a big driver for not just the deaths of despair,
02:18:46.580 | but like the lack of coupling and the lack,
02:18:49.740 | and crashing rates of fertility,
02:18:51.540 | which is also a real thing and it is happening.
02:18:54.940 | And if we don't counter it, it's gonna be bad.
02:18:57.580 | Getting back to syncrony, one of the most effective ways
02:19:00.680 | to get in sync with somebody that you're out of sync with
02:19:04.020 | or that you don't know, right?
02:19:05.900 | Who's different from you is through conversation,
02:19:09.460 | but deep conversation, okay?
02:19:11.300 | And there's a couple of parts to this.
02:19:12.300 | You have to make the time and the space to do this.
02:19:14.220 | You have to have an intentional mindset.
02:19:16.540 | And we and other scientists have worked with,
02:19:19.580 | there are these structured sets of questions
02:19:22.100 | that have been developed.
02:19:23.340 | There's one called "Fast Friends"
02:19:24.540 | developed by the Arons in the late 1990s.
02:19:26.420 | There's commercially available decks online
02:19:28.580 | that you can get.
02:19:30.180 | And they're cool because they, each question,
02:19:33.780 | you can kind of take it a superficial level or a deep level,
02:19:36.200 | but they're designed to kind of like break the ice
02:19:38.260 | and then get you really fast into like really deep questions.
02:19:42.500 | - Is this like 100 questions to fall in love type thing
02:19:45.380 | that was published in the New York Times?
02:19:46.740 | - Yeah, it's very similar to that.
02:19:49.440 | But in this case, it's about connecting,
02:19:53.440 | like deep connection.
02:19:54.480 | I think it's more about deep connection
02:19:55.960 | than sort of romance part of this.
02:19:58.760 | And what happens during that,
02:20:00.760 | and my good friend and colleague, Emily Falk
02:20:03.680 | at the Annenberg School had a really nice paper recently
02:20:07.600 | that showed that by measuring brain activity itself
02:20:11.080 | in people who don't know each other,
02:20:12.600 | as they work through these questions
02:20:15.040 | and their brain, you know, one brain is in this space,
02:20:17.600 | another brain is this space,
02:20:18.680 | and they, over time, come into really close alignment,
02:20:23.520 | and that's associated with all this good stuff,
02:20:25.960 | like I like you more, I feel closer to you,
02:20:28.000 | I value you more, et cetera, et cetera.
02:20:30.280 | And once you're in that kind of alignment,
02:20:32.440 | now you're set to sort of do things together.
02:20:35.760 | And now I think that gets back to your question,
02:20:37.680 | like we can't hug it out,
02:20:38.960 | but we have to somehow create space.
02:20:43.440 | And when I say space, like give people the space to do that,
02:20:46.940 | like I'm gonna talk to, you know,
02:20:48.500 | somebody from the other political party or from the whatever,
02:20:52.160 | that's not a bad thing, right?
02:20:53.840 | In fact, that's what we need to do,
02:20:55.160 | but instead we're, and especially online,
02:20:57.480 | reinforcing and making the barriers harder
02:21:01.280 | to have those conversations,
02:21:03.360 | which are the necessary thing, I think,
02:21:05.680 | to establish the glue that keeps us together.
02:21:09.100 | - Yeah, I feel like unless there's a organized effort
02:21:13.640 | to try and create a bridge,
02:21:16.280 | it ain't gonna happen.
02:21:17.240 | I just feel like there's,
02:21:18.480 | I don't wanna take us too far off course,
02:21:20.640 | but maybe this is a good segue
02:21:21.620 | into the neuroscience of decision-making
02:21:24.160 | and value-based decision-making,
02:21:26.800 | which is so much of the work that you've done.
02:21:28.920 | But I feel like there's this property of the human brain
02:21:32.080 | that there's evidence for.
02:21:33.640 | I've seen a beautiful neuron paper
02:21:36.200 | showing that like confirmation of our beliefs
02:21:38.200 | leads to a reward-based,
02:21:40.820 | activation of a reward-based mechanism.
02:21:42.980 | Basically, we're getting a little bit of dopamine
02:21:45.160 | for confirming our biases essentially about others.
02:21:49.340 | And then of course,
02:21:50.180 | if we then experience more affiliative behavior
02:21:53.720 | from our group, we feel more protected,
02:21:56.440 | and then there's a tendency to do more of that.
02:21:59.300 | And I feel like with the knowledge that we have
02:22:02.400 | about dopamine incentive schemes,
02:22:06.800 | group selection behavior,
02:22:09.980 | there ought to be a program that could be established
02:22:13.400 | that isn't hug it out, but that is designed to,
02:22:17.360 | again, that word exploit is so loaded,
02:22:20.640 | to leverage the same neural circuits
02:22:23.360 | that led to the divide to try and bridge this divide.
02:22:27.440 | And what it has to do though,
02:22:30.960 | is it has to break with the value system of both groups.
02:22:35.960 | I mean, let's just be frank,
02:22:36.960 | we're talking about the left and the right here.
02:22:38.160 | I mean, I don't wanna dance around the margins.
02:22:41.600 | And somehow acknowledge that there's good and bad
02:22:45.960 | within both of those groups,
02:22:47.680 | which itself, as I say, is like a heretical statement,
02:22:49.960 | like people are gonna...
02:22:50.800 | I mean, there's just so many assumptions
02:22:52.860 | made just on the basis of that,
02:22:54.980 | but create a new value-based system
02:22:59.980 | that is self-rewarding and allows for group selection
02:23:03.380 | to fill in the gap,
02:23:05.120 | or at least come up with a third option.
02:23:07.800 | If not politically, then in terms of sociology.
02:23:12.800 | - Yeah, so the solution is Independence Day, that movie.
02:23:17.560 | So we need an alien invasion,
02:23:19.640 | so there's an out-group that we can all
02:23:23.040 | identify with each other as,
02:23:25.720 | okay, we have to come together to fight.
02:23:28.680 | 'Cause I think that's really at the root of this,
02:23:31.720 | which is that because of group selection,
02:23:33.760 | humans are sort of very tribal by nature.
02:23:36.420 | We are wired to connect, to glue together
02:23:40.860 | with the people who are in our tribe,
02:23:42.280 | but that means, almost by definition,
02:23:43.920 | there's another tribe, right?
02:23:45.360 | So that we're over here
02:23:48.320 | and we're defending ourselves against them.
02:23:50.280 | Now, it's not like complete, right?
02:23:52.080 | People have been engaging in long-distance trade
02:23:54.760 | for 100,000 years plus.
02:23:58.040 | There was interbreeding between Homo sapiens
02:24:01.120 | and Neanderthals and Denisovans.
02:24:03.360 | So there's some flexibility in those rules,
02:24:08.360 | but in general, yeah, I mean, to have an in-group,
02:24:11.540 | that means you have to have an out-group.
02:24:14.000 | And if we wanna take the left and right
02:24:15.400 | and put them together, in some ways,
02:24:16.620 | it's like the easiest way to do that
02:24:18.760 | is if we had a third out-group
02:24:21.900 | that we needed to unite against,
02:24:23.820 | such as drones from over New Jersey,
02:24:27.380 | or aliens, or who knows what, but--
02:24:31.140 | - Well, these go back to classic psychology experiments,
02:24:33.180 | right, as I recall, where the best way
02:24:34.760 | to build affiliations, have a common goal and/or enemy.
02:24:39.760 | - Yeah.
02:24:41.460 | - Like, unfortunately, being under attack,
02:24:44.240 | when two opposing groups are both under attack,
02:24:46.340 | they form alliances.
02:24:47.180 | - So it's the classic minimal group experiments
02:24:49.780 | of Zions in the '60s, which I love,
02:24:52.460 | and I teach on this all the time,
02:24:53.560 | because it's relevant for all these tribal biases.
02:24:58.020 | And so, and what he did was, like,
02:24:59.900 | he'd take the random people off the street
02:25:01.540 | and go like, "Okay, you're on the red team,
02:25:02.680 | "you're on the blue team, you're on the red team,
02:25:03.720 | "you're on the blue team, okay, in five minutes,
02:25:06.760 | "you're gonna have to compete against the other team."
02:25:09.100 | And immediately, the people on the red team are like,
02:25:10.700 | "I don't like the people on the blue team,
02:25:11.820 | "they're stupid, and they're ugly,
02:25:13.200 | "and you don't know anything about them," right?
02:25:16.300 | But you end up immediately forming a tribe,
02:25:19.940 | even though you might not have had anything in common.
02:25:22.780 | And what I think is really interesting and relevant here
02:25:26.100 | is that any number of different biases
02:25:29.420 | that are sort of superficial based on race
02:25:34.420 | or ethnic group or whatnot, which have been shown to,
02:25:37.480 | you know, even though people say, like,
02:25:39.000 | "Oh, I feel, you know, if I see you in pain,
02:25:41.080 | "like, you're getting stuck with a needle."
02:25:42.360 | Like, "Oh, I feel the same for anyone, doesn't matter."
02:25:46.080 | But it tends to be selective for your own tribe
02:25:47.780 | when you measure the brain activity.
02:25:49.880 | But if you now put the emphasis on team,
02:25:52.640 | like, literally, you do that Zions experiment,
02:25:54.800 | that minimal group experiment, and I put you in a red,
02:25:57.640 | or like, we're both wearing black T-shirts,
02:25:59.320 | so you're gonna work with the other people
02:26:01.400 | in black T-shirts, it doesn't matter who you are,
02:26:03.840 | that, and I think the way it does this is through attention,
02:26:06.200 | is put my attention on what's shared
02:26:09.080 | rather than what's different.
02:26:10.560 | So now we're on the same team,
02:26:11.800 | and now that kind of recovers, restores
02:26:15.920 | that empathy that I didn't feel toward you before.
02:26:22.240 | And that's interesting when you think about,
02:26:25.440 | say, in the U.S., the first places,
02:26:29.040 | the first groups that became integrated
02:26:32.320 | were, like, military and sports, right?
02:26:36.120 | And what's common amongst those?
02:26:38.280 | They wear uniforms, right?
02:26:39.680 | So the uniforms say we're on a team
02:26:42.760 | that takes your attention away
02:26:44.120 | from the things that are different.
02:26:46.360 | - And the Stanford prisoner, famous Zimbardo experiment,
02:26:49.520 | where, you know, assigning people to prisoner versus guard,
02:26:51.880 | and that led where it led.
02:26:53.600 | - Exactly.
02:26:54.440 | - That occurred not but a short distance
02:26:55.960 | from where my lab was.
02:26:57.920 | - So we have this anxiety-lowering,
02:27:02.040 | pro-affiliative oxytocin thing,
02:27:05.960 | activated by touch affiliation,
02:27:07.720 | and it's bi-directional, like it promotes more touch,
02:27:09.800 | which promotes more feelings of safety,
02:27:11.500 | which lowers anxiety further,
02:27:12.960 | and then we have testosterone,
02:27:16.320 | which signals certain things about others,
02:27:18.360 | and seems to play a role in the hierarchy.
02:27:21.120 | And you mentioned that when oxytocin is given,
02:27:25.820 | that it kind of flattens the hierarchy.
02:27:27.820 | And my understanding of testosterone
02:27:30.700 | from Robert Sapolsky and others
02:27:33.500 | is that testosterone tends to exacerbate
02:27:37.020 | existing traits in people.
02:27:39.060 | It doesn't turn nice people into jerks,
02:27:41.380 | or jerks into nice people,
02:27:44.620 | but rather it turns jerks into super jerks,
02:27:46.860 | and nice people into super nice people,
02:27:49.420 | which fits well with my idea
02:27:52.260 | that testosterone makes effort feel good,
02:27:55.580 | and what type of effort feels good
02:27:57.540 | depends on a lot of complex features
02:28:01.060 | within us as humans,
02:28:03.780 | like too many things to explain by molecules.
02:28:06.760 | So I feel like the primate literature
02:28:10.900 | and the human literature map so well to one another,
02:28:13.980 | and I think this is a good segue
02:28:15.340 | to take us into value-based decision-making,
02:28:17.160 | because I do recall a paper published,
02:28:19.620 | I think it was in Proceedings
02:28:20.460 | of the National Academy of Sciences USA,
02:28:23.980 | I should point that out,
02:28:24.800 | there are other Proceedings in other countries,
02:28:27.000 | that showed that if day traders
02:28:30.940 | or people on the stock market floor took testosterone,
02:28:34.540 | or they tended to be more aggressive
02:28:37.100 | and impulsive in their decision-making,
02:28:41.140 | or if you just looked at performance,
02:28:43.860 | and then you measured testosterone,
02:28:45.380 | that it tended to fall out on a pretty nice correlation
02:28:50.020 | between higher testosterone
02:28:51.560 | and basically more aggressive decision-making,
02:28:54.200 | more risk-taking.
02:28:55.360 | So is that all still true?
02:28:57.480 | - I mean, that's my read of literature,
02:28:59.320 | that is still true,
02:29:01.760 | and it does raise, I think, a worrying specter of,
02:29:06.760 | 'cause I don't know how much of a phenomenon it is now,
02:29:10.000 | but it was the case maybe a decade ago or so
02:29:13.480 | that a lot of guys who were traders
02:29:16.320 | who were feeling like they were losing their mojo
02:29:19.440 | after 40 or whatnot, declining testosterone,
02:29:23.600 | so they decide they're gonna start juicing,
02:29:27.280 | okay, put some Androgel on,
02:29:28.880 | and if that is then,
02:29:31.280 | that's like taking you above typical levels,
02:29:36.280 | then what might that do in terms of markets
02:29:39.680 | if enough people are actually,
02:29:42.320 | or even if they're juicing just for physical performance,
02:29:47.160 | and they're engaged in trading?
02:29:51.120 | That could have a lot of bad effects, right,
02:29:54.760 | as it cascaded through the market.
02:29:56.600 | - Yeah, I would say that probably the dominant effect
02:29:58.800 | of exogenous androgens and all this TRT nowadays
02:30:02.440 | is it's very clear that it allows people
02:30:04.920 | to maintain moderate to high testosterone levels,
02:30:08.400 | even if they're not sleeping as much,
02:30:10.480 | it enhances recovery.
02:30:11.840 | So if people have their behaviors right,
02:30:14.840 | their nutrition, their sleep, et cetera,
02:30:16.400 | it really does give them a significant advantage.
02:30:19.960 | If they don't have their behaviors right,
02:30:23.680 | it gives them the significant advantage
02:30:25.880 | of not having to deal with the normal fluctuations
02:30:29.160 | caused by minimal sleep, et cetera.
02:30:31.520 | But the decision-making process,
02:30:33.800 | like to say yes, no, maybe, or maybe later,
02:30:37.440 | is reliant on things like good sleep, being rested,
02:30:41.380 | things other than testosterone.
02:30:43.200 | Like this is the idea of a committee
02:30:45.560 | as opposed to one individual, you know,
02:30:47.480 | recklessly driving decision
02:30:49.000 | based on state of mind or androgens.
02:30:51.800 | So if we could zoom out and in for a moment
02:30:55.040 | on some of the work that you did with Paul Glimcher
02:30:58.080 | when you were a postdoc in his lab,
02:30:59.360 | but also in your own laboratory.
02:31:01.640 | When I sit down to make a decision,
02:31:03.400 | should I do something, should I not do something?
02:31:05.840 | Let's say I have some general sense
02:31:08.360 | of what the potential payoff is within a range,
02:31:12.060 | the potential payoff of not doing it within a range,
02:31:15.440 | and I always think of like some like kind of tension
02:31:19.380 | or pressure as it relates to time.
02:31:21.820 | Like for instance, I've been considering buying a house.
02:31:25.440 | I really like the house.
02:31:26.980 | It's a bit of a reach for me for a number of reasons.
02:31:29.680 | And I'm trying to make this decision, right?
02:31:34.680 | And I'm trying to gauge whether or not other people
02:31:36.480 | are looking at this house also.
02:31:38.040 | What do we know about how we start to establish
02:31:42.600 | an internal representation of that?
02:31:44.560 | And I give that example as just one example.
02:31:49.360 | This could really translate to any number
02:31:51.240 | of different scenarios about whether or not
02:31:53.320 | to get married or not, whether or not to stay
02:31:54.720 | in a relationship or not, whether or not to move,
02:31:56.560 | whether or not to have another kid, and on and on and on.
02:32:00.080 | What are the core mechanics of value-based decision-making
02:32:04.040 | as it relates to outcomes and time?
02:32:07.720 | - Yeah, so I think we understand this system pretty well
02:32:12.720 | at this point.
02:32:13.960 | The last 25, 30 years have been enormously productive.
02:32:17.760 | So we have a good sketch of the circuitry that does this.
02:32:20.440 | And essentially what happens is you're confronting
02:32:22.640 | a situation, and it doesn't really matter whether,
02:32:24.920 | it seems to be the same process.
02:32:26.920 | Doesn't matter whether you're trying to decide
02:32:28.200 | between eating a donut or an apple,
02:32:31.040 | or buying this house versus renting an apartment,
02:32:33.760 | or marrying this person, you know, proposing or not.
02:32:37.000 | It's sort of all the same system.
02:32:39.960 | And what happens is you come to the situation
02:32:43.340 | and your brain takes in evidence about the alternatives.
02:32:48.200 | What are the options that are available to me?
02:32:50.640 | What do I know about them from their stimulus properties
02:32:53.320 | and from, you know, maybe prior encounters
02:32:56.000 | or just other information?
02:32:57.440 | And it takes that evidence and it weighs it against
02:32:59.840 | stored information about things you'd done in the past,
02:33:03.800 | other decisions you'd made, and then begins to assign value,
02:33:07.360 | computes the expected value of those different options
02:33:10.640 | in terms of what it will return to you.
02:33:13.220 | And then essentially that is the basis along which
02:33:19.080 | that decision gets made.
02:33:21.400 | So it's, you know, it's a soft max function, as we say,
02:33:26.080 | so it's not like a hard deterministic one.
02:33:28.200 | So there's some statistical noise in there for some,
02:33:31.880 | you know, we could talk about what that reason might be.
02:33:34.000 | You make a choice, and whenever you make a choice,
02:33:37.060 | in any behavior that you're engaging in,
02:33:39.000 | your brain is making a forecast of what's gonna happen next
02:33:42.320 | as a result of that.
02:33:43.580 | And your brain then determines, computes,
02:33:47.240 | that things go exactly as predicted, right?
02:33:50.120 | Is it better than predicted or is it worse than predicted?
02:33:52.760 | And then that signal gets fed back into the system
02:33:55.840 | to update it so that it hopefully performs that job
02:33:59.640 | better in the future, right?
02:34:01.380 | So like, oh, actually that was,
02:34:03.640 | it went way better than expected.
02:34:05.600 | And you should assign that a higher value
02:34:07.760 | and do that thing.
02:34:09.760 | Again, this process of weighing up the evidence takes time.
02:34:14.760 | And that's why we have this speed accuracy trade-off
02:34:17.960 | in decision-making, where we observe that the faster you go,
02:34:21.920 | the more mistakes you tend to make.
02:34:24.040 | - Been there.
02:34:24.880 | - (laughs) Exactly, we've all made split-second decisions
02:34:27.880 | that we regretted later.
02:34:29.920 | - Oh yeah, or slightly sleep-deprived.
02:34:32.360 | - Sleep-deprived, exactly.
02:34:34.160 | The more time you take,
02:34:35.800 | the more evidence you can accumulate.
02:34:38.560 | And when you, you have to recognize
02:34:40.920 | that the data your brain is taking in
02:34:42.800 | from the environment is noisy, right?
02:34:45.240 | It's not perfect.
02:34:46.960 | It's noisy because of the environment.
02:34:48.360 | It's noisy because the wetware of the brain
02:34:50.480 | is statistical and biological.
02:34:54.000 | So you can make the wrong choice
02:34:57.000 | by virtue of the noise dominating the signal.
02:35:01.360 | And that happens when you go too quickly, right?
02:35:03.840 | And one of the things that's,
02:35:05.560 | so there's a good mantra from that,
02:35:08.200 | which is if you want to make really good decisions
02:35:10.160 | or if it's really important,
02:35:11.560 | you kind of have to decide ahead of time,
02:35:12.960 | like, do I need to be accurate or do I need to be fast?
02:35:16.560 | And if accuracy is important, you need to slow down.
02:35:19.820 | Take your time, take as much time as needed
02:35:23.360 | to get the most information that you can.
02:35:26.760 | And even in the moment that doing like simple strategies,
02:35:31.660 | like breathing or having, you know, a mantra that says like,
02:35:35.680 | you know, it's not what matters.
02:35:37.120 | You know, every little decision is not what counts,
02:35:39.000 | but it's the long run.
02:35:40.640 | That helps to turn, we've talked about arousal a lot here.
02:35:44.000 | And that turns down arousal.
02:35:45.800 | One of the things you can think of arousal as doing,
02:35:47.640 | we keep talking about volume knobs.
02:35:49.080 | It's like a volume, volume knob for the stuff
02:35:52.780 | that's coming into your brain
02:35:54.840 | that could be signal or noise.
02:35:56.920 | So it can turn up noise too.
02:35:59.160 | So you could count as evidence
02:36:03.360 | toward the value of an option,
02:36:05.320 | something that is not actually, you know, evidence,
02:36:08.680 | and then you make the wrong decision.
02:36:10.000 | So by turning down arousal, slowing down,
02:36:15.000 | you're relying more on evidence than on noise.
02:36:18.440 | - Does increasing arousal increase the likelihood
02:36:20.920 | of false positives, that is thinking something's there
02:36:23.340 | that's not, generally speaking,
02:36:25.800 | as well as false negatives, you know,
02:36:28.440 | thinking that something's absent
02:36:29.720 | when actually it's present?
02:36:31.080 | - I haven't thought about it that way before,
02:36:33.560 | but it seems to me like that's a, yeah,
02:36:37.480 | that seems consistent with my understanding.
02:36:40.080 | - Just by way of example,
02:36:41.640 | one of the things that's been really different for me
02:36:45.720 | in the last few years is how quickly you move
02:36:48.280 | to publication when you podcast
02:36:51.640 | or when you're doing social media.
02:36:54.520 | You just click, it's out in the world,
02:36:57.960 | versus, you know, the way I was weaned was, you know,
02:37:00.040 | spend two, three, four years on a project.
02:37:02.680 | Maybe it doesn't go anywhere.
02:37:03.680 | Maybe it does, goes to multiple papers, gets reviewed.
02:37:06.080 | So by the time it comes out, you know,
02:37:08.440 | it's been proofread and you've read the proof.
02:37:10.880 | So it's been vetted by a number of,
02:37:13.040 | hopefully, expert sources,
02:37:14.280 | usually really good sources of feedback,
02:37:18.800 | as opposed to nowadays,
02:37:20.000 | where you can just kind of move immediately to publication.
02:37:23.160 | And I used to have this saying, which was in the lab,
02:37:28.160 | because sometimes, you know,
02:37:29.960 | you have two months to do a revision or something.
02:37:31.600 | It's never really two months.
02:37:32.600 | It always takes five times as long.
02:37:34.520 | I used to say, "I go as fast as I carefully can."
02:37:38.560 | And I used to tell my students in postdocs,
02:37:40.160 | we go as fast as we carefully can,
02:37:41.840 | because the moment you start going fast,
02:37:43.240 | you start making mistakes.
02:37:44.080 | You start making mistakes, you definitely pay for it later.
02:37:46.720 | And the mistakes that I've made podcasting
02:37:48.560 | were a product of going fast and/or fatigue.
02:37:51.360 | And the two things kind of relate to one another,
02:37:53.860 | or occasionally somebody will highlight
02:37:57.160 | conflicting evidence.
02:37:58.080 | And then nowadays, you can go back
02:37:59.800 | and repair things with AI, you can, you know,
02:38:02.960 | you put things in.
02:38:03.800 | But I feel like so much of life
02:38:06.360 | in terms of decision-making is trying to make decisions
02:38:09.520 | when most of the time we think we don't have more time,
02:38:13.960 | but most of the time we do have more time.
02:38:15.740 | Unless somebody's hemorrhaging, we usually have more time.
02:38:19.340 | But then there are some real things
02:38:22.400 | where we don't always have more time.
02:38:24.400 | I mean, we are biological aging machines,
02:38:27.400 | and there is such a thing as too late.
02:38:29.600 | - Yeah, yeah.
02:38:30.520 | - So how do you think these systems change
02:38:35.200 | as a function of, you know,
02:38:36.700 | playing a game for some money in the lab,
02:38:38.600 | we can, or we can get caught up in it.
02:38:41.160 | But there's this like tremendous backdrop of context.
02:38:44.920 | You know, $100 might be fun for one person,
02:38:47.220 | might be the difference between making rent
02:38:48.700 | and not making rent for another person.
02:38:50.600 | You know, the decision to stay in a relationship
02:38:55.120 | or leave a relationship when you're in your teens or 20s
02:38:58.260 | is fundamentally different than when somebody's,
02:39:00.760 | for instance, at the, near the transition zone
02:39:03.700 | of having versus losing their fertility.
02:39:06.680 | I mean, these are like, yeah.
02:39:08.120 | And those change, all sorts of,
02:39:10.200 | these pressures are so real.
02:39:12.520 | And yet, if we only have one system in the brain
02:39:14.720 | that handles this similarly to the reward system,
02:39:18.200 | it seems like we ought to learn in school
02:39:20.460 | how to like work with and update our decision-making process
02:39:25.460 | based on immediate term, short term,
02:39:30.120 | like all the different timescales.
02:39:32.120 | To be able to do that seems really important.
02:39:35.080 | Are there any ways to train that up?
02:39:37.000 | - Yeah, I think it's, so there's a few things in here
02:39:39.960 | that I think are worth unpacking.
02:39:43.460 | I mean, one is what you brought up about fatigue,
02:39:47.220 | which I think is really critical.
02:39:49.760 | And we did some work with the wrestling team at Penn.
02:39:53.200 | Coach came to us, and I had had a few of the wrestlers
02:39:56.600 | working in my lab, and he said,
02:39:58.120 | you know, we're having this problem,
02:40:00.160 | which is that, and I don't know if you've ever wrestled.
02:40:02.240 | I wrestled, my middle son has.
02:40:04.120 | - One match.
02:40:05.200 | - It's the worst six minutes of your life.
02:40:06.600 | - Well, I didn't quit because I lost that match,
02:40:08.480 | and I did lose that match, it was seventh grade.
02:40:10.400 | I quit because my dad gave me a choice.
02:40:13.720 | I could either continue to wrestle,
02:40:15.880 | or I could play this other sport
02:40:18.020 | that I really wanted to play.
02:40:19.420 | He said, you can't do both,
02:40:21.020 | because it was going to impact my grades negatively.
02:40:23.300 | And so I opted for the other sport.
02:40:25.700 | - What was the other sport?
02:40:26.540 | - Soccer.
02:40:27.360 | - Okay, yeah.
02:40:28.200 | - And it just, yeah, and I love soccer.
02:40:30.440 | But, you know, losing that one wrestling match
02:40:34.600 | was informative.
02:40:35.740 | The guy just dead fished on me the whole time,
02:40:38.340 | and he deserved to win.
02:40:39.300 | Like, it was a really good strategy.
02:40:40.420 | He just like dead fished on me, yeah.
02:40:42.740 | You know, and I couldn't gum me out of there.
02:40:45.420 | - But it is the worst six minutes of your life.
02:40:47.880 | You're exhausted within like 30 seconds.
02:40:50.700 | It's incredibly grueling.
02:40:52.700 | And what the coach observed was that their guys,
02:40:57.380 | it was the men's wrestling team,
02:40:59.000 | was they were performing very well in the first two periods.
02:41:02.240 | And they got to the third period,
02:41:03.140 | and they started making really dumb mistakes, bad decisions.
02:41:06.440 | And so we, so he said, what's going on?
02:41:08.860 | I said, well, it's about the speed accuracy trade-off,
02:41:10.780 | but we have to investigate how it's related to fatigue.
02:41:14.060 | So what we did, this was a really fun experiment.
02:41:16.460 | So we go to the wrestling room,
02:41:18.220 | and we wire these guys up.
02:41:19.700 | They got wearable EEG, heart rate monitors,
02:41:21.900 | the whole nine yards.
02:41:23.060 | And what we do, we gave them like this simple little
02:41:26.620 | decision-making/impulse control task.
02:41:28.780 | It's just like a controlled response task.
02:41:30.940 | Here's a, you know, a trade-off.
02:41:32.460 | If you go too fast and you make mistakes, okay?
02:41:35.200 | So it's like, there's, it's like a go, no-go.
02:41:37.700 | And so they do it.
02:41:39.760 | Then we run them through two minutes
02:41:41.820 | of CrossFit exercises, really brutal.
02:41:43.700 | Then they come back off
02:41:44.580 | and they have to do the same thing again.
02:41:45.620 | And we do that three times,
02:41:46.460 | and then they have to wrestle each other.
02:41:47.980 | - Oh, so it's cognitive and physical.
02:41:49.340 | - Yeah, cognitive and physical.
02:41:50.160 | - Not on like chess boxing,
02:41:51.060 | which is not a sport I recommend.
02:41:52.340 | Have you seen this?
02:41:53.700 | Where they play around, they play some chess,
02:41:55.580 | and then they literally fight.
02:41:56.960 | And then they, it's crazy.
02:41:58.460 | It's like switching between these
02:41:59.500 | two very different states of mind.
02:42:00.340 | - Yeah, it's insane, but also somehow really appealing.
02:42:03.220 | You know?
02:42:04.060 | - Well, I think for the neuroscientists in you and me,
02:42:06.060 | and I think we're all neuroscientists to some extent,
02:42:08.620 | we want to understand the brain and ourselves.
02:42:10.820 | This notion of very disparate behaviors,
02:42:15.820 | boxing and playing chess,
02:42:18.300 | being associated with very disparate types of arousal,
02:42:21.660 | and how those map onto one another,
02:42:25.340 | I think is interesting.
02:42:26.420 | - I think the confluence of chess boxing is fencing,
02:42:31.420 | which is very much like chess.
02:42:35.100 | My youngest son fenced for a number of years,
02:42:37.740 | and so mentally it's like that,
02:42:39.440 | but it has the physicality.
02:42:41.540 | - Or jiu-jitsu.
02:42:42.380 | My friends who do Brazilian jiu-jitsu tell me
02:42:44.180 | that it's like there's an infinite number of options
02:42:47.060 | that become constrained in certain dynamics.
02:42:49.860 | - Yeah, yeah, yeah.
02:42:51.700 | - Very similar.
02:42:52.540 | So this was really cool,
02:42:53.360 | because what we found was that speed-accuracy trade-off,
02:42:56.920 | the more fatigue they got, the more calories they spent,
02:43:00.120 | the faster they would slide down
02:43:04.120 | to emphasizing speed over accuracy.
02:43:06.740 | They just started like, just got to get done,
02:43:10.520 | just got to get, I don't know what they were feeling,
02:43:11.920 | but that they were just not deliberating,
02:43:15.720 | not really being focused.
02:43:17.720 | They just lost the capability of doing that.
02:43:21.200 | And aside from the, I could say,
02:43:22.600 | well, we could help you guys,
02:43:24.720 | you could become more physically fit,
02:43:26.760 | maybe you wouldn't fatigue as fast,
02:43:28.040 | but they're about as fit as they could be.
02:43:30.560 | They said, well, why don't we do this?
02:43:31.940 | Why don't we offload the decision
02:43:34.320 | in the third period to the coach?
02:43:37.020 | As soon as you, in the third period,
02:43:39.180 | you're gonna just look at the coach at some cadence
02:43:43.540 | or whenever he's gonna yell at you to look,
02:43:45.200 | and you do what the coach tells you.
02:43:47.340 | So, and I think this is really interesting,
02:43:48.840 | 'cause you think about it in other contexts,
02:43:51.460 | like in a business context or something,
02:43:53.140 | when if somebody's really fatigued or your unit's fatigued,
02:43:56.900 | maybe you have an external person then
02:43:58.980 | who takes over making the decision
02:44:02.380 | that you just execute, in a sense, right?
02:44:06.140 | The other thing I wanted to say about this all, too,
02:44:08.680 | which it gets to your point about,
02:44:10.840 | well, in the lab, it's like, it's one thing,
02:44:13.200 | you got an undergraduate gambling for 10 bucks over an hour,
02:44:18.120 | and that's, how well does that map onto the real world
02:44:22.080 | where there are all these other things going on?
02:44:23.360 | And I think that that's the challenge.
02:44:26.600 | All the, when I teach business school
02:44:28.940 | and classes, MBA students or executives through exec ed,
02:44:32.580 | they all wanna know, like, give me the five-step formula.
02:44:36.540 | (laughs)
02:44:37.380 | And it's like, that's supposed to apply,
02:44:39.540 | how do I take into all, and it's like,
02:44:41.220 | well, we mostly know about one,
02:44:43.300 | this dimension or that dimension or that dimension,
02:44:45.140 | and not how, in the real world,
02:44:47.980 | in a real complex environment, to put that all together.
02:44:52.300 | So that is a, I think that's a gap.
02:44:54.340 | That's a, and one that we're trying to fill,
02:44:56.820 | which is to study decision-making,
02:45:00.160 | whether it's individual or collective decision-making,
02:45:02.560 | in real-world environments, right,
02:45:04.480 | to where all of these factors,
02:45:08.940 | context and the various priorities that are coming in,
02:45:12.760 | are more natural, they're not controlled.
02:45:15.560 | And how then, I mean, we think we know how that works,
02:45:19.080 | but we haven't really proven it yet.
02:45:21.600 | So often we think that we know how we feel about something,
02:45:26.480 | but some of the work that you've done,
02:45:28.320 | in monkeys and in humans,
02:45:30.940 | has really highlighted the extent to which
02:45:34.040 | we base our evaluation of other things and people,
02:45:37.340 | based on things that are in proximity
02:45:39.680 | to those things and people.
02:45:41.240 | Could you tell us about this experiment?
02:45:43.320 | And I swore I wasn't gonna say the words,
02:45:46.280 | highly processed foods, during this episode,
02:45:49.300 | but I think we gotta talk about monkeys and Doritos.
02:45:52.160 | - Ah. (laughs)
02:45:53.720 | Right, I thought, this could have gone,
02:45:55.240 | there's a number of different,
02:45:56.580 | and I will, I do wanna bring up one study
02:45:59.080 | that I think people will find interesting,
02:46:02.000 | that gets at this difference
02:46:03.200 | between what we think we know and feel,
02:46:05.640 | and what our brains are actually telling us.
02:46:08.760 | So we talked about how monkeys and people,
02:46:12.360 | their brains are, I don't wanna say hardwired,
02:46:16.360 | but they're tuned, tuned to value social information,
02:46:21.360 | and particular kinds of social information,
02:46:24.300 | like information about high-status individuals,
02:46:27.720 | and information about sexy individuals,
02:46:30.880 | attractive individuals, right?
02:46:32.560 | And it's baked into the same circuitry,
02:46:34.680 | or attention circuitry, or reward circuitry.
02:46:37.680 | And that, once we observed that, of course,
02:46:40.880 | it led me to wonder, well, okay,
02:46:42.920 | there's this really weird phenomenon in humans,
02:46:46.160 | that in marketing, that we use celebrity status,
02:46:53.720 | celebrity and sex, to sell, to people,
02:46:56.700 | like, why should they ever care about,
02:46:59.860 | you know, Brad Pitt likes this thing,
02:47:01.700 | or Jennifer Aniston likes smart water,
02:47:03.540 | when the world, you're never gonna meet them,
02:47:06.260 | are they, do they really know a lot about water,
02:47:08.820 | or whatever, like, what's the point of all that?
02:47:12.620 | - If George Clooney's selling espresso.
02:47:15.180 | - Yeah, who cares, right?
02:47:16.420 | But now when you think about it in the context of like,
02:47:19.100 | oh, our brains are wired, tuned,
02:47:21.780 | to attend to, and process more deeply,
02:47:25.640 | and value information about others
02:47:29.440 | that are essentially high status, celebrity, sexy,
02:47:32.480 | you know, whereas like, George Clooney's,
02:47:33.700 | all of those things put together,
02:47:36.140 | now that starts to make some sense.
02:47:39.160 | And so we thought, well, given that monkeys are humans,
02:47:42.760 | monkeys and humans are so alike in this regard,
02:47:46.120 | I bet we could run an advertising campaign on monkeys
02:47:49.080 | that's based on sex and celebrity.
02:47:51.720 | So that's what we did.
02:47:53.040 | We just basically had monkeys, you know,
02:47:54.820 | they were just sitting there in their own colony,
02:47:57.600 | and we had a television monitor, computer monitor in there
02:48:00.760 | that would display like, you know,
02:48:03.280 | the Doritos logo next to, you know,
02:48:05.920 | high status monkey A, and maybe the Cheetos logo
02:48:09.160 | next to low status monkey B,
02:48:12.080 | or, you know, Coke next to like sexy monkey butt,
02:48:18.320 | and, you know, Pepsi next to, you know,
02:48:21.200 | the front end of, you know,
02:48:22.160 | or backside of monkey that's not so sexy.
02:48:23.980 | Okay, so you just do that, just do it, just pairing.
02:48:26.520 | And so it's just association, simple association.
02:48:30.200 | And then what we did is we then gave the monkeys
02:48:32.600 | choices between brand logos that had either been endorsed
02:48:38.440 | by, you know, essentially celebrity monkeys,
02:48:40.920 | sexy monkeys, or peon monkeys, right?
02:48:43.720 | And they got the same reward no matter what.
02:48:45.660 | No matter what they chose,
02:48:46.560 | they always got the same banana pellet.
02:48:49.560 | But the monkeys favored the brands
02:48:54.000 | that had been paired with celebrity and sexy monkeys,
02:48:56.920 | just like you see in people.
02:48:59.020 | You know, I just keep saying this,
02:49:01.480 | there's a little monkey in all of us.
02:49:04.080 | I'm shaking my head because it says a couple of things
02:49:08.940 | to me, but one of the things that it says to me
02:49:11.360 | as a neuroscientist is that it's almost like the bins
02:49:16.880 | like the map of valuation in the brain,
02:49:20.680 | there's overlap of, I'm gonna get into lingo here
02:49:24.320 | for a second, and then I'll explain of the receptive fields.
02:49:26.480 | So like you mentioned Hubel and Wiesel, H and W,
02:49:30.200 | and I mean, they basically won the Nobel Prize
02:49:32.200 | for a couple of things, but not the least of which
02:49:34.840 | was the identification of like,
02:49:36.800 | what are the specific qualities and positions of light
02:49:40.360 | and shapes of light that activate a given neuron,
02:49:43.000 | which eventually led to the Jennifer Aniston,
02:49:45.560 | Barack Obama cells.
02:49:47.600 | And by the way, their coexistence in the same sentence
02:49:50.600 | does not mean that I have knowledge of their dating,
02:49:53.160 | I have no knowledge. - And that paper was,
02:49:54.400 | that study was done a long time ago.
02:49:56.240 | - Right, right, but it speaks to the same principle,
02:49:59.120 | which is that when we see two things next to one another,
02:50:01.640 | sometimes there's a merging of those in our cognitive space
02:50:05.880 | or our memory, when in fact,
02:50:07.440 | there's no overlap conceptually, right?
02:50:11.080 | You know where you see this very dramatically
02:50:15.920 | is that if there's a podcast
02:50:17.920 | with a male and a female guest host pairing,
02:50:21.520 | I guarantee that 25% of the comments
02:50:26.120 | are theories about their dating and/or sleeping together.
02:50:29.800 | It's just, it's incredible.
02:50:30.920 | It's like people see male and female together,
02:50:33.520 | and they just like start doing this thing of like,
02:50:35.440 | oh, they're dating or they're, you know,
02:50:36.760 | they see flirtation where it may or may not have existed.
02:50:40.320 | You know, it's just wild.
02:50:41.440 | And so that when I hear about this experiment
02:50:43.320 | that you did of pairing products
02:50:45.600 | with sexy monkey or non-sexy monkey
02:50:48.080 | or high status or low status monkey,
02:50:50.160 | I can't help but feel that the area of the brain
02:50:51.880 | that's involved in valuation
02:50:53.760 | is just taking visual images, conceptual images,
02:50:56.100 | 'cause it'd be visual, it could be any number of things,
02:50:58.440 | and that there's just overlap
02:50:59.660 | in the maps of these in the brain,
02:51:01.680 | and then that the effect is born out of that overlap.
02:51:05.520 | That's one interpretation.
02:51:06.800 | The other interpretation, I suppose,
02:51:08.560 | and they're not mutually exclusive,
02:51:09.760 | is that we want to go up the hierarchy.
02:51:12.760 | And that's kind of an assumption
02:51:14.240 | that maybe we could just like poke at,
02:51:16.760 | like a couple of nerd academics for a second.
02:51:19.680 | Because like, I like my life very, very much.
02:51:23.680 | There are people that live near me
02:51:25.120 | that have far more resources than I do.
02:51:27.760 | And I never, for a nanosecond,
02:51:29.720 | wish for their home or my home.
02:51:31.360 | I've tried to make it a point in life
02:51:32.740 | to either have the life I want
02:51:34.660 | or be aspiring to the life I want.
02:51:37.120 | You know, have the things I want
02:51:38.640 | or aspire to the things I want.
02:51:41.340 | But I've never found myself in a mode of like,
02:51:44.320 | oh, I want to be working on that experiment,
02:51:47.560 | or I wanna be living in that house.
02:51:49.560 | If I see a beautiful house or a beautiful thing
02:51:52.560 | or some feature of someone's life,
02:51:54.960 | it inspires me to want to go
02:51:56.360 | try and create something similar.
02:51:57.960 | And so, it's not that I'm without competitive spirit in me.
02:52:02.040 | I am, like anyone else,
02:52:03.820 | but I feel like that's so far and away different
02:52:06.080 | than the notion of a hierarchy,
02:52:07.280 | where for me to move up, someone else has to move down.
02:52:10.160 | And for somebody to be above me in any domain,
02:52:13.880 | that means that, you know, I'm quote-unquote below.
02:52:18.200 | So, can we talk about hierarchies
02:52:19.880 | as they exist in old world primates,
02:52:21.480 | like the cacks versus us? - Sure, sure.
02:52:23.080 | - Because I don't wanna map this on anything political,
02:52:25.560 | but oftentimes this will get mapped onto the political.
02:52:28.920 | Some people live through the lens of abundance.
02:52:31.320 | There's plenty to go around.
02:52:35.000 | Some people live through the lens of scarcity.
02:52:38.040 | Their win is my loss.
02:52:40.780 | Their loss is my win, that kind of thing.
02:52:44.280 | Do you see this in monkeys too?
02:52:46.020 | - Again, it's really hard, you know,
02:52:48.360 | you can ask the monkey, but he won't necessarily tell you
02:52:50.560 | 'cause he doesn't know what you're asking them.
02:52:52.640 | But I, you know, I think it is,
02:52:56.560 | well, first of all, across primate species,
02:52:59.160 | there's different degrees of the steepness of hierarchy.
02:53:02.560 | So, in rhesus macaques, they're really despotic.
02:53:04.700 | They have a very steep hierarchy.
02:53:06.480 | In like Barbary macaques, which live in North Africa
02:53:10.760 | and in Gibraltar, very relaxed society,
02:53:14.760 | even though they're macaques, they're all the same genus.
02:53:18.600 | So, why that's so, we don't really know.
02:53:22.720 | The general idea is it has something to do
02:53:25.280 | with how rich the environment is,
02:53:28.400 | the resources that are available
02:53:30.420 | and how monopolizable they are.
02:53:33.240 | So, if you can monopolize resources,
02:53:37.520 | then that can help to create a steeper hierarchy.
02:53:42.320 | If they're not, like, let's imagine you eat grass
02:53:45.500 | for a living, you know, you're like a cow or whatever,
02:53:49.280 | and there are some monkeys that do that, eat grass.
02:53:52.640 | Like, I can't hoard all the grass to myself.
02:53:56.380 | It's just everywhere.
02:53:57.220 | And so, everybody can just spread out and kind of eat grass.
02:53:59.600 | It's a very boring life, and you spend all your time
02:54:01.800 | digesting and fermenting in this extended gut,
02:54:05.080 | which is kind of a gross thing to do, set aside.
02:54:09.440 | But I think you can see that that spans a continuum
02:54:12.960 | for what you're saying, from abundance to scarcity,
02:54:16.640 | and has a lot more to do with
02:54:17.840 | whether it's sort of monopolizable.
02:54:20.840 | And does that make sense?
02:54:22.480 | So, if you can monopolize something,
02:54:24.600 | then you have something that other monkeys need, right?
02:54:31.520 | And you're creating that scarcity.
02:54:34.120 | - Yeah, so let's drill into this,
02:54:35.520 | 'cause I think this is, everybody is operating
02:54:38.000 | from a certain frame in this context.
02:54:40.280 | And so, for instance, there are billionaires,
02:54:44.520 | hundreds of some people,
02:54:46.000 | like Elon has hundreds of billions of dollars,
02:54:48.120 | doesn't seem to care much about money for money's sake,
02:54:50.160 | or I think he's sold all his homes or whatever.
02:54:52.320 | You know, he's motivated by clearly other things as well,
02:54:55.320 | if money at all.
02:54:56.420 | And then there are people who are destitute poverty.
02:55:00.320 | I think many people will say,
02:55:03.080 | why does anyone need that much money?
02:55:05.920 | They'll say this about billionaires.
02:55:07.720 | What's been interesting is one of the more prominent themes
02:55:12.560 | in pop psychology that is supported by research
02:55:16.800 | is this idea that past a certain level of income,
02:55:20.720 | your happiness doesn't scale upwards linearly
02:55:24.120 | with the increase in income, or maybe at all.
02:55:27.740 | And the number that's thrown around
02:55:29.460 | is like past $75,000 a year,
02:55:32.360 | you know, your happiness doesn't grow.
02:55:33.480 | I would argue that indeed money can't buy happiness,
02:55:36.080 | but it absolutely can buffer stress
02:55:38.360 | or certain kinds of stress.
02:55:40.080 | Let's just give an example of a single mother
02:55:41.680 | with raising three kids on her own
02:55:43.600 | versus a single mother raising three kids
02:55:46.680 | with three night nurses when they're infants and nannies,
02:55:51.080 | different level of output required.
02:55:53.100 | Like you just can't argue between those two.
02:55:54.920 | Now, whether or not one is happier than the other
02:55:56.520 | is a discussion, different discussion altogether, excuse me.
02:55:59.540 | But I think, you know, the cow example makes a lot of sense.
02:56:03.280 | The hierarchies within primate troops make sense.
02:56:05.220 | But as humans, I think that I observe tremendous variation
02:56:10.100 | as to whether or not people say,
02:56:13.260 | oh, wow, this person is a millionaire or billionaire,
02:56:16.560 | but I'm good with what I've got.
02:56:18.820 | Or this person has so much more and I resent them for it.
02:56:24.580 | - And I guess we don't really think about
02:56:27.920 | there being a limited amount of money
02:56:31.000 | in the same way that we think of as like grass or resources.
02:56:33.760 | Now, if we were to talk about mates and that,
02:56:35.360 | that's a whole other thing,
02:56:36.920 | but you just have to go to a bar
02:56:40.060 | with a particular bias towards having more men or women.
02:56:45.000 | And then, you know, like that starts
02:56:46.080 | to play out immediately, right?
02:56:48.480 | But let's keep it simpler.
02:56:50.200 | Do you think that this whole stance
02:56:52.960 | about abundance versus scarcity is dynamic?
02:56:55.740 | Like if you're surrounded by people
02:56:59.520 | that make more or less the same amount of money as you,
02:57:01.260 | do you feel better than if you're surrounded
02:57:03.000 | by billionaires that have yachts?
02:57:04.880 | - No, I think the fundamental drive
02:57:06.980 | is to climb the hierarchy is more or less kind of baked in.
02:57:11.980 | Again, with a lot of variation across individuals
02:57:15.300 | and probably across cultures,
02:57:16.960 | which I'll get to in a moment.
02:57:19.760 | Going back to that 75,000 being kind of like where,
02:57:22.660 | you know, it's just asymptotes.
02:57:24.620 | There's a number of papers that came out
02:57:26.600 | from colleagues at Penn and Wharton.
02:57:29.860 | So a guy named Matt Killingsworth showed
02:57:31.580 | in a famous paper five or six years ago,
02:57:34.360 | that in fact, it actually continues,
02:57:36.320 | like happiness just keeps going up with income.
02:57:39.160 | And then there was a back and forth
02:57:40.860 | with Danny Kahneman about that.
02:57:42.120 | And then they worked on a paper together.
02:57:44.040 | And what it looks like is this,
02:57:44.920 | it kind of goes up, flattens out for a while.
02:57:47.620 | And then like above another level,
02:57:50.800 | wow, happiness really goes up
02:57:52.100 | when you got a lot, a lot of money.
02:57:54.220 | - Ah, so that study isn't discussed as much.
02:57:56.200 | - So that's new, well, it's new.
02:57:57.440 | It's like in the last year or two.
02:57:59.360 | So it--
02:58:00.200 | - So being very, very wealthy
02:58:01.240 | does increase your level of happiness?
02:58:02.940 | - Yeah, I mean, for a variety of reasons, right?
02:58:05.680 | So, you know, sure, it's a buffer of stress,
02:58:07.960 | but it also allows you access
02:58:09.180 | to lots and lots of different things
02:58:11.440 | that can make your life just easier, right?
02:58:14.920 | So that's, I think, part of it.
02:58:17.940 | But the other part, and I think this gets back
02:58:19.920 | to that question of what makes us human,
02:58:22.360 | is that we can intentionally,
02:58:25.880 | just like you said about yourself,
02:58:27.240 | it's like, well, I'm just gonna chill.
02:58:28.760 | I'm happy with what I've got.
02:58:30.520 | And there's lots of, you know,
02:58:31.680 | ascetic traditions in a variety of cultures,
02:58:35.360 | especially, you know, Eastern cultures
02:58:37.400 | that have taken that approach,
02:58:41.500 | or even in the West, like, you know,
02:58:43.320 | early Christianity, et cetera.
02:58:45.320 | And I'm trying to remember the name of the book
02:58:48.760 | that was recommended to me, I haven't read it yet,
02:58:51.160 | but that, for example, in India,
02:58:53.800 | in amongst some of the most extreme poverty in the world,
02:58:58.800 | you have people who are kind of ecstatically happy,
02:59:02.200 | and they're very, very happy with being alive,
02:59:06.120 | and being alive where they are, when they are,
02:59:08.360 | and with the people that they're happy with.
02:59:11.780 | How does that happen?
02:59:13.640 | I don't know, but here's my guess,
02:59:17.280 | or part of my guess, I guess,
02:59:19.320 | which is, gets back to what we talked about
02:59:22.620 | in terms of attention.
02:59:24.600 | So what you attend to is being turned up in the brain,
02:59:27.360 | and what you're not attending to is being turned down.
02:59:29.680 | It's kind of like glass half full, glass half empty.
02:59:33.160 | And if you're paying attention to the sort of good things,
02:59:38.020 | then those are getting kind of priority
02:59:40.240 | of access to your brain.
02:59:41.140 | So you're kind of getting like, oh, it's magnifying.
02:59:44.440 | Every little small positive surprise
02:59:46.980 | is amplified in your brain,
02:59:48.280 | and you get a bigger dopamine hit for that,
02:59:50.800 | rather than the sort of small negative surprises.
02:59:53.400 | Now, I'll put that into another context,
02:59:55.040 | which is, we've done a number of studies on loss aversion,
02:59:58.760 | right, and loss aversion is this observation
03:00:00.880 | that if I give you a 50/50 gamble,
03:00:02.440 | like win some money, lose some money,
03:00:05.280 | in general, for most people,
03:00:06.720 | I have to offer them a lot more to win than to lose
03:00:09.600 | for them to take the gamble,
03:00:10.800 | which doesn't make any sense rationally and economically.
03:00:13.680 | It should be, it's an even chances.
03:00:16.440 | So people are loss averse.
03:00:17.760 | There's been a lot of theories about why.
03:00:20.440 | Danny Kahneman famously thought
03:00:23.000 | that people feel the pain of loss
03:00:24.360 | more than the pleasure of winning, right?
03:00:27.160 | So, and I think that's true.
03:00:29.320 | We investigated that using a combination of modeling,
03:00:33.600 | computational modeling, and we looked at people's behavior.
03:00:36.160 | We did eye tracking,
03:00:37.200 | 'cause we're measuring where people attend.
03:00:39.280 | Your average person, most people,
03:00:41.080 | attend to what they might lose
03:00:42.880 | rather than what they might win.
03:00:44.280 | And the longer they focus on what they might lose,
03:00:46.420 | the more loss averse they are,
03:00:49.200 | and that tends to be associated with people
03:00:50.960 | who have negative affects.
03:00:53.120 | So if you're anxious or you're depressed,
03:00:54.840 | you're in a negative state,
03:00:57.600 | then you're looking more for what you could lose
03:00:59.620 | than what you could win.
03:01:01.280 | So that sets up a really interesting test,
03:01:04.440 | causal test, which we were like,
03:01:05.840 | well, where you look is a function
03:01:08.480 | of what you're looking for.
03:01:09.400 | They're looking for what might hurt them
03:01:11.520 | and also what the world looks like.
03:01:12.840 | So let's just manipulate the visual display.
03:01:14.960 | We made the wins bigger font or brighter than the losses.
03:01:19.960 | Okay, when you do that, that attracts people's attention.
03:01:23.520 | They look at the wins, what they could,
03:01:26.920 | the good things they could get rather than the bad things.
03:01:29.360 | - Just by changing the font.
03:01:30.520 | - Just by changing the font size or the brightness.
03:01:32.840 | They look at it more.
03:01:34.320 | This gets turned up in the brain
03:01:35.720 | and now they're not loss averse anymore.
03:01:37.440 | Now they're willing to take the gamble.
03:01:39.020 | So that's what I'm talking about
03:01:40.040 | in terms of like what you focus on.
03:01:42.400 | So if, and that's a way to do it.
03:01:44.880 | I mean, obviously that you could take advantage
03:01:47.440 | of people by doing that, but with their consent.
03:01:50.720 | So for example, we started that work
03:01:52.960 | on behalf of a financial services company
03:01:55.060 | who was saying we're having trouble with our customers,
03:01:57.980 | older customers, to get them to take good risks
03:02:01.360 | like that could really pay off for them
03:02:02.880 | 'cause they're too afraid.
03:02:05.040 | And so we did some basic work and then we tested
03:02:09.160 | that we could actually causally change that.
03:02:11.080 | We could shift that like.
03:02:13.000 | So with their consent, yeah, if we amplify,
03:02:15.080 | we just make, just put the, put what you could win
03:02:18.840 | instead of what you could lose in a, make it more obvious.
03:02:22.320 | People pay more attention to that
03:02:23.580 | and then that will subtly shift the decisions that they make.
03:02:28.080 | - Wow.
03:02:29.000 | We are so malleable when it comes to changing the context
03:02:34.000 | and thereby the variables that shape our decision-making.
03:02:41.040 | But I'm always struck by the way
03:02:43.440 | that it comes in below our conscious detection.
03:02:46.840 | This might be, this is the appropriate time
03:02:52.280 | to ask about meme coins, right?
03:02:55.440 | Because, you know, we all grow up learning about, you know,
03:03:00.260 | the US dollar or euro or whatever backed by something, right?
03:03:04.840 | Backed by the Fed, but also, you know,
03:03:07.160 | backed by real world physical objects of gold.
03:03:11.600 | That's what we're told anyway, right?
03:03:13.280 | You know, and this is, you know,
03:03:15.760 | why just printing more money is never the solution, right?
03:03:18.640 | Because meme coins born out of the kind of larger theme
03:03:23.640 | of cryptocurrency and Bitcoin are an interesting
03:03:30.840 | kind of derivative of cryptocurrency
03:03:34.560 | whereby you're pairing reputation of a person
03:03:37.920 | or in some cases, a Shibu Uno dog, right?
03:03:42.040 | With a currency that has no intrinsic value
03:03:44.880 | except for the person's reputation.
03:03:47.880 | Plus whatever backing, whatever value backing it's obtained
03:03:52.480 | when people decide to purchase that coin.
03:03:55.380 | So I don't know how many listeners, you know,
03:03:57.720 | track cryptocurrency and I am by no means an expert on this.
03:04:00.660 | But, you know, one thing that people get excited about
03:04:03.640 | is how much money is flowing into a coin,
03:04:06.960 | not just the value of the coin, you know, on a given day.
03:04:11.400 | So, you know, essentially how much has been invested
03:04:13.720 | in that coin as something of potential value.
03:04:16.960 | So when we hear about Haktua girl coin,
03:04:21.560 | the Haktua coin, or there's a Trump coin now, I think,
03:04:25.280 | there's a Melania coin, there's a Doge coin
03:04:29.240 | that was developed long before the idea
03:04:32.240 | of a department of government Doge, the Shibu Uno coin.
03:04:37.240 | Is this all just exploiting, again, there comes that word,
03:04:44.080 | leveraging this proximity between reputation and value?
03:04:49.080 | So I think that's partially it,
03:04:51.280 | but it may be even simpler than that,
03:04:54.320 | which is it's leveraging, it's harnessing our wiring
03:05:00.440 | to attend to what other people are doing
03:05:03.560 | and what they're getting or losing.
03:05:05.320 | So we care a lot about, you know, when we're in a group,
03:05:08.520 | the behaviors of other people.
03:05:09.840 | So let's think about how we learn something,
03:05:11.480 | the value of something.
03:05:12.960 | If you're a simple animal, you learn it
03:05:14.400 | from direct experience and that's reinforcement living,
03:05:17.040 | you know, reinforcement learning driven,
03:05:19.040 | the dopamine system, et cetera.
03:05:21.400 | You can also learn from what you didn't choose,
03:05:23.240 | counterfactual, fictive learning.
03:05:24.700 | And then in groups, you have this rich source
03:05:26.640 | of information of what other people are doing.
03:05:29.020 | Like I could watch you try that food
03:05:31.000 | and if you die from eating it, then I won't eat it, right?
03:05:33.760 | So that's, we're deeply wired to pay attention
03:05:38.120 | to the decisions that other people are making.
03:05:40.240 | And if they look good or if they, you know,
03:05:43.160 | then we start to copy what they're doing.
03:05:45.600 | And you see this in, it's not just these meme coins,
03:05:49.360 | but like meme stocks, you know, like GameStop.
03:05:52.640 | This is very similar to the FTX phenomenon/debacle,
03:05:57.680 | where celebrities joined in and people had trust
03:06:01.100 | in these celebrities, admiration of these celebrities
03:06:03.260 | and invested a lot of money in what turned out to be,
03:06:06.220 | you know, in the end, a failure.
03:06:07.980 | So how often is this happening in advertising?
03:06:12.500 | Like if we really step back and we go like,
03:06:14.580 | is the BMW really the better choice compared to the,
03:06:18.340 | you know, compared to the Range Rover?
03:06:23.300 | Like, are we really basing our decisions
03:06:25.700 | on the thing that we're purchasing as much as we think?
03:06:28.560 | - No, I don't think so at all.
03:06:30.800 | And there's a few things we could kind of unpack there.
03:06:33.400 | I think in terms of meme coins, meme stocks,
03:06:37.520 | there's probably two things,
03:06:38.520 | a confluence of two things going on.
03:06:40.060 | So one is this sort of celebrity endorser.
03:06:43.920 | And we have studied that also as well.
03:06:46.000 | We talked about the monkey stuff,
03:06:47.160 | but we looked at, we did eye-tracking studies
03:06:50.720 | of people making choices amongst products and brands
03:06:54.200 | that had been endorsed either by celebrities or not,
03:06:56.100 | just paired with them, right?
03:06:57.860 | And one of the things we found is that when people
03:07:00.500 | chose a product or a brand
03:07:06.060 | that was sort of unfamiliar to them,
03:07:08.900 | if it had been endorsed by a celebrity that pupils,
03:07:12.580 | their pupils didn't dilate.
03:07:13.740 | Normally they would dilate
03:07:14.660 | because that's like overcoming your default
03:07:17.460 | and mental effort and arousal goes up
03:07:19.500 | because it's sort of surprising.
03:07:21.380 | And so the pupil staying silent
03:07:24.520 | is an indicator of kind of enhanced confidence and trust,
03:07:27.840 | if you will, that I'm not making a mistake.
03:07:30.440 | I'm putting a lot of words here,
03:07:31.600 | but like, you know, that was the impact
03:07:35.840 | in a very subliminal way of that celebrity endorser.
03:07:38.800 | So I think that could be going on
03:07:40.900 | as well as this other process I was talking about
03:07:45.900 | in terms of what we pay attention
03:07:47.680 | to what other people are doing,
03:07:49.240 | which seems to be a major driver of bubbles
03:07:51.440 | in stock markets.
03:07:52.620 | Like that goes all the way back to like Isaac Newton
03:07:54.420 | losing his fortune in, you know,
03:07:56.520 | the South seas trading market.
03:07:58.480 | You know, he famously said, like, you know,
03:08:00.160 | I can divine the mechanics of the planets and the heavens,
03:08:04.060 | but I can't understand the minds of men
03:08:05.760 | or something like that.
03:08:06.920 | He just couldn't help himself.
03:08:09.440 | He got out first and then he got back in
03:08:11.280 | when he saw his friends were continuing to make money
03:08:13.520 | and then he got wiped out.
03:08:15.160 | So we were like super interested in this.
03:08:17.800 | And we ran experiment with MBA students at Wharton
03:08:22.800 | and they were playing a stock market game.
03:08:25.200 | Actually, it turns out it's a stock market
03:08:26.620 | we developed for monkeys.
03:08:27.600 | We had monkeys play the exact same stock market.
03:08:29.400 | They're buying, selling, they, you know,
03:08:30.920 | they've got a portfolio that they can trade in for juice.
03:08:35.920 | Humans get money for this, okay.
03:08:38.720 | This was based on some studies that Colin Kammerer
03:08:41.080 | and his colleagues had and Benedetto DiMartino
03:08:44.120 | did a while ago.
03:08:45.940 | In the MBA students, we used a standard psychometric scaler,
03:08:50.940 | you know, a questionnaire that's used to test people
03:08:56.480 | for sort of social impairments, okay.
03:09:00.600 | And then what we did is we looked at how their likelihood
03:09:04.440 | of getting caught in a bubble market
03:09:05.760 | related to social sensitivity,
03:09:07.940 | how attuned they were to other people.
03:09:10.000 | And basically, the more dialed in you were to other people,
03:09:12.640 | the higher your likelihood of losing everything in a bubble.
03:09:16.040 | And it was those people who were like, you know,
03:09:19.180 | socially impaired, who did the best.
03:09:21.840 | They never got sucked into bubbles.
03:09:23.400 | Now, what was cool is we found the same thing in monkeys,
03:09:25.480 | okay, so monkeys in the same stock market, okay.
03:09:30.320 | If they're playing alone,
03:09:31.760 | they're making pretty good decisions, okay.
03:09:34.080 | As soon as you put another monkey in the market
03:09:35.680 | that they can see, they see,
03:09:37.480 | they watch what that monkey did.
03:09:38.720 | That monkey buys GameStop or whatever.
03:09:42.360 | Then I buy GameStop and he sees me do that.
03:09:45.060 | And then he does the same thing.
03:09:46.080 | And it just goes back and forth, back and forth.
03:09:47.480 | They create this bubble and then you get this crash.
03:09:50.680 | It was like really phenomenal.
03:09:52.880 | And we found that the brain circuit
03:09:57.440 | that is essentially involved in theory of mind,
03:10:00.360 | but is about controlling your attention to others
03:10:03.400 | and registering what they're doing is driving that, okay.
03:10:07.280 | And it was really funny.
03:10:09.040 | It was like the bigger the portfolio imbalance
03:10:12.080 | between what I've got and what you got,
03:10:13.800 | the higher the signal in this area.
03:10:18.560 | The monkeys are like, "Shit."
03:10:20.800 | I don't know if I can say that.
03:10:21.640 | - You can say whatever you want on here.
03:10:22.480 | - Okay, well, fuck, I'm losing relative to you.
03:10:25.240 | So I'm paying even more attention to what you're doing
03:10:28.200 | and what you've got in your portfolio.
03:10:29.520 | And I'm gonna be much more likely to copy you
03:10:32.000 | and do what you're doing.
03:10:33.020 | So again, like there's a little monkey in all of us.
03:10:36.940 | I see very little difference between what people are doing
03:10:41.720 | with GameStop and what monkeys are doing in that market.
03:10:45.620 | - So when we hear about these, for lack of a better phrase,
03:10:48.360 | pump and dump type things where,
03:10:50.680 | like I'll never forget in 2017,
03:10:54.800 | a friend who's a spectacularly successful investor said,
03:10:59.800 | "You should put 2% of your investable earnings
03:11:02.560 | "into Bitcoin."
03:11:03.840 | And I was like, "Well, I don't know about that."
03:11:06.600 | And then not long after that,
03:11:09.960 | there was some press releases about who was buying Bitcoin
03:11:13.840 | and the price shot up.
03:11:15.920 | And then I said, went back to them and said,
03:11:20.760 | "I have to be really careful here,"
03:11:22.160 | and said, "You were right."
03:11:24.440 | They said, "Yeah, but whenever you read
03:11:29.440 | "about who's buying Bitcoin,
03:11:31.860 | "it's not clear when they bought that."
03:11:34.320 | A lot of those purchases were likely made a long time ago.
03:11:37.640 | So there are ways that people kind of,
03:11:40.620 | build some hydraulic pressure
03:11:45.080 | through social interaction on these things, right?
03:11:48.200 | He's not, it's very different
03:11:51.600 | than someone picking up the phone
03:11:53.040 | and the whole notion of insider trading, right?
03:11:56.120 | Very different.
03:11:57.060 | If people kind of create a swell around something,
03:12:01.480 | "This is great," or, "Let's make it real estate."
03:12:03.960 | It's a little more tangible for people.
03:12:05.920 | That neighborhood is really terrific.
03:12:07.540 | We're all gonna move there.
03:12:08.600 | And then people start moving there.
03:12:11.200 | And then you realize that they've actually owned
03:12:12.680 | that very inexpensive property for a long time.
03:12:15.440 | And they're actually the seller.
03:12:17.800 | You get a very different impression
03:12:18.960 | of the advice that you got.
03:12:21.760 | So, and I'm not a finance guy,
03:12:24.360 | but I think about these things
03:12:25.840 | in terms of the neuroscience and the human psychology.
03:12:27.920 | I mean, it just, again, I'm just struck by
03:12:30.600 | how our notion of valuation is adjusted in the short term
03:12:36.600 | by virtue of proximity, probably also in the long-term,
03:12:40.540 | but that how we kind of lose ourselves in these things,
03:12:44.280 | that we just become less than rational
03:12:47.540 | based on things like arousal,
03:12:52.060 | the relationship between hormones and arousal.
03:12:53.820 | What I love about what we're doing today is,
03:12:55.860 | in case people haven't caught on already,
03:12:57.380 | is that we've got multiple mechanisms and themes here
03:12:59.980 | that are starting to converge.
03:13:01.860 | As arousal goes too high,
03:13:04.460 | it's mostly what we're exploring,
03:13:05.900 | you start making, you start speeding up,
03:13:08.040 | you start misjudging information,
03:13:11.340 | you think noise is signal,
03:13:13.980 | and you start correlating things
03:13:17.260 | that are like not correlated in reality.
03:13:20.260 | And then you can quickly find yourself
03:13:21.760 | down the path of bad decisions.
03:13:23.340 | I think one of the best advice I ever got was,
03:13:25.520 | if somebody ever wants you to make a decision
03:13:28.080 | very, very quickly, and it's not clearly an emergency,
03:13:31.760 | like you don't see them hemorrhaging,
03:13:34.320 | chances are it's a scam.
03:13:35.560 | Like if anyone, this is the best thing to tell
03:13:38.960 | anyone that's older, let's say,
03:13:42.000 | because they'll get these calls from people,
03:13:43.480 | and it's like, this is urgent, this is,
03:13:44.880 | the urgency usually is suggestive of it being false,
03:13:49.880 | like using time pressure on people.
03:13:53.240 | I need this money now.
03:13:54.780 | I'm going to miss my bus,
03:13:55.620 | and my kid's going to be waiting for me, this kind of thing.
03:13:57.400 | I mean, and it like pulls on you, right?
03:13:58.920 | You don't want some kid waiting out in the middle of nowhere,
03:14:00.960 | but if this is somebody you don't know,
03:14:03.360 | then you could say, well, maybe five bucks,
03:14:04.960 | I'm willing to lose it.
03:14:06.600 | Maybe that's probably the calculation I would do.
03:14:08.480 | I'm willing to lose it.
03:14:09.640 | If they're lying, okay.
03:14:12.640 | If they're telling the truth, great.
03:14:14.800 | But when it starts getting higher stakes,
03:14:17.280 | it gets kind of scary.
03:14:19.560 | - I think we should address the word rational,
03:14:23.480 | rational, rationality that you used,
03:14:25.340 | which is like, oh, we're being irrational.
03:14:27.560 | It seems like we lose our rationality.
03:14:31.280 | - That's a word that's bandied about a lot, right?
03:14:35.680 | And especially in economics,
03:14:37.080 | and that kind of makes the assumption
03:14:38.680 | that we are essentially a computer,
03:14:41.560 | and we just kind of weigh things up dispassionately,
03:14:45.920 | and we have complete access to all information
03:14:49.680 | here and now and into the future.
03:14:51.640 | There are other concepts here.
03:14:53.360 | One is called bounded rationality,
03:14:55.920 | which this guy Gerd Gigerenzer kind of came up with,
03:14:59.960 | which is the idea that there are constraints,
03:15:02.840 | there are brain constraints that are built in.
03:15:04.560 | We've got energetic constraints.
03:15:06.000 | We've got, you know, which actually limit
03:15:08.280 | how much information we can process,
03:15:10.840 | which is why we fall prey to choice fatigue
03:15:13.080 | and decoy effects and things like that,
03:15:14.640 | why we see visual illusions in some ways.
03:15:17.360 | And then there's another concept of ecological rationality,
03:15:23.240 | which takes that bounded rationality,
03:15:24.880 | and it puts it into what you might call
03:15:27.120 | the environment of evolutionary adaptation,
03:15:29.160 | which kind of we've talked about before.
03:15:30.160 | Like, what's the environment our brains are designed for?
03:15:33.780 | And it's not the one we're in right now.
03:15:35.840 | So our brains are designed for, I mean,
03:15:37.560 | probably 130 million years ago,
03:15:38.920 | but let's say 200,000 years ago,
03:15:40.480 | our species, right, Homo sapiens.
03:15:44.900 | And what was that environment like?
03:15:46.960 | Well, we lived in small groups with face-to-face contact
03:15:51.120 | of somewhere between probably 20
03:15:52.860 | and no more than 100 people.
03:15:55.240 | You saw, you knew all of them.
03:15:56.920 | You know, you talk to them every day.
03:15:59.140 | Things didn't really move faster than an antelope
03:16:02.360 | or change faster than the seasons, okay?
03:16:05.640 | There was very little wealth inequality, okay?
03:16:11.360 | People were physically active all day long,
03:16:14.820 | and they ate natural food, right?
03:16:19.820 | And so what are we like now?
03:16:22.760 | We're in these so-called weird environments,
03:16:25.280 | Western, educated, industrialized, rich, democratic,
03:16:29.260 | but we're in these industrialized societies.
03:16:32.120 | We have money.
03:16:33.620 | We're in markets.
03:16:34.460 | We're interacting with thousands of people,
03:16:36.420 | perhaps millions of people.
03:16:37.740 | Their behaviors, their thoughts,
03:16:39.900 | everything are impinging on us.
03:16:42.180 | Stuff is changing super, super fast, right?
03:16:44.980 | We sit on our butts all day long.
03:16:46.980 | We're not active, right?
03:16:48.660 | It's like you have to be intentional
03:16:51.260 | and have enough resources to be active,
03:16:53.500 | and we eat garbage.
03:16:55.260 | And for those reasons,
03:16:57.860 | I think that's the source of a lot of the misery
03:16:59.900 | that we have is that now,
03:17:02.300 | now I'm not saying we should go back to being subsistence,
03:17:05.020 | you know, hunter-gatherers or horticulturalists.
03:17:06.700 | Maybe we should.
03:17:07.940 | It'll be a painful process to get there,
03:17:09.740 | and we may end up getting there
03:17:11.100 | given some of the trajectories that we're on.
03:17:14.820 | But people who live in those environments
03:17:16.900 | seem to generally be healthier and happier.
03:17:21.460 | For example, you know, like studies of brain and body
03:17:26.460 | in subsistence, hunter-gatherers, horticulturalists.
03:17:30.740 | People who are in their 70s look like people,
03:17:33.060 | you know, Westerners in their 30s or younger.
03:17:35.740 | They're incredibly fit, lean,
03:17:37.900 | no evidence of cardiovascular disease,
03:17:39.780 | no evidence of anything like, you know,
03:17:41.700 | dementias, major cognitive decline.
03:17:45.140 | - And we're always trying to hack that.
03:17:46.540 | - We're trying to hack it.
03:17:47.380 | - We're trying to look at blue zones,
03:17:48.200 | and then people say, well, it's the diet.
03:17:49.220 | No, it's the wine, by the way.
03:17:51.340 | It's not the wine.
03:17:52.180 | - It's not the wine.
03:17:53.020 | - Now, finally, after years,
03:17:56.020 | I'm not going to say I told you so,
03:17:57.300 | but alcohol's not good.
03:17:59.580 | A little bit maybe every once in a while,
03:18:01.100 | but not more than a little bit.
03:18:03.020 | - I'll respond to that.
03:18:04.140 | - Yeah, but the social component seems critical.
03:18:07.380 | - Yeah, yeah.
03:18:08.620 | - What are your thoughts on the longevity movement,
03:18:11.340 | if you will?
03:18:12.180 | Like, I always assumed if I do well
03:18:14.700 | that I'll probably live to be somewhere between 85 and 102.
03:18:20.380 | And my hope is that my last five years,
03:18:23.940 | I like the Peter Attia thing,
03:18:25.060 | like, what is the quality of your final decade,
03:18:28.060 | will be at least as vigorous as my dad's.
03:18:31.820 | My dad's 80, gosh, he's 81,
03:18:34.780 | and he's doing great mentally, physically.
03:18:38.080 | He was a guest on this podcast, actually.
03:18:40.700 | He's a scientist.
03:18:41.540 | Yeah, we, it was, and so,
03:18:44.700 | but he's always been very, very moderate about his drinking.
03:18:49.500 | He'll have like a half a glass of wine now and again.
03:18:51.300 | He just never ate too much.
03:18:52.340 | He never exercised too much.
03:18:53.500 | He worked nine to five, nine to six, just consistently.
03:18:57.260 | But he's never was a, you know,
03:18:58.260 | like burn the midnight oil type,
03:18:59.420 | but he's just, his consistency is what's so impressive.
03:19:02.540 | So I think that might have something to do with it.
03:19:03.980 | But what are your thoughts on the economics
03:19:08.420 | of decision-making as it relates to live fast, die young,
03:19:13.300 | versus be more monastic and try and live a very long time?
03:19:19.220 | - Well, that's a personal preference, isn't it?
03:19:21.120 | So I, you know, and that kind of, it's interesting,
03:19:25.140 | 'cause it maps on to concepts in ecology
03:19:28.460 | that typically we use to describe different species,
03:19:31.500 | which are like R-selected and K-selected.
03:19:34.220 | I don't know if you've heard of this before,
03:19:35.500 | but R-selected are species that are limited
03:19:38.780 | just by their pure reproductive rate.
03:19:40.780 | Think about weeds or rabbits, you know, something like that.
03:19:43.900 | And K-selected are like oak trees.
03:19:46.660 | And, you know, I don't know.
03:19:49.060 | - Whales. - Whales.
03:19:50.460 | And humans are sort of like this mix, right?
03:19:54.100 | And so that's where we are very plastic and flexible.
03:19:58.380 | So in some environments, you can be more R-selected,
03:20:02.240 | like especially if conditions are really not very favorable
03:20:05.140 | toward investing in the long-term,
03:20:08.180 | then it's kind of like kicking up your reproductive output.
03:20:11.060 | But if conditions can be favorable, right,
03:20:13.820 | that investment is worthwhile, then you, you know,
03:20:18.660 | then you can do that and be more like the whale
03:20:20.460 | or the oak tree or something like that.
03:20:22.580 | Now, you know, yeah, you're right.
03:20:25.040 | I think that does map on to economics in a certain way,
03:20:28.620 | because, you know, certain people,
03:20:31.120 | by virtue of what they know and what they have,
03:20:35.360 | can invest in trying to live the longest, healthiest life.
03:20:38.500 | And other people who may not either have the wherewithal
03:20:42.720 | or the knowledge are going to be invested
03:20:46.220 | in surviving until the next day, right?
03:20:49.300 | And so humans are sort of, you know,
03:20:52.580 | exist in that whole space in between.
03:20:55.000 | My dad died at 55, so I lived him by two years.
03:20:58.580 | You know, so for me, every day is like gravy,
03:21:02.460 | but I also don't have this sense of a long time horizon,
03:21:07.300 | which is, you know, just being a little weirdly self,
03:21:10.820 | you know, just introspective.
03:21:12.600 | Maybe, you know, part of my drive to like work a lot.
03:21:17.480 | - And it might've served you well.
03:21:18.740 | I mean, I've read and listened to Steve Jobs's biographies
03:21:23.700 | by Walter Isaacson several times,
03:21:25.580 | in part because I grew up seeing that stuff happening
03:21:28.460 | as I was born and raised in the South Bay.
03:21:30.380 | And Steve used to come into the toy store/skateboard shop
03:21:35.120 | that I worked at to get new roller blade wheels.
03:21:37.060 | And so like, I would like see him around
03:21:38.700 | and see him at this little shop called Shady Lane,
03:21:41.740 | which is like little trinkets.
03:21:42.980 | Like he was around town a lot.
03:21:44.860 | And so then of course he became Steve Jobs, right?
03:21:47.280 | And, or he was Steve Jobs and he stayed Steve Jobs.
03:21:49.860 | But in that book he talked about,
03:21:53.260 | where it was talked about him, that he,
03:21:55.820 | humans' knowledge that we are going to die someday
03:21:58.680 | can be the ultimate motivator.
03:22:01.100 | I mean, I think I look at some of the mistakes I made with,
03:22:04.680 | you know, bringing myself to places of physical risk
03:22:07.140 | in my life, and it's not like I thought I was immortal,
03:22:10.700 | but I didn't really have a good sense of time.
03:22:13.580 | And I think as I get older, I'm 49 now,
03:22:16.580 | so I can finally say that,
03:22:18.260 | I think my sense of the passage of time and mortality
03:22:21.700 | is much more visible to me in my psychology.
03:22:25.340 | So yeah, this is the ultimate time scale
03:22:28.660 | over which one has to make decision and make decisions.
03:22:31.740 | So actually let's talk about this.
03:22:35.840 | So if your dad died two years younger than you are now,
03:22:40.060 | do you have the assumption that you'll make it
03:22:41.720 | to a given age or are you just trying to maximize
03:22:43.560 | on the day, the week, the month?
03:22:44.820 | What's your unit of time scale?
03:22:48.180 | - So I think I did not anticipate,
03:22:50.920 | like 55 was a magic number for me,
03:22:54.420 | the double nickel, you know, like Michael Jordan.
03:22:56.820 | I didn't know what I would do or think about when I,
03:23:02.880 | if I got past that, and I got past it.
03:23:05.500 | And I was like, okay, I got past it.
03:23:06.620 | But now I'm kinda, I'm confused, I guess.
03:23:11.620 | I mean, physically, I showed no evidence
03:23:16.280 | of like rapid decline.
03:23:17.800 | - No, you seem, you appear very healthy.
03:23:20.880 | Not just for your age, but you're like very physically fit.
03:23:24.580 | You're cognitively fit, clearly.
03:23:27.300 | - So that started to, you know,
03:23:30.620 | I think that I'm opening up and I'm trying to
03:23:34.860 | look at some wisdom that's out there about like,
03:23:37.840 | hey, yeah, you know, probably got a lot of life ahead of me.
03:23:41.540 | And if I keep doing what I'm doing, what do I want to do?
03:23:46.020 | That's another part of it.
03:23:47.100 | Because, probably because of the focusing on that 55,
03:23:52.100 | and I'm like, oh God, everything that's come my way,
03:23:56.520 | every opportunity that's come my way, I've taken it.
03:23:58.860 | And I just keep adding.
03:23:59.980 | I've never, I don't subtract.
03:24:01.460 | - Ooh.
03:24:02.300 | (laughs)
03:24:03.120 | - We could talk about this.
03:24:03.960 | I'm just adding more and more things.
03:24:05.900 | You can look at the diversity of papers that I've published
03:24:09.220 | or the other things that I'm doing,
03:24:10.820 | and it's just getting wider and wider.
03:24:13.060 | And I keep taking more things on
03:24:15.480 | and reluctant to give anything up.
03:24:18.460 | But at some point, I think that's not the recipe
03:24:22.260 | for success.
03:24:23.220 | Like there's going to have to be some winnowing.
03:24:27.220 | And I, you know, okay, 57.
03:24:28.620 | So when's that going to happen?
03:24:30.740 | I don't know, 62, 65.
03:24:32.500 | I mean, I, you know, I don't plan on retiring.
03:24:34.880 | Although, you know, also that was like, wow,
03:24:37.560 | people in my family never got the chance to retire.
03:24:39.560 | (laughs)
03:24:40.400 | Because they didn't live long enough.
03:24:42.040 | - This is a-
03:24:42.880 | - But that's the short way to death though, I think too.
03:24:44.680 | And decline is like retiring if you don't do something else.
03:24:47.680 | - Yeah.
03:24:48.520 | - And so-
03:24:49.340 | - I might have a solution for you.
03:24:50.180 | - Okay, good.
03:24:51.020 | Give it up.
03:24:51.840 | - I've thought about this a ton.
03:24:52.680 | - Give it up.
03:24:53.520 | - And I think about time perception.
03:24:54.340 | - Yeah.
03:24:55.180 | - Constantly.
03:24:56.020 | And people can laugh because I'm always late,
03:24:57.400 | but that's because I'm really enveloped
03:24:58.760 | in whatever I was doing previously.
03:25:00.520 | - Yeah.
03:25:01.360 | - So I'm thinking about this conversation tonight
03:25:02.620 | and tomorrow morning when I wake up, for sure.
03:25:04.820 | - Right.
03:25:05.660 | - Okay, a couple of reflections and then ideas about this.
03:25:08.340 | So previous guest on this podcast, Josh Waitzkin,
03:25:13.240 | Grandmaster Chess Champion at a very young age,
03:25:17.060 | then realized at some point, started asking the question
03:25:22.060 | of whether or not his love for the game was gone
03:25:25.260 | or whether or not it was taken away from him.
03:25:26.620 | Was it the fame?
03:25:27.540 | Was it the, 'cause a lot of things came to him young
03:25:29.700 | when that around chess and he spent two years
03:25:33.800 | asking himself that question and then cut ties
03:25:36.500 | with chess forever, never picked up a chess piece again.
03:25:39.680 | But pivoted into martial arts, investing,
03:25:44.300 | now foiling, you know, this like.
03:25:48.160 | - Yeah.
03:25:49.000 | - But then had a near death experience.
03:25:52.080 | He had a drowning event, survived fine,
03:25:55.360 | decided then to move his family down to Costa Rica
03:25:57.760 | where he now spends four and a half hours a day
03:26:00.180 | or more foiling, raising his sons.
03:26:02.660 | I, you know, that struck such a chord with me.
03:26:08.140 | I've been involved in a number of things.
03:26:09.460 | I don't want to make this about me,
03:26:10.460 | but like, you know, early on it was like fish.
03:26:12.140 | I was obsessed with fish and birds and skateboarding
03:26:14.300 | and then firefighting.
03:26:15.380 | Then eventually it was like neuroscience.
03:26:17.020 | And then now I do this.
03:26:18.020 | And so I would say I've read a lot about people
03:26:21.540 | who need something to bite down into, they can't retire.
03:26:27.460 | And it seems to me my informal read of this
03:26:30.580 | is that the ones that are happiest who don't die young
03:26:34.180 | or in their fifties, like Steve Jobs did,
03:26:36.300 | tend to be for lack of a better way to describe it,
03:26:41.940 | kind of serial monogamous as it relates to their pursuits,
03:26:46.940 | which is the way I would describe myself.
03:26:48.780 | Like I'm, you know, like super into
03:26:51.500 | whatever it is professionally.
03:26:52.900 | And then after about anywhere from five to 15 years,
03:26:56.940 | it's like done and kind of move forward
03:27:01.220 | some of the elements and the learnings
03:27:02.660 | from that into the next thing.
03:27:03.740 | But Josh Whiteskin is the ultimate example of this,
03:27:06.700 | of achieved like world champion status in multiple things.
03:27:10.220 | And then now seems very, very much
03:27:13.020 | to achieve world champion status at like family life.
03:27:17.220 | And he's got his, you know,
03:27:19.700 | economic and professional life intact
03:27:21.860 | from the previous stuff, but also he's still involved.
03:27:23.700 | He's, you know, like he coaches for the Celtics
03:27:25.560 | and he's not the head coach, but he coached and so on.
03:27:28.420 | So I feel like the serial monogamy version of this
03:27:31.300 | is the ultimate.
03:27:33.020 | And then the question is when to cut
03:27:35.300 | and pivot into the next thing.
03:27:38.020 | But I'm not telling you not to go with Brad.
03:27:40.380 | But I was looking over your CV and your papers.
03:27:42.300 | And I was like,
03:27:43.140 | this is gonna be a really interesting conversation
03:27:44.740 | 'cause you have worked
03:27:45.580 | on a tremendous number of different things.
03:27:47.540 | Adding in more isn't the thing, but then again,
03:27:50.180 | maybe some of us are just designed to be involved
03:27:52.500 | in a ton of different stuff.
03:27:53.780 | And your vigor is undeniable, right?
03:27:56.160 | Like you're super fit mentally and physically.
03:27:58.600 | So I don't know, I'm not gonna tell you what to do.
03:28:01.460 | I just, I offer Josh as an example of one extreme.
03:28:05.320 | You're sort of at the other extreme, I suppose.
03:28:07.040 | And then, and I suppose I'm kind of in the middle.
03:28:10.080 | - So I think, well, I've done a bit of that over my career.
03:28:14.720 | And the way that has happened
03:28:16.280 | is through external leadership opportunities.
03:28:22.280 | Not really opportunities.
03:28:24.560 | I was like, you need to do this.
03:28:26.700 | Like, you know, so is you kind of, you know,
03:28:30.060 | broad portfolio is getting bigger.
03:28:31.740 | And then somebody says, I want you to direct this thing.
03:28:34.300 | And then I would say, well, then I have to,
03:28:36.860 | I have to cut some of this out
03:28:38.220 | and then go back and narrow again.
03:28:40.460 | And then the problem is then I start to do this again.
03:28:43.340 | And then, okay, then I moved to, you know, Penn and Wharton
03:28:46.340 | and I got to narrow again.
03:28:47.460 | But now it's bigger than it ever has been.
03:28:50.940 | So the question is like, at some point,
03:28:53.000 | does that just fall apart?
03:28:56.640 | Or can I, I think intentionally, you know, at some point,
03:29:01.260 | and maybe some of those decisions, you know,
03:29:03.180 | the other thing, the other thing one can do
03:29:05.900 | is just allow the universe
03:29:09.900 | to make some of those decisions for you, right?
03:29:13.180 | It might be the case that some of the research I do
03:29:15.780 | will not be fundable at some point, right?
03:29:18.780 | And then that decision is made for me.
03:29:20.260 | I know I can have lots of other things I can do.
03:29:23.280 | You know, we have, you know, for example,
03:29:25.260 | if like outside of the pure neuroscience,
03:29:27.220 | basic and clinical and technology development,
03:29:30.940 | you know, we've got all this corporate facing work,
03:29:34.320 | funded work through Wharton,
03:29:36.500 | which is a totally new space, a new opportunity.
03:29:38.740 | So if, you know, if one, you know,
03:29:40.820 | I don't want it to be taken away, but if it is,
03:29:42.940 | then, you know, there's plenty left to do.
03:29:45.580 | - Well, you're clearly one of the few people
03:29:47.300 | that I'm aware of that is, you know,
03:29:49.740 | a true card carrying research neuroscientist,
03:29:51.920 | highly respected in the domain of like real neuroscience,
03:29:56.020 | who's also involved in like business school type stuff.
03:29:59.660 | And, you know, people on both sides of that
03:30:03.100 | take you really seriously
03:30:04.260 | because there's real rigor there.
03:30:06.060 | And that vigor perhaps goes along with it.
03:30:08.580 | It's hard to know what's causal there,
03:30:10.940 | which comes first, but in any event,
03:30:14.500 | maybe we parse this over a coffee sometime.
03:30:17.380 | Apple and Samsung.
03:30:20.620 | - Apple and Samsung.
03:30:22.060 | - I'm an Apple guy.
03:30:23.180 | - Me too.
03:30:24.620 | - But I heard that the camera's on the Samsung phone.
03:30:27.220 | - How much do you love your iPhone?
03:30:30.560 | What's your loyalty?
03:30:31.540 | What's your brand loyalty for Apple?
03:30:33.900 | - I grew up near the original Apple store.
03:30:36.340 | It was in a different location.
03:30:37.900 | Gosh, I love Apple products.
03:30:42.160 | I don't like that they keep changing the ports.
03:30:44.620 | - I know, that's annoying.
03:30:45.460 | - That's super annoying,
03:30:47.140 | but they seem to be like hovering on USB-C, you know?
03:30:50.820 | I love the ease and simplicity.
03:30:55.340 | And yeah, I have a bit of a kind of like a historical
03:30:59.340 | South Bay relationship to it.
03:31:00.980 | So yeah, I would say,
03:31:02.180 | could you get me to use a Samsung phone?
03:31:06.140 | - Yeah, so, I mean, this is the interesting observation,
03:31:08.180 | like loyalty for, let's just talk about smartphones
03:31:11.740 | and Apple and Samsung are the dominant players
03:31:13.380 | in the U.S. market.
03:31:14.480 | They're basically the same device, you know?
03:31:16.640 | I mean, they're both little handheld computers
03:31:18.700 | that can do a million things and amazing stuff.
03:31:22.540 | And yet the loyalty amongst the Apple users
03:31:25.420 | is through the roof.
03:31:26.260 | It's near 100% year in, year out.
03:31:29.120 | And that's not true for Samsung.
03:31:30.940 | - Despite Steve Jobs passing away.
03:31:33.140 | - Yeah, despite, I mean, it's just a legacy, right?
03:31:35.200 | But I think that reflects a lot of the design
03:31:39.060 | and the emphasis I think that he put into the product,
03:31:44.060 | but also like trying to understand it
03:31:46.500 | through the lens of the customer, right?
03:31:48.460 | So that's like an empathy.
03:31:49.820 | I mean, it really is empathy.
03:31:51.580 | This was one of the first questions I got
03:31:52.980 | when I came to business school.
03:31:55.220 | I was like, what the hell accounts for empathy for a brand?
03:31:57.840 | It's not a person or this connection.
03:32:00.000 | Like, why don't I have loyalty to a thing
03:32:03.360 | that's not a human being?
03:32:05.420 | You know, a product and a brand, a company, right?
03:32:07.980 | What in the world is that all about?
03:32:09.140 | Doesn't make sense.
03:32:10.020 | And there's this idea in marketing that actually,
03:32:11.980 | and it makes sense, is that what's happening
03:32:14.740 | is we're applying or we're getting leveraged
03:32:19.180 | the hardware in our brains that's used to connect to people.
03:32:22.160 | And now it's connecting to brands
03:32:23.880 | and to the brand community.
03:32:25.940 | And you could see that kind of in like the words
03:32:27.980 | we use to talk about brands, for example.
03:32:30.780 | That's a rugged brand, the creative brand.
03:32:32.300 | You know, we use personality words to talk about them.
03:32:34.620 | Say, I love my brand.
03:32:35.900 | I hate that brand, whatever.
03:32:37.020 | - And Steve understood this.
03:32:38.700 | He talked about the Apple icon needing to look a certain way
03:32:41.780 | that it was like friendly, but technically right.
03:32:44.300 | But you know, balance.
03:32:45.260 | I mean, this is the idea that objects or images
03:32:49.180 | could look friendly when they weren't.
03:32:51.060 | Objects or images of faces or bodies
03:32:53.380 | is a very interesting concept.
03:32:54.820 | - Yeah, yeah.
03:32:55.660 | So we decided to test this idea.
03:32:57.580 | So we brought people in lab.
03:32:58.920 | We've done now like, I don't know, 10 studies on this.
03:33:02.060 | We brought people in who are Apple or Samsung users.
03:33:05.800 | And the first experiment we did
03:33:07.840 | was a brain imaging experiment.
03:33:09.780 | And first we just asked all the standard marketing questions.
03:33:13.180 | How long have you had the product?
03:33:16.020 | How much do you love it?
03:33:17.020 | What's your loyalty?
03:33:17.860 | What's your identification with it?
03:33:20.020 | You know, et cetera.
03:33:20.860 | And it was equivalent, Apple and Samsung.
03:33:23.260 | They said the same things about their brands.
03:33:26.800 | Okay, so now we bring them into the lab,
03:33:29.700 | put them in the MRI machine,
03:33:31.140 | and we're gonna expose them to news bits
03:33:34.140 | about each of the two brands,
03:33:35.540 | like something good happened to Apple,
03:33:36.980 | something bad happened to Apple or Samsung, et cetera.
03:33:39.820 | And we asked them to rate it.
03:33:40.980 | How do you feel?
03:33:41.860 | Good, neutral, bad about that.
03:33:44.080 | And then we go through the whole thing.
03:33:44.920 | We scan their brains,
03:33:45.760 | take pictures of what's going on in their brains.
03:33:47.680 | And it was really interesting
03:33:49.520 | 'cause what we found is that behaviorally,
03:33:51.380 | in terms of their responses,
03:33:52.380 | they both expressed empathy for their brands,
03:33:54.320 | which hadn't been really measured before.
03:33:55.540 | I feel good for good news about my brand,
03:33:57.720 | bad for bad news about my brand.
03:33:59.740 | And for Apple customers, they said,
03:34:01.180 | yeah, that's really true for Apple,
03:34:02.580 | but I don't feel so much about Samsung.
03:34:04.260 | Samsung customers said,
03:34:05.620 | yeah, I feel really strongly about my brand.
03:34:07.300 | Oh, and they had reverse empathy
03:34:08.900 | or schadenfreude for Apple,
03:34:10.340 | which was part of the story.
03:34:11.940 | So they felt really good
03:34:13.260 | when something bad happened to Apple,
03:34:14.620 | and they felt really bad
03:34:15.780 | when something good happened to Apple.
03:34:17.500 | Talk about tribalism.
03:34:18.560 | That's tribalism right there.
03:34:21.120 | When you look at their brains, it's totally different.
03:34:24.680 | So Apple customers show empathy in their brains for Apple.
03:34:29.620 | You get activation of areas that are active
03:34:32.860 | for reward for myself,
03:34:33.900 | reward for my kid winning the spelling bee for good news,
03:34:36.660 | and the extended pain network,
03:34:38.940 | pain for me, pain for my loved one,
03:34:41.380 | for things that happen to Apple.
03:34:42.580 | If I'm an Apple customer, it's silent for Samsung.
03:34:45.020 | You look at Samsung customers,
03:34:47.060 | and if you're the CMO of Samsung,
03:34:48.840 | you should be worried
03:34:49.680 | because they show absolutely nothing,
03:34:52.020 | no feelings towards anything that happened to Samsung.
03:34:56.100 | The only thing you see is this schadenfreude,
03:34:59.620 | this reverse empathy.
03:35:00.560 | So you see pain for good news about Apple
03:35:03.920 | and joy for bad news about Apple.
03:35:08.340 | So the first take-home message is it's all about Apple.
03:35:11.500 | Apple customers choose Apple 'cause they love Apple
03:35:14.100 | and they wanna be part of something bigger,
03:35:16.060 | and I'll get to that in a second.
03:35:17.420 | And Samsung users choose Samsung 'cause they hate Apple.
03:35:20.300 | So that's sort of that--
03:35:21.860 | - Or they hate winners.
03:35:22.980 | - They hate winners, whatever.
03:35:23.820 | There's a whole thing.
03:35:24.640 | They don't wanna be part of the community.
03:35:27.040 | They don't want something bigger than themselves.
03:35:28.860 | There is something, too, in our data
03:35:30.640 | that essentially Apple is kind of like a cult.
03:35:33.020 | You could say it's a family.
03:35:34.700 | - I would say it's the dominant culture now, though.
03:35:37.180 | - Yeah, well--
03:35:38.020 | - They're not the niche-like thing.
03:35:39.420 | Samsung's the niche thing, right?
03:35:41.140 | They're, yeah.
03:35:42.140 | - Well, what's really fascinating now is that,
03:35:45.460 | and this is by virtue of things that Apple,
03:35:48.820 | smart things Apple has done to reinforce
03:35:51.420 | this sense of in-group, be part of that community,
03:35:54.160 | like the green text bubble thing.
03:35:57.780 | So now it's like 91% of teenagers
03:36:00.620 | are choosing Apple over Samsung
03:36:02.140 | 'cause they don't wanna be left out.
03:36:03.540 | They don't wanna be ostracized.
03:36:05.320 | Now, we talked about synchrony before,
03:36:07.940 | and synchrony is this marker of community and closeness,
03:36:11.020 | and we're all on the same team.
03:36:13.260 | So we use the EEG to measure brainwaves in people
03:36:18.220 | while they're, in a number of conditions,
03:36:20.820 | while they're getting news about Apple and Samsung,
03:36:24.320 | also while they're watching the commercials.
03:36:26.520 | You remember that spectacular Apple commercial
03:36:28.600 | where they crushed all those beautiful instruments
03:36:31.400 | and whatnot and turned into an iPad or whatever,
03:36:34.120 | and then there was the Samsung response to that.
03:36:36.880 | So we measure EEG activity, and what we found
03:36:38.600 | is that Apple people are all in sync with each other.
03:36:41.600 | Their brains are humming along at the exact same rhythm.
03:36:45.360 | To news of the world, to ads about Apple and Samsung,
03:36:50.680 | each Samsung person is like an island unto themselves.
03:36:53.220 | They're just not in sync at all.
03:36:54.980 | - These are like the incels of technology.
03:36:58.080 | - You said it, I didn't say it.
03:37:00.080 | Probably gonna catch a lot of flack for this,
03:37:02.200 | but look, the data is the data.
03:37:04.640 | And so Apple's this sort of extended family,
03:37:08.600 | and they're all in sync with each other.
03:37:10.640 | They're like a real team,
03:37:12.860 | and you don't see that in Samsung.
03:37:15.200 | So Apple people are seeing the world through similar eyes
03:37:18.480 | and feeling similar things.
03:37:22.400 | And beyond that, I said, well, if this is all true,
03:37:27.040 | it's a question of now, are Apple people wired that way
03:37:30.320 | at birth, in essence, or what's the balance
03:37:36.200 | of who they are versus what Apple has done
03:37:39.400 | through their marketing and design activities?
03:37:44.880 | We can't do that experiment, it's really hard.
03:37:47.200 | But when we looked at the structural MRI data,
03:37:50.040 | we found something really interesting,
03:37:51.140 | which is the parts of the brain
03:37:52.640 | that are really intimately involved
03:37:54.300 | in managing our social relationships.
03:37:57.200 | So the parts that are like involved
03:37:58.320 | in theory of mind and empathy.
03:37:59.800 | So those are physically larger in Apple people
03:38:04.960 | than in Samsung people.
03:38:07.040 | They're physically larger.
03:38:08.240 | In monkeys, monkeys who have more friends,
03:38:10.760 | those same brain areas are bigger
03:38:12.960 | than monkeys who have fewer friends.
03:38:14.800 | - Please tell me you've run this
03:38:15.960 | on politically leaning left
03:38:17.400 | versus politically leaning right.
03:38:18.760 | - So we started to do that experiment
03:38:21.020 | and then the pandemic hit.
03:38:22.120 | So we haven't gotten that back off.
03:38:23.800 | - Nice excuse, Platt.
03:38:24.640 | - It is, I know it's an excuse.
03:38:26.000 | We haven't gotten that back off.
03:38:26.840 | - I'm just kidding. - No, it's true.
03:38:28.120 | - I would be too afraid to run that experiment.
03:38:30.240 | Not because I'd be concerned about the result
03:38:32.480 | or what people would say if I shared the result,
03:38:34.200 | but here's why.
03:38:35.420 | I feel like when I was growing up,
03:38:39.500 | it was like, if you were a rebel,
03:38:42.440 | you were associated with anything like indie music,
03:38:44.960 | punk rock, you were associated with hip hop,
03:38:47.880 | anything that was kind of outside the mainstream,
03:38:49.920 | which at the time, this was like the '80s and '90s,
03:38:52.600 | we had a mix of Republican and Democrat governments
03:38:56.040 | at that time, depending on which four-year segment
03:38:58.160 | we're talking about.
03:38:59.240 | But there was this idea that if you liked anything
03:39:04.240 | about the government, this is kind of the carryover,
03:39:07.340 | I think, from the Vietnam era and the post-Vietnam era,
03:39:10.140 | that if you liked anything associated with government,
03:39:12.400 | that you were a conformist.
03:39:14.160 | And if you didn't, you were an iconoclast, right?
03:39:17.440 | Now I feel like it's become very issue-specific, right?
03:39:21.760 | Like who's in power basically that the party,
03:39:25.600 | like politics has, it was always split into two.
03:39:28.800 | It used to be you agree with the establishment
03:39:31.380 | or you don't agree with the establishment.
03:39:32.920 | Now it's like, depending on who's in power,
03:39:36.160 | people say, well, they're the establishment.
03:39:37.740 | So it's like the game has changed.
03:39:40.200 | It's sort of subdivided itself and changed.
03:39:42.600 | - And so if one were to run the experiment
03:39:44.840 | of kind of like affiliation, I would assume,
03:39:48.680 | my prediction would be that within the right,
03:39:51.520 | there's a lot of affiliation.
03:39:52.640 | Within the left, there's a lot of affiliation,
03:39:54.840 | but that you wouldn't necessarily see a difference
03:39:57.080 | in terms of activation of affiliative neural circuitry.
03:39:59.840 | It depends on with whom they're sitting.
03:40:02.360 | - Absolutely.
03:40:03.200 | - Which is very different than the phone situation
03:40:05.280 | that the Samsung versus Apple thing is a lot more
03:40:07.440 | like when I was growing up.
03:40:11.520 | And it's complicated because what used to be niche
03:40:14.300 | and rebellious inevitably becomes mainstream.
03:40:17.840 | Like I remember the movie "Revenge of the Nerds,"
03:40:20.360 | which of course was about like the nerds being marginalized
03:40:23.120 | and then being like the popular ones and on and on.
03:40:27.040 | Everything was like a John Hughes film,
03:40:28.360 | jocks versus rockers versus nerds.
03:40:32.920 | Things really blended together for 20 years or so.
03:40:36.980 | And then now it's very divided along the lines of politics.
03:40:41.620 | Whereas before it was politics versus non-conformists.
03:40:46.020 | Now it's like, depends on which camp,
03:40:48.460 | like literally what color you're wearing.
03:40:50.540 | Yeah, it's like gang warfare.
03:40:51.960 | - It is, yeah.
03:40:53.660 | - It's like blues, it's blues versus reds.
03:40:55.900 | - Jets and sharks.
03:40:57.260 | - Jets and sharks.
03:40:58.320 | So it feels very like the experiment,
03:41:02.300 | that's why I'd be afraid to run the experiment.
03:41:04.100 | I wouldn't know how to design the experiment.
03:41:06.860 | - Yeah, I mean, I think, well, it would be interesting
03:41:10.180 | just at the outset to demonstrate that,
03:41:14.140 | like a very easy way to elicit these sort of empathy signals
03:41:18.820 | is like just create a video that's fake
03:41:20.380 | is what something a former postdoc of mine
03:41:22.800 | did in some studies.
03:41:24.160 | You just like a fake needle stick to the cheek.
03:41:27.200 | And you get generally this sort of activation
03:41:31.140 | of empathy signals in the brain,
03:41:34.040 | but it tends to be tribal specific
03:41:36.400 | or ethnic group specific, which is like,
03:41:38.640 | even though people say I feel just as much pain
03:41:41.640 | for these two different people,
03:41:43.920 | the brain signals, which we know are what actually predicts
03:41:46.400 | what you'll do next, it predicts your behavior.
03:41:49.400 | The brain signals are specific to within your group.
03:41:52.160 | So I think that's what, that was in fact,
03:41:53.680 | the experiment we were gonna do,
03:41:54.520 | which is like people are gonna be,
03:41:56.440 | we'll have these videos of like a proud Republican
03:41:59.440 | or proud Democrat or whatever you wanna say on the hat.
03:42:02.320 | And then they're getting stuck with a needle.
03:42:04.960 | And then we ask you what you feel
03:42:07.080 | and then we measure your brain activity.
03:42:09.740 | And I think it would be obviously very highly specific.
03:42:12.740 | You might say you feel pain for that person
03:42:15.520 | who's from the other political party.
03:42:16.800 | Maybe you wouldn't now anymore.
03:42:17.900 | Maybe you'd be like, yay.
03:42:20.000 | - You see a lot of, gosh, you see a lot of people
03:42:24.680 | take enjoyment in other people's suffering.
03:42:27.660 | When the person's suffering is sort of perceived
03:42:32.660 | by a lot of others as a winner.
03:42:34.380 | - Yeah, we saw that with the fires
03:42:36.460 | with rich people's houses burning down
03:42:38.660 | and a lot of people piling on, oh, yeah, you know.
03:42:41.580 | - Well, the media was very skewed there.
03:42:43.340 | Like we were hearing about people's, you know,
03:42:48.340 | first of three homes burning.
03:42:50.220 | And that's hard for people that have very little to,
03:42:53.260 | at the same time, you know, for anyone experiencing loss,
03:42:57.420 | it's loss, it's a tough one.
03:43:00.620 | - It's tough.
03:43:01.460 | - It's a tough one.
03:43:02.280 | Man, this conversation has given me a ton more
03:43:08.140 | to think about, which means it's a great conversation.
03:43:11.620 | I have to say, you know, in our business of research science
03:43:16.260 | there's that term, you know,
03:43:19.660 | he or she is a serious scientist.
03:43:23.700 | I feel like there are very, very few serious scientists
03:43:27.420 | doing experiments in the real world
03:43:29.860 | or trying to map to the real world.
03:43:31.460 | I probably just offended about 300 scientists,
03:43:33.520 | but hey, listen, we only have a limited number of guests
03:43:35.700 | we can bring on here anyway, so no, I'm just kidding.
03:43:38.580 | There are others certainly, but I have to just applaud you
03:43:41.700 | for the range of things that you've embraced
03:43:45.820 | and taken on at the level of neural, anthropologic,
03:43:51.100 | sociologic, psychology, like, you know, endocrinology.
03:43:55.900 | This is a big field that you're trying
03:43:59.460 | to get your arms around, a big set of questions.
03:44:02.260 | And yet it's clear you are a serious scientist.
03:44:05.380 | You do like real experiments with isolating variables
03:44:09.300 | and all the necessary controls that are required
03:44:12.540 | to really tease out mechanism and larger themes.
03:44:16.260 | So, whereas a few minutes ago we were talking about
03:44:18.700 | maybe you taking on less.
03:44:19.940 | I would say, first of all, who am I to tell you what to do?
03:44:22.900 | Second of all, and I'm not,
03:44:24.620 | I hope I didn't give that impression.
03:44:26.660 | And second of all, like,
03:44:28.700 | what a service to the world you're doing,
03:44:30.380 | because certainly in researching for this podcast
03:44:33.420 | and even with guests, you know,
03:44:35.620 | oftentimes it's really a struggle to try and figure out
03:44:38.260 | how to talk to someone who's really down
03:44:40.460 | at the level of mechanism,
03:44:41.580 | who's not working on small animal models,
03:44:44.100 | or even if they are, how to map that to everyday experience.
03:44:47.380 | And today, you know, we've been talking about
03:44:49.200 | potential mate valuation, meme coins, politics,
03:44:53.780 | hierarchies, decision-making, time scales.
03:44:57.140 | I mean, all through the lens of real serious science.
03:45:00.580 | So, first of all, thank you so much for coming here
03:45:04.420 | and spending these hours with us, educating us.
03:45:07.060 | And right alongside that,
03:45:09.420 | thank you for doing the work that you're doing.
03:45:11.220 | It's really spectacular.
03:45:12.900 | I knew we were gonna get into a number of these things,
03:45:15.900 | but I didn't really anticipate
03:45:19.080 | just how much it was gonna geyser out of this
03:45:22.200 | in terms of changing my way of thinking.
03:45:24.440 | And I'm certain that's changing the way
03:45:25.840 | that other people are thinking now
03:45:27.320 | and are going to think about their decisions
03:45:28.960 | and just kind of themselves and the world.
03:45:31.560 | I'd be very grateful if you'd come back again
03:45:33.760 | and talk to us about the next round
03:45:35.520 | of amazing experiments before too long.
03:45:37.500 | - Well, I would love that.
03:45:38.880 | And thanks for having me.
03:45:40.200 | And this has been a really stimulating conversation.
03:45:43.640 | I've enjoyed it.
03:45:44.480 | There is a lot more that we could cover,
03:45:46.820 | which would be super fun.
03:45:48.220 | - Surely.
03:45:49.060 | And your endurance is something to behold.
03:45:53.220 | Thank you.
03:45:55.780 | Please do come back again.
03:45:56.980 | Thank you for the work that you're doing.
03:45:58.380 | We will provide links to all the resources
03:46:00.780 | and places to find out more about your book,
03:46:02.820 | the work that you're doing,
03:46:03.660 | and some of these tests that you were talking about earlier
03:46:06.300 | where they go beyond like standard personality tests
03:46:08.740 | so that people can answer those critical questions
03:46:10.500 | about where they are perhaps best placed
03:46:13.700 | in the landscape between creativity
03:46:15.700 | and strategy implementation in a different way.
03:46:18.580 | So thanks so much.
03:46:20.100 | This was a real thrill for me.
03:46:21.620 | - Thank you.
03:46:22.760 | - Thank you for joining me for today's discussion
03:46:24.820 | with Dr. Michael Platt.
03:46:26.420 | To learn more about Dr. Platt's work
03:46:28.020 | and to find links to his books,
03:46:29.720 | please see the show note captions.
03:46:31.660 | If you're learning from and or enjoying this podcast,
03:46:34.040 | please subscribe to our YouTube channel.
03:46:36.020 | That's a terrific zero cost way to support us.
03:46:38.380 | In addition, please click follow for the podcast
03:46:40.860 | on both Spotify and Apple.
03:46:42.460 | And on both Spotify and Apple,
03:46:43.940 | you can leave us up to a five-star review.
03:46:46.260 | If you have questions for me or comments about the podcast
03:46:48.460 | or guests or topics you'd like me to consider
03:46:50.260 | for the Huberman Lab podcast,
03:46:51.900 | please put those in the comment section on YouTube.
03:46:54.340 | I do read all the comments.
03:46:56.220 | Please also check out the sponsors mentioned
03:46:57.980 | at the beginning and throughout today's episode.
03:47:00.180 | That's the best way to support this podcast.
03:47:02.580 | For those of you that haven't heard,
03:47:03.740 | I have a new book coming out.
03:47:04.940 | It's my very first book.
03:47:06.540 | It's entitled "Protocols, An Operating Manual
03:47:09.060 | for the Human Body."
03:47:10.080 | This is a book that I've been working on
03:47:11.260 | for more than five years,
03:47:12.420 | and that's based on more than 30 years
03:47:14.740 | of research and experience.
03:47:16.300 | And it covers protocols for everything from sleep,
03:47:19.360 | to exercise, to stress control,
03:47:21.860 | protocols related to focus and motivation.
03:47:24.300 | And of course, I provide the scientific substantiation
03:47:27.660 | for the protocols that are included.
03:47:29.740 | The book is now available by presale at protocolsbook.com.
03:47:33.660 | There you can find links to various vendors.
03:47:36.020 | You can pick the one that you like best.
03:47:37.780 | Again, the book is called "Protocols,
03:47:39.580 | An Operating Manual for the Human Body."
03:47:42.180 | If you're not already following me on social media,
03:47:44.240 | I am Huberman Lab on all social media platforms.
03:47:47.340 | So that's Instagram X, formerly known as Twitter,
03:47:50.060 | Facebook, LinkedIn, and Threads.
03:47:52.300 | And on all those platforms,
03:47:53.480 | I discuss science and science-related tools,
03:47:55.540 | some of which overlaps with the content
03:47:57.100 | of the Huberman Lab podcast,
03:47:58.540 | but much of which is distinct from the content
03:48:00.580 | on the Huberman Lab podcast.
03:48:02.100 | Again, that's Huberman Lab on all social media platforms.
03:48:05.520 | And if you haven't already subscribed
03:48:06.820 | to our Neural Network newsletter,
03:48:08.420 | the Neural Network newsletter
03:48:09.700 | is a zero cost monthly newsletter
03:48:11.760 | that includes podcast summaries,
03:48:13.100 | as well as what we call protocols
03:48:14.920 | in the form of one to three page PDFs
03:48:17.200 | that cover everything from how to optimize your sleep,
03:48:19.500 | how to optimize dopamine, deliberate cold exposure.
03:48:22.300 | We have a foundational fitness protocol
03:48:24.020 | that covers cardiovascular training and resistance training.
03:48:27.100 | All of that is available completely zero cost.
03:48:29.580 | You simply go to hubermanlab.com,
03:48:31.300 | go to the menu tab in the top right corner,
03:48:33.200 | scroll down to newsletter and enter your email.
03:48:35.700 | And I should emphasize
03:48:36.540 | that we do not share your email with anybody.
03:48:39.100 | Thank you once again for joining me
03:48:40.420 | for today's discussion with Dr. Michael Platt.
03:48:43.180 | And last, but certainly not least,
03:48:45.260 | thank you for your interest in science.
03:48:47.260 | [upbeat music]
03:48:49.840 | (upbeat music)