back to indexDr. Jamil Zaki: How to Cultivate a Positive, Growth-Oriented Mindset
Chapters
0:0 Dr. Jamil Zaki
2:12 Sponsors: Maui Nui, Joovv & Waking Up
6:59 Cynicism
12:38 Children, Attachment Styles & Cynicism
17:29 Cynicism vs. Skepticism, Complexity
23:30 Culture Variability & Trust
26:28 Sponsor: AG1
27:40 Negative Health Outcomes; Cynicism: Perception & Intelligence
35:59 Stereotypes, Threats
39:48 Cooperative Environments, Collaboration & Trust
44:5 Competition, Conflict, Judgement
48:46 Cynics, Awe, “Moral Beauty”
55:26 Sponsor: Function
57:13 Cynicism, Creativity & Workplace
64:19 Assessing Cynicism; Assumptions & Opportunities
71:11 Social Media & Cynicism, “Mean World Syndrome”
78:35 Negativity Bias, Gossip
84:3 Social Media & Cynicism, Polarization, “Hopeful Skepticism”
92:59 AI, Bias Correction
99:5 Tools: Mindset Skepticism; Reciprocity Mindset; Social Savoring
106:5 Tools: Leaps of Faith; Forecasting; Encounter Counting
111:33 Tool: Testing & Sharing Core Beliefs
118:9 Polarization vs. Perceived Polarization, Politics
126:6 Challenging Conversations, Questioning Perceptions
134:4 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter
00:00:10.240 |
and I'm a professor of neurobiology and ophthalmology 00:00:23.440 |
of the Social Neuroscience Laboratory at Stanford. 00:00:26.360 |
His laboratory focuses on key aspects of the human experience 00:00:31.700 |
which lie at the heart of our ability to learn 00:00:38.400 |
Today, you'll learn the optimal mindsets to adopt 00:00:40.640 |
when trying to understand how to learn conflict resolution 00:00:44.520 |
and how to navigate relationships of all kinds 00:00:48.780 |
including personal relationships and in the workplace. 00:00:54.520 |
is that he's able to take laboratory research 00:01:00.480 |
for things like how to set personal boundaries, 00:01:05.360 |
and sometimes even uncomfortable environments, 00:01:13.480 |
and how to collaborate with others in more effective ways. 00:01:16.880 |
I wanna be very clear that today's discussion, 00:01:18.940 |
while focused on cynicism, trust, and empathy, 00:01:40.220 |
This is very important because there's a lot of confusion 00:01:45.040 |
But I can assure you that by the end of today's discussion, 00:01:47.480 |
you will have new frameworks and indeed new tools, 00:01:52.800 |
to better navigate situations and relationships of all kinds 00:02:00.600 |
has authored a terrific new book entitled "Hope for Cynics, 00:02:06.720 |
And I've read this book and it is spectacular. 00:02:09.700 |
There's a link to the book in the show note captions. 00:02:12.240 |
Before we begin, I'd like to emphasize that this podcast 00:02:15.120 |
is separate from my teaching and research roles at Stanford. 00:02:26.640 |
I'd like to thank the sponsors of today's podcast. 00:02:37.580 |
about the fact that most of us should be seeking 00:02:44.240 |
That protein provides critical building blocks 00:02:57.740 |
is to make sure that you're getting enough quality protein 00:03:02.320 |
Maui Nui venison has an extremely high quality 00:03:14.020 |
Also, Maui Nui venison is absolutely delicious. 00:03:23.060 |
In fact, I probably eat a Maui Nui venison burger 00:03:26.720 |
and occasionally I'll swap that for a Maui Nui steak. 00:03:29.520 |
And if you're traveling a lot or simply on the go, 00:03:31.880 |
they have a very convenient Maui Nui venison jerky, 00:03:34.640 |
which has 10 grams of quality protein per stick 00:03:39.020 |
While Maui Nui offers the highest quality meat available, 00:03:47.440 |
means that they will not go beyond harvest capacity. 00:03:50.240 |
Signing up for a membership is therefore the best way 00:04:00.200 |
to get 20% off your membership or first order. 00:04:06.960 |
Today's episode is also brought to us by Juve. 00:04:09.680 |
Juve makes medical grade red light therapy devices. 00:04:13.040 |
Now, if there's one thing I've consistently emphasized 00:04:24.520 |
on improving numerous aspects of cellular and organ health, 00:04:31.080 |
even improvements in acne, reducing pain and inflammation, 00:04:39.280 |
and why they're my preferred red light therapy devices, 00:04:42.040 |
is that they use clinically proven wavelengths, 00:04:44.180 |
meaning it uses specific wavelengths of red light 00:04:56.120 |
so it's super portable and convenient to use. 00:05:00.080 |
and I use that about three or four times per week. 00:05:04.000 |
you can go to juve, spelled J-O-O-V-V.com/huberman. 00:05:22.000 |
Today's episode is also brought to us by Waking Up. 00:05:26.720 |
that offers hundreds of guided meditation programs, 00:05:29.180 |
mindfulness trainings, yoga nidra sessions, and more. 00:05:41.440 |
that emphasize how useful mindfulness meditation can be 00:05:44.680 |
for improving our focus, managing stress and anxiety, 00:06:08.120 |
is that it has a lot of different meditations 00:06:10.920 |
and those meditations are of different durations. 00:06:25.360 |
and you can always fit meditation into your schedule, 00:06:28.080 |
even if you only have two or three minutes per day 00:06:33.640 |
or what is sometimes called non-sleep deep rest, 00:06:42.000 |
that some people experience when they wake up 00:06:56.280 |
And now for my discussion with Dr. Jamil Zaki. 00:07:18.760 |
but of these fields in the science that you've done, 00:07:21.440 |
is that people default to some complicated states 00:07:26.280 |
and emotions sometimes that in some ways serve them well, 00:07:34.080 |
of the individual and interactions between pairs 00:07:49.160 |
What does it serve in terms of its role in the human mind? 00:07:53.640 |
- The way that psychologists think of cynicism these days 00:08:01.440 |
It's the idea that generally people at their core 00:08:11.400 |
will deny that somebody could act kindly, for instance, 00:08:14.880 |
could donate to charity, could help a stranger, 00:08:22.720 |
is a thin veneer covering up who we really are, 00:08:30.880 |
you know, there are these ancient philosophical questions 00:08:45.140 |
- I believe in your book, you quote Kurt Vonnegut, 00:08:53.400 |
"so we need to be careful who we pretend to be." 00:09:07.740 |
because it expresses the idea of self-fulfilling prophecies. 00:09:12.420 |
You know, there's this subjective sense that people have 00:09:28.460 |
We each construct our own version of the world. 00:09:32.340 |
And so, for instance, if you think about cynicism, right? 00:09:45.380 |
But it turns out that the way you answer that 00:09:52.040 |
the life that you live, the decisions that you make. 00:09:59.760 |
but it's about who they pretend everybody else is, right? 00:10:03.580 |
If you decide that other people are selfish, for instance, 00:10:12.620 |
when they're put in situations with new people, 00:10:34.800 |
what most of us need from social connections. 00:10:38.120 |
There's one really classic and very sad study 00:10:41.600 |
where people were forced to give an extemporaneous speech 00:10:54.760 |
not an actual cheerleader, but a friendly stranger 00:10:59.880 |
saying, "You've got this, I know you can do it. 00:11:06.180 |
As you know, one of the great things about social support 00:11:14.080 |
when they had this friendly person by their side, 00:11:17.000 |
their blood pressure, as they prepared for the speech, 00:11:19.520 |
went up only half as much as when they were alone. 00:11:23.240 |
But cynical people had a spike in their blood pressure 00:11:29.720 |
whether or not a person was by their side or not. 00:11:35.440 |
social connection is a deep and necessary form 00:11:43.640 |
making the decision that most people can't be trusted, 00:11:47.400 |
stops you from being able to metabolize those calories, 00:11:59.160 |
to pretend that others are selfish, greedy, and dishonest 00:12:18.480 |
But of course, other people can tell how we're treating them 00:12:35.840 |
and then ending up stuck living in that story. 00:12:41.840 |
I'm thinking about "Sesame Street" characters, 00:12:44.460 |
which to me embody different neural circuits. 00:13:13.520 |
And then in, you know, essentially every fairy tale 00:13:27.620 |
the celebration that one would otherwise have. 00:13:31.340 |
But even though kids are learning about cynicism 00:13:40.380 |
I often think about those phenotypes in older folks 00:13:53.560 |
how early can you observe classically defined cynicism? 00:14:05.400 |
because you typically measure it through self-report. 00:14:07.800 |
So people have to have relatively well-developed, 00:14:16.060 |
That said, one early experience and one early phenotype 00:14:20.640 |
that's very strongly correlated with generalized mistrust 00:14:30.980 |
So for instance, you might know, but just for listeners, 00:14:43.220 |
as the strange situation where a one-year-old 00:15:03.580 |
Couple minutes after that, their mother leaves the room 00:15:16.560 |
and what researchers look at is a few things. 00:15:19.420 |
One, how comfortable is the child exploring a space 00:15:24.820 |
Two, how comfortable are they when other people are around? 00:15:28.140 |
Three, how do they react when their caregiver leaves? 00:15:35.700 |
And the majority of kids, approximately 2/3 of them, 00:15:38.580 |
are securely attached, meaning that they are comfortable 00:15:42.380 |
exploring a new space, they get really freaked out, 00:15:45.820 |
of course, as you might when their caregiver leaves, 00:15:48.420 |
but then they soothe quickly when their caregiver returns. 00:15:51.880 |
The remaining 1/3 or so of kids are insecurely attached, 00:15:56.780 |
meaning that they're skittish in new environments 00:15:59.180 |
even when their parent or caregiver is there. 00:16:02.140 |
They really freak out when their caregiver leaves 00:16:04.940 |
and they're not very soothed upon their return. 00:16:08.500 |
Now, for a long time, attachment style was viewed 00:16:13.980 |
it is an emotional reaction first and foremost, 00:16:17.000 |
but researchers more recently have started to think about, 00:16:22.780 |
What are the underpinnings, the ways that children think 00:16:26.380 |
when they are securely or insecurely attached? 00:16:31.880 |
Looking time in kids is a metric of what surprises them. 00:16:40.540 |
And researchers found that insecurely attached kids, 00:16:55.100 |
they looked longer as though that was surprising. 00:16:58.300 |
Kids who were securely attached didn't look very long 00:17:03.820 |
but looked longer at interactions that were unstable. 00:17:16.580 |
And insecure attachment is a signal coming early in life, 00:17:33.480 |
I can think of some places where they might overlap, 00:17:49.020 |
- That's a very sharp way of thinking about it, actually. 00:18:05.500 |
And I would argue that one is much more useful 00:18:12.500 |
Again, cynicism is a theory that's kind of locked in, 00:18:27.420 |
that ultimately people are red in tooth and claw. 00:18:30.380 |
Skepticism is instead the, I guess, restlessness 00:18:36.380 |
with our assumptions, a desire for new information. 00:18:42.040 |
One way I often think about it is that cynics 00:18:46.960 |
They have a decision that they've already made about you 00:18:54.620 |
And when evidence comes in that doesn't support their point, 00:19:00.140 |
that cynical people will offer more ulterior motives 00:19:03.920 |
when they see an act of kindness, for instance. 00:19:08.380 |
In that way, I think cynics actually are quite similar 00:19:40.980 |
is the belief that you can never truly know anything. 00:19:46.520 |
it's more the desire for evidence to underlie any claim 00:20:04.500 |
but it allows you to update and learn from specific acts, 00:20:43.340 |
You'll change that in the course of our discussion. 00:20:56.480 |
certainly any notion of excitement by complexity. 00:21:11.020 |
To what extent can you hold different versions of the world, 00:21:19.400 |
what you believe based on the best evidence available? 00:21:25.140 |
to learn about the world and about the social world, 00:21:39.140 |
then puts us in a position where we can't learn very much. 00:22:03.860 |
and that new information allows you to update your priors 00:22:07.300 |
into a posterior distribution, into a new set of beliefs. 00:22:14.780 |
to adapt to new information and new circumstances. 00:22:18.440 |
A wicked learning environment is where your priors 00:22:24.020 |
that you would need to confirm or disconfirm them. 00:22:26.920 |
So think about mistrust, for instance, right? 00:22:49.460 |
we never are able to learn whether the people 00:22:52.100 |
who we are mistrusting would have been trustworthy or not. 00:23:03.180 |
Or more often than not, the data turn out to show us 00:23:18.700 |
that trusting people incorrectly, you do learn from, 00:23:22.460 |
but mistrusting people incorrectly, you don't learn from, 00:23:26.140 |
because the missed opportunities are invisible to us. 00:23:34.740 |
So you pointed out that some degree of cynicism 00:23:45.620 |
do we find cultures where it's very hard to find cynics, 00:23:50.500 |
and there could be any number of reasons for this, 00:23:56.420 |
do we find cultures where there really isn't even a word 00:24:14.180 |
a lot of research on this is done in an American context. 00:24:17.280 |
But that said, there's a lot of data on generalized trust, 00:24:21.920 |
which you could say is an inverse of cynicism, right? 00:24:24.280 |
So for instance, there are national and international 00:24:35.220 |
And there's a lot of variance around the world. 00:24:38.260 |
In general, the cultures that are most trusting 00:24:48.600 |
So there's a lot of great work from Kate Willett 00:24:57.740 |
where they look at inequality across the world 00:25:07.780 |
So you can look at not just are there places or cultures 00:25:16.060 |
And in the US, that's sadly a story of decline. 00:25:25.340 |
And by 2018, that had fallen to about a third of Americans. 00:25:29.460 |
And that's a drop as big, just to put it in perspective, 00:25:33.200 |
as the stock market took in the financial collapse of 2008. 00:25:44.900 |
but one of the seeming characteristics of cultures 00:25:54.760 |
that when you are in a highly unequal society, 00:25:58.460 |
economically, there's a sense of zero-sum competition 00:26:07.100 |
And if you have that inherent sense of zero-sum competition, 00:26:17.520 |
because you might think, well, in order to survive, 00:26:24.260 |
They have to try to make me fail for themselves to succeed. 00:26:37.540 |
The reason for that is AG1 is the highest quality 00:26:40.980 |
of the foundational nutritional supplements available. 00:26:46.540 |
but also probiotics, prebiotics, and adaptogens 00:27:06.500 |
For that reason, I've been taking AG1 daily since 2012 00:27:09.940 |
and often twice a day, once in the morning or mid-morning, 00:27:14.620 |
When I do that, it clearly bolsters my energy, 00:27:30.200 |
Right now, they're giving away five free travel packs 00:27:42.560 |
between cynicism and happiness or lack of happiness? 00:27:46.760 |
When I think of somebody who's really cynical, 00:27:54.380 |
I'm thinking specifically about what you said earlier 00:27:58.960 |
from certain forms of learning that are important 00:28:06.180 |
my dad, who went to classic boarding schools, 00:28:11.500 |
but he went to these boarding schools that were very strict. 00:28:22.220 |
people would accuse you of being kind of dumb. 00:28:27.860 |
and you acted a little bored with everything, 00:28:30.180 |
people thought that you were more discerning, 00:28:34.720 |
for going through life because it veered into cynicism. 00:28:52.540 |
but he experiences joy and pleasure in daily, 00:28:57.380 |
So clearly he rescued himself from the forces 00:29:00.620 |
that were kind of pushing him down that path. 00:29:03.660 |
But I use that question more as a way to frame up 00:29:15.220 |
in shifting somebody towards a happier affect. 00:29:23.820 |
And when I think about people who are not very cynical, 00:29:25.460 |
I think of them as kind of cheerful and curious. 00:29:29.920 |
They might not be Tigger-like in their affect, 00:29:45.340 |
I want to try to pull on a couple of threads here, 00:29:48.600 |
First, and this one is pretty straightforward, 00:29:53.860 |
is just really documented and quite negative. 00:30:05.460 |
that measure cynicism and then measure life outcomes 00:30:11.140 |
And the news is pretty bleak for cynics, right? 00:30:20.580 |
greater incidence of depression, greater loneliness. 00:30:24.500 |
But it's not just the neck up that cynicism affects. 00:30:31.300 |
also tend to have greater degrees of cellular inflammation, 00:30:37.720 |
and they even have higher rates of all-cause mortality, 00:30:43.660 |
And again, this might sound like, wait a minute, 00:30:53.940 |
so I don't wanna draw too many causal claims, 00:31:15.020 |
to really touch in to that type of connection, 00:31:18.580 |
it stands to reason that things like chronic stress 00:31:21.700 |
and isolation would impact not just your mind, 00:31:25.280 |
but all through your body and your organ systems. 00:31:31.420 |
and I often think about one of the best encapsulations 00:31:34.820 |
of a cynical view of life comes from Thomas Hobbes, 00:31:39.060 |
the philosopher, who, in his book "Leviathan," 00:31:52.700 |
the lives of cynics themselves more than most people. 00:31:59.780 |
is that there is this pretty stark negative correlation 00:32:27.820 |
It would have a skull and crossbones on the bottle. 00:32:47.340 |
if you trust people, that kind of seems dull. 00:32:54.580 |
And there is that strong relationship in our stereotypes, 00:33:10.260 |
And how able do they seem to accomplish hard things? 00:33:16.260 |
people's perception is that these are inversely correlated. 00:33:19.160 |
That if you're warm, maybe you're not that competent. 00:33:22.120 |
And if you're competent, maybe you shouldn't be that warm. 00:33:29.680 |
they'll often respond by being a little bit less nice, 00:33:32.980 |
a little bit less warm than they would be otherwise. 00:33:38.820 |
you know, where people are presented in surveys 00:33:45.900 |
they really think that people are great overall, 00:33:58.060 |
who should we pick for this difficult intellectual task? 00:34:06.580 |
over a non-cynic for difficult intellectual tasks. 00:34:09.780 |
85% of people think that cynics are socially wiser, 00:34:14.280 |
that they'd be able, for instance, to detect who's lying 00:34:22.280 |
who don't have a lot of faith in people, ironically. 00:34:25.580 |
And even more ironically, we're wrong to do so. 00:34:28.860 |
Olga Stavrova, this great psychologist who studies cynicism, 00:34:32.340 |
has this paper called "The Cynical Genius Illusion," 00:34:38.760 |
the way that we think cynics are bright and wise, 00:34:42.220 |
and then uses national data, tens of thousands of people, 00:35:00.580 |
this is not from Olga Stavrova, but from others, 00:35:03.380 |
that actually, cynics do less well than non-cynics 00:35:08.000 |
Because if you have a blanket assumption about people, 00:35:11.840 |
you're not actually attending to evidence in a sharp way. 00:35:15.380 |
You're not actually taking in new information 00:35:20.060 |
- In other words, cynics are not being scientific. 00:35:27.300 |
but they're not looking at the data equally, right? 00:35:33.700 |
Every great experiment starts with a question, 00:35:45.660 |
and you see if you prove or disprove the hypothesis. 00:36:00.200 |
because I would think that if we view cynics as smarter, 00:36:08.380 |
You're saying cynics are not more intelligent, right? 00:36:16.560 |
then why do we send cynics in kind of like razors 00:36:25.400 |
Is that because we'd rather have others deployed 00:36:32.520 |
Is it that we're willing to accept some false negatives? 00:36:52.680 |
that filter of cynicism is only going to allow in people 00:36:59.640 |
you know, there are probably two or three candidates 00:37:06.560 |
as opposed to having someone get through the filter 00:37:12.780 |
Like we're willing to let certain opportunities go 00:37:15.100 |
by being cynical or by deploying a cynic as the, 00:37:18.420 |
you know, I'm imagining the person with the clipboard, 00:37:32.180 |
you said that if we know that cynics aren't smarter 00:37:38.660 |
meaning you and I know this and scientists know this, 00:37:41.660 |
but the data show that most people don't know this, 00:37:44.580 |
that we maintain the stereotype in our culture 00:37:50.300 |
means that you've been around the block enough times, 00:38:00.580 |
when we deploy cynics out in the field, you know, 00:38:05.020 |
but I want somebody who's really pretty negative, 00:38:12.380 |
I think that's a really, again, understandable instinct, 00:38:18.460 |
You know, we are built to pay lots of attention 00:38:21.780 |
to threats in our environment and threats to our community. 00:38:28.660 |
just to do some back of the envelope evolutionary psychology, 00:38:32.100 |
if you wind the clock back 100, 150,000 years, 00:38:37.780 |
what is the greatest threat to early communities? 00:38:51.000 |
But that collaboration means that a free rider, 00:38:59.540 |
anything that they want, can do exceptionally well. 00:39:07.100 |
And if you select then for that type of person, 00:39:18.180 |
from that perspective, from a threat mitigation perspective, 00:39:34.360 |
meaning that we're more scared of negative outcomes 00:39:53.000 |
meaning if somebody is cynical in one environment, 00:40:03.040 |
so things can go this way or that way, depending on, 00:40:18.180 |
our levels of cynicism tend to be pretty stable over time. 00:40:25.640 |
contra the stereotype of the curmudgeonly older person. 00:40:36.840 |
and this makes sense if you look at questionnaires 00:40:41.800 |
people are honest chiefly through fear of getting caught, 00:40:45.380 |
or most people really don't like helping each other. 00:40:48.360 |
I mean, if you're answering those questions positively, 00:40:57.940 |
this is an old scale developed by a couple of psychologists 00:41:01.340 |
named Walter Cook and Donald Medley in the 1950s. 00:41:04.540 |
If you answer the Cook-Medley hostility scale, 00:41:14.560 |
have less trust in your romantic partnerships, 00:41:21.940 |
So this is sort of an all-purpose view of the world, 00:41:25.380 |
at least as Cook and Medley first thought about it. 00:41:30.180 |
But I do wanna build on a great intuition you have, 00:41:55.760 |
There are two fishing villages in Southeastern Brazil. 00:42:00.900 |
They're similar in socioeconomic status, religion, culture, 00:42:19.540 |
where fishermen strike out on small boats alone, 00:42:38.460 |
These were not with fellow fishermen, but with strangers. 00:42:41.500 |
Games like, would you trust somebody with some money 00:42:44.420 |
and see if they then want to share dividends with you? 00:42:49.420 |
would you like to share some of it with another person? 00:42:52.860 |
And they found that when they start in their careers, 00:42:56.260 |
lake fishermen and ocean fishermen were equally trusting 00:43:03.380 |
But over the course of their careers, they diverged. 00:43:08.820 |
where people must count on one another to survive 00:43:12.340 |
made people over time more trusting and more trustworthy. 00:43:17.020 |
Being in a competitive zero-sum environment over time 00:43:20.660 |
made people less trusting and less trustworthy. 00:43:23.900 |
Now, one thing that always amazes me about this work 00:43:26.220 |
is that people in both of these environments are right. 00:43:32.700 |
you don't trust and you're right to not trust. 00:43:40.220 |
And this is from the point of view of economic games, 00:43:47.140 |
well, which of these environments do we want to be in? 00:43:49.820 |
I think the cost in terms of wellbeing and relationships 00:43:53.000 |
is quite obvious if you're in a competitive environment. 00:43:57.500 |
how do we put ourselves in the type of environment 00:44:00.540 |
that we want knowing that that environment will change 00:44:11.500 |
like we're all gonna sit around and listen to a story 00:44:18.040 |
it evolves into more independent learning and competition. 00:44:23.200 |
That's largely the distribution of individual scores. 00:44:28.300 |
Like I think I've never been to business school, 00:44:30.100 |
but I think they form small groups and work on projects. 00:44:32.300 |
It's true in computer science at the undergraduate level 00:44:36.180 |
But to what extent do you think having a mixture 00:44:40.220 |
of cooperative learning, still competition perhaps 00:44:44.500 |
between groups, as well as individual learning 00:44:47.940 |
and competition can foster kind of an erosion of cynicism? 00:44:57.820 |
but they're probably already hard on themselves 00:45:14.300 |
And that there's something really big to be gained 00:45:18.420 |
from anybody who decides to embrace novel ideas, 00:45:21.980 |
even if they decide to stick with their original decision 00:45:27.220 |
Provided they explore the data in an open-minded way, 00:45:33.660 |
You gave a long-term example of these two phishing scenarios. 00:45:40.980 |
but we know neuroplasticity can be pretty quick. 00:45:48.260 |
that it's not gonna erode all of their cynicism, 00:45:59.180 |
One, I am not here to judge or impugn cynics. 00:46:03.300 |
I should confess that I myself struggle with cynicism 00:46:15.940 |
and to see if it is possible to unlearn cynicism 00:46:28.220 |
I think that another point that you're bringing out 00:46:41.060 |
isn't the same as saying that we should never compete. 00:46:50.420 |
when they are at odds trying to best one another. 00:46:57.300 |
when we focus on the great things that we can do. 00:47:03.340 |
by people we respect who are trying to be greater than us. 00:47:09.860 |
of a very healthy social structure and a very healthy life. 00:47:18.820 |
at the level of a task or at the level of the person. 00:47:23.820 |
In fact, there's a lot of work in the science of conflict 00:47:28.040 |
and conflict resolution that looks at the difference 00:47:38.340 |
for what direction they wanna take a project in. 00:47:41.780 |
Well, that's great if it leads to healthy debate 00:48:01.140 |
That's when we go from healthy skeptical conflict 00:48:14.540 |
and some of the people that they respect the most 00:48:23.540 |
literally battling, but they can have immense 00:48:33.480 |
between competition that's oriented on tasks, 00:48:37.020 |
which can help us be the best version of ourselves 00:48:51.020 |
Maybe I'm bridging two things that don't belong together, 00:49:01.160 |
But the young brain learns a number of things 00:49:05.960 |
It handles heart rate, digestion, et cetera, unconsciously. 00:49:28.040 |
So I'm thinking about the sort of classic example 00:49:38.940 |
And they, at a certain age, a very young age, 00:49:43.220 |
And then you bring it back and then they're amazed. 00:49:45.340 |
And then at some point along their developmental trajectory, 00:49:51.900 |
And then we hear that characters like Santa Claus are real, 00:49:56.040 |
and then eventually we learn that they're not, 00:49:59.700 |
In many ways, we go from being completely non-cynical 00:50:19.700 |
or among the very best magicians on this podcast, 00:50:31.180 |
they seem to defy the laws of physics in real time. 00:50:35.180 |
And it just blows your mind to the point where you like, 00:50:38.940 |
that can't be, but you sort of want it to be. 00:50:41.220 |
And at some point you just go, you know what? 00:50:45.020 |
So it seems to me that cynics apply almost physics 00:50:52.160 |
Like that they talk in terms of like first principles 00:51:07.860 |
as opposed to any kind of blending of understanding 00:51:13.780 |
And one can see how that would be a really useful heuristic, 00:51:16.180 |
but as we're learning, it's not good in the sense 00:51:20.360 |
but it's not good if our goal is to learn more 00:51:22.900 |
about the world or learn the most information 00:51:29.020 |
I also try to avoid good, bad language or moral judgment, 00:51:45.900 |
that cynicism can block your way towards them. 00:51:53.540 |
And there is almost a philosophical certainty. 00:51:58.360 |
Maybe it's not a happy philosophical certainty, 00:52:11.740 |
And the laws of physics are some of our most reliable, 00:52:15.700 |
And really we all use theories to predict the world, right? 00:52:22.140 |
We don't think objects with mass attract one another, 00:52:24.880 |
but we know if we drop a bowling ball on our foot, 00:52:30.580 |
So we use theories to provide explanatory simplicity 00:52:40.300 |
And absolutely, I think cynicism has a great function 00:52:45.260 |
in simplifying, but of course in simplifying, 00:52:56.900 |
And I do wanna, your beautiful description of kids 00:53:01.820 |
and their sort of sense of, I suppose, perennial surprise 00:53:12.120 |
which is the ability to witness the beauty of human action 00:53:42.500 |
that most commonly produce awe in a large sample, 00:53:52.500 |
my first go-to is Carl Sagan's "Pale Blue Dot," 00:53:59.900 |
or sort of cluster basically, stardust really. 00:54:09.580 |
"and every king and tyrant and mother and father, 00:54:16.820 |
"and every person who's ever had their heart broken, 00:54:20.300 |
I go to that, I show that to my kids all the time. 00:54:27.860 |
I think of drone footage of the Himalayas, right? 00:54:36.340 |
the number one category is what he calls moral beauty, 00:54:48.700 |
This is also related to what Dacher and John Haidt 00:54:53.700 |
witnessing positive actions that actually make us feel 00:55:03.140 |
If you are open to it, it is the most common thing 00:55:05.500 |
that will make you feel the vastness of our species. 00:55:09.820 |
And to have a lawful, physics-like prediction 00:55:14.320 |
about the world that blinkers you from seeing that, 00:55:17.740 |
that gives you tunnel vision and prevents you 00:55:31.900 |
after searching for the most comprehensive approach 00:55:37.100 |
I really wanted to find a more in-depth program 00:55:43.580 |
my hormone status, my immune system regulation, 00:55:46.380 |
my metabolic function, my vitamin and mineral status, 00:55:49.640 |
and other critical areas of my overall health and vitality. 00:55:53.060 |
Function not only provides testing of over 100 biomarkers 00:55:59.780 |
and provides insights from top doctors on your results. 00:56:03.340 |
For example, in one of my first tests with Function, 00:56:06.180 |
I learned that I had two high levels of mercury in my blood. 00:56:24.540 |
while also making an effort to eat more leafy greens 00:56:26.880 |
and supplementing with NAC and acetylcysteine, 00:56:29.780 |
both of which can support glutathione production 00:56:31.860 |
and detoxification and worked to reduce my mercury levels. 00:56:40.380 |
I've always found it to be overly complicated and expensive. 00:56:48.220 |
as well as how comprehensive and how actionable 00:56:51.100 |
the tests are, that I recently joined their advisory board, 00:56:54.520 |
and I'm thrilled that they're sponsoring the podcast. 00:57:01.380 |
Function currently has a wait list of over 250,000 people, 00:57:04.980 |
but they're offering early access to Huberman Lab listeners. 00:57:13.680 |
- I love that your examples of both pale blue dot 00:57:28.780 |
You know, this has long fascinated me about the human brain 00:57:31.720 |
and presumably other animals' brains as well, 00:57:34.240 |
which is that, you know, we can sharpen our aperture 00:57:41.860 |
and pay attention to just like the immense beauty. 00:57:53.360 |
And I thought, I'm just gonna like watch what they do. 00:58:02.540 |
but then you look up from there and you're like, 00:58:08.020 |
interactions like that must be going on everywhere 00:58:13.420 |
And, you know, it frames up that the aperture 00:58:16.380 |
of our cognition in space and in time, you know, 00:58:19.020 |
covering small distances quickly or small distances slowly. 00:58:24.580 |
and think about us on this ball in space, right? 00:58:29.580 |
You know, and that ability I think is incredible. 00:58:33.460 |
And that awe can be captured at these different extremes 00:58:42.280 |
is that cynicism and awe are also opposite ends 00:58:45.340 |
And that's taking us in a direction slightly different 00:58:52.300 |
because to me it feels like it's a more extreme example 00:59:00.500 |
if there's any examples of research on this, you know, 00:59:04.620 |
touch on to what extent a sense of cynicism divorces us 00:59:08.460 |
from delight and awe, or I guess their collaborator, 00:59:16.140 |
To me, everything you're saying about cynicism 00:59:22.020 |
by definition, you're eliminating possibility. 00:59:24.580 |
And creativity, of course, is the unique original combination 00:59:28.660 |
of existing things or the creation of new things 00:59:48.140 |
and a lot of it comes actually in the context 01:00:03.960 |
that make people more or less able to trust one another. 01:00:07.140 |
One version of this is what's known as stack ranking. 01:00:12.780 |
managers are forced to pick the highest-performing 01:00:21.380 |
who are at the bottom 10% every six or 12 months. 01:00:26.900 |
mostly fallen out of favor in the corporate world, 01:00:41.100 |
And the idea, again, was if you want people to be creative, 01:00:57.100 |
Really, stack ranking is a social Darwinist approach 01:01:02.100 |
And the idea is, well, great, if you threaten people, 01:01:15.620 |
I mean, stack-ranked workplaces, of course, are miserable. 01:01:26.700 |
pertains to what stack ranking does to creativity. 01:01:36.600 |
then the last thing you want to do is take a creative risk. 01:01:46.580 |
If other people are going to go after you for doing that, 01:01:50.640 |
and if you screw up, or if it doesn't go well, 01:02:02.880 |
I, of course, don't mean politically conservative. 01:02:04.760 |
I mean conservative in terms of the types of choices 01:02:15.620 |
of what we might call group creativity, right? 01:02:19.120 |
A lot of our best ideas come not from our minds, 01:02:33.320 |
people are less willing to share knowledge and perspective 01:02:36.840 |
because doing so amounts to helping your enemy succeed, 01:03:00.520 |
that looks at group intelligence, collective intelligence. 01:03:10.120 |
and have various forms of intelligence as well. 01:03:17.340 |
and especially creative problem-solving intelligence, 01:03:20.120 |
that goes above and beyond the sum of their parts, 01:03:24.800 |
and actually in some cases is almost orthogonal 01:03:27.980 |
to the intelligence of the individuals in that group, right? 01:03:31.340 |
Controlling for the intelligence of individuals, 01:03:36.400 |
And so Anita Woolley and others have looked at, 01:03:38.640 |
well, what predicts that type of collective intelligence? 01:03:53.480 |
But another is their willingness to, in essence, 01:03:55.800 |
pass the mic, to share the conversation, and to collaborate. 01:04:06.180 |
both at the individual and at the group level, 01:04:14.240 |
and where we feel that contributing to somebody else 01:04:22.880 |
all of this in the context of neuroplasticity. 01:04:25.320 |
I feel like one of the holy grails of neuroscience 01:04:29.880 |
you know, what are the gates to neuroplasticity? 01:04:31.840 |
We understand a lot about the cellular mechanisms. 01:04:33.760 |
We know it's possible throughout the lifespan. 01:04:44.760 |
and emotional stance, not technical, not a technical term, 01:04:53.000 |
Like to me, curiosity is an interest in the outcome 01:04:56.760 |
with no specific emotional attachment to the outcome. 01:05:02.280 |
with the hope of getting a certain result, you know, 01:05:05.320 |
But there is something about that childlike mind, 01:05:12.920 |
And it seems like the examples that you're giving 01:05:15.160 |
keep bringing me back to these developmental themes 01:05:21.400 |
exclude a lot of data that could be useful to them, 01:05:24.220 |
it seems that the opportunities for neuroplasticity 01:05:32.680 |
to what extent are we all a little bit cynical? 01:05:52.680 |
is give you that classic questionnaire from Cook and Medley, 01:05:55.920 |
which would just ask you about your theories of the world. 01:06:00.360 |
Do you think that people are generally honest? 01:06:02.860 |
Do you think that they are generally trustworthy? 01:06:05.160 |
- So it loads the questions or it's open-ended 01:06:07.440 |
where I would, would you say, what are people like? 01:06:09.720 |
And then I would just kind of free associate about that? 01:06:16.040 |
do you agree or disagree with each of these statements? 01:06:20.760 |
have adapted Cook-Medley and made it a shorter scale 01:06:29.440 |
But generally speaking, these are discrete questions 01:06:43.880 |
You know, the trust game being the number one 01:07:17.960 |
So they can be exactly fair and give 15 back, 01:07:21.720 |
in which case both people end up pretty much better off 01:07:24.800 |
than they would have without an active trust. 01:07:35.460 |
They can say, well, I started out with nothing, 01:07:39.260 |
And this is one terrific behavioral measure of trust 01:07:42.980 |
and it can be played in a couple of different ways. 01:07:47.700 |
Andrew, you can send $10 to an internet stranger 01:08:10.700 |
In that type of study, what percentage of trustees 01:08:33.780 |
There's a great study by Fechenhauer and Dunning 01:08:39.500 |
that found that people, when they're asked to forecast, 01:08:43.160 |
they say, I bet 52, 55% of people will send this money back, 01:08:52.660 |
make the pro-social and trustworthy decision. 01:08:55.840 |
And again, what Fechenhauer and Dunning found 01:09:04.740 |
and therefore less likely to learn that we were wrong. 01:09:13.780 |
you're interesting because you had the belief 01:09:15.460 |
that it's a 50% chance, but you still chose to trust. 01:09:21.020 |
when that person actually sent the money back, 01:09:23.300 |
which they would have an 80% chance of doing, 01:09:33.700 |
But without any evidence, you can't update your perception. 01:09:41.600 |
that when asked to estimate how friendly, trustworthy, 01:09:49.100 |
people's estimates come in much lower than data suggests. 01:09:52.620 |
And this to me is both the tragedy of cynical thinking, 01:09:58.500 |
and a major opportunity for so many of us, right? 01:10:21.340 |
to the extent that we can open ourselves to the data, 01:10:26.780 |
The social world is full of a lot more positive 01:10:30.860 |
and helpful and kind people than we realize, right? 01:10:34.620 |
The average person underestimates the average person. 01:10:42.060 |
people who do awful things every day around the world. 01:10:45.980 |
But we take those extreme examples and over-rotate on them. 01:10:50.060 |
We assume that the most toxic, awful examples 01:10:53.900 |
that we see are representative when they're not. 01:11:01.500 |
opens people to gaining more of those opportunities, 01:11:06.260 |
to using them, and to finding out more accurate 01:11:09.620 |
and more hopeful information about each other. 01:11:13.860 |
about negative interactions or somebody stealing from us 01:11:32.980 |
in the form of comments and clapbacks and retweets. 01:11:37.060 |
And there certainly is benevolence on social media, 01:11:40.020 |
but what if any data exists about how social media 01:11:51.120 |
And I should say that there's also the kind of, 01:11:54.760 |
I have to be careful, I'm trying not to be cynical. 01:11:58.740 |
I maintain the view that certain social media platforms 01:12:11.980 |
I'm trying to think of accounts on Instagram like Upworthy, 01:12:15.620 |
which its whole basis is to promote positive stuff. 01:12:27.980 |
To what extent is just being on social media, 01:12:30.580 |
regardless of platform, increasing or decreasing cynicism? 01:12:45.540 |
Social media has been a tectonic shift in our lives. 01:12:52.860 |
but as you know, history's not an experiment. 01:13:01.020 |
That said, my own intuition and a lot of the data 01:13:23.140 |
- Approximately the height of the Statue of Liberty, yeah. 01:13:28.380 |
of scrolling a day, much of it doom scrolling, 01:13:37.260 |
what are we seeing when we scroll for that long? 01:13:43.380 |
And are they representative of what people are really like? 01:13:53.300 |
is not representative of the human population. 01:14:11.060 |
formerly known as Twitter, when people tweet in outrage, 01:14:17.360 |
and when they tweet about, in particular, immorality, 01:14:24.260 |
those tweets are broadcast further, they're shared more. 01:14:37.580 |
using a kind of reinforcement learning model, right? 01:14:40.540 |
Reinforcement learning is where you do something, 01:14:43.180 |
you're rewarded, and that reward makes you more likely 01:14:51.500 |
that when people tweet in outrage and then get egged on, 01:14:56.100 |
and oftentimes I should say this is tribal in nature, 01:14:58.700 |
it's somebody tweeting against somebody who's an outsider, 01:15:03.580 |
who they consider to be part of their group, right? 01:15:06.320 |
When that happens, that person is more likely 01:15:12.200 |
on that outrage and on that moral outrage in particular. 01:15:31.940 |
I mean, 90 plus percent of tweets are created 01:15:41.980 |
And these are probably not representative, these folks, 01:15:45.540 |
not representative of the rest of us in terms 01:15:47.500 |
of how extreme and maybe how bitter their opinions are. 01:15:55.580 |
that Statue of Liberty's worth of information, 01:16:00.980 |
We think that we're seeing our fellow citizens. 01:16:14.880 |
This is, by the way, not just part of social media, 01:16:19.860 |
Communication theorists talk about something called 01:16:25.500 |
Where the more time that you spend looking at the news, 01:16:28.260 |
for instance, the more you think violent crime 01:16:33.020 |
the more you think you're in danger of violent crime, 01:16:36.340 |
even during years when violent crime is decreasing. 01:16:39.820 |
I'm old enough to remember when "Stranger Danger" 01:16:48.860 |
of a kid who had been kidnapped by a stranger. 01:16:53.580 |
how many kids are being kidnapped by strangers in the US, 01:16:57.460 |
they would, in many cases, say 50,000 children 01:17:14.220 |
is an absolute tragedy, but there's a big difference here. 01:17:21.140 |
we end up with these enormously warped perceptions 01:17:24.620 |
where we think that the world is much more dangerous 01:17:34.820 |
so much more often than stories of everyday goodness, 01:17:39.920 |
but it's not winning right now in the social media wars. 01:17:47.060 |
And so this leaves us all absolutely exhausted 01:18:11.180 |
about this battle royale that we've put ourselves in. 01:18:19.300 |
We're paying so much attention to a tiny minority 01:18:36.140 |
I have, I suppose, a mixed relationship to social media. 01:18:56.260 |
in this warring nature that they see on social media. 01:19:08.720 |
- And then when I'm away from it, I feel better. 01:19:14.100 |
sometimes we'll get sucked into the highly salient nature 01:19:27.120 |
This mean world syndrome, what's the inverse of that? 01:19:41.020 |
Things like Blue Sky, which has other aspects to it as well. 01:19:50.500 |
It seems like people aren't really interested 01:19:52.940 |
in being on there as much as they are these other platforms. 01:20:03.700 |
are characterized by what we would call negativity bias. 01:20:08.180 |
Right, negative events and threats loom larger in our minds. 01:20:19.500 |
in that we'd prefer to avoid a negative outcome 01:20:23.700 |
That's the classic work of Kahneman and Tversky, 01:20:27.380 |
The impressions that we form are often negatively skewed. 01:20:31.460 |
So classic work in psychology going back to the 1950s 01:20:35.380 |
shows that if you teach somebody about a new person 01:20:39.220 |
who they've never met and you list three positive qualities 01:20:42.060 |
that this person has and three negative qualities, 01:20:50.220 |
And also remember more about their negative qualities 01:20:55.620 |
And again, you can see why this would be part of who we are 01:21:05.580 |
but speak and share in a negatively biased way. 01:21:09.540 |
In my lab, we had a study where people witnessed 01:21:12.780 |
other groups of four playing an economic game 01:21:15.460 |
where they could be selfish or they could be positive. 01:21:20.460 |
And we asked them, okay, we're gonna ask you to share 01:21:24.380 |
a piece of information about one of the people 01:21:33.140 |
And when somebody in a group acted in a selfish way, 01:21:36.500 |
people shared information about them three times more often 01:22:01.420 |
But we further found that when we actually showed 01:22:07.820 |
and we asked, hey, how generous and how selfish 01:22:11.940 |
they vastly underestimated that group's generosity. 01:22:16.740 |
In other words, in trying to protect our communities, 01:22:24.860 |
and give other people the wrong idea of who we are. 01:22:42.300 |
Why don't positive networks, positive information, 01:22:48.460 |
I think it's because of these ingrained biases in our mind. 01:22:52.060 |
And I understand that that can sound fatalistic 01:22:54.900 |
because it's like, oh, maybe this is just who we are. 01:22:57.740 |
But I don't think that we generally accept our instincts 01:23:12.380 |
towards people who look like us versus people who don't, 01:23:17.860 |
None of us, I think, or a few of us sit here and say, 01:23:22.500 |
so I guess I'm always going to be racially biased. 01:23:37.180 |
and to interpret new information they receive 01:23:45.180 |
I want to fight the default settings in my mind. 01:23:53.460 |
So to say that this toxic environment that we're in 01:24:07.980 |
to be able to live one's life in the most adaptive way 01:24:30.820 |
that we learn about on social media are simply wrong. 01:24:37.300 |
We're made to fear something that actually is not happening, 01:24:42.260 |
made to fear a group of people who are not as dangerous 01:24:50.140 |
Of course, I think being informed about the world around us 01:25:01.940 |
makes you avoidant of taking chances on people, 01:25:07.020 |
anybody who's different from you ideologically, 01:25:13.180 |
that's going to limit your life in very important ways. 01:25:26.460 |
is its own form of danger over a longer time horizon. 01:25:33.580 |
in which in the attempt to stay safe right now, 01:25:56.460 |
but chief among them is striving and pushing oneself. 01:26:07.780 |
because most people are basically spending time 01:26:13.220 |
and doing a lot less, just literally doing a lot less, 01:26:19.660 |
Although, by the way, he's in school to become a paramedic. 01:26:24.260 |
and is always doing a bunch of other things as well. 01:26:31.260 |
Now, I don't know if I agree with him completely, 01:26:37.820 |
You know, if social media is bringing out our cynicism, 01:26:49.280 |
at least to some extent, taking away our time 01:26:54.080 |
where we could be generative, writing, thinking, 01:27:09.360 |
to decrease cynicism or, as you referred to it, 01:27:24.960 |
Like, if we were just gonna do the Gedanken experiment here, 01:27:28.160 |
like, what would a feed on social media look like 01:27:30.760 |
that fed hopeful skepticism as opposed to cynicism? 01:27:39.600 |
so I'm gonna try to take it to a logical conclusion 01:27:42.720 |
that would never actually occur in real life, 01:27:49.840 |
and hopeful skepticism, and by hopeful skepticism, 01:27:54.780 |
a scientific mindset, a scientific perspective, 01:28:00.960 |
And in the hopeful piece, I simply mean skepticism 01:28:09.640 |
so that I'm going to be open and I'm going to realize 01:28:18.440 |
that I don't have to listen to it all the time. 01:28:22.780 |
I think that what I would want in a social media feed 01:28:35.240 |
to post to social media about what they're doing today, 01:28:39.020 |
about what they're thinking, about what they want, 01:28:48.560 |
And then people's feed was a representative sample 01:28:55.720 |
Real people, and people who over time, right, 01:29:00.520 |
as I scroll through my Statue of Liberty now, 01:29:05.140 |
I see the people who are extreme and negative and toxic, 01:29:10.820 |
who's driving her grandkid to hockey practice. 01:29:14.080 |
I see a nurse who's coming in to help an elderly patient. 01:29:18.820 |
I see somebody who's made an unlikely connection 01:29:32.340 |
that has struck me most over the last few years 01:29:42.100 |
as kind of dim, naive, a rose-colored pair of glasses. 01:29:55.620 |
And actually the best way to make people more hopeful 01:30:08.720 |
that we've tried at Stanford, in our own backyard. 01:30:15.120 |
been surveying as many Stanford undergraduates as we can 01:30:28.240 |
we asked thousands of undergraduates to describe 01:30:32.040 |
both themselves and the average Stanford student 01:30:37.960 |
How empathic is the average Stanford student? 01:30:40.280 |
How much do you like helping people who are struggling? 01:30:42.880 |
What do you think the average Stanford student 01:30:45.480 |
How much do you want to meet new people on campus? 01:30:47.640 |
How do you think the average student would respond? 01:30:50.080 |
And we discovered not one, but two Stanfords. 01:31:01.160 |
who want to help their friends when they're struggling. 01:31:04.340 |
The second Stanford existed in students' minds. 01:31:07.840 |
Their imagination of the average undergraduate 01:31:10.980 |
was much less friendly, much less compassionate, 01:31:14.320 |
much pricklier and more judgmental than real students were. 01:31:20.920 |
between what people perceive and social reality. 01:31:24.600 |
We found that students who underestimated their peers 01:31:32.480 |
or confide in a friend when they were struggling. 01:31:34.920 |
And that left them more isolated and lonelier. 01:31:37.920 |
This is the kind of vicious cycle of cynicism, right? 01:31:41.440 |
But more recently, my lab led by a great postdoc, Ray Pei, 01:31:48.120 |
And the intervention was as simple as you can imagine. 01:32:04.020 |
would like to help their friends who are struggling. 01:32:12.360 |
a one-unit class that most first-year students take 01:32:20.440 |
And we found that when students learned this information, 01:32:28.160 |
they were more likely to have a greater number of friends, 01:32:36.800 |
but then there's a virtuous cycle that can replace it 01:32:44.440 |
a social media feed where everybody has to post 01:32:47.520 |
and you see an actually representative sample of the world. 01:32:50.040 |
But if we could, I do think that that would generate 01:33:06.840 |
The reason I ask this is I'm quite excited about 01:33:11.920 |
I'm not one of these, I don't know what you call them, 01:33:19.800 |
And I've started using AI in a number of different 01:33:23.100 |
realms of life and I find it to be incredible. 01:33:29.240 |
and Google search with PubMed and it's fascinating. 01:33:37.700 |
is that it mimics a human lack of perfectness well enough 01:33:49.600 |
You could imagine that given the enormous amount 01:33:56.040 |
that some of the large language models that make up AI 01:34:04.560 |
that were overly stringent on certain topics. 01:34:07.640 |
You also wouldn't want AI that was not stringent enough, 01:34:11.960 |
right, because we are already and soon to be using AI 01:34:19.480 |
and the last thing we want are errors in that information. 01:34:22.280 |
So if we were to take what we know from humans 01:34:28.080 |
and others have collected about ways to shift ourselves 01:34:34.040 |
do you think that's something that could be laced 01:34:38.460 |
I'm not talking about at the technical level, 01:34:43.280 |
but could you build an AI version of yourself 01:34:50.320 |
you know, tune down the cynicism a little bit 01:34:58.640 |
of being you than you and then therefore make you better? 01:35:11.520 |
I think one roadblock that I don't think is insurmountable 01:35:18.040 |
in that really fascinating goal is that AI models 01:35:22.800 |
are, of course, products of the data that we feed them. 01:35:25.960 |
And so if, you know, basically AI models eat the internet, 01:35:29.960 |
right, swallow it, and then give it back to us in some form, 01:35:33.080 |
to the extent that the internet is asymmetrically waiting, 01:35:38.040 |
right, is overweighting negative content and cynical content, 01:35:43.640 |
then AIs that swallow that will reflect it as well. 01:35:50.040 |
and it's blowing my mind in real time to think about, 01:35:58.280 |
that AI takes information to account for negativity bias 01:36:01.900 |
and to correct, I mean, this is what you're getting at, 01:36:04.560 |
I think, right, to correct for that negativity bias 01:36:07.600 |
and then produce an inference that is less biased, 01:36:15.780 |
and then give that as a kind of digest to people, right? 01:36:19.040 |
So don't make me go through my social media feed. 01:36:23.080 |
Go through it for me, correct, right, de-bias it, 01:36:27.520 |
and then give it to me in a more accurate way. 01:36:41.880 |
versus, I guess, awe, and I'll use the following examples. 01:36:46.140 |
I subscribed to an Instagram account that I like very much, 01:36:53.680 |
of beautiful animals in their ultimate essence. 01:37:03.240 |
and he's created what's called the photo arc. 01:37:04.920 |
He's trying to get images of all the world's animals 01:37:26.820 |
I subscribe to an animal account called Nature is Metal. 01:37:29.840 |
We've actually collaborated with Nature is Metal 01:37:37.720 |
that I didn't take, but someone I was with took, 01:37:46.320 |
And then I think about the Planet Earth series 01:37:51.480 |
which sort of has a mixture of beautiful ducklings, 01:38:23.240 |
and you get the impression that animals are just beautiful. 01:38:31.800 |
the essence of insects, reptiles, and mammals, 01:38:41.600 |
onto the landscape of real life, non-virtual life, 01:38:56.240 |
they're skewed toward this view that like life is hard, 01:39:07.280 |
Earlier, you described how it can be domain-specific. 01:39:18.200 |
they're outside the sort of developmental plasticity range, 01:39:21.260 |
what are the things that they can do on a daily basis 01:39:47.160 |
- It's a brilliant question, and you're right. 01:39:52.720 |
And heavy metal is great, but life is not all metal. 01:40:07.380 |
I've been trying to counteract it in myself and in others. 01:40:11.260 |
So I've focused on practical everyday things that I can do. 01:40:16.260 |
And I guess they come in a bunch of categories. 01:40:25.220 |
and the ways that we approach our own thinking. 01:40:38.880 |
taking tools from cognitive behavioral therapy 01:40:50.380 |
It's ironic, given what I study, but there we are. 01:40:59.020 |
wondering if they might take advantage of me. 01:41:01.700 |
And what I do these days that I didn't do in the past 01:41:18.940 |
I find that the evidence is thin to non-existent, right? 01:41:28.400 |
And can tap into a little bit of intellectual humility. 01:41:34.820 |
is apply what my lab and I call a reciprocity mindset. 01:42:02.440 |
And when you mistrust people, they become less trustworthy. 01:42:06.280 |
So in my lab, we found that when you teach people this, 01:42:19.080 |
then of course the other person reciprocates, 01:42:24.020 |
So I try, when I make a decision as to whether or not 01:42:33.540 |
And I try to rotate that a little bit and say, 01:42:39.480 |
Is this act of trust maybe a gift to this other person? 01:42:43.060 |
How can it positively influence who they become 01:42:48.260 |
And then a third thing on the sort of mindset side, 01:42:59.260 |
Savoring is a general term for appreciating good things 01:43:06.080 |
but gratitude is more appreciating the things 01:43:08.700 |
that have happened to us in the past that are good. 01:43:11.080 |
Savoring is, let's grab this moment right now 01:43:21.540 |
So I'll say, today we're gonna do an ice cream eating class, 01:43:40.980 |
and eat ice cream slowly, you know, not so that it melts, 01:43:44.320 |
but we'll say, you know, what are you enjoying about this? 01:43:48.700 |
What do you wanna remember about this moment? 01:43:51.060 |
And I noticed more recently while working on this book 01:43:56.900 |
Sunsets, somersaults, ice cream, you name it. 01:44:02.340 |
And what they were hearing from me about other people 01:44:16.220 |
who are politely following traffic laws all around us, 01:44:22.860 |
And so I started a practice of social savoring 01:44:28.820 |
positive things that I notice about other people. 01:44:46.580 |
if you're trying to tell somebody about something, 01:44:49.700 |
you look for examples that you can tell them about. 01:45:02.260 |
adopting a reciprocity mindset and social savoring, 01:45:19.580 |
because there's so much data to support gratitude practices. 01:45:31.480 |
is equally powerful towards our neurochemistry 01:45:41.960 |
and I'm sure people are as excited about them as I am 01:45:44.860 |
because all this knowledge from the laboratory 01:45:49.880 |
But of course we always want to know what can we do 01:45:57.800 |
in order to make ourselves smarter, better, happier 01:46:00.960 |
and in touch with awe on a more regular basis. 01:46:11.920 |
that I've talked about as thinking more like a scientist 01:46:17.060 |
then the second step to me is to act more like a scientist 01:46:32.200 |
You know, in this moment, you could interrupt the defaults. 01:46:42.740 |
And I encourage other people to do that as well. 01:46:47.860 |
taking leaps of faith on other people, right? 01:47:01.020 |
that I share my bank information with a prince 01:47:07.220 |
You need to be smart and safe in the risks that you take. 01:47:17.100 |
And there are lots of ways that I try to do this 01:47:22.420 |
One is to just be more open to the social world. 01:47:27.420 |
Andrew, I think you've said you're an introvert as well. 01:47:32.980 |
we tend to think that the social world is maybe tiring 01:47:45.820 |
where I underestimate the joy of social contact. 01:47:49.620 |
You know, there's so many times that before a dinner party, 01:47:59.740 |
but I would feel so relieved if they canceled. 01:48:16.320 |
And then afterwards, I'm so grateful to have done so. 01:48:23.900 |
If you ask them to forecast what it would be like 01:48:32.000 |
to express gratitude, to try to help somebody, 01:48:35.120 |
even to have a disagreement on ideological grounds, 01:48:38.980 |
people forecast that these conversations would be awful, 01:48:46.900 |
and in the case of disagreement, harmful even. 01:48:50.780 |
This is work from Nick Epley, Juliana Schroeder, 01:49:05.660 |
Nick, Juliana, and others then challenge people. 01:49:08.420 |
They say, "Go and do this, have this conversation, 01:49:12.380 |
And people's actual experiences are vastly more positive 01:49:21.620 |
I try to realize when my forecasts are too risk-averse 01:49:25.500 |
and too negative and say, "Let me just jump in. 01:49:44.500 |
So in essence, gathering new data from the world is great, 01:49:53.060 |
I try to really remember when a social encounter 01:50:13.620 |
And so I invited her to have this conversation 01:50:17.460 |
And we did not agree by the end of the conversation, 01:50:20.400 |
but it was an immensely deep and meaningful conversation, 01:50:43.740 |
try to lock in that learning from the social world 01:51:00.500 |
because many times I'll be listening to an audio book 01:51:04.220 |
and I'll put it into my voice memos or notes in my phone, 01:51:09.700 |
or another similar to it, and I'll go back and read it. 01:51:12.460 |
But many things don't get passed through the filters 01:51:21.460 |
to solidify information is to think about experiences 01:51:29.220 |
for emotional learning and our own personal evolution. 01:51:32.700 |
- Which brings me to another example of somebody 01:51:37.500 |
is it sort of philosophy, wellness, self-help space, 01:51:40.300 |
you mentioned Pema Chodron, wonderful writer. 01:51:45.300 |
There's someone else more or less in that space, 01:51:51.060 |
is about challenging beliefs by simply asking questions 01:51:55.820 |
This is something that I've started to explore a bit, 01:51:58.280 |
like one could have the idea that good people always, 01:52:09.940 |
for me, everything starts 10 minutes after the hour, 01:52:11.900 |
so we're consistently on time but late, right? 01:52:18.400 |
which is five minutes early is on time, on time is late, 01:52:26.220 |
In any event, the practice that she promotes, 01:52:42.240 |
as a way to really deconstruct one's own core beliefs, 01:52:44.880 |
which is, I think, a bit of what you're talking about, 01:52:47.040 |
and I feel like this could go in at least two directions. 01:52:53.740 |
that you can deconstruct by just simply asking questions. 01:52:59.320 |
Are there ever instances where that's not true? 01:53:07.000 |
where we tend to err toward hopeful skepticism 01:53:15.040 |
to explore hopeful skepticism also as a scientist, right? 01:53:21.780 |
can really get us into trouble, for instance? 01:53:24.900 |
Anyway, obviously, I haven't run a study on this 01:53:28.420 |
just because I came up with this example on the fly, 01:53:30.880 |
but does what I just described fit more or less 01:53:47.140 |
A person who's socially anxious might tell their therapist, 01:53:55.880 |
It might affect every decision that they make. 01:53:58.120 |
And the therapist might challenge them and say, 01:53:59.540 |
"Well, wait, what's the evidence that you have for that? 01:54:10.720 |
So this is the bedrock of one of the most successful 01:54:22.260 |
zoom in on something that you're sharing there 01:54:26.080 |
'Cause I think that in addition to testing our core beliefs, 01:54:36.560 |
And I think oftentimes, we think that we are more alone 01:54:42.940 |
So this is true in our politics, for instance, 01:54:50.840 |
who want more compromise, more peace, and less conflict 01:54:55.840 |
is north of 80% in surveys that my lab has conducted. 01:55:02.480 |
And so the lack of evidence, the lack of data 01:55:19.740 |
with school systems, hospital systems, businesses. 01:55:28.220 |
And I ask, how much do you value empathy and collaboration? 01:55:33.000 |
How much would you prefer a workplace or community 01:55:40.220 |
And invariably, and I'm talking about some places 01:55:43.500 |
where you might imagine people would be competitive, 01:55:56.720 |
Much more than they want competition or isolation. 01:56:03.500 |
hey, look, here's some data, look around you. 01:56:06.820 |
Here you've got 90% of people in this organization 01:56:12.220 |
So if you just take a look in your periphery, 01:56:15.420 |
almost everybody around you wants that as well. 01:56:25.380 |
And so I say, you have underestimated each other, 01:56:34.380 |
that we can take if we're in a leadership position anywhere. 01:56:38.180 |
Right, I think that looking for more data is great. 01:56:41.220 |
If you're a leader, you can collect those data 01:56:46.300 |
You can unveil the core beliefs of your community. 01:56:50.540 |
And oftentimes those core beliefs are incredibly beautiful 01:56:55.220 |
and surprising to the people in those communities 01:56:59.220 |
and give them what I would call not peer pressure, 01:57:02.200 |
but peer permission to express who they've been all along. 01:57:07.340 |
And one of the things that we've done on this podcast 01:57:12.980 |
critique and so forth in the comment section on YouTube. 01:57:17.980 |
And I always say, and I do read all the comments, 01:57:25.860 |
yes, they can be toxic in certain environments, 01:57:32.340 |
not just for the reader, but for the commenter. 01:57:36.140 |
And to see what people's core beliefs are really about. 01:57:39.900 |
Now, oftentimes comments are of a different form 01:57:45.340 |
But I think that because of the anonymity involved, 01:57:53.820 |
for people to really share their core beliefs 01:57:55.460 |
about something that can be really informative 01:58:04.220 |
where people are doing this in real time face-to-face 01:58:14.960 |
what are the data saying about the current state of affairs? 01:58:33.620 |
what do your data and understanding about cynicism 01:58:37.340 |
and hopeful skepticism tell us about that whole process 01:58:42.340 |
and how the two camps are presenting themselves? 01:58:51.820 |
but like so many of the themes in this conversation, 01:59:02.200 |
and I'm gonna talk about perceived polarization as well, 01:59:07.220 |
One, it's tragic because we are underestimating one another. 01:59:15.180 |
because the delta between the world that we think we're in 01:59:22.420 |
So there's a bunch of work on political perceptions. 01:59:25.340 |
This is work done by folks like Meena Chakra at Harvard, 01:59:29.340 |
my colleague Rob Willer in sociology at Stanford, 01:59:33.640 |
And a lot of this focuses on what people think 01:59:38.640 |
the average member of the other side is like. 01:59:45.400 |
what do you think the average Democrat believes? 01:59:49.800 |
what do you think the average Republican is like? 01:59:54.480 |
and Democrats here because a lot of these data 01:59:56.480 |
are bipartisan, the biases are pretty even across camps. 02:00:03.560 |
we are dead wrong about who's on the other side. 02:00:10.600 |
For instance, Democrats think that 25% of Republicans 02:00:21.900 |
But the stereotype of Republicans that Democrats hold 02:00:27.240 |
Republicans vastly overestimate the percentage of Democrats 02:00:31.080 |
who are part of the LGBTQ community, for instance. 02:00:36.640 |
So we're wrong about even who's on the other side, 02:00:40.480 |
but we're even more wrong about what they believe 02:00:44.120 |
So data suggests that there is perceived polarization, 02:00:47.680 |
that is what we think the other side believes, 02:00:53.520 |
I mean, first of all, we are divided, let's stipulate that. 02:01:08.820 |
My late friend, Emil Bruno, collected some data 02:01:11.860 |
where he gathered Republicans and Democrats' views 02:01:15.820 |
He said, what would you want immigration to look like 02:01:24.000 |
And he plotted the distributions of what that looks like. 02:01:30.180 |
what do you think the other side would respond 02:01:40.500 |
And if you're a Republican, what would Democrats want? 02:01:55.420 |
but they're not that far apart, first of all, the means, 02:01:57.860 |
and there's a lot of overlap in the distributions. 02:02:01.040 |
The distributions of our perceptions are two hills 02:02:07.460 |
Republicans think that Democrats want totally open borders 02:02:11.180 |
and Democrats think Republicans want totally closed borders. 02:02:14.100 |
And the same pattern plays out for all sorts of issues 02:02:18.340 |
where we think the other side is much more extreme, 02:02:21.580 |
we think the average member of the other side 02:02:28.260 |
What do you think the other side thinks about you? 02:02:42.460 |
that my grad student Louisa Santos collected, 02:02:55.100 |
would support violence to advance their aims? 02:03:01.260 |
So we think that the average person on the other side 02:03:13.100 |
as violent extremists who want to burn down the system. 02:03:18.100 |
And again, we've talked about the warped media ecosystem 02:03:21.500 |
that we're in, and that probably contributes here. 02:03:26.580 |
are making all the problems that we fear worse. 02:03:35.460 |
And so we're caught in this almost cycle of escalation 02:03:44.700 |
that I'm not saying that we don't have actual disagreements. 02:03:52.380 |
our political spectrum are all peaceable and all kind. 02:03:55.860 |
There are absolutely extreme and violent people 02:03:59.540 |
around our country that represent their political views 02:04:08.220 |
that the average person underestimates the average person. 02:04:16.740 |
And so again, to me, this is a tragedy and an opportunity. 02:04:24.020 |
that when you ask people to actually pay attention 02:04:29.100 |
"Hey, actually, the other side fears violence 02:04:34.460 |
When you show them that actually the other side 02:04:43.140 |
that pulls back all of these escalatory impulses. 02:04:51.020 |
by showing them who the other side really is. 02:05:08.000 |
But I do think it's worth noting how wrong we are 02:05:13.900 |
can at least open a door, maybe let our minds wander 02:05:17.700 |
towards a place of greater compromise and peace, 02:05:32.700 |
And I confess, I didn't know that the landscape 02:05:35.060 |
was as toward the center as it turns out it is. 02:05:40.060 |
I have also many theories about how media and social media 02:05:47.760 |
and podcasts for that matter might be contributing 02:05:50.560 |
to this perceived polarization as opposed to the reality. 02:06:01.520 |
to remedy our understanding of what's going on out there. 02:06:07.460 |
can some of the same tools that you described 02:06:16.460 |
and in small groups be used to sort of defragment 02:06:21.460 |
some of the cynicism circuitry that exists in us 02:06:28.220 |
perceived highly polarized political landscape? 02:06:38.760 |
There is lots of evidence that we are actively avoiding 02:06:42.320 |
having conversations in part because of who we think 02:06:49.600 |
during Thanksgiving of 2016, which as you may recall, 02:06:53.940 |
was directly after a very polarizing election 02:06:58.940 |
and researchers used geo-tracking on people's cell phones 02:07:04.060 |
to examine whether in order to go to Thanksgiving dinner, 02:07:07.000 |
they crossed between a blue county into a red county 02:07:16.320 |
and I'm using air quotes here, quote unquote, 02:07:23.620 |
they're having dinner with people they disagree with. 02:07:26.120 |
And it turns out that people who crossed county lines, 02:07:29.440 |
who crossed into enemy territory, again in quotes, 02:07:33.560 |
they had dinners that were 50 minutes shorter 02:07:42.640 |
So we're talking about forsaking pie, Andrew. 02:07:52.660 |
about these conversations because if you believe 02:07:55.580 |
that the other side is a bunch of bloodthirsty marauders, 02:08:06.500 |
The truth though is that when we can collect better data, 02:08:12.160 |
oftentimes we end up with better perceptions. 02:08:22.040 |
Now again, I want to say that there are real threats 02:08:25.760 |
I'm not asking anybody to make themselves unsafe in any way. 02:08:29.920 |
But in our lab, again, my wonderful graduate student, 02:08:34.080 |
Louisa Santos, ran a study where we had about 160 people, 02:08:45.200 |
about gun control, immigration, and climate change, 02:08:59.360 |
And the forecasts went from neutral to negative. 02:09:02.100 |
Some people thought it won't make any difference, 02:09:04.800 |
and other people thought it will be counterproductive. 02:09:07.520 |
Some folks in our survey said dialogue is dead, 02:09:10.240 |
there's no point in any of these conversations. 02:09:15.800 |
Oh, and I should say, among the people who were cynical 02:09:20.760 |
that they would go poorly, was us, the research team. 02:09:28.400 |
or dox each other, or look up each other's addresses. 02:09:32.180 |
You know, Andrew, that we have institutional review boards 02:09:34.800 |
that make sure that we're keeping human subjects safe, 02:09:37.080 |
and the IRB wanted all sorts of safeguards in place, 02:09:40.840 |
because we all thought that these conversations 02:09:50.520 |
to rate how positive they were on a one to 100 scale. 02:10:00.620 |
And it wasn't just that they liked the conversation, 02:10:04.360 |
they were shocked by how much they liked the conversation. 02:10:13.180 |
not just for the person that they talked with, 02:10:15.520 |
and they reported more intellectual humility, 02:10:18.640 |
more openness to questioning their own views. 02:10:22.120 |
So here are conversations that we as a culture 02:10:30.420 |
but we don't know that, and we don't give ourselves chances 02:10:33.440 |
to learn that we're wrong, because we don't collect the data. 02:10:40.960 |
take that social risk, we are shocked and humbled, 02:10:51.740 |
that there can be some way out of this toxic environment 02:11:01.140 |
thank you so much for sharing your incredible, 02:11:12.980 |
I mean, to be a cynic is one potential aspect 02:11:22.780 |
There is plasticity over this aspect of ourselves. 02:11:25.460 |
If we adopt the right mindsets, apply the right practices, 02:11:29.540 |
and it's so clear based on everything you've shared today 02:11:44.260 |
and in the context of being happier individuals, 02:11:49.500 |
that to really take a hard look at how cynical we are, 02:11:53.900 |
and to start to make even minor inroads into that 02:12:01.940 |
that what I really feel you're encouraging us to do, 02:12:04.940 |
correct me if I'm wrong, is to do both internal 02:12:11.340 |
to move us away toward internal and external polarization. 02:12:15.660 |
And I can't think of any higher calling than that. 02:12:25.820 |
These aren't just ideas, they are data-supported ideas. 02:12:29.500 |
And I just want to thank you for your incredible generosity 02:12:33.100 |
in coming here today to talk about those ideas. 02:12:40.900 |
And what you've shared with us today is phenomenal. 02:12:46.020 |
to talk about another topic that you are expert in, 02:12:55.380 |
So once again, I just want to thank you for your time, 02:13:04.220 |
So on behalf of myself and everyone listening and watching, 02:13:08.980 |
Andrew, this has been an absolutely delightful conversation. 02:13:11.860 |
And I will say my forecast of it was very high, 02:13:19.180 |
I also just want to take a moment to thank you 02:13:28.700 |
to generate knowledge, but also to share knowledge, 02:13:35.420 |
of the most important services that we can do 02:13:37.700 |
as folks who have been trained and learned all this stuff 02:13:41.060 |
to bring that information to as many people as we can. 02:13:44.260 |
And I think it's just, it's an incredible mission 02:13:49.220 |
So it's an honor to be part of that conversation 02:13:55.760 |
And it's a labor of love and an honor and a privilege 02:14:04.980 |
- Thank you for joining me for today's discussion 02:14:09.740 |
and to find a link to his new book, "Hope for Cynics," 02:14:12.560 |
please see the links in the show note captions. 02:14:15.000 |
If you're learning from and or enjoying this podcast, 02:14:19.180 |
That's a terrific zero cost way to support us. 02:14:23.580 |
is to follow the podcast on both Spotify and Apple. 02:14:32.060 |
at the beginning and throughout today's episode. 02:14:36.940 |
If you have questions for me or comments about the podcast 02:14:39.620 |
or guests or topics that you'd like me to consider 02:14:43.100 |
please put those in the comment section on YouTube. 02:15:00.940 |
And it covers protocols for everything from sleep 02:15:08.940 |
And of course, I provide the scientific substantiation 02:15:14.380 |
The book is now available by presale at protocolsbook.com. 02:15:27.020 |
If you're not already following me on social media, 02:15:29.200 |
I'm Huberman Lab on all social media platforms. 02:15:31.900 |
So that's Instagram, X, formerly known as Twitter, 02:15:42.920 |
but much of which is distinct from the content 02:15:46.440 |
Again, that's Huberman Lab on all social media channels. 02:15:53.660 |
is a zero-cost monthly newsletter that has protocols, 02:15:59.460 |
that describe things like optimizing your sleep, 02:16:02.540 |
how to optimize your dopamine, deliberate cold exposure. 02:16:17.780 |
Again, you can find all that at completely zero cost 02:16:23.760 |
scroll down to newsletter, you put in your email, 02:16:28.660 |
Thank you once again for joining me for today's discussion