back to index

Dr. Jamil Zaki: How to Cultivate a Positive, Growth-Oriented Mindset


Chapters

0:0 Dr. Jamil Zaki
2:12 Sponsors: Maui Nui, Joovv & Waking Up
6:59 Cynicism
12:38 Children, Attachment Styles & Cynicism
17:29 Cynicism vs. Skepticism, Complexity
23:30 Culture Variability & Trust
26:28 Sponsor: AG1
27:40 Negative Health Outcomes; Cynicism: Perception & Intelligence
35:59 Stereotypes, Threats
39:48 Cooperative Environments, Collaboration & Trust
44:5 Competition, Conflict, Judgement
48:46 Cynics, Awe, “Moral Beauty”
55:26 Sponsor: Function
57:13 Cynicism, Creativity & Workplace
64:19 Assessing Cynicism; Assumptions & Opportunities
71:11 Social Media & Cynicism, “Mean World Syndrome”
78:35 Negativity Bias, Gossip
84:3 Social Media & Cynicism, Polarization, “Hopeful Skepticism”
92:59 AI, Bias Correction
99:5 Tools: Mindset Skepticism; Reciprocity Mindset; Social Savoring
106:5 Tools: Leaps of Faith; Forecasting; Encounter Counting
111:33 Tool: Testing & Sharing Core Beliefs
118:9 Polarization vs. Perceived Polarization, Politics
126:6 Challenging Conversations, Questioning Perceptions
134:4 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science
00:00:03.680 | and science-based tools for everyday life.
00:00:05.880 | I'm Andrew Huberman,
00:00:10.240 | and I'm a professor of neurobiology and ophthalmology
00:00:13.480 | at Stanford School of Medicine.
00:00:15.440 | My guest today is Dr. Jamil Zaki.
00:00:18.080 | Dr. Jamil Zaki is a professor of psychology
00:00:20.600 | at Stanford University.
00:00:22.120 | He is also the director
00:00:23.440 | of the Social Neuroscience Laboratory at Stanford.
00:00:26.360 | His laboratory focuses on key aspects of the human experience
00:00:29.500 | such as empathy and cynicism,
00:00:31.700 | which lie at the heart of our ability to learn
00:00:34.120 | and can be barriers to learning,
00:00:36.320 | such as the case with cynicism.
00:00:38.400 | Today, you'll learn the optimal mindsets to adopt
00:00:40.640 | when trying to understand how to learn conflict resolution
00:00:44.520 | and how to navigate relationships of all kinds
00:00:47.280 | and in all contexts,
00:00:48.780 | including personal relationships and in the workplace.
00:00:52.120 | What sets Dr. Zaki's work apart from others
00:00:54.520 | is that he's able to take laboratory research
00:00:56.840 | and apply that to real-world scenarios
00:00:58.920 | to direct optimal strategies
00:01:00.480 | for things like how to set personal boundaries,
00:01:03.080 | how to learn information in uncertain
00:01:05.360 | and sometimes even uncomfortable environments,
00:01:08.120 | and then how to bring that to bear
00:01:10.040 | in terms of your relationship to yourself,
00:01:12.080 | your relationship to others,
00:01:13.480 | and how to collaborate with others in more effective ways.
00:01:16.880 | I wanna be very clear that today's discussion,
00:01:18.940 | while focused on cynicism, trust, and empathy,
00:01:21.800 | is anything but squishy.
00:01:23.040 | In fact, it focuses on experimental data
00:01:25.820 | derived from real-world contexts.
00:01:28.040 | So it is both grounded in solid research
00:01:30.520 | and it is very practical,
00:01:32.240 | such that by the end of today's episode,
00:01:34.280 | you'll be armed with new knowledge
00:01:35.760 | about what cynicism is and is not,
00:01:38.380 | what empathy is and is not.
00:01:40.220 | This is very important because there's a lot of confusion
00:01:42.680 | about these words and what they mean.
00:01:45.040 | But I can assure you that by the end of today's discussion,
00:01:47.480 | you will have new frameworks and indeed new tools,
00:01:49.920 | protocols that you can use as strategies
00:01:52.800 | to better navigate situations and relationships of all kinds
00:01:56.480 | and indeed to learn better.
00:01:58.760 | I'd also like to mention that Dr. Zaki
00:02:00.600 | has authored a terrific new book entitled "Hope for Cynics,
00:02:04.480 | the Surprising Science of Human Goodness."
00:02:06.720 | And I've read this book and it is spectacular.
00:02:09.700 | There's a link to the book in the show note captions.
00:02:12.240 | Before we begin, I'd like to emphasize that this podcast
00:02:15.120 | is separate from my teaching and research roles at Stanford.
00:02:17.900 | It is, however, part of my desire and effort
00:02:20.180 | to bring zero cost to consumer information
00:02:22.160 | about science and science-related tools
00:02:24.200 | to the general public.
00:02:25.560 | In keeping with that theme,
00:02:26.640 | I'd like to thank the sponsors of today's podcast.
00:02:29.580 | Our first sponsor is Maui Nui.
00:02:31.600 | Maui Nui venison is the most nutrient dense
00:02:33.840 | and delicious red meat available.
00:02:36.000 | I've spoken before on this podcast
00:02:37.580 | about the fact that most of us should be seeking
00:02:39.240 | to get about one gram of quality protein
00:02:41.680 | per pound of body weight every day.
00:02:44.240 | That protein provides critical building blocks
00:02:46.160 | for things like muscle repair and synthesis,
00:02:48.400 | but also promotes overall health
00:02:49.960 | given the importance of muscle as an organ.
00:02:52.320 | Eating enough quality protein each day
00:02:53.920 | is also a terrific way to stave off hunger.
00:02:56.560 | One of the key things, however,
00:02:57.740 | is to make sure that you're getting enough quality protein
00:03:00.160 | without ingesting excess calories.
00:03:02.320 | Maui Nui venison has an extremely high quality
00:03:04.760 | protein to calorie ratio,
00:03:06.640 | such that getting that one gram of protein
00:03:08.480 | per pound of body weight is both easy
00:03:10.600 | and doesn't cause you to ingest
00:03:12.240 | an excess amount of calories.
00:03:14.020 | Also, Maui Nui venison is absolutely delicious.
00:03:17.160 | They have venison steaks,
00:03:18.640 | ground venison, and venison bone broth.
00:03:21.000 | I personally like and eat all of those.
00:03:23.060 | In fact, I probably eat a Maui Nui venison burger
00:03:25.440 | pretty much every day,
00:03:26.720 | and occasionally I'll swap that for a Maui Nui steak.
00:03:29.520 | And if you're traveling a lot or simply on the go,
00:03:31.880 | they have a very convenient Maui Nui venison jerky,
00:03:34.640 | which has 10 grams of quality protein per stick
00:03:37.240 | at just 55 calories.
00:03:39.020 | While Maui Nui offers the highest quality meat available,
00:03:41.920 | their supplies are limited.
00:03:43.680 | Responsible population management
00:03:45.360 | of the Axis deer on the island of Maui
00:03:47.440 | means that they will not go beyond harvest capacity.
00:03:50.240 | Signing up for a membership is therefore the best way
00:03:52.440 | to ensure access to their high quality meat.
00:03:55.140 | If you'd like to try Maui Nui venison,
00:03:56.960 | you can go to mauinuivenison.com/huberman
00:04:00.200 | to get 20% off your membership or first order.
00:04:03.120 | Again, that's mauinuivenison.com/huberman.
00:04:06.960 | Today's episode is also brought to us by Juve.
00:04:09.680 | Juve makes medical grade red light therapy devices.
00:04:13.040 | Now, if there's one thing I've consistently emphasized
00:04:15.060 | on this podcast,
00:04:16.160 | it's the incredible impact
00:04:17.840 | that light can have on our biology.
00:04:19.760 | Now, in addition to sunlight,
00:04:21.160 | red light and near-infrared light
00:04:22.880 | have been shown to have positive effects
00:04:24.520 | on improving numerous aspects of cellular and organ health,
00:04:27.520 | including faster muscle recovery,
00:04:29.160 | improved skin health and wound healing,
00:04:31.080 | even improvements in acne, reducing pain and inflammation,
00:04:34.200 | improving mitochondrial function,
00:04:35.760 | and even improving vision itself.
00:04:37.840 | What sets Juve lights apart,
00:04:39.280 | and why they're my preferred red light therapy devices,
00:04:42.040 | is that they use clinically proven wavelengths,
00:04:44.180 | meaning it uses specific wavelengths of red light
00:04:46.460 | and near-infrared light in combination
00:04:48.520 | to trigger the optimal cellular adaptations.
00:04:51.020 | Personally, I use the Juve handheld light,
00:04:53.160 | both at home and when I travel.
00:04:54.660 | It's only about the size of a sandwich,
00:04:56.120 | so it's super portable and convenient to use.
00:04:58.360 | I also have a Juve whole body panel,
00:05:00.080 | and I use that about three or four times per week.
00:05:02.760 | If you'd like to try Juve,
00:05:04.000 | you can go to juve, spelled J-O-O-V-V.com/huberman.
00:05:08.960 | Juve is offering an exclusive discount
00:05:10.780 | to all Huberman Loud listeners,
00:05:12.360 | with up to $400 off select Juve products.
00:05:15.280 | Again, that's Juve, J-O-O-V-V.com/huberman,
00:05:19.200 | to get $400 off select Juve products.
00:05:22.000 | Today's episode is also brought to us by Waking Up.
00:05:25.200 | Waking Up is a meditation app
00:05:26.720 | that offers hundreds of guided meditation programs,
00:05:29.180 | mindfulness trainings, yoga nidra sessions, and more.
00:05:32.560 | I started practicing meditation
00:05:34.000 | when I was about 15 years old,
00:05:35.840 | and it made a profound impact on my life.
00:05:38.480 | And by now, there are thousands
00:05:39.860 | of quality peer-reviewed studies
00:05:41.440 | that emphasize how useful mindfulness meditation can be
00:05:44.680 | for improving our focus, managing stress and anxiety,
00:05:47.440 | improving our mood, and much more.
00:05:49.720 | In recent years, I started using
00:05:51.200 | the Waking Up app for my meditations,
00:05:53.040 | because I find it to be a terrific resource
00:05:55.180 | for allowing me to really be consistent
00:05:56.940 | with my meditation practice.
00:05:58.740 | Many people start a meditation practice
00:06:00.840 | and experience some benefits,
00:06:02.160 | but many people also have challenges
00:06:03.880 | keeping up with that practice.
00:06:05.540 | What I and so many other people love
00:06:07.080 | about the Waking Up app
00:06:08.120 | is that it has a lot of different meditations
00:06:10.040 | to choose from,
00:06:10.920 | and those meditations are of different durations.
00:06:13.440 | So it makes it very easy to keep up
00:06:15.040 | with your meditation practice,
00:06:16.560 | both from the perspective of novelty,
00:06:18.440 | you never get tired of those meditations,
00:06:20.160 | there's always something new to explore
00:06:21.720 | and to learn about yourself
00:06:23.000 | and about the effectiveness of meditation,
00:06:25.360 | and you can always fit meditation into your schedule,
00:06:28.080 | even if you only have two or three minutes per day
00:06:30.860 | in which to meditate.
00:06:31.960 | I also really like doing yoga nidra,
00:06:33.640 | or what is sometimes called non-sleep deep rest,
00:06:36.000 | for about 10 or 20 minutes,
00:06:37.560 | because it is a great way to restore mental
00:06:39.800 | and physical vigor without the tiredness
00:06:42.000 | that some people experience when they wake up
00:06:43.440 | from a conventional nap.
00:06:44.720 | If you'd like to try the Waking Up app,
00:06:46.280 | please go to wakingup.com/huberman,
00:06:49.120 | where you can access a free 30-day trial.
00:06:51.200 | Again, that's wakingup.com/huberman
00:06:54.080 | to access a free 30-day trial.
00:06:56.280 | And now for my discussion with Dr. Jamil Zaki.
00:06:59.600 | Dr. Jamil Zaki, welcome.
00:07:02.080 | - Thanks so much for having me.
00:07:03.480 | - Delighted to have you here.
00:07:05.480 | And to learn from you,
00:07:07.120 | you have decided to tackle
00:07:09.520 | an enormous number of very interesting
00:07:12.840 | and challenging topics.
00:07:14.720 | Challenging because my read of it,
00:07:17.880 | not just your book,
00:07:18.760 | but of these fields in the science that you've done,
00:07:21.440 | is that people default to some complicated states
00:07:26.280 | and emotions sometimes that in some ways serve them well,
00:07:29.880 | in some ways serve them less well.
00:07:32.560 | So I'd like to talk about this at the level
00:07:34.080 | of the individual and interactions between pairs
00:07:38.120 | and larger groups and so on.
00:07:40.120 | But just to kick things off,
00:07:43.040 | what is cynicism?
00:07:45.320 | You know, I have my own ideas,
00:07:47.240 | but what is cynicism?
00:07:49.160 | What does it serve in terms of its role in the human mind?
00:07:53.640 | - The way that psychologists think of cynicism these days
00:07:57.400 | is as a theory, a theory about human beings.
00:08:01.440 | It's the idea that generally people at their core
00:08:05.680 | are selfish, greedy, and dishonest.
00:08:08.880 | Now, that's not to say that a cynical person
00:08:11.400 | will deny that somebody could act kindly, for instance,
00:08:14.880 | could donate to charity, could help a stranger,
00:08:17.720 | but they would say all of that,
00:08:20.240 | all of that kind and friendly behavior
00:08:22.720 | is a thin veneer covering up who we really are,
00:08:26.480 | which is self-interested.
00:08:29.120 | Another way of putting this is,
00:08:30.880 | you know, there are these ancient philosophical questions
00:08:33.000 | about people.
00:08:34.080 | Are we good or bad?
00:08:35.960 | Kind or cruel?
00:08:37.320 | Caring or callous?
00:08:39.080 | And cynicism is answering all of those
00:08:42.340 | in the relatively bleak way that you might.
00:08:45.140 | - I believe in your book, you quote Kurt Vonnegut,
00:08:50.160 | who says, "We are who we pretend to be,
00:08:53.400 | "so we need to be careful who we pretend to be."
00:08:56.760 | What do you think that quote means?
00:08:58.200 | How do you interpret that quote?
00:09:00.380 | - Thanks for bringing that up.
00:09:01.220 | Kurt Vonnegut, one of my favorite authors,
00:09:04.140 | and to me, that quote is enormously powerful
00:09:07.740 | because it expresses the idea of self-fulfilling prophecies.
00:09:12.420 | You know, there's this subjective sense that people have
00:09:17.460 | that our version of the world is the world,
00:09:20.060 | that we are passively taking in information,
00:09:22.900 | veridically, dispassionately,
00:09:26.380 | and in fact, that's not the case.
00:09:28.460 | We each construct our own version of the world.
00:09:32.340 | And so, for instance, if you think about cynicism, right?
00:09:36.180 | Are people kind or cruel?
00:09:37.420 | That's pretty much an unanswerable question
00:09:40.080 | at the level of science.
00:09:41.140 | It's a philosophical, some could argue,
00:09:43.380 | even a theological question.
00:09:45.380 | But it turns out that the way you answer that
00:09:47.700 | goes a long way in constructing and shaping
00:09:52.040 | the life that you live, the decisions that you make.
00:09:56.100 | So cynics, maybe it's not so much
00:09:58.140 | about who they pretend to be,
00:09:59.760 | but it's about who they pretend everybody else is, right?
00:10:03.580 | If you decide that other people are selfish, for instance,
00:10:07.700 | you'll be far less likely to trust them.
00:10:10.340 | And there's a lot of evidence that cynics,
00:10:12.620 | when they're put in situations with new people,
00:10:15.140 | even when they interact with their friends,
00:10:18.260 | romantic partners, and families,
00:10:20.780 | that they still have their guard up,
00:10:22.560 | that they're not able to make trusting
00:10:25.440 | and deep connections with other people.
00:10:27.780 | But guess what?
00:10:28.620 | When you treat other people in that way,
00:10:30.220 | a couple of things happen.
00:10:32.040 | One, you're not able to receive
00:10:34.800 | what most of us need from social connections.
00:10:38.120 | There's one really classic and very sad study
00:10:41.600 | where people were forced to give an extemporaneous speech
00:10:45.320 | about a subject they don't know much about,
00:10:47.020 | a very stressful experience
00:10:49.300 | that raised people's blood pressure.
00:10:52.320 | Some of these folks had a cheerleader,
00:10:54.760 | not an actual cheerleader, but a friendly stranger
00:10:58.440 | who was with them while they prepared,
00:10:59.880 | saying, "You've got this, I know you can do it.
00:11:02.140 | "I'm in your corner."
00:11:03.840 | Other people had no support.
00:11:06.180 | As you know, one of the great things about social support
00:11:09.280 | is that it buffers us from stress.
00:11:12.100 | So most people,
00:11:14.080 | when they had this friendly person by their side,
00:11:17.000 | their blood pressure, as they prepared for the speech,
00:11:19.520 | went up only half as much as when they were alone.
00:11:23.240 | But cynical people had a spike in their blood pressure
00:11:26.480 | that was indistinguishable in magnitude
00:11:29.720 | whether or not a person was by their side or not.
00:11:33.120 | One way that I think about this is,
00:11:35.440 | social connection is a deep and necessary form
00:11:39.120 | of psychological nourishment.
00:11:41.480 | And living a cynical life,
00:11:43.640 | making the decision that most people can't be trusted,
00:11:47.400 | stops you from being able to metabolize those calories,
00:11:51.480 | leaves you malnourished in a social way.
00:11:55.740 | A second thing that happens when you choose
00:11:59.160 | to pretend that others are selfish, greedy, and dishonest
00:12:03.700 | is that you bring out the worst in them.
00:12:06.920 | There's a lot of research that finds
00:12:08.880 | that cynical people tend to do things
00:12:11.220 | like monitoring others, spying on them,
00:12:13.980 | or threatening them to make sure
00:12:15.560 | that that other person doesn't betray them.
00:12:18.480 | But of course, other people can tell how we're treating them
00:12:22.200 | and they reciprocate our kindness
00:12:24.560 | and retaliate against our unkindness.
00:12:27.840 | So cynical people end up bringing out
00:12:30.600 | the most selfish qualities of others,
00:12:33.260 | telling a story full of villains
00:12:35.840 | and then ending up stuck living in that story.
00:12:38.640 | - How early in life does cynicism show up?
00:12:41.840 | I'm thinking about "Sesame Street" characters,
00:12:44.460 | which to me embody different neural circuits.
00:12:47.720 | You know, you've got Cookie Monster,
00:12:51.000 | some strong dopaminergic drive there.
00:12:53.960 | Knows what he wants, knows what he likes,
00:12:55.480 | and he's gonna get it.
00:12:56.360 | - That great prefrontal system, maybe.
00:12:58.280 | - Right, even if he has to eat the box
00:13:01.160 | in order to get to the cookie quicker.
00:13:03.780 | You have Elmo, who's all loving,
00:13:07.920 | and you have Oscar the Grouch.
00:13:09.440 | Somewhat cynical, but certainly grouchy.
00:13:13.520 | And then in, you know, essentially every fairy tale
00:13:18.520 | or every Christmas story or, you know,
00:13:23.220 | there seems to be sort of a skeptic
00:13:25.960 | or somebody that can't be brought on board
00:13:27.620 | the celebration that one would otherwise have.
00:13:31.340 | But even though kids are learning about cynicism
00:13:35.540 | and grouchiness and curmudgeons,
00:13:40.380 | I often think about those phenotypes in older folks
00:13:43.640 | because that's how they've been written
00:13:45.080 | into most of those stories.
00:13:47.000 | I guess Oscar the Grouch is,
00:13:48.840 | we don't know how old Oscar is.
00:13:50.520 | If one observes children,
00:13:53.560 | how early can you observe classically defined cynicism?
00:13:58.500 | - That's a great question.
00:14:01.080 | Classically defined cynicism would be hard
00:14:04.000 | to measure very early in life
00:14:05.400 | because you typically measure it through self-report.
00:14:07.800 | So people have to have relatively well-developed,
00:14:10.820 | elaborated stories that they can tell you
00:14:13.300 | about their version of the world.
00:14:16.060 | That said, one early experience and one early phenotype
00:14:20.640 | that's very strongly correlated with generalized mistrust
00:14:24.580 | and unwillingness to count on other people
00:14:27.660 | would be insecure attachment early in life.
00:14:30.980 | So for instance, you might know, but just for listeners,
00:14:35.440 | insecure attachment is a way of describing
00:14:38.680 | how kids experience the social world.
00:14:41.580 | It's often tested using something known
00:14:43.220 | as the strange situation where a one-year-old
00:14:45.980 | is brought to a lab with their caregiver,
00:14:49.560 | mother, father, whoever is caring for them.
00:14:52.860 | They're in a novel environment
00:14:54.480 | and researchers are observing
00:14:56.340 | how much do they explore the space?
00:14:57.740 | How comfortable do they seem?
00:14:59.780 | Then after that, a stranger enters the room.
00:15:03.580 | Couple minutes after that, their mother leaves the room
00:15:07.020 | or their caregiver leaves the room,
00:15:09.000 | which is of course incredibly strange
00:15:11.340 | and stressful for most one-year-olds.
00:15:13.340 | The caregiver then returns after a minute
00:15:16.560 | and what researchers look at is a few things.
00:15:19.420 | One, how comfortable is the child exploring a space
00:15:22.700 | with their caregiver present?
00:15:24.820 | Two, how comfortable are they when other people are around?
00:15:28.140 | Three, how do they react when their caregiver leaves?
00:15:31.060 | And four, how do they react at the reunion
00:15:34.020 | with their caregiver?
00:15:35.700 | And the majority of kids, approximately 2/3 of them,
00:15:38.580 | are securely attached, meaning that they are comfortable
00:15:42.380 | exploring a new space, they get really freaked out,
00:15:45.820 | of course, as you might when their caregiver leaves,
00:15:48.420 | but then they soothe quickly when their caregiver returns.
00:15:51.880 | The remaining 1/3 or so of kids are insecurely attached,
00:15:56.780 | meaning that they're skittish in new environments
00:15:59.180 | even when their parent or caregiver is there.
00:16:02.140 | They really freak out when their caregiver leaves
00:16:04.940 | and they're not very soothed upon their return.
00:16:08.500 | Now, for a long time, attachment style was viewed
00:16:11.260 | in very emotional terms and it is,
00:16:13.980 | it is an emotional reaction first and foremost,
00:16:17.000 | but researchers more recently have started to think about,
00:16:19.540 | well, what are the cognitive schemas?
00:16:22.780 | What are the underpinnings, the ways that children think
00:16:26.380 | when they are securely or insecurely attached?
00:16:28.860 | And one brilliant study used looking time.
00:16:31.880 | Looking time in kids is a metric of what surprises them.
00:16:36.820 | If something really surprising happens,
00:16:38.320 | they look for a very long time.
00:16:40.540 | And researchers found that insecurely attached kids,
00:16:43.420 | when they saw a video of a reunion,
00:16:47.700 | of a caregiver and infant acting in a way
00:16:52.100 | that felt loving and stable,
00:16:55.100 | they looked longer as though that was surprising.
00:16:58.300 | Kids who were securely attached didn't look very long
00:17:01.140 | at those stable interactions,
00:17:03.820 | but looked longer at interactions that were unstable.
00:17:06.940 | - Interesting.
00:17:07.780 | - It's almost as though there is a setup
00:17:09.960 | that kids develop very early.
00:17:11.780 | Can I count on people?
00:17:13.520 | Am I safe with people?
00:17:16.580 | And insecure attachment is a signal coming early in life,
00:17:19.780 | no, you're not safe with people,
00:17:21.380 | that I think, well, and the data show,
00:17:23.980 | elaborates later in life into mistrust
00:17:26.980 | in other relationships.
00:17:28.640 | - How different is cynicism from skepticism?
00:17:33.480 | I can think of some places where they might overlap,
00:17:38.060 | but cynicism seems to carry something
00:17:41.780 | of a lack of anticipation
00:17:44.340 | about any possibility of a positive future.
00:17:47.380 | Is that one way to think about it?
00:17:49.020 | - That's a very sharp way of thinking about it, actually.
00:17:51.700 | And I wish that people knew more
00:17:54.500 | about the discrepancy between these two ways
00:17:58.500 | of viewing the world.
00:17:59.500 | Cynicism and skepticism,
00:18:01.380 | people often use them interchangeably.
00:18:03.900 | In fact, they're quite different.
00:18:05.500 | And I would argue that one is much more useful
00:18:08.480 | for learning about the world
00:18:10.020 | and building relationships than the other.
00:18:12.500 | Again, cynicism is a theory that's kind of locked in,
00:18:16.840 | that no matter what people show you,
00:18:19.120 | their true colors are, again, untrustworthy
00:18:23.780 | and self-oriented.
00:18:25.420 | It's a hyper-Darwinian view, right,
00:18:27.420 | that ultimately people are red in tooth and claw.
00:18:30.380 | Skepticism is instead the, I guess, restlessness
00:18:36.380 | with our assumptions, a desire for new information.
00:18:42.040 | One way I often think about it is that cynics
00:18:44.980 | think a little bit like lawyers, right?
00:18:46.960 | They have a decision that they've already made about you
00:18:49.580 | and about everybody.
00:18:51.060 | And they're just waiting for evidence
00:18:52.900 | that supports their point.
00:18:54.620 | And when evidence comes in that doesn't support their point,
00:18:57.180 | they explain it away, right?
00:18:59.020 | And you see this, actually,
00:19:00.140 | that cynical people will offer more ulterior motives
00:19:03.920 | when they see an act of kindness, for instance.
00:19:06.580 | They'll explain it away.
00:19:08.380 | In that way, I think cynics actually are quite similar
00:19:11.940 | to the naive, trusting, gullible folks
00:19:14.800 | that they love to make fun of, right?
00:19:17.260 | Naivete, gullibility, is trusting people
00:19:20.620 | in a credulous, unthinking way.
00:19:23.420 | I would say cynicism is mistrusting people
00:19:26.300 | in a credulous and unthinking way.
00:19:29.340 | So if cynics then think like lawyers,
00:19:32.020 | sort of in the prosecution against humanity,
00:19:35.100 | skeptics think more like scientists.
00:19:38.180 | Skepticism, classically in philosophy,
00:19:40.980 | is the belief that you can never truly know anything.
00:19:44.760 | But as we think about it now,
00:19:46.520 | it's more the desire for evidence to underlie any claim
00:19:51.300 | that you believe.
00:19:53.020 | And the great thing about skepticism
00:19:55.020 | is it doesn't require an ounce of naivete.
00:19:58.300 | You can be absolutely sharp in deciding,
00:20:00.980 | I don't want to trust this person,
00:20:02.300 | or I do want to trust this person,
00:20:04.500 | but it allows you to update and learn from specific acts,
00:20:08.900 | specific instances, and specific people.
00:20:11.740 | - When I think about scientists,
00:20:13.340 | one of the first things I think about
00:20:14.820 | is not just their willingness,
00:20:17.000 | but their excitement to embrace complexity.
00:20:19.520 | - Yes.
00:20:20.360 | - Like, okay, these two groups disagree,
00:20:22.400 | or these two sets of data disagree,
00:20:25.340 | and it's the complexity of that interaction
00:20:28.480 | that excites them.
00:20:29.960 | Whereas when I think of cynics
00:20:32.900 | in the way that it's framed up in my mind,
00:20:34.780 | which I'm getting more educated now,
00:20:36.780 | but admittedly my understanding of cynicism
00:20:39.980 | is still rather superficial.
00:20:43.340 | You'll change that in the course of our discussion.
00:20:46.200 | But that cynics are not embracing
00:20:50.980 | the complexity of disagreement.
00:20:54.060 | They are moving away from the,
00:20:56.480 | certainly any notion of excitement by complexity.
00:20:59.860 | It seems like it's a heuristic,
00:21:01.640 | it's a way to simplify the world around you.
00:21:03.960 | - That's exactly right.
00:21:05.700 | Phil Tetlock has a great term for this
00:21:08.740 | called integrative complexity.
00:21:11.020 | To what extent can you hold different versions of the world,
00:21:14.580 | different arguments in mind?
00:21:16.340 | To what extent can you pick from each one
00:21:19.400 | what you believe based on the best evidence available?
00:21:22.620 | And integrative complexity is a great way
00:21:25.140 | to learn about the world and about the social world,
00:21:27.900 | whereas cynicism, as you rightly point out,
00:21:30.020 | is much more of a heuristic.
00:21:32.220 | It's a black and white form of thinking.
00:21:35.020 | And the really sad thing is that cynicism
00:21:39.140 | then puts us in a position where we can't learn very much.
00:21:43.840 | This is what, in learning theory,
00:21:45.220 | is called a wicked learning environment,
00:21:47.940 | where, and I don't wanna get too nerdy.
00:21:49.940 | Well, I guess I can get nerdy here.
00:21:51.180 | - You can get as nerdy as you want.
00:21:52.780 | This audience likes nerdy.
00:21:54.500 | - So let's think in Bayesian terms, right?
00:21:57.140 | So Bayesian statistics is where you have
00:22:00.060 | a set of beliefs about the world,
00:22:01.760 | you take new information in,
00:22:03.860 | and that new information allows you to update your priors
00:22:07.300 | into a posterior distribution, into a new set of beliefs.
00:22:11.460 | And that's great.
00:22:12.300 | That's a great way to learn about the world,
00:22:14.780 | to adapt to new information and new circumstances.
00:22:18.440 | A wicked learning environment is where your priors
00:22:21.620 | prevent you from gathering the information
00:22:24.020 | that you would need to confirm or disconfirm them.
00:22:26.920 | So think about mistrust, for instance, right?
00:22:30.720 | It's easy to understand why people mistrust.
00:22:33.540 | Some of us are insecurely attached,
00:22:35.380 | and we've been hurt in the past.
00:22:37.240 | We're trying to stay safe.
00:22:38.980 | We don't want to be betrayed.
00:22:40.560 | This is a completely natural response.
00:22:43.100 | It's a totally understandable response.
00:22:45.860 | But when we decide to mistrust,
00:22:49.460 | we never are able to learn whether the people
00:22:52.100 | who we are mistrusting would have been trustworthy or not.
00:22:56.380 | When we trust, we can learn
00:22:58.460 | whether we've been right or not, right?
00:22:59.940 | Somebody can betray us, and that hurts,
00:23:01.540 | and we remember it for years.
00:23:03.180 | Or more often than not, the data turn out to show us
00:23:07.220 | they can honor that trust.
00:23:08.580 | We can build a relationship.
00:23:10.340 | We can start a collaboration.
00:23:13.580 | We can live a full social life.
00:23:16.140 | And it turns out that the problem is
00:23:18.700 | that trusting people incorrectly, you do learn from,
00:23:22.460 | but mistrusting people incorrectly, you don't learn from,
00:23:26.140 | because the missed opportunities are invisible to us.
00:23:30.740 | - Wow, there's certainly a lot there
00:23:32.060 | that maps to many people's experience.
00:23:34.740 | So you pointed out that some degree of cynicism
00:23:37.920 | likely has roots in insecure attachment.
00:23:40.940 | That said, if one looks internationally,
00:23:45.620 | do we find cultures where it's very hard to find cynics,
00:23:50.500 | and there could be any number of reasons for this,
00:23:53.740 | or perhaps even more interestingly,
00:23:56.420 | do we find cultures where there really isn't even a word
00:23:59.660 | for cynicism?
00:24:01.100 | - Wow, I love that question.
00:24:03.580 | There is a lot of variance in,
00:24:06.060 | and the data on cynicism are much more local
00:24:09.340 | to the US typically.
00:24:10.540 | I mean, for better and for worse,
00:24:14.180 | a lot of research on this is done in an American context.
00:24:17.280 | But that said, there's a lot of data on generalized trust,
00:24:21.920 | which you could say is an inverse of cynicism, right?
00:24:24.280 | So for instance, there are national and international
00:24:27.020 | samples of major surveys which ask people
00:24:31.740 | whether they agree or disagree
00:24:33.420 | that most people can be trusted.
00:24:35.220 | And there's a lot of variance around the world.
00:24:38.260 | In general, the cultures that are most trusting
00:24:41.060 | have a couple of things in common.
00:24:43.020 | One, they are more economically equal
00:24:46.660 | than untrusting cultures.
00:24:48.600 | So there's a lot of great work from Kate Willett
00:24:52.660 | and Richard Wilkinson that,
00:24:56.260 | they have a book called "The Spirit Level"
00:24:57.740 | where they look at inequality across the world
00:25:00.180 | and relate it to public health outcomes,
00:25:01.820 | and one of them is trust.
00:25:03.540 | There's also variance in trust over time.
00:25:07.780 | So you can look at not just are there places or cultures
00:25:10.900 | that trust more than others,
00:25:12.100 | but when does a culture trust more or less?
00:25:16.060 | And in the US, that's sadly a story of decline.
00:25:19.740 | In 1972, about half of Americans believed
00:25:23.360 | that most people can be trusted.
00:25:25.340 | And by 2018, that had fallen to about a third of Americans.
00:25:29.460 | And that's a drop as big, just to put it in perspective,
00:25:33.200 | as the stock market took in the financial collapse of 2008.
00:25:36.780 | So there's a lot of variance here,
00:25:39.680 | both across space and time.
00:25:42.340 | And one of the, not the only,
00:25:44.900 | but one of the seeming characteristics of cultures
00:25:48.860 | that tracks that is how unequal they are.
00:25:52.180 | In part, because research suggests
00:25:54.760 | that when you are in a highly unequal society,
00:25:58.460 | economically, there's a sense of zero-sum competition
00:26:01.580 | that develops.
00:26:02.420 | There's a sense that, wait a minute,
00:26:03.760 | anything that another person gets, I lose.
00:26:07.100 | And if you have that inherent sense of zero-sum competition,
00:26:12.100 | then it's very difficult to form bonds.
00:26:15.300 | It's very difficult to trust other people
00:26:17.520 | because you might think, well, in order to survive,
00:26:20.220 | this person has to try to outrun me.
00:26:23.220 | They have to try to trip me.
00:26:24.260 | They have to try to make me fail for themselves to succeed.
00:26:28.220 | I'd like to take a quick break
00:26:29.460 | and acknowledge our sponsor, AG1.
00:26:32.100 | By now, many of you have heard me say
00:26:33.700 | that if I could take just one supplement,
00:26:35.460 | that supplement would be AG1.
00:26:37.540 | The reason for that is AG1 is the highest quality
00:26:40.080 | and most complete
00:26:40.980 | of the foundational nutritional supplements available.
00:26:43.740 | What that means is that it contains
00:26:45.180 | not just vitamins and minerals,
00:26:46.540 | but also probiotics, prebiotics, and adaptogens
00:26:49.940 | to cover any gaps you may have in your diet
00:26:52.340 | and provide support for a demanding life.
00:26:54.620 | For me, even if I eat mostly whole foods
00:26:56.460 | and minimally processed foods,
00:26:57.700 | which I do for most of my food intake,
00:26:59.620 | it's very difficult for me
00:27:00.780 | to get enough fruits and vegetables,
00:27:02.420 | vitamins and minerals, micronutrients,
00:27:04.300 | and adaptogens from food alone.
00:27:06.500 | For that reason, I've been taking AG1 daily since 2012
00:27:09.940 | and often twice a day, once in the morning or mid-morning,
00:27:12.620 | and again in the afternoon or evening.
00:27:14.620 | When I do that, it clearly bolsters my energy,
00:27:17.040 | my immune system, and my gut microbiome.
00:27:19.620 | These are all critical to brain function,
00:27:21.460 | mood, physical performance, and much more.
00:27:24.020 | If you'd like to try AG1,
00:27:25.500 | you can go to drinkag1.com/huberman
00:27:28.620 | to claim their special offer.
00:27:30.200 | Right now, they're giving away five free travel packs
00:27:32.340 | plus a year supply of vitamin D3K2.
00:27:35.180 | Again, that's drinkag1.com/huberman
00:27:38.820 | to claim that special offer.
00:27:40.540 | - What is the relationship, if any,
00:27:42.560 | between cynicism and happiness or lack of happiness?
00:27:46.760 | When I think of somebody who's really cynical,
00:27:48.540 | I think of an Oscar the Grouch
00:27:50.300 | or a curmudgeon-like character.
00:27:53.160 | And as I ask this question,
00:27:54.380 | I'm thinking specifically about what you said earlier
00:27:56.620 | about how cynicism prevents us
00:27:58.960 | from certain forms of learning that are important
00:28:02.300 | and very valuable to us.
00:28:03.260 | Here's the reason why.
00:28:04.100 | I'll give just a little bit of context.
00:28:05.260 | I remember when I was a kid,
00:28:06.180 | my dad, who went to classic boarding schools,
00:28:10.260 | he grew up in South America,
00:28:11.500 | but he went to these boarding schools that were very strict.
00:28:15.900 | And he was taught, he told me,
00:28:18.140 | that to be cheerful and happy,
00:28:22.220 | people would accuse you of being kind of dumb.
00:28:26.280 | Whereas if you were cynical
00:28:27.860 | and you acted a little bored with everything,
00:28:30.180 | people thought that you were more discerning,
00:28:32.520 | but that he felt it was a terrible model
00:28:34.720 | for going through life because it veered into cynicism.
00:28:39.300 | My dad happens to be a scientist.
00:28:41.300 | He's, I think, a relatively happy person.
00:28:44.800 | Sorry, dad.
00:28:45.640 | A happy person, seems happy,
00:28:46.940 | but meaning he's a person who has happiness
00:28:49.940 | and he has other emotions too.
00:28:51.140 | I wouldn't say he's happy all the time,
00:28:52.540 | but he experiences joy and pleasure in daily,
00:28:55.680 | small things and big things in life.
00:28:57.380 | So clearly he rescued himself from the forces
00:29:00.620 | that were kind of pushing him down that path.
00:29:02.020 | But that's the anecdote.
00:29:03.660 | But I use that question more as a way to frame up
00:29:07.300 | the possible collaboration between cynicism
00:29:10.220 | and exuding boredom or a challenge
00:29:15.220 | in shifting somebody towards a happier affect.
00:29:20.060 | Because when I think about cynics,
00:29:21.340 | I think that they're kind of unhappy people.
00:29:23.820 | And when I think about people who are not very cynical,
00:29:25.460 | I think of them as kind of cheerful and curious.
00:29:27.980 | And there's some ebullience there.
00:29:29.920 | They might not be Tigger-like in their affect,
00:29:33.200 | but they kind of veer that direction.
00:29:36.020 | - Andrew, I love this trip down memory lane.
00:29:37.840 | I'm having all these childhood memories
00:29:39.760 | of Tigger and Sesame Street.
00:29:43.840 | There's so much in what you're saying.
00:29:45.340 | I want to try to pull on a couple of threads here,
00:29:47.460 | if that's okay.
00:29:48.600 | First, and this one is pretty straightforward,
00:29:51.560 | the effect of cynicism on well-being
00:29:53.860 | is just really documented and quite negative.
00:29:57.460 | So there are large prospective studies
00:30:00.860 | with tens of thousands of people,
00:30:04.100 | several of these studies,
00:30:05.460 | that measure cynicism and then measure life outcomes
00:30:08.740 | in the years and decades afterwards.
00:30:11.140 | And the news is pretty bleak for cynics, right?
00:30:14.100 | So absolutely, lower levels of happiness,
00:30:18.020 | flourishing satisfaction with life,
00:30:20.580 | greater incidence of depression, greater loneliness.
00:30:24.500 | But it's not just the neck up that cynicism affects.
00:30:29.140 | Cynics over the course of their lives
00:30:31.300 | also tend to have greater degrees of cellular inflammation,
00:30:35.420 | more incidence of heart disease,
00:30:37.720 | and they even have higher rates of all-cause mortality,
00:30:41.340 | so shorter lives than non-cynics.
00:30:43.660 | And again, this might sound like, wait a minute,
00:30:45.540 | you go from a philosophical theory
00:30:47.300 | to a shorter life.
00:30:49.500 | The answer is, yeah, you do, because,
00:30:51.940 | and again, these are correlational studies,
00:30:53.940 | so I don't wanna draw too many causal claims,
00:30:56.180 | but they're quite rigorous and control
00:30:58.100 | for a lot of other factors.
00:31:00.300 | But I would say that this is consistent
00:31:02.420 | with the idea that, really,
00:31:04.100 | one of the great protectors of our health
00:31:07.540 | is our sense of connection to other people.
00:31:09.940 | And if you are unable or unwilling
00:31:13.120 | to be vulnerable around others,
00:31:15.020 | to really touch in to that type of connection,
00:31:18.580 | it stands to reason that things like chronic stress
00:31:21.700 | and isolation would impact not just your mind,
00:31:25.280 | but all through your body and your organ systems.
00:31:28.760 | So again, the news here is not great,
00:31:31.420 | and I often think about one of the best encapsulations
00:31:34.820 | of a cynical view of life comes from Thomas Hobbes,
00:31:39.060 | the philosopher, who, in his book "Leviathan,"
00:31:42.020 | said we need a restrictive government.
00:31:44.980 | Because left to our own devices,
00:31:47.620 | human life is nasty, brutish, and short.
00:31:50.500 | And ironically, I think that might describe
00:31:52.700 | the lives of cynics themselves more than most people.
00:31:57.700 | So that's point one, right,
00:31:59.780 | is that there is this pretty stark negative correlation
00:32:03.740 | between cynicism and a lot of life outcomes
00:32:06.680 | that we might want for ourselves.
00:32:08.860 | But point two, I think,
00:32:10.540 | is related to what your dad also noticed,
00:32:13.460 | which is that if cynicism hurts us so much,
00:32:17.380 | why would we adopt it?
00:32:19.540 | If it was a pill, if there was a pill
00:32:21.460 | that as its side effects listed depression,
00:32:24.180 | loneliness, heart disease, and early death,
00:32:26.460 | it would be a poison, right?
00:32:27.820 | It would have a skull and crossbones on the bottle.
00:32:30.580 | But yet we're swallowing it.
00:32:32.080 | More of us are swallowing it
00:32:33.540 | than we did in years and decades past.
00:32:36.880 | Well, one of the answers, I think,
00:32:39.460 | is because our culture glamorizes cynicism.
00:32:42.220 | It's because of the very stereotype
00:32:44.140 | that your father pointed out,
00:32:45.340 | which is that if you're happy-go-lucky,
00:32:47.340 | if you trust people, that kind of seems dull.
00:32:50.460 | It seems like maybe you're not that sharp.
00:32:52.160 | Maybe you don't understand the world.
00:32:54.580 | And there is that strong relationship in our stereotypes,
00:32:58.980 | in our models of the world.
00:33:01.020 | Susan Fisk and many other psychologists
00:33:03.080 | have studied warmth and competence, right?
00:33:06.260 | How friendly and caring does somebody seem?
00:33:10.260 | And how able do they seem to accomplish hard things?
00:33:13.900 | And it turns out that in many studies,
00:33:16.260 | people's perception is that these are inversely correlated.
00:33:19.160 | That if you're warm, maybe you're not that competent.
00:33:22.120 | And if you're competent, maybe you shouldn't be that warm.
00:33:24.380 | And in fact, if you tell people
00:33:26.520 | to act as competently as they can,
00:33:29.680 | they'll often respond by being a little bit less nice,
00:33:32.980 | a little bit less warm than they would be otherwise.
00:33:36.300 | There's also data that find that,
00:33:38.820 | you know, where people are presented in surveys
00:33:41.100 | with a cynic and a non-cynic.
00:33:43.420 | They're told about, here's one person,
00:33:45.900 | they really think that people are great overall,
00:33:48.940 | and they tend to be trusting.
00:33:50.420 | Here's another person who thinks that people
00:33:52.660 | are kind of out for themselves
00:33:53.900 | and really doesn't trust most folks.
00:33:56.420 | And then they'll ask those people,
00:33:58.060 | who should we pick for this difficult intellectual task?
00:34:02.180 | And 70% of respondents pick a cynical person
00:34:06.580 | over a non-cynic for difficult intellectual tasks.
00:34:09.780 | 85% of people think that cynics are socially wiser,
00:34:14.280 | that they'd be able, for instance, to detect who's lying
00:34:17.160 | and who's telling the truth.
00:34:18.900 | So most of us put a lot of faith in people
00:34:22.280 | who don't have a lot of faith in people, ironically.
00:34:25.580 | And even more ironically, we're wrong to do so.
00:34:28.860 | Olga Stavrova, this great psychologist who studies cynicism,
00:34:32.340 | has this paper called "The Cynical Genius Illusion,"
00:34:35.780 | where she documents all these biases,
00:34:38.760 | the way that we think cynics are bright and wise,
00:34:42.220 | and then uses national data, tens of thousands of people,
00:34:46.460 | to show that actually, cynics do less well
00:34:49.620 | on cognitive tests, on mathematical tests,
00:34:52.460 | that trust is related with things
00:34:54.740 | like intelligence and education,
00:34:57.100 | and that, in other work,
00:35:00.580 | this is not from Olga Stavrova, but from others,
00:35:03.380 | that actually, cynics do less well than non-cynics
00:35:06.580 | in detecting liars.
00:35:08.000 | Because if you have a blanket assumption about people,
00:35:11.840 | you're not actually attending to evidence in a sharp way.
00:35:15.380 | You're not actually taking in new information
00:35:17.660 | and making wise decisions.
00:35:20.060 | - In other words, cynics are not being scientific.
00:35:24.500 | Their hypothesis is cast,
00:35:27.300 | but they're not looking at the data equally, right?
00:35:30.120 | And we should remind people
00:35:31.300 | that a hypothesis is not a question.
00:35:33.700 | Every great experiment starts with a question,
00:35:35.900 | and then you generate a hypothesis,
00:35:37.500 | which is a theory or conclusion,
00:35:41.500 | essentially made up front,
00:35:43.580 | and then you go collect data,
00:35:45.660 | and you see if you prove or disprove the hypothesis.
00:35:49.340 | And you can never really prove a hypothesis.
00:35:51.020 | You can only support it or not support it
00:35:54.220 | with the data that you collect,
00:35:55.420 | depending on the precision of your tools.
00:35:56.980 | But that's very interesting,
00:36:00.200 | because I would think that if we view cynics as smarter,
00:36:05.200 | which clearly they're not as a group, right?
00:36:08.380 | You're saying cynics are not more intelligent, right?
00:36:11.020 | I believe that's covered in your book.
00:36:13.060 | And if one knows that,
00:36:16.560 | then why do we send cynics in kind of like razors
00:36:20.740 | to assess what the environment is like?
00:36:25.400 | Is that because we'd rather have others deployed
00:36:29.940 | for us to kind of like weed people out?
00:36:32.520 | Is it that we're willing to accept some false negatives?
00:36:35.660 | Meaning for those,
00:36:38.220 | I guess we're using a little bit
00:36:39.120 | of a semi-technical language here,
00:36:40.600 | false negatives would be,
00:36:42.140 | you're trying to assess a group of people
00:36:43.580 | that would be terrific employees.
00:36:45.880 | And you send in somebody, interview them,
00:36:48.360 | that's very cynical.
00:36:50.480 | So presumably in one's mind,
00:36:52.680 | that filter of cynicism is only going to allow in people
00:36:55.520 | that are really, really right for the job.
00:36:58.040 | And we're willing to accept that,
00:36:59.640 | you know, there are probably two or three candidates
00:37:02.060 | that would also be right for the job,
00:37:03.380 | but we're willing to let them go.
00:37:04.580 | Some false negatives,
00:37:06.560 | as opposed to having someone get through the filter
00:37:11.380 | who really can't do the job.
00:37:12.780 | Like we're willing to let certain opportunities go
00:37:15.100 | by being cynical or by deploying a cynic as the,
00:37:18.420 | you know, I'm imagining the person with the clipboard,
00:37:20.120 | you know, very rigid.
00:37:21.900 | - Yeah. - Like cynicism
00:37:22.740 | and rigidity seem to go together.
00:37:24.180 | So that's why I'm lumping
00:37:25.340 | these kinds of psychological phenotypes.
00:37:27.500 | - No, I think that's absolutely right.
00:37:29.180 | And so a couple of things, one, you know,
00:37:32.180 | you said that if we know that cynics aren't smarter
00:37:34.980 | than non-cynics, why are we deploying them?
00:37:36.900 | Well, let's be clear, we know this,
00:37:38.660 | meaning you and I know this and scientists know this,
00:37:41.660 | but the data show that most people don't know this,
00:37:44.580 | that we maintain the stereotype in our culture
00:37:47.860 | that being negative about people
00:37:50.300 | means that you've been around the block enough times,
00:37:52.780 | that it is a form of wisdom.
00:37:54.040 | So that's a stereotype that I think we need
00:37:56.220 | to dispel first of all.
00:37:58.100 | But I do think that to your point,
00:38:00.580 | when we deploy cynics out in the field, you know,
00:38:03.380 | when we say, I'm gonna be nice,
00:38:05.020 | but I want somebody who's really pretty negative,
00:38:07.820 | who's really pretty suspicious to protect me
00:38:10.540 | or to protect my community,
00:38:12.380 | I think that's a really, again, understandable instinct,
00:38:15.780 | almost from an evolutionary perspective.
00:38:18.460 | You know, we are built to pay lots of attention
00:38:21.780 | to threats in our environment and threats to our community.
00:38:25.300 | And in the early social world, you know,
00:38:27.740 | if you wind, I mean,
00:38:28.660 | just to do some back of the envelope evolutionary psychology,
00:38:32.100 | if you wind the clock back 100, 150,000 years,
00:38:36.700 | what's, you know,
00:38:37.780 | what is the greatest threat to early communities?
00:38:39.860 | It's people, right?
00:38:41.660 | It's people who would take advantage
00:38:44.060 | of our communal nature, right?
00:38:45.820 | The thing that allows human beings to thrive
00:38:48.620 | is that we collaborate.
00:38:51.000 | But that collaboration means that a free rider,
00:38:54.460 | somebody who chooses to not pitch in,
00:38:56.960 | but still take out from the common pool,
00:38:59.540 | anything that they want, can do exceptionally well.
00:39:02.660 | They can live a life of leisure on the backs
00:39:05.500 | of a community that's working hard.
00:39:07.100 | And if you select then for that type of person,
00:39:10.680 | if that type of person proliferates,
00:39:12.260 | then the community collapses.
00:39:14.420 | So it makes sense that we depend on cynics
00:39:18.180 | from that perspective, from a threat mitigation perspective,
00:39:22.100 | from a risk aversion perspective.
00:39:24.900 | But it doesn't make sense
00:39:26.680 | from the perspective of trying to optimize
00:39:28.600 | our actual social lives, right?
00:39:30.600 | And I think that oftentimes, you know,
00:39:33.120 | we are risk averse in general,
00:39:34.360 | meaning that we're more scared of negative outcomes
00:39:36.820 | than we are enticed by positive outcomes.
00:39:40.320 | But in the social world,
00:39:41.520 | that risk aversion is, I think,
00:39:44.520 | quite harmful in a lot of demonstrable ways.
00:39:46.860 | - Is cynicism domain specific?
00:39:50.900 | And there again, I'm using jargon,
00:39:53.000 | meaning if somebody is cynical in one environment,
00:39:57.180 | like cynical about the markets,
00:39:58.540 | like, well, things are up now,
00:39:59.900 | but, you know, have an election come,
00:40:03.040 | so things can go this way or that way, depending on,
00:40:04.780 | you know, do they tend to be cynical
00:40:07.060 | about other aspects of life, other domains?
00:40:10.140 | - So there's a little bit of data on this,
00:40:13.360 | and it suggests a couple of things.
00:40:15.380 | One, left to our own devices,
00:40:18.180 | our levels of cynicism tend to be pretty stable over time.
00:40:22.820 | And also decline in older adulthood,
00:40:25.640 | contra the stereotype of the curmudgeonly older person.
00:40:29.400 | But another is that cynicism
00:40:31.320 | does tend to be pretty domain general.
00:40:33.760 | So for instance, cynics, you know,
00:40:36.840 | and this makes sense if you look at questionnaires
00:40:39.160 | that assess cynicism, which are things like,
00:40:41.800 | people are honest chiefly through fear of getting caught,
00:40:45.380 | or most people really don't like helping each other.
00:40:48.360 | I mean, if you're answering those questions positively,
00:40:51.120 | you're just not a fan of,
00:40:52.260 | you're probably not great at parties,
00:40:53.980 | you're not a fan of people.
00:40:55.820 | And it turns out that people who answer the,
00:40:57.940 | this is an old scale developed by a couple of psychologists
00:41:01.340 | named Walter Cook and Donald Medley in the 1950s.
00:41:04.540 | If you answer the Cook-Medley hostility scale,
00:41:07.940 | if you answer these questions positively,
00:41:09.900 | you tend to be less trusting of strangers,
00:41:12.580 | but you also tend to, for instance,
00:41:14.560 | have less trust in your romantic partnerships,
00:41:17.940 | you have less trust in your friends,
00:41:20.220 | and you have less trust in your colleagues.
00:41:21.940 | So this is sort of an all-purpose view of the world,
00:41:25.380 | at least as Cook and Medley first thought about it.
00:41:30.180 | But I do wanna build on a great intuition you have,
00:41:32.420 | which is that different environments
00:41:35.200 | might bring out cynicism or tamp it down.
00:41:38.860 | And it turns out that that's also very true.
00:41:40.880 | As trait-like as cynicism can be,
00:41:43.480 | there's lots of evidence that the type of
00:41:46.140 | social environment we're in matters a lot.
00:41:48.940 | One of my favorite studies in this domain
00:41:52.420 | came from Southeastern Brazil.
00:41:55.760 | There are two fishing villages in Southeastern Brazil.
00:41:58.260 | They're separated by about 30, 40 miles.
00:42:00.900 | They're similar in socioeconomic status, religion, culture,
00:42:04.780 | but there's one big difference between them.
00:42:07.520 | One of the villages sits on the ocean,
00:42:09.980 | and in order to fish on the ocean,
00:42:11.840 | you need big boats, heavy equipment.
00:42:14.140 | You can't do it alone.
00:42:15.420 | You must work together.
00:42:17.660 | The other village is on a lake,
00:42:19.540 | where fishermen strike out on small boats alone,
00:42:23.060 | and they compete with one another.
00:42:25.520 | About 10 years ago, economists,
00:42:27.580 | this was a study led by Andreas Liebrand,
00:42:29.460 | a really great economist.
00:42:31.000 | They went to these villages,
00:42:32.780 | and they gave the folks who worked there
00:42:35.800 | a bunch of social games to play.
00:42:38.460 | These were not with fellow fishermen, but with strangers.
00:42:41.500 | Games like, would you trust somebody with some money
00:42:44.420 | and see if they then want to share dividends with you?
00:42:47.380 | Or give in some money yourself,
00:42:49.420 | would you like to share some of it with another person?
00:42:52.860 | And they found that when they start in their careers,
00:42:56.260 | lake fishermen and ocean fishermen were equally trusting
00:43:00.300 | and equally trustworthy as well.
00:43:03.380 | But over the course of their careers, they diverged.
00:43:06.580 | Being in a collaborative environment
00:43:08.820 | where people must count on one another to survive
00:43:12.340 | made people over time more trusting and more trustworthy.
00:43:17.020 | Being in a competitive zero-sum environment over time
00:43:20.660 | made people less trusting and less trustworthy.
00:43:23.900 | Now, one thing that always amazes me about this work
00:43:26.220 | is that people in both of these environments are right.
00:43:30.620 | If you're in a competitive environment,
00:43:32.700 | you don't trust and you're right to not trust.
00:43:35.940 | If you're in a collaborative environment,
00:43:37.700 | you do trust and you're right to trust.
00:43:40.220 | And this is from the point of view of economic games,
00:43:42.780 | and I think much broadly construed as well.
00:43:45.440 | So one question then becomes,
00:43:47.140 | well, which of these environments do we want to be in?
00:43:49.820 | I think the cost in terms of wellbeing and relationships
00:43:53.000 | is quite obvious if you're in a competitive environment.
00:43:55.100 | And then the second question of course is,
00:43:57.500 | how do we put ourselves in the type of environment
00:44:00.540 | that we want knowing that that environment will change
00:44:03.180 | who we are over the course of our lives?
00:44:05.980 | - So much of schooling in this country
00:44:08.820 | is based on at first cooperation,
00:44:11.500 | like we're all gonna sit around and listen to a story
00:44:13.540 | and then we're gonna work in small groups.
00:44:15.580 | But in my experience over time,
00:44:18.040 | it evolves into more independent learning and competition.
00:44:21.180 | They post the distribution of scores.
00:44:23.200 | That's largely the distribution of individual scores.
00:44:26.820 | There are exceptions to this, of course.
00:44:28.300 | Like I think I've never been to business school,
00:44:30.100 | but I think they form small groups and work on projects.
00:44:32.300 | It's true in computer science at the undergraduate level
00:44:35.340 | and so on.
00:44:36.180 | But to what extent do you think having a mixture
00:44:40.220 | of cooperative learning, still competition perhaps
00:44:44.500 | between groups, as well as individual learning
00:44:47.940 | and competition can foster kind of an erosion of cynicism?
00:44:52.940 | Because it sounds like being cynical is,
00:44:55.900 | I don't wanna be hard on the cynics here,
00:44:57.820 | but they're probably already hard on themselves
00:45:00.100 | and everybody else.
00:45:01.300 | We know they're hard on everybody else.
00:45:02.700 | But, oh, there was my presumption.
00:45:05.140 | Okay, I'm gonna stay open-minded.
00:45:06.820 | Maybe they're not, you'll tell me.
00:45:09.020 | That they are on average less intelligent
00:45:13.260 | is what I'm hearing.
00:45:14.300 | And that there's something really big to be gained
00:45:18.420 | from anybody who decides to embrace novel ideas,
00:45:21.980 | even if they decide to stick with their original decision
00:45:25.020 | about others or something.
00:45:27.220 | Provided they explore the data in an open-minded way,
00:45:30.700 | even transiently, it sounds like
00:45:32.260 | there's an opportunity there.
00:45:33.660 | You gave a long-term example of these two phishing scenarios.
00:45:38.300 | So the neuroplasticity takes years,
00:45:40.980 | but we know neuroplasticity can be pretty quick.
00:45:42.980 | I would imagine if you expose a cynic
00:45:44.820 | to a counterexample to their belief
00:45:48.260 | that it's not gonna erode all of their cynicism,
00:45:50.700 | but it might make a little dent
00:45:51.940 | in that neural circuit for cynicism.
00:45:54.340 | - Yeah, this is a great perspective.
00:45:56.460 | And a couple of things I wanna be clear on.
00:45:59.180 | One, I am not here to judge or impugn cynics.
00:46:03.300 | I should confess that I myself struggle with cynicism
00:46:07.020 | and have for my entire life.
00:46:08.980 | Part of my journey to learn more about it
00:46:12.180 | and even to write this book
00:46:13.580 | was an attempt to understand myself
00:46:15.940 | and to see if it is possible to unlearn cynicism
00:46:20.940 | because frankly, I wanted to.
00:46:22.940 | So you will get no judgment from me
00:46:25.580 | of people who feel like it's hard to trust.
00:46:28.220 | I think that another point that you're bringing out
00:46:33.260 | that I wanna cosign is that saying
00:46:35.860 | that competition over the longterm,
00:46:38.820 | zero-sum competition can erode our trust
00:46:41.060 | isn't the same as saying that we should never compete.
00:46:43.540 | Competition is beautiful.
00:46:44.860 | I mean, the Olympics are going on right now
00:46:47.660 | and it's amazing to see what people do
00:46:50.420 | when they are at odds trying to best one another.
00:46:53.820 | Incredible feats are accomplished
00:46:57.300 | when we focus on the great things that we can do.
00:47:00.540 | And oftentimes we are driven to greatness
00:47:03.340 | by people we respect who are trying to be greater than us.
00:47:07.300 | So absolutely competition can be part
00:47:09.860 | of a very healthy social structure and a very healthy life.
00:47:14.460 | I think that the broader question
00:47:16.840 | is whether we construe that competition
00:47:18.820 | at the level of a task or at the level of the person.
00:47:23.820 | In fact, there's a lot of work in the science of conflict
00:47:28.040 | and conflict resolution that looks at the difference
00:47:30.740 | between task conflict and personal conflict.
00:47:34.820 | You can imagine in a workplace,
00:47:36.780 | two people have different ideas
00:47:38.340 | for what direction they wanna take a project in.
00:47:41.780 | Well, that's great if it leads to healthy debate
00:47:45.460 | and if that is mutually respectful.
00:47:48.540 | But the minute that that turns into
00:47:50.780 | blanket judgments about the other person,
00:47:52.820 | oh, the reason that they want this direction
00:47:54.660 | is because they're not so bright
00:47:56.940 | or because they don't have vision
00:47:59.340 | or because they're trying to gain favor.
00:48:01.140 | That's when we go from healthy skeptical conflict
00:48:05.740 | into cynical and destructive conflict.
00:48:09.500 | And you see this with athletes as well.
00:48:11.860 | Athletes often are very good friends
00:48:14.540 | and some of the people that they respect the most
00:48:16.980 | are the folks who they're battling.
00:48:19.260 | In the case of contact sports and boxing,
00:48:23.540 | literally battling, but they can have immense
00:48:26.500 | and positive regard for one another
00:48:28.700 | outside of the ring in those contexts.
00:48:31.340 | So I think that there's a huge difference
00:48:33.480 | between competition that's oriented on tasks,
00:48:37.020 | which can help us be the best version of ourselves
00:48:39.780 | and competition that bleeds into judgment,
00:48:44.020 | suspicion and mistrust.
00:48:46.340 | - I'd like to take us back just briefly
00:48:48.620 | to these developmental stages.
00:48:51.020 | Maybe I'm bridging two things that don't belong together,
00:48:54.580 | but I'm thinking about the young brain,
00:48:56.840 | which of course is hyperplastic
00:48:58.980 | and comparing that to the older brain.
00:49:01.160 | But the young brain learns a number of things
00:49:04.820 | while it does a number of things.
00:49:05.960 | It handles heart rate, digestion, et cetera, unconsciously.
00:49:09.180 | And then in many ways, the neuroplasticity
00:49:12.300 | that occurs early in life
00:49:13.380 | is to establish these maps of prediction.
00:49:16.620 | If things fall down, not up in general,
00:49:20.140 | things fall down, not up and so on.
00:49:23.920 | So that mental real estate can be used
00:49:26.140 | for other things and learning new things.
00:49:28.040 | So I'm thinking about the sort of classic example
00:49:31.500 | of object permanence.
00:49:33.100 | You show a baby a block or a toy,
00:49:36.780 | and then you hide that toy.
00:49:38.940 | And they, at a certain age, a very young age,
00:49:41.260 | will look as if it's gone.
00:49:43.220 | And then you bring it back and then they're amazed.
00:49:45.340 | And then at some point along their developmental trajectory,
00:49:47.820 | they learn object permanence.
00:49:49.500 | They know that it's behind your back, okay?
00:49:51.900 | And then we hear that characters like Santa Claus are real,
00:49:56.040 | and then eventually we learn that they're not,
00:49:58.640 | and so on and so on.
00:49:59.700 | In many ways, we go from being completely non-cynical
00:50:06.000 | about the physical world to being,
00:50:09.580 | one could sort of view it as cynical
00:50:12.540 | about the physical world, right?
00:50:14.000 | Like I love to see magic.
00:50:16.900 | In fact, we had probably the world's best
00:50:19.700 | or among the very best magicians on this podcast,
00:50:23.500 | Ozzy Wind, he's a mentalist and magician.
00:50:25.820 | And to see him do magic,
00:50:27.580 | even as an adult who understands
00:50:29.500 | that the laws of physics apply,
00:50:31.180 | they seem to defy the laws of physics in real time.
00:50:35.180 | And it just blows your mind to the point where you like,
00:50:38.940 | that can't be, but you sort of want it to be.
00:50:41.220 | And at some point you just go, you know what?
00:50:42.820 | It's what we call magic.
00:50:45.020 | So it seems to me that cynics apply almost physics
00:50:49.780 | like rules to social interaction.
00:50:52.160 | Like that they talk in terms of like first principles
00:50:56.220 | of human interactions, right?
00:50:58.260 | They talk about this group always this,
00:51:00.980 | and that group always that, right?
00:51:02.700 | These like strict categories,
00:51:05.060 | thick black lines between categories,
00:51:07.860 | as opposed to any kind of blending of understanding
00:51:11.780 | or a blending of rules.
00:51:13.780 | And one can see how that would be a really useful heuristic,
00:51:16.180 | but as we're learning, it's not good in the sense
00:51:19.540 | that we don't want to judge,
00:51:20.360 | but it's not good if our goal is to learn more
00:51:22.900 | about the world or learn the most information
00:51:24.900 | about the world.
00:51:25.720 | Can we say that?
00:51:26.560 | - Yes, and I appreciate you saying, yeah.
00:51:29.020 | I also try to avoid good, bad language or moral judgment,
00:51:33.600 | but I think that many of us have the goals
00:51:35.780 | of having strong relationships
00:51:37.800 | and of flourishing psychologically
00:51:40.900 | and of learning accurately about the world.
00:51:43.220 | And if those are your goals,
00:51:44.460 | I think it's fair to say
00:51:45.900 | that cynicism can block your way towards them.
00:51:49.200 | I love this.
00:51:50.180 | I've never thought about it in this way,
00:51:51.940 | but I love that perspective.
00:51:53.540 | And there is almost a philosophical certainty.
00:51:58.360 | Maybe it's not a happy philosophical certainty,
00:52:01.540 | but we love to, right?
00:52:03.620 | Human beings love explanatory power.
00:52:06.580 | We love to be able to have laws
00:52:09.700 | that determine what will happen.
00:52:11.740 | And the laws of physics are some of our most reliable,
00:52:14.860 | right?
00:52:15.700 | And really we all use theories to predict the world, right?
00:52:19.220 | I mean, we all have a theory of gravity
00:52:20.980 | that lives inside our head.
00:52:22.140 | We don't think objects with mass attract one another,
00:52:24.880 | but we know if we drop a bowling ball on our foot,
00:52:27.460 | we're gonna probably maybe not walk
00:52:28.980 | for the next week or at least, right?
00:52:30.580 | So we use theories to provide explanatory simplicity
00:52:35.580 | to a vast and overwhelmingly complex world.
00:52:40.300 | And absolutely, I think cynicism has a great function
00:52:45.260 | in simplifying, but of course in simplifying,
00:52:48.660 | we lose a lot of the detail.
00:52:50.940 | We lose a lot of the wonder
00:52:53.300 | that maybe we experienced earlier in life.
00:52:56.900 | And I do wanna, your beautiful description of kids
00:53:01.820 | and their sort of sense of, I suppose, perennial surprise
00:53:07.180 | makes me think about another aspect
00:53:09.820 | of what we lose to cynicism,
00:53:12.120 | which is the ability to witness the beauty of human action
00:53:17.120 | and human kindness.
00:53:19.240 | My friend, Dacher Keltner, studies awe,
00:53:22.820 | this emotion of experiencing something vast
00:53:27.180 | and also experiencing ourselves as small
00:53:31.220 | and a part of that vastness.
00:53:33.540 | And he wrote a great book on awe.
00:53:36.420 | And in it, he talks about his research
00:53:38.420 | where he cataloged what are the experiences
00:53:42.500 | that most commonly produce awe in a large sample,
00:53:46.980 | large representative sample of people.
00:53:49.180 | Now, I don't know about you, Andrew,
00:53:50.940 | but when I think about awe,
00:53:52.500 | my first go-to is Carl Sagan's "Pale Blue Dot,"
00:53:57.420 | this image of a kind of nebula band
00:53:59.900 | or sort of cluster basically, stardust really.
00:54:04.900 | And there's one dot in it with an arrow,
00:54:06.540 | and Carl Sagan says, "That dot is Earth,
00:54:09.580 | "and every king and tyrant and mother and father,
00:54:13.700 | "every person who's ever fallen in love,
00:54:16.820 | "and every person who's ever had their heart broken,
00:54:18.480 | "they're all on that tiny dot there."
00:54:20.300 | I go to that, I show that to my kids all the time.
00:54:23.460 | When I think of awe, I think of outer space.
00:54:25.860 | I think of groves of redwood trees.
00:54:27.860 | I think of drone footage of the Himalayas, right?
00:54:31.140 | But Dacher finds that if you ask people
00:54:33.580 | what they experience awe in response to,
00:54:36.340 | the number one category is what he calls moral beauty,
00:54:40.900 | everyday acts of kindness, giving,
00:54:46.060 | compassion, and connection.
00:54:48.700 | This is also related to what Dacher and John Haidt
00:54:51.500 | talk about in terms of moral elevation,
00:54:53.700 | witnessing positive actions that actually make us feel
00:54:57.820 | like we're capable of more.
00:55:00.100 | And moral beauty is everywhere.
00:55:03.140 | If you are open to it, it is the most common thing
00:55:05.500 | that will make you feel the vastness of our species.
00:55:09.820 | And to have a lawful, physics-like prediction
00:55:14.320 | about the world that blinkers you from seeing that,
00:55:17.740 | that gives you tunnel vision and prevents you
00:55:21.180 | from experiencing moral beauty
00:55:23.620 | seems like a tragic form of simplicity.
00:55:26.220 | - I'd like to take a quick break
00:55:27.540 | and thank one of our sponsors, Function.
00:55:30.260 | I recently became a Function member
00:55:31.900 | after searching for the most comprehensive approach
00:55:34.180 | to lab testing.
00:55:35.180 | While I've long been a fan of blood testing,
00:55:37.100 | I really wanted to find a more in-depth program
00:55:39.080 | for analyzing blood, urine, and saliva
00:55:41.700 | to get a full picture of my heart health,
00:55:43.580 | my hormone status, my immune system regulation,
00:55:46.380 | my metabolic function, my vitamin and mineral status,
00:55:49.640 | and other critical areas of my overall health and vitality.
00:55:53.060 | Function not only provides testing of over 100 biomarkers
00:55:56.260 | key to physical and mental health,
00:55:57.900 | but it also analyzes these results
00:55:59.780 | and provides insights from top doctors on your results.
00:56:03.340 | For example, in one of my first tests with Function,
00:56:06.180 | I learned that I had two high levels of mercury in my blood.
00:56:09.300 | This was totally surprising to me.
00:56:10.620 | I had no idea prior to taking the test.
00:56:13.420 | Function not only helped me detect this,
00:56:15.280 | but offered medical doctor-informed insights
00:56:17.780 | on how to best reduce those mercury levels,
00:56:20.460 | which included limiting my tuna consumption,
00:56:22.780 | because I had been eating a lot of tuna,
00:56:24.540 | while also making an effort to eat more leafy greens
00:56:26.880 | and supplementing with NAC and acetylcysteine,
00:56:29.780 | both of which can support glutathione production
00:56:31.860 | and detoxification and worked to reduce my mercury levels.
00:56:35.600 | Comprehensive lab testing like this
00:56:37.180 | is so important for health.
00:56:38.580 | And while I've been doing it for years,
00:56:40.380 | I've always found it to be overly complicated and expensive.
00:56:43.300 | I've been so impressed by Function,
00:56:44.780 | both at the level of ease of use,
00:56:46.680 | that is getting the tests done,
00:56:48.220 | as well as how comprehensive and how actionable
00:56:51.100 | the tests are, that I recently joined their advisory board,
00:56:54.520 | and I'm thrilled that they're sponsoring the podcast.
00:56:56.760 | If you'd like to try Function,
00:56:58.080 | go to functionhealth.com/huberman.
00:57:01.380 | Function currently has a wait list of over 250,000 people,
00:57:04.980 | but they're offering early access to Huberman Lab listeners.
00:57:07.800 | Again, that's functionhealth.com/huberman
00:57:11.060 | to get early access to Function.
00:57:13.680 | - I love that your examples of both pale blue dot
00:57:17.060 | and everyday compassion bridge the two,
00:57:21.600 | what I think of as time domains that the,
00:57:24.780 | or I should say space-time domains
00:57:26.560 | that the brain can encompass.
00:57:28.780 | You know, this has long fascinated me about the human brain
00:57:31.720 | and presumably other animals' brains as well,
00:57:34.240 | which is that, you know, we can sharpen our aperture
00:57:39.120 | to, you know, something so, so small
00:57:41.860 | and pay attention to just like the immense beauty.
00:57:44.380 | And, you know, like I have a lot of ants
00:57:45.980 | in my yard right now,
00:57:46.860 | and lately I've been watching them interact
00:57:48.260 | 'cause they were driving me crazy.
00:57:49.740 | They were just like, you know,
00:57:50.760 | they're like everywhere this summer
00:57:52.300 | and they're climbing on me.
00:57:53.360 | And I thought, I'm just gonna like watch what they do.
00:57:55.080 | And clearly there's a structure there.
00:57:57.260 | I know Debra Gordon at Stanford
00:57:59.140 | has studied ant behavior in others.
00:58:00.900 | And it's like, there's a lot going on there,
00:58:02.540 | but then you look up from there and you're like,
00:58:03.700 | wow, there's a big yard.
00:58:04.780 | And then the sense of all for me is that
00:58:08.020 | interactions like that must be going on everywhere
00:58:11.740 | in this yard.
00:58:13.420 | And, you know, it frames up that the aperture
00:58:16.380 | of our cognition in space and in time, you know,
00:58:19.020 | covering small distances quickly or small distances slowly.
00:58:22.400 | And then we can zoom out literally
00:58:24.580 | and think about us on this ball in space, right?
00:58:29.580 | You know, and that ability I think is incredible.
00:58:33.460 | And that awe can be captured at these different extremes
00:58:36.820 | of space, time, cognition.
00:58:39.900 | Amazing.
00:58:41.100 | It seems to me that what you're saying
00:58:42.280 | is that cynicism and awe are also opposite ends
00:58:44.520 | of the continuum.
00:58:45.340 | And that's taking us in a direction slightly different
00:58:48.180 | than I was going to try and take us.
00:58:50.300 | But I love that we're talking about awe
00:58:52.300 | because to me it feels like it's a more extreme example
00:58:55.660 | of delight.
00:58:56.500 | And I'd like you to perhaps,
00:59:00.500 | if there's any examples of research on this, you know,
00:59:04.620 | touch on to what extent a sense of cynicism divorces us
00:59:08.460 | from delight and awe, or I guess their collaborator,
00:59:13.460 | which is creativity.
00:59:16.140 | To me, everything you're saying about cynicism
00:59:18.860 | makes it sound anti-creative because you're,
00:59:22.020 | by definition, you're eliminating possibility.
00:59:24.580 | And creativity, of course, is the unique original combination
00:59:28.660 | of existing things or the creation of new things
00:59:30.580 | altogether, creativity.
00:59:32.700 | So what, if anything, has been studied
00:59:34.780 | about the relationship between cynicism,
00:59:38.060 | I guess we call it open-mindedness,
00:59:40.460 | and creativity and/or awe?
00:59:43.140 | - Yeah, great questions.
00:59:44.700 | And there is some work on this,
00:59:48.140 | and a lot of it comes actually in the context
00:59:50.220 | of the workplace, right?
00:59:52.000 | So you can examine, I mean,
00:59:53.780 | these Brazilian fishing villages
00:59:55.660 | were, after all, workplaces, right?
00:59:57.740 | That led people to more or less cynicism.
01:00:01.020 | But other workplaces also have structures
01:00:03.960 | that make people more or less able to trust one another.
01:00:07.140 | One version of this is what's known as stack ranking.
01:00:09.900 | And, you know, this is where people,
01:00:12.780 | managers are forced to pick the highest-performing
01:00:16.460 | and lowest-performing members of their team
01:00:18.740 | and, in essence, eliminate the people
01:00:21.380 | who are at the bottom 10% every six or 12 months.
01:00:25.160 | Stack ranking has, thankfully,
01:00:26.900 | mostly fallen out of favor in the corporate world,
01:00:30.340 | but it was very de rigueur in the late 20th
01:00:34.860 | and early 21st century, you know,
01:00:36.900 | up until 10 or so years ago.
01:00:38.740 | And it still exists in some places.
01:00:41.100 | And the idea, again, was if you want people to be creative,
01:00:45.820 | if you want them to do their best,
01:00:48.260 | tap into who they really are.
01:00:50.100 | And who are we really?
01:00:51.420 | We are really a hyper-individualistic,
01:00:55.020 | again, Darwinian species.
01:00:57.100 | Really, stack ranking is a social Darwinist approach
01:01:00.860 | to management.
01:01:02.100 | And the idea is, well, great, if you threaten people,
01:01:05.180 | if you make them want to defeat one another,
01:01:07.820 | they will be at their most creative
01:01:09.700 | when they are trying to do that, right?
01:01:12.340 | That it will bring out their best.
01:01:14.660 | The opposite is true.
01:01:15.620 | I mean, stack-ranked workplaces, of course, are miserable.
01:01:19.400 | The people in them are quite unhappy
01:01:22.460 | and more likely to leave their jobs.
01:01:25.060 | But some of the more interesting work
01:01:26.700 | pertains to what stack ranking does to creativity.
01:01:29.680 | Because it turns out that if your job
01:01:33.940 | is to just not be at the bottom of the pile,
01:01:36.600 | then the last thing you want to do is take a creative risk.
01:01:42.140 | You do not want to go out on a limb.
01:01:43.940 | You do not want to try something new.
01:01:46.580 | If other people are going to go after you for doing that,
01:01:50.640 | and if you screw up, or if it doesn't go well,
01:01:54.400 | you're eliminated from the group, right?
01:01:57.120 | So I think you're exactly right
01:01:59.080 | that these cynical environments
01:02:01.040 | are also highly conservative.
01:02:02.880 | I, of course, don't mean politically conservative.
01:02:04.760 | I mean conservative in terms of the types of choices
01:02:08.080 | that people make.
01:02:09.480 | And that's sort of, I think,
01:02:10.440 | at the level of individual creativity.
01:02:13.560 | But there's also a cost at the level
01:02:15.620 | of what we might call group creativity, right?
01:02:19.120 | A lot of our best ideas come not from our minds,
01:02:23.440 | but from the space between us,
01:02:25.280 | from dialogue, or from group conversation.
01:02:29.320 | And it turns out that in stacked rank,
01:02:31.680 | zero-sum environments,
01:02:33.320 | people are less willing to share knowledge and perspective
01:02:36.840 | because doing so amounts to helping your enemy succeed,
01:02:40.840 | which is the same as helping yourself fail.
01:02:43.300 | So to the extent that creativity requires
01:02:46.680 | a sort of collaborative mindset,
01:02:49.000 | then cynicism is preventative of that.
01:02:53.840 | And there's actually some terrific work
01:02:55.960 | by Anita Woolley and colleagues
01:03:00.520 | that looks at group intelligence, collective intelligence.
01:03:04.060 | This is the idea that, of course,
01:03:06.200 | people have levels of intelligence
01:03:08.120 | that can be measured in various ways,
01:03:10.120 | and have various forms of intelligence as well.
01:03:12.720 | But groups, when they get together,
01:03:15.160 | have a type of intelligence,
01:03:17.340 | and especially creative problem-solving intelligence,
01:03:20.120 | that goes above and beyond the sum of their parts,
01:03:23.120 | that can't be explained,
01:03:24.800 | and actually in some cases is almost orthogonal
01:03:27.980 | to the intelligence of the individuals in that group, right?
01:03:31.340 | Controlling for the intelligence of individuals,
01:03:33.920 | there's a group factor that still matters.
01:03:36.400 | And so Anita Woolley and others have looked at,
01:03:38.640 | well, what predicts that type of collective intelligence?
01:03:43.080 | And a couple of factors matter.
01:03:45.520 | One is people's ability
01:03:49.000 | to understand each other's emotions,
01:03:51.440 | so interpersonal sensitivity.
01:03:53.480 | But another is their willingness to, in essence,
01:03:55.800 | pass the mic, to share the conversation, and to collaborate.
01:04:00.500 | And so again, succeeding, thriving,
01:04:03.560 | optimizing, and being creative,
01:04:06.180 | both at the individual and at the group level,
01:04:09.520 | require environments where we feel free,
01:04:12.280 | and where we feel safe,
01:04:14.240 | and where we feel that contributing to somebody else
01:04:17.160 | can also contribute to ourselves.
01:04:18.900 | - It's so interesting to think about
01:04:22.880 | all of this in the context of neuroplasticity.
01:04:25.320 | I feel like one of the holy grails of neuroscience
01:04:28.320 | is to finally understand,
01:04:29.880 | you know, what are the gates to neuroplasticity?
01:04:31.840 | We understand a lot about the cellular mechanisms.
01:04:33.760 | We know it's possible throughout the lifespan.
01:04:35.440 | We know that there's sure an involvement
01:04:38.160 | of different neuromodulators and so on,
01:04:40.560 | but at the level of kind of human behavior
01:04:44.760 | and emotional stance, not technical, not a technical term,
01:04:49.320 | but I'll use it, of say, being curious.
01:04:53.000 | Like to me, curiosity is an interest in the outcome
01:04:56.760 | with no specific emotional attachment to the outcome.
01:05:00.160 | But of course, we could say you're curious
01:05:02.280 | with the hope of getting a certain result, you know,
01:05:04.200 | so one could modify it.
01:05:05.320 | But there is something about that childlike mind,
01:05:10.000 | so-called beginner's mind,
01:05:11.200 | where you're open to different outcomes.
01:05:12.920 | And it seems like the examples that you're giving
01:05:15.160 | keep bringing me back to these developmental themes
01:05:17.680 | because if it's true that cynics, you know,
01:05:21.400 | exclude a lot of data that could be useful to them,
01:05:24.220 | it seems that the opportunities for neuroplasticity
01:05:28.800 | are reduced for cynics.
01:05:31.360 | To flip it on its head,
01:05:32.680 | to what extent are we all a little bit cynical?
01:05:38.440 | And how would we explore that?
01:05:40.060 | Like if I were in your laboratory
01:05:41.780 | and you had 10 minutes with me,
01:05:44.280 | what questions would you ask me
01:05:46.280 | to determine how cynical I might be
01:05:48.600 | or how not cynical I might be?
01:05:51.660 | - Well, the first thing that I would do
01:05:52.680 | is give you that classic questionnaire from Cook and Medley,
01:05:55.920 | which would just ask you about your theories of the world.
01:05:58.720 | What do you think people are like?
01:06:00.360 | Do you think that people are generally honest?
01:06:02.860 | Do you think that they are generally trustworthy?
01:06:05.160 | - So it loads the questions or it's open-ended
01:06:07.440 | where I would, would you say, what are people like?
01:06:09.720 | And then I would just kind of free associate about that?
01:06:11.600 | - No, it's a series of 50 statements.
01:06:14.240 | And you're asked in a binary way,
01:06:16.040 | do you agree or disagree with each of these statements?
01:06:18.840 | Since then, Olga Stavrova and others
01:06:20.760 | have adapted Cook-Medley and made it a shorter scale
01:06:23.960 | and turned the questions into continuous
01:06:26.240 | one to nine or one to seven answers.
01:06:29.440 | But generally speaking, these are discrete questions
01:06:33.120 | that numerically or quantitatively
01:06:36.000 | tap our general theories of people.
01:06:39.700 | If you were in my lab, I might also ask you
01:06:41.960 | to play some different economic games.
01:06:43.880 | You know, the trust game being the number one
01:06:46.840 | that we might use here.
01:06:47.980 | So I can explain it.
01:06:50.200 | So the trust game involves two players
01:06:52.640 | and one of them is an investor.
01:06:55.840 | They start out with some amounts of money,
01:06:57.840 | let's just say $10.
01:06:59.380 | They can send as much of that money
01:07:01.240 | as they want to a trustee.
01:07:03.440 | The money is then tripled in value.
01:07:06.240 | So if the investor sends $10,
01:07:09.240 | in the hands of the trustee, it becomes $30.
01:07:12.860 | The trustee can then choose to give back
01:07:15.240 | whatever amount they want to the investor.
01:07:17.960 | So they can be exactly fair and give 15 back,
01:07:21.720 | in which case both people end up pretty much better off
01:07:24.800 | than they would have without an active trust.
01:07:27.720 | The trustee can keep all $30 themselves,
01:07:31.020 | betraying the investor,
01:07:33.200 | or the trustee can give more than 50% back.
01:07:35.460 | They can say, well, I started out with nothing,
01:07:37.020 | why don't you take 2/3 back?
01:07:39.260 | And this is one terrific behavioral measure of trust
01:07:42.980 | and it can be played in a couple of different ways.
01:07:44.640 | One is binary, where I would say,
01:07:47.700 | Andrew, you can send $10 to an internet stranger
01:07:51.940 | or you can send nothing
01:07:53.420 | and they can choose to send you back half
01:07:55.340 | or they can choose to send you back nothing.
01:07:58.200 | Would you do it?
01:07:59.040 | Actually, I'm curious, would you do that?
01:08:01.480 | - Oh, I absolutely zip it over to them.
01:08:03.680 | Yeah, I'm curious.
01:08:04.520 | - Great.
01:08:05.340 | - And I'm willing to lose the money.
01:08:07.640 | So I suppose that factors in as well.
01:08:09.040 | - Yeah.
01:08:09.880 | Follow-up question.
01:08:10.700 | In that type of study, what percentage of trustees
01:08:14.120 | do you think make the trustworthy decision
01:08:16.200 | of sending back the money?
01:08:17.500 | - Gosh.
01:08:24.500 | (silence)
01:08:28.520 | - Yeah, so your prediction there
01:08:31.140 | is quite aligned with most people's.
01:08:33.780 | There's a great study by Fechenhauer and Dunning
01:08:39.500 | that found that people, when they're asked to forecast,
01:08:43.160 | they say, I bet 52, 55% of people will send this money back,
01:08:47.700 | will make this binary trust decision.
01:08:50.300 | In fact, 80% of trustees
01:08:52.660 | make the pro-social and trustworthy decision.
01:08:55.840 | And again, what Fechenhauer and Dunning found
01:08:58.160 | is that when we have negative assumptions,
01:09:02.260 | we're less likely to send over the money,
01:09:04.740 | and therefore less likely to learn that we were wrong.
01:09:07.440 | And so that's one of, it's another example
01:09:10.800 | of where cynical beliefs, I mean,
01:09:13.780 | you're interesting because you had the belief
01:09:15.460 | that it's a 50% chance, but you still chose to trust.
01:09:18.660 | So from a Bayesian perspective,
01:09:21.020 | when that person actually sent the money back,
01:09:23.300 | which they would have an 80% chance of doing,
01:09:25.580 | and if I were to ask you again,
01:09:27.140 | what percentage of people give back,
01:09:29.440 | you might update your perception.
01:09:31.780 | - Absolutely.
01:09:32.620 | - Right?
01:09:33.700 | But without any evidence, you can't update your perception.
01:09:36.940 | So, and this is just one of many examples.
01:09:39.220 | It turns out that there's a lot of evidence
01:09:41.600 | that when asked to estimate how friendly, trustworthy,
01:09:46.180 | compassionate, or open-minded others are,
01:09:49.100 | people's estimates come in much lower than data suggests.
01:09:52.620 | And this to me is both the tragedy of cynical thinking,
01:09:56.940 | those heuristics that we're using,
01:09:58.500 | and a major opportunity for so many of us, right?
01:10:01.780 | It's a tragedy because we're coming up
01:10:04.340 | with these simple black and white,
01:10:06.700 | physics-like predictions about the world,
01:10:09.500 | and they're often wrong.
01:10:11.060 | They're often unduly negative.
01:10:13.620 | An opportunity because to the extent
01:10:16.420 | that we can tap into a more scientific
01:10:18.940 | or curious mindset,
01:10:21.340 | to the extent that we can open ourselves to the data,
01:10:23.900 | pleasant surprises are everywhere.
01:10:26.780 | The social world is full of a lot more positive
01:10:30.860 | and helpful and kind people than we realize, right?
01:10:34.620 | The average person underestimates the average person.
01:10:38.660 | This is not to say that there aren't awful,
01:10:42.060 | people who do awful things every day around the world.
01:10:44.340 | There, of course, are.
01:10:45.980 | But we take those extreme examples and over-rotate on them.
01:10:50.060 | We assume that the most toxic, awful examples
01:10:53.900 | that we see are representative when they're not.
01:10:57.540 | So we miss all these opportunities,
01:10:58.980 | but understanding that, I hope,
01:11:01.500 | opens people to gaining more of those opportunities,
01:11:06.260 | to using them, and to finding out more accurate
01:11:09.620 | and more hopeful information about each other.
01:11:12.100 | - There does seem to be a salience
01:11:13.860 | about negative interactions or somebody stealing from us
01:11:18.660 | or doing something that we consider cruel
01:11:21.900 | to us or to others.
01:11:23.580 | Nowadays, with social media,
01:11:26.380 | we get a window into, gosh,
01:11:30.020 | probably billions of social interactions
01:11:32.980 | in the form of comments and clapbacks and retweets.
01:11:37.060 | And there certainly is benevolence on social media,
01:11:40.020 | but what if any data exists about how social media
01:11:45.020 | either feeds or impedes cynicism,
01:11:48.700 | or maybe it doesn't change it at all?
01:11:51.120 | And I should say that there's also the kind of,
01:11:54.760 | I have to be careful, I'm trying not to be cynical.
01:11:58.740 | I maintain the view that certain social media platforms
01:12:05.900 | encourage a bit more negativity than others.
01:12:10.400 | And certainly there are accounts,
01:12:11.980 | I'm trying to think of accounts on Instagram like Upworthy,
01:12:15.620 | which its whole basis is to promote positive stuff.
01:12:20.020 | I like that account very much.
01:12:21.580 | But certainly you can find the full array
01:12:25.980 | of emotions on social media.
01:12:27.980 | To what extent is just being on social media,
01:12:30.580 | regardless of platform, increasing or decreasing cynicism?
01:12:35.340 | It's a terrific question.
01:12:38.220 | It's hard to provide a very clear answer,
01:12:41.060 | and I don't want to get out over my skis
01:12:43.020 | with what is known and what's not known.
01:12:45.540 | Social media has been a tectonic shift in our lives.
01:12:49.540 | It has coincided with a rise in cynicism,
01:12:52.860 | but as you know, history's not an experiment.
01:12:55.300 | So you can't take two temporal trends
01:12:57.460 | that are coincident with one another
01:12:58.840 | and say that one caused the other.
01:13:01.020 | That said, my own intuition and a lot of the data
01:13:04.660 | suggest that in at least some ways,
01:13:07.260 | social media is a cynicism factory, right?
01:13:10.260 | I mean, so let's first stipulate
01:13:12.100 | how much time we're spending on there.
01:13:13.760 | I mean, the average person goes through
01:13:17.260 | 300 feet of social media feed a day.
01:13:21.140 | - Is that right? - Yeah.
01:13:22.020 | - They've measured it in feet?
01:13:23.140 | - Approximately the height of the Statue of Liberty, yeah.
01:13:25.660 | So we're doing one Statue of Liberty worth
01:13:28.380 | of scrolling a day, much of it doom scrolling,
01:13:31.460 | if you're anything like me, at least.
01:13:34.540 | And so then the question becomes,
01:13:37.260 | what are we seeing when we scroll for that long?
01:13:41.640 | Who are we seeing?
01:13:43.380 | And are they representative of what people are really like?
01:13:47.940 | And the answer in a lot of ways is no.
01:13:51.980 | That what we see on social media
01:13:53.300 | is not representative of the human population.
01:13:56.780 | So there's a lot of evidence.
01:13:59.220 | A lot of this comes from William Brady,
01:14:02.020 | now at Northwestern, and Molly Crockett,
01:14:04.560 | that when people tweet, for instance,
01:14:08.780 | I mean, a lot of this is done on the site
01:14:11.060 | formerly known as Twitter, when people tweet in outrage,
01:14:15.780 | and when they tweet negatively,
01:14:17.360 | and when they tweet about, in particular, immorality,
01:14:20.760 | right, moral outrage, that algorithmically,
01:14:24.260 | those tweets are broadcast further, they're shared more.
01:14:29.060 | And this does a couple of things.
01:14:30.380 | One, it reinforces the people
01:14:32.620 | who are already tweeting in that way.
01:14:34.780 | So William Brady has this great work
01:14:37.580 | using a kind of reinforcement learning model, right?
01:14:40.540 | Reinforcement learning is where you do something,
01:14:43.180 | you're rewarded, and that reward makes you more likely
01:14:45.720 | to do that same thing again.
01:14:47.580 | And it turns out that Brady found
01:14:51.500 | that when people tweet in outrage and then get egged on,
01:14:56.100 | and oftentimes I should say this is tribal in nature,
01:14:58.700 | it's somebody tweeting against somebody who's an outsider,
01:15:01.900 | and then being rewarded by people
01:15:03.580 | who they consider to be part of their group, right?
01:15:06.320 | When that happens, that person is more likely
01:15:08.900 | in their future tweets to turn up the volume
01:15:12.200 | on that outrage and on that moral outrage in particular.
01:15:15.440 | So there's a sort of ratchet effect, right,
01:15:18.540 | on the people who are sharing.
01:15:20.520 | But a second question becomes,
01:15:22.780 | well, what about the people watching?
01:15:24.700 | What about the rest of us?
01:15:26.780 | Claire Robertson has a great paper on this
01:15:28.840 | where she documents that a vast majority,
01:15:31.940 | I mean, 90 plus percent of tweets are created
01:15:35.500 | by the 10% of the most active users, right?
01:15:40.060 | And this is in the political sphere.
01:15:41.980 | And these are probably not representative, these folks,
01:15:45.540 | not representative of the rest of us in terms
01:15:47.500 | of how extreme and maybe how bitter their opinions are.
01:15:52.500 | And so we, when we're scrolling,
01:15:55.580 | that Statue of Liberty's worth of information,
01:15:58.340 | we think that we're seeing the world.
01:16:00.980 | We think that we're seeing our fellow citizens.
01:16:02.900 | We think that we're getting a picture
01:16:04.620 | of what people are like.
01:16:06.060 | In fact, we're pulling from the fringes.
01:16:08.460 | And what this leads to is a misconstrual
01:16:12.500 | of what the world is really like.
01:16:14.880 | This is, by the way, not just part of social media,
01:16:17.700 | it's also part of legacy media.
01:16:19.860 | Communication theorists talk about something called
01:16:22.940 | the mean world syndrome, right?
01:16:25.500 | Where the more time that you spend looking at the news,
01:16:28.260 | for instance, the more you think violent crime
01:16:31.380 | is up in your area,
01:16:33.020 | the more you think you're in danger of violent crime,
01:16:36.340 | even during years when violent crime is decreasing.
01:16:39.820 | I'm old enough to remember when "Stranger Danger"
01:16:43.260 | was this big, massive story.
01:16:45.340 | And every time you wanted cereal,
01:16:47.100 | the milk carton would have a picture
01:16:48.860 | of a kid who had been kidnapped by a stranger.
01:16:51.540 | And during that time, if you asked people
01:16:53.580 | how many kids are being kidnapped by strangers in the US,
01:16:57.460 | they would, in many cases, say 50,000 children
01:17:00.100 | are being kidnapped each year in the US.
01:17:02.100 | Can you imagine what the world would be?
01:17:05.580 | There would be SWAT teams on every corner.
01:17:08.020 | The real number in those years
01:17:09.640 | was closer to 100 kids per year.
01:17:12.260 | Now, let me be clear, each one of those
01:17:14.220 | is an absolute tragedy, but there's a big difference here.
01:17:17.940 | And oftentimes, when we tune into media,
01:17:21.140 | we end up with these enormously warped perceptions
01:17:24.620 | where we think that the world is much more dangerous
01:17:26.960 | than it really is.
01:17:27.960 | We think that people are much more extreme
01:17:30.380 | than they really are.
01:17:31.220 | And because stories of immorality go viral
01:17:34.820 | so much more often than stories of everyday goodness,
01:17:38.260 | I mean, I love "Upworthy" as well,
01:17:39.920 | but it's not winning right now in the social media wars.
01:17:44.780 | - Not yet.
01:17:45.620 | - Not yet, not yet.
01:17:47.060 | And so this leaves us all absolutely exhausted
01:17:51.060 | and also feeling alone.
01:17:52.740 | People who feel like, wow,
01:17:54.220 | I actually don't feel that much outrage
01:17:57.540 | or I don't want to feel that much outrage.
01:17:59.380 | I actually don't want to hate everybody
01:18:02.500 | who's different from me, for instance.
01:18:04.400 | I'm just exhausted by all this.
01:18:07.380 | We feel like, well, I guess I'm the only one
01:18:09.340 | because everybody else seems really excited
01:18:11.180 | about this battle royale that we've put ourselves in.
01:18:14.540 | But in fact, most people are just
01:18:17.500 | like the exhausted majority, right?
01:18:19.300 | We're paying so much attention to a tiny minority
01:18:22.980 | of what the journalist, Amanda Ripley,
01:18:25.820 | calls conflict entrepreneurs,
01:18:28.500 | people who stoke conflict on purpose
01:18:30.780 | that we're confusing them with the average.
01:18:32.980 | - Ooh, so much there.
01:18:36.140 | I have, I suppose, a mixed relationship to social media.
01:18:41.300 | I teach there and I learn there.
01:18:43.340 | And I also have to be very discerning
01:18:45.980 | in terms of how I interact with it.
01:18:47.460 | And you made this point
01:18:49.860 | that I've never heard anyone make before,
01:18:52.060 | which is that many people feel alone
01:18:54.500 | by virtue of the fact that they don't share
01:18:56.260 | in this warring nature that they see on social media.
01:19:00.900 | It's almost like sometimes I feel
01:19:03.260 | like I'm watching a combat sport
01:19:04.940 | that I don't feel quite cut out for.
01:19:07.880 | - Yeah.
01:19:08.720 | - And then when I'm away from it, I feel better.
01:19:12.860 | But I, like everybody else,
01:19:14.100 | sometimes we'll get sucked into the highly salient nature
01:19:18.420 | of a combat between groups on social media.
01:19:22.620 | It can be very alluring in the worst ways.
01:19:27.120 | This mean world syndrome, what's the inverse of that?
01:19:33.220 | The kind world syndrome, I suppose.
01:19:35.500 | But attempts at creating those sorts
01:19:37.880 | of social media platforms have been made.
01:19:41.020 | Things like Blue Sky, which has other aspects to it as well.
01:19:44.100 | But, and while it may be thriving,
01:19:46.380 | I don't know, I haven't checked recently.
01:19:50.500 | It seems like people aren't really interested
01:19:52.940 | in being on there as much as they are these other platforms.
01:19:55.460 | Clearly the numbers play out that way.
01:19:57.980 | Why do you think that is?
01:20:00.260 | - Well, we as a species, I think,
01:20:03.700 | are characterized by what we would call negativity bias.
01:20:08.180 | Right, negative events and threats loom larger in our minds.
01:20:12.420 | And that happens in a number of domains.
01:20:15.460 | Our decision making is negatively biased
01:20:19.500 | in that we'd prefer to avoid a negative outcome
01:20:22.260 | than to pursue a positive outcome.
01:20:23.700 | That's the classic work of Kahneman and Tversky,
01:20:26.060 | for instance.
01:20:27.380 | The impressions that we form are often negatively skewed.
01:20:31.460 | So classic work in psychology going back to the 1950s
01:20:35.380 | shows that if you teach somebody about a new person
01:20:39.220 | who they've never met and you list three positive qualities
01:20:42.060 | that this person has and three negative qualities,
01:20:45.060 | people will very much judge the person
01:20:48.500 | on their worst qualities.
01:20:50.220 | And also remember more about their negative qualities
01:20:53.420 | than about their positive qualities.
01:20:55.620 | And again, you can see why this would be part of who we are
01:20:59.740 | because we need to protect one another.
01:21:01.700 | We also tend to, by the way, not just think
01:21:04.300 | in a negatively biased way,
01:21:05.580 | but speak and share in a negatively biased way.
01:21:09.540 | In my lab, we had a study where people witnessed
01:21:12.780 | other groups of four playing an economic game
01:21:15.460 | where they could be selfish or they could be positive.
01:21:20.460 | And we asked them, okay, we're gonna ask you to share
01:21:24.380 | a piece of information about one of the people
01:21:26.100 | you were playing this game with
01:21:27.700 | for a future generation of participants.
01:21:31.500 | Who would you like to share about?
01:21:33.140 | And when somebody in a group acted in a selfish way,
01:21:36.500 | people shared information about them three times more often
01:21:41.660 | than when they acted in a generous way.
01:21:44.420 | So we gossip negatively.
01:21:46.780 | And again, that gossip is pro-social.
01:21:49.940 | The idea is if there's somebody out there
01:21:51.860 | harming my community, of course,
01:21:53.780 | I'm gonna shout about them from the rooftops
01:21:55.780 | because I wanna protect my friends.
01:21:57.860 | It's a very noble instinct in a way.
01:22:01.420 | But we further found that when we actually showed
01:22:03.740 | a new generation of participants
01:22:05.460 | the gossip that the first generation shared,
01:22:07.820 | and we asked, hey, how generous and how selfish
01:22:10.260 | were people in that first generation,
01:22:11.940 | they vastly underestimated that group's generosity.
01:22:15.580 | Does that make sense?
01:22:16.740 | In other words, in trying to protect our communities,
01:22:20.180 | we send highly biased information
01:22:23.420 | about who's in our community
01:22:24.860 | and give other people the wrong idea of who we are.
01:22:29.180 | And I see that unfolding on social media
01:22:32.260 | every day of my life.
01:22:33.260 | Every day that I'm on social media,
01:22:34.620 | I do try to take breaks.
01:22:36.460 | But when I'm on there, I see it.
01:22:38.340 | And to your question, what do we do here?
01:22:42.300 | Why don't positive networks, positive information,
01:22:45.420 | why doesn't it proliferate more?
01:22:48.460 | I think it's because of these ingrained biases in our mind.
01:22:52.060 | And I understand that that can sound fatalistic
01:22:54.900 | because it's like, oh, maybe this is just who we are.
01:22:57.740 | But I don't think that we generally accept our instincts
01:23:01.260 | and biases as a life sentence, as destiny.
01:23:06.260 | A lot of us, well, human beings in general,
01:23:09.780 | have the instinct to trust and be kinder
01:23:12.380 | towards people who look like us versus people who don't,
01:23:15.340 | for instance, who share our racial makeup.
01:23:17.860 | None of us, I think, or a few of us sit here and say,
01:23:20.860 | well, I have that bias in my mind,
01:23:22.500 | so I guess I'm always going to be racially biased.
01:23:26.580 | We try to counteract those instincts.
01:23:29.780 | We try to become aware of those biases.
01:23:33.060 | Depressed people have the bias
01:23:35.540 | to see themselves as worthless
01:23:37.180 | and to interpret new information they receive
01:23:39.620 | through that framework.
01:23:41.100 | Well, therapy is the attempt to say,
01:23:43.340 | I don't want to feel this way anymore.
01:23:45.180 | I want to fight the default settings in my mind.
01:23:48.820 | I want to try to explore curiosity,
01:23:51.380 | to explore something new.
01:23:53.460 | So to say that this toxic environment that we're in
01:23:56.620 | corresponds with some of our biases
01:23:59.060 | is, to me, not the same as saying
01:24:00.500 | we are destined to remain in that situation.
01:24:03.140 | - Do you think it's possible
01:24:04.580 | to be adequately informed about threats
01:24:07.980 | to be able to live one's life in the most adaptive way
01:24:13.260 | while not being on social media,
01:24:16.740 | none of the social media platforms?
01:24:18.540 | Can you have a great life that way,
01:24:22.580 | a safe life?
01:24:24.100 | - This is a quasi-philosophical question,
01:24:26.460 | but from my perspective, absolutely.
01:24:29.340 | I mean, I think some of the threats
01:24:30.820 | that we learn about on social media are simply wrong.
01:24:34.780 | They're phantom threats.
01:24:37.300 | We're made to fear something that actually is not happening,
01:24:42.260 | made to fear a group of people who are not as dangerous
01:24:47.140 | as they're made out to be on social media.
01:24:50.140 | Of course, I think being informed about the world around us
01:24:53.380 | matters to staying safe.
01:24:55.380 | But again, I think we can also more broadly
01:24:57.740 | construe what safety is.
01:25:00.100 | You know, if being on social media
01:25:01.940 | makes you avoidant of taking chances on people,
01:25:05.580 | if it makes you feel as though
01:25:07.020 | anybody who's different from you ideologically,
01:25:09.340 | for instance, is bloodthirsty and extreme,
01:25:13.180 | that's going to limit your life in very important ways.
01:25:17.500 | And you can talk about being safe
01:25:19.220 | in terms of safe from acute threats.
01:25:22.380 | But as we've talked about,
01:25:23.820 | living a diminished and disconnected life
01:25:26.460 | is its own form of danger over a longer time horizon.
01:25:30.340 | So really, you know, there are a lot of ways
01:25:33.580 | in which in the attempt to stay safe right now,
01:25:37.860 | we introduce ourselves to long-term danger.
01:25:40.300 | - I'm not anti-social media,
01:25:43.660 | but I have to circle back on this yet again.
01:25:48.460 | Former guest on this podcast,
01:25:49.860 | one of our most popular episodes
01:25:51.260 | is with a former Navy SEAL, David Goggins,
01:25:53.460 | who's known for many things,
01:25:56.460 | but chief among them is striving and pushing oneself.
01:26:01.460 | And David has said many times that nowadays
01:26:05.780 | it's easier than ever to be extraordinary
01:26:07.780 | because most people are basically spending time
01:26:09.580 | just consuming experiences on social media
01:26:13.220 | and doing a lot less, just literally doing a lot less,
01:26:17.300 | not just exercising and running as he does.
01:26:19.660 | Although, by the way, he's in school to become a paramedic.
01:26:22.140 | So he's essentially gone to medical school
01:26:24.260 | and is always doing a bunch of other things as well.
01:26:27.500 | So he's also an intellectual learner.
01:26:31.260 | Now, I don't know if I agree with him completely,
01:26:34.780 | but it's an interesting statement.
01:26:37.820 | You know, if social media is bringing out our cynicism,
01:26:43.040 | polarizing us, and perhaps taking away,
01:26:47.100 | I would probably agree with David,
01:26:49.280 | at least to some extent, taking away our time
01:26:54.080 | where we could be generative, writing, thinking,
01:26:58.600 | socializing, building in other ways
01:27:01.120 | that one builds their life,
01:27:04.280 | then I guess an important question is,
01:27:07.080 | do you think social media could be leveraged
01:27:09.360 | to decrease cynicism or, as you referred to it,
01:27:14.360 | to generate hopeful skepticism?
01:27:17.700 | Like this notion of hopeful skepticism
01:27:20.500 | as a replacement for cynicism
01:27:21.960 | is something that is really intriguing.
01:27:23.900 | Like, what would that look like?
01:27:24.960 | Like, if we were just gonna do the Gedanken experiment here,
01:27:28.160 | like, what would a feed on social media look like
01:27:30.760 | that fed hopeful skepticism as opposed to cynicism?
01:27:35.760 | - Here's a far out example.
01:27:37.720 | I mean, I love this train of thought,
01:27:39.600 | so I'm gonna try to take it to a logical conclusion
01:27:42.720 | that would never actually occur in real life,
01:27:45.080 | but a great way to generate more accurate
01:27:49.840 | and hopeful skepticism, and by hopeful skepticism,
01:27:52.400 | I mean skepticism as we've described,
01:27:54.780 | a scientific mindset, a scientific perspective,
01:27:57.640 | and a curiosity, a hunger for information.
01:28:00.960 | And in the hopeful piece, I simply mean skepticism
01:28:04.660 | that begins with the understanding
01:28:06.460 | that our defaults are often too negative,
01:28:09.640 | so that I'm going to be open and I'm going to realize
01:28:12.300 | that my gut instinct is probably leading me
01:28:16.460 | towards the negative and can be challenged,
01:28:18.440 | that I don't have to listen to it all the time.
01:28:20.480 | So just as a working definition,
01:28:22.780 | I think that what I would want in a social media feed
01:28:28.600 | would be for it to have more data.
01:28:31.040 | If you could compel every person on Earth
01:28:35.240 | to post to social media about what they're doing today,
01:28:39.020 | about what they're thinking, about what they want,
01:28:41.340 | about their values, right?
01:28:43.420 | If you could compel each, of course,
01:28:45.420 | that's dystopic in many ways,
01:28:46.620 | but just as a thought experiment.
01:28:48.560 | And then people's feed was a representative sample
01:28:52.820 | of real people on the planet, right?
01:28:55.720 | Real people, and people who over time, right,
01:29:00.520 | as I scroll through my Statue of Liberty now,
01:29:02.940 | I see what people are really like.
01:29:05.140 | I see the people who are extreme and negative and toxic,
01:29:08.340 | but I also see a grandmother
01:29:10.820 | who's driving her grandkid to hockey practice.
01:29:14.080 | I see a nurse who's coming in to help an elderly patient.
01:29:18.820 | I see somebody who's made an unlikely connection
01:29:22.380 | with somebody who they disagree with.
01:29:24.500 | A veridical, accurate feed,
01:29:28.540 | I think would drive hopeful skepticism.
01:29:30.460 | And that's, again, one of the things
01:29:32.340 | that has struck me most over the last few years
01:29:34.880 | of doing this research,
01:29:36.460 | is that we stereotype hope and positivity,
01:29:40.340 | as you were saying earlier,
01:29:42.100 | as kind of dim, naive, a rose-colored pair of glasses.
01:29:47.100 | But in fact, I think what the data show us
01:29:50.900 | is that we're all wearing a pair
01:29:52.900 | of soot-colored glasses all the time.
01:29:55.620 | And actually the best way to make people more hopeful
01:29:58.540 | is to ask them to look more carefully,
01:30:00.980 | not to look away, but look towards
01:30:04.240 | in a more accurate and open fashion.
01:30:06.840 | And there's one version of this
01:30:08.720 | that we've tried at Stanford, in our own backyard.
01:30:12.760 | So my lab and I, we've, for years,
01:30:15.120 | been surveying as many Stanford undergraduates as we can
01:30:19.360 | about their social health, right?
01:30:21.280 | So how connected are they?
01:30:23.280 | How mentally healthy are they?
01:30:24.840 | And a couple of years ago,
01:30:28.240 | we asked thousands of undergraduates to describe
01:30:32.040 | both themselves and the average Stanford student
01:30:34.840 | on a number of dimensions.
01:30:36.400 | For instance, how empathic are you?
01:30:37.960 | How empathic is the average Stanford student?
01:30:40.280 | How much do you like helping people who are struggling?
01:30:42.880 | What do you think the average Stanford student
01:30:44.320 | would respond to that?
01:30:45.480 | How much do you want to meet new people on campus?
01:30:47.640 | How do you think the average student would respond?
01:30:50.080 | And we discovered not one, but two Stanfords.
01:30:54.160 | The first was made up of real students
01:30:56.040 | who are enormously compassionate,
01:30:58.220 | who really want to meet new friends,
01:31:01.160 | who want to help their friends when they're struggling.
01:31:04.340 | The second Stanford existed in students' minds.
01:31:07.840 | Their imagination of the average undergraduate
01:31:10.980 | was much less friendly, much less compassionate,
01:31:14.320 | much pricklier and more judgmental than real students were.
01:31:18.800 | So again, we've got this discrepancy
01:31:20.920 | between what people perceive and social reality.
01:31:24.600 | We found that students who underestimated their peers
01:31:28.160 | were less willing to do things like
01:31:30.120 | strike up a conversation with a stranger
01:31:32.480 | or confide in a friend when they were struggling.
01:31:34.920 | And that left them more isolated and lonelier.
01:31:37.920 | This is the kind of vicious cycle of cynicism, right?
01:31:41.440 | But more recently, my lab led by a great postdoc, Ray Pei,
01:31:46.240 | tried an intervention.
01:31:48.120 | And the intervention was as simple as you can imagine.
01:31:50.600 | It was show students the real data.
01:31:53.640 | We put posters in a number of dorms,
01:31:56.080 | experimental dorms we called them,
01:31:58.160 | that simply said, hey, did you know,
01:32:00.400 | 95% of students at Stanford
01:32:04.020 | would like to help their friends who are struggling.
01:32:06.000 | 85% want to make friends with new students.
01:32:09.560 | We also worked with Frosh 101,
01:32:12.360 | a one-unit class that most first-year students take
01:32:15.640 | and show them the data.
01:32:17.520 | We're just showing students to each other.
01:32:20.440 | And we found that when students learned this information,
01:32:23.960 | they were more willing to take social risks.
01:32:26.800 | And six months later,
01:32:28.160 | they were more likely to have a greater number of friends,
01:32:30.880 | to be more socially integrated.
01:32:32.800 | So here again is a tragic and vicious cycle,
01:32:36.800 | but then there's a virtuous cycle that can replace it
01:32:39.460 | if we just show people better information.
01:32:42.320 | Again, I don't imagine that there'll ever be
01:32:44.440 | a social media feed where everybody has to post
01:32:47.520 | and you see an actually representative sample of the world.
01:32:50.040 | But if we could, I do think that that would generate
01:32:53.680 | a more hopeful perspective because the truth
01:32:57.040 | is more hopeful than what we're seeing.
01:32:59.040 | - Do you think there's a version of AI
01:33:02.080 | that is less cynical than people tend to be?
01:33:06.840 | The reason I ask this is I'm quite excited about
01:33:10.240 | and hopeful about AI.
01:33:11.920 | I'm not one of these, I don't know what you call them,
01:33:14.360 | but AI doomers. - Doomers, yeah.
01:33:17.300 | - And it's here, it's happening.
01:33:18.520 | It's happening in the background now.
01:33:19.800 | And I've started using AI in a number of different
01:33:23.100 | realms of life and I find it to be incredible.
01:33:25.820 | It seems to me to combine neural networks
01:33:29.240 | and Google search with PubMed and it's fascinating.
01:33:33.840 | It's not perfect, it's far from perfect,
01:33:36.080 | but that's also part of its beauty
01:33:37.700 | is that it mimics a human lack of perfectness well enough
01:33:42.320 | that it feels something kind of like
01:33:45.720 | brain-like, personality-like.
01:33:49.600 | You could imagine that given the enormous amount
01:33:53.440 | of cynicism that's out there,
01:33:56.040 | that some of the large language models that make up AI
01:33:59.720 | would be somewhat cynical, would put filters
01:34:04.560 | that were overly stringent on certain topics.
01:34:07.640 | You also wouldn't want AI that was not stringent enough,
01:34:11.960 | right, because we are already and soon to be using AI
01:34:16.720 | to bring us information extremely quickly
01:34:19.480 | and the last thing we want are errors in that information.
01:34:22.280 | So if we were to take what we know from humans
01:34:26.560 | and the data that you've collected
01:34:28.080 | and others have collected about ways to shift ourselves
01:34:31.240 | from cynicism to hopeful skepticism,
01:34:34.040 | do you think that's something that could be laced
01:34:36.560 | into these large language models?
01:34:38.460 | I'm not talking about at the technical level,
01:34:40.320 | that's certainly beyond my understanding,
01:34:43.280 | but could you build an AI version of yourself
01:34:46.160 | that could forage the internet for news
01:34:48.120 | and what's going on out there that is,
01:34:50.320 | you know, tune down the cynicism a little bit
01:34:54.380 | since it's difficult to be less cynical?
01:34:56.840 | In other words, could it do a better job
01:34:58.640 | of being you than you and then therefore make you better?
01:35:02.200 | - Wow, I love that question.
01:35:06.120 | I think that there is, I could imagine
01:35:09.880 | an opportunity for that.
01:35:11.520 | I think one roadblock that I don't think is insurmountable
01:35:16.080 | but that you would need to face
01:35:18.040 | in that really fascinating goal is that AI models
01:35:22.800 | are, of course, products of the data that we feed them.
01:35:25.960 | And so if, you know, basically AI models eat the internet,
01:35:29.960 | right, swallow it, and then give it back to us in some form,
01:35:33.080 | to the extent that the internet is asymmetrically waiting,
01:35:38.040 | right, is overweighting negative content and cynical content,
01:35:43.640 | then AIs that swallow that will reflect it as well.
01:35:48.200 | I think that, and I could imagine,
01:35:50.040 | and it's blowing my mind in real time to think about,
01:35:53.320 | but you could imagine retuning the way
01:35:58.280 | that AI takes information to account for negativity bias
01:36:01.900 | and to correct, I mean, this is what you're getting at,
01:36:04.560 | I think, right, to correct for that negativity bias
01:36:07.600 | and then produce an inference that is less biased,
01:36:12.840 | more accurate, and less cynical,
01:36:15.780 | and then give that as a kind of digest to people, right?
01:36:19.040 | So don't make me go through my social media feed.
01:36:23.080 | Go through it for me, correct, right, de-bias it,
01:36:27.520 | and then give it to me in a more accurate way.
01:36:32.000 | That's an incredible idea.
01:36:34.360 | - I mean, that's what I want.
01:36:36.280 | I was thinking about my Instagram feed
01:36:38.400 | and cynicism versus hopeful skepticism
01:36:41.880 | versus, I guess, awe, and I'll use the following examples.
01:36:46.140 | I subscribed to an Instagram account that I like very much,
01:36:51.400 | which essentially just gives me images
01:36:53.680 | of beautiful animals in their ultimate essence.
01:36:58.680 | It's an account by a guy named Joel Sartore
01:37:01.800 | who works for National Geographic,
01:37:03.240 | and he's created what's called the photo arc.
01:37:04.920 | He's trying to get images of all the world's animals
01:37:09.920 | that really capture their essence,
01:37:12.480 | and many of them are endangered
01:37:14.640 | and some very close to extinction.
01:37:16.600 | Others are more prolific right now.
01:37:21.400 | Nonetheless, I think of that account
01:37:22.940 | as all goodness, all benevolence.
01:37:25.680 | And then at the other extreme,
01:37:26.820 | I subscribe to an animal account called Nature is Metal.
01:37:29.840 | We've actually collaborated with Nature is Metal
01:37:33.560 | on a great white shark grabbing a tuna video
01:37:37.720 | that I didn't take, but someone I was with took,
01:37:41.360 | and we got their permission to post it.
01:37:43.400 | In any event, Nature is Metal
01:37:44.480 | is all about the harshness of nature.
01:37:46.320 | And then I think about the Planet Earth series
01:37:49.060 | hosted by David Attenborough and so forth,
01:37:51.480 | which sort of has a mixture of beautiful ducklings,
01:37:54.660 | but then also animals hunting each other
01:37:57.940 | and dying of old age or of starvation,
01:38:00.280 | and so the full array.
01:38:01.340 | So I think about that as an example of,
01:38:04.720 | if you look at Nature is Metal long enough,
01:38:07.960 | and it's a very cool account.
01:38:09.920 | I highly recommend people follow
01:38:11.160 | all three of these accounts.
01:38:12.760 | But if you look at it long enough,
01:38:13.800 | you get the impression like nature is hard.
01:38:16.960 | Life is hard out there.
01:38:18.800 | And it can be.
01:38:20.080 | You look at the Sartore account
01:38:23.240 | and you get the impression that animals are just beautiful.
01:38:25.940 | They're just being them, right?
01:38:27.820 | And he has a gift for capturing
01:38:31.800 | the essence of insects, reptiles, and mammals,
01:38:34.160 | and everything in between.
01:38:35.760 | So when I think about social media,
01:38:39.040 | or I even just think about our outlook
01:38:41.600 | onto the landscape of real life, non-virtual life,
01:38:46.520 | I feel like the human brain
01:38:48.400 | potentially can like all these things.
01:38:51.340 | But what you're describing in cynicism
01:38:53.200 | is the people that, for whatever reason,
01:38:56.240 | they're skewed toward this view that like life is hard,
01:38:59.880 | and therefore I need to protect myself
01:39:01.600 | and protect others at all times.
01:39:04.600 | In reality, how dynamic is cynicism?
01:39:07.280 | Earlier, you described how it can be domain-specific.
01:39:10.280 | But if somebody is pretty cynical,
01:39:14.460 | and they're older than 25,
01:39:18.200 | they're outside the sort of developmental plasticity range,
01:39:21.260 | what are the things that they can do on a daily basis
01:39:25.640 | to either tune down their cynicism
01:39:28.300 | or create room for this hopeful skepticism
01:39:31.160 | in a way that enriches them?
01:39:32.680 | Let's just start with them,
01:39:33.900 | because after all, they're cynics.
01:39:35.780 | Like we can't bait them
01:39:37.960 | with the good that they'll do for the world,
01:39:39.880 | but they'll do that too.
01:39:41.120 | What are some tools that we can all apply
01:39:44.760 | towards being less cynical?
01:39:47.160 | - It's a brilliant question, and you're right.
01:39:49.200 | I mean, I think a lot of us are very tuned
01:39:51.180 | into the metal side of life.
01:39:52.720 | And heavy metal is great, but life is not all metal.
01:39:57.120 | So how do we retune ourselves?
01:39:59.880 | I think about this a lot,
01:40:01.880 | in part because over the last several years,
01:40:04.700 | I haven't just been studying cynicism.
01:40:07.380 | I've been trying to counteract it in myself and in others.
01:40:11.260 | So I've focused on practical everyday things that I can do.
01:40:16.260 | And I guess they come in a bunch of categories.
01:40:19.380 | I'm gonna try to tick through them,
01:40:21.100 | but I really wanna hear your thoughts.
01:40:23.180 | The first has to do with our mindsets
01:40:25.220 | and the ways that we approach our own thinking.
01:40:29.140 | So I like to engage in a practice
01:40:32.220 | that I call being skeptical of my cynicism.
01:40:35.320 | So that is, in essence,
01:40:38.880 | taking tools from cognitive behavioral therapy
01:40:41.820 | and applying them to my cynical inferences.
01:40:44.780 | So again, my default mode,
01:40:46.500 | my factory settings are pretty suspicious.
01:40:48.800 | I wanna lay my cards on the table.
01:40:50.380 | It's ironic, given what I study, but there we are.
01:40:53.360 | So I often find myself in new situations,
01:40:56.620 | suspecting people, mistrusting people,
01:40:59.020 | wondering if they might take advantage of me.
01:41:01.700 | And what I do these days that I didn't do in the past
01:41:05.420 | is say, well, wait a minute, Zaki,
01:41:07.420 | where is this coming from?
01:41:09.620 | You're a scientist, defend your inference,
01:41:13.060 | defend your hypothesis, right?
01:41:16.260 | What evidence do you have to back it up?
01:41:17.660 | And very often,
01:41:18.940 | I find that the evidence is thin to non-existent, right?
01:41:22.520 | So that challenge, that just unearthing of,
01:41:25.660 | wait a minute, are you sure?
01:41:27.560 | No, you're not.
01:41:28.400 | And can tap into a little bit of intellectual humility.
01:41:32.300 | A second thing that I try to do
01:41:34.820 | is apply what my lab and I call a reciprocity mindset.
01:41:39.820 | That is understanding that yes,
01:41:43.460 | people vary in how trustworthy they are,
01:41:45.920 | but what you do also matters.
01:41:48.060 | Research finds that when you trust people,
01:41:52.100 | they're more likely to become trustworthy
01:41:54.680 | because they wanna reciprocate.
01:41:56.400 | You've honored them in this small way,
01:41:58.080 | and so they step up.
01:41:59.420 | It's known as earned trust in economics.
01:42:02.440 | And when you mistrust people, they become less trustworthy.
01:42:06.280 | So in my lab, we found that when you teach people this,
01:42:11.000 | when you teach people to own the influence
01:42:13.860 | that they have on others,
01:42:15.760 | they're more willing to be trusting.
01:42:17.940 | And when you're more trusting,
01:42:19.080 | then of course the other person reciprocates,
01:42:21.860 | which again, turns into this positive cycle.
01:42:24.020 | So I try, when I make a decision as to whether or not
01:42:27.220 | I'm gonna trust somebody,
01:42:28.820 | I think the default is to say,
01:42:30.060 | whoa, I'm taking on this risk.
01:42:31.960 | Is this a good choice for me?
01:42:33.540 | And I try to rotate that a little bit and say,
01:42:36.840 | what am I doing for the relationship here?
01:42:39.480 | Is this act of trust maybe a gift to this other person?
01:42:43.060 | How can it positively influence who they become
01:42:45.880 | in the course of this interaction?
01:42:48.260 | And then a third thing on the sort of mindset side,
01:42:51.000 | and then we can get to some behaviors,
01:42:53.320 | is what I call social savoring.
01:42:57.260 | I do this a lot with my kids, actually.
01:42:59.260 | Savoring is a general term for appreciating good things
01:43:03.660 | while they happen.
01:43:04.740 | It's related to gratitude,
01:43:06.080 | but gratitude is more appreciating the things
01:43:08.700 | that have happened to us in the past that are good.
01:43:11.080 | Savoring is, let's grab this moment right now
01:43:13.300 | and think about it.
01:43:15.040 | So my kids and I started savoring practices
01:43:19.340 | a couple of years ago.
01:43:20.420 | I call it classes.
01:43:21.540 | So I'll say, today we're gonna do an ice cream eating class,
01:43:25.100 | or we're gonna do a sunset watching class.
01:43:27.620 | - Cool, I wanna, are you adopting children?
01:43:29.460 | (laughing)
01:43:30.900 | - Applications are coming in now.
01:43:33.100 | We're evaluating them on a rolling basis.
01:43:35.340 | - I've already graduated college.
01:43:36.660 | (laughing)
01:43:38.700 | - But so we'll just sit there, you know,
01:43:40.980 | and eat ice cream slowly, you know, not so that it melts,
01:43:44.320 | but we'll say, you know, what are you enjoying about this?
01:43:46.660 | Is it the texture?
01:43:47.500 | Is it the flavor?
01:43:48.700 | What do you wanna remember about this moment?
01:43:51.060 | And I noticed more recently while working on this book
01:43:54.100 | that all of this was sensory.
01:43:56.900 | Sunsets, somersaults, ice cream, you name it.
01:44:00.300 | But it wasn't very social.
01:44:02.340 | And what they were hearing from me about other people
01:44:05.340 | was negatively skewed
01:44:06.780 | because gossip is negatively skewed, right?
01:44:09.120 | If somebody cut me off in traffic
01:44:10.820 | while I'm driving them to summer camp,
01:44:12.500 | they learn all about that person,
01:44:14.460 | but they don't learn about the people
01:44:16.220 | who are politely following traffic laws all around us,
01:44:19.400 | right, which is 90 plus percent of drivers.
01:44:22.860 | And so I started a practice of social savoring
01:44:25.580 | where I try to share with my kids
01:44:28.820 | positive things that I notice about other people.
01:44:31.220 | You could call it positive gossip as well.
01:44:33.820 | And one thing that I noticed
01:44:35.500 | is that that habit of savoring for them
01:44:39.500 | changed my mental processing, right?
01:44:43.580 | It actually changed what I noticed
01:44:45.740 | because of course,
01:44:46.580 | if you're trying to tell somebody about something,
01:44:49.700 | you look for examples that you can tell them about.
01:44:52.400 | So a habit of action of speech in that case
01:44:56.020 | became a habit of mine.
01:44:57.700 | So those three things,
01:45:00.020 | being skeptical of my cynicism,
01:45:02.260 | adopting a reciprocity mindset and social savoring,
01:45:04.940 | those are three of the psychological pieces.
01:45:08.560 | And I can get to some actions,
01:45:10.140 | but yeah, I wonder what you think of these.
01:45:12.740 | - Oh, I love those three.
01:45:13.900 | And I love the distinguishing features
01:45:17.860 | of savoring versus gratitude
01:45:19.580 | because there's so much data to support gratitude practices.
01:45:23.600 | And I don't think I've ever heard
01:45:26.000 | those two distinguished from one another,
01:45:27.820 | but clearly savoring things is going to be,
01:45:31.480 | is equally powerful towards our neurochemistry
01:45:34.500 | and our wellbeing.
01:45:35.340 | And I love that you include both sensory
01:45:38.640 | and interpersonal aspects to this.
01:45:40.640 | These are highly actionable
01:45:41.960 | and I'm sure people are as excited about them as I am
01:45:44.860 | because all this knowledge from the laboratory
01:45:48.500 | is indeed wonderful.
01:45:49.880 | But of course we always want to know what can we do
01:45:53.000 | now that you've made such a strong case
01:45:54.540 | for tuning down our cynicism a little bit
01:45:57.800 | in order to make ourselves smarter, better, happier
01:46:00.960 | and in touch with awe on a more regular basis.
01:46:05.020 | Would love to hear about some of the actions
01:46:06.560 | one can take as well.
01:46:07.800 | - Yeah, so if you imagine the mindset shifts
01:46:11.920 | that I've talked about as thinking more like a scientist
01:46:15.380 | about the social world,
01:46:17.060 | then the second step to me is to act more like a scientist
01:46:20.300 | in the social world.
01:46:22.280 | The monk and author, Pema Chodron,
01:46:24.500 | this great, great writer.
01:46:26.300 | - That's wonderful.
01:46:27.300 | - Has, is written beautifully
01:46:29.520 | about treating your life like an experiment.
01:46:32.200 | You know, in this moment, you could interrupt the defaults.
01:46:37.000 | You could interrupt the patterns
01:46:39.020 | and look around more carefully.
01:46:41.220 | And I try to do that.
01:46:42.740 | And I encourage other people to do that as well.
01:46:45.320 | You know, one form of this is what I call
01:46:47.860 | taking leaps of faith on other people, right?
01:46:50.340 | Collecting more social data requires risk.
01:46:53.560 | So I try to do that.
01:46:54.940 | I try to take more risks,
01:46:56.220 | become less risk averse in a social context.
01:46:58.820 | Now, this is not to say, you know,
01:47:01.020 | that I share my bank information with a prince
01:47:03.100 | who's gonna wire me $14 million, right?
01:47:05.980 | You need to be calculated.
01:47:07.220 | You need to be smart and safe in the risks that you take.
01:47:12.060 | But I would argue that many of us
01:47:13.980 | are far too risk averse in the social world.
01:47:17.100 | And there are lots of ways that I try to do this
01:47:19.960 | and lots of ways that people can do this.
01:47:22.420 | One is to just be more open to the social world.
01:47:26.360 | I'm an introvert.
01:47:27.420 | Andrew, I think you've said you're an introvert as well.
01:47:29.460 | Is that true?
01:47:30.300 | - I am.
01:47:31.140 | - Yeah, and so as introverts,
01:47:32.980 | we tend to think that the social world is maybe tiring
01:47:36.740 | and we need to recharge on our own.
01:47:38.940 | It's completely valid.
01:47:39.860 | I experience that all the time.
01:47:41.840 | I think that sometimes my introversion
01:47:44.040 | morphs into something else
01:47:45.820 | where I underestimate the joy of social contact.
01:47:49.620 | You know, there's so many times that before a dinner party,
01:47:52.820 | I would pay an embarrassing amount of money
01:47:55.800 | for the other party to cancel on me.
01:47:58.080 | I don't wanna be the person to cancel,
01:47:59.740 | but I would feel so relieved if they canceled.
01:48:02.460 | But then while I'm there and afterwards,
01:48:05.540 | I feel totally fulfilled by the experience.
01:48:08.260 | It's a little bit like running.
01:48:09.740 | Running is another thing that I love,
01:48:11.180 | but there are many times that before a run,
01:48:13.660 | I think, gosh, I really don't wanna do this.
01:48:16.320 | And then afterwards, I'm so grateful to have done so.
01:48:19.540 | There's a bunch of research that finds
01:48:21.180 | that people in general are like this.
01:48:23.900 | If you ask them to forecast what it would be like
01:48:26.340 | to talk with a stranger,
01:48:28.460 | to open up about a problem
01:48:30.580 | that they're having with a friend,
01:48:32.000 | to express gratitude, to try to help somebody,
01:48:35.120 | even to have a disagreement on ideological grounds,
01:48:38.980 | people forecast that these conversations would be awful,
01:48:43.180 | awkward, cringe, painful,
01:48:46.900 | and in the case of disagreement, harmful even.
01:48:50.780 | This is work from Nick Epley, Juliana Schroeder,
01:48:53.500 | and many others, by the way,
01:48:54.540 | on something known as under-sociality.
01:48:57.300 | And because we have these forecasts,
01:48:59.220 | we simply don't pursue the conversations.
01:49:02.060 | We don't go deeper.
01:49:03.300 | We stay on the surface.
01:49:05.660 | Nick, Juliana, and others then challenge people.
01:49:08.420 | They say, "Go and do this, have this conversation,
01:49:11.100 | "and then report back."
01:49:12.380 | And people's actual experiences are vastly more positive
01:49:16.340 | and more fulfilling than their forecasts.
01:49:19.200 | So I try to remember this in my own life.
01:49:21.620 | I try to realize when my forecasts are too risk-averse
01:49:25.500 | and too negative and say, "Let me just jump in.
01:49:28.380 | "Let me take this chance.
01:49:30.040 | "If it goes badly, well, fine.
01:49:32.060 | "And if it goes well, even better."
01:49:35.220 | The second piece here, though,
01:49:36.340 | is not just to take those risks,
01:49:38.400 | but to document their effects.
01:49:40.300 | I call this encounter counting.
01:49:44.500 | So in essence, gathering new data from the world is great,
01:49:48.660 | but if you forget those data,
01:49:50.600 | well, then the effects might be short-lived.
01:49:53.060 | I try to really remember when a social encounter
01:49:56.340 | is a mismatch with my expectations.
01:49:58.540 | I have a relative who, for instance,
01:50:01.720 | I disagree with politically quite a bit.
01:50:03.780 | And when I was working on this book, I said,
01:50:06.220 | "Let me take a chance.
01:50:07.340 | "We've known each other for 30 years.
01:50:09.800 | "We've never talked politics.
01:50:11.740 | "Let me try."
01:50:13.620 | And so I invited her to have this conversation
01:50:15.900 | about an issue we really disagree on.
01:50:17.460 | And we did not agree by the end of the conversation,
01:50:20.400 | but it was an immensely deep and meaningful conversation,
01:50:24.180 | and I actually felt like I knew her better,
01:50:26.620 | even though we've been close for decades.
01:50:29.340 | And I could just say, "Well, that was nice,"
01:50:31.540 | and then forget all about it
01:50:32.660 | and imagine that any future conversations
01:50:34.660 | on disagreement would be terrible.
01:50:36.560 | But I tried to write down in my journal
01:50:39.900 | sort of this is what happened,
01:50:41.420 | this is how it counteracted my expectations,
01:50:43.740 | try to lock in that learning from the social world
01:50:47.860 | so that pleasant surprises
01:50:49.660 | hopefully aren't as surprising anymore.
01:50:51.660 | - I love those practices.
01:50:55.120 | And thank you for reinforcing the process
01:50:58.300 | of reinforcing the experiences,
01:51:00.500 | because many times I'll be listening to an audio book
01:51:02.920 | or I'll think of something when I'm running
01:51:04.220 | and I'll put it into my voice memos or notes in my phone,
01:51:07.780 | and then I move them to this very notebook
01:51:09.700 | or another similar to it, and I'll go back and read it.
01:51:12.460 | But many things don't get passed through the filters
01:51:16.360 | that I forget because I didn't do that.
01:51:20.040 | And we know this is one of the best ways
01:51:21.460 | to solidify information is to think about experiences
01:51:24.180 | and information after being exposed to it.
01:51:27.020 | This is true studying, this is true clearly
01:51:29.220 | for emotional learning and our own personal evolution.
01:51:32.700 | - Which brings me to another example of somebody
01:51:36.300 | from the, I don't know what to call them,
01:51:37.500 | is it sort of philosophy, wellness, self-help space,
01:51:40.300 | you mentioned Pema Chodron, wonderful writer.
01:51:45.300 | There's someone else more or less in that space,
01:51:48.540 | Byron Katie, who a lot of her work
01:51:51.060 | is about challenging beliefs by simply asking questions
01:51:54.020 | about our core beliefs.
01:51:55.820 | This is something that I've started to explore a bit,
01:51:58.280 | like one could have the idea that good people always,
01:52:03.280 | I don't know, show up on time,
01:52:05.360 | and wouldn't we all love to be punctual?
01:52:07.400 | And as an academic, I confess,
01:52:09.940 | for me, everything starts 10 minutes after the hour,
01:52:11.900 | so we're consistently on time but late, right?
01:52:14.640 | So the non-academics.
01:52:16.240 | My friends from the military have a saying,
01:52:18.400 | which is five minutes early is on time, on time is late,
01:52:22.240 | and if you're late, you better bring lunch,
01:52:25.280 | so that kind of thing.
01:52:26.220 | In any event, the practice that she promotes,
01:52:30.740 | in essence, is to take a core belief
01:52:35.240 | and then just start challenging it
01:52:36.560 | from a number of different directions.
01:52:38.000 | Is that always true?
01:52:39.280 | Are there cases where that's not true?
01:52:40.640 | What would that look like, et cetera,
01:52:42.240 | as a way to really deconstruct one's own core beliefs,
01:52:44.880 | which is, I think, a bit of what you're talking about,
01:52:47.040 | and I feel like this could go in at least two directions.
01:52:50.800 | You can have a core belief
01:52:51.680 | that leads in the direction of cynicism,
01:52:53.740 | that you can deconstruct by just simply asking questions.
01:52:56.760 | Is that always true?
01:52:59.320 | Are there ever instances where that's not true?
01:53:01.640 | And what would it mean if that weren't true
01:53:04.120 | in a given instance, this sort of thing?
01:53:05.520 | And then on the other side,
01:53:07.000 | where we tend to err toward hopeful skepticism
01:53:10.360 | as opposed to cynicism, there too,
01:53:13.200 | I could imagine it would be useful
01:53:15.040 | to explore hopeful skepticism also as a scientist, right?
01:53:18.920 | Are there cases where hopeful skepticism,
01:53:20.680 | here, I'm gonna be cynical,
01:53:21.780 | can really get us into trouble, for instance?
01:53:24.900 | Anyway, obviously, I haven't run a study on this
01:53:28.420 | just because I came up with this example on the fly,
01:53:30.880 | but does what I just described fit more or less
01:53:33.180 | into the framework that you're describing?
01:53:34.660 | - Absolutely.
01:53:35.500 | I think that it's, in essence,
01:53:37.060 | being skeptical about our beliefs,
01:53:39.660 | putting them through their paces, right?
01:53:41.500 | Kicking the tires on our own beliefs.
01:53:43.660 | And again, this reminds me
01:53:45.100 | of cognitive behavioral therapy, right?
01:53:47.140 | A person who's socially anxious might tell their therapist,
01:53:51.060 | "I think all my friends secretly hate me."
01:53:54.040 | They might believe that to their core.
01:53:55.880 | It might affect every decision that they make.
01:53:58.120 | And the therapist might challenge them and say,
01:53:59.540 | "Well, wait, what's the evidence that you have for that?
01:54:02.540 | "Are there any instances in your entire life
01:54:04.700 | "where that seemed to not be true?"
01:54:06.660 | And to your point from Byron Katie,
01:54:08.540 | what would it mean if it weren't true?
01:54:10.720 | So this is the bedrock of one of the most successful
01:54:14.560 | forms of therapy for depression and anxiety
01:54:16.680 | and phobia in the world.
01:54:19.120 | You know, I do wanna also, I guess,
01:54:22.260 | zoom in on something that you're sharing there
01:54:24.260 | about our core beliefs.
01:54:26.080 | 'Cause I think that in addition to testing our core beliefs,
01:54:29.100 | one thing that I wish we would do more
01:54:31.000 | is share our core beliefs.
01:54:32.980 | Because I don't think we know
01:54:34.840 | what each other's core beliefs are.
01:54:36.560 | And I think oftentimes, we think that we are more alone
01:54:39.980 | in our core beliefs than we actually are.
01:54:42.940 | So this is true in our politics, for instance,
01:54:46.140 | like the amount of people on,
01:54:48.780 | from every part of the political spectrum
01:54:50.840 | who want more compromise, more peace, and less conflict
01:54:55.840 | is north of 80% in surveys that my lab has conducted.
01:55:00.420 | But people don't know that.
01:55:02.480 | And so the lack of evidence, the lack of data
01:55:05.340 | about what other people want is a hindrance
01:55:08.140 | to the goals that we actually all share.
01:55:11.220 | This is also true in workplaces.
01:55:13.980 | So in the course of my work,
01:55:16.340 | I've done sort of some different projects
01:55:19.740 | with school systems, hospital systems, businesses.
01:55:23.180 | And one of the things I love doing
01:55:24.940 | is starting with an anonymous survey
01:55:26.580 | of everybody in the community.
01:55:28.220 | And I ask, how much do you value empathy and collaboration?
01:55:33.000 | How much would you prefer a workplace or community
01:55:35.820 | defined by cooperation versus competition?
01:55:40.220 | And invariably, and I'm talking about some places
01:55:43.500 | where you might imagine people would be competitive,
01:55:45.760 | invariably, a super majority of individuals
01:55:49.260 | in those communities want compassion,
01:55:52.580 | cooperation, and collaboration, right?
01:55:56.720 | Much more than they want competition or isolation.
01:56:00.580 | So one of the things that I love to do
01:56:01.900 | when I speak for those groups is to say,
01:56:03.500 | hey, look, here's some data, look around you.
01:56:06.820 | Here you've got 90% of people in this organization
01:56:10.340 | who want more cooperation.
01:56:12.220 | So if you just take a look in your periphery,
01:56:15.420 | almost everybody around you wants that as well.
01:56:18.360 | I also survey these communities and say,
01:56:20.660 | what do you think the average person
01:56:22.220 | would respond to these questions?
01:56:23.320 | And invariably, they're wrong.
01:56:25.380 | And so I say, you have underestimated each other,
01:56:28.740 | and now I'm giving you permission to stop.
01:56:30.840 | And I think this is one of the other actions
01:56:34.380 | that we can take if we're in a leadership position anywhere.
01:56:38.180 | Right, I think that looking for more data is great.
01:56:41.220 | If you're a leader, you can collect those data
01:56:43.580 | and you can show people to themselves.
01:56:46.300 | You can unveil the core beliefs of your community.
01:56:50.540 | And oftentimes those core beliefs are incredibly beautiful
01:56:55.220 | and surprising to the people in those communities
01:56:59.220 | and give them what I would call not peer pressure,
01:57:02.200 | but peer permission to express who they've been all along.
01:57:06.500 | - I love that.
01:57:07.340 | And one of the things that we've done on this podcast
01:57:09.540 | is to always invite comments and questions,
01:57:12.980 | critique and so forth in the comment section on YouTube.
01:57:17.980 | And I always say, and I do read all the comments,
01:57:20.580 | and sometimes it takes me a while
01:57:21.860 | and I'm still sifting through them.
01:57:22.940 | But I think comment sections can be,
01:57:25.860 | yes, they can be toxic in certain environments,
01:57:27.860 | in certain contexts,
01:57:28.980 | but they can also be tremendously enriching,
01:57:32.340 | not just for the reader, but for the commenter.
01:57:36.140 | And to see what people's core beliefs are really about.
01:57:39.900 | Now, oftentimes comments are of a different form
01:57:43.180 | and that's okay, that's all right.
01:57:45.340 | But I think that because of the anonymity involved,
01:57:49.300 | I think I can see that now through the lens
01:57:51.660 | of what you're saying as a license
01:57:53.820 | for people to really share their core beliefs
01:57:55.460 | about something that can be really informative
01:57:58.300 | and really enriching.
01:57:59.660 | Although I much prefer, I confess,
01:58:01.780 | the model that you're presenting
01:58:04.220 | where people are doing this in real time face-to-face
01:58:06.860 | as opposed to just online.
01:58:08.180 | As long as we're talking about polarization
01:58:12.980 | and the wish for less polarization,
01:58:14.960 | what are the data saying about the current state of affairs?
01:58:19.500 | We're recording this, you know, about what,
01:58:21.940 | three months or so out from an election
01:58:23.700 | or 90 some days or so from an election,
01:58:27.220 | presidential election.
01:58:28.940 | So without getting into discussions
01:58:30.580 | about political camps per se,
01:58:33.620 | what do your data and understanding about cynicism
01:58:37.340 | and hopeful skepticism tell us about that whole process
01:58:42.340 | and how the two camps are presenting themselves?
01:58:47.620 | - There is so much to say about this.
01:58:49.060 | I'm gonna try to not give a lecture here,
01:58:51.820 | but like so many of the themes in this conversation,
01:58:56.820 | I think that the headline for me
01:58:59.780 | when I look at the data on polarization,
01:59:02.200 | and I'm gonna talk about perceived polarization as well,
01:59:05.420 | is twofold.
01:59:07.220 | One, it's tragic because we are underestimating one another.
01:59:12.220 | And two, there's a lot of opportunity here
01:59:15.180 | because the delta between the world that we think we're in
01:59:17.860 | and the one that we're actually in is great,
01:59:20.180 | and it's positive as well.
01:59:22.420 | So there's a bunch of work on political perceptions.
01:59:25.340 | This is work done by folks like Meena Chakra at Harvard,
01:59:29.340 | my colleague Rob Willer in sociology at Stanford,
01:59:32.200 | our colleague, Rob Willer.
01:59:33.640 | And a lot of this focuses on what people think
01:59:38.640 | the average member of the other side is like.
01:59:44.000 | So if you're a Republican,
01:59:45.400 | what do you think the average Democrat believes?
01:59:48.000 | What do you think they're like?
01:59:48.840 | If you're a Democrat,
01:59:49.800 | what do you think the average Republican is like?
01:59:52.440 | And so I'll stop talking about Republicans
01:59:54.480 | and Democrats here because a lot of these data
01:59:56.480 | are bipartisan, the biases are pretty even across camps.
02:00:01.120 | And it turns out that in all cases,
02:00:03.560 | we are dead wrong about who's on the other side.
02:00:06.960 | We're even wrong demographically
02:00:08.680 | about who's on the other side.
02:00:10.600 | For instance, Democrats think that 25% of Republicans
02:00:15.600 | make more than $250,000 a year.
02:00:18.560 | The actual number is 2%.
02:00:21.900 | But the stereotype of Republicans that Democrats hold
02:00:25.020 | is that they're wealthy, I suppose.
02:00:27.240 | Republicans vastly overestimate the percentage of Democrats
02:00:31.080 | who are part of the LGBTQ community, for instance.
02:00:34.040 | Again, it's just a cultural stereotype.
02:00:36.640 | So we're wrong about even who's on the other side,
02:00:40.480 | but we're even more wrong about what they believe
02:00:42.680 | and what they want.
02:00:44.120 | So data suggests that there is perceived polarization,
02:00:47.680 | that is what we think the other side believes,
02:00:50.880 | is much greater than real polarization.
02:00:53.520 | I mean, first of all, we are divided, let's stipulate that.
02:00:56.160 | And those divisions can be really dangerous
02:00:58.740 | and are in some cases existential.
02:01:02.580 | But the division in our mind is much greater
02:01:06.500 | than the division that we actually have.
02:01:08.820 | My late friend, Emil Bruno, collected some data
02:01:11.860 | where he gathered Republicans and Democrats' views
02:01:14.860 | on immigration.
02:01:15.820 | He said, what would you want immigration to look like
02:01:18.500 | where zero is the borders are totally closed
02:01:21.980 | and 100 is they're totally open?
02:01:24.000 | And he plotted the distributions of what that looks like.
02:01:27.580 | He also asked people on either side,
02:01:30.180 | what do you think the other side would respond
02:01:33.060 | if asked that same question?
02:01:34.940 | And he plotted those distributions as well.
02:01:36.860 | - Other side meaning which group?
02:01:38.400 | - If you're a Democrat,
02:01:39.240 | what do you think Republicans would want?
02:01:40.500 | And if you're a Republican, what would Democrats want?
02:01:43.580 | And the distributions are totally different.
02:01:46.180 | The distributions of our actual preferences
02:01:48.780 | are like a hill with two peaks, right?
02:01:51.180 | So Republicans want more closed borders,
02:01:53.820 | Democrats want them more open,
02:01:55.420 | but they're not that far apart, first of all, the means,
02:01:57.860 | and there's a lot of overlap in the distributions.
02:02:01.040 | The distributions of our perceptions are two hills
02:02:04.500 | on opposite sides of a landscape.
02:02:07.460 | Republicans think that Democrats want totally open borders
02:02:11.180 | and Democrats think Republicans want totally closed borders.
02:02:14.100 | And the same pattern plays out for all sorts of issues
02:02:18.340 | where we think the other side is much more extreme,
02:02:21.580 | we think the average member of the other side
02:02:23.740 | is much more extreme than they really are.
02:02:26.140 | There's also work on meta-perceptions.
02:02:28.260 | What do you think the other side thinks about you?
02:02:32.220 | And it turns out that people on both sides
02:02:34.580 | imagine that their rivals hate them
02:02:37.860 | twice as much as their rivals really do.
02:02:40.940 | There's work on democratic norms
02:02:42.460 | that my grad student Louisa Santos collected,
02:02:45.820 | where we overestimate how anti-democratic
02:02:48.300 | the other side is by two times.
02:02:51.160 | And Rob has collected data on violence.
02:02:53.780 | How much do you think the other side
02:02:55.100 | would support violence to advance their aims?
02:02:57.780 | And here, the overestimates are 400%.
02:03:01.260 | So we think that the average person on the other side
02:03:03.860 | is four times as enthusiastic about violence
02:03:07.700 | as they really are.
02:03:09.360 | We have an image in our mind of the other
02:03:13.100 | as violent extremists who want to burn down the system.
02:03:18.100 | And again, we've talked about the warped media ecosystem
02:03:21.500 | that we're in, and that probably contributes here.
02:03:23.780 | But the fact is that those misperceptions
02:03:26.580 | are making all the problems that we fear worse.
02:03:29.420 | Because if you think that the other side
02:03:31.060 | is gearing up for war, what do you do?
02:03:33.220 | You have to defend yourself.
02:03:35.460 | And so we're caught in this almost cycle of escalation
02:03:39.940 | that really very few of us want.
02:03:42.940 | Now, I want to be really clear here
02:03:44.700 | that I'm not saying that we don't have actual disagreements.
02:03:48.140 | I'm also not saying that people across
02:03:52.380 | our political spectrum are all peaceable and all kind.
02:03:55.860 | There are absolutely extreme and violent people
02:03:59.540 | around our country that represent their political views
02:04:02.500 | in horrible and toxic ways.
02:04:04.440 | But that's not the average.
02:04:06.580 | And again, I want to get back to this point
02:04:08.220 | that the average person underestimates the average person.
02:04:11.560 | Not that we underestimate everybody,
02:04:14.020 | but that we're wrong about most people.
02:04:16.740 | And so again, to me, this is a tragedy and an opportunity.
02:04:21.100 | Rob and Mina and lots of other people find
02:04:24.020 | that when you ask people to actually pay attention
02:04:27.520 | to the data, when you show them,
02:04:29.100 | "Hey, actually, the other side fears violence
02:04:32.540 | "just as much as you do."
02:04:34.460 | When you show them that actually the other side
02:04:36.260 | is terrified of losing our democracy.
02:04:38.900 | When you show them that the other side
02:04:40.140 | doesn't actually hate you, that mitigates,
02:04:43.140 | that pulls back all of these escalatory impulses.
02:04:46.760 | In essence, you can decrease the threat
02:04:49.180 | that people feel from the other side
02:04:51.020 | by showing them who the other side really is.
02:04:54.320 | I understand this is such a massive
02:04:56.300 | and toxic sort of environment that we're in.
02:04:59.080 | I'm not saying that hopeful skepticism
02:05:01.220 | will solve our divided political landscape,
02:05:06.220 | will solve our problems.
02:05:08.000 | But I do think it's worth noting how wrong we are
02:05:11.540 | and that being a little bit less wrong
02:05:13.900 | can at least open a door, maybe let our minds wander
02:05:17.700 | towards a place of greater compromise and peace,
02:05:20.580 | which is what most people actually want.
02:05:23.140 | - Wow.
02:05:25.360 | I say that for several reasons.
02:05:27.860 | First of all, I've never heard the landscape
02:05:31.340 | described that way.
02:05:32.700 | And I confess, I didn't know that the landscape
02:05:35.060 | was as toward the center as it turns out it is.
02:05:40.060 | I have also many theories about how media and social media
02:05:47.760 | and podcasts for that matter might be contributing
02:05:50.560 | to this perceived polarization as opposed to the reality.
02:05:55.560 | And there's certainly a lot to explore
02:05:58.560 | in terms of what we can each and all do
02:06:01.520 | to remedy our understanding of what's going on out there.
02:06:05.700 | As a consequence, I'll ask,
02:06:07.460 | can some of the same tools that you described
02:06:10.140 | to better interact with one's own children,
02:06:13.400 | with one's own self, with other individuals
02:06:16.460 | and in small groups be used to sort of defragment
02:06:21.460 | some of the cynicism circuitry that exists in us
02:06:25.260 | around this polarized, excuse me,
02:06:28.220 | perceived highly polarized political landscape?
02:06:32.160 | - I love that clarification.
02:06:33.560 | Yeah, absolutely.
02:06:34.660 | I think that the answer is yes.
02:06:38.760 | There is lots of evidence that we are actively avoiding
02:06:42.320 | having conversations in part because of who we think
02:06:45.760 | the other side is.
02:06:47.120 | There is an amazing study that was conducted
02:06:49.600 | during Thanksgiving of 2016, which as you may recall,
02:06:53.940 | was directly after a very polarizing election
02:06:58.940 | and researchers used geo-tracking on people's cell phones
02:07:04.060 | to examine whether in order to go to Thanksgiving dinner,
02:07:07.000 | they crossed between a blue county into a red county
02:07:12.000 | or a red county into a blue county.
02:07:14.560 | In other words, are they going into,
02:07:16.320 | and I'm using air quotes here, quote unquote,
02:07:18.520 | enemy territory for Thanksgiving dinner.
02:07:21.120 | And they used that as a proxy of whether
02:07:23.620 | they're having dinner with people they disagree with.
02:07:26.120 | And it turns out that people who crossed county lines,
02:07:29.440 | who crossed into enemy territory, again in quotes,
02:07:31.960 | this is perceived polarization,
02:07:33.560 | they had dinners that were 50 minutes shorter
02:07:37.680 | than people who were dining with folks
02:07:39.960 | who presumably they agreed with.
02:07:42.640 | So we're talking about forsaking pie, Andrew.
02:07:46.480 | They're giving up pie in order to not talk
02:07:49.400 | with people they disagree with.
02:07:50.800 | And I think a lot of us are very skittish
02:07:52.660 | about these conversations because if you believe
02:07:55.580 | that the other side is a bunch of bloodthirsty marauders,
02:07:59.920 | why would you want to talk with them?
02:08:02.240 | Why have a beer with a fascist?
02:08:04.400 | That's just not a great plan.
02:08:06.500 | The truth though is that when we can collect better data,
02:08:12.160 | oftentimes we end up with better perceptions.
02:08:16.440 | And I mean better in two ways,
02:08:18.960 | one more positive and two more accurate.
02:08:22.040 | Now again, I want to say that there are real threats
02:08:24.760 | in our political environment.
02:08:25.760 | I'm not asking anybody to make themselves unsafe in any way.
02:08:29.920 | But in our lab, again, my wonderful graduate student,
02:08:34.080 | Louisa Santos, ran a study where we had about 160 people,
02:08:37.720 | these are folks from all over the country,
02:08:39.840 | who took part in Zoom conversations.
02:08:43.280 | We made sure that they really disagreed
02:08:45.200 | about gun control, immigration, and climate change,
02:08:48.960 | and they talked about those issues.
02:08:51.640 | We asked them to forecast
02:08:53.440 | what those conversations would be like,
02:08:55.040 | and we asked other folks to forecast
02:08:57.220 | what those conversations would be like.
02:08:59.360 | And the forecasts went from neutral to negative.
02:09:02.100 | Some people thought it won't make any difference,
02:09:04.800 | and other people thought it will be counterproductive.
02:09:07.520 | Some folks in our survey said dialogue is dead,
02:09:10.240 | there's no point in any of these conversations.
02:09:13.140 | We then brought these folks together.
02:09:15.800 | Oh, and I should say, among the people who were cynical
02:09:18.800 | about these conversations and who forecasted
02:09:20.760 | that they would go poorly, was us, the research team.
02:09:24.440 | Louisa and I spent hours talking about,
02:09:26.520 | what if people start to threaten each other,
02:09:28.400 | or dox each other, or look up each other's addresses.
02:09:32.180 | You know, Andrew, that we have institutional review boards
02:09:34.800 | that make sure that we're keeping human subjects safe,
02:09:37.080 | and the IRB wanted all sorts of safeguards in place,
02:09:40.840 | because we all thought that these conversations
02:09:43.760 | might go really poorly.
02:09:46.120 | After the conversations occurred,
02:09:48.100 | we asked folks who had taken part of them
02:09:50.520 | to rate how positive they were on a one to 100 scale.
02:09:54.760 | And the most common, the modal response
02:09:57.760 | that people gave us was 100 out of 100.
02:10:00.620 | And it wasn't just that they liked the conversation,
02:10:04.360 | they were shocked by how much they liked the conversation.
02:10:07.600 | They also reported less negative emotion
02:10:11.120 | for the other side as a whole,
02:10:13.180 | not just for the person that they talked with,
02:10:15.520 | and they reported more intellectual humility,
02:10:18.640 | more openness to questioning their own views.
02:10:22.120 | So here are conversations that we as a culture
02:10:24.700 | are actively avoiding because of our priors.
02:10:28.640 | Our priors are wrong given the data,
02:10:30.420 | but we don't know that, and we don't give ourselves chances
02:10:33.440 | to learn that we're wrong, because we don't collect the data.
02:10:37.100 | And when we do collect the data,
02:10:38.400 | when we step in and take that leap of faith,
02:10:40.960 | take that social risk, we are shocked and humbled,
02:10:46.380 | and feel more positive, and maybe even feel
02:10:49.620 | a slightly greater sense of hope
02:10:51.740 | that there can be some way out of this toxic environment
02:10:55.300 | that we're all trapped in.
02:10:57.260 | - Well, Jamil, Dr. Zaki,
02:11:01.140 | thank you so much for sharing your incredible,
02:11:06.140 | like what can only be described as wisdom
02:11:08.860 | into this area of humanity, right?
02:11:12.980 | I mean, to be a cynic is one potential aspect
02:11:17.700 | of being human, but you've made very clear
02:11:21.140 | that we have control.
02:11:22.780 | There is plasticity over this aspect of ourselves.
02:11:25.460 | If we adopt the right mindsets, apply the right practices,
02:11:29.540 | and it's so clear based on everything you've shared today
02:11:33.820 | that humans are operating rationally,
02:11:38.580 | and yet irrationally at the same time.
02:11:40.500 | I'm certainly not the first to say that,
02:11:42.620 | but in the context of cynicism,
02:11:44.260 | and in the context of being happier individuals,
02:11:46.620 | and families, and couples, and groups,
02:11:49.500 | that to really take a hard look at how cynical we are,
02:11:53.900 | and to start to make even minor inroads into that
02:11:58.620 | through belief testing.
02:12:00.100 | You know, I wrote down as we were talking
02:12:01.940 | that what I really feel you're encouraging us to do,
02:12:04.940 | correct me if I'm wrong, is to do both internal
02:12:08.620 | and external reality testing in an effort
02:12:11.340 | to move us away toward internal and external polarization.
02:12:15.660 | And I can't think of any higher calling than that.
02:12:20.100 | And you're giving us the tools,
02:12:23.340 | and those tools are supported by data.
02:12:25.820 | These aren't just ideas, they are data-supported ideas.
02:12:29.500 | And I just want to thank you for your incredible generosity
02:12:33.100 | in coming here today to talk about those ideas.
02:12:35.500 | Your book is phenomenal.
02:12:36.940 | I already learned so much from it,
02:12:38.700 | and I highly encourage people to read it.
02:12:40.900 | And what you've shared with us today is phenomenal.
02:12:43.820 | And I do hope to have you back again
02:12:46.020 | to talk about another topic that you are expert in,
02:12:49.500 | which is empathy, but we'll have to all wait
02:12:52.860 | with bated breath for that, myself included.
02:12:55.380 | So once again, I just want to thank you for your time,
02:12:57.540 | the incredible work that you're doing,
02:12:59.620 | and the evolution that you're taking us on.
02:13:04.220 | So on behalf of myself and everyone listening and watching,
02:13:07.500 | thank you ever so much.
02:13:08.980 | Andrew, this has been an absolutely delightful conversation.
02:13:11.860 | And I will say my forecast of it was very high,
02:13:15.660 | and it has exceeded that forecast.
02:13:19.180 | I also just want to take a moment to thank you
02:13:21.740 | for your work as a science communicator.
02:13:24.220 | As somebody who believes in not just trying
02:13:28.700 | to generate knowledge, but also to share knowledge,
02:13:32.620 | I think that it's absolutely one
02:13:35.420 | of the most important services that we can do
02:13:37.700 | as folks who have been trained and learned all this stuff
02:13:41.060 | to bring that information to as many people as we can.
02:13:44.260 | And I think it's just, it's an incredible mission
02:13:47.020 | and clearly has had such wonderful impact.
02:13:49.220 | So it's an honor to be part of that conversation
02:13:52.100 | and to be part of that effort.
02:13:54.080 | - Oh, well, thank you.
02:13:54.920 | I'll take that in.
02:13:55.760 | And it's a labor of love and an honor and a privilege
02:13:59.280 | to sit here today with you.
02:14:00.860 | So thank you ever so much.
02:14:02.020 | And please do come back again.
02:14:03.520 | - I would love that.
02:14:04.980 | - Thank you for joining me for today's discussion
02:14:06.900 | with Dr. Jamil Zaki.
02:14:08.520 | To learn more about his work
02:14:09.740 | and to find a link to his new book, "Hope for Cynics,"
02:14:12.560 | please see the links in the show note captions.
02:14:15.000 | If you're learning from and or enjoying this podcast,
02:14:17.500 | please subscribe to our YouTube channel.
02:14:19.180 | That's a terrific zero cost way to support us.
02:14:21.660 | Another terrific zero cost way to support us
02:14:23.580 | is to follow the podcast on both Spotify and Apple.
02:14:26.500 | And on both Spotify and Apple,
02:14:27.940 | you can leave us up to a five-star review.
02:14:30.340 | Please check out the sponsors mentioned
02:14:32.060 | at the beginning and throughout today's episode.
02:14:34.180 | That's the best way to support this podcast.
02:14:36.940 | If you have questions for me or comments about the podcast
02:14:39.620 | or guests or topics that you'd like me to consider
02:14:41.580 | for the Huberman Lab podcast,
02:14:43.100 | please put those in the comment section on YouTube.
02:14:45.500 | I do read all the comments.
02:14:47.220 | For those of you that haven't heard,
02:14:48.380 | I have a new book coming out.
02:14:49.580 | It's my very first book.
02:14:51.180 | It's entitled "Protocols,
02:14:52.580 | an Operating Manual for the Human Body."
02:14:54.740 | This is a book that I've been working on
02:14:55.900 | for more than five years,
02:14:57.060 | and that's based on more than 30 years
02:14:59.380 | of research and experience.
02:15:00.940 | And it covers protocols for everything from sleep
02:15:04.000 | to exercise to stress control,
02:15:06.500 | protocols related to focus and motivation.
02:15:08.940 | And of course, I provide the scientific substantiation
02:15:12.300 | for the protocols that are included.
02:15:14.380 | The book is now available by presale at protocolsbook.com.
02:15:18.280 | There you can find links to various vendors.
02:15:20.660 | You can pick the one that you like best.
02:15:22.420 | Again, the book is called "Protocols,
02:15:24.180 | an Operating Manual for the Human Body."
02:15:27.020 | If you're not already following me on social media,
02:15:29.200 | I'm Huberman Lab on all social media platforms.
02:15:31.900 | So that's Instagram, X, formerly known as Twitter,
02:15:34.740 | Threads, Facebook, and LinkedIn.
02:15:36.380 | And on all those platforms,
02:15:38.060 | I cover science and science-related tools,
02:15:40.020 | some of which overlaps with the content
02:15:41.620 | of the Huberman Lab podcast,
02:15:42.920 | but much of which is distinct from the content
02:15:45.140 | on the Huberman Lab podcast.
02:15:46.440 | Again, that's Huberman Lab on all social media channels.
02:15:49.600 | If you haven't already subscribed
02:15:50.780 | to our Neural Network newsletter,
02:15:52.260 | our Neural Network newsletter
02:15:53.660 | is a zero-cost monthly newsletter that has protocols,
02:15:57.220 | which are one- to three-page PDFs
02:15:59.460 | that describe things like optimizing your sleep,
02:16:02.540 | how to optimize your dopamine, deliberate cold exposure.
02:16:05.300 | We have a foundational fitness protocol
02:16:07.220 | that describes resistance training,
02:16:08.820 | sets and reps, and all of that,
02:16:10.300 | as well as cardiovascular training
02:16:11.700 | that's supported by the scientific research.
02:16:13.860 | And we have protocols related
02:16:15.220 | to neuroplasticity and learning.
02:16:17.780 | Again, you can find all that at completely zero cost
02:16:19.980 | by going to hubermanlab.com,
02:16:21.660 | go to the menu tab in the right corner,
02:16:23.760 | scroll down to newsletter, you put in your email,
02:16:26.080 | and we do not share your email with anybody.
02:16:28.660 | Thank you once again for joining me for today's discussion
02:16:31.060 | with Dr. Jamil Zaki.
02:16:32.700 | And last, but certainly not least,
02:16:34.840 | thank you for your interest in science.
02:16:36.980 | (upbeat music)
02:16:39.560 | (upbeat music)