back to index

Chris Voss: FBI Hostage Negotiator | Lex Fridman Podcast #364


Chapters

0:0 Introduction
0:59 Negotiation
6:50 Reason vs Emotion
21:45 How to listen
30:35 Negotiation with terrorists
32:43 Brittney Griner
34:21 Putin and Zelenskyy
41:41 Donald Trump
48:52 When to walk away
53:6 Israel and Palestine
60:45 Al-Qaeda
66:15 Three voices of negotiation
74:40 Strategic umbrage
77:46 Mirroring
80:57 Labeling
88:24 Exhaustion
90:38 The word "fair"
93:34 Closing the deal
95:32 Manipulation and lying
97:26 Conversation vs Negotiation
108:45 The 7-38-55 Rule
112:45 Chatbots
122:7 War
123:39 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | a crazy thing in the kidnap business.
00:00:02.760 | We used to get asked by FBI leadership,
00:00:07.120 | when is this gonna be over?
00:00:10.240 | And the answer would be when the bad guys
00:00:11.800 | feel like they've gotten everything they can.
00:00:13.960 | Now dissecting that statement,
00:00:16.680 | you're talking about when they feel
00:00:19.840 | like they got everything they can.
00:00:21.240 | So the key to kidnapping negotiations
00:00:23.640 | are the feelings of the bad guys.
00:00:26.000 | We're talking about feelings, kidnappers' feelings.
00:00:29.720 | Which drives everything.
00:00:32.200 | Doesn't matter what human endeavor it is.
00:00:34.320 | The following is a conversation with Chris Voss,
00:00:39.760 | former FBI hostage and crisis negotiator,
00:00:43.000 | and author of "Never Split the Difference,"
00:00:46.360 | negotiating as if your life depended on it.
00:00:49.520 | This is the Lex Friedman Podcast.
00:00:51.440 | To support it, please check out our sponsors
00:00:53.420 | in the description.
00:00:54.760 | And now, dear friends, here's Chris Voss.
00:00:59.400 | What is it like negotiating for a hostage
00:01:01.760 | with a kidnapper?
00:01:02.760 | What is the toughest part of that process?
00:01:04.860 | - The toughest part is if it looks bad from the beginning.
00:01:08.780 | And you gotta engage in the process anyway.
00:01:12.640 | - What are the factors that make it bad?
00:01:15.240 | What that makes you nervous,
00:01:16.720 | that if you were to observe a situation
00:01:18.240 | where there's general negotiation
00:01:19.760 | or it's a hostage negotiation,
00:01:21.500 | what makes you think that this is going to be difficult?
00:01:26.080 | - If they wanna make it look like they're negotiating,
00:01:27.840 | but they're not.
00:01:28.720 | Like in the 2004 timeframe,
00:01:30.840 | al-Qaeda in Iraq was executing people on camera
00:01:36.400 | for the publicity.
00:01:38.440 | And they wanted to make it look like they were negotiating.
00:01:41.200 | So they'd come on and they'd say,
00:01:43.120 | "If you don't get all the women out of,
00:01:45.280 | Iraqi women out of the jails in Iraq in 72 hours,
00:01:48.920 | we're gonna kill a hostage."
00:01:50.360 | That was one of the demands in one of the cases
00:01:53.240 | in that timeframe.
00:01:54.960 | Now, first of all, even if we'd have been willing,
00:01:56.840 | the US government, the coalition,
00:01:58.020 | would have been willing to do that,
00:01:59.120 | it wouldn't have been able to happen in 72 hours.
00:02:02.080 | So is it an impossible ask from the beginning?
00:02:04.760 | And so then that looks really bad.
00:02:07.760 | Like they're trying to make it look like
00:02:10.420 | they're talking reasonably, but they're not.
00:02:13.620 | So your hostage is in bad shape there.
00:02:15.600 | If they've made a demand that you just,
00:02:18.560 | even if you wanted to do, you couldn't do.
00:02:20.780 | So then what makes that very difficult is,
00:02:23.960 | in kidnappings especially,
00:02:26.680 | you're working with family members, you're coaching people.
00:02:29.700 | Bad guys are in touch with family members,
00:02:32.540 | or if they're not directly in touch with family members,
00:02:34.580 | the other thing that Al-Qaeda was doing at that time was,
00:02:37.940 | they didn't give us a way to talk to them.
00:02:39.980 | They're making statements in the media,
00:02:42.180 | but then not leaving their phone number, if you will.
00:02:44.880 | So that's one more thing.
00:02:48.060 | They're intentionally blocking you.
00:02:49.380 | They're asking you to do something you can't do,
00:02:52.100 | they're not giving you a way to talk to them.
00:02:54.380 | So you gotta get with the family
00:02:56.420 | and discuss with the family how you're gonna approach things.
00:02:58.840 | Now the family definitely wants to know,
00:03:00.760 | is this gonna help?
00:03:02.000 | So a bunch of cases like that in that timeframe.
00:03:05.160 | And you gotta be honest with them.
00:03:07.160 | It's a long shot.
00:03:08.260 | Our chances here are slim and none.
00:03:10.520 | And when it's slim and none, I'll take slim,
00:03:12.440 | but it's still very, very slim.
00:03:15.000 | And there were a number of people that were killed
00:03:16.440 | in that timeframe before the tide finally got turned,
00:03:19.240 | and it was hard dealing with the families at the time.
00:03:22.360 | - Can you negotiate in public
00:03:24.040 | versus like a direct channel in private?
00:03:26.040 | - Oh yeah.
00:03:26.940 | Bad guys pick the media.
00:03:28.420 | They're making statements in the media.
00:03:31.040 | So, and that's a big clue.
00:03:32.400 | Their channel of choice tells you an awful lot.
00:03:37.200 | And if they're choosing the media,
00:03:39.860 | then that means there's people they're trying to appeal to.
00:03:42.640 | That means in their view,
00:03:43.800 | there's such a thing as good media.
00:03:45.320 | So if there's good media, there's bad media.
00:03:47.160 | How do you make it bad?
00:03:48.680 | And we made it bad for them.
00:03:49.920 | It just, unfortunately,
00:03:51.680 | it had to go through a number of iterations
00:03:53.360 | before they got the message and quit.
00:03:55.720 | - In that negotiation,
00:03:57.400 | do you think about the value of human life?
00:04:02.160 | Is there a dollar figure?
00:04:03.320 | Is there, how do you enumerate, not enumerate,
00:04:07.320 | quantify the value of human life?
00:04:09.160 | - Yeah, that's like beauty seen in the eye of the beholder.
00:04:13.300 | So that was the first lesson on any hostage negotiation,
00:04:17.680 | really any negotiation.
00:04:19.360 | Like it doesn't matter what it is to you,
00:04:20.960 | matters what it is to the other side.
00:04:22.420 | One of the things, especially in your conversation
00:04:24.960 | I listened to with Andrew.
00:04:26.540 | By the way, you guys,
00:04:29.080 | another thing I really liked about that conversation,
00:04:31.900 | first of all, I think the world of him.
00:04:33.680 | - Andrew Huberman.
00:04:34.760 | - Yeah, Andrew Huberman.
00:04:36.160 | And you released it on my birthday, I appreciate that.
00:04:39.800 | It was a nice birthday present to me.
00:04:40.640 | - I tried to tie it perfectly just for you.
00:04:42.320 | - Yeah, nice job, thank you.
00:04:43.720 | But empathy is in the eye of the beholder
00:04:47.360 | in every negotiation,
00:04:50.680 | whether it's over a car, a house,
00:04:54.440 | collaboration in your company with the bad guys.
00:04:57.400 | How does the other side see it?
00:04:59.600 | Now, the nice thing about kidnapping for ransom,
00:05:02.000 | if there's an actual ransom demand,
00:05:04.560 | it's an actual demand,
00:05:06.200 | is it's a mercenary's business.
00:05:10.200 | They're gonna take what they could get.
00:05:12.640 | And they tend to be really good
00:05:15.480 | at figuring out how much money somebody has.
00:05:18.320 | So, and again, I'll keep drawing business analogies.
00:05:23.320 | You're looking for a job with an employer.
00:05:26.020 | There's a market price of the job,
00:05:29.480 | and then there's what the employer can pay you.
00:05:31.840 | Now, maybe the market price of the job market's 150 grand.
00:05:36.960 | Employer can pay you 120, but it's a great job.
00:05:41.040 | We were talking about Elon a minute ago.
00:05:44.680 | Like I'd work minimum wage to follow him around.
00:05:47.800 | That would be worth it.
00:05:49.920 | What are the value other than the dollars?
00:05:53.040 | And how hard is it to get the dollars?
00:05:55.600 | And how quickly can you get to them?
00:05:57.700 | These are all things that the bad guys are good,
00:06:00.740 | in kidnapping, are good at figuring out.
00:06:02.480 | So the value of human life to them
00:06:05.960 | is gonna be what can they get.
00:06:07.680 | A crazy thing in the kidnap business.
00:06:11.680 | We used to get asked by FBI leadership,
00:06:14.640 | when is this gonna be over?
00:06:17.760 | And the answer would be when the bad guys
00:06:19.320 | feel like they've gotten everything they can.
00:06:21.480 | Now dissecting that statement,
00:06:24.200 | you're talking about when they feel
00:06:27.360 | like they got everything they can.
00:06:28.800 | So the key to kidnapping negotiations
00:06:31.160 | are the feelings of the bad guys.
00:06:33.520 | We're talking about feelings, kidnappers' feelings.
00:06:37.280 | Which drives everything.
00:06:39.760 | Doesn't matter what human endeavor it is.
00:06:42.720 | - So it's not reason, it's emotion.
00:06:46.260 | - There's no such thing as reason.
00:06:48.980 | (laughing)
00:06:50.680 | - I should say, for a little bit of context,
00:06:53.040 | I just talked yesterday with a guy named Sam Harris.
00:06:58.040 | - Yeah.
00:06:59.000 | - I don't know if you know Sam.
00:06:59.960 | But Sam, and because I was preparing
00:07:02.940 | for a conversation with you,
00:07:04.800 | I talked to him about empathy versus reason.
00:07:08.200 | And he lands heavily on reason.
00:07:10.760 | - Yeah.
00:07:11.600 | - Empathy is somewhere between useless and erroneous
00:07:16.480 | and leads you astray and is not effective.
00:07:19.800 | That reason is the only way forward.
00:07:23.360 | - Well, let's draw some fine lines there.
00:07:25.080 | And the two fine lines I would draw is,
00:07:28.040 | first, what is your definition of empathy?
00:07:29.760 | And then secondly, how do people actually make up their minds?
00:07:34.760 | And I'm gonna flip it,
00:07:36.080 | I'm gonna go with how people make up their minds.
00:07:37.860 | You make up your mind based on what you care about, period.
00:07:41.040 | That makes reason emotion-based.
00:07:46.840 | - Yeah.
00:07:47.680 | - What do you care about?
00:07:48.500 | You start with what you care about.
00:07:49.880 | You see some guy swimming out off the coast of the ocean
00:07:54.240 | and you see a shark coming up behind him.
00:07:56.920 | Who are you cheering for?
00:07:58.160 | If it's Adolf Hitler out there,
00:08:01.280 | you're cheering for the shark.
00:08:02.400 | You might actually feel bad for the shark
00:08:03.920 | 'cause it's gonna taste bad.
00:08:07.480 | Who do you care about?
00:08:08.540 | - You mean the human will taste bad?
00:08:10.100 | - Yeah, he eats Adolf Hitler.
00:08:12.380 | You're gonna leave a bad taste in your mouth,
00:08:14.540 | even if you're a shark.
00:08:15.980 | So you're making up your mind on every circumstance
00:08:18.220 | is based on what you care about.
00:08:20.060 | So then what does that do to reason?
00:08:21.900 | Your reason is based on what you care about
00:08:23.620 | from the beginning.
00:08:24.820 | Now then, empathy.
00:08:25.980 | If you define it as sympathy,
00:08:28.700 | which it was never meant to be sympathy, ever.
00:08:32.540 | Etymology, I think is the word.
00:08:37.120 | I keep getting etymology and entomology mixed up.
00:08:39.740 | Etymology being, right, where words came from,
00:08:43.340 | the origin, entomology being bugs.
00:08:45.740 | - Got it.
00:08:48.180 | - So I like etymology.
00:08:50.620 | Where did something come from?
00:08:52.100 | I also like entomology.
00:08:53.340 | Anyway, etymology.
00:08:55.040 | My understanding from my research,
00:08:58.340 | the original definition of empathy
00:09:01.740 | was an interpretation of a German word
00:09:04.460 | where people were trying to figure out
00:09:05.700 | what the artist was trying to convey.
00:09:07.400 | It was about assessing art.
00:09:10.220 | And so it was always about understanding
00:09:12.220 | where somebody was coming from,
00:09:13.480 | but not sharing necessarily that same thing.
00:09:18.480 | So then when I was with the FBI
00:09:21.260 | and I first started collaborating with Harvard,
00:09:23.240 | Bob Mnookin wrote a book, "Beyond Winning,"
00:09:26.700 | second chapter is "The Tension
00:09:28.040 | Between Empathy and Assertiveness."
00:09:29.940 | Still the best chapter on empathy I've ever read anywhere.
00:09:35.660 | And Bob writes in his book,
00:09:37.020 | Bob was the head of the program on negotiation.
00:09:39.360 | He's also agreed to be interviewed for a documentary
00:09:43.420 | about me and my company that hasn't been released yet,
00:09:46.780 | but it should be released sometime this year.
00:09:48.660 | - What's the name of the documentary?
00:09:49.980 | - "Tactical Empathy."
00:09:52.220 | - Good name.
00:09:53.040 | - So Bob's definition of empathy said,
00:09:57.180 | "Not agreeing or even liking the other side."
00:10:02.180 | Don't even gotta like 'em, don't gotta agree with 'em.
00:10:04.720 | Just straight understanding where they're coming from
00:10:08.700 | and articulating it,
00:10:09.680 | which requires no agreement whatsoever.
00:10:11.700 | That becomes a very powerful tool,
00:10:15.260 | like ridiculously powerful.
00:10:16.740 | And if sympathy or compassion or agreement are not included,
00:10:21.320 | you can be empathic with anybody.
00:10:22.660 | I was thinking about this
00:10:24.260 | when I was getting ready to sit down and talk to you,
00:10:28.540 | 'cause you use the word empathy a lot.
00:10:31.800 | Putin.
00:10:32.640 | I can be empathic with Putin, easy.
00:10:36.280 | It's easy.
00:10:37.680 | I don't agree with where he's coming from.
00:10:39.840 | I don't agree with his methodology.
00:10:43.160 | Early on, the Ukraine-Russian War,
00:10:47.960 | I saw an article that was very dismissive of Russia
00:10:51.400 | that said, "Russia's basically Europe's gas station."
00:10:55.840 | And I thought, all right.
00:10:59.440 | So if you're in charge,
00:11:01.320 | and the way you feed your people
00:11:05.300 | is via an industry that the entire world is trying to quit.
00:11:10.140 | The whole world is trying to get out of fossil fuels.
00:11:14.980 | If that's how you feed your people,
00:11:16.920 | if you don't come up with an answer to that,
00:11:19.160 | the people that you've taken responsibility for
00:11:21.140 | are gonna die alone in the cold and the dark.
00:11:24.140 | They're gonna freeze and they're gonna die.
00:11:26.300 | All right, so that doesn't mean that I agree
00:11:28.860 | with where he's coming from or any of his means.
00:11:31.260 | But how does this guy see things in his distorted world?
00:11:35.860 | You're never gonna get through to somebody like that
00:11:38.780 | in a conversation unless you can demonstrate to them
00:11:41.100 | you understand where they're coming from,
00:11:42.420 | whether or not you agree.
00:11:43.740 | Early '90s, last century.
00:11:46.540 | I'm a last century guy.
00:11:47.620 | I'm an old dude.
00:11:48.460 | Refer to myself as a last century guy.
00:11:51.740 | Also a deeply flawed human.
00:11:54.420 | So terrorist case, New York City, civilian court.
00:11:59.420 | Terrorism does not have to be tried in military tribunals.
00:12:04.740 | That's a very bad idea.
00:12:06.060 | It was always bad.
00:12:08.060 | The FBI was always against it.
00:12:09.660 | I'm getting ready.
00:12:12.140 | We have Muslims testifying in open court
00:12:15.340 | against a legitimate Muslim cleric.
00:12:18.520 | The guy that was on trial had the credentials
00:12:22.060 | as a legitimate Muslim cleric.
00:12:23.740 | The people that were testifying against him
00:12:26.060 | didn't think he should be advocating
00:12:27.420 | murder of innocent people.
00:12:28.760 | We'd sit down with them, Arab Muslims, Egyptians, mostly.
00:12:34.040 | And I would say to them,
00:12:37.140 | you believe that there's been a succession
00:12:39.580 | of American governments for the last 200 years
00:12:41.800 | that are anti-Islamic?
00:12:43.280 | And they'd shake their head and go, yeah.
00:12:48.380 | And that'd be the start of the conversation.
00:12:50.960 | That's empathy.
00:12:52.700 | You believe this to be the case.
00:12:54.940 | I never said I agreed.
00:12:56.060 | I never said I disagreed.
00:12:57.460 | But I'd showed them that I wasn't afraid of their beliefs.
00:13:00.940 | I was so unafraid of them
00:13:02.840 | that I was willing to just state them
00:13:05.140 | and not disagree or contradict.
00:13:07.100 | 'Cause I would say that and then I'd shut up
00:13:09.140 | and let them react.
00:13:10.860 | And I never had to say, here's why you're wrong.
00:13:13.940 | I never gave my point of view.
00:13:15.940 | Every single one of them that testified, that's empathy.
00:13:18.380 | Not agreeing with where the other side is coming from.
00:13:21.380 | I'm not sure how Sam would define it.
00:13:24.060 | But common vernacular is it's sympathy and it's compassion.
00:13:27.420 | And that's when it becomes useless.
00:13:29.820 | - And there's a gray area, maybe you can comment on it,
00:13:33.760 | is sometimes a drop of compassion
00:13:35.980 | helps make that empathy more effective in the conversation.
00:13:41.220 | So you just saying you believe X
00:13:48.180 | doesn't quite form a strong of a bond with the other person.
00:13:53.140 | - You're imagining it doesn't.
00:13:55.140 | - Maybe you're right.
00:13:55.980 | Yes, I'm imagining it doesn't.
00:13:57.180 | I'm imagining you need to show
00:13:59.840 | that you're on the same side.
00:14:01.380 | That you need to signal a little bit
00:14:05.140 | about your actual beliefs, at least in that moment.
00:14:07.980 | Even if that signaling is not as deep as it sounds.
00:14:12.980 | But at first, basically patting the person on the back
00:14:17.100 | and saying we're on the same side, brother.
00:14:18.860 | - You know, that's what most people,
00:14:21.300 | when they're really learning the concept,
00:14:25.500 | that's the basic human reaction.
00:14:27.900 | And in application,
00:14:32.140 | especially in highly adversarial situations.
00:14:34.680 | Like I need a regular guy, Muslim,
00:14:40.820 | but how's that guy gonna say, buy it,
00:14:44.660 | if I like, you know, dude, I'm on your side.
00:14:47.540 | I've been there, I feel you.
00:14:50.060 | No, no, no, no, no, no.
00:14:52.140 | People get conned by that so much.
00:14:54.740 | Like if we're on opposite sides of the table
00:14:56.860 | and I try to act like I'm not on the opposite side
00:14:58.820 | of the table, that makes me disingenuous.
00:15:00.960 | So I would rather be honest.
00:15:07.300 | My currency's integrity.
00:15:10.240 | And at some point in time, if you go like,
00:15:13.420 | you know where I'm coming from?
00:15:14.780 | My answer's gonna be like, look,
00:15:16.460 | I can agree on maybe where we're going,
00:15:19.480 | but if we're talking about, you know,
00:15:22.180 | am I on your side now?
00:15:24.140 | As a human being, I wanna see you survive and thrive,
00:15:26.860 | not at my expense.
00:15:28.620 | I think the world is full of opportunity.
00:15:31.560 | I'm optimistic.
00:15:33.980 | I got more than enough reason for saying that.
00:15:36.980 | It's enough for here for both of us.
00:15:39.580 | So I got no problem with you getting yours.
00:15:42.060 | You know, just don't take it out of my hide.
00:15:43.780 | And I'm gonna be honest about it,
00:15:45.540 | about both of those things.
00:15:46.780 | I'm not interested in you taking it out of my hide.
00:15:49.140 | I think there's plenty here for both of us.
00:15:51.080 | Now, I don't need to be on your side,
00:15:53.100 | except in a human sense.
00:15:56.780 | But do I have to side with you over the war?
00:16:00.180 | Or how we're distributing the stock
00:16:03.940 | or how much you get paid or how much you make off this car.
00:16:09.440 | I think people, my experience as a layman,
00:16:14.080 | is that empathy's not got a downside.
00:16:17.400 | That you don't need me to act like I'm on your side
00:16:21.600 | for us to make a great deal.
00:16:23.600 | - Great deal.
00:16:24.680 | Well, we'll talk about two things,
00:16:26.260 | a great deal and a great conversation.
00:16:28.920 | - Right.
00:16:29.760 | - They're often going to be the same thing,
00:16:32.680 | but at times, they're going to be different.
00:16:35.040 | That's, you mentioned Vladimir Putin.
00:16:37.440 | There is some Zoom level at which you do want to say
00:16:41.640 | we're on the same side.
00:16:42.520 | You said the human level.
00:16:44.880 | It's possible to say, kind of Zoom out
00:16:47.360 | and say that we're all in this together.
00:16:50.520 | Not we Slavic people, we Europeans, but we human beings.
00:16:55.520 | - We're on the same planet.
00:16:57.440 | - Same planet.
00:16:58.400 | - Right.
00:16:59.620 | Several years ago, and his name has evidently been mud now,
00:17:05.320 | but he was very nice to me.
00:17:07.320 | Lawyer here in town named Tom Girardi.
00:17:09.680 | And no shortage of bad reporting on him now.
00:17:13.240 | I have absolutely no idea if any of it's true.
00:17:16.120 | I do know that in my interaction with him,
00:17:18.040 | he was always a gentleman to me and was very generous.
00:17:20.740 | When he'd get into conversations with people,
00:17:24.800 | he'd always say, "Let's look at 10 years from now
00:17:28.440 | "where we could both be in a phenomenal place together.
00:17:34.220 | "Now let's work our way back from there."
00:17:36.300 | (Lex laughing)
00:17:37.300 | - That's a good line.
00:17:38.460 | - Yeah, and then I saw him do it in simulations.
00:17:42.460 | I was teaching at USC, and we were at a function together,
00:17:46.780 | and a gentleman at the time told me who he was,
00:17:50.580 | and he was really influential.
00:17:51.860 | So I walked up to the guy cold, and I said,
00:17:53.460 | "Hey, how about coming and talking to my class at USC?"
00:17:56.580 | He didn't know me other than the fact
00:17:58.940 | that we had a mutual acquaintance,
00:18:00.920 | and he graciously consented to come in.
00:18:03.860 | And he said, "What do you want me to talk about?"
00:18:05.780 | And I said, "Look, dude, just from your success here,
00:18:09.020 | "it doesn't matter what you talk about.
00:18:10.980 | "Either I'm gonna agree or I'm gonna disagree
00:18:14.280 | "or I'm gonna learn from it.
00:18:15.120 | "My students are gonna learn from it."
00:18:17.180 | So students wanna role play with him.
00:18:18.900 | They dispute, let's do a negotiation.
00:18:20.740 | Every single time, he'd go to pick a point in the future
00:18:23.860 | where we're both happy 10 years, 20 years from now,
00:18:29.620 | and let's work our way back.
00:18:30.860 | Now, hostage negotiator, same thing.
00:18:33.220 | I call into a bank.
00:18:35.380 | Bad guy picks up on the phone.
00:18:39.080 | And I'm gonna say, "I want you to live.
00:18:42.620 | "I wanna see you survive this."
00:18:45.880 | Whatever else goes with that,
00:18:49.380 | let's pick a point in the future that we're both good with,
00:18:52.880 | and then we work our way back.
00:18:55.300 | And people make also, we were talking before
00:18:58.180 | about emotion and what you care about,
00:19:00.380 | people make their decisions based on a vision of the future.
00:19:03.860 | Like, without question.
00:19:05.400 | I think there's a Hindu temple in the United States
00:19:10.180 | has been or being assembled the same way
00:19:13.700 | that the Hindu temples were in India 1,000 years ago,
00:19:16.500 | by hand, volunteers, by hand.
00:19:19.120 | These people are knocking themselves on
00:19:22.900 | for a place in paradise, a vision of the future.
00:19:25.300 | What you will go through today if the future portends
00:19:30.220 | what you want, you'll go through incredible things today.
00:19:34.340 | So it's a vision of the future.
00:19:35.900 | - So you have to try to paint a vision of the future
00:19:40.020 | that the person you're negotiating with will like.
00:19:43.820 | Just tough to do.
00:19:44.980 | - Let's find out what their vision of the future is,
00:19:47.620 | and then remove yourself as a threat.
00:19:50.500 | - Sure.
00:19:51.340 | - You know, if we can collaborate together, at all,
00:19:54.580 | if you think that I could do anything at all
00:19:56.540 | to help you to that point, and integrity's my currency,
00:20:01.120 | I'm not gonna lie to you, which gets back before
00:20:04.580 | did I lie to you about whether or not I'm on your side.
00:20:06.860 | You know, right now, at the moment,
00:20:08.500 | we're on opposite sides of the fence.
00:20:11.020 | That's not gonna stop us from being together in the future.
00:20:14.300 | Inside, you're gonna say,
00:20:15.700 | well, you didn't lie to me about today,
00:20:17.820 | maybe you won't lie to me about tomorrow.
00:20:19.860 | - So going back to world leaders, for example,
00:20:23.460 | whether it's Vladimir Zelensky or Vladimir Putin,
00:20:27.780 | you don't think it closes off their mind
00:20:29.420 | to show that you have a different opinion?
00:20:33.260 | - Depend upon when you showed it.
00:20:36.120 | Are you arguing from the beginning,
00:20:39.020 | or are you displaying understanding from the beginning?
00:20:41.340 | I don't think it stops you from being adversarial.
00:20:44.180 | There was a thing about Mnuchin's chapter in his book,
00:20:49.180 | The Tension Between Empathy and Assertiveness.
00:20:53.220 | I remember reading that name of the chapter,
00:20:57.340 | thinking like, eh, you know,
00:20:59.420 | in my business, there is no tension.
00:21:03.000 | And then I got into it, and I read,
00:21:05.300 | I thought, this is a red herring.
00:21:08.180 | He's drawing people in, because his entire chapter
00:21:11.840 | is that empathy puts you in a position to assert,
00:21:15.700 | and that there is no tension.
00:21:17.580 | It's a sequencing issue.
00:21:19.820 | And that's why, again, I think it was written for lawyers.
00:21:22.940 | - Yeah, sequencing issue.
00:21:24.580 | So timing is everything.
00:21:26.020 | So you emphasize the importance of,
00:21:29.480 | in terms of sequencing and priority of listening,
00:21:33.860 | of truly listening to the other person.
00:21:36.380 | - I'm sorry, what'd you say?
00:21:37.780 | That was a bad joke, sorry. - I forgot.
00:21:39.220 | (Ralph laughs)
00:21:40.460 | Your timing is just perfect.
00:21:43.500 | How do you listen?
00:21:46.140 | How do you truly listen to another human being?
00:21:48.220 | How do you notice them?
00:21:49.100 | How do you really hear them?
00:21:50.500 | - I always hated the term active listening.
00:21:54.120 | If anything, it's proactive.
00:21:56.160 | And as soon as you start trying to anticipate
00:21:59.580 | where somebody's going, you're dialed in more.
00:22:01.880 | Because along the way,
00:22:05.940 | either you're congratulating yourself for being right,
00:22:09.260 | or when suddenly they say something that surprises you,
00:22:12.700 | you really notice it.
00:22:13.740 | Like, that's not what I expected.
00:22:15.740 | You're dialed in, you're listening.
00:22:17.420 | So it's proactive.
00:22:20.100 | And then one of the reasons,
00:22:23.460 | you know, we named the book "Tactical Empathy."
00:22:26.300 | Named the book, never split the difference,
00:22:28.300 | but we're talking about tactical empathy.
00:22:30.420 | Calibrated emotional intelligence.
00:22:35.460 | What's it calibrated by?
00:22:37.260 | First, it was experienced as hostage negotiators,
00:22:40.060 | and we've come to find out
00:22:41.100 | that our experience as hostage negotiators
00:22:43.940 | is backed up by neuroscience.
00:22:45.300 | That's another reason why I listen
00:22:46.300 | to Andrew Huberman's podcast all the time.
00:22:48.220 | Heavy, heavy, heavy, heavy on the neuroscience.
00:22:50.580 | And so then emotional intelligence calibrated
00:22:56.980 | by what we know about neuroscience.
00:22:58.420 | What do we know about neuroscience?
00:23:00.000 | And I'll talk about it from a layman's perspective,
00:23:02.300 | and to even say we is an arrogant thing, you know,
00:23:05.460 | human beings.
00:23:06.300 | I didn't do the research.
00:23:08.300 | I'm scooping up as much of it as I can as a layman.
00:23:12.100 | The brain's largely negative.
00:23:14.620 | I think there's ample evidence.
00:23:17.460 | People will argue with you as to what the wiring is
00:23:20.820 | and what does what, and the limbic system,
00:23:24.460 | and all of that, but the brain is basically 75% negative.
00:23:29.460 | As a layman, I make that contention, number one.
00:23:31.820 | Number two, the best way to deactivate negativity
00:23:35.940 | is by calling it out.
00:23:37.100 | And I could say, look, I don't want you to be offended
00:23:41.060 | by what I'm getting ready to say.
00:23:43.180 | That's a denial.
00:23:44.540 | Your guard is up.
00:23:45.620 | You're getting ready to get mad.
00:23:47.740 | If I say, what I'm getting ready to say
00:23:49.380 | is probably gonna offend you.
00:23:51.620 | Now you relax a little bit, and you go,
00:23:54.540 | all right, what is it?
00:23:56.220 | And then I say it, whatever it is,
00:23:58.240 | and you're gonna be like, oh, that wasn't that bad.
00:24:00.580 | Because we knew from hostage negotiation,
00:24:04.620 | by calling out the negativity, deactivate it,
00:24:06.940 | and then a number of neuroscience experiments
00:24:10.300 | have been done right and left
00:24:11.500 | by calling out negativity, deactivating the negativity.
00:24:14.500 | - So calling out ahead of time.
00:24:17.220 | So like acknowledging that this is,
00:24:18.980 | that this is, ahead of time, that this is going to hurt.
00:24:24.320 | - The experiments that I've seen
00:24:25.980 | have been when the negativity was inflicted,
00:24:29.300 | and then having a person that it was being inflicted upon,
00:24:31.860 | simply identify it.
00:24:33.300 | - Just identify.
00:24:34.300 | - Yeah, what are you feeling?
00:24:35.140 | I'm angry, and the anger goes away.
00:24:39.700 | - It's tough because I've had a few,
00:24:42.860 | and again, we're dancing between things,
00:24:44.600 | but I've had a few conversations
00:24:47.440 | where anger arose in the guests I spoke with.
00:24:51.140 | - Yeah.
00:24:53.300 | - And I'm not sure identifying it.
00:24:56.220 | That's like leaning into it and going into the depths,
00:25:01.740 | 'cause that's going to the depths of some emotional,
00:25:07.260 | psychological thing they're going through
00:25:09.700 | that I'm not sure I wanna explore that iceberg
00:25:12.620 | with the little ship we got.
00:25:14.020 | It's a, you have to decide.
00:25:17.540 | Do you want to avoid it,
00:25:18.740 | or do you wanna lean into it?
00:25:21.980 | It's a tough choice.
00:25:22.820 | - It's the elephant in the room.
00:25:23.780 | - It is an elephant in the room.
00:25:24.940 | It is an elephant, especially when,
00:25:27.500 | I think that's the big difference
00:25:28.980 | between conversations and negotiations.
00:25:32.180 | Negotiation ultimately is looking for closure and resolution.
00:25:36.500 | I think general conversations like this is more exploring.
00:25:41.500 | There's not necessarily a goal.
00:25:44.740 | Like if you were to put,
00:25:45.900 | like if I had to put a goal for this conversation,
00:25:48.460 | there's no real goal.
00:25:50.300 | It's curiously exploring ideas.
00:25:53.640 | So that gives you freedom to not call out the elephant
00:25:57.340 | for time.
00:25:59.500 | You could be like, all right, let's go to the next room,
00:26:01.220 | get a snack, and come back to the elephant.
00:26:03.020 | - Right.
00:26:04.820 | All right, so I'd make a tiny adjustment
00:26:07.460 | on the negotiation definition.
00:26:08.940 | - Sure.
00:26:09.780 | - 'Cause you said, I think, seeking closure.
00:26:13.260 | You used two words, and closure was one of 'em.
00:26:18.620 | - Goal is maybe another.
00:26:19.980 | Well, yeah, what is negotiation?
00:26:21.940 | - Well, I would say seeking collaboration.
00:26:24.040 | And 'cause closure kind of puts a little bit
00:26:27.540 | of a finality to it.
00:26:28.500 | And the real problem in any negotiation
00:26:30.300 | is always implementation.
00:26:32.820 | It's why we say yes, I say yes is nothing without how.
00:26:36.840 | And yes, at its very best, is only temporary aspiration.
00:26:42.100 | It's aspirational.
00:26:43.100 | It's usually counterfeit.
00:26:44.940 | So if you're looking for--
00:26:46.020 | (Lex laughing)
00:26:47.100 | - That's a good line.
00:26:48.020 | Yes is usually counterfeit.
00:26:49.260 | It's aspirational without the how.
00:26:51.660 | - Yeah. - It's just a good line.
00:26:52.780 | Yeah. - Thank you.
00:26:53.620 | I've been working on it.
00:26:54.860 | I was practicing in front of the mirror
00:26:56.820 | before I came in today. - You're doing pretty good.
00:26:58.220 | (both laughing)
00:27:00.620 | You got a bright future ahead of you.
00:27:02.100 | (Lex laughing)
00:27:04.060 | You should write a book or something, right?
00:27:06.300 | - Yeah. (Lex laughing)
00:27:08.500 | - Your book is excellent, by the way.
00:27:10.580 | - Thanks, appreciate that.
00:27:11.900 | What am I doing here anyway?
00:27:13.540 | - This, on Earth, in general?
00:27:15.140 | - On you, with you.
00:27:16.300 | - I don't know.
00:27:17.300 | (Lex laughing)
00:27:18.340 | We're collaborating.
00:27:20.220 | - Why me, though?
00:27:21.060 | Why'd you wanna talk to me?
00:27:23.660 | - I've heard you speak in a few places.
00:27:25.660 | I was like, this is a fascinating human.
00:27:28.040 | I think on Clubhouse and different places,
00:27:31.980 | and I listen to some YouTube stuff,
00:27:34.420 | and this is just, you meet people that are interesting.
00:27:39.420 | That's what I love doing with this podcast,
00:27:42.760 | is just exploring the mind of an interesting person.
00:27:46.640 | You notice people. - I think.
00:27:47.740 | - Sometimes there's a homeless person outside of 7-Eleven.
00:27:51.780 | I notice, who are you?
00:27:53.580 | - Yeah, yeah, yeah, yeah.
00:27:54.540 | - It's fascinating.
00:27:55.380 | It doesn't, I don't look at the resumes
00:27:57.060 | and the credentials and stuff like that.
00:27:58.540 | It's just being able to notice a person.
00:28:00.900 | As I've been leafing through the different choices
00:28:03.500 | of the podcast, the young lady that OnlyFans,
00:28:08.300 | and the sex workers, that's a fascinating human being.
00:28:12.340 | Like, I wanna know what makes that person tech at 1,000%.
00:28:16.180 | - The fascinating thing about her is her worldview
00:28:19.180 | is almost entirely different than mine,
00:28:21.380 | and that's always interesting to talk to a person
00:28:23.420 | who just is happy, flourishing, but sees the world
00:28:28.380 | and the set of values she has is completely different.
00:28:31.580 | And is also not argumentative,
00:28:34.580 | is accepting of other worldviews.
00:28:37.340 | It's beautiful to explore that.
00:28:38.860 | - Yeah, no kidding, I would agree.
00:28:40.780 | And then, yeah, thought-provoking,
00:28:42.060 | 'cause I consider myself,
00:28:43.380 | the word I was looking for before was abundant.
00:28:48.100 | I think it's an abundant world.
00:28:49.660 | So I'm pretty optimistic.
00:28:51.300 | I consider myself, I don't know,
00:28:53.940 | happy exactly describes it, but yeah.
00:28:57.060 | - So then, if I'm happy, optimistic, abundant,
00:28:59.980 | I got a worldview, and then you run into somebody
00:29:02.060 | that has a vastly different worldview,
00:29:03.740 | and they're happy, and they think it's abundant too.
00:29:06.940 | And you're like, what is going on in your head?
00:29:08.900 | Or mine, or what am I missing?
00:29:11.160 | Yeah, so that's fascinating.
00:29:12.380 | - And the pie grows, which is useful for kinda negotiation
00:29:15.820 | when you paint a picture of a future,
00:29:18.180 | if you're optimistic about that future,
00:29:20.260 | there's a kinda feeling like we're both gonna win here.
00:29:23.860 | - Exactly. - And that's easy.
00:29:24.860 | We live in a world where both people can win.
00:29:26.940 | - Yeah, and in point of fact, that's the case.
00:29:29.140 | Although a lot of people want us to think otherwise,
00:29:32.100 | mostly because of the negativity
00:29:34.900 | that I was talking about before.
00:29:36.340 | - So the brain is generally cynical.
00:29:38.700 | - Yeah, my description of it is,
00:29:40.880 | the pessimistic caveman survived,
00:29:44.540 | and we're descendants of the pessimists.
00:29:46.660 | The optimistic guy got eaten by a saber-toothed tiger.
00:29:49.340 | - Yeah, but on the flip side,
00:29:52.540 | the optimists seem to be the ones
00:29:53.860 | that actually build stuff these days.
00:29:55.900 | - There's the switch.
00:29:57.140 | So at what point in time do we catch on?
00:30:00.820 | 'Cause the difference between survival and success mindset,
00:30:03.780 | the success mindset is highly optimistic.
00:30:08.180 | So where do we switch, or how do we stay switched
00:30:12.580 | from survival to success?
00:30:14.140 | That's the challenge.
00:30:15.280 | - Yeah, somewhere we stopped being eaten by saber-toothed
00:30:19.180 | tigers and started building bridges and buildings
00:30:22.060 | and computers and companies.
00:30:25.140 | We started to experience, we got enough data back
00:30:27.820 | to collaborate, and we stopped listening to our amygdala
00:30:32.980 | and we started listening to our gut.
00:30:35.380 | - Let me just return briefly to terrorists.
00:30:38.220 | What do you think about the policy
00:30:39.820 | of not negotiating with terrorists?
00:30:41.540 | - Well, that's not the policy, first of all.
00:30:44.260 | Now, everybody thinks that's the policy.
00:30:46.660 | It hasn't been the policy since 2002,
00:30:49.740 | when Bush 43 signed a National Security
00:30:52.820 | Presidential Directive, NSPD, at the time it was NSPD 12,
00:30:57.100 | which basically said, "We won't make concessions.
00:31:03.340 | "That doesn't mean we won't talk."
00:31:05.300 | So I'm in Colombia at the same time,
00:31:08.380 | and I had been intimately involved with the signing,
00:31:12.540 | him signing that document.
00:31:14.260 | I knew exactly what it said,
00:31:17.420 | and he didn't inherit it from somebody else, he signed it.
00:31:21.900 | And I'm in Colombia, and the number two in the embassy says,
00:31:26.900 | "Last night on TV, the President of the United States said
00:31:30.860 | "we don't negotiate with terrorists.
00:31:32.540 | "Are you calling a President of the United States a liar?"
00:31:36.220 | And I remember thinking, like, all right, so,
00:31:38.420 | he probably said that,
00:31:42.140 | and that's not on the document that he signed.
00:31:44.740 | So I said, look, I'm familiar with what he's signed,
00:31:50.860 | and that's not what it says.
00:31:52.460 | Well, you know, and so the argument,
00:31:54.220 | but that's always been the soundbite that everybody likes.
00:31:56.540 | We don't negotiate with terrorists.
00:31:58.500 | Depends upon your definition of negotiation.
00:32:00.540 | If it's just communication,
00:32:01.780 | we negotiate with them all the time, number one.
00:32:03.780 | And number two, like, every President has made
00:32:08.100 | some boneheaded deal with the bad guys.
00:32:11.340 | Like, Obama released five high-level Taliban leaders
00:32:18.100 | from Guantanamo in exchange for an AWOL soldier
00:32:22.660 | that we immediately threw in jail.
00:32:24.860 | And I thought that was a horrible deal.
00:32:27.140 | And that's putting terrorists back on the battlefield.
00:32:29.900 | And then Trump turned around and topped it
00:32:33.260 | by putting 5,000 terrorists back on the battlefield.
00:32:36.700 | So we haven't had a President that has stuck to that
00:32:39.100 | on either side of the aisle
00:32:40.620 | since people started throwing that out as a soundbite.
00:32:43.100 | - What do you think of that negotiation?
00:32:45.100 | Forget terrorists, but the global negotiation,
00:32:47.660 | like with Vladimir Putin,
00:32:49.740 | the recent negotiation over prisoners,
00:32:52.860 | the exchange, the Brady and Geithner.
00:32:55.980 | Is there a way to do that negotiation successfully?
00:32:58.420 | - First of all, I agree with the idea
00:32:59.980 | that she was wrongfully detained
00:33:01.380 | and that she didn't deserve to be in jail
00:33:04.020 | and that there should be no second-class citizens ever.
00:33:09.020 | And whether you're a WNBA player
00:33:12.420 | or you're just some bonehead
00:33:14.100 | that walked into the wrong situation,
00:33:16.260 | your government should not abandon you ever, ever.
00:33:19.020 | Now what they do in the meantime,
00:33:23.100 | there should have been a negotiation.
00:33:26.780 | They were desperate to make a deal at a bad time.
00:33:29.140 | They'd been offered far better deals
00:33:30.820 | than prisoner swaps earlier and turned them down.
00:33:33.860 | And then he gets turned up,
00:33:35.620 | and thank God for Brittany Griner
00:33:37.420 | that the public got enough attention.
00:33:40.380 | They kept pressure on the administration.
00:33:42.100 | They made a deal.
00:33:44.620 | Now governments wanna make those kind of deals,
00:33:47.020 | that's fine as long as it,
00:33:49.660 | 'cause that was basically a political negotiation.
00:33:52.260 | You're putting 5,000 Taliban back on a battlefield.
00:33:54.940 | That ain't negotiating with another government.
00:33:57.500 | You're putting five of them back on a battlefield.
00:33:59.340 | That ain't negotiating with another government
00:34:00.900 | that's directly contradicting this thing that you claimed,
00:34:03.620 | and those were all bad deals.
00:34:04.860 | Now was the Brittany Griner thing a bad deal?
00:34:07.340 | I think it was great for her.
00:34:09.300 | If I was in the middle of it, it would have been better,
00:34:11.740 | and she still would have come home.
00:34:13.180 | - Yeah, there's some technical aspects of that negotiation.
00:34:15.860 | What do you think is the value, just to linger on it,
00:34:17.940 | of meeting in person for the negotiation?
00:34:20.220 | - I think it's a great idea.
00:34:21.860 | - Can I just follow that tangent along?
00:34:25.500 | There's a war in Ukraine now.
00:34:26.980 | It's been going on over a year.
00:34:29.060 | It's, for me personally, given my life stories,
00:34:34.580 | is a deeply personal one,
00:34:37.460 | and I'm returning back to that area of the world
00:34:40.380 | that was there.
00:34:42.180 | - Volodymyr Zelensky said he doesn't want to talk
00:34:45.940 | to Vladimir Putin.
00:34:48.660 | Do you think they could get in a room together
00:34:52.820 | and say you were there in a room with Putin and Zelensky,
00:34:57.820 | and Biden is sitting in the back drinking a cocktail,
00:35:02.140 | or maybe he is at the table participating.
00:35:04.700 | How is it possible through negotiation,
00:35:07.860 | through the art of conversation,
00:35:09.820 | to find peace in this very tense geopolitical conflict?
00:35:14.820 | - I think it's eminently possible.
00:35:18.540 | I think getting people together in person
00:35:20.980 | has always been a good idea.
00:35:22.940 | Now, who's getting them together, under what circumstances,
00:35:25.900 | and how many times are you getting them together?
00:35:28.180 | The documentary, "The Human Factor,"
00:35:31.540 | about the Mideast peace negotiations,
00:35:33.700 | mostly through the '90s,
00:35:36.460 | mostly into the Clinton administration,
00:35:39.020 | got kicked off under Bush 41,
00:35:44.020 | and then the documentary continues through Trump,
00:35:47.580 | but just touching, basically, on it.
00:35:50.700 | But they're getting Arafat
00:35:52.100 | and the different Israeli prime ministers
00:35:54.900 | together in person,
00:35:55.860 | and these guys do not want to talk to each other,
00:35:59.300 | and depending upon the prime minister,
00:36:01.460 | the mere thought of being on the same planet with Arafat
00:36:05.580 | was offensive,
00:36:07.420 | and they started getting these guys together
00:36:08.940 | in person regularly,
00:36:10.780 | and they started seeing each other as human beings,
00:36:14.020 | and they started realizing
00:36:15.180 | that there was enough room on the planet for 'em,
00:36:17.940 | and that people dying was stupid,
00:36:21.400 | and they would slowly work things out
00:36:24.620 | by getting these guys together in person.
00:36:27.260 | So how long does it take?
00:36:29.140 | Who's hosting it?
00:36:30.280 | But it's a good idea.
00:36:32.640 | - But the skill of achieving that thing
00:36:36.220 | that you talk about a lot,
00:36:37.300 | which is empathy,
00:36:38.420 | and I would say, in that case,
00:36:40.540 | not just empathy, but empathy plus,
00:36:43.580 | you might disagree with this,
00:36:44.580 | but a drop of compassion in there.
00:36:47.220 | - I think compassion is helpful,
00:36:49.740 | but it's not essential.
00:36:52.540 | If you just know where I'm coming from,
00:36:55.660 | the feeling of being understood--
00:37:01.420 | - Yeah, heard and understood, that's powerful.
00:37:03.540 | - Is, yeah, and again,
00:37:05.860 | I know I picked the vast majority of this up
00:37:08.020 | on Andrew's podcast,
00:37:11.340 | but I picked it up in other places,
00:37:12.780 | 'cause early on,
00:37:13.840 | when we were putting a book together,
00:37:16.860 | Tall Roz, the writer,
00:37:18.860 | my son, uncredited co-author,
00:37:22.060 | so the book's really a collaboration between me,
00:37:24.940 | my son, Brandon, and Tall Roz,
00:37:26.860 | and we're driving for that's right.
00:37:30.140 | When somebody feels like what you've said
00:37:33.020 | is completely their position,
00:37:34.420 | they say that's right.
00:37:36.380 | Not you're right, but that's right.
00:37:38.740 | So Tall says, "I think what's happening here
00:37:41.700 | "is you're triggering a subtle epiphany in somebody."
00:37:45.260 | So I'm like, "All right, I'll buy that."
00:37:47.260 | So I start looking up the neuroscience
00:37:50.100 | of the feeling of epiphany,
00:37:51.760 | getting a hit of oxytocin and serotonin.
00:37:56.700 | Oxytocin is a bonding drug.
00:38:03.100 | You bond to me.
00:38:04.780 | I don't bond to you.
00:38:05.740 | When you feel completely understood by me,
00:38:08.340 | you bond to me.
00:38:09.480 | Then in one of the relationship podcasts
00:38:13.660 | that I'm listening to on Andrew,
00:38:16.340 | it says oxytocin inclines people to tell the truth.
00:38:20.320 | You're more honest.
00:38:22.900 | All right, so you feel deeply understood by me,
00:38:24.820 | you bond to me,
00:38:26.780 | and you start getting more honest with me.
00:38:30.380 | Serotonin, the neurochemical satisfaction.
00:38:34.380 | Epiphany, you feel oxytocin and serotonin.
00:38:38.660 | Being understood.
00:38:40.940 | All right, I got you bonding to me,
00:38:42.940 | I got you being more honest with me,
00:38:44.900 | and I got you feeling more satisfied so you want less.
00:38:48.140 | What more do you want out of a negotiation?
00:38:51.420 | - Of course, there's already with leaders
00:38:55.620 | and great negotiators, there's walls built up,
00:38:58.820 | defense mechanisms against that.
00:39:00.620 | You're resisting.
00:39:02.780 | You're resisting this basic chemistry,
00:39:04.380 | but yes, you should have that.
00:39:06.780 | You should work towards that kind of empathy.
00:39:09.280 | And I personally believe,
00:39:10.460 | I don't actually understand why,
00:39:12.980 | but I've observed it time and time again,
00:39:14.580 | but getting in a room together
00:39:16.060 | and really talking, whether privately or publicly,
00:39:20.300 | but really talking.
00:39:22.140 | And like this, so I'll comment on this.
00:39:25.180 | So right now, this is being recorded,
00:39:27.920 | and a few folks will hear this,
00:39:30.180 | but when you really do a good job
00:39:32.300 | of this kind of conversation, you forget there's cameras.
00:39:35.300 | And that's much better than there being
00:39:37.380 | even a third person in the room,
00:39:39.060 | but often when world leaders meet,
00:39:42.020 | there's press or there's others in the room.
00:39:45.180 | Man to man or man to woman,
00:39:49.180 | you have to meet in a saloon,
00:39:52.840 | just the two of you, and talk.
00:39:55.040 | There's some intimacy and power to that,
00:39:57.680 | to achieve that if you're also willing
00:40:00.700 | to couple that with empathy,
00:40:02.540 | to really hear the other person.
00:40:04.660 | I don't know what that is.
00:40:05.500 | That's like a deep, deep intimacy that happens.
00:40:08.220 | - And I think there's actually,
00:40:09.800 | 'cause we get asked this in a Black Swan group all the time,
00:40:13.000 | like how didn't, you know, Zoom, that's bad,
00:40:16.540 | you know, 'cause you don't have
00:40:18.120 | the same visual feedback on Zoom.
00:40:21.920 | And that's not true.
00:40:23.100 | Like you and I, I see you from the waist up right now.
00:40:25.600 | If we were on Zoom, I'd be looking at you
00:40:26.980 | from the waist up.
00:40:27.820 | - I'm not wearing pants, yeah.
00:40:29.260 | (laughing)
00:40:30.100 | With the internet.
00:40:31.820 | I apologize for that.
00:40:33.020 | Sorry, yeah, yeah, yeah.
00:40:37.100 | You only see a small portion.
00:40:38.760 | - Usually, that's usually where I go, but anyway.
00:40:43.900 | - I'm glad we're both at ridiculous sentences.
00:40:46.940 | I appreciate it.
00:40:47.780 | - But what makes us different in person?
00:40:51.500 | I actually think, I think there's energy
00:40:55.500 | that we don't have the instrumentation to define yet.
00:40:58.700 | And I think that there's a feel.
00:41:00.980 | I think there's an actual energetic feel that changes.
00:41:04.900 | And just 'cause we don't, again,
00:41:06.620 | just 'cause we can't measure it,
00:41:10.340 | doesn't mean it's not there.
00:41:11.740 | - Yeah, I would love to figure out what that is.
00:41:15.940 | Folks that are working on virtual reality
00:41:18.100 | are trying to figure out what that is.
00:41:20.060 | During the pandemic, everybody was on Zoom.
00:41:22.700 | Zoom and Microsoft, everybody was trying to figure out
00:41:25.280 | how do we replicate that.
00:41:26.900 | I'm trying to understand how to replicate that
00:41:28.940 | because it sure is not fun to travel across the world
00:41:31.700 | just to talk to Snowden or Putin or Zelensky.
00:41:35.180 | I'd love to do it over Zoom, but it's not the same.
00:41:38.540 | It's not the same.
00:41:39.380 | - No, it's not the same.
00:41:40.420 | I'd go in a room with Putin.
00:41:43.900 | - You would go in a room with Putin?
00:41:44.740 | - I would, yeah, 1,000%.
00:41:47.220 | I'd get a that's right out of him.
00:41:49.200 | - That's right.
00:41:50.040 | Well, first you would give him a that's right, probably.
00:41:53.620 | - Ah, getting and giving.
00:41:55.140 | - See, and here's the issue
00:41:57.480 | that trips everybody up in negotiation.
00:41:59.220 | The difference between hearing and speaking,
00:42:01.940 | the same words are vastly different.
00:42:03.900 | And what I'm looking for is the responses
00:42:07.100 | I'm getting out of you.
00:42:08.940 | 'Cause if you can't, first, that's right especially,
00:42:11.300 | like if you can't appreciate what that really means,
00:42:14.260 | hearing it is unsatisfying.
00:42:16.420 | - So those two words are really important to you.
00:42:18.140 | You talk about this in your book.
00:42:20.020 | Why is that, what does that's right mean?
00:42:22.500 | Why is it important?
00:42:23.540 | - Well, it means that what you just heard
00:42:25.140 | you think is unequivocably the truth.
00:42:29.200 | Like it's dead on, it hit the target, it's a bullseye.
00:42:33.540 | And there's been a topic of discussion,
00:42:37.180 | especially between my son and I a lot, like what happens?
00:42:40.720 | This oxytocin bonding moment.
00:42:44.200 | And his contention has always been like,
00:42:48.140 | Donald Trump is the poster child of what it means
00:42:51.900 | because Donald Trump's an address in an audience,
00:42:55.420 | he's in a debate with Hillary
00:42:56.860 | or he's giving a speech someplace.
00:43:00.020 | And when the people that are devoted to him,
00:43:02.900 | when they believe that what he's just said
00:43:05.560 | is completely right, it's insightful.
00:43:08.740 | They look at him or they look at the TV
00:43:11.900 | and they go, that's right.
00:43:14.260 | And it's what people say when they're bought in
00:43:19.060 | to what they just heard.
00:43:21.780 | Now, if you're not convinced of the way
00:43:24.820 | that Donald Trump's followers are bonded to him,
00:43:27.620 | and he also just like this, in my view,
00:43:31.860 | destroys the idea of common ground.
00:43:34.340 | Because when he first started to run for president,
00:43:37.380 | the pundits all said, he's a New Yorker.
00:43:41.420 | Nobody in the Republican Party is gonna like him.
00:43:43.380 | It's middle America, it's blue collar,
00:43:46.380 | it's regular common folks, factory workers.
00:43:51.020 | They're not gonna like Trump 'cause he's from New York
00:43:53.180 | and he went to Wharton, he's an Ivy Leaguer
00:43:55.780 | and he's a son of a wealthy real estate mogul
00:43:59.820 | and he had a million dollars handed to him
00:44:02.580 | when he got out of college.
00:44:04.060 | He's born with a silver spoon in his mouth.
00:44:08.500 | The rank and file Republicans
00:44:10.500 | are never gonna accept this guy based on common ground.
00:44:14.740 | Look how smart that was.
00:44:18.260 | - Do you think he's a good negotiator?
00:44:19.580 | Do you think Donald Trump is a good negotiator?
00:44:22.300 | - No, I think he's a great marketer.
00:44:24.780 | If you look at his negotiation track record,
00:44:28.340 | all right, so I started following Donald Trump
00:44:30.260 | in the '80s when I was in New York.
00:44:32.260 | I'm a last century guy, he's a last century guy.
00:44:34.140 | We've got mutual acquaintances.
00:44:36.460 | The minister that married him, Tamar Maples,
00:44:38.460 | was a friend of mine, a close friend of mine,
00:44:41.700 | and in 1998, I threw a fundraiser in his apartment
00:44:46.180 | at Trump Tower that he attended.
00:44:49.560 | So no shortage of mutual friends.
00:44:52.260 | We went to the same church.
00:44:54.300 | Still have mutual acquaintances, friends.
00:44:57.300 | And I've watched his track record in negotiation history,
00:45:02.220 | which is exactly his track record with North Korea.
00:45:06.820 | Where are we with North Korea?
00:45:08.540 | What was the deal that he made with North Korea?
00:45:10.940 | See, your answer is the same as everybody else's.
00:45:16.140 | Well, I remember it started out with a lot of fanfare,
00:45:20.440 | but I don't know what happened,
00:45:23.160 | 'cause nothing ever happens.
00:45:24.800 | - It's more public fanfare,
00:45:26.400 | so marketing-minded presentation of the message.
00:45:29.040 | - Starts out with a bang.
00:45:30.320 | If he doesn't cut the deal in a short period,
00:45:32.920 | a really short period of time, he moves on,
00:45:35.760 | and everybody wonders what had happened
00:45:37.520 | because there was so much fanfare at the beginning.
00:45:40.120 | Now, at the beginning, him even opening that dialogue
00:45:44.160 | with North Korea was masterful.
00:45:45.980 | I was such a fan when you got a president
00:45:50.280 | of the United States that is willing to sit down
00:45:53.540 | and talk with the leader of another nation.
00:45:55.680 | When every other president, all their advisors are saying,
00:45:59.820 | "The leader of North Korea is beneath you.
00:46:01.620 | "You cannot dignify him by responding to him directly."
00:46:04.660 | And consequently, the Trump administration
00:46:08.060 | inherits a can of worms that has been simmering
00:46:11.740 | for 30 years.
00:46:13.780 | He didn't get a sense of that,
00:46:14.740 | and he opened up a dialogue
00:46:16.860 | where nobody else was capable of opening a dialogue,
00:46:19.260 | and then it just went away.
00:46:21.100 | Nobody knows what happened.
00:46:22.460 | And there was no deal made.
00:46:25.660 | Great negotiators make deals.
00:46:29.020 | - What do you think about these accusations
00:46:31.580 | that he's a narcissist?
00:46:32.860 | If you're a narcissist, does that help you or hurt you?
00:46:37.500 | - Is there a more popular term these days than narcissist?
00:46:40.500 | Like, everybody's a narcissist.
00:46:41.500 | - Everybody you don't like is a narcissist.
00:46:42.980 | - Like the homeless guy down on the corner.
00:46:44.900 | He's a narcissist.
00:46:45.740 | That's why he's there.
00:46:47.180 | - Yeah, it's lost meaning for you a little bit?
00:46:49.220 | - Yeah, and first of all, most psychological terms,
00:46:52.320 | as a hostage negotiator,
00:46:55.180 | and really, we were never into psychology,
00:46:57.700 | and we steered away from it,
00:47:00.220 | 'cause psychology, at best, is a soft science.
00:47:05.220 | If it's not informed these days,
00:47:07.500 | if it's not informed by real studies
00:47:10.620 | or neuroscience, the guys that I'm impressed with these days
00:47:15.620 | are psychologists and neuroscientists.
00:47:17.260 | Now, I'm interested in that guy or gal.
00:47:20.300 | But then, psychology convention.
00:47:23.620 | Do you get 'em all together and they all agree?
00:47:25.980 | - No.
00:47:26.860 | But also, the interesting thing about psychology
00:47:28.660 | is each individual person is way more complicated
00:47:32.300 | than the category psychology tries to create.
00:47:34.860 | And there's something about the human brain.
00:47:36.820 | The moment you classify somebody as a narcissist
00:47:39.500 | or depressed or bipolar or insane in any kind of way,
00:47:44.460 | for some reason, you give yourself a convenient excuse
00:47:48.060 | not to see them as a complicated human being,
00:47:50.420 | to empathize with them.
00:47:51.900 | I had that when I was talking to,
00:47:53.420 | I did an interview with Kanye West,
00:47:55.860 | and then there's a lot of popular opinions
00:47:58.580 | about him being mentally unwell and so on.
00:48:03.340 | And I felt that that kind of way of thinking
00:48:06.820 | is a very convenient way of thinking,
00:48:08.380 | to ignore the fact that he's a human being
00:48:11.300 | that, again, wants to be understood and heard.
00:48:15.080 | And that's the only way you can have that conversation.
00:48:19.580 | - Yeah, I agree completely.
00:48:21.460 | That's right.
00:48:22.660 | (laughing)
00:48:24.300 | I feel so close to you now.
00:48:25.700 | (laughing)
00:48:27.380 | - It might be 'cause I'm not wearing pants.
00:48:28.940 | (laughing)
00:48:30.780 | All right, so what we're--
00:48:32.500 | - You're funnier than I am.
00:48:33.580 | That bothers me.
00:48:34.460 | - All right.
00:48:35.300 | (laughing)
00:48:36.780 | I'll say something stupid soon enough.
00:48:38.340 | Don't worry about it.
00:48:39.380 | But you said, we were talking about terrorists
00:48:42.940 | and not negotiating with terrorists.
00:48:45.300 | Is there something--
00:48:46.140 | - Nice job going all the way back
00:48:47.300 | to where that rabbit hole started.
00:48:49.100 | - There's where Alice in Wonderland right now.
00:48:51.740 | Is there something about walking away of not negotiating?
00:48:57.780 | Is there power in that?
00:48:59.940 | - All right, so it depends upon whether or not
00:49:02.780 | you're doing it with integrity or a tactic to start with.
00:49:06.820 | And then also, hostage negotiators are successful
00:49:11.820 | 93% of the time, kind of across the board.
00:49:16.060 | Which means that the 7% of the time is gonna go bad.
00:49:20.660 | And that was my old boss, Gary Nesner.
00:49:25.660 | I learned so much from Gary.
00:49:27.100 | But a phrase that he used over and over and over again
00:49:30.940 | 'til I finally worked the case and went bad
00:49:33.300 | was this is gonna be the best chance of success,
00:49:35.860 | best chance of success.
00:49:37.060 | And then something went bad, and I remember thinking like,
00:49:39.380 | well, best chance of success is no guarantee of success.
00:49:42.680 | So your question is, are there negotiations
00:49:46.100 | you should walk away from?
00:49:47.380 | If you got no shot at success, then don't negotiate.
00:49:50.400 | And you have to accept the fact
00:49:53.340 | there's some deals you're never gonna make.
00:49:55.220 | You know, we teach in my company,
00:49:56.460 | it's not a sin to not get the deal,
00:49:58.540 | it's a sin to take a long time to not get the deal.
00:50:02.300 | And Gary, in his infinite wisdom,
00:50:04.960 | they realized that there was something
00:50:08.820 | called suicide by cop.
00:50:10.200 | And that it might have,
00:50:14.340 | Gary was very much into clusters of behavior.
00:50:16.380 | He kept us away from psychological terms,
00:50:18.340 | and there would be clusters of behavior
00:50:20.900 | that would be high-risk indicators.
00:50:22.900 | And he wrote a block of instruction
00:50:24.900 | called high-risk indicators.
00:50:26.400 | Which meant if you start seeing this stuff show up,
00:50:29.960 | this thing's probably going bad.
00:50:32.220 | And you're gonna need to recognize that
00:50:34.660 | from the very beginning and adjust accordingly.
00:50:37.180 | And it's the same way in business and personal life.
00:50:40.420 | I'm talking to the head of a marketing company
00:50:43.420 | I have tremendous respect for.
00:50:44.820 | I admire what this guy and his company does.
00:50:47.820 | Started from scratch.
00:50:49.260 | He borrowed space in the back of a drugstore
00:50:54.540 | to start his company.
00:50:56.680 | And now it's hugely successful.
00:51:00.220 | And he's laying out to me
00:51:02.380 | that he finally had to confront a potential client
00:51:06.700 | and walk away from him.
00:51:08.740 | And he said, "How do you think I handle this?"
00:51:12.940 | My answer was, "1000% correct."
00:51:15.620 | And as a matter of fact,
00:51:17.120 | the behavior that he indicated, he's a type.
00:51:20.940 | And you should have walked away sooner than you did.
00:51:24.340 | Because this guy was playing you the whole time.
00:51:27.020 | Al-Qaeda, 2004, they're playing us.
00:51:29.140 | They're not negotiating.
00:51:30.340 | We called them out on it.
00:51:33.500 | We don't think you're negotiating.
00:51:35.300 | You wouldn't say it exactly like that,
00:51:36.620 | but that was absolutely the approach.
00:51:39.260 | Confront people on their behavior in a respectful way.
00:51:44.140 | - And signal that you're willing to walk away.
00:51:47.060 | - And mean it, 1000%.
00:51:48.660 | - And mean it.
00:51:49.500 | Isn't that terrifying?
00:51:51.220 | I mean, it's scary 'cause you don't want to really walk away.
00:51:53.660 | Or do you have to really want to walk away?
00:51:55.820 | - Well, this gets core values, your view of reality.
00:51:59.980 | If it's an abundant world, it's not scary to walk away.
00:52:03.500 | If it's a finite world with limited opportunities,
00:52:06.300 | then it's horrifying.
00:52:07.380 | - But you have to use that worldview
00:52:13.100 | to be willing to actually walk away.
00:52:15.100 | - Yeah.
00:52:19.300 | - It could be walking away from a lot of money.
00:52:21.600 | It could be walking away from something
00:52:24.180 | that's gonna hurt people.
00:52:26.160 | 'Cause if you lose a hostage.
00:52:27.920 | - Yeah, well, but if they're not gonna let the hostage out.
00:52:32.000 | - Yeah.
00:52:32.840 | - Suicide by cop, they ain't letting them go.
00:52:35.000 | - The 7%, how do most negotiations fail?
00:52:41.320 | - The bad guys were never there
00:52:44.480 | to make a deal in the first place.
00:52:45.840 | If it was suicide by cop.
00:52:48.080 | If they were there to, if they're on a killing journey,
00:52:54.080 | it's an Israeli phrase.
00:52:55.420 | If they're on a killing journey,
00:52:58.840 | and the actions that they're currently engaged in
00:53:01.920 | are part of that killing journey.
00:53:04.760 | - Killing journey.
00:53:06.200 | Is there advice you can give about,
00:53:08.600 | you mentioned Israel, Palestine, the Middle East.
00:53:12.560 | Taking on a few conversations on that topic,
00:53:15.440 | is there hope for that part of the world?
00:53:17.280 | And from that hope, is there some advice you could lend?
00:53:21.400 | - Yeah, I think there's hope.
00:53:23.000 | There's, then I got friends on both sides.
00:53:26.880 | And also, when I got my, after I left the FBI,
00:53:31.880 | most people listening to this
00:53:35.400 | probably not gonna remember who Rodney Dangerfield was.
00:53:38.160 | - Oh, come on.
00:53:39.000 | - But he's a comedian.
00:53:39.840 | - Still doesn't get any respect, yeah.
00:53:41.200 | - Yeah, yeah, and--
00:53:42.880 | - New Yorker?
00:53:43.920 | Is he a New Yorker?
00:53:44.840 | - I think he was a New York guy.
00:53:45.680 | - Or like Jersey or something, yeah.
00:53:47.920 | - Yeah, and he did a movie a long time ago
00:53:50.440 | called "Back to School."
00:53:51.320 | He went back to school.
00:53:52.160 | He was an old guy, "Back to School."
00:53:53.200 | So I went back to school after I left the FBI.
00:53:55.720 | I did get a master's at Harvard Kennedy.
00:53:59.080 | And that's where I'm running across
00:54:02.480 | people on both sides of that.
00:54:03.920 | And when they could talk, they said,
00:54:08.160 | "Let's start from the promise that we,
00:54:10.000 | "both sides want a better life for our kids."
00:54:13.080 | Which is this version that I was telling you earlier
00:54:16.900 | from Tom Girardi.
00:54:18.400 | Let's pick a point in the future
00:54:20.320 | that we're both happy with.
00:54:22.400 | And they found that they could talk.
00:54:24.200 | All right, so it might not be better for us.
00:54:26.520 | How do we make it better for our kids?
00:54:28.420 | And that's where the hope derives from.
00:54:31.520 | Because I think both sides ultimately
00:54:33.920 | want it to be better for their kids,
00:54:35.400 | which is why they still engage in interactions,
00:54:39.280 | and which is why I think the leadership,
00:54:41.240 | regardless of how compromised they might be
00:54:45.120 | on either side, there are few straight players
00:54:48.000 | in the game in the Middle East.
00:54:50.540 | Or anywhere, for that matter.
00:54:52.560 | But they want a better future for their kids.
00:54:56.460 | You get people to agree that you want a better future
00:54:58.220 | for your kids, now you can start talking about,
00:54:59.580 | "Well, how do we work our way back from that?"
00:55:02.180 | And then, all right, so we got a mutual point in the future.
00:55:05.020 | The Israeli-Palestinian negotiations
00:55:07.140 | are also, for me, interesting.
00:55:11.100 | 'Cause you mentioned Clubhouse
00:55:12.140 | about almost two years ago now,
00:55:13.940 | when Israel was shelling Gaza.
00:55:18.300 | They hit the UPI office.
00:55:19.980 | They were hitting, they got fed up
00:55:23.300 | with the rocket attacks from Hamas.
00:55:25.580 | And of course Hamas is putting rockets in the UPI office,
00:55:29.220 | or the AP office, whichever press office it was there.
00:55:32.140 | How's that office gonna be there otherwise?
00:55:34.100 | Hamas is running a show.
00:55:35.520 | You're not gonna run that office
00:55:37.780 | unless you let them store weapons there.
00:55:40.420 | That's just part of the game.
00:55:41.860 | And are they gonna store 'em
00:55:45.220 | in specially designated ammunition dumps?
00:55:47.460 | No, they're gonna put 'em in schools,
00:55:49.060 | they're gonna put 'em in hospitals,
00:55:50.420 | they're gonna put 'em in all places
00:55:52.380 | that when Israel hits 'em, they're gonna look really bad.
00:55:55.980 | So after a while, Israel gets fed up
00:55:57.620 | and they start shelling Gaza,
00:55:58.820 | and they're hitting these places.
00:56:00.420 | Friend of mine, Nicole Benham,
00:56:03.500 | is hosting rooms on Clubhouse,
00:56:06.620 | and she says, "You gotta come on.
00:56:08.820 | "The vitriol is killing me.
00:56:10.040 | "These are all turning into screaming matches.
00:56:12.080 | "Nobody's talking to anybody."
00:56:14.380 | I said, "All right, cool, we'll go on, we'll do it."
00:56:17.180 | And watch, we won't have a single argument.
00:56:19.620 | We'll invite people on from both sides.
00:56:21.620 | There was one rule.
00:56:25.060 | Before you started to describe
00:56:28.340 | what you thought of the other side,
00:56:30.140 | you had to say, "Before I disagree with you,
00:56:33.860 | "here's what I think your position is."
00:56:36.460 | And you gotta continue to state the other side's position
00:56:39.380 | until they agree that you've gotten.
00:56:41.560 | Now, what happened?
00:56:45.620 | No agreement and no arguments.
00:56:48.260 | That was what we were really going for.
00:56:50.540 | We wanted to show that people on both sides,
00:56:54.140 | in one of their emotional timeframes,
00:56:56.740 | if your only requirement was
00:56:58.560 | you had to state the other side's position first,
00:57:01.120 | nobody got out of control.
00:57:04.660 | - Did it work?
00:57:06.140 | - That's exactly what happened.
00:57:07.940 | We wanted to show people that you can have conversations
00:57:11.300 | that do not devolve into screaming matches with vitriol,
00:57:16.300 | talking about how you're dedicated
00:57:20.460 | to the destruction of the other side just first.
00:57:23.620 | See if you can outline where they're coming from.
00:57:26.300 | - That's really impressive
00:57:27.220 | because I've just, having seen on Clubhouse,
00:57:30.860 | people, which part of the reason I liked Clubhouse,
00:57:34.220 | 'cause you get to hear voices from all sides,
00:57:36.900 | they were emotionally intense.
00:57:40.620 | - Right.
00:57:41.460 | (laughing)
00:57:43.540 | - It was, I mean, I'm sweating just
00:57:46.820 | in the buildup of your story here.
00:57:48.500 | I thought it could go to hell,
00:57:51.340 | but you're saying it kind of worked.
00:57:52.820 | - Not one person lost control.
00:57:55.140 | Now, of the two sides,
00:57:56.980 | the people that were speaking on behalf of the Israelis
00:58:00.060 | were a little better at articulating
00:58:04.620 | supportive positions for the Palestinians.
00:58:07.100 | Most of the people that wanna speak up
00:58:09.500 | on behalf of the Palestinians,
00:58:11.260 | they just, they'd wanna start it like,
00:58:13.860 | "You're doing this," and I'd say,
00:58:15.380 | "No, no, no, no, no, you can go there, just not yet.
00:58:18.340 | "Before you go there, you can say that all you want.
00:58:21.440 | "Before you go there, you've gotta try to articulate
00:58:25.340 | "to them where they're coming from.
00:58:26.840 | "They gotta tell you you got it right."
00:58:28.180 | And what would consistently happen
00:58:30.740 | is there's a leveling out of a person
00:58:35.140 | to try to see the other side's perspective and articulate it.
00:58:38.780 | It's enormously beneficial to the person
00:58:40.860 | who's trying to do it,
00:58:42.980 | which was really the point that we were trying to make.
00:58:44.940 | - It's a really interesting exercise,
00:58:46.180 | I mean, by way of advice.
00:58:49.020 | So if it works at clubhouse,
00:58:50.620 | for people who don't know,
00:58:51.500 | that's like a voice app where you can be anonymous.
00:58:55.780 | So it's really regular people,
00:58:57.620 | but regular people who can also be anonymous.
00:58:59.220 | It's just, it can be chaos.
00:59:02.200 | If it works there, that's really interesting.
00:59:04.140 | For when you sit down for a conversation
00:59:06.900 | and cross the table from somebody,
00:59:08.700 | don't have them even steel man the other side.
00:59:12.380 | Have them just state the other side.
00:59:16.260 | Just explain your understanding of it.
00:59:21.260 | That's it.
00:59:22.100 | - And every now and then I would jump in.
00:59:23.900 | Somebody's supporting Israel,
00:59:28.660 | whoever the heck they were,
00:59:30.780 | and they'd say a couple things.
00:59:34.340 | And the Palestinian guy would be like,
00:59:36.420 | I or Gal, or supportive of them,
00:59:38.340 | would say, you know, you missed some stuff.
00:59:39.940 | And I'd say, let me jump in.
00:59:41.900 | First of all, I know what the Nakba is.
00:59:43.860 | The Nakba is a catastrophe.
00:59:47.580 | That's the day Israel was born.
00:59:49.600 | You, you know, for the rest of the world,
00:59:51.300 | it's the birth of Israel, for you it's the Nakba.
00:59:53.460 | I said, you've got members of your family
00:59:57.060 | that is still walking around
01:00:00.100 | carrying keys to the front door of the house they abandoned.
01:00:04.100 | And they'd be like, yeah.
01:00:06.580 | And I'd say, you feel bad that,
01:00:11.980 | in point of fact, that in World War II,
01:00:14.680 | the world stood back and watched
01:00:19.900 | while the Nazis threw the Jews off a building.
01:00:23.460 | The only problem was they landed on you.
01:00:26.780 | And they'd be like, yeah.
01:00:30.140 | That's where they're coming from.
01:00:32.940 | (inhales)
01:00:35.180 | So articulating, you know, deeply
01:00:38.420 | what the other side feels is transformative
01:00:40.940 | for both people involved in the process.
01:00:43.240 | - What's the toughest negotiation
01:00:47.100 | you've ever been a part of,
01:00:48.300 | or maybe observed or heard of?
01:00:50.600 | What's a difficult case that just stands out to you?
01:00:55.800 | Or maybe just one of many.
01:00:58.880 | - Well, the stuff we went through with Al-Qaeda
01:01:02.720 | in and around Iraq, Iraq and Saudi,
01:01:07.760 | first one was in Saudi in 2004 timeframe.
01:01:10.340 | The hardest part about that was working with family members
01:01:14.840 | and not deceiving them about the possibility outcome.
01:01:21.560 | - Yeah, how do you talk to family members?
01:01:23.280 | Is that part of the negotiation?
01:01:25.000 | - Yeah, empathy, learning empathy the hard way.
01:01:28.680 | And then being able to take it up to higher levels.
01:01:31.240 | 'Cause at its base level,
01:01:33.760 | a guy that we're working with now
01:01:35.040 | that's coaching us in the US,
01:01:36.640 | and is a business partner, his name is Jonathan Smith,
01:01:38.840 | he pointed out to us that there's a shoe-ha-ree concept.
01:01:43.720 | Are you familiar with shoe-ha-ree?
01:01:45.280 | It's a martial arts concept.
01:01:46.740 | And shoe is do it exactly as the master's
01:01:51.320 | telling you to do it.
01:01:52.160 | Wax on, wax off, karate kid stuff.
01:01:56.720 | Ha is when you've done the repetitions enough times,
01:02:01.480 | you're getting a feel for it,
01:02:02.880 | and you begin to see the same lessons
01:02:05.800 | coming from other masters.
01:02:07.200 | You're seeing the same things show up in other places.
01:02:09.860 | And at the re-level, you're still in the discipline,
01:02:15.440 | but you're making up your own rules.
01:02:17.880 | It's almost a flow state.
01:02:19.840 | And you don't realize that you're making up your own rules.
01:02:25.960 | And if somebody asks you where you learned that,
01:02:28.520 | you'd probably say, "My sensei taught it to me.
01:02:32.440 | "My master taught it to me."
01:02:33.880 | This will come back around to negotiating
01:02:37.920 | with families pretty quick.
01:02:39.280 | We did this once because there's a bunch of people
01:02:42.560 | that we coach, business people that are scared
01:02:46.000 | of the amount of money that they're losing
01:02:47.520 | if we're not coaching them regularly.
01:02:50.040 | One of these guys, Michael, we're interviewing him
01:02:53.240 | for a social media posting about two years ago.
01:02:56.360 | And Michael says, "Yeah, you gotta gather data
01:02:58.600 | "with your eyes."
01:03:00.240 | And I went, "Ooh, I like that."
01:03:05.240 | I said, "Where did you hear that before?"
01:03:07.480 | And he goes, "I don't know, I heard it from you, I think."
01:03:10.240 | And I'm like, "No, no, no, no, no."
01:03:12.040 | (Rhett laughs)
01:03:13.280 | I'd have remembered saying that.
01:03:15.160 | That's the first time I've heard that.
01:03:16.600 | He's in re.
01:03:18.160 | So what's this got to do with families?
01:03:20.640 | Empathy at its base level and a shoe level,
01:03:23.480 | I learned it on the suicide hotline,
01:03:25.320 | is saying like, you sound angry.
01:03:27.920 | I'm just calling out the elephant in the room.
01:03:31.860 | Your emotions, what's driving you?
01:03:35.700 | I'm throwing a label on your affect.
01:03:39.640 | And I'm saying you sound, or it sounds like you are,
01:03:43.560 | 'cause that's the basic karate kid wax on, wax off approach.
01:03:49.520 | Now there are a lot of hostage negotiators
01:03:51.080 | that'll tell you empathy doesn't work at home.
01:03:52.560 | Not true, they've never gotten out of shoe.
01:03:54.840 | You're getting ready to talk to your significant other
01:04:00.160 | and you wanna go someplace
01:04:01.760 | that you know is gonna make her angry.
01:04:05.440 | You wanna go do something.
01:04:06.720 | Now that's real negotiation right there.
01:04:09.160 | You could say to her, you sound angry,
01:04:11.680 | in which case she's gonna blow up
01:04:13.360 | 'cause her reaction is, "You made me angry, bozo.
01:04:18.880 | "Can you act like you're an innocent third party
01:04:21.200 | "or that you were independent of how I feel bad?"
01:04:24.800 | And you learn a little bit more and you say,
01:04:27.460 | the high level is, "This is probably gonna make you angry."
01:04:32.000 | And then what I did with families,
01:04:36.580 | I knew how they felt before I walked in the door.
01:04:41.240 | I knew that they were scared to death.
01:04:45.600 | You find out that your husband, your father,
01:04:48.780 | your brother has been grabbed by Al-Qaeda
01:04:51.440 | who are in the business of chopping people's heads off,
01:04:55.040 | you're gonna be horrified.
01:04:56.440 | I can't walk into them and go like, "You sound angry."
01:04:59.960 | Of course I'm angry, you idiot.
01:05:02.140 | But knowing what they are,
01:05:04.880 | I used to walk into families' houses
01:05:06.680 | and I'd say, "I know you're angry."
01:05:11.400 | Now what are the circumstances dictate
01:05:13.960 | that they should also feel?
01:05:16.360 | They're gonna feel abandoned by their government.
01:05:18.900 | They're gonna feel totally alone.
01:05:21.380 | They're gonna be scared and they're gonna be angry
01:05:24.580 | because they feel the government abandoned them.
01:05:28.440 | Now, there in point of fact, is this an accurate statement?
01:05:32.260 | That their loved one voluntarily went into a war zone
01:05:35.340 | and voluntarily went someplace
01:05:37.660 | their government told them not to go.
01:05:40.380 | Are the facts that the government abandoned them?
01:05:42.780 | Absolutely not, as a matter of fact,
01:05:43.940 | the government tried to get them to not go
01:05:45.740 | and they went anyway.
01:05:47.280 | But that doesn't change how they felt in a moment.
01:05:50.520 | And I'd walk into a house and I'd go, "I know you're angry.
01:05:53.800 | "I know you feel abandoned and alone.
01:05:55.760 | "And I know you're horrified.
01:05:58.180 | "And I know you feel the United States government
01:06:00.180 | "has abandoned you."
01:06:01.280 | And they would look at me and go like,
01:06:04.420 | "Yeah, what do we do now?
01:06:10.600 | "Now we're ready to rock."
01:06:15.440 | - Is there, with Al Qaeda or in general,
01:06:18.120 | is there a language barrier too?
01:06:20.400 | It could be just barriers
01:06:21.600 | of different communication styles.
01:06:23.320 | I mean, you got like a New Yorker way about it.
01:06:26.520 | (Lex laughing)
01:06:28.140 | That might make somebody from like,
01:06:30.200 | I don't know, Laguna Beach uncomfortable.
01:06:33.120 | (Lex laughing)
01:06:36.000 | Do you feel that language barrier in communication?
01:06:38.640 | That language and communication style
01:06:42.560 | in itself creating a barrier.
01:06:46.160 | - You got a barrier when you think
01:06:48.120 | that your way is the way.
01:06:49.840 | - Sure, that's the biggest barrier.
01:06:51.200 | - Yeah, and that happens all the time.
01:06:53.380 | When people talk about,
01:06:55.840 | "What about cross-cultural negotiations?
01:06:57.880 | "What hand do I gotta shake hands with
01:07:02.280 | "so that I can get my way?"
01:07:04.000 | Well, if you strip it all down,
01:07:07.080 | we're all basically the same blank slate when we were born.
01:07:10.880 | Everybody's got a limbic system.
01:07:13.000 | Everybody's limbic system works pretty much the same way.
01:07:16.520 | People are driven by the same sorts of decisions.
01:07:18.680 | How does this affect my future?
01:07:21.120 | What am I at risk of losing?
01:07:22.680 | How does this affect my identity?
01:07:24.840 | You're not a kind of kidnapper.
01:07:27.120 | You're a New York City businessman.
01:07:29.100 | You're a tobacco farmer in the South.
01:07:31.400 | All making those same decisions based on those same things.
01:07:35.360 | So as soon as I start to navigate that,
01:07:38.040 | and I tailor my approach, which is what empathy is,
01:07:43.040 | to how you see things.
01:07:44.860 | So I can be the biggest goofball ever from,
01:07:51.120 | if you live in the South, yeah, maybe I'm a New Yorker,
01:07:54.720 | or I'm somebody from LA, or somebody from Chicago.
01:07:59.060 | But my geography is foreign to you,
01:08:01.920 | but as soon as I start dialing in on how you see things,
01:08:05.160 | suddenly you're listening.
01:08:08.060 | - What about the three voices you talk about?
01:08:12.500 | The different voices you can use in that communication?
01:08:15.020 | - Right, the assertive voice, direct and honest.
01:08:17.980 | I'm a natural born assertive.
01:08:20.260 | - Natural born.
01:08:22.460 | I thought we're all blank slate.
01:08:23.780 | Is your born-- - Yeah, well,
01:08:25.420 | stop catching me on what I said.
01:08:26.700 | How dare you accuse me of what I've said?
01:08:29.280 | To quote Bono, I stand accused of what I've said,
01:08:31.520 | the things I've said.
01:08:32.740 | - That's a good line.
01:08:33.740 | (laughing)
01:08:34.580 | He's got a few good lines.
01:08:36.740 | - Yeah.
01:08:37.580 | - So assertive voice, you're born that way.
01:08:40.100 | Which one, what are the other ones?
01:08:42.660 | - Analyst.
01:08:43.500 | You're an analyst.
01:08:45.900 | - And I can tell you're assertive.
01:08:47.500 | (laughing)
01:08:50.220 | - Yeah.
01:08:51.220 | - What's an analyst voice?
01:08:53.180 | - Well, an analyst is close to--
01:08:55.420 | - Smarter?
01:08:56.780 | More thoughtful?
01:08:57.620 | (laughing)
01:08:59.900 | - No, as a matter of fact.
01:09:01.200 | - Look, you ever do a decision tree?
01:09:06.100 | - Yeah.
01:09:06.940 | - See, you like it too, don't you?
01:09:08.780 | - So decision trees, I'm a computer scientist,
01:09:12.260 | so I like mathematical, systematic ways of seeing the world.
01:09:17.260 | - He's an analyst.
01:09:19.780 | You think Donald Trump would ever say that?
01:09:23.320 | - Unlikely.
01:09:25.180 | - Well, is he more the assertive kind?
01:09:27.900 | - He's a natural born assertive, yeah.
01:09:29.460 | - Yeah.
01:09:30.300 | Are all New Yorkers like this?
01:09:32.740 | Is this something in the water?
01:09:34.260 | - No, that's the craziest thing.
01:09:35.340 | I mean, there's an affect that a city can have.
01:09:37.780 | - Yeah.
01:09:39.300 | - And New York's Northeast, not just New York,
01:09:43.500 | but the Northeast is a little more the affect of the area,
01:09:48.500 | of the culture of the area.
01:09:50.940 | The individuals still boil down
01:09:52.940 | into the three types, cross or board.
01:09:55.820 | - What's the third one?
01:09:57.740 | - Accommodator, smiling, optimistic, hopeful.
01:10:00.620 | I'm a thousand percent convinced that the phrase
01:10:04.180 | hope is not a strategy is designed at people's frustration
01:10:09.180 | over a third of the population being accommodators
01:10:12.860 | that are hope-driven.
01:10:15.020 | I hope this works out.
01:10:16.220 | And they're very relationship,
01:10:21.860 | on the surface, they're very relationship oriented.
01:10:24.180 | They tend to appear to be very positive, and they are,
01:10:26.980 | but it's really built around hope.
01:10:29.500 | - And the idea is you can adopt these three voices.
01:10:32.460 | - You can, yeah, you can learn 'em.
01:10:33.660 | They're all learnable.
01:10:34.740 | Analysts are often mistaken for accommodators,
01:10:40.540 | because as you said before,
01:10:43.580 | analysts are more introspective, more analytical.
01:10:47.780 | They're looking at the systems at work.
01:10:50.420 | And if they like to learn,
01:10:55.380 | they notice that accommodators make more deals
01:10:59.500 | than they make.
01:11:00.580 | They also notice that there's a higher failure rate
01:11:02.580 | of the deals, but since they notice stuff
01:11:05.460 | and they think about it, they catch on faster
01:11:08.500 | than assertives do that the pleasant nature
01:11:11.820 | of an accommodator contributes strongly
01:11:15.420 | to them making deals.
01:11:16.620 | Like my daughter-in-law is an analyst.
01:11:22.380 | Another descriptor we have in that,
01:11:24.740 | an analyst are assassins.
01:11:26.340 | An analyst will snipe you from a thousand yards out
01:11:29.940 | in the middle of the night,
01:11:31.580 | and you never know what hits you,
01:11:32.980 | and they're really happy with that.
01:11:34.780 | (Lex laughing)
01:11:36.380 | - But how has assertiveness, the assertive voice
01:11:40.700 | served you in negotiation?
01:11:42.500 | - Poorly.
01:11:43.340 | The assertive voice is almost always counterproductive.
01:11:48.500 | It feels like getting hit in the face with a brick.
01:11:50.940 | And that's almost always counterproductive.
01:11:55.180 | So for me to be more effective,
01:11:57.020 | especially in a negotiation,
01:11:59.100 | I'll need to slow down and smile.
01:12:02.540 | - You know, I heard that Teddy Roosevelt
01:12:06.300 | was a good negotiator and that he was extremely stubborn,
01:12:11.220 | and perhaps the right term for that would be assertive,
01:12:14.420 | but he picked his battles.
01:12:15.700 | Is there some value to holding strong to principles?
01:12:20.700 | So I don't even know if that's probably
01:12:23.580 | the opposite of empathy.
01:12:24.600 | Are there times when you can just stick,
01:12:28.000 | be extremely stubborn to your principles?
01:12:30.780 | To win a negotiation?
01:12:31.620 | - Oh, we do it all the time.
01:12:32.460 | We just, you know, we're just nice about it.
01:12:35.220 | - Okay, it helps to be nice, you're saying.
01:12:37.140 | - Well, yes, because I need you to hear me.
01:12:39.580 | And the assertive tone of voice,
01:12:42.080 | so when we do our training,
01:12:43.980 | typically we do an exercise called 60 seconds or she dies.
01:12:46.980 | And I play the bad guy bank robber,
01:12:50.420 | and I ask you to be the hostage negotiator.
01:12:52.940 | And your job is to,
01:12:56.020 | I'll give you the four real world constraints,
01:12:59.040 | and then you're gonna try and negotiate me out of the bank.
01:13:02.220 | Now we're doing this,
01:13:03.520 | now the first voice that I always use in that exercise
01:13:07.040 | is the assertive voice,
01:13:08.380 | which is the commanding voice.
01:13:11.920 | It's the voice that all police officers
01:13:13.440 | are taught to use in the street.
01:13:15.040 | Issue loud and clear commands.
01:13:18.000 | You know, it doesn't,
01:13:19.680 | to me, I don't feel like I'm attacking you.
01:13:23.540 | I just feel like I'm being direct and honest and clear.
01:13:27.440 | You on the other hand feel attacked.
01:13:30.760 | Now we're doing this exercise in Austin,
01:13:33.240 | couple of years ago.
01:13:34.480 | The first participant has an Apple watch on.
01:13:37.120 | He tells us afterwards,
01:13:39.920 | that sitting still,
01:13:43.520 | not even answering,
01:13:45.460 | when he first gets hit in the face
01:13:47.240 | with the assertive voice,
01:13:48.520 | his heart rate jumped to 170,
01:13:50.220 | which is a typical fight or flight reaction.
01:13:53.400 | I come at you like I'm fighting you.
01:13:55.200 | Your fight flight mechanisms all kick into gear,
01:14:00.480 | which clouds your thinking.
01:14:02.280 | You're automatically dumber in the moment.
01:14:04.460 | So if I wanna make a great long-term deal with you,
01:14:09.000 | highly profitable,
01:14:10.400 | I'm agnostic to you being profitable.
01:14:14.160 | Is it you be profitable?
01:14:15.160 | Well, that's fine.
01:14:16.200 | I'm here to make money for me.
01:14:17.800 | Me making you dumber will always hurt me.
01:14:23.740 | Me making you feel attacked will always hurt me.
01:14:28.740 | So there's never a value in being,
01:14:32.780 | in you making me afraid.
01:14:36.240 | There's never a long-term value in it.
01:14:39.260 | It's another thing that Tal Roz,
01:14:42.780 | when we were writing a book,
01:14:44.580 | braced me on.
01:14:45.480 | 'Cause he said,
01:14:47.400 | "There's scientific data out there
01:14:48.620 | "that's called strategic umbrage."
01:14:51.580 | Well, there's data.
01:14:52.780 | Well, whether or not it's scientific,
01:14:54.300 | I would call that into question.
01:14:56.540 | But he said,
01:14:57.360 | "There's studies out there that show
01:14:59.620 | "that strategic umbrage works."
01:15:02.160 | And another thing that I also enjoy,
01:15:07.280 | you probably get tired of me saying
01:15:09.640 | wonderful things about Andrew.
01:15:11.820 | He taught me--
01:15:12.660 | - There's never enough wonderful things to say
01:15:15.700 | about the great Andrew Huberman,
01:15:17.700 | the host of the Huberman Lab Podcast
01:15:19.460 | that everybody should subscribe to.
01:15:20.900 | You should talk to Andrew.
01:15:21.860 | - You're funnier than he is, though.
01:15:23.100 | I'll give you that.
01:15:23.940 | - Hear that, Andrew?
01:15:24.900 | - He's funny accidentally.
01:15:26.580 | He makes me laugh all the time.
01:15:28.180 | Not when he's trying to be funny.
01:15:29.980 | - He's a really,
01:15:31.220 | he's one of the people in this world
01:15:32.740 | that's truly legit.
01:15:35.060 | He's a really strong scientist
01:15:36.900 | and a really strong communicator
01:15:38.500 | and a good human being.
01:15:40.020 | Those together don't come often.
01:15:43.140 | And it's nice to see.
01:15:44.660 | - Yeah, yeah, yeah.
01:15:46.220 | - Yeah, he's a treasure, national treasure.
01:15:47.740 | Anyway, you were saying?
01:15:48.620 | - Well, he sort of taught me how to think about data
01:15:51.100 | and studies and science
01:15:53.140 | and also from different books that he's turned me onto.
01:15:57.940 | He's really helped me think about this stuff.
01:15:59.660 | So the studies about strategic umbrage
01:16:02.060 | were done, the ones that I've seen,
01:16:05.460 | that show it's effective.
01:16:06.740 | There were simulated negotiations with college students.
01:16:10.980 | Now, here's the problem with that.
01:16:12.580 | A simulated negotiation with a college student,
01:16:15.620 | a college student's gonna sit down
01:16:16.980 | as part of their assignment.
01:16:18.500 | They're gonna sit down one time.
01:16:20.780 | They're gonna sit down for 45 minutes
01:16:23.380 | and they're gonna think that
01:16:24.220 | if they didn't come to a deal at all, that they failed.
01:16:27.180 | And there's no ongoing implementation.
01:16:30.500 | There's just a deal
01:16:31.500 | and then they walk away of a pretend situation.
01:16:34.540 | So they got no actual real skin in the game.
01:16:37.420 | There's no deal on earth.
01:16:40.060 | Do you sit down and come to an agreement in 45 minutes
01:16:43.540 | and never see each other again
01:16:44.660 | 'cause there's the implementation of the deal,
01:16:46.540 | even if it's only payment.
01:16:50.300 | So the data is flawed based on the way it was collected.
01:16:53.980 | It's a highly flawed study.
01:16:56.340 | And all data is flawed, as you know, as a scientist.
01:16:59.380 | You just gotta be aware of what the flaws are
01:17:01.100 | and decide whether or not that destroys the study.
01:17:03.700 | Or what do you think?
01:17:05.660 | Take a look at the data.
01:17:07.420 | There's no such thing as perfect data.
01:17:09.460 | Look at the data, see what you think of it.
01:17:10.940 | The data that says that strategic umbrage works
01:17:14.740 | is based on flawed circumstances.
01:17:18.420 | - Can you explain strategic umbrage?
01:17:20.300 | - Getting mad, scaring the other side into a deal.
01:17:22.780 | Getting mad at using anger strategically
01:17:26.460 | to bully the other side into an agreement.
01:17:29.800 | - That's nice to hear in some sense.
01:17:34.500 | It's nice to hear that empathy is the right way
01:17:37.740 | in almost all situations.
01:17:39.320 | - Best chance is success.
01:17:42.160 | Not that it works every time,
01:17:44.700 | just it works more than anything else does.
01:17:46.940 | - What is the technique of mirroring?
01:17:48.500 | There's a lot of cool stuff in your book
01:17:49.900 | that just kinda jump around.
01:17:51.260 | What's mirroring?
01:17:52.100 | - Mirroring is like,
01:17:53.220 | it's one of the most fun skills
01:17:56.440 | because it's the simplest to execute.
01:17:58.580 | You just repeat one to three-ish words
01:18:01.820 | of what somebody said.
01:18:02.660 | Usually the last one to three words.
01:18:04.420 | What I've found about it is
01:18:08.980 | the people that really like mirroring
01:18:12.580 | love it because it's so simple
01:18:14.140 | and so effortless and invisible.
01:18:17.060 | They typically, for lack of a better term,
01:18:19.940 | tend to be both high IQ and high EQ.
01:18:22.300 | I'm not a high IQ guy.
01:18:23.780 | I'm an average dude.
01:18:25.700 | I like to think that I can learn.
01:18:27.380 | In EQ, emotional intelligence is a skill you can build
01:18:32.380 | and I'm always working on building it.
01:18:34.380 | But a lot of really regular average people
01:18:38.300 | will be like, "Mirroring, that's stupid.
01:18:40.500 | "I'm not doing that."
01:18:42.180 | And I don't know why they don't like it.
01:18:44.140 | But when I find somebody that loves to mirror,
01:18:46.700 | I'll always ask them, "How'd you score on IQ?"
01:18:53.020 | And typically their IQ's pretty high.
01:18:55.840 | Now, I don't know why that combination
01:18:58.140 | attracts people to mirroring
01:18:59.580 | 'cause there's nine skills.
01:19:01.620 | Eight from hostage negotiation
01:19:04.740 | and the ninth really was tone of voice
01:19:06.980 | and we just define that as a skill.
01:19:09.860 | And each one is different
01:19:11.420 | and focuses on different components of the conversation.
01:19:15.280 | And a lot of people don't like to mirror.
01:19:18.700 | They found it so awkward.
01:19:19.780 | Like I don't particularly,
01:19:21.700 | I'm not particularly strong in mirroring.
01:19:23.300 | I gotta do it intentionally.
01:19:24.300 | I'm good at labeling.
01:19:25.620 | - But does it almost always work?
01:19:27.260 | - Oh yeah.
01:19:29.660 | - Yeah, it feels maybe awkward,
01:19:32.340 | but it's true there's gotta be ways
01:19:35.220 | to signal that you're truly listening.
01:19:37.220 | - That's part of it.
01:19:39.140 | - I think you can do body language.
01:19:40.420 | You can, yeah, there's a lot of ways to signal that,
01:19:42.980 | but mirroring is probably just this trivial little hack.
01:19:46.500 | - It kinda is.
01:19:48.220 | - You know what?
01:19:49.940 | There's a situation,
01:19:50.760 | I had a conversation with Stephen Kodkin.
01:19:52.420 | He's this historian.
01:19:54.100 | And he would say my name a lot throughout the conversation.
01:19:59.100 | He would be like, "Well, you have to understand, Lex,
01:20:02.260 | "is that, and for some reason
01:20:03.900 | "that was making me feel really good.
01:20:06.240 | "I was like, he cares about me.
01:20:08.140 | "And I wonder if that key, if everyone has that key,
01:20:11.660 | "that could be a name, just using people's name
01:20:14.220 | "could be powerful."
01:20:16.700 | - Using the name is really context-driven.
01:20:20.660 | It can be extremely powerful with someone who's genuine,
01:20:26.460 | and it comes across in their demeanor,
01:20:28.560 | and it's used in a way that you can tell
01:20:32.580 | is meant to encourage you as opposed to exploit you.
01:20:37.180 | - Sure.
01:20:38.140 | - And the people that are really into exploiting
01:20:41.340 | will also use it, do the same thing.
01:20:44.300 | - So you have to be, you have to avoid using the things
01:20:47.740 | that people that are exploiters, manipulators use,
01:20:51.540 | 'cause it might signal to others
01:20:53.860 | that this person is trying to trick me.
01:20:56.060 | - Gotta be very conscious of it, yeah.
01:20:57.940 | - What's labeling that you mentioned, the thing you like?
01:21:00.060 | - Well, I said earlier that old progression
01:21:02.820 | from you sound angry to this is probably
01:21:04.700 | gonna make you angry to I know you're angry.
01:21:07.540 | Labeling is hanging a label on an emotion or an affect,
01:21:12.320 | and then just calling it out.
01:21:14.320 | - Is that almost always good?
01:21:18.100 | Could it be a source of frustration
01:21:19.780 | when a person's being angry and you put a label on it?
01:21:22.720 | Call out the elephant.
01:21:26.560 | Is it possible that that will lead to escalation
01:21:33.020 | of that feeling versus a resolution?
01:21:35.060 | - Well, what would make it bad?
01:21:39.100 | If I'm pointing out that blatantly obvious,
01:21:44.240 | like if I say, look, I need you to get up
01:21:47.540 | and go down to the bank and make the deposit.
01:21:49.500 | Let's say I'm talking to somebody who works in my company.
01:21:53.980 | I need you to get on the phone with this person
01:21:56.140 | and make the appointment.
01:21:57.900 | And they go, sounds like you want me to talk to this person.
01:22:01.180 | I'm like, yeah, I'm gonna do that.
01:22:03.940 | And they go, yeah, I'm gonna do that.
01:22:05.740 | And I'm like, yeah, I'm gonna do that.
01:22:07.780 | And they go, yeah, I'm gonna do that.
01:22:09.580 | And I'm like, yeah, I'm gonna do that.
01:22:11.380 | And they go, yeah, I'm gonna do that.
01:22:13.420 | And I'm like, yeah, I'm gonna do that.
01:22:15.220 | And they go, yeah, I'm gonna do that.
01:22:17.260 | And I'm like, yeah, I'm gonna do that.
01:22:19.060 | And they go, yeah, I'm gonna do that.
01:22:21.100 | And I'm like, yeah, I'm gonna do that.
01:22:22.900 | And they go, yeah, I'm gonna do that.
01:22:24.940 | And I'm like, yeah, I'm gonna do that.
01:22:26.980 | And they go, yeah, I'm gonna do that.
01:22:28.780 | And I'm like, yeah, I'm gonna do that.
01:22:30.620 | - You weren't actually listening.
01:22:32.740 | And the label indicates that you're not listening.
01:22:37.740 | You know, I'm teaching at USC,
01:22:41.020 | and I'm teaching labels,
01:22:44.180 | and one of the kids in the class,
01:22:46.540 | he just wants to take the skills and make his deals
01:22:49.300 | and just hustle 'em.
01:22:50.680 | And he's just looking for a hustle.
01:22:54.180 | So he writes up a paper about,
01:22:56.260 | you know, he goes, there's some malls,
01:22:58.660 | I think over by Palm Springs or someplace,
01:23:00.700 | some malls, a lot of people go to buy suits.
01:23:02.940 | So he goes in there,
01:23:05.420 | and he immediately starts the bargaining
01:23:07.420 | that my book teaches, with no empathy.
01:23:11.740 | And he's like, throws a price to the guy,
01:23:14.700 | and the guy's like, no.
01:23:16.660 | And he throws another price to the guy,
01:23:17.940 | and the guy's like, no.
01:23:19.100 | And then he says to the guy behind the counter,
01:23:22.780 | sounds like we can make a deal.
01:23:25.060 | Like, no, it doesn't.
01:23:26.860 | I just shot down everything that you just said.
01:23:29.100 | If anything, it sounds like we're never gonna make a deal.
01:23:32.300 | But he tried to use this label for manipulation.
01:23:35.900 | Now, the guy didn't get mad on the other side,
01:23:38.460 | but it's like, clearly this dude is not listening to me.
01:23:41.220 | - And at the core of everything,
01:23:42.540 | you have a bunch of like, you know,
01:23:45.140 | almost like hacks, like techniques you can use,
01:23:49.620 | but at the core of it is empathy.
01:23:51.380 | - At the core of it is empathy, yeah.
01:23:52.940 | - That's the main thing,
01:23:53.780 | you can be able to just sit there and listen.
01:23:56.740 | And perceive, yeah, and look for insights.
01:24:00.740 | - You know what, I like silence.
01:24:02.460 | Or like, you're both sitting there chilling with a drink,
01:24:05.340 | looking up at the stars.
01:24:07.100 | There's a moment, the silence makes you kind of zoom out,
01:24:12.100 | realize you're in this together.
01:24:14.100 | As opposed to playing a game,
01:24:15.800 | or some kind of like chess game of negotiation,
01:24:18.980 | you're in it together.
01:24:19.820 | I don't know, there's some intimacy to the silence.
01:24:22.900 | And like, I'll ask a question,
01:24:25.480 | and just let the other person sit there in silence
01:24:31.620 | before they answer, or vice versa.
01:24:33.420 | They ask me a question, I sit there in silence.
01:24:35.540 | That's a big, feels like a big intimate thing.
01:24:39.620 | - Yes, and the other two types,
01:24:44.540 | until they've experienced that, are afraid of it.
01:24:47.980 | And what I'm actually gonna do is,
01:24:51.060 | for whatever reason, I'm really comfortable with silence.
01:24:53.820 | I think because I've experienced its effectiveness.
01:24:56.780 | And also my son, Brandon, he's the king of dynamic silence.
01:25:00.620 | Like, he coaches people, he says,
01:25:03.180 | go silent, count thousands to yourself.
01:25:05.940 | Don't stop 'til you run out of numbers.
01:25:07.940 | (laughing)
01:25:08.780 | - That's a good line.
01:25:10.260 | He's also good, full of good lines.
01:25:12.740 | - He is, that he is.
01:25:14.460 | And so, there's so much to it.
01:25:20.020 | But the other two types are natural wiring against it,
01:25:22.700 | until they've experienced it.
01:25:24.100 | And your gut intuition's giving you data
01:25:27.740 | once you've experienced it.
01:25:29.300 | But your amygdala's kicking into gear.
01:25:30.900 | Again, sorry, I realize it's more complicated than that.
01:25:33.740 | Until you've experienced it.
01:25:35.780 | So, accommodators, hope-based.
01:25:38.120 | How do they signal fury?
01:25:41.080 | The silent treatment.
01:25:42.140 | So when you go silent, they're scared to death you're furious.
01:25:49.340 | 'Cause that's how they indicate it.
01:25:51.900 | The assertive thinks they use the analyst when silent,
01:25:56.500 | 'cause you want 'em to talk some more.
01:25:58.460 | When in point of fact, you're either,
01:26:03.940 | you're thinking or, and I love your description,
01:26:07.660 | the feeling of intimacy in silence,
01:26:09.940 | and experiencing the moment.
01:26:12.220 | 'Cause I'm actually gonna factor that into trying to get,
01:26:16.300 | the accommodators love shared intimacy.
01:26:19.620 | They would love to experience a moment.
01:26:21.900 | And I can see that being very compelling,
01:26:24.420 | them be willing to cross that chasm
01:26:27.140 | and experience silence and see how it works for 'em.
01:26:29.780 | - Yeah, it's nerve-wracking, which is why it's intimate.
01:26:34.100 | 'Cause you start thinking,
01:26:34.920 | what's the other person thinking?
01:26:35.760 | Are we actually gonna do this?
01:26:36.700 | Are we gonna sit here for 10 seconds and count?
01:26:39.060 | I mean, there's tricks to it, I guess,
01:26:40.620 | like Brandon says, is to just count it out
01:26:43.420 | and realize through data that there's intimacy to it.
01:26:47.100 | I had a friend of mine, he lost his voice,
01:26:52.100 | 'cause he's singing, so he couldn't,
01:26:55.700 | the doctor says he can't talk for a week,
01:26:58.460 | just to heal the voice, the vocal cords.
01:27:02.100 | But he hung out with other people, with friends,
01:27:06.940 | and didn't talk to them.
01:27:08.140 | He just hung out, and he said it was really intimate.
01:27:11.540 | They both didn't talk to each other.
01:27:14.660 | They just sat there and enjoyed time together.
01:27:18.020 | I don't know, it's a wake-up call.
01:27:20.100 | It's a thing to try, maybe, with people in your life.
01:27:22.900 | Just hang out and don't say anything.
01:27:24.620 | Like, as an experiment, don't say anything the entire day.
01:27:28.660 | But spend-- - Or try, yeah, definitely.
01:27:31.340 | - It's interesting.
01:27:32.180 | I haven't tried it myself. (laughs)
01:27:35.100 | It seems, it's kinda like a silent retreat,
01:27:38.780 | but more active as part of regular, everyday life.
01:27:43.780 | Anyway. (laughs)
01:27:46.220 | Is there other interesting techniques
01:27:48.300 | we can talk about here?
01:27:49.260 | So, for example, creating the illusion of control.
01:27:53.780 | - Yeah, it's principally by asking what and how questions.
01:27:57.580 | 'Cause people love to tell others what to do
01:28:00.740 | or how to do it.
01:28:01.580 | It does a lot.
01:28:04.300 | That was really the way, when the book was first written,
01:28:08.680 | that we really thought about what and how questions.
01:28:11.320 | Is it giving the other side the illusion of control?
01:28:13.880 | And there's a lot more to it than that,
01:28:17.120 | that we've discovered.
01:28:18.200 | I mean, it triggers deep thinking, it wears people down.
01:28:21.400 | Deep thinking can be exhausting.
01:28:23.240 | - And you want, so what's the role
01:28:25.680 | of exhaustion in negotiation?
01:28:27.240 | Is that ultimately what-- - You gotta be careful
01:28:28.600 | with that.
01:28:29.800 | Some people exhaust intentionally.
01:28:34.560 | One of my negotiation heroes, a guy now who's,
01:28:38.800 | unfortunately, suffering from dementia and Alzheimer's,
01:28:43.140 | John Domenico Pico is the UN hostage negotiator
01:28:46.000 | that got all the Western hostages out of Beirut in the '80s.
01:28:49.200 | And he wrote a book called "Man Without a Gun."
01:28:52.880 | And I'm acquainted with Johnny.
01:28:55.240 | At this point in time, I don't think he has any memory
01:28:57.640 | of who I am at all.
01:28:58.960 | But he writes in his book,
01:29:04.360 | one of the great secrets of negotiation
01:29:06.080 | is exhausting the other side.
01:29:07.940 | Political negotiations, that could be Johnny,
01:29:10.800 | was very deferential.
01:29:12.760 | It was in the middle of, in the '80s,
01:29:15.480 | leading up to about 1986-ish,
01:29:17.960 | every negotiation involving warring parties
01:29:24.200 | in the Middle East that you can imagine.
01:29:27.400 | He was in Cyprus, he was in Afghanistan,
01:29:33.680 | Iraq and Iran.
01:29:35.080 | The Iranian government had tremendous trust in him
01:29:39.720 | as a Westerner, a representative of the UN.
01:29:42.920 | Got all the Westerners out of Beirut.
01:29:45.020 | And he was just ridiculously patient.
01:29:48.860 | And which the other side would often find exhausting.
01:29:53.540 | - So exhaustion can be a component
01:29:57.960 | of finding resolution in a negotiation.
01:30:02.320 | - If it tamps down the negative emotions,
01:30:04.760 | often exhaustion will tamp down negative emotions.
01:30:07.440 | The real trick is really getting negative emotions
01:30:12.080 | out of the way, 'cause you're dumber
01:30:14.120 | in a negative frame of mind.
01:30:15.520 | - So the goal is always positive emotion,
01:30:18.160 | as you talk about.
01:30:19.000 | That's what you're always chasing together.
01:30:20.840 | - I think so, yeah.
01:30:22.560 | - And that's what the that's right is about.
01:30:24.720 | - Yes.
01:30:25.560 | - Whatever you're triggering,
01:30:28.280 | whatever the chemistry you're triggering in your brain,
01:30:30.560 | like, yeah, yeah, we're doing good here.
01:30:32.700 | - I think so, long-term, for long-term success, absolutely.
01:30:36.680 | - How's the word fair used and abused?
01:30:40.720 | - The F-bomb.
01:30:41.600 | - The F-bomb, as you call it.
01:30:43.840 | How's it used and abused in negotiation?
01:30:46.080 | - It's usually used, it's most frequently used as a weapon.
01:30:51.080 | It's abused as a point of manipulation.
01:30:54.240 | It's what people say when they feel backed into a corner
01:30:57.880 | and they can't come up with any legitimate reason
01:31:01.840 | as to why they're being backed into a corner.
01:31:04.640 | Like, nobody uses the word F, the F-bomb,
01:31:06.960 | nobody uses the word fair
01:31:08.200 | when they've got criteria to back 'em up.
01:31:10.240 | So consequently, when somebody starts dropping it,
01:31:14.120 | you gotta realize the other side's got
01:31:15.500 | no legitimate outside criteria.
01:31:18.160 | They're feeling very vulnerable.
01:31:20.280 | They can't explain it, but they feel defensive.
01:31:23.440 | And saying, "Hey, look, I've given you a fair offer,"
01:31:27.960 | is a way for me to knock you off your game
01:31:30.200 | if you're not aware of it.
01:31:32.280 | So a lot of cutthroat negotiators
01:31:37.080 | are gonna use it on you to knock you off your game.
01:31:40.080 | The NFL strike probably now, it's been a good 10 years ago,
01:31:47.280 | and maybe even longer than that.
01:31:53.800 | One of the sticking points was
01:31:56.160 | the owners were not opening their books to the players.
01:31:58.120 | Players wanted to see the numbers.
01:31:59.880 | And in order to not open their books,
01:32:05.120 | they just sent a rep to the press conference
01:32:07.600 | saying, "We've given players a fair offer."
01:32:09.760 | Well, if it was fair, you'd open your books.
01:32:13.920 | (Rhett laughs)
01:32:15.280 | - Yeah.
01:32:16.120 | - If you gave 'em a fair offer
01:32:18.840 | and it was justified by what was in your books,
01:32:21.320 | you'd open 'em to prove your point.
01:32:23.400 | So what ends up happening, though,
01:32:26.680 | that, well, the owners gave the players a fair offer
01:32:29.280 | starts to get picked up in the media.
01:32:31.920 | And then it starts getting repeated.
01:32:33.280 | And now that different people on a player's side
01:32:35.880 | are going like, "Yeah, maybe they have given us
01:32:38.000 | "a fair offer, it caused people to be insecure
01:32:41.160 | "about their own positions."
01:32:42.680 | It's an enormously powerful word
01:32:44.960 | that can be used and abused.
01:32:46.200 | And it almost always comes up in every negotiation.
01:32:48.640 | It's shocking the number of times it comes up.
01:32:51.080 | And with people who don't really understand
01:32:53.680 | how or why it's coming up.
01:32:55.560 | - So usually it's a signal of a not a good place
01:32:59.400 | in the negotiation.
01:33:00.600 | - Without question, I'm completely convinced
01:33:03.920 | that if the person is using the word
01:33:06.320 | as a means of getting what they want,
01:33:09.080 | then either accidentally or on purpose,
01:33:13.800 | either in their gut, or they know they got a bad position,
01:33:16.440 | or their gut is afraid that they are.
01:33:19.800 | Do I use the word?
01:33:21.680 | What I'll say is, I want you to feel
01:33:23.680 | like I've treated you fairly.
01:33:25.760 | And if at any given point in time,
01:33:27.960 | you think I'm not treating you fairly,
01:33:29.440 | I want you to stop me.
01:33:30.800 | And we're gonna address it.
01:33:33.080 | - Big ridiculous question, but how do you close the deal?
01:33:39.560 | How do you take the negotiation to its end?
01:33:45.080 | Is the implementation ultimately--
01:33:47.600 | - You gotta pivot to agreed upon implementation
01:33:50.480 | to really move out on the negotiation.
01:33:52.800 | And I may say, how do you wanna proceed?
01:33:55.880 | And if you don't know, I might say,
01:33:59.880 | no more into question, is it a ridiculous idea
01:34:02.440 | if I share with you some ideas of how to proceed?
01:34:04.880 | - And then you agree on the actual steps,
01:34:08.320 | and that's the implementation.
01:34:09.800 | It's not just the philosophical agreement,
01:34:12.200 | it's actual steps.
01:34:13.800 | - The big problem in all negotiations
01:34:16.100 | is a lack of discussion of next steps.
01:34:18.760 | - That's deep.
01:34:23.080 | Who's the best negotiator you've ever met?
01:34:25.680 | - Yeah, actually, probably my son, Brandon.
01:34:28.120 | - Yeah? - Yeah, he's ridiculously
01:34:29.680 | talented, I mean, he's ridiculously talented.
01:34:32.880 | And he's, you know, and what was it,
01:34:37.560 | Coral's book, "The Talent Code" says that,
01:34:40.920 | you know, people just noticed it
01:34:42.240 | and started getting good at it.
01:34:43.600 | There's no such thing as a child prodigy.
01:34:45.680 | Just got interested when they were a kid.
01:34:47.360 | I mean, Brandon started learning how to negotiate
01:34:50.240 | when he was two years old.
01:34:51.540 | And he's been in it and immersed in it,
01:34:55.040 | you know, since he could make complete sentences,
01:34:56.680 | even before he could make complete sentences.
01:34:58.760 | He's ridiculously talented.
01:35:00.600 | - What's his future, what's he want to do?
01:35:03.680 | - He's gonna, he has been involved,
01:35:06.440 | he run and built my company,
01:35:08.780 | and now he's gonna be an affiliated licensee,
01:35:12.160 | run his own operation.
01:35:13.680 | He's pretty much gonna end up doing very much,
01:35:16.080 | he's gonna open his entrepreneurial opportunities
01:35:20.800 | to do whatever he wants and not have his dad say no.
01:35:23.900 | (laughing)
01:35:25.400 | - And do a better job than his dad.
01:35:26.880 | - Most likely. - Yeah.
01:35:28.800 | Okay.
01:35:31.280 | Do you see some of the techniques
01:35:33.960 | that you talk about as manipulative?
01:35:36.320 | - Manipulation is whether or not
01:35:39.120 | I'm trying to exploit you or hurt you.
01:35:43.560 | Am I trying to manipulate a bank robber
01:35:45.680 | into letting me save his life?
01:35:47.240 | Yeah.
01:35:48.940 | So manipulation is like, what am I trying to do to you?
01:35:54.040 | - Yeah.
01:35:54.880 | So you don't see the negative connotation.
01:35:58.080 | If you're trying to bring a better future,
01:36:01.720 | it's not manipulation?
01:36:02.860 | - If I'm trying to bring a better future,
01:36:05.680 | if I'm being genuine and honest,
01:36:07.960 | like, I compliment you.
01:36:10.880 | If my compliment is genuine, that's not manipulation.
01:36:15.800 | But if I think,
01:36:18.820 | you got a pair of shoes
01:36:24.280 | that are the dumbest looking things I've ever seen.
01:36:27.760 | And I go, wow, those are great shoes.
01:36:29.360 | No, that's manipulation.
01:36:30.560 | - So there's guys like Warren Buffett
01:36:33.960 | who are big on integrity and honesty.
01:36:36.600 | What's the role of lying in effective--
01:36:41.600 | - Lying is a bad idea.
01:36:43.320 | Lying is just a bad idea for a variety of reasons.
01:36:46.580 | First of all,
01:36:48.620 | there's a really good chance the other side
01:36:52.720 | is a better liar than you are,
01:36:53.800 | they're gonna spot it right off the bat.
01:36:55.440 | - Yeah.
01:36:56.760 | - Secondly, they could be luring you into a trap
01:36:59.840 | to see if you will lie.
01:37:01.000 | Thirdly, the chances are they're gonna find out
01:37:04.440 | that you lied to 'em.
01:37:05.280 | Eventually, it's really high.
01:37:08.760 | And then the penalties and the taxes
01:37:11.760 | are gonna be way higher than what you had in the first place.
01:37:15.320 | - So long-term, you wanna have a reputation
01:37:17.320 | of somebody with integrity.
01:37:18.520 | And the more you lie,
01:37:19.560 | the harder it is to maintain that reputation.
01:37:21.560 | - Yeah, exactly.
01:37:22.400 | And we're just gonna get out.
01:37:24.500 | - Yeah.
01:37:25.700 | So what's the, we can just return to that question.
01:37:29.120 | What's the difference between a good conversation
01:37:32.520 | and a good negotiation?
01:37:34.480 | Can we, because I think just reading your work,
01:37:39.400 | listening to you,
01:37:41.120 | there's a sense I have that the thing we're doing now
01:37:45.560 | and just conversation on podcasts and so on
01:37:47.760 | is different than negotiation.
01:37:49.640 | It feels like the purpose is different.
01:37:51.600 | And yet, having some of the same awareness
01:37:55.280 | of the value of empathy is extremely important.
01:37:59.860 | But it feels like the goals are different.
01:38:01.960 | Or no?
01:38:04.980 | - Really close, fine line.
01:38:06.420 | I mean, I ruled in here, not having any expectations,
01:38:11.180 | not looking for anything
01:38:12.180 | other than to have an interesting conversation.
01:38:14.900 | And to hear what was behind the questions
01:38:19.700 | that you were asking me and what interests you.
01:38:23.140 | And then also your description of silence
01:38:26.260 | and the power of silence,
01:38:27.100 | something I'm gonna take away as a learning point
01:38:29.380 | and help learn to teach others.
01:38:32.180 | But I didn't come in here,
01:38:33.820 | I suppose a negotiation is when we're both aware
01:38:36.660 | of a problem we're trying to solve.
01:38:39.060 | - Right, there's no problem in the room to solve,
01:38:43.040 | except maybe like the human condition.
01:38:45.380 | - Insight, you know, wisdom.
01:38:47.740 | - Insight.
01:38:48.660 | - Learn.
01:38:49.500 | - How do you train to become better at negotiating?
01:38:56.060 | In business, in life?
01:39:00.940 | - Yeah, just small stakes practice
01:39:02.700 | for high stakes results.
01:39:03.780 | I mean, decide what kind of negotiating resonates with you.
01:39:07.660 | - What's that mean, small stakes practice
01:39:09.600 | for high stakes or small stakes?
01:39:11.220 | So small, little, incremental,
01:39:14.860 | picking up girls at a bar, what are we talking about?
01:39:16.980 | - Well, it can be.
01:39:18.100 | For some people, that's high stakes practice.
01:39:20.380 | Well, you know, labeling mirrors.
01:39:24.260 | What are the basic tools of great negotiation?
01:39:26.620 | Labeling mirror, paraphrasing, summarizing.
01:39:29.980 | So you start labeling a mirror people
01:39:33.780 | that you just have regular interactions with
01:39:36.620 | just to gain a feel for whether or not
01:39:39.300 | you can read somebody's affect
01:39:40.620 | or how accurate your read is to get better at it.
01:39:43.180 | And so, you know, label the lift driver
01:39:49.380 | or the grocery store clerk
01:39:52.340 | or the person behind the airline counter at the airport.
01:39:58.260 | - So putting a label on their affect.
01:39:59.980 | - Or throwing something at 'em that,
01:40:02.820 | 'cause negotiation's a perishable skill.
01:40:05.220 | Emotional intelligence is perishable.
01:40:07.060 | So seeing if you can indicate
01:40:09.820 | that you understand their label.
01:40:11.940 | One of my favorite labels to throw out on somebody,
01:40:14.460 | which, you know, maybe re-level,
01:40:17.420 | I might look at somebody who looks distressed
01:40:19.340 | and I'll go, "Tough day?"
01:40:20.540 | So several years ago, I'm at the counter at LAX.
01:40:27.300 | Well, I'm waiting in line to get to the counter.
01:40:30.300 | And a lady behind the counter is clearly making it a point
01:40:33.980 | to not meet my eyes so that I don't approach.
01:40:38.160 | And she looks, and so, you know when you're next in line
01:40:42.980 | and they're making sure that you don't meet eyes.
01:40:45.840 | And I'm thinking to myself,
01:40:48.180 | all right, so they're having a bad day.
01:40:49.580 | So I walk up and as soon as I approach the counter,
01:40:52.260 | I go, "Tough day?"
01:40:53.860 | And she kinda snaps around.
01:40:56.940 | And she goes, "No, no, no, how can I help you?"
01:40:59.420 | And goes out of her way to help me.
01:41:01.140 | Now I'm practicing, but I also know it made her feel better.
01:41:04.900 | It relieved some of the stress.
01:41:05.940 | So now I'm going through TSA.
01:41:07.340 | I wanna look for people who are having a tough day.
01:41:09.860 | - That's a good place to find them.
01:41:10.700 | - It's a good place to find them, practice.
01:41:13.300 | And I'm rolling through the line
01:41:14.860 | and I realize I haven't tossed a label out
01:41:16.900 | on any one of these guys.
01:41:18.060 | And there's this guy watching the bags
01:41:20.420 | come out of the x-ray machine.
01:41:22.140 | And he's just kinda got an indifferent look on his face.
01:41:25.020 | And I go, "Tough day?"
01:41:26.220 | And he kinda goes, I could see from his body language,
01:41:29.380 | like, "No."
01:41:30.220 | And I go, "Just another day, huh?"
01:41:33.500 | And he goes, "Yeah, just another day."
01:41:35.860 | You know, he felt seen, but I missed and I'm practicing.
01:41:39.860 | And I'm trying to stay sharp.
01:41:41.060 | So these are the small-- - Just a few words.
01:41:42.500 | With just a few words, you're trying to like
01:41:44.740 | quickly localize the effect.
01:41:47.060 | - Nice. - And put a label on it.
01:41:48.300 | - Very, very, very analytically said, thank you.
01:41:53.540 | I'm not letting it go.
01:41:55.340 | - I love it.
01:41:56.180 | (laughing)
01:41:58.420 | Does the same apply to just conversation in general?
01:42:02.660 | Just how to get better at conversation?
01:42:04.060 | I think a lot of people struggle.
01:42:05.740 | They have insecurities, they have anxiety about conversation.
01:42:08.780 | As funny as this to say,
01:42:10.300 | I have a lot of anxiety about conversation.
01:42:13.340 | Is that you basically do the same kind of practice,
01:42:16.540 | practice some of the techniques--
01:42:17.620 | - Yeah, genuinely, just trying to make sure
01:42:19.340 | you heard somebody out.
01:42:20.740 | - Yeah.
01:42:21.820 | What's the best conversation you've ever been in?
01:42:23.780 | Except this one, of course.
01:42:25.100 | (laughing)
01:42:27.580 | - Wow.
01:42:28.420 | - I mean, not the best conversation,
01:42:31.140 | but what stands out to you
01:42:33.300 | as conversation that changed you as a person, maybe?
01:42:36.460 | - Well, there's probably been a lot of them along the way.
01:42:38.940 | I mean, but one that I remember on a regular basis,
01:42:43.620 | actually there's two.
01:42:44.940 | But when I was in the Bureau, I'm at Quantico,
01:42:48.700 | I'm there for an in-service.
01:42:49.540 | There's another guy from New York, a buddy of mine named Lionel.
01:42:53.660 | And we're both trying to decide whether or not
01:42:55.620 | we wanna be trying to get into profiling or negotiation.
01:42:59.340 | 'Cause they're both about human dynamics,
01:43:00.900 | and both of us really like human dynamics.
01:43:03.740 | And we're sitting around talking about it,
01:43:05.340 | and we're talking about several things, and he labels me.
01:43:08.220 | And I knew he didn't know what he was doing.
01:43:09.820 | I think he was just, he had picked it up.
01:43:12.420 | And I'd been talking about my family quite a few things.
01:43:17.500 | And he said to me, and I never said this directly,
01:43:21.100 | that we were close.
01:43:22.380 | But he said to me, "It sounds like your family's
01:43:24.740 | "really close."
01:43:26.140 | And I can remember in a moment, like this feeling,
01:43:29.340 | just like I felt great in the moment.
01:43:32.140 | I mean, what he said just drew together everything
01:43:35.180 | that I'd been saying, and nailed the essence of it.
01:43:38.020 | And I have a very clear recollection
01:43:40.460 | of how good that felt in a moment.
01:43:42.520 | So a couple years later, I'm on a suicide hotline.
01:43:46.260 | Now I got this line in the back of my head.
01:43:48.420 | You know, line, technique, reaction, read,
01:43:52.260 | whatever you wanna call it.
01:43:53.980 | Guy calls in on a hotline, and I could tell the dude
01:43:56.900 | is rattled by his tone of voice.
01:43:58.340 | I mean, just amped up.
01:44:00.140 | And he goes, "You know, I'm just trying to put a lid
01:44:03.460 | "on the day.
01:44:04.300 | "I need your help putting a lid on the day.
01:44:05.620 | "I gotta put a lid on the day."
01:44:07.740 | And I go, "You sound anxious."
01:44:13.660 | And he goes, "Yeah."
01:44:16.140 | And he came down a little bit, and he was a guy
01:44:18.620 | that was telling me about, he was battling
01:44:21.500 | a disease of paranoia.
01:44:24.540 | And he's gonna go on a car trip with his family
01:44:26.360 | the next day, and he knew that on the car trip,
01:44:29.100 | he was gonna twist himself into knots.
01:44:31.900 | And so the night before, he's twisting himself into knots.
01:44:34.740 | And he's laying out everything that he's done
01:44:37.880 | to try to beat paranoia, and how much his family's
01:44:42.740 | helping him.
01:44:43.580 | And he's going on a car trip with the family,
01:44:47.140 | 'cause they're gonna take him to see a doctor.
01:44:49.860 | And so I hit him with the same thing that my buddy
01:44:52.580 | Lionel said.
01:44:53.420 | I said, "Sounds like your family's close."
01:44:55.780 | He goes, "Yeah, we are close."
01:44:57.260 | And he leveled out a little bit more.
01:44:59.500 | And then he started ticking off all the things
01:45:02.280 | that he was doing to try to beat paranoia,
01:45:04.100 | and he sounded determined.
01:45:05.520 | And so I said, "You sound determined."
01:45:09.940 | And he goes, "Yeah, I am determined.
01:45:12.780 | "I'll be fine tomorrow.
01:45:16.000 | "Thanks."
01:45:16.900 | And that was all I said.
01:45:20.020 | So those two conversations, which are overlapping
01:45:22.260 | conversations, those two things really stick out
01:45:24.460 | in my mind.
01:45:25.700 | - Do those things, through all the different negotiations
01:45:28.220 | and conversations you've had, do they kinda echo
01:45:33.220 | throughout, like you basically...
01:45:38.220 | - Because when you empathize with other human beings,
01:45:40.700 | you start to realize we're all the same.
01:45:42.820 | And so you can start to pick little phrases here and there
01:45:49.180 | that you've heard from others, little experiences.
01:45:52.580 | They were all about, like we all want to be...
01:45:55.220 | To be close with other human beings.
01:45:59.940 | We all want love.
01:46:02.020 | They were all...
01:46:02.860 | I think we're all deeply lonely inside.
01:46:05.980 | (laughs)
01:46:06.820 | We're all looking for connection.
01:46:08.420 | We're just, if we're honest about it.
01:46:12.580 | And so all humans have that same,
01:46:15.660 | all the same different components of...
01:46:17.820 | Or it makes them tick.
01:46:20.900 | So you kind of see yourself basically just saying
01:46:25.720 | the same things to connect with another human being.
01:46:28.660 | - Yeah, there aren't that many different things
01:46:31.100 | that we're looking for understanding on,
01:46:33.000 | or connection on, or satisfaction of.
01:46:35.220 | There just aren't that many of them, regardless.
01:46:38.420 | And so yeah, you're looking for it to manifest itself
01:46:42.980 | in some form or another.
01:46:45.540 | And you're willing to take a guess on whether or not
01:46:47.420 | that's what you're seeing or hearing.
01:46:49.220 | - What advice would you give to me
01:46:51.700 | to be better at these conversations?
01:46:56.040 | To me and to other people that do kinda interviews
01:47:00.700 | and podcasts and so on.
01:47:02.860 | - Wow.
01:47:04.740 | - 'Cause I really care about empathy as well.
01:47:07.620 | Is there a kind of, is a lifelong journey in this process?
01:47:11.140 | - Yeah, well I would advise you to take that approach,
01:47:13.980 | which is the approach that you're taking.
01:47:15.340 | You care about it, you're very curious about it.
01:47:18.420 | You see it as a lifelong journey, you're fascinated by it.
01:47:21.380 | You enjoy learning about it.
01:47:26.280 | And you definitely do see it as a lifelong journey,
01:47:29.940 | as opposed to, this is what I can,
01:47:32.060 | if I can acquire this, then I can manipulate people.
01:47:34.300 | - No, I mean, I fall in love with people I talk to.
01:47:37.020 | There's a kinda deep connection and it lingers with you,
01:47:40.140 | especially when I'm preparing.
01:47:41.680 | The more material there is in a person,
01:47:45.020 | the more you get to fall in love with them ahead of time.
01:47:47.540 | They think you get to really understand, not understand,
01:47:50.660 | but what I mean by fall in love is--
01:47:55.060 | - Well, appreciate, huh?
01:47:56.260 | - Appreciate, but also become deeply curious.
01:47:58.960 | That's what I mean by fall in love.
01:48:00.700 | You appreciate the things you know,
01:48:02.120 | but you start to see, like Alice in Wonderland,
01:48:04.380 | you start to see that there's all this cool stuff
01:48:07.200 | you can learn if you keep interacting with them.
01:48:09.700 | And then when you show up and you actually meet,
01:48:12.140 | you realize it's like more and more and more and more.
01:48:15.460 | It's like in physics, the more you learn,
01:48:17.120 | the more you realize you don't know.
01:48:18.360 | And it's really exciting.
01:48:20.180 | And then it can also be heartbreaking
01:48:21.700 | 'cause you have to say goodbye.
01:48:23.460 | The goodbye, I hate goodbyes.
01:48:25.840 | I hate goodbyes.
01:48:26.680 | - Seems terminal, right?
01:48:28.060 | - Yeah, it reminds me that I'm gonna die one day.
01:48:31.860 | (Lex laughing)
01:48:33.180 | Like things end, good things end.
01:48:35.260 | It sucks.
01:48:36.140 | But then it makes the moment more delicious, you know,
01:48:39.520 | that you do get to spend together.
01:48:41.260 | - Yeah.
01:48:42.100 | (Lex laughing)
01:48:43.840 | - Okay.
01:48:44.680 | I just wanna, I completely forgot I wanted to ask you
01:48:48.180 | about this, the 73855% rule.
01:48:52.180 | (Lex laughing)
01:48:53.020 | This is really interesting.
01:48:54.340 | Does this, is there at all truth to it?
01:48:57.140 | That 7% of a message is conveyed from the words used,
01:49:00.420 | 38% from the tone of voice, and 55% from body language.
01:49:04.440 | Is there really truth to that?
01:49:07.020 | - All right, so Albert Morabian,
01:49:09.660 | I think is the name of the UCLA professor
01:49:13.860 | that originally proposed the 73855 ratio
01:49:17.980 | and discussed it in terms of that it wasn't the message,
01:49:21.800 | but how much, he called it liking.
01:49:25.020 | Like are you, not that you're,
01:49:26.620 | the meaning is coming across,
01:49:27.860 | but you're liking of the message.
01:49:29.700 | And so it's been extrapolated heavily by people like me
01:49:34.700 | to this meaning of the meaning in 73855,
01:49:41.500 | from liking to the meaning.
01:49:42.860 | What I've seen regularly is people that communicate verbally
01:49:48.780 | if their speaker's Tony Robbins, 73855 guy,
01:49:54.260 | he throws the ratio out there,
01:49:58.020 | go that's it exactly.
01:50:00.020 | That's exactly how the message comes across.
01:50:02.420 | This is how we gotta balance it.
01:50:04.500 | This is how we gotta do it.
01:50:05.860 | Those that communicate principally in writing,
01:50:09.820 | the meaning of the words are much more important to them.
01:50:14.180 | So they're deeply uncomfortable with seven being the words.
01:50:19.860 | 'Cause the content, the words, the meaning of the words,
01:50:21.860 | when you're writing, it's so important
01:50:24.800 | that you hate to poo-poo it that way.
01:50:27.860 | So I, first of all, I 1000% believe it's an accurate ratio,
01:50:32.260 | but the real critical issue is,
01:50:34.100 | not what the ratio of those three things are,
01:50:36.900 | it's what's the message when they're out of line?
01:50:39.820 | Like what's the message when the tone of voice
01:50:42.700 | is out of line with the words?
01:50:44.820 | Like it don't matter what your ratio is.
01:50:46.460 | You got a problem if their tone does not match their words.
01:50:51.060 | - And that's hard to really put a measure on exactly.
01:50:55.620 | Even in writing, there's a tone.
01:50:57.860 | I mean, it's not just, even in writing,
01:50:59.460 | it's not just the words.
01:51:00.460 | There's the words, but there's like a style
01:51:02.500 | underneath the whole thing.
01:51:04.140 | And there's something like a body language,
01:51:06.060 | the presentation of the whole thing.
01:51:07.140 | I mean, yeah, I'm a big fan of constraint
01:51:11.100 | mediums of communication, which writing is,
01:51:13.180 | or voice, like Clubhouse.
01:51:15.880 | There's a personality to a human being
01:51:17.420 | when you just hear their voice.
01:51:18.980 | It's not just, you could say it's the tone of voice,
01:51:23.360 | but there's, you can like, what is it?
01:51:25.260 | The imagination fills in the rest.
01:51:28.020 | Like when I'm listening to somebody,
01:51:29.580 | I'm like, I'm imagining some amorphous being, right?
01:51:33.660 | Doing things.
01:51:34.480 | When they get angry, I'm imagining anger.
01:51:36.620 | I don't know what exactly I'm visualizing.
01:51:38.740 | - Well, and so you may be thinking of a funny story
01:51:41.060 | 'cause we were talking about your buddy Elon before.
01:51:43.220 | And I told you about, you know,
01:51:45.140 | that I'd interacted with some of the senior executives.
01:51:48.340 | So I know that they love working with him
01:51:51.860 | and I think he's an interesting guy
01:51:53.460 | and they realize that he can be funny
01:51:54.900 | and he jokes around.
01:51:56.340 | So they're telling me,
01:51:58.820 | they're on this conference call, just words,
01:52:01.860 | and a guy on the other end of the line says something,
01:52:05.000 | you know, that was wrong but wasn't bad.
01:52:10.620 | And so they said, they're on a phone,
01:52:13.740 | and Elon goes, "You're fired."
01:52:15.100 | And then everybody in the room with him
01:52:16.620 | can see that he's joking.
01:52:17.900 | But the person on the other side can't,
01:52:21.220 | and they all go to him, "Wait, wait, wait, wait,
01:52:22.780 | they can't see your look on your face right now.
01:52:24.340 | You gotta stop, you gotta stop
01:52:25.300 | 'cause the guy on the other side is dying right now.
01:52:27.620 | He doesn't realize you're joking."
01:52:29.580 | So there was, you know, there were the words
01:52:31.580 | and the tone of voice,
01:52:32.780 | but it lacked the visual to go with it.
01:52:35.100 | - Nevertheless, it was probably funny.
01:52:39.100 | - I'm sure it was very funny at the time.
01:52:42.820 | - Maybe not to him.
01:52:43.820 | Just as an interesting task,
01:52:46.980 | I don't know if you're following along the developments
01:52:49.140 | of large language models,
01:52:50.780 | there's been something called Chad GPT.
01:52:52.940 | There's just more and more sophisticated
01:52:57.020 | and effective and impressive chatbots, essentially,
01:53:01.140 | that can talk,
01:53:03.420 | and they're becoming more and more human-like.
01:53:05.820 | Do you think it's possible in the future
01:53:09.640 | that AI will be able to be better negotiators than humans?
01:53:14.640 | Do you think about that kind of stuff?
01:53:18.020 | - Well, so definition of better versus less flawed.
01:53:22.340 | Like, you know, chatbots have been out there for a long time
01:53:26.460 | and probably about five years ago now,
01:53:31.360 | a company approached us
01:53:33.900 | 'cause they were doing a negotiation chatbot.
01:53:36.820 | And they said two things.
01:53:38.460 | First of all, I said, "Why are you talking to us?"
01:53:41.420 | Said, "Well, in point of fact,
01:53:43.380 | we already spoke to the people that are teaching,
01:53:45.420 | quote, the Harvard methodology,
01:53:48.620 | and the rational approach to negotiation just doesn't work.
01:53:52.060 | Rational approach just does not work.
01:53:54.180 | Our chatbots are not getting anywhere."
01:53:56.520 | But we're showing in around about 80% of the interactions
01:54:01.520 | a higher success outcome with these chatbots.
01:54:05.860 | And they showed me what they were doing,
01:54:08.220 | and it was still a lot deeply flawed
01:54:11.300 | emotional intelligence-wise,
01:54:13.460 | but the reason why that they were having
01:54:15.260 | higher success rates is the chatbots
01:54:17.420 | were never in a bad mood.
01:54:18.620 | And you could reach out for chatbot
01:54:21.180 | in the middle of the night.
01:54:23.380 | So if you were talking to somebody that was never upset
01:54:26.140 | and was always available,
01:54:27.540 | then you're gonna have a higher success rate.
01:54:30.620 | Negotiations go bad when people
01:54:32.140 | are in a negative frame of mind.
01:54:34.020 | - So the natural ability of a chatbot to be positive
01:54:39.020 | is just going to give you a higher success rate.
01:54:43.220 | - Yeah, and they're not gonna get mad and argue with you.
01:54:46.220 | You know, you say to a chatbot,
01:54:49.220 | you know, "Your price is too high."
01:54:51.220 | Chatbot is designed to come back with a smiley face.
01:54:54.060 | - Yeah.
01:54:54.900 | - You say to a person, "Your price is too high."
01:54:56.460 | They go, "How dare you?
01:54:57.540 | I'm trying to make a living."
01:54:58.820 | You know, they're gonna go off the deep end.
01:55:00.660 | - Unfortunately or fortunately,
01:55:02.260 | I think the way chatbots are going now,
01:55:03.940 | they will come back negative
01:55:05.780 | because they're becoming more and more human-like.
01:55:08.900 | That's the whole point.
01:55:09.980 | To be able to pass the Turing test,
01:55:11.540 | you have to be negative.
01:55:12.860 | You have to be an asshole.
01:55:14.180 | You have to have boundaries.
01:55:15.260 | You have to be insecure.
01:55:16.380 | You have to have some uncertainty.
01:55:18.900 | - Well, it's them between having boundaries
01:55:21.420 | and being negative.
01:55:22.820 | Like I can, you threw a proposal to me.
01:55:26.140 | You know, before I say no, I'm gonna say,
01:55:31.320 | "Look, I'm sorry, that just doesn't work for me."
01:55:34.460 | I'm gonna set up a real clear boundary
01:55:37.180 | without being negative.
01:55:38.900 | - Sure.
01:55:39.740 | - So, and a lot of people really struggle
01:55:44.100 | with setting boundaries without being negative,
01:55:47.220 | without name-calling, without indignation,
01:55:49.620 | without getting upset.
01:55:50.760 | - But see, there's a, when you are,
01:55:54.540 | when you show that you're not getting upset,
01:55:58.580 | I'm not just seeing that.
01:56:00.940 | I'm seeing a flawed human that has underneath it a temper,
01:56:07.160 | underneath it the ability to get upset,
01:56:10.040 | but chooses not to get upset.
01:56:12.280 | And a chatbot has to demonstrate that.
01:56:14.120 | So, it's not just going to be cold
01:56:16.320 | and be this kind of corporate, blank, empty,
01:56:21.320 | sort of like vapid creature that just says,
01:56:27.340 | "Oh, thank you.
01:56:28.440 | "Thank you for saying that."
01:56:29.920 | No, it's basically, you have to,
01:56:33.720 | the chatbot has to be able to be mean
01:56:38.320 | and choose not to be.
01:56:39.740 | - Interesting.
01:56:41.440 | I don't know.
01:56:42.400 | - Maybe not.
01:56:43.240 | - I'd be willing to see that play out
01:56:48.080 | and see how it plays out.
01:56:49.800 | - But I guess what I'm saying is to be a good negotiator,
01:56:52.080 | you have to have the capacity to be a bad person
01:56:56.040 | and choose not to. - Really?
01:56:57.240 | - I think so.
01:56:58.760 | - See, I think you just gotta have the capacity
01:57:00.760 | to set a boundary and stick to it.
01:57:03.460 | - Interesting.
01:57:04.700 | 'Cause I think it's hard for me to trust a person
01:57:07.500 | who's not aware of their own demons.
01:57:09.460 | 'Cause if you say you don't have any demons,
01:57:13.380 | if you don't have any flaws, I can't trust you.
01:57:16.500 | - Yeah, well, first of all, it's a lie, right?
01:57:18.060 | So, somebody's lying.
01:57:19.100 | - Right.
01:57:19.940 | - This is back to lying.
01:57:20.900 | - Yes.
01:57:21.740 | So, you have to have a self-awareness about that,
01:57:23.980 | but to be able to control it,
01:57:26.420 | demonstrate to be able to control it.
01:57:27.900 | I mean, this is humans,
01:57:28.900 | I just think humans, intelligent, effective humans,
01:57:32.340 | they're able to do this well,
01:57:33.820 | and chatbots are not yet,
01:57:36.780 | and they're moving in that direction.
01:57:38.220 | So, it makes me think about what is actually required
01:57:42.580 | for effective negotiation.
01:57:43.860 | That's what AI systems do,
01:57:45.580 | is they make you ask yourself,
01:57:49.180 | what is it that makes humans special,
01:57:51.420 | any discipline?
01:57:52.960 | What is it that makes humans special at chess and Go, games,
01:57:56.280 | which AI systems are able to beat humans at now?
01:58:00.000 | What is it that makes them effective at negotiation?
01:58:03.340 | What does it make them effective at
01:58:05.140 | something that's extremely difficult,
01:58:08.940 | which is navigating physical spaces?
01:58:12.100 | So, doing things that we take for granted,
01:58:13.700 | like making yourself a cup of coffee,
01:58:15.620 | is exceptionally difficult problem for robots.
01:58:19.180 | - Yeah.
01:58:20.000 | - Because of all the complexities
01:58:22.420 | involved in navigating physical reality.
01:58:25.900 | We have so much common sense reasoning built in.
01:58:30.200 | Just about how gravity works,
01:58:32.240 | about how objects move,
01:58:37.000 | what kind of objects there are in the world.
01:58:41.680 | It's really difficult to describe,
01:58:44.440 | because it all seems so damn trivial,
01:58:46.420 | but it's not trivial.
01:58:47.640 | - Right.
01:58:48.480 | - Because a lot of that we just learn as babies.
01:58:50.920 | We keep running into things,
01:58:52.400 | and we'll learn about that.
01:58:54.160 | And so, AI systems help us understand
01:58:56.200 | what is it that makes humans really,
01:58:58.560 | what is the wisdom we have in our heads?
01:59:00.840 | And negotiation, to me, is super interesting.
01:59:03.560 | 'Cause negotiation is not, it's about business,
01:59:06.520 | it's about geopolitics, it's about running government.
01:59:11.320 | It's basically negotiating how do we,
01:59:13.960 | the different policies, different bills,
01:59:17.040 | and programs, and so on.
01:59:18.920 | How do we allocate money?
01:59:20.360 | How do we reallocate resources?
01:59:23.320 | All that kind of stuff.
01:59:24.440 | That seems like AI, in the future, could be better at that.
01:59:28.320 | But maybe not.
01:59:29.280 | Maybe you have to be a messy, weird,
01:59:33.200 | insecure, uncertain human,
01:59:35.640 | and debate each other, and yell at each other on Twitter.
01:59:39.320 | Maybe you have to have the red and the blue teams
01:59:42.520 | that yell at each other,
01:59:43.940 | in the process of figuring out what is true.
01:59:48.400 | Maybe AI systems will not be able to do that,
01:59:51.040 | and figure out the full mess of human civilization.
01:59:55.120 | - Yeah, interesting.
01:59:55.960 | Well, I mean, the two thoughts that I had along the way
01:59:58.240 | was, I mean, anytime you're talking about systems,
02:00:01.880 | or scaling, you know, you're talking,
02:00:06.320 | my belief is, chatbots, systems,
02:00:10.920 | things that don't require decision-making,
02:00:13.320 | just following instructions,
02:00:14.920 | at least 80% of what's going on.
02:00:18.320 | Now, the remaining percentage, whatever it is,
02:00:22.720 | does it require the human interaction, and what's required?
02:00:27.040 | Like, I'm not, I'm not, I'm not like,
02:00:29.360 | I am not pro-conflict, and I also know
02:00:32.440 | that there's a case to be made in the creative world,
02:00:35.720 | that some of the best thinking came out of conflict.
02:00:38.320 | Reading interviews of Bono, you too.
02:00:42.220 | You know, their admiration for some of the Beatles'
02:00:47.200 | best music came when they were fighting with each other.
02:00:50.240 | And the song "One," Octoon,
02:00:53.200 | which is, I believe, from the album "Octoon Baby,"
02:00:56.120 | those guys were fighting.
02:00:57.840 | I mean, they were on the verge of breaking up.
02:01:00.200 | And their appreciation that conflict
02:01:01.960 | could create something beautiful.
02:01:03.600 | And then when I was in the crisis negotiation unit,
02:01:06.840 | you know, my last seven years in the FBI,
02:01:09.280 | there was a guy named Vince, brilliant dude,
02:01:12.640 | brilliant, brilliant negotiator.
02:01:14.440 | And he and I used to argue all the time.
02:01:17.320 | And then when we had a change in the guy who was in charge,
02:01:22.560 | the guy who was in charge took me off to the side,
02:01:25.200 | and he's like, "You know, I can't take you
02:01:26.560 | "and Vince fighting all the time."
02:01:28.320 | And I said, "Well, I got news for you.
02:01:30.860 | "I think we come up with much better stuff
02:01:32.680 | "as a result of our battles."
02:01:34.800 | And he said, "You know, Vince said the same thing to me."
02:01:37.600 | And I'm like, "So if we don't have a problem fighting,
02:01:39.660 | "why do you have a problem with it?"
02:01:42.360 | But you know, there is something there
02:01:44.760 | that sometimes the most difficult insights,
02:01:49.160 | you rack your brains as to why someone is so dug in
02:01:53.960 | on something that you think is so wrong.
02:01:57.640 | Yeah, maybe there's something to it.
02:01:58.840 | I think there's something to it.
02:02:00.120 | - There's something about conflict, even drama,
02:02:02.840 | that might be a feature, not a bug of our society.
02:02:05.960 | - Interesting.
02:02:06.800 | - Do you think there will always be war in the world?
02:02:10.560 | - Yeah.
02:02:11.400 | - So there will always be a need for negotiators
02:02:18.000 | and negotiating.
02:02:18.920 | - Well, as it turns out.
02:02:23.080 | - Why do you think there will always be war?
02:02:25.160 | What's your intuition about human nature there?
02:02:27.880 | - Yeah, just because we're basically 75% negative.
02:02:30.800 | And then, for lack of a better term,
02:02:34.480 | I call it two lines of code.
02:02:35.960 | Everybody, when we were little,
02:02:39.080 | somebody planted in two lines into our head.
02:02:41.720 | We don't know when it got in there.
02:02:43.480 | But somebody said something to us that stuck.
02:02:47.320 | And there are a lot of people
02:02:50.040 | that had some really negative garbage
02:02:52.560 | dumped in their brain when they were little.
02:02:56.360 | And just based on the numbers,
02:03:02.840 | what kind of opportunity they were given afterwards,
02:03:06.080 | did they ever have an epiphany moment
02:03:08.800 | when they genuinely believed
02:03:10.320 | they can get themselves out of it?
02:03:12.520 | Like, what is it, one of Joe Dispenza's book
02:03:15.280 | is "Breaking the Habit of Being Yourself."
02:03:17.680 | - Yeah.
02:03:18.600 | - You know, like, how do you get at that two lines of code
02:03:21.440 | that either mean or well-intentioned,
02:03:25.560 | but stupidly speaking, adult said to you
02:03:27.880 | at the wrong moment and planted in your brain?
02:03:30.320 | Like, the chances of everybody on Earth
02:03:32.760 | getting that out, even a majority of people on Earth
02:03:35.440 | getting that out of their heads is really small.
02:03:37.840 | - What advice would you give to a young person today
02:03:42.880 | about how to have a career they could be proud of or a life?
02:03:46.560 | Maybe somebody in high school, college,
02:03:49.720 | trying to figure out their way in this world.
02:03:51.960 | - It's probably a take on a cliche of do what you love.
02:03:57.400 | But if you figure out your ideals
02:04:02.400 | and pursue your ideals and stick to 'em when it costs you.
02:04:08.480 | Like, a guy I admire very much, Michael McGill,
02:04:13.880 | runs this Operation Crisp video in Atlanta.
02:04:19.000 | In one of his talks, he would say,
02:04:20.120 | "Core values are what you stick to that costs you money."
02:04:23.920 | It's not a value that really matters to you
02:04:27.800 | unless it's costing you.
02:04:30.160 | And stick to your values.
02:04:32.860 | Now, when I was in the FBI, I worked really hard at,
02:04:37.040 | you know, the number one core mission of the FBI
02:04:39.440 | is to protect and defend the American people.
02:04:41.680 | So I could pursue that value at all times, which I did,
02:04:49.360 | or I could follow the rules.
02:04:51.360 | You don't have time to do both.
02:04:54.360 | (Lex laughing)
02:04:57.400 | - When did you know you fall in love?
02:05:01.120 | When did you fall in love with whatever this process is
02:05:08.480 | that is negotiating?
02:05:09.620 | - I think it was in a conversation on the suicide hotline
02:05:15.240 | that I was telling you about earlier
02:05:16.480 | with the guy who was paranoid.
02:05:18.120 | When I thought, I can have that significant of an impact
02:05:22.360 | on another human being in this short of a period of time.
02:05:25.800 | That's really cool.
02:05:28.560 | - How hard is it to talk somebody off the ledge?
02:05:32.020 | So this question, this is a big question.
02:05:35.160 | Why the hell live at all?
02:05:37.080 | How do you have that kind of deeply philosophical,
02:05:43.120 | deeply psychological, and also practical conversation
02:05:46.360 | with somebody and convince them they should stick around?
02:05:49.400 | - Well, it's more clearing the clutter in their head
02:05:54.400 | and let them make up their own mind.
02:05:58.160 | That was what volunteering on a suicide hotline
02:06:00.520 | was really about.
02:06:01.360 | Just let me see how quickly I can clear out the clutter
02:06:05.000 | in your head, if you're willing to have it cleared out.
02:06:07.820 | Like, did you call here 'cause you were actually
02:06:11.840 | looking for help, or did you call here
02:06:15.240 | to fulfill some other agenda?
02:06:18.360 | So, are you willing to clear the clutter in your head?
02:06:24.680 | Not everybody is.
02:06:26.200 | - So once you clear out the clutter,
02:06:28.300 | is it at least a somewhat hopeful chance
02:06:31.480 | that you'll continue for another day?
02:06:34.320 | - Yeah.
02:06:35.520 | And if you step back, very few people that commit suicide
02:06:43.960 | physically are up against it that hard.
02:06:47.420 | Like, most of them, by and large,
02:06:50.800 | are pretty intact, physically human beings.
02:06:53.920 | They're struggling with emotional stuff.
02:06:55.920 | But it's an emotional issue.
02:06:59.960 | It's not a physical issue.
02:07:01.760 | So if you were to be a complete mercenary,
02:07:03.880 | like, a guy I'm a very big fan of,
02:07:06.220 | a guy named Mark Pollack, a born great athlete,
02:07:10.360 | lost his eyesight and then became paralyzed.
02:07:13.560 | Like, he's an emotional leader.
02:07:18.360 | He's about helping people thrive and live great lives.
02:07:22.960 | Like, Mark was born, he was a spectacular athlete.
02:07:28.640 | And first he lost his sight in one eye,
02:07:31.780 | then he lost his sight in the other eye,
02:07:34.220 | and then he fell out a window in a tragic experience.
02:07:38.000 | Like, if there was ever a dude that was saying,
02:07:39.880 | like, living sucks.
02:07:43.400 | You know, and if there's any doubt in my mind,
02:07:45.160 | something worse happens to me every few years.
02:07:48.040 | But Mark's about being alive and inspiring other people.
02:07:51.720 | So the hard part with navigating
02:07:54.520 | with somebody who's tossing it in
02:07:56.000 | because there's a chemical imbalance,
02:07:58.280 | or it's the way they're interpreting the world.
02:08:01.240 | There's clutter in their head.
02:08:02.740 | Like, can you help clear that clutter in their head?
02:08:06.760 | - And help them, by themselves,
02:08:10.480 | inspire them to reinterpret that world.
02:08:12.880 | - Yeah. - As one worth living in.
02:08:14.840 | - Yeah.
02:08:15.680 | - What do you think is the meaning of life?
02:08:19.200 | - Well-- - As far as, why live?
02:08:21.720 | (laughs)
02:08:22.560 | What's a good reason?
02:08:25.520 | - Well, I have very strong religious beliefs.
02:08:30.040 | Spiritual, you know, I don't, 1,000%.
02:08:36.000 | If you were to try to confine me in a box,
02:08:38.300 | I'd be a Christian.
02:08:41.800 | I have tremendous respect for the Jewish.
02:08:43.840 | I don't think any religion's got it nailed, exactly.
02:08:46.400 | Again, I keep mentioning, I'm kind of a Bono Christian.
02:08:50.840 | I think Bono's, he's like, what?
02:08:53.120 | And I'm gonna butcher it, but my belief in Jesus
02:08:58.840 | is what I've got after Christianity leaves the room.
02:09:01.440 | You know, the dogma of man's application
02:09:03.480 | of spiritual beliefs.
02:09:04.800 | So, but that being said, I truly believe
02:09:08.460 | that my life was a gift and there's a purpose here.
02:09:11.240 | And, you know, for my creator decided
02:09:14.360 | that I woke up in the morning 'cause he still had
02:09:16.280 | some cool, interesting things for me to do.
02:09:18.440 | - And you have gratitude for having the opportunity
02:09:22.760 | to live that day.
02:09:23.680 | - Yeah.
02:09:24.520 | - Well, you do one heck of a good job at living those days.
02:09:29.800 | I really appreciate your work.
02:09:31.040 | I appreciate the person you are.
02:09:33.080 | Thank you for just everything you've done today
02:09:36.880 | for just being empathic, honestly.
02:09:39.080 | You're a great listener, you're a great conversationalist.
02:09:42.160 | It's just an honor to meet you and to talk to you.
02:09:44.320 | This was really awesome, Chris.
02:09:45.160 | - My pleasure.
02:09:46.120 | - Thanks for listening to this conversation with Chris Voss.
02:09:49.760 | To support this podcast, please check out our sponsors
02:09:52.280 | in the description.
02:09:53.720 | And now, let me leave you with some words
02:09:55.500 | from John F. Kennedy.
02:09:56.960 | "Let us never negotiate out of fear,
02:10:00.240 | "but let us never fear to negotiate."
02:10:05.200 | Thank you for listening and hope to see you next time.
02:10:08.040 | (upbeat music)
02:10:10.620 | (upbeat music)
02:10:13.200 | [BLANK_AUDIO]