back to index

Stephen Schwarzman: Going Big in Business, Investing, and AI | Lex Fridman Podcast #96


Chapters

0:0 Introduction
4:17 Going big in business
7:34 How to recognize an opportunity
16:0 Solving problems that people have
25:26 Philanthropy
32:51 Hope for the new College of Computing at MIT
37:32 Unintended consequences of technological innovation
42:24 Education systems in China and United States
50:22 American AI Initiative
59:53 Starting a business is a rough ride
64:26 Love and family

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Stephen Schwarzman,
00:00:03.120 | CEO and co-founder of Blackstone,
00:00:05.760 | one of the world's leading investment firms
00:00:08.240 | with over $530 billion of assets under management.
00:00:12.960 | He's one of the most successful business leaders in history.
00:00:17.880 | I recommend his recent book called "What It Takes"
00:00:20.920 | that tells stories and lessons from his personal journey.
00:00:24.440 | Stephen is a philanthropist
00:00:26.320 | and one of the wealthiest people in the world.
00:00:28.680 | Recently, signing the Giving Pledge,
00:00:31.200 | thereby committing to give the majority of his wealth
00:00:33.840 | to philanthropic causes.
00:00:36.040 | As an example, in 2018, he donated $350 million to MIT
00:00:41.040 | to help establish its new College of Computing,
00:00:45.120 | the mission of which promotes interdisciplinary,
00:00:47.640 | big, bold research in artificial intelligence.
00:00:51.880 | For those of you who know me,
00:00:53.240 | know that MIT is near and dear to my heart
00:00:55.840 | and always will be.
00:00:57.400 | It was and is a place where I believe
00:01:00.400 | big, bold, revolutionary ideas have a home.
00:01:03.520 | And that is what is needed in artificial intelligence
00:01:06.120 | research in the coming decades.
00:01:08.320 | Yes, there's institutional challenges,
00:01:11.120 | but also there's power
00:01:13.200 | in the passion of individual researchers.
00:01:15.600 | From undergrad to PhD,
00:01:17.600 | from young scientists to senior faculty,
00:01:20.480 | I believe the dream to build intelligence systems
00:01:23.560 | burns brighter than ever in the halls of MIT.
00:01:27.360 | This conversation was recorded recently,
00:01:29.400 | but before the outbreak of the pandemic.
00:01:31.800 | For everyone feeling the burden of this crisis,
00:01:33.920 | I'm sending love your way.
00:01:35.840 | Stay strong, we're in this together.
00:01:38.060 | This is the Artificial Intelligence Podcast.
00:01:41.720 | If you enjoy it, subscribe on YouTube,
00:01:43.900 | review it with Five Stars and Apple Podcast,
00:01:46.160 | support on Patreon, or simply connect with me on Twitter
00:01:49.240 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:01:53.360 | As usual, I'll do a few minutes of ads now
00:01:55.840 | and never any ads in the middle
00:01:57.400 | that can break the flow of the conversation.
00:01:59.620 | I hope that works for you
00:02:00.920 | and doesn't hurt the listening experience.
00:02:02.800 | Quick summary of the ads.
00:02:04.320 | Two sponsors, Masterclass and ExpressVPN.
00:02:07.920 | Please consider supporting the podcast
00:02:09.560 | by signing up to Masterclass at masterclass.com/lex
00:02:13.800 | and getting ExpressVPN at expressvpn.com/lexpod.
00:02:18.800 | This show is sponsored by Masterclass.
00:02:22.900 | Sign up at masterclass.com/lex
00:02:25.680 | to get a discount and support this podcast.
00:02:29.040 | When I first heard about Masterclass,
00:02:30.760 | I thought it was too good to be true.
00:02:32.720 | For $180 a year, you get an all-access pass
00:02:36.160 | to watch courses from, to list some of my favorites,
00:02:39.320 | Chris Hadfield on space exploration,
00:02:41.480 | Neil deGrasse Tyson on scientific thinking communication,
00:02:44.520 | Will Wright, creator of SimCity and Sims on game design,
00:02:48.960 | Carlos Santana on guitar,
00:02:50.960 | Gary Kasparov on chess,
00:02:52.740 | Daniel Negrano on poker, and many, many more.
00:02:56.200 | Chris Hadfield explaining how rockets work
00:02:58.860 | and the experience of being launched into space alone
00:03:01.580 | is worth the money.
00:03:03.300 | By the way, you can watch it on basically any device.
00:03:06.620 | Once again, sign up at masterclass.com/lex
00:03:10.180 | to get a discount and to support this podcast.
00:03:12.720 | This show is sponsored by ExpressVPN.
00:03:16.500 | Get it at expressvpn.com/lexpod
00:03:20.340 | to get a discount and to support this podcast.
00:03:23.220 | I've been using ExpressVPN for many years.
00:03:25.740 | I love it.
00:03:26.700 | It's easy to use.
00:03:27.860 | Press the big power on button and your privacy is protected.
00:03:31.260 | And if you like, you can make it look like your location
00:03:34.580 | is anywhere else in the world.
00:03:36.380 | I might be in Boston now, but it can make it look like
00:03:39.380 | I'm in New York, London, Paris,
00:03:42.260 | or anywhere else in the world.
00:03:44.380 | This has a large number of obvious benefits.
00:03:46.860 | Certainly, it allows you to access international versions
00:03:49.780 | of streaming websites like the Japanese Netflix
00:03:52.580 | or the UK Hulu.
00:03:54.500 | ExpressVPN works on any device you can imagine.
00:03:57.460 | I use it on Linux, shout out to Ubuntu 2004,
00:04:01.140 | Windows, Android, but it's available everywhere else too.
00:04:05.580 | Once again, get it at expressvpn.com/lexpod
00:04:09.920 | to get a discount and to support this podcast.
00:04:13.180 | And now, here's my conversation with Steven Schwarzman.
00:04:16.700 | Let's start with a tough question.
00:04:19.760 | What idea do you believe, whether grounded in data
00:04:23.420 | or in intuition, that many people you respect
00:04:26.060 | disagree with you on?
00:04:27.560 | - Well, there isn't all that much anymore
00:04:32.180 | since the world's so transparent.
00:04:34.860 | But one of the things I believe in,
00:04:38.980 | put it in the book, what it takes is,
00:04:42.420 | if you're gonna do something, do something very consequential
00:04:46.220 | and do something that's quite large, if you can,
00:04:49.460 | that's unique.
00:04:51.040 | Because if you operate in that kind of space,
00:04:54.240 | when you're successful, it's a huge impact.
00:04:57.760 | The prospect of success enables you to recruit people
00:05:02.440 | who wanna be part of that.
00:05:04.320 | And those type of large opportunities
00:05:06.840 | are pretty easily described.
00:05:09.480 | And so, not everybody likes to operate at scale.
00:05:14.480 | Some people like to do small things
00:05:16.180 | because it is meaningful for them emotionally.
00:05:21.180 | And so, occasionally, you get a disagreement on that.
00:05:25.380 | But those are life choices rather than commercial choices.
00:05:30.020 | - That's interesting.
00:05:30.980 | What good and bad comes with going big?
00:05:34.860 | See, we often, in America, think big is good.
00:05:39.360 | What's the benefit, what's the cost,
00:05:44.060 | in terms of just bigger than business,
00:05:45.940 | but life, happiness, the pursuit of happiness?
00:05:49.200 | - Well, you do things that make you happy.
00:05:51.680 | It's not mandated.
00:05:53.680 | And everybody's different.
00:05:56.060 | And some people, if they have talent,
00:06:00.760 | like playing pro football,
00:06:02.500 | other people just like throwing the ball around,
00:06:06.220 | not even being on a team.
00:06:09.040 | What's better?
00:06:10.800 | Depends what your objectives are,
00:06:12.360 | depends what your talent is,
00:06:15.440 | depends what gives you joy.
00:06:19.720 | - So, in terms of going big,
00:06:21.560 | is it both for impact on the world
00:06:24.240 | and because you personally, it gives you joy?
00:06:27.640 | - Well, it makes it easier to succeed, actually.
00:06:32.000 | Because if you catch something,
00:06:34.400 | for example, that's cyclical, that's a huge opportunity,
00:06:39.400 | then you usually can find some place
00:06:42.720 | within that huge opportunity
00:06:44.400 | where you can make it work.
00:06:46.960 | If you're prosecuting a really small thing
00:06:51.480 | and you're wrong, you don't have many places to go.
00:06:56.240 | So, I've always found that the easy place to be,
00:07:00.440 | and the ability where you can concentrate human resources,
00:07:06.280 | get people excited about doing really impactful big things,
00:07:13.280 | and you can afford to pay them, actually,
00:07:16.560 | because the bigger thing can generate
00:07:20.160 | much more in the way of financial resources.
00:07:24.920 | So, that brings people out of talent to help you.
00:07:28.500 | And so, all together, it's a virtuous circle, I think.
00:07:34.320 | - How do you know an opportunity
00:07:36.280 | when you see one in terms of the one you wanna go big on?
00:07:40.240 | Is it intuition?
00:07:41.560 | Is it facts?
00:07:43.680 | Is it back and forth deliberation with people you trust?
00:07:48.680 | What's the process?
00:07:50.680 | Is it art, is it science?
00:07:52.680 | - Well, it's pattern recognition.
00:07:54.320 | And how do you get to pattern recognition?
00:07:57.760 | First, you need to understand the patterns
00:08:00.240 | and the changes that are happening.
00:08:02.960 | And that's either, it's observational on some level.
00:08:08.800 | You can call it data, or you can just call it listening
00:08:13.800 | to unusual things that people are saying
00:08:18.080 | that they haven't said before.
00:08:20.000 | And I've always tried to describe this.
00:08:24.400 | It's like seeing a piece of white lint on a black dress,
00:08:29.400 | but most people disregard that piece of lint.
00:08:33.320 | They just see the dress.
00:08:34.960 | I always see the lint.
00:08:37.080 | And I'm fascinated by how did something get someplace
00:08:41.720 | it's not supposed to be?
00:08:43.480 | So it doesn't even need to be a big discrepancy.
00:08:47.120 | But if something shouldn't be someplace
00:08:49.880 | in a constellation of facts that sort of made sense
00:08:54.880 | in a traditional way, I've learned that if you focus
00:09:01.320 | on why one discordant note is there,
00:09:05.920 | that's usually a key to something important.
00:09:09.480 | And if you can find two of those discordant notes,
00:09:14.480 | that's usually a straight line to someplace.
00:09:17.660 | And that someplace is not where you've been.
00:09:20.460 | And usually when you figure out that things are changing
00:09:24.340 | or have changed, and you describe them,
00:09:27.340 | which you have to be able to do,
00:09:29.140 | 'cause it's not some odd intuition.
00:09:33.240 | It's just focusing on facts.
00:09:35.680 | It's almost like a scientific discovery, if you will.
00:09:39.040 | When you describe it to other people in the real world,
00:09:42.280 | they tend to do absolutely nothing about it.
00:09:46.840 | And that's because humans are comfortable
00:09:51.180 | in their own reality.
00:09:53.120 | And if there's no particular reason at that moment
00:09:57.440 | to shake them out of their reality, they'll stay in it,
00:10:01.880 | even if they're ultimately completely wrong.
00:10:05.360 | And I've always been stunned that when I explain
00:10:09.080 | where we're going, what we're doing, and why,
00:10:13.040 | almost everyone just says, "That's interesting."
00:10:18.040 | And they continue doing what they're doing.
00:10:20.640 | And so I think it's pretty easy to do that.
00:10:24.960 | But what you need is a huge data set.
00:10:29.040 | So before AI and people's focus on data,
00:10:33.280 | I've sort of been doing this mostly my whole life.
00:10:36.200 | I'm not a scientist, let alone a computer scientist.
00:10:40.420 | And you can just hear what people are saying
00:10:43.360 | when somebody says something or you observe something
00:10:46.300 | that simply doesn't make sense.
00:10:48.520 | That's when you really go to work.
00:10:50.560 | The rest of it's just processing.
00:10:52.960 | - You know, on a quick tangent,
00:10:55.360 | pattern recognition is a term often used
00:10:57.400 | throughout the history of AI.
00:10:59.000 | That's the goal of artificial intelligence,
00:11:01.520 | is pattern recognition, right?
00:11:03.440 | But there's, I would say, various flavors of that.
00:11:08.440 | So usually pattern recognition refers to the process
00:11:12.760 | of the, what you said, dress and the lint on the dress.
00:11:17.360 | Pattern recognition is very good at identifying the dress,
00:11:21.280 | is looking at the pattern that's always there,
00:11:25.040 | that's very common, and so on.
00:11:27.160 | You almost refer to a pattern that's like an,
00:11:29.840 | what's called outlier detection in computer science, right?
00:11:34.840 | The rare thing, the small thing.
00:11:38.360 | Now, AI is not often good at that.
00:11:42.000 | Do you, just almost philosophically,
00:11:46.440 | the kind of decisions you made in your life
00:11:49.080 | based scientifically almost on data,
00:11:52.240 | do you think AI in the future will be able to do?
00:11:55.200 | Is it something that could be put down into code?
00:11:59.600 | Or is it still deeply human?
00:12:01.640 | - It's tough for me to say,
00:12:03.920 | since I don't have domain knowledge in AI
00:12:08.920 | to know everything that could or might occur.
00:12:14.800 | I know, sort of in my own case,
00:12:19.720 | that most people don't see any of that.
00:12:22.840 | I just assumed it was motivational.
00:12:28.320 | But it's also sort of, it's hard wiring.
00:12:32.960 | What are you wired or programmed to be finding
00:12:37.200 | or looking for?
00:12:38.600 | It's not what happens every day.
00:12:42.000 | That's not interesting, frankly.
00:12:44.680 | I mean, that's what people mostly do.
00:12:47.120 | I do a bunch of that too,
00:12:49.200 | because that's what you do in normal life.
00:12:52.560 | But I've always been completely fascinated
00:12:57.120 | by the stuff that doesn't fit.
00:13:00.240 | Or the other way of thinking about it,
00:13:02.800 | it's determining what people want without them saying it.
00:13:07.800 | That's a different kind of pattern.
00:13:14.560 | You can see everything they're doing.
00:13:17.200 | There's a missing piece.
00:13:18.400 | They don't know it's missing.
00:13:20.400 | You think it's missing, given the other facts.
00:13:23.440 | You know about them, and you deliver that,
00:13:27.080 | and then that becomes sort of very easy to sell to them.
00:13:32.080 | - To linger on this point a little bit,
00:13:35.320 | you've mentioned that in your family,
00:13:37.760 | when you were growing up,
00:13:38.680 | nobody raised their voice in anger or otherwise.
00:13:41.720 | And you said that this allows you to learn to listen
00:13:44.240 | and hear some interesting things.
00:13:46.040 | Can you elaborate, as you have been, on that idea?
00:13:50.560 | What do you hear about the world if you listen?
00:13:54.240 | - Well, you have to listen really intensely
00:13:57.400 | to understand what people are saying,
00:14:01.120 | as well as what people are intending,
00:14:03.560 | because it's not necessarily the same thing.
00:14:06.720 | People mostly give themselves away,
00:14:14.120 | no matter how clever they think they are.
00:14:16.600 | Particularly if you have the full array of inputs.
00:14:22.440 | In other words, if you look at their face,
00:14:24.600 | you look at their eyes, which are the window on the soul,
00:14:28.200 | it's very difficult to conceal what you're thinking.
00:14:33.200 | You look at facial expressions and posture.
00:14:36.580 | You listen to their voice, which changes.
00:14:39.480 | When you're talking about something
00:14:44.900 | you're comfortable with or not, are you speaking faster?
00:14:48.720 | Is the amplitude of what you're saying higher?
00:14:52.080 | Most people just give away what's really on their mind.
00:14:55.800 | They're not that clever.
00:14:58.560 | They're busy spending their time thinking
00:15:00.280 | about what they're in the process of saying.
00:15:03.480 | And so if you just observe that, not in a hostile way,
00:15:07.920 | but just in an evocative way,
00:15:10.320 | and just let them talk for a while,
00:15:12.920 | they'll more or less tell you, almost completely,
00:15:16.000 | what they're thinking, even the stuff
00:15:19.520 | they don't want you to know.
00:15:21.600 | And once you know that, of course,
00:15:24.640 | it's sort of easy to play that kind of game,
00:15:29.120 | because they've already told you
00:15:31.040 | everything you need to know.
00:15:32.440 | And so it's easy to get to a conclusion,
00:15:37.000 | if there's meant to be one, an area of common interest,
00:15:40.440 | since you know almost exactly what's on their mind.
00:15:44.760 | And so that's an enormous advantage,
00:15:48.000 | as opposed to just walking in someplace
00:15:51.800 | and somebody telling you something
00:15:53.960 | and you believing what they're saying.
00:15:56.360 | There's so many different levels of communication.
00:15:59.860 | - So a powerful approach to life,
00:16:02.920 | you discuss in the book on the topic of listening
00:16:06.280 | and really hearing people, is figuring out
00:16:09.080 | what the biggest problem bothering
00:16:10.560 | a particular individual or group is,
00:16:12.920 | and coming up with a solution to that problem,
00:16:15.520 | and presenting them with a solution, right?
00:16:18.700 | In fact, you brilliantly describe a lot of simple things
00:16:24.240 | that most people just don't do.
00:16:26.480 | It's kind of obvious.
00:16:28.000 | Find the problem that's bothering somebody deeply.
00:16:31.200 | And as you said, I think you've implied
00:16:33.520 | that they will usually tell you what the problem is.
00:16:36.280 | But can you talk about this process
00:16:39.280 | of seeing what the biggest problem for a person is,
00:16:43.060 | trying to solve it,
00:16:44.320 | and maybe a particularly memorable example?
00:16:47.560 | - Yeah, sure.
00:16:48.400 | You know, if you know you're gonna meet somebody,
00:16:52.020 | there are two types of situations.
00:16:54.000 | Chance meetings, and you know,
00:16:57.280 | the second is you know you're gonna meet somebody.
00:16:59.280 | So let's take the easiest one,
00:17:01.280 | which is you know you're gonna meet somebody.
00:17:04.120 | And you start trying to make pretend you're them.
00:17:09.120 | It's really easy.
00:17:11.260 | What's on their mind?
00:17:12.760 | What are they thinking about in their daily life?
00:17:16.300 | What are the big problems they're facing?
00:17:19.060 | So if they're, you know, to make it a really easy example,
00:17:23.720 | you know, make pretend, you know,
00:17:26.300 | they're like President of the United States.
00:17:28.780 | Doesn't have to be this president.
00:17:30.260 | It can be any president.
00:17:31.720 | So you sort of know what's more or less on their mind
00:17:34.460 | because the press keeps reporting it.
00:17:37.260 | And you see it on television.
00:17:39.480 | You hear it.
00:17:40.940 | People discuss it.
00:17:42.420 | So you know if you're gonna be running into somebody
00:17:45.160 | in that kind of position,
00:17:47.420 | you sort of know what they look like already.
00:17:49.760 | You know what they sound like.
00:17:52.100 | You know what their voice is like.
00:17:56.000 | And you know what they're focused on.
00:17:58.340 | And so if you're gonna meet somebody like that,
00:18:01.420 | what you should do is take the biggest unresolved issue
00:18:05.360 | that they're facing and come up with
00:18:09.100 | a few interesting solutions
00:18:11.660 | that basically haven't been out there
00:18:14.980 | or that you haven't heard anybody else was thinking about.
00:18:19.980 | So just to give you an example,
00:18:21.660 | I was sort of in the early 1990s
00:18:24.380 | and I was invited to something at the White House
00:18:26.420 | which was a big deal for me
00:18:27.820 | because I was like, you know, a person from no place.
00:18:30.940 | And you know, I had met the president once before
00:18:35.340 | because it was President Bush
00:18:37.220 | because his son was in my dormitory.
00:18:40.420 | So I had met him at Parents' Day.
00:18:43.060 | I mean it's just like the oddity of things.
00:18:45.420 | So I knew I was gonna see him
00:18:47.820 | because you know that's where the invitation came from.
00:18:50.580 | And so there was something going on
00:18:54.060 | and I just thought about, you know,
00:18:56.300 | two or three ways to approach that issue.
00:19:00.980 | And you know at that point I was separated
00:19:04.460 | and so I had brought a date to the White House
00:19:08.780 | and so I saw the president
00:19:12.300 | and we sort of went over in a corner for about 10 minutes
00:19:16.100 | and discussed whatever this issue was.
00:19:18.500 | And I later, you know, went back to my date.
00:19:22.060 | It was a little rude
00:19:22.900 | but it was meant to be confidential conversation
00:19:25.820 | and I barely knew her.
00:19:27.740 | And you know she said,
00:19:29.580 | "What were you talking about all that time?"
00:19:31.580 | I said, "Well you know,
00:19:33.380 | "there's something going on in the world
00:19:35.100 | "and I've thought about different ways
00:19:37.900 | "of perhaps approaching that."
00:19:39.700 | And he was interested.
00:19:40.860 | And the answer is of course he was interested.
00:19:44.100 | Why wouldn't he be interested?
00:19:45.420 | There didn't seem to be an easy outcome.
00:19:47.860 | And so you know conversations of that type,
00:19:51.420 | once somebody knows you're really thinking
00:19:53.840 | about what's good for them and good for the situation,
00:19:58.660 | has nothing to do with me.
00:20:01.900 | I mean it's really about being in service,
00:20:04.780 | you know to the situation,
00:20:08.380 | then people trust you and they'll tell you other things
00:20:12.060 | because they know your motives are basically very pure.
00:20:17.060 | You're just trying to resolve a difficult situation
00:20:20.580 | and help somebody do it.
00:20:21.920 | So these types of things,
00:20:23.840 | you know that's a planned situation, that's easy.
00:20:27.400 | It's sometimes you just come upon somebody
00:20:29.700 | and they start talking and you know that requires
00:20:32.900 | you know like different skills.
00:20:34.500 | You can ask them, "What have you been working on lately?
00:20:39.360 | "What are you thinking about?"
00:20:41.460 | You can ask them, "Has anything been particularly difficult?"
00:20:46.460 | You can ask most people if they trust you for some reason,
00:20:51.620 | they'll tell you.
00:20:55.540 | And then you have to instantly go to work on it.
00:20:58.000 | And you know that's not as good
00:21:02.120 | as having some advanced planning.
00:21:03.660 | But you know almost everything going on is like out there.
00:21:08.660 | And people who are involved with interesting situations,
00:21:15.340 | they're playing in the same ecosystem.
00:21:20.960 | They just have different roles in the ecosystem.
00:21:25.500 | And you know you can do that with somebody
00:21:29.620 | who owns a pro football team that loses all the time.
00:21:34.220 | We specialize in those in New York.
00:21:37.260 | And you know you already have analyzed
00:21:41.420 | why they're losing, right?
00:21:43.220 | Inevitably it's because they don't have a great quarterback,
00:21:48.180 | they don't have a great coach,
00:21:50.300 | and they don't have a great general manager
00:21:52.380 | who knows how to hire the best talent.
00:21:55.140 | So those are the three reasons why a team fails, right?
00:21:59.180 | Because there are salary caps.
00:22:01.140 | So every team pays a certain amount of money
00:22:03.380 | for all their players.
00:22:04.820 | So it's gotta be those three positions.
00:22:07.000 | So if you're talking with somebody like that,
00:22:09.500 | inevitably, even though it's not structured,
00:22:12.940 | you'll know how their team's doing
00:22:16.820 | and you'll know pretty much why.
00:22:19.260 | And if you start asking questions about that,
00:22:22.460 | they're typically very happy to talk about it
00:22:24.980 | 'cause they haven't solved that problem.
00:22:27.020 | In some case, they don't even know that's the problem.
00:22:29.620 | It's pretty easy to see it.
00:22:31.300 | So you know, I do stuff like that,
00:22:33.620 | which I find is intuitive as a process,
00:22:38.620 | but you know, leads to really good results.
00:22:43.540 | - Well the funny thing is when you're smart,
00:22:46.660 | for smart people, it's hard to escape their own ego
00:22:51.420 | and the space of their own problems,
00:22:53.540 | which is what's required
00:22:55.700 | to think about other people's problems.
00:22:58.220 | It requires for you to let go of the fact
00:23:01.060 | that your own problems are all important.
00:23:03.780 | And then to talk about your,
00:23:05.700 | I think while it seems obvious,
00:23:09.260 | and I think quite brilliant,
00:23:11.900 | it's a difficult leap for many people,
00:23:14.580 | especially smart people,
00:23:16.340 | to empathize with, truly empathize
00:23:18.780 | with the problems of others.
00:23:20.180 | - Well, I have a competitive advantage.
00:23:23.260 | - Come on, I like it.
00:23:25.460 | - Which is, I don't think I'm so smart.
00:23:27.660 | - That's good.
00:23:29.580 | - So you know, it's not a problem for me.
00:23:31.820 | - Well the truly smartest people I know
00:23:33.500 | say that exact same thing.
00:23:34.820 | Yeah, being humble is really useful,
00:23:39.580 | competitive advantage, as you said.
00:23:41.460 | How do you stay humble?
00:23:44.460 | - Well I haven't changed much.
00:23:46.980 | - Since?
00:23:47.820 | - Since I was in my mid-teens, you know,
00:23:51.900 | I was raised partly in the city
00:23:54.820 | and partly in the suburbs.
00:23:56.340 | And you know, whatever the values I had at that time,
00:24:03.300 | those are still my values.
00:24:05.740 | I call 'em like middle-class values,
00:24:08.340 | that's how I was raised.
00:24:10.340 | And I've never changed, why would I?
00:24:14.780 | That's who I am.
00:24:16.300 | And so the accoutrement of, you know,
00:24:21.180 | the rest of your life has gotta be put on the same,
00:24:25.620 | you know, like solid foundation of who you are.
00:24:28.500 | Because if you start losing who you really are,
00:24:31.260 | who are you?
00:24:32.740 | So I've never had the desire to be somebody else,
00:24:37.220 | I just do other things now that I wouldn't do
00:24:40.780 | as a, you know, sort of as a middle-class kid
00:24:43.980 | from Philadelphia.
00:24:45.360 | I mean my life has morphed on a certain level,
00:24:48.700 | but part of the strength of having integrity
00:24:52.580 | of personality is that you can remain in touch
00:24:57.580 | with everybody who comes from that kind of background.
00:25:02.740 | And you know, even though I do some things
00:25:07.300 | that aren't like that, you know,
00:25:09.020 | in terms of people I'd meet or situations I'm in,
00:25:12.340 | I always look at it through the same lens.
00:25:15.060 | And that's very psychologically comfortable.
00:25:18.780 | And doesn't require to me to make any real adjustments
00:25:22.540 | in my life and I just keep plowing ahead.
00:25:25.220 | - There's a lot of activity and progress in recent years
00:25:29.540 | around effective altruism.
00:25:31.340 | I wanted to bring this topic with you
00:25:34.540 | 'cause it's an interesting one from your perspective.
00:25:37.200 | You can put it in any kind of terms,
00:25:40.020 | but it's philanthropy that focuses on maximizing impact.
00:25:44.340 | How do you see the goal of philanthropy,
00:25:47.420 | both from a personal motivation perspective
00:25:50.100 | and a societal big picture impact perspective?
00:25:53.620 | - Yeah, I don't think about philanthropy
00:25:56.000 | the way you would expect me to, okay?
00:25:59.020 | I look at, you know, sort of solving big issues,
00:26:04.020 | addressing big issues, starting new organizations to do it,
00:26:09.480 | much like we do in our business.
00:26:12.100 | You know, we keep growing our business
00:26:14.340 | not by taking the original thing and making it larger,
00:26:17.660 | but continually seeing new things and building those.
00:26:22.300 | And, you know, sort of marshaling financial resources,
00:26:26.420 | human resources, and in our case,
00:26:30.460 | because we're in the investment business,
00:26:32.140 | we find something new that looks like
00:26:34.020 | it's gonna be terrific.
00:26:35.580 | And we do that and it works out really well.
00:26:38.780 | All I do in what you would call philanthropy
00:26:42.340 | is look at other opportunities to help society.
00:26:47.340 | And I end up starting something new,
00:26:54.340 | marshaling people, marshaling a lot of money.
00:26:57.080 | And then at the end of that kind of creative process,
00:27:00.820 | so somebody typically will ask me to write a check.
00:27:03.700 | I don't wake up and say,
00:27:06.020 | how can I like give large amounts of money away?
00:27:09.500 | I look at issues that are important for people.
00:27:14.500 | In some cases, I do smaller things
00:27:17.460 | because it's important to a person.
00:27:19.420 | And, you know, I have, you know,
00:27:22.780 | sort of I can relate to that person.
00:27:24.780 | There's some unfairness that's happened to them.
00:27:28.180 | And so in situations like that,
00:27:30.620 | I'd give money anonymously and help them out.
00:27:33.340 | And, you know, that's, it's like a miniature version
00:27:39.420 | of addressing something really big.
00:27:41.260 | So, you know, at MIT, I've done a big thing,
00:27:46.260 | you know, helping to start this new school of computing.
00:27:49.900 | And I did that because, you know,
00:27:51.980 | I saw that, you know, there's sort of like a global race on
00:27:56.380 | in AI, quantum and other major technologies.
00:28:00.740 | And I thought that the US could use more enhancement
00:28:06.900 | from a competitive perspective.
00:28:09.740 | And I also, because I get to China a lot
00:28:12.380 | and I travel around a lot compared to a regular person,
00:28:16.080 | you know, I can see the need to have control
00:28:21.040 | of these types of technologies.
00:28:23.180 | So when they're introduced, we don't create a mess
00:28:26.360 | like we did with the internet and with social media,
00:28:30.420 | an unintended consequence, you know,
00:28:33.460 | that's creating all kinds of issues
00:28:35.260 | on freedom of speech and the functioning
00:28:37.780 | of liberal democracies.
00:28:39.380 | So with AI, it was pretty clear
00:28:41.580 | that there was enormous difference of views
00:28:44.980 | around the world by the relatively few practitioners
00:28:48.740 | in the world who really knew what was going on.
00:28:51.480 | And by accident, I knew a bunch of these people, you know,
00:28:56.260 | who were like big famous people.
00:28:59.940 | And I could talk to them and say,
00:29:01.900 | "Why do you think this is a force for bad?"
00:29:05.140 | And someone else, "Why do you feel this is a force for good?
00:29:08.700 | And how do we move forward with the technology
00:29:13.540 | but the same, by the same time,
00:29:15.580 | make sure that whatever is in potentially, you know,
00:29:19.820 | sort of on the bad side of this technology with, you know,
00:29:23.540 | for example, disruption of workforces and things like that,
00:29:27.860 | that could happen much faster
00:29:29.700 | than the industrial revolution.
00:29:32.100 | What do we do about that?
00:29:33.380 | And how do we keep that under control
00:29:35.980 | so that the really good things about these technologies,
00:29:39.540 | which will be great things,
00:29:41.860 | not good things, are allowed to happen?"
00:29:44.980 | So to me, you know, this was one of the great issues
00:29:49.980 | facing society.
00:29:53.900 | The number of people who were aware of it were very small.
00:29:57.400 | I just accidentally got sucked into it.
00:30:00.420 | And as soon as I saw it, I went, "Oh my God, this is mega."
00:30:05.020 | Both on a competitive basis globally,
00:30:09.560 | but also in terms of protecting society
00:30:13.620 | and benefiting society.
00:30:15.520 | So that's how I got involved.
00:30:17.620 | And at the end, you know, sort of the right thing
00:30:20.140 | that we figured out was, you know,
00:30:22.380 | sort of double MIT's computer science faculty
00:30:26.780 | and basically create the first AI-enabled university
00:30:30.900 | in the world.
00:30:32.500 | And, you know, in effect, be an example,
00:30:35.380 | a beacon to the rest of the research community
00:30:38.500 | around the world academically,
00:30:40.900 | and create, you know, a much more robust US situation,
00:30:45.900 | competitive situation among the universities.
00:30:52.900 | Because if MIT was gonna raise a lot of money
00:30:55.820 | and double its faculty, well, you could bet that,
00:30:59.340 | you know, a number of other universities
00:31:02.460 | were gonna do the same thing.
00:31:03.420 | At the end of it, it would be great for knowledge creation,
00:31:08.260 | you know, great for the United States, great for the world.
00:31:12.940 | And so I like to do things that I think are
00:31:17.180 | really positive things that other people aren't acting on,
00:31:22.940 | that I see for whatever the reason.
00:31:25.940 | First, it's just people I meet and what they say,
00:31:29.220 | and I can recognize when something really profound
00:31:32.980 | is about to happen or needs to.
00:31:35.380 | And I do it, and at the end of the situation,
00:31:39.100 | somebody says, "Can you write a check to help us?"
00:31:43.300 | And then the answer is, "Sure."
00:31:44.860 | I mean, because if I don't, the vision won't happen.
00:31:48.260 | But it's the vision of whatever I do
00:31:52.420 | that is compelling.
00:31:54.380 | - And essentially, I love that idea of
00:31:57.380 | whether it's small at the individual level or really big,
00:32:01.940 | like the gift to MIT to launch the College of Computing,
00:32:06.940 | it starts with a vision, and you see philanthropy as,
00:32:11.980 | the biggest impact you can have
00:32:15.780 | is by launching something new,
00:32:18.380 | especially on an issue that others aren't really addressing.
00:32:22.500 | And I also love the notion, and you're absolutely right,
00:32:25.740 | that there's other universities,
00:32:27.460 | Stanford, CMU, I'm looking at you,
00:32:31.700 | that would essentially, the seed will create other,
00:32:36.100 | it'll have a ripple effect that potentially might help
00:32:40.860 | U.S. be a leader or continue to be a leader in AI,
00:32:44.340 | this potentially very transformative research direction.
00:32:49.340 | Just to linger on that point a little bit,
00:32:52.220 | what is your hope long-term for the impact
00:32:56.020 | the college here at MIT might have
00:32:58.900 | in the next five, 10, even 20,
00:33:01.300 | or let's get crazy, 30, 50 years?
00:33:03.620 | - Well, it's very difficult to predict the future
00:33:06.140 | when you're dealing with knowledge production
00:33:08.420 | and creativity.
00:33:11.140 | MIT has obviously some unique aspects globally,
00:33:16.140 | and there's four big sort of academic surveys,
00:33:21.720 | I forget whether it was QS, there's the Times in London,
00:33:27.700 | the U.S. News, and whatever.
00:33:31.100 | One of these recently, MIT was ranked number one
00:33:34.860 | in the world, right?
00:33:37.020 | So leave aside whether you're number three somewhere else,
00:33:41.100 | in the great sweep of humanity,
00:33:44.500 | this is pretty amazing, right?
00:33:46.900 | So you have a really remarkable aggregation
00:33:50.900 | of human talent here.
00:33:54.220 | And where it goes, it's hard to tell.
00:33:58.380 | You have to be a scientist to have the right feel.
00:34:01.680 | But what's important is you have a critical mass of people,
00:34:08.620 | and I think it breaks into two buckets.
00:34:12.500 | One is scientific advancement,
00:34:14.920 | and if the new college can help,
00:34:19.860 | sort of either serve as a convening force
00:34:23.780 | within the university, or help sort of coordination
00:34:28.780 | and communication among people, that's a good thing.
00:34:34.320 | Absolute good thing.
00:34:36.740 | The second thing is in the AI ethics area,
00:34:41.700 | which is in a way equally important,
00:34:46.700 | because if the science side creates blowback,
00:34:53.860 | so that science is a bit crippled in terms of going forward,
00:35:04.820 | because society's reaction to knowledge advancement
00:35:09.620 | in this field becomes really hostile,
00:35:12.620 | then you've sort of lost the game
00:35:15.580 | in terms of scientific progress and innovation.
00:35:19.100 | And so the AI ethics piece is super important,
00:35:23.020 | because in a perfect world,
00:35:26.580 | MIT would serve as a global convener,
00:35:32.500 | because what you need is you need the research universities,
00:35:37.500 | you need the companies that are driving AI and quantum work,
00:35:43.300 | you need governments who will ultimately be regulating
00:35:50.700 | certain elements of this, and you also need the media
00:35:54.900 | to be knowledgeable and trained,
00:35:59.540 | so we don't get sort of overreactions to one situation,
00:36:04.540 | which then goes viral, and it ends up shutting down avenues
00:36:12.620 | that are perfectly fine to be walking down
00:36:18.060 | or running down that avenue,
00:36:20.660 | but if enough discordant information,
00:36:25.620 | not even correct necessarily,
00:36:29.500 | sort of gets,
00:36:30.500 | is pushed around society,
00:36:36.060 | then you can end up with a really hostile
00:36:38.020 | regulatory environment and other things,
00:36:40.660 | so you have four drivers that have to be sort of integrated.
00:36:45.660 | And so if the new school of computing
00:36:54.140 | can be really helpful in that regard,
00:36:58.660 | then that's a real service to science,
00:37:02.260 | and it's a service to MIT.
00:37:05.180 | So that's why I wanted to get involved for both areas.
00:37:10.180 | - And the hope is, for me, for others,
00:37:12.860 | for everyone, for the world,
00:37:14.340 | is for this particular college of computing
00:37:18.140 | to be a beacon and a connector for these ideas.
00:37:22.980 | - Yeah, that's right.
00:37:23.820 | I mean, I think MIT is perfectly positioned to do that.
00:37:28.820 | - So you've mentioned the media, social media,
00:37:34.380 | the internet as this complex network of communication
00:37:39.380 | with flaws, perhaps, perhaps you can speak to them,
00:37:44.260 | but I personally think that science and technology
00:37:50.500 | has its flaws, but ultimately is,
00:37:53.740 | one, sexy, exciting.
00:37:58.220 | It's the way for us to explore
00:38:00.700 | and understand the mysteries of our world.
00:38:03.020 | And two, perhaps more importantly for some people,
00:38:06.860 | it's a huge way to, a really powerful way
00:38:09.740 | to grow the economy, to improve the quality of life
00:38:12.580 | for everyone.
00:38:13.660 | So how do we get, how do you see the media,
00:38:18.100 | social media, the internet as a society
00:38:21.700 | having a healthy discourse about science?
00:38:26.700 | First of all, one that's factual
00:38:30.260 | and two, one that finds science exciting,
00:38:33.140 | that invests in science, that pushes it forward,
00:38:36.540 | especially in this science fiction,
00:38:40.340 | fear-filled field of artificial intelligence?
00:38:43.580 | - Well, I think that's a little above my pay grade
00:38:45.860 | because trying to control social media
00:38:50.060 | to make it do what you want to do
00:38:51.940 | appears to be beyond almost anybody's control.
00:38:56.700 | And the technology is being used
00:39:00.300 | to create what I call the tyranny of the minorities.
00:39:03.700 | Okay, a minority is defined as two or three people
00:39:07.940 | on a street corner.
00:39:09.300 | Doesn't matter what they look like,
00:39:11.420 | doesn't matter where they came from,
00:39:13.500 | they're united by that one issue that they care about
00:39:18.500 | and their job is to enforce their views on the world.
00:39:24.940 | And in the political world,
00:39:30.260 | people just are manufacturing truth
00:39:33.380 | and they throw it all over and it affects all of us.
00:39:38.260 | And sometimes people are just hired
00:39:43.340 | to do that, I mean, it's amazing.
00:39:46.820 | And you think it's one person,
00:39:48.740 | it's really just sort of a front
00:39:52.460 | for a particular point of view.
00:39:55.260 | And this has become exceptionally disruptive
00:39:59.860 | for society and it's dangerous
00:40:03.500 | and it's undercutting the ability
00:40:06.140 | of liberal democracies to function.
00:40:09.060 | And I don't know how to get a grip on this
00:40:11.980 | and I was really surprised when we,
00:40:15.580 | you know, I was up here for the announcement last spring
00:40:19.220 | of the College of Computing
00:40:24.340 | and they had all these famous scientists,
00:40:27.020 | some of whom were involved with the invention
00:40:29.260 | of the internet and almost every one of them got up
00:40:35.100 | and said, "I think I made a mistake."
00:40:39.140 | And as a non-scientist,
00:40:40.740 | I never thought I'd hear anyone say that.
00:40:44.100 | And what they said is, more or less, to make it simple,
00:40:48.620 | "We thought this would be really cool,
00:40:51.540 | "inventing the internet.
00:40:52.620 | "We could connect everyone in the world,
00:40:54.900 | "we can move knowledge around, it was instantaneous,
00:40:58.900 | "it's a really amazing thing."
00:41:01.260 | He said, "I don't know that there was anyone
00:41:04.300 | "who ever thought about social media coming out of that
00:41:07.440 | "and the actual consequences for people's lives."
00:41:11.160 | You know, there's always some younger person,
00:41:17.000 | I just saw one of these yesterday,
00:41:19.760 | it was reported on the National News,
00:41:21.440 | he killed himself when people used social media
00:41:25.800 | to basically, you know, sort of ridicule him
00:41:28.740 | or something of that type.
00:41:30.520 | This is dead.
00:41:31.820 | This is dangerous.
00:41:35.800 | And, you know, so I don't have a solution for that
00:41:40.800 | other than going forward,
00:41:45.320 | you can't end up with this type of outcome.
00:41:49.000 | Using AI to make this kind of mistake twice
00:41:53.120 | is unforgivable.
00:41:55.880 | So interestingly, at least in the West,
00:41:59.200 | and parts of China, people are quite sympathetic
00:42:04.360 | to, you know, sort of the whole concept of AI ethics
00:42:08.920 | and what gets introduced when,
00:42:11.640 | and cooperation within your own country,
00:42:15.360 | within your own industry, as well as globally,
00:42:19.520 | to make sure that the technology is a force for good.
00:42:24.320 | - On that really interesting topic, since 2007,
00:42:27.120 | you've had a relationship with senior leadership
00:42:30.200 | with a lot of people in China,
00:42:32.840 | and an interest in understanding modern China,
00:42:36.040 | their culture, their world,
00:42:38.200 | much like with Russia, I'm from Russia, originally,
00:42:42.240 | Americans are told a very narrow,
00:42:44.000 | one-sided story about China
00:42:45.880 | that I'm sure misses a lot of fascinating complexity,
00:42:50.440 | both positive and negative.
00:42:53.360 | What lessons about Chinese culture,
00:42:55.800 | its ideas as a nation, its future,
00:42:57.600 | do you think Americans should know about,
00:43:00.620 | deliberate on, think about?
00:43:02.980 | - Well, it's sort of a wide question
00:43:06.160 | that you're asking about.
00:43:09.340 | You know, China's a pretty unusual place.
00:43:11.960 | First, it's huge.
00:43:13.800 | You know, you got, it's physically huge.
00:43:17.640 | It's got a billion three people,
00:43:19.960 | and the character of the people
00:43:22.580 | isn't as well understood in the United States.
00:43:27.400 | Chinese people are amazingly energetic.
00:43:32.400 | If you're one of a billion three people,
00:43:37.260 | one of the things you've gotta be focused on
00:43:40.140 | is how do you make your way, you know,
00:43:43.480 | through a crowd of a billion 2.99999 other people.
00:43:48.480 | - Now, the word for that is competitive.
00:43:52.940 | - Yes, they are individually highly energetic,
00:43:57.140 | highly focused, always looking for some opportunity
00:44:01.980 | for themselves because they need to,
00:44:06.860 | because there's an enormous amount
00:44:10.780 | of just literally people around.
00:44:12.500 | And so, you know, what I've found is
00:44:16.420 | they'll try and find a way to win for themselves.
00:44:21.860 | And their country is complicated
00:44:24.700 | because it basically doesn't have
00:44:26.700 | the same kind of functional laws
00:44:29.620 | that we do in the United States and the West.
00:44:34.020 | And the country is controlled really
00:44:38.740 | through a web of relationships you have with other people,
00:44:43.500 | and the relationships that those other people
00:44:47.240 | have with other people.
00:44:48.860 | So it's an incredibly dynamic culture
00:44:53.900 | where if somebody knocks somebody up on the top
00:44:57.620 | who's three levels above you
00:44:59.660 | and is in effect protecting you,
00:45:01.340 | then you're like a sort of a floating molecule there
00:45:06.340 | without tethering except the one or two layers above you,
00:45:12.220 | but that's gonna get affected.
00:45:14.140 | So it's a very dynamic system,
00:45:15.940 | and getting people to change is not that easy
00:45:19.980 | because if there aren't really functioning laws,
00:45:23.780 | it's only the relationships that everybody has.
00:45:27.420 | And so when you decide to make a major change
00:45:30.940 | and you sign up for it,
00:45:32.360 | something is changing in your life.
00:45:36.340 | There won't necessarily be all the same people on your team,
00:45:39.940 | and that's a very high-risk enterprise.
00:45:43.740 | So when you're dealing with China,
00:45:46.340 | it's important to know almost
00:45:49.380 | what everybody's relationship is with somebody.
00:45:52.800 | So when you suggest doing something differently,
00:45:55.900 | you line up these forces.
00:45:59.140 | In the West, it's usually you talk to a person
00:46:02.820 | and they figure out what's good for them.
00:46:04.980 | It's a lot easier.
00:46:06.380 | And in that sense, in a funny way,
00:46:08.820 | it's easier to make change in the West,
00:46:11.500 | just the opposite of what people think.
00:46:13.760 | But once the Chinese system
00:46:16.820 | adjusts to something that's new,
00:46:18.660 | everybody's on the team.
00:46:22.500 | It's hard to change 'em, but once they're changed,
00:46:25.340 | they are incredibly focused in a way
00:46:28.780 | that it's hard for the West to do
00:46:31.900 | in a more individualistic culture.
00:46:35.100 | So there are all kinds of fascinating things.
00:46:38.920 | One thing that might interest the people who are listening,
00:46:46.000 | we're more technologically-based than some other group.
00:46:50.780 | I was with one of the top people in the government
00:46:54.700 | a few weeks ago, and he was telling me
00:46:58.620 | that every school child in China
00:47:03.180 | is going to be taught computer science.
00:47:07.340 | Now imagine 100% of these children.
00:47:13.820 | This is such a large number of human beings.
00:47:20.180 | Now that doesn't mean that every one of 'em
00:47:22.740 | will be good at computer science,
00:47:25.340 | but if it's sort of like in the West,
00:47:28.860 | if it's like math or English, everybody's gonna take it.
00:47:33.500 | Not everybody's great at English.
00:47:35.620 | They don't write books, they don't write poetry.
00:47:38.300 | And not everybody's good at math.
00:47:41.260 | Somebody like myself, I sort of evolved to the third grade,
00:47:44.900 | and I'm still doing flashcards.
00:47:46.640 | I didn't make it further in math.
00:47:50.280 | But imagine everybody in their society
00:47:54.600 | is gonna be involved with computer science.
00:47:58.400 | - I'd just even pause on that.
00:48:00.080 | I think computer science involves
00:48:05.200 | at the basic beginner level programming,
00:48:07.920 | and the idea that everybody in the society
00:48:11.400 | would have some ability to program a computer is incredible.
00:48:18.680 | For me, it's incredibly exciting,
00:48:20.360 | and I think that should give United States pause.
00:48:24.160 | Talking about philanthropy and launching things,
00:48:31.320 | there's nothing like launching,
00:48:33.800 | sort of investing in young, the youth, the education system,
00:48:38.200 | because that's where everything launches.
00:48:40.400 | - Yes, well, we've got a complicated system
00:48:42.640 | because we have over 3,000 school districts
00:48:45.720 | around the country.
00:48:47.420 | China doesn't worry about that as a concept.
00:48:50.920 | They make a decision at the very top of the government
00:48:55.280 | that that's what they want to have happen,
00:48:57.400 | and that is what will happen.
00:48:59.480 | And we're really handicapped by this distributed power
00:49:04.480 | in the education area, although some people
00:49:09.320 | involved with that area will think it's great.
00:49:13.680 | But you would know better than I do
00:49:16.840 | what percent of American children
00:49:19.000 | have computer science exposure.
00:49:22.840 | My guess, no knowledge, would be 5% or less.
00:49:27.840 | And if we're gonna be going into a world
00:49:33.080 | where the other major economic power,
00:49:36.500 | sort of like ourselves, has got like 100%, and we got five,
00:49:42.600 | and the whole computer science area is the future,
00:49:47.400 | then we're purposely, or accidentally actually,
00:49:53.000 | handicapping ourselves, and our system doesn't allow us
00:49:58.000 | to adjust quickly to that.
00:50:00.360 | So, you know, issues like this, I find fascinating.
00:50:05.360 | And if you're lucky enough to go to other countries,
00:50:10.520 | which I do, and you learn what they're thinking,
00:50:14.600 | then it informs what we ought to be doing
00:50:18.720 | in the United States.
00:50:21.520 | - So the current administration, Donald Trump,
00:50:25.600 | has released an executive order
00:50:28.400 | on artificial intelligence.
00:50:29.880 | Not sure if you're familiar with it, in 2019.
00:50:33.220 | Looking several years ahead, how does America,
00:50:37.920 | sort of, we've mentioned in terms of the big impact,
00:50:41.800 | we hope your investment in MIT will have a ripple effect,
00:50:46.800 | but from a federal perspective,
00:50:49.480 | from a government perspective,
00:50:51.760 | how does America establish, with respect to China,
00:50:55.680 | leadership in the world at the top
00:50:57.840 | for research and development in AI?
00:51:00.280 | - I think that you have to get the federal government
00:51:04.120 | in the game in a big way.
00:51:07.680 | And that this leap forward, technologically,
00:51:12.680 | which is gonna happen with or without us,
00:51:16.260 | you know, really should be with us.
00:51:20.000 | And it's an opportunity, in effect,
00:51:23.540 | for another moonshot kind of mobilization
00:51:27.500 | by the United States.
00:51:30.600 | I think the appetite actually is there to do that.
00:51:36.600 | At the moment, what's getting in the way
00:51:41.400 | is the kind of poisonous politics we have.
00:51:44.880 | But if you go below the lack of cooperation,
00:51:51.000 | which is almost the defining element
00:51:56.760 | of American democracy right now in the Congress,
00:52:00.960 | if you talk to individual members, they get it.
00:52:04.640 | And they would like to do something.
00:52:08.040 | Another part of the issue is we're running huge deficits.
00:52:11.040 | We're running trillion-dollar-plus deficits.
00:52:13.800 | So how much money do you need for this initiative?
00:52:18.800 | Where does it come from?
00:52:21.440 | Who's prepared to stand up for it?
00:52:24.300 | Because if it involves taking away resources
00:52:27.120 | from another area, our political system
00:52:30.160 | is not real flexible to do that.
00:52:34.440 | If you're creating this kind of initiative,
00:52:39.440 | which we need, where does the money come from?
00:52:44.000 | And trying to get money when you've got
00:52:48.000 | trillion-dollar deficits, in a way, could be easy.
00:52:50.880 | What's the difference of a trillion and a trillion?
00:52:52.840 | A little more.
00:52:54.240 | But, you know, it's hard with the mechanisms of Congress.
00:52:58.640 | But what's really important is this is not a,
00:53:04.160 | this is not an issue that is unknown.
00:53:08.800 | And it's viewed as a very important issue.
00:53:12.480 | And there's almost no one in the Congress
00:53:15.920 | when you sit down and explain what's going on
00:53:18.940 | who doesn't say we've gotta do something.
00:53:22.120 | - Let me ask the impossible question.
00:53:24.080 | You didn't endorse Donald Trump.
00:53:28.460 | But after he was elected, you have given him advice.
00:53:33.480 | Which seems to me a great thing to do,
00:53:38.480 | no matter who the president is,
00:53:42.080 | to contribute, positively contribute
00:53:44.600 | to this nation by giving advice.
00:53:47.120 | And yet, you've received a lot of criticism for this.
00:53:50.580 | So on the previous topic of science and technology
00:53:54.160 | and government, how do we have a healthy discourse,
00:53:59.720 | give advice, get excited, conversation with the government
00:54:04.720 | about science and technology without it
00:54:07.680 | becoming politicized?
00:54:09.640 | - Well, it's very interesting.
00:54:11.140 | So when I was young, before there was a moonshot,
00:54:17.120 | we had a president named John F. Kennedy
00:54:21.080 | from Massachusetts here.
00:54:23.120 | And in his inaugural address as president,
00:54:27.760 | he asked not what your country can do for you,
00:54:30.840 | but what you can do for your country.
00:54:34.680 | Now, we had a generation of people my age,
00:54:38.800 | basically people who grew up with that credo.
00:54:43.800 | And sometimes you don't need to innovate.
00:54:49.120 | You can go back to basic principles.
00:54:52.720 | And that's a good basic principle.
00:54:55.200 | What can we do?
00:54:57.060 | Americans have GDP per capita of around $60,000.
00:55:02.440 | It's not equally distributed, but it's big.
00:55:08.200 | And people have, I think, an obligation
00:55:13.200 | to help their country.
00:55:17.120 | And I do that.
00:55:19.160 | And apparently, I take some grief from some people
00:55:25.000 | who project on me things I don't even vaguely believe.
00:55:30.000 | But I'm quite simple.
00:55:35.600 | I tried to help the previous president, President Obama.
00:55:41.120 | He was a good guy.
00:55:42.760 | And he was a different party.
00:55:44.460 | And I tried to help President Bush.
00:55:46.720 | And he's a different party.
00:55:48.640 | And I sort of don't care that much.
00:55:54.400 | About what the parties are.
00:55:57.200 | I care about, even though I'm a big donor
00:56:00.520 | for the Republicans, but what motivates me
00:56:04.960 | is what are the problems we're facing?
00:56:08.240 | Can I help people get to sort of a good outcome
00:56:13.240 | that'll stand any test?
00:56:15.760 | But we live in a world now where sort of the filters
00:56:22.480 | and the hostility is so unbelievable.
00:56:27.480 | In the 1960s, when I went to school,
00:56:33.480 | in university, I went to Yale.
00:56:35.280 | We had so much stuff going on.
00:56:39.040 | We had a war called the Vietnam War.
00:56:43.860 | We had sort of black power starting.
00:56:47.200 | And we had a sexual revolution
00:56:51.200 | with the birth control pill.
00:56:52.760 | And there was one other major thing going on.
00:56:59.480 | And the drug revolution.
00:57:04.260 | There hasn't been a generation that had more stuff
00:57:10.040 | going on in a four year period than my era.
00:57:17.480 | Yet, there wasn't this kind of instant hostility
00:57:22.480 | if you believed something different.
00:57:26.280 | Everybody lived together and respected the other person.
00:57:31.280 | And I think that this type of change needs to happen.
00:57:37.200 | And it's gotta happen from the leadership
00:57:41.600 | of our major institutions.
00:57:44.440 | And I don't think that leaders can be bullied
00:57:49.440 | by people who are against sort of the classical version
00:57:55.340 | of free speech and letting open expression and inquiry.
00:58:00.340 | That's what universities are for.
00:58:03.340 | Among other things, Socratic methods.
00:58:06.880 | And so I have, in the midst of this,
00:58:13.980 | like onslaught of oddness,
00:58:18.980 | I believe in still the basic principles
00:58:23.040 | and we're gonna have to find a way to get back to that.
00:58:26.600 | And that doesn't start with the people
00:58:30.560 | sort of in the middle to the bottom
00:58:32.240 | who are using these kinds of screens
00:58:36.400 | to shout people down
00:58:38.320 | and create an uncooperative environment.
00:58:41.280 | It's gotta be done at the top with core principles
00:58:45.960 | that are articulated.
00:58:47.440 | And ironically, if people don't sign on
00:58:53.360 | to these kind of core principles where people are equal
00:58:56.880 | and speech can be heard
00:59:00.480 | and you don't have these enormous shout down biases
00:59:04.440 | subtly or out loud,
00:59:06.980 | then they don't belong at those institutions.
00:59:09.640 | They're violating the core principles.
00:59:12.200 | And that's how you end up making change.
00:59:16.720 | But you have to have courageous people
00:59:20.200 | who are willing to lay that out
00:59:24.160 | for the benefit of not just their institutions,
00:59:27.820 | but for society as a whole.
00:59:31.580 | So I believe that will happen,
00:59:35.760 | but it needs the commitment of senior people
00:59:40.760 | to make it happen.
00:59:42.720 | - Courage.
00:59:43.640 | And I think for such great leaders, great universities,
00:59:47.840 | there's a huge hunger for it.
00:59:49.320 | So I am too very optimistic that it will come.
00:59:53.240 | I'm now personally taking a step
00:59:55.080 | into building a startup first time,
00:59:57.760 | hoping to change the world, of course.
00:59:59.680 | There are thousands, maybe more,
01:00:02.560 | maybe millions of other first time entrepreneurs like me.
01:00:06.080 | What advice, you've gone through this process,
01:00:09.440 | you've talked about the suffering,
01:00:12.880 | the emotional turmoil it all might entail.
01:00:15.560 | What advice do you have for those people taking that step?
01:00:18.820 | - I'd say it's a rough ride.
01:00:22.280 | And you have to be psychologically prepared
01:00:28.840 | for things going wrong with frequency.
01:00:33.040 | You have to be prepared to be put in situations
01:00:37.560 | where you're being asked to solve problems
01:00:40.000 | you didn't even know those problems existed.
01:00:43.040 | For example, renting space,
01:00:45.440 | it's not really a problem unless you've never done it.
01:00:48.560 | You have no idea what a lease looks like, right?
01:00:52.040 | You don't even know the relevant rent in a market.
01:00:56.800 | So everything is new.
01:00:58.560 | Everything has to be learned.
01:01:00.760 | What you realize is that it's good
01:01:03.660 | to have other people with you who've had some experience
01:01:07.120 | in areas where you don't know what you're doing.
01:01:09.760 | Unfortunately, an entrepreneur starting
01:01:13.120 | doesn't know much of anything,
01:01:14.620 | so everything is something new.
01:01:17.400 | - Yeah.
01:01:18.240 | - And I think it's important not to be alone
01:01:23.040 | because it's sort of overwhelming.
01:01:28.260 | And you need somebody to talk to
01:01:31.180 | other than a spouse or a loved one
01:01:34.580 | because even they get bored with your problems.
01:01:37.860 | And so getting a group, if you look at Alibaba,
01:01:42.620 | Jack Ma was telling me they basically were
01:01:47.900 | like a financial death's door at least twice.
01:01:51.440 | And the fact that it wasn't just Jack,
01:01:55.980 | I mean people think it is 'cause he became
01:01:58.420 | the sort of public face and the driver,
01:02:02.500 | but a group of people who can give advice,
01:02:07.500 | share situations to talk about, that's really important.
01:02:13.380 | - And that's not just referring to the small details
01:02:15.920 | like renting space.
01:02:17.140 | - No.
01:02:17.980 | - It's also the psychological burden.
01:02:20.540 | - Yeah, and because most entrepreneurs
01:02:24.140 | at some point question what they're doing
01:02:25.780 | 'cause it's not going so well.
01:02:27.820 | Or they're screwing it up and they don't know
01:02:29.500 | how to unscrew it up because we're all learning.
01:02:34.140 | And it's hard to be learning when there are
01:02:37.260 | like 25 variables going on.
01:02:39.380 | If you're missing four big ones,
01:02:41.940 | you can really make a mess.
01:02:43.900 | And so the ability to in effect have either an outsider
01:02:48.900 | who's really smart that you can rely on
01:02:53.340 | for certain type of things,
01:02:55.700 | or other people who are working with you on a daily basis.
01:02:59.520 | It's most people who haven't had experience
01:03:05.820 | believe in the myth of the one person,
01:03:08.380 | one great person, you know, makes outcomes,
01:03:12.980 | creates outcomes that are positive.
01:03:15.600 | Most of us, it's not like that.
01:03:18.660 | If you look back over a lot of the big
01:03:21.380 | successful tech companies,
01:03:24.140 | it's not typically one person.
01:03:26.400 | And you will know these stories better than I do
01:03:30.940 | 'cause it's your world, not mine.
01:03:32.460 | But even I know that almost every one of them
01:03:35.340 | had two people.
01:03:36.700 | I mean, if you look at Google, that's what they had.
01:03:40.420 | And that was the same at Microsoft at the beginning.
01:03:43.700 | And it was the same at Apple.
01:03:46.600 | People have different skills and they need
01:03:50.740 | to play off of other people.
01:03:53.300 | So, you know, the advice that I would give you
01:03:58.300 | is make sure you understand that
01:04:01.780 | so you don't head off in some direction as a lone wolf
01:04:05.780 | and find that either you can't invent all the solutions
01:04:10.260 | or you make bad decisions on certain types of things.
01:04:15.740 | This is a team sport.
01:04:18.060 | Entrepreneur means you're alone, in effect.
01:04:22.460 | And that's the myth.
01:04:24.300 | But it's mostly a myth.
01:04:25.740 | - Yeah, I think, and you talk about this in your book
01:04:29.900 | and I could talk to you about it forever,
01:04:31.900 | the harshly self-critical aspect to your personality
01:04:36.700 | and to mine as well in the face of failure.
01:04:39.780 | It's a powerful tool, but it's also a burden
01:04:42.860 | that's very interesting,
01:04:46.260 | very interesting to walk that line.
01:04:49.220 | But let me ask in terms of people around you,
01:04:53.020 | in terms of friends, in the bigger picture of your own life,
01:04:57.540 | where do you put the value of love, family, friendship
01:05:02.540 | in the big picture journey of your life?
01:05:06.500 | - Well, ultimately all journeys are alone.
01:05:09.820 | It's great to have support.
01:05:16.220 | And when you go forward and say
01:05:21.220 | your job is to make something work
01:05:25.100 | and that's your number one priority
01:05:27.100 | and you're gonna work at it to make it work,
01:05:31.500 | it's like superhuman effort.
01:05:33.880 | People don't become successful as part-time workers.
01:05:38.580 | It doesn't work that way.
01:05:40.580 | And if you're prepared to make that 100 to 120%
01:05:46.580 | effort, you're gonna need support
01:05:49.940 | and you're gonna have to have people involved
01:05:53.180 | with your life who understand
01:05:55.420 | that that's really part of your life.
01:05:59.180 | Sometimes you're involved with somebody
01:06:01.580 | and they don't really understand that
01:06:04.900 | and that's a source of conflict and difficulty.
01:06:09.180 | But if you're involved with the right people,
01:06:13.580 | whether it's a dating relationship
01:06:16.980 | or a spousal relationship,
01:06:21.360 | you have to involve them in your life
01:06:27.180 | but not burden them with every minor triumph or mistake.
01:06:33.660 | They actually get bored with it after a while
01:06:40.980 | and so you have to set up different types of ecosystems.
01:06:45.820 | You have your home life, you have your love life,
01:06:49.980 | you have children and that's like
01:06:52.700 | the enduring part of what you do.
01:06:55.820 | And then on the other side, you've got the
01:06:59.300 | sort of unpredictable nature of this type of work.
01:07:04.300 | What I say to people at my firm who are younger,
01:07:10.660 | usually, well, everybody's younger,
01:07:14.100 | but people who are of an age where
01:07:19.100 | they're just having their first child
01:07:21.140 | or maybe they have two children,
01:07:24.460 | that it's important to make sure they go away
01:07:29.460 | with their spouse at least once every two months
01:07:36.140 | to just some lovely place where there are no children,
01:07:40.180 | no issues, sometimes once a month
01:07:44.060 | if they're sort of energetic and clever.
01:07:47.660 | - Escape the craziness of it all.
01:07:51.740 | - Yeah, and reaffirm your values as a couple.
01:07:56.740 | And you have to have fun.
01:08:00.900 | If you don't have fun with the person you're with
01:08:04.580 | and all you're doing is dealing with issues,
01:08:07.460 | then that gets pretty old.
01:08:09.580 | And so you have to protect the fun element
01:08:13.780 | of your life together.
01:08:15.540 | And the way to do that isn't by hanging around the house
01:08:18.740 | and dealing with sort of more problems.
01:08:22.540 | You have to get away and reinforce
01:08:25.780 | and reinvigorate your relationship.
01:08:28.340 | And whenever I tell one of our younger people about that,
01:08:31.760 | they sort of look at me and it's like the scales
01:08:34.380 | are falling off of their eyes and they're saying,
01:08:36.820 | "Geez, I hadn't thought about that.
01:08:39.020 | "I'm so enmeshed in all these things."
01:08:41.260 | But that's a great idea.
01:08:42.780 | And that's something as an entrepreneur,
01:08:45.220 | you also have to do.
01:08:46.800 | You just can't let relationships slip
01:08:50.700 | because you're half overwhelmed.
01:08:52.420 | - Beautifully put, and I think there's no better place
01:08:56.860 | to end it.
01:08:57.700 | Steve, thank you so much.
01:08:58.740 | I really appreciate it.
01:08:59.580 | It was an honor to talk to you.
01:09:01.080 | - My pleasure.
01:09:01.920 | - Thanks for listening to this conversation
01:09:04.540 | with Stephen Schwarzman.
01:09:05.780 | And thank you to our sponsors,
01:09:07.380 | ExpressVPN and Masterclass.
01:09:09.940 | Please consider supporting the podcast
01:09:11.660 | by signing up to Masterclass at masterclass.com/lex
01:09:16.340 | and getting ExpressVPN at expressvpn.com/lexpod.
01:09:21.220 | If you enjoy this podcast, subscribe on YouTube,
01:09:23.840 | review it with 5 Stars on Apple Podcast,
01:09:26.140 | support it on Patreon,
01:09:27.300 | or simply connect with me on Twitter @LexFriedman.
01:09:30.660 | And now let me leave you with some words
01:09:33.820 | from Stephen Schwarzman's book, "What It Takes."
01:09:37.900 | "It's as hard to start and run a small business
01:09:41.300 | "as it is to start a big one.
01:09:43.600 | "You will suffer the same toll financially
01:09:45.400 | "and psychologically as you bludgeon it into existence.
01:09:50.180 | "It's hard to raise the money
01:09:51.620 | "and to find the right people.
01:09:53.360 | "So if you're going to dedicate your life to a business,
01:09:56.440 | "which is the only way it will ever work,
01:09:59.560 | "you should choose one with a potential to be huge."
01:10:03.860 | Thank you for listening and hope to see you next time.
01:10:06.980 | (upbeat music)
01:10:09.560 | (upbeat music)
01:10:12.140 | [BLANK_AUDIO]