back to indexStephen Schwarzman: Going Big in Business, Investing, and AI | Lex Fridman Podcast #96
Chapters
0:0 Introduction
4:17 Going big in business
7:34 How to recognize an opportunity
16:0 Solving problems that people have
25:26 Philanthropy
32:51 Hope for the new College of Computing at MIT
37:32 Unintended consequences of technological innovation
42:24 Education systems in China and United States
50:22 American AI Initiative
59:53 Starting a business is a rough ride
64:26 Love and family
00:00:00.000 |
The following is a conversation with Stephen Schwarzman, 00:00:08.240 |
with over $530 billion of assets under management. 00:00:12.960 |
He's one of the most successful business leaders in history. 00:00:17.880 |
I recommend his recent book called "What It Takes" 00:00:20.920 |
that tells stories and lessons from his personal journey. 00:00:26.320 |
and one of the wealthiest people in the world. 00:00:31.200 |
thereby committing to give the majority of his wealth 00:00:36.040 |
As an example, in 2018, he donated $350 million to MIT 00:00:41.040 |
to help establish its new College of Computing, 00:00:45.120 |
the mission of which promotes interdisciplinary, 00:00:47.640 |
big, bold research in artificial intelligence. 00:01:03.520 |
And that is what is needed in artificial intelligence 00:01:20.480 |
I believe the dream to build intelligence systems 00:01:23.560 |
burns brighter than ever in the halls of MIT. 00:01:31.800 |
For everyone feeling the burden of this crisis, 00:01:46.160 |
support on Patreon, or simply connect with me on Twitter 00:02:09.560 |
by signing up to Masterclass at masterclass.com/lex 00:02:13.800 |
and getting ExpressVPN at expressvpn.com/lexpod. 00:02:36.160 |
to watch courses from, to list some of my favorites, 00:02:41.480 |
Neil deGrasse Tyson on scientific thinking communication, 00:02:44.520 |
Will Wright, creator of SimCity and Sims on game design, 00:02:52.740 |
Daniel Negrano on poker, and many, many more. 00:02:58.860 |
and the experience of being launched into space alone 00:03:03.300 |
By the way, you can watch it on basically any device. 00:03:10.180 |
to get a discount and to support this podcast. 00:03:20.340 |
to get a discount and to support this podcast. 00:03:27.860 |
Press the big power on button and your privacy is protected. 00:03:31.260 |
And if you like, you can make it look like your location 00:03:36.380 |
I might be in Boston now, but it can make it look like 00:03:46.860 |
Certainly, it allows you to access international versions 00:03:49.780 |
of streaming websites like the Japanese Netflix 00:03:54.500 |
ExpressVPN works on any device you can imagine. 00:04:01.140 |
Windows, Android, but it's available everywhere else too. 00:04:09.920 |
to get a discount and to support this podcast. 00:04:13.180 |
And now, here's my conversation with Steven Schwarzman. 00:04:19.760 |
What idea do you believe, whether grounded in data 00:04:23.420 |
or in intuition, that many people you respect 00:04:42.420 |
if you're gonna do something, do something very consequential 00:04:46.220 |
and do something that's quite large, if you can, 00:04:51.040 |
Because if you operate in that kind of space, 00:04:57.760 |
The prospect of success enables you to recruit people 00:05:09.480 |
And so, not everybody likes to operate at scale. 00:05:16.180 |
because it is meaningful for them emotionally. 00:05:21.180 |
And so, occasionally, you get a disagreement on that. 00:05:25.380 |
But those are life choices rather than commercial choices. 00:05:34.860 |
See, we often, in America, think big is good. 00:05:45.940 |
but life, happiness, the pursuit of happiness? 00:06:02.500 |
other people just like throwing the ball around, 00:06:24.240 |
and because you personally, it gives you joy? 00:06:27.640 |
- Well, it makes it easier to succeed, actually. 00:06:34.400 |
for example, that's cyclical, that's a huge opportunity, 00:06:51.480 |
and you're wrong, you don't have many places to go. 00:06:56.240 |
So, I've always found that the easy place to be, 00:07:00.440 |
and the ability where you can concentrate human resources, 00:07:06.280 |
get people excited about doing really impactful big things, 00:07:24.920 |
So, that brings people out of talent to help you. 00:07:28.500 |
And so, all together, it's a virtuous circle, I think. 00:07:36.280 |
when you see one in terms of the one you wanna go big on? 00:07:43.680 |
Is it back and forth deliberation with people you trust? 00:08:02.960 |
And that's either, it's observational on some level. 00:08:08.800 |
You can call it data, or you can just call it listening 00:08:24.400 |
It's like seeing a piece of white lint on a black dress, 00:08:29.400 |
but most people disregard that piece of lint. 00:08:37.080 |
And I'm fascinated by how did something get someplace 00:08:43.480 |
So it doesn't even need to be a big discrepancy. 00:08:49.880 |
in a constellation of facts that sort of made sense 00:08:54.880 |
in a traditional way, I've learned that if you focus 00:09:09.480 |
And if you can find two of those discordant notes, 00:09:20.460 |
And usually when you figure out that things are changing 00:09:35.680 |
It's almost like a scientific discovery, if you will. 00:09:39.040 |
When you describe it to other people in the real world, 00:09:53.120 |
And if there's no particular reason at that moment 00:09:57.440 |
to shake them out of their reality, they'll stay in it, 00:10:05.360 |
And I've always been stunned that when I explain 00:10:09.080 |
where we're going, what we're doing, and why, 00:10:13.040 |
almost everyone just says, "That's interesting." 00:10:33.280 |
I've sort of been doing this mostly my whole life. 00:10:36.200 |
I'm not a scientist, let alone a computer scientist. 00:10:43.360 |
when somebody says something or you observe something 00:11:03.440 |
But there's, I would say, various flavors of that. 00:11:08.440 |
So usually pattern recognition refers to the process 00:11:12.760 |
of the, what you said, dress and the lint on the dress. 00:11:17.360 |
Pattern recognition is very good at identifying the dress, 00:11:21.280 |
is looking at the pattern that's always there, 00:11:27.160 |
You almost refer to a pattern that's like an, 00:11:29.840 |
what's called outlier detection in computer science, right? 00:11:52.240 |
do you think AI in the future will be able to do? 00:11:55.200 |
Is it something that could be put down into code? 00:12:08.920 |
to know everything that could or might occur. 00:12:32.960 |
What are you wired or programmed to be finding 00:13:02.800 |
it's determining what people want without them saying it. 00:13:20.400 |
You think it's missing, given the other facts. 00:13:27.080 |
and then that becomes sort of very easy to sell to them. 00:13:38.680 |
nobody raised their voice in anger or otherwise. 00:13:41.720 |
And you said that this allows you to learn to listen 00:13:46.040 |
Can you elaborate, as you have been, on that idea? 00:13:50.560 |
What do you hear about the world if you listen? 00:14:16.600 |
Particularly if you have the full array of inputs. 00:14:24.600 |
you look at their eyes, which are the window on the soul, 00:14:28.200 |
it's very difficult to conceal what you're thinking. 00:14:44.900 |
you're comfortable with or not, are you speaking faster? 00:14:48.720 |
Is the amplitude of what you're saying higher? 00:14:52.080 |
Most people just give away what's really on their mind. 00:15:03.480 |
And so if you just observe that, not in a hostile way, 00:15:12.920 |
they'll more or less tell you, almost completely, 00:15:37.000 |
if there's meant to be one, an area of common interest, 00:15:40.440 |
since you know almost exactly what's on their mind. 00:15:56.360 |
There's so many different levels of communication. 00:16:02.920 |
you discuss in the book on the topic of listening 00:16:12.920 |
and coming up with a solution to that problem, 00:16:18.700 |
In fact, you brilliantly describe a lot of simple things 00:16:28.000 |
Find the problem that's bothering somebody deeply. 00:16:33.520 |
that they will usually tell you what the problem is. 00:16:39.280 |
of seeing what the biggest problem for a person is, 00:16:48.400 |
You know, if you know you're gonna meet somebody, 00:16:57.280 |
the second is you know you're gonna meet somebody. 00:17:01.280 |
which is you know you're gonna meet somebody. 00:17:04.120 |
And you start trying to make pretend you're them. 00:17:12.760 |
What are they thinking about in their daily life? 00:17:19.060 |
So if they're, you know, to make it a really easy example, 00:17:31.720 |
So you sort of know what's more or less on their mind 00:17:42.420 |
So you know if you're gonna be running into somebody 00:17:47.420 |
you sort of know what they look like already. 00:17:58.340 |
And so if you're gonna meet somebody like that, 00:18:01.420 |
what you should do is take the biggest unresolved issue 00:18:14.980 |
or that you haven't heard anybody else was thinking about. 00:18:24.380 |
and I was invited to something at the White House 00:18:27.820 |
because I was like, you know, a person from no place. 00:18:30.940 |
And you know, I had met the president once before 00:18:47.820 |
because you know that's where the invitation came from. 00:19:04.460 |
and so I had brought a date to the White House 00:19:12.300 |
and we sort of went over in a corner for about 10 minutes 00:19:22.900 |
but it was meant to be confidential conversation 00:19:40.860 |
And the answer is of course he was interested. 00:19:53.840 |
about what's good for them and good for the situation, 00:20:08.380 |
then people trust you and they'll tell you other things 00:20:12.060 |
because they know your motives are basically very pure. 00:20:17.060 |
You're just trying to resolve a difficult situation 00:20:23.840 |
you know that's a planned situation, that's easy. 00:20:29.700 |
and they start talking and you know that requires 00:20:34.500 |
You can ask them, "What have you been working on lately? 00:20:41.460 |
You can ask them, "Has anything been particularly difficult?" 00:20:46.460 |
You can ask most people if they trust you for some reason, 00:20:55.540 |
And then you have to instantly go to work on it. 00:21:03.660 |
But you know almost everything going on is like out there. 00:21:08.660 |
And people who are involved with interesting situations, 00:21:20.960 |
They just have different roles in the ecosystem. 00:21:29.620 |
who owns a pro football team that loses all the time. 00:21:43.220 |
Inevitably it's because they don't have a great quarterback, 00:21:55.140 |
So those are the three reasons why a team fails, right? 00:22:07.000 |
So if you're talking with somebody like that, 00:22:19.260 |
And if you start asking questions about that, 00:22:22.460 |
they're typically very happy to talk about it 00:22:27.020 |
In some case, they don't even know that's the problem. 00:22:46.660 |
for smart people, it's hard to escape their own ego 00:23:56.340 |
And you know, whatever the values I had at that time, 00:24:21.180 |
the rest of your life has gotta be put on the same, 00:24:25.620 |
you know, like solid foundation of who you are. 00:24:28.500 |
Because if you start losing who you really are, 00:24:32.740 |
So I've never had the desire to be somebody else, 00:24:37.220 |
I just do other things now that I wouldn't do 00:24:40.780 |
as a, you know, sort of as a middle-class kid 00:24:45.360 |
I mean my life has morphed on a certain level, 00:24:52.580 |
of personality is that you can remain in touch 00:24:57.580 |
with everybody who comes from that kind of background. 00:25:09.020 |
in terms of people I'd meet or situations I'm in, 00:25:18.780 |
And doesn't require to me to make any real adjustments 00:25:25.220 |
- There's a lot of activity and progress in recent years 00:25:34.540 |
'cause it's an interesting one from your perspective. 00:25:40.020 |
but it's philanthropy that focuses on maximizing impact. 00:25:50.100 |
and a societal big picture impact perspective? 00:25:59.020 |
I look at, you know, sort of solving big issues, 00:26:04.020 |
addressing big issues, starting new organizations to do it, 00:26:14.340 |
not by taking the original thing and making it larger, 00:26:17.660 |
but continually seeing new things and building those. 00:26:22.300 |
And, you know, sort of marshaling financial resources, 00:26:42.340 |
is look at other opportunities to help society. 00:26:54.340 |
marshaling people, marshaling a lot of money. 00:26:57.080 |
And then at the end of that kind of creative process, 00:27:00.820 |
so somebody typically will ask me to write a check. 00:27:06.020 |
how can I like give large amounts of money away? 00:27:09.500 |
I look at issues that are important for people. 00:27:24.780 |
There's some unfairness that's happened to them. 00:27:30.620 |
I'd give money anonymously and help them out. 00:27:33.340 |
And, you know, that's, it's like a miniature version 00:27:46.260 |
you know, helping to start this new school of computing. 00:27:51.980 |
I saw that, you know, there's sort of like a global race on 00:28:00.740 |
And I thought that the US could use more enhancement 00:28:12.380 |
and I travel around a lot compared to a regular person, 00:28:23.180 |
So when they're introduced, we don't create a mess 00:28:26.360 |
like we did with the internet and with social media, 00:28:44.980 |
around the world by the relatively few practitioners 00:28:48.740 |
in the world who really knew what was going on. 00:28:51.480 |
And by accident, I knew a bunch of these people, you know, 00:29:05.140 |
And someone else, "Why do you feel this is a force for good? 00:29:08.700 |
And how do we move forward with the technology 00:29:15.580 |
make sure that whatever is in potentially, you know, 00:29:19.820 |
sort of on the bad side of this technology with, you know, 00:29:23.540 |
for example, disruption of workforces and things like that, 00:29:35.980 |
so that the really good things about these technologies, 00:29:44.980 |
So to me, you know, this was one of the great issues 00:29:53.900 |
The number of people who were aware of it were very small. 00:30:00.420 |
And as soon as I saw it, I went, "Oh my God, this is mega." 00:30:17.620 |
And at the end, you know, sort of the right thing 00:30:22.380 |
sort of double MIT's computer science faculty 00:30:26.780 |
and basically create the first AI-enabled university 00:30:35.380 |
a beacon to the rest of the research community 00:30:40.900 |
and create, you know, a much more robust US situation, 00:30:45.900 |
competitive situation among the universities. 00:30:52.900 |
Because if MIT was gonna raise a lot of money 00:30:55.820 |
and double its faculty, well, you could bet that, 00:31:03.420 |
At the end of it, it would be great for knowledge creation, 00:31:08.260 |
you know, great for the United States, great for the world. 00:31:17.180 |
really positive things that other people aren't acting on, 00:31:25.940 |
First, it's just people I meet and what they say, 00:31:29.220 |
and I can recognize when something really profound 00:31:35.380 |
And I do it, and at the end of the situation, 00:31:39.100 |
somebody says, "Can you write a check to help us?" 00:31:44.860 |
I mean, because if I don't, the vision won't happen. 00:31:57.380 |
whether it's small at the individual level or really big, 00:32:01.940 |
like the gift to MIT to launch the College of Computing, 00:32:06.940 |
it starts with a vision, and you see philanthropy as, 00:32:18.380 |
especially on an issue that others aren't really addressing. 00:32:22.500 |
And I also love the notion, and you're absolutely right, 00:32:31.700 |
that would essentially, the seed will create other, 00:32:36.100 |
it'll have a ripple effect that potentially might help 00:32:40.860 |
U.S. be a leader or continue to be a leader in AI, 00:32:44.340 |
this potentially very transformative research direction. 00:33:03.620 |
- Well, it's very difficult to predict the future 00:33:06.140 |
when you're dealing with knowledge production 00:33:11.140 |
MIT has obviously some unique aspects globally, 00:33:16.140 |
and there's four big sort of academic surveys, 00:33:21.720 |
I forget whether it was QS, there's the Times in London, 00:33:31.100 |
One of these recently, MIT was ranked number one 00:33:37.020 |
So leave aside whether you're number three somewhere else, 00:33:58.380 |
You have to be a scientist to have the right feel. 00:34:01.680 |
But what's important is you have a critical mass of people, 00:34:23.780 |
within the university, or help sort of coordination 00:34:28.780 |
and communication among people, that's a good thing. 00:34:46.700 |
because if the science side creates blowback, 00:34:53.860 |
so that science is a bit crippled in terms of going forward, 00:35:04.820 |
because society's reaction to knowledge advancement 00:35:15.580 |
in terms of scientific progress and innovation. 00:35:19.100 |
And so the AI ethics piece is super important, 00:35:32.500 |
because what you need is you need the research universities, 00:35:37.500 |
you need the companies that are driving AI and quantum work, 00:35:43.300 |
you need governments who will ultimately be regulating 00:35:50.700 |
certain elements of this, and you also need the media 00:35:59.540 |
so we don't get sort of overreactions to one situation, 00:36:04.540 |
which then goes viral, and it ends up shutting down avenues 00:36:40.660 |
so you have four drivers that have to be sort of integrated. 00:37:05.180 |
So that's why I wanted to get involved for both areas. 00:37:18.140 |
to be a beacon and a connector for these ideas. 00:37:23.820 |
I mean, I think MIT is perfectly positioned to do that. 00:37:28.820 |
- So you've mentioned the media, social media, 00:37:34.380 |
the internet as this complex network of communication 00:37:39.380 |
with flaws, perhaps, perhaps you can speak to them, 00:37:44.260 |
but I personally think that science and technology 00:38:03.020 |
And two, perhaps more importantly for some people, 00:38:09.740 |
to grow the economy, to improve the quality of life 00:38:33.140 |
that invests in science, that pushes it forward, 00:38:40.340 |
fear-filled field of artificial intelligence? 00:38:43.580 |
- Well, I think that's a little above my pay grade 00:38:51.940 |
appears to be beyond almost anybody's control. 00:39:00.300 |
to create what I call the tyranny of the minorities. 00:39:03.700 |
Okay, a minority is defined as two or three people 00:39:13.500 |
they're united by that one issue that they care about 00:39:18.500 |
and their job is to enforce their views on the world. 00:39:33.380 |
and they throw it all over and it affects all of us. 00:40:15.580 |
you know, I was up here for the announcement last spring 00:40:27.020 |
some of whom were involved with the invention 00:40:29.260 |
of the internet and almost every one of them got up 00:40:44.100 |
And what they said is, more or less, to make it simple, 00:40:54.900 |
"we can move knowledge around, it was instantaneous, 00:41:04.300 |
"who ever thought about social media coming out of that 00:41:07.440 |
"and the actual consequences for people's lives." 00:41:11.160 |
You know, there's always some younger person, 00:41:21.440 |
he killed himself when people used social media 00:41:35.800 |
And, you know, so I don't have a solution for that 00:41:59.200 |
and parts of China, people are quite sympathetic 00:42:04.360 |
to, you know, sort of the whole concept of AI ethics 00:42:15.360 |
within your own industry, as well as globally, 00:42:19.520 |
to make sure that the technology is a force for good. 00:42:24.320 |
- On that really interesting topic, since 2007, 00:42:27.120 |
you've had a relationship with senior leadership 00:42:32.840 |
and an interest in understanding modern China, 00:42:38.200 |
much like with Russia, I'm from Russia, originally, 00:42:45.880 |
that I'm sure misses a lot of fascinating complexity, 00:43:22.580 |
isn't as well understood in the United States. 00:43:43.480 |
through a crowd of a billion 2.99999 other people. 00:43:52.940 |
- Yes, they are individually highly energetic, 00:43:57.140 |
highly focused, always looking for some opportunity 00:44:16.420 |
they'll try and find a way to win for themselves. 00:44:29.620 |
that we do in the United States and the West. 00:44:38.740 |
through a web of relationships you have with other people, 00:44:43.500 |
and the relationships that those other people 00:44:53.900 |
where if somebody knocks somebody up on the top 00:45:01.340 |
then you're like a sort of a floating molecule there 00:45:06.340 |
without tethering except the one or two layers above you, 00:45:15.940 |
and getting people to change is not that easy 00:45:19.980 |
because if there aren't really functioning laws, 00:45:23.780 |
it's only the relationships that everybody has. 00:45:27.420 |
And so when you decide to make a major change 00:45:36.340 |
There won't necessarily be all the same people on your team, 00:45:49.380 |
what everybody's relationship is with somebody. 00:45:52.800 |
So when you suggest doing something differently, 00:45:59.140 |
In the West, it's usually you talk to a person 00:46:22.500 |
It's hard to change 'em, but once they're changed, 00:46:35.100 |
So there are all kinds of fascinating things. 00:46:38.920 |
One thing that might interest the people who are listening, 00:46:46.000 |
we're more technologically-based than some other group. 00:46:50.780 |
I was with one of the top people in the government 00:47:28.860 |
if it's like math or English, everybody's gonna take it. 00:47:35.620 |
They don't write books, they don't write poetry. 00:47:41.260 |
Somebody like myself, I sort of evolved to the third grade, 00:48:11.400 |
would have some ability to program a computer is incredible. 00:48:20.360 |
and I think that should give United States pause. 00:48:24.160 |
Talking about philanthropy and launching things, 00:48:33.800 |
sort of investing in young, the youth, the education system, 00:48:50.920 |
They make a decision at the very top of the government 00:48:59.480 |
And we're really handicapped by this distributed power 00:49:09.320 |
involved with that area will think it's great. 00:49:36.500 |
sort of like ourselves, has got like 100%, and we got five, 00:49:42.600 |
and the whole computer science area is the future, 00:49:47.400 |
then we're purposely, or accidentally actually, 00:49:53.000 |
handicapping ourselves, and our system doesn't allow us 00:50:00.360 |
So, you know, issues like this, I find fascinating. 00:50:05.360 |
And if you're lucky enough to go to other countries, 00:50:10.520 |
which I do, and you learn what they're thinking, 00:50:21.520 |
- So the current administration, Donald Trump, 00:50:29.880 |
Not sure if you're familiar with it, in 2019. 00:50:33.220 |
Looking several years ahead, how does America, 00:50:37.920 |
sort of, we've mentioned in terms of the big impact, 00:50:41.800 |
we hope your investment in MIT will have a ripple effect, 00:50:51.760 |
how does America establish, with respect to China, 00:51:00.280 |
- I think that you have to get the federal government 00:51:30.600 |
I think the appetite actually is there to do that. 00:51:56.760 |
of American democracy right now in the Congress, 00:52:00.960 |
if you talk to individual members, they get it. 00:52:08.040 |
Another part of the issue is we're running huge deficits. 00:52:13.800 |
So how much money do you need for this initiative? 00:52:39.440 |
which we need, where does the money come from? 00:52:48.000 |
trillion-dollar deficits, in a way, could be easy. 00:52:50.880 |
What's the difference of a trillion and a trillion? 00:52:54.240 |
But, you know, it's hard with the mechanisms of Congress. 00:52:58.640 |
But what's really important is this is not a, 00:53:15.920 |
when you sit down and explain what's going on 00:53:28.460 |
But after he was elected, you have given him advice. 00:53:47.120 |
And yet, you've received a lot of criticism for this. 00:53:50.580 |
So on the previous topic of science and technology 00:53:54.160 |
and government, how do we have a healthy discourse, 00:53:59.720 |
give advice, get excited, conversation with the government 00:54:11.140 |
So when I was young, before there was a moonshot, 00:54:27.760 |
he asked not what your country can do for you, 00:54:38.800 |
basically people who grew up with that credo. 00:54:57.060 |
Americans have GDP per capita of around $60,000. 00:55:19.160 |
And apparently, I take some grief from some people 00:55:25.000 |
who project on me things I don't even vaguely believe. 00:55:35.600 |
I tried to help the previous president, President Obama. 00:56:08.240 |
Can I help people get to sort of a good outcome 00:56:15.760 |
But we live in a world now where sort of the filters 00:56:52.760 |
And there was one other major thing going on. 00:57:04.260 |
There hasn't been a generation that had more stuff 00:57:17.480 |
Yet, there wasn't this kind of instant hostility 00:57:26.280 |
Everybody lived together and respected the other person. 00:57:31.280 |
And I think that this type of change needs to happen. 00:57:44.440 |
And I don't think that leaders can be bullied 00:57:49.440 |
by people who are against sort of the classical version 00:57:55.340 |
of free speech and letting open expression and inquiry. 00:58:23.040 |
and we're gonna have to find a way to get back to that. 00:58:41.280 |
It's gotta be done at the top with core principles 00:58:53.360 |
to these kind of core principles where people are equal 00:59:00.480 |
and you don't have these enormous shout down biases 00:59:06.980 |
then they don't belong at those institutions. 00:59:24.160 |
for the benefit of not just their institutions, 00:59:43.640 |
And I think for such great leaders, great universities, 00:59:49.320 |
So I am too very optimistic that it will come. 01:00:02.560 |
maybe millions of other first time entrepreneurs like me. 01:00:06.080 |
What advice, you've gone through this process, 01:00:15.560 |
What advice do you have for those people taking that step? 01:00:33.040 |
You have to be prepared to be put in situations 01:00:45.440 |
it's not really a problem unless you've never done it. 01:00:48.560 |
You have no idea what a lease looks like, right? 01:00:52.040 |
You don't even know the relevant rent in a market. 01:01:03.660 |
to have other people with you who've had some experience 01:01:07.120 |
in areas where you don't know what you're doing. 01:01:34.580 |
because even they get bored with your problems. 01:01:37.860 |
And so getting a group, if you look at Alibaba, 01:01:47.900 |
like a financial death's door at least twice. 01:02:07.500 |
share situations to talk about, that's really important. 01:02:13.380 |
- And that's not just referring to the small details 01:02:27.820 |
Or they're screwing it up and they don't know 01:02:29.500 |
how to unscrew it up because we're all learning. 01:02:43.900 |
And so the ability to in effect have either an outsider 01:02:55.700 |
or other people who are working with you on a daily basis. 01:03:26.400 |
And you will know these stories better than I do 01:03:32.460 |
But even I know that almost every one of them 01:03:36.700 |
I mean, if you look at Google, that's what they had. 01:03:40.420 |
And that was the same at Microsoft at the beginning. 01:03:53.300 |
So, you know, the advice that I would give you 01:04:01.780 |
so you don't head off in some direction as a lone wolf 01:04:05.780 |
and find that either you can't invent all the solutions 01:04:10.260 |
or you make bad decisions on certain types of things. 01:04:25.740 |
- Yeah, I think, and you talk about this in your book 01:04:31.900 |
the harshly self-critical aspect to your personality 01:04:49.220 |
But let me ask in terms of people around you, 01:04:53.020 |
in terms of friends, in the bigger picture of your own life, 01:04:57.540 |
where do you put the value of love, family, friendship 01:05:33.880 |
People don't become successful as part-time workers. 01:05:40.580 |
And if you're prepared to make that 100 to 120% 01:05:49.940 |
and you're gonna have to have people involved 01:06:04.900 |
and that's a source of conflict and difficulty. 01:06:09.180 |
But if you're involved with the right people, 01:06:27.180 |
but not burden them with every minor triumph or mistake. 01:06:33.660 |
They actually get bored with it after a while 01:06:40.980 |
and so you have to set up different types of ecosystems. 01:06:45.820 |
You have your home life, you have your love life, 01:06:59.300 |
sort of unpredictable nature of this type of work. 01:07:04.300 |
What I say to people at my firm who are younger, 01:07:24.460 |
that it's important to make sure they go away 01:07:29.460 |
with their spouse at least once every two months 01:07:36.140 |
to just some lovely place where there are no children, 01:07:51.740 |
- Yeah, and reaffirm your values as a couple. 01:08:00.900 |
If you don't have fun with the person you're with 01:08:15.540 |
And the way to do that isn't by hanging around the house 01:08:28.340 |
And whenever I tell one of our younger people about that, 01:08:31.760 |
they sort of look at me and it's like the scales 01:08:34.380 |
are falling off of their eyes and they're saying, 01:08:52.420 |
- Beautifully put, and I think there's no better place 01:09:11.660 |
by signing up to Masterclass at masterclass.com/lex 01:09:16.340 |
and getting ExpressVPN at expressvpn.com/lexpod. 01:09:21.220 |
If you enjoy this podcast, subscribe on YouTube, 01:09:27.300 |
or simply connect with me on Twitter @LexFriedman. 01:09:33.820 |
from Stephen Schwarzman's book, "What It Takes." 01:09:37.900 |
"It's as hard to start and run a small business 01:09:45.400 |
"and psychologically as you bludgeon it into existence. 01:09:53.360 |
"So if you're going to dedicate your life to a business, 01:09:59.560 |
"you should choose one with a potential to be huge." 01:10:03.860 |
Thank you for listening and hope to see you next time.