back to index

Ray Dalio: Principles, the Economic Machine, AI & the Arc of Life | Lex Fridman Podcast #54


Chapters

0:0 Introduction
1:33 Sponsor
2:56 What is truth
5:30 The 5step process
7:41 The call to adventure
8:39 The shapers
13:28 Being a players
15:9 Confidence and openmindedness
17:10 Delusion
19:15 The Abyss
21:0 Economic Depression
24:38 Idea meritocracy
27:39 The economy
29:23 Credit
32:59 Money
40:36 Central bank
42:57 The principles
44:35 Disruptions
46:23 AI
51:32 Principles

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Ray Dalio.
00:00:03.360 | He's the founder, co-chairman,
00:00:05.120 | and co-chief investment officer of Bridgewater Associates,
00:00:08.560 | one of the world's largest
00:00:09.880 | and most successful investment firms
00:00:11.920 | that is famous for the principles of radical truth
00:00:14.680 | and transparency that underlies culture.
00:00:17.920 | Ray is one of the wealthiest people in the world
00:00:20.760 | with ideas that extend far beyond the specifics
00:00:23.200 | of how you made that wealth.
00:00:24.940 | His ideas that are applicable to everyone
00:00:27.400 | are brilliantly summarized in his book "Principles."
00:00:30.680 | They're also even further condensed
00:00:32.760 | on several other platforms, including YouTube,
00:00:35.280 | where, for example, the 30-minute video titled
00:00:38.560 | "How the Economic Machine Works"
00:00:40.760 | is one of the best educational videos
00:00:43.440 | I personally have ever seen on YouTube.
00:00:46.580 | Once again, you may have noticed
00:00:49.060 | that the people I've been speaking with
00:00:50.600 | are not just computer scientists,
00:00:52.520 | but philosophers, mathematicians, writers,
00:00:55.080 | psychologists, physicists, economists, investors,
00:00:57.560 | and soon, much more.
00:00:59.600 | To me, AI is much bigger than deep learning,
00:01:03.040 | bigger than computing.
00:01:04.440 | It is our civilization's journey
00:01:06.200 | into understanding the human mind
00:01:08.160 | and creating echoes of it in the machine.
00:01:10.880 | That journey includes the mechanisms of our economy,
00:01:14.000 | of our politics, and the leaders
00:01:15.880 | that shape the future of both.
00:01:18.080 | This is the Artificial Intelligence Podcast.
00:01:21.200 | If you enjoy it, subscribe on YouTube,
00:01:23.560 | give it five stars on Apple Podcast,
00:01:25.640 | support it on Patreon,
00:01:27.080 | or simply connect with me on Twitter,
00:01:29.280 | @LexFriedman, spelled F-R-I-D-M-A-N.
00:01:32.680 | This show is presented by Cash App,
00:01:35.960 | the number one finance app in the App Store.
00:01:38.480 | I personally use Cash App to send money to friends,
00:01:41.220 | but you can also use it to buy, sell, and deposit Bitcoin.
00:01:44.640 | Most Bitcoin exchanges take days
00:01:46.440 | for a bank transfer to become investable.
00:01:48.700 | Through Cash App, it takes seconds.
00:01:51.040 | Cash App also has a new investing feature.
00:01:54.360 | You can buy fractions of a stock,
00:01:56.120 | which to me is a really interesting concept.
00:01:58.880 | So you can buy $1 worth,
00:02:00.720 | no matter what the stock price is.
00:02:03.120 | Brokerage services are provided by Cash App Investing,
00:02:06.080 | a subsidiary of Square and member SIPC.
00:02:09.840 | I'm excited to be working with Cash App
00:02:11.680 | to support one of my favorite organizations
00:02:14.360 | that many of you may know and have benefited from,
00:02:17.320 | called FIRST, best known for their FIRST Robotics
00:02:20.760 | and Lego competitions.
00:02:22.440 | They educate and inspire hundreds of thousands of students
00:02:26.120 | in over 110 countries
00:02:28.120 | and have a perfect rating on Charity Navigator,
00:02:30.560 | which means the donated money
00:02:32.160 | is used to maximum effectiveness.
00:02:35.500 | When you get Cash App from the App Store or Google Play
00:02:38.520 | and use code LEXPODCAST, you'll get $10
00:02:42.040 | and Cash App will also donate $10 to FIRST,
00:02:44.780 | which again is an organization
00:02:46.760 | that I've personally seen inspire girls and boys
00:02:49.560 | to dream of engineering a better world.
00:02:52.440 | And now here's my conversation with Ray Dalio.
00:02:55.900 | Truth or more precisely an accurate understanding
00:02:59.560 | of reality is the essential foundation of any good outcome.
00:03:03.320 | I believe you've said that.
00:03:05.680 | Let me ask an absurd sounding question
00:03:08.580 | at the high philosophical level.
00:03:10.840 | So what is truth?
00:03:12.760 | When you're trying to do something different
00:03:14.920 | than everybody else is doing
00:03:16.720 | and perhaps something that has not been done before,
00:03:19.760 | how do you accurately analyze the situation?
00:03:23.460 | How do you accurately discover the truth,
00:03:25.400 | the nature of things?
00:03:27.280 | - Almost the way you're asking the question
00:03:29.040 | implies that truth and newness are almost at odds.
00:03:34.040 | And I just want to say that
00:03:36.840 | I don't think that that's true, right?
00:03:39.000 | So what I mean by truth is,
00:03:42.600 | you know, what is the reality?
00:03:43.560 | How does the reality work?
00:03:45.440 | And so if you're doing something new
00:03:47.760 | that has never been done before,
00:03:49.040 | which is exciting and I like to do,
00:03:51.280 | the way that you would start with that
00:03:53.960 | is experimenting on what are the realities
00:03:57.120 | and the premises that you're using on that
00:03:59.400 | and how to stress test those types of things.
00:04:01.840 | I think what you're talking about is instead
00:04:04.320 | the fact of how do you deal with something
00:04:06.280 | that's never been done before
00:04:08.040 | and deal with the associated probabilities.
00:04:11.160 | And so I think in that,
00:04:13.280 | don't let something that's never been done before
00:04:15.700 | stand in the way of you doing that particular thing.
00:04:20.140 | You have a, because almost the only way
00:04:22.680 | that you understand what truth is
00:04:25.120 | is through experimentation.
00:04:27.240 | And so when you go out and experiment,
00:04:29.440 | you're going to learn a lot more about what truth is.
00:04:32.280 | But the essence of what I'm saying
00:04:34.320 | is that when you take a look at that,
00:04:38.000 | use truth to find out what the realities are
00:04:40.440 | as a foundation.
00:04:41.720 | Do the independent thinking.
00:04:43.560 | Do the experimentation to find out what's true
00:04:47.160 | and change and keep going after that.
00:04:50.400 | So I think that when you're thinking about it
00:04:54.320 | the way you're thinking about it,
00:04:56.040 | that almost implies that you're letting people
00:05:00.600 | almost say that they're reliant on
00:05:03.540 | what's been discovered before to find out what's true.
00:05:06.480 | And what's been discovered before is often not true.
00:05:10.720 | Conventional view of what is true is very often wrong.
00:05:15.720 | It'll go in ups and downs and there are fads
00:05:19.800 | and okay, this thing, it goes this way and that way.
00:05:22.360 | And so definitions of truths that are conventional
00:05:25.880 | are not the thing to go by.
00:05:28.720 | - How do you know the thing that has been done before
00:05:32.280 | that it might succeed?
00:05:33.880 | - It's to do whatever homework that you have
00:05:37.840 | in order to try to get a foundation.
00:05:40.160 | And then to go into worlds of not knowing
00:05:43.100 | and you go into the world of not knowing
00:05:45.200 | but not stupidly, not naively.
00:05:48.920 | You go into that world of not knowing
00:05:51.160 | and then you do experimenting and you learn
00:05:54.080 | what truth is and what's possible through that process.
00:05:57.680 | I describe it as a five step process.
00:05:59.960 | The first step is you go after your goals.
00:06:02.080 | The second step is you identify the problems
00:06:04.780 | that stand in the way of you getting to your goals.
00:06:07.180 | The third step is you diagnose those
00:06:09.320 | to get at the root cause of those.
00:06:11.360 | Then the fourth step is then now that you know
00:06:13.780 | the exact root cause, you design a way to get around those
00:06:17.920 | and then you follow through and do the designs
00:06:21.080 | you set out to do and it's the experimentation.
00:06:23.920 | I think that what happens to people mostly
00:06:27.480 | is that they try to decide whether they're gonna be
00:06:31.120 | successful or not ahead of doing it.
00:06:34.040 | And they don't know how to do the process well
00:06:36.160 | because the nature of your questions are along those lines
00:06:39.560 | like how do you know?
00:06:40.800 | Well, you don't know but a practical person
00:06:43.680 | who is also used to making dreams happen
00:06:47.220 | knows how to do that process.
00:06:48.840 | I've given personality tests to shapers.
00:06:52.580 | So the person, what I mean by a shaper
00:06:54.840 | is a person who can take something from visualization,
00:06:59.840 | they have an audacious goal and then they go
00:07:02.960 | from visualization to actualization, building it out.
00:07:06.140 | That includes Elon Musk, I gave him the personality test,
00:07:10.320 | I've given it to Bill Gates and I give it
00:07:12.180 | to many, many such shapers.
00:07:15.280 | And they know that process that I'm talking about.
00:07:18.740 | They experience it which is a process essentially
00:07:21.920 | of knowing how to go from an audacious goal
00:07:24.400 | but not in a ridiculous way, not a dream
00:07:27.800 | and then to do that learning along the way
00:07:31.600 | that allows them in a very practical way
00:07:34.360 | to learn very rapidly as they're moving toward that goal.
00:07:38.220 | - So the call to adventure, the adventure starts
00:07:42.240 | not trying to analyze the probabilities of the situation
00:07:44.680 | but using what, instinct?
00:07:48.120 | How do you dive in?
00:07:49.400 | So let's talk about--
00:07:50.240 | - It is being a, it's simultaneously being a dreamer
00:07:54.960 | and a realist.
00:07:55.860 | It's to know how to do that well.
00:07:58.680 | The pull comes from a pull to adventure.
00:08:01.840 | For whatever reason, I can't tell you how much
00:08:05.160 | of it's genetics and how much it's environment
00:08:07.680 | but there's a early on, it's exciting.
00:08:11.440 | That notion is exciting, being creative is exciting
00:08:15.440 | and so one feels that.
00:08:17.600 | Then one gets in the habit of doing that.
00:08:20.360 | Okay, how do I know, how do I learn very well
00:08:23.520 | and then how do I imagine and then how do I experiment
00:08:27.880 | to go from that imagination?
00:08:30.840 | So it's that process that one, and then one,
00:08:34.800 | the more one does it, the better one becomes at it.
00:08:38.960 | - You mentioned shapers, Elon Musk, Bill Gates.
00:08:43.620 | Who are the shapers that you find yourself thinking about
00:08:48.900 | when you're constructing these ideas,
00:08:51.120 | the ones that define the archetype of a shaper for you?
00:08:54.400 | - Well, as I say, a shaper for me is somebody
00:08:57.400 | who comes up with a great visualization,
00:09:01.120 | usually a really unique visualization
00:09:04.720 | and then actually builds it out and makes the world
00:09:08.240 | different, changes the world in that kind of a way.
00:09:10.800 | So when I look at it, Mark Benioff with Salesforce,
00:09:14.640 | Chris Anderson with TED, Mohammed Yunus
00:09:17.920 | with Social Enterprise and Philanthropy,
00:09:20.480 | Jeffrey Canada and Harlem Children's Zone.
00:09:23.520 | There are, all domains have shapers
00:09:27.400 | who have the ability to visualize
00:09:30.040 | and make extraordinary things happen.
00:09:32.440 | - What are the commonalities in some of them?
00:09:34.980 | - The commonalities are, first of all,
00:09:38.960 | the excitement of something new, that call to adventure
00:09:43.280 | and then again, that practicality, the capacity to learn.
00:09:47.920 | The capacity then, they're able to be in many ways
00:09:52.480 | full rage, that means they're able to go from
00:09:56.080 | the big, big picture down to the detail.
00:09:59.320 | So let's say for example, Elon Musk,
00:10:02.360 | he describes, he gets a lot of money from selling PayPal,
00:10:07.360 | his interest in PayPal, he said,
00:10:09.920 | "Why isn't anybody going to Mars or outer space?
00:10:13.920 | "What are we gonna do if the planet goes to hell?
00:10:16.020 | "And how do we gonna get that?
00:10:18.120 | "And nobody's paying attention to that."
00:10:20.040 | He doesn't know much about it,
00:10:21.680 | he then reads and learns and so on,
00:10:24.440 | says, "I'm gonna take, okay, half of my money
00:10:27.440 | "and I'm gonna put it in there
00:10:29.540 | "and I'm gonna do this thing."
00:10:30.720 | And he learns and blah, blah, blah, blah, blah,
00:10:32.240 | and he's got creative, okay, that's one dimension.
00:10:35.040 | So he gave me the keys to his car,
00:10:39.520 | which was just early days in Tesla
00:10:42.360 | and he then points out the details.
00:10:45.680 | Okay, if you push this button here,
00:10:48.120 | it's this, the detail.
00:10:49.780 | So he's simultaneously talking about the big,
00:10:54.780 | the big, big, big picture, okay,
00:10:56.360 | when does humanity going to abandon the planet?
00:10:59.920 | But he will then be able to take it down into the detail
00:11:03.480 | so he can go, let's call it helicoptering.
00:11:06.280 | He can go up, he can go down
00:11:07.840 | and see things at those types of perspective.
00:11:09.840 | - And then you've seen that with the other shapers as well.
00:11:11.600 | - And that's a common thing that they can do that.
00:11:14.120 | Another important difference that they have in mind
00:11:18.460 | is how they deal with people.
00:11:21.120 | I mean, meaning there's nothing more important
00:11:24.280 | than achieving the mission.
00:11:26.160 | And so what they have in common is that there's a test
00:11:30.960 | that I give these personality tests
00:11:34.120 | 'cause they're very helpful for understanding people.
00:11:36.400 | And so I gave it to all these shapers.
00:11:38.720 | And one of the things in workplace inventory test
00:11:42.280 | is this test and it has a category called
00:11:44.880 | concern for others.
00:11:47.120 | They're all having concern for others.
00:11:49.580 | This includes Muhammad Yunus who invented microfinance,
00:11:53.540 | social enterprise impact investing as Muhammad Yunus
00:11:56.800 | received the Nobel Peace Prize for this,
00:11:59.600 | Congressional Medal of Honor.
00:12:01.640 | One of them, a fortune determined
00:12:04.240 | one of the 10 greatest entrepreneurs of our time.
00:12:08.920 | He's built all sorts of businesses to give back money
00:12:12.360 | in social enterprise, a remarkable man.
00:12:15.000 | He has nobody that I know practically
00:12:17.440 | could have more concern for others.
00:12:20.320 | He lives a life of a saint.
00:12:22.380 | I mean, very modest lifestyle and he puts all his money
00:12:25.720 | into trying to help others.
00:12:27.280 | And he tests low on what's called concern for others
00:12:30.120 | because what it really, the questions under that
00:12:33.240 | are questions about conflict to get at the mission.
00:12:36.860 | So they all, Jeffrey Canada who changed Harlem Children's
00:12:40.080 | Zone and developed that to take children in Harlem
00:12:42.880 | and get them well taken care of,
00:12:44.160 | not only just in their education, but their whole lives.
00:12:47.040 | Harlem, him also, concern for others.
00:12:49.560 | What they mean is that they can see
00:12:53.680 | whether those individuals are performing at a level,
00:12:58.680 | an extremely high level that's necessary
00:13:01.660 | to make those dreams happen.
00:13:03.520 | So when you think of, let's say, Steve Jobs was famous
00:13:06.600 | for being difficult with people and so on.
00:13:09.840 | And I didn't know Steve Jobs
00:13:11.160 | so I can't speak personally to that.
00:13:13.120 | But his comments on, do you have A players?
00:13:15.800 | And if you have A players, if you put in B players,
00:13:17.800 | pretty soon you'll have C players and so on.
00:13:19.840 | That is a common element of them,
00:13:22.040 | holding people to high standards
00:13:24.000 | and not letting anybody stand in the way of the mission.
00:13:28.320 | - What do you think about that kind of idea?
00:13:30.120 | Sorry to pause on that for a second,
00:13:31.600 | that the A, B and C players and the importance of,
00:13:36.600 | so when you have a mission to really only have A players
00:13:41.120 | and be sort of aggressively filtering for that.
00:13:45.280 | - Yes, but I think that there are all different ways
00:13:47.400 | of being A players.
00:13:49.600 | And I think, and what a great team.
00:13:53.180 | You have to appreciate all the differences
00:13:55.840 | in ways of being A players, okay?
00:13:59.320 | That's the first thing.
00:14:00.520 | And then you always have to be super excellent,
00:14:04.680 | in my opinion, you always have to be really excellent
00:14:07.400 | with people to help them understand themselves
00:14:11.920 | and get in sync with them about what's true about them
00:14:15.940 | and their circumstances and how they're doing
00:14:18.080 | so that they're having a fabulous
00:14:19.800 | personal development experience
00:14:22.120 | at the same time as you're dealing with them.
00:14:24.440 | So when I say that they're all different ways,
00:14:27.720 | this is one of the then qualities.
00:14:30.600 | You asked me, what are the qualities?
00:14:32.480 | So one of the third qualities that I would say
00:14:35.520 | is to know how to deal well with your not knowing
00:14:40.520 | and to be able to get the best expertise
00:14:46.480 | so that you're a great orchestrator of different ways
00:14:50.400 | so that the people who are really, really successful,
00:14:53.880 | unlike most people believe that they're successful
00:14:56.760 | because of what they know,
00:14:58.520 | they're even more successful by being able
00:15:02.600 | to effectively learn from others
00:15:04.680 | and tapping into the skills of people
00:15:06.840 | who see things different from them.
00:15:09.000 | - Brilliant, so how do you, when that personality being,
00:15:12.440 | first of all, open to the fact that there's
00:15:14.960 | other people see things differently than you
00:15:18.200 | and at the same time have supreme confidence in your vision?
00:15:22.340 | Is there just the psychology of that?
00:15:24.920 | Do you see a tension there between the confidence
00:15:27.300 | and the open-mindedness?
00:15:28.560 | - And now it's funny because I think we grow up
00:15:31.220 | thinking that there's a tension there, right?
00:15:33.280 | That there's a confidence and the more confidence
00:15:37.060 | that you have, there's a tension with the open-mindedness
00:15:40.560 | and not being sure, okay?
00:15:42.760 | Confident and accurate are almost negatively correlated
00:15:48.620 | in many people.
00:15:50.120 | They're extremely confident and they're often inaccurate.
00:15:53.840 | And so I think one of the greatest tragedies of people
00:15:57.360 | is not realizing how those things go together
00:16:00.220 | because instead it's really that by saying,
00:16:04.200 | I know a lot and how do I know I'm still not wrong
00:16:08.600 | and how do I take that best thinking available to me
00:16:13.480 | and then raise my probability of learning?
00:16:16.660 | All these people think for themselves, okay?
00:16:19.600 | I mean, meaning they're smart,
00:16:21.520 | but they take in like vacuum cleaners,
00:16:24.000 | they take in ideas of others,
00:16:26.200 | they stress test their ideas with others,
00:16:28.840 | they assess what comes back to them
00:16:31.140 | in the form of other thinking
00:16:33.660 | and they also know what they're not good at
00:16:36.340 | and what other people who are good at the things
00:16:38.840 | that they're not good at, they know how to get those people
00:16:41.520 | and be successful all around
00:16:43.620 | because nobody has enough knowledge in their heads.
00:16:46.960 | And that I think is one of the great differences.
00:16:49.600 | So the reason my company has been successful
00:16:53.400 | in terms of this is 'cause of an idea
00:16:55.200 | meritocratic decision-making, a process
00:16:58.000 | by which you can get the best ideas.
00:17:00.900 | You know, what's an idea meritocracy?
00:17:02.840 | An idea meritocracy is to get the best ideas
00:17:05.860 | that are available out there
00:17:07.080 | and to work together with other people
00:17:09.300 | in the team to achieve that.
00:17:10.760 | - That's an incredible process
00:17:12.000 | that you describe in several places to arrive at the truth.
00:17:15.400 | But I apologize if I'm romanticizing the notion,
00:17:18.640 | but let me linger on it.
00:17:19.840 | Just having enough self-belief,
00:17:23.680 | you don't think there's a self-delusion there
00:17:27.580 | that's necessary, especially in the beginning
00:17:30.040 | you talk about in the journey, maybe the trials or the abyss.
00:17:34.640 | Do you think there is value to deluding yourself?
00:17:39.640 | - I think what you're calling delusion
00:17:44.280 | is a bad word for uncertainty, okay?
00:17:48.880 | So I mean, in other words, 'cause we keep going back
00:17:52.000 | to the question, how would you know,
00:17:54.000 | and all of those things.
00:17:55.040 | No, I think that delusion is not gonna help you,
00:17:58.640 | that you have to find out truth, okay?
00:18:01.460 | To deal with uncertainty, not saying,
00:18:03.900 | listen, I have this dream,
00:18:05.340 | and I don't know how I'm going to get that dream.
00:18:08.140 | I mentioned in my book, "Principles,"
00:18:10.140 | and described the process in a more complete way
00:18:12.920 | than we're gonna be able to go here.
00:18:15.060 | But what happens is I say, you form your dreams first,
00:18:19.780 | and you can't judge whether you're going
00:18:22.460 | to achieve those dreams because you haven't learned
00:18:26.900 | the things that you're going to learn
00:18:28.700 | on the way toward those dreams, okay?
00:18:31.420 | So that isn't delusion, I wouldn't use delusion.
00:18:35.660 | I think you're overemphasizing the importance
00:18:38.740 | of knowing whether you're going to succeed or not.
00:18:41.660 | Get rid of that, okay?
00:18:43.220 | If you can get rid of that and say,
00:18:45.300 | okay, no, I can have that dream,
00:18:48.380 | but I'm so realistic in the notion of finding out.
00:18:52.700 | I'm curious, I'm a great learner, I'm a great experimenter.
00:18:56.460 | Along the way, you'll do those experiments
00:18:58.780 | which will teach you more truths
00:19:00.780 | and more learning about the reality
00:19:03.220 | so that you can get your dreams.
00:19:04.980 | Because if you still live in that world of delusion, okay,
00:19:08.420 | and you think delusion's helpful,
00:19:10.380 | no, the delusion isn't,
00:19:12.260 | don't confuse delusion with not knowing.
00:19:14.540 | - Yes, but nevertheless,
00:19:17.380 | so if we look at the abyss,
00:19:18.940 | we can look at your own that you described.
00:19:22.060 | It's difficult psychologically for people.
00:19:24.220 | So many people quit, many people choose a path
00:19:27.500 | that is more comfortable.
00:19:28.960 | The heartbreak of that breaks people.
00:19:34.660 | So if you have the dream
00:19:36.180 | and there's this cycle of learning,
00:19:37.940 | setting a goal and so on,
00:19:40.060 | what's your value for the psychology
00:19:42.180 | of just being broken by these difficult moments?
00:19:45.440 | - Well, that's classically the defining moment.
00:19:48.380 | It's almost like evolution taking care of,
00:19:51.720 | okay, now you crash, you're in the abyss,
00:19:55.460 | oh my God, that's bad.
00:19:56.980 | And then the question is, what do you do?
00:19:59.300 | And it sorts people, okay?
00:20:01.260 | And that's what, some people get off the field
00:20:04.720 | and they say, oh, I don't like this and so on.
00:20:07.300 | And some people learn and they have a metamorphosis
00:20:12.780 | and it changes their approach to learning.
00:20:16.140 | The number one thing it should give them is uncertainty.
00:20:19.520 | You should take an audacious, dreamer,
00:20:23.920 | guy who wants to change the world, crash, okay?
00:20:28.660 | And then come out of that crashing and saying,
00:20:31.700 | okay, I can be audacious
00:20:35.160 | and scared that I'm gonna be wrong at the same time.
00:20:39.640 | And then how do I do that?
00:20:40.980 | Because that's the key.
00:20:42.360 | When you don't lose your audaciousness
00:20:44.520 | and you keep going after your big goal,
00:20:46.860 | and at the same time you say,
00:20:48.380 | hey, I'm worried that I'm gonna be wrong,
00:20:50.600 | you gain your radical open-mindedness
00:20:53.740 | that allows you to take in the things
00:20:56.080 | that allows you to go to the next level of being successful.
00:20:59.540 | - So your own process, I mean,
00:21:01.360 | you've talked about it before,
00:21:02.540 | but it would be great if you can describe it
00:21:05.220 | because our darkest moments
00:21:06.580 | are perhaps the most interesting.
00:21:07.940 | So your own and with the prediction
00:21:10.180 | of another depression.
00:21:13.240 | - Economic depression.
00:21:14.440 | - Yes, I apologize, economic depression.
00:21:17.120 | Can you talk to what you were feeling,
00:21:20.600 | thinking, planning, strategizing at those moments?
00:21:24.600 | - Yeah, that was my biggest moment, okay?
00:21:28.840 | Building my little company.
00:21:30.880 | This is in 1981, '82.
00:21:34.120 | I had calculated that American banks
00:21:36.720 | had given a lot more money to Lent,
00:21:40.040 | a lot more money to Latin American countries
00:21:42.120 | than those countries were gonna pay back,
00:21:44.060 | and that they would have a debt crisis,
00:21:46.720 | and that this would send the economy tumbling,
00:21:49.880 | and that was an extremely controversial point of view.
00:21:53.520 | Then it started to happen,
00:21:54.920 | and it happened in Mexico defaulted in August, 1982.
00:21:58.800 | I thought that there was gonna be an economic collapse
00:22:02.440 | that was gonna follow
00:22:03.380 | because there was a series of the other countries,
00:22:05.640 | it was just playing out as I had imagined,
00:22:08.160 | and that couldn't have been more wrong.
00:22:11.000 | That was the exact bottom in the stock market
00:22:14.080 | because central banks, monetary policy, blah, blah, blah,
00:22:17.440 | and I couldn't have been more wrong,
00:22:19.120 | and I was very publicly wrong, and all of that,
00:22:21.940 | and I lost money for me, and I lost money for my clients,
00:22:24.800 | and I only had a small company then,
00:22:27.880 | but these were close people, I had to let them go.
00:22:32.080 | I was down to me as the last person.
00:22:35.680 | I was so broke I had to borrow $4,000 from my dad
00:22:40.000 | to help to pay for my family bills, very painful,
00:22:43.160 | and at the same time, I would say it definitely was
00:22:47.600 | one of the best things that ever happened to me,
00:22:50.480 | maybe the best thing for him happened to me
00:22:52.360 | because it changed my approach to decision making.
00:22:55.100 | It's what I'm saying.
00:22:56.360 | In other words, I kept saying,
00:22:58.360 | okay, how do I know whether I'm right?
00:23:01.040 | How do I know I'm not wrong?
00:23:02.880 | It gave me that, and it didn't give up my audaciousness
00:23:07.520 | because I was in a position, what am I gonna do?
00:23:10.720 | Am I gonna go back, put on a tie, go to Wall Street,
00:23:14.720 | and just do those things?
00:23:16.560 | No, I can't bring myself to do that, so I'm at a juncture.
00:23:19.640 | How do I deal with my risk, and how do I deal with that?
00:23:22.200 | And it told me how to deal with my uncertainties,
00:23:25.240 | and that taught me, for example, a number of techniques.
00:23:28.800 | First, to find the smartest people I could find
00:23:31.780 | who disagreed with me, and to have quality disagreement.
00:23:35.240 | I learned the art of thoughtful disagreement.
00:23:37.440 | I learned how to produce diversification.
00:23:39.720 | I learned how to do a number of things.
00:23:41.840 | That is what led me to create an idea meritocracy.
00:23:46.120 | In other words, person by person, I hired them,
00:23:48.480 | and I wanted the smartest people
00:23:50.360 | who would be independent thinkers
00:23:51.800 | who would disagree with each other and me well
00:23:55.040 | so that we could be independent thinkers
00:23:57.600 | to go off to produce those audacious dreams
00:24:00.240 | 'cause you have to be an independent thinker to do that,
00:24:02.540 | and to do that not independently of the consensus,
00:24:05.540 | independently of each other,
00:24:06.940 | and then work ourselves through that
00:24:08.820 | 'cause who know whether you're gonna have the right answer,
00:24:11.460 | and by doing that, then that was the key to our success,
00:24:15.780 | and the things that I wanna pass along to people.
00:24:18.280 | The reason I'm doing this podcast with you
00:24:20.900 | is I'm 70 years old, and that is a magical way
00:24:25.700 | of achieving success if you can create
00:24:28.940 | an idea meritocracy.
00:24:30.940 | It's so much better in terms of achieving success
00:24:34.060 | and also quality relationships with people,
00:24:36.700 | but that's what that experience gave me.
00:24:38.960 | - So if we can linger on a little bit longer,
00:24:41.580 | the idea of an idea meritocracy, it's fascinating,
00:24:44.940 | but especially because it seems to be rare,
00:24:48.060 | not just in companies but in society.
00:24:50.960 | So there's a lot of people on Twitter
00:24:53.100 | and public discourse and politics and so on
00:24:55.800 | that are really stuck in certain sets of ideas,
00:24:58.260 | whatever they are.
00:24:59.940 | So when you're confronted with an idea
00:25:03.460 | that's different than your own about a particular topic,
00:25:08.460 | what kind of process do you go through mentally?
00:25:12.740 | Are you arguing through the idea with the person?
00:25:16.100 | Sort of present is almost like a debate,
00:25:18.340 | or do you sit on it and consider the world
00:25:21.020 | sort of empathetically if this is true?
00:25:23.980 | Then what does that world look like?
00:25:27.300 | Does that world make sense and so on?
00:25:28.820 | So what's the process of considering
00:25:30.520 | those conflicting ideas for you?
00:25:32.860 | - I'm gonna answer that question after saying first,
00:25:36.820 | almost implicit in your question is it's not common.
00:25:41.340 | What's common produces only common results.
00:25:45.760 | So don't judge anything that is good
00:25:50.100 | based on whether it's common,
00:25:51.620 | 'cause it's only gonna give you common results.
00:25:53.620 | If you want unique, you have a unique approach.
00:25:57.220 | And so that art of thoughtful disagreement
00:26:00.460 | is the capacity to hold two things
00:26:04.580 | in your mind at the same time.
00:26:06.460 | The, gee, I think this makes sense,
00:26:09.740 | and then saying, I'm not sure it makes sense,
00:26:13.380 | and then try to say, why does it make sense,
00:26:17.620 | and then to triangulate with others.
00:26:20.180 | So if I'm having a discussion like that
00:26:22.500 | and I work myself through and I'm not sure,
00:26:26.580 | then I have to do that in a good way.
00:26:29.220 | So I always give attention, for example,
00:26:32.220 | let's start off, what does the other person know
00:26:34.340 | relative to what I know?
00:26:36.120 | So if a person has a higher expertise or things,
00:26:39.820 | I'm much more inclined to ask questions.
00:26:42.240 | I'm always asking questions.
00:26:43.820 | If you wanna learn, you're asking questions,
00:26:46.620 | you're not arguing, okay?
00:26:48.460 | You're taking in, you're assessing when it comes into you.
00:26:51.820 | Does that make sense?
00:26:52.980 | Are you learning something?
00:26:54.100 | Are you getting epiphanies and so on?
00:26:56.020 | And I try to then do that.
00:26:57.540 | If the conversation, as we're trying to decide what is true,
00:27:02.540 | and we're trying to do that together,
00:27:04.580 | and we see truth different,
00:27:06.220 | then I might even call in another
00:27:09.060 | really smart, capable person and try to say,
00:27:12.360 | what is true and how do we explore that together?
00:27:15.420 | And you go through that same thing.
00:27:17.660 | So I would, I said, I describe it as having open-mindedness
00:27:22.580 | and assertiveness at the same time.
00:27:24.980 | That you can simultaneously be open-minded
00:27:28.580 | and take in with that curiosity
00:27:30.940 | and then also be assertive and say,
00:27:32.620 | but that doesn't make sense.
00:27:33.840 | Why would this be the case?
00:27:35.660 | And you do that back and forth.
00:27:37.460 | - And when you're doing that kind of back and forth
00:27:41.540 | on a topic like the economy, which you have,
00:27:44.800 | to me, perhaps I'm naive, but it seems both incredible
00:27:49.540 | and incredibly complex, the economy, the trading,
00:27:53.580 | the transactions, that these transactions
00:27:56.140 | between two individuals somehow add up
00:27:58.540 | to this giant mechanism.
00:28:00.300 | You've put out a 30-minute video.
00:28:03.180 | You have a lot of incredible videos online
00:28:06.500 | that people should definitely watch on YouTube.
00:28:10.080 | But you've put out this 30-minute video
00:28:11.660 | titled How the Economic Machine Works.
00:28:14.620 | That is probably one of the best, if not the best video
00:28:19.100 | I've seen on the internet in terms of educational videos.
00:28:22.420 | So people should definitely watch it,
00:28:24.260 | especially because it's not that the individual components
00:28:29.260 | of the video are somehow revolutionary,
00:28:31.940 | but the simplicity and the clarity
00:28:33.600 | of the different components just makes you,
00:28:36.180 | there's a few light bulb moments there
00:28:37.740 | about how the economy works as a machine.
00:28:40.880 | So as you described, there's three main forces
00:28:43.180 | that drive the economy, productivity growth,
00:28:45.180 | short-term debt cycle, long-term debt cycle.
00:28:48.380 | The former productivity growth is how valuable things
00:28:52.140 | how much value people create, valuable things people create.
00:28:57.140 | The latter is people borrowing from their future selves
00:29:02.100 | to hopefully create those valuable things faster.
00:29:05.220 | So this is an incredible system to me.
00:29:07.420 | Maybe we can linger on it a little bit.
00:29:09.980 | But you've also said what most people think about
00:29:14.180 | as money is actually credit.
00:29:15.700 | Total amount of credit in the US is $50 trillion.
00:29:20.580 | Total amount of money is $3 trillion.
00:29:23.060 | That's just crazy to me.
00:29:26.060 | Maybe I'm silly, maybe you can educate me,
00:29:28.540 | but that seems crazy.
00:29:29.920 | It gives me just pause that the human civilization
00:29:33.460 | has been able to create a system that has so much credit.
00:29:37.140 | So that's a long way to ask,
00:29:40.900 | do you think credit is good or bad for society?
00:29:45.060 | That system that's so fundamentally based on credit.
00:29:49.200 | - I think credit is great,
00:29:51.300 | even though people often overdo it.
00:29:54.500 | Credit is that somebody has earned money.
00:29:58.580 | What happens is they lend it to somebody else
00:30:04.820 | who's got better ideas and they cut a deal.
00:30:07.500 | Then that person with the better ideas
00:30:09.500 | is gonna pay it back.
00:30:10.540 | And if it works well, it helps resource allocations go well,
00:30:15.540 | providing people like the entrepreneurs
00:30:18.340 | and all of those, they need capital.
00:30:20.520 | They don't have capital themselves.
00:30:22.220 | And so somebody is gonna give them capital
00:30:24.060 | and they'll give them credit and along those lines.
00:30:26.620 | Then what happens is it's not managed well
00:30:30.220 | in a variety of ways.
00:30:31.900 | So I did another book on principles,
00:30:34.500 | principles of big debt crises that go into that.
00:30:38.700 | And it's free by the way, I put it free online as a PDF.
00:30:43.700 | So if you go online and you look principles
00:30:47.100 | of big debt crisis is under my name,
00:30:48.900 | you can download it in a PDF
00:30:50.660 | or you can buy a print book of it.
00:30:52.900 | And it goes through that particular process.
00:30:55.780 | And so you always have it overdone in always the same way.
00:31:00.780 | Everything by the way, almost everything happens
00:31:04.200 | over and over again for the same reasons.
00:31:06.300 | Okay, so these debt crises all happen over and over again
00:31:09.700 | for the same reasons.
00:31:11.100 | They get it overdone.
00:31:12.420 | In the book it explains how you identify
00:31:14.400 | whether it's overdone or not.
00:31:16.100 | They get it overdone and then you go through the process
00:31:19.280 | of making the adjustments according that.
00:31:21.580 | And then, and it explains how they can use the levers
00:31:24.460 | and so on.
00:31:25.340 | If you didn't have credit, then you would be sort of,
00:31:29.180 | everybody sort of be stuck.
00:31:30.860 | So credit is a good thing, but it can easily be overdone.
00:31:35.860 | So now we get into the question, what is money?
00:31:39.800 | What is credit?
00:31:41.060 | Okay, you get into money and credit.
00:31:43.260 | So if you're holding credit and you think that's worthwhile,
00:31:46.540 | keep in mind that the central bank,
00:31:48.800 | let's say it can print the money.
00:31:50.500 | What is that problem?
00:31:51.980 | You have an IOU and the IOU says you're gonna get
00:31:55.660 | a certain number of dollars, let's say, or yen or euros.
00:32:00.260 | And that is what the IOU is.
00:32:02.660 | And so the question is, will you get that money
00:32:06.220 | and what will it be worth?
00:32:07.900 | And then also you have a government
00:32:10.540 | which is a participant in that process.
00:32:13.060 | 'Cause they are on the hook, they owe money.
00:32:16.420 | And then will they print the money to make it easy
00:32:19.460 | for everybody to pay?
00:32:20.980 | So you have to pay attention to those two.
00:32:23.180 | I would suggest like you recommend to other people,
00:32:26.460 | just take that 30 minutes and it comes across
00:32:30.740 | pretty clearly.
00:32:31.580 | But my conclusion is that of course you want it.
00:32:35.180 | And even if you understand it and the cycles well,
00:32:39.900 | you can benefit from those cycles rather than
00:32:43.140 | to be hurt by those cycles.
00:32:45.380 | Because I don't know the way the cycle works
00:32:48.660 | is somebody gets over indebted,
00:32:50.460 | they have to sell an asset, okay, then I don't know,
00:32:53.620 | that's when assets become cheaper.
00:32:55.740 | How do you acquire the asset?
00:32:57.420 | It's a whole process.
00:32:59.260 | - So again, maybe another dumb question but--
00:33:03.860 | - There are no such things as dumb questions.
00:33:06.020 | - There you go.
00:33:06.860 | But what is money?
00:33:09.460 | So you've mentioned credit and money.
00:33:12.060 | Another thing that if I just zoom out
00:33:16.460 | from an alien perspective and look at human civilization,
00:33:19.420 | it's incredible that we've created a thing
00:33:22.740 | that's not, that only works because currency,
00:33:27.380 | because we all agree it has value.
00:33:30.460 | So I guess my question is how do you think about money
00:33:36.380 | as this emergent phenomenon?
00:33:38.940 | And what do you think is the future of money?
00:33:41.740 | You've commented on Bitcoin, other forms.
00:33:44.460 | What do you think is its history and future?
00:33:48.580 | How do you think about money?
00:33:51.320 | - There are two things that money is for.
00:33:55.100 | It's a medium of exchange and it's a storehold of wealth.
00:33:59.820 | - Yes.
00:34:01.100 | - So money, so you could say something's a medium
00:34:05.500 | of exchange and then you could say
00:34:07.500 | is it a storehold of wealth, okay?
00:34:10.300 | So those, and money is that vehicle that is those things
00:34:15.300 | and can be used to pay off your debt.
00:34:20.800 | So when you have a debt and you provide it,
00:34:25.020 | it pays off your debt.
00:34:26.840 | So that's that process.
00:34:28.740 | - And it's, I apologize to interrupt,
00:34:31.180 | but it only can be a medium of exchange or store wealth
00:34:35.100 | when everybody recognizes it to be a value.
00:34:37.700 | - That's right.
00:34:38.660 | And so you see in the history around the world
00:34:41.420 | and you go to places, I was in an island in the Pacific
00:34:46.420 | in which they had as money these big stones
00:34:53.340 | and literally they were taking a boat,
00:34:56.780 | this big carved stone, and they were taking it
00:35:00.140 | from one of the islands to the other
00:35:01.580 | and it sank, the piece of this big stone piece of money
00:35:06.580 | that they had, and it went to the bottom
00:35:10.120 | and they still perceived it as having value
00:35:13.580 | so that it was, even though it was in the bottom
00:35:15.940 | and it's this big hunk of rock,
00:35:17.820 | the fact that somebody owned it,
00:35:19.260 | they would say, oh, I'll own it for this and that.
00:35:21.940 | I've seen beads in different places,
00:35:25.460 | shells converted to this and mediums of exchange.
00:35:28.840 | And when we look at what we've got,
00:35:30.860 | you're exactly right, it is the notion
00:35:33.500 | that if I give it to you, I can then take it
00:35:36.180 | and I can buy something with it.
00:35:37.900 | And that's, so it's a matter of perception, okay.
00:35:41.500 | And then we go through then the history of money
00:35:45.020 | and the vulnerabilities of money.
00:35:47.500 | And what we have is, there's through history,
00:35:50.820 | there's been two types of money,
00:35:52.860 | those that are claims on something of value,
00:35:58.180 | like the connection to gold or something.
00:36:01.620 | - That's right.
00:36:02.460 | - That would be, or they just are money
00:36:05.660 | without any connection, and then we have a system now,
00:36:08.840 | which is a fiat monetary system.
00:36:10.940 | So that's what money is.
00:36:12.700 | Then it will last as long as it's kept a value
00:36:17.700 | and it works that way.
00:36:19.140 | So let's say central banks, when they get in the position
00:36:23.780 | of like they owe a lot of money,
00:36:26.220 | like we have in the case, it's increasingly the case,
00:36:29.780 | and they also are a bind and they have the printing press
00:36:33.700 | to print the money and get out of that.
00:36:36.700 | And you have a lot of people might be in that position.
00:36:39.500 | Then you can print it and then it could be devalued
00:36:42.460 | in there, and so history has shown, forget about today,
00:36:46.900 | history has shown that no currency has,
00:36:51.160 | every currency has either ended as being a currency,
00:36:55.900 | or devalued as a currency over periods of time,
00:36:59.900 | long periods of time.
00:37:01.420 | So it evolves and it changes, but everybody needs
00:37:04.500 | that medium of exchange, and everybody needs
00:37:07.260 | that storehold of wealth, so it keeps changing
00:37:10.200 | what is money over a period of time.
00:37:12.340 | - But so much is being digitized today,
00:37:16.780 | and there's this ideas that are based
00:37:19.340 | on the blockchain of Bitcoin and so on.
00:37:22.580 | So if all currencies, like all empires come to an end,
00:37:27.580 | what do you think, do you think something like Bitcoin
00:37:30.660 | might emerge as a common store of value,
00:37:34.340 | a store of wealth, and a medium of exchange?
00:37:37.860 | - The problem with Bitcoin is that it's not
00:37:42.860 | an effective medium of exchange, like it's not easy
00:37:45.740 | for me to go in there and buy things with it,
00:37:49.100 | and then it's not an effective storehold of value
00:37:52.060 | because it has a volatility that's based on speculation
00:37:56.020 | and the like, so it's not a very effective saving.
00:38:00.020 | That's very different from Facebook's
00:38:03.500 | of a stable value currency, which would be effective
00:38:06.820 | as both a medium of exchange and a storehold of wealth,
00:38:11.140 | because if you were to hold it, and the way it's linked to,
00:38:14.780 | number of things that it's linked to,
00:38:16.900 | would mean that it could be a very effective
00:38:19.340 | storehold of wealth, and then you have a digital currency
00:38:22.500 | that could be a very effective medium of exchange
00:38:24.980 | and storehold of wealth.
00:38:26.780 | So in my opinion, some digital currencies
00:38:29.940 | are likely to succeed more or less
00:38:32.820 | based on that ability to do it.
00:38:35.060 | Then the question is, what happens?
00:38:37.940 | Okay, what happens is, do central banks
00:38:40.740 | allow that to happen?
00:38:42.660 | I really do believe it's possible to get a better form
00:38:46.540 | of money that central banks don't control, okay?
00:38:50.500 | A better force of money that central banks don't control,
00:38:54.540 | but then that's not yet happened,
00:38:57.820 | and we also have to, and so they've got to go through
00:39:01.060 | that evolutionary process.
00:39:03.460 | In order to go through that evolutionary process,
00:39:05.700 | first of all, governments have got to allow that to happen,
00:39:09.140 | which is to some extent a threat to them
00:39:11.500 | in terms of their power, and that's an issue,
00:39:14.620 | and then you have to also build the confidence
00:39:17.540 | in all of the components of it to say,
00:39:21.420 | okay, that's going to be effective
00:39:23.580 | because I won't have problems owning it.
00:39:28.580 | So I think that digital currencies
00:39:30.940 | have some element of potential,
00:39:35.220 | but there's a lot of hurdles that are going
00:39:38.260 | to have to be gotten over.
00:39:39.900 | I think that it'll be a very long time,
00:39:42.820 | possibly never, but anyway, a very long time
00:39:46.140 | before we have that, let's say, get into a position
00:39:48.980 | that would be in an effective means relative to gold,
00:39:53.980 | let's say, if you were to think of that,
00:39:56.300 | because gold has a track record of thousands of years
00:40:00.620 | all across countries.
00:40:03.300 | It has its mobility.
00:40:04.800 | It has the ability to put it down.
00:40:06.460 | It has certain abilities.
00:40:07.540 | It's got disadvantages relative to digital currencies,
00:40:12.100 | but central banks will hold it.
00:40:14.260 | There's central banks that worry about others.
00:40:18.380 | Other countries, central banks might worry
00:40:20.180 | about whether the US dollar is going to print that,
00:40:23.100 | and so the thing they're going to go to
00:40:24.820 | is not going to be the digital currency.
00:40:26.940 | Thing they're going to go to is gold or something else,
00:40:30.740 | some other currency.
00:40:31.580 | They got to pick it, and so I think it's a long way to go.
00:40:35.380 | - Well, you think it's possible that one day
00:40:37.260 | we don't even have a central bank
00:40:39.140 | because of a currency that cannot be controlled
00:40:44.140 | by the central bank is the primary currency,
00:40:47.920 | or does that seem very unlikely?
00:40:50.620 | - It would be very remote possibility
00:40:57.940 | or very long in the future.
00:41:00.420 | - Got it.
00:41:01.460 | Again, maybe a dumb question, but romanticize one.
00:41:04.220 | When you sit back and you look,
00:41:06.060 | you describe these transactions between individuals
00:41:09.820 | somehow creating short-term debt cycles,
00:41:13.900 | long-term debt cycles, this productivity growth.
00:41:17.460 | Does it amaze you that this whole thing works,
00:41:22.380 | that there's however many millions,
00:41:25.340 | hundreds of millions of people in the United States,
00:41:27.780 | globally over seven billion people,
00:41:30.180 | that this thing between individual transactions,
00:41:35.020 | it works?
00:41:36.180 | - Yeah, it amazes me.
00:41:38.540 | I go back and forth between being in it,
00:41:43.380 | and then I think, how did a credit card,
00:41:48.020 | how is that really possible?
00:41:49.620 | I'm still used, I look up credit card, I put it on,
00:41:52.540 | the guy doesn't know me.
00:41:53.900 | - Yeah, it's all strangers.
00:41:56.260 | - Okay, we're making the digital entries.
00:41:58.660 | Is that really secure enough and that kind of thing?
00:42:01.100 | And then it goes back, and it goes this,
00:42:03.180 | and it clears, and it all happens.
00:42:05.540 | And what I marvel at that, and those types of things,
00:42:09.700 | is because of the capacity of the human mind
00:42:14.260 | to create abstractions that are true.
00:42:19.020 | It's imagination, and then the ability to go from one level,
00:42:23.380 | and then if these things are true,
00:42:25.660 | then you go to the next level,
00:42:27.100 | and if those things are true,
00:42:28.880 | then you go to the next level.
00:42:30.180 | And all those miracles that we almost become common,
00:42:33.700 | it's like when I'm flying in a plane,
00:42:36.420 | or when I'm looking at all of the things that happen.
00:42:39.540 | When I get communications in the middle of,
00:42:42.620 | I don't know, Africa or Antarctica,
00:42:45.060 | and we're communicating in the ways
00:42:47.220 | where I see the face on my iPad of somebody,
00:42:50.460 | my grandkid in someplace else,
00:42:52.540 | and I look at this and I say, wow, yes, it all amazes me.
00:42:58.100 | - So while being amazing, do you have a sense,
00:43:01.620 | the principles you described,
00:43:02.940 | that the whole thing is stable somehow also?
00:43:06.140 | Or is this, are we just lucky?
00:43:09.340 | So the principles that you described,
00:43:11.620 | are those describing a system that is stable,
00:43:16.180 | robust, and will remain so,
00:43:18.580 | or is it a lucky accident of our early history?
00:43:22.940 | - My area of expertise is economics and markets,
00:43:26.000 | so I get down to a real nitty gritty.
00:43:28.860 | I can't tell you whether the plane
00:43:30.260 | is gonna fall out of the sky
00:43:32.300 | because of its particular fundamentals.
00:43:34.980 | I don't know enough about that,
00:43:36.380 | but it happens over and over again,
00:43:37.980 | and so on, it gives me faith, okay?
00:43:39.780 | So without me knowing it.
00:43:41.300 | In the markets and the economy,
00:43:43.800 | I know those things well enough, in a sense,
00:43:47.440 | to say that by and large, that structure is right,
00:43:52.440 | what we're seeing is right.
00:43:55.120 | Now, whether there are disruptions,
00:43:58.060 | and it has effects that can come,
00:44:01.200 | not because that structure is right,
00:44:03.180 | and I believe it's right,
00:44:04.600 | but whether it can be hurt by, let's say,
00:44:08.100 | connectivity or journal entries.
00:44:11.200 | They could take all the money away from you
00:44:13.500 | through your digital entries.
00:44:15.340 | There's all sorts of things that can happen in various ways
00:44:19.100 | that means that that money is worthless,
00:44:22.000 | or the system falls,
00:44:23.620 | but from what I see in terms of its basic structure,
00:44:26.860 | and those complexities that still take my breath away,
00:44:30.180 | I would say knowing enough about the mechanics of them,
00:44:33.580 | that doesn't worry me.
00:44:34.860 | - Have you seen disruptions in your lifetime
00:44:37.040 | that really surprised you?
00:44:38.260 | - Oh, all the time.
00:44:39.800 | This is one of the great lessons of my life,
00:44:42.180 | is that many times I've seen things
00:44:46.520 | that I was very surprised about,
00:44:50.460 | and that I realized almost all of those
00:44:54.920 | I was surprised about because they were just
00:44:58.140 | the first time it happened to me.
00:45:00.260 | They didn't happen in my lifetime before,
00:45:02.620 | but when I researched them,
00:45:04.660 | they happened in other places or other people's lifetimes.
00:45:08.220 | So, for example, I remember 1971, the dollar,
00:45:13.220 | there was no such thing as a devaluation of a currency.
00:45:16.740 | It didn't experience it,
00:45:18.540 | and the dollar was connected to gold,
00:45:21.340 | and I was watching events happen,
00:45:23.220 | and then you get on, and that definition of money
00:45:26.580 | all of a sudden went out the window
00:45:29.100 | because it was not tied to gold,
00:45:31.380 | and then you have this devaluation.
00:45:33.740 | And then, or the first oil shock,
00:45:37.100 | or the second oil shock, or so many of these things.
00:45:41.060 | But almost always, I realized that they,
00:45:47.180 | when I looked in history, they happened before.
00:45:50.100 | They just happened in other people's lifetimes,
00:45:53.020 | which led me to realize that I needed to study history
00:45:58.020 | and what happened in other people's lifetimes
00:46:00.900 | and what happened in other countries and places
00:46:04.140 | so that I would have timeless and universal principles
00:46:07.780 | for dealing with that thing.
00:46:09.360 | So, oh yeah, I've been, the implausible happening.
00:46:13.860 | But it's like a one in a hundred year storm.
00:46:17.660 | - Right. - Okay?
00:46:19.060 | Or it's, or-- - They've happened before.
00:46:22.180 | - Yeah, they've happened. - Just not to you.
00:46:24.420 | Let me talk about, if we could, about AI a little bit.
00:46:28.180 | So we've, Bridgewater Associates manage
00:46:32.900 | about $160 billion in assets,
00:46:36.640 | and our artificial intelligence systems,
00:46:40.300 | algorithms are pretty good with data.
00:46:43.620 | What role in the future do you see AI play
00:46:47.980 | in analysis and decision making
00:46:50.260 | in this kind of data-rich and impactful area of investment?
00:46:55.260 | - I'm gonna answer that not only in investment,
00:46:59.900 | but I give a more all-encompassing rule for AI.
00:47:04.900 | As I think you know, for the last 25 years,
00:47:11.500 | we have taken our thinking and put them in algorithms,
00:47:15.700 | and so we make decisions.
00:47:17.900 | The computer takes those criteria, algorithms,
00:47:22.180 | and they put them, they're in there,
00:47:24.140 | and it takes data, and they operate
00:47:27.500 | as an independent decision maker
00:47:29.500 | in parallel with our decision making.
00:47:31.660 | So for me, it's like there's a chess game playing,
00:47:35.420 | and I'm a person with my chess game,
00:47:38.420 | and I'm saying it made that move,
00:47:39.940 | and I'm making the move, and how do I compare
00:47:42.060 | those two moves, so we've done a lot.
00:47:43.940 | But let me give you a rule.
00:47:45.980 | If the future can be different from the past,
00:47:49.980 | and you don't have deep understanding,
00:47:55.080 | you should not rely on AI, okay?
00:48:01.180 | Those two things.
00:48:03.260 | - Deep understanding of?
00:48:05.180 | - The cause-effect relationships
00:48:07.100 | that are leading you to place that bet in anything, okay?
00:48:11.020 | Anything important.
00:48:12.940 | Let's say if it was do surgeries,
00:48:14.980 | and you would say, how do I do surgeries?
00:48:17.420 | I think it's totally fine to watch all the doctors
00:48:20.780 | do the surgeries.
00:48:21.620 | You can put it on, take a digital camera and do that,
00:48:26.620 | convert that into AI algorithms that go to robots,
00:48:32.980 | and have them do surgeries,
00:48:34.980 | and I'd be comfortable with that.
00:48:36.740 | Because if it keeps doing the same thing over and over again
00:48:40.920 | and you have enough of that, that would be fine,
00:48:43.940 | even though you may not understand the algorithms,
00:48:46.700 | because if the thing's happening over and over again,
00:48:49.440 | and you're not asking, the future would be the same,
00:48:52.020 | that appendicitis or whatever it is,
00:48:54.220 | will be handled the same way the surgery, that's fine.
00:48:57.620 | However, what happens with AI is, for the most part,
00:49:03.000 | is it takes a lot of data,
00:49:05.160 | with a high enough sample size,
00:49:09.600 | and then it puts together its own algorithms.
00:49:13.240 | Okay, there are two ways you can come up with algorithms.
00:49:16.200 | You can either take your thinking
00:49:18.360 | and express them in algorithms,
00:49:20.640 | or you can put the data in and say, what is the algorithm?
00:49:25.640 | That's machine learning.
00:49:27.920 | And when you have machine learning,
00:49:30.620 | it'll give you equations,
00:49:32.080 | which quite often are not understandable.
00:49:35.120 | If you would try to say,
00:49:36.160 | okay, now describe what it's telling you,
00:49:38.400 | it's very difficult to describe,
00:49:40.160 | and so they can escape understanding.
00:49:42.960 | And so it's very good for doing those things
00:49:46.280 | that could be done over and over again,
00:49:47.960 | if you're watching and you're not taking that.
00:49:49.960 | But if the future is different from the past,
00:49:53.100 | and you have that, then,
00:49:54.440 | if the future is different from the past
00:49:56.880 | and you don't have deep understanding,
00:49:59.080 | you're gonna get in trouble.
00:50:00.740 | And so that's the main thing.
00:50:03.160 | As far as AI is concerned,
00:50:05.360 | AI, and let's say computer replications
00:50:08.660 | of thinking in various ways,
00:50:09.960 | I think it's particularly good for processing.
00:50:12.840 | But the notion of what you want to do
00:50:18.300 | is better most of the time determined by the human mind.
00:50:23.020 | That what are the principles?
00:50:24.500 | Like, okay, how should I raise my children?
00:50:27.540 | It's gonna be a long time before AI,
00:50:29.980 | you're going to say it has a good enough judgment to do that.
00:50:33.140 | Who should I marry?
00:50:34.740 | All of those things.
00:50:35.620 | Maybe you can get the computer to help you,
00:50:37.460 | but if you just took data and do machine learning,
00:50:40.140 | it's not gonna find it.
00:50:41.340 | If you were to then take what are my criteria
00:50:44.460 | for any of those questions,
00:50:47.460 | and then say, put them into an algorithm,
00:50:49.740 | you'd be a lot better off than if you took AI to do it.
00:50:53.180 | But by and large, the mind should be used
00:50:56.980 | for inventing and those creative things.
00:51:00.540 | And then the computer should be used for processing
00:51:03.980 | 'cause it could process a lot more information,
00:51:06.940 | a lot faster, a lot more accurately,
00:51:10.260 | and a lot less emotionally.
00:51:12.220 | So any notion of thinking
00:51:15.140 | in the form of processing type thinking
00:51:17.700 | should be done by a computer.
00:51:19.620 | And anything that is in the notion of doing
00:51:22.220 | that other type of thinking
00:51:23.460 | should be operating with the brain.
00:51:26.820 | Operating in a way where you can say, ah, that makes sense.
00:51:31.380 | - The process of reducing your understanding
00:51:35.340 | down to principles is kind of like the process,
00:51:38.860 | the first one you mentioned, type of AI algorithm,
00:51:43.300 | where you're encoding your expertise.
00:51:45.620 | You're trying to program, write a program.
00:51:47.940 | The human is trying to write a program.
00:51:50.380 | How do you think that's attainable?
00:51:53.540 | The process of reducing principles to a computer program.
00:51:58.540 | Or when you say, when you write about,
00:52:03.620 | when you think about principles,
00:52:05.820 | is there still a human element
00:52:07.460 | that's not reducible to an algorithm?
00:52:10.900 | - My experience has been that almost all things,
00:52:17.180 | including those things that I thought
00:52:19.620 | were pretty much impossible to express,
00:52:23.780 | I've been able to express in algorithms.
00:52:27.660 | But that doesn't constitute all things.
00:52:31.140 | So you can, whew, you can express far more
00:52:36.140 | than you can imagine you'll be able to express.
00:52:39.420 | So I use the example of, okay, it's not,
00:52:41.900 | how do you raise your children?
00:52:43.940 | Okay, you will be able to take it one piece by piece.
00:52:48.300 | Okay, at what age, what school?
00:52:51.660 | And the way to do that, in my experience,
00:52:54.940 | is to take that, and when you're in the moment
00:52:59.940 | of making a decision, or just past making a decision,
00:53:05.020 | to take the time and to write down your criteria
00:53:09.060 | for making that decision in words.
00:53:11.880 | Okay, that way you'll get your principles down on paper.
00:53:16.380 | I created an app, online call,
00:53:19.860 | it's right now just on the iPhone, it'll be on Android.
00:53:23.420 | - I tried getting it on Android, come on, now.
00:53:25.340 | Let's get it on Android.
00:53:26.660 | - It'll be, in a few months it'll be on Android.
00:53:28.620 | - Awesome.
00:53:29.460 | - But it has an app in there that helps people
00:53:33.020 | write down their own principles.
00:53:35.140 | 'Cause this is very powerful.
00:53:37.160 | So when you're in that moment where you've just,
00:53:40.540 | you're thinking about it and you're thinking
00:53:42.060 | your criteria for choosing the school for your child,
00:53:46.060 | or whatever that might be, and you write down your criteria,
00:53:49.660 | or whatever they are, those principles,
00:53:52.180 | you write down and you, that will, at that moment,
00:53:57.180 | make you articulate your principles in a very valuable way.
00:54:02.140 | And if you have the way that we operate,
00:54:04.860 | that you have easy access, so the next time
00:54:07.580 | that comes along, you can go to that,
00:54:09.900 | or you can show those principles to others
00:54:11.980 | to see if they're the right principles.
00:54:13.920 | You will get a clarity of that principle
00:54:16.920 | that's really invaluable in words,
00:54:19.060 | and that'll help you a lot.
00:54:21.100 | But then, you start to think, how do I express that in data?
00:54:26.020 | And it'll shock you about how you can do that.
00:54:29.260 | You'll form an equation that will show the relationship
00:54:33.320 | between these particular parts, and then the,
00:54:36.180 | essentially the variables that are going to go
00:54:39.500 | into that particular equation,
00:54:41.540 | and you will be able to do that.
00:54:43.100 | And you take that little piece,
00:54:45.140 | and you put it into the computer.
00:54:48.020 | And then take the next little piece,
00:54:49.860 | and you put that into the computer.
00:54:51.940 | And before you know it, you will have
00:54:54.780 | a decision-making system that's of the sort
00:54:57.020 | that I'm describing.
00:54:58.100 | - So you're almost making an argument
00:55:00.540 | against an earlier statement you've made.
00:55:03.140 | You're convincing me, at first you said,
00:55:05.380 | there's no way a computer could raise a child, essentially.
00:55:08.880 | But now you've described, making me think of it,
00:55:12.140 | if you have that kind of idea, meritocracy,
00:55:16.020 | you have this rigorous approach to bridge
00:55:17.900 | why it takes an investment, and apply it to raising a child.
00:55:21.880 | It feels like, through the process you just described,
00:55:24.900 | we could, as a society, arrive at a set of principles
00:55:29.140 | for raising a child, and encode it into a computer.
00:55:33.900 | - That originality will not come from machine learning.
00:55:38.900 | - The first time you do, so that the original,
00:55:42.000 | yes. - That's what I'm referring to.
00:55:43.180 | - But eventually, as we together develop it,
00:55:45.940 | and then we can automate it.
00:55:47.280 | - That's why I'm saying the processing
00:55:50.180 | can be done by the computer.
00:55:51.820 | So we're saying the same thing, we're not inconsistent.
00:55:54.940 | We're saying the same thing, that the processing
00:55:57.620 | of that information and those algorithms
00:55:59.820 | can be done by the computer in a very, very effective way.
00:56:03.580 | You don't need to sit there and process,
00:56:05.300 | and try to weigh all those things in your equation,
00:56:07.700 | and all those things.
00:56:09.060 | But that notion of, okay, how do I get at that principle?
00:56:13.820 | - And you're saying you'd be surprised
00:56:15.700 | - Yes, you can do that. - how much you can express.
00:56:18.900 | - That's right.
00:56:19.740 | You can do that.
00:56:22.100 | So this is where I think you're going to see the future.
00:56:25.920 | Right now, we go to our devices,
00:56:30.980 | and we get information, to a large extent.
00:56:35.640 | And then we get some guidance, we have our GPS and the like.
00:56:39.660 | In my opinion, principles, principles, principles,
00:56:43.380 | principles, I want to emphasize that.
00:56:44.960 | You write them down, you've got those principles.
00:56:48.300 | They will be converted into algorithms for decision making.
00:56:52.700 | And they're going to also have the benefit
00:56:55.220 | of collective decision making.
00:56:57.380 | Because right now, individuals,
00:56:59.740 | based on what's stuck in their heads,
00:57:01.900 | are making their decisions in very ignorant ways.
00:57:05.180 | They're not the best decision makers,
00:57:06.960 | they're not the best criteria, and they're operating.
00:57:10.060 | When those principles are written down,
00:57:12.520 | and converted into algorithms,
00:57:14.840 | it's almost like you'll look at that
00:57:16.560 | and follow the instructions,
00:57:18.240 | and it'll give you better results.
00:57:20.220 | Medicine will be much more like this.
00:57:23.760 | You can go to your local doctor,
00:57:25.560 | and you could ask his point of view, and whatever.
00:57:27.840 | And he's rushed, and he may not be the best doctor around.
00:57:31.520 | And you're going to go to this thing,
00:57:33.480 | and get that same information,
00:57:35.140 | or just automatically have an input in that.
00:57:37.900 | And it's going to tell you,
00:57:39.260 | okay, here's what you should go do,
00:57:41.460 | and it's going to be much better than your local doctor.
00:57:44.260 | And that, the converting of information into intelligence,
00:57:49.260 | okay, intelligence, is the thing.
00:57:52.700 | We're coming out with, again, I'm 70,
00:57:55.740 | and I want to pass all these things along.
00:57:57.900 | So, all these tools that I've found
00:58:02.260 | need to develop all over these periods of time,
00:58:04.940 | all those things, I want to make them available.
00:58:07.440 | And what's going to happen,
00:58:08.640 | as they're going to see this,
00:58:10.940 | they're going to see these tools
00:58:13.400 | operating much more that way.
00:58:15.000 | The idea of converting data into intelligence.
00:58:20.000 | Intelligence, for example, on what they are like.
00:58:23.660 | On what are your strengths and weaknesses.
00:58:25.420 | Intelligence on who do I work well with,
00:58:28.300 | under what circumstances.
00:58:29.460 | - Personalized. - Intelligence.
00:58:31.340 | We're going to go from what are called systems of record,
00:58:35.700 | which are a lot of, okay,
00:58:37.020 | information organized in the right way,
00:58:39.520 | to intelligence.
00:58:41.600 | And we're going to, that'll be the next big move,
00:58:45.540 | in my opinion.
00:58:46.660 | And so you will get intelligence back.
00:58:49.900 | - And that intelligence comes from
00:58:51.460 | reducing things down to principles and to--
00:58:54.120 | - That's how it happens.
00:58:55.500 | - So, what's your intuition,
00:58:57.220 | if we look at future societies,
00:58:58.900 | do you think we'll be able to reduce
00:59:02.620 | a lot of the details of our lives
00:59:07.300 | down to principles that would be
00:59:08.820 | further and further automated?
00:59:10.220 | - I think the real question hinges
00:59:12.760 | on people's emotions and irrational behaviors.
00:59:17.760 | I think that there's subliminal things that we want, okay?
00:59:25.940 | And then there's cerebral, conscious logic.
00:59:30.020 | And the two often are at odds.
00:59:32.940 | So there's almost like two yous in you, right?
00:59:36.700 | And so let's say, what do you want?
00:59:39.540 | Your mind will answer one thing,
00:59:42.260 | your emotions will answer something else.
00:59:44.660 | So when I think about it, I think emotions are,
00:59:47.900 | I want inspiration, I want love is a good thing,
00:59:52.300 | being able to have a good impact.
00:59:54.020 | But it is in the reconciliation of your subliminal wants
00:59:59.020 | and your intellectual wants,
01:00:02.700 | so that you really say they're aligned.
01:00:05.780 | And so to do that in a way to get what you want.
01:00:08.860 | So irrationality is a bad thing
01:00:13.340 | if it means that it doesn't make sense
01:00:16.460 | in getting you what you want,
01:00:18.180 | but you better decide which you you're satisfying.
01:00:20.180 | Is it the lower level you, emotional, subliminal one?
01:00:23.380 | Or is it the other?
01:00:24.580 | But if you can align them.
01:00:26.060 | So what I find is that by going from my,
01:00:30.700 | you experience the decision, do this thing subliminally.
01:00:34.740 | And that's the thing I want.
01:00:35.980 | It comes to the surface.
01:00:37.680 | I find that if I can align that
01:00:40.340 | with what my logical me wants
01:00:42.620 | and do the double check between them
01:00:45.900 | and I get the same sort of thing,
01:00:47.900 | that that helps me a lot.
01:00:49.740 | I find, for example, meditation is one of the things
01:00:53.060 | that helps to achieve that alignment.
01:00:55.180 | It's fantastic for achieving that alignment.
01:00:58.460 | And often then I also wanna not just do it in my head.
01:01:02.140 | I wanna say, does that make sense?
01:01:03.580 | Help you, and so I do it with other people.
01:01:06.220 | And I say, okay, well, let's say I want this thing
01:01:08.580 | and whatever, does that make sense?
01:01:10.700 | And when you do that kind of triangulation,
01:01:13.580 | your two yous, and you do that with also the other way,
01:01:17.820 | then you certainly wanna be rational, right?
01:01:21.220 | But rationality has to be defined by those things.
01:01:24.600 | - And then you discover sort of new ideas
01:01:28.740 | that drive your future.
01:01:30.780 | So you're always at the edge
01:01:32.660 | of the set of principles you've developed.
01:01:34.340 | You're doing new things always.
01:01:35.940 | - Right. - So that's where
01:01:36.780 | the intellect is needed.
01:01:38.300 | - Well, and the inspiration.
01:01:40.580 | The inspiration is needed to do that, right?
01:01:43.580 | Like, what are you doing it for?
01:01:45.260 | It's the excitement. - So what is that thing?
01:01:46.940 | What is that thing? - The adventure,
01:01:47.980 | the curiosity, the hunger.
01:01:49.780 | What's, if you can be Freud for a second,
01:01:52.740 | what's in that subconscious?
01:01:55.020 | What's the thing that drives us?
01:01:57.020 | - I think you can't generalize of us.
01:02:00.020 | I think different people are driven by different things.
01:02:02.620 | There's not a common one, right?
01:02:04.800 | So like if you would take the shapers,
01:02:07.980 | I think it is a combination of subliminally,
01:02:12.980 | it's a combination of excitement, curiosity.
01:02:18.420 | - Is there a dark element there?
01:02:19.980 | Is there demons?
01:02:21.820 | Is there fears?
01:02:22.840 | Is there, in your sense,
01:02:24.580 | something dark that drives them?
01:02:25.980 | - Most of the ones that I'm dealing with,
01:02:27.940 | I have not seen that.
01:02:29.780 | I see the, what I really see is,
01:02:33.260 | whoo, if I can do that, that would be the most dream.
01:02:37.700 | And then the act of creativity, and you say, ooh.
01:02:41.380 | So excitement is one of the things.
01:02:44.180 | Curiosity is a big pull, okay?
01:02:47.940 | And then tenacity, okay, to do those things.
01:02:51.780 | But definitely, emotions are entering into it.
01:02:55.380 | Then there's an intellectual component of it too, okay?
01:02:58.780 | It may be empathy.
01:03:01.140 | Can I have an impact?
01:03:02.780 | Can I have an impact?
01:03:04.060 | The desire to have an impact, that's an emotional thrill,
01:03:08.300 | but it also has empathy.
01:03:10.300 | And then you start to see spirituality.
01:03:12.820 | By the spirituality, I mean the connectedness to the whole.
01:03:16.020 | You start to see people operate those things.
01:03:18.380 | Those tend to be the things that you see the most of.
01:03:21.460 | - And I think you're gonna shut down this idea completely,
01:03:25.540 | but there's a notion that some of these shapers
01:03:29.600 | really walk the line between sort of madness and genius.
01:03:33.540 | Do you think madness has a role in any of this?
01:03:37.000 | Or do you still see Steve Jobs and Elon Musk
01:03:40.300 | as fundamentally rational?
01:03:42.220 | - Yeah, there's a continuum there.
01:03:43.980 | And what comes to my mind is that genius
01:03:48.980 | is often at the edge, in some cases,
01:03:53.940 | imaginary genius is at the edge of insanity.
01:03:58.940 | And it's almost like a radio that I think,
01:04:02.860 | okay, if I can tune it just right, it's playing right,
01:04:06.740 | but if I go a little bit too far, it goes off, okay?
01:04:11.900 | And so you can see this.
01:04:14.740 | Kay Jamison was studying bipolar.
01:04:17.380 | What it shows is that that's definitely the case,
01:04:22.000 | 'cause when you're going out there,
01:04:23.620 | that imagination, whatever,
01:04:25.460 | can be near the edge sometimes.
01:04:29.020 | Doesn't have to always be.
01:04:31.260 | - So let me ask you about automation.
01:04:33.780 | That's been a part of public discourse recently.
01:04:37.220 | What's your view on the impact of automation
01:04:41.820 | of whether we're talking about AI
01:04:44.260 | or more basic forms of automation on the economy
01:04:46.940 | in the short term and the long term?
01:04:49.140 | Do you have concerns about it, as some do,
01:04:52.740 | or do you think it's overblown?
01:04:55.380 | - It's not overblown.
01:04:56.220 | I mean, it's a giant thing.
01:04:57.500 | It'll come at us in a very big way in the future.
01:05:01.300 | We're right at the edge of even really accelerating it.
01:05:04.420 | It's had a big impact, and it will have a big impact.
01:05:07.740 | And it's a two-edged sword,
01:05:09.420 | because it'll have tremendous benefits,
01:05:14.420 | and at the same time, it has profound benefits
01:05:18.580 | in employment and distributions of wealth,
01:05:21.420 | because the way I think about it is
01:05:25.180 | there are certain things human beings can do,
01:05:27.540 | and over time, we've evolved to go to
01:05:32.340 | almost higher and higher levels,
01:05:34.380 | and now we're almost like we're at this level.
01:05:37.300 | It used to be your labor, and you would then do your labor,
01:05:41.420 | and okay, we can get past the labor.
01:05:42.980 | We got tractors and things, and you go up, up, up, up, up,
01:05:46.220 | and we're up over here, and up to the point in our minds
01:05:50.020 | where, okay, anything related to mental processing,
01:05:55.020 | the computer can probably do better, and we can find that.
01:05:58.580 | And so other than almost inventing,
01:06:01.940 | you're at a point where the machines
01:06:05.860 | and the automation will probably do it better,
01:06:09.700 | and that's accelerating, and that's a force,
01:06:12.860 | and that's a force for the good,
01:06:15.540 | and at the same time, what it does is it displaces people
01:06:20.540 | in terms of employment and changes,
01:06:22.860 | and it produces wealth gaps and all of that.
01:06:25.300 | So I think the real issue is that that has to be viewed
01:06:29.340 | as a national emergency.
01:06:32.180 | In other words, I think the wealth gap,
01:06:34.820 | the income gap, the opportunity gap, all of those things,
01:06:39.260 | that force is creating the problems
01:06:42.980 | that we're having today, a lot of the problems,
01:06:45.540 | the great polarity, the disenfranchised,
01:06:49.060 | not anything approaching equality of education,
01:06:54.780 | all of these problems, a lot of problems
01:06:57.620 | are coming as a result of that,
01:06:59.660 | and so it needs to be viewed, really,
01:07:02.540 | as an emergency situation in which there's a good work,
01:07:07.540 | good plan worked out for how to deal with that effectively
01:07:13.420 | so that it's dealt with effectively.
01:07:17.540 | Because it's good for the average,
01:07:21.620 | it's good for the impact, but it's not good for everyone,
01:07:24.460 | and it creates that polarity, so it's gotta be dealt with.
01:07:27.820 | - Yeah, and you've talked about the American dream,
01:07:30.020 | and that that's something that all people
01:07:32.740 | should have an opportunity for,
01:07:34.100 | and that we need to reform capitalism
01:07:36.700 | to give that opportunity for everyone.
01:07:39.380 | Let me ask on one of the ideas in terms of safety nets
01:07:44.380 | that support that kind of opportunity.
01:07:47.860 | There's been a lot of discussion
01:07:49.060 | of universal basic income amongst people,
01:07:51.780 | so there's Andrew Yang, who's running on that,
01:07:55.180 | he's a political candidate running for president,
01:07:57.860 | on the idea of universal basic income.
01:08:00.460 | What do you think about that,
01:08:01.860 | giving $1,000 or some amount of money to everybody
01:08:05.620 | as a way to give them the padding,
01:08:09.300 | the freedom to sort of take leaps,
01:08:12.340 | to take the call for adventure,
01:08:14.180 | to take the crazy pursuits?
01:08:16.740 | - Before I get right into my thoughts
01:08:19.140 | on universal basic income,
01:08:20.460 | I wanna start with the notion that opportunity,
01:08:26.740 | education, development, creating equality,
01:08:31.740 | so that people say there's equal opportunity,
01:08:36.220 | and is the most important thing,
01:08:39.720 | and then to find out what is the amount,
01:08:43.080 | how are you going to provide that,
01:08:45.020 | how do you get the money into a public school system,
01:08:50.260 | how do you get the teaching,
01:08:52.380 | the fleshing out that plan to create equal opportunity
01:08:57.380 | in all of its various forms is the most pressing thing to do,
01:09:02.660 | and so that is that.
01:09:06.220 | - The opportunity, the most important one
01:09:08.320 | you're kind of implying is the earlier the better,
01:09:11.420 | sort of like opportunity to education,
01:09:14.440 | so in the early development of a human being
01:09:17.780 | is when you should have the equal opportunities,
01:09:20.060 | that's the most important.
01:09:21.060 | - Right, in the first phase of your life,
01:09:24.220 | which goes from birth until you're on your own,
01:09:27.820 | and you're an adult and you're now out there,
01:09:30.520 | and you deal with early childhood development,
01:09:34.660 | and you take the brain, and you say what's important?
01:09:38.020 | The child care, it makes a world of difference,
01:09:42.660 | for example, if you have good parents
01:09:44.700 | who are trying to think about instilling the stability
01:09:48.740 | in a non-traumatic environment to provide them,
01:09:52.060 | so I would say the good guidance
01:09:53.760 | that normally comes from parents,
01:09:55.820 | and the good education that they're receiving
01:09:59.500 | are the most important things in that person's development.
01:10:06.180 | The ability to be able to be prepared to go out there,
01:10:10.380 | and then to go into a market
01:10:12.780 | that's an equal opportunity job market,
01:10:15.860 | to be able to then go into that kind of market
01:10:19.020 | is a system that creates not only fairness,
01:10:22.780 | anything else is not fair,
01:10:24.980 | and then in addition to that,
01:10:27.140 | it also is a more effective economic system,
01:10:30.420 | because the consequences of not doing that
01:10:33.720 | to a society are devastating.
01:10:35.740 | If you look at what the difference in outcomes
01:10:38.580 | for somebody who completes high school
01:10:40.940 | or doesn't complete high school,
01:10:42.540 | or does each one of those state changes,
01:10:45.860 | and you look at what that means
01:10:47.780 | in terms of their costs to society,
01:10:50.660 | not only themselves, but their cost in incarceration costs,
01:10:54.700 | and crimes, and all of those things,
01:10:57.500 | it's economically better for the society,
01:11:00.740 | and it's fairer if they can get those particular things.
01:11:05.740 | Once they have those things,
01:11:07.760 | then you move on to other things,
01:11:09.300 | but yes, from birth all the way through that process,
01:11:13.180 | anything less than that is bad,
01:11:16.820 | is a tragedy, and so on.
01:11:18.940 | So that's what I, yeah,
01:11:20.420 | those are the things that I'm estimating.
01:11:22.260 | And so my, what I would want above all else
01:11:25.900 | is to provide that.
01:11:27.420 | So with that in mind,
01:11:28.780 | now we'll talk about universal basic income.
01:11:30.900 | - Start with that, now we can talk about--
01:11:32.820 | - Well, right, because you have to have that.
01:11:34.940 | Now the question is what's the best way to provide that?
01:11:38.260 | Okay, so when I look at UBI,
01:11:41.380 | I really think is what is going to happen
01:11:43.900 | with that $1,000, okay?
01:11:46.300 | And will that $1,000 come from another program?
01:11:50.540 | Does that come from an early childhood developmental program?
01:11:54.220 | Who are you giving the $1,000 to,
01:11:57.180 | and what will they do for that $1,000?
01:11:59.700 | I mean, like my reaction would be,
01:12:01.900 | I think it's a great thing
01:12:03.220 | that everybody should have almost $1,000 in their bank,
01:12:06.940 | and so on.
01:12:07.780 | But when do they get to make decisions,
01:12:09.620 | or who's the parent?
01:12:10.700 | A lot of times you can give $1,000 to somebody,
01:12:14.020 | and it could have a negative result.
01:12:15.780 | It can have, you know,
01:12:16.820 | they can use that money detrimentally,
01:12:19.340 | not just productively.
01:12:21.100 | And if that money's coming away
01:12:22.940 | from some of those other things
01:12:24.500 | that are gonna produce the things I want,
01:12:26.460 | and you're shifted to,
01:12:28.340 | let's say, to come in and give a check,
01:12:30.420 | doesn't mean its outcomes are going to be good
01:12:32.580 | in providing those things
01:12:33.740 | that I think are so fundamental important.
01:12:36.540 | If it was just everybody can have $1,000 and use it,
01:12:39.580 | so when the time comes--
01:12:40.420 | >> Use it well, right. >> And use it well,
01:12:41.940 | that would be really, really good,
01:12:44.260 | because it's almost like everybody,
01:12:46.620 | you'd wish everybody could have $1,000 worth
01:12:48.740 | of wiggle room in their lives, okay?
01:12:51.660 | And I think that would be great.
01:12:54.180 | I love that.
01:12:55.040 | But I wanna make sure that these other things
01:12:58.380 | that are taken care of,
01:12:59.420 | so if it comes out of that budget,
01:13:01.380 | and you know, I don't want it to come out of that budget,
01:13:04.140 | that's gonna be doing those things,
01:13:05.980 | and so you have to figure it out.
01:13:08.900 | >> And you have a certain skepticism
01:13:10.540 | that human nature will use,
01:13:14.020 | may not always, in fact, frequently may not use
01:13:17.460 | that $1,000 for the optimal,
01:13:19.680 | to support the optimal trajectory--
01:13:21.980 | >> Some will and some won't.
01:13:23.800 | One of the advantages of universal basic income
01:13:27.380 | is that if you put it in the hands,
01:13:29.660 | let's say, of parents who know how to do the right things
01:13:32.420 | and make the right choices for their children,
01:13:34.940 | 'cause they're responsible,
01:13:36.100 | and you say, "I'm gonna give them $1,000 wiggle room
01:13:38.860 | "to use for the benefit of their children."
01:13:41.460 | Wow, that sounds great.
01:13:43.300 | If you put it in the hands of, let's say,
01:13:46.380 | an alcoholic or drug-addicted parent
01:13:49.700 | who is not making those choices well for their children,
01:13:53.860 | and what they do is they take that $1,000
01:13:56.340 | and they don't use it well,
01:13:58.100 | then that's gonna produce more harm than good.
01:14:01.040 | >> Well, Put, you're, if I may say so,
01:14:03.820 | one of the richest people in the world,
01:14:05.780 | so you're a good person to ask, does money buy happiness?
01:14:11.300 | >> No, it's been shown that once you get over
01:14:16.300 | a basic level of income,
01:14:18.800 | so that you can take care of the pain,
01:14:22.420 | you can, health and whatever,
01:14:24.220 | there's no correlation between the level of happiness
01:14:28.100 | that one has and the level of money that one has.
01:14:32.620 | That thing that has the highest correlation
01:14:35.820 | is quality relationships with others' community.
01:14:39.720 | If you look at surveys of these things
01:14:41.780 | across all surveys in all societies,
01:14:44.780 | it's a sense of community and interpersonal relationships.
01:14:49.700 | That is not in any way correlated with money.
01:14:52.640 | You can go down to native tribes in very poor places,
01:14:57.360 | or you can go in all different communities,
01:15:00.520 | and so they have the opportunity to have that.
01:15:03.220 | I'm very lucky in that I started with nothing,
01:15:07.020 | so I had the full range.
01:15:08.340 | I can tell you, by not having money,
01:15:11.200 | and then having quite a lot of money,
01:15:13.940 | and I did that in the right order.
01:15:16.060 | >> So you started from nothing in Long Island.
01:15:17.900 | >> Yeah, and my dad was a jazz musician,
01:15:20.660 | but I had all really that I needed,
01:15:23.220 | because I had two parents who loved me
01:15:25.820 | and took good care of me,
01:15:27.340 | and I went to a public school
01:15:28.940 | that was a good public school,
01:15:30.760 | and basically, you don't need much more than that
01:15:34.080 | in order to, that's the equal opportunity part.
01:15:36.940 | Anyway, what I'm saying is, I experienced the range,
01:15:40.740 | and there are many studies on the answer to your question.
01:15:44.620 | No, money does not bring happiness.
01:15:46.820 | Money gives you an ability to make choices.
01:15:52.580 | >> Does it get in the way, in any way,
01:15:57.540 | of forming those deep, meaningful relationships?
01:16:00.580 | >> It can.
01:16:01.580 | There are lots of ways that it makes negative.
01:16:03.340 | That's one of them.
01:16:04.460 | It could stand in the way of that, yes, okay,
01:16:08.500 | but I could almost list the ways that it could stand,
01:16:11.740 | it could be a problem.
01:16:13.300 | >> What does it buy?
01:16:15.420 | So if you can elaborate, you mentioned a bit of freedom.
01:16:19.580 | >> At the most fundamental level,
01:16:21.420 | it doesn't take a whole lot,
01:16:23.420 | but it takes enough that
01:16:26.700 | you can take care of yourself and your family
01:16:31.500 | to be able to learn, do the basics,
01:16:36.500 | have the relationships, have healthcare,
01:16:41.780 | the basics of those types of things.
01:16:44.500 | You can cover the patients,
01:16:46.580 | and then to have maybe enough security,
01:16:50.300 | but maybe not too much security.
01:16:52.340 | >> That's right, yeah.
01:16:53.420 | >> That you essentially are okay.
01:16:57.740 | Okay, that's really good.
01:17:00.660 | And you don't, that's what money will get you.
01:17:04.140 | >> And everything else could go either way.
01:17:06.540 | There's no correlation. >> Well, no, there's more.
01:17:07.940 | >> There's more.
01:17:08.780 | >> Okay, then beyond that, what it then starts to do,
01:17:13.460 | that's the most important thing.
01:17:15.100 | But beyond that, what it starts to do
01:17:17.980 | is to help to make your dreams happen in various ways.
01:17:22.820 | Okay, so for example, like in my case,
01:17:27.500 | those dreams might not be just my own dreams,
01:17:32.100 | they're impact on others' dreams, okay?
01:17:35.740 | So my own dreams might be,
01:17:39.900 | I don't know, I can pass along these,
01:17:42.580 | at my stage in life,
01:17:43.620 | I can pass along these principles to you,
01:17:45.340 | and I can give those things,
01:17:46.860 | or I can do whatever, I can go on an adventure,
01:17:51.900 | I can start a business, I can do those other things,
01:17:56.900 | be productive, I can self-actualize
01:18:00.100 | in ways that might be not possible otherwise.
01:18:04.260 | That's my own belief.
01:18:08.020 | And then I can also help others.
01:18:10.940 | I mean, this is, to the extent,
01:18:12.740 | when you get older, and with time, and whatever,
01:18:15.420 | you start to feel connected, spirituality,
01:18:18.820 | that's what I'm referring to,
01:18:19.960 | you can start to have an effect on others,
01:18:21.860 | it's beneficial, and so on, gives you the ability.
01:18:24.500 | I could tell you that people who are very wealthy,
01:18:29.100 | who have that, feel that they don't have enough money.
01:18:32.340 | Bill Gates will feel almost broke,
01:18:35.180 | because relative to the things he'd like to accomplish,
01:18:39.020 | through the Gates Foundation and things like that,
01:18:41.740 | you know, oh my God, he doesn't have enough money
01:18:44.160 | to accomplish the things he wishes for.
01:18:46.420 | But those things are not,
01:18:48.940 | you know, they're not the most fundamental things.
01:18:51.580 | So I think that people sometimes think money has value.
01:18:56.260 | Money doesn't have value.
01:18:58.460 | Money is, like you say, just a medium of exchange
01:19:01.500 | at a store, all the while.
01:19:02.940 | And so what you have to say is,
01:19:05.020 | what is it that you're going to buy?
01:19:07.380 | Now, there are other people who get their gratification
01:19:10.100 | in ways that are different from me,
01:19:12.460 | but I think in many cases,
01:19:14.620 | let's say somebody who used money to have a status symbol.
01:19:18.860 | What would I say, or that's probably unhealthy.
01:19:22.260 | But then, I don't know, somebody who says,
01:19:25.140 | I love a great, gorgeous painting,
01:19:28.140 | and it's gonna cost lots of money.
01:19:29.980 | In my priorities, I can't get there.
01:19:33.900 | But that doesn't mean, who am I to judge others
01:19:38.540 | in terms of, let's say, their element of the freedom
01:19:41.100 | to do those things.
01:19:42.200 | So it's a little bit complicated.
01:19:43.940 | But by and large, that's my view on money and wealth.
01:19:48.780 | - So let me ask you in terms of the idea of,
01:19:53.780 | so much of your passions in life has been
01:19:56.660 | through something you might be able to call work.
01:19:59.900 | Alan Watts has this quote.
01:20:02.020 | He said that the real key to life, secret to life,
01:20:06.460 | is to be completely engaged with what you're doing
01:20:08.980 | in the here and now.
01:20:10.620 | And instead of calling it work, realize it is play.
01:20:13.360 | So I'd like to ask, what is the role of work
01:20:17.960 | in your life's journey, or in a life's journey?
01:20:21.660 | And what do you think about this modern idea
01:20:24.060 | of kind of separating work and work-life balance?
01:20:28.120 | - I have a principle that I believe in is,
01:20:31.240 | make your work and your passion the same thing.
01:20:34.100 | Okay, so that's similar view.
01:20:37.260 | In other words, if you can make your work and your passion,
01:20:40.400 | it's just gonna work out great.
01:20:42.020 | And then, of course, people have different purposes of work.
01:20:45.900 | And I don't wanna be theoretical about that.
01:20:48.680 | People have to take care of their family.
01:20:51.040 | So money, at a certain point, is an important component
01:20:56.040 | of that work, so you look beyond that.
01:20:58.920 | What is the money gonna get you,
01:21:00.240 | and what are you trying to achieve?
01:21:02.100 | But the most important thing, I agree,
01:21:04.840 | is meaningful work and meaningful relationships.
01:21:07.680 | Like, if you can get into the thing that you're,
01:21:10.400 | your mission that you're on, and you are excited
01:21:13.760 | about that mission that you're on,
01:21:15.840 | and then you can do that with people
01:21:18.420 | who you have the meaningful relationships with,
01:21:21.960 | you have meaningful work and meaningful relationships,
01:21:25.340 | I mean, that is fabulous for most people.
01:21:29.000 | - And it seems that many people struggle to get there,
01:21:34.660 | not out of, not necessarily because they're constrained
01:21:39.020 | by the fact that they have the financial constraints
01:21:41.280 | of having to provide for their family and so on,
01:21:45.780 | but it's, I mean, you know, this idea's out there
01:21:50.100 | that there needs to be a work-life balance,
01:21:51.880 | which means that most people,
01:21:53.780 | and we're gonna return to the same thing,
01:21:55.580 | is most doesn't mean optimal,
01:21:57.820 | but most people seem to not be doing their life's passion,
01:22:01.940 | not be, not unifying work and passion.
01:22:04.940 | Why do you think that is?
01:22:06.260 | - Well, the work-life balance,
01:22:08.780 | there's a life arc that you go through.
01:22:12.900 | Starts at zero and ends somewhere in the vicinity of 80,
01:22:17.900 | and there is a phase, and there's a,
01:22:19.860 | and you could look at the different degrees of happiness
01:22:22.300 | that happen in those phases, I can go through that
01:22:24.780 | if that was interesting,
01:22:25.620 | but we don't have time probably for it.
01:22:27.640 | But you get in the part of the life,
01:22:30.200 | that part of the life which has the lowest level
01:22:34.340 | of happiness is age 45 to 55,
01:22:38.620 | and because as you move into this second phase of your life,
01:22:43.620 | now the first phase of your life
01:22:46.420 | is when you're learning dependent on others.
01:22:48.980 | Second phase of your life is when you're working
01:22:51.740 | and others are dependent on you,
01:22:53.500 | and you're trying to be successful.
01:22:55.600 | And in that phase of one's life,
01:22:58.140 | you encounter the work-life balance challenge
01:23:01.460 | because you're trying to be successful at work
01:23:04.040 | and successful at parenting and successful and successful
01:23:07.300 | and all those things that take your demand.
01:23:10.140 | And they get into that, and I understand that problem
01:23:13.620 | in the work-life balance.
01:23:15.380 | The issue is primarily to know how to approach that, okay?
01:23:20.140 | So I understand it's stressful, it produces stress,
01:23:24.220 | and it produces bad results,
01:23:25.740 | and it produces the lowest level of happiness in one's life.
01:23:29.180 | It's interesting, as you get later in life,
01:23:31.940 | the levels of happiness rise,
01:23:33.780 | and the highest level of happiness
01:23:35.900 | is between ages 70 and 80,
01:23:37.660 | which is interesting for other reasons.
01:23:39.420 | But in that spot, and the key to work-life balance
01:23:44.420 | is to realize and to learn
01:23:48.980 | how to get more out of an hour of life, okay?
01:23:53.740 | Because an hour of work,
01:23:56.820 | what people are thinking is that they have to make a choice
01:24:00.540 | between one thing and another, and of course they do,
01:24:04.720 | but they don't realize that if they develop the skill
01:24:08.300 | to get a lot more out of an hour,
01:24:11.320 | it's the equivalent of having many more hours in your life.
01:24:14.660 | And so, that's why in the book "Principles"
01:24:17.460 | I try to go into, okay, now how can you get a lot more
01:24:21.060 | out of an hour that allows you to get more life
01:24:25.080 | into your life, and it reduces the work-life balance.
01:24:28.020 | - And that's the primary struggle in that 35 to 45.
01:24:32.900 | If you could linger on that,
01:24:35.140 | so what are the ups and downs of life
01:24:37.860 | in terms of happiness in general,
01:24:39.960 | and perhaps in your own life
01:24:42.960 | when you look back at the moments, the peaks?
01:24:46.580 | - It's pretty much the same pattern.
01:24:50.040 | Really in one's life tends to be a very happy period
01:24:54.400 | all the way up, and 16 is like a really great happy,
01:24:58.640 | you know, I think, like myself,
01:25:00.920 | you start to get elements of freedom,
01:25:03.400 | you get your driver's license, whatever, but 16 is there.
01:25:07.660 | Junior year in high school quite often
01:25:10.560 | could be a stressful period to try to get things
01:25:12.520 | about the high school.
01:25:13.760 | You go into college, tends to be very high happiness,
01:25:17.280 | generally speaking.
01:25:18.440 | - Freedom.
01:25:19.280 | - And then freedom, yeah, friendships,
01:25:21.960 | all of that freedom is a big thing.
01:25:24.200 | And then 23 is a peak point kind of in happiness.
01:25:30.640 | That freedom.
01:25:31.640 | Then sequentially one has a great time,
01:25:34.960 | they date, they go out, and so on,
01:25:36.560 | you find the love of your life,
01:25:38.900 | you begin to develop a family,
01:25:41.540 | and then with that as time happens,
01:25:44.400 | you have more of your work-life balance challenges
01:25:48.000 | that come and your responsibilities,
01:25:50.280 | and then as you get there in that mid part of your life,
01:25:53.880 | that is the biggest struggle.
01:25:56.480 | Chances are you will crash in that period of time.
01:25:59.720 | You'll have your series of failures, that's that.
01:26:03.640 | That's when you go into the abyss.
01:26:05.520 | You learn, you hopefully learn from those mistakes,
01:26:09.000 | you have a metamorphosis, you come out, you change,
01:26:12.440 | you hopefully become better,
01:26:14.520 | and you take more responsibilities and so on.
01:26:17.200 | And then when you get to the later part
01:26:20.740 | as you are starting to approach the transition
01:26:24.520 | in that late part of the second phase of your life
01:26:28.520 | before you go into the third phase of your life,
01:26:31.160 | second phase is you're working, trying to be successful.
01:26:33.760 | Third phase of your life is you want people
01:26:37.600 | to be successful without you, okay?
01:26:40.400 | - Yes.
01:26:41.600 | - You want your kids to be successful without you
01:26:43.880 | because when you're at that phase,
01:26:46.360 | they're making their transition from the first phase
01:26:49.480 | to the second phase, and they're trying to be successful,
01:26:51.960 | and you want them to be successful without you,
01:26:54.040 | and you have, your parents are gone,
01:26:56.520 | and then you have freedom, and then you have freedom again.
01:27:00.600 | And with that freedom, and then you have these,
01:27:03.680 | in history shown with this, you have friendships,
01:27:06.740 | you have perspective on life, you have different things,
01:27:09.640 | and that's one of the reasons that that later part
01:27:12.200 | of the life can be real, on average, actually.
01:27:15.320 | It's the highest.
01:27:16.160 | Very interesting thing.
01:27:17.600 | If they, there are surveys, and say,
01:27:21.120 | "How good do you look, and how good do you feel?"
01:27:24.540 | And that's the highest survey.
01:27:27.840 | The person, now they're not looking the best,
01:27:30.400 | and they're not feeling the best, right?
01:27:32.160 | Maybe it's 35 that they're actually looking the best
01:27:35.080 | and feeling the best, but they rank the highest
01:27:38.480 | at that point, survey results of being the highest.
01:27:41.080 | - That's so fascinating.
01:27:41.920 | - In that 70 to 80 period of time,
01:27:44.200 | because it has to do with an attitude on life.
01:27:47.200 | Then you start to have grandkids.
01:27:48.920 | Oh, grandkids are great.
01:27:50.760 | And you start to experience that transition well.
01:27:54.660 | So that's what the arc of life pretty much looks like,
01:27:57.780 | and I'm experiencing it.
01:28:00.940 | - You've learned that when you meditate,
01:28:03.940 | we're all human, we're all mortal.
01:28:05.740 | When you meditate on your own mortality,
01:28:09.540 | having achieved a lot of success on whatever dimension,
01:28:14.540 | what do you think is the meaning of it all?
01:28:16.980 | The meaning of our short existence on Earth
01:28:20.260 | as human beings?
01:28:22.100 | - I think that evolution is the greatest force
01:28:25.120 | of the universe, and that we're all tiny bits
01:28:30.120 | of an evolutionary type of process,
01:28:33.460 | where it's just matter and machines that go through time,
01:28:38.000 | and that we all have a deeply embedded inclination
01:28:43.000 | to evolve and contribute to evolution.
01:28:48.480 | So I think it's to personally evolve
01:28:52.000 | and contribute to evolution.
01:28:54.280 | - I could have predicted you would answer that way.
01:28:56.200 | It's brilliant and exactly right.
01:28:58.280 | And I think we've said it before, but I'll say it again.
01:29:02.960 | You have a lot of incredible videos out there
01:29:05.440 | that people should definitely watch.
01:29:07.540 | I don't say this often.
01:29:08.800 | I mean, it's literally the best spend of time.
01:29:11.560 | And in terms of reading principles,
01:29:14.000 | and reading basically anything you write on LinkedIn
01:29:16.120 | and so on, is a really good use of time.
01:29:18.480 | It's a lot of light bulb moments,
01:29:20.640 | a lot of transformative ideas in there.
01:29:22.680 | So Ray, thank you so much.
01:29:24.760 | It's been an honor.
01:29:25.640 | I really appreciate it.
01:29:26.640 | - It's been a pleasure for me too.
01:29:28.080 | I'm happy to hear it's of use to you and others.
01:29:31.340 | - Thanks for listening to this conversation with Ray Dalio.
01:29:35.920 | And thank you to our presenting sponsor, Cash App.
01:29:38.840 | Download it, use code LEXPODCAST.
01:29:41.360 | You'll get $10, and $10 will go to FIRST,
01:29:44.480 | a STEM education nonprofit that inspires hundreds
01:29:47.400 | of thousands of young minds to learn
01:29:50.040 | and to dream of engineering our future.
01:29:52.600 | If you enjoy this podcast, subscribe on YouTube,
01:29:55.540 | get five stars on Apple Podcast, support on Patreon,
01:29:59.000 | or connect with me on Twitter.
01:30:00.880 | Finally, closing words of advice from Ray Dalio.
01:30:04.240 | Pain plus reflection equals progress.
01:30:08.280 | Thank you for listening, and hope to see you next time.
01:30:11.360 | (upbeat music)
01:30:13.940 | (upbeat music)
01:30:16.520 | [BLANK_AUDIO]