back to index

Yaron Brook: Ayn Rand and the Philosophy of Objectivism | Lex Fridman Podcast #138


Chapters

0:0 Introduction
2:39 Principles of a life well lived
10:46 Free will
17:1 Nature of reality
25:39 Ayn Rand
57:22 Objectivism
82:40 Godel Incompleteness Theorem
87:47 Capitalism
117:33 Virtue of selfishness
127:37 Win-win
133:42 Anarchy
152:35 Tribalism and division
156:53 Objectivism and Jordan Peterson on personal responsibility

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Yaron Brook,
00:00:03.240 | one of the best known objectivist philosophers
00:00:05.640 | and thinkers in the world.
00:00:07.200 | Objectivism is the philosophical system
00:00:09.520 | developed by Ayn Rand that she first expressed
00:00:12.920 | in her fiction books, "The Fountainhead" and "Atlas Shrugged"
00:00:16.720 | and later in nonfiction essays and books.
00:00:19.840 | Yaron is the current chairman of the board
00:00:22.880 | at the Ayn Rand Institute,
00:00:24.560 | host of the "Yaron Brook Show"
00:00:26.800 | and the co-author of "Free Market Revolution",
00:00:30.000 | "Equal is Unfair" and several other books
00:00:33.360 | where he analyzes systems of government,
00:00:36.080 | human behavior and the human condition
00:00:38.960 | from the perspective of objectivism.
00:00:41.800 | Quick mention of each sponsor,
00:00:43.360 | followed by some thoughts related to the episode.
00:00:46.280 | Blinkist, an app I use for reading
00:00:49.120 | through summaries of books.
00:00:51.080 | ExpressVPN, the VPN I've used for many years
00:00:54.640 | to protect my privacy on the internet
00:00:57.240 | and Cash App, the app I use to send money to friends.
00:01:01.440 | Please check out these sponsors in the description
00:01:03.640 | to get a discount and to support this podcast.
00:01:07.320 | As a side note, let me say that I first read
00:01:09.760 | "Atlas Shrugged" and "The Fountainhead" early in college,
00:01:13.480 | along with many other literary and philosophical works
00:01:16.960 | from Nietzsche, Heidegger, Kant, Locke, Foucault,
00:01:21.320 | Wittgenstein and of course, all the great existentialists
00:01:25.160 | from Kierkegaard to Camus.
00:01:27.680 | I always had an open mind, curious to learn
00:01:30.200 | and explore the ideas of thinkers throughout history,
00:01:33.120 | no matter how mundane or radical
00:01:36.760 | or even dangerous they were considered to be.
00:01:39.560 | Ayn Rand was, and I think still is a divisive figure.
00:01:44.520 | Some people love her, some people dislike
00:01:46.960 | or even dismiss her.
00:01:49.240 | I prefer to look past what some may consider
00:01:51.480 | to be the flaws of the person
00:01:53.800 | and consider with an open mind, the ideas she presents
00:01:57.920 | and Yaron now describes and applies
00:02:00.800 | in his philosophical discussions.
00:02:03.000 | In general, I hope that you will be patient
00:02:05.440 | and understanding as I venture out across the space
00:02:08.840 | of ideas and the ever widening Overton window,
00:02:12.800 | pulling at the thread of curiosity,
00:02:15.120 | sometimes saying stupid things,
00:02:17.160 | but always striving to understand
00:02:19.280 | how we can better build a better world together.
00:02:23.340 | If you enjoy this thing, subscribe on YouTube,
00:02:25.760 | review it with Five Stars on Apple Podcast,
00:02:28.000 | follow on Spotify, support on Patreon
00:02:30.680 | or connect with me on Twitter @LexFriedman.
00:02:34.440 | And now here's my conversation with Yaron Rook.
00:02:38.320 | Let me ask the biggest possible question first.
00:02:42.640 | - Sure.
00:02:43.480 | - What are the principles of a life well lived?
00:02:47.320 | - I think it's to live with thought,
00:02:51.960 | that is to live a rational life, to think it through.
00:02:55.960 | I think so many people are in a sense zombies out there.
00:02:59.600 | They're alive, but they're not really alive
00:03:01.920 | 'cause their mind is not focused.
00:03:03.960 | Their mind is not focused on what do I need to do
00:03:07.960 | in order to live a great life?
00:03:10.040 | So too many people just go through the motions of living
00:03:13.400 | rather than really embrace life.
00:03:16.760 | So I think the secret to living a great life
00:03:19.760 | is to take it seriously.
00:03:22.720 | And what it means to take it seriously
00:03:24.460 | is to use the one tool that makes us human,
00:03:27.000 | the one tool that provides us with all the values
00:03:29.080 | that we have, our mind, our reason,
00:03:31.680 | and to use it, apply it to living.
00:03:34.880 | People apply it to their work,
00:03:36.780 | they apply it to their math problems,
00:03:38.880 | to science, to programming.
00:03:41.920 | But imagine if they used that same energy,
00:03:43.660 | that same focus, that same concentration
00:03:45.640 | to actually living life and choosing values
00:03:48.720 | that they should pursue.
00:03:51.080 | That would change the world
00:03:54.280 | and it would change their lives.
00:03:56.200 | - Yeah, actually, I wear this silly suit and tie.
00:04:00.320 | It symbolizes to me always, it makes me feel
00:04:03.120 | like I'm taking the moment really seriously.
00:04:06.440 | - I think that's really, that's right.
00:04:08.480 | And each one of us has different ways
00:04:10.940 | to kind of condition our consciousness.
00:04:14.440 | I'm serious now, and for you, it's a suit and tie.
00:04:17.780 | It's a conditioning of your consciousness
00:04:19.600 | to now I'm focused, now I'm at work,
00:04:21.560 | now I'm doing my thing.
00:04:22.920 | And I think that's terrific
00:04:26.040 | and I wish everybody took that.
00:04:28.020 | Look, I mean, it's a cliche, but we only live once.
00:04:32.080 | Every minute of your life, you're never gonna live again.
00:04:34.840 | This is really valuable.
00:04:36.580 | And when people don't have that deep respect
00:04:41.020 | for their own life, for their own time, for their own mind,
00:04:44.860 | and if they did, again, one could only imagine,
00:04:49.140 | look at how productive people are.
00:04:50.900 | Look at the amazing things they produce
00:04:52.620 | and they do in their work.
00:04:54.540 | And if they apply that to everything, wow.
00:04:58.540 | - So you kind of talk about reason.
00:05:00.260 | Where does the kind of existentialist idea
00:05:05.140 | of experience maybe, fully experiencing all the moments
00:05:10.140 | versus fully thinking through?
00:05:13.740 | Is there an interesting line to separate the two?
00:05:17.260 | Like why such an emphasis on reason for a life well-lived
00:05:21.820 | versus just enjoy, like experience?
00:05:26.480 | - Well, because I think experience in a sense
00:05:28.360 | is the easy part.
00:05:31.460 | I'm not saying it's how we experience the life that we live.
00:05:36.460 | And yes, I'm all with the take time to value what you value,
00:05:41.980 | but I don't think that's the problem of people out there.
00:05:46.700 | I don't think the problem is they're not taking time
00:05:48.660 | to appreciate where they are and what they do.
00:05:51.060 | I think it's that they don't use their mind
00:05:53.360 | in this one respect in planning their life,
00:05:57.780 | in thinking about how to live.
00:06:00.100 | So the focus is on reason is because
00:06:02.380 | it's our only source of knowledge.
00:06:03.900 | There's no other source of knowledge.
00:06:05.260 | We don't know anything that does not come
00:06:10.100 | from our senses and our mind,
00:06:12.500 | the integration of the evidence of our senses.
00:06:15.220 | Now we know stuff about ourselves,
00:06:16.700 | and I think it's important to know oneself
00:06:18.300 | through introspection.
00:06:19.740 | And I consider that part of reasoning is to introspect.
00:06:24.740 | But I think reason is undervalued,
00:06:28.380 | which is funny to say because it's our means of survival.
00:06:31.340 | It's how human beings survive.
00:06:33.000 | We cannot, see, this is why I disagree
00:06:35.740 | with so many scientists and people like Sam Harris.
00:06:38.620 | You mentioned Sam Harris before the show.
00:06:40.620 | We're not programmed to know how to hunt.
00:06:46.140 | We're not programmed to do agriculture.
00:06:49.020 | We're not programmed to build computers
00:06:51.660 | and build networks on which we can podcast
00:06:53.860 | and do our shows.
00:06:55.320 | All of that requires effort.
00:06:58.100 | It requires focus.
00:06:59.740 | It requires energy, and it requires will.
00:07:03.360 | It requires somebody to will it.
00:07:05.940 | It requires somebody to choose it.
00:07:08.540 | And once you make that choice,
00:07:11.200 | you have to engage that choice means
00:07:13.580 | that you're choosing to engage your reason
00:07:15.940 | in discovery, in integration,
00:07:19.240 | and then in work to change the world in which we live.
00:07:23.140 | And human beings have to discover, figure out,
00:07:27.500 | solve the problem of hunting.
00:07:29.140 | Hunting, everybody thinks, oh, that's easy.
00:07:31.620 | I've seen the movie.
00:07:32.660 | But human beings had to figure out how to do it.
00:07:36.220 | You can't run down a bison and bite into it.
00:07:40.740 | You're not gonna catch it.
00:07:41.580 | You have no fangs to bite into it.
00:07:44.100 | You have to build weapons.
00:07:45.560 | You have to build tools.
00:07:46.460 | You have to create traps.
00:07:47.700 | You have to have a strategy.
00:07:49.520 | All of that requires reason.
00:07:50.940 | So the most important thing that allows human beings
00:07:56.540 | to survive and to thrive in every value,
00:07:58.780 | from the most simple to the most sophisticated,
00:08:01.420 | from the most material to, I believe,
00:08:03.140 | the most spiritual, requires thinking.
00:08:06.340 | So stopping and appreciating the moment
00:08:10.260 | is something that I think is relatively easy
00:08:14.500 | once you have a plan, once you've thought it through,
00:08:17.620 | once you know what your values are.
00:08:19.660 | There is a mistake people make.
00:08:21.940 | They attain their values,
00:08:23.620 | and they don't take a moment to savor that
00:08:26.660 | and to appreciate that and to even pat themselves
00:08:29.060 | on the back that they did it.
00:08:31.380 | But that's not what's screwing up the world.
00:08:33.220 | What's screwing up the world
00:08:34.340 | is that people have the wrong values,
00:08:35.780 | and they don't think about them,
00:08:37.340 | and they don't really focus on them,
00:08:39.020 | and they don't have a plan for their own life
00:08:41.500 | and how to live it.
00:08:42.600 | - If we look at human nature,
00:08:44.300 | you're saying the fundamental big thing
00:08:46.780 | that we need to consider is our capacity,
00:08:49.100 | like a capability to reason.
00:08:51.620 | - I said to me, reason is this massive
00:08:55.220 | evolutionary achievement, right, in quotes, right?
00:08:58.700 | If you think about any other sophisticated animal,
00:09:03.340 | everything has to be coded.
00:09:04.840 | Everything has to be written in the hard way.
00:09:08.300 | It has to be there.
00:09:09.980 | And they have to have a solution for every outcome,
00:09:12.180 | and if there's no solution, the animal dies, typically,
00:09:14.380 | or the animal suffers in some way.
00:09:16.740 | Human beings have this capacity to self-program.
00:09:19.660 | They have this capacity.
00:09:21.460 | There's not, it's not a tabula rasa
00:09:25.020 | in the sense that there's nothing there.
00:09:26.380 | Obviously, we have a nature.
00:09:27.900 | Obviously, our minds, our brains
00:09:30.300 | are structured in a particular way.
00:09:32.300 | But given that, we have the ability
00:09:36.180 | to turn it on or turn it off.
00:09:38.420 | We have the ability to commit suicide,
00:09:40.460 | to reject our nature, to work against our interests,
00:09:44.740 | not to use the tool that evolution has provided us,
00:09:49.500 | which is this mind, which is reason.
00:09:51.900 | So that choice, that fundamental choice,
00:09:54.780 | Hamlet says it, right, to be or not to be.
00:09:58.460 | But to be or not to be is to think or not to think,
00:10:01.900 | to engage or not to engage, to focus or not to focus.
00:10:05.540 | In the morning when you get up, you're not really
00:10:09.940 | completely there, you're kind of out of focus and stuff.
00:10:12.740 | It requires an act of will to say,
00:10:14.940 | okay, I'm awake, I've got stuff to do.
00:10:17.540 | Some people never do that.
00:10:19.340 | Some people live in that haze.
00:10:21.620 | And they never engage that mind.
00:10:23.460 | And when you're sitting and trying to solve
00:10:25.860 | a complex computer problem or a math problem,
00:10:29.460 | you have to turn something on.
00:10:31.940 | You have to, in a sense, exert certain energy
00:10:36.220 | to focus on the problem, to do it.
00:10:38.100 | And that is not determined in a sense
00:10:41.180 | that you have to focus.
00:10:43.180 | You choose to focus.
00:10:44.500 | And you could choose not to focus.
00:10:46.260 | - And that choice is more powerful
00:10:47.700 | than any other parts of our brain
00:10:50.700 | that we've borrowed from fish
00:10:52.180 | and from our evolutionary origins.
00:10:54.660 | Like this, whatever this crazy little leap in evolution is
00:10:58.080 | that allowed us to think is more powerful than anything else.
00:11:00.700 | - So I think neuroscientists pretend they know
00:11:05.180 | a lot more about the brain than they really do.
00:11:07.500 | - Yeah.
00:11:08.340 | Shots fired.
00:11:10.620 | I agree with you.
00:11:12.620 | - And we don't know that much yet
00:11:14.820 | about how the brain functions and what's a fish
00:11:16.980 | and all this stuff.
00:11:18.820 | So I think what exists there is a lot of potentialities.
00:11:23.240 | But the beauty of the human brain is it's potentialities
00:11:28.780 | that we have to manifest through our choices.
00:11:31.880 | It's there, it's sitting there.
00:11:34.540 | And yes, there's certain things that are gonna evoke
00:11:37.900 | certain senses, certain feelings.
00:11:42.700 | I'm not even saying emotions
00:11:43.860 | 'cause I think emotions are too complex
00:11:45.400 | to have been programmed into our mind.
00:11:48.100 | But I don't think so.
00:11:49.980 | There's this big issue of evolutionary psychology
00:11:52.100 | is huge right now and it's a big issue.
00:11:54.620 | I find it to a large extent as way too early
00:12:01.100 | and storytelling about ex-post storytelling about stuff.
00:12:07.820 | We still don't, so for example,
00:12:10.460 | I would like to see evolutionary psychology
00:12:12.340 | differentiate between things like inclinations,
00:12:17.080 | feelings, emotions, sensations, thoughts, concepts, ideas.
00:12:22.080 | What of those, the programmed and what of those
00:12:26.340 | are developed and chosen and a product of reason?
00:12:29.560 | I think anything from emotion to abstract ideas
00:12:32.920 | is all chosen, is all a product of reason.
00:12:36.800 | And everything before that, we might've been programmed for.
00:12:42.460 | But the fact is, so clearly a sensation
00:12:45.060 | is not a product of, is something that we feel
00:12:48.740 | because that's how our biology works.
00:12:50.740 | So until we have these categories
00:12:54.260 | and until we can clearly specify what is what
00:12:58.740 | and where did they come from,
00:13:01.340 | the whole discussion in evolutionary psychology
00:13:03.260 | seems to be rambling.
00:13:04.340 | It doesn't seem to be scientific.
00:13:06.420 | So we have to define our terms,
00:13:08.900 | which is the basis of science.
00:13:10.020 | You have to have some clear definitions
00:13:12.140 | about what we're talking about.
00:13:13.740 | When you ask them these questions,
00:13:16.380 | there's never really a coherent answer
00:13:18.540 | about what is it exactly.
00:13:20.460 | And everybody is afraid of the issue of free will.
00:13:22.800 | And I think to some extent, I mean,
00:13:25.000 | Harris has this, and I don't wanna misrepresent anything
00:13:28.300 | Harris has 'cause I'm a fan and I like a lot of his stuff.
00:13:32.000 | But on the one hand, he is obviously intellectually active
00:13:37.180 | and wants to change our minds.
00:13:38.740 | So he believes that we have some capacity to choose.
00:13:41.900 | On the other hand, he's undermining that capacity to choose
00:13:44.780 | by saying it's just determinants.
00:13:45.820 | You're gonna choose what you choose.
00:13:47.380 | You have no say in it.
00:13:48.740 | There's actually no you.
00:13:50.260 | So it's, and that's to me completely unscientific.
00:13:55.900 | That's completely him pulling it out of nowhere.
00:14:00.460 | We all experience the fact that we have an eye.
00:14:03.660 | - That kind of certainty saying
00:14:05.340 | that we do not have that fundamental choice
00:14:07.820 | that reason provides is unfounded currently.
00:14:12.020 | - Look, there's a sense in which
00:14:13.260 | it can never be contradicted
00:14:15.860 | because it's a product of your experience.
00:14:20.860 | It's not a product of your experience.
00:14:21.940 | You can experience it directly.
00:14:24.140 | So no science will ever prove that this table isn't here.
00:14:28.260 | I can see it, it's here, right?
00:14:31.140 | I can feel it.
00:14:32.220 | I know I have free will 'cause I can introspect it.
00:14:36.400 | In a sense, I can see it.
00:14:37.980 | I can see myself engaging it.
00:14:41.120 | And that is as valid as the evidence of my senses.
00:14:47.400 | Now I can't point at it
00:14:49.020 | so that you can see the same thing I'm seeing,
00:14:51.420 | but you can do the same thing in your own consciousness
00:14:53.580 | and you can identify the same thing.
00:14:55.220 | And to deny that in the name of science
00:14:59.220 | is to get things upside down.
00:15:00.620 | You start with that and that's the beginning of science.
00:15:04.780 | And the beginning of science is the identification
00:15:07.320 | that I choose and that I can reason.
00:15:10.720 | And now I need to figure out the mechanism,
00:15:13.320 | the rules of reasoning, the rules of logic.
00:15:16.720 | How does this work?
00:15:17.680 | And that's where science comes from.
00:15:19.560 | - Of course, it's possible that science,
00:15:21.240 | like from my place of AI, would be able to,
00:15:24.620 | if we were able to engineer consciousness or understand,
00:15:30.720 | I mean, it's very difficult
00:15:32.120 | 'cause we're so far away from it now,
00:15:33.440 | but understand how the actual mechanism
00:15:36.500 | of that consciousness emerges,
00:15:38.580 | that in fact this table is not real,
00:15:40.580 | that we can determine that it,
00:15:43.520 | exactly how our mind constructs the reality
00:15:47.460 | that we perceive, then you can start to make interesting.
00:15:51.620 | - But our mind doesn't construct the reality that we perceive.
00:15:54.460 | The reality we perceive is there.
00:15:56.300 | We perceive a reality that exists.
00:15:59.780 | Now, and we perceive it in particular ways
00:16:02.140 | given the nature of our senses, right?
00:16:05.020 | A bat perceives this table differently,
00:16:07.180 | but it's still the same table
00:16:08.500 | with the same characteristics and the same identity.
00:16:12.860 | It's just a matter of, we use eyes,
00:16:16.100 | they use a radar system to,
00:16:17.980 | they use sound waves to perceive it,
00:16:19.840 | but it's still there.
00:16:20.780 | Existence exists whether we exist or not.
00:16:22.940 | And so you could create, I mean, I don't know how,
00:16:27.020 | and I don't know if it's possible,
00:16:28.460 | but let's say you could create a consciousness, right?
00:16:30.940 | And I suspect that to do that,
00:16:33.620 | you would have to use biology, not just electronics,
00:16:37.140 | but way outside my expertise.
00:16:39.380 | Because consciousness, as far as we know,
00:16:42.700 | is a phenomenon of life,
00:16:43.700 | and you would have to figure out how to create life
00:16:45.420 | before you created consciousness, I think.
00:16:48.420 | But if you did that, then that wouldn't change anything.
00:16:51.900 | All it would say is we have another conscious being, cool.
00:16:54.300 | That's great.
00:16:55.120 | But it wouldn't change the nature of our consciousness.
00:16:58.140 | Our consciousness is what it is.
00:17:00.500 | - But, so that's very interesting.
00:17:02.340 | I think this is a good way to set the table
00:17:05.660 | for discussion of objectivism is,
00:17:08.260 | let me at least challenge a thought experiment,
00:17:12.340 | which is, I don't know if you're familiar
00:17:14.220 | with Donald Hoffman's work about reality.
00:17:17.260 | So his idea is that we're just,
00:17:20.460 | our perception is just an interface to reality.
00:17:23.460 | - So Donald Hoffman is the guy, you see Owein?
00:17:26.580 | - Yeah.
00:17:27.420 | - Yes, I've met Donald, and I've seen his video.
00:17:28.860 | And look, Donald has not invented anything new.
00:17:31.860 | This goes back to ancient philosophy.
00:17:34.060 | - Let me just state in case people aren't familiar.
00:17:38.180 | I mean, it's a fascinating thought experiment to me,
00:17:41.500 | like of out-of-the-box thinking, perhaps literally,
00:17:44.140 | is that there's a gap between the world as we perceive it
00:17:49.140 | and the world as it actually exists.
00:17:52.800 | And I think that's, for the philosophy of objectivism,
00:17:55.700 | there's a really important gap to close.
00:17:59.780 | So can you maybe at least try to entertain the idea
00:18:03.660 | that there is more to reality than our minds can perceive?
00:18:08.660 | - Well, I don't understand what more means, right?
00:18:13.420 | Of course there's more to reality
00:18:14.820 | than what our senses perceive.
00:18:16.460 | That is, for example, I don't know,
00:18:19.380 | certain elements have radiation, right?
00:18:24.220 | Uranium has radiation.
00:18:25.180 | I can't perceive radiation.
00:18:27.140 | The beauty of human reason is I can, through experimentation,
00:18:32.140 | discover the phenomena of radiation,
00:18:34.340 | then actually measure radiation,
00:18:36.780 | and I don't worry about it.
00:18:37.660 | I can't perceive the world
00:18:39.060 | the way a bat perceives the world,
00:18:40.420 | and I might not be able to see certain things.
00:18:43.180 | But I can, we've created radar,
00:18:44.980 | so A, we understand how a bat perceives the world,
00:18:47.860 | and I can mimic it through a radar screen
00:18:50.340 | and create images like the bat,
00:18:53.320 | its consciousness somehow perceives it, right?
00:18:55.780 | So the beauty of human reason is our capacity
00:19:00.280 | to understand the world beyond
00:19:02.860 | what our senses give us directly.
00:19:05.220 | At the end, everything comes in through our senses,
00:19:07.860 | but we can understand things
00:19:10.380 | that our senses don't provide us.
00:19:11.540 | But what he's doing is he's doing something very different.
00:19:14.940 | He is saying what our senses provides us
00:19:17.420 | might have nothing to do with the reality out there.
00:19:20.700 | That is just a random, arbitrary, nonsensical statement.
00:19:25.700 | And he actually has a whole evolutionary explanation for it.
00:19:28.880 | He runs some simulations.
00:19:30.720 | The simulations seem, I mean,
00:19:32.440 | I'm not an expert in this field,
00:19:33.760 | but they seem silly to me.
00:19:35.360 | They don't seem to reflect.
00:19:36.760 | And look, all he's doing is taking Immanuel Kant's philosophy
00:19:41.080 | which articulate exactly the same cause,
00:19:43.240 | and he's giving it a veneer of evolutionary ideas.
00:19:48.240 | I'm not an expert on evolution,
00:19:50.400 | and I'm not an expert on epistemology,
00:19:52.300 | which is what this is.
00:19:53.620 | So to me, as a semi-layman, it doesn't make any sense.
00:19:58.620 | And I'm actually, I have this Yaron Burk show.
00:20:03.820 | I don't know if I'm allowed to pitch it,
00:20:05.540 | but I've got this Yaron Burk show on.
00:20:06.380 | - Oh, please, first of all, let me pause.
00:20:08.020 | - On YouTube. - I'm a huge fan
00:20:09.340 | of the Yaron Burk show. - Thank you.
00:20:11.140 | - I listen to it very often.
00:20:12.820 | As a small aside, the cool thing about reason,
00:20:18.020 | which you practice, is you have a systematic way
00:20:21.460 | of thinking through basically anything.
00:20:24.400 | - Yes.
00:20:25.480 | - And that's so fun to listen to.
00:20:27.600 | I mean, it's rare that I think there's flaws in your logic,
00:20:32.600 | but even then, it's fun,
00:20:34.560 | 'cause I'm like disagreeing with the screen.
00:20:37.320 | - And it's great when somebody disagrees with me,
00:20:39.120 | and they give good arguments,
00:20:40.560 | because that makes it challenging.
00:20:42.280 | - Anyway, sorry.
00:20:43.120 | - So one of the shows I wanna do in the next few weeks
00:20:46.340 | is bring one of my philosopher friends
00:20:49.140 | to discuss the video that Hoffman,
00:20:51.820 | where he presents his theory,
00:20:52.820 | because it surprises me how seductive it is.
00:20:57.820 | And it seems to be so, first of all,
00:21:00.780 | completely counterintuitive,
00:21:02.820 | because somehow we managed to cross the road
00:21:05.860 | and not get hit by the car,
00:21:07.020 | and if our senses did not provide us any information
00:21:10.080 | about what's actually going on in reality,
00:21:12.180 | how do we do that?
00:21:13.660 | And not to mention build computers.
00:21:16.060 | Not to mention fly to the moon
00:21:17.600 | and actually land on the moon.
00:21:18.640 | And if reality's not giving us information about the moon,
00:21:21.800 | if our senses are not giving us information about the moon,
00:21:24.620 | how did we get there?
00:21:25.820 | And where did we go?
00:21:27.120 | Maybe we didn't go anywhere.
00:21:28.860 | It's just, it's nonsensical to me,
00:21:30.920 | and it's a very bad place philosophically,
00:21:35.920 | because it basically says
00:21:38.680 | there is no objective standard for anything.
00:21:40.840 | There is no objective reality.
00:21:42.480 | You can come up with anything.
00:21:43.600 | You could argue anything,
00:21:44.680 | and there's no methodology, right?
00:21:46.400 | I believe that at the end of the day,
00:21:48.160 | what reason allows us to do
00:21:50.040 | is provides us with a methodology for truth.
00:21:52.240 | And at the end of the day, for every claim that I make,
00:21:54.760 | I should be able to boil it down to,
00:21:57.880 | see, look, the evidence of the senses is right there.
00:22:02.520 | And once you take that away,
00:22:03.840 | knowledge is gone, and truth is gone,
00:22:06.240 | and that opens it up to complete disaster.
00:22:09.680 | - So to me, why it's compelling
00:22:12.740 | to at least entertain this idea,
00:22:16.220 | first of all, it shakes up the mind a little bit
00:22:18.960 | to force you to go back to first principles
00:22:23.960 | and ask the question, what do I really know?
00:22:27.420 | And the second part of that that I really enjoy
00:22:31.500 | is it's a reminder that we know very little,
00:22:35.420 | to be a little bit more humble.
00:22:37.380 | So if reality doesn't exist at all,
00:22:40.580 | before you start thinking about it,
00:22:43.060 | I think it's a really nice wake-up call to think,
00:22:46.660 | wait a minute, I don't really know much about this universe,
00:22:51.180 | that humbleness.
00:22:52.680 | I think something I'd like to ask you about
00:22:54.980 | in terms of reason,
00:22:56.500 | when you, you can become very confident
00:22:59.940 | in your ability to understand the world
00:23:03.060 | if you practice reason often.
00:23:04.900 | And I feel like it can lead you astray
00:23:07.940 | because you can start to think,
00:23:11.020 | so I love psychology,
00:23:12.940 | and psychologists have this certainty
00:23:15.380 | about understanding the human condition,
00:23:17.860 | which is undeserved.
00:23:19.540 | You run a study with 50 people
00:23:21.580 | and you think you can understand
00:23:23.700 | the source of all these psychiatric disorders,
00:23:25.620 | all these kinds of things.
00:23:27.100 | That's similar kind of trouble I feel like you can get into
00:23:31.820 | when you overreach with reason.
00:23:35.300 | - So I don't think there is such a thing
00:23:36.540 | as overreaching with reason,
00:23:38.260 | but there are bad applications of reason.
00:23:40.540 | There are bad uses of reason,
00:23:42.140 | or the pretense of using reason.
00:23:44.420 | I think a lot of these psychological studies
00:23:46.820 | are pretense of using reason,
00:23:48.220 | and these psychologists have never really taken
00:23:51.120 | a serious stat class or a serious econometrics class,
00:23:53.660 | so they use statistics in weird ways
00:23:55.780 | that just don't make any sense.
00:23:57.360 | And that's a mis, that's not reason, right?
00:23:59.500 | That's just bad thinking, right?
00:24:01.140 | So I don't think you can do too much good thinking,
00:24:05.940 | and that's what reason is, it's good thinking.
00:24:08.620 | Now, the fact that you try to use reason
00:24:13.620 | does not guarantee you won't make mistakes.
00:24:17.060 | It doesn't guarantee you won't be wrong.
00:24:18.880 | It doesn't guarantee you won't go down a rabbit hole
00:24:21.820 | and completely get it wrong,
00:24:24.360 | but it does give you the only existing mechanism to fix it,
00:24:29.220 | which is going back to reality,
00:24:30.380 | going back to facts, going back to reason,
00:24:32.640 | and getting out of the rabbit hole
00:24:34.780 | and getting back to reality.
00:24:37.380 | So I agree with you that it's interesting
00:24:40.220 | to think about these, what I consider crazy ideas,
00:24:44.700 | because it, oh wait, what is my argument about them?
00:24:47.840 | If I don't really have a good argument about them,
00:24:49.900 | then do I know what I know?
00:24:51.120 | So in that sense, it's always nice to be challenged
00:24:53.940 | and pushed and oriented.
00:24:56.020 | You know, the nice thing about objectivism
00:24:57.620 | is everybody's doing that to me all the time, right?
00:25:00.320 | Because nobody agrees with me on anything,
00:25:01.820 | so I'm constantly being challenged,
00:25:04.180 | whether it's by Hoffman on metaphysics and epistemology,
00:25:08.260 | right, on the very foundations of my knowledge,
00:25:10.020 | in ethics, everybody constantly,
00:25:12.260 | and in politics all the time.
00:25:14.060 | So I find that it's part of, you know,
00:25:18.540 | I prefer that everybody,
00:25:19.580 | there's a sense in which I prefer
00:25:20.900 | that everybody agreed with me, right?
00:25:22.580 | Because I think we'd live in a better world,
00:25:24.460 | but there's a sense in which that disagreement makes it,
00:25:27.680 | at least up to a point, makes it interesting and challenging
00:25:31.300 | and forces you to be able to rethink
00:25:35.620 | or to confirm your own thinking
00:25:37.300 | and to challenge that thinking.
00:25:39.540 | - Can you try to do the impossible task
00:25:42.220 | and give a whirlwind introduction to Ayn Rand,
00:25:45.760 | the many sides of Ayn Rand?
00:25:49.260 | So Ayn Rand, the human being, Ayn Rand, the novelist,
00:25:53.820 | and Ayn Rand, the philosopher.
00:25:56.040 | So who was Ayn Rand?
00:25:57.780 | - Sure, so her life story is one that I think is fascinating
00:26:02.780 | but it also lends itself to this integration
00:26:07.420 | of all of these things.
00:26:08.860 | She was born in St. Petersburg, Russia in 1905
00:26:12.620 | to kind of a middle-class family, Jewish family.
00:26:16.460 | They owned a pharmacy, her father owned a pharmacy.
00:26:19.740 | And, you know, she grew up, she was a very,
00:26:27.420 | she knew what she wanted to do
00:26:29.020 | and what she wanted to be from a very young age.
00:26:31.220 | I think from the age of nine,
00:26:32.380 | she knew she wanted to be a writer.
00:26:33.620 | She wanted to write stories.
00:26:35.020 | That was the thing she wanted to do.
00:26:37.380 | And, you know, she focused her life after that
00:26:41.520 | on this goal of, I want to be a novelist, I want to write.
00:26:44.800 | And the philosophy was incidental to that, in a sense,
00:26:50.020 | at least until some point in her life.
00:26:52.700 | She witnessed the Russian Revolution,
00:26:55.700 | literally it happened outside.
00:26:57.020 | They lived in St. Petersburg
00:26:59.340 | where the first kind of demonstrations
00:27:01.380 | and of the revolution happened.
00:27:03.460 | So she witnessed it.
00:27:04.540 | She lived through it as a teenager,
00:27:06.740 | went to school under the Soviets.
00:27:10.900 | For a while, they were under kind of the,
00:27:14.260 | on the Black Sea where the opposition government was ruling
00:27:18.100 | and then they would go back and forth
00:27:19.780 | between the communists and the whites.
00:27:21.500 | But she experienced what communism was like.
00:27:23.820 | She saw the pharmacy being taken away from her family.
00:27:26.700 | She saw their apartment being taken away
00:27:28.900 | or other families being brought into the apartment
00:27:31.700 | they already lived in.
00:27:32.800 | And it was very clear, given her nature,
00:27:38.620 | given her views, even at a very young age,
00:27:42.080 | that she would not survive this system.
00:27:44.660 | So a lot of effort was put into how do we get,
00:27:47.260 | how does she get out?
00:27:48.380 | And her family was really helpful in this.
00:27:51.260 | And she had a cousin in Chicago
00:27:54.100 | and she had been studying kind of film at the university.
00:27:58.420 | - This is in her 20s?
00:28:00.460 | - This is in her 20s, early 20s.
00:28:03.220 | And Lenin, there was a small window
00:28:06.820 | where Lenin was allowing some people
00:28:09.260 | to leave under certain circumstances.
00:28:12.460 | And she managed to get out to go do research on film
00:28:15.820 | in the United States.
00:28:17.140 | Everybody knew, everybody who knew her
00:28:19.180 | knew she would never come back,
00:28:21.120 | that this was a one-way ticket.
00:28:22.500 | And she got out, she made it to Chicago,
00:28:24.500 | spent a few weeks in Chicago, and then headed to Hollywood.
00:28:28.820 | She wanted to write scripts.
00:28:30.180 | That was the goal.
00:28:32.440 | Here's this short woman from Russia
00:28:36.380 | with a strong accent, learning English,
00:28:39.740 | showing up in Hollywood, and I wanna be a script writer.
00:28:43.540 | - In English?
00:28:44.460 | - In English, writing in English.
00:28:46.540 | And this is kind of one of these fairy tale stories,
00:28:51.160 | but it's true, she shows up at the Cecil B. DeMille Studios.
00:28:55.460 | And she has a letter of introduction
00:28:58.380 | from her cousin in Chicago who owns a movie theater.
00:29:02.060 | And this is in the late 1920s.
00:29:05.940 | And she shows up there with this letter,
00:29:07.660 | and they say, "Don't call us, we'll call you," kind of thing.
00:29:10.420 | And she steps out, and there's this massive convertible.
00:29:15.420 | And in the convertible is Cecil B. DeMille.
00:29:18.060 | And he's driving slowly past her
00:29:20.140 | right at the entrance of the studio,
00:29:21.540 | and she stares at him, and he stops the car,
00:29:23.260 | and he says, "Why are you staring at me?"
00:29:25.760 | And she says, she tells him a story for Russia,
00:29:28.220 | and I wanna make it in the movies,
00:29:30.100 | I wanna be a script writer one day.
00:29:31.580 | And he says, "Well, if you want that, get in the car."
00:29:35.060 | She gets in the car, and he takes her
00:29:37.060 | to the back lot of his studio
00:29:38.940 | where they're filming "The King of Kings,"
00:29:40.380 | the story of Jesus.
00:29:41.980 | And he says, "Here's a pass for a week.
00:29:45.180 | If you wanna write for the movies,
00:29:47.300 | you better know how movies are made."
00:29:49.700 | And she basically spends a week,
00:29:51.540 | and then she spends more time there.
00:29:53.140 | She managed to get an extension.
00:29:54.500 | She lands up being an extra in the movie,
00:29:56.100 | so you can see Iron Man there,
00:29:58.260 | and it's one of the masses when Jesus is walking by.
00:30:03.020 | She meets her future husband on the sets
00:30:05.580 | of "The King of Kings."
00:30:07.340 | She lands up getting married,
00:30:09.180 | getting her American citizenship that way.
00:30:12.140 | And she lands up doing odds and ends jobs in Hollywood,
00:30:15.860 | living in a tiny little apartment.
00:30:19.380 | Somehow making a living.
00:30:20.700 | Her husband was an actor.
00:30:22.020 | He was struggling actors, difficult times.
00:30:26.620 | And in the evenings, studying English,
00:30:28.780 | writing, writing, writing, writing,
00:30:30.460 | and studying, and studying, and studying.
00:30:31.900 | She finally makes it by writing a play
00:30:34.700 | that is successful in LA,
00:30:39.420 | and ultimately goes to Broadway.
00:30:41.240 | And she writes her first novel,
00:30:45.420 | is a novel called "We the Living,"
00:30:47.720 | which is the most autobiographical of all her novels.
00:30:50.820 | It's about a young woman in the Soviet Union.
00:30:54.380 | It's a powerful story, a very moving story,
00:30:58.140 | and probably, if not the best,
00:31:01.300 | one of the best portrayals of life under communism.
00:31:05.020 | - So you would recommend the book?
00:31:06.500 | - Definitely recommend "We the Living."
00:31:08.020 | It's her first novel.
00:31:09.460 | She wrote in the '30s, and it didn't go anywhere,
00:31:13.260 | because if you think about the intelligentsia,
00:31:16.060 | the people who mattered, the people who wrote book reviews,
00:31:20.320 | this is a time of Durante,
00:31:23.400 | who's the "New York Times" guy in Moscow,
00:31:25.940 | who's praising Stalin to the hills, and the success.
00:31:29.760 | So the novel fails, but she's got a novel out.
00:31:34.280 | She writes a small novelette called "Anthem."
00:31:36.840 | A lot of people have read that,
00:31:38.200 | and it's read in high schools.
00:31:39.840 | It's kind of a dystopian novel,
00:31:42.200 | and it doesn't get published in the US.
00:31:45.840 | It gets published in the UK.
00:31:47.120 | UK is very interested in dystopian novels.
00:31:50.200 | "Animal Farm" in 1984, '84,
00:31:54.680 | is published a couple of years after, I think, after "Anthem."
00:31:58.920 | There's reason to believe he read "Anthem."
00:32:01.700 | - And George Orwell read that for "Animal Farm."
00:32:06.560 | - Yeah.
00:32:07.680 | - Just the small side, "Animal Farm" is probably top.
00:32:11.140 | I mean, it's weird to say,
00:32:13.000 | but I would say it's my favorite book, which is--
00:32:14.920 | - Have you seen this movie out now called "Mr. Jones?"
00:32:17.600 | - No.
00:32:18.440 | - Oh, you've got to see "Mr. Jones."
00:32:19.600 | - What's "Mr. Jones?"
00:32:21.080 | - It's a-- - Sorry for my ignorance.
00:32:22.960 | - No, no, it's a movie, and it hasn't got any publicity,
00:32:25.640 | which is tragic, 'cause it's a really good movie.
00:32:28.200 | It's both brilliantly made.
00:32:29.660 | It's made by a Polish director, but it's in English.
00:32:32.960 | It's a true story, and George Orwell's "Animal Farm"
00:32:36.440 | is featured in it, in the sense that during the story,
00:32:40.240 | George Orwell was writing "Animal Farm,"
00:32:42.560 | and the narrator is reading off sections of "Animal Farm"
00:32:47.360 | as the movie is progressing.
00:32:49.360 | And the movie is a true story
00:32:50.960 | about the first Western journalist to discover
00:32:55.720 | and to write about the famine in Ukraine.
00:32:57.760 | And so he goes to Moscow, and then he gets on a train,
00:33:01.040 | and he finds himself in Ukraine,
00:33:02.240 | and it's beautifully and horrifically made.
00:33:05.760 | So the horror of the famine is brilliantly conveyed,
00:33:10.480 | and it's a true story, so it's a very moving story,
00:33:13.000 | very powerful story, and just very well-made movie.
00:33:16.560 | So it's tragic, in my view, that not more people are seeing.
00:33:19.880 | - That's interesting.
00:33:20.720 | I was actually recently just complaining
00:33:23.240 | that there's not enough content on the famine,
00:33:26.920 | the '30s of stuff.
00:33:29.440 | There's so much on Hitler.
00:33:30.440 | Like, I love the reading.
00:33:32.520 | I'm reading, it's so long.
00:33:34.640 | It's been taking me forever.
00:33:35.880 | The rise and fall of the Third Reich, yeah, I love it.
00:33:39.440 | I've got the book to complement that that you have to read.
00:33:42.160 | It's called "The Ominous Parallels."
00:33:44.160 | It's Leonard Picoff, and it's "The Ominous Parallels,"
00:33:47.080 | and it's about the causes of the rise of Hitler,
00:33:52.080 | but a philosophical causes.
00:33:54.240 | So whereas the rise and fall is more of a kind of,
00:33:58.480 | the existential kind of what happened,
00:34:00.880 | but really delving into the intellectual currents
00:34:07.720 | that led to the rise of Hitler.
00:34:09.240 | - And maybe-- - Highly recommend that.
00:34:11.280 | - And basically suggesting how it might rise another--
00:34:15.800 | - That's "The Ominous Parallels."
00:34:17.200 | So the parallel he draws is to the United States,
00:34:20.480 | and he says those same intellectual forces
00:34:22.640 | are rising in the United States,
00:34:23.840 | and this was published, I think, in '82.
00:34:28.840 | It was published in '82.
00:34:30.260 | So it was published a long time ago,
00:34:31.760 | and yet you look around us,
00:34:34.600 | and it's unbelievably predictive, sadly,
00:34:37.220 | about the state of the world.
00:34:38.740 | - So I haven't finished Dianne Rant's story.
00:34:40.120 | I don't know if you want me to--
00:34:41.360 | - No, no, but on that point, I'll have to,
00:34:44.360 | let's please return to it, but let's now,
00:34:46.800 | for now, let's talk--
00:34:47.640 | - But let me also say, just because I don't want
00:34:50.160 | to forget about Mr. Jones, it is true, the point you made,
00:34:54.440 | that tons of movies that are anti-fascist, anti-Nazi,
00:34:58.520 | and that's good, but there are way too few movies
00:35:02.640 | that are anti-communist, just almost not.
00:35:05.520 | And it's very interesting, and if you remind me later,
00:35:08.000 | I'll tell you a story about that.
00:35:09.300 | But so she publishes "Anthem," and then she starts,
00:35:13.920 | and she's doing okay in Hollywood,
00:35:15.720 | and she's doing okay with the play,
00:35:18.060 | and then she starts on the book "The Fountainhead,"
00:35:21.620 | and she writes "The Fountainhead," and it comes out,
00:35:25.260 | she finishes it in 1945, and she sends it to publishers,
00:35:30.260 | and publisher after publisher after publisher turn it down.
00:35:37.100 | And it takes 12 publishers before this editor reads it
00:35:41.860 | and says, "I want to publish this book."
00:35:44.800 | And he basically tells his bosses,
00:35:47.040 | "If you don't publish this book, I'm leaving."
00:35:49.820 | And they don't really believe in the book,
00:35:54.740 | so they publish just a few copies, they don't do a mat,
00:35:58.640 | and the book becomes a bestseller from word of mouth,
00:36:00.860 | and they land up having to publish more and more and more.
00:36:03.620 | And she's basically gone from this immigrant
00:36:07.540 | who comes here with very little command of English,
00:36:10.060 | and to all kinds of odds and ends jobs in Hollywood,
00:36:14.860 | to writing one of the seminal, I think, American books.
00:36:19.860 | She is an American author.
00:36:24.060 | I mean, if you read "The Fountainhead," it's not Russian.
00:36:27.540 | This is not Dostoyevsky.
00:36:29.220 | - It feels like a symbol of what America is
00:36:32.780 | in the 20th century.
00:36:34.380 | And I mean, probably, maybe you can...
00:36:38.260 | So there's a famous kind of sexual rape scene in there.
00:36:42.500 | Is that like a lesson you want to throw in
00:36:44.780 | some controversial stuff to make
00:36:47.060 | your philosophical books work out?
00:36:49.340 | I mean, why was it so popular?
00:36:51.980 | Do you have a sense?
00:36:53.580 | - Well, because I think it illustrated...
00:36:55.780 | First of all, 'cause I think the character's fantastic.
00:36:58.940 | It's got a real hero.
00:37:01.340 | And I think the whole book is basically
00:37:03.820 | illustrating this massive conflict
00:37:05.860 | that I think went on in America then, is going on today,
00:37:09.300 | and it goes on in a big scale politics,
00:37:12.460 | all the way down to the scale of the choices
00:37:14.720 | you make in your life.
00:37:16.160 | And the issue is individualism versus collectivism.
00:37:20.160 | Should you live for yourself?
00:37:22.380 | Should you live for your values?
00:37:23.580 | Should you pursue your passions?
00:37:25.460 | Or should you do what your mother tells you?
00:37:29.020 | Should you follow your mother's passions?
00:37:31.580 | And that's...
00:37:33.700 | And it's very, very much a book about individuals,
00:37:38.700 | and people relate to that.
00:37:42.600 | But it obviously has this massive implications
00:37:45.660 | to the world outside.
00:37:47.060 | And at the time of collectivism just having been defeated,
00:37:50.980 | communism, well, fascism,
00:37:53.580 | and the United States representing individualism
00:37:58.620 | has defeated collectivism.
00:38:01.420 | But where collectivist ideas are still popular
00:38:03.820 | in the form of socialism and communism,
00:38:06.260 | and for the individual,
00:38:08.100 | this constant struggle between what people tell me to do,
00:38:10.940 | what society tells me to do,
00:38:12.100 | what my mother tells me to do,
00:38:13.260 | and what I think I should do,
00:38:15.260 | I think it's unbelievably appealing,
00:38:17.580 | particularly to young people
00:38:18.860 | who are trying to figure out what they wanna do in life,
00:38:21.500 | trying to figure out what's important in life.
00:38:23.740 | It had this enormous appeal.
00:38:26.420 | It's romantic, it's bigger than life.
00:38:28.300 | The characters are big heroes.
00:38:29.940 | It's very American in that sense.
00:38:31.660 | It's about individualism.
00:38:32.940 | It's about the triumph of individualism.
00:38:35.500 | And so I think that's what related.
00:38:38.900 | And it had this big romantic element from the...
00:38:42.980 | I mean, when I use romantic,
00:38:44.260 | I use it kind of in the sense of a movement in art.
00:38:49.260 | But it also has this romantic element
00:38:51.620 | in the sense of a relationship between a man and a woman
00:38:54.180 | who's, that's very intriguing.
00:38:55.640 | It's not only that there's a,
00:38:58.140 | I would say, almost rape scene, right?
00:39:00.420 | I would say, but it's also that this woman
00:39:03.620 | is hard to understand.
00:39:04.940 | I mean, I've read it more than once,
00:39:06.860 | and I still can't quite figure out Dominique, right?
00:39:09.460 | Because she loves him and she wants to destroy him,
00:39:11.620 | and she marries other people.
00:39:12.980 | I mean, think about that too.
00:39:14.220 | Here, she's writing a book in the 1940s.
00:39:16.220 | There's lots of sex.
00:39:19.820 | There's a woman who marries more than one person,
00:39:23.660 | has having sex with more than one person.
00:39:25.840 | Very unconventional.
00:39:27.320 | She's having sex with Rourke,
00:39:29.860 | even though she's not married to Rourke.
00:39:31.080 | This is 1945.
00:39:33.100 | And it's very jarring to people.
00:39:36.920 | It's very unexpected, but it's also a book of its time.
00:39:39.840 | It's about individuals pursuing their passion,
00:39:42.400 | pursuing their life, and not caring about convention
00:39:45.560 | and what people think, but doing what they think is right.
00:39:52.780 | So I think it's, I encourage everybody to read it,
00:39:55.500 | obviously.
00:39:56.340 | - So was that the first time she articulated
00:39:59.380 | something that sounded like a philosophy of individualism?
00:40:04.940 | - I mean, the philosophy's there in "We the Living," right?
00:40:08.680 | Because at the end of the day, the woman is,
00:40:12.660 | the hero of "We the Living" is this individualist
00:40:16.100 | stuck in Soviet Union, so she's struggling
00:40:18.340 | with these things.
00:40:20.300 | So the theme is there already.
00:40:22.440 | It's not as fleshed out.
00:40:23.960 | It's not as articulated philosophically.
00:40:26.400 | And it's certainly there in "Anthem,"
00:40:27.720 | which is a dystopian novel, where this dystopia
00:40:31.400 | in the future has, there's no I.
00:40:35.720 | Everything is we.
00:40:36.720 | And it's about one guy who breaks out of that.
00:40:40.960 | I don't wanna give it away, but breaks out of that.
00:40:43.500 | So these themes are running, and then we have,
00:40:48.080 | and they've been published, some of the early Ayn Rand
00:40:50.800 | stories that she was writing in preparation
00:40:53.960 | for writing her novel, stories she was writing
00:40:56.000 | when she first came to America.
00:40:57.320 | And you can see these same philosophical elements,
00:41:01.160 | even in the male-female relationships,
00:41:03.840 | and the compassion, and the, you know,
00:41:07.160 | in the conflict, you see them even in those early pieces.
00:41:10.880 | And she's just developing them.
00:41:12.360 | It's the same philosophically.
00:41:13.920 | She's developing her philosophy with her literature.
00:41:17.800 | And of course, after "The Fountainhead,"
00:41:20.040 | she starts on what turns out to be her magnus opus,
00:41:22.520 | which is "Atlas Shrugged," which takes her 12 years
00:41:25.500 | to publish.
00:41:26.340 | By the time, of course, she brings that out,
00:41:28.680 | every publisher in New York wants to publish it,
00:41:31.040 | because "The Fountainhead" has been such a huge success.
00:41:34.240 | They don't quite understand it.
00:41:35.360 | They don't know what to do with "Atlas Shrugged,"
00:41:37.120 | but they're eager to get it out there.
00:41:39.760 | And indeed, when it's published,
00:41:41.320 | it becomes an instant bestseller.
00:41:43.560 | And the thing about the, particularly "The Fountainhead"
00:41:45.600 | and "Atlas Shrugged," but true of even "Anthem"
00:41:47.920 | and "We the Living," she is one of the only dead authors
00:41:52.920 | that sell more after they've died
00:41:55.400 | than when they was alive.
00:41:56.320 | Now, you know, that's true maybe in music.
00:41:58.520 | We listen to more Beethoven than when he was alive,
00:42:00.440 | but it's not true typically of novelists.
00:42:03.000 | And yet here we are, you know, what is it, 50,
00:42:08.000 | you know, 60 years after, 63 years after the publication
00:42:11.520 | of "Atlas Shrugged," and it sells probably more today
00:42:15.200 | than it sold when it was a bestseller
00:42:16.520 | when it first came out.
00:42:17.600 | - Is it true that it's like one of the most sold books
00:42:21.600 | in history?
00:42:22.440 | - No. - Okay.
00:42:23.840 | I've heard this kind of statement.
00:42:25.200 | - Any Tom Clancy book comes out,
00:42:27.200 | sells more than "Atlas Shrugged."
00:42:28.600 | - But I've never read, I've heard statements like this.
00:42:31.680 | - So there was a very, and I shouldn't say this,
00:42:33.880 | but it's the truth, so I'll say it,
00:42:35.320 | a very unscientific study done by the Smithsonian Institute,
00:42:40.320 | probably in the early '90s,
00:42:42.760 | that basically surveyed CEOs and asked them,
00:42:47.280 | what was the most influential book on you?
00:42:49.340 | And "Atlas Shrugged" came out as number two,
00:42:53.520 | the second most influential book on CEOs in the country.
00:42:57.080 | But there's so many flaws in the study.
00:42:58.720 | One was, you want to guess what the number one book?
00:43:01.480 | - Bible? - The Bible.
00:43:03.200 | But the Bible was like, you know,
00:43:05.760 | so maybe they surveyed 100 people.
00:43:07.080 | I don't know what the exact numbers were,
00:43:07.920 | but let's say it's 100 people,
00:43:09.880 | and 60 said the Bible, and 10 said "Atlas Shrugged,"
00:43:12.960 | and there were a bunch of books over here.
00:43:15.040 | So, you know, I don't--
00:43:16.280 | - That's, again, the psychology discussion
00:43:18.080 | we're having right now.
00:43:18.920 | - Exactly, well, and it's, one thing I've learned,
00:43:21.600 | and maybe COVID has taught me,
00:43:23.200 | and nobody, you know, there are very few people
00:43:27.040 | who know how to do statistics,
00:43:29.560 | and almost nobody knows how to think probabilistically.
00:43:33.360 | That is, think in terms of probabilities,
00:43:35.560 | that it is a skill, it's a hard skill.
00:43:38.220 | And everybody thinks they know it,
00:43:39.600 | but I see doctors thinking they're statisticians,
00:43:42.320 | and giving whole analyses of the data on COVID,
00:43:45.400 | and they don't have a clue what they're talking about.
00:43:46.960 | Not because they're not good doctors,
00:43:48.520 | but because they're not good statisticians.
00:43:49.960 | It's not, you know, people think that they have one skill,
00:43:53.680 | and therefore it translates immediately into another skill,
00:43:55.960 | and it's just not true.
00:43:57.280 | So I've been astounded at how bad people are at that.
00:44:01.840 | - For people who haven't read any of the books
00:44:05.800 | that we were just discussing,
00:44:09.240 | what would you recommend,
00:44:11.480 | what book would you recommend they read?
00:44:14.120 | And maybe also just elaborate,
00:44:17.160 | what mindset should they enter
00:44:20.640 | the reading of that book with?
00:44:22.820 | - So I would recommend everybody read
00:44:25.240 | "Fountainhead" and "Atlas Shrugged."
00:44:26.920 | And in one-- - In that order?
00:44:29.000 | - So it would depend on where you are in life, right?
00:44:31.640 | So it depends on who you are and what you are.
00:44:35.280 | So "Fountainhead" is a more personal story.
00:44:38.200 | For many people, it's their favorite,
00:44:39.680 | and for many people, it was their first book,
00:44:41.440 | and they wouldn't replace that, right?
00:44:44.000 | If "Atlas Shrugged" is a, it's about the world.
00:44:50.840 | It's about what impacts the world,
00:44:54.040 | how the world functions, how it's a bigger book
00:44:57.760 | in the sense of the scope.
00:44:59.200 | If you're interested in politics,
00:45:01.600 | and you're interested in the world,
00:45:03.640 | read "Atlas Shrugged" first.
00:45:05.480 | If you're mainly focused on your life, your career,
00:45:08.120 | what you wanna do with yourself, start with "Fountainhead."
00:45:10.400 | I still think you should read both,
00:45:12.040 | because I think they are, I mean, to me,
00:45:14.520 | they were life-altering, and to many, many people,
00:45:17.880 | they're life-altering, and you should go into reading them
00:45:20.760 | with an open mind, I'd say, and with a,
00:45:24.520 | put aside everything you've heard about Ayn Rand.
00:45:27.040 | Put aside any, even if it's true, just put it aside.
00:45:30.560 | Even what I just said about Ayn Rand, put it aside.
00:45:33.280 | Just read the book as a book, and let it move you,
00:45:36.920 | and let your thoughts, let it shape how you think,
00:45:41.720 | and it'll have, it either have,
00:45:46.440 | you'll either have a response to it, or you won't.
00:45:49.000 | But I think most people have a very strong response to it.
00:45:52.160 | And then the question is, do they,
00:45:55.560 | are they willing to respond to the philosophy?
00:45:57.320 | Are they willing to integrate the philosophy?
00:45:58.720 | Are they willing to think through the philosophy, or not?
00:46:01.720 | Because I know a lot of people who completely disagree
00:46:03.960 | with the philosophy, right?
00:46:06.360 | Here in Hollywood, right?
00:46:07.640 | Lots of people here in Hollywood love "The Fountainhead."
00:46:11.400 | - Interesting.
00:46:12.240 | - Oliver Stone, who is, I think, a avowed Marxist, right?
00:46:16.760 | I think he's admitted to being a Marxist.
00:46:19.440 | He is, his movies certainly reflect the Marxist theme,
00:46:23.240 | is a huge fan of "The Fountainhead,"
00:46:27.320 | and is actually, his dream project, he has said in public,
00:46:30.400 | his dream project is to make "The Fountainhead."
00:46:33.080 | Now, he would completely change it, as movie directors do,
00:46:37.560 | and he's actually outlined what his script would look like,
00:46:40.080 | and it would be a disaster for the ideas of the,
00:46:43.520 | but he loves the story, because to him,
00:46:45.280 | the story is about artistic integrity.
00:46:47.520 | - Yeah.
00:46:49.320 | - And that's what he catches on.
00:46:50.240 | And what he hates about the story is the individualism.
00:46:53.120 | And I think that his movie ends with Howard Rourke
00:46:56.920 | joining some kind of commune of architects
00:46:59.520 | that do it for the love and don't do it for the money.
00:47:02.120 | - Interesting.
00:47:03.080 | So yeah, so he can connect with you without the philosophy.
00:47:05.160 | And before we get into the philosophy,
00:47:07.800 | staying on Ayn Rand,
00:47:09.160 | I'll tell you sort of my own personal experience,
00:47:12.760 | and I think it's one that people share.
00:47:15.120 | I've experienced this with two people, Ayn Rand and Nietzsche.
00:47:18.120 | When I brought up Ayn Rand when I was in my early 20s,
00:47:24.000 | the number of eye rolls I got from advisors and so on,
00:47:30.800 | that of dismissal.
00:47:34.280 | I've seen that later in life about more specific concepts
00:47:38.200 | in artificial intelligence and technical,
00:47:39.920 | where people decide that this is a set of ideas
00:47:43.400 | that are acceptable, and these sets of ideas are not.
00:47:46.320 | And they dismissed Ayn Rand
00:47:48.880 | without giving me any justification
00:47:52.960 | of why they dismissed her,
00:47:55.680 | except, oh, that's something you're into
00:47:58.800 | when you're 19 or 20.
00:48:01.800 | Same thing people say about Nietzsche.
00:48:03.480 | Well, that's just something you do when you're in college
00:48:06.520 | and you take an intro to philosophy course.
00:48:09.600 | So I've never really heard anybody cleanly articulate
00:48:14.600 | their opposition to Ayn Rand
00:48:18.400 | in my own private little circles and so on.
00:48:21.240 | Maybe one question I just want to ask is,
00:48:23.840 | why is there such a opposition to Ayn Rand?
00:48:28.600 | And maybe another way to ask the same thing is,
00:48:31.560 | what's misunderstood about Ayn Rand?
00:48:35.120 | - So we haven't talked about the philosophy,
00:48:37.400 | so it's harder to answer right now.
00:48:39.000 | - We can return to it if you think
00:48:40.160 | that's the right way to go.
00:48:41.400 | - Well, let me give a broad answer,
00:48:43.320 | and then we'll do the philosophy,
00:48:45.120 | and then we'll return to it,
00:48:45.960 | 'cause I think it's important to know
00:48:47.440 | something about her ideas.
00:48:49.400 | She, I think her philosophy challenges everything.
00:48:54.400 | It really does.
00:48:56.840 | It shakes up the world.
00:48:58.200 | It challenges so many of our preconceptions.
00:49:01.520 | It challenges so many of the things
00:49:03.520 | that people take for granted as truth.
00:49:06.440 | From religion to morality to politics to almost everything,
00:49:11.640 | there's never quite been a thinker like her
00:49:13.760 | in the sense of really challenging everything
00:49:17.240 | and doing it systematically,
00:49:18.520 | and having a complete philosophy
00:49:21.400 | that is a challenge to everything that has come before her.
00:49:23.960 | Now, I'm not saying they're on threads that connect.
00:49:27.680 | They are, right?
00:49:28.520 | In politics, there might be a thread,
00:49:30.120 | and in morality, there might be a thread,
00:49:31.760 | but on everything, there's just never been like it,
00:49:34.800 | and people are afraid of that
00:49:37.960 | because it challenges them to the core.
00:49:39.800 | She's basically telling you to rethink almost everything,
00:49:42.640 | and that is, that people reject.
00:49:47.800 | The other thing that it does,
00:49:49.560 | and this goes to this point about,
00:49:51.560 | oh yeah, that's what you do when you're 14, 15, right?
00:49:55.600 | She points out to them that they've lost something.
00:49:59.540 | They've lost their idealism.
00:50:02.040 | They've lost their youthful idealism.
00:50:05.960 | What makes youthfulness meaningful,
00:50:10.000 | other than we're in better physical shape,
00:50:13.320 | starting to feel, 'cause I'm getting older?
00:50:15.440 | When we're young, sometime in the teen years,
00:50:21.400 | there's something that happens to human consciousness.
00:50:24.560 | We almost awaken anew, right?
00:50:27.240 | We suddenly discover that we can think for ourselves.
00:50:30.800 | We suddenly discover that not everything
00:50:33.400 | our parents and our teachers tell us is true.
00:50:36.160 | We suddenly discover that this tool, our minds,
00:50:39.440 | is suddenly available to us to discover the world
00:50:42.680 | and to discover truth, and it is a time of idealism.
00:50:46.640 | It's a time of, whoa, I wanna, you know,
00:50:49.760 | the better teenagers, I wanna know about the world.
00:50:52.240 | I wanna go out there.
00:50:53.200 | I don't believe my parents.
00:50:54.320 | I don't believe my teachers, and this is healthy.
00:50:56.360 | This is fantastic, and I wanna go out there and experiment,
00:50:59.800 | and that gets us into trouble, right?
00:51:01.440 | We do stupid things when we're teenagers.
00:51:03.400 | Why? 'Cause we're experimenting.
00:51:05.080 | It's the experiential part of it, right?
00:51:06.740 | We wanna go and experience life, but we're learning.
00:51:10.000 | It's part of the learning process,
00:51:11.400 | and we become risk-takers because we wanna experience,
00:51:15.400 | but the risk is something we need to learn,
00:51:16.960 | 'cause we need to learn where the boundaries are,
00:51:19.360 | and one of the damages that helicopter parents do
00:51:21.780 | is they prevent us from taking those risks
00:51:23.200 | so we don't learn about the world
00:51:24.400 | and we don't learn about where the boundaries are,
00:51:26.560 | so the teenage years are these years of wonder.
00:51:30.400 | They're depressing when you're in them
00:51:32.720 | for a variety of reasons,
00:51:33.720 | which I think primarily have to do with the culture,
00:51:35.440 | but also with oneself, but they are exciting,
00:51:39.640 | the periods of discovery,
00:51:42.000 | and people get excited about ideas,
00:51:45.440 | and good ideas, bad ideas, all kinds of ideas,
00:51:48.640 | and then what happens?
00:51:50.280 | We settle, we compromise.
00:51:53.760 | Whether that happens in college,
00:51:55.800 | where we're taught that nothing exists and nothing matters,
00:51:58.160 | and start being, be a nihilist, be a cynic, be whatever,
00:52:01.900 | or whether it happens when we get married and get a job
00:52:04.400 | and have kids and are too busy
00:52:05.680 | and can't think about our ideals and forget
00:52:07.480 | and just get into the norm of conventional life,
00:52:10.460 | or whether it's because a mother pesters us
00:52:14.000 | to get married and have kids
00:52:15.060 | and do all the things that she wanted us to do,
00:52:17.920 | we give up on those ideals,
00:52:20.360 | and there's a sense in which Ayn Rand reminds them
00:52:24.880 | that they gave up.
00:52:25.920 | - That's beautifully, that's so beautifully put,
00:52:27.840 | and so true.
00:52:28.920 | It's worth pausing on,
00:52:34.320 | that this dismissal,
00:52:38.520 | people forget the beauty of that curiosity.
00:52:41.640 | That's true in the scientific field too,
00:52:47.440 | that youthful joy of everything is possible
00:52:51.880 | and we can understand it with the tools of our mind.
00:52:56.200 | - Yes, and that's what it's all about.
00:52:57.880 | That's what Ayn Rand's ideas at the end of the day
00:52:59.580 | all boil down to, is that confidence and that passion
00:53:02.560 | and that curiosity and that interest.
00:53:05.320 | And if you, think about what academia does
00:53:08.840 | to so many of us.
00:53:10.320 | We go into academia and we're excited about it.
00:53:12.760 | We're gonna learn stuff, we're gonna discover things.
00:53:16.200 | And then they stick you into sub-sub field
00:53:18.400 | and examining some minutiae that's insignificant
00:53:21.520 | and unimportant.
00:53:22.840 | And to get published, you have to be conventional,
00:53:25.560 | you have to do what everybody else does.
00:53:27.080 | And then there's the tenure process of seven years
00:53:29.640 | where they put you through this torture to write papers
00:53:32.200 | that fit into a certain mold.
00:53:34.160 | And by the time you're done, you're in your mid-30s
00:53:38.080 | and you've done nothing, you discovered nothing.
00:53:39.900 | You're all in this minutiae in this stuff
00:53:43.040 | and it's destructive.
00:53:44.320 | And with holding onto that passion,
00:53:48.080 | holding onto that knowledge and that confidence is hard.
00:53:52.200 | And when people do away with it, they become cynical.
00:53:55.480 | And they become part of the system
00:53:57.320 | and they inflict the same pain on the next guy
00:54:00.320 | that they suffered because that's part of how it works.
00:54:03.440 | - Yeah, this happens in artificial intelligence.
00:54:06.020 | This happens when a young person shows up
00:54:08.920 | and with fire in their eyes and they say,
00:54:11.600 | I wanna understand the nature of intelligence.
00:54:14.600 | And everybody rolls their eyes.
00:54:17.120 | Well, for these same reasons,
00:54:20.320 | because they've spent so many years
00:54:21.880 | on the very specific set of questions
00:54:25.120 | that they compete over and they write papers over
00:54:30.120 | and they have conferences about.
00:54:31.720 | And it's true, that incremental research
00:54:34.100 | is the way you make progress,
00:54:35.280 | answering the question of what is intelligence
00:54:37.200 | is exceptionally difficult.
00:54:38.880 | But when you mock it, you actually destroy the realities.
00:54:43.880 | When we look centuries from now,
00:54:47.920 | we'll look back at this time
00:54:49.360 | for this particular field of artificial intelligence,
00:54:52.680 | it will be the people who will be remembered
00:54:55.560 | will be the people who've asked the question
00:54:58.080 | and made it their life journey of what is intelligence
00:55:01.560 | and actually had the chance to succeed.
00:55:04.800 | Most will fail asking that question,
00:55:06.880 | but the ones that had a chance of succeeding
00:55:09.520 | and had that throughout their whole life.
00:55:11.840 | And I suppose the same is true for philosophy.
00:55:15.120 | - It's in every field.
00:55:16.080 | It's asking the big questions and staying curious
00:55:20.200 | and staying passionate and staying excited
00:55:22.880 | and accepting failure, right?
00:55:26.120 | Accepting that you're not gonna get it first time,
00:55:27.920 | you're not gonna get the whole thing.
00:55:29.640 | But and sometimes you have to do the minutiae work
00:55:31.880 | and I'm not here to say nobody should specialize
00:55:34.320 | and you shouldn't do the minutiae, you have to do that.
00:55:36.680 | But there has to be a way to do that work
00:55:38.800 | and keep the passion and keep it all integrated.
00:55:41.920 | That's another thing.
00:55:42.760 | I mean, we don't live in a culture that integrates, right?
00:55:46.440 | We live in a culture that is all about this minutiae
00:55:51.040 | and medicine is another field
00:55:53.560 | where you specialize in the kidney.
00:55:55.360 | I mean, the kidney is connected to other things.
00:55:57.000 | You've gotta, and we don't have a holistic view
00:55:59.880 | of these things and I'm sure in artificial intelligence,
00:56:02.240 | you're not gonna make the big leaps forward
00:56:05.000 | without a holistic view of what it is
00:56:08.000 | you're trying to achieve.
00:56:08.880 | And maybe that's the question, what is intelligence?
00:56:10.720 | But that's the kind of questions you have to ask
00:56:14.320 | to make big leaps forward, to really move the field
00:56:17.400 | in a positive direction.
00:56:19.160 | And it's the people who can think that way,
00:56:21.960 | who move fields and move technology, move anything.
00:56:25.640 | Anything is, everything is like--
00:56:27.360 | - Which is just like you said, it's painful
00:56:28.840 | because underlying that kind of questioning is,
00:56:32.660 | well, maybe the work I've done for the past 20 years
00:56:35.360 | was a dead end and you have to kind of face that.
00:56:40.100 | Even just, it might not be true,
00:56:42.080 | but even just facing that reality,
00:56:44.220 | it's just, it's a painful feeling.
00:56:47.760 | - Absolutely, but that's part of the reason
00:56:50.600 | why it's important to enjoy the work that you do.
00:56:52.880 | So that even if it doesn't completely work out,
00:56:54.800 | at least you enjoyed the process.
00:56:55.640 | - It was never a waste.
00:56:56.480 | - It was not a waste because you enjoyed the process.
00:56:59.320 | And if you learn, as any entrepreneur knows this,
00:57:02.780 | and if you learn from the waste of time,
00:57:05.540 | from the errors, from the mistakes,
00:57:07.580 | then you can build on them and make things even better.
00:57:10.620 | And so the next 20 years, I'm a massive success.
00:57:15.620 | - Can we, another impossible task.
00:57:18.860 | So you did wonderfully on talking about Ayn Rand.
00:57:22.660 | The other impossible task of giving a whirlwind overview
00:57:25.980 | of the philosophy of objectivism, the philosophy of Ayn Rand.
00:57:30.640 | - Yeah, so luckily she did it in an essay.
00:57:33.640 | She talks about doing her philosophy on one foot.
00:57:36.080 | But let me integrate it with the literature
00:57:39.560 | and with her life a little bit.
00:57:41.820 | She wanted to be a writer,
00:57:44.660 | but her goal, she had a particular goal in her writing.
00:57:47.420 | She was an idealist, right?
00:57:50.200 | She wanted to portray the ideal man.
00:57:55.160 | So one of the things you do when you wanna do something
00:57:57.900 | is what is an ideal man?
00:57:58.960 | You have to ask that question.
00:58:00.240 | What does that mean?
00:58:01.200 | You might have a sense of it.
00:58:03.140 | You might have certain glimpses of it
00:58:06.520 | in other people's literature, but what is it?
00:58:08.780 | So she starts reading philosophy to try to figure out
00:58:12.100 | what do philosophers say about the ideal man.
00:58:15.100 | And what she finds horrifies her
00:58:16.940 | in terms of the view of most philosophers of man.
00:58:19.040 | And she's attracted, certainly when she's young, to Nietzsche
00:58:23.500 | because Nietzsche at least has a vision
00:58:26.700 | of grandeur for man, even though his philosophy
00:58:30.380 | is very flawed and has other problems
00:58:31.980 | and contradicts iron man in many ways.
00:58:34.020 | But at least he has that vision of what is possible to man.
00:58:38.020 | And she's attracted to that romantic vision,
00:58:40.100 | that idealistic vision.
00:58:42.100 | So she discovers in writing,
00:58:43.620 | and particularly in writing "Atlas Shrugged,"
00:58:45.020 | but even in "The Fountainhead,"
00:58:46.660 | that she's gonna have to develop her own philosophy.
00:58:49.300 | She's gonna have to discover these ideas for herself
00:58:52.780 | because they're not fully articulated anywhere else.
00:58:55.860 | The glimpses again of it in Aristotle, in Nietzsche,
00:59:00.340 | but they're not fully fleshed out.
00:59:02.060 | So to a large extent, she develops a philosophy
00:59:05.820 | for a very practical purpose, to write,
00:59:08.900 | to write a novel about the ideal man.
00:59:11.300 | And "Atlas Shrugged" is the manifestation of that.
00:59:14.580 | - By the way, sorry to interrupt.
00:59:16.540 | As a little aside, she does, when you say man,
00:59:21.000 | you mean human.
00:59:22.460 | And because we'll bring this up often,
00:59:26.340 | she does, I mean, maybe you can elaborate
00:59:28.860 | of how she specifically uses man and he in the work.
00:59:33.380 | We live in a time now of gender.
00:59:36.260 | - Well, she did that in the sense that everybody did it
00:59:40.340 | during her period of time, right?
00:59:41.520 | It's only in modern times where we do he/she, right?
00:59:45.380 | Historically, when you said he, you meant a human being,
00:59:48.620 | unless the particular context implied that it was a...
00:59:51.600 | But in Einran's case, in this case, in this one sentence,
00:59:55.720 | she probably meant man.
00:59:57.360 | Not that because she viewed that there are differences
01:00:02.040 | between men and women, were not the same,
01:00:03.840 | which I know comes as a shock to many people,
01:00:06.080 | but she--
01:00:11.080 | - She's working on a character.
01:00:12.400 | - She was working on a particular vision, right?
01:00:15.480 | She considered herself a man worshiper,
01:00:18.900 | and a man, not human being, a male.
01:00:23.060 | She worshipped manhood, if you will, the hero in man.
01:00:28.060 | And she wanted to fully understand what that was.
01:00:32.020 | Now, it has massive implications for ideal woman,
01:00:35.300 | and I think she does portray the ideal woman
01:00:36.940 | in Atlas Shrugged and the character of Dagny.
01:00:40.860 | But her goal is, I think her selfish goal
01:00:45.860 | for what she wanted to get out of the novel
01:00:49.200 | is that excitement, partially sexual,
01:00:52.620 | about seeing your ideal manifest in reality
01:00:56.240 | of what you perceive as that which you would be
01:01:00.960 | attracted to, fully, intellectually, physically,
01:01:04.720 | sexually, in every aspect of your life.
01:01:06.640 | That's what she's trying to bring into the novel.
01:01:08.160 | - So there was no ambiguity of gender,
01:01:09.840 | so there was a masculinity and a femininity in her work.
01:01:12.840 | - Very much so.
01:01:14.360 | And if you read the novels, you see that.
01:01:17.080 | Now, remember, this is in the context of,
01:01:19.960 | in Atlas Shrugged, she is portraying a woman
01:01:22.840 | who runs a railroad, the most masculine
01:01:26.520 | of all jobs you could imagine, right?
01:01:28.480 | Running a railroad, better than any man could run it.
01:01:31.760 | And achieving huge success,
01:01:33.440 | better than any other man out there.
01:01:35.840 | But, but for her, even Dagny needs somebody to,
01:01:40.840 | needs a man, in some sense, to look up to.
01:01:47.360 | - Yeah.
01:01:48.280 | - And that's the character whose name I will mention,
01:01:51.360 | because it gives away too much of the plot.
01:01:53.200 | But there has to be--
01:01:54.800 | - I like how you do that, you're good.
01:01:57.120 | You're not, a lot of practice, a lot of practice.
01:01:59.680 | - Nothing, brilliant, 'cause you convey
01:02:01.720 | all the important things without giving away plot lines.
01:02:04.560 | - That's beautiful, you're a master.
01:02:05.880 | - So she's, so she's very much,
01:02:09.360 | she, she described herself once as a male chauvinist.
01:02:14.360 | - Okay.
01:02:16.240 | - She very, she likes the idea of a man opening a door for her.
01:02:19.600 | But more metaphysically, she identifies something
01:02:25.960 | in the difference between a way a man relates to a woman
01:02:29.040 | and a woman relates to a man.
01:02:30.200 | It's not the same.
01:02:32.160 | - And let's not take too far of a tangent,
01:02:35.360 | but I just, as a side comment,
01:02:37.480 | to me, she represented, she was a feminist to me.
01:02:43.200 | Perhaps there's a, perhaps technically,
01:02:45.800 | Filosofsky, you disagree with that, whatever.
01:02:47.600 | But the, you know, that to me represented strong,
01:02:52.600 | like she had some of the strongest female characters
01:02:55.400 | in the history of literature.
01:02:56.800 | - Again, this is a woman running a railroad in 1957.
01:03:00.200 | - Yeah.
01:03:01.080 | - And not just a woman running a railroad,
01:03:02.680 | and this is true of the Fountainhead as well.
01:03:05.080 | A woman who is sexually, in a sense, assertive,
01:03:09.800 | sexually open.
01:03:11.280 | This is, this is not a woman who, you know,
01:03:15.760 | this is a woman who embraces her sexuality.
01:03:20.480 | And, you know, sex is important in life.
01:03:22.800 | This is why it keeps coming up, right?
01:03:24.600 | It was important to Ayn Rand.
01:03:25.960 | It was, it's important in the novels.
01:03:27.440 | It's important in life.
01:03:29.000 | And for her, one's attitude towards sex
01:03:32.360 | is a reflection of one's attitude towards life.
01:03:34.520 | And, you know, and what attitude towards pleasure,
01:03:36.720 | which is an important part of life.
01:03:38.400 | And she thought that was an incredibly important thing.
01:03:41.920 | And so she has these assertive, powerful, sexual women
01:03:46.920 | who live their lives on their terms 100%,
01:03:53.920 | who seek a man to look up to.
01:03:56.960 | - Yeah.
01:03:57.800 | - And now, this is psychologically complex.
01:04:00.760 | It's more psychology than philosophy, right?
01:04:02.320 | It's psychologically complex.
01:04:03.680 | And, you know, not my area of expertise,
01:04:06.480 | but this is, there's something,
01:04:08.640 | and she would argue there's something
01:04:11.640 | fundamentally different about a male and a woman,
01:04:14.400 | about a male and female,
01:04:16.160 | psychologically in their attitude towards one another.
01:04:18.880 | - Yeah, but as a side note, I say that,
01:04:22.000 | I would say that, I don't know, philosophically,
01:04:25.520 | if her ideas about gender are interesting,
01:04:28.960 | I think her other philosophical ideas
01:04:30.760 | are much more interesting.
01:04:32.360 | But reading-wise, like the stories it created,
01:04:36.240 | the tension it created, that was pretty powerful.
01:04:39.520 | I mean, that was, that's pretty powerful stuff.
01:04:43.360 | - I'll speculate that the reason it's so powerful
01:04:45.680 | is because it reflects something in reality.
01:04:47.400 | - Yeah, that's true.
01:04:48.680 | There's a thread that at least it pulls.
01:04:50.160 | - And look, it's really important to say,
01:04:53.400 | I think she was the first feminist in a sense.
01:04:56.120 | I think in a sense, the feminist that provoked feminism
01:04:59.360 | into something that it shouldn't be.
01:05:00.800 | But in the sense of men and women are capable,
01:05:04.220 | she was the first one who really put that
01:05:08.600 | into a novel and showed it.
01:05:10.680 | - To me, as a boy, when I was reading "Atlas Shrugged,"
01:05:15.560 | I think I read that before "Fountainhead,"
01:05:18.200 | that was one of the early introductions,
01:05:20.520 | at least if an American woman,
01:05:21.840 | I had examples in my own life for Russian women,
01:05:24.240 | but of like a badass lady.
01:05:26.960 | Like I admire, like I love engineering.
01:05:30.240 | I had loved it that she could,
01:05:32.440 | here's a lady that's running the show.
01:05:34.400 | So that at least to me was an example
01:05:36.720 | of a really strong woman.
01:05:37.960 | But objectivism.
01:05:39.040 | - Objectivism.
01:05:39.880 | So, and so she developed it for a novel.
01:05:42.080 | She spent the latter part of her life
01:05:44.040 | after the publication of "Atlas Shrugged"
01:05:45.440 | really articulating her philosophy.
01:05:46.880 | So that's what she did.
01:05:47.800 | She applied it to politics, to life, to gender,
01:05:50.800 | to all these issues from 1957 until she died in 1982.
01:05:54.280 | - So the objectivism was born
01:05:56.120 | out of the later parts of "Atlas Shrugged."
01:05:57.920 | - Yes, definitely.
01:05:59.320 | It was there all the time,
01:06:00.480 | but it was fleshed out during the latter parts
01:06:03.000 | of "Atlas Shrugged" and then articulated
01:06:04.360 | for the next 20 years.
01:06:05.200 | - So what is objectivism?
01:06:06.680 | - So objectivism, so there are five branches in philosophy.
01:06:09.920 | And so I'm gonna just go through the branches.
01:06:13.200 | She starts with, you start with metaphysics,
01:06:15.080 | the nature of reality.
01:06:16.800 | And objectivism argues that reality is what it is.
01:06:20.240 | It's kind of, it goes, harkens back to Aristotle,
01:06:22.800 | law of identity.
01:06:23.800 | A is A.
01:06:24.880 | You can wish it to be B,
01:06:27.000 | but wishes do not make something real.
01:06:29.440 | Reality is what it is and it is the primary.
01:06:32.640 | And it's not manipulated, directed by consciousness.
01:06:37.400 | Consciousness is there to observe,
01:06:40.400 | to give us information about reality.
01:06:45.880 | That is the purpose of consciousness.
01:06:48.200 | That is the nature of it.
01:06:50.360 | So in metaphysics, existence exists.
01:06:54.600 | The law of identity, the law of causality,
01:06:57.080 | the things act based on their nature,
01:07:01.120 | not randomly, not arbitrarily, but based on their nature.
01:07:04.640 | And then we have the tool to know reality.
01:07:08.480 | This is epistemology, the theory of knowledge.
01:07:11.480 | A tool to know reality is reason.
01:07:14.400 | It's our senses and our capacity
01:07:16.640 | to integrate the information we get from our senses
01:07:19.160 | and to integrate it into new knowledge
01:07:20.720 | and to conceptualize it.
01:07:22.560 | And that is uniquely human.
01:07:25.240 | We don't know the truth from revelation.
01:07:31.240 | We don't know truth from our emotions.
01:07:35.240 | Our emotions are interesting.
01:07:36.880 | Our emotions tell us something about ourselves.
01:07:39.560 | But our emotions are not tools of cognition.
01:07:42.360 | They don't tell us the truth about what's out there,
01:07:45.200 | about what's in reality.
01:07:47.680 | So reason is a means of knowledge,
01:07:50.880 | and therefore reason is a means of survival.
01:07:53.360 | Only individuals reason,
01:07:56.160 | just in the same way that only individuals can eat.
01:07:59.160 | We don't have a collective stomach.
01:08:00.920 | Nobody can eat for me,
01:08:03.440 | and therefore nobody can think for me.
01:08:05.720 | We don't have a collective mind.
01:08:07.280 | There's no collective consciousness.
01:08:09.400 | It's bizarre that people talk about
01:08:11.800 | these collectivized aspects of the mind.
01:08:14.960 | They don't talk about collective feet
01:08:16.680 | and collective stomachs and collective things.
01:08:18.800 | But so we all think for ourselves,
01:08:21.800 | and it is our fundamental basic responsibility
01:08:25.320 | to live our lives, to live, to choose.
01:08:29.880 | Once we choose to live,
01:08:31.040 | to live our lives to the best of our ability.
01:08:34.160 | So in morality, she is an egoist.
01:08:38.080 | She believes that the purpose of morality
01:08:40.440 | is to provide you with a code of values and virtues,
01:08:43.560 | to guide your life for the purpose of your own success,
01:08:47.800 | your own survival, your own thriving, your own happiness.
01:08:51.120 | Happiness is the moral purpose of your life.
01:08:54.360 | The purpose of morality is to guide you
01:08:55.960 | towards a happy life.
01:08:57.760 | - Your own happiness.
01:08:58.800 | - Your own happiness, absolutely, your own happiness.
01:09:01.880 | So she rejects the idea that she should live for other people,
01:09:04.880 | that you should live for the purpose
01:09:06.240 | of other people's happiness.
01:09:07.760 | Your purpose is not to make them happy
01:09:09.360 | or to make them anything.
01:09:10.240 | Your purpose is your own happiness.
01:09:12.040 | But she also rejects the idea
01:09:14.560 | that you could argue maybe the Nietzschean idea
01:09:18.080 | of you should use other people for your own purposes.
01:09:20.960 | So every person is an end in himself.
01:09:24.480 | Every person's moral responsibility is their own happiness.
01:09:28.200 | And you shouldn't use other people for your own,
01:09:30.160 | shouldn't exploit other people for your own happiness,
01:09:32.080 | and you shouldn't allow yourself
01:09:33.320 | to be exploited for other people.
01:09:34.960 | Every individual is responsible for themselves.
01:09:37.320 | And what is it that allows us to be happy?
01:09:40.840 | What is it that facilitates human flourishing,
01:09:44.720 | human success, human survival?
01:09:47.000 | Well, it's the use of our minds, right?
01:09:49.080 | Goes back to reason.
01:09:50.380 | And what does reason require in order to be successful,
01:09:56.160 | in order to work effectively?
01:09:58.620 | It requires freedom.
01:10:02.240 | So the enemy of reason, the enemy of reason is force.
01:10:07.040 | The enemy of reason is coercion.
01:10:09.160 | The enemy of reason is authority, right?
01:10:12.880 | The Catholic Church doing what they did to Galileo, right?
01:10:16.840 | That restricts Galileo's thinking, right?
01:10:19.200 | When he's in house arrest,
01:10:20.440 | is he gonna come up with a new theory?
01:10:21.680 | Is he gonna discover new truths?
01:10:23.760 | No, the punishment is too, it's too dangerous.
01:10:28.760 | So force, coercion are enemies of reason.
01:10:34.160 | And what reason needs is to be free,
01:10:39.080 | to think, to discover, to innovate,
01:10:42.720 | to break out of convention.
01:10:44.700 | So we need to create an environment
01:10:48.680 | in which individuals are free to reason, free to think.
01:10:52.400 | And to do that, we come up with a concept,
01:10:55.640 | historically we've come up with a concept
01:10:57.200 | of individual rights.
01:10:58.760 | Individual rights define the scope of,
01:11:01.240 | define the fact that we should be left alone,
01:11:05.560 | free to pursue our values, using our reason,
01:11:09.400 | free of what?
01:11:10.240 | Free of coercion, force, authority.
01:11:12.400 | And that the job of government
01:11:14.840 | is to make sure that we are free.
01:11:17.160 | The whole point of government,
01:11:18.400 | the whole point of when we come in a social context,
01:11:21.660 | the whole point of establishing a government
01:11:24.600 | in that context is to secure that freedom.
01:11:29.600 | It's to make sure that I don't use coercion on you.
01:11:34.640 | The government is supposed to stop me,
01:11:36.040 | supposed to intervene before I can do that,
01:11:38.040 | or if I've already done it,
01:11:40.440 | to prevent me from doing it again.
01:11:42.680 | So the purpose of government is to protect our freedom
01:11:47.020 | to think and to act based on our thoughts.
01:11:49.940 | It's to leave individuals free to pursue their values,
01:11:53.360 | to pursue their happiness, to pursue their rational thought,
01:11:58.240 | and to be left alone to do it.
01:12:01.540 | And so she rejects socialism,
01:12:04.160 | which basically assumes some kind of collective goal,
01:12:07.800 | assumes the sacrifice of the individual to the group,
01:12:11.000 | assumes that your moral purpose in life
01:12:13.080 | is the well-being of other people rather than your own.
01:12:15.840 | And she rejects all form of statism,
01:12:20.560 | all form of government that is overly,
01:12:24.560 | that is involved in any aspect
01:12:28.680 | other than to protect us from forced coercion and authority.
01:12:33.600 | And she rejects anarchy, and we can talk about that.
01:12:36.600 | I think you had a question in the list of questions
01:12:39.480 | you sent me about anarchy, so I'm happy to discuss that.
01:12:41.560 | - I just talked to Michael Malice about anarchy,
01:12:43.520 | so I don't know if you're familiar with him.
01:12:45.720 | - Yes, I'm familiar with him.
01:12:46.760 | So yeah, so she would completely reject anarchy.
01:12:49.840 | Anarchy is completely inconsistent with her point of view,
01:12:52.440 | and we can talk about why if you want.
01:12:54.120 | - So there is some perfect place
01:12:55.680 | where freedom is maximized,
01:12:57.200 | so systems of government that--
01:12:58.920 | - Absolutely, and she thought
01:13:00.400 | that the American system of government
01:13:01.580 | came close in its idea,
01:13:04.080 | obviously founded with original sin,
01:13:06.240 | with the sin of slavery, but in its conception,
01:13:10.040 | the Declaration of Independence
01:13:11.320 | is about as perfect a political document as one could write,
01:13:14.620 | I think the greatest political document in human history,
01:13:17.100 | but really articulated almost perfectly and beautifully.
01:13:21.920 | And that the American system of government
01:13:23.160 | with the checks and balances,
01:13:25.120 | which is with its emphasis on individual rights,
01:13:27.560 | with its emphasis on freedom,
01:13:29.480 | with its emphasis on leaving individual free
01:13:32.120 | to pursue their happiness,
01:13:33.360 | an explicit recognition of happiness as a goal,
01:13:36.640 | individual happiness, was the model.
01:13:39.240 | It wasn't perfect, there were a lot of problems,
01:13:41.640 | to a large extent because the founders
01:13:43.080 | had mixed philosophical premises,
01:13:45.440 | so there were alien premises introduced
01:13:50.440 | into the founding of the country,
01:13:52.260 | slavery obviously being the biggest problem,
01:13:55.120 | but it was close, and we need to build on that
01:13:59.040 | to create an ideal political system
01:14:01.280 | that will, yes, maximize the freedom of individuals
01:14:06.240 | to do exactly this.
01:14:07.380 | And then of course she had,
01:14:10.680 | so that's the manifestation of this individualism
01:14:15.440 | in a political realm, and she had a theory of art,
01:14:18.040 | she had a theory of aesthetics,
01:14:19.560 | which is the fifth branch of,
01:14:21.620 | she had metaphysics, epistemology, ethics, and politics,
01:14:25.080 | and the fifth branch is aesthetics,
01:14:26.880 | and she viewed art as an essential human need,
01:14:31.720 | a fuel for the human spirit,
01:14:34.160 | and that just like any human need,
01:14:36.240 | it had certain principles that it had to abide by,
01:14:40.220 | that is just like there's nutrition, right,
01:14:42.760 | so some food is good for you and some food is bad for you,
01:14:45.180 | some food, some stuff is poison.
01:14:47.760 | She believed the same is true of art,
01:14:49.640 | that art had an identity,
01:14:51.640 | which is very controversial today, right,
01:14:54.560 | today if you put a frame around it, it is art, right,
01:14:57.280 | if you put a urinal in a museum, it becomes art,
01:15:01.080 | which she thought was evil and ludicrous,
01:15:05.080 | and she rejected completely,
01:15:07.160 | that art had an identity,
01:15:09.120 | and that it served a certain function
01:15:11.440 | that human beings needed it,
01:15:13.600 | and if it didn't have, not only did it have the identity,
01:15:17.180 | but that function was served well by some art
01:15:20.080 | and poorly by other art.
01:15:22.640 | And then there's a whole realm of stuff that's not art,
01:15:24.840 | basically, all of what today is considered modern art,
01:15:28.720 | she would consider as not being art,
01:15:31.360 | splashing paint on a canvas, not art.
01:15:34.080 | So she had very clear ideas,
01:15:40.120 | she articulated them not,
01:15:42.520 | so I would say not in conventional philosophical form,
01:15:47.540 | so she didn't write philosophical essays
01:15:50.300 | using the philosopher's language,
01:15:52.840 | it's why, partially why I think philosophers
01:15:55.360 | have never taken it seriously,
01:15:57.160 | they're actually accessible to us,
01:15:59.240 | we can actually read them,
01:16:01.480 | and she integrates the philosophy
01:16:04.000 | in what I think are amazing ways with psychology,
01:16:07.020 | with history, with economics, with politics,
01:16:10.440 | with what's going on in the world,
01:16:12.520 | and she has dozens and dozens and dozens of essays
01:16:15.640 | that she wrote, many of them were aggregated into books,
01:16:20.280 | I particularly recommend books like
01:16:22.420 | The Virtue of Selfishness,
01:16:25.960 | Capitalism, The Unknown Ideal,
01:16:28.280 | and Philosophy, Who Needs It?
01:16:32.920 | And I think it's a beautiful philosophy,
01:16:37.920 | I know you're big on love,
01:16:39.760 | I think it's a philosophy of love,
01:16:41.660 | we can talk about that,
01:16:42.640 | essentially it's about love,
01:16:44.140 | that's what the philosophy is all about,
01:16:45.700 | and when it apply in terms of it applying to self,
01:16:49.140 | and I think it's sad that so few people read it,
01:16:54.140 | and so few intellectuals take it seriously
01:16:57.420 | and are willing to engage with it.
01:16:58.740 | - Let me ask, that was incredible,
01:17:01.980 | but after that beautiful whirlwind overview,
01:17:04.300 | let me ask the most shallow of questions,
01:17:06.260 | which is the name objectivism.
01:17:09.440 | How should people think about the name being rooted,
01:17:16.640 | why not individualism, what are the options,
01:17:19.340 | if we're like had a branding meeting right now?
01:17:21.260 | - Sure, so she actually had a branding meeting,
01:17:23.900 | so she did this, she went through the exercise,
01:17:26.020 | objectivism, I do not think,
01:17:27.660 | I don't know all the details,
01:17:28.940 | but I don't think objectivism was the first
01:17:31.660 | name she came with,
01:17:32.640 | the problem was that the other names were taken,
01:17:35.300 | and they were not positive implications.
01:17:38.280 | So for example, rationalism could have been a good word,
01:17:41.260 | because she's an advocate of rational thought,
01:17:44.060 | or reasonism, but reasonism sounds weird,
01:17:47.060 | right, the ism, because of too many S's I guess.
01:17:50.100 | Rationalism, but it was already a philosophy,
01:17:52.960 | and it was a philosophy inconsistent with hers,
01:17:55.480 | because it was what she considered a false view
01:17:59.080 | of reason, of rationality.
01:18:01.120 | Realityism, you know, just doesn't work.
01:18:05.640 | So she came on objectivism, and I think actually,
01:18:08.680 | it's a great word, it's a great name,
01:18:12.520 | because it has two aspects to it,
01:18:16.660 | and this is a unique view of what objectivity
01:18:18.640 | actually means.
01:18:19.480 | In objectivism, in objectivity is the idea
01:18:23.680 | of an independent reality.
01:18:26.020 | There is truth, there's actually something out there,
01:18:29.840 | and then there's the role of consciousness, right,
01:18:33.520 | there is the role of figuring out the truth.
01:18:37.240 | The truth doesn't just hit you.
01:18:41.000 | The truth is not in the thing.
01:18:43.840 | You have to discover it.
01:18:45.400 | It's that a consciousness applied to,
01:18:49.280 | that's what objectivity is, right?
01:18:52.900 | It's you discovering the truth in reality.
01:18:56.880 | It's your consciousness interacting.
01:19:00.160 | - And thereby opposing the individual in that sense.
01:19:02.640 | - And only the individual could do it.
01:19:03.840 | Now the problem with individualism
01:19:06.200 | is it would have made the philosophy too political.
01:19:09.980 | - Right.
01:19:10.920 | - And she always said, so she said,
01:19:13.840 | she said, "I'm an advocate for capitalism
01:19:15.840 | "because I'm really an advocate for rational egoism,
01:19:19.460 | "but I'm an advocate for rational egoism
01:19:23.680 | "really because I'm an advocate for reason."
01:19:26.120 | So she viewed the essential of her philosophy
01:19:29.040 | as being this reason and her particular view of reason,
01:19:34.040 | and she has a whole book, she has a book called
01:19:36.720 | Introduction to Objectivist Epistemology,
01:19:39.480 | which I encourage any scientist, mathematician,
01:19:42.160 | anybody interested in science to read
01:19:43.760 | because it is a total force on,
01:19:47.440 | in a sense, what it means to hold concepts
01:19:53.000 | and what it means to discover new discoveries
01:19:56.920 | and to use concepts and how we use concepts.
01:20:01.920 | And she has a theory of concepts that is completely new,
01:20:09.200 | that is completely revolutionary,
01:20:11.240 | and I think is essential for the philosophy of science,
01:20:14.360 | and therefore ultimately,
01:20:16.040 | the more abstract we get with scientific discoveries,
01:20:18.840 | the easier it is to detach them from reality
01:20:23.240 | and to detach them from truth,
01:20:25.400 | the easier it is to be inside our heads
01:20:27.900 | instead of about what's real,
01:20:30.560 | and there are probably examples
01:20:32.520 | from modern physics that fit that.
01:20:34.960 | And I think what she teaches in the book
01:20:37.900 | is how to ground your concepts
01:20:39.800 | and how to bring them into grounding in reality.
01:20:42.640 | So Introduction to Objectivist Epistemology,
01:20:44.600 | and note that it's only an introduction
01:20:46.520 | 'cause one of the things she realized,
01:20:48.040 | one of the things that I think a lot of her critics
01:20:50.400 | don't give enough credit for,
01:20:52.080 | is that philosophy is, there's no end, right?
01:20:56.080 | It's always growing, there are always new discoveries,
01:20:58.280 | there's always, it's like science,
01:21:00.360 | there's always new things,
01:21:01.360 | and there's a ton of work to do in philosophy,
01:21:07.400 | and particularly in epistemology, in the theory of knowledge
01:21:09.520 | that she was actually,
01:21:10.960 | given your interest in mathematics,
01:21:12.720 | she actually saw a lot of parallels
01:21:15.560 | between math and concept formation,
01:21:18.340 | and she was actually, in the years before she died,
01:21:23.600 | she was taking private lessons in mathematics,
01:21:26.000 | in algebra and calculus,
01:21:28.920 | because she believed that there was real insight
01:21:31.300 | in understanding algebra and calculus
01:21:33.640 | to philosophy, into epistemology,
01:21:38.640 | and she also was very interested in neuroscience,
01:21:42.800 | 'cause she believed that that had a lot to tell us
01:21:45.540 | about epistemology, but also about music,
01:21:48.840 | therefore about aesthetics.
01:21:50.760 | So, I mean, she recognized the importance
01:21:55.520 | of all these different fields,
01:21:56.960 | and the beauty of philosophy
01:21:58.720 | is it should be integrating all of them,
01:22:00.600 | and one of the sad things about the world in which we live
01:22:03.140 | is, again, we view these things as silos.
01:22:05.460 | We don't view them as integrating.
01:22:06.940 | We don't have teams of people from different arenas,
01:22:10.840 | different fields, discovering things.
01:22:13.680 | We become like ants, specialized.
01:22:17.280 | So, she was definitely like that,
01:22:19.920 | and she was constantly curious,
01:22:22.240 | constantly interested in new discoveries,
01:22:25.280 | and new ideas, and how this could expand
01:22:29.040 | the scope of her philosophy
01:22:30.620 | and the application of her philosophy.
01:22:32.080 | - There's like a million topics I could talk to you,
01:22:33.920 | but since you mentioned math, I'm almost curious.
01:22:35.640 | - We've only got three hours.
01:22:36.880 | - You know, I'm almost curious.
01:22:40.720 | I don't know if you're familiar
01:22:41.600 | with Gayle's incompleteness theorem.
01:22:44.160 | - I'm not, unfortunately.
01:22:45.080 | - Okay, it was a powerful proof
01:22:49.240 | that any axiomatic systems,
01:22:51.100 | when you start from a bunch of axioms,
01:22:53.840 | that there will, in that system,
01:22:57.880 | provably must be an inconsistency.
01:23:00.920 | So, that was this painful stab
01:23:04.520 | in the idea of mathematics that,
01:23:07.440 | if we start with a set of assumptions,
01:23:09.940 | kind of like Adrian started with objectivism,
01:23:12.960 | there will have to be at least one contradiction.
01:23:16.520 | - See, I intuitively am gonna say that's false.
01:23:20.960 | - Philosophically, but in math, it's just true.
01:23:25.920 | - It's a question about how you define,
01:23:28.200 | again, definitions matter,
01:23:30.200 | and you have to be careful on how you define axioms,
01:23:32.920 | and you have to be careful about
01:23:34.320 | what you define as an inconsistency,
01:23:35.880 | and what that means to say there's an inconsistency.
01:23:38.640 | And I don't know, I'm not gonna say more than that,
01:23:40.280 | 'cause I don't know, but I'm suspicious
01:23:42.640 | that there is some, and this is the power of philosophy,
01:23:47.640 | and this is why I said before,
01:23:49.040 | concept formation is so important,
01:23:50.640 | and understanding concept formation is so important,
01:23:53.040 | for particularly, again, mathematics,
01:23:54.240 | because it's such an abstract field,
01:23:55.880 | and it's so easy to lose grounding in reality,
01:24:00.360 | that if you properly define axioms,
01:24:03.720 | and you properly define what you're doing in math,
01:24:05.640 | whether that is true, and I don't think it is.
01:24:08.280 | - This is, yeah, we'll leave it as an open mystery,
01:24:11.200 | 'cause actually, this audience,
01:24:13.480 | you know, there's literally over 100,000 people
01:24:17.680 | that have PhDs, and so they know
01:24:20.040 | Gato's incomplete theorem.
01:24:21.400 | I have this intuition that there is something different
01:24:25.080 | between mathematics and philosophy,
01:24:27.280 | that I'd love to hear from people.
01:24:28.880 | Like, what exactly is that difference?
01:24:31.480 | Because there's a precision to mathematics
01:24:36.000 | that philosophy doesn't have,
01:24:39.120 | but that precision gets you in trouble.
01:24:42.680 | It somehow, it actually takes you away from truth.
01:24:46.320 | Like, the very constraints of the language used
01:24:49.840 | in mathematics actually puts a constraint
01:24:53.400 | on the capture of truth that it's able to do.
01:24:56.200 | - I'm gonna argue that that is a total product
01:25:00.560 | of the way you're conceptualizing
01:25:02.840 | the terms within mathematics.
01:25:05.760 | It's not in reality.
01:25:07.660 | - Yeah, so you would argue it's in the fact
01:25:10.480 | that mathematics, in as much as it's detached from reality,
01:25:13.800 | that you can do these kinds of things.
01:25:15.240 | - Yes, and that mathematicians have come up with concepts
01:25:21.240 | that they haven't grounded in reality properly
01:25:26.240 | that allows them to go off in places
01:25:32.120 | that don't lead to truth, that's right,
01:25:34.200 | that don't lead to truth.
01:25:35.360 | But I encourage you then, I encourage you
01:25:38.040 | to do one of these podcasts with one of our philosophers
01:25:42.720 | who know more about this stuff.
01:25:46.160 | And if you move to Austin, I've got somebody
01:25:48.080 | I'd recommend to you.
01:25:49.600 | - Can you throw a name out, or no?
01:25:51.440 | - Yeah, I mean, I would talk to Greg Sommieri.
01:25:54.120 | - When you say our, can you say what you mean by our?
01:25:57.640 | - I'd say people who are affiliated
01:26:00.100 | with the Ironman Institute are philosophers
01:26:02.220 | who are affiliated with objectivism, right?
01:26:04.960 | And Greg is one of our brightest, and he's in Austin.
01:26:08.240 | He's just got a position at UT,
01:26:10.400 | so at the University of Texas.
01:26:13.240 | And he would want, Encargate would be another one
01:26:15.720 | who actually works at the Institute
01:26:17.520 | and a chief philosophy officer at the Institute.
01:26:20.440 | - That's awesome.
01:26:21.280 | - And there are others who specialize
01:26:23.640 | in philosophy of science who I think Greg
01:26:26.760 | could probably give you a lead.
01:26:28.720 | But these are unbelievably smart people
01:26:31.340 | who know this part of the philosophy much better than I do.
01:26:34.920 | - Can you just briefly perhaps say
01:26:36.620 | what is the Ironman Institute?
01:26:38.560 | - Yeah, so the Ironman Institute was an organization
01:26:40.600 | founded three years after Ironman died.
01:26:44.760 | She died in 1982, and it was founded in 1985
01:26:49.760 | to promote her ideas, to make sure that her ideas
01:26:53.080 | and her novels continued in the culture and were relevant.
01:26:58.080 | Well, they're relevant, but the people saw the relevance.
01:27:01.880 | So our mission is to get people to read her books,
01:27:04.360 | to engage in the ideas.
01:27:06.320 | We teach, we have the Objectivist Academic Center
01:27:10.080 | where we teach the philosophy,
01:27:12.720 | primarily to graduate students and others
01:27:14.800 | who take the idea seriously and who really want
01:27:16.840 | a deep understanding of the philosophy.
01:27:20.840 | And we apply the ideas.
01:27:22.640 | So we take the ideas and apply them to ethics,
01:27:25.000 | to philosophy, to issues of the day,
01:27:28.160 | which is more my strength and more what I tend to do.
01:27:31.120 | I've never formally studied philosophy.
01:27:34.720 | So all my education philosophy is informal.
01:27:39.600 | And I'm an engineer.
01:27:42.360 | And a finance guy, that's my background.
01:27:44.560 | So I'm a no-biz guy.
01:27:45.760 | - Well, let me, I feel pretty under-educated.
01:27:49.760 | I have a pretty open mind,
01:27:54.800 | which sometimes can be painful on the internet
01:27:57.080 | because people mock me or, you know,
01:28:02.080 | if I say something nuanced about communism,
01:28:05.480 | people immediately kind of put you in a bin
01:28:09.080 | or something like that.
01:28:10.760 | It hurts to be open-minded, to say, I don't know,
01:28:13.920 | to ask the question, why is communism or Marxism
01:28:18.400 | so problematic, why is capitalism problematic and so on?
01:28:21.920 | But let me nevertheless go into that direction with you.
01:28:25.180 | Maybe let's talk about capitalism a little bit.
01:28:29.840 | How does objectivism compare, relate
01:28:33.720 | to the idea of capitalism?
01:28:36.080 | - Well, first we have to define what capitalism is
01:28:37.920 | 'cause again, people use capitalism in all kinds of ways.
01:28:40.960 | And I know you had Ray Dalio on your show once.
01:28:43.600 | I need to listen to that episode.
01:28:46.360 | But Ray has no clue what capitalism is.
01:28:48.720 | And that's his big problem.
01:28:51.680 | So when he says there are real problems today in capitalism,
01:28:56.880 | he's not talking about capitalism.
01:28:58.240 | He's talking about problems in the world today.
01:28:59.880 | And I agree with many of the problems,
01:29:01.480 | but they have nothing to do with capitalism.
01:29:04.920 | Capitalism is a social, political, economic system
01:29:08.680 | in which all property is privately owned
01:29:13.360 | and in which the only role of government
01:29:15.280 | is the protection of individual rights.
01:29:18.040 | I think it's the ideal system.
01:29:19.920 | I think it's the right system
01:29:21.040 | for the reasons we talked about earlier.
01:29:22.320 | It's a system that leaves you as an individual
01:29:24.440 | to pursue your values, your life, your happiness,
01:29:27.400 | free of coercion or force.
01:29:28.920 | And you get to decide what happens to you.
01:29:32.120 | And I get to decide if to help you or not.
01:29:34.440 | Let's say you fall flat on your face.
01:29:35.600 | People always say, "Well, what about the poor?"
01:29:37.280 | Well, if you care about the poor, help them.
01:29:39.480 | Just don't, you know, what do you need a government for?
01:29:43.760 | You know, I always ask audiences,
01:29:45.440 | okay, if there's a poor kid who can't afford to go to school
01:29:49.760 | and all the schools are private
01:29:50.800 | because capitalism is being instituted,
01:29:53.480 | and he can't go to school,
01:29:55.200 | would you be willing to participate in a fund
01:29:56.920 | that pays for his education?
01:29:58.920 | Every hand in the room goes up.
01:30:00.440 | So what do you need government for?
01:30:02.480 | Just let's get all the money together
01:30:04.880 | and pay for his schooling.
01:30:05.960 | So the point is that what capitalism does
01:30:08.360 | is leave individuals free to make their own decisions.
01:30:11.200 | And as long as they're not violating other people's rights,
01:30:14.160 | in other words, as long as they're not using coercion force
01:30:17.360 | on other people, then leave them alone.
01:30:20.280 | And people are going to make mistakes
01:30:21.920 | and people are gonna screw up their lives
01:30:23.120 | and people are gonna commit suicide.
01:30:24.480 | People are gonna do terrible things to themselves.
01:30:27.360 | That is fundamentally their problem.
01:30:29.120 | And if you want to help,
01:30:30.920 | you under capitalism are free to help.
01:30:33.840 | It's just the only thing that doesn't happen under capitalism
01:30:37.000 | is you don't get to impose your will on other people.
01:30:40.560 | How's that a bad thing?
01:30:41.800 | - So the question then is how does the implementation
01:30:46.800 | of capitalism deviate from its ideal in practice?
01:30:52.360 | I mean, this is what is the question with a lot of systems
01:30:57.480 | is how does it start to then fail?
01:31:00.880 | So one thing maybe you can correct me or inform me,
01:31:05.000 | it seems like information is very important.
01:31:10.560 | Like being able to make decisions, to be free,
01:31:15.560 | you have to have access,
01:31:18.360 | full access of all the information you need
01:31:21.200 | to make rational decisions.
01:31:23.680 | - No, that can't be.
01:31:25.800 | Because it can't be right.
01:31:26.880 | 'Cause none of us has full access
01:31:28.800 | to all the information we need.
01:31:31.240 | I mean, what does that even mean?
01:31:32.360 | And how much of the scope do you want to do?
01:31:35.680 | - Let's just start there.
01:31:36.520 | Yeah, we don't.
01:31:37.360 | - So you need to have access to information.
01:31:39.840 | So one of the big criticisms of capitalism
01:31:41.920 | is this asymmetrical information.
01:31:44.280 | The drug maker has more information about the drug
01:31:46.800 | than the drug buyer, right?
01:31:48.440 | Pharmaceutical drugs.
01:31:49.680 | True, it's a problem.
01:31:53.800 | Well, I wonder if one can think about,
01:31:55.800 | an entrepreneur can think about how to solve that problem.
01:31:58.080 | See, I view any one of these challenges to capitalism
01:32:01.200 | as an opportunity for entrepreneur to make money.
01:32:03.320 | - And they have the freedom to do it.
01:32:04.880 | - Yeah, so imagine an entrepreneur steps in and says,
01:32:07.360 | "I will test all the drugs that drug companies make.
01:32:11.400 | "And I will provide you for a fee with the answer.
01:32:15.820 | "And how do I know he's not gonna be corrupted?
01:32:18.400 | "Well, there'll be other ones and they'll compete.
01:32:21.620 | "And who am I to tell which one of these is the right one?"
01:32:25.080 | Well, it won't be you really getting
01:32:26.760 | the information from them.
01:32:28.840 | It'll be your doctor.
01:32:30.560 | The doctors need that information.
01:32:32.260 | So the doctor who has some expertise in medicine
01:32:36.000 | will be evaluating which rating agency to use
01:32:39.280 | to evaluate the drugs and which ones then
01:32:41.480 | to recommend to you.
01:32:43.080 | So do we need an FDA?
01:32:45.480 | Do we need a government that siphons all the information
01:32:48.360 | to one source that does all the research, all the thing,
01:32:51.080 | and has a clear incentive, by the way,
01:32:53.000 | not to approve drugs.
01:32:55.080 | Because they don't make any money from it.
01:32:57.520 | Nobody pays them for the information.
01:32:59.240 | Nobody pays them to be accurate.
01:33:00.720 | They're bureaucrats at the end of the day.
01:33:02.440 | And what is a bureaucrat?
01:33:04.120 | What's the main focus of a bureaucrat?
01:33:06.680 | Even if they go in with the best of intentions,
01:33:09.000 | which I'm sure all the scientists at the FDA
01:33:10.840 | have the best of intentions, what's their incentive?
01:33:13.360 | The system builds in this incentive not to screw up.
01:33:17.040 | Because one drug gets value and does damage,
01:33:22.000 | you lose your job.
01:33:23.800 | But if 100 drugs that could kill cancer tomorrow
01:33:27.040 | don't ever get to market,
01:33:29.120 | nobody's gonna come after you.
01:33:31.560 | - Yeah, and you're saying that's not a mechanism
01:33:35.840 | that's conducive to--
01:33:37.960 | - You see, the marketplace is competition.
01:33:39.440 | So if you won't approve the drug,
01:33:41.200 | if I still think it's possible, I will,
01:33:43.760 | and it's not zero-one.
01:33:45.240 | You see, the other thing that happens with the FDA
01:33:47.200 | is it's zero-one.
01:33:48.040 | It's either approved or it's not approved.
01:33:50.360 | Oh, it's approved for this, but it's not approved for that.
01:33:52.600 | But what if a drug came out and you said,
01:33:57.000 | you told the doctors,
01:33:58.120 | this drug in 10% of the cases
01:34:03.360 | can cause patients an increased risk of heart disease?
01:34:07.520 | You and your patients should,
01:34:09.720 | we're not forcing you, but you should, right?
01:34:12.640 | It's your medical responsibility to evaluate that
01:34:15.520 | and decide if the drug is appropriate or not.
01:34:17.480 | Why don't I get to make that choice
01:34:19.080 | if I wanna take on the 10% risk of heart disease?
01:34:21.760 | So there was a drug, and right now I forget the name,
01:34:24.520 | but it was a drug against pain,
01:34:26.360 | particularly for arthritic pain, and it worked.
01:34:29.320 | It reduced pain dramatically, right?
01:34:31.600 | And some people tried everything,
01:34:33.240 | and this was the only drug that reduced their pain.
01:34:35.800 | And it turned out that in 10% of the cases,
01:34:39.340 | it caused the elevated risk.
01:34:42.320 | Didn't kill people necessarily,
01:34:43.900 | but it caused elevated risk of heart disease.
01:34:46.160 | Okay, what did the FDA do?
01:34:49.080 | It banned the drug.
01:34:50.920 | Some people, I know a lot of people who said,
01:34:53.600 | living with pain is much worse than taking on a 10% risk.
01:34:58.440 | Again, probabilities, right?
01:34:59.560 | People don't think in those numbers.
01:35:01.360 | 10% risk of maybe getting heart disease.
01:35:03.200 | Why don't I get to make that choice?
01:35:04.660 | Why does some bureaucrat make that choice for me?
01:35:07.520 | That's capitalism.
01:35:08.900 | Capitalism gives you the choice,
01:35:11.140 | not you as an ignorant person,
01:35:13.080 | you with your doctor and a whole marketplace,
01:35:17.040 | which is now created to provide you with information.
01:35:20.120 | And think about a world where we didn't have
01:35:23.680 | all these regulations and controls.
01:35:25.800 | The amount of opportunities that would exist
01:35:31.960 | to create, to provide information,
01:35:35.000 | to educate you about that information
01:35:36.920 | would mushroom dramatically.
01:35:39.480 | Bloomberg, the billionaire Bloomberg,
01:35:41.720 | how did he make his money?
01:35:42.960 | He made his money by providing financial information,
01:35:45.560 | by creating this service called Bloomberg
01:35:47.480 | that you buy a terminal
01:35:49.080 | and you get all this amazing information.
01:35:50.840 | And he was before computers, desktop computers.
01:35:53.720 | I mean, he was very early on
01:35:55.240 | in that whole computing revolution.
01:35:57.360 | But his focus was providing financial information
01:35:59.800 | to professionals.
01:36:00.740 | And you hire a professional to manage your money.
01:36:04.380 | That's the way it's supposed to be.
01:36:06.080 | So you as an individual cannot have all the knowledge
01:36:11.880 | you need in medicine,
01:36:12.840 | all the knowledge you need in finance,
01:36:14.120 | all the knowledge you need in every aspect of your life.
01:36:16.280 | You can't do that.
01:36:17.280 | You have to delegate.
01:36:19.560 | And you hire a doctor.
01:36:21.160 | Now you should be able to figure out
01:36:23.080 | if the doctor's good or not.
01:36:24.240 | You should be able to ask doctors for reasons
01:36:26.840 | for why you have to make the decision at the end.
01:36:28.960 | But that's why you have a doctor.
01:36:29.800 | That's why you have a financial advisor.
01:36:31.080 | That's why you have different people
01:36:32.840 | who you're delegating certain aspects of your life to.
01:36:36.440 | But you want choices.
01:36:38.440 | And what the marketplace provides is those choices.
01:36:41.560 | - So let's, let me then, this is what I do.
01:36:45.560 | I'll make a dumb case for things,
01:36:47.680 | and then you shut me down,
01:36:48.960 | and then the internet says how dumb Lex is.
01:36:51.160 | This is good.
01:36:52.000 | This is how it works.
01:36:52.840 | - I'm good at shutting down.
01:36:54.160 | (laughing)
01:36:55.000 | And they're foolish in blaming you for the question
01:36:59.840 | because you're here to ask me questions.
01:37:02.840 | - Let's make the, let me make a case for socialism.
01:37:06.460 | (laughing)
01:37:09.680 | - It's gonna be bad because that's the only case
01:37:11.680 | there is for socialism.
01:37:12.520 | - All right. - That's reality.
01:37:13.640 | - So, and then perhaps it's not a case for socialism,
01:37:16.920 | but just a certain notion that inequality,
01:37:21.860 | the wealth inequality, that the bigger the gap
01:37:26.840 | between the poorest or the average and the richest,
01:37:30.440 | the more painful it is to be average.
01:37:35.000 | Psychologically speaking, if you know that there is,
01:37:38.480 | the CEOs of companies make 300, 1,000,
01:37:43.720 | 1 million times more than you do,
01:37:46.000 | that makes life for a large part of the population
01:37:50.520 | less fulfilling, that there's a relative notion
01:37:53.120 | to the experience of our life,
01:37:55.240 | that even though everybody's life has gotten better
01:37:58.240 | over the past decades and centuries,
01:38:02.080 | it may feel actually worse because you know
01:38:06.600 | that life could be so, so much better
01:38:10.320 | in the life of these CEOs that, yeah,
01:38:13.720 | that gap is fundamentally a thing
01:38:17.240 | that is undesirable in a society.
01:38:19.740 | - Everything about that is wrong.
01:38:23.040 | (laughing)
01:38:25.080 | I like to start off like that, yeah.
01:38:27.040 | Which, so, I mean, so my wife likes to remind me
01:38:32.040 | that as well as we've done in life,
01:38:36.140 | we are actually from a wealth perspective
01:38:38.240 | closer to a homeless person than we are to Bill Gates.
01:38:41.120 | Just a math, right, just a math, right?
01:38:43.120 | (laughing)
01:38:44.880 | - It's a good ego check.
01:38:46.160 | - When I look at Bill Gates, I get a smile on my face.
01:38:49.280 | I love Bill Gates, I've never met Bill Gates.
01:38:51.400 | I love Bill Gates.
01:38:52.440 | I love what he stands for.
01:38:54.800 | I love that he has $100 billion.
01:38:56.720 | I love that he has built a trampoline room in his house
01:39:01.160 | where his kids can jump up and down in a trampoline
01:39:03.560 | in a safe environment.
01:39:04.920 | - Can we take another billionaire?
01:39:06.640 | Because I'm not sure if you're paying attention,
01:39:09.400 | but there's all kinds of conspiracy theories
01:39:12.360 | about Bill Gates.
01:39:13.800 | - Well, but that's part of the story, right?
01:39:15.560 | They have to pull him down because people resent him.
01:39:17.920 | - And that's strange.
01:39:19.120 | - That's strange, but yes, we can take Jeff Bezos.
01:39:21.520 | We can take my favorite, historically,
01:39:24.880 | just 'cause I like a lot about him, was Steve Jobs.
01:39:28.740 | I mean, I love these people.
01:39:32.720 | And I can't, there are very few billionaires I don't love.
01:39:36.360 | In a sense that I appreciate everything they've done for me,
01:39:41.240 | for people I cherish and love,
01:39:46.020 | they've made the world a better place.
01:39:48.360 | Why would it ever cross my mind
01:39:51.800 | that they make me look bad because they're richer than me
01:39:55.920 | or that I don't have what they have?
01:39:58.360 | They've made me so much richer
01:40:02.520 | that they've made inventions that used to cost
01:40:07.400 | millions and millions and millions of dollars
01:40:10.280 | accessible to me.
01:40:12.120 | I mean, this is a super computer in my pocket.
01:40:15.980 | Now, but think about it, right?
01:40:18.520 | What is the difference between,
01:40:21.040 | and I'll get to the essence of your point in a minute,
01:40:23.560 | but think about what the difference is
01:40:25.660 | between me and Bill Gates in terms of,
01:40:28.600 | because it's true that in terms of wealth,
01:40:30.320 | I'm closer to the homeless person.
01:40:31.820 | But in terms of my day-to-day life, I'm closer to Bill Gates.
01:40:34.980 | We both live in a nice house.
01:40:37.840 | His is nicer, but we live in a nice house.
01:40:40.160 | His is bigger, but mine is plenty big.
01:40:43.100 | We both drive cars.
01:40:44.560 | His is nicer, but we both drive cars.
01:40:46.700 | Cars, 100 years ago, what cars?
01:40:50.160 | We both can fly, get on a plane in Los Angeles
01:40:54.280 | and fly to New York and get there in about the same time.
01:40:57.100 | We're both flying private.
01:40:58.540 | The only difference is my private plane,
01:41:01.280 | I share with 300 other people.
01:41:03.320 | And his, but it's accessible.
01:41:07.160 | It's relatively comfortable.
01:41:08.700 | Again, in the perspective of 50 years ago, 100 years ago,
01:41:11.400 | it's unimaginable that I could fly like that
01:41:13.800 | for such a low fee.
01:41:15.480 | We live very similar lives in that sense.
01:41:18.840 | So I don't resent him.
01:41:20.160 | So first of all, I'm an exception to the supposed rule
01:41:23.800 | that people resent.
01:41:24.840 | I don't think anybody, I don't think people do resent
01:41:26.840 | unless they're taught to resent.
01:41:28.320 | And this is the key.
01:41:29.680 | People are taught, and I've seen this in America.
01:41:33.120 | And this is, to me, the most horrible, shocking thing
01:41:37.520 | that has happened in America over the last 40 years.
01:41:40.400 | I came to America, so I'm an immigrant.
01:41:42.320 | I came to America from Israel in 1987.
01:41:45.760 | And I came here because I thought this was the place
01:41:48.440 | where I could, where I'd had the most opportunities.
01:41:50.880 | And it is, most opportunities.
01:41:52.920 | And I came here 'cause I believed
01:41:54.280 | there was a certain American spirit of individualism,
01:41:58.840 | and exactly the opposite of what you just described,
01:42:01.160 | a sense of, I live my life, it's my happiness.
01:42:06.160 | I'm not looking at my neighbor.
01:42:07.680 | I'm not competing with the Joneses.
01:42:09.600 | The American dream is my dream, my two kids,
01:42:12.860 | my dog, my station wagon, not because other people have it,
01:42:16.440 | it's because I want it, in that sense.
01:42:18.360 | And when I came here in the '80s, you had that.
01:42:22.840 | You had, you still had it.
01:42:25.280 | It was less than I think it had been in the past.
01:42:28.120 | But you had that spirit.
01:42:29.480 | There was no envy, there was no resentment.
01:42:31.280 | There were rich people, and they were celebrated.
01:42:34.180 | There was still this admiration for entrepreneurs
01:42:37.560 | and admiration for success.
01:42:39.720 | Not by everybody, certainly not by the intellectuals,
01:42:42.680 | but by the average person.
01:42:45.120 | I have witnessed, particularly over the last 10 years,
01:42:47.960 | a complete transformation,
01:42:50.140 | and America's become like Europe.
01:42:52.500 | I know, are you Russian?
01:42:54.280 | - Yeah. - Yeah.
01:42:55.120 | It's become Russian, in a sense where,
01:42:59.480 | you know, they've always done these studies.
01:43:01.680 | I'll give you $100 and your neighbor $100,
01:43:06.840 | or I'll give you, what is it?
01:43:09.400 | Or I'll give you $1,000, but your neighbor gets $10,000.
01:43:14.040 | And a Russian will always choose the $100, right?
01:43:16.560 | He wants equality above being better himself.
01:43:20.720 | Americans would always choose that gap.
01:43:24.540 | - And that's changing. - My sense is not anymore.
01:43:26.780 | And it's changing because we've been told it should change.
01:43:31.780 | - And morally, you're saying that doesn't make any sense.
01:43:35.020 | So there's no sense in which, let me put another spin,
01:43:38.900 | I forget the book, but the sense of,
01:43:41.780 | if you're working for Steve Jobs, and your hands,
01:43:45.500 | you're the engineer behind the iPhone,
01:43:48.260 | and there's a sense in which his salary
01:43:51.340 | is stealing from your efforts.
01:43:53.960 | Because, I forget the book, right?
01:43:57.340 | That's literally the terminology is used, right?
01:43:59.940 | - This is straight out of Karl Marx.
01:44:02.420 | - Sure, it's also straight, but out of Karl Marx.
01:44:05.260 | But there's no sense, morally speaking,
01:44:08.180 | that you see that as the best--
01:44:09.700 | - The other way around.
01:44:11.060 | That engineer's stealing off of,
01:44:12.660 | and it's not stealing, right?
01:44:14.260 | It's not.
01:44:15.740 | But the engineer's getting more from Steve Jobs,
01:44:18.420 | by a lot, not by a little bit,
01:44:20.420 | than Steve Jobs is getting from the engineer.
01:44:23.340 | The engineer, even if they're a great engineer,
01:44:26.140 | there are probably other great engineers
01:44:27.380 | that could replace him.
01:44:28.540 | Would he even have a job without Steve Jobs?
01:44:32.140 | Would the industry exist without Steve Jobs?
01:44:34.980 | Without the giants that carry these things forward?
01:44:38.500 | Let me ask you this.
01:44:39.340 | I mean, you're a scientist.
01:44:41.140 | Do you resent Einstein for being smarter than you?
01:44:44.640 | (laughing)
01:44:45.920 | I mean, do you envy him?
01:44:47.360 | Are you angry with him?
01:44:48.360 | Would you feel negative towards him
01:44:51.520 | if he was in the room right now?
01:44:52.720 | Or would you, if you came into the room,
01:44:54.040 | you'd say, "Oh my God."
01:44:55.400 | I mean, you interview people who I think some of them
01:44:58.320 | are probably smarter than you and me.
01:45:00.040 | - For sure.
01:45:00.880 | - And your attitude towards them is one of reverence.
01:45:03.280 | - Well, one interesting little side question there
01:45:07.080 | is what is the natural state of being for us humans?
01:45:10.920 | You kind of implied education has polluted our minds,
01:45:15.680 | but like if I, 'cause you're referring to jealousy,
01:45:19.640 | the Einstein question, the Steve Jobs question,
01:45:22.640 | I wonder which way, if we're left without education,
01:45:26.000 | would we naturally go?
01:45:27.480 | - So there is no such thing as the natural state
01:45:30.100 | in that sense, right?
01:45:31.940 | This is the myth of Rousseau's "Nobel Savage"
01:45:37.920 | and of John Walls' "Behind the Veil of Ignorance."
01:45:42.520 | Well, if you're ignorant, you're ignorant.
01:45:44.720 | You can't make any decisions, you're just ignorant.
01:45:48.120 | There is no human nature that determines
01:45:54.520 | how you will relate to other people.
01:45:56.540 | You will relate to other people based on the conclusions
01:45:58.960 | you come to about how to relate to other people.
01:46:01.920 | You can relate to other people as values,
01:46:06.920 | to use your terminology, from the perspective of love.
01:46:10.320 | This other human being is a value to me
01:46:13.600 | and I want to trade with them
01:46:15.760 | and trade the beauty of trade is it's win-win.
01:46:19.060 | I wanna benefit and they are going to benefit.
01:46:21.440 | I don't wanna screw them, I don't want them to screw me,
01:46:24.080 | I want this to be win-win.
01:46:25.560 | Or you can deal with other people as threats, as enemies.
01:46:31.400 | Much of human history we have done that
01:46:34.560 | and therefore is a zero-sum world.
01:46:37.320 | What they have, I want, I will take it.
01:46:41.880 | I will use force to take it,
01:46:43.080 | I will use political force to take it,
01:46:44.600 | I will use the force of my arm to take it,
01:46:46.200 | I will just take it.
01:46:47.640 | So those are two options, right?
01:46:51.080 | And they will determine whether we live
01:46:52.680 | in civilization or not.
01:46:54.640 | And they aren't determined by conclusions people come to
01:46:57.480 | about the world and the nature of reality
01:46:59.360 | and the nature of morality and the nature of politics
01:47:01.480 | and all these things.
01:47:02.880 | They're determined by philosophy.
01:47:05.320 | And this is why philosophy is so important
01:47:07.680 | because philosophy shapes, evolution doesn't do this.
01:47:12.400 | It doesn't just happen.
01:47:14.520 | Ideas shape how we relate to other people.
01:47:17.200 | And you say, "Well, little children do it."
01:47:19.960 | Well, little children don't have a frontal cortex.
01:47:22.600 | It's not relevant, right?
01:47:24.440 | What happens as you develop a frontal cortex,
01:47:27.020 | as you develop the brain, you learn ideas.
01:47:32.160 | And those ideas will shape how you relate to other people.
01:47:35.240 | And if you learn good ideas,
01:47:37.000 | you relate to other people in a healthy, productive win-win.
01:47:40.440 | And if you develop bad ideas,
01:47:43.840 | you will resent other people and you will want their stuff.
01:47:47.800 | And the thing is that human progress depends
01:47:50.240 | on the win-win relationship.
01:47:52.480 | It depends on civilization, depends on peace.
01:47:55.480 | It depends on allowing people,
01:47:57.820 | going back to what we talked about earlier,
01:47:59.240 | allowing people the freedom to think for themselves.
01:48:02.640 | And anytime you try to interrupt that,
01:48:05.040 | you're causing damage.
01:48:06.160 | So this change in America is not some reversion
01:48:09.480 | to a natural state.
01:48:11.400 | It's a shift in ideas.
01:48:12.800 | We still live, the better part of American society
01:48:19.520 | and the world still lives on the remnants of the Enlightenment.
01:48:24.360 | The Enlightenment ideas,
01:48:26.840 | the ideas that brought about the scientific revolution,
01:48:30.760 | the ideas that brought about the creation of this country.
01:48:33.120 | And it's the same basic ideas that led to both of those.
01:48:36.480 | And as those ideas get more distant,
01:48:41.120 | as those ideas are not defended,
01:48:43.480 | as those ideas disappear, as Enlightenment goes away,
01:48:47.360 | we will become more violent, more resentful, more tribal,
01:48:53.000 | more obnoxious, more unpleasant, more primitive.
01:48:58.200 | - A very specific example of this that bothers me,
01:49:02.680 | I'd be curious to get your comment on.
01:49:04.960 | So Elon Musk is a billionaire.
01:49:08.760 | And one of the things that really,
01:49:14.760 | maybe it's almost a pet peeve,
01:49:16.240 | it really bothers me when the press
01:49:18.520 | and the general public will say,
01:49:21.320 | "Well, all those rockets they're sending up there,
01:49:24.880 | "those are just like the toys,
01:49:26.460 | "the games that billionaires play."
01:49:28.600 | That to me, billionaire has become a dirty word to use,
01:49:34.960 | like as if money can buy or has anything to do with genius.
01:49:41.520 | Like, I'm trying to articulate a specific
01:49:47.640 | line of question here, because it just bothers me.
01:49:53.960 | I guess the question is like,
01:49:55.960 | why, how do we get here and how do we get out of that?
01:49:58.480 | Because Elon Musk is doing some of the most incredible things
01:50:02.240 | that a human being has ever participated in,
01:50:05.120 | in mostly, he doesn't build the rockets himself,
01:50:07.480 | he's getting a bunch of other geniuses together that have--
01:50:10.560 | - That takes genius.
01:50:11.400 | - That takes genius, but why, where do we go
01:50:14.840 | and how do we get back to where Elon Musk
01:50:17.640 | is an inspiring figure as opposed to
01:50:20.160 | a billionaire playing with some toys?
01:50:23.160 | - So this is the role of philosophy.
01:50:25.080 | It goes back to the same place.
01:50:26.520 | It goes back to our understanding of the world
01:50:28.520 | and our role in it.
01:50:30.280 | And if you understand that the only way
01:50:32.640 | to become a billionaire, for example,
01:50:34.800 | is to create value, value for whom?
01:50:37.560 | Value for people who are gonna consume it.
01:50:39.760 | The only way to become a billionaire,
01:50:41.240 | the only way Elon Musk became a billionaire is through PayPal.
01:50:45.600 | Now, PayPal is something we all use.
01:50:47.560 | PayPal is an enormous value to all of us.
01:50:50.040 | It's why it's worth several billions of dollars,
01:50:52.680 | which Elon Musk could then earn.
01:50:57.680 | But you cannot become a billionaire in a free society
01:51:01.640 | by exploiting people.
01:51:02.920 | You cannot because you'll be laughed,
01:51:05.840 | nobody will deal with you.
01:51:06.880 | Nobody will have any interactions with you.
01:51:09.160 | The only way to become a billionaire
01:51:10.640 | is to do billions of win-win transactions.
01:51:14.200 | So the only way to become a billionaire in a free society
01:51:18.440 | is to change the world to make it a better place.
01:51:21.720 | Billionaires are the great humanitarians of our time,
01:51:24.640 | not because they give charity,
01:51:26.360 | but because they make them billions.
01:51:29.200 | And it's true that money and genius
01:51:32.760 | are not necessarily correlated,
01:51:35.440 | but you cannot become a billionaire
01:51:37.360 | without being super smart.
01:51:38.960 | You cannot become a billionaire by figuring something out
01:51:42.400 | that nobody else has figured out
01:51:44.560 | in whatever realm it happens to be.
01:51:46.160 | And that thing that you figure out
01:51:48.400 | has to be something that provides immense value
01:51:50.680 | to other people.
01:51:51.560 | Where do we go wrong?
01:51:54.400 | We go wrong, our culture goes wrong,
01:51:56.760 | because it views billionaires as self-interested, as selfish.
01:52:01.760 | And there's a sense in which, and not a sense,
01:52:04.800 | it's absolutely true.
01:52:06.400 | The billionaire doesn't ask for my opinion
01:52:08.680 | on what product to launch.
01:52:11.520 | Elon Musk doesn't ask others
01:52:13.840 | what they think you should spend his money on,
01:52:15.960 | what the greatest social well-being will be.
01:52:18.520 | Elon, I mean, there's a sense
01:52:19.960 | in which the rockets are his toys.
01:52:21.880 | There's a sense in which he chose
01:52:24.480 | that he would be inspired the most.
01:52:28.240 | He would have the most fun
01:52:30.160 | by going to Mars and building rockets.
01:52:32.680 | And he's probably dreamt of rockets from when he was a kid
01:52:35.400 | and probably always played with rockets.
01:52:37.360 | And now he has the funds, the capital,
01:52:39.040 | to be able to deploy it.
01:52:40.760 | So he's being selfish.
01:52:43.840 | Obviously, he's being self-interested.
01:52:45.600 | This is what Elon Musk is about.
01:52:47.280 | I mean, the same with Jeff Bezos.
01:52:50.440 | There's no committee to decide
01:52:52.320 | whether to invest in cloud computing or not.
01:52:57.000 | Bezos decided that.
01:52:58.960 | And at the end of the day, they are the bosses.
01:53:01.840 | They pursue the values they believe are good.
01:53:03.720 | They create the wealth.
01:53:06.080 | It's their decisions.
01:53:07.240 | It's their mind.
01:53:08.960 | And the fact is we live in a world
01:53:10.960 | where for 2,000-plus years, self-interest,
01:53:15.960 | even though we all do it, to a small extent or less,
01:53:20.600 | we deem it as morally abhorrent.
01:53:24.000 | It's bad. It's wrong.
01:53:26.160 | I mean, your mother probably taught you
01:53:27.720 | the same thing my mother taught me.
01:53:29.240 | Think of others first.
01:53:31.000 | Think of yourself last.
01:53:32.760 | The good stuff is kept for the guests.
01:53:35.680 | You never get to use the good stuff.
01:53:38.040 | You know, it's others.
01:53:39.720 | That's what the focus of morality is.
01:53:41.600 | Now, no mother, even no Jewish mother,
01:53:44.960 | actually believes that, right?
01:53:47.040 | Because they don't really want you to be last.
01:53:50.640 | They want you to be first, and they push you to be first.
01:53:53.480 | But morally, they've been taught their entire lives,
01:53:56.400 | and they believe that the right thing to say
01:53:59.800 | and to some extent do
01:54:02.600 | is to argue for sacrifice for other people, right?
01:54:06.840 | So most people, 99% of people, are torn.
01:54:12.480 | They know they should be selfless,
01:54:17.920 | sacrifice, live for other people.
01:54:20.000 | They don't really want to,
01:54:21.920 | so they act selfishly in their day-to-day life,
01:54:25.040 | and they feel guilty, and they can't be happy.
01:54:28.480 | They can't be happy, and Jewish mothers
01:54:30.000 | and Catholic mothers are excellent
01:54:31.440 | at using that guilt to manipulate you.
01:54:33.920 | But the guilt is inevitable
01:54:35.120 | because you've got these two conflicting things,
01:54:38.360 | the way you want to live
01:54:39.360 | and the way you've been taught to live.
01:54:42.040 | And what objectivism does is it, at the end of the day,
01:54:45.480 | provides you with a way to unite morality,
01:54:49.160 | a proper morality, with what you want
01:54:52.720 | and to think about what you really want,
01:54:55.080 | to conceptualize what you really want properly.
01:54:58.120 | So what you want is really good for you,
01:55:00.080 | and what you want will really lead to your happiness.
01:55:03.360 | So, you know, we reject the idea of sacrifice.
01:55:06.760 | We reject the idea of living for other people.
01:55:09.400 | But you see, if you believe that the purpose of morality
01:55:14.160 | is to sacrifice for other people,
01:55:16.200 | and you look at Jeff Bezos,
01:55:18.920 | when was the last time he sacrificed anything, right?
01:55:22.160 | He's living pretty well.
01:55:23.560 | He's got billions that he could give it all away,
01:55:26.240 | and yet he doesn't.
01:55:27.720 | How dare he?
01:55:28.840 | You know, in my talks, I often position,
01:55:33.000 | and I'm gonna use vocate, sorry, guys.
01:55:36.760 | Drop the conspiracy theory.
01:55:38.000 | They're all BS, complete and utter nonsense.
01:55:41.360 | There's not a shred of truth.
01:55:42.920 | You know, I disagree with Bill Gates on everything political.
01:55:47.440 | I think he politically is a complete ignoramus,
01:55:51.000 | but the guy's a genius when it comes to technology.
01:55:54.200 | And he's just thoughtful, even in his philanthropy.
01:55:58.000 | He just uses his mind, and I respect that,
01:56:00.360 | even though politically he's terrible.
01:56:01.720 | Anyway, think about this.
01:56:04.040 | Who had a bigger impact on the lives
01:56:06.920 | of poor people in the world, Bill Gates or Mother Teresa?
01:56:10.400 | - Bill Gates. - It's not even close.
01:56:13.640 | And Mother Teresa lived this altruistic life to the core.
01:56:17.440 | She lived it consistently,
01:56:19.240 | and yet she was miserable, pathetic, horrible.
01:56:21.720 | She hated her life.
01:56:23.000 | She was miserable, and most of the people she helped
01:56:26.600 | didn't do very well, because she just helped them not die.
01:56:30.040 | Right? - Yeah.
01:56:30.880 | - And then Bill Gates changed the world,
01:56:32.840 | and he helped a lot by providing technology.
01:56:35.120 | We even, philanthropy gets to them.
01:56:37.240 | The food gets to them much faster, more efficient.
01:56:39.600 | Yet, who is the moral saint?
01:56:41.000 | Sainthood is not determined based on
01:56:44.680 | what you do for other people.
01:56:46.160 | Sainthood is based on how much pain you suffer.
01:56:49.140 | I like to ask people to go to a museum
01:56:52.120 | and look at all the paintings of saints,
01:56:54.680 | how many of them are smiling and are happy?
01:56:57.440 | They've usually got arrows through them
01:56:59.320 | and holes in their body,
01:57:00.800 | and they're just suffering a horrible death.
01:57:02.740 | The whole point of the morality we are taught
01:57:05.200 | is that happiness is immorality,
01:57:10.160 | that happy people cannot be good people,
01:57:15.240 | and that good people suffer,
01:57:17.320 | and that suffering is necessary for morality.
01:57:20.800 | Morality is about sacrifice, self-sacrifice and suffering.
01:57:25.100 | And at the end of the day,
01:57:28.420 | almost all the problems in the world
01:57:30.040 | boil down to that false view.
01:57:34.220 | - So, can we try to talk about,
01:57:37.620 | part of it is the problem of the word selfishness,
01:57:39.860 | but let's talk about the virtue of selfishness.
01:57:42.980 | So, let's start at the fact that, for me,
01:57:45.220 | I really enjoy doing stuff for other people.
01:57:48.180 | I enjoy being, cheering on the success of others.
01:57:53.180 | - Why?
01:57:55.420 | - I don't know.
01:57:56.500 | It's deep in the-- - Well, think about it.
01:57:59.300 | 'Cause I think you do know.
01:58:01.460 | - If I were to really think,
01:58:03.660 | I don't wanna resort to evolutionary arguments,
01:58:08.700 | or this is some how-- - No, some of it's evolutionary.
01:58:10.460 | - So, I think--
01:58:13.140 | - So, I can tell you why I enjoy helping others.
01:58:17.020 | - Maybe you can go there.
01:58:18.500 | One thing, 'cause we should talk about love a little bit.
01:58:21.140 | I'll tell you, there's a part of me
01:58:23.020 | that's a little bit not rational.
01:58:26.980 | There's a gut that I follow
01:58:29.660 | that not everything I do is perfectly rational.
01:58:32.420 | Like, for example, my dad criticizes me.
01:58:36.700 | He says, "You should always have a plan.
01:58:39.020 | "It should make sense.
01:58:40.000 | "You have to have a strategy."
01:58:41.540 | And I say that I stepped down
01:58:45.420 | from my full-salary position at MIT.
01:58:47.980 | There's so many things I did without a plan.
01:58:50.320 | It's the gut.
01:58:51.160 | It's like, I wanna start a company.
01:58:53.780 | Well, you know how many companies fail?
01:58:55.460 | I don't know.
01:58:56.300 | - 90%.
01:58:57.140 | - It's the gut.
01:58:58.900 | And the same thing with being kind to others is a gut.
01:59:02.900 | I watch the way that karma works in this world,
01:59:06.540 | that the people like us,
01:59:07.940 | one guy I look up to is Joe Rogan,
01:59:09.980 | that he does stuff for others,
01:59:12.300 | and that the joy he experiences,
01:59:15.460 | the way he sees the world,
01:59:16.940 | like just the glimmer in his eyes
01:59:20.860 | because he does stuff for others
01:59:22.740 | that creates a joyful experience.
01:59:24.300 | And that somehow seems to be an instructive way to,
01:59:27.940 | that to me is inspiring of a life well-lived.
01:59:31.660 | - But you probably know a lot of people
01:59:33.100 | who have done stuff for others who are not happy.
01:59:35.600 | - True.
01:59:38.100 | - So I don't think it's the doing stuff for others
01:59:40.060 | that brings the happiness.
01:59:41.220 | It's why you do stuff for others
01:59:42.640 | and what else you're doing in your life
01:59:44.260 | and what is the proportion.
01:59:48.540 | But it's why at the end of the day,
01:59:50.260 | which is, and it's the same.
01:59:52.500 | Look, you can maybe through a gut feeling,
01:59:55.180 | say I want to start a company,
01:59:56.780 | but you better start doing thinking about how and what
01:59:59.700 | and all of that.
02:00:00.540 | And to some extent the why,
02:00:01.900 | because if you really want to be happy doing this,
02:00:03.840 | you may better make sure you're doing it
02:00:05.360 | for the right reason.
02:00:06.860 | So I'm not, there's something called fast thinking,
02:00:10.340 | Coleman, the-
02:00:11.180 | - Daniel Kahneman?
02:00:13.780 | - Yes.
02:00:14.620 | Daniel Kahneman talks about,
02:00:15.700 | and there is, it's,
02:00:18.420 | all the integrations you've made so far in your life
02:00:22.060 | cause you to have specialized knowledge in certain things
02:00:25.180 | and you can think very fast.
02:00:27.160 | And your gut tells you what the right answer is.
02:00:31.500 | But it's not.
02:00:32.820 | Your mind is constantly evaluating and constantly working.
02:00:35.900 | You want to make it as rational as you can,
02:00:39.340 | not in the sense that I have to think through
02:00:41.020 | every time I make a decision,
02:00:42.520 | but that they've so programmed my mind in a sense
02:00:45.900 | that the answers are the right answers,
02:00:48.600 | when I get them.
02:00:51.720 | So, you know, I like,
02:00:56.720 | I view other people as a value.
02:00:59.220 | Other people contribute enormously to my life.
02:01:02.980 | Whether it's a romantic love relationship
02:01:07.360 | or whether it's a friendship relationship
02:01:09.440 | or whether it's just, you know,
02:01:12.560 | Jeff Bezos creating Amazon
02:01:15.000 | and delivering goodies to my home when I get them.
02:01:18.520 | And people do all that, right?
02:01:20.640 | It's not just Jeff Bezos.
02:01:22.160 | He gets the most credit,
02:01:23.080 | but everybody in that chain of command,
02:01:24.760 | everybody at Amazon is working for me.
02:01:27.360 | I love that.
02:01:28.560 | I love the idea of a human being.
02:01:31.560 | I love the idea that there are people capable
02:01:34.520 | of being an Einstein, of being, you know,
02:01:37.600 | and creating and building and making stuff
02:01:40.400 | that makes my life so good.
02:01:42.400 | You know, most of us like,
02:01:45.480 | this is not a good room for an example.
02:01:47.440 | Most of us like plants, right?
02:01:50.020 | We like pets.
02:01:51.400 | I don't particularly, but people like pets.
02:01:54.080 | We like to see life.
02:01:55.760 | Human beings are life on steroids, right?
02:01:59.800 | They're life with a brain.
02:02:01.040 | It's amazing, right, what they can do.
02:02:03.560 | I love people.
02:02:05.260 | Now, that doesn't mean I love everybody
02:02:07.160 | 'cause there's some,
02:02:08.040 | there are really bad people out there who I hate, right?
02:02:10.680 | And I do hate.
02:02:11.960 | And there are people out there that are just,
02:02:14.120 | I have no opinion about.
02:02:15.520 | But generally, the idea of a human being
02:02:18.680 | to me is a phenomenal idea.
02:02:20.000 | When I see a baby, I light up
02:02:22.600 | because to me, there's a potential, you know,
02:02:26.060 | there's this magnificent potential that is embodied in that.
02:02:31.100 | And when I see people struggling and need help,
02:02:34.120 | I think they're human beings.
02:02:36.640 | They embody that potential.
02:02:38.640 | They embody that goodness.
02:02:40.820 | They might turn out to be bad,
02:02:43.240 | but why would I ever give the presumption of that?
02:02:45.360 | I give them the presumption of the positive
02:02:46.980 | and I cheer them on.
02:02:48.880 | And I enjoy watching people succeed.
02:02:52.200 | I enjoy watching people get to the top of the mountain
02:02:54.760 | and produce something.
02:02:56.160 | Even if I don't get anything directly from it,
02:02:59.080 | I enjoy that because it's part of my enjoyment of life.
02:03:03.360 | - So the word, to you, the morality of selfishness,
02:03:08.280 | this kind of love of other human beings,
02:03:10.360 | the love of life fits into a morality of selfishness.
02:03:13.920 | - Can't not.
02:03:15.860 | Because there's no context in which you can truly
02:03:20.060 | love yourself without loving life
02:03:23.300 | and loving what it means to be human.
02:03:25.140 | So, you know, the love of yourself is gonna manifest itself
02:03:30.180 | differently in different people, but it's core.
02:03:32.900 | What do you love about yourself?
02:03:34.940 | First of all, I love.
02:03:35.920 | I love that I'm alive.
02:03:37.020 | I love that, you know, I love this world
02:03:39.380 | and the opportunities it provides me
02:03:40.980 | and the fun and the excitement of discovering something new
02:03:45.020 | and meeting a new person and having a conversation.
02:03:48.200 | You know, all of this is immensely enjoyable,
02:03:52.580 | but behind all of that is a particular human capability
02:03:55.540 | that not only I have, other people have.
02:03:57.700 | And the fact that they have it makes my life
02:03:59.940 | so much more fun because, so it's,
02:04:02.900 | you cannot view, you know, it's all integrated
02:04:07.380 | and you cannot view yourself in isolation.
02:04:09.340 | Now that doesn't place a moral commandment on me.
02:04:14.660 | Help everybody who's poor that you happen
02:04:17.020 | to meet in the street.
02:04:18.580 | It doesn't place a burden on me in a sense
02:04:21.540 | that now I have this moral duty to help everybody.
02:04:25.660 | It leaves me free to make decisions
02:04:27.820 | about who I help and who I don't.
02:04:28.940 | There's some people who I will not help.
02:04:31.580 | There's some people who I do not wish positive things upon.
02:04:35.520 | Bad people should have bad outcomes.
02:04:39.500 | Bad people should suffer.
02:04:42.640 | - And then you have the freedom to choose
02:04:44.140 | who's good, who's bad within your home.
02:04:45.620 | - It's your decision based on your values.
02:04:47.740 | Now, I think there's an objectivity to it.
02:04:49.780 | There's a standard by which you should evaluate
02:04:52.100 | good versus bad, and that standard should be
02:04:54.420 | to what extent do they contribute or hurt human life?
02:04:57.580 | The standard is human life.
02:04:59.640 | And so when I say, look at Jeff Bezos,
02:05:01.620 | I say he's contributed to human life, good guy.
02:05:04.220 | I might disagree with him on stuff.
02:05:05.860 | We might disagree about politics.
02:05:07.300 | We might disagree about women.
02:05:08.940 | I don't know what we agree, but overall, big picture,
02:05:12.980 | he is pro-life, right?
02:05:15.340 | I look at somebody like, you know,
02:05:16.900 | to take like 99.9% of our politicians,
02:05:20.620 | and they are pro-death.
02:05:22.900 | They are pro-destruction.
02:05:25.580 | They are pro-cutting corners in ways
02:05:28.060 | that destroy human life and human potential
02:05:30.220 | and human ability.
02:05:31.240 | So I literally hate almost every politician out there.
02:05:34.340 | And I wish ill on them, right?
02:05:38.100 | I don't want them to be successful or happy.
02:05:40.180 | I want them all to go away, right?
02:05:42.260 | Don't leave me alone.
02:05:43.100 | So I believe in justice.
02:05:45.260 | I believe good things should happen to good people
02:05:46.860 | and bad things should happen to bad people.
02:05:48.020 | So I make those generalizations based on this one,
02:05:52.660 | you know, on the other hand,
02:05:54.180 | I shouldn't say all politicians, right?
02:05:55.780 | So if I, you know, I love Thomas Jefferson
02:05:57.580 | and George Washington, right?
02:05:59.460 | I love Abraham Lincoln.
02:06:00.580 | I love people who fought for freedom
02:06:02.740 | and who believed in freedom, who had these ideas
02:06:04.800 | and lived up to, at least in parts of their lives,
02:06:07.600 | to those principles.
02:06:08.440 | Now, do I think Thomas Jefferson was flawed
02:06:10.420 | because he helped slaves?
02:06:11.420 | Absolutely.
02:06:12.980 | But the virtues way outweigh that in my view.
02:06:15.620 | And I understand people who don't accept that.
02:06:17.660 | - You don't have to also love and hate the entirety
02:06:20.580 | of the person.
02:06:21.400 | There's parts of that person that you're attracted to.
02:06:23.500 | - The major part is pro-life
02:06:25.020 | and therefore I'm pro that person.
02:06:26.620 | And I think, and I said earlier
02:06:28.620 | that objectivism is philosophy of love.
02:06:30.140 | And I believe that because objectivism is about your life,
02:06:35.140 | about loving your life, about embracing your life,
02:06:38.300 | about engaging with the world,
02:06:40.020 | about loving the world in which you live,
02:06:42.660 | about win-win relationships with other people,
02:06:44.860 | which means to a large extent loving the good
02:06:48.340 | in other people and the best in other people
02:06:50.780 | and encouraging that and supporting that and promoting that.
02:06:53.520 | So I know selfishness is a harsh word
02:06:56.500 | because the culture has given it that harshness.
02:06:59.060 | Selfishness is a harsh word
02:07:00.380 | because the people who don't like selfishness
02:07:02.020 | want you to believe it's a harsh word.
02:07:04.300 | But it's not.
02:07:05.700 | What does it mean?
02:07:06.620 | It means focus on self.
02:07:09.060 | It means take care of self.
02:07:10.900 | It means make yourself your highest priority,
02:07:13.440 | not your only priority, because in taking care of self,
02:07:16.820 | what would I be without my wife?
02:07:20.420 | What would I be without the people who support me,
02:07:24.540 | who help me, who I have these love relationships with?
02:07:28.620 | So other people are crucial.
02:07:31.980 | What would my life be without Steve Jobs?
02:07:36.100 | Right?
02:07:37.020 | - A lot of things you mentioned here are just beautiful.
02:07:41.420 | So one is win-win.
02:07:42.940 | So one key thing about this selfishness
02:07:45.900 | and the idea of objectivism as a philosophy of love
02:07:48.820 | is that you don't want parasitism.
02:07:52.340 | So that is unethical.
02:07:54.500 | So you actually, first of all, you say win-win a lot,
02:07:58.380 | and I just like that terminology
02:08:00.320 | because it's a good way to see life.
02:08:02.340 | It's trying to maximize the number of win-win interactions.
02:08:06.620 | That's a good way to see business, actually.
02:08:08.420 | - Well, life generally.
02:08:09.580 | I think every aspect of life,
02:08:11.020 | you want to have a win-win relationship with your wife.
02:08:13.880 | Imagine if it was win-lose.
02:08:16.100 | Either way, if you win and she loses,
02:08:18.300 | how long is that going to sustain?
02:08:20.500 | So win-lose relationships are not in equilibrium.
02:08:25.300 | What they turn into is lose-lose.
02:08:27.180 | Like win-lose turns into lose-lose.
02:08:29.880 | And so the only alternative to lose-lose is win-win.
02:08:34.360 | And you win and the person you love wins.
02:08:36.960 | What's better than that, right?
02:08:38.480 | - That's the way to maximize.
02:08:39.980 | So like the selfishness is you're trying
02:08:43.360 | to maximize the win, but the way to maximize the win
02:08:46.060 | is to maximize the win-win.
02:08:48.220 | - Yes, and it turns out,
02:08:49.640 | and Adam Smith understood this a long time ago,
02:08:51.840 | that if you focus on your own winning
02:08:54.120 | while respecting other people as human beings,
02:08:58.000 | then everybody wins.
02:08:59.160 | And the beauty of capitalism,
02:09:00.280 | if we go back to capitalism for a second,
02:09:02.240 | the beauty of capitalism is you cannot be successful
02:09:05.120 | in capitalism without producing values
02:09:08.800 | that other people appreciate
02:09:10.240 | and therefore willing to buy from you.
02:09:12.120 | And they buy them, and this goes back to that question
02:09:14.960 | about the engineer and Steve Jobs.
02:09:16.520 | Why is the engineer working there?
02:09:18.960 | Because he's getting paid more than his time is worth to him.
02:09:22.560 | I know people don't like to think in those terms,
02:09:24.280 | but that's the reality.
02:09:25.720 | If his time is worth more to him
02:09:27.160 | than what he's getting paid, he would leave.
02:09:29.880 | So he's winning.
02:09:30.840 | And is Apple winning?
02:09:33.120 | Yes, because they're getting more productivity from him,
02:09:35.280 | they're getting more from him
02:09:36.600 | than what he's actually producing.
02:09:40.240 | - It's tough because there's human psychology
02:09:44.120 | and imperfect information.
02:09:45.720 | It just makes it a little messier
02:09:47.820 | than the clarity of thinking you have about this.
02:09:50.360 | It's just, you know, because for sure,
02:09:54.240 | but not everything in life is an economic transaction.
02:09:56.920 | It ultimately is close.
02:10:00.480 | - Even if it's not an economic transaction,
02:10:02.240 | even if it's a relationship transaction,
02:10:05.560 | when you get to a point with a friend
02:10:08.640 | where you're not gaining from the relationship,
02:10:11.400 | friendship's gonna be over.
02:10:12.560 | Not immediately, because it takes time for these things
02:10:14.720 | to manifest itself and to really absorb into,
02:10:17.200 | but we change friendships, we change our loves, right?
02:10:20.200 | We fall in and out of love.
02:10:22.000 | We fall out of love because we're not,
02:10:23.880 | love, so let's go back to love, right?
02:10:27.000 | Love is the most selfish of all emotions.
02:10:29.160 | Love is about what you do to me, right?
02:10:33.000 | So I love my wife 'cause she makes me
02:10:34.760 | feel better about myself.
02:10:36.240 | So, you know, the idea of selfless love is bizarre.
02:10:41.440 | So Ayn Rand used to say, before you say, "I love you,"
02:10:45.840 | you have to say the "I."
02:10:47.240 | And you have to know who you are
02:10:51.400 | and you have to appreciate yourself.
02:10:52.960 | If you hate yourself,
02:10:54.440 | what does it mean to love somebody else?
02:10:56.400 | So I love my wife 'cause she makes me
02:10:59.960 | feel great about the world.
02:11:01.440 | And she loves me for the same reason.
02:11:04.800 | And so Ayn Rand used to use this example.
02:11:07.800 | Imagine you go up to your, to be spoused
02:11:11.080 | the night before the wedding, and you say,
02:11:13.480 | "I get nothing out of this relationship.
02:11:16.720 | "I'm doing this purely as an act of noble self-sacrifice."
02:11:21.040 | (laughing)
02:11:22.240 | She would slap you.
02:11:23.720 | And she should, right?
02:11:25.520 | So, no, we know this intuitively that love is selfish,
02:11:30.120 | but we are afraid to admit it to ourselves.
02:11:32.080 | And why?
02:11:32.960 | Because the other side has convinced us
02:11:35.240 | that selfishness is associated with exploiting other people.
02:11:38.760 | Selfishness means lying, cheating, stealing,
02:11:41.400 | walking on corpses, backstabbing people.
02:11:44.440 | But is that ever in your self-interest?
02:11:48.240 | Truly, right?
02:11:50.760 | I'll often be in front of an audience and say,
02:11:52.320 | "Okay, how many people here have lied?"
02:11:54.600 | I'm kidding, right?
02:11:57.080 | How many of you think that if you did that consistently,
02:12:00.880 | that would make your life better?
02:12:02.520 | Nobody thinks that, right?
02:12:05.000 | Because everybody's experienced how shitty lying,
02:12:09.840 | not because of how it makes you feel
02:12:11.400 | out of a sense of guilt.
02:12:12.480 | Existentially, it's just a bad strategy, right?
02:12:15.360 | You get caught, you have to create other lies
02:12:18.240 | to cover up the previous lie.
02:12:19.960 | It screws up with your own psychology
02:12:21.720 | and your own cognition.
02:12:23.560 | The mind, to some extent like a computer, right,
02:12:27.920 | is an integrating machine.
02:12:29.720 | And in computer science, I understand
02:12:31.160 | there's a term called garbage in, garbage out.
02:12:33.800 | Lying is garbage in.
02:12:35.080 | - Yeah.
02:12:36.040 | - So it's not good strategy, cheating,
02:12:40.600 | screwing your customers in a business,
02:12:42.800 | not paying your suppliers as a businessman,
02:12:45.160 | not good business practices,
02:12:47.080 | not good practices for being alive.
02:12:49.360 | So win-win is both moral and practical.
02:12:53.000 | And the beauty of Ayn Rand's philosophy,
02:12:55.080 | and I think this is really important,
02:12:57.280 | is that the moral is the practical
02:12:58.800 | and the practical is the moral.
02:13:00.360 | And therefore, if you are moral, you will be happy.
02:13:03.320 | - Yeah, that's why the application
02:13:08.080 | of the philosophy of objectivism is so easy to practice,
02:13:11.880 | so like, or to discuss, or possible to discuss.
02:13:15.240 | That's why you talk about all--
02:13:16.600 | - So clear cut.
02:13:17.640 | - Yeah.
02:13:18.480 | - I'm not ambiguous about my view.
02:13:19.400 | - And it's fundamentally practical.
02:13:20.960 | I mean, that's the best of philosophies is practical.
02:13:24.480 | - It's in a sense teaching you how to live a good life.
02:13:27.760 | And it's teaching you how to live a good life,
02:13:30.200 | not just as you, but as a human being.
02:13:33.320 | And therefore, the principles that apply to you
02:13:35.480 | probably apply to me as well.
02:13:37.280 | And if we both share the same principles
02:13:40.080 | of how to live a good life,
02:13:41.320 | we're not gonna be enemies.
02:13:42.680 | - You brought up anarchy earlier.
02:13:46.880 | It's an interesting question
02:13:49.600 | because you've kind of said politicians,
02:13:52.060 | I mean, part of it is a little bit joking,
02:13:54.040 | but politicians are not good people.
02:13:57.440 | - Yeah.
02:13:58.280 | - But we should have some?
02:14:02.120 | So you have an opposition to anarchism.
02:14:05.880 | - So first of all, they weren't always not bad people.
02:14:08.400 | That is, I gave examples of people
02:14:10.880 | who engage in political life
02:14:11.920 | who I think were good people basically.
02:14:13.880 | But they think they get worse over time
02:14:17.160 | if the system is corrupt.
02:14:20.000 | And I think the system, unfortunately,
02:14:21.960 | even the American system, as good as it was,
02:14:24.600 | was founded on quicksand and have corruption built in.
02:14:27.400 | They didn't quite get it.
02:14:30.400 | And they needed Ayn Rand to get it.
02:14:31.960 | So I'm not blaming them.
02:14:32.960 | I don't think they share any blame.
02:14:34.480 | You needed a philosophy in order to completely fulfill
02:14:39.280 | the promise that is America,
02:14:40.840 | or the promise that is the founding of America.
02:14:42.280 | - So the place where the corruption sneaked in
02:14:45.360 | is the lack in some way of the philosophy
02:14:48.440 | underlying the nation?
02:14:49.640 | - Absolutely.
02:14:50.480 | So it's Christianity.
02:14:53.120 | It's, you know, not to hit on another controversial topic.
02:14:57.160 | It's religion, which undercut their morality.
02:15:01.840 | So the founders were explicitly Christian
02:15:05.920 | and altruistic in their morality.
02:15:09.100 | Implicitly, in terms of their actions,
02:15:11.820 | they were completely secular,
02:15:12.900 | and they were very secular anyway.
02:15:15.260 | But in their morality, even, they were secular.
02:15:17.740 | So there's nothing in Christianity that says
02:15:20.060 | that you have an inalienable right to pursue happiness.
02:15:23.240 | That's unbelievably self-interested
02:15:25.180 | and based on kind of a moral philosophy,
02:15:27.980 | of egoistic moral philosophy.
02:15:30.100 | But they didn't know that.
02:15:31.180 | And they didn't know how to ground it.
02:15:33.100 | They implicitly, they had that fast thinking, that gut,
02:15:36.100 | that told them that this was right.
02:15:37.580 | And the whole enlightenment,
02:15:39.020 | that period from John Locke on to really to Hume,
02:15:43.460 | that period is about pursuit of happiness
02:15:47.300 | using reason in pursuit of the good life, right?
02:15:50.300 | But they can't ground it.
02:15:51.540 | They don't really understand what reason is,
02:15:53.300 | and they don't really understand what happiness requires,
02:15:56.860 | and they can't detach themselves from Christianity.
02:16:00.080 | They're not allowed to politically,
02:16:01.420 | and I think conceptually,
02:16:02.740 | you just can't make that big break.
02:16:05.140 | Rand is an enlightenment thinker in that sense.
02:16:07.180 | She is what should have followed right after, right?
02:16:10.980 | She should have come in, grounded them in the secular
02:16:15.860 | and in the egoistic and Aristotelian view of morality
02:16:19.940 | as a code of values to basically to guide your life,
02:16:24.940 | to guide your life towards happiness.
02:16:27.260 | That's Aristotle's view, right?
02:16:28.820 | So they didn't have that.
02:16:34.020 | So I think that government is necessary.
02:16:38.420 | It's not a necessary evil, it's a necessary good
02:16:40.860 | 'cause it does something good.
02:16:42.940 | And the good that it does
02:16:45.860 | is it eliminates coercion from society.
02:16:48.060 | It eliminates violence from society.
02:16:50.020 | It eliminates the use of force
02:16:52.700 | between individuals from society.
02:16:55.380 | And that--
02:16:56.940 | - But see, the argument that Michael Malice would make.
02:16:59.380 | (laughing)
02:17:00.740 | Give me a chance here.
02:17:03.340 | Why can't you apply the same kind of reasoning
02:17:05.940 | that you've effectively used for the rest
02:17:08.740 | of mutually agreed upon institutions
02:17:12.140 | that are driven by capitalism,
02:17:14.020 | that we can't also hire forces to protect us
02:17:18.020 | from the violence to ensure the stability of society
02:17:21.520 | that protects us from the violence?
02:17:23.260 | Why draw the line at this particular place, right?
02:17:28.420 | - Well, because there is no other place to draw a line
02:17:30.780 | and there is a line.
02:17:32.460 | And by the way, we draw lines other places, right?
02:17:34.960 | We don't vote.
02:17:40.340 | We don't have...
02:17:43.180 | We don't determine truth in science based on competition.
02:17:48.000 | - Right, so that's a line.
02:17:51.380 | - That's a line.
02:17:52.220 | - But first of all, some people might say--
02:17:53.540 | - I mean, there's competition in a sense
02:17:55.220 | that you have alternate theories,
02:17:57.220 | but at the end of the day,
02:17:58.940 | whether you decide that he's right or he's right
02:18:01.700 | is not based on the market,
02:18:04.660 | it's based on facts, on reality, on objective reality.
02:18:09.180 | And some people will never accept that this person is right
02:18:14.300 | because they don't see the strength.
02:18:16.980 | So first of all, what they reject,
02:18:19.140 | what most anarchists reject,
02:18:20.480 | even if they don't admit it or recognize it,
02:18:23.620 | is they object, they reject objective reality.
02:18:26.660 | - In which sense?
02:18:29.860 | So like, okay. - I'll get to it, right?
02:18:31.340 | So there's a whole...
02:18:32.940 | So the whole realm of law
02:18:36.300 | is a scientific realm
02:18:40.700 | to define, for example,
02:18:44.380 | the boundaries of private property.
02:18:46.140 | It's not an issue of competition.
02:18:49.560 | It's not an issue of I have one system
02:18:54.020 | and you have another system.
02:18:55.900 | It's an issue of objective reality.
02:18:57.980 | And now it's more difficult than science in a sense
02:19:00.900 | because it's more difficult to prove
02:19:03.660 | that my conception of property is correct
02:19:05.580 | and you're correct.
02:19:07.740 | But there is a correct one.
02:19:10.260 | In reality, there's a correct vision.
02:19:12.340 | It's more abstract.
02:19:13.340 | But look, somebody has to decide what property is.
02:19:19.580 | So I have defined, my property is defined
02:19:22.700 | by certain boundaries.
02:19:25.540 | And I have a police force
02:19:27.700 | and I have a judiciary system that backs my vision.
02:19:30.580 | And you have a claim against my property.
02:19:34.500 | You have a claim against my property.
02:19:35.660 | And you have a police force and a judicial system
02:19:38.780 | that backs your claim.
02:19:39.900 | Who's right?
02:19:42.300 | - So our definitions of property are different.
02:19:45.740 | - Yes, our definitions of property
02:19:46.940 | or our claim on the property is different.
02:19:49.540 | - So why can't we just agree
02:19:52.020 | on the definition of property and--
02:19:54.740 | - But why should we agree, right?
02:19:55.940 | Your judicial system is one definition of property.
02:19:58.940 | My judicial system is not.
02:20:00.540 | You think that there's no such thing
02:20:03.140 | as intellectual property rights.
02:20:05.420 | And your whole system believes that.
02:20:07.580 | And my whole system believes there is such thing.
02:20:09.780 | So you are duplicating my books
02:20:12.380 | and handing them out to all your friends
02:20:14.460 | and not paying me a royalty.
02:20:16.140 | And I think that's wrong.
02:20:19.820 | And my judicial system and my police force
02:20:21.900 | think that's wrong.
02:20:24.180 | And we're both living in the same geographic area, right?
02:20:28.100 | So we have overlapping jurisdictions.
02:20:31.780 | Now, the anarchists would say, well, we'll negotiate.
02:20:34.820 | Why should we negotiate?
02:20:35.980 | My system is actually right.
02:20:37.660 | There is such a thing as intellectual property rights.
02:20:39.500 | There's no negotiation here.
02:20:40.780 | You're wrong.
02:20:41.780 | And you should either pay a fine or go to jail.
02:20:44.500 | - Yeah, but why can't, 'cause it's a community,
02:20:46.540 | it's multiple, there's multiple parties
02:20:48.700 | and it's like a majority vote.
02:20:50.620 | They'll hire different forces that says,
02:20:52.820 | yeah, your honor is onto something here
02:20:54.900 | with the definition of property and we'll go with that.
02:20:57.260 | - So anarchist pro-democracy in the majority rule sense?
02:21:01.460 | - Well, I think so.
02:21:02.300 | I think anarchy promotes like emergent democracy, right?
02:21:07.300 | - No, it doesn't.
02:21:09.140 | I'll tell you what it promotes.
02:21:11.380 | It promotes emergent strife and civil war and violence,
02:21:16.140 | constant uninterrupted violence.
02:21:18.420 | 'Cause the only way to settle the dispute between us,
02:21:21.060 | since we both think that we are right
02:21:23.460 | and we have guns behind us to protect that.
02:21:27.100 | And we have a legal system, we have a whole theory of ideas
02:21:30.340 | is you're stealing my stuff.
02:21:33.900 | How do I get it back?
02:21:35.740 | I invade you, right?
02:21:37.620 | I take over, you know, and who's gonna win that battle?
02:21:42.060 | The smartest guy?
02:21:43.220 | No, the guy with the biggest guns.
02:21:44.580 | - See, but the anarchists would say
02:21:46.540 | that they're using implied,
02:21:48.460 | like the state uses implied force.
02:21:51.860 | They're already doing violence.
02:21:53.180 | - Because they take the state as it is today
02:21:56.220 | and they refuse to engage in the conversation
02:21:58.940 | about what a state should and could look like
02:22:01.380 | and how we can create mechanisms
02:22:04.260 | to protect us from the state using those.
02:22:07.220 | But look, my view of anarchy is very simple.
02:22:10.540 | It's a ridiculous position.
02:22:12.620 | It's infantile.
02:22:13.740 | I mean, I really mean this, right?
02:22:15.140 | And I'm sorry to Michael,
02:22:16.660 | and all the other very, very smart,
02:22:19.300 | very, very smart anarchists,
02:22:20.700 | 'cause anarchists is never,
02:22:22.260 | you won't find a dumb anarchist.
02:22:25.620 | - Right.
02:22:26.460 | - Because dumb people know it wouldn't work.
02:22:28.740 | You have to have, it's absolutely true.
02:22:31.620 | You have to have a certain IQ to be an anarchist.
02:22:35.740 | - That's true, they're all really intelligent.
02:22:37.340 | - All intelligent, and the reason is
02:22:39.660 | that you have to create such a mythology in your head.
02:22:45.780 | You have to create so many rationalizations.
02:22:49.420 | Any Joe in the street knows it doesn't work
02:22:52.620 | because they can understand what happens
02:22:55.540 | when two people who are armed are in the street
02:22:59.220 | and have a dispute,
02:23:00.220 | and there's no mechanism to resolve that dispute.
02:23:03.700 | - Yeah.
02:23:04.620 | - That's objective, and this is where it gets subjective.
02:23:07.860 | That's objective.
02:23:09.420 | The whole point of government is
02:23:11.300 | that it is the objective authority
02:23:14.820 | for determining the truth in one regard,
02:23:18.860 | in regard to force,
02:23:20.340 | because the only alternative to determining it
02:23:25.620 | when it comes to force is through force.
02:23:27.940 | The only way to resolve disputes is through force,
02:23:31.140 | or through this negotiation, which is unjust,
02:23:33.180 | because if one party's right and one party's wrong,
02:23:34.780 | why negotiate?
02:23:35.620 | And this is the point.
02:23:38.620 | I'm not against competition of governance.
02:23:41.780 | I'm all for competition of governance.
02:23:43.740 | We do that all the time.
02:23:44.660 | It's called countries.
02:23:46.260 | The United States has a certain governance structure.
02:23:49.140 | The Soviet Union had a governance structure.
02:23:50.820 | Mexico has a governance structure,
02:23:52.820 | and they're competing.
02:23:54.180 | And we can observe the competition.
02:23:56.580 | In my world, you could move freely
02:23:58.660 | from one governance to another.
02:24:00.300 | If you didn't like your governance,
02:24:01.580 | you would move to a better governance system,
02:24:03.860 | but they have to have autonomy within a geographic area.
02:24:07.300 | Otherwise, what you get is complete and utter civil war.
02:24:10.780 | The law needs to be objective,
02:24:13.140 | and there needs to be one law over a piece of ground.
02:24:15.380 | And if you disagree with that law,
02:24:16.860 | you can move somewhere else where they may.
02:24:18.580 | This is why federalism is such a beautiful system.
02:24:21.540 | Even within the United States, we have states.
02:24:23.980 | And on certain issues,
02:24:25.300 | we're allowed to disagree between states,
02:24:27.100 | like the death penalty.
02:24:27.980 | Some states do, some states don't.
02:24:30.020 | Fine.
02:24:30.940 | And now I can move from one state if I don't like it.
02:24:33.780 | But there's certain issues you cannot have disagreement.
02:24:36.100 | Slavery, for example.
02:24:37.140 | This is why we had a civil war.
02:24:39.100 | But let me, one other argument against anarchy.
02:24:43.620 | Markets exist where force has been eliminated.
02:24:47.980 | - Sorry, can you say that again?
02:24:50.020 | Markets exist where the rule of force has been eliminated.
02:24:54.380 | - The rule of force?
02:24:57.060 | - Yes.
02:24:57.900 | - Can you elaborate that?
02:24:58.740 | - So a market will exist if we know
02:25:02.700 | that you can't pull a gun on me and just take my stuff.
02:25:05.700 | I am willing to engage in transaction with you
02:25:08.140 | if we have an implicit understanding
02:25:10.900 | we're not gonna use force against each other.
02:25:13.220 | - So force has something special to it.
02:25:15.620 | - Yes.
02:25:16.460 | - It's a special, it overrides,
02:25:18.820 | 'cause we're still agreeing we can manipulate each other.
02:25:21.820 | - Yes.
02:25:22.860 | But force we can't.
02:25:23.700 | - Force kinda, so there's something fundamental
02:25:27.140 | about violence.
02:25:28.340 | - Force is a fundamental force.
02:25:30.660 | It's the anti-reason.
02:25:32.500 | It's the anti-life.
02:25:34.460 | It's the anti-force against another person.
02:25:38.660 | And what it does is shuts down the mind.
02:25:41.620 | - Right.
02:25:43.020 | - So in order to have a market, you have to extract force.
02:25:48.020 | - That's fascinating.
02:25:50.100 | - How can you have a market in force?
02:25:51.500 | - There's an Instagram channel called Nature's Metal
02:25:56.500 | where it has all these videos of animals
02:26:00.940 | basically having a market of force.
02:26:03.180 | - Yes.
02:26:04.020 | - But that shuts down the ability to reason.
02:26:06.060 | And animals don't need to because they can't.
02:26:08.020 | - Exactly, so the innovation that is human beings
02:26:11.020 | is our capacity to reason.
02:26:12.500 | And therefore, the relegation of force to the animals.
02:26:16.060 | We don't do force.
02:26:17.500 | Civilization is where we don't have force.
02:26:20.260 | And so what you have is you cannot have a market in that,
02:26:25.260 | which a market requires the elimination of it.
02:26:29.020 | And I don't debate formally these guys,
02:26:32.300 | but I interact with them all the time, right?
02:26:34.300 | And you get these absurd arguments where,
02:26:37.180 | David Friedman will say, that's Milton Friedman's son,
02:26:40.060 | he will say something like, well, in Somalia,
02:26:42.860 | in the Northern part of Somalia,
02:26:44.060 | where they have no government,
02:26:45.460 | you have all these wonderful,
02:26:46.660 | you have these tribunals of these tribes
02:26:51.300 | and they resolve disputes.
02:26:52.980 | Yeah, barbarically, they use Sharia law.
02:26:57.180 | They have no respect for individual rights,
02:26:58.820 | no respect for property.
02:27:00.380 | And the only reason they have any authority
02:27:02.580 | is because they have guns and they have power
02:27:04.980 | and they have force and they do it barbarically.
02:27:08.660 | There's nothing civilizing about the courts of Somalian
02:27:13.660 | and they write about pirates and because they view force,
02:27:18.220 | they don't view force as something unique
02:27:20.420 | that must be extracted from human life.
02:27:23.260 | And that's why anarchy has to devolve into violence
02:27:26.220 | because it treats force as just, what's the big deal?
02:27:29.220 | We're negotiating over guns.
02:27:32.460 | So we covered a lot of high level philosophy,
02:27:34.660 | but I'd like to touch on the troubles,
02:27:39.660 | the chaos of the day.
02:27:41.500 | A couple of things,
02:27:43.860 | and I really would trying to find a hopeful path way out.
02:27:48.860 | So one is the current coronavirus pandemic,
02:27:55.100 | or in particular, not the virus, but our handling of it.
02:28:00.540 | Is there something philosophically, politically
02:28:04.860 | that you would like to see, that you would like to recommend,
02:28:08.180 | that you would like to maybe give a hopeful message
02:28:11.100 | if we take that kind of trajectory
02:28:12.820 | we might be able to get out?
02:28:14.300 | Because I'm kind of worried about the economic pain
02:28:18.260 | that people are feeling, that there's this quiet suffering.
02:28:22.100 | - I mean, I agree with you completely.
02:28:23.620 | There is a quiet suffering, it's horrible.
02:28:26.140 | I mean, I know people, I go to a lot of restaurants.
02:28:29.460 | One of the things we love to do is eat out.
02:28:31.740 | My wife doesn't like cooking anymore.
02:28:33.860 | We don't have kids in the house anymore,
02:28:35.980 | so she doesn't have to.
02:28:36.820 | So we go out a lot, we go to restaurants.
02:28:38.140 | And because we have our favorites,
02:28:39.740 | so we go to them a lot,
02:28:40.580 | we get to know the owners of the restaurant, the chef.
02:28:43.820 | And it's just heartbreaking.
02:28:46.940 | These people put their life, their blood, sweat, and tears,
02:28:51.100 | I mean, real blood, sweat, and tears into these projects.
02:28:54.020 | Restaurants are super difficult to manage.
02:28:56.980 | Most of them go bankrupt anyway.
02:28:59.580 | And the restaurants, we go to a good restaurant,
02:29:01.780 | so they've done a good job,
02:29:03.100 | and they offer unique value.
02:29:06.580 | And they shut them down.
02:29:09.460 | And many of them will never open.
02:29:12.940 | Something like, they estimate 50, 60% of restaurants
02:29:16.180 | in some places won't open.
02:29:17.660 | These are people's lives, these are people's capital,
02:29:19.580 | these are people's effort, these are people's love.
02:29:22.060 | Talk about love, they love what they do,
02:29:24.300 | particularly if they're the chef as well.
02:29:26.500 | And it's gone, and it's disappeared.
02:29:28.340 | What are they gonna do with their lives now?
02:29:29.540 | They're gonna live off the government
02:29:30.620 | the way our politicians would like them?
02:29:32.540 | Bigger and bigger stimulus plans,
02:29:34.180 | so we can hand checks to people
02:29:35.740 | to get them used to living off of us,
02:29:37.340 | rather than, it's disgusting, and it's offensive,
02:29:40.500 | and it's unbelievably sad.
02:29:42.780 | And this is where it comes to this,
02:29:44.460 | I care about other people.
02:29:45.420 | I mean, this idea that objectivists don't care.
02:29:46.940 | I mean, I love these people who provide me with pleasure
02:29:50.700 | of eating wonderful food in a great environment.
02:29:54.780 | - And there's something inspiring about them too.
02:29:56.700 | Like when I see a great restaurant,
02:29:58.500 | I wanna do better with my own stuff.
02:30:00.460 | - Yeah, exactly, they're inspiring.
02:30:02.980 | Anybody who does it is excellent.
02:30:04.740 | I love sports, because it's the one realm
02:30:07.020 | in which you'd still value and celebrate excellence.
02:30:10.820 | But I try to celebrate excellence everything in my life.
02:30:13.300 | So I try to be nice to these people,
02:30:16.660 | and with COVID, we went more to restaurants,
02:30:20.260 | if you believe it or not.
02:30:21.100 | And we did more takeout stuff.
02:30:23.220 | We made an effort, particularly at the restaurants,
02:30:25.420 | we really loved to keep them going,
02:30:27.380 | to encourage them, to support them.
02:30:29.140 | The problem is, the problem is philosophy drives the world.
02:30:34.180 | The response to COVID has been worse than pathetic.
02:30:38.620 | And it's driven by philosophy.
02:30:42.140 | It's driven by disrespect to science,
02:30:46.260 | ignorance and disrespect of statistics,
02:30:48.980 | a disrespect of individual human decision-making.
02:30:53.060 | Government has to decide everything for us.
02:30:55.220 | And just throughout the process,
02:30:58.620 | and a disrespect of markets,
02:30:59.940 | because we didn't let markets work
02:31:01.380 | to facilitate what we needed
02:31:03.820 | in order to deal with this virus.
02:31:05.900 | If you look at the, it's interesting
02:31:08.100 | that the only place on the planet
02:31:09.420 | that's done well with this are parts of Asia, right?
02:31:12.660 | Taiwan did phenomenally with this.
02:31:15.180 | And the vice president of Taiwan is a epidemiologist.
02:31:18.900 | So he knew what he was doing.
02:31:20.820 | And they got it right from the beginning.
02:31:22.860 | South Korea did amazing.
02:31:25.140 | Even Hong Kong and Singapore.
02:31:26.780 | Hong Kong is just very few deaths.
02:31:30.420 | And the economy wasn't shut down in any of those places.
02:31:35.140 | There were no lockdowns in any of those places.
02:31:37.580 | The CDC had plans before this happened
02:31:43.860 | on how to deal with good plans.
02:31:45.780 | Indeed, if you ask people around the world
02:31:48.260 | before the pandemic,
02:31:49.140 | which country is best prepared for a pandemic,
02:31:52.100 | they would have said the United States.
02:31:53.740 | Because of the CDC's plans
02:31:55.420 | and all of our emergency reserves and all that,
02:31:57.540 | and the wealth.
02:31:58.380 | And yet all of that went out the window
02:32:02.700 | because people panicked.
02:32:05.260 | People didn't think, go back to reason.
02:32:08.660 | People were arrogant,
02:32:10.220 | refused to use the tools that they had
02:32:14.340 | at their disposal to deal with this.
02:32:16.260 | So you deal with pandemics.
02:32:17.460 | It's very simple how you deal with pandemics.
02:32:19.060 | And this is how South Korea and Taiwan,
02:32:21.260 | you deal with them by testing,
02:32:24.380 | tracing, and isolating.
02:32:27.380 | That's it.
02:32:28.580 | And you do it well.
02:32:29.940 | And you do it vigorously.
02:32:30.980 | And you do it on scale if you have to.
02:32:32.940 | And you scale up to do it.
02:32:34.140 | And we have the wealth to do that.
02:32:35.780 | - So one question I have,
02:32:39.060 | it's a difficult one.
02:32:41.220 | So I talk about love a lot.
02:32:43.860 | And you've just talked about Donald Trump.
02:32:45.620 | I guarantee you this particular segment
02:32:47.900 | will be full of division from the internet.
02:32:51.260 | But I believe that should be and can be fixed.
02:32:56.260 | What I'm referring to in particular is the division
02:33:00.460 | because we've talked about the value of reason.
02:33:03.340 | And what I've noticed on the internet
02:33:06.100 | is the division shuts down reason.
02:33:10.220 | So when people hear you say Trump,
02:33:12.500 | actually the first sentence you said about Trump,
02:33:14.620 | they'll hear Trump and their ears will perk up.
02:33:17.260 | And they'll immediately start in that first sentence,
02:33:19.700 | they'll say, is he a Trump supporter or a Trump--
02:33:22.740 | - They're not interested in anything else after that.
02:33:24.540 | - And then after that, that's it.
02:33:26.460 | And what, how do, so my question is,
02:33:29.380 | you as one of the beacons of intellectualism,
02:33:34.340 | maybe quite honest, I mean, it sounds silly to say,
02:33:37.540 | but you are a beacon of reason.
02:33:40.660 | How do we bring people together
02:33:43.580 | long enough to where we can reason?
02:33:48.340 | - I mean, there's no easy way out of this
02:33:51.020 | because the fact that people have become tribal
02:33:54.540 | and they have, very tribal.
02:33:56.380 | And the tribe, in the tribe reason doesn't matter.
02:34:02.580 | It's all about emotion.
02:34:04.620 | It's all about belonging or not belonging.
02:34:06.380 | And you don't wanna stand out.
02:34:07.980 | You don't wanna have a different opinion.
02:34:10.140 | You wanna belong.
02:34:11.220 | And it's all about belonging.
02:34:13.420 | It took us decades to get back to tribalism
02:34:18.340 | where we were hundreds of years ago.
02:34:20.580 | It took millennium to get out of tribalism.
02:34:23.140 | It took the enlightenment to get us
02:34:24.740 | to the point of individualism where we think for,
02:34:26.780 | and reason, respect for reason.
02:34:28.380 | Before that, we were all tribal.
02:34:30.180 | So it took the enlightenment to get us out of it.
02:34:31.980 | We've been in the enlightenment for about 250 years
02:34:34.460 | influenced by the enlightenment and it's fading.
02:34:38.060 | The impact is fading.
02:34:39.900 | So what would we need to get out of it?
02:34:42.260 | We need self-esteem.
02:34:43.380 | People join a tribe because they don't trust their own mind.
02:34:48.980 | People join a tribe because they're afraid
02:34:52.780 | to stand on their own two feet.
02:34:54.140 | They're afraid to think for themselves.
02:34:55.900 | They're afraid to be different.
02:34:57.140 | They're afraid to be unique.
02:34:58.340 | They're afraid to be an individual.
02:35:00.820 | People need self-esteem.
02:35:02.420 | To gain self-esteem, they have to have respect
02:35:07.420 | for rationality.
02:35:08.780 | They have to think and they have to achieve
02:35:10.780 | and they have to recognize that achievement.
02:35:12.980 | To do that, they have to have respect for thinking.
02:35:19.740 | They have to have respect for reason.
02:35:22.140 | And we have to, and think about the schools.
02:35:24.820 | We have to have schools that teach people to think,
02:35:27.620 | teach people to value their mind.
02:35:29.900 | We have schools that teach people to feel
02:35:32.700 | and value their feelings.
02:35:33.820 | We have groups of six-year-olds sitting around a circle
02:35:36.260 | discussing politics.
02:35:37.620 | What?
02:35:38.460 | They don't know anything.
02:35:39.660 | They're ignorant.
02:35:40.980 | See, you don't know anything when you're ignorant.
02:35:43.260 | Yes, you can feel, but your feelings are useless
02:35:46.420 | as decision-making tools.
02:35:49.140 | But we emphasize emotion.
02:35:51.340 | It's all about socialization and emotion.
02:35:53.700 | This is why they talk about this generation of snowflakes.
02:35:57.500 | They can't hear anything that they're opposed to
02:36:00.500 | because they've not learned how to use their mind,
02:36:03.420 | how to think.
02:36:04.300 | So it boils down to teaching people how to think,
02:36:09.020 | two things, how to think and how to care about themselves.
02:36:13.100 | So it's thinking of self-esteem and the connected,
02:36:16.260 | because when you think, you achieve,
02:36:18.620 | which gains you self-esteem.
02:36:20.860 | When you have self-esteem, it's easier to think for yourself.
02:36:23.860 | And I don't know how you do that quickly.
02:36:28.060 | I mean, I think leadership matters.
02:36:31.380 | So, you know, part of what I try to do
02:36:33.860 | is try to encourage people to do those things,
02:36:36.940 | but I am a small voice.
02:36:38.980 | You asked me when early on,
02:36:40.420 | you said we should talk about why I'm not more famous.
02:36:42.820 | I'm not famous.
02:36:43.980 | My following is not big.
02:36:45.140 | It's very small in the scope of things.
02:36:48.660 | - Well, yours and objectivism, and that question,
02:36:51.500 | could you linger on it for a moment?
02:36:53.580 | Why isn't objectivism more famous?
02:36:57.980 | - I think because it's so challenging.
02:37:00.380 | It's not challenging to me, right?
02:37:03.100 | When I first encountered objectivism,
02:37:05.140 | it's like after the first shock
02:37:07.900 | and after the first kind of,
02:37:10.100 | none of this can be true, this is all BS,
02:37:12.820 | and fighting it, once I got it, it was easy.
02:37:17.740 | It required years of studying,
02:37:19.140 | but it was easy in the sense of, yes, this makes sense.
02:37:22.700 | But it's challenging because it upends everything.
02:37:25.860 | It really says what my mother taught me is wrong.
02:37:28.820 | And what my politicians say, left and right, is wrong.
02:37:32.700 | All of them.
02:37:33.540 | There's not a single politician
02:37:35.420 | on which I agree with on almost anything, right?
02:37:39.220 | Because on the fundamentals, we disagree.
02:37:42.140 | And what my teachers are telling me is wrong,
02:37:45.300 | and what Jesus said is wrong, and it's hard.
02:37:50.300 | - But the thing is, so you talk about politics
02:37:53.580 | and all that kind of stuff, but most people don't care.
02:37:56.260 | The more powerful thing about objectivism
02:37:58.740 | is the practical of my life,
02:38:02.180 | of how I revolutionized my life.
02:38:04.860 | And that feels to be like a very important and appealing,
02:38:09.860 | you know, get your shit together.
02:38:12.180 | - Yeah, but this is why Jordan Peterson
02:38:14.580 | is so much more successful than we are, right?
02:38:16.300 | - Why is that?
02:38:17.380 | Make your bed, or whatever.
02:38:19.100 | - What's that?
02:38:19.940 | - Make your bed, or whatever he says.
02:38:20.780 | - Yeah, because his personal responsibility is shallow.
02:38:23.380 | It's make your bed, stand up straight.
02:38:25.700 | It's what my mother told me when I was growing up.
02:38:27.340 | There's nothing new about Jordan Peterson.
02:38:29.660 | He says, embrace Christianity, Christianity's fine, right?
02:38:33.940 | Religion is okay.
02:38:36.020 | Just do these few things and you'll be fine.
02:38:38.060 | And by the way, he says, happiness, you know,
02:38:42.380 | you either have it or you don't.
02:38:43.660 | You know, it's random.
02:38:44.700 | You don't actually, you can't bring about your own happiness.
02:38:47.180 | So he's giving people an easy out.
02:38:49.140 | People want easy outs.
02:38:50.220 | People buy self-help books that give them five principles
02:38:53.900 | for living a, you know, shallow, I'm telling them,
02:38:57.460 | think, stand on your own two feet, be independent.
02:39:02.640 | Don't listen to your mother.
02:39:04.980 | Do your own thing, but thoughtfully, not based on emotions.
02:39:09.480 | - So you're responsible not just for a set
02:39:12.420 | of particular habits and so on.
02:39:14.900 | You're responsible for everything.
02:39:17.260 | - Yes, and you're responsible, here's the big one, right?
02:39:20.140 | You're responsible for shaping your own soul.
02:39:25.580 | Your consciousness, you get to decide
02:39:30.300 | what it's gonna be like.
02:39:32.060 | - And the only tool you have is your mind.
02:39:34.460 | - Your only tool is your mind.
02:39:36.580 | Well, your emotions play a tool
02:39:38.220 | when they're properly cultivated.
02:39:39.580 | They play a role in that.
02:39:41.420 | And the tools you have is thinking, experiencing, living,
02:39:45.340 | coming to the right conclusions, you know,
02:39:47.580 | listening to great music and watching good movies.
02:39:50.540 | And art is very important in shaping your own soul
02:39:54.260 | and helping you do this.
02:39:56.220 | It's got a crucial role in that, but it's work.
02:40:01.720 | And it's lonely work,
02:40:04.580 | because it's work you do with yourself.
02:40:05.980 | Now, if you find somebody who you love,
02:40:08.060 | who shares these values and you can do with them,
02:40:10.220 | that's great, but it's mostly lonely work.
02:40:12.980 | It's hard, it's challenging, it ends your world.
02:40:16.660 | The reward is unbelievable.
02:40:18.700 | But even at the, think about the enlightenment, right?
02:40:23.700 | So up until the enlightenment, where was truth?
02:40:26.820 | Truth came from a book.
02:40:27.980 | And there were a few people who understood the book.
02:40:30.740 | Most of us couldn't read and they conveyed it to us.
02:40:33.460 | And they just told us what to do.
02:40:34.700 | And in that sense, life's easy.
02:40:36.140 | It sucks.
02:40:37.300 | And we die young and we have nothing and we don't enjoy it,
02:40:41.020 | but it's easy.
02:40:42.540 | And the enlightenment comes around and says,
02:40:45.260 | "We've got this tool, it's called reason.
02:40:49.100 | "And it allows us to discover truth about the world.
02:40:51.500 | "It's not in a book.
02:40:52.860 | "It's actually your reason allows you
02:40:54.780 | "to discover stuff about the world."
02:40:56.500 | And I consider the first, really the first figure
02:41:00.140 | of the enlightenment is Newton, not Locke, right?
02:41:02.460 | It's a scientist.
02:41:03.500 | Because he teaches us the laws of mechanics,
02:41:07.940 | like how does stuff work?
02:41:10.060 | And people go, "Oh, wow, this is cool.
02:41:13.160 | "I can use my mind.
02:41:14.540 | "I can discover truth.
02:41:16.120 | "Isn't that amazing?"
02:41:18.100 | And everything opens up once you do that.
02:41:19.900 | Hey, if I can discover, if I understand the laws of motion,
02:41:23.940 | if I can understand truth in the world,
02:41:25.620 | how come I can't decide who I marry?
02:41:27.620 | I mean, everything was fixed in those days.
02:41:29.940 | How come I can't decide what profession I should be in?
02:41:33.100 | Right, everybody belonged to a guild.
02:41:35.180 | How come I can't decide who my political leader should be?
02:41:38.500 | That's, so it's all reason.
02:41:40.580 | It's all, once you understand the efficacy of your own mind
02:41:43.260 | to understand truth, to understand reality,
02:41:45.220 | discover truth, not understand truth, discover it,
02:41:48.220 | everything opens up.
02:41:49.300 | Now you can take responsibility for your own life
02:41:51.300 | 'cause now you have the tool to do it.
02:41:53.980 | But we are living in an era where postmodernism tells us
02:41:57.240 | there is no truth, there is no reality,
02:41:59.140 | and our mind is useless anyway.
02:42:00.860 | Critical race theory tells us that you're determined
02:42:04.420 | by your race and your race shapes everything
02:42:06.980 | and your free will is meaningless
02:42:08.620 | and your reason doesn't matter
02:42:10.260 | 'cause reason is just shaped by your genes
02:42:12.780 | and shaped by your color of your skin.
02:42:15.040 | It's the most racist theory of all.
02:42:17.140 | And you've got our friend at UC Irvine telling them,
02:42:21.140 | "Oh, your senses don't tell you anything about reality.
02:42:24.260 | "Anyway, reality is what it is.
02:42:25.400 | "So, you know, what's the purpose of reason?
02:42:28.160 | "It's to invent stuff, it's to make stuff up.
02:42:29.980 | "Then what use is that?
02:42:30.960 | "It's complete fantasy."
02:42:32.840 | You've basically got every philosophical,
02:42:35.780 | intellectual voice in the culture
02:42:37.620 | telling them their reason is impotent.
02:42:42.020 | There's like a Steven Pinker who tries,
02:42:44.740 | and I love Pinker and he's really good
02:42:47.140 | and I love his books,
02:42:48.180 | but he needs to be stronger about this.
02:42:53.420 | And there's a few people on kind of,
02:42:55.180 | there's a few people partially in the intellectual dark web
02:42:57.620 | in other ways who are big on reason,
02:42:59.560 | but not consistent enough and not full understanding
02:43:02.600 | of what it means or what it implies.
02:43:05.160 | And then there's little old me.
02:43:06.720 | (laughing)
02:43:08.780 | And it's me against the world in a sense
02:43:10.680 | because I'm not only willing to accept,
02:43:13.040 | to articulate the case for reason,
02:43:16.600 | but then what that implies.
02:43:18.560 | It implies freedom, it implies capitalism,
02:43:20.600 | it implies taking personal responsibility
02:43:22.240 | over your own life.
02:43:23.240 | And there are other intellectual dark web people
02:43:25.160 | get to reason and then, "Oh, politics, you can be whatever."
02:43:28.860 | No, you can't, you can't be a socialist and for reason.
02:43:32.340 | It doesn't actually, those are incompatible.
02:43:35.080 | And you can't be a determinist and for reason.
02:43:38.260 | Reason and determinism don't go together.
02:43:40.700 | The whole point of reason is that it's an achievement
02:43:43.620 | and it requires effort and it requires engagement
02:43:45.620 | and it requires choice.
02:43:47.340 | So it is, it does feel like a little old me
02:43:49.700 | because that's it.
02:43:51.540 | The allies I have are allies,
02:43:53.520 | I have allies among some libertarians over economics.
02:43:56.760 | I have some allies in the intellectual dark web
02:43:58.680 | maybe over reason,
02:44:00.000 | but none of them are allies in the full sense.
02:44:02.800 | My allies are the other objectivists,
02:44:04.540 | but we just, they're not a lot of us.
02:44:06.400 | - For people listening to this,
02:44:10.280 | for the few folks kind of listening to this
02:44:12.360 | and thinking about the trajectory of their own life,
02:44:17.760 | I guess the takeaway is reason is a difficult project,
02:44:22.760 | but a project that's worthy of taking on.
02:44:27.300 | - Yeah, and difficult is,
02:44:29.620 | I don't know if difficult is the right word
02:44:31.140 | 'cause difficult sounds like it's,
02:44:33.020 | I have to push this boulder up a hill.
02:44:35.220 | It's not difficult in that sense.
02:44:37.020 | It's difficult in the sense that it requires energy
02:44:39.020 | and focus, it requires effort,
02:44:41.720 | but it's immediately rewarding.
02:44:43.980 | It's fun to do and it's rewards immediate, pretty quick.
02:44:48.980 | It takes a while to undo all the garbage that you have,
02:44:53.960 | but we all have that I had that took me years
02:44:56.640 | and years and years to get rid of certain concepts
02:44:58.600 | and certain emotions that I had that didn't make any sense,
02:45:01.960 | but it takes a long time to fully integrate that.
02:45:04.960 | So I don't want it to sound like it's a burden,
02:45:09.240 | like it's hard in that sense.
02:45:11.520 | It does require focus and energy.
02:45:13.760 | And I don't want to sound like a Dr. Spock.
02:45:16.760 | I don't want to say, and I don't think I do
02:45:18.640 | because I'm a pretty passionate guy,
02:45:20.360 | but I don't want it to appeal like,
02:45:22.060 | oh, just forget about emotions.
02:45:24.400 | Emotions are how you experience the world.
02:45:26.900 | You want to have strong emotions.
02:45:29.700 | You wanna live, you wanna experience life strongly
02:45:33.760 | and passionately.
02:45:35.760 | You just need to know that emotions are not cognition.
02:45:39.580 | It's another realm.
02:45:40.700 | It's like, don't mix the realms.
02:45:42.460 | Think about outcomes and then experience them.
02:45:45.360 | And sometimes your emotions won't coincide
02:45:47.320 | with what you think should be.
02:45:49.920 | And that means there's still more integration to be done.
02:45:52.680 | - Jan, as I told you offline,
02:45:55.960 | I've been a fan of yours for a long time.
02:45:58.200 | It's been, I was a little starstruck early on,
02:46:01.720 | getting a little more comfortable now.
02:46:02.920 | - I believe that's gone.
02:46:04.120 | - I highly recommend that people
02:46:08.480 | that haven't heard your work, listen to it.
02:46:13.180 | The Yaron Brook show.
02:46:14.180 | The times I've disagreed with something I've hear you say
02:46:18.820 | is usually a first step on a journey
02:46:21.980 | of learning a lot more about that thing,
02:46:24.420 | about that viewpoint.
02:46:25.780 | And that's been so fulfilling.
02:46:27.300 | It's been a gift.
02:46:28.140 | The passion, you talk about reason a lot,
02:46:32.140 | but the passion radiates in a way
02:46:35.420 | that's just contagious and awe-inspiring.
02:46:38.180 | So thank you for everything you've done for this world.
02:46:41.020 | It's truly an honor and a pleasure to talk to you.
02:46:43.560 | - Well, thank you.
02:46:44.400 | And it's, my award is that if I've had an impact on you
02:46:48.560 | and people like you, wow, I mean, that's amazing.
02:46:51.440 | When you wrote to me an email saying you being a fan,
02:46:54.200 | I was blown away 'cause I had no idea
02:46:56.840 | and completely unexpected.
02:46:58.440 | And I, every few months I discover,
02:47:02.360 | hey, I had an impact on this world
02:47:03.920 | and people that I would have never thought.
02:47:05.720 | And they, so the only way to change the world
02:47:10.800 | is to change your one mind at a time.
02:47:12.700 | And when you have an impact on a good mind
02:47:18.140 | and a mind that cares about the world
02:47:20.200 | and a mind that goes out and does something about it,
02:47:22.600 | then you get the exponential growth.
02:47:24.840 | So through you, I've impacted other people
02:47:27.840 | and that's how you get,
02:47:29.140 | that's how you ultimately change everything.
02:47:31.840 | And so I'm, in spite of everything,
02:47:34.280 | I'm optimistic in a sense that I think
02:47:37.080 | that the progress we've made today
02:47:39.800 | is so universally accepted,
02:47:41.880 | that scientific progress, the technological progress,
02:47:44.360 | it can't just vanish like it did under when Rome collapsed.
02:47:48.800 | And whether it's in the United States or somewhere,
02:47:51.200 | progress will continue.
02:47:52.720 | The human project for human progress will continue.
02:47:57.800 | And I think these ideas,
02:47:58.960 | the ideas of reason and individualism
02:48:00.800 | will always be at the heart of it.
02:48:02.520 | And what we are doing
02:48:05.040 | is continuing the project of the enlightenment.
02:48:07.040 | And it's the project that will save the human race
02:48:12.040 | and allow it to, for Elon Musk
02:48:14.880 | and for Jeff Bezos to reach the stars.
02:48:19.040 | - Thank you for masterfully ending on a hopeful note.
02:48:22.560 | Yaron, a pleasure and an honor.
02:48:24.320 | Thanks.
02:48:25.700 | Thanks for listening to this conversation
02:48:27.240 | with Yaron Brook.
02:48:28.400 | And thank you to our sponsors,
02:48:30.480 | Blinkist, an app I use for reading
02:48:32.520 | through summaries of books,
02:48:34.080 | ExpressVPN, the VPN I've used for many years
02:48:37.400 | to protect my privacy on the internet,
02:48:39.960 | and Cash App, the app I use to send money to friends.
02:48:43.440 | Please check out these sponsors in the description
02:48:45.900 | to get a discount and to support this podcast.
02:48:49.400 | If you enjoy this thing, subscribe on YouTube,
02:48:51.800 | review it with 5,000 Apple Podcasts,
02:48:54.040 | follow on Spotify, support on Patreon,
02:48:56.760 | or connect with me on Twitter @LexFriedman.
02:49:00.040 | And now let me leave you with some words from Ayn Rand.
02:49:04.000 | Do not let your fire go out,
02:49:06.480 | spark by irreplaceable spark in the hopeless swamps
02:49:11.160 | of the not quite, the not yet, and the not at all.
02:49:15.960 | Do not let the hero in your soul perish
02:49:19.040 | in lonely frustration for the life you deserved
02:49:22.280 | and have never been able to reach.
02:49:24.860 | The world you desire can be one.
02:49:27.400 | It exists.
02:49:28.720 | It is real.
02:49:30.040 | It is possible.
02:49:31.640 | It is yours.
02:49:33.980 | Thank you for listening and hope to see you next time.
02:49:37.560 | (upbeat music)
02:49:40.140 | (upbeat music)
02:49:42.720 | [BLANK_AUDIO]