back to index

Zev Weinstein: The Next Generation of Big Ideas and Brave Minds | Lex Fridman Podcast #158


Chapters

0:0 Introduction
1:38 Philosophy becomes dangerous in difficult times
7:1 The power of radical ideas
11:52 Changing your mind
16:6 Fear
17:45 Labels
22:48 Thomas Aquinas
27:18 Nietzsche
31:49 Nature of truth
34:9 Jordan Peterson
39:41 Mediums of communication
48:10 Free will
52:34 Simulation
56:51 Transcending the limits of human life
59:56 Elon Musk
64:17 Aliens
67:48 Is math invented or discovered?
70:22 Theory of everything
72:17 Eric Weinstein as a dad
88:4 Music
95:19 Advice for young people
97:18 Mortality
101:5 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Zev Weinstein,
00:00:03.160 | a young man with a brilliant, bold, and hopeful mind
00:00:06.400 | that I had the great fortune of talking to
00:00:09.000 | on a recent afternoon.
00:00:10.640 | He happens to be Eric Weinstein's son,
00:00:13.720 | but I invited Zev not because of that,
00:00:15.800 | but because I got a chance to listen to him speak
00:00:17.800 | on a few occasions and was captivated
00:00:20.560 | by how deeply he thought about this world
00:00:22.920 | at such a young age.
00:00:24.360 | And I thought that it might be fun
00:00:25.960 | to explore this world of ours together with him
00:00:29.160 | for a time through this conversation.
00:00:31.800 | Quick mention of our sponsors,
00:00:33.480 | ExpressVPN, Grammarly Grammar Assistant,
00:00:37.160 | Simply Safe Home Security,
00:00:39.000 | and Magic Spoon Low Carb Cereal.
00:00:41.800 | So the choice is privacy, grammar, safety, or health.
00:00:46.160 | Choose wisely, my friends.
00:00:47.560 | And if you wish, click the sponsor links below
00:00:49.960 | to get a discount and to support this podcast.
00:00:52.920 | As a side note, let me say that Zev acknowledges
00:00:56.440 | the fear associated with participating in public discourse
00:00:59.960 | and is brave enough to join in at a young age
00:01:03.160 | to push forward, to change his mind publicly,
00:01:05.800 | to learn, to articulate difficult, nuanced ideas,
00:01:09.280 | and grow from the conversations that follow.
00:01:11.920 | In this, I hope he leads the next generation of minds
00:01:15.120 | that is joining and steering the collective intelligence
00:01:18.080 | of this big ant colony we think of
00:01:21.120 | as our human civilization.
00:01:23.440 | If you enjoy this thing, subscribe on YouTube,
00:01:25.880 | review it on Apple Podcasts, follow us on Spotify,
00:01:29.000 | support on Patreon, or connect with me
00:01:31.200 | on Twitter @LexFriedman.
00:01:33.720 | And now, here's my conversation with Zev Weinstein.
00:01:37.740 | You've said that philosophy becomes more dangerous
00:01:42.160 | in difficult times.
00:01:43.520 | What do you mean by that?
00:01:44.720 | - Interestingly, I think I mean two things by that.
00:01:48.440 | And I think firstly, I should clarify,
00:01:51.400 | when I say philosophy, I sort of mean
00:01:53.560 | in a very traditional sense, just thinking, ideation,
00:01:58.560 | and that could be reconsidering our notions of self
00:02:03.160 | in a very traditional sense, which we consider philosophy,
00:02:06.560 | or that could be like technological innovation.
00:02:10.720 | I think it's important to recognize all of these
00:02:13.060 | as philosophies that we can not question
00:02:16.500 | whether it's important to promote thought.
00:02:18.800 | I think the other thing I should clarify
00:02:21.040 | is when I say difficult times,
00:02:23.160 | I mean times when nothing is growing,
00:02:25.960 | and so the risk for real conflict is much greater
00:02:30.960 | because people are incentivized to fight over the things
00:02:34.840 | which already exist.
00:02:37.200 | I think when times are not difficult,
00:02:41.040 | the people with the greatest power are usually the people
00:02:46.320 | who are very creative, generating a lot,
00:02:48.920 | and that really requires ideation or philosophy
00:02:52.240 | of some sort.
00:02:53.400 | I think when times become stagnant,
00:02:56.600 | the important successful people become the people
00:03:00.680 | who are very good at protecting their own pieces of the pie
00:03:04.320 | and taking others.
00:03:05.800 | I think that those people have to be very opposed
00:03:11.640 | to any sort of thinking that could restructure society
00:03:16.080 | or conventions about who should succeed.
00:03:21.920 | And so firstly, I mean by that
00:03:25.200 | that it becomes much more dangerous for a person
00:03:30.200 | to think deeply and question during a time
00:03:35.320 | when the important people are those concerned
00:03:38.600 | with making sure no one rocks the boat.
00:03:41.560 | One example of this would be Socrates and his execution
00:03:45.640 | because everyone was happy enough to sit
00:03:47.800 | through his questions before there was war and poverty
00:03:51.360 | and distress, and afterwards it just became too dangerous.
00:03:54.560 | The other thing I mean by that is that the consequences
00:03:58.640 | of thinking deeply carry much greater potential
00:04:03.120 | for real catastrophe when everyone is desperate.
00:04:06.960 | So like for example, the Communist Manifesto
00:04:11.800 | was probably much more dangerous during early 1900s Russia
00:04:16.800 | than it was during the 1848 revolutions
00:04:19.800 | because I think people were in much worse shape
00:04:22.880 | and desperate people are very willing to dive
00:04:27.520 | into anything new that might bring the future
00:04:31.000 | without fully calculating whatever the consequences
00:04:34.920 | or risks might be.
00:04:36.600 | So it is both more dangerous for a person
00:04:39.160 | to have creative ideas and those ideas are more dangerous
00:04:43.680 | when times are tough.
00:04:47.000 | - And by dangerous you mean it challenges the people
00:04:51.120 | with power who want to maintain that power
00:04:55.120 | in times of stagnation when there's not much growth,
00:04:59.440 | innovation, creativity, all that kind of stuff.
00:05:01.880 | - Right, and we know that if nothing new is created,
00:05:06.280 | people have promises that they've made
00:05:10.280 | about what will be paid to whom, what debt structure is.
00:05:14.560 | The only possibility if stagnation lasts for long enough
00:05:19.560 | is really some kind of great conflict, great war
00:05:24.280 | because people have to take from others
00:05:25.960 | to make good on their own promises.
00:05:28.720 | So we know that by denying any sort of grand ideation,
00:05:33.720 | we are accepting that there will be some kind
00:05:37.440 | of great catastrophe.
00:05:39.800 | And so we have to understand that philosophy
00:05:42.840 | is the most important when we've seen too much stagnation
00:05:47.840 | for too long.
00:05:49.600 | It is also very dangerous and it's dangerous
00:05:53.920 | for the people who are doing it and it's dangerous
00:05:56.320 | for the people who believe it,
00:05:58.000 | but it's kind of our only way out ever.
00:06:00.360 | - And again, by philosophy you mean the bigger,
00:06:03.760 | so it's not academic philosophy or this kind of games played
00:06:07.760 | in the space of just like moral philosophy
00:06:10.560 | and all those metaphysics, all that kind of stuff.
00:06:12.840 | You mean just thinking deeply about this world,
00:06:15.680 | thinking from first principles.
00:06:17.600 | I think your like Twitter line involves something about--
00:06:22.120 | - Trying to piece everything together
00:06:23.560 | from first principles.
00:06:24.840 | - So that's fundamentally what being philosophical
00:06:27.800 | about this world is and that's where the people
00:06:31.200 | who are thinking deeply about this world
00:06:33.160 | are the ones who are feeding,
00:06:36.080 | who are the catalyst of this growth in society and so on.
00:06:39.200 | - Yeah, I mean I also think that the real implication
00:06:43.080 | of moral philosophy can be something
00:06:48.080 | that most would consider like a real political implication.
00:06:52.640 | So I think all philosophy really ties together
00:06:56.000 | because there has to be some sort of grand structure
00:06:58.840 | to all thought and how it relates.
00:07:01.800 | - Do you think this growth and innovation
00:07:06.800 | and improvement can last forever?
00:07:09.440 | We've seen some incredible,
00:07:12.460 | the things that humans have been able to accomplish
00:07:16.000 | over the past several hundred years,
00:07:19.160 | it's just I mean awe-inspiring
00:07:21.480 | and every moment in that history,
00:07:24.920 | it almost seemed like no more could be done.
00:07:27.800 | Like we've solved all the problems that are to be solved.
00:07:30.840 | I mean there's just historically
00:07:32.480 | there's all these kind of ridiculous
00:07:34.240 | Bill Gates style quotes where like,
00:07:37.200 | it's obvious that this new cool thing's
00:07:40.440 | not gonna take off and yet it does.
00:07:43.360 | And so there's a feeling of the same kind of pattern
00:07:47.160 | that we see in Moore's Law.
00:07:48.200 | There's constant growth in different technologies
00:07:51.160 | in the modern day era, in any kind of automation
00:07:54.120 | over the past hundred years.
00:07:55.580 | Do you think it's possible that we'll keep growing this way
00:07:58.960 | if we give power to the philosophers of our society?
00:08:02.360 | - I think the only way that we can keep growing this way
00:08:05.400 | is if we give power to real thinkers.
00:08:08.260 | And there's no guarantee that that will work
00:08:11.120 | but we sort of don't have any other choice.
00:08:13.800 | And I think you're entirely right that this period
00:08:17.120 | of both understanding the universe at a rate
00:08:22.120 | which has never been seen before
00:08:25.160 | and invention and creativity,
00:08:30.000 | these past hundred years have been sort of uncharacteristic
00:08:33.760 | for the level of growth that we've seen in all of history.
00:08:38.320 | We've never seen anything like this.
00:08:41.160 | And I think a lot of our promises
00:08:43.640 | rest on this sort of thing continuing.
00:08:51.820 | I think that's very dangerous.
00:08:53.580 | But the one thing that can get us out of this
00:08:55.400 | is philosophy and being ready to radically restructure
00:09:00.400 | all of our notions about what should be, what is.
00:09:05.560 | I think that's very important.
00:09:06.800 | - So you think deeply about this world.
00:09:08.800 | You are clearly this embodiment of a thinker,
00:09:10.920 | of a philosopher.
00:09:11.980 | Your dad is also one such guy, Eric Weinstein.
00:09:16.120 | Do you have big disagreements with him
00:09:20.080 | on this topic in particular?
00:09:21.600 | I think people should know he also happens
00:09:24.800 | to be in the room, but the mics can't pick him up
00:09:28.360 | so he can heckle, it doesn't even matter.
00:09:30.920 | But do you have disagreements with him on this point?
00:09:34.600 | Let me try to summarize his argument
00:09:37.680 | that we were actually based a lot of our American society
00:09:42.680 | on the belief that things will keep growing.
00:09:45.680 | And yet it seems that however you break it apart,
00:09:50.640 | maybe from an economics perspective,
00:09:52.880 | that they're not growing currently.
00:09:55.160 | And so that's where a lot of our troubles are at.
00:09:57.680 | Do you have the same sense that there's a stagnation period
00:10:00.880 | that we're living through over the past couple of decades?
00:10:03.480 | - I think stagnation, modern stagnation,
00:10:06.600 | is completely undeniable, particularly scientifically.
00:10:11.120 | And I think there have been a few fields
00:10:13.200 | where tremendous progress has been made very recently.
00:10:18.000 | I think my dad might feel that there is sort of
00:10:23.000 | an inevitability to the ending of this period.
00:10:29.720 | And I'm not so certain that the fall of this great time
00:10:34.720 | is completely inevitable because I don't know
00:10:39.680 | what thoughts we're capable of producing,
00:10:42.560 | what we're able to reconsider.
00:10:45.160 | I think we really have to be open to the possibility
00:10:49.000 | that all of our standard frameworks where,
00:10:54.000 | he will talk about embedded growth obligations.
00:10:57.640 | If we continue within the same framework,
00:11:00.400 | then we're very susceptible to the dangers
00:11:02.160 | of whatever these embedded growth obligations are.
00:11:05.200 | I think if we break the frameworks,
00:11:06.920 | we have no reason to believe that the problems
00:11:09.040 | we're experiencing with our current frameworks
00:11:11.640 | will follow us.
00:11:13.160 | And I think that's the importance of radical thought
00:11:15.520 | is we don't know what the solution is,
00:11:17.880 | but if there is a solution,
00:11:19.160 | it will be born from some very fundamental thinking.
00:11:22.680 | And so I have great hope.
00:11:24.760 | - So you have optimism about sort of the power
00:11:27.720 | of a single radical idea or a single radical thinker
00:11:31.160 | to break our frameworks and break us out
00:11:36.160 | of this like spiral down due to whatever
00:11:39.760 | the economic forces that are creating
00:11:43.760 | this current stagnation.
00:11:45.300 | - Yeah, I'm very, very hopeful.
00:11:47.720 | - The optimism of youth.
00:11:48.920 | Well, I share your optimism.
00:11:52.060 | So let me come back to something you've also talked about.
00:11:56.080 | You have very little stuff out there currently,
00:11:59.200 | but the things you have out there, your thoughts,
00:12:02.600 | you could just tell how deeply you think about this world.
00:12:05.320 | And one of the things you mentioned is,
00:12:07.620 | as you learn about this world, as you read,
00:12:12.500 | as you sort of go through different experiences,
00:12:15.760 | that you're open to changing your mind.
00:12:21.440 | How often do you find yourself changing your mind?
00:12:23.880 | Do you think Zev from 10 years into the future
00:12:26.380 | will look back at this conversation we're having now
00:12:29.040 | and disagree completely with everything you just said?
00:12:33.160 | - It's entirely possible.
00:12:34.520 | And that's one of the things that scares me so much
00:12:36.880 | about appearing publicly.
00:12:39.440 | I think that the internet can be very intolerant
00:12:43.280 | of inconsistency.
00:12:44.880 | And I am entirely prepared to be very inconsistent
00:12:49.400 | because I know that whatever beliefs I have
00:12:54.200 | when subjected to scrutiny may change
00:12:56.900 | because that's really the only way
00:12:59.200 | to form your truest, most fundamental conceptions
00:13:04.200 | about the world around you.
00:13:06.680 | And it would take an infinite amount of time
00:13:09.320 | to subject every single one of your beliefs to scrutiny.
00:13:12.960 | And so that's a process that must follow me
00:13:16.120 | throughout my entire life.
00:13:18.740 | And I know that means that my opinions and perspectives
00:13:23.600 | are always to be changing.
00:13:26.560 | I'm prepared to accept that about myself.
00:13:31.200 | Whether other people are prepared to accept
00:13:34.880 | that my public opinions may change very greatly over time
00:13:39.880 | is something I don't know.
00:13:46.000 | I don't know how tolerant the world will be.
00:13:48.560 | But I'm very prepared to change anything I believe in
00:13:53.560 | if I think deeply enough about it
00:13:55.440 | or a good enough argument is made so that I might reconsider.
00:13:59.680 | - Well, there's certainly is currently an intolerance
00:14:02.240 | and that's one of the problems of our age.
00:14:04.920 | There's an intolerance towards change.
00:14:07.320 | I'll also ask you about labels.
00:14:09.000 | You talked about sort of we like to bin each other
00:14:11.080 | into different categories, blue or red
00:14:14.160 | or whatever the different categorization is.
00:14:17.020 | But it seems like the task before you
00:14:20.720 | as a young person defining our future
00:14:23.200 | is to make a tolerance of change the norm.
00:14:28.200 | Doing this podcast, for example,
00:14:30.320 | and then changing your mind one or two years later
00:14:33.040 | and doing so publicly without a big dramatic thing
00:14:36.520 | or maybe changing it on a daily basis
00:14:38.960 | and just being open about it
00:14:41.400 | and being transparent about your thought process.
00:14:43.720 | Maybe that is the beacon of hope for the philosophical way,
00:14:48.320 | the path of the philosopher.
00:14:51.700 | So that's your task in a sense
00:14:55.320 | is to change your mind openly and bravely.
00:14:58.960 | - You're right and maybe I will just have to
00:15:01.160 | endure some sort of criticism for doing that.
00:15:04.320 | But I think that's very important.
00:15:05.440 | I think this ties back to this previous facet
00:15:08.880 | of our conversation where we were discussing
00:15:11.640 | if thinkers would win over systems
00:15:15.960 | that are devoted to preventing radical thought
00:15:19.720 | or if who will win the systems or the thinkers.
00:15:25.960 | I think it's crucial that my generation
00:15:29.160 | take up a hand in this fight
00:15:34.240 | and I think it's important that I'm a part of that
00:15:37.340 | because I know that I have some opportunity to.
00:15:41.080 | I think it is my obligation as a member of a generation
00:15:49.520 | whose only real hope is to think outside of a system
00:15:53.300 | because whatever systems exist are collapsing.
00:15:57.120 | I think it is really my obligation
00:15:59.480 | to try to play some role, whatever role I can
00:16:02.800 | and being an instrument in that change.
00:16:06.600 | - Are you, as a young mind, do you have a sense of fear
00:16:11.080 | about just like how afraid were you
00:16:13.680 | to do this podcast conversation?
00:16:15.560 | Do you have a sense of fear of thinking publicly?
00:16:18.540 | - Yeah, I don't even think that that fear is irrational.
00:16:23.280 | It's very difficult to exist publicly in any form now
00:16:27.880 | because it's very easy for anyone to take cheap shots
00:16:33.380 | at something which is difficult
00:16:36.280 | and as I said, the people who are trying
00:16:41.280 | to have the difficult ideas in conversations
00:16:45.200 | are perhaps putting others in actual danger
00:16:49.680 | because everyone is so desperate
00:16:51.520 | that they might be willing to try anything.
00:16:55.400 | So there's a certain amount of responsibility
00:16:59.120 | which one has to take going before the public
00:17:03.440 | and there is a certain amount of ridicule
00:17:07.800 | which will be completely unwarranted
00:17:10.620 | that anyone must endure for it.
00:17:14.280 | And I think that means that one has to be afraid
00:17:20.120 | because they could both ruin the world
00:17:22.040 | and be ruined by the world
00:17:23.760 | in an unwarranted and undeserved fashion.
00:17:28.060 | I would like to believe in myself enough
00:17:34.040 | to try to accept this as a task
00:17:36.000 | because I think people need to try
00:17:39.760 | or there's no getting out of this
00:17:41.040 | and we will end in some kind of crazy, brilliant war.
00:17:44.760 | - Awfully put.
00:17:45.600 | You've said also that in these times we can't have labels
00:17:48.520 | because it holds us back.
00:17:51.440 | Maybe we've already talked about it a little bit
00:17:53.580 | but this idea of labels is really interesting.
00:17:56.600 | Why do you think labels hold us back?
00:17:59.580 | - Well, I think many underestimate the extent
00:18:04.160 | to which language and communication really impacts
00:18:09.160 | and shapes the ideas and thoughts
00:18:12.640 | which are being communicated.
00:18:14.780 | And I think if we're willing to accept imperfect labels
00:18:19.780 | to categorize particular people or thoughts,
00:18:24.980 | in some sense we are corrupting an abstraction
00:18:29.260 | in order to represent it and communicate about it.
00:18:32.340 | And I think as we've discussed,
00:18:35.260 | those abstractions are particularly important
00:18:38.420 | when everything is on fire.
00:18:43.900 | We should not be sacrificing grand thought
00:18:47.900 | for the ability to express it.
00:18:52.740 | I think everyone should work much harder,
00:18:56.620 | including myself, to really be thinking abstractly
00:18:59.300 | in abstract terms instead of using concrete terms
00:19:03.020 | to discuss abstraction while ruining it slightly.
00:19:07.700 | - Yeah, it's kind of a skill, actually.
00:19:10.620 | So one really difficult example in the recent time
00:19:15.620 | that maybe you can comment on
00:19:19.780 | if you have been thinking about it is just politics.
00:19:23.460 | And there's a lot of labels in politics
00:19:25.620 | that it takes a lot of skill to be able
00:19:29.080 | to communicate difficult ideas
00:19:31.420 | without labels being attached to you.
00:19:34.020 | That's something I've been sort of thinking about a lot
00:19:37.300 | in trying to express, for example,
00:19:40.140 | how much I love various aspects
00:19:42.220 | of the foundational ideas of this country,
00:19:45.260 | like freedom, and just saying I love America,
00:19:49.420 | a simple statement, I love the ideas
00:19:51.820 | that we're finding to America.
00:19:53.260 | Well, often in the current time,
00:19:54.940 | people will try, they'll desperately try
00:19:57.780 | to attach a label to me, for example,
00:20:00.300 | for saying I love America,
00:20:01.500 | that I'm a Republican, a Donald Trump supporter.
00:20:05.500 | And it takes elegance and grace and skill
00:20:08.620 | to avoid those labels so that people can actually listen
00:20:13.620 | to the contents of your words
00:20:15.700 | versus the summarization that results
00:20:20.040 | from just the labels that they can pin on you.
00:20:24.420 | Are you cognizant of the skill required there
00:20:27.140 | of being able to communicate
00:20:28.420 | without being branded a Republican or a Democrat
00:20:31.540 | in this particular set of conversations?
00:20:33.980 | I'm sure there's other dangerous labels
00:20:35.940 | that could be attached.
00:20:38.020 | - I don't think there's any way
00:20:39.220 | of avoiding that right now.
00:20:42.300 | It might not be anyone's best effort to really try.
00:20:47.300 | I think the thing I can say which will most speak to that,
00:20:52.140 | which I truly believe, is that participating
00:20:57.140 | in modern conventional politics
00:21:01.020 | is not being inherently political in a generative sense.
00:21:06.460 | It's this repeated trope where politics now
00:21:11.460 | is not about creating new political ideologies.
00:21:14.900 | It's about defending ideologies which already exist
00:21:17.700 | so that everyone can keep what they have.
00:21:20.580 | And that's where all of the name calling
00:21:23.260 | and the labeling really comes in.
00:21:26.500 | It's an attempt to constrict whatever may be generated
00:21:31.900 | to standard conversations and discussions
00:21:36.900 | so that arguments can be strawmanned and defeated
00:21:42.460 | and people can keep what they have
00:21:43.940 | because everyone's very, very scared.
00:21:47.020 | I want to be very political,
00:21:51.660 | but not in a standard political sense
00:21:54.680 | where I'm defending a particular party
00:21:57.520 | or place on a spectrum.
00:22:01.220 | I would like to play some role in inventing new spectrums,
00:22:04.740 | and I think that's most important politically
00:22:07.280 | because above most else, politics is about real power
00:22:12.280 | and conventional politicians have real power.
00:22:17.140 | And that power will find terrible outlets
00:22:21.340 | if new spectrums for that power to live are not invented.
00:22:25.220 | - So you're not afraid of politics, political discourse,
00:22:29.860 | at the deepest, richest level
00:22:33.380 | of what political discourse is supposed to mean?
00:22:35.100 | - Actually, I'm very afraid of it,
00:22:36.920 | but once again, we have no--
00:22:39.140 | - That's not paralyzing for you,
00:22:40.660 | that you feel like it's a responsibility,
00:22:42.340 | you're ready to take it on.
00:22:43.620 | - Yes.
00:22:44.820 | - This is a good sign.
00:22:46.360 | This is, you're a special human.
00:22:48.580 | Okay, let's talk maybe fun, maybe profound.
00:22:51.720 | We talked about philosophers, philosophy.
00:22:55.420 | Who's your favorite philosopher?
00:22:57.540 | Like somebody in your current time,
00:22:59.940 | but neither influential or you just enjoy
00:23:03.060 | his/her ideas or writing or anything like that.
00:23:08.920 | - Weirdly, I'll give an answer which sort of doesn't have
00:23:13.920 | much to do with whom I might imagine myself to be.
00:23:18.600 | I like Thomas Aquinas at the moment.
00:23:21.700 | I think he's very inspirational to me
00:23:24.300 | given what we're going through.
00:23:26.340 | And that's not because his particular ideas of religion
00:23:31.340 | or God or unmoved movers are particularly inspirational
00:23:37.700 | to me and I don't even think they were necessarily right.
00:23:44.120 | But he was introducing aspects of the scientific method
00:23:50.340 | during one of the darkest periods in human history
00:23:55.480 | when we had lost all hope and reason
00:23:57.980 | and ability to think logically.
00:24:01.300 | So I think he was really something of a light in the dark
00:24:06.140 | and I think we need to look to people like that
00:24:09.420 | at the moment.
00:24:10.580 | The other reason why I think I need to learn from him
00:24:15.580 | is that even though he was doing something
00:24:18.700 | which really needed to be done
00:24:20.700 | and introducing scientific thought and reason
00:24:25.700 | to a time that lacked it,
00:24:29.100 | he was not saying anything that would have been offensive
00:24:35.040 | to whatever powers were in play during his time.
00:24:39.160 | He was writing about the importance of faith in God
00:24:43.600 | and how we could prove it.
00:24:45.640 | And so it's important to remember, I suppose,
00:24:50.020 | that having ideas that shape the world
00:24:55.020 | and which bring the world closer to what we can prove
00:24:58.760 | it's supposed to be and how it's supposed to work
00:25:01.120 | does not always take some sort of grand contradiction
00:25:04.160 | of whatever's in play.
00:25:05.540 | And the most courageous thing to do
00:25:09.640 | may not always be the most helpful thing to do.
00:25:13.400 | And I think it's very easy for anyone
00:25:17.440 | with ideas about how everything is broken
00:25:21.660 | to become very cynical and say,
00:25:23.480 | "Oh, the system, man, they're all wrong."
00:25:26.000 | I think it takes another kind of discipline
00:25:32.600 | to be a person with real ideas
00:25:35.320 | and to make the world better
00:25:38.380 | without stepping on anyone's toes or contradicting anyone.
00:25:41.960 | I have real respect for that.
00:25:43.140 | - So being able to be,
00:25:44.500 | when it's within your principles to operate,
00:25:46.340 | within the current system of thought.
00:25:48.620 | - Yeah, and not offend anyone,
00:25:51.300 | not say anything outlandish,
00:25:54.380 | but introduce the method by which progress must be achieved.
00:25:59.380 | I think that takes a kind of maturity,
00:26:03.580 | which is found very rarely now.
00:26:07.540 | And I really look to him for inspiration
00:26:10.320 | despite whatever disagreements I may have
00:26:13.140 | with the minute details of his philosophy.
00:26:16.140 | - Yeah, it takes a lot of skill,
00:26:18.300 | a lot of character and yeah, deep thinking
00:26:22.020 | to be able to operate within the system when needed
00:26:25.180 | and having the fortitude and just the boldness
00:26:29.820 | to step outside and to burn the system down when needed,
00:26:33.300 | but rarely, and opportune moments
00:26:36.420 | that would actually have impact.
00:26:38.080 | I mean, it's ultimately about impact
00:26:40.540 | within the society that you live in,
00:26:42.500 | not just making a statement that has no impact.
00:26:46.500 | - Yeah, and we were talking about how dangerous it is
00:26:50.100 | to do real philosophy at dangerous, broken times.
00:26:55.100 | He was going through the most broken time in history
00:26:59.020 | and he questioned the methods
00:27:04.020 | which made a broken system able to survive.
00:27:10.140 | And he was so skilled and so graceful
00:27:12.140 | that he became a saint in that tradition.
00:27:15.700 | And there's something for me to really learn from there.
00:27:19.100 | - Do you draw any inspiration, have any interest
00:27:21.260 | in the sort of more modern philosophers,
00:27:23.340 | maybe the existentialists?
00:27:25.340 | I mean, Nietzsche is one of the early ones.
00:27:27.660 | Do you have thoughts on the guy in general
00:27:31.660 | or any of the other existentialists?
00:27:34.060 | - Well, with regard to Nietzsche,
00:27:36.500 | I think Yeats might've said that he's the worst.
00:27:40.340 | You know, he was certainly filled with passionate intensity.
00:27:44.060 | I think-- - Was that a compliment?
00:27:47.720 | He was the worst or a criticism?
00:27:51.980 | - Yeats had this big line,
00:27:53.620 | that the best lack all conviction,
00:27:55.500 | the worst are filled with passionate intensity.
00:27:58.680 | So I think Nietzsche was destroyed
00:28:07.940 | by the horrors of everything that went on around him.
00:28:12.940 | I think he never really recovered from it.
00:28:17.740 | I think that's because if you think about Nietzsche's
00:28:22.740 | philosophy, he was very opposed to any sort of acceptance
00:28:28.680 | of what one had.
00:28:29.840 | One should always envy those who have more
00:28:32.260 | and use that envy to fuel their growth
00:28:36.660 | and accept whatever the human condition and desires are
00:28:41.660 | and use those desires to want more and more
00:28:46.700 | and make use of your greed.
00:28:49.900 | I think it's very difficult to be truly happy
00:28:54.900 | if the thing which you pride yourself most on
00:29:04.700 | is never being satisfied.
00:29:08.260 | And I think Nietzsche was never satisfied
00:29:09.980 | and that was the danger of his philosophy.
00:29:12.920 | I think also with his amoralism,
00:29:15.780 | you know, there is no good or evil.
00:29:18.700 | I sort of disagree with that
00:29:20.460 | on a pretty fundamental basis.
00:29:23.340 | I think that our notion of morality
00:29:28.340 | is by no means subjective.
00:29:30.480 | It's really the proxy for the fitness of a society.
00:29:35.480 | I think whatever we consider ethical,
00:29:39.780 | like don't steal, don't murder, don't do this,
00:29:42.500 | societies have a very difficult time running.
00:29:47.940 | It's very hard to run a civilization
00:29:49.520 | when everyone is stealing from everyone else
00:29:52.360 | and people are murdering each other
00:29:54.660 | and committing these things
00:29:57.260 | which we would consider atrocities.
00:30:01.080 | So I think we also, we know this
00:30:04.280 | because I think very similar notions of morality
00:30:07.820 | have evolved convergently from different traditions.
00:30:12.820 | I think good is a proxy for a civilization's fitness
00:30:18.400 | and the good news is that that means
00:30:26.000 | that evil in being anathema to that good
00:30:30.640 | must therefore be the opposite of stable
00:30:36.520 | in whatever way that it's evil
00:30:39.080 | and that means that good will always be more stable
00:30:41.940 | than evil and the only way evil can really win
00:30:45.640 | is like if everyone dies.
00:30:47.280 | - So wait, can you say that again?
00:30:51.380 | Good is a proxy for society's what?
00:30:54.240 | - Good is a proxy for the stability
00:30:56.600 | and fitness of a civilization.
00:30:58.440 | - I believe, damn, that's a good definition.
00:31:01.000 | - Thank you.
00:31:01.840 | - So you're throwing some bombs today.
00:31:02.920 | Okay, all right.
00:31:03.760 | (laughing)
00:31:06.720 | Okay, this is exciting.
00:31:08.880 | Sorry, sorry to interrupt your flow there
00:31:11.240 | but it's just a damn good line.
00:31:12.840 | - Thank you.
00:31:13.680 | - So in that sense, that's a kind of optimistic view
00:31:18.320 | that if by definition good is a proxy for stability
00:31:21.520 | then it's going to be stable
00:31:22.960 | unless the entire world just blows itself up.
00:31:25.920 | So good wins in the end by definition.
00:31:28.400 | - Yeah.
00:31:29.520 | - Or no, well, good wins unless it all goes
00:31:34.520 | to complete destruction.
00:31:39.120 | That's beautifully put.
00:31:40.320 | - Thank you.
00:31:41.160 | - On the topic of sort of good and evil being human illusions
00:31:46.160 | you've said that more broadly than that
00:31:52.760 | about truth that it is easier in some ways
00:31:55.360 | to be unified under truth because it is universal
00:31:58.600 | than it is to be unified under belief
00:32:00.560 | which at times can be completely subjective.
00:32:02.960 | So what is the nature of truth to you?
00:32:06.560 | Can we understand the world objectively
00:32:09.800 | or is most of what we can understand about the world
00:32:12.960 | is just subjective opinions that we kind of all agree on
00:32:17.960 | in these little collectives
00:32:20.320 | and over time it kind of evolves,
00:32:22.840 | completely detached from objective reality?
00:32:26.560 | - I think this is the greatest argument for objectivity
00:32:31.120 | is that something that is objectively true
00:32:35.520 | cannot be true to me and untrue to you.
00:32:40.240 | You can feel that it's untrue
00:32:42.440 | but that would be unproductive
00:32:45.560 | and create unnecessary tension and conflict.
00:32:50.560 | I think this is one reason for the importance of science
00:32:55.560 | as a tool for stability.
00:32:58.800 | If science is the search for truth
00:33:03.440 | and truth can never really be, I shouldn't say that,
00:33:08.760 | truth should never be an engine of conflict
00:33:12.840 | because no two people should disagree on something
00:33:15.800 | which is objectively true
00:33:18.480 | then in some sense search for truth
00:33:21.480 | is searching for a common ground
00:33:24.560 | where we can all exist and live without contradicting
00:33:29.560 | or attacking each other.
00:33:32.400 | - Do you have a hope that there is a lot of common ground
00:33:34.680 | to be discovered?
00:33:36.360 | - Sure, I mean if we continue scientifically
00:33:41.080 | we are discovering truth and in that discovering
00:33:44.320 | common ground on which we can all agree.
00:33:46.560 | That's one reason why I think caring about science,
00:33:51.240 | if you have a culture which cares very deeply about science
00:33:54.480 | that's a culture which is not necessarily bound
00:33:59.160 | to endure unwarranted internal conflict.
00:34:04.120 | I think that's one reason that I'm so passionate
00:34:06.000 | about science is its search for universal ground.
00:34:09.360 | - Let me just throw out an example
00:34:11.600 | of a modern day philosophical thinker.
00:34:14.440 | We'll keep your dad, Eric Weinstein,
00:34:16.640 | out of the picture for a sec.
00:34:18.640 | But he does happen to be an example of one.
00:34:21.080 | But Jordan Peterson is an example of another,
00:34:23.600 | somebody who thinks deeply about this world.
00:34:25.800 | His ideas are by a certain percent of the population,
00:34:30.400 | sort of speaking of truth, are labeled as dangerous.
00:34:33.640 | Why do you think his ideas or just ideas
00:34:37.700 | of these kinds of deep thinkers in general
00:34:40.800 | are labeled as dangerous in our modern world?
00:34:43.960 | Is it similar to what you've been discussing
00:34:46.360 | that in difficult times philosophers become dangerous?
00:34:50.720 | Or is there something specific
00:34:52.360 | about these particular thinkers in our time?
00:34:54.980 | - Well, I think Jordan Peterson is very anti-establishment
00:34:58.920 | in a lot of his beliefs.
00:35:01.360 | He's an unconventional thinker and I think we need,
00:35:07.360 | regardless of whatever Jordan's particular views
00:35:12.360 | and beliefs are and if they bring about more danger
00:35:16.720 | than truth or if they don't,
00:35:21.440 | it's very important to have fundamental thinkers
00:35:25.240 | who exist outside of a conventional framework.
00:35:29.400 | So do I think that he's dangerous?
00:35:33.920 | I think by existing outside of a system which is known,
00:35:38.920 | he is dangerous and I think we have to,
00:35:45.920 | in some sense we have to welcome danger in that capacity
00:35:50.320 | because it will be our only way out of this.
00:35:53.880 | So I'm, regardless of whether his beliefs
00:35:59.280 | are right or wrong, I'm pretty adamant about the fact
00:36:04.280 | that we need to support thought which may rescue us.
00:36:11.200 | - And that thought can appear radical or dangerous at times
00:36:16.440 | but ultimately if you allow for it,
00:36:19.320 | this is kind of the difficult discussion of free speech
00:36:22.320 | and so on, is ultimately difficult ideas will pay off
00:36:27.320 | the way for progress.
00:36:30.760 | - Yeah and I'd actually, I'd like to slow you down there
00:36:34.320 | because I think like one of the issues we were discussing
00:36:38.840 | previously was the fact that language often destroys
00:36:43.840 | our ability to think.
00:36:46.480 | When we're talking about whether his ideas are radical,
00:36:52.120 | I don't know if we mean radical in the traditional sense
00:36:55.980 | of having to do with the root of a problem
00:37:00.120 | or in the more modern sense of being very extreme.
00:37:05.120 | And I think that's completely by design,
00:37:09.800 | I think fundamental thought which semantically
00:37:14.800 | would once be considered radical thought
00:37:18.900 | became very dangerous and now it's become synonymous
00:37:22.400 | with extreme or dangerous thought
00:37:24.480 | which means that anyone who considers themselves
00:37:27.100 | a radical thinker is semantically also a dangerous
00:37:32.100 | or extreme thinker.
00:37:36.440 | - These are not helpful labels in a sense
00:37:38.640 | that the moment you say radical or extremist thinker
00:37:42.960 | then you're just, well how do I put it,
00:37:46.800 | you're not helping the public discourse exchange of ideas.
00:37:52.680 | - But through no fault of our own,
00:37:54.840 | the concept of radical as having to do with a root
00:37:58.160 | is it's an obvious concept for which there must be language
00:38:03.160 | and a lot of the attack on thought has to do
00:38:08.560 | with attacking language which communicates conceptually.
00:38:13.160 | So like this is an example of how our world
00:38:17.980 | is becoming increasingly Orwellian,
00:38:20.680 | it's just language is being used to destroy
00:38:23.160 | our ability to think.
00:38:25.360 | I think, I can't remember exactly what the numbers are
00:38:27.840 | but I read some statistic about how greatly
00:38:30.660 | the average English vocabulary has decreased since 1960.
00:38:35.660 | It was like some incredible number, it really baffled me.
00:38:39.480 | It's like how are people less able to think in a time
00:38:43.960 | when the world is supposed to be growing
00:38:46.040 | at a never before seen rate.
00:38:49.960 | We can't sustain this growth
00:38:54.000 | if we destroy everyone's ability to think
00:38:58.040 | because growth requires thinking
00:39:00.560 | and we're ruining the tools for it.
00:39:05.040 | I watched your podcast with Noam Chomsky
00:39:09.080 | and I think one interesting thing which he discussed
00:39:13.680 | was how language is more used to develop thoughts
00:39:18.080 | within our own head than it is used
00:39:20.640 | to communicate those thoughts with others.
00:39:23.720 | If the language doesn't change, even if its usage changes,
00:39:27.760 | then when language is destroyed in communication
00:39:32.360 | it also stymies our ability to think reasonably
00:39:37.360 | and I'm very, very worried.
00:39:40.720 | - But the language in communication requires a medium
00:39:47.000 | and there's a lot of different mediums.
00:39:48.280 | So there's social media, there's Twitter,
00:39:51.440 | there's writing books, there's blog posts,
00:39:54.880 | there's podcasts, there's YouTube videos,
00:39:59.880 | all of things you have dipped a toe in
00:40:04.800 | in your exploration of different mediums of communication.
00:40:08.320 | Which do you see yourself, this might be just a poetic way
00:40:12.700 | of asking are you gonna do a podcast,
00:40:14.200 | but broader picture, what do you think
00:40:18.160 | as an intellectual in this world, for you personally,
00:40:22.880 | would be the path for communicating your ideas to the world?
00:40:26.600 | What are the mediums you are currently drawn to?
00:40:30.060 | Out of the ones I mentioned, maybe something I didn't.
00:40:33.400 | - To answer your question concretely before abstractly,
00:40:38.400 | I'm scared but I need to do a podcast.
00:40:42.360 | It's important, it is my obligation
00:40:45.920 | as a member of my generation.
00:40:47.760 | I really hope that more people my age start to do this
00:40:51.180 | because we will be the people in charge of new ideas
00:40:56.180 | which either sink or swim.
00:40:59.160 | - How upset would your dad be when your podcast
00:41:02.240 | quickly becomes more popular than his?
00:41:04.720 | - I think he would be negatively upset.
00:41:07.240 | - I'll say you'd be proud, he's a good dad.
00:41:09.680 | - I really think so, yeah.
00:41:11.960 | - Sorry to interrupt.
00:41:13.680 | Yeah, so but then zooming out, do you think podcasts,
00:41:17.040 | are you excited by the possibility of other mediums
00:41:21.000 | outside of podcasting to communicate ideas?
00:41:24.120 | - I would be if people still read books
00:41:27.400 | or did things like that.
00:41:28.880 | I'm somewhat guilty of this.
00:41:33.480 | A lot of the books I read are very technical
00:41:39.000 | and then to absorb really deep, modern conversations,
00:41:44.000 | I listen to podcasts and I don't really read many books
00:41:49.840 | on the matters that we're discussing, for example.
00:41:55.480 | - It's fascinating because you're making me think
00:41:57.480 | of something that I align with you very much
00:42:00.840 | of how I consume deep thinkers currently.
00:42:03.520 | So what happens is somebody who thinks deeply
00:42:05.520 | about the world will write a book
00:42:08.720 | now Jordan Peterson example,
00:42:10.360 | instead of reading their book,
00:42:12.040 | I'll just listen to podcast conversations
00:42:14.360 | of them talking about the book, which I find to,
00:42:17.800 | this is really sad, but I find that to be
00:42:22.280 | a more compelling way to think about their ideas
00:42:25.720 | because they're often challenged in certain ways
00:42:28.840 | in those conversations and they're forced to,
00:42:31.840 | after having boiled them down and really thought
00:42:34.240 | that I'm enough to write a book.
00:42:36.240 | So it's almost like they needed to go through the process
00:42:38.520 | of writing a book just so they can think through,
00:42:41.320 | convert the language in their minds
00:42:43.080 | into something more concrete
00:42:44.480 | and then the actual exchange of ideas,
00:42:47.640 | the actual communication of ideas with the public happens,
00:42:50.000 | not with the book, but after the book
00:42:53.000 | with that person going on a book tour
00:42:55.280 | and communicating the ideas.
00:42:58.160 | - Well, there are two meanings I make
00:43:00.920 | of why not too many people spend much
00:43:03.920 | of their time reading anymore.
00:43:05.960 | One interpretation is that we've lost our attention spans
00:43:10.440 | to our phones, people can't concentrate on a page
00:43:13.600 | if it takes them a minute to read,
00:43:15.360 | we're too busy watching TikToks or whatever people do.
00:43:18.960 | The other interpretation would be that language
00:43:24.320 | and verbal communication has, as well as some amount
00:43:29.320 | of communication, which is done through facial expression,
00:43:33.880 | tone of voice, et cetera, these are means of communication
00:43:38.000 | that have evolved along with humanity
00:43:42.160 | over thousands and thousands of years.
00:43:44.560 | So we know that we are built to communicate in this way.
00:43:49.560 | We have had writing for much less time.
00:43:56.920 | It is a system that we invented, not a system which evolved
00:44:02.480 | and is innately part of humanity or the human mind.
00:44:07.480 | And so we are designed to consume conversation
00:44:16.560 | by our own evolution.
00:44:18.600 | We are designed to consume writing by some process
00:44:23.600 | of symbols that's evolved over a couple thousand years.
00:44:31.800 | - It makes sense to me why many are much more compelled
00:44:35.840 | to listen to podcasts, for example,
00:44:38.600 | than they are to read books.
00:44:41.280 | It could be that this is simply a technological progression
00:44:46.280 | which has displaced reading conventionally
00:44:52.640 | instead of some sort of maladaptation of our minds,
00:44:59.120 | which has corrupted our attention spans.
00:45:02.880 | Likely there's some combination which determines
00:45:06.880 | why people spend much less time reading.
00:45:09.120 | But I don't think it's necessarily
00:45:11.120 | because we're all broken.
00:45:12.360 | It may simply have to do with the fact
00:45:14.320 | that we are designed to listen through our ears
00:45:18.280 | and speak through our mouths,
00:45:19.440 | and we are not innately designed
00:45:22.960 | to communicate over a page.
00:45:25.720 | - Yeah, there's an exciting coupling to me
00:45:28.260 | between like few second TikTok videos
00:45:32.160 | that are fun and addicting,
00:45:33.760 | and then the three, four hour podcasts,
00:45:36.360 | which are both really popular in our current time.
00:45:40.500 | So people are both hungry for the visual stimulation
00:45:44.040 | of internet humor and memes, a huge fan of,
00:45:48.200 | and also slow moving deep conversations.
00:45:53.120 | And that might, you know, there's a lot of,
00:45:56.360 | I mean, it's part of your generation to define
00:45:58.460 | what that looks like moving forward.
00:45:59.940 | Where a lot of people, like Joe Rogan's one of the people
00:46:02.860 | that kind of started, accidentally stumbled
00:46:06.940 | into the discovery that this is like a thing.
00:46:10.300 | And now people are kind of scrambling to figure out
00:46:13.900 | why is this a thing?
00:46:15.420 | Like why is there so much hunger
00:46:16.820 | for long form conversations?
00:46:18.760 | And how do we optimize that medium
00:46:21.140 | for further, further expression of deep ideas
00:46:23.540 | and all that kind of stuff?
00:46:24.700 | And YouTube is a really interesting medium for that as well.
00:46:28.900 | Like video, sharing of videos.
00:46:31.820 | Mostly YouTube is used with a spirit of like
00:46:35.260 | the TikTok spirit, if I can put it in that way,
00:46:37.460 | which is like, how do I have quick moving things
00:46:41.100 | that even if you're expressing difficult ideas,
00:46:43.100 | they should be quick and exciting and visual and switching.
00:46:46.340 | But there's a lot of exploration there to see,
00:46:48.740 | what, can we do something deeper?
00:46:50.660 | And nobody knows.
00:46:52.420 | And you're part of the, you have a YouTube channel
00:46:56.660 | releasing one video every few years.
00:46:58.980 | So your momentum is currently quite slow,
00:47:03.500 | but perhaps it'll accelerate.
00:47:05.100 | You're one of the people that gets to define that medium.
00:47:08.860 | Is that, do you enjoy that, the visual YouTube
00:47:12.340 | medium of communication as well?
00:47:14.420 | - I know that when the topic of conversation
00:47:21.500 | or the means by which a conversation is communicated
00:47:26.500 | or an idea is communicated,
00:47:28.740 | if that is sufficiently interesting to me,
00:47:32.260 | I will read a book on it.
00:47:35.780 | I would listen to a podcast on it.
00:47:37.380 | I would watch a video on it.
00:47:40.740 | I think if I'm very curious about something,
00:47:42.940 | I will consume it however possible.
00:47:46.380 | I think when I have to consume things
00:47:48.620 | which really don't interest me very much,
00:47:51.300 | I'm indeed much more ready to consume them
00:47:55.740 | through some sort of video or discussion
00:47:57.860 | than I am through a long, tedious book.
00:48:01.820 | - So for the breadth of acquiring knowledge,
00:48:05.940 | video is good.
00:48:06.940 | For the depth, the medium doesn't matter.
00:48:09.580 | I think it'd be fun to ask you about some
00:48:13.580 | big philosophical questions to see
00:48:16.540 | if you have an opinion on them.
00:48:18.980 | Do you think there's a free will
00:48:21.420 | or is free will just an illusion?
00:48:24.140 | - Well, I think classical mechanics would tell us
00:48:28.660 | that if we were to know every piece of information
00:48:33.660 | about a system and understand the rules
00:48:36.220 | which govern that system,
00:48:38.180 | we would be completely able to predict the future
00:48:42.380 | with complete accuracy.
00:48:44.260 | So if something could know everything about our lives,
00:48:49.260 | it could freeze time and understand the position
00:48:53.380 | of every neuron in my mind about to fire,
00:48:56.780 | no decision could be unpredictable.
00:49:02.700 | In some sense, there is that sort of fate.
00:49:07.940 | I think that doesn't make the decisions we make
00:49:11.140 | illegitimate even if some grand supercomputer
00:49:14.460 | could understand what decisions we would make
00:49:19.100 | beforehand with complete certainty.
00:49:21.580 | I think we're making legitimate systems
00:49:24.340 | within a system that has no freedom.
00:49:26.180 | - We're making legitimate systems
00:49:29.180 | within a system that has no freedom.
00:49:30.780 | Can you explain what you mean by that?
00:49:33.620 | - Yeah, so if we were to have just a simple pendulum
00:49:40.140 | and I told you how long the rope was,
00:49:44.780 | we froze it at a particular point
00:49:47.820 | and I told you how high above the ground the weight was
00:49:52.460 | and the motion of a pendulum is something
00:49:57.060 | which is easy for everyone to imagine.
00:49:59.700 | If we had all of that information,
00:50:04.900 | you could ask me, what will the pendulum do
00:50:08.540 | six and a half minutes from now?
00:50:10.180 | And we would have a precise answer.
00:50:12.860 | That's an example of a very simple system
00:50:16.100 | with a very simple Lagrangian.
00:50:17.980 | And we could completely predict the future.
00:50:23.180 | The pendulum has no ability to do anything
00:50:26.740 | that would surprise us.
00:50:28.500 | Weirdly, that's true of whatever this four-dimensional,
00:50:35.020 | crazy world we live in looks like.
00:50:39.460 | If we were to understand where every piece of this system
00:50:43.300 | was at any given time and we understand the laws of motion,
00:50:47.300 | how everything worked, if we could compute
00:50:50.660 | all of that information somehow,
00:50:52.420 | which we will never be able to do,
00:50:54.220 | every decision you will ever make
00:51:00.500 | could be predicted by that computer.
00:51:03.020 | That doesn't mean that your decisions are illegitimate.
00:51:05.580 | You are really making those decisions
00:51:08.060 | but with a completely predictable outcome.
00:51:11.020 | - So I'm just sort of a little bit high at the moment
00:51:14.580 | on the poetry of a system within a system
00:51:19.420 | that has no freedom.
00:51:21.280 | So the human experience is the system we've created
00:51:25.660 | within the system that has no freedom.
00:51:27.260 | But that system that we've created has a feeling of freedom.
00:51:33.260 | That to us, ants feels as much more real
00:51:38.260 | than the physics as we understand it
00:51:45.780 | of the underlying base system.
00:51:48.940 | So it's almost like not important
00:51:51.500 | what the physics of the base system is.
00:51:53.800 | That for what we've created,
00:51:56.100 | the nature of the human experience is there is a free will.
00:52:01.580 | (Dave laughs)
00:52:02.540 | - Or there is something that feels close enough
00:52:05.420 | to a free will that it may not be worth
00:52:08.740 | spending too much time on the fact
00:52:15.340 | that it's something of an illusion.
00:52:16.980 | We will never build a computer that knows everything
00:52:19.300 | about every piece of the universe at a given time.
00:52:23.340 | And so for all intensive purposes,
00:52:25.540 | our decisions are up to us.
00:52:28.700 | We just happen to know that their outcomes
00:52:30.900 | could be predicted with enough information.
00:52:33.420 | - So speaking of supercomputers,
00:52:36.380 | they can predict every single thing
00:52:37.780 | about what's going to ever happen.
00:52:40.620 | What do you think about the philosophical thought experiment
00:52:46.420 | of us living in a simulation?
00:52:48.940 | Do you often find yourself pondering
00:52:52.140 | of us living in a simulation?
00:52:54.140 | Of this question, do you think it is
00:52:55.620 | at all a useful thought experiment?
00:52:58.020 | - I think it's very easy to become fascinated
00:53:02.060 | with all of these possibilities.
00:53:03.940 | And they're completely legitimate possibilities.
00:53:07.480 | Is there some validity to solipsism?
00:53:15.180 | Well, it can never be falsified or disproven.
00:53:18.500 | So, I mean, sure you could be a figment of my imagination.
00:53:24.900 | It doesn't mean that I will act
00:53:27.020 | according to this possibility.
00:53:28.980 | I'm not gonna call you mean names.
00:53:30.940 | - Just to test the system,
00:53:34.280 | to see how robust it is to distortions?
00:53:37.300 | - Yeah, so, I mean, all of these existential
00:53:40.700 | thought experiments are completely possible.
00:53:42.700 | We could be brains in jars.
00:53:44.600 | It doesn't mean that our experience
00:53:48.500 | will feel any less valid.
00:53:51.580 | And so it doesn't make a difference to me
00:53:53.860 | if you are some number of ones and zeros
00:53:58.860 | or you are a figment of my imagination
00:54:02.540 | which lives in a stored away brain.
00:54:05.880 | It will never really change my experience
00:54:10.580 | knowing that that's a possibility.
00:54:12.580 | And so I try to avoid making decisions
00:54:17.580 | based on such contemplations.
00:54:21.620 | You know, if we take this previous issue of free will,
00:54:25.380 | I could decide that because I have no choice in my life,
00:54:32.220 | if I lie around in bed all day and eat chips,
00:54:39.260 | I was destined to do that thing.
00:54:41.180 | And if I make that decision
00:54:42.220 | that I was destined to do that thing,
00:54:44.140 | it would be a really poor decision for me to make.
00:54:47.060 | I have school and a dozen commitments.
00:54:51.220 | - There's somebody listening to this right now,
00:54:53.660 | probably hundreds of people sitting down eating chips
00:54:57.700 | and feeling terrible about themselves.
00:54:59.060 | So how dare you, sir?
00:55:00.580 | - If they're listening to this,
00:55:02.380 | they're clearly curious about possibilities of thought.
00:55:07.380 | It's not the bed and the chips that makes the man.
00:55:12.180 | - It's not the bed or the chips that makes the man.
00:55:15.220 | Yet another quotable from Zev Weinstein.
00:55:18.020 | Okay, but you don't think of it
00:55:21.500 | as a useful thought experiment
00:55:22.980 | from an engineering perspective of virtual reality,
00:55:27.980 | of thinking how we can create
00:55:29.860 | further and further immersive worlds.
00:55:31.940 | Like, would it be possible to create worlds
00:55:34.180 | that are so immersive that we would rather live
00:55:37.060 | in that world versus the real world?
00:55:39.780 | I mean, that's another possible trajectory
00:55:42.060 | of the world that you're growing up in
00:55:44.580 | is we're more and more immersing ourselves
00:55:49.060 | into the digital world.
00:55:50.460 | For now, it's screens and looking at the screens
00:55:52.860 | and socializing with the screens,
00:55:54.660 | but it's possible to potentially create a world
00:55:56.900 | that's also visually for all of our human senses
00:56:00.740 | as immersive as the physical world.
00:56:03.700 | And then, to me, it's an engineering question
00:56:07.500 | of how difficult is it to create a world
00:56:10.020 | that's as immersive and more fun
00:56:13.140 | than the world we're currently living in.
00:56:16.100 | - It's a terrifying concept, and I hate to say it.
00:56:18.300 | We might live happier lives in a virtual reality headset
00:56:21.860 | 30 years from now than we are currently living.
00:56:25.700 | - This future, the digital future worries you.
00:56:28.980 | - It worries me.
00:56:31.180 | On the other hand, it may be a better alternative
00:56:37.620 | to fighting for whatever people are clinging onto
00:56:43.540 | in our non-virtual world, or at least the world
00:56:47.780 | that we don't yet know is virtual.
00:56:50.060 | - So embrace the future.
00:56:51.980 | We've been talking a lot about thinkers.
00:56:55.280 | Now, in the broad definition of philosophy,
00:57:00.340 | you kind of included innovators of all form.
00:57:03.940 | Do you find it useful to draw a distinction
00:57:06.500 | between thinkers and doers?
00:57:08.440 | - I think that the most important gift
00:57:12.700 | we've ever been given is our ability
00:57:16.140 | to observe the universe and think deductively
00:57:20.560 | about whatever principles transcend humanity.
00:57:24.240 | Because as we discussed, that's the closest thing
00:57:29.200 | we will ever have to a universal experience
00:57:32.680 | is understanding things, which must be true everywhere.
00:57:37.280 | In order for that, so I think if we're deciding
00:57:42.280 | that life is meaningful and the human experience
00:57:46.480 | is meaningful, you could make a very convincing argument
00:57:50.560 | that its greatest meaning will be understanding
00:57:54.480 | whatever transcends it.
00:57:56.000 | I think that's only sustainable if people are happy
00:58:05.880 | and well-fed and things of market value are invented.
00:58:10.880 | And so I think we really need both to live meaningful
00:58:19.560 | and successful and possible lives.
00:58:24.280 | In terms of who my greatest heroes are,
00:58:29.200 | I can't decide between figures like Einstein
00:58:34.480 | and Newton and Feynman and on the other hand,
00:58:38.120 | figures like Cary Mullis, for example.
00:58:42.920 | I think people like Einstein make our lives meaningful
00:58:48.720 | and people like Cary Mullis, who's probably responsible
00:58:53.480 | for saving hundreds of millions of lives,
00:58:56.160 | make our lives possible and good.
00:59:02.640 | So in terms of where I would like to find myself
00:59:07.160 | with these two different notions of achievement,
00:59:14.920 | I don't know what I would more like to achieve.
00:59:18.840 | I have an inclination that it will be something scientific
00:59:22.280 | because I would like to bring meaning to humanity
00:59:25.160 | instead of sustenance, but I think both are very important.
00:59:31.360 | We can't sustain our lives
00:59:33.520 | if we don't keep growing technologically.
00:59:36.200 | I think people like you are making that possible
00:59:38.680 | with computing because that's one of the few things
00:59:42.240 | that's really moving forward in a clear sense.
00:59:46.480 | I think about this a great deal.
00:59:53.440 | So I think both are very important.
00:59:56.400 | - So one example that's modern day,
00:59:59.600 | inspiring figure on the latter part,
01:00:01.640 | in the engineering part on the sustenance is Elon Musk.
01:00:05.960 | Is that somebody you draw inspiration from?
01:00:09.440 | What are your thoughts in general
01:00:11.080 | about the kind of unique speck of human
01:00:16.080 | that's creating so much inspiring innovation
01:00:23.080 | in this world so boldly?
01:00:25.760 | - I know that we will not survive without people like that.
01:00:29.560 | Elon is a ridiculous and sensational example
01:00:36.960 | of one of these figures.
01:00:38.780 | I don't know if he's the best example or the worst example,
01:00:44.640 | but he is of his own kind.
01:00:48.240 | He is radically individualistic,
01:00:50.920 | and those are the people who will allow us
01:00:54.480 | to continue as humans.
01:00:58.000 | I'm very happy that we have people like that in this world.
01:01:01.040 | - You said this thing about if we are to say
01:01:05.480 | that life has meaning or life is meaningful,
01:01:10.480 | then you could argue that it is a worthy pursuit
01:01:15.240 | to transcend life.
01:01:18.960 | Do you see that, another just,
01:01:22.080 | I'm gonna have to go back and sleep on that one.
01:01:26.300 | Do you draw some, speaking of Elon,
01:01:32.240 | some inspiration of us transcending Earth,
01:01:37.240 | of us moving outside of this particular planet
01:01:45.880 | that we've called home for a long time
01:01:48.220 | and colonizing other planets,
01:01:50.400 | and perhaps one day expanding outside the solar system
01:01:53.880 | and expanding, colonizing our galaxy and beyond?
01:01:58.640 | - Honestly, I know very little about space exploration.
01:02:01.800 | I think it makes complete sense to me
01:02:05.760 | why we are starting to think very seriously about it.
01:02:09.820 | It's an amazing and baffling and innovative solution
01:02:15.340 | to a lot of problems we see as a world population.
01:02:20.340 | I can't really offer very much of interest on the topic.
01:02:26.160 | I think when I'm talking about transcending humanity
01:02:31.920 | and transcending Earth,
01:02:33.400 | I'm talking usually about deriving truth,
01:02:39.800 | and that's one of the things
01:02:41.960 | that makes theoretical math and physics
01:02:45.180 | so interesting.
01:02:46.580 | It's like I really, really love biology, for example,
01:02:51.140 | but biology is a combination of whatever principles
01:02:56.140 | ensure evolution and whatever weird coincidences
01:03:01.380 | happened billions of years ago.
01:03:03.020 | - So to you, it's more interesting to understand
01:03:05.940 | the fundamental mechanisms of evolution, for example,
01:03:08.500 | than it is the results, the messy results of its processes.
01:03:13.060 | - I can't say which is more interesting.
01:03:14.760 | I can say which I think is more deep.
01:03:18.000 | I think theory and abstraction,
01:03:22.200 | which can be achieved completely deductively,
01:03:25.200 | is deeper because it has nothing to do with circumstance
01:03:30.360 | and everything to do with logic and thought.
01:03:34.940 | So if we were ever to interact with aliens, for example,
01:03:41.800 | we would not have our biology in common.
01:03:46.800 | If these were some sort of really intelligent life form,
01:03:52.020 | we would have math and physics in common
01:03:56.940 | because the laws of physics will be the same
01:04:01.520 | everywhere in the universe.
01:04:03.660 | Our particular anatomy and biology
01:04:09.760 | pertains only to life on this planet,
01:04:14.040 | and the principles may apply more ubiquitously.
01:04:17.360 | - Do you ever think about aliens,
01:04:18.920 | like what they might look like?
01:04:20.480 | - I try to, when I deal with thought experiments like these,
01:04:24.760 | I try to keep a very abstract mindset,
01:04:29.760 | and I notice that whenever I try to instantiate
01:04:35.120 | these abstractions, I corrupt whatever thoughts
01:04:40.120 | there are for which they're useful.
01:04:43.760 | - It's kind of like the labels discussion.
01:04:45.320 | So the moment you try to make it concrete,
01:04:48.360 | it's probably gonna look like some cute version of a human.
01:04:51.360 | It's the little green fellas with the eyes and so on,
01:04:56.400 | or whatever.
01:04:57.240 | Whatever the movies have instilled,
01:04:59.520 | like your cultural upbringing,
01:05:01.320 | you're going to project onto that
01:05:02.760 | and the assumptions you have.
01:05:04.840 | - Exactly. - That's interesting.
01:05:05.680 | So you prefer to step away and think and abstract notions
01:05:09.240 | of what it means to be intelligent,
01:05:10.560 | what it means to be a living life form
01:05:12.960 | and all that kind of stuff.
01:05:14.280 | - I try to, I almost try to pretend I'm blind and I'm deaf
01:05:18.880 | and I'm only a mind with no inductive reasoning capacity
01:05:23.880 | when I'm trying to think about thought experiments
01:05:28.280 | like these because I know that if I incorporate
01:05:33.560 | whatever my eyes instruct my brain,
01:05:37.920 | I will impede my ability to think as deeply as possible.
01:05:42.920 | Because once again, the thing which shallows our thought
01:05:51.120 | can be the incorporation of circumstance and coincidence.
01:05:54.480 | And for particular kinds of thought, that's very important.
01:05:57.400 | I'm not discounting the use of inductive reasoning
01:06:00.640 | in many humanities and in many sciences,
01:06:03.680 | but for the deepest of thoughts,
01:06:07.400 | once again, I feel it's important to try to transcend
01:06:11.000 | whatever methods of observation
01:06:14.000 | characterize human experience.
01:06:15.480 | - See, but within that, that's all really beautifully put.
01:06:18.160 | I wonder if there is a common mathematics
01:06:22.600 | and a common physics between us and alien beings,
01:06:27.880 | we still have to make concrete
01:06:30.440 | the methods of communication.
01:06:32.700 | - Yeah.
01:06:33.880 | - And that's a fascinating question of like,
01:06:36.500 | while remaining in these abstract fundamental ideas,
01:06:39.440 | how do we communicate with them?
01:06:41.880 | I mean, I suppose that that question could be applied
01:06:43.960 | to different cultures on earth.
01:06:47.280 | But it's finding a common language.
01:06:50.920 | Do you think about that kind of problem
01:06:53.000 | of basically communicating abstract fundamental ideas?
01:06:57.280 | - My least favorite aspect of math or physics
01:07:00.880 | or any of these really deep sciences
01:07:02.880 | is the symbolic component.
01:07:05.600 | You know, I'm dyslexic.
01:07:07.440 | I don't like looking at symbols.
01:07:10.440 | They're too often a source of ambiguity.
01:07:14.400 | And I think you're entirely right
01:07:15.960 | that if one thing holds us back with
01:07:19.040 | communication with something that behaves
01:07:26.600 | or looks nothing like us,
01:07:28.380 | I think if one thing holds us back,
01:07:31.320 | it will be symbols and the communication of deep thought.
01:07:37.620 | Because as I said, I think communication
01:07:40.200 | frequently compromises thought by intention
01:07:43.640 | or by just theoretical inadequacy.
01:07:48.520 | - So on this topic, actually,
01:07:50.040 | it'd be fun to see what your thoughts are.
01:07:52.320 | Do you think math is invented or discovered?
01:07:56.960 | So you said that math,
01:07:58.320 | we might share ideas of mathematics and physics
01:08:03.080 | with alien life forms.
01:08:04.640 | So it's uniform in some sense of uniform
01:08:07.440 | throughout the universe.
01:08:08.640 | Do you think this thing that we call mathematics
01:08:14.800 | is something that's kind of fundamental
01:08:18.280 | to the world we live in?
01:08:19.440 | Or is it just some kind of pretty axioms
01:08:23.840 | and theorems we've come up with
01:08:25.120 | to try to describe the patterns we see in the world?
01:08:28.280 | - I think it's completely discovered
01:08:31.560 | and completely fundamental to all experience.
01:08:34.840 | I think the only component of mathematics
01:08:38.260 | that has been invented is the expression of it.
01:08:41.200 | And I think in some sense,
01:08:44.360 | there's almost an arrogance required
01:08:48.580 | to believe that whatever aspect we invent
01:08:52.240 | having to do with math and physics and theory,
01:09:00.440 | there is an arrogance required to truly believe
01:09:05.120 | that that belongs on any sort of stage
01:09:07.320 | with the actual beauty of the matters being discovered.
01:09:12.140 | So we need our minds and our intellects
01:09:17.140 | we need our minds and in some sense,
01:09:19.960 | our pens to be able to play with these things
01:09:24.600 | and communicate about them.
01:09:28.360 | And those hands and those pens are the things
01:09:33.360 | which smudge the most beautiful thing
01:09:36.440 | that humanity can ever experience.
01:09:40.200 | And maybe if we interact with some intelligent life form,
01:09:45.860 | they will have their own unique smudges.
01:09:50.860 | But the canvas, which is beautiful,
01:09:55.100 | must be identical because that is universal
01:09:57.780 | and ubiquitous truth.
01:09:59.240 | And that's what makes it deep and meaningful
01:10:01.560 | is that it's so much more important
01:10:03.800 | than whatever we're programmed to enjoy
01:10:07.460 | as an aspect of human experience.
01:10:10.140 | - Yeah, that's really beautifully put.
01:10:13.260 | That the human language is these messy smudges
01:10:17.740 | of trying to express something underlying
01:10:19.780 | that is beautiful.
01:10:21.960 | Speaking of that, on the physics side,
01:10:26.620 | do you think the pursuit of a theory
01:10:30.420 | of everything in physics,
01:10:31.900 | as we may call it in our current times,
01:10:34.500 | of understanding the basic fabric of reality
01:10:37.940 | from a physics perspective is an important pursuit?
01:10:41.780 | - I think it's essential.
01:10:43.620 | As I've said, I think ideation is our only escape
01:10:49.220 | from the constraints of human condition.
01:10:54.220 | And I think that it's important
01:10:57.100 | that all great thoughts and ideas are bound together.
01:11:01.700 | And I think the math is beautiful
01:11:04.660 | and it ensures that the things which bind great ideas,
01:11:10.380 | which have already been had in great discoveries together,
01:11:13.280 | it ensures that those strings will be beautiful.
01:11:18.280 | I think it's very important to unify all theories
01:11:23.380 | that have brought us to where we are.
01:11:26.020 | - Do you think humans can do it?
01:11:28.700 | Do you think humans can solve this puzzle?
01:11:30.260 | Is it possible that we,
01:11:32.100 | with our limited cognitive capacity,
01:11:33.660 | will never be able to truly understand this deep,
01:11:37.540 | like deeply understand this underlying canvas?
01:11:42.460 | - I think if not, it will be people like you
01:11:46.180 | who invent some sort of,
01:11:51.180 | I don't know, we'll call it computation for now,
01:11:55.340 | that will be able to not only discover
01:12:00.340 | that which transcends humanity,
01:12:05.300 | but to transcend human methods
01:12:08.180 | of discovering that which is above it.
01:12:10.740 | - So superintelligent systems, AGI and so on,
01:12:14.660 | that are better physicists than us.
01:12:17.260 | I wonder if you might be able to comment,
01:12:19.820 | so your dad does happen to be somebody
01:12:21.540 | who boldly seeks this kind of deep understanding of physics,
01:12:26.540 | the underlying nature of reality from a physics perspective,
01:12:30.220 | from a mathematical physics perspective.
01:12:34.900 | Do you have hope your dad figures it out?
01:12:37.100 | - I have great hope.
01:12:38.780 | It's not supposed to be my journey,
01:12:41.220 | it's supposed to be his journey,
01:12:42.620 | it's supposed to be his to express to the world.
01:12:47.220 | Obviously, I'm so proud that I'm connected
01:12:51.180 | to someone who is determined to do such a thing.
01:12:54.940 | And on the other hand, maybe in some sense,
01:12:58.940 | I feel bad for him for having to,
01:13:02.980 | if he's gonna be the thing which discovers
01:13:07.100 | some sort of grand unified theory and expresses it,
01:13:11.340 | I feel sorry that he will have to smudge
01:13:14.420 | whatever canvas this thing is because--
01:13:18.940 | - Because he's human.
01:13:19.900 | - Really, I think, I know, I've seen a little bit
01:13:24.220 | of what I think great math and great physics looks like,
01:13:27.820 | and it's unbelievably beautiful.
01:13:31.020 | And then you have to present it to a world
01:13:33.500 | with market constraints and all of this messy sloppiness.
01:13:38.500 | I feel bad, in some sense, for my dad
01:13:43.620 | because he has to go back and forth
01:13:45.540 | between this beautiful world of math
01:13:47.900 | and whatever the messiness is of his human life.
01:13:52.900 | - And then the scientific community broadly
01:13:56.940 | with egos and tensions and just the--
01:13:58.860 | - Exactly.
01:13:59.700 | - Dynamics of what makes us human.
01:14:03.620 | - He's also very lucky that he gets to play
01:14:05.380 | with these sorts of things.
01:14:07.180 | It's a mixed bag.
01:14:10.060 | I both feel a little sorry for him
01:14:12.420 | for having to deal with the beauty
01:14:14.340 | as well as the smudging and the sloppiness
01:14:19.260 | of human expression.
01:14:21.380 | And I think it's difficult not to envy such a,
01:14:27.220 | such a beautiful insight or life or vision.
01:14:32.220 | - Well, that's your own path as well
01:14:36.980 | is this kind of struggle of, as you mentioned,
01:14:41.180 | exploring the beauty of different ideas
01:14:44.100 | while having to communicate those ideas
01:14:47.500 | with the best smudges you can in a world
01:14:51.060 | that wants to put labels, that wants to misinterpret,
01:14:53.420 | that wants to destroy the beauty of those ideas.
01:14:57.820 | And that's, you seem to at this time
01:15:00.500 | with your youthful enthusiasm embracing that struggle
01:15:05.020 | despite the fear, in the face of fear.
01:15:07.020 | Your dad also carries that same youthful enthusiasm as well.
01:15:14.260 | But that said, your dad, Eric Weinstein,
01:15:18.020 | he's a powerful voice, I would say,
01:15:19.580 | powerful intellect in public discourse.
01:15:22.060 | Is this a burden for you or an inspiration or both
01:15:27.060 | as a young mind yourself?
01:15:30.500 | - I think, as I said, there's this weird contrast of,
01:15:36.380 | you know, I know that he has ideas
01:15:40.720 | which I think are very beautiful
01:15:42.460 | and I know he has to deal with the sort of,
01:15:48.580 | there's something you have to sacrifice in beauty
01:15:52.180 | when you bring it to a world which is not always beautiful.
01:15:57.660 | And there's an aspect of that which sort of scares me
01:16:04.860 | about this kind of thing.
01:16:07.540 | I also think that,
01:16:09.060 | especially since I'm trying to think about
01:16:14.240 | how I should appear publicly,
01:16:16.780 | my dad has been very inspirational
01:16:20.100 | in that I think he brings a sort of fastidious care
01:16:24.980 | to very difficult conversations that--
01:16:27.700 | - What does fastidious mean?
01:16:29.580 | - Like, just very careful and thoughtful.
01:16:34.100 | He brings that sort of attitude to,
01:16:38.260 | I think, really difficult conversations.
01:16:42.860 | And I know that I don't have that skill yet.
01:16:45.940 | I don't think I'm terrible, but--
01:16:47.860 | - The care, the nuance,
01:16:49.700 | and yet not being afraid to push forward.
01:16:52.740 | - Yeah, I would really like to learn from my dad there.
01:16:55.620 | I think also my dad has been very important to my life
01:17:00.620 | just because I've always been
01:17:03.100 | a sort of very idiosyncratic thinker.
01:17:06.580 | And I think I don't always know how to interact
01:17:13.180 | with the world for those sorts of reasons.
01:17:16.500 | And I think my dad has always been similar.
01:17:21.500 | And if not for my dad,
01:17:23.100 | I don't know if I would just believe
01:17:25.060 | that I was stupid or something.
01:17:27.080 | Because I wouldn't know how to,
01:17:30.900 | I don't know if I would know how to interpret
01:17:33.540 | my differences from convention.
01:17:35.780 | - So he gave you the power to be different.
01:17:42.440 | And use that as a superpower.
01:17:44.820 | - Yeah, I guess you could put it that way.
01:17:49.060 | I don't know who I would believe I am
01:17:52.020 | if I didn't have my dad telling me
01:17:56.420 | that it wasn't my own stupidity,
01:17:58.540 | which alienated me from certain aspects of standard life.
01:18:02.960 | So I'm very, very thankful for that.
01:18:06.020 | - Is there a fond memory you have
01:18:07.660 | about an interaction with your dad,
01:18:09.380 | either funny, profound, that kind of sticks with you now?
01:18:14.380 | - A lot.
01:18:16.320 | - Part of the reason I ask that,
01:18:19.960 | of course, is just fascinating
01:18:22.520 | to see somebody as brilliant as you,
01:18:24.700 | see how the people that you interact with,
01:18:28.060 | how they form the mind that you have.
01:18:30.720 | But also to give an insight of another public figure
01:18:35.200 | like your dad to see from your perspective
01:18:38.280 | of what kind of little magical moments
01:18:41.120 | happen in private life.
01:18:42.660 | - I would say, I remember,
01:18:45.720 | I think I just posted about this on Instagram or something.
01:18:49.280 | - Otherwise it didn't happen if you didn't post that, yeah.
01:18:53.840 | - One person who's always sort of mattered
01:18:56.760 | to whatever weird life and experience I've had
01:19:00.500 | has been this comedian, Tom Lehrer.
01:19:02.580 | Do you know him?
01:19:04.800 | - Yes.
01:19:05.640 | - Yeah.
01:19:06.600 | - I love him very much.
01:19:08.120 | - Likewise.
01:19:09.320 | Anyway, I remember, I think I was five or something,
01:19:12.920 | my dad came home with the CD, this Tom Lehrer CD,
01:19:17.120 | and he told me to listen to it.
01:19:19.080 | And it was all of this bizarre satirical writing
01:19:24.080 | about prostitution and cutting up babies
01:19:29.120 | and all kinds of ridiculously vile content
01:19:33.000 | for a five-year-old.
01:19:35.880 | I think beyond just my love of Tom Lehrer,
01:19:40.880 | I think it was a way for my dad to express
01:19:46.520 | that from a very young age,
01:19:48.680 | he was really ready to treat me like an adult
01:19:53.000 | and he was ready to trust me
01:19:55.680 | and share his life and his enjoyments with me
01:20:05.240 | in a way that was unconventional
01:20:07.480 | because he was willing to discard tradition
01:20:12.480 | for the chance at a really unique
01:20:17.200 | and meaningful parental relationship.
01:20:21.360 | - So trusting that his particular brand of weirdness
01:20:24.960 | is something you can understand at a young age
01:20:27.000 | and embrace and learn from it.
01:20:28.640 | Tom Lehrer, we should clarify,
01:20:30.420 | is not all about, what is it, murder and prostitution.
01:20:33.120 | He's one of the wittiest, most brilliant musical artists.
01:20:36.120 | If you haven't listened to his work, you should.
01:20:40.160 | He's just a rare intellect
01:20:43.840 | who's able to sort of in catchy rhyme
01:20:46.760 | express some really difficult ideas through satire,
01:20:50.160 | I suppose, that still, even though it's decades ago,
01:20:55.160 | still resonates today, some of the ideas that he expressed.
01:20:58.080 | - I will say also that I think I am probably
01:21:02.880 | a more cultured person having listened to Tom Lehrer
01:21:06.560 | than I would have been without.
01:21:08.920 | I think a lot of his comedy draws upon a canon
01:21:13.360 | that I was really driven to research
01:21:15.440 | by saying, oh, what does this mean?
01:21:16.800 | I don't understand that reference.
01:21:18.640 | There are a lot of references there
01:21:20.000 | to really inspirational things,
01:21:25.000 | which he sort of assumes going into a lot of his songs.
01:21:28.040 | And for many of us, like me,
01:21:29.480 | you have to piece those things together.
01:21:31.760 | We're looking at Wikipedia pages and whatnot.
01:21:33.920 | But to tie this back to the original question,
01:21:38.040 | I think there's sort of a break it,
01:21:43.040 | you bought it notion of parenting.
01:21:47.360 | I think, really, if you're not gonna accept a standard,
01:21:52.360 | you have to invent your own.
01:21:55.400 | And I think in some ways that was my dad's way
01:21:57.920 | of telling me that if I was too unstandard as a child,
01:22:02.920 | he would invent his own way of parenting me
01:22:07.800 | because that was worth it to him.
01:22:09.520 | And I think that was very meaningful to me.
01:22:11.640 | - I know you're young.
01:22:13.080 | This is a weird time to ask this question.
01:22:15.280 | Are you cognizant on the role of love
01:22:20.040 | in your relationship with your dad?
01:22:21.880 | Are you at a place mentally as a man yourself
01:22:26.040 | to admit that you love the guy?
01:22:28.360 | - I love my dad with the connection
01:22:32.640 | that I think I've had to very few things in the world.
01:22:36.200 | I think my dad is one of the people
01:22:37.760 | that's allowed me to see myself.
01:22:40.600 | And I don't know who I would imagine myself to be,
01:22:45.480 | if not for my dad.
01:22:46.360 | That isn't to say that I agree with him on everything,
01:22:49.760 | but I think he's given me courage to accept myself
01:22:54.760 | and to believe that I can teach myself
01:22:59.720 | where I'm unable to learn from convention.
01:23:03.040 | So I have a very, I love my dad very dearly, yes.
01:23:07.920 | - Is there ways in which you wish you could be a better son?
01:23:12.920 | - Firstly, I'd like to say I'm sure
01:23:15.560 | before I figure out exactly what those are.
01:23:18.680 | I think whenever I come to conclusions on what that means,
01:23:23.840 | I'm eager to take them.
01:23:26.740 | - What do you mean by that?
01:23:31.040 | What do you mean by conclusions?
01:23:32.440 | - If I have an idea for how to be a better son,
01:23:35.400 | I think I'm inclined to try to be that person.
01:23:39.140 | I think that's true of almost anything.
01:23:41.080 | I think if I have ideas for improvement,
01:23:44.880 | it would be wasteful not to act on them.
01:23:49.400 | So I suppose one thing I could say is that
01:23:54.400 | I think idealism and what could almost be considered naivete
01:24:03.680 | is not necessarily a lacking of maturity,
01:24:10.420 | but instead an obligation to be a better person.
01:24:19.440 | Those older than us who have lived and seen too much
01:24:24.440 | to fully believe in what is naive and right
01:24:32.760 | without the assistance of the young
01:24:39.480 | to re-inspire traditional idealism.
01:24:46.600 | And so perhaps instead of trying to be more mature
01:24:51.600 | all the time, I should spend some time
01:24:55.520 | trying to be an idealistic form of hope
01:24:59.880 | in the lives of people who maybe have seen too much
01:25:04.320 | to retain all of that original hope.
01:25:07.760 | So that's something that's difficult,
01:25:11.640 | but especially appearing in public
01:25:15.120 | as someone as young as I am,
01:25:17.840 | I think anything I do which is juvenile by choice
01:25:21.320 | will be held against me.
01:25:22.880 | But maybe that's a sacrifice that I have to make.
01:25:27.000 | I have to retain some sort of youthful hope and optimism.
01:25:30.800 | - Yeah, I can't.
01:25:32.200 | I mean, I'm gonna get teary-eyed now, but I have allergies.
01:25:37.000 | But also, this is pretty powerful what you're saying.
01:25:39.080 | I certainly share your ideas.
01:25:40.440 | It's something I struggle with just by instinct.
01:25:44.200 | You should read "The Idiot" by Dostoevsky.
01:25:46.360 | By instinct, I love being naive
01:25:49.720 | and seeing the world from a hopeful perspective,
01:25:54.560 | from an optimistic perspective.
01:25:56.440 | And it's sad that that is something you pay a price for
01:26:01.440 | in this world.
01:26:03.040 | Like in the academic world, especially as you're coming up
01:26:06.680 | through schooling, but just actually it's a hit
01:26:09.200 | on your reputation throughout your life.
01:26:12.160 | And it's a sad truth, but you have to,
01:26:14.600 | like for many things, if it's a principle you hold,
01:26:18.840 | you have to be willing to pay the costs.
01:26:21.760 | And ultimately, I believe that in part a hopeful view
01:26:26.760 | will help you realize the best version of yourself
01:26:31.840 | because optimism is a kind of, optimism is productive.
01:26:37.760 | Like believing that the world is and can be amazing
01:26:42.200 | allows you to create a more amazing world somehow.
01:26:47.360 | I mean, I'm not sure if it's a human nature
01:26:51.080 | or a fundamental law of physics, I don't know.
01:26:53.120 | But believing the impossible in the sense
01:26:55.160 | being optimistic about the thing.
01:26:57.960 | It's similar, like going back to what you've said,
01:27:00.920 | is like believing that a radical,
01:27:02.520 | that a powerful single idea, that a single individual
01:27:05.720 | can revolutionize some framework that we're operating in
01:27:10.720 | that will change the world for the better.
01:27:12.920 | Believing that allows you to have the chance to create that.
01:27:17.800 | And so I'm with you on the optimism,
01:27:19.940 | but you may have to pay a cost of optimism
01:27:23.560 | and naive hopefulness.
01:27:26.800 | - I mean, in some sense, optimism limits freedom.
01:27:29.880 | I think if we don't really have much choice
01:27:34.480 | in choosing what is perfect, if it exists as an ideal,
01:27:39.480 | then there isn't much room for creativity.
01:27:45.680 | And that's a danger of optimism
01:27:47.160 | as someone who would like to be creative.
01:27:50.700 | I think it was Warren Zeevon said,
01:27:54.320 | "Accepting dreams, you're never really free."
01:27:56.300 | And that's something I think about a lot.
01:27:58.560 | He's an interesting guy also, I really like him.
01:28:04.360 | - On that topic, you do have a bit of an appreciation
01:28:08.880 | and connection with music.
01:28:09.880 | I saw you play some guitar a few months ago.
01:28:12.080 | Can you put in like a philosophical sense
01:28:18.600 | your connection to music?
01:28:20.940 | What insights about life,
01:28:22.680 | about just the way you see the world,
01:28:24.840 | do you get from music?
01:28:26.080 | - I think the role music has played in my life
01:28:29.080 | was originally motivated by sort of wanting to prove things
01:28:34.480 | to myself, I really have no ear for music.
01:28:37.520 | I have a terrible sense of pitch.
01:28:41.120 | And I think a lot of music relies on
01:28:44.120 | very standard teaching.
01:28:45.160 | If you think about lessons, for example, music lessons,
01:28:50.160 | there's sort of a routine to them,
01:28:53.960 | which is so archaic and traditional
01:28:56.400 | that there's no room for deviation.
01:28:59.320 | I think all of that suggested to me
01:29:03.440 | that I would never have a relationship with music.
01:29:06.480 | I loved listening to music, it was just,
01:29:08.720 | it was difficult to me, it sort of saddened me.
01:29:11.340 | I wanted to know if there was any way
01:29:14.760 | I could build a connection to music,
01:29:17.240 | given who I am, my own idiosyncrasies,
01:29:21.120 | what challenges I have.
01:29:24.740 | I decided to try to learn music theory
01:29:27.840 | before I touched an instrument.
01:29:32.880 | I think that gave me a very unique opportunity
01:29:35.160 | instead of spending my time fruitlessly
01:29:38.280 | at the beginning on the syntax of a particular instrument.
01:29:41.120 | This is how you, this is your posture on the piano,
01:29:43.680 | this is how you hold your fingers.
01:29:46.040 | I tried instead to learn what made music work.
01:29:50.560 | And the wonderful thing about that was
01:29:52.920 | I'm pretty sure that any instrument with discrete notes
01:29:57.080 | is mine for the taking within a day or so
01:30:00.480 | of having the ability to play with it.
01:30:03.200 | So I think approaching music abstractly
01:30:08.000 | gave me the ability to instantiate it everywhere.
01:30:12.320 | And I think it also taught me something about self-teaching.
01:30:18.240 | Like recently I've tried getting into classical music
01:30:22.120 | because at least traditionally this is the thing
01:30:25.400 | which is thought to require the most rigor,
01:30:30.840 | and traditional teaching.
01:30:33.960 | I think it's essentially taught me,
01:30:37.920 | even if I'll never be a great classical performer,
01:30:41.640 | that there is nothing one can't really teach themself
01:30:46.200 | in this era.
01:30:48.400 | So I've been enjoying whatever connection I have with music.
01:30:53.400 | The other thing I'll say about it
01:30:56.160 | is that it's a very rewarding learning process.
01:31:00.400 | We know, for example, that music
01:31:02.960 | accesses our neurochemicals very directly.
01:31:09.240 | And if you teach yourself a little bit of theory
01:31:14.240 | and are able to instantiate it on an instrument
01:31:19.320 | without wasting your time or spending your time
01:31:22.800 | tediously on learning the particulars of that instrument,
01:31:27.800 | you can instantly sit down and access your own dopamine loops
01:31:32.360 | and so you don't really need to motivate yourself with music
01:31:36.040 | because you're giving your brain drugs.
01:31:39.440 | Who needs motivation to give themselves drugs
01:31:43.320 | and learn something?
01:31:45.960 | So I think more people should be playing music
01:31:51.960 | and I think a lot of people don't realize
01:31:54.640 | how easy it can be to approach
01:31:57.000 | if you take a sort of unstandard approach.
01:32:00.440 | - And the unstandard approach in your sense
01:32:03.080 | was understanding the theory first
01:32:05.200 | and then just from the foundation of the theory
01:32:08.600 | be able to then just take on any instrument
01:32:13.120 | and start creating something that sounds reasonably good.
01:32:17.720 | - Yeah.
01:32:18.560 | - Or learning something that sounds reasonably good
01:32:19.960 | and then plugging into the, as you call them,
01:32:23.960 | the dopamine loops of your brain,
01:32:26.600 | allowing yourself to enjoy the process.
01:32:28.680 | - Yeah.
01:32:29.520 | - What about the pain in the ass
01:32:32.080 | rigorous process of practice?
01:32:34.320 | So is there something about my dopamine loops, for example,
01:32:37.580 | that enjoys doing the same thing
01:32:38.920 | over and over and over again and watching myself improve?
01:32:42.520 | - I think that's because music is more effective
01:32:46.080 | at accessing us when it's played correctly
01:32:49.880 | and I think you play,
01:32:52.040 | I'm positive that you play music
01:32:53.920 | much more correctly than I do.
01:32:56.040 | So if you are going to sit down and play something
01:32:59.400 | that you've learned,
01:33:00.520 | that piece will be much more satisfying
01:33:03.440 | to your ears and to your brain
01:33:05.840 | than if I were to play that piece
01:33:07.800 | just sitting down with an instrument.
01:33:11.340 | But it's sort of a trade-off with freedom and rigor
01:33:18.480 | because even if I should be spending more of my time
01:33:23.000 | practicing rigorously,
01:33:25.000 | I know I don't have to to make me happy.
01:33:28.360 | - Well, Jocko Willink, I think,
01:33:30.400 | has this saying that discipline is freedom.
01:33:33.680 | So maybe the repetition of the disciplined repetition
01:33:38.680 | is actually one of the mechanisms of achieving freedom.
01:33:43.000 | It's another way to get to freedom.
01:33:45.440 | That it doesn't have to be a constraint
01:33:47.720 | but in a sense unlocks greater sets of opportunity
01:33:52.600 | that then results in a deeper experience of freedom.
01:33:56.280 | - Maybe, I mean, particularly if you're thinking
01:33:58.600 | about discipline and method for improvisation,
01:34:03.600 | there are a million pieces that you could improvise
01:34:10.280 | with the same discipline
01:34:12.880 | and how to approach that improvisation.
01:34:16.220 | So I think that it's true that discipline promotes freedom
01:34:21.220 | if you insert a layer of indirection
01:34:28.760 | because I think if you're trying to learn one piece
01:34:33.240 | that was written 400 years ago
01:34:35.400 | and you're playing it over and over again,
01:34:38.920 | there is nothing personable, sorry,
01:34:40.880 | there's nothing personal or creative about that process
01:34:46.120 | even if it's beautiful and satisfying.
01:34:49.600 | There has to be some sort of discipline applied
01:34:53.240 | to the creativity of self.
01:34:56.060 | So I think that is the layer of indirection
01:35:01.060 | which reconciles both approaches to freedom and discipline
01:35:06.580 | and enjoyment of music.
01:35:08.440 | - Discipline applied to the creativity of self.
01:35:15.380 | - Damn, Zev.
01:35:16.660 | - Thank you.
01:35:18.220 | - Now, as an aging man yourself,
01:35:22.500 | if you were to give an advice to young folks today
01:35:25.860 | of how to approach life and maybe advice to yourself,
01:35:30.180 | is there some way you could condense a set of principles,
01:35:35.180 | a set of advices you would give to yourself
01:35:38.820 | and to other young folks of how to live life?
01:35:43.740 | - Sure, I would say that with the collapse of systems
01:35:48.740 | that have existed for thousands of years,
01:35:54.820 | like whatever is happening with universities
01:35:57.720 | might be an example of some system
01:35:59.400 | that may or may not be decaying.
01:36:02.580 | I think with the destruction of important systems,
01:36:09.780 | there is a unique opportunity to invest in oneself
01:36:14.780 | and I think that is always the right approach
01:36:19.580 | provided that the investment one makes in his self
01:36:23.620 | is obligated towards humanity as a whole.
01:36:28.620 | And I think that is the great struggle of my generation.
01:36:33.980 | Will we create our own paths
01:36:38.580 | that are capable of saving whatever is collapsing
01:36:42.540 | or will we be squashed by the debris?
01:36:46.620 | And I hope to articulate what patterns
01:36:51.620 | I see this struggle taking over the years
01:36:55.680 | that my generation becomes particularly active in the world
01:36:59.540 | as an important force.
01:37:02.540 | I think already we're important as a demographic
01:37:06.380 | to particular markets, but I should hope
01:37:08.860 | that our voices will matter as well starting very soon.
01:37:12.860 | So I would try to think about that.
01:37:16.260 | That would be my advice.
01:37:17.460 | - Do you, it's a silly question to ask perhaps,
01:37:21.580 | but a bit of a Russian one.
01:37:25.140 | It's silly because you're young,
01:37:27.780 | but I don't think it's actually silly because you're young.
01:37:31.780 | Do you ponder your mortality
01:37:34.460 | and are you just afraid of death in general?
01:37:38.860 | - So tying us back to our previous conversations
01:37:45.100 | about abstraction versus experience,
01:37:49.820 | which is determining our notions of our life and our world,
01:37:58.300 | death is interesting in that it is obviously
01:38:04.620 | hyper important to a person's life.
01:38:07.420 | And it is something that for the most part,
01:38:09.300 | no human will really experience
01:38:11.420 | and be able to reflect upon.
01:38:15.660 | So our notions of death are sort of proof
01:38:20.500 | that if we want to make the most of our lives,
01:38:22.820 | we have to think abstractly and relying not at all at times
01:38:29.660 | on experiential thought and understandings
01:38:34.660 | because we can't really experience death
01:38:40.060 | and reflect upon it hence and use it to motivate us.
01:38:43.620 | It has to remain some sort of abstraction.
01:38:45.820 | And I think if we have trouble
01:38:49.620 | comprehending true abstraction,
01:38:53.300 | we tend to view our lives as,
01:38:56.260 | we tend to view ourselves as nearly immortal.
01:38:58.500 | And I think that's very dangerous.
01:39:00.300 | So one concrete implication for my belief in abstraction
01:39:05.300 | would be that we all need to be aware of our own deaths
01:39:12.060 | and we need to understand concretely
01:39:19.300 | the boundaries of our lifetimes.
01:39:22.680 | And no amount of experience can really motivate that.
01:39:26.980 | It has to be driven by thought and abstraction in theory.
01:39:31.780 | - That's one of the deepest elements
01:39:34.940 | of what it means to be human
01:39:36.060 | is our ability to form abstractions
01:39:37.820 | about our mortality versus animals.
01:39:41.300 | I think there's just something really fundamental
01:39:44.500 | about our interaction with the abstractions of death.
01:39:49.460 | And there's a lot of philosophers
01:39:54.420 | that say that that's actually core
01:39:57.500 | to everything we create in this world,
01:40:01.860 | which is like us struggling with this impossible
01:40:05.140 | to understand idea of mortality.
01:40:10.000 | And I mean, I'm drawn to this idea
01:40:13.460 | because both the mystery of it,
01:40:17.340 | but also just from the human experience perspective,
01:40:19.700 | it seems that you get a lot of meaning from stuff ending.
01:40:24.220 | It's kind of sad, the flip side of that,
01:40:27.140 | to think that stuff won't be as meaningful
01:40:30.340 | if it doesn't end, if it's not finite.
01:40:33.260 | But it seems like resources gain value from being finite.
01:40:38.260 | And that's true for time,
01:40:40.140 | that's true for the deliciousness of ice cream,
01:40:42.980 | that's true for love, for everything,
01:40:45.300 | for music and so on.
01:40:47.380 | And yeah, it seems deeply human to try to,
01:40:52.380 | as you said, concretize the abstractions of mortality,
01:40:58.140 | even though we can never truly experience it,
01:41:00.540 | 'cause that's the whole point of it.
01:41:02.500 | Once it ends, you can't experience it.
01:41:04.540 | - Yeah.
01:41:05.880 | - Again, another ridiculous question.
01:41:07.900 | - Okay.
01:41:08.740 | - What do you think is the meaning of it all?
01:41:13.100 | What's the meaning of life?
01:41:14.980 | From your deep thinking about this world,
01:41:19.980 | is there a good way to answer any of the why questions
01:41:23.740 | about this existence here on Earth?
01:41:25.760 | - And as I said, we're here in part by principle
01:41:28.900 | and in part by accident.
01:41:31.060 | And a lot of the things which bring us joy
01:41:34.820 | are programmed to bring us joy
01:41:38.740 | to ensure our evolutionary success.
01:41:42.380 | And so, I would not necessarily consider
01:41:47.380 | all of the things which bring us joy to be meaningful.
01:41:56.040 | I think they play a very obvious role in a clear pattern,
01:42:02.340 | and we don't have much choice in that.
01:42:06.260 | I think that outrules the idea of joy being the meaning
01:42:12.340 | of life.
01:42:13.180 | I think it's a nice thing we get to have,
01:42:18.180 | even if it's not inherently meaningful.
01:42:22.460 | I think the most wonderful thing
01:42:27.460 | that we have ever been given
01:42:34.460 | has been our ability to, as I said,
01:42:42.260 | observe what transcends us as humans.
01:42:47.260 | And I think to live a meaningful life is to see that
01:42:52.220 | and hopefully contribute to that.
01:42:54.540 | - So to try to understand what makes us human
01:42:58.980 | and to transcend that,
01:43:00.260 | and in some small way contribute to it
01:43:03.660 | in the finite time we have here.
01:43:08.460 | Yeah, those are some powerful words.
01:43:13.460 | - Thank you.
01:43:14.420 | - You're a truly special human being.
01:43:16.020 | It's really an honor to talk to you.
01:43:17.820 | I'm a newborn fan of yours,
01:43:23.180 | and I can't wait to see how you push to the world.
01:43:25.540 | Please embrace the fear you feel and be bold.
01:43:30.500 | And I think you will do some special things in this world.
01:43:34.980 | I'm confident if this world doesn't destroy you,
01:43:37.580 | and I hope it doesn't.
01:43:38.620 | Be strong, be brave.
01:43:41.300 | You're an inspiration.
01:43:42.600 | Keep doing your thing.
01:43:44.300 | And thanks for talking today.
01:43:46.060 | - Thank you so much, Lex.
01:43:47.300 | - Thanks for listening to this conversation
01:43:50.020 | with Zev Weinstein, and thank you to our sponsors,
01:43:53.260 | ExpressVPN, Grammarly Grammar Assistant,
01:43:56.620 | Simply Safe Home Security,
01:43:58.500 | and Magic Spoon Low Carb Cereal.
01:44:00.980 | So the choice is privacy, grammar, safety, or health.
01:44:04.620 | Choose wisely, my friends.
01:44:06.020 | And if you wish, click the sponsor links below
01:44:08.660 | to get a discount and to support this podcast.
01:44:11.820 | And now let me leave you with some words from Aristotle.
01:44:15.340 | Knowing yourself is the beginning of all wisdom.
01:44:20.020 | Thank you for listening, and hope to see you next time.
01:44:22.860 | (upbeat music)
01:44:25.440 | (upbeat music)
01:44:28.020 | [BLANK_AUDIO]