back to index

Lex Fridman: Ask Me Anything - AMA January 2021 | Lex Fridman Podcast


Chapters

0:0 Introduction
0:43 Will AGI suffer from depression?
5:20 Love is an escape from the muck of life
11:20 What questions would you ask an alien?
20:4 How to pivot careers to computer science
27:12 What will robots look like in the future?
30:0 Disagreement with Einstein about happiness
35:56 How I pick podcast guests
45:32 How to stay optimistic about the future
53:15 Major topics I changed my mind on
60:45 Benefits of keto diet
70:8 Darkest time in my life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is an AMA episode where I answer a few questions that folks asked on Patreon,
00:00:06.040 | YouTube, and other social networks.
00:00:08.720 | I'll try to do these episodes on occasion if it's of interest to anyone at all.
00:00:14.240 | Quick mention of our sponsors.
00:00:16.840 | Brooklyn and Sheets, Indeed Hiring website, ExpressVPN, and Theragun muscle recovery device.
00:00:24.400 | So the choice is sleep, employment, privacy, or muscle recovery.
00:00:30.480 | Choose wisely my friends.
00:00:31.480 | And if you wish, click the sponsor links below to get a discount and to support this podcast.
00:00:38.400 | And now, on to the questions and the answers.
00:00:43.440 | The question is, Lex, I'm a young man that has battled with depression.
00:00:47.840 | Do you think when trying to develop a human-like AI, we will reach a stumbling point where
00:00:53.160 | the AI themselves suffer from depression and other complex mental issues?
00:00:58.360 | Do you think it will be a simple fix like rewriting a piece of code or a new patch or
00:01:03.240 | update?
00:01:04.360 | Or maybe when trying to create something human-like with high fidelity, you need to leave in the
00:01:09.880 | possibility of the AI suffering from such complex mental issues that a human can.
00:01:16.400 | What are your thoughts generally and philosophically about AI suffering from depression?
00:01:22.680 | - I think that suffering is a deep fundamental property of consciousness.
00:01:30.040 | I would like to probably say quite a bit about depression.
00:01:32.920 | I have friends who suffer from depression, but that's for another time.
00:01:38.720 | That's for when we talk about depression in humans.
00:01:42.080 | I think depression is just one flavor of suffering that is part of the human condition.
00:01:48.560 | I see it as a kind of dark side street on the path to intelligence.
00:01:58.600 | In terms of robot suffering, if we were to create systems that are truly intelligent
00:02:04.080 | in the way that they're able to interact in intelligent and deeply meaningful ways with
00:02:09.560 | other humans, it's going to have many of the properties, many of the characteristics of
00:02:14.880 | the human condition, of the full human experience.
00:02:17.880 | I think depression is part of that.
00:02:19.840 | There's of course a part in us humans that longs to remove all that is cruel in this
00:02:25.720 | world.
00:02:27.240 | That's why people that believe in God, often the biggest question is of why does God allow
00:02:33.920 | there to be suffering in the world?
00:02:36.360 | There's this longing to understand why is there so much unfairness in this world?
00:02:44.160 | Depending on that, there's an inclination to then in our systems engineer something
00:02:50.160 | that is void of those things that we cannot understand why that's part of the human condition.
00:02:56.120 | But I think it is intricately part of the experience that is to be human.
00:03:01.600 | I think if we were to build intelligent systems that are interacting with humans, there has
00:03:07.160 | to be in some ways properties of consciousness baked in.
00:03:12.440 | If we were to have properties of consciousness baked in, we have to have the full mystery
00:03:17.000 | and uncertainty of the human experience, which yes, includes all the different flavors of
00:03:21.520 | suffering of which depression is part.
00:03:25.520 | I think the yin and the yang in all of its versions, the ups and downs of moods, but
00:03:31.120 | also the more rational, intellectual interpretations of different concepts that are less dramatic,
00:03:39.640 | all have to oscillate back and forth.
00:03:41.400 | I think that's where the interesting aspect of interactions happens.
00:03:46.400 | Just like when I have conversations in the podcast, the interesting stuff happens when
00:03:50.280 | there's disagreements, when there's a bit of turmoil, when there's a push and pull,
00:03:55.280 | when there's a changing of minds, or even just a morphing of your own opinions about
00:04:00.520 | something, your own thoughts.
00:04:01.960 | I think that's part of it.
00:04:03.240 | So I really do think all of that mess of humanity has to be engineered in into AI systems that
00:04:10.680 | are interacting with humans and are trying to create meaningful interactions with those
00:04:15.680 | humans.
00:04:16.680 | There's of course a huge amount of AI systems that are going to be more intelligent than
00:04:21.280 | humans at particular tasks.
00:04:23.280 | Those do not need to have those properties of the human experience, like suffering and
00:04:26.640 | all those kinds of things.
00:04:28.200 | But for the ones that move among us, I think unfortunately depression has to be part of
00:04:34.640 | the experience, or the possibility of depression has to be part of the experience.
00:04:38.440 | Of course I tend to focus on the positive aspects of the human experience, like love,
00:04:46.280 | beauty, joy, all those kinds of things.
00:04:50.480 | But it's the yin and the yang.
00:04:52.980 | They go together.
00:04:54.560 | They're lifelong partners, unfortunately, I think.
00:04:58.680 | Now of course all of this is just hypothesis, and most of my answers to all of these questions
00:05:02.340 | are going to be just my own thoughts.
00:05:04.640 | But I am thinking about all of this from an engineering perspective, and maybe I'll have
00:05:09.000 | more to say in the future about how we actually build these kinds of things into our AI systems
00:05:16.280 | that interact with humans.
00:05:18.840 | Thanks for the great question.
00:05:19.840 | It's a tough one.
00:05:20.840 | The question is, "Lex, I was wondering if you would be willing to talk about your immigrant
00:05:26.200 | experience.
00:05:27.200 | I myself started off as an international student studying and working in America.
00:05:32.880 | Not from Russia, I'm from India, but there was a constant push and pull that I experienced
00:05:38.920 | given my life circumstance.
00:05:40.640 | I would be curious to hear how you assimilated.
00:05:44.160 | Do you feel like you belong, et cetera?
00:05:46.720 | Thank you for the AMA."
00:05:49.560 | Your statement's about, "Do you feel like you belong?"
00:05:51.760 | Hit hard for some reason.
00:05:54.080 | Maybe it's because of late at night, maybe because I'm a bit over-caffeinated.
00:06:00.040 | Maybe what pops to mind to focus on is the aspect of loneliness, the aspect of belonging.
00:06:07.680 | I think a lot of us in the early teenage years go through that process of feeling like an
00:06:14.680 | outsider, an outcast of different kinds.
00:06:17.400 | I think it hit me the hardest personally because I was a popular kid in Russia, and when we
00:06:23.120 | moved here, I went to the opposite of being popular.
00:06:27.960 | Being like that, I felt like an outcast.
00:06:32.080 | The place I moved to in America had more of an emphasis, maybe it's a cultural thing,
00:06:38.080 | of emphasizing material possessions over two things that were deeply meaningful to me,
00:06:43.760 | which is human connection, like friendship, and also knowledge, like mathematics and scientific
00:06:50.920 | discovery, all those kinds of things.
00:06:53.240 | It's just the emphasis of what was valued was different, and that for me was a catalyst
00:06:58.080 | to feel like a total outcast, as opposed to being this person who looks out into the world
00:07:02.560 | and enjoys the beauty of the world.
00:07:05.520 | I went to this brooding phase of, first of all, learning the English language, but starting
00:07:11.640 | to read books, more philosophical books.
00:07:14.680 | The first one I remember reading in English was The Giver.
00:07:18.240 | That sort of helped me start thinking about this world.
00:07:21.320 | I was so fortunate to be so in love with people for so long and have close friends in Russia
00:07:30.600 | that I didn't notice in my childhood how deeply alone we all are.
00:07:39.480 | For me, the immigrant experience involved in a small way, at least the first, realizing
00:07:46.320 | that hard human truth that we all are born alone, live alone, die alone.
00:07:55.320 | Even when we're in the arms of somebody we love, we're still somehow fundamentally alone
00:08:03.360 | with our thoughts, with our hopes, with our fears, trapped in this conscious meat vessel
00:08:13.280 | between our ears.
00:08:15.440 | I think the immigrant experience for me was the catalyst to realizing and being terrified
00:08:23.480 | and also liberated by the idea that I'm alone in this world.
00:08:31.960 | At the same time was the realization that this beautiful feeling I felt from the connection
00:08:38.200 | to other humans was this gift that took me away from this dark realization.
00:08:49.560 | It's almost that love is a kind of escape from the reality of life, from the muck of
00:08:58.320 | life.
00:09:00.800 | The journey began in that way, to think about this world in this way, both the burden of
00:09:05.120 | being alone coupled with the frequent escape from that feeling by being lost in the company
00:09:13.480 | of friends, loved ones.
00:09:16.800 | Early on, coupled with this love of the human mind and curiosity about the human mind was
00:09:22.640 | the love of programming and actually building little programs and engineering systems, of
00:09:27.760 | course, building robots in college and so on.
00:09:30.320 | I think the gift of the immigrant experience of feeling like the outcast was the love of
00:09:39.160 | experiencing the deep connection with others, like a deep appreciation of it when it's there.
00:09:45.640 | I guess because it was taken away, because I was ripped out of it through moving here,
00:09:51.440 | I got to really appreciate it and start becoming cognizant of it to where I can start looking
00:09:56.560 | for it and being more grateful when I do have it.
00:10:00.000 | At the same time, a kind of curiosity started boiling up of the perspective on artificial
00:10:05.960 | intelligence systems from that kind of longing for a connection.
00:10:11.000 | As opposed to looking at robots or AI systems or even just programs that accomplish a particular
00:10:17.680 | task, can these programs accomplish the same richness of task and richness of experience
00:10:25.560 | that I came to appreciate as a human being?
00:10:28.800 | When I talk about love, there's echoes of that in my longing of the kind of experiences
00:10:38.520 | I would like to create in artificial intelligence systems that was born out of the immigrant
00:10:45.520 | experience of the loss of childlike innocence experience, of all of it combined, of starting
00:10:54.760 | to read books and thinking deeply about this world experience, all of that coupled in.
00:10:59.640 | I really think sometimes, unfortunately, the first step of deep gratitude is loss.
00:11:08.160 | So for me, I lost quite a bit during that time, and through that loss, I was able to
00:11:14.000 | discover the things that I truly appreciate about life.
00:11:18.400 | So let me leave it at that.
00:11:21.480 | Question is, if you were able to ask an alien some questions, what would they be?
00:11:27.880 | This is a really good question, and I find it to be actually a really good thought experiment.
00:11:33.600 | Let me put out some candidate questions out there and see what sticks.
00:11:38.440 | So first I'll probably ask for advice for the human species as a whole, for our civilization,
00:11:45.520 | of what we might do to survive and prosper for a long time to come, assuming the alien
00:11:55.840 | is from a civilization that's far older than ours or far wiser.
00:12:01.600 | I think there could be some really interesting, clear statements about the things we're doing
00:12:07.040 | here on Earth that are getting us into trouble from an alien perspective.
00:12:11.560 | So I think that's the number one thing, and maybe I'll bring up along those lines, bring
00:12:17.280 | up questions of great filters.
00:12:20.520 | If you look at the history of your civilization, when did you almost destroy the entirety of
00:12:26.440 | your species?
00:12:27.440 | It would be informative from a historical perspective to see.
00:12:32.320 | For us, it's currently the nuclear age and the few moments in the history that could
00:12:37.320 | have resulted in an all-out nuclear war.
00:12:40.160 | It'd be interesting to see if they mentioned something about AGI, something about viruses
00:12:47.720 | or wars or just things that we don't even think about.
00:12:52.600 | So I guess question number one would be some basic life advice, hoping that this alien
00:12:57.940 | is a Naval type character who can, in a crisp, short way, give some profound advice.
00:13:04.800 | Second, I would probably ask, now this is a very selfish conversation because it's just
00:13:10.960 | following along the things on top of my head that follow my curiosity.
00:13:15.760 | I would ask about the difference between their civilization and ours.
00:13:21.320 | I would ask whether they have some of these things that make us human, like love.
00:13:28.920 | Do you guys have love where you come from?
00:13:31.320 | Do you have death, mortality?
00:13:34.560 | I suspect it's possible to have mortality not even be a concept that makes any sense
00:13:40.320 | to an alien species, that of course everybody's immortal and there might be some kind of enforced
00:13:45.560 | selection mechanism, like evolution in general.
00:13:49.120 | I would ask about consciousness, try to tease apart the question of this thing of subjective
00:13:57.080 | experience, is this some kind of self-centered, weird, over-dramatized quirk of evolution
00:14:06.480 | that we have that's not actually special at all and then we make a kind of big deal about
00:14:11.880 | it, that's some kind of useful feature of our brain to think of ourselves as individuals
00:14:17.840 | that's completely silly.
00:14:19.840 | It'd be interesting to try to tease apart whether they have consciousness and what form
00:14:25.720 | their intelligence takes that is distinct from consciousness in the way that we think
00:14:32.080 | of humans as being conscious entities that are also able to do intelligent things.
00:14:37.200 | Are those intricately connected, are those separate?
00:14:40.600 | It'd be interesting to sort of tease that apart of how their alien minds work.
00:14:46.380 | So that includes intelligence, consciousness, love, and death, all the greatest hits.
00:14:52.160 | Okay, then I would probably go to physics.
00:14:54.680 | Of course, you gotta ask about physics.
00:14:57.240 | I would look into the alien's eyes, if they have eyes, and try to determine if we can
00:15:05.040 | actually even find the same language of mathematics of physics of sciences.
00:15:10.040 | In general, I would probably ask about the big mysteries of physics and science of what's
00:15:16.720 | outside our universe.
00:15:19.040 | Why is there something rather than nothing?
00:15:22.240 | Why is there stuff?
00:15:23.780 | And what's outside the stuff we think of as stuff?
00:15:26.920 | So like what's outside the universe?
00:15:29.440 | I'd be hesitant to ask the why questions, but I'll try a few out to see maybe there
00:15:34.200 | is a good answer to the why questions of like, why did it start?
00:15:39.560 | Like why is there something rather than nothing?
00:15:42.480 | Then I would probably ask slightly more detailed about what's the universe made of?
00:15:46.960 | What's up with this dark matter and dark energy stuff?
00:15:50.840 | What are the basic building blocks of reality?
00:15:54.160 | And what are the laws of physics that govern that reality?
00:15:58.560 | So I would of course ask, kind of sneak in there just like casually, can you maybe give
00:16:04.080 | a few hints of how to unify?
00:16:06.400 | First of all, are we on the right track in terms of quantum mechanics and general relativity?
00:16:12.920 | And then how do you unify all the laws of physics?
00:16:16.800 | Maybe sneak in there in a different angle, trying to ask about the singularity in the
00:16:22.720 | black hole, or maybe what happens at the very beginning of the big bang, like where those
00:16:28.160 | laws are all unified.
00:16:29.880 | Maybe try to get a sense of what are the kind of physics required to fully describe these
00:16:36.800 | events.
00:16:37.960 | I think the physics discussion would be a good time to ask, is there a God?
00:16:42.720 | Maybe not use the G word, but instead say, is there a kind of a centralized designer
00:16:48.840 | or team of designers that have like launched the universe and are actively managing the
00:16:55.880 | universe?
00:16:57.380 | And of course, another version of asking that I would probably talk about the simulation
00:17:01.960 | of looking at the universe as we see it, as a computation, as a computer that's doing
00:17:07.380 | information processing, see if that rings a bell to the alien.
00:17:12.400 | If there's a connection to that, in general, I would ask about what kind of computers you
00:17:15.400 | have and also what kind of computer games that'd be really useful.
00:17:18.760 | Like, what do you do for fun?
00:17:20.960 | You come here often, but that's like usual icebreakers.
00:17:23.920 | Of course, I'm not mentioning those.
00:17:25.320 | That's just like chatter at the bar.
00:17:27.400 | So I guess outside the big physics questions, I would ask the more engineering centric questions.
00:17:32.640 | First, my interest AI about super intelligence, how do we build super intelligent systems,
00:17:39.520 | systems that are far more intelligent than humans?
00:17:43.440 | How do we travel close to the speed of light or faster than the speed of light?
00:17:49.920 | Like how did the aliens get to where we're at that we're meeting and talking?
00:17:54.640 | Related to that would be a question of energy.
00:17:56.480 | How do we harness the energy of a sun or multiple suns or all of the suns in our galaxy?
00:18:03.280 | And then also kind of an engineering question, can we travel through time?
00:18:08.240 | And if we can, how do we build a time traveling machine?
00:18:12.240 | And is it a good idea?
00:18:13.240 | I think a lot of these questions will be appended with a sort of caveat of like, if you know
00:18:19.200 | the answer to this question, will I be better off if you told me this answer?
00:18:25.320 | Sometimes knowledge is not power.
00:18:27.840 | Sometimes knowledge is a burden that leads to self destruction.
00:18:32.720 | So we want to be careful about that.
00:18:34.280 | Of course, as the alien gets tired of talking to me at this intergalactic bar, probably
00:18:40.520 | gets up sort of politely, starts walking away, I would definitely ask some questions from
00:18:47.600 | my own personal knowledge bank.
00:18:52.320 | Is P equals NP?
00:18:53.800 | Good question.
00:18:55.600 | Theoretical computer science, one of the big questions all in mathematics.
00:18:59.880 | I just need to know the answer.
00:19:00.880 | Just give me the answer, I'll work from there.
00:19:03.240 | Okay, we'll figure out the rest, just the answer.
00:19:05.280 | So yes or no.
00:19:08.120 | Probably won't ask him for investment advice.
00:19:10.760 | Probably thinks that the whole concept of money is silly, but I might ask about Bitcoin.
00:19:17.760 | Good long term investment or bad?
00:19:19.400 | What do you think?
00:19:21.600 | The digital currency in general.
00:19:23.600 | And of course, we'll probably ask, is Elon Musk one of you guys or a different species?
00:19:29.200 | Do you know which galaxy, which group of planets he came from?
00:19:34.320 | It'd be nice to sort of localize things.
00:19:36.200 | Is there others like it that visit and build companies?
00:19:39.960 | Just get some of the details.
00:19:42.240 | This AMA has suddenly become ridiculous.
00:19:45.080 | But I think this is a really nice thought experiment.
00:19:48.320 | And I'll think about this a little bit more.
00:19:50.080 | I'm sure there is a list of really precise questions that could most efficiently unlock
00:19:58.040 | the mysteries before the human race that are both useful for our progress and useful for
00:20:03.640 | our survival.
00:20:05.920 | Question is, what advice would you give an intermediate life stage 36 year old who wants
00:20:12.360 | to career pivot from medical technology and research to computer science?
00:20:17.960 | So first, by computer science, I think you mean the broad field that includes software
00:20:24.080 | engineering, machine learning, robotics, just computing in general, maybe with less emphasis
00:20:30.000 | on the mathematical side, like theoretical computer science.
00:20:33.280 | I think the best advice on this that I could give is find a simple project to get excited
00:20:40.280 | about and allow yourself to get really excited by it.
00:20:45.640 | Have fun, fall in love with it, be proud of the thing you create.
00:20:51.320 | And I should say there's a big emphasis on the simple.
00:20:54.200 | Don't go super ambitious.
00:20:56.240 | I believe that most people, if they allow themselves, can derive a huge amount of joy
00:21:01.720 | for creating some simple little things.
00:21:04.360 | Even if it's following a tutorial, if you just allow yourself to experience the joy
00:21:08.800 | of creation, it's there for you.
00:21:12.560 | That's one of the magical things about computer science, is it allows you to create things
00:21:18.600 | that are almost like entities on their own.
00:21:21.180 | That's what programs are.
00:21:23.200 | So I think a career in computer science starts first with allowing yourself to be passionate
00:21:30.280 | and getting that, stoking that flame and allowing it to build.
00:21:36.120 | So it's not about any of the practical, like which job do I get, what thing I work on,
00:21:40.160 | it's just really giving yourself over to the simple passion of creating stuff.
00:21:47.320 | I think there's just a quick set of steps that I think I followed early on that I would
00:21:53.600 | also recommend you at least consider following.
00:21:56.960 | First is basic software engineering.
00:21:59.600 | So finding maybe Python or JavaScript, like super popular, accessible programming language,
00:22:06.680 | and build just like a Hello World program or something just a little bit more complicated,
00:22:11.280 | but not much more.
00:22:13.440 | Beyond that is using that newly acquired set of tools of programming, build something that
00:22:22.080 | automates something you do on the computer.
00:22:25.440 | Maybe another way to phrase that is just like scripts that are helping you in your interaction
00:22:30.280 | with the computer.
00:22:31.280 | So maybe finding different files in your computer that you try to look for often, or reorganizing
00:22:38.440 | things in an automated way, like folder structures, or maybe renaming files.
00:22:44.080 | Like I have a script that finds all the files that have spaces in the file name and it renames
00:22:51.880 | them after confirmation to underscores, all those kinds of things.
00:22:57.560 | There's a bunch of little helpful scripts I have all over the place, and those are just
00:23:00.640 | really joyful because you get to use them every day and it's something that you've created
00:23:05.120 | that made your life a little bit easier.
00:23:08.040 | For me at least, that's a source of joy that helps feed that love of programming, of just
00:23:16.200 | being a part of the computing of the computer science world.
00:23:19.760 | I've been doing that really my whole life.
00:23:22.040 | It started with C and C++, but now it's a lot of other languages, primarily Python and
00:23:28.920 | yes, JavaScript.
00:23:31.760 | Next is a branching into two separate little worlds in computer science of algorithms and
00:23:37.800 | then data science.
00:23:40.760 | I think both are full of beautiful things to fall in love with.
00:23:45.160 | The thing you can really enjoy with algorithms is learning how to build more and more efficient
00:23:50.760 | algorithms.
00:23:51.920 | On the data side is learning how to process different data sets, how to clean them up,
00:23:59.600 | how to reorganize them and do different kind of statistics on them, processing on them.
00:24:05.880 | So we're not even talking about machine learning yet, it's just being able to visualize those
00:24:10.040 | data sets, all those kinds of stuff, just working with data.
00:24:12.880 | And now we're starting to talk about a career because there's a lot of jobs that have to
00:24:18.280 | do with the use of computing techniques to process, visualize, interpret, aggregate,
00:24:27.120 | analyze data.
00:24:28.340 | So I guess you would call that field data science.
00:24:33.080 | So that's a really cool career trajectory and there's so many cool things to get into
00:24:38.600 | with I think a very reasonable small learning curve that you can really, if you push yourself,
00:24:44.440 | do within weeks, maybe months, not years.
00:24:48.280 | And once you become comfortable with the data science world, you can start building on top
00:24:52.400 | of that quite naturally, doing some boilerplate machine learning, supervised learning projects
00:24:57.680 | and then building out into more specific, more useful, more novel, cutting edge applications
00:25:06.000 | of machine learning, reinforcement learning, that whole world.
00:25:10.860 | Maybe even taking that into physical systems of actually building robots.
00:25:14.320 | And I should backtrack, it sounds like I'm building towards something super complicated,
00:25:18.800 | but it's not.
00:25:19.800 | All of these can be really small projects.
00:25:22.120 | Even robotics projects, you can build a little robot that does some basic tasks, maybe does
00:25:28.560 | some basic computer vision.
00:25:30.520 | And it's a nice way to learn on the robotics side, embedded systems programming.
00:25:33.880 | So it's just getting more comfortable with hardware and seeing if that's something you're
00:25:38.240 | interested in.
00:25:39.240 | Or on the data science side, where you're sticking much more to the software.
00:25:43.680 | Both of those, you now start to figure out what is the exciting career possibility.
00:25:49.320 | I think two things, I would even see them as skills that are important here, passion
00:25:54.840 | and Google.
00:25:56.760 | I see passion as a skill because it's allowing yourself to be excited.
00:26:01.360 | It's finding things you could be excited about and allowing yourself to be excited.
00:26:05.440 | And seeing that as an actually essential part of progress is allowing yourself to be excited.
00:26:11.340 | And the reason I mentioned Google is because I find that in a lot of fields, but especially
00:26:16.120 | in computer science, with software engineering, with machine learning, there's so many amazing
00:26:20.840 | resources out there that the key skill actually ends up being is how good are you at discovering
00:26:28.200 | the exact page and resources that is allowing you to take the next step in your journey
00:26:35.420 | of exploration of learning.
00:26:37.480 | And that's fundamentally a skill of how do I Google the right thing?
00:26:42.660 | What pages do I click on?
00:26:44.420 | And all those kinds of things.
00:26:45.520 | I think it sounds almost kind of ridiculous to say that that's a skill, but that is one
00:26:50.480 | of the most essential skills of the modern day student, lifelong student, is how to Google.
00:26:59.480 | So yeah, passion and Google.
00:27:02.840 | Allow yourself to fall in love with the project and keep taking the next step, the next step,
00:27:09.320 | the next step with the help of a good search engine and a bit of curiosity.
00:27:14.920 | Question is, what form factor of robots are you most excited about for the future?
00:27:19.520 | Bipeds, quads, arms, humanoids, maybe something else more obscure.
00:27:26.720 | This is a really tough question because I really like robots.
00:27:30.700 | I think that love is born in software and the hardware stuff just makes it a little
00:27:38.720 | more fun.
00:27:39.720 | So I think the things I'm really excited about, even in terms of form factors, is in the software.
00:27:44.880 | I think much of the exciting developments in robotics is actually in simulated worlds
00:27:50.080 | currently, and I think that will be true for quite a while to come.
00:27:55.400 | So I think in terms of human-robot interaction, the robots that will be really exciting are
00:28:01.720 | the ones that live in virtual worlds, like in virtual reality or even just on a screen.
00:28:07.480 | But I think what we would see more and more is entities, human-like entities, or entities
00:28:15.480 | that allow us to anthropomorphize a consciousness, a spirit onto them, living in the digital
00:28:24.280 | world.
00:28:26.040 | I think that's what I'm really excited about.
00:28:29.400 | And of course, slowly those entities taking a form in the physical space in terms of,
00:28:34.360 | I think probably the humanoid form.
00:28:37.680 | Unfortunately, though very difficult to engineer and create a realistic and natural fulfilling
00:28:46.600 | experience with, I think it's still probably the most, to me, exciting form.
00:28:51.720 | Although I do really like Boston Dynamic Spot, the robot dog, from a kind of having a pet
00:29:01.160 | perspective is a really exciting form.
00:29:03.640 | Again, very difficult to do stuff in the physical space.
00:29:07.320 | It's a huge engineering challenge that, as far as I can tell, is several orders of magnitude
00:29:14.440 | more difficult than the same challenge in the digital space.
00:29:18.200 | So I just see the digital simulated robotics advancing much quicker and having a much larger
00:29:25.680 | scale impact on the world, especially if we start seeing more and more virtual worlds
00:29:32.560 | being created.
00:29:34.440 | And that doesn't necessarily mean virtual reality or like augmented reality.
00:29:38.560 | It just means ability and mediums within which you can interact with artificial intelligence
00:29:43.840 | systems in the digital space.
00:29:46.320 | And I do see that as a form factor, which is entities in digital space having a humanoid
00:29:54.760 | or a semi-humanoid form, something that we can anthropomorphize, something we can connect
00:29:59.720 | with on a human level.
00:30:01.600 | The question is, on the topic of suffering and growth, is happiness a healthy pursuit?
00:30:09.240 | Or do you agree with Einstein's view on happiness as the aspiration of a pig?
00:30:15.320 | Okay, let me quickly look up the Einstein quote here that you referenced about a pig
00:30:21.560 | and happiness.
00:30:22.560 | Okay, Einstein writes, "I have never looked upon ease and happiness as ends in themselves.
00:30:29.720 | This critical basis I call the ideal of a pig's thigh.
00:30:34.120 | The ideals that have lighted my way and time after time have given me new courage to face
00:30:39.120 | life cheerfully have been kindness, beauty, and truth.
00:30:44.460 | Without the sense of kinship with men of like mind, without the occupation with the objective
00:30:49.200 | world, the eternally unattainable in the field of art and scientific endeavors, life would
00:30:54.840 | have seemed empty to me.
00:30:57.220 | The trite objects of human efforts, possessions, outward success, luxury, have always seemed
00:31:04.520 | to me contemptible."
00:31:06.080 | Okay, where do I start with this?
00:31:11.600 | I think I usually agree with Einstein, especially when he talks philosophy on most things, and
00:31:17.240 | I do here as well in terms of material possessions and all those kinds of things.
00:31:21.680 | But I think he unfairly attacks the word happiness and also pigs.
00:31:27.500 | So let me disagree with Einstein and try to defend the word happiness and also maybe defend
00:31:32.860 | pigs if I can somehow figure that out.
00:31:36.800 | The word happiness, I think, is one of those words that could mean a lot of things to a
00:31:42.940 | lot of people.
00:31:43.940 | And I think in this case, Einstein is using it as almost the pursuit of happiness as a
00:31:50.620 | kind of synonym for hedonism, so a kind of very narrow definition of what happiness is.
00:31:57.860 | I think I see happiness as an indicator that it's much bigger than direct pleasures, but
00:32:08.660 | as a word that includes those pleasures but also includes more meaningful, deep fulfillment
00:32:14.060 | in life.
00:32:15.060 | And so I'd like to reclaim the word happiness as a good thing, which is slightly applied
00:32:19.780 | in this discussion that happiness is a kind of distraction that shouldn't be thought about.
00:32:27.580 | I do think that happiness is a side effect of a life well lived, not a goal.
00:32:36.820 | I think the moment it becomes a goal in itself, I think it's easy to lose your way.
00:32:41.960 | And perhaps that's what in part Einstein means.
00:32:45.860 | I do think it's a really good signal of progress, happiness.
00:32:51.700 | So in losing yourself in the focus of battle, of just focusing on excellence and progress
00:33:01.500 | and improving and challenging yourself and growing all the time, I think a kind of running
00:33:10.500 | average measure of your happiness, day-to-day happiness, so you like average that over a
00:33:17.980 | period of weeks and months, is a good measure of how you're doing.
00:33:24.300 | And I think a more actionable process of collecting that signal is a process of just gratitude,
00:33:30.660 | of sitting back and thinking how grateful I am, how grateful you are for how it started
00:33:38.940 | and how it's going, for the progress that you've made.
00:33:44.260 | So I do think it's a good signal, not momentary happiness, but over a period of time, several
00:33:49.460 | weeks, several months, if there's not happiness, that you've probably lost your way as well.
00:33:55.980 | So it's a useful signal, not a goal in itself, but a useful signal.
00:34:00.300 | And kindness, beauty, and truth, as Einstein puts it, are good ideals, but they're a bit
00:34:08.920 | ambiguous, in a practical day-to-day sense.
00:34:13.940 | I share them, of course, but I think practically, if I were to put it into words, at least for
00:34:21.140 | myself, struggle is the process, and happiness is the measure.
00:34:32.340 | So day-to-day life actually looks like a constant struggle to improve yourself.
00:34:38.880 | And then the flip side of that is the gratitude of how amazing life is, the progress you've
00:34:47.720 | made, but also just the opportunity to struggle.
00:34:53.240 | You have to imagine this if it's happy, and ultimately, when I look back at my life, most
00:34:59.760 | days are spent truly happy to be alive.
00:35:05.400 | So in that sense, the pursuit of happiness is a good one.
00:35:10.200 | Not hedonistic, in the moment, local optima of pleasure, but more like stepping back,
00:35:19.440 | looking at the running average over the past few weeks and months, and making sure you're
00:35:24.160 | at a good level.
00:35:26.840 | So that's a bit of a disagreement with Einstein.
00:35:30.000 | And I also have to say that I think pigs are one of the most intelligent animals.
00:35:36.680 | So I'm still holding out for the possibility that pigs, or maybe dolphins, have life figured
00:35:43.440 | out quite a bit better than us humans.
00:35:47.460 | So on those two things, the pursuit of happiness, and on the brilliance of pigs, me and Einstein
00:35:53.840 | part ways for a brief moment.
00:35:59.280 | Question is, "Hey Lex, I was curious how you pick people to come on to the podcast."
00:36:06.360 | I think this process is actually quite difficult, and it evolved over time.
00:36:11.760 | So let me mention a few factors.
00:36:13.640 | I think first and foremost, it's important that a person is really passionate about what
00:36:21.600 | they do, and that passion can take all kinds of different forms.
00:36:24.840 | I know I sometimes, or all the time, completely lack emotion in my face, but I truly am passionate
00:36:33.760 | about the things I do, and so that passion can express itself in different ways.
00:36:38.880 | And so coupled with that passion, I look for people who are sort of not only passionate,
00:36:44.200 | but they appreciate, enjoy, are drawn to the long-form conversation format as a way to
00:36:52.560 | express that passion, which is not everybody.
00:36:55.480 | Some people love to express their passion, their interest, their expertise, their ideas
00:37:01.600 | in written form.
00:37:03.700 | Maybe that's more kind of edited over several passes of editing, versus a conversation format,
00:37:09.640 | especially long-form conversation where there's very little editing.
00:37:13.380 | In addition to that, I also try to make sure the person actually wants to come onto this
00:37:19.640 | particular podcast.
00:37:21.720 | There's so many amazing podcasts out there, and it's also just surprising to see how much
00:37:27.760 | better they are than me at talking and conversations, explaining stuff.
00:37:32.680 | It's humbling.
00:37:33.680 | It's also inspiring, because it pushes me to kind of improve, seeing what's possible.
00:37:38.240 | So I don't know, if people don't actually listen to this particular podcast, or at least
00:37:46.440 | have listened a little bit, and are not drawn to the particular flavor of weirdness that
00:37:52.400 | is me, like some kid who wears a suit all the time and mumbles, speaks slowly, asks
00:37:59.080 | these weird questions.
00:38:00.080 | I mean, if they're not drawn to whatever the hell that weird mystery is of this particular
00:38:05.040 | human, then there's no reason to talk.
00:38:08.960 | If they're drawn, I think there's a possibility of something magical happening.
00:38:13.200 | Me with my weirdness, and them with their weirdness kind of colliding in interesting
00:38:16.840 | ways that creates something new that both of us are surprised by.
00:38:22.680 | And on that topic, more and more I'm looking for people that are different than me.
00:38:28.480 | And that means the full spectrum of diversity.
00:38:31.680 | So it could be different backgrounds, different world views, different personalities.
00:38:37.880 | Like you can tell there'll be a clash of flavors.
00:38:41.920 | It's like chocolate and salt.
00:38:45.000 | But it can also turn out to be like a pineapple pizza that actually some people love, but
00:38:50.960 | I don't understand.
00:38:52.400 | It doesn't make any sense.
00:38:55.120 | Why it doesn't make any sense.
00:38:57.120 | So it could be taking that risk of embracing that clash, and the chemistry can sometimes
00:39:03.840 | result in a pineapple pizza.
00:39:05.960 | So there's a cost to that risk.
00:39:08.440 | But I seek it out more because I think that's the possibility of some magical experience
00:39:13.520 | of a magical conversation.
00:39:16.240 | And on that topic, I should mention there's this kind of idea of platforming, which is
00:39:19.960 | I've been fortunate enough to have sort of enough listeners and viewers that the question
00:39:28.040 | of platforming even comes up.
00:39:30.320 | Meaning if you have this kind of guest with these kind of controversial viewpoints, why
00:39:37.680 | give them a platform that further spreads their viewpoints?
00:39:44.440 | And I understand, I empathize with this kind of view, but I don't like it.
00:39:52.800 | Because to me, if I'm successful, now that's the problem.
00:39:57.400 | I'm not very good at this thing, especially in challenging conversations.
00:40:03.120 | But if I'm successful, the tension in worldviews, the tension in personalities, the clash will
00:40:11.920 | create wisdom.
00:40:14.620 | So I really want to talk to very challenging people.
00:40:19.720 | I want to have really difficult conversations.
00:40:22.960 | And that means talking to people that are at the outskirts of society.
00:40:29.160 | I think it's something that I'm thinking about a lot.
00:40:34.320 | It's important to say that I'm not afraid of being canceled.
00:40:39.000 | I do think I'm afraid, or perhaps the better word is concerned, about doing a terrible
00:40:46.680 | job on an important, difficult conversation.
00:40:53.480 | Where as a result of me doing a terrible job, I don't add love or knowledge or inspiration
00:41:00.800 | to the world, but fuel further division.
00:41:05.300 | Not because of the guests I have on, but because of my failure to catalyze and steer an inspiring
00:41:16.600 | conversation.
00:41:17.600 | I see my skill in conversation as not, I mean, I don't know how to put it nicely, but not
00:41:23.280 | very good.
00:41:24.480 | I'm striving to improve constantly.
00:41:27.300 | So some of the guest selection has to do with the difficulty of the conversation and how
00:41:35.240 | prepared I am for that level of difficulty.
00:41:37.640 | I think, the way I think about difficult conversations is some of them might take years to prepare
00:41:46.240 | for, just intellectually.
00:41:48.680 | There's certain people and certain spaces of ideas that takes a lot of time.
00:41:53.480 | You have to remember that I'm just an engineer.
00:41:58.720 | I have a set of things that preoccupied my mind for years, and there's a lot of difficult
00:42:04.840 | topics that I just won't do a good job of.
00:42:09.360 | So part of it is I have to work hard to learn more, to kind of constantly look outside the
00:42:15.240 | Overton window to try to explore difficult ideas.
00:42:19.720 | At the same time, build enough sort of reputation driven freedom to take risks and make mistakes,
00:42:29.480 | or try to inspire people in the community to allow me, to allow each other, all of us
00:42:34.360 | to make mistakes in conversation.
00:42:36.880 | So it's the coupling of extreme thorough preparation and allowing yourself to make mistakes.
00:42:44.400 | It's like excellence and not giving a damn combined.
00:42:50.200 | But overall, the thing I'm concerned about, and I take back the fear, I'm not afraid of
00:42:58.200 | I'm just concerned of doing a bad job of conversation.
00:43:01.800 | I'm not concerned of being canceled or derided or criticized after having done a reasonably
00:43:09.480 | good job.
00:43:11.240 | I'm concerned of myself, it doesn't matter if I'm canceled or not.
00:43:15.960 | Just when I look in the mirror, when I look at the results of the conversation being a
00:43:21.160 | failure, something that doesn't add love to the world, but something that adds derision.
00:43:28.040 | And also this is the problem with words.
00:43:29.520 | I don't even like how I'm expressing myself currently.
00:43:33.360 | I really try not to have some kind of agenda or strategy going into a conversation.
00:43:39.160 | I really want to be fragile, open-minded, almost boring and naive, and just giving my
00:43:47.240 | trust to a person.
00:43:49.280 | Even when I challenge or play devil's advocate, all those kinds of things, I really want to
00:43:54.960 | place trust in the mutual respect and the love that the other person gives.
00:44:03.960 | And I trust that they won't take advantage of that.
00:44:07.880 | And so some of the guest selection has to do with, do I have enough trust yet that this
00:44:14.160 | person won't take advantage of my open-mindedness, of my childlike curiosity, all those kinds
00:44:21.680 | of things.
00:44:23.680 | But all of this is just a giant learning experience.
00:44:26.400 | I do want to be careful not to let my curiosity run, what should I say, too far ahead of me,
00:44:32.560 | or my preparation doesn't meet the level of curiosity I exhibit.
00:44:38.800 | So again, like I said, I'm willing and I'm trying to be more and more willing to take
00:44:45.000 | risks and make mistakes in conversations, but I'm also not letting myself off the hook
00:44:51.080 | in terms of the level of preparation I put.
00:44:55.160 | And I really hope that we give each other the freedom and are patient with each other
00:45:02.500 | in nuanced conversation.
00:45:04.160 | That's what seems to be really missing in public discourse, is this kind of patience
00:45:11.200 | and allowing each other to make statements that we later change our mind on, and not
00:45:18.240 | putting that statement on us as this kind of scarlet letter that forever puts us in
00:45:24.480 | a bin of red or blue or some other bin.
00:45:28.800 | So I'm trying to navigate all of this while still being naive and open-minded as best
00:45:34.160 | I can.
00:45:35.160 | The question is, "Hey Lex, I was wondering how you manage to remain optimistic in the
00:45:40.040 | face of adversity when you encounter hostile people that don't want to even consider offering
00:45:45.720 | constructive criticism and would rather try to tear you down and force their ideology.
00:45:51.120 | I find pieces of hope for short periods of time and then they fade after I see the arguments
00:45:56.800 | surrounding whatever brought about hope to begin with.
00:45:59.960 | I guess to put it simply, how do you hold on to hope and optimism?"
00:46:05.160 | Thank you for the question.
00:46:06.160 | There's probably a lot to be said about this, but I'll try to keep it brief and simple.
00:46:14.000 | I try to ignore the noise of the world, the bickering of the moment.
00:46:21.720 | I find that if you give yourself a chance to see how amazing people are, that those
00:46:27.880 | people will reveal themselves to be amazing.
00:46:31.800 | That you will see it.
00:46:36.640 | That if you give yourself a chance to see it, you will see it.
00:46:41.120 | I see it.
00:46:43.080 | And I see gratitude for how amazing things are and optimism for how much even better
00:46:52.680 | things could be as a kind of superpower.
00:46:57.160 | It makes life exciting in a way that first, is just fun to live.
00:47:04.640 | And two, from just a productivity perspective, as an engineer or anybody who creates anything,
00:47:11.200 | it's fuel to create.
00:47:15.280 | I believe that to create new things, and especially for things that others will say is not possible
00:47:22.720 | to create, I find that optimism is a necessary precondition to give you the energy, the fuel,
00:47:30.000 | the drive, the inspiration to go for months, for years, to carry the fire of belief.
00:47:39.960 | That's where that optimism truly is, a superpower that enables that kind of perseverance.
00:47:46.440 | So I think the most important thing is it makes life more exciting and fun.
00:47:52.640 | And it's a good productivity hack, is the second thing.
00:47:58.440 | You also asked how, so I try to, my personal life and the influences I take in, the books
00:48:05.440 | I read and the people I talk to, I try to surround myself with people that are also
00:48:10.800 | full of optimism.
00:48:11.800 | And in general, I'm unapologetically a fan of a lot of people, especially sort of big
00:48:20.080 | thinkers, wild engineers and scientists and creators of all walks of life.
00:48:25.160 | People that shine in ways that surprise me or excite me.
00:48:32.080 | There's really thousands, to be honest, just off the top of my head, even people I talked
00:48:36.160 | to on this podcast.
00:48:38.160 | Chris Lattner always brings a smile to my face, one of the greatest engineers of the
00:48:42.680 | world.
00:48:44.080 | Jim Keller's from that ilk as well, though slightly different personalities, but also
00:48:50.400 | inspires me, makes me smile, such a deep and kind and brilliant human being.
00:48:57.160 | Along that line of engineers, Elon Musk, of course, also the embodiment of optimism about
00:49:04.560 | this world is an inspiration.
00:49:08.600 | And then maybe down the dimension of more wild, even George Hotz, with a chaotic style
00:49:17.080 | of thinking that's very different than my own, but one that I find just inspiring.
00:49:22.600 | Of course, Joe Rogan, for me, has been for many years a kind of example of somebody who
00:49:30.640 | doesn't take themselves too seriously.
00:49:33.360 | He's been for a lot of people.
00:49:35.080 | He has been for me a role model for a successful life that's not full of jealousy and kind
00:49:43.200 | of derision, but it's more being supportive of others, being a fan of others, all those
00:49:48.080 | kinds of things.
00:49:51.080 | On the darker side, Dan Carlin, of course, you don't often think of him as optimistic,
00:49:56.040 | but I truly think he's optimistic.
00:49:58.840 | He's just been so deeply soaking in the muck, the darkness of human history, that I think
00:50:08.320 | sometimes the things he talks about come off as deeply cynical about the future of human
00:50:16.240 | civilization, but they're not.
00:50:18.640 | There's a shining optimism to him.
00:50:21.840 | I was in my conversation with him, even though his words were saying that he's not always
00:50:26.080 | optimistic, I think his heart, his spirit was clearly optimistic.
00:50:30.640 | There's a hope for us in him, at least to me.
00:50:36.080 | That's what I see, and to me that hope glows pretty bright in the stuff that he creates
00:50:41.600 | and the passion that he has for human history.
00:50:44.360 | Of course, the scientist, Stephen Wolfram, on the computer science side, I can't tell
00:50:51.760 | you how much I love cellular automata.
00:50:53.760 | Sean Carroll, the way he loves everything about physics, this incredible communicator.
00:51:00.760 | Eric Weinstein, the way he loves everything geometrical, shapes of all things, whether
00:51:06.640 | they're mathematical or whether they're connected to physics, just his loves for symmetry, asymmetry.
00:51:14.240 | For topology, for the weird curvature of things in the visible dimensions of space-time or
00:51:21.200 | the invisible ones.
00:51:23.520 | That's just sticking to people I've talked to on this podcast.
00:51:26.560 | Of course, Joshua Bach, whose flow of consciousness is full of so much brilliance, it breaks my
00:51:32.080 | brain any time I try to process it.
00:51:34.880 | My Commodore 64 brain takes in his Pentium.
00:51:38.720 | I don't know what the analogy is, but it always breaks my brain.
00:51:43.760 | I'm especially inspired by the creations of software engineers, for example, because there's
00:51:48.280 | an inherent optimism to the creative process.
00:51:51.680 | A lot of people in the cryptocurrency space, like Vitalik Buterin, is a constant inspiration.
00:51:56.800 | It just goes on and on.
00:51:58.480 | Of course, the hundreds, probably thousands of dead folks, from Nietzsche to Dostoevsky,
00:52:05.360 | Freud, Jung, Camus, Hasse, Kerouac, everybody.
00:52:13.240 | I just feel like I exist in this world of people that are excited about the future.
00:52:23.040 | Then of course, the noise of the world that is lost in the bickering of the moment can
00:52:29.920 | seep in.
00:52:30.920 | That's where meditation comes in.
00:52:33.880 | I don't fully ignore it.
00:52:35.760 | I think that's running away from the world in a way that I don't find constructive.
00:52:43.200 | At least at this time in my life, I just take it in, but I don't let it linger.
00:52:49.320 | If there's any kind of harshness or trolling or just maybe destructive criticism, I try
00:52:58.240 | to pick from it pieces that I can use to grow to inspire me and let the rest go.
00:53:04.840 | That's the kind of muscle you have to build.
00:53:07.960 | Every once in a while, just disconnect from it all and recharge the mind in a way from
00:53:14.640 | just simple silence of nature.
00:53:19.680 | Question is, "What is something you changed your opinion about in the past few years?
00:53:24.960 | Thank you for everything you're doing.
00:53:26.320 | Love from Brussels."
00:53:27.320 | I love Belgium.
00:53:29.160 | Thank you for that question and the kind words.
00:53:31.760 | I changed my mind on a lot of things and I change my mind all the time.
00:53:35.520 | I'm in a constant flux.
00:53:37.920 | I'm constantly learning.
00:53:39.280 | I guess my mind is a quantum mechanical system.
00:53:42.000 | I can mention a few things that have been stable big shifts in my thinking at least
00:53:49.680 | over the past year or two, especially related to the podcast.
00:53:53.760 | On the topic of psychedelics, I've always found those fascinating.
00:53:59.040 | What I've changed my mind over the past couple of years is a hopeful message.
00:54:03.360 | I think that psychedelics can actually enter the realm of science and that there's a bunch
00:54:09.720 | of places that are starting to conduct large scale research studies on psychedelics.
00:54:15.320 | That's really exciting to me because I have a sense that that's just another perspective
00:54:20.520 | into the world of neuroscience that will help us understand the way the mind works and potentially
00:54:25.400 | how to engineer different aspects of what makes the human mind so special in our artificial
00:54:31.600 | intelligence systems.
00:54:33.840 | On the topic of social media, I've changed my mind over the past two years.
00:54:37.520 | I always felt that it had a bunch of complicated bad influences on society, but they were balanced
00:54:45.760 | with a lot of positive effects that build community, that give people a voice, all those
00:54:51.480 | kinds of things.
00:54:53.080 | More and more, I'm starting to think that the possible set of destructive trajectories
00:54:59.240 | that social media can take human civilization is much wider, much more destructive than
00:55:07.720 | I accounted for.
00:55:09.680 | It's something that I worry about.
00:55:11.240 | In the space of existential risk of artificial intelligence that people talk about, I think
00:55:16.600 | my mind more and more over the past two years has been focused on social media as the greatest
00:55:21.040 | threat of artificial intelligence.
00:55:23.360 | I also think it's the greatest set of possibilities.
00:55:27.240 | What I want to say is it's the set of trajectories is wider than I expected.
00:55:31.720 | The set of possible trajectories than society might go as driven by, managed by, directed
00:55:38.620 | by our platforms.
00:55:41.920 | Hence it's been something that I've been working on to see if I can help.
00:55:48.060 | The biggest thing I probably changed my mind on is that extraterrestrial life, intelligence,
00:55:55.520 | consciousness is worthy of serious scientific investigation.
00:56:00.480 | It's similar how I felt before about consciousness, human consciousness, is that we lack the tools
00:56:07.520 | and we're very early in our ability to explore, to understand, to engineer consciousness.
00:56:16.160 | The same with extraterrestrial life.
00:56:19.140 | The tools are very crude in terms of the SETI efforts of trying to communicate with far
00:56:23.600 | away civilizations, also the listening, then there's the detection in far away exoplanets
00:56:31.440 | and whether they're habitable in life forms on those planets.
00:56:35.920 | Also the hundreds of thousands of reports of UFO sightings, actually getting some high
00:56:40.400 | resolution sensory data around that.
00:56:43.040 | We're in the very early days of any of that kind of understanding, but what I've changed
00:56:48.200 | my mind on, or rather what I've come to understand, is closing my mind, closing the mind of other
00:56:57.320 | scientists to these fields of consciousness and extraterrestrial life prevents us from
00:57:04.720 | actually discovering new things.
00:57:07.920 | Basically what happens when you close your mind to these fascinating, inspiring, mysterious
00:57:15.160 | spaces of exploration, you leave the exploration of these topics to people that are not well
00:57:21.960 | equipped to explore them.
00:57:23.360 | They're just curious minds.
00:57:25.380 | By the way, those curious minds are magical and they're inspiring and I'm one such curious
00:57:30.400 | mind.
00:57:32.040 | But the rigors of science, the tools of science, the funding of science can crack these wide
00:57:38.560 | open and give us better data, better understanding.
00:57:42.680 | There are totally new ways of thinking about consciousness, about extraterrestrial life,
00:57:50.680 | have entire paradigm shifts of the way we approach our understanding of intelligence,
00:57:57.120 | of life forms in general.
00:57:59.320 | There's a lot of things that kind of opened my eyes to this fascinating world.
00:58:04.720 | The David Fravor conversation of the pilot that saw the Tic Tac UFO, the, it was just
00:58:13.280 | recent, Amo Amoa conversation, but that was in 2017.
00:58:18.000 | I remember seeing Avi Loeb's thoughts about Amo Amoa when it first came out.
00:58:22.760 | And even just thinking about the Drake equation more seriously and thinking about the different
00:58:27.320 | possibilities built into the uncertainty of the parameters just opened my eyes to the
00:58:34.040 | mystery and the wonder of the amazing universe we're in and how little we know about it.
00:58:41.440 | And so I've definitely kind of become much more intellectually open to the exploration
00:58:49.640 | of what extraterrestrial life might look like, what are the ways we might be able to communicate
00:58:55.960 | with it, how we might be able to understand it, what does it teach us about ourselves.
00:59:00.680 | And also importantly, this very fascinating psychological effect of being open to these
00:59:08.840 | mysteries that we know very little about, what does that do to the actual productivity,
00:59:17.120 | the creative output of an engineering mind, that opening your mind in this way to think
00:59:24.680 | outside of the little box of things we understand well.
00:59:29.720 | What does that do in terms of the things you might be able to build, the ideas that might
00:59:37.280 | visit you and result in you being able to build something totally new.
00:59:42.760 | I think all of that changed my mind about aliens.
00:59:45.360 | That's why I've been having conversations about extraterrestrial life.
00:59:48.920 | I'm of course very careful walking down this line because I am first and foremost a scientist
00:59:55.100 | and engineer and I want to stay in that world, but I really do want to cultivate an open
01:00:00.540 | mind and a childlike curiosity.
01:00:03.860 | And I generally hope to see that in other scientists as well.
01:00:07.420 | That's what science is all about.
01:00:08.940 | I think incremental progress is essential for science, but it has to be coupled to that
01:00:15.940 | childlike wonder about the world and an open-minded, out-of-the-box thinking that results in major
01:00:23.320 | paradigm shifts that throw all those silly citations out the window and build totally
01:00:28.820 | new sciences, totally new approaches that make everything we did in the decades past
01:00:36.120 | meaningless or actually counterproductive.
01:00:39.160 | So they have to be coupled together.
01:00:41.560 | Incremental progress and first principles, deep thinking that results in paradigm shifts.
01:00:49.840 | Second is, what was your decision behind going on the keto diet, mainly meat-based, and how
01:00:55.820 | has it helped you?
01:00:57.780 | So the decision, or rather process of discovering the diets that work for me has to do with
01:01:02.840 | the fact that I wrestled, did combat sports my whole life, that has weight classes, so
01:01:08.060 | you're constantly figuring out how to perform optimally physically and mentally while going
01:01:15.260 | to school and so on, while also cutting weight.
01:01:18.960 | So grounded in that, I've developed a fascination with different diets.
01:01:23.420 | I've never thought about diet as a prescriptive thing for others.
01:01:28.060 | I've always thought of myself as a kind of a nutritional scientist running a study of
01:01:35.180 | N of one.
01:01:36.180 | So just studying myself and not trying to extrapolate to others, just understanding
01:01:41.220 | what makes me happy, what makes me perform the best, and that's where that journey took.
01:01:44.940 | I've tried everything.
01:01:46.240 | I think about 15 or more years ago, I discovered the power of intermittent fasting, or fasting
01:01:51.480 | in general, and I can talk about that forever.
01:01:54.660 | I used to do a lot of weight lifting, sort of power lifting, all that kind of stuff.
01:01:58.920 | In the world of men's health, or rather men's muscle and fitness, kind of, where you eat
01:02:06.320 | six, seven times a day, small meals, chicken and broccoli, all that kind of stuff, in that
01:02:10.680 | kind of world, to realize that you can eat once a day and still train two, three times
01:02:17.260 | that day and actually have more energy, more focus, and perform better than you ever have,
01:02:25.520 | was mind-blowing.
01:02:27.660 | So I think fasting was the biggest paradigm shift for me, because it made me realize that
01:02:36.000 | I really need to study myself better, try new things all the time, to allow myself the
01:02:41.500 | opportunity to discover something that's totally transformative on my life, makes my life easier,
01:02:47.420 | makes my body, my mind work better, all that kind of stuff.
01:02:50.480 | I discovered intermittent fasting and fasting in general from the ultra-endurance athlete's
01:02:56.200 | world, and that's where also I came across the idea as a fat-adapted athlete, which is
01:03:02.920 | this kind of idea that you can use fat as an energy source, and then quickly discover
01:03:07.580 | that there is diets similar to a keto diet that are extremely low-carb that could allow
01:03:13.520 | you to perform well physically and mentally, all those kinds of things.
01:03:18.080 | I think it all sounded a little bit crazy to me.
01:03:20.760 | I grew up thinking low-fat is good, high-fat is bad, so it was always weird to eat something
01:03:27.280 | with fat in it, and for it not to be a cheat meal or something, but to be something that's
01:03:33.620 | part of the diet.
01:03:35.260 | So it was strange, but once I gave it a chance and did it properly with all the electrolytes
01:03:39.940 | and water and all those kinds of things, you can look it up, when you do it properly, it
01:03:44.300 | just felt great, and there was just a huge number of benefits I felt immediately, and
01:03:49.180 | I've been doing it ever since.
01:03:51.280 | So let me maybe quickly comment on some pros and cons of the keto diet.
01:03:56.700 | And again, this is all personal experience.
01:03:59.060 | I don't want to extrapolate this to others, but I do encourage people to try to explore,
01:04:04.380 | to be their own scientists of their own body.
01:04:07.420 | So for me, pros is the physical energy.
01:04:11.460 | First of all, the energy levels are more stable, but also I just feel more energized for exercise.
01:04:17.020 | This is both for explosive movements, heavy lifts, or jiu-jitsu, grappling, judo, wrestling,
01:04:25.740 | all those kinds of things, and also for prolonged endurance exercise.
01:04:30.340 | I find both are really benefit for me.
01:04:32.940 | I think for explosive exercise, the biggest benefit for me is the mental focus, at least
01:04:37.780 | the way I approach the grappling sports, but even lifting.
01:04:42.020 | It's certainly very important how my body feels, but it's also important that the mind
01:04:47.320 | is really focused on the technique.
01:04:49.860 | And I find that the biggest benefit of keto combined with fasting is that my mind can
01:04:57.020 | achieve a greater level of stable, prolonged focus, which is useful for exercise, funny
01:05:05.380 | enough for me.
01:05:06.620 | Obviously, it's really useful for work, for deep work sessions, for thinking deeply for
01:05:13.580 | prolonged periods of times, whether that's programming, whether that's writing, or whether
01:05:18.020 | that's sitting behind a sheet of paper and designing new systems.
01:05:22.020 | It's both the energy of mental focus and the kind of clarity, I don't know how else to
01:05:27.580 | put it, but there's just a cleanness to the focus that I really enjoy.
01:05:34.140 | Also when you acclimate to it, I find that the number of hours in the day that I have
01:05:39.980 | a positive mood is just larger.
01:05:42.460 | I can be cranky sometimes, especially when I'm sleep deprived, or especially when stuff
01:05:47.660 | is just not working.
01:05:49.580 | So there will always be parts of the day when I'm cranky, but it just feels, I haven't quantified
01:05:55.980 | it, but I'm pretty sure, sort of anecdotally speaking, that the number of hours I feel
01:06:00.020 | just good about the day, just grateful to be alive, is higher with keto.
01:06:07.680 | Other benefits are better sleep, I fall asleep easier.
01:06:10.940 | That might have to do with just the lower volume of food.
01:06:13.460 | I don't know, but I enjoy naps and sleep better.
01:06:19.900 | There's also just in general small aches and pains from joints when you're exercising,
01:06:24.700 | all that kinds of stuff.
01:06:25.700 | It seems to be less on keto.
01:06:28.660 | So that's just my own personal experience.
01:06:31.820 | Also when you're doing fasting and keto, because of the stable energy, you find that you can
01:06:36.520 | actually skip meals quite easily.
01:06:39.300 | So that gives you a nice gateway into fasting for longer periods of time if you like.
01:06:44.020 | There's a lot of benefits to fasting that I could talk about, but that's for another
01:06:47.580 | time.
01:06:48.580 | But in general, it gives you this freedom to live life, to enjoy life, and not be so
01:06:53.740 | obsessed about food.
01:06:54.900 | I think that's the biggest liberating thing about keto, is that if you do the keto diet
01:07:00.180 | well, that food ceases to be a kind of habitual obsession that drives the progress of the
01:07:10.940 | You, more of the day is spent kind of lost in the passions and the things you love doing.
01:07:18.620 | I just found that when I was doing the kind of many meals a day, I would find myself thinking
01:07:25.980 | about food a lot.
01:07:28.060 | It drove the structure of the day.
01:07:31.220 | It influenced a lot of the things I would talk about and think about.
01:07:34.860 | You don't really think of it that way until it's gone.
01:07:37.340 | And you notice with keto and fasting that you can spend really long hours of the day
01:07:45.620 | just doing some cool stuff that you love, and food doesn't come into play in your mind
01:07:52.300 | and your actual activity.
01:07:54.900 | My personal sort of cons of the keto diet is I enjoy eating higher volume.
01:08:02.060 | It gives you a feeling of fullness.
01:08:04.420 | And I think with a keto diet is a lower volume of food in general.
01:08:09.420 | You're still full in terms of your body not saying you're hungry, but there's not a feeling
01:08:14.680 | of real fullness.
01:08:15.680 | Now that's also a benefit because you just feel better, you feel lighter, less bloated,
01:08:21.820 | and so on.
01:08:22.820 | I find this is actually changing a lot, but keto used to be a little bit less socially
01:08:28.380 | friendly.
01:08:30.460 | Most of the fun foods, foods you associate with kind of just like going crazy at parties
01:08:36.740 | or restaurants and so on, have a ton of carbs.
01:08:40.580 | And so in social settings, it often feel like you're being restrictive and not partaking
01:08:47.820 | in the fun if you're doing a keto diet.
01:08:49.780 | I think that's changing a lot, people becoming much more accepting of it.
01:08:53.460 | For example, at McDonald's, you can order just the beef patties for $1.50 as I've talked
01:08:59.380 | about.
01:09:00.380 | And people don't look at you weird, at least in my experience, if you just get the burger
01:09:04.420 | without a bun.
01:09:06.340 | Another con is keto and carnivore just doesn't sound healthy.
01:09:09.840 | So I usually try not to talk about it too much because it just makes me feel really
01:09:14.900 | good.
01:09:15.900 | My mind focused.
01:09:18.020 | My body performs well, but I don't know if I want to sort of prescribe it to others.
01:09:23.900 | It's definitely something I recommend you try, but I just don't feel like conclusively
01:09:28.140 | saying this diet is great for everybody.
01:09:30.060 | I really don't.
01:09:31.260 | I certainly don't know enough to be able to say that.
01:09:33.780 | And also it just doesn't sound right to say that.
01:09:36.760 | And while I've loved meat my whole life, I feel the best when I eat a lot of meat.
01:09:44.840 | I do think about the ethical side of veganism.
01:09:47.580 | It's something I'm reading about now.
01:09:49.700 | I'm thinking a lot about.
01:09:51.260 | It's an ongoing journey.
01:09:53.540 | Perhaps I'll have more to say, more of my mind to be changed in the future.
01:10:00.540 | We'll see.
01:10:01.880 | But for now, for many years now, I've been really enjoying the keto diet, a mix of keto
01:10:08.500 | and carnivore diets.
01:10:10.720 | We'll see what the future holds.
01:10:12.700 | What was the darkest time in your life, and what did your road to recovery look like?
01:10:19.660 | In general, I love life, so it's difficult for me to talk about these kinds of things.
01:10:24.820 | But let me briefly say that I think the darkest times have been when I've put my faith in
01:10:33.220 | people, when I opened my heart to them, and they turned out not to be the best versions
01:10:42.100 | of themselves or maybe the kind of amazing people that I'd hope I thought they might
01:10:50.260 | So my heart has been broken in small ways in my life, as I'm sure it has been for many
01:10:58.700 | people.
01:11:00.060 | But the fire of hope still burns bright, perhaps even brighter.
01:11:07.620 | You mentioned road to recovery.
01:11:09.220 | I think the people I mentioned, I focus on the positive moments, and there always are,
01:11:16.940 | and just have gratitude for those, and just don't linger on the negative.
01:11:23.180 | I just remember the good times.
01:11:26.860 | That's how I recover.
01:11:27.980 | That's how I keep my optimism, and that's how I keep my heart open for future amazing
01:11:33.060 | people to take the risk.
01:11:35.300 | And I'm sure my heart will be broken again.
01:11:38.860 | Perhaps many times in the future, but I think it's always worth the risk.
01:11:44.340 | I like the, I wrote this down, the Marcus Aurelius quote, "Love the people with whom
01:11:50.740 | faith brings you together, and do so with all of your heart."
01:11:55.580 | I think that's all we can do.
01:11:57.660 | I hope some of these answers were at least somewhat interesting or useful.
01:12:04.660 | If so, I'll try to do it again in the future.
01:12:09.060 | It is currently 4, oh, it's 421.
01:12:13.740 | When I started saying that sentence, it was 420 AM.
01:12:17.700 | A good time to end as any, perhaps the best.
01:12:21.820 | Good night.
01:12:22.820 | I love you all.
01:12:23.820 | Thanks for listening to this AMA episode, and thank you to our sponsors, Brooklyn and
01:12:28.860 | Sheets, Indeed hiring website, ExpressVPN, and Theragun muscle recovery device.
01:12:35.460 | So the choice is sleep, employment, privacy, or muscle recovery.
01:12:41.780 | Choose wisely, my friends.
01:12:43.300 | And if you wish, click the sponsor links below to get a discount and to support this podcast.
01:12:49.620 | And now, since we talked about Einstein's thoughts about happiness and pigs, let me leave
01:12:55.420 | you with some words from Winston Churchill.
01:12:58.340 | I'm fond of pigs.
01:13:00.140 | Dogs look up to us.
01:13:01.860 | Cats look down on us.
01:13:03.820 | Pigs treat us as equals.
01:13:06.700 | Thank you for listening, and hope to see you next time.
01:13:10.380 | [END]