back to index

Dr Lex Fridman: Navigating Conflict, Finding Purpose & Maintaining Drive | Huberman Lab Podcast #100


Chapters

0:0 Dr. Lex Fridman
4:30 LMNT, Levels, Eight Sleep
8:28 Podcasting
12:11 Ukraine, Russia, War & Geopolitics
23:17 Conflict & Generalized Hate
26:23 Typical Day in Ukraine; American Military & Information Wars
37:28 AG1 (Athletic Greens)
38:42 Deliberate Cold Exposure & Sauna; Fertility
46:44 Ukraine: Science, Infrastructure & Military; Zelensky
53:33 Firearms; Violence & Sensitization
57:40 MIT & Artificial Intelligence (AI), University Teaching & Pandemic
65:51 Publications & Peer Review, Research, Social Media
73:5 InsideTracker
74:17 Twitter & Social Media Mindset, Andrew Tate & Masculinity
86:5 Donald Trump & Anthony Fauci; Ideological Extremes
95:11 Biotechnology & Biopharma; Money & Status
105:8 Robotics, AI & Social Media; Start-ups
113:50 Motivation & Competition; Relationships
121:55 Jobs; A Career vs. A Calling; Robotics & Relationships
132:11 Chess, Poker & Cheating
142:25 Ideas of Lately
144:44 Why Lex Wears a Suit & Tie
147:50 Is There an AI Equivalent of Psychedelics?
149:6 Hardest Jiu-Jitsu Belt to Achieve
152:7 Advice to Young People
159:29 Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous Supplements, Neural Network Newsletter, Social Media

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science and science-based tools
00:00:04.880 | for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.200 | and I'm a professor of neurobiology and ophthalmology
00:00:13.240 | at Stanford School of Medicine.
00:00:15.140 | Today, my guest is Dr. Lex Friedman.
00:00:17.560 | Dr. Lex Friedman is an expert
00:00:19.120 | in electrical and computer engineering,
00:00:20.880 | artificial intelligence, and robotics.
00:00:23.200 | He is also the host of the Lex Friedman Podcast,
00:00:26.240 | which initially started as a podcast focused on technology
00:00:29.120 | and science of various kinds,
00:00:30.760 | including computer science and physics,
00:00:33.220 | but rapidly evolved to include guests and other topics
00:00:36.540 | as a matter of focus, including sport.
00:00:40.280 | For instance, Dr. Lex Friedman is a black belt
00:00:43.040 | in Brazilian jiu-jitsu,
00:00:44.360 | and he's had numerous guests on who come from the fields
00:00:46.720 | of Brazilian jiu-jitsu, both from the coaching side
00:00:49.520 | and from the competitor side.
00:00:52.280 | He also has shown an active interest in topics
00:00:54.920 | such as chess and essentially anything
00:00:57.680 | that involves intense activation and engagement
00:01:00.580 | of the mind and/or body.
00:01:02.560 | In fact, the Lex Friedman Podcast has evolved
00:01:04.840 | to take on very difficult topics, such as mental health.
00:01:08.080 | He's had various psychiatrists and other guests on
00:01:10.280 | that relate to mental health and mental illness,
00:01:12.680 | as well as guests focused on geopolitics
00:01:15.620 | and some of the more controversial issues
00:01:17.720 | that face our times.
00:01:19.040 | He's had comedians, he's had scientists, he's had friends,
00:01:23.620 | he's had enemies on his podcast.
00:01:26.040 | Lex has a phenomenal, I would say a one
00:01:28.000 | in an eight billion ability to find these people,
00:01:32.860 | make them comfortable, and in that comfort,
00:01:36.240 | both try to understand them and to confront them
00:01:40.200 | and to push them so that we all learn.
00:01:42.540 | All of which is to say that Lex Friedman is no longer
00:01:44.920 | just an accomplished scientist.
00:01:46.720 | He certainly is that, but he has also become one
00:01:49.320 | of the more preeminent thought leaders on the planet.
00:01:51.960 | And if there's anything that really captures the essence
00:01:54.340 | of Lex Friedman, it's his love of learning,
00:01:57.100 | his desire to share with us the human experience
00:02:00.040 | and to broaden that experience so that we all may benefit.
00:02:03.540 | In many ways, our discussion during today's episode
00:02:06.080 | captures the many facets of Lex Friedman,
00:02:08.300 | although no conversation of course could capture them all.
00:02:11.560 | We sit down to the conversation just days
00:02:13.320 | after Lex returned from Ukraine,
00:02:15.440 | where he deliberately placed himself into the tension
00:02:18.560 | of that environment in order to understand the geopolitics
00:02:21.560 | of the region and to understand exactly what was happening
00:02:25.280 | at the level of the ground and the people there.
00:02:28.240 | You may notice that he carries quite a lot of both emotion
00:02:32.320 | and knowledge and understanding,
00:02:34.280 | and yet in a very classic Lex Friedman way,
00:02:36.880 | you'll notice that he's able to zoom out
00:02:38.620 | of his own experience around any number of different topics
00:02:41.500 | and view them through a variety of lenses
00:02:44.220 | so that first of all, everyone feel included,
00:02:46.740 | but most of all, so that everyone learns something new,
00:02:50.240 | that is to gain new perspective.
00:02:52.480 | Our discussion also ventures into the waters of social media
00:02:55.520 | and how that landscape is changing the way
00:02:58.140 | that science and technology are communicated.
00:03:00.520 | We also get into the topics of motivation, drive,
00:03:03.100 | and purpose, both finding it and executing
00:03:05.520 | on that drive and purpose.
00:03:07.420 | I should mention that this is episode 100
00:03:10.000 | of the Huberman Lab Podcast,
00:03:11.600 | and I would be remiss if I did not tell you
00:03:14.260 | that there would be no Huberman Lab Podcast
00:03:16.720 | were it not for Lex Friedman.
00:03:18.440 | I was a fan of the Lex Friedman Podcast
00:03:20.320 | long before I was ever invited onto the podcast as a guest,
00:03:23.660 | and after our first recording,
00:03:25.400 | Lex was the one that suggested that I start a podcast.
00:03:28.360 | He only gave me two pieces of advice.
00:03:30.160 | The first piece of advice was start a podcast,
00:03:33.480 | and the second piece of advice was that I not just make it
00:03:36.720 | me blabbing into the microphone and staring at the camera.
00:03:40.240 | So I can safely say that I at least followed
00:03:42.560 | half of his advice and that I am ever grateful for Lex,
00:03:46.080 | both as a friend, a colleague in science,
00:03:48.880 | and now fellow podcaster for making the suggestion
00:03:51.840 | that we start this podcast.
00:03:53.560 | I already mentioned a few of the topics
00:03:55.180 | covered on today's podcast,
00:03:56.840 | but I can assure you that there is far more to the person
00:04:00.280 | that many of us know as Lex Friedman.
00:04:02.560 | If you are somebody interested in artificial intelligence,
00:04:04.800 | engineering, or robotics,
00:04:06.080 | today's discussion is most certainly for you.
00:04:08.720 | And if you are not,
00:04:09.960 | but you are somebody who is interested in world politics
00:04:12.320 | and more importantly, the human experience,
00:04:15.000 | both the individual and the collective human experience,
00:04:18.660 | Lex shares what can only be described as incredible insights
00:04:23.140 | into what he views as the human experience
00:04:25.840 | and what is optimal in order to derive
00:04:28.240 | from our time on this planet.
00:04:29.900 | Before we begin, I'd like to emphasize that this podcast
00:04:32.520 | is separate from my teaching and research roles at Stanford.
00:04:35.280 | It is however, part of my desire and effort
00:04:37.480 | to bring zero cost to consumer information about science
00:04:39.960 | and science-related tools to the general public.
00:04:42.640 | In keeping with that theme,
00:04:43.700 | I'd like to thank the sponsors of today's podcast.
00:04:46.380 | Our first sponsor is Element.
00:04:48.300 | Element is an electrolyte drink
00:04:49.800 | with everything you need and nothing you don't.
00:04:52.260 | That means the electrolytes, sodium, potassium, and magnesium
00:04:55.780 | are in Element in the correct ratios,
00:04:57.840 | but it has no sugar.
00:04:59.760 | As I mentioned before on the podcast,
00:05:01.480 | electrolytes are critical to the function
00:05:03.360 | of every cell in the body
00:05:04.800 | and especially the cells in your brain,
00:05:06.720 | meaning the neurons or nerve cells.
00:05:09.060 | Indeed, the ability for nerve cells to be active
00:05:11.260 | and communicate with one another
00:05:12.640 | critically depends on sodium, potassium, and magnesium.
00:05:16.020 | You can get electrolytes from a variety of sources,
00:05:18.680 | but it's often hard to get them in the proper ratios,
00:05:20.860 | even from food.
00:05:22.100 | So if you're somebody who's exercising a lot and sweating,
00:05:24.880 | or if you're somebody falling, for instance,
00:05:26.520 | a low carbohydrate or even a semi-low carbohydrate diet,
00:05:30.040 | that will cause you to excrete electrolytes.
00:05:32.080 | I tend to have my Element first thing in the morning
00:05:34.140 | when I wake up or within the first few hours of waking,
00:05:37.120 | anytime while or after I'm exercising or I've sweat a lot,
00:05:40.340 | such as exiting the sauna.
00:05:41.760 | If you'd like to try Element,
00:05:42.960 | you can go to Drink Element, that's lmnt.com/huberman,
00:05:46.920 | to claim a free Element sample pack with your purchase.
00:05:49.560 | Again, that's Drink Element, lmnt.com/huberman,
00:05:53.320 | to claim a free sample pack.
00:05:55.020 | Today's episode is also brought to us by Levels.
00:05:57.600 | Levels is a program that lets you see
00:05:59.120 | how different foods affect your health
00:06:00.540 | by giving you real-time feedback on your diet
00:06:02.560 | using a continuous glucose monitor.
00:06:04.840 | Blood glucose or blood sugar is a critical aspect
00:06:07.540 | of your immediate and long-term health,
00:06:09.040 | and indeed your feelings of vigor and mental clarity
00:06:11.760 | and wellbeing at any moment.
00:06:14.200 | One of the key things is to know how different foods
00:06:16.220 | and food combinations and timing of food intake
00:06:19.140 | is impacting blood glucose.
00:06:20.880 | And with Levels, you're able to assess
00:06:22.760 | all of that in real time.
00:06:24.960 | I tried Levels and what it taught me, for instance,
00:06:27.340 | was that I can eat certain foods at certain times of day,
00:06:30.000 | but if I eat them at other times of day,
00:06:31.800 | I get a blood sugar crash.
00:06:33.560 | It also taught me, for instance,
00:06:34.940 | how to space my exercise and my food intake.
00:06:38.040 | Turns out for me, exercising fasted is far more beneficial.
00:06:41.320 | That's something I learned using Levels,
00:06:43.080 | and it's completely transformed
00:06:44.680 | not just the spacing and timing of my diet and exercise,
00:06:48.200 | but also use of things like the sauna and other activities.
00:06:51.600 | It's been a tremendous learning for me
00:06:53.560 | that's really shaped an enormous number of factors
00:06:55.600 | in my life that have led to me feeling far more vigorous
00:06:58.740 | with far more mental focus
00:07:00.120 | and physical strength and endurance.
00:07:02.640 | So if you're interested in learning more about Levels
00:07:04.420 | and trying a continuous glucose monitor yourself,
00:07:06.800 | go to levels.link/huberman.
00:07:09.160 | Again, that's levels.link, L-I-N-K/huberman.
00:07:12.880 | Today's episode is also brought to us by Eight Sleep.
00:07:15.840 | Eight Sleep makes smart mattress covers
00:07:17.480 | with cooling, heating, and sleep tracking capacity.
00:07:20.600 | I've talked many times on this podcast
00:07:22.480 | about the critical relationship
00:07:23.760 | between sleep and body temperature.
00:07:26.240 | That is, in order to fall asleep
00:07:28.280 | and stay deeply asleep throughout the night,
00:07:30.240 | our body temperature needs to drop
00:07:31.620 | by about one to three degrees.
00:07:33.440 | And conversely, when we wake up in the morning,
00:07:36.200 | that is in large part because of our body heating up
00:07:39.360 | by one to three degrees.
00:07:41.080 | Now, people have different core body temperatures
00:07:43.000 | and they tend to run colder or hotter throughout the night.
00:07:45.600 | Eight Sleep allows you to adjust the temperature
00:07:47.600 | of your sleeping environment
00:07:48.880 | so that you have the optimal temperature
00:07:50.500 | that gets you the best night's sleep.
00:07:52.460 | I started sleeping on an Eight Sleep mattress cover
00:07:54.320 | about eight months ago,
00:07:55.480 | and it is completely transformed by sleep.
00:07:57.800 | I sleep so much deeper.
00:07:59.680 | I wake up far less during the middle of the night,
00:08:01.720 | if at all, and I wake up feeling far better than I ever have
00:08:05.000 | even after the same amount of sleep.
00:08:07.280 | If you want to try Eight Sleep,
00:08:08.120 | you can go to eightsleep.com/huberman
00:08:11.140 | to save up to $400 off their Sleep Fit Holiday bundle,
00:08:14.400 | which includes their new pod three cover.
00:08:16.740 | Eight Sleep currently ships in the USA, Canada,
00:08:19.040 | United Kingdom, select countries in the EU, and Australia.
00:08:22.640 | Again, that's eightsleep.com/huberman.
00:08:25.380 | And now for my discussion with Dr. Lex Friedman.
00:08:28.840 | Welcome back.
00:08:30.320 | - It's good to be back in a bedroom.
00:08:32.960 | This feels like a porn set.
00:08:34.240 | I apologize to open that way.
00:08:36.320 | I've never been in a porn set, so I should admit this.
00:08:39.140 | - Our studio is being renovated.
00:08:40.720 | So here we are for the monumental recording of episode 100.
00:08:45.720 | - Episode 100.
00:08:46.960 | - Of the Huberman Lab Podcast,
00:08:48.500 | which was inspired by the Lex Friedman Podcast.
00:08:52.280 | Some people already know this story,
00:08:53.440 | but I'll repeat it again.
00:08:55.780 | For those that don't,
00:08:57.080 | there would not be a Huberman Lab Podcast
00:08:59.520 | were it not for Lex Friedman,
00:09:01.000 | because after recording as a guest on his podcast
00:09:05.440 | a few years ago,
00:09:06.280 | he made the suggestion that I start a podcast
00:09:08.600 | and he explained to me how it works.
00:09:10.200 | And he said, you should start a podcast,
00:09:12.940 | but just make sure that it's not you labbing the whole time,
00:09:15.280 | Andrew, and I only sort of followed the advice.
00:09:18.340 | - Yeah, well, you surprised me,
00:09:20.460 | surprised the world that you're able to talk for hours
00:09:25.120 | and cite some of the best science going on
00:09:27.840 | and be able to give people advice
00:09:29.680 | without many interruptions or edits or any of that.
00:09:32.480 | I mean, that takes an incredible amount of skill
00:09:35.700 | that you're probably born with and some of it is developed.
00:09:38.280 | I mean, the whole science community is proud of you, man.
00:09:41.200 | Stanford is proud of you.
00:09:43.440 | So yeah, it's just a beautiful thing.
00:09:45.640 | It was really surprising
00:09:46.760 | 'cause it's unclear how a scientist can do a great podcast
00:09:50.740 | that's not just shooting the shit about random stuff,
00:09:52.920 | but really is giving very structured good advice
00:09:56.200 | that's boiling down the state-of-the-art science
00:10:00.080 | into something that's actually useful for people.
00:10:02.480 | So that was impressive.
00:10:04.040 | It's like, holy shit, he actually pulled this off.
00:10:06.080 | And doing it every week on a different topic.
00:10:10.540 | That, I mean, I'm usually positive,
00:10:14.680 | especially for people I love and support,
00:10:16.960 | but damn, I thought there's no way
00:10:18.480 | he's gonna be able to pull this off week after week.
00:10:20.660 | And it's been only getting better and better and better.
00:10:23.280 | I had a whole rant on a recent podcast,
00:10:25.180 | I forget with who, of how awesome you are,
00:10:27.840 | with Rana Al-Khalobi.
00:10:29.400 | She's a emotion recognition person, AI person.
00:10:34.080 | And then she didn't know who you were.
00:10:36.920 | And I was like, what the hell do you mean?
00:10:39.800 | And I just wanted this whole rant of how awesome you are.
00:10:41.920 | It was hilarious.
00:10:43.460 | - Oh, well, I'm very gratified to hear this.
00:10:46.520 | It's a little uncomfortable for me to hear,
00:10:48.620 | but listen, I'm just really happy
00:10:51.360 | if people are getting information that they like
00:10:53.800 | and can make actionable.
00:10:55.240 | And it was inspired by you.
00:10:56.940 | And look right back at you,
00:10:58.480 | I've followed a number of your structural formats,
00:11:03.960 | attire, I don't wear a tie.
00:11:06.900 | I'm constantly reminded about this by my father,
00:11:08.980 | who saw my podcast and was like,
00:11:10.940 | why don't you dress properly like your friend Lex?
00:11:13.120 | He literally said that.
00:11:14.380 | And it's a debate that goes back and forth.
00:11:17.380 | But nonetheless.
00:11:19.380 | - How does it feel, episode 100?
00:11:21.080 | How does it feel?
00:11:22.280 | - You know, I think-
00:11:23.120 | - Imagine you're here.
00:11:24.800 | You're here after so many episodes and done so much.
00:11:27.500 | I mean, the number of hours is just insane.
00:11:31.120 | The amount of passion, the amount of work you put into this,
00:11:33.680 | what's it feel like?
00:11:34.680 | - It feels great.
00:11:38.560 | And it feels very much like the horizon
00:11:41.400 | is still at the same distance in front of me.
00:11:43.280 | You know, every episode,
00:11:44.220 | I just try and get information there.
00:11:46.840 | And the process that we talked about on your podcast,
00:11:48.800 | so we won't go into it, of collecting information,
00:11:51.480 | distilling it down to some simple notes,
00:11:52.980 | walking around, listening to music,
00:11:54.560 | trying to figure out what the motifs are.
00:11:57.880 | And then just like you,
00:11:59.100 | I don't use a teleprompter or anything like that.
00:12:01.240 | There's very minimal notes.
00:12:02.620 | So it feels great and I love it.
00:12:04.420 | And again, I'm just grateful to you for inspiring it.
00:12:08.280 | And I just want to keep going and do more of it.
00:12:10.860 | And I should say,
00:12:13.500 | I am also relieved that we're sitting here
00:12:16.120 | because you recently went overseas
00:12:19.240 | to a very intense war zone, literally, the Ukraine.
00:12:24.240 | And the entire time that you were there,
00:12:28.340 | I was genuinely concerned.
00:12:30.220 | You know, the world's a unpredictable place in general,
00:12:33.400 | and we don't always get the only vote
00:12:35.380 | in what happens to us.
00:12:36.680 | So first of all, welcome back safely.
00:12:39.620 | One piece, one alive piece.
00:12:41.960 | And what was that like?
00:12:45.040 | I mean, at a broad level, at a specific level,
00:12:49.200 | what drew you there?
00:12:50.400 | What surprised you?
00:12:51.960 | And how do you think it changed you in coming back here?
00:12:55.960 | - I think there's a lot to say,
00:12:58.300 | but first it is really good to be back.
00:13:00.580 | One of the things that
00:13:02.080 | when you go to a difficult part of the world
00:13:04.380 | or a part of the world
00:13:05.220 | that's going through something difficult,
00:13:07.300 | you really appreciate how great it is to be an American.
00:13:10.700 | Everything, the easy access to food,
00:13:15.720 | despite what people think,
00:13:17.400 | the stable, reliable rule of law,
00:13:20.300 | the lack of corruption in that you can trust
00:13:26.200 | that if you start a business
00:13:27.600 | or if you take on various pursuits in life,
00:13:31.140 | that there's not going to be at scale
00:13:34.000 | manipulation of your efforts such that you can't succeed.
00:13:36.800 | So this kind of, you know, capitalism is in its,
00:13:41.760 | the ideal of capitalism is really still burning bright
00:13:45.340 | in this country,
00:13:46.180 | and really makes you appreciate those aspects.
00:13:48.680 | And also just the ability to have a home for generations,
00:13:53.680 | across generations.
00:13:55.960 | So you can have your grandfather live in, I don't know,
00:13:59.020 | Kentucky in a certain city,
00:14:00.280 | and then his children lived there and you lived there,
00:14:04.520 | and then it just continues on and on.
00:14:07.000 | That's the kind of thing you can have
00:14:08.440 | when you don't have war,
00:14:10.480 | because war destroys entire communities.
00:14:14.040 | It destroys the histories, generations,
00:14:17.820 | like life stories that stretch across the generations.
00:14:21.500 | - Yeah, I didn't even think about that until you said,
00:14:23.560 | just now about photographs,
00:14:25.480 | hard drives get destroyed or just abandoned, right?
00:14:28.500 | Libraries.
00:14:29.680 | I mean, nowadays things exist in the cloud,
00:14:32.620 | but there's still a lot of material goods
00:14:35.060 | that are irreplaceable, right?
00:14:37.620 | - Well, even in rural parts of the United States,
00:14:41.460 | they don't exist in the cloud, right?
00:14:43.100 | A lot of people still, well, even in towns,
00:14:46.120 | they still love the physical photo album of your family.
00:14:50.340 | A lot of people still store their photographs of families
00:14:53.360 | in the store, the VHS tapes and all that kind of stuff.
00:14:57.000 | Yeah, but I think there's so many things I've learned
00:15:02.000 | and really felt the lessons.
00:15:04.940 | One of which is nobody gives a damn
00:15:08.140 | when your photos are gone and all that kind of stuff.
00:15:10.400 | Your house is gone.
00:15:12.260 | The thing time and time again,
00:15:13.940 | I saw for people that lost everything
00:15:17.800 | is how happy they are for the people.
00:15:20.780 | They love the friends, the family that are still alive.
00:15:24.520 | That's the only thing they talk about that,
00:15:28.220 | in fact, they don't mention actually with much dramatic
00:15:32.700 | sort of vigor about the trauma of losing your home.
00:15:36.780 | They're just nonstop saying how lucky they are
00:15:39.740 | that person X, person Y is still here.
00:15:42.940 | And that makes you realize that when you lose everything,
00:15:46.460 | it makes you realize what really matters,
00:15:49.420 | which is the people in your life.
00:15:51.100 | I mean, a lot of people kind of realize that later in life
00:15:53.460 | when you're facing mortality, when you're facing your death
00:15:56.380 | or you get a cancer diagnosis, that kind of stuff.
00:15:59.300 | I think people here in America, in California,
00:16:02.300 | with the fires, you can still lose your home.
00:16:06.180 | You realize like, nah, it doesn't really matter.
00:16:08.780 | It's a pain in the ass, but what matters is still
00:16:12.300 | the family, the people, and so on.
00:16:15.060 | I think the most intense thing,
00:16:17.820 | I talked to several hundred people,
00:16:20.040 | some of which is recorded.
00:16:21.580 | I've really been struggling to put that out
00:16:25.580 | 'cause I have to edit it myself.
00:16:27.460 | And so you're talking about 30, 40 hours of footage.
00:16:30.780 | - Is emotionally struggling?
00:16:31.940 | - Yeah, emotional struggle is extremely difficult.
00:16:34.740 | So I talk to a lot of politicians,
00:16:36.140 | the number two in the country, number three.
00:16:38.180 | I'll be back there to talk to the president
00:16:39.980 | to do a three-hour conversation.
00:16:42.380 | Those are easy to edit.
00:16:43.940 | You know, they're really heartfelt and thoughtful folks
00:16:47.620 | from different perspectives on the geopolitics of the war.
00:16:51.940 | But the ones that are really hard to edit
00:16:53.580 | is like grandmas that are like in the middle of nowhere.
00:16:56.620 | They lost everything.
00:16:57.540 | They still have hope.
00:16:58.380 | They still have love.
00:16:59.420 | And some of them have, some of them, many of them,
00:17:02.820 | unfortunately, have now hate in their heart.
00:17:05.500 | So in February, when Russia invaded Ukraine,
00:17:10.500 | this is the thing I realized about war.
00:17:14.260 | One of the most painful lessons
00:17:17.540 | is that war creates generational hate.
00:17:22.140 | You know, we sometimes think about war
00:17:24.940 | as a thing that kills people, kills civilians,
00:17:28.620 | kills soldiers, takes away lives, injures people.
00:17:32.100 | But we don't directly think about the secondary
00:17:36.260 | and tertiary effects of that, which lasts decades,
00:17:39.940 | which is anyone who's lost a father, or a mother,
00:17:43.060 | or a daughter, or a son, they now hate.
00:17:47.500 | Not just the individual soldiers or the leaders
00:17:51.460 | that invaded their country, but the entirety of the people.
00:17:55.020 | So it's not that they hate Vladimir Putin
00:17:59.780 | or hate the Russian military.
00:18:01.740 | They hate Russian people.
00:18:03.360 | So that tears the fabric of a thing that, for me,
00:18:08.660 | you know, half my family's from Ukraine,
00:18:12.220 | half my family is from Russia.
00:18:14.200 | But there's a, I remember the pain,
00:18:19.200 | the triumph of World War II still resonates
00:18:22.500 | through my entire family tree.
00:18:24.820 | And so you remember when the Russians and Ukrainians
00:18:27.300 | fought together against this Nazi invasion.
00:18:30.340 | You remember a lot of that.
00:18:31.980 | And now to see the fabric of this peoples
00:18:38.020 | torn apart completely with hate
00:18:39.820 | is really, really difficult for me,
00:18:41.700 | just to realize that things will just never be the same
00:18:45.180 | on this particular cultural, historical aspect.
00:18:48.160 | But also, there's so many painful ways
00:18:52.520 | in which things will never be the same,
00:18:54.500 | which is we've seen that it's possible
00:18:56.260 | to have a major hot war in the 21st century.
00:19:01.240 | I think a lot of people are watching this.
00:19:02.820 | China is watching this.
00:19:03.900 | India is watching this.
00:19:04.860 | The United States is watching this
00:19:06.260 | and thinking we can actually have a large-scale war.
00:19:11.120 | And I think the lessons learned from that
00:19:13.820 | might be the kind that lead to a major World War III
00:19:18.700 | in the 21st century.
00:19:20.700 | So one of the things I realized watching the whole scene
00:19:25.920 | is that we don't know shit about what's gonna happen
00:19:27.960 | in the 21st century.
00:19:29.260 | And it might, we kind of have this intuition,
00:19:31.360 | like surely there's not gonna be another war.
00:19:33.620 | - We'll just coast.
00:19:34.700 | - Yeah.
00:19:35.540 | - Yeah.
00:19:36.360 | - Yeah, pandemic.
00:19:37.540 | - Yeah, back to normal.
00:19:38.860 | - Back to normal.
00:19:39.700 | - Whatever that is.
00:19:40.540 | - But you have to remember, at the end of World War I,
00:19:45.460 | as Woodrow Wilson called it, the war to end all wars,
00:19:49.580 | nobody, ironically, in a dark way,
00:19:52.820 | it was also the war in the '20s when people believed this,
00:19:56.080 | there will never be another World War.
00:19:58.440 | And 20 years after that, the rise of Nazi Germany,
00:20:03.440 | a charismatic leader that captivated the minds of millions
00:20:07.760 | and built up a military that can take on the whole world.
00:20:11.480 | And so it makes you realize that this is still possible.
00:20:15.440 | This is still possible.
00:20:16.680 | And then the tension, you see the media machine,
00:20:22.020 | the propaganda machine that I've gotten to see
00:20:24.120 | every aspect of, it's still fueling that division
00:20:28.740 | between America and China, between Russia and India.
00:20:33.740 | And then Africa has a complicated thing
00:20:37.280 | that's trying to figure out who are they with,
00:20:39.100 | who are they against, and just this tension
00:20:41.440 | is building and building.
00:20:42.880 | And it makes you realize, like we might,
00:20:47.060 | the thing that might shake human civilization
00:20:51.760 | may not be so far off.
00:20:55.380 | That's a realization you get to really feel.
00:20:58.200 | I mean, there's all kinds of other lessons,
00:21:01.520 | and one of which is propaganda,
00:21:03.920 | is I get a lot of letters, emails,
00:21:06.840 | and some of them are full of really intense language,
00:21:11.400 | full of hate from every side toward me.
00:21:16.260 | Well, the hate is towards me as representing side X,
00:21:21.260 | and X stands as a variable for every side.
00:21:26.160 | So either I'm a Zelensky show, or I'm a Putin show,
00:21:30.360 | or I'm a NATO show, or I'm an American empire show,
00:21:35.360 | or I'm a Democrat or a Republican,
00:21:41.160 | 'cause it's already been in this country politicized.
00:21:44.840 | I think there's a sense of Ukraine is this place
00:21:48.960 | that's full of corruption, why are we sending money there?
00:21:51.740 | I think that's kind of the messaging on the Republican side,
00:21:55.460 | on the Democratic side.
00:21:56.800 | I'm not even keeping track of the actual messaging
00:21:59.980 | and the conspiracy theories and the narratives,
00:22:02.940 | but they are, the tension is there,
00:22:05.100 | and I get to feel it directly.
00:22:07.140 | And what you get to really experience is
00:22:09.920 | there's a large number of narratives
00:22:12.840 | that all are extremely confident in themselves
00:22:16.320 | that they know the truth.
00:22:17.560 | People are convinced, first of all,
00:22:21.240 | that they're not being lied to.
00:22:22.940 | People in Russia think there's no propaganda.
00:22:26.060 | They think that yes, yes, there's state-sponsored propaganda,
00:22:30.540 | but we're all smart enough to ignore
00:22:32.520 | the sort of lame propaganda that's everywhere.
00:22:37.520 | They know that we can think on our own, we know the truth.
00:22:40.640 | And everybody kind of speaks in this way.
00:22:42.760 | Everybody in the United States says,
00:22:44.480 | well, yes, there's mainstream media,
00:22:46.020 | they're full of messaging and propaganda,
00:22:48.160 | but we're smart, we can think on our own.
00:22:50.780 | Of course, we see through that.
00:22:52.460 | Everybody says this, and then the conclusion
00:22:55.000 | of their thought is often hatred towards some group,
00:22:59.400 | whatever that group is, and the more you've lost,
00:23:01.800 | the more intense the feeling of hatred.
00:23:04.600 | It's a really difficult field to walk through calmly
00:23:12.080 | and with an open mind and try to understand
00:23:15.340 | what's really going on.
00:23:17.080 | - It's super intense.
00:23:19.220 | That's the only words that come to mind as I hear this.
00:23:21.840 | You mentioned something that it seems that hate generalizes.
00:23:25.560 | You know, it's against an entire group or an entire country.
00:23:28.940 | Why do you think it is that hate generalizes
00:23:33.900 | and that love may or may not generalize?
00:23:37.060 | - I've had, so one of the, as you can imagine,
00:23:40.960 | the kind of question I asked is,
00:23:44.320 | do you have love or hate in your heart?
00:23:48.580 | It's a question I asked almost everybody,
00:23:50.840 | and then I would dig into this exact question
00:23:52.880 | that you're asking.
00:23:53.840 | I think some of the most beautiful things I've heard,
00:23:58.920 | which is people that are full of hate
00:24:00.880 | are able to self-introspect about it.
00:24:04.120 | They know they shouldn't feel it, but they can't help it.
00:24:07.520 | That's not, they know that ultimately the thing that helps
00:24:10.880 | them and helps everyone is to feel love for a fellow man,
00:24:15.300 | but they can't help it.
00:24:18.720 | They know it's like a drug.
00:24:20.480 | They say hate escalates.
00:24:23.520 | It's like a vicious spiral.
00:24:25.600 | You just can't help it.
00:24:27.400 | And the question I also asked is,
00:24:30.040 | do you think you'll ever be able to forgive Russia?
00:24:33.620 | And after much thought, almost, it's split,
00:24:38.620 | but most people will say, no,
00:24:42.440 | I will never be able to forgive.
00:24:45.220 | - And because of the generalization you talked about earlier,
00:24:48.640 | that could even include all Russians.
00:24:51.240 | - All Russians.
00:24:52.080 | - They mean all Russians.
00:24:52.900 | - Because if you do nothing, that's as bad or worse
00:25:02.920 | than being part of the army that invades.
00:25:07.780 | So the people that are just sitting there,
00:25:09.700 | the good Germans, the people that are just quietly going on
00:25:12.880 | with their lives, you're just as bad, if not worse,
00:25:16.580 | is their perspective.
00:25:17.800 | - Earlier you said that going over to the Ukraine now
00:25:23.300 | allowed you to realize just so many of the positives
00:25:27.620 | of being here in the United States.
00:25:30.180 | I have a good friend.
00:25:31.320 | We both know him.
00:25:32.160 | We won't name him by name,
00:25:33.380 | but we've communicated the three of us
00:25:35.500 | from tier one special operations.
00:25:37.680 | He spent years doing deployments, really amazing individual.
00:25:41.600 | And I remember when the pandemic hit,
00:25:44.220 | he said on a text thread,
00:25:47.180 | Americans aren't used to the government interfering
00:25:49.660 | with their plans.
00:25:50.760 | Around the world, many people are familiar with governments
00:25:55.060 | dramatically interfering with their plans,
00:25:57.020 | sometimes even in a seemingly random way.
00:25:59.740 | Here, we were not braced for that.
00:26:02.640 | There, I mean, we get speeding tickets
00:26:05.000 | and there's lines to vote and things like that.
00:26:07.800 | But I think the pandemic was one of the first times,
00:26:11.180 | at least in my life that I can remember,
00:26:12.300 | where it really seemed like the government was impeding
00:26:14.760 | what people naturally wanted to do.
00:26:16.940 | And that was a shock for people here.
00:26:19.640 | And I have what might seem like a somewhat mundane question,
00:26:24.560 | but it's something that I saw on social media.
00:26:26.200 | A lot of people were asking me to ask you.
00:26:28.640 | And I was curious about too,
00:26:30.840 | what was a typical day like over there?
00:26:34.120 | Were you sleeping in a bed?
00:26:35.400 | Were you sleeping on the ground?
00:26:37.000 | Everyone seems to want to know, what were you eating?
00:26:39.620 | Were you eating once a day?
00:26:40.720 | Were you eating your steak?
00:26:42.040 | Or were you in fairly deprived conditions over there?
00:26:45.720 | I saw a couple photos that you posted out of doors,
00:26:50.720 | in front of rubble, with pith helmet on in one case.
00:26:56.320 | What was that typical day like over there?
00:26:59.340 | - So there's two modes.
00:27:03.240 | One of them, I spent a lot of time in Kiev,
00:27:06.320 | which is much safer than, it may be obvious this day,
00:27:11.320 | but for people who don't know,
00:27:14.100 | it's in the middle of the country
00:27:15.400 | and it's much safer than the actual front,
00:27:18.160 | where the battle is happening.
00:27:20.200 | So much, much safer than Kiev even is Lviv,
00:27:23.640 | which is the western part of the country.
00:27:25.640 | So the times I spent in Kiev were fundamentally different
00:27:28.360 | than the time I spent at the front.
00:27:30.140 | And I went to the Hursan region,
00:27:32.840 | which is where a lot of really heated battle was happening.
00:27:35.960 | There's several areas, so there's Kharkiv,
00:27:38.600 | it's northeast of the country.
00:27:40.840 | And then there's Donbass region,
00:27:42.640 | which is east of the country.
00:27:43.800 | And then there's Hursan region,
00:27:46.360 | which by the way, I'm not good at geography,
00:27:48.460 | so it's the southeast of the country.
00:27:52.920 | And that's where, at least when I was there,
00:27:54.320 | there was a lot of really heated fighting happening.
00:27:57.120 | So when I was in the Hursan region,
00:27:59.360 | there's, you know, it's what you would imagine.
00:28:02.640 | The place, I stayed in a hotel,
00:28:06.380 | where all the lights have to stay off for the entire town.
00:28:10.260 | All the lights are off.
00:28:11.680 | You have to kind of navigate through the darkness
00:28:13.880 | and then use your phone to shine and so on.
00:28:16.320 | - This is terrible for the circadian system.
00:28:18.320 | - Yeah, that's exactly, how can I do this?
00:28:21.040 | Where's my element and the flood of greens, how can I function?
00:28:24.440 | No, I think it was balanced
00:28:29.440 | by the deep appreciation of being alive.
00:28:31.920 | - Right, no, I mean, this is the reason I asked.
00:28:35.360 | This is the reason I ask is, you know,
00:28:37.080 | we get used to all these creature comforts
00:28:40.200 | and we don't need them,
00:28:43.440 | but we often come to depend on them
00:28:45.520 | in a way that makes us feel like we need them.
00:28:47.760 | - Yeah, but very quickly,
00:28:49.640 | there's something about the intensity of life
00:28:54.640 | that you see in people's eyes
00:28:56.320 | because they've been living through war
00:28:57.640 | that makes you forget all those creature comforts.
00:28:59.920 | And it was actually, you know,
00:29:02.080 | I'm somebody who hates traveling and so on.
00:29:03.880 | I love the creature habits.
00:29:06.240 | I love the comfort of the ritual, right?
00:29:10.080 | But all of that was forgotten very quickly.
00:29:12.320 | Just the intensity of feeling,
00:29:14.160 | the intensity of love that people have for each other.
00:29:16.880 | That was obvious.
00:29:18.280 | In terms of food, so there's a curfew.
00:29:20.400 | So depends on what part of the country,
00:29:24.480 | but usually you basically have to scammer home at like 9 p.m.
00:29:29.000 | So the hard curfew in a lot of places is 11 p.m. at night,
00:29:34.000 | but by then you have to be home.
00:29:37.600 | So in some places it's 10.
00:29:40.220 | So at 9 p.m. you start going home,
00:29:42.920 | which for me was kind of wonderful also
00:29:45.880 | because I get to spend,
00:29:48.340 | I get to be forced to spend time alone
00:29:53.280 | and think for many hours in wherever I'm staying,
00:29:57.080 | which is really nice in everybody.
00:29:58.400 | There's a calmness and the quietness to the whole thing.
00:30:01.200 | In terms of food, once a day,
00:30:03.100 | just the food is incredibly cheap and incredibly delicious.
00:30:09.400 | People are still,
00:30:10.960 | one of the things they can still take pride in
00:30:13.600 | is making the best possible food they can.
00:30:18.300 | So meat, but they do admire American meat,
00:30:22.720 | so the meat is not as great as it could be in that country,
00:30:25.320 | but I ate borscht every day, all that kind of stuff,
00:30:28.400 | mostly meat.
00:30:29.340 | So spend the entire day, wake up in the morning with coffee,
00:30:34.480 | spend the entire day talking to people,
00:30:36.520 | which for me is very difficult
00:30:38.320 | because of the intensity of the stories,
00:30:40.920 | one after the other after the other.
00:30:42.520 | We just talk to regular people, talk to soldiers,
00:30:45.080 | talk to politicians, all kinds of soldiers.
00:30:49.800 | I talk to people there who are doing rescue missions,
00:30:53.480 | so Americans.
00:30:54.680 | I hung out with Tim Kennedy.
00:30:58.160 | - Oh yeah, the great Tim Kennedy.
00:31:00.200 | - The great Tim Kennedy, who also him and many others
00:31:05.200 | revealed to me one of the many reasons
00:31:09.920 | I'm proud to be an American is how trained and skilled
00:31:14.920 | and effective American soldiers are.
00:31:19.400 | - I guess for listeners of this podcast,
00:31:21.320 | maybe we should familiarize them with who Tim Kennedy is
00:31:23.760 | 'cause I realize that a number of them will know.
00:31:25.440 | - How do you do that?
00:31:26.600 | How do you try to summarize a man?
00:31:28.400 | - Right, in, let's say we can be accurate but not exhaustive
00:31:33.200 | as any good data are accurate but not exhaustive.
00:31:38.220 | Very skilled and accomplished MMA fighter,
00:31:40.360 | very skilled and accomplished former special operations.
00:31:43.720 | And we're American patriot and a podcaster too, right?
00:31:47.820 | Does he have his own podcast?
00:31:49.520 | - Maybe, maybe.
00:31:51.440 | - We know Andy Stumpf has his own podcast.
00:31:53.100 | - Yes, it's an amazing podcast, which is great.
00:31:55.540 | - Yeah, clearing hot podcasts with Andy Stumpf.
00:31:58.660 | - But also Tim Kennedy's like the embodiment of America
00:32:02.160 | and to the most beautiful and the most ridiculous degree.
00:32:05.640 | So he's like, would you imagine, what is it, Team America?
00:32:10.640 | That like, I just imagined him like shirtless on a tank
00:32:16.540 | rolling into enemy territory,
00:32:18.160 | just screaming at the top of his lungs.
00:32:19.900 | That's just his personality.
00:32:21.360 | - But not posturing, that's it, he actually does the work
00:32:24.060 | as they say.
00:32:24.900 | - So this is the thing, he really embodies that.
00:32:27.780 | Now, some of that is just his personality and humor.
00:32:30.840 | I'd like to sort of comment on the humor of things,
00:32:33.200 | not just with him.
00:32:34.320 | It's one other interesting thing I've learned.
00:32:36.740 | But also when he's actually helping people,
00:32:39.460 | he's extremely good at what he does,
00:32:41.680 | which is building teams that rescue,
00:32:44.280 | that go into the most dangerous areas of Ukraine,
00:32:47.500 | dangerous areas anywhere else, and they get the job done.
00:32:50.940 | And one of the things I heard time and time again,
00:32:54.520 | which is really interesting to me,
00:32:57.520 | that Ukrainian soldiers said that comparing Ukrainian,
00:33:02.780 | Russian, and American soldiers,
00:33:04.940 | American soldiers are the bravest,
00:33:07.840 | which was very interesting for me to hear,
00:33:09.460 | given how high the morale is for the Ukrainian soldiers.
00:33:12.360 | But that just reveals that training enables you to be brave.
00:33:16.420 | So it's not just about how well trained they are and so on,
00:33:20.160 | it's how intense and ferocious they are in the fighting.
00:33:23.320 | And that makes you realize like this is American army,
00:33:26.960 | not just through the technology,
00:33:29.160 | especially the special special force guys,
00:33:31.420 | they're still one of the most effective
00:33:34.420 | and terrifying armies in the world.
00:33:36.720 | And listen, just for context,
00:33:38.860 | I'm somebody who is for the most part anti-war, a pacifist,
00:33:43.860 | but you get to see some of the realities of war
00:33:49.220 | kind of wake you up to what needs to get done
00:33:52.800 | to protect sovereignty, to protect some of the values,
00:33:58.680 | to protect civilians and homes and all that kind of stuff.
00:34:01.700 | Sometimes war has to happen.
00:34:04.340 | And I should also mention the Russian side
00:34:06.140 | because while I haven't gotten to experience
00:34:10.640 | the Russian side yet, I do fully plan to travel to Russia.
00:34:14.540 | As I've told everybody,
00:34:15.380 | I was very upfront with everybody about this.
00:34:18.820 | I would like to hear the story of Russians,
00:34:21.140 | but I do know from the Ukrainian side, like the grandmas,
00:34:24.780 | I love grandmas, they told me stories
00:34:27.180 | that the Russians really,
00:34:28.660 | the ones that entered their villages,
00:34:31.960 | they really, really believe they're saving Ukraine
00:34:34.260 | from Nazis, from Nazi occupation.
00:34:36.480 | So they feel that there's the,
00:34:40.860 | Ukraine is under control of Nazi organizations
00:34:44.260 | and they believe they're saving the country.
00:34:47.020 | That's their brothers and sisters.
00:34:48.580 | So I think, I think propaganda
00:34:53.640 | and I think truth is a very difficult thing to arrive.
00:34:58.640 | It's in that war zone.
00:35:01.440 | I think in the 21st century,
00:35:02.820 | one of the things you realize that so much of war,
00:35:06.020 | even more so than in the past, is an information war.
00:35:09.800 | And people that just use Twitter
00:35:11.860 | for their source of information might be surprised
00:35:15.620 | to know how much misinformation there is on Twitter,
00:35:18.440 | like real narratives being sold.
00:35:22.060 | And so it's really hard to know who to believe.
00:35:25.120 | And through all of that,
00:35:27.100 | you have to try to keep an open mind
00:35:28.700 | and ultimately ignore the powerful
00:35:32.460 | and listen to actual citizens, actual people.
00:35:35.080 | That's the other maybe obvious lesson is that
00:35:37.900 | war is waged by powerful, rich people
00:35:44.060 | and it's the poor people that suffer.
00:35:45.960 | And that's just visible time and time again.
00:35:50.420 | - You mentioned the fact that people still enjoy food
00:35:53.980 | or the pleasure of cooking,
00:35:55.220 | or there's occasional humor or maybe frequent humor.
00:35:58.900 | I know Jocko Willink has talked about this in "Warfare"
00:36:01.840 | and that all the elements of the human spirit
00:36:05.340 | and conditions still emerge at various times.
00:36:09.660 | I find this amazing.
00:36:10.500 | And you and I have had conversations about this before,
00:36:12.420 | but the aperture of the mind,
00:36:14.780 | the classic story that comes to mind
00:36:18.080 | is the one of Viktor Frankl or Nelson Mandela.
00:36:21.420 | You put somebody into a small box of confinement
00:36:24.140 | and some people break under those conditions
00:36:27.940 | and other people find entire stories
00:36:31.300 | within a centimeter of concrete that can occupy them.
00:36:36.300 | And real stories and richness or humor or love
00:36:40.060 | or fascination and surprise.
00:36:42.080 | And I find this so interesting that the mind is so adaptable.
00:36:45.260 | We talked about creature comforts
00:36:46.460 | and then lack of creature comforts
00:36:48.240 | and the way that we can adapt.
00:36:50.640 | And yet humans are always striving, it seems,
00:36:52.860 | or one would hope for these better conditions
00:36:55.080 | to better their conditions.
00:36:57.000 | So as you've come back and you've been here now
00:37:00.300 | back in the States for how long after your trip?
00:37:03.520 | - It depends on this podcast release,
00:37:04.920 | but it felt like I've never left.
00:37:07.460 | So practically speaking, a couple of months.
00:37:11.620 | - Okay.
00:37:12.460 | Yeah, and we won't be shy.
00:37:13.280 | We're recording this mid-September.
00:37:15.600 | - But we actually recorded this several years ago.
00:37:18.320 | So we're anticipating the future.
00:37:20.320 | - We're going to start telling you this is a simulation.
00:37:22.760 | You and Joe.
00:37:24.120 | I'm still trying to figure out what that actually means.
00:37:27.600 | I'd like to take a quick break
00:37:29.000 | and acknowledge one of our sponsors, Athletic Greens.
00:37:32.040 | Athletic Greens, now called AG1,
00:37:34.500 | is a vitamin mineral probiotic drink
00:37:36.860 | that covers all of your foundational nutritional needs.
00:37:39.760 | I've been taking Athletic Greens since 2012.
00:37:42.440 | So I'm delighted that they're sponsoring the podcast.
00:37:44.660 | The reason I started taking Athletic Greens
00:37:46.260 | and the reason I still take Athletic Greens
00:37:48.340 | once or usually twice a day
00:37:50.320 | is that it gets to be the probiotics
00:37:52.240 | that I need for gut health.
00:37:53.920 | Our gut is very important.
00:37:55.020 | It's populated by gut microbiota
00:37:57.580 | that communicate with the brain, the immune system,
00:37:59.320 | and basically all the biological systems of our body
00:38:01.740 | to strongly impact our immediate and long-term health.
00:38:05.380 | And those probiotics in Athletic Greens
00:38:07.240 | are optimal and vital for microbiota health.
00:38:11.060 | In addition, Athletic Greens contains
00:38:12.620 | a number of adaptogens, vitamins, and minerals
00:38:14.540 | that make sure that all of my foundational
00:38:16.420 | nutritional needs are met, and it tastes great.
00:38:19.860 | If you'd like to try Athletic Greens,
00:38:21.320 | you can go to athleticgreens.com/huberman,
00:38:24.720 | and they'll give you five free travel packs
00:38:26.700 | that make it really easy to mix up Athletic Greens
00:38:29.020 | while you're on the road, in the car,
00:38:30.300 | on the plane, et cetera.
00:38:31.580 | And they'll give you a year's supply of vitamin D3K2.
00:38:35.000 | Again, that's athleticgreens.com/huberman
00:38:37.680 | to get the five free travel packs
00:38:39.060 | and the year's supply of vitamin D3K2.
00:38:41.900 | I know I speak for many people
00:38:43.220 | when I say that we are very happy that you're back.
00:38:45.940 | We know that it's not going to be the first and last trip,
00:38:48.760 | that there will be others,
00:38:50.220 | and that you'll be going to Russia as well,
00:38:52.180 | and presumably other places as well in order to explore.
00:38:56.380 | And I have to say, as a podcaster and as your friend,
00:38:58.980 | I was really inspired that your sense of adventure
00:39:04.540 | and your sense of not just adventure,
00:39:07.620 | but thoughtful, respectful adventure,
00:39:09.140 | you understood what you were doing.
00:39:10.660 | You weren't just going there to get some wartime footage
00:39:12.780 | or something.
00:39:13.620 | This wasn't a kick or a thrill.
00:39:15.160 | This is really serious and remains serious.
00:39:18.280 | So thank you for doing it.
00:39:20.580 | And please, next time you go, bring Tim Kennedy again.
00:39:25.100 | - I feel like Tim Kennedy gets you into,
00:39:29.320 | we'll take it,
00:39:30.160 | 'cause he really loves going to the most dangerous places
00:39:32.380 | and helping people.
00:39:33.300 | So I think he'd get me into more trouble in his word.
00:39:36.140 | And I should mention that,
00:39:37.600 | I mean, there's many reasons I went,
00:39:41.220 | but it's definitely not something I take lightly
00:39:43.380 | or want to do again.
00:39:45.420 | So I'm doing things that I don't want to do.
00:39:47.820 | I just feel like I have to.
00:39:49.220 | - You're compelled.
00:39:50.060 | - So I don't think there's,
00:39:51.820 | now I'll definitely talk about it as we all should.
00:39:54.380 | There's different areas of the world
00:39:55.860 | that are seeing a lot of suffering.
00:39:57.700 | Yemen, there's so many atrocities
00:40:01.460 | going on in the world today,
00:40:03.220 | but this one is just personal to me.
00:40:06.540 | So I want to,
00:40:08.140 | I feel like I'm qualified just because of the language.
00:40:10.400 | So most of the talking, by the way,
00:40:12.340 | I was doing it, it was in Russian.
00:40:14.580 | And so because of the language,
00:40:16.380 | because of my history,
00:40:18.060 | I felt like I have to do this particular thing.
00:40:20.860 | I think it's in many ways stupid and dangerous.
00:40:24.180 | And that was made clear to me,
00:40:27.100 | but I do many things of this nature
00:40:28.780 | because the heart says, pulls,
00:40:30.720 | pulls towards that.
00:40:33.140 | But also there's a freedom to not,
00:40:36.420 | you know, I'm afraid of death,
00:40:38.740 | but I think there's a freedom to,
00:40:43.220 | it's almost like, okay, if I die,
00:40:45.760 | I want to take full advantage
00:40:48.200 | of not having a family currently.
00:40:50.640 | I feel like when you have a family,
00:40:52.520 | there's a responsibility for others.
00:40:55.080 | So you immediately become more conservative and careful.
00:40:58.160 | I feel like I want to take full advantage
00:41:00.180 | of this particular moment in my life
00:41:01.740 | when you can be a little bit more accepting of risk.
00:41:05.840 | - Well, you should definitely reproduce at some point.
00:41:09.060 | Maybe before next time, you should just freeze some sperm.
00:41:11.900 | Really?
00:41:14.400 | - Is that what you do with the ice bath?
00:41:15.520 | Is that how that works?
00:41:16.360 | - You know, it's interesting.
00:41:17.480 | There's always an opportunity to do some science protocols.
00:41:20.580 | You know, there are products on the internet
00:41:22.540 | and there are actually a few decent manuscripts
00:41:25.280 | looking at how cold exposure
00:41:27.260 | can increase testosterone levels,
00:41:29.980 | but it doesn't happen by the cold directly.
00:41:33.320 | Good scientists, as the authors of those papers were and are,
00:41:37.180 | realize that it's the vasoconstriction
00:41:39.540 | and then the vasodilation.
00:41:42.200 | You know, as people warm up again,
00:41:44.120 | there's increased blood flow to the testicles.
00:41:46.260 | And in women, it seems there's probably increased blood flow
00:41:48.660 | to the reproductive organs as well
00:41:50.240 | after people warm back up.
00:41:52.140 | So that seems to cause some sort of hyper nourishment
00:41:55.400 | of the various cells,
00:41:57.220 | the sertoli and lytic cells of the testes
00:42:00.060 | that lead to increased output of testosterone and in women
00:42:04.000 | testosterone as well.
00:42:05.040 | So the cold exposure in any case is obviously,
00:42:10.040 | do you do the ice bath?
00:42:11.140 | Are you into that?
00:42:11.980 | - I've not done that.
00:42:12.800 | - As a Russian, you probably consider that a hot tub.
00:42:15.080 | - Yeah, exactly.
00:42:16.860 | Yeah, it's a nice thing to have fun with
00:42:19.640 | every once in a while to warm up.
00:42:21.900 | No, I haven't done that.
00:42:23.300 | I've been kind of waiting to maybe do it together
00:42:24.980 | with you at some point.
00:42:25.940 | - Great, well, we have one here.
00:42:28.020 | No, we have one here.
00:42:29.180 | - It'll be straightforward for you.
00:42:30.580 | I always say that the adrenaline comes in waves.
00:42:32.720 | And so if you just think about it, walls,
00:42:34.180 | like you're going through a number of walls of adrenaline,
00:42:35.980 | as opposed to going for time becomes rather trivial
00:42:38.460 | with your jujitsu background and whatnot,
00:42:39.980 | you'll immediately recognize the physiological sensation,
00:42:42.920 | even though it's cold specifically,
00:42:45.240 | it's the adrenaline that makes you want to hop
00:42:46.920 | out of the thing.
00:42:47.760 | - And you've seen Joe's.
00:42:49.300 | So Joe set up a really nice man cave,
00:42:53.300 | or it's not even a cave because it's so big.
00:42:55.700 | It's like a network of man caves,
00:42:58.540 | but it has a ice bath and a sauna next to each other.
00:43:03.240 | - We have one of those here, ice bath and sauna.
00:43:05.000 | So we'll have to get you in it when one of these days,
00:43:08.340 | maybe tonight, maybe tomorrow.
00:43:10.420 | No, although there is a,
00:43:11.920 | I don't know the underlying physiological basis,
00:43:14.840 | but there does seem to be a trend toward truth telling
00:43:18.140 | in the sauna.
00:43:19.520 | Some people refer to them as truth barrels.
00:43:21.680 | Mine's a barrel sauna shaped like a barrel.
00:43:23.380 | Who knows why, maybe under intense heat duress,
00:43:25.780 | people just feel compelled to share.
00:43:28.380 | - I have a complicated relationship with saunas
00:43:30.840 | because of all the way cutting.
00:43:32.740 | - Oh.
00:43:33.580 | - Some of the deepest suffering, sorry to interrupt,
00:43:35.620 | I've done was in the sauna.
00:43:37.460 | That's very, it's, I mean I've gone to some dark places
00:43:40.460 | in the sauna.
00:43:41.300 | 'Cause I wrestled my whole life, judo, jiu jitsu,
00:43:45.120 | and those way cuts can really test the mind.
00:43:49.280 | So you're truth telling.
00:43:52.480 | Yeah, it's a certain kind of truth telling
00:43:54.000 | 'cause you're sitting there and the clock moves slower
00:43:57.700 | than it has ever moved in your life.
00:44:00.380 | Yeah, so I usually, for the most part,
00:44:02.340 | I would try to have a bunch of sweats, garbage bags,
00:44:06.620 | and all that kind of stuff and run.
00:44:08.380 | It's easier 'cause you can distract the mind.
00:44:10.580 | In the sauna, you can't distract the mind.
00:44:12.460 | It's just you and all the excuses
00:44:15.100 | and all the weaknesses in your mind
00:44:17.940 | just coming to the surface and you're just sitting there
00:44:20.260 | and sweating or not sweating.
00:44:21.980 | That's the worst.
00:44:22.940 | - And talk about visual abster, you're in a small box
00:44:25.140 | so it also inspires some claustrophobia,
00:44:27.060 | even if you're not claustrophobic.
00:44:28.820 | That's absolutely true.
00:44:31.300 | And the desire to just get out of the thing
00:44:34.260 | is where the, you get a pretty serious adrenaline surge
00:44:37.280 | from in the sauna as well.
00:44:39.900 | Now the sauna actually will, it won't deplete testosterone,
00:44:43.220 | but it kills sperm.
00:44:44.520 | So for people that, sperm are on a 60-day sperm cycle.
00:44:47.780 | So if you're trying to donate sperm
00:44:50.300 | 'cause that's what got us onto this,
00:44:51.420 | or fertilize an egg or eggs in whatever format,
00:44:55.660 | dish or in vivo, as we say in science,
00:44:57.740 | which means, well, you can look it up, folks.
00:45:00.860 | The 60-day sperm cycle,
00:45:03.940 | so if you go into a really hot sauna
00:45:05.660 | or a hot bath or a hot tub,
00:45:09.140 | in 60 days, those sperm are going to be,
00:45:11.420 | a significantly greater portion of them will be dead,
00:45:13.300 | will be non-viable.
00:45:14.840 | So there's a simple solution
00:45:16.140 | that people just put ice pack down there,
00:45:18.240 | or a jar, not this jar,
00:45:21.660 | but a jar of cold fluid between their legs
00:45:24.420 | and just sit there or they go back and forth
00:45:26.440 | between the ice bath and the sauna.
00:45:28.100 | But you probably, if you're going to go back over there,
00:45:30.840 | you should freeze sperm.
00:45:32.000 | We're going to do a couple episodes on fertility
00:45:34.720 | when it's relatively inexpensive and you're young,
00:45:37.540 | so you should probably do it now
00:45:38.380 | 'cause there is a association with autism
00:45:40.860 | as males get older.
00:45:42.740 | It's not a strong one, it's significant,
00:45:44.600 | but it's still a small contribution
00:45:46.260 | to the autism phenotype.
00:45:47.380 | - As you age, don't sperm get wiser or no?
00:45:50.660 | There's no science to back that.
00:45:51.940 | - No, but you know, men can conceive healthy children
00:45:54.560 | at a considerable age, but in any case,
00:45:58.260 | but no, they don't get wiser.
00:46:00.020 | - It's like a fasting age state.
00:46:02.180 | - Well, it's a little bit like the maturation of the brain
00:46:04.740 | in the sense that some of the sperm
00:46:06.260 | get much better at swimming
00:46:08.200 | and then many of them get less good.
00:46:10.020 | Motility is a strong correlate of the DNA of the sperm.
00:46:12.880 | - This is probably a good time to announce
00:46:14.260 | that I'm selling my sperm as an NFTs.
00:46:16.440 | I want to see how much that writing the-
00:46:21.220 | - Well, your children, your future children
00:46:23.400 | and my future children are supposed to do jiu-jitsu together
00:46:26.180 | since I've only done the one jiu-jitsu class.
00:46:28.120 | So I'm strongly vested in you having children,
00:46:31.900 | but only in the friendly kind of way.
00:46:34.920 | - Well, yes, the friendly competition kind of way, yeah.
00:46:39.180 | Dominance of the clan, yep, for sure.
00:46:43.340 | - So moving on to science,
00:46:48.300 | but still with our minds in the Ukraine,
00:46:51.340 | did you encounter any scientists or see any universities
00:46:54.900 | or, you know, as we know in this country
00:46:57.580 | and in Europe and elsewhere, you know,
00:47:00.100 | science takes infrastructure, you need buildings,
00:47:02.540 | you need laboratories, you need robots,
00:47:04.580 | you need a lot of equipment and you need minus 80 freezers
00:47:09.580 | and you need incubators and you need money
00:47:12.020 | and you need technicians.
00:47:13.340 | And typically it's been the wealthier countries
00:47:15.600 | that have been able to do more research for sake of research
00:47:19.180 | and development and productization.
00:47:21.780 | Certainly the Ukraine had some marvelous universities
00:47:25.180 | and marvelous scientists.
00:47:26.960 | What's going on with science and scientists over there?
00:47:32.260 | And gosh, can we even calculate the loss of discovery
00:47:37.020 | that is occurring as a consequence of this conflict?
00:47:40.540 | - So science goes on.
00:47:44.440 | Before the war, Ukraine had a very vibrant tech sector,
00:47:47.960 | which means engineering and all that kind of stuff.
00:47:50.500 | Kiev has a lot of excellent universities
00:47:53.340 | and they still go on.
00:47:54.620 | The biggest hit, I would say,
00:47:57.840 | is not the infrastructure of the science,
00:47:59.640 | but the fact, because of the high morale,
00:48:01.880 | everybody is joining the military.
00:48:05.420 | So everybody's going to the front to fight,
00:48:07.440 | including, you know, you, Andrew Huberman,
00:48:10.020 | would be fighting and not because you have to,
00:48:12.500 | but because you want to.
00:48:14.200 | And everybody you know would be really proud
00:48:15.940 | that you're fighting.
00:48:17.060 | Even though everyone tries to convince, you know,
00:48:20.220 | Andrew Huberman, you have much better ways to contribute.
00:48:23.780 | There's deep honor in fighting for your country, yes,
00:48:26.980 | but there are better ways to contribute to your country
00:48:30.500 | than just picking up a gun that you're not that trained with
00:48:34.060 | and going to the front.
00:48:35.300 | Still, they do it.
00:48:36.480 | Scientists, engineers, CEOs, professors, students.
00:48:43.860 | - Men and women. - Actors, men and women.
00:48:46.060 | Obviously, primarily men, but men and women.
00:48:49.820 | Like, much more than you would see in other militaries.
00:48:53.560 | Women are everybody.
00:48:55.060 | Everybody wants to fight.
00:48:56.260 | Everybody's proud of fighting.
00:48:57.860 | There's no discussion of kind of pacifism.
00:49:02.860 | Should we be fighting?
00:49:04.020 | Is this right?
00:49:04.900 | Is this, you know, everybody's really proud of fighting.
00:49:08.100 | So there's this kind of black hole that pulls everything,
00:49:12.980 | all the resource into the war effort
00:49:15.580 | that's not just financial but also psychological.
00:49:19.180 | So it's like if you're a scientist, it feels like,
00:49:23.020 | it feels like almost like you're dishonoring humanity
00:49:29.540 | by continuing to do things you were doing before.
00:49:35.680 | There's a lot of people that converted to being soldiers.
00:49:38.120 | They literally watch a YouTube video
00:49:40.520 | of how to shoot a particular gun,
00:49:43.360 | to arm a drone with a grenade.
00:49:46.180 | If you're a tech person, you know how to work with drones.
00:49:49.240 | So you're gonna use that, use whatever skills you got,
00:49:52.180 | figure out whatever skills you got and how to use them
00:49:55.400 | to help the effort on the front.
00:49:57.340 | And so that's a big hit.
00:49:59.260 | But that said, I've talked to a lot of folks in Kiev,
00:50:02.660 | faculty, primarily in the tech economics space.
00:50:06.320 | So I didn't get a chance to interact with folks
00:50:08.580 | who are on the biology, chemistry,
00:50:11.500 | neuroscience side of things, but that still goes on.
00:50:15.100 | So one of the really impressive things about Ukraine
00:50:18.480 | is that they're able to maintain infrastructure
00:50:20.660 | like road, food supply, all that kind of stuff,
00:50:25.280 | education while the war's going on, especially in Kiev.
00:50:28.900 | The war started where nobody knew
00:50:32.200 | whether Kiev was gonna be taken by the Russian forces.
00:50:34.780 | It was surrounded.
00:50:36.860 | And a lot of experts from outside
00:50:41.800 | were convinced that Russia would take Kiev, and they didn't.
00:50:45.620 | And one of the really impressive things as a leader,
00:50:48.260 | one of the things I really experienced
00:50:50.620 | is that a lot of people criticized Zelensky before the war.
00:50:54.800 | He only had about like 30% approval rate.
00:50:57.100 | A lot of people didn't like Zelensky.
00:50:59.580 | But one of the great things he did as a leader,
00:51:04.060 | which I'm not sure many leaders would be able to do,
00:51:06.620 | is when Kiev was clearly being invaded, he chose to stay.
00:51:11.620 | He stayed in the capital, everybody.
00:51:15.300 | All the American military, the intelligence agencies,
00:51:20.140 | NATO, his own staff, advisors,
00:51:23.060 | all told him to flee, and he stayed.
00:51:25.460 | And so that's, I think that was a beacon,
00:51:28.620 | a symbol for the rest, for the universities,
00:51:30.780 | for science, for the infrastructure that we're staying to.
00:51:34.740 | And that kept the whole thing going.
00:51:37.840 | There's an interesting social experiment that happened.
00:51:40.580 | I think for folks who are interested in sort of gun control
00:51:44.380 | in this country in particular,
00:51:46.260 | is one of the decisions they made early on
00:51:50.300 | is to give guns to everybody, semi-automatics.
00:51:54.340 | - Early on in the war.
00:51:55.540 | - Early on in the war, yeah.
00:51:57.020 | - So everybody got a gun.
00:51:59.140 | - They also released a bunch of prisoners from prison
00:52:02.260 | because there was no staff to keep the prisons running.
00:52:07.260 | And so there's a very interesting psychological experiment
00:52:13.140 | of like, how is this gonna go?
00:52:16.060 | Everybody has a gun.
00:52:17.460 | Are they gonna start robbing places?
00:52:18.940 | Are they going to start taking advantage
00:52:20.460 | of a chaotic situation?
00:52:22.140 | And what happened is that crime went to zero.
00:52:26.620 | So it turned out that this,
00:52:28.680 | as an experiment, worked wonderfully.
00:52:30.700 | - That's a case where love generalized.
00:52:33.020 | - Yes.
00:52:33.840 | - Or at least hate did not.
00:52:34.780 | We don't know if it's love or it's sort of lack of initiative
00:52:37.460 | for self, you know, common culture directed hate.
00:52:40.460 | - Yeah, I don't, right.
00:52:42.020 | It's, I think that's very correct to say
00:52:45.840 | that it wasn't hate that was unifying people.
00:52:48.300 | It was love of country, love of community.
00:52:50.940 | It's the, probably the same thing that will happen
00:52:53.020 | to humans when like aliens invade.
00:52:55.020 | It's, we're all, it's the common effort.
00:52:57.840 | Everybody puts everything else to the side.
00:53:00.420 | Plus just the sheer amount of guns.
00:53:03.460 | It's similar to like Texas.
00:53:04.820 | You realize like, well, there's going to be
00:53:07.700 | a self-correcting mechanism very quickly
00:53:09.720 | because the rule of law was also put aside, right?
00:53:12.980 | Like basically the police force lost a lot of power
00:53:17.980 | because everybody else has guns
00:53:19.940 | and they're kind of taking the law into their own hands.
00:53:22.560 | And that system, at least in this particular case,
00:53:25.740 | in this particular moment in human history worked.
00:53:29.780 | It's an interesting lesson, you know.
00:53:32.380 | - It is.
00:53:33.940 | I had an interesting contrast that I'll share with you.
00:53:36.460 | I, because you mentioned Texas.
00:53:38.300 | So not so long ago, I was in Austin.
00:53:40.420 | I often visit you or others in Austin, as you know.
00:53:43.100 | And many doors that I walked past, including a school,
00:53:46.960 | said no firearms past this point.
00:53:50.140 | You know, there's a sticker on the door.
00:53:51.460 | You see this on hospitals sometimes.
00:53:53.060 | I saw this at Baylor College of Medicine, et cetera.
00:53:55.660 | Relatively common to see in Texas.
00:53:59.180 | Not so common in California.
00:54:01.500 | And then I flew to the San Francisco Bay area.
00:54:04.780 | I was walking by an elementary school
00:54:07.380 | in my old neighborhood and saw a similar sticker
00:54:09.680 | and looked at it and it said,
00:54:11.820 | no peanuts or other allergy-containing foods
00:54:15.860 | past this point on the door of this elementary school.
00:54:18.420 | So quite a different contrast, like, you know,
00:54:20.060 | guns and peanuts.
00:54:21.940 | Now, peanut allergies obviously are very serious
00:54:24.340 | for some people, although there's great research
00:54:26.000 | out of Stanford showing that early exposure to peanuts
00:54:29.480 | can prevent the allergies.
00:54:31.480 | But don't start rubbing yourself in peanut butter, folks.
00:54:33.960 | If you have a peanut allergy,
00:54:34.920 | that's not the best way to deal with it.
00:54:36.040 | In any case, the contrast of what's dangerous,
00:54:39.560 | the contrast of, you know,
00:54:43.080 | the familiarity with guns versus no familiarity.
00:54:46.000 | You know, in Israel and elsewhere,
00:54:48.360 | you see machine guns in the airport.
00:54:49.740 | In Germany, Frankfurt, you see machine guns in the airport.
00:54:52.520 | Not so common in the United States.
00:54:54.960 | So again, I feel like there's this aperture of vision.
00:54:57.960 | There's this aperture of pleasures
00:55:00.840 | versus creature comforts and lack of creature comforts.
00:55:04.080 | And then there's this aperture of danger, right?
00:55:07.780 | People who are familiar with guns, you know,
00:55:09.200 | are familiar with people coming in
00:55:10.400 | and setting their firearm on the table and eating dinner,
00:55:13.180 | you know, but if you're not accustomed to that,
00:55:16.240 | it's jarring, right?
00:55:17.480 | - I should mention people know this throughout human history,
00:55:21.160 | but the human ability to get assimilated,
00:55:26.160 | no, get used to violence is incredible.
00:55:31.420 | So like you could be living in a peaceful time,
00:55:34.500 | like we're here now, and there'll be one explosion,
00:55:38.280 | like a 9/11 type of situation.
00:55:40.280 | That'd be a huge shock, terrifying.
00:55:42.120 | Everybody freaks out.
00:55:43.240 | The second one is a huge drop-off
00:55:45.760 | in how freaked out you get.
00:55:47.200 | And in a matter of days, sometimes hours,
00:55:50.640 | it becomes the normal.
00:55:52.360 | I've talked to so many people in Kharkiv,
00:55:54.720 | which is one of the towns that's seen a lot of heated battle.
00:55:58.980 | You ask them, is it safe there?
00:56:01.680 | In fact, when I went to the,
00:56:03.180 | closer and closer to the war zone,
00:56:06.800 | you ask people, is it safe?
00:56:09.280 | And their answers usually, yeah, it's pretty safe.
00:56:12.400 | - It's all signal the noise.
00:56:13.860 | - Nobody has told me, except like Western reporters
00:56:19.960 | sitting in the West side of Ukraine,
00:56:22.000 | it's really dangerous here.
00:56:23.960 | Everyone's like, yeah, you know, it's good.
00:56:26.460 | Like my uncle just died yesterday.
00:56:29.780 | Like he was shot.
00:56:30.920 | But it's pretty, you know, it's pretty good.
00:56:33.800 | Like the farm's still running.
00:56:35.740 | Like they, how do I put it?
00:56:39.040 | They focus on the positive, that's one.
00:56:40.680 | But there's a deeper truth there,
00:56:42.720 | which is you just get used to difficult situations.
00:56:46.160 | And the stuff that make you happy
00:56:47.440 | and the stuff that make you upset is relative
00:56:50.200 | to that new normal that you establish.
00:56:52.180 | - Well, I grew up in California
00:56:53.680 | and there were a lot of earthquakes.
00:56:54.680 | I remember the '89 quake.
00:56:55.960 | I remember the Embarcadero Freeway
00:56:57.160 | called pancaking on top of people and cars.
00:56:59.520 | I remember I moved to Southern California.
00:57:01.680 | There was a Northridge quake.
00:57:02.820 | Wherever I moved, there seemed to be earthquakes.
00:57:04.360 | I never worry about earthquakes ever.
00:57:06.320 | I just don't.
00:57:07.160 | In fact, I don't like the destruction they cause,
00:57:09.380 | but every once in a while, an earthquake will roll through
00:57:11.240 | and it's kind of exciting.
00:57:12.060 | It sounds like a train coming through.
00:57:13.080 | It's like, wow, like the earth is moving.
00:57:14.640 | You know, again, I don't want anyone to get harmed,
00:57:16.920 | but I enjoy a good rumble coming through.
00:57:20.600 | Nonetheless, it signaled the noise.
00:57:23.560 | But if I saw a tornado, I'd freak out.
00:57:26.080 | And people from the Midwest are probably comfortable
00:57:28.220 | with, you know, Dan Gable,
00:57:29.280 | the great wrestler from the Midwest that you know,
00:57:30.880 | and I've never met, but I have great respect for.
00:57:32.440 | He's probably, you know, he's a tornado.
00:57:33.960 | It's like, ah, yeah, maybe, yeah.
00:57:35.760 | You know, so I think signal to noise is real.
00:57:38.720 | Before I neglect, although I won't forget,
00:57:44.660 | speaking of signal to noise,
00:57:46.420 | and environment you are returning to,
00:57:49.660 | or have gone back to one of your original natural habitats,
00:57:54.060 | which is the Massachusetts Institute of Technology,
00:57:57.960 | which is actually difficult to pronounce in full, MIT, right?
00:58:02.120 | So you've been spending some time there teaching
00:58:04.540 | and doing other things.
00:58:06.300 | Tell us what you're up to with MIT recently.
00:58:08.420 | - Well, I'm really glad that you being on the West Coast,
00:58:12.280 | know the difference in like Boston, New York.
00:58:14.200 | I feel like a lot of people think
00:58:15.640 | it's like in the East Coast.
00:58:17.520 | - Very different, especially to Bostonians and New Yorkers.
00:58:21.020 | - They get very aggressive.
00:58:22.800 | Yeah, I love it.
00:58:24.080 | I gave lectures there in front of a in-person crowd.
00:58:29.080 | - What were you talking about?
00:58:30.220 | - For the AI, so different aspects of AI,
00:58:33.080 | and you know, robotics, machine learning, machine learning.
00:58:37.080 | So for people who know the artificial intelligence field,
00:58:40.060 | they usually don't use the term AI,
00:58:41.520 | and people from outside use AIs.
00:58:43.740 | The biggest breakthroughs in the machine learning field
00:58:46.820 | was some discussion of robotics and so on.
00:58:50.640 | Yeah, it was in person, it was wonderful.
00:58:52.940 | I'm a sucker for that.
00:58:54.440 | I really avoided teaching
00:58:56.400 | or any kind of interaction during COVID
00:58:59.900 | because people put a lot of emphasis on,
00:59:03.300 | but also got comfortable with remote teaching,
00:59:05.960 | and I think nobody enjoyed it,
00:59:08.480 | except sort of there's a notion
00:59:12.340 | that it's much easier to do
00:59:14.720 | because you don't have to travel,
00:59:17.720 | you don't have to,
00:59:18.560 | you can do it in your pajamas kind of thing,
00:59:20.920 | but when you actually get to do it,
00:59:23.400 | you don't get the same kind of joy
00:59:25.680 | that you do when you're teaching.
00:59:27.360 | As a student, you don't get the same kind of joy of learning.
00:59:30.620 | It's not as effective and all that kind of stuff.
00:59:32.440 | So to be in person together with people,
00:59:34.880 | to see their eyes, to get their excitement,
00:59:36.880 | to get the questions and all the interactions,
00:59:39.380 | yeah, it was awesome.
00:59:40.560 | And I'm still a sucker and a believer
00:59:45.200 | in the ideal of MIT, of the university.
00:59:48.760 | I think it's an incredible place.
00:59:50.020 | There's something in the air still,
00:59:52.240 | but it really hit,
00:59:53.320 | the pandemic hit universities hard because,
00:59:57.460 | and I can say this, this is not you saying it,
00:59:59.260 | this is me saying it,
01:00:01.080 | that administrations, as in all cases
01:00:04.980 | when people criticize institutions,
01:00:07.060 | the pandemic has given more power to the administration
01:00:10.180 | and taken away power from the faculty and the students.
01:00:12.780 | And that's from everybody involved,
01:00:14.800 | including the administration, that's a concern
01:00:17.060 | because the university is about the teachers
01:00:19.480 | and the students, that should be primary.
01:00:21.760 | And whenever you have a pandemic,
01:00:23.520 | there's an opportunity to increase the amount of rules.
01:00:25.860 | Like one of the things that really bothered me,
01:00:28.320 | and I'll scream from the top of the MIT Dome about this,
01:00:33.320 | is they've instituted a new TIM ticket system,
01:00:37.800 | which is if you're a visitor to the campus at MIT,
01:00:41.200 | you have to register, you have to,
01:00:42.980 | first of all, show that you're vaccinated,
01:00:44.540 | but more importantly, there's a process to visiting.
01:00:47.520 | You need to get permission to visit.
01:00:49.980 | One of the reasons I loved MIT,
01:00:53.060 | unlike some other institutions,
01:00:55.180 | MIT just leaves the door open to anyone.
01:00:58.600 | In classrooms, you can roll in the ridiculous characters,
01:01:02.660 | the students that are usually doing business stuff
01:01:06.560 | or economics can roll into a physics class
01:01:09.040 | and just, you're kinda not allowed, but it's a gray area,
01:01:13.000 | so you let that happen,
01:01:15.480 | and that creates a flourishing of the community.
01:01:17.480 | That was beautiful.
01:01:18.400 | And I think adding extra rules puts a squeeze on
01:01:22.680 | and limits some of the flourishing.
01:01:25.320 | And I hope some of that dissipates over time
01:01:27.800 | as we kinda let go of the risk aversion
01:01:32.800 | that was created by the pandemic
01:01:34.680 | as we kinda enter the normal, return back,
01:01:38.400 | some of that flourishing can happen.
01:01:39.800 | But when you're actually in there with the students,
01:01:43.600 | yeah, it was magic.
01:01:44.800 | I love it, I love it.
01:01:46.240 | - Well, some of your earliest videos on your YouTube channel
01:01:48.440 | were of you in the classroom, right?
01:01:50.640 | That's how this all started.
01:01:51.800 | - Yeah, yeah, that's how YouTube,
01:01:53.840 | like putting stuff on YouTube was terrifying, right?
01:01:58.200 | - Well, especially at the time when you did it again,
01:01:59.940 | you're a pioneer in that sense.
01:02:03.240 | You did that, Jordan Peterson did that.
01:02:04.920 | Putting up lectures is, yeah, I would,
01:02:09.920 | I teach still every winter I teach, direct a course,
01:02:13.180 | and I'll be doing even more teaching going forward.
01:02:15.380 | But the idea of those videos being on the web is,
01:02:20.380 | yeah, that spikes my cortisol a little bit.
01:02:22.840 | - Yeah, it's terrifying 'cause you get to,
01:02:24.840 | and everybody has a different experience.
01:02:26.400 | Like for me being a junior research scientist,
01:02:31.120 | the kind of natural concern is like, who am I?
01:02:35.160 | When I was giving this lecture,
01:02:36.840 | it's like, I don't deserve any of this.
01:02:39.520 | - Yeah, but that's your humility coming through.
01:02:41.040 | And I actually think that humility
01:02:42.800 | on the part of an instructor is good
01:02:44.400 | because those that think that they are entitled
01:02:48.900 | and who else could give this lecture, then I worry more.
01:02:52.640 | I think it's, I once heard,
01:02:54.600 | I don't know if it's still true that the,
01:02:56.600 | at Caltech, right,
01:02:57.520 | the great California Institute of Technology,
01:02:59.280 | not far from here,
01:03:01.040 | that many of the faculty are actually afraid
01:03:03.700 | of the students, not physically afraid,
01:03:05.800 | but they're intellectually afraid
01:03:07.040 | because the students are so smart.
01:03:09.580 | And teaching there can be downright frightening, I've heard.
01:03:13.160 | But that's great, keeps everybody on their toes.
01:03:15.600 | And I think, and you know,
01:03:18.000 | I've been corrected in lecture before at Stanford
01:03:20.320 | and elsewhere, you know,
01:03:21.560 | when my lab was at UC San Diego where someone will say,
01:03:24.000 | hey wait, you know, last lecture you said this
01:03:27.520 | and now you said that, and we're on the podcast, you know?
01:03:29.560 | And I think it's that moment where, you know,
01:03:32.020 | you sometimes feel that urge to defend and you go,
01:03:34.280 | oh, you're right.
01:03:35.720 | And I think it depends on how one was trained.
01:03:37.300 | My graduate advisor was wonderful at saying,
01:03:40.320 | I don't know all the time.
01:03:41.540 | And she went to Harvard, Radcliffe, UCSF, and Caltech,
01:03:45.520 | a brilliant woman, and had no problem saying like,
01:03:48.440 | I don't know.
01:03:49.260 | - I don't have that problem.
01:03:50.100 | So I usually have two guys that if somebody speaks up,
01:03:52.820 | grab them, drag them out of the room,
01:03:55.160 | never see them again.
01:03:56.960 | So everybody is really supportive.
01:03:58.400 | I don't understand the amount of love and support I get.
01:04:01.680 | - Especially when the last few students are there
01:04:03.560 | and everybody seems to be nodding as you're going.
01:04:06.160 | No, I think that I'd love to sit in on one of your lectures.
01:04:09.180 | I know very little about AI, machine learning, or robotics.
01:04:12.680 | - Have you ever talked at MIT?
01:04:14.120 | Have you ever like given lectures?
01:04:16.680 | - Oh yeah, when I went on the job market
01:04:19.440 | as a faculty member,
01:04:20.520 | my final two choices were between MIT Peacower,
01:04:23.720 | I had an on-paper offer, wonderful place,
01:04:26.120 | wonderful place to do neuroscience,
01:04:27.840 | and UC San Diego, which is a wonderful neuroscience program.
01:04:31.960 | In the end, it made sense for me to be on the West Coast
01:04:34.000 | for personal reasons,
01:04:34.880 | but there's some amazing neuroscience going on there.
01:04:38.140 | Goodness.
01:04:38.980 | And that's always been true and it's going to continue.
01:04:41.440 | It's been a long time since I've been invited back there.
01:04:44.400 | Oddly enough, when I started doing more podcasting
01:04:47.360 | and I still run a lab,
01:04:49.080 | but I shrunk my lab considerably when I was doing,
01:04:51.640 | as I've done more podcasting,
01:04:53.440 | I've received fewer academic lecture invites,
01:04:56.140 | which makes sense.
01:04:57.060 | But now they're sort of coming back.
01:04:58.680 | And so when people invite now, I always say, you know,
01:05:00.440 | do you want me to talk about the ventral thalamus
01:05:04.080 | and its role in anxiety and aggression,
01:05:06.960 | or do you want me to talk about the podcast?
01:05:09.080 | And my big fear is I'm going to go back to give a lecture
01:05:10.880 | about the retina or something,
01:05:11.920 | and I'll start off with an athletic greens read
01:05:13.660 | or something like that, just reflexively.
01:05:15.820 | Just kidding, that wouldn't happen.
01:05:17.100 | But listen, I think it's great to continue
01:05:20.360 | to keep a foot in both places.
01:05:22.560 | I was so happy to hear that you're teaching at MIT
01:05:24.560 | because podcasting is one thing,
01:05:26.320 | teaching is another,
01:05:27.200 | and there's overlap there in the Venn diagram.
01:05:29.020 | But listen, the students that get to sit in
01:05:31.760 | on one of your lectures,
01:05:32.600 | and you may see me sitting there in the audience soon,
01:05:34.540 | when I creep into your class.
01:05:36.880 | - Sunglasses.
01:05:37.720 | - That's right, wearing a red shirt.
01:05:39.860 | You won't recognize me.
01:05:40.960 | Well, are certainly receiving a great gift.
01:05:45.360 | I've watched your lectures on YouTube, even the early ones.
01:05:47.860 | And listen, I know you to be a phenomenal teacher.
01:05:52.100 | - Yeah, there's something about,
01:05:53.240 | so I'm also doing, like I said, I'm pretty late last night,
01:05:57.880 | working for a deadline on a paper.
01:06:00.420 | One of the things that I hope to do
01:06:03.560 | for hopefully the rest of my life
01:06:05.360 | is to continue publishing.
01:06:08.080 | And I think it's really important to do that,
01:06:12.240 | even if you continue the podcast,
01:06:14.400 | because you wanna be just on your own intellectual
01:06:18.400 | and scientific journey as you do podcasting.
01:06:22.400 | 'Cause at least for me,
01:06:23.920 | and especially on the engineering side,
01:06:25.520 | 'cause I wanna build stuff.
01:06:27.620 | And I think that's like keeps your ego in check,
01:06:32.120 | keeps you humble.
01:06:33.860 | Because I think if you talk too much on a microphone,
01:06:36.760 | you start getting, you might lose track of,
01:06:40.420 | you know, the grounding that comes from engineering,
01:06:43.880 | from science and the scientific process
01:06:45.680 | and the criticisms that you get, all that kind of stuff.
01:06:47.880 | - And how slow and iterative it is.
01:06:49.520 | We have two papers right now
01:06:50.700 | that are in the revision stage.
01:06:52.360 | And it's been a very long road.
01:06:54.400 | And I was asked this recently
01:06:55.780 | because I met with my chairman.
01:06:56.880 | He said, do you wanna continue to run a lab
01:06:58.200 | or are you just gonna go full-time on the podcast?
01:06:59.840 | And Stanford has been very supportive, I must say,
01:07:01.760 | as I know MIT has been of you.
01:07:03.820 | And I said, oh, I absolutely wanna continue
01:07:06.660 | to be involved in research and do research.
01:07:09.560 | And we started talking about these papers
01:07:10.840 | and we're looking over my,
01:07:11.760 | this was my yearly review and looking back,
01:07:13.520 | like goodness, these papers have been in play
01:07:15.080 | for a very long time.
01:07:16.200 | So it's a long road, but you learn more and more.
01:07:19.300 | And the more time you spend, you know,
01:07:20.940 | myopically looking at a bunch of data
01:07:23.200 | that the more you learn and the more you think.
01:07:24.860 | I totally agree.
01:07:26.060 | You know, talking into these devices for podcasts
01:07:28.080 | is wonderful 'cause it's fun.
01:07:29.960 | It relieves a certain itch that we both have
01:07:31.960 | and hopefully it lands some important information
01:07:35.180 | out there for people.
01:07:36.020 | But doing research is like the,
01:07:38.460 | I, you know, I guess if you know, you know,
01:07:42.200 | there's like the, you know, the unpeeling of the onion,
01:07:45.640 | knowing that there could be something there.
01:07:47.580 | There's just nothing like it.
01:07:49.520 | - I mean, you do, especially with the pandemic.
01:07:52.540 | And for me, both Twitter and the podcast
01:07:57.020 | have made me much more impatient
01:07:58.780 | about the slowness of the review process because--
01:08:01.940 | - Twitter will do that.
01:08:03.240 | - Twitter will do that.
01:08:04.080 | But even with podcasts, you have a cool,
01:08:06.020 | you'll find something cool and then you have ideas
01:08:08.420 | and all, and you'll just say them
01:08:10.160 | and it'll be out pretty quickly.
01:08:12.000 | - Then we do a post right now about something
01:08:13.700 | that we both found interesting and it's out in the world.
01:08:15.960 | - And you can write up something like,
01:08:18.000 | there is a culture in computer science of posting stuff
01:08:20.380 | on archives and preprints that don't get in your review.
01:08:23.700 | And sometimes they don't even go through the review process
01:08:25.940 | ever because like people just start using them if it's code.
01:08:29.300 | And it's like, what's the point of this?
01:08:31.220 | It works like the, it's self evident that it works
01:08:34.900 | because people are using it.
01:08:36.740 | And that I think applies more to engineering fields
01:08:40.260 | 'cause it's an actual tool that works.
01:08:42.300 | It doesn't matter if it,
01:08:43.340 | you don't have to scientifically prove that it works.
01:08:45.700 | It works 'cause it's using for a lot of people.
01:08:47.740 | - Well, sorry to interrupt,
01:08:48.580 | but I just said for point of reference,
01:08:50.760 | the famous paper describing the double helix,
01:08:52.980 | which earned Watson and Crick the Nobel prize
01:08:55.200 | and should have earned Rosalind Franklin Nobel prize too,
01:08:58.500 | of course, but they got it for the structure of DNA.
01:09:02.360 | Of course, that paper was never reviewed at nature.
01:09:05.100 | They published it because its importance was self evident
01:09:08.700 | or whatever they just said.
01:09:10.060 | - So like the editors.
01:09:11.500 | - It was that purely editorial decision.
01:09:13.100 | I believe, I mean, that's what I was told
01:09:14.640 | by someone who's currently an editor at nature.
01:09:17.620 | If that turns out to not be correct,
01:09:19.300 | someone will tell us in the comments for sure.
01:09:21.340 | - Well, I think--
01:09:22.180 | - That's pretty interesting, right?
01:09:23.020 | - That's really interesting.
01:09:23.860 | - Perhaps the most significant discovery in biology
01:09:26.440 | and bioengineering,
01:09:27.780 | which leading to bioengineering as well, of course,
01:09:30.540 | of the last century was not peer reviewed.
01:09:34.140 | - Yeah, but so Eric Weinstein,
01:09:36.400 | but many others have talked about this,
01:09:38.800 | which is, I mean, I don't think people understand
01:09:43.700 | how poor the peer review process is,
01:09:48.380 | just the amount of, 'cause you think peer review,
01:09:50.860 | it means all the best peers get together
01:09:54.720 | and they review your stuff, but it's unpaid work
01:09:57.720 | and it's usually a small number of people
01:09:59.340 | and it's a very, they have a very select perspective
01:10:01.700 | so they might not be the best person,
01:10:03.180 | especially if it's super novel work.
01:10:04.940 | - And it's who has time to do it.
01:10:06.160 | I'm on a bunch of editorial boards still,
01:10:08.100 | why I don't know, but I enjoy the peer review process
01:10:11.180 | and sending papers out.
01:10:12.260 | Oftentimes the best scientists are very busy
01:10:14.380 | and don't have time to review.
01:10:15.780 | And oftentimes the more premier journals
01:10:19.540 | will select from a kind of a unique kit
01:10:21.940 | of very good scientists who are very close to the work.
01:10:26.100 | Sometimes the people are very far from the work.
01:10:27.940 | It really depends.
01:10:28.780 | - And both have negatives, right?
01:10:30.100 | If you're very close to the work,
01:10:31.920 | there's jealousy and all those basic human things,
01:10:33.980 | very far from the work, you might not appreciate
01:10:35.940 | the nuanced contribution, all that kind of stuff.
01:10:38.620 | - And there's psychology, sorry to interrupt again,
01:10:40.460 | but a good friend of mine who's extremely successful
01:10:43.260 | neuroscientist, Howard Hughes investigator, et cetera,
01:10:45.860 | always told me that they,
01:10:47.640 | I won't even say whether or not who they are,
01:10:49.860 | they select their reviewers on the basis
01:10:52.100 | of who has been publishing very well recently,
01:10:55.220 | 'cause they assume that that person
01:10:56.340 | is going to be more benevolent
01:10:57.420 | because they've been doing well so that the love expands.
01:11:00.980 | - That's a good point to that actually.
01:11:02.380 | But the idea is that editors
01:11:05.300 | might actually be the best reviewers.
01:11:07.840 | So that was the traditional,
01:11:09.180 | that's the thing I wanted to mention
01:11:10.860 | that Eric Weinstein talks about,
01:11:12.460 | that back several decades ago, editors had much more power.
01:11:16.700 | And there's something to be made for that,
01:11:18.020 | 'cause they, editors are the ones who are responsible
01:11:22.220 | for crafting the journal.
01:11:24.380 | They really are invested in this.
01:11:26.260 | And they're also often experts, right?
01:11:29.900 | So it makes sense for an editor
01:11:31.140 | to have a bit of power in this case.
01:11:33.620 | Usually if an idea is truly novel, you could see it.
01:11:38.100 | And so it makes sense for an editor
01:11:41.060 | to have more power in that regard.
01:11:42.700 | Of course, for me, I think peer reviews
01:11:45.180 | should be done the way tweets are done,
01:11:46.700 | which is like crowdsourced or Amazon reviews.
01:11:49.900 | - Let the crowd decide.
01:11:50.740 | - Let the crowd decide.
01:11:51.940 | And let the crowd add depth and breadth
01:11:56.940 | in context for the contribution.
01:12:01.260 | So if the paper overstates the degree of contribution,
01:12:06.260 | the crowd will check you on that.
01:12:08.580 | If there's not enough support,
01:12:10.660 | or like the conclusions are not supported by the evidence,
01:12:14.260 | the crowd will check you on that.
01:12:16.380 | There could be, of course, a political bickering
01:12:19.380 | that enters the picture,
01:12:20.340 | especially on very controversial topics.
01:12:22.520 | But I think I trust the intelligence of human beings
01:12:24.820 | to figure that out.
01:12:25.660 | And I think most of us are trying
01:12:27.980 | to figure this whole process out.
01:12:29.940 | I just wish it was happening much faster,
01:12:32.660 | because on the important topics,
01:12:34.940 | the review cycle could be faster.
01:12:37.260 | And we learned that through COVID,
01:12:38.740 | that Twitter was actually pretty effective
01:12:41.920 | at doing science communication.
01:12:43.980 | It was really interesting.
01:12:45.620 | Some of the best scientists took to Twitter
01:12:48.340 | to communicate their own work and other people's work,
01:12:52.020 | and always putting into sort of the caveats
01:12:54.060 | that it's not peer reviewed and so on,
01:12:56.320 | but it's all out there.
01:12:58.020 | And the data just moves so fast.
01:13:00.160 | And if you want stuff to move fast,
01:13:02.340 | Twitter is the best medium of communication for that.
01:13:04.900 | It's cool to see.
01:13:06.140 | I'd like to take a brief break
01:13:07.780 | and thank our sponsor, InsideTracker.
01:13:10.860 | InsideTracker is a personalized nutrition platform
01:13:13.340 | that analyzes data from your blood and DNA
01:13:15.860 | to help you better understand your body
01:13:17.520 | and help you reach your health goals.
01:13:19.340 | I've long been a believer in getting regular blood work done
01:13:22.000 | for the simple reason that many of the factors
01:13:24.420 | that impact your immediate and long-term health
01:13:26.440 | can only be analyzed from a quality blood test.
01:13:29.020 | The problem with a lot of blood and DNA tests out there,
01:13:31.100 | however, is that you get data back about metabolic factors,
01:13:34.640 | lipids and hormones and so forth,
01:13:36.180 | but you don't know what to do with those data.
01:13:37.940 | InsideTracker solves that problem
01:13:39.700 | and makes it very easy for you to understand
01:13:42.020 | what sorts of nutritional, behavioral,
01:13:45.040 | maybe even supplementation-based interventions
01:13:47.880 | you might want to take on
01:13:49.040 | in order to adjust the numbers of those metabolic factors,
01:13:51.640 | hormones, lipids, and other things
01:13:53.420 | that impact your immediate and long-term health
01:13:55.380 | to bring those numbers into the ranges
01:13:57.660 | that are appropriate and indeed optimal for you.
01:14:00.260 | If you'd like to try InsideTracker,
01:14:01.660 | you can visit insidetracker.com/huberman
01:14:04.420 | to get $200 off an ultimate plan or 34% off the entire site
01:14:09.400 | as a special Black Friday deal
01:14:10.940 | now through the end of November.
01:14:12.660 | Again, that's insidetracker.com/huberman
01:14:15.540 | and use the code Huberman at checkout.
01:14:17.760 | I'm now on Twitter more regularly.
01:14:20.800 | And initially it was just Instagram.
01:14:22.560 | And I remember you and I used to have these
01:14:25.540 | over dinner drink conversations where I'd say,
01:14:28.420 | "I don't understand Twitter."
01:14:29.820 | And you'd say, "I don't understand Instagram."
01:14:31.820 | And of course we understand how it worked
01:14:33.260 | and how to work each respective platform.
01:14:35.580 | But I think we were both trying to figure out
01:14:38.220 | what is driving the psychology of these different venues?
01:14:41.340 | 'Cause they are quite distinct psychologies
01:14:44.180 | for whatever reason.
01:14:45.420 | I think I'm finally starting to understand Twitter
01:14:47.220 | and enjoy it a little bit.
01:14:49.180 | Initially, I wasn't prepared for the level
01:14:52.900 | of kind of reflexive scrutiny,
01:14:56.100 | that it sounds a little bit oxymoronic,
01:14:57.960 | but that people kind of like pick up on one small thing
01:14:59.760 | and then drive it down that trajectory.
01:15:01.900 | It didn't seem to be happening quite as much on Instagram,
01:15:04.380 | but I love your tweets.
01:15:06.060 | I do have a question about your Twitter account
01:15:10.460 | and how you, do you have sort of internal filters
01:15:12.940 | of what you'll put up and won't put up?
01:15:15.300 | Because sometimes you'll put up things
01:15:17.300 | that are about life and reflections.
01:15:18.740 | Other times you'll put up things
01:15:21.260 | like what you're excited about in AI,
01:15:23.020 | or of course point to various podcasts,
01:15:25.940 | including your own, but others as well.
01:15:27.940 | How do you approach social media?
01:15:30.980 | Not how do you regulate your behavior on there
01:15:33.340 | in terms of how much time, et cetera.
01:15:34.820 | I know you've talked about that before,
01:15:35.940 | but what's your mindset around social media
01:15:40.060 | when you go on there to either post or forage
01:15:45.060 | or respond to information?
01:15:47.980 | - I think I try to add some, not to sound cliche,
01:15:52.980 | but some love out there into the world,
01:15:56.940 | into as OJ Simpson calls it, Twitter world.
01:16:00.460 | I think there is this viral negativity that can take hold
01:16:05.460 | and I try to find the right language
01:16:09.180 | to add good vibes out there.
01:16:12.260 | And it's actually really, really tricky
01:16:14.180 | because there's something about positivity that sounds fake.
01:16:19.180 | And I'm not, I can't quite put my finger on it,
01:16:22.400 | but whenever I talk about love and the positive
01:16:27.000 | and almost childlike in my curiosity and positivity,
01:16:30.420 | people start to think like,
01:16:32.920 | surely he has skeletons in the closet.
01:16:36.740 | There's dead bodies in his basement.
01:16:38.800 | This must be a fake--
01:16:40.180 | - It's the attic.
01:16:41.520 | - It's the attic?
01:16:42.360 | - The attic.
01:16:43.180 | - I keep mine in the basement.
01:16:44.020 | That's the details.
01:16:45.260 | - I was referring to your attic.
01:16:46.240 | I don't have an attic or a basement, nor dead bodies.
01:16:48.900 | I just want to be very clear.
01:16:50.040 | - Yeah, I do have an attic
01:16:52.320 | and actually haven't been up to,
01:16:55.060 | maybe there is bodies up there.
01:16:56.520 | But yes, I prefer the basement's colder down there.
01:16:59.020 | I like it.
01:16:59.860 | No, but there's an assumption that this is not genuine
01:17:04.120 | or it's disingenuous in some kind of way.
01:17:07.820 | And so I try to find the right language
01:17:10.160 | for that kind of stuff, how to be positive.
01:17:12.820 | Some of it, I was really inspired
01:17:15.300 | by Elon's approach to Twitter.
01:17:18.140 | Not all of it, but the one he just is silly.
01:17:23.180 | I found that silliness, I think it's Hermann Hesse said,
01:17:28.180 | something to paraphrase one of my favorite writers,
01:17:35.060 | I think in Steppenwolf, said,
01:17:39.300 | "Learn what is to be taken seriously and laugh at the rest."
01:17:44.300 | I think I try to be silly, laugh at myself,
01:17:49.160 | laugh at the absurdity of life,
01:17:51.940 | and then in part, when I'm serious,
01:17:53.840 | try to just be positive, just see a positive perspective.
01:17:58.840 | But, and also, as you said,
01:18:02.220 | people pick out certain words and so on
01:18:04.240 | and they attack each other, attack me
01:18:06.180 | over certain usage of words.
01:18:07.940 | And in particular, I think the thing I try to do
01:18:11.460 | is think positively towards them.
01:18:14.580 | Do not escalate.
01:18:15.900 | So whenever somebody's criticizing me and so on,
01:18:18.700 | I just smile.
01:18:20.860 | If there's a lesson to be learned, I learn it.
01:18:23.160 | And then I just send good vibes their way.
01:18:26.220 | Don't respond.
01:18:27.500 | And just hopefully sort of through karma
01:18:30.840 | and through kind of the ripple effect of positivity,
01:18:35.220 | have an impact on them and the rest of Twitter.
01:18:38.940 | And what you find is that builds,
01:18:42.740 | your actions create the community.
01:18:45.220 | So how I behave gets me surrounded by certain people.
01:18:49.380 | But lately, especially Ukraine is one topic like this.
01:18:54.340 | I also thought about talking to somebody
01:18:57.900 | who reached out to me is Andrew Tate,
01:18:59.980 | who's extremely controversial.
01:19:02.100 | From the perspective of a lot of people as a misogynist.
01:19:05.220 | And--
01:19:06.060 | - I've heard his name and I know
01:19:07.380 | that there's a lot of controversy around him.
01:19:09.060 | Maybe you could familiarize me.
01:19:11.340 | I've been pretty nose down in podcast prep
01:19:13.300 | and I tried to do this vacation thing
01:19:15.500 | for about three, four weeks.
01:19:17.060 | - I've heard about that.
01:19:17.900 | And it sort of worked.
01:19:20.580 | I did get some time in the Colorado wilderness by myself,
01:19:23.500 | which was great.
01:19:24.340 | I did get some downtime,
01:19:29.100 | but in any event, it mainly consists of reading and--
01:19:33.780 | - In nature.
01:19:35.420 | - Reading and nature, sauna, ice bath, working out,
01:19:40.060 | good food, a little extra sleep, these kinds of things.
01:19:42.420 | I really felt I needed it.
01:19:43.620 | But I am pretty naive when it comes to the kind
01:19:47.980 | of current controversies, but I've heard his name
01:19:50.740 | and I think he's been deplatformed
01:19:52.780 | on a couple of platforms.
01:19:53.680 | Do I have that right?
01:19:55.060 | - So I should also admit that while I might know more
01:19:58.400 | than you, it's not by much.
01:20:01.420 | So it's like a five-year-old talking
01:20:03.180 | to a four-year-old right now.
01:20:04.540 | - Is he an athlete, a podcaster?
01:20:06.540 | - So basic summary, he used to be a fighter,
01:20:11.180 | a kickboxer, I believe, was pretty successful.
01:20:14.540 | And then during that and after that,
01:20:18.920 | I think he was in a reality show
01:20:21.260 | and he had all these programs that are basically
01:20:24.860 | like pickup artist advice.
01:20:26.980 | He has this community of people where he gives advice
01:20:29.860 | on how to pick up women, how to be successful
01:20:32.820 | in relationships, how to make a lot of money.
01:20:35.140 | And there's like, it costs money to enter those programs.
01:20:39.140 | So a lot of the criticism that he gets is kind of,
01:20:41.960 | it's like a pyramid scheme where you convince people
01:20:48.000 | to join so that they can make more money
01:20:49.860 | and then they convince others to join, that kind of stuff.
01:20:52.320 | But that's not why I'm interested in talking to him.
01:20:55.060 | I'm interested because one of the guests,
01:20:57.580 | maybe I shouldn't mention who,
01:20:58.840 | but one of the female guests I had,
01:21:01.100 | really a big scientist, said that her two kids
01:21:06.380 | that are 13 and 12 really look up to Andrew.
01:21:10.560 | - Male children for male children?
01:21:13.820 | - Yeah, male.
01:21:14.980 | And I hear this time and time again.
01:21:16.580 | So he is somebody that a lot of teens, young teens,
01:21:21.420 | look up to.
01:21:22.580 | So I haven't done serious research.
01:21:26.900 | I usually try to avoid doing research
01:21:28.660 | until I agree to talk and then I go deep.
01:21:32.880 | But there is an aspect to the way he talks about women
01:21:37.880 | that while I understand, and I understand certain dynamics
01:21:44.040 | and relationships work for people,
01:21:46.040 | and he's one such person,
01:21:48.200 | but I think him being really disrespectful towards women
01:21:54.160 | is not what I,
01:21:55.760 | it's not how I see what it means to be a good man.
01:22:02.000 | So the conversation I want to have with him
01:22:04.100 | is about masculinity.
01:22:05.360 | What does masculinity mean in the 21st century?
01:22:08.220 | And so when I think about that kind of stuff,
01:22:11.820 | 'cause we were talking about Twitter,
01:22:14.100 | it's like going into a war zone.
01:22:16.420 | I'm like a happy-go-lucky person, but you're not-
01:22:20.500 | - Send me to the Ukraine,
01:22:21.620 | but I don't want to have this conversation on Twitter.
01:22:24.020 | - Because it's a really, really, really tricky one.
01:22:27.560 | Because also, as you know, when you do a podcast,
01:22:32.340 | everybody wants you to win.
01:22:35.420 | Everything you do is positive.
01:22:40.340 | Maybe you'll say the wrong thing.
01:22:41.800 | It's like an inaccurate thing and you can correct yourself.
01:22:45.620 | With Andrew Tate, with Donald Trump, with folks like this,
01:22:50.380 | you have to, I mean, it's a professional boxing.
01:22:53.540 | I think you have to push the person.
01:22:55.620 | You have to be really eloquent.
01:22:57.520 | You have to be all sympathetic,
01:22:58.660 | 'cause you can't just do what journalists do,
01:23:00.220 | which is talk down to the person the entire time.
01:23:02.220 | That's easy.
01:23:03.220 | The hard thing is to empathize with the person,
01:23:05.140 | to understand them, to steel man their case,
01:23:08.260 | but also to make your own case.
01:23:10.020 | So in that case, about what it means to be a man,
01:23:12.720 | to me, a strong man is somebody who's respectful to women.
01:23:16.620 | Not out of weakness, not out of social justice,
01:23:18.760 | warrior signaling and all that kind of stuff,
01:23:20.580 | but out of, that's what a strong man does.
01:23:24.060 | They don't need to be disrespectful
01:23:26.060 | to prove their position in life.
01:23:28.100 | He is often, now, a lot of people say it's a character.
01:23:31.480 | He's being misogynistic.
01:23:34.780 | He's being a misogynist as a kind of,
01:23:36.840 | for entertainment purposes.
01:23:37.960 | - So like an avatar.
01:23:39.280 | - Yeah.
01:23:40.440 | But to me, that avatar has a lot of influence on young folks.
01:23:44.720 | So the character has impact.
01:23:49.400 | - Oh, I don't think you can separate the avatar
01:23:51.360 | and the person in terms of the impact, as you said.
01:23:55.200 | In fact, there are a number of accounts
01:23:57.280 | on Twitter and Instagram and elsewhere,
01:23:59.200 | which people have only revealed their first names
01:24:01.840 | or they give themselves another name
01:24:03.360 | or they're using a cartoon image.
01:24:05.320 | And part of that, I believe,
01:24:07.560 | and at least from some of these individuals
01:24:09.160 | who actually know who they are,
01:24:10.420 | I understand is, A, an attempt to maintain their privacy,
01:24:13.760 | which is important to many people.
01:24:16.300 | And in some cases, so that they can be more inflammatory
01:24:22.960 | and then just pop up elsewhere as something else
01:24:25.360 | without anyone knowing that it's the same person.
01:24:27.920 | - Some of the, this is the dark stuff.
01:24:29.440 | I've been reading a lot about Ukraine and Nazi Germany.
01:24:34.060 | So the '30s and the '40s and so on.
01:24:36.000 | And you get to see how much the absurdity
01:24:39.000 | turns to evil quickly.
01:24:40.440 | One of the things I worry,
01:24:42.060 | one of the things I really don't like to see
01:24:44.120 | on Twitter and the internet
01:24:45.640 | is how many statements end with LOL.
01:24:48.880 | It's like you think just because something is kind of funny
01:24:53.880 | or is funny or is legitimately funny,
01:24:58.160 | it also doesn't have a deep effect on society.
01:25:02.360 | So that's such a difficult gray area
01:25:05.600 | because some of the best comedy is dark and mean,
01:25:09.840 | but it reveals some important truth
01:25:11.480 | that we need to consider.
01:25:12.720 | But sometimes comedy is just covering up
01:25:16.360 | for destructive ideology.
01:25:20.720 | And you have to know the line between those two.
01:25:23.200 | Hitler was seen as a joke in the late '20s and the '30s,
01:25:26.880 | the Nazi Germany, until the joke became very serious.
01:25:30.560 | You have to be careful to know the difference
01:25:33.240 | between the joke and the reality and do all that.
01:25:36.800 | I mean, in a conversation,
01:25:38.720 | I'm just such a big believer in conversation
01:25:41.040 | to be able to reveal something through conversation.
01:25:44.840 | But I don't know one of the big,
01:25:47.280 | you and I challenge ourselves all the time.
01:25:49.560 | I don't know if I have what it takes
01:25:52.320 | to have a good, empathetic, but adversarial conversation.
01:25:57.320 | - I need to learn more about this tight person
01:26:02.400 | or not learn about it. - Or not.
01:26:03.520 | - Yeah, it sounds like maybe it's something to skip.
01:26:05.800 | I don't know 'cause again,
01:26:07.240 | I'm not familiar with the content,
01:26:08.480 | but I was gonna ask you whether or not you've seeked out
01:26:11.680 | or whether or not you would ever consider having Donald Trump
01:26:14.000 | as a guest on your podcast.
01:26:15.480 | - Yeah, I've talked to Joe a lot about this,
01:26:19.400 | I really believe I can have a good conversation
01:26:28.280 | with Donald Trump,
01:26:29.200 | but I haven't seen many good conversations with him.
01:26:35.080 | So part of me thinks, part of me believes it's possible,
01:26:41.200 | but he often effectively runs over the interviewer.
01:26:46.200 | - Yeah, you can sit him down,
01:26:47.840 | give him an element, an athletic greens.
01:26:50.240 | - Just relax.
01:26:51.180 | - I mean that nice, cool,
01:26:53.080 | air conditioned black curtain studio you've got,
01:26:56.240 | and a different side might come out.
01:26:58.420 | Context is powerful.
01:26:59.840 | - Well, Joe's really good at this,
01:27:01.720 | which is relaxing person.
01:27:03.960 | Like here, have a drink, smoke a joint or whatever it is,
01:27:08.800 | but this energy of just let's relax,
01:27:10.980 | and there's laughter and so on.
01:27:12.640 | I don't think, as people know,
01:27:16.640 | I'm just not good at that kind of stuff.
01:27:19.440 | So I think the way I could have a good conversation with him
01:27:23.240 | is to really understand his worldview,
01:27:25.340 | be able to steel man his worldview,
01:27:28.100 | and those that support him,
01:27:29.740 | which is, I'm sorry to say,
01:27:32.200 | for people who seem to hate Donald Trump
01:27:34.100 | is a very large percentage of the country.
01:27:36.160 | And so you have to really empathize
01:27:38.440 | with those people, you have to empathize with Donald Trump,
01:27:41.040 | the human being, and from that perspective,
01:27:45.180 | ask him hard questions.
01:27:46.800 | - Who do you think is the counterpoint?
01:27:50.060 | If you're going to seek balance in your guests,
01:27:52.920 | if you're going to have Trump on,
01:27:54.280 | then you have to have who on?
01:27:57.040 | - Well, it's interesting.
01:27:58.600 | - Anthony Fauci?
01:27:59.460 | Seems to be strongly associated with sort of counter values,
01:28:04.980 | at least in the eye of the public.
01:28:07.900 | I think he's retiring soon, but.
01:28:09.520 | - Yeah, he's retiring soon.
01:28:12.180 | That's really interesting, Anthony Fauci.
01:28:14.340 | Yeah, definitely, but I don't think he's a counterbalance.
01:28:17.580 | He's a complicated, fascinating figure
01:28:21.980 | who seems to have attracted a lot of hate and distrust,
01:28:25.700 | but also- - And love from some people.
01:28:27.020 | - And love. - And love from some people.
01:28:28.220 | I mean, I know people, not even necessarily scientists,
01:28:32.620 | who have pro Fauci shirts.
01:28:35.560 | I've seen people with anti Fauci shirts, excuse me,
01:28:38.200 | but certainly, but who adore him.
01:28:40.820 | There are people who adore him.
01:28:42.180 | In the same way, there are people that adore Trump.
01:28:43.860 | It's so interesting that one species of animal
01:28:47.680 | is such divergent neural circuitry.
01:28:49.840 | - It's almost feels like it's by design
01:28:52.240 | and every single topic will find tension and division.
01:28:56.080 | It's fascinating to watch.
01:28:57.300 | I mean, I got to really witness it from zero to 100
01:29:01.360 | in Ukraine, where there's not huge significant division.
01:29:06.280 | There was in certain parts of Ukraine,
01:29:09.040 | but across Europe, across the world,
01:29:11.680 | there was not that much division between Russia and Ukraine.
01:29:14.580 | And it was just born overnight, this intense hatred.
01:29:18.240 | So, and you see the same kind of stuff
01:29:20.160 | with Fauci over the pandemic.
01:29:22.720 | At first, we're all kind of huddled in uncertainty,
01:29:27.180 | kind of there is a togetherness with that pandemic.
01:29:29.780 | Of course, there is more difficult
01:29:31.060 | 'cause you're isolated, but then you start to figure out
01:29:33.940 | like the, probably the politicians in the media
01:29:37.060 | try to figure out how can I take a side here
01:29:39.380 | and how can I now start reporting on this side
01:29:42.980 | or that side and say how the other side is wrong.
01:29:45.300 | And so I think Anthony Fauci is a part
01:29:49.780 | of just being used as a scapegoat for certain things
01:29:53.440 | as part of that kind of narrative of division.
01:29:57.000 | But I think, so Trump is a singular figure
01:30:01.920 | that to me represents something important
01:30:04.920 | in American history.
01:30:05.760 | I'm not sure what that is, but I think you have to think,
01:30:09.160 | you put on your historian hat,
01:30:11.400 | go forward in time and think back.
01:30:14.000 | Like how will he be remembered
01:30:16.080 | 20, 30, 40, 50 years from now?
01:30:18.440 | Who is the opposite of that?
01:30:20.440 | You have to, I would really have to think about that
01:30:26.840 | because Trump was so singular.
01:30:28.640 | I think AOC is an interesting one,
01:30:31.360 | but she's so young it's unclear to know how,
01:30:34.720 | what if she represents a legitimately large scale movement
01:30:39.720 | or not.
01:30:40.760 | Bernie Sanders is an interesting option,
01:30:42.960 | but I wish he would be 30, 40 years younger.
01:30:45.940 | Like the young Bernie would be a good.
01:30:47.840 | - There are scientists working on that.
01:30:49.200 | - Yeah, I think so.
01:30:50.460 | - Not him specifically, but.
01:30:54.360 | - Well, yeah, it may be him, we never know.
01:30:57.700 | There is a big conspiracy theory
01:30:59.120 | that Putin is, that's a body double.
01:31:03.000 | It's no longer-
01:31:04.360 | - Bernie is Putin?
01:31:05.520 | - No, no, no.
01:31:06.360 | - I'm having a hard time merging that image.
01:31:09.280 | - The conspiracy theories, no, no, no.
01:31:10.700 | That the Putin we see on camera today is a body double.
01:31:14.540 | - Well, one thing that in science
01:31:18.100 | and in particular in anatomy,
01:31:21.080 | there's a classification scheme
01:31:24.480 | for different types of anatomists,
01:31:25.880 | which they either say you're a lumper or a splitter.
01:31:29.160 | Some people like to call a whole structure something,
01:31:31.440 | not necessarily just for simplicity,
01:31:33.000 | but for a lot of reasons.
01:31:34.440 | And then other people like to micro-divide the nucleus
01:31:36.680 | into multiple names.
01:31:37.560 | And of course, people used to be able
01:31:38.560 | to name different brain structures after themselves.
01:31:40.820 | So there'd be the nucleus of Lex
01:31:43.020 | and the Huberman fasciculus or whatever,
01:31:46.500 | less of that nowadays.
01:31:50.280 | And by the way, those structures don't actually exist
01:31:52.500 | just yet, we haven't defined those yet,
01:31:55.600 | that I was making those names up.
01:31:56.760 | But what's interesting is it seems like
01:32:00.800 | in the last five years, there's been a lot of,
01:32:03.360 | there's been a trend, excuse me,
01:32:06.760 | toward a requirement for lumping.
01:32:09.960 | Like you can't say, it seems that it's not allowed,
01:32:13.280 | if you will, to say, hey, yeah, you know,
01:32:16.380 | and here I'm not stating my,
01:32:18.120 | I will never reveal my preferences about pandemic related
01:32:20.440 | things for hopefully obvious reasons.
01:32:23.320 | It, you know, some people will say vaccines, yes,
01:32:25.560 | but masks, no, or vaccines and masks, yes,
01:32:28.800 | but let people work.
01:32:30.080 | And other people will say, no, everyone stay home.
01:32:32.040 | And then other people will say, no, you know, no vaccines,
01:32:33.960 | no masks, let everybody work.
01:32:35.480 | No one was saying no vaccines, no masks and stay home,
01:32:38.360 | I don't think.
01:32:39.280 | So there's this sort of lumping, right?
01:32:45.460 | The boundaries around ideology
01:32:49.400 | really did start to defy science.
01:32:51.880 | I mean, it wasn't scientific.
01:32:52.860 | It was one part science-ish at times,
01:32:55.720 | and sometimes really hardcore science.
01:32:57.860 | Other times it was politics, economics.
01:32:59.760 | I mean, we really saw the confluence
01:33:01.580 | of all these different domains of society
01:33:03.660 | that use very different criteria to evaluate the world.
01:33:07.280 | I mean, as a scientist, you know,
01:33:09.560 | remember when the vaccines first came out
01:33:11.020 | and I asked somebody, you know,
01:33:12.960 | one of the early concerns I had that was actually satisfied
01:33:17.820 | for me was how does this thing turn off?
01:33:20.260 | You know, if you start degenerating mRNA,
01:33:21.840 | how does it actually get turned off?
01:33:22.840 | So I asked a friend, you know,
01:33:24.000 | they know a lot about RNA biology and said,
01:33:27.740 | you know, how does it turn off?
01:33:28.580 | They explained it to me and I was like, okay, makes sense.
01:33:31.880 | I asked some other questions.
01:33:33.580 | So, but most people aren't going to think about it
01:33:36.660 | at that level of detail necessarily,
01:33:38.600 | but it did seem that there was just kind of amorphous blobs
01:33:42.440 | of ideology that they grabbed onto things.
01:33:45.020 | And then there was this need for a chasm between them.
01:33:48.220 | It was almost felt like it became illegal in some ways
01:33:52.240 | to want, you know, two of the things from that menu.
01:33:55.860 | And one of the things from that menu,
01:33:57.240 | I really felt like I was being constrained
01:33:59.080 | by a kind of like bento box model
01:34:01.620 | where I didn't get to define what was in the bento box.
01:34:04.240 | I can either have bento box A or bento box Z,
01:34:08.240 | but nothing in between.
01:34:09.440 | - And I think on that topic, and I think a lot of topics,
01:34:14.440 | most people are in the middle with humility, uncertainty,
01:34:18.280 | and they're just kind of trying to figure it out.
01:34:20.600 | And I think there is just the extremes
01:34:22.640 | defining the nature of this division.
01:34:26.360 | So I think it's the role of a lot of us
01:34:28.080 | in our individual lives.
01:34:29.720 | And also if you have a platform of any kind,
01:34:32.720 | I think you have to try to walk in the middle,
01:34:34.960 | like with the empathy and humility.
01:34:36.400 | And that's actually what science is about,
01:34:37.820 | is the humility.
01:34:40.320 | I'm still thinking about who's the opposite of Trump.
01:34:43.440 | - Well, maybe it is not.
01:34:44.280 | I mean, maybe Fauci is orthogonal to Trump.
01:34:47.160 | I mean, not everything has an opposite.
01:34:48.840 | I mean, it's, you know, maybe he's in the end of one.
01:34:52.080 | Maybe he's in the minority of one
01:34:53.800 | because he was an outsider from Washington
01:34:56.500 | who then made it there.
01:34:58.400 | - But also I wonder, you know, you have to pick your battles
01:35:03.400 | because every battle you fight,
01:35:06.800 | you should take very seriously.
01:35:08.460 | And just the amount of hate I got
01:35:11.440 | and I still get for having sat down with the Pfizer CEO,
01:35:15.040 | that was a very valuable lesson for me.
01:35:17.560 | - Well, that one got you a lot of heat.
01:35:19.840 | - Yeah, it still does because in the end-
01:35:21.760 | - 'Cause you had some pretty controversial guests on.
01:35:25.100 | - Yeah, but that one-
01:35:27.600 | - Is he still the Pfizer CEO?
01:35:29.560 | - I believe so.
01:35:30.440 | - CEO's turnover like crazy.
01:35:31.840 | This is the thing I didn't realize, you know,
01:35:33.040 | in science if somebody moves institutions like a big deal,
01:35:36.220 | most people don't have more than two moves in their career,
01:35:39.360 | maybe, but they often, you know,
01:35:41.200 | move to the next building is a big deal.
01:35:43.280 | But in biotech, it's like,
01:35:46.160 | I have a former colleague of mine from San Diego
01:35:48.360 | and he's been a CEO here, then he's a CEO there.
01:35:50.280 | He went back to a company he was a CEO at before, you know,
01:35:53.240 | but he's probably back at the university we worked at
01:35:54.760 | for all I know.
01:35:55.840 | It's amazing how much moving around there is.
01:35:57.880 | It is a very itinerant profession.
01:35:59.900 | - Yeah, I think in certain companies,
01:36:02.320 | I guess in biotech would be the case.
01:36:03.800 | The CEO is more of like a manager type.
01:36:07.220 | So you can,
01:36:08.060 | so almost jumping around benefits your experience.
01:36:10.900 | So you get, become better and better at being a manager.
01:36:13.300 | There's some like leader revolutionary CEOs
01:36:17.180 | that stick around for longer
01:36:19.100 | because they're so critical to pivoting a company
01:36:23.220 | like the Microsoft CEO currently,
01:36:25.800 | Sandra Prachai is somebody like that.
01:36:28.000 | Obviously Elon Musk is somebody like that,
01:36:30.580 | that is part of pivoting a company into new domains
01:36:34.300 | constantly, but yeah, in biotech, there's a machine.
01:36:37.600 | And in the eyes of a lot of people,
01:36:40.360 | big pharma is like big tobacco.
01:36:45.520 | It's the epitome of everything
01:36:49.480 | that is wrong with capitalism.
01:36:50.740 | It's evil, right?
01:36:53.260 | And so I showed up in the conversation
01:36:55.340 | where I thought with a pretty open mind
01:36:57.280 | and really asked what I thought were difficult questions
01:37:01.760 | of him.
01:37:02.600 | I don't think he's ever sat down to a grilling of that kind.
01:37:05.960 | In fact, I'm pretty sure they cut the interview short
01:37:08.600 | because of that.
01:37:09.720 | And I thought, literally it was hot in the room
01:37:12.840 | and we're sweating.
01:37:13.680 | And I was asking tough questions for somebody
01:37:16.600 | that like half the country or a large percent of the country
01:37:19.600 | believes he's alleviated a lot of,
01:37:22.200 | he helped through the financial resources
01:37:24.700 | that Pfizer has helped alleviate a lot of suffering
01:37:28.420 | in the world.
01:37:29.260 | And so I thought for somebody like that,
01:37:30.820 | I was asking pretty hard questions.
01:37:32.820 | Boy, did I get to hear from the side.
01:37:36.460 | Usually one of the sides is more intense in their anger.
01:37:41.460 | So there's certain political topics,
01:37:47.660 | like with Andrew Tate, for example,
01:37:52.540 | I would hear from a very,
01:37:56.040 | it would probably be the left, far left
01:37:58.480 | that would write very angrily.
01:38:01.480 | And so that's a group you'll hear from.
01:38:03.680 | The Pfizer CEO, I didn't get almost any messages
01:38:08.320 | from people saying, why did you go so hard on him?
01:38:12.260 | He's an incredible human, incredible leader and CEO
01:38:16.560 | of a company that helped us with the vaccine
01:38:20.040 | and nobody thought it would be possible to develop so quickly.
01:38:22.820 | - You did not get letters of that.
01:38:23.960 | - I did not.
01:38:24.800 | I mean, like here and there, but the sea of people
01:38:28.320 | that said everything from me being weak,
01:38:31.120 | that I wasn't able to call out this person,
01:38:33.340 | how do you sit down, how do you platform this evil person,
01:38:37.160 | how do you make him look human, all that kind of stuff.
01:38:41.200 | And that you have to deal with that.
01:38:42.560 | You have to, of course, it's great.
01:38:45.520 | It's great because I have to do some soul searching,
01:38:48.600 | which is like, did I, like you have to ask
01:38:51.640 | some hard questions.
01:38:52.560 | I love criticism like that.
01:38:53.840 | You get to like, I hit some low points.
01:38:56.960 | There's definitely some despair.
01:38:58.240 | And you start to wonder like, was I too weak?
01:39:00.800 | Should I have talked to him?
01:39:04.280 | What is true?
01:39:05.760 | And you sit there alone and just marinate in that.
01:39:09.320 | And hopefully over time that makes you better.
01:39:10.920 | But I still don't know what the right answer
01:39:12.320 | with that one is.
01:39:13.160 | - Well, I feel that money plays a role here.
01:39:17.440 | You know, when people think big pharma,
01:39:22.440 | they think billions of dollars,
01:39:24.400 | maybe even trillions of dollars, really.
01:39:26.600 | And certainly people who make a lot of money
01:39:31.600 | get scrutiny that others don't.
01:39:35.560 | Part of it is that they are often not always visible.
01:39:38.580 | But I think that there is a natural and reflexive,
01:39:42.420 | and I'm not justifying it.
01:39:44.380 | I certainly don't feel this 'cause I do,
01:39:46.640 | I know some people who are very wealthy,
01:39:48.960 | some people who are very poor.
01:39:51.000 | I can't say it scales with happiness at all.
01:39:54.560 | People are always shocked to hear that.
01:39:56.640 | But that's what I've observed in very wealthy people.
01:40:01.640 | But that people who have a lot of money
01:40:04.520 | are often held to a different standard
01:40:07.880 | because people resent that.
01:40:10.920 | Some people resent that.
01:40:11.880 | And maybe there are other reasons as well.
01:40:14.720 | I mean, among people who are very wealthy,
01:40:17.840 | oftentimes the wish is for status, right?
01:40:21.440 | Not money.
01:40:22.280 | You get a bunch of billionaires in a room,
01:40:24.040 | and unless one of them is Elon,
01:40:26.600 | who also has immense status for his accomplishments,
01:40:30.020 | typically if you put a Nobel Prize winner in a room
01:40:33.400 | with a bunch of billionaires,
01:40:34.680 | they're all talking to that person, right?
01:40:37.280 | And there are many very interesting billionaires.
01:40:40.080 | But status is something that is often
01:40:45.080 | but not always associated with money,
01:40:48.280 | but is a much rarer form of uniqueness out there,
01:40:53.280 | positive uniqueness, if one considers status positive,
01:40:56.720 | 'cause there's a downside too.
01:40:58.160 | But so I wonder whether or not the Pfizer CEO
01:41:02.720 | caught extra heat because people assume,
01:41:05.080 | and I probably assume also that his salary is quite immense.
01:41:08.560 | - Yeah, so because I have a lot of data on this,
01:41:11.240 | I can answer it.
01:41:12.080 | It's a very good hypothesis.
01:41:13.600 | Let's test it scientifically.
01:41:15.040 | - He's about to tell me it's a great hypothesis,
01:41:16.760 | but it's wrong.
01:41:18.080 | I know this smirk.
01:41:20.040 | - I honestly think it's wrong.
01:41:21.160 | There is, that effect is there for a lot of people,
01:41:23.880 | but I think the distrust is not towards the CEO.
01:41:28.040 | The distrust is towards the company.
01:41:29.920 | One of the really difficult soul searching I had to do,
01:41:33.080 | which is just having interacted with Pfizer folks
01:41:36.320 | at every level from junior to the CEO,
01:41:39.520 | they're all really nice people.
01:41:41.920 | They have a mission.
01:41:43.320 | They talk about trying to really help people
01:41:45.880 | 'cause that's the best way to make money
01:41:47.200 | is come up with a medicine that helps a lot of people.
01:41:50.480 | Like the mission is clear.
01:41:52.780 | They're all good people.
01:41:54.120 | A lot of really brilliant people, PhDs.
01:41:57.340 | So you can have a system where all the people are good,
01:42:00.600 | including the CEO.
01:42:01.720 | And by good, I mean people that really are trying
01:42:04.880 | to do everything, they dedicate their whole life to do good.
01:42:08.680 | And yet, you have to think that that system
01:42:12.120 | can deviate from a path that does good
01:42:16.360 | because you start to deceive yourself of what is good.
01:42:20.480 | You turn it into a game where money does come into play
01:42:23.640 | from a company perspective where you convince yourself
01:42:27.640 | the more money you make, the more good you'll be able to do,
01:42:30.220 | and then you start to focus more and more and more
01:42:33.220 | on making more money, and then you can really deviate
01:42:37.500 | and lose track of what is actually good.
01:42:40.020 | I'm not saying necessarily Pfizer does that,
01:42:42.700 | but I think companies could do that.
01:42:44.000 | You can apply that criticism to social media companies,
01:42:47.220 | to big pharma companies, that one of the big lessons for me,
01:42:51.880 | I don't know what the answer is,
01:42:53.440 | but that all the people inside a company could be good,
01:42:57.000 | people you would want to hang out with,
01:42:59.480 | people you'd want to work with,
01:43:01.140 | but as a company, it's doing evil.
01:43:04.120 | And that's a possibility.
01:43:05.720 | So the distrust, I don't think,
01:43:07.420 | is towards the billionaire individual,
01:43:10.160 | which I do see a lot of in this case.
01:43:12.360 | I think it's like Wall Street distrust,
01:43:15.560 | that the machinery of this particular organization
01:43:19.000 | has gone off track.
01:43:20.800 | - It's the generalization of hate again.
01:43:22.800 | - Yeah, and then good luck figuring out what is true.
01:43:26.680 | This is the tough stuff.
01:43:28.380 | But I should say the individuals, like individual scientists
01:43:33.380 | at the NIH, in Pfizer, are just incredible people.
01:43:38.800 | Like they're really brilliant people.
01:43:44.620 | So I never trust the administration or the business people,
01:43:47.840 | no offense, business people,
01:43:49.480 | but the scientists are always good.
01:43:51.260 | They have the right motivator in life.
01:43:55.220 | But again, they can have blinders on to focus on the science.
01:43:59.240 | Nazi Germany has a history of people
01:44:00.940 | just too focused on the science,
01:44:02.640 | and then the politicians use the scientists
01:44:05.260 | to achieve whatever end they want.
01:44:06.640 | But if you just look narrowly at the journey of a scientist,
01:44:11.560 | it's a beautiful one,
01:44:12.560 | 'cause they're ultimately in it for the curiosity,
01:44:16.540 | the moment of discovery versus money.
01:44:19.280 | I mean, prestige probably does come into play
01:44:21.320 | later in life, but especially young scientists,
01:44:25.500 | they're after the, it's like they're pulling at the thread
01:44:29.580 | of curiosity to try to discover something big.
01:44:32.060 | They get excited by that kind of stuff,
01:44:33.540 | and it's beautiful to see.
01:44:34.940 | - It is beautiful to see.
01:44:35.920 | I have a former graduate student,
01:44:37.100 | now a postdoc at Caltech,
01:44:38.540 | and I don't even know if she had a cell phone.
01:44:40.720 | She would come into the lab,
01:44:41.560 | put her cell phone into the desk,
01:44:43.620 | and she was tremendously productive.
01:44:45.500 | But that wasn't why I brought it up.
01:44:48.180 | She was productive as a side effect
01:44:49.820 | of just being absolutely committed and obsessed
01:44:52.180 | to discover the answers to the questions she was asking
01:44:55.500 | as best she could.
01:44:56.340 | And it was, you could feel it.
01:44:58.320 | You could just feel the intensity
01:44:59.700 | and just incredibly low activation energy.
01:45:03.160 | If there was an experiment to do, she'd just go do it.
01:45:05.860 | You're teaching at MIT.
01:45:08.600 | You are obviously traveling the world.
01:45:11.940 | You're running the podcast,
01:45:13.520 | a lot of coverage of chess recently, which is interesting.
01:45:15.940 | I don't play chess, but I-
01:45:17.340 | - Oh, I have some scientific questions to you about that.
01:45:19.480 | - Oh, okay, sure.
01:45:20.380 | And then let's get to those for sure.
01:45:23.200 | And then-
01:45:24.040 | - You're not going to like it.
01:45:24.860 | - Oh no, okay.
01:45:25.700 | And then also some very,
01:45:29.100 | do I have to spell Massachusetts again?
01:45:31.780 | - Of course.
01:45:32.620 | - Also, you still seem to have a proclivity
01:45:37.140 | for finding guests that are controversial, right?
01:45:39.300 | You're thinking about Tate.
01:45:40.140 | We're talking about Trump.
01:45:40.960 | We're talking about the Pfizer CEO.
01:45:42.340 | We're talking about Fauci.
01:45:43.300 | These are intense people.
01:45:45.580 | And so what we're getting, folks,
01:45:47.100 | is we're not doing neuroimaging here
01:45:49.480 | in the traditional sense of putting someone into a scanner.
01:45:51.520 | What we're doing here is we're using
01:45:53.760 | the great Karl Deisseroth who was on your podcast.
01:45:57.000 | - Thank you for that.
01:45:57.840 | Thank you for connecting to us.
01:45:58.680 | He's an incredible person.
01:45:59.720 | - He's an incredible psychiatrist,
01:46:01.280 | bio-engineer and human being and writer.
01:46:03.960 | And your conversation with him was phenomenal.
01:46:06.680 | I listened to it twice.
01:46:08.520 | I actually have taken notes.
01:46:10.520 | We talk about it in this household.
01:46:12.140 | We really do.
01:46:14.500 | His description of love is not to be missed.
01:46:18.520 | I'll just leave it at that
01:46:19.360 | because if I try and say it, I won't capture it well.
01:46:21.360 | But we're getting a language-based map
01:46:24.920 | of at least a portion of Lex Friedman's brain here.
01:46:30.760 | So what else is going on these days in that brain
01:46:35.000 | as it relates to robotics, AI?
01:46:38.040 | Our last conversation was a lot about robots
01:46:41.000 | and the potential for robot-human interaction.
01:46:43.440 | Even what is a robot, et cetera.
01:46:46.060 | Are you still working on robots or focused on robots?
01:46:49.520 | And where is science showing up in your life
01:46:52.140 | besides the things we've already talked about?
01:46:53.900 | - So I think the last time we talked was before Ukraine.
01:46:57.320 | - Yes. - Before.
01:46:58.380 | - You were just about to leave.
01:47:00.480 | - Yes, so that, I mean--
01:47:02.160 | - So that's why I went on.
01:47:03.060 | I was like, you know, this might be the last.
01:47:05.440 | You said you want to come out here before or after.
01:47:06.920 | I was like, come out here before.
01:47:08.660 | Don't want to see you before you go.
01:47:10.320 | But here you are in the flesh.
01:47:11.780 | - I think, so a lot of,
01:47:15.460 | just a lot of my mind has been occupied,
01:47:18.200 | obviously, with that part of the world.
01:47:19.840 | But the most of the difficult struggles
01:47:22.940 | that I'm still going through is that
01:47:24.960 | I haven't launched a company that I want to launch.
01:47:27.820 | And the company has to do with AI.
01:47:31.160 | I mean, it's maybe a longer conversation,
01:47:32.960 | but the ultimate dream is to put robots in every home.
01:47:36.200 | But short term, I see their possibility
01:47:40.220 | of launching a social media company.
01:47:42.440 | And it's a non-trivial explanation
01:47:45.740 | why that leads to robots in the home.
01:47:47.680 | But it's basically the algorithms
01:47:50.160 | that fuel effective social robotics.
01:47:53.800 | So robots that you can form a deep connection with.
01:47:56.560 | And so I've been really, yeah,
01:47:57.720 | I've been building prototypes,
01:48:00.140 | but struggling that I don't have maybe,
01:48:05.040 | if I were to be critical, the guts to launch a company.
01:48:10.340 | - Or the time.
01:48:12.060 | - Well, it's combined.
01:48:13.140 | - I think you've got the guts.
01:48:14.140 | I mean, it's clear if you'll do an interview
01:48:16.500 | with the Pfizer CEO,
01:48:17.560 | and you're considering putting this Tate fellow
01:48:19.440 | on your podcast and you've gone to the Ukraine,
01:48:22.620 | that you have the guts.
01:48:24.340 | It's also a, it means not doing quite a lot of other things.
01:48:28.820 | - That's what I mean, but it does take,
01:48:31.080 | the thing is, as many people know,
01:48:34.620 | when you fill your day and you're busy,
01:48:37.740 | that busyness becomes an excuse
01:48:41.100 | that you use against doing the things that scare you.
01:48:44.780 | A lot of people use family in this way.
01:48:46.980 | You know, my wife, my kids, I can't.
01:48:51.420 | When in reality, some of the most successful people
01:48:53.460 | have a wife and have kids and have families
01:48:56.620 | and they still do it.
01:48:57.820 | And so a lot of times we can fill the day with busy work,
01:49:00.980 | with like, yeah, of course I have podcasts
01:49:05.060 | and all this kind of stuff and they make me happy.
01:49:06.900 | And they're all, they're wonderful.
01:49:08.620 | And there's research, there's teaching and so on,
01:49:11.140 | but all of that can just serve as an excuse
01:49:13.100 | from the thing that my heart says is the right thing to do.
01:49:17.220 | And that's why I don't have the guts,
01:49:19.780 | the guts to say no to basically everything
01:49:22.020 | and then to focus all out.
01:49:23.620 | Because part of it is I'm unlikely to fail
01:49:28.620 | at anything in my life currently,
01:49:31.340 | 'cause I've already found a comfortable place.
01:49:33.740 | With the startup, it's most likely going to be a failure.
01:49:38.420 | It's not an embarrassing failure.
01:49:40.020 | - Well, the machine learning data that I'm aware of,
01:49:44.780 | I don't know a lot about machine learning,
01:49:46.060 | but within the realm of neuroscience,
01:49:48.020 | say that a failure rate of about 15%
01:49:51.540 | is optimal for neuroplasticity and growth.
01:49:54.680 | Whether or not that translates
01:49:55.940 | to all kinds of practices isn't clear,
01:49:58.620 | but getting trials right 85% of the time
01:50:02.040 | seems to be optimal for language learning,
01:50:04.840 | seems to be optimal for mathematics
01:50:07.320 | and seems to be optimal for physical pursuits on average.
01:50:11.560 | I'm sure you have more machine learning geeks
01:50:16.560 | that listen to your podcast than listen to this podcast,
01:50:18.600 | but it doesn't mean you have to fail
01:50:20.320 | on 15% of your weight sets, folks.
01:50:22.440 | I mean, it could be 16%.
01:50:23.920 | No, I'm just kidding.
01:50:24.760 | But it's not exact, but it's a pretty good rule of thumb.
01:50:30.120 | - I think a lot of startup founders would literally murder
01:50:33.480 | for 85% chance of success.
01:50:36.100 | I think given all the opportunities I have,
01:50:40.920 | the skillset, the funding, all that kind of stuff,
01:50:44.760 | my chances are relatively high for success.
01:50:47.760 | But what relatively high means in the startup world
01:50:50.440 | is still far, far below 85.
01:50:53.600 | We are talking about single digit percentages.
01:50:55.980 | Most startups fail.
01:50:57.400 | - Well, I think it means the decision
01:50:59.400 | to focus on the company and not other things
01:51:01.100 | means the decision to close the hatch on dopamine retrieval
01:51:04.800 | from all these other things
01:51:05.840 | that are very predictable sources of dopamine.
01:51:08.440 | Not that everything is dopamine,
01:51:11.200 | but dopamine is I think the primary chemical driver
01:51:14.880 | of motivation.
01:51:15.880 | If you know that you can get some degree of satisfaction
01:51:19.320 | from scrolling social media
01:51:20.560 | or from that particular cup of coffee,
01:51:23.080 | that's what you're going to do.
01:51:24.480 | That's what you're going to consume
01:51:25.760 | unless you somehow invert the algorithm
01:51:28.760 | and you say, it's actually my denial of myself
01:51:33.760 | drinking that coffee that's going to be the dopamine.
01:51:37.480 | - Interesting.
01:51:38.320 | - Then, and that's the beauty of having a forebrain
01:51:40.840 | is that you can make those decisions.
01:51:42.640 | This is the essence, I do believe,
01:51:44.480 | of what we see of David Goggins.
01:51:46.640 | There's much more there.
01:51:47.480 | There's a person that none of us know and only he knows,
01:51:51.400 | of course, but the idea that the pain
01:51:55.320 | is the source of dopamine, the limbic friction,
01:51:59.040 | as I sometimes like to call it, is the source of dopamine.
01:52:01.280 | That runs counter to how most nervous systems work,
01:52:04.360 | but it's decision-based, right?
01:52:06.680 | It's not because his musculature is a certain way
01:52:09.200 | or he had CRISPR or something.
01:52:11.760 | It's because he decides that.
01:52:15.040 | And I think that's amazing,
01:52:17.060 | but what it means in terms of starting a company
01:52:18.640 | and changing priorities is a closing the hatch
01:52:21.080 | on all or many of the current sources of dopamine
01:52:24.160 | so that you can derive dopamine from the failures
01:52:26.920 | within this narrow context.
01:52:28.440 | And there's a very reductionist view,
01:52:30.360 | a kind of neurocentric view of what we're talking about,
01:52:35.300 | but I think about this a lot.
01:52:36.960 | I mean, the decision to choose one relationship
01:52:39.580 | versus another is a decision to close down
01:52:42.240 | other opportunities, right?
01:52:43.640 | So I think that the decision to order one thing off the menu
01:52:48.120 | versus others is this decision to close down
01:52:50.680 | those other hatches.
01:52:51.960 | So I think that you absolutely can do it.
01:52:54.600 | It's just a question of, can you flip the algorithm?
01:52:58.440 | - Yeah, remap the source of dopamine to something else.
01:53:02.440 | - And maybe go out there not to succeed,
01:53:05.320 | but make the journey as the destination type thing.
01:53:08.640 | But when you're financially vested in your time,
01:53:11.800 | and as far as I know, we only get one life,
01:53:13.840 | at least on this planet,
01:53:15.520 | and you want to spend that wisely, right?
01:53:19.200 | - And a lot of that, the people that surround you,
01:53:23.640 | I mean, people are really important.
01:53:25.680 | And I don't have people around me
01:53:30.180 | that say you should do a startup.
01:53:33.620 | It's very difficult to find such people 'cause--
01:53:35.640 | - Is Austin big startup culture right now?
01:53:37.120 | - Yeah, it is, it is.
01:53:37.960 | But it doesn't make sense for me to do a startup.
01:53:40.120 | This is what the people that love me my whole life
01:53:43.040 | have been telling me it doesn't make sense
01:53:44.520 | what you're doing right now.
01:53:46.440 | Just do the thing you were doing previously.
01:53:48.280 | - Why do I get the sense that because they are saying
01:53:50.200 | this, you're apt to go against them?
01:53:51.040 | - No, I actually was never that, unfortunately.
01:53:53.400 | Unfortunately, I need, I've talked to people I love,
01:53:58.360 | my parents, family, and so on, friends.
01:54:00.500 | I'm one of those people that needs
01:54:02.680 | unconditional support for difficult things.
01:54:05.320 | Like, I know myself coaching-wise.
01:54:07.880 | It's good to, I like, so here's how I get coached best.
01:54:12.800 | Let's say wrestling.
01:54:14.480 | I, like, a coach that says, "You wanna win the Olympics?"
01:54:19.480 | They will not, like, if I say I wanna win the gold medal
01:54:22.840 | at the Olympics in freestyle wrestling,
01:54:25.300 | I want a coach that doesn't blink once
01:54:29.160 | and hears me and believes that I can do it
01:54:32.200 | and then is viciously intense and cruel to me
01:54:37.080 | on that pursuit.
01:54:39.380 | Like, if you want to do this, let's do this.
01:54:41.800 | Right, so, but that's support.
01:54:44.740 | That, like, that positivity, I don't,
01:54:48.160 | I'm never, you know, I'm not energized,
01:54:52.580 | nor do I see that as love.
01:54:54.220 | A person saying, like, basically criticizing that.
01:54:57.680 | Like, saying, like, you're too old
01:55:02.040 | to win the Olympic gold medal, right?
01:55:03.500 | You're, like, all the things you can come up with.
01:55:06.540 | That's not helpful to me.
01:55:07.760 | And I can't find a dopamine, or I haven't yet,
01:55:10.560 | a dopamine source from the haters.
01:55:13.860 | Like, basically people that are criticizing you
01:55:16.120 | just trying to prove them wrong.
01:55:17.960 | It doesn't, it never got me off.
01:55:20.480 | Like, it never...
01:55:21.760 | - Well, some people seem to like that.
01:55:25.160 | I mean, David Goggins always seems to come up,
01:55:28.200 | he seems driven by many sources that he has access.
01:55:30.560 | I do, I don't know, 'cause I've never asked him,
01:55:32.760 | but I, if I were to venture a guess,
01:55:35.120 | I'd say that he probably has a lot of options
01:55:38.260 | inside his head as to how to push through challenge.
01:55:41.540 | Not just overcome pain, but he'll post sometimes
01:55:45.600 | about the fact that, you know, people will say this,
01:55:47.960 | or people will do this, and, you know,
01:55:49.580 | and talk about the pushback approach.
01:55:51.800 | He'll also talk about the pushback approach
01:55:54.700 | that's purely internal, that doesn't involve anyone else.
01:55:57.400 | Great versatility there.
01:55:58.640 | - Yeah, there's literally, like, a voice he yells at
01:56:01.160 | that represents some kind of, like, devil
01:56:05.580 | that wants him to fail.
01:56:06.820 | And he's, you know, he calls him bitch
01:56:09.920 | and all kinds of things, saying, you know,
01:56:11.180 | "Fuck you, I'm not, I'm not."
01:56:13.120 | He's, there's always, like, an enemy,
01:56:15.160 | and he's going against that enemy.
01:56:17.000 | I mean, I wish, maybe that's something,
01:56:18.760 | I mean, it's really interesting.
01:56:19.720 | Maybe you can remap it this way so that you can construct,
01:56:24.720 | like, that's a kind of obvious mechanism.
01:56:27.100 | Construct an amorphous blob that is a hater
01:56:31.300 | that wants you to fail, right?
01:56:33.140 | That's kind of the David Goggins thing.
01:56:35.020 | You're, and that blob says you're too weak,
01:56:39.760 | you're too dumb, you're too old, you're too fat,
01:56:43.760 | you're too whatever, and getting you to want to quit,
01:56:46.920 | and so on, and then you start getting angry at that blob,
01:56:49.900 | and maybe that's a good motivator.
01:56:51.240 | I haven't personally really tried that.
01:56:53.260 | - Well, I've had external, you know, challenge
01:56:56.200 | when I was a postdoc, a very prominent laboratory,
01:57:00.100 | several prominent laboratories, in fact,
01:57:02.080 | were working on the same thing that I was,
01:57:03.640 | and I was just a slowly postdoc working on a project
01:57:06.160 | pretty independent from the lab I was in,
01:57:07.760 | and there was competition, but there was plenty of room
01:57:11.000 | for everybody to win, but in my head, and frankly,
01:57:14.880 | I won't disclose who this is,
01:57:17.120 | and because there was some legitimate competition there,
01:57:20.400 | and a little bit of friction, not too much healthy,
01:57:23.160 | scientific friction, yeah, I might've pushed
01:57:26.880 | a few extra hours or more, a little bit.
01:57:29.520 | I have to say, it felt metabolizing.
01:57:33.240 | It felt catabolic, right?
01:57:34.960 | It didn't, I couldn't be sustained by it,
01:57:38.280 | and I contrast that with the podcast
01:57:40.560 | or the work that my laboratory is doing now,
01:57:42.580 | focused on stress and human performance, et cetera,
01:57:45.340 | and it's pure love.
01:57:47.120 | I just, I want, it's pure curiosity and love.
01:57:49.360 | I mean, they're hard days, but I never,
01:57:51.480 | there's no adversary in the picture.
01:57:54.520 | They're the practical, you know, workings of life that--
01:57:57.520 | - Well, that was the thing that Joe really inspired me on,
01:58:00.440 | and people do create adversarial relationships
01:58:03.400 | in podcasting, 'cause you get, like, YouTubers do this.
01:58:06.960 | They get, you know, they hate seeing
01:58:09.320 | somebody else be successful.
01:58:10.840 | There's a feeling of like, like jealousy,
01:58:15.840 | and some people even see that as healthy,
01:58:17.880 | like, ooh, this, like, MrBeast is somebody's,
01:58:20.720 | some of these popular YouTubers, how do they get
01:58:23.840 | 100 million views, and I only get 20 views.
01:58:26.680 | - Like, MrBeast devoted his entire, according to him,
01:58:29.000 | his entire life he's been focused on becoming
01:58:31.240 | this massive YouTube channel.
01:58:32.640 | - Well, that, you know, he's inspiring in many ways,
01:58:34.620 | but there's some people that get, become famous
01:58:37.080 | for doing much less insane pursuit of greatness
01:58:42.080 | than MrBeast, like, people become famous on, you know,
01:58:47.400 | on social media and so on, and it's easy to be jealous
01:58:50.320 | of them, I just, one of the early things I've learned
01:58:52.560 | from Joe, just being a fan of his podcast,
01:58:55.400 | is how much he celebrated everybody,
01:58:57.280 | and again, maybe I ruined my whole dopamine thing,
01:59:00.840 | but I don't get energized by people that become popular.
01:59:05.000 | The podcasting space and YouTube, it doesn't, it's awesome,
01:59:08.440 | it's all of it is awesome, and I'm inspired by that,
01:59:11.960 | but the problem is that's not a good motivator.
01:59:14.320 | Inspiration is like, ooh, cool, humans can do this,
01:59:17.140 | this is beautiful, but it's not, I'm looking,
01:59:19.760 | I'm looking, you know, for forcing function,
01:59:23.000 | that's why I gave away the salary from MIT,
01:59:25.420 | I was hoping my bank account had zero,
01:59:28.000 | that would be a forcing function to be like, oh shit,
01:59:30.940 | and I would, you know, and you're not allowed
01:59:32.780 | to have a normal job, so I wanted to launch,
01:59:35.120 | and then the podcast becomes, you know, a source of income,
01:59:38.320 | and so it's like, goddammit.
01:59:40.380 | - Yeah, yeah, well, you know, and here I have
01:59:43.040 | to confess my biases, you are, you are so good
01:59:48.040 | at what you do in the realm of podcast,
01:59:52.520 | and you're excellent at other things as well,
01:59:54.120 | I just have less, you know, experience in those things.
01:59:56.860 | I know here I'm taking the liberty of speaking
02:00:00.320 | for many, many people and just saying I sure as hell hope
02:00:05.320 | you don't shut down the podcast, but as your friend
02:00:09.200 | and as somebody who cares very deeply about your happiness
02:00:11.700 | and your deeper satisfaction, if it's in your heart's heart
02:00:16.040 | to do a company, well then, damn it, do the company.
02:00:21.040 | - And a lot of it, I wouldn't even categorize
02:00:23.700 | as happiness, I don't know if you have things like that
02:00:26.020 | in your life, but I'm probably the happiest
02:00:29.340 | I could possibly be right now.
02:00:32.020 | - That's wonderful.
02:00:32.980 | - But the thing is, there's a longing for the startup
02:00:35.980 | that has nothing to do with happiness, it's something else.
02:00:38.460 | - And that's that itch, that's that itch.
02:00:40.180 | - I'm pretty sure I'll be less happy,
02:00:42.600 | because it's a really tough process.
02:00:45.240 | I mean, to whatever degree you can extract happiness
02:00:50.540 | from struggle, yes, maybe, but I don't see it.
02:00:53.280 | - I think I'll have some very, very low points.
02:00:55.260 | There's a lot of people who find companies,
02:00:58.700 | found companies know about your,
02:01:01.940 | and I also want to be in a relationship,
02:01:03.600 | I want to get married, and sure as hell a startup
02:01:06.980 | is not gonna increase the likelihood of that.
02:01:11.040 | - We could start up a family and start a company.
02:01:13.620 | - I'm a huge believer in that, which is getting
02:01:17.900 | a relationship at a low point in your life,
02:01:22.660 | which is-
02:01:24.140 | - Sorry, I'm not disputing your stance,
02:01:27.480 | nor am I agreeing with it.
02:01:29.280 | Every once in a while, there's a Lex Friedman-ism
02:01:32.720 | that hits a particular circuit in my brain,
02:01:35.820 | I had to just laugh out loud.
02:01:37.040 | - I just think that it's easy to have a relationship
02:01:40.440 | when everything is good.
02:01:42.100 | The relationships that become strong and are tested quickly
02:01:46.320 | are the ones when shit is going down.
02:01:48.480 | - Well, then there's hope for me yet.
02:01:50.140 | [laughing]
02:01:53.140 | So, you know, before we sat down,
02:01:58.140 | I was having a conversation with a podcast producer
02:02:01.340 | who is a, I wouldn't say avid,
02:02:03.940 | rather he's a rabid consumer of podcasts
02:02:06.500 | and finds these amazing podcasts.
02:02:08.420 | He's small podcasts and unique episodes.
02:02:12.700 | Anyway, we were talking about some stuff that he had seen
02:02:16.620 | and read in the business sector,
02:02:17.940 | and he was talking about the difference
02:02:20.500 | between job, career, and a calling, right?
02:02:23.700 | And I think he was extracting this from conversations
02:02:26.820 | of CEOs and founders, et cetera.
02:02:30.740 | I forget the specific founders
02:02:32.700 | that brought this to light for him.
02:02:35.740 | But, you know, that this idea that if you focus on a job,
02:02:38.900 | you know, you can make an income
02:02:40.060 | and hopefully you enjoy your job or not hate it too much.
02:02:42.340 | A career represents a sort of, in my mind,
02:02:45.180 | a kind of series of evolutions
02:02:46.580 | that one can go through junior, professor, tenure, et cetera.
02:02:49.660 | But a calling has a whole other level of energetic pull
02:02:54.660 | to it because it includes career and job
02:02:57.580 | and includes this concept of sort of like a life.
02:02:59.860 | It's very hard to draw the line between a calling in career
02:03:03.260 | and a calling in the other parts of your life.
02:03:05.640 | So the question therefore is,
02:03:09.280 | do you feel a calling to start this company?
02:03:13.680 | Or is it a born of a compulsion that irritates you?
02:03:18.680 | Is it like something you wish would go away?
02:03:21.600 | Or is it something that you hope will won't go away?
02:03:25.000 | - No, I hope it won't go away.
02:03:26.040 | It's a calling, it's a calling.
02:03:27.400 | It's like, it's like when I see a robot,
02:03:32.400 | when I first interacted with robots
02:03:36.580 | and it became even stronger,
02:03:37.700 | the most sophisticated the robots I interacted with,
02:03:40.760 | I see a magic there.
02:03:42.480 | And you're like, you look around,
02:03:44.240 | does anyone else see this magic?
02:03:46.620 | Like it's kind of like maybe when you fall in love,
02:03:50.420 | like that feeling like,
02:03:52.940 | does anyone else notice this person?
02:03:55.880 | I just walk in the room, I feel that way about robots
02:03:59.440 | and I can elaborate what that means,
02:04:03.180 | but I'm not even sure I can convert it into words.
02:04:05.520 | I just feel like the social integration of robots
02:04:10.520 | in society will create a really interesting world.
02:04:15.520 | And our ability to anthropomorphize when we look at a robot
02:04:20.720 | and our ability to feel things when we look at a robot
02:04:23.660 | is something that most of us don't yet experience,
02:04:25.980 | but I think everybody will experience
02:04:28.260 | in the next few decades.
02:04:29.720 | And I just want to be a part of exploring that
02:04:34.560 | because it hasn't been really thoroughly explored.
02:04:37.600 | The best roboticists in the world
02:04:39.720 | are not currently working on that problem at all.
02:04:42.160 | They try to avoid human beings completely.
02:04:44.680 | And nobody's really working on that problem
02:04:47.360 | in terms of when you look at the numbers.
02:04:48.800 | All the big tech companies that are investing money,
02:04:51.400 | the closest thing to that is Alexa
02:04:54.360 | and basically being a servant to help to tell you the weather
02:04:59.240 | or play music and so on,
02:05:00.600 | it's not trying to form a deep connection.
02:05:02.560 | And so sometimes you just notice the thing.
02:05:07.100 | Not only do I notice the magic, there's a gut feeling,
02:05:12.100 | which I try not to speak to because there's no track record,
02:05:15.900 | but I feel like I can be good
02:05:19.600 | at bringing that magic out of the robot.
02:05:23.600 | And there's no data that says I would be good at that,
02:05:26.760 | but there's a feeling, it's just like a feeling.
02:05:28.960 | Like I, you know, when I, 'cause I've done so many things,
02:05:32.640 | I love doing, playing guitar, all that kind of stuff,
02:05:36.060 | jiu-jitsu, I've never felt that feeling.
02:05:38.400 | When I'm doing jiu-jitsu, I don't feel the magic
02:05:42.360 | of the genius required to be extremely good.
02:05:44.860 | And guitar, I don't feel any of that,
02:05:46.900 | but I've noticed it in others.
02:05:48.480 | Great musicians, they will, they notice the magic
02:05:53.440 | about the thing they do and they ran with it.
02:05:57.260 | And I just always thought, I think it had a different form
02:06:02.000 | when I, before I knew robots existed, before AI existed.
02:06:05.560 | The form was more about the magic between humans.
02:06:10.560 | The, like, I think of it as love,
02:06:16.300 | but like the smile the two friends have towards each other
02:06:19.100 | when I was really young and people would be excited
02:06:23.120 | when they first notice each other and see, notice each other.
02:06:26.140 | And there's that moment that they share
02:06:28.740 | that feeling together.
02:06:29.840 | I was like, wow, that's really interesting.
02:06:33.260 | It is really interesting that these two separate
02:06:35.580 | intelligent organisms are able to connect all of a sudden
02:06:38.120 | on this deep emotional level.
02:06:40.100 | It's like, huh, it's just beautiful to see.
02:06:43.340 | And I noticed the magic of that.
02:06:44.820 | And then when I started programming,
02:06:49.820 | programming period, but then programming AI systems,
02:06:52.540 | you realize, oh, that could be,
02:06:54.860 | that's not just between humans and humans,
02:06:56.500 | that could be humans and other entities,
02:06:58.180 | dogs, cats, and robots.
02:07:02.180 | And that's, so I, for some reason it hit me
02:07:05.200 | the most intensely when I saw robots.
02:07:08.100 | So yeah, it's like a calling.
02:07:10.100 | But it's a calling that I can just enjoy
02:07:15.100 | the vision of it, the vision of a future world,
02:07:22.060 | of an exciting future world that's full of cool stuff,
02:07:25.960 | or I can be part of building that.
02:07:27.860 | And being part of building that means doing
02:07:30.580 | the hard work of capitalism, which is raising funds
02:07:34.340 | from people, which for me right now is the easy part,
02:07:38.940 | and then hiring a lot of people.
02:07:40.980 | I don't know how much you know about hiring,
02:07:42.620 | but hiring excellent people.
02:07:44.260 | Excellent people that will define the trajectory
02:07:47.100 | of not only your company, but your whole existence
02:07:49.820 | as a human being, and building it up,
02:07:52.400 | not failing them, because now they all depend on you,
02:07:55.620 | and not failing the world with an opportunity
02:07:58.820 | to bring something that brings joy to people.
02:08:01.380 | And all that pressure, just nonstop fires
02:08:05.780 | that you have to put out.
02:08:06.620 | The drama, having to work with people you've never
02:08:09.860 | worked with, like lawyers, and the human resources,
02:08:12.960 | and supply chain, and because this is very compute heavy,
02:08:17.960 | the computer infrastructure, the managing, security,
02:08:23.900 | cyber security, because you're dealing with people's data.
02:08:27.940 | So now you have to understand not only the cyber security
02:08:32.300 | of data and the privacy, how to maintain privacy
02:08:34.600 | correctly with data, but also the psychology
02:08:36.860 | of people trusting you with their data.
02:08:38.800 | And what is, how, you know, if you look at Mark Zuckerberg
02:08:43.800 | and Jack Dorsey and those folks, they seem to be hated
02:08:46.820 | by a large number of people.
02:08:48.420 | - Jack seemed, I didn't, you know, I didn't--
02:08:51.060 | - Much less so, yes.
02:08:51.900 | - I think, I always think of Jack as a loved individual, but--
02:08:56.160 | - Well, you have a very positive view of the world, yes.
02:08:58.600 | - I like Jack a lot, and I like his mind,
02:09:00.940 | and someone close to him described him to me recently
02:09:05.940 | as he's an excellent listener.
02:09:08.560 | That's what they said about Jack.
02:09:09.860 | And that's my experience of him too.
02:09:11.920 | A very private person, so we'll leave it at that.
02:09:14.220 | But listen, I think Jack Dorsey is one of the greats
02:09:22.080 | of our, of the last 200 years, and is just much quieter
02:09:27.080 | about his stance on things than a lot of people.
02:09:29.980 | But much of what we see in the world that's wonderful,
02:09:32.880 | I think we owe him a debt of gratitude.
02:09:34.900 | I'm just voicing my stance here, but--
02:09:37.540 | - The person, this is really important.
02:09:39.420 | A wonderful person, a brilliant person, a good person,
02:09:44.080 | but you still have to pay the price
02:09:47.820 | of making any kind of mistakes as the head of a company.
02:09:51.620 | You don't get any extra bonus points for being a good person.
02:09:55.680 | - But his willingness to go on Rogan and deal directly,
02:10:00.680 | and say I don't know an answer to that in some cases,
02:10:04.300 | but to deal directly with some really challenging questions
02:10:06.940 | to me earned him tremendous respect.
02:10:09.240 | - Yes, as an individual, he was still part of him.
02:10:12.100 | So you've said, and I love Jack too,
02:10:15.300 | and I interact with him often.
02:10:16.660 | - Yeah, he's been on your podcast.
02:10:17.740 | - Yes, but he's also part of a system as we talked about.
02:10:21.920 | And I would argue that Jack shouldn't have brought
02:10:25.260 | anyone else with him on that podcast.
02:10:28.640 | If you go--
02:10:29.480 | - Oh, that's right, he had a cadre of--
02:10:30.620 | - Oh, he had, I guess, the head legal with him.
02:10:35.620 | And also it requires a tremendous amount of skill
02:10:40.140 | to go on a podcast like Joe Rogan
02:10:42.340 | and be able to win over the trust of people
02:10:45.740 | by being able to be transparent and communicate
02:10:47.700 | how the company really works.
02:10:49.180 | Because the more you reveal
02:10:50.340 | about how a social media company works,
02:10:51.960 | the more you open up for security,
02:10:55.060 | the vector of attacks increases.
02:10:57.420 | Also, there's a lot of difficult decisions
02:10:59.360 | in terms of censorship and not that are made,
02:11:01.900 | that if you make them transparent,
02:11:03.860 | you're gonna get an order of magnitude more hate.
02:11:06.740 | So you have to make all those kinds of decisions.
02:11:08.500 | And I think that's one of the things I have to realize
02:11:13.380 | is you have to take that avalanche of potentially hate
02:11:18.380 | if you make mistakes.
02:11:23.420 | - Well, you have a very clear picture of this architecture
02:11:28.420 | of what's required in order to create a company.
02:11:31.840 | Of course, there's division of labor too.
02:11:34.320 | I mean, you don't have to do all of those things in detail,
02:11:36.500 | but finding people that are excellent to do,
02:11:40.180 | to run the critical segments is obviously key.
02:11:44.020 | I'll just say what I said earlier,
02:11:45.220 | which is if it's in your heart's heart to start a company,
02:11:49.580 | if that indeed is your calling and it sounds like it is,
02:11:52.220 | then I can't wait.
02:11:54.660 | - Does the heart have a heart?
02:11:57.260 | I don't know, what's that expression even mean?
02:11:59.060 | - Probably not.
02:12:00.580 | - romanticized heart. - In my lab at one point,
02:12:02.100 | early days, we worked on cuttlefish
02:12:03.780 | and they have multiple hearts and they pump green blood,
02:12:06.860 | believe it or not, a very fascinating animal.
02:12:09.580 | Speaking of hearts and green blood,
02:12:13.620 | earlier today, before we sat down,
02:12:15.680 | I solicited four questions on Instagram in a brief post.
02:12:20.680 | So if you'll- - Do you want to
02:12:23.100 | look at some of them?
02:12:23.940 | - Yes, let's take these in real time.
02:12:25.920 | My podcast team is always teasing me
02:12:29.100 | that I never have any charge on my phone.
02:12:31.220 | I'm one of these people that likes to run in the yellow
02:12:35.660 | or whatever it is on my phone. - On an iPhone?
02:12:37.500 | - Yeah, it never does. - It's funny how,
02:12:39.000 | always the iPhone people are out of battery, it's weird.
02:12:41.180 | - Well, I just got a new one.
02:12:42.360 | So I mean, this one has plenty of battery.
02:12:44.120 | I just got a new one.
02:12:44.960 | So I have different numbers for different things,
02:12:48.000 | personal and work, et cetera.
02:12:50.200 | I'm trying that now.
02:12:52.580 | All right, get into the-
02:12:55.880 | - I have a chess thing too, to mention to you.
02:12:58.860 | - Oh yes, please.
02:13:00.280 | Will I insult you if I look up these questions
02:13:02.420 | as you ask me? - No, no, no.
02:13:03.520 | No, no, but I will insult you by asking this question
02:13:06.260 | 'cause I think it's hilarious.
02:13:07.580 | - So there's been a controversy about cheating.
02:13:09.940 | - Okay. - Or Hans Niemann,
02:13:11.300 | who's a 2,700 player,
02:13:13.080 | was accused of cheating. - Oh yeah, I saw that clip
02:13:14.720 | on your clips channel.
02:13:15.780 | By the way, I love your clips channel.
02:13:17.080 | But I listen to your full channel.
02:13:18.400 | - The big accusation is that he cheated by having,
02:13:22.200 | I mean, it's half joke,
02:13:23.640 | but it's starting to get me to wonder whether,
02:13:26.080 | so that you can cheat by having vibrating anal beads.
02:13:32.300 | So you can send messages to-
02:13:35.260 | - Well, let's rephrase that statement.
02:13:37.080 | Not you can, but one can.
02:13:39.400 | - One can, one can. - Thank you.
02:13:40.860 | - That was a personal attack, yes.
02:13:42.460 | But it made me realize, I mean-
02:13:44.660 | - I'm just gonna adjust myself in my seat here.
02:13:47.100 | - I use it all the time for podcasts.
02:13:49.860 | I need to send myself messages
02:13:51.580 | to remind me myself of notes.
02:13:53.440 | But it's interesting, I mean, it-
02:13:56.860 | - I'm not gonna call you again.
02:13:58.360 | - Yeah, that's exactly where I keep my phone.
02:14:02.000 | The, it did get me down this whole rabbit hole of,
02:14:05.380 | well, how would you be able to send communication
02:14:08.020 | in order to cheat in different sports?
02:14:12.220 | I mean, that doesn't even have to do
02:14:13.420 | with chess in particular.
02:14:15.860 | But it's interesting in chess and poker
02:14:18.120 | that there's mechanisms sort of modern day
02:14:21.960 | where you're streaming live the competition.
02:14:25.680 | So people can watch it on TV.
02:14:27.540 | If they can only send you a signal back,
02:14:32.580 | they, you know, it's just like a fun little thing
02:14:35.500 | to think about and if it's possible to pull off.
02:14:38.620 | So I wanted to get your scientific
02:14:40.460 | evaluation of that technique.
02:14:43.420 | - To cheat using some sort of interoceptive device.
02:14:47.100 | - Yeah, vibrating of some kind, yeah.
02:14:48.860 | Well, no, no, that's one way to send signals
02:14:50.820 | is like Morse code basically.
02:14:52.840 | - Yeah, so there's a famous,
02:14:55.540 | I believe there's a famous real world story
02:14:58.500 | of physics students.
02:15:01.100 | I'm gonna get some of this wrong.
02:15:02.120 | So I'm saying this in kind of a course form
02:15:05.180 | so that somebody will correct this.
02:15:06.580 | But I believe it was physics graduate students
02:15:09.680 | from UC Santa Cruz or somewhere else.
02:15:13.220 | Maybe it was Caltech, a bunch of universities
02:15:15.460 | so that no one, you know,
02:15:17.500 | associated with any one university that went to Vegas
02:15:20.640 | and used some sort of tactile device
02:15:24.380 | for kind of card counting, I think.
02:15:27.340 | This was actually demonstrated also,
02:15:30.660 | not this particular incident.
02:15:31.900 | I don't think in the movie "Casino,"
02:15:34.380 | where they spotted, I remember Robert De Niro,
02:15:38.480 | who you have a not so vague resemblance to, by the way,
02:15:42.100 | in "Taxi Driver."
02:15:43.700 | - God, I wish I had a De Niro impression right now.
02:15:47.400 | - Travis Bickle.
02:15:48.440 | Look it up, folks.
02:15:50.320 | Travis Bickle is, you know,
02:15:51.700 | if Lex ever shaved his head into a mohawk.
02:15:54.460 | - I would.
02:15:55.480 | - So he had a tapping device on his ankle
02:15:58.540 | that was signaling someone else was counting cards
02:16:00.520 | and then signaling to that person.
02:16:02.400 | So yeah, that could be done in the tactile way.
02:16:05.060 | It could be done, obviously earpieces,
02:16:08.340 | if it's deep earpiece,
02:16:09.380 | I think that there are ways that they look for that.
02:16:12.180 | Certainly any kind of vibrational device
02:16:14.220 | in whatever orifice,
02:16:16.280 | provided someone could pay attention to that
02:16:17.860 | while still playing the game.
02:16:19.400 | Yeah, I think it's entirely possible now.
02:16:22.060 | Could it be done purely neurally?
02:16:24.680 | You know, could there be something that was,
02:16:26.860 | and listen, it wouldn't have to even be below the skull.
02:16:29.220 | This is where whenever people hear about Neuralink
02:16:31.620 | or brain machine interface,
02:16:32.620 | they always think, oh, you have to drill down below the skull
02:16:34.580 | and put a chip into the skull.
02:16:36.620 | I think there are people walking around nowadays
02:16:38.100 | with glucose monitoring devices like levels,
02:16:40.580 | which I've used and it was very informative for me actually,
02:16:44.300 | as a kind of an experiment,
02:16:45.540 | gave me a lot of interesting insights
02:16:47.260 | about my blood sugar regulation,
02:16:48.620 | how it reacts to different foods, et cetera.
02:16:50.480 | Well, you know, you can implant a tactile device
02:16:55.220 | below the skin with a simple incision.
02:16:57.940 | Actually one of the neurosurgeons at Neuralink,
02:16:59.880 | I know well, because he came up at some point
02:17:02.980 | through my laboratory and was at Stanford.
02:17:04.380 | And he actually has put in a radio receiver in his hand
02:17:07.520 | and his wife has it too.
02:17:08.380 | And he can open locks of his house and things like that.
02:17:11.740 | So he's been doing- - Under the skin?
02:17:12.880 | - Under the skin.
02:17:13.840 | You know, you can go to- - How does that work?
02:17:15.020 | So how do you use- - A piercer.
02:17:16.340 | You go to a, you know, a body piercer type person
02:17:19.780 | and they can just slide it under there.
02:17:21.220 | And it's got a battery life of something, you know,
02:17:23.980 | some fairly long duration.
02:17:25.540 | - How do you experience the tactile, the haptics of it?
02:17:28.220 | - Oh no, that just allows him to open certain locks
02:17:30.340 | with just his hand, but you could easily put
02:17:32.220 | some sort of tactile device in there.
02:17:34.920 | - But does it have to connect to the nerves
02:17:37.440 | or is it just like, just vibration?
02:17:39.340 | - No, just vibration.
02:17:40.180 | And you know- - And you can probably sense it
02:17:41.700 | even if it's under the skin.
02:17:42.980 | - And it can be Bluetooth linked.
02:17:44.980 | I mean, you know, I've seen,
02:17:46.000 | there's a engineering laboratory
02:17:48.020 | at the University of Illinois, Champaign, Urbana,
02:17:50.860 | that's got an amazing device,
02:17:52.420 | which is about the size of a bandaid.
02:17:53.820 | It goes on the clavicles
02:17:55.660 | and it uses sound waves pinged into the body
02:17:58.860 | to measure cavitation.
02:18:00.260 | I think about this for a moment.
02:18:02.140 | This is being used in the military where,
02:18:04.180 | let's say you're leading an operation
02:18:06.680 | or something people are getting shot at.
02:18:08.740 | And on a laptop, you can see
02:18:10.540 | where the bullet entry points are.
02:18:12.420 | Are people dead?
02:18:13.240 | Are they bleeding out?
02:18:14.220 | You know, entry exit points.
02:18:16.720 | You can get, take it out of the battlefield scenario,
02:18:19.740 | you can get breathing body position 24 hours a day.
02:18:23.340 | There's so much that you can do looking at cavitation.
02:18:25.500 | So these same sorts of devices that are on 12 hour Bluetooth
02:18:29.380 | could be used to send all sorts of,
02:18:32.340 | maybe every time you're supposed to hold your hand,
02:18:35.260 | I'm not a good gambler,
02:18:36.320 | so I only play roulette when I go to Vegas
02:18:38.900 | because like you just long boring and, you know, games,
02:18:41.420 | but you get some good mileage out of each run usually.
02:18:44.500 | But the, you know, maybe every time you're supposed to hold,
02:18:49.020 | the person gets sort of like a stomach cinching
02:18:51.500 | because this is, you know, stimulating the Vegas a little bit
02:18:54.180 | and they get a little bit of an ache.
02:18:55.020 | So it doesn't have to be Morse code.
02:18:57.740 | It can be yes, no, maybe, right?
02:19:00.180 | It can be, you know,
02:19:01.340 | it can be a green, red, yellow type signaling.
02:19:04.260 | It doesn't have to be very sophisticated
02:19:05.940 | to give somebody a significant advantage.
02:19:08.460 | Anyway, I haven't thought about this in detail
02:19:10.380 | before this conversation,
02:19:11.460 | but oh yeah, there's an immense landscape.
02:19:13.580 | - I don't know if you know a poker player named Phil Ivy.
02:19:16.500 | - No, I don't follow the gambling.
02:19:18.780 | - Well, he's considered to be
02:19:20.300 | one of the greatest poker players of all time, legitimately.
02:19:23.220 | You know, he's just incredibly good.
02:19:24.500 | But he got, there's this big case
02:19:28.580 | where he was accused of cheating and proven,
02:19:32.420 | and it's not really cheating,
02:19:33.620 | which is what's really fascinating,
02:19:35.580 | is it turns out, so he plays poker at Texas Hold'em mostly,
02:19:40.580 | but all kinds of poker.
02:19:41.980 | It turns out that the grid on the back of the cards
02:19:46.060 | is often printed a little bit imperfectly.
02:19:49.580 | And so you can use the asymmetry of the imperfections
02:19:54.580 | to try to figure out certain cards.
02:19:56.780 | So if you play, and you remember that a certain card
02:20:00.620 | is like, I think the eight in that deck
02:20:02.460 | that he was accused of, an eight and nine,
02:20:05.200 | were slightly different, symmetry-wise.
02:20:08.740 | So he can now ask the dealer, actually,
02:20:12.500 | to rotate it to check the symmetry.
02:20:14.100 | So you would ask the dealer to rotate the card
02:20:16.660 | to see that there's, to detect the asymmetry
02:20:18.740 | of the back of the card.
02:20:19.980 | And now he knows which cards are eights and nines,
02:20:23.740 | and, or likelier to be eights and nines.
02:20:25.980 | And he was using that information to play poker
02:20:29.920 | and win a lot of money, but it's just a slight advantage.
02:20:33.540 | And his cases, and in fact the judge found this,
02:20:37.420 | that he's not actually cheating, but it's not right.
02:20:40.440 | You can't use this kind of extra information.
02:20:42.840 | So it's fascinating that you can discover
02:20:44.740 | these little holes in games
02:20:45.800 | if you pay close enough attention.
02:20:47.480 | - Yeah, it's fascinating.
02:20:49.400 | And I think that, you know, I did watch that clip
02:20:53.300 | about the potential of a cheating event in chess
02:20:56.000 | and the fact that a number of chess players
02:20:58.920 | admit to cheating at some point in their career.
02:21:01.760 | Very, very interesting.
02:21:03.560 | - Well, it's online.
02:21:04.400 | So online cheating is easier, right?
02:21:07.300 | When you're playing online cheating in a game
02:21:09.940 | where the machine is much better than the human,
02:21:12.140 | it's very difficult to prove that you're human.
02:21:14.420 | And that applies, by the way,
02:21:15.740 | another really big thing is in social media, the bots,
02:21:18.760 | to if you're running a social media company,
02:21:20.840 | you have to deal with the bots.
02:21:22.460 | And they become one of the really exciting things
02:21:25.340 | in machine learning and artificial intelligence to me
02:21:28.060 | is the very fast improvement of language models.
02:21:31.820 | So neural networks that generate text and interpret text,
02:21:35.580 | that generate from text images and all that kind of stuff.
02:21:38.420 | But that's, you're now going to create incredible bots
02:21:43.740 | that look awfully a lot like humans.
02:21:46.000 | - Well, at least they're not going to be those crypto bots
02:21:47.820 | that seem to populate my comment section
02:21:49.560 | when I post anything on Instagram.
02:21:50.860 | I actually delete those
02:21:51.740 | even though they add to the comment roster.
02:21:54.500 | And, you know, they bother me so much.
02:21:58.220 | I spend, you know, at least 10, 15 minutes on each post
02:22:01.660 | just deleting those.
02:22:02.500 | I don't know what they need to do,
02:22:03.740 | but I'm not interested in those,
02:22:06.980 | whatever it is they're offering.
02:22:08.540 | Speaking of non-bots, I'm going to assume
02:22:11.620 | that all the questions are not from bots.
02:22:13.660 | There are a lot of questions here,
02:22:15.420 | more than 10,000 questions, goodness.
02:22:17.700 | I'll just take a few working from top to bottom.
02:22:20.580 | What ideas have you been wrestling with lately?
02:22:29.100 | And I think about the company as one,
02:22:31.300 | but as I scroll to the next, what are some others?
02:22:34.620 | - Well, some of the things we've talked about, which is,
02:22:38.300 | (sighs)
02:22:40.300 | the ideas of how to understand what is true,
02:22:48.380 | what is true about a human being, how to reveal that,
02:22:50.860 | how to reveal that to a conversation,
02:22:52.540 | how to challenge that properly,
02:22:54.740 | that it leads to understanding, not division.
02:22:57.780 | So that applies to everybody from Donald Trump
02:23:00.100 | to Vladimir Putin.
02:23:05.060 | Also, another idea is there's a deep distrust of science
02:23:08.940 | and trying to understand, the growing distrust of science,
02:23:11.620 | trying to understand what's the role of those of us
02:23:15.180 | that have a foot in the scientific community,
02:23:17.660 | how to be, how to regain some of that trust.
02:23:22.060 | Also, there's, as we talked about,
02:23:30.740 | how to find and how to maintain a good relationship.
02:23:35.660 | That's really been, I've never felt quite as lonely
02:23:39.220 | as I have this year with Ukraine.
02:23:41.740 | It's just like so many times I would just lay there
02:23:45.900 | and just feeling so deeply alone
02:23:47.500 | because I felt that my home, not my home like literally
02:23:52.500 | 'cause I'm an American, I love, I'm a proud American,
02:23:55.780 | I'll die an American, but my home in the sense
02:23:58.580 | of my generationally, my family's home is now,
02:24:02.060 | has been changed forever.
02:24:06.880 | There's no more being proud of being from the former Russia
02:24:11.880 | or Ukraine, it's just, it's now a political message
02:24:15.860 | to say, to show your pride.
02:24:18.660 | And so it's been extremely lonely and within that world,
02:24:21.940 | with all the things I'm pursuing,
02:24:23.460 | how do you find a successful relationship?
02:24:25.420 | That's been tough, but obviously,
02:24:27.140 | and there's a huge number of technical ideas
02:24:29.160 | with a startup of like,
02:24:30.140 | how the hell do you make this thing work?
02:24:32.760 | - Well, the relationship topic is one we talked
02:24:35.580 | a little bit about and last time we touched on,
02:24:38.220 | I'm in a little bit more detail.
02:24:40.300 | We're gonna come back to that, so I've made a note here.
02:24:43.640 | What or who inspired Lex, you,
02:24:48.080 | to wear a suit every time you podcast?
02:24:50.260 | That's a good question, I don't know the answer to that.
02:24:54.500 | So there's two answers to that question.
02:24:57.180 | One is a suit and two is a black suit and black tie.
02:25:02.180 | 'Cause I used to do, I used to have more variety,
02:25:06.620 | which is like, it was always a black suit,
02:25:08.940 | but I would sometimes do a red tie and a blue tie.
02:25:11.480 | But that was mostly me trying to fit into society,
02:25:14.820 | 'cause like variety is,
02:25:15.780 | you're supposed to have some variety.
02:25:18.280 | What inspired me is at first was a general culture
02:25:24.340 | that doesn't take itself seriously
02:25:28.060 | in terms of how you present yourself to the world.
02:25:30.740 | So in academia, in the tech world, just at Google,
02:25:33.740 | everybody was wearing pajamas and very relaxed in the tech.
02:25:38.340 | I don't know how it is in the science,
02:25:39.660 | in the chemistry, biology, and so on.
02:25:41.820 | But in computer science, everybody was very,
02:25:45.820 | I mean, very relaxed in terms of the stuff they wear.
02:25:50.920 | So I wanted to try to really take myself seriously
02:25:54.460 | and take every single moment serious
02:25:55.820 | and everything I do seriously.
02:25:57.500 | And the suit made me feel that way.
02:25:59.200 | I don't know how it looks, but it made me feel that way.
02:26:01.960 | And I think in terms of people I look up to
02:26:03.900 | that wore a suit that made me think of that
02:26:06.140 | is probably Richard Feynman.
02:26:08.080 | I see him-- - Such a wonderful human being.
02:26:11.280 | - Yes, I see him as like the epitome of class
02:26:14.180 | and humor and brilliance.
02:26:17.940 | Obviously, I can never come close to that kind of,
02:26:21.420 | be able to simply explain really complicated ideas
02:26:27.500 | and to have humor and wit, but definitely aspire to that.
02:26:32.120 | And then there's just the Mad Men,
02:26:34.840 | that whole era of the '50s, the classiness of that.
02:26:37.640 | There's something about a suit that both removes
02:26:42.640 | the importance of fashion from the character.
02:26:45.120 | You see the person.
02:26:46.420 | I think, not to, I forgot who said this,
02:26:51.420 | might be like Coco Chanel or somebody like this,
02:26:53.980 | is that you wear a shabby dress and everyone sees the dress.
02:26:58.980 | You wear a beautiful dress and everybody sees the woman.
02:27:07.700 | So in that sense, hopefully I'm quoting that correctly, but--
02:27:12.460 | - Sounds good.
02:27:13.380 | - I think there's a sense in which a simple, classy suit
02:27:18.380 | allows people to focus on your character
02:27:23.660 | and then do so with the full responsibility of that.
02:27:27.540 | This is who I am, yeah.
02:27:29.420 | - I love that.
02:27:30.260 | And I love what you said just prior to that.
02:27:32.240 | My father, who again is always asking me
02:27:34.780 | why I don't dress formally like you do,
02:27:36.960 | always said to me growing up,
02:27:40.340 | "If you overdress slightly,
02:27:43.700 | at least people know that you took them seriously."
02:27:46.180 | So it's a sign of respect for your audience too, in my eyes.
02:27:50.620 | Someone asked, "Is there an AI equivalent of psychedelics?"
02:27:55.600 | And I'm assuming they mean,
02:27:58.020 | is there something that machines can do for themselves
02:28:01.180 | in order to alter their neural circuitry
02:28:03.380 | through unconventional activation patterns?
02:28:06.320 | - Yes, obviously.
02:28:08.980 | Well, I don't know exactly how psychedelics work,
02:28:11.580 | but you can see that with all the diffusion models now
02:28:16.580 | with DALI and the stable diffusion
02:28:18.980 | that generates from text, art.
02:28:21.340 | And there's a,
02:28:22.620 | it's basically a small injection of noise
02:28:28.020 | into a system that has a deep representation
02:28:31.120 | of visual information.
02:28:34.260 | So it's able to convert text to art
02:28:37.620 | in introducing uncertainty into that, noise into that.
02:28:42.420 | That's kind of maybe,
02:28:43.660 | I could see that as a parallel to psychedelics
02:28:45.680 | and it's able to create some incredible things.
02:28:48.060 | From a conceptual understanding of a thing,
02:28:51.800 | it can create incredible art that no human, I think,
02:28:56.300 | could have at least easily created
02:28:58.260 | through a bit of introduction of randomness.
02:29:00.980 | Randomness does a lot of work in the machine learning world.
02:29:05.120 | Just enough.
02:29:06.780 | - There are a lot of requests of you for relationship,
02:29:11.780 | a lot of requests about statistics about you,
02:29:15.780 | data about you specifically, flipping past those.
02:29:20.120 | What was the hardest belt to achieve in jiu-jitsu?
02:29:24.620 | I would have assumed the black belt,
02:29:25.980 | but is that actually true?
02:29:27.600 | - No, I mean, everybody has a different journey
02:29:31.100 | through jiu-jitsu, as people know.
02:29:33.700 | For me, the black belt was the ceremonial belt,
02:29:38.700 | which is not usually the case because I fought the wars.
02:29:43.300 | I trained twice a day for, I don't know how many years,
02:29:47.460 | seven, eight years.
02:29:48.740 | I competed nonstop.
02:29:50.980 | I competed against people much better than me.
02:29:53.180 | I competed against many and beaten many black belts
02:29:56.340 | and brown belts.
02:29:58.180 | I think for me personally,
02:30:01.980 | the hardest belt was the brown belt.
02:30:06.980 | Because for people who know jiu-jitsu,
02:30:10.800 | the size of tournament divisions for blue belts
02:30:14.260 | and purple belts is just humongous.
02:30:17.460 | Like Worlds, when I competed at Worlds,
02:30:19.300 | it was like 140 people in a division,
02:30:21.940 | which means you have to win, I forget how many times,
02:30:23.940 | but seven, eight, nine times in a row to medal.
02:30:28.020 | And so I just had to put in a lot of work during that time.
02:30:32.760 | And especially for competitors,
02:30:35.020 | instructors usually really make you earn a belt.
02:30:38.780 | So to earn the purple belt was extremely difficult,
02:30:41.780 | extremely difficult.
02:30:42.620 | And then to earn the brown belt means I had to compete
02:30:45.200 | nonstop against other purple belts, which are young.
02:30:48.420 | You're talking about,
02:30:49.820 | the people that usually compete are 23, 24, 25 year olds.
02:30:53.980 | They're like shredded, incredible cardio.
02:30:57.120 | They can, for some reason, are in their life
02:30:59.160 | where they can, no kids, nothing.
02:31:01.040 | They can dedicate everything to this pursuit.
02:31:03.400 | So they're training two, three, four times a day.
02:31:05.800 | Diet is on point.
02:31:07.360 | You're going, and for me, because they're usually bigger
02:31:12.360 | and taller than me and just more aggressive,
02:31:15.180 | actual good athletes.
02:31:16.780 | Yeah, I had to go through a lot of wars
02:31:18.460 | to earn that brown belt.
02:31:19.940 | But then--
02:31:20.780 | - I gotta try this jiu-jitsu thing.
02:31:21.680 | - Yeah, you should.
02:31:22.680 | But it's a different--
02:31:23.520 | - Well, I tried, I did the one class,
02:31:25.120 | but I really want to embrace it.
02:31:27.680 | - As you know, many pursuits like jiu-jitsu are different.
02:31:31.860 | If you're doing your 20s and 30s and later,
02:31:33.780 | it's like, it's a different, you can't, you're not,
02:31:37.060 | you can have a bit of an ego in your 20s.
02:31:39.140 | You can have that fire under you,
02:31:40.580 | but you should be sort of more zen-like and wise
02:31:44.880 | and patient later in life.
02:31:48.040 | - Well, one would hope.
02:31:49.860 | That's the wisdom.
02:31:51.980 | - Well, I think Rogan is still a meathead.
02:31:54.460 | He still goes hard and crazy
02:31:55.940 | and he's still super competitive on that.
02:31:58.040 | So some people can jockos somebody like that.
02:32:01.460 | - Well, whatever they're doing,
02:32:02.360 | they're doing something right because they're still in it.
02:32:04.480 | And that's super impressive.
02:32:07.660 | There were far too many questions to ask all of them,
02:32:11.440 | but several, if not many,
02:32:14.500 | asked a highly appropriate question for where we are
02:32:17.420 | in the arc of this discussion.
02:32:19.520 | And this is one admittedly that you ask in your podcast
02:32:24.100 | all the time, but I get the great pleasure
02:32:26.320 | of being in the question asker seat today.
02:32:29.920 | And so what is your advice to young people?
02:32:34.720 | - So I just gave a lecture at MIT and
02:32:40.820 | the amount of love I got there is incredible.
02:32:46.060 | And so of course, who you're talking to
02:32:48.700 | is usually undergrads, maybe young graduate students.
02:32:51.580 | And so one person did ask for advice
02:32:55.880 | as a question at the end.
02:32:56.820 | I did a bunch of Q&A.
02:32:58.100 | So my answer was that the world will tell you
02:33:02.780 | to find a work-life balance to
02:33:05.380 | sort of explore,
02:33:10.820 | to try to
02:33:12.180 | try different fields to see what you really connect with,
02:33:20.820 | variety, general education, all that kind of stuff.
02:33:24.180 | And I said, in your 20s,
02:33:25.780 | I think you should find one thing you're passionate about
02:33:29.860 | and work harder at that
02:33:31.780 | than you worked at anything else in your life.
02:33:34.380 | And if it destroys you, it destroys you.
02:33:36.980 | That's advice for in your 20s.
02:33:39.380 | I don't know how universally true that advice is,
02:33:44.220 | but I think at least give that a chance,
02:33:46.260 | like sacrifice, real sacrifice
02:33:48.020 | towards a thing you really care about and work your ass off.
02:33:52.620 | That said, I've met so many people
02:33:55.340 | and I'm starting to think that advice is best applied
02:33:59.920 | or best tried in the engineering disciplines,
02:34:02.260 | especially programming.
02:34:04.040 | I think there's a bunch of disciplines
02:34:06.920 | in which you can achieve success with much fewer hours.
02:34:09.860 | And it's much more important
02:34:11.820 | to actually have a clarity of thinking and great ideas
02:34:15.900 | and have an energetic mind.
02:34:17.660 | Like the grind in certain disciplines
02:34:20.140 | does not produce great work.
02:34:21.860 | I just know that in computer science and programming,
02:34:24.900 | it often does.
02:34:25.960 | Some of the best people ever
02:34:28.300 | that have built system, have programmed systems,
02:34:31.080 | I usually like the John Carmack kind of people
02:34:33.980 | that drink soda, eat pizza and program 18 hours a day.
02:34:38.660 | So I don't know actually,
02:34:40.700 | you have to, I think really go discipline specific.
02:34:44.860 | So my advice applies to my own life,
02:34:46.620 | which has been mostly spent behind that computer.
02:34:49.660 | And for that, you really, really have to put in the hours.
02:34:53.620 | And what that means is essentially it feels like a grind.
02:34:56.500 | I do recommend that you should at least try it in your own.
02:35:01.460 | That if you interview
02:35:03.880 | some of the most accomplished people ever,
02:35:06.820 | I think if they're honest with you,
02:35:08.860 | they're gonna talk about their 20s
02:35:10.580 | as a journey of a lot of pain
02:35:14.420 | and a lot of really hard work.
02:35:16.020 | I think what really happens, unfortunately,
02:35:19.140 | is a lot of those successful people later in life
02:35:22.260 | will talk about work-life balance.
02:35:23.780 | They'll say, "You know what I learned from that process
02:35:26.380 | "is that it's really important
02:35:27.980 | "to get sun in the morning, to have health,
02:35:33.120 | "to have good relationships and friends."
02:35:34.300 | - Hire a chef.
02:35:35.700 | - Exactly.
02:35:36.540 | But I think those people have forgotten
02:35:41.540 | the value of the journey they took to that lesson.
02:35:45.500 | I think work-life balance is best learned the hard way.
02:35:48.900 | My own perspective,
02:35:52.220 | there's certain things you can only learn the hard way.
02:35:54.700 | And so you should learn that the hard way.
02:35:57.040 | Yeah, so that's definitely advice.
02:35:58.720 | And I should say that I admire people that work hard.
02:36:03.720 | If you wanna get on my good side,
02:36:08.000 | I think are the people that give everything they got
02:36:10.180 | towards something.
02:36:11.180 | It doesn't actually matter what it is.
02:36:12.820 | But towards achieving excellence in a thing,
02:36:17.540 | that's the highest thing that we can reach
02:36:23.080 | for as human beings, I think,
02:36:24.420 | is excellence at a thing.
02:36:25.860 | - I love it.
02:36:28.540 | Well, speaking of excellence at a thing,
02:36:30.780 | whether or not it's teaching at MIT or the podcast
02:36:34.140 | or the company that resides
02:36:36.340 | in the near future that you create,
02:36:41.060 | once again, I'm speaking for an enormous number of people
02:36:45.060 | that excellence and hard work certainly
02:36:48.500 | are woven through everything that you do.
02:36:51.380 | Every time I sit down with you,
02:36:52.620 | I begin and finish with such an immense feeling of joy
02:36:57.380 | and appreciation and gratitude.
02:37:00.500 | And it wouldn't be a Lex Friedman podcast
02:37:04.780 | or in case of a Lex Friedman being a guest on a podcast,
02:37:08.600 | if the word love weren't mentioned at least 10 times.
02:37:11.540 | So the feelings of gratitude for all the work you do
02:37:16.400 | for taking the time here today to share with us
02:37:19.240 | what you're doing, your thoughts, your insights,
02:37:22.300 | what you're perplexed about
02:37:23.520 | and what drives you and your callings.
02:37:26.420 | - Can I read a poem?
02:37:27.260 | - Yes, please.
02:37:28.100 | [laughing]
02:37:29.900 | He was trying to cut me off,
02:37:30.740 | but I was getting a little long.
02:37:32.520 | - No, no, no, this is, I was thinking about this recently.
02:37:35.720 | It's one of my favorite Robert Frost poems.
02:37:37.960 | And I, 'cause I wrote several essays on it as you do,
02:37:42.520 | 'cause I think it's a popular one that's read.
02:37:45.100 | And so essays being like trying to interpret poetry
02:37:48.340 | and it's one that sticks with me in both its calm beauty,
02:37:52.640 | but in the seriousness of what it means
02:37:56.000 | because I ultimately think it's the,
02:37:59.980 | so stopping by a woods on a snowy evening.
02:38:02.660 | I think it's ultimately a human being, a man asking
02:38:06.940 | the old Sisyphus, the old Camus question of why live.
02:38:11.940 | I think this poem, even though it doesn't seem like it,
02:38:16.620 | is a question of a man contending with suicide
02:38:19.480 | and choosing to live.
02:38:20.740 | Whose woods these are, I think I know.
02:38:26.700 | His house is in the village though.
02:38:28.820 | He will not see me stopping here
02:38:30.420 | to watch his woods fill up with snow.
02:38:33.380 | My little horse must think it queer
02:38:35.920 | to stop without a farmhouse near
02:38:38.320 | between the woods and frozen lake
02:38:39.900 | the darkest evening of the year.
02:38:41.640 | He gives his harness bells a shake
02:38:44.740 | to ask if there's some mistake.
02:38:47.000 | The only other sounds the sweep
02:38:49.260 | of easy wind and downy flake.
02:38:52.060 | The woods are lovely, dark and deep,
02:38:55.300 | but I have promises to keep
02:38:57.380 | and miles to go before I sleep
02:39:00.220 | and miles to go before I sleep.
02:39:03.580 | The woods representing the darkness,
02:39:06.040 | the comfort of the woods representing death
02:39:08.060 | and he is a man choosing to live.
02:39:10.840 | Yeah, I think about that often,
02:39:12.900 | especially in my dark, darker moments
02:39:14.760 | is you have promises to keep.
02:39:19.040 | Thank you for having me, Andrew.
02:39:24.740 | You're a beautiful human being.
02:39:26.500 | I love you, brother.
02:39:27.340 | - I love you, brother.
02:39:28.400 | Thank you for joining me today
02:39:30.480 | for my discussion with Dr. Lex Friedman
02:39:32.880 | and special thanks to Dr. Lex Friedman
02:39:35.620 | for inspiring me to start this podcast.
02:39:38.580 | If you're learning from and or enjoying this podcast,
02:39:40.980 | please subscribe to our YouTube channel.
02:39:42.780 | That's a terrific zero cost way to support us.
02:39:45.200 | In addition, please subscribe to the podcast
02:39:47.420 | on Spotify and on Apple.
02:39:49.220 | And on both Spotify and Apple,
02:39:50.780 | you can leave us up to a five-star review.
02:39:52.960 | If you have questions or suggestions about topics
02:39:55.100 | and guests you'd like me to include
02:39:56.600 | on the Huberman Lab podcast,
02:39:58.180 | please put those in the comment section on YouTube.
02:40:00.380 | I do read all the comments.
02:40:02.300 | In addition, please check out the sponsors mentioned
02:40:04.340 | at the beginning of today's episode.
02:40:06.040 | That's the best way to support this podcast.
02:40:08.380 | During today's episode, we did not discuss supplements,
02:40:10.880 | but on many previous episodes of the Huberman Lab podcast,
02:40:13.780 | we do discuss supplements
02:40:15.460 | because while supplements aren't necessary for everybody,
02:40:18.060 | many people derive tremendous benefit from them
02:40:20.100 | for things like enhancing sleep and focus
02:40:22.200 | and hormone augmentation and so forth.
02:40:24.220 | The Huberman Lab podcast has partnered
02:40:25.680 | with Momentous Supplements
02:40:27.220 | because they are of the very highest quality
02:40:29.820 | and they ship internationally.
02:40:31.280 | In addition to that, they have single ingredient formulations
02:40:34.340 | that allow you to devise the supplement regimen
02:40:36.900 | that's most effective and most cost effective for you.
02:40:39.780 | If you'd like to see the supplements discussed
02:40:41.420 | on the Huberman Lab podcast,
02:40:42.860 | please go to livemomentous.com/huberman.
02:40:46.120 | If you haven't already signed up
02:40:47.380 | for the Huberman Lab podcast,
02:40:48.900 | zero cost neural network newsletter,
02:40:51.600 | we invite you to do so.
02:40:52.660 | It's a monthly newsletter
02:40:53.700 | that has summaries of podcast episodes
02:40:55.940 | and various protocols distilled into simple form.
02:40:59.420 | You can sign up for the newsletter
02:41:00.820 | by going to HubermanLab.com,
02:41:02.320 | go to the menu and look for newsletter.
02:41:04.280 | You supply your email,
02:41:05.340 | but we do not share it with anybody else.
02:41:07.080 | And as I mentioned before,
02:41:08.500 | the newsletter is completely zero cost.
02:41:10.680 | And if you're not already following us on social media,
02:41:13.200 | we are Huberman Lab on Instagram, Huberman Lab on Twitter
02:41:16.340 | and Huberman Lab on Facebook.
02:41:17.960 | And at all of those sites,
02:41:19.580 | I provide science and science related tools
02:41:21.920 | for mental health, physical health and performance,
02:41:23.880 | some of which overlap with information covered
02:41:26.120 | on the Huberman Lab podcast,
02:41:27.640 | but often which is distinct from information covered
02:41:30.240 | on the Huberman Lab podcast.
02:41:31.460 | So again, that's Huberman Lab on Instagram,
02:41:33.240 | Twitter and Facebook.
02:41:34.800 | Thank you again for joining me for the discussion
02:41:36.800 | with Dr. Lex Friedman.
02:41:38.420 | And as always, thank you for your interest in science.
02:41:41.400 | [upbeat music]
02:41:43.980 | (upbeat music)