back to index

Matt Walker: Sleep | Lex Fridman Podcast #210


Chapters

0:0 Introduction
2:5 Putin moment: Lex takes Matt's sunglasses
2:26 Fascination with sleep
6:35 Why do we sleep?
15:6 Computer vision for driver assistance
24:28 Consciousness is fundamental
32:34 Lex on human to robot connection
35:1 Scent of a Woman is better than "John Wick"
46:42 Distinction between coffee and caffeine
72:26 The science of 'sleeping on it'
86:19 Lex on his sleeping schedule
111:23 Chronotypes
118:52 How to overcome insomnia
136:15 Diet and sleep
145:12 Where do dreams come from?
158:50 How sleep affects emotions
165:43 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Matt Walker,
00:00:02.520 | sleep scientist, professor of neuroscience
00:00:04.840 | and psychology at Berkeley, author of "Why We Sleep"
00:00:08.700 | and the host of a new podcast
00:00:11.540 | called the Matt Walker Podcast.
00:00:14.240 | It's 10 minute episodes a couple of times a month,
00:00:17.140 | covering sleep and other health and science topics.
00:00:20.760 | I love it and recommend it highly.
00:00:22.560 | It's up there with the greats
00:00:24.420 | like the Huberman Lab Podcast with Andrew Huberman
00:00:28.080 | and I think David Sinclair is putting out
00:00:30.620 | an audio series soon too.
00:00:32.340 | I can't wait to listen to it.
00:00:33.780 | I'm really excited by the future of science
00:00:36.820 | in the podcasting world.
00:00:38.680 | To support this podcast, please check out our sponsors,
00:00:41.660 | Stamps.com, Squarespace, Athletic Greens,
00:00:45.300 | BetterHelp and Onnit.
00:00:47.180 | Their links are in the description.
00:00:49.840 | As a side note, let me say that to me,
00:00:52.240 | a healthy life is one in which you fall in love
00:00:55.120 | with the world around you, with ideas, with people,
00:00:58.640 | with small goals and big goals, no matter how difficult,
00:01:02.060 | with dreams you hold onto and chase for years.
00:01:05.840 | Life should be lived fully.
00:01:07.920 | That to me is the priority.
00:01:09.980 | That to me is a healthy life.
00:01:12.060 | Second to that is the understanding
00:01:14.180 | and the utilization of the best available science
00:01:16.900 | on diet, exercise, supplements, sleep
00:01:20.540 | and other lifestyle choices.
00:01:22.740 | To me, science in the realm of health is a guide
00:01:26.520 | for we should try, not the absolute truth
00:01:28.960 | of how to live life.
00:01:30.980 | The goal is to learn to listen to your body
00:01:33.040 | and figure out what works best for you.
00:01:35.800 | All that said, a good night's sleep can be a great tool
00:01:39.480 | in making life awesome and productive.
00:01:41.720 | And Matt is a great advocate of the how and the why of sleep.
00:01:46.720 | We agree on some things and disagree on others,
00:01:49.760 | but he's a great human being, a great scientist
00:01:52.440 | and as of recently, a friend with whom I enjoy
00:01:55.500 | having these wide ranging conversations.
00:01:58.700 | This is the Lex Friedman podcast
00:02:00.860 | and here is my conversation with Matt Walker.
00:02:04.200 | You should try these shades on and see what you look like.
00:02:08.020 | - So they are now your shades.
00:02:13.720 | And that's not a question.
00:02:14.560 | - It's the same thing as Putin took the Superbowl ring
00:02:17.700 | and it's now his ring.
00:02:18.900 | (both laughing)
00:02:20.960 | - Yeah, one wonders if he was offered it,
00:02:22.700 | but they are yours.
00:02:24.280 | - When did you first fall in love
00:02:28.940 | with the dream of understanding sleep?
00:02:32.900 | Like where did the fascination with sleep begin?
00:02:36.400 | - So back in the United Kingdom,
00:02:41.140 | you can sort of start doing medicine at age 18
00:02:44.140 | and it's a five year program.
00:02:46.100 | And I was at the Queen's Medical Center in the UK.
00:02:50.700 | And I remember just being fascinated
00:02:52.760 | by states of consciousness and particularly anesthesia.
00:02:57.220 | I was thinking, isn't that incredible?
00:02:59.100 | Within seconds, I can take a perfectly conscious
00:03:01.880 | human being and I can remove all existence
00:03:06.020 | of the mentality and their awareness within seconds.
00:03:09.860 | And that stunned me.
00:03:12.540 | So I started to get really interested in conscious states.
00:03:15.540 | I even started to read a lot about hypnosis
00:03:19.740 | and all of these things, hypnosis,
00:03:22.340 | even sleep and dreams at the time,
00:03:24.380 | they were very esoteric.
00:03:25.980 | It was sort of charlatan science at that stage.
00:03:29.780 | And I think almost all of my colleagues
00:03:33.260 | and I are accidental sleep researchers.
00:03:36.700 | No one, as I recall in the classroom
00:03:39.020 | when you're sort of five years old
00:03:40.380 | and the teacher says,
00:03:41.380 | "What would you like to be when you grow up?"
00:03:44.040 | No one's putting their hand up and saying,
00:03:45.300 | "I would love to be a sleep researcher."
00:03:48.540 | And so when I was doing my PhD,
00:03:51.500 | I was trying to identify different forms of dementia
00:03:55.060 | very early on in the course.
00:03:57.500 | And I was using electrical brainwave recordings to do that.
00:04:00.660 | And I was failing miserably.
00:04:02.620 | It was a disaster, just no result after no result.
00:04:07.260 | And I used to go home to the doctor's residence
00:04:09.220 | with this sort of a little igloo of journals
00:04:11.260 | that at the weekend I would sort of sit in and read.
00:04:15.060 | And which I'm now thinking,
00:04:16.980 | do I really want to admit this?
00:04:18.180 | 'Cause it sounds like I had no social life,
00:04:19.800 | which I didn't, I was social leper.
00:04:21.900 | But, and I started to realize that some parts of the brain
00:04:26.260 | were sleep related areas.
00:04:29.020 | And some dementias were eating away
00:04:31.540 | those sleep related areas.
00:04:33.660 | Other dementias would leave them untouched.
00:04:35.780 | And I thought, well, I'm doing this all wrong.
00:04:38.060 | I'm measuring my patients while they're awake.
00:04:41.260 | Instead I should be measuring them while they're asleep.
00:04:43.980 | Started doing that, got some amazing results.
00:04:47.520 | And then I wanted to ask the question,
00:04:49.700 | is that sleep disruption that my patients are experiencing
00:04:54.100 | as they go into dementia,
00:04:55.680 | maybe it's not a symptom of the dementia.
00:04:57.900 | I wonder if it's a cause of the dementia.
00:05:01.940 | And at that point, which was, cough, cough, 20 years ago,
00:05:06.060 | no one could answer a very simple fundamental question.
00:05:11.060 | Why do we sleep?
00:05:12.020 | And I at the time didn't realize
00:05:17.020 | that some of the most brilliant minds
00:05:18.700 | in scientific history had tried to answer that question
00:05:21.340 | and failed.
00:05:22.580 | And at that point, I just thought, well,
00:05:24.540 | I'm going to go and do a couple of years of sleep research
00:05:27.660 | and I'll figure out why we sleep.
00:05:30.220 | And then I'll come back to my patients
00:05:31.700 | in this question of dementia.
00:05:33.720 | And as I said, that was 20 years ago.
00:05:35.420 | And what I realized is that hard questions
00:05:38.620 | care very little about who asks them.
00:05:41.460 | They will meter out their lessons of difficulty
00:05:44.040 | all the same.
00:05:45.200 | And I was schooled in the difficulty of the question,
00:05:48.880 | why do we sleep?
00:05:50.540 | But in truth, 20 years later,
00:05:53.040 | we've had to upend the question
00:05:55.320 | rather than saying, why do we sleep?
00:05:57.260 | And by the way, the answer then was
00:05:59.980 | that we sleep to cure sleepiness,
00:06:02.160 | which is like saying, we eat to cure hunger.
00:06:07.080 | That tells you nothing about the physiological benefits
00:06:09.360 | of food, same with sleep.
00:06:11.600 | Now we've actually have to ask the question,
00:06:14.320 | is there any physiological system in the body
00:06:17.100 | or any major operation of the mind
00:06:19.360 | that isn't wonderfully enhanced when we get sleep
00:06:22.400 | or demonstrably impaired when we don't get enough?
00:06:25.560 | And so far, for the most part, the answer seems to be no.
00:06:28.880 | - So far, the answer seems to be no.
00:06:32.960 | So why does the body and the mind crave sleep then?
00:06:37.960 | Why do we sleep?
00:06:41.200 | How can we begin to answer that question then?
00:06:45.800 | - So I think one of the ways that I think about this
00:06:48.240 | or one of the answers that came to me is the following.
00:06:53.080 | The reason that we implode so quickly
00:06:55.480 | and so thoroughly with insufficient sleep
00:06:58.680 | is because human beings seem to be one of the few species
00:07:01.840 | that will deliberately deprive themselves of sleep
00:07:04.060 | for no apparent good reason, biological.
00:07:07.760 | And what that led me then to was the following.
00:07:12.240 | Mother nature as a consequence,
00:07:14.280 | so no other species does what we do in that context.
00:07:18.480 | There are a few species that do undergo sleep deprivation,
00:07:21.600 | but for very obvious, clear biological reasons.
00:07:25.120 | One is when they're in a condition of severe starvation.
00:07:28.240 | The second is when they're caring for their newborn.
00:07:31.520 | So for example, killer whales will often deprive themselves.
00:07:34.800 | The female will go away from the pod,
00:07:37.560 | give birth and then bring the calf back.
00:07:40.400 | And during that time,
00:07:41.820 | the mother will undergo sleep deprivation.
00:07:44.520 | And then the third one is during migration
00:07:47.160 | when birds are flying trans oceanographic to 3000 miles.
00:07:51.920 | But for the most part, it's never seen in the animal kingdom
00:07:55.480 | which brings me back to the point.
00:07:57.680 | Therefore, mother nature in the course of evolution
00:08:01.440 | has never had to face the challenge
00:08:03.460 | of this thing called sleep deprivation.
00:08:06.520 | And therefore she has never created a safety net in place
00:08:11.000 | to circumnavigate this common influence.
00:08:15.880 | And there is a good example where we have,
00:08:18.980 | which is called the adipose cell, the fat cell.
00:08:22.840 | Because during our evolutionary past,
00:08:25.000 | we had famine and we had feast.
00:08:27.440 | And mother nature came up with a very clever recipe,
00:08:29.680 | which is how can I store caloric credit
00:08:34.480 | so that I can spend it when I go into debt?
00:08:37.200 | And the fat cell was born, brilliant idea.
00:08:39.620 | Where is the fat cell for sleep?
00:08:42.040 | Where is that sort of banking chip for sleep?
00:08:44.400 | And unfortunately we don't seem to have one
00:08:46.560 | because she's never had to face that challenge.
00:08:49.400 | - So even if there's not some kind of physics
00:08:52.680 | fundamental need for sleep,
00:08:55.300 | that physiologically or psychologically,
00:08:59.120 | the fact is most organisms are built such that they need it.
00:09:03.640 | And then mother nature never built an extra mechanism
00:09:08.040 | for sleep deprivation.
00:09:09.560 | So it's interesting that why we sleep
00:09:12.160 | might not have a good answer,
00:09:13.620 | but we need to sleep to be healthy is nevertheless true.
00:09:19.440 | - Yeah, and we have many answers right now.
00:09:21.700 | In some ways, the question of why we sleep
00:09:23.940 | was the wrong question too.
00:09:25.960 | It's what are the pluripotent many reasons we sleep?
00:09:30.240 | We don't just sleep for one reason.
00:09:32.520 | Because from an evolutionary perspective,
00:09:35.800 | it is the most idiotic thing that you could imagine.
00:09:40.000 | When you're sleeping, you're not finding a mate,
00:09:43.000 | you're not reproducing, you're not caring for your young,
00:09:45.520 | you're not foraging for food,
00:09:47.460 | and worse still, you're vulnerable to predation.
00:09:50.640 | So on any one of those grounds,
00:09:52.720 | especially as a collective,
00:09:54.680 | sleep should have been strongly selected
00:09:56.560 | against in the course of evolution.
00:09:59.800 | But in every species that we've studied carefully to date,
00:10:03.920 | sleep is present.
00:10:05.920 | - So it is important.
00:10:07.040 | So you're right.
00:10:09.000 | I think I've heard arguments
00:10:10.380 | from an evolutionary biology perspective
00:10:12.240 | that sleep is actually advantageous.
00:10:15.780 | Maybe like some kind of predator-prey relationships.
00:10:18.600 | But you're saying, and it actually makes way more sense
00:10:21.320 | what you're saying, is it should have been selected against.
00:10:25.400 | Why close your eyes?
00:10:27.280 | - Yeah, why?
00:10:28.120 | Because, and there was an energy conservation hypothesis
00:10:31.880 | for a while, which is that we need to essentially
00:10:34.000 | go into low battery mode,
00:10:36.200 | power down because it's unsustainable.
00:10:38.860 | But in fact, that actually has been blasted out the water
00:10:41.880 | because sleep is an incredibly active process.
00:10:45.660 | In fact, the difference between you just lying on the couch,
00:10:48.900 | but remaining conscious,
00:10:50.380 | versus you lying on the couch and falling asleep,
00:10:53.040 | it's only a savings of about 140, 150 calories.
00:10:56.600 | In other words, you just go out and club another baby seal
00:11:00.040 | or whatever it was, and you wouldn't worry.
00:11:02.360 | So it has to be much more to it than energy conservation,
00:11:05.020 | much more to it than sharing ecosystem space and time,
00:11:09.760 | much more to it than simply predator-prey relationships.
00:11:13.600 | If sleep really did, and looking back,
00:11:17.220 | even very old evolutionary organisms like earthworms,
00:11:20.680 | millions of years old,
00:11:22.120 | they have periods where they're active
00:11:24.720 | and periods where they're passively asleep.
00:11:26.520 | It's called lethargics.
00:11:27.720 | (laughs)
00:11:28.680 | And so what that in some ways suggested to me
00:11:32.200 | was sleep evolved with life itself on this planet,
00:11:36.680 | and then it has fought its way through heroically
00:11:39.680 | every step along the evolutionary pathway,
00:11:42.640 | which then leads to the sort of famous sleep statement
00:11:46.540 | from a researcher that if sleep doesn't serve
00:11:48.860 | an absolutely vital function or functions,
00:11:52.140 | then it's the biggest mistake
00:11:53.160 | the evolutionary process has ever made.
00:11:55.640 | And we've now realized mother nature
00:11:57.240 | didn't make a spectacular blunder with sleep.
00:12:00.080 | - You've mentioned the idea of conscious states.
00:12:03.120 | Do you think of sleep as a fundamentally
00:12:06.080 | different conscious state than awake-ness?
00:12:11.080 | And how many conscious states are there?
00:12:13.280 | So when you're into it,
00:12:14.640 | you're understanding of what the mind can do.
00:12:17.320 | Do you think awake state, sleep state,
00:12:20.480 | or is there some kind of continuum?
00:12:22.680 | There's a complicated state transition diagram.
00:12:26.200 | Like how do you think about this whole space?
00:12:28.520 | - I think about it as a state-space diagram,
00:12:32.040 | and I think it's probably more of a continuum
00:12:34.080 | than we have believed it to be or suggested it to be.
00:12:37.660 | So we used to think absent of anesthesia
00:12:41.400 | that there were already three main states of consciousness.
00:12:44.520 | There was being awake,
00:12:45.960 | being in non-rapid eye movement sleep or non-dream sleep,
00:12:50.360 | and then being in rapid eye movement sleep or dream sleep.
00:12:53.280 | And those were the three states
00:12:55.200 | within which your brain could percolate and be conscious.
00:12:59.480 | Conscious during non-REM sleep is maybe a stretch to say,
00:13:05.080 | but I still believe there is plenty of consciousness there.
00:13:08.480 | I don't believe that though anymore.
00:13:11.440 | And the reason is because we can have daydreams,
00:13:15.020 | and we are in a very different wakeful state
00:13:20.140 | in those daydreams than we are when we are,
00:13:22.520 | as we are now together, present and extraceptively focused
00:13:27.160 | rather than intraceptively focused.
00:13:29.920 | And then we also know that as you are sort of progressing
00:13:35.820 | into those different stages of sleep during non-REM sleep,
00:13:38.840 | you can also still dream.
00:13:40.320 | Depends on your definition of dreaming,
00:13:42.360 | but we seem to have some degree of dreaming
00:13:44.880 | in almost all stages of sleep.
00:13:48.560 | We've also then found that when you are sleep deprived,
00:13:51.960 | there are even individual brain cells will fall asleep.
00:13:56.220 | Despite the animal being, you know,
00:13:59.060 | behaviorally from best we can tell awake,
00:14:02.420 | individual brain cells and clusters of brain cells
00:14:05.100 | will go into a sleep-like state.
00:14:07.740 | And humans do this too.
00:14:09.460 | When we are sleep deprived,
00:14:10.940 | we have what are called micro-sleeps
00:14:13.580 | where the eyelid will partially close
00:14:17.180 | and the brain essentially falls lapses
00:14:20.520 | into a state of sleep,
00:14:23.060 | but behaviorally you seem to be awake
00:14:25.320 | and the danger here is road traffic accidents.
00:14:28.800 | So these are the, what we call these sort of micro-sleep
00:14:33.180 | events at the wheel.
00:14:34.780 | Now, if you're traveling at 65 miles an hour
00:14:38.080 | in a two ton vehicle, you know,
00:14:40.720 | it takes probably around one second to drift
00:14:43.160 | from one lane to the next.
00:14:44.360 | And it takes two seconds to go completely off the road.
00:14:48.160 | So if you have one of these micro-sleeps at the wheel,
00:14:50.660 | you know, it could be the last micro-sleep
00:14:52.580 | that you ever have.
00:14:53.540 | But I don't now see it as a set of, you know,
00:14:57.740 | very binary distinct, you know, step function states.
00:15:02.300 | It's not a one or a zero.
00:15:04.480 | I see it more of a, as a continuum.
00:15:07.060 | - Yeah, so I've for five, six years at MIT
00:15:13.460 | really focused on this human side of driving question.
00:15:17.740 | And one of the big concerns is the micro-sleeps,
00:15:22.740 | drowsiness, these kinds of ideas.
00:15:25.340 | And one of the open questions was,
00:15:27.060 | is it possible through computer vision to detect,
00:15:29.920 | or any kind of sensors?
00:15:32.300 | The nice thing about computer vision
00:15:33.860 | is you don't have to have direct contact to the person.
00:15:36.860 | Is it possible to detect increases in drowsiness?
00:15:41.700 | Is it possible to detect these kinds of micro-sleeps
00:15:44.100 | or actually just sleep in general?
00:15:46.080 | Among other things, like distraction.
00:15:49.220 | These are all words that have so many meanings
00:15:52.020 | and so many debates, like attention is a whole nother one.
00:15:55.860 | Just because you're looking at something
00:15:57.260 | doesn't mean you're loading in the information.
00:16:00.140 | Just because you're looking away
00:16:02.220 | doesn't mean your peripheral vision
00:16:03.720 | can't pick up the important information.
00:16:05.740 | There's so many complicated vision science things there.
00:16:08.860 | So I wonder if you could say something to,
00:16:11.980 | they say the eyes are the windows to the soul.
00:16:15.300 | Do you think the eyes can reveal something about sleepiness
00:16:20.300 | through computer vision,
00:16:24.820 | through just looking at the video of the face?
00:16:27.620 | And Andrew Huberman and I, your friend,
00:16:30.220 | have talked about this.
00:16:31.180 | I would love to work on this together.
00:16:33.660 | - You should do.
00:16:34.500 | - It's a fascinating problem.
00:16:35.680 | But drowsiness is a tricky one.
00:16:37.220 | So there's what kind of information?
00:16:39.380 | There's blinking and there's eye movement.
00:16:42.740 | And those are the ones that can be picked up
00:16:46.020 | with computer vision.
00:16:46.940 | Do you think those are signals that could be used
00:16:49.020 | to say something about where we are in this continuum?
00:16:52.700 | - Yeah, I do.
00:16:53.580 | And I think there are a number of other features too.
00:16:55.620 | I think aperture of eye,
00:16:59.500 | so in other words, partial closures, full closures,
00:17:03.160 | duration of those closures,
00:17:04.820 | duration of those partial closures of the eyelid.
00:17:08.120 | I think there may be some information in the pupil as well,
00:17:13.380 | because as we're transitioning between those states,
00:17:16.420 | there are changes in what's called
00:17:17.900 | the automatic nervous system,
00:17:19.300 | or technically it's called the autonomic nervous system,
00:17:22.780 | part of which will control your pupillary size.
00:17:27.060 | So I actually think that there is probably
00:17:29.820 | a wealth of information.
00:17:32.280 | When you combine that probably with aspects
00:17:35.680 | of steering angle, steering maneuver,
00:17:38.940 | and if you can sense the pressure on the pedals as well,
00:17:44.480 | my guess is that there is some combinatorial feature
00:17:48.040 | that creates a phenotype of you are starting to fall asleep.
00:17:53.040 | And as the autonomous controls develop,
00:17:57.860 | it's time for them to kick in.
00:18:00.480 | - Some manufacturers, auto manufacturers,
00:18:02.800 | sort of have something, beta version,
00:18:05.640 | or maybe an alpha version of this
00:18:07.900 | already starting to come online,
00:18:09.400 | where they have a little camera in the wheel
00:18:11.960 | that I think tries to look at some features.
00:18:14.520 | - Almost everybody doing this, and it's very alpha.
00:18:17.720 | So, you know, the thing that you currently have,
00:18:23.200 | some people have that in their car,
00:18:24.400 | there's a coffee cup or something that comes up
00:18:26.480 | that you might be sleepy.
00:18:27.980 | The primary signal that they're comfortable using
00:18:31.580 | is a steering wheel reversals.
00:18:33.380 | So basically using your interaction with the steering wheel
00:18:37.460 | and how much you're interacting with it
00:18:39.700 | as a sign of sleepiness.
00:18:40.900 | So if you have to constantly correct the car,
00:18:43.620 | that's a sign of you starting to drift into micro sleep.
00:18:47.360 | I think that's a very, very crude signal.
00:18:49.540 | It's probably a powerful one.
00:18:51.180 | There's a whole nother component to this,
00:18:52.740 | which is it seems like it's so driver and subject dependent.
00:18:57.580 | How our behavior changes as we get sleepy and drowsy
00:19:02.580 | seems to be different in complicated, fascinating ways
00:19:08.200 | where you can't just use one signal.
00:19:09.820 | It's kind of like what you're saying,
00:19:11.140 | there has to be a lot of different signals
00:19:13.580 | that you should then be able to combine.
00:19:15.440 | The hope is there's the searches for like universal signals
00:19:19.820 | that are pretty damn good for like 90% of people.
00:19:23.600 | But I don't think we need to take necessarily
00:19:26.240 | quite that approach.
00:19:27.240 | I think what we could do in some clever fashion
00:19:31.100 | is using the individual.
00:19:33.520 | So what you and I are perhaps suggesting here
00:19:35.620 | is that there is an array of features that we know
00:19:39.440 | provide information that is sensitive to whether or not
00:19:42.800 | you're falling asleep at the wheel.
00:19:44.580 | Some of those, let's say that there are 10 of them.
00:19:47.940 | For me, seven of them are the cardinal features.
00:19:51.860 | For you, however, six of them,
00:19:54.560 | and they're not all the same sort of overlapping,
00:19:58.060 | are those for you.
00:19:59.940 | I think what we need is algorithms
00:20:01.880 | that can firstly understand when you are well slept.
00:20:04.460 | So let's say that people have sleep trackers at night
00:20:06.940 | and then your car integrates that information.
00:20:09.260 | - That would be amazing.
00:20:10.100 | - And it understands when you are well slept.
00:20:12.620 | And then you've got the data of the individual behavior
00:20:16.660 | unique to that individual, snowflake-like,
00:20:20.340 | when they are well slept.
00:20:21.900 | This is the signature of well-rested driving.
00:20:26.520 | Then you can look at deviations from that
00:20:29.220 | and pattern match it with the sleep history
00:20:32.220 | of that individual.
00:20:33.700 | And then I don't need to find the sort of, you know,
00:20:37.460 | the one size fits all approach for 99% of the people.
00:20:41.060 | I can create a very bespoke, tailor-like set of features,
00:20:45.380 | a Savile Row suit of sleepiness features.
00:20:49.560 | You know, that would be my,
00:20:50.520 | if you want to ask me about moonshots and crazy ideas,
00:20:53.540 | that's where I'd go.
00:20:54.380 | But to start with, I think your approach is a great one.
00:20:58.180 | Let's find something that covers 99% of the people.
00:21:02.060 | Because the worrying thing about microsleeps, of course,
00:21:04.700 | unlike, you know, drugs or alcohol,
00:21:07.420 | which, you know, certainly is a terrible thing
00:21:10.300 | to be behind the wheel,
00:21:11.920 | with those, often you react too late.
00:21:16.920 | And that's the reason you get into an accident.
00:21:20.160 | When you fall asleep behind the wheel,
00:21:22.820 | you don't react at all.
00:21:24.240 | You know, at that point,
00:21:26.220 | there is a two-ton missile driving down the street
00:21:28.580 | and no one's in control.
00:21:29.920 | That's why those accidents can often be more dangerous.
00:21:34.380 | - Yeah, and the fascinating thing is,
00:21:37.320 | in the case of semi-autonomous vehicles,
00:21:39.380 | like Tesla Autopilot,
00:21:40.920 | this is where I've had disagreements with Mr. Elon Musk,
00:21:44.700 | and the human factors community,
00:21:50.020 | which is this community that one of the big things
00:21:52.420 | they study is human supervision over automation.
00:21:56.780 | So you have like pilots, you know,
00:21:59.220 | supervising an airplane that's mostly flying autonomously.
00:22:02.760 | The question is, when we're actually doing the driving,
00:22:06.740 | how do microsleeps or general,
00:22:09.900 | how does drowsiness progress
00:22:12.140 | and how does it affect our driving?
00:22:14.060 | That question becomes more fascinating, more complicated,
00:22:17.500 | when your task is not driving,
00:22:19.580 | but supervising the driving.
00:22:21.680 | So your task is to take over when stuff goes wrong.
00:22:24.820 | And that is complicated,
00:22:27.200 | but the basic conclusions from many decades
00:22:30.300 | is that humans are really crappy at supervising
00:22:34.340 | because they get drowsy
00:22:36.780 | and lose vigilance much, much faster.
00:22:39.540 | The really surprising thing with Tesla Autopilot,
00:22:42.260 | it was surprising to me,
00:22:43.660 | surprising to the human factors community,
00:22:47.060 | and in fact, they still argue with me about it,
00:22:49.380 | is it seems that humans in Teslas with Autopilot
00:22:54.380 | and other similar systems are not becoming less vigilant,
00:22:58.740 | at least with the studies we've done.
00:23:01.580 | So there's something about the urgency of driving.
00:23:05.300 | I can't, I'm not sure why,
00:23:07.220 | but there's something about the risk,
00:23:08.880 | I think the fact that you might die
00:23:10.780 | is still keeping people awake.
00:23:14.440 | The question is, as Tesla Autopilot
00:23:16.500 | or similar systems get better and better and better,
00:23:19.420 | how does that affect increasing drowsiness?
00:23:21.700 | And that's when you need to have,
00:23:23.220 | that's where the big disagreement was.
00:23:25.380 | You need to have driver sensing,
00:23:27.340 | meaning driver facing camera
00:23:29.140 | that tracks some kind of information about the face
00:23:33.660 | that can tell you drowsiness.
00:23:36.060 | So you can tell the car if you're drowsy
00:23:38.980 | so that the car can be like,
00:23:40.220 | you should be probably driving or pull to the side.
00:23:43.900 | - Right, or I need to do some of the heavy lifting here.
00:23:47.460 | - Yeah, so there needs to be that dance of interaction
00:23:51.700 | of a human and machine.
00:23:53.660 | But currently it's mostly steering wheel based.
00:23:56.820 | So this idea that your hands should be on the steering wheel
00:24:01.820 | that's a sign that you're paying attention
00:24:05.300 | is an outdated and a very crude metric.
00:24:09.460 | - I agree, yeah.
00:24:11.220 | I think there are far more sophisticated ways
00:24:13.340 | that we can solve that problem if we invest.
00:24:16.400 | - Can I ask you a big philosophical question
00:24:20.620 | before we get into fun details?
00:24:22.980 | On the topic of conscious states,
00:24:26.560 | how fundamental do you think is consciousness
00:24:31.220 | to the human mind?
00:24:33.340 | I ask this from almost like a robotics perspective.
00:24:36.420 | So in your study of sleep,
00:24:39.060 | do you think the hard question of consciousness,
00:24:42.100 | that it feels like something to be us,
00:24:45.380 | is that like a nice little feature,
00:24:46.860 | like a quirk of our mind,
00:24:50.220 | or is it somehow fundamental?
00:24:51.980 | Because sleep feels like we take a step out
00:24:55.820 | of that consciousness a little bit.
00:24:57.820 | So from all your study of sleep,
00:25:00.540 | do you think consciousness is like deeply part of who we are
00:25:04.300 | or is it just a nice trick?
00:25:05.820 | - I think it's a deeply embedded feature
00:25:09.700 | that I can imagine has a whole panoply
00:25:13.340 | of biological benefits.
00:25:16.100 | But to your point about sleep,
00:25:17.700 | what is interesting if you do a lot of dream research,
00:25:20.900 | and we've done some,
00:25:22.020 | it's very, very rare at all, in fact,
00:25:28.140 | for you to end up becoming someone other
00:25:31.540 | than who you are in your dreams.
00:25:33.420 | Now, you can have third-person perspective dreams
00:25:35.940 | where you can see yourself in the dream
00:25:38.760 | as if you've risen above your physical being.
00:25:43.760 | But for the most part,
00:25:46.220 | it's very rare that we lose our sense of conscious self.
00:25:51.220 | And maybe I'm sort of doing a sleight of hand
00:25:53.700 | because it's really what I'm saying
00:25:54.860 | is it's very rare that we lose our sense
00:25:56.620 | of who we are in dreams.
00:25:58.700 | We never do.
00:25:59.980 | Now, that's not to suggest that dreams aren't utterly bizarre
00:26:04.380 | and I mean, when you slept last night,
00:26:07.820 | which I know may have been perhaps a little less than me,
00:26:12.100 | but when you went into dreaming,
00:26:14.740 | you became flagrantly psychotic.
00:26:18.180 | And there are five essentially good reasons.
00:26:20.420 | Firstly, you started to see things which were not there,
00:26:23.460 | so you were hallucinating.
00:26:25.180 | Second, you believe things that couldn't possibly be true,
00:26:28.280 | so you were delusional.
00:26:30.180 | Third, you became confused about time and place and person,
00:26:35.180 | so you're suffering
00:26:36.140 | from what we would call disorientation.
00:26:38.580 | Fourth, you have wildly fluctuating emotions,
00:26:41.660 | something that psychiatrists
00:26:43.660 | will call being affectively labile.
00:26:46.460 | And then how wonderful, you woke up this morning
00:26:48.580 | and you forgot most, if not all of that dream experience,
00:26:51.140 | so you're suffering from amnesia.
00:26:52.820 | If you had to experience any one of those five things
00:26:55.900 | while you're awake,
00:26:57.060 | you would probably be seeking psychological help.
00:26:59.840 | So I place that as a backdrop
00:27:03.940 | against your astute question,
00:27:06.140 | because despite all of that psychosis,
00:27:10.220 | there is still a present self nested at the heart of it,
00:27:16.060 | meaning that I think it's very difficult for us
00:27:19.780 | to abandon our conscious sense of self.
00:27:24.100 | And if it's that hard,
00:27:25.900 | the old adage in some ways that you can't outrun your shadow,
00:27:29.260 | but here it's more of a philosophical question,
00:27:31.400 | which is about the conscious mind
00:27:33.660 | and what the state of consciousness actually means
00:27:36.220 | in a human being.
00:27:37.240 | So I think that that to me,
00:27:39.700 | you become so dislocated
00:27:43.700 | from so many other rational ways of waking consciousness,
00:27:47.860 | but one thing that won't go away,
00:27:49.500 | that won't get perturbed or sort of, you know,
00:27:54.500 | manacled is this your sense of conscious self.
00:27:58.180 | - Yeah, there's a strong sign
00:27:59.380 | that consciousness is fundamental to the human mind.
00:28:03.380 | - Or we're just creatures of habit.
00:28:04.900 | We've gotten used to having consciousness.
00:28:06.860 | Maybe it just takes a lot of either chemical substances
00:28:11.060 | or a lot of like mental work to escape that.
00:28:15.960 | I mean, it's like trying to launch a rocket.
00:28:19.340 | You know, the energy that has to be put in
00:28:22.060 | to create escape velocity from the gravitational pull
00:28:26.140 | of this thing called planet earth is immense.
00:28:29.060 | Well, the same thing is true for us
00:28:33.260 | to abandon our sense of conscious self.
00:28:36.700 | The amount of biological, the amount of substances,
00:28:39.360 | the amount of wacky stuff that you have to do
00:28:42.020 | to truly get escape velocity from your conscious self.
00:28:46.420 | What does that tell us about then
00:28:48.380 | the fundamental state of our conscious self?
00:28:52.220 | - Yeah, it also probably says that it's quite useful
00:28:55.540 | to have consciousness for survival
00:28:58.700 | and for just operation in this world.
00:29:01.420 | And perhaps for intelligence.
00:29:02.780 | I'm one of the, on the AI side,
00:29:05.020 | people that think that intelligence requires consciousness.
00:29:10.020 | So like high levels of general intelligence
00:29:12.960 | requires consciousness.
00:29:14.460 | Most people in the AI field think like consciousness
00:29:17.700 | and intelligence are fundamentally different.
00:29:19.580 | You can build a computer that's super intelligent.
00:29:22.100 | It doesn't have to be conscious.
00:29:23.840 | I think that if you define super intelligence
00:29:26.660 | by being good at chess, yes.
00:29:28.580 | But if you define super intelligence
00:29:30.820 | as being able to operate in this living world of humans
00:29:35.020 | and be able to perform all kinds of different tasks,
00:29:37.540 | consciousness, it seems to be somehow fundamental
00:29:41.020 | to richly integrate yourself into the human experience,
00:29:46.800 | into society.
00:29:48.320 | It feels like you have to be a conscious being.
00:29:50.960 | But then we don't even know what consciousness is
00:29:53.460 | and we certainly don't know how to engineer it
00:29:55.640 | in our machines.
00:29:56.680 | - I love the fact that there are still questions
00:30:00.120 | that are so embryonic because, you know,
00:30:03.260 | I suspect it's the same with you.
00:30:05.640 | Answers to me are simply ways to get to more questions.
00:30:09.820 | You know, it's questions where, you know,
00:30:12.160 | questions turn me on, answers less so.
00:30:15.040 | And I love the fact that we are still embryonic
00:30:18.840 | in our sense of arguing about even what the definition
00:30:22.200 | of consciousness is.
00:30:23.800 | But I also find it fascinating,
00:30:26.360 | I think it's thoroughly delightful to absorb yourself
00:30:28.540 | in the thought.
00:30:30.120 | Think about the brain and we can move back
00:30:33.760 | across the complexity of phylogeny from, you know,
00:30:36.520 | humans to mammals to sort of birds to reptiles,
00:30:40.360 | amphibians, fish, and you can, bacteria, whatever you want.
00:30:44.900 | And you can go through this and say, okay,
00:30:46.680 | where is the hard line of, you know,
00:30:48.680 | what we would define as consciousness?
00:30:51.240 | And I'm sure it's got something to do with the complexity
00:30:54.400 | of the neural system, of that I'm fairly certain.
00:30:57.600 | But to me, it's always been fascinating.
00:31:01.400 | So what is it then?
00:31:02.520 | You know, is it that I just keep adding neurons
00:31:04.680 | to a Petri dish and I just keep adding them
00:31:07.200 | and adding them and adding them.
00:31:08.280 | At some point when I hit a critical mass
00:31:10.280 | of interconnected neurons, that is the mass of the,
00:31:13.240 | you know, the interconnected human brain, then bingo.
00:31:16.320 | All of a sudden it kicks into gear
00:31:19.840 | and we have consciousness.
00:31:21.360 | - Like a phase shift, phase transition of some kind.
00:31:23.600 | - Correct, yeah.
00:31:24.520 | But there is something about the complexity
00:31:26.320 | of the nervous system that I think is fundamental
00:31:28.720 | to consciousness.
00:31:29.560 | And the reason I bring that up is because when we're trying
00:31:32.040 | to then think about creating it in an artificial way,
00:31:35.280 | does that inform us as to the complexity
00:31:37.760 | that we should be looking at in terms of development?
00:31:41.340 | I also think that it's a missed opportunity
00:31:44.480 | in the sort of digital space for us to try
00:31:49.020 | to recreate human consciousness.
00:31:51.400 | We've already got human consciousness.
00:31:54.720 | What if we were to think about creating some other form
00:31:58.520 | of consciousness?
00:31:59.360 | Why do we have to think that the ultimate in the creation
00:32:02.560 | of, you know, an artificial intelligence is the replication,
00:32:07.400 | you know, of a human state of consciousness?
00:32:11.920 | Can we not think outside of our own consciousness
00:32:16.680 | and believe that there is something even more
00:32:18.840 | incredible or more complementary, more orthogonal?
00:32:22.420 | So I'm sometimes perplexed that people are trying
00:32:28.820 | to mimic human consciousness rather than think
00:32:31.260 | about creating something that's different.
00:32:35.020 | - I think of human consciousness or consciousness in general
00:32:37.780 | as this magic superpower that allows us
00:32:42.780 | to deeply experience the world.
00:32:44.780 | And just as you're saying, I don't think that superpower
00:32:47.300 | has to take the exact flavor as humans have.
00:32:49.800 | That's my love for robots.
00:32:51.880 | I would love to add the ability to robots
00:32:56.200 | that can experience the world and other humans deeply.
00:33:01.200 | I'm humbled by the fact that that idea does not necessarily
00:33:04.880 | need to look anything like how humans experience the world.
00:33:09.360 | But there's a dance of human to robot connection,
00:33:14.840 | the same way human to dog or human to cat connection.
00:33:18.560 | There's a magic there to that interaction.
00:33:21.760 | And I'm not sure how to create that magic,
00:33:23.640 | but it's a worthy effort.
00:33:25.160 | I also love, just exactly as you said,
00:33:28.000 | on the question of consciousness
00:33:29.480 | or engineering consciousness,
00:33:31.140 | the fun thing about this problem is it seems obvious to me
00:33:36.640 | that 100 years from now, no matter what we do today,
00:33:41.340 | people, if we're still here,
00:33:43.340 | will laugh at how silly our notions were.
00:33:47.320 | So it's almost impossible for me to imagine
00:33:49.900 | that we will truly solve this problem fully in my lifetime.
00:33:54.900 | And more than that, everything we'll do
00:34:00.100 | will be silly 100 years from now.
00:34:02.620 | But it's still, that makes it fun to me
00:34:05.660 | because it's like you have the full freedom
00:34:07.940 | to not even be right, just to try,
00:34:11.260 | just to try is freedom.
00:34:12.900 | And that's how I see--
00:34:15.300 | - Get me that T-shirt, please.
00:34:17.020 | (laughing)
00:34:17.860 | I love that.
00:34:18.700 | - So, and human robot interaction is fascinating
00:34:22.100 | because it's like watching dancing.
00:34:24.260 | I've been dancing tango recently.
00:34:28.340 | And just, it's like, there is no goal.
00:34:31.100 | The goal is to create something magical.
00:34:33.820 | And whether consciousness or emotion
00:34:37.620 | or elegance of movement, all of those things,
00:34:41.460 | aid in the creation of the magic.
00:34:43.380 | And it's a free, it's an art form to explore
00:34:46.060 | how to make that, how to create that
00:34:49.900 | in a way that's compelling.
00:34:51.260 | - Yeah, I love the line in "Sense of a Woman"
00:34:53.480 | with Al Pacino where he's speaking about the tango.
00:34:55.980 | And he said, "Really, it's just freedom
00:34:57.660 | "that if you get tangled up, you just keep tangoing on."
00:35:01.900 | - I still to this day, I think, well,
00:35:05.340 | first or second time I talked to Joe Rogan on his podcast,
00:35:08.880 | I said, "We got into this heated argument
00:35:11.460 | "about whether 'Sense of a Woman'
00:35:13.880 | "is a better movie than 'John Wick'."
00:35:15.700 | Because it's one of my favorite movies for many reasons.
00:35:20.300 | One is-- - "Sense of a Woman."
00:35:21.940 | - "Sense of a Woman."
00:35:22.980 | - Partially, I didn't know that, by the way.
00:35:25.260 | I was just gonna--
00:35:26.100 | - You just--
00:35:26.940 | - Yeah, I didn't know if you would actually know
00:35:28.240 | of the movie. - Awesome, awesome.
00:35:29.540 | No, yeah, I said, "I love the tango scene.
00:35:31.840 | "I love Al Pacino's performance.
00:35:34.600 | "It's a wonderful movie."
00:35:36.880 | Then Joe was saying, "John Wick is better."
00:35:39.380 | So we, to this day, argue about this.
00:35:41.460 | - I think it depends on what conscious state you're in
00:35:44.980 | that you would be ready and receptive to.
00:35:46.980 | But "Sense of a Woman," I think it has one of the best
00:35:50.480 | monologues at the end of the movie
00:35:52.900 | that has ever been written or at least performed.
00:35:57.060 | - When Al Pacino defends the younger.
00:36:01.980 | Yeah, I often think about that.
00:36:05.940 | There's been times in my life, I don't know about you,
00:36:10.000 | where I wish I had an Al Pacino in my life,
00:36:12.600 | where integrity is really important in this life.
00:36:17.600 | - It is.
00:36:19.140 | - And sometimes you find yourself in places
00:36:20.820 | where there's pressure to sacrifice that integrity.
00:36:25.600 | And you want, what is it, Lieutenant Colonel
00:36:29.200 | or whatever he was--
00:36:30.040 | - (laughs) Slade.
00:36:31.620 | - Yeah, to come in on your side
00:36:34.800 | and scream at everyone and say,
00:36:36.340 | "What the hell are we doing here?"
00:36:38.520 | - Being, unfortunately, British
00:36:41.280 | and sort of having that slightly awkward
00:36:44.360 | sort of Hugh Grant gene, it's very, very,
00:36:47.240 | very at the opposite end of the spectrum
00:36:48.800 | of the remarkable feat of Al Pacino
00:36:51.840 | at the end of that scene.
00:36:53.080 | But, and yeah, integrity is, it's a challenging thing
00:36:58.080 | and I value it much.
00:37:00.300 | And I think it can take 20 years to build a reputation
00:37:04.860 | and two minutes to lose it.
00:37:06.540 | And there is nothing more that I value than integrity.
00:37:10.380 | And if I'm ever wrong about anything,
00:37:13.540 | I truly don't want to be wrong
00:37:15.980 | for any longer than I have to be.
00:37:17.920 | That's what being in some ways a scientist is.
00:37:22.660 | You're just driven by truth.
00:37:25.420 | And the irony relative to something like mathematics
00:37:29.140 | is that in science, you never find truth.
00:37:31.860 | What all you do in science is you discount the things
00:37:35.420 | that are likely to be untrue,
00:37:37.980 | leaving only the possibility of what could be true.
00:37:42.180 | But in math, when you create a proof, it's a proof for,
00:37:47.180 | from that point forward, there is truth in mathematics.
00:37:54.260 | And I think there's a beauty in that.
00:37:56.940 | But I kind of like the messiness of science,
00:38:00.180 | because again, to me, it's less about the truth
00:38:03.580 | of the answer and it is more about the pursuit of questions.
00:38:07.220 | - But their integrity becomes more and more important
00:38:10.060 | and it becomes more difficult.
00:38:11.780 | There's a lot of pressure,
00:38:12.900 | just like in the rest of the world,
00:38:14.060 | but there's a lot of pressures on a scientist.
00:38:17.140 | One is like funding sources.
00:38:19.420 | I've noticed this, that money affects everyone's mind,
00:38:23.060 | I think.
00:38:24.580 | I've been always somebody that I believe money can't,
00:38:28.340 | you can't buy my opinion.
00:38:31.500 | I don't care how much money, billions or trillions.
00:38:34.060 | But that pressure is there
00:38:37.020 | and you have to be very cognizant of it
00:38:38.740 | and make sure that your opinion is not defined
00:38:41.660 | by the funding sources.
00:38:43.380 | And then the other is just your own success of,
00:38:48.260 | for a couple of decades, publishing an idea
00:38:53.860 | and then realizing at some point
00:38:55.460 | that that idea was wrong all along.
00:38:57.900 | And that's a tough thing for people to do,
00:39:00.820 | but that's also integrity is to walk away,
00:39:03.220 | is to say that you were wrong.
00:39:04.820 | That doesn't have to be in some big dramatic way.
00:39:08.900 | It could be in a bunch of tiny ways along the way.
00:39:11.940 | Like reconfigure your intuition about a particular problem.
00:39:17.700 | And all of that is integrity.
00:39:20.540 | When everybody in the room believes a certain thing,
00:39:24.500 | everybody in the community believes a certain thing
00:39:27.460 | to be able to still be open-minded in the face of that.
00:39:31.020 | - Yeah, and I think it comes down in some ways
00:39:33.820 | to the issue of ego,
00:39:35.340 | that you bond your correctness or your rightness,
00:39:39.980 | your scientific theory with your sense of ego.
00:39:44.980 | I've never found it that difficult to let go
00:39:48.780 | of theories in the face of counter evidence,
00:39:52.780 | in part because I have such low self-esteem.
00:39:56.380 | - Well, I kind of like that.
00:39:57.900 | I've always liked that combination.
00:39:59.020 | I have the same.
00:40:00.020 | I'm like very self-critical, imposter syndrome,
00:40:02.460 | all those things, putting yourself below the podium,
00:40:05.740 | but at the same time, having the ego
00:40:08.140 | that drives the ambition to work your ass off.
00:40:11.060 | Like some kind of weird drive, maybe,
00:40:14.860 | like drive to be better.
00:40:16.740 | Like thinking yourself is not that great
00:40:18.940 | and always driving to be better.
00:40:20.940 | And at the same time,
00:40:22.020 | because that can be paralyzing and exhausting and so on,
00:40:25.940 | at the same time, just being grateful to be alive.
00:40:28.340 | But in the sciences, in the actual effort,
00:40:31.340 | never be satisfied, never think of yourself highly.
00:40:33.940 | It seems to be a nice combination.
00:40:35.860 | - I very much hope that that is part of who I am,
00:40:39.700 | and I remain very quietly motivated and driven,
00:40:44.420 | and I, like you, love the idea of perfection,
00:40:47.980 | and I know I will never achieve it,
00:40:49.740 | but I will never stop trying to.
00:40:51.380 | - So similar to you, which sounds weird
00:40:55.540 | because there's all these videos of me on the internet.
00:40:59.420 | So I think I just naturally lean into the things
00:41:04.420 | I'm afraid of and I'm uncomfortable doing.
00:41:07.060 | Like I'm very afraid of talking to people,
00:41:09.180 | and just even before talking to you today,
00:41:12.740 | just a lot of anxiety and all those kinds of things.
00:41:16.140 | - About talking to me?
00:41:17.580 | - Yeah, yeah.
00:41:18.420 | - Oh, I like.
00:41:19.240 | - Nervousness.
00:41:20.080 | Fear in some cases, self-doubt and all those kinds of things
00:41:24.580 | but I do it anyway.
00:41:26.020 | So the reason I bring that up is
00:41:27.820 | you've launched a podcast.
00:41:32.140 | - I have.
00:41:32.980 | - Allow me to say,
00:41:35.440 | I think you're a great science communicator.
00:41:37.420 | So this challenge of being
00:41:41.300 | afraid or cautious of being in the public eye
00:41:46.300 | and yet having a longing to communicate
00:41:51.420 | some of the things you're excited about
00:41:53.620 | in the space of sleep and beyond.
00:41:55.900 | What's your vision with this project?
00:41:57.800 | - I think firstly to that question,
00:42:02.780 | like you, I am always more afraid of not trying than trying.
00:42:07.780 | - Yeah.
00:42:08.820 | - That to me frightens me more.
00:42:11.040 | But with the podcast, I think really,
00:42:15.980 | I have two very simple goals.
00:42:18.640 | I want to try and democratize the science of sleep.
00:42:23.000 | And in doing so, my goal would be to try
00:42:25.020 | and reunite humanity with the sleep
00:42:27.060 | that it is so desperately bereft of.
00:42:29.200 | And if I can do that through a number of different means,
00:42:33.820 | the podcast is a little bit different than this format.
00:42:38.500 | It's going to be short form monologues from yours, Julie,
00:42:43.500 | that will last usually less than just 10 minutes.
00:42:47.820 | And I see it as simply a little slice of sleep goodness
00:42:51.460 | that can accompany your waking day.
00:42:53.980 | - It's hard to know what is the right way
00:42:56.420 | to do science communication.
00:42:58.480 | Like your friend, mine, Andrew Huberman,
00:43:01.660 | he's an incredible human being.
00:43:05.500 | - Oh gosh.
00:43:06.340 | - He does like two hours of,
00:43:09.460 | I wonder how many takes he does, I don't know.
00:43:11.060 | But it looks like he doesn't do any.
00:43:12.860 | - Yeah, I suspect he's that magnificent of a human being.
00:43:16.340 | - When I talk to him in person,
00:43:18.980 | he always generates intelligent words,
00:43:21.620 | well cited nonstop for hours.
00:43:24.260 | So I don't.
00:43:25.740 | - He's a Gatlin gun of information and it's pristine.
00:43:29.060 | - And passion and all those kinds of things.
00:43:30.700 | So that's an interesting medium.
00:43:35.260 | It's funny 'cause I wouldn't have done it
00:43:36.780 | the way he's doing it.
00:43:38.060 | I wouldn't advise him to do it the way he's doing it
00:43:40.380 | 'cause I thought there's no way
00:43:41.380 | you could do what you're doing.
00:43:42.860 | 'Cause it's a lot of work,
00:43:45.580 | but he is like doing an incredible job of it.
00:43:48.460 | I just think it's the same with like Dan Carlin
00:43:51.100 | in hardcore history.
00:43:52.300 | I thought that the way Andrew's doing it
00:43:56.740 | would crush him the way it crushes Dan Carlin.
00:44:00.900 | So Dan has so much pressure on him to do a good job
00:44:04.720 | that he ends up publishing like two episodes a year.
00:44:08.180 | So that pressure can be paralyzing.
00:44:10.140 | The pressure of like putting out
00:44:12.660 | like strong scientific statements,
00:44:16.180 | that can be overwhelming.
00:44:17.860 | Now, Andrew seems to be just plowing through anyway.
00:44:21.580 | If there's mistakes,
00:44:22.860 | he'll say there's corrections and so on.
00:44:25.260 | I just, I wonder,
00:44:26.280 | actually I haven't talked to him too much about it.
00:44:27.940 | Like psychologically,
00:44:28.980 | how difficult is it to put yourself out there
00:44:32.040 | for an hour to a week of just nonstop dropping knowledge?
00:44:37.040 | Any one sentence of which could be totally wrong.
00:44:41.180 | It could be a mistake.
00:44:42.020 | - And there will be mistakes.
00:44:44.580 | And I, in the first edition of my book,
00:44:47.500 | there were errors that we corrected
00:44:50.060 | in the second edition too.
00:44:51.540 | But there will be probabilistically,
00:44:55.900 | if you've got 10 facts per page of a book
00:44:59.140 | and you've got 350 pages,
00:45:02.520 | odds are it's probably not going to be
00:45:05.320 | utter perfection out the gate.
00:45:07.460 | And it will be the same way for Andrew too.
00:45:10.860 | But having the reverence of a humble mind
00:45:16.800 | and simply accepting the things that are wrong
00:45:22.000 | and correcting them and doing the right thing,
00:45:23.920 | I know that that's his mentality.
00:45:26.640 | - I do want to say that I'm just kind of honored to be,
00:45:30.260 | it's a cool group of like scientific people
00:45:34.780 | that I'm fortunate enough to now be interacting with.
00:45:37.820 | It's you and Andrew and David Sinclair
00:45:41.180 | has been thinking about throwing his hat in the ring.
00:45:43.100 | - Oh, I hope so.
00:45:43.940 | David is another one of those very special people
00:45:46.740 | in the world.
00:45:47.580 | - So it's cool because podcasts are, it's cool.
00:45:51.100 | It's such a powerful medium of communication.
00:45:53.460 | It's much freer than more constrained.
00:45:56.540 | Like publications and so on,
00:45:58.400 | or it's much more accessible and inspiring
00:46:01.260 | than like, I don't know,
00:46:02.240 | conference presentations or lectures.
00:46:04.020 | And so it's a really exciting medium to me.
00:46:06.900 | And it's cool that there's this group of people
00:46:08.860 | that are becoming friends and putting stuff out there
00:46:12.700 | and supporting each other.
00:46:13.660 | So it's fun to also watch
00:46:16.800 | how that's going to evolve in your case.
00:46:18.940 | 'Cause I wonder, it'll be two a month.
00:46:20.860 | - Or devolve is the answer to that.
00:46:24.980 | - Well, I mean, some of it is persistence
00:46:27.700 | through the challenges that we've been talking about,
00:46:30.220 | which is like-
00:46:31.060 | - I think I've got a lot to learn.
00:46:32.460 | - Yeah.
00:46:33.300 | - But I will persist.
00:46:34.120 | - Luke, can I ask you some detailed stuff?
00:46:37.480 | You mentioned that one.
00:46:38.320 | - Oh my goodness, go anywhere you wish with sleep.
00:46:41.600 | - So I'm a big fan of coffee and caffeine.
00:46:45.300 | And I've been, especially in the last few days,
00:46:47.820 | consuming a very large amount.
00:46:50.140 | And I'm cognizant of the fact that
00:46:54.260 | my body is affected by caffeine
00:46:56.300 | different than the anecdotal information
00:46:59.060 | that other people tell me.
00:47:00.780 | I seem to be not at all affected by it.
00:47:03.940 | It's almost,
00:47:04.780 | it feels more like a ritual
00:47:09.300 | than it is a chemical boost to my performance.
00:47:12.180 | Like I can drink several cups of coffee right before bed
00:47:15.420 | and just knock out anyway.
00:47:17.780 | I'm not sure if it's a biological chemical
00:47:20.260 | or it has to do with just the fact
00:47:21.580 | that I'm consuming huge amounts of caffeine.
00:47:24.380 | All that to say,
00:47:25.720 | what do you think is the relationship
00:47:29.580 | between coffee and sleep, caffeine and sleep?
00:47:32.940 | If there's an interesting distinction there.
00:47:34.420 | - There is a distinction.
00:47:35.680 | So I think the first thing to say,
00:47:37.620 | which is going to sound strange coming from me,
00:47:40.220 | is drink coffee.
00:47:41.660 | The health benefits associated with drinking coffee
00:47:46.500 | are really quite well-established now.
00:47:51.220 | But I think that the counterpoint to that,
00:47:54.220 | well, firstly, the dose and the timing make the poison,
00:47:57.860 | and I'll perhaps come back to that in just a second.
00:48:01.260 | But for coffee, it's actually not the caffeine.
00:48:06.260 | So a lot of people have asked me
00:48:10.580 | about this rightful paradox between the fact
00:48:13.120 | that sleep provides all of these incredible health benefits
00:48:16.560 | and then coffee, which can have a deleterious impact
00:48:20.340 | on your sleep, has a whole collection of health benefits,
00:48:24.020 | many of them, Venn diagram overlapping
00:48:26.420 | with those that sleep provides.
00:48:28.340 | How on earth can you reconcile those two?
00:48:31.220 | And the answer is that, well, the answer is very simple.
00:48:34.780 | It's called antioxidants.
00:48:37.740 | That it turns out that for most people
00:48:39.860 | in Western civilization, because of diet not being
00:48:43.500 | quite what it should be, the major source
00:48:47.580 | through which they obtain antioxidants
00:48:50.160 | is the coffee bean.
00:48:51.260 | So the humble coffee bean has now been asked
00:48:54.520 | to carry the astronomical weight of serving up
00:48:58.460 | the large majority of people's antioxidant needs.
00:49:02.820 | And you can see this if, for example,
00:49:05.500 | you look at the health benefits of decaffeinated coffee.
00:49:09.100 | It has a whole constellation
00:49:10.940 | of really great health benefits too.
00:49:12.860 | So it's not the caffeine, and that's why I liked
00:49:15.100 | what you said, this sort of separation of church and state
00:49:18.460 | between coffee and caffeine.
00:49:20.660 | It's not the caffeine, it's the coffee bean itself
00:49:24.100 | that provides those health benefits.
00:49:25.600 | But coming back to how it impacts sleep,
00:49:30.320 | it impacts sleep in probably at least three different ways.
00:49:34.100 | The first is that for most people,
00:49:37.580 | caffeine can make it obviously a little harder
00:49:39.960 | to fall asleep.
00:49:41.700 | Caffeine can make it harder to stay asleep.
00:49:44.580 | But let's say that you are one of those individuals,
00:49:46.620 | and I think you are, and you can say,
00:49:48.540 | "Look, I can have three or four espressos with dinner,
00:49:51.220 | and I fall asleep just fine,
00:49:52.820 | and I stay asleep soundly across the night,
00:49:55.020 | so there's no problem."
00:49:56.740 | The downside there is that even if that is true,
00:50:00.320 | the amount of deep sleep that you get will not be as deep.
00:50:03.820 | And so you will actually lose somewhere between 10 to 30%
00:50:07.300 | of your deep sleep if you drink caffeine in the evening.
00:50:11.300 | So to give you some context, to drop your deep sleep
00:50:14.860 | by let's say 20%, I'd probably have to age you by 15 years,
00:50:19.260 | or you could do it every night with a cup of coffee.
00:50:22.540 | I think the fourth component that is perhaps
00:50:25.380 | less well understood about coffee is its timing,
00:50:28.860 | and that's why I was saying the timing
00:50:30.100 | and the dose make the poison.
00:50:31.580 | The dose, by the way, once you get past about three cups
00:50:34.540 | of coffee a day, the health benefits actually start
00:50:37.700 | to turn down in the opposite direction.
00:50:40.220 | So there is a U-shaped function.
00:50:41.960 | It's sort of the Goldilocks syndrome,
00:50:44.240 | not too little, not too much, just the right amount.
00:50:48.060 | The second component is the timing, though.
00:50:50.720 | Caffeine has half-life of about five to six hours,
00:50:55.280 | meaning that after five to six hours, 50% of that,
00:50:59.000 | on average, for the average adult, is still in the system,
00:51:02.520 | which means that it has a quarter-life of 10 to 12 hours.
00:51:06.080 | So in other words, if you have a coffee at noon,
00:51:08.320 | a quarter of that caffeine is still circulating
00:51:10.600 | in your brain at midnight.
00:51:12.640 | So having a cup of coffee at noon, one could argue,
00:51:15.640 | is the equivalent of tucking yourself into bed at midnight,
00:51:18.080 | and before you turn the light out,
00:51:19.600 | you swig a quarter of a cup of coffee.
00:51:22.200 | But that doesn't still answer your question
00:51:24.320 | as to why are you so immune?
00:51:26.240 | So I'm someone who is actually, unfortunately,
00:51:28.200 | very sensitive to caffeine, and if I have, you know,
00:51:31.200 | even two cups of coffee in the morning,
00:51:34.120 | I don't sleep as well that night,
00:51:37.280 | and I find it miserable because I love the smell of coffee,
00:51:40.680 | I love the routine, I love the ritual,
00:51:43.560 | I think I would love to be invested in it.
00:51:46.240 | It's just terrible for my sleep, so I switch to decaf.
00:51:49.440 | There is a difference from one individual to the next,
00:51:52.640 | and it's controlled by a set of liver enzymes
00:51:56.640 | called cytochrome P450 enzymes,
00:52:00.720 | and there is a particular gene
00:52:03.040 | that if you have a different sort of version of this gene,
00:52:06.640 | it's called CYP1A2,
00:52:11.600 | that gene will determine the speed
00:52:14.720 | of the clearance of caffeine from your system.
00:52:18.080 | Some people will have a version of that gene
00:52:20.280 | that is very effective and efficient
00:52:23.040 | at clearing that caffeine,
00:52:24.680 | and so their half-life could be as short as two hours
00:52:28.440 | rather than five to six hours.
00:52:30.920 | Other people, hands up, Matt Walker,
00:52:34.080 | have a version of that gene that is not very effective
00:52:37.760 | at clearing out the caffeine,
00:52:40.360 | and therefore their half-life sort of sensitivity
00:52:43.680 | could be somewhere between eight to nine hours.
00:52:48.120 | So we understand that there are individual differences,
00:52:50.760 | but overall, I guess the top line here is drink coffee
00:52:55.760 | and understand that it's not the caffeine,
00:52:58.440 | it's the coffee that's the benefit,
00:53:00.080 | and the dose makes the poison.
00:53:01.320 | - Is there some aspect to it
00:53:02.920 | that's it's like a muscle in terms of the,
00:53:06.320 | all the combination of letters and numbers
00:53:08.200 | that you just said,
00:53:09.080 | is there some aspect that if I can improve
00:53:13.880 | the quarter-life, the half-life,
00:53:15.720 | could decrease that number if I just practice?
00:53:19.040 | Like drink a lot of coffee,
00:53:21.360 | is this like habit,
00:53:22.840 | alters how your body's able to get rid of the caffeine?
00:53:27.160 | - Not how the body is able to get rid of the caffeine,
00:53:30.200 | but it does alter how sensitive the body is to the caffeine.
00:53:34.520 | And it's not at the level of the enzyme
00:53:36.720 | degrading the caffeine,
00:53:38.800 | it's at the level of the receptors
00:53:41.520 | that caffeine will act upon.
00:53:44.120 | Now, it turns out that those are called adenosine receptors,
00:53:47.000 | and maybe we can speak about what adenosine is
00:53:49.120 | and sleep pressure and all of that good stuff.
00:53:51.200 | But as you start to drink more and more coffee,
00:53:55.280 | the body tries to fight back,
00:53:59.000 | and it happens with many different drugs, by the way,
00:54:01.120 | and it's called tolerance.
00:54:03.160 | And so one of the ways that your body becomes tolerant
00:54:06.680 | to a drug is that the receptors that the drug is binding to,
00:54:10.160 | these sort of welcome sites,
00:54:11.400 | these sort of picture myths, as it were,
00:54:14.320 | that receive the drug,
00:54:16.320 | those start to get taken away from the surface of the cell.
00:54:21.320 | And it's what we call receptor internalization.
00:54:25.120 | So the cell starts to think,
00:54:26.680 | gee whiz, there's a lot of stimulation going on,
00:54:30.120 | this is too much.
00:54:31.480 | So I'm just going to,
00:54:33.040 | when normally I would coat my cell with,
00:54:36.240 | let's just say five of these receptors for argument's sake,
00:54:40.680 | things are going a little bit too ballistic right now.
00:54:43.320 | I'm going to take away at least two of those receptors
00:54:46.120 | and downscale it to just having three of those.
00:54:49.120 | And now you need two cups of coffee to get the same effect
00:54:52.880 | that one cup of coffee got you before.
00:54:55.560 | And that's why then when you go cold turkey on coffee,
00:55:00.560 | all of a sudden the system has equilibrated itself
00:55:05.400 | to expecting X amount of stimulation.
00:55:08.560 | And now all of that stimulation is gone.
00:55:10.160 | So it's now got too few receptors
00:55:12.960 | and you have a caffeine withdrawal syndrome.
00:55:15.680 | And that's why, for example, with drugs of abuse,
00:55:18.520 | things like heroin, when people go into abstinence,
00:55:23.720 | as they're sort of moving into their addiction,
00:55:26.720 | they will build up a progressive tolerance to that drug.
00:55:30.640 | So they need to take more of it to get the same high.
00:55:34.160 | But then if they go cold turkey for some period of time,
00:55:38.160 | the system goes back to being more sensitive again.
00:55:40.640 | It starts to repopulate the surface of the cell
00:55:43.400 | with these receptors.
00:55:44.680 | But now when they reuse and they fall off the wagon,
00:55:47.680 | if they go back to the same dose that they were using
00:55:50.360 | before 10 weeks ago or three months ago,
00:55:54.760 | that dose can kill them.
00:55:56.560 | They can have an overdose.
00:55:58.160 | Even though they were using the same amount
00:56:00.360 | at those two different times,
00:56:02.280 | the difference is that it's not the dose of the drug,
00:56:05.820 | it's the sensitivity of the system.
00:56:08.160 | And that's the same thing that we see with caffeine.
00:56:11.000 | In terms of training the muscle, as it were,
00:56:13.480 | is the system becomes less sensitive, can calibrate.
00:56:17.640 | - Is there a time, the number of hours before bed,
00:56:22.640 | that's a safe bet to most people to recommend
00:56:26.600 | you shouldn't drink caffeine this many hours?
00:56:31.240 | Like, is there an average half-life
00:56:33.480 | that you should be aiming at?
00:56:35.520 | Or is this advice kind of impossible
00:56:38.080 | because there's so much variability?
00:56:39.640 | - There is huge variability.
00:56:41.300 | And I think everyone themselves, to a degree, knows it.
00:56:45.340 | Although I'll put a caveat on that too,
00:56:48.020 | because it's a slightly dangerous point.
00:56:50.460 | So the recommendation for the average adult,
00:56:53.220 | and where is the average adult in society?
00:56:55.700 | There is no such thing.
00:56:56.580 | But for the average adult,
00:56:58.500 | it would be probably cutting yourself off
00:57:00.380 | maybe 10 hours before.
00:57:02.900 | So assuming a normative bedtime in society,
00:57:06.260 | I would say try to stop drinking caffeine before 2 p.m.
00:57:10.900 | and just keep an eye out.
00:57:13.140 | And if you're struggling with sleep,
00:57:14.500 | dial down the caffeine and see if it makes a difference.
00:57:18.060 | - Can I ask you about sleep and learning?
00:57:22.500 | So how does sleep affect learning?
00:57:25.420 | Sleep before learning, sleep after learning,
00:57:30.420 | which are both fascinating kind of dynamics
00:57:33.140 | of the mind's interaction with this extra-conscious state.
00:57:36.600 | - Yeah, sleep is profoundly and very intimately related
00:57:42.300 | to your memory systems and your informational systems.
00:57:45.720 | The first, as you just mentioned,
00:57:47.820 | is that sleep before learning
00:57:50.740 | will essentially prepare your brain
00:57:53.420 | almost like a dry sponge ready to sort of,
00:57:56.860 | you know, initially soak up new information.
00:57:59.180 | In other words, you need sleep before learning
00:58:01.220 | to effectively imprint information into the brain,
00:58:04.940 | to lay down fresh memory traces.
00:58:07.940 | And without sleep, the memory circuits of the brain,
00:58:11.020 | and we know we've studied these memory circuits,
00:58:13.980 | will, you know, they essentially become waterlogged
00:58:17.060 | as it were for the sponge analogy,
00:58:18.500 | and you can't absorb the information as effectively.
00:58:22.400 | So you need sleep before learning,
00:58:26.340 | but you also need sleep, unfortunately, after learning too,
00:58:30.220 | to then take those freshly minted memories
00:58:33.620 | and effectively hit the save button on them.
00:58:36.260 | But it's nowhere near as quick as a digital system.
00:58:38.740 | It takes hours because it's a physical biological change
00:58:42.140 | that happens at the level of brain cells.
00:58:45.060 | But sleep after learning will cement and solidify
00:58:49.540 | that new memory into the neural architecture of the brain,
00:58:53.340 | therefore making it less likely to be forgotten.
00:58:56.960 | So, you know, I often think of sleep in that way as,
00:59:01.340 | it's almost sort of future-proofing information.
00:59:04.620 | - In what way?
00:59:06.980 | - Well, it means that it gives it a higher degree
00:59:10.340 | of assurance to be remembered in the future
00:59:15.060 | rather than go through the sort of degradation
00:59:19.380 | that we think of as forgetting.
00:59:21.260 | So the brain has, in some ways by default, you know,
00:59:27.180 | there is forget, and actually I would love to,
00:59:29.860 | I was going to say sleep is relevant for memory
00:59:31.900 | in three different ways, but I'm going to amend that
00:59:34.700 | and say there's four different ways
00:59:36.820 | which is learning, maintaining, memorizing,
00:59:41.820 | abstraction, assimilation, association,
00:59:45.580 | then forgetting, which the last one sounds oxymoronic
00:59:50.420 | based on the former three, but I'll see if I can explain.
00:59:53.460 | So sleep after learning then sort of, you know,
00:59:56.120 | sets that information like amber in, you know,
01:00:02.140 | in solidification.
01:00:03.980 | The third benefit, however, is that sleep,
01:00:07.460 | we've learned more recently is much more intelligent
01:00:10.260 | than we ever gave it credit for.
01:00:12.340 | Sleep doesn't simply just take individual memories
01:00:15.980 | and strengthen them.
01:00:17.580 | Sleep will then intelligently integrate and cross-link
01:00:21.900 | and associate that information together.
01:00:24.840 | And it's almost like informational alchemy
01:00:28.940 | so that you wake up the next morning
01:00:30.780 | with a revised mind wide web of associations.
01:00:34.900 | And that's probably the reason that, you know,
01:00:38.380 | you've never been told to stay awake on a problem.
01:00:40.920 | You know, and in every language
01:00:44.860 | that I've inquired about that phrase
01:00:46.540 | or something very similar seems to exist,
01:00:49.580 | which means to me that this creative associative benefit
01:00:54.220 | of sleep transcends cultural boundaries.
01:00:56.740 | It is a common experience across humanity.
01:01:00.580 | Now I should note that I think the French translation
01:01:05.180 | of that is much closer to,
01:01:07.420 | I think you sleep with a problem,
01:01:09.060 | whereas the British, you sleep on a problem.
01:01:11.380 | The French, you sleep with a problem.
01:01:12.940 | I think it says so much about the romantic difference
01:01:15.100 | between the British and the French,
01:01:17.180 | but let's not go there.
01:01:18.740 | - That's brilliant.
01:01:20.500 | So such a subtle, but such a fundamental difference.
01:01:23.460 | Yeah.
01:01:24.300 | - Oh, goodness.
01:01:25.140 | Yeah, goodness me.
01:01:25.960 | - Sleep with the problem.
01:01:27.340 | - Yes, exactly.
01:01:28.420 | - That's how I love the French.
01:01:29.980 | So, and we can just sort of double click
01:01:33.620 | on any one of these and go into detail,
01:01:36.420 | but the fourth I became really enchanted
01:01:41.300 | by about eight years ago in our research,
01:01:44.900 | which was this idea of forgetting.
01:01:46.920 | And I started to think that forgetting may be the price
01:01:52.940 | that we pay for remembering.
01:01:56.500 | And in that sense, there is an enormous benefit
01:02:01.500 | to letting go.
01:02:04.560 | And you may be thinking that sounds ridiculous.
01:02:08.460 | I don't want to forget.
01:02:09.780 | In fact, my biggest problem is I keep forgetting things,
01:02:13.300 | but the brain has a, well, we believe,
01:02:17.940 | has a finite storage capacity.
01:02:20.540 | We can't prove it yet,
01:02:21.700 | but my suspicion is that that's probably true.
01:02:23.480 | It doesn't have an infinite storage capacity.
01:02:25.580 | It has constraints.
01:02:26.900 | If that's the case, we can't simply go through life
01:02:31.580 | being constantly informational aggregators,
01:02:36.520 | unless we are programmed to say we've got a hard drive space
01:02:41.340 | of about 85 to 90 years and we're good and we can do that.
01:02:44.820 | Maybe that's true.
01:02:45.980 | I don't think that's true.
01:02:46.960 | I think forgetting is an incredibly good and useful thing.
01:02:50.620 | So for example, it's not beneficial
01:02:54.220 | from an evolutionary perspective for me to remember
01:02:57.140 | where I parked my car three years ago.
01:02:59.380 | So it's important that I can remember today's parking spot,
01:03:04.780 | but I don't want to have the junk kind of DNA
01:03:08.140 | from a memory perspective of where I parked my car
01:03:12.980 | two years ago.
01:03:14.020 | Now, I actually have, in some ways,
01:03:16.680 | a problem with forgetting.
01:03:18.740 | And again, I'm not trying to sort of be laudatory,
01:03:20.520 | but I tend not to forget too many things.
01:03:24.960 | And I don't think that that's a good thing.
01:03:27.200 | And there's a wonderful neurologist, Luria,
01:03:31.520 | who wrote a book called "The Mind of the Mnemonicist."
01:03:35.400 | And it was a brilliant book,
01:03:37.920 | both because it was written exquisitely,
01:03:40.680 | but he was studying these sort of memory savants
01:03:44.840 | who basically could remember everything that he gave them.
01:03:49.520 | And he tried to find a chink in their armor.
01:03:53.640 | And the first half of the book is essentially about him
01:03:56.960 | seeing how far he can push them before they fail.
01:04:00.960 | And he never found that place.
01:04:03.840 | He could never find a place where they stopped remembering.
01:04:08.840 | And then in his brilliance,
01:04:10.840 | he turned the question on its head.
01:04:13.160 | He said, "Not what is the benefit
01:04:15.480 | "of constantly remembering,
01:04:17.760 | "but instead what is the detriment to never forgetting?"
01:04:22.280 | And when you start to realize his descriptions
01:04:24.760 | of those individuals,
01:04:26.120 | it's probably a life that you would not want.
01:04:28.360 | - But it's as fascinating, both from a human perspective,
01:04:31.960 | but also AI perspective.
01:04:33.560 | There's a big challenge in the machine learning community
01:04:38.360 | of how to build systems that are able to remember
01:04:40.700 | for prolonged periods of time,
01:04:42.680 | lifelong continuous learning.
01:04:44.960 | So where you build up information over time.
01:04:48.160 | So memory is one of the biggest open problems
01:04:51.240 | in AI and machine learning.
01:04:54.640 | But at the same time,
01:04:55.880 | the right way to formulate memory is actually forgetting
01:05:00.200 | because you have to be exceptionally selective
01:05:03.520 | at which kind of stuff you remember.
01:05:05.640 | And that's where the step of assimilation,
01:05:07.480 | integration that you're referring to is really important.
01:05:10.000 | I mean, we forget most of the things.
01:05:12.760 | And the question is exactly the cost of forgetting
01:05:16.720 | at the very edge of stuff that could be important
01:05:20.440 | or could not be, how do we remember or not those things?
01:05:24.200 | Like for example, I've, you know, doing a podcast,
01:05:28.440 | I've become cognizant of one feature of my forgetting
01:05:32.440 | that's been problematic,
01:05:34.200 | which is I forget names and titles of books and so on.
01:05:39.420 | So when I read, I remember ideas, I remember quotes,
01:05:44.420 | I remember statements and like that's the space
01:05:48.900 | in which I'm thinking.
01:05:50.440 | But when you communicate to others,
01:05:53.140 | you have to say this person in this book said that.
01:05:56.740 | So it's the same thing with like Andrew Huberman
01:05:59.380 | is masterful at this.
01:06:01.200 | It's this important academia,
01:06:02.580 | remembering the authors of a paper
01:06:04.360 | and the title of the paper
01:06:05.900 | as part of remembering the idea.
01:06:09.140 | And I've been feeling the cost of not being able
01:06:13.020 | to naturally remember those things.
01:06:15.580 | And so that's something I need to sort of work on.
01:06:18.020 | But that's an example. - Are you good with faces?
01:06:20.140 | - Yes, very good at faces.
01:06:21.620 | - But not good with names.
01:06:23.140 | So I'm exactly like you.
01:06:25.220 | And there is, you know, an understanding of that
01:06:27.520 | in the brain too.
01:06:28.940 | We understand that there is partitioning of those
01:06:31.660 | in terms of the territory of the brain
01:06:33.340 | that takes care of faces and facts and places
01:06:36.140 | and they can be separate.
01:06:38.020 | So I will never forget a face,
01:06:40.580 | but, you know, and as I said,
01:06:42.020 | I usually forget very little,
01:06:44.540 | but for some reason names are a struggle.
01:06:47.660 | I think in some ways because I'm probably
01:06:49.220 | just a slightly anxious person.
01:06:50.980 | So when you first meet someone,
01:06:52.220 | which is usually the time when a name is introduced,
01:06:55.420 | you know, you were saying you were sort of anxious
01:06:57.260 | maybe about sort of sitting down with me,
01:06:59.780 | but I find that a little bit, you know, activating.
01:07:04.420 | And so it's not as though there's anything wrong
01:07:06.860 | with my memory.
01:07:07.700 | It's just the emotional state I'm in
01:07:09.500 | when I'm first meeting someone,
01:07:11.340 | you know, it's a little bit perturbing,
01:07:12.780 | but I will never forget that face.
01:07:15.300 | - I completely relate to that
01:07:16.540 | because I almost don't hear people's names
01:07:19.140 | when they tell me because I'm so anxious.
01:07:21.420 | - Yeah.
01:07:22.900 | - But I think there's certain quirks of social interaction
01:07:27.300 | that show that you care about the person,
01:07:30.460 | that you remember that person,
01:07:31.860 | that they matter to you,
01:07:33.580 | that they had an impact on you.
01:07:35.580 | And one of the ways to show that
01:07:36.980 | is you remember their name.
01:07:39.020 | But that's a quirk to me
01:07:40.060 | because there's a lot of people I meet
01:07:43.460 | have a deep impact on me,
01:07:46.140 | but I can't communicate that unless I know their name,
01:07:50.700 | unless I know some of the details
01:07:55.020 | that we humans seem to use to communicate
01:07:58.740 | that we remember each other.
01:08:00.180 | What I remember well is the feeling we shared,
01:08:04.980 | is the experience we shared.
01:08:07.420 | What I don't remember well
01:08:08.700 | is the detailed labels of those experiences.
01:08:12.100 | And I need to certainly work on that.
01:08:14.060 | - I don't know.
01:08:15.060 | I think it's, you know,
01:08:16.220 | just allowing yourself to be innate
01:08:19.300 | and who you are is also a beautiful thing too.
01:08:22.700 | I'm not suggesting it's not important
01:08:24.580 | to try and better oneself,
01:08:26.900 | but I also sometimes worry about the misery
01:08:29.780 | that that puts us in.
01:08:31.060 | But like you, I will.
01:08:34.900 | I do struggle with it.
01:08:35.980 | But I know the first time when we met in the lobby,
01:08:39.940 | I know exactly what you look like.
01:08:43.700 | I know that you were wearing headphones.
01:08:45.260 | I know the shape and the size of those headphones.
01:08:47.620 | You didn't have your black jacket on.
01:08:49.060 | I know exactly what the weave of your shirt looked like.
01:08:51.660 | I know what your shoes look like.
01:08:53.700 | And I knew exactly the height of your,
01:08:56.060 | the end of your pants from the top of your shoes.
01:08:59.580 | And so those things I don't forget, you know,
01:09:02.420 | and I can remember when people,
01:09:04.780 | I met people, you know, two years ago and I'll say,
01:09:06.980 | oh yes, we met there.
01:09:08.340 | And I remember you had those fantastic, you know, boots on.
01:09:12.340 | I thought they were a bloody great pair of boots,
01:09:14.740 | you know, and they're like, how do you,
01:09:15.940 | I didn't even remember what I was wearing that day.
01:09:18.620 | - It's fascinating.
01:09:19.780 | Yeah, I'm the exact the same way,
01:09:21.100 | but you can't until we have Neuralink
01:09:23.580 | or something like that,
01:09:24.420 | we can't communicate that you remember all those things.
01:09:26.380 | - I know, that's what I want.
01:09:27.980 | - So you have to be able to use tricks
01:09:29.860 | of human communication for that.
01:09:31.460 | But so that, I mean, that's the,
01:09:33.780 | it's ultimately is a trick of like,
01:09:35.740 | which to remember, which to forget.
01:09:37.420 | - Right.
01:09:38.260 | - And the forgetting is so,
01:09:39.620 | it's so fascinating you say this.
01:09:41.060 | I mean, it seems to be deeply connected
01:09:44.500 | to that assimilation process.
01:09:46.660 | So forgetting, you try to fit all the new stuff
01:09:50.620 | into this big web of the old stuff
01:09:55.020 | and the things that don't fit, you throw out.
01:09:58.260 | - I think the assimilation,
01:10:00.020 | the way I've been thinking about it with sleep
01:10:02.100 | and it's particularly sort of dream sleep
01:10:03.700 | that we think can help with this assimilation
01:10:06.140 | is that during wake,
01:10:09.620 | we have one version of associative processing.
01:10:13.540 | And what I mean by that
01:10:14.380 | is we see the most obvious connections.
01:10:17.180 | So I think of wakefulness as a Google search gone right.
01:10:22.180 | Whereas I see dream sleep
01:10:26.620 | as doing something very different.
01:10:29.580 | I think dream sleep is a little bit
01:10:31.180 | like group therapy for memories,
01:10:33.660 | that everyone gets a name badge
01:10:36.020 | and sleep gathers in all of the individual pieces
01:10:39.340 | of the day.
01:10:40.180 | And it sort of starts to get you to,
01:10:42.620 | forces you in fact, to speak to the people,
01:10:44.660 | not at the front of the room
01:10:45.700 | that you think you've got the most obvious connection with,
01:10:48.180 | but to speak with the people all the way
01:10:49.700 | at the back of the room,
01:10:50.580 | that at first you think I've got no obvious connection
01:10:53.180 | with them at all.
01:10:54.460 | But once you get chatting with them,
01:10:56.620 | you learn that you do have
01:10:57.900 | a very distant non-obvious connection,
01:10:59.940 | but it's still a connection on the same.
01:11:02.700 | And it's almost as though you're doing a Google search
01:11:05.300 | where I input Lex Friedman
01:11:08.660 | and it doesn't take me to the first page of your home site.
01:11:12.140 | - Page 20.
01:11:12.980 | - It takes me to page 20,
01:11:13.860 | which is about some like field hockey game in Utah.
01:11:16.940 | - Yeah, exactly.
01:11:17.780 | - It turns out that there actually is a link
01:11:19.580 | if I look at it, it's a distant non-obvious one.
01:11:22.340 | And to me, I find that exciting
01:11:24.180 | because when you fuse things together
01:11:25.860 | that shouldn't normally go together,
01:11:27.760 | but when they do, they cause marked advances
01:11:30.100 | in evolutionary fitness.
01:11:31.600 | It sounds like the biological basis of creativity.
01:11:34.940 | And that's exactly what I think DreamSleep
01:11:37.500 | and the algorithm of DreamSleep is designed to do.
01:11:40.460 | You know, it's not a Boolean like system
01:11:42.580 | where you have, you know,
01:11:45.060 | the sort of assumptions of true and false,
01:11:49.020 | you know, maybe it's more fuzzy logic system.
01:11:52.020 | And I think REM sleep is a perfect environment
01:11:55.140 | within which we do, you know,
01:11:56.820 | it's almost like memory pinball.
01:11:58.940 | You know, you get the information
01:12:00.260 | that you've learned during the day,
01:12:02.100 | and then you pull the lever back
01:12:03.420 | and you shoot it up into the attic of your brain.
01:12:07.000 | You know, this cortex filled
01:12:08.700 | with all of your past historical knowledge,
01:12:11.300 | and you start to bounce it around
01:12:12.680 | and see where one of those things lights up
01:12:14.540 | and you build a new connection there
01:12:15.780 | and you build another one there too.
01:12:17.500 | You're developing schemas.
01:12:19.780 | And so in that way, I think you could argue,
01:12:22.340 | you know, we dream, therefore we are.
01:12:26.060 | (laughing)
01:12:27.100 | - Yeah, so in terms of this line between learning
01:12:31.140 | and thinking through a new thing
01:12:34.020 | that seems to be deeply connected,
01:12:36.460 | there's this legendary engineer named Jim Keller
01:12:40.980 | who keeps yelling at me about this.
01:12:42.820 | He says it's very effective.
01:12:44.900 | He likes to, for difficult problems before bed,
01:12:49.340 | think about that difficult problem.
01:12:51.100 | We're not talking about like drama at work
01:12:52.880 | or all that kind of stuff.
01:12:54.020 | No, like a scientific, for him, engineering problem.
01:12:57.900 | He likes to intensely think about it
01:13:01.220 | and to prime his mind before sleep and then go to sleep.
01:13:05.740 | And then he finds that the next day,
01:13:09.860 | he's able to think much clearer
01:13:12.020 | and there's new ideas that come,
01:13:13.380 | but also just, I guess it's more well-integrated.
01:13:16.820 | And sometimes during the process of,
01:13:21.200 | like he's able to like wake up and like see new insights.
01:13:26.200 | - That's right.
01:13:27.820 | - If he's deeply sort of aggressively
01:13:29.220 | thinking through a problem.
01:13:30.420 | - So there's many scientific, you know,
01:13:33.480 | demonstrations of this, you know,
01:13:35.260 | the Mendeleev with the periodic table of elements,
01:13:39.260 | you know, he was trying for months to understand.
01:13:42.260 | I mean, talk about an ecumenical problem
01:13:45.840 | of epic proportions.
01:13:47.900 | Here's your question today.
01:13:49.980 | You have to understand how all of the known elements
01:13:52.920 | in the universe fit together in a logical way.
01:13:55.640 | Good luck, take care.
01:13:57.120 | It was non-trivial at the time.
01:13:58.860 | And he would try and try, he was so obsessed with it.
01:14:01.680 | He created playing cards
01:14:03.940 | with all of the different elements on.
01:14:06.600 | And then he would go on these long train journeys
01:14:09.280 | around Europe and he would just sort of deal these cards
01:14:12.760 | in front of them and he would shuffle them,
01:14:15.200 | shuffling and shuffling.
01:14:16.180 | And he would just try to see if he could find
01:14:18.680 | what the answer was.
01:14:20.400 | And then, so the story goes, you know,
01:14:22.640 | he fell asleep and he had a dream.
01:14:25.120 | And in that dream, you know, all of these elements
01:14:28.480 | started to dance and play around
01:14:30.420 | and they snapped into a logical grid, you know,
01:14:33.120 | atomic weights, et cetera, et cetera.
01:14:35.680 | And it wasn't his waking brain that solved the problem.
01:14:40.680 | It was his sleeping brain that solved
01:14:43.120 | the impenetrable problem that his waking brain could not.
01:14:46.960 | And there's been count, you know, even in the arts
01:14:49.920 | and in music, some wonderful dreams, you know,
01:14:52.720 | Frankenstein, Mary Shelley's epic Gothic novel
01:14:56.080 | came to her in a dream at Lord Byron's home.
01:14:59.180 | And then we've got, you know, Paul McCartney.
01:15:04.500 | Yesterday, the song came to him in a dream.
01:15:08.520 | He was filming, gosh, what was the movie?
01:15:12.760 | I don't recall it.
01:15:13.840 | I should be shot because I'm from Liverpool myself.
01:15:17.040 | And, but he was on Wimpole Street in London and filming.
01:15:21.360 | And they, he came up with that song,
01:15:24.560 | the melody in his sleep, not to be outdone by the Beatles.
01:15:28.680 | And by the way, "Let It Be" also came from a dream
01:15:33.360 | that McCartney had.
01:15:34.480 | People usually give it, you know, religious overtones.
01:15:38.000 | You know, mother Mary comes to me
01:15:40.000 | speaking words of wisdom, let it be.
01:15:42.000 | If you've ever asked who mother Mary is,
01:15:45.440 | it's not the, you know, the biblical content.
01:15:49.440 | It's his mother.
01:15:51.280 | It's Mary McCartney.
01:15:54.360 | And she came to him in a dream and gifted him the song.
01:15:57.960 | But the best story I've heard is not to be outdone
01:16:02.120 | by the Beatles, the Stones, Keith Richards,
01:16:07.120 | who I think once was suggested, who was it?
01:16:10.840 | It was a comedian who was saying that in an interview
01:16:15.160 | with Rolling Stone, Keith Richards suggested or inferred
01:16:17.920 | that young kids should not do drugs.
01:16:20.800 | And they said, well, look, young kids can't do drugs
01:16:25.800 | because you've done all of them.
01:16:27.640 | - All of them, yeah.
01:16:28.480 | - And I always thought that, but Keith Richards described,
01:16:32.040 | he would always go to bed with his guitar
01:16:36.880 | and a tape recorder.
01:16:38.600 | And then probably he would have a whole set
01:16:42.640 | of other things in the bed with him.
01:16:43.960 | And who knows how many other people, but anyway.
01:16:47.480 | And then he said in his autobiography,
01:16:49.800 | and I'm paraphrasing here, but one morning I woke up
01:16:54.120 | and I realized that the tape had recorded
01:16:57.200 | all the way to the end.
01:16:58.640 | So I rewound the tape and I hit play.
01:17:02.800 | And there in some kind of ghostly form
01:17:06.000 | were the opening chords to "Satisfaction,"
01:17:08.880 | the most famous successful Rolling Stone song
01:17:11.280 | of all time.
01:17:13.720 | Followed by then 43 minutes of snoring.
01:17:16.240 | - That's awesome.
01:17:18.960 | - But that riff came to him,
01:17:20.520 | one of the most famous riffs in all of rock and roll
01:17:22.920 | came to him by way of a dream inspired insight.
01:17:25.480 | So I think there is too many of those anecdotes
01:17:30.320 | and we've now got the science,
01:17:31.440 | I don't rely on anecdotes as science.
01:17:33.960 | We've now done the studies in the laboratory
01:17:35.840 | and we can reliably demonstrate
01:17:37.600 | that sleep inspires creativity,
01:17:39.360 | inspires problem solving capacity.
01:17:41.840 | - Well, the interesting thing is,
01:17:43.120 | is it possible to some of the ideas that you talk about
01:17:45.800 | to turn them into a protocol
01:17:47.320 | that could be practiced rigorously?
01:17:48.960 | So what Jim Keller espouses is saying,
01:17:53.400 | not just the fact that sleep helps you
01:17:56.880 | increase the creativity, but turn it into a process.
01:18:00.440 | Like literally, like don't do it accidentally.
01:18:03.880 | You know, like an athlete does certain things
01:18:08.680 | to optimize their performance.
01:18:10.160 | They have a training routine, they have a regimen
01:18:12.720 | of like cycling and sprints and long distance stuff.
01:18:17.720 | In the same way, thinking about your job
01:18:22.320 | as an idea generator in the engineering space
01:18:25.280 | is like, this is good for my performance.
01:18:27.600 | So like for an hour before bed,
01:18:29.880 | think through a problem like every night
01:18:32.200 | and then use sleep to work through that problem.
01:18:36.680 | I mean, he's the first person that I heard
01:18:39.200 | like of the people I really respect that do like what I do,
01:18:42.680 | which is like programming engineering type work,
01:18:46.240 | like using sleep, not accidentally,
01:18:49.200 | but with a purpose, like using sleep.
01:18:52.720 | You know, that's just basically the difference between,
01:18:54.880 | as you said, a passive approach to it
01:18:57.520 | versus an active deterministic
01:19:01.840 | or hope for a deterministic approach to it.
01:19:04.000 | In other words, that you are actually trying to harness
01:19:08.440 | the power of sleep in a deliberate way
01:19:11.360 | rather than an unthoughtful way.
01:19:13.960 | I still think that, you know, mother nature through it,
01:19:17.240 | you know, 3.6 million years of evolution
01:19:20.240 | has probably got it mostly figured out
01:19:22.560 | in terms of what information should be uploaded at night
01:19:25.960 | and worked through.
01:19:27.080 | I think her algorithm is probably pretty good at this stage.
01:19:31.280 | It's not to suggest though that, you know,
01:19:33.080 | we can't try to tweak it and nudge it.
01:19:35.640 | You know, it's a very light hand on the tiller
01:19:37.520 | is what he's doing.
01:19:39.000 | I don't think there's anything wrong with that.
01:19:41.560 | - You know, just like, for example, for me,
01:19:43.480 | fasting has improved my ability to focus deeply
01:19:47.400 | and productivity significantly.
01:19:49.680 | And in that same way, you know,
01:19:52.560 | it's possible that playing with these ideas
01:19:55.480 | of thinking before bed or some hours before bed,
01:19:57.920 | or some playing with different protocols
01:20:00.280 | will have a significant leap
01:20:02.000 | over what mother nature naturally does.
01:20:04.280 | So if you let your body do what it naturally does,
01:20:06.640 | you may not achieve the same level of performance
01:20:09.320 | 'cause mother nature has not designed us
01:20:12.920 | to think deeply about chip design
01:20:16.480 | or programming artificial intelligence systems.
01:20:20.560 | - Well, she's gifted us the architecture
01:20:23.160 | and the capacity to do that.
01:20:25.600 | What we do with that is, you know,
01:20:28.080 | is what life's experience dictates.
01:20:31.520 | She gives us the blueprint to do many, you know.
01:20:34.560 | - Well, if I were to sort of introspect
01:20:37.400 | and self-analyze what mother nature wants me to do,
01:20:39.960 | I think given my current lifestyle
01:20:42.840 | that I have food in the fridge and a bed to sleep on,
01:20:47.240 | I think what mother nature wants me to do is to be lazy.
01:20:50.240 | (both laughing)
01:20:51.320 | And so I think I'm actually resisting mother nature
01:20:54.760 | because so many of my needs are satisfied.
01:20:59.680 | And so I have to resist some of the natural forces
01:21:04.320 | of the body and the mind when I do some of the things I do.
01:21:07.920 | So there's that dance.
01:21:09.520 | You know, like I've been thinking about doing a startup
01:21:12.880 | and that's obviously going against everything
01:21:15.280 | that my body and mind are telling me to do
01:21:18.080 | because it's going to be basically suffering.
01:21:21.240 | But the only reason I want--
01:21:22.720 | - As you know it, we'll be over.
01:21:24.680 | (both laughing)
01:21:26.320 | - Yes.
01:21:27.160 | But nevertheless, there's some kind of inner drive
01:21:31.120 | that wants me to do it.
01:21:32.000 | And then you start to ask the question,
01:21:33.760 | well, how do you optimize the things you can optimize
01:21:36.520 | like sleep, like diet,
01:21:38.120 | like the people that you surround yourself with
01:21:40.360 | in order to maximize happiness and performance
01:21:43.520 | and all those kinds of things without also over-optimizing.
01:21:47.040 | And that's such an interesting idea from an engineer.
01:21:52.040 | So as you may know,
01:21:55.480 | you don't often get those kinds of ideas from engineers.
01:21:59.580 | Engineers usually just don't read books about sleeping.
01:22:04.060 | They're usually like, they're not the healthiest of people.
01:22:09.060 | I think that's changing over time,
01:22:13.100 | especially with Silicon Valley,
01:22:14.100 | especially with the tech sector,
01:22:15.500 | people are starting to understand
01:22:16.700 | what's a healthy lifestyle.
01:22:18.300 | But usually they're kind of on the insane side,
01:22:21.020 | especially programmers.
01:22:22.700 | But it's nice to hear somebody like that use sleep
01:22:27.460 | and use some of the things you talk about,
01:22:29.420 | strategically, on purpose.
01:22:32.940 | - You know, to that idea of not just trying to use
01:22:36.820 | what mother nature gave,
01:22:38.980 | but seeing if you can do something more or different.
01:22:43.980 | In a conservative mindset,
01:22:47.940 | I would then pose the question at what cost?
01:22:52.500 | Because when you do something perhaps that deviates
01:22:56.720 | from the typical pre-programmed, you know,
01:23:00.340 | mother nature's program,
01:23:02.460 | I suspect it usually comes at the cost of something else.
01:23:07.940 | So maybe he is able to direct and focus
01:23:12.500 | his sleeping cognition on those particular topics
01:23:16.620 | that will gain him better problematic resolution
01:23:20.380 | the next day when he wakes up.
01:23:22.300 | The question is though, at what cost of the other things
01:23:24.900 | that didn't make it onto the menu
01:23:28.480 | of the finger buffet of sleep that night?
01:23:32.000 | And is it that you don't process
01:23:34.360 | the emotional difficulties or events,
01:23:37.560 | and therefore you are less emotionally resolved
01:23:40.280 | the next day, but you are more problem resolved
01:23:44.120 | the following day?
01:23:45.360 | And so I always try to think,
01:23:46.960 | and I truly don't want to sound puritanical
01:23:51.680 | either about sleep.
01:23:52.760 | And I think I've come off that way many a time,
01:23:56.100 | especially when I started out in the public.
01:23:58.600 | The tone of the book in some ways, you know,
01:24:02.340 | I look back and think, could I have been a little softer?
01:24:06.340 | And the reason was I was that way back in,
01:24:10.500 | but when I started writing the book,
01:24:11.700 | which was probably something like 2014 or '15,
01:24:15.400 | sleep was the neglected step-sister
01:24:18.660 | in the health conversation of the day.
01:24:21.500 | And I was just so sad to see the amount of suffering
01:24:25.360 | and disease and sickness that was caused
01:24:27.480 | by insufficient sleep.
01:24:29.640 | And for years before I'd been doing public speaking
01:24:33.080 | and I'd tell people about the great things
01:24:35.000 | that happen when you get sleep,
01:24:36.320 | people would say, that's fascinating.
01:24:37.940 | And then they would go back and keep doing the same thing
01:24:40.000 | about not sleeping enough.
01:24:41.800 | And then I realized you can't really speak
01:24:43.720 | about the good things that happen.
01:24:44.840 | It's like the news, what bleeds leads.
01:24:47.040 | And if you speak about the alarmingly bad things
01:24:49.340 | that happen, people tend to have a behavioral change.
01:24:53.000 | And so the book as a consequence,
01:24:55.400 | I think probably came out a little bit on the strong side
01:24:59.880 | of trying to convince people.
01:25:03.640 | - You were trying to help a lot of people
01:25:05.120 | and that's a powerful way to help a lot of people.
01:25:07.760 | - I was genuinely trying to help people,
01:25:09.520 | but certainly for some people for whom sleep
01:25:12.160 | does not come easy, then it was probably
01:25:14.680 | a tricky book to read too.
01:25:17.520 | And I think I feel more sensitive to those people now
01:25:20.660 | and empathetically connected to them.
01:25:22.600 | So I think the, again, the point was simply
01:25:27.660 | that I don't mean to sound too puritanical in all of this.
01:25:32.580 | And the same way with caffeine and coffee,
01:25:36.900 | I am just a scientist and I am not here to tell anyone
01:25:40.780 | how to live their life.
01:25:41.980 | That is not my job at all.
01:25:44.020 | And life is to be lived to a degree.
01:25:47.180 | And life is to be lived if you want to do a startup.
01:25:50.160 | All I want to do is empower people with the understanding
01:25:55.700 | of the science of sleep.
01:25:57.500 | And then you can make an informed choice
01:25:59.360 | as to how you want to live your life.
01:26:01.020 | And I offer no judgment on how anyone
01:26:03.620 | wishes to live their life.
01:26:05.540 | I just want to try and see if the information
01:26:08.620 | that I have about sleep would alternatively change
01:26:12.060 | how you would think about your life decisions.
01:26:13.740 | And if it doesn't, no problem.
01:26:15.460 | And if it does, I hope it's been of use.
01:26:18.300 | Well, maybe this is me trying to justify
01:26:21.300 | my lifestyle to you.
01:26:22.900 | But Dr. Seuss said, "You know you're in love
01:26:27.420 | "when you can't fall asleep because reality
01:26:29.580 | "is finally better than your dreams."
01:26:31.900 | - I love that quote too.
01:26:34.580 | - Okay.
01:26:35.420 | My sleeping schedule is complicated.
01:26:42.000 | And it has to do primarily with the fact
01:26:46.020 | that I love basically everything that I do.
01:26:49.620 | And that love takes a form that may not appear
01:26:53.560 | to be love from the external observer perspective.
01:26:56.700 | 'Cause it often includes struggle.
01:26:58.820 | It often includes something that looks like stress,
01:27:02.200 | even though it's not stress.
01:27:03.820 | It's like this excitement, it's this turmoil
01:27:06.980 | and chaos of passion, of struggling with a problem,
01:27:11.220 | of being sad and down to the point even depressed
01:27:15.980 | of how difficult the problem is, the disappointment
01:27:18.660 | that the last few weeks and months have been a failure
01:27:22.100 | and self-doubt, all that mix.
01:27:25.340 | But I love it.
01:27:27.180 | And a part of that is sometimes staying up all night
01:27:30.380 | working on a thing I'm really passionate about.
01:27:33.260 | And that means sleep schedules that are just like,
01:27:36.500 | sometimes sleeping during the day,
01:27:39.860 | sometimes very often sleeping very little,
01:27:43.260 | but taking naps that are like an hour or two hours
01:27:45.780 | or so on, that kind of weird chaos.
01:27:48.620 | Now, I'll also try to give myself backup.
01:27:54.260 | I was trying to like research yesterday,
01:27:56.220 | is anybody else productive while like this?
01:27:59.700 | And there's of course a lot of anecdotal evidence
01:28:02.180 | and some of it could be just narratives
01:28:05.860 | that people have told to the public
01:28:07.820 | when in reality they sleep way more,
01:28:09.900 | but there's a bunch of people that are famous
01:28:14.980 | for not sleeping much.
01:28:16.380 | So on the topic of naps, I read this a long time ago
01:28:21.380 | and I checked this, Churchill was big on big naps
01:28:26.340 | and is actually just reading more about Winston Churchill's
01:28:30.100 | sleep schedule is very much like mine.
01:28:33.060 | So I basically wanna give myself the opportunity
01:28:36.940 | to at night, to stay up all night if I want to.
01:28:40.780 | And a good nap is a big part of that in the late evening.
01:28:44.820 | Like I'll often, that just destroys social life completely,
01:28:48.100 | but I'll often take a nap in the late afternoon
01:28:52.020 | or the evening and that sets me
01:28:54.060 | if I want to stay up all night.
01:28:56.580 | And things like that, like I read that Nikola Tesla
01:29:00.700 | slept only two hours a night, Edison the same three hours,
01:29:05.180 | but he actually did the polyphasic sleep
01:29:07.700 | like where it's just a bunch of naps.
01:29:10.260 | What can you say about this madness of love and passion,
01:29:15.260 | of loving everything you do
01:29:18.940 | and the chaos of sleep that might result in?
01:29:22.260 | - I love the Seuss quote.
01:29:26.140 | And I've had that experience too, like you,
01:29:30.220 | I adore what I do.
01:29:33.140 | If someone gave you enough money
01:29:37.700 | to live the rest of your life,
01:29:40.260 | got a roof above my head, rice and beans on the table,
01:29:43.260 | and they said, "You don't have to work anymore,"
01:29:45.260 | I would do nothing different.
01:29:46.900 | I would do exactly, you know,
01:29:48.780 | this sounds a little crass
01:29:51.940 | and I hope it doesn't sound this way,
01:29:53.820 | but being a scientist is not what I do, it's who I am.
01:29:59.820 | It's not what I do, it's who I am.
01:30:01.660 | And when that's the case,
01:30:04.620 | sleep, working out, showering and eating
01:30:11.180 | are the things that I do in between my love affair with sleep.
01:30:16.180 | I fell for sleep like a blind roofer.
01:30:19.380 | And it was a love affair that started 20 years ago
01:30:28.340 | and I remain utterly besotted today.
01:30:33.140 | It's the most beguiling thing in the world to me.
01:30:37.460 | And I could easily, and I have, you know,
01:30:39.420 | it's kept me up at night.
01:30:41.580 | When my mind is fizzing with experimental ideas
01:30:44.180 | or I think I've got a new hypothesis or theory,
01:30:47.420 | I will struggle with sleep, I really will.
01:30:50.140 | It doesn't come easy to me
01:30:52.380 | because my mind is just so on fire with those ideas.
01:30:56.580 | So I understand the struggle,
01:30:58.940 | but I couldn't advocate from a scientific perspective
01:31:06.220 | the schedule because the science just doesn't,
01:31:09.740 | I would feel as though I'm doing you a disservice
01:31:14.860 | to say it's okay, that won't come with some blast radius,
01:31:19.860 | some health consequences.
01:31:23.060 | You can add Margaret Thatcher and Ronald Reagan
01:31:25.540 | to that list too.
01:31:26.700 | Both of them were very, you know, proud chess beaters
01:31:30.900 | of how little sleep that they get.
01:31:32.300 | Thatcher said four hours, Reagan, something similar,
01:31:35.300 | you know, and I, knowing the links that we now know
01:31:38.180 | between sleep and Alzheimer's disease,
01:31:39.860 | I've often wondered whether it was coincidental then
01:31:42.940 | that both of them died of the terrible disease of Alzheimer's
01:31:46.380 | meaning, you know, maybe it doesn't get you
01:31:48.420 | by way of, you know, being popped out of the gene pool
01:31:51.140 | in a car accident 'cause you had a microsleep at the wheel
01:31:53.420 | at age 32, or it doesn't get you at 42 with, you know,
01:31:58.420 | heart attack or even 52 with cancer or a stroke,
01:32:03.820 | maybe it gets you in your 70s.
01:32:05.700 | I think the elastic band of sleep deprivation
01:32:07.780 | can stretch only so far before it snaps
01:32:10.780 | and it ultimately seems to snap.
01:32:12.980 | You know, Nikola Tesla, I think he,
01:32:16.640 | Nikola Tesla, he, I think died of a coronary thrombosis,
01:32:21.740 | I believe, and there was a wonderful study
01:32:24.420 | done out of Harvard where they took a group of people
01:32:26.580 | who had no signs of cardiovascular disease.
01:32:30.260 | And what they found is that when they track them
01:32:33.020 | for years afterwards, they were completely healthy
01:32:36.420 | to begin with, those people who were getting
01:32:38.380 | less than six hours of sleep ended up having
01:32:40.900 | a 300% increased risk of developing calcification
01:32:44.980 | of the coronary artery, which is the major
01:32:48.420 | sort of corridor of life for your heart
01:32:51.460 | when someone says, you know, he died of a massive coronary,
01:32:54.820 | it's because of a blockade of the coronary artery,
01:32:58.140 | you know, and Tesla, you know, passed away
01:33:00.340 | from a coronary thrombosis.
01:33:03.540 | We also know that insufficient sleep
01:33:05.140 | is linked to numerous mental health issues.
01:33:07.400 | We know that Churchill had a wicked battle with depression.
01:33:11.380 | Gosh, my goodness, he used to call it black dog
01:33:13.260 | that would come and visit him.
01:33:15.100 | And I think many of his paintings, he was exquisite painter,
01:33:18.300 | but some of them would depict his darkness
01:33:20.820 | with depression as well.
01:33:22.840 | You know, Edison is interesting.
01:33:26.140 | People have argued that he would short sleep
01:33:28.620 | and he didn't put much value in sleep,
01:33:30.300 | whether or not that's true, we don't know,
01:33:31.780 | but he was a habitual napper, you're right,
01:33:33.540 | during the day, I've got some great pictures of him
01:33:35.340 | on his inventor's bench taking a nap.
01:33:37.340 | And in fact, I believe he set up nap cots around his house
01:33:41.100 | so he could nap.
01:33:42.740 | But what we also know is that he, again,
01:33:44.740 | coming out of Harvard just a couple of months ago,
01:33:47.860 | demonstrated very clearly that polyphasic sleep
01:33:50.580 | is associated with worse physical outcomes,
01:33:52.980 | worse cognitive outcomes, and especially worse mood outcomes.
01:33:57.460 | So from that sense, you know,
01:33:58.900 | sleeping like a baby is not perfect for adults.
01:34:01.660 | - There's a fascinating dance here
01:34:05.700 | of the mean and the extreme,
01:34:09.100 | like the average and the high performers.
01:34:13.180 | So I,
01:34:17.900 | this gets to like the meaning of life kind of discussion,
01:34:21.420 | but let's go there.
01:34:24.140 | So, well, and also happiness.
01:34:25.900 | So when studying sleep and when studying anything
01:34:29.500 | like diet and exercise,
01:34:31.700 | I think you have to really get a lot more data
01:34:35.900 | about individuals to make a conclusive statement.
01:34:40.900 | When people talk about like, is meat,
01:34:44.380 | red meat good for you or bad for you, right?
01:34:47.060 | It's just so often correlated with other life decisions
01:34:50.660 | when you choose to eat meat or not.
01:34:53.060 | My sense is that whatever life decisions you make,
01:34:58.060 | if they reduce stress and lead to happiness,
01:35:03.500 | that's also going to be a big boost
01:35:06.300 | that needs to be integrated into the plots in the science.
01:35:09.020 | Right?
01:35:09.860 | So I'll give you an example of somebody
01:35:12.300 | who is unarguably seen as unhealthy.
01:35:16.580 | My friend, Mr. David Goggins.
01:35:18.940 | So he's clearly, obviously,
01:35:22.980 | almost on purpose destroying his body.
01:35:25.240 | And to say that he's doing the wrong thing
01:35:30.700 | or the unhealthy thing feels wrong.
01:35:35.700 | But I'm not sure exactly in which way he feels wrong.
01:35:41.580 | One of the things I'm bothered by,
01:35:43.700 | and again, I apologize for the therapy sessions,
01:35:47.100 | a framework of this, but I'm bothered by the fact
01:35:53.420 | that a lot of people tell me or David
01:35:58.140 | that they're doing things wrong.
01:35:59.740 | A lot of people in my life, when they see me not sleep,
01:36:05.260 | they'll tell me to sleep more.
01:36:09.260 | Now they're correct.
01:36:11.540 | But one fundamental aspect that I'd like to complain about
01:36:15.540 | is not enough people, almost nobody,
01:36:20.180 | especially people that care for me,
01:36:22.460 | will come to me and say,
01:36:24.240 | "You have a dream, work harder."
01:36:29.540 | It's like the healthy thing should be a component
01:36:39.700 | of a life well-lived.
01:36:41.340 | - Right.
01:36:42.180 | - But not everything.
01:36:44.420 | And I don't know what to do with that
01:36:45.820 | because you certainly don't want to espouse it.
01:36:48.540 | And just like you said, when you were working in your book,
01:36:51.380 | there is a belief, sleep was a secondary citizen
01:36:55.900 | in the full spectrum of what's a healthy life.
01:36:59.100 | But at the same time, I'm bothered by in Silicon Valley
01:37:02.340 | and all these kinds of work environments
01:37:05.580 | that I get to work with with engineers
01:37:07.460 | is there's, to me, too much focus on work-life balance.
01:37:12.060 | What that usually starts meaning is like,
01:37:16.100 | yeah, yeah, of course, it's good to have a social life,
01:37:18.460 | it's good to have a family,
01:37:19.540 | it's good to eat well and sleep well,
01:37:23.320 | but we should also discover our passion.
01:37:25.460 | We should also give ourself a chance
01:37:30.140 | to work our ass off towards a dream
01:37:33.860 | and make mistakes and take big risks
01:37:36.940 | that in the short term seem to sacrifice health.
01:37:40.060 | - And I think, to come back to how you started
01:37:42.860 | about David Goggins, who I've never met,
01:37:45.340 | but who I admire incredibly
01:37:48.180 | and have an immense reverence for the man,
01:37:50.900 | you said two things.
01:37:53.740 | Is it wrong to do those things to yourself?
01:37:58.740 | And is it unhealthy to do those things to yourself?
01:38:03.700 | - I disagree with the former and I agree with the latter.
01:38:07.700 | So from a health biological medicine perspective,
01:38:10.900 | sleeping in the way that you've described
01:38:14.740 | or that other people may be sleeping
01:38:16.340 | in terms of insufficient amounts,
01:38:18.140 | now, to your point, too, about inter-individual differences,
01:38:24.340 | usually when I see a bar graph and a mean,
01:38:27.140 | I usually say, show me your variance.
01:38:30.380 | I want to see your variance.
01:38:32.140 | In other words, show me the distribution of that effect.
01:38:35.420 | How many people were below the mean?
01:38:36.940 | How many, is it all tightly clustered around this one thing?
01:38:39.980 | So it's a very robust effect.
01:38:41.620 | Or was this huge fan of effect where for some people,
01:38:44.540 | there was no effect at all and other people,
01:38:46.260 | there was a whopping effect and everything in between.
01:38:49.020 | So I don't discount inter-individual variability.
01:38:52.700 | But, and I will come back to those two points
01:38:56.820 | about is it wrong and is it unhealthy in just a second.
01:38:59.600 | When it comes to sleep, we have found huge amounts
01:39:02.440 | of inter-individual differences
01:39:04.680 | in your response to a lack of sleep.
01:39:07.240 | But one of the fascinating things,
01:39:09.480 | so let's say that I take you
01:39:10.920 | and we're going to measure your attention,
01:39:13.280 | your emotion, your mood, your blood pressure,
01:39:16.960 | your blood sugar glucose regulation,
01:39:19.200 | your autonomic nervous system,
01:39:20.660 | and your different gene expression.
01:39:22.400 | Let's say I'm just going to measure a whole kaleidoscope
01:39:25.760 | of different outcomes, brain and body.
01:39:28.760 | And I find that on our measure of cognition,
01:39:31.180 | on your attentional ability to focus,
01:39:33.640 | you are very resilient.
01:39:35.220 | You just don't show any impairments at all,
01:39:37.120 | even after being awake for 36 hours straight.
01:39:40.040 | Does that mean that you are resilient
01:39:42.560 | in all of those other domains as well?
01:39:44.600 | The answer is no, you're not.
01:39:46.420 | So you can be resilient in one,
01:39:48.840 | but very vulnerable in another.
01:39:51.600 | And we've not found anyone who isn't at least vulnerable
01:39:56.160 | in one of those domains.
01:39:58.360 | Meaning that it's somewhat safe to say
01:40:01.320 | that not getting sufficient sleep
01:40:03.480 | will lead to some kind of impairment
01:40:06.360 | in anyone given individual.
01:40:08.040 | It may not be the same impairment,
01:40:10.560 | but it's likely to be an impairment.
01:40:12.680 | But to come back to the question,
01:40:14.500 | I think it's wrong to tell anyone
01:40:16.800 | that it's wrong to do what they're doing,
01:40:19.320 | even if they are compromising their sleep,
01:40:21.760 | even if they're compromising their mental health.
01:40:24.840 | As long as they're not hurting anyone else,
01:40:28.380 | then I think the answer is that's that person's choice.
01:40:33.020 | - Yeah, but that's that person's life.
01:40:34.220 | - I'd like to push back further.
01:40:35.620 | So see the way you kind of said it,
01:40:38.060 | yes, you're absolutely right.
01:40:41.740 | But I would like to say a stronger statement,
01:40:45.460 | which is you should let go of that judgment
01:40:49.420 | of somebody is wrong and allow yourself to be inspired
01:40:53.220 | by the great heights they have reached.
01:40:55.900 | So take yourself out of the seat of being a judger
01:40:59.980 | of what is healthy or not,
01:41:01.420 | and appreciate the greatness of a particular human.
01:41:05.260 | You watch the Olympics,
01:41:07.020 | the kind of things that some athletes do
01:41:09.100 | to reach the very heights.
01:41:11.780 | The Olympics are taking years off their life.
01:41:15.020 | They suffer depression after the Olympics often.
01:41:18.540 | - Their physiology is disastrous.
01:41:21.100 | - Everything, their personal life,
01:41:23.380 | their psychology, their physiology, everything.
01:41:27.900 | It's a giant mess.
01:41:29.060 | So the question is about life.
01:41:31.600 | Healthy now means longevity,
01:41:37.860 | quality of life over a prolonged period of time,
01:41:42.060 | optimal performance over a prolonged period of time.
01:41:46.300 | But to me, beauty is reaching great heights.
01:41:52.260 | And there's a dance there
01:41:54.100 | that sometimes reaching great heights
01:41:56.220 | requires sacrifice of health
01:41:58.300 | and not like a calculation
01:42:01.420 | where you sat down on a sheet of paper and say,
01:42:03.100 | I'm going to take seven years off my life
01:42:05.700 | for an Olympic gold medal.
01:42:07.540 | No, it requires more chaotic journey
01:42:11.020 | that doesn't do that kind of calculus.
01:42:13.300 | And I just want to kind of speak to the,
01:42:15.820 | in the culture that struggles of what is healthy and not,
01:42:19.420 | we want to be able to speak to what is healthy
01:42:22.820 | and at the same time be inspired by the great heights
01:42:26.460 | that humans reach no matter how healthy
01:42:29.900 | or unhealthy they live.
01:42:32.700 | - Yeah, I agree with that.
01:42:34.140 | I think if that's a flag you're hoisting,
01:42:35.500 | I will definitely salute it because it really depends,
01:42:38.380 | what are you trying to optimize for in your life?
01:42:41.660 | And if you are, I think the only danger potentially
01:42:45.500 | with that mindset is that
01:42:48.100 | if you look at many of the studies of old age
01:42:51.460 | and end of life,
01:42:53.340 | most people say, I never look back on my life
01:42:57.220 | and wish I worked harder.
01:43:01.260 | I wish instead I'd spent more time with family, friends
01:43:06.260 | and engaged in that aspect.
01:43:08.380 | Now I'm not saying though, coming back to your point,
01:43:11.320 | that that is the standard rubric for everyone.
01:43:13.820 | I don't believe it is too.
01:43:15.860 | And there are many things that you and I
01:43:17.660 | are both benefiting from today,
01:43:19.620 | even in the field of medicine,
01:43:21.300 | where people have sacrificed their own longevity
01:43:26.460 | for the quest of solving a particular medical problem
01:43:30.180 | and they died quicker because of their commitment,
01:43:36.220 | because they wished to try and solve that problem
01:43:40.400 | in their pursuit of greatness scientifically.
01:43:43.100 | And I now benefit.
01:43:44.820 | Am I grateful that they did that?
01:43:46.580 | Incredibly grateful.
01:43:48.100 | A simpler demonstration is this,
01:43:51.700 | if tonight at 4 a.m. in the morning,
01:43:54.540 | I have a ruptured appendix, I have an appendicitis,
01:43:59.540 | I am incredibly grateful that there is an emergency team
01:44:04.580 | that will take me to the hospital at 4 a.m. in the morning.
01:44:07.700 | They are awake, they're not sleeping and they save my life.
01:44:12.380 | And that's part of what their life's mission and quest is.
01:44:17.380 | And they saved another's life by, in some ways,
01:44:21.500 | shaving a little of their own off.
01:44:24.740 | So I don't take, I have no umbrage
01:44:27.980 | with that mentality at all.
01:44:30.220 | I think you just have to be very clear
01:44:31.780 | about what you're optimizing for.
01:44:34.980 | And my worry is that most people fall into the rat race
01:44:39.740 | and they never actually ask the question,
01:44:42.180 | why am I doing this?
01:44:43.620 | - If you're just working nine to five,
01:44:46.300 | and you allow that nine to five to stretch into much longer,
01:44:51.700 | but it's nevertheless a job that's kind of like,
01:44:54.180 | wears you down, that's one thing.
01:44:56.500 | Another thing is when it is a lot like,
01:44:59.280 | it's a dream.
01:45:02.220 | - Your life mission is to accomplish.
01:45:04.420 | And for that, I think as long as you know
01:45:08.300 | what it is that you could be doing to yourself
01:45:11.060 | and you are comfortable and A-okay with that,
01:45:13.600 | I have no problem with that at all.
01:45:17.820 | Again, as I said, as a scientist,
01:45:20.220 | I cannot, should not, and will not tell anyone
01:45:23.060 | what they should do with their life.
01:45:24.860 | All I want you to be able to do is say,
01:45:27.340 | okay, now I understand more about the,
01:45:31.660 | previously these were the known unknowns
01:45:35.820 | and these were the unknown unknowns.
01:45:38.220 | And now I am slightly more cognizant,
01:45:41.400 | I have more knowns than I had before
01:45:46.400 | regarding my sleep and my health,
01:45:48.420 | knowing that information,
01:45:49.900 | do I still choose to make this decision?
01:45:53.860 | And if that's what I offered,
01:45:56.960 | then I think I've done my job.
01:45:59.520 | That's all I want to offer is just added information
01:46:02.380 | into the decision algorithm
01:46:04.860 | and what you end up choosing as an output of that algorithm
01:46:09.260 | has nothing to do with me.
01:46:11.220 | It's not my business and I will never judge anyone for it.
01:46:14.700 | And as I said, I'm immensely grateful
01:46:16.260 | for people who have sacrificed much in their lives
01:46:19.580 | to give me what I have.
01:46:21.380 | - So you're saying as long as the sacrifice,
01:46:23.180 | sort of grounded in knowledge of what the sacrifice is,
01:46:27.540 | that sleep is important, all those kinds of things.
01:46:29.020 | - And that you're comfortable with it,
01:46:30.260 | that it is your conscious choice
01:46:32.140 | rather than feeling as though you're trapped
01:46:34.500 | or that you are just, you haven't thought about it.
01:46:38.020 | And you start that job at age 32
01:46:41.260 | and then you wake up the next morning and you're 65
01:46:44.100 | and you think, where did my life go?
01:46:45.620 | What was I doing?
01:46:46.940 | That to me, I would feel, I would want to hug you
01:46:49.500 | and I would say, I'm just, and I'm not sounding,
01:46:52.940 | I don't mean to sound belittling here at all.
01:46:56.420 | I would just not wish that for you.
01:46:58.940 | I would wish that you could have thought about
01:47:04.140 | what it was that you're doing and not have that regret.
01:47:06.580 | - Yeah, so I guess I'm, this is for you, the listener.
01:47:10.060 | I'm coming out of the closet here a little bit.
01:47:12.020 | The fact that I enjoy the madness I live in.
01:47:14.660 | So please do not criticize me, embrace me.
01:47:17.460 | I understand the sacrifices I'm making.
01:47:20.860 | I enjoy sleeping on the floor
01:47:22.580 | when I'm passionately programming all night
01:47:25.060 | and just pass out on the carpet.
01:47:28.700 | I love this life, okay?
01:47:30.500 | So it's, but it's definitely something I think about
01:47:33.900 | that there's a balance to strike where--
01:47:38.060 | - I just want you to have as much of it though.
01:47:40.380 | - Of life.
01:47:41.260 | See, quality of life is important.
01:47:47.940 | - I should have said, I want you to have
01:47:51.020 | as much high quality life.
01:47:53.060 | And if high quality of life means
01:47:55.940 | I spend five decades on this planet,
01:48:01.260 | but yet in that time, I am thrilled every day.
01:48:04.660 | I'm turned on every day by what I do.
01:48:07.900 | And I reveled in this thing called my life's work.
01:48:12.900 | I think that that is a 50 year journey
01:48:17.260 | of absolute delight and fulfillment that you should take.
01:48:21.960 | - I think about my death all the time.
01:48:26.740 | I meditate on death.
01:48:28.000 | I'm okay to die today.
01:48:31.580 | So to me, longevity is not a significant goal.
01:48:36.580 | I'm so happy to be alive.
01:48:40.140 | I don't even think it would suck to die today.
01:48:43.620 | I'm as afraid of it today as I will be in 50 years.
01:48:47.500 | I don't wanna die as much today as I will in 50 years.
01:48:51.940 | There's of course all these experiences
01:48:55.060 | I would like to have, but everything's already amazing.
01:48:59.900 | It's like that Lego movie.
01:49:01.020 | So I don't know.
01:49:02.900 | So to me, I just wanna keep doing this.
01:49:06.540 | And there's of course things that could affect,
01:49:11.540 | like you mentioned dementia and these deterioration
01:49:16.880 | of the mind or the body that can significantly affect
01:49:21.140 | the quality of life.
01:49:23.580 | And so you want to-
01:49:24.420 | - As long as you're aware of that.
01:49:26.260 | And that's the price you pay for the entry
01:49:28.540 | of into this magical kingdom that you are experiencing,
01:49:32.900 | which is a lovely thing.
01:49:34.340 | I feel privileged too.
01:49:37.060 | I can't believe the life that I live.
01:49:38.940 | It's incredible.
01:49:41.620 | And just like you, I think about mortality a great deal.
01:49:46.620 | I think a lot about death, but I don't worry about death.
01:49:51.140 | I probably, with the exception of the potential pain
01:49:56.060 | that comes before it, that some people,
01:49:58.420 | many people can suffer, that maybe concerns me.
01:50:02.140 | But I actually think about mortality as a tool,
01:50:06.640 | as I use it as a lens through which I can then retrospect.
01:50:11.640 | And by placing myself at the point of future mortality,
01:50:16.940 | I can then use it as a retrospective lens to focus
01:50:21.580 | and ask the following question,
01:50:23.460 | is there anything I feel I would regret
01:50:26.920 | and therefore change in the life that I currently have now?
01:50:30.620 | That's the way I meditate and use mortality as a question,
01:50:36.220 | is to try and course correct and focus my life.
01:50:39.580 | I worry not about dying,
01:50:41.740 | but I like to think about death
01:50:44.980 | as a way to prioritize my life, if that makes sense.
01:50:48.260 | I don't know if that makes sense.
01:50:49.540 | - No, it makes total sense to decide
01:50:52.740 | how do you want to live today so that in the future,
01:50:57.740 | you do not regret the way you've lived today.
01:51:00.100 | - Right, and to place yourself in the future
01:51:02.340 | at your point of mortality is one way to, I think,
01:51:06.220 | as an exercise to retrospectively look back
01:51:10.460 | and not lose out on informed choices
01:51:13.020 | that you could otherwise lose out on
01:51:15.420 | if you weren't thinking about mortality.
01:51:18.060 | - Yeah, it clarifies your thinking.
01:51:22.140 | Is there, so I mentioned I sleep on the floor,
01:51:24.900 | take naps and power naps, and it's just kind of madness.
01:51:28.620 | Is there weirdnesses to your own sleep schedule
01:51:31.860 | as a scientist that does incredible work,
01:51:35.300 | has a lot of things going on,
01:51:37.100 | has to lead research, has to write research,
01:51:41.820 | has to be a science communicator,
01:51:44.580 | also have a social life, all those kinds of things.
01:51:46.780 | Is there certain patterns to your own sleep
01:51:49.940 | that you regret or you participate in
01:51:54.940 | that you find you enjoy?
01:51:58.420 | Is there some personal stuff, quirks,
01:52:03.140 | or things you're proud of that you do
01:52:05.020 | in terms of your sleep schedule?
01:52:07.340 | - The funny thing about being a sleep researcher
01:52:10.620 | is that it doesn't make you immune
01:52:13.020 | to the ravages of a difficult night of sleep,
01:52:16.300 | and I have battled my own periods of insomnia in my life too.
01:52:21.300 | And I think I've been fortunate in ways
01:52:26.900 | because I know how sleep works
01:52:28.460 | and I know how to combat insomnia.
01:52:30.300 | I know how to get it under control
01:52:32.660 | because insomnia in many ways is a condition
01:52:36.940 | where all of a sudden your sleep controls you
01:52:40.500 | rather than you control your sleep.
01:52:42.500 | - Wow, yeah, that's a beautiful way to put it, yeah.
01:52:46.460 | - And I know when I'm starting to lose control
01:52:51.460 | and it's starting to take control,
01:52:54.860 | I understand how to regain,
01:52:56.860 | but it doesn't happen overnight.
01:53:01.420 | It takes a long time.
01:53:03.620 | - So you've struggled with insomnia in your life?
01:53:06.780 | - I have, not all of my life.
01:53:08.340 | I would say I've probably had three
01:53:09.740 | or four really severe bouts,
01:53:12.420 | and all of them usually triggered
01:53:14.860 | by emotional circumstances, by stress.
01:53:17.740 | - Stress that's connected to actual events in life
01:53:22.180 | or stress that's unexplainable?
01:53:24.380 | - Well, externally triggered, yeah.
01:53:27.260 | It's sort of what we would call reactive stress.
01:53:30.020 | And so that's sort of point number one
01:53:36.460 | about the idiosyncrasies.
01:53:38.620 | The point number two is that when you are having
01:53:41.180 | a difficult night of sleep, as a sleep researcher,
01:53:45.020 | you basically have become the Woody Allen neurotic
01:53:47.820 | of the sleep world.
01:53:48.860 | Because at that moment, I'm trying to fall asleep
01:53:53.020 | and I'm not, and I'm starting to think,
01:53:55.020 | okay, my dorsolateral prefrontal cortex is not shutting down,
01:53:57.940 | my noradrenaline is not ramping down,
01:53:59.860 | my sympathetic nervous system
01:54:01.180 | is not giving way to my parasympathetic.
01:54:02.940 | At that point, you are dead in the water
01:54:05.380 | for the next two hours and nothing is bringing you back.
01:54:08.140 | So there is some irony in that too.
01:54:11.460 | I would say for myself though,
01:54:13.460 | if there is something I'm not proud of,
01:54:16.940 | it has been at times railing against my chronotype.
01:54:21.940 | So your chronotype is essentially,
01:54:25.540 | are you a morning type, evening type,
01:54:27.380 | or somewhere in between?
01:54:28.580 | - Yeah.
01:54:29.420 | - And there were times because society
01:54:32.580 | is desperately biased towards the morning types.
01:54:37.100 | This notion of the early bird catches the worm.
01:54:40.540 | Maybe that's true, but I'll also tell you
01:54:42.660 | that the second mouse gets the cheese.
01:54:44.540 | - Yeah.
01:54:47.060 | - So I think one of the issues--
01:54:49.540 | - That's a good line.
01:54:50.500 | - Around, firstly, people don't really understand chronotype
01:54:55.300 | because I'll have some people
01:54:56.580 | when I'm sort of out in the public,
01:54:58.100 | they'll say, look, I struggle with terrible insomnia
01:55:00.180 | and I'll ask them, is it problems falling asleep
01:55:02.380 | or staying asleep?
01:55:03.220 | And they'll say falling asleep.
01:55:05.140 | And then I'll say, look, if you are on a desert island
01:55:07.900 | with nothing to wake up for, no responsibilities,
01:55:11.660 | what time would you normally go to bed
01:55:13.140 | and what time would you wake up?
01:55:14.300 | And they would say, I'd probably like to go to bed
01:55:15.820 | about midnight and wake up maybe eight in the morning.
01:55:18.660 | And then I'd say, so what time do you now go to bed?
01:55:20.900 | And they'd say, well, I've got to be up for work early,
01:55:22.940 | so I get into bed at 10.
01:55:25.340 | I'd say, well, you don't have insomnia,
01:55:26.780 | you have a mismatch between your biological chronotype
01:55:29.860 | and your current sleep schedule.
01:55:31.780 | And when you align those two,
01:55:33.620 | and I was fighting that for some time too,
01:55:35.900 | I'm probably mostly right in the middle.
01:55:40.780 | I am desperately vanilla, unfortunately,
01:55:43.700 | in many aspects of life, but this included,
01:55:47.060 | I'm neither a strong morning type
01:55:48.420 | nor a strong evening type.
01:55:49.940 | So ideally, I'd probably like to go to bed around 11,
01:55:53.500 | 10.30, 11, probably somewhere between 10.30 and 11
01:55:56.900 | and wake up, I naturally wake up usually most days
01:56:00.220 | before my alarm at 7.04.
01:56:04.340 | And it's 7.04 because why not be idiosyncratic
01:56:08.540 | in terms of setting an alarm?
01:56:09.380 | - I love it.
01:56:10.980 | - And so I--
01:56:13.180 | - That's kind of awesome, I've never heard about that.
01:56:15.060 | That's amazing, I'm gonna start doing that now,
01:56:17.300 | setting alarms like a little bit off the--
01:56:19.380 | (laughing)
01:56:20.340 | - Yeah, I'm never quite sure why we all--
01:56:22.380 | - It's a celebration of uniqueness.
01:56:24.780 | - Yeah, and I am quite the odd snowflake in that sense too.
01:56:28.820 | So I would usually then try to force myself
01:56:31.500 | because I had that same mentality that if I wasn't up
01:56:34.940 | at 6.30 and in the gym by seven,
01:56:39.060 | that there was something wrong with me.
01:56:41.820 | And I quickly abandoned that.
01:56:44.540 | But if I look back, if there was a shameful act
01:56:46.620 | that I have around my sleep, I think it would be that
01:56:49.180 | for some years until I really started
01:56:50.860 | to get more detailed into sleep.
01:56:53.140 | And now I have no shame in telling people
01:56:56.620 | that I will probably usually wake up around 6.45 naturally,
01:57:01.620 | sometimes seven, when people are looking at me thinking,
01:57:06.100 | you're a sloth, you're lazy.
01:57:07.940 | And I don't finish my daily workout until,
01:57:13.780 | I'm not working until probably nine o'clock in the morning.
01:57:17.860 | They're thinking, what are you doing?
01:57:19.500 | Now I will work late into the day.
01:57:22.940 | If I could, I would work 16 hours.
01:57:25.500 | It's my passion just like yours.
01:57:27.300 | So I don't feel shame around that,
01:57:31.980 | but I have changed my mentality around that.
01:57:34.460 | - It's complicated because I'm probably happiest
01:57:40.540 | going to bed, if I'm being honest, like at 5 a.m.
01:57:45.940 | - That's fine, you're just an extreme evening type.
01:57:48.180 | - But the problem is, it's not that I'm ashamed for it.
01:57:54.700 | I actually kind of enjoy it 'cause I get to sleep
01:57:56.540 | through all the nonsense of the morning.
01:57:59.660 | - Isn't that a beautiful thing?
01:58:01.180 | - Like people are busy with their emails
01:58:03.260 | and I just am happy as a cow.
01:58:06.340 | And I wake up, after all the drama has been resolved.
01:58:09.540 | - Yeah, and cows are happy and the drama has been resolved.
01:58:12.260 | - Exactly, but in society you do,
01:58:15.580 | especially, I mean, this is what I think about is,
01:58:18.080 | when you work on a larger team, especially with companies,
01:58:22.940 | you are, everybody's awake at the same time.
01:58:26.780 | So that's definitely been a struggle to try to figure out,
01:58:31.100 | just like you said, how to balance that,
01:58:34.140 | how to fit into society and yet be optimal
01:58:36.540 | for your chronotype.
01:58:38.580 | - Yeah, you have to sleep in synchrony with it and harmony.
01:58:42.980 | Because normally what we know is that if you fight biology,
01:58:47.540 | you'll normally lose.
01:58:48.940 | And the way you know you've lost is
01:58:50.620 | through disease and sickness.
01:58:53.020 | - You said you suffered through several bouts of insomnia.
01:58:56.540 | Is there, aside from embracing your chronotype,
01:59:02.460 | is there advice you can give how to overcome insomnia
01:59:05.900 | from your own experience?
01:59:07.780 | - Right now, the best method that we have
01:59:09.660 | is something called cognitive behavioral therapy
01:59:11.780 | for insomnia or CBTI for short.
01:59:15.020 | And you work with, for people who don't know what it is,
01:59:18.720 | you work with a therapist for maybe six weeks
01:59:21.380 | and you can do it online, by the way,
01:59:22.740 | I recommend probably jumping online, it's just the easiest.
01:59:25.820 | And it will change your beliefs, your habits,
01:59:31.100 | your behaviors and your general stress
01:59:33.060 | around this thing called sleep.
01:59:34.780 | And it is just as effective as sleeping pills
01:59:36.700 | in the short term.
01:59:38.060 | But what's great is that unlike sleeping pills,
01:59:40.500 | when you stop working with your therapist,
01:59:43.100 | those benefits last for years later.
01:59:45.780 | Whereas when you stop your sleeping pills,
01:59:47.300 | you typically have what's called rebound insomnia
01:59:49.460 | where your sleep not only goes back
01:59:50.820 | to being as bad as it was before, it's usually even worse.
01:59:54.340 | For me, I think I found a number of things effective.
01:59:59.580 | The first is that I had to really address
02:00:02.580 | what was stressful and try to come up with some degree
02:00:07.580 | of meaningful rationality around it.
02:00:11.580 | Because I think one of the things that happens,
02:00:13.620 | there's something very, talking about conscious states
02:00:15.860 | to come all the way back to, gosh, I don't know,
02:00:19.540 | I feel like we've only been chatting for like 20 minutes,
02:00:21.740 | but you're gonna tell me it's been a while.
02:00:23.340 | - Yeah, it's been a while.
02:00:24.380 | - Okay, I'm desperately, I feel terribly sorry.
02:00:27.980 | But let's come back to conscious states
02:00:29.500 | which is where we started.
02:00:31.320 | There is something very strange about the night
02:00:36.380 | that thoughts and anxieties are not the same
02:00:41.300 | as they are in the waking day.
02:00:43.380 | They are worse, they are bigger.
02:00:45.780 | And I at least find that I am far more likely
02:00:50.300 | to catastrophize and ruminate at night about things
02:00:58.140 | that when I wake up the next day in the broad light of day,
02:01:01.340 | I think it's no one near that bad man.
02:01:03.980 | What were you doing?
02:01:04.900 | It's not that bad at all.
02:01:06.900 | So to gain firstly, some rational understanding
02:01:10.420 | of my emotional state that's causing that insomnia
02:01:14.260 | was very helpful.
02:01:15.660 | The second thing was to keep regularity,
02:01:18.900 | just going to bed at the same time waking up.
02:01:21.260 | And here's an unconventional piece of sleep advice.
02:01:25.340 | After a bad night of sleep, do nothing.
02:01:30.340 | Don't wake up any later, don't go to bed any earlier,
02:01:36.800 | don't nap during the day, and don't drink any more coffee
02:01:40.980 | than you would otherwise.
02:01:42.480 | Because if you end up sleeping later into the morning,
02:01:47.120 | you're then not going to be tired
02:01:49.440 | at your normal time at night.
02:01:51.040 | So then you're gonna get into bed thinking,
02:01:53.200 | well, I had a terrible night of sleep last night.
02:01:55.040 | And yes, I slept in this morning to try and compensate,
02:01:58.780 | but I'm still gonna get to bed at my normal time.
02:02:00.900 | But now you get into bed and you haven't been awake
02:02:03.600 | for as long as you normally would.
02:02:05.360 | So you're not as sleepy as you normally would be.
02:02:07.640 | And so now you sit there lying in bed
02:02:10.240 | and it's another bad night.
02:02:12.120 | And the same thing is, if you go to bed any earlier,
02:02:15.960 | so don't wake up any later, wake up at the same time,
02:02:20.040 | don't go to bed any earlier,
02:02:21.560 | because then you're just probably, your chronotype,
02:02:23.800 | your biological rhythm doesn't want you to be asleep.
02:02:26.560 | And you think, well, terrible night,
02:02:28.960 | I'm gonna get into bed at 9 p.m.
02:02:31.040 | rather than my standard 10.
02:02:32.560 | You're just gonna be lying in bed awake for that hour.
02:02:35.260 | Naps will take our double-edged sword.
02:02:37.320 | They can have wonderful benefits.
02:02:38.640 | And we've done lots of studies on naps
02:02:40.440 | for both the brain and the body,
02:02:42.240 | but they are a double-edged sword in the sense that
02:02:45.560 | napping will just take the edge off your sleepiness.
02:02:50.080 | It's a little bit like a valve on a pressure cooker.
02:02:52.240 | When you nap during the day,
02:02:54.280 | you can take some of that healthy sleepiness
02:02:56.840 | that you've been building up during the day.
02:02:59.120 | And for some people, not all people,
02:03:00.900 | but for some people, that can then make it harder
02:03:02.820 | for them to fall asleep at night
02:03:04.200 | and then stay asleep soundly across the night.
02:03:07.320 | So the advice would be,
02:03:08.440 | if you're struggling with sleep at night,
02:03:10.240 | don't nap during the day.
02:03:12.080 | But if you are not struggling with sleep
02:03:14.560 | and you can nap regularly, naps are just fine.
02:03:17.680 | And we can play around with optimal durations
02:03:20.200 | depending on what you want.
02:03:21.600 | Just try not to nap too late into the day
02:03:23.720 | because napping late into the day
02:03:25.240 | is like snacking before your main meal.
02:03:27.380 | It just takes the edge off your sleep hunger, as it were.
02:03:30.360 | But that would be,
02:03:32.360 | so that's my unconventional second piece of advice
02:03:35.320 | regarding insomnia.
02:03:37.160 | The third is meditation.
02:03:39.020 | I found meditation to be incredibly powerful.
02:03:41.280 | I started reading about meditation
02:03:43.940 | as I was researching that aspect of the book many years ago.
02:03:48.940 | And as a hard-nosed scientist,
02:03:51.060 | I thought this sounds very woo-woo.
02:03:53.720 | This is sort of, we all hold hands and sing "Kumbaya"
02:03:57.360 | and everything's going to be fine with sleep.
02:03:59.420 | I read the data and it was compelling.
02:04:02.680 | I couldn't ignore it.
02:04:04.280 | And I started meditating.
02:04:06.040 | And that was six years ago and I haven't stopped.
02:04:09.640 | And I find meditation before bed incredibly powerful.
02:04:14.280 | The meditation app companies
02:04:16.160 | were perplexed at this at first.
02:04:17.500 | They want people to meditate during the day.
02:04:19.440 | But when they looked at their usage statistics,
02:04:21.700 | they found that they would have people
02:04:23.500 | in the morning meditating.
02:04:24.960 | And then there's a huge number of people
02:04:26.900 | using the meditation app in the evening.
02:04:28.940 | What they were doing was self-medicating their insomnia.
02:04:32.300 | And they finally, rather than railing against it,
02:04:34.960 | they started to see it as a cash cow, rightly so.
02:04:39.720 | So I found meditation to be helpful.
02:04:41.840 | Having a wind down routine
02:04:43.400 | is the other thing that's critical for me.
02:04:45.840 | I can't just go from, because when my mind is switched on,
02:04:49.200 | and I think you may be like this too,
02:04:51.560 | if I get into bed, that rolodex of thoughts
02:04:55.720 | and information and excitement and anxiety and worry
02:05:00.000 | is just whirling away.
02:05:02.680 | And it's not gonna be a good night for me.
02:05:05.320 | So I have to find a wind down routine.
02:05:07.640 | And that makes sense when you realize what sleep is like.
02:05:10.640 | Sleep is not like a light switch.
02:05:12.360 | Sleep is much more like trying to land a plane.
02:05:17.320 | It takes time to descend down onto the terra firma
02:05:20.600 | that we call sound sleep at night.
02:05:23.360 | And we have this for kids.
02:05:25.240 | I don't have children, but a lot of parents will say,
02:05:31.320 | we have to have the bedroom, sorry, the bedtime routine.
02:05:35.120 | You bathe the kid, you put them in bed,
02:05:37.560 | you read them a story.
02:05:38.880 | You have to go through this routine,
02:05:40.560 | this wind down routine for them.
02:05:42.400 | And then they fall asleep wonderfully.
02:05:44.520 | Why do we abandon that as adults?
02:05:47.800 | We need that same wind down routine.
02:05:51.440 | So that's been the other thing
02:05:53.240 | that's been very helpful to me.
02:05:54.680 | So don't do anything different.
02:05:57.200 | If you have a bad night of sleep, keep doing the same thing.
02:06:01.000 | Manage your anxiety, understand it, rationalize it.
02:06:04.120 | Then meditation.
02:06:07.520 | And then finally having some kind
02:06:09.600 | of disengagement wind down routine.
02:06:12.020 | Those are the four things that have been very helpful to me.
02:06:15.640 | - That's brilliant.
02:06:16.480 | So the regularities really do a lot of work against insomnia.
02:06:20.600 | Is it possible to have a healthy sleep life
02:06:28.800 | without the regularities?
02:06:31.020 | - I say that because I'm all over the place
02:06:34.820 | and I've gotten good at being all over the place.
02:06:37.220 | So I'll often, like what happens,
02:06:40.780 | I'll go stretches of time.
02:06:42.140 | There'll be some times a month where my days are like,
02:06:46.260 | this is embarrassing to admit, but they're like-
02:06:49.180 | - It's just you and I here.
02:06:51.140 | Just you and I.
02:06:51.980 | - It's like 28 hours or 30 hour days.
02:06:54.420 | I'll just go all the way around comfortably and happily.
02:06:59.980 | I love it.
02:07:00.820 | And then there'll be a nap.
02:07:02.420 | I mean, if you add up the hours,
02:07:05.340 | when I'm just sleeping as much as I want,
02:07:09.020 | it'll probably be like six hour average per 24 hours.
02:07:12.500 | So it works out nicely.
02:07:16.700 | Maybe even seven hours, I don't know.
02:07:18.460 | But it's obviously irregular
02:07:21.500 | and there's chaos in the whole thing.
02:07:24.100 | So sometimes it's shorter sleep, sometimes it's longer.
02:07:26.740 | Is that totally not a good thing, do you think?
02:07:31.260 | - The best evidence that we have to speak to this question
02:07:34.100 | is people who are doing rotating shifts.
02:07:36.380 | And unfortunately, the news is not good.
02:07:42.100 | They usually have a higher instance of many diseases
02:07:45.820 | such as depression, diabetes, cardiovascular disease,
02:07:50.820 | obesity, stroke, and
02:07:55.100 | (silence)
02:07:57.260 | And again, that's just me communicating the data
02:08:04.100 | that we have and I'm not telling you
02:08:06.540 | that you should do anything different.
02:08:08.820 | The other thing is that there's nothing in your biology
02:08:13.260 | that suggests that that's how your body was designed
02:08:16.540 | to sleep.
02:08:17.380 | It is a system that loves habit.
02:08:24.940 | If your circadian clock in your brain,
02:08:27.940 | it's called the suprachiasmatic nucleus,
02:08:30.900 | sits in the middle of your brain, had a personality trait,
02:08:33.900 | it would be a creature of habit.
02:08:36.540 | It loves habit.
02:08:38.420 | That's how your biology is designed to work
02:08:42.180 | is through very archetypal, prototypical, expected cycles.
02:08:47.180 | And when we do something different to that,
02:08:52.540 | then you start to see some of the pressure,
02:08:55.860 | stress, fractures in the system.
02:08:58.820 | But again, to your point,
02:09:01.580 | if that's something that you don't mind,
02:09:04.100 | adopting and understanding,
02:09:07.100 | then I think you should keep doing what you're doing.
02:09:12.340 | - It's complicated.
02:09:13.580 | Of course, you have to be a student of your own body
02:09:15.380 | and explore it.
02:09:16.260 | One of the reasons I wanna have kids
02:09:19.980 | is kids enforce a stricter schedule.
02:09:22.580 | (laughing)
02:09:23.620 | I think I definitely--
02:09:25.060 | - So I've heard.
02:09:26.460 | - I definitely feel that I'm not living the sort of data-wise,
02:09:31.460 | scientifically speaking, the optimal life.
02:09:35.060 | And me just living the way I wanna live day-to-day
02:09:38.060 | is perhaps not the optimal way.
02:09:40.140 | And there's certain things that I've seen,
02:09:42.020 | very successful people that I know in my life,
02:09:45.740 | when they have kids,
02:09:48.540 | they actually, their productivity goes up,
02:09:50.500 | they get their shit together.
02:09:51.700 | There's a lot of aspects that--
02:09:53.100 | - Regularity, yeah.
02:09:54.300 | - Yeah, the regularity.
02:09:55.460 | I mean, that creatures a habit.
02:09:56.900 | That's the thing, that's power.
02:09:58.940 | And then you start to optimally use the hours
02:10:01.580 | you have in the day.
02:10:02.780 | Let me ask you about--
02:10:03.620 | - Well, actually, I just have one quick point on that too.
02:10:06.820 | We often think about sleep as a cost,
02:10:11.820 | but instead I think of sleep as an investment.
02:10:18.220 | And the reason is because your effectiveness
02:10:20.700 | and your efficiency when you're well-slept
02:10:23.140 | typically exceeds that when you're not.
02:10:27.060 | And to me, it's the idea of,
02:10:28.700 | if I'm going to boil a pot of water,
02:10:31.460 | why would I boil it on medium
02:10:35.380 | when I could boil it in half the time on high?
02:10:37.900 | - Yeah.
02:10:40.380 | - And I sometimes worry that when I speak
02:10:43.620 | to Fortune 500 companies,
02:10:45.180 | and they're of this mentality of long hours,
02:10:49.580 | getting people to rise and grind.
02:10:51.700 | The first point is that after about 20 hours of being awake,
02:10:56.180 | a human being is as cognitively as impaired
02:10:58.740 | as they would be if they were legally drunk.
02:11:01.700 | And the reason I bring that point up
02:11:03.980 | is because I don't know any company or CEO who would say,
02:11:08.980 | I've got this great team, they're drunk all the time.
02:11:13.260 | But we often lord the airport warrior
02:11:16.340 | who's flown through three different time zones
02:11:19.340 | in the past two days is on email at 2 a.m.
02:11:21.980 | and then is in the office at six.
02:11:24.740 | And I think there is some aspect, not in all people,
02:11:28.980 | but there is sort of some aspect
02:11:30.380 | of that slight sleep machismo.
02:11:33.060 | And that's not what, you are very different.
02:11:36.940 | You are driven by a purity of passion
02:11:40.460 | and a very authentic, incredibly genuine goal
02:11:44.180 | of wanting to do something remarkable with your life.
02:11:48.180 | That's not the issue I think I'm speaking about.
02:11:51.620 | It's just simply that I think the,
02:11:55.660 | this notion of wanting to be awake for longer
02:12:02.900 | to try and get more done can sometimes be at odds
02:12:06.820 | with the fact that you can actually get so much more done
02:12:11.260 | if you're well slept.
02:12:12.620 | And it's this trade off.
02:12:14.460 | - I actually admire people that take the big risk
02:12:17.660 | and work hard, whether that means staying up late at night,
02:12:20.260 | all those kinds of things,
02:12:21.900 | but it cannot be in the framework, in the context,
02:12:25.020 | like what Edison said,
02:12:26.140 | which is sleep feels like a waste of time.
02:12:28.740 | So like, if you're not sleeping
02:12:32.620 | because you think sleep is stupid, that's totally wrong.
02:12:36.540 | But if you're not sleeping
02:12:37.820 | because you're deeply passionate about something,
02:12:39.860 | that to me, it's a gray area, of course,
02:12:43.420 | but that to me is much more admirable.
02:12:45.260 | And everything you're espousing is saying like,
02:12:48.300 | whatever the hell you're doing,
02:12:49.660 | you better be aware that sleep,
02:12:51.700 | long-term and short-term is really good for you.
02:12:54.700 | So if you're not sleeping, you're sacrificing,
02:12:57.260 | just make sure you're sacrificing for the right thing.
02:13:00.100 | I see vodka and getting drunk the same way.
02:13:04.060 | I know it's not good for me.
02:13:05.540 | I know I'm not gonna feel good days after.
02:13:08.500 | I know it's gonna decrease my performance.
02:13:10.620 | There's nothing positive about it,
02:13:12.820 | except it introduces chaos in my life
02:13:17.540 | that introduces beautiful experiences
02:13:20.340 | that I would not otherwise have.
02:13:22.140 | It creates like this turmoil of social interaction
02:13:27.140 | that ultimately makes me happy
02:13:29.820 | that I've experienced them in the moment.
02:13:31.940 | And later the stories, you get to meet new people.
02:13:34.620 | So like alcohol in this society
02:13:36.620 | is an incredible facilitator of that.
02:13:40.380 | So like, that's a good example of like not sleeping
02:13:44.020 | and drinking way too much vodka.
02:13:46.420 | - Again, it's this notion of life is to be lived
02:13:49.580 | to a degree, but if you do have children,
02:13:54.460 | I think one of the other things
02:13:56.780 | that then maybe comes into the picture
02:13:58.660 | is the fact that now there are other people
02:14:03.380 | that you have to live for than yourself.
02:14:06.740 | - Yeah, but come on, like once they're old enough,
02:14:10.740 | like if you can't defend for yourself,
02:14:13.580 | you're too weak, get stronger.
02:14:15.620 | - It's gonna be that kind of fatherhood.
02:14:17.740 | I got it.
02:14:18.580 | I'm understanding so much more about like screaming
02:14:22.580 | than I did before.
02:14:23.780 | - That's why you have to have,
02:14:25.260 | for me that'd be, my wife would be probably softer.
02:14:29.500 | It's good cop, bad cop.
02:14:30.620 | 'Cause I think I'm...
02:14:32.380 | But of course, actually, 'cause I don't have kids,
02:14:35.540 | I've seen some tough dudes when they have kids
02:14:39.740 | become like the softies.
02:14:44.340 | They become like, they do everything for their kids.
02:14:47.260 | It becomes like, it totally transforms their life.
02:14:50.660 | I mean, Joe Rogan is an example of that.
02:14:53.180 | I've just seen so many tough guys completely become changed
02:14:57.380 | by having kids, which is fascinating to watch
02:15:00.420 | 'cause it just shows you how meaningful having kids is
02:15:03.500 | for a lot of people.
02:15:04.380 | - Although I would say having,
02:15:06.180 | chatted with Joe for some time,
02:15:11.340 | I think he is a delightful sweetheart,
02:15:13.740 | independent of children.
02:15:15.020 | I think, don't get me wrong,
02:15:17.220 | I don't wanna be in a ring with him.
02:15:19.740 | He would face me five ways till Tuesday,
02:15:22.260 | but I think he's a desperately sweet man
02:15:24.780 | and a very, very smart individual.
02:15:26.740 | - Yeah, I mean, but he talks about the compassion
02:15:29.460 | he's gained from realizing, just watching kids grow up,
02:15:32.980 | that we were all kids at some point.
02:15:35.100 | You get a new perspective.
02:15:37.140 | I think just like me, I still get this with him.
02:15:40.460 | He's super competitive
02:15:41.820 | and there's a certain way to approach life.
02:15:45.420 | Like you're striving to do great things
02:15:47.740 | and you're competitive against others
02:15:49.740 | and that intensity or that aggression,
02:15:52.260 | that can lack compassion sometimes and empathy.
02:15:56.660 | And when you have children, you get a sense like,
02:15:59.020 | oh, everybody was a child at some point.
02:16:00.940 | Everybody was a kid.
02:16:02.420 | And you see that whole development process.
02:16:04.780 | It can definitely enrich,
02:16:07.300 | expand your ability to be empathetic.
02:16:12.300 | Let me ask about diet.
02:16:15.020 | So what's the connection between diet and sleep?
02:16:19.580 | So I do intermittent fasting,
02:16:21.220 | sometimes only one meal a day, sometimes no meals a day.
02:16:24.740 | Is there a good science on the interaction
02:16:26.700 | between fasting and sleep?
02:16:29.860 | - We have some data.
02:16:31.500 | I would prefer more,
02:16:33.060 | but we have data both on time-restricted eating
02:16:36.980 | and then we have some data on fasting to a degree.
02:16:43.620 | On time-restricted eating,
02:16:47.860 | I think that it has some benefits,
02:16:51.340 | although the human replication studies
02:16:53.060 | have actually not borne out quite the same health benefit
02:16:56.660 | to the extent that the animal studies have.
02:16:59.060 | There've been some disappointing studies,
02:17:01.820 | one here close to where we are right now at UCSF recently.
02:17:05.820 | So I think time-restricted eating can be a good thing.
02:17:09.940 | And there are many benefits of time-restricted eating.
02:17:13.140 | Is sleep one of them?
02:17:14.260 | No, it doesn't seem to be
02:17:15.940 | because there are probably at the time
02:17:17.780 | that we're recording this,
02:17:18.740 | three pretty decent studies that I'm aware of.
02:17:21.580 | Two out of the three were in obese individuals.
02:17:25.740 | One out of the three were in healthy weight individuals.
02:17:29.340 | And what they found was that time-restricted eating
02:17:31.580 | in all three of those studies
02:17:33.500 | didn't have any advantageous benefit to sleep.
02:17:36.380 | It didn't necessarily harm sleep,
02:17:38.700 | but it didn't seem to improve it.
02:17:41.820 | When it comes to fasting though,
02:17:43.220 | which is a different state,
02:17:45.980 | we don't have too many studies,
02:17:47.500 | experimental studies with long-term fasting.
02:17:49.500 | The best data that we have
02:17:50.780 | is probably from religious practices.
02:17:53.020 | And probably the most data we have is during Ramadan
02:17:56.980 | where people will fast for 29 to 30 days
02:18:01.100 | from sunrise to sunset.
02:18:03.140 | And under those conditions,
02:18:07.220 | there are probably five distinct changes that we've seen.
02:18:12.220 | None of them seem to be particularly good for sleep.
02:18:16.260 | The first is that the amount of melatonin
02:18:18.420 | that you release, and melatonin is a hormone.
02:18:20.740 | It's often called the hormone of darkness
02:18:23.420 | or the vampire hormone,
02:18:25.300 | not because it makes you look longingly
02:18:27.100 | at people's necklines,
02:18:28.300 | but it's just because it comes out at night.
02:18:31.260 | Melatonin signals to your brain and your body
02:18:33.820 | that it's dark, it's nighttime, and it's time to sleep.
02:18:37.180 | Those individuals, when they were undergoing
02:18:39.300 | that regimen to fasting,
02:18:40.780 | the amount of melatonin that was released
02:18:44.700 | and when it was released,
02:18:46.020 | the amount of melatonin decreased
02:18:48.220 | and when it was released came later.
02:18:51.060 | That was the first thing.
02:18:52.260 | The second thing was that they ended up
02:18:55.460 | finding it harder to fall asleep
02:18:57.820 | as quickly as they normally would otherwise.
02:19:00.580 | The third thing was that the total amount of sleep
02:19:02.780 | that they were getting decreased.
02:19:05.260 | The fourth fascinating thing was that
02:19:07.460 | a wake-promoting chemical called orexin increased.
02:19:12.260 | And this is why a lot of people will say,
02:19:14.020 | when I'm fasting, it feels like I can stay awake for longer.
02:19:18.420 | And I'm more alert, I'm more active.
02:19:21.700 | And I'll come back to, from an evolutionary perspective,
02:19:24.020 | why we understand that to be the case.
02:19:26.660 | And then the fourth factor is that
02:19:28.460 | fasting didn't decrease the amount of deep sleep
02:19:31.580 | that seemed to be unaffected.
02:19:33.300 | It did, however, decrease the amount of REM sleep
02:19:36.340 | or dream sleep.
02:19:37.460 | And we know that REM sleep dreaming is essential
02:19:39.500 | for emotional first aid, mental health,
02:19:42.460 | it's critical for memory, creativity.
02:19:45.340 | It's also critical for several hormone functions.
02:19:47.460 | It's when, you know, if you,
02:19:49.700 | there's direct correlations between testosterone,
02:19:52.220 | you know, testosterone release peaks
02:19:54.380 | just before you go into REM sleep and during REM sleep too.
02:19:58.100 | So REM sleep is critical.
02:20:00.260 | But, so those are the five changes that we've seen.
02:20:02.820 | None of them seem to be that advantageous for sleep.
02:20:06.140 | But the fourth point that I mentioned, which was orexin,
02:20:09.540 | which is this wake-promoting chemical,
02:20:12.340 | and a good demonstration or a very sad demonstration
02:20:15.300 | of its power is when it becomes very deficient in the brain
02:20:18.940 | and it leads to a condition called narcolepsy,
02:20:21.820 | where, you know, you're just unpredictable
02:20:25.540 | with your sleep and you, so,
02:20:28.080 | so orexin when it's in high concentrations
02:20:33.140 | keeps you awake when you lose it,
02:20:35.140 | it can, you know, it can put you very much
02:20:37.820 | into a state of narcolepsy where you're sleeping
02:20:40.100 | a lot of the time in unpredictable sleep.
02:20:43.460 | Why on earth, when you are fasting,
02:20:46.580 | would the brain release a wake-promoting chemical?
02:20:51.580 | And our answer is right now is the following.
02:20:54.620 | The, one of the few times that I mentioned before
02:20:56.940 | that we see animals undergoing insufficient sleep
02:21:00.580 | or prolonged sleep deprivation
02:21:03.660 | is under conditions of starvation.
02:21:05.660 | And that is an extreme evolutionary pressure.
02:21:11.020 | And at that point, the brain will forego some,
02:21:14.020 | it won't forego all, but it will forego some of its sleep.
02:21:18.460 | And the reason is so that it can stay awake for longer
02:21:21.240 | because the sign of starvation is saying to the brain,
02:21:24.460 | you can't find food in your normal foraging perimeter,
02:21:27.900 | you need to stay awake for longer
02:21:29.500 | so you can travel outside of your perimeter
02:21:31.700 | for a further distance,
02:21:33.980 | and maybe you will find food and save the organism.
02:21:37.300 | So in other words, when we fast,
02:21:39.660 | it's giving our brain this evolutionary signal
02:21:43.620 | that you are under conditions of starvation.
02:21:46.380 | So the brain responds by saying, oh my goodness,
02:21:49.020 | I need to release the chemical
02:21:50.300 | that helps the organism stay awake for longer,
02:21:52.900 | which is a rexin, so that they can forage for more food.
02:21:57.420 | Now, of course, your brain from evolutionary perspective
02:21:59.940 | doesn't know about this thing called Safeway
02:22:02.340 | that you could easily go to and break the fast.
02:22:05.780 | But that's how we understand fasting.
02:22:08.300 | And I think, my dear friend, Peter Atiyah
02:22:11.140 | has done a lot of work in this area too.
02:22:14.060 | I think fasting and David Sinclair's brilliant work,
02:22:17.340 | goodness me, what an individual too.
02:22:20.140 | The work is pretty clear there
02:22:21.900 | that time-restricted eating and fasting
02:22:25.100 | have wonderful health benefits.
02:22:28.540 | Fasting creates this thing called hermesis,
02:22:32.700 | just like exercise and low-level stress and sauna,
02:22:37.460 | heat, shock, and hermesis is a biological process,
02:22:41.580 | I think, as David Sinclair has once said,
02:22:43.700 | in simple layman's terms is,
02:22:45.500 | what doesn't kill you makes you stronger.
02:22:47.540 | And I think there is certainly good data
02:22:53.340 | that fasting and time-restricted eating has many benefits.
02:22:56.260 | Is sleep one of them?
02:22:57.780 | It doesn't seem to be, it doesn't seem to enhance sleep.
02:23:00.700 | - But it's interesting to understand its effects on sleep.
02:23:04.780 | I've, like, I've fasted,
02:23:09.100 | it's a study of NF2,
02:23:12.460 | I've once fasted 72 hours and another time, 48 hours.
02:23:16.900 | And I found that I got much less sleep
02:23:20.380 | and was very restful though.
02:23:22.660 | I hesitate to say this, but this is how I felt,
02:23:24.780 | which is I needed less sleep.
02:23:27.020 | I wonder if my brain is deceiving me
02:23:28.900 | because it feels like I'm getting
02:23:31.100 | a whole extra amount of focus for free.
02:23:34.620 | And I wonder if there's long-term impacts of that.
02:23:39.740 | Because if I fast 24 hours,
02:23:41.780 | get the same amount of calories, one meal a day,
02:23:45.940 | there's a little bit of discomfort,
02:23:47.340 | like just maybe your body gets a little bit colder,
02:23:50.580 | maybe there's just, I mean, hunger,
02:23:54.620 | but the amount of focus is crazy.
02:23:58.780 | And so I wonder, it's like,
02:24:00.700 | I'm a little suspicious of that.
02:24:02.140 | I feel like I'm getting something for free.
02:24:04.340 | I'm the same way with sweetener, like Splendor or something.
02:24:07.820 | It's like, it's gotta be really bad for you, right?
02:24:10.380 | 'Cause why is it so tasty, right?
02:24:12.820 | - And I think, yeah, as we said before, with biology,
02:24:17.820 | you don't get, if there's a gain,
02:24:20.580 | there's, yeah, there's often a cost too.
02:24:23.460 | So, but we at least understand the biological basis
02:24:28.420 | of what you're describing.
02:24:29.260 | It's not that you actually don't need less sleep.
02:24:33.100 | It's that this chemical is present
02:24:36.300 | that forces you more awake.
02:24:39.220 | And so subjectively you feel as though,
02:24:42.020 | I don't need as much sleep because I'm wide awake.
02:24:45.220 | And those two things are quite different.
02:24:47.860 | It's not as though your sleep need has decreased.
02:24:51.780 | It's that your brain has hit the overdrive switch,
02:24:54.300 | the overboost switch to say, we need to keep you awake
02:24:57.540 | because food is in short supply.
02:24:59.660 | - So you mentioned during sleep, there's assimilation,
02:25:03.380 | all those kinds of things for learning purposes,
02:25:05.420 | but there's also these, you mentioned the five ways
02:25:08.180 | in which we become psychotic in dreams.
02:25:11.340 | What do you think dreams are about?
02:25:14.460 | Why do you think we dream?
02:25:17.780 | What place do we go to when we dream?
02:25:21.100 | And why are they useful?
02:25:22.380 | Not just the assimilation aspect,
02:25:25.820 | but just like all the crazy visuals that we get with dreams.
02:25:29.460 | Is there something you can speak to that's actually useful?
02:25:33.340 | Like why we have such fun experiences in that dream world?
02:25:38.340 | - So one of the camps in the sleep field
02:25:43.260 | is that dreams are meaningless,
02:25:46.420 | that they are an epiphenomenal by-product
02:25:49.580 | of this thing called REM sleep
02:25:51.420 | from which dreams come from as a physiological state.
02:25:55.420 | So the analogy would be, let's think of a light bulb,
02:26:00.340 | that the reason that you create the apparatus
02:26:02.900 | of a light bulb is to produce this thing called light
02:26:06.340 | in the same way that we've evolved
02:26:08.700 | to this thing called REM sleep
02:26:10.340 | to serve whatever functions REM sleep serves.
02:26:13.740 | But it turns out that when you create light in that way,
02:26:17.100 | you also produce something called heat.
02:26:19.860 | It was never the reason that you designed the light bulb,
02:26:22.820 | it's just what happens when you create light in that way.
02:26:26.180 | And the belief so too was that dreaming
02:26:29.180 | was essentially the heat of the light bulb.
02:26:32.140 | That REM sleep is critical,
02:26:34.860 | but when you have REM sleep with a complex brain like ours,
02:26:38.380 | you also produce this conscious epiphenomenon
02:26:42.180 | called dreaming.
02:26:43.660 | I don't believe that for a second.
02:26:46.280 | (laughs)
02:26:47.860 | And from a simple perspective is that I suspect
02:26:50.740 | that dreaming is more metabolically costly
02:26:54.460 | as a conscious experience than not dreaming.
02:26:56.860 | So you could still have REM sleep,
02:26:58.640 | but absent the conscious experience of dreaming
02:27:01.900 | was probably less metabolically costly.
02:27:05.580 | And whenever mother nature burns the energy unit called ATP,
02:27:10.420 | which is the most valuable thing,
02:27:12.340 | there's usually a reason for it.
02:27:15.260 | So if it's more energetically demanding,
02:27:20.380 | then I suspect that there is a function to it.
02:27:23.020 | And we've now since discovered that dreams have a function.
02:27:26.620 | The first, as we mentioned, creativity.
02:27:29.180 | The second is that dreams provide a form
02:27:32.340 | of overnight therapy.
02:27:35.140 | Dreaming is a form of emotional first aid.
02:27:38.860 | And it's during dream sleep at night
02:27:40.580 | that we take these difficult, painful experiences
02:27:43.900 | that we've had during the day, sometimes traumatic,
02:27:47.000 | and dream sleep acts almost like a nocturnal soothing balm.
02:27:51.360 | And it sort of just takes the sharp edges
02:27:53.480 | off those difficult, painful experiences
02:27:56.340 | so that you come back the next day
02:27:58.640 | and you feel better about them.
02:28:01.560 | And so I think in that sense, dreaming,
02:28:04.000 | it's not time that heals all wounds.
02:28:07.320 | It's time during dream sleep
02:28:09.380 | that provides emotional convalescence.
02:28:12.660 | So dreaming is almost a form of, you know,
02:28:15.720 | emotional windscreen wipers.
02:28:17.720 | (both laughing)
02:28:19.900 | And I think, and by the way, it's not just that you dream,
02:28:24.900 | it's what you dream about that also matters.
02:28:30.060 | So for example, scientists have done studies
02:28:32.100 | with learning and memory
02:28:33.060 | where they have people learn a virtual maze.
02:28:36.080 | And what they discovered was that those people
02:28:40.500 | who then dreamed, but dreamed of the maze,
02:28:45.220 | were the only ones who, when they woke up,
02:28:47.840 | ended up being better at navigating the maze.
02:28:50.760 | Whereas those people who dreamed,
02:28:53.400 | but didn't dream about the maze itself,
02:28:55.860 | they were no better at navigating the maze.
02:28:57.980 | So it's not just that you, it's not,
02:28:59.900 | it's sort of necessary, but not sufficient.
02:29:02.420 | It's necessary that you dream,
02:29:04.300 | but it's not sufficient to produce the benefit.
02:29:06.820 | You have to be dreaming about certain things itself.
02:29:09.900 | And the same is true for mental health.
02:29:13.340 | What we've discovered is that people who are going through
02:29:15.420 | a very difficult experience, a trauma, for example,
02:29:18.100 | a very painful divorce, those people who are dreaming,
02:29:23.060 | but dreaming of that difficult event itself,
02:29:26.360 | they go on to gain resolution
02:29:28.580 | to their clinical depression one year later.
02:29:31.140 | Whereas people who were dreaming just as much,
02:29:34.620 | but not dreaming about the trauma itself,
02:29:37.380 | did not go on to gain as much clinical resolution
02:29:40.740 | to their depression.
02:29:42.340 | So it's, I think to me, those are the lines of evidence
02:29:46.940 | that tell me dreaming is not epiphenomenal.
02:29:50.340 | And it's not just about the act of dreaming.
02:29:53.900 | It's about the content of the dreams,
02:29:57.220 | not just the fact of a dream itself.
02:30:00.500 | - It's, first of all, it's fascinating.
02:30:01.820 | It makes a lot of sense,
02:30:02.980 | but then immediately takes my mind to,
02:30:05.660 | from an engineering perspective,
02:30:07.180 | how that could be useful in, for example, AI systems.
02:30:10.480 | If you think about dreaming as an important part
02:30:16.860 | about learning and cognition
02:30:19.100 | and filtering previous memories of what's important,
02:30:23.580 | integrating them, maybe you can correct me,
02:30:27.100 | but I see dreaming as a kind of simulation of worlds
02:30:31.480 | that are not constrained by physics.
02:30:34.620 | So you get a chance to take some of your memories,
02:30:37.860 | some of your thoughts, some of your anxieties,
02:30:40.340 | and play with them,
02:30:42.380 | construct virtual worlds and see how it evolves,
02:30:46.380 | to play with those worlds
02:30:48.900 | in a safe environment of your mind,
02:30:51.100 | safe in quotes, 'cause you could probably get
02:30:53.020 | into a lot of trouble with the places your mind will go.
02:30:56.980 | But this definitely is applied in much cruder ways
02:31:04.340 | in artificial intelligence.
02:31:05.860 | So one context in which this is applied
02:31:08.740 | is the process called self-play,
02:31:12.780 | which is reinforcement learning,
02:31:14.420 | where agents play against itself or versions of itself.
02:31:19.420 | And it's all simulated of trying different versions
02:31:23.380 | of themselves and playing against each other
02:31:25.300 | to see what ends up being a good.
02:31:29.100 | The ultimate goal is to learn a function
02:31:32.620 | that represents what is good
02:31:33.900 | and what is not good in terms of how you should act
02:31:35.900 | in the world. - Right, you create a set
02:31:37.100 | of decision weights based on experience,
02:31:39.460 | and you constantly update those weights
02:31:41.500 | based on ongoing learning.
02:31:43.060 | - But the experience is artificially created
02:31:45.980 | versus actual real data. - That's right.
02:31:48.620 | - So it's a crude approximation of what dreams are,
02:31:51.620 | which is you're hallucinating a lot of things
02:31:54.420 | to see which things are actually--
02:31:56.460 | - No, I think it's been a theory that's been put forward,
02:31:59.300 | which is that dreaming is a virtual reality test space
02:32:04.300 | that is largely consequence-free.
02:32:06.700 | What an incredible gift to give a conscious mind
02:32:11.140 | each and every night.
02:32:12.260 | - Now, the conscious mind, the human mind,
02:32:15.900 | is very good at constructing dreams
02:32:17.780 | that are nevertheless useful for you.
02:32:20.660 | Like, they're wild and crazy,
02:32:22.540 | but they're such that they are still grounded in reality
02:32:27.420 | to a degree where anything you learn in dreams
02:32:29.940 | might be useful in reality.
02:32:32.100 | This is a very difficult thing to do
02:32:34.380 | 'cause it requires a lot of intelligence,
02:32:36.540 | it requires consciousness.
02:32:38.000 | This has been effectively recently been used
02:32:41.700 | in self-supervised learning for computer vision
02:32:45.980 | with the process of what's called data augmentation.
02:32:50.060 | That's a very crude version of dreams,
02:32:53.260 | which is you take data and you mess with it
02:32:56.700 | and you start to learn how a picture of a cat
02:33:01.220 | truly represents a cat by messing with it in different ways.
02:33:06.980 | Now, the crude methods currently are cropping,
02:33:09.060 | rotating, distorting, all that kind of stuff,
02:33:11.500 | but you can imagine much more complicated
02:33:14.100 | generative processes that start hallucinating
02:33:18.180 | different cats in order for you to understand deeply
02:33:21.460 | of what it means for something to look like a cat.
02:33:25.020 | - What is the prototype of a archetype of a cat?
02:33:27.660 | - Yeah, the archetype.
02:33:28.940 | That's a very difficult process for computer vision
02:33:31.540 | to go from what are the pixels
02:33:36.060 | that are usually associated with a cat
02:33:38.500 | to what is a cat in the visual space?
02:33:42.860 | In the three-dimensional visual space
02:33:44.740 | that is projected on an image,
02:33:46.500 | on a two-dimensional image, what is a cat?
02:33:50.260 | Those are fundamentally philosophical questions
02:33:52.860 | that we humans don't know the answer to,
02:33:56.500 | like linguistically, but when we look at a picture
02:33:59.060 | of a cat and a dog, we can usually tell pretty damn well
02:34:03.180 | what's the difference.
02:34:04.300 | And I don't know what that is
02:34:05.900 | because you can't reduce that to pointy ears
02:34:08.380 | or non-pointy ears, furry or not furry,
02:34:11.500 | something about the eyes.
02:34:12.340 | - It's been a longstanding issue in cognitive neuroscience,
02:34:14.860 | cognitive neuroscience too,
02:34:16.660 | is how does the brain create an archetype?
02:34:20.220 | How does it create schemas
02:34:22.340 | that have general applicability
02:34:24.980 | but yet still obtain specificity?
02:34:29.740 | That's a very difficult challenge.
02:34:31.700 | I mean, we can do it.
02:34:32.700 | We do it.
02:34:33.540 | It's rather bloody amazing.
02:34:35.660 | - It seems like part of the toolbox
02:34:37.740 | is this controlled hallucination, which is dreams.
02:34:41.140 | - Well, it's a relaxing of the rigid constraints.
02:34:45.860 | I often think of dreaming as,
02:34:48.780 | it's from an information processing standpoint,
02:34:52.740 | the prison guards are away
02:34:54.700 | and the prisoners are running amok in a delightful way.
02:34:59.060 | And part of the reason is because
02:35:00.260 | when you go into dream sleep,
02:35:01.940 | the rational part of your brain
02:35:03.660 | called the prefrontal cortex, which is the part,
02:35:05.940 | it's like the CEO of the brain.
02:35:07.700 | It's very good at making high level, rational,
02:35:09.580 | top-down decisions and controlled actions.
02:35:12.580 | That part of the brain is shut down during REM sleep.
02:35:17.340 | But then emotional centers, memory centers,
02:35:20.660 | visual centers, motoric centers,
02:35:24.220 | all of those centers actually become more active.
02:35:27.260 | In fact, some of them are more active
02:35:29.580 | than when we're awake in the dream state.
02:35:32.900 | - That's fascinating.
02:35:33.740 | - So your brain from a neural architecture perspective
02:35:37.340 | is radically different.
02:35:39.580 | Its network feature is not the same as wakefulness.
02:35:44.540 | And I think this is an immensely beneficial thing
02:35:48.140 | that we have at least two different rational
02:35:51.780 | and irrational conscious states
02:35:54.300 | that we do information processing in.
02:35:56.460 | The rational, the veritical,
02:35:58.220 | the page one of the Google search is wakefulness.
02:36:01.460 | The more irrational, illogical, hyper-associative,
02:36:06.020 | Google page 20 is the REM sleep.
02:36:08.700 | Both I think are critical, both are necessary.
02:36:11.980 | - That's fascinating.
02:36:12.860 | And again, fascinating to see how that could be integrated
02:36:15.660 | in the machines to help them learn better
02:36:18.700 | and to reason better.
02:36:21.660 | - And in some ways,
02:36:23.580 | we also know it from a chemical perspective too.
02:36:25.900 | When you go into dream sleep,
02:36:27.820 | it is a neurochemical cocktail like no other that we see
02:36:31.900 | in the rest of the 24-hour state.
02:36:34.900 | There is a chemical called noradrenaline
02:36:37.700 | or norepinephrine in the brain.
02:36:40.380 | And you know if it's sister chemical in the body
02:36:42.620 | called adrenaline, but upstairs in the brain,
02:36:45.700 | noradrenaline is very good at creating
02:36:48.220 | a very hyper-focused, attentive, narrow,
02:36:52.260 | it's sort of very convergent way of thinking
02:36:57.260 | to a point, sharp, focus, that's the only thing.
02:37:01.220 | The spotlight of consciousness is very narrow.
02:37:04.140 | That's noradrenaline.
02:37:05.860 | When you remove noradrenaline,
02:37:08.820 | then you go from a high SNR, high signal to noise ratio,
02:37:13.820 | where it's just you and I in this moment.
02:37:16.900 | I don't even know what's going on elsewhere.
02:37:18.940 | I am with you, noradrenaline is present.
02:37:21.700 | But when you go into REM sleep,
02:37:24.940 | it is the only time during the 24-hour period
02:37:27.220 | where your brain is devoid of any noradrenaline.
02:37:30.060 | It is completely shut off.
02:37:31.740 | And so the signal to noise ratio is very different.
02:37:34.940 | It's almost as though we're injecting
02:37:36.660 | a greater amount of noise into the neural architecture
02:37:40.220 | of the brain during dream sleep,
02:37:42.460 | as if it's chemically brute-forced
02:37:45.780 | into this relaxed associative memory processing state.
02:37:50.780 | And then from an anatomical perspective,
02:37:53.180 | just as I described, the prefrontal cortex goes down
02:37:56.220 | and other regions light up.
02:37:58.620 | So it is a state that seems to be very,
02:38:01.700 | I mean, if you were to show me a brain scan of REM sleep
02:38:06.460 | and tell me that it's not REM sleep,
02:38:08.820 | just say, "Look, based on the pattern
02:38:10.660 | of this brain activity, what would you say
02:38:12.540 | is going on in this person's mind?"
02:38:14.460 | I would say, "Well, they're probably not rational.
02:38:16.500 | They're probably not having logical thought
02:38:18.140 | because their prefrontal cortex is down.
02:38:19.940 | They're probably feeling very emotional
02:38:21.460 | because their amygdala is active,
02:38:23.580 | which is an emotional center of the brain.
02:38:25.420 | They're definitely going to be thinking visually
02:38:27.340 | 'cause the back of the brain is lit up, the visual cortex.
02:38:30.620 | It's probably going to be filled with past experience
02:38:33.900 | and autobiographical memories
02:38:35.940 | because their memory centers are lighting up.
02:38:39.540 | And there's probably going to be movement
02:38:40.860 | because their motor cortex is very active."
02:38:43.060 | That to me sounds very much like a dream.
02:38:45.980 | And that's exactly what we see in brain scanners
02:38:48.060 | when we've put people inside of them.
02:38:50.420 | - One of the things I notice sleep affects
02:38:54.340 | is my ability to see the beauty in the world.
02:38:59.340 | - Yeah.
02:39:00.460 | - So what do you think is the connection
02:39:03.740 | between sleep and your emotional life?
02:39:06.020 | Your ability to love other human beings and love life?
02:39:10.620 | - Yeah.
02:39:15.460 | I think it's very powerful and strong.
02:39:19.780 | So we've done a lot of work in the field of sleep
02:39:22.860 | and emotion and sleep and moods.
02:39:25.020 | And you can separate your emotions into two main buckets,
02:39:29.780 | you know, positive and negative.
02:39:31.980 | And what's interesting is that when you are sleep deprived
02:39:35.380 | and the more hours that you go into being awake
02:39:38.980 | and the fewer hours that you've had to sleep,
02:39:41.220 | your negative mood starts to increase.
02:39:47.220 | And we know which individual types of emotions are changing.
02:39:52.220 | I've got a wonderful postdoc in my lab
02:39:55.060 | called Etty Ben-Simon, who's doing some incredible work
02:39:58.020 | on trying to understand the emotional,
02:40:01.140 | individual emotional tapestry of affective meltdown
02:40:06.140 | when you're not getting sufficient sleep.
02:40:12.060 | But let's just keep with two dimensions,
02:40:14.180 | positive and negative.
02:40:15.140 | - Yes.
02:40:15.980 | - The negative, most people would think,
02:40:17.620 | well, it's the negative that takes the biggest hit
02:40:19.660 | when I'm sleep deprived.
02:40:20.780 | It's not.
02:40:22.340 | By probably a log order magnitude larger
02:40:26.060 | is a hit on your positive emotions.
02:40:29.020 | In other words, you stop gaining pleasure
02:40:33.380 | from normally pleasurable things.
02:40:35.180 | And it's a state that we call anhedonia.
02:40:38.420 | And anhedonia is the state that we often call depression.
02:40:43.420 | So depression to most people's surprise
02:40:46.420 | isn't necessarily that you're always feeling
02:40:50.100 | negative emotions.
02:40:52.060 | It's often more about the fact that you lose
02:40:55.580 | the pleasure in the good things in life.
02:40:58.700 | That's what we call anhedonia.
02:41:00.620 | That's what we see in sleep and insufficient sleep.
02:41:03.740 | And it happens quite quickly.
02:41:05.540 | - Yeah, it's kind of fascinating.
02:41:07.180 | I think I do, it's not depression,
02:41:10.660 | but like it's a stroll into that direction,
02:41:15.180 | which is when I'm sleep deprived,
02:41:17.340 | I stop being able to see the meaning in life.
02:41:20.400 | The things that gave me meanings starts to lose meaning.
02:41:24.980 | Like stupid, it makes me realize how enjoyable
02:41:28.380 | everything is in my life.
02:41:29.940 | Because when I start to lose it,
02:41:31.420 | when I'm severely sleep deprived,
02:41:33.620 | you start to see how much life sucks when you lose it.
02:41:36.060 | But that said, I'm just cognizant enough
02:41:38.740 | that like sleep fixes all of that.
02:41:40.780 | So I use those states for what they're worth.
02:41:44.140 | In fact, I personally like to pay attention
02:41:48.020 | to the things that bother me in doing that time.
02:41:52.780 | 'Cause they also reveal important information to me.
02:41:56.800 | - That's an interesting method to use like a Rorschach.
02:42:01.500 | - Yeah, I mean, there's, so I find this
02:42:04.260 | when I fast combine with sleep deprivation,
02:42:06.900 | am clear to see with people,
02:42:10.700 | clear in identifying the things
02:42:12.940 | that are not going right in my life,
02:42:15.740 | or people that I'm working with
02:42:18.540 | are not doing as good of a job as they could be doing.
02:42:22.060 | Like people that are negative in my life,
02:42:25.940 | I'm more able to identify them.
02:42:28.180 | So I don't act on that.
02:42:30.020 | It's a very bad time to act on those decisions.
02:42:32.340 | But I'm like recording that information
02:42:36.860 | because I usually when I'm well rested and happy,
02:42:40.060 | I see the beauty in everybody,
02:42:42.060 | which can get you into trouble.
02:42:44.020 | So you have to balance those two things.
02:42:46.020 | But yes, it's fascinating.
02:42:47.340 | - But there's irony there too,
02:42:48.740 | which is the fact that when you're well rested
02:42:51.380 | and well slept, as you said,
02:42:53.020 | you see the beauty in life and it sort of enlivens you
02:42:56.820 | and sort of gives you a quality of life
02:43:01.580 | that's emotionally very different.
02:43:03.760 | Yet then we are contrasting that against the need
02:43:09.900 | for not getting enough sleep
02:43:15.580 | because of the beautiful things
02:43:17.620 | that you want to accomplish in life.
02:43:20.900 | And I don't actually see them as,
02:43:23.820 | sort of completely counterintuitive or paradoxical
02:43:30.340 | because I still think that you can strive
02:43:32.740 | for all of the brilliant things that you are striving for
02:43:35.580 | to have the monumental goals,
02:43:38.020 | the Herculean challenges that you wish to take on and solve.
02:43:42.160 | They can still enthrall you and excite you
02:43:46.140 | and stimulate you.
02:43:48.020 | But because of the insufficient sleep that they can
02:43:52.060 | or that goal can produce,
02:43:55.420 | it will shave off the beauty of life
02:43:59.980 | that you experience in between.
02:44:01.300 | And again, this is just about the trade-off.
02:44:04.100 | I will say though that,
02:44:05.220 | and this is not applicable to your circumstance,
02:44:09.080 | we do know that insufficient sleep
02:44:14.380 | is very strongly linked to suicide ideation,
02:44:18.740 | suicide attempts and tragically suicide completion as well.
02:44:22.880 | And in fact, in 20 years of studying sleep,
02:44:28.260 | we have not been able to discover
02:44:29.820 | a single psychiatric condition in which sleep is normal.
02:44:32.900 | And I think that that is a profound state.
02:44:37.380 | I think it tells us so much about the role of sleep
02:44:41.860 | as a potential causal agent in psychiatric conditions.
02:44:45.100 | I also think it's a potential sign
02:44:47.620 | that we should be using sleep as a tool
02:44:50.020 | for the prevention of grave mental illness.
02:44:52.260 | - Yeah, it's both a cause and a solution.
02:44:54.900 | So, I mean, me personally,
02:44:57.060 | I've gone through a few dark periods quite recently
02:45:01.340 | and it was almost always sleep is not the cause,
02:45:06.240 | but sleep is the catalyst from going to a bad time
02:45:10.820 | to a very bad time.
02:45:12.380 | - Yeah.
02:45:13.220 | - And so it's definitely true.
02:45:15.100 | And it's funny how sleep can just cure all of that.
02:45:18.580 | - There's actually a beautiful quote
02:45:20.100 | by an American entrepreneur called E. Joseph Kosman,
02:45:24.380 | who once said that the best bridge
02:45:26.100 | between despair and hope is a good night's sleep.
02:45:29.640 | And I spilled so much ink and hundreds of pages
02:45:35.820 | inelegantly trying to say the same thing in my book.
02:45:40.140 | And he said it in one line and it's beautiful.
02:45:43.860 | - What do you think is,
02:45:45.420 | we've been talking about how to extend this life,
02:45:47.500 | how to make it a good life.
02:45:48.960 | We've been talking about love.
02:45:53.460 | What do you think is the meaning of this whole ride?
02:45:55.860 | - Of life?
02:45:56.700 | - Of life.
02:45:57.520 | Why do we want to make it a good one?
02:46:00.820 | Do you think there's a meaning?
02:46:01.860 | Do you think there's a answer to the why?
02:46:03.860 | - For me personally,
02:46:09.220 | I think the meaning of life is to eat,
02:46:14.220 | is to sleep, is to fall in love,
02:46:17.940 | is to cry, and then to die.
02:46:23.740 | Oh, and probably race cars in between.
02:46:29.580 | - Race cars.
02:46:30.540 | Well, there's a whole topic of sex we didn't talk about,
02:46:33.380 | so that's probably in there somewhere.
02:46:34.900 | - Should we do that?
02:46:36.660 | - Maybe if you'll have me back, I would love to do it.
02:46:38.940 | I will go around.
02:46:39.780 | - Next time we will do another three hours on sex alone.
02:46:42.820 | - Yeah.
02:46:43.660 | - It has been over three hours, yes.
02:46:47.260 | - Okay.
02:46:48.100 | - Matt, I'm a big fan of your work.
02:46:50.340 | I think you're doing really important work,
02:46:51.980 | even despite all the things I've been saying
02:46:54.260 | about the madness of my own sleep schedule.
02:46:56.620 | I think you're helping millions of people.
02:46:58.700 | So it's an honor that you spend your valuable time with me.
02:47:02.280 | And I can't wait until your podcast comes out.
02:47:05.780 | I'm a huge fan of podcasts.
02:47:06.860 | I'm a huge fan of you.
02:47:08.260 | And it's just an honor to know you
02:47:10.820 | and to get a chance, hopefully in the future,
02:47:12.500 | to work together with you.
02:47:13.740 | - You're a brilliant man and you're doing amazing things.
02:47:17.340 | And I feel immensely honored to have met you,
02:47:22.220 | to now know you, and to start calling you a friend.
02:47:25.900 | Thank you for what you do for the world
02:47:27.660 | and for me included.
02:47:31.140 | - Thank you, Matt.
02:47:33.140 | - Take care.
02:47:34.660 | - Thanks for listening to this conversation
02:47:36.100 | with Matt Walker.
02:47:37.100 | And thank you to stamps.com, Squarespace,
02:47:40.740 | Athletic Greens, BetterHelp, and Onnit.
02:47:43.580 | Check them out in the description to support this podcast.
02:47:46.980 | And now let me leave you with some words from Nikola Tesla,
02:47:50.540 | who we discussed in this podcast
02:47:52.300 | as sleeping very few hours a night.
02:47:54.100 | "All that was great in the past was ridiculed,
02:47:57.540 | "condemned, combated, and suppressed,
02:48:00.100 | "only to emerge all the more powerfully,
02:48:02.700 | "all the more triumphantly from the struggle."
02:48:06.380 | Thank you for listening and hope to see you next time.
02:48:09.300 | (upbeat music)
02:48:11.900 | (upbeat music)
02:48:14.500 | [BLANK_AUDIO]