back to index

Dr. E.J. Chichilnisky: How the Brain Works, Curing Blindness & How to Navigate a Career Path


Chapters

0:0 Dr. E.J. Chichilnisky
2:31 Sponsors: Eight Sleep, ROKA & BetterHelp
6:6 Vision & Brain; Retina
11:23 Retina & Visual Processing
18:37 Vision in Humans & Other Animals, Color
23:1 Studying the Human Retina
29:48 Sponsor: AG1
31:16 Cell Types
36:0 Determining Cell Function in Retina
43:39 Retinal Cell Types & Stimuli
49:27 Retinal Prostheses, Implants
60:25 Artificial Retina, Augmenting Vision
66:5 Sponsor: InsideTracker
67:12 Neuroengineering, Neuroaugmentation & Specificity
77:1 Building a Smart Device, AI
80:2 Neural Prosthesis, Paralysis; Specificity
85:21 Neurodegeneration; Adult Neuroplasticity; Implant Specificity
94:0 Career Journey, Music & Dance, Neuroscience
102:55 Self-Understanding, Coffee; Self-Love, Meditation & Yoga
107:50 Body Signals & Decisions; Beauty
117:49 Zero-Cost Support, Spotify & Apple Reviews, Sponsors, YouTube Feedback, Momentous, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | Welcome to the Huberman Lab Podcast,
00:00:02.260 | where we discuss science
00:00:03.680 | and science-based tools for everyday life.
00:00:05.860 | I'm Andrew Huberman,
00:00:10.440 | and I'm a professor of neurobiology and ophthalmology
00:00:13.640 | at Stanford School of Medicine.
00:00:15.640 | My guest today is Dr. E.J. Chichilnisky.
00:00:18.680 | Dr. E.J. Chichilnisky is a professor of neurosurgery,
00:00:21.680 | ophthalmology, and neuroscience at Stanford University.
00:00:25.120 | He is one of the world's leading researchers
00:00:27.280 | trying to understand how we see the world around us,
00:00:30.160 | that is, how visual perception occurs,
00:00:32.360 | and then applying that information directly
00:00:35.120 | to the design of neural prostheses,
00:00:37.140 | literally robotic eyes that can allow blind people
00:00:40.640 | to see once again.
00:00:42.040 | Today's discussion is a very important one
00:00:44.120 | for anyone who wants to understand how their brain works.
00:00:47.500 | Indeed, E.J. spells out in very clear terms
00:00:50.940 | exactly how the world around us is encoded by the neurons,
00:00:55.120 | the nerve cells within our brain
00:00:56.880 | in order to create these elaborate visual images
00:00:59.760 | that we essentially see within our minds.
00:01:02.560 | And with that understanding,
00:01:04.120 | he explains how that can be applied
00:01:06.400 | to engineer specific robotic AI and machine learning devices
00:01:11.160 | that can allow human brains
00:01:12.640 | not only to see once again in the blind,
00:01:15.040 | but also to perceive things that typical human brains can't,
00:01:18.840 | and indeed for memory to be enhanced
00:01:21.300 | and for cognition to be enhanced.
00:01:23.120 | This is the direction that neuroscience is going.
00:01:25.760 | And in the course of today's discussion,
00:01:27.520 | we have the opportunity to learn
00:01:29.000 | from the world expert in these topics
00:01:31.880 | where the science is now and where it is headed.
00:01:35.000 | During today's discussion,
00:01:36.080 | we also get heavily into the topic
00:01:37.800 | of how to select one's professional and personal path.
00:01:41.800 | And indeed, you'll learn from Dr. Chichilnisky
00:01:44.320 | that he has a somewhat unusual path
00:01:46.600 | both into science and through science.
00:01:49.060 | So for those of you that believe
00:01:50.180 | that everyone that's highly accomplished in their career
00:01:52.720 | always knew exactly what they wanted to do at every stage,
00:01:55.960 | you will soon learn
00:01:56.800 | that that is absolutely not the case with EJ.
00:01:59.520 | He describes wandering
00:02:01.120 | through three different graduate programs,
00:02:03.320 | taking several years off from school in order to dance.
00:02:06.560 | Yes, you heard that correctly, to dance.
00:02:08.320 | And how that wandering and indeed dancing
00:02:10.800 | helped him decide exactly what he wanted to do
00:02:13.520 | with his professional life
00:02:15.040 | and exactly what specific problems to try and tackle
00:02:18.400 | in the realm of neuroscience and medicine.
00:02:20.560 | It's a discussion that I'm certain
00:02:21.800 | that everybody, scientist or no,
00:02:23.840 | young or old can benefit from
00:02:26.380 | and can apply the specific tools that EJ describes
00:02:29.440 | in their own life and pursuits.
00:02:31.400 | Before we begin,
00:02:32.240 | I'd like to emphasize that this podcast
00:02:34.200 | is separate from my teaching and research roles at Stanford.
00:02:36.900 | It is however, part of my desire and effort
00:02:38.840 | to bring zero cost to consumer information about science
00:02:41.480 | and science-related tools to the general public.
00:02:44.200 | In keeping with that theme,
00:02:45.200 | I'd like to thank the sponsors of today's podcast.
00:02:48.100 | Our first sponsor is Eight Sleep.
00:02:50.240 | Eight Sleep makes smart mattress covers
00:02:51.920 | with cooling, heating and sleep tracking capacity.
00:02:54.760 | Now I've spoken many times before on this and other podcasts
00:02:57.040 | about the fact that sleep is the foundation
00:02:58.820 | of mental health, physical health and performance.
00:03:01.120 | And one of the key aspects to getting a great night's sleep
00:03:03.360 | is to control the temperature of your sleeping environment.
00:03:05.840 | And that's because in order to fall and stay deeply asleep,
00:03:08.920 | your body temperature actually has to drop
00:03:10.700 | by about one to three degrees.
00:03:12.320 | And in order to wake up in the morning feeling refreshed,
00:03:14.280 | your body temperature actually has to increase
00:03:16.440 | by about one to three degrees.
00:03:18.040 | Eight Sleep makes it extremely easy
00:03:19.480 | to control the temperature of your sleeping environment
00:03:21.760 | at the beginning, middle and throughout the night
00:03:23.820 | and when you wake up in the morning.
00:03:25.340 | I've been sleeping on an Eight Sleep mattress cover
00:03:27.180 | for nearly three years now
00:03:28.680 | and it has dramatically improved my sleep.
00:03:31.060 | If you'd like to try Eight Sleep,
00:03:32.320 | you can go to eightsleep.com/huberman
00:03:35.420 | to save $150 off their pod three cover.
00:03:38.480 | Eight Sleep currently ships to the USA, Canada, UK,
00:03:41.400 | select countries in the EU and Australia.
00:03:43.760 | Again, that's eightsleep.com/huberman.
00:03:46.840 | Today's episode is also brought to us by Roka.
00:03:50.160 | Roka makes eyeglasses and sunglasses
00:03:52.300 | that are of the absolute highest quality.
00:03:54.880 | I've spent a lifetime working on the biology
00:03:56.800 | of the visual system and I can tell you
00:03:58.340 | that your visual system has to contend
00:04:00.200 | with an enormous number of challenges
00:04:01.920 | in order for you to be able to see clearly
00:04:03.820 | under different conditions.
00:04:05.040 | Roka understands this and designed all of their eyeglasses
00:04:07.840 | and sunglasses with the biology of the visual system in mind.
00:04:11.040 | Now, Roka eyeglasses and sunglasses were initially developed
00:04:13.640 | for use in sport and as a consequence,
00:04:15.720 | you can wear them without them slipping off your face
00:04:18.200 | while running or cycling and they're extremely lightweight.
00:04:21.000 | Roka eyeglasses and sunglasses are also designed
00:04:23.520 | with a new technology called Float Fit,
00:04:26.000 | which I really like because it makes their eyeglasses
00:04:28.000 | and sunglasses fit perfectly and they don't move around
00:04:30.640 | even when I'm active.
00:04:32.160 | So if I'm running and I'm wearing my glasses,
00:04:34.240 | they stay on my face.
00:04:35.080 | Most of the time, I don't even remember they're on my face
00:04:36.540 | because they're so lightweight.
00:04:38.040 | You can also use them while cycling or for other activities.
00:04:40.640 | So if you'd like to try Roka glasses,
00:04:42.520 | go to Roka, that's R-O-K-A.com and enter the code Huberman
00:04:46.720 | to save 20% off your first order.
00:04:48.880 | Again, that's R-O-K-A.com
00:04:51.240 | and enter the code Huberman at checkout.
00:04:53.640 | Today's episode is also brought to us by BetterHelp.
00:04:56.760 | BetterHelp offers professional therapy
00:04:58.560 | with a licensed therapist carried out online.
00:05:01.440 | I've been going to therapy for well over 30 years.
00:05:04.160 | Initially, I didn't have a choice.
00:05:05.620 | It was a condition of being allowed to stay in school,
00:05:08.080 | but pretty soon I realized that therapy
00:05:10.280 | is extremely valuable.
00:05:11.480 | In fact, I consider doing regular therapy
00:05:13.800 | just as important as getting regular exercise,
00:05:16.560 | including cardiovascular exercise and resistance training,
00:05:18.880 | which of course I also do every week.
00:05:20.940 | The reason I know therapy is so valuable
00:05:22.880 | is that if you can find a therapist
00:05:24.640 | with whom you can develop a really good rapport,
00:05:27.240 | you not only get terrific support
00:05:29.540 | for some of the challenges in your life,
00:05:31.480 | but you also can derive tremendous insights
00:05:34.080 | from that therapy.
00:05:35.220 | Insights that can allow you to better
00:05:37.000 | not just your emotional life and your relationship life,
00:05:39.680 | but of course also the relationship to yourself
00:05:41.960 | and to your professional life,
00:05:43.520 | to all sorts of career goals.
00:05:45.260 | In fact, I see therapy as one of the key components
00:05:47.520 | for meshing together all aspects of one's life
00:05:50.200 | and being able to really direct one's focus and attention
00:05:53.100 | toward what really matters.
00:05:54.820 | If you'd like to try BetterHelp,
00:05:56.040 | go to betterhelp.com/huberman
00:05:58.500 | to get 10% off your first month.
00:06:00.160 | Again, that's betterhelp.com/huberman.
00:06:03.240 | And now for my discussion with Dr. EJ Cichelniski.
00:06:06.960 | Dr. EJ Cichelniski, welcome.
00:06:09.920 | - Good to see you.
00:06:11.160 | - For the audience, we are friends.
00:06:12.520 | We go way back.
00:06:14.360 | EJ has been a few years or more ahead of me
00:06:18.880 | in the science game.
00:06:19.840 | And the best way to describe you and your work, EJ,
00:06:23.960 | is you're an astronaut.
00:06:26.240 | You go places no one else has been willing to go before.
00:06:30.160 | You develop new technologies in order to do that,
00:06:33.320 | all with the bold mission of trying to understand
00:06:36.200 | how the nervous system,
00:06:38.280 | which of course includes the brain,
00:06:40.020 | works and how to make it better with engineering.
00:06:45.020 | So today we are going to get into all of that.
00:06:47.440 | But just to start off and get everybody on the same page,
00:06:51.460 | maybe we could just take a moment
00:06:52.800 | and talk about the brain and nervous system
00:06:56.480 | and what it consists of that allows it to do
00:07:00.120 | all the sorts of things that we're going to get into,
00:07:01.880 | like see things in our environment
00:07:04.300 | and respond to those things in our environment.
00:07:06.300 | So at risk of throwing too much at you right out the gate,
00:07:10.340 | what's your one to five minute version
00:07:15.140 | of how the brain works?
00:07:17.060 | - Oh, I don't have a one to five minute version
00:07:21.340 | of how the brain works,
00:07:22.280 | but I can tell you how I think vision
00:07:26.660 | is initiated in the brain.
00:07:28.980 | And you and I go back a long way,
00:07:32.060 | so we have a lot of common understanding about this,
00:07:34.540 | but I'll narrate it from scratch if that makes sense.
00:07:37.720 | So vision is initiated in the retina of the eye,
00:07:43.460 | which is a sheet of neural tissue at the rear of the eye
00:07:46.800 | that captures the light that is incident on the eye
00:07:50.060 | that comes in through the eye,
00:07:52.660 | transforms that light into electrical signals,
00:07:55.780 | processes those electrical signals in interesting ways
00:07:59.380 | and changes them up and then sends that visual information
00:08:03.420 | to the brain where it is used
00:08:05.260 | to bring about our sense of vision.
00:08:07.240 | And you asked me about the one to five minute version
00:08:11.620 | of how the brain works, I don't know,
00:08:13.540 | but I do know that the brain receives
00:08:15.580 | all these patterns of electrical activity
00:08:17.900 | coming out of these nerve cells in the retina
00:08:20.540 | and somehow assembles that into our visual experience,
00:08:24.740 | whether that be responding to things coming at us
00:08:28.340 | or our circadian rhythms that govern our sleep and behavior
00:08:33.260 | or identifying objects for prey or avoiding predators
00:08:38.220 | or appreciating beauty.
00:08:39.980 | And what we know is that the brain receives
00:08:42.260 | a fantastically complex set of signals from the retina
00:08:46.260 | and puts that all together into our visual experience.
00:08:49.340 | And we are very visual creatures, obviously.
00:08:52.700 | So I think that's a big part of how the brain works
00:08:54.980 | because so much of what we do revolves around vision,
00:08:59.500 | revolves around how the brain puts together
00:09:01.300 | these signals coming out of the retina.
00:09:03.460 | And I would love to understand how that works.
00:09:05.940 | At the moment, I don't.
00:09:08.400 | And what we're trying to do
00:09:09.900 | is get a really complete understanding
00:09:11.700 | of how that begins in the retina
00:09:14.340 | and then how we can restore it in those who've lost sight.
00:09:17.920 | - Why focus on this issue in the retina,
00:09:22.540 | this thin set of layers of neurons
00:09:26.340 | that line the back of the eye?
00:09:27.780 | Why explore vision there?
00:09:29.740 | I mean, obviously there are centers within the brain
00:09:32.140 | that, of course, contain neurons, nerve cells
00:09:34.980 | that are involved in vision.
00:09:36.460 | If one wants to understand visual perception,
00:09:38.660 | and I agree, by the way, that visual perception
00:09:40.820 | is one of the most dominant forces
00:09:42.820 | in the quality and experience of our life,
00:09:46.320 | why focus on the retina?
00:09:49.020 | Why not focus on the visual cortex
00:09:51.260 | or the visual thalamus?
00:09:53.020 | What's so special about the retina?
00:09:55.780 | - Well, we have to focus on all of it
00:09:58.340 | because understanding the retina
00:09:59.780 | won't give us a full understanding
00:10:02.020 | of how all this works, obviously.
00:10:03.940 | And if you don't have your visual cortex
00:10:06.020 | and visual thalamus, you won't see.
00:10:08.260 | But if you don't have your retina, you also won't see.
00:10:11.480 | You won't even have a chance to see.
00:10:13.660 | So I focus on the retina
00:10:16.260 | because I enjoy the possibility
00:10:20.140 | that we can really understand
00:10:22.460 | a piece of the nervous system in my lifetime,
00:10:25.660 | in our lifetimes.
00:10:27.340 | We can understand it so well that we can build it,
00:10:30.900 | replace it, restore its function.
00:10:34.540 | That's farther off in the central regions of the brain.
00:10:38.300 | That's gonna be quite a bit harder.
00:10:40.060 | I find satisfaction in really understanding
00:10:45.220 | something so well that I can write down
00:10:47.500 | in a mathematical formula what it's doing,
00:10:50.580 | that I can test my hypotheses up and down.
00:10:52.940 | And yes, we really get how this little machine works
00:10:56.100 | and that I can engineer devices
00:10:57.780 | to replace the function of that circuit when it's lost.
00:11:00.620 | That to me is just deeply satisfying.
00:11:03.040 | But there also is a really fundamental role
00:11:06.460 | for people who wanna go and do more exploratory work
00:11:09.400 | in the visual brain, as you mentioned,
00:11:10.900 | in the visual cortex and the thalamus and other places.
00:11:14.060 | 'Cause ultimately those retinal signals
00:11:16.020 | won't lead to anything if those areas
00:11:17.740 | aren't putting it all together to govern our perception
00:11:20.980 | and ultimately our behavior.
00:11:23.440 | - So let's talk about the retina
00:11:24.660 | in its full beauty and detail.
00:11:26.500 | Three layers of cells that line the back of the eye
00:11:30.100 | like a pie crust.
00:11:31.100 | Somehow take light, it comes into the eye,
00:11:35.340 | lens focuses that light.
00:11:37.260 | If it doesn't do that well,
00:11:38.780 | we put lenses in front of our eyes,
00:11:40.800 | such as contact lenses or spectacles.
00:11:44.480 | And somehow takes that light and transforms it,
00:11:48.900 | as you said, into neural signals
00:11:50.600 | and processes that within the retina.
00:11:52.580 | So let's take a deep dive into the retina
00:11:55.500 | and do so with the understanding,
00:11:57.500 | at least my understanding is that,
00:11:59.400 | in part, thanks to your work and the work of others,
00:12:02.480 | this is perhaps the best understood piece of the brain.
00:12:06.000 | - Yes, I think it's a solid argument
00:12:09.720 | that it's the best understood piece of the brain.
00:12:11.700 | And we'll turn back to that in a minute.
00:12:13.720 | So the retina begins with a sheet of cells
00:12:18.420 | called the photoreceptor cells that are highly specialized.
00:12:21.140 | These are cells that essentially don't exist
00:12:23.100 | anywhere else in the brain.
00:12:24.380 | And what they do is transform light energy
00:12:27.300 | into electrical signals in neurons.
00:12:30.460 | Very specialized, very demanding cells.
00:12:34.180 | They require a lot of maintenance
00:12:35.660 | and they die relatively easily,
00:12:38.120 | which is what gives rise to some of the forms of blindness.
00:12:42.160 | Those are the, you might call them pixel detectors.
00:12:44.520 | They're tiny cells called photoreceptors
00:12:46.300 | that each one captures light
00:12:47.740 | from a particular location in the world.
00:12:49.740 | That sheet of cells has done
00:12:52.420 | that initial transduction process
00:12:54.380 | where light is converted into neural signals
00:12:56.620 | that the brain can then begin to work with.
00:12:59.040 | The second layer is responsible for processing,
00:13:04.180 | adjusting, changing, mixing and matching,
00:13:07.700 | comparing signals in different neurons,
00:13:09.780 | many complex operations
00:13:11.400 | that we're still trying to understand
00:13:13.560 | and consists of dozens of distinct cell types
00:13:16.460 | that extract features, if you will, of the visual world
00:13:20.320 | from the elementary pixels represented
00:13:23.320 | in the photoreceptor cells.
00:13:24.600 | So that second layer is receiving the input
00:13:26.500 | from that sheet of photoreceptors
00:13:28.640 | and picking stuff out of it.
00:13:30.180 | The third layer of cells
00:13:32.680 | is the so-called retinal ganglion cells.
00:13:34.320 | That's the only term that I'd like to,
00:13:37.040 | probably will come up repeatedly in this conversation.
00:13:39.720 | So for your viewers and listeners,
00:13:43.160 | these retinal ganglion cells
00:13:44.320 | are the ones who are responsible
00:13:45.720 | for taking the signals that are there in the retina
00:13:48.200 | and sending them to the brain
00:13:49.560 | so that the process of vision can begin.
00:13:51.640 | They are the messengers, if you will,
00:13:53.840 | from the retina to the brain.
00:13:56.200 | The retinal ganglion cells,
00:13:57.520 | and there are about 20 different types in humans,
00:14:00.760 | are again, feature extractors.
00:14:04.120 | They pick out different bits and pieces of the visual scene
00:14:06.960 | and send interesting stuff to the brain,
00:14:09.160 | trying to leave out the uninteresting stuff.
00:14:12.360 | And the 20 or so cell types
00:14:14.800 | all pick out different types of information
00:14:16.840 | from the visual scene.
00:14:18.280 | You can sort of think of them as Photoshop filters,
00:14:21.640 | each cell type in the retina.
00:14:23.200 | Again, about 20 different ganglion cell types.
00:14:26.480 | Each type represents the full scene,
00:14:29.080 | the entire visual world,
00:14:30.800 | but picks out different features.
00:14:32.360 | - Such as?
00:14:33.260 | - Some cells pick out spatial detail,
00:14:35.560 | tiny little points of light almost.
00:14:37.960 | Some cells pick out and signal information
00:14:40.520 | about things that are moving in the visual world.
00:14:43.440 | Some cells pick out information that's been captured
00:14:45.720 | about different wavelengths from the photoreceptor cells,
00:14:48.760 | and thereby giving us our sensations of color.
00:14:52.040 | And probably more things
00:14:54.240 | in those 20 different ganglion cell types
00:14:56.000 | that we don't fully understand.
00:14:57.560 | The result then is that the retina
00:15:00.600 | has this sort of a representation of the visual world,
00:15:04.040 | but it has 20 different representations, not one.
00:15:06.720 | It's not one picture that comes out of the retina
00:15:08.560 | and gets sent to the brain.
00:15:09.600 | No, no, no.
00:15:10.440 | It's 20 different pictures,
00:15:11.880 | and you can think of maybe
00:15:13.040 | as 20 different Photoshopped pictures,
00:15:15.180 | but one of them has the edges highlighted.
00:15:16.940 | One of them has the colors highlighted.
00:15:18.680 | One of them has movement encoded in it.
00:15:22.960 | And these, somehow, these filters
00:15:25.720 | send the information to many different targets in the brain,
00:15:28.160 | and then our brain puts it all together,
00:15:29.600 | and then we have a cohesive sense of the visual world,
00:15:32.360 | which is the remarkable feature
00:15:33.760 | that we really don't understand.
00:15:35.480 | - Amazing.
00:15:36.320 | Is it fair for those that don't work with Photoshop
00:15:41.520 | to think about these different Photoshop filters
00:15:44.160 | perhaps as like different movies of the visual world?
00:15:46.720 | One movie contains the outlines of objects
00:15:48.920 | and people and things.
00:15:50.160 | Another movie is showing the motion of blobs
00:15:54.320 | in the environment, meaning whatever's moving
00:15:56.120 | in the environment is kind of just represented as blobs.
00:15:58.080 | Another movie is just the color in the environment.
00:16:01.200 | Another movie, and then all of those,
00:16:03.800 | what I'm calling movies, are sent into the brain.
00:16:06.280 | And then the brain somehow combines those
00:16:09.680 | in ways that allow us to see each other
00:16:12.520 | and see cars and objects and recognize faces.
00:16:15.360 | Is that one way to think about it?
00:16:16.840 | - That's exactly how I think about it.
00:16:18.240 | Maybe it's a better way to say it.
00:16:19.360 | - No, I like the Photoshop filter analogy.
00:16:22.320 | I just, for those that don't work with Photoshop,
00:16:24.760 | I just think that the movie analogy
00:16:27.880 | might be a decent alternative.
00:16:30.240 | - How the retina works is an example, we think,
00:16:33.360 | of how all sensory systems work.
00:16:35.360 | There's an initial representation
00:16:37.240 | in a specialized cell type that is responsible for
00:16:41.000 | and capable of extracting physical features from the world.
00:16:45.000 | And then neural circuits in the brain
00:16:49.080 | use that information in different ways
00:16:51.840 | to grab stuff out of the visual world.
00:16:54.160 | In the auditory system, the sound world
00:16:57.840 | is represented also in specialized cells
00:17:00.000 | that capture sound energy and transduce that
00:17:02.720 | into neural signals.
00:17:04.560 | And then subsequent stages of processing
00:17:07.800 | in the auditory system pick out different features
00:17:10.800 | of our auditory world.
00:17:12.160 | - Like the frequency, how high or how low a tone is,
00:17:15.480 | the direction it's coming from.
00:17:17.160 | - Right, the movement of it, how loud it is,
00:17:20.680 | different features are extracted.
00:17:21.960 | So we think the visual system is just an example
00:17:25.720 | of how the external world is represented in our brain.
00:17:29.920 | And of course, in some sense,
00:17:32.600 | a philosophical approach to the brain
00:17:34.480 | is really saying, well, there's the sensory world,
00:17:37.080 | and then there's the actions we take,
00:17:40.240 | and there's almost nothing else that we really know
00:17:42.600 | other than those two things,
00:17:44.120 | how the sensory world comes in,
00:17:46.080 | and then finally it results in our action.
00:17:48.320 | That's what our brain is about.
00:17:50.320 | Because vision is so important for people,
00:17:52.600 | I find it absolutely compelling and fascinating.
00:17:55.840 | I mean, as an example, as you know well,
00:17:57.600 | many people study rodents to understand
00:18:00.400 | how different aspects of the brain work.
00:18:03.160 | And rodents are interesting animals
00:18:07.160 | and do all sorts of really cool things,
00:18:08.960 | but they interact with the world differently than we do.
00:18:11.920 | They, in a lot of ways, they sense by smelling,
00:18:15.200 | they identify objects by smelling,
00:18:17.840 | and they navigate with their whiskers to a large extent.
00:18:20.960 | We don't do any of that.
00:18:21.840 | You don't navigate with your whiskers,
00:18:23.160 | at least I don't think you do.
00:18:24.880 | You don't recognize me when I walk in the room
00:18:27.200 | by my smell, no, you use vision for all that.
00:18:30.440 | And we humans use vision,
00:18:32.160 | so it's a really fundamental aspect
00:18:33.760 | of who we are as biological creatures.
00:18:36.080 | - I wonder if, just for sake of entertainment,
00:18:42.000 | we could think about how the human retina,
00:18:45.000 | and therefore vision in our species,
00:18:47.960 | differs a little bit from some extreme examples
00:18:51.040 | of vision in other species,
00:18:53.280 | not to make this a comparative or zoological exploration,
00:18:58.120 | but just to really illustrate the fact
00:19:01.160 | that the specific cell types within our retinas
00:19:04.160 | create a visual representation of the outside world
00:19:08.400 | that can be and often is very different
00:19:11.480 | from that of other species.
00:19:13.040 | For instance, or at least my understanding,
00:19:16.880 | is that the mantis shrimp sees, I don't know,
00:19:20.720 | 60 to 100 different variations of each color
00:19:24.560 | that we are essentially blind to,
00:19:27.080 | because their photoreceptors
00:19:29.000 | can detect very subtle differences in red,
00:19:32.400 | for instance, long wavelengths of light,
00:19:34.360 | what most people refer to as red.
00:19:36.680 | Pit vipers can sense heat emissions,
00:19:39.640 | essentially with their eyes, but also other organs,
00:19:42.240 | and on and on.
00:19:44.240 | I raise this because I think the human neural retina
00:19:49.860 | is such an incredible example of extracting features
00:19:53.280 | from the visual world that then we recreate,
00:19:55.800 | but I think it's also worth reminding everyone
00:19:58.760 | and ourselves that it's not a complete representation
00:20:02.600 | of what's out there.
00:20:03.440 | Like there's a lot in light that we don't see
00:20:07.520 | because our neural retina
00:20:08.960 | just can't turn it into electrical signals, right?
00:20:11.520 | Do you want to give us some examples of what we can't see?
00:20:13.840 | And if any particular examples
00:20:15.920 | from the animal kingdom delight you,
00:20:17.200 | feel free to throw those out.
00:20:19.440 | - Well, one thing, you mentioned color.
00:20:22.560 | We experience a rich sensation of color
00:20:25.500 | when we look at the world and say,
00:20:26.340 | "Wow, I see all these colors."
00:20:27.940 | That's immediate.
00:20:29.480 | And that's just how we talk about it.
00:20:33.140 | But in fact, we have very little information about color.
00:20:37.300 | Color is a very high dimensional complex thing,
00:20:40.220 | or wavelength, I should say.
00:20:41.260 | Wavelength information really is about
00:20:44.140 | how much energy there is in the light around us
00:20:47.840 | at different wavelengths.
00:20:49.240 | We only have three sort of snapshots of that
00:20:53.580 | in our retinas with the three different types
00:20:56.380 | of photoreceptor cells that are sensitive
00:20:58.500 | to different wavelengths,
00:21:00.060 | different bands of the wavelength spectrum.
00:21:03.820 | Three is not a lot.
00:21:04.820 | As you just said, other creatures have many more ways
00:21:08.520 | of capturing wavelength information.
00:21:11.320 | And one way you can verify for yourself
00:21:13.100 | that we just have three is to realize
00:21:15.620 | that if you look at your TV,
00:21:17.300 | there are only three primaries on your TV.
00:21:19.140 | There's a red, there's a green, and there's a blue.
00:21:20.640 | That's it.
00:21:21.480 | And from those three primaries,
00:21:23.500 | the entire richness of the experience on your TV set
00:21:26.680 | is composed.
00:21:28.640 | So with just those three things,
00:21:30.000 | you basically are able to create
00:21:31.880 | any human visual sensation.
00:21:33.600 | Well, the mantis shrimp would be like,
00:21:36.100 | "That's nothing.
00:21:36.940 | "There's so much more stuff out there
00:21:38.440 | "that's not represented on this TV."
00:21:40.960 | If you could speak to the mantis shrimp.
00:21:43.200 | Another thing we maybe don't see,
00:21:45.240 | another example of a difference in the animal kingdom
00:21:47.580 | is, so again, taking rodents as an example,
00:21:51.200 | one of the things that rodents have to do
00:21:53.240 | is to not be hunted by birds
00:21:56.400 | that are coming down toward them.
00:21:58.420 | And so it appears that there are cells in the retina
00:22:02.400 | that seem to be quite sensitive to looming,
00:22:07.400 | to something dark that's getting bigger,
00:22:10.320 | like a shadow coming from a bird coming down at you.
00:22:13.920 | We don't know for sure
00:22:14.760 | that this is exactly what causes animals
00:22:17.340 | to avoid being eaten by birds,
00:22:19.340 | but there's interesting evidence in that direction.
00:22:22.820 | That's not really a big thing for us,
00:22:24.660 | for humans, as far as I know.
00:22:26.020 | We're not typically hunted by huge birds.
00:22:28.260 | So that's not a thing we need.
00:22:29.440 | And I think that's where it comes back to
00:22:32.180 | where you were headed, if I understand right,
00:22:34.380 | which is we occupy different biological niches,
00:22:37.940 | we and the mantis shrimp and the rodents,
00:22:40.320 | and our visual systems reflect that.
00:22:42.700 | We have different stuff that we're looking for
00:22:44.800 | in our visual environment than other creatures are.
00:22:48.120 | And so our eyes are different.
00:22:49.760 | And that's one of the reasons
00:22:50.680 | that we emphasize work on the human retina,
00:22:54.160 | as opposed to certain other animal species
00:22:56.620 | that would be less clearly relevant
00:22:58.680 | to the visual experience you and I have.
00:23:01.380 | - So let's talk about these incredible experiments
00:23:03.760 | that your laboratory has been doing for several decades now.
00:23:07.260 | I've had the privilege of sitting in
00:23:10.240 | on some of these experiments,
00:23:11.560 | and they are very involved, to say the least.
00:23:16.560 | If you could just walk us through one of these experiments,
00:23:19.580 | I think the audience would appreciate understanding
00:23:21.680 | what goes into, quote unquote,
00:23:24.480 | trying to understand what's going on
00:23:26.420 | in the electrical activity of these specific
00:23:28.260 | retinal cell types,
00:23:29.100 | the retinal ganglion cells in particular.
00:23:31.700 | What does this look like?
00:23:33.700 | You're in your laboratory at Stanford,
00:23:35.960 | and you get a phone call.
00:23:38.660 | Someone says, "I got a retina."
00:23:41.620 | What happens next?
00:23:43.220 | - We scramble like crazy.
00:23:45.460 | We drop everything we're doing,
00:23:46.540 | cancel all our appointments,
00:23:47.700 | and get ourselves ready for 48 hours of nonstop work
00:23:50.620 | down in the lab,
00:23:51.460 | getting as much data as we possibly can from the retina.
00:23:54.500 | The most exciting example of what you just said
00:23:56.940 | is when we get a human retina.
00:23:58.880 | When, for example, there's a donor who has died,
00:24:02.600 | and the retina is available for research,
00:24:04.540 | we jump at that opportunity.
00:24:06.300 | - How soon after the person is deceased
00:24:09.580 | do you need to get the eye globe, the eyeball,
00:24:12.640 | in order to get the retina in a condition
00:24:14.800 | that would allow you to record electrical signals from it?
00:24:17.360 | - A few minutes.
00:24:18.540 | Just a few minutes.
00:24:19.380 | - So you're waiting in the hospital?
00:24:21.520 | - The way we typically get these eyes
00:24:23.200 | is from brain-dead individuals,
00:24:25.380 | so people who are legally and medically dead,
00:24:27.920 | but their hearts are still pumping,
00:24:29.680 | and therefore their retinas are still alive and functioning.
00:24:33.440 | When those individuals are used,
00:24:38.080 | their bodies of those individuals
00:24:39.680 | are used for organ donations,
00:24:42.000 | we can benefit from that organ donation setup
00:24:44.800 | that organ distribution centers do
00:24:47.920 | to save many people's lives,
00:24:50.000 | and also to promote research.
00:24:51.920 | So we sometimes get those retinas,
00:24:54.000 | and that begins the experiment for us.
00:24:56.380 | - I'm gonna ask for a few more details here
00:24:57.780 | just to put the picture in people's minds,
00:25:00.260 | and not to be gruesome,
00:25:01.280 | I just really want people to understand
00:25:02.760 | what's involved here.
00:25:03.600 | So you'll get a call.
00:25:04.660 | We've got a patient who is soon to be deceased.
00:25:08.040 | They've consented to giving their eye globes,
00:25:10.560 | their eyeballs, for research
00:25:12.200 | so that you can study the human retina.
00:25:14.160 | Is it you who goes over and takes the eyes out?
00:25:18.580 | Does somebody do that?
00:25:19.640 | Or hand them to you in a bucket of ice?
00:25:22.600 | I'm sorry if I'm making people queasy at all,
00:25:24.520 | but this is, folks, how one goes about
00:25:27.920 | trying to understand how the human brain works.
00:25:29.640 | - Absolutely, and this is also how you go about
00:25:31.900 | donating your heart so that you can save
00:25:34.100 | somebody else's life who needs a heart transplant.
00:25:36.240 | The same incredible organizations
00:25:37.840 | that do the harvesting of the tissue for us,
00:25:40.560 | their primary goal is to do that
00:25:42.160 | for organ donations to save lives.
00:25:43.760 | They save lives every day.
00:25:44.820 | These people are incredible.
00:25:46.040 | Donor Network West is one of those organizations,
00:25:48.320 | the one that we work with.
00:25:50.040 | They're really amazing.
00:25:52.000 | So their technicians or a retinal surgeon
00:25:54.860 | will take the eye out,
00:25:56.880 | give it to myself or somebody from my lab
00:25:59.320 | who will bring it back to the lab,
00:26:00.480 | and we have a way to keep the eye alive
00:26:02.800 | and functioning, just the eye by itself.
00:26:04.920 | - Is this always at Stanford
00:26:06.040 | or do you sometimes travel elsewhere?
00:26:07.320 | - Local hospitals, up to an hour away.
00:26:09.640 | - So then you drive them back.
00:26:10.920 | - We drive them back.
00:26:12.000 | It's the Retina Express.
00:26:13.800 | And when we're bringing back the Retina Express,
00:26:17.280 | it's, again, it's all hands on deck in our lab.
00:26:20.400 | We are scrambling, setting up all of our equipment,
00:26:22.440 | getting everything ready.
00:26:23.280 | You've been at these experiments.
00:26:24.320 | They're intense, and they really are 48-hour marathons
00:26:27.640 | of incredible activity by really dedicated individuals.
00:26:31.320 | So we might get those eyes sometimes,
00:26:34.760 | again, at two in the morning, that's common.
00:26:37.200 | And from that two in the morning time,
00:26:38.840 | it begins the experiment.
00:26:41.020 | So we bring the eyes back, we open them up,
00:26:44.480 | and we have access to the rear of the eye,
00:26:47.240 | which is where the retina is.
00:26:48.720 | It's a thin sheet of neural tissue
00:26:50.400 | at the back part of the eye.
00:26:51.840 | We hemisect the eye, cut it in half
00:26:55.960 | so that we can see the back.
00:26:57.200 | It's like half of a globe, if you will.
00:26:59.160 | And then we put in relaxing cuts and lay it out flat
00:27:02.040 | so we can see what we're working with.
00:27:03.920 | And we take little segments of retina out
00:27:06.360 | in the subsequent 48 hours, cut them out,
00:27:09.200 | maybe a three by three millimeter piece of the retina,
00:27:11.320 | a little chunk of retinal tissue,
00:27:13.080 | and bring it into an electrophysiology recording
00:27:15.800 | and stimulation apparatus that allows us
00:27:18.000 | to interact with it.
00:27:19.240 | And we do two types of experiments with that.
00:27:21.560 | So this electrophysiological recording
00:27:23.560 | and stimulation apparatus is very custom-built
00:27:26.960 | by our physics collaborators
00:27:28.400 | who have developed high-end equipment.
00:27:30.360 | It allows us to record and stimulate
00:27:32.320 | through 512 channels simultaneously at very high density.
00:27:36.520 | This is pretty high-end stuff in terms of technology
00:27:40.620 | for interrogating and manipulating
00:27:43.120 | the electrical signals in the retina.
00:27:44.720 | That's what we specialize in in my lab.
00:27:46.840 | - Could I just ask a question about this device?
00:27:49.360 | I've seen it before.
00:27:50.880 | It's very small.
00:27:51.720 | As you mentioned, you're recording
00:27:52.680 | from a few millimeters square of the retina.
00:27:57.080 | From this recently deceased patient,
00:27:59.340 | it looks a little bit like a bed of nails, right?
00:28:02.720 | Like tiny little micro wires
00:28:04.360 | all arranged very closely to one another.
00:28:08.220 | You've got the retina laying down on top of it.
00:28:09.840 | - That's right. - And that bed of nails
00:28:11.280 | can extract, meaning record the electrical signals
00:28:15.160 | that are coming out of the retinal ganglion cells.
00:28:17.720 | - That's right. - And the retina's still alive.
00:28:20.080 | So you are in a position to shine light on it
00:28:24.120 | and essentially make it behave in the same way it would
00:28:29.120 | if it were still lining the back of a healthy, alive person.
00:28:33.760 | - That is the beauty of these experiments.
00:28:35.480 | So because we can keep the retina alive and happy
00:28:38.120 | and because the retinal ganglion cells,
00:28:39.840 | the cells that are the ones
00:28:41.040 | that message the visual information to the brain
00:28:44.480 | are on the surface,
00:28:46.000 | we can put them right next to the electrodes
00:28:47.760 | and we can record their electrical activity.
00:28:49.600 | In other words, we record the signal
00:28:51.000 | that those cells would have sent to the brain
00:28:53.200 | if they were still in the living person.
00:28:56.000 | And at the same time, as you said,
00:28:57.240 | we can focus an image that we create on a computer display
00:29:00.720 | onto the retina.
00:29:01.600 | So we're treating the retina, if you will,
00:29:03.120 | as a little electronic circuit,
00:29:05.280 | which it almost is, honestly,
00:29:07.600 | delivering light to the photoreceptor cells
00:29:10.080 | so that they are electrically excited
00:29:13.520 | and then recording the electrical activity
00:29:15.960 | that the retina is sending out, if you will.
00:29:19.080 | That allows us to study how the retina works normally.
00:29:22.560 | What we also do with that same electrical apparatus
00:29:27.000 | is turn around and pass current through those electrodes
00:29:31.040 | in order to see if we can activate
00:29:33.200 | those ganglion cells directly with no light, just electrodes.
00:29:36.720 | Why do we do that?
00:29:37.720 | We do that because it allows us to design
00:29:40.920 | future methods of restoring vision
00:29:44.360 | by electrical stimulation of the retina,
00:29:46.360 | which we'll probably talk about in a few minutes.
00:29:49.280 | I'd like to take a brief moment
00:29:50.640 | and thank one of our sponsors, and that's AG-1.
00:29:53.600 | AG-1 is a vitamin mineral probiotic drink
00:29:56.240 | that also contains adaptogens.
00:29:58.520 | I started taking AG-1 way back in 2012.
00:30:01.640 | The reason I started taking it
00:30:02.920 | and the reason I still take it every day
00:30:04.680 | is that it ensures that I meet all of my quotas
00:30:07.080 | for vitamins and minerals.
00:30:08.840 | And it ensures that I get enough prebiotic and probiotic
00:30:11.720 | to support gut health.
00:30:13.240 | Now, gut health is something that over the last 10 years,
00:30:15.680 | we realized is not just important
00:30:17.680 | for the health of our gut,
00:30:19.360 | but also for our immune system
00:30:21.520 | and for the production of neurotransmitters
00:30:23.680 | and neuromodulators, things like dopamine and serotonin.
00:30:26.240 | In other words, gut health is critical
00:30:28.200 | for proper brain functioning.
00:30:30.080 | Now, of course, I strive to consume healthy whole foods
00:30:32.760 | for the majority of my nutritional intake every single day,
00:30:36.200 | but there are a number of things in AG-1,
00:30:38.160 | including specific micronutrients
00:30:39.840 | that are hard to get from whole foods
00:30:41.400 | or at least in sufficient quantities.
00:30:43.280 | So AG-1 allows me to get the vitamins and minerals
00:30:45.720 | that I need, probiotics, prebiotics, the adaptogens
00:30:48.480 | and critical micronutrients.
00:30:50.480 | So anytime somebody asks me
00:30:52.040 | if they were to take just one supplement,
00:30:54.200 | what that supplement should be, I tell them AG-1
00:30:57.320 | because AG-1 supports so many different systems
00:30:59.640 | within the body that are involved in mental health,
00:31:01.820 | physical health and performance.
00:31:03.440 | To try AG-1, go to drinkag1.com/huberman
00:31:07.340 | and you'll get a year supply of vitamin D3K2
00:31:10.480 | and five free travel packs of AG-1.
00:31:12.600 | Again, that's drinkag1.com/huberman.
00:31:16.720 | Let's take this moment to talk a little bit
00:31:18.720 | about cell types.
00:31:19.720 | So you mentioned there are about 20 different types
00:31:23.200 | of these retinal ganglion cells,
00:31:24.960 | what we may refer to in brief as RGCs.
00:31:28.920 | So retinal ganglion cells, RGCs, same thing.
00:31:31.600 | And as you mentioned, these cover the entire retina
00:31:35.680 | so that if each cell type is extracting a different set
00:31:39.760 | of features from the visual word motion,
00:31:42.360 | color, specific colors, et cetera,
00:31:44.740 | that essentially no location in the world around us
00:31:50.720 | fails to be represented by these cells.
00:31:53.120 | Put differently, these cells are looking everywhere.
00:31:56.360 | - Each cell type is looking everywhere.
00:31:58.240 | - Each cell type is looking everywhere.
00:32:00.200 | So that if movement occurs in any region of our visual world,
00:32:04.620 | we are in a position to detect it.
00:32:06.560 | But maybe we could talk a little bit about cell types.
00:32:09.600 | Cell types is such an important theme
00:32:11.920 | in the field of neuroscience and indeed in all of biology,
00:32:15.040 | but it's actually not something we have talked about
00:32:17.240 | very much on this podcast before,
00:32:18.640 | either in solo episodes or in guest episodes.
00:32:21.640 | I don't have any specific reason for that.
00:32:24.800 | We've talked about brain areas, prefrontal cortex,
00:32:27.080 | basal ganglia, anterior midsingulate cortex,
00:32:29.680 | and on and on, we've talked about neural circuits,
00:32:31.880 | but we've never really talked about cell types.
00:32:34.360 | So the ganglion cells-
00:32:35.880 | - Brother, you let me down.
00:32:36.800 | No talking about cell types.
00:32:37.880 | - Well, but that's why you're here.
00:32:38.920 | - That's why I'm here.
00:32:39.760 | - That's why you're here.
00:32:40.720 | Tell us about cell types.
00:32:42.920 | How do you figure out if you have a cell type?
00:32:46.720 | How do you know if it's a cell type?
00:32:48.140 | Or is it the shape?
00:32:50.000 | Is it how it responds?
00:32:52.120 | How do you know if you have a cell type?
00:32:53.440 | What's this about?
00:32:54.920 | And I want to just put in the back of this question,
00:32:59.000 | or rather in the back of people's minds,
00:33:00.440 | that this issue of cell types
00:33:02.400 | is not just an issue pertinent to the retina.
00:33:04.720 | This is an issue that is critical
00:33:06.800 | to understanding how the brain works.
00:33:09.040 | - Absolutely.
00:33:09.880 | - It's critical to understanding consciousness.
00:33:10.940 | I know a lot of people are like, "What is consciousness?"
00:33:13.120 | We're not going there just yet.
00:33:14.560 | But what are cell types?
00:33:16.760 | How do you determine if you have a cell type?
00:33:18.400 | And why is this so important
00:33:19.560 | to understanding how the brain works?
00:33:21.960 | - Yeah, I mean, as you said, as far as we understand,
00:33:24.920 | every single brain circuit
00:33:26.160 | is full of very distinct cell types.
00:33:28.040 | And those cell types are distinguished
00:33:29.480 | by their genetic expression, their shapes and sizes,
00:33:34.160 | which other cells they do contact,
00:33:36.440 | and which cells they don't contact,
00:33:38.840 | where they send their information to
00:33:41.000 | in other parts of the brain, and what they represent.
00:33:44.120 | And as far as we know, this is true throughout the brain.
00:33:47.720 | And it's true in the retina,
00:33:49.120 | the different ganglion cell types,
00:33:50.560 | retinal ganglion cell types, about 20 of them,
00:33:53.400 | each of which is looking at the whole visual scene,
00:33:55.320 | extracts different stuff.
00:33:56.920 | This cell type one extracts one thing,
00:33:58.760 | cell type two extracts something else,
00:34:00.280 | but they all represent the entire visual scene.
00:34:03.440 | But those cell types we know from lots of beautiful work,
00:34:06.760 | work that you're closely connected to,
00:34:08.280 | and some of which you've done,
00:34:09.780 | those cell types have different morphology,
00:34:13.640 | different shapes and sizes,
00:34:14.960 | different patterns of gene expression,
00:34:17.000 | different targets in the brain.
00:34:18.500 | They send their outputs to different places in the brain.
00:34:21.900 | So really, to study the retina
00:34:26.200 | without understanding cell types,
00:34:27.840 | you're kind of lost right away.
00:34:29.240 | You have to know what's going on with the cell types,
00:34:30.840 | otherwise you can't make sense of this retinal signal.
00:34:33.440 | The way we identify them in two ways
00:34:35.360 | and for different purposes.
00:34:37.520 | The basic way we identify the different cell types
00:34:40.040 | is their function, because we study their function.
00:34:42.180 | We study how they respond to light images,
00:34:44.920 | and we can clearly separate them out.
00:34:46.960 | And in fact, it's a simple thing to say,
00:34:49.320 | but it's really true.
00:34:50.760 | Our 512 electrode array technology,
00:34:52.740 | which you've seen in our lab and stuff
00:34:54.320 | that we developed with collaborators about 20 years ago,
00:34:57.580 | was crucial for this.
00:34:59.680 | Because with that 512 electrode technology,
00:35:01.760 | we could see many cells of each type,
00:35:04.560 | and we could clearly parse them apart from one another.
00:35:07.440 | Whereas previous studies working on one cell at a time
00:35:10.200 | had great difficulty doing that.
00:35:12.420 | So with our technology, with 512 electrodes,
00:35:14.880 | we record hundreds of cells simultaneously.
00:35:16.680 | We can say, oh, there's 20 of these,
00:35:17.760 | there are 50 of those, there's 26 of those,
00:35:20.160 | and here they are, and we can just set them
00:35:21.960 | in different bins and say, okay,
00:35:23.620 | this is what's present in this retina,
00:35:26.880 | just what the information is they're extracting.
00:35:29.480 | There's another purpose, again,
00:35:32.560 | referring forward to the neuroengineering aspect.
00:35:34.880 | We need to identify the cell types,
00:35:36.320 | not just based on what visual information they carry,
00:35:39.360 | but based on their electrical features.
00:35:41.640 | Properties, electrical properties of the cells.
00:35:44.000 | Cells, as you know, neurons are electrical cells.
00:35:47.480 | They fundamentally receive
00:35:49.440 | and transmit electrical information,
00:35:50.920 | and the way that they do that
00:35:52.360 | has a distinctive electrical signature.
00:35:54.100 | That turns out to be super important
00:35:56.360 | for developing devices to restore vision.
00:35:58.600 | - Could you explain how you determine
00:36:03.960 | what a given cell type does, its electrical properties?
00:36:07.520 | Let's just draw a mental image for people.
00:36:12.160 | The retina is taken out of this deceased individual,
00:36:14.040 | put down on this bed of nails, of electrodes.
00:36:16.640 | Those electrodes can detect electrical signals
00:36:19.040 | within the ganglion cells.
00:36:20.600 | You are able to shine light onto the retina
00:36:23.080 | and see how the retinal ganglion cells respond,
00:36:25.020 | meaning what electrical signals
00:36:26.840 | they would transmit to the brain
00:36:29.040 | if they were still connected to a brain.
00:36:31.760 | They're not connected to a brain in the experiment.
00:36:33.360 | They're sitting there- - But they're trying.
00:36:34.720 | - But they're trying.
00:36:35.760 | I could imagine playing those cells a movie of,
00:36:42.160 | I don't know, a checkerboard going
00:36:43.720 | where every square on the checkerboard
00:36:46.140 | goes from white to black to gray.
00:36:47.640 | Could do that.
00:36:48.520 | I could play a cartoon.
00:36:52.520 | I could show it this year's Academy Award winner
00:36:56.380 | for best picture.
00:36:57.940 | How do you decide what to show the retina?
00:37:00.220 | This is a human being's retina after all.
00:37:02.660 | Presumably it looked at things
00:37:05.460 | that are relevant to human beings until that person died.
00:37:08.200 | But how do you determine cell type electrical signals
00:37:10.980 | if you don't know what specific things to show it?
00:37:15.300 | I mean, you're going to show it, I don't know, Disney movies?
00:37:18.060 | Like, what do you show it?
00:37:21.420 | - So what we show now reflects the fact
00:37:25.120 | that we've built up a lot of information
00:37:26.640 | and our work stands on the shoulders of many scientists
00:37:29.400 | who have studied the retina for decades
00:37:31.000 | to figure out what different cell types respond to.
00:37:34.800 | And we know that certain cell types
00:37:38.880 | respond primarily to increments of light
00:37:41.320 | when light gets brighter than it was.
00:37:43.200 | So a change from a certain brightness
00:37:45.440 | to a higher brightness, this particular cell type fires.
00:37:49.000 | Another cell type fires or sends spikes to the brain
00:37:52.360 | when it gets darker.
00:37:53.720 | Some cell types respond primarily to large targets
00:37:59.360 | in the visual world.
00:38:00.840 | Other cell types respond better to small targets
00:38:03.760 | in the visual world.
00:38:04.760 | Some cell types respond to different wavelengths of light
00:38:08.520 | that we can identify.
00:38:10.480 | There exists certain cell types
00:38:12.000 | that are still poorly understood that respond to movement.
00:38:15.460 | So we can tailor visual stimuli
00:38:18.740 | to types that we kind of already know about
00:38:21.020 | because of much preceding research.
00:38:23.840 | That's not actually how we do it in our experiments
00:38:25.900 | for the most part.
00:38:26.860 | Instead, we use a very unbiased
00:38:28.680 | flickering checkerboard pattern, as it turns out,
00:38:31.020 | which is a really efficient, unbiased way
00:38:34.600 | to sample many cells simultaneously
00:38:36.860 | so that in a half hour of electrical recording from a retina,
00:38:40.500 | we can figure out what all the 512 or so cells are
00:38:44.540 | that we're recording and know all of their types.
00:38:47.020 | And the way we do that is to play
00:38:49.100 | essentially random garbage TV snow type image
00:38:52.300 | to the retina for a period of time
00:38:54.540 | and determine which bits of brightening or darkening
00:38:58.740 | or movement or whatever in that random garbage
00:39:01.260 | activated this particular cell
00:39:03.060 | by looking at average across the half hour recording
00:39:06.140 | and saying, "Oh, it looks like this cell was always firing
00:39:09.060 | "when it became bright in this region of the screen.
00:39:11.280 | "That must be an on cell sensitive to light
00:39:13.920 | "in this region of the screen," and so on.
00:39:16.660 | So we have sophisticated, efficient ways of doing it,
00:39:20.060 | but it all comes back to these basic things
00:39:22.140 | about what features in the visual world
00:39:25.180 | tend to cause a given cell to send a signal to the brain.
00:39:28.100 | - Yeah, that makes a lot of sense.
00:39:29.580 | So you take essentially what you called random garbage,
00:39:33.500 | snow, white, black, and gray pixels on a screen.
00:39:37.900 | The retina views that.
00:39:42.380 | And then the cells in the retina will respond
00:39:44.940 | every once in a while with an electrical potential.
00:39:47.800 | They'll fire, as we say.
00:39:50.180 | Spike, it's sometimes called.
00:39:52.620 | And then you take sort of a forensic approach a bit later.
00:39:57.060 | You look back in time and you say,
00:39:58.820 | what was the arrangement of pixels in this random garbage
00:40:03.940 | right before this cell fired an electrical potential?
00:40:08.020 | - That's right. - A spike.
00:40:09.300 | And then from that,
00:40:10.820 | you can reconstruct the preferred stimulus.
00:40:14.880 | You can say, oh, this cell and cells around it
00:40:20.500 | seem to like motion of things
00:40:23.020 | going in a particular direction, for instance.
00:40:25.740 | And how do you know that the cell doesn't also like
00:40:29.540 | a bunch of other stuff that you didn't pick up on
00:40:33.860 | using this random garbage?
00:40:37.100 | - Yeah, two things.
00:40:38.100 | Let me just say, for the record,
00:40:40.260 | we don't record from these cells
00:40:41.820 | that signal motion in particular directions.
00:40:43.880 | They are an elusive cell type that is best understood
00:40:47.060 | in rodents and other creatures
00:40:48.300 | and not well understood in the primate, as you know.
00:40:50.660 | Although some people are discovering potential cells
00:40:53.340 | of that type now and have recently discovered them.
00:40:56.380 | - Okay, so let's say cells that respond to--
00:40:58.420 | - Small, bright. - Like spots that are red,
00:41:01.380 | that go from dim red to bright red.
00:41:06.540 | - Right. - Yeah.
00:41:07.380 | So we can go through that colored TV snow
00:41:10.940 | and pick out the cells that responded to a transition
00:41:14.780 | of the kind you described, from darker to lighter,
00:41:17.700 | or from greener to redder, or something like that.
00:41:20.100 | Cells tend to respond to transitions in the visual scene
00:41:22.760 | rather than static imagery.
00:41:24.440 | And so we can pick that stuff out.
00:41:27.420 | But you asked the question, well, gee,
00:41:28.900 | is the TV snow gonna capture everything
00:41:30.940 | about what these cells are doing?
00:41:32.420 | That's a really important question
00:41:33.620 | that I wanna just mention more.
00:41:37.540 | Quite likely not.
00:41:39.580 | That's a scientific instrument.
00:41:41.140 | It's an unbiased way to sample a whole bunch of cells,
00:41:44.660 | first cut, look at, generally speaking, what are they up to?
00:41:49.180 | But that doesn't mean we've really captured their role
00:41:51.500 | in natural visual perception.
00:41:53.260 | 'Cause actually, you don't go through the world
00:41:55.340 | perceiving visual snow.
00:41:56.620 | You go through the world perceiving objects and meals
00:41:58.940 | and mates and targets and all these things, right?
00:42:01.980 | So the study of how the retina responds
00:42:05.360 | to more naturalistic visual stimuli in my lab
00:42:09.300 | and in many other labs around the world
00:42:11.660 | is really getting off the ground now.
00:42:14.020 | And I would say we have limited understanding.
00:42:16.980 | I would say we know that our simple laboratory experiments
00:42:23.300 | with the TV snow don't capture the whole story.
00:42:25.540 | There's more.
00:42:26.820 | There are about 20 different cell types in the retina.
00:42:28.740 | We have basic characterizations of seven of them,
00:42:32.420 | if you count a certain way.
00:42:35.020 | We know that there are another 15 or so lurking
00:42:38.540 | right behind the curtain that we've started to sample.
00:42:41.940 | We don't know what naturalistic targets they respond to
00:42:45.580 | in the visual life of the animal.
00:42:47.460 | That's work that's underway, exciting, interesting work.
00:42:50.380 | 'Cause we know that the retina,
00:42:53.100 | they gotta be there for something.
00:42:55.220 | One way to think of it,
00:42:56.300 | I'm pretty sure you think of it this way too,
00:42:58.100 | is that the retina is a highly evolved organ
00:43:01.420 | with a lot of evolutionary pressure for it to be efficient,
00:43:04.900 | to have a small optic nerve sending to the brain.
00:43:09.020 | It's probably the case that there's no accidental stuff
00:43:11.820 | sitting around in the retina that's vestigial
00:43:13.660 | and sending information to the brain.
00:43:14.980 | It's probably the case that those signals
00:43:17.300 | are all doing something important for our visual behavior,
00:43:20.460 | for our well-being, for our sleep, all sorts of stuff.
00:43:24.340 | And the field is still trying to figure that out.
00:43:27.380 | These are the big mysteries, I think,
00:43:30.220 | in terms of the retina.
00:43:31.060 | What are those signals exactly
00:43:32.580 | in all those different cell types?
00:43:33.820 | What different behaviors and aspects of our life
00:43:36.740 | do they control?
00:43:37.700 | - What is the wildest cell type you've ever encountered?
00:43:43.540 | (laughs)
00:43:45.500 | Like, what did it do?
00:43:46.380 | Like, what did it respond to?
00:43:47.820 | That's what I mean when I say wildest.
00:43:49.720 | You know, it sells retinal ganglion cells
00:43:52.860 | that respond to, you know,
00:43:54.740 | increasingly red portions of the visual scene
00:43:57.820 | or decreasingly green portions of the visual scene.
00:44:00.100 | Like, okay, cool, that seems cool.
00:44:01.860 | Like, get some, you know,
00:44:03.700 | around the time of Christmas, that's useful.
00:44:05.820 | And it's useful on other days of the year as well.
00:44:08.740 | But, you know, given that the retina
00:44:12.620 | is indeed the best understood piece of the brain,
00:44:16.020 | and given that you have 20 cell types,
00:44:17.940 | 20 isn't 20 million, it's, you know, it seems tractable.
00:44:22.100 | You probably get to understanding it in its entirety
00:44:25.220 | or understanding them in their entirety, excuse me.
00:44:29.340 | - One would like to know, like, what stuff is,
00:44:32.300 | are we paying attention to at the level of the retina?
00:44:34.620 | I mean, are there like spiral cells
00:44:37.060 | that like spiral stuff in the environment?
00:44:38.980 | Are there cells that like emojis?
00:44:40.780 | Like, what's going on in there?
00:44:42.780 | You spend a lot of time doing this.
00:44:44.180 | - We do, we spend a lot of time doing this.
00:44:45.300 | - After all, you give up two nights sleep,
00:44:47.580 | which is kind of incredible, by the way.
00:44:48.820 | I'll just do a little, take a moment here
00:44:51.580 | and just say, you know, for a guy
00:44:53.140 | that's been doing this for this long
00:44:54.340 | with these sleepless nights, you look pretty good.
00:44:56.380 | You look pretty rested, you know?
00:44:58.260 | - I tend to go home.
00:44:59.660 | I go home before the graduate students do.
00:45:01.420 | - Oh, they stay up.
00:45:02.260 | - They stay up.
00:45:03.100 | - You used to stay up.
00:45:03.920 | - I used to stay up until my mid forties.
00:45:05.420 | I was in there doing the all-nighter type things.
00:45:07.980 | - Got it.
00:45:08.820 | - And maybe you can help me figure out
00:45:10.260 | my sleep patterns better so I can do it.
00:45:11.100 | - Yeah, yeah, we can talk about that this episode.
00:45:12.860 | We can talk about how to pull all-nighters
00:45:14.260 | and still survive, done plenty of those.
00:45:17.340 | But yeah, like what's, lots at stake here.
00:45:20.500 | There's a human retina, you know,
00:45:22.180 | meaning a human gave up their eyeballs to,
00:45:25.260 | for this experiment after they died, of course.
00:45:27.820 | You've got many people on this.
00:45:28.980 | These are, these sorts of experiments are very expensive,
00:45:33.980 | a lot of fancy equipment, a lot of salaries
00:45:37.340 | to try and figure this stuff out.
00:45:38.540 | This is the chief mission of the National Eye Institute.
00:45:40.460 | There's a lot of tax dollars, like this is, in my opinion,
00:45:44.140 | as important as the space program,
00:45:45.500 | probably more important, in my opinion,
00:45:48.020 | restoring vision to the blind, obviously.
00:45:49.940 | So what are you finding in there?
00:45:52.100 | - Yeah, and we have the privilege
00:45:53.860 | of being on the front lines of that,
00:45:55.140 | funded by the National Eye Institute
00:45:56.580 | and other institutions to be out there
00:45:58.460 | figuring out what's going on in this human retina.
00:46:00.820 | I'm with you on that.
00:46:02.260 | So I'll tell you how we go about it these days.
00:46:05.860 | There are about seven cell types
00:46:07.140 | that we understand pretty well what they're doing.
00:46:08.980 | They're, it's not, they're not complicated.
00:46:10.900 | They just have different properties, color, size,
00:46:13.220 | this kind of thing, temporal properties,
00:46:14.820 | their timing of their signals.
00:46:16.740 | And those seven cell types, we understand pretty well,
00:46:18.860 | but we're trying to really nail down the details.
00:46:20.660 | Why, because of neuroengineering for vision restoration.
00:46:24.820 | Then there's another, I'm gonna say 15 or so.
00:46:27.820 | And we, and the anatomists,
00:46:29.540 | the people who study the shapes and sizes of the cells
00:46:33.020 | have long known that there were more cell types
00:46:35.500 | lurking in the retinal circuitry,
00:46:37.460 | but their function has not been known.
00:46:40.460 | And because we didn't have many recordings from them,
00:46:42.740 | we didn't have electrical recordings in response to light
00:46:45.620 | that would tell us what they naturally do.
00:46:48.340 | We've actually had a breakthrough in the last few years
00:46:51.020 | led by a senior researcher in my lab named Alexandra Kling,
00:46:55.580 | who has figured out that there's another 15 or so cell types
00:47:00.580 | lurking in those recordings,
00:47:01.980 | that if we look more carefully, they're there.
00:47:04.060 | And they have crazy properties.
00:47:06.300 | And so the crazy properties I can tell you about
00:47:09.300 | have to do with the spatial region of the visual world
00:47:13.060 | that they respond to.
00:47:14.580 | The well-known cell types that you know,
00:47:16.500 | and I know from the textbooks,
00:47:18.700 | kind of respond to a circular spot in the visual world.
00:47:21.700 | If there's light in this little circular area,
00:47:23.860 | they'll fire a spike.
00:47:24.820 | If there's no light there, they don't care.
00:47:26.860 | Well, okay, some cells is not quite circular.
00:47:31.260 | Some cells respond to the light that's there
00:47:33.460 | and the difference from the light that's around it.
00:47:36.460 | So if it's brighter than the light that's nearby,
00:47:38.340 | then you get a big response.
00:47:40.620 | The new cell types are more puzzling than that.
00:47:43.620 | Some of them respond to three or four blobs
00:47:46.460 | in the visual world.
00:47:47.620 | That's kind of strange, unexpected,
00:47:50.580 | definitely unexpected based on the textbooks.
00:47:53.340 | And the newest ones are weirder yet.
00:47:55.900 | Some of them, their visual response profiles,
00:47:58.900 | that is the region of the visual world
00:48:00.740 | that they are sensitive to light in,
00:48:03.140 | almost has a spidery shape,
00:48:05.980 | almost like the dendrites of a cell,
00:48:08.060 | like the processes of a cell.
00:48:10.500 | And some of them have blobby light sensitivity.
00:48:13.740 | They're sensitive to light increments here
00:48:16.620 | and decrements there and increments here
00:48:18.060 | and decrements there and some blue light over here
00:48:20.180 | and blue light over here.
00:48:21.740 | We don't understand these cells.
00:48:24.180 | To be clear, the seven that we understand reasonably well
00:48:27.020 | are not trying to just pin down and really nail
00:48:29.180 | for vision restoration,
00:48:30.460 | the sort of first cut at cell type specific
00:48:33.700 | vision restoration.
00:48:35.660 | Those ones don't have these weird properties.
00:48:37.420 | They're a little simpler to understand.
00:48:39.780 | But we're just working out all the details
00:48:41.260 | of the timing of their responses and all that.
00:48:43.260 | These new ones, we don't know what's going on with them.
00:48:45.620 | So we're doing experiments.
00:48:47.100 | Those seven cell types constitute maybe 70%
00:48:50.620 | of all the neurons that send visual information
00:48:53.260 | from the eye to the brain.
00:48:54.460 | So we think it's a really solid target
00:48:56.700 | for vision restoration to work with the simple ones.
00:49:00.060 | And so when I say that I think that the retina
00:49:02.860 | is the best understood circuit in the nervous system,
00:49:04.820 | I'm talking about those seven cell types,
00:49:06.700 | which we know a lot about what they do.
00:49:08.220 | We really do know a lot.
00:49:09.300 | It's not done, but we know a lot.
00:49:11.340 | I'm not talking about the other 15 cell types,
00:49:13.860 | which are a minority of the population,
00:49:15.580 | but seem to be doing very strange and surprising things
00:49:18.300 | that are yet to be determined.
00:49:19.340 | So it's a mix.
00:49:21.100 | We know some really good stuff.
00:49:22.860 | And then there's really some deep mysteries out there
00:49:25.140 | about these other cell types.
00:49:27.300 | - So we've been talking a lot about how to understand
00:49:29.500 | the signals that the retina is sending the brain.
00:49:32.060 | And I know your lab has done incredible work in this arena
00:49:36.860 | and figured out a number of the different signals
00:49:38.940 | as you described some of the features
00:49:41.700 | that the different cell types are extracting
00:49:43.540 | just a moment ago, these blobs of different colors, et cetera.
00:49:46.900 | What good is this to the everyday person, right?
00:49:52.580 | In addition to wanting to understand how we see,
00:49:56.820 | what sorts of medical applications can this provide
00:50:00.060 | in terms of potentially restoring vision to the blind,
00:50:03.280 | but perhaps even larger theme
00:50:08.340 | is this notion of neuroengineering, right?
00:50:11.620 | Taking this information and creating devices
00:50:14.780 | that can help our nervous system function better,
00:50:19.780 | maybe even function at super physiological levels.
00:50:22.460 | I know there's a lot of interest in this these days
00:50:25.380 | in part due to Neuralink, right?
00:50:27.200 | Because Elon's out there front-facing very vocal
00:50:29.860 | about his vision of the, no pun intended,
00:50:32.940 | of chips being implanted in people's brains
00:50:35.780 | that would allow them to be in conversation
00:50:37.900 | with a hundred people at the same time
00:50:39.940 | just by hearing those voices in the head,
00:50:41.980 | maybe filtering things out so it doesn't sound
00:50:43.660 | like a clamor of a hundred different voices,
00:50:45.860 | perhaps giving people super memory.
00:50:49.900 | I mean, you know, sky's the limit.
00:50:51.780 | No one really knows where this is all headed.
00:50:53.540 | You're working in what we call a very constrained system
00:50:57.320 | where it has specific properties
00:50:58.700 | that you're trying to understand.
00:51:00.140 | And once you understand those,
00:51:01.660 | you can start to think about real applications
00:51:05.020 | of like what's possible.
00:51:06.100 | Like, could you create a visual system
00:51:07.960 | that can extract more color features from the world
00:51:11.100 | that no other human can see?
00:51:13.200 | Can you restore pattern vision to somebody
00:51:16.060 | who is essentially blind and dependent on a cane or a dog,
00:51:18.620 | or, you know, God forbid can't even leave their house
00:51:21.040 | because they can't see anything at all.
00:51:22.880 | You know, where is this headed?
00:51:24.480 | What is the information useful for?
00:51:27.300 | And perhaps we should frame that first
00:51:28.760 | within the medical rehabilitative context
00:51:31.860 | of repairing or restoring vision rather,
00:51:34.700 | and then get into the more kind of
00:51:36.600 | sci-fi type neuroengineering stuff.
00:51:40.400 | - Absolutely.
00:51:41.240 | Yeah, this really is my passion these days,
00:51:44.200 | turning that corner.
00:51:45.920 | Continuing to figure out the mysteries in the retina,
00:51:48.560 | but also saying, wait a second,
00:51:49.880 | we actually know quite a lot about this.
00:51:51.880 | Shouldn't this be the first place
00:51:53.680 | that we can solve problems like restoring vision,
00:51:56.280 | restoring function, or augmenting our function?
00:51:58.560 | I think it should be.
00:52:00.780 | The concept of how to do this is straightforward
00:52:05.180 | and not invented by us in any way.
00:52:07.700 | And that is the following.
00:52:09.940 | One of the major sources of blindness in the Western world
00:52:14.940 | is loss of the photoreceptor cells that capture light.
00:52:20.220 | Macular degeneration and retinitis pigmentosa
00:52:22.940 | are two well-known ailments that give rise to vision loss.
00:52:28.340 | And the vision loss is because the cells
00:52:30.960 | that capture light in the first place
00:52:32.360 | that we talked about earlier die off.
00:52:35.000 | So you're no longer sensitive to light and you're blind.
00:52:38.540 | The concept is that you may be able to bypass
00:52:41.920 | those early sections of the retina
00:52:43.880 | that capture the light and process the signals,
00:52:45.800 | and instead build an electronic implant
00:52:49.680 | that connects up directly to the retinal ganglion cells.
00:52:53.040 | And this electronic implant would do the following.
00:52:55.920 | It would capture the light using a camera,
00:52:58.040 | which is relatively easy.
00:53:00.040 | It would process the visual information
00:53:01.760 | in a manner similar to the way that the retina normally does,
00:53:04.880 | as similar as possible.
00:53:06.500 | And then it would electrically activate
00:53:08.220 | the retinal ganglion cells by passing current
00:53:10.600 | and causing the ganglion cells to fire spikes
00:53:13.080 | and send those spikes to the brain.
00:53:14.580 | And if we do this really well,
00:53:16.680 | we can essentially replace those first two layers
00:53:19.120 | of the retina and piggyback onto the third layer
00:53:21.440 | and say, okay, we're just jumping right
00:53:22.680 | into that third layer.
00:53:23.880 | We're gonna force those ganglion cells
00:53:25.660 | to send reasonable visual signals to the brain,
00:53:27.940 | and then the brain is gonna think
00:53:29.720 | it got a natural visual signal and proceed accordingly.
00:53:32.960 | That's the concept.
00:53:34.560 | This is not our idea.
00:53:35.600 | People have had this concept for decades,
00:53:37.840 | and some people have even started to make it work
00:53:41.080 | in human patients.
00:53:42.760 | And what I mean by that is implanting electrodes
00:53:45.240 | on the retina, stimulating and causing people
00:53:47.600 | who have been profoundly blind for decades
00:53:49.880 | to see visual sensations, blobs and streaks of light
00:53:53.560 | in their visual world that are reproducible.
00:53:56.360 | - So that's happening now.
00:53:57.520 | - That's happened.
00:53:58.360 | - People who were at once blind are able to see objects.
00:54:03.360 | - Are able to see crude blobs and flashes of light.
00:54:07.360 | - In ways that allow them to navigate their world better.
00:54:09.520 | - A little bit, a little bit.
00:54:10.840 | - Avoid a coffee table.
00:54:12.800 | - Maybe, or at least see a bright window in a dark wall
00:54:16.120 | and be able to point to that bright window
00:54:17.920 | or the bright doorway in a dark wall, something like that.
00:54:20.200 | So it's a glass half full, half empty story
00:54:22.280 | that I'd like to turn attention to in this conversation
00:54:26.280 | because I think it's very exciting.
00:54:28.880 | Yes, you can see stuff
00:54:32.080 | by artificially electrically stimulating the ganglion cells
00:54:35.440 | and you can see stuff that actually helps you interact
00:54:37.440 | with your world a tiny little bit.
00:54:39.600 | So great, that's the glass half full.
00:54:43.080 | The glass half empty is it's nothing remotely resembling
00:54:46.640 | what we understand as naturalistic vision,
00:54:49.800 | where we see fine spatial detail and color and objects
00:54:52.760 | and can navigate complex environments and all that stuff.
00:54:54.960 | But no chance, you can't do anything remotely like that.
00:54:57.200 | And you can see that there's a bright doorway over that way
00:55:01.080 | and turn toward it, which is a helpful step
00:55:04.240 | in your human experience, but there's a long way to go.
00:55:07.800 | So the question is, why does this existing technology
00:55:12.400 | fail to give us high quality vision?
00:55:15.440 | What's needed to give us high quality vision?
00:55:18.600 | And this is the piece I'm really passionate about.
00:55:22.600 | It turns out that the devices that have been implanted
00:55:25.440 | in humans so far by pioneering bioengineers
00:55:29.120 | who did really hard stuff were fairly simple devices
00:55:33.560 | that treated the retina as if it's a camera,
00:55:35.680 | that is just a grid of pixels.
00:55:38.240 | And they put a grid of electrodes down there
00:55:40.040 | and they stimulated according to the pixels
00:55:41.800 | in the visual world and thought,
00:55:42.840 | well, hopefully that will cause the retina
00:55:44.800 | to do the right thing and send a nice visual,
00:55:48.040 | piece of visual information to the brain and initiate vision.
00:55:51.120 | Unfortunately, they left the science on the table.
00:55:55.160 | And this is actually what I'm dedicating
00:55:57.800 | the next phase of my career to.
00:55:59.360 | Bringing the science that we know, that we talked about
00:56:03.960 | to the table for vision engineering.
00:56:07.640 | And in particular, the fact that there are,
00:56:09.640 | there really are 20 or so distinct cell types.
00:56:12.560 | And they send different types of visual information
00:56:14.960 | to the different targets in the brain.
00:56:17.020 | I like to think of them a little bit as an orchestra
00:56:20.320 | playing a symphony.
00:56:21.720 | Each different cell type has its particular score.
00:56:24.400 | The violins do one thing, the oboes do something else.
00:56:27.760 | It's a very organized signal coming out of the retina,
00:56:31.240 | presenting to the brain this complex pattern
00:56:33.680 | of electrical activity that the brain assembles
00:56:35.280 | into our visual experience.
00:56:37.160 | Well, current retinal implants, unfortunately,
00:56:39.880 | are too crude to do anything like that.
00:56:42.200 | The conductor has just scattered the sheet music everywhere
00:56:46.400 | and people are playing whatever.
00:56:48.440 | It's cacophony, okay?
00:56:50.280 | You can maybe recognize a tune in there a little bit,
00:56:52.400 | sometimes, navigate toward a doorway,
00:56:55.440 | but you're not gonna get the full richness of the experience
00:56:58.040 | by ignoring the different cell types.
00:57:00.760 | And I'm so passionate about this in part for reasons
00:57:03.800 | that a little bit are similar to your reasons
00:57:06.000 | for doing this kind of work that you do,
00:57:07.800 | which I think is great.
00:57:08.960 | Which is, I feel we have a mission to give back
00:57:13.560 | as scientists, to take all this stuff
00:57:15.280 | we've been talking about because we find it
00:57:16.880 | so interesting and cool, and to deliver something
00:57:20.440 | to the society that has allowed us
00:57:22.840 | to explore these fascinating areas.
00:57:25.880 | And in our case, in the case of my lab and what we've done,
00:57:30.400 | it's to turn around and say, "Wait a second.
00:57:31.760 | "We understand that there's these different cell types.
00:57:33.620 | "We understand a lot about what they do.
00:57:35.800 | "None of this information appears
00:57:37.340 | "in current epiretinal implants.
00:57:40.340 | "Can't we do better by using the science
00:57:43.320 | "to restore vision in a way that respects
00:57:46.100 | "the circuitry of the retina?"
00:57:48.320 | That's what we're trying to do.
00:57:49.920 | And the mismatch is intense.
00:57:53.360 | I told you when we were chatting before
00:57:57.160 | that nothing that we have learned about the retina
00:58:01.920 | since the founding of the National Eye Institute in 1968
00:58:05.320 | is incorporated into the existing retinal implants.
00:58:10.840 | Nothing.
00:58:12.200 | We've learned a ton about the retina.
00:58:13.880 | Your research was funded by the National Eye Institute.
00:58:16.040 | My research is funded by National Eye Institute,
00:58:17.740 | a fantastic organization that allows us
00:58:20.080 | to learn all these things.
00:58:22.100 | How is this showing up in the neuroengineering
00:58:24.080 | to restore vision to people?
00:58:25.160 | Well, currently it's not.
00:58:26.800 | And so we're trying to do that.
00:58:28.360 | Now, doing that turns out to be hard,
00:58:30.960 | and maybe we'll talk about that.
00:58:32.560 | It's a technological feat that's really challenging.
00:58:35.520 | You have to build a device that you can implant in a human
00:58:39.240 | that can recognize the distinct cell types,
00:58:42.200 | see where they are,
00:58:43.960 | stimulate them separately from one another,
00:58:46.680 | and conduct this orchestra to create a musical score
00:58:51.680 | that reasonably closely resembles the natural one.
00:58:55.760 | That's what we're all about doing.
00:58:57.400 | And as it turns out,
00:58:58.360 | and maybe we'll talk about that separately,
00:59:00.400 | that mission of being able to restore
00:59:03.460 | the patterns of activity that the retina normally creates
00:59:06.640 | also has extremely exciting spinoffs in three directions.
00:59:12.740 | One of them is understanding better
00:59:14.160 | how the brain puts the signals together.
00:59:16.600 | That's research for the brain.
00:59:19.300 | The second one is augmenting vision,
00:59:21.520 | creating novel types of visual sensations
00:59:23.760 | that weren't possible before,
00:59:25.000 | and maybe doing something along the Elon Musk lines
00:59:27.440 | of delivering more visual information
00:59:29.720 | than was ever possible.
00:59:30.880 | And third, figuring out how to interface
00:59:33.840 | to the brain more broadly.
00:59:35.420 | Because as you and I know,
00:59:37.460 | the structure of the retina is very much representative
00:59:40.680 | of the structure of many brain areas.
00:59:42.560 | And if we're gonna figure out how to interface
00:59:44.300 | to the visual cortex,
00:59:45.320 | we darn well better figure out how to interface
00:59:47.520 | to the retina first.
00:59:49.680 | That's what we're all about doing in my lab these days,
00:59:52.000 | is figuring out how to do that well.
00:59:54.000 | That's a mixed science and engineering effort.
00:59:57.000 | We've done about 15 years of basic science on that.
01:00:00.020 | How do we stimulate cells?
01:00:01.480 | How do we recognize cells?
01:00:02.740 | How are we gonna build a device that does all this,
01:00:05.180 | and talks to the cells in this way?
01:00:06.820 | And I can go into lots of gory detail about it.
01:00:09.760 | But that's what we've been doing the basic research on.
01:00:11.480 | In the last few years, we've worked at Stanford
01:00:13.520 | with fantastic engineers from various disciplines,
01:00:16.800 | electrical engineers, material scientists, others,
01:00:19.920 | to figure out how to put together the pieces
01:00:21.920 | and build an implant that can do this in a living human.
01:00:25.200 | - So is the idea to build a robotic retina,
01:00:30.040 | to build essentially an artificial retina
01:00:32.760 | that could be put into the eye of a blind person,
01:00:34.960 | or even put into the eye of a sighted person
01:00:37.600 | that would fundamentally change their ability to see,
01:00:40.500 | or in the latter case, enhance their ability
01:00:44.040 | to extract things from the visual world
01:00:45.680 | that they would otherwise not be able to see?
01:00:48.440 | Like seeing twice as far,
01:00:51.440 | getting hawk-like resolution of the visual world.
01:00:55.720 | - Yep. - Which that would be cool.
01:00:57.840 | - Yep. - Could be distracting.
01:00:59.720 | - Yep. - Like, I'm not sure
01:01:00.560 | I wanna see the fine movements of a piece of paper
01:01:03.800 | in somebody's notebook from across a cafe
01:01:05.960 | as they flutter the pages.
01:01:08.080 | - But you might want to for a moment.
01:01:10.240 | There might be a moment when you want to.
01:01:11.880 | And if you have an electronic device that you can control,
01:01:14.680 | that you can dial in to sense different aspects
01:01:17.240 | of the visual world by your choice,
01:01:21.320 | you might be like, yeah, that's pretty cool,
01:01:22.560 | I wanna be able to do that right now.
01:01:24.120 | There's an example I like to give,
01:01:25.440 | which I think maybe is helpful
01:01:27.000 | for interpreting what we're talking about
01:01:28.440 | when we say being able to do more things
01:01:30.840 | with the optic nerve.
01:01:32.000 | You gave the example of many voices and stuff.
01:01:34.160 | Here's an example that I like.
01:01:36.080 | We know that we can drive down the road
01:01:39.400 | and have a phone call hands-free
01:01:42.480 | and do that quite safely, pretty safely, right?
01:01:46.240 | And you're, why?
01:01:47.440 | Because you're tapping in,
01:01:48.840 | you've got two types of signals coming into your brain.
01:01:51.100 | Your visual signal of the cars on the freeway,
01:01:53.040 | any one of which could kill you in an instant.
01:01:55.760 | So it's important.
01:01:56.680 | And the sound coming into your ears,
01:01:59.840 | which carries the voice of your girlfriend
01:02:01.960 | that's telling, who's telling you something
01:02:03.440 | that you're interested in hearing.
01:02:05.320 | And these are different parts of the brain
01:02:06.880 | processing this information.
01:02:08.120 | And so you can do both of these things at the same time
01:02:10.000 | 'cause there's no interference.
01:02:11.200 | One part of the brain's working, doing one thing,
01:02:12.920 | another part's doing something else, you're good.
01:02:15.220 | What you can't do is to read your texts
01:02:18.360 | and drive down the freeway at the same time.
01:02:20.280 | That's not good because now that visual system of yours
01:02:23.160 | that needs to be detecting
01:02:24.120 | these fast moving dangerous objects
01:02:25.920 | is being distracted by looking at the text
01:02:28.280 | and you might die and some other people might die with you.
01:02:30.960 | - I see a lot of people texting and driving.
01:02:33.880 | - Yeah, that's why I like to point out this example
01:02:36.160 | to remind people you can't do it well.
01:02:38.080 | It's like- - You can't do it well.
01:02:39.540 | - You probably talk them- - Yeah, it used to be,
01:02:42.440 | we'll just take a brief tangent here into this topic.
01:02:45.580 | A few years back, there were a lot of news articles,
01:02:49.040 | a lot of discussion about texting and driving,
01:02:51.360 | a lot of attention to getting people
01:02:52.680 | to stop texting and driving.
01:02:54.120 | I've seen a few people pulled over
01:02:56.800 | for texting and driving before,
01:02:58.580 | but I would say texting and driving is rampant.
01:03:02.680 | Reading what's on one's phone while driving is rampant.
01:03:07.080 | All you have to do is be on the freeway here in Los Angeles,
01:03:09.120 | look in the car next to you.
01:03:10.280 | - Yeah, look where they're looking.
01:03:11.520 | - And people are reading and texting while driving.
01:03:14.800 | Presumably when they're doing that,
01:03:16.060 | they're just using their peripheral vision
01:03:17.540 | to detect any kind of motion.
01:03:19.400 | And no doubt this has caused the deaths of many people.
01:03:23.080 | - Yeah, change lanes, get away from them.
01:03:25.360 | And that's just like that other driver.
01:03:27.920 | So here's the thing.
01:03:30.040 | And this is, I say this a bit tongue in cheek,
01:03:33.240 | but it's sort of a real example.
01:03:35.640 | It may be that if we harness the different cell types
01:03:39.120 | in the retina to deliver different visual information
01:03:43.280 | to different cell types,
01:03:44.820 | like the image of the text on your screen
01:03:50.080 | to a certain cell type that you know,
01:03:51.760 | the so-called midget cells,
01:03:53.140 | or the motion of the objects in the visual scene,
01:03:58.040 | the cars to a different cell type that you know.
01:04:00.560 | - Midget cells, by the way, folks,
01:04:01.680 | because they're very, very small.
01:04:03.560 | - Yes, and by named by anatomists decades ago.
01:04:06.920 | So we carry that nomenclature forward.
01:04:09.140 | - Sure.
01:04:10.280 | - And the parasol cells, which are different cells
01:04:12.560 | that you can potentially encode the movement of the cars
01:04:16.040 | in the parasol cells.
01:04:17.500 | And if those two systems are operating independently,
01:04:19.980 | which we sort of think they may be
01:04:22.120 | from research that you know very well
01:04:23.520 | from your extensive studies in vision,
01:04:27.520 | maybe we can do those two things safely at the same time.
01:04:30.260 | Now, by the way, that's not my research goal
01:04:31.960 | to text and drive at the same time, just to be clear,
01:04:34.720 | but it's a very tangible real world example
01:04:37.280 | of if we do really have parallel pathways
01:04:39.840 | that can be modulated and controlled
01:04:41.640 | independently of one another,
01:04:43.540 | this opens the door to streaming
01:04:46.040 | all kinds of visual information in parallel
01:04:48.360 | into our very high bandwidth visual systems
01:04:50.680 | that wasn't possible during evolution
01:04:52.960 | because we didn't have control over the cell types.
01:04:55.760 | So I think of that as the world of visual augmentation.
01:04:58.880 | And it starts to get interesting
01:05:00.480 | if the different cell types are behaving
01:05:02.720 | in an independent way
01:05:04.400 | when they transmit visual information to the brain.
01:05:06.240 | Now, how are we gonna figure that out?
01:05:08.080 | Well, we need a device that can stimulate
01:05:10.600 | the different cells independently,
01:05:12.940 | and then study that to see whether people
01:05:15.960 | can do this kind of thing, right?
01:05:19.180 | What's that device?
01:05:20.600 | The artificial retina, the same implant I'm telling you about
01:05:22.800 | that can restore vision to people
01:05:24.120 | because it's an electronic device
01:05:25.360 | that can dial up the activity in the different cell types,
01:05:27.960 | that same device is what we can use
01:05:30.120 | as a research instrument to understand
01:05:31.720 | if the different pathways are parallel,
01:05:33.800 | if the signals interact with one another,
01:05:35.880 | and explore how the brain receives that information.
01:05:38.800 | And then we can use that to explore,
01:05:40.240 | can we augment vision and allow ourselves
01:05:44.000 | to have newer visual sensations
01:05:45.580 | that we don't even know what that would look like.
01:05:47.080 | We don't even understand what it would look like
01:05:49.380 | to us to see those sensations,
01:05:50.760 | but it might be able to deliver
01:05:52.440 | lots of information to our brain.
01:05:53.880 | And if we can do all those things,
01:05:57.280 | then we can take that same set of tools
01:06:00.080 | and engineering technologies into the brain
01:06:03.000 | to access different cell types as well.
01:06:05.160 | - I'd like to just take a quick break
01:06:06.600 | and acknowledge one of our sponsors, InsideTracker.
01:06:09.600 | InsideTracker is a personalized nutrition platform
01:06:12.240 | that analyzes data from your blood and DNA
01:06:14.880 | to help you better understand your body
01:06:16.520 | and help you reach your health goals.
01:06:18.480 | Now, I've long been a believer
01:06:19.680 | in getting regular blood work done
01:06:21.040 | for the simple reason that many of the factors
01:06:23.480 | that impact our immediate and long-term health
01:06:25.840 | can only be addressed,
01:06:27.000 | that is can only be measured with a quality blood test.
01:06:30.320 | Now, one issue with many blood tests out there
01:06:32.240 | is that you get information back about lipid levels,
01:06:34.720 | hormone levels, metabolic factors, et cetera,
01:06:37.160 | but you don't know what to do with that information.
01:06:39.320 | With InsideTracker, they make it very easy
01:06:41.040 | to understand your levels and what they mean
01:06:43.640 | and specific actionable items that you can undertake
01:06:46.520 | in order to bring those levels
01:06:47.840 | into the ranges that are optimal for you.
01:06:49.960 | InsideTracker also offers InsideTracker Pro,
01:06:52.980 | which enables coaches and health professionals
01:06:55.240 | to provide premium and personalized services
01:06:57.740 | by leveraging InsideTracker's analysis
01:06:59.680 | and recommendations with their clients.
01:07:01.780 | If you'd like to try InsideTracker,
01:07:03.280 | you can go to insidetracker.com/huberman
01:07:06.320 | and you'll get 20% off any of InsideTracker's plans.
01:07:09.120 | Again, that's insidetracker.com/huberman.
01:07:12.400 | So to just summarize a little bit of the linear flow here
01:07:16.480 | of what you've done,
01:07:17.680 | you started off with the understanding
01:07:21.960 | that the neural retina is perhaps the best place
01:07:24.580 | to try and understand how the brain works
01:07:26.180 | because of its arrangement, the cell types, et cetera.
01:07:29.480 | You spend a number of decades doing these wild experiments
01:07:33.900 | on human retinas and other retinas,
01:07:36.580 | recording the different cell types with these high density,
01:07:38.940 | what I call bed of nails.
01:07:40.340 | - Two and a half decades, I'm not that old.
01:07:41.780 | - Two and a half decades.
01:07:42.920 | It's your robustness that matters, EJ,
01:07:48.500 | and you have plenty of it.
01:07:51.600 | You figure out what the cell types are.
01:07:53.020 | So then you gained an understanding
01:07:54.900 | of how light is transformed
01:07:56.580 | into different types of electrical signals
01:07:58.280 | that encode different features in the visual scene.
01:08:00.660 | Then comes the challenge
01:08:02.740 | of developing neuroengineering tools
01:08:06.020 | to try and stimulate the specific cell types
01:08:09.340 | in a way that more or less mimics
01:08:11.900 | their normal patterns of activation,
01:08:13.260 | like not activating a huge piece of retina
01:08:16.020 | so that the cells that increases in redness
01:08:19.580 | are also being stimulated with the cells
01:08:21.260 | that like edges in a way that would create some shmooey,
01:08:24.240 | like crazy representation of the outside world.
01:08:26.680 | No, you want the same precision
01:08:29.180 | that light stimulation of these cells
01:08:31.660 | in the intact human eye provides in this explanted retina,
01:08:36.660 | this retina on this bed of nails,
01:08:39.520 | but then a device that essentially can mimic
01:08:42.580 | what the retina does.
01:08:43.760 | And you needed to do all that earlier work,
01:08:46.220 | understanding like what does the normal retina do?
01:08:48.360 | What does the healthy retina do
01:08:49.640 | in order to try and develop this prosthetic device
01:08:52.040 | to either restore vision
01:08:53.300 | or because it puts you in the position
01:08:56.180 | of being able to stimulate cells however you want.
01:08:58.380 | In theory, you could create a situation in a human
01:09:01.480 | where the cells that respond to, I don't know,
01:09:06.180 | outlines of objects are hyperactive
01:09:08.340 | so that the person could effectively see objects
01:09:13.340 | in one's environment better than anyone else.
01:09:16.860 | Could perceive, I know motion is a tricky one,
01:09:19.260 | but motion better than anyone else.
01:09:21.780 | Could see detail in the visual world
01:09:23.540 | that no one else could detect.
01:09:25.460 | We're not talking about turning people into mantis shrimp,
01:09:28.780 | but the analogy works because mantis shrimp
01:09:31.620 | can see things that we can't and vice versa.
01:09:33.980 | And so what you're talking about here
01:09:35.240 | is neural augmentation through the use of engineering.
01:09:38.180 | - Yep, and we often do talk about it as sci-fi
01:09:41.620 | because the sci-fi writers have been writing
01:09:44.780 | about this for decades.
01:09:46.020 | It's not sci-fi anymore.
01:09:47.620 | It's sci, it's straight up sci right now.
01:09:49.940 | It's really, we just need to build the instrumentation
01:09:52.900 | and start working with those experiments
01:09:54.740 | to figure out how to make it work.
01:09:56.540 | I think we have a responsibility to do this
01:09:59.880 | because this is the way we take this kind of information,
01:10:03.540 | all that's been learned about the visual system
01:10:06.380 | by the National Eye Institute since 1968
01:10:09.340 | and all the people that it's funded to do this research
01:10:12.160 | and turn the corner and make a difference
01:10:13.580 | for humanity with it.
01:10:15.540 | And I assume and think that humanity
01:10:20.180 | will be leveraging nervous system knowledge
01:10:22.540 | to build all sorts of devices
01:10:24.380 | that we can interface to the world with.
01:10:27.020 | I think, you know, I don't know Elon Musk,
01:10:29.260 | but I think he's right about that,
01:10:30.820 | that that's where we're headed.
01:10:32.380 | It should be done well.
01:10:35.180 | It's important to do it well.
01:10:36.620 | We will hopefully be more connected to truth in the world
01:10:41.740 | if we're able to build devices that give us
01:10:43.420 | better sensations, more acute understanding
01:10:45.620 | of what's going on out there,
01:10:47.420 | better abilities to make decisions and all that,
01:10:49.940 | let alone just see stuff.
01:10:51.900 | So that frontier of developing technologies
01:10:55.340 | to allow our brains and our visual systems initially
01:10:58.380 | and then other parts of our brains to do things better
01:11:01.400 | is unbelievably exciting.
01:11:04.100 | It is sci-fi, but I just want to emphasize,
01:11:07.580 | I think it's the responsible way to go
01:11:10.280 | to think about how to do that well.
01:11:13.060 | All technologies that we develop
01:11:15.500 | can be used for good or for ill.
01:11:17.820 | And I'm sure some of your listeners
01:11:19.460 | who are a lot of very passionate thinking people out there
01:11:22.660 | thinking about neuroscience and the implications worry,
01:11:25.380 | what does this mean?
01:11:26.200 | We're gonna be introducing electronic circuits
01:11:28.020 | in our brains to do stuff?
01:11:29.700 | Yeah, well, we will.
01:11:31.080 | It's pretty much clear that humanity will do that.
01:11:34.540 | And so in any technology development,
01:11:37.660 | you have to think about, well, how do you do it well?
01:11:39.220 | How do you do it for good?
01:11:40.940 | There are popular movies right now
01:11:42.780 | about technology development,
01:11:44.060 | such as "Understanding the Structure of the Atom."
01:11:46.620 | And that technology development
01:11:49.060 | can be used for good or for ill,
01:11:51.460 | to blow up cities or to save civilizations.
01:11:54.620 | How's it gonna go?
01:11:56.700 | Well, I think as scientists,
01:11:58.540 | we're responsible for advancing that
01:12:01.580 | in a thoughtful and meaningful way.
01:12:03.460 | I think we can do this in the retina.
01:12:06.780 | It is the place to start.
01:12:09.740 | I'm curious what you think, actually, as a scientist,
01:12:11.980 | 'cause your background is in this field
01:12:13.540 | or a very close field to mine.
01:12:15.700 | I know you speak with all sorts of scientists
01:12:17.600 | on this podcast, but this is pretty much your field,
01:12:21.220 | or very close, not the neuroengineering part,
01:12:22.820 | but understanding the retina.
01:12:24.620 | And I'm curious if you agree
01:12:25.940 | that this is the place to start doing this stuff.
01:12:28.260 | - The first guest to ever ask me a question
01:12:30.100 | on this podcast during a guest interview.
01:12:33.260 | I think this would be a fun place
01:12:34.580 | to both answer and riff on this a little bit,
01:12:37.020 | because first of all, I think the retina
01:12:41.860 | is absolutely the place to start
01:12:43.100 | because we understand so much of what it does,
01:12:45.840 | what the cell types are.
01:12:47.140 | But maybe by comparison, a different brain region,
01:12:50.060 | the hippocampus, which is involved
01:12:51.700 | in the formation of memories and other stuff,
01:12:54.260 | but formation of memories about
01:12:56.300 | what one did the previous day,
01:12:57.800 | what one did many years ago, et cetera,
01:13:00.340 | is an area that I think anytime the conversation
01:13:02.820 | about neuroprosthetics or neuroengineering
01:13:04.980 | or neural augmentation comes up,
01:13:06.300 | people think, oh, wouldn't it be cool
01:13:07.820 | to have like a little stimulating device in the hippocampus?
01:13:10.980 | And then if I want to remember a bunch of information
01:13:13.900 | from a page or from a lecture,
01:13:16.380 | I just stimulate and then voila,
01:13:18.580 | all the information is batched in there.
01:13:20.580 | While that's an attractive idea,
01:13:24.040 | I think it's worth pointing out right now that sure,
01:13:27.820 | there is a pretty decent understanding
01:13:30.180 | of the different cell types in terms of their shapes,
01:13:32.260 | some of their electrical properties of the hippocampus,
01:13:35.420 | but there is in no way, shape or form,
01:13:38.720 | the depth of understanding about the hippocampus
01:13:40.900 | and what the individual cell types do
01:13:42.340 | and what the different layers of it do
01:13:44.780 | that one has for the neural retina.
01:13:46.260 | So what we're really saying is
01:13:47.820 | if you stimulate the hippocampus,
01:13:50.220 | you'll likely get an effect,
01:13:52.220 | but it's unclear what the effect is
01:13:53.780 | and it's not clear how to stimulate.
01:13:55.680 | That's to me, the best reason to focus on the retina
01:13:58.560 | because you know what the cell types are,
01:14:00.380 | thanks to the work of your laboratory,
01:14:01.620 | many other laboratories,
01:14:02.700 | you know what sorts of stimulation matter
01:14:05.740 | and it provides the perfect test bed
01:14:08.140 | for this whole business
01:14:09.140 | of neural augmentation or neuroengineering.
01:14:11.580 | I think there's also a bigger discussion
01:14:13.280 | to just frame this in,
01:14:14.500 | which is so much of what we discuss on this podcast
01:14:18.040 | with guests and in solo episodes
01:14:19.900 | relates to things like dopamine, neuromodulator, serotonin.
01:14:23.060 | Everyone is interested in these things
01:14:24.760 | because they can profoundly change
01:14:26.780 | the way that we perceive and interact with the world,
01:14:29.100 | but one only has to look to the various pharmaceuticals
01:14:32.060 | that increase or decrease these neuromodulators
01:14:34.460 | and know that indeed those pharmaceuticals
01:14:36.860 | can be immensely beneficial to certain individuals.
01:14:39.380 | I wanna be clear about that,
01:14:41.140 | but that whatever quote unquote side effects one sees
01:14:43.520 | or lack of effect over time
01:14:45.260 | is because those receptors are like everywhere
01:14:47.940 | over the, around the brain.
01:14:49.580 | So you can't just increase dopamine in the brain
01:14:52.100 | and expect to only get one specific desired effect.
01:14:54.900 | So the reason you're here today
01:14:58.140 | is not because we both worked on the retina
01:14:59.620 | and it's not because we happen to also be friends.
01:15:01.440 | It's because to my mind,
01:15:03.140 | your laboratory represents the apex of precision
01:15:07.860 | in terms of trying to figure out
01:15:10.440 | what a given piece of the brain,
01:15:13.240 | in your case, the neural retina,
01:15:14.680 | does parse all its different components
01:15:18.780 | and then use that knowledge
01:15:19.980 | to create a real world technology
01:15:22.100 | that can actually tickle and probe
01:15:24.580 | and stimulate that piece of the brain
01:15:27.140 | in a way that's meaningful, right?
01:15:28.740 | Not just like sending electrical activity in.
01:15:31.660 | And that to me is so important.
01:15:34.020 | I think if we were gonna think about levels of specificity
01:15:36.480 | for manipulating the human brain to get an effect,
01:15:40.580 | you would say, okay, crudest would be drugs.
01:15:44.900 | Take a drug, increases serotonin,
01:15:46.900 | which might bind to particular receptors.
01:15:48.660 | Let's take the drug psilocybin,
01:15:49.700 | a lot of excitement about psilocybin.
01:15:51.380 | We know that can lead to increased connectivity
01:15:53.620 | between different brain regions at rest.
01:15:55.660 | There's probably, there is,
01:15:57.140 | some demonstrate clinical benefit.
01:15:58.660 | There's also some potential hazards,
01:16:00.340 | but it's very broad scale.
01:16:02.340 | We don't know what's happening
01:16:03.980 | when the person is thinking about a piece of moss
01:16:07.300 | expanding into an image in a memory of their childhood.
01:16:10.940 | It's like a million different things are happening there.
01:16:14.140 | And then at the other far extreme
01:16:15.300 | is the kind of experiment that you're talking about,
01:16:17.460 | stimulating one known type of retinal cell,
01:16:20.720 | seeing what that means for visual processing
01:16:24.060 | or modeling what that means for visual processing,
01:16:26.220 | and then building a device that can do exactly that.
01:16:28.300 | And then maybe you ramp it up 20%, 50%.
01:16:31.700 | 'Cause I think that represents the first step into,
01:16:35.860 | okay, how would you stimulate the hippocampus
01:16:37.940 | to create a super memory?
01:16:39.660 | Would you stimulate a particular cell type
01:16:42.300 | in a particular way?
01:16:43.300 | And to my knowledge, there's,
01:16:45.460 | despite the immense excitement about the hippocampus
01:16:50.260 | and understandably so,
01:16:51.660 | there just isn't that precision of understanding yet.
01:16:55.620 | So forgive the long answer,
01:16:57.220 | but you asked me a question on this podcast,
01:16:59.780 | I'm gonna give you a long answer.
01:17:01.220 | - Yeah, and specificity is what you're talking about.
01:17:04.300 | And we need technology to do that,
01:17:06.820 | to modulate neural circuits in a highly specific way.
01:17:09.800 | We've got to start with the piece of the neural circuit
01:17:11.740 | we understand best and we have best access to,
01:17:13.740 | that's the retina.
01:17:14.560 | It's sitting right there on the surface.
01:17:15.740 | We can get right into it and install devices.
01:17:19.380 | We know so much about it.
01:17:20.700 | That's the place to start, the place that we understand.
01:17:23.740 | Build electronics that is adaptive,
01:17:27.420 | that senses what circuit it's embedded in
01:17:30.860 | and responds appropriately.
01:17:32.460 | It's not just electronics,
01:17:33.460 | you stick in there and it does something and that's it.
01:17:35.860 | No, it first figures out who it's talking to
01:17:38.540 | and then learns to speak the language
01:17:40.540 | of the nearby neural circuitry.
01:17:42.100 | - So a smart device.
01:17:43.300 | - Smart device.
01:17:44.380 | - Let's push on that a little bit.
01:17:45.540 | So put a little chip onto the retina
01:17:49.620 | that can stimulate specific cell types.
01:17:51.740 | Is there a way that it can use AI, machine learning,
01:17:56.100 | that it can learn something about the tissue
01:17:58.700 | it happens to be in contact with?
01:17:59.940 | - Absolutely.
01:18:00.780 | In the simplest possible way,
01:18:01.940 | the device works in three simple steps.
01:18:04.540 | Step number one, record electrical activity,
01:18:07.380 | which is what we do in our lab in a room full of equipment,
01:18:09.780 | but this is a two millimeter size chip
01:18:11.260 | implanted in your eye.
01:18:12.420 | Record the electrical activity.
01:18:13.940 | Just recognize what cells are there, when they're firing,
01:18:17.140 | what electrical properties they have
01:18:18.740 | to identify the cells and cell types
01:18:20.820 | in this specific circuit in this human.
01:18:23.140 | That's step one.
01:18:23.980 | Step two, stimulate and record.
01:18:27.140 | So you figure out, oh, this electrode
01:18:29.340 | activates cell number 14 with 50% probability.
01:18:32.820 | This electrode activates cell number 12,
01:18:35.060 | if you pass a microamp of current,
01:18:36.900 | with this probability and so on.
01:18:38.100 | You make a big table that tells you
01:18:40.220 | how these electrodes talk to these cells in your circuit.
01:18:43.940 | That's step two, calibrate the stimulation
01:18:46.020 | by stimulation and recording.
01:18:47.900 | Then finally, when you have a visual image
01:18:50.380 | and you want to represent it
01:18:52.780 | in the pattern of activity of these cells,
01:18:55.580 | you say, okay, I know from decades of basic science
01:18:58.780 | what these cells ought to be doing
01:19:00.660 | with this image that's coming in.
01:19:02.020 | I know exactly what they ought to be doing.
01:19:03.300 | That's what the science has been telling us.
01:19:04.660 | We've been studying the neural code
01:19:05.740 | for decades to understand this.
01:19:07.660 | I know what they should do.
01:19:09.540 | Use my device with my calibration,
01:19:11.380 | where I know where the cells are,
01:19:12.380 | I know how the electrodes talk to them,
01:19:13.660 | and bing, bing, bing, bing, bing,
01:19:14.660 | activate them in the correct sequence.
01:19:16.900 | That's what I think of as a smart device,
01:19:19.260 | a device that records, stimulates and records,
01:19:21.860 | and then finally stimulates.
01:19:23.740 | Yes, AI is part of that.
01:19:25.300 | Of course it is,
01:19:26.140 | because this is a very complex transformation,
01:19:29.460 | if you will, from the external visual world
01:19:31.860 | to the patterns of activity of these cells.
01:19:33.860 | It's not easy to write down in just a few lines of code
01:19:35.980 | or a few equations.
01:19:37.620 | It's complicated.
01:19:38.700 | And AI is really helpful for that.
01:19:40.940 | And learning by stimulating and recording
01:19:43.300 | and aggregating information quickly
01:19:45.100 | so that you can then use the device,
01:19:46.460 | that's absolutely a part of the engineering.
01:19:48.140 | Let me be clear, the AI doesn't help us to understand.
01:19:52.180 | It's just an engineering tool
01:19:54.220 | that helps us to capture what this thing normally does,
01:19:57.740 | and then go ahead and execute
01:19:59.260 | and make it do the thing it should normally do.
01:20:01.460 | - I hope people will appreciate this example.
01:20:06.380 | Perhaps not, you know, not but, goodness,
01:20:09.460 | I don't know, 40, 50 years ago,
01:20:11.060 | but still today, one treatment for depression
01:20:14.340 | is electric shock therapy.
01:20:16.100 | A very, you know, on the face of it, barbaric treatment,
01:20:21.100 | but effective in certain conditions.
01:20:23.180 | It's still used for a reason,
01:20:24.640 | but it can appear barbaric, right?
01:20:27.020 | You know, people have like a bite device, you know,
01:20:31.540 | so they don't bite through their tongue or their lips.
01:20:33.500 | They're, you know, they're strapped down
01:20:35.460 | and they stimulate.
01:20:37.060 | Just like stimulate all neurons in the brain, essentially.
01:20:40.540 | There's a huge dump of neurotransmitters
01:20:42.220 | and neuromodulators. - It's like drugs.
01:20:43.700 | It's completely nonspecific stimulation, effectively.
01:20:46.540 | - Probably even less specific than drugs,
01:20:49.340 | and yet the clinical outcomes from electric shock therapy,
01:20:53.220 | in some cases, are pretty impressive.
01:20:55.260 | Like people, the brain is quote-unquote reset.
01:20:58.580 | They still remember who they are,
01:21:00.320 | but presumably through the release of neuromodulators
01:21:06.220 | like dopamine, serotonin, and acetylcholine
01:21:07.780 | in a very nonspecific way,
01:21:09.820 | there has been some symptom relief in some cases.
01:21:14.140 | What you're talking about is really the opposite extreme.
01:21:17.540 | You know, before I said pharmaceuticals
01:21:18.980 | that tickle a particular neuromodulator pathway
01:21:21.060 | would be the opposite extreme.
01:21:22.060 | I think electric shock therapy is probably the most extreme.
01:21:26.380 | Where is this whole business of neuroprostheses
01:21:29.740 | going outside of the visual system?
01:21:32.180 | Like right now, I can imagine that there are
01:21:35.500 | little stimulators in the spinal cord
01:21:38.380 | for sake of restoring movement to paralyzed people.
01:21:41.900 | I realize this is not your field,
01:21:43.140 | but are you seeing impressive stuff there,
01:21:46.460 | or is it still really, really early days?
01:21:48.980 | - There's absolutely impressive stuff,
01:21:51.380 | particularly, for example, people reading out signals
01:21:54.900 | from the motor cortex or the language cortex
01:21:58.940 | in order to help people to communicate
01:22:01.180 | or to move cursors on screens
01:22:03.260 | in order to interact with devices.
01:22:05.220 | - These are paralyzed people.
01:22:06.460 | - Yeah, excuse me, paralyzed people
01:22:07.820 | who can't interact with technology the way that we do,
01:22:11.260 | but with their thoughts can send signals
01:22:15.220 | through an electronic device that can be used
01:22:17.860 | to control a mouse on a screen
01:22:19.620 | and have them connect to the internet.
01:22:21.340 | That's a huge deal to be able to have people do that.
01:22:23.820 | Imagine how life-changing it would be
01:22:25.220 | to be able to communicate if you couldn't before.
01:22:27.500 | So there are wonderful examples of that.
01:22:29.700 | You know of them, so do I.
01:22:32.060 | The work of Krishna Shenoy and Jamie Henderson at Stanford
01:22:34.940 | is one beautiful example over the recent years.
01:22:37.500 | Eddie Chang doing stuff now.
01:22:39.700 | You know, Neuralink doing this kind of stuff,
01:22:42.260 | the built on the work of Shenoy and Henderson and stuff.
01:22:45.460 | So that's great.
01:22:46.800 | Stimulating in spinal circuits, as you said,
01:22:51.860 | for creating rhythmic movements, that's happening.
01:22:55.140 | So this is an enormous space.
01:22:57.620 | And in each case, what you said,
01:22:58.740 | I think is really highly relevant,
01:23:00.780 | that electroshock therapy, you can think of that as,
01:23:03.740 | look, let's say your computer is not behaving right.
01:23:06.700 | You can reboot it, might work.
01:23:10.220 | Then it'll start not working again.
01:23:12.380 | Then you have to reboot it again.
01:23:13.420 | Well, how often do you want to reboot your computer?
01:23:15.140 | It gets a little inconvenient
01:23:16.020 | to be rebooting your computer every five minutes.
01:23:18.220 | Maybe you want to go in and actually diagnose this thing
01:23:20.280 | and put in a piece of software
01:23:21.820 | that fixes the thing that was going wrong
01:23:24.500 | instead of rebooting your computer every five minutes, right?
01:23:26.700 | And I think of electroshock therapy
01:23:28.340 | a little bit as a reboot, it's at that level.
01:23:30.580 | So we want to intervene more specifically.
01:23:32.620 | How do you do that?
01:23:33.460 | You have to understand the software in order to do that.
01:23:36.260 | You have to have specificity
01:23:37.420 | controlling this thing in your computer,
01:23:39.420 | not this, this, this, this, or this,
01:23:41.280 | this particular thing that's going wrong.
01:23:43.260 | You got to interfere with that
01:23:45.060 | and change it and modulate it.
01:23:46.480 | Well, that's what understanding the neural circuit is about.
01:23:48.840 | That's what building specific hardware
01:23:50.380 | that can activate specific cells is about.
01:23:53.060 | That's in the case of the retina, again,
01:23:56.220 | it's just so obvious
01:23:58.940 | that it's right in front of us to do this stuff.
01:24:01.940 | And it's right in front of us to take us into augmentation,
01:24:05.100 | to giving us better senses.
01:24:07.180 | A fun example I like, it's an interesting topic
01:24:09.500 | because the National Institutes of Health
01:24:11.980 | that funds a lot of research that goes in this direction
01:24:16.980 | tends to not be interested in augmenting our senses.
01:24:21.740 | They kind of want to, they draw a line more or less
01:24:24.380 | at saying, look, we're trying to restore
01:24:26.780 | what we were as humans, not create a new kind of human.
01:24:31.500 | And that's an interesting question
01:24:33.580 | because I don't think there is a fine,
01:24:35.260 | there's an actual line, a bright line between those things.
01:24:38.500 | I don't think there's a bright line between those two things.
01:24:41.500 | The finest example I know is that even in the very crudest
01:24:47.300 | visual restoration devices,
01:24:50.280 | you have to actively suppress
01:24:52.460 | the infrared sensitivity of your camera
01:24:55.140 | to not have infrared vision.
01:24:58.200 | 'Cause many cameras are sensitive to infrared light.
01:25:00.820 | So in other words, if you don't put an infrared filter
01:25:02.780 | in front of your camera,
01:25:03.820 | you're going to have some infrared vision.
01:25:05.580 | Maybe it won't help you very much, whatever.
01:25:07.140 | I'm just trying to say,
01:25:08.780 | as soon as you start building devices to restore sensations,
01:25:11.720 | building electronic devices,
01:25:13.580 | augmentation is right around the corner.
01:25:15.380 | It'll creep up on you real fast.
01:25:16.860 | So I think you can't even really draw a line.
01:25:19.880 | - Throughout today's discussion,
01:25:24.340 | we've been thinking of the brain as kind of a,
01:25:27.160 | the rest of the brain, I should say,
01:25:28.540 | because the neural retina is two pieces of the brain
01:25:30.840 | extruded out into the eye globes during development.
01:25:33.300 | I like to remind people of that over and over.
01:25:35.440 | When you're looking at somebody's eyes,
01:25:36.660 | you are looking at two pieces of their brain.
01:25:39.500 | There's no debate about that,
01:25:41.440 | but most people don't realize that.
01:25:43.780 | You'll never look at anyone the same way again,
01:25:45.700 | but this is the reason why you can tell so much
01:25:47.740 | about people's interstate from their eyes.
01:25:50.180 | Somebody who's sleep deprived,
01:25:51.400 | it's not just about the droopiness of their eyelids
01:25:53.300 | or the circles under their eyes.
01:25:54.740 | There's an aliveness to the eyes.
01:25:56.980 | The yogis talk about people that sort of show up
01:26:01.980 | at the level of their eyes
01:26:04.260 | as opposed to sunk back into their brain.
01:26:07.580 | You know, these are very kind of abstract concepts,
01:26:10.720 | but they- - And there's some
01:26:11.560 | very non-abstract stuff these days,
01:26:13.060 | looking through the eye at the retina
01:26:15.020 | the way the ophthalmologists do.
01:26:16.980 | And there's a lot of diagnostic capability
01:26:18.900 | just in those images of the retina.
01:26:20.740 | - Oh, right, I'm glad you brought this up.
01:26:22.220 | There's some interesting and increasing evidence
01:26:25.760 | that looking at the neural retina,
01:26:27.700 | because it is a piece of the brain with neurons
01:26:29.700 | that have the potential to both thrive or degenerate,
01:26:32.620 | that looking at the neural retina,
01:26:34.080 | which one can do with these new technologies,
01:26:36.420 | can provide a window into whether or not
01:26:39.300 | there are forms of degeneration,
01:26:40.940 | such as Alzheimer's and other forms of neurodegeneration,
01:26:44.540 | a deeper within the brain that one can't image directly
01:26:47.620 | because of the thick opacity of the skull, right?
01:26:51.020 | So in other words, imaging the eye
01:26:52.340 | in order to determine if someone is developing Alzheimer's.
01:26:55.100 | - 'Cause you have a direct view
01:26:56.080 | into a little piece of the brain.
01:26:57.440 | It's a good signal.
01:26:59.480 | It can help you figure stuff out
01:27:00.760 | about what's going on in your brain,
01:27:01.880 | even beyond the sunken eyes, absolutely.
01:27:05.160 | - Amazing.
01:27:06.560 | So I think the rest of the brain pieces
01:27:09.880 | is also really interesting.
01:27:10.880 | And maybe here we can go like semi-neurophilosophical.
01:27:14.640 | You know that there are clearly parts of the brain
01:27:18.060 | that are involved in essential,
01:27:20.120 | what I call housekeeping functions,
01:27:21.440 | regulating respiration, keeping us breathing,
01:27:23.920 | keeping our heart beating, digestion,
01:27:26.380 | responding to threats in some sort of basic way,
01:27:29.600 | like through the secretion of adrenaline
01:27:31.480 | and giving us the potential to move.
01:27:33.460 | But a lot of the brain is capable of plasticity.
01:27:37.400 | And one wonders if you were to, for instance,
01:27:41.360 | develop a retinal prosthesis that would allow me
01:27:46.080 | to see with twice the level of detail that I currently can,
01:27:51.220 | would my adult brain be capable
01:27:53.420 | of dealing with that information?
01:27:54.660 | We're talking about twice as much information coming in,
01:27:58.040 | same brain tissue on the receiving end.
01:28:01.780 | Can it make sense of it?
01:28:03.020 | Do we have any idea if it can make sense of it?
01:28:04.620 | Are there experiments that speak to that?
01:28:07.380 | - Yeah, that's a fantastic and interesting question.
01:28:10.500 | It makes you think about neural development
01:28:12.060 | all over again, right?
01:28:13.100 | And I take some inspiration on that
01:28:15.660 | from the work of someone you know, Eric Knudson,
01:28:18.900 | who discovered that there is plasticity
01:28:22.780 | beyond the periods of time.
01:28:25.100 | He discovered many wonderful things,
01:28:26.700 | but there's plasticity well beyond the period of time
01:28:29.600 | that we thought that there was plasticity
01:28:31.100 | in certain animals.
01:28:31.940 | And in particular, that if you make gradual adjustments
01:28:35.420 | to the sensory world, you can exhibit plasticity
01:28:39.780 | that you can't see if you make an abrupt adjustment.
01:28:43.300 | So plasticity is there,
01:28:46.860 | it just has to be brought on by more subtle manipulations
01:28:51.100 | that take you from A to B in little incremental steps.
01:28:53.780 | And if you take those incremental steps,
01:28:55.740 | you can see the adult plasticity.
01:28:58.260 | So coming to your question,
01:29:00.940 | is the brain capable of receiving
01:29:04.500 | the kind of extra information we provide it?
01:29:06.860 | Could be that if we just show up on day one, bam,
01:29:09.500 | try to deliver twice the visual resolution
01:29:11.740 | or whatever in your visual system,
01:29:14.000 | it could be that if we try to deliver
01:29:16.340 | twice the visual resolution on day one, it won't work.
01:29:20.820 | You'll see, it won't look sensible to you.
01:29:24.340 | But if we gradually introduce it
01:29:26.460 | by the way that we're dialing up the resolution,
01:29:29.740 | we may be able to get there.
01:29:31.260 | And there are fascinating studies to be done.
01:29:33.300 | You think about spike timing dependent plasticity,
01:29:35.920 | a term that your viewers may not all know,
01:29:38.580 | but is related to how neurons adjust
01:29:41.980 | their strength of connectivity to one another
01:29:44.780 | according to the timing of the signals in those cells.
01:29:47.940 | Mechanisms like that tell us,
01:29:49.140 | wow, the brain really cares
01:29:51.020 | about the very precise timing of stuff.
01:29:53.500 | And to the degree that that influences
01:29:55.220 | the way that neurons do or don't
01:29:57.060 | strengthen their connections to one another,
01:29:58.900 | it's so fundamental in everything from memory
01:30:01.140 | to visual function, what have you.
01:30:02.980 | - This relates to fire together, wire together.
01:30:05.340 | Although it highlights the together part,
01:30:08.380 | how closely in time do two neurons need to be active
01:30:11.220 | in order for them to subsequently
01:30:13.820 | increase their connectivity?
01:30:15.420 | - And indeed one of them needs to be active
01:30:16.940 | a little bit before the other one
01:30:18.620 | in order for it to work optimally, right?
01:30:20.900 | So what I envision is that when we have full control
01:30:25.300 | of the neural code with an electronic implant
01:30:26.980 | that can talk to the cells
01:30:27.820 | and do all the things that I said,
01:30:28.860 | and we can really control the neural code
01:30:30.700 | coming out of the retina,
01:30:32.220 | we can then start to play games
01:30:33.580 | and dial up that neural code very slowly
01:30:36.260 | and teach the remaining brain
01:30:39.100 | how to understand these signals.
01:30:41.260 | Not introduce some crazy thing from day one.
01:30:43.540 | No, just gradually teach.
01:30:46.500 | Isn't that how we do everything well?
01:30:47.980 | Isn't that how plasticity works?
01:30:49.620 | - I love the subtlety and the rationality of your example,
01:30:55.580 | because so much of what we see in the internet
01:30:58.460 | and on the news is like chips inserted into the brain
01:31:01.780 | to create super memory,
01:31:03.060 | or conversations between 50 people at the same time
01:31:06.980 | without anyone speaking,
01:31:08.580 | just hearing other people's thoughts
01:31:09.860 | by way of neural signals being passed from one to the next.
01:31:13.820 | But yet another reason why you're here today
01:31:15.860 | is because you represent the realistic, grounded,
01:31:20.440 | incremental approach to really parsing this whole thing
01:31:25.700 | of how the brain works
01:31:27.020 | and how one then goes about engineering devices
01:31:31.500 | to augment the human brain.
01:31:32.780 | And as you just pointed out,
01:31:34.020 | it's not going to be done by just sticking electrodes in
01:31:36.660 | and stimulating and seeing what happens.
01:31:38.140 | In fact, those experiments were done in the 1960s.
01:31:40.260 | People like Robert Heath would put electrodes
01:31:42.120 | into the brain during neurosurgery,
01:31:43.260 | stimulate and just see what the patient would do
01:31:45.100 | or what they reported thinking.
01:31:46.940 | And nowadays that's done still,
01:31:48.720 | but with a lot more precision and intention.
01:31:51.140 | And we've moved far beyond that.
01:31:53.980 | - By the way, I just want to say,
01:31:54.980 | those were important first experiments.
01:31:56.300 | - Yeah, they're cool experiments.
01:31:57.120 | - That's the first thing you got to do.
01:31:57.960 | - Heath was a rather, let's be honest,
01:32:00.900 | not the most savory character.
01:32:02.980 | He embarked on some experiments
01:32:04.540 | that had a social agenda to them
01:32:06.740 | and was a pretty, at least by my read,
01:32:09.080 | was not the kind of person I'd want to spend time with,
01:32:11.820 | to say the least.
01:32:12.660 | But you're right, those experiments were critical
01:32:15.140 | because like electric shock therapy,
01:32:18.260 | like the formulation of drugs
01:32:19.900 | that can massively increase certain neuromodulators
01:32:22.220 | or decrease them, they led to some level, crude,
01:32:25.820 | but some level of understanding about how the brain works,
01:32:28.620 | which is what we're really talking about today.
01:32:31.140 | But you represent the, as I said, the astronauts of this.
01:32:35.380 | Astronauts don't go into space and just kind of blast off
01:32:38.020 | and see where they end up.
01:32:39.300 | There's math, there's physics, there's computer science.
01:32:44.300 | - Sensors, cameras.
01:32:45.920 | - That's right.
01:32:46.760 | - Looking, where are we about to land here on the moon?
01:32:49.020 | Is there a crater here or not?
01:32:51.020 | What's around us?
01:32:53.500 | We should sense what's there
01:32:54.820 | and then make our decisions accordingly.
01:32:56.580 | And our electronic implants in the brain,
01:32:59.380 | really, we should make them smart.
01:33:00.900 | Why make them dumb?
01:33:02.260 | We're smarter than that.
01:33:03.460 | We can build implants that can sense what's around them
01:33:05.940 | and change their patterns of activity accordingly.
01:33:09.340 | I use a metaphor sometimes.
01:33:11.340 | If you go as an American who doesn't speak Chinese to China
01:33:15.620 | and you start yelling in English,
01:33:17.760 | maybe somebody will, to learn which way to go on the street,
01:33:21.260 | somebody might understand you at some point
01:33:23.580 | and help you out, but it's gonna be a,
01:33:25.500 | it's gonna be a not very effective way to get around.
01:33:28.580 | It's way better if you speak the language,
01:33:30.220 | you talk to people and ask them where to go.
01:33:32.980 | So that's what we need to know.
01:33:34.780 | We need to say, look, we have the science.
01:33:37.440 | We have incredible devices we can engineer.
01:33:39.220 | We have AI now that even helps us to do this query
01:33:42.660 | of the outside world and turn it into a smarter instrument.
01:33:45.720 | Make our instruments smart.
01:33:47.740 | Make them so they know how to talk to the brain.
01:33:50.460 | Don't expect that the brain is just gonna wrap itself
01:33:52.700 | around your simple electronic device.
01:33:55.300 | No, make a smart device.
01:33:56.740 | That's what we wanna make, a smart renal implant.
01:33:59.180 | - Maybe we could just take a couple of minutes
01:34:02.780 | and talk a little bit about you and some of the things
01:34:07.060 | that have led to your choice to go in this direction.
01:34:10.740 | So did you always know you wanted to be a neuroscientist
01:34:14.500 | from the time you started college?
01:34:16.140 | What was your trajectory?
01:34:17.020 | I should know this, but you were an undergraduate at?
01:34:20.540 | - At Princeton.
01:34:21.380 | - At Princeton, that right, studying?
01:34:22.820 | - Math. - Math.
01:34:24.820 | So you could have just done all your work
01:34:26.220 | with a piece of paper and a pen,
01:34:27.420 | but you had to try and engineer all these electrodes.
01:34:30.220 | Okay, that's a pen and paper pen.
01:34:32.660 | - I took a very random route.
01:34:34.620 | I studied math as an undergrad.
01:34:36.660 | I spent a few years running around playing music
01:34:39.180 | and traveling and living a bohemian life.
01:34:41.140 | - Tell me more about that.
01:34:42.260 | - Oh, I basically just told you all I'm gonna tell.
01:34:45.980 | - Following the Grateful Dead.
01:34:47.940 | - No, not quite following them,
01:34:49.380 | but that was an important part of the story.
01:34:53.260 | - Was that an important part of your personal development?
01:34:56.180 | - Yes, very much so.
01:34:59.140 | Free expression, dance, music, creative,
01:35:02.780 | exploratory music, all that kind of stuff.
01:35:05.580 | - Such a contrast to the EJ that comes forward
01:35:09.500 | when we're talking about the precision
01:35:10.740 | of neural stimulation in specific retinal cell types.
01:35:13.960 | But I think it's useful for people to hear
01:35:15.900 | both young and old that one's nervous system
01:35:20.040 | can be partitioned into these different abilities,
01:35:22.140 | like go and dance and travel,
01:35:23.580 | and you weren't doing anything academic at that time?
01:35:26.140 | - No, for a few years I wasn't doing that,
01:35:27.540 | programming computers to make a living.
01:35:30.140 | And then I started three different PhD programs
01:35:34.320 | at Stanford before I settled.
01:35:35.740 | - Simultaneously?
01:35:36.580 | - No, no, no, in sequence.
01:35:38.140 | I started in the math PhD program.
01:35:39.600 | I learned that was not really for me.
01:35:41.560 | And I started in an economics PhD program
01:35:44.860 | in the business school there.
01:35:46.020 | And I realized after less than a year, that was not for me.
01:35:49.420 | I worked in a startup company for a while.
01:35:52.220 | I did a lot of stuff for a few years
01:35:53.940 | and took some settling.
01:35:57.100 | But then I decided to go into neuroscience.
01:36:00.360 | And there were a couple of formative things.
01:36:02.980 | One is that I had gotten a really formative experience
01:36:05.980 | as an undergraduate from a wonderful guy called Don Reddy,
01:36:08.920 | who taught an introductory neuroscience course,
01:36:11.020 | who was a really inspiring mentor.
01:36:12.960 | And then when I was at Stanford,
01:36:17.040 | I met Brian Wandel, my PhD advisor,
01:36:19.920 | and I was inspired by him.
01:36:21.780 | I thought, I didn't know why he was studying
01:36:23.980 | what he was studying,
01:36:24.820 | but I just knew I wanted to learn from this man.
01:36:26.760 | And I wanted to study with him.
01:36:28.820 | I just knew this was the person who should be my mentor.
01:36:31.320 | - Based on something about him.
01:36:32.460 | - Yes.
01:36:33.380 | - Can I ask you about these three PhD programs?
01:36:35.660 | Because I think people here,
01:36:37.060 | or see what you're doing
01:36:40.800 | and probably imagine a very linear trajectory.
01:36:43.820 | But now I'm hearing you like tour around playing music.
01:36:47.240 | Then you start a PhD program.
01:36:49.140 | Nope.
01:36:49.980 | Then another one.
01:36:50.800 | Nope.
01:36:51.640 | Then another one without getting into all the details.
01:36:53.340 | I mean, were there nights spent lying awake,
01:36:55.780 | thinking like, what am I going to do with my life?
01:36:57.140 | Or did you have the sense that you knew
01:36:59.340 | you wanted to do something important,
01:37:00.860 | you just hadn't found the right fit for you?
01:37:02.980 | Like how much anxiety on a scale of one to 10,
01:37:05.380 | 10 being total panic,
01:37:07.100 | did you experience at the apex of your anxiety
01:37:09.620 | in that kind of wandering?
01:37:10.860 | - Am I allowed to go above 10?
01:37:12.260 | Like turning up the amp to 11?
01:37:14.180 | - Sure.
01:37:15.020 | I just think it's really important for people to hear
01:37:16.820 | whether or not they want to be scientists or not,
01:37:18.260 | this idea that people that are doing important things
01:37:20.620 | in the world, in my view,
01:37:24.300 | rarely if ever understood that that's the thing
01:37:27.340 | that they were going to be doing.
01:37:28.220 | There was some wandering about.
01:37:29.660 | - That sure seems like it, doesn't it?
01:37:31.260 | Yeah.
01:37:32.100 | I experienced the same when I talked to other people
01:37:34.100 | and it seems like that.
01:37:34.940 | And for sure for me,
01:37:36.700 | it just took a while of trying different things to see,
01:37:39.140 | number one, what I was really good at
01:37:41.220 | and where I felt I could make a difference.
01:37:46.060 | And I realized I studied math and I was okay at math,
01:37:51.360 | but I have known mathematicians who are really talented,
01:37:55.640 | gifted mathematicians,
01:37:56.680 | the one who really make a difference.
01:37:58.120 | I wasn't gonna be one of those people.
01:38:00.240 | Likewise, playing music,
01:38:01.640 | I don't have that intrinsic talent.
01:38:02.960 | It's fun, I can play songs in front of people and do stuff.
01:38:05.720 | I like it and stuff like that,
01:38:07.600 | but I don't have that kind of talent.
01:38:09.240 | In fact, I'll say something that I say to friends sometimes
01:38:13.080 | and you're a good friend of mine.
01:38:14.680 | If I had the talent to get a few thousand people
01:38:18.340 | on their feet dancing by playing music,
01:38:21.040 | I'd probably just do that.
01:38:22.360 | - Really?
01:38:23.340 | As long as we've been friends, I knew none of this.
01:38:26.000 | I knew none of this.
01:38:27.080 | Mostly because I think we always end up talking
01:38:28.800 | about neuroscience or other aspects of our life.
01:38:32.100 | But I didn't know, I know a great many things about you,
01:38:36.520 | but I had no idea.
01:38:37.680 | It's so interesting.
01:38:39.160 | Do you still do dance?
01:38:40.480 | We had Eric Jarvis on the podcast, by the way,
01:38:42.580 | professor at Rockefeller who studies,
01:38:44.420 | at one point was studying speech in New York.
01:38:50.900 | In birds, song in birds.
01:38:53.520 | And then he's done a great many other things now
01:38:55.840 | in genetics of vocalization.
01:38:57.780 | And he actually danced with,
01:39:00.940 | or was about to dance with the Alvin Ailey Dance Company.
01:39:04.940 | So he's a really, really talented dancer.
01:39:07.000 | And so dance seems to be like a theme that comes up
01:39:11.580 | among the neuroscience guests on this podcast.
01:39:13.540 | Do you still dance?
01:39:14.660 | - Yeah, I love to dance.
01:39:15.560 | I'm a free form dancer.
01:39:16.420 | I'm not a skilled dancer, but I love music.
01:39:19.060 | I love dancing.
01:39:19.900 | I think it's part of the human spirit.
01:39:21.180 | I someday will understand the neuroscience
01:39:23.020 | behind dancing, right?
01:39:23.980 | Dancing is a universal human thing in all cultures.
01:39:26.540 | What is this dancing thing?
01:39:27.780 | Why do we do this and other creatures don't?
01:39:29.780 | - Well, Jarvis thinks that perhaps it's one
01:39:33.100 | of the more early forms of language.
01:39:35.740 | And that song came before spoken language.
01:39:39.460 | It's sort of interesting that birds
01:39:41.660 | that can actually recreate human speech oftentimes
01:39:46.660 | have the ability to dance as well.
01:39:49.060 | - Oh, wow.
01:39:49.900 | - And so there's some commonality in the circuitry there.
01:39:53.180 | We'll provide a link to that episode.
01:39:54.860 | Jarvis, it's really interesting.
01:39:55.900 | - I would love to hear that.
01:39:57.580 | But if I may, I'd like to riff on that in a different way.
01:40:01.940 | I did spend some time wandering around, as many people do.
01:40:04.740 | And I think particularly for your young listeners
01:40:08.100 | and viewers who don't know, wow, could I ever be a scientist
01:40:11.760 | and develop new things, stuff like that?
01:40:13.340 | Yes, you can.
01:40:14.300 | And if you're messing around in your life,
01:40:15.660 | trying this, trying that, trying the other thing,
01:40:18.500 | definitely stick with it.
01:40:20.000 | Keep looking for the thing that works for you.
01:40:22.300 | I really deeply believe that.
01:40:24.180 | You gotta play around.
01:40:25.260 | You gotta find what it is that works for you.
01:40:27.820 | Interestingly enough, at least it's interesting for me,
01:40:31.560 | I spent a lot of years studying the retina
01:40:35.060 | in a pure basic science,
01:40:36.580 | just curiosity-driven research way,
01:40:39.680 | as you and I have both done in the past.
01:40:44.740 | And as it turned out, I learned all the stuff
01:40:49.660 | I needed to know about the retina
01:40:51.900 | to develop a high-fidelity adaptive retinal implant
01:40:56.900 | of the type that I'm talking about in that process.
01:40:59.460 | The technology, the stimulation, recording,
01:41:01.380 | figuring out the cell types,
01:41:02.220 | how do you stimulate, all that stuff.
01:41:03.820 | I learned all that stuff.
01:41:05.860 | And I have come to a point in my life where I realize,
01:41:09.140 | wow, if somebody's gonna do what I think needs to be done,
01:41:13.200 | which is to take everything we've learned about the retina
01:41:15.660 | and instantiate that in smart technology
01:41:19.060 | that can restore vision
01:41:20.060 | and do all the things we've been talking about,
01:41:22.940 | who are the people in the world
01:41:24.820 | who have the right training and background
01:41:26.940 | and know-how to do that stuff?
01:41:30.180 | I'm one of them.
01:41:31.760 | I know that.
01:41:32.860 | And it's totally by chance that I picked up and learned,
01:41:35.880 | or it seems by chance that I picked up and learned
01:41:37.820 | the things that I need to know for this.
01:41:40.780 | But I definitely have the right know-how to do this
01:41:43.660 | based on all my training and the research that I've done.
01:41:45.780 | And it feels accidental sometimes.
01:41:47.400 | I look back on my own history.
01:41:48.740 | I'm like, how did I get here
01:41:50.460 | where this is obviously the thing I need to do?
01:41:53.100 | Was this on purpose?
01:41:54.940 | It didn't seem like it was on purpose,
01:41:56.460 | but now I gotta do this because I know what needs to be done
01:42:00.060 | and it's something that needs to be done.
01:42:01.420 | So that's my mission for the coming decade or so.
01:42:05.900 | - Wild.
01:42:06.860 | I mean, I knew you had this engineer, mathy,
01:42:10.220 | geeky neuroscience.
01:42:11.340 | I don't wanna say geeky,
01:42:12.180 | 'cause it may sound like I'm not right there
01:42:14.300 | in the same raft with you,
01:42:16.520 | but I didn't know about this more free-spirited
01:42:20.840 | move in all directions,
01:42:21.940 | depending on what one feels in the moment, dancing EJ.
01:42:25.620 | It's very cool.
01:42:26.640 | Are you still a absolute level 11 coffee snob?
01:42:32.060 | - Yes.
01:42:32.900 | - Okay, yeah.
01:42:33.720 | I used to go to meetings
01:42:35.320 | and EJ would bring his own coffee maker
01:42:37.320 | and coffee to meetings.
01:42:39.200 | We're not talking about an espresso machine.
01:42:40.800 | We're talking about like extreme levels of coffee snobbery.
01:42:44.440 | - A press pot, the correct ground coffee,
01:42:46.520 | the correct kind of press pot, yeah.
01:42:48.040 | - Good, good, good.
01:42:49.120 | I expect nothing less.
01:42:50.640 | Proof that not all circuits in the brain are neuroplastic,
01:42:53.600 | nor should they be.
01:42:54.600 | - That's right.
01:42:55.440 | - But to bridge off of that in a more serious way,
01:43:00.560 | despite the free exploration aspect to yourself
01:43:05.560 | and that hopefully other people don't suppress,
01:43:09.960 | it seems like you really are good at knowing your taste.
01:43:14.960 | I think it was the great Marcus Meister,
01:43:19.840 | a colleague of ours who has also worked
01:43:22.400 | on the neural retina extensively, of course,
01:43:24.320 | who once said that there's a coding system in the brain
01:43:27.620 | that leads to either the perception,
01:43:30.300 | the feeling of yum, yuck, or meh.
01:43:35.300 | And that so much of life is being able to register that
01:43:38.260 | in terms of who you interact with and how,
01:43:40.580 | and the choices or problems to work on.
01:43:43.180 | It seems like you have a very keen sense of like, yes, that.
01:43:47.140 | And you move toward that.
01:43:48.540 | You've always been very goal-directed
01:43:51.340 | since the time I've known you.
01:43:52.620 | So, and you've picked such a huge problem,
01:43:56.580 | but going about it in such a precise way,
01:43:59.860 | hence the analogy to the space mission.
01:44:03.260 | So, like when you experienced that,
01:44:07.220 | may I ask, does it come about as like a thought?
01:44:10.660 | Like, oh yeah, that has to be the thing,
01:44:12.140 | or is it like a whole body sensation?
01:44:13.940 | - What a great question.
01:44:14.980 | I love that question.
01:44:16.140 | I have two things to say to that.
01:44:18.180 | The first is that for me, it's all feeling.
01:44:22.980 | I don't make hardly any decisions out of thoughts.
01:44:27.540 | I think, I process, I put it all into the hopper,
01:44:31.740 | and the hopper comes out and spits out a feeling,
01:44:33.740 | and the feeling's like, yeah, that's the thing to do.
01:44:36.500 | A hundred percent.
01:44:37.420 | And I know not everybody's like me.
01:44:39.300 | Lots of people aren't like me,
01:44:40.620 | and particularly lots of scientists aren't like me.
01:44:43.460 | You know, but I definitely roll that way.
01:44:45.700 | That is absolutely how I work.
01:44:47.780 | There's something that's related to that
01:44:49.660 | that I think is, you know, philosophically,
01:44:52.540 | and in terms of personal development
01:44:54.020 | and spiritual development stuff,
01:44:55.700 | I think is quite relevant that I think you'll relate to.
01:44:58.540 | My favorite aphorism is know thyself, the oracle.
01:45:03.980 | And I think, because if you don't know yourself,
01:45:10.180 | you can't do anything.
01:45:11.740 | You don't even know where to go.
01:45:13.820 | You can't even orient yourself
01:45:16.300 | at the next thing in your life.
01:45:19.300 | And I think it deserves to have two corollaries
01:45:23.820 | that go with it, or addenda.
01:45:26.500 | Know thyself, be thyself, which is not easy.
01:45:32.020 | It's not easy to really be yourself in this world.
01:45:34.540 | There are all sorts of things telling us
01:45:36.340 | to be something other than what we are.
01:45:38.300 | And the third one is love thyself.
01:45:41.500 | And it's, you know, having gone through much exploration
01:45:48.460 | of yourself and your life and your values and me too,
01:45:51.700 | and all the things we've talked about over time.
01:45:53.980 | That's not easy.
01:45:55.140 | Some of us are not necessarily programmed to love ourselves.
01:45:58.500 | And it's a skill.
01:46:00.700 | And I really, I try my best
01:46:05.060 | to be with those three things all the time.
01:46:07.380 | Know thyself, be thyself, love thyself.
01:46:10.220 | - Could you elaborate a little bit more
01:46:12.380 | on your process for the third?
01:46:14.460 | This is a concept that has been very challenging
01:46:17.940 | for me and I think for many other people.
01:46:20.500 | And it gets kind of opaque
01:46:25.900 | when it starts getting conflated
01:46:27.660 | with like self-respect and et cetera,
01:46:30.100 | like loving thyself.
01:46:31.780 | Do you have practices?
01:46:33.740 | Do you meditate?
01:46:35.460 | Do you journal?
01:46:36.740 | Do you spend time trying to cultivate a love for self?
01:46:41.460 | - Yeah.
01:46:42.580 | Yeah, I meditate in an informal way
01:46:45.020 | in the morning with my coffee.
01:46:46.940 | Every morning, I make a fantastic cup of coffee
01:46:50.660 | and I sit with it for five or 10 minutes
01:46:53.700 | and take in my world as it's coming toward me
01:46:56.220 | and start to, as I come into the day
01:46:59.140 | and come into consciousness, I meditate like that.
01:47:02.740 | And I have a Ashtanga related yoga practice.
01:47:06.140 | Many of your viewers and listeners
01:47:08.940 | will know about Ashtanga yoga.
01:47:10.260 | It's a very physical, spiritual, traditional yoga practice
01:47:14.900 | that has a deep meditative and breath focus component.
01:47:18.140 | I know you've had lots of episodes
01:47:19.460 | and discussion about the breath
01:47:20.740 | and the importance of that for awareness.
01:47:23.180 | You know, at the end of many Western yoga practices,
01:47:28.800 | you end with namaste, which is expressing your respect
01:47:31.980 | and for the connectedness of what is in front of you
01:47:36.240 | to the whole universe
01:47:37.660 | and what's common to all of us and everything.
01:47:41.460 | And I usually practice yoga by myself.
01:47:44.180 | And when I say namaste at the end of my yoga practice,
01:47:47.160 | a part of that is to myself.
01:47:50.080 | - Earlier, when I asked you about
01:47:52.260 | how you guide your decisions,
01:47:54.220 | you said it's all feel
01:47:56.660 | and you provide a beautiful description
01:47:58.700 | as to how and why that occurs for you
01:48:01.500 | and your trust in that.
01:48:02.780 | I don't recall you saying whether or not
01:48:06.700 | the feeling is in your head or it's a whole body feeling.
01:48:09.900 | Does it have a particular signature to it
01:48:11.680 | that you'd be willing to share?
01:48:12.980 | Is it excitement that makes you want to get up and move
01:48:15.140 | or is it a stillness?
01:48:16.680 | I think I ask because we've been talking
01:48:19.980 | throughout today's episode about the precision
01:48:22.380 | of neural coding and the signals
01:48:23.900 | that are at the level of individual cells.
01:48:25.500 | And yet when it comes to feeling,
01:48:27.900 | we actually have a pretty crude map
01:48:31.900 | and certainly a deficit in language
01:48:33.660 | to explain what this feeling thing is.
01:48:35.900 | And I know that people experience life
01:48:38.340 | and feelings differently,
01:48:39.240 | but I think it's often insightful
01:48:42.400 | when somebody with your understanding
01:48:45.060 | of the nervous system and yourself can share a bit.
01:48:48.020 | Like, what does it feel like?
01:48:50.060 | - I love that question.
01:48:53.020 | And it relates to something you said to me years ago.
01:48:56.200 | What it feels like is ease.
01:49:01.640 | And I remember years ago,
01:49:04.260 | when we were talking about challenging things
01:49:05.900 | that each of us was facing in our lives,
01:49:07.400 | you said to me something like,
01:49:09.020 | "I wish for you some ease with all of this."
01:49:12.660 | It was very moving, touching,
01:49:16.260 | as that's what a good friend does
01:49:17.960 | is to give that to somebody who they love.
01:49:21.140 | And it sticks with me probably 10 years later.
01:49:26.140 | So the feeling I feel when I'm on the path
01:49:32.600 | that makes sense for me is ease.
01:49:36.380 | It's, there's just, it's just, okay, this is it.
01:49:41.260 | - I love that.
01:49:44.460 | I don't actually recall that specific conversation
01:49:46.860 | because we had many, many conversations
01:49:48.600 | sitting in your yard in San Diego
01:49:50.720 | on those plastic chairs with my bulldog, Costello,
01:49:55.640 | hanging out.
01:49:56.480 | By the way, folks, EJ knew Costello,
01:49:58.120 | my bulldog master, very well.
01:50:00.500 | And he was not a huge fan of dogs
01:50:03.520 | prior to meeting Costello,
01:50:05.040 | but Costello flipped him.
01:50:07.320 | He became at least a Costello lover.
01:50:09.440 | - I love Costello.
01:50:10.880 | I'll never forget him.
01:50:11.840 | - Yeah, he embodied ease.
01:50:13.900 | (both laughing)
01:50:14.960 | He did nothing but sleep and eat.
01:50:16.860 | He embodied energetic efficiency.
01:50:21.360 | - And everybody loves him.
01:50:22.600 | Everybody who gets to be in the same room with him loves him.
01:50:25.440 | The people I just spoke to in your setup here,
01:50:28.040 | your colleagues, you know, yeah.
01:50:30.120 | Yeah, I could see why.
01:50:30.960 | And you have a beautiful photo of him hanging there.
01:50:32.760 | - Yeah, yeah, he's a--
01:50:34.200 | - What a creature.
01:50:35.020 | - He's a great memory.
01:50:36.000 | Definitely embedded in my nervousness.
01:50:37.540 | I often choke up when I think about him,
01:50:38.960 | but I want to be clear
01:50:40.040 | 'cause I've already cried once about him
01:50:41.520 | on a different podcast.
01:50:42.840 | I don't want to do that today.
01:50:43.840 | They're not tears of sadness.
01:50:45.520 | It's this crazy thing, like I love him so much,
01:50:47.520 | I just kind of want to explode.
01:50:49.160 | So damn it, Costello got me again and publicly again.
01:50:53.200 | So he's someplace laughing.
01:50:55.680 | So I love that.
01:50:57.920 | And I think if I made, you know,
01:51:02.400 | do you think it's worth kids and adults
01:51:04.940 | learning to recognize those kinds of states,
01:51:08.440 | those signals that tell them they're on the right path
01:51:11.260 | by paying attention to, I don't know,
01:51:14.380 | like we said, there's sort of a deficit of language
01:51:16.100 | like eases in the body, eases in the mind.
01:51:18.460 | It's the release of, I mean, you know,
01:51:20.160 | it's not even worth exploring verbally
01:51:21.660 | because it's such a whole body, whole nervous system thing.
01:51:26.660 | - Yeah, I feel like I actually was thinking about,
01:51:31.140 | I was giving advice to a young fellow
01:51:33.160 | who is applying to graduate school recently
01:51:37.420 | and had a Zoom call with him about stuff.
01:51:41.060 | He had received some good advice from some other people.
01:51:43.540 | And then I gave him some advice.
01:51:45.640 | And I saw him speak and emote
01:51:50.640 | and with body language drop into like,
01:51:53.960 | oh yeah, that makes sense.
01:51:56.340 | Yeah, that, that feeling, of course.
01:51:59.640 | Kind of the, of course.
01:52:01.540 | And I think if you can teach people to do that,
01:52:04.460 | I don't know if the verbal communication of that is gonna,
01:52:06.940 | like you said, is that gonna do anything,
01:52:09.180 | but can you at least observe it in them as a teacher,
01:52:12.380 | as a mentor and do things.
01:52:15.180 | And when you know, when you've done it,
01:52:17.100 | because you see them drop into ease,
01:52:19.020 | do you think you can detect ease in people
01:52:20.820 | by looking at them
01:52:21.660 | and seeing their body language and everything?
01:52:23.740 | - It's gotta be an amalgam of different things,
01:52:26.060 | the cadence of their breathing, their pupil size.
01:52:28.620 | It's not worth dissecting.
01:52:29.880 | This is an experiment I would not want to run,
01:52:32.440 | but I wouldn't want to bring people into a laboratory
01:52:34.480 | and figure out, you know, what pupils of the eye dynamics
01:52:38.500 | combined with certain rhythms of breathing,
01:52:40.080 | relaxation of the shoulders, because-
01:52:41.600 | - It's too beautiful for that, isn't it?
01:52:42.440 | - It's too beautiful and it's too nuanced.
01:52:44.600 | And it's different when we're in motion
01:52:46.760 | versus when we're lying down.
01:52:48.600 | It's like, I mean, science is capable of a great many things,
01:52:52.040 | but I don't think it needs to be pointed
01:52:53.940 | at every aspect of human experience.
01:52:56.720 | I think some of these things
01:52:58.800 | are simply worth allowing to just be.
01:53:02.380 | - Do you feel the same way
01:53:03.220 | about when you have a feeling about a person,
01:53:05.220 | you meet somebody and their energy just captures you?
01:53:08.560 | It's like, wow, what a cool person, what amazing energy.
01:53:13.560 | Do you want to know the science behind that?
01:53:15.400 | I don't.
01:53:16.240 | - No, I don't.
01:53:17.120 | I think the word that comes to mind
01:53:18.680 | when I experience somebody like that
01:53:20.880 | or something like a beautiful animal,
01:53:23.460 | or you see something, the movement of something,
01:53:25.680 | or a beautiful piece of music or something
01:53:27.440 | is the word just behold.
01:53:30.300 | I just want to stop and take in as much of it as possible.
01:53:34.760 | - And here's something I know you've done,
01:53:36.280 | but I'm checking to make sure I really got this right,
01:53:38.800 | 'cause I've done it too.
01:53:40.640 | Because we sometimes get human retinas
01:53:42.400 | for doing our experiments.
01:53:43.800 | The first thing that happens when we get the human retina,
01:53:45.600 | we bring it back to our lab, it's a big production,
01:53:47.320 | everybody's getting ready to go,
01:53:48.480 | a whole bunch of moving parts going on.
01:53:51.120 | We have to open up the eye and look into it
01:53:53.760 | to see what condition it's in.
01:53:55.920 | And it's typically with a dissecting scope on a chair,
01:53:59.440 | it's open, sitting on a chair, dissecting scope,
01:54:01.420 | looking down, look into the eye.
01:54:03.020 | It is so beautiful.
01:54:05.980 | It's breathtaking.
01:54:09.280 | Each time, I've looked at the retina,
01:54:10.560 | I don't know how many times.
01:54:12.000 | Each time, wow, this is what's initiating
01:54:16.460 | all the visual experiences I've ever had in my life
01:54:18.960 | or that person ever had in their life, right?
01:54:22.480 | And the beauty just keeps coming.
01:54:25.460 | - I love it.
01:54:26.640 | And I love it because you're talking about a behold moment
01:54:31.600 | that isn't just to entertain your curiosity.
01:54:35.680 | Sure, it's that, you want to understand how the brain works,
01:54:39.220 | but also a behold moment that leads
01:54:42.200 | from that desire to understand
01:54:44.480 | to a deep level of understanding now,
01:54:46.800 | after more than two decades of exploration,
01:54:49.680 | to a mission in service to humanity,
01:54:52.440 | restore vision to the blind, develop neuroprostheses
01:54:55.720 | and other types of neuroengineering technologies
01:54:58.580 | that will allow the human brain to function better
01:55:01.840 | than it would otherwise.
01:55:02.780 | So there's real purpose there too.
01:55:04.640 | So it represents a kind of a perfect ecosystem of,
01:55:08.120 | it's not just about delighting in something
01:55:10.760 | and spending one's time there,
01:55:13.360 | that there's a real mission there.
01:55:17.040 | So I love it.
01:55:20.200 | Well, EJ, Dr. Chichilnesky.
01:55:25.640 | - Dr. Huberman.
01:55:26.480 | - I rarely get called that these days.
01:55:29.520 | When I invited you here today,
01:55:32.360 | I was absolutely sure that our listeners and viewers
01:55:35.880 | were going to get a absolutely world-class explanation
01:55:39.560 | of how the nervous system works and the retina
01:55:42.240 | and the visual system in particular,
01:55:43.640 | and that it would be delivered
01:55:46.800 | with the utmost clarity, which it was.
01:55:48.840 | So thank you.
01:55:49.820 | I know there's been so much learning in and around that,
01:55:52.500 | and you beautifully framed for us what that means
01:55:55.240 | in terms of the larger understanding
01:55:56.680 | of how the nervous system works
01:55:58.320 | and what you and other laboratories are now in a position
01:56:01.540 | to really do with that information
01:56:03.740 | and the technologies that are being built
01:56:05.860 | and that will be built.
01:56:07.400 | And the purpose in bringing you here today was just that,
01:56:11.520 | but not that alone.
01:56:12.920 | I think we hear so much about the brain
01:56:16.480 | and how it works and everyone wants to have tools
01:56:19.200 | and protocols to function better,
01:56:20.960 | but it's clear that the work that you're doing
01:56:24.260 | is headed in a direction that's going to vastly expand
01:56:26.780 | the possibilities for sake of treating human disease
01:56:29.920 | and for expanding human experience.
01:56:31.740 | I'm certain of that.
01:56:33.140 | What I did not expect, however,
01:56:36.800 | was that when I wrote down one bullet point,
01:56:40.600 | well, actually two,
01:56:41.440 | I wrote still a coffee snob, question mark?
01:56:44.400 | The answer is yes.
01:56:46.320 | And yoga, you know, that we would end up in territory
01:56:51.320 | where you would share some of your experience
01:56:53.520 | that I myself was not aware of about this.
01:56:56.200 | A bit of a wandering of three different PhD programs
01:57:00.860 | and of this cultivation of an intuitive sense
01:57:04.440 | of beauty and taste and preference
01:57:07.840 | that the way you described it
01:57:10.680 | takes you out of your rational mind
01:57:12.960 | and into the aspects of your nervous system
01:57:15.780 | that just really act as a compass
01:57:19.000 | toward what is absolutely right for you.
01:57:21.960 | And we're all so lucky that what's absolutely right for you
01:57:24.840 | turns out to also be what's absolutely right
01:57:28.240 | and beneficial for the world.
01:57:30.960 | So thank you for coming here today.
01:57:33.400 | Thank you for sharing your knowledge and your heart
01:57:38.480 | and for doing it with such an incredible degree
01:57:41.920 | of openness and respect.
01:57:44.000 | So thank you so much.
01:57:46.320 | - Thank you, Andrew.
01:57:47.140 | It's a great pleasure.
01:57:48.200 | Really appreciate it.
01:57:49.800 | - Thank you for joining me for today's discussion
01:57:51.640 | with Dr. E.J. Cicholinski.
01:57:53.440 | To learn more about the work in the Cicholinski Lab
01:57:55.840 | and to find links to E.J.'s social media handles,
01:57:58.400 | please see the links in the show note captions.
01:58:00.800 | If you're learning from and/or enjoying this podcast,
01:58:03.280 | please subscribe to our YouTube channel.
01:58:05.120 | That's a terrific zero-cost way to support us.
01:58:07.600 | In addition, please subscribe to the podcast
01:58:09.760 | on both Spotify and Apple.
01:58:11.540 | And on both Spotify and Apple,
01:58:12.920 | you can leave us up to a five-star review.
01:58:15.280 | Please check out the sponsors mentioned at the beginning
01:58:17.600 | and throughout today's episode.
01:58:19.040 | That's the best way to support this podcast.
01:58:21.720 | If you have questions for me or comments about the podcast
01:58:24.280 | or guests you'd like me to consider
01:58:25.720 | or topics you'd like me to consider
01:58:27.200 | for the "Huberman Lab" podcast,
01:58:28.680 | please put those in the comment section on YouTube.
01:58:31.120 | I do read all the comments.
01:58:32.840 | Not during today's episode,
01:58:34.020 | but on many previous episodes of the "Huberman Lab" podcast,
01:58:36.660 | we discuss supplements.
01:58:38.100 | While supplements aren't necessary for everybody,
01:58:40.240 | many people derive tremendous benefit from them
01:58:42.360 | for things like improving sleep,
01:58:43.880 | for hormone support, and for improving focus.
01:58:46.320 | To learn more about the supplements
01:58:47.560 | discussed on the "Huberman Lab" podcast,
01:58:49.360 | go to Live Momentous, spelled O-U-S,
01:58:51.600 | so it's livemomentous.com/huberman.
01:58:54.600 | If you're not already following me on social media,
01:58:56.680 | I am @hubermanlab on all social media platforms.
01:58:59.360 | That's Instagram, X, Twitter, LinkedIn,
01:59:02.160 | Facebook, and Threads.
01:59:03.520 | And at all of those places,
01:59:04.860 | I discuss science and science-related tools,
01:59:07.320 | some of which overlap with the content
01:59:08.840 | of the "Huberman Lab" podcast,
01:59:10.040 | but much of which are distinct from the content
01:59:12.200 | of the "Huberman Lab" podcast.
01:59:13.400 | So again, it's @hubermanlab on all social media channels.
01:59:16.380 | If you haven't already subscribed
01:59:17.520 | to our Neural Network newsletter,
01:59:19.160 | the Neural Network newsletter
01:59:20.440 | is our zero-cost monthly newsletter
01:59:22.640 | that has podcast summaries and protocols
01:59:24.680 | in the form of brief one-to-three-page PDFs
01:59:27.880 | that spell out for you in very clear terms
01:59:30.600 | how, for instance, to optimize your physical health,
01:59:33.900 | how to optimize dopamine, how to improve your sleep,
01:59:36.920 | how to learn better through neuroplasticity,
01:59:39.080 | deliberate cold exposure, heat exposure, and on and on.
01:59:41.920 | Again, all completely zero cost.
01:59:43.500 | To sign up for the Neural Network newsletter,
01:59:45.360 | you simply go to hubermanlab.com, go to the Menu tab,
01:59:48.320 | scroll down to Newsletter, and enter your email.
01:59:50.640 | We do not share your email with anybody.
01:59:53.000 | Thank you once again for joining me
01:59:54.320 | for today's discussion with Dr. E.J. Cichelnicki.
01:59:57.620 | And last, but certainly not least,
01:59:59.680 | thank you for your interest in science.
02:00:01.520 | (upbeat music)
02:00:04.100 | (upbeat music)