back to index

Mark Zuckerberg & Dr. Priscilla Chan: Curing All Human Diseases & the Future of Health & Technology


Chapters

0:0 Mark Zuckerberg & Dr. Priscilla Chan
2:15 Sponsors: Eight Sleep & LMNT; The Brain Body Contract
5:35 Chan Zuckerberg Initiative (CZI) & Human Disease Research
8:51 Innovation & Discovery, Science & Engineering
12:53 Funding, Building Tools & Imaging
17:57 Healthy vs. Diseased Cells, Human Cell Atlas & AI, Virtual Cells
21:59 Single Cell Methods & Disease; CELLxGENE Tool
28:22 Sponsor: AG1
29:53 AI & Hypothesis Generation; Long-term Projects & Collaboration
35:14 Large Language Models (LLMs), In Silico Experiments
42:11 CZI Biohubs, Chicago, New York
50:52 Universities & Biohubs; Therapeutics & Rare Diseases
57:23 Optimism; Children & Families
66:21 Sponsor: InsideTracker
67:25 Technology & Health, Positive & Negative Interactions
73:17 Algorithms, Clickbait News, Individual Experience
79:17 Parental Controls, Meta Social Media Tools & Tailoring Experience
84:51 Time, Usage & Technology, Parental Tools
88:55 Virtual Reality (VR), Mixed Reality Experiences & Smart Glasses
96:9 Physical Exercise & Virtual Product Development
104:19 Virtual Futures for Creativity & Social Interactions
109:31 Ray-Ban Meta Smart Glasses: Potential, Privacy & Risks
120:20 Visual System & Smart Glasses, Augmented Reality
126:42 AI Assistants & Creators, Identity Protection
133:26 Zero-Cost Support, Spotify & Apple Reviews, Sponsors, YouTube Feedback, Momentous, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science and science-based tools
00:00:04.880 | for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.360 | and I'm a professor of neurobiology and ophthalmology
00:00:13.540 | at Stanford School of Medicine.
00:00:15.240 | My guests today are Mark Zuckerberg and Dr. Priscilla Chan.
00:00:19.080 | Mark Zuckerberg, as everybody knows,
00:00:21.080 | founded the company Facebook.
00:00:22.760 | He is now the CEO of Meta, which includes Facebook,
00:00:26.380 | Instagram, WhatsApp, and other technology platforms.
00:00:29.720 | Dr. Priscilla Chan graduated from Harvard
00:00:32.080 | and went on to do her medical degree
00:00:34.280 | at the University of California, San Francisco.
00:00:37.020 | Mark Zuckerberg and Dr. Priscilla Chan
00:00:39.160 | are married and the co-founders of the CZI,
00:00:41.980 | or Chan Zuckerberg Initiative,
00:00:44.000 | a philanthropic organization whose stated goal
00:00:46.460 | is to cure all human diseases.
00:00:49.020 | The Chan Zuckerberg Initiative is accomplishing that
00:00:51.360 | by providing critical funding not available elsewhere,
00:00:54.520 | as well as a novel framework for discovery
00:00:57.120 | of the basic functioning of cells,
00:00:59.600 | cataloging all the different human cell types,
00:01:02.480 | as well as providing AI or artificial intelligence platforms
00:01:05.680 | to mine all of that data to discover new pathways
00:01:08.640 | and cures for all human diseases.
00:01:11.260 | The first hour of today's discussion
00:01:13.340 | is held with both Dr. Priscilla Chan and Mark Zuckerberg,
00:01:16.400 | during which we discuss the CZI
00:01:18.520 | and what it really means to try and cure all human diseases.
00:01:22.040 | We talk about the motivational backbone for the CZI
00:01:24.700 | that extends well into each of their personal histories.
00:01:27.560 | Indeed, you'll learn quite a lot about Dr. Priscilla Chan,
00:01:30.600 | who has, I must say, an absolutely incredible family story
00:01:34.060 | leading up to her role as a physician
00:01:36.040 | and her motivations for the CZI and beyond.
00:01:38.800 | And you'll learn from Mark
00:01:40.120 | how he's bringing an engineering and AI perspective
00:01:42.860 | to the discovery of new cures for human disease.
00:01:45.840 | The second half of today's discussion
00:01:47.400 | is just between Mark Zuckerberg and me,
00:01:49.700 | during which we discuss various meta platforms,
00:01:52.200 | including, of course, social media platforms
00:01:54.360 | and their effects on mental health in children and adults.
00:01:57.480 | We also discuss VR, virtual reality,
00:01:59.980 | as well as augmented and mixed reality.
00:02:02.560 | And we discuss AI, artificial intelligence,
00:02:05.320 | and how it stands to transform
00:02:06.920 | not just our online experiences with social media
00:02:09.440 | and other technologies,
00:02:10.880 | but how it stands to potentially transform
00:02:13.000 | every aspect of everyday life.
00:02:15.540 | Before we begin, I'd like to emphasize that this podcast
00:02:18.300 | is separate from my teaching and research roles at Stanford.
00:02:20.820 | It is, however, part of my desire and effort
00:02:22.900 | to bring zero cost to consumer information about science
00:02:25.360 | and science-related tools to the general public.
00:02:27.980 | In keeping with that theme,
00:02:29.000 | I'd like to thank the sponsors of today's podcast.
00:02:31.880 | Our first sponsor is Eight Sleep.
00:02:33.940 | Eight Sleep makes smart mattress covers
00:02:35.540 | with cooling, heating, and sleep tracking capacity.
00:02:38.820 | I've spoken many times before on this podcast
00:02:40.720 | about the fact that getting a great night's sleep
00:02:43.120 | really is the foundation of mental health,
00:02:44.840 | physical health, and performance.
00:02:46.400 | One of the key things to getting a great night's sleep
00:02:48.560 | is to make sure that the temperature
00:02:49.940 | of your sleeping environment is correct.
00:02:51.680 | And that's because in order to fall and stay deeply asleep,
00:02:54.280 | your body temperature actually has to drop
00:02:55.920 | by about one to three degrees.
00:02:57.640 | And in order to wake up feeling refreshed and energized,
00:03:00.760 | your body temperature actually has to increase
00:03:02.700 | by about one to three degrees.
00:03:04.400 | With Eight Sleep, you can program the temperature
00:03:06.080 | of your sleeping environment in the beginning, middle,
00:03:08.400 | and end of your night.
00:03:09.680 | It has a number of other features,
00:03:10.840 | like tracking the amount of rapid eye movement
00:03:12.680 | and slow wave sleep that you get,
00:03:14.080 | things that are essential to really dialing in
00:03:16.280 | the perfect night's sleep for you.
00:03:17.880 | I've been sleeping on an Eight Sleep mattress cover
00:03:19.480 | for well over two years now,
00:03:21.000 | and it has greatly improved my sleep.
00:03:23.040 | I fall asleep far more quickly.
00:03:24.840 | I wake up far less often in the middle of the night,
00:03:27.040 | and I wake up feeling far more refreshed
00:03:29.060 | than I ever did prior to using an Eight Sleep mattress cover.
00:03:32.480 | If you'd like to try Eight Sleep,
00:03:33.740 | you can go to eightsleep.com/huberman
00:03:36.400 | to save $150 off their pod three cover.
00:03:39.280 | Eight Sleep currently ships to the USA, Canada, UK,
00:03:42.080 | select countries in the EU, and Australia.
00:03:44.160 | Again, that's eightsleep.com/huberman.
00:03:47.040 | Today's episode is also brought to us by Element.
00:03:49.800 | Element is an electrolyte drink
00:03:51.240 | that has everything you need and nothing you don't.
00:03:53.360 | That means plenty of electrolytes,
00:03:54.760 | sodium, magnesium, and potassium, and no sugar.
00:03:57.880 | The electrolytes are absolutely essential
00:03:59.580 | for the functioning of every cell in your body,
00:04:01.440 | and your neurons, your nerve cells,
00:04:02.880 | rely on sodium, magnesium, and potassium
00:04:05.080 | in order to communicate with one another
00:04:06.720 | electrically and chemically.
00:04:08.240 | Element contains the optimal ratio of electrolytes
00:04:10.400 | for the functioning of neurons
00:04:11.920 | and the other cells of your body.
00:04:13.360 | Every morning, I drink a packet of Element
00:04:15.180 | dissolved in about 32 ounces of water.
00:04:17.800 | I do that just for general hydration
00:04:20.000 | and to make sure that I have adequate electrolytes
00:04:22.160 | for any activities that day.
00:04:23.680 | I'll often also have an Element packet,
00:04:25.520 | or even two packets, in 32 to 60 ounces of water
00:04:29.020 | if I'm exercising very hard,
00:04:30.560 | and certainly if I'm sweating a lot
00:04:32.440 | in order to make sure that I replace those electrolytes.
00:04:35.000 | If you'd like to try Element,
00:04:36.160 | you can go to drinklmnt.com/huberman
00:04:40.060 | to get a free sample pack with your purchase.
00:04:41.960 | Again, that's drinklmnt.com/huberman.
00:04:45.800 | I'm pleased to announce that we will be hosting
00:04:47.720 | four live events in Australia,
00:04:50.100 | each of which is entitled The Brain-Body Contract,
00:04:52.920 | during which I will share science and science-related tools
00:04:55.800 | for mental health, physical health, and performance.
00:04:58.520 | There will also be a live question and answer session.
00:05:01.480 | We have limited tickets still available
00:05:03.120 | for the event in Melbourne on February 10th,
00:05:05.800 | as well as the event in Brisbane on February 24th.
00:05:09.280 | Our event in Sydney at the Sydney Opera House
00:05:11.420 | sold out very quickly, so as a consequence,
00:05:14.160 | we've now scheduled a second event in Sydney
00:05:16.680 | at the Aware Super Theater on February 18th.
00:05:19.680 | To access tickets to any of these events,
00:05:21.880 | you can go to hubermanlab.com/events
00:05:25.240 | and use the code Huberman at checkout.
00:05:27.600 | I hope to see you there, and as always,
00:05:29.780 | thank you for your interest in science.
00:05:31.680 | And now for my discussion with Mark Zuckerberg
00:05:34.280 | and Dr. Priscilla Chan.
00:05:35.840 | Priscilla and Mark, so great to meet you,
00:05:37.680 | and thank you for having me here in your home.
00:05:39.960 | - Oh, thanks for having us on the podcast.
00:05:41.360 | - Yeah.
00:05:42.520 | - I'd like to talk about the CZI,
00:05:44.480 | the Chan-Zuckerberg Initiative.
00:05:46.160 | I learned about this a few years ago
00:05:48.120 | when my lab was and still is now at Stanford
00:05:52.120 | as a very exciting philanthropic effort
00:05:54.920 | that has a truly big mission.
00:05:58.000 | I can't imagine a bigger mission.
00:06:00.120 | So maybe you could tell us what that big mission is,
00:06:01.680 | and then we can get into some of the mechanics
00:06:03.280 | of how that big mission can become a reality.
00:06:08.280 | - So like you're mentioning, in 2015,
00:06:14.320 | we launched the Chan-Zuckerberg Initiative,
00:06:16.880 | and what we were hoping to do at CZI
00:06:19.800 | was think about how do we build a better future for everyone,
00:06:23.080 | and looking for ways where we can contribute
00:06:25.440 | the resources that we have to bring philanthropically,
00:06:28.600 | and the experiences that Mark and I have had,
00:06:31.400 | for me as a physician and educator,
00:06:33.280 | for Mark as an engineer, and then our ability
00:06:37.160 | to bring teams together to build builders.
00:06:40.000 | Mark has been a builder throughout his career,
00:06:43.800 | and what could we do if we actually put together a team
00:06:46.880 | to build tools, do great science?
00:06:50.600 | And so within our science portfolio,
00:06:53.000 | we've really been focused on what some people think
00:06:56.840 | is either an incredibly audacious goal
00:06:59.640 | or a inevitable goal, but I think about it
00:07:02.920 | as something that will happen
00:07:04.400 | if we sort of continue focusing on it,
00:07:06.640 | which is to be able to cure, prevent, or manage
00:07:08.960 | all disease by the end of the century.
00:07:10.840 | - All disease. - All disease.
00:07:12.480 | So that's important, right?
00:07:13.600 | And a lot of times people ask like, which disease?
00:07:16.200 | And the whole point is that there is not one disease,
00:07:19.640 | and it's really about taking a step back
00:07:22.880 | to where I always found the most hope as a physician,
00:07:26.040 | which is new discoveries and new opportunities
00:07:29.680 | and new ways of understanding how to keep people well
00:07:32.800 | come from basic science.
00:07:34.480 | So our strategy at CZI is really to build tools,
00:07:39.280 | fund science, change the way basic scientists
00:07:43.800 | can see the world and how they can move quickly
00:07:47.640 | in their discoveries.
00:07:49.560 | And so that's what we launched in 2015.
00:07:53.760 | We do work in three ways.
00:07:55.840 | We fund great scientists.
00:07:58.800 | We build tools right now, software tools,
00:08:03.000 | to help move science along and make it easier
00:08:06.160 | for scientists to do their work, and we do science.
00:08:09.680 | You mentioned Stanford being an important pillar
00:08:12.000 | for our science work.
00:08:13.440 | We've built what we call bio hubs,
00:08:15.920 | institutes where teams can take on grand challenges
00:08:20.360 | to do work that wouldn't be possible in a single lab
00:08:25.080 | or within a single discipline.
00:08:27.000 | And our first bio hub was launched in San Francisco,
00:08:30.240 | a collaboration between Stanford, UC Berkeley, and UCSF.
00:08:35.800 | - Amazing.
00:08:36.640 | Curing all diseases implies that there will either be
00:08:42.720 | a ton of knowledge gleaned from this effort,
00:08:44.920 | which I'm certain there will be, and there already has been.
00:08:47.640 | We can talk about some of those early successes in a moment,
00:08:51.240 | but it also sort of implies that if we can understand
00:08:54.420 | some basic operations of diseases and cells
00:08:59.200 | that transcend autism, Huntington's, Parkinson's, cancer,
00:09:04.820 | and any other disease that perhaps
00:09:07.300 | there are some core principles that would make
00:09:09.980 | the big mission a real reality, so to speak.
00:09:13.960 | What I'm basically saying is, how are you attacking this?
00:09:16.860 | My belief is that the cell sits at the center
00:09:20.900 | of all discussion about disease,
00:09:22.960 | given that our body is made up of cells
00:09:24.840 | and different types of cells.
00:09:26.300 | So maybe you could just illuminate for us
00:09:29.560 | a little bit of what the cell is in your mind
00:09:34.680 | as it relates to disease and how one goes
00:09:37.400 | about understanding disease in the context of cells,
00:09:40.060 | because ultimately that's what we're made up of.
00:09:42.600 | - Yeah, well, let's get to the cell thing in a moment,
00:09:45.080 | but just even taking a step back from that,
00:09:47.260 | we don't think that it's CZI that we're going to cure,
00:09:50.680 | prevent, or manage all diseases.
00:09:51.940 | The goal is to basically give the scientific community
00:09:54.980 | and scientists around the world the tools
00:09:57.520 | to accelerate the pace of science.
00:09:59.200 | And we spent a lot of time when we were getting started
00:10:02.400 | with this looking at the history of science
00:10:05.440 | and trying to understand the trends
00:10:06.840 | and how they've played out over time.
00:10:08.240 | And if you look over this very long-term arc,
00:10:11.880 | most large-scale discoveries are preceded
00:10:15.240 | by the invention of a new tool or a new way to see something.
00:10:18.320 | And it's not just in biology, right?
00:10:19.600 | It's like having a telescope came before a lot
00:10:22.680 | of discoveries in astronomy and astrophysics,
00:10:26.660 | but similarly, the microscope and just different ways
00:10:30.440 | to observe things, or different platforms,
00:10:32.180 | like the ability to do vaccines,
00:10:33.540 | preceded the ability to kind of cure
00:10:35.540 | a lot of different things.
00:10:36.840 | So this is sort of the engineering part
00:10:39.680 | that you were talking about, about building tools.
00:10:41.460 | We view our goal as to try to bring together
00:10:46.000 | some scientific and engineering knowledge
00:10:48.240 | to build tools that empower the whole field.
00:10:50.440 | And that's sort of the big arc,
00:10:52.920 | and a lot of the things that we're focused on,
00:10:54.840 | including the work in single-cell and cell understanding,
00:10:58.380 | which you can jump in and get into that if you want.
00:11:02.920 | But yeah, I think we generally agree with the premise
00:11:07.000 | that if you wanna understand this stuff
00:11:08.640 | from first principles, people study organs a lot,
00:11:12.600 | right, they study kind of how things present
00:11:15.120 | across the body, but there's not a very widespread
00:11:18.880 | understanding of how kind of each cell operates.
00:11:21.480 | And this is sort of a big part of some of the initial work
00:11:24.640 | that we tried to do on the Human Cell Atlas,
00:11:26.680 | and understanding what are the different cells,
00:11:28.280 | and there's a bunch more work that we wanna do
00:11:30.880 | to carry that forward.
00:11:31.880 | But overall, I think when we think about the next 10 years
00:11:36.680 | here of this long arc to try to empower the community
00:11:39.880 | to be able to cure, prevent, or manage all diseases,
00:11:44.240 | we think that the next 10 years should really be primarily
00:11:47.720 | about being able to measure and observe more things
00:11:50.760 | in human biology.
00:11:51.680 | There are a lot of limits to that.
00:11:52.840 | It's like you wanna look at something through a microscope,
00:11:54.720 | you can't usually see living tissues,
00:11:57.280 | because it's hard to see through skin, or things like that.
00:12:00.160 | So there are a lot of different techniques
00:12:03.640 | that will help us observe different things,
00:12:05.280 | and this is sort of where the engineering background
00:12:07.980 | comes in a bit, because when I think about this
00:12:11.040 | as from the perspective of how you'd write code or something,
00:12:14.320 | you know, the idea of trying to debug or fix a code base,
00:12:17.200 | but not be able to step through the code line by line,
00:12:19.380 | it's not gonna happen, right?
00:12:21.240 | And at the beginning of any big project that we do at Meta,
00:12:25.560 | you know, we like to spend a bunch of the time up front
00:12:27.280 | just trying to instrument things,
00:12:28.320 | and understand, you know, what are we gonna look at,
00:12:30.120 | and how are we gonna measure things before,
00:12:32.760 | so we know we're making progress, and know what to optimize.
00:12:35.000 | And this is such a long-term journey that we think
00:12:38.400 | that it actually makes sense to take the next 10 years
00:12:40.400 | to build those kind of tools for biology,
00:12:43.680 | and understanding just how the human body works in action,
00:12:48.460 | and a big part of that is cells.
00:12:50.040 | I don't know, do you wanna jump and talk
00:12:52.200 | about some of the efforts?
00:12:53.240 | - Could I just interrupt briefly and just ask about
00:12:56.200 | the different interventions, so to speak,
00:13:00.760 | that CZI is a unique position to bring to the quest
00:13:05.380 | to cure all diseases.
00:13:06.720 | So I can think of, I mean, I know as a scientist
00:13:10.000 | that money is necessary, but not sufficient, right?
00:13:13.000 | Like when you have money, you can hire more people,
00:13:14.620 | you can try different things, so that's critical,
00:13:17.440 | but a lot of philanthropy includes money.
00:13:22.060 | The other component is you wanna be able to see things,
00:13:26.100 | as you pointed out, so you wanna know
00:13:28.120 | the normal disease process, like what is a healthy cell?
00:13:31.300 | What's a diseased cell?
00:13:32.480 | Are cells constantly being bombarded with challenges
00:13:35.500 | and then repairing those, and then what we call cancer
00:13:37.960 | is just kind of the runaway train of those challenges
00:13:39.940 | not being met by the cell itself or something like that.
00:13:42.920 | So better imaging tools, and then it sounds like
00:13:45.720 | there's not just a hardware component,
00:13:47.600 | but a software component, this is where AI comes in.
00:13:50.100 | So maybe we can, at some point, we can break this up
00:13:52.560 | into three different avenues.
00:13:54.040 | One is understanding disease processes
00:13:56.400 | and healthy processes, we'll lump those together.
00:13:58.720 | Then there's hardware, so microscopes, lenses,
00:14:02.680 | digital deconvolution, ways of seeing things
00:14:05.040 | in bolder relief and more precision.
00:14:08.520 | And then there's how to manage all the data.
00:14:12.260 | And then I love the idea that maybe AI could do
00:14:15.160 | what human brains can't do alone,
00:14:17.440 | they manage understanding of the data.
00:14:19.240 | 'Cause it's one thing to organize data,
00:14:20.600 | it's another to say, oh, you know, this,
00:14:22.640 | as you point out in the analogy with code,
00:14:24.180 | that this particular gene and that particular gene
00:14:27.820 | are potentially interesting, whereas a human being
00:14:30.300 | would never make that potential connection, yeah.
00:14:33.360 | - So the tools that CZI can bring to the table,
00:14:36.520 | we fund science like you're talking about.
00:14:39.400 | And we try to, there's lots of ways to fund science.
00:14:43.520 | And just to be clear, what we fund is a tiny fraction
00:14:48.280 | of what the NIH funds, for instance.
00:14:50.440 | - You guys have been generous enough that it's,
00:14:53.040 | it definitely holds weight to NIH, NIH's contribution.
00:14:58.040 | - Yeah, but I think every funder has its own role
00:15:01.560 | in the ecosystem.
00:15:02.400 | And for us, it's really how do we incentivize
00:15:04.640 | new points of view?
00:15:05.520 | How do we incentivize collaboration?
00:15:07.240 | How do we incentivize open science?
00:15:09.760 | And so a lot of our grants include inviting people
00:15:14.440 | to look in, look at different fields.
00:15:17.080 | Our first neuroscience RFA was aimed towards incentivizing
00:15:22.080 | people from different backgrounds, immunologists,
00:15:24.960 | microbiologists, to come and look at how
00:15:28.320 | our nervous system works and how to keep it healthy.
00:15:31.340 | Or we ask that our grantees participate
00:15:35.280 | in the pre-print movement to accelerate the rate
00:15:37.400 | of sharing knowledge and actually others being able
00:15:40.640 | to build upon science.
00:15:42.360 | So that's the funding that we do.
00:15:45.040 | In terms of building, we build software and hardware,
00:15:50.000 | like you mentioned.
00:15:51.080 | We put together teams that can build tools
00:15:54.720 | that are more durable and scalable than someone
00:15:59.400 | in a single lab might be incentivized to do.
00:16:01.880 | There's a ton of great ideas and nowadays,
00:16:05.400 | most scientists can tinker and build something useful
00:16:08.400 | for their lab.
00:16:09.460 | But it's really hard for them to be able to share
00:16:12.960 | that tool sometimes beyond their own laptop
00:16:15.600 | or forget the next lab over or across the globe.
00:16:19.760 | So we partner with scientists to see what is useful,
00:16:23.400 | what kinds of tools.
00:16:24.680 | In imaging, NAPARI, it's a useful image annotation tool
00:16:29.680 | that is born from an open source community.
00:16:34.200 | And how can we contribute to that?
00:16:36.160 | Or a cell by gene which works on single cell data sets
00:16:40.320 | and how can we make it build a useful tool
00:16:43.200 | so that scientists can share data sets, analyze their own
00:16:46.760 | and contribute to a larger corpus of information?
00:16:50.800 | So we have software teams that are building,
00:16:54.400 | collaborating with scientists to make sure
00:16:56.060 | that we're building easy to use, durable, translatable tools
00:17:00.800 | across the scientific community in the areas that we work in.
00:17:04.480 | We also have institutes.
00:17:05.920 | This is where the imaging work comes in,
00:17:09.100 | where, you know, we are proud owners
00:17:11.500 | of an electron microscope right now.
00:17:13.840 | And it's gonna be installed at our imaging institute
00:17:18.840 | and that will really contribute to a way
00:17:21.360 | where we can see work differently.
00:17:23.040 | But more hardware does need to be developed.
00:17:26.840 | You know, we're partnering with a fantastic scientists
00:17:29.720 | in the Biohub network to build a mini phase plate
00:17:35.720 | to increase, to align the electrons
00:17:38.840 | through the electron microscope
00:17:43.120 | to be able to increase the resolution
00:17:44.680 | so we can see in sharper detail.
00:17:47.680 | So there's a lot of innovative work
00:17:49.560 | within the network that's happening.
00:17:51.360 | And these institutes have grand challenges
00:17:55.940 | that they're working on.
00:17:57.840 | Back to your question about cells.
00:18:00.040 | Cells are just the smallest unit that are alive.
00:18:04.120 | And all of our bodies have many, many, many cells.
00:18:09.120 | There's some estimate of like 37 trillion cells,
00:18:14.340 | different cells in your body.
00:18:15.800 | And what are they all doing?
00:18:17.300 | And what do they look like when you're healthy?
00:18:19.920 | What do they look like when you're sick?
00:18:21.880 | And where we're at right now
00:18:25.200 | with our understanding of cells
00:18:28.200 | and what happens when you get sick
00:18:30.940 | is basically we've gotten pretty good
00:18:34.780 | from the Human Genome Project
00:18:36.540 | looking at how different mutations in your genetic code
00:18:39.920 | lead for you to be more susceptible to get sick
00:18:42.520 | or directly cause you to get sick.
00:18:44.760 | So we go from a mutation in your DNA
00:18:48.000 | to wow, you now have Huntington's disease, for instance.
00:18:53.000 | And there's a lot that happens in the middle.
00:18:56.760 | And that's one of the questions
00:18:58.080 | that we're going after at CZI is what actually happens?
00:19:02.800 | So an analogy that I like to use to share with my friends
00:19:05.820 | is right now, say we have a recipe for a cake.
00:19:09.080 | We know there's a typo in the recipe.
00:19:11.400 | And then the cake is awful.
00:19:14.600 | That's all we know.
00:19:15.720 | We don't know how the chef interprets the typo.
00:19:18.840 | We don't know what happens in the oven.
00:19:20.940 | And we don't actually know sort of
00:19:23.160 | how it's exactly connected to how the cake didn't turn out,
00:19:26.280 | how you had expected.
00:19:28.800 | A lot of that is unknown,
00:19:31.960 | but we can actually systematically try to break this down.
00:19:35.520 | And one segment of that journey that we're looking at
00:19:38.840 | is how that mutation gets translated
00:19:41.680 | and acted upon in your cells.
00:19:43.940 | And all of your cells have what's called mRNA.
00:19:47.880 | mRNA are the actual instructions
00:19:51.480 | that are taken from the DNA.
00:19:53.640 | And what our work in single cell is looking at
00:19:56.500 | how every cell in your body
00:19:59.760 | is actually interpreting your DNA slightly differently.
00:20:04.280 | And what happens when healthy cells
00:20:06.500 | are interpreting the DNA instructions
00:20:08.140 | and when sick cells are interpreting those directions.
00:20:11.360 | And that is a ton of data.
00:20:13.520 | I just told you there's 37 trillion cells.
00:20:16.120 | There's different large sets of mRNA in each cell.
00:20:21.120 | But the work that we've been funding
00:20:22.760 | is looking at how, first of all, gathering that information.
00:20:27.760 | We've been incredibly lucky
00:20:31.200 | to be part of a very fast moving field
00:20:35.520 | where we've gone from in 2017 funding some methods work
00:20:40.520 | to now having really not complete,
00:20:43.760 | but nearly complete atlases of how the human body works,
00:20:47.400 | how flies work, how mice work at the single cell level,
00:20:52.720 | and being able to then try to piece together
00:20:55.360 | like how does that all come together
00:20:57.880 | and when you're healthy and when you're sick.
00:21:00.240 | And the neat thing about the sort of inflection point
00:21:05.000 | where we're at in AI is that I can't look at this data
00:21:09.200 | and make sense of it.
00:21:10.240 | There's just too much of it.
00:21:11.600 | And biology is complex, human bodies are complex.
00:21:15.060 | We need this much information.
00:21:17.440 | But the use of large language models
00:21:20.120 | can help us actually look at that data
00:21:22.640 | and gain insights, look at what trends are consistent
00:21:27.640 | with health and what trends are unsuspected.
00:21:32.720 | And eventually our hope through the use of these data sets
00:21:37.760 | that we've helped curate
00:21:39.160 | in the application of large language models
00:21:41.220 | is to be able to formulate a virtual cell,
00:21:43.960 | a cell that's completely built off of the data sets
00:21:49.000 | of what we know about the human body,
00:21:51.160 | but allows us to manipulate and learn faster
00:21:53.720 | and try new things to help move science
00:21:57.080 | and then medicine along.
00:21:59.000 | - Do you think we've cataloged the total number
00:22:03.880 | of different cell types?
00:22:05.820 | Every week I look at great journals
00:22:07.640 | like Cell, Nature and Science.
00:22:08.900 | And for instance, I saw recently
00:22:10.660 | that using single cell sequencing,
00:22:13.240 | they've categorized 18 plus different types of fat cells.
00:22:18.120 | We always think of like a fat cell versus a muscle cell.
00:22:20.200 | So now you've got 18 types.
00:22:22.520 | Each one is going to express many,
00:22:24.200 | many different genes and RNAs, MRNAs.
00:22:27.640 | And perhaps one of them is responsible
00:22:31.360 | for what we see in the advanced type two diabetes
00:22:35.040 | or in other forms of obesity
00:22:37.480 | or where people can't lay down fat cells,
00:22:39.280 | which turns out to be just as detrimental
00:22:41.240 | in those extreme cases.
00:22:42.920 | So now you've got all these lists of genes,
00:22:45.000 | but I always thought of single cell sequencing
00:22:48.800 | as necessary, but not sufficient.
00:22:51.160 | But you need the information,
00:22:52.760 | but it doesn't resolve the problem.
00:22:54.300 | And I think of it more as a hypothesis
00:22:57.360 | generating experiment.
00:22:59.400 | Okay, so you have all these genes and you could say,
00:23:00.760 | wow, this gene is particularly elevated
00:23:03.040 | in the diabetics cell type of let's say
00:23:07.240 | one of these fat cells or muscle cells for that matter,
00:23:10.000 | whereas it's not in non-diabetics.
00:23:11.980 | So then of the millions of different cells,
00:23:15.720 | maybe only five of them differ dramatically.
00:23:19.640 | So then you generate a hypothesis,
00:23:21.220 | oh, it's the ones that differ dramatically
00:23:22.560 | that are important, but maybe one of those genes
00:23:25.380 | when it's only 50% changed has a huge effect
00:23:30.240 | because of some network biology effect.
00:23:32.340 | And so I guess what I'm trying to get to here is,
00:23:35.420 | how does one meet that challenge
00:23:37.600 | and can AI help resolve that challenge
00:23:40.080 | by essentially placing those lists of genes
00:23:42.160 | into 10,000 hypotheses?
00:23:45.260 | 'Cause I'll tell you that the graduate students
00:23:46.560 | and postdocs in my lab get a chance to test
00:23:48.760 | one hypothesis at a time.
00:23:50.420 | And that's really the challenge, let alone one lab.
00:23:52.580 | And so for those that are listening to this
00:23:55.060 | and hopefully it's not getting outside the scope
00:23:56.940 | of kind of like standard understanding
00:23:59.200 | or the understanding we've generated here,
00:24:01.040 | but we're basically saying is you have to pick at some point
00:24:04.280 | more data always sounds great,
00:24:05.780 | but then how do you decide what to test?
00:24:08.180 | - So, no, we don't know all the cell types.
00:24:10.960 | I think one thing that was really exciting
00:24:14.520 | when we first launched this work was cystic fibrosis.
00:24:18.720 | Cystic fibrosis is caused by mutation and CFTR.
00:24:21.640 | That's pretty well known.
00:24:23.000 | It affects a certain channel that makes it hard
00:24:25.440 | for mucus to be cleared.
00:24:26.840 | That's the basics of cystic fibrosis.
00:24:28.960 | When I went to medical school, it was taught as fact.
00:24:31.300 | - So their lungs fill up with fluid.
00:24:32.800 | These are people carrying around sacks of fluid filling up.
00:24:35.760 | I've known people, I've worked with people
00:24:37.640 | and then they have to literally dump the fluid out.
00:24:39.720 | They can't run or do an intense exercise.
00:24:41.900 | Life is shorter.
00:24:42.740 | - Life is shorter.
00:24:43.840 | And when we applied single cell methodologies to the lungs,
00:24:48.220 | they discovered an entirely new cell type
00:24:51.680 | that actually is affected by mutation
00:24:54.840 | and the CF mutation and cystic fibrosis mutation
00:24:58.360 | that actually changes the paradigm
00:24:59.880 | of how we think about cystic fibrosis.
00:25:01.600 | - Amazing. - Just unknown.
00:25:02.900 | So I don't think we know all the cell types.
00:25:06.400 | I think we'll continue to discover them
00:25:07.960 | and we'll continue to discover new relationships
00:25:10.340 | between cell and disease,
00:25:11.840 | which leads me to the second example I wanna bring up
00:25:14.420 | is this large data set
00:25:17.720 | that the entire scientific community
00:25:19.800 | is built around single cell.
00:25:21.360 | It's starting to allow us to say this mutation,
00:25:25.400 | where is it expressed?
00:25:26.520 | What types of cell types it's expressed in?
00:25:28.720 | And we actually have built a tool at CZI called Cell by Gene
00:25:33.720 | where you can put in the mutation that you're interested in
00:25:37.680 | and it gives you a heat map of cross cell types
00:25:40.440 | of which cell types are expressing the gene
00:25:44.740 | that you're interested in.
00:25:46.180 | And so then you can start looking at,
00:25:47.980 | okay, if I look at gene X
00:25:52.180 | and I know it's related to heart disease,
00:25:55.400 | but if you look at the heat map,
00:25:56.780 | it's also spiking in the pancreas.
00:25:59.220 | That allows you to generate a hypothesis, why?
00:26:02.100 | And what happens when this gene is mutated
00:26:04.520 | in the function of your pancreas?
00:26:07.560 | Really exciting way to look and ask questions differently.
00:26:11.680 | And you can also imagine a world
00:26:13.920 | where if you're trying to develop a therapy, a drug,
00:26:18.140 | and the goal is to treat the function of the heart,
00:26:22.100 | but you know that it's also really active
00:26:24.440 | in the pancreas again.
00:26:25.680 | So what, is there going to be an unexpected side effect
00:26:29.360 | that you should think about
00:26:30.820 | as you're bringing this drug to clinical trials?
00:26:34.100 | So it's an incredibly exciting tool
00:26:37.000 | and one that's only going to get better
00:26:38.700 | as we get more and more sophisticated ways
00:26:41.320 | to analyze the data.
00:26:42.940 | - I must say, I love that
00:26:43.840 | because if I look at the advances in neuroscience
00:26:46.020 | over the last 15 years,
00:26:47.260 | most of them didn't necessarily come
00:26:50.500 | from looking at the nervous system,
00:26:52.100 | came from the understanding
00:26:53.780 | that the immune system impacts the brain.
00:26:55.660 | Everyone prior to that talked about the brain
00:26:57.560 | as immune privileged organ.
00:26:59.180 | What you just said also bridges the divide
00:27:02.420 | between single cells, organs, and systems, right?
00:27:05.500 | Because ultimately cells make up organs,
00:27:07.780 | organs make up systems,
00:27:08.620 | and they're all talking to one another.
00:27:09.820 | And everyone nowadays is familiar with like gut brain access
00:27:12.360 | or the microbiome being so important,
00:27:14.100 | but rarely is the discussion between organs discussed,
00:27:19.100 | so to speak.
00:27:21.840 | So I think it's wonderful.
00:27:23.700 | So that tool was generated by CZI or CZI funded that tool.
00:27:28.000 | So how does this- - No, we built that.
00:27:29.100 | - We built that. - You built it.
00:27:30.280 | So is it built by meta?
00:27:31.800 | Is this meta- - No, no, no.
00:27:32.640 | CZI has its own engineers.
00:27:34.480 | - Got it. - Yeah.
00:27:35.780 | They're completely different organizations.
00:27:38.020 | - Incredible.
00:27:38.860 | And so a graduate student or postdoc
00:27:41.460 | who's interested in a particular mutation
00:27:43.100 | could put this mutation into this database.
00:27:45.380 | That graduate student or postdoc
00:27:46.660 | might be in a laboratory known for working on heart,
00:27:49.860 | but suddenly find that they're collaborating
00:27:51.800 | with other scientists that work on the pancreas,
00:27:54.620 | which also is wonderful
00:27:56.620 | because it bridges the divide between these fields.
00:27:58.540 | Fields are so siloed in science,
00:28:00.420 | not just different buildings, but people rarely talk
00:28:03.020 | unless things like this are happening.
00:28:04.220 | - I mean, the graduate student is someone
00:28:05.860 | that we want to empower
00:28:07.180 | 'cause one, they're the future of science, as you know.
00:28:09.540 | And within Cell by Gene,
00:28:11.500 | if you put in the gene you're interested in
00:28:13.180 | and it shows you the heat map,
00:28:14.680 | we also will pull up the most relevant papers to that gene.
00:28:18.640 | And so read these things.
00:28:21.060 | - Fantastic.
00:28:22.300 | As we all know, quality nutrition influences, of course,
00:28:25.220 | our physical health, but also our mental health
00:28:27.460 | and our cognitive functioning, our memory,
00:28:29.320 | our ability to learn new things and to focus.
00:28:31.700 | And we know that one of the most important features
00:28:33.860 | of high quality nutrition is making sure
00:28:35.780 | that we get enough vitamins and minerals
00:28:37.800 | from high quality unprocessed or minimally processed sources
00:28:41.420 | as well as enough probiotics and prebiotics and fiber
00:28:44.500 | to support basically all the cellular functions in our body,
00:28:47.580 | including the gut microbiome.
00:28:49.420 | Now, I, like most everybody,
00:28:51.560 | try to get optimal nutrition from whole foods,
00:28:54.620 | ideally mostly from minimally processed
00:28:57.340 | or non-processed foods.
00:28:58.800 | However, one of the challenges
00:28:59.900 | that I and so many other people face
00:29:01.760 | is getting enough servings of high quality fruits
00:29:03.820 | and vegetables per day, as well as fiber and probiotics
00:29:06.780 | that often accompany those fruits and vegetables.
00:29:08.980 | That's why way back in 2012,
00:29:11.000 | long before I ever had a podcast, I started drinking AG1.
00:29:14.840 | And so I'm delighted that AG1
00:29:16.460 | is sponsoring the Huberman Lab Podcast.
00:29:18.620 | The reason I started taking AG1
00:29:20.340 | and the reason I still drink AG1 once or twice a day
00:29:23.640 | is that it provides all of my foundational nutritional needs.
00:29:26.420 | That is, it provides insurance
00:29:28.380 | that I get the proper amounts of those vitamins, minerals,
00:29:31.180 | probiotics, and fiber to ensure optimal mental health,
00:29:34.820 | physical health, and performance.
00:29:36.980 | If you'd like to try AG1,
00:29:38.400 | you can go to drinkag1.com/huberman
00:29:41.660 | to claim a special offer.
00:29:43.220 | They're giving away five free travel packs
00:29:44.980 | plus a year supply of vitamin D3K2.
00:29:47.820 | Again, that's drinkag1.com/huberman
00:29:51.160 | to claim that special offer.
00:29:53.160 | - I just think going back to your question from before,
00:29:55.500 | I mean, are there going to be more cell types
00:29:57.300 | that get discovered?
00:29:58.140 | I mean, I assume so, right?
00:29:59.460 | I mean, no catalog of the stuff has ever,
00:30:01.460 | you know, it doesn't seem like we're ever done, right?
00:30:03.300 | We keep on finding more.
00:30:04.980 | But I think that that gets to one of the things
00:30:09.280 | that I think are the strengths of modern LLMs
00:30:13.020 | is the ability to kind of imagine different states
00:30:15.940 | that things can be in.
00:30:17.300 | So from all the work that we've done
00:30:20.140 | and funded on the Human Cell Atlas,
00:30:22.620 | there is a large corpus of data
00:30:24.180 | that you can now train a kind of large scale model on.
00:30:29.100 | And one of the things that we're doing at CZI,
00:30:32.660 | which I think is pretty exciting,
00:30:33.700 | is building what we think is one of the largest
00:30:37.460 | nonprofit life sciences AI clusters, right?
00:30:41.180 | It's like on the order of 1,000 GPUs.
00:30:44.440 | And it's larger than what most people have access to
00:30:47.500 | in academia that you can do serious engineering work on.
00:30:51.540 | And by basically training a model
00:30:55.820 | with all of the Human Cell Atlas data
00:30:57.700 | and a bunch of other inputs as well,
00:31:01.020 | we think you'll be able to basically imagine
00:31:04.500 | all of the different types of cells
00:31:06.140 | and all the different states that they can be in
00:31:07.960 | and when they're healthy and diseased
00:31:10.140 | and how they'll interact with, you know,
00:31:12.380 | different, you know, interact with each other,
00:31:16.000 | interact with different potential drugs.
00:31:18.220 | But, I mean, I think the state of LLMs,
00:31:20.100 | I think this is where it's helpful to understand,
00:31:22.420 | you know, have a good understanding
00:31:23.420 | and be grounded in like the modern state of AI.
00:31:25.820 | I mean, these things are not foolproof, right?
00:31:28.140 | I mean, one of the flaws of modern LLMs
00:31:30.380 | is they hallucinate, right?
00:31:31.940 | So the question is, how do you make it so that
00:31:34.420 | that can be an advantage rather than a disadvantage?
00:31:37.580 | And I think the way that it ends up being an advantage
00:31:39.780 | is when they help you imagine a bunch of states
00:31:42.420 | that someone could be in, but then you, you know,
00:31:44.380 | as the scientist or engineer go and validate
00:31:46.420 | that those are true, whether they're, you know,
00:31:47.940 | solutions to how a protein can be folded
00:31:49.900 | or possible states that a cell could be in
00:31:52.140 | when it's interacting with other things.
00:31:54.060 | But, you know, we're not yet at the state with AI
00:31:56.860 | that you can just take the outputs of these things
00:31:59.460 | as like as gospel and run from there.
00:32:02.780 | But they are very good, I think, as you said,
00:32:05.300 | hypothesis generators or possible solution generators
00:32:09.580 | that then you can go validate.
00:32:10.900 | So I think that that's a very powerful thing
00:32:14.100 | that we can basically, you know,
00:32:15.380 | building on the first five years of science work
00:32:17.620 | around the Human Cell Atlas and all the data
00:32:19.260 | that's been built out, carry that forward into something
00:32:21.580 | that I think is gonna be a very novel tool going forward.
00:32:24.940 | And that's the type of thing
00:32:27.100 | that I think we're set up to do well.
00:32:29.260 | I mean, you all, you had this exchange a little while back
00:32:33.260 | about, you know, funding levels and how CZI is, you know,
00:32:36.740 | just sort of a drop in the bucket compared to NIH.
00:32:41.060 | But I think we have this,
00:32:42.900 | the thing that I think we can do that's different
00:32:45.580 | is funding some of these longer term, bigger projects
00:32:50.020 | that it is hard to galvanize the,
00:32:52.540 | and pull together the energy to do that.
00:32:54.620 | And it's a lot of what most science funding is,
00:32:57.820 | is like relatively small projects that are exploring things
00:33:00.560 | over relatively short time horizons.
00:33:02.700 | And one of the things that we try to do is,
00:33:04.660 | is like build these tools over, you know,
00:33:06.940 | five, 10, 15 year periods.
00:33:09.100 | They're often projects that require, you know,
00:33:10.980 | hundreds of millions of dollars of funding
00:33:12.540 | and world-class engineering teams and infrastructure to do.
00:33:15.660 | And that I think is a pretty cool contribution to the field
00:33:19.300 | that I think is, there aren't as many other folks
00:33:23.420 | who are doing that kind of thing.
00:33:24.900 | But that's one of the reasons why I'm personally excited
00:33:27.140 | about the virtual cell stuff.
00:33:28.340 | 'Cause it just, it's like this perfect intersection
00:33:30.940 | of all the stuff that we've done and single cell,
00:33:32.740 | the previous collaborations that we've done with the field
00:33:36.220 | and, you know, bringing together the industry
00:33:39.260 | and AI expertise around this.
00:33:41.260 | - Yeah, I completely agree that the model of science
00:33:45.580 | that you're putting together with CZI
00:33:47.220 | isn't just unique from NIH, but it's extremely important.
00:33:50.700 | The independent investigator model
00:33:54.100 | is what's driven the progression of science in this country
00:33:56.340 | and to some extent in Northern Europe
00:33:58.380 | for the last hundred years.
00:34:00.140 | And it's wonderful on the one hand
00:34:04.220 | because it allows for that image we have of a scientist
00:34:07.980 | kind of tinkering away or the people in their lab
00:34:09.700 | and then like the Eurekas.
00:34:11.400 | And that hopefully translates to better human health.
00:34:16.780 | But I think in my opinion, we've moved past that model
00:34:21.100 | as the most effective model
00:34:22.380 | or the only model that should be explored.
00:34:24.380 | - Yeah, I just think it's a balance.
00:34:25.600 | - It's a balance. - I think you want that,
00:34:27.020 | but you want to empower those people.
00:34:28.180 | I think that that's, these tools empower those folks.
00:34:29.620 | - Sure, and there are mechanisms to do that like NIH,
00:34:31.820 | but it's hard to do collaborative sciences.
00:34:35.300 | It's sort of interesting that we're sitting here not far
00:34:37.700 | because I grew up right near here as well,
00:34:40.140 | I'm not far from the garage model of tech, right?
00:34:43.720 | The Hewlett-Packard model, not far from here at all.
00:34:47.720 | And the idea was that the tinkerer in the garage,
00:34:51.000 | the inventor, and then people often forget
00:34:53.480 | that to implement all the technologies they discovered
00:34:55.520 | took enormous factories and warehouses.
00:34:57.400 | So there's a similarity there to Facebook, meta, et cetera.
00:35:01.600 | But I think in science, we imagine that the scientists alone
00:35:04.320 | in their laboratory and those Eureka moments,
00:35:06.480 | but I think nowadays that the big questions
00:35:08.640 | really require extensive collaboration
00:35:11.920 | and certainly tool development.
00:35:13.920 | And one of the tools that you keep coming back to
00:35:15.560 | is these LLMs, these large language models.
00:35:17.500 | And maybe you could just elaborate
00:35:19.100 | for those that aren't familiar,
00:35:20.780 | what is a large language model for the uninformed?
00:35:25.780 | What is it?
00:35:27.760 | And what does it allow us to do that different types of,
00:35:32.760 | other types of AI don't allow?
00:35:35.600 | Or more importantly, perhaps, what does it allow us to do
00:35:38.040 | that a bunch of really smart people,
00:35:40.600 | highly informed in a given area of science,
00:35:42.580 | staring at the data, what can it do that they can't do?
00:35:45.620 | - Sure, so I think a lot of the progression
00:35:50.200 | of machine learning has been about building systems,
00:35:54.320 | neural networks or otherwise,
00:35:56.260 | that can basically make sense and find patterns
00:35:59.840 | in larger and larger amounts of data.
00:36:01.960 | And there was a breakthrough a number of years back
00:36:05.080 | that some folks at Google actually made
00:36:08.700 | called this transformer model architecture.
00:36:11.720 | And it was this huge breakthrough because before then,
00:36:15.360 | there was somewhat of a cap where if you fed more data
00:36:19.180 | into a neural network past some point,
00:36:22.220 | it didn't really glean more insights from it.
00:36:25.160 | Whereas transformers just, we haven't seen the end
00:36:27.720 | of how big that can scale to yet.
00:36:29.220 | I mean, I think that there's a chance
00:36:30.500 | that we run into some ceiling, but--
00:36:32.980 | - So it never asymptotes?
00:36:34.700 | - We haven't observed it yet,
00:36:35.760 | but we just haven't built big enough systems yet.
00:36:37.420 | So I would guess that, I don't know.
00:36:40.100 | I think this is actually one of the big questions
00:36:41.740 | in the AI field today is basically are transformers
00:36:46.280 | and are the current model architecture sufficient?
00:36:48.360 | And if you just build larger and larger clusters,
00:36:50.160 | do you eventually get something that's like
00:36:51.560 | human intelligence or super intelligence?
00:36:53.680 | Or is there some kind of fundamental limit
00:36:57.200 | to this architecture that we just haven't reached yet?
00:36:59.980 | And once we kind of get a little bit further
00:37:03.440 | in building them out, then we'll reach that
00:37:04.640 | and then we'll need a few more leaps
00:37:05.880 | before we get to the level of AI
00:37:09.340 | that I think will unlock a ton of really futuristic
00:37:13.400 | and amazing things.
00:37:14.240 | But there's no doubt that even just being able
00:37:16.020 | to process the amount of data that we can now
00:37:19.180 | with this model architecture has unlocked
00:37:22.200 | a lot of new use cases.
00:37:23.720 | And the reason why they're called large language models
00:37:25.780 | is because one of the first uses of them
00:37:28.440 | is people basically feed in all of the language
00:37:32.740 | from basically the World Wide Web.
00:37:35.380 | And you can think of them as basically prediction machines.
00:37:38.360 | So if you fit in and you put in a prompt
00:37:42.900 | and it can basically predict a version
00:37:45.980 | of what should come next.
00:37:47.020 | So you type in a headline for a news story
00:37:50.700 | and it can kind of predict what it thinks
00:37:53.540 | the story should be.
00:37:54.660 | Or you could train it so that it could be a chatbot, right?
00:37:57.460 | Where, okay, if you're prompted with this question,
00:38:00.020 | you can get this response.
00:38:02.780 | But one of the interesting things is it turns out
00:38:04.620 | that there's actually nothing specific
00:38:07.040 | to using human language in it.
00:38:08.540 | So if instead of feeding it human language,
00:38:11.160 | if you use that model architecture for a network
00:38:14.260 | and instead you feed it all of the human cell atlas data,
00:38:18.860 | then if you prompt it with a state of a cell,
00:38:21.860 | it can spit out different versions of like
00:38:26.300 | what, you know, how that cell can interact
00:38:28.300 | or different states that the cell could be a next
00:38:29.980 | when it interacts with different things.
00:38:31.340 | - Does it have to take a genetics class?
00:38:32.960 | So for instance, if you give it a bunch of genetics data,
00:38:34.940 | do you have to say, "Hey, by the way,"
00:38:36.520 | and then you give it a genetics class so it understands
00:38:38.620 | that, you know, you've got DNA, RNA, mRNA, and proteins?
00:38:41.580 | - No, I think that the basic nature
00:38:43.680 | of all these machine learning techniques
00:38:45.180 | is they're basically pattern recognition systems.
00:38:48.820 | So they're these like very deep statistical machines
00:38:53.820 | that are very efficient at finding patterns.
00:38:57.540 | So it's not actually, I mean, you don't need to teach
00:39:00.740 | a language model that's trying to, you know,
00:39:03.040 | speak a language, you know, a lot of specific things
00:39:07.560 | about that language either.
00:39:08.740 | You just feed it in a bunch of examples
00:39:10.280 | and then, you know, let's say you teach it
00:39:11.640 | about something in English, but then you also give it
00:39:16.240 | a bunch of examples of people speaking Italian,
00:39:19.280 | it'll actually be able to explain the thing
00:39:21.320 | that it learned in English and Italian, right?
00:39:22.880 | Even though it is, so the crossover
00:39:25.240 | and just the pattern recognition is the thing
00:39:28.920 | that is pretty profound and powerful about this.
00:39:31.700 | But it really does apply to a lot of different things.
00:39:34.500 | Another example in the scientific community
00:39:36.640 | has been the work that AlphaFold, you know,
00:39:40.440 | that basically the folks at DeepMind have done
00:39:43.280 | on protein folding.
00:39:45.420 | It's, you know, just basically a lot
00:39:47.060 | of the same model architecture, but instead of language,
00:39:50.680 | there they kind of fed in all of the protein data
00:39:54.860 | and you can give it a state and it can spit out solutions
00:39:57.640 | to how those proteins get folded.
00:39:59.560 | So it's very powerful, I don't think we know yet
00:40:01.960 | as an industry what the natural limits of it are
00:40:06.960 | and that that's one of the things that's pretty exciting
00:40:10.360 | about the current state.
00:40:11.760 | But it certainly allows you to solve problems
00:40:15.460 | that just weren't solved with the generation
00:40:18.700 | of machine learning that came before it.
00:40:21.760 | - It sounds like CZI is moving a lot of work
00:40:24.440 | that was just done in vitro in dishes
00:40:27.400 | and in vivo in living organisms, model organisms or humans
00:40:32.280 | to in silico, as we say.
00:40:33.980 | So do you foresee a future where a lot
00:40:36.560 | of biomedical research, certainly the work of CZI,
00:40:40.640 | included is done by machines?
00:40:44.400 | I mean, obviously it's much lower cost
00:40:47.080 | and you can run millions of experiments,
00:40:49.060 | which of course is not to say
00:40:49.960 | that humans are not going to be involved,
00:40:52.260 | but I love the idea that we can run experiments
00:40:55.440 | in silico and mass.
00:40:58.320 | - I think the in silico experiments are going
00:41:01.560 | to be incredibly helpful to test things quickly,
00:41:04.920 | to cheaply and to just unleash a lot of creativity.
00:41:09.920 | I do think you need to be very careful
00:41:13.320 | about making sure it still translates
00:41:14.900 | and matches this humans.
00:41:18.180 | One thing that's funny in basic science
00:41:22.080 | is we've basically cured every single disease in mice.
00:41:25.200 | Like mice have, we know what's going on
00:41:28.600 | when they have a number of diseases
00:41:30.520 | because they're used as a model organism,
00:41:33.000 | but they are not humans.
00:41:34.520 | And a lot of times that research is relevant,
00:41:38.400 | but not directly one-to-one translatable to humans.
00:41:42.320 | So you just have to be really careful about making sure
00:41:45.220 | that it actually works for humans.
00:41:47.600 | - Sounds like what CZI is doing
00:41:50.520 | is actually creating a new field.
00:41:54.000 | As I'm hearing all of this, I'm thinking, okay,
00:41:55.840 | this transcends immunology department,
00:41:59.600 | cardiothoracic surgery, neuroscience.
00:42:02.560 | The idea of a new field where you certainly embrace
00:42:06.280 | the realities of universities and laboratories,
00:42:08.120 | because that's where most of the work
00:42:09.280 | that you're funding is done, is that right?
00:42:11.400 | So maybe we need to think about what it means
00:42:14.540 | to do science differently.
00:42:16.620 | And I think that's one of the things that's most exciting.
00:42:19.540 | Along those lines, it seems that bringing together
00:42:21.680 | a lot of different types of people at different
00:42:24.280 | major institutions is going to be especially important.
00:42:28.960 | So I know that the initial CCI Biohub, gratefully,
00:42:33.960 | included Stanford, we'll put that first in the list,
00:42:37.800 | but also UCSF, forgive me.
00:42:40.640 | Many friends at UCSF and also Berkeley.
00:42:43.160 | But there are now some additional institutions involved.
00:42:47.760 | So maybe you could talk about that and what motivated
00:42:49.480 | the decision to branch outside the Bay Area
00:42:52.000 | and why you selected those particular
00:42:56.080 | additional institutions to be included.
00:42:57.960 | - Well, I'll just say, a big part of why we wanted
00:43:01.600 | to create additional Biohubs is we were just so impressed
00:43:03.920 | by the work that the folks who are running
00:43:05.800 | the first Biohub did.
00:43:07.600 | And I also think, and you should walk through the work
00:43:11.900 | of the Chicago Biohub and the New York Biohub
00:43:14.400 | that we just announced, but I think it's actually
00:43:16.260 | an interesting set of examples that balance the limits
00:43:21.260 | of what you want to do with like physical material
00:43:24.460 | engineering and where things are purely biological.
00:43:28.740 | Because the Chicago team is really building more sensors
00:43:32.260 | to be able to understand what's going on in your body.
00:43:33.880 | But that's more of like a physical kind of
00:43:35.740 | engineering challenge, whereas the New York team,
00:43:39.560 | we basically talk about this as like a cellular endoscope
00:43:42.700 | of being able to have like an immune cell or something
00:43:45.700 | that can go and understand what's the thing
00:43:50.360 | that's going on in your body.
00:43:51.260 | But it's not like a physical piece of hardware.
00:43:52.900 | It's a cell that you can basically just go report out
00:43:57.900 | on different things that are happening inside the body.
00:44:00.920 | - Ooh, so making the cell the microscope.
00:44:03.140 | - Totally. - Yeah, and then eventually
00:44:04.740 | actually being able to act on it.
00:44:05.900 | But I mean, you should go into more detail on all this.
00:44:08.160 | - So a core principle of how we think about Biohubs
00:44:11.080 | is that it has to be, when we invited proposals,
00:44:15.440 | has to be at least three institutions.
00:44:17.860 | So really breaking down the barrier of a single university.
00:44:22.220 | Oftentimes asking for the people designing the research aim
00:44:26.060 | to come from all different backgrounds.
00:44:28.780 | And to explain why that the problem that they wanna solve
00:44:32.300 | requires interdisciplinary, inter-university institution
00:44:38.180 | collaboration to actually make happen.
00:44:40.940 | We just put that request for a proposal out there
00:44:43.560 | with our San Francisco Biohub as an example,
00:44:46.580 | where they've done incredible work in single cell biology
00:44:50.140 | and infectious disease.
00:44:52.620 | And we got, I wanna say like 57 proposals
00:44:56.900 | from over 150 institutions.
00:45:00.540 | A lot of ideas came together.
00:45:02.260 | And we are so, so excited that we've been able
00:45:05.840 | to launch Chicago and New York.
00:45:07.680 | Chicago is a collaboration between UIUC,
00:45:11.900 | University of Illinois, Urbana-Champaign,
00:45:14.600 | and University of Chicago and Northwestern.
00:45:18.080 | And if I, obviously these universities are multifaceted,
00:45:21.320 | but if I were to sort of describe them
00:45:23.380 | by their like stereotypical strength,
00:45:26.100 | Northwestern has an incredible medical system
00:45:30.300 | and hospital system.
00:45:31.940 | University of Chicago brings to the table
00:45:35.760 | incredible basic science strengths.
00:45:38.180 | University of Illinois is a computing powerhouse.
00:45:41.940 | And so they came together and proposed
00:45:44.620 | that they were gonna start thinking about cells and tissue.
00:45:48.360 | So that one of the layers that you just alluded to.
00:45:52.340 | So how do the cells that we know behave and act differently
00:45:56.900 | when they come together as a tissue?
00:45:58.860 | And one of the first tissues
00:46:00.520 | that they're starting with is skin.
00:46:02.400 | So they've been already been able to,
00:46:05.140 | as a collaboration under the leadership of Shana Kelly,
00:46:08.080 | design engineered skin tissue.
00:46:13.020 | The architecture looks the same as what's in UNI.
00:46:17.400 | And what they've done is built these super,
00:46:21.080 | super thin sensors, and they embed these sensors
00:46:24.860 | throughout the layers of this engineered tissue.
00:46:27.340 | And they read out the data.
00:46:29.200 | They wanna see how these cells,
00:46:31.740 | what these cells are secreting,
00:46:33.120 | how these cells talk to each other,
00:46:34.840 | and what happens when these cells get inflamed.
00:46:37.640 | Inflammation is an incredibly important process
00:46:39.980 | that drives 50% of all deaths.
00:46:42.820 | And so this is another sort of disease agnostic approach.
00:46:46.380 | We wanna understand inflammation.
00:46:48.340 | And they're gonna get a ton of information out
00:46:50.460 | from these sensors that tell you
00:46:53.340 | what happens when something goes awry.
00:46:56.380 | 'Cause right now we can say,
00:46:57.940 | like when you have an allergic reaction,
00:47:00.060 | your skin gets red and puffy.
00:47:01.940 | What is the earliest signal of that?
00:47:04.540 | And these sensors can look at the behaviors
00:47:08.160 | of these cells over time,
00:47:09.560 | and then you can apply a large language model
00:47:11.760 | to look at the earliest statistically significant changes
00:47:15.680 | that can allow you to intervene as early as possible.
00:47:19.700 | So that's what Chicago's doing.
00:47:21.800 | They're starting in the skin cells.
00:47:24.820 | They're also looking at the neuromuscular junction,
00:47:27.560 | which is the connection between where a neuron
00:47:30.800 | attaches to a muscle and tells the muscle how to behave.
00:47:33.860 | Super important in things like ALS, but also in aging.
00:47:38.020 | The slowed transmission of information
00:47:41.340 | across that neuromuscular junction
00:47:42.900 | is what causes old people to fall.
00:47:44.920 | Their brain cannot trigger their muscles
00:47:46.680 | to react fast enough.
00:47:48.060 | And so we wanna be able to embed these sensors
00:47:50.780 | to understand how these different interconnected systems
00:47:55.580 | within our bodies work together.
00:47:57.820 | In New York, they're doing a related,
00:48:02.180 | but equally exciting project
00:48:04.780 | where they're engineering individual cells
00:48:08.520 | to be able to go in and identify changes in a human body.
00:48:13.520 | So what they'll do is, they're calling it--
00:48:19.620 | - Wild, I mean, I love it.
00:48:20.780 | I mean, this is, I don't wanna go on a tangent,
00:48:23.500 | but for those that wanna look it up, adaptive optics,
00:48:26.460 | there's a lot of distortion and interference
00:48:29.240 | when you try and look at something really small
00:48:30.780 | or really far away and really smart physicists
00:48:33.980 | figured out, well, use the interference
00:48:36.360 | as part of the microscope.
00:48:37.460 | Make those actually lenses of the microscope.
00:48:40.260 | - We should talk about imaging separate.
00:48:41.460 | So after you talk about the New York biohub.
00:48:43.420 | - It's extremely clever along those lines.
00:48:45.320 | It's not intuitive, but then when you hear it,
00:48:46.980 | it's like, it makes so much sense.
00:48:49.060 | It's not immediately intuitive.
00:48:50.540 | Make the cells that are already, can navigate to tissues
00:48:53.240 | or embed themselves in tissues,
00:48:54.600 | be the microscope within that tissue.
00:48:55.980 | - Totally. - I love it.
00:48:57.180 | - The way that I explain this to my friends
00:49:00.500 | and my family is, this is fantastic voyage, but real life.
00:49:04.720 | Like we are going into the human body
00:49:07.780 | and we're using the immune cells,
00:49:09.580 | which, you know, are privileged and already working
00:49:11.900 | to keep your body healthy and being able to target them
00:49:15.180 | to examine certain things.
00:49:17.020 | So like you can engineer an immune cell to go in your body
00:49:21.140 | and look inside your coronary arteries and say,
00:49:24.380 | are these arteries healthy or are there plaques?
00:49:27.860 | Because plaques lead to blockage, which lead to heart attacks
00:49:32.500 | and the cell can then record that information
00:49:35.300 | and report it back out.
00:49:36.840 | That's the first half
00:49:37.680 | of what the New York biohub is going to do.
00:49:39.860 | The second half is can you then engineer the cells
00:49:42.700 | to go do something about it?
00:49:44.280 | Can I then tell a different cell, immune cell,
00:49:46.780 | that is able to transport in your body to go in
00:49:49.660 | and clean that up in a targeted way?
00:49:52.540 | And so it's incredibly exciting.
00:49:56.060 | They're going to study things
00:49:57.860 | that are sort of immune privilege
00:49:59.860 | that your immune system normally doesn't have access to.
00:50:03.340 | Things like ovarian and pancreatic cancer.
00:50:06.520 | They'll also look at a number of neurodegenerative diseases
00:50:09.680 | since the immune system doesn't presently have a ton
00:50:13.780 | of access into the nervous system.
00:50:17.240 | But it's both mind blowing and it feels like sci-fi,
00:50:22.020 | but science is actually in a place
00:50:23.720 | where if you really pushed a group
00:50:26.100 | of incredibly qualified scientists,
00:50:27.860 | say, could you do this if given the chance?
00:50:30.480 | The answer is like, probably.
00:50:33.300 | Give us enough time.
00:50:34.820 | The bright team and resources, it's doable.
00:50:37.620 | - Yeah, I mean, it's a 10 to 15 year project.
00:50:39.980 | - Yeah.
00:50:40.820 | - But it's awesome.
00:50:41.940 | Engineer it cells, yeah.
00:50:43.580 | - I love the optimism and the moment you said,
00:50:46.060 | "Make the cell the microscope," so to speak,
00:50:48.020 | it's like, yes, yes, and yes, it just makes so much sense.
00:50:52.580 | What motivated the decision to do the work of CZI
00:50:56.320 | in the context of existing universities as opposed to,
00:51:00.740 | there's still some real estate up in Redwood City
00:51:02.600 | where there's a bunch of space to put biotech companies
00:51:04.980 | and just hiring people from all backgrounds
00:51:08.080 | and saying, hey, have at it
00:51:10.080 | and doing this stuff from scratch.
00:51:11.680 | I mean, it's a very interesting decision to do this
00:51:15.100 | in the context of an existing framework
00:51:16.860 | of like graduate students that need to do a thesis
00:51:18.780 | and get a first author paper.
00:51:20.260 | 'Cause there's a whole set of structures within academia
00:51:22.320 | that I think both facilitate
00:51:23.860 | but also limit the progression of science.
00:51:26.100 | You know, that independent investigator model
00:51:28.340 | that we talked about a little bit earlier,
00:51:30.140 | it's so core to the way science has been done.
00:51:32.900 | This is very different and frankly sounds far more efficient
00:51:35.400 | if I'm to be completely honest.
00:51:37.120 | And, you know, we'll see if I renew my NIH funding
00:51:39.500 | after saying that, but I think we all want the same thing.
00:51:42.880 | We all want to, as scientists and as, you know, as humans,
00:51:47.620 | we want to understand the way we work
00:51:49.040 | and we want healthy people to persist to be healthy
00:51:53.460 | and we want sick people to get healthy.
00:51:54.780 | I mean, that's really ultimately the goal.
00:51:56.380 | It's not super complicated.
00:51:57.700 | It's just hard to do.
00:51:58.620 | - So the teams at the Biohub
00:52:00.260 | are actually independent of the universities.
00:52:04.260 | So each Biohub will probably have in total,
00:52:07.560 | maybe 50 people working on sort of deep efforts.
00:52:11.100 | However, it's an acknowledgement
00:52:13.100 | that not all of the best scientists
00:52:15.340 | who can contribute to this area
00:52:17.220 | are actually going to, one, want to leave a university
00:52:21.600 | or want to take on the full-time scope of this project.
00:52:25.580 | So it's the ability to partner with universities
00:52:29.580 | and to have the faculty at all the universities
00:52:33.760 | be able to contribute to the overall project
00:52:37.300 | is how the Biohub is structured.
00:52:39.500 | - Got it.
00:52:40.420 | But a lot of the way that we're approaching CZI
00:52:43.300 | is this long-term iterative project to figure out,
00:52:46.680 | try a bunch of different things,
00:52:48.260 | figure out which things produce the most interesting results
00:52:51.780 | and then double down on those in the next five-year push.
00:52:55.540 | So we just went through this period
00:52:57.740 | where we kind of wrapped up the first five years
00:53:00.480 | of the science program
00:53:01.700 | and we tried a lot of different models,
00:53:03.260 | all kinds of different things.
00:53:04.380 | And it's not that the Biohub model,
00:53:07.340 | we don't think it's like the best or only model,
00:53:10.100 | but we found that it was sort of a really interesting way
00:53:13.980 | to unlock a bunch of collaboration
00:53:16.320 | and bring some technical resources
00:53:18.780 | that allow for this longer-term development.
00:53:21.000 | And it's not something that is widely being pursued
00:53:25.340 | across the rest of the field.
00:53:26.580 | So we figured, okay, this is like an interesting thing
00:53:29.060 | that we can help push on.
00:53:30.500 | But I mean, yeah, we do believe in the collaboration,
00:53:33.540 | but I also think that we come at this with,
00:53:38.140 | you know, we don't think that the way
00:53:39.300 | that we're pursuing this is like the only way to do this
00:53:42.040 | or the way that everyone should do it.
00:53:43.580 | We're pretty aware of what is the rest of the ecosystem
00:53:48.580 | and how we can play a unique role in it.
00:53:51.040 | - It feels very synergistic
00:53:52.200 | with the way science has already done
00:53:53.680 | and also fills in an incredibly important niche
00:53:56.320 | that frankly wasn't filled before.
00:53:58.580 | Along the lines of implementation,
00:54:01.320 | so let's say your large language models
00:54:04.920 | combined with imaging tools reveal
00:54:07.860 | that a particular set of genes acting in a cluster,
00:54:12.360 | I don't know, set up an organ crash,
00:54:15.240 | let's say the pancreas crashes
00:54:17.080 | at a particular stage of pancreatic cancer.
00:54:19.800 | I mean, still one of the most deadliest of the cancers.
00:54:22.800 | And there are others that you certainly wouldn't want to get,
00:54:26.360 | but that's among the ones you wouldn't want to get the most.
00:54:30.080 | So you discover that.
00:54:30.920 | And then, and the idea is that, okay,
00:54:32.200 | then AI reveals some potential drug targets
00:54:35.440 | that then bear out in vitro in a dish and in a mouse model.
00:54:40.840 | How is the actual implementation to drug discovery?
00:54:43.920 | Or maybe this target is druggable, maybe it's not.
00:54:47.060 | Maybe it requires some other approach,
00:54:49.040 | laser ablation approach or something.
00:54:52.640 | We don't know.
00:54:53.800 | But ultimately, is CZI going to be involved
00:54:56.120 | in the implementation of new therapeutics?
00:54:58.000 | Is that the idea?
00:54:59.620 | - Less so.
00:55:00.460 | - Less so.
00:55:01.300 | This is where it's important to work in an ecosystem
00:55:04.560 | and to know your own limitations.
00:55:06.720 | There are groups and startups and companies
00:55:10.040 | that take that and bring it to translation very effectively.
00:55:14.400 | I would say the place where we have a small window
00:55:18.000 | into that world is actually our work
00:55:20.320 | with rare disease groups.
00:55:21.780 | We have, through our RARES1 portfolio,
00:55:25.880 | funded patient advocates to create
00:55:29.880 | rare disease organizations where patients come together
00:55:34.280 | and actually pool their collective experience.
00:55:39.340 | They build bioregistries,
00:55:41.000 | registries of their natural history,
00:55:43.120 | and they both partner with researchers
00:55:46.460 | to do the research about their disease
00:55:48.540 | and with drug developers to incentivize drug developers
00:55:53.500 | to focus on what they may need for their disease.
00:55:56.880 | And one thing that's important to point out
00:55:59.440 | is that rare diseases aren't rare.
00:56:01.080 | There are over 7,000 rare diseases
00:56:04.080 | and collectively impact many, many individuals.
00:56:08.600 | And I think the thing that's from
00:56:11.260 | a basic science perspective,
00:56:15.200 | the incredibly fascinating thing about rare diseases
00:56:17.780 | is that there are actually windows
00:56:19.260 | to how the body normally should work.
00:56:22.740 | And so there are often mutations that,
00:56:26.860 | when genes that are mutated cause very specific diseases
00:56:31.860 | but that tell you how the normal biology works as well.
00:56:36.320 | - Got it.
00:56:37.600 | So you discussed basically the goals,
00:56:41.680 | major goals and initiatives of the CZI
00:56:43.560 | for the next say five to 10 years.
00:56:45.720 | And then beyond that,
00:56:48.040 | the targets will be explored by biotech companies.
00:56:51.280 | They'll grab those targets
00:56:52.280 | and test them and implement them.
00:56:54.480 | - There've also, I think, been a couple of teams
00:56:57.040 | from the initial Biohub
00:56:58.520 | that were interested in spinning out ideas
00:57:00.280 | right into startups.
00:57:01.120 | So that's just, even though it's not a thing
00:57:03.080 | that we're gonna pursue because we're a philanthropy,
00:57:08.080 | we want to enable the work that gets done
00:57:10.760 | to be able to get turned into companies
00:57:12.900 | and things that other people go take
00:57:14.980 | and run towards building ultimately therapeutics.
00:57:19.160 | So that's another zone,
00:57:21.060 | but that's not a thing that we're gonna do.
00:57:23.040 | - Got it.
00:57:24.480 | I gather you're both optimists.
00:57:26.200 | Is that part of what brought you together?
00:57:30.160 | Forgive me for switching to a personal question,
00:57:32.480 | but I love the optimism that seems to sit
00:57:35.320 | at the root of the CZI.
00:57:37.000 | - I will say that we are incredibly hopeful people,
00:57:40.360 | but it manifests in different ways between the two of us.
00:57:44.600 | - Yeah.
00:57:45.440 | - How would you describe your optimism versus mine?
00:57:49.720 | It's not a loaded question.
00:57:51.080 | - I don't know.
00:57:54.340 | - Um.
00:58:00.600 | I mean, I think I'm more probably technologically optimistic
00:58:06.340 | about what can be built.
00:58:08.220 | And I think you, because of your focus as an actual doctor,
00:58:13.220 | kind of have more of a sense of how that's gonna affect
00:58:19.700 | actual people in their lives.
00:58:22.860 | Whereas for me, it's like, I mean, a lot of my work,
00:58:26.500 | it is, it's like we touch a lot of people around the world,
00:58:30.660 | and the scale is sort of immense.
00:58:32.460 | And I think for you, it's like being able to improve
00:58:36.540 | the lives of individuals, whether it's students
00:58:40.060 | at any of the schools that you've started
00:58:41.660 | or any of the stuff that we've supported
00:58:42.900 | through the education work, which isn't the goal here,
00:58:45.600 | or just being able to improve people's lives in that way,
00:58:50.180 | I think is the thing that I've seen you
00:58:51.620 | be super passionate about.
00:58:53.820 | I don't know, does that,
00:58:54.980 | do you agree with that characterization?
00:58:56.220 | I'm trying to, I'm trying to.
00:58:57.400 | - Yeah, I agree with that.
00:58:58.760 | I think that's very fair.
00:59:00.860 | And I'm sort of giggling to myself,
00:59:02.900 | 'cause in a day-to-day life, as like life partners,
00:59:06.860 | our relative optimism comes through as,
00:59:10.500 | Mark just like is overly optimistic
00:59:13.500 | about his time management,
00:59:15.080 | and will get engrossed in interesting ideas.
00:59:17.580 | - I'm late.
00:59:18.420 | - And he's late.
00:59:20.420 | - The decisions are very punctual.
00:59:21.660 | - And because he's late, I have to channel
00:59:24.140 | Mark is an optimist whenever I'm waiting for him.
00:59:27.180 | - That's such a nice way.
00:59:28.780 | Okay, I'll start using that.
00:59:30.340 | - That's what I think when I'm in the driveway
00:59:32.180 | with the kids waiting for you.
00:59:33.380 | I'm like, Mark is an optimist.
00:59:35.140 | And so his optimism translates to some tardiness,
00:59:40.140 | whereas I'm sort of, I'm like,
00:59:42.460 | how is this gonna happen?
00:59:45.220 | Like, I'm gonna open a spreadsheet,
00:59:46.980 | I'm gonna start putting together a plan,
00:59:49.100 | and like pulling together all the pieces,
00:59:51.420 | calling people to sort of like bring something to life.
00:59:55.580 | - But it is, it's one of my favorite quotes,
00:59:57.340 | that is, optimists tend to be successful,
01:00:01.340 | and pessimists tend to be right.
01:00:02.980 | And yeah, I mean, I think it's true
01:00:06.260 | in a lot of different aspects of life, right?
01:00:09.140 | - We know who said that.
01:00:10.660 | Did you say that?
01:00:11.500 | Mark said-- - No, no, no.
01:00:12.340 | I did not. - Absolutely not.
01:00:13.660 | - No, no, no, no, no, no.
01:00:15.060 | I like it, I did not invent it.
01:00:17.220 | - We'll give it to you.
01:00:18.060 | We'll put it out there.
01:00:18.900 | - No, no, no, no, no, no.
01:00:19.740 | - Just kidding, just kidding.
01:00:21.220 | - But I do think that there's really something to it, right?
01:00:23.540 | And there's like, if you're discussing any idea,
01:00:26.420 | there's all these reasons why it might not work.
01:00:28.780 | And so I think that, and those reasons are probably true.
01:00:33.780 | The people who are stating them are,
01:00:36.520 | probably have some validity to it.
01:00:38.020 | But the question is that,
01:00:39.100 | is that the most productive way to view the world?
01:00:41.540 | And I think across the board,
01:00:43.900 | and I think the people who tend to be
01:00:45.540 | the most productive and get the most done,
01:00:48.140 | you kind of need to be optimistic.
01:00:49.460 | Because if you don't believe that something can get done,
01:00:51.820 | then why would you go work on it?
01:00:53.600 | - The reason I asked the question is that,
01:00:56.100 | these days we hear a lot about,
01:00:57.620 | the future is looking so dark in these various ways.
01:01:00.900 | And you have children, so you have families.
01:01:04.140 | And you are a family, excuse me.
01:01:06.420 | And you also have families independently
01:01:09.460 | that are now merged.
01:01:10.620 | But I love the optimism behind the CZI.
01:01:14.460 | Because behind all this,
01:01:18.180 | there's sort of a set of big statements on the wall.
01:01:21.020 | One, the future can be better than the present
01:01:23.940 | in terms of treating disease.
01:01:25.940 | Maybe even, you said eliminating diseases, all diseases.
01:01:29.940 | I love that optimism.
01:01:31.160 | And that there's a tractable path to do it.
01:01:34.940 | Like what we're going to put literally,
01:01:36.940 | money and time and energy and people and technology
01:01:40.100 | and AI behind that.
01:01:42.140 | And so I have to ask, was having children
01:01:47.140 | a significant modifier in terms of your view of the future?
01:01:51.700 | Like, wow, like you hear all this doom and gloom.
01:01:53.180 | Like what's the future going to be like for them?
01:01:55.300 | Did you sit back and think,
01:01:57.900 | what would it look like if there was a future
01:01:59.380 | with no diseases?
01:02:01.300 | Is that the future we want our children in?
01:02:02.860 | I mean, I'm voting a big yes.
01:02:04.460 | So we're not going to debate that at all.
01:02:06.480 | But was having children sort of an inspiration
01:02:09.260 | for the CZI in some way?
01:02:11.660 | - Yeah.
01:02:12.500 | - So I think my answer to that,
01:02:15.700 | I would dial backwards for me.
01:02:18.740 | And I'll just tell a very brief story about my family.
01:02:22.020 | I'm the daughter of Chinese Vietnamese refugees.
01:02:25.900 | My parents and grandparents were boat people.
01:02:29.420 | If you remember, people left Vietnam during the war
01:02:32.580 | in these small boats into the South China Sea.
01:02:35.180 | And there were stories about how these boats
01:02:40.100 | would sink with whole families on them.
01:02:41.980 | And so my grandparents, both sets of grandparents
01:02:44.380 | who knew each other,
01:02:45.700 | decided that there was a better future out there
01:02:49.020 | and they were willing to take risks for it.
01:02:51.660 | But they were afraid of losing all of their kids.
01:02:55.020 | My dad is one of six, my mom is one of 10.
01:02:58.180 | And so they decided that there was something out there
01:03:03.180 | in this bleak time and they paired up their kids,
01:03:06.700 | one from each family,
01:03:08.340 | and sent them out on these little boats
01:03:11.020 | before the internet, before cell phones,
01:03:14.860 | and just said, "We'll see you on the other side."
01:03:17.580 | And the kids were between the ages of like 10 to 25.
01:03:23.500 | So young kids.
01:03:25.620 | My mom was a teenager, early teen when this happened.
01:03:29.300 | And everyone made it.
01:03:32.260 | And I get to sit here and talk to you.
01:03:34.660 | So how could I not believe that better is possible?
01:03:38.980 | And I hope that that's in my epigenetics somewhere
01:03:42.580 | and that I carry that on.
01:03:44.220 | - That is a spectacular story.
01:03:45.900 | - Isn't that wild?
01:03:46.740 | - It is spectacular.
01:03:47.580 | - How can I be a pessimist with that?
01:03:49.500 | - I love it.
01:03:50.380 | And I so appreciate that you became a physician
01:03:52.340 | because you're now bringing that optimism
01:03:55.260 | and that epigenetic understanding
01:03:57.720 | and cognitive understanding and emotional understanding
01:04:00.120 | to the field of medicine.
01:04:01.460 | So I'm grateful to the people that made that decision.
01:04:05.460 | - Yeah.
01:04:06.300 | And then when I think you don't really,
01:04:08.340 | I've always known that story,
01:04:10.340 | but you don't understand how wild that feels
01:04:13.020 | until you have your own child.
01:04:14.620 | And you're like, "Well, I can't even,
01:04:16.380 | I refuse to let her use glass bottles only
01:04:20.500 | or something like that."
01:04:21.740 | And you're like, "Oh my God,
01:04:23.740 | the risk and sort of willingness of my grandparents
01:04:27.160 | to believe in something bigger and better
01:04:30.380 | is just astounding."
01:04:31.440 | And our own children sort of give it a sense of urgency.
01:04:34.900 | - Yeah, spectacular story.
01:04:37.940 | And you're sending knowledge out into the fields of science
01:04:41.300 | and bringing knowledge into the fields of science.
01:04:42.900 | And I love this, we'll see you on the other side.
01:04:45.520 | - Yeah.
01:04:46.360 | - I'm confident that it will all come back.
01:04:48.220 | Well, thank you so much for that.
01:04:52.820 | Mark, you have the opportunity to talk about
01:04:55.580 | did having kids change your worldview?
01:04:57.580 | - It's really tough to beat that story.
01:04:59.180 | It is tough to beat that story.
01:05:01.740 | And they are also your children.
01:05:03.000 | So in this case, you get two for the price of one,
01:05:07.180 | so to speak.
01:05:08.020 | - Having children definitely changes your time horizon.
01:05:11.860 | Something that that's one thing is you just,
01:05:13.900 | like there were all these things that I think
01:05:15.220 | we had talked about for as long as we've known each other
01:05:18.780 | that you eventually want to go do.
01:05:20.120 | But then it's like, "Oh, we're having kids.
01:05:21.820 | We need to like get on this."
01:05:23.860 | So I have theirs.
01:05:24.700 | - That was actually one of the checklists,
01:05:26.700 | the baby checklist before the first.
01:05:28.540 | - It was like the baby's coming.
01:05:29.820 | You have to like start CZI.
01:05:32.260 | - Truly.
01:05:33.100 | - And like sitting in the hospital delivery room,
01:05:37.140 | finishing editing the letter that we were going to publish
01:05:40.980 | to announce the work that we're doing on CZI.
01:05:42.820 | - Some people think that is an exaggeration.
01:05:43.660 | It was not.
01:05:44.660 | We really were editing the final draft.
01:05:47.100 | - Birth CZI before you first became a child.
01:05:51.100 | Well, it's an incredible initiative.
01:05:53.540 | I've been following it since its inception.
01:05:55.620 | And it's already been tremendously successful
01:06:00.260 | and everyone in the field of science,
01:06:02.260 | and I have a lot of communication with those folks,
01:06:04.020 | it feels the same way.
01:06:05.220 | And the future is even brighter for it.
01:06:07.220 | It's clear.
01:06:08.060 | And thank you for expanding to the Midwest and New York.
01:06:10.220 | And we're all very excited to see where all of this goes.
01:06:14.740 | I share in your optimism.
01:06:16.760 | And thank you for your time today.
01:06:18.900 | - Thank you.
01:06:19.740 | - Thank you.
01:06:20.560 | A lot more to do.
01:06:21.520 | - I'd like to take a quick break
01:06:22.620 | and thank our sponsor, Inside Tracker.
01:06:25.060 | Inside Tracker is a personalized nutrition platform
01:06:27.500 | that analyzes data from your blood and DNA
01:06:30.180 | to help you better understand your body
01:06:31.820 | and help you reach your health goals.
01:06:33.600 | I've long been a believer in getting regular blood work done
01:06:36.340 | for the simple reason that many of the factors
01:06:38.180 | that impact your immediate and long-term health
01:06:40.420 | can only be analyzed from a quality blood test.
01:06:43.260 | A major problem with a lot of blood tests out there,
01:06:45.380 | however, is that you get information back
01:06:47.740 | about metabolic factors, lipids and hormones and so forth,
01:06:50.620 | but you don't know what to do with that information.
01:06:52.680 | With Inside Tracker, they make it very easy
01:06:54.880 | because they have a personalized platform
01:06:56.820 | that allows you to see the levels of all those things,
01:06:59.360 | metabolic factors, lipids, hormones, et cetera,
01:07:01.860 | but it gives you specific directives that you can follow
01:07:04.740 | that relate to nutrition, behavioral modification,
01:07:07.100 | supplements, et cetera,
01:07:08.340 | that can help you bring those numbers into the ranges
01:07:10.620 | that are optimal for you.
01:07:11.980 | If you'd like to try Inside Tracker,
01:07:13.400 | you can go to insidetracker.com/huberman
01:07:16.680 | to get 20% off any of Inside Tracker's plans.
01:07:19.300 | Again, that's insidetracker.com/huberman.
01:07:22.780 | And now for my discussion with Mark Zuckerberg.
01:07:25.940 | Slight shift of topic here.
01:07:27.380 | You're extremely well-known
01:07:29.900 | for your role in technology development,
01:07:32.380 | but by virtue of your personal interests
01:07:35.620 | and also where metotechnology interfaces
01:07:39.540 | with mental health and physical health,
01:07:41.660 | you're starting to become synonymous with health,
01:07:44.900 | whether or not you realize it or not.
01:07:47.040 | Part of that is because there's post-footage
01:07:50.100 | of you rolling jiu-jitsu, you won a jiu-jitsu competition
01:07:53.240 | recently, you're doing other forms of martial arts,
01:07:57.000 | water sports, including surfing and on and on.
01:08:02.180 | So you're doing it yourself,
01:08:05.020 | but maybe we could just start off with technology
01:08:08.380 | and get this issue out of the way first,
01:08:12.500 | which is that I think many people assume that technology,
01:08:16.460 | especially technology that involves a screen, excuse me,
01:08:19.500 | of any kind is going to be detrimental to our health,
01:08:23.320 | but that doesn't necessarily have to be the case.
01:08:26.780 | So could you explain how you see technology meshing with,
01:08:31.780 | inhibiting, or maybe even promoting
01:08:35.140 | physical and mental health?
01:08:37.300 | - Sure, I mean, I think this is a really important topic.
01:08:41.540 | The research that we've done suggests
01:08:45.820 | that it's not like all good or all bad.
01:08:49.060 | I think how you're using the technology has a big impact
01:08:52.180 | on whether it is basically a positive experience for you.
01:08:56.360 | And even within technology, even within social media,
01:08:59.460 | there's not kind of one type of thing that people do.
01:09:02.560 | I think at its best, you're forming
01:09:05.500 | meaningful connections with other people.
01:09:08.960 | And there's a lot of research that basically suggests
01:09:12.420 | that it's the relationships that we have
01:09:15.500 | and the friendships that kind of bring the most happiness
01:09:19.260 | and in our lives and at some level end up even correlating
01:09:23.460 | with living a longer and healthier life,
01:09:25.200 | because that kind of grounding that you have in community
01:09:28.460 | ends up being important for that.
01:09:29.580 | So I think that that aspect of social media,
01:09:32.820 | which is the ability to connect with people,
01:09:36.260 | to understand what's going on in people's lives,
01:09:38.680 | have empathy for them, communicate what's going on
01:09:41.260 | with your life, express that, that's generally positive.
01:09:44.900 | There are ways that it can be negative
01:09:47.380 | in terms of bad interactions, things like bullying,
01:09:50.980 | which we can talk about because there's a lot
01:09:52.660 | that we've done to basically make sure
01:09:54.080 | that people can be safe from that and give people tools
01:09:56.740 | and give kids the ability to have the right parental
01:09:59.260 | controls so their parents can oversee that.
01:10:01.420 | But that's sort of the interacting with people side.
01:10:04.520 | There's another side of all this,
01:10:07.160 | which I think of as just like passive consumption,
01:10:10.860 | which at its best, it's entertainment, right?
01:10:15.260 | And entertainment is an important human thing too.
01:10:18.060 | But I don't think that that has quite the same association
01:10:22.420 | with the long-term wellbeing and health benefits
01:10:27.420 | as being able to help people connect with other people does.
01:10:30.560 | And I think at its worst, some of the stuff
01:10:36.640 | that we see online, I think these days,
01:10:40.360 | a lot of the news is just so relentlessly negative
01:10:43.960 | that it's just hard to come away from an experience
01:10:47.040 | where you're looking at the news for half an hour
01:10:50.680 | and feel better about the world.
01:10:53.040 | So I think that there's a mix on this.
01:10:56.560 | I think the more that social media is about connecting
01:11:00.220 | with people and the more that when you're consuming
01:11:05.220 | and using the media part of social media
01:11:09.640 | to learn about things that kind of enrich you
01:11:12.440 | and can provide inspiration or education
01:11:15.400 | as opposed to things that just leave you
01:11:18.440 | with a more toxic feeling,
01:11:20.360 | that that's sort of the balance
01:11:22.320 | that we try to get right across our products.
01:11:24.600 | And I think we're pretty aligned with the community
01:11:27.120 | because at the end of the day,
01:11:28.700 | people don't want to use a product
01:11:30.800 | and come away feeling bad.
01:11:32.360 | There's a lot that people talk about,
01:11:34.240 | evaluate a lot of these products
01:11:37.520 | in terms of information and utility,
01:11:39.760 | but I think it's as important when you're designing a product
01:11:43.060 | to think about what kind of feeling you're creating
01:11:45.760 | with the people who use it.
01:11:47.360 | Whether that's kind of an aesthetic sense
01:11:49.240 | when you're designing hardware
01:11:50.800 | or just kind of like what do you make people feel.
01:11:54.400 | And generally people don't want to feel bad.
01:11:56.880 | So I think that doesn't mean that we want to shelter people
01:12:00.960 | from bad things that are happening in the world,
01:12:03.060 | but I don't really think that it's not what people want
01:12:08.120 | for us to just be kind of just showing
01:12:11.600 | all this super negative stuff all day long.
01:12:13.860 | So we work hard on all these different problems,
01:12:17.080 | making sure that we're helping connect people
01:12:19.080 | as best as possible,
01:12:20.160 | helping make sure that we give people good tools
01:12:23.480 | to block people who might be bullying them or harass them,
01:12:25.960 | or especially for younger folks.
01:12:28.000 | Anyone under the age of 16 defaults into an experience
01:12:30.520 | where their experience is private.
01:12:32.140 | We have all these parental tools.
01:12:34.520 | So that way parents can kind of understand
01:12:36.920 | what their children is up to and are up to in a good balance.
01:12:40.540 | And then on the other side,
01:12:43.240 | we try to give people tools to understand
01:12:45.520 | how they're spending their time.
01:12:47.840 | We try to give people tools so that if you're a teen
01:12:50.660 | and you're kind of stuck in some loop
01:12:54.160 | of just looking at one type of content,
01:12:56.000 | we'll nudge you and say,
01:12:56.840 | "Hey, you've been looking at content
01:12:58.000 | of this type for a while.
01:12:59.480 | Like how about something else?
01:13:00.440 | And here's a bunch of other examples."
01:13:02.360 | So I think that there were things that you can do
01:13:03.920 | to kind of push this in a positive direction,
01:13:05.600 | but I think it just starts with having a more nuanced view
01:13:08.760 | of like, this isn't all good or all bad.
01:13:11.740 | And the more that you can make it kind of a positive thing,
01:13:14.280 | the better this will be for all the people
01:13:16.080 | who use our products.
01:13:17.240 | - That makes really good sense.
01:13:18.360 | In terms of the negative experience, I agree.
01:13:21.360 | I don't think anyone wants a negative experience
01:13:23.620 | in the moment.
01:13:24.460 | I think where some people get concerned perhaps,
01:13:27.200 | and I think about my own interactions with say Instagram,
01:13:29.360 | which I use all the time for getting information out,
01:13:32.560 | but also consuming information.
01:13:34.400 | And I happen to love it.
01:13:35.220 | Where I essentially launched the non-podcast segment
01:13:38.600 | of my podcast and continue to,
01:13:40.380 | I can think of experiences that are a little bit
01:13:44.040 | like highly processed food,
01:13:46.280 | where it tastes good at the time.
01:13:48.800 | It's highly engrossing,
01:13:51.000 | but it's not necessarily nutritious
01:13:53.240 | and you don't feel very good afterwards.
01:13:55.920 | So for me, that would be my,
01:13:58.480 | the little collage of default options
01:14:01.040 | to click on in Instagram.
01:14:02.260 | Occasionally I noticed, and this just reflects my failure,
01:14:05.220 | not Instagram's, right?
01:14:07.020 | That there are a lot of like street fight things,
01:14:10.120 | like of people beating people up on the street.
01:14:12.440 | And I have to say,
01:14:13.320 | these have a very strong gravitational pull.
01:14:15.800 | I'm not somebody that enjoys seeing violence per se,
01:14:18.060 | but I find myself, I'll click on one of these,
01:14:20.900 | like what happened?
01:14:21.740 | And I'll see someone like get hit
01:14:23.680 | and there's like a little melee on the street or something.
01:14:26.040 | And those seem to be offered to me a lot lately.
01:14:28.420 | And again, this is my fault.
01:14:29.820 | It reflects my prior searching experience.
01:14:32.380 | But I noticed that it has a bit of a gravitational pull
01:14:35.460 | where I didn't learn anything.
01:14:39.360 | It's not teaching me any kind of useful street
01:14:41.720 | self-defense skills of any kind.
01:14:44.520 | And at the same time,
01:14:48.120 | I also really enjoy some of the cute animal stuff.
01:14:50.960 | And so I get a lot of those also.
01:14:52.140 | So there's this polarized collage that's offered to me
01:14:55.440 | that reflects my prior search behavior.
01:14:58.480 | You could argue that the cute animal stuff
01:15:01.320 | is just entertainment,
01:15:02.780 | but actually it fills me with a feeling in some cases
01:15:05.740 | that truly delights me.
01:15:06.820 | I delight in animals.
01:15:07.860 | And we're not just talking about kittens.
01:15:08.760 | I mean, animals I've never seen before,
01:15:11.080 | interactions between animals I've never seen before
01:15:12.980 | that truly delight me.
01:15:14.000 | They energize me in a positive way
01:15:15.460 | that when I leave Instagram, I do think I'm better off.
01:15:19.020 | So I'm grateful for the algorithm in that sense.
01:15:21.560 | But I guess the direct question is,
01:15:24.660 | is the algorithm just reflective
01:15:26.980 | of what one has been looking at a lot
01:15:29.220 | prior to that moment where they log on?
01:15:31.560 | Or is it also trying to do exactly what you described,
01:15:35.200 | which is trying to give people a good feeling experience
01:15:38.280 | that leads to more good feelings?
01:15:40.980 | - Yeah, I mean, I think we try to do this
01:15:42.640 | in a long-term way, right?
01:15:44.220 | I think one simple example of this
01:15:46.680 | is we had this issue a number of years back
01:15:49.700 | about clickbait news, right?
01:15:51.620 | So articles that would have basically a headline
01:15:57.200 | that grabbed your attention, that made you feel like,
01:15:59.440 | oh, I need to click on this, and then you click on it.
01:16:01.500 | And then the article is actually about something
01:16:03.700 | that's somewhat tangential to it.
01:16:05.940 | But people clicked on it.
01:16:07.160 | So the naive version of this stuff,
01:16:09.540 | the 10-year-old version, it was like,
01:16:11.580 | oh, people seem to be clicking on this.
01:16:13.020 | Maybe that's good.
01:16:14.180 | But it's actually a pretty straightforward exercise
01:16:16.900 | to instrument the system to realize that,
01:16:19.020 | hey, people click on this,
01:16:20.340 | and then they don't really spend a lot of time
01:16:23.980 | reading the news after clicking on it.
01:16:27.260 | And after they do this a few times,
01:16:30.260 | it doesn't really correlate with them
01:16:33.560 | saying that they're having a good experience.
01:16:36.420 | Some of how we measure this
01:16:39.580 | is just by looking at how people use the services.
01:16:42.180 | But I think it's also important to balance that
01:16:44.460 | by having real people come in and tell us,
01:16:48.180 | okay, we show them,
01:16:49.500 | here are the stories that we could have showed you.
01:16:51.920 | Which of these are most meaningful to you
01:16:55.120 | or would make it so that you have the best experience
01:16:57.260 | and just kind of mapping the algorithm
01:17:00.140 | and what we do to that ground truth
01:17:01.560 | of what people say that they want.
01:17:02.780 | So I think that through a set of things like that,
01:17:06.040 | we really have made large steps
01:17:08.620 | to minimize things like clickbait over time.
01:17:10.720 | It's not gone from the internet,
01:17:12.000 | but I think we've done a good job
01:17:13.420 | of minimizing it on our services.
01:17:15.400 | Within that though,
01:17:17.940 | I do think that we need to be pretty careful
01:17:19.420 | about not being paternalistic
01:17:20.880 | about what makes different people feel good.
01:17:23.300 | I don't know that everyone feels good about cute animals.
01:17:28.060 | I can't imagine that people would feel really bad about it,
01:17:30.500 | but maybe they don't have as profound
01:17:31.700 | of a positive reaction to it as you just expressed.
01:17:35.360 | And I don't know, maybe people who are more into fighting
01:17:39.960 | would look at the street fighting videos,
01:17:42.420 | assuming that they're within our community standards.
01:17:44.100 | I think that there's a level of violence
01:17:45.460 | that we just don't want to be showing at all,
01:17:47.100 | but that's a separate question.
01:17:50.000 | But if they are, I mean, it's like,
01:17:51.860 | I mean, I'm pretty into MMA.
01:17:53.340 | I don't get a lot of street fighting videos,
01:17:55.260 | but if I did, maybe I'd feel like
01:17:56.660 | I was learning something from that.
01:17:58.460 | I think at various times in the company's history,
01:18:03.040 | we've been a little bit too paternalistic
01:18:05.380 | about saying this is good content, this is bad,
01:18:09.380 | you should like this, this is unhealthy for you.
01:18:13.140 | And I think that we want to look at the long-term effects.
01:18:17.420 | You don't want to get stuck in a short-term loop
01:18:19.540 | of like, okay, just 'cause you did this today
01:18:20.940 | doesn't mean it's what you aspire for yourself over time.
01:18:24.060 | But I think as long as you look at the long-term
01:18:28.120 | of what people both say they want and what they do,
01:18:31.200 | giving people a fair amount of latitude
01:18:33.040 | to like the things that they like,
01:18:35.020 | I just think feels like the right set of values
01:18:37.480 | to bring to this.
01:18:38.440 | Now, of course, that doesn't go for everything.
01:18:41.000 | There are things that are kind of truly off limits
01:18:43.060 | and things that, like bullying, for example,
01:18:47.340 | or things that are really like inciting violence,
01:18:50.020 | things like that.
01:18:50.860 | I mean, we have the whole community standards around this.
01:18:53.040 | But I think, except for those things
01:18:55.840 | which I would hope that most people can agree,
01:18:58.220 | okay, bullying is bad, right?
01:18:59.380 | I hope that 100% of people agree with that,
01:19:03.420 | not 100, maybe 99%,
01:19:06.220 | except for the things that kind of get that sort of very,
01:19:09.940 | that feel pretty extreme and bad like that.
01:19:14.060 | I think you want to give people space
01:19:15.400 | to like what they want to like.
01:19:17.660 | - Yesterday, I had the very good experience
01:19:20.400 | of learning from the meta team about safety protections
01:19:23.700 | that are in place for kids who are using meta platforms.
01:19:28.700 | And frankly, I was like really positively surprised
01:19:32.780 | at the huge number of filter-based tools
01:19:36.140 | and just ability to customize the experience
01:19:39.880 | so that it can stand the best chance of enriching,
01:19:42.780 | not just remaining neutral, but enriching their mental health
01:19:46.220 | status.
01:19:47.060 | One thing that came about in that conversation, however,
01:19:51.160 | was I realized there are all these tools,
01:19:54.440 | but do people really know that these tools exist?
01:19:56.800 | And I think about my own experience with Instagram,
01:19:58.600 | I love watching Adam Aseri's Friday Q and A's
01:20:02.380 | because he explains a lot of the tools
01:20:04.920 | that I didn't know existed.
01:20:07.500 | And if people haven't seen that,
01:20:10.120 | I highly recommend that they watch that.
01:20:12.520 | I think every, he takes questions on Thursdays
01:20:14.320 | and answers them most every Fridays.
01:20:16.120 | So if I'm not aware of the tools without watching that,
01:20:21.240 | that exists for adults,
01:20:22.780 | how does meta look at the challenge of making sure
01:20:26.560 | that people know that there are all these tools?
01:20:28.280 | I mean, dozens and dozens of very useful tools,
01:20:30.640 | but I think most of us just know the hashtag,
01:20:32.960 | the tag, the click, stories versus feed.
01:20:37.480 | We now know that, you know, I also post to threads.
01:20:40.200 | I mean, so we know the major channels and tools,
01:20:42.980 | but this is like owning a vehicle
01:20:44.680 | that has incredible features
01:20:46.320 | that one doesn't realize can take you off-road,
01:20:48.980 | can allow your vehicle to fly.
01:20:50.600 | I mean, there's a lot there.
01:20:52.160 | So what do you think could be done
01:20:54.280 | to get that information out?
01:20:55.240 | Maybe this conversation could cue people to their existence.
01:20:57.560 | - I mean, that's part of the reason
01:20:59.220 | why I wanted to talk to you about this is,
01:21:00.960 | I mean, I think most of the narrative around social media
01:21:04.420 | is not, okay, all of the different tools
01:21:06.680 | that people have to control their experience.
01:21:08.500 | That's, you know, the kind of narrative of,
01:21:11.200 | is this just negative for teens or something?
01:21:14.740 | And I think, again, a lot of this comes down to,
01:21:17.140 | you know, how is the experience being tuned
01:21:20.940 | and is it actually, you know,
01:21:23.060 | like are people using it to connect in positive ways?
01:21:25.620 | And if so, I think it's really positive.
01:21:27.580 | So, yeah, I mean, I think part of this is
01:21:30.900 | we probably just need to get out
01:21:31.740 | and talk to people more about it.
01:21:33.300 | And then there's an in-product aspect,
01:21:36.220 | which is, you know, if you're a teen
01:21:38.320 | and you sign up, we take you through
01:21:40.460 | a pretty, you know, extensive experience
01:21:43.560 | that tries to outline some of this.
01:21:46.380 | But that has limits too, right?
01:21:48.060 | Because when you sign up for a new thing,
01:21:50.260 | if you're bombarded with like,
01:21:51.460 | here's a list of features, you're like,
01:21:53.360 | okay, I just signed up for this.
01:21:54.460 | I don't really understand much about what the service is.
01:21:57.140 | Like, let me go find some people to follow
01:22:00.220 | who are my friends on here
01:22:01.500 | before I like learn about controls
01:22:03.020 | to prevent people from harassing me or something.
01:22:07.980 | That's why I think it's really important
01:22:09.260 | to also show a bunch of these tools in context.
01:22:13.800 | So, you know, if you're looking at comments
01:22:16.140 | and, you know, if you go to, you know, delete a comment
01:22:20.600 | or you go to edit something, you know,
01:22:22.940 | try to give people prompts in line.
01:22:24.700 | It's like, hey, did you know that you can manage things
01:22:26.640 | in these ways around that?
01:22:29.080 | Or when you're in the inbox
01:22:30.300 | and you're filtering something, right?
01:22:32.080 | It's remind people in line.
01:22:34.160 | So just because of the number of people
01:22:37.020 | who use the products and the level of nuance
01:22:39.340 | around each of the controls,
01:22:40.480 | I think the vast majority of that education,
01:22:44.740 | I think, needs to happen in the product.
01:22:47.200 | But I do think that through conversations like this
01:22:49.440 | and others that, you know, we need to be doing,
01:22:52.820 | I mean, we can create a broader awareness
01:22:55.260 | that those things exist.
01:22:56.380 | So that way at least people are primed.
01:22:58.300 | So that way when those things pop up in the product,
01:23:00.100 | people are like, oh yeah,
01:23:00.940 | like I knew that there was this control
01:23:02.300 | and like, here's like how I would use that.
01:23:05.460 | - Like, I find the restrict function to be very useful
01:23:08.660 | more than the block function.
01:23:09.980 | In most cases, I do sometimes have to block people,
01:23:12.340 | but the restrict function is really useful
01:23:13.820 | that you could filter specific comments.
01:23:15.980 | You know, someone might have a,
01:23:17.880 | you might recognize that someone has a tendency
01:23:19.540 | to be a little aggressive.
01:23:20.900 | And I should point out that I actually don't really mind
01:23:22.840 | what people say to me,
01:23:23.680 | but I try and maintain what I call classroom rules
01:23:25.700 | in my comment section
01:23:27.260 | where I don't like people attacking other people
01:23:28.860 | because I would never tolerate that
01:23:30.020 | in the university classroom.
01:23:31.080 | I'm not going to tolerate that in the comment section,
01:23:32.780 | for instance.
01:23:33.620 | - Yeah, and I think that the example that you just,
01:23:36.860 | you just used about restrict versus block
01:23:39.060 | gets to something about product design
01:23:40.660 | that's important too,
01:23:42.100 | which is that block is sort of this very powerful tool
01:23:46.940 | that if someone is giving you a hard time
01:23:48.540 | and you just want them to disappear from the experience,
01:23:50.500 | you can do it.
01:23:51.620 | But the design trade-off with that
01:23:55.820 | is that in order to make it so that the person
01:23:57.600 | is just gone from the experience
01:23:59.540 | and that you don't show up to them,
01:24:03.320 | they don't show up to you,
01:24:05.300 | inherent to that is that they will have a sense
01:24:08.260 | that you blocked them.
01:24:09.720 | And that's why I think some stuff like restrict
01:24:12.560 | or just filtering,
01:24:13.500 | like I just don't want to see as much stuff
01:24:15.100 | about this topic.
01:24:16.140 | People like using different tools for very subtle reasons.
01:24:20.300 | I mean, maybe you want the content to not show up,
01:24:23.860 | but you don't want the person who's posting the content
01:24:26.140 | to know that you don't want it to show up.
01:24:28.380 | Maybe you don't want to get the messages in your main inbox,
01:24:30.440 | but you don't want to tell the person
01:24:31.700 | that you're not friends or something like that.
01:24:36.620 | I mean, you actually need to give people different tools
01:24:39.020 | that have different levels of kind of power
01:24:41.340 | and nuance around how the social dynamics
01:24:45.300 | around using them play out
01:24:46.620 | in order to really allow people to tailor the experience
01:24:50.460 | in the ways that they want.
01:24:51.840 | - In terms of trying to limit total amount of time
01:24:55.040 | on social media,
01:24:56.480 | I couldn't find really good data on this.
01:25:01.400 | How much time is too much?
01:25:02.840 | I mean, I think it's going to depend on what one
01:25:04.940 | is looking at, the age of the user, et cetera.
01:25:07.420 | - Yeah, I agree.
01:25:08.260 | - But I know that you have tools that cue the user
01:25:12.100 | to how long they've been on a given platform.
01:25:15.260 | Are there tools to self-regulate?
01:25:17.420 | Like I'm thinking about like the Greek myth of the sirens
01:25:20.100 | and people tying themselves to the mast
01:25:22.420 | and covering their eyes
01:25:23.300 | so that they're not drawn in by the sirens.
01:25:25.180 | Is there a function aside from deleting the app temporarily
01:25:28.380 | and then reinstalling it every time you want to use it again?
01:25:31.740 | Is there a true lockout, self lockout function
01:25:34.600 | where one can lock themselves out of access to the app?
01:25:38.040 | - Well, I think we give people tools
01:25:39.300 | that let them manage this
01:25:40.980 | and there's the tools that you get to use
01:25:43.360 | and then there's the tools that the parents get to use
01:25:46.120 | to basically see how the usage works.
01:25:48.760 | But yeah, I think that there's different kind of,
01:25:51.560 | and I think for now we've mostly focused
01:25:53.120 | on helping people understand this
01:25:55.280 | and then give people reminders and things like that.
01:25:59.580 | It's tough though to answer the question
01:26:02.600 | that you were talking about before this
01:26:04.880 | of is there an amount of time which is too much?
01:26:07.520 | Because it does really get to what you're doing.
01:26:09.800 | But if you fast forward beyond just the apps
01:26:12.780 | that we have today to an experience
01:26:15.080 | that is like a social experience
01:26:16.980 | in the future of the augmented reality glasses
01:26:19.920 | or something that we're building,
01:26:22.160 | a lot of this is gonna be you're interacting with people
01:26:27.160 | in the way that you would physically
01:26:29.280 | as if you were kind of like hanging out with friends
01:26:31.440 | or working with people.
01:26:33.600 | But now they can show up as holograms
01:26:36.480 | and you can feel like you're present right there with them
01:26:39.040 | no matter where they actually are.
01:26:40.980 | And the question is, is there too much time
01:26:43.600 | to spend interacting with people like that?
01:26:45.760 | Well, at the limit, if we can get that experience
01:26:48.640 | to be kind of as rich and giving you as good
01:26:52.040 | of a sense of presence as you would have
01:26:55.780 | if you were physically there with someone,
01:26:57.840 | then I don't see why you would wanna restrict the amount
01:27:01.480 | that people use that technology to any less than
01:27:04.920 | what would be the amount of time that you'd be comfortable
01:27:08.600 | interacting with people physically.
01:27:10.940 | Which obviously is not gonna be 24 hours a day.
01:27:12.680 | You have to do other stuff.
01:27:14.520 | You have work, you need to sleep.
01:27:16.420 | But I think it really gets to kind of how you're using
01:27:18.900 | these things.
01:27:19.740 | Whereas if what you're primarily using the services for
01:27:22.380 | is to, you're getting stuck in loops reading news
01:27:26.820 | or something that is really kind of getting you
01:27:29.020 | into a negative mental state, then I don't know.
01:27:32.380 | I mean, I think that there's probably a relatively short
01:27:34.320 | period of time that maybe that's kind of a good thing
01:27:36.820 | that you wanna be doing.
01:27:38.720 | But again, even then it's not zero, right?
01:27:40.300 | 'Cause it's just 'cause news might make you unhappy
01:27:43.460 | doesn't mean that the answer is to be unaware
01:27:45.460 | of negative things that are happening in the world.
01:27:47.260 | I just think that there's like different people
01:27:48.980 | have different tolerances for what they can take on that.
01:27:52.140 | And I think we, it's generally having some awareness
01:27:55.020 | is probably good as long as it's not more than
01:27:57.480 | you're kind of constitutionally able to take.
01:27:59.940 | So I don't know.
01:28:01.820 | Try to not be too paternalistic about this as our approach.
01:28:05.240 | But we want to empower people by giving them the tools,
01:28:08.700 | both people and if you're a teen, your parents,
01:28:12.500 | to have tools to understand what you're experiencing
01:28:14.980 | and how you're using these things and then go from there.
01:28:19.220 | - Yeah, I think it requires of all of us
01:28:21.100 | some degree of self-regulation.
01:28:22.720 | I like this idea of not being too paternalistic.
01:28:24.740 | I mean, that's, it seems like the right way to go.
01:28:26.900 | I find myself occasionally having to make sure
01:28:29.540 | that I'm not just passively scrolling, that I'm learning.
01:28:32.660 | I like forging for organizing and dispersing information.
01:28:36.480 | That's been my life's career.
01:28:38.480 | So I've learned so much from social media.
01:28:40.540 | I find great papers, great ideas.
01:28:43.620 | I think comments are a great source of feedback.
01:28:45.540 | And I'm not just saying that 'cause you're sitting here.
01:28:47.160 | I mean, Instagram in particular,
01:28:49.180 | but other meta platforms have been tremendously helpful
01:28:52.480 | for me to get science and health information out.
01:28:55.500 | One of the things that I'm really excited about,
01:28:58.340 | which I only had the chance to try for the first time today,
01:29:01.300 | is your new VR platforms, the newest Oculus.
01:29:04.940 | And then we can talk about the glasses, the Ray-Bans.
01:29:07.740 | - Sure.
01:29:08.580 | - Those are still, those two experiences
01:29:10.660 | are still kind of blowing my mind,
01:29:12.340 | especially the Ray-Ban glasses.
01:29:16.540 | And I have so many questions about this, so I'll resist.
01:29:19.980 | - We can get into that.
01:29:20.820 | - Okay, well, yeah, I have some experience with VR.
01:29:22.940 | My lab has used VR.
01:29:24.260 | Jeremy Bailenson's lab at Stanford
01:29:27.500 | is one of the pioneering labs of VR and mixed reality.
01:29:30.520 | I guess some used to call it augmented reality,
01:29:32.540 | but now mixed reality.
01:29:33.860 | I think what's so striking about the VR
01:29:36.200 | that you guys had me try today
01:29:38.780 | is how well it interfaces with the real room,
01:29:42.020 | let's call it the physical room.
01:29:44.300 | I could still see people.
01:29:45.500 | I could see where the furniture was,
01:29:46.700 | so I wasn't gonna bump into anything.
01:29:48.300 | I could see people's smiles.
01:29:49.320 | I could see my water on the table
01:29:52.500 | while I was doing this,
01:29:54.620 | what felt like a real martial arts experience,
01:29:57.740 | except I wasn't getting hit,
01:29:59.580 | well, I was getting hit virtually,
01:30:01.780 | but it's extremely engaging.
01:30:04.620 | And yet it, on the good side of things,
01:30:06.980 | it really bypasses a lot of the early concerns
01:30:10.020 | that Bailenson lab,
01:30:11.180 | again, Jeremy's lab was early to say that,
01:30:13.580 | oh, there's a limit to how much VR one can
01:30:15.820 | or should use each day, even for the adult brain,
01:30:19.820 | because it can really disrupt your vestibular system,
01:30:23.300 | your sense of balance.
01:30:24.520 | All of that seems to have been dealt with
01:30:26.080 | in this new iteration of VR.
01:30:28.300 | Like we didn't come out of it feeling dizzy at all.
01:30:30.100 | I didn't feel like I was reentering the room
01:30:31.940 | in a way that was really jarring.
01:30:33.620 | Going into it is obviously,
01:30:34.860 | whoa, this is a different world,
01:30:36.100 | but you can look to your left and say,
01:30:39.080 | oh, someone just came in the door.
01:30:40.460 | Hey, how's it going?
01:30:41.300 | Hold on, I'm playing this game just as it was
01:30:42.620 | when I was a kid playing a Nintendo and someone walked in.
01:30:45.180 | It's fully engrossing, but you'd be like,
01:30:46.460 | hold on and you see they're there.
01:30:47.720 | So first of all, Bravo, incredible.
01:30:51.920 | And then the next question is,
01:30:55.340 | what is this, what do we even call this experience?
01:30:58.040 | Because it is truly mixed.
01:31:00.140 | It's a truly mixed reality experience.
01:31:02.380 | - Yeah, I mean, mixed reality is sort of the umbrella term
01:31:05.420 | that refers to the combined experience
01:31:08.300 | of virtual and augmented reality.
01:31:10.300 | So augmented reality is what you're eventually going to get
01:31:13.800 | with some future version of the smart glasses
01:31:16.500 | where you're primarily seeing the world, right?
01:31:19.420 | But you can put holograms in it, right?
01:31:22.300 | So like that we'll have a future
01:31:25.700 | where you're going to walk into a room
01:31:26.980 | and you're going to be like as many holograms
01:31:29.220 | as physical objects, right?
01:31:31.020 | If you just think about like all the paper,
01:31:32.820 | the kind of art, physical games, media, your workstation.
01:31:36.300 | - If we refer to, let's say an MMA fight,
01:31:38.260 | we could just draw it up on the table right here
01:31:39.980 | and just see it repeat as opposed to us turning
01:31:41.940 | and looking at a screen.
01:31:42.780 | - Yeah, I mean, pretty much any screen that exists
01:31:44.520 | could be a hologram in the future with smart glasses, right?
01:31:47.720 | There's nothing that actually physically needs to be there
01:31:50.300 | for that when you have glasses
01:31:51.780 | that can put a hologram there.
01:31:54.680 | And it's an interesting thought experiment
01:31:56.300 | to just go around and think about, okay,
01:31:57.540 | what of the things that are physical in the world
01:31:59.560 | need to actually be physical?
01:32:01.540 | And your chair does, right?
01:32:02.620 | 'Cause you're sitting on it.
01:32:03.460 | A hologram isn't going to support you.
01:32:05.100 | But I like that art on the wall.
01:32:06.760 | I mean, that doesn't need to physically be there.
01:32:09.080 | I mean, so I think that that's sort of
01:32:13.760 | the augmented reality experience that we're moving towards.
01:32:16.900 | And then we've had these headsets
01:32:18.780 | that historically we think about as VR.
01:32:22.380 | And that has been something that kind of,
01:32:25.300 | it's like a fully immersive experience.
01:32:27.680 | But now we're kind of getting something
01:32:30.020 | that's a hybrid in between the two and capable of both,
01:32:32.620 | which is a headset that can do both virtual reality
01:32:35.060 | and some of these augmented reality experiences.
01:32:38.300 | And I think that that's really powerful.
01:32:41.100 | Both because you're going to get new applications
01:32:43.900 | that kind of allow people to collaborate together.
01:32:46.260 | And maybe the two of us are here physically,
01:32:48.800 | but someone joins us and it's their avatar there.
01:32:51.860 | Or maybe it's some version of the future,
01:32:53.860 | like you're having a team meeting
01:32:56.180 | and you have some people there physically,
01:32:57.900 | and you have some people dialing in,
01:32:59.260 | and they're basically like a hologram.
01:33:00.680 | They're virtually, but then you also have some AI personas
01:33:04.060 | that are on your team
01:33:04.900 | that are helping you do different things.
01:33:06.060 | And they can be embodied as avatars
01:33:07.480 | and around the table meeting with you.
01:33:09.080 | - Are people going to be doing first dates
01:33:10.780 | that are physically separated?
01:33:12.680 | I could imagine that some people would,
01:33:14.340 | is it even worth leaving the house type date?
01:33:16.700 | And then they find out,
01:33:17.600 | and then they meet for the first time.
01:33:19.340 | - I mean, maybe.
01:33:20.220 | I think, you know, dating has physical aspects to it too.
01:33:25.220 | - And some people might not be,
01:33:28.040 | they want to know whether or not it's worth the effort
01:33:29.860 | to head out to, they want to breach the divide, right?
01:33:34.340 | - It is possible.
01:33:35.180 | I mean, I know like some of my friends
01:33:37.380 | who are dating basically say that in order to make sure
01:33:42.380 | that they have like a safe experience,
01:33:45.020 | then if they're going on a first date,
01:33:46.520 | they'll schedule something that's like shorter
01:33:49.340 | and maybe in the middle of the day,
01:33:50.420 | like maybe it's coffee,
01:33:51.420 | so that way if they don't like the person,
01:33:52.660 | they can just kind of get out
01:33:53.980 | before like going and scheduling a dinner
01:33:55.700 | or like a real full date.
01:33:57.160 | So I don't know, maybe in the future,
01:33:58.380 | people will kind of have that experience
01:34:00.180 | where you can feel like you're kind of sitting there,
01:34:02.800 | and it's even easier and lighter weight and safer.
01:34:06.420 | And if you're not having a good experience,
01:34:07.780 | you can just like teleport out of there and beyond.
01:34:10.700 | But yeah, I think that this will be an interesting question
01:34:14.700 | in the future is there are clearly a lot of things
01:34:18.840 | that are only possible physically that,
01:34:21.820 | or so much better physically.
01:34:24.260 | And then there are all these things that we're building up
01:34:25.920 | that can be digital experiences,
01:34:28.060 | but it's this weird artifact of kind of how this stuff
01:34:31.800 | has been developed that the digital world
01:34:34.100 | and the physical world exist in these
01:34:35.680 | like completely different planes,
01:34:37.320 | or you want to interact with the digital world.
01:34:39.060 | Well, we do it all the time,
01:34:39.920 | but we pull out a small screen or we have a big screen.
01:34:42.660 | There's just basically,
01:34:43.500 | we're interacting with the digital world
01:34:44.320 | through these screens.
01:34:45.580 | But I think if we fast forward a decade or more,
01:34:50.260 | it's I think one of the really interesting questions
01:34:54.320 | about what is the world that we're going to live in?
01:34:57.280 | I think it's going to increasingly be this mesh
01:34:59.120 | of the physical and digital worlds
01:35:00.720 | that will allow us to feel A,
01:35:03.520 | that the world that we're in is just a lot richer
01:35:06.420 | 'cause there can be all these things that people create
01:35:08.200 | that are just so much easier to do digitally than physically.
01:35:12.140 | But B, you're going to have a real kind of physical sense
01:35:17.140 | of presence with these things
01:35:19.200 | and not feel like interacting in the digital world
01:35:21.340 | is taking you away from the physical world,
01:35:24.040 | which today is just so much viscerally richer
01:35:26.200 | and more powerful.
01:35:27.660 | I think the digital world will sort of be embedded in that
01:35:31.560 | and will feel kind of just as vivid in a lot of ways.
01:35:35.080 | So that's why I always think,
01:35:36.500 | and when you were saying before,
01:35:38.000 | you felt like you could look around and see the real room.
01:35:41.040 | I actually think that there's an interesting
01:35:42.400 | kind of philosophical distinction
01:35:43.900 | between the real room and the physical room,
01:35:46.560 | which historically I think people would have said
01:35:48.740 | those are the same thing.
01:35:50.460 | But I actually think in the future,
01:35:52.040 | the real room is going to be the combination
01:35:54.580 | of the physical world with all the digital artifacts
01:35:57.120 | and objects that are in there
01:35:58.200 | that you can interact with them and feel present,
01:36:00.360 | whereas the physical world is just the part
01:36:01.940 | that's physically there.
01:36:03.320 | And I think it's possible to build a real world
01:36:05.240 | that's the sum of these two
01:36:06.280 | that will actually be a more profound experience
01:36:08.940 | than what we have today.
01:36:10.400 | - Well, I was struck by the smoothness of the interface
01:36:12.880 | between the VR and the physical room.
01:36:15.280 | Your team had me try a,
01:36:16.880 | I guess it was an exercise class in the form of a book.
01:36:21.040 | It was essentially like hitting mitts boxing,
01:36:23.500 | so hitting targets boxing. - Supernatural.
01:36:25.160 | - Yeah, and it comes at a fairly fast pace
01:36:27.820 | that then picks up.
01:36:28.660 | It's got some tutorial.
01:36:29.720 | It's very easy to use.
01:36:31.040 | And certainly got my heart rate up.
01:36:32.560 | I'm in at least decent shape.
01:36:34.840 | And I have to be honest,
01:36:36.520 | I've never once desired
01:36:38.240 | to do any of these onscreen fitness things.
01:36:40.500 | I mean, I can't think of anything more aversive
01:36:42.300 | than like a clap,
01:36:44.560 | like I don't want to insult any particular products,
01:36:47.580 | but like riding a stationary bike while looking at a screen
01:36:50.440 | and pretending I'm on a road outside,
01:36:51.920 | I can't think of anything worse for me.
01:36:54.580 | Maybe only- - I do like the leaderboard.
01:36:56.320 | - Okay, yeah. - Maybe I'm just
01:36:57.160 | a very competitive person.
01:36:58.000 | It's like, if you're going to be running on a treadmill,
01:37:00.040 | at least give me a leaderboard
01:37:01.640 | so I can beat the people who are ahead of me.
01:37:03.360 | - I like moving outside,
01:37:04.880 | and certainly an exercise class or aerobics class,
01:37:07.620 | as they used to call them.
01:37:08.460 | But the experience I tried today was extremely engaging.
01:37:12.560 | And I've done enough boxing
01:37:14.640 | to at least know how to do a little bit of it.
01:37:17.260 | And I really enjoyed it.
01:37:18.320 | It gets your heart rate up.
01:37:19.160 | And I completely forgot
01:37:20.360 | that I was doing an onscreen experience
01:37:22.600 | because, in part, because I believe
01:37:25.400 | I was still in that physical room.
01:37:28.320 | And I think there's something about the mesh
01:37:31.560 | of the physical room and the virtual experience
01:37:35.320 | that makes it neither of one world or the other.
01:37:38.920 | I mean, I really felt at the interface of those,
01:37:40.760 | and certainly got presence,
01:37:42.280 | this feeling of forgetting that I was in a virtual experience
01:37:45.160 | and got my heart rate up pretty quickly.
01:37:46.960 | We had to stop 'cause we were going to start recording,
01:37:48.560 | but I would do that for a good 45 minutes in the morning.
01:37:51.200 | And there's no amount of money you could pay me, truly,
01:37:54.800 | to look at a screen while pedaling on a bike
01:37:57.320 | or running on a treadmill.
01:37:58.640 | So, again, bravo.
01:38:00.540 | I think it's going to be very useful.
01:38:01.480 | It's going to get people moving their bodies more,
01:38:03.520 | which, certainly, social media up until now
01:38:07.400 | and a lot of technologies have been accused
01:38:10.060 | of limiting the amount of physical activity
01:38:13.440 | that both children and adults are engaged in.
01:38:15.960 | And we know we need physical activity.
01:38:17.560 | You're a big proponent of and practitioner
01:38:19.800 | of physical activity. - Yeah, totally, yeah.
01:38:20.640 | - So is this a major goal of Metta?
01:38:22.760 | Get people moving their bodies more
01:38:25.320 | and getting their heart rates up and so on?
01:38:28.280 | - I think we want to enable it and I think it's good,
01:38:31.840 | but I think it comes more from a philosophical view
01:38:36.840 | of the world than it is necessarily.
01:38:41.560 | I don't go into building products
01:38:43.320 | to try to shape people's behavior.
01:38:45.640 | I believe in empowering people to do what they want
01:38:49.960 | and be the best version of themselves that they can be.
01:38:53.640 | - So no agenda.
01:38:54.920 | - That said, I do believe that there's the previous
01:38:59.120 | generation of computers were devices for your mind.
01:39:02.380 | And I think that we are not brains in tanks.
01:39:06.480 | You know, I think that there's sort of a philosophical view
01:39:09.560 | of people of like, okay, you are primarily
01:39:13.500 | what you think about or your values or something.
01:39:15.720 | It's like, no, you are that
01:39:17.200 | and you are a physical manifestation.
01:39:19.280 | And people were very physical.
01:39:22.220 | And I think building a computer for your whole body
01:39:27.220 | and not just for your mind is very fitting
01:39:32.400 | with this worldview that like the actual essence of you,
01:39:35.840 | if you want to be present with another person,
01:39:37.680 | if you want to like be fully engaged in experience
01:39:40.600 | is not just, okay, it's not just a video conference call
01:39:44.400 | that looks at your face and where you can like share ideas.
01:39:48.000 | It's something that you can engage your whole body.
01:39:51.200 | So yeah, I mean, I think being physical
01:39:53.760 | is very important to me.
01:39:55.000 | I mean, it's just that's a lot of, you know,
01:39:59.520 | the most fun stuff that I get to do.
01:40:02.040 | It's a really important part of how I personally balance
01:40:04.800 | my energy levels and just get a diversity of experiences
01:40:09.360 | because I could spend all my time running the company.
01:40:12.080 | But I think it's good for people to do some different things
01:40:16.380 | and compete in different areas or learn different things.
01:40:19.340 | And all of that is good.
01:40:22.120 | If people want to do really intense workouts
01:40:27.040 | with the work that we're doing with Quest
01:40:30.000 | or with eventual AR glasses, great.
01:40:33.920 | But even if you don't want to do
01:40:36.040 | like a really intense workout,
01:40:37.920 | I think just having a computing environment and platform,
01:40:40.440 | which is inherently physical,
01:40:42.320 | captures more of the essence of what we are as people
01:40:45.460 | than any of the previous computing platforms
01:40:47.720 | that we've had to date.
01:40:49.000 | - Yeah, I was even thinking just of the simple task
01:40:51.560 | of getting a better range of motion, AKA flexibility.
01:40:55.920 | I could imagine inside of the VR experience,
01:40:58.560 | you know, leaning into a stretch,
01:40:59.720 | you know, a standard kind of like a lunge type stretch,
01:41:01.980 | but actually seeing a meter of like,
01:41:03.440 | are you approaching new levels of flexibility in that moment
01:41:06.680 | where it's actually measuring some kinesthetic elements
01:41:10.240 | on the body and the joints.
01:41:11.400 | And I mean, I was just trying,
01:41:12.920 | whereas normally you might have to do that in front
01:41:15.460 | of a camera, which then would give you the data on a screen
01:41:17.680 | that you'd look at afterwards or hire an expensive coach,
01:41:20.180 | but so, or looking at form and resistance training.
01:41:23.680 | So you're actually lifting physical weights,
01:41:25.860 | but it's telling you whether or not you're breaking form.
01:41:27.940 | I mean, there's just so much that could be done
01:41:29.740 | inside of there.
01:41:30.580 | And then my mind just starts to spiral into like, wow,
01:41:32.780 | this is very likely to transform what we think of
01:41:35.580 | as quote unquote exercise.
01:41:37.940 | - Yeah, I think so.
01:41:38.780 | I think there's still a bunch of questions
01:41:40.320 | that need to get answered.
01:41:42.660 | You know, I don't think most people are gonna necessarily
01:41:45.740 | want to install, you know, a lot of sensors or cameras
01:41:49.580 | to track their whole body.
01:41:50.580 | So we're just over time getting better from the sensors
01:41:53.940 | that are on the headsets of being able to do
01:41:56.220 | very good hand tracking, right?
01:41:58.180 | So we have this research demo where you now,
01:42:00.620 | just with the hand tracking from the headset, you can type,
01:42:03.580 | it just projects a little keyboard onto your table
01:42:05.980 | and you can type and people like type like a hundred words
01:42:08.100 | a minute with that.
01:42:08.980 | - With a virtual keyboard.
01:42:10.360 | - Yeah, we're starting to be able to using
01:42:14.840 | some modern AI techniques, be able to like simulate
01:42:18.440 | and understand where your torso's position is.
01:42:21.480 | Even though you can't always see it,
01:42:23.280 | you can see it a bunch of the time.
01:42:24.720 | And if you fuse together what you do see
01:42:27.540 | with like the accelerometer and understanding
01:42:29.960 | how the thing is moving, you can kind of understand
01:42:32.560 | what the body position is gonna be.
01:42:34.320 | But some things are still gonna be hard, right?
01:42:38.320 | So you mentioned boxing, that one works pretty well
01:42:42.900 | because we understand your head position,
01:42:45.160 | we understand your hands, and now we're kind of
01:42:48.580 | increasingly understanding your body position.
01:42:51.840 | But let's say you wanna expand that to Muay Thai
01:42:55.340 | or kickboxing, okay?
01:42:56.780 | So legs, that's a different part of tracking, that's harder
01:43:00.260 | 'cause that's out of the field of view more of the time.
01:43:03.320 | But there's also the element of resistance, right?
01:43:05.240 | So you can throw a punch and retract it and shadow box
01:43:07.980 | and do that without upsetting your kind of physical balance
01:43:12.900 | that much, but if you wanna throw a roundhouse kick
01:43:15.320 | and there's no one there, then the standard way
01:43:18.980 | that you do it when you're shadow boxing
01:43:20.340 | is you basically do a little 360.
01:43:22.400 | But like, I don't know, is that gonna feel great?
01:43:24.760 | I mean, I think that there's a question
01:43:26.120 | about what that experience should be.
01:43:28.460 | And then if you wanted to go even further,
01:43:30.880 | if you wanted to get like grappling to work,
01:43:33.960 | I'm not even sure how you would do that
01:43:36.580 | without having resistance of understanding
01:43:38.440 | what the forces applied to you would be.
01:43:40.120 | And just then you get into, okay,
01:43:41.440 | maybe you're gonna have some kind of body suit
01:43:43.880 | that can apply haptics, but I'm not even sure
01:43:46.960 | that even a pretty advanced haptic system
01:43:49.340 | is gonna be able to be quite good enough to do
01:43:51.600 | to simulate like the actual forces
01:43:53.360 | that would be applied to you in a grappling scenario.
01:43:56.240 | So this is part of what's fun about technology though,
01:43:58.760 | is you keep on getting new capabilities
01:44:01.680 | and then you need to figure out
01:44:02.600 | what things you can do with them.
01:44:04.000 | So I think it's really neat that we can kind of do boxing
01:44:06.580 | and we can do the supernatural thing.
01:44:08.140 | And there's a bunch of awesome cardio
01:44:10.060 | and dancing and things like that.
01:44:12.140 | And then there's also still so much more to do
01:44:14.020 | that I'm excited to kind of get to over time,
01:44:16.660 | but it's a long journey.
01:44:19.460 | - And what about things like painting and art and music?
01:44:23.900 | You know, I imagine, of course, like different mediums,
01:44:28.380 | like I like to draw with pen and pencil,
01:44:30.240 | but I can imagine trying to learn how to paint virtually.
01:44:32.540 | And of course you could print out
01:44:34.080 | a physical version of that at the end.
01:44:35.980 | This doesn't have to depart from the physical world.
01:44:38.060 | It could end in the physical world.
01:44:39.400 | - Did you see the demo, the piano demo where you,
01:44:43.540 | either you're there with a physical keyboard
01:44:46.140 | or it could be a virtual keyboard,
01:44:47.820 | but the app basically highlights what keys you need to press
01:44:52.820 | in order to play the song.
01:44:55.420 | So it's basically like you're looking at your piano
01:44:58.460 | and it's teaching you how to play a song that you choose.
01:45:02.020 | - An actual piano, yeah.
01:45:03.820 | But it's illuminating certain keys in the virtual space.
01:45:06.940 | - And it could either be a virtual piano if you,
01:45:09.560 | or keyboard if you don't have a piano or keyboard,
01:45:12.060 | or it could use your actual keyboard.
01:45:15.460 | So yeah, I think stuff like that is going to be
01:45:19.700 | really fascinating for education and expression.
01:45:23.080 | - And for broad, excuse me,
01:45:24.880 | but for broadening access to expensive equipment.
01:45:28.180 | I mean, a piano is no small expense.
01:45:30.900 | - Exactly.
01:45:31.740 | And it takes up a lot of space and it needs to be tuned.
01:45:34.900 | You can think of all these things
01:45:35.740 | like the kid that has very little income
01:45:37.700 | or their family has very little income
01:45:39.140 | could learn to play a virtual piano at much lower cost.
01:45:42.300 | - Yeah, and it gets back to the question
01:45:43.320 | I was asking before about this thought experiment
01:45:45.700 | of how many of the things that we physically have today
01:45:49.140 | actually need to be physical.
01:45:51.060 | The piano doesn't.
01:45:52.820 | Maybe there's some premium where it's,
01:45:57.060 | maybe it's a somewhat better, more tactile experience
01:46:01.060 | to have a physical one,
01:46:02.500 | but for people who don't have the space for it
01:46:05.380 | or who can't afford to buy a piano
01:46:07.740 | or just aren't sure that they would want to make
01:46:08.980 | that investment at the beginning
01:46:09.980 | of learning how to play piano,
01:46:11.740 | I think in the future you'll have the option
01:46:13.300 | of just buying an app or a hologram piano,
01:46:16.540 | which will be a lot more affordable.
01:46:19.020 | And I think that's going to unlock a ton of creativity too,
01:46:24.800 | because instead of the market for piano makers
01:46:29.340 | being constrained to a relatively small set of experts
01:46:33.340 | who have perfected that craft,
01:46:35.620 | you're going to have kids or developers all around the world
01:46:40.360 | designing crazy designs for potential keyboards and pianos
01:46:44.440 | that look nothing like what we've seen before,
01:46:46.060 | but maybe bring even more joy
01:46:48.700 | and are even more kind of fun in the world
01:46:50.820 | where you have fewer of these physical constraints.
01:46:52.700 | So I don't know.
01:46:53.540 | I think it's going to be a lot of wild stuff to explore.
01:46:56.100 | - There's definitely going to be a lot of wild stuff
01:46:57.820 | to explore.
01:46:58.660 | I had this idea/image in my mind
01:47:03.560 | of what you were talking about merged
01:47:05.220 | with our earlier conversation when Priscilla was here.
01:47:07.940 | I could imagine a time not too long from now
01:47:10.020 | where you're using mixed reality
01:47:11.820 | to run experiments in the lab,
01:47:13.340 | literally mixing virtual solutions,
01:47:15.660 | getting potential outcomes,
01:47:17.220 | and then picking the best one
01:47:18.400 | to then go actually do in the real world,
01:47:19.940 | which is both financially costly and time-wise costly.
01:47:25.700 | - Yeah, I mean, people are already using VR
01:47:30.100 | for surgery and education on it.
01:47:32.860 | And there's some study that was done
01:47:35.400 | that basically tried to do a controlled experiment
01:47:39.980 | of people who learned how to do a specific surgery
01:47:42.940 | through just the normal kind of textbook and lecture method
01:47:46.980 | versus like you show the knee and you have it
01:47:51.700 | be a large blown up model and people can manipulate it
01:47:54.620 | and kind of practice where they would make the cuts
01:47:57.540 | and the people in that class did better.
01:48:01.040 | So I think that it's going to be profound
01:48:05.820 | for a lot of different areas.
01:48:07.140 | - And the last example that leaps to mind,
01:48:09.460 | I think social media and online culture
01:48:12.020 | has been accused of creating a lot of real world,
01:48:14.940 | let's call it physical world, social anxiety for people.
01:48:17.400 | But I could imagine practicing a social interaction
01:48:20.980 | or a kid that has a lot of social anxiety
01:48:22.940 | or that needs to advocate for themselves better,
01:48:24.780 | learning how to do that progressively
01:48:27.060 | through a virtual interaction
01:48:28.300 | and then taking that to the real world
01:48:29.620 | because it's in my very recent experience today,
01:48:32.520 | it's so blended now with real experience
01:48:35.860 | that the kid that feels terrified
01:48:37.220 | of advocating for themselves
01:48:38.420 | or just talking to another human being or an adult
01:48:41.020 | or being in a new circumstance of a room full of kids,
01:48:42.900 | you could really experience that in silico first
01:48:46.540 | and get comfortable, let the nervous system attenuate a bit
01:48:49.420 | and then take it into the quote unquote physical world.
01:48:53.400 | - Yeah, I think we'll see experiences like that.
01:48:55.680 | I mean, I also think that some of the social dynamics
01:48:58.160 | around how people interact
01:49:00.360 | in this kind of blended digital world
01:49:03.920 | will be more nuanced in other ways.
01:49:05.320 | So I'm sure that there will be kind of new anxieties
01:49:08.420 | that people develop to just like teens today
01:49:13.160 | need to navigate dynamics around texting constantly
01:49:16.520 | that we just didn't have when we were kids.
01:49:21.500 | So I think it will help with some things.
01:49:23.360 | I think that there will be new issues
01:49:24.540 | that hopefully we can help people work through too.
01:49:26.320 | But overall, I think it's going to be
01:49:29.040 | really powerful and positive.
01:49:31.160 | - Let's talk about the glasses.
01:49:32.620 | - Sure.
01:49:33.460 | - This was wild.
01:49:34.280 | - Yeah.
01:49:35.120 | - I put on a pair of Ray-Bans.
01:49:36.460 | I like the way they look.
01:49:38.400 | They're clear.
01:49:39.240 | They look like any other Ray-Ban glasses,
01:49:42.000 | except that I could call out to the glasses.
01:49:46.600 | I could just say, "Hey, Metta,
01:49:48.940 | I want to listen to the Bach variations."
01:49:51.360 | Goldberg variations of Bach.
01:49:53.760 | And Metta responded, and no one around me could hear,
01:49:58.640 | but I could hear with exquisite clarity.
01:50:02.620 | And by the way, I'm not getting paid to say any of this.
01:50:04.220 | I'm just still blown away by this, folks.
01:50:06.720 | I want a pair of these very badly.
01:50:09.760 | I could hear, "Okay, I'm selecting those now,
01:50:12.600 | or that music now."
01:50:13.680 | And then I could hear it in the background,
01:50:14.860 | but then I could still have a conversation.
01:50:16.300 | So this was neither headphones in nor headphones out.
01:50:20.440 | And I could say, "Wait, pause the music."
01:50:21.920 | And it would pause.
01:50:22.900 | And the best part was I didn't have to
01:50:25.760 | leave the room mentally.
01:50:27.400 | I didn't even have to take out a phone.
01:50:29.160 | It was all interfaced through this very local environment
01:50:32.180 | in and around the head.
01:50:33.640 | And as a neuroscientist, I'm fascinated by this
01:50:35.320 | because, of course, all of our perceptions,
01:50:36.780 | auditory, visual, et cetera,
01:50:37.800 | all occurring inside the casing of this thing
01:50:40.940 | we call a skull.
01:50:42.080 | But maybe you could comment on the origin of that design
01:50:46.800 | for you, the ideas behind that,
01:50:48.720 | and where you think it could go,
01:50:49.880 | because I'm sure I'm just scratching the surface.
01:50:52.840 | - The real product that we want to eventually get to
01:50:56.280 | is this kind of full augmented reality product
01:51:00.240 | in a kind of stylish and comfortable
01:51:02.800 | normal glasses form factor.
01:51:04.760 | - Not dorky VR headset, so to speak.
01:51:07.040 | - No, I mean-
01:51:07.860 | - 'Cause the VR headset does feel kind of like-
01:51:09.040 | - It will, but there's gonna be a place for that too,
01:51:11.940 | just like you have your laptop and you have your workstation
01:51:15.380 | or maybe the better analogy is you have your phone
01:51:17.300 | and you have your workstation.
01:51:19.640 | These AR glasses are gonna be like your phone
01:51:22.000 | in that you have something on your face
01:51:24.040 | and you will, I think, be able to, if you want,
01:51:27.360 | wear it for a lot of the day
01:51:29.040 | and interact with it very frequently.
01:51:32.880 | I don't think that people are gonna be
01:51:35.120 | walking around the world wearing VR headsets.
01:51:37.680 | - Let's hope not.
01:51:38.520 | - But yeah, that's certainly not the future
01:51:40.080 | that I'm kind of hoping we get to.
01:51:42.960 | But I do think that there is a place where,
01:51:45.960 | for having, because it's a big reform factor,
01:51:48.600 | it has more compute power.
01:51:50.080 | So just like your workstation or your bigger computer
01:51:53.800 | can do more than your phone can do, there's a place for that.
01:51:57.160 | When you want to settle into an intense task,
01:51:59.680 | if you have a doctor who's doing a surgery,
01:52:01.240 | I would want them doing it through the headset,
01:52:02.640 | not through their phone equivalent
01:52:05.080 | or the just kind of lower powered glasses.
01:52:07.500 | But just like phones are powerful enough
01:52:09.640 | to do a lot of things,
01:52:10.800 | I think the glasses will eventually get there too.
01:52:13.240 | Now that said, there's a bunch
01:52:15.600 | of really hard technology problems to address
01:52:20.080 | in order to be able to get to this point
01:52:22.000 | where you can like put kind of full holograms in the world.
01:52:26.260 | You're basically miniaturizing a supercomputer
01:52:29.240 | and putting it into a pair of glasses
01:52:31.320 | so that the pair of glasses still looks stylish and normal.
01:52:36.880 | And that's a really hard technology problem.
01:52:40.480 | Making things small is really hard.
01:52:43.320 | A holographic display is different
01:52:47.480 | from what our industry has optimized for
01:52:49.720 | for 30 or 40 years now, building screens.
01:52:53.640 | There's like a whole kind of industrial process around that
01:52:57.320 | that goes into phones and TVs and computers
01:53:00.200 | and like increasingly so many things
01:53:01.840 | that have different screens.
01:53:03.320 | Like there's a whole pipeline that's gotten very good
01:53:05.560 | at making that kind of screen.
01:53:07.240 | And the holographic displays
01:53:09.980 | are just a completely different thing, right?
01:53:12.160 | 'Cause it's not a screen, right?
01:53:14.080 | It's a thing that you can shoot light into
01:53:16.600 | through a laser or some other kind of projector
01:53:19.040 | and it can place that as an object in the world.
01:53:21.520 | So that's gonna need to be this whole other
01:53:24.360 | industrial process that gets built up
01:53:26.360 | to doing that like in an efficient way.
01:53:28.680 | So all that said, we're basically taking
01:53:32.840 | two different approaches towards building this at once.
01:53:36.140 | One is we are trying to keep in mind
01:53:40.000 | what is the long-term thing that it's not super far off.
01:53:43.480 | I think within a few years,
01:53:45.980 | I think we'll have something that's sort of
01:53:48.240 | a first version of kind of this full vision
01:53:50.280 | that I'm talking about.
01:53:51.120 | And we have something that's working internally
01:53:52.320 | that we use as like a, that we'll use as a dev kit.
01:53:54.840 | But that one, that's kind of a big challenge.
01:54:01.040 | It's gonna be more expensive
01:54:02.760 | and it's harder to get all the pieces working.
01:54:06.600 | The other approach has been,
01:54:08.760 | all right, let's start with what we know we can put into
01:54:12.680 | a pair of stylish sunglasses today
01:54:15.120 | and just make them as smart as we can.
01:54:18.900 | So for the first version, we worked with,
01:54:21.580 | we did this collaboration with Ray-Ban, right?
01:54:24.760 | 'Cause that's like well-accepted,
01:54:27.400 | these are well-designed glasses.
01:54:29.200 | They're classic, people have used them for decades.
01:54:32.280 | For the first version, we got a sensor on the front
01:54:34.800 | so you could capture moments
01:54:36.160 | without having to take your phone out of your pocket.
01:54:37.920 | So you got photos and videos.
01:54:39.760 | You have the speaker and the microphones,
01:54:42.560 | you can listen to music.
01:54:43.820 | You could communicate with it.
01:54:47.360 | But it was, that was sort of the first version of it.
01:54:51.340 | We had a lot of the basics there,
01:54:52.480 | but we saw how people used it and we tuned it.
01:54:56.320 | We made the camera is like twice as good
01:54:59.120 | for this new version that we made.
01:55:01.000 | The audio is a lot crisper for the use cases
01:55:02.960 | that we saw that people actually used,
01:55:04.240 | which is some of it is listening to music,
01:55:06.000 | but a lot of it is like people wanna take calls
01:55:07.800 | on their glasses.
01:55:09.420 | They wanna listen to podcasts, right?
01:55:11.280 | But the biggest thing that I think is interesting
01:55:15.000 | is the ability to get AI running on it,
01:55:18.440 | which it doesn't just run on the glasses.
01:55:20.640 | It also, it kind of proxies through your phone.
01:55:23.820 | But I mean, with all the advances in LLMs,
01:55:28.560 | we talked about this a bit in the first part
01:55:30.400 | of the conversation,
01:55:32.120 | having the ability to have your meta AI assistant
01:55:36.080 | that you can just talk to
01:55:36.920 | and basically ask any question throughout the day
01:55:39.680 | is I think it'd be really fascinating.
01:55:42.040 | And like you were saying about
01:55:44.080 | kind of how we process the world as people,
01:55:47.840 | eventually I think you're gonna want your AI assistant
01:55:52.400 | to be able to see what you see and hear what you hear.
01:55:55.160 | Maybe not all the time,
01:55:56.000 | but you're gonna wanna be able to tell it
01:55:58.160 | to go into a mode where it can see what you see
01:56:00.160 | and hear what you hear.
01:56:01.400 | And what's the kind of device design
01:56:04.560 | that best kind of positions an AI assistant
01:56:08.240 | to be able to see what you see and hear what you hear
01:56:10.100 | so it can best help you?
01:56:11.560 | Well, it's glasses, right?
01:56:12.760 | Where it basically has a sensor
01:56:14.280 | to be able to see what you see
01:56:16.000 | and a microphone that is close to your ears
01:56:20.320 | that can hear what you hear.
01:56:21.780 | The other design goals is like you said,
01:56:26.280 | to keep you present in the world, right?
01:56:28.860 | So I think one of the issues with phones
01:56:32.000 | is they kind of pull you away
01:56:34.680 | from what's physically happening around you.
01:56:36.440 | And I don't think that the next generation
01:56:37.760 | of computing will do that.
01:56:39.760 | - I just, I'm chuckling to myself because I have a friend,
01:56:41.720 | he's a very well-known photographer
01:56:43.200 | and he was laughing about how people go to a concert
01:56:46.440 | and everyone's filming the concert on their phone
01:56:49.120 | so that they can be the person that posts the thing.
01:56:50.820 | But like there are literally millions of other people
01:56:52.800 | who posted the exact same thing,
01:56:53.860 | but somehow our unique experience,
01:56:55.560 | it feels important to post our unique experience
01:56:58.540 | with glasses that would essentially smooth that gap
01:57:02.860 | completely.
01:57:03.700 | - Yeah, totally.
01:57:04.520 | - You can just worry about it later, download it.
01:57:06.020 | There are issues I realize with glasses
01:57:08.340 | because they are so seamless with everyday experience,
01:57:11.260 | even though you and I aren't wearing them now,
01:57:13.060 | it's very common for people to wear glasses,
01:57:16.060 | issues of recording and consent.
01:57:18.300 | - Yeah, that's where we-
01:57:19.140 | - Like if I go into a locker room at my gym.
01:57:21.020 | - That's where we have the light on.
01:57:21.860 | - I'm assuming that the people with glasses aren't filming,
01:57:23.500 | whereas right now, because there's a sharp transition
01:57:26.660 | when there's a phone in the room and someone's pointing it,
01:57:30.580 | people generally say no phones in the locker rooms
01:57:34.220 | and recording.
01:57:35.500 | So that's just one instance.
01:57:36.820 | I mean, there are other instances.
01:57:37.640 | - Well, we have the whole privacy light.
01:57:38.740 | I don't know.
01:57:39.580 | Did you get a chance to do this?
01:57:40.420 | - I didn't get a chance to explore that.
01:57:41.260 | - Yeah, so it's anytime that it's active,
01:57:43.660 | that the camera sensor is active,
01:57:45.780 | it's basically like pulsing a white bright light.
01:57:50.220 | - Got it.
01:57:51.060 | - So, which is, by the way, more than cameras do.
01:57:54.760 | - Right, someone could be holding a phone on the side.
01:57:55.600 | - Yeah, I mean, phones aren't kind of showing a light,
01:57:59.440 | a bright sensor when you're taking a photo, so.
01:58:02.500 | - No, people oftentimes will pretend they're texting
01:58:04.380 | and they're actually recording.
01:58:05.660 | I actually saw an instance of this in a barbershop once
01:58:07.820 | where someone was recording and they were pretending
01:58:10.380 | that they were texting.
01:58:11.220 | And it was a pretty intense interaction that ensued.
01:58:15.380 | And it was like, wow, you know,
01:58:16.520 | it's pretty easy for people to feign texting
01:58:19.240 | while actually recording.
01:58:20.780 | - Yeah, so I think when you're evaluating a risk
01:58:25.400 | with a new technology, the bar shouldn't be,
01:58:29.740 | is it possible to do anything bad?
01:58:32.420 | It's does this new technology make it easier
01:58:36.180 | to do something bad than what people already had?
01:58:39.180 | And I think because you have this privacy light
01:58:42.100 | that is just broadcasting to everyone around you,
01:58:44.220 | hey, this thing is recording now.
01:58:46.540 | I think that that makes it actually less discreet
01:58:50.780 | to do it through the glasses than what you could do
01:58:52.860 | with a phone already, which I think is basically the bar
01:58:56.260 | that we wanted to get over from a design perspective.
01:58:59.020 | - Thank you for pointing out that it has the privacy light.
01:59:00.960 | I didn't get long enough in the experience
01:59:03.040 | to explore all the features.
01:59:04.480 | But again, I can think of a lot of uses,
01:59:07.980 | being able to look at a restaurant from the outside
01:59:11.240 | and see the menu, get status on how crowded it is.
01:59:15.500 | As much as I love, I don't want to call out,
01:59:19.160 | let's just say app-based map functions
01:59:22.560 | that allow you to navigate and the audio is okay.
01:59:25.420 | It's nice to have a conversation with somebody on the phone
01:59:28.500 | or in the vehicle and just,
01:59:29.900 | it'd be great if the road was traced where I should turn.
01:59:32.720 | - Yeah, absolutely.
01:59:33.540 | - These kinds of things seem like it's going to be
01:59:34.960 | straightforward for meta-engineers to create.
01:59:38.040 | - Yeah, in some future version,
01:59:39.320 | we'll have it so it'll also have the holographic display
01:59:41.200 | where it can show you the directions.
01:59:43.120 | But I think that there will basically just be
01:59:45.920 | different price points that pack
01:59:48.260 | different amounts of technology.
01:59:50.020 | The holographic display part I think is going to be
01:59:52.480 | more expensive than doing one that just has the AI,
01:59:55.600 | but is primarily communicating with you through audio.
02:00:00.040 | So I mean the current Ray-Ban metaglasses are $299.
02:00:04.340 | I think when we have one that has a display in it,
02:00:08.160 | it'll probably be some amount more than that,
02:00:09.800 | but it'll also be more powerful.
02:00:11.160 | So I think that people will choose what they want to use
02:00:14.560 | based on what the capabilities are that they want
02:00:17.120 | and what they can afford.
02:00:18.200 | But a lot of our goal in building things is,
02:00:23.200 | you know, we try to make things
02:00:26.940 | that can be accessible to everyone.
02:00:29.440 | You know, our game as a company isn't to build things
02:00:33.060 | and then charge a premium price for it.
02:00:35.960 | We try to build things that then everyone can use
02:00:39.540 | and then become more useful because a very large number
02:00:42.280 | of people are using them.
02:00:43.520 | So it's just a very different approach.
02:00:46.480 | You know, we're not like Apple or some of these companies
02:00:50.560 | that just try to make something and then sell it
02:00:52.560 | for as much as they can.
02:00:55.080 | Which I mean, they're a great company.
02:00:56.940 | So I mean, I think that that model kind of is fine too.
02:01:00.180 | But our approach is going to be,
02:01:02.720 | we want stuff that can be affordable
02:01:04.480 | so that way everyone in the world can use it.
02:01:06.600 | - Long lines of health, I think the glasses
02:01:08.560 | will also potentially solve a major problem in a real way,
02:01:12.420 | which is the following.
02:01:13.680 | For both children and adults, it's very clear
02:01:16.320 | that viewing objects in particular screens up close
02:01:18.920 | for too many hours per day leads to myopia.
02:01:21.480 | Literally a change in the length of the eyeball
02:01:24.160 | and nearsightedness.
02:01:25.340 | And on the positive side, we know,
02:01:28.840 | based on some really large clinical trials,
02:01:31.500 | that kids who spend, and adults who spend two hours a day
02:01:35.120 | or more out of doors, don't experience that
02:01:39.000 | and maybe even reverse their myopia.
02:01:40.560 | And it has something to do with exposure to sunlight,
02:01:42.860 | but it has a lot to do with long viewing.
02:01:44.740 | Viewing things at a distance greater
02:01:46.280 | than three or four feet away.
02:01:47.720 | And with the glasses, I realize one could actually
02:01:50.160 | do digital work out of doors.
02:01:53.640 | It could measure and tell you how much time you've spent
02:01:56.800 | looking at things up close versus far away.
02:01:58.900 | I mean, this is just another example that leaps to mind,
02:02:01.920 | but in accessing the visual system,
02:02:04.540 | you're effectively accessing the whole brain
02:02:06.740 | because it's the only two bits of brain
02:02:08.100 | that are outside the cranial vault.
02:02:09.560 | So it just seems like putting technology
02:02:11.360 | right at the level of the eyes,
02:02:12.660 | seeing what the eyes see, is just gotta be the best way to go.
02:02:16.960 | - Yeah, I think, well, multimodal, right, I think is,
02:02:20.960 | you want the visual sensation,
02:02:24.320 | but you also want kind of text or language.
02:02:27.720 | So I think it's-
02:02:28.560 | - Sure, but that all can be brought
02:02:30.020 | to the level of the eyes, right?
02:02:32.320 | - What do you mean by that?
02:02:33.160 | - Well, I mean, I think what we're describing here
02:02:35.760 | is essentially taking the phone, the computer,
02:02:38.800 | and bringing it all to the level of the eyes.
02:02:40.680 | And of course, one would like more-
02:02:41.840 | - Physically at your face.
02:02:42.680 | - Physically at your eyes, right?
02:02:43.840 | And one would like more kinesthetic information,
02:02:46.200 | as you mentioned before, where the legs are,
02:02:47.520 | maybe the lung function.
02:02:48.740 | Hey, have you taken enough steps today?
02:02:50.140 | But that all can be, if it can be figured out on the phone,
02:02:52.900 | it can be by the phone, it can be figured out by glasses.
02:02:56.560 | But there's additional information there,
02:02:58.200 | such as what are you focusing on in your world?
02:03:00.660 | How much of your time is spent
02:03:02.200 | looking at things far away versus up close?
02:03:04.020 | How much social time did you have today?
02:03:06.180 | It's really tricky to get that with a phone.
02:03:07.840 | Like if my phone were right in front of us
02:03:09.880 | as if we were at a standard lunch nowadays,
02:03:11.740 | certainly at Silicon Valley.
02:03:12.960 | And then people were peering at our phones.
02:03:14.440 | I mean, how much real direct attention
02:03:16.580 | was in the conversation at hand versus something else?
02:03:18.920 | I mean, you can get issues
02:03:19.780 | of where are you placing your attention
02:03:22.160 | by virtue of where you're placing your eyes.
02:03:24.920 | And I think that information is not accessible
02:03:27.000 | with a phone in your pocket or in front of you.
02:03:29.160 | I mean, a little bit,
02:03:30.920 | but not nearly as rich and complete information
02:03:33.960 | as one gets when you're really pulling the data
02:03:35.920 | from the level of vision and what kids and adults
02:03:38.760 | are actually looking at and attending to.
02:03:41.360 | - Yeah, yeah.
02:03:42.680 | - Yeah, so it seems extremely valuable.
02:03:44.940 | You get autonomic information, size of the pupils.
02:03:47.500 | So you get information about internal states.
02:03:49.380 | I mean, that you can't--
02:03:50.220 | - Well, there's internal sensor and outside.
02:03:53.120 | So there's the sensor on the Ray-Ban metaglasses is external.
02:03:57.840 | Right, so it's basically allows you to see what you see.
02:04:00.460 | Then, sorry, the AI assistant to see what you're seeing.
02:04:04.680 | There's a separate set of things which are eye tracking,
02:04:08.380 | which are also very powerful
02:04:11.500 | for enabling a lot of interfaces, right?
02:04:13.840 | So if you want to just like look at something
02:04:16.300 | and select it by looking at it with your eyes,
02:04:19.700 | rather than having to kind of drag a controller over
02:04:22.360 | or pick up a hologram or anything like that,
02:04:25.680 | you can do that with eye tracking.
02:04:27.400 | So that's a pretty profound and cool experience too,
02:04:33.420 | as well as just kind of understanding
02:04:34.740 | what you're looking at.
02:04:35.580 | So that way you're not kind of wasting compute power,
02:04:37.820 | drawing pixels in high resolution
02:04:40.120 | in a part of the kind of world
02:04:41.720 | that's gonna be in your peripheral vision.
02:04:43.860 | So yeah, all of these things,
02:04:47.300 | they're interesting design and technology trade-offs
02:04:51.980 | where if you want the external sensor, that's one thing.
02:04:55.820 | If you also want the eye tracking,
02:04:57.860 | now that's a different set of sensors.
02:04:59.340 | Each one of these consumes compute, which consumes battery.
02:05:04.220 | They take up more space, right?
02:05:05.580 | So it's like, where are the eye tracking sensors gonna be?
02:05:07.700 | It's like, well, you wanna make sure
02:05:08.980 | that the rim of the glasses is actually quite thin
02:05:11.620 | because I mean, there's a kind of variance
02:05:15.500 | of how thick can glasses be
02:05:17.780 | before they look more like goggles than glasses.
02:05:20.340 | Something that this is, there was this whole space
02:05:23.540 | and I think people are gonna end up choosing
02:05:25.380 | what product makes sense for them.
02:05:26.680 | Maybe they want something that's more powerful,
02:05:28.080 | that has more of the sensors,
02:05:30.180 | but it's gonna be a little more expensive,
02:05:31.980 | maybe like slightly thicker,
02:05:33.920 | or maybe you want like a more basic thing
02:05:35.620 | that just looks like very similar
02:05:38.100 | to what Ray-Ban glasses are
02:05:39.340 | that people have been wearing for decades,
02:05:41.320 | but kind of has AI in it and you can capture moments
02:05:44.820 | without having to take your phone out
02:05:45.940 | and send them to people.
02:05:47.180 | In the latest version, we got the ability into live stream.
02:05:52.180 | I think that that's pretty crazy
02:05:53.420 | that now you can be kind of going back to your concert case
02:05:56.860 | or whatever else you're doing.
02:05:58.620 | You can be doing sports
02:06:00.980 | and just watching your kids play something
02:06:04.140 | and just you can be watching
02:06:05.700 | and you can be live streaming it
02:06:07.220 | to your kind of family group so people can see it.
02:06:11.500 | I think that that stuff is,
02:06:14.180 | I think that's pretty cool
02:06:16.500 | that you basically have a normal looking pair of glasses
02:06:19.180 | at this point that can kind of live stream
02:06:21.220 | and has like an AI assistant.
02:06:22.740 | So the stuff is making a lot faster progress
02:06:26.060 | in a lot of ways than I would have thought.
02:06:28.180 | And I don't know, I think people are gonna like this version
02:06:31.040 | but there's a lot more still to do.
02:06:32.980 | I think it's super exciting
02:06:34.180 | and I see a lot of technologies.
02:06:35.620 | This one's particularly exciting to me
02:06:37.060 | because of how smooth the interface is
02:06:39.220 | and for all the reasons that you just mentioned.
02:06:42.460 | What's happening with and what can we expect
02:06:44.460 | around AI interfaces and maybe even avatars of people
02:06:49.020 | within social media?
02:06:50.180 | Are we not far off from a day
02:06:53.220 | where there are multiple versions of me
02:06:56.700 | and you on the internet where people, for instance,
02:06:58.540 | I get asked a lot of questions.
02:06:59.860 | I don't have the opportunity to respond
02:07:01.600 | to all those questions, but with things like ChatGPT,
02:07:04.620 | people are trying to generate answers to those questions
02:07:06.580 | on other platforms.
02:07:07.660 | Will I have the opportunity to soon have an AI version
02:07:10.300 | of myself where people can ask me questions
02:07:12.140 | about like what I recommend for sleep and circadian rhythm,
02:07:15.400 | fitness, mental health, et cetera,
02:07:17.620 | based on content I've already generated
02:07:19.220 | that will be accurate so they could just ask my avatar?
02:07:22.620 | - Yeah, this is something that I think a lot of creators
02:07:26.140 | are gonna want that we're trying to build.
02:07:31.620 | And I think we'll probably have a version of next year,
02:07:35.100 | but there's a bunch of constraints that I think we need
02:07:38.400 | to make sure that we get right.
02:07:39.820 | So for one, I think it's really important that,
02:07:43.360 | it's not that there's a bunch of versions of you,
02:07:45.740 | it's that if anyone is creating
02:07:47.680 | like an AI assistant version of you,
02:07:50.620 | it should be something that you control, right?
02:07:53.180 | I think there are some platforms that are out there today
02:07:54.960 | that just let people like make, I don't know,
02:07:58.360 | the AI bot of me or other figures,
02:08:02.060 | and it's like, I don't know.
02:08:03.060 | I mean, we have platform policies for decades,
02:08:07.060 | since the beginning of the company at this point,
02:08:13.580 | which is almost 20 years,
02:08:15.060 | that basically don't allow impersonation.
02:08:18.980 | Real identity is like one of the core aspects
02:08:21.180 | that kind of our company was started on,
02:08:23.680 | is like you wanna kind of authentically be yourself.
02:08:25.880 | So yeah, I think if you're almost any creator,
02:08:30.880 | being able to engage your community,
02:08:34.940 | and there's just gonna be more demand to interact with you
02:08:39.140 | than you have hours in the day.
02:08:40.820 | So they're both people out there who would benefit
02:08:43.740 | from being able to talk to an AI version of you.
02:08:46.720 | And I think you and other creators would benefit
02:08:49.620 | from being able to keep your community engaged
02:08:51.540 | and service that demand that people have to engage with you.
02:08:54.620 | But you're gonna wanna know that that AI kind of version
02:08:59.620 | of you or assistant is gonna represent you
02:09:02.900 | the way that you would want.
02:09:04.260 | And there are a lot of things that are awesome
02:09:07.020 | about kind of these modern LLMs,
02:09:09.700 | but having perfect predictability
02:09:13.440 | about how it's gonna represent something
02:09:15.140 | is not one of the current strengths.
02:09:17.060 | So I think that there's some work
02:09:19.300 | that needs to get done there.
02:09:20.140 | I don't think it needs to be 100% perfect all the time,
02:09:22.820 | but you need to have very good confidence, I would say,
02:09:25.100 | that it's gonna represent you the way that you'd want
02:09:27.540 | for you to wanna turn it on,
02:09:29.380 | which again, you should have control
02:09:30.780 | over whether you turn it on.
02:09:32.680 | So we wanted to start in a different place,
02:09:35.140 | which I think is a somewhat easier problem,
02:09:38.140 | which is creating new characters or AI personas.
02:09:43.140 | So that way, it's not, we built,
02:09:48.540 | one of the AIs is like a chef,
02:09:51.500 | and they can help you kind of come up with things
02:09:55.500 | that you could cook and can help you cook them.
02:09:59.060 | There's like a couple of people
02:10:00.700 | that are interested in different types of fitness
02:10:02.640 | that can help you kind of plan out your workouts
02:10:05.620 | or help with recovery or different things like that.
02:10:09.360 | There are people, there's an AI
02:10:10.920 | that's focused on like DIY crafts.
02:10:13.960 | There's someone who's a travel expert
02:10:16.700 | that can help you make travel plans or give you ideas.
02:10:19.460 | But the key thing about all these is they're not,
02:10:22.340 | they're not modeled off of existing people.
02:10:24.860 | So they don't have to have kind of 100% fidelity
02:10:28.980 | to like making sure that they never say something
02:10:31.500 | that a real person who they're modeled after
02:10:34.580 | would never say because they're just made up characters.
02:10:37.020 | So I think that that is, that's somewhat easier problem.
02:10:41.400 | We actually got a bunch of different kind of
02:10:46.020 | well-known people to play those characters
02:10:49.420 | because we thought that would make it more fun.
02:10:50.820 | So there's like Snoop Dogg is the dungeon master.
02:10:53.460 | So you can like drop him into a thread
02:10:54.860 | and play text-based games.
02:10:56.780 | And it's just like, I do this with my daughter
02:10:59.060 | when I tuck her in at night.
02:11:01.100 | And she just like loves like storytelling, right?
02:11:05.020 | And it's like Snoop Dogg is the dungeon master.
02:11:08.180 | We'll come up with like, here's what's happening next.
02:11:09.900 | And she's like, okay, like turn into a mermaid.
02:11:12.340 | And then I like swim across the bay
02:11:14.000 | and I like go and find the treasure chest and unlock it.
02:11:17.460 | And it's like, and then Snoop Dogg just always
02:11:19.140 | will have a next version of the,
02:11:21.100 | like the next iteration on the story.
02:11:22.500 | So I mean, it's stuff is fun,
02:11:24.540 | but it's not actually Snoop Dogg.
02:11:26.120 | He's just kind of the actor.
02:11:27.180 | He's playing the dungeon master, which makes it more fun.
02:11:29.620 | So that's probably the right place to start
02:11:32.020 | is you have like, you can kind of build versions
02:11:35.420 | of these characters that people can interact with
02:11:37.780 | doing different things.
02:11:39.260 | But I think where you want to get over time
02:11:41.540 | is to the place where any creator or any small business
02:11:45.320 | can very easily just create an AI assistant
02:11:49.020 | that can represent them and interact
02:11:50.900 | with your kind of community or customers
02:11:54.020 | if you're a business and basically just help you grow
02:11:59.020 | your enterprise.
02:12:00.940 | So I don't know, I think that's gonna be cool,
02:12:02.480 | but I think this is, it's a long-term project.
02:12:04.620 | I think we'll have more progress on it to report
02:12:06.780 | on next year, but I think that's coming.
02:12:10.820 | - I'm super excited about it because of, you know,
02:12:13.940 | we hear a lot about the downsides of AI.
02:12:15.900 | I mean, I think people are now coming around to that,
02:12:18.220 | the reality that AI is neither good nor bad.
02:12:20.700 | It can be used for good or bad,
02:12:21.800 | and that there are a lot of life-enhancing spaces
02:12:24.060 | that it's gonna show up and really, really improve
02:12:26.580 | the way that we engage socially, what we learn,
02:12:31.140 | and that mental health and physical health
02:12:33.120 | don't have to suffer and in fact can be enhanced
02:12:34.900 | by the sorts of technologies we've been talking about.
02:12:37.220 | So I know you're extremely busy.
02:12:39.780 | I so appreciate the large amount of time
02:12:43.500 | you've given me today to sort through all these things,
02:12:46.660 | and to talk with you and Priscilla and to hear
02:12:48.900 | what's happening and where things are headed.
02:12:51.700 | The future certainly is bright.
02:12:53.900 | I share in your optimism and it's been only strengthened
02:12:57.540 | by today's conversation.
02:12:58.980 | So thank you so much and keep doing what you're doing.
02:13:02.420 | And on behalf of myself and everyone listening, thank you,
02:13:06.680 | because regardless of what people say,
02:13:08.580 | we all use these platforms excitedly,
02:13:11.740 | and it's clear that there's a ton of intention and care
02:13:15.180 | and thought about what could be in the positive sense,
02:13:20.180 | and that's really worth highlighting.
02:13:23.420 | - Awesome, thank you, I appreciate it.
02:13:26.260 | - Thank you for joining me for today's discussion
02:13:28.080 | with Mark Zuckerberg and Dr. Priscilla Chan.
02:13:30.760 | If you're learning from and/or enjoying this podcast,
02:13:33.180 | please subscribe to our YouTube channel.
02:13:35.000 | That's a terrific zero-cost way to support us.
02:13:37.520 | In addition, please subscribe to the podcast
02:13:39.500 | on both Spotify and Apple, and on both Spotify and Apple,
02:13:42.580 | you can leave us up to a five-star review.
02:13:44.980 | Please also check out the sponsors mentioned
02:13:46.700 | at the beginning and throughout today's episode.
02:13:48.820 | That's the best way to support this podcast.
02:13:51.420 | If you have questions for me or comments about the podcast
02:13:53.980 | or guests that you'd like me to consider hosting
02:13:55.780 | on the Huberman Lab Podcast,
02:13:57.220 | please put those in the comment section on YouTube.
02:13:59.540 | I do read all the comments.
02:14:01.180 | Not during today's episode,
02:14:02.300 | but on many previous episodes of the Huberman Lab Podcast,
02:14:05.080 | we discuss supplements.
02:14:06.320 | While supplements aren't necessary for everybody,
02:14:08.500 | many people derive tremendous benefit from them
02:14:10.820 | for things like enhancing sleep, hormone support,
02:14:13.500 | and improving focus.
02:14:14.900 | If you'd like to learn more about the supplements discussed
02:14:16.800 | on the Huberman Lab Podcast,
02:14:18.280 | you can go to Live Momentus, spelled O-U-S,
02:14:20.820 | so livemomentus.com/huberman.
02:14:24.020 | If you're not already following me on social media,
02:14:25.900 | it's Huberman Lab on all social media platforms.
02:14:28.580 | So that's Instagram, Twitter, now called X,
02:14:31.720 | Threads, Facebook, LinkedIn, and on all those places,
02:14:34.440 | I discuss science and science-related tools,
02:14:36.400 | some of which overlaps with the content
02:14:37.860 | of the Huberman Lab Podcast,
02:14:39.300 | but much of which is distinct from the content
02:14:41.180 | on the Huberman Lab Podcast.
02:14:42.380 | So again, it's Huberman Lab on all social media platforms.
02:14:45.820 | If you haven't already subscribed
02:14:46.980 | to our monthly neural network newsletter,
02:14:49.220 | the neural network newsletter
02:14:50.340 | is a completely zero cost newsletter
02:14:52.620 | that gives you podcast summaries,
02:14:54.220 | as well as toolkits in the form of brief PDFs.
02:14:56.860 | We've had toolkits related to optimizing sleep,
02:15:00.120 | to regulating dopamine, deliberate cold exposure, fitness,
02:15:03.800 | mental health, learning, and neuroplasticity, and much more.
02:15:07.340 | Again, it's completely zero cost to sign up.
02:15:09.140 | You simply go to HubermanLab.com,
02:15:10.860 | go over to the Menu tab, scroll down to Newsletter,
02:15:13.220 | and supply your email.
02:15:14.760 | I should emphasize that we do not share
02:15:16.300 | your email with anybody.
02:15:18.180 | Thank you once again for joining me for today's discussion
02:15:20.500 | with Mark Zuckerberg and Dr. Priscilla Chan.
02:15:23.060 | And last, but certainly not least,
02:15:25.260 | thank you for your interest in science.
02:15:27.020 | (upbeat music)
02:15:29.600 | (upbeat music)