back to indexMark Zuckerberg & Dr. Priscilla Chan: Curing All Human Diseases & the Future of Health & Technology
Chapters
0:0 Mark Zuckerberg & Dr. Priscilla Chan
2:15 Sponsors: Eight Sleep & LMNT; The Brain Body Contract
5:35 Chan Zuckerberg Initiative (CZI) & Human Disease Research
8:51 Innovation & Discovery, Science & Engineering
12:53 Funding, Building Tools & Imaging
17:57 Healthy vs. Diseased Cells, Human Cell Atlas & AI, Virtual Cells
21:59 Single Cell Methods & Disease; CELLxGENE Tool
28:22 Sponsor: AG1
29:53 AI & Hypothesis Generation; Long-term Projects & Collaboration
35:14 Large Language Models (LLMs), In Silico Experiments
42:11 CZI Biohubs, Chicago, New York
50:52 Universities & Biohubs; Therapeutics & Rare Diseases
57:23 Optimism; Children & Families
66:21 Sponsor: InsideTracker
67:25 Technology & Health, Positive & Negative Interactions
73:17 Algorithms, Clickbait News, Individual Experience
79:17 Parental Controls, Meta Social Media Tools & Tailoring Experience
84:51 Time, Usage & Technology, Parental Tools
88:55 Virtual Reality (VR), Mixed Reality Experiences & Smart Glasses
96:9 Physical Exercise & Virtual Product Development
104:19 Virtual Futures for Creativity & Social Interactions
109:31 Ray-Ban Meta Smart Glasses: Potential, Privacy & Risks
120:20 Visual System & Smart Glasses, Augmented Reality
126:42 AI Assistants & Creators, Identity Protection
133:26 Zero-Cost Support, Spotify & Apple Reviews, Sponsors, YouTube Feedback, Momentous, Social Media, Neural Network Newsletter
00:00:02.280 |
where we discuss science and science-based tools 00:00:10.360 |
and I'm a professor of neurobiology and ophthalmology 00:00:15.240 |
My guests today are Mark Zuckerberg and Dr. Priscilla Chan. 00:00:22.760 |
He is now the CEO of Meta, which includes Facebook, 00:00:26.380 |
Instagram, WhatsApp, and other technology platforms. 00:00:34.280 |
at the University of California, San Francisco. 00:00:44.000 |
a philanthropic organization whose stated goal 00:00:49.020 |
The Chan Zuckerberg Initiative is accomplishing that 00:00:51.360 |
by providing critical funding not available elsewhere, 00:00:59.600 |
cataloging all the different human cell types, 00:01:02.480 |
as well as providing AI or artificial intelligence platforms 00:01:05.680 |
to mine all of that data to discover new pathways 00:01:13.340 |
is held with both Dr. Priscilla Chan and Mark Zuckerberg, 00:01:18.520 |
and what it really means to try and cure all human diseases. 00:01:22.040 |
We talk about the motivational backbone for the CZI 00:01:24.700 |
that extends well into each of their personal histories. 00:01:27.560 |
Indeed, you'll learn quite a lot about Dr. Priscilla Chan, 00:01:30.600 |
who has, I must say, an absolutely incredible family story 00:01:40.120 |
how he's bringing an engineering and AI perspective 00:01:42.860 |
to the discovery of new cures for human disease. 00:01:49.700 |
during which we discuss various meta platforms, 00:01:54.360 |
and their effects on mental health in children and adults. 00:02:06.920 |
not just our online experiences with social media 00:02:15.540 |
Before we begin, I'd like to emphasize that this podcast 00:02:18.300 |
is separate from my teaching and research roles at Stanford. 00:02:22.900 |
to bring zero cost to consumer information about science 00:02:25.360 |
and science-related tools to the general public. 00:02:29.000 |
I'd like to thank the sponsors of today's podcast. 00:02:35.540 |
with cooling, heating, and sleep tracking capacity. 00:02:38.820 |
I've spoken many times before on this podcast 00:02:40.720 |
about the fact that getting a great night's sleep 00:02:46.400 |
One of the key things to getting a great night's sleep 00:02:51.680 |
And that's because in order to fall and stay deeply asleep, 00:02:57.640 |
And in order to wake up feeling refreshed and energized, 00:03:00.760 |
your body temperature actually has to increase 00:03:04.400 |
With Eight Sleep, you can program the temperature 00:03:06.080 |
of your sleeping environment in the beginning, middle, 00:03:10.840 |
like tracking the amount of rapid eye movement 00:03:14.080 |
things that are essential to really dialing in 00:03:17.880 |
I've been sleeping on an Eight Sleep mattress cover 00:03:24.840 |
I wake up far less often in the middle of the night, 00:03:29.060 |
than I ever did prior to using an Eight Sleep mattress cover. 00:03:39.280 |
Eight Sleep currently ships to the USA, Canada, UK, 00:03:47.040 |
Today's episode is also brought to us by Element. 00:03:51.240 |
that has everything you need and nothing you don't. 00:03:54.760 |
sodium, magnesium, and potassium, and no sugar. 00:03:59.580 |
for the functioning of every cell in your body, 00:04:08.240 |
Element contains the optimal ratio of electrolytes 00:04:20.000 |
and to make sure that I have adequate electrolytes 00:04:25.520 |
or even two packets, in 32 to 60 ounces of water 00:04:32.440 |
in order to make sure that I replace those electrolytes. 00:04:40.060 |
to get a free sample pack with your purchase. 00:04:45.800 |
I'm pleased to announce that we will be hosting 00:04:50.100 |
each of which is entitled The Brain-Body Contract, 00:04:52.920 |
during which I will share science and science-related tools 00:04:55.800 |
for mental health, physical health, and performance. 00:04:58.520 |
There will also be a live question and answer session. 00:05:05.800 |
as well as the event in Brisbane on February 24th. 00:05:09.280 |
Our event in Sydney at the Sydney Opera House 00:05:31.680 |
And now for my discussion with Mark Zuckerberg 00:05:37.680 |
and thank you for having me here in your home. 00:06:00.120 |
So maybe you could tell us what that big mission is, 00:06:01.680 |
and then we can get into some of the mechanics 00:06:03.280 |
of how that big mission can become a reality. 00:06:19.800 |
was think about how do we build a better future for everyone, 00:06:25.440 |
the resources that we have to bring philanthropically, 00:06:28.600 |
and the experiences that Mark and I have had, 00:06:33.280 |
for Mark as an engineer, and then our ability 00:06:40.000 |
Mark has been a builder throughout his career, 00:06:43.800 |
and what could we do if we actually put together a team 00:06:53.000 |
we've really been focused on what some people think 00:07:06.640 |
which is to be able to cure, prevent, or manage 00:07:13.600 |
And a lot of times people ask like, which disease? 00:07:16.200 |
And the whole point is that there is not one disease, 00:07:22.880 |
to where I always found the most hope as a physician, 00:07:26.040 |
which is new discoveries and new opportunities 00:07:29.680 |
and new ways of understanding how to keep people well 00:07:34.480 |
So our strategy at CZI is really to build tools, 00:07:39.280 |
fund science, change the way basic scientists 00:07:43.800 |
can see the world and how they can move quickly 00:08:03.000 |
to help move science along and make it easier 00:08:06.160 |
for scientists to do their work, and we do science. 00:08:09.680 |
You mentioned Stanford being an important pillar 00:08:15.920 |
institutes where teams can take on grand challenges 00:08:20.360 |
to do work that wouldn't be possible in a single lab 00:08:27.000 |
And our first bio hub was launched in San Francisco, 00:08:30.240 |
a collaboration between Stanford, UC Berkeley, and UCSF. 00:08:36.640 |
Curing all diseases implies that there will either be 00:08:44.920 |
which I'm certain there will be, and there already has been. 00:08:47.640 |
We can talk about some of those early successes in a moment, 00:08:51.240 |
but it also sort of implies that if we can understand 00:08:59.200 |
that transcend autism, Huntington's, Parkinson's, cancer, 00:09:07.300 |
there are some core principles that would make 00:09:13.960 |
What I'm basically saying is, how are you attacking this? 00:09:16.860 |
My belief is that the cell sits at the center 00:09:29.560 |
a little bit of what the cell is in your mind 00:09:37.400 |
about understanding disease in the context of cells, 00:09:40.060 |
because ultimately that's what we're made up of. 00:09:42.600 |
- Yeah, well, let's get to the cell thing in a moment, 00:09:47.260 |
we don't think that it's CZI that we're going to cure, 00:09:51.940 |
The goal is to basically give the scientific community 00:09:59.200 |
And we spent a lot of time when we were getting started 00:10:08.240 |
And if you look over this very long-term arc, 00:10:15.240 |
by the invention of a new tool or a new way to see something. 00:10:19.600 |
It's like having a telescope came before a lot 00:10:22.680 |
of discoveries in astronomy and astrophysics, 00:10:26.660 |
but similarly, the microscope and just different ways 00:10:39.680 |
that you were talking about, about building tools. 00:10:52.920 |
and a lot of the things that we're focused on, 00:10:54.840 |
including the work in single-cell and cell understanding, 00:10:58.380 |
which you can jump in and get into that if you want. 00:11:02.920 |
But yeah, I think we generally agree with the premise 00:11:08.640 |
from first principles, people study organs a lot, 00:11:15.120 |
across the body, but there's not a very widespread 00:11:18.880 |
understanding of how kind of each cell operates. 00:11:21.480 |
And this is sort of a big part of some of the initial work 00:11:26.680 |
and understanding what are the different cells, 00:11:28.280 |
and there's a bunch more work that we wanna do 00:11:31.880 |
But overall, I think when we think about the next 10 years 00:11:36.680 |
here of this long arc to try to empower the community 00:11:39.880 |
to be able to cure, prevent, or manage all diseases, 00:11:44.240 |
we think that the next 10 years should really be primarily 00:11:47.720 |
about being able to measure and observe more things 00:11:52.840 |
It's like you wanna look at something through a microscope, 00:11:57.280 |
because it's hard to see through skin, or things like that. 00:12:05.280 |
and this is sort of where the engineering background 00:12:07.980 |
comes in a bit, because when I think about this 00:12:11.040 |
as from the perspective of how you'd write code or something, 00:12:14.320 |
you know, the idea of trying to debug or fix a code base, 00:12:17.200 |
but not be able to step through the code line by line, 00:12:21.240 |
And at the beginning of any big project that we do at Meta, 00:12:25.560 |
you know, we like to spend a bunch of the time up front 00:12:28.320 |
and understand, you know, what are we gonna look at, 00:12:32.760 |
so we know we're making progress, and know what to optimize. 00:12:35.000 |
And this is such a long-term journey that we think 00:12:38.400 |
that it actually makes sense to take the next 10 years 00:12:43.680 |
and understanding just how the human body works in action, 00:12:53.240 |
- Could I just interrupt briefly and just ask about 00:13:00.760 |
that CZI is a unique position to bring to the quest 00:13:06.720 |
So I can think of, I mean, I know as a scientist 00:13:10.000 |
that money is necessary, but not sufficient, right? 00:13:13.000 |
Like when you have money, you can hire more people, 00:13:14.620 |
you can try different things, so that's critical, 00:13:22.060 |
The other component is you wanna be able to see things, 00:13:28.120 |
the normal disease process, like what is a healthy cell? 00:13:32.480 |
Are cells constantly being bombarded with challenges 00:13:35.500 |
and then repairing those, and then what we call cancer 00:13:37.960 |
is just kind of the runaway train of those challenges 00:13:39.940 |
not being met by the cell itself or something like that. 00:13:42.920 |
So better imaging tools, and then it sounds like 00:13:47.600 |
but a software component, this is where AI comes in. 00:13:50.100 |
So maybe we can, at some point, we can break this up 00:13:56.400 |
and healthy processes, we'll lump those together. 00:13:58.720 |
Then there's hardware, so microscopes, lenses, 00:14:12.260 |
And then I love the idea that maybe AI could do 00:14:24.180 |
that this particular gene and that particular gene 00:14:27.820 |
are potentially interesting, whereas a human being 00:14:30.300 |
would never make that potential connection, yeah. 00:14:33.360 |
- So the tools that CZI can bring to the table, 00:14:39.400 |
And we try to, there's lots of ways to fund science. 00:14:43.520 |
And just to be clear, what we fund is a tiny fraction 00:14:50.440 |
- You guys have been generous enough that it's, 00:14:53.040 |
it definitely holds weight to NIH, NIH's contribution. 00:14:58.040 |
- Yeah, but I think every funder has its own role 00:15:02.400 |
And for us, it's really how do we incentivize 00:15:09.760 |
And so a lot of our grants include inviting people 00:15:17.080 |
Our first neuroscience RFA was aimed towards incentivizing 00:15:22.080 |
people from different backgrounds, immunologists, 00:15:28.320 |
our nervous system works and how to keep it healthy. 00:15:35.280 |
in the pre-print movement to accelerate the rate 00:15:37.400 |
of sharing knowledge and actually others being able 00:15:45.040 |
In terms of building, we build software and hardware, 00:15:54.720 |
that are more durable and scalable than someone 00:16:05.400 |
most scientists can tinker and build something useful 00:16:09.460 |
But it's really hard for them to be able to share 00:16:15.600 |
or forget the next lab over or across the globe. 00:16:19.760 |
So we partner with scientists to see what is useful, 00:16:24.680 |
In imaging, NAPARI, it's a useful image annotation tool 00:16:36.160 |
Or a cell by gene which works on single cell data sets 00:16:43.200 |
so that scientists can share data sets, analyze their own 00:16:46.760 |
and contribute to a larger corpus of information? 00:16:56.060 |
that we're building easy to use, durable, translatable tools 00:17:00.800 |
across the scientific community in the areas that we work in. 00:17:13.840 |
And it's gonna be installed at our imaging institute 00:17:26.840 |
You know, we're partnering with a fantastic scientists 00:17:29.720 |
in the Biohub network to build a mini phase plate 00:18:00.040 |
Cells are just the smallest unit that are alive. 00:18:04.120 |
And all of our bodies have many, many, many cells. 00:18:09.120 |
There's some estimate of like 37 trillion cells, 00:18:17.300 |
And what do they look like when you're healthy? 00:18:36.540 |
looking at how different mutations in your genetic code 00:18:39.920 |
lead for you to be more susceptible to get sick 00:18:48.000 |
to wow, you now have Huntington's disease, for instance. 00:18:53.000 |
And there's a lot that happens in the middle. 00:18:58.080 |
that we're going after at CZI is what actually happens? 00:19:02.800 |
So an analogy that I like to use to share with my friends 00:19:05.820 |
is right now, say we have a recipe for a cake. 00:19:15.720 |
We don't know how the chef interprets the typo. 00:19:23.160 |
how it's exactly connected to how the cake didn't turn out, 00:19:31.960 |
but we can actually systematically try to break this down. 00:19:35.520 |
And one segment of that journey that we're looking at 00:19:43.940 |
And all of your cells have what's called mRNA. 00:19:53.640 |
And what our work in single cell is looking at 00:19:59.760 |
is actually interpreting your DNA slightly differently. 00:20:08.140 |
and when sick cells are interpreting those directions. 00:20:16.120 |
There's different large sets of mRNA in each cell. 00:20:22.760 |
is looking at how, first of all, gathering that information. 00:20:35.520 |
where we've gone from in 2017 funding some methods work 00:20:43.760 |
but nearly complete atlases of how the human body works, 00:20:47.400 |
how flies work, how mice work at the single cell level, 00:20:57.880 |
and when you're healthy and when you're sick. 00:21:00.240 |
And the neat thing about the sort of inflection point 00:21:05.000 |
where we're at in AI is that I can't look at this data 00:21:11.600 |
And biology is complex, human bodies are complex. 00:21:22.640 |
and gain insights, look at what trends are consistent 00:21:32.720 |
And eventually our hope through the use of these data sets 00:21:43.960 |
a cell that's completely built off of the data sets 00:21:59.000 |
- Do you think we've cataloged the total number 00:22:13.240 |
they've categorized 18 plus different types of fat cells. 00:22:18.120 |
We always think of like a fat cell versus a muscle cell. 00:22:31.360 |
for what we see in the advanced type two diabetes 00:22:45.000 |
but I always thought of single cell sequencing 00:22:59.400 |
Okay, so you have all these genes and you could say, 00:23:07.240 |
one of these fat cells or muscle cells for that matter, 00:23:22.560 |
that are important, but maybe one of those genes 00:23:32.340 |
And so I guess what I'm trying to get to here is, 00:23:45.260 |
'Cause I'll tell you that the graduate students 00:23:50.420 |
And that's really the challenge, let alone one lab. 00:23:55.060 |
and hopefully it's not getting outside the scope 00:24:01.040 |
but we're basically saying is you have to pick at some point 00:24:14.520 |
when we first launched this work was cystic fibrosis. 00:24:18.720 |
Cystic fibrosis is caused by mutation and CFTR. 00:24:23.000 |
It affects a certain channel that makes it hard 00:24:28.960 |
When I went to medical school, it was taught as fact. 00:24:32.800 |
These are people carrying around sacks of fluid filling up. 00:24:37.640 |
and then they have to literally dump the fluid out. 00:24:43.840 |
And when we applied single cell methodologies to the lungs, 00:24:54.840 |
and the CF mutation and cystic fibrosis mutation 00:25:07.960 |
and we'll continue to discover new relationships 00:25:11.840 |
which leads me to the second example I wanna bring up 00:25:21.360 |
It's starting to allow us to say this mutation, 00:25:28.720 |
And we actually have built a tool at CZI called Cell by Gene 00:25:33.720 |
where you can put in the mutation that you're interested in 00:25:37.680 |
and it gives you a heat map of cross cell types 00:25:59.220 |
That allows you to generate a hypothesis, why? 00:26:07.560 |
Really exciting way to look and ask questions differently. 00:26:13.920 |
where if you're trying to develop a therapy, a drug, 00:26:18.140 |
and the goal is to treat the function of the heart, 00:26:25.680 |
So what, is there going to be an unexpected side effect 00:26:30.820 |
as you're bringing this drug to clinical trials? 00:26:43.840 |
because if I look at the advances in neuroscience 00:26:55.660 |
Everyone prior to that talked about the brain 00:27:02.420 |
between single cells, organs, and systems, right? 00:27:09.820 |
And everyone nowadays is familiar with like gut brain access 00:27:14.100 |
but rarely is the discussion between organs discussed, 00:27:23.700 |
So that tool was generated by CZI or CZI funded that tool. 00:27:46.660 |
might be in a laboratory known for working on heart, 00:27:51.800 |
with other scientists that work on the pancreas, 00:27:56.620 |
because it bridges the divide between these fields. 00:28:00.420 |
not just different buildings, but people rarely talk 00:28:07.180 |
'cause one, they're the future of science, as you know. 00:28:14.680 |
we also will pull up the most relevant papers to that gene. 00:28:22.300 |
As we all know, quality nutrition influences, of course, 00:28:25.220 |
our physical health, but also our mental health 00:28:29.320 |
our ability to learn new things and to focus. 00:28:31.700 |
And we know that one of the most important features 00:28:37.800 |
from high quality unprocessed or minimally processed sources 00:28:41.420 |
as well as enough probiotics and prebiotics and fiber 00:28:44.500 |
to support basically all the cellular functions in our body, 00:28:51.560 |
try to get optimal nutrition from whole foods, 00:29:01.760 |
is getting enough servings of high quality fruits 00:29:03.820 |
and vegetables per day, as well as fiber and probiotics 00:29:06.780 |
that often accompany those fruits and vegetables. 00:29:11.000 |
long before I ever had a podcast, I started drinking AG1. 00:29:20.340 |
and the reason I still drink AG1 once or twice a day 00:29:23.640 |
is that it provides all of my foundational nutritional needs. 00:29:28.380 |
that I get the proper amounts of those vitamins, minerals, 00:29:31.180 |
probiotics, and fiber to ensure optimal mental health, 00:29:53.160 |
- I just think going back to your question from before, 00:29:55.500 |
I mean, are there going to be more cell types 00:30:01.460 |
you know, it doesn't seem like we're ever done, right? 00:30:04.980 |
But I think that that gets to one of the things 00:30:09.280 |
that I think are the strengths of modern LLMs 00:30:13.020 |
is the ability to kind of imagine different states 00:30:24.180 |
that you can now train a kind of large scale model on. 00:30:29.100 |
And one of the things that we're doing at CZI, 00:30:33.700 |
is building what we think is one of the largest 00:30:44.440 |
And it's larger than what most people have access to 00:30:47.500 |
in academia that you can do serious engineering work on. 00:31:06.140 |
and all the different states that they can be in 00:31:12.380 |
different, you know, interact with each other, 00:31:20.100 |
I think this is where it's helpful to understand, 00:31:23.420 |
and be grounded in like the modern state of AI. 00:31:25.820 |
I mean, these things are not foolproof, right? 00:31:31.940 |
So the question is, how do you make it so that 00:31:34.420 |
that can be an advantage rather than a disadvantage? 00:31:37.580 |
And I think the way that it ends up being an advantage 00:31:39.780 |
is when they help you imagine a bunch of states 00:31:42.420 |
that someone could be in, but then you, you know, 00:31:46.420 |
that those are true, whether they're, you know, 00:31:54.060 |
But, you know, we're not yet at the state with AI 00:31:56.860 |
that you can just take the outputs of these things 00:32:02.780 |
But they are very good, I think, as you said, 00:32:05.300 |
hypothesis generators or possible solution generators 00:32:15.380 |
building on the first five years of science work 00:32:19.260 |
that's been built out, carry that forward into something 00:32:21.580 |
that I think is gonna be a very novel tool going forward. 00:32:29.260 |
I mean, you all, you had this exchange a little while back 00:32:33.260 |
about, you know, funding levels and how CZI is, you know, 00:32:36.740 |
just sort of a drop in the bucket compared to NIH. 00:32:42.900 |
the thing that I think we can do that's different 00:32:45.580 |
is funding some of these longer term, bigger projects 00:32:54.620 |
And it's a lot of what most science funding is, 00:32:57.820 |
is like relatively small projects that are exploring things 00:33:09.100 |
They're often projects that require, you know, 00:33:12.540 |
and world-class engineering teams and infrastructure to do. 00:33:15.660 |
And that I think is a pretty cool contribution to the field 00:33:19.300 |
that I think is, there aren't as many other folks 00:33:24.900 |
But that's one of the reasons why I'm personally excited 00:33:28.340 |
'Cause it just, it's like this perfect intersection 00:33:30.940 |
of all the stuff that we've done and single cell, 00:33:32.740 |
the previous collaborations that we've done with the field 00:33:36.220 |
and, you know, bringing together the industry 00:33:41.260 |
- Yeah, I completely agree that the model of science 00:33:47.220 |
isn't just unique from NIH, but it's extremely important. 00:33:54.100 |
is what's driven the progression of science in this country 00:34:04.220 |
because it allows for that image we have of a scientist 00:34:07.980 |
kind of tinkering away or the people in their lab 00:34:11.400 |
And that hopefully translates to better human health. 00:34:16.780 |
But I think in my opinion, we've moved past that model 00:34:28.180 |
I think that that's, these tools empower those folks. 00:34:29.620 |
- Sure, and there are mechanisms to do that like NIH, 00:34:35.300 |
It's sort of interesting that we're sitting here not far 00:34:40.140 |
I'm not far from the garage model of tech, right? 00:34:43.720 |
The Hewlett-Packard model, not far from here at all. 00:34:47.720 |
And the idea was that the tinkerer in the garage, 00:34:53.480 |
that to implement all the technologies they discovered 00:34:57.400 |
So there's a similarity there to Facebook, meta, et cetera. 00:35:01.600 |
But I think in science, we imagine that the scientists alone 00:35:04.320 |
in their laboratory and those Eureka moments, 00:35:13.920 |
And one of the tools that you keep coming back to 00:35:20.780 |
what is a large language model for the uninformed? 00:35:27.760 |
And what does it allow us to do that different types of, 00:35:35.600 |
Or more importantly, perhaps, what does it allow us to do 00:35:42.580 |
staring at the data, what can it do that they can't do? 00:35:50.200 |
of machine learning has been about building systems, 00:35:56.260 |
that can basically make sense and find patterns 00:36:01.960 |
And there was a breakthrough a number of years back 00:36:11.720 |
And it was this huge breakthrough because before then, 00:36:15.360 |
there was somewhat of a cap where if you fed more data 00:36:22.220 |
it didn't really glean more insights from it. 00:36:25.160 |
Whereas transformers just, we haven't seen the end 00:36:35.760 |
but we just haven't built big enough systems yet. 00:36:40.100 |
I think this is actually one of the big questions 00:36:41.740 |
in the AI field today is basically are transformers 00:36:46.280 |
and are the current model architecture sufficient? 00:36:48.360 |
And if you just build larger and larger clusters, 00:36:57.200 |
to this architecture that we just haven't reached yet? 00:37:09.340 |
that I think will unlock a ton of really futuristic 00:37:14.240 |
But there's no doubt that even just being able 00:37:16.020 |
to process the amount of data that we can now 00:37:23.720 |
And the reason why they're called large language models 00:37:28.440 |
is people basically feed in all of the language 00:37:35.380 |
And you can think of them as basically prediction machines. 00:37:54.660 |
Or you could train it so that it could be a chatbot, right? 00:37:57.460 |
Where, okay, if you're prompted with this question, 00:38:02.780 |
But one of the interesting things is it turns out 00:38:11.160 |
if you use that model architecture for a network 00:38:14.260 |
and instead you feed it all of the human cell atlas data, 00:38:18.860 |
then if you prompt it with a state of a cell, 00:38:28.300 |
or different states that the cell could be a next 00:38:32.960 |
So for instance, if you give it a bunch of genetics data, 00:38:36.520 |
and then you give it a genetics class so it understands 00:38:38.620 |
that, you know, you've got DNA, RNA, mRNA, and proteins? 00:38:45.180 |
is they're basically pattern recognition systems. 00:38:48.820 |
So they're these like very deep statistical machines 00:38:57.540 |
So it's not actually, I mean, you don't need to teach 00:39:03.040 |
speak a language, you know, a lot of specific things 00:39:11.640 |
about something in English, but then you also give it 00:39:16.240 |
a bunch of examples of people speaking Italian, 00:39:21.320 |
that it learned in English and Italian, right? 00:39:25.240 |
and just the pattern recognition is the thing 00:39:28.920 |
that is pretty profound and powerful about this. 00:39:31.700 |
But it really does apply to a lot of different things. 00:39:40.440 |
that basically the folks at DeepMind have done 00:39:47.060 |
of the same model architecture, but instead of language, 00:39:50.680 |
there they kind of fed in all of the protein data 00:39:54.860 |
and you can give it a state and it can spit out solutions 00:39:59.560 |
So it's very powerful, I don't think we know yet 00:40:01.960 |
as an industry what the natural limits of it are 00:40:06.960 |
and that that's one of the things that's pretty exciting 00:40:11.760 |
But it certainly allows you to solve problems 00:40:27.400 |
and in vivo in living organisms, model organisms or humans 00:40:36.560 |
of biomedical research, certainly the work of CZI, 00:40:52.260 |
but I love the idea that we can run experiments 00:40:58.320 |
- I think the in silico experiments are going 00:41:01.560 |
to be incredibly helpful to test things quickly, 00:41:04.920 |
to cheaply and to just unleash a lot of creativity. 00:41:22.080 |
is we've basically cured every single disease in mice. 00:41:34.520 |
And a lot of times that research is relevant, 00:41:38.400 |
but not directly one-to-one translatable to humans. 00:41:42.320 |
So you just have to be really careful about making sure 00:41:54.000 |
As I'm hearing all of this, I'm thinking, okay, 00:42:02.560 |
The idea of a new field where you certainly embrace 00:42:06.280 |
the realities of universities and laboratories, 00:42:11.400 |
So maybe we need to think about what it means 00:42:16.620 |
And I think that's one of the things that's most exciting. 00:42:19.540 |
Along those lines, it seems that bringing together 00:42:21.680 |
a lot of different types of people at different 00:42:24.280 |
major institutions is going to be especially important. 00:42:28.960 |
So I know that the initial CCI Biohub, gratefully, 00:42:33.960 |
included Stanford, we'll put that first in the list, 00:42:43.160 |
But there are now some additional institutions involved. 00:42:47.760 |
So maybe you could talk about that and what motivated 00:42:57.960 |
- Well, I'll just say, a big part of why we wanted 00:43:01.600 |
to create additional Biohubs is we were just so impressed 00:43:07.600 |
And I also think, and you should walk through the work 00:43:11.900 |
of the Chicago Biohub and the New York Biohub 00:43:14.400 |
that we just announced, but I think it's actually 00:43:16.260 |
an interesting set of examples that balance the limits 00:43:21.260 |
of what you want to do with like physical material 00:43:24.460 |
engineering and where things are purely biological. 00:43:28.740 |
Because the Chicago team is really building more sensors 00:43:32.260 |
to be able to understand what's going on in your body. 00:43:35.740 |
engineering challenge, whereas the New York team, 00:43:39.560 |
we basically talk about this as like a cellular endoscope 00:43:42.700 |
of being able to have like an immune cell or something 00:43:51.260 |
But it's not like a physical piece of hardware. 00:43:52.900 |
It's a cell that you can basically just go report out 00:43:57.900 |
on different things that are happening inside the body. 00:44:05.900 |
But I mean, you should go into more detail on all this. 00:44:08.160 |
- So a core principle of how we think about Biohubs 00:44:11.080 |
is that it has to be, when we invited proposals, 00:44:17.860 |
So really breaking down the barrier of a single university. 00:44:22.220 |
Oftentimes asking for the people designing the research aim 00:44:28.780 |
And to explain why that the problem that they wanna solve 00:44:32.300 |
requires interdisciplinary, inter-university institution 00:44:40.940 |
We just put that request for a proposal out there 00:44:46.580 |
where they've done incredible work in single cell biology 00:45:02.260 |
And we are so, so excited that we've been able 00:45:18.080 |
And if I, obviously these universities are multifaceted, 00:45:26.100 |
Northwestern has an incredible medical system 00:45:38.180 |
University of Illinois is a computing powerhouse. 00:45:44.620 |
that they were gonna start thinking about cells and tissue. 00:45:48.360 |
So that one of the layers that you just alluded to. 00:45:52.340 |
So how do the cells that we know behave and act differently 00:46:05.140 |
as a collaboration under the leadership of Shana Kelly, 00:46:13.020 |
The architecture looks the same as what's in UNI. 00:46:21.080 |
super thin sensors, and they embed these sensors 00:46:24.860 |
throughout the layers of this engineered tissue. 00:46:34.840 |
and what happens when these cells get inflamed. 00:46:37.640 |
Inflammation is an incredibly important process 00:46:42.820 |
And so this is another sort of disease agnostic approach. 00:46:48.340 |
And they're gonna get a ton of information out 00:47:09.560 |
and then you can apply a large language model 00:47:11.760 |
to look at the earliest statistically significant changes 00:47:15.680 |
that can allow you to intervene as early as possible. 00:47:24.820 |
They're also looking at the neuromuscular junction, 00:47:27.560 |
which is the connection between where a neuron 00:47:30.800 |
attaches to a muscle and tells the muscle how to behave. 00:47:33.860 |
Super important in things like ALS, but also in aging. 00:47:48.060 |
And so we wanna be able to embed these sensors 00:47:50.780 |
to understand how these different interconnected systems 00:48:08.520 |
to be able to go in and identify changes in a human body. 00:48:20.780 |
I mean, this is, I don't wanna go on a tangent, 00:48:23.500 |
but for those that wanna look it up, adaptive optics, 00:48:29.240 |
when you try and look at something really small 00:48:30.780 |
or really far away and really smart physicists 00:48:37.460 |
Make those actually lenses of the microscope. 00:48:45.320 |
It's not intuitive, but then when you hear it, 00:48:50.540 |
Make the cells that are already, can navigate to tissues 00:49:00.500 |
and my family is, this is fantastic voyage, but real life. 00:49:09.580 |
which, you know, are privileged and already working 00:49:11.900 |
to keep your body healthy and being able to target them 00:49:17.020 |
So like you can engineer an immune cell to go in your body 00:49:21.140 |
and look inside your coronary arteries and say, 00:49:24.380 |
are these arteries healthy or are there plaques? 00:49:27.860 |
Because plaques lead to blockage, which lead to heart attacks 00:49:32.500 |
and the cell can then record that information 00:49:39.860 |
The second half is can you then engineer the cells 00:49:44.280 |
Can I then tell a different cell, immune cell, 00:49:46.780 |
that is able to transport in your body to go in 00:49:59.860 |
that your immune system normally doesn't have access to. 00:50:06.520 |
They'll also look at a number of neurodegenerative diseases 00:50:09.680 |
since the immune system doesn't presently have a ton 00:50:17.240 |
But it's both mind blowing and it feels like sci-fi, 00:50:37.620 |
- Yeah, I mean, it's a 10 to 15 year project. 00:50:43.580 |
- I love the optimism and the moment you said, 00:50:48.020 |
it's like, yes, yes, and yes, it just makes so much sense. 00:50:52.580 |
What motivated the decision to do the work of CZI 00:50:56.320 |
in the context of existing universities as opposed to, 00:51:00.740 |
there's still some real estate up in Redwood City 00:51:02.600 |
where there's a bunch of space to put biotech companies 00:51:11.680 |
I mean, it's a very interesting decision to do this 00:51:16.860 |
of like graduate students that need to do a thesis 00:51:20.260 |
'Cause there's a whole set of structures within academia 00:51:26.100 |
You know, that independent investigator model 00:51:30.140 |
it's so core to the way science has been done. 00:51:32.900 |
This is very different and frankly sounds far more efficient 00:51:37.120 |
And, you know, we'll see if I renew my NIH funding 00:51:39.500 |
after saying that, but I think we all want the same thing. 00:51:42.880 |
We all want to, as scientists and as, you know, as humans, 00:51:49.040 |
and we want healthy people to persist to be healthy 00:52:00.260 |
are actually independent of the universities. 00:52:07.560 |
maybe 50 people working on sort of deep efforts. 00:52:17.220 |
are actually going to, one, want to leave a university 00:52:21.600 |
or want to take on the full-time scope of this project. 00:52:25.580 |
So it's the ability to partner with universities 00:52:29.580 |
and to have the faculty at all the universities 00:52:40.420 |
But a lot of the way that we're approaching CZI 00:52:43.300 |
is this long-term iterative project to figure out, 00:52:48.260 |
figure out which things produce the most interesting results 00:52:51.780 |
and then double down on those in the next five-year push. 00:52:57.740 |
where we kind of wrapped up the first five years 00:53:07.340 |
we don't think it's like the best or only model, 00:53:10.100 |
but we found that it was sort of a really interesting way 00:53:21.000 |
And it's not something that is widely being pursued 00:53:26.580 |
So we figured, okay, this is like an interesting thing 00:53:30.500 |
But I mean, yeah, we do believe in the collaboration, 00:53:39.300 |
that we're pursuing this is like the only way to do this 00:53:43.580 |
We're pretty aware of what is the rest of the ecosystem 00:53:53.680 |
and also fills in an incredibly important niche 00:54:07.860 |
that a particular set of genes acting in a cluster, 00:54:19.800 |
I mean, still one of the most deadliest of the cancers. 00:54:22.800 |
And there are others that you certainly wouldn't want to get, 00:54:26.360 |
but that's among the ones you wouldn't want to get the most. 00:54:35.440 |
that then bear out in vitro in a dish and in a mouse model. 00:54:40.840 |
How is the actual implementation to drug discovery? 00:54:43.920 |
Or maybe this target is druggable, maybe it's not. 00:55:01.300 |
This is where it's important to work in an ecosystem 00:55:10.040 |
that take that and bring it to translation very effectively. 00:55:14.400 |
I would say the place where we have a small window 00:55:29.880 |
rare disease organizations where patients come together 00:55:34.280 |
and actually pool their collective experience. 00:55:48.540 |
and with drug developers to incentivize drug developers 00:55:53.500 |
to focus on what they may need for their disease. 00:56:04.080 |
and collectively impact many, many individuals. 00:56:15.200 |
the incredibly fascinating thing about rare diseases 00:56:26.860 |
when genes that are mutated cause very specific diseases 00:56:31.860 |
but that tell you how the normal biology works as well. 00:56:48.040 |
the targets will be explored by biotech companies. 00:56:54.480 |
- There've also, I think, been a couple of teams 00:57:03.080 |
that we're gonna pursue because we're a philanthropy, 00:57:14.980 |
and run towards building ultimately therapeutics. 00:57:30.160 |
Forgive me for switching to a personal question, 00:57:37.000 |
- I will say that we are incredibly hopeful people, 00:57:40.360 |
but it manifests in different ways between the two of us. 00:57:45.440 |
- How would you describe your optimism versus mine? 00:58:00.600 |
I mean, I think I'm more probably technologically optimistic 00:58:08.220 |
And I think you, because of your focus as an actual doctor, 00:58:13.220 |
kind of have more of a sense of how that's gonna affect 00:58:22.860 |
Whereas for me, it's like, I mean, a lot of my work, 00:58:26.500 |
it is, it's like we touch a lot of people around the world, 00:58:32.460 |
And I think for you, it's like being able to improve 00:58:36.540 |
the lives of individuals, whether it's students 00:58:42.900 |
through the education work, which isn't the goal here, 00:58:45.600 |
or just being able to improve people's lives in that way, 00:59:02.900 |
'cause in a day-to-day life, as like life partners, 00:59:24.140 |
Mark is an optimist whenever I'm waiting for him. 00:59:30.340 |
- That's what I think when I'm in the driveway 00:59:35.140 |
And so his optimism translates to some tardiness, 00:59:51.420 |
calling people to sort of like bring something to life. 01:00:06.260 |
in a lot of different aspects of life, right? 01:00:21.220 |
- But I do think that there's really something to it, right? 01:00:23.540 |
And there's like, if you're discussing any idea, 01:00:26.420 |
there's all these reasons why it might not work. 01:00:28.780 |
And so I think that, and those reasons are probably true. 01:00:39.100 |
is that the most productive way to view the world? 01:00:49.460 |
Because if you don't believe that something can get done, 01:00:57.620 |
the future is looking so dark in these various ways. 01:01:18.180 |
there's sort of a set of big statements on the wall. 01:01:21.020 |
One, the future can be better than the present 01:01:25.940 |
Maybe even, you said eliminating diseases, all diseases. 01:01:36.940 |
money and time and energy and people and technology 01:01:47.140 |
a significant modifier in terms of your view of the future? 01:01:51.700 |
Like, wow, like you hear all this doom and gloom. 01:01:53.180 |
Like what's the future going to be like for them? 01:01:57.900 |
what would it look like if there was a future 01:02:06.480 |
But was having children sort of an inspiration 01:02:18.740 |
And I'll just tell a very brief story about my family. 01:02:22.020 |
I'm the daughter of Chinese Vietnamese refugees. 01:02:25.900 |
My parents and grandparents were boat people. 01:02:29.420 |
If you remember, people left Vietnam during the war 01:02:32.580 |
in these small boats into the South China Sea. 01:02:41.980 |
And so my grandparents, both sets of grandparents 01:02:45.700 |
decided that there was a better future out there 01:02:51.660 |
But they were afraid of losing all of their kids. 01:02:58.180 |
And so they decided that there was something out there 01:03:03.180 |
in this bleak time and they paired up their kids, 01:03:14.860 |
and just said, "We'll see you on the other side." 01:03:17.580 |
And the kids were between the ages of like 10 to 25. 01:03:25.620 |
My mom was a teenager, early teen when this happened. 01:03:34.660 |
So how could I not believe that better is possible? 01:03:38.980 |
And I hope that that's in my epigenetics somewhere 01:03:50.380 |
And I so appreciate that you became a physician 01:03:57.720 |
and cognitive understanding and emotional understanding 01:04:01.460 |
So I'm grateful to the people that made that decision. 01:04:23.740 |
the risk and sort of willingness of my grandparents 01:04:31.440 |
And our own children sort of give it a sense of urgency. 01:04:37.940 |
And you're sending knowledge out into the fields of science 01:04:41.300 |
and bringing knowledge into the fields of science. 01:04:42.900 |
And I love this, we'll see you on the other side. 01:05:03.000 |
So in this case, you get two for the price of one, 01:05:08.020 |
- Having children definitely changes your time horizon. 01:05:13.900 |
like there were all these things that I think 01:05:15.220 |
we had talked about for as long as we've known each other 01:05:33.100 |
- And like sitting in the hospital delivery room, 01:05:37.140 |
finishing editing the letter that we were going to publish 01:05:40.980 |
to announce the work that we're doing on CZI. 01:05:55.620 |
And it's already been tremendously successful 01:06:02.260 |
and I have a lot of communication with those folks, 01:06:08.060 |
And thank you for expanding to the Midwest and New York. 01:06:10.220 |
And we're all very excited to see where all of this goes. 01:06:25.060 |
Inside Tracker is a personalized nutrition platform 01:06:33.600 |
I've long been a believer in getting regular blood work done 01:06:36.340 |
for the simple reason that many of the factors 01:06:38.180 |
that impact your immediate and long-term health 01:06:40.420 |
can only be analyzed from a quality blood test. 01:06:43.260 |
A major problem with a lot of blood tests out there, 01:06:47.740 |
about metabolic factors, lipids and hormones and so forth, 01:06:50.620 |
but you don't know what to do with that information. 01:06:56.820 |
that allows you to see the levels of all those things, 01:06:59.360 |
metabolic factors, lipids, hormones, et cetera, 01:07:01.860 |
but it gives you specific directives that you can follow 01:07:04.740 |
that relate to nutrition, behavioral modification, 01:07:08.340 |
that can help you bring those numbers into the ranges 01:07:16.680 |
to get 20% off any of Inside Tracker's plans. 01:07:22.780 |
And now for my discussion with Mark Zuckerberg. 01:07:41.660 |
you're starting to become synonymous with health, 01:07:50.100 |
of you rolling jiu-jitsu, you won a jiu-jitsu competition 01:07:53.240 |
recently, you're doing other forms of martial arts, 01:07:57.000 |
water sports, including surfing and on and on. 01:08:05.020 |
but maybe we could just start off with technology 01:08:12.500 |
which is that I think many people assume that technology, 01:08:16.460 |
especially technology that involves a screen, excuse me, 01:08:19.500 |
of any kind is going to be detrimental to our health, 01:08:23.320 |
but that doesn't necessarily have to be the case. 01:08:26.780 |
So could you explain how you see technology meshing with, 01:08:37.300 |
- Sure, I mean, I think this is a really important topic. 01:08:49.060 |
I think how you're using the technology has a big impact 01:08:52.180 |
on whether it is basically a positive experience for you. 01:08:56.360 |
And even within technology, even within social media, 01:08:59.460 |
there's not kind of one type of thing that people do. 01:09:08.960 |
And there's a lot of research that basically suggests 01:09:15.500 |
and the friendships that kind of bring the most happiness 01:09:19.260 |
and in our lives and at some level end up even correlating 01:09:25.200 |
because that kind of grounding that you have in community 01:09:36.260 |
to understand what's going on in people's lives, 01:09:38.680 |
have empathy for them, communicate what's going on 01:09:41.260 |
with your life, express that, that's generally positive. 01:09:47.380 |
in terms of bad interactions, things like bullying, 01:09:50.980 |
which we can talk about because there's a lot 01:09:54.080 |
that people can be safe from that and give people tools 01:09:56.740 |
and give kids the ability to have the right parental 01:10:01.420 |
But that's sort of the interacting with people side. 01:10:07.160 |
which I think of as just like passive consumption, 01:10:10.860 |
which at its best, it's entertainment, right? 01:10:15.260 |
And entertainment is an important human thing too. 01:10:18.060 |
But I don't think that that has quite the same association 01:10:22.420 |
with the long-term wellbeing and health benefits 01:10:27.420 |
as being able to help people connect with other people does. 01:10:40.360 |
a lot of the news is just so relentlessly negative 01:10:43.960 |
that it's just hard to come away from an experience 01:10:47.040 |
where you're looking at the news for half an hour 01:10:56.560 |
I think the more that social media is about connecting 01:11:00.220 |
with people and the more that when you're consuming 01:11:09.640 |
to learn about things that kind of enrich you 01:11:22.320 |
that we try to get right across our products. 01:11:24.600 |
And I think we're pretty aligned with the community 01:11:39.760 |
but I think it's as important when you're designing a product 01:11:43.060 |
to think about what kind of feeling you're creating 01:11:50.800 |
or just kind of like what do you make people feel. 01:11:56.880 |
So I think that doesn't mean that we want to shelter people 01:12:00.960 |
from bad things that are happening in the world, 01:12:03.060 |
but I don't really think that it's not what people want 01:12:13.860 |
So we work hard on all these different problems, 01:12:17.080 |
making sure that we're helping connect people 01:12:20.160 |
helping make sure that we give people good tools 01:12:23.480 |
to block people who might be bullying them or harass them, 01:12:28.000 |
Anyone under the age of 16 defaults into an experience 01:12:36.920 |
what their children is up to and are up to in a good balance. 01:12:47.840 |
We try to give people tools so that if you're a teen 01:13:02.360 |
So I think that there were things that you can do 01:13:03.920 |
to kind of push this in a positive direction, 01:13:05.600 |
but I think it just starts with having a more nuanced view 01:13:11.740 |
And the more that you can make it kind of a positive thing, 01:13:18.360 |
In terms of the negative experience, I agree. 01:13:21.360 |
I don't think anyone wants a negative experience 01:13:24.460 |
I think where some people get concerned perhaps, 01:13:27.200 |
and I think about my own interactions with say Instagram, 01:13:29.360 |
which I use all the time for getting information out, 01:13:35.220 |
Where I essentially launched the non-podcast segment 01:13:40.380 |
I can think of experiences that are a little bit 01:14:02.260 |
Occasionally I noticed, and this just reflects my failure, 01:14:07.020 |
That there are a lot of like street fight things, 01:14:10.120 |
like of people beating people up on the street. 01:14:15.800 |
I'm not somebody that enjoys seeing violence per se, 01:14:18.060 |
but I find myself, I'll click on one of these, 01:14:23.680 |
and there's like a little melee on the street or something. 01:14:26.040 |
And those seem to be offered to me a lot lately. 01:14:32.380 |
But I noticed that it has a bit of a gravitational pull 01:14:39.360 |
It's not teaching me any kind of useful street 01:14:48.120 |
I also really enjoy some of the cute animal stuff. 01:14:52.140 |
So there's this polarized collage that's offered to me 01:15:02.780 |
but actually it fills me with a feeling in some cases 01:15:11.080 |
interactions between animals I've never seen before 01:15:15.460 |
that when I leave Instagram, I do think I'm better off. 01:15:19.020 |
So I'm grateful for the algorithm in that sense. 01:15:31.560 |
Or is it also trying to do exactly what you described, 01:15:35.200 |
which is trying to give people a good feeling experience 01:15:51.620 |
So articles that would have basically a headline 01:15:57.200 |
that grabbed your attention, that made you feel like, 01:15:59.440 |
oh, I need to click on this, and then you click on it. 01:16:01.500 |
And then the article is actually about something 01:16:14.180 |
But it's actually a pretty straightforward exercise 01:16:20.340 |
and then they don't really spend a lot of time 01:16:33.560 |
saying that they're having a good experience. 01:16:39.580 |
is just by looking at how people use the services. 01:16:42.180 |
But I think it's also important to balance that 01:16:49.500 |
here are the stories that we could have showed you. 01:16:55.120 |
or would make it so that you have the best experience 01:17:02.780 |
So I think that through a set of things like that, 01:17:23.300 |
I don't know that everyone feels good about cute animals. 01:17:28.060 |
I can't imagine that people would feel really bad about it, 01:17:31.700 |
of a positive reaction to it as you just expressed. 01:17:35.360 |
And I don't know, maybe people who are more into fighting 01:17:42.420 |
assuming that they're within our community standards. 01:17:45.460 |
that we just don't want to be showing at all, 01:17:58.460 |
I think at various times in the company's history, 01:18:05.380 |
about saying this is good content, this is bad, 01:18:09.380 |
you should like this, this is unhealthy for you. 01:18:13.140 |
And I think that we want to look at the long-term effects. 01:18:17.420 |
You don't want to get stuck in a short-term loop 01:18:19.540 |
of like, okay, just 'cause you did this today 01:18:20.940 |
doesn't mean it's what you aspire for yourself over time. 01:18:24.060 |
But I think as long as you look at the long-term 01:18:28.120 |
of what people both say they want and what they do, 01:18:35.020 |
I just think feels like the right set of values 01:18:38.440 |
Now, of course, that doesn't go for everything. 01:18:41.000 |
There are things that are kind of truly off limits 01:18:47.340 |
or things that are really like inciting violence, 01:18:50.860 |
I mean, we have the whole community standards around this. 01:18:55.840 |
which I would hope that most people can agree, 01:19:06.220 |
except for the things that kind of get that sort of very, 01:19:20.400 |
of learning from the meta team about safety protections 01:19:23.700 |
that are in place for kids who are using meta platforms. 01:19:28.700 |
And frankly, I was like really positively surprised 01:19:39.880 |
so that it can stand the best chance of enriching, 01:19:42.780 |
not just remaining neutral, but enriching their mental health 01:19:47.060 |
One thing that came about in that conversation, however, 01:19:54.440 |
but do people really know that these tools exist? 01:19:56.800 |
And I think about my own experience with Instagram, 01:19:58.600 |
I love watching Adam Aseri's Friday Q and A's 01:20:12.520 |
I think every, he takes questions on Thursdays 01:20:16.120 |
So if I'm not aware of the tools without watching that, 01:20:22.780 |
how does meta look at the challenge of making sure 01:20:26.560 |
that people know that there are all these tools? 01:20:28.280 |
I mean, dozens and dozens of very useful tools, 01:20:30.640 |
but I think most of us just know the hashtag, 01:20:37.480 |
We now know that, you know, I also post to threads. 01:20:40.200 |
I mean, so we know the major channels and tools, 01:20:46.320 |
that one doesn't realize can take you off-road, 01:20:55.240 |
Maybe this conversation could cue people to their existence. 01:21:00.960 |
I mean, I think most of the narrative around social media 01:21:06.680 |
that people have to control their experience. 01:21:11.200 |
is this just negative for teens or something? 01:21:14.740 |
And I think, again, a lot of this comes down to, 01:21:23.060 |
like are people using it to connect in positive ways? 01:21:54.460 |
I don't really understand much about what the service is. 01:22:03.020 |
to prevent people from harassing me or something. 01:22:09.260 |
to also show a bunch of these tools in context. 01:22:16.140 |
and, you know, if you go to, you know, delete a comment 01:22:24.700 |
It's like, hey, did you know that you can manage things 01:22:47.200 |
But I do think that through conversations like this 01:22:49.440 |
and others that, you know, we need to be doing, 01:22:58.300 |
So that way when those things pop up in the product, 01:23:05.460 |
- Like, I find the restrict function to be very useful 01:23:09.980 |
In most cases, I do sometimes have to block people, 01:23:17.880 |
you might recognize that someone has a tendency 01:23:20.900 |
And I should point out that I actually don't really mind 01:23:23.680 |
but I try and maintain what I call classroom rules 01:23:27.260 |
where I don't like people attacking other people 01:23:31.080 |
I'm not going to tolerate that in the comment section, 01:23:33.620 |
- Yeah, and I think that the example that you just, 01:23:42.100 |
which is that block is sort of this very powerful tool 01:23:48.540 |
and you just want them to disappear from the experience, 01:23:55.820 |
is that in order to make it so that the person 01:24:05.300 |
inherent to that is that they will have a sense 01:24:09.720 |
And that's why I think some stuff like restrict 01:24:16.140 |
People like using different tools for very subtle reasons. 01:24:20.300 |
I mean, maybe you want the content to not show up, 01:24:23.860 |
but you don't want the person who's posting the content 01:24:28.380 |
Maybe you don't want to get the messages in your main inbox, 01:24:31.700 |
that you're not friends or something like that. 01:24:36.620 |
I mean, you actually need to give people different tools 01:24:46.620 |
in order to really allow people to tailor the experience 01:24:51.840 |
- In terms of trying to limit total amount of time 01:25:02.840 |
I mean, I think it's going to depend on what one 01:25:04.940 |
is looking at, the age of the user, et cetera. 01:25:08.260 |
- But I know that you have tools that cue the user 01:25:12.100 |
to how long they've been on a given platform. 01:25:17.420 |
Like I'm thinking about like the Greek myth of the sirens 01:25:25.180 |
Is there a function aside from deleting the app temporarily 01:25:28.380 |
and then reinstalling it every time you want to use it again? 01:25:31.740 |
Is there a true lockout, self lockout function 01:25:34.600 |
where one can lock themselves out of access to the app? 01:25:43.360 |
and then there's the tools that the parents get to use 01:25:48.760 |
But yeah, I think that there's different kind of, 01:25:55.280 |
and then give people reminders and things like that. 01:26:04.880 |
of is there an amount of time which is too much? 01:26:07.520 |
Because it does really get to what you're doing. 01:26:16.980 |
in the future of the augmented reality glasses 01:26:22.160 |
a lot of this is gonna be you're interacting with people 01:26:29.280 |
as if you were kind of like hanging out with friends 01:26:36.480 |
and you can feel like you're present right there with them 01:26:45.760 |
Well, at the limit, if we can get that experience 01:26:57.840 |
then I don't see why you would wanna restrict the amount 01:27:01.480 |
that people use that technology to any less than 01:27:04.920 |
what would be the amount of time that you'd be comfortable 01:27:10.940 |
Which obviously is not gonna be 24 hours a day. 01:27:16.420 |
But I think it really gets to kind of how you're using 01:27:19.740 |
Whereas if what you're primarily using the services for 01:27:22.380 |
is to, you're getting stuck in loops reading news 01:27:26.820 |
or something that is really kind of getting you 01:27:29.020 |
into a negative mental state, then I don't know. 01:27:32.380 |
I mean, I think that there's probably a relatively short 01:27:34.320 |
period of time that maybe that's kind of a good thing 01:27:40.300 |
'Cause it's just 'cause news might make you unhappy 01:27:43.460 |
doesn't mean that the answer is to be unaware 01:27:45.460 |
of negative things that are happening in the world. 01:27:47.260 |
I just think that there's like different people 01:27:48.980 |
have different tolerances for what they can take on that. 01:27:52.140 |
And I think we, it's generally having some awareness 01:27:55.020 |
is probably good as long as it's not more than 01:27:57.480 |
you're kind of constitutionally able to take. 01:28:01.820 |
Try to not be too paternalistic about this as our approach. 01:28:05.240 |
But we want to empower people by giving them the tools, 01:28:08.700 |
both people and if you're a teen, your parents, 01:28:12.500 |
to have tools to understand what you're experiencing 01:28:14.980 |
and how you're using these things and then go from there. 01:28:22.720 |
I like this idea of not being too paternalistic. 01:28:24.740 |
I mean, that's, it seems like the right way to go. 01:28:26.900 |
I find myself occasionally having to make sure 01:28:29.540 |
that I'm not just passively scrolling, that I'm learning. 01:28:32.660 |
I like forging for organizing and dispersing information. 01:28:43.620 |
I think comments are a great source of feedback. 01:28:45.540 |
And I'm not just saying that 'cause you're sitting here. 01:28:49.180 |
but other meta platforms have been tremendously helpful 01:28:52.480 |
for me to get science and health information out. 01:28:55.500 |
One of the things that I'm really excited about, 01:28:58.340 |
which I only had the chance to try for the first time today, 01:29:04.940 |
And then we can talk about the glasses, the Ray-Bans. 01:29:16.540 |
And I have so many questions about this, so I'll resist. 01:29:20.820 |
- Okay, well, yeah, I have some experience with VR. 01:29:27.500 |
is one of the pioneering labs of VR and mixed reality. 01:29:30.520 |
I guess some used to call it augmented reality, 01:29:38.780 |
is how well it interfaces with the real room, 01:29:54.620 |
what felt like a real martial arts experience, 01:30:06.980 |
it really bypasses a lot of the early concerns 01:30:15.820 |
or should use each day, even for the adult brain, 01:30:19.820 |
because it can really disrupt your vestibular system, 01:30:28.300 |
Like we didn't come out of it feeling dizzy at all. 01:30:41.300 |
Hold on, I'm playing this game just as it was 01:30:42.620 |
when I was a kid playing a Nintendo and someone walked in. 01:30:55.340 |
what is this, what do we even call this experience? 01:31:02.380 |
- Yeah, I mean, mixed reality is sort of the umbrella term 01:31:10.300 |
So augmented reality is what you're eventually going to get 01:31:13.800 |
with some future version of the smart glasses 01:31:16.500 |
where you're primarily seeing the world, right? 01:31:26.980 |
and you're going to be like as many holograms 01:31:32.820 |
the kind of art, physical games, media, your workstation. 01:31:38.260 |
we could just draw it up on the table right here 01:31:39.980 |
and just see it repeat as opposed to us turning 01:31:42.780 |
- Yeah, I mean, pretty much any screen that exists 01:31:44.520 |
could be a hologram in the future with smart glasses, right? 01:31:47.720 |
There's nothing that actually physically needs to be there 01:31:57.540 |
what of the things that are physical in the world 01:32:06.760 |
I mean, that doesn't need to physically be there. 01:32:13.760 |
the augmented reality experience that we're moving towards. 01:32:30.020 |
that's a hybrid in between the two and capable of both, 01:32:32.620 |
which is a headset that can do both virtual reality 01:32:35.060 |
and some of these augmented reality experiences. 01:32:41.100 |
Both because you're going to get new applications 01:32:43.900 |
that kind of allow people to collaborate together. 01:32:48.800 |
but someone joins us and it's their avatar there. 01:33:00.680 |
They're virtually, but then you also have some AI personas 01:33:14.340 |
is it even worth leaving the house type date? 01:33:20.220 |
I think, you know, dating has physical aspects to it too. 01:33:28.040 |
they want to know whether or not it's worth the effort 01:33:29.860 |
to head out to, they want to breach the divide, right? 01:33:37.380 |
who are dating basically say that in order to make sure 01:33:46.520 |
they'll schedule something that's like shorter 01:34:00.180 |
where you can feel like you're kind of sitting there, 01:34:02.800 |
and it's even easier and lighter weight and safer. 01:34:07.780 |
you can just like teleport out of there and beyond. 01:34:10.700 |
But yeah, I think that this will be an interesting question 01:34:14.700 |
in the future is there are clearly a lot of things 01:34:24.260 |
And then there are all these things that we're building up 01:34:28.060 |
but it's this weird artifact of kind of how this stuff 01:34:37.320 |
or you want to interact with the digital world. 01:34:39.920 |
but we pull out a small screen or we have a big screen. 01:34:45.580 |
But I think if we fast forward a decade or more, 01:34:50.260 |
it's I think one of the really interesting questions 01:34:54.320 |
about what is the world that we're going to live in? 01:34:57.280 |
I think it's going to increasingly be this mesh 01:35:03.520 |
that the world that we're in is just a lot richer 01:35:06.420 |
'cause there can be all these things that people create 01:35:08.200 |
that are just so much easier to do digitally than physically. 01:35:12.140 |
But B, you're going to have a real kind of physical sense 01:35:19.200 |
and not feel like interacting in the digital world 01:35:24.040 |
which today is just so much viscerally richer 01:35:27.660 |
I think the digital world will sort of be embedded in that 01:35:31.560 |
and will feel kind of just as vivid in a lot of ways. 01:35:38.000 |
you felt like you could look around and see the real room. 01:35:46.560 |
which historically I think people would have said 01:35:54.580 |
of the physical world with all the digital artifacts 01:35:58.200 |
that you can interact with them and feel present, 01:36:03.320 |
And I think it's possible to build a real world 01:36:06.280 |
that will actually be a more profound experience 01:36:10.400 |
- Well, I was struck by the smoothness of the interface 01:36:16.880 |
I guess it was an exercise class in the form of a book. 01:36:21.040 |
It was essentially like hitting mitts boxing, 01:36:40.500 |
I mean, I can't think of anything more aversive 01:36:44.560 |
like I don't want to insult any particular products, 01:36:47.580 |
but like riding a stationary bike while looking at a screen 01:36:58.000 |
It's like, if you're going to be running on a treadmill, 01:37:01.640 |
so I can beat the people who are ahead of me. 01:37:04.880 |
and certainly an exercise class or aerobics class, 01:37:08.460 |
But the experience I tried today was extremely engaging. 01:37:14.640 |
to at least know how to do a little bit of it. 01:37:31.560 |
of the physical room and the virtual experience 01:37:35.320 |
that makes it neither of one world or the other. 01:37:38.920 |
I mean, I really felt at the interface of those, 01:37:42.280 |
this feeling of forgetting that I was in a virtual experience 01:37:46.960 |
We had to stop 'cause we were going to start recording, 01:37:48.560 |
but I would do that for a good 45 minutes in the morning. 01:37:51.200 |
And there's no amount of money you could pay me, truly, 01:38:01.480 |
It's going to get people moving their bodies more, 01:38:13.440 |
that both children and adults are engaged in. 01:38:28.280 |
- I think we want to enable it and I think it's good, 01:38:31.840 |
but I think it comes more from a philosophical view 01:38:45.640 |
I believe in empowering people to do what they want 01:38:49.960 |
and be the best version of themselves that they can be. 01:38:54.920 |
- That said, I do believe that there's the previous 01:38:59.120 |
generation of computers were devices for your mind. 01:39:06.480 |
You know, I think that there's sort of a philosophical view 01:39:13.500 |
what you think about or your values or something. 01:39:22.220 |
And I think building a computer for your whole body 01:39:32.400 |
with this worldview that like the actual essence of you, 01:39:35.840 |
if you want to be present with another person, 01:39:37.680 |
if you want to like be fully engaged in experience 01:39:40.600 |
is not just, okay, it's not just a video conference call 01:39:44.400 |
that looks at your face and where you can like share ideas. 01:39:48.000 |
It's something that you can engage your whole body. 01:40:02.040 |
It's a really important part of how I personally balance 01:40:04.800 |
my energy levels and just get a diversity of experiences 01:40:09.360 |
because I could spend all my time running the company. 01:40:12.080 |
But I think it's good for people to do some different things 01:40:16.380 |
and compete in different areas or learn different things. 01:40:37.920 |
I think just having a computing environment and platform, 01:40:42.320 |
captures more of the essence of what we are as people 01:40:49.000 |
- Yeah, I was even thinking just of the simple task 01:40:51.560 |
of getting a better range of motion, AKA flexibility. 01:40:59.720 |
you know, a standard kind of like a lunge type stretch, 01:41:03.440 |
are you approaching new levels of flexibility in that moment 01:41:06.680 |
where it's actually measuring some kinesthetic elements 01:41:12.920 |
whereas normally you might have to do that in front 01:41:15.460 |
of a camera, which then would give you the data on a screen 01:41:17.680 |
that you'd look at afterwards or hire an expensive coach, 01:41:20.180 |
but so, or looking at form and resistance training. 01:41:25.860 |
but it's telling you whether or not you're breaking form. 01:41:27.940 |
I mean, there's just so much that could be done 01:41:30.580 |
And then my mind just starts to spiral into like, wow, 01:41:32.780 |
this is very likely to transform what we think of 01:41:42.660 |
You know, I don't think most people are gonna necessarily 01:41:45.740 |
want to install, you know, a lot of sensors or cameras 01:41:50.580 |
So we're just over time getting better from the sensors 01:42:00.620 |
just with the hand tracking from the headset, you can type, 01:42:03.580 |
it just projects a little keyboard onto your table 01:42:05.980 |
and you can type and people like type like a hundred words 01:42:14.840 |
some modern AI techniques, be able to like simulate 01:42:18.440 |
and understand where your torso's position is. 01:42:27.540 |
with like the accelerometer and understanding 01:42:29.960 |
how the thing is moving, you can kind of understand 01:42:34.320 |
But some things are still gonna be hard, right? 01:42:38.320 |
So you mentioned boxing, that one works pretty well 01:42:45.160 |
we understand your hands, and now we're kind of 01:42:48.580 |
increasingly understanding your body position. 01:42:51.840 |
But let's say you wanna expand that to Muay Thai 01:42:56.780 |
So legs, that's a different part of tracking, that's harder 01:43:00.260 |
'cause that's out of the field of view more of the time. 01:43:03.320 |
But there's also the element of resistance, right? 01:43:05.240 |
So you can throw a punch and retract it and shadow box 01:43:07.980 |
and do that without upsetting your kind of physical balance 01:43:12.900 |
that much, but if you wanna throw a roundhouse kick 01:43:15.320 |
and there's no one there, then the standard way 01:43:22.400 |
But like, I don't know, is that gonna feel great? 01:43:41.440 |
maybe you're gonna have some kind of body suit 01:43:43.880 |
that can apply haptics, but I'm not even sure 01:43:49.340 |
is gonna be able to be quite good enough to do 01:43:53.360 |
that would be applied to you in a grappling scenario. 01:43:56.240 |
So this is part of what's fun about technology though, 01:44:04.000 |
So I think it's really neat that we can kind of do boxing 01:44:12.140 |
And then there's also still so much more to do 01:44:14.020 |
that I'm excited to kind of get to over time, 01:44:19.460 |
- And what about things like painting and art and music? 01:44:23.900 |
You know, I imagine, of course, like different mediums, 01:44:30.240 |
but I can imagine trying to learn how to paint virtually. 01:44:35.980 |
This doesn't have to depart from the physical world. 01:44:39.400 |
- Did you see the demo, the piano demo where you, 01:44:47.820 |
but the app basically highlights what keys you need to press 01:44:55.420 |
So it's basically like you're looking at your piano 01:44:58.460 |
and it's teaching you how to play a song that you choose. 01:45:03.820 |
But it's illuminating certain keys in the virtual space. 01:45:06.940 |
- And it could either be a virtual piano if you, 01:45:09.560 |
or keyboard if you don't have a piano or keyboard, 01:45:15.460 |
So yeah, I think stuff like that is going to be 01:45:19.700 |
really fascinating for education and expression. 01:45:24.880 |
but for broadening access to expensive equipment. 01:45:31.740 |
And it takes up a lot of space and it needs to be tuned. 01:45:39.140 |
could learn to play a virtual piano at much lower cost. 01:45:43.320 |
I was asking before about this thought experiment 01:45:45.700 |
of how many of the things that we physically have today 01:45:57.060 |
maybe it's a somewhat better, more tactile experience 01:46:02.500 |
but for people who don't have the space for it 01:46:07.740 |
or just aren't sure that they would want to make 01:46:19.020 |
And I think that's going to unlock a ton of creativity too, 01:46:24.800 |
because instead of the market for piano makers 01:46:29.340 |
being constrained to a relatively small set of experts 01:46:35.620 |
you're going to have kids or developers all around the world 01:46:40.360 |
designing crazy designs for potential keyboards and pianos 01:46:44.440 |
that look nothing like what we've seen before, 01:46:50.820 |
where you have fewer of these physical constraints. 01:46:53.540 |
I think it's going to be a lot of wild stuff to explore. 01:46:56.100 |
- There's definitely going to be a lot of wild stuff 01:47:05.220 |
with our earlier conversation when Priscilla was here. 01:47:19.940 |
which is both financially costly and time-wise costly. 01:47:35.400 |
that basically tried to do a controlled experiment 01:47:39.980 |
of people who learned how to do a specific surgery 01:47:42.940 |
through just the normal kind of textbook and lecture method 01:47:46.980 |
versus like you show the knee and you have it 01:47:51.700 |
be a large blown up model and people can manipulate it 01:47:54.620 |
and kind of practice where they would make the cuts 01:48:12.020 |
has been accused of creating a lot of real world, 01:48:14.940 |
let's call it physical world, social anxiety for people. 01:48:17.400 |
But I could imagine practicing a social interaction 01:48:22.940 |
or that needs to advocate for themselves better, 01:48:29.620 |
because it's in my very recent experience today, 01:48:38.420 |
or just talking to another human being or an adult 01:48:41.020 |
or being in a new circumstance of a room full of kids, 01:48:42.900 |
you could really experience that in silico first 01:48:46.540 |
and get comfortable, let the nervous system attenuate a bit 01:48:49.420 |
and then take it into the quote unquote physical world. 01:48:53.400 |
- Yeah, I think we'll see experiences like that. 01:48:55.680 |
I mean, I also think that some of the social dynamics 01:49:05.320 |
So I'm sure that there will be kind of new anxieties 01:49:13.160 |
need to navigate dynamics around texting constantly 01:49:24.540 |
that hopefully we can help people work through too. 01:49:53.760 |
And Metta responded, and no one around me could hear, 01:50:02.620 |
And by the way, I'm not getting paid to say any of this. 01:50:09.760 |
I could hear, "Okay, I'm selecting those now, 01:50:16.300 |
So this was neither headphones in nor headphones out. 01:50:29.160 |
It was all interfaced through this very local environment 01:50:33.640 |
And as a neuroscientist, I'm fascinated by this 01:50:37.800 |
all occurring inside the casing of this thing 01:50:42.080 |
But maybe you could comment on the origin of that design 01:50:49.880 |
because I'm sure I'm just scratching the surface. 01:50:52.840 |
- The real product that we want to eventually get to 01:50:56.280 |
is this kind of full augmented reality product 01:51:07.860 |
- 'Cause the VR headset does feel kind of like- 01:51:09.040 |
- It will, but there's gonna be a place for that too, 01:51:11.940 |
just like you have your laptop and you have your workstation 01:51:15.380 |
or maybe the better analogy is you have your phone 01:51:19.640 |
These AR glasses are gonna be like your phone 01:51:24.040 |
and you will, I think, be able to, if you want, 01:51:35.120 |
walking around the world wearing VR headsets. 01:51:45.960 |
for having, because it's a big reform factor, 01:51:50.080 |
So just like your workstation or your bigger computer 01:51:53.800 |
can do more than your phone can do, there's a place for that. 01:51:57.160 |
When you want to settle into an intense task, 01:52:01.240 |
I would want them doing it through the headset, 01:52:10.800 |
I think the glasses will eventually get there too. 01:52:15.600 |
of really hard technology problems to address 01:52:22.000 |
where you can like put kind of full holograms in the world. 01:52:26.260 |
You're basically miniaturizing a supercomputer 01:52:31.320 |
so that the pair of glasses still looks stylish and normal. 01:52:53.640 |
There's like a whole kind of industrial process around that 01:53:03.320 |
Like there's a whole pipeline that's gotten very good 01:53:09.980 |
are just a completely different thing, right? 01:53:16.600 |
through a laser or some other kind of projector 01:53:19.040 |
and it can place that as an object in the world. 01:53:32.840 |
two different approaches towards building this at once. 01:53:40.000 |
what is the long-term thing that it's not super far off. 01:53:51.120 |
And we have something that's working internally 01:53:52.320 |
that we use as like a, that we'll use as a dev kit. 01:53:54.840 |
But that one, that's kind of a big challenge. 01:54:02.760 |
and it's harder to get all the pieces working. 01:54:08.760 |
all right, let's start with what we know we can put into 01:54:21.580 |
we did this collaboration with Ray-Ban, right? 01:54:29.200 |
They're classic, people have used them for decades. 01:54:32.280 |
For the first version, we got a sensor on the front 01:54:36.160 |
without having to take your phone out of your pocket. 01:54:47.360 |
But it was, that was sort of the first version of it. 01:54:52.480 |
but we saw how people used it and we tuned it. 01:55:06.000 |
but a lot of it is like people wanna take calls 01:55:11.280 |
But the biggest thing that I think is interesting 01:55:20.640 |
It also, it kind of proxies through your phone. 01:55:32.120 |
having the ability to have your meta AI assistant 01:55:36.920 |
and basically ask any question throughout the day 01:55:47.840 |
eventually I think you're gonna want your AI assistant 01:55:52.400 |
to be able to see what you see and hear what you hear. 01:55:58.160 |
to go into a mode where it can see what you see 01:56:08.240 |
to be able to see what you see and hear what you hear 01:56:39.760 |
- I just, I'm chuckling to myself because I have a friend, 01:56:43.200 |
and he was laughing about how people go to a concert 01:56:46.440 |
and everyone's filming the concert on their phone 01:56:49.120 |
so that they can be the person that posts the thing. 01:56:50.820 |
But like there are literally millions of other people 01:56:55.560 |
it feels important to post our unique experience 01:56:58.540 |
with glasses that would essentially smooth that gap 01:57:04.520 |
- You can just worry about it later, download it. 01:57:08.340 |
because they are so seamless with everyday experience, 01:57:11.260 |
even though you and I aren't wearing them now, 01:57:21.860 |
- I'm assuming that the people with glasses aren't filming, 01:57:23.500 |
whereas right now, because there's a sharp transition 01:57:26.660 |
when there's a phone in the room and someone's pointing it, 01:57:30.580 |
people generally say no phones in the locker rooms 01:57:45.780 |
it's basically like pulsing a white bright light. 01:57:51.060 |
- So, which is, by the way, more than cameras do. 01:57:54.760 |
- Right, someone could be holding a phone on the side. 01:57:55.600 |
- Yeah, I mean, phones aren't kind of showing a light, 01:57:59.440 |
a bright sensor when you're taking a photo, so. 01:58:02.500 |
- No, people oftentimes will pretend they're texting 01:58:05.660 |
I actually saw an instance of this in a barbershop once 01:58:07.820 |
where someone was recording and they were pretending 01:58:11.220 |
And it was a pretty intense interaction that ensued. 01:58:20.780 |
- Yeah, so I think when you're evaluating a risk 01:58:36.180 |
to do something bad than what people already had? 01:58:39.180 |
And I think because you have this privacy light 01:58:42.100 |
that is just broadcasting to everyone around you, 01:58:46.540 |
I think that that makes it actually less discreet 01:58:50.780 |
to do it through the glasses than what you could do 01:58:52.860 |
with a phone already, which I think is basically the bar 01:58:56.260 |
that we wanted to get over from a design perspective. 01:58:59.020 |
- Thank you for pointing out that it has the privacy light. 01:59:07.980 |
being able to look at a restaurant from the outside 01:59:11.240 |
and see the menu, get status on how crowded it is. 01:59:22.560 |
that allow you to navigate and the audio is okay. 01:59:25.420 |
It's nice to have a conversation with somebody on the phone 01:59:29.900 |
it'd be great if the road was traced where I should turn. 01:59:33.540 |
- These kinds of things seem like it's going to be 01:59:34.960 |
straightforward for meta-engineers to create. 01:59:39.320 |
we'll have it so it'll also have the holographic display 01:59:43.120 |
But I think that there will basically just be 01:59:50.020 |
The holographic display part I think is going to be 01:59:52.480 |
more expensive than doing one that just has the AI, 01:59:55.600 |
but is primarily communicating with you through audio. 02:00:00.040 |
So I mean the current Ray-Ban metaglasses are $299. 02:00:04.340 |
I think when we have one that has a display in it, 02:00:08.160 |
it'll probably be some amount more than that, 02:00:11.160 |
So I think that people will choose what they want to use 02:00:14.560 |
based on what the capabilities are that they want 02:00:29.440 |
You know, our game as a company isn't to build things 02:00:35.960 |
We try to build things that then everyone can use 02:00:39.540 |
and then become more useful because a very large number 02:00:46.480 |
You know, we're not like Apple or some of these companies 02:00:50.560 |
that just try to make something and then sell it 02:00:56.940 |
So I mean, I think that that model kind of is fine too. 02:01:04.480 |
so that way everyone in the world can use it. 02:01:08.560 |
will also potentially solve a major problem in a real way, 02:01:13.680 |
For both children and adults, it's very clear 02:01:16.320 |
that viewing objects in particular screens up close 02:01:21.480 |
Literally a change in the length of the eyeball 02:01:31.500 |
that kids who spend, and adults who spend two hours a day 02:01:40.560 |
And it has something to do with exposure to sunlight, 02:01:47.720 |
And with the glasses, I realize one could actually 02:01:53.640 |
It could measure and tell you how much time you've spent 02:01:58.900 |
I mean, this is just another example that leaps to mind, 02:02:12.660 |
seeing what the eyes see, is just gotta be the best way to go. 02:02:16.960 |
- Yeah, I think, well, multimodal, right, I think is, 02:02:33.160 |
- Well, I mean, I think what we're describing here 02:02:35.760 |
is essentially taking the phone, the computer, 02:02:38.800 |
and bringing it all to the level of the eyes. 02:02:43.840 |
And one would like more kinesthetic information, 02:02:50.140 |
But that all can be, if it can be figured out on the phone, 02:02:52.900 |
it can be by the phone, it can be figured out by glasses. 02:02:58.200 |
such as what are you focusing on in your world? 02:03:16.580 |
was in the conversation at hand versus something else? 02:03:24.920 |
And I think that information is not accessible 02:03:27.000 |
with a phone in your pocket or in front of you. 02:03:30.920 |
but not nearly as rich and complete information 02:03:33.960 |
as one gets when you're really pulling the data 02:03:35.920 |
from the level of vision and what kids and adults 02:03:44.940 |
You get autonomic information, size of the pupils. 02:03:47.500 |
So you get information about internal states. 02:03:53.120 |
So there's the sensor on the Ray-Ban metaglasses is external. 02:03:57.840 |
Right, so it's basically allows you to see what you see. 02:04:00.460 |
Then, sorry, the AI assistant to see what you're seeing. 02:04:04.680 |
There's a separate set of things which are eye tracking, 02:04:13.840 |
So if you want to just like look at something 02:04:16.300 |
and select it by looking at it with your eyes, 02:04:19.700 |
rather than having to kind of drag a controller over 02:04:27.400 |
So that's a pretty profound and cool experience too, 02:04:35.580 |
So that way you're not kind of wasting compute power, 02:04:47.300 |
they're interesting design and technology trade-offs 02:04:51.980 |
where if you want the external sensor, that's one thing. 02:04:59.340 |
Each one of these consumes compute, which consumes battery. 02:05:05.580 |
So it's like, where are the eye tracking sensors gonna be? 02:05:08.980 |
that the rim of the glasses is actually quite thin 02:05:17.780 |
before they look more like goggles than glasses. 02:05:20.340 |
Something that this is, there was this whole space 02:05:26.680 |
Maybe they want something that's more powerful, 02:05:41.320 |
but kind of has AI in it and you can capture moments 02:05:47.180 |
In the latest version, we got the ability into live stream. 02:05:53.420 |
that now you can be kind of going back to your concert case 02:06:07.220 |
to your kind of family group so people can see it. 02:06:16.500 |
that you basically have a normal looking pair of glasses 02:06:28.180 |
And I don't know, I think people are gonna like this version 02:06:39.220 |
and for all the reasons that you just mentioned. 02:06:44.460 |
around AI interfaces and maybe even avatars of people 02:06:56.700 |
and you on the internet where people, for instance, 02:07:01.600 |
to all those questions, but with things like ChatGPT, 02:07:04.620 |
people are trying to generate answers to those questions 02:07:07.660 |
Will I have the opportunity to soon have an AI version 02:07:12.140 |
about like what I recommend for sleep and circadian rhythm, 02:07:19.220 |
that will be accurate so they could just ask my avatar? 02:07:22.620 |
- Yeah, this is something that I think a lot of creators 02:07:31.620 |
And I think we'll probably have a version of next year, 02:07:35.100 |
but there's a bunch of constraints that I think we need 02:07:39.820 |
So for one, I think it's really important that, 02:07:43.360 |
it's not that there's a bunch of versions of you, 02:07:50.620 |
it should be something that you control, right? 02:07:53.180 |
I think there are some platforms that are out there today 02:07:54.960 |
that just let people like make, I don't know, 02:08:03.060 |
I mean, we have platform policies for decades, 02:08:07.060 |
since the beginning of the company at this point, 02:08:18.980 |
Real identity is like one of the core aspects 02:08:23.680 |
is like you wanna kind of authentically be yourself. 02:08:25.880 |
So yeah, I think if you're almost any creator, 02:08:34.940 |
and there's just gonna be more demand to interact with you 02:08:40.820 |
So they're both people out there who would benefit 02:08:43.740 |
from being able to talk to an AI version of you. 02:08:46.720 |
And I think you and other creators would benefit 02:08:49.620 |
from being able to keep your community engaged 02:08:51.540 |
and service that demand that people have to engage with you. 02:08:54.620 |
But you're gonna wanna know that that AI kind of version 02:09:04.260 |
And there are a lot of things that are awesome 02:09:20.140 |
I don't think it needs to be 100% perfect all the time, 02:09:22.820 |
but you need to have very good confidence, I would say, 02:09:25.100 |
that it's gonna represent you the way that you'd want 02:09:38.140 |
which is creating new characters or AI personas. 02:09:51.500 |
and they can help you kind of come up with things 02:09:55.500 |
that you could cook and can help you cook them. 02:10:00.700 |
that are interested in different types of fitness 02:10:02.640 |
that can help you kind of plan out your workouts 02:10:05.620 |
or help with recovery or different things like that. 02:10:16.700 |
that can help you make travel plans or give you ideas. 02:10:19.460 |
But the key thing about all these is they're not, 02:10:24.860 |
So they don't have to have kind of 100% fidelity 02:10:28.980 |
to like making sure that they never say something 02:10:34.580 |
would never say because they're just made up characters. 02:10:37.020 |
So I think that that is, that's somewhat easier problem. 02:10:49.420 |
because we thought that would make it more fun. 02:10:50.820 |
So there's like Snoop Dogg is the dungeon master. 02:10:56.780 |
And it's just like, I do this with my daughter 02:11:01.100 |
And she just like loves like storytelling, right? 02:11:05.020 |
And it's like Snoop Dogg is the dungeon master. 02:11:08.180 |
We'll come up with like, here's what's happening next. 02:11:09.900 |
And she's like, okay, like turn into a mermaid. 02:11:14.000 |
and I like go and find the treasure chest and unlock it. 02:11:17.460 |
And it's like, and then Snoop Dogg just always 02:11:27.180 |
He's playing the dungeon master, which makes it more fun. 02:11:32.020 |
is you have like, you can kind of build versions 02:11:35.420 |
of these characters that people can interact with 02:11:41.540 |
is to the place where any creator or any small business 02:11:54.020 |
if you're a business and basically just help you grow 02:12:00.940 |
So I don't know, I think that's gonna be cool, 02:12:02.480 |
but I think this is, it's a long-term project. 02:12:04.620 |
I think we'll have more progress on it to report 02:12:10.820 |
- I'm super excited about it because of, you know, 02:12:15.900 |
I mean, I think people are now coming around to that, 02:12:21.800 |
and that there are a lot of life-enhancing spaces 02:12:24.060 |
that it's gonna show up and really, really improve 02:12:26.580 |
the way that we engage socially, what we learn, 02:12:33.120 |
don't have to suffer and in fact can be enhanced 02:12:34.900 |
by the sorts of technologies we've been talking about. 02:12:43.500 |
you've given me today to sort through all these things, 02:12:46.660 |
and to talk with you and Priscilla and to hear 02:12:48.900 |
what's happening and where things are headed. 02:12:53.900 |
I share in your optimism and it's been only strengthened 02:12:58.980 |
So thank you so much and keep doing what you're doing. 02:13:02.420 |
And on behalf of myself and everyone listening, thank you, 02:13:11.740 |
and it's clear that there's a ton of intention and care 02:13:15.180 |
and thought about what could be in the positive sense, 02:13:26.260 |
- Thank you for joining me for today's discussion 02:13:30.760 |
If you're learning from and/or enjoying this podcast, 02:13:35.000 |
That's a terrific zero-cost way to support us. 02:13:39.500 |
on both Spotify and Apple, and on both Spotify and Apple, 02:13:46.700 |
at the beginning and throughout today's episode. 02:13:51.420 |
If you have questions for me or comments about the podcast 02:13:53.980 |
or guests that you'd like me to consider hosting 02:13:57.220 |
please put those in the comment section on YouTube. 02:14:02.300 |
but on many previous episodes of the Huberman Lab Podcast, 02:14:06.320 |
While supplements aren't necessary for everybody, 02:14:08.500 |
many people derive tremendous benefit from them 02:14:10.820 |
for things like enhancing sleep, hormone support, 02:14:14.900 |
If you'd like to learn more about the supplements discussed 02:14:24.020 |
If you're not already following me on social media, 02:14:25.900 |
it's Huberman Lab on all social media platforms. 02:14:31.720 |
Threads, Facebook, LinkedIn, and on all those places, 02:14:39.300 |
but much of which is distinct from the content 02:14:42.380 |
So again, it's Huberman Lab on all social media platforms. 02:14:54.220 |
as well as toolkits in the form of brief PDFs. 02:14:56.860 |
We've had toolkits related to optimizing sleep, 02:15:00.120 |
to regulating dopamine, deliberate cold exposure, fitness, 02:15:03.800 |
mental health, learning, and neuroplasticity, and much more. 02:15:10.860 |
go over to the Menu tab, scroll down to Newsletter, 02:15:18.180 |
Thank you once again for joining me for today's discussion