back to indexMark Zuckerberg: Meta, Facebook, Instagram, and the Metaverse | Lex Fridman Podcast #267
Chapters
0:0 Introduction
5:36 Metaverse
25:36 Identity in Metaverse
37:45 Security
42:10 Social Dilemma
64:16 Instagram whistleblower
69:1 Social media and mental health
74:26 Censorship
91:35 Translation
99:10 Advice for young people
104:58 Daughters
107:46 Mortality
112:19 Question for God
115:25 Meaning of life
00:00:04.040 |
unless you believe that people expressing themselves 00:00:08.620 |
does it weigh heavy on you that people get bullied 00:00:15.440 |
and I don't want to build products that make people angry. 00:00:19.320 |
- Why do you think so many people dislike you? 00:00:25.560 |
And how do you regain their trust and support? 00:00:30.160 |
- The following is a conversation with Mark Zuckerberg, 00:00:38.680 |
about this conversation with Mark Zuckerberg, 00:00:42.680 |
and about what troubles me in the world today, 00:00:57.760 |
and reveals the full complexity of our world, 00:01:01.040 |
shining a light on the dark aspects of human nature, 00:01:06.640 |
through compassionate but tense chaos of conversation 00:01:21.420 |
I think about the hundreds of millions of people 00:01:24.440 |
who are starving, and who live in extreme poverty. 00:01:28.440 |
The one million people who take their own life every year, 00:01:33.880 |
and the many, many more millions who suffer quietly, 00:01:44.600 |
Today, my heart goes out to the people of Ukraine. 00:01:48.520 |
My grandfather spilled his blood on this land, 00:01:52.200 |
held the line as a machine gunner against the Nazi invasion, 00:02:19.960 |
As I've said in the past, I don't care about access, 00:02:22.640 |
fame, money, or power, and I'm afraid of nothing. 00:02:26.720 |
But I am who I am, and my goal in conversation 00:02:33.420 |
no matter who they are, no matter their position. 00:02:36.520 |
And I do believe the line between good and evil 00:02:47.360 |
It is full of hate, violence, and destruction. 00:02:55.280 |
and the insatiable desire to help each other. 00:03:01.280 |
that show this world, that show us to ourselves, 00:03:11.720 |
we turn to social networks to share real human insights 00:03:14.400 |
and experiences, to organize protests and celebrations, 00:03:20.920 |
of the world, of our history, and of our future, 00:03:24.240 |
and above all, to be reminded of our common humanity. 00:03:30.460 |
they have the power to cause immense suffering. 00:03:44.040 |
This podcast conversation attempts to understand 00:03:46.400 |
the man and the company who take this responsibility on, 00:03:50.640 |
where they fail, and where they hope to succeed. 00:03:53.220 |
Mark Zuckerberg's feet are often held to the fire, 00:03:57.880 |
as they should be, and this actually gives me hope. 00:04:07.680 |
I believe can solve any problem in the world. 00:04:12.640 |
Both are necessary, the engineer and the critic. 00:04:32.160 |
and worse, suffocates the dreams of young minds 00:04:35.400 |
who want to build solutions to the problems of the world. 00:04:49.480 |
that don't just ride the viral wave of cynicism, 00:04:53.200 |
but seek to understand the failures and successes 00:05:05.800 |
and I count on the critic to point it out when I do. 00:05:15.200 |
especially in those who dream to build solutions, 00:05:31.620 |
And now, dear friends, here's Mark Zuckerberg. 00:05:39.160 |
Can you circle all the traffic lights, please? 00:05:56.800 |
Okay, now we can initiate the interview procedure. 00:06:00.000 |
Is it possible that this conversation is happening 00:06:15.200 |
and not the person who created that meta company. 00:06:26.400 |
who can do the Mark and the Lex conversation replay 00:06:34.800 |
before we have photorealistic avatars like this. 00:06:43.520 |
of the avatar representing who you are in the metaverse. 00:06:52.280 |
because there's a magic to the in-person conversation. 00:06:58.120 |
you can have the same kind of magic in the metaverse, 00:07:02.600 |
whatever the heck is there when we're talking in person? 00:07:09.280 |
- Well, I think this is like the key question, right? 00:07:12.920 |
Because the thing that's different about virtual 00:07:19.120 |
compared to all other forms of digital platforms before 00:07:29.680 |
And that's just different from all of the other screens 00:07:35.760 |
It's, you know, they're trying to, in some cases, 00:07:43.000 |
but at no point do you actually feel like you're in it, 00:07:46.960 |
At some level, your content is trying to sort of 00:07:51.360 |
that's happening, but all of the kind of subtle signals 00:07:54.160 |
are telling you, no, you're looking at a screen. 00:07:56.440 |
So the question about how you develop these systems 00:07:59.480 |
is like, what are all of the things that make 00:08:04.920 |
So I think on visual presence and spatial audio, 00:08:16.960 |
I don't know if you've tried this experience, 00:08:19.600 |
workrooms that we launched where you have meetings. 00:08:21.960 |
And, you know, I basically made a rule for, you know, 00:08:24.800 |
all of the top, you know, management folks at the company 00:08:31.720 |
I feel like we got to dog food this, you know, 00:08:33.600 |
this is how people are going to work in the future. 00:08:37.860 |
And there were already a lot of things that I think 00:08:41.120 |
feel significantly better than like typical Zoom meetings, 00:08:44.720 |
even though the avatars are a lot lower fidelity, 00:08:47.440 |
you know, the idea that you have spatial audio, 00:08:56.840 |
You can see, you know, the arm gestures and stuff 00:09:03.040 |
which is something that you can't really do in Zoom. 00:09:09.680 |
and if you're actually sitting around a table with people, 00:09:14.600 |
to the person next to you and like have a conversation 00:09:16.720 |
that you can't, you know, that you can't really do with 00:09:32.560 |
even when the visual fidelity isn't quite there, 00:09:35.000 |
but I think it'll get there over the next few years. 00:09:37.160 |
Now, I mean, you were asking about comparing that 00:09:38.720 |
to the true physical world, not Zoom or something like that. 00:09:42.720 |
And there, I mean, I think you have feelings of like 00:09:46.960 |
temperature, you know, olfactory, obviously touch, right? 00:09:54.480 |
the sense that you wanna be able to, you know, 00:09:56.400 |
put your hands down and feel some pressure from the table. 00:10:00.640 |
are gonna be really critical to be able to keep up 00:10:08.720 |
But I don't know, I think we're gonna have a lot 00:10:15.840 |
how much you're just gonna be able to build with software 00:10:27.200 |
- Yeah, so I mean, look, I mean, that's on the shorter end 00:10:32.520 |
But it's, but, you know, one of the things that we found 00:10:39.480 |
So the earliest VR, you just have the headset 00:10:41.960 |
and then, and that was cool, you could look around, 00:10:45.360 |
but you don't feel like you're really able to interact 00:10:48.520 |
And then there was this big question where once you got 00:10:50.440 |
hands, what's the right way to represent them? 00:10:53.400 |
And initially, all of our assumptions was, okay, 00:10:58.400 |
when I look down and see my hands in the physical world, 00:11:00.360 |
I see an arm and it's gonna be super weird if you see, 00:11:11.480 |
And if the elbow angle that we're kind of interpolating 00:11:14.720 |
based on where your hand is and where your headset is, 00:11:19.840 |
it creates this very uncomfortable feeling where it's like, 00:11:25.840 |
And that actually broke the feeling of presence a lot more. 00:11:29.440 |
Whereas it turns out that if you just show the hands 00:11:31.840 |
and you don't show the arms, it actually is fine for people. 00:11:36.120 |
So I think that there's a bunch of these interesting 00:11:39.200 |
psychological cues where it'll be more about getting 00:11:44.880 |
And I think a lot of that will be possible even over, 00:11:47.800 |
you know, a few year period or a five year period, 00:11:49.760 |
and we won't need like every single thing to be solved 00:11:54.600 |
- Yeah, it's a fascinating psychology question of 00:11:56.720 |
what is the essence that makes in-person conversation 00:12:04.360 |
It's like emojis are able to convey emotion really well, 00:12:08.000 |
even though they're obviously not photorealistic. 00:12:10.560 |
And so in that same way, just like you're saying, 00:12:12.440 |
just showing the hands is able to create a comfortable 00:12:19.280 |
You know, people in the world wars used to write letters 00:12:21.920 |
and you can fall in love with just writing letters. 00:12:26.640 |
You can convey emotion, you can be depth of experience 00:12:32.720 |
So that's, I think, a fascinating place to explore 00:12:36.240 |
psychology of like, how do you find that intimacy? 00:12:39.240 |
- Yeah, and you know, the way that I come to all of this 00:12:41.480 |
stuff is, you know, I basically studied psychology 00:12:45.040 |
So all of the work that I do is sort of at the intersection 00:12:50.080 |
I think most of the other big tech companies are building 00:12:55.040 |
What I care about is building technology to help people 00:12:58.080 |
So I think it's a somewhat different approach than most of 00:13:00.360 |
the other tech entrepreneurs and big companies 00:13:04.320 |
And a lot of the lessons in terms of how I think about 00:13:09.960 |
designing products come from some just basic elements 00:13:15.920 |
In terms of, you know, our brains, you can compare to the 00:13:19.920 |
brains of other animals, you know, we're very wired to 00:13:28.120 |
So compared to other animals, I mean, that's clearly 00:13:32.920 |
But there's a whole part of your brain that's just kind of 00:13:38.520 |
So, you know, when we're designing the next version of Quest 00:13:42.120 |
or the VR headset, a big focus for us is face tracking 00:13:45.760 |
and basically eye tracking so you can make eye contact, 00:13:48.800 |
which again, isn't really something that you can do 00:13:51.440 |
It's sort of amazing how much, how far video conferencing 00:13:55.440 |
has gotten without the ability to make eye contact, right? 00:13:58.600 |
It's sort of a bizarre thing if you think about it, 00:14:00.480 |
you're like looking at someone's face, you know, 00:14:03.080 |
sometimes for an hour when you're in a meeting and like, 00:14:06.640 |
you looking at their eyes to them doesn't look like 00:14:15.680 |
You're not sending that signal. - Well, you're trying to. 00:14:17.320 |
- Right, you're trying to. - Like a lot of times, 00:14:19.760 |
I'm trying to look into the other person's eyes. 00:14:21.440 |
- But they don't feel like you're looking to their eyes. 00:14:24.120 |
am I supposed to look at the camera so that way you can, 00:14:26.080 |
you know, have a sensation that I'm looking at you? 00:14:30.080 |
And then, you know, with VR today, even without eye tracking 00:14:35.080 |
and knowing what your eyes are actually looking at, 00:14:39.400 |
So you can look at like where the head poses, 00:14:43.720 |
in your general direction, then you can sort of assume 00:14:46.520 |
that maybe there's some eye contact intended, 00:14:50.880 |
Maybe not, it's like a, maybe it's not a, you know, 00:15:00.200 |
And I think that that's really important stuff. 00:15:02.120 |
So when I think about Meta's contribution to this field, 00:15:06.640 |
that any of the other companies that are focused 00:15:09.520 |
on the Metaverse or on virtual and augmented reality 00:15:13.320 |
are gonna prioritize putting these features in the hardware 00:15:15.840 |
because like everything, they're trade-offs, right? 00:15:22.760 |
You could totally see another company taking the approach 00:15:24.840 |
of let's just make the lightest and thinnest thing possible. 00:15:34.320 |
'Cause so much of human emotion and expression 00:15:39.520 |
If I like move my eyebrow, you know, a millimeter, 00:15:41.800 |
you will notice and that like means something. 00:15:46.840 |
and a lot of communication I think is a loss. 00:15:49.640 |
And so it's not like, okay, there's one feature 00:15:55.120 |
You can sort of look at how the human brain works 00:15:57.840 |
and how we express and kind of read emotions. 00:16:01.840 |
And you can just build a roadmap of that, you know, 00:16:06.440 |
to try to unlock over a five to 10 year period 00:16:19.360 |
where there's a lot of ways to ask this question, 00:16:31.320 |
And actually it's interesting to think about the fact 00:16:33.400 |
that a lot of people are having the most important moments 00:16:50.800 |
will provide those experiences for a large number, 00:16:57.240 |
There was someone, you know, I read this piece 00:17:06.040 |
but one definition of this is it's about a time 00:17:12.880 |
become the primary way that we live our lives 00:17:21.840 |
I think you also just want to look at this as a continuation 00:17:25.520 |
because it's not like, okay, we are building digital worlds, 00:17:29.760 |
I think, you know, you and I probably already live 00:17:32.240 |
a very large part of our life in digital worlds. 00:17:34.560 |
They're just not 3D immersive virtual reality, 00:17:37.160 |
but, you know, I do a lot of meetings over video 00:17:45.960 |
for kind of the immersive presence version of this, 00:17:50.960 |
And for that, I think that there's just a bunch 00:17:54.720 |
And I think when you're building virtual worlds, 00:18:02.880 |
a lot of it is just you're managing this duality 00:18:06.960 |
you want to build these elegant things that can scale 00:18:10.080 |
and, you know, have billions of people use them 00:18:14.480 |
you're fighting this kind of ground game where it's just, 00:18:22.200 |
So the first ones that we basically went after 00:18:25.920 |
were gaming with Quest and social experiences. 00:18:31.200 |
it goes back to when we started working on virtual reality. 00:18:39.400 |
but if you look at all computing platforms up to that point, 00:19:03.880 |
there were a lot of different games out there. 00:19:15.080 |
there tended to be a smaller number of those. 00:19:17.280 |
And that ended up being just as important of a thing 00:19:30.920 |
- Yeah, I think that there's a workrooms aspect of this, 00:19:35.360 |
And then I think that there's like a, you know, 00:19:40.360 |
You're like, you're working or coding or what, 00:19:49.440 |
You know, gaming, I think we're well on our way. 00:19:51.280 |
Social, I think we're just the kind of preeminent company 00:19:57.040 |
And I think that that's already on quest becoming the, 00:20:00.280 |
you know, if you look at the list of what are the top apps, 00:20:09.200 |
But I don't know, I would imagine for someone like you, 00:20:31.240 |
because pretty soon you're just gonna be able 00:20:32.640 |
to have a screen that's bigger than, you know, 00:20:34.200 |
it'll be your ideal setup and you can bring it with you 00:20:39.760 |
So I think that there are a few things to work out on that, 00:20:56.440 |
we were talking before about how you're into running 00:21:02.600 |
you know, different things in different places. 00:21:08.360 |
- Yeah, and surfing and I used to fence competitively, 00:21:42.640 |
- This is like, you know, you've seen "Rocky IV" 00:21:51.360 |
- The idea of me as Rocky and like fighting is-- 00:22:05.880 |
you know, I don't know if you've tried supernatural 00:22:10.160 |
- So first of all, can I just comment on the fact 00:22:15.000 |
I just, I get giddy every time I step into virtual reality. 00:22:18.760 |
So you mentioned productivity and all those kinds of things. 00:22:20.880 |
That's definitely something I'm excited about, 00:22:30.440 |
but it just feels like the most convenient way 00:22:37.480 |
into worlds that are similar to the real world 00:22:55.000 |
In terms of the productivity as a programmer, 00:23:04.320 |
You have to develop, like there has to be a threshold 00:23:07.360 |
where a large amount of the programming community 00:23:10.520 |
But the collaborative aspects that are possible 00:23:29.640 |
and those are gonna be the amazing experiences. 00:23:33.360 |
Whether it's a real place or something that people made. 00:23:44.760 |
when I'm coding stuff, it's like, all right, you code it, 00:23:46.480 |
and then you build it, and then you see it afterwards. 00:23:50.440 |
you're in a world and you're building the world 00:23:52.720 |
as you are in it and kind of manipulating it. 00:23:55.720 |
One of the things that we showed at our Inside the Lab 00:24:02.440 |
is this BuilderBot program, where now you are, 00:24:08.520 |
"put some trees over there and it'll do that." 00:24:10.680 |
And like, "All right, put some bottles of water 00:24:17.800 |
And I think there are gonna be new paradigms for coding. 00:24:24.600 |
especially the first few times that you do them, 00:24:34.240 |
is not doing things that are amazing for the first time. 00:24:39.600 |
I mean, just answering your question from before around, 00:24:45.040 |
Well, first of all, let me just say this as an aside, 00:24:57.200 |
But I think you will spend most of your computing time 00:25:10.520 |
is a 5% improvement on your coding productivity. 00:25:18.400 |
But I mean, look, if I could increase the productivity 00:25:27.640 |
And I imagine a lot of other companies would too. 00:25:30.360 |
And that's how you start getting to the scale 00:25:34.480 |
some of the bigger computing platforms that exist today. 00:25:59.480 |
- Yeah, I mean, I think that there's gonna be a range. 00:26:03.200 |
So we're working on, for expression and avatars, 00:26:07.640 |
on one end of the spectrum are kind of expressive 00:26:20.800 |
there are gonna be different use cases for different things. 00:26:25.120 |
So if you're going from photorealistic to expressive, 00:26:48.960 |
I think getting to be more photorealistic as a soldier 00:26:56.760 |
There are times when I'm hanging out with friends 00:27:02.080 |
So a kind of cartoonish or expressive version of me is good. 00:27:11.600 |
where a lot of the experience is kind of dressing up 00:27:21.320 |
and it's almost like you have like a built-in icebreaker 00:27:24.560 |
because like you see people and you're just like, 00:27:27.320 |
all right, I'm cracking up at what you're wearing 00:27:34.360 |
Whereas, okay, if you're going into a work meeting, 00:27:38.920 |
maybe a photorealistic version of your real self 00:27:41.800 |
is gonna be the most appropriate thing for that. 00:27:43.600 |
So I think the reality is there aren't going to be, 00:27:49.040 |
My own sense of kind of how you wanna express identity 00:27:55.280 |
online has sort of evolved over time in that, 00:28:00.280 |
And now I think that's clearly not gonna be the case. 00:28:02.080 |
I think you're gonna have all these different things 00:28:04.400 |
and there's utility in being able to do different things. 00:28:12.880 |
how do you build the software to allow people 00:28:17.080 |
So say, so you could view them as just completely 00:28:25.160 |
but let's talk about the metaverse economy for a second. 00:28:31.240 |
for my photorealistic avatar, which by the way, 00:28:34.440 |
I think at the time where we're spending a lot of time 00:28:36.480 |
in the metaverse doing a lot of our work meetings 00:28:40.280 |
I would imagine that the economy around virtual clothing 00:28:56.680 |
let's say you buy some shirt for your photorealistic avatar. 00:28:59.800 |
Wouldn't it be cool if there was a way to basically 00:29:07.920 |
for your kind of cartoonish or expressive avatar? 00:29:12.520 |
You can view them as two discrete points and okay, 00:29:18.160 |
then it actually comes in a pack and there's two 00:29:22.320 |
But I actually think this stuff might exist more 00:29:26.080 |
And that's what, I do think the direction on some of the 00:29:35.920 |
being able to take a piece of art or express something 00:29:39.800 |
and say, okay, paint me this photo in the style of Gauguin 00:29:51.240 |
of what I've designed for my expressive avatar. 00:29:56.880 |
- And so the fashion, you might be buying like a generator, 00:30:05.480 |
they'll be able to infinitely generate outfits 00:30:08.120 |
thereby making it, so the reason I wear the same thing 00:30:16.640 |
Your closet generates your outfit for you every time. 00:30:19.520 |
So you have to live with the outfit it generates. 00:30:27.440 |
but I think like, I think that there's going to be 00:30:29.960 |
a huge aspect of just people doing creative commerce here. 00:30:34.960 |
So I think that there is going to be a big market 00:30:41.000 |
But the question is, if you're designing digital clothing, 00:30:42.960 |
do you need to design, if you're the designer, 00:30:44.800 |
do you need to make it for each kind of specific, 00:31:03.120 |
is there a way to morph that so it like goes on the dragon 00:31:07.640 |
And that I think is an interesting AI problem 00:31:09.440 |
because you're probably not going to make it so that, 00:31:11.880 |
like that designers have to go design for all those things. 00:31:17.240 |
that you buy in a lot of uses, in a lot of use cases, 00:31:23.440 |
And that's a lot of what, you know, all of the, 00:31:29.680 |
but I think a lot of the promise here is that 00:31:32.560 |
if the digital goods that you buy are not just tied 00:31:50.800 |
you can have people concealing their identity 00:32:06.920 |
- Well, let's break that down into a few different cases. 00:32:10.240 |
I mean, 'cause knowing that you're talking to someone 00:32:13.520 |
that I think is not even solved in pretty much anywhere. 00:32:17.800 |
But if you're talking to someone who's a dragon, 00:32:21.160 |
that they're not representing themselves as a person. 00:32:36.120 |
you're in a future version of this conversation 00:32:50.280 |
And one of the things that we're thinking about is, 00:32:57.480 |
to be able to generate photorealistic avatars 00:33:03.320 |
And so you kind of have a map from your headset 00:33:06.160 |
and whatever sensors of what your body's actually doing 00:33:12.640 |
should there be some sort of biometric security 00:33:24.320 |
And I think you probably are gonna want something like that. 00:33:26.760 |
So that's, you know, as we're developing these technologies, 00:33:31.120 |
we're also thinking about the security for things like that 00:33:34.640 |
because people aren't gonna want to be impersonated. 00:33:39.520 |
Then you just get the question of people hiding behind 00:33:48.320 |
which is not gonna be unique to the metaverse. 00:33:51.000 |
Although, you know, certainly in a environment 00:34:06.480 |
in social media and the internet more broadly. 00:34:08.720 |
And there, I think there have been a bunch of tactics 00:34:17.760 |
you know, we've built up these different AI systems 00:34:21.880 |
is this account behaving in the way that a person would? 00:34:28.360 |
so in all of the work that we've done around, 00:34:33.320 |
and it's basically like policing harmful content 00:34:36.920 |
and trying to figure out where to draw the line. 00:34:41.280 |
where do you draw the line on some of this stuff? 00:34:42.880 |
And the thing that I've kind of found the most effective 00:34:55.520 |
in an overall harmful way at the account level, 00:35:01.400 |
Which I think the metaverse is gonna be even harder 00:35:14.680 |
So I think more of this stuff will have to be done 00:35:27.640 |
inside the company and like years of building 00:35:29.840 |
just different AI systems to basically detect 00:35:39.880 |
we are like years ahead of basically anyone else 00:35:43.520 |
in the industry in terms of having built those capabilities. 00:35:48.080 |
And I think that that just is gonna be incredibly important 00:35:54.920 |
it's an incredibly difficult problem to solve. 00:35:57.640 |
By the way, I would probably like to open source my avatar 00:36:03.160 |
so that could be like millions of Lex's walking around, 00:36:20.160 |
and I'd just been doing reinforcement learning on it. 00:36:32.400 |
they have to learn how to like walk around and stuff. 00:36:37.200 |
between biometric security, you want to have one identity, 00:36:40.320 |
but then certain avatars, you might have to have many. 00:36:53.840 |
- Well, how does flooding the world with Lex's 00:37:06.840 |
- I think that one's not gonna work that well for you. 00:37:09.480 |
- It's not gonna work that, for the original copy. 00:37:14.760 |
and you're trying to have, you know, a bunch of, 00:37:19.440 |
in a bunch of different places in the future, 00:37:23.480 |
So that kind of replication I think will be useful. 00:37:26.120 |
But I do think that you're gonna want a notion of like, 00:37:34.440 |
start outperforming you in all your private relationships 00:37:38.720 |
I mean, that's a serious concern I have with clones. 00:37:43.320 |
Okay, so I recently got, I use QNAP NAS storage. 00:37:51.440 |
It was the first time for me with ransomware. 00:37:53.520 |
It's not me personally, it's all QNAP devices. 00:38:03.280 |
because I was doing a lot of the right things 00:38:13.720 |
to protect people's data on the security front? 00:38:16.880 |
- I think that there's different solutions for, 00:38:20.360 |
and strategies where it makes sense to have stuff 00:38:30.200 |
Then I think both have strengths and weaknesses. 00:38:38.760 |
you know, I mean, the advantage of something like, 00:38:52.600 |
And that's something that I think was a big advance 00:38:57.120 |
And one of the promises that we can basically make 00:39:07.800 |
So that way, if someone hacks Meta's servers, 00:39:19.040 |
because obviously if someone is able to compromise 00:39:21.840 |
a company's servers and that company has hundreds 00:39:31.040 |
You know, are you following security best practices 00:39:35.800 |
If you lose your phone, all your content is gone. 00:40:01.440 |
that can just do much better personalization. 00:40:09.840 |
you know, especially if you're a serious company, 00:40:15.960 |
and you have red teams attacking your own stuff 00:40:24.200 |
and trying to attract some of the best hackers in the world 00:40:35.440 |
and open about hardening the systems as possible, 00:40:39.080 |
and pretend that there aren't gonna be issues, 00:40:50.840 |
So I think you want to adopt that approach as a company 00:40:56.560 |
- Trying to stay one step ahead of the attackers. 00:41:11.760 |
there are ones where basically the adversaries 00:41:24.360 |
that are trying to interfere in elections or something, 00:41:33.160 |
but like if someone is saying something hateful, 00:41:36.680 |
people usually aren't getting smarter and smarter 00:41:47.400 |
and some of these other places get over time. 00:41:55.400 |
'cause they realize it's not fun being hateful. 00:42:13.800 |
that raised concerns about the effects of social media 00:42:29.600 |
algorithms want to maximize attention and engagement, 00:42:40.920 |
Can you steel man their criticisms and arguments 00:42:51.600 |
- Well, yeah, I think that that's a good conversation 00:42:56.880 |
I don't happen to agree with the conclusions, 00:43:08.880 |
But I understand overall why people would be concerned 00:43:20.720 |
as people use the service more in general, right? 00:43:34.560 |
So we think that if people are finding it useful, 00:43:47.920 |
So let's say you just kind of simplified it down to that, 00:43:51.440 |
then would we want people to use the services more? 00:43:55.000 |
But then, and then you get to the second question, 00:43:57.360 |
which is, does kind of getting people agitated 00:44:01.960 |
make them more likely to use the services more? 00:44:07.560 |
And I think from looking at other media in the world, 00:44:12.560 |
especially TV and, you know, there's the old news adage, 00:44:35.600 |
One is that what grabs someone's attention in the near term 00:44:47.400 |
is that we're not, I'm not building this company 00:44:55.560 |
I mean, I've been doing this for 17 years at this point, 00:45:03.320 |
So like, I think that it's too simplistic to say, 00:45:08.320 |
hey, this might increase time in the near term, 00:45:17.160 |
the incentives of a company that are focused on the long term 00:45:24.080 |
not what is gonna draw people's attention today. 00:45:40.240 |
is just that, okay, I'm a product designer, right? 00:45:51.960 |
And we spend a lot of time talking about virtual reality 00:45:55.600 |
and how the kind of key aspect of that experience 00:46:01.920 |
It's not just about the utility that you're delivering, 00:46:05.880 |
And similarly, I care a lot about how people feel 00:46:11.360 |
and I don't want to build products that make people angry. 00:46:18.360 |
is to build something that people spend a bunch of time doing 00:46:22.080 |
and it just kind of makes them angrier to other people. 00:47:25.080 |
for the next week, but like as people working on this, 00:47:35.680 |
I guess one other thing that I'd say is that, 00:47:37.720 |
while we're focused on the ads business model, 00:47:45.320 |
I mean, so take like a subscription news business model, 00:47:52.160 |
Maybe if someone's paying for a subscription, 00:47:55.200 |
you don't get paid per piece of content that they look at, 00:48:07.320 |
by you have these kind of partisan news organizations 00:48:15.720 |
and they're only gonna get people on one side 00:48:30.000 |
if they produce stuff that's kind of more polarizing 00:48:33.600 |
then that is what gets them more subscribers. 00:48:43.400 |
The thing that I think is great about advertising 00:48:46.440 |
is it makes it to the consumer services free, 00:48:48.720 |
which if you believe that everyone should have a voice 00:49:00.000 |
about how you're implementing what you're doing. 00:49:08.960 |
like you're leaning into them making people angry 00:49:13.600 |
just because it makes more money in the short term, 00:49:17.840 |
But there's also a kind of reality of human nature. 00:49:50.520 |
But it does seem when you look at social media 00:50:01.920 |
sort of outrage in one direction or the other. 00:50:04.960 |
And so it's easy from that to infer the narrative 00:50:13.960 |
And so they're increasing the division in the world. 00:50:25.800 |
How can you be transparent about this battle? 00:50:28.440 |
Because I think it's not just motivation or financials, 00:50:43.040 |
- I think that going through some of the design decisions 00:50:56.920 |
And I think it's worth grounding this conversation 00:50:59.520 |
in the actual research that has been done on this, 00:51:10.800 |
And I mean, there's been a number of economists 00:51:14.800 |
and social scientists and folks who have studied this. 00:51:17.440 |
In a lot of polarization, it varies around the world. 00:51:27.160 |
And you see different trends in different places 00:51:32.160 |
where in a lot of countries polarization is declining, 00:51:37.000 |
in some it's flat, in the US it's risen sharply. 00:51:43.200 |
what are the unique phenomena in the different places? 00:51:46.000 |
And I think for the people who are trying to say, 00:51:47.640 |
hey, social media is the thing that's doing this, 00:51:57.720 |
And you have researchers like this economist at Stanford, 00:52:00.600 |
Matthew Genskow, who's just written at length about this. 00:52:04.400 |
And it's a bunch of books by political scientists, 00:52:13.760 |
this decades long analysis in the US before I was born, 00:52:24.240 |
and different things that predate the internet 00:52:32.200 |
not only is it pretty clear that social media 00:52:37.560 |
but most of the academic studies that I've seen 00:52:45.360 |
Genskow, the same person who just did the study 00:52:57.480 |
that if you looked after the 2016 election in the US, 00:53:04.320 |
were actually the ones who were not on the internet. 00:53:07.560 |
So, and there have been recent other studies, 00:53:12.800 |
basically showing that as people stop using social media, 00:53:21.360 |
okay, well, what polarization actually isn't even one thing. 00:53:24.760 |
'Cause having different opinions on something isn't, 00:53:28.920 |
What people who study this say is most problematic 00:53:35.920 |
which is basically, do you have negative feelings 00:53:41.040 |
And the way that a lot of scholars study this 00:53:46.800 |
would you let your kids marry someone of group X? 00:53:53.320 |
that someone might have negative feelings towards. 00:54:10.720 |
But overall, I think it's just worth grounding 00:54:13.280 |
that discussion in the research that's existed 00:54:17.480 |
that the mainstream narrative around this is just not right. 00:54:27.200 |
There's another question I'd like to ask you on this. 00:54:31.320 |
I was looking at various polls and saw that you're 00:54:43.280 |
It's basically everybody has a very high unfavorable rating 00:54:56.880 |
And how do you regain their trust and support? 00:55:00.920 |
why are you losing the battle in explaining to people 00:55:08.120 |
what actual impact social media has on society? 00:55:11.120 |
- Well, I'm curious if that's a US survey or world. 00:55:19.360 |
One is that our brand has been somewhat uniquely challenged 00:55:42.880 |
before 2016, I think that there were probably 00:55:44.880 |
very few months if any where it was negative. 00:55:49.440 |
been very few months if any that it's been positive. 00:55:55.320 |
And this is very different from other places. 00:55:56.920 |
So I think in a lot of other countries in the world, 00:56:12.480 |
I think if you look at it from like a partisan perspective, 00:56:21.040 |
and there are people who are more right of center 00:56:22.480 |
and there's kind of blue America and red America. 00:56:29.840 |
has a more positive view towards than another. 00:56:39.760 |
is that because of a lot of the content decisions 00:56:45.760 |
and because we're not a partisan company, right? 00:56:49.520 |
We're not a Democrat company or a Republican company. 00:56:52.600 |
We're trying to make the best decisions we can 00:56:56.040 |
and help people have as much voice as they can 00:57:22.960 |
that's gonna upset, say, about half the country, 00:57:46.040 |
But their opinion doesn't tend to go up by that much. 00:57:48.920 |
Whereas the people who kind of are on the other side of it 00:57:58.200 |
and should be up and should not be censored?" 00:58:09.240 |
the people who thought it should be taken down, 00:58:17.920 |
are basically any time one of these big controversial things 00:58:22.000 |
our brand goes down with half of the country. 00:58:27.600 |
and then if you just kind of extrapolate that out, 00:58:29.600 |
it's just been very challenging for us to try to navigate 00:58:33.200 |
what is a polarizing country in a principled way, 00:58:36.640 |
where we're not trying to kind of hew to one side 00:58:38.600 |
or the other, we're trying to do what we think 00:58:44.080 |
So, I mean, that's what we'll try to keep doing. 00:58:47.360 |
- Just as a human being, how does it feel though, 00:58:50.160 |
when you're giving so much of your day-to-day life 00:58:55.720 |
to try to do good in the world, as we've talked about, 00:58:59.680 |
that so many people in the US, the place you call home, 00:59:18.280 |
I mean, look, if I wanted people to think positively 00:59:23.540 |
I don't know, I'm not sure if you go build a company. 00:59:32.900 |
- Yeah, so, I mean, I don't know, there is a dynamic 00:59:36.740 |
where a lot of the other people running these companies, 00:59:40.820 |
internet companies, have sort of stepped back 00:59:49.500 |
And some of it may be that they just get tired over time, 00:59:59.860 |
You only really do it for a long period of time 01:00:18.060 |
your experience as a public figure is gonna be 01:00:21.040 |
that there's a lot of people who dislike you, right? 01:00:23.340 |
So I actually am not sure how different it is. 01:00:28.340 |
Certainly, the country's gotten more polarized 01:00:32.740 |
and we in particular have gotten more controversial 01:00:39.220 |
but I don't know, I kind of think like as a public figure 01:00:49.420 |
- Part of, yeah, part of what you do is like, 01:00:56.920 |
you need to be getting feedback and internalizing feedback 01:01:09.460 |
because they're trying to help you do a better job 01:01:23.020 |
to tune out everyone who says anything negative 01:01:25.920 |
and just listen to the people who are kind of positive 01:01:29.320 |
and support you as it would be psychologically 01:01:40.080 |
but I mean, you kind of develop more of a feel for like, 01:01:48.400 |
and who has different ideas about how to do that 01:02:02.100 |
But I think at this point, I'm pretty tuned to just, 01:02:13.220 |
I think over time, it just doesn't bother you that much. 01:02:19.680 |
Are there friends or colleagues in your inner circle 01:02:30.920 |
- I think we have a famously open company culture 01:02:33.480 |
where we sort of encourage that kind of dissent internally, 01:02:39.400 |
which is why there's so much material internally 01:02:42.240 |
that can leak out with people sort of disagreeing 01:02:47.520 |
Our management team, I think it's a lot of people, 01:02:51.960 |
there's some folks who've kind of been there for a while, 01:03:01.040 |
And my friends and family, I think will push me on this. 01:03:09.240 |
It can't just be people who are your friends and family. 01:03:13.740 |
It's also, I mean, there are journalists or analysts 01:03:27.640 |
about thinking about the world, certain politicians 01:03:32.720 |
who I just think have like very insightful perspectives 01:03:39.840 |
they come at the world from a different perspective, 01:03:41.640 |
which is sort of what makes the perspective so valuable. 01:03:55.620 |
And I don't know, I mean, those are the people 01:04:02.940 |
And that's how I learn on a day-to-day basis. 01:04:05.680 |
People are constantly sending me comments on stuff 01:04:10.160 |
And I don't know, it's kind of constantly evolving 01:04:14.520 |
and kind of what we should be aspiring to be. 01:04:16.880 |
- You've talked about, you have a famously open culture 01:04:20.080 |
which comes with the criticism and the painful experiences. 01:04:25.940 |
So let me ask you another difficult question. 01:04:38.080 |
Her claim is that Instagram is choosing profit 01:05:07.600 |
Again, can you steal man and defend the point 01:05:10.960 |
and Frances Haugen's characterization of the study 01:05:24.220 |
around teen mental health that are really important. 01:05:26.600 |
It's hard to, as a parent, it's hard to imagine 01:05:29.500 |
any set of questions that are sort of more important. 01:05:32.040 |
I mean, I guess maybe other aspects of physical health 01:05:34.080 |
or wellbeing are probably come to that level. 01:05:37.240 |
But like, these are really important questions, right? 01:05:40.580 |
Which is why we dedicate teams to studying them. 01:06:05.000 |
that everything was gonna be positive all the time. 01:06:07.000 |
So, I mean, the reason why you study this stuff 01:06:15.320 |
I thought some of the reporting and coverage of it 01:06:24.280 |
and on 18 or 19, the effect of using Instagram 01:06:27.620 |
was neutral or positive on the teens' wellbeing. 01:06:38.840 |
But I think having the coverage just focus on that one 01:06:43.080 |
you know, I mean, I think an accurate characterization 01:06:49.300 |
is generally positive for their mental health. 01:06:53.760 |
But of course, that was not the narrative that came out. 01:06:56.740 |
that's not a kind of logical thing to straw man, 01:07:01.480 |
but I sort of disagree with that overall characterization. 01:07:04.080 |
I think anyone sort of looking at this objectively would. 01:07:29.840 |
it's not clear that any of the other tech companies 01:07:33.360 |
So why the narrative should form that we did research, 01:07:38.840 |
'cause we wanted to understand it to improve, 01:07:40.800 |
and took steps after that to try to improve it, 01:07:55.960 |
that like, that's the accurate description of the actions 01:07:59.040 |
that we've taken compared to the others in the industry. 01:08:01.200 |
So I don't know, that's kind of, that's my view on it. 01:08:09.120 |
related to teen mental health for a long time, 01:08:14.280 |
And I would encourage everyone else in the industry 01:08:16.600 |
- Yeah, I would love there to be open conversations 01:08:21.920 |
and a lot of great research being released internally, 01:08:35.020 |
when they say negative things about social media. 01:08:46.060 |
And I don't understand how that's supposed to lead 01:08:53.060 |
and the negatives, the concerns about social media, 01:08:56.020 |
especially when you're doing that kind of research. 01:09:13.520 |
But now, because so much of our life is in the digital world, 01:09:39.700 |
have committed suicide or will commit suicide 01:09:45.260 |
based on the bullying that happens on social media. 01:09:51.540 |
that we basically track and build systems to fight against. 01:10:00.420 |
you know, I mean, these are some of the biggest things 01:10:18.300 |
and it's probably impossible to get rid of all of it, 01:10:27.220 |
And you also wanna make sure that people have the tools 01:10:39.700 |
they fight and, you know, push each other around 01:10:51.580 |
You know, we talked a little bit before around 01:10:57.100 |
to identify when something harmful is happening. 01:11:01.540 |
'cause a lot of bullying is very context-specific. 01:11:03.740 |
It's not like you're trying to fit a formula of like, 01:11:07.300 |
you know, if like looking at the different harms, 01:11:10.340 |
you know, someone promoting a terrorist group 01:11:14.420 |
to generally find, because things promoting that group 01:11:24.060 |
someone's appearance that's idiosyncratic to them. 01:11:44.820 |
So we've done things like making it so people 01:12:00.460 |
I just like, I just don't wanna hear about this, 01:12:01.940 |
but I also don't wanna signal at all publicly that, 01:12:08.940 |
And then you get to some of the more extreme cases 01:12:14.100 |
like you're talking about, where someone is thinking about 01:12:37.540 |
and, you know, hundreds of languages around the world 01:12:41.100 |
And one of the things that I'm actually quite proud of 01:12:45.380 |
is we've built these systems that I think are 01:12:48.860 |
clearly leading at this point that not only identify that, 01:12:57.060 |
and have been able to save, I think at this point, 01:13:09.660 |
between the AI work and being able to communicate 01:13:13.820 |
And we're rolling that out in more places around the world. 01:13:28.540 |
And also kind of understand that some of nature 01:13:36.380 |
which is unfortunate, but you can give people tools 01:13:43.860 |
- A platform that allows people to fall in love 01:13:46.980 |
with each other is also by nature going to be a platform 01:13:52.860 |
And when you're managing such a platform, it's difficult. 01:13:57.140 |
And I think you spoke to it, but the psychology of that, 01:13:59.420 |
of being a leader in that space of creating technology 01:14:02.580 |
that's playing in this space, like you mentioned, 01:14:10.260 |
And I mean, the burden of that is just great. 01:14:13.100 |
I just wanted to hear you speak to that point. 01:14:17.220 |
I have to ask about the thing you've brought up a few times, 01:14:29.420 |
So there are two groups of people pressuring meta on this. 01:14:33.900 |
One group is upset that Facebook, the social network, 01:14:37.220 |
allows misinformation in quotes to be spread. 01:14:40.260 |
On the platform, the other group are concerned 01:14:42.660 |
that Facebook censors speech by calling it misinformation. 01:14:48.780 |
You, in 2019, October at Georgetown University, 01:14:53.780 |
eloquently defended the importance of free speech, 01:14:58.200 |
but then COVID came and the 2020 election came. 01:15:04.300 |
Do you worry that outside pressures from advertisers, 01:15:09.660 |
to damage the ideal of free speech that you spoke highly of? 01:15:25.020 |
So let me just take you through our evolution 01:15:31.700 |
unless you believe that people expressing themselves 01:15:38.140 |
You can kind of think about our company as a formula 01:15:44.220 |
and helping people connect creates opportunity. 01:15:47.420 |
So those are the two things that we're always focused on 01:15:58.100 |
that's not like politically controversial content, 01:16:00.820 |
it's like expressing something about their identity 01:16:04.000 |
that's more related to the avatar conversation 01:16:06.460 |
we had earlier in terms of expressing some facet, 01:16:08.560 |
but that's what's important to people on a day-to-day basis. 01:16:11.220 |
And sometimes when people feel strongly enough 01:16:13.420 |
about something, it kind of becomes a political topic. 01:16:16.300 |
That's sort of always been a thing that we've focused on. 01:16:19.060 |
There's always been the question of safety in this, 01:16:26.020 |
We've had these community standards from early on, 01:16:28.260 |
and there are about 20 different kinds of harm 01:16:36.380 |
So it includes things like bullying and harassment, 01:16:40.620 |
it includes things like terrorism or promoting terrorism, 01:16:45.620 |
inciting violence, intellectual property theft. 01:16:49.300 |
And in general, I think call it about 18 out of 20 of those, 01:16:53.660 |
there's not really a particularly polarized definition 01:16:58.060 |
I think you're not really gonna find many people 01:17:16.340 |
And you're right, the misinformation is basically up there. 01:17:20.020 |
And I think sometimes the definition of hate speech 01:17:25.300 |
most of the content that I think we're working on for safety 01:17:30.300 |
is not actually, people don't kind of have these questions. 01:17:36.000 |
But if you go back to the beginning of the company, 01:17:42.680 |
And therefore, it was me and my roommate Dustin joined me. 01:17:57.920 |
to be able to go basically look at all the content. 01:18:06.160 |
that no one would expect that we could review it all. 01:18:12.720 |
and we try to look at stuff and deal with stuff. 01:18:43.440 |
sort of a larger and more profitable company. 01:19:03.480 |
- Oh, of course there are. - Technical papers. 01:19:10.640 |
understand the nuances of what is inciting violence 01:19:24.040 |
Because you're actually trying to denounce it, 01:19:26.320 |
in which case we probably shouldn't take that down. 01:19:30.760 |
in some kind of dialect in a corner of India, 01:19:35.280 |
as opposed to, okay, actually you're posting this thing 01:20:04.960 |
and there's a lot more stuff here than we realized. 01:20:08.040 |
Why weren't these internet companies on top of this? 01:20:20.640 |
some of this technology had started becoming possible. 01:20:26.320 |
we needed to make a substantially larger investment. 01:20:40.160 |
to be able to go and proactively identify the stuff 01:20:46.360 |
Now we've built the tools to be able to do that. 01:20:49.000 |
And now I think it's actually a much more complicated set 01:20:51.920 |
of philosophical rather than technical questions, 01:20:57.960 |
Now, the way that we basically hold ourselves accountable 01:21:02.960 |
is we issue these transparency reports every quarter. 01:21:05.400 |
And the metric that we track is for each of those 01:21:15.320 |
Right, so how effective is our AI at doing this? 01:21:18.120 |
But that basically creates this big question, 01:21:20.560 |
which is, okay, now we need to really be careful 01:21:31.360 |
It's certainly at a point now where, you know, 01:21:46.320 |
that people aren't reporting or that you're not getting to, 01:21:52.920 |
But then I think at some point along the way, 01:21:54.800 |
there started to be almost equal issues on both sides 01:21:58.960 |
of, okay, actually you're kind of taking down 01:22:09.640 |
So is that really an issue that you need to take down? 01:22:13.000 |
Whereas we still have the critique on the other side too, 01:22:15.440 |
where a lot of people think we're not doing enough. 01:22:18.560 |
So it's become, as we built the technical capacity, 01:22:21.840 |
I think it becomes more philosophically interesting 01:22:27.600 |
And I just think like you don't want one person 01:22:33.440 |
So we've also tried to innovate in terms of building out 01:22:37.480 |
which has people who are dedicated to free expression, 01:22:46.400 |
basically go to them and they make the final binding 01:22:51.640 |
we then try to figure out what the principles are 01:22:54.000 |
behind those and encode them into the algorithms. 01:22:58.120 |
which you're outsourcing a difficult decision. 01:23:26.720 |
Okay, so there's also this idea of fact checkers, 01:23:29.280 |
so jumping around to the misinformation questions, 01:23:33.800 |
which is an exceptionally, speaking of polarization. 01:23:38.920 |
I mean, I think one of the hardest set of questions 01:23:44.520 |
And the answer to that is no, my stance has not changed. 01:23:49.120 |
It is fundamentally the same as when I was talking 01:23:52.720 |
about Georgetown from a philosophical perspective. 01:23:56.480 |
The challenge with free speech is that everyone agrees 01:24:01.480 |
that there is a line where if you're actually 01:24:27.920 |
I mean, take it back to the bullying conversation 01:24:36.640 |
Or is it actually harm as they're being bullied? 01:24:39.880 |
And kind of at what point in the spectrum is that? 01:24:42.160 |
And that's the part that there's not agreement on. 01:24:44.480 |
But I think what people agree on pretty broadly 01:24:49.440 |
that it does make sense from a societal perspective 01:24:52.960 |
to tolerate less speech that could be potentially harmful 01:24:59.560 |
So I think where COVID got very difficult is, 01:25:02.800 |
I don't think anyone expected this to be going on for years, 01:25:10.360 |
would a global pandemic where a lot of people are dying 01:25:29.000 |
So I think that it's a very tricky situation, 01:25:32.320 |
but I think the fundamental commitment to free expression 01:25:39.840 |
And again, I don't think you start this company 01:25:52.400 |
- And what are the institutions that define that harm? 01:25:55.440 |
A lot of the criticism is that the CDC, the WHO, 01:25:59.720 |
the institutions we've come to trust as a civilization 01:26:07.760 |
in terms of health policy have failed in many ways, 01:26:11.640 |
in small ways, in big ways, depending on who you ask. 01:26:14.440 |
And then the perspective of Meta and Facebook is like, 01:26:17.120 |
well, where the hell do I get the information 01:26:30.120 |
And again, I think this actually calls to the fact 01:26:46.880 |
and he is called as a, I remember hanging out 01:26:54.600 |
I mean, and there's a lot of people that follow him 01:26:57.560 |
that believe he's not spreading misinformation. 01:27:05.680 |
that have a different definition of misinformation. 01:27:24.040 |
in a totally new pandemic or a new catastrophic event, 01:27:31.040 |
and especially when the news or the media involved 01:27:35.000 |
that can propagate sort of outrageous narratives 01:27:40.680 |
Like what the hell, where's the source of truth? 01:27:45.480 |
It's like, please tell me what the source of truth is. 01:27:49.040 |
- Well, I mean, well, how would you handle this 01:27:56.440 |
I would more speak about how difficult the choices are 01:28:05.360 |
Like here, you got exactly, ask the exact question 01:28:08.080 |
you just asked me, but to the broader public. 01:28:10.100 |
Like, okay, yeah, you guys tell me what to do. 01:28:23.420 |
and now there's a feeling like you're censoring 01:28:26.660 |
And so I would lean, I would try to be ahead of that feeling. 01:28:40.020 |
And I actually place, you know, this idea of misinformation, 01:28:46.360 |
on the poor communication skills of scientists. 01:28:57.400 |
against the vaccine, they should not be censored. 01:29:00.420 |
They should be talked with and you should show the data. 01:29:04.820 |
as opposed to rolling your eyes and saying, I'm the expert. 01:29:13.240 |
So that's the whole point of freedom of speech 01:29:22.080 |
on the poor communication skills of scientists. 01:29:26.600 |
Thanks to social media, scientists are not communicators. 01:29:38.840 |
It's a way to learn to respond really quickly, 01:29:51.780 |
Don't talk down to people, all those kinds of things. 01:30:00.820 |
because there's a lot of stuff that can cause real harm 01:30:09.580 |
wouldn't be blaming you for the other ills of society, 01:30:14.580 |
The institutions have flaws, the political divide. 01:30:19.580 |
Obviously, politicians have flaws, that's news. 01:30:23.380 |
The media has flaws that they're all trying to work with. 01:30:28.020 |
And because of the central place of Facebook in the world, 01:30:31.100 |
all of those flaws somehow kind of propagate to Facebook. 01:30:34.340 |
And you're sitting there, as Plato, the philosopher, 01:30:38.140 |
have to answer to some of the most difficult questions 01:30:43.940 |
So I don't know, maybe this is an American answer, though, 01:30:59.140 |
and matter when people talk about this company 01:31:06.900 |
You talked about AI has to catch a bunch of harmful stuff 01:31:10.700 |
really quickly, just the sea of data you have to deal with. 01:31:28.180 |
that would wake people up to how difficult this problem is 01:31:37.820 |
just 'cause you mentioned language and my ineloquence, 01:31:41.620 |
translation is something I wanted to ask you about. 01:31:44.140 |
And first, just to give a shout out to the supercomputer, 01:31:47.780 |
you've recently announced the AI Research Supercluster, RSC. 01:32:04.100 |
And it will eventually, maybe this year, maybe soon, 01:32:15.040 |
There's a cool thing on the distributed storage aspect 01:32:21.060 |
that I think is super exciting is translation, 01:32:26.340 |
I mentioned to you that having a conversation, 01:32:32.340 |
and I'm having a conversation with Vladimir Putin, 01:32:51.220 |
and a powerful application of AI to connect the world, 01:32:54.700 |
that bridge the gap, not necessarily between me and Putin, 01:32:57.560 |
but people that don't have that shared language. 01:33:01.020 |
Can you just speak about your vision with translation? 01:33:04.120 |
'Cause I think that's a really exciting application. 01:33:11.600 |
and people in all these other places are interested in it. 01:33:16.640 |
just unlocks a lot of value on a day-to-day basis. 01:33:20.560 |
And so the kind of AI around translation is interesting 01:33:24.400 |
because it's gone through a bunch of iterations. 01:33:37.680 |
representations of language or something like that. 01:33:42.480 |
You basically wanna be able to map the concepts 01:33:46.920 |
and basically go directly from one language to another. 01:33:49.320 |
And you just can train bigger and bigger models 01:33:54.120 |
And that's where the research supercluster comes in 01:33:58.120 |
is basically a lot of the trend in machine learning 01:34:01.040 |
is just you're building bigger and bigger models 01:34:03.400 |
and you just need a lot of computation to train them. 01:34:11.400 |
which could have billions or trillions of examples 01:34:25.960 |
a much longer period of time on a smaller cluster. 01:34:28.160 |
So it just wouldn't be practical for most teams to do. 01:34:37.280 |
you're able to go between about 100 languages 01:34:39.680 |
seamlessly today to being able to go to about 300 languages 01:34:46.440 |
- So from any language to any other language. 01:34:48.680 |
- Yeah, and part of the issue when you get closer to 01:34:51.920 |
more languages is some of these get to be pretty, 01:35:07.240 |
and you need to kind of use a model that you've built up 01:35:12.080 |
And this is one of the big questions around AI 01:35:21.320 |
- But capturing, we talked about this with the metaverse, 01:35:23.800 |
capturing the magic of human to human interaction. 01:35:31.080 |
'cause you actually both speak Russian and English. 01:35:37.720 |
because we would both like to have an AI that improves 01:35:44.200 |
It would be nice to outperform our abilities. 01:35:50.640 |
because I think AI can really help in translation 01:35:53.600 |
for people that don't speak the language at all. 01:35:55.720 |
But to actually capture the magic of the chemistry, 01:36:01.280 |
which would make the metaverse super immersive. 01:36:08.680 |
- Yeah, so when people think about translation, 01:36:14.240 |
But speech to speech, I think, is a whole nother thing. 01:36:26.640 |
and then kind of output that as speech in that language. 01:36:30.400 |
You just wanna be able to go directly from speech 01:36:32.240 |
in one language to speech in another language 01:36:40.720 |
when you look at the progress in machine learning, 01:36:42.840 |
there have been big advances in the techniques. 01:36:46.200 |
Some of the advances in self-supervised learning, 01:36:52.840 |
and he's like one of the leading thinkers in this area. 01:36:55.280 |
I just think that that stuff is really exciting. 01:36:59.760 |
to just throw larger and larger amounts of compute 01:37:09.280 |
But we're asking more of our systems too, right? 01:37:24.320 |
hopefully like a normal-ish looking pair of glasses, 01:37:28.080 |
but they're gonna be able to put holograms in the world 01:37:31.160 |
and intermix virtual and physical objects in your scene. 01:37:35.920 |
And one of the things that's gonna be unique about this, 01:37:42.520 |
is that this is gonna be the first computing device 01:37:47.480 |
about what's going on around you that you have. 01:38:03.400 |
are gonna be right around where your ears are. 01:38:10.200 |
that can basically help you process the world 01:38:21.760 |
but the kinds of AI models that we're gonna need 01:38:27.360 |
I don't know, there's a lot that we're gonna need 01:38:31.800 |
But I mean, but that's why I think these concepts 01:38:38.720 |
that I mean, they're kind of enabling each other. 01:38:42.880 |
- Yeah, like the world builder is a really cool idea. 01:38:52.960 |
like enriching your experience in all kinds of ways. 01:38:58.160 |
how you can have AI assistants helping you out 01:39:00.800 |
on different levels of sort of intimacy of communication. 01:39:11.120 |
you're one of the most successful people ever. 01:39:21.120 |
today how to live a life they can be proud of? 01:39:27.720 |
that can have a big positive impact on the world? 01:39:48.920 |
but, and also like what age of people are you talking to? 01:39:58.240 |
and you really have a chance to do something unprecedented. 01:40:08.040 |
the kind of most philosophical and abstract version of this. 01:40:17.360 |
and like, let's say that they call it the good night things 01:40:21.760 |
'cause we're basically what we talk about at night. 01:40:37.960 |
but I basically go through with Max and Augie, 01:40:46.360 |
what are the things that are most important in life? 01:40:51.560 |
and just have like really ingrained in them as they grow up? 01:41:08.680 |
I think is perhaps one of the most important things. 01:41:12.880 |
And then the third is maybe a little more amorphous, 01:41:16.040 |
but it is something that you're excited about 01:41:25.200 |
But I think for most people, it's really hard. 01:41:34.760 |
is that we have things that we're looking forward to. 01:41:37.320 |
So whether it is building AR glasses for the future, 01:41:41.640 |
or being able to celebrate my 10-year wedding anniversary 01:41:49.360 |
you have things that you're looking forward to. 01:41:55.200 |
But it's like, that's a really critical thing. 01:41:57.040 |
And then the last thing is I ask them every day, 01:42:02.640 |
Because I just think that that's a really critical thing 01:42:07.120 |
is like, it's easy to kind of get caught up in yourself 01:42:10.760 |
and kind of stuff that's really far down the road. 01:42:19.800 |
okay, yeah, I helped set the table for lunch. 01:42:32.960 |
if you were to boil down my overall life philosophy 01:42:40.160 |
those are the things that I think are really important. 01:42:57.120 |
you're probably gonna make if you're in college 01:43:04.600 |
And I sort of have this hiring heuristic at Metta, 01:43:12.400 |
which is that I will only hire someone to work for me 01:43:19.800 |
Not necessarily that I want them to run the company 01:43:35.440 |
And when you're building something like this, 01:43:39.920 |
because there's a lot of stuff that you need to get done. 01:43:41.440 |
And everyone always says, don't compromise on quality, 01:43:44.880 |
okay, how do you know that someone is good enough? 01:43:52.920 |
But I think that's actually a pretty similar answer 01:44:04.160 |
trying to figure out what your circle is gonna be, 01:44:06.760 |
you're evaluating different job opportunities, 01:44:11.560 |
even if they're gonna be peers in what you're doing, 01:44:14.400 |
who are the people who in an alternate university, 01:44:18.680 |
because you think you're gonna learn a lot from them 01:44:30.640 |
that are kind of more of what you wanna become like 01:44:39.080 |
and maybe not focused enough on the connections 01:44:48.080 |
but my place in Austin now has seven-legged robots. 01:44:55.400 |
which is probably something I should look into. 01:45:06.420 |
- Well, I think one of the promises of all this stuff 01:45:48.960 |
than what I think today we would just consider 01:45:56.280 |
And that's what a lot of what people are here to do 01:46:03.140 |
think of things that you wanna build and go do it. 01:46:08.100 |
one of the things that I just think is striking, 01:46:09.380 |
so I teach my daughters some basic coding with Scratch. 01:46:13.700 |
I mean, they're still obviously really young, 01:46:19.820 |
I'm like building something that I want to exist. 01:46:37.580 |
It's like, she's just very interesting visually 01:46:39.340 |
in what she can kind of output and how it can move around. 01:46:53.060 |
but I mean, to me, this is like a beautiful thing, right? 01:46:58.220 |
for me coding was this functional thing and I enjoyed it. 01:47:01.420 |
And it like helped build something utilitarian, 01:47:10.500 |
of their kind of imagination and artistic sense 01:47:23.580 |
that that's like their existence going forward 01:47:25.900 |
is being able to basically create and live out, 01:47:31.260 |
you know, in all these different kinds of art. 01:47:34.380 |
and wonderful thing and will be very freeing for humanity 01:47:37.900 |
to spend more of our time on the things that matter to us. 01:47:40.380 |
- Yeah, allow more and more people to express their art 01:48:01.180 |
You didn't sign up for this on a podcast, did you? 01:48:08.820 |
I do a fair amount of like extreme sport type stuff. 01:48:33.860 |
and like make sure that I'm like used to that spot 01:48:50.080 |
I have a lot of stuff that I wanna build and wanna-- 01:48:53.980 |
- Does it freak you out that it's finite though? 01:49:00.680 |
And that there'll be a time when your daughters are around 01:49:17.360 |
Yeah, the finiteness makes ice cream taste more delicious 01:49:23.200 |
There's something about that with the metaverse too. 01:49:25.760 |
You want, we talked about this identity earlier, 01:49:30.280 |
There's something powerful about the constraint 01:49:39.840 |
- But I mean, a lot of, as you go through different waves 01:49:42.180 |
of technology, I think a lot of what is interesting 01:49:48.120 |
or kind of there can be many, many of a thing, 01:49:51.640 |
and then what ends up still being constrained. 01:50:04.280 |
hopefully close to an infinite amount of expression 01:50:17.080 |
And obviously all of my, our philanthropic work is, 01:50:21.760 |
it's not focused on longevity, but it is focused 01:50:24.240 |
on trying to achieve what I think is a possible goal 01:50:28.080 |
in this century, which is to be able to cure, 01:50:32.520 |
So I certainly think people kind of getting sick 01:50:43.280 |
to push on that, which I mean, we could do a whole, 01:50:49.680 |
I mean, this is with your wife, Priscilla Chan, 01:51:02.000 |
about all the exciting things you're working on there, 01:51:06.120 |
including the sort of moonshot of eradicating disease 01:51:13.320 |
- I don't actually know if you're gonna ever eradicate it, 01:51:20.920 |
So people get diseases, but you can cure them. 01:51:27.440 |
ongoing things that are not gonna ruin your life. 01:51:34.360 |
I think saying that there's gonna be no disease at all 01:51:37.120 |
probably is not possible within the next several decades. 01:51:41.560 |
- Basic thing is increase the quality of life. 01:51:44.600 |
- And maybe keep the finiteness, because it tastes, 01:51:50.880 |
- Maybe that's just being a romantic 20th century human. 01:51:54.800 |
- Maybe, but I mean, but it was an intentional decision 01:52:04.360 |
If at the moment of your death, and by the way, 01:52:19.760 |
At the moment of your death, you get to meet God, 01:52:28.120 |
Or maybe a whole conversation, I don't know, it's up to you. 01:52:32.740 |
It's more dramatic when it's just one question. 01:52:35.080 |
- Well, if it's only one question, and I died, 01:52:45.280 |
and my family, like if they were gonna be okay. 01:52:49.600 |
That might depend on the circumstances of my death, 01:52:54.640 |
but I think that in most circumstances that I can think of, 01:52:58.080 |
that's probably the main thing that I would care about. 01:53:07.720 |
- The humility and selfishness, all right, you're in. 01:53:14.600 |
- You're gonna be fine, don't worry, you're in. 01:53:16.520 |
- But I mean, one of the things that I struggle with, 01:53:18.920 |
at least, is on the one hand, that's probably the most, 01:53:29.480 |
but I don't know, one of the things that I just struggle with 01:53:32.440 |
in terms of running this large enterprise is like, 01:53:53.880 |
and at this point, I mean, I care deeply about it, 01:53:58.880 |
but yeah, I think that that's not as obvious of a question. 01:54:07.840 |
You get this ability to impact millions of lives, 01:54:12.800 |
and it's definitely something, billions of lives, 01:54:23.680 |
and I suppose that's the dream of the metaverse, 01:54:29.320 |
where you can have those intimate relationships. 01:54:36.920 |
not just based on who you happen to be next to. 01:54:39.880 |
I think that's what the internet is already doing, 01:54:46.520 |
I mean, I always think, when you think about the metaverse, 01:54:49.920 |
people ask this question about the real world. 01:54:52.120 |
It's like, the virtual world versus the real world. 01:54:54.840 |
And it's like, no, the real world is a combination 01:55:00.060 |
but I think over time, as we get more technology, 01:55:04.040 |
the physical world is becoming less of a percent 01:55:08.160 |
And I think that that opens up a lot of opportunities 01:55:10.960 |
for people, 'cause you can work in different places, 01:55:13.440 |
you can stay closer to people who are in different places. 01:55:21.360 |
and then barriers of language, that's a beautiful vision. 01:55:31.800 |
- I think that, well, there are probably a couple 01:55:51.560 |
But I think it gets back to this last question 01:55:53.920 |
that we talked about, about the duality between, 01:55:59.120 |
the most about, and then there's this bigger thing 01:56:04.400 |
And I think that in my own life, I think about this tension, 01:56:09.360 |
but I started this whole company, and my life's work 01:56:15.120 |
So I think it's intellectually, probably the thing 01:56:20.120 |
that I go to first is just that human connection 01:56:28.560 |
And I think that it's a thing that our society 01:56:37.000 |
I just remember when I was growing up and in school, 01:56:45.200 |
And it's like, no, what if playing with your friends 01:56:48.360 |
- That sounds like an argument your daughters would make. 01:56:52.360 |
- Well, I mean, I don't know, I just think it's interesting. 01:56:54.360 |
- Like, the homework doesn't even matter, man. 01:56:56.360 |
- Well, I think it's interesting because it's, 01:56:58.560 |
people, I think people tend to think about that stuff 01:57:03.360 |
as wasting time, or that's what you do in the free time 01:57:07.080 |
that you have, but what if that's actually the point? 01:57:12.580 |
But here's maybe a different way of counting out this, 01:57:17.880 |
I mean, I always, there's a rabbi who I've studied with 01:57:25.400 |
who kind of gave me this, we were talking through Genesis 01:57:31.840 |
and they're basically walking through, it's like, 01:57:36.840 |
okay, you go through the seven days of creation, 01:57:48.120 |
Right, so it could have started anywhere, right, 01:58:04.760 |
So I actually think that there's like a compelling argument 01:58:09.760 |
that I think I've always just found meaningful 01:58:23.000 |
that we should do is to create and build things. 01:58:37.720 |
because I've dedicated my life to creating things 01:58:42.520 |
I mean, getting back to what we talked about earlier, 01:58:48.280 |
So it's, I mean, these are like the two themes 01:58:54.440 |
that's what, that's kind of what I think about. 01:59:03.080 |
I think this is one hell of an amazing replay experience 01:59:08.080 |
So whoever is using our avatars years from now, 01:59:12.000 |
I hope you had fun and thank you for talking today. 01:59:20.880 |
please check out our sponsors in the description. 01:59:23.680 |
And now let me leave you with the end of the poem, 01:59:29.160 |
If you can talk with crowds and keep your virtue 01:59:34.160 |
or walk with Kings, now lose the common touch. 01:59:37.840 |
If neither foes nor loving friends can hurt you. 01:59:41.240 |
If all men count with you, but none too much. 01:59:52.360 |
yours is the earth and everything that's in it. 01:59:59.600 |
Thank you for listening and hope to see you next time.