back to index

Mark Zuckerberg: Meta, Facebook, Instagram, and the Metaverse | Lex Fridman Podcast #267


Chapters

0:0 Introduction
5:36 Metaverse
25:36 Identity in Metaverse
37:45 Security
42:10 Social Dilemma
64:16 Instagram whistleblower
69:1 Social media and mental health
74:26 Censorship
91:35 Translation
99:10 Advice for young people
104:58 Daughters
107:46 Mortality
112:19 Question for God
115:25 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | Let's talk about free speech and censorship.
00:00:02.560 | - You don't build a company like this
00:00:04.040 | unless you believe that people expressing themselves
00:00:06.440 | is a good thing.
00:00:07.280 | - Let me ask you as a father,
00:00:08.620 | does it weigh heavy on you that people get bullied
00:00:11.660 | on social networks?
00:00:12.980 | - I care a lot about how people feel
00:00:14.440 | when they use our products,
00:00:15.440 | and I don't want to build products that make people angry.
00:00:19.320 | - Why do you think so many people dislike you?
00:00:21.980 | Some even hate you.
00:00:25.560 | And how do you regain their trust and support?
00:00:28.120 | (air whooshing)
00:00:30.160 | - The following is a conversation with Mark Zuckerberg,
00:00:32.720 | CEO of Facebook, now called Meta.
00:00:35.720 | Please allow me to say a few words
00:00:38.680 | about this conversation with Mark Zuckerberg,
00:00:41.300 | about social media,
00:00:42.680 | and about what troubles me in the world today,
00:00:45.440 | and what gives me hope.
00:00:47.760 | If this is not interesting to you,
00:00:49.480 | I understand, please skip.
00:00:51.540 | I believe that at its best,
00:00:55.000 | social media puts a mirror to humanity,
00:00:57.760 | and reveals the full complexity of our world,
00:01:01.040 | shining a light on the dark aspects of human nature,
00:01:04.040 | and giving us hope, a way out,
00:01:06.640 | through compassionate but tense chaos of conversation
00:01:09.760 | that eventually can turn into understanding,
00:01:12.760 | friendship, and even love.
00:01:14.620 | But this is not simple.
00:01:17.400 | Our world is not simple.
00:01:19.520 | It is full of human suffering.
00:01:21.420 | I think about the hundreds of millions of people
00:01:24.440 | who are starving, and who live in extreme poverty.
00:01:28.440 | The one million people who take their own life every year,
00:01:31.560 | the 20 million people that attempt it,
00:01:33.880 | and the many, many more millions who suffer quietly,
00:01:37.360 | in ways that numbers can never know.
00:01:39.340 | I'm troubled by the cruelty and pain of war.
00:01:44.600 | Today, my heart goes out to the people of Ukraine.
00:01:48.520 | My grandfather spilled his blood on this land,
00:01:52.200 | held the line as a machine gunner against the Nazi invasion,
00:01:55.760 | surviving impossible odds.
00:01:59.160 | I am nothing without him.
00:02:01.360 | His blood runs in my blood.
00:02:03.520 | My words are useless here.
00:02:07.560 | I send my love, it's all I have.
00:02:10.160 | I hope to travel to Russia and Ukraine soon.
00:02:14.080 | I will speak to citizens and leaders,
00:02:16.760 | including Vladimir Putin.
00:02:19.960 | As I've said in the past, I don't care about access,
00:02:22.640 | fame, money, or power, and I'm afraid of nothing.
00:02:26.720 | But I am who I am, and my goal in conversation
00:02:31.080 | is to understand the human being before me,
00:02:33.420 | no matter who they are, no matter their position.
00:02:36.520 | And I do believe the line between good and evil
00:02:40.000 | runs through the heart of every man.
00:02:42.440 | So this is it.
00:02:45.320 | This is our world.
00:02:47.360 | It is full of hate, violence, and destruction.
00:02:50.620 | But it is also full of love, beauty,
00:02:55.280 | and the insatiable desire to help each other.
00:02:57.520 | The people who run the social networks
00:03:01.280 | that show this world, that show us to ourselves,
00:03:05.360 | have the greatest of responsibilities.
00:03:08.480 | In a time of war, pandemic, atrocity,
00:03:11.720 | we turn to social networks to share real human insights
00:03:14.400 | and experiences, to organize protests and celebrations,
00:03:18.560 | to learn and to challenge our understanding
00:03:20.920 | of the world, of our history, and of our future,
00:03:24.240 | and above all, to be reminded of our common humanity.
00:03:27.420 | When the social networks fail,
00:03:30.460 | they have the power to cause immense suffering.
00:03:33.600 | And when they succeed, they have the power
00:03:35.640 | to lessen that suffering.
00:03:37.780 | This is hard.
00:03:39.280 | It's a responsibility, perhaps almost unlike
00:03:41.880 | any other in history.
00:03:44.040 | This podcast conversation attempts to understand
00:03:46.400 | the man and the company who take this responsibility on,
00:03:50.640 | where they fail, and where they hope to succeed.
00:03:53.220 | Mark Zuckerberg's feet are often held to the fire,
00:03:57.880 | as they should be, and this actually gives me hope.
00:04:01.480 | The power of innovation and engineering,
00:04:03.840 | coupled with the freedom of speech
00:04:05.520 | in the form of its highest ideal,
00:04:07.680 | I believe can solve any problem in the world.
00:04:11.080 | But that's just it.
00:04:12.640 | Both are necessary, the engineer and the critic.
00:04:16.620 | I believe that criticism is essential,
00:04:20.520 | but cynicism is not.
00:04:23.240 | And I worry that in our public discourse,
00:04:25.640 | cynicism too easily masquerades as wisdom,
00:04:29.160 | as truth, becomes viral and takes over,
00:04:32.160 | and worse, suffocates the dreams of young minds
00:04:35.400 | who want to build solutions to the problems of the world.
00:04:39.200 | We need to inspire those young minds.
00:04:41.480 | At least for me, they give me hope.
00:04:43.680 | And one small way I'm trying to contribute
00:04:47.120 | is to have honest conversations like these
00:04:49.480 | that don't just ride the viral wave of cynicism,
00:04:53.200 | but seek to understand the failures and successes
00:04:55.440 | of the past, the problems before us,
00:04:57.900 | and the possible solutions
00:04:59.480 | in this very complicated world of ours.
00:05:02.520 | I'm sure I will fail often,
00:05:05.800 | and I count on the critic to point it out when I do.
00:05:10.200 | But I ask for one thing,
00:05:12.560 | and that is to fuel the fire of optimism,
00:05:15.200 | especially in those who dream to build solutions,
00:05:18.340 | because without that, we don't have a chance
00:05:21.760 | on this too fragile, tiny planet of ours.
00:05:24.600 | This is the Lex Friedman Podcast.
00:05:28.000 | To support it, please check out our sponsors
00:05:30.240 | in the description.
00:05:31.620 | And now, dear friends, here's Mark Zuckerberg.
00:05:36.320 | (papers rustling)
00:05:39.160 | Can you circle all the traffic lights, please?
00:05:43.260 | You actually did it.
00:05:54.840 | That is very impressive performance.
00:05:56.800 | Okay, now we can initiate the interview procedure.
00:06:00.000 | Is it possible that this conversation is happening
00:06:02.960 | inside the metaverse created by you,
00:06:05.360 | by Meta, many years from now,
00:06:07.160 | and we're doing a memory replay experience?
00:06:10.180 | - I don't know the answer to that.
00:06:11.240 | Then I'd be some computer construct
00:06:15.200 | and not the person who created that meta company.
00:06:19.000 | But that would truly be Meta.
00:06:20.500 | - Right, so this could be somebody else
00:06:23.360 | using the Mark Zuckerberg avatar
00:06:26.400 | who can do the Mark and the Lex conversation replay
00:06:29.200 | from four decades ago,
00:06:30.960 | when Meta, it was first--
00:06:33.360 | - I mean, it's not gonna be four decades
00:06:34.800 | before we have photorealistic avatars like this.
00:06:38.000 | So I think we're much closer to that.
00:06:40.080 | - Well, that's something you talk about
00:06:41.400 | is how passionate you are about the idea
00:06:43.520 | of the avatar representing who you are in the metaverse.
00:06:46.580 | So I do these podcasts in person.
00:06:49.260 | You know, I'm a stickler for that
00:06:52.280 | because there's a magic to the in-person conversation.
00:06:55.760 | How long do you think it'll be before
00:06:58.120 | you can have the same kind of magic in the metaverse,
00:07:00.640 | the same kind of intimacy in the chemistry,
00:07:02.600 | whatever the heck is there when we're talking in person?
00:07:06.040 | How difficult is it?
00:07:07.080 | How long before we have it in the metaverse?
00:07:09.280 | - Well, I think this is like the key question, right?
00:07:12.920 | Because the thing that's different about virtual
00:07:17.480 | and hopefully augmented reality
00:07:19.120 | compared to all other forms of digital platforms before
00:07:22.400 | is this feeling of presence, right?
00:07:24.240 | The feeling that you're right,
00:07:25.840 | that you're in an experience
00:07:27.000 | and that you're there with other people
00:07:28.280 | or in another place.
00:07:29.680 | And that's just different from all of the other screens
00:07:32.280 | that we have today, right?
00:07:33.880 | Phones, TVs, all this stuff.
00:07:35.760 | It's, you know, they're trying to, in some cases,
00:07:38.080 | deliver experiences that feel high fidelity,
00:07:43.000 | but at no point do you actually feel like you're in it,
00:07:46.120 | right?
00:07:46.960 | At some level, your content is trying to sort of
00:07:48.960 | convince you that this is a realistic thing
00:07:51.360 | that's happening, but all of the kind of subtle signals
00:07:54.160 | are telling you, no, you're looking at a screen.
00:07:56.440 | So the question about how you develop these systems
00:07:59.480 | is like, what are all of the things that make
00:08:02.760 | the physical world all the different cues?
00:08:04.920 | So I think on visual presence and spatial audio,
00:08:09.920 | we're making reasonable progress.
00:08:15.440 | Spatial audio makes a huge deal.
00:08:16.960 | I don't know if you've tried this experience,
00:08:19.600 | workrooms that we launched where you have meetings.
00:08:21.960 | And, you know, I basically made a rule for, you know,
00:08:24.800 | all of the top, you know, management folks at the company
00:08:27.720 | that they need to be doing standing meetings
00:08:29.520 | in workrooms already, right?
00:08:31.720 | I feel like we got to dog food this, you know,
00:08:33.600 | this is how people are going to work in the future.
00:08:35.760 | So we have to adopt this now.
00:08:37.860 | And there were already a lot of things that I think
00:08:41.120 | feel significantly better than like typical Zoom meetings,
00:08:44.720 | even though the avatars are a lot lower fidelity,
00:08:47.440 | you know, the idea that you have spatial audio,
00:08:50.760 | you're around a table in VR with people.
00:08:53.960 | If someone's talking from over there,
00:08:55.240 | it sounds like it's talking from over there.
00:08:56.840 | You can see, you know, the arm gestures and stuff
00:09:00.280 | feel more natural.
00:09:01.720 | You can have side conversations,
00:09:03.040 | which is something that you can't really do in Zoom.
00:09:04.840 | I mean, I guess you can text someone over,
00:09:06.880 | like out of band, but,
00:09:09.680 | and if you're actually sitting around a table with people,
00:09:12.920 | you know, you can lean over and whisper
00:09:14.600 | to the person next to you and like have a conversation
00:09:16.720 | that you can't, you know, that you can't really do with
00:09:19.480 | in just video communication.
00:09:23.480 | So I think it's interesting in what ways
00:09:27.480 | some of these things already feel more real
00:09:29.800 | than a lot of the technology that we have,
00:09:32.560 | even when the visual fidelity isn't quite there,
00:09:35.000 | but I think it'll get there over the next few years.
00:09:37.160 | Now, I mean, you were asking about comparing that
00:09:38.720 | to the true physical world, not Zoom or something like that.
00:09:42.720 | And there, I mean, I think you have feelings of like
00:09:46.960 | temperature, you know, olfactory, obviously touch, right?
00:09:52.080 | We're working on haptic gloves, you know,
00:09:54.480 | the sense that you wanna be able to, you know,
00:09:56.400 | put your hands down and feel some pressure from the table.
00:09:59.680 | You know, all of these things, I think,
00:10:00.640 | are gonna be really critical to be able to keep up
00:10:03.120 | this illusion that you're in a world
00:10:06.720 | and that you're fully present in this world.
00:10:08.720 | But I don't know, I think we're gonna have a lot
00:10:10.560 | of these building blocks within, you know,
00:10:12.600 | the next 10 years or so.
00:10:14.160 | And even before that, I think it's amazing
00:10:15.840 | how much you're just gonna be able to build with software
00:10:18.080 | that sort of masks some of these things.
00:10:21.320 | - I realize I'm going long, but I, you know,
00:10:23.280 | I was told we have a few hours here.
00:10:25.280 | - Yeah, we're here for five to six hours.
00:10:27.200 | - Yeah, so I mean, look, I mean, that's on the shorter end
00:10:30.440 | of the congressional testimonies I've done.
00:10:32.520 | But it's, but, you know, one of the things that we found
00:10:36.880 | with hand presence, right?
00:10:39.480 | So the earliest VR, you just have the headset
00:10:41.960 | and then, and that was cool, you could look around,
00:10:44.400 | you feel like you're in a place,
00:10:45.360 | but you don't feel like you're really able to interact
00:10:47.160 | with it until you have hands.
00:10:48.520 | And then there was this big question where once you got
00:10:50.440 | hands, what's the right way to represent them?
00:10:53.400 | And initially, all of our assumptions was, okay,
00:10:58.400 | when I look down and see my hands in the physical world,
00:11:00.360 | I see an arm and it's gonna be super weird if you see,
00:11:03.560 | you know, just your hand.
00:11:04.800 | But it turned out to not be the case,
00:11:08.000 | because there's this issue with your arms,
00:11:09.800 | which is like, what's your elbow angle?
00:11:11.480 | And if the elbow angle that we're kind of interpolating
00:11:14.720 | based on where your hand is and where your headset is,
00:11:18.600 | actually isn't accurate,
00:11:19.840 | it creates this very uncomfortable feeling where it's like,
00:11:22.600 | oh, like my arm is actually out like this,
00:11:24.360 | but it's like showing it in here.
00:11:25.840 | And that actually broke the feeling of presence a lot more.
00:11:29.440 | Whereas it turns out that if you just show the hands
00:11:31.840 | and you don't show the arms, it actually is fine for people.
00:11:36.120 | So I think that there's a bunch of these interesting
00:11:39.200 | psychological cues where it'll be more about getting
00:11:43.120 | the right details right.
00:11:44.880 | And I think a lot of that will be possible even over,
00:11:47.800 | you know, a few year period or a five year period,
00:11:49.760 | and we won't need like every single thing to be solved
00:11:52.080 | to deliver this like full sense of presence.
00:11:54.600 | - Yeah, it's a fascinating psychology question of
00:11:56.720 | what is the essence that makes in-person conversation
00:12:01.720 | special?
00:12:04.360 | It's like emojis are able to convey emotion really well,
00:12:08.000 | even though they're obviously not photorealistic.
00:12:10.560 | And so in that same way, just like you're saying,
00:12:12.440 | just showing the hands is able to create a comfortable
00:12:16.240 | expression with your hands.
00:12:18.160 | So I wonder what that is.
00:12:19.280 | You know, people in the world wars used to write letters
00:12:21.920 | and you can fall in love with just writing letters.
00:12:24.360 | You don't need to see each other in person.
00:12:26.640 | You can convey emotion, you can be depth of experience
00:12:31.640 | with just words.
00:12:32.720 | So that's, I think, a fascinating place to explore
00:12:36.240 | psychology of like, how do you find that intimacy?
00:12:39.240 | - Yeah, and you know, the way that I come to all of this
00:12:41.480 | stuff is, you know, I basically studied psychology
00:12:44.000 | and computer science.
00:12:45.040 | So all of the work that I do is sort of at the intersection
00:12:49.240 | of those things.
00:12:50.080 | I think most of the other big tech companies are building
00:12:53.200 | technology for you to interact with.
00:12:55.040 | What I care about is building technology to help people
00:12:57.200 | interact with each other.
00:12:58.080 | So I think it's a somewhat different approach than most of
00:13:00.360 | the other tech entrepreneurs and big companies
00:13:03.280 | come at this from.
00:13:04.320 | And a lot of the lessons in terms of how I think about
00:13:09.960 | designing products come from some just basic elements
00:13:14.520 | of psychology, right?
00:13:15.920 | In terms of, you know, our brains, you can compare to the
00:13:19.920 | brains of other animals, you know, we're very wired to
00:13:22.960 | specific things, facial expressions, right?
00:13:25.640 | I mean, we're very visual, right?
00:13:28.120 | So compared to other animals, I mean, that's clearly
00:13:30.280 | the main sense that most people have.
00:13:32.920 | But there's a whole part of your brain that's just kind of
00:13:36.120 | focused on reading facial cues.
00:13:38.520 | So, you know, when we're designing the next version of Quest
00:13:42.120 | or the VR headset, a big focus for us is face tracking
00:13:45.760 | and basically eye tracking so you can make eye contact,
00:13:48.800 | which again, isn't really something that you can do
00:13:50.480 | over a video conference.
00:13:51.440 | It's sort of amazing how much, how far video conferencing
00:13:55.440 | has gotten without the ability to make eye contact, right?
00:13:58.600 | It's sort of a bizarre thing if you think about it,
00:14:00.480 | you're like looking at someone's face, you know,
00:14:03.080 | sometimes for an hour when you're in a meeting and like,
00:14:06.640 | you looking at their eyes to them doesn't look like
00:14:09.560 | you're looking at their eyes.
00:14:10.840 | So it's a-- - You're always looking,
00:14:12.840 | I mean, past each other, I guess.
00:14:14.680 | - Yeah. - I guess you're right.
00:14:15.680 | You're not sending that signal. - Well, you're trying to.
00:14:17.320 | - Right, you're trying to. - Like a lot of times,
00:14:18.480 | I mean, or at least I find myself,
00:14:19.760 | I'm trying to look into the other person's eyes.
00:14:21.440 | - But they don't feel like you're looking to their eyes.
00:14:23.080 | - So then the question is, all right,
00:14:24.120 | am I supposed to look at the camera so that way you can,
00:14:26.080 | you know, have a sensation that I'm looking at you?
00:14:28.640 | I think that that's an interesting question.
00:14:30.080 | And then, you know, with VR today, even without eye tracking
00:14:35.080 | and knowing what your eyes are actually looking at,
00:14:37.520 | you can fake it reasonably well, right?
00:14:39.400 | So you can look at like where the head poses,
00:14:42.240 | and if it looks like I'm kind of looking
00:14:43.720 | in your general direction, then you can sort of assume
00:14:46.520 | that maybe there's some eye contact intended,
00:14:48.640 | and you can do it in a way where it's okay.
00:14:50.880 | Maybe not, it's like a, maybe it's not a, you know,
00:14:53.040 | fixated stare, but it's somewhat natural.
00:14:56.880 | But once you have actual eye tracking,
00:14:58.760 | you can do it for real.
00:15:00.200 | And I think that that's really important stuff.
00:15:02.120 | So when I think about Meta's contribution to this field,
00:15:05.320 | I have to say it's not clear to me
00:15:06.640 | that any of the other companies that are focused
00:15:09.520 | on the Metaverse or on virtual and augmented reality
00:15:13.320 | are gonna prioritize putting these features in the hardware
00:15:15.840 | because like everything, they're trade-offs, right?
00:15:18.240 | I mean, it adds some weight to the device.
00:15:21.480 | Maybe it adds some thickness.
00:15:22.760 | You could totally see another company taking the approach
00:15:24.840 | of let's just make the lightest and thinnest thing possible.
00:15:27.600 | But, you know, I want us to design
00:15:29.320 | the most human thing possible
00:15:31.360 | that creates the richest sense of presence.
00:15:34.320 | 'Cause so much of human emotion and expression
00:15:37.920 | comes from these like micro movements.
00:15:39.520 | If I like move my eyebrow, you know, a millimeter,
00:15:41.800 | you will notice and that like means something.
00:15:44.640 | So the fact that we're losing these signals
00:15:46.840 | and a lot of communication I think is a loss.
00:15:49.640 | And so it's not like, okay, there's one feature
00:15:51.720 | and you add this, then it all of a sudden
00:15:53.320 | is gonna feel like we have real presence.
00:15:55.120 | You can sort of look at how the human brain works
00:15:57.840 | and how we express and kind of read emotions.
00:16:01.840 | And you can just build a roadmap of that, you know,
00:16:04.720 | of just what are the most important things
00:16:06.440 | to try to unlock over a five to 10 year period
00:16:08.520 | and just try to make the experience
00:16:10.040 | more and more human and social.
00:16:12.800 | - When do you think would be a moment
00:16:16.640 | like a singularity moment for the metaverse
00:16:19.360 | where there's a lot of ways to ask this question,
00:16:22.280 | but, you know, people will have many
00:16:25.920 | or most of their meaningful experiences
00:16:28.800 | in the metaverse versus the real world.
00:16:31.320 | And actually it's interesting to think about the fact
00:16:33.400 | that a lot of people are having the most important moments
00:16:36.640 | of their life happen in the digital sphere,
00:16:39.040 | especially now during COVID, you know,
00:16:41.640 | like even falling in love or meeting friends
00:16:45.000 | or getting excited about stuff
00:16:46.400 | that is happening on the 2D digital plane.
00:16:49.640 | When do you think the metaverse
00:16:50.800 | will provide those experiences for a large number,
00:16:54.080 | like a majority of the population?
00:16:54.920 | - Yeah, I think it's a really good question.
00:16:57.240 | There was someone, you know, I read this piece
00:17:00.240 | that framed this as, a lot of people think
00:17:03.720 | that the metaverse is about a place,
00:17:06.040 | but one definition of this is it's about a time
00:17:10.400 | when basically immersive digital worlds
00:17:12.880 | become the primary way that we live our lives
00:17:17.080 | and spend our time.
00:17:18.720 | I think that that's a reasonable construct.
00:17:20.160 | And from that perspective, you know,
00:17:21.840 | I think you also just want to look at this as a continuation
00:17:25.520 | because it's not like, okay, we are building digital worlds,
00:17:28.920 | but we don't have that today.
00:17:29.760 | I think, you know, you and I probably already live
00:17:32.240 | a very large part of our life in digital worlds.
00:17:34.560 | They're just not 3D immersive virtual reality,
00:17:37.160 | but, you know, I do a lot of meetings over video
00:17:39.720 | or I spend a lot of time writing things
00:17:41.600 | over email or WhatsApp or whatever.
00:17:44.480 | So what is it gonna take to get there
00:17:45.960 | for kind of the immersive presence version of this,
00:17:48.600 | which I think is what you're asking.
00:17:50.960 | And for that, I think that there's just a bunch
00:17:52.880 | of different use cases, right?
00:17:54.720 | And I think when you're building virtual worlds,
00:17:59.760 | technology, I think you're,
00:18:02.880 | a lot of it is just you're managing this duality
00:18:05.760 | where on the one hand,
00:18:06.960 | you want to build these elegant things that can scale
00:18:10.080 | and, you know, have billions of people use them
00:18:12.080 | and get value from them.
00:18:13.320 | And then on the other hand,
00:18:14.480 | you're fighting this kind of ground game where it's just,
00:18:18.080 | there are just a lot of different use cases
00:18:19.640 | and people do different things
00:18:20.840 | and like you want to be able to unlock them.
00:18:22.200 | So the first ones that we basically went after
00:18:25.920 | were gaming with Quest and social experiences.
00:18:30.360 | And this is, you know,
00:18:31.200 | it goes back to when we started working on virtual reality.
00:18:33.360 | My theory at the time was basically,
00:18:36.200 | people thought about it as gaming,
00:18:39.400 | but if you look at all computing platforms up to that point,
00:18:44.400 | you know, gaming is a huge part.
00:18:46.080 | It was a huge part of PCs.
00:18:47.440 | It was a huge part of mobile,
00:18:49.440 | but it was also very decentralized, right?
00:18:51.960 | There wasn't, you know, for the most part,
00:18:54.080 | you know, one or two gaming companies,
00:18:55.680 | there were a lot of gaming companies
00:18:57.440 | and gaming is somewhat hits based.
00:18:58.680 | I mean, we're getting some games
00:18:59.960 | that have more longevity,
00:19:01.440 | but in general, you know,
00:19:03.880 | there were a lot of different games out there.
00:19:06.560 | But on PC and on mobile,
00:19:10.720 | the companies that focused on communication
00:19:13.680 | and social interaction,
00:19:15.080 | there tended to be a smaller number of those.
00:19:17.280 | And that ended up being just as important of a thing
00:19:19.160 | as all of the games that you did combined.
00:19:21.360 | I think productivity is another area.
00:19:23.160 | That's obviously something
00:19:24.000 | we've historically been less focused on,
00:19:26.000 | but I think it's gonna be really important.
00:19:27.200 | - With workroom, do you mean productivity
00:19:29.560 | in the collaborative aspect?
00:19:30.920 | - Yeah, I think that there's a workrooms aspect of this,
00:19:34.360 | like a meeting aspect.
00:19:35.360 | And then I think that there's like a, you know,
00:19:37.600 | Word, Excel, you know, productivity.
00:19:40.360 | You're like, you're working or coding or what,
00:19:43.520 | knowledge work, right?
00:19:44.400 | It's as opposed to just meetings.
00:19:46.760 | So you can kind of go through
00:19:47.760 | all these different use cases.
00:19:49.440 | You know, gaming, I think we're well on our way.
00:19:51.280 | Social, I think we're just the kind of preeminent company
00:19:56.080 | that focuses on this.
00:19:57.040 | And I think that that's already on quest becoming the,
00:20:00.280 | you know, if you look at the list of what are the top apps,
00:20:03.280 | you know, social apps are already,
00:20:04.840 | you know, number one, two, three.
00:20:06.480 | So that's kind of becoming a critical thing.
00:20:09.200 | But I don't know, I would imagine for someone like you,
00:20:12.560 | it'll be, you know, until we get, you know,
00:20:15.840 | a lot of the work things dialed in, right?
00:20:17.720 | When this is just like much more adopted
00:20:20.840 | and clearly better than Zoom for VC,
00:20:24.280 | when, you know, if you're doing your coding
00:20:26.360 | or your writing or whatever it is in VR,
00:20:29.400 | which it's not that far off to imagine that
00:20:31.240 | because pretty soon you're just gonna be able
00:20:32.640 | to have a screen that's bigger than, you know,
00:20:34.200 | it'll be your ideal setup and you can bring it with you
00:20:36.120 | and put it on anywhere
00:20:37.520 | and have your kind of ideal workstation.
00:20:39.760 | So I think that there are a few things to work out on that,
00:20:42.560 | but I don't think that that's more than,
00:20:44.720 | you know, five years off.
00:20:46.920 | And then you'll get a bunch of other things
00:20:48.120 | that like aren't even possible
00:20:50.200 | or you don't even think about using a phone
00:20:52.000 | or PC for today, like fitness, right?
00:20:54.440 | So, I mean, I know that you're,
00:20:56.440 | we were talking before about how you're into running
00:20:58.960 | and like, I'm really into, you know,
00:21:00.280 | a lot of things around fitness as well,
00:21:02.600 | you know, different things in different places.
00:21:04.120 | I got really into hydrofoiling recently.
00:21:06.040 | - Nice, I saw a video.
00:21:08.360 | - Yeah, and surfing and I used to fence competitively,
00:21:12.440 | I like run, so.
00:21:13.680 | - And you were saying that you were thinking
00:21:14.840 | about trying different martial arts
00:21:16.360 | and I tried to trick you and convince you
00:21:18.200 | into doing Brazilian Jiu-Jitsu
00:21:19.960 | or you actually mentioned that that was one
00:21:21.520 | you're curious about and I--
00:21:22.960 | - Was that a trick?
00:21:24.120 | - Yeah, I don't know.
00:21:26.040 | We're in the metaverse now.
00:21:27.400 | - Yeah, no, I mean, I took that seriously.
00:21:29.560 | I thought that that was a real suggestion.
00:21:34.360 | - That would be an amazing chance
00:21:36.400 | if we ever step on the mat together
00:21:37.800 | and just like roll around.
00:21:39.040 | I'll show you some moves.
00:21:40.120 | - Well, give me a year to train
00:21:41.800 | and then we can do it.
00:21:42.640 | - This is like, you know, you've seen "Rocky IV"
00:21:44.800 | where the Russian faces off the American,
00:21:46.360 | I'm the Russian in this picture.
00:21:48.000 | And then you're the Rocky, the underdog
00:21:49.960 | that gets to win in the end.
00:21:51.360 | - The idea of me as Rocky and like fighting is--
00:21:55.280 | - If he dies, he dies.
00:21:58.080 | (laughing)
00:21:59.120 | Sorry, just had to say.
00:22:00.720 | - I mean-- - Anyway, yeah.
00:22:02.640 | - But I mean, a lot of aspects of fitness,
00:22:05.880 | you know, I don't know if you've tried supernatural
00:22:08.760 | on Quest or--
00:22:10.160 | - So first of all, can I just comment on the fact
00:22:12.080 | every time I played around with Quest 2,
00:22:15.000 | I just, I get giddy every time I step into virtual reality.
00:22:18.760 | So you mentioned productivity and all those kinds of things.
00:22:20.880 | That's definitely something I'm excited about,
00:22:23.800 | but really I just love the possibilities
00:22:26.760 | of stepping into that world.
00:22:28.840 | Maybe it's the introvert in me,
00:22:30.440 | but it just feels like the most convenient way
00:22:34.080 | to travel into worlds,
00:22:37.480 | into worlds that are similar to the real world
00:22:40.320 | or totally different.
00:22:41.480 | So it's like "Alice in Wonderland,"
00:22:42.840 | just try out crazy stuff.
00:22:44.680 | The possibilities are endless.
00:22:45.760 | And I just, I personally,
00:22:47.800 | and just love, get excited for stepping
00:22:52.800 | in those virtual worlds.
00:22:53.960 | So I'm a huge fan.
00:22:55.000 | In terms of the productivity as a programmer,
00:22:58.280 | I spend most of my day programming.
00:23:00.040 | That's really interesting also,
00:23:02.000 | but then you have to develop the right IDs.
00:23:04.320 | You have to develop, like there has to be a threshold
00:23:07.360 | where a large amount of the programming community
00:23:09.320 | moves there.
00:23:10.520 | But the collaborative aspects that are possible
00:23:13.040 | in terms of meetings, in terms of the,
00:23:15.800 | when two coders are working together,
00:23:18.320 | I mean, that, the possibilities there
00:23:19.960 | are super, super exciting.
00:23:21.720 | - I think that in building this,
00:23:24.160 | we sort of need to balance,
00:23:25.760 | there are gonna be some new things
00:23:28.120 | that you just couldn't do before,
00:23:29.640 | and those are gonna be the amazing experiences.
00:23:31.480 | So teleporting to any place, right?
00:23:33.360 | Whether it's a real place or something that people made.
00:23:36.960 | I mean, some of the experiences around
00:23:40.520 | how we can build stuff in new ways,
00:23:42.040 | where a lot of the stuff that,
00:23:44.760 | when I'm coding stuff, it's like, all right, you code it,
00:23:46.480 | and then you build it, and then you see it afterwards.
00:23:48.200 | But increasingly, it's gonna be possible to,
00:23:50.440 | you're in a world and you're building the world
00:23:52.720 | as you are in it and kind of manipulating it.
00:23:55.720 | One of the things that we showed at our Inside the Lab
00:23:58.680 | for recent artificial intelligence progress
00:24:02.440 | is this BuilderBot program, where now you are,
00:24:05.640 | you can just talk to it and say,
00:24:07.080 | "Hey, okay, I'm in this world,
00:24:08.520 | "put some trees over there and it'll do that."
00:24:10.680 | And like, "All right, put some bottles of water
00:24:13.160 | "on our picnic blanket and it'll do that,
00:24:16.960 | "and you're in the world."
00:24:17.800 | And I think there are gonna be new paradigms for coding.
00:24:19.920 | So yeah, there are gonna be some things
00:24:22.080 | that I think are just pretty amazing,
00:24:24.600 | especially the first few times that you do them,
00:24:26.560 | that you're like, "Whoa,
00:24:28.440 | "I've never had an experience like this."
00:24:30.600 | But most of your life, I would imagine,
00:24:34.240 | is not doing things that are amazing for the first time.
00:24:37.880 | A lot of this in terms of,
00:24:39.600 | I mean, just answering your question from before around,
00:24:42.040 | what is it gonna take before you're spending
00:24:43.440 | most of your time in this?
00:24:45.040 | Well, first of all, let me just say this as an aside,
00:24:48.200 | the goal isn't to have people spend
00:24:49.600 | a lot more time in computing.
00:24:50.960 | - I'm asking for myself.
00:24:52.360 | When will I spend all my time in that?
00:24:54.960 | - Yeah, it's to make computing more natural.
00:24:57.200 | But I think you will spend most of your computing time
00:25:02.720 | in this when it does the things
00:25:04.880 | that you use computing for somewhat better.
00:25:07.280 | So maybe having your perfect workstation
00:25:10.520 | is a 5% improvement on your coding productivity.
00:25:15.120 | Maybe it's not like a completely new thing.
00:25:18.400 | But I mean, look, if I could increase the productivity
00:25:21.600 | of every engineer at Meta by 5%,
00:25:24.000 | we'd buy those devices for everyone.
00:25:27.640 | And I imagine a lot of other companies would too.
00:25:30.360 | And that's how you start getting to the scale
00:25:31.840 | that I think makes this rival
00:25:34.480 | some of the bigger computing platforms that exist today.
00:25:37.040 | - Let me ask you about identity.
00:25:38.280 | We talked about the avatar.
00:25:40.440 | How do you see identity in the metaverse?
00:25:42.720 | Should the avatar be tied to your identity?
00:25:46.400 | Or can I be anything in the metaverse?
00:25:49.280 | Like can I be whatever the heck I want?
00:25:52.160 | Can I even be a troll?
00:25:53.640 | So there's exciting, freeing possibilities,
00:25:57.440 | and there's the darker possibilities too.
00:25:59.480 | - Yeah, I mean, I think that there's gonna be a range.
00:26:03.200 | So we're working on, for expression and avatars,
00:26:07.640 | on one end of the spectrum are kind of expressive
00:26:13.080 | and cartoonish avatars.
00:26:14.920 | And then on the other end of the spectrum
00:26:16.440 | are photorealistic avatars.
00:26:18.480 | And I just think the reality is that
00:26:20.800 | there are gonna be different use cases for different things.
00:26:23.640 | And I guess there's another axis.
00:26:25.120 | So if you're going from photorealistic to expressive,
00:26:28.680 | there's also like representing you directly
00:26:31.120 | versus like some fantasy identity.
00:26:33.720 | And I think that there are gonna be things
00:26:35.360 | on all ends of that spectrum too.
00:26:37.880 | So you'll want photo,
00:26:39.680 | in some experience you might wanna be like
00:26:42.080 | a photorealistic dragon, right?
00:26:44.280 | Or if I'm playing Onward,
00:26:46.960 | or just this military simulator game,
00:26:48.960 | I think getting to be more photorealistic as a soldier
00:26:53.640 | in that could enhance the experience.
00:26:56.760 | There are times when I'm hanging out with friends
00:26:59.560 | where I want them to know it's me.
00:27:02.080 | So a kind of cartoonish or expressive version of me is good.
00:27:06.200 | But there are also experiences like,
00:27:09.440 | now VR Chat does this well today,
00:27:11.600 | where a lot of the experience is kind of dressing up
00:27:14.920 | and wearing a fantastical avatar
00:27:17.800 | that's almost like a meme or is humorous.
00:27:19.600 | So you come into an experience
00:27:21.320 | and it's almost like you have like a built-in icebreaker
00:27:24.560 | because like you see people and you're just like,
00:27:27.320 | all right, I'm cracking up at what you're wearing
00:27:29.960 | because that's funny.
00:27:30.800 | And it's just like, where'd you get that?
00:27:31.920 | Or, oh, you made that?
00:27:32.760 | That's, it's awesome.
00:27:34.360 | Whereas, okay, if you're going into a work meeting,
00:27:38.920 | maybe a photorealistic version of your real self
00:27:41.800 | is gonna be the most appropriate thing for that.
00:27:43.600 | So I think the reality is there aren't going to be,
00:27:47.400 | it's not just gonna be one thing.
00:27:49.040 | My own sense of kind of how you wanna express identity
00:27:55.280 | online has sort of evolved over time in that,
00:27:57.560 | early days in Facebook, I thought, okay,
00:27:59.120 | people are gonna have one identity.
00:28:00.280 | And now I think that's clearly not gonna be the case.
00:28:02.080 | I think you're gonna have all these different things
00:28:04.400 | and there's utility in being able to do different things.
00:28:07.320 | So some of the technical challenges
00:28:10.120 | that I'm really interested in around it are,
00:28:12.880 | how do you build the software to allow people
00:28:14.840 | to seamlessly go between them?
00:28:17.080 | So say, so you could view them as just completely
00:28:23.200 | discrete points on a spectrum,
00:28:25.160 | but let's talk about the metaverse economy for a second.
00:28:28.520 | Let's say I buy a digital shirt
00:28:31.240 | for my photorealistic avatar, which by the way,
00:28:34.440 | I think at the time where we're spending a lot of time
00:28:36.480 | in the metaverse doing a lot of our work meetings
00:28:38.760 | in the metaverse and et cetera,
00:28:40.280 | I would imagine that the economy around virtual clothing
00:28:42.480 | as an example is going to be quite as big.
00:28:44.680 | Why wouldn't I spend almost as much money
00:28:46.800 | in investing in my appearance or expression
00:28:49.800 | for my photorealistic avatar for meetings
00:28:52.440 | as I would for the, whatever I'm gonna wear
00:28:54.360 | in my video chat.
00:28:55.520 | But the question is, okay, so you,
00:28:56.680 | let's say you buy some shirt for your photorealistic avatar.
00:28:59.800 | Wouldn't it be cool if there was a way to basically
00:29:03.520 | translate that into a more expressive thing
00:29:07.920 | for your kind of cartoonish or expressive avatar?
00:29:11.160 | And there are multiple ways to do that.
00:29:12.520 | You can view them as two discrete points and okay,
00:29:14.880 | maybe if a designer sells one thing,
00:29:18.160 | then it actually comes in a pack and there's two
00:29:19.880 | and you can use either one on that.
00:29:22.320 | But I actually think this stuff might exist more
00:29:24.400 | as a spectrum in the future.
00:29:26.080 | And that's what, I do think the direction on some of the
00:29:30.240 | AI advances that is happening to be able to,
00:29:34.000 | especially stuff around like style transfer,
00:29:35.920 | being able to take a piece of art or express something
00:29:39.800 | and say, okay, paint me this photo in the style of Gauguin
00:29:44.800 | or whoever it is that you're interested in.
00:29:48.000 | Take this shirt and put it in the style
00:29:51.240 | of what I've designed for my expressive avatar.
00:29:53.760 | I think that's gonna be pretty compelling.
00:29:56.880 | - And so the fashion, you might be buying like a generator,
00:30:00.000 | like a closet that generates a style.
00:30:03.240 | And then like with the gowns,
00:30:05.480 | they'll be able to infinitely generate outfits
00:30:08.120 | thereby making it, so the reason I wear the same thing
00:30:10.720 | all the time is I don't like choice.
00:30:12.360 | You've talked about the same thing,
00:30:15.120 | but now you don't even have to choose.
00:30:16.640 | Your closet generates your outfit for you every time.
00:30:19.520 | So you have to live with the outfit it generates.
00:30:23.400 | - I mean, you could do that, although,
00:30:25.440 | no, I think some people will,
00:30:27.440 | but I think like, I think that there's going to be
00:30:29.960 | a huge aspect of just people doing creative commerce here.
00:30:34.960 | So I think that there is going to be a big market
00:30:37.800 | around people designing digital clothing.
00:30:41.000 | But the question is, if you're designing digital clothing,
00:30:42.960 | do you need to design, if you're the designer,
00:30:44.800 | do you need to make it for each kind of specific,
00:30:47.600 | discrete point along a spectrum?
00:30:49.920 | Or are you just designing it for kind of
00:30:52.000 | a photorealistic case or an expressive case,
00:30:54.080 | or can you design one and have it translate
00:30:56.160 | across these things?
00:30:57.920 | You know, if I buy a style from a designer
00:31:01.040 | who I care about and now I'm a dragon,
00:31:03.120 | is there a way to morph that so it like goes on the dragon
00:31:05.520 | in a way that makes sense?
00:31:07.640 | And that I think is an interesting AI problem
00:31:09.440 | because you're probably not going to make it so that,
00:31:11.880 | like that designers have to go design for all those things.
00:31:14.680 | But the more useful the digital content is
00:31:17.240 | that you buy in a lot of uses, in a lot of use cases,
00:31:21.280 | the more that economy will just explode.
00:31:23.440 | And that's a lot of what, you know, all of the,
00:31:25.840 | you know, we were joking about NFTs before,
00:31:29.680 | but I think a lot of the promise here is that
00:31:32.560 | if the digital goods that you buy are not just tied
00:31:35.000 | to one platform or one use case,
00:31:37.040 | they end up being more valuable,
00:31:38.280 | which means that people are more willing
00:31:39.800 | and more likely to invest in them.
00:31:41.280 | And that just spurs the whole economy.
00:31:44.240 | - But the question is,
00:31:45.400 | so that's a fascinating positive aspect,
00:31:47.280 | but the potential negative aspect is that
00:31:50.800 | you can have people concealing their identity
00:31:52.680 | in order to troll or even not people, bots.
00:31:57.040 | So how do you know in the metaverse
00:31:58.760 | that you're talking to a real human or an AI
00:32:02.040 | or a well-intentioned human?
00:32:03.920 | Is that something you think about,
00:32:04.960 | something you're concerned about?
00:32:06.920 | - Well, let's break that down into a few different cases.
00:32:10.240 | I mean, 'cause knowing that you're talking to someone
00:32:11.960 | who has good intentions is something
00:32:13.520 | that I think is not even solved in pretty much anywhere.
00:32:17.800 | But if you're talking to someone who's a dragon,
00:32:20.320 | I think it's pretty clear
00:32:21.160 | that they're not representing themselves as a person.
00:32:23.280 | I think probably the most pernicious thing
00:32:25.280 | that you want to solve for is,
00:32:28.560 | I think probably one of the scariest ones
00:32:32.000 | is how do you make sure
00:32:32.840 | that someone isn't impersonating you?
00:32:34.760 | Right, so like, okay,
00:32:36.120 | you're in a future version of this conversation
00:32:39.280 | and we have photorealistic avatars
00:32:41.680 | and we're doing this in workrooms
00:32:43.280 | or whatever the future version of that is.
00:32:44.960 | And someone walks in who like looks like me.
00:32:48.840 | How do you know that that's me?
00:32:50.280 | And one of the things that we're thinking about is,
00:32:54.680 | you know, it's still a pretty big AI project
00:32:57.480 | to be able to generate photorealistic avatars
00:32:59.480 | that basically can like,
00:33:00.840 | they work like these codecs of you, right?
00:33:03.320 | And so you kind of have a map from your headset
00:33:06.160 | and whatever sensors of what your body's actually doing
00:33:08.000 | and it takes the model in it
00:33:09.200 | and it kind of displays it in VR.
00:33:11.160 | But there's a question which is,
00:33:12.640 | should there be some sort of biometric security
00:33:15.400 | so that like when I put on my VR headset
00:33:18.200 | or I'm going to go use that avatar,
00:33:20.920 | I need to first prove that I am that.
00:33:24.320 | And I think you probably are gonna want something like that.
00:33:26.760 | So that's, you know, as we're developing these technologies,
00:33:31.120 | we're also thinking about the security for things like that
00:33:34.640 | because people aren't gonna want to be impersonated.
00:33:37.080 | That's a huge security issue.
00:33:39.520 | Then you just get the question of people hiding behind
00:33:44.520 | fake accounts to do malicious things,
00:33:48.320 | which is not gonna be unique to the metaverse.
00:33:51.000 | Although, you know, certainly in a environment
00:33:56.000 | where it's more immersive
00:33:57.320 | and you have more of a sense of presence,
00:33:58.640 | it could be more painful.
00:34:01.720 | But this is obviously something
00:34:03.160 | that we've just dealt with for years
00:34:06.480 | in social media and the internet more broadly.
00:34:08.720 | And there, I think there have been a bunch of tactics
00:34:13.120 | that I think we've just evolved to,
00:34:17.760 | you know, we've built up these different AI systems
00:34:20.480 | to basically get a sense of,
00:34:21.880 | is this account behaving in the way that a person would?
00:34:26.360 | And it turns out, you know,
00:34:28.360 | so in all of the work that we've done around,
00:34:31.800 | you know, we call it community integrity
00:34:33.320 | and it's basically like policing harmful content
00:34:36.920 | and trying to figure out where to draw the line.
00:34:38.360 | And there are all these like really hard
00:34:39.800 | and philosophical questions around like,
00:34:41.280 | where do you draw the line on some of this stuff?
00:34:42.880 | And the thing that I've kind of found the most effective
00:34:47.800 | is as much as possible trying to figure out
00:34:51.200 | who are the inauthentic accounts
00:34:53.360 | or where are the accounts that are behaving
00:34:55.520 | in an overall harmful way at the account level,
00:34:58.400 | rather than trying to get into like policing
00:35:00.400 | what they're saying, right?
00:35:01.400 | Which I think the metaverse is gonna be even harder
00:35:03.680 | because the metaverse, I think,
00:35:04.760 | will have more properties of,
00:35:07.280 | it's almost more like a phone call, right?
00:35:09.160 | Or like, or you're, you know,
00:35:10.480 | it's not like I post a piece of content
00:35:12.360 | and is that piece of content good or bad?
00:35:14.680 | So I think more of this stuff will have to be done
00:35:16.200 | at the level of the account.
00:35:19.400 | But this is the area where, you know,
00:35:21.680 | between the kind of, you know,
00:35:25.880 | counterintelligence teams that we built up
00:35:27.640 | inside the company and like years of building
00:35:29.840 | just different AI systems to basically detect
00:35:34.280 | what is a real account and what isn't.
00:35:36.840 | I'm not saying we're perfect,
00:35:37.880 | but like this is an area where I just think
00:35:39.880 | we are like years ahead of basically anyone else
00:35:43.520 | in the industry in terms of having built those capabilities.
00:35:48.080 | And I think that that just is gonna be incredibly important
00:35:50.160 | for this next wave of things.
00:35:51.480 | - And like you said, on a technical level,
00:35:53.440 | on a philosophical level,
00:35:54.920 | it's an incredibly difficult problem to solve.
00:35:57.640 | By the way, I would probably like to open source my avatar
00:36:03.160 | so that could be like millions of Lex's walking around,
00:36:05.880 | just like an army.
00:36:07.000 | - Like Agent Smith.
00:36:08.440 | - Agent Smith, yeah, exactly.
00:36:10.640 | So the Unity ML folks built a copy of me
00:36:15.640 | and they sent it to me.
00:36:17.520 | So there's a person running around
00:36:20.160 | and I'd just been doing reinforcement learning on it.
00:36:22.400 | I was gonna release it because, you know,
00:36:26.400 | just to have sort of like thousands of Lex's
00:36:29.800 | doing reinforcement learning.
00:36:31.120 | So they fall over naturally,
00:36:32.400 | they have to learn how to like walk around and stuff.
00:36:34.840 | So I love that idea of this tension
00:36:37.200 | between biometric security, you want to have one identity,
00:36:40.320 | but then certain avatars, you might have to have many.
00:36:43.600 | I don't know which is better security,
00:36:45.360 | sort of flooding the world with Lex's
00:36:48.120 | and thereby achieving security
00:36:49.400 | or really being protective of your identity.
00:36:51.760 | I have to ask a security question actually.
00:36:53.840 | - Well, how does flooding the world with Lex's
00:36:56.080 | help me know in our conversation
00:36:58.040 | that I'm talking to the real Lex?
00:36:59.600 | - I completely destroy the trust
00:37:01.520 | in all my relationships then, right?
00:37:03.000 | If I flood, 'cause then it's, yeah, that...
00:37:06.840 | - I think that one's not gonna work that well for you.
00:37:09.480 | - It's not gonna work that, for the original copy.
00:37:11.720 | - It probably fits some things,
00:37:13.320 | like if you're a public figure
00:37:14.760 | and you're trying to have, you know, a bunch of,
00:37:18.440 | if you're trying to show up
00:37:19.440 | in a bunch of different places in the future,
00:37:21.040 | you'll be able to do that in the metaverse.
00:37:23.480 | So that kind of replication I think will be useful.
00:37:26.120 | But I do think that you're gonna want a notion of like,
00:37:29.240 | I am talking to the real one.
00:37:31.480 | - Yeah, yeah, especially if the fake ones
00:37:34.440 | start outperforming you in all your private relationships
00:37:37.440 | and then you're left behind.
00:37:38.720 | I mean, that's a serious concern I have with clones.
00:37:41.040 | Again, the things I think about.
00:37:43.320 | Okay, so I recently got, I use QNAP NAS storage.
00:37:48.320 | So just storage for video and stuff.
00:37:50.200 | And I recently got hacked.
00:37:51.440 | It was the first time for me with ransomware.
00:37:53.520 | It's not me personally, it's all QNAP devices.
00:37:58.680 | So the question that people have
00:38:00.760 | is about security in general,
00:38:03.280 | because I was doing a lot of the right things
00:38:05.000 | in terms of security and nevertheless,
00:38:06.800 | ransomware basically disabled my device.
00:38:10.880 | Is that something you think about?
00:38:12.000 | What are the different steps you could take
00:38:13.720 | to protect people's data on the security front?
00:38:16.880 | - I think that there's different solutions for,
00:38:20.360 | and strategies where it makes sense to have stuff
00:38:23.600 | kind of put behind a fortress, right?
00:38:25.400 | So the centralized model
00:38:27.160 | versus the decentralizing.
00:38:30.200 | Then I think both have strengths and weaknesses.
00:38:32.080 | So I think anyone who says,
00:38:33.000 | okay, just decentralize everything,
00:38:34.960 | that'll make it more secure.
00:38:36.600 | I think that that's tough because,
00:38:38.760 | you know, I mean, the advantage of something like,
00:38:41.000 | you know, encryption is that, you know,
00:38:44.480 | we run the largest encrypted service
00:38:46.360 | in the world with WhatsApp.
00:38:47.640 | And we're one of the first to roll out
00:38:49.520 | a multi-platform encryption service.
00:38:52.600 | And that's something that I think was a big advance
00:38:55.920 | for the industry.
00:38:57.120 | And one of the promises that we can basically make
00:38:59.240 | because of that,
00:39:00.320 | our company doesn't see when you're sending
00:39:02.920 | an encrypted message,
00:39:04.440 | and an encrypted message,
00:39:05.840 | what the content is of what you're sharing.
00:39:07.800 | So that way, if someone hacks Meta's servers,
00:39:10.560 | they're not gonna be able to access,
00:39:13.480 | you know, the WhatsApp message that,
00:39:15.200 | you know, you're sending to your friend.
00:39:16.880 | And that I think matters a lot to people
00:39:19.040 | because obviously if someone is able to compromise
00:39:21.840 | a company's servers and that company has hundreds
00:39:23.840 | of millions or billions of people,
00:39:25.160 | then that ends up being a very big deal.
00:39:27.880 | The flip side of that is, okay,
00:39:29.320 | all the content is on your phone.
00:39:31.040 | You know, are you following security best practices
00:39:34.720 | on your phone?
00:39:35.800 | If you lose your phone, all your content is gone.
00:39:38.000 | So that's an issue.
00:39:39.360 | You know, maybe you go back up your content
00:39:41.480 | from WhatsApp or some other service
00:39:44.000 | in iCloud or something,
00:39:45.680 | but then you're just at Apple's whims
00:39:47.720 | about are they gonna go turn over the data
00:39:50.280 | to, you know, some government,
00:39:51.840 | or are they gonna get hacked?
00:39:53.320 | So a lot of the time it is useful
00:39:55.800 | to have data in a centralized place too,
00:39:58.760 | because then you can train systems
00:40:01.440 | that can just do much better personalization.
00:40:04.680 | I think that in a lot of cases,
00:40:07.280 | you know, centralized systems can offer,
00:40:09.840 | you know, especially if you're a serious company,
00:40:13.360 | you're running the state of the art stuff
00:40:15.960 | and you have red teams attacking your own stuff
00:40:19.520 | and you're putting out bounty programs
00:40:24.200 | and trying to attract some of the best hackers in the world
00:40:26.200 | to go break into your stuff all the time.
00:40:27.760 | So any system is gonna have security issues,
00:40:30.440 | but I think the best way forward
00:40:33.400 | is to basically try to be as aggressive
00:40:35.440 | and open about hardening the systems as possible,
00:40:37.520 | not trying to kind of hide
00:40:39.080 | and pretend that there aren't gonna be issues,
00:40:40.680 | which I think is over time
00:40:41.640 | why a lot of open source systems
00:40:43.760 | have gotten relatively more secure,
00:40:45.480 | because they're open and, you know,
00:40:46.960 | it's not rather than pretending
00:40:48.360 | that there aren't gonna be issues,
00:40:49.400 | just people surface them quicker.
00:40:50.840 | So I think you want to adopt that approach as a company
00:40:53.680 | and just constantly be hardening yourself.
00:40:56.560 | - Trying to stay one step ahead of the attackers.
00:41:00.000 | - It's an inherently adversarial space.
00:41:03.200 | - Yeah.
00:41:04.040 | - Right, I think it's an interesting,
00:41:04.880 | security is interesting
00:41:07.200 | because of the different kind of threats
00:41:09.080 | that we've managed over the last five years,
00:41:11.760 | there are ones where basically the adversaries
00:41:15.480 | keep on getting better and better.
00:41:16.840 | So trying to kind of interfere with,
00:41:19.880 | security is certainly one area of this.
00:41:23.080 | If you have like nation states
00:41:24.360 | that are trying to interfere in elections or something,
00:41:27.120 | like they're kind of evolving their tactics.
00:41:29.480 | Whereas on the other hand,
00:41:31.120 | I don't wanna be too simplistic about it,
00:41:33.160 | but like if someone is saying something hateful,
00:41:36.680 | people usually aren't getting smarter and smarter
00:41:38.720 | about how they say hateful things, right?
00:41:40.480 | So maybe there's some element of that,
00:41:42.600 | but it's a very small dynamic compared to,
00:41:46.040 | you know, how advanced attackers
00:41:47.400 | and some of these other places get over time.
00:41:49.960 | - I believe most people are good,
00:41:51.360 | so they actually get better over time
00:41:53.640 | at not being less hateful,
00:41:55.400 | 'cause they realize it's not fun being hateful.
00:41:59.080 | That's at least the belief I have.
00:42:01.960 | But first, bathroom break.
00:42:04.960 | - Sure. - Okay.
00:42:05.800 | So we'll come back to AI,
00:42:08.160 | but let me ask some difficult questions now.
00:42:11.000 | Social Dilemma is a popular documentary
00:42:13.800 | that raised concerns about the effects of social media
00:42:16.200 | on society.
00:42:17.560 | You responded with a point by point rebuttal
00:42:19.960 | titled "What the Social Dilemma Gets Wrong."
00:42:23.120 | People should read that.
00:42:25.000 | I would say the key point they make
00:42:26.680 | is because social media is funded by ads,
00:42:29.600 | algorithms want to maximize attention and engagement,
00:42:33.280 | and an effective way to do so
00:42:36.240 | is to get people angry at each other,
00:42:38.920 | increase division, and so on.
00:42:40.920 | Can you steel man their criticisms and arguments
00:42:44.240 | that they make in the documentary
00:42:46.240 | as a way to understand the concern
00:42:48.560 | and as a way to respond to it?
00:42:51.600 | - Well, yeah, I think that that's a good conversation
00:42:55.600 | to have.
00:42:56.880 | I don't happen to agree with the conclusions,
00:43:00.400 | and I think that they make a few assumptions
00:43:02.120 | that are just very big jumps
00:43:06.280 | that I don't think are reasonable to make.
00:43:08.880 | But I understand overall why people would be concerned
00:43:13.880 | that our business model and ads in general,
00:43:18.440 | we do make more money
00:43:20.720 | as people use the service more in general, right?
00:43:23.360 | So as a kind of basic assumption,
00:43:26.360 | okay, do we have an incentive for people
00:43:28.360 | to build a service that people use more?
00:43:31.400 | Yes, on a lot of levels.
00:43:32.920 | I mean, we think what we're doing is good.
00:43:34.560 | So we think that if people are finding it useful,
00:43:37.240 | they'll use it more.
00:43:38.640 | Or if you just look at it as this sort of,
00:43:41.200 | if the only thing we cared about is money,
00:43:43.400 | which is not for anyone who knows me,
00:43:46.320 | but okay, we're a company.
00:43:47.920 | So let's say you just kind of simplified it down to that,
00:43:51.440 | then would we want people to use the services more?
00:43:55.000 | But then, and then you get to the second question,
00:43:57.360 | which is, does kind of getting people agitated
00:44:01.960 | make them more likely to use the services more?
00:44:07.560 | And I think from looking at other media in the world,
00:44:12.560 | especially TV and, you know, there's the old news adage,
00:44:17.240 | if it bleeds, it leads.
00:44:18.640 | Like, I think that this is, there are,
00:44:21.280 | I think there are a bunch of reasons
00:44:24.080 | why someone might think that
00:44:26.160 | that kind of provocative content
00:44:30.480 | would be the most engaging.
00:44:32.600 | Now, what I've always found is two things.
00:44:35.600 | One is that what grabs someone's attention in the near term
00:44:39.120 | is not necessarily something
00:44:40.800 | that they're going to appreciate having seen
00:44:43.600 | or going to be the best over the long term.
00:44:45.280 | So I think what a lot of people get wrong
00:44:47.400 | is that we're not, I'm not building this company
00:44:50.360 | to like make the most money
00:44:51.840 | or get people to spend the most time on this
00:44:53.560 | in the next quarter or the next year, right?
00:44:55.560 | I mean, I've been doing this for 17 years at this point,
00:44:58.920 | and I'm still relatively young,
00:45:00.360 | and I have a lot more that I wanna do
00:45:02.000 | over the coming decades.
00:45:03.320 | So like, I think that it's too simplistic to say,
00:45:08.320 | hey, this might increase time in the near term,
00:45:11.760 | therefore it's what you're gonna do,
00:45:13.360 | because I actually think a deeper look
00:45:15.280 | at kind of what my incentives are,
00:45:17.160 | the incentives of a company that are focused on the long term
00:45:20.440 | is to basically do what people
00:45:22.640 | are gonna find valuable over time,
00:45:24.080 | not what is gonna draw people's attention today.
00:45:26.720 | The other thing that I'd say is that
00:45:28.520 | I think a lot of times people look at this
00:45:31.440 | from the perspective of media,
00:45:33.160 | or kind of information or civic discourse,
00:45:37.720 | but one other way of looking at this
00:45:40.240 | is just that, okay, I'm a product designer, right?
00:45:42.480 | Our company, we build products,
00:45:45.120 | and a big part of building a product
00:45:47.280 | is not just the function and utility
00:45:48.960 | of what you're delivering,
00:45:50.120 | but the feeling of how it feels, right?
00:45:51.960 | And we spend a lot of time talking about virtual reality
00:45:55.600 | and how the kind of key aspect of that experience
00:45:58.760 | is the feeling of presence,
00:46:00.600 | which it's a visceral thing.
00:46:01.920 | It's not just about the utility that you're delivering,
00:46:03.880 | it's about like the sensation.
00:46:05.880 | And similarly, I care a lot about how people feel
00:46:10.360 | when they use our products,
00:46:11.360 | and I don't want to build products that make people angry.
00:46:15.240 | I mean, that's like not, I think,
00:46:16.960 | what we're here on this earth to do,
00:46:18.360 | is to build something that people spend a bunch of time doing
00:46:22.080 | and it just kind of makes them angrier to other people.
00:46:23.960 | I mean, I think that that's not good.
00:46:26.240 | That's not what I think would be
00:46:30.040 | sort of a good use of our time
00:46:31.960 | or a good contribution to the world.
00:46:33.600 | So, okay, it's like people,
00:46:35.680 | they tell us on a per content basis,
00:46:38.080 | does this thing, do I like it?
00:46:39.800 | Do I love it?
00:46:40.640 | Does it make me angry?
00:46:41.560 | Does it make me sad?
00:46:42.880 | And based on that,
00:46:44.040 | I mean, we choose to basically show content
00:46:47.160 | that makes people angry less,
00:46:49.080 | because of course, right?
00:46:51.160 | If you're designing a product
00:46:52.640 | and you want people to be able to connect
00:46:56.240 | and feel good over a long period of time,
00:46:59.120 | then that's naturally what you're gonna do.
00:47:02.040 | So, I don't know, I think overall,
00:47:04.360 | I understand at a high level,
00:47:10.520 | if you're not thinking too deeply about it,
00:47:13.640 | why that argument might be appealing.
00:47:16.080 | But I just think if you actually look
00:47:19.200 | at what our real incentives are,
00:47:20.880 | not just like, if we were trying to optimize
00:47:25.080 | for the next week, but like as people working on this,
00:47:28.920 | like why are we here?
00:47:30.440 | And I think it's pretty clear
00:47:32.880 | that that's not actually how you would wanna
00:47:34.240 | design the system.
00:47:35.680 | I guess one other thing that I'd say is that,
00:47:37.720 | while we're focused on the ads business model,
00:47:40.760 | I do think it's important to note that
00:47:43.080 | a lot of these issues are not unique to ads.
00:47:45.320 | I mean, so take like a subscription news business model,
00:47:47.840 | for example, I think that has,
00:47:50.200 | just as many potential pitfalls.
00:47:52.160 | Maybe if someone's paying for a subscription,
00:47:55.200 | you don't get paid per piece of content that they look at,
00:47:57.880 | but say for example, I think like a bunch
00:48:02.600 | of the partisanship that we see
00:48:04.400 | could potentially be made worse
00:48:07.320 | by you have these kind of partisan news organizations
00:48:12.320 | that basically sell subscriptions,
00:48:15.720 | and they're only gonna get people on one side
00:48:17.520 | to basically subscribe to them.
00:48:19.840 | So their incentive is not to print content
00:48:22.720 | or produce content that's kind of centrist
00:48:26.080 | or down the line either.
00:48:27.760 | I bet that what a lot of them find is that
00:48:30.000 | if they produce stuff that's kind of more polarizing
00:48:32.440 | or more partisan,
00:48:33.600 | then that is what gets them more subscribers.
00:48:36.800 | So I think that this stuff is all,
00:48:40.200 | there's no perfect business model.
00:48:41.880 | Everything has pitfalls.
00:48:43.400 | The thing that I think is great about advertising
00:48:46.440 | is it makes it to the consumer services free,
00:48:48.720 | which if you believe that everyone should have a voice
00:48:50.840 | and everyone should be able to connect,
00:48:52.000 | then that's a great thing,
00:48:53.920 | as opposed to building a luxury service
00:48:55.840 | that not everyone can afford.
00:48:57.200 | But look, I mean, every business model,
00:48:59.160 | you have to be careful
00:49:00.000 | about how you're implementing what you're doing.
00:49:02.440 | - You responded to a few things there.
00:49:04.600 | You spoke to the fact that
00:49:06.360 | there is a narrative of malevolence,
00:49:08.960 | like you're leaning into them making people angry
00:49:13.600 | just because it makes more money in the short term,
00:49:15.680 | that kind of thing.
00:49:16.520 | So you responded to that.
00:49:17.840 | But there's also a kind of reality of human nature.
00:49:22.040 | Just like you spoke about,
00:49:23.640 | there's fights, arguments we get in,
00:49:26.840 | and we don't like ourselves afterwards,
00:49:28.720 | but we got into them anyway.
00:49:30.320 | So our long-term growth is,
00:49:32.880 | I believe for most of us,
00:49:34.520 | has to do with learning,
00:49:36.520 | challenging yourself, improving,
00:49:39.680 | being kind to each other,
00:49:40.960 | finding a community of people
00:49:42.920 | that you connect with on a real human level,
00:49:47.920 | all that kind of stuff.
00:49:50.520 | But it does seem when you look at social media
00:49:54.660 | that a lot of fights break out,
00:49:56.560 | a lot of arguments break out,
00:49:58.200 | a lot of viral content ends up being
00:50:01.920 | sort of outrage in one direction or the other.
00:50:04.960 | And so it's easy from that to infer the narrative
00:50:08.000 | that social media companies
00:50:10.280 | are letting this outrage become viral.
00:50:13.960 | And so they're increasing the division in the world.
00:50:16.840 | I mean, perhaps you can comment on that
00:50:18.840 | or further, how can you be,
00:50:21.160 | how can you push back on this narrative?
00:50:25.800 | How can you be transparent about this battle?
00:50:28.440 | Because I think it's not just motivation or financials,
00:50:33.440 | it's a technical problem too,
00:50:36.000 | which is how do you improve
00:50:39.440 | long-term wellbeing of human beings?
00:50:43.040 | - I think that going through some of the design decisions
00:50:47.960 | would be a good conversation.
00:50:49.680 | But first I actually think,
00:50:51.800 | I think you acknowledge that,
00:50:54.280 | that narrative is somewhat anecdotal.
00:50:56.920 | And I think it's worth grounding this conversation
00:50:59.520 | in the actual research that has been done on this,
00:51:02.600 | which by and large finds that social media
00:51:07.600 | is not a large driver of polarization.
00:51:10.800 | And I mean, there's been a number of economists
00:51:14.800 | and social scientists and folks who have studied this.
00:51:17.440 | In a lot of polarization, it varies around the world.
00:51:21.320 | Social media is basically in every country,
00:51:23.080 | Facebook's in pretty much every country
00:51:24.600 | except for China and maybe North Korea.
00:51:27.160 | And you see different trends in different places
00:51:32.160 | where in a lot of countries polarization is declining,
00:51:37.000 | in some it's flat, in the US it's risen sharply.
00:51:41.640 | So the question is,
00:51:43.200 | what are the unique phenomena in the different places?
00:51:46.000 | And I think for the people who are trying to say,
00:51:47.640 | hey, social media is the thing that's doing this,
00:51:50.200 | I think that that clearly doesn't hold up
00:51:52.920 | because social media is a phenomenon
00:51:54.480 | that is pretty much equivalent
00:51:56.040 | in all of these different countries.
00:51:57.720 | And you have researchers like this economist at Stanford,
00:52:00.600 | Matthew Genskow, who's just written at length about this.
00:52:04.400 | And it's a bunch of books by political scientists,
00:52:10.320 | Ezra Klein and folks,
00:52:11.960 | why we're polarized basically goes through
00:52:13.760 | this decades long analysis in the US before I was born,
00:52:18.160 | basically talking about some of the forces
00:52:20.640 | in kind of partisan politics and Fox News
00:52:24.240 | and different things that predate the internet
00:52:26.480 | in a lot of ways that I think
00:52:28.360 | are likely larger contributors.
00:52:30.040 | So to the contrary on this,
00:52:32.200 | not only is it pretty clear that social media
00:52:35.320 | is not a major contributor,
00:52:37.560 | but most of the academic studies that I've seen
00:52:40.040 | actually show that social media use
00:52:42.600 | is correlated with lower polarization.
00:52:45.360 | Genskow, the same person who just did the study
00:52:48.640 | that I cited about longitudinal polarization
00:52:51.640 | across different countries,
00:52:53.040 | also did a study that basically showed
00:52:57.480 | that if you looked after the 2016 election in the US,
00:53:02.120 | the voters who were the most polarized
00:53:04.320 | were actually the ones who were not on the internet.
00:53:07.560 | So, and there have been recent other studies,
00:53:10.280 | I think in Europe and around the world,
00:53:12.800 | basically showing that as people stop using social media,
00:53:16.720 | they tend to get more polarized.
00:53:19.200 | Then there's a deeper analysis around,
00:53:21.360 | okay, well, what polarization actually isn't even one thing.
00:53:24.760 | 'Cause having different opinions on something isn't,
00:53:27.080 | I don't think that that's by itself bad.
00:53:28.920 | What people who study this say is most problematic
00:53:33.920 | is what they call affective polarization,
00:53:35.920 | which is basically, do you have negative feelings
00:53:39.280 | towards people of another group?
00:53:41.040 | And the way that a lot of scholars study this
00:53:43.760 | is they basically ask a group,
00:53:46.800 | would you let your kids marry someone of group X?
00:53:50.600 | Whatever the groups are that you're worried
00:53:53.320 | that someone might have negative feelings towards.
00:53:55.520 | And in general, use of social media
00:53:58.160 | has corresponded to decreases
00:53:59.880 | in that kind of affective polarization.
00:54:01.960 | So I just wanna, I think we should talk
00:54:04.720 | to the design decisions and how we handle
00:54:06.840 | the kind of specific pieces of content.
00:54:10.720 | But overall, I think it's just worth grounding
00:54:13.280 | that discussion in the research that's existed
00:54:15.640 | that I think overwhelmingly shows
00:54:17.480 | that the mainstream narrative around this is just not right.
00:54:21.080 | - But the narrative does take hold
00:54:23.120 | and it's compelling to a lot of people.
00:54:27.200 | There's another question I'd like to ask you on this.
00:54:31.320 | I was looking at various polls and saw that you're
00:54:33.800 | one of the most disliked tech leaders today.
00:54:38.160 | 54% unfavorable rating.
00:54:41.440 | Elon Musk is 23%.
00:54:43.280 | It's basically everybody has a very high unfavorable rating
00:54:46.280 | that are tech leaders.
00:54:48.000 | Maybe you can help me understand that.
00:54:50.640 | Why do you think so many people dislike you?
00:54:53.320 | Some even hate you.
00:54:56.880 | And how do you regain their trust and support?
00:54:59.160 | Given everything you just said,
00:55:00.920 | why are you losing the battle in explaining to people
00:55:08.120 | what actual impact social media has on society?
00:55:11.120 | - Well, I'm curious if that's a US survey or world.
00:55:16.760 | - It is US, yeah.
00:55:17.960 | - So I think that there's a few dynamics.
00:55:19.360 | One is that our brand has been somewhat uniquely challenged
00:55:24.360 | in the US compared to other places.
00:55:29.000 | It's not that there are, I mean,
00:55:29.920 | other countries we have issues too.
00:55:32.640 | But I think in the US there was this dynamic
00:55:35.840 | where if you look at like the net sentiment
00:55:38.880 | of kind of coverage or attitude towards us,
00:55:42.880 | before 2016, I think that there were probably
00:55:44.880 | very few months if any where it was negative.
00:55:47.440 | And since 2016, I think that there probably
00:55:49.440 | been very few months if any that it's been positive.
00:55:51.840 | - The politics.
00:55:53.000 | - So, but I think it's a specific thing.
00:55:55.320 | And this is very different from other places.
00:55:56.920 | So I think in a lot of other countries in the world,
00:55:59.800 | the sentiment towards meta and our services
00:56:02.400 | is extremely positive.
00:56:04.800 | In the US we have more challenges.
00:56:06.560 | And I think compared to other companies,
00:56:08.760 | you can look at certain industries,
00:56:12.480 | I think if you look at it from like a partisan perspective,
00:56:16.280 | not from like a political perspective,
00:56:18.000 | but just kind of culturally,
00:56:19.040 | it's like there are people
00:56:19.880 | who are probably more left of center
00:56:21.040 | and there are people who are more right of center
00:56:22.480 | and there's kind of blue America and red America.
00:56:25.840 | There are certain industries that I think
00:56:27.600 | maybe one half of the country
00:56:29.840 | has a more positive view towards than another.
00:56:32.160 | And I think we're in a,
00:56:33.640 | one of the positions that we're in
00:56:37.960 | that I think is really challenging
00:56:39.760 | is that because of a lot of the content decisions
00:56:42.880 | that we've basically had to arbitrate,
00:56:45.760 | and because we're not a partisan company, right?
00:56:49.520 | We're not a Democrat company or a Republican company.
00:56:52.600 | We're trying to make the best decisions we can
00:56:55.040 | to help people connect
00:56:56.040 | and help people have as much voice as they can
00:56:59.320 | while having some rules
00:57:01.080 | because we're running a community.
00:57:02.880 | The net effect of that is that
00:57:06.240 | we're kind of constantly making decisions
00:57:08.720 | that piss off people in both camps.
00:57:11.720 | And the effect that I've sort of seen
00:57:16.520 | is that when we make a decision
00:57:18.520 | that's a controversial one,
00:57:22.960 | that's gonna upset, say, about half the country,
00:57:26.480 | those decisions are all negative some,
00:57:30.320 | from a brand perspective,
00:57:31.920 | because it's not like,
00:57:33.400 | like if we make that decision in one way
00:57:35.600 | and say half the country is happy
00:57:37.800 | about that particular decision that we make,
00:57:39.960 | they tend to not say,
00:57:41.160 | "Oh, sweet, Meta got that one right."
00:57:43.680 | They're just like,
00:57:44.520 | "Ah, you didn't mess that one up."
00:57:46.040 | But their opinion doesn't tend to go up by that much.
00:57:48.920 | Whereas the people who kind of are on the other side of it
00:57:51.920 | are like, "God, how could you mess that up?
00:57:54.920 | Like, how could you possibly think
00:57:56.360 | that like that piece of content is okay
00:57:58.200 | and should be up and should not be censored?"
00:58:00.080 | Or, and so I think the,
00:58:03.080 | whereas if you leave it up and,
00:58:05.360 | or if you take it down,
00:58:09.240 | the people who thought it should be taken down,
00:58:10.720 | or it's like, "All right, fine, great.
00:58:12.760 | You didn't mess that one up."
00:58:14.080 | So our internal assessment of,
00:58:16.080 | and kind of analytics on our brand
00:58:17.920 | are basically any time one of these big controversial things
00:58:20.560 | comes up in society,
00:58:22.000 | our brand goes down with half of the country.
00:58:26.080 | And then like, if you,
00:58:27.600 | and then if you just kind of extrapolate that out,
00:58:29.600 | it's just been very challenging for us to try to navigate
00:58:33.200 | what is a polarizing country in a principled way,
00:58:36.640 | where we're not trying to kind of hew to one side
00:58:38.600 | or the other, we're trying to do what we think
00:58:40.040 | is the right thing.
00:58:41.040 | But that's what I think is the right thing
00:58:43.240 | for us to do though.
00:58:44.080 | So, I mean, that's what we'll try to keep doing.
00:58:47.360 | - Just as a human being, how does it feel though,
00:58:50.160 | when you're giving so much of your day-to-day life
00:58:53.360 | to try to heal division,
00:58:55.720 | to try to do good in the world, as we've talked about,
00:58:59.680 | that so many people in the US, the place you call home,
00:59:03.720 | have a negative view of you as a leader,
00:59:07.920 | as a human being and the company you love?
00:59:11.380 | - Well, I mean, it's not great, but I,
00:59:18.280 | I mean, look, if I wanted people to think positively
00:59:21.040 | about me as a person,
00:59:23.540 | I don't know, I'm not sure if you go build a company.
00:59:27.980 | I mean, it's like--
00:59:28.820 | - Or a social media company.
00:59:30.300 | It seems exceptionally difficult to do
00:59:32.060 | with a social media company.
00:59:32.900 | - Yeah, so, I mean, I don't know, there is a dynamic
00:59:36.740 | where a lot of the other people running these companies,
00:59:40.820 | internet companies, have sort of stepped back
00:59:44.060 | and they just do things that are sort of,
00:59:46.420 | I don't know, less controversial.
00:59:49.500 | And some of it may be that they just get tired over time,
00:59:52.740 | but it's, so I don't know, I think that,
00:59:57.020 | running a company is hard,
00:59:58.100 | building something at scale is hard.
00:59:59.860 | You only really do it for a long period of time
01:00:01.620 | if you really care about what you're doing.
01:00:04.220 | And yeah, so I mean, it's not great,
01:00:07.540 | but look, I think that at some level,
01:00:10.380 | whether 25% of people dislike you
01:00:14.980 | or 75% of people dislike you,
01:00:18.060 | your experience as a public figure is gonna be
01:00:21.040 | that there's a lot of people who dislike you, right?
01:00:23.340 | So I actually am not sure how different it is.
01:00:28.340 | Certainly, the country's gotten more polarized
01:00:32.740 | and we in particular have gotten more controversial
01:00:36.020 | over the last five years or so,
01:00:39.220 | but I don't know, I kind of think like as a public figure
01:00:45.220 | and leader of one of these enterprises--
01:00:48.540 | - Comes with a job.
01:00:49.420 | - Part of, yeah, part of what you do is like,
01:00:51.440 | and look, you can't just,
01:00:52.760 | the answer can't just be ignore it, right?
01:00:54.620 | Because like a huge part of the job is like,
01:00:56.920 | you need to be getting feedback and internalizing feedback
01:00:59.360 | on how you can do better.
01:01:00.760 | But I think increasingly what you need to do
01:01:02.500 | is be able to figure out,
01:01:04.520 | who are the kind of good faith critics
01:01:07.980 | who are criticizing you
01:01:09.460 | because they're trying to help you do a better job
01:01:12.500 | rather than tear you down.
01:01:13.960 | And those are the people who I just think
01:01:15.400 | you have to cherish and listen very closely
01:01:19.100 | to the things that they're saying,
01:01:20.280 | because I think it's just as dangerous
01:01:23.020 | to tune out everyone who says anything negative
01:01:25.920 | and just listen to the people who are kind of positive
01:01:29.320 | and support you as it would be psychologically
01:01:32.720 | to pay attention trying to make people
01:01:34.480 | who are never gonna like you, like you.
01:01:36.600 | So I think that's just kind of a dance
01:01:38.880 | that people have to do,
01:01:40.080 | but I mean, you kind of develop more of a feel for like,
01:01:44.720 | who actually is trying to accomplish
01:01:46.280 | the same types of things in the world
01:01:48.400 | and who has different ideas about how to do that
01:01:51.420 | and how can I learn from those people?
01:01:52.860 | And like, yeah, we get stuff wrong.
01:01:54.820 | And when the people whose opinions I respect
01:01:57.780 | call me out on getting stuff wrong,
01:01:59.820 | that hurts and makes me wanna do better.
01:02:02.100 | But I think at this point, I'm pretty tuned to just,
01:02:04.660 | all right, if someone, if I know
01:02:05.820 | they're kind of like operating in bad faith
01:02:08.100 | and they're not really trying to help,
01:02:10.780 | then I don't know, it doesn't,
01:02:13.220 | I think over time, it just doesn't bother you that much.
01:02:15.300 | - But you are surrounded by people
01:02:17.420 | that believe in the mission, that love you.
01:02:19.680 | Are there friends or colleagues in your inner circle
01:02:23.600 | you trust that call you out on your bullshit
01:02:26.560 | whenever you're thinking maybe misguided
01:02:28.600 | as it is for leaders at times?
01:02:30.920 | - I think we have a famously open company culture
01:02:33.480 | where we sort of encourage that kind of dissent internally,
01:02:39.400 | which is why there's so much material internally
01:02:42.240 | that can leak out with people sort of disagreeing
01:02:44.500 | is because that's sort of the culture.
01:02:47.520 | Our management team, I think it's a lot of people,
01:02:50.560 | there's some newer folks who come in,
01:02:51.960 | there's some folks who've kind of been there for a while,
01:02:54.780 | but there's a very high level of trust.
01:02:56.800 | And I would say it is a relatively
01:02:58.960 | confrontational group of people.
01:03:01.040 | And my friends and family, I think will push me on this.
01:03:04.520 | But look, it's not just,
01:03:06.360 | but I think you need some diversity, right?
01:03:09.240 | It can't just be people who are your friends and family.
01:03:13.740 | It's also, I mean, there are journalists or analysts
01:03:17.920 | or peer executives at other companies
01:03:22.920 | or other people who sort of are insightful
01:03:27.640 | about thinking about the world, certain politicians
01:03:30.840 | or people kind of in that sphere
01:03:32.720 | who I just think have like very insightful perspectives
01:03:36.200 | who even if they would,
01:03:39.840 | they come at the world from a different perspective,
01:03:41.640 | which is sort of what makes the perspective so valuable.
01:03:44.380 | But I think fundamentally
01:03:46.240 | we're trying to get to the same place
01:03:47.600 | in terms of helping people connect more,
01:03:50.720 | helping the whole world function better,
01:03:53.480 | not just one place or another.
01:03:55.620 | And I don't know, I mean, those are the people
01:03:59.660 | whose opinions really matter to me.
01:04:02.940 | And that's how I learn on a day-to-day basis.
01:04:05.680 | People are constantly sending me comments on stuff
01:04:07.880 | or links to things they found interesting.
01:04:10.160 | And I don't know, it's kind of constantly evolving
01:04:13.440 | this model of the world
01:04:14.520 | and kind of what we should be aspiring to be.
01:04:16.880 | - You've talked about, you have a famously open culture
01:04:20.080 | which comes with the criticism and the painful experiences.
01:04:25.940 | So let me ask you another difficult question.
01:04:31.000 | Frances Haugen, the Facebook whistleblower,
01:04:33.480 | leaked the internal Instagram research
01:04:35.840 | into teenagers and wellbeing.
01:04:38.080 | Her claim is that Instagram is choosing profit
01:04:41.260 | over wellbeing of teenage girls.
01:04:43.120 | So Instagram is quote, toxic for them.
01:04:46.720 | Your response titled,
01:04:48.080 | "What our research really says
01:04:51.320 | about teen wellbeing and Instagram,"
01:04:53.120 | says, "No, Instagram research shows
01:04:55.520 | that 11 of 12 wellbeing issues,
01:04:58.800 | teenage girls who said they struggle
01:05:02.720 | with those difficult issues
01:05:03.800 | also said that Instagram made them better
01:05:05.900 | rather than worse."
01:05:07.600 | Again, can you steal man and defend the point
01:05:10.960 | and Frances Haugen's characterization of the study
01:05:14.800 | and then help me understand the positive
01:05:17.040 | and negative effects of Instagram
01:05:19.000 | and Facebook on young people?
01:05:20.880 | - So there are certainly questions
01:05:24.220 | around teen mental health that are really important.
01:05:26.600 | It's hard to, as a parent, it's hard to imagine
01:05:29.500 | any set of questions that are sort of more important.
01:05:32.040 | I mean, I guess maybe other aspects of physical health
01:05:34.080 | or wellbeing are probably come to that level.
01:05:37.240 | But like, these are really important questions, right?
01:05:40.580 | Which is why we dedicate teams to studying them.
01:05:43.780 | I don't think the internet or social media
01:05:48.280 | are unique in having these questions.
01:05:49.980 | I mean, I think people,
01:05:51.240 | and there've been sort of magazines
01:05:53.160 | with promoting certain body types
01:05:55.180 | for women and kids for decades.
01:05:58.500 | But we really care about this stuff.
01:06:01.440 | So we wanted to study it.
01:06:02.760 | And of course, we didn't expect
01:06:05.000 | that everything was gonna be positive all the time.
01:06:07.000 | So, I mean, the reason why you study this stuff
01:06:08.520 | is to try to improve and get better.
01:06:10.760 | So, I mean, look, the place where I disagree
01:06:13.200 | with the characterization, first,
01:06:15.320 | I thought some of the reporting and coverage of it
01:06:18.720 | just took the whole thing out of proportion
01:06:20.840 | and that it focused on, as you said,
01:06:22.660 | I think there were like 20 metrics in there
01:06:24.280 | and on 18 or 19, the effect of using Instagram
01:06:27.620 | was neutral or positive on the teens' wellbeing.
01:06:30.920 | And there was one area where I think
01:06:34.160 | it showed that we needed to improve
01:06:35.500 | and we took some steps to try to do that
01:06:37.760 | after doing the research.
01:06:38.840 | But I think having the coverage just focus on that one
01:06:41.720 | without focusing on the,
01:06:43.080 | you know, I mean, I think an accurate characterization
01:06:45.120 | would have been that kids using Instagram,
01:06:47.960 | or not kids, teens,
01:06:49.300 | is generally positive for their mental health.
01:06:53.760 | But of course, that was not the narrative that came out.
01:06:55.580 | So I think it's hard to,
01:06:56.740 | that's not a kind of logical thing to straw man,
01:06:59.240 | but I sort of disagree, or steel man,
01:07:01.480 | but I sort of disagree with that overall characterization.
01:07:04.080 | I think anyone sort of looking at this objectively would.
01:07:08.840 | But then, you know, I mean,
01:07:11.760 | there is this sort of intent critique
01:07:15.080 | that I think you were getting at before,
01:07:16.400 | which says, you know,
01:07:17.800 | it assumes some sort of malevolence, right?
01:07:19.720 | It's like, which it's really hard for me to
01:07:23.640 | really wrap my head around this,
01:07:26.580 | because as far as I know,
01:07:29.840 | it's not clear that any of the other tech companies
01:07:31.780 | are doing this kind of research.
01:07:33.360 | So why the narrative should form that we did research,
01:07:37.680 | you know, because we were studying an issue,
01:07:38.840 | 'cause we wanted to understand it to improve,
01:07:40.800 | and took steps after that to try to improve it,
01:07:43.600 | that your interpretation of that would be
01:07:46.300 | that we did the research
01:07:47.920 | and tried to sweep it under the rug.
01:07:49.280 | It just, it sort of is like,
01:07:53.680 | I don't know, it's beyond credibility to me,
01:07:55.960 | that like, that's the accurate description of the actions
01:07:59.040 | that we've taken compared to the others in the industry.
01:08:01.200 | So I don't know, that's kind of, that's my view on it.
01:08:05.280 | These are really important issues,
01:08:06.600 | and there's a lot of stuff
01:08:08.000 | that I think we're gonna be working on
01:08:09.120 | related to teen mental health for a long time,
01:08:11.420 | including trying to understand this better.
01:08:14.280 | And I would encourage everyone else in the industry
01:08:15.760 | to do this too.
01:08:16.600 | - Yeah, I would love there to be open conversations
01:08:21.920 | and a lot of great research being released internally,
01:08:25.720 | and then also externally.
01:08:27.980 | It doesn't make me feel good to see press
01:08:32.780 | obviously get way more clicks
01:08:35.020 | when they say negative things about social media.
01:08:38.900 | Let's, objectively speaking, I can just tell
01:08:41.560 | that there's hunger to say negative things
01:08:44.440 | about social media.
01:08:46.060 | And I don't understand how that's supposed to lead
01:08:50.760 | to an open conversation about the positives
01:08:53.060 | and the negatives, the concerns about social media,
01:08:56.020 | especially when you're doing that kind of research.
01:08:59.260 | I mean, I don't know what to do with that,
01:09:01.700 | but let me ask you as a father,
01:09:03.920 | there's a weight heavy on you
01:09:06.720 | that people get bullied on social networks.
01:09:10.300 | So people get bullied in their private life.
01:09:13.520 | But now, because so much of our life is in the digital world,
01:09:17.100 | the bullying moves from the physical world
01:09:19.620 | to the digital world.
01:09:21.180 | So you're now creating a platform
01:09:24.540 | on which bullying happens,
01:09:26.500 | and some of that bullying can lead to damage
01:09:30.460 | to mental health, and some of that bullying
01:09:32.820 | can lead to depression, even suicide.
01:09:36.320 | There's a weight heavy on you that people
01:09:39.700 | have committed suicide or will commit suicide
01:09:45.260 | based on the bullying that happens on social media.
01:09:48.100 | - Yeah, I mean, there's a set of harms
01:09:51.540 | that we basically track and build systems to fight against.
01:09:55.420 | And bullying and self-harm are,
01:10:00.420 | you know, I mean, these are some of the biggest things
01:10:03.260 | that we are most focused on.
01:10:06.040 | For bullying, like you say, it's gonna be,
01:10:14.420 | while this predates the internet,
01:10:18.300 | and it's probably impossible to get rid of all of it,
01:10:21.940 | you wanna give people tools to fight it,
01:10:24.140 | and you wanna fight it yourself.
01:10:27.220 | And you also wanna make sure that people have the tools
01:10:28.820 | to get help when they need it.
01:10:30.180 | So I think this isn't like a question of,
01:10:33.060 | you know, can you get rid of all bullying?
01:10:34.660 | I mean, it's like, all right.
01:10:36.060 | I mean, I have two daughters, and, you know,
01:10:39.700 | they fight and, you know, push each other around
01:10:43.180 | and stuff too, and the question is just,
01:10:44.740 | how do you handle that situation?
01:10:47.100 | And there's a handful of things that I think
01:10:50.100 | you can do.
01:10:51.580 | You know, we talked a little bit before around
01:10:54.980 | some of the AI tools that you can build
01:10:57.100 | to identify when something harmful is happening.
01:11:00.180 | It's actually, it's very hard in bullying,
01:11:01.540 | 'cause a lot of bullying is very context-specific.
01:11:03.740 | It's not like you're trying to fit a formula of like,
01:11:07.300 | you know, if like looking at the different harms,
01:11:10.340 | you know, someone promoting a terrorist group
01:11:12.100 | is like probably one of the simpler things
01:11:14.420 | to generally find, because things promoting that group
01:11:16.820 | are gonna, you know, look a certain way
01:11:18.660 | or feel a certain way.
01:11:20.260 | Bullying could just be, you know,
01:11:21.860 | someone making some subtle comment about
01:11:24.060 | someone's appearance that's idiosyncratic to them.
01:11:26.860 | - And it could look at just like humor.
01:11:28.740 | So humor to one person could be destructive
01:11:31.020 | to another human being, yeah.
01:11:32.300 | - So with bullying, I think there are,
01:11:34.180 | there are certain things that you can find
01:11:37.780 | through AI systems, but I think it is
01:11:41.140 | increasingly important to just give people
01:11:43.500 | more agency themselves.
01:11:44.820 | So we've done things like making it so people
01:11:46.780 | can turn off comments or, you know,
01:11:48.460 | take a break from, you know,
01:11:50.420 | hearing from a specific person
01:11:52.260 | without having to signal at all
01:11:54.180 | that they're gonna stop following them
01:11:55.620 | or kind of make some stand that,
01:11:58.220 | okay, I'm not friends with you anymore,
01:11:59.460 | I'm not following you.
01:12:00.460 | I just like, I just don't wanna hear about this,
01:12:01.940 | but I also don't wanna signal at all publicly that,
01:12:06.500 | or to them that there's been an issue.
01:12:08.940 | And then you get to some of the more extreme cases
01:12:14.100 | like you're talking about, where someone is thinking about
01:12:16.740 | self-harm or suicide.
01:12:19.180 | And there, we've found that that is a place
01:12:24.100 | where AI can identify a lot,
01:12:26.420 | as well as people flagging things.
01:12:28.380 | You know, if people are expressing something
01:12:31.100 | that is, you know, potentially,
01:12:33.380 | they're thinking of hurting themselves,
01:12:35.020 | those are cues that you can build systems
01:12:37.540 | and, you know, hundreds of languages around the world
01:12:39.580 | to be able to identify that.
01:12:41.100 | And one of the things that I'm actually quite proud of
01:12:45.380 | is we've built these systems that I think are
01:12:48.860 | clearly leading at this point that not only identify that,
01:12:53.060 | but then connect with local first responders
01:12:57.060 | and have been able to save, I think at this point,
01:12:59.700 | it's, you know, in thousands of cases,
01:13:01.980 | be able to get first responders to people
01:13:04.620 | through these systems who really need them
01:13:06.980 | because of specific plumbing that we've done
01:13:09.660 | between the AI work and being able to communicate
01:13:11.700 | with local first responder organizations.
01:13:13.820 | And we're rolling that out in more places around the world.
01:13:15.820 | And I think the team that worked on that
01:13:18.220 | just did awesome stuff.
01:13:19.380 | So I think that that's a long way of saying,
01:13:22.780 | yeah, I mean, this is a heavy topic
01:13:25.500 | and there's, you want to attack it
01:13:27.060 | in a bunch of different ways.
01:13:28.540 | And also kind of understand that some of nature
01:13:33.260 | is for people to do this to each other,
01:13:36.380 | which is unfortunate, but you can give people tools
01:13:39.020 | and build things that help.
01:13:40.620 | - It's still one hell of a burden though.
01:13:43.860 | - A platform that allows people to fall in love
01:13:46.980 | with each other is also by nature going to be a platform
01:13:50.980 | that allows people to hurt each other.
01:13:52.860 | And when you're managing such a platform, it's difficult.
01:13:57.140 | And I think you spoke to it, but the psychology of that,
01:13:59.420 | of being a leader in that space of creating technology
01:14:02.580 | that's playing in this space, like you mentioned,
01:14:05.940 | psychology is really damn difficult.
01:14:10.260 | And I mean, the burden of that is just great.
01:14:13.100 | I just wanted to hear you speak to that point.
01:14:17.220 | I have to ask about the thing you've brought up a few times,
01:14:23.140 | which is making controversial decisions.
01:14:25.320 | Let's talk about free speech and censorship.
01:14:29.420 | So there are two groups of people pressuring meta on this.
01:14:33.900 | One group is upset that Facebook, the social network,
01:14:37.220 | allows misinformation in quotes to be spread.
01:14:40.260 | On the platform, the other group are concerned
01:14:42.660 | that Facebook censors speech by calling it misinformation.
01:14:46.540 | So you're getting it from both sides.
01:14:48.780 | You, in 2019, October at Georgetown University,
01:14:53.780 | eloquently defended the importance of free speech,
01:14:58.200 | but then COVID came and the 2020 election came.
01:15:04.300 | Do you worry that outside pressures from advertisers,
01:15:07.300 | politicians, the public have forced meta
01:15:09.660 | to damage the ideal of free speech that you spoke highly of?
01:15:13.040 | - Just to say some obvious things up front,
01:15:16.820 | I don't think pressure from advertisers
01:15:18.780 | or politicians directly in any way
01:15:21.380 | affects how we think about this.
01:15:22.620 | I think these are just hard topics.
01:15:25.020 | So let me just take you through our evolution
01:15:26.820 | from kind of the beginning of the company
01:15:28.180 | to where we are now.
01:15:29.340 | You don't build a company like this
01:15:31.700 | unless you believe that people expressing themselves
01:15:34.100 | is a good thing.
01:15:35.700 | So that's sort of the foundational thing.
01:15:38.140 | You can kind of think about our company as a formula
01:15:41.860 | where we think giving people voice
01:15:44.220 | and helping people connect creates opportunity.
01:15:47.420 | So those are the two things that we're always focused on
01:15:49.620 | are sort of helping people connect,
01:15:50.780 | we talked about that a lot,
01:15:52.020 | but also giving people voice
01:15:53.860 | and ability to express themselves.
01:15:55.780 | Then by the way, most of the time
01:15:56.940 | when people express themselves,
01:15:58.100 | that's not like politically controversial content,
01:16:00.820 | it's like expressing something about their identity
01:16:04.000 | that's more related to the avatar conversation
01:16:06.460 | we had earlier in terms of expressing some facet,
01:16:08.560 | but that's what's important to people on a day-to-day basis.
01:16:11.220 | And sometimes when people feel strongly enough
01:16:13.420 | about something, it kind of becomes a political topic.
01:16:16.300 | That's sort of always been a thing that we've focused on.
01:16:19.060 | There's always been the question of safety in this,
01:16:22.260 | which if you're building a community,
01:16:24.340 | I think you have to focus on safety.
01:16:26.020 | We've had these community standards from early on,
01:16:28.260 | and there are about 20 different kinds of harm
01:16:32.580 | that we track and try to fight actively.
01:16:34.780 | And we've talked about some of them already.
01:16:36.380 | So it includes things like bullying and harassment,
01:16:40.620 | it includes things like terrorism or promoting terrorism,
01:16:45.620 | inciting violence, intellectual property theft.
01:16:49.300 | And in general, I think call it about 18 out of 20 of those,
01:16:53.660 | there's not really a particularly polarized definition
01:16:57.220 | of that.
01:16:58.060 | I think you're not really gonna find many people
01:17:01.440 | in the country or in the world
01:17:03.740 | who are trying to say we should be
01:17:05.780 | fighting terrorist content less.
01:17:09.340 | I think that the content where,
01:17:11.300 | there are a couple of areas where I think
01:17:12.580 | this has gotten more controversial recently,
01:17:14.400 | which I'll talk about.
01:17:16.340 | And you're right, the misinformation is basically up there.
01:17:20.020 | And I think sometimes the definition of hate speech
01:17:21.940 | is up there too.
01:17:23.660 | But I think in general,
01:17:25.300 | most of the content that I think we're working on for safety
01:17:30.300 | is not actually, people don't kind of have these questions.
01:17:33.280 | So it's sort of this subset.
01:17:36.000 | But if you go back to the beginning of the company,
01:17:38.120 | this was sort of pre deep learning days.
01:17:42.680 | And therefore, it was me and my roommate Dustin joined me.
01:17:47.680 | And like, if someone posted something bad,
01:17:53.480 | it was the AI technology did not exist yet
01:17:57.920 | to be able to go basically look at all the content.
01:18:01.300 | And we were a small enough outfit
01:18:06.160 | that no one would expect that we could review it all.
01:18:08.820 | Even if like someone reported it to us,
01:18:10.420 | we basically did our best, right?
01:18:11.760 | It's like someone would report it
01:18:12.720 | and we try to look at stuff and deal with stuff.
01:18:16.940 | And for Cult, the first, I don't know,
01:18:21.540 | seven or eight years of the company,
01:18:23.980 | you know, we weren't that big of a company.
01:18:26.260 | You know, for a lot of that period,
01:18:27.420 | we weren't even really profitable.
01:18:28.800 | The AI didn't really exist to be able to do
01:18:30.560 | the kind of moderation that we do today.
01:18:32.760 | And then at some point,
01:18:34.800 | in kind of the middle of the last decade,
01:18:36.880 | that started to flip.
01:18:38.180 | And we, you know, we became,
01:18:41.460 | it got to the point where we were
01:18:43.440 | sort of a larger and more profitable company.
01:18:45.280 | And the AI was starting to come online
01:18:48.000 | to be able to proactively detect
01:18:50.500 | some of the simpler forms of this.
01:18:52.840 | So things like pornography,
01:18:54.800 | you could train an image classifier
01:18:57.640 | to identify what a nipple was,
01:18:59.520 | or you can fight against terrorist content.
01:19:01.320 | You still could- - There's actually
01:19:02.160 | papers on this, it's great.
01:19:03.480 | - Oh, of course there are. - Technical papers.
01:19:05.120 | - Of course there are.
01:19:06.240 | You know, those are relatively easier things
01:19:07.900 | to train AI to do than, for example,
01:19:10.640 | understand the nuances of what is inciting violence
01:19:14.000 | in a hundred languages around the world
01:19:15.820 | and not have the false positives of like,
01:19:20.220 | okay, are you posting about this thing
01:19:22.360 | that might be inciting violence?
01:19:24.040 | Because you're actually trying to denounce it,
01:19:26.320 | in which case we probably shouldn't take that down.
01:19:28.280 | Where if you're trying to denounce something
01:19:29.520 | that's inciting violence
01:19:30.760 | in some kind of dialect in a corner of India,
01:19:35.280 | as opposed to, okay, actually you're posting this thing
01:19:38.440 | because you're trying to incite violence.
01:19:39.600 | Okay, building an AI that can basically
01:19:42.040 | get to that level of nuance
01:19:43.560 | in all the languages that we serve
01:19:45.240 | is something that I think
01:19:47.120 | is only really becoming possible now,
01:19:49.640 | not towards the middle of the last decade.
01:19:51.880 | But there's been this evolution,
01:19:54.840 | and I think what happened,
01:19:56.140 | you know, people sort of woke up after 2016,
01:20:00.080 | and, you know, a lot of people are like,
01:20:02.560 | okay, the country is a lot more polarized,
01:20:04.960 | and there's a lot more stuff here than we realized.
01:20:08.040 | Why weren't these internet companies on top of this?
01:20:12.840 | And I think at that point,
01:20:15.740 | it was reasonable feedback that, you know,
01:20:20.640 | some of this technology had started becoming possible.
01:20:23.480 | And at that point, I really did feel like
01:20:26.320 | we needed to make a substantially larger investment.
01:20:28.920 | We'd already worked on this stuff a lot,
01:20:30.680 | on AI and on these integrity problems,
01:20:33.400 | but that we should basically invest,
01:20:36.200 | you know, have a thousand or more engineers
01:20:38.160 | basically work on building these AI systems
01:20:40.160 | to be able to go and proactively identify the stuff
01:20:42.560 | across all these different areas.
01:20:44.680 | Okay, so we went and did that.
01:20:46.360 | Now we've built the tools to be able to do that.
01:20:49.000 | And now I think it's actually a much more complicated set
01:20:51.920 | of philosophical rather than technical questions,
01:20:54.420 | which is the exact policies, which are okay.
01:20:57.960 | Now, the way that we basically hold ourselves accountable
01:21:02.960 | is we issue these transparency reports every quarter.
01:21:05.400 | And the metric that we track is for each of those
01:21:07.720 | 20 types of harmful content,
01:21:11.480 | how much of that content are we taking down
01:21:13.520 | before someone even has to report it to us?
01:21:15.320 | Right, so how effective is our AI at doing this?
01:21:18.120 | But that basically creates this big question,
01:21:20.560 | which is, okay, now we need to really be careful
01:21:24.160 | about how proactive we set the AI
01:21:26.640 | and where the exact policy lines are
01:21:29.120 | around what we're taking down.
01:21:31.360 | It's certainly at a point now where, you know,
01:21:35.240 | I felt like at the beginning of that journey
01:21:37.720 | of building those AI systems,
01:21:39.440 | there was a lot of push.
01:21:43.160 | They're saying, okay, you've got to do more.
01:21:44.360 | There's clearly a lot more bad content
01:21:46.320 | that people aren't reporting or that you're not getting to,
01:21:49.880 | and you need to get more effective at that.
01:21:51.200 | And I was pretty sympathetic to that.
01:21:52.920 | But then I think at some point along the way,
01:21:54.800 | there started to be almost equal issues on both sides
01:21:58.960 | of, okay, actually you're kind of taking down
01:22:00.960 | too much stuff, right?
01:22:02.080 | Or some of the stuff is borderline
01:22:05.560 | and it wasn't really bothering anyone
01:22:07.560 | and they didn't report it.
01:22:09.640 | So is that really an issue that you need to take down?
01:22:13.000 | Whereas we still have the critique on the other side too,
01:22:15.440 | where a lot of people think we're not doing enough.
01:22:18.560 | So it's become, as we built the technical capacity,
01:22:21.840 | I think it becomes more philosophically interesting
01:22:25.600 | almost where you want to be on the line.
01:22:27.600 | And I just think like you don't want one person
01:22:32.080 | making those decisions.
01:22:33.440 | So we've also tried to innovate in terms of building out
01:22:36.080 | this independent oversight board,
01:22:37.480 | which has people who are dedicated to free expression,
01:22:40.480 | but from around the world,
01:22:42.440 | who people can appeal cases to.
01:22:44.560 | So a lot of the most controversial cases
01:22:46.400 | basically go to them and they make the final binding
01:22:48.200 | decision on how we should handle that.
01:22:50.040 | And then of course their decisions,
01:22:51.640 | we then try to figure out what the principles are
01:22:54.000 | behind those and encode them into the algorithms.
01:22:56.680 | - And how are those people chosen,
01:22:58.120 | which you're outsourcing a difficult decision.
01:23:01.040 | - Yeah, the initial people,
01:23:03.320 | we chose a handful of chairs for the group.
01:23:09.840 | And we basically chose the people
01:23:13.240 | for a commitment to free expression
01:23:16.280 | and a broad understanding of human rights
01:23:20.400 | and the trade-offs around free expression.
01:23:22.280 | So they're fundamentally people
01:23:23.440 | who are gonna lean towards free expression.
01:23:25.600 | - Towards freedom of speech.
01:23:26.720 | Okay, so there's also this idea of fact checkers,
01:23:29.280 | so jumping around to the misinformation questions,
01:23:32.320 | especially during COVID,
01:23:33.800 | which is an exceptionally, speaking of polarization.
01:23:36.800 | - Can I speak to the COVID thing?
01:23:38.920 | I mean, I think one of the hardest set of questions
01:23:40.920 | around free expression,
01:23:41.920 | 'cause you asked about Georgetown,
01:23:42.960 | is my stance fundamentally changed?
01:23:44.520 | And the answer to that is no, my stance has not changed.
01:23:49.120 | It is fundamentally the same as when I was talking
01:23:52.720 | about Georgetown from a philosophical perspective.
01:23:56.480 | The challenge with free speech is that everyone agrees
01:24:01.480 | that there is a line where if you're actually
01:24:05.440 | about to do physical harm to people,
01:24:08.160 | that there should be restrictions.
01:24:10.560 | So I mean, there's the famous Supreme Court
01:24:13.960 | historical example of like,
01:24:15.120 | you can't yell fire in a crowded theater.
01:24:18.040 | The thing that everyone disagrees on
01:24:20.360 | is what is the definition of real harm?
01:24:22.640 | Where I think some people think,
01:24:24.560 | okay, this should only be a very literal,
01:24:27.920 | I mean, take it back to the bullying conversation
01:24:29.840 | we were just having, where is it just harm
01:24:32.760 | if the person is about to hurt themselves
01:24:34.800 | because they've been bullied so hard?
01:24:36.640 | Or is it actually harm as they're being bullied?
01:24:39.880 | And kind of at what point in the spectrum is that?
01:24:42.160 | And that's the part that there's not agreement on.
01:24:44.480 | But I think what people agree on pretty broadly
01:24:47.000 | is that when there is an acute threat,
01:24:49.440 | that it does make sense from a societal perspective
01:24:52.960 | to tolerate less speech that could be potentially harmful
01:24:57.960 | in that acute situation.
01:24:59.560 | So I think where COVID got very difficult is,
01:25:02.800 | I don't think anyone expected this to be going on for years,
01:25:06.000 | but if you'd kind of asked a priori,
01:25:10.360 | would a global pandemic where a lot of people are dying
01:25:14.880 | and catching this, is that an emergency
01:25:19.000 | that where you'd kind of consider it that,
01:25:22.680 | it's problematic to basically yell fire
01:25:25.600 | in a crowded theater,
01:25:26.840 | I think that that probably passes that test.
01:25:29.000 | So I think that it's a very tricky situation,
01:25:32.320 | but I think the fundamental commitment to free expression
01:25:37.120 | is there.
01:25:38.200 | And that's what I believe.
01:25:39.840 | And again, I don't think you start this company
01:25:41.440 | unless you care about people being able
01:25:42.720 | to express themselves as much as possible.
01:25:44.800 | But I think that that's the question, right?
01:25:48.840 | Is like, how do you define what the harm is
01:25:50.440 | and how acute that is?
01:25:52.400 | - And what are the institutions that define that harm?
01:25:55.440 | A lot of the criticism is that the CDC, the WHO,
01:25:59.720 | the institutions we've come to trust as a civilization
01:26:03.800 | to give the line of what is and isn't harm
01:26:07.760 | in terms of health policy have failed in many ways,
01:26:11.640 | in small ways, in big ways, depending on who you ask.
01:26:14.440 | And then the perspective of Meta and Facebook is like,
01:26:17.120 | well, where the hell do I get the information
01:26:20.120 | of what is and isn't misinformation?
01:26:22.360 | So it's a really difficult place to be in,
01:26:25.120 | but it's great to hear that you're leaning
01:26:26.680 | towards freedom of speech on this aspect.
01:26:30.120 | And again, I think this actually calls to the fact
01:26:32.960 | that we need to reform institutions
01:26:35.280 | that help keep an open mind of what is
01:26:37.200 | and isn't misinformation.
01:26:38.960 | And misinformation has been used to bully.
01:26:42.040 | On the internet, I mean, I just have,
01:26:45.400 | you know, I'm friends with Joe Rogan
01:26:46.880 | and he is called as a, I remember hanging out
01:26:50.200 | with him in Vegas and somebody yelled,
01:26:51.960 | "Stop spreading misinformation."
01:26:54.600 | I mean, and there's a lot of people that follow him
01:26:57.560 | that believe he's not spreading misinformation.
01:26:59.840 | Like you can't just not acknowledge the fact
01:27:02.880 | that there's a large number of people
01:27:05.680 | that have a different definition of misinformation.
01:27:08.800 | And that's such a tough place to be.
01:27:10.720 | Like who do you listen to?
01:27:11.800 | Do you listen to quote unquote experts?
01:27:14.320 | Who gets, as a person who has a PhD,
01:27:16.800 | I gotta say, I mean, I'm not sure I know
01:27:18.760 | what defines an expert, especially in a new,
01:27:24.040 | in a totally new pandemic or a new catastrophic event,
01:27:29.040 | especially when politics is involved
01:27:31.040 | and especially when the news or the media involved
01:27:35.000 | that can propagate sort of outrageous narratives
01:27:39.480 | and thereby make a lot of money.
01:27:40.680 | Like what the hell, where's the source of truth?
01:27:43.240 | And then everybody turns to Facebook.
01:27:45.480 | It's like, please tell me what the source of truth is.
01:27:48.000 | (laughing)
01:27:49.040 | - Well, I mean, well, how would you handle this
01:27:50.720 | if you were in my position?
01:27:52.680 | - It's very, very, very, very difficult.
01:27:55.160 | I would say,
01:27:56.440 | I would more speak about how difficult the choices are
01:28:02.660 | and be transparent about like,
01:28:04.020 | what the hell do you do with this?
01:28:05.360 | Like here, you got exactly, ask the exact question
01:28:08.080 | you just asked me, but to the broader public.
01:28:10.100 | Like, okay, yeah, you guys tell me what to do.
01:28:12.380 | So like crowdsource it.
01:28:14.200 | And then the other aspect is when you spoke
01:28:18.980 | really eloquently about the fact that
01:28:21.780 | there's this going back and forth
01:28:23.420 | and now there's a feeling like you're censoring
01:28:25.220 | a little bit too much.
01:28:26.660 | And so I would lean, I would try to be ahead of that feeling.
01:28:30.220 | I would now lean towards freedom of speech
01:28:32.500 | and say, you know, we're not the ones
01:28:33.900 | that are going to define misinformation.
01:28:36.220 | Let it be a public debate.
01:28:38.560 | Let the idea stand.
01:28:40.020 | And I actually place, you know, this idea of misinformation,
01:28:44.300 | I place the responsibility
01:28:46.360 | on the poor communication skills of scientists.
01:28:50.020 | They should be in the battlefield of ideas
01:28:52.660 | and everybody who is spreading information
01:28:57.400 | against the vaccine, they should not be censored.
01:29:00.420 | They should be talked with and you should show the data.
01:29:03.020 | You should have open discussion
01:29:04.820 | as opposed to rolling your eyes and saying, I'm the expert.
01:29:08.260 | I know what I'm talking about.
01:29:09.840 | No, you need to convince people.
01:29:11.620 | It's a battle of ideas.
01:29:13.240 | So that's the whole point of freedom of speech
01:29:15.360 | is the way to defeat bad ideas
01:29:17.100 | is with good ideas, with speech.
01:29:20.080 | So like the responsibility here falls
01:29:22.080 | on the poor communication skills of scientists.
01:29:26.600 | Thanks to social media, scientists are not communicators.
01:29:31.600 | They have the power to communicate.
01:29:34.040 | Some of the best stuff I've seen about COVID
01:29:36.800 | from doctors is on social media.
01:29:38.840 | It's a way to learn to respond really quickly,
01:29:41.520 | to go faster than the peer review process.
01:29:43.840 | And so they just need to get way better
01:29:45.480 | at that communication.
01:29:46.520 | And also by better, I don't mean just,
01:29:49.420 | convincing, I also mean speak with humility.
01:29:51.780 | Don't talk down to people, all those kinds of things.
01:29:54.260 | And as a platform, I would say,
01:29:56.020 | I would step back a little bit.
01:29:59.820 | Not all the way, of course,
01:30:00.820 | because there's a lot of stuff that can cause real harm
01:30:03.540 | as we've talked about,
01:30:04.440 | but you lean more towards freedom of speech
01:30:06.940 | because then people from a brand perspective
01:30:09.580 | wouldn't be blaming you for the other ills of society,
01:30:12.740 | which there are many.
01:30:14.580 | The institutions have flaws, the political divide.
01:30:19.580 | Obviously, politicians have flaws, that's news.
01:30:23.380 | The media has flaws that they're all trying to work with.
01:30:28.020 | And because of the central place of Facebook in the world,
01:30:31.100 | all of those flaws somehow kind of propagate to Facebook.
01:30:34.340 | And you're sitting there, as Plato, the philosopher,
01:30:38.140 | have to answer to some of the most difficult questions
01:30:40.740 | being asked of human civilization.
01:30:43.940 | So I don't know, maybe this is an American answer, though,
01:30:47.020 | to lean towards freedom of speech.
01:30:48.420 | I don't know if that applies globally.
01:30:50.320 | So yeah, I don't know.
01:30:52.620 | But transparency and saying,
01:30:55.220 | I think as a technologist,
01:30:57.380 | one of the things I sense about Facebook
01:30:59.140 | and matter when people talk about this company
01:31:02.380 | is they don't necessarily understand fully
01:31:05.220 | how difficult the problem is.
01:31:06.900 | You talked about AI has to catch a bunch of harmful stuff
01:31:10.700 | really quickly, just the sea of data you have to deal with.
01:31:14.620 | It's a really difficult problem.
01:31:16.700 | So any of the critics,
01:31:18.400 | if you just hand them the helm for a week,
01:31:21.520 | let's see how well you can do.
01:31:24.340 | To me, that's definitely something
01:31:28.180 | that would wake people up to how difficult this problem is
01:31:31.300 | if there's more transparency
01:31:32.660 | in saying how difficult this problem is.
01:31:35.580 | Let me ask you about, on the AI front,
01:31:37.820 | just 'cause you mentioned language and my ineloquence,
01:31:41.620 | translation is something I wanted to ask you about.
01:31:44.140 | And first, just to give a shout out to the supercomputer,
01:31:47.780 | you've recently announced the AI Research Supercluster, RSC.
01:31:51.980 | Obviously, I'm somebody who loves the GPUs.
01:31:54.700 | It currently has 6,000 GPUs.
01:31:57.140 | NVIDIA DGX A100s is the systems
01:32:01.260 | that have in total 6,000 GPUs.
01:32:04.100 | And it will eventually, maybe this year, maybe soon,
01:32:07.300 | will have 16,000 GPUs.
01:32:09.980 | So it can do a bunch of different kinds
01:32:11.660 | of machine learning applications.
01:32:15.040 | There's a cool thing on the distributed storage aspect
01:32:18.580 | and all that kind of stuff.
01:32:19.660 | So one of the applications
01:32:21.060 | that I think is super exciting is translation,
01:32:24.660 | real-time translation.
01:32:26.340 | I mentioned to you that having a conversation,
01:32:29.100 | I speak Russian fluently,
01:32:30.220 | I speak English somewhat fluently,
01:32:32.340 | and I'm having a conversation with Vladimir Putin,
01:32:34.940 | say, as a use case.
01:32:36.060 | Me as a user coming to you as a use case.
01:32:38.480 | We both speak each other's language.
01:32:42.500 | I speak Russian, he speaks English.
01:32:45.020 | How can we have that communication go well
01:32:47.980 | with the help of AI?
01:32:49.380 | I think it's such a beautiful
01:32:51.220 | and a powerful application of AI to connect the world,
01:32:54.700 | that bridge the gap, not necessarily between me and Putin,
01:32:57.560 | but people that don't have that shared language.
01:33:01.020 | Can you just speak about your vision with translation?
01:33:04.120 | 'Cause I think that's a really exciting application.
01:33:06.600 | - If you're trying to help people connect
01:33:08.000 | all around the world,
01:33:09.400 | a lot of content is produced in one language
01:33:11.600 | and people in all these other places are interested in it.
01:33:14.720 | So being able to translate that
01:33:16.640 | just unlocks a lot of value on a day-to-day basis.
01:33:20.560 | And so the kind of AI around translation is interesting
01:33:24.400 | because it's gone through a bunch of iterations.
01:33:27.880 | But the basic state of the art
01:33:29.680 | is that you don't wanna go through
01:33:33.560 | different kind of intermediate symbolic
01:33:37.680 | representations of language or something like that.
01:33:42.480 | You basically wanna be able to map the concepts
01:33:46.920 | and basically go directly from one language to another.
01:33:49.320 | And you just can train bigger and bigger models
01:33:53.040 | in order to be able to do that.
01:33:54.120 | And that's where the research supercluster comes in
01:33:58.120 | is basically a lot of the trend in machine learning
01:34:01.040 | is just you're building bigger and bigger models
01:34:03.400 | and you just need a lot of computation to train them.
01:34:05.720 | So it's not that like the translation
01:34:08.040 | would run on the supercomputer,
01:34:09.440 | the training of the model,
01:34:11.400 | which could have billions or trillions of examples
01:34:15.800 | of just basically that,
01:34:18.240 | you're training models on this supercluster
01:34:22.360 | in days or weeks that might take
01:34:25.960 | a much longer period of time on a smaller cluster.
01:34:28.160 | So it just wouldn't be practical for most teams to do.
01:34:30.200 | But the translation work,
01:34:32.520 | we're basically getting from,
01:34:37.280 | you're able to go between about 100 languages
01:34:39.680 | seamlessly today to being able to go to about 300 languages
01:34:44.680 | in the near term.
01:34:46.440 | - So from any language to any other language.
01:34:48.680 | - Yeah, and part of the issue when you get closer to
01:34:51.920 | more languages is some of these get to be pretty,
01:34:57.640 | not very popular languages, right?
01:35:01.880 | Where there isn't that much content in them.
01:35:04.280 | So you end up having less data
01:35:07.240 | and you need to kind of use a model that you've built up
01:35:10.960 | around other examples.
01:35:12.080 | And this is one of the big questions around AI
01:35:14.000 | is like how generalizable can things be?
01:35:16.680 | And that I think is one of the things
01:35:18.760 | that's just kind of exciting here
01:35:19.800 | from a technical perspective.
01:35:21.320 | - But capturing, we talked about this with the metaverse,
01:35:23.800 | capturing the magic of human to human interaction.
01:35:26.440 | So me and Putin, okay?
01:35:29.200 | Again, this is-
01:35:30.040 | - I mean, it's a tough example
01:35:31.080 | 'cause you actually both speak Russian and English.
01:35:33.840 | But in the future-
01:35:34.680 | - I see it as a Turing test of a kind
01:35:37.720 | because we would both like to have an AI that improves
01:35:40.440 | 'cause I don't speak Russian that well.
01:35:42.240 | He doesn't speak English that well.
01:35:44.200 | It would be nice to outperform our abilities.
01:35:48.640 | And it sets a really nice bar
01:35:50.640 | because I think AI can really help in translation
01:35:53.600 | for people that don't speak the language at all.
01:35:55.720 | But to actually capture the magic of the chemistry,
01:36:00.120 | the translation,
01:36:01.280 | which would make the metaverse super immersive.
01:36:04.800 | I mean, that's exciting.
01:36:06.040 | You remove the barrier of language, period.
01:36:08.680 | - Yeah, so when people think about translation,
01:36:11.220 | I think a lot of that is,
01:36:12.400 | they're thinking about text to text.
01:36:14.240 | But speech to speech, I think, is a whole nother thing.
01:36:17.120 | And I mean, one of the big lessons on that,
01:36:19.040 | which I was referring to before is,
01:36:21.160 | I think early models, it's like, all right,
01:36:23.000 | they take speech, they translate it to text,
01:36:25.080 | translate the text to another language,
01:36:26.640 | and then kind of output that as speech in that language.
01:36:29.360 | And you don't wanna do that.
01:36:30.400 | You just wanna be able to go directly from speech
01:36:32.240 | in one language to speech in another language
01:36:34.160 | and build up the models to do that.
01:36:36.360 | And I mean, I think one of the,
01:36:39.120 | there have been,
01:36:40.720 | when you look at the progress in machine learning,
01:36:42.840 | there have been big advances in the techniques.
01:36:46.200 | Some of the advances in self-supervised learning,
01:36:51.480 | which I know you talked to Jan about,
01:36:52.840 | and he's like one of the leading thinkers in this area.
01:36:55.280 | I just think that that stuff is really exciting.
01:36:57.480 | But then you couple that with the ability
01:36:59.760 | to just throw larger and larger amounts of compute
01:37:02.400 | at training these models,
01:37:03.920 | and you can just do a lot of things
01:37:05.520 | that were harder to do before.
01:37:09.280 | But we're asking more of our systems too, right?
01:37:12.840 | So if you think about the applications
01:37:14.760 | that we're gonna need for the metaverse,
01:37:18.280 | or think about it, okay,
01:37:19.280 | so let's talk about AR here for a second.
01:37:21.400 | You're gonna have these glasses.
01:37:23.040 | They're gonna look,
01:37:24.320 | hopefully like a normal-ish looking pair of glasses,
01:37:28.080 | but they're gonna be able to put holograms in the world
01:37:31.160 | and intermix virtual and physical objects in your scene.
01:37:35.920 | And one of the things that's gonna be unique about this,
01:37:39.000 | compared to every other computing device
01:37:41.160 | that you've had before,
01:37:42.520 | is that this is gonna be the first computing device
01:37:45.120 | that has all the same signals
01:37:47.480 | about what's going on around you that you have.
01:37:49.400 | Right, so your phone,
01:37:50.400 | you can have it take a photo or a video,
01:37:54.080 | but I mean, these glasses are gonna,
01:37:56.400 | you know, whenever you activate them,
01:37:57.440 | they're gonna be able to see what you see
01:37:58.960 | from your perspective.
01:38:00.200 | They're gonna be able to hear what you hear
01:38:01.480 | because the microphones and all that
01:38:03.400 | are gonna be right around where your ears are.
01:38:05.760 | So you're gonna want an AI assistant
01:38:08.120 | that's a new kind of AI assistant
01:38:10.200 | that can basically help you process the world
01:38:13.840 | from this first-person perspective,
01:38:16.960 | or from the perspective that you have.
01:38:18.520 | And the utility of that is gonna be huge,
01:38:21.760 | but the kinds of AI models that we're gonna need
01:38:25.400 | are going to be just,
01:38:27.360 | I don't know, there's a lot that we're gonna need
01:38:30.000 | to basically make advances in.
01:38:31.800 | But I mean, but that's why I think these concepts
01:38:33.720 | of the metaverse and the advances in AI
01:38:36.560 | are so fundamentally interlinked
01:38:38.720 | that I mean, they're kind of enabling each other.
01:38:42.880 | - Yeah, like the world builder is a really cool idea.
01:38:45.440 | Like you can be like a Bob Ross,
01:38:47.240 | like I'm gonna put a little tree right here.
01:38:49.120 | - Yeah. - I need a little tree.
01:38:50.320 | It's missing a little tree.
01:38:51.200 | And then, but at scale,
01:38:52.960 | like enriching your experience in all kinds of ways.
01:38:55.680 | You mentioned the assistant too.
01:38:56.960 | That's really interesting
01:38:58.160 | how you can have AI assistants helping you out
01:39:00.800 | on different levels of sort of intimacy of communication.
01:39:04.000 | It could be just like scheduling
01:39:05.480 | or it could be like almost like therapy.
01:39:08.120 | Clearly I need some.
01:39:09.880 | So let me ask you,
01:39:11.120 | you're one of the most successful people ever.
01:39:14.000 | You've built an incredible company
01:39:16.240 | that has a lot of impact.
01:39:18.080 | What advice do you have for young people
01:39:21.120 | today how to live a life they can be proud of?
01:39:25.480 | How to build something
01:39:27.720 | that can have a big positive impact on the world?
01:39:31.040 | - Well, let's break that down
01:39:37.480 | 'cause I think you proud of,
01:39:39.840 | have a big positive impact.
01:39:41.280 | - Wow, you're actually listening.
01:39:42.320 | - And how to live your life
01:39:43.880 | are actually three different things
01:39:45.560 | that I think, I mean, they could line up,
01:39:48.920 | but, and also like what age of people are you talking to?
01:39:52.480 | 'Cause I mean, I can like--
01:39:53.520 | - High school and college.
01:39:54.720 | So you don't really know what you're doing,
01:39:56.520 | but you dream big
01:39:58.240 | and you really have a chance to do something unprecedented.
01:40:02.320 | - Yeah.
01:40:03.160 | So I guess just-- - Also for people my age.
01:40:06.320 | - Okay, so let's maybe start with
01:40:08.040 | the kind of most philosophical and abstract version of this.
01:40:12.080 | Every night when I put my daughters to bed,
01:40:16.240 | we go through this thing
01:40:17.360 | and like, let's say that they call it the good night things
01:40:21.760 | 'cause we're basically what we talk about at night.
01:40:25.440 | And I just, I go through with them.
01:40:29.720 | - Sounds like a good show.
01:40:30.960 | - It's the good night things.
01:40:32.720 | Yeah, Priscilla's always asking,
01:40:33.760 | she's like, "Can I get good night things?"
01:40:35.080 | Like, I don't know, you go to bed too early.
01:40:37.120 | But it's,
01:40:37.960 | but I basically go through with Max and Augie,
01:40:46.360 | what are the things that are most important in life?
01:40:48.960 | Right, that I just, it's like,
01:40:50.200 | what do I want them to remember
01:40:51.560 | and just have like really ingrained in them as they grow up?
01:40:53.960 | And it's health, right?
01:40:56.720 | Making sure that you take care of yourself
01:40:58.800 | and keep yourself in good shape,
01:41:00.720 | loving friends and family, right?
01:41:02.920 | Because having the relationships, the family
01:41:06.200 | and making time for friends,
01:41:08.680 | I think is perhaps one of the most important things.
01:41:12.880 | And then the third is maybe a little more amorphous,
01:41:16.040 | but it is something that you're excited about
01:41:18.520 | for the future.
01:41:19.400 | And when I'm talking to a four-year-old,
01:41:21.280 | often I'll ask her what she's excited about
01:41:23.480 | for tomorrow or the week ahead.
01:41:25.200 | But I think for most people, it's really hard.
01:41:29.600 | I mean, the world is a heavy place.
01:41:31.440 | And I think like the way that we navigate it
01:41:34.760 | is that we have things that we're looking forward to.
01:41:37.320 | So whether it is building AR glasses for the future,
01:41:41.640 | or being able to celebrate my 10-year wedding anniversary
01:41:45.520 | with my wife that's coming up.
01:41:47.360 | It's like, I think people,
01:41:49.360 | you have things that you're looking forward to.
01:41:51.880 | Or for the girls, it's often,
01:41:53.120 | I wanna see mom in the morning, right?
01:41:55.200 | But it's like, that's a really critical thing.
01:41:57.040 | And then the last thing is I ask them every day,
01:42:00.360 | what did you do today to help someone?
01:42:02.640 | Because I just think that that's a really critical thing
01:42:07.120 | is like, it's easy to kind of get caught up in yourself
01:42:10.760 | and kind of stuff that's really far down the road.
01:42:14.280 | But did you do something just concrete today
01:42:17.520 | to help someone?
01:42:18.360 | And it can just be as simple as,
01:42:19.800 | okay, yeah, I helped set the table for lunch.
01:42:23.400 | Or this other kid in our school
01:42:26.440 | was having a hard time with something
01:42:27.920 | and I helped explain it to him.
01:42:29.280 | But those are, that's sort of like,
01:42:32.960 | if you were to boil down my overall life philosophy
01:42:36.080 | into what I try to impart to my kids,
01:42:40.160 | those are the things that I think are really important.
01:42:43.000 | So, okay, so let's say college.
01:42:44.320 | So if you're graduating college,
01:42:45.840 | probably more practical advice,
01:42:47.760 | someone who's very focused on people.
01:42:53.160 | And I think the most important decision
01:42:57.120 | you're probably gonna make if you're in college
01:42:59.320 | is who you surround yourself with,
01:43:01.600 | because you become like the people
01:43:02.960 | you surround yourself with.
01:43:04.600 | And I sort of have this hiring heuristic at Metta,
01:43:12.400 | which is that I will only hire someone to work for me
01:43:16.280 | if I could see myself working for them.
01:43:19.800 | Not necessarily that I want them to run the company
01:43:21.640 | 'cause I like my job,
01:43:22.760 | but in an alternate universe,
01:43:24.960 | if it was their company
01:43:25.880 | and I was looking to go work somewhere,
01:43:28.440 | would I be happy to work for them?
01:43:29.680 | And I think that that's a helpful heuristic
01:43:32.720 | to help balance.
01:43:35.440 | And when you're building something like this,
01:43:36.600 | there's a lot of pressure to,
01:43:38.560 | you wanna build out your teams
01:43:39.920 | because there's a lot of stuff that you need to get done.
01:43:41.440 | And everyone always says, don't compromise on quality,
01:43:43.920 | but there's this question of,
01:43:44.880 | okay, how do you know that someone is good enough?
01:43:46.240 | And I think my answer is,
01:43:48.280 | I would want someone to be on my team
01:43:51.120 | if I would work for them.
01:43:52.920 | But I think that's actually a pretty similar answer
01:43:55.560 | to if you were choosing friends
01:43:59.360 | or a partner or something like that.
01:44:01.680 | So when you're kind of in college,
01:44:04.160 | trying to figure out what your circle is gonna be,
01:44:05.840 | trying to figure out,
01:44:06.760 | you're evaluating different job opportunities,
01:44:10.320 | who are the people,
01:44:11.560 | even if they're gonna be peers in what you're doing,
01:44:14.400 | who are the people who in an alternate university,
01:44:17.000 | you would wanna work for them
01:44:18.680 | because you think you're gonna learn a lot from them
01:44:20.400 | because they are kind of values aligned
01:44:24.120 | on the things that you care about
01:44:25.440 | and they're gonna push you,
01:44:28.240 | but also they know different things
01:44:29.480 | and have different experiences
01:44:30.640 | that are kind of more of what you wanna become like
01:44:32.920 | over time.
01:44:33.760 | So I don't know,
01:44:34.600 | I think probably people are too,
01:44:36.160 | in general, objective focused
01:44:39.080 | and maybe not focused enough on the connections
01:44:42.680 | and the people who they're basically
01:44:45.680 | building relationships with.
01:44:46.840 | - I don't know what it says about me,
01:44:48.080 | but my place in Austin now has seven-legged robots.
01:44:53.080 | So I'm surrounding myself by robots,
01:44:55.400 | which is probably something I should look into.
01:44:59.080 | What kind of world would you like
01:45:01.400 | to see your daughters grow up in,
01:45:03.960 | even after you're gone?
01:45:06.420 | - Well, I think one of the promises of all this stuff
01:45:08.980 | that is getting built now
01:45:10.620 | is that it can be a world where more people
01:45:14.060 | can just live out their imagination.
01:45:19.340 | One of my favorite quotes,
01:45:20.580 | I think it was attributed to Picasso,
01:45:22.420 | it's that all children are artists
01:45:24.100 | and the challenge is how do you remain one
01:45:25.660 | when you grow up?
01:45:26.700 | And if you have kids,
01:45:29.300 | this is pretty clear,
01:45:30.700 | they just have wonderful imagination
01:45:32.700 | and they're just like,
01:45:33.540 | they have wonderful imaginations.
01:45:36.220 | And part of what I think is gonna be great
01:45:38.940 | about the creator economy and the metaverse
01:45:41.340 | and all this stuff is this notion around
01:45:43.820 | that a lot more people in the future
01:45:46.260 | are gonna get to work doing creative stuff
01:45:48.960 | than what I think today we would just consider
01:45:51.380 | traditional labor or service.
01:45:53.700 | And I think that that's awesome.
01:45:56.280 | And that's what a lot of what people are here to do
01:46:00.140 | is collaborate together, work together,
01:46:03.140 | think of things that you wanna build and go do it.
01:46:06.460 | And I don't know,
01:46:08.100 | one of the things that I just think is striking,
01:46:09.380 | so I teach my daughters some basic coding with Scratch.
01:46:13.700 | I mean, they're still obviously really young,
01:46:15.460 | but I think of coding as building, right?
01:46:18.580 | It's like when I'm coding,
01:46:19.820 | I'm like building something that I want to exist.
01:46:22.340 | But my youngest daughter,
01:46:25.160 | she's very musical and pretty artistic
01:46:29.740 | and she thinks about coding as art.
01:46:32.860 | She calls it code art, not the code,
01:46:35.580 | but the output of what she is making.
01:46:37.580 | It's like, she's just very interesting visually
01:46:39.340 | in what she can kind of output and how it can move around.
01:46:42.580 | And do we need to fix that?
01:46:45.060 | Are we good?
01:46:45.900 | - What happened?
01:46:47.500 | Do we have to clap?
01:46:48.620 | - Yeah. - Alexa.
01:46:49.500 | - Yeah, so I was just talking about
01:46:51.500 | Augie and her code art,
01:46:53.060 | but I mean, to me, this is like a beautiful thing, right?
01:46:56.580 | The notion that like,
01:46:58.220 | for me coding was this functional thing and I enjoyed it.
01:47:01.420 | And it like helped build something utilitarian,
01:47:04.660 | but that for the next generation of people,
01:47:06.860 | it will be even more an expression
01:47:10.500 | of their kind of imagination and artistic sense
01:47:14.920 | for what they want to exist.
01:47:15.980 | So I don't know if that happens,
01:47:17.660 | if we can help bring about this world where,
01:47:21.340 | you know, a lot more people can,
01:47:23.580 | that that's like their existence going forward
01:47:25.900 | is being able to basically create and live out,
01:47:31.260 | you know, in all these different kinds of art.
01:47:32.900 | I just think that that's like a beautiful
01:47:34.380 | and wonderful thing and will be very freeing for humanity
01:47:37.900 | to spend more of our time on the things that matter to us.
01:47:40.380 | - Yeah, allow more and more people to express their art
01:47:43.140 | in the full meaning of that word.
01:47:44.500 | - Yeah.
01:47:45.340 | - That's a beautiful vision.
01:47:46.860 | We mentioned that you are mortal.
01:47:49.260 | Are you afraid of death?
01:47:51.860 | Do you think about your mortality?
01:47:53.560 | And are you afraid of it?
01:48:01.180 | You didn't sign up for this on a podcast, did you?
01:48:03.060 | - No, I mean, it's an interesting question.
01:48:05.220 | I mean, I'm definitely aware of it.
01:48:08.820 | I do a fair amount of like extreme sport type stuff.
01:48:14.980 | So, so like, so I'm definitely aware of it.
01:48:20.260 | - Yeah.
01:48:21.100 | And you're flirting with it a bit.
01:48:23.980 | - I train hard.
01:48:25.820 | I mean, so it's like, if I'm gonna go out
01:48:27.260 | in like a 15 foot wave.
01:48:29.780 | - Go out big.
01:48:30.820 | - Then, well, then it's like, all right,
01:48:32.020 | I'll make sure we have the right safety gear
01:48:33.860 | and like make sure that I'm like used to that spot
01:48:36.900 | and all that stuff.
01:48:37.940 | But like, but yeah, I mean, you--
01:48:40.020 | - The risk is still there.
01:48:41.300 | - It takes some head blows along the way.
01:48:42.780 | - Yes.
01:48:43.780 | - But definitely aware of it.
01:48:46.300 | Definitely would like to stay safe.
01:48:50.080 | I have a lot of stuff that I wanna build and wanna--
01:48:53.980 | - Does it freak you out that it's finite though?
01:48:56.380 | That there's a deadline when it's all over?
01:49:00.680 | And that there'll be a time when your daughters are around
01:49:03.560 | and you're gone?
01:49:05.240 | - I don't know, that doesn't freak me out.
01:49:07.120 | I think...
01:49:07.960 | Constraints are helpful.
01:49:16.280 | - Yeah.
01:49:17.360 | Yeah, the finiteness makes ice cream taste more delicious
01:49:21.240 | somehow, the fact that it's gonna be over.
01:49:23.200 | There's something about that with the metaverse too.
01:49:25.760 | You want, we talked about this identity earlier,
01:49:28.560 | like having just one, like NFTs.
01:49:30.280 | There's something powerful about the constraint
01:49:34.360 | of finiteness or uniqueness.
01:49:36.960 | That this moment is singular in history.
01:49:39.840 | - But I mean, a lot of, as you go through different waves
01:49:42.180 | of technology, I think a lot of what is interesting
01:49:43.960 | is what becomes in practice infinite,
01:49:48.120 | or kind of there can be many, many of a thing,
01:49:51.640 | and then what ends up still being constrained.
01:49:53.760 | So the metaverse should hopefully allow
01:49:58.760 | a very large number, or maybe in practice,
01:50:04.280 | hopefully close to an infinite amount of expression
01:50:06.900 | and worlds, but we'll still only have
01:50:09.740 | a finite amount of time.
01:50:11.240 | - Yes.
01:50:12.080 | - I think, living longer I think is good.
01:50:17.080 | And obviously all of my, our philanthropic work is,
01:50:21.760 | it's not focused on longevity, but it is focused
01:50:24.240 | on trying to achieve what I think is a possible goal
01:50:28.080 | in this century, which is to be able to cure,
01:50:30.720 | prevent, or manage all diseases.
01:50:32.520 | So I certainly think people kind of getting sick
01:50:36.200 | and dying is a bad thing, because,
01:50:37.800 | and I'm dedicating almost all of my capital
01:50:40.280 | towards advancing research in that area
01:50:43.280 | to push on that, which I mean, we could do a whole,
01:50:45.480 | another one of these podcasts about that.
01:50:46.880 | - Exactly.
01:50:47.720 | - Because that's a fascinating topic.
01:50:49.680 | I mean, this is with your wife, Priscilla Chan,
01:50:51.760 | you formed the Chan Zuckerberg Initiative,
01:50:54.160 | gave away 99%, or pledged to give away 99%
01:50:57.040 | of Facebook non-meta shares.
01:50:59.280 | I mean, like you said, we could talk forever
01:51:02.000 | about all the exciting things you're working on there,
01:51:06.120 | including the sort of moonshot of eradicating disease
01:51:11.120 | by the mid-century mark, or--
01:51:13.320 | - I don't actually know if you're gonna ever eradicate it,
01:51:15.440 | but I think you can get to a point where you
01:51:18.020 | can either cure things that happen, right?
01:51:20.920 | So people get diseases, but you can cure them.
01:51:22.980 | Prevent is probably closest to eradication.
01:51:25.640 | Or just be able to manage as sort of like
01:51:27.440 | ongoing things that are not gonna ruin your life.
01:51:32.440 | And I think that that's possible.
01:51:34.360 | I think saying that there's gonna be no disease at all
01:51:37.120 | probably is not possible within the next several decades.
01:51:41.560 | - Basic thing is increase the quality of life.
01:51:43.760 | - Yeah.
01:51:44.600 | - And maybe keep the finiteness, because it tastes,
01:51:47.800 | it makes everything taste more delicious.
01:51:50.040 | - Yeah.
01:51:50.880 | - Maybe that's just being a romantic 20th century human.
01:51:54.800 | - Maybe, but I mean, but it was an intentional decision
01:51:57.160 | to not focus on our philanthropy on like
01:52:00.200 | explicitly on longevity or living forever.
01:52:03.520 | - Yes.
01:52:04.360 | If at the moment of your death, and by the way,
01:52:09.080 | I like that the lights went out
01:52:11.560 | when we started talking about death,
01:52:13.400 | you get to meet--
01:52:14.240 | - It does make it a lot more dramatic.
01:52:15.720 | - It does.
01:52:16.720 | (laughs)
01:52:17.960 | I should get closer to the mic.
01:52:19.760 | At the moment of your death, you get to meet God,
01:52:22.200 | and you get to ask one question.
01:52:26.160 | What question would you like to ask?
01:52:28.120 | Or maybe a whole conversation, I don't know, it's up to you.
01:52:32.740 | It's more dramatic when it's just one question.
01:52:35.080 | - Well, if it's only one question, and I died,
01:52:40.540 | I would just wanna know that Priscilla
01:52:45.280 | and my family, like if they were gonna be okay.
01:52:49.600 | That might depend on the circumstances of my death,
01:52:54.640 | but I think that in most circumstances that I can think of,
01:52:58.080 | that's probably the main thing that I would care about.
01:53:01.080 | - Yeah, I think God would hear that question
01:53:02.520 | and be like, all right, fine, you get in.
01:53:04.280 | That's the right question to ask.
01:53:06.400 | - Is it? I don't know.
01:53:07.720 | - The humility and selfishness, all right, you're in.
01:53:10.840 | - I mean, but, well, maybe--
01:53:14.600 | - You're gonna be fine, don't worry, you're in.
01:53:16.520 | - But I mean, one of the things that I struggle with,
01:53:18.920 | at least, is on the one hand, that's probably the most,
01:53:23.660 | the thing that's closest to me
01:53:25.700 | and maybe the most common human experience,
01:53:29.480 | but I don't know, one of the things that I just struggle with
01:53:32.440 | in terms of running this large enterprise is like,
01:53:36.220 | should the thing that I care more about be
01:53:42.800 | that responsibility?
01:53:44.880 | And I think it's shifted over time.
01:53:49.320 | I mean, before I really had a family
01:53:52.040 | that was the only thing I cared about,
01:53:53.880 | and at this point, I mean, I care deeply about it,
01:53:58.880 | but yeah, I think that that's not as obvious of a question.
01:54:04.960 | - Yeah, we humans are weird.
01:54:07.840 | You get this ability to impact millions of lives,
01:54:12.800 | and it's definitely something, billions of lives,
01:54:15.640 | it's something you care about,
01:54:16.960 | but the weird humans that are closest to us,
01:54:21.080 | those are the ones that mean the most,
01:54:23.680 | and I suppose that's the dream of the metaverse,
01:54:26.160 | is to connect, form small groups like that,
01:54:29.320 | where you can have those intimate relationships.
01:54:31.700 | Let me ask you the big, ridiculous--
01:54:33.600 | - One, to be able to be close,
01:54:36.920 | not just based on who you happen to be next to.
01:54:39.880 | I think that's what the internet is already doing,
01:54:41.960 | is allowing you to spend more of your time
01:54:44.520 | not physically proximate.
01:54:46.520 | I mean, I always think, when you think about the metaverse,
01:54:49.920 | people ask this question about the real world.
01:54:52.120 | It's like, the virtual world versus the real world.
01:54:54.840 | And it's like, no, the real world is a combination
01:54:58.160 | of the virtual world and the physical world,
01:55:00.060 | but I think over time, as we get more technology,
01:55:04.040 | the physical world is becoming less of a percent
01:55:06.760 | of the real world.
01:55:08.160 | And I think that that opens up a lot of opportunities
01:55:10.960 | for people, 'cause you can work in different places,
01:55:13.440 | you can stay closer to people who are in different places.
01:55:18.440 | I think that's good.
01:55:19.320 | - Removing barriers of geography,
01:55:21.360 | and then barriers of language, that's a beautiful vision.
01:55:24.900 | Big, ridiculous question.
01:55:27.580 | What do you think is the meaning of life?
01:55:29.640 | (silence)
01:55:31.800 | - I think that, well, there are probably a couple
01:55:46.560 | of different ways that I would go at this.
01:55:51.560 | But I think it gets back to this last question
01:55:53.920 | that we talked about, about the duality between,
01:55:56.560 | you have the people around you who you care
01:55:59.120 | the most about, and then there's this bigger thing
01:56:03.040 | that maybe you're building.
01:56:04.400 | And I think that in my own life, I think about this tension,
01:56:09.360 | but I started this whole company, and my life's work
01:56:12.680 | is around human connection.
01:56:15.120 | So I think it's intellectually, probably the thing
01:56:20.120 | that I go to first is just that human connection
01:56:27.720 | is the meaning.
01:56:28.560 | And I think that it's a thing that our society
01:56:33.300 | probably systematically undervalues.
01:56:37.000 | I just remember when I was growing up and in school,
01:56:42.000 | it's like, do your homework and then go play
01:56:43.680 | with your friends after.
01:56:45.200 | And it's like, no, what if playing with your friends
01:56:47.520 | is the point?
01:56:48.360 | - That sounds like an argument your daughters would make.
01:56:52.360 | - Well, I mean, I don't know, I just think it's interesting.
01:56:54.360 | - Like, the homework doesn't even matter, man.
01:56:56.360 | - Well, I think it's interesting because it's,
01:56:58.560 | people, I think people tend to think about that stuff
01:57:03.360 | as wasting time, or that's what you do in the free time
01:57:07.080 | that you have, but what if that's actually the point?
01:57:11.120 | So that's one.
01:57:12.580 | But here's maybe a different way of counting out this,
01:57:14.760 | which is maybe more religious in nature.
01:57:17.880 | I mean, I always, there's a rabbi who I've studied with
01:57:25.400 | who kind of gave me this, we were talking through Genesis
01:57:29.640 | and the Bible and the Torah,
01:57:31.840 | and they're basically walking through, it's like,
01:57:36.840 | okay, you go through the seven days of creation,
01:57:40.720 | and it's basically, it's like,
01:57:45.640 | why does the Bible start there?
01:57:48.120 | Right, so it could have started anywhere, right,
01:57:49.640 | in terms of like how to live.
01:57:52.000 | But basically it starts with talking about
01:57:54.640 | how God created people in his, her image.
01:57:58.940 | But the Bible starts by talking about
01:58:02.720 | how God created everything.
01:58:04.760 | So I actually think that there's like a compelling argument
01:58:09.760 | that I think I've always just found meaningful
01:58:13.000 | and inspiring, that a lot of the point
01:58:18.400 | of what sort of religion has been telling us
01:58:23.000 | that we should do is to create and build things.
01:58:28.000 | So these things are not necessarily at odds.
01:58:32.080 | I mean, I think like, I mean, that's,
01:58:34.760 | and I think probably to some degree,
01:58:36.080 | you'd expect me to say something like this
01:58:37.720 | because I've dedicated my life to creating things
01:58:39.760 | that help people connect.
01:58:40.600 | So I mean, that's sort of the fusion of,
01:58:42.520 | I mean, getting back to what we talked about earlier,
01:58:45.260 | it's, I mean, what I studied in school
01:58:46.500 | or psychology and computer science, right?
01:58:48.280 | So it's, I mean, these are like the two themes
01:58:50.600 | that I care about, but I don't know, for me,
01:58:54.440 | that's what, that's kind of what I think about.
01:58:56.320 | That's what matters.
01:58:57.160 | - To create and to love,
01:59:00.960 | which is the ultimate form of connection.
01:59:03.080 | I think this is one hell of an amazing replay experience
01:59:07.240 | in the metaverse.
01:59:08.080 | So whoever is using our avatars years from now,
01:59:12.000 | I hope you had fun and thank you for talking today.
01:59:14.800 | - Thank you.
01:59:15.640 | - Thanks for listening to this conversation
01:59:18.160 | with Mark Zuckerberg.
01:59:19.480 | To support this podcast,
01:59:20.880 | please check out our sponsors in the description.
01:59:23.680 | And now let me leave you with the end of the poem,
01:59:27.080 | If by Roger Kipling.
01:59:29.160 | If you can talk with crowds and keep your virtue
01:59:34.160 | or walk with Kings, now lose the common touch.
01:59:37.840 | If neither foes nor loving friends can hurt you.
01:59:41.240 | If all men count with you, but none too much.
01:59:47.120 | If you can fill the unforgiving minute
01:59:49.240 | with 60 seconds worth of distance run,
01:59:52.360 | yours is the earth and everything that's in it.
01:59:56.360 | And which is more, you'll be a man, my son.
01:59:59.600 | Thank you for listening and hope to see you next time.
02:00:04.320 | (upbeat music)
02:00:06.900 | (upbeat music)
02:00:09.480 | [BLANK_AUDIO]