back to index

Kevin Systrom: Instagram | Lex Fridman Podcast #243


Chapters

0:0 Introduction
0:22 Origin of Instagram
42:29 New social networks
61:34 Selling Instagram to Facebook
84:36 Features
88:42 Facebook
125:24 Whistleblower
136:43 Machine learning
146:41 Advice for startups
152:33 Money
158:33 Love
160:50 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Kevin Systrom,
00:00:03.200 | co-founder and longtime CEO of Instagram,
00:00:06.680 | including for six years
00:00:08.000 | after Facebook's acquisition of Instagram.
00:00:10.760 | This is the Lex Friedman Podcast.
00:00:13.260 | To support it, please check out our sponsors
00:00:15.520 | in the description.
00:00:16.820 | And now, here's my conversation with Kevin Systrom.
00:00:21.200 | At the risk of asking the Rolling Stones
00:00:24.740 | to play Satisfaction,
00:00:25.880 | let me ask you about the origin story of Instagram.
00:00:28.400 | - Sure.
00:00:29.280 | - So maybe some context.
00:00:30.520 | You, like we were talking about offline,
00:00:32.840 | grew up in Massachusetts,
00:00:34.840 | learned computer programming there,
00:00:36.600 | liked to play Doom II,
00:00:38.000 | (Kevin chuckles)
00:00:38.960 | worked at a vinyl record store,
00:00:40.920 | then you went to Stanford,
00:00:42.840 | turned down Mr. Mark Zuckerberg and Facebook,
00:00:47.320 | went to Florence to study photography.
00:00:49.040 | Those are just some random, beautiful,
00:00:51.700 | impossibly brief glimpses into a life.
00:00:54.800 | So let me ask again,
00:00:56.120 | can you take me through the origin story of Instagram,
00:00:58.440 | given that context? - Yeah,
00:00:59.280 | you basically set it up.
00:01:00.480 | (Kevin chuckles)
00:01:01.920 | All right, so we have a fair amount of time,
00:01:04.640 | so I'll go into some detail,
00:01:05.920 | but basically what I'll say is,
00:01:07.480 | Instagram started out of a company actually called Bourbon,
00:01:13.520 | and it was spelled B-U-R-B-N.
00:01:16.200 | And a couple of things were happening at the time.
00:01:19.080 | So if we zoom back to 2010,
00:01:21.400 | not a lot of people remember what was happening
00:01:23.440 | in the dot-com world then,
00:01:26.520 | but check-in apps were all the rage.
00:01:29.480 | So-- - What's a check-in app?
00:01:31.040 | - Gowalla, Foursquare, Hot Potato.
00:01:34.360 | - So I'm at a place,
00:01:35.880 | I'm gonna tell the world that I'm at this place.
00:01:37.720 | - That's right.
00:01:38.680 | - What's the idea behind this kind of app, by the way?
00:01:41.280 | - You know what, I'm gonna answer that,
00:01:42.760 | but through what Instagram became
00:01:45.320 | and why I believe Instagram replaced them.
00:01:48.080 | So the whole idea was to share with the world
00:01:49.840 | what you were doing, specifically with your friends, right?
00:01:52.800 | But they were all the rage,
00:01:54.840 | and Foursquare was getting all the press,
00:01:56.400 | and I remember sitting around saying,
00:01:57.720 | "Hey, I wanna build something,
00:01:58.880 | "but I don't know what I wanna build.
00:02:00.520 | "What if I built a better version of Foursquare?"
00:02:04.360 | And I asked myself,
00:02:05.440 | why don't I like Foursquare or how could it be improved?
00:02:09.440 | And basically I sat down and I said,
00:02:12.960 | "I think that if you have a few extra features,
00:02:16.480 | "it might be enough."
00:02:17.480 | One of which happened to be posting a photo
00:02:19.080 | of where you were.
00:02:20.320 | There were some others.
00:02:21.520 | It turns out that wasn't enough.
00:02:23.120 | My co-founder joined,
00:02:24.240 | we were going to attack Foursquare and the likes
00:02:27.560 | and try to build something interesting.
00:02:29.520 | And no one used it, no one cared,
00:02:31.760 | because it wasn't enough.
00:02:32.600 | It wasn't different enough, right?
00:02:35.720 | So one day we were sitting down and we asked ourselves,
00:02:37.960 | "Okay, it's come to Jesus moment.
00:02:40.800 | "Are we gonna do this startup?
00:02:43.040 | "And if we're going to,
00:02:44.320 | "we can't do what we're currently doing.
00:02:46.040 | "We have to switch it up.
00:02:46.920 | "So what do people love the most?"
00:02:48.360 | So we sat down and we wrote out three things
00:02:51.760 | that we thought people uniquely loved about our product
00:02:54.640 | that weren't in other products.
00:02:57.200 | Photos happened to be the top one.
00:02:59.200 | So sharing a photo of what you were doing,
00:03:01.360 | where you were at the moment
00:03:03.160 | was not something products let you do, really.
00:03:06.640 | Facebook was like,
00:03:07.800 | "Post an album of your vacation from two weeks ago."
00:03:11.400 | Twitter allowed you to post a photo,
00:03:13.400 | but their feed was primarily text
00:03:15.560 | and they didn't show the photo in line,
00:03:17.400 | or at least I don't think they did at the time.
00:03:19.720 | So even though it seems totally stupid
00:03:23.520 | and obvious to us now,
00:03:25.240 | at the moment then,
00:03:27.280 | posting a photo of what you were doing at the moment
00:03:29.360 | was like not a thing.
00:03:30.600 | So we decided to go after that
00:03:34.160 | because we had noticed that people who used our service,
00:03:36.480 | the one thing they happened to like the most
00:03:38.640 | was posting a photo.
00:03:40.240 | So that was the beginning of Instagram.
00:03:41.760 | And yes, like we went through and we added filters
00:03:44.480 | and there's a bunch of stories around that.
00:03:46.440 | But the origin of this was that
00:03:48.160 | we were trying to be a check-in app,
00:03:49.560 | realized that no one wanted another check-in app.
00:03:52.360 | It became a photo sharing app,
00:03:54.000 | but one that was much more about what you're doing
00:03:56.280 | and where you are.
00:03:57.480 | And that's why when I say
00:03:58.600 | I think we've replaced check-in apps,
00:04:01.000 | it became a check-in via a photo
00:04:03.680 | rather than saying your location
00:04:06.560 | and then optionally adding a photo.
00:04:08.760 | - When you were thinking about what people like,
00:04:11.440 | from where did you get a sense
00:04:13.240 | that this is what people like?
00:04:14.480 | You said, "We sat down, we wrote some stuff down on paper."
00:04:18.920 | Where is that intuition
00:04:20.840 | that seems fundamental to the success
00:04:23.280 | of an app like Instagram?
00:04:26.800 | Where does that idea,
00:04:27.880 | where does that list of three things come from exactly?
00:04:31.520 | - Only after having studied machine learning now
00:04:33.640 | for a couple of years, I like, I have a-
00:04:36.880 | - You have understood yourself?
00:04:38.440 | - I've started to make connections,
00:04:40.800 | like we can go into this later,
00:04:43.360 | but obviously the connections between
00:04:49.120 | machine learning and the human brain,
00:04:51.680 | I think are stretched sometimes, right?
00:04:53.760 | At the same time, being able to back prop
00:04:57.400 | and being able to like look at the world, try something,
00:05:01.000 | figure out how you're wrong, how wrong you are,
00:05:04.240 | and then nudge your company in the right direction
00:05:08.040 | based on how wrong you are.
00:05:10.080 | It's like a fascinating concept, right?
00:05:12.280 | And I don't, we didn't know we were doing it at the time,
00:05:14.720 | but that's basically what we were doing, right?
00:05:17.120 | We put it out to call it a hundred people
00:05:19.360 | and you would look at their data.
00:05:22.800 | You would say, what are they sharing?
00:05:24.840 | What, like what resonates, what doesn't resonate?
00:05:26.960 | We think they're going to resonate with X,
00:05:29.040 | but it turns out they resonate with Y.
00:05:31.120 | Okay, shift the company towards Y.
00:05:33.680 | And it turns out if you do that enough, quickly enough,
00:05:36.320 | you can get to a solution that has product market fit.
00:05:39.800 | Most companies fail because they sit there
00:05:42.640 | and they don't, either their learning rate's too slow,
00:05:45.520 | they sit there and they're just, they're adamant
00:05:47.440 | that they're right, even though the data's telling them
00:05:49.280 | they're not right, or they, their learning rate's too high
00:05:53.240 | and they wildly chase different ideas
00:05:55.240 | and they never actually set out on one
00:05:57.080 | where they don't groove, right?
00:05:59.520 | And I think when we sat down
00:06:00.600 | and we wrote out those three ideas,
00:06:01.960 | what we were saying is, what are the three possible,
00:06:05.720 | whether they're local or global maxima in our world, right?
00:06:09.520 | That users are telling us they like
00:06:12.800 | because they're using the product that way.
00:06:14.960 | It was clear people liked the photos
00:06:16.480 | 'cause that was the thing they were doing.
00:06:18.680 | And we just said, okay, like, what if we just cut out
00:06:21.400 | most of the other stuff and focus on that thing?
00:06:24.080 | And then it happened to be a multi-billion dollar business
00:06:27.480 | and it's that easy, by the way.
00:06:30.240 | - Yeah, I guess so.
00:06:32.320 | Well, nobody ever writes about neural networks
00:06:34.320 | that miserably failed.
00:06:36.160 | So this particular neural network succeeded.
00:06:38.480 | - Oh, they fail all the time, right?
00:06:40.200 | - Yeah, but nobody writes about it.
00:06:41.040 | - The default state is failing.
00:06:42.920 | - Yes.
00:06:44.680 | - When you said the way people are using the app,
00:06:47.360 | is that the loss function for this neural network
00:06:51.200 | or is it also self-report?
00:06:53.000 | Like, do you ever ask people what they like
00:06:54.920 | or do you have to track exactly what they're doing,
00:06:58.560 | not what they're saying?
00:07:00.040 | - I once made a Thanksgiving dinner, okay?
00:07:03.160 | And it was for relatives and I like to cook a lot.
00:07:08.160 | And I worked really hard on picking the specific dishes
00:07:13.720 | and I was really proud because I had planned it out
00:07:17.080 | using a Gantt chart and it was ready on time
00:07:19.880 | and everything was hot.
00:07:21.280 | - Nice.
00:07:22.120 | - I don't know if you're a big Thanksgiving guy,
00:07:23.320 | but the worst thing about Thanksgiving
00:07:25.680 | is when the turkey is cold and some things are hot
00:07:28.160 | and some things, anyway.
00:07:29.240 | - You had a Gantt chart.
00:07:30.080 | Did you actually have a chart?
00:07:31.360 | - Oh yeah, yeah, OmniPlan.
00:07:33.200 | Fairly expensive Gantt chart thing
00:07:36.080 | that I think maybe 10 people have purchased in the world,
00:07:39.400 | but I'm one of them and I use it for recipe planning,
00:07:42.320 | only around big holidays.
00:07:44.440 | - That's brilliant, by the way.
00:07:45.960 | Do people do this kind of--
00:07:47.880 | - Over-engineering?
00:07:49.000 | - It's not over-engineering, it's just engineering.
00:07:50.840 | It's planning.
00:07:51.680 | Thanksgiving is a complicated set of events
00:07:54.760 | with some uncertainty with a lot of things going on.
00:07:57.460 | You should be planning it in this way.
00:07:59.800 | There should be a chart.
00:08:00.880 | It's not over-engineering.
00:08:02.040 | - So what's funny is, brief aside.
00:08:04.880 | - Yes, it's brilliant.
00:08:06.480 | - I love cooking, I love food, I love coffee,
00:08:08.680 | and I've spent some time with some chefs
00:08:10.400 | who know their stuff.
00:08:12.200 | And they always just take out a piece of paper
00:08:15.560 | and just work backwards in rough order.
00:08:18.200 | It's never perfect, but rough order.
00:08:20.560 | It's just like, oh, that makes sense.
00:08:21.920 | Why not just work backwards from the end goal, right?
00:08:24.800 | And put in some buffer time.
00:08:26.840 | So I probably over-specified it a bit using a Gantt chart,
00:08:29.880 | but the fact that you can do it,
00:08:32.760 | it's what professional kitchens roughly do.
00:08:35.280 | They just don't call it a Gantt chart,
00:08:36.920 | or at least I don't think they do.
00:08:38.720 | Anyway, I was telling a story about Thanksgiving.
00:08:40.320 | So here's the thing.
00:08:42.960 | I'm sitting down, we have the meal,
00:08:44.760 | and then I got to know Ray Dalio fairly well
00:08:49.240 | over maybe the last year of Instagram.
00:08:51.680 | And one thing that he kept saying was like,
00:08:55.160 | feedback is really hard to get, honestly, from people.
00:08:59.920 | And I sat down after dinner, I said, guys, I want feedback.
00:09:03.720 | What was good and what was bad?
00:09:05.680 | - Yes.
00:09:06.520 | - And what's funny is like,
00:09:07.840 | literally everyone just said everything was great.
00:09:10.320 | And I like personally knew
00:09:12.640 | I had screwed up a handful of things,
00:09:14.680 | but no one would say it.
00:09:17.640 | And can you imagine now,
00:09:19.560 | not something as high stakes as Thanksgiving dinner, okay?
00:09:22.080 | Thanksgiving dinner, it's not that high stakes,
00:09:25.160 | but you're trying to build a product
00:09:26.440 | and everyone knows you left your job for it,
00:09:28.440 | and you're trying to build it out,
00:09:29.600 | and you're trying to make something wonderful,
00:09:32.040 | and it's yours, right?
00:09:33.480 | You designed it.
00:09:34.600 | Now try asking for feedback.
00:09:37.920 | And know that you're giving this
00:09:39.200 | to your friends and your family.
00:09:41.120 | People have trouble giving hard feedback.
00:09:45.560 | People have trouble saying, I don't like this,
00:09:48.840 | or this isn't great, or this is how it's failed me.
00:09:51.960 | In fact, you usually have two classes of people.
00:09:56.600 | People who just won't say bad things.
00:09:58.800 | You can literally say to them,
00:10:00.440 | please tell me what you hate most about this,
00:10:02.360 | and they won't do it.
00:10:03.200 | They'll try, but they won't.
00:10:04.480 | And then the other class of people
00:10:06.920 | are just negative, period, about everything,
00:10:09.720 | and it's hard to parse out what is true and what isn't.
00:10:13.960 | So my rule of thumb with this
00:10:17.000 | is you should always ask people.
00:10:20.160 | But at the end of the day,
00:10:21.200 | it's amazing what data will tell you.
00:10:23.480 | And that's why with whatever project I work on, even now,
00:10:27.320 | collecting data from the beginning on usage patterns,
00:10:30.280 | so engagement, how many days of the week do they use it,
00:10:34.360 | how many, I don't know, if we were to go back to Instagram,
00:10:36.720 | how many impressions per day, right?
00:10:39.480 | Is that growing, is that shrinking?
00:10:41.160 | And don't be overly scientific about it, right?
00:10:44.760 | 'Cause maybe you have 50 beta users or something.
00:10:48.360 | But what's fascinating is that data doesn't lie.
00:10:52.800 | People are very defensive about their time.
00:10:57.600 | They'll say, oh, I'm so busy,
00:10:59.320 | I'm sorry I didn't get to use the app.
00:11:01.320 | But I don't know,
00:11:04.200 | they're posting on Instagram the whole time.
00:11:07.160 | So I don't know, at the end of the day,
00:11:09.240 | like at Facebook, there was,
00:11:11.520 | before time spent became kind of this loaded term there,
00:11:15.800 | the idea that people's currency in their lives is time,
00:11:21.040 | and they only have a certain amount of time to give things,
00:11:23.400 | whether it's friends or family or apps
00:11:25.440 | or TV shows or whatever,
00:11:27.560 | there's no way of inventing more of it,
00:11:28.960 | at least not that I know of.
00:11:32.720 | If they don't use it, it's because it's not great.
00:11:35.560 | So the moral of the story is you can ask all you want,
00:11:39.720 | but you just have to look at the data.
00:11:41.880 | And data doesn't lie, right?
00:11:45.360 | - I mean, there's metrics,
00:11:46.840 | data can obscure the key insight if you're not careful.
00:11:54.040 | So time spent in the app, that's one.
00:11:57.920 | There's so many metrics you can put at this,
00:11:59.840 | and they will give you totally different insights,
00:12:02.520 | especially when you're trying to create something
00:12:04.600 | that doesn't obviously exist yet.
00:12:07.680 | So measuring maybe why you left the app
00:12:12.680 | or measuring special moments of happiness
00:12:17.760 | that will make sure you return to the app
00:12:20.000 | or moments of happiness that are long lasting
00:12:22.360 | versus like dopamine short-term, all of those things.
00:12:26.440 | But I think, I suppose in the beginning,
00:12:29.680 | you can just get away with just asking the question,
00:12:33.200 | which features are used a lot?
00:12:35.400 | Let's do more of that.
00:12:37.320 | And how hard was the decision?
00:12:40.720 | And I mean, maybe you can tell me
00:12:42.560 | what Instagram looked in the beginning,
00:12:44.440 | but how hard was it to make pictures the first class citizen?
00:12:48.480 | That's a revolutionary idea.
00:12:50.480 | Like at whatever point Instagram became this feed of photos,
00:12:55.480 | that's quite brilliant.
00:12:58.480 | Plus, also don't know when this happened,
00:13:02.040 | but they're all shaped the same.
00:13:04.380 | It's like- - I have to tell you why.
00:13:07.760 | That's the interesting part.
00:13:09.160 | - Why is that? - So a couple of things.
00:13:11.940 | One is data, like you're right,
00:13:16.860 | you can over-interpret data.
00:13:18.060 | Like imagine trying to fly a plane
00:13:20.760 | by staring at, I don't know, a single metric like airspeed.
00:13:25.760 | You don't know if you're going up or down.
00:13:27.080 | I mean, it correlates with up or down,
00:13:28.640 | but you don't actually know.
00:13:30.000 | It will never help you land the plane.
00:13:32.280 | So don't stare at one metric.
00:13:33.960 | Like it turns out you have to synthesize a bunch of metrics
00:13:36.880 | to know where to go, but it doesn't lie.
00:13:40.160 | Like if your airspeed is zero,
00:13:41.440 | unless it's not working, right?
00:13:43.120 | If it's zero, you're probably gonna fall out of the sky.
00:13:46.400 | So generally you look around and you have the scan going.
00:13:50.560 | - Yes.
00:13:51.700 | - And you're just asking yourself,
00:13:52.940 | is this working or is this not working?
00:13:56.400 | But people have trouble explaining how they actually feel.
00:14:01.400 | So just, it's about synthesizing both of them.
00:14:05.160 | So then Instagram, right?
00:14:07.840 | We were talking about revolutionary moment
00:14:10.520 | where the feed became square photos basically.
00:14:14.240 | - And photos first and then square photos.
00:14:16.400 | - Yeah.
00:14:17.240 | It was clear to me that the biggest,
00:14:21.600 | so I believe the biggest companies are founded
00:14:26.600 | when enormous technical shifts happen.
00:14:30.840 | And the biggest technical shift that happened
00:14:32.920 | right before Instagram was founded
00:14:34.920 | was the advent of a phone that didn't suck, the iPhone.
00:14:38.720 | Right?
00:14:39.540 | Like in retrospect, we're like, oh my God,
00:14:40.800 | the first iPhone almost had, like it wasn't that good,
00:14:44.600 | but compared to everything else at the time, it was amazing.
00:14:49.000 | And by the way, the first phone that had an incredible camera
00:14:54.000 | that could like do as well as the point and shoot
00:14:57.760 | you might carry around was the iPhone 4.
00:15:01.160 | And that was right when Instagram launched.
00:15:02.980 | And we looked around and we said,
00:15:04.680 | what will change because everyone has a camera
00:15:07.080 | in their pocket?
00:15:08.600 | And it was so clear to me that the world of social networks
00:15:13.300 | before it was based in the desktop and sitting there
00:15:17.540 | and having a link you could share, right?
00:15:20.800 | And that wasn't gonna be the case.
00:15:21.840 | The question is, what would you share
00:15:23.440 | if you were out and about in the world?
00:15:25.440 | If not only did you have a camera that fit in your pocket,
00:15:29.640 | but by the way, that camera had a network attached to it
00:15:32.020 | that allowed you to share instantly.
00:15:34.800 | That seemed revolutionary.
00:15:36.120 | And a bunch of people saw it at the same time.
00:15:37.740 | It wasn't just Instagram.
00:15:38.800 | There were a bunch of competitors.
00:15:40.500 | The thing we did, I think, was not only,
00:15:44.120 | well, we focused on two things.
00:15:45.360 | So we wrote down those things.
00:15:46.880 | We circled photos and we said,
00:15:48.040 | I think we should invest in this.
00:15:50.140 | But then we said, what sucks about photos?
00:15:52.800 | One, they look like crap, right?
00:15:55.560 | They just, at least back then.
00:15:57.080 | Now my phone takes pretty great photos, right?
00:16:01.360 | Back then they were blurry, not so great, compressed, right?
00:16:05.300 | Two, it was really slow, like really slow to upload a photo.
00:16:11.160 | And I'll tell a fun story about that
00:16:13.760 | and explain to you why they're all the same size
00:16:15.940 | and square as well.
00:16:17.020 | And three, man, if you wanted to share a photo
00:16:21.820 | on different networks,
00:16:23.240 | you had to go to each of the individual apps
00:16:25.240 | and select all of them and upload individually.
00:16:27.840 | And so we were like, all right, those are the pain points.
00:16:30.600 | We're gonna focus on that.
00:16:32.000 | So one, instead of, because they weren't beautiful,
00:16:36.080 | we were like, why don't we lean into the fact
00:16:37.600 | that they're not beautiful?
00:16:38.440 | And I remember studying in Florence.
00:16:40.680 | My photography teacher gave me this Holga camera
00:16:43.040 | and I'm not sure everyone knows what a Holga camera is,
00:16:44.960 | but they're these old school plastic cameras.
00:16:47.200 | I think they're produced in China at the time.
00:16:50.800 | And I wanna say the original ones were like from the 70s
00:16:55.040 | or the 80s or something.
00:16:56.040 | They're supposed to be like $3 cameras for the every person.
00:16:59.040 | They took nice medium format films, large negatives,
00:17:05.040 | but they kind of blurred the light
00:17:07.920 | and they kind of like light leaked into the side.
00:17:10.280 | And there was this whole resurgence
00:17:12.480 | where people looked at that and said,
00:17:13.880 | oh my God, this is a style, right?
00:17:16.920 | And I remember using that in Florence and just saying,
00:17:19.040 | well, why don't we just like lean into the fact
00:17:20.880 | that these photos suck and make them suck more,
00:17:24.360 | but in an artistic way.
00:17:26.260 | And it turns out that had product market fit.
00:17:28.080 | People really liked that.
00:17:29.240 | They were willing to share their not so great photos
00:17:31.760 | if they looked not so great on purpose, okay?
00:17:35.880 | The second part.
00:17:37.440 | - That's where the filters come into the picture.
00:17:40.360 | So computational modification of photos
00:17:42.600 | and make them look extra crappy to where it becomes art.
00:17:46.200 | - Yeah, yeah.
00:17:47.040 | And I mean, add light leaks, add like an overlay filter,
00:17:51.720 | make them more contrasty than they should be.
00:17:54.120 | The first filter we ever produced was called X-Pro2
00:17:58.080 | and I designed it while I was in this small little
00:18:01.000 | bed and breakfast room in Todos Santos, Mexico.
00:18:03.640 | I was trying to take a break from the bourbon days.
00:18:06.800 | And I remember saying to my co-founder,
00:18:08.840 | I just need like a week to reset.
00:18:11.080 | And that was on that trip, worked on the first filter
00:18:15.080 | because I said, you know, I think I can do this.
00:18:17.240 | And I literally iterated one by one over the RGB values
00:18:22.060 | in the array that was the photo and just slightly shifted.
00:18:26.280 | Basically there was a function of R, function of G,
00:18:29.400 | function of B that just shifted them slightly.
00:18:32.480 | It wasn't rocket science.
00:18:34.000 | And it turns out that actually
00:18:36.540 | made your photo look pretty cool.
00:18:38.520 | It just mapped from one color space to another color space.
00:18:42.820 | It was simple, but it was really slow.
00:18:44.440 | I mean, if you applied a filter,
00:18:47.320 | I think it used to take two or three seconds to render.
00:18:50.500 | Only eventually would I figure out how to do it on the GPU.
00:18:53.200 | And I'm not even sure it was a GPU,
00:18:55.240 | but it was using OpenGL, but anyway.
00:18:57.080 | I would eventually figure that out
00:18:59.960 | and then it would be instant,
00:19:01.000 | but it used to be really slow.
00:19:02.800 | By the way, anyone who's watching or listening,
00:19:06.000 | it's amazing what you can get away with in a startup,
00:19:10.080 | as long as the product outcome is right for the user.
00:19:14.080 | Like you can be slow, you can be terrible, you can be,
00:19:17.240 | as long as you have product market fit,
00:19:21.040 | people will put up with a lot.
00:19:22.480 | And then the question is just about compressing,
00:19:24.920 | making it more performant over time
00:19:27.680 | so that they get that product market fit instantly.
00:19:30.460 | - So fascinating because there's some things
00:19:32.840 | where those three seconds would make or break the app,
00:19:37.480 | but some things you're saying not.
00:19:39.520 | It's hard to know when,
00:19:41.040 | you know, it's the problem Spotify solved,
00:19:43.580 | making streaming work.
00:19:47.500 | And delays in listening to music is a huge negative,
00:19:53.040 | even slight delays.
00:19:54.620 | But here you're saying, I mean,
00:19:57.200 | how do you know when those three seconds are okay?
00:20:00.520 | Or are you just gonna have to try it out?
00:20:03.760 | Because to me, my intuition would be
00:20:07.320 | those three seconds would kill the app.
00:20:09.900 | Like I would try to do the OpenGL thing.
00:20:12.240 | - Right.
00:20:13.240 | So I wish I were that smart at the time.
00:20:16.180 | I wasn't, I just knew how to do what I knew how to do.
00:20:20.720 | Right?
00:20:21.680 | And I decided, okay, like,
00:20:23.560 | why don't I just iterate over the values and change them?
00:20:27.200 | And what's interesting is that
00:20:29.600 | compared to the alternatives, no one else used OpenGL.
00:20:35.000 | - Right.
00:20:35.840 | - So everyone else was doing it the dumb way.
00:20:37.120 | And in fact, they were doing it at a high resolution.
00:20:40.440 | Now comes in the small resolution
00:20:42.520 | that we'll talk about in a second.
00:20:45.120 | By choosing 512 pixels by 512 pixels,
00:20:48.620 | which I believe it was at the time,
00:20:51.000 | we iterated over a lot fewer pixels than our competitors
00:20:54.360 | who were trying to do these enormous output,
00:20:56.560 | like images.
00:20:57.840 | - Yeah.
00:20:58.680 | - So instead of taking 20 seconds,
00:21:00.960 | I mean, three seconds feels pretty good, right?
00:21:03.320 | So on a relative basis, we were winning like a lot.
00:21:06.340 | Okay.
00:21:07.180 | So that's answer number one.
00:21:08.520 | Answer number two is,
00:21:09.760 | we actually focused on latency in the right places.
00:21:13.120 | So we did this really wonderful thing when you uploaded.
00:21:17.480 | So the way it would work is, you know,
00:21:19.720 | you'd take your phone, you'd take the photo,
00:21:22.720 | and then you'd go to the, you'd go to the edit screen,
00:21:26.420 | where you would caption it.
00:21:27.760 | And on that caption screen, you'd start typing,
00:21:31.200 | and you'd think, okay, like, what's a clever caption?
00:21:34.200 | And I said to Mike, "Hey, when I worked on the Gmail team,
00:21:36.360 | you know what they did?
00:21:37.440 | When you typed in your username or your email address,
00:21:40.860 | even before you've entered in your password,
00:21:43.260 | like the probability, once you enter in your username,
00:21:47.160 | that you're going to actually sign in is extremely high.
00:21:51.120 | So why not just start loading your account
00:21:52.640 | in the background?
00:21:53.480 | Not like sending it down to the desktop,
00:21:55.720 | that would be a security issue,
00:21:59.400 | but like loaded into memory on the server,
00:22:01.360 | like get it ready, prepare it."
00:22:03.640 | I always thought that was so fascinating and unintuitive.
00:22:07.000 | And I was like, "Mike, why don't we just do that?
00:22:08.920 | But like, we'll just upload the photo and like assume
00:22:12.500 | you're going to upload the photo.
00:22:14.160 | And if you don't, forget about it, we'll delete it, right?"
00:22:17.640 | So what ended up happening
00:22:19.360 | was people would caption their photo,
00:22:22.160 | they'd press done or upload,
00:22:24.680 | and you'd see this little progress bar just go,
00:22:26.800 | and it was lightning fast, okay?
00:22:30.200 | We were no faster than anyone else at the time,
00:22:33.000 | but by choosing 512 by 512 and doing it in the background,
00:22:37.480 | it almost guaranteed that it was done
00:22:39.720 | by the time you captioned.
00:22:41.360 | And everyone, when they used it, was like,
00:22:43.960 | "How the hell is this thing so fast?"
00:22:47.040 | But we were slow, we just hit the slowness.
00:22:49.380 | It wasn't like, these things are just like,
00:22:51.520 | it's a shell game, you're just hiding the latency.
00:22:54.260 | That mattered to people like a lot.
00:22:58.760 | And I think that, so you were willing to prop
00:23:01.080 | with a slow filter if it meant you could share it immediately.
00:23:04.720 | And of course we added sharing options
00:23:06.600 | which let you distribute it really quickly,
00:23:08.260 | that was the third part.
00:23:09.460 | So latency matters, but relative to what?
00:23:14.560 | And then there's some like tricks you get around
00:23:18.520 | to just hiding the latency.
00:23:20.600 | Like, I don't know if Spotify starts downloading
00:23:23.080 | the next song eagerly, I'm assuming they do.
00:23:25.800 | There are a bunch of ideas here
00:23:27.840 | that are not rocket science that really help.
00:23:31.760 | - And all of that was stuff you were explicitly
00:23:35.120 | having a discussion about?
00:23:36.240 | Like those designs and argument,
00:23:38.480 | you were having like arguments, discussions.
00:23:40.720 | - I'm not sure it was arguments.
00:23:42.600 | I mean, I'm not sure if you've met my co-founder, Mike,
00:23:45.520 | but he's a pretty nice guy and he's very reasonable.
00:23:48.520 | And we both just saw eye to eye and we're like, yeah,
00:23:51.720 | it's like, if you make this fast or at least seem fast,
00:23:55.560 | it'll be great.
00:23:57.400 | I mean, honestly, I think the most contentious thing,
00:23:59.480 | and he would say this too initially,
00:24:01.280 | was I was on an iPhone 3G, so like the not so fast one.
00:24:07.760 | And he had a brand new iPhone 4, that was cheap.
00:24:10.880 | - Nice.
00:24:12.520 | - And his feed loaded super smoothly.
00:24:15.160 | Like when he would scroll from photo to photo,
00:24:18.480 | buttery smooth, right?
00:24:20.120 | But on my phone, every time you got to a new photo,
00:24:22.520 | it was like, ka-chunk, ka-chunk, allocate memory,
00:24:25.160 | like all this stuff, right?
00:24:27.200 | I was like, Mike, that's unacceptable.
00:24:29.040 | He's like, oh, come on, man,
00:24:30.040 | just like upgrade your phone, basically.
00:24:32.120 | You didn't actually say that, he's nicer than that.
00:24:34.560 | But I could tell he wished like I would just stop being cheap
00:24:37.720 | and just get a new phone.
00:24:39.440 | But what's funny is we actually sat there working
00:24:41.280 | on that little detail for a few days before launch.
00:24:45.160 | And that polished experience,
00:24:48.360 | plus the fact that uploading seemed fast
00:24:50.800 | for all these people who didn't have nice phones,
00:24:54.120 | I think meant a lot.
00:24:55.640 | Because far too often you see teams focus
00:24:58.640 | not on performance, they focus on
00:25:01.880 | what's the cool computer science problem they can solve,
00:25:04.920 | right?
00:25:05.760 | Can we scale this thing to a billion users
00:25:08.240 | and they've got like a hundred, right?
00:25:10.760 | - Yeah.
00:25:11.600 | - You talked about loss function,
00:25:13.760 | so I wanna come back to that.
00:25:15.800 | Like the loss function is like,
00:25:17.520 | do you provide a great, happy, magical,
00:25:20.440 | whatever experience for the consumer?
00:25:22.920 | And listen, if it happens to involve something complex
00:25:25.720 | and technical, then great.
00:25:27.520 | But it turns out, I think most of the time,
00:25:30.520 | those experiences are just sitting there waiting to be built
00:25:34.560 | with like not that complex solutions.
00:25:37.520 | But everyone is just like so stuck in their own head
00:25:40.600 | that they have to over-engineer everything
00:25:42.400 | and then they forget about the easy stuff.
00:25:44.800 | - And you also, maybe to flip the loss function there is,
00:25:48.200 | you're trying to minimize the number of times
00:25:50.480 | there's unpleasant experience, right?
00:25:53.800 | Like the one you mentioned where
00:25:55.440 | when you go to the next photo, it freezes for a little bit.
00:25:58.200 | So it's almost, as opposed to maximizing pleasure,
00:26:00.720 | it's probably easier to minimize the number of like,
00:26:04.480 | the friction.
00:26:05.600 | - Yeah.
00:26:06.440 | And as we all know, you just make the pleasure negative
00:26:10.160 | and then minimize everything, so.
00:26:13.080 | - We're mapping this all back to neural networks.
00:26:15.000 | - But actually, can I say one thing on that?
00:26:16.560 | Which is, I don't know a lot about machine learning,
00:26:19.760 | but I feel like I've tried studying a bunch.
00:26:23.080 | That whole idea of reinforcement learning
00:26:25.920 | and planning out more than the greedy single experience,
00:26:30.600 | I think is the closest you can get
00:26:34.040 | to like ideal product design thinking,
00:26:38.560 | where you're not saying, hey, like,
00:26:40.680 | can we have a great experience?
00:26:42.000 | Just this one time.
00:26:43.720 | But like, what is the right way to onboard someone?
00:26:46.320 | What series of experiences correlate most
00:26:49.120 | with them hanging on long-term, right?
00:26:52.640 | So not just saying, oh, did the photo load slowly
00:26:55.640 | a couple times, or did they get a great photo
00:26:57.920 | at the top of their feed?
00:27:00.080 | But like, what are the things that are gonna make
00:27:01.880 | this person come back over the next week,
00:27:04.920 | over the next month?
00:27:06.400 | And as a product designer asking yourself,
00:27:08.440 | okay, I wanna optimize,
00:27:10.120 | not just minimize bad experiences in the short run,
00:27:12.400 | but like, how do I get someone to engage
00:27:16.000 | over the next month?
00:27:17.880 | And I'm not gonna claim that I thought that way at all
00:27:21.120 | at the time, 'cause I certainly didn't.
00:27:23.120 | But if I were going back and giving myself any advice,
00:27:25.560 | it would be thinking, what are those second order effects
00:27:28.720 | that you can create?
00:27:30.360 | And it turns out having your friends on the service
00:27:33.360 | is an enormous win.
00:27:34.840 | So starting with a very small group of people
00:27:37.600 | that produce content that you wanted to see,
00:27:39.920 | which we did, we seeded the community very well, I think,
00:27:43.840 | ended up mattering, and so.
00:27:46.600 | - Yeah, you said that community
00:27:47.920 | is one of the most important things.
00:27:49.640 | So it's from a metrics perspective,
00:27:52.000 | from maybe a philosophy perspective,
00:27:55.200 | building a certain kind of community within the app.
00:27:57.880 | See, I wasn't sure what exactly you meant by that
00:28:00.240 | when I've heard you say that.
00:28:01.920 | Maybe you can elaborate, but as I understand now,
00:28:04.400 | it can literally mean get your friends onto the app.
00:28:09.800 | - Yeah, think of it this way.
00:28:11.960 | You can build an amazing restaurant or bar or whatever,
00:28:16.680 | right, but if you show up and you're the only one there,
00:28:20.280 | is it like, does it matter how good the food is?
00:28:22.920 | The drinks, whatever?
00:28:25.600 | No, these are inherently social experiences
00:28:28.800 | that we were working on.
00:28:30.440 | So the idea of having people there,
00:28:34.280 | like you needed to have that,
00:28:36.880 | otherwise it was just a filter app.
00:28:39.200 | But by the way, part of the genius,
00:28:41.360 | I'm gonna say genius, even though it wasn't really genius,
00:28:43.880 | was starting to be,
00:28:45.600 | marauding as a filter app was awesome.
00:28:50.280 | The fact that you could,
00:28:51.200 | so we talk about single player mode a lot,
00:28:53.000 | which is like, can you play the game alone?
00:28:55.360 | And Instagram, you could totally play alone.
00:28:57.960 | You could filter your photos.
00:28:59.360 | And a lot of people would tell me,
00:29:00.880 | I didn't even realize that this thing was a social network
00:29:04.560 | until my friend showed up.
00:29:06.120 | It totally worked as a single player game.
00:29:09.520 | And then when your friend showed up,
00:29:10.920 | all of a sudden it was like,
00:29:11.760 | oh, not only was this great alone,
00:29:14.080 | but now I actually have this trove of photos
00:29:16.920 | that people can look at and start liking,
00:29:19.320 | and then I can like theirs.
00:29:20.440 | And so it was this bootstrap method
00:29:23.160 | of how do you make the thing not suck
00:29:25.160 | when the restaurant is empty?
00:29:27.000 | - Yeah, but the thing is, when you say friends,
00:29:29.560 | I mean, we're not necessarily referring to friends
00:29:31.680 | in the physical space.
00:29:32.720 | So you're not bringing your physical friends with you.
00:29:35.280 | You're also making new friends.
00:29:36.720 | So you're finding new community.
00:29:38.360 | So it's not immediately obvious to me
00:29:40.120 | that it's almost like building any kind of community.
00:29:45.000 | - It was both.
00:29:46.520 | And what we learned very early on
00:29:48.640 | was what made Instagram special
00:29:50.160 | and the reason why you would sign up for it
00:29:51.840 | versus say, just sit on Facebook
00:29:53.480 | and look at your friends' photos.
00:29:55.800 | Of course we were live,
00:29:56.880 | and of course it was interesting
00:29:58.240 | to see what your friends were doing now.
00:30:00.320 | But the fact that you could connect with people
00:30:02.160 | who took really beautiful photos in a certain style
00:30:05.320 | all around the world, whether they were travelers,
00:30:07.320 | it was the beginning of the influencer economy.
00:30:11.400 | It was these people who became professional Instagrammers
00:30:13.920 | way back when, right?
00:30:15.080 | But they took these amazing photos
00:30:18.800 | and some of them were photographers, right?
00:30:21.240 | Like professionally.
00:30:24.120 | And all of a sudden you had this moment in the day
00:30:25.920 | when you could open up this app
00:30:27.120 | and sure, you could see what your friends were doing,
00:30:28.880 | but also it was like, oh my God,
00:30:30.320 | that's a beautiful waterfall,
00:30:32.400 | or oh my God, I didn't realize
00:30:33.720 | there was that corner of England,
00:30:35.080 | or like really cool stuff.
00:30:37.200 | And the beauty about Instagram early on
00:30:40.280 | was that it was international by default.
00:30:43.920 | You didn't have to speak English to use it, right?
00:30:46.760 | You could just look at the photos.
00:30:48.600 | Works great.
00:30:49.760 | We did translate.
00:30:50.880 | We had some pretty bad translations,
00:30:52.360 | but we did translate the app.
00:30:54.400 | And even if our translations were pretty poor,
00:30:57.840 | the idea that you could just connect with other people
00:31:00.080 | through their images was pretty powerful.
00:31:02.280 | - How much technical difficulty is there
00:31:05.920 | with the programming?
00:31:07.120 | Like what programming language you were talking about?
00:31:09.120 | - Oh, zero.
00:31:10.440 | Like maybe it was hard for us,
00:31:12.040 | but I mean, we,
00:31:13.960 | there was nothing.
00:31:16.280 | The only thing that was complex about Instagram
00:31:18.680 | at the beginning, technically, was making it scale.
00:31:23.480 | And we were just plain old objective C for the client.
00:31:27.920 | - So it was iPhone only at first?
00:31:29.920 | - iPhone only, yep.
00:31:30.880 | - As an Android person, I'm deeply offended,
00:31:33.080 | but go ahead.
00:31:33.920 | - Come on, this was 2010.
00:31:35.880 | - Oh, sure, sure, sorry.
00:31:36.720 | - Android's getting a lot better, right?
00:31:40.960 | - I take it back, you're right.
00:31:41.800 | - If I were to do something today,
00:31:43.120 | I think it would be very different
00:31:44.400 | in terms of launch strategy, right?
00:31:46.640 | Android's enormous too.
00:31:48.280 | But anyway, back to that moment,
00:31:51.160 | it was objective C,
00:31:53.400 | and then we were Python based,
00:31:56.920 | which is just like,
00:31:58.200 | this is before Python was really cool.
00:32:00.240 | Like now it's cool,
00:32:01.240 | 'cause it's all these machine learning libraries
00:32:03.120 | like support Python and right?
00:32:05.280 | Now it's super,
00:32:06.960 | now it's like cool to be Python.
00:32:08.160 | Back then it was like,
00:32:09.000 | "Oh, Google uses Python,
00:32:10.840 | like maybe you should use Python."
00:32:12.640 | Facebook was PHP.
00:32:14.400 | Like I had worked at a small startup
00:32:16.800 | of some ex-Googlers that used Python,
00:32:19.360 | so we used it.
00:32:20.200 | And we used a framework called Django,
00:32:22.520 | still exists and people use for basically the backend.
00:32:27.280 | And then you threw a couple interesting things in there.
00:32:30.120 | I mean, we used Postgres,
00:32:31.360 | which was kind of fun.
00:32:32.200 | It was a little bit like hipster database at the time.
00:32:34.720 | - Versus MySQL.
00:32:36.160 | - MySQL, like everyone used MySQL.
00:32:37.920 | So like using Postgres
00:32:39.000 | was like an interesting decision, right?
00:32:41.040 | But we used it because it had a bunch of geo features built
00:32:45.600 | in because we thought we were gonna be a check nap.
00:32:47.600 | Remember?
00:32:48.440 | - It's also super cool now.
00:32:49.280 | So you were into Python before it was cool,
00:32:51.440 | and you were into Postgres before it was cool.
00:32:53.480 | - Yeah, we were basically like,
00:32:55.040 | not only hipster photo company, hipster tech company.
00:32:59.040 | Right?
00:33:00.120 | We also adopted Redis early and like loved it.
00:33:04.000 | I mean, it solved so many problems for us.
00:33:06.960 | And turns out that's still pretty cool.
00:33:09.600 | But the programming was very easy.
00:33:11.280 | It was like, sign up a user, have a feed.
00:33:13.960 | There was nothing, no machine learning at all, zero.
00:33:17.440 | - Can you give some context,
00:33:18.400 | how many users at each of these stages?
00:33:20.760 | Are we talking about a hundred users, a thousand users?
00:33:24.120 | - So the stage I just described,
00:33:25.680 | I mean, that technical stack lasted
00:33:27.960 | through probably 50 million users.
00:33:30.960 | I mean, seriously, like you can get away with a lot
00:33:36.160 | with a pretty basic stack.
00:33:38.400 | Like I think a lot of startups try to over-engineer
00:33:41.080 | their solutions from the beginning to like really scale
00:33:43.680 | and you can get away with a lot.
00:33:45.680 | That being said, most of the first two years of Instagram
00:33:48.600 | was literally just trying to make that stack scale.
00:33:51.320 | And it wasn't, it was not a Python problem.
00:33:54.160 | It was like, literally just like, where do we put the data?
00:33:58.040 | Like, it's all coming in too fast.
00:33:59.920 | Like, how do we store it?
00:34:01.000 | How do we make sure to be up?
00:34:02.360 | How do we like, how do we make sure
00:34:05.680 | we're on the right size boxes,
00:34:07.160 | that they have enough memory?
00:34:09.320 | Those were the issues, but.
00:34:11.200 | - Can you speak to the choices you make at that stage
00:34:14.280 | when you're growing so quickly?
00:34:16.200 | Do you use something like somebody else's
00:34:18.480 | computer infrastructure or do you build in-house?
00:34:22.640 | - I'm only laughing because we, when we launched,
00:34:25.640 | we had a single computer that we had rented
00:34:30.640 | in some color space in LA.
00:34:32.160 | I don't even remember what it was called.
00:34:34.720 | 'Cause I thought that's what you did.
00:34:35.920 | When I worked at a company called Odeo that became Twitter,
00:34:38.520 | I remember visiting our space in San Francisco.
00:34:41.200 | You walked in, you had to wear the ear things
00:34:43.440 | and it was cold and fans everywhere, right?
00:34:46.880 | And we had to plug one out, replace one.
00:34:49.920 | And I was the intern, so I just like held things.
00:34:52.720 | But I thought to myself, oh, this is how it goes.
00:34:54.960 | And then I remember being in a VC's office.
00:34:57.520 | I think it was Benchmark's office.
00:35:00.040 | And I think we ran into another entrepreneur
00:35:02.280 | and they were like, oh, how are things going?
00:35:03.560 | We're like, ah, you know, try to scale this thing.
00:35:06.840 | And they were like, well, I mean,
00:35:08.440 | can't you just add more instances?
00:35:09.960 | And I was like, what do you mean?
00:35:11.560 | And they're like, instances on Amazon.
00:35:13.600 | I was like, what are those?
00:35:15.520 | And it was this moment where we realized
00:35:17.760 | how deep in it we were because we had no idea
00:35:20.360 | that AWS existed, nor should we be using it.
00:35:24.400 | Anyway, that night we went back to the office
00:35:27.000 | and we got on AWS, but we did this really dumb thing
00:35:29.840 | where, I'm so sorry to people listening,
00:35:32.960 | but we brought up an instance, which was our database.
00:35:37.960 | It was gonna be a replacement for our database.
00:35:40.920 | But we had it talking over the public internet
00:35:43.720 | to our little box in LA that was our app server.
00:35:46.560 | - Very nice.
00:35:47.400 | - Yeah, that's how sophisticated we were.
00:35:50.000 | And obviously that was very, very slow.
00:35:52.000 | Didn't work at all.
00:35:53.840 | I mean, it worked, but didn't work.
00:35:56.000 | Only later that night did we realize
00:35:58.080 | we had to have it all together.
00:36:00.160 | But at least if you're listening right now
00:36:02.120 | and you're thinking, I have no chance,
00:36:04.760 | I'm gonna start a startup, I have no chance.
00:36:06.920 | I don't know, we did it.
00:36:07.920 | We made a bunch of really dumb mistakes initially.
00:36:10.640 | I think the question is how quickly do you learn
00:36:12.560 | that you're making a mistake?
00:36:13.600 | And do you do the right thing immediately right after?
00:36:16.360 | - So you didn't pay for those mistakes by failure.
00:36:20.320 | So yeah, how quickly did you fix it?
00:36:23.600 | I guess there's a lot of ways to sneak up to this question
00:36:26.320 | of how the hell do you scale the thing?
00:36:28.440 | Other startups, if you have an idea,
00:36:30.360 | how do you scale the thing?
00:36:31.640 | Is it just AWS?
00:36:34.040 | And you try to write the kind of code
00:36:37.680 | that's easy to spread across a large number of instances.
00:36:41.800 | And then the rest is just put money into it?
00:36:45.160 | - Basically, I would say a couple of things.
00:36:48.840 | First off, don't even ask the question.
00:36:51.600 | Just find product market fit, duct tape it together.
00:36:55.120 | Right, like if you have to.
00:36:56.600 | I think there's a big caveat here, which I wanna get to.
00:36:59.520 | But generally all that matters is product market fit.
00:37:03.800 | That's all that matters.
00:37:04.840 | If people like your product.
00:37:06.880 | Do not worry about when 50,000 people use your product
00:37:10.280 | because you will be happy that you have that problem
00:37:12.760 | when you get there.
00:37:13.920 | I actually can't name many startups
00:37:18.640 | where they go from nothing to something overnight
00:37:22.080 | and they can't figure out how to scale it.
00:37:24.280 | There are some, but I think nowadays,
00:37:27.600 | it's a, when I say a solved problem,
00:37:29.640 | like there are ways of solving it.
00:37:32.160 | The base case is typically that startups
00:37:35.040 | worry way too much about scaling way too early
00:37:38.000 | and forget that they actually have to make something
00:37:39.800 | that people like.
00:37:40.640 | That's the default mistake case.
00:37:43.720 | But what I'll say is once you start scaling,
00:37:48.000 | I mean, hiring quickly people who have seen the game before
00:37:51.440 | and just know how to do it,
00:37:52.840 | it becomes a bit of like, yeah,
00:37:56.680 | just throw instances of the problem, right?
00:37:59.320 | But the last thing I'll say on this
00:38:00.720 | that I think did save us,
00:38:02.320 | we were pretty rigorous about writing tests
00:38:06.680 | from the beginning.
00:38:08.560 | That helped us move very, very quickly
00:38:11.400 | when we wanted to rewrite parts of the product
00:38:14.600 | and know that we weren't breaking something else.
00:38:17.240 | Tests are one of those things where it's like,
00:38:18.720 | you go slow to go fast and they suck
00:38:22.040 | when you have to write them
00:38:22.920 | 'cause you have to figure it out.
00:38:24.200 | And there are always those ones that break
00:38:27.000 | when you don't want them to break and they're annoying
00:38:29.040 | and it feels like you spent all this time.
00:38:30.440 | But looking back, I think that like long-term optimal,
00:38:34.760 | even with a team of four,
00:38:36.880 | it allowed us to move very, very quickly
00:38:38.760 | because anyone could touch any part of the product
00:38:41.920 | and know that they weren't gonna bring down the site
00:38:44.400 | or at least in general.
00:38:45.880 | - At which point do you know product market fit?
00:38:48.640 | How many users would you say?
00:38:50.520 | Is it all it takes is like 10 people or is it a thousand?
00:38:53.760 | Is it 50,000?
00:38:55.800 | - I don't think it is generally a question
00:38:58.560 | of absolute numbers.
00:38:59.840 | I think it's a question of cohorts
00:39:01.840 | and I think it's a question of trends.
00:39:03.840 | So, it depends how big your business is trying to be.
00:39:08.840 | But if I were signing up a thousand people a week
00:39:12.240 | and they all retain,
00:39:14.080 | like the retention curves for those cohorts
00:39:16.040 | looked good, healthy.
00:39:18.360 | And even like as you started getting more people
00:39:21.320 | on the service, maybe those earlier cohorts
00:39:23.440 | started curving up again
00:39:24.640 | because now there are network effects
00:39:26.040 | and their friends are on the service
00:39:27.360 | or totally depends what type of business you're in.
00:39:29.840 | But I'm talking purely social.
00:39:32.320 | I don't think it's an absolute number.
00:39:36.560 | I think it is a,
00:39:37.880 | I guess you could call it a marginal number.
00:39:39.640 | So, I spent a lot of time when I work with startups
00:39:42.280 | asking them like, okay,
00:39:44.360 | have you looked at that cohort versus this cohort,
00:39:47.200 | whether it's your clients
00:39:48.440 | or whether it's people signing up for the service.
00:39:52.280 | But a lot of people think you just have to hit some mark
00:39:55.120 | like 10,000 people or 50,000 people,
00:39:57.800 | but really seven-ish billion people in the world,
00:40:01.960 | most people forever will not know about your product.
00:40:05.320 | There are always more people out there to sign up.
00:40:07.880 | It's just a question of how you turn on the spigot.
00:40:11.360 | - At that stage, early stage,
00:40:13.160 | yourself, but also by way of advice,
00:40:16.680 | should you worry about money at all?
00:40:18.200 | How this thing is going to make money?
00:40:20.320 | Or do you just try to find product market fit
00:40:24.280 | and get a lot of users to enjoy using your thing?
00:40:27.800 | - I think it totally depends.
00:40:30.040 | And that's an unsatisfying answer.
00:40:32.640 | I was talking with a friend today who,
00:40:35.440 | he was one of our earlier investors and he was saying,
00:40:39.120 | hey, like, have you been doing any angel investing lately?
00:40:41.560 | I said, not really.
00:40:42.560 | I'm just like focused on what I want to do next.
00:40:44.760 | And he said, the number of financings have just gone bonkers
00:40:49.760 | like just bonk,
00:40:51.760 | like people are throwing money everywhere right now.
00:40:54.520 | And I think the question is,
00:41:00.640 | do you have an inkling of how you're going to make money?
00:41:04.200 | Or are you really just like waving your hands?
00:41:07.440 | I would not like to be an entrepreneur in the position of,
00:41:11.480 | well, I have no idea how this will eventually make money.
00:41:14.440 | That's not fun.
00:41:15.280 | If you are in an area, like,
00:41:19.160 | let's say you wanted to start a social network, right?
00:41:22.920 | Not saying this is a good idea, but if you did,
00:41:25.920 | there are only a handful of ways they've made money
00:41:27.800 | and really only one way they've made money in the past,
00:41:29.840 | and that's ads.
00:41:30.920 | So, you know, if you have a service that's amenable to that,
00:41:36.800 | and then I wouldn't worry too much about that
00:41:39.800 | because if you get to the scale,
00:41:40.920 | you can hire some smart people and figure that out.
00:41:44.240 | I do think that it is really healthy
00:41:47.680 | for a lot of startups these days,
00:41:49.080 | especially the ones doing like enterprise software,
00:41:52.880 | slacks of the world, et cetera,
00:41:54.880 | to be worried about money from the beginning,
00:41:56.680 | but mostly as a way of winning over clients
00:42:00.240 | and having stickiness.
00:42:01.640 | Of course you need to be worried about money,
00:42:07.000 | but I'm going to also say this again,
00:42:08.640 | which is it's like long-term profitability.
00:42:12.120 | If you have a roadmap to that, then that's great.
00:42:15.880 | But if you're just like, I don't know,
00:42:17.480 | maybe never, like we're working on this Metaverse thing,
00:42:19.880 | I think maybe someday, I don't know.
00:42:22.920 | Like that seems harder to me.
00:42:24.760 | So you have to be as big as Facebook
00:42:26.760 | to like finance that bet, right?
00:42:29.480 | - Do you think it's possible,
00:42:30.440 | you said you're not saying it's necessarily a good idea
00:42:33.400 | to launch a social network.
00:42:35.280 | Do you think it's possible today,
00:42:39.040 | maybe you can put yourself in those shoes,
00:42:41.280 | to launch a social network
00:42:44.120 | that achieves the scale of a Facebook
00:42:46.840 | or a Twitter or an Instagram and maybe even greater scale?
00:42:51.360 | - Absolutely.
00:42:52.200 | - How do you do it?
00:42:54.240 | Asking for a friend.
00:42:56.400 | - Yeah, if I knew, I'd probably be doing it right now
00:42:59.040 | and not sitting here.
00:43:00.160 | - I mean, there's a lot of ways to ask this question.
00:43:03.640 | One is create a totally new product market fit,
00:43:07.240 | create a new market,
00:43:08.840 | create something like Instagram did,
00:43:10.360 | which is like create something kind of new,
00:43:13.160 | or literally out-compete Facebook at its own thing
00:43:17.120 | or out-compete Twitter at its own thing.
00:43:19.880 | - The only way to compete now,
00:43:21.560 | if you want to build a large social network
00:43:23.840 | is to look for the cracks, look for the openings.
00:43:26.880 | You know, no one competed,
00:43:31.080 | I mean, no one competed with the core business of Google.
00:43:33.480 | No one competed with the core business of Microsoft.
00:43:36.920 | You don't go at the big guys
00:43:39.440 | doing exactly what they're doing.
00:43:41.920 | Instagram didn't win, quote unquote,
00:43:43.920 | because it tried to be a visual Twitter.
00:43:46.960 | Like we spotted things that either Twitter
00:43:50.000 | wasn't going to do or refuse to do,
00:43:52.840 | images and feed for the longest time, right?
00:43:56.040 | Or that Facebook wasn't doing or not paying attention to
00:43:58.600 | because they were mostly desktop at the time
00:44:00.640 | and we were purely mobile, purely visual.
00:44:03.720 | Often there are opportunities sitting there.
00:44:07.640 | You just have to figure out like,
00:44:12.000 | I think like there's a strategy book,
00:44:13.840 | I can't remember the name, but talk about moats
00:44:16.400 | and just like the best place to play
00:44:19.720 | is where your competitor like literally can't pivot
00:44:22.240 | because structurally they're set up not to be there.
00:44:26.080 | And that's where you win.
00:44:28.120 | And what's fascinating is like,
00:44:30.480 | do you know how many people are like, images?
00:44:33.320 | Facebook does that, Twitter does that.
00:44:35.280 | I mean, how wrong were they?
00:44:36.520 | Really wrong.
00:44:37.360 | And these are some of the smartest people
00:44:38.560 | in Silicon Valley, right?
00:44:40.400 | But now Instagram exists for a while.
00:44:42.760 | How is it that Snapchat could then exist?
00:44:45.280 | Makes no sense.
00:44:47.280 | Like plenty of people would say,
00:44:48.720 | well, there's Facebook, no images.
00:44:50.120 | Okay, okay.
00:44:50.960 | I mean, Instagram, I'll give you that one,
00:44:52.840 | but wait, now another image-based social network
00:44:55.280 | is going to get really big.
00:44:57.160 | And then TikTok comes along.
00:44:58.800 | Like the prior, so you asked me, is it possible?
00:45:03.520 | The only reason I'm answering yes
00:45:05.680 | is because my prior is that it's happened once every,
00:45:09.600 | I don't know, three, four or five years consistently.
00:45:13.000 | And I can't imagine there's anything structurally
00:45:15.360 | that would change that.
00:45:16.680 | So that's why I answer that way.
00:45:19.000 | Not because I know how, I just,
00:45:21.280 | when you see a pattern, you see a pattern
00:45:22.920 | and there's no reason to believe that's going to stop.
00:45:25.560 | - And it's subtle too, because like you said,
00:45:27.800 | Snapchat and TikTok,
00:45:29.720 | they're all doing the same space of things,
00:45:32.640 | but there's something fundamentally different
00:45:34.840 | about like a three second video
00:45:37.600 | and a five second video and a 15 second video
00:45:40.040 | and a one minute video and a one hour video,
00:45:42.400 | like fundamentally different.
00:45:43.920 | - Fundamentally different.
00:45:45.000 | I mean, I think one of the reasons Snapchat exists
00:45:48.240 | is because Instagram was so focused
00:45:50.160 | on posting great, beautiful manicured versions
00:45:54.680 | of yourself throughout time.
00:45:57.200 | And there was this enormous demand of like,
00:45:58.760 | hey, I really like this behavior.
00:46:00.720 | I love using Instagram, but man,
00:46:03.400 | I just like wish I could share something going on in my day.
00:46:06.760 | Like, do I really have to put it on my profile?
00:46:10.000 | Do I really have to make it last forever?
00:46:11.920 | Do I really?
00:46:12.760 | And that opened up a door, it created a market, right?
00:46:16.640 | And then what's fascinating is Instagram had an explore page
00:46:20.440 | for the longest time and it was image driven, right?
00:46:23.720 | But there's absolutely a behavior
00:46:25.720 | where you open up Instagram
00:46:26.560 | and you sit on the explore page all day.
00:46:28.360 | That is effectively TikTok, but obviously focused on videos.
00:46:32.520 | And it's not like you could just put the explore page
00:46:35.440 | in TikTok form and it works.
00:46:37.160 | It had to be video, it had to have music.
00:46:39.720 | These are the hard parts about product development
00:46:42.400 | that are very hard to predict,
00:46:44.200 | but they're all versions of the same thing with varying,
00:46:49.000 | if you line them up in a bunch of dimensions,
00:46:52.080 | they're just like kind of on,
00:46:54.320 | they're different values of the same dimensions,
00:46:57.000 | which is like, I guess, easy to say in retrospect.
00:46:59.680 | But like, if I were an entrepreneur going after that area,
00:47:02.560 | I'd ask myself like, where's the opening?
00:47:05.160 | What needs to exist because TikTok exists now?
00:47:08.480 | - So I wonder how much things that don't yet exist
00:47:13.040 | and can exist is in the space of algorithms,
00:47:15.880 | in the space of recommender systems.
00:47:18.040 | So in the space of how the feed is generated.
00:47:21.680 | So we kind of talk about the actual elements of the content.
00:47:26.680 | That's what we've been talking,
00:47:27.880 | the difference between photos,
00:47:29.840 | between short videos, longer videos.
00:47:32.440 | I wonder how much disruption is possible
00:47:35.160 | in the way the algorithms work.
00:47:37.240 | Because a lot of the criticism towards social media
00:47:39.520 | is in the way the algorithms work currently.
00:47:41.880 | And it feels like, first of all,
00:47:44.640 | talking about product market fit,
00:47:47.360 | there's certainly a hunger for social media algorithms
00:47:52.360 | that do something different.
00:47:56.840 | I don't think anyone, everyone's like complaining,
00:47:59.560 | this is not doing, this is hurting me
00:48:02.400 | and this is hurting society,
00:48:04.160 | but I keep doing it 'cause I'm addicted to it.
00:48:07.360 | And they say, we want something different,
00:48:09.920 | but we don't know what.
00:48:11.360 | It feels like a, just different.
00:48:15.320 | It feels like there's a hunger for that,
00:48:17.720 | but that's in the space of algorithms.
00:48:19.320 | I wonder if it's possible to disrupt in that space.
00:48:22.040 | - Absolutely.
00:48:22.960 | I have this thesis that the worst part
00:48:27.880 | about social networks is that they're,
00:48:30.040 | is the people.
00:48:32.920 | It's a line that sounds funny, right?
00:48:36.720 | Because that's why you call it a social network.
00:48:39.720 | But what does social networks actually do for you?
00:48:42.480 | Like just think, imagine you were an alien
00:48:45.760 | and you landed and someone says,
00:48:47.520 | hey, there's this site, it's a social network.
00:48:49.480 | We're not gonna tell you what it is,
00:48:50.440 | but just what does it do?
00:48:51.920 | And you had to explain it to them.
00:48:53.200 | It does two things.
00:48:54.400 | One is that people you know and have social ties with
00:48:59.080 | distribute updates through whether it's photos or videos
00:49:05.280 | about their lives so that you don't have
00:49:07.240 | to physically be with them,
00:49:08.320 | but you can keep in touch with them.
00:49:09.920 | That's one.
00:49:10.760 | That's like a big part of Instagram.
00:49:12.320 | That's a big part of Snap.
00:49:14.080 | It is not part of TikTok at all.
00:49:16.320 | So there's another big part,
00:49:18.120 | which is there's all this content out in the world
00:49:20.840 | that's entertaining, whether you wanna watch it
00:49:24.000 | or you wanna read it.
00:49:25.160 | And matchmaking between content that exists in the world
00:49:30.320 | and people that want that content
00:49:33.480 | turns out to be like a really big business.
00:49:35.800 | - Search and discovery, would you-
00:49:37.280 | - Search and discovery.
00:49:38.480 | But my point is it could be video, it could be text,
00:49:40.720 | it could be websites, it could be,
00:49:42.000 | I mean, think back to like Dig, right?
00:49:46.560 | Or StumbleUpon or, right?
00:49:49.240 | - Nice.
00:49:50.600 | - But like, what did those do?
00:49:51.800 | Like they basically distributed interesting content to you.
00:49:54.800 | I think the most interesting part
00:49:59.480 | or the future of social networks
00:50:01.720 | is going to be making them less social
00:50:03.560 | because I think people are part of the root cause
00:50:06.360 | of the problem.
00:50:07.200 | So for instance, often in recommender systems,
00:50:10.360 | we talk about two stages.
00:50:11.520 | There's a candidate generation step,
00:50:13.760 | which is just like of our vast trove of stuff
00:50:16.280 | that you might wanna see,
00:50:18.400 | what small subset should we pick for you, okay?
00:50:23.080 | Typically that is grabbed
00:50:25.320 | from things your friends have shared, right?
00:50:28.360 | Then there's a ranking step, which says, okay,
00:50:30.800 | now given these 100, 200 things,
00:50:32.720 | depends on the network, right?
00:50:34.280 | Let's like be really good about ranking them
00:50:36.480 | and generally rank the things up higher
00:50:38.920 | that get the most engagement, right?
00:50:40.840 | So what's the problem with that?
00:50:42.520 | Step one is we've limited everything you could possibly see
00:50:46.840 | to things that your friends have chosen to share
00:50:49.520 | or maybe not friends, but influencers.
00:50:52.320 | What things do people generally want to share?
00:50:54.600 | They wanna share things that are gonna get likes,
00:50:56.320 | that are gonna show up broadly.
00:50:58.640 | So they tend to be more emotionally driven.
00:51:00.760 | They tend to be more risque or whatever.
00:51:03.800 | So why do we have this problem?
00:51:05.120 | It's because we show people things
00:51:08.120 | people have decided to share
00:51:09.560 | and those things self-select
00:51:10.800 | to being the things that are most divisive.
00:51:14.040 | So how do you fix that?
00:51:16.400 | Well, what if you just imagine for a second
00:51:20.160 | that why do you have to grab things
00:51:21.640 | from things your friends have shared?
00:51:23.160 | Why not just like grab things?
00:51:25.440 | That's really fascinating to me.
00:51:27.440 | And that's something I've been thinking a lot about
00:51:29.120 | and just like, you know,
00:51:31.720 | why is it that when you log onto Twitter,
00:51:34.960 | you're just sitting there looking at things
00:51:38.080 | from accounts that you've followed for whatever reason?
00:51:40.960 | And TikTok, I think has done a wonderful job here,
00:51:44.480 | which is like, you can literally be anyone.
00:51:47.200 | And if you produce something fascinating, it'll go viral.
00:51:50.920 | But like, you don't have to be someone that anyone knows.
00:51:54.800 | You don't have to have built up a giant following.
00:51:57.360 | You don't have to have paid for followers.
00:52:00.080 | You don't have to try to maintain those followers.
00:52:01.880 | You literally just have to produce something interesting.
00:52:04.680 | That is, I think, the future of social networking.
00:52:07.280 | That's the direction things will head.
00:52:10.080 | And I think what you'll find is
00:52:11.720 | it's far less about people manipulating distribution
00:52:15.880 | and far more about what is like, is this content good?
00:52:18.760 | And good is obviously a vague definition
00:52:22.440 | that we could spend hours on,
00:52:23.720 | but different networks, I think,
00:52:26.440 | will decide different value functions
00:52:28.240 | to decide what is good and what isn't good.
00:52:30.000 | And I think that's a fascinating direction.
00:52:32.120 | - So that's almost like creating an internet.
00:52:33.640 | I mean, that's what Google did for web pages.
00:52:36.240 | They did, you know, page rank search.
00:52:39.280 | - Yeah.
00:52:40.120 | - You don't follow anybody on Google
00:52:42.680 | when you use a search engine.
00:52:44.400 | You just discover web pages.
00:52:46.240 | And so what TikTok does is saying,
00:52:49.760 | let's start from scratch.
00:52:51.720 | Let's like start a new internet
00:52:54.520 | and have people discover stuff on that new internet
00:52:56.800 | within a particular kind of pool of people.
00:52:59.800 | - Well, what's so fascinating about this
00:53:01.280 | is like the field of information retrieval.
00:53:05.120 | Like I always talked about, as I was studying this stuff,
00:53:08.480 | I would always use the word query and document.
00:53:10.520 | So I was like, why are they saying query and document?
00:53:12.800 | It's like, they're literally,
00:53:14.680 | like if you just stop thinking about query
00:53:17.080 | as like literally a search query,
00:53:18.920 | and a query could be a person.
00:53:20.520 | I mean, a lot of the way,
00:53:22.120 | I'm not gonna claim to know how Instagram or Facebook
00:53:24.240 | machine learning works today,
00:53:26.200 | but you know, if you want to find a match for a query,
00:53:30.320 | the query is actually the attributes of the person,
00:53:33.320 | their age, their gender, where they're from,
00:53:36.760 | maybe some kind of summarization of their interests.
00:53:39.760 | And that's a query, right?
00:53:42.000 | And that matches against documents.
00:53:43.520 | And by the way, documents don't have to be texts.
00:53:45.160 | They can be videos.
00:53:47.080 | They're however long,
00:53:48.240 | I don't know what the limit is on TikTok these days.
00:53:50.200 | They keep changing it.
00:53:51.360 | My point is just, you've got a query,
00:53:53.440 | which is someone in search of something
00:53:55.320 | that they want to match.
00:53:57.200 | And you've got the document
00:53:58.280 | and it doesn't have to be text.
00:53:59.480 | It could be anything.
00:54:00.720 | And how do you match make?
00:54:02.120 | And that's one of these, like,
00:54:03.240 | I mean, I have spent a lot of time thinking about this
00:54:06.280 | and I don't claim to have mastered it at all,
00:54:08.840 | but I think it's so fascinating about where that will go
00:54:11.720 | with new social networks.
00:54:13.320 | - See, what I'm also fascinated by is
00:54:15.560 | metrics that are different than engagement.
00:54:18.160 | So the other thing from an alien perspective,
00:54:20.920 | what social networks are doing is they,
00:54:23.840 | they in the short term bring out
00:54:28.320 | different aspects of each human being.
00:54:30.120 | So first, let me say that
00:54:35.160 | an algorithm or a social network
00:54:38.640 | for each individual can bring out
00:54:40.680 | the best of that person or the worst of that person.
00:54:43.400 | Or there's a bunch of different parts to us,
00:54:45.680 | parts we're proud of that we are,
00:54:48.800 | parts we're not so proud of.
00:54:50.840 | When we look at the big picture of our lives,
00:54:53.120 | when we look back 30 days from now,
00:54:55.200 | am I proud that I said those things or not?
00:54:57.600 | Am I proud that I felt those things?
00:54:59.320 | Am I proud that I experienced or read those things
00:55:02.880 | or thought about those things?
00:55:04.840 | Just in that kind of self-reflective kind of way.
00:55:08.120 | And so coupled with that,
00:55:10.240 | I wonder if it's possible to have different metrics
00:55:12.240 | that are not just about engagement,
00:55:14.160 | but are about long-term happiness,
00:55:18.920 | growth of a human being,
00:55:20.320 | where they look back and say,
00:55:22.160 | "I am a better human being
00:55:23.720 | for having spent 100 hours on that app."
00:55:26.920 | And that feels like it's actually strongly correlated
00:55:30.600 | with engagement in the long-term.
00:55:33.360 | In the short term, it may not be,
00:55:34.760 | but in the long-term, it's like the same kind of thing
00:55:37.880 | where you really fall in love with a product.
00:55:40.640 | You fall in love with an iPhone,
00:55:42.040 | you fall in love with a car.
00:55:43.800 | That's what makes you fall in love,
00:55:45.120 | is like really being proud
00:55:49.320 | and just in a self-reflective way,
00:55:51.640 | understanding that you're a better human being
00:55:53.680 | for having used the thing.
00:55:55.080 | And that's what great relationships are made from.
00:55:58.800 | It's not just like you're hot
00:56:01.680 | and we like being together or something like that.
00:56:04.320 | It's more like, "I'm a better human being
00:56:06.080 | because I'm with you."
00:56:07.600 | And that feels like a metric
00:56:09.280 | that could be optimized for by the algorithms.
00:56:12.160 | But anytime I kind of talk about this with anybody,
00:56:17.120 | they seem to say, "Yeah, okay,
00:56:18.600 | that's going to get out-competed immediately
00:56:21.040 | by the engagement, if it's ad-driven especially."
00:56:24.680 | I just don't think so.
00:56:26.200 | A lot of it is just implementation.
00:56:30.680 | - I'll say a couple of things.
00:56:31.680 | One is to pull back the curtain on daily meetings
00:56:36.680 | inside of these large social media companies.
00:56:39.200 | A lot of what management,
00:56:42.880 | or at least the people that are tweaking these algorithms
00:56:45.680 | spend their time on are trade-offs.
00:56:47.920 | And there's these things called value functions,
00:56:50.080 | which are like, "Okay, we can predict the probability
00:56:54.200 | that you'll click on this thing,
00:56:55.560 | or the probability that you'll share it,
00:56:58.240 | or the probability that you will leave a comment on it,
00:57:01.160 | or the probability you'll dwell on it."
00:57:03.600 | Individual actions, right?
00:57:06.360 | And you've got this neural network
00:57:09.480 | that basically has a bunch of heads at the end
00:57:11.440 | and all of them are between zero and one,
00:57:13.680 | and great, they all have values, right?
00:57:16.000 | Or they all have probabilities.
00:57:18.240 | And then in these meetings, what they will do is say,
00:57:21.480 | "Well, how much do we value a comment
00:57:25.600 | versus a click versus a share versus a..."
00:57:29.160 | And there may be even some downstream thing, right?
00:57:31.920 | That has nothing to do with the item there,
00:57:34.080 | but like driving follows or something.
00:57:37.520 | And what typically happens is they will say,
00:57:40.080 | "Well, what are our goals for this quarter at the company?
00:57:42.720 | Oh, we want to drive sharing up.
00:57:44.040 | Okay, well, let's turn down these metrics
00:57:46.720 | and turn up these metrics."
00:57:48.600 | And they blend them, right?
00:57:50.640 | Into a single scalar which they're trying to optimize.
00:57:53.880 | That is really hard because invariably
00:57:58.080 | you think you're solving for, I don't know,
00:57:59.840 | something called meaningful interactions, right?
00:58:02.760 | This was the big Facebook pivot.
00:58:04.160 | And I don't actually have any internal knowledge.
00:58:06.680 | Like I wasn't in those meetings,
00:58:09.040 | but at least from what we've seen
00:58:11.360 | over the last month or so,
00:58:12.880 | it seems by actually trying to optimize
00:58:16.440 | for meaningful interactions,
00:58:18.120 | it had all these side effects of optimizing
00:58:20.080 | for these other things.
00:58:22.080 | And I don't claim to fully understand them,
00:58:24.560 | but what I will say is that trade-offs abound.
00:58:28.760 | And as much as you'd like to solve for one thing,
00:58:31.680 | if you have a network of over a billion people,
00:58:34.680 | you're going to have unintended consequences either way.
00:58:36.680 | And it gets really hard.
00:58:38.840 | So what you're describing is effectively a value model
00:58:41.400 | that says like, can we capture,
00:58:43.400 | this is the thing that I spent a lot of time thinking about.
00:58:45.600 | Like, can you capture utility
00:58:48.600 | in a way that like actually measures someone's happiness?
00:58:53.520 | That isn't just a, what do they call it?
00:58:56.600 | A surrogate problem where you say,
00:58:58.160 | well, I kind of think like the more you use the product,
00:59:01.840 | the happier you are.
00:59:02.960 | That was always the argument at Facebook, by the way.
00:59:04.720 | It was like, well, people use it more,
00:59:07.440 | so they must be more happy.
00:59:09.240 | Turns out there are like a lot of things you use more
00:59:11.080 | that make you less happy in the world.
00:59:12.840 | Not talking about Facebook,
00:59:14.040 | just let's think about whether it's gambling or whatever,
00:59:17.600 | like that you can do more of,
00:59:19.040 | but doesn't necessarily make you happier.
00:59:20.720 | So the idea that time equals happiness,
00:59:22.760 | obviously you can't map utility and time together easily.
00:59:27.040 | There are a lot of edge cases.
00:59:28.800 | So when you look around the world and you say,
00:59:30.440 | well, what are all the ways we can model utility?
00:59:32.680 | That is like one of the,
00:59:34.360 | please, if you know someone smart doing this,
00:59:36.240 | introduce me because I'm fascinated by it.
00:59:38.480 | And it seems really tough.
00:59:40.560 | But the idea that reinforcement learning,
00:59:42.680 | like everyone interesting I know in machine learning,
00:59:46.440 | like I was really interested in recommender systems
00:59:48.760 | and supervised learning.
00:59:49.760 | And the more I dug into it, I was like,
00:59:52.640 | oh, literally everyone smart
00:59:54.400 | is working on reinforcement learning.
00:59:56.400 | Like literally everyone.
00:59:57.640 | - You just made people at OpenAI and DeepMind very happy.
01:00:01.120 | - But I mean, but what's interesting is like,
01:00:02.640 | it's one thing to train a game and like,
01:00:06.640 | I mean that paper that where they just took Atari
01:00:09.200 | and they used a ConvNet to basically just like
01:00:11.960 | train simple actions, mind blowing, right?
01:00:15.080 | Absolutely mind blowing, but it's a game, great.
01:00:17.760 | So now what if you're constructing a feed
01:00:22.400 | for a person, right?
01:00:24.400 | Like how can you construct that feed in such a way
01:00:29.400 | that optimizes for a diversity of experience,
01:00:33.240 | a long-term happiness, right?
01:00:35.800 | But that reward function,
01:00:38.400 | it turns out in reinforcement learning, again,
01:00:41.160 | as I've learned, like reward design is really hard.
01:00:45.160 | And I don't know, like how do you design a scalar reward
01:00:49.680 | for someone's happiness over time?
01:00:51.280 | I mean, do you have to measure dopamine levels?
01:00:53.240 | Like, do you have to?
01:00:54.080 | - Well, you have to have a lot of,
01:00:56.520 | a lot more signals from the human being.
01:00:59.440 | Currently it feels like there's not enough signals
01:01:01.640 | coming from the human being users of this algorithm.
01:01:06.080 | So for reinforcement learning to work well,
01:01:08.680 | it needs to have a lot more data.
01:01:10.800 | - Needs to have a lot of data.
01:01:11.880 | And that actually is a challenge for anyone
01:01:13.360 | who wants to start something,
01:01:14.400 | which is you don't have a lot of data.
01:01:16.400 | So how do you compete?
01:01:17.440 | But I do think back to your original point,
01:01:19.960 | rethinking the algorithm, rethinking reward functions,
01:01:23.360 | rethinking utility, that's fascinating.
01:01:28.040 | That's cool.
01:01:28.880 | And I think that's an open opportunity
01:01:31.000 | for a company that figures it out.
01:01:33.560 | - I have to ask about April, 2012,
01:01:37.400 | when Instagram, along with its massive employee base
01:01:42.400 | of 13 people was sold to Facebook for $1 billion.
01:01:48.000 | What was the process like on a business level,
01:01:50.160 | engineering level, human level?
01:01:52.120 | What was that process of selling to Facebook like?
01:01:54.920 | What did it feel like?
01:01:56.440 | - So I want to provide some context,
01:01:58.080 | which is I worked in corporate development at Google,
01:02:00.880 | which not a lot of people know,
01:02:02.640 | but corporate development is effectively
01:02:04.160 | the group that buys companies, right?
01:02:06.200 | You sit there and you acquire companies.
01:02:08.080 | And I had sat through so many of these meetings
01:02:10.920 | with entrepreneurs.
01:02:12.080 | We actually, fun fact, we never acquired a single company
01:02:14.520 | when I worked in corporate development.
01:02:15.720 | So I can't claim that I had like a lot of experience,
01:02:19.560 | but I had enough experience to understand,
01:02:23.400 | okay, like what prices are people getting
01:02:25.560 | and what's the process?
01:02:27.320 | And as we started to grow,
01:02:30.400 | we were trying to keep this thing running
01:02:33.960 | and we were exhausted and we were 13 people.
01:02:36.200 | And I mean, we were trying to think back,
01:02:38.840 | it's probably 27, 37 now, so young.
01:02:45.520 | And on a relative basis, right?
01:02:48.360 | And we're trying to keep the thing running.
01:02:50.520 | And then, we go out to raise money
01:02:53.000 | and we're kind of like the hot startup at the time.
01:02:57.400 | And I remember going into a specific VC and saying,
01:03:00.440 | our terms we're looking for are,
01:03:02.280 | we're looking for a $500 million valuation.
01:03:04.520 | And I've never seen so many jobs drop, all in unison, right?
01:03:10.200 | And I was like thanked and walked out the door
01:03:12.280 | very kindly after.
01:03:14.240 | And then I got a call the next day
01:03:16.600 | from someone who was connected to them and said,
01:03:19.000 | they said, we just want to let you know
01:03:21.160 | that like it was pretty offensive
01:03:22.880 | that you asked for a $500 billion valuation.
01:03:25.000 | And I can't tell if that was like just negotiating or what,
01:03:30.000 | but it's true, like no one offered us more, right?
01:03:32.840 | So we were-
01:03:33.680 | - So can you clarify the number again?
01:03:36.000 | You said how many million?
01:03:37.320 | - 500.
01:03:38.160 | - 500 million.
01:03:39.120 | - 500 million, yeah, half a billion.
01:03:40.880 | - Yeah.
01:03:42.760 | So in my mind, I'm anchored like, okay,
01:03:44.480 | well, literally no one's biting at 500 million.
01:03:47.000 | And eventually we would get Sequoia and Greylock
01:03:50.480 | and others together at 500 million basically post.
01:03:54.520 | It was 450 pre, I think we raised $50 million.
01:03:57.160 | But just like no one was used to seeing
01:04:00.200 | a $500 million companies then.
01:04:02.800 | Like, I don't know if it was because we were just coming out
01:04:05.280 | of the hangover 2008 and things were still on recovery mode.
01:04:11.480 | But then along comes Facebook and after some negotiation,
01:04:16.480 | we've 2X to the number from a half a billion to a billion.
01:04:21.960 | Yeah, it seems pretty good, you know?
01:04:25.120 | And I think Mark and I really saw eye to eye
01:04:28.480 | that this thing could be big.
01:04:30.200 | We thought we could, their resources would help us scale it.
01:04:33.720 | And in a lot of ways it de-risks, I mean,
01:04:36.320 | it de-risks a lot of the employees lives
01:04:38.120 | for the rest of their lives,
01:04:39.520 | including me, including Mike, right?
01:04:41.360 | I think I might've had like 10 grand
01:04:44.000 | in my bank account at the time, right?
01:04:45.720 | Like we're working hard, we had nothing.
01:04:48.520 | So on a relative basis, it seems very high.
01:04:52.360 | And then I think the last company to exit
01:04:54.960 | for anywhere close to a billion was YouTube
01:04:56.800 | that I could think of.
01:04:58.760 | And thus began the giant long bull run of 2012
01:05:03.240 | to all the way to where we are now
01:05:05.400 | where I saw some stat yesterday
01:05:07.840 | about like how many unicorns exist and it's absurd.
01:05:11.880 | But then again, never underestimate technology
01:05:14.400 | and like the value it can provide.
01:05:16.560 | And man, costs have dropped and man scale has increased.
01:05:19.880 | And you can make businesses make a lot of money now,
01:05:23.760 | but on a fundamental level, I don't know,
01:05:26.920 | like how do you describe the decision
01:05:29.760 | to sell a company with 13 people for a billion dollars?
01:05:33.080 | - So first of all, like how,
01:05:35.680 | did it take a lot of guts to sit at a table
01:05:38.160 | and say 500 million or 1 billion with Mark Zuckerberg?
01:05:42.280 | It seems like a very large number with 13,
01:05:44.800 | like especially-
01:05:46.120 | - It doesn't seem, it is.
01:05:47.760 | - It is.
01:05:48.600 | - They're all large numbers.
01:05:49.440 | - Especially like you said before the unicorn parade.
01:05:54.120 | - I like that, I'm gonna use that.
01:05:56.840 | - The unicorn parade?
01:05:57.960 | - Yeah.
01:05:58.800 | - You were at the head of the unicorn parade.
01:06:01.680 | - It's the, yeah, it's a massive unicorn parade.
01:06:04.880 | Okay, so no, I mean, we knew we were worth
01:06:09.880 | quote unquote a lot, but we didn't,
01:06:12.320 | I mean, there was no market for Instagram.
01:06:14.560 | I mean, it's not, you couldn't mark to market
01:06:16.880 | this thing in the public markets.
01:06:18.040 | You didn't quite understand what it would be worth
01:06:20.920 | or was worth at the time.
01:06:23.040 | So in a market, an illiquid market where you have one buyer
01:06:25.640 | and one seller and you're going back and forth.
01:06:27.600 | And well, I guess there were like VC firms
01:06:31.320 | who were willing to invest at a certain valuation.
01:06:34.840 | So I don't know, you just go with your gut.
01:06:38.440 | And at the end of the day, I would say
01:06:41.520 | the hardest part of it was not realizing,
01:06:48.400 | like when we sold, it was tough
01:06:52.160 | because like literally everywhere I go,
01:06:54.880 | restaurants, whatever, like for a good six months after,
01:06:59.200 | it was a lot of attention on the deal,
01:07:01.040 | a lot of attention on the product, a lot of attention.
01:07:04.120 | It was kind of miserable, right?
01:07:06.360 | And you're like, wait, like I made a lot of money,
01:07:08.120 | but like, why is this not great?
01:07:10.200 | And it's because it turns out, you know,
01:07:12.520 | I don't know, like I don't really keep in touch with Mark,
01:07:16.280 | but I've got to assume his job right now
01:07:17.920 | is not exactly the most happy job in the world.
01:07:19.960 | It's really tough when you're on top
01:07:21.720 | and it's really tough when you're in the limelight.
01:07:24.400 | So the decision itself was like, oh, cool, this is great.
01:07:27.480 | How lucky are we, right?
01:07:29.280 | - So, okay, there's a million questions.
01:07:30.600 | - Yeah, go, go, go.
01:07:32.080 | - First of all, why is it hard to be on top?
01:07:37.080 | Why did you not feel good?
01:07:39.400 | Like, can you dig into that?
01:07:40.800 | It always, I've heard like Olympic athletes say
01:07:45.080 | after they win gold, they get depressed.
01:07:48.320 | Is it something like that where it feels like
01:07:52.960 | it was kind of like a thing you were working towards?
01:07:56.400 | - Yeah, sure.
01:07:57.240 | - Some loose definition of success.
01:07:58.920 | And this sure feels like,
01:08:00.600 | at least according to other startups,
01:08:03.480 | this is what success looks like.
01:08:04.880 | And now why don't I feel any better?
01:08:08.480 | I'm still human, I still have all the same problems.
01:08:10.600 | Is that the nature?
01:08:11.760 | Or is it just like negative attention of some kind?
01:08:14.440 | - I think it's all of the above.
01:08:16.080 | But to be clear, there was a lot of happiness
01:08:18.600 | in terms of like, oh my God, this is great.
01:08:20.840 | Like we won the Superbowl of startups, right?
01:08:25.720 | Anyone who can get to a liquidity event
01:08:28.760 | of anything meaningful feels like,
01:08:31.080 | wow, this is what we started out to do.
01:08:32.920 | Of course we want to create great things that people love,
01:08:35.160 | but like we won in a big way.
01:08:38.640 | But yeah, there's this big like,
01:08:40.080 | oh, if we won, what's next?
01:08:43.960 | So they call it the we have arrived syndrome,
01:08:46.960 | which I need to go back and look
01:08:51.240 | where I can quote that from.
01:08:52.720 | But I remember reading about it at the time.
01:08:54.200 | I was like, oh yeah, that's that.
01:08:56.120 | And I remember we had a product manager leave very early on
01:08:59.280 | when we got to Facebook.
01:09:00.440 | And he said to me, I just don't believe I can learn anything
01:09:03.800 | at this company anymore.
01:09:05.160 | It's like, it's hit its apex.
01:09:08.240 | We sold it, great.
01:09:09.960 | I just don't have anything else to learn.
01:09:12.400 | So from 2012, all the way to the day I left in 2018,
01:09:16.000 | like the amount I learned and the humility
01:09:18.960 | with which I realized, oh, we thought we won.
01:09:22.440 | A billion dollars is cool,
01:09:23.560 | but like there are a hundred billion dollar companies.
01:09:26.200 | And by the way, on top of that, we had no revenue.
01:09:29.640 | We had, I mean, we had a cool product,
01:09:31.320 | but we didn't scale it yet.
01:09:32.720 | And there's so much to learn.
01:09:34.960 | And then competitors and how fun was it to fight Snapchat?
01:09:38.360 | Oh my God, like it was, it's like Yankees, Red Sox.
01:09:42.040 | It's great.
01:09:42.920 | Like that's what you live for.
01:09:44.640 | You know, you win some, you lose some,
01:09:47.920 | but the amount you can learn through that process,
01:09:52.720 | what I've realized in life is that there is no,
01:09:56.920 | and there's always someone who has more,
01:09:58.680 | there's always more challenge, just at different scales.
01:10:02.320 | And it sounds like a little Buddhist,
01:10:04.960 | but everything is super challenging,
01:10:09.720 | whether you're like a small business
01:10:11.160 | or an enormous business.
01:10:13.160 | I say like, choose the game you like to play, right?
01:10:17.120 | You've got to imagine that if you're
01:10:18.320 | an amazing basketball player,
01:10:19.680 | you enjoy to some extent practicing basketball.
01:10:22.920 | It's gotta be something you love.
01:10:24.040 | It's gonna suck.
01:10:24.880 | It's gonna be hard.
01:10:26.320 | You're gonna have injuries, right?
01:10:27.480 | But you gotta love it.
01:10:28.440 | And the same thing with Instagram,
01:10:30.480 | which is we might've sold, but it was like, great.
01:10:35.040 | There's one Superbowl title.
01:10:37.240 | Can we win five?
01:10:38.720 | What else can we do?
01:10:40.240 | Now I imagine you didn't ask this, but okay, so I left.
01:10:43.560 | There's a little bit of like, what do you do next?
01:10:46.680 | Right, like, what do you, how do you top that thing?
01:10:49.240 | It's the wrong question.
01:10:50.720 | The question is like, when you wake up every day,
01:10:53.120 | what is the hardest, most interesting thing
01:10:54.920 | you can go work on?
01:10:56.520 | Because like, at the end of the day,
01:10:58.200 | we all turn into dirt, doesn't matter, right?
01:11:00.800 | But what does matter is like,
01:11:02.600 | can we really enjoy this life?
01:11:05.440 | Not in a hedonistic way, because that's,
01:11:07.520 | it's like the reinforcement learning,
01:11:09.800 | learning like short-term versus long-term objectives.
01:11:12.880 | Can you wake up every day and truly enjoy
01:11:16.560 | what you're doing, knowing that it's gonna be painful?
01:11:21.280 | Knowing that like, no matter what you choose,
01:11:24.120 | it's gonna be painful.
01:11:25.560 | Whether you sit on a beach or whether you manage
01:11:28.480 | a thousand people or 10,000, it's gonna be painful.
01:11:31.320 | So choose something that's fun to have pain.
01:11:34.360 | But yes, there was a lot of, we have arrived
01:11:38.560 | and it's a maturation process you just have to go through.
01:11:44.440 | So no matter how much success there is,
01:11:47.120 | how much money you make, you have to wake up the next day
01:11:50.120 | and choose the hard life, whatever that means next.
01:11:52.880 | That's fun, the fun/hard life.
01:11:55.480 | Well, hard life, that's fun.
01:11:56.720 | - I guess what I'm trying to say is slightly different,
01:12:00.400 | which is just that no one realizes
01:12:03.000 | everything's gonna be hard.
01:12:05.360 | Even chilling out is hard.
01:12:07.800 | And then you just start worrying about stupid stuff.
01:12:09.960 | Like, I don't know, like did so-and-so forget
01:12:13.160 | to paint the house today?
01:12:14.400 | Or like, did the gardener come or whatever?
01:12:16.240 | Like, or, oh, I'm so angry and my shipment of wine
01:12:19.400 | didn't show up and I'm sitting here on the beach
01:12:21.200 | without my wine.
01:12:22.160 | I don't know, I'm making shit up now.
01:12:23.480 | But like--
01:12:24.320 | - It turns out that even chilling,
01:12:25.920 | aka meditation, is hard work.
01:12:27.760 | - Yeah, and at least meditation is like productive chilling,
01:12:31.000 | where you're like actually training yourself
01:12:32.600 | to calm down and be, but backing up for a moment.
01:12:36.280 | Everything's hard.
01:12:37.320 | You might as well be like playing in the game
01:12:40.240 | you love to play.
01:12:41.080 | And I just like playing and winning.
01:12:43.560 | And I'm still on the, I think,
01:12:48.560 | the first half of life knock on wood.
01:12:50.680 | And I've got a lot of years.
01:12:53.320 | And what am I gonna do, sit around?
01:12:54.720 | And the other way of looking at this, by the way,
01:12:57.160 | imagine you made one movie and it was great.
01:13:02.400 | Would you just like stop making movies?
01:13:05.160 | No, generally you're like, wow,
01:13:06.960 | I really like making movies.
01:13:08.200 | Let's make another one.
01:13:09.400 | A lot of times, by the way,
01:13:10.360 | the second one or the third one, not that great.
01:13:11.960 | But the fourth one, awesome.
01:13:13.960 | And no one forgets the second,
01:13:15.160 | or everyone forgets the second and the third one.
01:13:17.520 | So there's just this constant process of like,
01:13:20.720 | can I produce?
01:13:22.920 | And is that fun?
01:13:23.760 | Is that exciting?
01:13:24.600 | What else can I learn?
01:13:25.440 | So this machine learning stuff for me
01:13:26.800 | has been this awesome new chapter of being like,
01:13:30.640 | man, that's something I didn't understand at all.
01:13:33.640 | And now I feel like I'm 1/10 of the way there.
01:13:37.160 | And that feels like a big mountain to climb.
01:13:40.240 | So I distracted us from the original question.
01:13:42.600 | - No, and we'll return to the machine learning
01:13:44.560 | 'cause I'd love to explore your interest there.
01:13:46.520 | But I mean, speaking of sort of challenges and hard things,
01:13:50.040 | is there a possible world where sitting in a room
01:13:54.080 | with Mark Zuckerberg with a $1 billion deal,
01:13:58.400 | you turn it down?
01:14:00.000 | - Yeah, of course.
01:14:01.120 | - What does that world look like?
01:14:03.240 | Why would you turn it down?
01:14:04.640 | Why did you take it?
01:14:05.640 | What was the calculation that you were making?
01:14:08.440 | - That's enters the world of counterfactuals
01:14:11.480 | and not really knowing.
01:14:13.160 | And if only we could run that experiment.
01:14:14.960 | - Well, the universe exists.
01:14:16.440 | It's just running in parallel to our own.
01:14:18.640 | - Yeah, it's so fascinating, right?
01:14:21.000 | I mean, we're talking a lot about money,
01:14:24.720 | but the real question was,
01:14:26.360 | I'm not sure you'll believe me when I say this,
01:14:30.480 | but could we strap our little company
01:14:33.640 | onto the side of a rocket ship
01:14:36.160 | and get out to a lot of people really, really quickly
01:14:39.560 | with the support, with the talent of a place like Facebook?
01:14:44.560 | I mean, people often ask me what I would do differently
01:14:48.200 | at Instagram today.
01:14:49.040 | And I say, well, I'd probably hire more carefully
01:14:51.240 | 'cause we showed up just like before I knew it,
01:14:53.760 | we had like 100 people on the team and 200, then 300.
01:14:57.800 | I don't know where all these people were coming from.
01:14:59.560 | I never had to recruit them.
01:15:00.800 | I never had to screen them.
01:15:02.400 | They were just like internal transfers, right?
01:15:05.120 | So it's like relying on the Facebook hiring machine,
01:15:07.480 | which is quite sort of, I mean, it's an elaborate machine.
01:15:11.800 | - It's great, by the way.
01:15:12.800 | They have really talented people there.
01:15:15.320 | But my point is,
01:15:16.960 | the choice was like, take this thing,
01:15:21.480 | put it on the side of a rocket ship
01:15:22.960 | that you know is growing very quickly.
01:15:25.000 | Like I had seen what had happened
01:15:27.280 | when Ev sold Blogger to Google and then Google went public.
01:15:30.920 | Remember, we sold before Facebook went public.
01:15:34.620 | There was a moment at which the stock price was $17,
01:15:37.680 | by the way, Facebook stock price was $17.
01:15:40.200 | I remember thinking, what the did I just do, right?
01:15:43.360 | Now at 320-ish, I don't know where we are today,
01:15:48.200 | but like, okay, like the best thing by the way is like,
01:15:53.200 | when the stock is down, everyone calls you a dope.
01:15:56.800 | And then when it's up, they also call you a dope,
01:15:58.680 | but just for a different reason, right?
01:16:01.480 | Like you can't win.
01:16:02.520 | - Less than in there somewhere.
01:16:03.840 | - Yeah.
01:16:04.680 | - So, but you know, the choice is to strap yourself
01:16:06.720 | to a rocket ship or to build your own.
01:16:08.680 | You know, Mr. Elon built his own,
01:16:11.120 | literally with a rocket ship.
01:16:12.800 | That's a difficult choice because there's a world.
01:16:18.360 | - Actually, I would say something different,
01:16:19.800 | which is Elon and others decided to sell PayPal
01:16:23.280 | for not that much.
01:16:25.140 | I mean, how much was it, about a billion dollars?
01:16:26.920 | I can't remember. - Something like that, yeah.
01:16:28.400 | - I mean, it was early, but it's worth a lot more now.
01:16:31.560 | - To then build a new rocket ship.
01:16:33.160 | So this is the cool part, right?
01:16:34.800 | If you are an entrepreneur and you own a controlling stake
01:16:38.680 | in the company, not only is it really hard
01:16:41.920 | to do something else with your life,
01:16:43.400 | because all of the value is tied up in you
01:16:47.480 | as a personality attached to this company, right?
01:16:50.920 | But if you sell it and you're getting yourself
01:16:53.000 | enough capital and you like have enough energy,
01:16:55.840 | you can do another thing or 10 other things.
01:16:57.980 | Or in Elon's case, like a bunch of other things.
01:16:59.960 | I don't know, like I lost count at this point.
01:17:02.480 | (laughing)
01:17:03.360 | And it might've seemed silly at the time.
01:17:05.600 | And sure, like if you look back,
01:17:07.300 | man, PayPal's worth a lot now, right?
01:17:09.600 | But I don't know, like, do you think Elon like cares about,
01:17:13.300 | like, are we gonna buy Pinterest or not?
01:17:15.240 | Like I just, he is, he created a massive capital
01:17:20.240 | that allowed him to do what he wants to do.
01:17:23.280 | And that's awesome.
01:17:24.120 | That's more freeing than anything,
01:17:25.480 | because when you are an entrepreneur attached to a company,
01:17:28.400 | you gotta stay at that company for a really long time.
01:17:30.720 | It's really hard to remove yourself.
01:17:32.960 | - But I'm not sure how much he loved PayPal
01:17:35.000 | versus SpaceX and Tesla.
01:17:37.440 | I have a sense that you love Instagram.
01:17:41.400 | - Yeah, I loved enough to like work
01:17:43.600 | for six years beyond the deal.
01:17:45.120 | - Which is rare, which is very rare.
01:17:46.680 | You chose.
01:17:47.520 | - But can I tell you why?
01:17:48.680 | - Sure, please.
01:17:51.160 | - There are not a lot of companies that you can be part of
01:17:55.040 | where the Pope's like,
01:17:57.360 | I would like to sign up for your product.
01:17:59.240 | Like I'm not a religious person at all.
01:18:01.600 | I'm really not.
01:18:03.200 | But when you go to the Vatican
01:18:04.520 | and you're like walking among giant columns
01:18:07.040 | and you're hearing the music and everything,
01:18:08.400 | and like the Pope walks in
01:18:10.540 | and he wants to press the sign up button on your product.
01:18:12.600 | Like it's a moment in life, okay?
01:18:15.040 | No matter what your persuasion, okay?
01:18:17.420 | The number of doors and experiences that that opened up was,
01:18:22.920 | it was incredible.
01:18:23.920 | I mean, the people I got to meet, the places I got to go,
01:18:27.800 | I assume maybe like a payments company
01:18:30.120 | is slightly different, right?
01:18:32.040 | But that's why, like it was so fun.
01:18:34.760 | And plus I truly believed
01:18:36.760 | we were building such a great product
01:18:38.560 | and I loved the game.
01:18:40.840 | It wasn't about the money, it was about the game.
01:18:43.560 | - Do you think you had the guts to say no?
01:18:46.840 | Is that, so here's, I often think about this,
01:18:49.360 | like how hard is it for an entrepreneur to say no?
01:18:52.480 | Because the peer pressure,
01:18:54.880 | so every, like basically the sea of entrepreneurs
01:18:57.880 | in Silicon Valley are gonna tell you,
01:18:59.720 | I mean, this is their dream.
01:19:01.680 | The thing you were sitting before is a dream.
01:19:05.200 | To walk away from that is really,
01:19:07.480 | it seems like nearly impossible.
01:19:09.460 | Because Instagram could in 10 years be,
01:19:15.640 | you know, you could talk about Google,
01:19:17.440 | you could be making self-driving cars
01:19:20.280 | and building rockets that go to Mars
01:19:21.920 | and compete with SpaceX.
01:19:23.600 | - Totally.
01:19:24.440 | - And so that's an interesting decision to say,
01:19:27.340 | am I willing to risk it?
01:19:32.120 | The reason I also say it's an interesting decision
01:19:34.520 | because it feels like per our previous discussion,
01:19:37.800 | if you're launching a social network company,
01:19:42.160 | there's going to be that meeting,
01:19:43.800 | whatever that number is, if you're successful.
01:19:46.720 | If you're in this rocket ship of success,
01:19:49.160 | there's going to be a meeting
01:19:50.780 | with one of the social media, social network companies,
01:19:53.800 | that want to buy you, whether it's Facebook or Twitter,
01:19:58.280 | but it could also very well be Google,
01:20:00.280 | who seems to have like a graveyard
01:20:04.520 | of failed social networks.
01:20:06.600 | And it's, I mean, I don't know.
01:20:08.960 | I think about that, how difficult it is
01:20:11.480 | for an entrepreneur to make that decision.
01:20:13.680 | How many have successfully made that decision, I guess,
01:20:16.640 | is a big question.
01:20:18.200 | It's sad to me, to be honest,
01:20:20.360 | that too many make that decision,
01:20:21.960 | perhaps for the wrong reason.
01:20:23.640 | - Sorry, when you say make the decision,
01:20:25.520 | you mean to the affirmative.
01:20:26.680 | - To the affirmative, yeah. - Got it, yeah.
01:20:28.600 | There are also companies that don't sell, right?
01:20:31.400 | And take the path and say, we're going to be independent.
01:20:34.840 | And then you've never heard of them again.
01:20:37.320 | Like I remember Path, right?
01:20:40.360 | Was one of our competitors early on.
01:20:42.300 | There's a big moment when they had,
01:20:45.800 | I can't remember what it was,
01:20:46.840 | like $110 million offer from Google or something.
01:20:50.760 | It might've been larger, I don't know.
01:20:52.660 | And I remember there was like this big TechCrunch article
01:20:56.560 | that was like, they turned it down
01:20:58.440 | after talking deeply about their values and everything.
01:21:01.760 | And I don't know the inner workings of Foursquare,
01:21:05.720 | but like, I'm certain there were many conversations
01:21:08.480 | over time where there were companies
01:21:09.960 | that wanted Foursquare as well.
01:21:11.760 | Recently, like, I mean, what other companies?
01:21:15.920 | There's Clubhouse, right?
01:21:16.960 | Like, I don't know, maybe people were really interested
01:21:19.440 | in them too.
01:21:20.560 | Like, there are plenty of moments where people say no
01:21:25.560 | and we just forget that those things happen.
01:21:28.480 | We only focus on the ones where like,
01:21:31.200 | they said yes and like, wow,
01:21:34.200 | like what if they had stayed independent?
01:21:36.280 | So I don't know, I used to think a lot about this
01:21:38.920 | and now I just don't because I'm like, whatever.
01:21:41.320 | Things have gone pretty well.
01:21:44.840 | I'm ready for the next game.
01:21:45.960 | I mean, think about an athlete where,
01:21:50.280 | I don't know, maybe they do something wrong
01:21:53.040 | in the World Series or whatever.
01:21:54.080 | And like, if you let it haunt you
01:21:56.080 | for the rest of your career, like,
01:21:58.440 | why not just be like, I don't know, it was a game,
01:22:00.360 | next game, next shot, right?
01:22:02.920 | If you just move to that world,
01:22:05.080 | like at least I have a next shot, right?
01:22:07.440 | - No, that's beautiful.
01:22:08.600 | But I mean, just in insights,
01:22:10.720 | and it's funny you brought up Clubhouse, it is very true.
01:22:14.200 | It seems like Clubhouse is, you know, on the downward path
01:22:19.480 | and it's very possible to see a billion plus dollar deal
01:22:23.240 | at some stage, maybe like a year ago, half a year ago,
01:22:27.200 | from Facebook, from Google.
01:22:28.520 | I think Facebook was flirting with that idea too.
01:22:30.960 | And I think a lot of companies probably were.
01:22:34.760 | I wish it was more public.
01:22:37.120 | You know what, there's not like a badass public story
01:22:41.200 | about them making the decision to walk away.
01:22:43.880 | We just don't hear about it
01:22:45.000 | and then we get to see the results of that,
01:22:47.000 | the success or the failure, more often failure.
01:22:49.360 | - So a couple of things.
01:22:51.480 | One is, I would not assume Clubhouse
01:22:53.680 | is down for the count at all.
01:22:54.920 | They're young, they have plenty of money,
01:22:56.560 | they're run by really smart people.
01:22:58.560 | I'd give them like a very fighting chance to figure it out.
01:23:02.440 | There are a lot of times when people call Twitter
01:23:04.160 | down for the count and they figure it out
01:23:05.840 | and they seem to be doing well, right?
01:23:08.360 | So just backing up, like,
01:23:09.800 | and not knowing anything about their internals,
01:23:11.800 | like there's a strong chance they will figure it out
01:23:15.560 | and that people are just down
01:23:16.760 | 'cause they like being down about companies.
01:23:18.520 | They like assuming that they're gonna fail.
01:23:20.080 | So who knows, right?
01:23:21.240 | But let's take the ones in the past
01:23:22.600 | where like we know how it played out.
01:23:24.760 | There are plenty of examples
01:23:25.920 | where people have turned down big offers
01:23:28.160 | and then you've just never heard from them again,
01:23:30.920 | but we never focus on the companies
01:23:32.440 | 'cause you just forget that those were big.
01:23:34.680 | But inside your psyche,
01:23:36.200 | I think it's easy for someone with enough money
01:23:42.240 | to say money doesn't matter,
01:23:43.800 | which I think is like, it's bullshit.
01:23:45.640 | Of course money matters.
01:23:46.720 | It's to people.
01:23:47.560 | But at the moment, you just can't even grasp
01:23:51.560 | like the number of zeros that you're talking about.
01:23:53.920 | It just doesn't make sense, right?
01:23:56.600 | So to think rationally in that moment
01:23:58.560 | is not something many people are equipped to do,
01:24:01.360 | especially not people where I think we had founded
01:24:03.880 | the company a year earlier, maybe two years earlier,
01:24:06.720 | like a year and a half, we were 13 people.
01:24:08.760 | But I will say, I still don't know
01:24:13.360 | if it was the right decision
01:24:14.400 | 'cause I don't have that counterfactual.
01:24:15.960 | I don't know that other world.
01:24:17.600 | I'm just thankful that by and large,
01:24:21.240 | most people love Instagram, still do.
01:24:23.800 | By and large, people are very happy
01:24:25.520 | with like the time we had there
01:24:28.240 | and I'm proud of what we built.
01:24:29.680 | So like, I'm cool.
01:24:31.640 | Like we're, you know, now it's next shot, right?
01:24:35.600 | - Well, if we could just linger on this.
01:24:38.000 | - Sure. - Yankees versus Red Sox,
01:24:40.360 | the fun of it, the competition over,
01:24:42.440 | I would say over the space of features.
01:24:45.400 | So there are a bunch of features,
01:24:49.200 | like there's photos, there's one minute videos on Instagram,
01:24:54.200 | there's IGTV, there's stories, there's reels, there's live.
01:24:58.880 | So that sounds like it's like a long list
01:25:01.960 | of too much stuff, but it's not
01:25:04.320 | because it feels like they're close together,
01:25:07.560 | but they're somehow, like what we're saying,
01:25:09.600 | fundamentally distinct, like each of the things I mentioned.
01:25:13.320 | Maybe can you describe the philosophies,
01:25:15.920 | the design philosophies behind some of these,
01:25:17.840 | how you were thinking about it during the historic war
01:25:22.480 | between Snapchat and Instagram, or just in general,
01:25:25.800 | like this space of features that was discovered.
01:25:30.080 | - There's this great book by Clay Christensen
01:25:34.320 | called "Competing Against Luck."
01:25:36.240 | It's like a terrible title,
01:25:39.280 | but within it, there's effectively an expression
01:25:43.440 | of this thing called jobs to be done theory.
01:25:45.760 | And it's unclear if like he came up with it
01:25:49.000 | or some of his colleagues, but everyone,
01:25:50.720 | there are a bunch of places you can find
01:25:52.040 | with people claiming to have come up
01:25:53.280 | with this jobs to be done theory.
01:25:55.120 | But the idea is if you like zoom out
01:25:59.160 | and you look at your product, you ask yourself,
01:26:01.680 | why are people hiring your product?
01:26:04.120 | Like imagine every product in your life
01:26:07.000 | is effectively an employee, you know,
01:26:10.120 | your CEO of your life,
01:26:11.680 | and you hire products to be employees effectively.
01:26:13.840 | They all have roles and jobs, right?
01:26:15.640 | Why are you hiring a product?
01:26:18.360 | Why do you want that product
01:26:19.560 | to perform something in your life?
01:26:21.000 | And what are the hidden reasons
01:26:22.680 | why you're in love with this product?
01:26:25.600 | Instagram was about sharing your life
01:26:28.800 | with others visually, period, right?
01:26:31.640 | Why? Because you feel connected with them.
01:26:34.320 | You get to show off.
01:26:36.800 | You get to feel good and cared about, right?
01:26:39.920 | With likes and, and it turns out that that will,
01:26:44.920 | I think forever define Instagram
01:26:47.560 | and any product that serves that job
01:26:50.800 | is going to do very well, okay?
01:26:53.400 | Stories, let's take it as an example,
01:26:56.240 | is very much serving that job.
01:26:59.080 | In fact, it serves it better than the original product
01:27:01.960 | because when you're large and have an enormous audience,
01:27:06.120 | you're worried about people seeing your stuff
01:27:08.040 | or you're worried about being permanent
01:27:09.520 | so that a college admissions person
01:27:11.600 | is going to see your photo of you doing something.
01:27:13.480 | And so it turns out that that is a more efficient way
01:27:16.680 | of performing that job than the original product was.
01:27:19.800 | The original product still has its value,
01:27:22.640 | but at scale, these two things together
01:27:24.680 | work really, really well.
01:27:26.400 | Now, I will claim that other parts of the product over time
01:27:30.920 | didn't perform that job as well.
01:27:32.760 | I think IGTV probably didn't, right?
01:27:35.640 | Shopping is like completely unrelated
01:27:38.280 | to what I just described,
01:27:40.120 | but it might work, I don't know, right?
01:27:42.120 | Products I think, products that succeed
01:27:46.800 | are products that all share this parent node
01:27:49.560 | of like this job to be done that is in common.
01:27:52.080 | And then there's just like different ways of doing it, right?
01:27:55.400 | Apple, I think does a great job with this, right?
01:27:57.680 | It's like managing your digital life
01:27:59.680 | and all the products just work together.
01:28:01.960 | They sync, they like, it's beautiful, right?
01:28:06.200 | Even if they require like silly specific cords to work,
01:28:10.440 | but they're all part of a system.
01:28:12.920 | It's when you leave that system
01:28:14.240 | and you start doing something weird
01:28:15.640 | that people start scratching their head.
01:28:17.840 | And I think you are less successful.
01:28:19.320 | So I think one of the challenges Facebook
01:28:20.920 | has had throughout its life
01:28:22.480 | is that it has never fully, I think, appreciated the job
01:28:25.920 | to be done of the main product.
01:28:28.120 | And what it's done is said,
01:28:29.240 | ooh, there's a shiny object over there.
01:28:31.440 | That startup's getting some traction.
01:28:32.800 | Let's go copy that thing.
01:28:34.480 | And then they're confused why it doesn't work.
01:28:36.920 | Like, why doesn't it work?
01:28:37.960 | It's because the people who show up for this
01:28:40.160 | don't want that, it's different.
01:28:42.640 | - What's the purpose of Facebook?
01:28:44.320 | So I remember I was a very early Facebook user.
01:28:47.520 | The reason I was personally excited about Facebook
01:28:50.720 | is because you can, first of all, use your real name.
01:28:55.600 | Like I can exist in this world.
01:28:58.080 | It can be like formally exist.
01:29:00.840 | I like anonymity for certain things, Reddit and so on,
01:29:04.800 | but I want it to also exist not anonymously
01:29:09.280 | so that I can connect with other friends of mine
01:29:12.280 | not anonymously.
01:29:13.640 | And there's a reliable way to know that I'm real
01:29:17.320 | and they're real and that we're connecting.
01:29:20.000 | And it's kind of like, I liked it for the reasons
01:29:24.840 | that people like LinkedIn, I guess.
01:29:27.140 | But like without the form, like not everybody is dressed up
01:29:30.940 | and being super polite, like more like with friends.
01:29:34.620 | But then it became something much bigger than that,
01:29:37.580 | I suppose, there's a feed.
01:29:39.260 | It became this, I mean, it became a place
01:29:44.260 | to get, discover content, to share content
01:29:49.420 | that's not just about connecting directly with friends.
01:29:53.780 | I mean, it became something else.
01:29:54.800 | I don't even know what it is really.
01:29:56.580 | - So you said Instagram is a place
01:29:58.860 | where you visually share your life.
01:30:02.140 | What is Facebook?
01:30:03.660 | - Well, let's go back to the founding of Facebook
01:30:06.180 | and why it worked really well initially at Harvard
01:30:09.700 | and then Dartmouth and Stanford.
01:30:11.340 | And I can't remember, probably MIT.
01:30:13.020 | There were like a handful of schools
01:30:14.660 | in that first tranche, right?
01:30:16.100 | It worked because there are communities
01:30:19.840 | that exist in the world that want to transact.
01:30:23.340 | And when I say transact, I don't mean commercially.
01:30:25.380 | I just mean they want to share, they want to coordinate,
01:30:28.420 | they want to communicate, they want a space for themselves.
01:30:32.060 | And Facebook at its best, I think is that.
01:30:36.380 | And if actually you look at the most popular products
01:30:39.240 | that Facebook has built over time,
01:30:42.100 | if you look at things like Groups, Marketplace,
01:30:45.300 | Groups is enormous.
01:30:47.540 | And Groups is effectively like everyone can found
01:30:50.380 | their own little Stanford or Dartmouth or MIT, right?
01:30:52.900 | And find each other and share and communicate
01:30:57.100 | about something that matters deeply to them.
01:31:00.380 | That is the core of what Facebook was built around.
01:31:03.880 | And I think today is where it stands most strongly.
01:31:08.880 | - It's brilliant.
01:31:09.980 | The Groups, I wish Groups were done better.
01:31:13.680 | It feels like it's not a first-class citizen.
01:31:16.100 | I know I may be saying something without much knowledge,
01:31:19.840 | but it feels like it's kind of bolted on
01:31:24.080 | while being used a lot.
01:31:26.220 | It feels like there needs to be a little bit more structure
01:31:29.020 | in terms of discovery, in terms of like-
01:31:32.100 | - Yeah, I mean, look at Reddit.
01:31:33.860 | Like Reddit is basically groups that are public and open
01:31:36.620 | and a little bit crazy, right?
01:31:38.680 | In a good way.
01:31:39.520 | But there's clear product market fit
01:31:42.480 | for that specific use case.
01:31:44.900 | And it doesn't have to be a college.
01:31:46.380 | It can be anything.
01:31:47.220 | It can be a small group, a big group.
01:31:48.500 | It can be group messaging.
01:31:50.220 | Facebook shines, I think, when it leans into that.
01:31:53.260 | I think when there are other companies
01:31:56.640 | that just seem exciting,
01:31:58.340 | and now all of a sudden the product shifts
01:32:01.300 | in some fundamental way to go try to compete
01:32:03.960 | with that other thing,
01:32:05.840 | that's when I think consumers get confused.
01:32:08.280 | Even if you can be successful,
01:32:10.700 | like even if you can compete with that other company,
01:32:13.060 | even if you can figure out how to bolt it on,
01:32:16.240 | eventually you come back and you look at the app
01:32:18.020 | and you're like, "I just don't know why I opened this app.
01:32:20.660 | Like why, like too many things going on."
01:32:23.220 | And that was always a worry.
01:32:24.620 | I mean, you listed all the things at Instagram
01:32:26.260 | and it almost gave me a heart attack,
01:32:27.620 | like way too many things.
01:32:30.500 | But I don't know, entrepreneurs get bored.
01:32:31.920 | They want to add things.
01:32:32.940 | They want to like, right?
01:32:34.180 | I don't have a good answer for it,
01:32:37.320 | except for that, I think being true
01:32:40.180 | to your original use case,
01:32:41.540 | and not even original use case,
01:32:43.100 | but sorry, actually not use case, original job.
01:32:46.300 | There are many use cases under that job.
01:32:49.340 | Being true to that and like being really good at it
01:32:52.260 | over time and morphing as needs change,
01:32:56.140 | I think that's how to make a company last forever.
01:32:59.540 | And I mean, honestly, like my main thesis
01:33:03.900 | about why Facebook is in the position it is today
01:33:07.580 | is if they have had a series of product launches
01:33:12.580 | that delighted people over time,
01:33:16.340 | I think they'd be in a totally different world.
01:33:17.980 | So just like imagine for a moment,
01:33:20.420 | and by the way, Apple's entering this,
01:33:21.780 | but like Apple for so long,
01:33:23.860 | just like product after product,
01:33:25.300 | you couldn't wait for it.
01:33:26.620 | You stood in line for it.
01:33:27.900 | You talked about it.
01:33:28.740 | You got excited.
01:33:29.940 | Amazon makes your life so easy.
01:33:31.780 | It's like, "Wow, I needed this thing
01:33:34.220 | and it showed up at my door two days later."
01:33:36.180 | And like both of these companies, by the way,
01:33:39.160 | Amazon, Apple have issues, right?
01:33:41.220 | There are labor issues,
01:33:43.260 | whether it's here in the US or in China,
01:33:45.380 | there are environmental issues.
01:33:47.460 | But like when's the last time you heard
01:33:49.980 | like a large chorus being like,
01:33:52.740 | "These companies better pay
01:33:54.020 | for what they're doing on these things," right?
01:33:56.220 | I think Facebook's main issue today
01:33:58.540 | is like you need to produce a hit.
01:34:01.740 | If you don't produce hits,
01:34:03.780 | it's really hard to keep consumers on your side.
01:34:06.460 | Then people just start picking on you
01:34:08.260 | for a variety of reasons,
01:34:09.460 | whether it's right or wrong.
01:34:11.260 | I'm not even gonna place a judgment right here and right now.
01:34:13.340 | I'm just gonna say that it is way better to be in a world
01:34:17.900 | where you're producing hits
01:34:19.180 | and consumers love what you're doing
01:34:21.900 | because then they're on your side.
01:34:23.460 | And I think that it's the past 10 years
01:34:26.580 | for Facebook has been fairly hard on this dimension.
01:34:29.900 | - So, and by hits,
01:34:31.260 | it doesn't necessarily mean financial hits.
01:34:33.020 | It feels like to me what you're saying
01:34:34.580 | is something that brings joy.
01:34:36.860 | - Yeah.
01:34:37.700 | - A product that brings joy
01:34:38.520 | to some fraction of the population.
01:34:40.100 | - Yeah, I mean, TikTok isn't just literally an algorithm.
01:34:44.460 | In some ways TikTok's content and algorithm
01:34:49.460 | have more sway now over the American psyche
01:34:53.420 | than Facebook's algorithm, right?
01:34:54.860 | It's visual, it's video.
01:34:57.220 | By the way, it's not defined by who you follow.
01:34:59.480 | It's defined by some magical thing that, by the way,
01:35:01.620 | if someone wanted to tweak
01:35:02.700 | to show you a certain type of content for some reason,
01:35:05.540 | they could, right?
01:35:08.820 | And people love it.
01:35:09.820 | - So as a CEO, let me ask you a question
01:35:13.940 | 'cause leadership matters.
01:35:15.560 | This is a complicated question.
01:35:19.900 | Why is Mark Zuckerberg distrusted, disliked,
01:35:23.780 | and sometimes even hated by many people in public?
01:35:27.040 | - Right, that is a complicated question.
01:35:30.160 | Well, the premise,
01:35:31.340 | I'm not sure I agree with the premise.
01:35:34.700 | - And I can expand that to include
01:35:38.100 | even a more mysterious question for me, Bill Gates.
01:35:41.840 | - What is the Bill Gates version of the question?
01:35:47.380 | Do you think people hate Bill Gates?
01:35:49.660 | - No, distrust.
01:35:51.340 | - Ah.
01:35:52.300 | - So takeaway one, it's a checklist.
01:35:55.680 | I think Mark Zuckerberg's distrust is the primary one,
01:36:01.220 | but there's also like a dislike,
01:36:04.220 | maybe hate is too strong a word,
01:36:05.580 | but it's just, if you look at like the articles
01:36:09.020 | that are being written and so on,
01:36:10.880 | there's a dislike and it makes,
01:36:14.500 | it's confusing to me because it's like the public
01:36:17.020 | picks certain individuals
01:36:18.540 | and they attach certain kinds of emotions
01:36:21.940 | to those individuals.
01:36:22.940 | - Yeah, so someone once just recently said,
01:36:27.340 | "There's a strong case that founder-led companies
01:36:30.460 | "have this problem and that a lot of Mark's issues
01:36:34.400 | "today come from the fact that he is a visible founder
01:36:38.320 | "with this story that people have watched in both a movie
01:36:42.900 | "and they followed along and he's this boy wonder kid
01:36:45.440 | "who became one of the world's richest people."
01:36:48.400 | And he's no longer Mark the person,
01:36:50.560 | he's Mark this image of a person
01:36:53.220 | with enormous wealth and power.
01:36:55.640 | And in today's world, we have issues
01:36:59.020 | with enormous wealth and power for a variety of reasons.
01:37:02.260 | One of which is we've been stuck inside,
01:37:04.300 | for a year and a half, two years,
01:37:07.400 | one of which is a lot of people were really unhappy
01:37:10.400 | about not the last election, but the last, last election.
01:37:13.860 | And where do you take out that anger?
01:37:15.240 | Who do you blame but the people in charge?
01:37:18.360 | That's one example or one reason why I think
01:37:21.200 | a lot of people express anger or resentment
01:37:26.560 | or unhappiness with Mark.
01:37:29.680 | At the same time, I don't know,
01:37:31.640 | I pointed out to that person, I was like,
01:37:33.400 | "Well, I don't know,
01:37:34.980 | "I think a lot of people really like Elon."
01:37:37.200 | Like Elon arguably, like he kept his factory open here
01:37:41.360 | throughout COVID protocols,
01:37:42.640 | which arguably a lot of people would be against.
01:37:47.220 | - While saying a bunch of crazy offensive things
01:37:50.680 | on the internet, they still--
01:37:52.360 | - And like basically, gives the middle finger to the SEC,
01:37:56.720 | like on Twitter and like, I don't know,
01:37:59.100 | I'm like, well, there's a founder
01:38:00.420 | and like people kind of like him.
01:38:02.920 | - So I do think that the founder and slash CEO of a company
01:38:07.920 | that's a social network company
01:38:11.680 | is like an extra level of difficulty.
01:38:13.860 | If life is a video game,
01:38:15.440 | you just chose the harder video game.
01:38:17.720 | So, I mean, that's why it's interesting to ask you
01:38:20.120 | because you were the founder and CEO of a social network.
01:38:24.600 | - Right, exactly. - I challenge it because--
01:38:25.920 | - Exactly.
01:38:26.960 | - But you're one of the rare examples,
01:38:30.780 | even Jack Dorsey's this light, not to the degree,
01:38:34.500 | but it just seems harder
01:38:35.860 | when you're running a social media company.
01:38:38.900 | - It's interesting.
01:38:39.740 | I never thought of Jack as just like,
01:38:41.380 | I think generally he's well-respected.
01:38:43.980 | - Yeah, I think so.
01:38:45.940 | I think you're right, but like he's not loved.
01:38:50.540 | - Yeah, no, I--
01:38:51.380 | - And I feel like you, I mean, to me,
01:38:53.260 | Twitter's an incredible thing.
01:38:54.860 | - Yeah.
01:38:55.700 | - Again, can I just come back to this point,
01:38:57.500 | which seems oversimplistic,
01:38:59.220 | but like I really do think how a product makes someone feel,
01:39:04.220 | they ascribe that feeling to the founder.
01:39:07.460 | - Yep.
01:39:08.300 | So, make people feel good.
01:39:11.900 | - So, think about it,
01:39:12.740 | like let's just go with this thesis for a second.
01:39:14.900 | - Sure, I like it though.
01:39:16.180 | - Amazon's pretty utilitarian, right?
01:39:19.620 | It delivers brown boxes to your front door.
01:39:21.540 | Sure, you can have Alexa
01:39:23.420 | and you can have all these things, right?
01:39:25.580 | But in general, it delivers stuff quickly to you
01:39:29.020 | at a reasonable price, right?
01:39:30.620 | I think Jeff Bezos is wonderfully wealthy,
01:39:34.260 | thoughtful, smart guy, right?
01:39:36.700 | But like people kind of feel that way about them.
01:39:38.860 | They're like, wow, this is really big.
01:39:40.340 | We're impressed that this is really big.
01:39:42.860 | But he's doing the same space stuff Elon's doing,
01:39:45.860 | but they don't necessarily ascribe
01:39:47.260 | the same sense of wonder, right?
01:39:50.100 | Now let's take Elon.
01:39:51.060 | And again, this is pet theory.
01:39:52.900 | I don't have much proof other than my own intuition.
01:39:55.500 | He is literally about living the future, Mars, space.
01:40:01.100 | It's about wonder.
01:40:01.980 | It's about going back to that feeling as a kid
01:40:04.060 | when you looked up to the stars and asked,
01:40:05.940 | is there life out there?
01:40:08.180 | People get behind that because it's a sense of hope
01:40:11.100 | and excitement and innovation.
01:40:13.500 | And like, you can say whatever you want,
01:40:15.380 | but we ascribe that emotion to that person.
01:40:18.900 | Now, let's say you're on a social network
01:40:21.180 | and people make you kind of angry
01:40:23.300 | because they disagree with you
01:40:24.500 | or they say something ridiculous
01:40:25.940 | or they're living a FOMO type life where you're like,
01:40:28.700 | wow, I wish I was doing that thing.
01:40:30.460 | I think Instagram, if I were to think back,
01:40:34.100 | by and large when I was there was not about FOMO,
01:40:37.500 | was not about this influencer economy,
01:40:39.900 | although it certainly became that way closer to the end.
01:40:43.660 | It was about the sense of wonder and happiness
01:40:45.900 | and beautiful things in the world.
01:40:47.340 | And I don't know, I mean, like,
01:40:49.460 | I don't want to have a blind spot,
01:40:50.620 | I don't think anyone had a strong opinion
01:40:52.500 | about me one way or the other.
01:40:53.580 | - For the longest time, the way people explained to me,
01:40:55.660 | I mean, if you want to go for toxicity,
01:40:57.540 | you go to Facebook or Twitter.
01:40:59.380 | If you want to go to make, feel good about life,
01:41:01.820 | you go to Instagram to enjoy, celebrate life.
01:41:04.460 | - And my experience when talking to people
01:41:05.860 | is they gave me the benefit of the doubt because of that.
01:41:08.340 | But if your experience of the product
01:41:10.020 | is kind of makes you angry, it's where you argue.
01:41:13.660 | I mean, a big part of Jack might be
01:41:15.180 | that he wasn't actually the CEO for a very long time
01:41:17.740 | and only became recently.
01:41:19.260 | So I'm not sure how much of the connection got made.
01:41:22.180 | But in general, I mean, if you hate,
01:41:26.220 | I'm just thinking about other companies
01:41:29.620 | that aren't tech companies.
01:41:30.740 | If you hate like what a company is doing
01:41:32.580 | or it makes you not feel happy, I don't know,
01:41:36.740 | like people are really angry about Comcast or whatever.
01:41:39.140 | Are they even called Comcast anymore?
01:41:40.660 | It's like Xfinity or something, right?
01:41:42.100 | They had to rebrand, they became meta, right?
01:41:44.660 | It's like, but my point is if it makes you-
01:41:48.540 | - That's beautiful, yeah.
01:41:50.020 | But the thing is, this is me saying this.
01:41:54.940 | I think your thesis is very strong and correct,
01:41:58.780 | has elements of correctness,
01:42:00.100 | but I still personally put some blame on individuals.
01:42:05.100 | - Of course.
01:42:06.340 | - I think you said, Elon, looking up,
01:42:10.360 | there's something about childlike wonder to him,
01:42:13.500 | like to his personality, his character.
01:42:16.400 | Something about, I think more so than others,
01:42:19.900 | where people can trust them.
01:42:21.340 | And there's, I don't know,
01:42:23.060 | Sondra Prachai is an example of somebody
01:42:25.420 | who's like, there's some kind,
01:42:27.940 | it's hard to put into words,
01:42:29.660 | but there's something about the human being
01:42:32.120 | where he's trustworthy.
01:42:34.300 | He's human in a way that connects to us.
01:42:38.240 | And the same with Sajjan Nadella.
01:42:40.780 | I mean, some of these folks,
01:42:44.420 | something about us is drawn to them,
01:42:47.420 | even when they're flawed.
01:42:48.620 | So your thesis really holds up for Steve Jobs,
01:42:53.060 | because I think people didn't like Steve Jobs,
01:42:55.040 | but he delivered products
01:42:57.160 | that and then they fell in love every time.
01:42:59.500 | I guess you could say that the CEO,
01:43:03.000 | the leader is also a product.
01:43:05.800 | And if they keep delivering a product that people like
01:43:08.940 | by being public and saying things that people like,
01:43:11.440 | that's also a way to make people happy.
01:43:14.640 | But from a social network perspective,
01:43:16.240 | it makes me wonder how difficult it is
01:43:19.840 | to explain to people why certain things happen,
01:43:22.080 | like to explain machine learning,
01:43:24.500 | to explain why certain,
01:43:27.180 | the woke mob effect happens,
01:43:32.480 | or the certain kinds of like bullying happens,
01:43:36.440 | which is like, it's human nature combined with algorithm.
01:43:40.800 | And it's very difficult to control for
01:43:42.560 | how the spread of quote unquote misinformation happens.
01:43:45.600 | It's very difficult to control for that.
01:43:47.680 | And so you try to decelerate certain parts
01:43:50.520 | and you create more problems than you solve.
01:43:53.720 | And anything that looks at all like censorship
01:43:55.840 | can create huge amounts of problems as a slippery slope.
01:43:58.840 | And then you have to inject humans
01:44:01.120 | to oversee the machine learning algorithms.
01:44:03.200 | And anytime you inject humans into the system,
01:44:05.560 | it's gonna create a huge number of problems.
01:44:07.240 | And I feel like it's up to the leader
01:44:08.800 | to communicate that effectively,
01:44:10.320 | to be transparent.
01:44:12.760 | First of all, design products that don't have those problems
01:44:15.200 | and second of all, when they have those problems,
01:44:17.480 | to be able to communicate with them.
01:44:19.040 | I guess that's all going to,
01:44:21.080 | when you run a social network company, your job is hard.
01:44:25.040 | - Yeah.
01:44:26.000 | I will say the one element that you haven't named
01:44:28.760 | that I think you're getting at is just bedside manner.
01:44:31.640 | Which Steve Jobs, I never worked for him,
01:44:35.480 | I never met him in person,
01:44:38.600 | had an uncanny ability in public to have bedside manner.
01:44:43.600 | I mean, some of the best clips of Steve Jobs
01:44:46.520 | from like, I wanna say, maybe the 80s
01:44:49.520 | when he's on the stage and getting questions
01:44:52.000 | from the audience about life.
01:44:54.440 | And he'll take this question that is like,
01:44:56.600 | how are you gonna compete with blah?
01:44:58.440 | And it's super boring.
01:44:59.440 | And I don't even know the name of the company.
01:45:01.480 | And his answer is as if you just asked
01:45:04.240 | like your grandfather the meaning of life.
01:45:06.240 | - Yeah.
01:45:07.440 | - And you sit there and you're just like, what?
01:45:09.520 | Like, and there's that bedside manner.
01:45:11.800 | And if you lack that, or if that's just not intuitive to you,
01:45:16.960 | I think that it can be a lot harder
01:45:20.240 | to gain the trust of people.
01:45:21.520 | And then add on top of that, missteps of companies, right?
01:45:25.780 | I don't know if you have any friends from the past
01:45:29.080 | where like, maybe they crossed you once,
01:45:31.360 | or like, maybe you get back together
01:45:33.400 | and you're friends again,
01:45:34.400 | but you just never really forget that thing.
01:45:36.920 | It's human nature not to forget.
01:45:38.880 | - I'm Russian, you crossed me once.
01:45:40.680 | (Mark laughs)
01:45:41.760 | We solved the problem.
01:45:43.320 | - So my point is,
01:45:44.640 | humans don't forget.
01:45:48.920 | And if there are times in the past
01:45:50.400 | where they feel like they don't trust the company
01:45:52.200 | or the company hasn't had their back,
01:45:54.040 | that is really hard to earn back,
01:45:57.080 | especially if you don't have that bedside manner.
01:46:00.000 | And again, like I'm not attributing this specifically
01:46:02.520 | to Mark because I think a lot of companies have this issue
01:46:06.320 | where, one, you have to be trustworthy of the company
01:46:10.600 | and live by it and live by those actions.
01:46:12.580 | And then two, I think you need to be able
01:46:14.480 | to be really relatable in a way that's very difficult
01:46:18.080 | if you're worth like what these people are.
01:46:20.520 | It's really hard.
01:46:22.040 | - Yeah.
01:46:23.200 | Jack does a pretty good job of this by being a monk.
01:46:26.960 | - But I also, like Jack eschews attention.
01:46:29.520 | Like he's not out there almost on purpose.
01:46:32.040 | He's just working hard, doing square, right?
01:46:34.380 | Like I literally shared a desk like this with him at Odeo.
01:46:38.080 | I mean, just normal guy who likes painting.
01:46:40.920 | I remember he would leave early on like Wednesdays
01:46:43.920 | or something to go to like a painting class.
01:46:46.640 | - Yeah.
01:46:47.480 | - And he's creative, he's thoughtful.
01:46:49.360 | I mean, money makes people like more creative
01:46:52.240 | and more thoughtful, like extreme versions of themselves.
01:46:54.880 | And this was a long, long time ago.
01:46:58.240 | - You mentioned that he asked you
01:47:00.480 | to do some kind of JavaScript thing.
01:47:02.720 | - We were working on some JavaScript together.
01:47:05.220 | - That's hilarious.
01:47:06.180 | Like pre-Twitter, early Twitter days,
01:47:08.020 | you and Jack Dorsey are in a room together
01:47:11.220 | talking about JavaScript,
01:47:12.780 | solving some kind of menial problem.
01:47:14.780 | - Terrible problems.
01:47:15.620 | Yeah, I mean, not terrible, just like boring,
01:47:17.620 | boring widget problem.
01:47:18.460 | I think it was the Odeo widget
01:47:19.660 | we were working on at the time.
01:47:21.660 | I'm surprised anyone paid me to be in the room as an intern
01:47:24.820 | 'cause I didn't really provide any value.
01:47:26.500 | I'm very thankful to anyone who included me
01:47:29.620 | back in the day.
01:47:31.200 | It was very helpful, so thank you if you're listening.
01:47:33.780 | - I mean, is there Odeo that's a precursor to Twitter?
01:47:37.700 | First of all, did you have any anticipation
01:47:40.700 | that this Jack Dorsey guy could be also a head
01:47:44.020 | of a major social network?
01:47:46.180 | And second, did you learn anything from the guy that,
01:47:49.420 | like, do you think it's a coincidence
01:47:52.460 | that you two were in the room together?
01:47:54.420 | And it's the coincidence meaning like,
01:47:58.900 | why does the world play its game in a certain way
01:48:01.720 | where these two founders of social networks?
01:48:04.120 | - I don't know.
01:48:04.960 | It's so weird, right?
01:48:05.800 | Like, I mean, it's also weird that Mark showed up
01:48:10.800 | in our fraternity my sophomore year
01:48:13.960 | and we got to know each other then,
01:48:16.320 | like long before Instagram.
01:48:18.160 | It's a small world, but let me tell a fun story about Jack.
01:48:23.160 | We're at Odeo and I don't know,
01:48:27.240 | I think Ev was feeling like people weren't working hard
01:48:29.780 | enough or something.
01:48:31.340 | - Nice.
01:48:32.180 | - And I can't remember exactly what he,
01:48:35.060 | he created this thing where every Friday,
01:48:39.140 | I don't know if it was every Friday.
01:48:40.380 | I only remember this happening once,
01:48:43.140 | but he had like a statuette.
01:48:46.180 | - It's like of Mary.
01:48:47.540 | - And in the bottom it's hollow, right?
01:48:50.740 | And I remember on a Friday, Ev decided he was going
01:48:55.700 | to let everyone vote for who had worked
01:48:57.920 | the hardest that week.
01:48:59.600 | We all voted, closed ballot, right?
01:49:01.360 | We all put in a bucket and he tallied the votes.
01:49:04.440 | And then whoever got the most votes,
01:49:06.960 | as I recall, got the statuette.
01:49:09.000 | And in the statuette was a thousand bucks.
01:49:12.280 | Or as I recall, there was a thousand bucks.
01:49:14.320 | It might've been a hundred bucks,
01:49:15.280 | but let's call it a thousand.
01:49:16.240 | It's more exciting that way.
01:49:17.080 | - It felt like a thousand, yeah.
01:49:18.020 | - It did to me for sure.
01:49:19.440 | I actually got two votes.
01:49:20.560 | I was very happy.
01:49:21.400 | We were a small company, but I, as the intern,
01:49:23.200 | I got at least two votes.
01:49:24.040 | - So everybody knew how many votes they got individually?
01:49:26.260 | - Yeah, yeah.
01:49:27.100 | And I think it was one of these self-accountability things.
01:49:29.060 | Anyway, I remember Jack just getting like
01:49:31.660 | the vast majority of votes from everyone.
01:49:34.180 | And I remember just thinking like,
01:49:37.100 | like I couldn't imagine he would become what he'd become
01:49:39.740 | and do what he would do.
01:49:41.120 | But I had a profound respect that the new guy
01:49:45.620 | who I really liked worked that hard.
01:49:48.820 | And you could see his dedication even then
01:49:51.700 | and that people respected him.
01:49:53.860 | But that's the one story that I remember of him,
01:49:55.680 | like working with him specifically from that summer.
01:49:58.040 | - Can I take a small tangent on that?
01:49:59.520 | - Of course.
01:50:00.360 | - There's kind of a pushback in Silicon Valley
01:50:02.080 | a little bit against hard work.
01:50:04.440 | Can you speak to the sort of,
01:50:06.120 | the thing you admired to see the new guy working so hard,
01:50:10.280 | that thing, what is the value of that thing in a company?
01:50:13.800 | - See, this is, I like, just to be very frank,
01:50:16.480 | it drives me nuts.
01:50:17.760 | Like I saw this really funny video on TikTok.
01:50:21.880 | Was it on TikTok?
01:50:22.720 | It was like, I'm taking a break from my mental health
01:50:24.920 | to work on my career.
01:50:26.320 | I thought that was funny.
01:50:27.560 | So I was like, oh, it is kind of phrased that way,
01:50:31.720 | the opposite often, right?
01:50:33.600 | Okay, so a couple of things.
01:50:35.000 | I have worked so hard to do the things that I did.
01:50:43.680 | Like Mike and I lost years off of our lives,
01:50:47.560 | staying up late, figuring things out,
01:50:50.120 | the stress that comes with the job.
01:50:51.720 | I have a lot more gray hair now than I did back then.
01:50:54.800 | It requires an enormous amount of work
01:50:57.080 | and most people aren't successful, right?
01:50:59.440 | But even the ones that do don't skate by.
01:51:01.640 | I am okay if people choose not to work hard
01:51:07.200 | because I don't actually think there's anything
01:51:09.040 | in this world that says you have to work hard.
01:51:11.380 | But I do think that great things require a lot of hard work.
01:51:16.400 | So there's no way you can expect to change the world
01:51:18.640 | without working really hard.
01:51:19.720 | By the way, even changing the world,
01:51:22.880 | the folks that I respect the most have nudged the world
01:51:25.160 | in like a slight direction, slight, very, very slight.
01:51:29.520 | Like even if Elon accomplishes all the things
01:51:34.000 | he wants to accomplish, we will have nudged the world
01:51:37.040 | in a slight direction, but it requires enormous amount.
01:51:40.160 | There was an interview with him where he was just like,
01:51:43.200 | he was interviewed, I think at the Tesla factory
01:51:45.080 | and he was like, work is really hard.
01:51:47.560 | This is actually unhealthy.
01:51:49.440 | And I can't recall the exact,
01:51:51.240 | but he was like visibly shaken about how hard
01:51:53.560 | he had been working and he was like, this is bad.
01:51:55.880 | And unfortunately, I think to have great outcomes,
01:51:57.640 | you actually do need to work at like three standard
01:51:59.920 | deviations above the mean, but there's nothing saying
01:52:02.660 | that people have to go for that.
01:52:04.040 | - See, the thing is, but what I would argue,
01:52:06.440 | this is my personal opinion,
01:52:08.120 | is nobody has to do anything, first of all.
01:52:09.960 | And second, they certainly don't have to work hard.
01:52:11.960 | - Exactly.
01:52:12.800 | - But I think hard work in a company should be admired.
01:52:18.880 | - I do too.
01:52:19.920 | - And you should not feel like,
01:52:22.960 | you shouldn't feel good about yourself for not working hard.
01:52:28.640 | So for example, I don't have to work out.
01:52:32.480 | I don't have to run.
01:52:34.360 | I hate running, but I certainly don't feel good
01:52:38.000 | if I don't run because I know for my health,
01:52:39.920 | like there's certain values, I guess,
01:52:41.720 | is what I'm trying to get at.
01:52:42.960 | There's certain values that you have in life.
01:52:44.560 | It feels like there's certain values
01:52:45.960 | that companies should have and hard work
01:52:47.640 | is one of the things I think that should be admired.
01:52:51.400 | I often ask this kind of silly question,
01:52:53.640 | just to get a sense of people,
01:52:56.560 | like if I'm hiring and so on,
01:52:58.520 | I just ask if they think it's better
01:53:00.800 | to work hard or work smart.
01:53:03.120 | It was helpful for me to get a sense of people from that.
01:53:07.380 | 'Cause you think like the right--
01:53:08.220 | - The answer's both.
01:53:09.680 | - What's that?
01:53:10.520 | - The answer's both.
01:53:11.340 | - The answer's both.
01:53:12.180 | I usually try not to give them that,
01:53:13.880 | but sometimes I'll say both if that's an option.
01:53:16.880 | But a lot of people kind of,
01:53:19.760 | a surprising number will say work smart.
01:53:22.040 | And there are usually people who don't know
01:53:24.760 | how to work smart and they're literally just lazy.
01:53:29.760 | Not just, there's two effects behind that.
01:53:32.280 | One is laziness and the other is ego.
01:53:36.400 | When you're younger and you say it's better to work smart,
01:53:40.440 | it means you think you know what it means
01:53:44.240 | to work smart at this early stage.
01:53:46.800 | To me, people that say work hard or both,
01:53:49.920 | they have the humility to understand like,
01:53:52.360 | I'm going to have to work my ass off
01:53:54.320 | because I'm too dumb to know how to work smart.
01:53:56.840 | And people who are self-critical in this way,
01:53:59.120 | in some small amount, you have to have some confidence.
01:54:01.480 | But if you have humility,
01:54:03.880 | that means you're going to actually eventually figure out
01:54:06.440 | what it means to work smart.
01:54:07.520 | And then to actually be successful, you should do both.
01:54:11.200 | - So I have a very particular take on this,
01:54:14.040 | which is that no one's forcing you to do anything.
01:54:19.040 | All choices have consequences.
01:54:22.280 | So if you major in, I don't know, theoretical literature,
01:54:27.280 | I don't even know if that's a major,
01:54:28.880 | I'm just making something up.
01:54:30.040 | - That's supposed to be regular literature.
01:54:31.720 | Applied literature.
01:54:33.560 | - Yeah, think about like theoretical Spanish lit
01:54:38.280 | from the 14th century.
01:54:39.400 | Like just make up your esoteric thing.
01:54:41.720 | And then the number of people I went to Stanford with
01:54:44.280 | who get out in the world and they're like, wait, what?
01:54:46.200 | I can't find a job?
01:54:47.240 | Like no one wants a theoretical,
01:54:48.960 | like there are plenty of counter examples
01:54:51.640 | of people who have majored in esoteric things
01:54:53.360 | and gone on to be very successful.
01:54:54.800 | So I just want to be clear, it's not about the major,
01:54:56.680 | but every choice you make, whether it's to have kids,
01:55:00.800 | like I love my children.
01:55:02.840 | It's so awesome to have two kids.
01:55:04.600 | And it is so hard to work really hard
01:55:06.960 | and also have kids, it's really hard.
01:55:09.360 | And there's a reason why certain very successful people
01:55:12.600 | like don't have, or not successful,
01:55:14.520 | but people who run very, very large companies or startups
01:55:17.400 | have chosen not to have kids for a while
01:55:19.120 | or chosen not to like prioritize them.
01:55:21.560 | Everything's a choice.
01:55:22.560 | And like, I choose to prioritize my children
01:55:24.840 | because I want to do that, right?
01:55:27.720 | So everything's a choice.
01:55:29.360 | Now, once you've made that choice,
01:55:33.080 | I think it's important that the contract is clear,
01:55:36.120 | which is to say, let's imagine you were joining
01:55:38.080 | a new startup.
01:55:39.400 | It's important that that startup communicate
01:55:43.480 | that like the expectation is like,
01:55:44.680 | we're all working really, really hard right now.
01:55:46.400 | You don't have to join this startup.
01:55:48.760 | But like, if you do just know, like you're,
01:55:51.440 | it's almost as if you join, I don't know,
01:55:53.680 | pick your, pick your like sports team.
01:55:57.920 | Like, let's go back to the Yankees for a second.
01:56:00.800 | You want to join the Yankees,
01:56:01.840 | but you don't really want to work that hard.
01:56:03.600 | You don't really want to do batting practice
01:56:05.760 | or pitching practice or whatever for your position, right?
01:56:08.600 | That to me is wacko.
01:56:11.400 | And that's actually the world that it feels like we live in
01:56:13.920 | in tech sometimes, where people both want to work
01:56:16.560 | for the Yankees because that pays a lot,
01:56:18.480 | but like don't actually want to work that hard.
01:56:21.080 | That I don't fully understand.
01:56:23.280 | Because if you sign up for some of these things,
01:56:26.000 | just sign up for it, but it's okay
01:56:27.440 | if you don't want to sign up for it.
01:56:29.000 | There's so many wonderful careers in this world
01:56:31.680 | that don't require 80 hours a week.
01:56:33.760 | But when I read about companies going to like four day
01:56:35.640 | work weeks and stuff, I just like, I chuckle
01:56:38.280 | because I can't get enough done with a seven day week.
01:56:41.080 | I don't know how, and people will say,
01:56:43.400 | "Oh, you're just not working smart."
01:56:44.920 | And it's like, no, I work pretty smart, I think in general.
01:56:47.560 | Like I wouldn't have gotten to this point
01:56:49.320 | if I hadn't like some amount of working smart.
01:56:52.760 | And there is balance though.
01:56:53.840 | So I used to be like a pretty big cyclist.
01:56:55.880 | I don't do it much anymore just because of kids
01:56:58.240 | and like prioritizing other things, right?
01:57:00.340 | But one of the most important things to learn
01:57:03.360 | as a cyclist is to take a rest day.
01:57:06.140 | But to me and to cyclists, like resting is a function
01:57:10.340 | of optimizing for the long run.
01:57:12.680 | It's not like a thing that you do for its own merits.
01:57:16.240 | It's actually like, if you don't rest,
01:57:17.880 | your muscles don't recover, and then you're just not as,
01:57:19.880 | like you're not training as efficiently.
01:57:21.520 | - You should probably, the successful people I've known
01:57:24.680 | in terms of athletes, they hate rest days,
01:57:27.160 | but they know they have to do it for the longterm.
01:57:28.960 | - Absolutely.
01:57:29.800 | - They think their opposition is getting stronger
01:57:32.000 | and stronger, and that's the feeling,
01:57:34.360 | but you know it's the right thing.
01:57:36.000 | And usually you need a coach to help you.
01:57:38.200 | - Yeah, totally.
01:57:39.040 | So, I mean, I use this thing called training peaks,
01:57:41.240 | and it's interesting 'cause it actually mathematically shows
01:57:44.060 | like where you are on the curve and all this stuff.
01:57:46.600 | - Yeah.
01:57:47.440 | - But you have to rest, like you have to have that rest,
01:57:49.400 | but it's a function of going harder for longer.
01:57:52.320 | Again, it's this reinforcement learning,
01:57:53.840 | like planning the aggregate and the long,
01:57:56.480 | but a lot of people will hide behind laziness
01:57:58.580 | by saying that they're trying to optimize for the long run,
01:58:00.860 | and they're not, they're just not working very hard.
01:58:03.160 | But again, you don't have to sign up for it.
01:58:05.000 | It's totally cool.
01:58:05.840 | Like I don't think less of people
01:58:07.400 | for like not working super hard.
01:58:08.920 | It's just like, don't sign up for things
01:58:10.260 | that require working super hard.
01:58:11.840 | - And some of that requires for the leadership
01:58:13.880 | to have the guts, the boldness to communicate effectively
01:58:17.080 | at the very beginning.
01:58:17.920 | I mean, sometimes I think most of the problems arise
01:58:20.760 | from the fact that the leadership is kind of hesitant
01:58:24.600 | to communicate the socially difficult truth
01:58:30.580 | of what it takes to be at this company.
01:58:33.820 | So they kind of say, "Hey, come with us.
01:58:36.500 | There's, we have snacks."
01:58:37.860 | You know, but-
01:58:39.460 | - Unlimited vacation.
01:58:40.620 | And, you know, Ray at Bridgewater is always fascinating
01:58:44.220 | because, you know, people, it's been called like a cult
01:58:47.100 | on the outside or cult-ish.
01:58:48.820 | And, but what's fascinating is like,
01:58:50.980 | they just don't give on their principles.
01:58:52.820 | They're like, "Listen, this is what it's like to work here.
01:58:55.220 | We record every meeting.
01:58:57.340 | We're like brutally honest."
01:58:59.260 | And that's not going to feel right to everyone.
01:59:00.920 | And if it doesn't feel right to you, totally cool.
01:59:03.440 | Just go work somewhere else.
01:59:04.880 | But if you work here, you are signing up for this.
01:59:08.040 | And that's, that's been fascinating to me
01:59:10.400 | because it's honesty upfront.
01:59:12.220 | It's a system in which you operate.
01:59:15.480 | And if it's not for you,
01:59:16.720 | like no one's forcing you to work there, right?
01:59:19.600 | - I actually did.
01:59:20.440 | So I did a conversation with him at,
01:59:23.560 | kind of got stuck in a funny moment, which is,
01:59:26.640 | at the end I asked him, you know,
01:59:28.340 | to give me honest feedback of how I did on the interview.
01:59:31.420 | And I was- - Did he?
01:59:33.460 | - I don't think so.
01:59:34.740 | He was super nice.
01:59:36.220 | He asked me, he's like, "Well, tell me,
01:59:39.220 | did you accomplish what you were hoping to accomplish?"
01:59:42.020 | I was like, "That's not, that's not."
01:59:44.100 | I'm asking you as an objective observer
01:59:47.900 | of two people talking, how do we do today?
01:59:50.780 | And then he's like, "Well, I'm,"
01:59:53.780 | he gave me this politician answer.
01:59:55.220 | Well, I feel like we've accomplished
01:59:57.240 | successful communication of like ideas,
02:00:00.320 | which is, I'd love to spread some of the ideas
02:00:03.000 | in that, like in principles and so on.
02:00:06.920 | I was like, "All right." - Back to my original point,
02:00:08.800 | it's really hard to get feedback.
02:00:10.080 | - Even for Ray Dalio.
02:00:11.920 | - It's really hard to give feedback.
02:00:13.920 | And one of the other things I learned from him
02:00:15.760 | and just people in that world is like,
02:00:18.820 | man, humans really like to pretend like they've come to,
02:00:24.760 | that they've come to some kind of meeting of the minds.
02:00:27.120 | Like if there's conflict, if you and I have conflict,
02:00:30.160 | it's always better to meet face-to-face, right?
02:00:32.640 | Or on the phone, Slack is not great, right?
02:00:35.280 | Email's not great, but face-to-face.
02:00:37.200 | What's crazy is you and I get together
02:00:38.700 | and we actively try to,
02:00:40.700 | even if we're not actually solving the conflict,
02:00:43.120 | we actively try to paper over the conflict.
02:00:45.920 | Oh yeah, it didn't really bother me that much.
02:00:48.640 | Oh yeah, I'm sure you didn't mean it.
02:00:50.880 | But like, no, in our minds, we're still there.
02:00:54.360 | - So this is one of the things that as a leader,
02:00:56.240 | you always have to be digging,
02:00:58.480 | especially as you ascend.
02:00:59.320 | - Straight to the conflict.
02:01:00.920 | - Yeah, but as you ascend,
02:01:02.080 | no one wants to tell you you're crazy.
02:01:03.640 | No one wants to tell you your idea's bad.
02:01:06.760 | And you're like, oh, I'm going to be a leader.
02:01:09.200 | And the idea is, well, I'm just going to ask people.
02:01:12.320 | No one tells you.
02:01:13.640 | So like, you have to look for the markers,
02:01:16.600 | knowing that literally just people
02:01:18.280 | aren't going to tell you along the way.
02:01:20.440 | And be paranoid.
02:01:21.320 | I mean, you asked about selling the company.
02:01:24.000 | I think one of the biggest differences
02:01:25.360 | between me and a lot of other entrepreneurs is like,
02:01:28.280 | I wasn't completely confident we could do it.
02:01:30.440 | Like we could be alone and actually be great.
02:01:35.440 | And if any entrepreneur is honest with you,
02:01:37.560 | they also feel that way.
02:01:39.120 | But a lot of people are like, well,
02:01:40.600 | I have to be cocky and just say, I can do this on my own.
02:01:43.960 | We're going to be fine.
02:01:44.840 | We're going to crush everyone.
02:01:46.800 | Some people do say that, and then it's not right.
02:01:49.200 | And they fail.
02:01:52.320 | - Being honest in that moment with yourself,
02:01:55.480 | with those close to you.
02:01:57.440 | - And also you talked about the personality of leaders
02:02:00.840 | and who resonates and who doesn't.
02:02:02.960 | It's rare that I see leaders be vulnerable.
02:02:08.800 | Rare.
02:02:09.840 | And one thing I tried to do at Instagram,
02:02:12.480 | at least internally, was like, say when I screwed up.
02:02:15.880 | And like, point out how I was wrong about things.
02:02:19.640 | And point out where my judgment was off.
02:02:21.640 | Everyone thinks they have to bat 1,000, right?
02:02:25.320 | Like, that's crazy.
02:02:27.040 | The best quant hedge funds in the world bat 50.001%.
02:02:31.680 | They just take a lot of bets, right?
02:02:34.000 | Renaissance.
02:02:35.480 | They might bat 51%, right?
02:02:37.760 | But holy hell, like the question isn't,
02:02:41.600 | are you right every single time
02:02:43.320 | and you have to seem invincible?
02:02:45.000 | The question is, how many at-bats do you get?
02:02:48.160 | And on average, are you better on average, right?
02:02:52.600 | With enough bets and enough at-bats
02:02:55.600 | that your aggregate can be very high.
02:02:57.640 | I mean, Steve Jobs was wrong at a lot of stuff.
02:03:01.920 | The Newton, too early, right?
02:03:03.720 | Next, not quite right.
02:03:05.240 | There was even a time where he said like,
02:03:08.120 | no one will ever want to watch video on the iPod.
02:03:13.120 | Totally wrong.
02:03:14.560 | But who cares if you come around
02:03:16.160 | and realize your mistake and fix it?
02:03:18.560 | - It becomes just like you said,
02:03:19.640 | harder and harder when your ego grows
02:03:21.280 | and the number of people around you
02:03:22.960 | that say positive things towards you grows.
02:03:26.200 | - I actually think it's really valuable that,
02:03:28.520 | like let's imagine a counterfactual
02:03:31.520 | where Instagram became worth like $300 billion
02:03:34.760 | or something crazy, right?
02:03:36.080 | I kind of like that my life is relatively normal now.
02:03:40.200 | When I say relatively, you get what I mean.
02:03:41.800 | I'm not making a claim that I live a normal life.
02:03:44.280 | But like, I certainly don't live in a world
02:03:46.200 | where there are like 15 Sherpas following me,
02:03:49.480 | like fetching me water or whatever.
02:03:51.160 | Like, that's not how it works.
02:03:52.640 | I actually like that I have a sense of humility of like,
02:03:56.560 | I may not found another thing that's nearly as big,
02:03:59.200 | so I have to work twice as hard.
02:04:01.160 | Or I have to like learn twice as much.
02:04:04.440 | I have to, we haven't talked about machine learning yet,
02:04:07.360 | but my favorite thing is all these like famous,
02:04:11.280 | you know, tech guys who have worked in the industry,
02:04:15.000 | pontificating about the future of machine learning
02:04:17.320 | and how it's gonna kill us all.
02:04:18.880 | Right, like, and like, I'm pretty sure
02:04:21.080 | they've never tried to build anything
02:04:23.120 | with machine learning themselves.
02:04:24.440 | - Yes.
02:04:26.000 | So there's a nice line between people
02:04:27.960 | that actually build stuff with machine,
02:04:29.680 | like actually program something,
02:04:31.760 | or at least understand some of those fundamentals,
02:04:33.760 | and the people that are just saying philosophical stuff
02:04:36.840 | for journalists and so on.
02:04:38.480 | It's an interesting line to walk
02:04:40.880 | because the people who program are often not philosophers.
02:04:44.400 | - No, or don't have the attention.
02:04:45.840 | They can't write an op-ed for the Wall Street Journal.
02:04:48.080 | Like it doesn't work.
02:04:49.080 | - So like, it's nice to be both a little bit,
02:04:51.400 | like to have elements of both.
02:04:52.880 | - My point is the fact that I have to learn stuff
02:04:55.400 | from scratch or that I choose to, or like-
02:04:58.160 | - It's humbling.
02:04:59.640 | - Yeah, I mean, again, I have a lot of advantages.
02:05:03.040 | Like, but my point is it's awesome to be back in a game
02:05:09.400 | where you have to fight.
02:05:11.800 | That's fun.
02:05:13.200 | So being humble, being vulnerable,
02:05:15.240 | it's an important aspect of a leader,
02:05:18.040 | and I hope it serves me well,
02:05:19.240 | but like I can't fast forward 10 years to know.
02:05:22.000 | I've just, that's my game plan.
02:05:24.040 | - Before I forget,
02:05:24.920 | I have to ask you one last thing on Instagram.
02:05:27.380 | What do you think about the whistleblower,
02:05:30.760 | Frances Haugen, recently coming out
02:05:33.400 | and saying that Facebook is aware
02:05:34.960 | of Instagram's harmful effect on teenage girls
02:05:38.720 | as per their own internal research studies on the matter.
02:05:43.320 | What do you think about this baby of yours,
02:05:46.040 | Instagram, being under fire now,
02:05:48.240 | as we've been talking about
02:05:50.240 | under the leadership of Facebook?
02:05:52.080 | - You know, I often question where does the blame lie?
02:05:59.240 | Is the blame at the people that originated the network, me,
02:06:05.240 | right?
02:06:06.760 | Is the blame at like the decision to combine the network
02:06:10.520 | with another network with a certain set of values?
02:06:14.420 | Is the blame at how it gets run after I left?
02:06:20.500 | Like, is it the driver or is it the car, right?
02:06:23.660 | Is it that someone enabled these devices in the first place,
02:06:29.160 | if you go to an extreme, right?
02:06:31.800 | Or is it the users themselves, just human nature?
02:06:35.760 | Is it just the way of human nature?
02:06:37.760 | - Sure, and like the idea that we're going to find
02:06:39.840 | a mutually exclusive answer here is crazy.
02:06:42.040 | There's not one place.
02:06:43.240 | It's a combination of a lot of these things.
02:06:45.680 | And then the question is like, is it true at all, right?
02:06:48.040 | Like I'm not actually saying that's not true
02:06:50.800 | or that it's true, but there's always more nuance here.
02:06:55.520 | Do I believe that social media has an effect
02:06:59.240 | on young people?
02:07:00.240 | Well, it's got to, they use it a lot.
02:07:02.400 | And I bet you there are a lot of positive effects
02:07:04.280 | and I bet you there are negative effects,
02:07:05.720 | just like any technology.
02:07:07.600 | And where I've come to in my thinking on this
02:07:10.160 | is that I think any technology has negative side effects.
02:07:13.400 | The question is, as a leader, what do you do about them?
02:07:16.520 | And are you actively working on them
02:07:18.160 | or do you just like not really believe in them?
02:07:20.600 | If you're a leader that sits there and says,
02:07:22.360 | well, we're going to put an enormous amount
02:07:24.040 | of resources against this,
02:07:26.240 | we're going to acknowledge when there are true criticisms,
02:07:29.520 | we're going to be vulnerable and that we're not perfect
02:07:32.560 | and we're going to go fix them
02:07:33.640 | and we're going to be held accountable along the way.
02:07:36.240 | I think that people generally really respect that.
02:07:39.880 | But I think that where Facebook, I think,
02:07:43.640 | has had issues in the past is where they say things like,
02:07:46.840 | can't remember what Mark said about misinformation
02:07:49.440 | during the election.
02:07:50.280 | There was that like famous quote where he was like,
02:07:52.600 | it's pretty crazy to think that Facebook
02:07:54.280 | had anything to do with this election.
02:07:55.600 | Like that was something like that quote.
02:07:57.080 | And I don't remember what stage he was on.
02:07:58.800 | And, but ooh, that did not age well, right?
02:08:02.520 | Like you have to be willing to say,
02:08:05.640 | well, maybe there's something there and wow,
02:08:09.680 | like I want to go look into it
02:08:11.040 | and truly believe it in your gut.
02:08:12.840 | But if people look at you and how you act
02:08:14.520 | and what you say and don't believe you truly feel that way.
02:08:18.640 | - It's not just the words you say,
02:08:19.800 | but how you say them and that people believe
02:08:21.880 | that you actually feel the pain
02:08:23.760 | of having caused any suffering in the world.
02:08:25.360 | - So to me, it's much more about your actions
02:08:29.320 | and your posture post event
02:08:31.520 | than it is about debugging the why.
02:08:33.040 | 'Cause I don't know, is it,
02:08:34.360 | like I don't know this research.
02:08:36.800 | It was written well after I left, right?
02:08:38.440 | Like, is it the algorithm?
02:08:41.040 | Is it the explore page?
02:08:42.360 | Is it the people you might know unit connecting you to,
02:08:46.040 | you know, ideas that are dangerous?
02:08:47.880 | Like I really don't know.
02:08:49.640 | - Yeah.
02:08:50.480 | - So we'd have to have a much deeper, I think,
02:08:53.960 | dive to understand where the blame lies.
02:08:56.600 | - What's very unpleasant to me to consider,
02:08:58.720 | now, I don't know if this is true,
02:09:00.040 | but to consider the very fact that
02:09:02.560 | there might be some complicated games being played here.
02:09:07.000 | For example, you know, as somebody, I really love psychology
02:09:10.800 | and I love it enough to know that the field
02:09:13.520 | is pretty broken in the following way.
02:09:15.640 | It's very difficult to study human beings well at scale
02:09:19.960 | because the questions you ask affect the results.
02:09:22.800 | You can basically get any results you want.
02:09:25.120 | And so you have an internal Facebook study
02:09:27.240 | that asks some question of which we don't know
02:09:29.640 | the full details and there's some kind of analysis,
02:09:32.560 | but that's just the one little tiny slice
02:09:34.800 | into some much bigger picture.
02:09:37.000 | And so you can have thousands of employees at Facebook.
02:09:40.920 | One of them comes out and picks whatever narrative
02:09:44.320 | knowing that they become famous,
02:09:46.400 | coupled with the other really uncomfortable thing
02:09:49.160 | I see in the world, which is journalists
02:09:51.720 | seem to understand they get a lot of clickbait attention
02:09:55.800 | from saying something negative about social networks.
02:09:58.400 | Certain companies, like they even get some clickbait stuff
02:10:03.400 | about Tesla or about, especially when it's like,
02:10:07.680 | when there's a public famous CEO type of person,
02:10:11.040 | if they get a lot of views on the negative,
02:10:13.140 | not the positive, the positive they'll get,
02:10:16.120 | I mean, it actually goes to the thing
02:10:17.320 | you were saying before.
02:10:18.280 | If there's a hot, sexy new product
02:10:20.760 | that's great to look forward to,
02:10:22.120 | they get positive on that, but absent a product,
02:10:25.320 | it's nice to have like the CEO
02:10:28.560 | messing up in some kind of way.
02:10:30.240 | And so couple that with the whistleblower
02:10:33.560 | and with this whole dynamic of journalism and so on,
02:10:38.260 | with "Social Dilemma" being really popular documentary,
02:10:42.100 | it's like, all right, my concern is there's deep flaws
02:10:47.100 | in human nature here in terms of things
02:10:49.640 | we need to deal with, like the nature of hate,
02:10:53.280 | of bullying, all those kinds of things.
02:10:56.280 | And then there's people who are trying to use that
02:10:59.080 | potentially to become famous and make money
02:11:02.400 | off of blaming others for causing more of the problem
02:11:06.960 | as opposed to helping solve the problem.
02:11:08.880 | So I don't know what to think.
02:11:10.240 | I'm not saying this is like, I'm just uncomfortable
02:11:12.880 | with, I guess, not knowing what to think about any of this
02:11:16.720 | because a bunch of folks I know that work at Facebook
02:11:19.480 | on the machine learning side, so Jan LeCun,
02:11:21.480 | I mean, they're quite upset by what's happening
02:11:25.160 | because there's a lot of really brilliant,
02:11:26.880 | good people inside Facebook that are trying to do good.
02:11:30.680 | And so like all of this press, Jan is one of them.
02:11:33.400 | And he has an amazing team
02:11:34.480 | of the machine learning researchers.
02:11:35.720 | Like he's really upset with the fact
02:11:38.520 | that people don't seem to understand
02:11:40.520 | that the portrayal does not represent
02:11:43.640 | the full nature of efforts that's going on at Facebook.
02:11:46.620 | So I don't know what to think about that.
02:11:48.200 | - Well, you just, I think, very helpfully explained
02:11:52.680 | the nuance of the situation
02:11:54.040 | and why it's so hard to understand.
02:11:56.280 | But a couple of things.
02:11:57.120 | One is, I think I have been surprised at the scale
02:12:02.120 | with which some product manager can do an enormous amount
02:12:13.920 | of harm to a very, very large company
02:12:16.100 | by releasing a trove of documents.
02:12:19.260 | Like I think I read a couple of them when they got published
02:12:21.900 | and I haven't even spent any time going deep.
02:12:24.140 | Part of it's like, I don't really feel
02:12:25.420 | like reliving a previous life, but wow.
02:12:30.420 | Like talk about challenging the idea of open culture
02:12:34.380 | and like what that does to Facebook internally.
02:12:37.500 | If Facebook was built, like I remember like my office,
02:12:43.300 | we had this like no visitors rule around my office
02:12:45.680 | 'cause we always had like confidential stuff up on the walls
02:12:48.160 | and everyone was super angry 'cause they were like,
02:12:50.840 | that goes against our culture of transparency.
02:12:52.880 | And like Mark's in the fish cube or whatever they call it,
02:12:55.640 | the aquarium, I think they called it,
02:12:57.920 | where like literally anyone could see
02:12:59.320 | what he was doing at any point.
02:13:00.800 | And I don't know, I mean, other companies like Apple
02:13:04.720 | have been quiet/locked down.
02:13:07.640 | Snapchat's the same way for a reason.
02:13:10.240 | And I don't know what this does to transparency
02:13:14.280 | on the inside of startups that value that.
02:13:16.820 | I think that it's a seminal moment.
02:13:19.500 | And you can say, well, you should have nothing to hide,
02:13:22.020 | right, but to your point, you can pick out documents
02:13:24.640 | that show anything, right?
02:13:26.620 | But I don't know.
02:13:28.500 | So what happens to transparency inside of startups
02:13:31.380 | and the culture that startups or companies in the future
02:13:36.380 | will grow, like the startup of the future
02:13:37.980 | that becomes the next Facebook, will it be locked down?
02:13:40.440 | And what does that do, right?
02:13:42.580 | So that's part one.
02:13:44.220 | Part two, like I don't think that you could design
02:13:49.220 | a more like a well-orchestrated handful of events
02:13:54.520 | from the like 60 minutes to releasing the documents
02:14:00.540 | in the way that they were released at the right time.
02:14:03.420 | That takes a lot of planning and partnership.
02:14:06.500 | And it seems like she has a partner at some firm, right,
02:14:09.980 | that probably helped a lot with this.
02:14:12.320 | But man, at a personal level, if you're her,
02:14:16.300 | you'd have to really believe in what you are doing,
02:14:20.900 | really believe in it, because you are personally
02:14:23.500 | putting your ass on the line, right?
02:14:25.160 | Like you've got a very large company
02:14:29.020 | that doesn't like enemies, right?
02:14:31.540 | It takes a lot of guts.
02:14:36.120 | And I don't love these conspiracy theories about like,
02:14:39.700 | oh, she's being financed from some person or people.
02:14:42.380 | Like I don't love them
02:14:43.260 | because that's like the easy thing to say.
02:14:45.700 | I think the Occam's razor here is like,
02:14:48.660 | someone thought they were doing something wrong
02:14:51.300 | and was like very, very courageous.
02:14:54.700 | I don't know if courageous is the word, but like,
02:14:58.260 | so without getting into like, is she a martyr?
02:15:01.780 | Is she courageous?
02:15:02.620 | Is she, right, like, let's put that aside for a second.
02:15:05.860 | Then there are the documents themselves.
02:15:07.440 | They say what they say.
02:15:08.640 | To your point, a lot of the things that like,
02:15:12.120 | people have been worried about are already in the documents
02:15:15.080 | or they're already been said externally.
02:15:17.240 | And I don't know, I'm just like,
02:15:21.040 | I'm thankful that I am focused on new things with my life.
02:15:24.560 | (laughing)
02:15:25.820 | - Well, let me just say, I just think
02:15:27.320 | it's a really hard problem that probably Facebook
02:15:30.280 | and Twitter are trying to solve.
02:15:32.280 | I'm actually just fascinated by how hard this problem is.
02:15:35.340 | There are fundamental issues at Facebook in tone
02:15:38.400 | and in an approach of how product gets built
02:15:41.720 | and the objective functions.
02:15:43.240 | And since people, organizations are not people.
02:15:48.280 | So yawn and fair, right?
02:15:50.000 | Like there are a lot of really great people
02:15:51.760 | who like literally just want to push reinforcement
02:15:53.960 | learning forward.
02:15:55.200 | They literally just want to teach a robot
02:15:56.920 | to touch, feel, lift, right?
02:15:59.400 | Like they're not thinking about political misinformation.
02:16:02.840 | Right?
02:16:03.880 | But there's a strong connection
02:16:06.300 | between what funds that research
02:16:08.380 | and an enormously profitable machine that has trade-offs.
02:16:13.340 | And one cannot separate the two.
02:16:18.200 | You are not completely separate from the system.
02:16:21.400 | So I agree, it can feel really frustrating to feel
02:16:25.100 | if you're internally, internal there,
02:16:27.220 | that you're working on something completely unrelated
02:16:29.580 | and you feel like your group's good.
02:16:31.180 | I can understand that.
02:16:32.820 | - But there's some responsibility still.
02:16:34.420 | You have to acknowledge, it's like the Ray Dalio thing.
02:16:36.420 | You have to look in the mirror and see if there's problems
02:16:38.940 | and you have to fix those problems.
02:16:40.280 | - Yeah.
02:16:41.120 | - You've mentioned machine learning
02:16:44.780 | and reinforcement quite a bit.
02:16:46.740 | I mean, to me, social networks is one
02:16:48.620 | of the exciting places, the recommender systems
02:16:51.300 | where machine learning is applied.
02:16:52.820 | Where else in the world, in the space of possibilities
02:16:56.460 | over the next five, 10, 20 years,
02:16:58.220 | do you think we're going to see impact of machine learning
02:17:02.140 | when you try it?
02:17:03.220 | On the philosophical level, on a technical level,
02:17:06.540 | what do you think?
02:17:07.380 | Or within social networks themselves?
02:17:09.500 | - Well, I think the obvious answers are climate change.
02:17:16.720 | Think about how much fuel or just waste there is
02:17:21.780 | in energy consumption today
02:17:24.520 | because we don't plan accordingly,
02:17:27.460 | because we take the least efficient route.
02:17:30.940 | So logistics and stuff, supply chain,
02:17:32.660 | all that kind of stuff.
02:17:33.500 | - Yeah, I mean, listen,
02:17:35.260 | if we're going to fight climate change,
02:17:36.620 | like one awesome way to do it is figure out
02:17:40.540 | how to optimize how we operate as a species
02:17:43.300 | and minimize the amount of energy we consume
02:17:47.260 | to maximize whatever economic impact we want to have.
02:17:51.120 | Because right now, those two are very much tied together.
02:17:53.940 | And I don't believe that that has to be the case.
02:17:56.580 | There's this really interesting, you've read it.
02:18:00.040 | For people who are listening,
02:18:00.880 | there's this really interesting paper
02:18:02.720 | on reinforcement learning
02:18:04.040 | and energy consumption inside buildings.
02:18:06.240 | It's like one of the seminal ones, right?
02:18:08.320 | But imagine that at massive scale.
02:18:10.320 | That's super interesting.
02:18:11.320 | I mean, they've done like resource planning for servers
02:18:15.340 | for peak load using reinforcement learning.
02:18:17.840 | I don't know if that was at Google or somewhere else,
02:18:19.600 | but like, okay, great, you do it for servers,
02:18:21.920 | but what if you could do it for just capacity
02:18:24.080 | in general, energy capacity for cities
02:18:26.240 | and planning for traffic?
02:18:28.000 | Of course, there's all the self-driving cars.
02:18:31.120 | I don't know, I'm not gonna pontificate crazy ideas
02:18:36.360 | using reinforcement learning or machine learning.
02:18:40.000 | It's just so clear to me
02:18:41.220 | that humans don't think quickly enough.
02:18:43.640 | - So it's interesting to think about machine learning
02:18:46.520 | helping a little bit at scale.
02:18:49.680 | So a little bit to a large number of people,
02:18:52.000 | that has a huge impact.
02:18:53.440 | So if you optimize, say Google Maps, something like that,
02:18:57.280 | trajectory planning or what a map quest first.
02:19:00.100 | - Getting here, I looked and it was like,
02:19:02.940 | here's the most energy efficient route.
02:19:04.520 | And I was like, I'm gonna be late.
02:19:05.680 | I need to take the fastest route.
02:19:07.280 | - As opposed to unrolling the map.
02:19:09.160 | - Yeah.
02:19:10.000 | - Yeah.
02:19:10.820 | - Like, and that's going to be very inefficient
02:19:12.960 | no matter what.
02:19:13.780 | - I was definitely the other day,
02:19:14.800 | like part of the Epsilon of Epsilon Greedy
02:19:17.820 | with Waze where like I was sent on like a weird route
02:19:21.800 | that I could tell they're like,
02:19:22.960 | we just need to collect data of this road.
02:19:25.040 | (laughing)
02:19:25.880 | Like we just, Kevin's-
02:19:26.720 | - You were the ant they sent off for exploration.
02:19:28.400 | - Kevin's definitely gonna be the Guinea pig.
02:19:30.400 | And great, now we have-
02:19:32.120 | - Did you at least feel pride?
02:19:34.160 | - Oh, going through it, I was like, oh, this is fun.
02:19:36.000 | Like now they get data about this weird shortcut.
02:19:38.280 | And actually I hit all the green lights and it worked.
02:19:40.200 | I'm like, this is a problem.
02:19:41.320 | - This is bad data.
02:19:42.720 | - Bad data, they're just gonna imagine.
02:19:44.720 | - I could see you slowing down and stopping at a green light
02:19:47.360 | just to give them the right kind of data.
02:19:49.000 | - Yeah.
02:19:50.100 | But to answer your question,
02:19:51.000 | like I feel like that was fairly unsatisfying
02:19:53.400 | and it's easy to say climate change.
02:19:55.000 | But what I would say is at Instagram,
02:19:58.640 | everything we applied machining learning to
02:20:02.120 | got better for users and it got better for the company.
02:20:04.800 | I saw the power.
02:20:06.200 | I didn't fully understand it as an executive.
02:20:08.200 | And I think that's actually one of the issues that,
02:20:11.040 | and when I say understand, I mean the mathematics of it.
02:20:14.040 | Like I understand what it does, I understand that it helps.
02:20:17.040 | But there are a lot of executives now that talk about it
02:20:21.800 | in the way that they talk about the internet
02:20:23.680 | or they talked about the internet like 10 years ago.
02:20:25.560 | They're like, we're gonna build mobile.
02:20:27.080 | And you're like, what does that mean?
02:20:27.960 | They're like, we're just gonna do mobile.
02:20:29.440 | And you're like, okay.
02:20:31.240 | So my sense is the next generation of leaders
02:20:33.920 | will have grown up having had classes
02:20:37.200 | in reinforcement learning, supervised learning, whatever.
02:20:40.200 | And they will be able to thoughtfully apply it
02:20:42.920 | to their companies in the places it is needed most.
02:20:46.320 | And that's really cool.
02:20:47.680 | 'Cause I mean, talk about efficiency gains.
02:20:53.040 | That's what excites me the most about it.
02:20:54.800 | - Yeah, so there's, it's interesting.
02:20:56.440 | Just to get a fundamental first principles understanding
02:20:59.480 | of certain concepts of machine learning.
02:21:01.040 | So supervised learning from an executive perspective,
02:21:04.400 | supervised learning, you have to have a lot of humans
02:21:07.080 | label a lot of data.
02:21:08.480 | So the question there is, okay,
02:21:10.000 | can we gather a large amount of data
02:21:12.440 | that can be labeled well?
02:21:14.400 | And that's the question Tesla asked.
02:21:16.120 | Like, can we create a data engine that keeps
02:21:18.560 | sending an imperfect machine learning system out there
02:21:22.960 | whenever it fails, it gives us data back.
02:21:25.040 | We label it by human and we send it back and forth.
02:21:27.640 | So this way.
02:21:28.640 | Then there is a, Jan LeCun's excited
02:21:30.440 | about the self-supervised learning
02:21:32.400 | where you do much less human labeling.
02:21:36.120 | And there's some kind of mechanism
02:21:38.200 | for the system to learn it by itself
02:21:40.640 | on the human generated data.
02:21:42.760 | And then there's the reinforcement learning,
02:21:44.760 | which is like basically allowing,
02:21:47.800 | it's applying the alpha zero technology
02:21:52.080 | that allow through self-play to learn how to
02:21:55.760 | solve the game of go and achieve incredible levels
02:21:59.920 | at the game of chess.
02:22:01.120 | Can you formulate the problem you're trying to solve
02:22:05.160 | in a way that's amenable to reinforcement learning?
02:22:07.760 | And can you get the right kind of signal at scale?
02:22:09.960 | 'Cause you need a lot, a lot of signal.
02:22:11.840 | And that's kind of fascinating to see which part
02:22:15.480 | of a social network can you convert
02:22:17.640 | into a reinforcement learning problem?
02:22:19.440 | - The fascinating thing about reinforcement learning,
02:22:22.240 | I think, is that we now have learned
02:22:25.920 | to apply neural networks to guess,
02:22:29.680 | like the Q function, basically the values
02:22:35.840 | for any state in action.
02:22:37.120 | And that is fascinating 'cause we used to just like,
02:22:40.480 | I don't know, have like a linear regression,
02:22:42.160 | like hope it worked and that was the fanciest version of it.
02:22:45.320 | But now you look at it, I'm like trying to learn this stuff.
02:22:47.400 | And I look at it, I'm like, there are like 17
02:22:49.840 | different acronyms of different ways
02:22:51.640 | you can try to apply this.
02:22:52.760 | No one quite agrees, like what's the best.
02:22:56.080 | Generally, if you're trying to like build a neural network,
02:22:58.880 | they're pretty well-trodden ways of doing that.
02:23:02.120 | You use Adam, you use Relu,
02:23:04.520 | like there's just like general good ideas.
02:23:07.360 | And in reinforcement learning, I feel like the consensus
02:23:10.160 | is like, it totally depends.
02:23:12.640 | And by the way, it's really hard to get it to converge
02:23:16.440 | and it's noisy.
02:23:17.560 | And it like, so there are all these really interesting ideas
02:23:20.680 | around building simulators,
02:23:22.280 | you know, like for instance, in self-driving, right?
02:23:25.720 | Like you don't want to like actually have someone
02:23:29.360 | get in an accident to learn that an accident is bad.
02:23:31.920 | So you start simulating accidents,
02:23:33.960 | simulating aggressive drivers,
02:23:35.400 | just simulating crazy dogs that run into the street.
02:23:37.680 | And wow, fascinating, right?
02:23:41.320 | Like my mind starts racing.
02:23:42.840 | And then the question is, okay, forget about
02:23:45.160 | self-driving cars, let's talk about social networks.
02:23:48.520 | How can you produce a better, more thoughtful experience
02:23:52.960 | using these types of algorithms?
02:23:55.840 | And honestly, in talking to some of the people
02:23:58.200 | that work at Facebook and old Instagrammers,
02:24:01.640 | most people are like, yeah, we tried a lot of things,
02:24:04.160 | didn't quite ever make it work.
02:24:05.600 | I mean, for the longest time, Facebook ads
02:24:07.440 | was effectively a logistic regression, okay?
02:24:10.600 | I don't know what it is now, but like,
02:24:12.120 | if you look at this paper that they published
02:24:13.640 | back in the day, it was literally
02:24:14.600 | just a logistic regression.
02:24:16.120 | Made a lot of money.
02:24:17.120 | So even at these like extremely large scales,
02:24:21.760 | if we are not yet touching what reinforcement learning
02:24:24.320 | can truly do, imagine what the next 10 years looks like.
02:24:27.400 | - Yeah. - How cool is that?
02:24:28.600 | - It's amazing.
02:24:29.600 | So I really like the use of reinforcement learning
02:24:32.200 | as part of the simulation, for example,
02:24:34.760 | like with self-driving cars, it's modeling pedestrians.
02:24:37.760 | So the nice thing about reinforcement learning,
02:24:40.440 | it can be used to learn agents within the world.
02:24:45.000 | So they can learn to behave properly.
02:24:47.080 | Like you can teach pedestrians to,
02:24:49.360 | like you don't hard code the way they behave.
02:24:51.640 | They learn how to behave.
02:24:53.280 | In that same way, I do have a hope,
02:24:55.200 | was it Jack Dorsey talks about healthy conversations.
02:24:58.280 | You talked about meaningful interactions, I believe.
02:25:01.520 | - Yeah.
02:25:02.360 | - Like simulating interactions.
02:25:06.920 | So you can learn how to manage that.
02:25:08.720 | It's fascinating.
02:25:09.560 | So where most of your algorithm development
02:25:12.280 | happens in virtual worlds.
02:25:15.000 | And then you can really learn how to design the interface,
02:25:18.120 | how you design much of aspects of the experience
02:25:21.400 | in terms of how you select what's shown in the feed,
02:25:24.480 | all those kinds of things.
02:25:26.120 | It feels like if you can connect reinforcement learning
02:25:28.440 | to that, that's super exciting.
02:25:30.080 | - Yep.
02:25:30.920 | And I think if you have a company and leadership
02:25:35.080 | that believe in doing the right things
02:25:36.560 | and can apply this technology in the right way,
02:25:38.920 | some really special stuff can happen.
02:25:41.440 | It is mostly likely going to be a group of people
02:25:44.440 | we've never heard about, startup from scratch, right?
02:25:47.960 | And you asked if like new social networks could be built.
02:25:52.200 | I've got to imagine they will be.
02:25:54.300 | And whoever starts it, it might be some kids in a garage
02:25:58.400 | that took these classes from these people, you, right?
02:26:01.280 | Like, and they're building all of these things
02:26:04.380 | with this tech at the core.
02:26:06.320 | So I'm trying not to be someone
02:26:07.400 | who just like throws around reinforcement learning
02:26:10.800 | as a buzzword.
02:26:11.880 | I truly believe that it is the most cutting edge
02:26:16.240 | in what can happen in social networks.
02:26:18.320 | And I also believe it's super hard.
02:26:20.880 | Like it's super hard to make it work.
02:26:22.480 | It's super hard to do it at scale.
02:26:24.040 | It's super hard to find people that truly understand it.
02:26:26.840 | So I'm not going to say that like,
02:26:28.600 | I think it'll be applied in social networks
02:26:31.760 | before we have true self-driving.
02:26:33.840 | - Yeah.
02:26:34.680 | - Let me put it that way.
02:26:35.500 | - We could argue about this for a long time,
02:26:36.920 | but yes, I agree with you.
02:26:38.280 | I think self-driving is way harder than people realize.
02:26:40.480 | - Oh, absolutely.
02:26:41.840 | - Let me ask you in terms of that kid in the garage
02:26:44.040 | or those couple of kids in the garage,
02:26:45.520 | what advice would you give to them
02:26:47.040 | if they want to start a new social network or a business?
02:26:50.760 | What advice would you give to somebody with a big dream
02:26:53.520 | and a young startup?
02:26:54.700 | - To me, you have to choose to do something
02:26:59.500 | that even if it fails, like it was so fun, right?
02:27:03.800 | Like we never started Instagram
02:27:06.400 | knowing it was going to be big.
02:27:07.760 | We started Instagram because we loved photography.
02:27:10.440 | We loved social networks.
02:27:12.460 | I had seen what other social networks had done
02:27:14.520 | and I thought, hmm, maybe we did a spin on this,
02:27:17.640 | but like nowhere was our fate predestined.
02:27:20.400 | Like it wasn't like, it wasn't written out anywhere
02:27:23.640 | that everything was going to go great.
02:27:25.600 | And I often think the counterfactual,
02:27:27.000 | like what if it had not gone well?
02:27:28.840 | I would have been like, I don't know, that was fun.
02:27:30.360 | We raised some money, we learned some stuff
02:27:32.280 | and does it position you well for the next experience?
02:27:37.280 | That's the advice that I would give to anyone
02:27:39.720 | wanting to start something today,
02:27:41.160 | which is like, does this meet with your ultimate goals?
02:27:45.080 | Not wealth, not fame, none of that,
02:27:47.160 | because all of that, by the way, is bullshit.
02:27:48.800 | Like you can get super famous and super wealthy.
02:27:52.320 | And I think generally those are not things that,
02:27:55.920 | again, it's easy to say with like a lot of money
02:27:59.400 | that somehow like it's not good to have a lot of money.
02:28:01.760 | It's just, I think that complicates life enormously
02:28:04.880 | in a way that people don't fully comprehend.
02:28:06.840 | So I think it is way more interesting to shoot for,
02:28:09.480 | can I make something that people love,
02:28:11.720 | that provides value in the world, that I love building,
02:28:15.120 | that I love working on, right?
02:28:17.480 | That's what I would do if I were starting from scratch.
02:28:21.880 | And by the way, like in some ways
02:28:22.920 | that I will do that personally,
02:28:25.160 | which is like choose the thing
02:28:26.400 | that you get up every morning and you're like,
02:28:27.880 | I love this, even when it's painful.
02:28:32.880 | - Even when it's painful.
02:28:34.440 | What about a social network specifically?
02:28:36.720 | If you were to imagine, put yourself in the mind of some-
02:28:40.600 | - I don't wanna compete against myself.
02:28:42.000 | I can't give out ideas.
02:28:43.520 | - Okay, I got you.
02:28:44.360 | No, but it's like high level, you focus on community.
02:28:47.600 | - Yeah.
02:28:48.440 | I said that as a half joke.
02:28:53.320 | In all honesty, I think these things are so hard to build
02:28:56.960 | that like ideas are a dime a dozen, but-
02:29:00.320 | - You have talked about keeping it simple.
02:29:02.920 | - Can I tell you-
02:29:03.760 | - Which is a liberating idea.
02:29:04.960 | - My model is three circles and they overlap.
02:29:08.280 | One circle is what do I have experience at
02:29:11.360 | slash what am I good at?
02:29:12.800 | I don't like saying what am I good at
02:29:14.120 | because it just like seems like,
02:29:16.040 | what do I have experience in, right?
02:29:18.280 | What can I bring to the table?
02:29:19.720 | What am I excited about is the other circle.
02:29:22.040 | What's just super cool, right?
02:29:24.640 | That I wanna work on because even when this is hard,
02:29:27.200 | I think it's so cool, I wanna stick with it.
02:29:31.720 | And the last circle is like, what does the world need?
02:29:34.960 | And if that circle ain't there,
02:29:36.200 | it doesn't matter what you work on.
02:29:37.720 | 'Cause there are a lot of startups that exist
02:29:39.280 | that just no one needs or very small markets need.
02:29:43.200 | But if you wanna be successful,
02:29:44.840 | I think if you're like, if you're good at it,
02:29:47.240 | you have it, sorry, if you're good at it,
02:29:49.680 | you're passionate about it and the world needs it.
02:29:51.640 | I mean, this sounds simple,
02:29:53.000 | but not enough people sit down
02:29:54.160 | and just think about those circles and think,
02:29:57.160 | do these things overlap?
02:29:58.120 | And then can I get that middle section?
02:29:59.480 | It's small, but can I get that middle section?
02:30:01.780 | I think a lot about that personally.
02:30:04.600 | - And then you have to be really honest
02:30:07.720 | about the circle that you're good at
02:30:10.120 | and really honest about the circle that the world needs.
02:30:16.320 | And as opposed really honest about the passion,
02:30:20.280 | like what do you actually love?
02:30:22.520 | As opposed to like some kind of dream of making money,
02:30:24.600 | all those kinds of stuff, like literally love doing.
02:30:26.600 | - I had a former engineer who decided to start a startup.
02:30:29.920 | And I was like, are you sure you wanna start a company
02:30:32.080 | versus like join something else?
02:30:34.440 | Because being a coach of an NBA team
02:30:38.160 | and playing basketball are two very, very different things.
02:30:41.400 | And like not everyone fully understands the difference.
02:30:45.880 | I think you can kind of do it both.
02:30:47.640 | And I don't know, jury's out on that one
02:30:51.960 | because like they're in the middle of it now.
02:30:54.600 | But it's really important to figure out what you're good at,
02:30:58.560 | not be full of yourself, like truly look
02:31:00.840 | at your track record.
02:31:02.060 | What's the saying?
02:31:04.560 | Like it ain't bragging if you can do it.
02:31:07.400 | But too many people are delusional
02:31:12.280 | and like think they're better at things
02:31:14.280 | than they actually are,
02:31:15.480 | or think there's a bigger market than there actually is.
02:31:18.640 | When you confuse your passion for things with a big market,
02:31:21.080 | that's really scary, right?
02:31:23.600 | Like just because you think it's cool doesn't mean
02:31:25.640 | that it's a big business opportunity.
02:31:27.120 | So like what evidence do you have?
02:31:28.760 | Again, I'm a fairly like, I'm a strict rationalist on this.
02:31:32.580 | And like sometimes people don't like working with me
02:31:35.000 | because I'm pretty pragmatic about things.
02:31:37.160 | Like I'm not Elon, like I don't sit
02:31:40.440 | and make bold proclamations about visiting Mars.
02:31:44.160 | Like that's just not how I work.
02:31:46.160 | I'm like, okay, I wanna build this really cool thing
02:31:48.480 | that's fairly practical and I think we could do it.
02:31:50.840 | And it's in this way.
02:31:52.320 | And what's cool though is like, that's just my sweet spot.
02:31:55.060 | I'm not like, I just, I can't with a straight face
02:31:58.800 | talk about the metaverse.
02:31:59.680 | I can't, I just, it's not me.
02:32:01.800 | - What do you think about the Facebook renaming itself
02:32:05.560 | to Meta? - I didn't mean that as a dig.
02:32:06.480 | I just literally mean like I'm fairly,
02:32:09.480 | I like to live in the next five years.
02:32:11.240 | And like what things can I get out in a year
02:32:13.560 | that people will use at scale?
02:32:15.080 | And so it's just, again, those circles I think
02:32:20.000 | are different for different people,
02:32:21.040 | but it's important to realize that like market matters,
02:32:24.600 | you being good at it matters
02:32:26.060 | and having passion for it matters.
02:32:27.980 | Your question, sorry.
02:32:29.440 | - Well, on this topic in terms of funding,
02:32:33.780 | is there by way of advice,
02:32:36.500 | was funding in your own journey helpful, unhelpful?
02:32:44.220 | Like is there a right time to get funding?
02:32:46.560 | - You mean venture funding? - Venture funding
02:32:48.360 | or anything, borrow some money from your parents.
02:32:51.120 | I don't know.
02:32:51.960 | But like is money getting in the way?
02:32:54.880 | Does it help?
02:32:56.060 | Is the timing important?
02:32:57.440 | Is there some kind of wisdom you can give there
02:33:00.280 | because you were exceptionally successful very quickly?
02:33:03.040 | - Funding helps as long as it's from the right people.
02:33:09.740 | That includes yourself.
02:33:10.640 | And I'll talk about myself funding myself in a second,
02:33:13.680 | which is like, because I can fund myself
02:33:15.820 | doing whatever projects I can do,
02:33:18.020 | I don't really have another person putting pressure on me
02:33:20.640 | except for myself.
02:33:21.600 | And that creates strange dynamics, right?
02:33:24.480 | But let's like talk about people getting funding
02:33:27.120 | from a venture capitalist initially.
02:33:29.020 | We raised money from Matt Kohler at Benchmark.
02:33:32.520 | He's brilliant, amazing guy, very thoughtful.
02:33:36.520 | And he was very helpful early on.
02:33:39.480 | But I have stories from entrepreneurs
02:33:41.140 | where they raised money from the wrong person
02:33:42.920 | or the wrong firm where incentives weren't aligned.
02:33:46.800 | They didn't think in the same way
02:33:48.640 | and bad things happened because of that.
02:33:51.360 | The boardroom was always noisy.
02:33:53.520 | There were fights.
02:33:54.360 | Like we just never had that.
02:33:55.540 | Matt was great.
02:33:57.120 | I think like capital these days
02:33:59.920 | is kind of a dime a dozen, right?
02:34:01.760 | Like as long as you're fundable,
02:34:03.160 | like it seems like there's money out there
02:34:05.120 | is what I'm hearing.
02:34:06.540 | It's really important that you are aligned
02:34:10.440 | and that you think of raising money
02:34:12.040 | as hiring someone for your team rather than taking money.
02:34:15.960 | If capital is plentiful, right?
02:34:18.360 | It provides a certain amount of pressure
02:34:20.920 | to do the right thing that I think is healthy
02:34:23.100 | for any startup.
02:34:24.160 | And it keeps you real and honest
02:34:25.740 | because they don't wanna lose their money.
02:34:27.400 | They're paid to not lose their money.
02:34:29.680 | The problem, maybe I could depersonalize it,
02:34:32.740 | but like I remember having lunch with Elon.
02:34:35.840 | It's only happened once.
02:34:37.680 | And I asked him, like I was trying to figure out
02:34:40.000 | what I was doing after Instagram, right?
02:34:42.600 | And I asked him something about like angel investing.
02:34:44.900 | And he looked at me with a straight face.
02:34:46.320 | He was like, "Why the F would I do that?"
02:34:48.120 | Like, "Why?"
02:34:48.960 | I was like, "I don't know, like you're connected.
02:34:51.560 | Like, it seems like."
02:34:52.400 | He's like, "I only invest in myself."
02:34:54.760 | I was like, "Ooh, okay."
02:34:56.880 | You know, like not the confidence.
02:34:59.240 | I was just like, "What a novel idea."
02:35:00.840 | It's like, "Yeah, if you have money,
02:35:03.080 | like why not just put it against your bag
02:35:05.960 | and like enable visiting Mars or something, right?
02:35:10.960 | Like, that's awesome, great."
02:35:12.800 | But I had never really thought of it that way.
02:35:14.920 | But also with that comes an interesting dynamic
02:35:17.580 | where you don't actually have people
02:35:22.180 | who are gonna lose that money telling you,
02:35:23.820 | "Hey, don't do this."
02:35:24.980 | Or, "Hey, you need to face this reality."
02:35:26.960 | So you need to create other versions of that truth teller.
02:35:30.980 | And whatever I do next,
02:35:34.540 | that's gonna be one of the interesting challenges
02:35:36.700 | is how do you create that truth telling situation?
02:35:39.780 | And that's part of why, by the way,
02:35:41.700 | I think someone like Jack,
02:35:42.900 | when you start Square, you have money,
02:35:44.460 | but you still, you bring on partners
02:35:46.760 | because I think it creates a truth telling type environment.
02:35:51.380 | I'm still trying to figure this out.
02:35:52.420 | Like, it's an interesting dynamic.
02:35:56.380 | - So you're thinking of perhaps launching
02:35:59.060 | some kind of venture where you're investing in yourself?
02:36:01.700 | - I mean-
02:36:02.620 | - Is that in the books potentially?
02:36:03.620 | - I'm 37 going on 38 next month.
02:36:06.420 | I have a long life to live.
02:36:07.860 | I'm definitely not gonna sit on the beach, right?
02:36:10.700 | So I'm gonna do something at some point,
02:36:13.520 | and I gotta imagine I will help fund it, right?
02:36:18.180 | So the other way of thinking about this
02:36:20.020 | is you can park your money in the S&P,
02:36:21.740 | and this is bad 'cause the S&P's done wonderfully
02:36:23.940 | well the last year, right?
02:36:26.020 | Or you can invest in yourself.
02:36:27.500 | And if you're not gonna invest in yourself,
02:36:30.800 | you probably shouldn't do a startup.
02:36:32.100 | It's kind of the way of thinking about it.
02:36:34.200 | - And you can invest in yourself in the way Elon does,
02:36:37.780 | which is basically go all in on this investment.
02:36:41.340 | Maybe that's one way to achieve accountability
02:36:43.340 | is like you're kind of screwed if you fail.
02:36:45.980 | - Man, yeah, that's, yeah.
02:36:49.980 | - I personally like that.
02:36:50.980 | I like burning bridges behind me
02:36:52.620 | so that I'm fucked if it fails.
02:36:55.100 | - Yeah, yeah, yeah.
02:36:57.340 | It's really important though.
02:36:58.840 | One of the things I think Mark said to me early on
02:37:03.700 | that sticks with me that I think is true,
02:37:06.300 | we were talking about people who had left operating roles
02:37:10.500 | and started doing venture or something,
02:37:11.700 | and he was like, "A lot of people convince themselves
02:37:13.380 | "they work really hard, like they think they work really hard
02:37:15.840 | "and they put on the show,
02:37:17.500 | "and in their minds they work really hard,
02:37:19.420 | "but they don't work very hard."
02:37:21.060 | There is something about lighting a fire underneath you
02:37:25.020 | and burning bridges such that you can't turn back
02:37:28.340 | that I think, we didn't talk about this specifically,
02:37:31.300 | but I think you're right.
02:37:33.980 | You need to have that because there's this self-delusion
02:37:37.060 | at a certain scale, "Oh, I have so many board calls.
02:37:40.940 | "Oh, we have all these things to figure out."
02:37:43.260 | It's like, "Mm."
02:37:44.100 | This is one of the hard parts about being an operator.
02:37:47.420 | It's like there are so many people
02:37:50.300 | that have made a lot of money not operating,
02:37:52.900 | but operating is just one of the hardest things on earth.
02:37:55.540 | It is just so effing hard.
02:37:58.300 | It is stressful.
02:37:59.460 | It is, you're dealing with real humans,
02:38:01.540 | not just throwing capital in and hoping it grows.
02:38:04.900 | I'm not undermining the VC mindset.
02:38:06.780 | I think it's a wonderful thing and needed
02:38:08.580 | and so many wonderful VCs I've worked with.
02:38:11.580 | But yeah, when your ass is on the line and it's your money,
02:38:15.860 | talk to me in 10 years, we'll see how it goes.
02:38:20.540 | - [Lex] [laughs]
02:38:21.780 | Yeah, but you were saying that as a source,
02:38:23.500 | when you wake up in the morning
02:38:24.980 | and you look forward to the day full of challenges,
02:38:28.940 | that's also where you can find happiness.
02:38:31.100 | Let me ask you about love and friendship.
02:38:32.780 | - Sure.
02:38:33.620 | - What's the role in this heck of a difficult journey
02:38:36.700 | you have been on of love, of friendship?
02:38:41.700 | What's the role of love in the human condition?
02:38:44.060 | - Well, first things first,
02:38:47.260 | the woman I married, my wife, Nicole,
02:38:49.500 | no way I could do what I do if we weren't together.
02:38:53.820 | - She had the filter idea.
02:38:55.100 | - Yeah, yeah, exactly.
02:38:56.660 | We didn't go over that story.
02:38:59.340 | Everything is a partnership, right?
02:39:01.500 | And to achieve great things,
02:39:03.820 | it's not about someone pulling their weight in places.
02:39:06.460 | Like, it's not like someone's supporting you
02:39:08.820 | so that you could do this other thing.
02:39:11.180 | It's literally like,
02:39:12.580 | Mike and I and our partnership as co-founders is fascinating
02:39:18.940 | 'cause I don't think Instagram would have happened
02:39:20.420 | without that partnership.
02:39:21.820 | Either him or me alone, no way.
02:39:25.780 | We pushed and pulled each other in a way
02:39:28.620 | that allowed us to build a better thing because of it.
02:39:32.380 | Nicole, sure, she pushed me to work on the filters early on.
02:39:35.860 | And yes, that's exciting.
02:39:36.780 | It's a fun story, right?
02:39:38.580 | But the truth of it is being able to level with someone
02:39:42.460 | about how hard the process is
02:39:44.740 | and have someone see you for who you are before Instagram
02:39:49.340 | and know that there's a constant you throughout all of this
02:39:53.260 | and be able to call you when you're drifting from that,
02:39:55.980 | but also support you when you're trying to stick with that.
02:39:58.380 | That's true friendship/love, whatever you want to call it.
02:40:03.380 | But also it's for someone not to care.
02:40:07.140 | I remember Nicole saying,
02:40:08.540 | "Hey, I know you're going to do this Instagram thing."
02:40:11.420 | I guess it was bourbon at the time.
02:40:12.860 | "You should do it because even if it doesn't work,
02:40:16.820 | we can move to a smaller apartment and it'll be fine.
02:40:20.420 | We'll make it work."
02:40:21.460 | How beautiful is that, right?
02:40:23.580 | - Yeah, that's almost like a superpower.
02:40:25.740 | It gives you permission to fail
02:40:27.660 | and somehow that actually leads to success.
02:40:29.780 | - But also she's like the least impressed
02:40:31.780 | about Instagram of anyone.
02:40:33.520 | She's like, "Yeah, it's great, but I love you for you.
02:40:36.380 | I like that you're a decent cook."
02:40:37.980 | - That's beautiful.
02:40:38.820 | (laughing)
02:40:39.660 | That's beautiful with the Gantt chart and Thanksgiving,
02:40:42.700 | which I still think is a brilliant effing idea.
02:40:44.900 | - Thank you.
02:40:46.220 | - Big, ridiculous question.
02:40:47.900 | You're old and wise at this stage,
02:40:53.060 | so have you discovered meaning to this whole thing?
02:40:55.460 | Why the hell are we descendants of apes here on earth?
02:40:59.620 | What's the meaning of it?
02:41:00.660 | What's the meaning of life?
02:41:01.980 | - I haven't.
02:41:02.820 | So the best learning for me has been
02:41:09.300 | no matter what level of success you achieve,
02:41:12.620 | you're still worried about similar things,
02:41:14.900 | just maybe on a slightly different scale.
02:41:16.320 | You're still concerned about the same thing.
02:41:18.940 | You're still self-conscious about the same things.
02:41:22.660 | And actually that moment going through that
02:41:26.700 | is what makes you believe there's gotta be
02:41:29.020 | more machinery to life or purpose to life
02:41:31.700 | and that we're all chasing these materialistic things,
02:41:35.260 | but you start realizing,
02:41:38.220 | it's almost like the Truman Show when he gets to the edge
02:41:40.740 | and he knocks against it.
02:41:42.900 | He's like, "What?"
02:41:44.380 | There's this awakening that happens
02:41:45.860 | when you get to that edge that you realize,
02:41:47.580 | "Oh, sure, it's great that we all chase money and fame
02:41:51.580 | and success," but you hit the edge.
02:41:54.820 | And I'm not even claiming I hit an edge
02:41:56.740 | like Elon's hit an edge.
02:41:58.380 | There's clearly larger scales.
02:42:00.020 | But what's cool is you learn that
02:42:02.620 | it doesn't actually matter
02:42:03.860 | and that there are all these other things that truly matter.
02:42:07.220 | That's not a case for working less hard.
02:42:09.300 | That's not a case for taking it easy.
02:42:11.500 | That's not a case for the four-day work week.
02:42:13.900 | What that is a case for is designing your life
02:42:16.660 | exactly the way you want to design it,
02:42:18.980 | 'cause I don't know.
02:42:20.460 | I think we go around the sun a certain number of times
02:42:25.060 | and then we die and then that's it.
02:42:27.660 | That's me.
02:42:28.500 | - Are you afraid of that moment?
02:42:29.740 | - No, not at all.
02:42:30.900 | In fact, or at least not yet.
02:42:32.820 | [laughing]
02:42:35.060 | Listen, I'm like a pilot.
02:42:37.740 | I do crazy things and I like,
02:42:40.360 | no, if anything, I'm like,
02:42:43.380 | "Ooh, I got to choose mindfully and purposefully
02:42:49.660 | the thing I am doing right now
02:42:51.420 | and not just fall into it."
02:42:54.060 | Because you're going to wake up one day and ask yourself,
02:42:55.740 | "Why the hell you spent the last 10 years doing X, Y, or Z?"
02:42:58.900 | So I guess my shorter answer to this is
02:43:03.700 | doing things on purpose because you choose to do them,
02:43:08.300 | so important in life.
02:43:10.420 | And not just floating down the river of life,
02:43:12.860 | hitting branches along the way,
02:43:14.700 | 'cause you will hit branches, right?
02:43:17.140 | But rather like literally plotting a course
02:43:19.680 | and not having a 10-year plan,
02:43:21.060 | but just choosing every day to opt in.
02:43:23.740 | That I think has been more,
02:43:28.300 | like I haven't figured out the meaning of life
02:43:29.900 | by any stretch of the imagination,
02:43:31.620 | but it certainly isn't money
02:43:32.700 | and it certainly isn't fame
02:43:33.820 | and it certainly isn't travel.
02:43:35.060 | And it's way more of like opting into the game
02:43:38.340 | you love playing.
02:43:39.400 | - Every day, opting in.
02:43:41.980 | - Just opting in.
02:43:43.220 | And like, don't let it happen to you, opt in.
02:43:46.940 | - Kevin, it's great to end on love and the meaning of life.
02:43:51.940 | This was an amazing conversation.
02:43:53.900 | - It was a lot of fun, thank you.
02:43:54.740 | - You gave me like a light into some fascinating aspects
02:43:57.900 | of this technical world.
02:44:00.180 | And I can't honestly wait to see what you do next.
02:44:04.060 | Thank you so much.
02:44:05.260 | - Thanks for having me.
02:44:06.420 | - Thanks for listening to this conversation
02:44:09.020 | with Kevin Systrom.
02:44:10.500 | To support this podcast,
02:44:11.980 | please check out our sponsors in the description.
02:44:14.660 | And now, let me leave you with some words
02:44:16.860 | from Kevin Systrom himself.
02:44:19.460 | "Focusing on one thing and doing it really, really well
02:44:24.220 | can get you very far."
02:44:26.500 | Thank you for listening and hope to see you next time.
02:44:29.580 | (upbeat music)
02:44:32.160 | (upbeat music)
02:44:34.740 | [BLANK_AUDIO]